Jan 20 22:35:26 crc systemd[1]: Starting Kubernetes Kubelet... Jan 20 22:35:26 crc restorecon[4711]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 22:35:26 crc restorecon[4711]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 22:35:27 crc restorecon[4711]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 22:35:27 crc restorecon[4711]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 20 22:35:27 crc kubenswrapper[5030]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 20 22:35:27 crc kubenswrapper[5030]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 20 22:35:27 crc kubenswrapper[5030]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 20 22:35:27 crc kubenswrapper[5030]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 20 22:35:27 crc kubenswrapper[5030]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 20 22:35:27 crc kubenswrapper[5030]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.785528 5030 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787770 5030 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787805 5030 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787809 5030 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787814 5030 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787817 5030 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787821 5030 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787826 5030 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787829 5030 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787835 5030 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787839 5030 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787844 5030 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787848 5030 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787853 5030 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787856 5030 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787861 5030 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787865 5030 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787868 5030 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787872 5030 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787876 5030 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787880 5030 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787885 5030 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787889 5030 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787894 5030 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787898 5030 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787902 5030 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787906 5030 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787910 5030 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787914 5030 feature_gate.go:330] unrecognized feature gate: Example Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787918 5030 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787922 5030 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787925 5030 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787929 5030 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787932 5030 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787936 5030 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787939 5030 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787944 5030 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787949 5030 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787954 5030 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787958 5030 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787961 5030 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787965 5030 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787969 5030 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787972 5030 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787977 5030 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787980 5030 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.787984 5030 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788010 5030 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788014 5030 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788018 5030 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788022 5030 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788026 5030 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788029 5030 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788034 5030 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788038 5030 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788043 5030 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788048 5030 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788051 5030 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788055 5030 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788058 5030 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788061 5030 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788065 5030 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788068 5030 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788072 5030 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788075 5030 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788078 5030 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788082 5030 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788085 5030 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788088 5030 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788092 5030 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788096 5030 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.788100 5030 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788362 5030 flags.go:64] FLAG: --address="0.0.0.0" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788372 5030 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788380 5030 flags.go:64] FLAG: --anonymous-auth="true" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788385 5030 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788390 5030 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788395 5030 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788400 5030 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788406 5030 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788410 5030 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788414 5030 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788419 5030 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788423 5030 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788427 5030 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788431 5030 flags.go:64] FLAG: --cgroup-root="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788435 5030 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788439 5030 flags.go:64] FLAG: --client-ca-file="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788443 5030 flags.go:64] FLAG: --cloud-config="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788447 5030 flags.go:64] FLAG: --cloud-provider="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788451 5030 flags.go:64] FLAG: --cluster-dns="[]" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788456 5030 flags.go:64] FLAG: --cluster-domain="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788460 5030 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788465 5030 flags.go:64] FLAG: --config-dir="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788469 5030 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788473 5030 flags.go:64] FLAG: --container-log-max-files="5" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788478 5030 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788482 5030 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788487 5030 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788491 5030 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788495 5030 flags.go:64] FLAG: --contention-profiling="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788498 5030 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788502 5030 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788507 5030 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788511 5030 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788515 5030 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788520 5030 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788524 5030 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788528 5030 flags.go:64] FLAG: --enable-load-reader="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788531 5030 flags.go:64] FLAG: --enable-server="true" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788552 5030 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788558 5030 flags.go:64] FLAG: --event-burst="100" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788562 5030 flags.go:64] FLAG: --event-qps="50" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788566 5030 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788570 5030 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788574 5030 flags.go:64] FLAG: --eviction-hard="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788579 5030 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788584 5030 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788587 5030 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788592 5030 flags.go:64] FLAG: --eviction-soft="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788596 5030 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788599 5030 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788603 5030 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788608 5030 flags.go:64] FLAG: --experimental-mounter-path="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788612 5030 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788633 5030 flags.go:64] FLAG: --fail-swap-on="true" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788637 5030 flags.go:64] FLAG: --feature-gates="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788642 5030 flags.go:64] FLAG: --file-check-frequency="20s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788646 5030 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788651 5030 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788655 5030 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788659 5030 flags.go:64] FLAG: --healthz-port="10248" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788663 5030 flags.go:64] FLAG: --help="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788667 5030 flags.go:64] FLAG: --hostname-override="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788671 5030 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788675 5030 flags.go:64] FLAG: --http-check-frequency="20s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788679 5030 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788683 5030 flags.go:64] FLAG: --image-credential-provider-config="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788686 5030 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788690 5030 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788695 5030 flags.go:64] FLAG: --image-service-endpoint="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788699 5030 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788702 5030 flags.go:64] FLAG: --kube-api-burst="100" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788707 5030 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788711 5030 flags.go:64] FLAG: --kube-api-qps="50" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788715 5030 flags.go:64] FLAG: --kube-reserved="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788719 5030 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788723 5030 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788728 5030 flags.go:64] FLAG: --kubelet-cgroups="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788732 5030 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788737 5030 flags.go:64] FLAG: --lock-file="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788741 5030 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788744 5030 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788749 5030 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788754 5030 flags.go:64] FLAG: --log-json-split-stream="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788759 5030 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788762 5030 flags.go:64] FLAG: --log-text-split-stream="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788766 5030 flags.go:64] FLAG: --logging-format="text" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788770 5030 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788775 5030 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788779 5030 flags.go:64] FLAG: --manifest-url="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788783 5030 flags.go:64] FLAG: --manifest-url-header="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788788 5030 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788792 5030 flags.go:64] FLAG: --max-open-files="1000000" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788797 5030 flags.go:64] FLAG: --max-pods="110" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788801 5030 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788805 5030 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788809 5030 flags.go:64] FLAG: --memory-manager-policy="None" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788813 5030 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788817 5030 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788821 5030 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788825 5030 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788834 5030 flags.go:64] FLAG: --node-status-max-images="50" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788838 5030 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788842 5030 flags.go:64] FLAG: --oom-score-adj="-999" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788846 5030 flags.go:64] FLAG: --pod-cidr="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788850 5030 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788856 5030 flags.go:64] FLAG: --pod-manifest-path="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788860 5030 flags.go:64] FLAG: --pod-max-pids="-1" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788864 5030 flags.go:64] FLAG: --pods-per-core="0" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788868 5030 flags.go:64] FLAG: --port="10250" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788872 5030 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788890 5030 flags.go:64] FLAG: --provider-id="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788894 5030 flags.go:64] FLAG: --qos-reserved="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788898 5030 flags.go:64] FLAG: --read-only-port="10255" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788902 5030 flags.go:64] FLAG: --register-node="true" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788907 5030 flags.go:64] FLAG: --register-schedulable="true" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788911 5030 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788917 5030 flags.go:64] FLAG: --registry-burst="10" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788921 5030 flags.go:64] FLAG: --registry-qps="5" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788925 5030 flags.go:64] FLAG: --reserved-cpus="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788930 5030 flags.go:64] FLAG: --reserved-memory="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788936 5030 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788940 5030 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788945 5030 flags.go:64] FLAG: --rotate-certificates="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788949 5030 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788954 5030 flags.go:64] FLAG: --runonce="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788958 5030 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788962 5030 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788967 5030 flags.go:64] FLAG: --seccomp-default="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788971 5030 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788975 5030 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788980 5030 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788984 5030 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788988 5030 flags.go:64] FLAG: --storage-driver-password="root" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788992 5030 flags.go:64] FLAG: --storage-driver-secure="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.788996 5030 flags.go:64] FLAG: --storage-driver-table="stats" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789000 5030 flags.go:64] FLAG: --storage-driver-user="root" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789004 5030 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789008 5030 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789012 5030 flags.go:64] FLAG: --system-cgroups="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789016 5030 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789022 5030 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789026 5030 flags.go:64] FLAG: --tls-cert-file="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789031 5030 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789036 5030 flags.go:64] FLAG: --tls-min-version="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789040 5030 flags.go:64] FLAG: --tls-private-key-file="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789044 5030 flags.go:64] FLAG: --topology-manager-policy="none" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789049 5030 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789053 5030 flags.go:64] FLAG: --topology-manager-scope="container" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789057 5030 flags.go:64] FLAG: --v="2" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789062 5030 flags.go:64] FLAG: --version="false" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789068 5030 flags.go:64] FLAG: --vmodule="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789073 5030 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789077 5030 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789179 5030 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789183 5030 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789187 5030 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789191 5030 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789194 5030 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789198 5030 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789201 5030 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789205 5030 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789208 5030 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789212 5030 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789215 5030 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789219 5030 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789222 5030 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789226 5030 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789230 5030 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789233 5030 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789238 5030 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789244 5030 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789248 5030 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789252 5030 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789256 5030 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789261 5030 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789265 5030 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789268 5030 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789273 5030 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789278 5030 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789282 5030 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789285 5030 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789289 5030 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789293 5030 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789296 5030 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789300 5030 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789303 5030 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789306 5030 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789310 5030 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789313 5030 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789317 5030 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789320 5030 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789323 5030 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789327 5030 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789330 5030 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789333 5030 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789337 5030 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789340 5030 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789344 5030 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789347 5030 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789350 5030 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789354 5030 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789357 5030 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789362 5030 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789366 5030 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789369 5030 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789372 5030 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789377 5030 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789381 5030 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789384 5030 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789388 5030 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789391 5030 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789395 5030 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789398 5030 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789402 5030 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789405 5030 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789408 5030 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789412 5030 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789415 5030 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789420 5030 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789424 5030 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789428 5030 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789431 5030 feature_gate.go:330] unrecognized feature gate: Example Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789436 5030 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.789440 5030 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.789446 5030 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.805032 5030 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.805079 5030 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805219 5030 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805232 5030 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805243 5030 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805254 5030 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805264 5030 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805275 5030 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805287 5030 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805298 5030 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805310 5030 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805324 5030 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805334 5030 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805344 5030 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805354 5030 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805364 5030 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805375 5030 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805385 5030 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805395 5030 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805405 5030 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805414 5030 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805425 5030 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805437 5030 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805445 5030 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805454 5030 feature_gate.go:330] unrecognized feature gate: Example Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805462 5030 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805471 5030 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805479 5030 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805486 5030 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805494 5030 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805502 5030 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805510 5030 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805517 5030 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805526 5030 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805534 5030 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805542 5030 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805549 5030 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805560 5030 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805572 5030 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805580 5030 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805589 5030 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805599 5030 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805609 5030 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805617 5030 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805654 5030 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805662 5030 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805669 5030 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805677 5030 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805685 5030 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805695 5030 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805704 5030 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805712 5030 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805719 5030 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805730 5030 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805740 5030 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805749 5030 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805758 5030 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805767 5030 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805775 5030 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805783 5030 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805791 5030 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805799 5030 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805807 5030 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805815 5030 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805823 5030 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805831 5030 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805838 5030 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805845 5030 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805855 5030 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805863 5030 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805871 5030 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805878 5030 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.805886 5030 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.805900 5030 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806135 5030 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806151 5030 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806159 5030 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806167 5030 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806177 5030 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806188 5030 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806201 5030 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806211 5030 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806221 5030 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806231 5030 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806244 5030 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806254 5030 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806264 5030 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806273 5030 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806282 5030 feature_gate.go:330] unrecognized feature gate: Example Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806290 5030 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806298 5030 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806305 5030 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806313 5030 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806320 5030 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806328 5030 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806336 5030 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806343 5030 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806351 5030 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806361 5030 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806371 5030 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806381 5030 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806389 5030 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806398 5030 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806407 5030 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806416 5030 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806424 5030 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806432 5030 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806440 5030 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806448 5030 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806456 5030 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806463 5030 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806471 5030 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806479 5030 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806486 5030 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806493 5030 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806502 5030 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806509 5030 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806517 5030 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806524 5030 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806532 5030 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806542 5030 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806551 5030 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806559 5030 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806566 5030 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806577 5030 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806587 5030 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806596 5030 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806605 5030 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806613 5030 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806649 5030 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806657 5030 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806665 5030 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806678 5030 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806687 5030 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806694 5030 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806703 5030 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806710 5030 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806718 5030 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806726 5030 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806734 5030 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806742 5030 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806750 5030 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806757 5030 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806765 5030 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.806772 5030 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.806785 5030 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.807060 5030 server.go:940] "Client rotation is on, will bootstrap in background" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.811839 5030 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.812014 5030 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.812887 5030 server.go:997] "Starting client certificate rotation" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.812915 5030 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.813117 5030 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-17 16:54:32.329037452 +0000 UTC Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.813218 5030 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.821676 5030 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 20 22:35:27 crc kubenswrapper[5030]: E0120 22:35:27.822781 5030 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.824180 5030 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.833419 5030 log.go:25] "Validated CRI v1 runtime API" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.846708 5030 log.go:25] "Validated CRI v1 image API" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.848170 5030 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.850826 5030 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-20-22-30-49-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.850866 5030 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.881966 5030 manager.go:217] Machine: {Timestamp:2026-01-20 22:35:27.87978326 +0000 UTC m=+0.200043638 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ef8501f1-c822-4135-bd61-b33341e82dce BootID:8166daa9-857d-40f2-9d8b-f852d2b2e6fb Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:67:7c:8d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:67:7c:8d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:69:d5:da Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:71:99:e6 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:65:b5:24 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:66:66:6f Speed:-1 Mtu:1496} {Name:eth10 MacAddress:06:62:83:ea:22:0d Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:56:7d:17:0f:18:b9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.882382 5030 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.882540 5030 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.883219 5030 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.883502 5030 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.883558 5030 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.883846 5030 topology_manager.go:138] "Creating topology manager with none policy" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.883861 5030 container_manager_linux.go:303] "Creating device plugin manager" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.884089 5030 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.884133 5030 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.884377 5030 state_mem.go:36] "Initialized new in-memory state store" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.884469 5030 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.885675 5030 kubelet.go:418] "Attempting to sync node with API server" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.885701 5030 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.885726 5030 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.885740 5030 kubelet.go:324] "Adding apiserver pod source" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.885753 5030 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.887457 5030 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.887766 5030 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.887881 5030 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 20 22:35:27 crc kubenswrapper[5030]: E0120 22:35:27.887959 5030 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.888017 5030 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Jan 20 22:35:27 crc kubenswrapper[5030]: E0120 22:35:27.888109 5030 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.888812 5030 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.889369 5030 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.889396 5030 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.889406 5030 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.889415 5030 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.889430 5030 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.889439 5030 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.889448 5030 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.889462 5030 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.889473 5030 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.889489 5030 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.889539 5030 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.889549 5030 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.889574 5030 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.890019 5030 server.go:1280] "Started kubelet" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.890273 5030 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.890250 5030 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.890939 5030 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Jan 20 22:35:27 crc systemd[1]: Started Kubernetes Kubelet. Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.892135 5030 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 20 22:35:27 crc kubenswrapper[5030]: E0120 22:35:27.895035 5030 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188c915056d53811 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 22:35:27.889987601 +0000 UTC m=+0.210247899,LastTimestamp:2026-01-20 22:35:27.889987601 +0000 UTC m=+0.210247899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.895749 5030 server.go:460] "Adding debug handlers to kubelet server" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.900008 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.900062 5030 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.900774 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 00:13:32.363461051 +0000 UTC Jan 20 22:35:27 crc kubenswrapper[5030]: E0120 22:35:27.900896 5030 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.900939 5030 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.900947 5030 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.901168 5030 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.905213 5030 factory.go:55] Registering systemd factory Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.905182 5030 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.905270 5030 factory.go:221] Registration of the systemd container factory successfully Jan 20 22:35:27 crc kubenswrapper[5030]: E0120 22:35:27.905520 5030 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="200ms" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.905879 5030 factory.go:153] Registering CRI-O factory Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.905909 5030 factory.go:221] Registration of the crio container factory successfully Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.905992 5030 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.906027 5030 factory.go:103] Registering Raw factory Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.906045 5030 manager.go:1196] Started watching for new ooms in manager Jan 20 22:35:27 crc kubenswrapper[5030]: E0120 22:35:27.906778 5030 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.908061 5030 manager.go:319] Starting recovery of all containers Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.913723 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.913872 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.913955 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.914054 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.914138 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.914217 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.914305 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.914405 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.914505 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.914597 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.914737 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.914826 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.914904 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.914982 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.915056 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.915143 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.915221 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.915321 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.915436 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.915550 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.915721 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.915882 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.915974 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.916118 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.916259 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.916426 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.916576 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.916765 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.916920 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.917037 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.917164 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.917267 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.917371 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.917505 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.917596 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.917896 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.918000 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.918095 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.918188 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.918294 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.918400 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.918501 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.918588 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.918777 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.918875 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.918954 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.919033 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.919122 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.919202 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.919291 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.919380 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.919457 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.919551 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.919655 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.919769 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.919882 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.920014 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.920110 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.920194 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.920279 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.920364 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.920445 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.920519 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.920599 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.920702 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.920779 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.920856 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.920931 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.921017 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.921094 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.921169 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.921259 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.921351 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.921442 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.921521 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.921596 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.921696 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.921783 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.921860 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.921939 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.922014 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.922088 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.922167 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.922246 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.922343 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.922425 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.922500 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.922579 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.922701 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.922819 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.922929 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.923040 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.923136 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.923286 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.923425 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.923553 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.925451 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.925583 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.925687 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.925814 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.925936 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.926037 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.926129 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.926208 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.926325 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.926445 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.926528 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.927378 5030 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.927504 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.927645 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.927817 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.927921 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.928010 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.928118 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.928206 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.928297 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.928402 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.928533 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.928609 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.928718 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.928808 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.928997 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.929085 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.929163 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.929245 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.929343 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.929433 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.929510 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.929585 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.929686 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.929767 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.929843 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.929933 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.930013 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.930090 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.930165 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.930255 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.930343 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.930429 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.930503 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.930585 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.930679 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.930771 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.930861 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.930956 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.931039 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.931119 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.931194 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.931277 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.931355 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.931429 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.931505 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.931578 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.931769 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.931857 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.931940 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.932025 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.932113 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.932197 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.932273 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.932367 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.932454 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.932530 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.932614 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.932721 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.932805 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.932879 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.932953 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.933043 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.933133 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.933238 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.933327 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.933416 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.933490 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.933588 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.933746 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.933849 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.933958 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.934052 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.934140 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.934226 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.934378 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.934465 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.934552 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.934696 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.934797 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.934875 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.934955 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.935028 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.935110 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.935194 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.935269 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.935371 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.935497 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.935617 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.935793 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.935876 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.935955 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.936030 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.936105 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.936180 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.936267 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.936364 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.936453 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.936529 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.936615 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.936775 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.936909 5030 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.937015 5030 reconstruct.go:97] "Volume reconstruction finished" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.937111 5030 reconciler.go:26] "Reconciler: start to sync state" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.941802 5030 manager.go:324] Recovery completed Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.955609 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.958467 5030 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.960700 5030 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.960751 5030 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.960794 5030 kubelet.go:2335] "Starting kubelet main sync loop" Jan 20 22:35:27 crc kubenswrapper[5030]: E0120 22:35:27.960874 5030 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.961447 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.961560 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.961674 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.962427 5030 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.962524 5030 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.962602 5030 state_mem.go:36] "Initialized new in-memory state store" Jan 20 22:35:27 crc kubenswrapper[5030]: W0120 22:35:27.962740 5030 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Jan 20 22:35:27 crc kubenswrapper[5030]: E0120 22:35:27.962826 5030 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.973245 5030 policy_none.go:49] "None policy: Start" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.975008 5030 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 20 22:35:27 crc kubenswrapper[5030]: I0120 22:35:27.975138 5030 state_mem.go:35] "Initializing new in-memory state store" Jan 20 22:35:28 crc kubenswrapper[5030]: E0120 22:35:28.001234 5030 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.025775 5030 manager.go:334] "Starting Device Plugin manager" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.025821 5030 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.025831 5030 server.go:79] "Starting device plugin registration server" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.026727 5030 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.026744 5030 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.026929 5030 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.026994 5030 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.027001 5030 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 20 22:35:28 crc kubenswrapper[5030]: E0120 22:35:28.032518 5030 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.061070 5030 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.061143 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.062011 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.062059 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.062076 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.062247 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.062603 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.062711 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.063130 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.063142 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.063150 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.063218 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.063429 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.063485 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.063859 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.063890 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.063913 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.063920 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.063928 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.063936 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.064103 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.064113 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.064124 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.064131 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.064148 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.064176 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.065329 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.065349 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.065362 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.065500 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.065522 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.065533 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.065641 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.065711 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.065735 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.066402 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.066433 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.066445 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.066551 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.066571 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.066768 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.066815 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.066831 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.067252 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.067274 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.067286 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: E0120 22:35:28.106165 5030 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="400ms" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.127664 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.128811 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.128874 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.128892 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.128924 5030 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 22:35:28 crc kubenswrapper[5030]: E0120 22:35:28.129594 5030 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140075 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140123 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140157 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140185 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140213 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140242 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140272 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140336 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140412 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140442 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140508 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140573 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140657 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.140713 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.242136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.242236 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.242292 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.242342 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.242390 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.242435 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.242455 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.242901 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243056 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243155 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243165 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243295 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243365 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243374 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243463 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243518 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243576 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243721 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243609 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243924 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243982 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.244070 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.243872 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.244834 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.244980 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.245023 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.245086 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.245998 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.246932 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.330279 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.331555 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.331596 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.331609 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.331653 5030 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 22:35:28 crc kubenswrapper[5030]: E0120 22:35:28.332151 5030 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.408041 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.432543 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: W0120 22:35:28.436658 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-53b5da04f8c6feca829a55bcc6d04f3258ba3480821a8e86ff5e31288b429ee1 WatchSource:0}: Error finding container 53b5da04f8c6feca829a55bcc6d04f3258ba3480821a8e86ff5e31288b429ee1: Status 404 returned error can't find the container with id 53b5da04f8c6feca829a55bcc6d04f3258ba3480821a8e86ff5e31288b429ee1 Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.444816 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: W0120 22:35:28.455016 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-ac9b4c36251dbb4f3786243a739e811325ca01db5929a4edf2fc2656d4d38c4d WatchSource:0}: Error finding container ac9b4c36251dbb4f3786243a739e811325ca01db5929a4edf2fc2656d4d38c4d: Status 404 returned error can't find the container with id ac9b4c36251dbb4f3786243a739e811325ca01db5929a4edf2fc2656d4d38c4d Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.459862 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: W0120 22:35:28.459922 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-e0a492e4caa4b1ced9e45299226ee422653bfd03d75a4c15ad61478cec29ed5d WatchSource:0}: Error finding container e0a492e4caa4b1ced9e45299226ee422653bfd03d75a4c15ad61478cec29ed5d: Status 404 returned error can't find the container with id e0a492e4caa4b1ced9e45299226ee422653bfd03d75a4c15ad61478cec29ed5d Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.465774 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 22:35:28 crc kubenswrapper[5030]: W0120 22:35:28.486195 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-102c442919b0cd7c6535aa051280cd75d025dccba6e9f656b3539e0d74ff9867 WatchSource:0}: Error finding container 102c442919b0cd7c6535aa051280cd75d025dccba6e9f656b3539e0d74ff9867: Status 404 returned error can't find the container with id 102c442919b0cd7c6535aa051280cd75d025dccba6e9f656b3539e0d74ff9867 Jan 20 22:35:28 crc kubenswrapper[5030]: W0120 22:35:28.494477 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-a035db81569900bd9506e180239a8a583ea1c6c91ce607f2c4077c73cb434b57 WatchSource:0}: Error finding container a035db81569900bd9506e180239a8a583ea1c6c91ce607f2c4077c73cb434b57: Status 404 returned error can't find the container with id a035db81569900bd9506e180239a8a583ea1c6c91ce607f2c4077c73cb434b57 Jan 20 22:35:28 crc kubenswrapper[5030]: E0120 22:35:28.506972 5030 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="800ms" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.732354 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.733614 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.733677 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.733687 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.733713 5030 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 22:35:28 crc kubenswrapper[5030]: E0120 22:35:28.734181 5030 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Jan 20 22:35:28 crc kubenswrapper[5030]: W0120 22:35:28.791756 5030 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Jan 20 22:35:28 crc kubenswrapper[5030]: E0120 22:35:28.792276 5030 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.892451 5030 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.901685 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 21:45:05.20971075 +0000 UTC Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.966334 5030 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92" exitCode=0 Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.966399 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92"} Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.966496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"102c442919b0cd7c6535aa051280cd75d025dccba6e9f656b3539e0d74ff9867"} Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.966580 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.967717 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.967746 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.967758 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.968760 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a"} Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.968778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e0a492e4caa4b1ced9e45299226ee422653bfd03d75a4c15ad61478cec29ed5d"} Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.970027 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251" exitCode=0 Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.970082 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251"} Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.970102 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ac9b4c36251dbb4f3786243a739e811325ca01db5929a4edf2fc2656d4d38c4d"} Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.970175 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.970851 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.970871 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.970879 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.974355 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.975222 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.975254 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.975267 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.975504 5030 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="811170c82242f1ec0b3369ad9937e0382a120f28ea0fca77977546bfc7ad036c" exitCode=0 Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.975583 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"811170c82242f1ec0b3369ad9937e0382a120f28ea0fca77977546bfc7ad036c"} Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.975658 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"53b5da04f8c6feca829a55bcc6d04f3258ba3480821a8e86ff5e31288b429ee1"} Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.975768 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.976688 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.976749 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.976769 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.977160 5030 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ec37384891d34d90b6a170761cba27322de98993f08332581d5fbc3cdfef52ca" exitCode=0 Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.977192 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ec37384891d34d90b6a170761cba27322de98993f08332581d5fbc3cdfef52ca"} Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.977252 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a035db81569900bd9506e180239a8a583ea1c6c91ce607f2c4077c73cb434b57"} Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.977377 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.978270 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.978303 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:28 crc kubenswrapper[5030]: I0120 22:35:28.978317 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:29 crc kubenswrapper[5030]: W0120 22:35:29.076502 5030 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Jan 20 22:35:29 crc kubenswrapper[5030]: E0120 22:35:29.076591 5030 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Jan 20 22:35:29 crc kubenswrapper[5030]: W0120 22:35:29.076602 5030 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Jan 20 22:35:29 crc kubenswrapper[5030]: E0120 22:35:29.076702 5030 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Jan 20 22:35:29 crc kubenswrapper[5030]: E0120 22:35:29.307870 5030 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="1.6s" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.534475 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.535795 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.535821 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.535830 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.535853 5030 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.848661 5030 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.902542 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 01:27:43.468614369 +0000 UTC Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.984788 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c"} Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.984830 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1"} Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.984841 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2"} Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.984881 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.986182 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.986207 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.986216 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.988456 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49"} Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.988539 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14"} Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.988570 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48"} Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.988602 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5"} Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.991090 5030 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="56ae8f4a64fdcda35f1334399907f4a271d22a22bf7476f42232c2418eb1bcdd" exitCode=0 Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.991196 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"56ae8f4a64fdcda35f1334399907f4a271d22a22bf7476f42232c2418eb1bcdd"} Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.991390 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.992658 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.992710 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.992729 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.993229 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7a087c314f74fd0dd33aa0b5e7c70d924ab48168afe3d66fff4728cb82efddbc"} Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.993307 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.994275 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.994295 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.994304 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.996189 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2b87b63fa491cf7fb6be320cd3242839fe8f5f7347876548e29891982c295b92"} Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.996212 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5324b780d1462962d65c46a0abda0743ce74550c1571862713fa79256b316aaf"} Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.996222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"95a93fba3f4f3480ed58b7b22535695dcb978c47afd9f4fec0886d49a88d1ad7"} Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.996373 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.997333 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.997359 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:29 crc kubenswrapper[5030]: I0120 22:35:29.997369 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:30 crc kubenswrapper[5030]: I0120 22:35:30.902729 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 14:33:03.538778613 +0000 UTC Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.004772 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945"} Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.004882 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.006196 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.006410 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.006603 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.008553 5030 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cf9fef688f776640ac8ba48d52dfb230f00627bc443e9434315a0e882698b849" exitCode=0 Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.008709 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cf9fef688f776640ac8ba48d52dfb230f00627bc443e9434315a0e882698b849"} Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.008745 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.008932 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.009737 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.009953 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.010194 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.010126 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.010510 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.010536 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.569965 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.903876 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 07:46:14.415511338 +0000 UTC Jan 20 22:35:31 crc kubenswrapper[5030]: I0120 22:35:31.993907 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.019647 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f5fd2698d3f37b612fa931723dc7c83f37357623c820b444f7c798af21a317df"} Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.019731 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"90675c9f112b6084c079fe9ecd1f6ef7be53cf2473573ce9d6adad0ac2bff566"} Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.019764 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"05d82bc092c5e1b022aa38c85d4386c3a18be024acf89e3e45655e5a6a3550ad"} Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.019791 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"85a1b8398310790bcb556fba00104d24fb2b286a512e6dd91b4d2da472bfacc2"} Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.019793 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.019794 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.019982 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.021319 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.021364 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.021408 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.021425 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.021376 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.021487 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:32 crc kubenswrapper[5030]: I0120 22:35:32.904235 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 22:47:58.102464496 +0000 UTC Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.020194 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.029707 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2aee9683d24830cf61fc3ca22fb4cf76089af1550b816de40661b19ff76072db"} Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.029807 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.029878 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.029904 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.029928 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.031789 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.031841 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.031881 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.031895 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.031843 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.031930 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.031950 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.031964 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.031992 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.246188 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 20 22:35:33 crc kubenswrapper[5030]: I0120 22:35:33.904983 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 20:30:12.217798085 +0000 UTC Jan 20 22:35:34 crc kubenswrapper[5030]: I0120 22:35:34.033184 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:34 crc kubenswrapper[5030]: I0120 22:35:34.035254 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:34 crc kubenswrapper[5030]: I0120 22:35:34.035342 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:34 crc kubenswrapper[5030]: I0120 22:35:34.035366 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:34 crc kubenswrapper[5030]: I0120 22:35:34.615188 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 20 22:35:34 crc kubenswrapper[5030]: I0120 22:35:34.906466 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 22:14:04.471482164 +0000 UTC Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.036323 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.037682 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.037760 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.037780 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.126311 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.126575 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.128330 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.128377 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.128393 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.729592 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.881006 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.881278 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.883146 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.883239 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.883258 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:35 crc kubenswrapper[5030]: I0120 22:35:35.907882 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 05:36:54.630188328 +0000 UTC Jan 20 22:35:36 crc kubenswrapper[5030]: I0120 22:35:36.039519 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:36 crc kubenswrapper[5030]: I0120 22:35:36.039523 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:36 crc kubenswrapper[5030]: I0120 22:35:36.041237 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:36 crc kubenswrapper[5030]: I0120 22:35:36.041299 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:36 crc kubenswrapper[5030]: I0120 22:35:36.041349 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:36 crc kubenswrapper[5030]: I0120 22:35:36.041537 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:36 crc kubenswrapper[5030]: I0120 22:35:36.041606 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:36 crc kubenswrapper[5030]: I0120 22:35:36.041623 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:36 crc kubenswrapper[5030]: I0120 22:35:36.908321 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 06:54:33.718183314 +0000 UTC Jan 20 22:35:37 crc kubenswrapper[5030]: I0120 22:35:37.864872 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:37 crc kubenswrapper[5030]: I0120 22:35:37.865064 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:37 crc kubenswrapper[5030]: I0120 22:35:37.866218 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:37 crc kubenswrapper[5030]: I0120 22:35:37.866245 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:37 crc kubenswrapper[5030]: I0120 22:35:37.866261 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:37 crc kubenswrapper[5030]: I0120 22:35:37.909184 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 12:23:12.110807973 +0000 UTC Jan 20 22:35:38 crc kubenswrapper[5030]: E0120 22:35:38.032588 5030 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 20 22:35:38 crc kubenswrapper[5030]: I0120 22:35:38.676318 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:38 crc kubenswrapper[5030]: I0120 22:35:38.676512 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:38 crc kubenswrapper[5030]: I0120 22:35:38.677799 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:38 crc kubenswrapper[5030]: I0120 22:35:38.677844 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:38 crc kubenswrapper[5030]: I0120 22:35:38.677861 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:38 crc kubenswrapper[5030]: I0120 22:35:38.682376 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:38 crc kubenswrapper[5030]: I0120 22:35:38.910043 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 19:33:27.376148459 +0000 UTC Jan 20 22:35:39 crc kubenswrapper[5030]: I0120 22:35:39.046108 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:39 crc kubenswrapper[5030]: I0120 22:35:39.046983 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:39 crc kubenswrapper[5030]: I0120 22:35:39.047012 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:39 crc kubenswrapper[5030]: I0120 22:35:39.047020 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:39 crc kubenswrapper[5030]: I0120 22:35:39.050419 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:39 crc kubenswrapper[5030]: W0120 22:35:39.484409 5030 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 20 22:35:39 crc kubenswrapper[5030]: I0120 22:35:39.484549 5030 trace.go:236] Trace[1359696742]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 22:35:29.482) (total time: 10001ms): Jan 20 22:35:39 crc kubenswrapper[5030]: Trace[1359696742]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (22:35:39.484) Jan 20 22:35:39 crc kubenswrapper[5030]: Trace[1359696742]: [10.001595265s] [10.001595265s] END Jan 20 22:35:39 crc kubenswrapper[5030]: E0120 22:35:39.484586 5030 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 20 22:35:39 crc kubenswrapper[5030]: E0120 22:35:39.537255 5030 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Jan 20 22:35:39 crc kubenswrapper[5030]: E0120 22:35:39.850420 5030 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 20 22:35:39 crc kubenswrapper[5030]: I0120 22:35:39.893080 5030 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 20 22:35:39 crc kubenswrapper[5030]: I0120 22:35:39.910477 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 14:34:10.770380472 +0000 UTC Jan 20 22:35:40 crc kubenswrapper[5030]: I0120 22:35:40.048808 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:40 crc kubenswrapper[5030]: I0120 22:35:40.050174 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:40 crc kubenswrapper[5030]: I0120 22:35:40.050209 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:40 crc kubenswrapper[5030]: I0120 22:35:40.050221 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:40 crc kubenswrapper[5030]: I0120 22:35:40.865933 5030 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 20 22:35:40 crc kubenswrapper[5030]: I0120 22:35:40.866067 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 22:35:40 crc kubenswrapper[5030]: E0120 22:35:40.909269 5030 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Jan 20 22:35:40 crc kubenswrapper[5030]: I0120 22:35:40.911516 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 16:27:09.479328974 +0000 UTC Jan 20 22:35:40 crc kubenswrapper[5030]: I0120 22:35:40.963206 5030 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 20 22:35:40 crc kubenswrapper[5030]: I0120 22:35:40.963274 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 20 22:35:40 crc kubenswrapper[5030]: I0120 22:35:40.972357 5030 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 20 22:35:40 crc kubenswrapper[5030]: I0120 22:35:40.972439 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 20 22:35:41 crc kubenswrapper[5030]: I0120 22:35:41.138349 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:41 crc kubenswrapper[5030]: I0120 22:35:41.140669 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:41 crc kubenswrapper[5030]: I0120 22:35:41.140709 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:41 crc kubenswrapper[5030]: I0120 22:35:41.140719 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:41 crc kubenswrapper[5030]: I0120 22:35:41.140746 5030 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 22:35:41 crc kubenswrapper[5030]: I0120 22:35:41.912311 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 08:02:23.635017347 +0000 UTC Jan 20 22:35:42 crc kubenswrapper[5030]: I0120 22:35:42.913077 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 23:12:28.461490134 +0000 UTC Jan 20 22:35:43 crc kubenswrapper[5030]: I0120 22:35:43.859292 5030 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 20 22:35:43 crc kubenswrapper[5030]: I0120 22:35:43.881108 5030 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 20 22:35:43 crc kubenswrapper[5030]: I0120 22:35:43.913666 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 04:47:51.419626499 +0000 UTC Jan 20 22:35:44 crc kubenswrapper[5030]: I0120 22:35:44.658084 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 20 22:35:44 crc kubenswrapper[5030]: I0120 22:35:44.658330 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:44 crc kubenswrapper[5030]: I0120 22:35:44.659774 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:44 crc kubenswrapper[5030]: I0120 22:35:44.659825 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:44 crc kubenswrapper[5030]: I0120 22:35:44.659843 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:44 crc kubenswrapper[5030]: I0120 22:35:44.681340 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 20 22:35:44 crc kubenswrapper[5030]: I0120 22:35:44.913784 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 00:32:35.30012622 +0000 UTC Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.060248 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.061338 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.061380 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.061390 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.738471 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.738811 5030 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.740423 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.740485 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.740509 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.745875 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.914883 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 23:03:52.935585609 +0000 UTC Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.965897 5030 trace.go:236] Trace[491266246]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 22:35:31.838) (total time: 14127ms): Jan 20 22:35:45 crc kubenswrapper[5030]: Trace[491266246]: ---"Objects listed" error: 14127ms (22:35:45.965) Jan 20 22:35:45 crc kubenswrapper[5030]: Trace[491266246]: [14.127384053s] [14.127384053s] END Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.965957 5030 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.966312 5030 trace.go:236] Trace[829132186]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 22:35:31.711) (total time: 14254ms): Jan 20 22:35:45 crc kubenswrapper[5030]: Trace[829132186]: ---"Objects listed" error: 14254ms (22:35:45.966) Jan 20 22:35:45 crc kubenswrapper[5030]: Trace[829132186]: [14.254929553s] [14.254929553s] END Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.966343 5030 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.966447 5030 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.966800 5030 trace.go:236] Trace[1915938179]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 22:35:31.590) (total time: 14376ms): Jan 20 22:35:45 crc kubenswrapper[5030]: Trace[1915938179]: ---"Objects listed" error: 14376ms (22:35:45.966) Jan 20 22:35:45 crc kubenswrapper[5030]: Trace[1915938179]: [14.376579167s] [14.376579167s] END Jan 20 22:35:45 crc kubenswrapper[5030]: I0120 22:35:45.966841 5030 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.062152 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.156063 5030 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.156425 5030 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.159070 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.159126 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.159142 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.159164 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.159180 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:46Z","lastTransitionTime":"2026-01-20T22:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:46 crc kubenswrapper[5030]: E0120 22:35:46.175503 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.175805 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.179975 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.180018 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.180028 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.180043 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.180055 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:46Z","lastTransitionTime":"2026-01-20T22:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:46 crc kubenswrapper[5030]: E0120 22:35:46.195842 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.199830 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.199873 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.199882 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.199896 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.199905 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:46Z","lastTransitionTime":"2026-01-20T22:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:46 crc kubenswrapper[5030]: E0120 22:35:46.222975 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.227344 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.227380 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.227391 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.227409 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.227422 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:46Z","lastTransitionTime":"2026-01-20T22:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:46 crc kubenswrapper[5030]: E0120 22:35:46.244358 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.254189 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.254221 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.254229 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.254242 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.254276 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:46Z","lastTransitionTime":"2026-01-20T22:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:46 crc kubenswrapper[5030]: E0120 22:35:46.270120 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:46 crc kubenswrapper[5030]: E0120 22:35:46.270230 5030 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.271602 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.271635 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.271643 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.271656 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.271666 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:46Z","lastTransitionTime":"2026-01-20T22:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.301676 5030 csr.go:261] certificate signing request csr-pkvbx is approved, waiting to be issued Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.318168 5030 csr.go:257] certificate signing request csr-pkvbx is issued Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.374087 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.374120 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.374129 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.374141 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.374151 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:46Z","lastTransitionTime":"2026-01-20T22:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.475791 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.475842 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.475850 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.475863 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.475873 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:46Z","lastTransitionTime":"2026-01-20T22:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.578791 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.578831 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.578841 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.578859 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.578870 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:46Z","lastTransitionTime":"2026-01-20T22:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.681198 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.681235 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.681244 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.681257 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.681267 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:46Z","lastTransitionTime":"2026-01-20T22:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.758703 5030 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.783797 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.783846 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.783858 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.783875 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.783888 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:46Z","lastTransitionTime":"2026-01-20T22:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.886403 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.886446 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.886459 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.886473 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.886483 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:46Z","lastTransitionTime":"2026-01-20T22:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.901823 5030 apiserver.go:52] "Watching apiserver" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.905349 5030 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.905752 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-machine-config-operator/machine-config-daemon-qb97t","openshift-multus/multus-n8v4f","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-dns/node-resolver-5zbvj","openshift-image-registry/node-ca-fqn26","openshift-multus/multus-additional-cni-plugins-zzvtq","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.906080 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.906133 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.906190 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.906634 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.906705 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.906782 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:46 crc kubenswrapper[5030]: E0120 22:35:46.906780 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:35:46 crc kubenswrapper[5030]: E0120 22:35:46.906811 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:35:46 crc kubenswrapper[5030]: E0120 22:35:46.906870 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.906984 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.907114 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n8v4f" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.907453 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.907562 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5zbvj" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.907595 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fqn26" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.912640 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.912660 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.912729 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.912783 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.912795 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.914972 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.914983 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.915066 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 19:00:12.432684147 +0000 UTC Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.915197 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.915799 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.915873 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.915995 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.915801 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.916045 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.916256 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.918479 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.918956 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.919002 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.919661 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.919867 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.920210 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.924213 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.924652 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.924767 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.925046 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.925102 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.918966 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.925854 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.925621 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.960911 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.971320 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.983371 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.988226 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.988274 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.988285 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.988306 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.988316 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:46Z","lastTransitionTime":"2026-01-20T22:35:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:46 crc kubenswrapper[5030]: I0120 22:35:46.993036 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.003137 5030 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.004221 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.020895 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.029648 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.040563 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.048150 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.063802 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.068681 5030 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074015 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074061 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074082 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074104 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074138 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074160 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074203 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074220 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074235 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074255 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074275 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074315 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074337 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074354 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074385 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074435 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074499 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074515 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074535 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074558 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074580 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074583 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074645 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074682 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074669 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074739 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074768 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074793 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074816 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074837 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074857 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074878 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074905 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.074959 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075000 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075009 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075016 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075083 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075087 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075105 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075126 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075149 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075182 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075186 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075217 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075234 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075269 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075316 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075345 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075370 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075421 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075434 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075447 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075488 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075510 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075527 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075533 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075544 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075563 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075579 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075595 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075610 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075652 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075688 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075704 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075721 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075731 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075741 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075760 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075756 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075779 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075784 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075796 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075818 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075835 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075850 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075907 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075952 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075956 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.075974 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076006 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076023 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076050 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076052 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076073 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076095 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076115 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076137 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076157 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076177 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076188 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076211 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076227 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076222 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076202 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076297 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076327 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076355 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076408 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076498 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076526 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076550 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076575 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076601 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076646 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076697 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076719 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076742 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076765 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076790 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076813 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076832 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076851 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076869 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076888 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076904 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076920 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076936 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076958 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076982 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077004 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077026 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077050 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077074 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077098 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077142 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077168 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077403 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077450 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077473 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077494 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077623 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077715 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077738 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077774 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077810 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077843 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078137 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078190 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078211 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078232 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078325 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078808 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078840 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078866 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078891 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078945 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079437 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079485 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079555 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079592 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079642 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079671 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079716 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079736 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079954 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076328 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076471 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076504 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076529 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076530 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081053 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081086 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081109 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081134 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081152 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081168 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081195 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081238 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081262 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081278 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081311 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081341 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081396 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081431 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081451 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081498 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081546 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081561 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081576 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081605 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081637 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081654 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081671 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081753 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081797 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081819 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081849 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081868 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081905 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076542 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076562 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076889 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076925 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076935 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.076939 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077091 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077124 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077216 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077319 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077384 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077436 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077577 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.077603 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.078044 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079505 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079719 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079896 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.079927 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.080023 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.080047 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.080178 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.080239 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.080438 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.080447 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.080698 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.080962 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.081943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.082098 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.082220 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.088073 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.082411 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.082594 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.082610 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.082660 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.082912 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.082953 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.083203 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.083205 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.083296 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.083361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.083744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.083775 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.083908 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.084053 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.084081 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.084200 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.084341 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.084400 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.085100 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.085861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.086542 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.087295 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.088560 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.087451 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.087844 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.088712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.088943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.089035 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.089201 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.089275 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.089469 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.090283 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.090726 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.090888 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.092411 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.092462 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.092680 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.092717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.092775 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.092832 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:35:47.592809938 +0000 UTC m=+19.913070226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093001 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.092748 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.092859 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093052 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093095 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093099 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093115 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093147 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093247 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093266 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093320 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093329 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093341 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093353 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093544 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093852 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093887 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093912 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093966 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.093988 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18c97a93-96a8-40e8-9f36-513f91962906-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094020 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094041 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-var-lib-cni-bin\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094059 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-etc-kubernetes\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rdmk\" (UniqueName: \"kubernetes.io/projected/c8ba00cb-4776-41c1-87d9-9ba37894c69e-kube-api-access-4rdmk\") pod \"node-ca-fqn26\" (UID: \"c8ba00cb-4776-41c1-87d9-9ba37894c69e\") " pod="openshift-image-registry/node-ca-fqn26" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-cnibin\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094141 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-os-release\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094159 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-multus-conf-dir\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094179 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c8ba00cb-4776-41c1-87d9-9ba37894c69e-serviceca\") pod \"node-ca-fqn26\" (UID: \"c8ba00cb-4776-41c1-87d9-9ba37894c69e\") " pod="openshift-image-registry/node-ca-fqn26" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094202 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094245 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18c97a93-96a8-40e8-9f36-513f91962906-cnibin\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094260 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18c97a93-96a8-40e8-9f36-513f91962906-system-cni-dir\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094277 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094292 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-multus-socket-dir-parent\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-run-multus-certs\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094325 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-var-lib-kubelet\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094343 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094360 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qczpq\" (UniqueName: \"kubernetes.io/projected/02009b42-ac8c-4b5e-8c69-9102da70a748-kube-api-access-qczpq\") pod \"node-resolver-5zbvj\" (UID: \"02009b42-ac8c-4b5e-8c69-9102da70a748\") " pod="openshift-dns/node-resolver-5zbvj" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094376 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-system-cni-dir\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-multus-cni-dir\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094406 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18c97a93-96a8-40e8-9f36-513f91962906-cni-binary-copy\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094439 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094478 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-hostroot\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094590 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj9mn\" (UniqueName: \"kubernetes.io/projected/7e610661-5072-4aa0-b1f1-75410b7f663b-kube-api-access-xj9mn\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094643 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpgp\" (UniqueName: \"kubernetes.io/projected/18c97a93-96a8-40e8-9f36-513f91962906-kube-api-access-ljpgp\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094661 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a2a963db-b558-4e9a-8e56-12636c3fe1c2-rootfs\") pod \"machine-config-daemon-qb97t\" (UID: \"a2a963db-b558-4e9a-8e56-12636c3fe1c2\") " pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094679 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18c97a93-96a8-40e8-9f36-513f91962906-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094741 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-run-netns\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094757 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a2a963db-b558-4e9a-8e56-12636c3fe1c2-mcd-auth-proxy-config\") pod \"machine-config-daemon-qb97t\" (UID: \"a2a963db-b558-4e9a-8e56-12636c3fe1c2\") " pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094776 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094794 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094814 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094839 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e610661-5072-4aa0-b1f1-75410b7f663b-multus-daemon-config\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094863 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7pmb\" (UniqueName: \"kubernetes.io/projected/a2a963db-b558-4e9a-8e56-12636c3fe1c2-kube-api-access-p7pmb\") pod \"machine-config-daemon-qb97t\" (UID: \"a2a963db-b558-4e9a-8e56-12636c3fe1c2\") " pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094886 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8ba00cb-4776-41c1-87d9-9ba37894c69e-host\") pod \"node-ca-fqn26\" (UID: \"c8ba00cb-4776-41c1-87d9-9ba37894c69e\") " pod="openshift-image-registry/node-ca-fqn26" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094907 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02009b42-ac8c-4b5e-8c69-9102da70a748-hosts-file\") pod \"node-resolver-5zbvj\" (UID: \"02009b42-ac8c-4b5e-8c69-9102da70a748\") " pod="openshift-dns/node-resolver-5zbvj" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094931 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094948 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-run-k8s-cni-cncf-io\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094970 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-var-lib-cni-multus\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.094991 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e610661-5072-4aa0-b1f1-75410b7f663b-cni-binary-copy\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095009 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18c97a93-96a8-40e8-9f36-513f91962906-os-release\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095032 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2a963db-b558-4e9a-8e56-12636c3fe1c2-proxy-tls\") pod \"machine-config-daemon-qb97t\" (UID: \"a2a963db-b558-4e9a-8e56-12636c3fe1c2\") " pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095049 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095144 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095157 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095167 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095178 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095188 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095198 5030 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095208 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095218 5030 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095226 5030 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095241 5030 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095251 5030 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095262 5030 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095273 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095284 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095293 5030 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095303 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095312 5030 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095323 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095333 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095343 5030 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095353 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095362 5030 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095371 5030 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095381 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095390 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095400 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095409 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095419 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095429 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095439 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095440 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095449 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095486 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095516 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095531 5030 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095642 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.095991 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096000 5030 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096018 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096082 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096154 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096406 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096026 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096657 5030 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096670 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096684 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096685 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096696 5030 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096708 5030 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096733 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096737 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.096768 5030 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096781 5030 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.096854 5030 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.096917 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.097151 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.097351 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.097539 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.097826 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.097855 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.097914 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.097957 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.097977 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.098766 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.098833 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.098877 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.098899 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.098919 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.098951 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.099256 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.099274 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.099283 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.099296 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.099305 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:47Z","lastTransitionTime":"2026-01-20T22:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.099346 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.099479 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.099803 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.099810 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.100312 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.100391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.100808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.100987 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.101176 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.101556 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.101668 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.101962 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.102581 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.102652 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.102749 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.102849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.103821 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.104050 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.104579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.104710 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.104712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.104786 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.105009 5030 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.105034 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.105082 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.105097 5030 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.105112 5030 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.105012 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.105128 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.105137 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:47.605118259 +0000 UTC m=+19.925378537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.105274 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.105442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.105595 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.105669 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.105828 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.106093 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.106097 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.106115 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.106371 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.106504 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.106582 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.106605 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.106613 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.106890 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.107108 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:47.607096856 +0000 UTC m=+19.927357144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.107398 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.107483 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.110057 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.110055 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.110288 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.110313 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.110431 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.110680 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.110417 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.110792 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.110821 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.111162 5030 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.111201 5030 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.111310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.111405 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.111421 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.111435 5030 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.111500 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:47.611483069 +0000 UTC m=+19.931743357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.112000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.113949 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.114861 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.114884 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.114899 5030 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.114953 5030 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.114980 5030 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.114996 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115008 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115021 5030 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115036 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115047 5030 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115057 5030 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115068 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115084 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115094 5030 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115104 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115117 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115130 5030 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115140 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115149 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115162 5030 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115171 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115180 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115190 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115203 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115213 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115224 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115233 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115249 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115258 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115269 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115284 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115298 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115313 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115326 5030 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115344 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115357 5030 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115370 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115383 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115400 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115414 5030 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115519 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.116323 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.119022 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.115427 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.120959 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.120953 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.120987 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.121004 5030 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.121055 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:47.621034665 +0000 UTC m=+19.941294953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.120420 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123519 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123563 5030 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123578 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123592 5030 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123603 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123613 5030 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123644 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123661 5030 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123673 5030 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123683 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123693 5030 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123704 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123713 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123722 5030 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123732 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123741 5030 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.123749 5030 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.124001 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.127289 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.128264 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.128955 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.130074 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.130330 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.130579 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.131106 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.135582 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.135845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.135940 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.135951 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.155042 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.159070 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.167861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.201810 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.201842 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.201852 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.201867 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.201878 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:47Z","lastTransitionTime":"2026-01-20T22:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225096 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-multus-conf-dir\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c8ba00cb-4776-41c1-87d9-9ba37894c69e-serviceca\") pod \"node-ca-fqn26\" (UID: \"c8ba00cb-4776-41c1-87d9-9ba37894c69e\") " pod="openshift-image-registry/node-ca-fqn26" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18c97a93-96a8-40e8-9f36-513f91962906-cnibin\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-multus-socket-dir-parent\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225184 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-run-multus-certs\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18c97a93-96a8-40e8-9f36-513f91962906-system-cni-dir\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225212 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-system-cni-dir\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-multus-cni-dir\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225255 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-var-lib-kubelet\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qczpq\" (UniqueName: \"kubernetes.io/projected/02009b42-ac8c-4b5e-8c69-9102da70a748-kube-api-access-qczpq\") pod \"node-resolver-5zbvj\" (UID: \"02009b42-ac8c-4b5e-8c69-9102da70a748\") " pod="openshift-dns/node-resolver-5zbvj" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225339 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18c97a93-96a8-40e8-9f36-513f91962906-cni-binary-copy\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225360 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-hostroot\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225375 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj9mn\" (UniqueName: \"kubernetes.io/projected/7e610661-5072-4aa0-b1f1-75410b7f663b-kube-api-access-xj9mn\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225389 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpgp\" (UniqueName: \"kubernetes.io/projected/18c97a93-96a8-40e8-9f36-513f91962906-kube-api-access-ljpgp\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225405 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a2a963db-b558-4e9a-8e56-12636c3fe1c2-rootfs\") pod \"machine-config-daemon-qb97t\" (UID: \"a2a963db-b558-4e9a-8e56-12636c3fe1c2\") " pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225422 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-run-netns\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225437 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18c97a93-96a8-40e8-9f36-513f91962906-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225454 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a2a963db-b558-4e9a-8e56-12636c3fe1c2-mcd-auth-proxy-config\") pod \"machine-config-daemon-qb97t\" (UID: \"a2a963db-b558-4e9a-8e56-12636c3fe1c2\") " pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225467 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e610661-5072-4aa0-b1f1-75410b7f663b-multus-daemon-config\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225494 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8ba00cb-4776-41c1-87d9-9ba37894c69e-host\") pod \"node-ca-fqn26\" (UID: \"c8ba00cb-4776-41c1-87d9-9ba37894c69e\") " pod="openshift-image-registry/node-ca-fqn26" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7pmb\" (UniqueName: \"kubernetes.io/projected/a2a963db-b558-4e9a-8e56-12636c3fe1c2-kube-api-access-p7pmb\") pod \"machine-config-daemon-qb97t\" (UID: \"a2a963db-b558-4e9a-8e56-12636c3fe1c2\") " pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225523 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-run-k8s-cni-cncf-io\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225537 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-var-lib-cni-multus\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02009b42-ac8c-4b5e-8c69-9102da70a748-hosts-file\") pod \"node-resolver-5zbvj\" (UID: \"02009b42-ac8c-4b5e-8c69-9102da70a748\") " pod="openshift-dns/node-resolver-5zbvj" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225566 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e610661-5072-4aa0-b1f1-75410b7f663b-cni-binary-copy\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225580 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18c97a93-96a8-40e8-9f36-513f91962906-os-release\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225603 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2a963db-b558-4e9a-8e56-12636c3fe1c2-proxy-tls\") pod \"machine-config-daemon-qb97t\" (UID: \"a2a963db-b558-4e9a-8e56-12636c3fe1c2\") " pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225646 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225661 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18c97a93-96a8-40e8-9f36-513f91962906-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225675 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-cnibin\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225690 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-os-release\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225704 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-var-lib-cni-bin\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225717 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-etc-kubernetes\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225731 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rdmk\" (UniqueName: \"kubernetes.io/projected/c8ba00cb-4776-41c1-87d9-9ba37894c69e-kube-api-access-4rdmk\") pod \"node-ca-fqn26\" (UID: \"c8ba00cb-4776-41c1-87d9-9ba37894c69e\") " pod="openshift-image-registry/node-ca-fqn26" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225764 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225779 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225788 5030 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225796 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225805 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225814 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225823 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225834 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225842 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225851 5030 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225860 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225870 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225879 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225888 5030 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225897 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225905 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225914 5030 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225923 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225931 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225938 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225946 5030 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225954 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225962 5030 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225971 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.225992 5030 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226000 5030 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226009 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226018 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226028 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226037 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226047 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226055 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226063 5030 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226072 5030 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226082 5030 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226090 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226098 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226106 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226115 5030 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226124 5030 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226133 5030 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226142 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226151 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226159 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226168 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226177 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226186 5030 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226195 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226204 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226213 5030 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226221 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226229 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226237 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226245 5030 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226253 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226261 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226269 5030 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226277 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226286 5030 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226294 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226302 5030 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226310 5030 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226318 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226327 5030 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226335 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226343 5030 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226352 5030 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226360 5030 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226369 5030 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226377 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226385 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226394 5030 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226403 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226414 5030 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226425 5030 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226436 5030 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226448 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226459 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226469 5030 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226479 5030 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226488 5030 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226498 5030 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226506 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226514 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226523 5030 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226532 5030 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226541 5030 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226549 5030 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226907 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18c97a93-96a8-40e8-9f36-513f91962906-os-release\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226968 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-cnibin\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.226968 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227035 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-hostroot\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227049 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18c97a93-96a8-40e8-9f36-513f91962906-system-cni-dir\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227058 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-os-release\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227060 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-var-lib-kubelet\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227106 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-etc-kubernetes\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227075 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c8ba00cb-4776-41c1-87d9-9ba37894c69e-host\") pod \"node-ca-fqn26\" (UID: \"c8ba00cb-4776-41c1-87d9-9ba37894c69e\") " pod="openshift-image-registry/node-ca-fqn26" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227088 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-run-k8s-cni-cncf-io\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227091 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-var-lib-cni-bin\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227061 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18c97a93-96a8-40e8-9f36-513f91962906-cnibin\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227146 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227188 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-multus-cni-dir\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-system-cni-dir\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227208 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-multus-conf-dir\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227230 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-run-multus-certs\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227343 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02009b42-ac8c-4b5e-8c69-9102da70a748-hosts-file\") pod \"node-resolver-5zbvj\" (UID: \"02009b42-ac8c-4b5e-8c69-9102da70a748\") " pod="openshift-dns/node-resolver-5zbvj" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227534 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e610661-5072-4aa0-b1f1-75410b7f663b-cni-binary-copy\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227564 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a2a963db-b558-4e9a-8e56-12636c3fe1c2-rootfs\") pod \"machine-config-daemon-qb97t\" (UID: \"a2a963db-b558-4e9a-8e56-12636c3fe1c2\") " pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227613 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-run-netns\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227670 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-host-var-lib-cni-multus\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227719 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e610661-5072-4aa0-b1f1-75410b7f663b-multus-socket-dir-parent\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.227999 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/18c97a93-96a8-40e8-9f36-513f91962906-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.228004 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e610661-5072-4aa0-b1f1-75410b7f663b-multus-daemon-config\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.228194 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a2a963db-b558-4e9a-8e56-12636c3fe1c2-mcd-auth-proxy-config\") pod \"machine-config-daemon-qb97t\" (UID: \"a2a963db-b558-4e9a-8e56-12636c3fe1c2\") " pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.228238 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18c97a93-96a8-40e8-9f36-513f91962906-cni-binary-copy\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.228277 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c8ba00cb-4776-41c1-87d9-9ba37894c69e-serviceca\") pod \"node-ca-fqn26\" (UID: \"c8ba00cb-4776-41c1-87d9-9ba37894c69e\") " pod="openshift-image-registry/node-ca-fqn26" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.228304 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18c97a93-96a8-40e8-9f36-513f91962906-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.234713 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.240038 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2a963db-b558-4e9a-8e56-12636c3fe1c2-proxy-tls\") pod \"machine-config-daemon-qb97t\" (UID: \"a2a963db-b558-4e9a-8e56-12636c3fe1c2\") " pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.249070 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-217bcf2ac87cef733dec7e581d924db38d5c94b45e952f55303b3e68e056276d WatchSource:0}: Error finding container 217bcf2ac87cef733dec7e581d924db38d5c94b45e952f55303b3e68e056276d: Status 404 returned error can't find the container with id 217bcf2ac87cef733dec7e581d924db38d5c94b45e952f55303b3e68e056276d Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.249567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rdmk\" (UniqueName: \"kubernetes.io/projected/c8ba00cb-4776-41c1-87d9-9ba37894c69e-kube-api-access-4rdmk\") pod \"node-ca-fqn26\" (UID: \"c8ba00cb-4776-41c1-87d9-9ba37894c69e\") " pod="openshift-image-registry/node-ca-fqn26" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.256149 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7pmb\" (UniqueName: \"kubernetes.io/projected/a2a963db-b558-4e9a-8e56-12636c3fe1c2-kube-api-access-p7pmb\") pod \"machine-config-daemon-qb97t\" (UID: \"a2a963db-b558-4e9a-8e56-12636c3fe1c2\") " pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.257044 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.259349 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qczpq\" (UniqueName: \"kubernetes.io/projected/02009b42-ac8c-4b5e-8c69-9102da70a748-kube-api-access-qczpq\") pod \"node-resolver-5zbvj\" (UID: \"02009b42-ac8c-4b5e-8c69-9102da70a748\") " pod="openshift-dns/node-resolver-5zbvj" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.271415 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj9mn\" (UniqueName: \"kubernetes.io/projected/7e610661-5072-4aa0-b1f1-75410b7f663b-kube-api-access-xj9mn\") pod \"multus-n8v4f\" (UID: \"7e610661-5072-4aa0-b1f1-75410b7f663b\") " pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.276558 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5zbvj" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.279256 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpgp\" (UniqueName: \"kubernetes.io/projected/18c97a93-96a8-40e8-9f36-513f91962906-kube-api-access-ljpgp\") pod \"multus-additional-cni-plugins-zzvtq\" (UID: \"18c97a93-96a8-40e8-9f36-513f91962906\") " pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.281776 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fqn26" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.302483 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kq4dw"] Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.303298 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.306606 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.307833 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.308058 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.308307 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.308536 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:47Z","lastTransitionTime":"2026-01-20T22:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.307547 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.307589 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.309307 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.309424 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.314296 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.314716 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.314935 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.318818 5030 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-20 22:30:46 +0000 UTC, rotation deadline is 2026-11-27 05:34:47.621589631 +0000 UTC Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.318879 5030 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7446h59m0.302712525s for next certificate rotation Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.326881 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-node-log\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.326919 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-log-socket\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.326944 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-env-overrides\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.326958 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-run-netns\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.326974 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wk7p\" (UniqueName: \"kubernetes.io/projected/bf449d4a-2037-4802-8370-1965d3026c07-kube-api-access-6wk7p\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.326991 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-systemd-units\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327022 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-systemd\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327035 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-ovnkube-script-lib\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327051 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-run-ovn-kubernetes\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327064 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-ovnkube-config\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327098 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-kubelet\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327117 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf449d4a-2037-4802-8370-1965d3026c07-ovn-node-metrics-cert\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327135 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-ovn\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327151 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-cni-bin\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327170 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-cni-netd\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327202 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-openvswitch\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327232 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327248 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-slash\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327268 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-var-lib-openvswitch\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327281 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-etc-openvswitch\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.327439 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.348872 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.362641 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.390971 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.403348 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.416670 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.418739 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.418769 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.418781 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.418798 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.418811 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:47Z","lastTransitionTime":"2026-01-20T22:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.428778 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-cni-netd\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.428828 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-openvswitch\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.428869 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.428887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-slash\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.428910 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-var-lib-openvswitch\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.428926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-etc-openvswitch\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.428940 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-node-log\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.428964 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-log-socket\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.428978 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-env-overrides\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.428996 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-run-netns\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-systemd-units\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429029 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-systemd\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429045 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-ovnkube-script-lib\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429067 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wk7p\" (UniqueName: \"kubernetes.io/projected/bf449d4a-2037-4802-8370-1965d3026c07-kube-api-access-6wk7p\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429087 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-run-ovn-kubernetes\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429101 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-ovnkube-config\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429116 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-kubelet\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429129 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf449d4a-2037-4802-8370-1965d3026c07-ovn-node-metrics-cert\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429148 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-ovn\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429162 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-cni-bin\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429240 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-cni-bin\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429279 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-cni-netd\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-openvswitch\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429319 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429339 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-slash\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429361 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-var-lib-openvswitch\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429381 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-etc-openvswitch\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-node-log\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429431 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-log-socket\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.429963 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-env-overrides\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.430018 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-run-netns\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.430056 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-systemd-units\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.430086 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-systemd\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.430566 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-ovnkube-script-lib\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.430623 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-kubelet\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.430683 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-ovn\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.430695 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-run-ovn-kubernetes\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.431323 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-ovnkube-config\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.433860 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf449d4a-2037-4802-8370-1965d3026c07-ovn-node-metrics-cert\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.435513 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.445991 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.455119 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wk7p\" (UniqueName: \"kubernetes.io/projected/bf449d4a-2037-4802-8370-1965d3026c07-kube-api-access-6wk7p\") pod \"ovnkube-node-kq4dw\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.460042 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.476329 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.488248 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.501015 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.512019 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.521130 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.521181 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.521193 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.521237 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.521250 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:47Z","lastTransitionTime":"2026-01-20T22:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.526385 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.540898 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.548606 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n8v4f" Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.560764 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2a963db_b558_4e9a_8e56_12636c3fe1c2.slice/crio-d5b5544e2e5f6c67fe57d77e66f1ee0db9d205c46847ea381554300afbbd3154 WatchSource:0}: Error finding container d5b5544e2e5f6c67fe57d77e66f1ee0db9d205c46847ea381554300afbbd3154: Status 404 returned error can't find the container with id d5b5544e2e5f6c67fe57d77e66f1ee0db9d205c46847ea381554300afbbd3154 Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.566442 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.579219 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e610661_5072_4aa0_b1f1_75410b7f663b.slice/crio-efdead91c0128c6970dd934c386080fcc77ffe5a94818b7587900d5c14282d08 WatchSource:0}: Error finding container efdead91c0128c6970dd934c386080fcc77ffe5a94818b7587900d5c14282d08: Status 404 returned error can't find the container with id efdead91c0128c6970dd934c386080fcc77ffe5a94818b7587900d5c14282d08 Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.623671 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.623698 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.623707 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.623721 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.623730 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:47Z","lastTransitionTime":"2026-01-20T22:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.630945 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.631337 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.631508 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.631554 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.631568 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:35:48.631546247 +0000 UTC m=+20.951806535 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.631641 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.631687 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.631711 5030 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.631811 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.631811 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.631858 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.631864 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:48.631852974 +0000 UTC m=+20.952113262 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.631876 5030 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.631936 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:48.631917756 +0000 UTC m=+20.952178094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.631828 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.631738 5030 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.631964 5030 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.632004 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:48.631995558 +0000 UTC m=+20.952255936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:47 crc kubenswrapper[5030]: E0120 22:35:47.632026 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:48.632015539 +0000 UTC m=+20.952275927 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.661970 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf449d4a_2037_4802_8370_1965d3026c07.slice/crio-beeda922fc880b32e36c237db8ea60c6490fde2c801311a1fae103837e897a85 WatchSource:0}: Error finding container beeda922fc880b32e36c237db8ea60c6490fde2c801311a1fae103837e897a85: Status 404 returned error can't find the container with id beeda922fc880b32e36c237db8ea60c6490fde2c801311a1fae103837e897a85 Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.731226 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.731266 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.731277 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.731293 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.731304 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:47Z","lastTransitionTime":"2026-01-20T22:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.814144 5030 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814414 5030 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814459 5030 reflector.go:484] object-"openshift-multus"/"default-cni-sysctl-allowlist": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"default-cni-sysctl-allowlist": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814537 5030 reflector.go:484] object-"openshift-machine-config-operator"/"proxy-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"proxy-tls": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814561 5030 reflector.go:484] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": watch of *v1.Secret ended with: very short watch: object-"openshift-image-registry"/"node-ca-dockercfg-4777p": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814569 5030 reflector.go:484] object-"openshift-multus"/"multus-daemon-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"multus-daemon-config": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814664 5030 reflector.go:484] object-"openshift-image-registry"/"image-registry-certificates": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"image-registry-certificates": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814683 5030 reflector.go:484] object-"openshift-image-registry"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814707 5030 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-config": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814682 5030 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814711 5030 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814733 5030 reflector.go:484] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814721 5030 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814760 5030 reflector.go:484] object-"openshift-multus"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814806 5030 reflector.go:484] object-"openshift-ovn-kubernetes"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814601 5030 reflector.go:484] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814811 5030 reflector.go:484] object-"openshift-machine-config-operator"/"kube-rbac-proxy": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-rbac-proxy": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814836 5030 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814585 5030 reflector.go:484] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814853 5030 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814762 5030 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814870 5030 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814877 5030 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814893 5030 reflector.go:484] object-"openshift-image-registry"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814768 5030 reflector.go:484] object-"openshift-multus"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814903 5030 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814800 5030 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814813 5030 reflector.go:484] object-"openshift-multus"/"default-dockercfg-2q5b6": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"default-dockercfg-2q5b6": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814823 5030 reflector.go:484] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814821 5030 reflector.go:484] object-"openshift-multus"/"cni-copy-resources": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"cni-copy-resources": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814836 5030 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814941 5030 reflector.go:484] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814847 5030 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814855 5030 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.814870 5030 reflector.go:484] object-"openshift-machine-config-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: W0120 22:35:47.815313 5030 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.833297 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.833334 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.833343 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.833357 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.833367 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:47Z","lastTransitionTime":"2026-01-20T22:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.868527 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.873506 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.885759 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.896078 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:47Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.915671 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 18:09:14.238721656 +0000 UTC Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.928959 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:47Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.935772 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.935823 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.935836 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.935853 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.936175 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:47Z","lastTransitionTime":"2026-01-20T22:35:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.966620 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.966678 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:47Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.967278 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.968334 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.969006 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.969996 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.970534 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.971169 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.972085 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.972723 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.973567 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.974055 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.975191 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.975701 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.976564 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.977145 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.977991 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.978611 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.978993 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.980012 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.980590 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.981084 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.981145 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:47Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.982015 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.982440 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.983499 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.984034 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.985080 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.985829 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.986281 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.987281 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.987933 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.988783 5030 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.988892 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.990528 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.991590 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.992043 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.993912 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.993945 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:47Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.995209 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.995913 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.996711 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.998035 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.998548 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 20 22:35:47 crc kubenswrapper[5030]: I0120 22:35:47.999848 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.001065 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.001779 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.002875 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.003530 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.004756 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.005517 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.006384 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.006909 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.007366 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.008094 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.008283 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.008861 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.009761 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.020150 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.037336 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.038279 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.038309 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.038320 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.038335 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.038347 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:48Z","lastTransitionTime":"2026-01-20T22:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.052778 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.064800 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.068679 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.069136 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"51eea5ed76fb5184717152009e441c1bd22e9c70f980089d40f9e0c0e85754e4"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.070462 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fqn26" event={"ID":"c8ba00cb-4776-41c1-87d9-9ba37894c69e","Type":"ContainerStarted","Data":"74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.070496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fqn26" event={"ID":"c8ba00cb-4776-41c1-87d9-9ba37894c69e","Type":"ContainerStarted","Data":"26fa6ab27b011105fb3a8a0a48586ee23cd98de0720a5294d637a209bb758c00"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.075498 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5zbvj" event={"ID":"02009b42-ac8c-4b5e-8c69-9102da70a748","Type":"ContainerStarted","Data":"af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.075542 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5zbvj" event={"ID":"02009b42-ac8c-4b5e-8c69-9102da70a748","Type":"ContainerStarted","Data":"52b9654dca8949c3dad34e904df027cb6c501720195e75b7e8ae6fe0f277f7e0"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.077644 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" event={"ID":"18c97a93-96a8-40e8-9f36-513f91962906","Type":"ContainerStarted","Data":"e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.077685 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" event={"ID":"18c97a93-96a8-40e8-9f36-513f91962906","Type":"ContainerStarted","Data":"dc808e9f8095657d2abbd0b1a7b84145433483f013f515a9e418882fee108677"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.079716 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.079757 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.079768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"217bcf2ac87cef733dec7e581d924db38d5c94b45e952f55303b3e68e056276d"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.081078 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.081599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.081641 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.081652 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"d5b5544e2e5f6c67fe57d77e66f1ee0db9d205c46847ea381554300afbbd3154"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.082639 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c1127f882d51e81d281527b281ebeef2cd91e333e247e742d36f57f215feaf6f"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.083795 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf449d4a-2037-4802-8370-1965d3026c07" containerID="00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902" exitCode=0 Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.083852 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.083870 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerStarted","Data":"beeda922fc880b32e36c237db8ea60c6490fde2c801311a1fae103837e897a85"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.084983 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n8v4f" event={"ID":"7e610661-5072-4aa0-b1f1-75410b7f663b","Type":"ContainerStarted","Data":"14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.085077 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n8v4f" event={"ID":"7e610661-5072-4aa0-b1f1-75410b7f663b","Type":"ContainerStarted","Data":"efdead91c0128c6970dd934c386080fcc77ffe5a94818b7587900d5c14282d08"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.095526 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.109305 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.120242 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.133011 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.144947 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.144986 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.144998 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.145018 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.145031 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:48Z","lastTransitionTime":"2026-01-20T22:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.148558 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.170738 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.186133 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.216614 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.227817 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.242622 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.247509 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.247553 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.247561 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.247575 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.247584 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:48Z","lastTransitionTime":"2026-01-20T22:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.254664 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.267279 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.277973 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.290799 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.302575 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.320893 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.335545 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.349522 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.350532 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.350570 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.350583 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.350601 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.350610 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:48Z","lastTransitionTime":"2026-01-20T22:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.360151 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.374243 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.386512 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.406093 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.416533 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.428876 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.440330 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.452546 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.452589 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.452602 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.452637 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.452651 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:48Z","lastTransitionTime":"2026-01-20T22:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.458436 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.470230 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.496664 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.531833 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.554952 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.554992 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.555004 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.555021 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.555033 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:48Z","lastTransitionTime":"2026-01-20T22:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.574712 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.643205 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.643309 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.643333 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.643351 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.643380 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.643491 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.643505 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.643508 5030 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.643531 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.643558 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.643552 5030 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.643573 5030 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.643587 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:50.64356854 +0000 UTC m=+22.963828818 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.643515 5030 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.643858 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:50.643790395 +0000 UTC m=+22.964050723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.643898 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:50.643885087 +0000 UTC m=+22.964145415 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.643987 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:50.643915498 +0000 UTC m=+22.964175806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.644035 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:35:50.64402106 +0000 UTC m=+22.964281718 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.658454 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.658504 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.658522 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.658547 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.658564 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:48Z","lastTransitionTime":"2026-01-20T22:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.696201 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.706783 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.731489 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.738458 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.761532 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.761583 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.761597 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.761645 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.761659 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:48Z","lastTransitionTime":"2026-01-20T22:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.820816 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.852980 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.863477 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.863510 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.863518 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.863531 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.863542 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:48Z","lastTransitionTime":"2026-01-20T22:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.900353 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.900679 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.908179 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.914791 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.916112 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 00:02:37.710240251 +0000 UTC Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.957526 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.961316 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.961361 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.961391 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.961424 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.961490 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:35:48 crc kubenswrapper[5030]: E0120 22:35:48.961598 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.965958 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.965990 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.966001 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.966018 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.966029 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:48Z","lastTransitionTime":"2026-01-20T22:35:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.971910 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 20 22:35:48 crc kubenswrapper[5030]: I0120 22:35:48.987148 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.035893 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.040233 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.042370 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.056513 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.065119 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.068903 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.068945 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.068954 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.068968 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.068978 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:49Z","lastTransitionTime":"2026-01-20T22:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.092396 5030 generic.go:334] "Generic (PLEG): container finished" podID="18c97a93-96a8-40e8-9f36-513f91962906" containerID="e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4" exitCode=0 Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.092478 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" event={"ID":"18c97a93-96a8-40e8-9f36-513f91962906","Type":"ContainerDied","Data":"e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4"} Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.094836 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerStarted","Data":"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3"} Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.102698 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.106064 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.106291 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.108881 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.122788 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.139961 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.157105 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.180115 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.180144 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.180152 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.180168 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.180176 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:49Z","lastTransitionTime":"2026-01-20T22:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.183699 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.213419 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.223783 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.244263 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.265647 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.282612 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.282658 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.282669 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.282685 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.282697 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:49Z","lastTransitionTime":"2026-01-20T22:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.307739 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.323305 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.343911 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.363275 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.385114 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.385156 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.385167 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.385183 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.385195 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:49Z","lastTransitionTime":"2026-01-20T22:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.395359 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.403686 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.423475 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.443604 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.463906 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.483828 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.487978 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.488023 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.488033 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.488053 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.488064 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:49Z","lastTransitionTime":"2026-01-20T22:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.523994 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.553327 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.590144 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.590180 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.590204 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.590217 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.590227 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:49Z","lastTransitionTime":"2026-01-20T22:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.591682 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.633207 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.671867 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.693076 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.693117 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.693133 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.693149 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.693161 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:49Z","lastTransitionTime":"2026-01-20T22:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.710208 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.758139 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.792082 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.796274 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.796318 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.796331 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.796350 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.796361 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:49Z","lastTransitionTime":"2026-01-20T22:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.839342 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:49Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.899000 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.899038 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.899051 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.899068 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.899079 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:49Z","lastTransitionTime":"2026-01-20T22:35:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:49 crc kubenswrapper[5030]: I0120 22:35:49.917135 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 18:14:11.451468987 +0000 UTC Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.000936 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.000962 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.000972 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.000985 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.000996 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:50Z","lastTransitionTime":"2026-01-20T22:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.102341 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerStarted","Data":"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.102431 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerStarted","Data":"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.102475 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerStarted","Data":"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.102500 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerStarted","Data":"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.102524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerStarted","Data":"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.102648 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.102674 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.102684 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.102695 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.102704 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:50Z","lastTransitionTime":"2026-01-20T22:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.104205 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.105845 5030 generic.go:334] "Generic (PLEG): container finished" podID="18c97a93-96a8-40e8-9f36-513f91962906" containerID="7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d" exitCode=0 Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.105879 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" event={"ID":"18c97a93-96a8-40e8-9f36-513f91962906","Type":"ContainerDied","Data":"7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.205662 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.205713 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.205730 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.205753 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.205770 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:50Z","lastTransitionTime":"2026-01-20T22:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.286582 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.309454 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.309518 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.309530 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.309548 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.309559 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:50Z","lastTransitionTime":"2026-01-20T22:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.309264 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.330874 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.346080 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.360191 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.380982 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.395499 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.409891 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.411548 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.411598 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.411611 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.411654 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.411668 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:50Z","lastTransitionTime":"2026-01-20T22:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.438107 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.476215 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.498963 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.508178 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.513652 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.513676 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.513684 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.513696 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.513704 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:50Z","lastTransitionTime":"2026-01-20T22:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.520464 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.530075 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.539452 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.548901 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.558265 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.567564 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.591560 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.616289 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.616334 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.616345 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.616361 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.616371 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:50Z","lastTransitionTime":"2026-01-20T22:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.631909 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.662678 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.662840 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.662854 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:35:54.662831902 +0000 UTC m=+26.983092190 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.662893 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.662947 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.663019 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.663054 5030 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.663099 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.663115 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.663119 5030 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.663128 5030 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.663141 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:54.663117118 +0000 UTC m=+26.983377486 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.663119 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.663167 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:54.663152979 +0000 UTC m=+26.983413287 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.663171 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.663191 5030 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.663195 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:54.66318331 +0000 UTC m=+26.983443708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.663213 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 22:35:54.663206372 +0000 UTC m=+26.983466660 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.671305 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.710993 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.719279 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.719362 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.719385 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.719408 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.719424 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:50Z","lastTransitionTime":"2026-01-20T22:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.759067 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.794601 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.822283 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.822331 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.822348 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.822371 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.822388 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:50Z","lastTransitionTime":"2026-01-20T22:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.847280 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.881461 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.917106 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.917377 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 19:27:15.597879597 +0000 UTC Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.925164 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.925201 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.925212 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.925229 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.925242 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:50Z","lastTransitionTime":"2026-01-20T22:35:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.951815 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:50Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.961031 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.961069 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:35:50 crc kubenswrapper[5030]: I0120 22:35:50.961069 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.961210 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.961320 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:35:50 crc kubenswrapper[5030]: E0120 22:35:50.961443 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.028569 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.028654 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.028674 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.028703 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.028723 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:51Z","lastTransitionTime":"2026-01-20T22:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.131719 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.131784 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.131802 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.131826 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.131845 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:51Z","lastTransitionTime":"2026-01-20T22:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.233987 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.234028 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.234039 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.234060 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.234073 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:51Z","lastTransitionTime":"2026-01-20T22:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.337786 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.337839 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.337850 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.337866 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.337881 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:51Z","lastTransitionTime":"2026-01-20T22:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.440203 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.440245 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.440257 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.440276 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.440288 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:51Z","lastTransitionTime":"2026-01-20T22:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.543490 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.543525 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.543534 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.543546 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.543556 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:51Z","lastTransitionTime":"2026-01-20T22:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.645568 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.645609 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.645639 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.645658 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.645669 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:51Z","lastTransitionTime":"2026-01-20T22:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.747598 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.747663 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.747676 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.747695 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.747717 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:51Z","lastTransitionTime":"2026-01-20T22:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.850498 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.850533 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.850542 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.850565 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.850575 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:51Z","lastTransitionTime":"2026-01-20T22:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.918419 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 03:49:34.14785237 +0000 UTC Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.952111 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.952154 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.952167 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.952183 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:51 crc kubenswrapper[5030]: I0120 22:35:51.952195 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:51Z","lastTransitionTime":"2026-01-20T22:35:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.055372 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.055413 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.055422 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.055436 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.055446 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:52Z","lastTransitionTime":"2026-01-20T22:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.114154 5030 generic.go:334] "Generic (PLEG): container finished" podID="18c97a93-96a8-40e8-9f36-513f91962906" containerID="eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1" exitCode=0 Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.114230 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" event={"ID":"18c97a93-96a8-40e8-9f36-513f91962906","Type":"ContainerDied","Data":"eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1"} Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.118403 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerStarted","Data":"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede"} Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.136999 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.151200 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.157710 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.157752 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.157762 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.157777 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.157788 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:52Z","lastTransitionTime":"2026-01-20T22:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.166134 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.178227 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.196992 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.232206 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.248059 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.259840 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.260216 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.260299 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.260312 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.260326 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.260335 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:52Z","lastTransitionTime":"2026-01-20T22:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.275473 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.294238 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.308934 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.319599 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.331507 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.340832 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.362885 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.362919 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.362927 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.362940 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.362949 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:52Z","lastTransitionTime":"2026-01-20T22:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.465938 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.465970 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.465980 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.465993 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.466001 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:52Z","lastTransitionTime":"2026-01-20T22:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.568516 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.568600 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.568672 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.568705 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.568728 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:52Z","lastTransitionTime":"2026-01-20T22:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.670485 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.670526 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.670536 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.670550 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.670560 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:52Z","lastTransitionTime":"2026-01-20T22:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.776584 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.776709 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.776735 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.776767 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.776792 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:52Z","lastTransitionTime":"2026-01-20T22:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.879787 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.879823 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.879835 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.879854 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.879866 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:52Z","lastTransitionTime":"2026-01-20T22:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.919509 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 15:47:10.632711182 +0000 UTC Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.962108 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.962173 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:35:52 crc kubenswrapper[5030]: E0120 22:35:52.962264 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:35:52 crc kubenswrapper[5030]: E0120 22:35:52.962383 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.962798 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:52 crc kubenswrapper[5030]: E0120 22:35:52.962926 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.982382 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.982431 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.982449 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.982473 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:52 crc kubenswrapper[5030]: I0120 22:35:52.982489 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:52Z","lastTransitionTime":"2026-01-20T22:35:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.085040 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.085089 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.085101 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.085120 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.085132 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:53Z","lastTransitionTime":"2026-01-20T22:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.124896 5030 generic.go:334] "Generic (PLEG): container finished" podID="18c97a93-96a8-40e8-9f36-513f91962906" containerID="1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2" exitCode=0 Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.124972 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" event={"ID":"18c97a93-96a8-40e8-9f36-513f91962906","Type":"ContainerDied","Data":"1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2"} Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.138696 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.156311 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.167299 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.178894 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.187344 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.187380 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.187391 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.187408 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.187418 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:53Z","lastTransitionTime":"2026-01-20T22:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.189075 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.198641 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.210355 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.222482 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.242360 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.257373 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.272184 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.282482 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.289570 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.289594 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.289605 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.289646 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.289662 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:53Z","lastTransitionTime":"2026-01-20T22:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.294406 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.308472 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.392157 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.392395 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.392595 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.392761 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.392857 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:53Z","lastTransitionTime":"2026-01-20T22:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.495891 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.496176 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.496267 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.496359 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.496432 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:53Z","lastTransitionTime":"2026-01-20T22:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.598908 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.598950 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.598967 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.598986 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.599001 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:53Z","lastTransitionTime":"2026-01-20T22:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.702107 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.702179 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.702196 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.702221 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.702244 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:53Z","lastTransitionTime":"2026-01-20T22:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.804905 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.804975 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.804998 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.805032 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.805054 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:53Z","lastTransitionTime":"2026-01-20T22:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.908679 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.908748 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.908814 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.908856 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.908880 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:53Z","lastTransitionTime":"2026-01-20T22:35:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:53 crc kubenswrapper[5030]: I0120 22:35:53.921465 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 13:25:35.286430643 +0000 UTC Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.011840 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.011900 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.011921 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.011950 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.011971 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:54Z","lastTransitionTime":"2026-01-20T22:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.114535 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.114608 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.114676 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.114727 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.114747 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:54Z","lastTransitionTime":"2026-01-20T22:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.133368 5030 generic.go:334] "Generic (PLEG): container finished" podID="18c97a93-96a8-40e8-9f36-513f91962906" containerID="15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1" exitCode=0 Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.133412 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" event={"ID":"18c97a93-96a8-40e8-9f36-513f91962906","Type":"ContainerDied","Data":"15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1"} Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.151790 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.168663 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.183537 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.197279 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.211150 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.217783 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.217811 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.217821 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.217834 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.217843 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:54Z","lastTransitionTime":"2026-01-20T22:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.224979 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.235989 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.248413 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.262305 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.277210 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.286667 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.297332 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.307360 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.320114 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.320144 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.320152 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.320166 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.320176 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:54Z","lastTransitionTime":"2026-01-20T22:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.324887 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:54Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.423207 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.423274 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.423292 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.423315 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.423332 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:54Z","lastTransitionTime":"2026-01-20T22:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.526464 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.526524 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.526547 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.526591 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.526651 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:54Z","lastTransitionTime":"2026-01-20T22:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.629523 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.629580 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.629602 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.629658 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.629678 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:54Z","lastTransitionTime":"2026-01-20T22:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.706901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.707036 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.707073 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:36:02.707046091 +0000 UTC m=+35.027306379 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.707133 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.707174 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.707183 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.707194 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.707208 5030 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.707214 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.707265 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:02.707249366 +0000 UTC m=+35.027509664 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.707299 5030 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.707302 5030 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.707355 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.707449 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.707469 5030 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.707361 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:02.707344158 +0000 UTC m=+35.027604506 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.707546 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:02.707513372 +0000 UTC m=+35.027773660 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.707562 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:02.707553903 +0000 UTC m=+35.027814331 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.731890 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.731934 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.731949 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.731971 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.731986 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:54Z","lastTransitionTime":"2026-01-20T22:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.835062 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.835130 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.835148 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.835174 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.835190 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:54Z","lastTransitionTime":"2026-01-20T22:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.922193 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 08:01:28.567862181 +0000 UTC Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.938295 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.938392 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.938420 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.938451 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.938476 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:54Z","lastTransitionTime":"2026-01-20T22:35:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.961772 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.961770 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:35:54 crc kubenswrapper[5030]: I0120 22:35:54.961797 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.962033 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.962196 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:35:54 crc kubenswrapper[5030]: E0120 22:35:54.962318 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.041159 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.041224 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.041244 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.041272 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.041296 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:55Z","lastTransitionTime":"2026-01-20T22:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.143599 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.143711 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.143729 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.143751 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.143767 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:55Z","lastTransitionTime":"2026-01-20T22:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.144367 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerStarted","Data":"e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b"} Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.145069 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.145377 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.151941 5030 generic.go:334] "Generic (PLEG): container finished" podID="18c97a93-96a8-40e8-9f36-513f91962906" containerID="81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7" exitCode=0 Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.152017 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" event={"ID":"18c97a93-96a8-40e8-9f36-513f91962906","Type":"ContainerDied","Data":"81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7"} Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.164808 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.182341 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.185683 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.186816 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.195180 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.214219 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.228299 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.246357 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.246552 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.246612 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.246705 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.246780 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:55Z","lastTransitionTime":"2026-01-20T22:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.250017 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.263115 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.280236 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.294945 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.310229 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.324045 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.337439 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.349437 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.350750 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.350787 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.350797 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.350814 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.350826 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:55Z","lastTransitionTime":"2026-01-20T22:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.367780 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.384156 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.404810 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.422292 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.439763 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.451599 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.454443 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.454494 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.454506 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.454523 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.454538 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:55Z","lastTransitionTime":"2026-01-20T22:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.465133 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.478533 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.491596 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.536862 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.555481 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.556561 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.556601 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.556612 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.556646 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.556659 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:55Z","lastTransitionTime":"2026-01-20T22:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.566545 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.577256 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.585689 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.595224 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:55Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.659164 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.659220 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.659236 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.659259 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.659275 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:55Z","lastTransitionTime":"2026-01-20T22:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.762196 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.762266 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.762290 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.762320 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.762344 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:55Z","lastTransitionTime":"2026-01-20T22:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.865661 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.865724 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.865752 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.865781 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.865803 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:55Z","lastTransitionTime":"2026-01-20T22:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.922696 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 00:18:40.127846482 +0000 UTC Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.968075 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.968111 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.968119 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.968136 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:55 crc kubenswrapper[5030]: I0120 22:35:55.968146 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:55Z","lastTransitionTime":"2026-01-20T22:35:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.070523 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.070556 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.070565 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.070578 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.070590 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.159854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" event={"ID":"18c97a93-96a8-40e8-9f36-513f91962906","Type":"ContainerStarted","Data":"dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d"} Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.159881 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.172524 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.173601 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.173615 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.173636 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.173648 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.173680 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.194830 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.219284 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.241240 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.263315 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.276192 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.276226 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.276238 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.276255 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.276265 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.280163 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.295350 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.307102 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.307137 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.307149 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.307165 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.307179 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.310277 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: E0120 22:35:56.331518 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.334784 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.335207 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.335230 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.335238 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.335253 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.335262 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: E0120 22:35:56.351379 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.353063 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.356578 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.356662 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.356682 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.356717 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.356754 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: E0120 22:35:56.372682 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.376475 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.376662 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.376707 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.376719 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.376739 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.376750 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: E0120 22:35:56.391130 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.392120 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.395168 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.395200 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.395211 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.395227 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.395239 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.409895 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: E0120 22:35:56.413576 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: E0120 22:35:56.413794 5030 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.415228 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.415257 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.415269 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.415283 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.415292 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.430580 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:56Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.518436 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.518477 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.518489 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.518510 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.518528 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.621124 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.621168 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.621182 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.621197 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.621207 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.723281 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.723353 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.723377 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.723399 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.723415 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.825652 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.825693 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.825706 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.825723 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.825738 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.923993 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 01:17:41.814441832 +0000 UTC Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.928184 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.928222 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.928236 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.928258 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.928274 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:56Z","lastTransitionTime":"2026-01-20T22:35:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.961683 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.961724 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:35:56 crc kubenswrapper[5030]: E0120 22:35:56.961847 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:35:56 crc kubenswrapper[5030]: E0120 22:35:56.961925 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:35:56 crc kubenswrapper[5030]: I0120 22:35:56.962321 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:56 crc kubenswrapper[5030]: E0120 22:35:56.962416 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.030831 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.030861 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.030869 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.030883 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.030941 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:57Z","lastTransitionTime":"2026-01-20T22:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.134017 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.134050 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.134058 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.134071 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.134079 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:57Z","lastTransitionTime":"2026-01-20T22:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.167435 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/0.log" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.171709 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf449d4a-2037-4802-8370-1965d3026c07" containerID="e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b" exitCode=1 Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.171742 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b"} Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.172945 5030 scope.go:117] "RemoveContainer" containerID="e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.191351 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.226931 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"lice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 22:35:56.863794 6295 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0120 22:35:56.863776 6295 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0120 22:35:56.863817 6295 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0120 22:35:56.863823 6295 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 22:35:56.863877 6295 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0120 22:35:56.863912 6295 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0120 22:35:56.863921 6295 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0120 22:35:56.863943 6295 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0120 22:35:56.864029 6295 factory.go:656] Stopping watch factory\\\\nI0120 22:35:56.864060 6295 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0120 22:35:56.864066 6295 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 22:35:56.864078 6295 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 22:35:56.864079 6295 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 22:35:56.864086 6295 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 22:35:56.864098 6295 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.237679 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.237736 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.237757 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.237785 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.237807 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:57Z","lastTransitionTime":"2026-01-20T22:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.250253 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.277391 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.292142 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.310973 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.326407 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.340871 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.340908 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.340920 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.340937 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.340949 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:57Z","lastTransitionTime":"2026-01-20T22:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.342930 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.360696 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.375195 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.392115 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.410839 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.424303 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.441533 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.442908 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.442930 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.442939 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.442952 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.442963 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:57Z","lastTransitionTime":"2026-01-20T22:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.545346 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.545423 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.545449 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.545480 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.545501 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:57Z","lastTransitionTime":"2026-01-20T22:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.647892 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.647947 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.647961 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.647980 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.647995 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:57Z","lastTransitionTime":"2026-01-20T22:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.750153 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.750214 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.750226 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.750242 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.750256 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:57Z","lastTransitionTime":"2026-01-20T22:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.852510 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.852567 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.852583 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.852609 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.852647 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:57Z","lastTransitionTime":"2026-01-20T22:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.924923 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 02:10:37.05605326 +0000 UTC Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.955222 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.955270 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.955284 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.955302 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.955316 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:57Z","lastTransitionTime":"2026-01-20T22:35:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.974744 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.986644 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:57 crc kubenswrapper[5030]: I0120 22:35:57.998672 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:57Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.010883 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.023005 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.040605 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.054476 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.056783 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.056842 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.056858 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.056882 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.056897 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:58Z","lastTransitionTime":"2026-01-20T22:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.064962 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.077329 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.091335 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.114489 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"lice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 22:35:56.863794 6295 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0120 22:35:56.863776 6295 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0120 22:35:56.863817 6295 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0120 22:35:56.863823 6295 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 22:35:56.863877 6295 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0120 22:35:56.863912 6295 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0120 22:35:56.863921 6295 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0120 22:35:56.863943 6295 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0120 22:35:56.864029 6295 factory.go:656] Stopping watch factory\\\\nI0120 22:35:56.864060 6295 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0120 22:35:56.864066 6295 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 22:35:56.864078 6295 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 22:35:56.864079 6295 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 22:35:56.864086 6295 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 22:35:56.864098 6295 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.128222 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.144955 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.154413 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.159589 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.159632 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.159644 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.159659 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.159669 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:58Z","lastTransitionTime":"2026-01-20T22:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.177174 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/0.log" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.180020 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerStarted","Data":"91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731"} Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.180143 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.191389 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.203431 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.212214 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.224950 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.235722 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.251900 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"lice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 22:35:56.863794 6295 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0120 22:35:56.863776 6295 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0120 22:35:56.863817 6295 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0120 22:35:56.863823 6295 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 22:35:56.863877 6295 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0120 22:35:56.863912 6295 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0120 22:35:56.863921 6295 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0120 22:35:56.863943 6295 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0120 22:35:56.864029 6295 factory.go:656] Stopping watch factory\\\\nI0120 22:35:56.864060 6295 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0120 22:35:56.864066 6295 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 22:35:56.864078 6295 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 22:35:56.864079 6295 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 22:35:56.864086 6295 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 22:35:56.864098 6295 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.261411 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.261454 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.261465 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.261481 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.261493 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:58Z","lastTransitionTime":"2026-01-20T22:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.263334 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.292303 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.304925 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.329879 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.347476 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.364971 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.365036 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.365060 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.365089 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.365110 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:58Z","lastTransitionTime":"2026-01-20T22:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.365384 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.382504 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.398459 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:58Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.468725 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.468805 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.468835 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.468867 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.468884 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:58Z","lastTransitionTime":"2026-01-20T22:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.572019 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.572078 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.572097 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.572118 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.572135 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:58Z","lastTransitionTime":"2026-01-20T22:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.675927 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.675997 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.676027 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.676074 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.676095 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:58Z","lastTransitionTime":"2026-01-20T22:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.778611 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.778713 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.778732 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.778758 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.778777 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:58Z","lastTransitionTime":"2026-01-20T22:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.881749 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.881810 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.881830 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.881852 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.881867 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:58Z","lastTransitionTime":"2026-01-20T22:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.925749 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 03:11:02.247032221 +0000 UTC Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.961718 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.961786 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:35:58 crc kubenswrapper[5030]: E0120 22:35:58.961932 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.962011 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:35:58 crc kubenswrapper[5030]: E0120 22:35:58.962373 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:35:58 crc kubenswrapper[5030]: E0120 22:35:58.962230 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.984442 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.984526 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.984548 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.984579 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:58 crc kubenswrapper[5030]: I0120 22:35:58.984601 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:58Z","lastTransitionTime":"2026-01-20T22:35:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.087116 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.087149 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.087159 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.087174 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.087184 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:59Z","lastTransitionTime":"2026-01-20T22:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.190434 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/1.log" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.190890 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.190928 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.190938 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.190953 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.190964 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:59Z","lastTransitionTime":"2026-01-20T22:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.191267 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/0.log" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.194656 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf449d4a-2037-4802-8370-1965d3026c07" containerID="91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731" exitCode=1 Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.194701 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731"} Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.194755 5030 scope.go:117] "RemoveContainer" containerID="e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.195431 5030 scope.go:117] "RemoveContainer" containerID="91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731" Jan 20 22:35:59 crc kubenswrapper[5030]: E0120 22:35:59.195603 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.215385 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.233204 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.246340 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.261323 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.277447 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.294468 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.294522 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.294538 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.294560 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.294575 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:59Z","lastTransitionTime":"2026-01-20T22:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.295612 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.310612 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.324700 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.341989 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.365379 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"lice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 22:35:56.863794 6295 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0120 22:35:56.863776 6295 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0120 22:35:56.863817 6295 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0120 22:35:56.863823 6295 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 22:35:56.863877 6295 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0120 22:35:56.863912 6295 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0120 22:35:56.863921 6295 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0120 22:35:56.863943 6295 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0120 22:35:56.864029 6295 factory.go:656] Stopping watch factory\\\\nI0120 22:35:56.864060 6295 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0120 22:35:56.864066 6295 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 22:35:56.864078 6295 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 22:35:56.864079 6295 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 22:35:56.864086 6295 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 22:35:56.864098 6295 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:58Z\\\",\\\"message\\\":\\\"uilt service openshift-multus/multus-admission-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0120 22:35:58.107221 6450 services_controller.go:444] Built service openshift-multus/multus-admission-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0120 22:35:58.107234 6450 services_controller.go:445] Built service openshift-multus/multus-admission-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0120 22:35:58.107232 6450 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller init\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.386207 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.396872 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.396925 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.396944 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.396968 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.396986 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:59Z","lastTransitionTime":"2026-01-20T22:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.403029 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.425781 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.436990 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.500676 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.500753 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.500777 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.500809 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.500831 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:59Z","lastTransitionTime":"2026-01-20T22:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.603321 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.603363 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.603373 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.603390 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.603401 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:59Z","lastTransitionTime":"2026-01-20T22:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.675337 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5"] Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.676696 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.680228 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.680299 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.700247 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.706613 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.706701 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.706750 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.706776 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.706798 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:59Z","lastTransitionTime":"2026-01-20T22:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.735015 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e087b53667615ac7563732ef39bf1588c452b07b579cecc5eae27a019fb26c1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"message\\\":\\\"lice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 22:35:56.863794 6295 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0120 22:35:56.863776 6295 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0120 22:35:56.863817 6295 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0120 22:35:56.863823 6295 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 22:35:56.863877 6295 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0120 22:35:56.863912 6295 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0120 22:35:56.863921 6295 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0120 22:35:56.863943 6295 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0120 22:35:56.864029 6295 factory.go:656] Stopping watch factory\\\\nI0120 22:35:56.864060 6295 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0120 22:35:56.864066 6295 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 22:35:56.864078 6295 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 22:35:56.864079 6295 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 22:35:56.864086 6295 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 22:35:56.864098 6295 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:58Z\\\",\\\"message\\\":\\\"uilt service openshift-multus/multus-admission-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0120 22:35:58.107221 6450 services_controller.go:444] Built service openshift-multus/multus-admission-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0120 22:35:58.107234 6450 services_controller.go:445] Built service openshift-multus/multus-admission-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0120 22:35:58.107232 6450 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller init\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.759121 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkjb6\" (UniqueName: \"kubernetes.io/projected/382d80ca-0935-4f90-b822-33b31b79b1f5-kube-api-access-dkjb6\") pod \"ovnkube-control-plane-749d76644c-ccgf5\" (UID: \"382d80ca-0935-4f90-b822-33b31b79b1f5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.759235 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/382d80ca-0935-4f90-b822-33b31b79b1f5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ccgf5\" (UID: \"382d80ca-0935-4f90-b822-33b31b79b1f5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.759324 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/382d80ca-0935-4f90-b822-33b31b79b1f5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ccgf5\" (UID: \"382d80ca-0935-4f90-b822-33b31b79b1f5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.759373 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/382d80ca-0935-4f90-b822-33b31b79b1f5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ccgf5\" (UID: \"382d80ca-0935-4f90-b822-33b31b79b1f5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.760654 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.781218 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.804671 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.810243 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.810286 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.810309 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.810339 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.810363 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:59Z","lastTransitionTime":"2026-01-20T22:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.822950 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.840802 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.856243 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.860053 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/382d80ca-0935-4f90-b822-33b31b79b1f5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ccgf5\" (UID: \"382d80ca-0935-4f90-b822-33b31b79b1f5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.860106 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/382d80ca-0935-4f90-b822-33b31b79b1f5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ccgf5\" (UID: \"382d80ca-0935-4f90-b822-33b31b79b1f5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.860199 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkjb6\" (UniqueName: \"kubernetes.io/projected/382d80ca-0935-4f90-b822-33b31b79b1f5-kube-api-access-dkjb6\") pod \"ovnkube-control-plane-749d76644c-ccgf5\" (UID: \"382d80ca-0935-4f90-b822-33b31b79b1f5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.860239 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/382d80ca-0935-4f90-b822-33b31b79b1f5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ccgf5\" (UID: \"382d80ca-0935-4f90-b822-33b31b79b1f5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.861289 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/382d80ca-0935-4f90-b822-33b31b79b1f5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ccgf5\" (UID: \"382d80ca-0935-4f90-b822-33b31b79b1f5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.861297 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/382d80ca-0935-4f90-b822-33b31b79b1f5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ccgf5\" (UID: \"382d80ca-0935-4f90-b822-33b31b79b1f5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.868910 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/382d80ca-0935-4f90-b822-33b31b79b1f5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ccgf5\" (UID: \"382d80ca-0935-4f90-b822-33b31b79b1f5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.872993 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.889193 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkjb6\" (UniqueName: \"kubernetes.io/projected/382d80ca-0935-4f90-b822-33b31b79b1f5-kube-api-access-dkjb6\") pod \"ovnkube-control-plane-749d76644c-ccgf5\" (UID: \"382d80ca-0935-4f90-b822-33b31b79b1f5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.899370 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.914693 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.914767 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.914791 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.914823 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.914847 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:35:59Z","lastTransitionTime":"2026-01-20T22:35:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.921602 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.926060 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 04:57:38.482641495 +0000 UTC Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.941712 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.962007 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.979171 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.994194 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:35:59Z is after 2025-08-24T17:21:41Z" Jan 20 22:35:59 crc kubenswrapper[5030]: I0120 22:35:59.998423 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.017284 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.017612 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.017737 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.017829 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.017911 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:00Z","lastTransitionTime":"2026-01-20T22:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:00 crc kubenswrapper[5030]: W0120 22:36:00.019789 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod382d80ca_0935_4f90_b822_33b31b79b1f5.slice/crio-8b0313bbcc54a5c5af7895b4b73df757f2f742da6b791530243e10dba5ab2809 WatchSource:0}: Error finding container 8b0313bbcc54a5c5af7895b4b73df757f2f742da6b791530243e10dba5ab2809: Status 404 returned error can't find the container with id 8b0313bbcc54a5c5af7895b4b73df757f2f742da6b791530243e10dba5ab2809 Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.120811 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.120875 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.120895 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.120921 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.120938 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:00Z","lastTransitionTime":"2026-01-20T22:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.206027 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" event={"ID":"382d80ca-0935-4f90-b822-33b31b79b1f5","Type":"ContainerStarted","Data":"8b0313bbcc54a5c5af7895b4b73df757f2f742da6b791530243e10dba5ab2809"} Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.206938 5030 scope.go:117] "RemoveContainer" containerID="91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731" Jan 20 22:36:00 crc kubenswrapper[5030]: E0120 22:36:00.207156 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.221052 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.222926 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.222971 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.222986 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.223006 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.223020 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:00Z","lastTransitionTime":"2026-01-20T22:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.237771 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.261308 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.277798 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.291706 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.310277 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.325839 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.325962 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.326019 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.326080 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.326133 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:00Z","lastTransitionTime":"2026-01-20T22:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.327373 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.343164 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.358359 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.375958 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.390446 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.403026 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.423478 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.428074 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.428104 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.428119 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.428138 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.428150 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:00Z","lastTransitionTime":"2026-01-20T22:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.441617 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.467739 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:58Z\\\",\\\"message\\\":\\\"uilt service openshift-multus/multus-admission-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0120 22:35:58.107221 6450 services_controller.go:444] Built service openshift-multus/multus-admission-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0120 22:35:58.107234 6450 services_controller.go:445] Built service openshift-multus/multus-admission-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0120 22:35:58.107232 6450 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller init\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:00Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.530281 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.530337 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.530353 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.530378 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.530395 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:00Z","lastTransitionTime":"2026-01-20T22:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.633696 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.633962 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.634024 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.634093 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.634157 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:00Z","lastTransitionTime":"2026-01-20T22:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.736600 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.736717 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.736741 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.736775 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.736801 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:00Z","lastTransitionTime":"2026-01-20T22:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.839808 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.839851 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.839862 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.839877 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.839890 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:00Z","lastTransitionTime":"2026-01-20T22:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.926649 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 07:27:51.566591107 +0000 UTC Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.943071 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.943162 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.943180 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.943204 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.943221 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:00Z","lastTransitionTime":"2026-01-20T22:36:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.961919 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.961967 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:00 crc kubenswrapper[5030]: E0120 22:36:00.962092 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:00 crc kubenswrapper[5030]: I0120 22:36:00.962440 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:00 crc kubenswrapper[5030]: E0120 22:36:00.962557 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:00 crc kubenswrapper[5030]: E0120 22:36:00.962669 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.046245 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.046311 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.046328 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.046354 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.046372 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:01Z","lastTransitionTime":"2026-01-20T22:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.149282 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.149361 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.149387 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.149422 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.149444 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:01Z","lastTransitionTime":"2026-01-20T22:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.212006 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-f2dgj"] Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.212889 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:01 crc kubenswrapper[5030]: E0120 22:36:01.212997 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.215479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" event={"ID":"382d80ca-0935-4f90-b822-33b31b79b1f5","Type":"ContainerStarted","Data":"876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6"} Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.215544 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" event={"ID":"382d80ca-0935-4f90-b822-33b31b79b1f5","Type":"ContainerStarted","Data":"8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d"} Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.218454 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/1.log" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.234366 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.252794 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.252856 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.252873 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.252895 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.252917 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:01Z","lastTransitionTime":"2026-01-20T22:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.256001 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.274309 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.276817 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n47nm\" (UniqueName: \"kubernetes.io/projected/2b3c414b-bc29-41aa-a369-ecc2cc809691-kube-api-access-n47nm\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.276946 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.296992 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.318783 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.338122 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.353593 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.355970 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.356025 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.356042 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.356068 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.356085 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:01Z","lastTransitionTime":"2026-01-20T22:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.374457 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.378878 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n47nm\" (UniqueName: \"kubernetes.io/projected/2b3c414b-bc29-41aa-a369-ecc2cc809691-kube-api-access-n47nm\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.378973 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:01 crc kubenswrapper[5030]: E0120 22:36:01.379196 5030 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:01 crc kubenswrapper[5030]: E0120 22:36:01.379293 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs podName:2b3c414b-bc29-41aa-a369-ecc2cc809691 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:01.879265434 +0000 UTC m=+34.199525752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs") pod "network-metrics-daemon-f2dgj" (UID: "2b3c414b-bc29-41aa-a369-ecc2cc809691") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.396423 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.407875 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n47nm\" (UniqueName: \"kubernetes.io/projected/2b3c414b-bc29-41aa-a369-ecc2cc809691-kube-api-access-n47nm\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.425342 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:58Z\\\",\\\"message\\\":\\\"uilt service openshift-multus/multus-admission-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0120 22:35:58.107221 6450 services_controller.go:444] Built service openshift-multus/multus-admission-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0120 22:35:58.107234 6450 services_controller.go:445] Built service openshift-multus/multus-admission-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0120 22:35:58.107232 6450 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller init\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.448926 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.458277 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.458350 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.458375 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.458441 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.458466 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:01Z","lastTransitionTime":"2026-01-20T22:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.468885 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.484038 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.499691 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.517597 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.537161 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.561281 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.562056 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.562138 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.562161 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.562194 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.562218 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:01Z","lastTransitionTime":"2026-01-20T22:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.579262 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.622477 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.644237 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.664608 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.664957 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.664990 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.665009 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.665033 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.665051 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:01Z","lastTransitionTime":"2026-01-20T22:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.684381 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.705146 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.724949 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.739103 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.759480 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.768187 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.768252 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.768274 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.768308 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.768331 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:01Z","lastTransitionTime":"2026-01-20T22:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.779051 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.809351 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:58Z\\\",\\\"message\\\":\\\"uilt service openshift-multus/multus-admission-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0120 22:35:58.107221 6450 services_controller.go:444] Built service openshift-multus/multus-admission-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0120 22:35:58.107234 6450 services_controller.go:445] Built service openshift-multus/multus-admission-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0120 22:35:58.107232 6450 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller init\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.825999 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.841827 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.860130 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.871742 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.871792 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.871806 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.871828 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.871844 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:01Z","lastTransitionTime":"2026-01-20T22:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.872718 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:01Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.885328 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:01 crc kubenswrapper[5030]: E0120 22:36:01.885533 5030 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:01 crc kubenswrapper[5030]: E0120 22:36:01.885604 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs podName:2b3c414b-bc29-41aa-a369-ecc2cc809691 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:02.885583958 +0000 UTC m=+35.205844256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs") pod "network-metrics-daemon-f2dgj" (UID: "2b3c414b-bc29-41aa-a369-ecc2cc809691") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.927752 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 12:49:14.51338947 +0000 UTC Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.973959 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.973999 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.974016 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.974029 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:01 crc kubenswrapper[5030]: I0120 22:36:01.974040 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:01Z","lastTransitionTime":"2026-01-20T22:36:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.076504 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.076541 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.076551 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.076567 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.076579 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:02Z","lastTransitionTime":"2026-01-20T22:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.179750 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.179813 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.179826 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.179843 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.179857 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:02Z","lastTransitionTime":"2026-01-20T22:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.282066 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.282114 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.282125 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.282143 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.282156 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:02Z","lastTransitionTime":"2026-01-20T22:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.385724 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.385782 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.385795 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.385811 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.385822 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:02Z","lastTransitionTime":"2026-01-20T22:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.489028 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.489387 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.489405 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.489431 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.489454 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:02Z","lastTransitionTime":"2026-01-20T22:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.592180 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.592207 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.592215 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.592227 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.592235 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:02Z","lastTransitionTime":"2026-01-20T22:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.694585 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.694657 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.694675 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.694694 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.694708 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:02Z","lastTransitionTime":"2026-01-20T22:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.793649 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.793771 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.793800 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.793843 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.793911 5030 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.793941 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:36:18.793893852 +0000 UTC m=+51.114154180 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.793948 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.793991 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.794023 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.794031 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.794047 5030 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.793999 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:18.793974494 +0000 UTC m=+51.114234792 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.794090 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.794110 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:18.794095547 +0000 UTC m=+51.114355865 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.794053 5030 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.794148 5030 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.794192 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:18.794178689 +0000 UTC m=+51.114438977 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.794223 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:18.79419902 +0000 UTC m=+51.114459338 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.797602 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.797726 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.797758 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.797789 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.797812 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:02Z","lastTransitionTime":"2026-01-20T22:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.895734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.895937 5030 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.896065 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs podName:2b3c414b-bc29-41aa-a369-ecc2cc809691 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:04.896026623 +0000 UTC m=+37.216286951 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs") pod "network-metrics-daemon-f2dgj" (UID: "2b3c414b-bc29-41aa-a369-ecc2cc809691") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.900672 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.900731 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.900755 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.900783 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.900817 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:02Z","lastTransitionTime":"2026-01-20T22:36:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.928487 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 15:08:34.214902813 +0000 UTC Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.961053 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.961156 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.961073 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:02 crc kubenswrapper[5030]: I0120 22:36:02.961062 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.961372 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.961519 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.961698 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:02 crc kubenswrapper[5030]: E0120 22:36:02.961953 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.003190 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.003235 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.003246 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.003262 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.003273 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:03Z","lastTransitionTime":"2026-01-20T22:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.105787 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.105861 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.105880 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.105903 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.105921 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:03Z","lastTransitionTime":"2026-01-20T22:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.211544 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.211614 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.211689 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.211721 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.211743 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:03Z","lastTransitionTime":"2026-01-20T22:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.315190 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.315258 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.315281 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.315343 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.315363 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:03Z","lastTransitionTime":"2026-01-20T22:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.422012 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.422090 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.422117 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.422147 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.422170 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:03Z","lastTransitionTime":"2026-01-20T22:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.525462 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.525550 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.525567 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.525590 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.525607 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:03Z","lastTransitionTime":"2026-01-20T22:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.628118 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.628165 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.628176 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.628193 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.628205 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:03Z","lastTransitionTime":"2026-01-20T22:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.735561 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.735603 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.735615 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.735647 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.735660 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:03Z","lastTransitionTime":"2026-01-20T22:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.837902 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.837960 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.837977 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.838001 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.838018 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:03Z","lastTransitionTime":"2026-01-20T22:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.929345 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 10:09:20.781073534 +0000 UTC Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.940711 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.940763 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.940780 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.940802 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:03 crc kubenswrapper[5030]: I0120 22:36:03.940819 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:03Z","lastTransitionTime":"2026-01-20T22:36:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.044075 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.044124 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.044139 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.044160 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.044178 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:04Z","lastTransitionTime":"2026-01-20T22:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.147362 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.147437 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.147461 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.147493 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.147516 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:04Z","lastTransitionTime":"2026-01-20T22:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.250441 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.250500 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.250519 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.250543 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.250560 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:04Z","lastTransitionTime":"2026-01-20T22:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.354046 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.354105 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.354121 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.354145 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.354163 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:04Z","lastTransitionTime":"2026-01-20T22:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.456903 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.456957 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.456970 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.456989 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.457003 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:04Z","lastTransitionTime":"2026-01-20T22:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.560358 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.560426 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.560443 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.560468 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.560485 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:04Z","lastTransitionTime":"2026-01-20T22:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.663940 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.664026 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.664042 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.664066 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.664079 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:04Z","lastTransitionTime":"2026-01-20T22:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.767156 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.767190 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.767199 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.767212 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.767221 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:04Z","lastTransitionTime":"2026-01-20T22:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.870066 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.870132 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.870152 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.870177 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.870196 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:04Z","lastTransitionTime":"2026-01-20T22:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.916993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:04 crc kubenswrapper[5030]: E0120 22:36:04.917249 5030 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:04 crc kubenswrapper[5030]: E0120 22:36:04.917353 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs podName:2b3c414b-bc29-41aa-a369-ecc2cc809691 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:08.917324973 +0000 UTC m=+41.237585291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs") pod "network-metrics-daemon-f2dgj" (UID: "2b3c414b-bc29-41aa-a369-ecc2cc809691") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.929505 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 00:56:54.897469399 +0000 UTC Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.961839 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.961903 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.961901 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.961857 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:04 crc kubenswrapper[5030]: E0120 22:36:04.962038 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:04 crc kubenswrapper[5030]: E0120 22:36:04.962155 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:04 crc kubenswrapper[5030]: E0120 22:36:04.962230 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:04 crc kubenswrapper[5030]: E0120 22:36:04.962402 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.973286 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.973373 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.973392 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.973417 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:04 crc kubenswrapper[5030]: I0120 22:36:04.973436 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:04Z","lastTransitionTime":"2026-01-20T22:36:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.076756 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.076796 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.076821 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.076835 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.076852 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:05Z","lastTransitionTime":"2026-01-20T22:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.179991 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.180157 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.180179 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.180204 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.180220 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:05Z","lastTransitionTime":"2026-01-20T22:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.283602 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.283730 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.283749 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.283773 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.283862 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:05Z","lastTransitionTime":"2026-01-20T22:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.387333 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.387384 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.387402 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.387425 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.387444 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:05Z","lastTransitionTime":"2026-01-20T22:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.490447 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.490508 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.490528 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.490548 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.490564 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:05Z","lastTransitionTime":"2026-01-20T22:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.593273 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.593304 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.593313 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.593330 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.593341 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:05Z","lastTransitionTime":"2026-01-20T22:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.695926 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.695998 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.696023 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.696052 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.696074 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:05Z","lastTransitionTime":"2026-01-20T22:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.799091 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.799154 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.799184 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.799215 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.799238 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:05Z","lastTransitionTime":"2026-01-20T22:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.902365 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.902425 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.902441 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.902467 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.902484 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:05Z","lastTransitionTime":"2026-01-20T22:36:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:05 crc kubenswrapper[5030]: I0120 22:36:05.930436 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 18:48:59.699925421 +0000 UTC Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.007439 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.007536 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.007558 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.007586 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.007617 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.111701 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.111758 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.111776 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.111800 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.111817 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.214451 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.214515 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.214532 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.214557 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.214577 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.318079 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.318477 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.318580 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.318723 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.318822 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.422412 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.422479 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.422496 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.422522 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.422542 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.507571 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.507663 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.507687 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.507715 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.507736 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: E0120 22:36:06.528494 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:06Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.533496 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.533550 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.533573 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.533600 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.533659 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: E0120 22:36:06.554978 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:06Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.559924 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.559986 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.560011 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.560041 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.560062 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: E0120 22:36:06.579490 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:06Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.584165 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.584215 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.584226 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.584243 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.584256 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: E0120 22:36:06.602654 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:06Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.607426 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.607494 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.607508 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.607526 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.607561 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: E0120 22:36:06.625888 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:06Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:06 crc kubenswrapper[5030]: E0120 22:36:06.626157 5030 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.628271 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.628331 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.628348 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.628372 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.628392 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.731357 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.731429 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.731464 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.731493 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.731516 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.833588 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.833686 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.833705 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.833729 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.833747 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.931195 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 10:21:34.791671492 +0000 UTC Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.936488 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.936526 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.936537 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.936552 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.936564 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:06Z","lastTransitionTime":"2026-01-20T22:36:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.962145 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.962257 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.962273 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:06 crc kubenswrapper[5030]: I0120 22:36:06.962197 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:06 crc kubenswrapper[5030]: E0120 22:36:06.962388 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:06 crc kubenswrapper[5030]: E0120 22:36:06.962536 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:06 crc kubenswrapper[5030]: E0120 22:36:06.962737 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:06 crc kubenswrapper[5030]: E0120 22:36:06.962881 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.039758 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.039832 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.039855 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.039889 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.039910 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:07Z","lastTransitionTime":"2026-01-20T22:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.142037 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.142076 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.142105 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.142119 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.142129 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:07Z","lastTransitionTime":"2026-01-20T22:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.244084 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.244193 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.244209 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.244229 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.244245 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:07Z","lastTransitionTime":"2026-01-20T22:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.346487 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.346530 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.346542 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.346558 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.346568 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:07Z","lastTransitionTime":"2026-01-20T22:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.421188 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.422915 5030 scope.go:117] "RemoveContainer" containerID="91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731" Jan 20 22:36:07 crc kubenswrapper[5030]: E0120 22:36:07.423281 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.449493 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.449559 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.449577 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.449602 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.449656 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:07Z","lastTransitionTime":"2026-01-20T22:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.552923 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.552979 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.552996 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.553019 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.553036 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:07Z","lastTransitionTime":"2026-01-20T22:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.657198 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.657271 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.657291 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.657323 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.657345 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:07Z","lastTransitionTime":"2026-01-20T22:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.760907 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.760988 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.761010 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.761040 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.761058 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:07Z","lastTransitionTime":"2026-01-20T22:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.865381 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.865442 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.865455 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.865477 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.865492 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:07Z","lastTransitionTime":"2026-01-20T22:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.931774 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 14:59:46.784031019 +0000 UTC Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.968330 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.968417 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.968450 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.968512 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.968563 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:07Z","lastTransitionTime":"2026-01-20T22:36:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:07 crc kubenswrapper[5030]: I0120 22:36:07.980378 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:07Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.003019 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.022983 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.041945 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.071095 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.071682 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.071775 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.071796 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.071826 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.071844 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:08Z","lastTransitionTime":"2026-01-20T22:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.083811 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.100133 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.121253 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.139610 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.170013 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.174781 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.174870 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.174889 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.174916 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.174936 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:08Z","lastTransitionTime":"2026-01-20T22:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.193143 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.224660 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:58Z\\\",\\\"message\\\":\\\"uilt service openshift-multus/multus-admission-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0120 22:35:58.107221 6450 services_controller.go:444] Built service openshift-multus/multus-admission-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0120 22:35:58.107234 6450 services_controller.go:445] Built service openshift-multus/multus-admission-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0120 22:35:58.107232 6450 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller init\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.245817 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.272984 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.278212 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.278281 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.278304 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.278333 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.278351 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:08Z","lastTransitionTime":"2026-01-20T22:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.290592 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.309361 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:08Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.382092 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.382185 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.382203 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.382230 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.382249 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:08Z","lastTransitionTime":"2026-01-20T22:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.485520 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.485562 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.485574 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.485589 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.485601 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:08Z","lastTransitionTime":"2026-01-20T22:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.588300 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.588368 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.588377 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.588398 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.588410 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:08Z","lastTransitionTime":"2026-01-20T22:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.691380 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.691480 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.691497 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.691526 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.691552 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:08Z","lastTransitionTime":"2026-01-20T22:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.795009 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.795101 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.795119 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.795178 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.795195 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:08Z","lastTransitionTime":"2026-01-20T22:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.897726 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.897779 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.897791 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.897808 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.897821 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:08Z","lastTransitionTime":"2026-01-20T22:36:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.932403 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 07:52:39.696757032 +0000 UTC Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.960223 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:08 crc kubenswrapper[5030]: E0120 22:36:08.960549 5030 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:08 crc kubenswrapper[5030]: E0120 22:36:08.960732 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs podName:2b3c414b-bc29-41aa-a369-ecc2cc809691 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:16.960704003 +0000 UTC m=+49.280964321 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs") pod "network-metrics-daemon-f2dgj" (UID: "2b3c414b-bc29-41aa-a369-ecc2cc809691") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.961101 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.961149 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.961203 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:08 crc kubenswrapper[5030]: I0120 22:36:08.961100 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:08 crc kubenswrapper[5030]: E0120 22:36:08.961307 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:08 crc kubenswrapper[5030]: E0120 22:36:08.961397 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:08 crc kubenswrapper[5030]: E0120 22:36:08.961579 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:08 crc kubenswrapper[5030]: E0120 22:36:08.961784 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.000212 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.000265 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.000282 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.000305 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.000322 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:09Z","lastTransitionTime":"2026-01-20T22:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.102852 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.102908 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.102920 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.102940 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.102952 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:09Z","lastTransitionTime":"2026-01-20T22:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.205495 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.205545 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.205564 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.205583 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.205593 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:09Z","lastTransitionTime":"2026-01-20T22:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.308538 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.308665 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.308695 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.308726 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.308750 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:09Z","lastTransitionTime":"2026-01-20T22:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.411954 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.412010 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.412026 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.412046 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.412060 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:09Z","lastTransitionTime":"2026-01-20T22:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.514798 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.514849 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.514859 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.514880 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.514892 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:09Z","lastTransitionTime":"2026-01-20T22:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.617648 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.617699 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.617715 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.617739 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.617758 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:09Z","lastTransitionTime":"2026-01-20T22:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.720999 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.721046 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.721057 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.721076 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.721090 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:09Z","lastTransitionTime":"2026-01-20T22:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.824421 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.824499 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.824524 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.824548 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.824566 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:09Z","lastTransitionTime":"2026-01-20T22:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.928750 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.928855 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.928876 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.928907 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.928924 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:09Z","lastTransitionTime":"2026-01-20T22:36:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:09 crc kubenswrapper[5030]: I0120 22:36:09.933152 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 17:58:35.687269704 +0000 UTC Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.031701 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.031834 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.031861 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.031895 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.031921 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:10Z","lastTransitionTime":"2026-01-20T22:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.135109 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.135210 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.135228 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.135252 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.135271 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:10Z","lastTransitionTime":"2026-01-20T22:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.238614 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.238725 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.238742 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.238768 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.238789 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:10Z","lastTransitionTime":"2026-01-20T22:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.341601 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.341729 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.341752 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.341777 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.341795 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:10Z","lastTransitionTime":"2026-01-20T22:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.444747 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.444817 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.444853 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.444880 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.444899 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:10Z","lastTransitionTime":"2026-01-20T22:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.548107 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.548216 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.548235 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.548259 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.548277 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:10Z","lastTransitionTime":"2026-01-20T22:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.650992 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.651040 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.651051 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.651067 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.651079 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:10Z","lastTransitionTime":"2026-01-20T22:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.758283 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.758349 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.758366 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.758394 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.758433 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:10Z","lastTransitionTime":"2026-01-20T22:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.861883 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.861948 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.861970 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.861999 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.862016 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:10Z","lastTransitionTime":"2026-01-20T22:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.934183 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 23:51:05.27266685 +0000 UTC Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.961790 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.961831 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.961917 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:10 crc kubenswrapper[5030]: E0120 22:36:10.961960 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.961811 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:10 crc kubenswrapper[5030]: E0120 22:36:10.962170 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:10 crc kubenswrapper[5030]: E0120 22:36:10.962286 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:10 crc kubenswrapper[5030]: E0120 22:36:10.962586 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.964357 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.964420 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.964445 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.964475 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:10 crc kubenswrapper[5030]: I0120 22:36:10.964498 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:10Z","lastTransitionTime":"2026-01-20T22:36:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.067677 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.067751 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.067773 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.067801 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.067823 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:11Z","lastTransitionTime":"2026-01-20T22:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.170970 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.171036 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.171087 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.171105 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.171115 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:11Z","lastTransitionTime":"2026-01-20T22:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.275291 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.275378 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.275405 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.275433 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.275455 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:11Z","lastTransitionTime":"2026-01-20T22:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.378079 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.378145 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.378158 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.378178 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.378195 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:11Z","lastTransitionTime":"2026-01-20T22:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.480692 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.480730 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.480739 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.480751 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.480760 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:11Z","lastTransitionTime":"2026-01-20T22:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.582788 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.582835 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.582845 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.582860 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.582870 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:11Z","lastTransitionTime":"2026-01-20T22:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.685375 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.685457 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.685473 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.685523 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.685539 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:11Z","lastTransitionTime":"2026-01-20T22:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.787925 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.788022 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.788044 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.788074 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.788097 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:11Z","lastTransitionTime":"2026-01-20T22:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.891104 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.891166 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.891187 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.891214 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.891232 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:11Z","lastTransitionTime":"2026-01-20T22:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.935444 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 01:47:01.37042124 +0000 UTC Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.994293 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.994336 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.994347 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.994366 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:11 crc kubenswrapper[5030]: I0120 22:36:11.994378 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:11Z","lastTransitionTime":"2026-01-20T22:36:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.097449 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.097523 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.097542 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.097567 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.097586 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:12Z","lastTransitionTime":"2026-01-20T22:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.200705 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.200779 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.200802 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.200826 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.200843 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:12Z","lastTransitionTime":"2026-01-20T22:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.303182 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.303256 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.303282 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.303310 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.303329 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:12Z","lastTransitionTime":"2026-01-20T22:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.406219 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.406271 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.406282 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.406298 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.406308 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:12Z","lastTransitionTime":"2026-01-20T22:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.509546 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.509636 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.509651 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.509673 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.509686 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:12Z","lastTransitionTime":"2026-01-20T22:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.612365 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.612406 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.612415 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.612429 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.612440 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:12Z","lastTransitionTime":"2026-01-20T22:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.714966 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.715055 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.715097 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.715122 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.715134 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:12Z","lastTransitionTime":"2026-01-20T22:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.817884 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.817944 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.817954 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.817968 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.817977 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:12Z","lastTransitionTime":"2026-01-20T22:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.921402 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.921503 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.921532 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.921566 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.921605 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:12Z","lastTransitionTime":"2026-01-20T22:36:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.935762 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 16:16:51.779193943 +0000 UTC Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.961381 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.961450 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:12 crc kubenswrapper[5030]: E0120 22:36:12.961572 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.961617 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:12 crc kubenswrapper[5030]: E0120 22:36:12.961784 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:12 crc kubenswrapper[5030]: E0120 22:36:12.961911 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:12 crc kubenswrapper[5030]: I0120 22:36:12.962059 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:12 crc kubenswrapper[5030]: E0120 22:36:12.962215 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.023978 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.024017 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.024027 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.024042 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.024055 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:13Z","lastTransitionTime":"2026-01-20T22:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.127548 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.127662 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.127691 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.127722 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.127744 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:13Z","lastTransitionTime":"2026-01-20T22:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.231232 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.231343 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.231363 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.231394 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.231418 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:13Z","lastTransitionTime":"2026-01-20T22:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.334319 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.334396 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.334419 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.334450 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.334476 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:13Z","lastTransitionTime":"2026-01-20T22:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.437588 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.437705 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.437729 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.437756 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.437776 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:13Z","lastTransitionTime":"2026-01-20T22:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.540929 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.540997 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.541015 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.541053 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.541072 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:13Z","lastTransitionTime":"2026-01-20T22:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.644491 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.644559 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.644582 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.644610 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.644667 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:13Z","lastTransitionTime":"2026-01-20T22:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.747908 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.747976 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.747993 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.748015 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.748030 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:13Z","lastTransitionTime":"2026-01-20T22:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.852119 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.852189 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.852209 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.852238 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.852261 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:13Z","lastTransitionTime":"2026-01-20T22:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.936914 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 03:32:32.48528351 +0000 UTC Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.955423 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.955481 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.955498 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.955522 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:13 crc kubenswrapper[5030]: I0120 22:36:13.955540 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:13Z","lastTransitionTime":"2026-01-20T22:36:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.058572 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.058653 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.058671 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.058694 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.058711 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:14Z","lastTransitionTime":"2026-01-20T22:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.162281 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.162345 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.162363 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.162393 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.162419 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:14Z","lastTransitionTime":"2026-01-20T22:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.265806 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.265862 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.265880 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.265902 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.265918 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:14Z","lastTransitionTime":"2026-01-20T22:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.368453 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.368549 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.368573 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.368604 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.368666 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:14Z","lastTransitionTime":"2026-01-20T22:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.471694 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.471787 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.471815 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.471855 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.471880 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:14Z","lastTransitionTime":"2026-01-20T22:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.574331 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.574380 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.574393 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.574409 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.574420 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:14Z","lastTransitionTime":"2026-01-20T22:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.677536 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.677616 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.677674 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.677703 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.677727 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:14Z","lastTransitionTime":"2026-01-20T22:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.781828 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.781888 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.781923 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.781949 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.781967 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:14Z","lastTransitionTime":"2026-01-20T22:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.884747 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.884822 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.884846 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.884877 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.884898 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:14Z","lastTransitionTime":"2026-01-20T22:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.937289 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 13:48:22.097367816 +0000 UTC Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.962029 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.962084 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.962113 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:14 crc kubenswrapper[5030]: E0120 22:36:14.962220 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.962318 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:14 crc kubenswrapper[5030]: E0120 22:36:14.962481 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:14 crc kubenswrapper[5030]: E0120 22:36:14.962598 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:14 crc kubenswrapper[5030]: E0120 22:36:14.962804 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.987840 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.987890 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.987902 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.987921 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:14 crc kubenswrapper[5030]: I0120 22:36:14.987933 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:14Z","lastTransitionTime":"2026-01-20T22:36:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.090718 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.090779 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.090800 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.090823 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.090840 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:15Z","lastTransitionTime":"2026-01-20T22:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.197318 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.197411 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.197454 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.197496 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.197522 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:15Z","lastTransitionTime":"2026-01-20T22:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.300444 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.300508 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.300523 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.300545 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.300562 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:15Z","lastTransitionTime":"2026-01-20T22:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.403769 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.403854 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.403879 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.403910 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.403931 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:15Z","lastTransitionTime":"2026-01-20T22:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.507022 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.507095 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.507108 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.507127 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.507139 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:15Z","lastTransitionTime":"2026-01-20T22:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.610116 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.610178 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.610197 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.610222 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.610241 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:15Z","lastTransitionTime":"2026-01-20T22:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.713491 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.713562 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.713574 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.713594 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.713606 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:15Z","lastTransitionTime":"2026-01-20T22:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.817235 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.817295 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.817313 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.817335 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.817354 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:15Z","lastTransitionTime":"2026-01-20T22:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.889520 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.900456 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.911559 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:15Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.920381 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.920436 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.920453 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.920478 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.920499 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:15Z","lastTransitionTime":"2026-01-20T22:36:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.930057 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:15Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.937981 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 20:42:59.662774867 +0000 UTC Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.952890 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:15Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.977498 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:15Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:15 crc kubenswrapper[5030]: I0120 22:36:15.999296 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:15Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.018989 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.022422 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.022473 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.022490 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.022516 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.022530 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.040052 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.062245 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.081755 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.106396 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.124135 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.126252 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.126318 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.126337 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.126364 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.126383 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.146588 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:58Z\\\",\\\"message\\\":\\\"uilt service openshift-multus/multus-admission-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0120 22:35:58.107221 6450 services_controller.go:444] Built service openshift-multus/multus-admission-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0120 22:35:58.107234 6450 services_controller.go:445] Built service openshift-multus/multus-admission-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0120 22:35:58.107232 6450 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller init\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.158513 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.173041 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.190470 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.204288 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.229611 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.229751 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.229778 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.229813 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.229839 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.332457 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.332545 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.332571 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.332603 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.332673 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.435584 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.435686 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.435710 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.435732 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.435749 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.539336 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.539401 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.539420 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.539445 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.539464 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.643921 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.644001 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.644025 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.644057 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.644080 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.747773 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.747857 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.747882 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.747919 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.747944 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.833927 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.834025 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.834044 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.834077 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.834100 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:16 crc kubenswrapper[5030]: E0120 22:36:16.856648 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.862575 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.862686 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.862714 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.862749 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.862774 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:16 crc kubenswrapper[5030]: E0120 22:36:16.884935 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.890718 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.890771 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.890789 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.890814 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.890831 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:16 crc kubenswrapper[5030]: E0120 22:36:16.912266 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.917133 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.917222 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.917249 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.917281 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.917300 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.939045 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 07:47:22.011638388 +0000 UTC Jan 20 22:36:16 crc kubenswrapper[5030]: E0120 22:36:16.939973 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.947327 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.947446 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.947465 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.947933 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.948006 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.961964 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.962014 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:16 crc kubenswrapper[5030]: E0120 22:36:16.962085 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.961958 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.962172 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:16 crc kubenswrapper[5030]: E0120 22:36:16.962399 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:16 crc kubenswrapper[5030]: E0120 22:36:16.962434 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:16 crc kubenswrapper[5030]: E0120 22:36:16.962484 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:16 crc kubenswrapper[5030]: E0120 22:36:16.972186 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:16Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:16 crc kubenswrapper[5030]: E0120 22:36:16.972425 5030 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.974938 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.975009 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.975028 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.975053 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:16 crc kubenswrapper[5030]: I0120 22:36:16.975070 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:16Z","lastTransitionTime":"2026-01-20T22:36:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.054881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:17 crc kubenswrapper[5030]: E0120 22:36:17.055063 5030 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:17 crc kubenswrapper[5030]: E0120 22:36:17.055114 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs podName:2b3c414b-bc29-41aa-a369-ecc2cc809691 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:33.055099602 +0000 UTC m=+65.375359900 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs") pod "network-metrics-daemon-f2dgj" (UID: "2b3c414b-bc29-41aa-a369-ecc2cc809691") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.077732 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.077787 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.077801 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.077821 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.077838 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:17Z","lastTransitionTime":"2026-01-20T22:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.181554 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.181660 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.181680 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.181704 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.181722 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:17Z","lastTransitionTime":"2026-01-20T22:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.284487 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.284590 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.284612 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.284669 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.284688 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:17Z","lastTransitionTime":"2026-01-20T22:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.387810 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.387893 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.387918 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.387948 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.387971 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:17Z","lastTransitionTime":"2026-01-20T22:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.491178 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.491231 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.491247 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.491270 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.491286 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:17Z","lastTransitionTime":"2026-01-20T22:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.593429 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.593511 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.593538 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.593564 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.593581 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:17Z","lastTransitionTime":"2026-01-20T22:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.696406 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.696469 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.696486 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.696508 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.696527 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:17Z","lastTransitionTime":"2026-01-20T22:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.799609 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.799685 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.799698 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.799741 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.799754 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:17Z","lastTransitionTime":"2026-01-20T22:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.902678 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.902766 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.902786 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.902809 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.902828 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:17Z","lastTransitionTime":"2026-01-20T22:36:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.939464 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 21:34:51.248208978 +0000 UTC Jan 20 22:36:17 crc kubenswrapper[5030]: I0120 22:36:17.984042 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:17Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.005365 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.005441 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.005464 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.005493 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.005517 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:18Z","lastTransitionTime":"2026-01-20T22:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.009101 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.028373 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.059143 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.084033 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.102139 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.108850 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.108916 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.108936 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.108961 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.108979 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:18Z","lastTransitionTime":"2026-01-20T22:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.121209 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.143679 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.162833 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.185781 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.205769 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.212660 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.212724 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.212747 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.212771 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.212789 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:18Z","lastTransitionTime":"2026-01-20T22:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.241658 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:58Z\\\",\\\"message\\\":\\\"uilt service openshift-multus/multus-admission-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0120 22:35:58.107221 6450 services_controller.go:444] Built service openshift-multus/multus-admission-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0120 22:35:58.107234 6450 services_controller.go:445] Built service openshift-multus/multus-admission-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0120 22:35:58.107232 6450 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller init\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.261833 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d9c2e0-ed03-4fda-b0fc-43a3e53a2a30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a93fba3f4f3480ed58b7b22535695dcb978c47afd9f4fec0886d49a88d1ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5324b780d1462962d65c46a0abda0743ce74550c1571862713fa79256b316aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b87b63fa491cf7fb6be320cd3242839fe8f5f7347876548e29891982c295b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.281343 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.306023 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.315286 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.315346 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.315357 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.315375 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.315420 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:18Z","lastTransitionTime":"2026-01-20T22:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.323031 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.340310 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:18Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.418374 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.418445 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.418460 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.418486 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.418498 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:18Z","lastTransitionTime":"2026-01-20T22:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.520824 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.520887 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.520905 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.520933 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.520951 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:18Z","lastTransitionTime":"2026-01-20T22:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.624535 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.624658 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.624680 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.624706 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.624724 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:18Z","lastTransitionTime":"2026-01-20T22:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.728649 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.728704 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.728721 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.728744 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.728761 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:18Z","lastTransitionTime":"2026-01-20T22:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.832179 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.832319 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.832408 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.832480 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.832539 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:18Z","lastTransitionTime":"2026-01-20T22:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.875545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.875759 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.875794 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:36:50.875760176 +0000 UTC m=+83.196020504 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.875857 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.875901 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.875938 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.876005 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.876043 5030 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.876053 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.876078 5030 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.876110 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:50.876087884 +0000 UTC m=+83.196348202 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.876114 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.876149 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.876170 5030 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.876170 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:50.876136405 +0000 UTC m=+83.196396723 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.876226 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:50.876207757 +0000 UTC m=+83.196468075 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.876267 5030 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.876327 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:36:50.876313909 +0000 UTC m=+83.196574237 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.936199 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.936272 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.936297 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.936325 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.936346 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:18Z","lastTransitionTime":"2026-01-20T22:36:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.940465 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 10:00:54.298252756 +0000 UTC Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.961907 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.961971 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.962092 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.962196 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:18 crc kubenswrapper[5030]: I0120 22:36:18.962277 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.962311 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.962467 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:18 crc kubenswrapper[5030]: E0120 22:36:18.962688 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.039040 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.039095 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.039111 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.039134 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.039152 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:19Z","lastTransitionTime":"2026-01-20T22:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.143344 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.143409 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.143433 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.143464 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.143488 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:19Z","lastTransitionTime":"2026-01-20T22:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.246310 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.246382 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.246406 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.246438 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.246467 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:19Z","lastTransitionTime":"2026-01-20T22:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.349882 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.349961 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.349980 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.350005 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.350025 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:19Z","lastTransitionTime":"2026-01-20T22:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.453054 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.453112 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.453130 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.453166 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.453191 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:19Z","lastTransitionTime":"2026-01-20T22:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.556579 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.556684 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.556704 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.556731 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.556750 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:19Z","lastTransitionTime":"2026-01-20T22:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.659379 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.659445 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.659467 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.659514 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.659536 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:19Z","lastTransitionTime":"2026-01-20T22:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.762821 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.762882 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.762901 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.762925 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.762944 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:19Z","lastTransitionTime":"2026-01-20T22:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.865294 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.865340 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.865356 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.865376 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.865391 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:19Z","lastTransitionTime":"2026-01-20T22:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.941425 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 18:06:20.287092605 +0000 UTC Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.967915 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.967989 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.968011 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.968040 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:19 crc kubenswrapper[5030]: I0120 22:36:19.968064 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:19Z","lastTransitionTime":"2026-01-20T22:36:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.070665 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.070746 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.070779 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.070809 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.070835 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:20Z","lastTransitionTime":"2026-01-20T22:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.172999 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.173070 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.173119 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.173147 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.173169 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:20Z","lastTransitionTime":"2026-01-20T22:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.276003 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.276068 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.276090 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.276120 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.276141 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:20Z","lastTransitionTime":"2026-01-20T22:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.379528 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.379579 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.379595 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.379663 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.379688 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:20Z","lastTransitionTime":"2026-01-20T22:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.482420 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.482489 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.482516 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.482573 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.482597 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:20Z","lastTransitionTime":"2026-01-20T22:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.585604 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.585741 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.585761 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.585790 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.585812 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:20Z","lastTransitionTime":"2026-01-20T22:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.688886 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.689003 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.689081 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.689108 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.689128 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:20Z","lastTransitionTime":"2026-01-20T22:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.792051 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.792120 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.792142 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.792169 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.792191 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:20Z","lastTransitionTime":"2026-01-20T22:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.895344 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.895421 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.895446 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.895472 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.895492 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:20Z","lastTransitionTime":"2026-01-20T22:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.942325 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 01:16:17.416164128 +0000 UTC Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.961740 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.961801 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.961843 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.961762 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:20 crc kubenswrapper[5030]: E0120 22:36:20.961903 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:20 crc kubenswrapper[5030]: E0120 22:36:20.962027 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:20 crc kubenswrapper[5030]: E0120 22:36:20.962174 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:20 crc kubenswrapper[5030]: E0120 22:36:20.962321 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.998901 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.999003 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.999022 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.999047 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:20 crc kubenswrapper[5030]: I0120 22:36:20.999083 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:20Z","lastTransitionTime":"2026-01-20T22:36:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.101861 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.101949 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.101976 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.102007 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.102032 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:21Z","lastTransitionTime":"2026-01-20T22:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.205439 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.205511 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.205536 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.205579 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.205602 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:21Z","lastTransitionTime":"2026-01-20T22:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.308798 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.308854 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.308870 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.308893 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.308911 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:21Z","lastTransitionTime":"2026-01-20T22:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.412063 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.412694 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.412856 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.413000 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.413129 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:21Z","lastTransitionTime":"2026-01-20T22:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.516276 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.516369 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.516396 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.516500 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.516531 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:21Z","lastTransitionTime":"2026-01-20T22:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.620410 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.620481 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.620506 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.620537 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.620556 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:21Z","lastTransitionTime":"2026-01-20T22:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.723512 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.723572 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.723588 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.723659 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.723678 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:21Z","lastTransitionTime":"2026-01-20T22:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.826606 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.827061 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.827271 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.827456 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.827673 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:21Z","lastTransitionTime":"2026-01-20T22:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.930611 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.930718 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.930742 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.930765 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.930782 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:21Z","lastTransitionTime":"2026-01-20T22:36:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.943463 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 09:00:12.602216166 +0000 UTC Jan 20 22:36:21 crc kubenswrapper[5030]: I0120 22:36:21.963200 5030 scope.go:117] "RemoveContainer" containerID="91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.034529 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.034929 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.034955 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.034985 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.035009 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:22Z","lastTransitionTime":"2026-01-20T22:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.137390 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.137443 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.137453 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.137466 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.137476 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:22Z","lastTransitionTime":"2026-01-20T22:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.240062 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.240135 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.240158 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.240188 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.240211 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:22Z","lastTransitionTime":"2026-01-20T22:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.303759 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/1.log" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.308426 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerStarted","Data":"8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c"} Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.309116 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.324393 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.343431 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.343498 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.343533 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.343565 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.343588 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:22Z","lastTransitionTime":"2026-01-20T22:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.348204 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.362648 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.388136 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.411368 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.446517 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.446565 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.446579 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.446601 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.446615 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:22Z","lastTransitionTime":"2026-01-20T22:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.449798 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:58Z\\\",\\\"message\\\":\\\"uilt service openshift-multus/multus-admission-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0120 22:35:58.107221 6450 services_controller.go:444] Built service openshift-multus/multus-admission-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0120 22:35:58.107234 6450 services_controller.go:445] Built service openshift-multus/multus-admission-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0120 22:35:58.107232 6450 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller init\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.462024 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d9c2e0-ed03-4fda-b0fc-43a3e53a2a30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a93fba3f4f3480ed58b7b22535695dcb978c47afd9f4fec0886d49a88d1ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5324b780d1462962d65c46a0abda0743ce74550c1571862713fa79256b316aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b87b63fa491cf7fb6be320cd3242839fe8f5f7347876548e29891982c295b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.474059 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.486079 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.496573 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.511126 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.521469 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.534924 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.548826 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.548856 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.548864 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.548876 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.548885 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:22Z","lastTransitionTime":"2026-01-20T22:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.551070 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.564441 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.576794 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.593314 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:22Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.651035 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.651087 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.651097 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.651110 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.651120 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:22Z","lastTransitionTime":"2026-01-20T22:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.753760 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.753853 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.753877 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.753912 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.753939 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:22Z","lastTransitionTime":"2026-01-20T22:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.857094 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.857152 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.857167 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.857186 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.857198 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:22Z","lastTransitionTime":"2026-01-20T22:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.944403 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 15:10:53.123617353 +0000 UTC Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.959960 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.960018 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.960035 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.960059 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.960076 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:22Z","lastTransitionTime":"2026-01-20T22:36:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.961049 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:22 crc kubenswrapper[5030]: E0120 22:36:22.961183 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.961207 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.961238 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:22 crc kubenswrapper[5030]: E0120 22:36:22.961368 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:22 crc kubenswrapper[5030]: I0120 22:36:22.961212 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:22 crc kubenswrapper[5030]: E0120 22:36:22.961556 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:22 crc kubenswrapper[5030]: E0120 22:36:22.961700 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.062770 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.062817 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.062833 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.062859 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.062878 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:23Z","lastTransitionTime":"2026-01-20T22:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.165950 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.166009 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.166026 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.166050 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.166068 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:23Z","lastTransitionTime":"2026-01-20T22:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.268598 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.268676 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.268693 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.268718 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.268735 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:23Z","lastTransitionTime":"2026-01-20T22:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.315525 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/2.log" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.316727 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/1.log" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.320158 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf449d4a-2037-4802-8370-1965d3026c07" containerID="8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c" exitCode=1 Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.320221 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c"} Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.320280 5030 scope.go:117] "RemoveContainer" containerID="91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.321733 5030 scope.go:117] "RemoveContainer" containerID="8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c" Jan 20 22:36:23 crc kubenswrapper[5030]: E0120 22:36:23.322063 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.343071 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.362222 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.372429 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.372509 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.372532 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.372562 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.372586 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:23Z","lastTransitionTime":"2026-01-20T22:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.384059 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.399218 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.419319 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.440707 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.461396 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.475478 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.475544 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.475563 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.475590 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.475609 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:23Z","lastTransitionTime":"2026-01-20T22:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.477538 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.500806 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.521298 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.553033 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91e951323a68ab8fec25bf4cc11ae99019e0c4b65e29d1290bb28cfc9b913731\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:35:58Z\\\",\\\"message\\\":\\\"uilt service openshift-multus/multus-admission-controller LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.119\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0120 22:35:58.107221 6450 services_controller.go:444] Built service openshift-multus/multus-admission-controller LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0120 22:35:58.107234 6450 services_controller.go:445] Built service openshift-multus/multus-admission-controller LB template configs for network=default: []services.lbConfig(nil)\\\\nF0120 22:35:58.107232 6450 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller init\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:23Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 22:36:22.834729 6720 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 22:36:22.834777 6720 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 22:36:22.834806 6720 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 22:36:22.834873 6720 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 22:36:22.834941 6720 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 22:36:22.835370 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 22:36:22.835483 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 22:36:22.835550 6720 ovnkube.go:599] Stopped ovnkube\\\\nI0120 22:36:22.835583 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 22:36:22.835729 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:36:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.577594 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.578428 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.578516 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.578531 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.578549 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.578560 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:23Z","lastTransitionTime":"2026-01-20T22:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.601563 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.618288 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.636459 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.652691 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d9c2e0-ed03-4fda-b0fc-43a3e53a2a30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a93fba3f4f3480ed58b7b22535695dcb978c47afd9f4fec0886d49a88d1ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5324b780d1462962d65c46a0abda0743ce74550c1571862713fa79256b316aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b87b63fa491cf7fb6be320cd3242839fe8f5f7347876548e29891982c295b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.671887 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:23Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.681377 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.681447 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.681465 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.681489 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.681507 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:23Z","lastTransitionTime":"2026-01-20T22:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.784000 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.784076 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.784115 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.784135 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.784148 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:23Z","lastTransitionTime":"2026-01-20T22:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.888119 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.888193 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.888211 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.888238 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.888265 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:23Z","lastTransitionTime":"2026-01-20T22:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.945809 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 05:46:16.64795071 +0000 UTC Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.991472 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.991521 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.991538 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.991561 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:23 crc kubenswrapper[5030]: I0120 22:36:23.991731 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:23Z","lastTransitionTime":"2026-01-20T22:36:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.094792 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.094855 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.094872 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.094896 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.094913 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:24Z","lastTransitionTime":"2026-01-20T22:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.197999 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.198100 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.198122 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.198155 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.198178 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:24Z","lastTransitionTime":"2026-01-20T22:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.301037 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.301096 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.301114 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.301138 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.301156 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:24Z","lastTransitionTime":"2026-01-20T22:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.327221 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/2.log" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.332054 5030 scope.go:117] "RemoveContainer" containerID="8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c" Jan 20 22:36:24 crc kubenswrapper[5030]: E0120 22:36:24.332302 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.350392 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.365866 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d9c2e0-ed03-4fda-b0fc-43a3e53a2a30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a93fba3f4f3480ed58b7b22535695dcb978c47afd9f4fec0886d49a88d1ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5324b780d1462962d65c46a0abda0743ce74550c1571862713fa79256b316aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b87b63fa491cf7fb6be320cd3242839fe8f5f7347876548e29891982c295b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.386911 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.404214 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.404254 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.404283 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.404306 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.404317 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:24Z","lastTransitionTime":"2026-01-20T22:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.414734 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.429131 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.449422 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.465018 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.483517 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.503430 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.508367 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.508426 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.508447 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.508476 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.508502 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:24Z","lastTransitionTime":"2026-01-20T22:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.521240 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.539449 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.556185 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.576034 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.592206 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.610022 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.611778 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.611863 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.611886 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.611920 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.611944 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:24Z","lastTransitionTime":"2026-01-20T22:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.629883 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.659955 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:23Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 22:36:22.834729 6720 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 22:36:22.834777 6720 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 22:36:22.834806 6720 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 22:36:22.834873 6720 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 22:36:22.834941 6720 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 22:36:22.835370 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 22:36:22.835483 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 22:36:22.835550 6720 ovnkube.go:599] Stopped ovnkube\\\\nI0120 22:36:22.835583 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 22:36:22.835729 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:36:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:24Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.714599 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.714681 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.714699 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.714723 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.714739 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:24Z","lastTransitionTime":"2026-01-20T22:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.817545 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.817697 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.817714 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.817737 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.817754 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:24Z","lastTransitionTime":"2026-01-20T22:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.920810 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.920865 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.920881 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.920902 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.920922 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:24Z","lastTransitionTime":"2026-01-20T22:36:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.946775 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 17:00:01.446628902 +0000 UTC Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.961267 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.961336 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.961522 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:24 crc kubenswrapper[5030]: E0120 22:36:24.961513 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:24 crc kubenswrapper[5030]: I0120 22:36:24.961569 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:24 crc kubenswrapper[5030]: E0120 22:36:24.961743 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:24 crc kubenswrapper[5030]: E0120 22:36:24.961848 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:24 crc kubenswrapper[5030]: E0120 22:36:24.962004 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.023799 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.023853 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.023868 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.023889 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.023903 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:25Z","lastTransitionTime":"2026-01-20T22:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.125915 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.125943 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.125951 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.125963 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.125972 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:25Z","lastTransitionTime":"2026-01-20T22:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.228349 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.228396 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.228405 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.228420 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.228430 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:25Z","lastTransitionTime":"2026-01-20T22:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.331655 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.331715 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.331740 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.331769 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.331826 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:25Z","lastTransitionTime":"2026-01-20T22:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.435115 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.435167 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.435188 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.435216 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.435236 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:25Z","lastTransitionTime":"2026-01-20T22:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.538452 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.538502 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.538512 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.538527 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.538539 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:25Z","lastTransitionTime":"2026-01-20T22:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.641984 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.642057 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.642081 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.642112 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.642140 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:25Z","lastTransitionTime":"2026-01-20T22:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.745412 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.745508 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.745529 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.745553 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.745570 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:25Z","lastTransitionTime":"2026-01-20T22:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.849104 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.849147 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.849155 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.849170 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.849179 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:25Z","lastTransitionTime":"2026-01-20T22:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.947572 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 12:08:02.12784749 +0000 UTC Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.951372 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.951404 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.951427 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.951491 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:25 crc kubenswrapper[5030]: I0120 22:36:25.951512 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:25Z","lastTransitionTime":"2026-01-20T22:36:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.053548 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.053591 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.053605 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.053649 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.053665 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:26Z","lastTransitionTime":"2026-01-20T22:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.155594 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.155693 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.155720 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.155753 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.155776 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:26Z","lastTransitionTime":"2026-01-20T22:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.257999 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.258058 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.258068 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.258080 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.258089 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:26Z","lastTransitionTime":"2026-01-20T22:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.360961 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.361035 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.361062 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.361092 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.361113 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:26Z","lastTransitionTime":"2026-01-20T22:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.464290 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.464347 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.464388 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.464410 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.464425 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:26Z","lastTransitionTime":"2026-01-20T22:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.567434 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.567495 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.567514 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.567541 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.567566 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:26Z","lastTransitionTime":"2026-01-20T22:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.670340 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.670391 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.670403 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.670421 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.670445 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:26Z","lastTransitionTime":"2026-01-20T22:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.773728 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.773759 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.773770 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.773787 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.773800 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:26Z","lastTransitionTime":"2026-01-20T22:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.876738 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.876815 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.876832 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.876861 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.876882 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:26Z","lastTransitionTime":"2026-01-20T22:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.948208 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 13:17:34.105571347 +0000 UTC Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.961781 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.961864 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.961773 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.961789 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:26 crc kubenswrapper[5030]: E0120 22:36:26.961966 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:26 crc kubenswrapper[5030]: E0120 22:36:26.962144 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:26 crc kubenswrapper[5030]: E0120 22:36:26.962279 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:26 crc kubenswrapper[5030]: E0120 22:36:26.962380 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.980093 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.980141 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.980158 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.980181 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:26 crc kubenswrapper[5030]: I0120 22:36:26.980198 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:26Z","lastTransitionTime":"2026-01-20T22:36:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.082937 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.082980 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.082994 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.083016 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.083031 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.185972 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.186009 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.186020 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.186038 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.186053 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.252914 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.252951 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.252961 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.252975 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.252983 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: E0120 22:36:27.297738 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:27Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.303091 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.303143 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.303161 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.303189 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.303209 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: E0120 22:36:27.328017 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:27Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.331970 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.332022 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.332033 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.332053 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.332063 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: E0120 22:36:27.344926 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:27Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.348246 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.348345 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.348369 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.348411 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.348435 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: E0120 22:36:27.362009 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:27Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.364969 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.364996 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.365004 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.365014 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.365023 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: E0120 22:36:27.377246 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:27Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:27 crc kubenswrapper[5030]: E0120 22:36:27.377360 5030 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.379285 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.379307 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.379314 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.379327 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.379337 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.481997 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.482043 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.482059 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.482082 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.482098 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.584171 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.584208 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.584219 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.584237 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.584248 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.687008 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.687102 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.687153 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.687176 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.687192 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.791340 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.791394 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.791412 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.791434 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.791451 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.893095 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.893199 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.893220 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.893247 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.893267 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.949291 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 23:16:58.768744111 +0000 UTC Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.979751 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:27Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.991501 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:27Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.995432 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.995750 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.995766 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.995788 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:27 crc kubenswrapper[5030]: I0120 22:36:27.995803 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:27Z","lastTransitionTime":"2026-01-20T22:36:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.005382 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.016923 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.038050 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:23Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 22:36:22.834729 6720 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 22:36:22.834777 6720 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 22:36:22.834806 6720 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 22:36:22.834873 6720 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 22:36:22.834941 6720 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 22:36:22.835370 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 22:36:22.835483 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 22:36:22.835550 6720 ovnkube.go:599] Stopped ovnkube\\\\nI0120 22:36:22.835583 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 22:36:22.835729 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:36:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.058178 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.074690 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.091280 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.099373 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.099460 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.099485 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.099533 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.099557 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:28Z","lastTransitionTime":"2026-01-20T22:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.106973 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.122613 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d9c2e0-ed03-4fda-b0fc-43a3e53a2a30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a93fba3f4f3480ed58b7b22535695dcb978c47afd9f4fec0886d49a88d1ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5324b780d1462962d65c46a0abda0743ce74550c1571862713fa79256b316aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b87b63fa491cf7fb6be320cd3242839fe8f5f7347876548e29891982c295b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.137592 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.151425 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.163125 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.177369 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.190172 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.201966 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.202023 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.202037 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.202055 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.202065 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:28Z","lastTransitionTime":"2026-01-20T22:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.203068 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.214600 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:28Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.303874 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.303926 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.303959 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.303977 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.303988 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:28Z","lastTransitionTime":"2026-01-20T22:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.407333 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.407397 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.407421 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.407448 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.407467 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:28Z","lastTransitionTime":"2026-01-20T22:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.510218 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.510294 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.510317 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.510346 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.510368 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:28Z","lastTransitionTime":"2026-01-20T22:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.612576 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.612705 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.612732 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.612757 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.612780 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:28Z","lastTransitionTime":"2026-01-20T22:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.715802 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.715831 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.715840 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.715854 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.715863 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:28Z","lastTransitionTime":"2026-01-20T22:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.818934 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.819001 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.819019 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.819044 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.819062 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:28Z","lastTransitionTime":"2026-01-20T22:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.920839 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.920880 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.920889 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.920904 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.920914 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:28Z","lastTransitionTime":"2026-01-20T22:36:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.950221 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 07:39:47.265537244 +0000 UTC Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.961817 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.961896 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.961938 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:28 crc kubenswrapper[5030]: I0120 22:36:28.961824 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:28 crc kubenswrapper[5030]: E0120 22:36:28.962048 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:28 crc kubenswrapper[5030]: E0120 22:36:28.962182 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:28 crc kubenswrapper[5030]: E0120 22:36:28.962312 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:28 crc kubenswrapper[5030]: E0120 22:36:28.962401 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.023694 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.023728 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.023739 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.023760 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.023772 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:29Z","lastTransitionTime":"2026-01-20T22:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.126589 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.126670 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.126684 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.126701 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.126713 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:29Z","lastTransitionTime":"2026-01-20T22:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.229338 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.229398 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.229412 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.229430 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.229445 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:29Z","lastTransitionTime":"2026-01-20T22:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.331343 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.331399 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.331414 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.331439 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.331455 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:29Z","lastTransitionTime":"2026-01-20T22:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.433386 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.433444 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.433465 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.433489 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.433509 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:29Z","lastTransitionTime":"2026-01-20T22:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.535965 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.536075 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.536112 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.536174 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.536194 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:29Z","lastTransitionTime":"2026-01-20T22:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.648103 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.648174 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.648193 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.648219 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.648238 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:29Z","lastTransitionTime":"2026-01-20T22:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.751309 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.751354 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.751368 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.751386 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.751397 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:29Z","lastTransitionTime":"2026-01-20T22:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.854745 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.854790 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.854803 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.854824 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.854837 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:29Z","lastTransitionTime":"2026-01-20T22:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.951011 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 07:49:01.776024036 +0000 UTC Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.957133 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.957176 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.957184 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.957201 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:29 crc kubenswrapper[5030]: I0120 22:36:29.957214 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:29Z","lastTransitionTime":"2026-01-20T22:36:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.059901 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.059948 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.059961 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.059979 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.059991 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:30Z","lastTransitionTime":"2026-01-20T22:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.163424 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.163466 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.163478 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.163494 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.163507 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:30Z","lastTransitionTime":"2026-01-20T22:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.266406 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.266466 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.266482 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.266505 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.266525 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:30Z","lastTransitionTime":"2026-01-20T22:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.368926 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.369052 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.369077 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.369109 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.369130 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:30Z","lastTransitionTime":"2026-01-20T22:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.472395 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.472459 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.472477 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.472503 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.472530 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:30Z","lastTransitionTime":"2026-01-20T22:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.575729 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.575809 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.575826 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.575854 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.575871 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:30Z","lastTransitionTime":"2026-01-20T22:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.678299 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.678356 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.678368 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.678395 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.678408 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:30Z","lastTransitionTime":"2026-01-20T22:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.781796 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.781860 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.781878 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.781902 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.781921 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:30Z","lastTransitionTime":"2026-01-20T22:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.884509 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.884582 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.884663 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.884698 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.884722 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:30Z","lastTransitionTime":"2026-01-20T22:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.951724 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 15:39:44.449768185 +0000 UTC Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.961110 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.961181 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.961233 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.961314 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:30 crc kubenswrapper[5030]: E0120 22:36:30.961352 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:30 crc kubenswrapper[5030]: E0120 22:36:30.961473 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:30 crc kubenswrapper[5030]: E0120 22:36:30.961737 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:30 crc kubenswrapper[5030]: E0120 22:36:30.961899 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.988019 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.988101 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.988125 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.988153 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:30 crc kubenswrapper[5030]: I0120 22:36:30.988172 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:30Z","lastTransitionTime":"2026-01-20T22:36:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.090822 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.090880 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.090902 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.090927 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.090948 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:31Z","lastTransitionTime":"2026-01-20T22:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.193760 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.193826 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.193842 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.193865 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.193882 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:31Z","lastTransitionTime":"2026-01-20T22:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.296671 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.296721 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.296732 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.296748 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.296760 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:31Z","lastTransitionTime":"2026-01-20T22:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.399298 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.399338 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.399347 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.399362 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.399373 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:31Z","lastTransitionTime":"2026-01-20T22:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.502242 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.502288 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.502307 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.502330 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.502347 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:31Z","lastTransitionTime":"2026-01-20T22:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.604797 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.604861 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.604878 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.604901 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.604923 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:31Z","lastTransitionTime":"2026-01-20T22:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.707688 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.707750 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.707770 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.707794 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.707811 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:31Z","lastTransitionTime":"2026-01-20T22:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.811992 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.812057 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.812074 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.812098 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.812115 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:31Z","lastTransitionTime":"2026-01-20T22:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.913671 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.913704 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.913712 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.913724 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.913733 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:31Z","lastTransitionTime":"2026-01-20T22:36:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:31 crc kubenswrapper[5030]: I0120 22:36:31.952533 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 03:52:23.104075141 +0000 UTC Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.015729 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.015768 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.015777 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.015789 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.015798 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:32Z","lastTransitionTime":"2026-01-20T22:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.118908 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.118958 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.118973 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.118993 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.119006 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:32Z","lastTransitionTime":"2026-01-20T22:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.222108 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.222177 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.222198 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.222223 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.222243 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:32Z","lastTransitionTime":"2026-01-20T22:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.324036 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.324078 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.324090 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.324106 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.324118 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:32Z","lastTransitionTime":"2026-01-20T22:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.426783 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.426828 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.426839 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.426855 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.426867 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:32Z","lastTransitionTime":"2026-01-20T22:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.530510 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.530553 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.530564 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.530583 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.530598 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:32Z","lastTransitionTime":"2026-01-20T22:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.633155 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.633204 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.633216 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.633232 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.633244 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:32Z","lastTransitionTime":"2026-01-20T22:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.735126 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.735168 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.735177 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.735193 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.735202 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:32Z","lastTransitionTime":"2026-01-20T22:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.838206 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.838242 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.838252 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.838266 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.838277 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:32Z","lastTransitionTime":"2026-01-20T22:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.940142 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.940221 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.940234 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.940252 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.940265 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:32Z","lastTransitionTime":"2026-01-20T22:36:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.953756 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 02:07:58.290104172 +0000 UTC Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.960962 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.960978 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.961017 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:32 crc kubenswrapper[5030]: I0120 22:36:32.961037 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:32 crc kubenswrapper[5030]: E0120 22:36:32.961077 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:32 crc kubenswrapper[5030]: E0120 22:36:32.961131 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:32 crc kubenswrapper[5030]: E0120 22:36:32.961232 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:32 crc kubenswrapper[5030]: E0120 22:36:32.961413 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.042792 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.042823 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.042831 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.042844 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.042853 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:33Z","lastTransitionTime":"2026-01-20T22:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.140059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:33 crc kubenswrapper[5030]: E0120 22:36:33.140255 5030 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:33 crc kubenswrapper[5030]: E0120 22:36:33.140335 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs podName:2b3c414b-bc29-41aa-a369-ecc2cc809691 nodeName:}" failed. No retries permitted until 2026-01-20 22:37:05.140314336 +0000 UTC m=+97.460574624 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs") pod "network-metrics-daemon-f2dgj" (UID: "2b3c414b-bc29-41aa-a369-ecc2cc809691") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.145209 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.145266 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.145284 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.145308 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.145324 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:33Z","lastTransitionTime":"2026-01-20T22:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.247799 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.247864 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.247882 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.247905 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.247961 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:33Z","lastTransitionTime":"2026-01-20T22:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.350115 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.350162 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.350173 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.350190 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.350202 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:33Z","lastTransitionTime":"2026-01-20T22:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.362527 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/0.log" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.362586 5030 generic.go:334] "Generic (PLEG): container finished" podID="7e610661-5072-4aa0-b1f1-75410b7f663b" containerID="14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf" exitCode=1 Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.362638 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n8v4f" event={"ID":"7e610661-5072-4aa0-b1f1-75410b7f663b","Type":"ContainerDied","Data":"14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf"} Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.363022 5030 scope.go:117] "RemoveContainer" containerID="14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.386404 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:23Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 22:36:22.834729 6720 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 22:36:22.834777 6720 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 22:36:22.834806 6720 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 22:36:22.834873 6720 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 22:36:22.834941 6720 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 22:36:22.835370 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 22:36:22.835483 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 22:36:22.835550 6720 ovnkube.go:599] Stopped ovnkube\\\\nI0120 22:36:22.835583 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 22:36:22.835729 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:36:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.402147 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.416112 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.426910 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.438969 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.452465 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.452423 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d9c2e0-ed03-4fda-b0fc-43a3e53a2a30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a93fba3f4f3480ed58b7b22535695dcb978c47afd9f4fec0886d49a88d1ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5324b780d1462962d65c46a0abda0743ce74550c1571862713fa79256b316aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b87b63fa491cf7fb6be320cd3242839fe8f5f7347876548e29891982c295b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.452511 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.452663 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.452688 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.452701 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:33Z","lastTransitionTime":"2026-01-20T22:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.465185 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.479686 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.492237 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.509692 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:33Z\\\",\\\"message\\\":\\\"2026-01-20T22:35:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225\\\\n2026-01-20T22:35:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225 to /host/opt/cni/bin/\\\\n2026-01-20T22:35:48Z [verbose] multus-daemon started\\\\n2026-01-20T22:35:48Z [verbose] Readiness Indicator file check\\\\n2026-01-20T22:36:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.519920 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.532113 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.546019 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.554704 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.554743 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.554752 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.554766 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.554775 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:33Z","lastTransitionTime":"2026-01-20T22:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.558116 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.567046 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.580767 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.594401 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:33Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.657493 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.657546 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.657558 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.657575 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.657586 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:33Z","lastTransitionTime":"2026-01-20T22:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.760348 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.760378 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.760387 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.760400 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.760408 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:33Z","lastTransitionTime":"2026-01-20T22:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.862708 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.862781 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.862803 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.862831 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.862855 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:33Z","lastTransitionTime":"2026-01-20T22:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.954167 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 19:42:14.93018118 +0000 UTC Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.965246 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.965292 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.965307 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.965324 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:33 crc kubenswrapper[5030]: I0120 22:36:33.965338 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:33Z","lastTransitionTime":"2026-01-20T22:36:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.067933 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.067976 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.067987 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.068003 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.068017 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:34Z","lastTransitionTime":"2026-01-20T22:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.170780 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.170816 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.170826 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.170842 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.170853 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:34Z","lastTransitionTime":"2026-01-20T22:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.273188 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.273215 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.273223 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.273235 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.273244 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:34Z","lastTransitionTime":"2026-01-20T22:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.368192 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/0.log" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.368251 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n8v4f" event={"ID":"7e610661-5072-4aa0-b1f1-75410b7f663b","Type":"ContainerStarted","Data":"b0f5e385cd5f1702d045571944087e6d00cc5743b66386b068f212c853833667"} Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.374283 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.374311 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.374321 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.374333 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.374342 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:34Z","lastTransitionTime":"2026-01-20T22:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.383798 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.399762 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.433201 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:23Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 22:36:22.834729 6720 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 22:36:22.834777 6720 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 22:36:22.834806 6720 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 22:36:22.834873 6720 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 22:36:22.834941 6720 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 22:36:22.835370 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 22:36:22.835483 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 22:36:22.835550 6720 ovnkube.go:599] Stopped ovnkube\\\\nI0120 22:36:22.835583 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 22:36:22.835729 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:36:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.451375 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d9c2e0-ed03-4fda-b0fc-43a3e53a2a30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a93fba3f4f3480ed58b7b22535695dcb978c47afd9f4fec0886d49a88d1ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5324b780d1462962d65c46a0abda0743ce74550c1571862713fa79256b316aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b87b63fa491cf7fb6be320cd3242839fe8f5f7347876548e29891982c295b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.467052 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.476609 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.476713 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.476736 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.476765 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.476787 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:34Z","lastTransitionTime":"2026-01-20T22:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.486217 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.495998 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.506034 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.517692 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.528053 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.541976 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.551220 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.562192 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0f5e385cd5f1702d045571944087e6d00cc5743b66386b068f212c853833667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:33Z\\\",\\\"message\\\":\\\"2026-01-20T22:35:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225\\\\n2026-01-20T22:35:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225 to /host/opt/cni/bin/\\\\n2026-01-20T22:35:48Z [verbose] multus-daemon started\\\\n2026-01-20T22:35:48Z [verbose] Readiness Indicator file check\\\\n2026-01-20T22:36:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.570737 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.578699 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.578740 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.578749 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.578763 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.578775 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:34Z","lastTransitionTime":"2026-01-20T22:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.581104 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.592989 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.601838 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:34Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.681198 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.681271 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.681285 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.681302 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.681314 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:34Z","lastTransitionTime":"2026-01-20T22:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.783865 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.783936 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.783959 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.783983 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.784001 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:34Z","lastTransitionTime":"2026-01-20T22:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.886528 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.886569 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.886577 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.886591 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.886601 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:34Z","lastTransitionTime":"2026-01-20T22:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.954424 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 17:15:57.384299739 +0000 UTC Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.961301 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.961309 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.961388 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.961440 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:34 crc kubenswrapper[5030]: E0120 22:36:34.961593 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:34 crc kubenswrapper[5030]: E0120 22:36:34.961769 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:34 crc kubenswrapper[5030]: E0120 22:36:34.961832 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:34 crc kubenswrapper[5030]: E0120 22:36:34.961937 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.989762 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.989821 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.989838 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.989859 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:34 crc kubenswrapper[5030]: I0120 22:36:34.989876 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:34Z","lastTransitionTime":"2026-01-20T22:36:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.092281 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.092331 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.092344 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.092363 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.092378 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:35Z","lastTransitionTime":"2026-01-20T22:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.195235 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.195286 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.195302 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.195324 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.195342 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:35Z","lastTransitionTime":"2026-01-20T22:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.297696 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.297779 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.297802 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.297831 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.297854 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:35Z","lastTransitionTime":"2026-01-20T22:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.399930 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.399970 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.399981 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.399998 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.400011 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:35Z","lastTransitionTime":"2026-01-20T22:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.502660 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.502686 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.502694 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.502706 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.502714 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:35Z","lastTransitionTime":"2026-01-20T22:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.604917 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.605739 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.605940 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.606097 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.606286 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:35Z","lastTransitionTime":"2026-01-20T22:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.709318 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.709361 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.709373 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.709389 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.709400 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:35Z","lastTransitionTime":"2026-01-20T22:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.812351 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.812423 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.812444 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.812472 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.812519 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:35Z","lastTransitionTime":"2026-01-20T22:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.915578 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.915674 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.915700 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.915728 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.915749 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:35Z","lastTransitionTime":"2026-01-20T22:36:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.955143 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 09:25:46.699692104 +0000 UTC Jan 20 22:36:35 crc kubenswrapper[5030]: I0120 22:36:35.963362 5030 scope.go:117] "RemoveContainer" containerID="8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c" Jan 20 22:36:35 crc kubenswrapper[5030]: E0120 22:36:35.963740 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.019077 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.019118 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.019127 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.019142 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.019154 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:36Z","lastTransitionTime":"2026-01-20T22:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.122268 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.122336 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.122354 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.122380 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.122399 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:36Z","lastTransitionTime":"2026-01-20T22:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.224312 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.224353 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.224362 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.224377 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.224386 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:36Z","lastTransitionTime":"2026-01-20T22:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.326369 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.326425 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.326446 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.326493 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.326521 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:36Z","lastTransitionTime":"2026-01-20T22:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.428961 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.428996 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.429009 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.429025 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.429036 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:36Z","lastTransitionTime":"2026-01-20T22:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.531405 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.531440 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.531451 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.531467 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.531478 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:36Z","lastTransitionTime":"2026-01-20T22:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.634666 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.634746 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.634776 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.634800 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.634817 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:36Z","lastTransitionTime":"2026-01-20T22:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.737308 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.737375 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.737393 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.737415 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.737438 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:36Z","lastTransitionTime":"2026-01-20T22:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.839737 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.839780 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.839789 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.839806 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.839816 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:36Z","lastTransitionTime":"2026-01-20T22:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.942125 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.942185 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.942196 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.942219 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.942233 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:36Z","lastTransitionTime":"2026-01-20T22:36:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.955658 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 11:50:34.996818186 +0000 UTC Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.962033 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.962113 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:36 crc kubenswrapper[5030]: E0120 22:36:36.962157 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.962193 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:36 crc kubenswrapper[5030]: I0120 22:36:36.962224 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:36 crc kubenswrapper[5030]: E0120 22:36:36.962359 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:36 crc kubenswrapper[5030]: E0120 22:36:36.962482 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:36 crc kubenswrapper[5030]: E0120 22:36:36.962557 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.045199 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.045256 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.045267 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.045280 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.045288 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.147671 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.147716 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.147733 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.147755 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.147772 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.251229 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.251295 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.251319 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.251350 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.251373 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.353654 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.353699 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.353712 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.353730 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.353741 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.455990 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.456029 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.456038 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.456054 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.456065 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.559155 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.559188 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.559196 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.559208 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.559216 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.662108 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.662161 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.662174 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.662196 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.662209 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.730377 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.730712 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.730814 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.730912 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.730995 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: E0120 22:36:37.746443 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:37Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.750357 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.750396 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.750405 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.750420 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.750430 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: E0120 22:36:37.763371 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:37Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.767304 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.767483 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.767514 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.767544 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.767566 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: E0120 22:36:37.780526 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:37Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.783924 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.783951 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.783958 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.783971 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.783981 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: E0120 22:36:37.794983 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:37Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.798081 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.798105 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.798113 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.798133 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.798143 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: E0120 22:36:37.808339 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:37Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:37 crc kubenswrapper[5030]: E0120 22:36:37.808492 5030 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.810173 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.810253 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.810268 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.810287 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.810324 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.912453 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.912488 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.912497 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.912530 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.912541 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:37Z","lastTransitionTime":"2026-01-20T22:36:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.956062 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 16:29:45.562885725 +0000 UTC Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.981548 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:37Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:37 crc kubenswrapper[5030]: I0120 22:36:37.995939 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:37Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.009385 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.014344 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.014378 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.014389 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.014436 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.014449 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:38Z","lastTransitionTime":"2026-01-20T22:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.024274 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.038246 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0f5e385cd5f1702d045571944087e6d00cc5743b66386b068f212c853833667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:33Z\\\",\\\"message\\\":\\\"2026-01-20T22:35:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225\\\\n2026-01-20T22:35:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225 to /host/opt/cni/bin/\\\\n2026-01-20T22:35:48Z [verbose] multus-daemon started\\\\n2026-01-20T22:35:48Z [verbose] Readiness Indicator file check\\\\n2026-01-20T22:36:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.048526 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.062284 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.076163 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.086412 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.104610 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.116959 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.117006 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.117018 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.117034 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.117048 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:38Z","lastTransitionTime":"2026-01-20T22:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.121938 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.144447 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:23Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 22:36:22.834729 6720 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 22:36:22.834777 6720 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 22:36:22.834806 6720 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 22:36:22.834873 6720 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 22:36:22.834941 6720 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 22:36:22.835370 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 22:36:22.835483 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 22:36:22.835550 6720 ovnkube.go:599] Stopped ovnkube\\\\nI0120 22:36:22.835583 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 22:36:22.835729 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:36:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.159000 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d9c2e0-ed03-4fda-b0fc-43a3e53a2a30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a93fba3f4f3480ed58b7b22535695dcb978c47afd9f4fec0886d49a88d1ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5324b780d1462962d65c46a0abda0743ce74550c1571862713fa79256b316aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b87b63fa491cf7fb6be320cd3242839fe8f5f7347876548e29891982c295b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.172342 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.191375 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.203259 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.216814 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:38Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.219446 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.219490 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.219502 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.219523 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.219535 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:38Z","lastTransitionTime":"2026-01-20T22:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.322247 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.322305 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.322315 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.322330 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.322340 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:38Z","lastTransitionTime":"2026-01-20T22:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.424783 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.424830 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.424839 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.424858 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.424869 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:38Z","lastTransitionTime":"2026-01-20T22:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.531657 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.531712 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.531734 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.531761 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.531781 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:38Z","lastTransitionTime":"2026-01-20T22:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.634340 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.634664 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.634811 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.634911 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.635007 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:38Z","lastTransitionTime":"2026-01-20T22:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.737533 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.737871 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.738001 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.738104 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.738207 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:38Z","lastTransitionTime":"2026-01-20T22:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.841168 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.841211 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.841220 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.841235 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.841244 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:38Z","lastTransitionTime":"2026-01-20T22:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.944055 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.944137 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.944158 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.944187 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.944208 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:38Z","lastTransitionTime":"2026-01-20T22:36:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.957232 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 22:57:23.75348233 +0000 UTC Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.961496 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:38 crc kubenswrapper[5030]: E0120 22:36:38.961640 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.961699 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.961749 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:38 crc kubenswrapper[5030]: E0120 22:36:38.961776 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:38 crc kubenswrapper[5030]: E0120 22:36:38.962060 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:38 crc kubenswrapper[5030]: I0120 22:36:38.962125 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:38 crc kubenswrapper[5030]: E0120 22:36:38.962173 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.047358 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.047427 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.047446 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.047473 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.047493 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:39Z","lastTransitionTime":"2026-01-20T22:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.149578 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.149650 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.149662 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.149681 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.149694 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:39Z","lastTransitionTime":"2026-01-20T22:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.252611 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.252694 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.252712 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.252735 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.252753 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:39Z","lastTransitionTime":"2026-01-20T22:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.355521 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.355580 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.355597 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.355648 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.355671 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:39Z","lastTransitionTime":"2026-01-20T22:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.458504 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.458551 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.458565 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.458584 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.458598 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:39Z","lastTransitionTime":"2026-01-20T22:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.562167 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.562200 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.562208 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.562239 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.562249 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:39Z","lastTransitionTime":"2026-01-20T22:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.666228 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.666260 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.666270 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.666286 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.666299 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:39Z","lastTransitionTime":"2026-01-20T22:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.770543 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.770608 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.770645 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.770668 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.770686 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:39Z","lastTransitionTime":"2026-01-20T22:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.873141 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.873169 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.873176 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.873190 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.873199 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:39Z","lastTransitionTime":"2026-01-20T22:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.958477 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 18:34:10.470429054 +0000 UTC Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.974786 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.974876 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.974924 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.974948 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:39 crc kubenswrapper[5030]: I0120 22:36:39.974968 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:39Z","lastTransitionTime":"2026-01-20T22:36:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.077819 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.077867 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.077877 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.077896 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.077908 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:40Z","lastTransitionTime":"2026-01-20T22:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.180349 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.180396 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.180408 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.180426 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.180439 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:40Z","lastTransitionTime":"2026-01-20T22:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.283589 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.283679 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.283702 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.283730 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.283754 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:40Z","lastTransitionTime":"2026-01-20T22:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.385934 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.385972 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.385980 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.385994 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.386003 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:40Z","lastTransitionTime":"2026-01-20T22:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.488881 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.488932 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.488949 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.488971 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.488987 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:40Z","lastTransitionTime":"2026-01-20T22:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.591614 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.591678 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.591688 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.591703 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.591713 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:40Z","lastTransitionTime":"2026-01-20T22:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.694933 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.694970 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.694979 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.694994 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.695005 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:40Z","lastTransitionTime":"2026-01-20T22:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.797243 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.797277 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.797287 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.797302 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.797311 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:40Z","lastTransitionTime":"2026-01-20T22:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.900556 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.900917 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.901026 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.901137 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.901230 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:40Z","lastTransitionTime":"2026-01-20T22:36:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.959437 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 19:20:42.768267876 +0000 UTC Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.961835 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.961868 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.961848 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:40 crc kubenswrapper[5030]: E0120 22:36:40.961994 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:40 crc kubenswrapper[5030]: I0120 22:36:40.961851 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:40 crc kubenswrapper[5030]: E0120 22:36:40.962108 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:40 crc kubenswrapper[5030]: E0120 22:36:40.962190 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:40 crc kubenswrapper[5030]: E0120 22:36:40.962282 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.004220 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.004543 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.004880 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.005164 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.005276 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:41Z","lastTransitionTime":"2026-01-20T22:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.111250 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.111303 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.111320 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.111342 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.111357 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:41Z","lastTransitionTime":"2026-01-20T22:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.214335 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.214380 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.214392 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.214420 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.214432 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:41Z","lastTransitionTime":"2026-01-20T22:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.317105 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.317171 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.317191 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.317216 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.317270 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:41Z","lastTransitionTime":"2026-01-20T22:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.419985 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.420453 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.420676 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.420821 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.420976 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:41Z","lastTransitionTime":"2026-01-20T22:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.523941 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.524001 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.524018 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.524042 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.524059 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:41Z","lastTransitionTime":"2026-01-20T22:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.626726 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.626766 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.626777 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.626795 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.626810 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:41Z","lastTransitionTime":"2026-01-20T22:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.730261 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.730309 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.730325 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.730342 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.730353 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:41Z","lastTransitionTime":"2026-01-20T22:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.833153 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.833220 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.833238 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.833261 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.833279 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:41Z","lastTransitionTime":"2026-01-20T22:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.936323 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.936370 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.936382 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.936401 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.936413 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:41Z","lastTransitionTime":"2026-01-20T22:36:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:41 crc kubenswrapper[5030]: I0120 22:36:41.960938 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 15:26:56.147974308 +0000 UTC Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.038536 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.038582 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.038593 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.038609 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.038642 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:42Z","lastTransitionTime":"2026-01-20T22:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.141364 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.141440 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.141465 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.141495 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.141517 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:42Z","lastTransitionTime":"2026-01-20T22:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.244613 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.244718 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.244737 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.244761 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.244781 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:42Z","lastTransitionTime":"2026-01-20T22:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.348243 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.348380 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.348406 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.348437 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.348459 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:42Z","lastTransitionTime":"2026-01-20T22:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.451562 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.451604 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.451615 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.451648 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.451660 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:42Z","lastTransitionTime":"2026-01-20T22:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.553691 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.553758 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.553767 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.553783 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.553795 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:42Z","lastTransitionTime":"2026-01-20T22:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.655993 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.656031 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.656043 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.656061 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.656073 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:42Z","lastTransitionTime":"2026-01-20T22:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.758111 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.758158 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.758170 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.758186 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.758198 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:42Z","lastTransitionTime":"2026-01-20T22:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.861321 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.861385 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.861405 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.861430 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.861448 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:42Z","lastTransitionTime":"2026-01-20T22:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.961404 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:31:02.731190349 +0000 UTC Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.961457 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.961491 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.961472 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:42 crc kubenswrapper[5030]: E0120 22:36:42.961687 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:42 crc kubenswrapper[5030]: E0120 22:36:42.961578 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.961727 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:42 crc kubenswrapper[5030]: E0120 22:36:42.961777 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:42 crc kubenswrapper[5030]: E0120 22:36:42.961815 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.964259 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.964282 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.964289 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.964300 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:42 crc kubenswrapper[5030]: I0120 22:36:42.964310 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:42Z","lastTransitionTime":"2026-01-20T22:36:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.065998 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.066050 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.066061 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.066079 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.066094 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:43Z","lastTransitionTime":"2026-01-20T22:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.169200 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.169238 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.169246 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.169259 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.169267 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:43Z","lastTransitionTime":"2026-01-20T22:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.271667 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.271738 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.271755 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.271778 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.271794 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:43Z","lastTransitionTime":"2026-01-20T22:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.374139 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.374203 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.374216 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.374235 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.374246 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:43Z","lastTransitionTime":"2026-01-20T22:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.476727 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.476798 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.476813 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.476833 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.476845 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:43Z","lastTransitionTime":"2026-01-20T22:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.580239 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.580299 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.580319 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.580342 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.580368 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:43Z","lastTransitionTime":"2026-01-20T22:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.683452 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.683546 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.683558 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.683580 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.683596 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:43Z","lastTransitionTime":"2026-01-20T22:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.786593 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.786728 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.786750 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.786777 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.786796 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:43Z","lastTransitionTime":"2026-01-20T22:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.890372 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.890407 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.890416 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.890431 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.890444 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:43Z","lastTransitionTime":"2026-01-20T22:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.962261 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 08:23:31.259621113 +0000 UTC Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.993508 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.993572 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.993593 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.993657 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:43 crc kubenswrapper[5030]: I0120 22:36:43.993684 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:43Z","lastTransitionTime":"2026-01-20T22:36:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.096168 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.096211 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.096219 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.096236 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.096245 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:44Z","lastTransitionTime":"2026-01-20T22:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.199762 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.199830 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.199854 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.199882 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.199902 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:44Z","lastTransitionTime":"2026-01-20T22:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.302858 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.302923 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.302942 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.302971 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.302990 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:44Z","lastTransitionTime":"2026-01-20T22:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.405564 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.405611 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.405655 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.405675 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.405692 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:44Z","lastTransitionTime":"2026-01-20T22:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.508675 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.508728 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.508744 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.508767 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.508783 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:44Z","lastTransitionTime":"2026-01-20T22:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.611994 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.612129 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.612149 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.612171 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.612187 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:44Z","lastTransitionTime":"2026-01-20T22:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.714903 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.714968 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.714985 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.715010 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.715026 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:44Z","lastTransitionTime":"2026-01-20T22:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.818595 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.818680 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.818697 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.818725 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.818743 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:44Z","lastTransitionTime":"2026-01-20T22:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.922383 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.922446 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.922462 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.922486 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.922507 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:44Z","lastTransitionTime":"2026-01-20T22:36:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.961352 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:44 crc kubenswrapper[5030]: E0120 22:36:44.961533 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.961611 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.961721 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.962122 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:44 crc kubenswrapper[5030]: E0120 22:36:44.962240 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:44 crc kubenswrapper[5030]: E0120 22:36:44.962183 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.962387 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 21:32:55.881423205 +0000 UTC Jan 20 22:36:44 crc kubenswrapper[5030]: E0120 22:36:44.962437 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:44 crc kubenswrapper[5030]: I0120 22:36:44.978814 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.025977 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.026038 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.026054 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.026083 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.026102 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:45Z","lastTransitionTime":"2026-01-20T22:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.129764 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.129846 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.129867 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.129895 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.129915 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:45Z","lastTransitionTime":"2026-01-20T22:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.233017 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.233078 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.233095 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.233117 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.233136 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:45Z","lastTransitionTime":"2026-01-20T22:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.335948 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.335983 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.335991 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.336023 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.336034 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:45Z","lastTransitionTime":"2026-01-20T22:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.439233 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.439327 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.439345 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.439368 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.439387 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:45Z","lastTransitionTime":"2026-01-20T22:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.542393 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.542511 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.542534 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.542563 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.542587 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:45Z","lastTransitionTime":"2026-01-20T22:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.645991 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.646378 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.646567 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.646764 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.646928 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:45Z","lastTransitionTime":"2026-01-20T22:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.750912 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.751338 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.751479 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.751673 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.751822 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:45Z","lastTransitionTime":"2026-01-20T22:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.854957 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.855019 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.855038 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.855064 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.855084 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:45Z","lastTransitionTime":"2026-01-20T22:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.958446 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.958548 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.958568 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.958592 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.958612 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:45Z","lastTransitionTime":"2026-01-20T22:36:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:45 crc kubenswrapper[5030]: I0120 22:36:45.962758 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 00:43:53.132312815 +0000 UTC Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.062162 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.062220 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.062235 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.062271 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.062285 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:46Z","lastTransitionTime":"2026-01-20T22:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.165901 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.165986 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.166012 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.166043 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.166065 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:46Z","lastTransitionTime":"2026-01-20T22:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.268860 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.268928 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.268944 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.268977 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.268996 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:46Z","lastTransitionTime":"2026-01-20T22:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.372087 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.372155 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.372173 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.372198 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.372216 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:46Z","lastTransitionTime":"2026-01-20T22:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.476337 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.476446 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.476470 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.476503 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.476528 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:46Z","lastTransitionTime":"2026-01-20T22:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.580050 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.580108 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.580124 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.580146 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.580164 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:46Z","lastTransitionTime":"2026-01-20T22:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.682971 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.683049 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.683066 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.683090 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.683108 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:46Z","lastTransitionTime":"2026-01-20T22:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.786088 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.786147 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.786165 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.786188 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.786207 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:46Z","lastTransitionTime":"2026-01-20T22:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.889536 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.889577 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.889589 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.889606 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.889616 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:46Z","lastTransitionTime":"2026-01-20T22:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.961854 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.961953 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.961871 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:46 crc kubenswrapper[5030]: E0120 22:36:46.962060 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.961854 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:46 crc kubenswrapper[5030]: E0120 22:36:46.962345 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:46 crc kubenswrapper[5030]: E0120 22:36:46.962327 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:46 crc kubenswrapper[5030]: E0120 22:36:46.962442 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.963052 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 08:46:46.366352606 +0000 UTC Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.992205 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.992243 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.992254 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.992268 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:46 crc kubenswrapper[5030]: I0120 22:36:46.992280 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:46Z","lastTransitionTime":"2026-01-20T22:36:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.095264 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.095314 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.095330 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.095353 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.095380 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:47Z","lastTransitionTime":"2026-01-20T22:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.198890 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.198959 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.198976 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.199019 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.199054 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:47Z","lastTransitionTime":"2026-01-20T22:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.302505 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.302566 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.302587 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.303280 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.303362 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:47Z","lastTransitionTime":"2026-01-20T22:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.406460 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.406553 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.406576 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.406607 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.406668 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:47Z","lastTransitionTime":"2026-01-20T22:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.509803 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.509872 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.509896 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.509925 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.509948 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:47Z","lastTransitionTime":"2026-01-20T22:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.613090 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.613183 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.613199 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.613226 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.613243 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:47Z","lastTransitionTime":"2026-01-20T22:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.715931 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.715970 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.715981 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.715999 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.716013 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:47Z","lastTransitionTime":"2026-01-20T22:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.818294 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.818329 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.818341 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.818356 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.818366 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:47Z","lastTransitionTime":"2026-01-20T22:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.922226 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.922254 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.922264 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.922278 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.922288 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:47Z","lastTransitionTime":"2026-01-20T22:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.946806 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.946853 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.946869 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.946890 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.946904 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:47Z","lastTransitionTime":"2026-01-20T22:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.963405 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 20:40:21.778889718 +0000 UTC Jan 20 22:36:47 crc kubenswrapper[5030]: E0120 22:36:47.967385 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:47Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.975073 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.975121 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.975140 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.975163 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.975181 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:47Z","lastTransitionTime":"2026-01-20T22:36:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.978696 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:47Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:47 crc kubenswrapper[5030]: E0120 22:36:47.995992 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:47Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:47 crc kubenswrapper[5030]: I0120 22:36:47.997498 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:47Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.000999 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.001035 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.001047 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.001076 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.001095 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:48Z","lastTransitionTime":"2026-01-20T22:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.014059 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: E0120 22:36:48.023981 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.028289 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.028357 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.028381 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.028409 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.028434 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:48Z","lastTransitionTime":"2026-01-20T22:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.037319 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: E0120 22:36:48.051458 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.056668 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.056712 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.056723 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.056749 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.056762 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:48Z","lastTransitionTime":"2026-01-20T22:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.056789 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.072487 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0f5e385cd5f1702d045571944087e6d00cc5743b66386b068f212c853833667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:33Z\\\",\\\"message\\\":\\\"2026-01-20T22:35:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225\\\\n2026-01-20T22:35:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225 to /host/opt/cni/bin/\\\\n2026-01-20T22:35:48Z [verbose] multus-daemon started\\\\n2026-01-20T22:35:48Z [verbose] Readiness Indicator file check\\\\n2026-01-20T22:36:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: E0120 22:36:48.075183 5030 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8166daa9-857d-40f2-9d8b-f852d2b2e6fb\\\",\\\"systemUUID\\\":\\\"ef8501f1-c822-4135-bd61-b33341e82dce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: E0120 22:36:48.075414 5030 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.077255 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.077308 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.077327 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.077350 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.077364 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:48Z","lastTransitionTime":"2026-01-20T22:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.086235 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.098357 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.107413 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.119672 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.130908 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0e294fc-1401-4e47-ab95-08026c695da4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a087c314f74fd0dd33aa0b5e7c70d924ab48168afe3d66fff4728cb82efddbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec37384891d34d90b6a170761cba27322de98993f08332581d5fbc3cdfef52ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec37384891d34d90b6a170761cba27322de98993f08332581d5fbc3cdfef52ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.144122 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.174533 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:23Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 22:36:22.834729 6720 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 22:36:22.834777 6720 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 22:36:22.834806 6720 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 22:36:22.834873 6720 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 22:36:22.834941 6720 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 22:36:22.835370 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 22:36:22.835483 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 22:36:22.835550 6720 ovnkube.go:599] Stopped ovnkube\\\\nI0120 22:36:22.835583 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 22:36:22.835729 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:36:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.181234 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.181298 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.181319 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.181349 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.181369 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:48Z","lastTransitionTime":"2026-01-20T22:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.193962 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d9c2e0-ed03-4fda-b0fc-43a3e53a2a30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a93fba3f4f3480ed58b7b22535695dcb978c47afd9f4fec0886d49a88d1ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5324b780d1462962d65c46a0abda0743ce74550c1571862713fa79256b316aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b87b63fa491cf7fb6be320cd3242839fe8f5f7347876548e29891982c295b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.204907 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.226922 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.243077 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.258489 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:48Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.284250 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.284299 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.284318 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.284343 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.284361 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:48Z","lastTransitionTime":"2026-01-20T22:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.386407 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.386487 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.386511 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.386544 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.386567 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:48Z","lastTransitionTime":"2026-01-20T22:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.489327 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.489390 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.489413 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.489443 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.489468 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:48Z","lastTransitionTime":"2026-01-20T22:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.592087 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.592144 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.592164 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.592193 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.592214 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:48Z","lastTransitionTime":"2026-01-20T22:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.695459 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.695535 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.695558 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.695584 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.695601 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:48Z","lastTransitionTime":"2026-01-20T22:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.798310 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.798365 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.798383 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.798406 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.798424 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:48Z","lastTransitionTime":"2026-01-20T22:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.901770 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.901855 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.901880 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.901911 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.901936 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:48Z","lastTransitionTime":"2026-01-20T22:36:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.961521 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.961569 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.961570 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.961684 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:48 crc kubenswrapper[5030]: E0120 22:36:48.961825 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:48 crc kubenswrapper[5030]: E0120 22:36:48.962054 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:48 crc kubenswrapper[5030]: E0120 22:36:48.962214 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:48 crc kubenswrapper[5030]: E0120 22:36:48.962351 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:48 crc kubenswrapper[5030]: I0120 22:36:48.964497 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 06:00:47.408055164 +0000 UTC Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.004808 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.004870 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.004887 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.004912 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.004931 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:49Z","lastTransitionTime":"2026-01-20T22:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.108604 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.108707 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.108764 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.108789 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.108806 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:49Z","lastTransitionTime":"2026-01-20T22:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.211809 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.211888 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.211915 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.211947 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.211970 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:49Z","lastTransitionTime":"2026-01-20T22:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.316818 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.316889 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.316907 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.316930 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.316954 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:49Z","lastTransitionTime":"2026-01-20T22:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.420079 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.420141 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.420154 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.420175 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.420190 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:49Z","lastTransitionTime":"2026-01-20T22:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.523066 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.523121 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.523136 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.523158 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.523173 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:49Z","lastTransitionTime":"2026-01-20T22:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.626026 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.626107 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.626131 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.626162 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.626189 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:49Z","lastTransitionTime":"2026-01-20T22:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.729858 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.729924 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.729948 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.729979 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.730003 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:49Z","lastTransitionTime":"2026-01-20T22:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.833744 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.833916 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.833943 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.833972 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.833998 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:49Z","lastTransitionTime":"2026-01-20T22:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.936960 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.937019 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.937037 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.937076 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.937093 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:49Z","lastTransitionTime":"2026-01-20T22:36:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.962669 5030 scope.go:117] "RemoveContainer" containerID="8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c" Jan 20 22:36:49 crc kubenswrapper[5030]: I0120 22:36:49.965285 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 02:19:02.203395648 +0000 UTC Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.040226 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.040784 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.040807 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.040866 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.040887 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:50Z","lastTransitionTime":"2026-01-20T22:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.144475 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.144540 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.144558 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.144584 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.144603 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:50Z","lastTransitionTime":"2026-01-20T22:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.246854 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.246916 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.246934 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.246958 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.246976 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:50Z","lastTransitionTime":"2026-01-20T22:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.349340 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.349377 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.349387 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.349400 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.349410 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:50Z","lastTransitionTime":"2026-01-20T22:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.452108 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.452180 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.452204 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.452235 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.452257 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:50Z","lastTransitionTime":"2026-01-20T22:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.554541 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.554613 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.554703 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.554733 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.554754 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:50Z","lastTransitionTime":"2026-01-20T22:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.658078 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.658116 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.658127 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.658142 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.658153 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:50Z","lastTransitionTime":"2026-01-20T22:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.761086 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.761159 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.761177 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.761203 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.761221 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:50Z","lastTransitionTime":"2026-01-20T22:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.864388 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.864424 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.864440 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.864462 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.864480 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:50Z","lastTransitionTime":"2026-01-20T22:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.945290 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.945404 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.945433 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.945406817 +0000 UTC m=+147.265667145 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.945467 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.945515 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.945534 5030 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.945580 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.945600 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.945576401 +0000 UTC m=+147.265836709 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.945759 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.945760 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.945783 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.945798 5030 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.945803 5030 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.945813 5030 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.945856 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.945841917 +0000 UTC m=+147.266102215 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.945863 5030 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.945878 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.945867418 +0000 UTC m=+147.266127726 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.945897 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.945885418 +0000 UTC m=+147.266145716 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.961824 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.962027 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.962536 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.962747 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.962975 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.963083 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.963327 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:50 crc kubenswrapper[5030]: E0120 22:36:50.963907 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.965616 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 20:50:11.176323163 +0000 UTC Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.966829 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.966860 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.966890 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.966906 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:50 crc kubenswrapper[5030]: I0120 22:36:50.966916 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:50Z","lastTransitionTime":"2026-01-20T22:36:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.069981 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.070045 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.070063 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.070084 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.070095 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:51Z","lastTransitionTime":"2026-01-20T22:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.172547 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.172582 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.172593 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.172612 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.172649 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:51Z","lastTransitionTime":"2026-01-20T22:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.274187 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.274220 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.274228 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.274242 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.274251 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:51Z","lastTransitionTime":"2026-01-20T22:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.376388 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.376441 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.376450 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.376465 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.376474 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:51Z","lastTransitionTime":"2026-01-20T22:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.429396 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/2.log" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.432808 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerStarted","Data":"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2"} Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.433275 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.451660 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d9c2e0-ed03-4fda-b0fc-43a3e53a2a30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a93fba3f4f3480ed58b7b22535695dcb978c47afd9f4fec0886d49a88d1ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5324b780d1462962d65c46a0abda0743ce74550c1571862713fa79256b316aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b87b63fa491cf7fb6be320cd3242839fe8f5f7347876548e29891982c295b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.470011 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.478760 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.478796 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.478804 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.478816 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.478825 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:51Z","lastTransitionTime":"2026-01-20T22:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.489902 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.507119 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.521693 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.532849 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.549205 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.571461 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.580958 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.581044 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.581060 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.581083 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.581099 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:51Z","lastTransitionTime":"2026-01-20T22:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.592755 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.607074 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.621517 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0f5e385cd5f1702d045571944087e6d00cc5743b66386b068f212c853833667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:33Z\\\",\\\"message\\\":\\\"2026-01-20T22:35:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225\\\\n2026-01-20T22:35:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225 to /host/opt/cni/bin/\\\\n2026-01-20T22:35:48Z [verbose] multus-daemon started\\\\n2026-01-20T22:35:48Z [verbose] Readiness Indicator file check\\\\n2026-01-20T22:36:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.634410 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.645990 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.657288 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.672975 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.683092 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0e294fc-1401-4e47-ab95-08026c695da4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a087c314f74fd0dd33aa0b5e7c70d924ab48168afe3d66fff4728cb82efddbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec37384891d34d90b6a170761cba27322de98993f08332581d5fbc3cdfef52ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec37384891d34d90b6a170761cba27322de98993f08332581d5fbc3cdfef52ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.683603 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.683652 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.683662 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.683675 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.683685 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:51Z","lastTransitionTime":"2026-01-20T22:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.693555 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.710514 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:23Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 22:36:22.834729 6720 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 22:36:22.834777 6720 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 22:36:22.834806 6720 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 22:36:22.834873 6720 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 22:36:22.834941 6720 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 22:36:22.835370 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 22:36:22.835483 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 22:36:22.835550 6720 ovnkube.go:599] Stopped ovnkube\\\\nI0120 22:36:22.835583 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 22:36:22.835729 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:36:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:51Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.786493 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.786559 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.786583 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.786616 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.786750 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:51Z","lastTransitionTime":"2026-01-20T22:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.890437 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.890529 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.890547 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.890578 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.890598 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:51Z","lastTransitionTime":"2026-01-20T22:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.966296 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 05:29:15.544393007 +0000 UTC Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.993242 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.993296 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.993310 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.993333 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:51 crc kubenswrapper[5030]: I0120 22:36:51.993348 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:51Z","lastTransitionTime":"2026-01-20T22:36:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.096792 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.096839 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.096850 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.096867 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.096881 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:52Z","lastTransitionTime":"2026-01-20T22:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.199902 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.199966 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.199988 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.200020 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.200040 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:52Z","lastTransitionTime":"2026-01-20T22:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.303309 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.303418 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.303449 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.303482 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.303502 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:52Z","lastTransitionTime":"2026-01-20T22:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.407225 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.407265 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.407274 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.407292 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.407304 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:52Z","lastTransitionTime":"2026-01-20T22:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.439192 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/3.log" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.440359 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/2.log" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.444765 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf449d4a-2037-4802-8370-1965d3026c07" containerID="42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2" exitCode=1 Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.444821 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2"} Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.444881 5030 scope.go:117] "RemoveContainer" containerID="8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.445904 5030 scope.go:117] "RemoveContainer" containerID="42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2" Jan 20 22:36:52 crc kubenswrapper[5030]: E0120 22:36:52.446357 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.470764 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.489835 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d9c2e0-ed03-4fda-b0fc-43a3e53a2a30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a93fba3f4f3480ed58b7b22535695dcb978c47afd9f4fec0886d49a88d1ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5324b780d1462962d65c46a0abda0743ce74550c1571862713fa79256b316aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b87b63fa491cf7fb6be320cd3242839fe8f5f7347876548e29891982c295b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.507908 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.509912 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.509980 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.510161 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.510279 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.510323 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:52Z","lastTransitionTime":"2026-01-20T22:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.528472 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.544701 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.561123 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0f5e385cd5f1702d045571944087e6d00cc5743b66386b068f212c853833667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:33Z\\\",\\\"message\\\":\\\"2026-01-20T22:35:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225\\\\n2026-01-20T22:35:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225 to /host/opt/cni/bin/\\\\n2026-01-20T22:35:48Z [verbose] multus-daemon started\\\\n2026-01-20T22:35:48Z [verbose] Readiness Indicator file check\\\\n2026-01-20T22:36:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.576466 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.591691 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.613483 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.613545 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.613565 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.613597 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.613652 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:52Z","lastTransitionTime":"2026-01-20T22:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.621894 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.661332 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.677705 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.693853 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.706924 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.716889 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.716956 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.716969 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.717010 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.717024 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:52Z","lastTransitionTime":"2026-01-20T22:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.717149 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.740662 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.753157 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0e294fc-1401-4e47-ab95-08026c695da4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a087c314f74fd0dd33aa0b5e7c70d924ab48168afe3d66fff4728cb82efddbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec37384891d34d90b6a170761cba27322de98993f08332581d5fbc3cdfef52ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec37384891d34d90b6a170761cba27322de98993f08332581d5fbc3cdfef52ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.769519 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.792144 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e658be3e2895b5bed24d699af8b6f2bf3e129e656b1268639b04a13b213dc1c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:23Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 22:36:22.834729 6720 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 22:36:22.834777 6720 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 22:36:22.834806 6720 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 22:36:22.834873 6720 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 22:36:22.834941 6720 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 22:36:22.835370 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 22:36:22.835483 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 22:36:22.835550 6720 ovnkube.go:599] Stopped ovnkube\\\\nI0120 22:36:22.835583 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 22:36:22.835729 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:36:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:51Z\\\",\\\"message\\\":\\\"bor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 22:36:51.568275 7139 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-f2dgj before timer (time: 2026-01-20 22:36:53.00259098 +0000 UTC m=+2.001671091): skip\\\\nI0120 22:36:51.568305 7139 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 52.891µs)\\\\nI0120 22:36:51.568226 7139 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0120 22:36:51.568351 7139 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0120 22:36:51.568377 7139 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0120 22:36:51.568414 7139 factory.go:656] Stopping watch factory\\\\nI0120 22:36:51.568433 7139 ovnkube.go:599] Stopped ovnkube\\\\nI0120 22:36:51.568501 7139 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 22:36:51.568518 7139 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 22:36:51.568586 7139 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:36:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:52Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.820432 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.820462 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.820473 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.820491 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.820503 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:52Z","lastTransitionTime":"2026-01-20T22:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.923448 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.923508 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.923526 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.923550 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.923567 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:52Z","lastTransitionTime":"2026-01-20T22:36:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.962052 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.962083 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.962065 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:52 crc kubenswrapper[5030]: E0120 22:36:52.962243 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.962279 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:52 crc kubenswrapper[5030]: E0120 22:36:52.962467 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:52 crc kubenswrapper[5030]: E0120 22:36:52.962778 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:52 crc kubenswrapper[5030]: E0120 22:36:52.962827 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:52 crc kubenswrapper[5030]: I0120 22:36:52.967606 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 03:10:32.156843571 +0000 UTC Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.027003 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.027404 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.027565 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.027754 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.027942 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:53Z","lastTransitionTime":"2026-01-20T22:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.131414 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.131460 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.131477 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.131496 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.131511 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:53Z","lastTransitionTime":"2026-01-20T22:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.233256 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.233286 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.233311 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.233324 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.233333 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:53Z","lastTransitionTime":"2026-01-20T22:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.335614 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.335662 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.335674 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.335689 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.335699 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:53Z","lastTransitionTime":"2026-01-20T22:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.438772 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.438847 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.438883 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.438918 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.438940 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:53Z","lastTransitionTime":"2026-01-20T22:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.452270 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/3.log" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.458489 5030 scope.go:117] "RemoveContainer" containerID="42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2" Jan 20 22:36:53 crc kubenswrapper[5030]: E0120 22:36:53.459729 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.481272 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.500929 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2a963db-b558-4e9a-8e56-12636c3fe1c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2bdb389d85255666e2af8574aa906f25dcfb7974adddd75dfe3ee0407a3da39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p7pmb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qb97t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.523389 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n8v4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e610661-5072-4aa0-b1f1-75410b7f663b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0f5e385cd5f1702d045571944087e6d00cc5743b66386b068f212c853833667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:33Z\\\",\\\"message\\\":\\\"2026-01-20T22:35:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225\\\\n2026-01-20T22:35:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6ed652b3-fa01-4329-a0d2-6bd0b5e1f225 to /host/opt/cni/bin/\\\\n2026-01-20T22:35:48Z [verbose] multus-daemon started\\\\n2026-01-20T22:35:48Z [verbose] Readiness Indicator file check\\\\n2026-01-20T22:36:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xj9mn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n8v4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.542344 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.542386 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.542400 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.542421 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.542436 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:53Z","lastTransitionTime":"2026-01-20T22:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.542695 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b3c414b-bc29-41aa-a369-ecc2cc809691\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n47nm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:36:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-f2dgj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.564658 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b24841d3-f4d6-4e58-b4aa-f8d2a41fb281\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://884afb03e1894fb9180b62f17058826115b787008451bd230a4104183ad004a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768f237e5550046fe142cbc9b972fb1bfcd6fdbad2fdecd8e96d5e760b73dbd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://517113c3eb36dc8e1322d0dd40f4d15ceb30448ec3d05be6f61330ea7068278c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.584963 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.605407 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a87d96a3a08f1375404f64a0adec89964995972a621f4fb1d69685539e562f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://248a016345ede09061b263b51e5dd022f9f63f9a489f35cb9dd2c0ee4a371c30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.621354 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fqn26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8ba00cb-4776-41c1-87d9-9ba37894c69e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74b775a1b8f2c2a573ac416c4dce49b964965ee3006cf72ca42d4107db4c0066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rdmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fqn26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.642069 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56b27c65f86447e0ff20fecd12f82b2a5b3572a398d894460b23bdf141b99fbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.645256 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.645315 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.645333 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.645359 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.645376 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:53Z","lastTransitionTime":"2026-01-20T22:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.661371 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.691058 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf449d4a-2037-4802-8370-1965d3026c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T22:36:51Z\\\",\\\"message\\\":\\\"bor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.150:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 22:36:51.568275 7139 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-f2dgj before timer (time: 2026-01-20 22:36:53.00259098 +0000 UTC m=+2.001671091): skip\\\\nI0120 22:36:51.568305 7139 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 52.891µs)\\\\nI0120 22:36:51.568226 7139 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0120 22:36:51.568351 7139 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0120 22:36:51.568377 7139 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0120 22:36:51.568414 7139 factory.go:656] Stopping watch factory\\\\nI0120 22:36:51.568433 7139 ovnkube.go:599] Stopped ovnkube\\\\nI0120 22:36:51.568501 7139 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 22:36:51.568518 7139 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 22:36:51.568586 7139 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T22:36:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6wk7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kq4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.711966 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c2f1372-befa-4af3-89cb-869b61ef944e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.728348 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0e294fc-1401-4e47-ab95-08026c695da4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a087c314f74fd0dd33aa0b5e7c70d924ab48168afe3d66fff4728cb82efddbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec37384891d34d90b6a170761cba27322de98993f08332581d5fbc3cdfef52ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec37384891d34d90b6a170761cba27322de98993f08332581d5fbc3cdfef52ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.749035 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.749160 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.749184 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.749213 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.749235 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:53Z","lastTransitionTime":"2026-01-20T22:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.753894 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18c97a93-96a8-40e8-9f36-513f91962906\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea97717155f27494be8a99a50db3e2d02055ce42d3eeeec60cdcef4cf6bd82d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e99a8a3708cd42aad50e3bbcc30ef4b4fec94ba3ab6a94ff8b360e14cface0b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f86f596b6e325f0ea1bcc832660cfa46aad6584d1f998fee49595b5258b3e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb84aa3c796104f62a830527132f851a5c329d64a7055f8925c998dcfb10aea1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe1704e409f4d4f31f2ca8712410576e75b081ef7b7f7c2d948d77a47bcfca2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15932208df911d57792530198a613c42074078f7310e965471acc6af6154e8f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81748b024bd8d2ba95cdf05483979d6ea9e8af60d15bc29189231a56f2f5ffc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljpgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zzvtq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.771078 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5zbvj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02009b42-ac8c-4b5e-8c69-9102da70a748\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af86d14feb0fd71e6f04660f058e6bd50501ddfe013d83aa872104ce3a048f24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qczpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:46Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5zbvj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.789066 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382d80ca-0935-4f90-b822-33b31b79b1f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a5a75ab6709e352bd665fdebde9bc20b0497be9398043686737c0cc2c1c5b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://876255119ccaaa1038d803cff408ecd6d934805042b6722b026f6355a46044b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:36:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dkjb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ccgf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.807476 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8d9c2e0-ed03-4fda-b0fc-43a3e53a2a30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:36:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95a93fba3f4f3480ed58b7b22535695dcb978c47afd9f4fec0886d49a88d1ad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5324b780d1462962d65c46a0abda0743ce74550c1571862713fa79256b316aaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b87b63fa491cf7fb6be320cd3242839fe8f5f7347876548e29891982c295b92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a474b219e51f2e3d7747aa44b641ad7f195854636f0f7aaca13669cb6c50d92\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T22:35:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T22:35:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T22:35:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.826198 5030 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T22:35:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e1a028041e67b281e783aafcf52f57d0a3be27504c8c87f2a06f501a0862a6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T22:35:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T22:36:53Z is after 2025-08-24T17:21:41Z" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.855267 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.855427 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.856303 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.856342 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.856354 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:53Z","lastTransitionTime":"2026-01-20T22:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.958571 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.958647 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.958664 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.958679 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.958689 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:53Z","lastTransitionTime":"2026-01-20T22:36:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:53 crc kubenswrapper[5030]: I0120 22:36:53.968186 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 19:18:50.860617458 +0000 UTC Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.062196 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.062278 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.062302 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.062333 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.062359 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:54Z","lastTransitionTime":"2026-01-20T22:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.166400 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.166471 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.166491 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.166516 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.166534 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:54Z","lastTransitionTime":"2026-01-20T22:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.269209 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.269280 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.269306 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.269333 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.269355 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:54Z","lastTransitionTime":"2026-01-20T22:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.371976 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.372039 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.372059 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.372083 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.372101 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:54Z","lastTransitionTime":"2026-01-20T22:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.475576 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.475769 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.475789 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.475815 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.475832 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:54Z","lastTransitionTime":"2026-01-20T22:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.579039 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.579109 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.579130 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.579159 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.579180 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:54Z","lastTransitionTime":"2026-01-20T22:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.682550 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.682654 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.682688 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.682712 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.682730 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:54Z","lastTransitionTime":"2026-01-20T22:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.785474 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.785588 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.785611 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.785677 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.785703 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:54Z","lastTransitionTime":"2026-01-20T22:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.888039 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.888092 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.888106 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.888125 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.888137 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:54Z","lastTransitionTime":"2026-01-20T22:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.961708 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.961772 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.961900 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:54 crc kubenswrapper[5030]: E0120 22:36:54.962026 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.962075 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:54 crc kubenswrapper[5030]: E0120 22:36:54.962231 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:54 crc kubenswrapper[5030]: E0120 22:36:54.962317 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:54 crc kubenswrapper[5030]: E0120 22:36:54.962424 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.968719 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 09:32:50.240962174 +0000 UTC Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.991210 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.991253 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.991270 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.991291 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:54 crc kubenswrapper[5030]: I0120 22:36:54.991308 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:54Z","lastTransitionTime":"2026-01-20T22:36:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.093586 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.093677 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.093693 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.093711 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.093720 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:55Z","lastTransitionTime":"2026-01-20T22:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.196685 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.196759 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.196777 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.196840 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.196858 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:55Z","lastTransitionTime":"2026-01-20T22:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.299979 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.300050 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.300074 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.300103 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.300127 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:55Z","lastTransitionTime":"2026-01-20T22:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.410043 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.410108 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.410134 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.410166 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.410190 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:55Z","lastTransitionTime":"2026-01-20T22:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.514279 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.514345 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.514366 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.514390 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.514409 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:55Z","lastTransitionTime":"2026-01-20T22:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.618124 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.618210 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.618233 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.618258 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.618275 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:55Z","lastTransitionTime":"2026-01-20T22:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.720807 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.720882 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.720900 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.720930 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.720947 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:55Z","lastTransitionTime":"2026-01-20T22:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.824486 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.824668 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.824696 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.824726 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.824747 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:55Z","lastTransitionTime":"2026-01-20T22:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.927406 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.927447 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.927457 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.927476 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.927488 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:55Z","lastTransitionTime":"2026-01-20T22:36:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:55 crc kubenswrapper[5030]: I0120 22:36:55.969168 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 08:50:41.744599938 +0000 UTC Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.030600 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.030711 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.030734 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.030762 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.030796 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:56Z","lastTransitionTime":"2026-01-20T22:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.133897 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.134383 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.134570 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.134886 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.135072 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:56Z","lastTransitionTime":"2026-01-20T22:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.238512 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.238552 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.238566 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.238588 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.238602 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:56Z","lastTransitionTime":"2026-01-20T22:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.342286 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.342348 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.342369 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.342398 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.342419 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:56Z","lastTransitionTime":"2026-01-20T22:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.445067 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.445125 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.445144 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.445168 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.445185 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:56Z","lastTransitionTime":"2026-01-20T22:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.548673 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.548764 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.548790 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.548826 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.549115 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:56Z","lastTransitionTime":"2026-01-20T22:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.652114 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.652190 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.652208 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.652233 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.652250 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:56Z","lastTransitionTime":"2026-01-20T22:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.755728 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.755796 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.755817 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.755840 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.755860 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:56Z","lastTransitionTime":"2026-01-20T22:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.858645 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.858694 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.858711 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.858733 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.858749 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:56Z","lastTransitionTime":"2026-01-20T22:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.960719 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.960788 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.960822 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.960941 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.960980 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:56 crc kubenswrapper[5030]: E0120 22:36:56.961054 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.961095 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.961150 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:56Z","lastTransitionTime":"2026-01-20T22:36:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:56 crc kubenswrapper[5030]: E0120 22:36:56.961251 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.961295 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.961323 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:56 crc kubenswrapper[5030]: E0120 22:36:56.961498 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:56 crc kubenswrapper[5030]: E0120 22:36:56.961674 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:56 crc kubenswrapper[5030]: I0120 22:36:56.970226 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 23:54:38.034516066 +0000 UTC Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.064886 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.064945 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.064962 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.064986 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.065006 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:57Z","lastTransitionTime":"2026-01-20T22:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.168065 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.168110 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.168121 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.168141 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.168163 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:57Z","lastTransitionTime":"2026-01-20T22:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.271271 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.271337 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.271354 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.271384 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.271402 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:57Z","lastTransitionTime":"2026-01-20T22:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.374396 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.374460 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.374476 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.374499 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.374518 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:57Z","lastTransitionTime":"2026-01-20T22:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.476748 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.476794 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.476810 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.476833 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.476850 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:57Z","lastTransitionTime":"2026-01-20T22:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.580522 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.580610 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.580690 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.580722 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.580749 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:57Z","lastTransitionTime":"2026-01-20T22:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.684177 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.684227 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.684237 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.684255 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.684268 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:57Z","lastTransitionTime":"2026-01-20T22:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.787059 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.787103 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.787114 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.787132 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.787145 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:57Z","lastTransitionTime":"2026-01-20T22:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.890186 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.890267 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.890291 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.890321 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.890338 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:57Z","lastTransitionTime":"2026-01-20T22:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.971433 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 00:00:58.437993931 +0000 UTC Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.993037 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.993096 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.993112 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.993141 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:57 crc kubenswrapper[5030]: I0120 22:36:57.993160 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:57Z","lastTransitionTime":"2026-01-20T22:36:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.013801 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=43.013773401 podStartE2EDuration="43.013773401s" podCreationTimestamp="2026-01-20 22:36:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:36:57.995202179 +0000 UTC m=+90.315462537" watchObservedRunningTime="2026-01-20 22:36:58.013773401 +0000 UTC m=+90.334033729" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.055361 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zzvtq" podStartSLOduration=72.055324571 podStartE2EDuration="1m12.055324571s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:36:58.03974435 +0000 UTC m=+90.360004678" watchObservedRunningTime="2026-01-20 22:36:58.055324571 +0000 UTC m=+90.375584909" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.055740 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5zbvj" podStartSLOduration=72.05572632 podStartE2EDuration="1m12.05572632s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:36:58.055087066 +0000 UTC m=+90.375347364" watchObservedRunningTime="2026-01-20 22:36:58.05572632 +0000 UTC m=+90.375986648" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.089806 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ccgf5" podStartSLOduration=71.089777532 podStartE2EDuration="1m11.089777532s" podCreationTimestamp="2026-01-20 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:36:58.076502335 +0000 UTC m=+90.396762633" watchObservedRunningTime="2026-01-20 22:36:58.089777532 +0000 UTC m=+90.410037850" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.095942 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.095985 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.096000 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.096018 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.096032 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:58Z","lastTransitionTime":"2026-01-20T22:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.118349 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=71.118323732 podStartE2EDuration="1m11.118323732s" podCreationTimestamp="2026-01-20 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:36:58.111819657 +0000 UTC m=+90.432079965" watchObservedRunningTime="2026-01-20 22:36:58.118323732 +0000 UTC m=+90.438584030" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.155563 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.155603 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.155614 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.155649 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.155661 5030 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T22:36:58Z","lastTransitionTime":"2026-01-20T22:36:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.189114 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podStartSLOduration=72.189092297 podStartE2EDuration="1m12.189092297s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:36:58.16817269 +0000 UTC m=+90.488432988" watchObservedRunningTime="2026-01-20 22:36:58.189092297 +0000 UTC m=+90.509352595" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.189228 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n8v4f" podStartSLOduration=72.189221411 podStartE2EDuration="1m12.189221411s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:36:58.188920033 +0000 UTC m=+90.509180351" watchObservedRunningTime="2026-01-20 22:36:58.189221411 +0000 UTC m=+90.509481709" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.212498 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7"] Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.213317 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.215012 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.215649 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.215733 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.215981 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.238848 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fqn26" podStartSLOduration=72.238824512 podStartE2EDuration="1m12.238824512s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:36:58.237825338 +0000 UTC m=+90.558085636" watchObservedRunningTime="2026-01-20 22:36:58.238824512 +0000 UTC m=+90.559084810" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.253560 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.253541993 podStartE2EDuration="1m12.253541993s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:36:58.253199774 +0000 UTC m=+90.573460062" watchObservedRunningTime="2026-01-20 22:36:58.253541993 +0000 UTC m=+90.573802291" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.268608 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=14.268586811 podStartE2EDuration="14.268586811s" podCreationTimestamp="2026-01-20 22:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:36:58.267885604 +0000 UTC m=+90.588145892" watchObservedRunningTime="2026-01-20 22:36:58.268586811 +0000 UTC m=+90.588847119" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.319863 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3da81e07-8c55-4286-9a9e-44ee0a0eb015-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.319924 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3da81e07-8c55-4286-9a9e-44ee0a0eb015-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.319958 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3da81e07-8c55-4286-9a9e-44ee0a0eb015-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.320017 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da81e07-8c55-4286-9a9e-44ee0a0eb015-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.320083 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3da81e07-8c55-4286-9a9e-44ee0a0eb015-service-ca\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.421210 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3da81e07-8c55-4286-9a9e-44ee0a0eb015-service-ca\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.421312 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3da81e07-8c55-4286-9a9e-44ee0a0eb015-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.421361 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3da81e07-8c55-4286-9a9e-44ee0a0eb015-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.421396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3da81e07-8c55-4286-9a9e-44ee0a0eb015-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.421452 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3da81e07-8c55-4286-9a9e-44ee0a0eb015-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.421411 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3da81e07-8c55-4286-9a9e-44ee0a0eb015-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.421518 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da81e07-8c55-4286-9a9e-44ee0a0eb015-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.422312 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3da81e07-8c55-4286-9a9e-44ee0a0eb015-service-ca\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.429833 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da81e07-8c55-4286-9a9e-44ee0a0eb015-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.451837 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3da81e07-8c55-4286-9a9e-44ee0a0eb015-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-76jh7\" (UID: \"3da81e07-8c55-4286-9a9e-44ee0a0eb015\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.530310 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.961223 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.961275 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.961237 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.961356 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:36:58 crc kubenswrapper[5030]: E0120 22:36:58.961373 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:36:58 crc kubenswrapper[5030]: E0120 22:36:58.961512 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:36:58 crc kubenswrapper[5030]: E0120 22:36:58.961778 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:36:58 crc kubenswrapper[5030]: E0120 22:36:58.961894 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.971716 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 19:03:01.880990739 +0000 UTC Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.971779 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 20 22:36:58 crc kubenswrapper[5030]: I0120 22:36:58.981793 5030 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 20 22:36:59 crc kubenswrapper[5030]: I0120 22:36:59.483466 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" event={"ID":"3da81e07-8c55-4286-9a9e-44ee0a0eb015","Type":"ContainerStarted","Data":"c6bb70dd90fafc7cf6a189e7ac76ac47b12d24a5bc56e285bf6b9bd8c4ef58a6"} Jan 20 22:36:59 crc kubenswrapper[5030]: I0120 22:36:59.483519 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" event={"ID":"3da81e07-8c55-4286-9a9e-44ee0a0eb015","Type":"ContainerStarted","Data":"7fe54c06582c225d45858184893e0b5fc9ef26a7399cce7c51ac38a070509eae"} Jan 20 22:36:59 crc kubenswrapper[5030]: I0120 22:36:59.506841 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-76jh7" podStartSLOduration=73.506807265 podStartE2EDuration="1m13.506807265s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:36:59.505920143 +0000 UTC m=+91.826180481" watchObservedRunningTime="2026-01-20 22:36:59.506807265 +0000 UTC m=+91.827067603" Jan 20 22:37:00 crc kubenswrapper[5030]: I0120 22:37:00.961695 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:00 crc kubenswrapper[5030]: E0120 22:37:00.962216 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:00 crc kubenswrapper[5030]: I0120 22:37:00.961753 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:00 crc kubenswrapper[5030]: E0120 22:37:00.962333 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:00 crc kubenswrapper[5030]: I0120 22:37:00.961709 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:00 crc kubenswrapper[5030]: I0120 22:37:00.961829 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:00 crc kubenswrapper[5030]: E0120 22:37:00.962446 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:00 crc kubenswrapper[5030]: E0120 22:37:00.962557 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:00 crc kubenswrapper[5030]: I0120 22:37:00.980343 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 20 22:37:02 crc kubenswrapper[5030]: I0120 22:37:02.961446 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:02 crc kubenswrapper[5030]: I0120 22:37:02.961491 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:02 crc kubenswrapper[5030]: I0120 22:37:02.961505 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:02 crc kubenswrapper[5030]: E0120 22:37:02.961866 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:02 crc kubenswrapper[5030]: E0120 22:37:02.961690 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:02 crc kubenswrapper[5030]: E0120 22:37:02.961923 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:02 crc kubenswrapper[5030]: I0120 22:37:02.961519 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:02 crc kubenswrapper[5030]: E0120 22:37:02.961993 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:04 crc kubenswrapper[5030]: I0120 22:37:04.962120 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:04 crc kubenswrapper[5030]: I0120 22:37:04.962121 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:04 crc kubenswrapper[5030]: I0120 22:37:04.962160 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:04 crc kubenswrapper[5030]: E0120 22:37:04.962316 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:04 crc kubenswrapper[5030]: E0120 22:37:04.962411 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:04 crc kubenswrapper[5030]: E0120 22:37:04.962604 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:04 crc kubenswrapper[5030]: I0120 22:37:04.962763 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:04 crc kubenswrapper[5030]: E0120 22:37:04.962903 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:05 crc kubenswrapper[5030]: I0120 22:37:05.192104 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:05 crc kubenswrapper[5030]: E0120 22:37:05.192315 5030 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:37:05 crc kubenswrapper[5030]: E0120 22:37:05.192425 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs podName:2b3c414b-bc29-41aa-a369-ecc2cc809691 nodeName:}" failed. No retries permitted until 2026-01-20 22:38:09.192392071 +0000 UTC m=+161.512652399 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs") pod "network-metrics-daemon-f2dgj" (UID: "2b3c414b-bc29-41aa-a369-ecc2cc809691") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 22:37:06 crc kubenswrapper[5030]: I0120 22:37:06.961480 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:06 crc kubenswrapper[5030]: I0120 22:37:06.961588 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:06 crc kubenswrapper[5030]: E0120 22:37:06.961701 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:06 crc kubenswrapper[5030]: I0120 22:37:06.962340 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:06 crc kubenswrapper[5030]: I0120 22:37:06.962800 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:06 crc kubenswrapper[5030]: E0120 22:37:06.966600 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:06 crc kubenswrapper[5030]: E0120 22:37:06.967041 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:06 crc kubenswrapper[5030]: E0120 22:37:06.967447 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:06 crc kubenswrapper[5030]: I0120 22:37:06.967798 5030 scope.go:117] "RemoveContainer" containerID="42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2" Jan 20 22:37:06 crc kubenswrapper[5030]: E0120 22:37:06.968043 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" Jan 20 22:37:08 crc kubenswrapper[5030]: I0120 22:37:08.004794 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=8.00476841 podStartE2EDuration="8.00476841s" podCreationTimestamp="2026-01-20 22:37:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:08.003382277 +0000 UTC m=+100.323642605" watchObservedRunningTime="2026-01-20 22:37:08.00476841 +0000 UTC m=+100.325028708" Jan 20 22:37:08 crc kubenswrapper[5030]: I0120 22:37:08.961564 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:08 crc kubenswrapper[5030]: I0120 22:37:08.961713 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:08 crc kubenswrapper[5030]: E0120 22:37:08.961885 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:08 crc kubenswrapper[5030]: I0120 22:37:08.961916 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:08 crc kubenswrapper[5030]: E0120 22:37:08.962084 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:08 crc kubenswrapper[5030]: I0120 22:37:08.962198 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:08 crc kubenswrapper[5030]: E0120 22:37:08.962285 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:08 crc kubenswrapper[5030]: E0120 22:37:08.962353 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:10 crc kubenswrapper[5030]: I0120 22:37:10.962004 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:10 crc kubenswrapper[5030]: E0120 22:37:10.962119 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:10 crc kubenswrapper[5030]: I0120 22:37:10.962261 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:10 crc kubenswrapper[5030]: I0120 22:37:10.962014 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:10 crc kubenswrapper[5030]: E0120 22:37:10.962367 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:10 crc kubenswrapper[5030]: E0120 22:37:10.962584 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:10 crc kubenswrapper[5030]: I0120 22:37:10.962803 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:10 crc kubenswrapper[5030]: E0120 22:37:10.962890 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:12 crc kubenswrapper[5030]: I0120 22:37:12.961300 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:12 crc kubenswrapper[5030]: I0120 22:37:12.961385 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:12 crc kubenswrapper[5030]: I0120 22:37:12.961404 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:12 crc kubenswrapper[5030]: I0120 22:37:12.961319 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:12 crc kubenswrapper[5030]: E0120 22:37:12.961540 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:12 crc kubenswrapper[5030]: E0120 22:37:12.961752 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:12 crc kubenswrapper[5030]: E0120 22:37:12.961908 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:12 crc kubenswrapper[5030]: E0120 22:37:12.962004 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:14 crc kubenswrapper[5030]: I0120 22:37:14.961219 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:14 crc kubenswrapper[5030]: I0120 22:37:14.961344 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:14 crc kubenswrapper[5030]: I0120 22:37:14.961363 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:14 crc kubenswrapper[5030]: E0120 22:37:14.961475 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:14 crc kubenswrapper[5030]: I0120 22:37:14.961219 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:14 crc kubenswrapper[5030]: E0120 22:37:14.961686 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:14 crc kubenswrapper[5030]: E0120 22:37:14.961793 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:14 crc kubenswrapper[5030]: E0120 22:37:14.961938 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:16 crc kubenswrapper[5030]: I0120 22:37:16.961412 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:16 crc kubenswrapper[5030]: I0120 22:37:16.961413 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:16 crc kubenswrapper[5030]: E0120 22:37:16.962180 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:16 crc kubenswrapper[5030]: I0120 22:37:16.961436 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:16 crc kubenswrapper[5030]: E0120 22:37:16.962296 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:16 crc kubenswrapper[5030]: E0120 22:37:16.962415 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:16 crc kubenswrapper[5030]: I0120 22:37:16.961453 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:16 crc kubenswrapper[5030]: E0120 22:37:16.962597 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:18 crc kubenswrapper[5030]: I0120 22:37:18.961158 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:18 crc kubenswrapper[5030]: I0120 22:37:18.961164 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:18 crc kubenswrapper[5030]: I0120 22:37:18.961212 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:18 crc kubenswrapper[5030]: I0120 22:37:18.961209 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:18 crc kubenswrapper[5030]: E0120 22:37:18.961945 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:18 crc kubenswrapper[5030]: E0120 22:37:18.962183 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:18 crc kubenswrapper[5030]: E0120 22:37:18.962577 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:18 crc kubenswrapper[5030]: E0120 22:37:18.962576 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:19 crc kubenswrapper[5030]: I0120 22:37:19.554399 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/1.log" Jan 20 22:37:19 crc kubenswrapper[5030]: I0120 22:37:19.555123 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/0.log" Jan 20 22:37:19 crc kubenswrapper[5030]: I0120 22:37:19.555208 5030 generic.go:334] "Generic (PLEG): container finished" podID="7e610661-5072-4aa0-b1f1-75410b7f663b" containerID="b0f5e385cd5f1702d045571944087e6d00cc5743b66386b068f212c853833667" exitCode=1 Jan 20 22:37:19 crc kubenswrapper[5030]: I0120 22:37:19.555262 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n8v4f" event={"ID":"7e610661-5072-4aa0-b1f1-75410b7f663b","Type":"ContainerDied","Data":"b0f5e385cd5f1702d045571944087e6d00cc5743b66386b068f212c853833667"} Jan 20 22:37:19 crc kubenswrapper[5030]: I0120 22:37:19.555310 5030 scope.go:117] "RemoveContainer" containerID="14d99de591a716b530936977982e690a410b9006aecf4d7e8d212bfc508a8ddf" Jan 20 22:37:19 crc kubenswrapper[5030]: I0120 22:37:19.556194 5030 scope.go:117] "RemoveContainer" containerID="b0f5e385cd5f1702d045571944087e6d00cc5743b66386b068f212c853833667" Jan 20 22:37:19 crc kubenswrapper[5030]: E0120 22:37:19.556582 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-n8v4f_openshift-multus(7e610661-5072-4aa0-b1f1-75410b7f663b)\"" pod="openshift-multus/multus-n8v4f" podUID="7e610661-5072-4aa0-b1f1-75410b7f663b" Jan 20 22:37:20 crc kubenswrapper[5030]: I0120 22:37:20.566670 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/1.log" Jan 20 22:37:20 crc kubenswrapper[5030]: I0120 22:37:20.961368 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:20 crc kubenswrapper[5030]: I0120 22:37:20.961497 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:20 crc kubenswrapper[5030]: E0120 22:37:20.961529 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:20 crc kubenswrapper[5030]: I0120 22:37:20.961592 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:20 crc kubenswrapper[5030]: I0120 22:37:20.961707 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:20 crc kubenswrapper[5030]: E0120 22:37:20.962203 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:20 crc kubenswrapper[5030]: E0120 22:37:20.962351 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:20 crc kubenswrapper[5030]: E0120 22:37:20.962420 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:20 crc kubenswrapper[5030]: I0120 22:37:20.962885 5030 scope.go:117] "RemoveContainer" containerID="42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2" Jan 20 22:37:20 crc kubenswrapper[5030]: E0120 22:37:20.963213 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kq4dw_openshift-ovn-kubernetes(bf449d4a-2037-4802-8370-1965d3026c07)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" Jan 20 22:37:22 crc kubenswrapper[5030]: I0120 22:37:22.961773 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:22 crc kubenswrapper[5030]: I0120 22:37:22.961860 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:22 crc kubenswrapper[5030]: I0120 22:37:22.961877 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:22 crc kubenswrapper[5030]: E0120 22:37:22.961976 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:22 crc kubenswrapper[5030]: E0120 22:37:22.962130 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:22 crc kubenswrapper[5030]: E0120 22:37:22.962329 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:22 crc kubenswrapper[5030]: I0120 22:37:22.962415 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:22 crc kubenswrapper[5030]: E0120 22:37:22.962711 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:24 crc kubenswrapper[5030]: I0120 22:37:24.961429 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:24 crc kubenswrapper[5030]: I0120 22:37:24.961538 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:24 crc kubenswrapper[5030]: I0120 22:37:24.961720 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:24 crc kubenswrapper[5030]: E0120 22:37:24.961713 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:24 crc kubenswrapper[5030]: I0120 22:37:24.961857 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:24 crc kubenswrapper[5030]: E0120 22:37:24.962010 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:24 crc kubenswrapper[5030]: E0120 22:37:24.962158 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:24 crc kubenswrapper[5030]: E0120 22:37:24.962748 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:26 crc kubenswrapper[5030]: I0120 22:37:26.961691 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:26 crc kubenswrapper[5030]: I0120 22:37:26.961698 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:26 crc kubenswrapper[5030]: E0120 22:37:26.962206 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:26 crc kubenswrapper[5030]: I0120 22:37:26.961773 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:26 crc kubenswrapper[5030]: I0120 22:37:26.961725 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:26 crc kubenswrapper[5030]: E0120 22:37:26.962388 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:26 crc kubenswrapper[5030]: E0120 22:37:26.962548 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:26 crc kubenswrapper[5030]: E0120 22:37:26.962808 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:27 crc kubenswrapper[5030]: E0120 22:37:27.944823 5030 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 20 22:37:28 crc kubenswrapper[5030]: E0120 22:37:28.051965 5030 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 20 22:37:28 crc kubenswrapper[5030]: I0120 22:37:28.961193 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:28 crc kubenswrapper[5030]: I0120 22:37:28.961345 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:28 crc kubenswrapper[5030]: I0120 22:37:28.961284 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:28 crc kubenswrapper[5030]: E0120 22:37:28.961454 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:28 crc kubenswrapper[5030]: I0120 22:37:28.961531 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:28 crc kubenswrapper[5030]: E0120 22:37:28.961772 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:28 crc kubenswrapper[5030]: E0120 22:37:28.961925 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:28 crc kubenswrapper[5030]: E0120 22:37:28.962016 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:30 crc kubenswrapper[5030]: I0120 22:37:30.961666 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:30 crc kubenswrapper[5030]: I0120 22:37:30.961701 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:30 crc kubenswrapper[5030]: E0120 22:37:30.961867 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:30 crc kubenswrapper[5030]: I0120 22:37:30.961885 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:30 crc kubenswrapper[5030]: I0120 22:37:30.961930 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:30 crc kubenswrapper[5030]: E0120 22:37:30.961975 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:30 crc kubenswrapper[5030]: E0120 22:37:30.962289 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:30 crc kubenswrapper[5030]: E0120 22:37:30.962266 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:32 crc kubenswrapper[5030]: I0120 22:37:32.961263 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:32 crc kubenswrapper[5030]: E0120 22:37:32.961926 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:32 crc kubenswrapper[5030]: I0120 22:37:32.961422 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:32 crc kubenswrapper[5030]: I0120 22:37:32.961960 5030 scope.go:117] "RemoveContainer" containerID="b0f5e385cd5f1702d045571944087e6d00cc5743b66386b068f212c853833667" Jan 20 22:37:32 crc kubenswrapper[5030]: E0120 22:37:32.962012 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:32 crc kubenswrapper[5030]: I0120 22:37:32.961461 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:32 crc kubenswrapper[5030]: E0120 22:37:32.962059 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:32 crc kubenswrapper[5030]: I0120 22:37:32.961349 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:32 crc kubenswrapper[5030]: E0120 22:37:32.962104 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:33 crc kubenswrapper[5030]: E0120 22:37:33.053420 5030 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 20 22:37:33 crc kubenswrapper[5030]: I0120 22:37:33.615176 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/1.log" Jan 20 22:37:33 crc kubenswrapper[5030]: I0120 22:37:33.615247 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n8v4f" event={"ID":"7e610661-5072-4aa0-b1f1-75410b7f663b","Type":"ContainerStarted","Data":"87892d6c086c1bb0bb14fd6908aee383a0b4a4535439b578baf5b6b838452f1a"} Jan 20 22:37:34 crc kubenswrapper[5030]: I0120 22:37:34.961951 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:34 crc kubenswrapper[5030]: I0120 22:37:34.961973 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:34 crc kubenswrapper[5030]: E0120 22:37:34.962653 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:34 crc kubenswrapper[5030]: I0120 22:37:34.962067 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:34 crc kubenswrapper[5030]: E0120 22:37:34.962772 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:34 crc kubenswrapper[5030]: I0120 22:37:34.961998 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:34 crc kubenswrapper[5030]: E0120 22:37:34.962848 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:34 crc kubenswrapper[5030]: E0120 22:37:34.962957 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:35 crc kubenswrapper[5030]: I0120 22:37:35.962428 5030 scope.go:117] "RemoveContainer" containerID="42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2" Jan 20 22:37:36 crc kubenswrapper[5030]: I0120 22:37:36.628301 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/3.log" Jan 20 22:37:36 crc kubenswrapper[5030]: I0120 22:37:36.631556 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerStarted","Data":"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008"} Jan 20 22:37:36 crc kubenswrapper[5030]: I0120 22:37:36.631978 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:37:36 crc kubenswrapper[5030]: I0120 22:37:36.678088 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podStartSLOduration=110.678067322 podStartE2EDuration="1m50.678067322s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:36.677771545 +0000 UTC m=+128.998031883" watchObservedRunningTime="2026-01-20 22:37:36.678067322 +0000 UTC m=+128.998327630" Jan 20 22:37:36 crc kubenswrapper[5030]: I0120 22:37:36.906898 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f2dgj"] Jan 20 22:37:36 crc kubenswrapper[5030]: I0120 22:37:36.907057 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:36 crc kubenswrapper[5030]: E0120 22:37:36.907174 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:36 crc kubenswrapper[5030]: I0120 22:37:36.961533 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:36 crc kubenswrapper[5030]: I0120 22:37:36.961583 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:36 crc kubenswrapper[5030]: E0120 22:37:36.961682 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:36 crc kubenswrapper[5030]: I0120 22:37:36.961746 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:36 crc kubenswrapper[5030]: E0120 22:37:36.961832 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:36 crc kubenswrapper[5030]: E0120 22:37:36.962059 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:38 crc kubenswrapper[5030]: E0120 22:37:38.054496 5030 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 20 22:37:38 crc kubenswrapper[5030]: I0120 22:37:38.961728 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:38 crc kubenswrapper[5030]: I0120 22:37:38.961745 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:38 crc kubenswrapper[5030]: E0120 22:37:38.962309 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:38 crc kubenswrapper[5030]: I0120 22:37:38.961888 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:38 crc kubenswrapper[5030]: I0120 22:37:38.961788 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:38 crc kubenswrapper[5030]: E0120 22:37:38.962420 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:38 crc kubenswrapper[5030]: E0120 22:37:38.962526 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:38 crc kubenswrapper[5030]: E0120 22:37:38.962693 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:40 crc kubenswrapper[5030]: I0120 22:37:40.962072 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:40 crc kubenswrapper[5030]: I0120 22:37:40.962110 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:40 crc kubenswrapper[5030]: E0120 22:37:40.962367 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:40 crc kubenswrapper[5030]: I0120 22:37:40.962162 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:40 crc kubenswrapper[5030]: E0120 22:37:40.962557 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:40 crc kubenswrapper[5030]: I0120 22:37:40.962731 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:40 crc kubenswrapper[5030]: E0120 22:37:40.962818 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:40 crc kubenswrapper[5030]: E0120 22:37:40.962842 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:42 crc kubenswrapper[5030]: I0120 22:37:42.961614 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:42 crc kubenswrapper[5030]: I0120 22:37:42.961779 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:42 crc kubenswrapper[5030]: E0120 22:37:42.961809 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 22:37:42 crc kubenswrapper[5030]: I0120 22:37:42.961820 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:42 crc kubenswrapper[5030]: I0120 22:37:42.961862 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:42 crc kubenswrapper[5030]: E0120 22:37:42.961934 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 22:37:42 crc kubenswrapper[5030]: E0120 22:37:42.962052 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f2dgj" podUID="2b3c414b-bc29-41aa-a369-ecc2cc809691" Jan 20 22:37:42 crc kubenswrapper[5030]: E0120 22:37:42.962129 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 22:37:44 crc kubenswrapper[5030]: I0120 22:37:44.961596 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:44 crc kubenswrapper[5030]: I0120 22:37:44.961700 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:44 crc kubenswrapper[5030]: I0120 22:37:44.961768 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:44 crc kubenswrapper[5030]: I0120 22:37:44.962217 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:37:44 crc kubenswrapper[5030]: I0120 22:37:44.965145 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 20 22:37:44 crc kubenswrapper[5030]: I0120 22:37:44.965546 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 20 22:37:44 crc kubenswrapper[5030]: I0120 22:37:44.965661 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 20 22:37:44 crc kubenswrapper[5030]: I0120 22:37:44.965704 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 20 22:37:44 crc kubenswrapper[5030]: I0120 22:37:44.965829 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 20 22:37:44 crc kubenswrapper[5030]: I0120 22:37:44.965829 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.157052 5030 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.210943 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t7nch"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.211744 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.212431 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.213359 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.213827 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2zwp4"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.214323 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.215253 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-vkrcb"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.215657 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.215672 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.215771 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.215959 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.216184 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.216513 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.216532 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.216841 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.218354 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.221245 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.222753 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ll54f"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.223048 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.223454 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.225268 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5fgwp"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.228564 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.228810 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.228976 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.239021 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.239546 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vm7db"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.239593 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.240181 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.240342 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vm7db" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.240424 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.240593 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.241032 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.241083 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.241310 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.241419 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.259753 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.259875 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260003 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260212 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260307 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260367 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260405 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260469 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260502 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260336 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260546 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260593 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260597 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260678 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dbvw5"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260845 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.260972 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.261446 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.261557 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.261717 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.261908 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.262090 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.262338 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.262409 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.263028 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.263827 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.264251 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7fll2"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.264585 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.266162 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.270529 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.271558 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.271738 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.272967 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.276852 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.277268 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.277488 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.277819 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.278006 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.278204 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.278310 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.278311 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.278431 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.278750 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.279350 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.280527 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h2r7q"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.282139 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.283607 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.297635 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977980fd-278e-43c6-a63d-32646c9cf3de-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298123 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/09fc7d72-f430-4d90-9ab9-1d79db43b695-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t7nch\" (UID: \"09fc7d72-f430-4d90-9ab9-1d79db43b695\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298172 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352cd616-4b0f-4f66-9fff-d477bd1e827c-serving-cert\") pod \"openshift-config-operator-7777fb866f-5sb9g\" (UID: \"352cd616-4b0f-4f66-9fff-d477bd1e827c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298215 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fbbs\" (UniqueName: \"kubernetes.io/projected/352cd616-4b0f-4f66-9fff-d477bd1e827c-kube-api-access-6fbbs\") pod \"openshift-config-operator-7777fb866f-5sb9g\" (UID: \"352cd616-4b0f-4f66-9fff-d477bd1e827c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298283 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/977980fd-278e-43c6-a63d-32646c9cf3de-serving-cert\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298325 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/352cd616-4b0f-4f66-9fff-d477bd1e827c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5sb9g\" (UID: \"352cd616-4b0f-4f66-9fff-d477bd1e827c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298412 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gn55\" (UniqueName: \"kubernetes.io/projected/977980fd-278e-43c6-a63d-32646c9cf3de-kube-api-access-5gn55\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298473 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09fc7d72-f430-4d90-9ab9-1d79db43b695-config\") pod \"machine-api-operator-5694c8668f-t7nch\" (UID: \"09fc7d72-f430-4d90-9ab9-1d79db43b695\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298500 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-client-ca\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298562 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j6bf\" (UniqueName: \"kubernetes.io/projected/09fc7d72-f430-4d90-9ab9-1d79db43b695-kube-api-access-8j6bf\") pod \"machine-api-operator-5694c8668f-t7nch\" (UID: \"09fc7d72-f430-4d90-9ab9-1d79db43b695\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298609 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298652 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvq9b\" (UniqueName: \"kubernetes.io/projected/96c05319-8357-4042-ae09-77f0c85a73d9-kube-api-access-wvq9b\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298681 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/09fc7d72-f430-4d90-9ab9-1d79db43b695-images\") pod \"machine-api-operator-5694c8668f-t7nch\" (UID: \"09fc7d72-f430-4d90-9ab9-1d79db43b695\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298737 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96c05319-8357-4042-ae09-77f0c85a73d9-serving-cert\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298778 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-config\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298810 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/977980fd-278e-43c6-a63d-32646c9cf3de-config\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.298836 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977980fd-278e-43c6-a63d-32646c9cf3de-service-ca-bundle\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.283671 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.283689 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.299330 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.283725 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.283765 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.283805 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.283810 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.284209 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.284289 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.294583 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.300560 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-fmkdf"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.302468 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.302643 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.302735 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.302822 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.302916 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.303203 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.303546 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.303677 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.303834 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.303924 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.305841 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.306108 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.307170 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.308881 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.309084 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-fmkdf" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.309271 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.309770 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.309977 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.309983 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.310100 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.311418 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.311448 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.313239 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.313571 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.314957 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.316376 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.316474 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.316585 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.316594 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.317368 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.317680 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.317919 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.318106 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.318382 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.318529 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.318671 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.319769 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-vc527"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.320261 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.320651 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.320680 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.321004 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.321428 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.321482 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.323757 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ht5ll"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.324337 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.324363 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.324406 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.324810 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.324996 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.325042 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.325056 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.325105 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.325163 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.325354 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.325421 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.325564 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.326132 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.327602 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.327725 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.335385 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.336014 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.336534 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9vbnp"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.337323 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vbnp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.340517 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.341236 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.343400 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.347268 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.347827 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.348596 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tlnpr"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.349355 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.357645 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.358735 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.358917 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.359254 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mh8ms"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.359814 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mh8ms" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.360212 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.368220 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.368458 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.370067 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xhbbg"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.370286 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.370497 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.370642 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.370990 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.371128 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.371712 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.372068 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.374522 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2zwp4"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.374652 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.376956 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.384348 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t7nch"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.385561 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.386197 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vkrcb"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.389664 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.390948 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ll54f"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.393513 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.394359 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.394644 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.395800 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c8xgq"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.399879 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5fgwp"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.399900 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.400064 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.400161 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-serving-cert\") pod \"route-controller-manager-6576b87f9c-bv2jh\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.400236 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-etcd-serving-ca\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.400309 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfs5b\" (UniqueName: \"kubernetes.io/projected/c1d9550c-ae35-4705-aff2-7946b6434807-kube-api-access-lfs5b\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.400389 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977980fd-278e-43c6-a63d-32646c9cf3de-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.400462 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-trusted-ca\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.400544 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-oauth-serving-cert\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.400614 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d9550c-ae35-4705-aff2-7946b6434807-serving-cert\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.400894 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/09fc7d72-f430-4d90-9ab9-1d79db43b695-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t7nch\" (UID: \"09fc7d72-f430-4d90-9ab9-1d79db43b695\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.400980 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f58z9\" (UniqueName: \"kubernetes.io/projected/b061ed0d-eb5f-4569-9736-74c4705a7b82-kube-api-access-f58z9\") pod \"cluster-samples-operator-665b6dd947-7pp4d\" (UID: \"b061ed0d-eb5f-4569-9736-74c4705a7b82\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.401017 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7fll2"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.401058 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0ad1055-4774-45e8-a1ab-e932ecb14bc8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-27n89\" (UID: \"d0ad1055-4774-45e8-a1ab-e932ecb14bc8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.401133 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86rs6\" (UniqueName: \"kubernetes.io/projected/e174ceb8-a286-43e5-b6e4-cfc426d77226-kube-api-access-86rs6\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.401163 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352cd616-4b0f-4f66-9fff-d477bd1e827c-serving-cert\") pod \"openshift-config-operator-7777fb866f-5sb9g\" (UID: \"352cd616-4b0f-4f66-9fff-d477bd1e827c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.401186 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-bound-sa-token\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.401206 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27g6b\" (UniqueName: \"kubernetes.io/projected/d0ad1055-4774-45e8-a1ab-e932ecb14bc8-kube-api-access-27g6b\") pod \"cluster-image-registry-operator-dc59b4c8b-27n89\" (UID: \"d0ad1055-4774-45e8-a1ab-e932ecb14bc8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.401232 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fbbs\" (UniqueName: \"kubernetes.io/projected/352cd616-4b0f-4f66-9fff-d477bd1e827c-kube-api-access-6fbbs\") pod \"openshift-config-operator-7777fb866f-5sb9g\" (UID: \"352cd616-4b0f-4f66-9fff-d477bd1e827c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.401250 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e174ceb8-a286-43e5-b6e4-cfc426d77226-etcd-service-ca\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.401270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1d9550c-ae35-4705-aff2-7946b6434807-audit-dir\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.401297 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/977980fd-278e-43c6-a63d-32646c9cf3de-serving-cert\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.401382 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977980fd-278e-43c6-a63d-32646c9cf3de-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.402443 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.401319 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b471dba4-7cf5-4cbb-beb6-010bd32f8fde-trusted-ca\") pod \"console-operator-58897d9998-ll54f\" (UID: \"b471dba4-7cf5-4cbb-beb6-010bd32f8fde\") " pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.402890 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-serving-cert\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.402921 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.402941 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c1d9550c-ae35-4705-aff2-7946b6434807-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.402965 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b471dba4-7cf5-4cbb-beb6-010bd32f8fde-config\") pod \"console-operator-58897d9998-ll54f\" (UID: \"b471dba4-7cf5-4cbb-beb6-010bd32f8fde\") " pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403000 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e174ceb8-a286-43e5-b6e4-cfc426d77226-etcd-ca\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403031 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/352cd616-4b0f-4f66-9fff-d477bd1e827c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5sb9g\" (UID: \"352cd616-4b0f-4f66-9fff-d477bd1e827c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403061 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403085 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w45km\" (UniqueName: \"kubernetes.io/projected/6f7b1164-858a-4230-a949-0b770ed638d8-kube-api-access-w45km\") pod \"openshift-controller-manager-operator-756b6f6bc6-mjfgk\" (UID: \"6f7b1164-858a-4230-a949-0b770ed638d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403104 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g6g9\" (UniqueName: \"kubernetes.io/projected/b471dba4-7cf5-4cbb-beb6-010bd32f8fde-kube-api-access-2g6g9\") pod \"console-operator-58897d9998-ll54f\" (UID: \"b471dba4-7cf5-4cbb-beb6-010bd32f8fde\") " pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403122 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-audit\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qshf4\" (UniqueName: \"kubernetes.io/projected/e4ad4478-2873-407f-a178-b0255511ef06-kube-api-access-qshf4\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403161 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-registry-certificates\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403178 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jdkd\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-kube-api-access-9jdkd\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403196 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-config\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403214 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c1d9550c-ae35-4705-aff2-7946b6434807-encryption-config\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403230 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9840bef3-5439-4096-bde3-812f3653da3c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4d9dw\" (UID: \"9840bef3-5439-4096-bde3-812f3653da3c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403250 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b471dba4-7cf5-4cbb-beb6-010bd32f8fde-serving-cert\") pod \"console-operator-58897d9998-ll54f\" (UID: \"b471dba4-7cf5-4cbb-beb6-010bd32f8fde\") " pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403268 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e490f76-4197-4c4c-bfde-2911a16dad48-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6gbd8\" (UID: \"0e490f76-4197-4c4c-bfde-2911a16dad48\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" Jan 20 22:37:49 crc kubenswrapper[5030]: E0120 22:37:49.403292 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:49.903277316 +0000 UTC m=+142.223537604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403302 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/352cd616-4b0f-4f66-9fff-d477bd1e827c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5sb9g\" (UID: \"352cd616-4b0f-4f66-9fff-d477bd1e827c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403318 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4ad4478-2873-407f-a178-b0255511ef06-node-pullsecrets\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403363 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gn55\" (UniqueName: \"kubernetes.io/projected/977980fd-278e-43c6-a63d-32646c9cf3de-kube-api-access-5gn55\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403445 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e490f76-4197-4c4c-bfde-2911a16dad48-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6gbd8\" (UID: \"0e490f76-4197-4c4c-bfde-2911a16dad48\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403493 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n4xq\" (UniqueName: \"kubernetes.io/projected/8de6d00e-a15c-47d8-a4a6-0212694a734f-kube-api-access-4n4xq\") pod \"dns-operator-744455d44c-vm7db\" (UID: \"8de6d00e-a15c-47d8-a4a6-0212694a734f\") " pod="openshift-dns-operator/dns-operator-744455d44c-vm7db" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403520 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e174ceb8-a286-43e5-b6e4-cfc426d77226-serving-cert\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403544 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pws2h\" (UniqueName: \"kubernetes.io/projected/9840bef3-5439-4096-bde3-812f3653da3c-kube-api-access-pws2h\") pod \"ingress-operator-5b745b69d9-4d9dw\" (UID: \"9840bef3-5439-4096-bde3-812f3653da3c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403581 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8de6d00e-a15c-47d8-a4a6-0212694a734f-metrics-tls\") pod \"dns-operator-744455d44c-vm7db\" (UID: \"8de6d00e-a15c-47d8-a4a6-0212694a734f\") " pod="openshift-dns-operator/dns-operator-744455d44c-vm7db" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403614 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b061ed0d-eb5f-4569-9736-74c4705a7b82-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7pp4d\" (UID: \"b061ed0d-eb5f-4569-9736-74c4705a7b82\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403637 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403654 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-config\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403674 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4ad4478-2873-407f-a178-b0255511ef06-audit-dir\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403694 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1d9550c-ae35-4705-aff2-7946b6434807-etcd-client\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403714 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgghl\" (UniqueName: \"kubernetes.io/projected/0e490f76-4197-4c4c-bfde-2911a16dad48-kube-api-access-pgghl\") pod \"openshift-apiserver-operator-796bbdcf4f-6gbd8\" (UID: \"0e490f76-4197-4c4c-bfde-2911a16dad48\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403744 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09fc7d72-f430-4d90-9ab9-1d79db43b695-config\") pod \"machine-api-operator-5694c8668f-t7nch\" (UID: \"09fc7d72-f430-4d90-9ab9-1d79db43b695\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403767 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-client-ca\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403786 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4ad4478-2873-407f-a178-b0255511ef06-etcd-client\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403808 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-image-import-ca\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403828 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ad4478-2873-407f-a178-b0255511ef06-serving-cert\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403872 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0ad1055-4774-45e8-a1ab-e932ecb14bc8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-27n89\" (UID: \"d0ad1055-4774-45e8-a1ab-e932ecb14bc8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403893 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e174ceb8-a286-43e5-b6e4-cfc426d77226-config\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403917 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j6bf\" (UniqueName: \"kubernetes.io/projected/09fc7d72-f430-4d90-9ab9-1d79db43b695-kube-api-access-8j6bf\") pod \"machine-api-operator-5694c8668f-t7nch\" (UID: \"09fc7d72-f430-4d90-9ab9-1d79db43b695\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e174ceb8-a286-43e5-b6e4-cfc426d77226-etcd-client\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.403972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4ad4478-2873-407f-a178-b0255511ef06-encryption-config\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404013 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7b1164-858a-4230-a949-0b770ed638d8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mjfgk\" (UID: \"6f7b1164-858a-4230-a949-0b770ed638d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404034 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-trusted-ca-bundle\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404054 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404076 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvq9b\" (UniqueName: \"kubernetes.io/projected/96c05319-8357-4042-ae09-77f0c85a73d9-kube-api-access-wvq9b\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404097 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0ad1055-4774-45e8-a1ab-e932ecb14bc8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-27n89\" (UID: \"d0ad1055-4774-45e8-a1ab-e932ecb14bc8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404120 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f7b1164-858a-4230-a949-0b770ed638d8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mjfgk\" (UID: \"6f7b1164-858a-4230-a949-0b770ed638d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404327 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-config\") pod \"route-controller-manager-6576b87f9c-bv2jh\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404353 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-service-ca\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404373 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtp9h\" (UniqueName: \"kubernetes.io/projected/6d43e77d-5f9a-4b29-959f-190fa60d8807-kube-api-access-rtp9h\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404396 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/09fc7d72-f430-4d90-9ab9-1d79db43b695-images\") pod \"machine-api-operator-5694c8668f-t7nch\" (UID: \"09fc7d72-f430-4d90-9ab9-1d79db43b695\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404416 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-registry-tls\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404436 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c1d9550c-ae35-4705-aff2-7946b6434807-audit-policies\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404452 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1d9550c-ae35-4705-aff2-7946b6434807-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404469 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96c05319-8357-4042-ae09-77f0c85a73d9-serving-cert\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404522 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mspd2\" (UniqueName: \"kubernetes.io/projected/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-kube-api-access-mspd2\") pod \"route-controller-manager-6576b87f9c-bv2jh\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404544 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-config\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404564 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9840bef3-5439-4096-bde3-812f3653da3c-trusted-ca\") pod \"ingress-operator-5b745b69d9-4d9dw\" (UID: \"9840bef3-5439-4096-bde3-812f3653da3c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-oauth-config\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404606 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9840bef3-5439-4096-bde3-812f3653da3c-metrics-tls\") pod \"ingress-operator-5b745b69d9-4d9dw\" (UID: \"9840bef3-5439-4096-bde3-812f3653da3c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404648 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/977980fd-278e-43c6-a63d-32646c9cf3de-config\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404668 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977980fd-278e-43c6-a63d-32646c9cf3de-service-ca-bundle\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404688 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-client-ca\") pod \"route-controller-manager-6576b87f9c-bv2jh\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.404955 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-client-ca\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.405634 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09fc7d72-f430-4d90-9ab9-1d79db43b695-config\") pod \"machine-api-operator-5694c8668f-t7nch\" (UID: \"09fc7d72-f430-4d90-9ab9-1d79db43b695\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.405757 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.406591 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/09fc7d72-f430-4d90-9ab9-1d79db43b695-images\") pod \"machine-api-operator-5694c8668f-t7nch\" (UID: \"09fc7d72-f430-4d90-9ab9-1d79db43b695\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.407602 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-config\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.408057 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/977980fd-278e-43c6-a63d-32646c9cf3de-config\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.408134 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/977980fd-278e-43c6-a63d-32646c9cf3de-service-ca-bundle\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.408646 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/352cd616-4b0f-4f66-9fff-d477bd1e827c-serving-cert\") pod \"openshift-config-operator-7777fb866f-5sb9g\" (UID: \"352cd616-4b0f-4f66-9fff-d477bd1e827c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.409975 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/977980fd-278e-43c6-a63d-32646c9cf3de-serving-cert\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.410802 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dbvw5"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.410908 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96c05319-8357-4042-ae09-77f0c85a73d9-serving-cert\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.412095 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.413108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/09fc7d72-f430-4d90-9ab9-1d79db43b695-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t7nch\" (UID: \"09fc7d72-f430-4d90-9ab9-1d79db43b695\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.413278 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.413784 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.414406 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ht5ll"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.415463 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.416517 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.417632 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.418669 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.419723 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tlnpr"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.421223 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5mcr2"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.422052 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5mcr2" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.422344 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-j5gw4"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.423227 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j5gw4" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.423515 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vm7db"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.426313 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.427436 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.433914 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.434913 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9vbnp"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.436386 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.436419 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.437522 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h2r7q"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.438948 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5mcr2"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.440355 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-fmkdf"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.441546 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.442733 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c8xgq"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.443842 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mh8ms"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.445079 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.446164 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.447863 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.448824 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xhbbg"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.449976 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.451504 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j5gw4"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.453502 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-882nz"] Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.454151 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-882nz" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.459746 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.473318 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.494333 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.505126 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.505264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9840bef3-5439-4096-bde3-812f3653da3c-trusted-ca\") pod \"ingress-operator-5b745b69d9-4d9dw\" (UID: \"9840bef3-5439-4096-bde3-812f3653da3c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:49 crc kubenswrapper[5030]: E0120 22:37:49.505308 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.005287408 +0000 UTC m=+142.325547696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.505339 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-webhook-cert\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/49dc7f5e-9a94-4a6a-83d0-c77019142654-signing-key\") pod \"service-ca-9c57cc56f-xhbbg\" (UID: \"49dc7f5e-9a94-4a6a-83d0-c77019142654\") " pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dce6cbef-8eea-40d0-83ee-8790d2bd9450-profile-collector-cert\") pod \"catalog-operator-68c6474976-pxvmq\" (UID: \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514330 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9mms\" (UniqueName: \"kubernetes.io/projected/c6a2a728-a83b-4fa2-b21a-748dcb79a930-kube-api-access-t9mms\") pod \"olm-operator-6b444d44fb-8h529\" (UID: \"c6a2a728-a83b-4fa2-b21a-748dcb79a930\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514358 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51c966fa-8fea-4242-a6b5-2e8b9c55fdd0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4l4sh\" (UID: \"51c966fa-8fea-4242-a6b5-2e8b9c55fdd0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514397 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-client-ca\") pod \"route-controller-manager-6576b87f9c-bv2jh\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514424 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b55be0c-55e3-4bf3-badd-618e8dc9f738-config-volume\") pod \"collect-profiles-29482470-4x64w\" (UID: \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514459 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfs5b\" (UniqueName: \"kubernetes.io/projected/c1d9550c-ae35-4705-aff2-7946b6434807-kube-api-access-lfs5b\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514769 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d9550c-ae35-4705-aff2-7946b6434807-serving-cert\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514812 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514838 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514863 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514885 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-metrics-certs\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514917 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-trusted-ca\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q95sh\" (UniqueName: \"kubernetes.io/projected/dce6cbef-8eea-40d0-83ee-8790d2bd9450-kube-api-access-q95sh\") pod \"catalog-operator-68c6474976-pxvmq\" (UID: \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514953 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eadfdde-af50-4bad-a779-dc5af28f74fe-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmzpg\" (UID: \"8eadfdde-af50-4bad-a779-dc5af28f74fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514976 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0ad1055-4774-45e8-a1ab-e932ecb14bc8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-27n89\" (UID: \"d0ad1055-4774-45e8-a1ab-e932ecb14bc8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.514998 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86rs6\" (UniqueName: \"kubernetes.io/projected/e174ceb8-a286-43e5-b6e4-cfc426d77226-kube-api-access-86rs6\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.515022 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c6a2a728-a83b-4fa2-b21a-748dcb79a930-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8h529\" (UID: \"c6a2a728-a83b-4fa2-b21a-748dcb79a930\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.515039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-82699\" (UID: \"607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.515061 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-bound-sa-token\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.515081 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1d9550c-ae35-4705-aff2-7946b6434807-audit-dir\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.515099 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ae9bb3b-dab8-4286-aa17-1d04b2239dd6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7d8jq\" (UID: \"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.515128 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e174ceb8-a286-43e5-b6e4-cfc426d77226-etcd-service-ca\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.515151 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b471dba4-7cf5-4cbb-beb6-010bd32f8fde-trusted-ca\") pod \"console-operator-58897d9998-ll54f\" (UID: \"b471dba4-7cf5-4cbb-beb6-010bd32f8fde\") " pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.515170 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-service-ca-bundle\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.515230 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.515429 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dce6cbef-8eea-40d0-83ee-8790d2bd9450-srv-cert\") pod \"catalog-operator-68c6474976-pxvmq\" (UID: \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.515453 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e174ceb8-a286-43e5-b6e4-cfc426d77226-etcd-ca\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.515474 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b55be0c-55e3-4bf3-badd-618e8dc9f738-secret-volume\") pod \"collect-profiles-29482470-4x64w\" (UID: \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.515821 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1d9550c-ae35-4705-aff2-7946b6434807-audit-dir\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: E0120 22:37:49.515827 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.01579874 +0000 UTC m=+142.336059028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.516016 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-client-ca\") pod \"route-controller-manager-6576b87f9c-bv2jh\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.516875 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b471dba4-7cf5-4cbb-beb6-010bd32f8fde-trusted-ca\") pod \"console-operator-58897d9998-ll54f\" (UID: \"b471dba4-7cf5-4cbb-beb6-010bd32f8fde\") " pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.517156 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-trusted-ca\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.517285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0ad1055-4774-45e8-a1ab-e932ecb14bc8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-27n89\" (UID: \"d0ad1055-4774-45e8-a1ab-e932ecb14bc8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.517285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/e174ceb8-a286-43e5-b6e4-cfc426d77226-etcd-ca\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.517416 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/e174ceb8-a286-43e5-b6e4-cfc426d77226-etcd-service-ca\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.517819 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g6g9\" (UniqueName: \"kubernetes.io/projected/b471dba4-7cf5-4cbb-beb6-010bd32f8fde-kube-api-access-2g6g9\") pod \"console-operator-58897d9998-ll54f\" (UID: \"b471dba4-7cf5-4cbb-beb6-010bd32f8fde\") " pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.517850 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-audit\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.517901 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.517996 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-registry-certificates\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.518459 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jdkd\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-kube-api-access-9jdkd\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.518490 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b471dba4-7cf5-4cbb-beb6-010bd32f8fde-serving-cert\") pod \"console-operator-58897d9998-ll54f\" (UID: \"b471dba4-7cf5-4cbb-beb6-010bd32f8fde\") " pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.518531 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9840bef3-5439-4096-bde3-812f3653da3c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4d9dw\" (UID: \"9840bef3-5439-4096-bde3-812f3653da3c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.518559 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e490f76-4197-4c4c-bfde-2911a16dad48-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6gbd8\" (UID: \"0e490f76-4197-4c4c-bfde-2911a16dad48\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.518586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.518694 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1e5724b-6745-4d13-a19b-a631a2483004-auth-proxy-config\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.518738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8de6d00e-a15c-47d8-a4a6-0212694a734f-metrics-tls\") pod \"dns-operator-744455d44c-vm7db\" (UID: \"8de6d00e-a15c-47d8-a4a6-0212694a734f\") " pod="openshift-dns-operator/dns-operator-744455d44c-vm7db" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.518799 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09197347-c50f-4fe1-b20d-3baabf9869fc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mh8ms\" (UID: \"09197347-c50f-4fe1-b20d-3baabf9869fc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mh8ms" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.519564 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1d9550c-ae35-4705-aff2-7946b6434807-serving-cert\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.519726 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.519915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-registry-certificates\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4ad4478-2873-407f-a178-b0255511ef06-audit-dir\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520168 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-config\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgghl\" (UniqueName: \"kubernetes.io/projected/0e490f76-4197-4c4c-bfde-2911a16dad48-kube-api-access-pgghl\") pod \"openshift-apiserver-operator-796bbdcf4f-6gbd8\" (UID: \"0e490f76-4197-4c4c-bfde-2911a16dad48\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4ad4478-2873-407f-a178-b0255511ef06-etcd-client\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520233 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ad4478-2873-407f-a178-b0255511ef06-serving-cert\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520267 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520301 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0ad1055-4774-45e8-a1ab-e932ecb14bc8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-27n89\" (UID: \"d0ad1055-4774-45e8-a1ab-e932ecb14bc8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520332 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e5724b-6745-4d13-a19b-a631a2483004-config\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4ad4478-2873-407f-a178-b0255511ef06-encryption-config\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520375 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnjh6\" (UniqueName: \"kubernetes.io/projected/f12e0006-3347-4e5b-98ad-bd69997cbe22-kube-api-access-tnjh6\") pod \"package-server-manager-789f6589d5-njtmh\" (UID: \"f12e0006-3347-4e5b-98ad-bd69997cbe22\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520395 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-trusted-ca-bundle\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-config\") pod \"route-controller-manager-6576b87f9c-bv2jh\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520444 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-service-ca\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520495 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e4ad4478-2873-407f-a178-b0255511ef06-audit-dir\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520508 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8eadfdde-af50-4bad-a779-dc5af28f74fe-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmzpg\" (UID: \"8eadfdde-af50-4bad-a779-dc5af28f74fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520555 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51c966fa-8fea-4242-a6b5-2e8b9c55fdd0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4l4sh\" (UID: \"51c966fa-8fea-4242-a6b5-2e8b9c55fdd0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c1d9550c-ae35-4705-aff2-7946b6434807-audit-policies\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520754 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1d9550c-ae35-4705-aff2-7946b6434807-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520787 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mspd2\" (UniqueName: \"kubernetes.io/projected/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-kube-api-access-mspd2\") pod \"route-controller-manager-6576b87f9c-bv2jh\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520819 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520847 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-oauth-config\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520930 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9840bef3-5439-4096-bde3-812f3653da3c-metrics-tls\") pod \"ingress-operator-5b745b69d9-4d9dw\" (UID: \"9840bef3-5439-4096-bde3-812f3653da3c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520967 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-serving-cert\") pod \"route-controller-manager-6576b87f9c-bv2jh\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.520992 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-etcd-serving-ca\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521024 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f12e0006-3347-4e5b-98ad-bd69997cbe22-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-njtmh\" (UID: \"f12e0006-3347-4e5b-98ad-bd69997cbe22\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521066 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-oauth-serving-cert\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521083 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521107 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f58z9\" (UniqueName: \"kubernetes.io/projected/b061ed0d-eb5f-4569-9736-74c4705a7b82-kube-api-access-f58z9\") pod \"cluster-samples-operator-665b6dd947-7pp4d\" (UID: \"b061ed0d-eb5f-4569-9736-74c4705a7b82\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521131 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27g6b\" (UniqueName: \"kubernetes.io/projected/d0ad1055-4774-45e8-a1ab-e932ecb14bc8-kube-api-access-27g6b\") pod \"cluster-image-registry-operator-dc59b4c8b-27n89\" (UID: \"d0ad1055-4774-45e8-a1ab-e932ecb14bc8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521151 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/49dc7f5e-9a94-4a6a-83d0-c77019142654-signing-cabundle\") pod \"service-ca-9c57cc56f-xhbbg\" (UID: \"49dc7f5e-9a94-4a6a-83d0-c77019142654\") " pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521172 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjvpx\" (UniqueName: \"kubernetes.io/projected/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-kube-api-access-tjvpx\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521188 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-tmpfs\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521208 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4z2w\" (UniqueName: \"kubernetes.io/projected/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-kube-api-access-x4z2w\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521233 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-serving-cert\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521260 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51c966fa-8fea-4242-a6b5-2e8b9c55fdd0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4l4sh\" (UID: \"51c966fa-8fea-4242-a6b5-2e8b9c55fdd0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521283 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdz7b\" (UniqueName: \"kubernetes.io/projected/6b55be0c-55e3-4bf3-badd-618e8dc9f738-kube-api-access-hdz7b\") pod \"collect-profiles-29482470-4x64w\" (UID: \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521312 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b471dba4-7cf5-4cbb-beb6-010bd32f8fde-config\") pod \"console-operator-58897d9998-ll54f\" (UID: \"b471dba4-7cf5-4cbb-beb6-010bd32f8fde\") " pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521330 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c1d9550c-ae35-4705-aff2-7946b6434807-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521351 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sstz\" (UniqueName: \"kubernetes.io/projected/4e59bf88-6561-4b84-93a3-4aa09d919f56-kube-api-access-7sstz\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521353 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-service-ca\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521376 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521413 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521436 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w45km\" (UniqueName: \"kubernetes.io/projected/6f7b1164-858a-4230-a949-0b770ed638d8-kube-api-access-w45km\") pod \"openshift-controller-manager-operator-756b6f6bc6-mjfgk\" (UID: \"6f7b1164-858a-4230-a949-0b770ed638d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-config\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521453 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c6a2a728-a83b-4fa2-b21a-748dcb79a930-srv-cert\") pod \"olm-operator-6b444d44fb-8h529\" (UID: \"c6a2a728-a83b-4fa2-b21a-748dcb79a930\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.521986 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-audit\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.522164 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c1d9550c-ae35-4705-aff2-7946b6434807-audit-policies\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.522987 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-trusted-ca-bundle\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.523067 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.523094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b471dba4-7cf5-4cbb-beb6-010bd32f8fde-serving-cert\") pod \"console-operator-58897d9998-ll54f\" (UID: \"b471dba4-7cf5-4cbb-beb6-010bd32f8fde\") " pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.523489 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8de6d00e-a15c-47d8-a4a6-0212694a734f-metrics-tls\") pod \"dns-operator-744455d44c-vm7db\" (UID: \"8de6d00e-a15c-47d8-a4a6-0212694a734f\") " pod="openshift-dns-operator/dns-operator-744455d44c-vm7db" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.523940 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e490f76-4197-4c4c-bfde-2911a16dad48-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6gbd8\" (UID: \"0e490f76-4197-4c4c-bfde-2911a16dad48\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.523050 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1d9550c-ae35-4705-aff2-7946b6434807-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.524164 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfb2j\" (UniqueName: \"kubernetes.io/projected/8eadfdde-af50-4bad-a779-dc5af28f74fe-kube-api-access-sfb2j\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmzpg\" (UID: \"8eadfdde-af50-4bad-a779-dc5af28f74fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.524828 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-oauth-serving-cert\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.525017 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.525031 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e4ad4478-2873-407f-a178-b0255511ef06-etcd-client\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.525481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9840bef3-5439-4096-bde3-812f3653da3c-trusted-ca\") pod \"ingress-operator-5b745b69d9-4d9dw\" (UID: \"9840bef3-5439-4096-bde3-812f3653da3c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.525774 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-etcd-serving-ca\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.527795 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.526567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c1d9550c-ae35-4705-aff2-7946b6434807-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.527187 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qshf4\" (UniqueName: \"kubernetes.io/projected/e4ad4478-2873-407f-a178-b0255511ef06-kube-api-access-qshf4\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.527193 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4ad4478-2873-407f-a178-b0255511ef06-serving-cert\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.527397 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9840bef3-5439-4096-bde3-812f3653da3c-metrics-tls\") pod \"ingress-operator-5b745b69d9-4d9dw\" (UID: \"9840bef3-5439-4096-bde3-812f3653da3c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.527938 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-serving-cert\") pod \"route-controller-manager-6576b87f9c-bv2jh\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.527951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.527985 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-config\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.528009 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c1d9550c-ae35-4705-aff2-7946b6434807-encryption-config\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.526130 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b471dba4-7cf5-4cbb-beb6-010bd32f8fde-config\") pod \"console-operator-58897d9998-ll54f\" (UID: \"b471dba4-7cf5-4cbb-beb6-010bd32f8fde\") " pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.528120 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q64bt\" (UniqueName: \"kubernetes.io/projected/3ae9bb3b-dab8-4286-aa17-1d04b2239dd6-kube-api-access-q64bt\") pod \"machine-config-operator-74547568cd-7d8jq\" (UID: \"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.528149 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98tw\" (UniqueName: \"kubernetes.io/projected/607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9-kube-api-access-x98tw\") pod \"control-plane-machine-set-operator-78cbb6b69f-82699\" (UID: \"607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.528174 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4ad4478-2873-407f-a178-b0255511ef06-node-pullsecrets\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.528196 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3ae9bb3b-dab8-4286-aa17-1d04b2239dd6-images\") pod \"machine-config-operator-74547568cd-7d8jq\" (UID: \"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.528214 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e59bf88-6561-4b84-93a3-4aa09d919f56-audit-dir\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.528569 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e490f76-4197-4c4c-bfde-2911a16dad48-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6gbd8\" (UID: \"0e490f76-4197-4c4c-bfde-2911a16dad48\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.528696 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-config\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.528747 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4ad4478-2873-407f-a178-b0255511ef06-node-pullsecrets\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.528836 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n4xq\" (UniqueName: \"kubernetes.io/projected/8de6d00e-a15c-47d8-a4a6-0212694a734f-kube-api-access-4n4xq\") pod \"dns-operator-744455d44c-vm7db\" (UID: \"8de6d00e-a15c-47d8-a4a6-0212694a734f\") " pod="openshift-dns-operator/dns-operator-744455d44c-vm7db" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529113 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pws2h\" (UniqueName: \"kubernetes.io/projected/9840bef3-5439-4096-bde3-812f3653da3c-kube-api-access-pws2h\") pod \"ingress-operator-5b745b69d9-4d9dw\" (UID: \"9840bef3-5439-4096-bde3-812f3653da3c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529171 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-audit-policies\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529203 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-serving-cert\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529203 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e1e5724b-6745-4d13-a19b-a631a2483004-machine-approver-tls\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e174ceb8-a286-43e5-b6e4-cfc426d77226-serving-cert\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529298 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ae9bb3b-dab8-4286-aa17-1d04b2239dd6-proxy-tls\") pod \"machine-config-operator-74547568cd-7d8jq\" (UID: \"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529323 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhh76\" (UniqueName: \"kubernetes.io/projected/09197347-c50f-4fe1-b20d-3baabf9869fc-kube-api-access-hhh76\") pod \"multus-admission-controller-857f4d67dd-mh8ms\" (UID: \"09197347-c50f-4fe1-b20d-3baabf9869fc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mh8ms" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529344 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg4ct\" (UniqueName: \"kubernetes.io/projected/e1e5724b-6745-4d13-a19b-a631a2483004-kube-api-access-kg4ct\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529368 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b061ed0d-eb5f-4569-9736-74c4705a7b82-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7pp4d\" (UID: \"b061ed0d-eb5f-4569-9736-74c4705a7b82\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529342 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-oauth-config\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529386 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1d9550c-ae35-4705-aff2-7946b6434807-etcd-client\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529446 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-apiservice-cert\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529483 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-image-import-ca\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-default-certificate\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529563 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e174ceb8-a286-43e5-b6e4-cfc426d77226-config\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529592 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hs66\" (UniqueName: \"kubernetes.io/projected/49dc7f5e-9a94-4a6a-83d0-c77019142654-kube-api-access-8hs66\") pod \"service-ca-9c57cc56f-xhbbg\" (UID: \"49dc7f5e-9a94-4a6a-83d0-c77019142654\") " pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529655 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e174ceb8-a286-43e5-b6e4-cfc426d77226-etcd-client\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529704 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7b1164-858a-4230-a949-0b770ed638d8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mjfgk\" (UID: \"6f7b1164-858a-4230-a949-0b770ed638d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529737 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtp9h\" (UniqueName: \"kubernetes.io/projected/6d43e77d-5f9a-4b29-959f-190fa60d8807-kube-api-access-rtp9h\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.530087 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e490f76-4197-4c4c-bfde-2911a16dad48-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6gbd8\" (UID: \"0e490f76-4197-4c4c-bfde-2911a16dad48\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.529778 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0ad1055-4774-45e8-a1ab-e932ecb14bc8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-27n89\" (UID: \"d0ad1055-4774-45e8-a1ab-e932ecb14bc8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.530263 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f7b1164-858a-4230-a949-0b770ed638d8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mjfgk\" (UID: \"6f7b1164-858a-4230-a949-0b770ed638d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.530287 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-registry-tls\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.530311 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-stats-auth\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.530284 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e174ceb8-a286-43e5-b6e4-cfc426d77226-config\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.530839 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c1d9550c-ae35-4705-aff2-7946b6434807-encryption-config\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.530905 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7b1164-858a-4230-a949-0b770ed638d8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mjfgk\" (UID: \"6f7b1164-858a-4230-a949-0b770ed638d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.531121 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e4ad4478-2873-407f-a178-b0255511ef06-encryption-config\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.531581 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e4ad4478-2873-407f-a178-b0255511ef06-image-import-ca\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.532139 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e174ceb8-a286-43e5-b6e4-cfc426d77226-serving-cert\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.532938 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e174ceb8-a286-43e5-b6e4-cfc426d77226-etcd-client\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.533087 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-config\") pod \"route-controller-manager-6576b87f9c-bv2jh\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.533507 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f7b1164-858a-4230-a949-0b770ed638d8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mjfgk\" (UID: \"6f7b1164-858a-4230-a949-0b770ed638d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.533889 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1d9550c-ae35-4705-aff2-7946b6434807-etcd-client\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.534202 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-registry-tls\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.534572 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.535390 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b061ed0d-eb5f-4569-9736-74c4705a7b82-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7pp4d\" (UID: \"b061ed0d-eb5f-4569-9736-74c4705a7b82\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.535811 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0ad1055-4774-45e8-a1ab-e932ecb14bc8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-27n89\" (UID: \"d0ad1055-4774-45e8-a1ab-e932ecb14bc8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.554542 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.573526 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.593945 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.614279 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631177 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631302 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b55be0c-55e3-4bf3-badd-618e8dc9f738-secret-volume\") pod \"collect-profiles-29482470-4x64w\" (UID: \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631339 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631364 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631387 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1e5724b-6745-4d13-a19b-a631a2483004-auth-proxy-config\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09197347-c50f-4fe1-b20d-3baabf9869fc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mh8ms\" (UID: \"09197347-c50f-4fe1-b20d-3baabf9869fc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mh8ms" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631431 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e5724b-6745-4d13-a19b-a631a2483004-config\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631471 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnjh6\" (UniqueName: \"kubernetes.io/projected/f12e0006-3347-4e5b-98ad-bd69997cbe22-kube-api-access-tnjh6\") pod \"package-server-manager-789f6589d5-njtmh\" (UID: \"f12e0006-3347-4e5b-98ad-bd69997cbe22\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631487 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8eadfdde-af50-4bad-a779-dc5af28f74fe-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmzpg\" (UID: \"8eadfdde-af50-4bad-a779-dc5af28f74fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51c966fa-8fea-4242-a6b5-2e8b9c55fdd0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4l4sh\" (UID: \"51c966fa-8fea-4242-a6b5-2e8b9c55fdd0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631523 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631557 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f12e0006-3347-4e5b-98ad-bd69997cbe22-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-njtmh\" (UID: \"f12e0006-3347-4e5b-98ad-bd69997cbe22\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631572 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/49dc7f5e-9a94-4a6a-83d0-c77019142654-signing-cabundle\") pod \"service-ca-9c57cc56f-xhbbg\" (UID: \"49dc7f5e-9a94-4a6a-83d0-c77019142654\") " pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631614 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjvpx\" (UniqueName: \"kubernetes.io/projected/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-kube-api-access-tjvpx\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631666 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-tmpfs\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631683 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4z2w\" (UniqueName: \"kubernetes.io/projected/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-kube-api-access-x4z2w\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631696 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51c966fa-8fea-4242-a6b5-2e8b9c55fdd0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4l4sh\" (UID: \"51c966fa-8fea-4242-a6b5-2e8b9c55fdd0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631711 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdz7b\" (UniqueName: \"kubernetes.io/projected/6b55be0c-55e3-4bf3-badd-618e8dc9f738-kube-api-access-hdz7b\") pod \"collect-profiles-29482470-4x64w\" (UID: \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sstz\" (UniqueName: \"kubernetes.io/projected/4e59bf88-6561-4b84-93a3-4aa09d919f56-kube-api-access-7sstz\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631744 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c6a2a728-a83b-4fa2-b21a-748dcb79a930-srv-cert\") pod \"olm-operator-6b444d44fb-8h529\" (UID: \"c6a2a728-a83b-4fa2-b21a-748dcb79a930\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631787 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631804 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfb2j\" (UniqueName: \"kubernetes.io/projected/8eadfdde-af50-4bad-a779-dc5af28f74fe-kube-api-access-sfb2j\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmzpg\" (UID: \"8eadfdde-af50-4bad-a779-dc5af28f74fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631825 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3ae9bb3b-dab8-4286-aa17-1d04b2239dd6-images\") pod \"machine-config-operator-74547568cd-7d8jq\" (UID: \"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631843 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q64bt\" (UniqueName: \"kubernetes.io/projected/3ae9bb3b-dab8-4286-aa17-1d04b2239dd6-kube-api-access-q64bt\") pod \"machine-config-operator-74547568cd-7d8jq\" (UID: \"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631857 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x98tw\" (UniqueName: \"kubernetes.io/projected/607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9-kube-api-access-x98tw\") pod \"control-plane-machine-set-operator-78cbb6b69f-82699\" (UID: \"607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631873 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e59bf88-6561-4b84-93a3-4aa09d919f56-audit-dir\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631896 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-audit-policies\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631910 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e1e5724b-6745-4d13-a19b-a631a2483004-machine-approver-tls\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ae9bb3b-dab8-4286-aa17-1d04b2239dd6-proxy-tls\") pod \"machine-config-operator-74547568cd-7d8jq\" (UID: \"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631939 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-apiservice-cert\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631954 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhh76\" (UniqueName: \"kubernetes.io/projected/09197347-c50f-4fe1-b20d-3baabf9869fc-kube-api-access-hhh76\") pod \"multus-admission-controller-857f4d67dd-mh8ms\" (UID: \"09197347-c50f-4fe1-b20d-3baabf9869fc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mh8ms" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631969 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg4ct\" (UniqueName: \"kubernetes.io/projected/e1e5724b-6745-4d13-a19b-a631a2483004-kube-api-access-kg4ct\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.631983 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-default-certificate\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632005 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hs66\" (UniqueName: \"kubernetes.io/projected/49dc7f5e-9a94-4a6a-83d0-c77019142654-kube-api-access-8hs66\") pod \"service-ca-9c57cc56f-xhbbg\" (UID: \"49dc7f5e-9a94-4a6a-83d0-c77019142654\") " pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-stats-auth\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632083 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-webhook-cert\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632100 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9mms\" (UniqueName: \"kubernetes.io/projected/c6a2a728-a83b-4fa2-b21a-748dcb79a930-kube-api-access-t9mms\") pod \"olm-operator-6b444d44fb-8h529\" (UID: \"c6a2a728-a83b-4fa2-b21a-748dcb79a930\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632135 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/49dc7f5e-9a94-4a6a-83d0-c77019142654-signing-key\") pod \"service-ca-9c57cc56f-xhbbg\" (UID: \"49dc7f5e-9a94-4a6a-83d0-c77019142654\") " pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632151 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dce6cbef-8eea-40d0-83ee-8790d2bd9450-profile-collector-cert\") pod \"catalog-operator-68c6474976-pxvmq\" (UID: \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632165 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51c966fa-8fea-4242-a6b5-2e8b9c55fdd0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4l4sh\" (UID: \"51c966fa-8fea-4242-a6b5-2e8b9c55fdd0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632181 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b55be0c-55e3-4bf3-badd-618e8dc9f738-config-volume\") pod \"collect-profiles-29482470-4x64w\" (UID: \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632200 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632215 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632229 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632244 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-metrics-certs\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632265 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q95sh\" (UniqueName: \"kubernetes.io/projected/dce6cbef-8eea-40d0-83ee-8790d2bd9450-kube-api-access-q95sh\") pod \"catalog-operator-68c6474976-pxvmq\" (UID: \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632280 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eadfdde-af50-4bad-a779-dc5af28f74fe-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmzpg\" (UID: \"8eadfdde-af50-4bad-a779-dc5af28f74fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632302 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c6a2a728-a83b-4fa2-b21a-748dcb79a930-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8h529\" (UID: \"c6a2a728-a83b-4fa2-b21a-748dcb79a930\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632319 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-82699\" (UID: \"607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632336 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ae9bb3b-dab8-4286-aa17-1d04b2239dd6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7d8jq\" (UID: \"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-service-ca-bundle\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.632377 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dce6cbef-8eea-40d0-83ee-8790d2bd9450-srv-cert\") pod \"catalog-operator-68c6474976-pxvmq\" (UID: \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 20 22:37:49 crc kubenswrapper[5030]: E0120 22:37:49.632520 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.132506371 +0000 UTC m=+142.452766659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.633737 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-tmpfs\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.633782 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e59bf88-6561-4b84-93a3-4aa09d919f56-audit-dir\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.634441 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51c966fa-8fea-4242-a6b5-2e8b9c55fdd0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4l4sh\" (UID: \"51c966fa-8fea-4242-a6b5-2e8b9c55fdd0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.635822 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-service-ca-bundle\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.636355 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ae9bb3b-dab8-4286-aa17-1d04b2239dd6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7d8jq\" (UID: \"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.636384 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.637868 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-stats-auth\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.638560 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51c966fa-8fea-4242-a6b5-2e8b9c55fdd0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4l4sh\" (UID: \"51c966fa-8fea-4242-a6b5-2e8b9c55fdd0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.639213 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-metrics-certs\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.654288 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.674483 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.678950 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-default-certificate\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.694606 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.714495 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.733412 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.734441 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 20 22:37:49 crc kubenswrapper[5030]: E0120 22:37:49.735539 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.235506687 +0000 UTC m=+142.555767005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.745367 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3ae9bb3b-dab8-4286-aa17-1d04b2239dd6-images\") pod \"machine-config-operator-74547568cd-7d8jq\" (UID: \"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.753928 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.775537 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.795082 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.816082 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.835189 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.835211 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:49 crc kubenswrapper[5030]: E0120 22:37:49.836729 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.336689969 +0000 UTC m=+142.656950297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.841598 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ae9bb3b-dab8-4286-aa17-1d04b2239dd6-proxy-tls\") pod \"machine-config-operator-74547568cd-7d8jq\" (UID: \"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.868169 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.879103 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.891692 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.895359 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.898296 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.938761 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.939948 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:49 crc kubenswrapper[5030]: E0120 22:37:49.941059 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.441029506 +0000 UTC m=+142.761289844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.949063 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.950439 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.953542 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.954338 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.957857 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.974655 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 20 22:37:49 crc kubenswrapper[5030]: I0120 22:37:49.988567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.000970 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.006587 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.016776 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.035610 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.041756 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.041892 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.54186395 +0000 UTC m=+142.862124248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.042814 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.043228 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.543209762 +0000 UTC m=+142.863470080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.049190 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.054259 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.067164 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.081386 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.085770 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-audit-policies\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.094483 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.114443 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.123479 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.133281 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.134447 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.143792 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.143903 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.643879562 +0000 UTC m=+142.964139870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.144079 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.144479 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.644464416 +0000 UTC m=+142.964724724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.163411 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.165297 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eadfdde-af50-4bad-a779-dc5af28f74fe-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmzpg\" (UID: \"8eadfdde-af50-4bad-a779-dc5af28f74fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.173418 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.194248 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.208566 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8eadfdde-af50-4bad-a779-dc5af28f74fe-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmzpg\" (UID: \"8eadfdde-af50-4bad-a779-dc5af28f74fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.215213 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.234221 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.246047 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.246212 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.746181871 +0000 UTC m=+143.066442189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.246508 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.247033 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.747010811 +0000 UTC m=+143.067271159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.254612 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.274874 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.294463 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.314669 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.335288 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.348942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.349223 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.849184687 +0000 UTC m=+143.169445005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.349715 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.350159 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.85013852 +0000 UTC m=+143.170398838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.353197 5030 request.go:700] Waited for 1.011595056s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcatalog-operator-serving-cert&limit=500&resourceVersion=0 Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.355455 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.368046 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dce6cbef-8eea-40d0-83ee-8790d2bd9450-srv-cert\") pod \"catalog-operator-68c6474976-pxvmq\" (UID: \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.374562 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.378024 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c6a2a728-a83b-4fa2-b21a-748dcb79a930-profile-collector-cert\") pod \"olm-operator-6b444d44fb-8h529\" (UID: \"c6a2a728-a83b-4fa2-b21a-748dcb79a930\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.379470 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dce6cbef-8eea-40d0-83ee-8790d2bd9450-profile-collector-cert\") pod \"catalog-operator-68c6474976-pxvmq\" (UID: \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.386095 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b55be0c-55e3-4bf3-badd-618e8dc9f738-secret-volume\") pod \"collect-profiles-29482470-4x64w\" (UID: \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.395019 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.434857 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.448773 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c6a2a728-a83b-4fa2-b21a-748dcb79a930-srv-cert\") pod \"olm-operator-6b444d44fb-8h529\" (UID: \"c6a2a728-a83b-4fa2-b21a-748dcb79a930\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.450780 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.451087 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.951044255 +0000 UTC m=+143.271304583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.451670 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.452038 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:50.952021248 +0000 UTC m=+143.272281536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.454581 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.475389 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.493580 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.514519 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.545738 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.553089 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.553206 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.05317511 +0000 UTC m=+143.373435398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.553375 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.553698 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.053689162 +0000 UTC m=+143.373949450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.555112 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.565674 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b55be0c-55e3-4bf3-badd-618e8dc9f738-config-volume\") pod \"collect-profiles-29482470-4x64w\" (UID: \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.574213 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.594187 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.605843 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/09197347-c50f-4fe1-b20d-3baabf9869fc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mh8ms\" (UID: \"09197347-c50f-4fe1-b20d-3baabf9869fc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mh8ms" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.614516 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.632965 5030 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.633017 5030 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.633119 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1e5724b-6745-4d13-a19b-a631a2483004-config podName:e1e5724b-6745-4d13-a19b-a631a2483004 nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.133066724 +0000 UTC m=+143.453327032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/e1e5724b-6745-4d13-a19b-a631a2483004-config") pod "machine-approver-56656f9798-x4j96" (UID: "e1e5724b-6745-4d13-a19b-a631a2483004") : failed to sync configmap cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.633146 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1e5724b-6745-4d13-a19b-a631a2483004-auth-proxy-config podName:e1e5724b-6745-4d13-a19b-a631a2483004 nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.133134636 +0000 UTC m=+143.453394934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e1e5724b-6745-4d13-a19b-a631a2483004-auth-proxy-config") pod "machine-approver-56656f9798-x4j96" (UID: "e1e5724b-6745-4d13-a19b-a631a2483004") : failed to sync configmap cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.633434 5030 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.633459 5030 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.633535 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/49dc7f5e-9a94-4a6a-83d0-c77019142654-signing-cabundle podName:49dc7f5e-9a94-4a6a-83d0-c77019142654 nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.133511055 +0000 UTC m=+143.453771383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/49dc7f5e-9a94-4a6a-83d0-c77019142654-signing-cabundle") pod "service-ca-9c57cc56f-xhbbg" (UID: "49dc7f5e-9a94-4a6a-83d0-c77019142654") : failed to sync configmap cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.633560 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f12e0006-3347-4e5b-98ad-bd69997cbe22-package-server-manager-serving-cert podName:f12e0006-3347-4e5b-98ad-bd69997cbe22 nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.133548395 +0000 UTC m=+143.453808713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/f12e0006-3347-4e5b-98ad-bd69997cbe22-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-njtmh" (UID: "f12e0006-3347-4e5b-98ad-bd69997cbe22") : failed to sync secret cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.634188 5030 secret.go:188] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.634297 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1e5724b-6745-4d13-a19b-a631a2483004-machine-approver-tls podName:e1e5724b-6745-4d13-a19b-a631a2483004 nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.134270293 +0000 UTC m=+143.454530681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/e1e5724b-6745-4d13-a19b-a631a2483004-machine-approver-tls") pod "machine-approver-56656f9798-x4j96" (UID: "e1e5724b-6745-4d13-a19b-a631a2483004") : failed to sync secret cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.634687 5030 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.634733 5030 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.634777 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49dc7f5e-9a94-4a6a-83d0-c77019142654-signing-key podName:49dc7f5e-9a94-4a6a-83d0-c77019142654 nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.134751874 +0000 UTC m=+143.455012242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/49dc7f5e-9a94-4a6a-83d0-c77019142654-signing-key") pod "service-ca-9c57cc56f-xhbbg" (UID: "49dc7f5e-9a94-4a6a-83d0-c77019142654") : failed to sync secret cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.634795 5030 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.634820 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-apiservice-cert podName:ddc234b4-01ae-46aa-bc6b-e6e7a723c06b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.134797355 +0000 UTC m=+143.455057683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-apiservice-cert") pod "packageserver-d55dfcdfc-m6wtc" (UID: "ddc234b4-01ae-46aa-bc6b-e6e7a723c06b") : failed to sync secret cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.634837 5030 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.634876 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9-control-plane-machine-set-operator-tls podName:607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9 nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.134851647 +0000 UTC m=+143.455111975 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-82699" (UID: "607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9") : failed to sync secret cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.634903 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.634925 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-webhook-cert podName:ddc234b4-01ae-46aa-bc6b-e6e7a723c06b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.134904608 +0000 UTC m=+143.455165016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-webhook-cert") pod "packageserver-d55dfcdfc-m6wtc" (UID: "ddc234b4-01ae-46aa-bc6b-e6e7a723c06b") : failed to sync secret cache: timed out waiting for the condition Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.654308 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.654334 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.654561 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.154528056 +0000 UTC m=+143.474788394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.655205 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.655749 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.155727384 +0000 UTC m=+143.475987762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.675459 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.695014 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.715163 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.734582 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.754365 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.756810 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.757063 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.257016839 +0000 UTC m=+143.577277167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.757496 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.758035 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.258009063 +0000 UTC m=+143.578269381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.774103 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.794231 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.814735 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.834862 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.854969 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.859693 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.859886 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.359850051 +0000 UTC m=+143.680110379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.860734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.861248 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.361219953 +0000 UTC m=+143.681480311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.875559 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.894918 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.914491 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.934884 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.954463 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.961611 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.961779 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.461751159 +0000 UTC m=+143.782011487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.962044 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:50 crc kubenswrapper[5030]: E0120 22:37:50.962494 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.462478447 +0000 UTC m=+143.782738775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.974612 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 20 22:37:50 crc kubenswrapper[5030]: I0120 22:37:50.995226 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.014898 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.035247 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.053995 5030 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.063810 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.064221 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.564182102 +0000 UTC m=+143.884442430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.064849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.065394 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.565372969 +0000 UTC m=+143.885633297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.074013 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.123764 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fbbs\" (UniqueName: \"kubernetes.io/projected/352cd616-4b0f-4f66-9fff-d477bd1e827c-kube-api-access-6fbbs\") pod \"openshift-config-operator-7777fb866f-5sb9g\" (UID: \"352cd616-4b0f-4f66-9fff-d477bd1e827c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.134696 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gn55\" (UniqueName: \"kubernetes.io/projected/977980fd-278e-43c6-a63d-32646c9cf3de-kube-api-access-5gn55\") pod \"authentication-operator-69f744f599-7fll2\" (UID: \"977980fd-278e-43c6-a63d-32646c9cf3de\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.164325 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j6bf\" (UniqueName: \"kubernetes.io/projected/09fc7d72-f430-4d90-9ab9-1d79db43b695-kube-api-access-8j6bf\") pod \"machine-api-operator-5694c8668f-t7nch\" (UID: \"09fc7d72-f430-4d90-9ab9-1d79db43b695\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.166973 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.167223 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.667171846 +0000 UTC m=+143.987432174 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.167409 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f12e0006-3347-4e5b-98ad-bd69997cbe22-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-njtmh\" (UID: \"f12e0006-3347-4e5b-98ad-bd69997cbe22\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.167532 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/49dc7f5e-9a94-4a6a-83d0-c77019142654-signing-cabundle\") pod \"service-ca-9c57cc56f-xhbbg\" (UID: \"49dc7f5e-9a94-4a6a-83d0-c77019142654\") " pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.167851 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e1e5724b-6745-4d13-a19b-a631a2483004-machine-approver-tls\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.167906 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-apiservice-cert\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.168046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-webhook-cert\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.168097 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/49dc7f5e-9a94-4a6a-83d0-c77019142654-signing-key\") pod \"service-ca-9c57cc56f-xhbbg\" (UID: \"49dc7f5e-9a94-4a6a-83d0-c77019142654\") " pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.168191 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-82699\" (UID: \"607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.168286 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.168377 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1e5724b-6745-4d13-a19b-a631a2483004-auth-proxy-config\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.168465 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e5724b-6745-4d13-a19b-a631a2483004-config\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.168709 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.668691813 +0000 UTC m=+143.988952111 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.169509 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/49dc7f5e-9a94-4a6a-83d0-c77019142654-signing-cabundle\") pod \"service-ca-9c57cc56f-xhbbg\" (UID: \"49dc7f5e-9a94-4a6a-83d0-c77019142654\") " pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.169667 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1e5724b-6745-4d13-a19b-a631a2483004-config\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.170100 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1e5724b-6745-4d13-a19b-a631a2483004-auth-proxy-config\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.172661 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/49dc7f5e-9a94-4a6a-83d0-c77019142654-signing-key\") pod \"service-ca-9c57cc56f-xhbbg\" (UID: \"49dc7f5e-9a94-4a6a-83d0-c77019142654\") " pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.172828 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e1e5724b-6745-4d13-a19b-a631a2483004-machine-approver-tls\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.174343 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f12e0006-3347-4e5b-98ad-bd69997cbe22-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-njtmh\" (UID: \"f12e0006-3347-4e5b-98ad-bd69997cbe22\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.175123 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-82699\" (UID: \"607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.175187 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvq9b\" (UniqueName: \"kubernetes.io/projected/96c05319-8357-4042-ae09-77f0c85a73d9-kube-api-access-wvq9b\") pod \"controller-manager-879f6c89f-2zwp4\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.175331 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.175381 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-webhook-cert\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.177979 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-apiservice-cert\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.195700 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.214921 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.234875 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.242287 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.254454 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.261294 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.269577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.269854 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.769815953 +0000 UTC m=+144.090076281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.270122 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.270567 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.770550541 +0000 UTC m=+144.090810869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.276763 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.295416 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.335094 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.350029 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.353387 5030 request.go:700] Waited for 1.898943819s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-tls&limit=500&resourceVersion=0 Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.355287 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.371533 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.372468 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.87243941 +0000 UTC m=+144.192699738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.376156 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.384184 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.418199 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfs5b\" (UniqueName: \"kubernetes.io/projected/c1d9550c-ae35-4705-aff2-7946b6434807-kube-api-access-lfs5b\") pod \"apiserver-7bbb656c7d-hszg6\" (UID: \"c1d9550c-ae35-4705-aff2-7946b6434807\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.433656 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86rs6\" (UniqueName: \"kubernetes.io/projected/e174ceb8-a286-43e5-b6e4-cfc426d77226-kube-api-access-86rs6\") pod \"etcd-operator-b45778765-h2r7q\" (UID: \"e174ceb8-a286-43e5-b6e4-cfc426d77226\") " pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.452146 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-bound-sa-token\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.471456 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g6g9\" (UniqueName: \"kubernetes.io/projected/b471dba4-7cf5-4cbb-beb6-010bd32f8fde-kube-api-access-2g6g9\") pod \"console-operator-58897d9998-ll54f\" (UID: \"b471dba4-7cf5-4cbb-beb6-010bd32f8fde\") " pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.473373 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.474994 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:51.974975804 +0000 UTC m=+144.295236112 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.479889 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7fll2"] Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.489487 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jdkd\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-kube-api-access-9jdkd\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:51 crc kubenswrapper[5030]: W0120 22:37:51.509576 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod977980fd_278e_43c6_a63d_32646c9cf3de.slice/crio-5f5ddcb8bb95d2412f4ed8c01a105b9b06cf61d2e7d25a3b456648bea6035f3f WatchSource:0}: Error finding container 5f5ddcb8bb95d2412f4ed8c01a105b9b06cf61d2e7d25a3b456648bea6035f3f: Status 404 returned error can't find the container with id 5f5ddcb8bb95d2412f4ed8c01a105b9b06cf61d2e7d25a3b456648bea6035f3f Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.516717 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g"] Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.518141 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9840bef3-5439-4096-bde3-812f3653da3c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4d9dw\" (UID: \"9840bef3-5439-4096-bde3-812f3653da3c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.546439 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d0ad1055-4774-45e8-a1ab-e932ecb14bc8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-27n89\" (UID: \"d0ad1055-4774-45e8-a1ab-e932ecb14bc8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.552320 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.557137 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27g6b\" (UniqueName: \"kubernetes.io/projected/d0ad1055-4774-45e8-a1ab-e932ecb14bc8-kube-api-access-27g6b\") pod \"cluster-image-registry-operator-dc59b4c8b-27n89\" (UID: \"d0ad1055-4774-45e8-a1ab-e932ecb14bc8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.574037 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.574416 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:52.074402445 +0000 UTC m=+144.394662733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.574515 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.581303 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mspd2\" (UniqueName: \"kubernetes.io/projected/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-kube-api-access-mspd2\") pod \"route-controller-manager-6576b87f9c-bv2jh\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.589919 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w45km\" (UniqueName: \"kubernetes.io/projected/6f7b1164-858a-4230-a949-0b770ed638d8-kube-api-access-w45km\") pod \"openshift-controller-manager-operator-756b6f6bc6-mjfgk\" (UID: \"6f7b1164-858a-4230-a949-0b770ed638d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.609343 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t7nch"] Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.610179 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgghl\" (UniqueName: \"kubernetes.io/projected/0e490f76-4197-4c4c-bfde-2911a16dad48-kube-api-access-pgghl\") pod \"openshift-apiserver-operator-796bbdcf4f-6gbd8\" (UID: \"0e490f76-4197-4c4c-bfde-2911a16dad48\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" Jan 20 22:37:51 crc kubenswrapper[5030]: W0120 22:37:51.616167 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09fc7d72_f430_4d90_9ab9_1d79db43b695.slice/crio-b5f606d4c529d2b3019101a504440f08d4871a33b61258b9a18a1e29ecb7d637 WatchSource:0}: Error finding container b5f606d4c529d2b3019101a504440f08d4871a33b61258b9a18a1e29ecb7d637: Status 404 returned error can't find the container with id b5f606d4c529d2b3019101a504440f08d4871a33b61258b9a18a1e29ecb7d637 Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.630475 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f58z9\" (UniqueName: \"kubernetes.io/projected/b061ed0d-eb5f-4569-9736-74c4705a7b82-kube-api-access-f58z9\") pod \"cluster-samples-operator-665b6dd947-7pp4d\" (UID: \"b061ed0d-eb5f-4569-9736-74c4705a7b82\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.652221 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qshf4\" (UniqueName: \"kubernetes.io/projected/e4ad4478-2873-407f-a178-b0255511ef06-kube-api-access-qshf4\") pod \"apiserver-76f77b778f-5fgwp\" (UID: \"e4ad4478-2873-407f-a178-b0255511ef06\") " pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.656846 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2zwp4"] Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.664990 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.667686 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n4xq\" (UniqueName: \"kubernetes.io/projected/8de6d00e-a15c-47d8-a4a6-0212694a734f-kube-api-access-4n4xq\") pod \"dns-operator-744455d44c-vm7db\" (UID: \"8de6d00e-a15c-47d8-a4a6-0212694a734f\") " pod="openshift-dns-operator/dns-operator-744455d44c-vm7db" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.675922 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.676317 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:52.176301854 +0000 UTC m=+144.496562162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: W0120 22:37:51.676930 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96c05319_8357_4042_ae09_77f0c85a73d9.slice/crio-8b3e883d14d221c4734d2edc7aea1c5efd877b9f4c908bbcd02245e0b3acd2b6 WatchSource:0}: Error finding container 8b3e883d14d221c4734d2edc7aea1c5efd877b9f4c908bbcd02245e0b3acd2b6: Status 404 returned error can't find the container with id 8b3e883d14d221c4734d2edc7aea1c5efd877b9f4c908bbcd02245e0b3acd2b6 Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.684304 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" event={"ID":"977980fd-278e-43c6-a63d-32646c9cf3de","Type":"ContainerStarted","Data":"5f5ddcb8bb95d2412f4ed8c01a105b9b06cf61d2e7d25a3b456648bea6035f3f"} Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.685068 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" event={"ID":"96c05319-8357-4042-ae09-77f0c85a73d9","Type":"ContainerStarted","Data":"8b3e883d14d221c4734d2edc7aea1c5efd877b9f4c908bbcd02245e0b3acd2b6"} Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.686738 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" event={"ID":"09fc7d72-f430-4d90-9ab9-1d79db43b695","Type":"ContainerStarted","Data":"b5f606d4c529d2b3019101a504440f08d4871a33b61258b9a18a1e29ecb7d637"} Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.687506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" event={"ID":"352cd616-4b0f-4f66-9fff-d477bd1e827c","Type":"ContainerStarted","Data":"66cd623935c546f96c63699e8b026854d6f3e5c8cdbfbd4a6df13840bd00646c"} Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.690188 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pws2h\" (UniqueName: \"kubernetes.io/projected/9840bef3-5439-4096-bde3-812f3653da3c-kube-api-access-pws2h\") pod \"ingress-operator-5b745b69d9-4d9dw\" (UID: \"9840bef3-5439-4096-bde3-812f3653da3c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.714124 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtp9h\" (UniqueName: \"kubernetes.io/projected/6d43e77d-5f9a-4b29-959f-190fa60d8807-kube-api-access-rtp9h\") pod \"console-f9d7485db-vkrcb\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.722542 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.729299 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6"] Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.746223 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.747689 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-h2r7q"] Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.749895 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjvpx\" (UniqueName: \"kubernetes.io/projected/ddc234b4-01ae-46aa-bc6b-e6e7a723c06b-kube-api-access-tjvpx\") pod \"packageserver-d55dfcdfc-m6wtc\" (UID: \"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.751510 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.757454 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.763571 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnjh6\" (UniqueName: \"kubernetes.io/projected/f12e0006-3347-4e5b-98ad-bd69997cbe22-kube-api-access-tnjh6\") pod \"package-server-manager-789f6589d5-njtmh\" (UID: \"f12e0006-3347-4e5b-98ad-bd69997cbe22\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.770650 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vm7db" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.780675 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.781274 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:52.281258456 +0000 UTC m=+144.601518734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.784891 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.788766 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4z2w\" (UniqueName: \"kubernetes.io/projected/d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc-kube-api-access-x4z2w\") pod \"router-default-5444994796-vc527\" (UID: \"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc\") " pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.788984 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.794251 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51c966fa-8fea-4242-a6b5-2e8b9c55fdd0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4l4sh\" (UID: \"51c966fa-8fea-4242-a6b5-2e8b9c55fdd0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.798777 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.807938 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdz7b\" (UniqueName: \"kubernetes.io/projected/6b55be0c-55e3-4bf3-badd-618e8dc9f738-kube-api-access-hdz7b\") pod \"collect-profiles-29482470-4x64w\" (UID: \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.832359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sstz\" (UniqueName: \"kubernetes.io/projected/4e59bf88-6561-4b84-93a3-4aa09d919f56-kube-api-access-7sstz\") pod \"oauth-openshift-558db77b4-ht5ll\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.856213 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.858203 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfb2j\" (UniqueName: \"kubernetes.io/projected/8eadfdde-af50-4bad-a779-dc5af28f74fe-kube-api-access-sfb2j\") pod \"kube-storage-version-migrator-operator-b67b599dd-pmzpg\" (UID: \"8eadfdde-af50-4bad-a779-dc5af28f74fe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.868936 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.869863 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q95sh\" (UniqueName: \"kubernetes.io/projected/dce6cbef-8eea-40d0-83ee-8790d2bd9450-kube-api-access-q95sh\") pod \"catalog-operator-68c6474976-pxvmq\" (UID: \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.881195 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.882138 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.882540 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:52.38252844 +0000 UTC m=+144.702788728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.886860 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q64bt\" (UniqueName: \"kubernetes.io/projected/3ae9bb3b-dab8-4286-aa17-1d04b2239dd6-kube-api-access-q64bt\") pod \"machine-config-operator-74547568cd-7d8jq\" (UID: \"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.898138 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d"] Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.900797 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.921754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9mms\" (UniqueName: \"kubernetes.io/projected/c6a2a728-a83b-4fa2-b21a-748dcb79a930-kube-api-access-t9mms\") pod \"olm-operator-6b444d44fb-8h529\" (UID: \"c6a2a728-a83b-4fa2-b21a-748dcb79a930\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.931874 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg4ct\" (UniqueName: \"kubernetes.io/projected/e1e5724b-6745-4d13-a19b-a631a2483004-kube-api-access-kg4ct\") pod \"machine-approver-56656f9798-x4j96\" (UID: \"e1e5724b-6745-4d13-a19b-a631a2483004\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.951686 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhh76\" (UniqueName: \"kubernetes.io/projected/09197347-c50f-4fe1-b20d-3baabf9869fc-kube-api-access-hhh76\") pod \"multus-admission-controller-857f4d67dd-mh8ms\" (UID: \"09197347-c50f-4fe1-b20d-3baabf9869fc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mh8ms" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.960180 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vkrcb"] Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.964589 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.970923 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.973093 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hs66\" (UniqueName: \"kubernetes.io/projected/49dc7f5e-9a94-4a6a-83d0-c77019142654-kube-api-access-8hs66\") pod \"service-ca-9c57cc56f-xhbbg\" (UID: \"49dc7f5e-9a94-4a6a-83d0-c77019142654\") " pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.977955 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.986836 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.987064 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:52.487046292 +0000 UTC m=+144.807306580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.987213 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:51 crc kubenswrapper[5030]: E0120 22:37:51.987510 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:52.487502692 +0000 UTC m=+144.807762980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.992476 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.992782 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98tw\" (UniqueName: \"kubernetes.io/projected/607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9-kube-api-access-x98tw\") pod \"control-plane-machine-set-operator-78cbb6b69f-82699\" (UID: \"607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699" Jan 20 22:37:51 crc kubenswrapper[5030]: I0120 22:37:51.996294 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.006079 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh"] Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.006254 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.013091 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mh8ms" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.018772 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.031899 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.045901 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.059921 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.065704 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc"] Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.088042 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.088184 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-registration-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.088204 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkw8t\" (UniqueName: \"kubernetes.io/projected/5ee883e4-13d7-4013-8e09-a217f02874f2-kube-api-access-gkw8t\") pod \"dns-default-j5gw4\" (UID: \"5ee883e4-13d7-4013-8e09-a217f02874f2\") " pod="openshift-dns/dns-default-j5gw4" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.088222 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht7kw\" (UniqueName: \"kubernetes.io/projected/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-kube-api-access-ht7kw\") pod \"marketplace-operator-79b997595-tlnpr\" (UID: \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\") " pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.088271 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tlnpr\" (UID: \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\") " pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.088316 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0a3aeed-8352-4f1b-8f2a-0735c467f98c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nhcfn\" (UID: \"d0a3aeed-8352-4f1b-8f2a-0735c467f98c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.088355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51e17579-9d5a-40de-8207-17d85f4c803d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7zdhb\" (UID: \"51e17579-9d5a-40de-8207-17d85f4c803d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.088405 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv6sf\" (UniqueName: \"kubernetes.io/projected/94de3b86-fede-47e1-8ecf-03f167572300-kube-api-access-vv6sf\") pod \"downloads-7954f5f757-fmkdf\" (UID: \"94de3b86-fede-47e1-8ecf-03f167572300\") " pod="openshift-console/downloads-7954f5f757-fmkdf" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.088477 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ee883e4-13d7-4013-8e09-a217f02874f2-config-volume\") pod \"dns-default-j5gw4\" (UID: \"5ee883e4-13d7-4013-8e09-a217f02874f2\") " pod="openshift-dns/dns-default-j5gw4" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.088615 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee883e4-13d7-4013-8e09-a217f02874f2-metrics-tls\") pod \"dns-default-j5gw4\" (UID: \"5ee883e4-13d7-4013-8e09-a217f02874f2\") " pod="openshift-dns/dns-default-j5gw4" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.088645 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4f9h\" (UniqueName: \"kubernetes.io/projected/cd0500bb-3678-45b3-bf69-010672c72338-kube-api-access-g4f9h\") pod \"ingress-canary-5mcr2\" (UID: \"cd0500bb-3678-45b3-bf69-010672c72338\") " pod="openshift-ingress-canary/ingress-canary-5mcr2" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.089162 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crkl7\" (UniqueName: \"kubernetes.io/projected/ae10e694-2552-431e-88b6-2637e467c256-kube-api-access-crkl7\") pod \"migrator-59844c95c7-9vbnp\" (UID: \"ae10e694-2552-431e-88b6-2637e467c256\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vbnp" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.089791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppbgm\" (UniqueName: \"kubernetes.io/projected/6123fd3b-008e-4cc1-ab2b-55805290b417-kube-api-access-ppbgm\") pod \"service-ca-operator-777779d784-5rcd9\" (UID: \"6123fd3b-008e-4cc1-ab2b-55805290b417\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.089833 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51e17579-9d5a-40de-8207-17d85f4c803d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7zdhb\" (UID: \"51e17579-9d5a-40de-8207-17d85f4c803d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.089850 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-csi-data-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.089884 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-plugins-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.089917 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tlnpr\" (UID: \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\") " pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.089969 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6123fd3b-008e-4cc1-ab2b-55805290b417-config\") pod \"service-ca-operator-777779d784-5rcd9\" (UID: \"6123fd3b-008e-4cc1-ab2b-55805290b417\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.089996 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-socket-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.090026 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a3aeed-8352-4f1b-8f2a-0735c467f98c-config\") pod \"kube-apiserver-operator-766d6c64bb-nhcfn\" (UID: \"d0a3aeed-8352-4f1b-8f2a-0735c467f98c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.090075 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wxtc\" (UniqueName: \"kubernetes.io/projected/17ebad12-2a66-427b-b200-64383ca6631c-kube-api-access-2wxtc\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.090109 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd0500bb-3678-45b3-bf69-010672c72338-cert\") pod \"ingress-canary-5mcr2\" (UID: \"cd0500bb-3678-45b3-bf69-010672c72338\") " pod="openshift-ingress-canary/ingress-canary-5mcr2" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.090130 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51e17579-9d5a-40de-8207-17d85f4c803d-config\") pod \"kube-controller-manager-operator-78b949d7b-7zdhb\" (UID: \"51e17579-9d5a-40de-8207-17d85f4c803d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.090172 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vst6x\" (UniqueName: \"kubernetes.io/projected/e89be5a8-754e-4a83-9cb5-691055aed0e2-kube-api-access-vst6x\") pod \"machine-config-controller-84d6567774-m27bp\" (UID: \"e89be5a8-754e-4a83-9cb5-691055aed0e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.090205 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-mountpoint-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.090237 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6123fd3b-008e-4cc1-ab2b-55805290b417-serving-cert\") pod \"service-ca-operator-777779d784-5rcd9\" (UID: \"6123fd3b-008e-4cc1-ab2b-55805290b417\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.090254 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e89be5a8-754e-4a83-9cb5-691055aed0e2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-m27bp\" (UID: \"e89be5a8-754e-4a83-9cb5-691055aed0e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.090312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0a3aeed-8352-4f1b-8f2a-0735c467f98c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nhcfn\" (UID: \"d0a3aeed-8352-4f1b-8f2a-0735c467f98c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.090338 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e89be5a8-754e-4a83-9cb5-691055aed0e2-proxy-tls\") pod \"machine-config-controller-84d6567774-m27bp\" (UID: \"e89be5a8-754e-4a83-9cb5-691055aed0e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" Jan 20 22:37:52 crc kubenswrapper[5030]: E0120 22:37:52.092463 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:52.592446934 +0000 UTC m=+144.912707222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.129071 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5fgwp"] Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.154952 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vm7db"] Jan 20 22:37:52 crc kubenswrapper[5030]: W0120 22:37:52.155047 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddc234b4_01ae_46aa_bc6b_e6e7a723c06b.slice/crio-2b3707f3954eac946ca74dd8f5cb98d214cad07bbc6fde237592ae295ce81dec WatchSource:0}: Error finding container 2b3707f3954eac946ca74dd8f5cb98d214cad07bbc6fde237592ae295ce81dec: Status 404 returned error can't find the container with id 2b3707f3954eac946ca74dd8f5cb98d214cad07bbc6fde237592ae295ce81dec Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.193474 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e89be5a8-754e-4a83-9cb5-691055aed0e2-proxy-tls\") pod \"machine-config-controller-84d6567774-m27bp\" (UID: \"e89be5a8-754e-4a83-9cb5-691055aed0e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.193745 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-registration-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.193761 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkw8t\" (UniqueName: \"kubernetes.io/projected/5ee883e4-13d7-4013-8e09-a217f02874f2-kube-api-access-gkw8t\") pod \"dns-default-j5gw4\" (UID: \"5ee883e4-13d7-4013-8e09-a217f02874f2\") " pod="openshift-dns/dns-default-j5gw4" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.193825 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b3831420-3546-4f5f-aa58-c8465f7f33ef-node-bootstrap-token\") pod \"machine-config-server-882nz\" (UID: \"b3831420-3546-4f5f-aa58-c8465f7f33ef\") " pod="openshift-machine-config-operator/machine-config-server-882nz" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.193916 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht7kw\" (UniqueName: \"kubernetes.io/projected/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-kube-api-access-ht7kw\") pod \"marketplace-operator-79b997595-tlnpr\" (UID: \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\") " pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.193996 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tlnpr\" (UID: \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\") " pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.194047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0a3aeed-8352-4f1b-8f2a-0735c467f98c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nhcfn\" (UID: \"d0a3aeed-8352-4f1b-8f2a-0735c467f98c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.194141 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51e17579-9d5a-40de-8207-17d85f4c803d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7zdhb\" (UID: \"51e17579-9d5a-40de-8207-17d85f4c803d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.194167 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-registration-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.194279 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv6sf\" (UniqueName: \"kubernetes.io/projected/94de3b86-fede-47e1-8ecf-03f167572300-kube-api-access-vv6sf\") pod \"downloads-7954f5f757-fmkdf\" (UID: \"94de3b86-fede-47e1-8ecf-03f167572300\") " pod="openshift-console/downloads-7954f5f757-fmkdf" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.194405 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ee883e4-13d7-4013-8e09-a217f02874f2-config-volume\") pod \"dns-default-j5gw4\" (UID: \"5ee883e4-13d7-4013-8e09-a217f02874f2\") " pod="openshift-dns/dns-default-j5gw4" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.194439 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee883e4-13d7-4013-8e09-a217f02874f2-metrics-tls\") pod \"dns-default-j5gw4\" (UID: \"5ee883e4-13d7-4013-8e09-a217f02874f2\") " pod="openshift-dns/dns-default-j5gw4" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.194465 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4f9h\" (UniqueName: \"kubernetes.io/projected/cd0500bb-3678-45b3-bf69-010672c72338-kube-api-access-g4f9h\") pod \"ingress-canary-5mcr2\" (UID: \"cd0500bb-3678-45b3-bf69-010672c72338\") " pod="openshift-ingress-canary/ingress-canary-5mcr2" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.194642 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b3831420-3546-4f5f-aa58-c8465f7f33ef-certs\") pod \"machine-config-server-882nz\" (UID: \"b3831420-3546-4f5f-aa58-c8465f7f33ef\") " pod="openshift-machine-config-operator/machine-config-server-882nz" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.194769 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crkl7\" (UniqueName: \"kubernetes.io/projected/ae10e694-2552-431e-88b6-2637e467c256-kube-api-access-crkl7\") pod \"migrator-59844c95c7-9vbnp\" (UID: \"ae10e694-2552-431e-88b6-2637e467c256\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vbnp" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.194801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppbgm\" (UniqueName: \"kubernetes.io/projected/6123fd3b-008e-4cc1-ab2b-55805290b417-kube-api-access-ppbgm\") pod \"service-ca-operator-777779d784-5rcd9\" (UID: \"6123fd3b-008e-4cc1-ab2b-55805290b417\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.194866 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51e17579-9d5a-40de-8207-17d85f4c803d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7zdhb\" (UID: \"51e17579-9d5a-40de-8207-17d85f4c803d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.194888 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-csi-data-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.194910 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d52jc\" (UniqueName: \"kubernetes.io/projected/b3831420-3546-4f5f-aa58-c8465f7f33ef-kube-api-access-d52jc\") pod \"machine-config-server-882nz\" (UID: \"b3831420-3546-4f5f-aa58-c8465f7f33ef\") " pod="openshift-machine-config-operator/machine-config-server-882nz" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.195184 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-plugins-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.195218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tlnpr\" (UID: \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\") " pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.195250 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6123fd3b-008e-4cc1-ab2b-55805290b417-config\") pod \"service-ca-operator-777779d784-5rcd9\" (UID: \"6123fd3b-008e-4cc1-ab2b-55805290b417\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.195333 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-socket-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.196316 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a3aeed-8352-4f1b-8f2a-0735c467f98c-config\") pod \"kube-apiserver-operator-766d6c64bb-nhcfn\" (UID: \"d0a3aeed-8352-4f1b-8f2a-0735c467f98c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.196469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wxtc\" (UniqueName: \"kubernetes.io/projected/17ebad12-2a66-427b-b200-64383ca6631c-kube-api-access-2wxtc\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.196528 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd0500bb-3678-45b3-bf69-010672c72338-cert\") pod \"ingress-canary-5mcr2\" (UID: \"cd0500bb-3678-45b3-bf69-010672c72338\") " pod="openshift-ingress-canary/ingress-canary-5mcr2" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.196552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.196582 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51e17579-9d5a-40de-8207-17d85f4c803d-config\") pod \"kube-controller-manager-operator-78b949d7b-7zdhb\" (UID: \"51e17579-9d5a-40de-8207-17d85f4c803d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.196613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vst6x\" (UniqueName: \"kubernetes.io/projected/e89be5a8-754e-4a83-9cb5-691055aed0e2-kube-api-access-vst6x\") pod \"machine-config-controller-84d6567774-m27bp\" (UID: \"e89be5a8-754e-4a83-9cb5-691055aed0e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.196653 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-mountpoint-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.196705 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6123fd3b-008e-4cc1-ab2b-55805290b417-serving-cert\") pod \"service-ca-operator-777779d784-5rcd9\" (UID: \"6123fd3b-008e-4cc1-ab2b-55805290b417\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.196764 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e89be5a8-754e-4a83-9cb5-691055aed0e2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-m27bp\" (UID: \"e89be5a8-754e-4a83-9cb5-691055aed0e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.196798 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0a3aeed-8352-4f1b-8f2a-0735c467f98c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nhcfn\" (UID: \"d0a3aeed-8352-4f1b-8f2a-0735c467f98c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.198566 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ee883e4-13d7-4013-8e09-a217f02874f2-config-volume\") pod \"dns-default-j5gw4\" (UID: \"5ee883e4-13d7-4013-8e09-a217f02874f2\") " pod="openshift-dns/dns-default-j5gw4" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.199035 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-plugins-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.199641 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-csi-data-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.200254 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a3aeed-8352-4f1b-8f2a-0735c467f98c-config\") pod \"kube-apiserver-operator-766d6c64bb-nhcfn\" (UID: \"d0a3aeed-8352-4f1b-8f2a-0735c467f98c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.195850 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-socket-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.200754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tlnpr\" (UID: \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\") " pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.201380 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51e17579-9d5a-40de-8207-17d85f4c803d-config\") pod \"kube-controller-manager-operator-78b949d7b-7zdhb\" (UID: \"51e17579-9d5a-40de-8207-17d85f4c803d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" Jan 20 22:37:52 crc kubenswrapper[5030]: E0120 22:37:52.202084 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:52.702067477 +0000 UTC m=+145.022327765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.202296 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e89be5a8-754e-4a83-9cb5-691055aed0e2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-m27bp\" (UID: \"e89be5a8-754e-4a83-9cb5-691055aed0e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.202329 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6123fd3b-008e-4cc1-ab2b-55805290b417-config\") pod \"service-ca-operator-777779d784-5rcd9\" (UID: \"6123fd3b-008e-4cc1-ab2b-55805290b417\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.203125 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/17ebad12-2a66-427b-b200-64383ca6631c-mountpoint-dir\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.216146 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0a3aeed-8352-4f1b-8f2a-0735c467f98c-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nhcfn\" (UID: \"d0a3aeed-8352-4f1b-8f2a-0735c467f98c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.217566 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51e17579-9d5a-40de-8207-17d85f4c803d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7zdhb\" (UID: \"51e17579-9d5a-40de-8207-17d85f4c803d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.220288 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkw8t\" (UniqueName: \"kubernetes.io/projected/5ee883e4-13d7-4013-8e09-a217f02874f2-kube-api-access-gkw8t\") pod \"dns-default-j5gw4\" (UID: \"5ee883e4-13d7-4013-8e09-a217f02874f2\") " pod="openshift-dns/dns-default-j5gw4" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.222816 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e89be5a8-754e-4a83-9cb5-691055aed0e2-proxy-tls\") pod \"machine-config-controller-84d6567774-m27bp\" (UID: \"e89be5a8-754e-4a83-9cb5-691055aed0e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.229189 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tlnpr\" (UID: \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\") " pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.229282 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ee883e4-13d7-4013-8e09-a217f02874f2-metrics-tls\") pod \"dns-default-j5gw4\" (UID: \"5ee883e4-13d7-4013-8e09-a217f02874f2\") " pod="openshift-dns/dns-default-j5gw4" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.229681 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6123fd3b-008e-4cc1-ab2b-55805290b417-serving-cert\") pod \"service-ca-operator-777779d784-5rcd9\" (UID: \"6123fd3b-008e-4cc1-ab2b-55805290b417\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.229991 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd0500bb-3678-45b3-bf69-010672c72338-cert\") pod \"ingress-canary-5mcr2\" (UID: \"cd0500bb-3678-45b3-bf69-010672c72338\") " pod="openshift-ingress-canary/ingress-canary-5mcr2" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.230483 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht7kw\" (UniqueName: \"kubernetes.io/projected/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-kube-api-access-ht7kw\") pod \"marketplace-operator-79b997595-tlnpr\" (UID: \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\") " pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.252521 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppbgm\" (UniqueName: \"kubernetes.io/projected/6123fd3b-008e-4cc1-ab2b-55805290b417-kube-api-access-ppbgm\") pod \"service-ca-operator-777779d784-5rcd9\" (UID: \"6123fd3b-008e-4cc1-ab2b-55805290b417\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.293425 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crkl7\" (UniqueName: \"kubernetes.io/projected/ae10e694-2552-431e-88b6-2637e467c256-kube-api-access-crkl7\") pod \"migrator-59844c95c7-9vbnp\" (UID: \"ae10e694-2552-431e-88b6-2637e467c256\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vbnp" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.294382 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vbnp" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.297953 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.298411 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b3831420-3546-4f5f-aa58-c8465f7f33ef-certs\") pod \"machine-config-server-882nz\" (UID: \"b3831420-3546-4f5f-aa58-c8465f7f33ef\") " pod="openshift-machine-config-operator/machine-config-server-882nz" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.298474 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d52jc\" (UniqueName: \"kubernetes.io/projected/b3831420-3546-4f5f-aa58-c8465f7f33ef-kube-api-access-d52jc\") pod \"machine-config-server-882nz\" (UID: \"b3831420-3546-4f5f-aa58-c8465f7f33ef\") " pod="openshift-machine-config-operator/machine-config-server-882nz" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.298656 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b3831420-3546-4f5f-aa58-c8465f7f33ef-node-bootstrap-token\") pod \"machine-config-server-882nz\" (UID: \"b3831420-3546-4f5f-aa58-c8465f7f33ef\") " pod="openshift-machine-config-operator/machine-config-server-882nz" Jan 20 22:37:52 crc kubenswrapper[5030]: E0120 22:37:52.298702 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:52.79866342 +0000 UTC m=+145.118923708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.301039 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.311888 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b3831420-3546-4f5f-aa58-c8465f7f33ef-certs\") pod \"machine-config-server-882nz\" (UID: \"b3831420-3546-4f5f-aa58-c8465f7f33ef\") " pod="openshift-machine-config-operator/machine-config-server-882nz" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.315367 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b3831420-3546-4f5f-aa58-c8465f7f33ef-node-bootstrap-token\") pod \"machine-config-server-882nz\" (UID: \"b3831420-3546-4f5f-aa58-c8465f7f33ef\") " pod="openshift-machine-config-operator/machine-config-server-882nz" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.324304 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0a3aeed-8352-4f1b-8f2a-0735c467f98c-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nhcfn\" (UID: \"d0a3aeed-8352-4f1b-8f2a-0735c467f98c\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.326182 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.343253 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv6sf\" (UniqueName: \"kubernetes.io/projected/94de3b86-fede-47e1-8ecf-03f167572300-kube-api-access-vv6sf\") pod \"downloads-7954f5f757-fmkdf\" (UID: \"94de3b86-fede-47e1-8ecf-03f167572300\") " pod="openshift-console/downloads-7954f5f757-fmkdf" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.359564 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4f9h\" (UniqueName: \"kubernetes.io/projected/cd0500bb-3678-45b3-bf69-010672c72338-kube-api-access-g4f9h\") pod \"ingress-canary-5mcr2\" (UID: \"cd0500bb-3678-45b3-bf69-010672c72338\") " pod="openshift-ingress-canary/ingress-canary-5mcr2" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.371031 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51e17579-9d5a-40de-8207-17d85f4c803d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7zdhb\" (UID: \"51e17579-9d5a-40de-8207-17d85f4c803d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.396321 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5mcr2" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.398608 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j5gw4" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.399324 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:52 crc kubenswrapper[5030]: E0120 22:37:52.399583 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:52.899572296 +0000 UTC m=+145.219832584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.405917 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wxtc\" (UniqueName: \"kubernetes.io/projected/17ebad12-2a66-427b-b200-64383ca6631c-kube-api-access-2wxtc\") pod \"csi-hostpathplugin-c8xgq\" (UID: \"17ebad12-2a66-427b-b200-64383ca6631c\") " pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.414149 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vst6x\" (UniqueName: \"kubernetes.io/projected/e89be5a8-754e-4a83-9cb5-691055aed0e2-kube-api-access-vst6x\") pod \"machine-config-controller-84d6567774-m27bp\" (UID: \"e89be5a8-754e-4a83-9cb5-691055aed0e2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.416908 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ll54f"] Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.425166 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8"] Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.442586 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d52jc\" (UniqueName: \"kubernetes.io/projected/b3831420-3546-4f5f-aa58-c8465f7f33ef-kube-api-access-d52jc\") pod \"machine-config-server-882nz\" (UID: \"b3831420-3546-4f5f-aa58-c8465f7f33ef\") " pod="openshift-machine-config-operator/machine-config-server-882nz" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.487524 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-fmkdf" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.494284 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.494856 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89"] Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.501011 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:52 crc kubenswrapper[5030]: E0120 22:37:52.501303 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:53.001286451 +0000 UTC m=+145.321546739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.506331 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.555967 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.602772 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:52 crc kubenswrapper[5030]: E0120 22:37:52.603184 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:53.10317291 +0000 UTC m=+145.423433198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.684290 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.691057 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh"] Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.702928 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-882nz" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.704012 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:52 crc kubenswrapper[5030]: E0120 22:37:52.704349 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:53.204332101 +0000 UTC m=+145.524592389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.737363 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq"] Jan 20 22:37:52 crc kubenswrapper[5030]: W0120 22:37:52.749006 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e490f76_4197_4c4c_bfde_2911a16dad48.slice/crio-8501020a3976ebcf169751a50a1a6e2799e2e86aa9100ea82e6e2c1fb2cffb8a WatchSource:0}: Error finding container 8501020a3976ebcf169751a50a1a6e2799e2e86aa9100ea82e6e2c1fb2cffb8a: Status 404 returned error can't find the container with id 8501020a3976ebcf169751a50a1a6e2799e2e86aa9100ea82e6e2c1fb2cffb8a Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.781128 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" event={"ID":"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b","Type":"ContainerStarted","Data":"2b3707f3954eac946ca74dd8f5cb98d214cad07bbc6fde237592ae295ce81dec"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.787452 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" event={"ID":"977980fd-278e-43c6-a63d-32646c9cf3de","Type":"ContainerStarted","Data":"d82fac8ac1cb6cf5efd53c560c8ce4f3700ae42838adfc15fde23d136a4f2257"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.806107 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:52 crc kubenswrapper[5030]: E0120 22:37:52.806451 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:53.306439815 +0000 UTC m=+145.626700103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.807649 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" event={"ID":"96c05319-8357-4042-ae09-77f0c85a73d9","Type":"ContainerStarted","Data":"8f8bb888dd3cc2499ff1de94615359ba78f585fac6589de240b4d7a5942aafc2"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.808207 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.810432 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" event={"ID":"09fc7d72-f430-4d90-9ab9-1d79db43b695","Type":"ContainerStarted","Data":"7fdfe1072c483840548146cccdc4d2f173a3cadad3c42d28ed379e970ef8d750"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.810451 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" event={"ID":"09fc7d72-f430-4d90-9ab9-1d79db43b695","Type":"ContainerStarted","Data":"dd3853bd3504dff276e2aa24a39426165485246850892a189195e2511e15df8a"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.829064 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.832340 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" event={"ID":"e174ceb8-a286-43e5-b6e4-cfc426d77226","Type":"ContainerStarted","Data":"fac75c9e3dce602af4d4936c13b0863fc50183a026513553d0dc34fe3ad07a4c"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.832375 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" event={"ID":"e174ceb8-a286-43e5-b6e4-cfc426d77226","Type":"ContainerStarted","Data":"b4281e4976474848711a7d8d8e52ffa65bb2cad32c92bf291c00957b5e8be58a"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.833295 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" event={"ID":"e4ad4478-2873-407f-a178-b0255511ef06","Type":"ContainerStarted","Data":"01e9b3cf2bf4ab7837b10711d868a0f5acfa96ab938c621c686b35c2e4b23f05"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.834535 5030 generic.go:334] "Generic (PLEG): container finished" podID="352cd616-4b0f-4f66-9fff-d477bd1e827c" containerID="4add5276cd962313a518f53bd8d3e2cfd87083f11a8842681e301160e4a649b1" exitCode=0 Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.834661 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" event={"ID":"352cd616-4b0f-4f66-9fff-d477bd1e827c","Type":"ContainerDied","Data":"4add5276cd962313a518f53bd8d3e2cfd87083f11a8842681e301160e4a649b1"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.835316 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vkrcb" event={"ID":"6d43e77d-5f9a-4b29-959f-190fa60d8807","Type":"ContainerStarted","Data":"5765b1b3f27896c79d26d45e2b414b5956e6e668caadda9586bf610e58aa8d18"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.836387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" event={"ID":"c1d9550c-ae35-4705-aff2-7946b6434807","Type":"ContainerStarted","Data":"733b972ac29d61ec526a7c07d2ab81dd9b354030637f486787ec6e55f097df67"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.836476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" event={"ID":"c1d9550c-ae35-4705-aff2-7946b6434807","Type":"ContainerStarted","Data":"2a11a85221af26a37da7c784a53190646aebcd62e5dad93efbe1a841b2dab64a"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.837531 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" event={"ID":"e1e5724b-6745-4d13-a19b-a631a2483004","Type":"ContainerStarted","Data":"19076a9dc5642a35d28a7cd7bdbba39e316ff1b57be89579346364220d3e5ddb"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.838605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d" event={"ID":"b061ed0d-eb5f-4569-9736-74c4705a7b82","Type":"ContainerStarted","Data":"4c0a229067c28978e18fc89acc7bd046ff49cf137a95e89cc3a8f19b3284737b"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.846744 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" event={"ID":"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee","Type":"ContainerStarted","Data":"a6ba8b8159d9e3093de70ab675deb411e4cf9ea60b656b7b87d9ae27950dfb4f"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.846780 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" event={"ID":"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee","Type":"ContainerStarted","Data":"f3864d9bb33c0ece73c7b5113f79ed0d6936e6165f0884ab98a1791b0adebc66"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.849314 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.852321 5030 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bv2jh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.852373 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" podUID="2cc86f8a-314a-40e6-adbd-9c4bc5cffeee" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.855005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vc527" event={"ID":"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc","Type":"ContainerStarted","Data":"bc4d2268fdcb46368e60d82b387adc67cfd6d978e487e5b374b2b1ed169f12a5"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.862906 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vm7db" event={"ID":"8de6d00e-a15c-47d8-a4a6-0212694a734f","Type":"ContainerStarted","Data":"cf48804c7e5661e7f3b1bae5d128aec216f5aba63da9025dbbb10228f4bd6a3d"} Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.883203 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk"] Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.909963 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:52 crc kubenswrapper[5030]: E0120 22:37:52.912773 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:53.41274983 +0000 UTC m=+145.733010108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:52 crc kubenswrapper[5030]: I0120 22:37:52.976579 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw"] Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.012594 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:53 crc kubenswrapper[5030]: E0120 22:37:53.013437 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:53.513425 +0000 UTC m=+145.833685288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:53 crc kubenswrapper[5030]: W0120 22:37:53.110166 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ae9bb3b_dab8_4286_aa17_1d04b2239dd6.slice/crio-1d28daa7d1100d2bfcd42ad3361cbb2d100d53194cd49b6af7011ae765de71d6 WatchSource:0}: Error finding container 1d28daa7d1100d2bfcd42ad3361cbb2d100d53194cd49b6af7011ae765de71d6: Status 404 returned error can't find the container with id 1d28daa7d1100d2bfcd42ad3361cbb2d100d53194cd49b6af7011ae765de71d6 Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.118264 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:53 crc kubenswrapper[5030]: E0120 22:37:53.118568 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:53.618553616 +0000 UTC m=+145.938813904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:53 crc kubenswrapper[5030]: W0120 22:37:53.118656 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9840bef3_5439_4096_bde3_812f3653da3c.slice/crio-43296b3a64c05ac87fa2795ab3ba337235159976dfd50b67fe9743d23f3334a0 WatchSource:0}: Error finding container 43296b3a64c05ac87fa2795ab3ba337235159976dfd50b67fe9743d23f3334a0: Status 404 returned error can't find the container with id 43296b3a64c05ac87fa2795ab3ba337235159976dfd50b67fe9743d23f3334a0 Jan 20 22:37:53 crc kubenswrapper[5030]: W0120 22:37:53.118797 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51c966fa_8fea_4242_a6b5_2e8b9c55fdd0.slice/crio-e17973d9d840fd36f587dac2149d41c8c5535e421855ddb4b6ee5be3690b9a8d WatchSource:0}: Error finding container e17973d9d840fd36f587dac2149d41c8c5535e421855ddb4b6ee5be3690b9a8d: Status 404 returned error can't find the container with id e17973d9d840fd36f587dac2149d41c8c5535e421855ddb4b6ee5be3690b9a8d Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.219647 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:53 crc kubenswrapper[5030]: E0120 22:37:53.220130 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:53.720118467 +0000 UTC m=+146.040378755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.321870 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:53 crc kubenswrapper[5030]: E0120 22:37:53.322212 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:53.82219743 +0000 UTC m=+146.142457718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.332778 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" podStartSLOduration=126.332762612 podStartE2EDuration="2m6.332762612s" podCreationTimestamp="2026-01-20 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:53.32720325 +0000 UTC m=+145.647463548" watchObservedRunningTime="2026-01-20 22:37:53.332762612 +0000 UTC m=+145.653022900" Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.426842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:53 crc kubenswrapper[5030]: E0120 22:37:53.427366 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:53.927354857 +0000 UTC m=+146.247615145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.526117 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg"] Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.529168 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:53 crc kubenswrapper[5030]: E0120 22:37:53.529502 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.029483442 +0000 UTC m=+146.349743720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.529609 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.529810 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq"] Jan 20 22:37:53 crc kubenswrapper[5030]: E0120 22:37:53.529849 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.029840661 +0000 UTC m=+146.350100949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.550231 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w"] Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.590645 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7fll2" podStartSLOduration=127.590614429 podStartE2EDuration="2m7.590614429s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:53.590059487 +0000 UTC m=+145.910319775" watchObservedRunningTime="2026-01-20 22:37:53.590614429 +0000 UTC m=+145.910874717" Jan 20 22:37:53 crc kubenswrapper[5030]: W0120 22:37:53.624272 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b55be0c_55e3_4bf3_badd_618e8dc9f738.slice/crio-b2547c8f7fb8b02577817e1d16919041f419d8fe8b45b44721ab17c9042bac9d WatchSource:0}: Error finding container b2547c8f7fb8b02577817e1d16919041f419d8fe8b45b44721ab17c9042bac9d: Status 404 returned error can't find the container with id b2547c8f7fb8b02577817e1d16919041f419d8fe8b45b44721ab17c9042bac9d Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.632236 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:53 crc kubenswrapper[5030]: E0120 22:37:53.633131 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.133108912 +0000 UTC m=+146.453369200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:53 crc kubenswrapper[5030]: E0120 22:37:53.636917 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.136905823 +0000 UTC m=+146.457166111 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.636435 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.739584 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:53 crc kubenswrapper[5030]: E0120 22:37:53.757162 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.257133029 +0000 UTC m=+146.577393317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.823919 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-h2r7q" podStartSLOduration=127.823888761 podStartE2EDuration="2m7.823888761s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:53.808183256 +0000 UTC m=+146.128443534" watchObservedRunningTime="2026-01-20 22:37:53.823888761 +0000 UTC m=+146.144149119" Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.825068 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mh8ms"] Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.857497 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" podStartSLOduration=127.857479962 podStartE2EDuration="2m7.857479962s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:53.85701882 +0000 UTC m=+146.177279108" watchObservedRunningTime="2026-01-20 22:37:53.857479962 +0000 UTC m=+146.177740250" Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.860345 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:53 crc kubenswrapper[5030]: E0120 22:37:53.860656 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.360644357 +0000 UTC m=+146.680904645 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.896261 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" event={"ID":"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6","Type":"ContainerStarted","Data":"1d28daa7d1100d2bfcd42ad3361cbb2d100d53194cd49b6af7011ae765de71d6"} Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.898774 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ll54f" event={"ID":"b471dba4-7cf5-4cbb-beb6-010bd32f8fde","Type":"ContainerStarted","Data":"5fed3f54df43b85a4b636e79419aa615a1bb1605ffd38f0b7d820bcb5d60b2b3"} Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.901975 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xhbbg"] Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.902792 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" event={"ID":"6f7b1164-858a-4230-a949-0b770ed638d8","Type":"ContainerStarted","Data":"d3e0de03b4296fe87b913a644b66e6f32648d152f0f5eb08f8b5e65bb4bbb780"} Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.909392 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" event={"ID":"dce6cbef-8eea-40d0-83ee-8790d2bd9450","Type":"ContainerStarted","Data":"02ed1f2589a1078b2496e47c2ca39670e4d7cfcff601ab8d54db6520fdb20c7d"} Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.919334 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529"] Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.921023 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" event={"ID":"8eadfdde-af50-4bad-a779-dc5af28f74fe","Type":"ContainerStarted","Data":"d90ac1b555b6d6d3330efa1c43aa2d80b2924edfeb971a1fce85d7ab32fc028b"} Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.923330 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" event={"ID":"51c966fa-8fea-4242-a6b5-2e8b9c55fdd0","Type":"ContainerStarted","Data":"e17973d9d840fd36f587dac2149d41c8c5535e421855ddb4b6ee5be3690b9a8d"} Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.924980 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-882nz" event={"ID":"b3831420-3546-4f5f-aa58-c8465f7f33ef","Type":"ContainerStarted","Data":"82787bb188466a613ff4ab5179a43f92de4ef5fc247659315950fd64d0b506d5"} Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.954991 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" event={"ID":"ddc234b4-01ae-46aa-bc6b-e6e7a723c06b","Type":"ContainerStarted","Data":"59a1c77c6135af8ba173f7414487a6b26442baf22e70496d71a3ea2e81cb58ea"} Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.955883 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.960681 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ht5ll"] Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.961030 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:53 crc kubenswrapper[5030]: E0120 22:37:53.962665 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.462648799 +0000 UTC m=+146.782909087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.963243 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699"] Jan 20 22:37:53 crc kubenswrapper[5030]: I0120 22:37:53.988368 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-t7nch" podStartSLOduration=127.988347571 podStartE2EDuration="2m7.988347571s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:53.932119361 +0000 UTC m=+146.252379659" watchObservedRunningTime="2026-01-20 22:37:53.988347571 +0000 UTC m=+146.308607859" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.037928 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" event={"ID":"d0ad1055-4774-45e8-a1ab-e932ecb14bc8","Type":"ContainerStarted","Data":"d5b0d0e320b0da3f5c58a7c5b71cdb4fa6e94e5040bbfa142e2fadd0b08cd317"} Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.038169 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5mcr2"] Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.044637 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" event={"ID":"6b55be0c-55e3-4bf3-badd-618e8dc9f738","Type":"ContainerStarted","Data":"b2547c8f7fb8b02577817e1d16919041f419d8fe8b45b44721ab17c9042bac9d"} Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.070054 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.070591 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh"] Jan 20 22:37:54 crc kubenswrapper[5030]: E0120 22:37:54.070884 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.570870808 +0000 UTC m=+146.891131096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.101746 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vkrcb" event={"ID":"6d43e77d-5f9a-4b29-959f-190fa60d8807","Type":"ContainerStarted","Data":"3380a5f8fa6234a24777c5a79ef63ce7322a44640be049d3222216bdfeb28a0b"} Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.118330 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d" event={"ID":"b061ed0d-eb5f-4569-9736-74c4705a7b82","Type":"ContainerStarted","Data":"166969415a164cbe11b1d4893bd6b27e8f3ab1c4af7091f708bd62a2a7bf1a67"} Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.121905 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9"] Jan 20 22:37:54 crc kubenswrapper[5030]: W0120 22:37:54.130771 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf12e0006_3347_4e5b_98ad_bd69997cbe22.slice/crio-dc41fe4ac85d52cfe466d0901e4f8d2c083cf4e396431f42e57ac204533c9ee8 WatchSource:0}: Error finding container dc41fe4ac85d52cfe466d0901e4f8d2c083cf4e396431f42e57ac204533c9ee8: Status 404 returned error can't find the container with id dc41fe4ac85d52cfe466d0901e4f8d2c083cf4e396431f42e57ac204533c9ee8 Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.140090 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vc527" event={"ID":"d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc","Type":"ContainerStarted","Data":"c575ed1c7e1abd03bd9e0da0a3c06b5cd58199b3da33ea0f5d099da0ad39e1c6"} Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.157340 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tlnpr"] Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.157811 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" event={"ID":"e4ad4478-2873-407f-a178-b0255511ef06","Type":"ContainerStarted","Data":"334b412a1a1e2663b8890d417096c8e92c0eca759bcb58e528c90515d7d9e554"} Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.160016 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j5gw4"] Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.160037 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mh8ms" event={"ID":"09197347-c50f-4fe1-b20d-3baabf9869fc","Type":"ContainerStarted","Data":"3240891cedc801d95f518e46ead209f2c3b837812c3896da53b6b60370fc390a"} Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.163445 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-fmkdf"] Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.173167 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:54 crc kubenswrapper[5030]: E0120 22:37:54.173498 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.673483984 +0000 UTC m=+146.993744272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.179475 5030 generic.go:334] "Generic (PLEG): container finished" podID="c1d9550c-ae35-4705-aff2-7946b6434807" containerID="733b972ac29d61ec526a7c07d2ab81dd9b354030637f486787ec6e55f097df67" exitCode=0 Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.179562 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" event={"ID":"c1d9550c-ae35-4705-aff2-7946b6434807","Type":"ContainerDied","Data":"733b972ac29d61ec526a7c07d2ab81dd9b354030637f486787ec6e55f097df67"} Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.182224 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" event={"ID":"0e490f76-4197-4c4c-bfde-2911a16dad48","Type":"ContainerStarted","Data":"8501020a3976ebcf169751a50a1a6e2799e2e86aa9100ea82e6e2c1fb2cffb8a"} Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.185286 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" event={"ID":"9840bef3-5439-4096-bde3-812f3653da3c","Type":"ContainerStarted","Data":"43296b3a64c05ac87fa2795ab3ba337235159976dfd50b67fe9743d23f3334a0"} Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.192569 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.216726 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-vkrcb" podStartSLOduration=128.216711645 podStartE2EDuration="2m8.216711645s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:54.173767462 +0000 UTC m=+146.494027750" watchObservedRunningTime="2026-01-20 22:37:54.216711645 +0000 UTC m=+146.536971933" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.259690 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" podStartSLOduration=127.259676749 podStartE2EDuration="2m7.259676749s" podCreationTimestamp="2026-01-20 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:54.21650746 +0000 UTC m=+146.536767768" watchObservedRunningTime="2026-01-20 22:37:54.259676749 +0000 UTC m=+146.579937037" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.278189 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:54 crc kubenswrapper[5030]: E0120 22:37:54.280654 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.780602798 +0000 UTC m=+147.100863086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.288650 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9vbnp"] Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.373936 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" podStartSLOduration=128.373917942 podStartE2EDuration="2m8.373917942s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:54.325819716 +0000 UTC m=+146.646080004" watchObservedRunningTime="2026-01-20 22:37:54.373917942 +0000 UTC m=+146.694178230" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.375034 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-vc527" podStartSLOduration=128.375031039 podStartE2EDuration="2m8.375031039s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:54.37002737 +0000 UTC m=+146.690287668" watchObservedRunningTime="2026-01-20 22:37:54.375031039 +0000 UTC m=+146.695291327" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.381996 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:54 crc kubenswrapper[5030]: E0120 22:37:54.382580 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.882562869 +0000 UTC m=+147.202823157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.483546 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:54 crc kubenswrapper[5030]: E0120 22:37:54.483935 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:54.983919885 +0000 UTC m=+147.304180173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.584269 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:54 crc kubenswrapper[5030]: E0120 22:37:54.584378 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:55.084354769 +0000 UTC m=+147.404615057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.584466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:54 crc kubenswrapper[5030]: E0120 22:37:54.584835 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:55.08482884 +0000 UTC m=+147.405089128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:54 crc kubenswrapper[5030]: W0120 22:37:54.678593 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00d754cf_ca0b_4dd4_a322_a5ad7ce30dd5.slice/crio-666b9ea133f335c27ce3bb1fad9d6fcd7eb9ffcb1fa14a4271178d8e5a5d6c5d WatchSource:0}: Error finding container 666b9ea133f335c27ce3bb1fad9d6fcd7eb9ffcb1fa14a4271178d8e5a5d6c5d: Status 404 returned error can't find the container with id 666b9ea133f335c27ce3bb1fad9d6fcd7eb9ffcb1fa14a4271178d8e5a5d6c5d Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.684998 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:54 crc kubenswrapper[5030]: E0120 22:37:54.685399 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:55.185375618 +0000 UTC m=+147.505635926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:54 crc kubenswrapper[5030]: W0120 22:37:54.709414 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae10e694_2552_431e_88b6_2637e467c256.slice/crio-2548e3fd0c00b6a8d6c5ff54fc105889d9b0c88b2f584af46a8f8c884ee2237f WatchSource:0}: Error finding container 2548e3fd0c00b6a8d6c5ff54fc105889d9b0c88b2f584af46a8f8c884ee2237f: Status 404 returned error can't find the container with id 2548e3fd0c00b6a8d6c5ff54fc105889d9b0c88b2f584af46a8f8c884ee2237f Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.710902 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb"] Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.728477 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp"] Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.736180 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn"] Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.737506 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c8xgq"] Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.786431 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:54 crc kubenswrapper[5030]: E0120 22:37:54.787155 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:55.287138864 +0000 UTC m=+147.607399152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.887159 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:54 crc kubenswrapper[5030]: E0120 22:37:54.887517 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:55.387500206 +0000 UTC m=+147.707760494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.903328 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.907855 5030 patch_prober.go:28] interesting pod/router-default-5444994796-vc527 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 22:37:54 crc kubenswrapper[5030]: [-]has-synced failed: reason withheld Jan 20 22:37:54 crc kubenswrapper[5030]: [+]process-running ok Jan 20 22:37:54 crc kubenswrapper[5030]: healthz check failed Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.907918 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vc527" podUID="d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.958215 5030 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-m6wtc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.958289 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" podUID="ddc234b4-01ae-46aa-bc6b-e6e7a723c06b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.988696 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.989392 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.990296 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.990342 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.990455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:54 crc kubenswrapper[5030]: E0120 22:37:54.990768 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:55.490753717 +0000 UTC m=+147.811014005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.992212 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.995488 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:54 crc kubenswrapper[5030]: I0120 22:37:54.998262 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.008972 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.091857 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:55 crc kubenswrapper[5030]: E0120 22:37:55.092332 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:55.592317209 +0000 UTC m=+147.912577497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.181910 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.192360 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.193486 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:55 crc kubenswrapper[5030]: E0120 22:37:55.193847 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:55.693833639 +0000 UTC m=+148.014093927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.219476 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.223593 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" event={"ID":"c1d9550c-ae35-4705-aff2-7946b6434807","Type":"ContainerStarted","Data":"92bb985bf79452309d2388c2b365c78553accf1922912eb69addc4f6bd5e8e73"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.230021 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" event={"ID":"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5","Type":"ContainerStarted","Data":"666b9ea133f335c27ce3bb1fad9d6fcd7eb9ffcb1fa14a4271178d8e5a5d6c5d"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.231772 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" event={"ID":"f12e0006-3347-4e5b-98ad-bd69997cbe22","Type":"ContainerStarted","Data":"dc41fe4ac85d52cfe466d0901e4f8d2c083cf4e396431f42e57ac204533c9ee8"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.235063 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d" event={"ID":"b061ed0d-eb5f-4569-9736-74c4705a7b82","Type":"ContainerStarted","Data":"1aeb04d1586423867cc96d2c36f03bc7c6fec5cdb76adcc0980baa33dd80ec28"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.243287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" event={"ID":"d0ad1055-4774-45e8-a1ab-e932ecb14bc8","Type":"ContainerStarted","Data":"042a3b9f89356a6dbebf3cfd0ed2a65307c68fac11c3ecbc4788b276268fde0b"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.255708 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" podStartSLOduration=128.255687593 podStartE2EDuration="2m8.255687593s" podCreationTimestamp="2026-01-20 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.254080495 +0000 UTC m=+147.574340783" watchObservedRunningTime="2026-01-20 22:37:55.255687593 +0000 UTC m=+147.575947881" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.282130 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vm7db" event={"ID":"8de6d00e-a15c-47d8-a4a6-0212694a734f","Type":"ContainerStarted","Data":"0e7e9cbc31a82de0b648cf7083bc9ffbe9aaf82750f31ff1329c70ef06f1c4c4"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.282173 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vm7db" event={"ID":"8de6d00e-a15c-47d8-a4a6-0212694a734f","Type":"ContainerStarted","Data":"39cd6b86995a8842a7202978d398c5b995e2110f14b321546b15bec64ec01577"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.285923 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" event={"ID":"49dc7f5e-9a94-4a6a-83d0-c77019142654","Type":"ContainerStarted","Data":"d0af0fe118a382f98fbf5a9463abff186df111dca5f359c090cc8866712f8edb"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.285961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" event={"ID":"49dc7f5e-9a94-4a6a-83d0-c77019142654","Type":"ContainerStarted","Data":"62bc4aa5b3f7bf17f93f847ddf6715a01c8c558dad96a223abdd41b1f6956d6a"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.293997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:55 crc kubenswrapper[5030]: E0120 22:37:55.294199 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:55.794179131 +0000 UTC m=+148.114439419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.294592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:55 crc kubenswrapper[5030]: E0120 22:37:55.296759 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:55.796745932 +0000 UTC m=+148.117006220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.307882 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7pp4d" podStartSLOduration=129.307861638 podStartE2EDuration="2m9.307861638s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.297461769 +0000 UTC m=+147.617722067" watchObservedRunningTime="2026-01-20 22:37:55.307861638 +0000 UTC m=+147.628121926" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.355768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" event={"ID":"352cd616-4b0f-4f66-9fff-d477bd1e827c","Type":"ContainerStarted","Data":"85a32abb057f59df0052b71200e9e3d2c28f51b259e5dc9c02e155a7ee42be99"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.356329 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.363546 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" event={"ID":"9840bef3-5439-4096-bde3-812f3653da3c","Type":"ContainerStarted","Data":"e44f65eb6983311de839972ef5dccd7a5d44d35cf7ebcdd28dc22bc0e17ff0c8"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.365763 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" event={"ID":"6f7b1164-858a-4230-a949-0b770ed638d8","Type":"ContainerStarted","Data":"3a99fd2e7405d716ffb0e8036ce647d577f9ca54e378794d16d154d2c0d1291a"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.380951 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-882nz" event={"ID":"b3831420-3546-4f5f-aa58-c8465f7f33ef","Type":"ContainerStarted","Data":"e49c2a8797910becf99c941ea5a99fdcffb6d39522a906d12fe2ebef4354088f"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.385898 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vm7db" podStartSLOduration=129.385881527 podStartE2EDuration="2m9.385881527s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.384648507 +0000 UTC m=+147.704908795" watchObservedRunningTime="2026-01-20 22:37:55.385881527 +0000 UTC m=+147.706141815" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.393632 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-27n89" podStartSLOduration=129.393598691 podStartE2EDuration="2m9.393598691s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.344947971 +0000 UTC m=+147.665208259" watchObservedRunningTime="2026-01-20 22:37:55.393598691 +0000 UTC m=+147.713858979" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.408791 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:55 crc kubenswrapper[5030]: E0120 22:37:55.410029 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:55.910013292 +0000 UTC m=+148.230273580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.410727 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-xhbbg" podStartSLOduration=128.410708749 podStartE2EDuration="2m8.410708749s" podCreationTimestamp="2026-01-20 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.404388978 +0000 UTC m=+147.724649266" watchObservedRunningTime="2026-01-20 22:37:55.410708749 +0000 UTC m=+147.730969047" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.451916 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" podStartSLOduration=129.45190203 podStartE2EDuration="2m9.45190203s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.45017664 +0000 UTC m=+147.770436928" watchObservedRunningTime="2026-01-20 22:37:55.45190203 +0000 UTC m=+147.772162318" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.463363 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699" event={"ID":"607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9","Type":"ContainerStarted","Data":"18e47d1cc243b1d940fcd3dcb3ed763f7ca1e6eafdd8bc3a31f6de2a5726fbe8"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.463402 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699" event={"ID":"607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9","Type":"ContainerStarted","Data":"fee33bf6c0ede4a47af24d285f84fdcc3dfd765c57ab311fa67d63e9a7270406"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.491850 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" event={"ID":"8eadfdde-af50-4bad-a779-dc5af28f74fe","Type":"ContainerStarted","Data":"9800834970b1d02262a69a9d990401c7a3744fb14c10b11718a91480d9630da9"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.515489 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:55 crc kubenswrapper[5030]: E0120 22:37:55.515774 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:56.015760363 +0000 UTC m=+148.336020641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.518331 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" event={"ID":"51c966fa-8fea-4242-a6b5-2e8b9c55fdd0","Type":"ContainerStarted","Data":"e3616bcc0ed90d3864df55e9297f1f7febb403acc2e26656041991cbb07cba40"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.525740 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mjfgk" podStartSLOduration=129.525719 podStartE2EDuration="2m9.525719s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.482814957 +0000 UTC m=+147.803075245" watchObservedRunningTime="2026-01-20 22:37:55.525719 +0000 UTC m=+147.845979288" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.525849 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-882nz" podStartSLOduration=6.5258435729999995 podStartE2EDuration="6.525843573s" podCreationTimestamp="2026-01-20 22:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.514855431 +0000 UTC m=+147.835115719" watchObservedRunningTime="2026-01-20 22:37:55.525843573 +0000 UTC m=+147.846103861" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.550859 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pmzpg" podStartSLOduration=129.550840419 podStartE2EDuration="2m9.550840419s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.549256062 +0000 UTC m=+147.869516340" watchObservedRunningTime="2026-01-20 22:37:55.550840419 +0000 UTC m=+147.871100707" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.563920 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" event={"ID":"e89be5a8-754e-4a83-9cb5-691055aed0e2","Type":"ContainerStarted","Data":"f62ac2ee214884fabaa0c69910c02c64a2c000692f7696299597ab3fd91dcf96"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.620389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:55 crc kubenswrapper[5030]: E0120 22:37:55.621347 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:56.121332009 +0000 UTC m=+148.441592297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.629899 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-82699" podStartSLOduration=129.629881654 podStartE2EDuration="2m9.629881654s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.592356629 +0000 UTC m=+147.912616917" watchObservedRunningTime="2026-01-20 22:37:55.629881654 +0000 UTC m=+147.950141942" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.668226 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" event={"ID":"c6a2a728-a83b-4fa2-b21a-748dcb79a930","Type":"ContainerStarted","Data":"fa5703ecd6455fd96ea3667b4e29e2e3b18675083eb2981a326aa3a2ff683aba"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.668484 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" event={"ID":"c6a2a728-a83b-4fa2-b21a-748dcb79a930","Type":"ContainerStarted","Data":"cc59feedf2bb063b952d8fda2dc5e3edf23306056488ebfd17aad64fc6a7d9ff"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.669516 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.690750 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4l4sh" podStartSLOduration=129.690731634 podStartE2EDuration="2m9.690731634s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.631145853 +0000 UTC m=+147.951406141" watchObservedRunningTime="2026-01-20 22:37:55.690731634 +0000 UTC m=+148.010991922" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.705432 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" event={"ID":"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6","Type":"ContainerStarted","Data":"6054960f94f69e24c59ba063560496eaab3a2763b7e4c8362a8c65859ae57d51"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.737881 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mh8ms" event={"ID":"09197347-c50f-4fe1-b20d-3baabf9869fc","Type":"ContainerStarted","Data":"a97455988fc854f48eaba1d20d4615dddfc13b8d555c95926726b9b0074ee1cb"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.739296 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:55 crc kubenswrapper[5030]: E0120 22:37:55.742868 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:56.242853457 +0000 UTC m=+148.563113735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.743006 5030 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-8h529 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.743065 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" podUID="c6a2a728-a83b-4fa2-b21a-748dcb79a930" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.782029 5030 generic.go:334] "Generic (PLEG): container finished" podID="e4ad4478-2873-407f-a178-b0255511ef06" containerID="334b412a1a1e2663b8890d417096c8e92c0eca759bcb58e528c90515d7d9e554" exitCode=0 Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.782275 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" event={"ID":"e4ad4478-2873-407f-a178-b0255511ef06","Type":"ContainerDied","Data":"334b412a1a1e2663b8890d417096c8e92c0eca759bcb58e528c90515d7d9e554"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.840993 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:55 crc kubenswrapper[5030]: E0120 22:37:55.841394 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:56.341374096 +0000 UTC m=+148.661634384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.855421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ll54f" event={"ID":"b471dba4-7cf5-4cbb-beb6-010bd32f8fde","Type":"ContainerStarted","Data":"3d0b567e14daf62e5e9f4da85b8b84046de9049507eb170ee51ff97bbf783755"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.862195 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.878207 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-ll54f" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.887441 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-ll54f" podStartSLOduration=129.887422723 podStartE2EDuration="2m9.887422723s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.887059044 +0000 UTC m=+148.207319322" watchObservedRunningTime="2026-01-20 22:37:55.887422723 +0000 UTC m=+148.207683011" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.890599 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" podStartSLOduration=128.890573239 podStartE2EDuration="2m8.890573239s" podCreationTimestamp="2026-01-20 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.692506836 +0000 UTC m=+148.012767124" watchObservedRunningTime="2026-01-20 22:37:55.890573239 +0000 UTC m=+148.210833537" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.908445 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" event={"ID":"e1e5724b-6745-4d13-a19b-a631a2483004","Type":"ContainerStarted","Data":"b6024c6273e84d1bddd384216dd526baff0bfa86bcfa7ae26a1752611dd74fd1"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.913191 5030 patch_prober.go:28] interesting pod/router-default-5444994796-vc527 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 22:37:55 crc kubenswrapper[5030]: [-]has-synced failed: reason withheld Jan 20 22:37:55 crc kubenswrapper[5030]: [+]process-running ok Jan 20 22:37:55 crc kubenswrapper[5030]: healthz check failed Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.913226 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vc527" podUID="d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.948236 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:55 crc kubenswrapper[5030]: E0120 22:37:55.949152 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:56.449138385 +0000 UTC m=+148.769398673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.955400 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fmkdf" event={"ID":"94de3b86-fede-47e1-8ecf-03f167572300","Type":"ContainerStarted","Data":"8c358a1b463f436c0c261d30d51e822b592b87a49651973b1a30d34fa64cf7ca"} Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.956308 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-fmkdf" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.961279 5030 patch_prober.go:28] interesting pod/downloads-7954f5f757-fmkdf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.961332 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fmkdf" podUID="94de3b86-fede-47e1-8ecf-03f167572300" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.974527 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-fmkdf" podStartSLOduration=129.97451268 podStartE2EDuration="2m9.97451268s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:55.973839763 +0000 UTC m=+148.294100051" watchObservedRunningTime="2026-01-20 22:37:55.97451268 +0000 UTC m=+148.294772968" Jan 20 22:37:55 crc kubenswrapper[5030]: I0120 22:37:55.995515 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6gbd8" event={"ID":"0e490f76-4197-4c4c-bfde-2911a16dad48","Type":"ContainerStarted","Data":"222fcfcd6dceafea4b61b759dd1202d853f04cc37934c584ea6dd2532e947b2f"} Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.036157 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vbnp" event={"ID":"ae10e694-2552-431e-88b6-2637e467c256","Type":"ContainerStarted","Data":"2548e3fd0c00b6a8d6c5ff54fc105889d9b0c88b2f584af46a8f8c884ee2237f"} Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.051299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:56 crc kubenswrapper[5030]: E0120 22:37:56.052596 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:56.5525762 +0000 UTC m=+148.872836488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.082926 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" event={"ID":"4e59bf88-6561-4b84-93a3-4aa09d919f56","Type":"ContainerStarted","Data":"8c0f0031caa99f69f22b0194665293b3648ae9958f088ed52a621811539f1804"} Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.082983 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" event={"ID":"4e59bf88-6561-4b84-93a3-4aa09d919f56","Type":"ContainerStarted","Data":"12ca7f2ac882c8f918efc86d51bb39d5292472e250ffde8e6ff35f5f396f6a61"} Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.084665 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.110675 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" podStartSLOduration=130.110658345 podStartE2EDuration="2m10.110658345s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:56.104952239 +0000 UTC m=+148.425212527" watchObservedRunningTime="2026-01-20 22:37:56.110658345 +0000 UTC m=+148.430918633" Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.131282 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" event={"ID":"6b55be0c-55e3-4bf3-badd-618e8dc9f738","Type":"ContainerStarted","Data":"a8eea5b347bca88e0564d71e4e5a6ec69c6b240ae4f761c7184c5d721ecc3b69"} Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.167767 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:56 crc kubenswrapper[5030]: E0120 22:37:56.168077 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:56.668061184 +0000 UTC m=+148.988321552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.197813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" event={"ID":"51e17579-9d5a-40de-8207-17d85f4c803d","Type":"ContainerStarted","Data":"99207ce7a1973ded5cc8e978c17519b4392871fa91680286673c9112e125d807"} Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.229740 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" event={"ID":"d0a3aeed-8352-4f1b-8f2a-0735c467f98c","Type":"ContainerStarted","Data":"4a7e75978f0cc4527e15e89d674e76c95c1cae5b455343c1dfbb70d87d393596"} Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.267816 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j5gw4" event={"ID":"5ee883e4-13d7-4013-8e09-a217f02874f2","Type":"ContainerStarted","Data":"42b05b39c57d76329ea4a158d9130d352d6314f72652c04ef93d835335e6bd98"} Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.275607 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:56 crc kubenswrapper[5030]: E0120 22:37:56.276613 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:56.776598641 +0000 UTC m=+149.096858919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.319757 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" event={"ID":"dce6cbef-8eea-40d0-83ee-8790d2bd9450","Type":"ContainerStarted","Data":"c69fc5730d416ab0b9e63766bd41ff5288d7a1376f17aa93612b835bc23c8ef9"} Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.320910 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.352234 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.358896 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" podStartSLOduration=130.358884842 podStartE2EDuration="2m10.358884842s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:56.198489629 +0000 UTC m=+148.518749917" watchObservedRunningTime="2026-01-20 22:37:56.358884842 +0000 UTC m=+148.679145130" Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.362332 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" podStartSLOduration=129.362324245 podStartE2EDuration="2m9.362324245s" podCreationTimestamp="2026-01-20 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:56.356351662 +0000 UTC m=+148.676611950" watchObservedRunningTime="2026-01-20 22:37:56.362324245 +0000 UTC m=+148.682584533" Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.378541 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:56 crc kubenswrapper[5030]: E0120 22:37:56.379472 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:56.879456863 +0000 UTC m=+149.199717151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.398892 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" event={"ID":"17ebad12-2a66-427b-b200-64383ca6631c","Type":"ContainerStarted","Data":"d3ece465f8b93d1c3207b061dadce81261f6e5fdf3f38f27d0aea8ffebdb408a"} Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.426077 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" event={"ID":"6123fd3b-008e-4cc1-ab2b-55805290b417","Type":"ContainerStarted","Data":"f886e3bfe536afa62768cc2e6cf18b9d29e00b4100ce5bca40d22ec2a35e448d"} Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.461159 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" podStartSLOduration=129.46114479 podStartE2EDuration="2m9.46114479s" podCreationTimestamp="2026-01-20 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:56.459224945 +0000 UTC m=+148.779485233" watchObservedRunningTime="2026-01-20 22:37:56.46114479 +0000 UTC m=+148.781405078" Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.469690 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5mcr2" event={"ID":"cd0500bb-3678-45b3-bf69-010672c72338","Type":"ContainerStarted","Data":"ab529e53340d3ebf9012ae1e416af00cf04d2e2d12c0edf290d9ec1615e8d3aa"} Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.474598 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m6wtc" Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.485957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:56 crc kubenswrapper[5030]: E0120 22:37:56.487251 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:56.987232512 +0000 UTC m=+149.307492800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.519967 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5mcr2" podStartSLOduration=7.519948162 podStartE2EDuration="7.519948162s" podCreationTimestamp="2026-01-20 22:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:56.510644461 +0000 UTC m=+148.830904749" watchObservedRunningTime="2026-01-20 22:37:56.519948162 +0000 UTC m=+148.840208440" Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.553353 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.553673 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.562975 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.590355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:56 crc kubenswrapper[5030]: E0120 22:37:56.590710 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:57.090684889 +0000 UTC m=+149.410945177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.692042 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:56 crc kubenswrapper[5030]: E0120 22:37:56.692377 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:57.192361542 +0000 UTC m=+149.512621820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.793780 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:56 crc kubenswrapper[5030]: E0120 22:37:56.794460 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:57.294444576 +0000 UTC m=+149.614704864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.861529 5030 csr.go:261] certificate signing request csr-hqc55 is approved, waiting to be issued Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.870578 5030 csr.go:257] certificate signing request csr-hqc55 is issued Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.895639 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:56 crc kubenswrapper[5030]: E0120 22:37:56.895958 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:57.395940256 +0000 UTC m=+149.716200544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.907881 5030 patch_prober.go:28] interesting pod/router-default-5444994796-vc527 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 22:37:56 crc kubenswrapper[5030]: [-]has-synced failed: reason withheld Jan 20 22:37:56 crc kubenswrapper[5030]: [+]process-running ok Jan 20 22:37:56 crc kubenswrapper[5030]: healthz check failed Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.907942 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vc527" podUID="d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 22:37:56 crc kubenswrapper[5030]: I0120 22:37:56.996950 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:56 crc kubenswrapper[5030]: E0120 22:37:56.997350 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:57.497332172 +0000 UTC m=+149.817592450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.085231 5030 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ht5ll container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.18:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.085304 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" podUID="4e59bf88-6561-4b84-93a3-4aa09d919f56" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.18:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.097656 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:57 crc kubenswrapper[5030]: E0120 22:37:57.097818 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:57.597794528 +0000 UTC m=+149.918054816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.097874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:57 crc kubenswrapper[5030]: E0120 22:37:57.098161 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:57.598147136 +0000 UTC m=+149.918407424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.199344 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:57 crc kubenswrapper[5030]: E0120 22:37:57.199549 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:57.699517093 +0000 UTC m=+150.019777381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.199718 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:57 crc kubenswrapper[5030]: E0120 22:37:57.200118 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:57.700102076 +0000 UTC m=+150.020362434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.213156 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s7w6m"] Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.214078 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.217189 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.228695 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s7w6m"] Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.277508 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5sb9g" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.301103 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.301265 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb2gb\" (UniqueName: \"kubernetes.io/projected/25dd4196-71a9-4ac4-ba14-e29499a14b97-kube-api-access-rb2gb\") pod \"certified-operators-s7w6m\" (UID: \"25dd4196-71a9-4ac4-ba14-e29499a14b97\") " pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.301312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dd4196-71a9-4ac4-ba14-e29499a14b97-utilities\") pod \"certified-operators-s7w6m\" (UID: \"25dd4196-71a9-4ac4-ba14-e29499a14b97\") " pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.301375 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dd4196-71a9-4ac4-ba14-e29499a14b97-catalog-content\") pod \"certified-operators-s7w6m\" (UID: \"25dd4196-71a9-4ac4-ba14-e29499a14b97\") " pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:37:57 crc kubenswrapper[5030]: E0120 22:37:57.301463 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:57.801448792 +0000 UTC m=+150.121709080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.402442 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rwqgg"] Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.403078 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.403153 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dd4196-71a9-4ac4-ba14-e29499a14b97-catalog-content\") pod \"certified-operators-s7w6m\" (UID: \"25dd4196-71a9-4ac4-ba14-e29499a14b97\") " pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.403187 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb2gb\" (UniqueName: \"kubernetes.io/projected/25dd4196-71a9-4ac4-ba14-e29499a14b97-kube-api-access-rb2gb\") pod \"certified-operators-s7w6m\" (UID: \"25dd4196-71a9-4ac4-ba14-e29499a14b97\") " pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.403234 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dd4196-71a9-4ac4-ba14-e29499a14b97-utilities\") pod \"certified-operators-s7w6m\" (UID: \"25dd4196-71a9-4ac4-ba14-e29499a14b97\") " pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:37:57 crc kubenswrapper[5030]: E0120 22:37:57.403444 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:57.903427613 +0000 UTC m=+150.223687901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.403642 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dd4196-71a9-4ac4-ba14-e29499a14b97-catalog-content\") pod \"certified-operators-s7w6m\" (UID: \"25dd4196-71a9-4ac4-ba14-e29499a14b97\") " pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.403702 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dd4196-71a9-4ac4-ba14-e29499a14b97-utilities\") pod \"certified-operators-s7w6m\" (UID: \"25dd4196-71a9-4ac4-ba14-e29499a14b97\") " pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.404081 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.406478 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.423803 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwqgg"] Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.436710 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb2gb\" (UniqueName: \"kubernetes.io/projected/25dd4196-71a9-4ac4-ba14-e29499a14b97-kube-api-access-rb2gb\") pod \"certified-operators-s7w6m\" (UID: \"25dd4196-71a9-4ac4-ba14-e29499a14b97\") " pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.504741 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:57 crc kubenswrapper[5030]: E0120 22:37:57.504993 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:58.004973205 +0000 UTC m=+150.325233493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.505192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.505230 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1481ba2e-3e83-4628-aef7-53fd09d2a02d-catalog-content\") pod \"community-operators-rwqgg\" (UID: \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\") " pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.505348 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1481ba2e-3e83-4628-aef7-53fd09d2a02d-utilities\") pod \"community-operators-rwqgg\" (UID: \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\") " pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.505386 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2cmd\" (UniqueName: \"kubernetes.io/projected/1481ba2e-3e83-4628-aef7-53fd09d2a02d-kube-api-access-w2cmd\") pod \"community-operators-rwqgg\" (UID: \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\") " pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:37:57 crc kubenswrapper[5030]: E0120 22:37:57.505712 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:58.005700461 +0000 UTC m=+150.325960739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.516678 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mh8ms" event={"ID":"09197347-c50f-4fe1-b20d-3baabf9869fc","Type":"ContainerStarted","Data":"8f72ee15223e9c18a69400ee3c956a775936bbfc4a6cb17c385188b19b86912c"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.529731 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j5gw4" event={"ID":"5ee883e4-13d7-4013-8e09-a217f02874f2","Type":"ContainerStarted","Data":"2d42022d5b2724fbb0d368af467da192a44b0f075c55b1ce613469710738fe0a"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.529774 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j5gw4" event={"ID":"5ee883e4-13d7-4013-8e09-a217f02874f2","Type":"ContainerStarted","Data":"0534f6741ccddd13e44785f0700d9dc00d66ef067013169a45f3fb2a104e5fb0"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.529887 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-j5gw4" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.537207 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.547432 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" event={"ID":"f12e0006-3347-4e5b-98ad-bd69997cbe22","Type":"ContainerStarted","Data":"9a912e156a19c1115785010b7bf8fbf553803673f927e8df27f4dfed40d640d5"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.547479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" event={"ID":"f12e0006-3347-4e5b-98ad-bd69997cbe22","Type":"ContainerStarted","Data":"450b8fbe62bab6fa1d27a6d2946b9fe9e5ca2067f430225b306f3163da169a4e"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.548344 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.558964 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-j5gw4" podStartSLOduration=8.558948641 podStartE2EDuration="8.558948641s" podCreationTimestamp="2026-01-20 22:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:57.557906887 +0000 UTC m=+149.878167175" watchObservedRunningTime="2026-01-20 22:37:57.558948641 +0000 UTC m=+149.879208929" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.559211 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mh8ms" podStartSLOduration=130.559206147 podStartE2EDuration="2m10.559206147s" podCreationTimestamp="2026-01-20 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:57.53669836 +0000 UTC m=+149.856958648" watchObservedRunningTime="2026-01-20 22:37:57.559206147 +0000 UTC m=+149.879466425" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.567710 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"84babbe2667438deb2fc3ab462f9e786b952e07b224fa5bd7cc6708b708686a9"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.567751 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"90f4ca79d03bbe6beaa87824a72e1db05e75d88ddb26bfd91100a1a03cb1cc6b"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.568235 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.593206 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" event={"ID":"e89be5a8-754e-4a83-9cb5-691055aed0e2","Type":"ContainerStarted","Data":"639d69152d98b8fd5aa903e8d7eed85b8720f97fe49094e0de7ee6c2a497f498"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.593249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" event={"ID":"e89be5a8-754e-4a83-9cb5-691055aed0e2","Type":"ContainerStarted","Data":"6c26ff29edaf4cdd1dca97b3253f34f47cad094bbdb79ab417a948abe6c9a19e"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.610019 5030 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.610299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.610460 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2cmd\" (UniqueName: \"kubernetes.io/projected/1481ba2e-3e83-4628-aef7-53fd09d2a02d-kube-api-access-w2cmd\") pod \"community-operators-rwqgg\" (UID: \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\") " pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.610549 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1481ba2e-3e83-4628-aef7-53fd09d2a02d-catalog-content\") pod \"community-operators-rwqgg\" (UID: \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\") " pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.610598 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1481ba2e-3e83-4628-aef7-53fd09d2a02d-utilities\") pod \"community-operators-rwqgg\" (UID: \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\") " pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:37:57 crc kubenswrapper[5030]: E0120 22:37:57.611805 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:58.111787231 +0000 UTC m=+150.432047519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.612414 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" podStartSLOduration=130.612396235 podStartE2EDuration="2m10.612396235s" podCreationTimestamp="2026-01-20 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:57.590046172 +0000 UTC m=+149.910306460" watchObservedRunningTime="2026-01-20 22:37:57.612396235 +0000 UTC m=+149.932656523" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.612599 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w9p75"] Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.613496 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.613655 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1481ba2e-3e83-4628-aef7-53fd09d2a02d-catalog-content\") pod \"community-operators-rwqgg\" (UID: \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\") " pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.614208 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1481ba2e-3e83-4628-aef7-53fd09d2a02d-utilities\") pod \"community-operators-rwqgg\" (UID: \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\") " pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.628162 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" event={"ID":"3ae9bb3b-dab8-4286-aa17-1d04b2239dd6","Type":"ContainerStarted","Data":"00cc90a9d99c2e2ab64185962970ff401f6af8ebaec3dac5aad371d8c2fa66ee"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.635515 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2cmd\" (UniqueName: \"kubernetes.io/projected/1481ba2e-3e83-4628-aef7-53fd09d2a02d-kube-api-access-w2cmd\") pod \"community-operators-rwqgg\" (UID: \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\") " pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.638848 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9p75"] Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.655553 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-m27bp" podStartSLOduration=131.655536333 podStartE2EDuration="2m11.655536333s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:57.654735805 +0000 UTC m=+149.974996103" watchObservedRunningTime="2026-01-20 22:37:57.655536333 +0000 UTC m=+149.975796621" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.677528 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" event={"ID":"e4ad4478-2873-407f-a178-b0255511ef06","Type":"ContainerStarted","Data":"ec911f095434b0130375da11efcfb39742e8c6b7875eefffb3008a9131c46225"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.678335 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" event={"ID":"e4ad4478-2873-407f-a178-b0255511ef06","Type":"ContainerStarted","Data":"9a287e1090a419384e0a8740d51b72962588067cd276709aedd5a8c6b28c6c9e"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.687572 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vbnp" event={"ID":"ae10e694-2552-431e-88b6-2637e467c256","Type":"ContainerStarted","Data":"7586cd462e580bc0f5e0b2b85268c541def099c469d429629d2a13dda811a66e"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.687641 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vbnp" event={"ID":"ae10e694-2552-431e-88b6-2637e467c256","Type":"ContainerStarted","Data":"42f8da4b77649d53fb9188344bc22cae5fe8d7df20ad63b85eae40ace31f2ffc"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.690687 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1cf53bea62b50cffb1553987cea77429416e61b252ae5c72c0d1c3cff3a0fa5b"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.690724 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c0ccde0ad5e047dd712322ee529dff19b8b7b6c38037c9f9e8aacb5047438dce"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.711645 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-utilities\") pod \"certified-operators-w9p75\" (UID: \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\") " pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.712006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-catalog-content\") pod \"certified-operators-w9p75\" (UID: \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\") " pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.712036 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h96fm\" (UniqueName: \"kubernetes.io/projected/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-kube-api-access-h96fm\") pod \"certified-operators-w9p75\" (UID: \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\") " pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.712074 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:57 crc kubenswrapper[5030]: E0120 22:37:57.713841 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:58.213825943 +0000 UTC m=+150.534086231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.728934 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" event={"ID":"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5","Type":"ContainerStarted","Data":"27418cd1f9ab974fdd707fae32edb6029362def21560dbea701ae7e4bcc59398"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.730064 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.731781 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" podStartSLOduration=131.731755531 podStartE2EDuration="2m11.731755531s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:57.708368153 +0000 UTC m=+150.028628451" watchObservedRunningTime="2026-01-20 22:37:57.731755531 +0000 UTC m=+150.052015819" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.732679 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.734225 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7d8jq" podStartSLOduration=131.734207998 podStartE2EDuration="2m11.734207998s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:57.732312624 +0000 UTC m=+150.052572912" watchObservedRunningTime="2026-01-20 22:37:57.734207998 +0000 UTC m=+150.054468286" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.745766 5030 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tlnpr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.745808 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" podUID="00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.755391 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" event={"ID":"d0a3aeed-8352-4f1b-8f2a-0735c467f98c","Type":"ContainerStarted","Data":"23b25641f1ae6048c55aa7d3c39da6261ca5243884912f5562077c2fe0975ba4"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.770339 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9vbnp" podStartSLOduration=131.77032314 podStartE2EDuration="2m11.77032314s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:57.769024479 +0000 UTC m=+150.089284767" watchObservedRunningTime="2026-01-20 22:37:57.77032314 +0000 UTC m=+150.090583428" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.783240 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-fmkdf" event={"ID":"94de3b86-fede-47e1-8ecf-03f167572300","Type":"ContainerStarted","Data":"d3c7bf6756f4796f9cf3bad7ccc3b72250c5efdb322c47cf51654ce26a2df846"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.783969 5030 patch_prober.go:28] interesting pod/downloads-7954f5f757-fmkdf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.783994 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fmkdf" podUID="94de3b86-fede-47e1-8ecf-03f167572300" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.800040 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" podStartSLOduration=130.800024428 podStartE2EDuration="2m10.800024428s" podCreationTimestamp="2026-01-20 22:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:57.797659472 +0000 UTC m=+150.117919760" watchObservedRunningTime="2026-01-20 22:37:57.800024428 +0000 UTC m=+150.120284726" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.804090 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xvfww"] Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.805106 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.810394 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" event={"ID":"9840bef3-5439-4096-bde3-812f3653da3c","Type":"ContainerStarted","Data":"f737bddb7e90891ef87116d6fc5b7d2ae665ebf800f70962a91947f1c6f8f460"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.817791 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.817964 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-catalog-content\") pod \"certified-operators-w9p75\" (UID: \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\") " pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.818005 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h96fm\" (UniqueName: \"kubernetes.io/projected/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-kube-api-access-h96fm\") pod \"certified-operators-w9p75\" (UID: \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\") " pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.818123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-utilities\") pod \"certified-operators-w9p75\" (UID: \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\") " pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:37:57 crc kubenswrapper[5030]: E0120 22:37:57.818849 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:58.318803416 +0000 UTC m=+150.639063704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.819244 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-catalog-content\") pod \"certified-operators-w9p75\" (UID: \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\") " pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.821257 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvfww"] Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.821811 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-utilities\") pod \"certified-operators-w9p75\" (UID: \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\") " pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.838395 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" event={"ID":"51e17579-9d5a-40de-8207-17d85f4c803d","Type":"ContainerStarted","Data":"f4bb4928dee85ddf71f5912f85062a05a8d528998b4ffbb1474f058fa1b7f7bc"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.843054 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rcd9" event={"ID":"6123fd3b-008e-4cc1-ab2b-55805290b417","Type":"ContainerStarted","Data":"d14253eeda8da75006a5d3347960db9603d3fb16f9ed3a0639c65036994cad5d"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.843668 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h96fm\" (UniqueName: \"kubernetes.io/projected/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-kube-api-access-h96fm\") pod \"certified-operators-w9p75\" (UID: \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\") " pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.846654 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" event={"ID":"e1e5724b-6745-4d13-a19b-a631a2483004","Type":"ContainerStarted","Data":"68a0d67568d13486a9f4afee86d5a56b2b2bf76504f16b4077c391d1c45cfeea"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.857735 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5mcr2" event={"ID":"cd0500bb-3678-45b3-bf69-010672c72338","Type":"ContainerStarted","Data":"74a0a92a8d4d0fb72fa3250ccabb84e249134387630b9d19cf48473d7c3f94f9"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.866531 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhcfn" podStartSLOduration=131.866509353 podStartE2EDuration="2m11.866509353s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:57.839602342 +0000 UTC m=+150.159862630" watchObservedRunningTime="2026-01-20 22:37:57.866509353 +0000 UTC m=+150.186769641" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.870107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a21efd52293dd381a09d7db00d92120027f32cf400f3d178abc8f1c327a5afe6"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.870143 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0bfb4e8dd61c89067a0158e27c2c713a55a35c1079d3bcda39795b0658d1d5de"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.871599 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-20 22:32:56 +0000 UTC, rotation deadline is 2026-10-29 20:37:58.91561369 +0000 UTC Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.871679 5030 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6766h0m1.043938184s for next certificate rotation Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.877286 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" event={"ID":"17ebad12-2a66-427b-b200-64383ca6631c","Type":"ContainerStarted","Data":"40cb088910e62f0a6f7043617c7c1b50d954bbd2c10a8c422023239cf58be272"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.877347 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" event={"ID":"17ebad12-2a66-427b-b200-64383ca6631c","Type":"ContainerStarted","Data":"0b806a51182e1ead2694ab61b9e96c908868940948dbcdd4b16bb46b40c59c44"} Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.888340 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4d9dw" podStartSLOduration=131.888309213 podStartE2EDuration="2m11.888309213s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:57.880676891 +0000 UTC m=+150.200937179" watchObservedRunningTime="2026-01-20 22:37:57.888309213 +0000 UTC m=+150.208569501" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.890986 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.895185 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-8h529" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.910716 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hszg6" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.910842 5030 patch_prober.go:28] interesting pod/router-default-5444994796-vc527 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 22:37:57 crc kubenswrapper[5030]: [-]has-synced failed: reason withheld Jan 20 22:37:57 crc kubenswrapper[5030]: [+]process-running ok Jan 20 22:37:57 crc kubenswrapper[5030]: healthz check failed Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.910876 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vc527" podUID="d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.922614 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24278bf4-72d4-4d8e-b912-7ac98cf12b52-catalog-content\") pod \"community-operators-xvfww\" (UID: \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\") " pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.922758 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.922783 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24278bf4-72d4-4d8e-b912-7ac98cf12b52-utilities\") pod \"community-operators-xvfww\" (UID: \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\") " pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.926537 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xnkf\" (UniqueName: \"kubernetes.io/projected/24278bf4-72d4-4d8e-b912-7ac98cf12b52-kube-api-access-7xnkf\") pod \"community-operators-xvfww\" (UID: \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\") " pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:37:57 crc kubenswrapper[5030]: E0120 22:37:57.948500 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 22:37:58.448486007 +0000 UTC m=+150.768746285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dbvw5" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.956849 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7zdhb" podStartSLOduration=131.956831306 podStartE2EDuration="2m11.956831306s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:57.911869974 +0000 UTC m=+150.232130262" watchObservedRunningTime="2026-01-20 22:37:57.956831306 +0000 UTC m=+150.277091584" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.958827 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x4j96" podStartSLOduration=131.958821464 podStartE2EDuration="2m11.958821464s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:57.955239568 +0000 UTC m=+150.275499866" watchObservedRunningTime="2026-01-20 22:37:57.958821464 +0000 UTC m=+150.279081752" Jan 20 22:37:57 crc kubenswrapper[5030]: I0120 22:37:57.978342 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.049814 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.050041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xnkf\" (UniqueName: \"kubernetes.io/projected/24278bf4-72d4-4d8e-b912-7ac98cf12b52-kube-api-access-7xnkf\") pod \"community-operators-xvfww\" (UID: \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\") " pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.050145 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24278bf4-72d4-4d8e-b912-7ac98cf12b52-catalog-content\") pod \"community-operators-xvfww\" (UID: \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\") " pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.050194 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24278bf4-72d4-4d8e-b912-7ac98cf12b52-utilities\") pod \"community-operators-xvfww\" (UID: \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\") " pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.050585 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24278bf4-72d4-4d8e-b912-7ac98cf12b52-utilities\") pod \"community-operators-xvfww\" (UID: \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\") " pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:37:58 crc kubenswrapper[5030]: E0120 22:37:58.050676 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 22:37:58.550654712 +0000 UTC m=+150.870915010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.052010 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24278bf4-72d4-4d8e-b912-7ac98cf12b52-catalog-content\") pod \"community-operators-xvfww\" (UID: \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\") " pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.086372 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xnkf\" (UniqueName: \"kubernetes.io/projected/24278bf4-72d4-4d8e-b912-7ac98cf12b52-kube-api-access-7xnkf\") pod \"community-operators-xvfww\" (UID: \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\") " pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.118701 5030 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-20T22:37:57.610042959Z","Handler":null,"Name":""} Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.135345 5030 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.135381 5030 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.139880 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.156293 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.167305 5030 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.167342 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.248494 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s7w6m"] Jan 20 22:37:58 crc kubenswrapper[5030]: W0120 22:37:58.288818 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25dd4196_71a9_4ac4_ba14_e29499a14b97.slice/crio-6864bfe41e58a5ed19564f16df60d76f2c32ff3a2c2741e9053528143b5a4226 WatchSource:0}: Error finding container 6864bfe41e58a5ed19564f16df60d76f2c32ff3a2c2741e9053528143b5a4226: Status 404 returned error can't find the container with id 6864bfe41e58a5ed19564f16df60d76f2c32ff3a2c2741e9053528143b5a4226 Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.326529 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwqgg"] Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.337156 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dbvw5\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.364195 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.431254 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.488837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 20 22:37:58 crc kubenswrapper[5030]: W0120 22:37:58.666167 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a85129d_c6c1_484b_b19b_d22ddfa78fd4.slice/crio-2f46132986be22c00f48799f9ea8669b1ca861b11d77c2d6c91321e3060780cd WatchSource:0}: Error finding container 2f46132986be22c00f48799f9ea8669b1ca861b11d77c2d6c91321e3060780cd: Status 404 returned error can't find the container with id 2f46132986be22c00f48799f9ea8669b1ca861b11d77c2d6c91321e3060780cd Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.669643 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9p75"] Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.702318 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvfww"] Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.903992 5030 patch_prober.go:28] interesting pod/router-default-5444994796-vc527 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 22:37:58 crc kubenswrapper[5030]: [-]has-synced failed: reason withheld Jan 20 22:37:58 crc kubenswrapper[5030]: [+]process-running ok Jan 20 22:37:58 crc kubenswrapper[5030]: healthz check failed Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.904354 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vc527" podUID="d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.904171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwqgg" event={"ID":"1481ba2e-3e83-4628-aef7-53fd09d2a02d","Type":"ContainerDied","Data":"f39e59040bd29fc1d368fb0afcd7c6a885f264ea7d4043e026320bcf04eb48eb"} Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.904147 5030 generic.go:334] "Generic (PLEG): container finished" podID="1481ba2e-3e83-4628-aef7-53fd09d2a02d" containerID="f39e59040bd29fc1d368fb0afcd7c6a885f264ea7d4043e026320bcf04eb48eb" exitCode=0 Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.904472 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwqgg" event={"ID":"1481ba2e-3e83-4628-aef7-53fd09d2a02d","Type":"ContainerStarted","Data":"aa5445b5266c95f5eec22516cfff30bddfe56a2c63917ed982b05eb5d3b2390f"} Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.918381 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.927046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9p75" event={"ID":"9a85129d-c6c1-484b-b19b-d22ddfa78fd4","Type":"ContainerStarted","Data":"54a8905efa3bfda9f6b135f1400371f68d47f5c35b90fded926bc064dabfd49f"} Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.927084 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9p75" event={"ID":"9a85129d-c6c1-484b-b19b-d22ddfa78fd4","Type":"ContainerStarted","Data":"2f46132986be22c00f48799f9ea8669b1ca861b11d77c2d6c91321e3060780cd"} Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.934407 5030 generic.go:334] "Generic (PLEG): container finished" podID="25dd4196-71a9-4ac4-ba14-e29499a14b97" containerID="8cba325132615f82304d2bc68c63750a26d4562ee29af5331425c794267485bf" exitCode=0 Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.935765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7w6m" event={"ID":"25dd4196-71a9-4ac4-ba14-e29499a14b97","Type":"ContainerDied","Data":"8cba325132615f82304d2bc68c63750a26d4562ee29af5331425c794267485bf"} Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.935845 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7w6m" event={"ID":"25dd4196-71a9-4ac4-ba14-e29499a14b97","Type":"ContainerStarted","Data":"6864bfe41e58a5ed19564f16df60d76f2c32ff3a2c2741e9053528143b5a4226"} Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.967226 5030 generic.go:334] "Generic (PLEG): container finished" podID="6b55be0c-55e3-4bf3-badd-618e8dc9f738" containerID="a8eea5b347bca88e0564d71e4e5a6ec69c6b240ae4f761c7184c5d721ecc3b69" exitCode=0 Jan 20 22:37:58 crc kubenswrapper[5030]: I0120 22:37:58.967318 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" event={"ID":"6b55be0c-55e3-4bf3-badd-618e8dc9f738","Type":"ContainerDied","Data":"a8eea5b347bca88e0564d71e4e5a6ec69c6b240ae4f761c7184c5d721ecc3b69"} Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.004105 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" event={"ID":"17ebad12-2a66-427b-b200-64383ca6631c","Type":"ContainerStarted","Data":"c21f192fb67e0bcc5c5e330b8db839ddfb95064ad010b3c19c90693a80ec9a24"} Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.004351 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" event={"ID":"17ebad12-2a66-427b-b200-64383ca6631c","Type":"ContainerStarted","Data":"d3492ca89b4c6571455fcb8467ac805f9491ca231243c6cc251a54d2fb767351"} Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.014914 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dbvw5"] Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.045688 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-c8xgq" podStartSLOduration=10.045668653 podStartE2EDuration="10.045668653s" podCreationTimestamp="2026-01-20 22:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:37:59.042093808 +0000 UTC m=+151.362354116" watchObservedRunningTime="2026-01-20 22:37:59.045668653 +0000 UTC m=+151.365928941" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.046670 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfww" event={"ID":"24278bf4-72d4-4d8e-b912-7ac98cf12b52","Type":"ContainerStarted","Data":"9e3e2049531dfd9edacbcf426b2df86b53eff7c01b10ae805dd0b6219edbdef7"} Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.058628 5030 patch_prober.go:28] interesting pod/downloads-7954f5f757-fmkdf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.058666 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fmkdf" podUID="94de3b86-fede-47e1-8ecf-03f167572300" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.067163 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.401135 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c8mnn"] Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.402177 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.404560 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.420266 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8mnn"] Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.490261 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c46fc78-f3af-42cc-84e3-c0445d7e686f-utilities\") pod \"redhat-marketplace-c8mnn\" (UID: \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\") " pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.490503 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c46fc78-f3af-42cc-84e3-c0445d7e686f-catalog-content\") pod \"redhat-marketplace-c8mnn\" (UID: \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\") " pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.490542 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nt87\" (UniqueName: \"kubernetes.io/projected/8c46fc78-f3af-42cc-84e3-c0445d7e686f-kube-api-access-6nt87\") pod \"redhat-marketplace-c8mnn\" (UID: \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\") " pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.591943 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c46fc78-f3af-42cc-84e3-c0445d7e686f-catalog-content\") pod \"redhat-marketplace-c8mnn\" (UID: \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\") " pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.591988 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nt87\" (UniqueName: \"kubernetes.io/projected/8c46fc78-f3af-42cc-84e3-c0445d7e686f-kube-api-access-6nt87\") pod \"redhat-marketplace-c8mnn\" (UID: \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\") " pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.592042 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c46fc78-f3af-42cc-84e3-c0445d7e686f-utilities\") pod \"redhat-marketplace-c8mnn\" (UID: \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\") " pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.592635 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c46fc78-f3af-42cc-84e3-c0445d7e686f-utilities\") pod \"redhat-marketplace-c8mnn\" (UID: \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\") " pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.592853 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c46fc78-f3af-42cc-84e3-c0445d7e686f-catalog-content\") pod \"redhat-marketplace-c8mnn\" (UID: \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\") " pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.614934 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nt87\" (UniqueName: \"kubernetes.io/projected/8c46fc78-f3af-42cc-84e3-c0445d7e686f-kube-api-access-6nt87\") pod \"redhat-marketplace-c8mnn\" (UID: \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\") " pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.714112 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.803766 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zfx2r"] Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.804766 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.811197 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfx2r"] Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.899408 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-utilities\") pod \"redhat-marketplace-zfx2r\" (UID: \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\") " pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.899459 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-catalog-content\") pod \"redhat-marketplace-zfx2r\" (UID: \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\") " pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.899488 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbxlt\" (UniqueName: \"kubernetes.io/projected/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-kube-api-access-tbxlt\") pod \"redhat-marketplace-zfx2r\" (UID: \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\") " pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.905930 5030 patch_prober.go:28] interesting pod/router-default-5444994796-vc527 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 22:37:59 crc kubenswrapper[5030]: [-]has-synced failed: reason withheld Jan 20 22:37:59 crc kubenswrapper[5030]: [+]process-running ok Jan 20 22:37:59 crc kubenswrapper[5030]: healthz check failed Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.906002 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vc527" podUID="d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.971431 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 20 22:37:59 crc kubenswrapper[5030]: I0120 22:37:59.972648 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8mnn"] Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.001016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-utilities\") pod \"redhat-marketplace-zfx2r\" (UID: \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\") " pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.001078 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-catalog-content\") pod \"redhat-marketplace-zfx2r\" (UID: \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\") " pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.001109 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbxlt\" (UniqueName: \"kubernetes.io/projected/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-kube-api-access-tbxlt\") pod \"redhat-marketplace-zfx2r\" (UID: \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\") " pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.002064 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-catalog-content\") pod \"redhat-marketplace-zfx2r\" (UID: \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\") " pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.002843 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-utilities\") pod \"redhat-marketplace-zfx2r\" (UID: \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\") " pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.024718 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbxlt\" (UniqueName: \"kubernetes.io/projected/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-kube-api-access-tbxlt\") pod \"redhat-marketplace-zfx2r\" (UID: \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\") " pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.066156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" event={"ID":"e6b44a8f-35c4-4a14-a7bc-ab287216c87d","Type":"ContainerStarted","Data":"9a90e6eb0307e61dcc2ec53f8e11f1fbff33f6c72b55553cd5f476220c0f5aaa"} Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.066195 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" event={"ID":"e6b44a8f-35c4-4a14-a7bc-ab287216c87d","Type":"ContainerStarted","Data":"86415a1670562d1bac1b8ab29155b7e77cb42cc2a02b13f0e0246ca0297b4ff7"} Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.066765 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.070169 5030 generic.go:334] "Generic (PLEG): container finished" podID="9a85129d-c6c1-484b-b19b-d22ddfa78fd4" containerID="54a8905efa3bfda9f6b135f1400371f68d47f5c35b90fded926bc064dabfd49f" exitCode=0 Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.070263 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9p75" event={"ID":"9a85129d-c6c1-484b-b19b-d22ddfa78fd4","Type":"ContainerDied","Data":"54a8905efa3bfda9f6b135f1400371f68d47f5c35b90fded926bc064dabfd49f"} Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.072928 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8mnn" event={"ID":"8c46fc78-f3af-42cc-84e3-c0445d7e686f","Type":"ContainerStarted","Data":"833620d4712fceb6fe4b527de6942b2aa56c74b92585f29224df07a603cfd5c5"} Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.077600 5030 generic.go:334] "Generic (PLEG): container finished" podID="24278bf4-72d4-4d8e-b912-7ac98cf12b52" containerID="dbfcbbfdc94d07878acd4c063509f224a093b447b62ede0db139bb9700f6b58d" exitCode=0 Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.077664 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfww" event={"ID":"24278bf4-72d4-4d8e-b912-7ac98cf12b52","Type":"ContainerDied","Data":"dbfcbbfdc94d07878acd4c063509f224a093b447b62ede0db139bb9700f6b58d"} Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.092487 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" podStartSLOduration=134.092465798 podStartE2EDuration="2m14.092465798s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:38:00.085981273 +0000 UTC m=+152.406241561" watchObservedRunningTime="2026-01-20 22:38:00.092465798 +0000 UTC m=+152.412726086" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.128778 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.282887 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.337235 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfx2r"] Jan 20 22:38:00 crc kubenswrapper[5030]: W0120 22:38:00.366733 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77ee5767_5a99_4b6a_8a75_d7b98a0a89de.slice/crio-afb708369cb20dbc49d74c088e89696d9281f3050fce2cdd8e374fbb387fbf3a WatchSource:0}: Error finding container afb708369cb20dbc49d74c088e89696d9281f3050fce2cdd8e374fbb387fbf3a: Status 404 returned error can't find the container with id afb708369cb20dbc49d74c088e89696d9281f3050fce2cdd8e374fbb387fbf3a Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.398054 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7tbm4"] Jan 20 22:38:00 crc kubenswrapper[5030]: E0120 22:38:00.398252 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b55be0c-55e3-4bf3-badd-618e8dc9f738" containerName="collect-profiles" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.398263 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b55be0c-55e3-4bf3-badd-618e8dc9f738" containerName="collect-profiles" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.398409 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b55be0c-55e3-4bf3-badd-618e8dc9f738" containerName="collect-profiles" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.399089 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.400979 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.408303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdz7b\" (UniqueName: \"kubernetes.io/projected/6b55be0c-55e3-4bf3-badd-618e8dc9f738-kube-api-access-hdz7b\") pod \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\" (UID: \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\") " Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.409018 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b55be0c-55e3-4bf3-badd-618e8dc9f738-config-volume" (OuterVolumeSpecName: "config-volume") pod "6b55be0c-55e3-4bf3-badd-618e8dc9f738" (UID: "6b55be0c-55e3-4bf3-badd-618e8dc9f738"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.409081 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b55be0c-55e3-4bf3-badd-618e8dc9f738-config-volume\") pod \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\" (UID: \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\") " Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.409114 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7tbm4"] Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.411613 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b55be0c-55e3-4bf3-badd-618e8dc9f738-kube-api-access-hdz7b" (OuterVolumeSpecName: "kube-api-access-hdz7b") pod "6b55be0c-55e3-4bf3-badd-618e8dc9f738" (UID: "6b55be0c-55e3-4bf3-badd-618e8dc9f738"). InnerVolumeSpecName "kube-api-access-hdz7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.412300 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b55be0c-55e3-4bf3-badd-618e8dc9f738-secret-volume\") pod \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\" (UID: \"6b55be0c-55e3-4bf3-badd-618e8dc9f738\") " Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.412983 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-catalog-content\") pod \"redhat-operators-7tbm4\" (UID: \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\") " pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.413024 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f4j4\" (UniqueName: \"kubernetes.io/projected/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-kube-api-access-4f4j4\") pod \"redhat-operators-7tbm4\" (UID: \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\") " pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.413083 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-utilities\") pod \"redhat-operators-7tbm4\" (UID: \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\") " pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.413125 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b55be0c-55e3-4bf3-badd-618e8dc9f738-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.413140 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdz7b\" (UniqueName: \"kubernetes.io/projected/6b55be0c-55e3-4bf3-badd-618e8dc9f738-kube-api-access-hdz7b\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.415855 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b55be0c-55e3-4bf3-badd-618e8dc9f738-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6b55be0c-55e3-4bf3-badd-618e8dc9f738" (UID: "6b55be0c-55e3-4bf3-badd-618e8dc9f738"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.513975 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-utilities\") pod \"redhat-operators-7tbm4\" (UID: \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\") " pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.514070 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-catalog-content\") pod \"redhat-operators-7tbm4\" (UID: \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\") " pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.514093 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f4j4\" (UniqueName: \"kubernetes.io/projected/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-kube-api-access-4f4j4\") pod \"redhat-operators-7tbm4\" (UID: \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\") " pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.514133 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b55be0c-55e3-4bf3-badd-618e8dc9f738-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.514820 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-utilities\") pod \"redhat-operators-7tbm4\" (UID: \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\") " pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.515400 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-catalog-content\") pod \"redhat-operators-7tbm4\" (UID: \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\") " pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.543426 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f4j4\" (UniqueName: \"kubernetes.io/projected/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-kube-api-access-4f4j4\") pod \"redhat-operators-7tbm4\" (UID: \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\") " pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.739457 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.806704 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6wq5c"] Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.807669 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.815258 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6wq5c"] Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.818757 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45wx6\" (UniqueName: \"kubernetes.io/projected/71d11e04-7835-4965-9c82-a9410fe5c407-kube-api-access-45wx6\") pod \"redhat-operators-6wq5c\" (UID: \"71d11e04-7835-4965-9c82-a9410fe5c407\") " pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.818796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d11e04-7835-4965-9c82-a9410fe5c407-catalog-content\") pod \"redhat-operators-6wq5c\" (UID: \"71d11e04-7835-4965-9c82-a9410fe5c407\") " pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.818815 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d11e04-7835-4965-9c82-a9410fe5c407-utilities\") pod \"redhat-operators-6wq5c\" (UID: \"71d11e04-7835-4965-9c82-a9410fe5c407\") " pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.905018 5030 patch_prober.go:28] interesting pod/router-default-5444994796-vc527 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 22:38:00 crc kubenswrapper[5030]: [-]has-synced failed: reason withheld Jan 20 22:38:00 crc kubenswrapper[5030]: [+]process-running ok Jan 20 22:38:00 crc kubenswrapper[5030]: healthz check failed Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.905356 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vc527" podUID="d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.923487 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45wx6\" (UniqueName: \"kubernetes.io/projected/71d11e04-7835-4965-9c82-a9410fe5c407-kube-api-access-45wx6\") pod \"redhat-operators-6wq5c\" (UID: \"71d11e04-7835-4965-9c82-a9410fe5c407\") " pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.923538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d11e04-7835-4965-9c82-a9410fe5c407-catalog-content\") pod \"redhat-operators-6wq5c\" (UID: \"71d11e04-7835-4965-9c82-a9410fe5c407\") " pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.923560 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d11e04-7835-4965-9c82-a9410fe5c407-utilities\") pod \"redhat-operators-6wq5c\" (UID: \"71d11e04-7835-4965-9c82-a9410fe5c407\") " pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.924076 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d11e04-7835-4965-9c82-a9410fe5c407-utilities\") pod \"redhat-operators-6wq5c\" (UID: \"71d11e04-7835-4965-9c82-a9410fe5c407\") " pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.924210 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d11e04-7835-4965-9c82-a9410fe5c407-catalog-content\") pod \"redhat-operators-6wq5c\" (UID: \"71d11e04-7835-4965-9c82-a9410fe5c407\") " pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.953936 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45wx6\" (UniqueName: \"kubernetes.io/projected/71d11e04-7835-4965-9c82-a9410fe5c407-kube-api-access-45wx6\") pod \"redhat-operators-6wq5c\" (UID: \"71d11e04-7835-4965-9c82-a9410fe5c407\") " pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:00 crc kubenswrapper[5030]: I0120 22:38:00.964176 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7tbm4"] Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.086307 5030 generic.go:334] "Generic (PLEG): container finished" podID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" containerID="931d6348aa6043c6b561d526af8588532eb166eea99587d16b6d05fd9901446c" exitCode=0 Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.086368 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfx2r" event={"ID":"77ee5767-5a99-4b6a-8a75-d7b98a0a89de","Type":"ContainerDied","Data":"931d6348aa6043c6b561d526af8588532eb166eea99587d16b6d05fd9901446c"} Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.086391 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfx2r" event={"ID":"77ee5767-5a99-4b6a-8a75-d7b98a0a89de","Type":"ContainerStarted","Data":"afb708369cb20dbc49d74c088e89696d9281f3050fce2cdd8e374fbb387fbf3a"} Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.088854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tbm4" event={"ID":"ca452d17-ecf5-4ea1-8580-cca89ddb5b70","Type":"ContainerStarted","Data":"32c39608e4852c93503294d7816417361593c357bd5be5ab3ce1aa3d8378aaef"} Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.091103 5030 generic.go:334] "Generic (PLEG): container finished" podID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" containerID="d08e722d71147729b414b3cfb0d7684dfce45c0e62477941ec089d787bce7014" exitCode=0 Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.091189 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8mnn" event={"ID":"8c46fc78-f3af-42cc-84e3-c0445d7e686f","Type":"ContainerDied","Data":"d08e722d71147729b414b3cfb0d7684dfce45c0e62477941ec089d787bce7014"} Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.093828 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" event={"ID":"6b55be0c-55e3-4bf3-badd-618e8dc9f738","Type":"ContainerDied","Data":"b2547c8f7fb8b02577817e1d16919041f419d8fe8b45b44721ab17c9042bac9d"} Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.093851 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2547c8f7fb8b02577817e1d16919041f419d8fe8b45b44721ab17c9042bac9d" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.093888 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.125473 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.463448 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6wq5c"] Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.543300 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.544130 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.546429 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.546965 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.547497 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.637936 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ae73b5c-09e8-461f-a26a-5db5955938bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8ae73b5c-09e8-461f-a26a-5db5955938bc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.638012 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ae73b5c-09e8-461f-a26a-5db5955938bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8ae73b5c-09e8-461f-a26a-5db5955938bc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.723351 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.723399 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.724361 5030 patch_prober.go:28] interesting pod/console-f9d7485db-vkrcb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.724423 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vkrcb" podUID="6d43e77d-5f9a-4b29-959f-190fa60d8807" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.739156 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ae73b5c-09e8-461f-a26a-5db5955938bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8ae73b5c-09e8-461f-a26a-5db5955938bc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.739230 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ae73b5c-09e8-461f-a26a-5db5955938bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8ae73b5c-09e8-461f-a26a-5db5955938bc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.740170 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ae73b5c-09e8-461f-a26a-5db5955938bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8ae73b5c-09e8-461f-a26a-5db5955938bc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.792963 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.793020 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.795203 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ae73b5c-09e8-461f-a26a-5db5955938bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8ae73b5c-09e8-461f-a26a-5db5955938bc\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.803071 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.894517 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.901451 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.904313 5030 patch_prober.go:28] interesting pod/router-default-5444994796-vc527 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 22:38:01 crc kubenswrapper[5030]: [-]has-synced failed: reason withheld Jan 20 22:38:01 crc kubenswrapper[5030]: [+]process-running ok Jan 20 22:38:01 crc kubenswrapper[5030]: healthz check failed Jan 20 22:38:01 crc kubenswrapper[5030]: I0120 22:38:01.904350 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vc527" podUID="d26c6dbb-0421-40ef-b5b6-b9523bb2a5dc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 22:38:02 crc kubenswrapper[5030]: I0120 22:38:02.102958 5030 generic.go:334] "Generic (PLEG): container finished" podID="ca452d17-ecf5-4ea1-8580-cca89ddb5b70" containerID="b3e14b51d81f45a5f5e373a4e0e64d7015afb20a0c690ae7dfcb2e3b0b80c5d6" exitCode=0 Jan 20 22:38:02 crc kubenswrapper[5030]: I0120 22:38:02.103319 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tbm4" event={"ID":"ca452d17-ecf5-4ea1-8580-cca89ddb5b70","Type":"ContainerDied","Data":"b3e14b51d81f45a5f5e373a4e0e64d7015afb20a0c690ae7dfcb2e3b0b80c5d6"} Jan 20 22:38:02 crc kubenswrapper[5030]: I0120 22:38:02.106752 5030 generic.go:334] "Generic (PLEG): container finished" podID="71d11e04-7835-4965-9c82-a9410fe5c407" containerID="3c285efab944034f935a8ec8b38fff5e1ecc62f5bd09e14cf1c65267f5e0c73c" exitCode=0 Jan 20 22:38:02 crc kubenswrapper[5030]: I0120 22:38:02.107897 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wq5c" event={"ID":"71d11e04-7835-4965-9c82-a9410fe5c407","Type":"ContainerDied","Data":"3c285efab944034f935a8ec8b38fff5e1ecc62f5bd09e14cf1c65267f5e0c73c"} Jan 20 22:38:02 crc kubenswrapper[5030]: I0120 22:38:02.107917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wq5c" event={"ID":"71d11e04-7835-4965-9c82-a9410fe5c407","Type":"ContainerStarted","Data":"f5e8f43bd737934959e1b27f6c3e477680fd18231da47a0dfd3e0166ae16feb8"} Jan 20 22:38:02 crc kubenswrapper[5030]: I0120 22:38:02.118306 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-5fgwp" Jan 20 22:38:02 crc kubenswrapper[5030]: I0120 22:38:02.176321 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 20 22:38:02 crc kubenswrapper[5030]: I0120 22:38:02.488727 5030 patch_prober.go:28] interesting pod/downloads-7954f5f757-fmkdf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Jan 20 22:38:02 crc kubenswrapper[5030]: I0120 22:38:02.488776 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-fmkdf" podUID="94de3b86-fede-47e1-8ecf-03f167572300" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Jan 20 22:38:02 crc kubenswrapper[5030]: I0120 22:38:02.489658 5030 patch_prober.go:28] interesting pod/downloads-7954f5f757-fmkdf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Jan 20 22:38:02 crc kubenswrapper[5030]: I0120 22:38:02.489687 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-fmkdf" podUID="94de3b86-fede-47e1-8ecf-03f167572300" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.31:8080/\": dial tcp 10.217.0.31:8080: connect: connection refused" Jan 20 22:38:02 crc kubenswrapper[5030]: I0120 22:38:02.905521 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:38:02 crc kubenswrapper[5030]: I0120 22:38:02.914553 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-vc527" Jan 20 22:38:05 crc kubenswrapper[5030]: I0120 22:38:05.918053 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 20 22:38:05 crc kubenswrapper[5030]: I0120 22:38:05.919903 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 22:38:05 crc kubenswrapper[5030]: I0120 22:38:05.922418 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 20 22:38:05 crc kubenswrapper[5030]: I0120 22:38:05.922435 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 20 22:38:05 crc kubenswrapper[5030]: I0120 22:38:05.924275 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 20 22:38:05 crc kubenswrapper[5030]: I0120 22:38:05.999734 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b64a23c-ab66-46ca-8121-bb25b0af63be-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0b64a23c-ab66-46ca-8121-bb25b0af63be\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 22:38:05 crc kubenswrapper[5030]: I0120 22:38:05.999779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b64a23c-ab66-46ca-8121-bb25b0af63be-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0b64a23c-ab66-46ca-8121-bb25b0af63be\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 22:38:06 crc kubenswrapper[5030]: I0120 22:38:06.100938 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b64a23c-ab66-46ca-8121-bb25b0af63be-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0b64a23c-ab66-46ca-8121-bb25b0af63be\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 22:38:06 crc kubenswrapper[5030]: I0120 22:38:06.100984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b64a23c-ab66-46ca-8121-bb25b0af63be-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0b64a23c-ab66-46ca-8121-bb25b0af63be\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 22:38:06 crc kubenswrapper[5030]: I0120 22:38:06.101128 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b64a23c-ab66-46ca-8121-bb25b0af63be-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0b64a23c-ab66-46ca-8121-bb25b0af63be\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 22:38:06 crc kubenswrapper[5030]: I0120 22:38:06.121397 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b64a23c-ab66-46ca-8121-bb25b0af63be-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0b64a23c-ab66-46ca-8121-bb25b0af63be\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 22:38:06 crc kubenswrapper[5030]: I0120 22:38:06.250540 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 22:38:07 crc kubenswrapper[5030]: I0120 22:38:07.401673 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-j5gw4" Jan 20 22:38:07 crc kubenswrapper[5030]: I0120 22:38:07.442086 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:38:09 crc kubenswrapper[5030]: I0120 22:38:09.255840 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:38:09 crc kubenswrapper[5030]: I0120 22:38:09.261994 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b3c414b-bc29-41aa-a369-ecc2cc809691-metrics-certs\") pod \"network-metrics-daemon-f2dgj\" (UID: \"2b3c414b-bc29-41aa-a369-ecc2cc809691\") " pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:38:09 crc kubenswrapper[5030]: I0120 22:38:09.309959 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f2dgj" Jan 20 22:38:09 crc kubenswrapper[5030]: I0120 22:38:09.978177 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 20 22:38:09 crc kubenswrapper[5030]: W0120 22:38:09.988334 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0b64a23c_ab66_46ca_8121_bb25b0af63be.slice/crio-049f2f9e89c3515f5d30a04459c199cb6418ba12eca50d70953de4b5d7b32daa WatchSource:0}: Error finding container 049f2f9e89c3515f5d30a04459c199cb6418ba12eca50d70953de4b5d7b32daa: Status 404 returned error can't find the container with id 049f2f9e89c3515f5d30a04459c199cb6418ba12eca50d70953de4b5d7b32daa Jan 20 22:38:10 crc kubenswrapper[5030]: I0120 22:38:10.022867 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f2dgj"] Jan 20 22:38:10 crc kubenswrapper[5030]: W0120 22:38:10.075085 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3c414b_bc29_41aa_a369_ecc2cc809691.slice/crio-780743737d1e413053aea947cd269d53e51ebc70f8df59489672837621a6c9be WatchSource:0}: Error finding container 780743737d1e413053aea947cd269d53e51ebc70f8df59489672837621a6c9be: Status 404 returned error can't find the container with id 780743737d1e413053aea947cd269d53e51ebc70f8df59489672837621a6c9be Jan 20 22:38:10 crc kubenswrapper[5030]: I0120 22:38:10.162661 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:38:10 crc kubenswrapper[5030]: I0120 22:38:10.162717 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:38:10 crc kubenswrapper[5030]: I0120 22:38:10.223453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" event={"ID":"2b3c414b-bc29-41aa-a369-ecc2cc809691","Type":"ContainerStarted","Data":"780743737d1e413053aea947cd269d53e51ebc70f8df59489672837621a6c9be"} Jan 20 22:38:10 crc kubenswrapper[5030]: I0120 22:38:10.230286 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0b64a23c-ab66-46ca-8121-bb25b0af63be","Type":"ContainerStarted","Data":"049f2f9e89c3515f5d30a04459c199cb6418ba12eca50d70953de4b5d7b32daa"} Jan 20 22:38:10 crc kubenswrapper[5030]: I0120 22:38:10.232209 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8ae73b5c-09e8-461f-a26a-5db5955938bc","Type":"ContainerStarted","Data":"ce21e0f48ebbfe04415394b624e9ed6ab23e3f9626fad7449e32443460b70297"} Jan 20 22:38:11 crc kubenswrapper[5030]: I0120 22:38:11.244417 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0b64a23c-ab66-46ca-8121-bb25b0af63be","Type":"ContainerStarted","Data":"1c0ff9330e52ab19c7a62696af94fe1e1088309fc8106a0371d96288f6bd27b1"} Jan 20 22:38:11 crc kubenswrapper[5030]: I0120 22:38:11.245938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8ae73b5c-09e8-461f-a26a-5db5955938bc","Type":"ContainerStarted","Data":"ab14dcc1c7e4c77650d1261c653f735c18ff8e93145a401cf8e96082aed51f49"} Jan 20 22:38:11 crc kubenswrapper[5030]: I0120 22:38:11.750121 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:38:11 crc kubenswrapper[5030]: I0120 22:38:11.755310 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:38:11 crc kubenswrapper[5030]: I0120 22:38:11.767826 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=10.767807437 podStartE2EDuration="10.767807437s" podCreationTimestamp="2026-01-20 22:38:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:38:11.261983159 +0000 UTC m=+163.582243467" watchObservedRunningTime="2026-01-20 22:38:11.767807437 +0000 UTC m=+164.088067725" Jan 20 22:38:12 crc kubenswrapper[5030]: I0120 22:38:12.255310 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" event={"ID":"2b3c414b-bc29-41aa-a369-ecc2cc809691","Type":"ContainerStarted","Data":"7fd4846b52d10ec29bdb3dd4c062422251e6ac297dfb9d04b5e4645fb5b30c41"} Jan 20 22:38:12 crc kubenswrapper[5030]: I0120 22:38:12.255386 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f2dgj" event={"ID":"2b3c414b-bc29-41aa-a369-ecc2cc809691","Type":"ContainerStarted","Data":"7224e3edd85f719947389ca81579e6178443410f3ad3ffdb1bb27e693fcb95ce"} Jan 20 22:38:12 crc kubenswrapper[5030]: I0120 22:38:12.257956 5030 generic.go:334] "Generic (PLEG): container finished" podID="8ae73b5c-09e8-461f-a26a-5db5955938bc" containerID="ab14dcc1c7e4c77650d1261c653f735c18ff8e93145a401cf8e96082aed51f49" exitCode=0 Jan 20 22:38:12 crc kubenswrapper[5030]: I0120 22:38:12.258025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8ae73b5c-09e8-461f-a26a-5db5955938bc","Type":"ContainerDied","Data":"ab14dcc1c7e4c77650d1261c653f735c18ff8e93145a401cf8e96082aed51f49"} Jan 20 22:38:12 crc kubenswrapper[5030]: I0120 22:38:12.292525 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=7.292494785 podStartE2EDuration="7.292494785s" podCreationTimestamp="2026-01-20 22:38:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:38:12.289269289 +0000 UTC m=+164.609529587" watchObservedRunningTime="2026-01-20 22:38:12.292494785 +0000 UTC m=+164.612755103" Jan 20 22:38:12 crc kubenswrapper[5030]: I0120 22:38:12.504732 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-fmkdf" Jan 20 22:38:13 crc kubenswrapper[5030]: I0120 22:38:13.265940 5030 generic.go:334] "Generic (PLEG): container finished" podID="0b64a23c-ab66-46ca-8121-bb25b0af63be" containerID="1c0ff9330e52ab19c7a62696af94fe1e1088309fc8106a0371d96288f6bd27b1" exitCode=0 Jan 20 22:38:13 crc kubenswrapper[5030]: I0120 22:38:13.266024 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0b64a23c-ab66-46ca-8121-bb25b0af63be","Type":"ContainerDied","Data":"1c0ff9330e52ab19c7a62696af94fe1e1088309fc8106a0371d96288f6bd27b1"} Jan 20 22:38:13 crc kubenswrapper[5030]: I0120 22:38:13.285557 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-f2dgj" podStartSLOduration=147.285534428 podStartE2EDuration="2m27.285534428s" podCreationTimestamp="2026-01-20 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:38:13.280808956 +0000 UTC m=+165.601069254" watchObservedRunningTime="2026-01-20 22:38:13.285534428 +0000 UTC m=+165.605794716" Jan 20 22:38:16 crc kubenswrapper[5030]: I0120 22:38:16.573920 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 22:38:16 crc kubenswrapper[5030]: I0120 22:38:16.765152 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ae73b5c-09e8-461f-a26a-5db5955938bc-kube-api-access\") pod \"8ae73b5c-09e8-461f-a26a-5db5955938bc\" (UID: \"8ae73b5c-09e8-461f-a26a-5db5955938bc\") " Jan 20 22:38:16 crc kubenswrapper[5030]: I0120 22:38:16.765250 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ae73b5c-09e8-461f-a26a-5db5955938bc-kubelet-dir\") pod \"8ae73b5c-09e8-461f-a26a-5db5955938bc\" (UID: \"8ae73b5c-09e8-461f-a26a-5db5955938bc\") " Jan 20 22:38:16 crc kubenswrapper[5030]: I0120 22:38:16.765558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ae73b5c-09e8-461f-a26a-5db5955938bc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8ae73b5c-09e8-461f-a26a-5db5955938bc" (UID: "8ae73b5c-09e8-461f-a26a-5db5955938bc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:38:16 crc kubenswrapper[5030]: I0120 22:38:16.774049 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae73b5c-09e8-461f-a26a-5db5955938bc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8ae73b5c-09e8-461f-a26a-5db5955938bc" (UID: "8ae73b5c-09e8-461f-a26a-5db5955938bc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:38:16 crc kubenswrapper[5030]: I0120 22:38:16.866836 5030 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8ae73b5c-09e8-461f-a26a-5db5955938bc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:16 crc kubenswrapper[5030]: I0120 22:38:16.866878 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8ae73b5c-09e8-461f-a26a-5db5955938bc-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:17 crc kubenswrapper[5030]: I0120 22:38:17.321021 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8ae73b5c-09e8-461f-a26a-5db5955938bc","Type":"ContainerDied","Data":"ce21e0f48ebbfe04415394b624e9ed6ab23e3f9626fad7449e32443460b70297"} Jan 20 22:38:17 crc kubenswrapper[5030]: I0120 22:38:17.321371 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce21e0f48ebbfe04415394b624e9ed6ab23e3f9626fad7449e32443460b70297" Jan 20 22:38:17 crc kubenswrapper[5030]: I0120 22:38:17.321091 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 22:38:18 crc kubenswrapper[5030]: I0120 22:38:18.238529 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 22:38:18 crc kubenswrapper[5030]: I0120 22:38:18.327644 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0b64a23c-ab66-46ca-8121-bb25b0af63be","Type":"ContainerDied","Data":"049f2f9e89c3515f5d30a04459c199cb6418ba12eca50d70953de4b5d7b32daa"} Jan 20 22:38:18 crc kubenswrapper[5030]: I0120 22:38:18.327683 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="049f2f9e89c3515f5d30a04459c199cb6418ba12eca50d70953de4b5d7b32daa" Jan 20 22:38:18 crc kubenswrapper[5030]: I0120 22:38:18.327709 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 22:38:18 crc kubenswrapper[5030]: I0120 22:38:18.386233 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b64a23c-ab66-46ca-8121-bb25b0af63be-kube-api-access\") pod \"0b64a23c-ab66-46ca-8121-bb25b0af63be\" (UID: \"0b64a23c-ab66-46ca-8121-bb25b0af63be\") " Jan 20 22:38:18 crc kubenswrapper[5030]: I0120 22:38:18.386316 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b64a23c-ab66-46ca-8121-bb25b0af63be-kubelet-dir\") pod \"0b64a23c-ab66-46ca-8121-bb25b0af63be\" (UID: \"0b64a23c-ab66-46ca-8121-bb25b0af63be\") " Jan 20 22:38:18 crc kubenswrapper[5030]: I0120 22:38:18.386511 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b64a23c-ab66-46ca-8121-bb25b0af63be-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0b64a23c-ab66-46ca-8121-bb25b0af63be" (UID: "0b64a23c-ab66-46ca-8121-bb25b0af63be"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:38:18 crc kubenswrapper[5030]: I0120 22:38:18.386706 5030 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b64a23c-ab66-46ca-8121-bb25b0af63be-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:18 crc kubenswrapper[5030]: I0120 22:38:18.419509 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b64a23c-ab66-46ca-8121-bb25b0af63be-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b64a23c-ab66-46ca-8121-bb25b0af63be" (UID: "0b64a23c-ab66-46ca-8121-bb25b0af63be"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:38:18 crc kubenswrapper[5030]: I0120 22:38:18.440000 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:38:18 crc kubenswrapper[5030]: I0120 22:38:18.487847 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b64a23c-ab66-46ca-8121-bb25b0af63be-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:20 crc kubenswrapper[5030]: E0120 22:38:20.566032 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2855433442/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 20 22:38:20 crc kubenswrapper[5030]: E0120 22:38:20.566391 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6nt87,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-c8mnn_openshift-marketplace(8c46fc78-f3af-42cc-84e3-c0445d7e686f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2855433442/2\": happened during read: context canceled" logger="UnhandledError" Jan 20 22:38:20 crc kubenswrapper[5030]: E0120 22:38:20.567608 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage2855433442/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-marketplace-c8mnn" podUID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" Jan 20 22:38:30 crc kubenswrapper[5030]: E0120 22:38:30.139788 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-c8mnn" podUID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" Jan 20 22:38:32 crc kubenswrapper[5030]: I0120 22:38:32.066600 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-njtmh" Jan 20 22:38:35 crc kubenswrapper[5030]: I0120 22:38:35.198116 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 22:38:36 crc kubenswrapper[5030]: E0120 22:38:36.407521 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2460180830/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 20 22:38:36 crc kubenswrapper[5030]: E0120 22:38:36.408091 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tbxlt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zfx2r_openshift-marketplace(77ee5767-5a99-4b6a-8a75-d7b98a0a89de): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2460180830/2\": happened during read: context canceled" logger="UnhandledError" Jan 20 22:38:36 crc kubenswrapper[5030]: E0120 22:38:36.409463 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage2460180830/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-marketplace-zfx2r" podUID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" Jan 20 22:38:36 crc kubenswrapper[5030]: E0120 22:38:36.440217 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zfx2r" podUID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" Jan 20 22:38:37 crc kubenswrapper[5030]: I0120 22:38:37.433167 5030 generic.go:334] "Generic (PLEG): container finished" podID="25dd4196-71a9-4ac4-ba14-e29499a14b97" containerID="91690598c7284e43663136e401d12ba93f02c9558eded244007c6867044ba873" exitCode=0 Jan 20 22:38:37 crc kubenswrapper[5030]: I0120 22:38:37.433216 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7w6m" event={"ID":"25dd4196-71a9-4ac4-ba14-e29499a14b97","Type":"ContainerDied","Data":"91690598c7284e43663136e401d12ba93f02c9558eded244007c6867044ba873"} Jan 20 22:38:37 crc kubenswrapper[5030]: I0120 22:38:37.435611 5030 generic.go:334] "Generic (PLEG): container finished" podID="24278bf4-72d4-4d8e-b912-7ac98cf12b52" containerID="7dbd8dce79e05fe61c3ef97dce8cbe4c309ce769b7c8b1a819081d8cdb617dd0" exitCode=0 Jan 20 22:38:37 crc kubenswrapper[5030]: I0120 22:38:37.435753 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfww" event={"ID":"24278bf4-72d4-4d8e-b912-7ac98cf12b52","Type":"ContainerDied","Data":"7dbd8dce79e05fe61c3ef97dce8cbe4c309ce769b7c8b1a819081d8cdb617dd0"} Jan 20 22:38:37 crc kubenswrapper[5030]: I0120 22:38:37.439876 5030 generic.go:334] "Generic (PLEG): container finished" podID="ca452d17-ecf5-4ea1-8580-cca89ddb5b70" containerID="c9d278b96c2833fdd4ec6b3c222d65cecabc1400a99e6a5da787cdadffbe62a9" exitCode=0 Jan 20 22:38:37 crc kubenswrapper[5030]: I0120 22:38:37.439965 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tbm4" event={"ID":"ca452d17-ecf5-4ea1-8580-cca89ddb5b70","Type":"ContainerDied","Data":"c9d278b96c2833fdd4ec6b3c222d65cecabc1400a99e6a5da787cdadffbe62a9"} Jan 20 22:38:37 crc kubenswrapper[5030]: I0120 22:38:37.441979 5030 generic.go:334] "Generic (PLEG): container finished" podID="71d11e04-7835-4965-9c82-a9410fe5c407" containerID="cef180134d14d8b1c7ec51c3d027eadd29584558bb67b95dbc24576619009da0" exitCode=0 Jan 20 22:38:37 crc kubenswrapper[5030]: I0120 22:38:37.442034 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wq5c" event={"ID":"71d11e04-7835-4965-9c82-a9410fe5c407","Type":"ContainerDied","Data":"cef180134d14d8b1c7ec51c3d027eadd29584558bb67b95dbc24576619009da0"} Jan 20 22:38:37 crc kubenswrapper[5030]: I0120 22:38:37.448065 5030 generic.go:334] "Generic (PLEG): container finished" podID="1481ba2e-3e83-4628-aef7-53fd09d2a02d" containerID="ef0e35a45c4a850deffd6495b626facbf06956fe92a6c9b71d2e3fcf23621677" exitCode=0 Jan 20 22:38:37 crc kubenswrapper[5030]: I0120 22:38:37.448144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwqgg" event={"ID":"1481ba2e-3e83-4628-aef7-53fd09d2a02d","Type":"ContainerDied","Data":"ef0e35a45c4a850deffd6495b626facbf06956fe92a6c9b71d2e3fcf23621677"} Jan 20 22:38:37 crc kubenswrapper[5030]: I0120 22:38:37.457733 5030 generic.go:334] "Generic (PLEG): container finished" podID="9a85129d-c6c1-484b-b19b-d22ddfa78fd4" containerID="d120a165c85084026947a99f6a56e4671c37d43f4f56c49bab41c61f86fbd431" exitCode=0 Jan 20 22:38:37 crc kubenswrapper[5030]: I0120 22:38:37.457783 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9p75" event={"ID":"9a85129d-c6c1-484b-b19b-d22ddfa78fd4","Type":"ContainerDied","Data":"d120a165c85084026947a99f6a56e4671c37d43f4f56c49bab41c61f86fbd431"} Jan 20 22:38:38 crc kubenswrapper[5030]: I0120 22:38:38.465547 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tbm4" event={"ID":"ca452d17-ecf5-4ea1-8580-cca89ddb5b70","Type":"ContainerStarted","Data":"4cf316f5567c345b42168b52831684431f471ea09a73932ccdef46c6fd07ed14"} Jan 20 22:38:38 crc kubenswrapper[5030]: I0120 22:38:38.469783 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wq5c" event={"ID":"71d11e04-7835-4965-9c82-a9410fe5c407","Type":"ContainerStarted","Data":"edc143a9d913cc44027a8f0ded7dec2d6779bf36a171eac676f547629ee2df5d"} Jan 20 22:38:38 crc kubenswrapper[5030]: I0120 22:38:38.474979 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwqgg" event={"ID":"1481ba2e-3e83-4628-aef7-53fd09d2a02d","Type":"ContainerStarted","Data":"3b1b2ed71adf81cd57d34fd6c476908ace762e89f483636d20c55e287530a559"} Jan 20 22:38:38 crc kubenswrapper[5030]: I0120 22:38:38.477034 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9p75" event={"ID":"9a85129d-c6c1-484b-b19b-d22ddfa78fd4","Type":"ContainerStarted","Data":"0b95c987e2e6180ab540e0177ac73fa5f4e2df06f86aec5dd1c9139a310f5032"} Jan 20 22:38:38 crc kubenswrapper[5030]: I0120 22:38:38.478708 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7w6m" event={"ID":"25dd4196-71a9-4ac4-ba14-e29499a14b97","Type":"ContainerStarted","Data":"6620bba3c7be37539aca2e0e790a50a3c8c46358dba87bcb2a0c3ddcbb9a34a4"} Jan 20 22:38:38 crc kubenswrapper[5030]: I0120 22:38:38.483209 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfww" event={"ID":"24278bf4-72d4-4d8e-b912-7ac98cf12b52","Type":"ContainerStarted","Data":"1e454adb39792cdb1c7075907142434f88e8bc3e65cf2cff45c85d0c327f59cf"} Jan 20 22:38:38 crc kubenswrapper[5030]: I0120 22:38:38.516347 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7tbm4" podStartSLOduration=8.738956163 podStartE2EDuration="38.516329559s" podCreationTimestamp="2026-01-20 22:38:00 +0000 UTC" firstStartedPulling="2026-01-20 22:38:08.136102901 +0000 UTC m=+160.456363189" lastFinishedPulling="2026-01-20 22:38:37.913476307 +0000 UTC m=+190.233736585" observedRunningTime="2026-01-20 22:38:38.490543134 +0000 UTC m=+190.810803452" watchObservedRunningTime="2026-01-20 22:38:38.516329559 +0000 UTC m=+190.836589847" Jan 20 22:38:38 crc kubenswrapper[5030]: I0120 22:38:38.517813 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6wq5c" podStartSLOduration=8.688379097 podStartE2EDuration="38.517803814s" podCreationTimestamp="2026-01-20 22:38:00 +0000 UTC" firstStartedPulling="2026-01-20 22:38:08.135643 +0000 UTC m=+160.455903288" lastFinishedPulling="2026-01-20 22:38:37.965067717 +0000 UTC m=+190.285328005" observedRunningTime="2026-01-20 22:38:38.514857544 +0000 UTC m=+190.835117832" watchObservedRunningTime="2026-01-20 22:38:38.517803814 +0000 UTC m=+190.838064102" Jan 20 22:38:38 crc kubenswrapper[5030]: I0120 22:38:38.534442 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xvfww" podStartSLOduration=3.722650481 podStartE2EDuration="41.53442585s" podCreationTimestamp="2026-01-20 22:37:57 +0000 UTC" firstStartedPulling="2026-01-20 22:38:00.080689217 +0000 UTC m=+152.400949505" lastFinishedPulling="2026-01-20 22:38:37.892464586 +0000 UTC m=+190.212724874" observedRunningTime="2026-01-20 22:38:38.532967726 +0000 UTC m=+190.853228034" watchObservedRunningTime="2026-01-20 22:38:38.53442585 +0000 UTC m=+190.854686138" Jan 20 22:38:38 crc kubenswrapper[5030]: I0120 22:38:38.549115 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w9p75" podStartSLOduration=3.635225976 podStartE2EDuration="41.54909903s" podCreationTimestamp="2026-01-20 22:37:57 +0000 UTC" firstStartedPulling="2026-01-20 22:38:00.071964779 +0000 UTC m=+152.392225067" lastFinishedPulling="2026-01-20 22:38:37.985837803 +0000 UTC m=+190.306098121" observedRunningTime="2026-01-20 22:38:38.546905717 +0000 UTC m=+190.867166005" watchObservedRunningTime="2026-01-20 22:38:38.54909903 +0000 UTC m=+190.869359318" Jan 20 22:38:38 crc kubenswrapper[5030]: I0120 22:38:38.573375 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rwqgg" podStartSLOduration=2.627167956 podStartE2EDuration="41.573359278s" podCreationTimestamp="2026-01-20 22:37:57 +0000 UTC" firstStartedPulling="2026-01-20 22:37:58.918128733 +0000 UTC m=+151.238389011" lastFinishedPulling="2026-01-20 22:38:37.864320035 +0000 UTC m=+190.184580333" observedRunningTime="2026-01-20 22:38:38.568541663 +0000 UTC m=+190.888801951" watchObservedRunningTime="2026-01-20 22:38:38.573359278 +0000 UTC m=+190.893619566" Jan 20 22:38:38 crc kubenswrapper[5030]: I0120 22:38:38.592775 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s7w6m" podStartSLOduration=2.5102373780000002 podStartE2EDuration="41.592761011s" podCreationTimestamp="2026-01-20 22:37:57 +0000 UTC" firstStartedPulling="2026-01-20 22:37:58.958422763 +0000 UTC m=+151.278683051" lastFinishedPulling="2026-01-20 22:38:38.040946386 +0000 UTC m=+190.361206684" observedRunningTime="2026-01-20 22:38:38.58896934 +0000 UTC m=+190.909229628" watchObservedRunningTime="2026-01-20 22:38:38.592761011 +0000 UTC m=+190.913021299" Jan 20 22:38:40 crc kubenswrapper[5030]: I0120 22:38:40.157698 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:38:40 crc kubenswrapper[5030]: I0120 22:38:40.157938 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:38:40 crc kubenswrapper[5030]: I0120 22:38:40.740246 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:40 crc kubenswrapper[5030]: I0120 22:38:40.740680 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.125722 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.125804 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.314611 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 20 22:38:41 crc kubenswrapper[5030]: E0120 22:38:41.315297 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b64a23c-ab66-46ca-8121-bb25b0af63be" containerName="pruner" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.315310 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b64a23c-ab66-46ca-8121-bb25b0af63be" containerName="pruner" Jan 20 22:38:41 crc kubenswrapper[5030]: E0120 22:38:41.315333 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae73b5c-09e8-461f-a26a-5db5955938bc" containerName="pruner" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.315339 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae73b5c-09e8-461f-a26a-5db5955938bc" containerName="pruner" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.315435 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae73b5c-09e8-461f-a26a-5db5955938bc" containerName="pruner" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.315448 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b64a23c-ab66-46ca-8121-bb25b0af63be" containerName="pruner" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.315954 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.318519 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.318787 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.329340 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.415739 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/810d56f1-ec9e-4803-8041-30a04d396c9a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"810d56f1-ec9e-4803-8041-30a04d396c9a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.415811 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/810d56f1-ec9e-4803-8041-30a04d396c9a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"810d56f1-ec9e-4803-8041-30a04d396c9a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.517469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/810d56f1-ec9e-4803-8041-30a04d396c9a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"810d56f1-ec9e-4803-8041-30a04d396c9a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.517547 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/810d56f1-ec9e-4803-8041-30a04d396c9a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"810d56f1-ec9e-4803-8041-30a04d396c9a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.517674 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/810d56f1-ec9e-4803-8041-30a04d396c9a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"810d56f1-ec9e-4803-8041-30a04d396c9a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.536787 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/810d56f1-ec9e-4803-8041-30a04d396c9a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"810d56f1-ec9e-4803-8041-30a04d396c9a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.642576 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 22:38:41 crc kubenswrapper[5030]: I0120 22:38:41.819858 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7tbm4" podUID="ca452d17-ecf5-4ea1-8580-cca89ddb5b70" containerName="registry-server" probeResult="failure" output=< Jan 20 22:38:41 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 22:38:41 crc kubenswrapper[5030]: > Jan 20 22:38:42 crc kubenswrapper[5030]: I0120 22:38:42.043036 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 20 22:38:42 crc kubenswrapper[5030]: W0120 22:38:42.051300 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod810d56f1_ec9e_4803_8041_30a04d396c9a.slice/crio-bdfb486a386101bb3ea0a1902684873c682b6e6194f7ed18fc18ec037b65d2f7 WatchSource:0}: Error finding container bdfb486a386101bb3ea0a1902684873c682b6e6194f7ed18fc18ec037b65d2f7: Status 404 returned error can't find the container with id bdfb486a386101bb3ea0a1902684873c682b6e6194f7ed18fc18ec037b65d2f7 Jan 20 22:38:42 crc kubenswrapper[5030]: I0120 22:38:42.200608 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6wq5c" podUID="71d11e04-7835-4965-9c82-a9410fe5c407" containerName="registry-server" probeResult="failure" output=< Jan 20 22:38:42 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 22:38:42 crc kubenswrapper[5030]: > Jan 20 22:38:42 crc kubenswrapper[5030]: I0120 22:38:42.508357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"810d56f1-ec9e-4803-8041-30a04d396c9a","Type":"ContainerStarted","Data":"bdfb486a386101bb3ea0a1902684873c682b6e6194f7ed18fc18ec037b65d2f7"} Jan 20 22:38:43 crc kubenswrapper[5030]: I0120 22:38:43.527712 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"810d56f1-ec9e-4803-8041-30a04d396c9a","Type":"ContainerStarted","Data":"9fc46927f10018916f29ff1a99d9d22e221c9a0bb44d2e7e049c95d290b3608f"} Jan 20 22:38:44 crc kubenswrapper[5030]: I0120 22:38:44.541885 5030 generic.go:334] "Generic (PLEG): container finished" podID="810d56f1-ec9e-4803-8041-30a04d396c9a" containerID="9fc46927f10018916f29ff1a99d9d22e221c9a0bb44d2e7e049c95d290b3608f" exitCode=0 Jan 20 22:38:44 crc kubenswrapper[5030]: I0120 22:38:44.542741 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"810d56f1-ec9e-4803-8041-30a04d396c9a","Type":"ContainerDied","Data":"9fc46927f10018916f29ff1a99d9d22e221c9a0bb44d2e7e049c95d290b3608f"} Jan 20 22:38:45 crc kubenswrapper[5030]: I0120 22:38:45.816144 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 22:38:45 crc kubenswrapper[5030]: I0120 22:38:45.915871 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 20 22:38:45 crc kubenswrapper[5030]: E0120 22:38:45.916205 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810d56f1-ec9e-4803-8041-30a04d396c9a" containerName="pruner" Jan 20 22:38:45 crc kubenswrapper[5030]: I0120 22:38:45.916217 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="810d56f1-ec9e-4803-8041-30a04d396c9a" containerName="pruner" Jan 20 22:38:45 crc kubenswrapper[5030]: I0120 22:38:45.916362 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="810d56f1-ec9e-4803-8041-30a04d396c9a" containerName="pruner" Jan 20 22:38:45 crc kubenswrapper[5030]: I0120 22:38:45.916811 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 22:38:45 crc kubenswrapper[5030]: I0120 22:38:45.918055 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 20 22:38:45 crc kubenswrapper[5030]: I0120 22:38:45.972567 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/810d56f1-ec9e-4803-8041-30a04d396c9a-kubelet-dir\") pod \"810d56f1-ec9e-4803-8041-30a04d396c9a\" (UID: \"810d56f1-ec9e-4803-8041-30a04d396c9a\") " Jan 20 22:38:45 crc kubenswrapper[5030]: I0120 22:38:45.972660 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/810d56f1-ec9e-4803-8041-30a04d396c9a-kube-api-access\") pod \"810d56f1-ec9e-4803-8041-30a04d396c9a\" (UID: \"810d56f1-ec9e-4803-8041-30a04d396c9a\") " Jan 20 22:38:45 crc kubenswrapper[5030]: I0120 22:38:45.972726 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/810d56f1-ec9e-4803-8041-30a04d396c9a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "810d56f1-ec9e-4803-8041-30a04d396c9a" (UID: "810d56f1-ec9e-4803-8041-30a04d396c9a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:38:45 crc kubenswrapper[5030]: I0120 22:38:45.972901 5030 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/810d56f1-ec9e-4803-8041-30a04d396c9a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:45 crc kubenswrapper[5030]: I0120 22:38:45.977800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/810d56f1-ec9e-4803-8041-30a04d396c9a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "810d56f1-ec9e-4803-8041-30a04d396c9a" (UID: "810d56f1-ec9e-4803-8041-30a04d396c9a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.074530 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.074584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-kube-api-access\") pod \"installer-9-crc\" (UID: \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.074685 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-var-lock\") pod \"installer-9-crc\" (UID: \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.074727 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/810d56f1-ec9e-4803-8041-30a04d396c9a-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.176262 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-var-lock\") pod \"installer-9-crc\" (UID: \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.176326 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.176354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-kube-api-access\") pod \"installer-9-crc\" (UID: \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.176396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-var-lock\") pod \"installer-9-crc\" (UID: \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.176495 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.195905 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-kube-api-access\") pod \"installer-9-crc\" (UID: \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.234350 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.552279 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"810d56f1-ec9e-4803-8041-30a04d396c9a","Type":"ContainerDied","Data":"bdfb486a386101bb3ea0a1902684873c682b6e6194f7ed18fc18ec037b65d2f7"} Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.552661 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdfb486a386101bb3ea0a1902684873c682b6e6194f7ed18fc18ec037b65d2f7" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.552326 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 22:38:46 crc kubenswrapper[5030]: I0120 22:38:46.628706 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 20 22:38:46 crc kubenswrapper[5030]: W0120 22:38:46.632852 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4ed92e66_4b4b_4a90_91c4_c8636c22b7fe.slice/crio-4b4b5f9d967752c3072caa28f59fc0285209e5ba54f24b779d78aa1bb01149b2 WatchSource:0}: Error finding container 4b4b5f9d967752c3072caa28f59fc0285209e5ba54f24b779d78aa1bb01149b2: Status 404 returned error can't find the container with id 4b4b5f9d967752c3072caa28f59fc0285209e5ba54f24b779d78aa1bb01149b2 Jan 20 22:38:47 crc kubenswrapper[5030]: I0120 22:38:47.537716 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:38:47 crc kubenswrapper[5030]: I0120 22:38:47.538052 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:38:47 crc kubenswrapper[5030]: I0120 22:38:47.557186 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe","Type":"ContainerStarted","Data":"fe0f4dcefc703999f73a631b3a8bca0ac62e5f8028493ea839f812589f0dc17d"} Jan 20 22:38:47 crc kubenswrapper[5030]: I0120 22:38:47.557235 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe","Type":"ContainerStarted","Data":"4b4b5f9d967752c3072caa28f59fc0285209e5ba54f24b779d78aa1bb01149b2"} Jan 20 22:38:47 crc kubenswrapper[5030]: I0120 22:38:47.575878 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.575856031 podStartE2EDuration="2.575856031s" podCreationTimestamp="2026-01-20 22:38:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:38:47.568293306 +0000 UTC m=+199.888553594" watchObservedRunningTime="2026-01-20 22:38:47.575856031 +0000 UTC m=+199.896116319" Jan 20 22:38:47 crc kubenswrapper[5030]: I0120 22:38:47.593759 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:38:47 crc kubenswrapper[5030]: I0120 22:38:47.632600 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:38:47 crc kubenswrapper[5030]: I0120 22:38:47.733943 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:38:47 crc kubenswrapper[5030]: I0120 22:38:47.733995 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:38:47 crc kubenswrapper[5030]: I0120 22:38:47.771635 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:38:47 crc kubenswrapper[5030]: I0120 22:38:47.979661 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:38:47 crc kubenswrapper[5030]: I0120 22:38:47.979926 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:38:48 crc kubenswrapper[5030]: I0120 22:38:48.025257 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:38:48 crc kubenswrapper[5030]: I0120 22:38:48.141243 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:38:48 crc kubenswrapper[5030]: I0120 22:38:48.141846 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:38:48 crc kubenswrapper[5030]: I0120 22:38:48.185568 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:38:48 crc kubenswrapper[5030]: I0120 22:38:48.601547 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:38:48 crc kubenswrapper[5030]: I0120 22:38:48.605000 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:38:48 crc kubenswrapper[5030]: I0120 22:38:48.605116 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:38:49 crc kubenswrapper[5030]: I0120 22:38:49.629493 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvfww"] Jan 20 22:38:50 crc kubenswrapper[5030]: I0120 22:38:50.803781 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:50 crc kubenswrapper[5030]: I0120 22:38:50.872114 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:38:51 crc kubenswrapper[5030]: I0120 22:38:51.173659 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:51 crc kubenswrapper[5030]: I0120 22:38:51.223378 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:51 crc kubenswrapper[5030]: I0120 22:38:51.579756 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xvfww" podUID="24278bf4-72d4-4d8e-b912-7ac98cf12b52" containerName="registry-server" containerID="cri-o://1e454adb39792cdb1c7075907142434f88e8bc3e65cf2cff45c85d0c327f59cf" gracePeriod=2 Jan 20 22:38:52 crc kubenswrapper[5030]: I0120 22:38:52.023328 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w9p75"] Jan 20 22:38:52 crc kubenswrapper[5030]: I0120 22:38:52.023583 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w9p75" podUID="9a85129d-c6c1-484b-b19b-d22ddfa78fd4" containerName="registry-server" containerID="cri-o://0b95c987e2e6180ab540e0177ac73fa5f4e2df06f86aec5dd1c9139a310f5032" gracePeriod=2 Jan 20 22:38:52 crc kubenswrapper[5030]: I0120 22:38:52.588722 5030 generic.go:334] "Generic (PLEG): container finished" podID="9a85129d-c6c1-484b-b19b-d22ddfa78fd4" containerID="0b95c987e2e6180ab540e0177ac73fa5f4e2df06f86aec5dd1c9139a310f5032" exitCode=0 Jan 20 22:38:52 crc kubenswrapper[5030]: I0120 22:38:52.588791 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9p75" event={"ID":"9a85129d-c6c1-484b-b19b-d22ddfa78fd4","Type":"ContainerDied","Data":"0b95c987e2e6180ab540e0177ac73fa5f4e2df06f86aec5dd1c9139a310f5032"} Jan 20 22:38:52 crc kubenswrapper[5030]: I0120 22:38:52.593420 5030 generic.go:334] "Generic (PLEG): container finished" podID="24278bf4-72d4-4d8e-b912-7ac98cf12b52" containerID="1e454adb39792cdb1c7075907142434f88e8bc3e65cf2cff45c85d0c327f59cf" exitCode=0 Jan 20 22:38:52 crc kubenswrapper[5030]: I0120 22:38:52.593465 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfww" event={"ID":"24278bf4-72d4-4d8e-b912-7ac98cf12b52","Type":"ContainerDied","Data":"1e454adb39792cdb1c7075907142434f88e8bc3e65cf2cff45c85d0c327f59cf"} Jan 20 22:38:54 crc kubenswrapper[5030]: I0120 22:38:54.421150 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6wq5c"] Jan 20 22:38:54 crc kubenswrapper[5030]: I0120 22:38:54.421789 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6wq5c" podUID="71d11e04-7835-4965-9c82-a9410fe5c407" containerName="registry-server" containerID="cri-o://edc143a9d913cc44027a8f0ded7dec2d6779bf36a171eac676f547629ee2df5d" gracePeriod=2 Jan 20 22:38:55 crc kubenswrapper[5030]: I0120 22:38:55.609517 5030 generic.go:334] "Generic (PLEG): container finished" podID="71d11e04-7835-4965-9c82-a9410fe5c407" containerID="edc143a9d913cc44027a8f0ded7dec2d6779bf36a171eac676f547629ee2df5d" exitCode=0 Jan 20 22:38:55 crc kubenswrapper[5030]: I0120 22:38:55.609561 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wq5c" event={"ID":"71d11e04-7835-4965-9c82-a9410fe5c407","Type":"ContainerDied","Data":"edc143a9d913cc44027a8f0ded7dec2d6779bf36a171eac676f547629ee2df5d"} Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.623925 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9p75" event={"ID":"9a85129d-c6c1-484b-b19b-d22ddfa78fd4","Type":"ContainerDied","Data":"2f46132986be22c00f48799f9ea8669b1ca861b11d77c2d6c91321e3060780cd"} Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.624246 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f46132986be22c00f48799f9ea8669b1ca861b11d77c2d6c91321e3060780cd" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.627069 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfww" event={"ID":"24278bf4-72d4-4d8e-b912-7ac98cf12b52","Type":"ContainerDied","Data":"9e3e2049531dfd9edacbcf426b2df86b53eff7c01b10ae805dd0b6219edbdef7"} Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.627096 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e3e2049531dfd9edacbcf426b2df86b53eff7c01b10ae805dd0b6219edbdef7" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.658671 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.669553 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.722145 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-utilities\") pod \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\" (UID: \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\") " Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.722309 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h96fm\" (UniqueName: \"kubernetes.io/projected/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-kube-api-access-h96fm\") pod \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\" (UID: \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\") " Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.722369 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24278bf4-72d4-4d8e-b912-7ac98cf12b52-catalog-content\") pod \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\" (UID: \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\") " Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.722399 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-catalog-content\") pod \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\" (UID: \"9a85129d-c6c1-484b-b19b-d22ddfa78fd4\") " Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.722418 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24278bf4-72d4-4d8e-b912-7ac98cf12b52-utilities\") pod \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\" (UID: \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\") " Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.724529 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-utilities" (OuterVolumeSpecName: "utilities") pod "9a85129d-c6c1-484b-b19b-d22ddfa78fd4" (UID: "9a85129d-c6c1-484b-b19b-d22ddfa78fd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.726041 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xnkf\" (UniqueName: \"kubernetes.io/projected/24278bf4-72d4-4d8e-b912-7ac98cf12b52-kube-api-access-7xnkf\") pod \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\" (UID: \"24278bf4-72d4-4d8e-b912-7ac98cf12b52\") " Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.726244 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.729302 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24278bf4-72d4-4d8e-b912-7ac98cf12b52-utilities" (OuterVolumeSpecName: "utilities") pod "24278bf4-72d4-4d8e-b912-7ac98cf12b52" (UID: "24278bf4-72d4-4d8e-b912-7ac98cf12b52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.732657 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-kube-api-access-h96fm" (OuterVolumeSpecName: "kube-api-access-h96fm") pod "9a85129d-c6c1-484b-b19b-d22ddfa78fd4" (UID: "9a85129d-c6c1-484b-b19b-d22ddfa78fd4"). InnerVolumeSpecName "kube-api-access-h96fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.732752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24278bf4-72d4-4d8e-b912-7ac98cf12b52-kube-api-access-7xnkf" (OuterVolumeSpecName: "kube-api-access-7xnkf") pod "24278bf4-72d4-4d8e-b912-7ac98cf12b52" (UID: "24278bf4-72d4-4d8e-b912-7ac98cf12b52"). InnerVolumeSpecName "kube-api-access-7xnkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.775453 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24278bf4-72d4-4d8e-b912-7ac98cf12b52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24278bf4-72d4-4d8e-b912-7ac98cf12b52" (UID: "24278bf4-72d4-4d8e-b912-7ac98cf12b52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.777323 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a85129d-c6c1-484b-b19b-d22ddfa78fd4" (UID: "9a85129d-c6c1-484b-b19b-d22ddfa78fd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.827774 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h96fm\" (UniqueName: \"kubernetes.io/projected/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-kube-api-access-h96fm\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.827810 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24278bf4-72d4-4d8e-b912-7ac98cf12b52-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.827819 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a85129d-c6c1-484b-b19b-d22ddfa78fd4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.827827 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24278bf4-72d4-4d8e-b912-7ac98cf12b52-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:56 crc kubenswrapper[5030]: I0120 22:38:56.827838 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xnkf\" (UniqueName: \"kubernetes.io/projected/24278bf4-72d4-4d8e-b912-7ac98cf12b52-kube-api-access-7xnkf\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:57 crc kubenswrapper[5030]: I0120 22:38:57.631531 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9p75" Jan 20 22:38:57 crc kubenswrapper[5030]: I0120 22:38:57.631601 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvfww" Jan 20 22:38:57 crc kubenswrapper[5030]: I0120 22:38:57.658059 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w9p75"] Jan 20 22:38:57 crc kubenswrapper[5030]: I0120 22:38:57.662923 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w9p75"] Jan 20 22:38:57 crc kubenswrapper[5030]: I0120 22:38:57.683221 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvfww"] Jan 20 22:38:57 crc kubenswrapper[5030]: I0120 22:38:57.687007 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xvfww"] Jan 20 22:38:57 crc kubenswrapper[5030]: I0120 22:38:57.968802 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24278bf4-72d4-4d8e-b912-7ac98cf12b52" path="/var/lib/kubelet/pods/24278bf4-72d4-4d8e-b912-7ac98cf12b52/volumes" Jan 20 22:38:57 crc kubenswrapper[5030]: I0120 22:38:57.969943 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a85129d-c6c1-484b-b19b-d22ddfa78fd4" path="/var/lib/kubelet/pods/9a85129d-c6c1-484b-b19b-d22ddfa78fd4/volumes" Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.174376 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.244948 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d11e04-7835-4965-9c82-a9410fe5c407-utilities\") pod \"71d11e04-7835-4965-9c82-a9410fe5c407\" (UID: \"71d11e04-7835-4965-9c82-a9410fe5c407\") " Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.245063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45wx6\" (UniqueName: \"kubernetes.io/projected/71d11e04-7835-4965-9c82-a9410fe5c407-kube-api-access-45wx6\") pod \"71d11e04-7835-4965-9c82-a9410fe5c407\" (UID: \"71d11e04-7835-4965-9c82-a9410fe5c407\") " Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.245096 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d11e04-7835-4965-9c82-a9410fe5c407-catalog-content\") pod \"71d11e04-7835-4965-9c82-a9410fe5c407\" (UID: \"71d11e04-7835-4965-9c82-a9410fe5c407\") " Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.247883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71d11e04-7835-4965-9c82-a9410fe5c407-utilities" (OuterVolumeSpecName: "utilities") pod "71d11e04-7835-4965-9c82-a9410fe5c407" (UID: "71d11e04-7835-4965-9c82-a9410fe5c407"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.252695 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71d11e04-7835-4965-9c82-a9410fe5c407-kube-api-access-45wx6" (OuterVolumeSpecName: "kube-api-access-45wx6") pod "71d11e04-7835-4965-9c82-a9410fe5c407" (UID: "71d11e04-7835-4965-9c82-a9410fe5c407"). InnerVolumeSpecName "kube-api-access-45wx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.346071 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d11e04-7835-4965-9c82-a9410fe5c407-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.346097 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45wx6\" (UniqueName: \"kubernetes.io/projected/71d11e04-7835-4965-9c82-a9410fe5c407-kube-api-access-45wx6\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.356185 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71d11e04-7835-4965-9c82-a9410fe5c407-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71d11e04-7835-4965-9c82-a9410fe5c407" (UID: "71d11e04-7835-4965-9c82-a9410fe5c407"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.447238 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d11e04-7835-4965-9c82-a9410fe5c407-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.637553 5030 generic.go:334] "Generic (PLEG): container finished" podID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" containerID="5ffa93c9b1bf9d96c0abad00bef119f60a63dbb4330a44ad28fa8054e41d187d" exitCode=0 Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.637649 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8mnn" event={"ID":"8c46fc78-f3af-42cc-84e3-c0445d7e686f","Type":"ContainerDied","Data":"5ffa93c9b1bf9d96c0abad00bef119f60a63dbb4330a44ad28fa8054e41d187d"} Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.641273 5030 generic.go:334] "Generic (PLEG): container finished" podID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" containerID="f89473ff3bb99f3b71af965d05da1405f3566f95182a9abff563bec7de0674d6" exitCode=0 Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.641331 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfx2r" event={"ID":"77ee5767-5a99-4b6a-8a75-d7b98a0a89de","Type":"ContainerDied","Data":"f89473ff3bb99f3b71af965d05da1405f3566f95182a9abff563bec7de0674d6"} Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.645061 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wq5c" event={"ID":"71d11e04-7835-4965-9c82-a9410fe5c407","Type":"ContainerDied","Data":"f5e8f43bd737934959e1b27f6c3e477680fd18231da47a0dfd3e0166ae16feb8"} Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.645103 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wq5c" Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.645111 5030 scope.go:117] "RemoveContainer" containerID="edc143a9d913cc44027a8f0ded7dec2d6779bf36a171eac676f547629ee2df5d" Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.698096 5030 scope.go:117] "RemoveContainer" containerID="cef180134d14d8b1c7ec51c3d027eadd29584558bb67b95dbc24576619009da0" Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.706992 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6wq5c"] Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.709466 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6wq5c"] Jan 20 22:38:58 crc kubenswrapper[5030]: I0120 22:38:58.746067 5030 scope.go:117] "RemoveContainer" containerID="3c285efab944034f935a8ec8b38fff5e1ecc62f5bd09e14cf1c65267f5e0c73c" Jan 20 22:38:59 crc kubenswrapper[5030]: I0120 22:38:59.652929 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8mnn" event={"ID":"8c46fc78-f3af-42cc-84e3-c0445d7e686f","Type":"ContainerStarted","Data":"38cb67c56cd40097cba8bb25d6bfd73eb424fe60a9e63ba2bf242f7f0c4aa8f4"} Jan 20 22:38:59 crc kubenswrapper[5030]: I0120 22:38:59.656122 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfx2r" event={"ID":"77ee5767-5a99-4b6a-8a75-d7b98a0a89de","Type":"ContainerStarted","Data":"05f3ec33daec50202ff4031d84c69a81e494a42a3917b0acf158f669023f5605"} Jan 20 22:38:59 crc kubenswrapper[5030]: I0120 22:38:59.673516 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c8mnn" podStartSLOduration=2.7186225459999998 podStartE2EDuration="1m0.673499405s" podCreationTimestamp="2026-01-20 22:37:59 +0000 UTC" firstStartedPulling="2026-01-20 22:38:01.092379805 +0000 UTC m=+153.412640083" lastFinishedPulling="2026-01-20 22:38:59.047256654 +0000 UTC m=+211.367516942" observedRunningTime="2026-01-20 22:38:59.672244436 +0000 UTC m=+211.992504734" watchObservedRunningTime="2026-01-20 22:38:59.673499405 +0000 UTC m=+211.993759703" Jan 20 22:38:59 crc kubenswrapper[5030]: I0120 22:38:59.696189 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zfx2r" podStartSLOduration=2.7153993290000003 podStartE2EDuration="1m0.696169451s" podCreationTimestamp="2026-01-20 22:37:59 +0000 UTC" firstStartedPulling="2026-01-20 22:38:01.087808796 +0000 UTC m=+153.408069084" lastFinishedPulling="2026-01-20 22:38:59.068578918 +0000 UTC m=+211.388839206" observedRunningTime="2026-01-20 22:38:59.694247056 +0000 UTC m=+212.014507344" watchObservedRunningTime="2026-01-20 22:38:59.696169451 +0000 UTC m=+212.016429739" Jan 20 22:38:59 crc kubenswrapper[5030]: I0120 22:38:59.714932 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:38:59 crc kubenswrapper[5030]: I0120 22:38:59.714965 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:38:59 crc kubenswrapper[5030]: I0120 22:38:59.970958 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71d11e04-7835-4965-9c82-a9410fe5c407" path="/var/lib/kubelet/pods/71d11e04-7835-4965-9c82-a9410fe5c407/volumes" Jan 20 22:39:00 crc kubenswrapper[5030]: I0120 22:39:00.129245 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:39:00 crc kubenswrapper[5030]: I0120 22:39:00.129597 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:39:00 crc kubenswrapper[5030]: I0120 22:39:00.762440 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-c8mnn" podUID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" containerName="registry-server" probeResult="failure" output=< Jan 20 22:39:00 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 22:39:00 crc kubenswrapper[5030]: > Jan 20 22:39:01 crc kubenswrapper[5030]: I0120 22:39:01.173229 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-zfx2r" podUID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" containerName="registry-server" probeResult="failure" output=< Jan 20 22:39:01 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 22:39:01 crc kubenswrapper[5030]: > Jan 20 22:39:09 crc kubenswrapper[5030]: I0120 22:39:09.781897 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:39:09 crc kubenswrapper[5030]: I0120 22:39:09.853141 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:39:10 crc kubenswrapper[5030]: I0120 22:39:10.157195 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:39:10 crc kubenswrapper[5030]: I0120 22:39:10.157757 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:39:10 crc kubenswrapper[5030]: I0120 22:39:10.157851 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:39:10 crc kubenswrapper[5030]: I0120 22:39:10.158929 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 22:39:10 crc kubenswrapper[5030]: I0120 22:39:10.159110 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce" gracePeriod=600 Jan 20 22:39:10 crc kubenswrapper[5030]: I0120 22:39:10.199003 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:39:10 crc kubenswrapper[5030]: I0120 22:39:10.249459 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:39:11 crc kubenswrapper[5030]: I0120 22:39:11.030325 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfx2r"] Jan 20 22:39:11 crc kubenswrapper[5030]: I0120 22:39:11.730489 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zfx2r" podUID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" containerName="registry-server" containerID="cri-o://05f3ec33daec50202ff4031d84c69a81e494a42a3917b0acf158f669023f5605" gracePeriod=2 Jan 20 22:39:12 crc kubenswrapper[5030]: I0120 22:39:12.106323 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ht5ll"] Jan 20 22:39:12 crc kubenswrapper[5030]: I0120 22:39:12.738905 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce" exitCode=0 Jan 20 22:39:12 crc kubenswrapper[5030]: I0120 22:39:12.739110 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce"} Jan 20 22:39:12 crc kubenswrapper[5030]: I0120 22:39:12.741897 5030 generic.go:334] "Generic (PLEG): container finished" podID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" containerID="05f3ec33daec50202ff4031d84c69a81e494a42a3917b0acf158f669023f5605" exitCode=0 Jan 20 22:39:12 crc kubenswrapper[5030]: I0120 22:39:12.741938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfx2r" event={"ID":"77ee5767-5a99-4b6a-8a75-d7b98a0a89de","Type":"ContainerDied","Data":"05f3ec33daec50202ff4031d84c69a81e494a42a3917b0acf158f669023f5605"} Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.007843 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.128119 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbxlt\" (UniqueName: \"kubernetes.io/projected/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-kube-api-access-tbxlt\") pod \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\" (UID: \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\") " Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.128158 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-catalog-content\") pod \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\" (UID: \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\") " Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.128197 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-utilities\") pod \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\" (UID: \"77ee5767-5a99-4b6a-8a75-d7b98a0a89de\") " Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.129674 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-utilities" (OuterVolumeSpecName: "utilities") pod "77ee5767-5a99-4b6a-8a75-d7b98a0a89de" (UID: "77ee5767-5a99-4b6a-8a75-d7b98a0a89de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.135215 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-kube-api-access-tbxlt" (OuterVolumeSpecName: "kube-api-access-tbxlt") pod "77ee5767-5a99-4b6a-8a75-d7b98a0a89de" (UID: "77ee5767-5a99-4b6a-8a75-d7b98a0a89de"). InnerVolumeSpecName "kube-api-access-tbxlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.156565 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77ee5767-5a99-4b6a-8a75-d7b98a0a89de" (UID: "77ee5767-5a99-4b6a-8a75-d7b98a0a89de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.229261 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbxlt\" (UniqueName: \"kubernetes.io/projected/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-kube-api-access-tbxlt\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.229293 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.229303 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77ee5767-5a99-4b6a-8a75-d7b98a0a89de-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.748880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"f9e88aa4bf39a956e813cd75bd5356756871e80ce6ac73ab5e8b532a1293320b"} Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.750821 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfx2r" event={"ID":"77ee5767-5a99-4b6a-8a75-d7b98a0a89de","Type":"ContainerDied","Data":"afb708369cb20dbc49d74c088e89696d9281f3050fce2cdd8e374fbb387fbf3a"} Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.750850 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfx2r" Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.750889 5030 scope.go:117] "RemoveContainer" containerID="05f3ec33daec50202ff4031d84c69a81e494a42a3917b0acf158f669023f5605" Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.775222 5030 scope.go:117] "RemoveContainer" containerID="f89473ff3bb99f3b71af965d05da1405f3566f95182a9abff563bec7de0674d6" Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.797703 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfx2r"] Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.803864 5030 scope.go:117] "RemoveContainer" containerID="931d6348aa6043c6b561d526af8588532eb166eea99587d16b6d05fd9901446c" Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.809416 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfx2r"] Jan 20 22:39:13 crc kubenswrapper[5030]: I0120 22:39:13.972993 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" path="/var/lib/kubelet/pods/77ee5767-5a99-4b6a-8a75-d7b98a0a89de/volumes" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.558587 5030 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.559243 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d11e04-7835-4965-9c82-a9410fe5c407" containerName="registry-server" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559263 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d11e04-7835-4965-9c82-a9410fe5c407" containerName="registry-server" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.559285 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" containerName="registry-server" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559297 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" containerName="registry-server" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.559314 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d11e04-7835-4965-9c82-a9410fe5c407" containerName="extract-utilities" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559327 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d11e04-7835-4965-9c82-a9410fe5c407" containerName="extract-utilities" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.559347 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d11e04-7835-4965-9c82-a9410fe5c407" containerName="extract-content" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559362 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d11e04-7835-4965-9c82-a9410fe5c407" containerName="extract-content" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.559387 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24278bf4-72d4-4d8e-b912-7ac98cf12b52" containerName="extract-utilities" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559403 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24278bf4-72d4-4d8e-b912-7ac98cf12b52" containerName="extract-utilities" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.559428 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a85129d-c6c1-484b-b19b-d22ddfa78fd4" containerName="registry-server" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559446 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a85129d-c6c1-484b-b19b-d22ddfa78fd4" containerName="registry-server" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.559473 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" containerName="extract-content" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559490 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" containerName="extract-content" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.559513 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a85129d-c6c1-484b-b19b-d22ddfa78fd4" containerName="extract-utilities" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559532 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a85129d-c6c1-484b-b19b-d22ddfa78fd4" containerName="extract-utilities" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.559557 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" containerName="extract-utilities" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559574 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" containerName="extract-utilities" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.559589 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a85129d-c6c1-484b-b19b-d22ddfa78fd4" containerName="extract-content" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559601 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a85129d-c6c1-484b-b19b-d22ddfa78fd4" containerName="extract-content" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.559648 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24278bf4-72d4-4d8e-b912-7ac98cf12b52" containerName="extract-content" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559661 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24278bf4-72d4-4d8e-b912-7ac98cf12b52" containerName="extract-content" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.559676 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24278bf4-72d4-4d8e-b912-7ac98cf12b52" containerName="registry-server" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559688 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24278bf4-72d4-4d8e-b912-7ac98cf12b52" containerName="registry-server" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559874 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="71d11e04-7835-4965-9c82-a9410fe5c407" containerName="registry-server" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559895 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a85129d-c6c1-484b-b19b-d22ddfa78fd4" containerName="registry-server" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559910 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ee5767-5a99-4b6a-8a75-d7b98a0a89de" containerName="registry-server" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.559935 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="24278bf4-72d4-4d8e-b912-7ac98cf12b52" containerName="registry-server" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.560483 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.562989 5030 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.563452 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5" gracePeriod=15 Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.563700 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945" gracePeriod=15 Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.563823 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49" gracePeriod=15 Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.563915 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14" gracePeriod=15 Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.564006 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48" gracePeriod=15 Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.571788 5030 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.572208 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.572238 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.572262 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.572278 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.572299 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.572312 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.572329 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.572341 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.572361 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.572372 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 20 22:39:24 crc kubenswrapper[5030]: E0120 22:39:24.572393 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.572405 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.572580 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.572597 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.572616 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.572661 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.572679 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.592899 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.593043 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.593133 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.593182 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.593243 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.593296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.593347 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.593393 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.697285 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.697763 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.697874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.697956 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.697995 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.698069 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.698359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.698471 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.698528 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.698573 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.698638 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.698775 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.701203 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.701295 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.701299 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.701388 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.827508 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.828460 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945" exitCode=0 Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.828500 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49" exitCode=0 Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.828512 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14" exitCode=0 Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.828521 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48" exitCode=2 Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.830487 5030 generic.go:334] "Generic (PLEG): container finished" podID="4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" containerID="fe0f4dcefc703999f73a631b3a8bca0ac62e5f8028493ea839f812589f0dc17d" exitCode=0 Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.830538 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe","Type":"ContainerDied","Data":"fe0f4dcefc703999f73a631b3a8bca0ac62e5f8028493ea839f812589f0dc17d"} Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.831639 5030 status_manager.go:851] "Failed to get status for pod" podUID="4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:24 crc kubenswrapper[5030]: I0120 22:39:24.832083 5030 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:25 crc kubenswrapper[5030]: I0120 22:39:25.127699 5030 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 20 22:39:25 crc kubenswrapper[5030]: I0120 22:39:25.127782 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.172940 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.174236 5030 status_manager.go:851] "Failed to get status for pod" podUID="4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.225654 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-kubelet-dir\") pod \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\" (UID: \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\") " Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.225746 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-var-lock\") pod \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\" (UID: \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\") " Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.225783 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" (UID: "4ed92e66-4b4b-4a90-91c4-c8636c22b7fe"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.225864 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-var-lock" (OuterVolumeSpecName: "var-lock") pod "4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" (UID: "4ed92e66-4b4b-4a90-91c4-c8636c22b7fe"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.225892 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-kube-api-access\") pod \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\" (UID: \"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe\") " Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.226388 5030 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.226412 5030 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-var-lock\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.233751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" (UID: "4ed92e66-4b4b-4a90-91c4-c8636c22b7fe"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.327703 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ed92e66-4b4b-4a90-91c4-c8636c22b7fe-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.845131 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4ed92e66-4b4b-4a90-91c4-c8636c22b7fe","Type":"ContainerDied","Data":"4b4b5f9d967752c3072caa28f59fc0285209e5ba54f24b779d78aa1bb01149b2"} Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.845183 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b4b5f9d967752c3072caa28f59fc0285209e5ba54f24b779d78aa1bb01149b2" Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.845239 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 22:39:26 crc kubenswrapper[5030]: I0120 22:39:26.857151 5030 status_manager.go:851] "Failed to get status for pod" podUID="4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.489392 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.491608 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.492563 5030 status_manager.go:851] "Failed to get status for pod" podUID="4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.493432 5030 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.546308 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.546366 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.546439 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.546486 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.546492 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.546510 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.546887 5030 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.546912 5030 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.546930 5030 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.856081 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.857192 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5" exitCode=0 Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.857252 5030 scope.go:117] "RemoveContainer" containerID="14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.857362 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.875647 5030 scope.go:117] "RemoveContainer" containerID="58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.883339 5030 status_manager.go:851] "Failed to get status for pod" podUID="4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.883799 5030 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.892037 5030 scope.go:117] "RemoveContainer" containerID="50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.907754 5030 scope.go:117] "RemoveContainer" containerID="29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.921535 5030 scope.go:117] "RemoveContainer" containerID="86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.944743 5030 scope.go:117] "RemoveContainer" containerID="1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.965962 5030 status_manager.go:851] "Failed to get status for pod" podUID="4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.966345 5030 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.978384 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.980171 5030 scope.go:117] "RemoveContainer" containerID="14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945" Jan 20 22:39:27 crc kubenswrapper[5030]: E0120 22:39:27.980860 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\": container with ID starting with 14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945 not found: ID does not exist" containerID="14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.980915 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945"} err="failed to get container status \"14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\": rpc error: code = NotFound desc = could not find container \"14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945\": container with ID starting with 14c7cec0e1c977e8abe2504f55272771a1ada6b82b8eb350bf47a7213d128945 not found: ID does not exist" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.980948 5030 scope.go:117] "RemoveContainer" containerID="58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49" Jan 20 22:39:27 crc kubenswrapper[5030]: E0120 22:39:27.981439 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\": container with ID starting with 58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49 not found: ID does not exist" containerID="58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.981471 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49"} err="failed to get container status \"58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\": rpc error: code = NotFound desc = could not find container \"58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49\": container with ID starting with 58093c63f7a3cc6ea1e49b2986e75326e371a460fc7383ad917012d5b9466b49 not found: ID does not exist" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.981495 5030 scope.go:117] "RemoveContainer" containerID="50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14" Jan 20 22:39:27 crc kubenswrapper[5030]: E0120 22:39:27.981804 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\": container with ID starting with 50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14 not found: ID does not exist" containerID="50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.981853 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14"} err="failed to get container status \"50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\": rpc error: code = NotFound desc = could not find container \"50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14\": container with ID starting with 50d09a2d76897a149b435518dcac57342bb841fb6b1011ef80ed3d3d5caf6d14 not found: ID does not exist" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.981927 5030 scope.go:117] "RemoveContainer" containerID="29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48" Jan 20 22:39:27 crc kubenswrapper[5030]: E0120 22:39:27.982341 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\": container with ID starting with 29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48 not found: ID does not exist" containerID="29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.982414 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48"} err="failed to get container status \"29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\": rpc error: code = NotFound desc = could not find container \"29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48\": container with ID starting with 29a863104f3a8d989d4b2dbaff77d25aa059e2e9cc28749c96ffb8a34fa7ef48 not found: ID does not exist" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.982455 5030 scope.go:117] "RemoveContainer" containerID="86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5" Jan 20 22:39:27 crc kubenswrapper[5030]: E0120 22:39:27.982808 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\": container with ID starting with 86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5 not found: ID does not exist" containerID="86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.982833 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5"} err="failed to get container status \"86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\": rpc error: code = NotFound desc = could not find container \"86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5\": container with ID starting with 86ab66776b1caca84b134880dd5789cca869707fe8a7b9e874320178bf7d13f5 not found: ID does not exist" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.982849 5030 scope.go:117] "RemoveContainer" containerID="1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251" Jan 20 22:39:27 crc kubenswrapper[5030]: E0120 22:39:27.983174 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\": container with ID starting with 1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251 not found: ID does not exist" containerID="1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251" Jan 20 22:39:27 crc kubenswrapper[5030]: I0120 22:39:27.983216 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251"} err="failed to get container status \"1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\": rpc error: code = NotFound desc = could not find container \"1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251\": container with ID starting with 1ef36d3c4c7e12ea3e729aac5c2297a49c02f573d633a51ffc61501d6aa3b251 not found: ID does not exist" Jan 20 22:39:29 crc kubenswrapper[5030]: E0120 22:39:29.047182 5030 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" volumeName="registry-storage" Jan 20 22:39:29 crc kubenswrapper[5030]: E0120 22:39:29.604700 5030 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:29 crc kubenswrapper[5030]: I0120 22:39:29.605391 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:29 crc kubenswrapper[5030]: W0120 22:39:29.629765 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-7d36f36dc8b4048091583c9c24a4ea4d86f65f056f780490a80a2b14891310ab WatchSource:0}: Error finding container 7d36f36dc8b4048091583c9c24a4ea4d86f65f056f780490a80a2b14891310ab: Status 404 returned error can't find the container with id 7d36f36dc8b4048091583c9c24a4ea4d86f65f056f780490a80a2b14891310ab Jan 20 22:39:29 crc kubenswrapper[5030]: E0120 22:39:29.633050 5030 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188c91889fcd3934 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 22:39:29.632368948 +0000 UTC m=+241.952629276,LastTimestamp:2026-01-20 22:39:29.632368948 +0000 UTC m=+241.952629276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 22:39:29 crc kubenswrapper[5030]: I0120 22:39:29.878495 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7d36f36dc8b4048091583c9c24a4ea4d86f65f056f780490a80a2b14891310ab"} Jan 20 22:39:30 crc kubenswrapper[5030]: E0120 22:39:30.704131 5030 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:30 crc kubenswrapper[5030]: E0120 22:39:30.704905 5030 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:30 crc kubenswrapper[5030]: E0120 22:39:30.705525 5030 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:30 crc kubenswrapper[5030]: E0120 22:39:30.706191 5030 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:30 crc kubenswrapper[5030]: E0120 22:39:30.706594 5030 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:30 crc kubenswrapper[5030]: I0120 22:39:30.706684 5030 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 20 22:39:30 crc kubenswrapper[5030]: E0120 22:39:30.707114 5030 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="200ms" Jan 20 22:39:30 crc kubenswrapper[5030]: E0120 22:39:30.907767 5030 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="400ms" Jan 20 22:39:31 crc kubenswrapper[5030]: E0120 22:39:31.309133 5030 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="800ms" Jan 20 22:39:31 crc kubenswrapper[5030]: I0120 22:39:31.901224 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"16c48af9d855875740861dff6b5339c2b35efeabc9566d44bd55d9e1f7e73d86"} Jan 20 22:39:31 crc kubenswrapper[5030]: E0120 22:39:31.902534 5030 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:31 crc kubenswrapper[5030]: I0120 22:39:31.902796 5030 status_manager.go:851] "Failed to get status for pod" podUID="4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:32 crc kubenswrapper[5030]: E0120 22:39:32.110646 5030 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="1.6s" Jan 20 22:39:32 crc kubenswrapper[5030]: E0120 22:39:32.908571 5030 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:39:33 crc kubenswrapper[5030]: E0120 22:39:33.711917 5030 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="3.2s" Jan 20 22:39:34 crc kubenswrapper[5030]: E0120 22:39:34.036316 5030 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188c91889fcd3934 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 22:39:29.632368948 +0000 UTC m=+241.952629276,LastTimestamp:2026-01-20 22:39:29.632368948 +0000 UTC m=+241.952629276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 22:39:35 crc kubenswrapper[5030]: I0120 22:39:35.962192 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:35 crc kubenswrapper[5030]: I0120 22:39:35.963536 5030 status_manager.go:851] "Failed to get status for pod" podUID="4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:35 crc kubenswrapper[5030]: I0120 22:39:35.989765 5030 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4c2f1372-befa-4af3-89cb-869b61ef944e" Jan 20 22:39:35 crc kubenswrapper[5030]: I0120 22:39:35.989823 5030 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4c2f1372-befa-4af3-89cb-869b61ef944e" Jan 20 22:39:35 crc kubenswrapper[5030]: E0120 22:39:35.990506 5030 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:35 crc kubenswrapper[5030]: I0120 22:39:35.991320 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:36 crc kubenswrapper[5030]: E0120 22:39:36.912436 5030 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="6.4s" Jan 20 22:39:36 crc kubenswrapper[5030]: I0120 22:39:36.938265 5030 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="1dc41d3bd67a47eca2f697fe3dc5818a83fa136d03b00c72e95df087266dcda4" exitCode=0 Jan 20 22:39:36 crc kubenswrapper[5030]: I0120 22:39:36.938322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"1dc41d3bd67a47eca2f697fe3dc5818a83fa136d03b00c72e95df087266dcda4"} Jan 20 22:39:36 crc kubenswrapper[5030]: I0120 22:39:36.938361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7a174440013faf29dd8e14326ee9288a621a21895ab5b398df160340085da65e"} Jan 20 22:39:36 crc kubenswrapper[5030]: I0120 22:39:36.938782 5030 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4c2f1372-befa-4af3-89cb-869b61ef944e" Jan 20 22:39:36 crc kubenswrapper[5030]: I0120 22:39:36.938804 5030 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4c2f1372-befa-4af3-89cb-869b61ef944e" Jan 20 22:39:36 crc kubenswrapper[5030]: E0120 22:39:36.939357 5030 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:36 crc kubenswrapper[5030]: I0120 22:39:36.939358 5030 status_manager.go:851] "Failed to get status for pod" podUID="4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.136431 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" podUID="4e59bf88-6561-4b84-93a3-4aa09d919f56" containerName="oauth-openshift" containerID="cri-o://8c0f0031caa99f69f22b0194665293b3648ae9958f088ed52a621811539f1804" gracePeriod=15 Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.605785 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685277 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-service-ca\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685332 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-error\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685359 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e59bf88-6561-4b84-93a3-4aa09d919f56-audit-dir\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685396 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-trusted-ca-bundle\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685432 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-cliconfig\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685486 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-login\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685516 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-provider-selection\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685547 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e59bf88-6561-4b84-93a3-4aa09d919f56-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685564 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-ocp-branding-template\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685691 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-idp-0-file-data\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685734 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-serving-cert\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685782 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sstz\" (UniqueName: \"kubernetes.io/projected/4e59bf88-6561-4b84-93a3-4aa09d919f56-kube-api-access-7sstz\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685808 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-session\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685837 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-audit-policies\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.685869 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-router-certs\") pod \"4e59bf88-6561-4b84-93a3-4aa09d919f56\" (UID: \"4e59bf88-6561-4b84-93a3-4aa09d919f56\") " Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.686186 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.686199 5030 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4e59bf88-6561-4b84-93a3-4aa09d919f56-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.686405 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.687229 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.688282 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.691549 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.691926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e59bf88-6561-4b84-93a3-4aa09d919f56-kube-api-access-7sstz" (OuterVolumeSpecName: "kube-api-access-7sstz") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "kube-api-access-7sstz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.692237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.692489 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.692783 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.693011 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.693236 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.693343 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.693539 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4e59bf88-6561-4b84-93a3-4aa09d919f56" (UID: "4e59bf88-6561-4b84-93a3-4aa09d919f56"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.787187 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.787214 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.787223 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.787234 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sstz\" (UniqueName: \"kubernetes.io/projected/4e59bf88-6561-4b84-93a3-4aa09d919f56-kube-api-access-7sstz\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.787246 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.787255 5030 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.787264 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.787274 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.787283 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.787291 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.787300 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.787308 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.787318 5030 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4e59bf88-6561-4b84-93a3-4aa09d919f56-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.945725 5030 generic.go:334] "Generic (PLEG): container finished" podID="4e59bf88-6561-4b84-93a3-4aa09d919f56" containerID="8c0f0031caa99f69f22b0194665293b3648ae9958f088ed52a621811539f1804" exitCode=0 Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.945805 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" event={"ID":"4e59bf88-6561-4b84-93a3-4aa09d919f56","Type":"ContainerDied","Data":"8c0f0031caa99f69f22b0194665293b3648ae9958f088ed52a621811539f1804"} Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.945815 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.945835 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ht5ll" event={"ID":"4e59bf88-6561-4b84-93a3-4aa09d919f56","Type":"ContainerDied","Data":"12ca7f2ac882c8f918efc86d51bb39d5292472e250ffde8e6ff35f5f396f6a61"} Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.945854 5030 scope.go:117] "RemoveContainer" containerID="8c0f0031caa99f69f22b0194665293b3648ae9958f088ed52a621811539f1804" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.949095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b937eca9934a28a8ffaf59693f8b85ab60e1c4f9c32b0c7e0c2b40598e65cefc"} Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.949119 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"159be4637a7e22b1f4be6e50e79e3eaa7373d7ffc5171fdc8bbf86bc7ca5c173"} Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.949128 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"29451c58340c84afcb28f4b7793c7e12fc0bfc6772841f5437c1e76f3cb4dc33"} Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.952871 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.952925 5030 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a" exitCode=1 Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.952954 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a"} Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.953384 5030 scope.go:117] "RemoveContainer" containerID="0342e9b17d12b02dd3f70541ad515c875ad7e479c213f047e1388e3ba705081a" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.976780 5030 scope.go:117] "RemoveContainer" containerID="8c0f0031caa99f69f22b0194665293b3648ae9958f088ed52a621811539f1804" Jan 20 22:39:37 crc kubenswrapper[5030]: E0120 22:39:37.977361 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0f0031caa99f69f22b0194665293b3648ae9958f088ed52a621811539f1804\": container with ID starting with 8c0f0031caa99f69f22b0194665293b3648ae9958f088ed52a621811539f1804 not found: ID does not exist" containerID="8c0f0031caa99f69f22b0194665293b3648ae9958f088ed52a621811539f1804" Jan 20 22:39:37 crc kubenswrapper[5030]: I0120 22:39:37.977419 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0f0031caa99f69f22b0194665293b3648ae9958f088ed52a621811539f1804"} err="failed to get container status \"8c0f0031caa99f69f22b0194665293b3648ae9958f088ed52a621811539f1804\": rpc error: code = NotFound desc = could not find container \"8c0f0031caa99f69f22b0194665293b3648ae9958f088ed52a621811539f1804\": container with ID starting with 8c0f0031caa99f69f22b0194665293b3648ae9958f088ed52a621811539f1804 not found: ID does not exist" Jan 20 22:39:38 crc kubenswrapper[5030]: I0120 22:39:38.676079 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:39:38 crc kubenswrapper[5030]: I0120 22:39:38.962764 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ca765800db495d4f14cf34065b6815959c845866709353080e8789bc2a1ef27a"} Jan 20 22:39:38 crc kubenswrapper[5030]: I0120 22:39:38.962827 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9d546239705fec16f3874cea239f9522ad41af8b1261a60a6ea0cd3471aaa33c"} Jan 20 22:39:38 crc kubenswrapper[5030]: I0120 22:39:38.963032 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:38 crc kubenswrapper[5030]: I0120 22:39:38.963185 5030 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4c2f1372-befa-4af3-89cb-869b61ef944e" Jan 20 22:39:38 crc kubenswrapper[5030]: I0120 22:39:38.963221 5030 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4c2f1372-befa-4af3-89cb-869b61ef944e" Jan 20 22:39:38 crc kubenswrapper[5030]: I0120 22:39:38.966559 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 20 22:39:38 crc kubenswrapper[5030]: I0120 22:39:38.966724 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e2b28522703beda7c6ed9af771b68873a4c4352f9e87aecba628a9b4d2043ad9"} Jan 20 22:39:40 crc kubenswrapper[5030]: I0120 22:39:40.991596 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:40 crc kubenswrapper[5030]: I0120 22:39:40.992114 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:41 crc kubenswrapper[5030]: I0120 22:39:41.001716 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:41 crc kubenswrapper[5030]: I0120 22:39:41.570562 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:39:44 crc kubenswrapper[5030]: I0120 22:39:44.033260 5030 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:44 crc kubenswrapper[5030]: I0120 22:39:44.129425 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="743b4ded-7c8c-4646-93a0-cbebfde44312" Jan 20 22:39:45 crc kubenswrapper[5030]: I0120 22:39:45.006602 5030 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4c2f1372-befa-4af3-89cb-869b61ef944e" Jan 20 22:39:45 crc kubenswrapper[5030]: I0120 22:39:45.006683 5030 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4c2f1372-befa-4af3-89cb-869b61ef944e" Jan 20 22:39:45 crc kubenswrapper[5030]: I0120 22:39:45.009725 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="743b4ded-7c8c-4646-93a0-cbebfde44312" Jan 20 22:39:45 crc kubenswrapper[5030]: I0120 22:39:45.012815 5030 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://29451c58340c84afcb28f4b7793c7e12fc0bfc6772841f5437c1e76f3cb4dc33" Jan 20 22:39:45 crc kubenswrapper[5030]: I0120 22:39:45.012859 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:39:46 crc kubenswrapper[5030]: I0120 22:39:46.016030 5030 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4c2f1372-befa-4af3-89cb-869b61ef944e" Jan 20 22:39:46 crc kubenswrapper[5030]: I0120 22:39:46.016090 5030 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="4c2f1372-befa-4af3-89cb-869b61ef944e" Jan 20 22:39:46 crc kubenswrapper[5030]: I0120 22:39:46.020264 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="743b4ded-7c8c-4646-93a0-cbebfde44312" Jan 20 22:39:48 crc kubenswrapper[5030]: I0120 22:39:48.677049 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:39:48 crc kubenswrapper[5030]: I0120 22:39:48.692149 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:39:49 crc kubenswrapper[5030]: I0120 22:39:49.045044 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 22:39:54 crc kubenswrapper[5030]: I0120 22:39:54.606676 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 20 22:39:54 crc kubenswrapper[5030]: I0120 22:39:54.704331 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 20 22:39:55 crc kubenswrapper[5030]: I0120 22:39:55.744436 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 20 22:39:56 crc kubenswrapper[5030]: I0120 22:39:56.222959 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 20 22:39:56 crc kubenswrapper[5030]: I0120 22:39:56.823426 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.024611 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.126302 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.184298 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.320590 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.353379 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.521785 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.587380 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.694119 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.721399 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.781821 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.809090 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.847665 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.953082 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 20 22:39:57 crc kubenswrapper[5030]: I0120 22:39:57.997122 5030 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 20 22:39:58 crc kubenswrapper[5030]: I0120 22:39:58.170585 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 20 22:39:58 crc kubenswrapper[5030]: I0120 22:39:58.178482 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 20 22:39:58 crc kubenswrapper[5030]: I0120 22:39:58.323297 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 20 22:39:58 crc kubenswrapper[5030]: I0120 22:39:58.352940 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 20 22:39:58 crc kubenswrapper[5030]: I0120 22:39:58.531821 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 20 22:39:58 crc kubenswrapper[5030]: I0120 22:39:58.614251 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 20 22:39:58 crc kubenswrapper[5030]: I0120 22:39:58.634994 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 20 22:39:58 crc kubenswrapper[5030]: I0120 22:39:58.835483 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 20 22:39:58 crc kubenswrapper[5030]: I0120 22:39:58.861760 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.001310 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.039078 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.113263 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.225583 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.225723 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.228701 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.240765 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.266955 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.326093 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.388389 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.448329 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.452234 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.482298 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.502248 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.588678 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.681751 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.777725 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.880355 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 20 22:39:59 crc kubenswrapper[5030]: I0120 22:39:59.955808 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.002377 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.028551 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.030594 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.033738 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.040828 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.069924 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.117474 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.284527 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.406465 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.458457 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.472911 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.542760 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.612698 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.633736 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.737066 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.786295 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.862445 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 22:40:00 crc kubenswrapper[5030]: I0120 22:40:00.867832 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.050321 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.064505 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.128393 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.133352 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.138167 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.158922 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.159256 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.221189 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.230272 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.251477 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.273826 5030 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.277817 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-ht5ll"] Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.277879 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.284406 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.290702 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.305967 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.305937873 podStartE2EDuration="17.305937873s" podCreationTimestamp="2026-01-20 22:39:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:40:01.299228181 +0000 UTC m=+273.619488559" watchObservedRunningTime="2026-01-20 22:40:01.305937873 +0000 UTC m=+273.626198201" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.356275 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.356577 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.519247 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.638957 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.765399 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.890985 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.917976 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 20 22:40:01 crc kubenswrapper[5030]: I0120 22:40:01.975616 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e59bf88-6561-4b84-93a3-4aa09d919f56" path="/var/lib/kubelet/pods/4e59bf88-6561-4b84-93a3-4aa09d919f56/volumes" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.019359 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.036574 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.176131 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.233139 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.317608 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.320808 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.507695 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.558966 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.579296 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.792235 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.821005 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.850065 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.955537 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.997832 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 20 22:40:02 crc kubenswrapper[5030]: I0120 22:40:02.998229 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.027975 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.042760 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.056254 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.091808 5030 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.109404 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.117680 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.132942 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.173597 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.320336 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.439247 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.451119 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.456461 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.462357 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.472741 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.483276 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.509386 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.510495 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.589108 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.675321 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.859842 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.888648 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 20 22:40:03 crc kubenswrapper[5030]: I0120 22:40:03.992420 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.010999 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.036978 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.071136 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.268451 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.271455 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.309433 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.482352 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.489194 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.550271 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.564397 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.616276 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.683325 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.694794 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.730443 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.766018 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.889604 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 20 22:40:04 crc kubenswrapper[5030]: I0120 22:40:04.915475 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.015367 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.105327 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.138664 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.182007 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.251508 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.306245 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5566b8fdb8-6mckw"] Jan 20 22:40:05 crc kubenswrapper[5030]: E0120 22:40:05.306489 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" containerName="installer" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.306510 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" containerName="installer" Jan 20 22:40:05 crc kubenswrapper[5030]: E0120 22:40:05.306537 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e59bf88-6561-4b84-93a3-4aa09d919f56" containerName="oauth-openshift" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.306544 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e59bf88-6561-4b84-93a3-4aa09d919f56" containerName="oauth-openshift" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.306686 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e59bf88-6561-4b84-93a3-4aa09d919f56" containerName="oauth-openshift" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.306705 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ed92e66-4b4b-4a90-91c4-c8636c22b7fe" containerName="installer" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.307147 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.313291 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.313661 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.314905 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.315508 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.316145 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.316395 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.316792 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.316963 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.317110 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.316400 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.317496 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.317759 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.326871 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.329227 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.330329 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-user-template-error\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.330489 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.330554 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.330594 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.330776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-user-template-login\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.330812 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-session\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.330870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wkzm\" (UniqueName: \"kubernetes.io/projected/cd2ba62e-6ae4-4fc7-abcb-8798103be233-kube-api-access-8wkzm\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.330904 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.330937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd2ba62e-6ae4-4fc7-abcb-8798103be233-audit-dir\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.330970 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-service-ca\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.331005 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.331084 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd2ba62e-6ae4-4fc7-abcb-8798103be233-audit-policies\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.331162 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.331211 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-router-certs\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.332964 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.381970 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.408920 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.432359 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-router-certs\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.432466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-user-template-error\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.432528 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.432601 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.432692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.432740 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-user-template-login\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.433534 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.434249 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-session\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.434316 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.434358 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.434418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wkzm\" (UniqueName: \"kubernetes.io/projected/cd2ba62e-6ae4-4fc7-abcb-8798103be233-kube-api-access-8wkzm\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.434462 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd2ba62e-6ae4-4fc7-abcb-8798103be233-audit-dir\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.434517 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-service-ca\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.434560 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd2ba62e-6ae4-4fc7-abcb-8798103be233-audit-dir\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.434569 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.434757 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd2ba62e-6ae4-4fc7-abcb-8798103be233-audit-policies\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.434830 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.435165 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-service-ca\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.435428 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd2ba62e-6ae4-4fc7-abcb-8798103be233-audit-policies\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.439826 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.439912 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-router-certs\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.440725 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.441326 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.443470 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-session\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.444650 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.445182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-user-template-login\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.451389 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-user-template-error\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.454013 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cd2ba62e-6ae4-4fc7-abcb-8798103be233-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.454859 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wkzm\" (UniqueName: \"kubernetes.io/projected/cd2ba62e-6ae4-4fc7-abcb-8798103be233-kube-api-access-8wkzm\") pod \"oauth-openshift-5566b8fdb8-6mckw\" (UID: \"cd2ba62e-6ae4-4fc7-abcb-8798103be233\") " pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.479087 5030 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.484109 5030 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.484500 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://16c48af9d855875740861dff6b5339c2b35efeabc9566d44bd55d9e1f7e73d86" gracePeriod=5 Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.540760 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.631111 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.685000 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.693409 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.711392 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.722720 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.751425 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.780253 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.854578 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.875185 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.956698 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 20 22:40:05 crc kubenswrapper[5030]: I0120 22:40:05.965454 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.063432 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.080474 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.118230 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.222131 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.222826 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.348510 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.452391 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.514036 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.546394 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.571785 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.673008 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.676780 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.696308 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.712554 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.775124 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.915846 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.959679 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.970354 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 20 22:40:06 crc kubenswrapper[5030]: I0120 22:40:06.998927 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.003226 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.005450 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.018737 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.023393 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.120866 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.164480 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.226589 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.269063 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.284719 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.304473 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.395088 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.539076 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.541119 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.582727 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.659657 5030 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.664362 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.766492 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 20 22:40:07 crc kubenswrapper[5030]: I0120 22:40:07.828758 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.046945 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.091683 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.113306 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.320949 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.449820 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.461372 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.503407 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.581219 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.676586 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.687295 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.714580 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.789873 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.869738 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 20 22:40:08 crc kubenswrapper[5030]: I0120 22:40:08.944731 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 20 22:40:09 crc kubenswrapper[5030]: I0120 22:40:09.277693 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 20 22:40:09 crc kubenswrapper[5030]: I0120 22:40:09.320704 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 20 22:40:09 crc kubenswrapper[5030]: I0120 22:40:09.521275 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 20 22:40:09 crc kubenswrapper[5030]: I0120 22:40:09.600407 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 20 22:40:09 crc kubenswrapper[5030]: I0120 22:40:09.698868 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 20 22:40:09 crc kubenswrapper[5030]: I0120 22:40:09.780233 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 20 22:40:10 crc kubenswrapper[5030]: I0120 22:40:10.097179 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 20 22:40:10 crc kubenswrapper[5030]: I0120 22:40:10.138758 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 20 22:40:10 crc kubenswrapper[5030]: I0120 22:40:10.201991 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 20 22:40:10 crc kubenswrapper[5030]: I0120 22:40:10.324991 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 20 22:40:10 crc kubenswrapper[5030]: I0120 22:40:10.343596 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 22:40:10 crc kubenswrapper[5030]: I0120 22:40:10.453206 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 20 22:40:10 crc kubenswrapper[5030]: I0120 22:40:10.463406 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 20 22:40:10 crc kubenswrapper[5030]: I0120 22:40:10.611726 5030 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 20 22:40:10 crc kubenswrapper[5030]: I0120 22:40:10.760321 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 20 22:40:10 crc kubenswrapper[5030]: I0120 22:40:10.843472 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 20 22:40:10 crc kubenswrapper[5030]: I0120 22:40:10.855566 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 20 22:40:10 crc kubenswrapper[5030]: I0120 22:40:10.882203 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.113594 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.126683 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.126817 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.200967 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.201078 5030 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="16c48af9d855875740861dff6b5339c2b35efeabc9566d44bd55d9e1f7e73d86" exitCode=137 Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.201156 5030 scope.go:117] "RemoveContainer" containerID="16c48af9d855875740861dff6b5339c2b35efeabc9566d44bd55d9e1f7e73d86" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.201198 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.208716 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.225458 5030 scope.go:117] "RemoveContainer" containerID="16c48af9d855875740861dff6b5339c2b35efeabc9566d44bd55d9e1f7e73d86" Jan 20 22:40:11 crc kubenswrapper[5030]: E0120 22:40:11.226248 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16c48af9d855875740861dff6b5339c2b35efeabc9566d44bd55d9e1f7e73d86\": container with ID starting with 16c48af9d855875740861dff6b5339c2b35efeabc9566d44bd55d9e1f7e73d86 not found: ID does not exist" containerID="16c48af9d855875740861dff6b5339c2b35efeabc9566d44bd55d9e1f7e73d86" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.226310 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16c48af9d855875740861dff6b5339c2b35efeabc9566d44bd55d9e1f7e73d86"} err="failed to get container status \"16c48af9d855875740861dff6b5339c2b35efeabc9566d44bd55d9e1f7e73d86\": rpc error: code = NotFound desc = could not find container \"16c48af9d855875740861dff6b5339c2b35efeabc9566d44bd55d9e1f7e73d86\": container with ID starting with 16c48af9d855875740861dff6b5339c2b35efeabc9566d44bd55d9e1f7e73d86 not found: ID does not exist" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.312979 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.313061 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.313171 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.313198 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.313286 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.313303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.313351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.313793 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.313854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.314271 5030 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.314385 5030 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.326705 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.416324 5030 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.416373 5030 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.416387 5030 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.438006 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.474994 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.962189 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 20 22:40:11 crc kubenswrapper[5030]: I0120 22:40:11.971871 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 20 22:40:12 crc kubenswrapper[5030]: I0120 22:40:12.269073 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 20 22:40:12 crc kubenswrapper[5030]: I0120 22:40:12.377422 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 20 22:40:12 crc kubenswrapper[5030]: I0120 22:40:12.396157 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 20 22:40:12 crc kubenswrapper[5030]: I0120 22:40:12.620463 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 20 22:40:12 crc kubenswrapper[5030]: I0120 22:40:12.893449 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 20 22:40:13 crc kubenswrapper[5030]: I0120 22:40:13.168766 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5566b8fdb8-6mckw"] Jan 20 22:40:13 crc kubenswrapper[5030]: I0120 22:40:13.677615 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5566b8fdb8-6mckw"] Jan 20 22:40:14 crc kubenswrapper[5030]: I0120 22:40:14.224154 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" event={"ID":"cd2ba62e-6ae4-4fc7-abcb-8798103be233","Type":"ContainerStarted","Data":"81435576af584a76fd52e3d9b09a8f18b655de7ce7d7e3b8ccdb6ac2a04336d1"} Jan 20 22:40:14 crc kubenswrapper[5030]: I0120 22:40:14.224203 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" event={"ID":"cd2ba62e-6ae4-4fc7-abcb-8798103be233","Type":"ContainerStarted","Data":"acf091079c81b6baab97845bf92c64299555fb85ffadbdaa9317ebf299960886"} Jan 20 22:40:14 crc kubenswrapper[5030]: I0120 22:40:14.224799 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:14 crc kubenswrapper[5030]: I0120 22:40:14.253151 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" podStartSLOduration=62.253133945 podStartE2EDuration="1m2.253133945s" podCreationTimestamp="2026-01-20 22:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:40:14.249959539 +0000 UTC m=+286.570219827" watchObservedRunningTime="2026-01-20 22:40:14.253133945 +0000 UTC m=+286.573394233" Jan 20 22:40:14 crc kubenswrapper[5030]: I0120 22:40:14.658584 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5566b8fdb8-6mckw" Jan 20 22:40:27 crc kubenswrapper[5030]: I0120 22:40:27.816354 5030 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 20 22:40:29 crc kubenswrapper[5030]: I0120 22:40:29.314782 5030 generic.go:334] "Generic (PLEG): container finished" podID="00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" containerID="27418cd1f9ab974fdd707fae32edb6029362def21560dbea701ae7e4bcc59398" exitCode=0 Jan 20 22:40:29 crc kubenswrapper[5030]: I0120 22:40:29.314890 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" event={"ID":"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5","Type":"ContainerDied","Data":"27418cd1f9ab974fdd707fae32edb6029362def21560dbea701ae7e4bcc59398"} Jan 20 22:40:29 crc kubenswrapper[5030]: I0120 22:40:29.315808 5030 scope.go:117] "RemoveContainer" containerID="27418cd1f9ab974fdd707fae32edb6029362def21560dbea701ae7e4bcc59398" Jan 20 22:40:30 crc kubenswrapper[5030]: I0120 22:40:30.323014 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" event={"ID":"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5","Type":"ContainerStarted","Data":"41a937dedc3a633f2c419058e6465e868d662c0d3e05e8ca8727936a572327d0"} Jan 20 22:40:30 crc kubenswrapper[5030]: I0120 22:40:30.323780 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:40:30 crc kubenswrapper[5030]: I0120 22:40:30.325177 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.236733 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2zwp4"] Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.237188 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" podUID="96c05319-8357-4042-ae09-77f0c85a73d9" containerName="controller-manager" containerID="cri-o://8f8bb888dd3cc2499ff1de94615359ba78f585fac6589de240b4d7a5942aafc2" gracePeriod=30 Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.338064 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh"] Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.338389 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" podUID="2cc86f8a-314a-40e6-adbd-9c4bc5cffeee" containerName="route-controller-manager" containerID="cri-o://a6ba8b8159d9e3093de70ab675deb411e4cf9ea60b656b7b87d9ae27950dfb4f" gracePeriod=30 Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.810175 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.910858 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-proxy-ca-bundles\") pod \"96c05319-8357-4042-ae09-77f0c85a73d9\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.910927 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-client-ca\") pod \"96c05319-8357-4042-ae09-77f0c85a73d9\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.910996 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-config\") pod \"96c05319-8357-4042-ae09-77f0c85a73d9\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.911019 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96c05319-8357-4042-ae09-77f0c85a73d9-serving-cert\") pod \"96c05319-8357-4042-ae09-77f0c85a73d9\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.911763 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-client-ca" (OuterVolumeSpecName: "client-ca") pod "96c05319-8357-4042-ae09-77f0c85a73d9" (UID: "96c05319-8357-4042-ae09-77f0c85a73d9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.911895 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "96c05319-8357-4042-ae09-77f0c85a73d9" (UID: "96c05319-8357-4042-ae09-77f0c85a73d9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.911975 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-config" (OuterVolumeSpecName: "config") pod "96c05319-8357-4042-ae09-77f0c85a73d9" (UID: "96c05319-8357-4042-ae09-77f0c85a73d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.912082 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvq9b\" (UniqueName: \"kubernetes.io/projected/96c05319-8357-4042-ae09-77f0c85a73d9-kube-api-access-wvq9b\") pod \"96c05319-8357-4042-ae09-77f0c85a73d9\" (UID: \"96c05319-8357-4042-ae09-77f0c85a73d9\") " Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.912387 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.912418 5030 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.912432 5030 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96c05319-8357-4042-ae09-77f0c85a73d9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.916852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c05319-8357-4042-ae09-77f0c85a73d9-kube-api-access-wvq9b" (OuterVolumeSpecName: "kube-api-access-wvq9b") pod "96c05319-8357-4042-ae09-77f0c85a73d9" (UID: "96c05319-8357-4042-ae09-77f0c85a73d9"). InnerVolumeSpecName "kube-api-access-wvq9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:40:33 crc kubenswrapper[5030]: I0120 22:40:33.916884 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96c05319-8357-4042-ae09-77f0c85a73d9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "96c05319-8357-4042-ae09-77f0c85a73d9" (UID: "96c05319-8357-4042-ae09-77f0c85a73d9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.013779 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96c05319-8357-4042-ae09-77f0c85a73d9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.013818 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvq9b\" (UniqueName: \"kubernetes.io/projected/96c05319-8357-4042-ae09-77f0c85a73d9-kube-api-access-wvq9b\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.168653 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.316501 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-serving-cert\") pod \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.316566 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-config\") pod \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.316587 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-client-ca\") pod \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.316602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mspd2\" (UniqueName: \"kubernetes.io/projected/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-kube-api-access-mspd2\") pod \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\" (UID: \"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee\") " Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.317614 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-config" (OuterVolumeSpecName: "config") pod "2cc86f8a-314a-40e6-adbd-9c4bc5cffeee" (UID: "2cc86f8a-314a-40e6-adbd-9c4bc5cffeee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.317611 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-client-ca" (OuterVolumeSpecName: "client-ca") pod "2cc86f8a-314a-40e6-adbd-9c4bc5cffeee" (UID: "2cc86f8a-314a-40e6-adbd-9c4bc5cffeee"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.320569 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2cc86f8a-314a-40e6-adbd-9c4bc5cffeee" (UID: "2cc86f8a-314a-40e6-adbd-9c4bc5cffeee"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.320569 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-kube-api-access-mspd2" (OuterVolumeSpecName: "kube-api-access-mspd2") pod "2cc86f8a-314a-40e6-adbd-9c4bc5cffeee" (UID: "2cc86f8a-314a-40e6-adbd-9c4bc5cffeee"). InnerVolumeSpecName "kube-api-access-mspd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.343324 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq"] Jan 20 22:40:34 crc kubenswrapper[5030]: E0120 22:40:34.343677 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.343698 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 20 22:40:34 crc kubenswrapper[5030]: E0120 22:40:34.343722 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c05319-8357-4042-ae09-77f0c85a73d9" containerName="controller-manager" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.343736 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c05319-8357-4042-ae09-77f0c85a73d9" containerName="controller-manager" Jan 20 22:40:34 crc kubenswrapper[5030]: E0120 22:40:34.343759 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc86f8a-314a-40e6-adbd-9c4bc5cffeee" containerName="route-controller-manager" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.343773 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc86f8a-314a-40e6-adbd-9c4bc5cffeee" containerName="route-controller-manager" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.343957 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.343979 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc86f8a-314a-40e6-adbd-9c4bc5cffeee" containerName="route-controller-manager" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.344005 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c05319-8357-4042-ae09-77f0c85a73d9" containerName="controller-manager" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.344545 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.348233 5030 generic.go:334] "Generic (PLEG): container finished" podID="2cc86f8a-314a-40e6-adbd-9c4bc5cffeee" containerID="a6ba8b8159d9e3093de70ab675deb411e4cf9ea60b656b7b87d9ae27950dfb4f" exitCode=0 Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.348342 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" event={"ID":"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee","Type":"ContainerDied","Data":"a6ba8b8159d9e3093de70ab675deb411e4cf9ea60b656b7b87d9ae27950dfb4f"} Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.348380 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" event={"ID":"2cc86f8a-314a-40e6-adbd-9c4bc5cffeee","Type":"ContainerDied","Data":"f3864d9bb33c0ece73c7b5113f79ed0d6936e6165f0884ab98a1791b0adebc66"} Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.348408 5030 scope.go:117] "RemoveContainer" containerID="a6ba8b8159d9e3093de70ab675deb411e4cf9ea60b656b7b87d9ae27950dfb4f" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.348554 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.353812 5030 generic.go:334] "Generic (PLEG): container finished" podID="96c05319-8357-4042-ae09-77f0c85a73d9" containerID="8f8bb888dd3cc2499ff1de94615359ba78f585fac6589de240b4d7a5942aafc2" exitCode=0 Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.353884 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" event={"ID":"96c05319-8357-4042-ae09-77f0c85a73d9","Type":"ContainerDied","Data":"8f8bb888dd3cc2499ff1de94615359ba78f585fac6589de240b4d7a5942aafc2"} Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.353916 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" event={"ID":"96c05319-8357-4042-ae09-77f0c85a73d9","Type":"ContainerDied","Data":"8b3e883d14d221c4734d2edc7aea1c5efd877b9f4c908bbcd02245e0b3acd2b6"} Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.354001 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2zwp4" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.359509 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq"] Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.400215 5030 scope.go:117] "RemoveContainer" containerID="a6ba8b8159d9e3093de70ab675deb411e4cf9ea60b656b7b87d9ae27950dfb4f" Jan 20 22:40:34 crc kubenswrapper[5030]: E0120 22:40:34.400741 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6ba8b8159d9e3093de70ab675deb411e4cf9ea60b656b7b87d9ae27950dfb4f\": container with ID starting with a6ba8b8159d9e3093de70ab675deb411e4cf9ea60b656b7b87d9ae27950dfb4f not found: ID does not exist" containerID="a6ba8b8159d9e3093de70ab675deb411e4cf9ea60b656b7b87d9ae27950dfb4f" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.400807 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ba8b8159d9e3093de70ab675deb411e4cf9ea60b656b7b87d9ae27950dfb4f"} err="failed to get container status \"a6ba8b8159d9e3093de70ab675deb411e4cf9ea60b656b7b87d9ae27950dfb4f\": rpc error: code = NotFound desc = could not find container \"a6ba8b8159d9e3093de70ab675deb411e4cf9ea60b656b7b87d9ae27950dfb4f\": container with ID starting with a6ba8b8159d9e3093de70ab675deb411e4cf9ea60b656b7b87d9ae27950dfb4f not found: ID does not exist" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.400846 5030 scope.go:117] "RemoveContainer" containerID="8f8bb888dd3cc2499ff1de94615359ba78f585fac6589de240b4d7a5942aafc2" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.417654 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.417697 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.417715 5030 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.417732 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mspd2\" (UniqueName: \"kubernetes.io/projected/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee-kube-api-access-mspd2\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.417856 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh"] Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.425902 5030 scope.go:117] "RemoveContainer" containerID="8f8bb888dd3cc2499ff1de94615359ba78f585fac6589de240b4d7a5942aafc2" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.425951 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bv2jh"] Jan 20 22:40:34 crc kubenswrapper[5030]: E0120 22:40:34.426312 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f8bb888dd3cc2499ff1de94615359ba78f585fac6589de240b4d7a5942aafc2\": container with ID starting with 8f8bb888dd3cc2499ff1de94615359ba78f585fac6589de240b4d7a5942aafc2 not found: ID does not exist" containerID="8f8bb888dd3cc2499ff1de94615359ba78f585fac6589de240b4d7a5942aafc2" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.426352 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f8bb888dd3cc2499ff1de94615359ba78f585fac6589de240b4d7a5942aafc2"} err="failed to get container status \"8f8bb888dd3cc2499ff1de94615359ba78f585fac6589de240b4d7a5942aafc2\": rpc error: code = NotFound desc = could not find container \"8f8bb888dd3cc2499ff1de94615359ba78f585fac6589de240b4d7a5942aafc2\": container with ID starting with 8f8bb888dd3cc2499ff1de94615359ba78f585fac6589de240b4d7a5942aafc2 not found: ID does not exist" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.430653 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2zwp4"] Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.434286 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2zwp4"] Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.519209 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62hsr\" (UniqueName: \"kubernetes.io/projected/772a5d23-03a9-4083-8c63-39ae6be1c8d8-kube-api-access-62hsr\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.519305 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772a5d23-03a9-4083-8c63-39ae6be1c8d8-serving-cert\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.519413 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-config\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.519447 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-client-ca\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.519486 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-proxy-ca-bundles\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.620597 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62hsr\" (UniqueName: \"kubernetes.io/projected/772a5d23-03a9-4083-8c63-39ae6be1c8d8-kube-api-access-62hsr\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.620692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772a5d23-03a9-4083-8c63-39ae6be1c8d8-serving-cert\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.620734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-config\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.620764 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-client-ca\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.620802 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-proxy-ca-bundles\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.621987 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-client-ca\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.622222 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-proxy-ca-bundles\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.623656 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-config\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.629065 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772a5d23-03a9-4083-8c63-39ae6be1c8d8-serving-cert\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.650444 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62hsr\" (UniqueName: \"kubernetes.io/projected/772a5d23-03a9-4083-8c63-39ae6be1c8d8-kube-api-access-62hsr\") pod \"controller-manager-7cfdf7ff57-qbhmq\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:34 crc kubenswrapper[5030]: I0120 22:40:34.706772 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.133016 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq"] Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.343651 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz"] Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.344544 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.351164 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.351608 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.352176 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.352856 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.353188 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.354319 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.370476 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz"] Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.379836 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" event={"ID":"772a5d23-03a9-4083-8c63-39ae6be1c8d8","Type":"ContainerStarted","Data":"388bfc39dbe211d213d25404cb5153e768a80b36e48fefdb1bab8f7f3b535062"} Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.379887 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" event={"ID":"772a5d23-03a9-4083-8c63-39ae6be1c8d8","Type":"ContainerStarted","Data":"bb1309c32bb029692b1816b8a8cc6bf8a4e17bf80ddd9e1019426e526c6e65cd"} Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.380178 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.381416 5030 patch_prober.go:28] interesting pod/controller-manager-7cfdf7ff57-qbhmq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.381461 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" podUID="772a5d23-03a9-4083-8c63-39ae6be1c8d8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.419484 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" podStartSLOduration=2.419459218 podStartE2EDuration="2.419459218s" podCreationTimestamp="2026-01-20 22:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:40:35.414217581 +0000 UTC m=+307.734477879" watchObservedRunningTime="2026-01-20 22:40:35.419459218 +0000 UTC m=+307.739719516" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.534261 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fe4dfc-c01a-4824-85b2-38f79894b9ae-client-ca\") pod \"route-controller-manager-5d59c4485f-8xhqz\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.534386 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fe4dfc-c01a-4824-85b2-38f79894b9ae-serving-cert\") pod \"route-controller-manager-5d59c4485f-8xhqz\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.534412 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lgb4\" (UniqueName: \"kubernetes.io/projected/25fe4dfc-c01a-4824-85b2-38f79894b9ae-kube-api-access-8lgb4\") pod \"route-controller-manager-5d59c4485f-8xhqz\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.534496 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fe4dfc-c01a-4824-85b2-38f79894b9ae-config\") pod \"route-controller-manager-5d59c4485f-8xhqz\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.635100 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lgb4\" (UniqueName: \"kubernetes.io/projected/25fe4dfc-c01a-4824-85b2-38f79894b9ae-kube-api-access-8lgb4\") pod \"route-controller-manager-5d59c4485f-8xhqz\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.635136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fe4dfc-c01a-4824-85b2-38f79894b9ae-serving-cert\") pod \"route-controller-manager-5d59c4485f-8xhqz\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.635188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fe4dfc-c01a-4824-85b2-38f79894b9ae-config\") pod \"route-controller-manager-5d59c4485f-8xhqz\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.635213 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fe4dfc-c01a-4824-85b2-38f79894b9ae-client-ca\") pod \"route-controller-manager-5d59c4485f-8xhqz\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.636133 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fe4dfc-c01a-4824-85b2-38f79894b9ae-client-ca\") pod \"route-controller-manager-5d59c4485f-8xhqz\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.636264 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fe4dfc-c01a-4824-85b2-38f79894b9ae-config\") pod \"route-controller-manager-5d59c4485f-8xhqz\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.650868 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fe4dfc-c01a-4824-85b2-38f79894b9ae-serving-cert\") pod \"route-controller-manager-5d59c4485f-8xhqz\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.659703 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lgb4\" (UniqueName: \"kubernetes.io/projected/25fe4dfc-c01a-4824-85b2-38f79894b9ae-kube-api-access-8lgb4\") pod \"route-controller-manager-5d59c4485f-8xhqz\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.699803 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.871936 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz"] Jan 20 22:40:35 crc kubenswrapper[5030]: W0120 22:40:35.880783 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25fe4dfc_c01a_4824_85b2_38f79894b9ae.slice/crio-eec001e3f45632f4bf49007864212e7583d12ca84fd6832f3f8577792eb19465 WatchSource:0}: Error finding container eec001e3f45632f4bf49007864212e7583d12ca84fd6832f3f8577792eb19465: Status 404 returned error can't find the container with id eec001e3f45632f4bf49007864212e7583d12ca84fd6832f3f8577792eb19465 Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.969567 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cc86f8a-314a-40e6-adbd-9c4bc5cffeee" path="/var/lib/kubelet/pods/2cc86f8a-314a-40e6-adbd-9c4bc5cffeee/volumes" Jan 20 22:40:35 crc kubenswrapper[5030]: I0120 22:40:35.970475 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c05319-8357-4042-ae09-77f0c85a73d9" path="/var/lib/kubelet/pods/96c05319-8357-4042-ae09-77f0c85a73d9/volumes" Jan 20 22:40:36 crc kubenswrapper[5030]: I0120 22:40:36.385594 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" event={"ID":"25fe4dfc-c01a-4824-85b2-38f79894b9ae","Type":"ContainerStarted","Data":"9f5297bd612608bce6ca1d4d08f76e299ef1c535efd07e4101594c6b26658048"} Jan 20 22:40:36 crc kubenswrapper[5030]: I0120 22:40:36.385664 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" event={"ID":"25fe4dfc-c01a-4824-85b2-38f79894b9ae","Type":"ContainerStarted","Data":"eec001e3f45632f4bf49007864212e7583d12ca84fd6832f3f8577792eb19465"} Jan 20 22:40:36 crc kubenswrapper[5030]: I0120 22:40:36.387268 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:36 crc kubenswrapper[5030]: I0120 22:40:36.391864 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:36 crc kubenswrapper[5030]: I0120 22:40:36.409473 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" podStartSLOduration=3.409456026 podStartE2EDuration="3.409456026s" podCreationTimestamp="2026-01-20 22:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:40:36.406138276 +0000 UTC m=+308.726398574" watchObservedRunningTime="2026-01-20 22:40:36.409456026 +0000 UTC m=+308.729716324" Jan 20 22:40:36 crc kubenswrapper[5030]: I0120 22:40:36.594268 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.193917 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq"] Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.194357 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" podUID="772a5d23-03a9-4083-8c63-39ae6be1c8d8" containerName="controller-manager" containerID="cri-o://388bfc39dbe211d213d25404cb5153e768a80b36e48fefdb1bab8f7f3b535062" gracePeriod=30 Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.224265 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz"] Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.405127 5030 generic.go:334] "Generic (PLEG): container finished" podID="772a5d23-03a9-4083-8c63-39ae6be1c8d8" containerID="388bfc39dbe211d213d25404cb5153e768a80b36e48fefdb1bab8f7f3b535062" exitCode=0 Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.405236 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" event={"ID":"772a5d23-03a9-4083-8c63-39ae6be1c8d8","Type":"ContainerDied","Data":"388bfc39dbe211d213d25404cb5153e768a80b36e48fefdb1bab8f7f3b535062"} Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.405334 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" podUID="25fe4dfc-c01a-4824-85b2-38f79894b9ae" containerName="route-controller-manager" containerID="cri-o://9f5297bd612608bce6ca1d4d08f76e299ef1c535efd07e4101594c6b26658048" gracePeriod=30 Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.641404 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.765233 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.788045 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-client-ca\") pod \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.788099 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-config\") pod \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.788127 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-proxy-ca-bundles\") pod \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.788248 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62hsr\" (UniqueName: \"kubernetes.io/projected/772a5d23-03a9-4083-8c63-39ae6be1c8d8-kube-api-access-62hsr\") pod \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.788282 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772a5d23-03a9-4083-8c63-39ae6be1c8d8-serving-cert\") pod \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\" (UID: \"772a5d23-03a9-4083-8c63-39ae6be1c8d8\") " Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.789997 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "772a5d23-03a9-4083-8c63-39ae6be1c8d8" (UID: "772a5d23-03a9-4083-8c63-39ae6be1c8d8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.790037 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-config" (OuterVolumeSpecName: "config") pod "772a5d23-03a9-4083-8c63-39ae6be1c8d8" (UID: "772a5d23-03a9-4083-8c63-39ae6be1c8d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.790058 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-client-ca" (OuterVolumeSpecName: "client-ca") pod "772a5d23-03a9-4083-8c63-39ae6be1c8d8" (UID: "772a5d23-03a9-4083-8c63-39ae6be1c8d8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.794587 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772a5d23-03a9-4083-8c63-39ae6be1c8d8-kube-api-access-62hsr" (OuterVolumeSpecName: "kube-api-access-62hsr") pod "772a5d23-03a9-4083-8c63-39ae6be1c8d8" (UID: "772a5d23-03a9-4083-8c63-39ae6be1c8d8"). InnerVolumeSpecName "kube-api-access-62hsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.795540 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/772a5d23-03a9-4083-8c63-39ae6be1c8d8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "772a5d23-03a9-4083-8c63-39ae6be1c8d8" (UID: "772a5d23-03a9-4083-8c63-39ae6be1c8d8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.889424 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fe4dfc-c01a-4824-85b2-38f79894b9ae-serving-cert\") pod \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.889493 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fe4dfc-c01a-4824-85b2-38f79894b9ae-config\") pod \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.889561 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lgb4\" (UniqueName: \"kubernetes.io/projected/25fe4dfc-c01a-4824-85b2-38f79894b9ae-kube-api-access-8lgb4\") pod \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.889696 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fe4dfc-c01a-4824-85b2-38f79894b9ae-client-ca\") pod \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\" (UID: \"25fe4dfc-c01a-4824-85b2-38f79894b9ae\") " Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.889977 5030 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.890002 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.890016 5030 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/772a5d23-03a9-4083-8c63-39ae6be1c8d8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.890032 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62hsr\" (UniqueName: \"kubernetes.io/projected/772a5d23-03a9-4083-8c63-39ae6be1c8d8-kube-api-access-62hsr\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.890045 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/772a5d23-03a9-4083-8c63-39ae6be1c8d8-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.890725 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fe4dfc-c01a-4824-85b2-38f79894b9ae-client-ca" (OuterVolumeSpecName: "client-ca") pod "25fe4dfc-c01a-4824-85b2-38f79894b9ae" (UID: "25fe4dfc-c01a-4824-85b2-38f79894b9ae"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.890870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fe4dfc-c01a-4824-85b2-38f79894b9ae-config" (OuterVolumeSpecName: "config") pod "25fe4dfc-c01a-4824-85b2-38f79894b9ae" (UID: "25fe4dfc-c01a-4824-85b2-38f79894b9ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.892231 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25fe4dfc-c01a-4824-85b2-38f79894b9ae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "25fe4dfc-c01a-4824-85b2-38f79894b9ae" (UID: "25fe4dfc-c01a-4824-85b2-38f79894b9ae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.893817 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25fe4dfc-c01a-4824-85b2-38f79894b9ae-kube-api-access-8lgb4" (OuterVolumeSpecName: "kube-api-access-8lgb4") pod "25fe4dfc-c01a-4824-85b2-38f79894b9ae" (UID: "25fe4dfc-c01a-4824-85b2-38f79894b9ae"). InnerVolumeSpecName "kube-api-access-8lgb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.991743 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fe4dfc-c01a-4824-85b2-38f79894b9ae-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.992063 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fe4dfc-c01a-4824-85b2-38f79894b9ae-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.992086 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lgb4\" (UniqueName: \"kubernetes.io/projected/25fe4dfc-c01a-4824-85b2-38f79894b9ae-kube-api-access-8lgb4\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:39 crc kubenswrapper[5030]: I0120 22:40:39.992108 5030 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fe4dfc-c01a-4824-85b2-38f79894b9ae-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.344760 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67"] Jan 20 22:40:40 crc kubenswrapper[5030]: E0120 22:40:40.346186 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772a5d23-03a9-4083-8c63-39ae6be1c8d8" containerName="controller-manager" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.346241 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="772a5d23-03a9-4083-8c63-39ae6be1c8d8" containerName="controller-manager" Jan 20 22:40:40 crc kubenswrapper[5030]: E0120 22:40:40.346290 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fe4dfc-c01a-4824-85b2-38f79894b9ae" containerName="route-controller-manager" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.346307 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fe4dfc-c01a-4824-85b2-38f79894b9ae" containerName="route-controller-manager" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.346542 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="772a5d23-03a9-4083-8c63-39ae6be1c8d8" containerName="controller-manager" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.346570 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="25fe4dfc-c01a-4824-85b2-38f79894b9ae" containerName="route-controller-manager" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.347105 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.350523 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8646798d4-fjnvz"] Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.351127 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.364173 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67"] Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.369589 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8646798d4-fjnvz"] Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.410079 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" event={"ID":"772a5d23-03a9-4083-8c63-39ae6be1c8d8","Type":"ContainerDied","Data":"bb1309c32bb029692b1816b8a8cc6bf8a4e17bf80ddd9e1019426e526c6e65cd"} Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.410126 5030 scope.go:117] "RemoveContainer" containerID="388bfc39dbe211d213d25404cb5153e768a80b36e48fefdb1bab8f7f3b535062" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.410177 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.411683 5030 generic.go:334] "Generic (PLEG): container finished" podID="25fe4dfc-c01a-4824-85b2-38f79894b9ae" containerID="9f5297bd612608bce6ca1d4d08f76e299ef1c535efd07e4101594c6b26658048" exitCode=0 Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.411716 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" event={"ID":"25fe4dfc-c01a-4824-85b2-38f79894b9ae","Type":"ContainerDied","Data":"9f5297bd612608bce6ca1d4d08f76e299ef1c535efd07e4101594c6b26658048"} Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.411730 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.411731 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz" event={"ID":"25fe4dfc-c01a-4824-85b2-38f79894b9ae","Type":"ContainerDied","Data":"eec001e3f45632f4bf49007864212e7583d12ca84fd6832f3f8577792eb19465"} Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.425614 5030 scope.go:117] "RemoveContainer" containerID="9f5297bd612608bce6ca1d4d08f76e299ef1c535efd07e4101594c6b26658048" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.431197 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq"] Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.438728 5030 scope.go:117] "RemoveContainer" containerID="9f5297bd612608bce6ca1d4d08f76e299ef1c535efd07e4101594c6b26658048" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.439350 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7cfdf7ff57-qbhmq"] Jan 20 22:40:40 crc kubenswrapper[5030]: E0120 22:40:40.440806 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f5297bd612608bce6ca1d4d08f76e299ef1c535efd07e4101594c6b26658048\": container with ID starting with 9f5297bd612608bce6ca1d4d08f76e299ef1c535efd07e4101594c6b26658048 not found: ID does not exist" containerID="9f5297bd612608bce6ca1d4d08f76e299ef1c535efd07e4101594c6b26658048" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.440851 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5297bd612608bce6ca1d4d08f76e299ef1c535efd07e4101594c6b26658048"} err="failed to get container status \"9f5297bd612608bce6ca1d4d08f76e299ef1c535efd07e4101594c6b26658048\": rpc error: code = NotFound desc = could not find container \"9f5297bd612608bce6ca1d4d08f76e299ef1c535efd07e4101594c6b26658048\": container with ID starting with 9f5297bd612608bce6ca1d4d08f76e299ef1c535efd07e4101594c6b26658048 not found: ID does not exist" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.444325 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz"] Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.447144 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d59c4485f-8xhqz"] Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.498809 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a3b3772-69c9-4cce-b72e-079d10918a75-client-ca\") pod \"route-controller-manager-85cd79558b-s8r67\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.498853 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-proxy-ca-bundles\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.499020 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bxws\" (UniqueName: \"kubernetes.io/projected/ab33731d-7028-49a8-922d-64d2f8a2ab22-kube-api-access-7bxws\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.499079 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-config\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.499171 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-client-ca\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.499209 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j5hx\" (UniqueName: \"kubernetes.io/projected/0a3b3772-69c9-4cce-b72e-079d10918a75-kube-api-access-8j5hx\") pod \"route-controller-manager-85cd79558b-s8r67\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.499243 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab33731d-7028-49a8-922d-64d2f8a2ab22-serving-cert\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.499265 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a3b3772-69c9-4cce-b72e-079d10918a75-serving-cert\") pod \"route-controller-manager-85cd79558b-s8r67\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.499399 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3b3772-69c9-4cce-b72e-079d10918a75-config\") pod \"route-controller-manager-85cd79558b-s8r67\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.603124 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3b3772-69c9-4cce-b72e-079d10918a75-config\") pod \"route-controller-manager-85cd79558b-s8r67\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.603193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a3b3772-69c9-4cce-b72e-079d10918a75-client-ca\") pod \"route-controller-manager-85cd79558b-s8r67\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.603215 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-proxy-ca-bundles\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.603247 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bxws\" (UniqueName: \"kubernetes.io/projected/ab33731d-7028-49a8-922d-64d2f8a2ab22-kube-api-access-7bxws\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.603267 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-config\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.603290 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-client-ca\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.603309 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j5hx\" (UniqueName: \"kubernetes.io/projected/0a3b3772-69c9-4cce-b72e-079d10918a75-kube-api-access-8j5hx\") pod \"route-controller-manager-85cd79558b-s8r67\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.603335 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab33731d-7028-49a8-922d-64d2f8a2ab22-serving-cert\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.603352 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a3b3772-69c9-4cce-b72e-079d10918a75-serving-cert\") pod \"route-controller-manager-85cd79558b-s8r67\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.604267 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a3b3772-69c9-4cce-b72e-079d10918a75-client-ca\") pod \"route-controller-manager-85cd79558b-s8r67\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.604450 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3b3772-69c9-4cce-b72e-079d10918a75-config\") pod \"route-controller-manager-85cd79558b-s8r67\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.604918 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-proxy-ca-bundles\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.605069 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-client-ca\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.605671 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-config\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.607044 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab33731d-7028-49a8-922d-64d2f8a2ab22-serving-cert\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.607813 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a3b3772-69c9-4cce-b72e-079d10918a75-serving-cert\") pod \"route-controller-manager-85cd79558b-s8r67\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.621934 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bxws\" (UniqueName: \"kubernetes.io/projected/ab33731d-7028-49a8-922d-64d2f8a2ab22-kube-api-access-7bxws\") pod \"controller-manager-8646798d4-fjnvz\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.630469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j5hx\" (UniqueName: \"kubernetes.io/projected/0a3b3772-69c9-4cce-b72e-079d10918a75-kube-api-access-8j5hx\") pod \"route-controller-manager-85cd79558b-s8r67\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.670251 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.678711 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:40 crc kubenswrapper[5030]: I0120 22:40:40.973449 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8646798d4-fjnvz"] Jan 20 22:40:41 crc kubenswrapper[5030]: I0120 22:40:41.130021 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67"] Jan 20 22:40:41 crc kubenswrapper[5030]: W0120 22:40:41.133467 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a3b3772_69c9_4cce_b72e_079d10918a75.slice/crio-de478d984e86929bd36c9887171693232ba787ff9927a0ea36d1f402db4a244f WatchSource:0}: Error finding container de478d984e86929bd36c9887171693232ba787ff9927a0ea36d1f402db4a244f: Status 404 returned error can't find the container with id de478d984e86929bd36c9887171693232ba787ff9927a0ea36d1f402db4a244f Jan 20 22:40:41 crc kubenswrapper[5030]: I0120 22:40:41.418795 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" event={"ID":"0a3b3772-69c9-4cce-b72e-079d10918a75","Type":"ContainerStarted","Data":"6993670ded5951f913b78ff14ebbbc17805a05b589c268fe5b9dbe5d21367fae"} Jan 20 22:40:41 crc kubenswrapper[5030]: I0120 22:40:41.419101 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" event={"ID":"0a3b3772-69c9-4cce-b72e-079d10918a75","Type":"ContainerStarted","Data":"de478d984e86929bd36c9887171693232ba787ff9927a0ea36d1f402db4a244f"} Jan 20 22:40:41 crc kubenswrapper[5030]: I0120 22:40:41.420117 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:41 crc kubenswrapper[5030]: I0120 22:40:41.422766 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" event={"ID":"ab33731d-7028-49a8-922d-64d2f8a2ab22","Type":"ContainerStarted","Data":"7f536454f25563b4b60e30e46c9831bafd7cf35a6662224c731a8e8cea720a18"} Jan 20 22:40:41 crc kubenswrapper[5030]: I0120 22:40:41.422817 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" event={"ID":"ab33731d-7028-49a8-922d-64d2f8a2ab22","Type":"ContainerStarted","Data":"1150a3f400fadb47836ccfecbd9c9e2bf59b43720dbc175a22f5ff2f1e1f6735"} Jan 20 22:40:41 crc kubenswrapper[5030]: I0120 22:40:41.423136 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:41 crc kubenswrapper[5030]: I0120 22:40:41.427721 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:40:41 crc kubenswrapper[5030]: I0120 22:40:41.466074 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" podStartSLOduration=2.466055668 podStartE2EDuration="2.466055668s" podCreationTimestamp="2026-01-20 22:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:40:41.447361998 +0000 UTC m=+313.767622286" watchObservedRunningTime="2026-01-20 22:40:41.466055668 +0000 UTC m=+313.786315956" Jan 20 22:40:41 crc kubenswrapper[5030]: I0120 22:40:41.467916 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" podStartSLOduration=2.467906842 podStartE2EDuration="2.467906842s" podCreationTimestamp="2026-01-20 22:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:40:41.463465226 +0000 UTC m=+313.783725504" watchObservedRunningTime="2026-01-20 22:40:41.467906842 +0000 UTC m=+313.788167130" Jan 20 22:40:41 crc kubenswrapper[5030]: I0120 22:40:41.720843 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:41 crc kubenswrapper[5030]: I0120 22:40:41.972247 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25fe4dfc-c01a-4824-85b2-38f79894b9ae" path="/var/lib/kubelet/pods/25fe4dfc-c01a-4824-85b2-38f79894b9ae/volumes" Jan 20 22:40:41 crc kubenswrapper[5030]: I0120 22:40:41.973714 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="772a5d23-03a9-4083-8c63-39ae6be1c8d8" path="/var/lib/kubelet/pods/772a5d23-03a9-4083-8c63-39ae6be1c8d8/volumes" Jan 20 22:40:53 crc kubenswrapper[5030]: I0120 22:40:53.268593 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67"] Jan 20 22:40:53 crc kubenswrapper[5030]: I0120 22:40:53.269486 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" podUID="0a3b3772-69c9-4cce-b72e-079d10918a75" containerName="route-controller-manager" containerID="cri-o://6993670ded5951f913b78ff14ebbbc17805a05b589c268fe5b9dbe5d21367fae" gracePeriod=30 Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.212929 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.350668 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx"] Jan 20 22:40:54 crc kubenswrapper[5030]: E0120 22:40:54.350900 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3b3772-69c9-4cce-b72e-079d10918a75" containerName="route-controller-manager" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.350918 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3b3772-69c9-4cce-b72e-079d10918a75" containerName="route-controller-manager" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.351065 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3b3772-69c9-4cce-b72e-079d10918a75" containerName="route-controller-manager" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.351510 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.364124 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx"] Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.395877 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a3b3772-69c9-4cce-b72e-079d10918a75-client-ca\") pod \"0a3b3772-69c9-4cce-b72e-079d10918a75\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.395972 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3b3772-69c9-4cce-b72e-079d10918a75-config\") pod \"0a3b3772-69c9-4cce-b72e-079d10918a75\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.396025 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a3b3772-69c9-4cce-b72e-079d10918a75-serving-cert\") pod \"0a3b3772-69c9-4cce-b72e-079d10918a75\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.396142 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j5hx\" (UniqueName: \"kubernetes.io/projected/0a3b3772-69c9-4cce-b72e-079d10918a75-kube-api-access-8j5hx\") pod \"0a3b3772-69c9-4cce-b72e-079d10918a75\" (UID: \"0a3b3772-69c9-4cce-b72e-079d10918a75\") " Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.397040 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a3b3772-69c9-4cce-b72e-079d10918a75-config" (OuterVolumeSpecName: "config") pod "0a3b3772-69c9-4cce-b72e-079d10918a75" (UID: "0a3b3772-69c9-4cce-b72e-079d10918a75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.397136 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a3b3772-69c9-4cce-b72e-079d10918a75-client-ca" (OuterVolumeSpecName: "client-ca") pod "0a3b3772-69c9-4cce-b72e-079d10918a75" (UID: "0a3b3772-69c9-4cce-b72e-079d10918a75"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.415413 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3b3772-69c9-4cce-b72e-079d10918a75-kube-api-access-8j5hx" (OuterVolumeSpecName: "kube-api-access-8j5hx") pod "0a3b3772-69c9-4cce-b72e-079d10918a75" (UID: "0a3b3772-69c9-4cce-b72e-079d10918a75"). InnerVolumeSpecName "kube-api-access-8j5hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.415543 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3b3772-69c9-4cce-b72e-079d10918a75-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0a3b3772-69c9-4cce-b72e-079d10918a75" (UID: "0a3b3772-69c9-4cce-b72e-079d10918a75"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.497630 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d68c50-0572-4d84-88b9-a0240cb28055-config\") pod \"route-controller-manager-6878449bd4-dw9cx\" (UID: \"37d68c50-0572-4d84-88b9-a0240cb28055\") " pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.497888 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvmhd\" (UniqueName: \"kubernetes.io/projected/37d68c50-0572-4d84-88b9-a0240cb28055-kube-api-access-gvmhd\") pod \"route-controller-manager-6878449bd4-dw9cx\" (UID: \"37d68c50-0572-4d84-88b9-a0240cb28055\") " pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.497983 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37d68c50-0572-4d84-88b9-a0240cb28055-serving-cert\") pod \"route-controller-manager-6878449bd4-dw9cx\" (UID: \"37d68c50-0572-4d84-88b9-a0240cb28055\") " pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.498083 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37d68c50-0572-4d84-88b9-a0240cb28055-client-ca\") pod \"route-controller-manager-6878449bd4-dw9cx\" (UID: \"37d68c50-0572-4d84-88b9-a0240cb28055\") " pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.498184 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j5hx\" (UniqueName: \"kubernetes.io/projected/0a3b3772-69c9-4cce-b72e-079d10918a75-kube-api-access-8j5hx\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.498245 5030 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a3b3772-69c9-4cce-b72e-079d10918a75-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.498311 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3b3772-69c9-4cce-b72e-079d10918a75-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.498366 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a3b3772-69c9-4cce-b72e-079d10918a75-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.506557 5030 generic.go:334] "Generic (PLEG): container finished" podID="0a3b3772-69c9-4cce-b72e-079d10918a75" containerID="6993670ded5951f913b78ff14ebbbc17805a05b589c268fe5b9dbe5d21367fae" exitCode=0 Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.506675 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" event={"ID":"0a3b3772-69c9-4cce-b72e-079d10918a75","Type":"ContainerDied","Data":"6993670ded5951f913b78ff14ebbbc17805a05b589c268fe5b9dbe5d21367fae"} Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.506744 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" event={"ID":"0a3b3772-69c9-4cce-b72e-079d10918a75","Type":"ContainerDied","Data":"de478d984e86929bd36c9887171693232ba787ff9927a0ea36d1f402db4a244f"} Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.506764 5030 scope.go:117] "RemoveContainer" containerID="6993670ded5951f913b78ff14ebbbc17805a05b589c268fe5b9dbe5d21367fae" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.506671 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.536315 5030 scope.go:117] "RemoveContainer" containerID="6993670ded5951f913b78ff14ebbbc17805a05b589c268fe5b9dbe5d21367fae" Jan 20 22:40:54 crc kubenswrapper[5030]: E0120 22:40:54.536855 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6993670ded5951f913b78ff14ebbbc17805a05b589c268fe5b9dbe5d21367fae\": container with ID starting with 6993670ded5951f913b78ff14ebbbc17805a05b589c268fe5b9dbe5d21367fae not found: ID does not exist" containerID="6993670ded5951f913b78ff14ebbbc17805a05b589c268fe5b9dbe5d21367fae" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.536912 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6993670ded5951f913b78ff14ebbbc17805a05b589c268fe5b9dbe5d21367fae"} err="failed to get container status \"6993670ded5951f913b78ff14ebbbc17805a05b589c268fe5b9dbe5d21367fae\": rpc error: code = NotFound desc = could not find container \"6993670ded5951f913b78ff14ebbbc17805a05b589c268fe5b9dbe5d21367fae\": container with ID starting with 6993670ded5951f913b78ff14ebbbc17805a05b589c268fe5b9dbe5d21367fae not found: ID does not exist" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.545262 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67"] Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.551775 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85cd79558b-s8r67"] Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.602453 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d68c50-0572-4d84-88b9-a0240cb28055-config\") pod \"route-controller-manager-6878449bd4-dw9cx\" (UID: \"37d68c50-0572-4d84-88b9-a0240cb28055\") " pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.603150 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvmhd\" (UniqueName: \"kubernetes.io/projected/37d68c50-0572-4d84-88b9-a0240cb28055-kube-api-access-gvmhd\") pod \"route-controller-manager-6878449bd4-dw9cx\" (UID: \"37d68c50-0572-4d84-88b9-a0240cb28055\") " pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.603385 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37d68c50-0572-4d84-88b9-a0240cb28055-serving-cert\") pod \"route-controller-manager-6878449bd4-dw9cx\" (UID: \"37d68c50-0572-4d84-88b9-a0240cb28055\") " pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.603648 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37d68c50-0572-4d84-88b9-a0240cb28055-client-ca\") pod \"route-controller-manager-6878449bd4-dw9cx\" (UID: \"37d68c50-0572-4d84-88b9-a0240cb28055\") " pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.605140 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37d68c50-0572-4d84-88b9-a0240cb28055-client-ca\") pod \"route-controller-manager-6878449bd4-dw9cx\" (UID: \"37d68c50-0572-4d84-88b9-a0240cb28055\") " pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.605170 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37d68c50-0572-4d84-88b9-a0240cb28055-config\") pod \"route-controller-manager-6878449bd4-dw9cx\" (UID: \"37d68c50-0572-4d84-88b9-a0240cb28055\") " pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.608752 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37d68c50-0572-4d84-88b9-a0240cb28055-serving-cert\") pod \"route-controller-manager-6878449bd4-dw9cx\" (UID: \"37d68c50-0572-4d84-88b9-a0240cb28055\") " pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.624522 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvmhd\" (UniqueName: \"kubernetes.io/projected/37d68c50-0572-4d84-88b9-a0240cb28055-kube-api-access-gvmhd\") pod \"route-controller-manager-6878449bd4-dw9cx\" (UID: \"37d68c50-0572-4d84-88b9-a0240cb28055\") " pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:54 crc kubenswrapper[5030]: I0120 22:40:54.677131 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:55 crc kubenswrapper[5030]: I0120 22:40:55.078860 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx"] Jan 20 22:40:55 crc kubenswrapper[5030]: W0120 22:40:55.087599 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37d68c50_0572_4d84_88b9_a0240cb28055.slice/crio-b67b377116705b100213f1aa850c58d46367d2faa19c96f4671300f58f60386a WatchSource:0}: Error finding container b67b377116705b100213f1aa850c58d46367d2faa19c96f4671300f58f60386a: Status 404 returned error can't find the container with id b67b377116705b100213f1aa850c58d46367d2faa19c96f4671300f58f60386a Jan 20 22:40:55 crc kubenswrapper[5030]: I0120 22:40:55.515381 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" event={"ID":"37d68c50-0572-4d84-88b9-a0240cb28055","Type":"ContainerStarted","Data":"6526529beb487dfbf07214572c94cc19431145587de04085b0091343924e71bd"} Jan 20 22:40:55 crc kubenswrapper[5030]: I0120 22:40:55.515466 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" event={"ID":"37d68c50-0572-4d84-88b9-a0240cb28055","Type":"ContainerStarted","Data":"b67b377116705b100213f1aa850c58d46367d2faa19c96f4671300f58f60386a"} Jan 20 22:40:55 crc kubenswrapper[5030]: I0120 22:40:55.535430 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" podStartSLOduration=2.535406323 podStartE2EDuration="2.535406323s" podCreationTimestamp="2026-01-20 22:40:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:40:55.532279086 +0000 UTC m=+327.852539374" watchObservedRunningTime="2026-01-20 22:40:55.535406323 +0000 UTC m=+327.855666621" Jan 20 22:40:55 crc kubenswrapper[5030]: I0120 22:40:55.970408 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a3b3772-69c9-4cce-b72e-079d10918a75" path="/var/lib/kubelet/pods/0a3b3772-69c9-4cce-b72e-079d10918a75/volumes" Jan 20 22:40:56 crc kubenswrapper[5030]: I0120 22:40:56.523294 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:40:56 crc kubenswrapper[5030]: I0120 22:40:56.531612 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.093872 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6pllv"] Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.095390 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.112681 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6pllv"] Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.239735 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f0ddc7f2-a163-4750-b8d3-6730f66295a4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.239816 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.240495 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0ddc7f2-a163-4750-b8d3-6730f66295a4-registry-tls\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.240553 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f0ddc7f2-a163-4750-b8d3-6730f66295a4-registry-certificates\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.240595 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f0ddc7f2-a163-4750-b8d3-6730f66295a4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.240657 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0ddc7f2-a163-4750-b8d3-6730f66295a4-bound-sa-token\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.240724 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj5t4\" (UniqueName: \"kubernetes.io/projected/f0ddc7f2-a163-4750-b8d3-6730f66295a4-kube-api-access-cj5t4\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.240760 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0ddc7f2-a163-4750-b8d3-6730f66295a4-trusted-ca\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.269223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.342026 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f0ddc7f2-a163-4750-b8d3-6730f66295a4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.342100 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0ddc7f2-a163-4750-b8d3-6730f66295a4-bound-sa-token\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.342135 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj5t4\" (UniqueName: \"kubernetes.io/projected/f0ddc7f2-a163-4750-b8d3-6730f66295a4-kube-api-access-cj5t4\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.342169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0ddc7f2-a163-4750-b8d3-6730f66295a4-trusted-ca\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.342315 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f0ddc7f2-a163-4750-b8d3-6730f66295a4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.342386 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0ddc7f2-a163-4750-b8d3-6730f66295a4-registry-tls\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.342421 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f0ddc7f2-a163-4750-b8d3-6730f66295a4-registry-certificates\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.343703 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f0ddc7f2-a163-4750-b8d3-6730f66295a4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.344514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0ddc7f2-a163-4750-b8d3-6730f66295a4-trusted-ca\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.344600 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f0ddc7f2-a163-4750-b8d3-6730f66295a4-registry-certificates\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.352209 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f0ddc7f2-a163-4750-b8d3-6730f66295a4-registry-tls\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.352222 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f0ddc7f2-a163-4750-b8d3-6730f66295a4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.361372 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj5t4\" (UniqueName: \"kubernetes.io/projected/f0ddc7f2-a163-4750-b8d3-6730f66295a4-kube-api-access-cj5t4\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.371244 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0ddc7f2-a163-4750-b8d3-6730f66295a4-bound-sa-token\") pod \"image-registry-66df7c8f76-6pllv\" (UID: \"f0ddc7f2-a163-4750-b8d3-6730f66295a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.419413 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.874435 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-6pllv"] Jan 20 22:41:26 crc kubenswrapper[5030]: W0120 22:41:26.884061 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0ddc7f2_a163_4750_b8d3_6730f66295a4.slice/crio-5e632e058d5a95e47144489a16c95f4006e5a1a4018cfb325fc790279f1bf4d1 WatchSource:0}: Error finding container 5e632e058d5a95e47144489a16c95f4006e5a1a4018cfb325fc790279f1bf4d1: Status 404 returned error can't find the container with id 5e632e058d5a95e47144489a16c95f4006e5a1a4018cfb325fc790279f1bf4d1 Jan 20 22:41:26 crc kubenswrapper[5030]: I0120 22:41:26.916958 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" event={"ID":"f0ddc7f2-a163-4750-b8d3-6730f66295a4","Type":"ContainerStarted","Data":"5e632e058d5a95e47144489a16c95f4006e5a1a4018cfb325fc790279f1bf4d1"} Jan 20 22:41:27 crc kubenswrapper[5030]: I0120 22:41:27.926117 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" event={"ID":"f0ddc7f2-a163-4750-b8d3-6730f66295a4","Type":"ContainerStarted","Data":"c5b5c7061181a22ddc6e554fe795788d7cc8a867862797fdda131f1323047eea"} Jan 20 22:41:27 crc kubenswrapper[5030]: I0120 22:41:27.926491 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:27 crc kubenswrapper[5030]: I0120 22:41:27.960121 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" podStartSLOduration=1.960087697 podStartE2EDuration="1.960087697s" podCreationTimestamp="2026-01-20 22:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:41:27.953764361 +0000 UTC m=+360.274024689" watchObservedRunningTime="2026-01-20 22:41:27.960087697 +0000 UTC m=+360.280348015" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.277589 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8646798d4-fjnvz"] Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.278509 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" podUID="ab33731d-7028-49a8-922d-64d2f8a2ab22" containerName="controller-manager" containerID="cri-o://7f536454f25563b4b60e30e46c9831bafd7cf35a6662224c731a8e8cea720a18" gracePeriod=30 Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.658895 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.749520 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-config\") pod \"ab33731d-7028-49a8-922d-64d2f8a2ab22\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.749561 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-client-ca\") pod \"ab33731d-7028-49a8-922d-64d2f8a2ab22\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.749640 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab33731d-7028-49a8-922d-64d2f8a2ab22-serving-cert\") pod \"ab33731d-7028-49a8-922d-64d2f8a2ab22\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.749705 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-proxy-ca-bundles\") pod \"ab33731d-7028-49a8-922d-64d2f8a2ab22\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.749726 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bxws\" (UniqueName: \"kubernetes.io/projected/ab33731d-7028-49a8-922d-64d2f8a2ab22-kube-api-access-7bxws\") pod \"ab33731d-7028-49a8-922d-64d2f8a2ab22\" (UID: \"ab33731d-7028-49a8-922d-64d2f8a2ab22\") " Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.750337 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-client-ca" (OuterVolumeSpecName: "client-ca") pod "ab33731d-7028-49a8-922d-64d2f8a2ab22" (UID: "ab33731d-7028-49a8-922d-64d2f8a2ab22"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.750745 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ab33731d-7028-49a8-922d-64d2f8a2ab22" (UID: "ab33731d-7028-49a8-922d-64d2f8a2ab22"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.750755 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-config" (OuterVolumeSpecName: "config") pod "ab33731d-7028-49a8-922d-64d2f8a2ab22" (UID: "ab33731d-7028-49a8-922d-64d2f8a2ab22"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.757016 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab33731d-7028-49a8-922d-64d2f8a2ab22-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ab33731d-7028-49a8-922d-64d2f8a2ab22" (UID: "ab33731d-7028-49a8-922d-64d2f8a2ab22"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.757434 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab33731d-7028-49a8-922d-64d2f8a2ab22-kube-api-access-7bxws" (OuterVolumeSpecName: "kube-api-access-7bxws") pod "ab33731d-7028-49a8-922d-64d2f8a2ab22" (UID: "ab33731d-7028-49a8-922d-64d2f8a2ab22"). InnerVolumeSpecName "kube-api-access-7bxws". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.851815 5030 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.851876 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bxws\" (UniqueName: \"kubernetes.io/projected/ab33731d-7028-49a8-922d-64d2f8a2ab22-kube-api-access-7bxws\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.851898 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.851912 5030 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab33731d-7028-49a8-922d-64d2f8a2ab22-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.851927 5030 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab33731d-7028-49a8-922d-64d2f8a2ab22-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.967305 5030 generic.go:334] "Generic (PLEG): container finished" podID="ab33731d-7028-49a8-922d-64d2f8a2ab22" containerID="7f536454f25563b4b60e30e46c9831bafd7cf35a6662224c731a8e8cea720a18" exitCode=0 Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.967407 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.968709 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" event={"ID":"ab33731d-7028-49a8-922d-64d2f8a2ab22","Type":"ContainerDied","Data":"7f536454f25563b4b60e30e46c9831bafd7cf35a6662224c731a8e8cea720a18"} Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.968772 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8646798d4-fjnvz" event={"ID":"ab33731d-7028-49a8-922d-64d2f8a2ab22","Type":"ContainerDied","Data":"1150a3f400fadb47836ccfecbd9c9e2bf59b43720dbc175a22f5ff2f1e1f6735"} Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.968801 5030 scope.go:117] "RemoveContainer" containerID="7f536454f25563b4b60e30e46c9831bafd7cf35a6662224c731a8e8cea720a18" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.985842 5030 scope.go:117] "RemoveContainer" containerID="7f536454f25563b4b60e30e46c9831bafd7cf35a6662224c731a8e8cea720a18" Jan 20 22:41:33 crc kubenswrapper[5030]: E0120 22:41:33.986947 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f536454f25563b4b60e30e46c9831bafd7cf35a6662224c731a8e8cea720a18\": container with ID starting with 7f536454f25563b4b60e30e46c9831bafd7cf35a6662224c731a8e8cea720a18 not found: ID does not exist" containerID="7f536454f25563b4b60e30e46c9831bafd7cf35a6662224c731a8e8cea720a18" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.986987 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f536454f25563b4b60e30e46c9831bafd7cf35a6662224c731a8e8cea720a18"} err="failed to get container status \"7f536454f25563b4b60e30e46c9831bafd7cf35a6662224c731a8e8cea720a18\": rpc error: code = NotFound desc = could not find container \"7f536454f25563b4b60e30e46c9831bafd7cf35a6662224c731a8e8cea720a18\": container with ID starting with 7f536454f25563b4b60e30e46c9831bafd7cf35a6662224c731a8e8cea720a18 not found: ID does not exist" Jan 20 22:41:33 crc kubenswrapper[5030]: I0120 22:41:33.999052 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8646798d4-fjnvz"] Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.003584 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8646798d4-fjnvz"] Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.374830 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-857bb89c7-ss8tl"] Jan 20 22:41:34 crc kubenswrapper[5030]: E0120 22:41:34.375184 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab33731d-7028-49a8-922d-64d2f8a2ab22" containerName="controller-manager" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.375207 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab33731d-7028-49a8-922d-64d2f8a2ab22" containerName="controller-manager" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.375384 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab33731d-7028-49a8-922d-64d2f8a2ab22" containerName="controller-manager" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.376076 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.379051 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.379552 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.379608 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.383092 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.383133 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.383768 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.393696 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.393801 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-857bb89c7-ss8tl"] Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.562787 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/584f872a-7d62-4771-977a-052bbfb51e3f-proxy-ca-bundles\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.562997 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/584f872a-7d62-4771-977a-052bbfb51e3f-serving-cert\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.563058 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbfr9\" (UniqueName: \"kubernetes.io/projected/584f872a-7d62-4771-977a-052bbfb51e3f-kube-api-access-jbfr9\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.563107 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/584f872a-7d62-4771-977a-052bbfb51e3f-client-ca\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.563150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/584f872a-7d62-4771-977a-052bbfb51e3f-config\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.664426 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/584f872a-7d62-4771-977a-052bbfb51e3f-serving-cert\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.664889 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbfr9\" (UniqueName: \"kubernetes.io/projected/584f872a-7d62-4771-977a-052bbfb51e3f-kube-api-access-jbfr9\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.665117 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/584f872a-7d62-4771-977a-052bbfb51e3f-client-ca\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.665373 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/584f872a-7d62-4771-977a-052bbfb51e3f-config\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.665712 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/584f872a-7d62-4771-977a-052bbfb51e3f-proxy-ca-bundles\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.667106 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/584f872a-7d62-4771-977a-052bbfb51e3f-client-ca\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.667357 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/584f872a-7d62-4771-977a-052bbfb51e3f-config\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.667929 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/584f872a-7d62-4771-977a-052bbfb51e3f-proxy-ca-bundles\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.672450 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/584f872a-7d62-4771-977a-052bbfb51e3f-serving-cert\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.685446 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbfr9\" (UniqueName: \"kubernetes.io/projected/584f872a-7d62-4771-977a-052bbfb51e3f-kube-api-access-jbfr9\") pod \"controller-manager-857bb89c7-ss8tl\" (UID: \"584f872a-7d62-4771-977a-052bbfb51e3f\") " pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:34 crc kubenswrapper[5030]: I0120 22:41:34.701561 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:35 crc kubenswrapper[5030]: I0120 22:41:35.175051 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-857bb89c7-ss8tl"] Jan 20 22:41:35 crc kubenswrapper[5030]: I0120 22:41:35.976094 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab33731d-7028-49a8-922d-64d2f8a2ab22" path="/var/lib/kubelet/pods/ab33731d-7028-49a8-922d-64d2f8a2ab22/volumes" Jan 20 22:41:35 crc kubenswrapper[5030]: I0120 22:41:35.983884 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" event={"ID":"584f872a-7d62-4771-977a-052bbfb51e3f","Type":"ContainerStarted","Data":"1853f1e4e969d3233f2bba32fb4fcf3ba3b105b0ba9cbfb4d305546b2b9eb071"} Jan 20 22:41:35 crc kubenswrapper[5030]: I0120 22:41:35.984150 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" event={"ID":"584f872a-7d62-4771-977a-052bbfb51e3f","Type":"ContainerStarted","Data":"2ca35f1dc1340f4b4ff79d8ede216d4d9afb93a0f104326cfd107b5202d96b3b"} Jan 20 22:41:35 crc kubenswrapper[5030]: I0120 22:41:35.984776 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:35 crc kubenswrapper[5030]: I0120 22:41:35.994943 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" Jan 20 22:41:36 crc kubenswrapper[5030]: I0120 22:41:36.016098 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-857bb89c7-ss8tl" podStartSLOduration=3.016080157 podStartE2EDuration="3.016080157s" podCreationTimestamp="2026-01-20 22:41:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:41:36.014595911 +0000 UTC m=+368.334856229" watchObservedRunningTime="2026-01-20 22:41:36.016080157 +0000 UTC m=+368.336340465" Jan 20 22:41:40 crc kubenswrapper[5030]: I0120 22:41:40.157655 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:41:40 crc kubenswrapper[5030]: I0120 22:41:40.158236 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.498238 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s7w6m"] Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.499279 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s7w6m" podUID="25dd4196-71a9-4ac4-ba14-e29499a14b97" containerName="registry-server" containerID="cri-o://6620bba3c7be37539aca2e0e790a50a3c8c46358dba87bcb2a0c3ddcbb9a34a4" gracePeriod=30 Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.511849 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwqgg"] Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.513061 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rwqgg" podUID="1481ba2e-3e83-4628-aef7-53fd09d2a02d" containerName="registry-server" containerID="cri-o://3b1b2ed71adf81cd57d34fd6c476908ace762e89f483636d20c55e287530a559" gracePeriod=30 Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.522820 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tlnpr"] Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.523158 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" podUID="00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" containerName="marketplace-operator" containerID="cri-o://41a937dedc3a633f2c419058e6465e868d662c0d3e05e8ca8727936a572327d0" gracePeriod=30 Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.529241 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8mnn"] Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.529854 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c8mnn" podUID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" containerName="registry-server" containerID="cri-o://38cb67c56cd40097cba8bb25d6bfd73eb424fe60a9e63ba2bf242f7f0c4aa8f4" gracePeriod=30 Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.546028 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jpsk9"] Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.548516 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.548806 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7tbm4"] Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.549723 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7tbm4" podUID="ca452d17-ecf5-4ea1-8580-cca89ddb5b70" containerName="registry-server" containerID="cri-o://4cf316f5567c345b42168b52831684431f471ea09a73932ccdef46c6fd07ed14" gracePeriod=30 Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.567070 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jpsk9"] Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.685909 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jpsk9\" (UID: \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.685953 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9jwz\" (UniqueName: \"kubernetes.io/projected/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-kube-api-access-s9jwz\") pod \"marketplace-operator-79b997595-jpsk9\" (UID: \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.686151 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jpsk9\" (UID: \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.787137 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jpsk9\" (UID: \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.787206 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9jwz\" (UniqueName: \"kubernetes.io/projected/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-kube-api-access-s9jwz\") pod \"marketplace-operator-79b997595-jpsk9\" (UID: \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.787269 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jpsk9\" (UID: \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.788748 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jpsk9\" (UID: \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.793474 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jpsk9\" (UID: \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.803558 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9jwz\" (UniqueName: \"kubernetes.io/projected/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-kube-api-access-s9jwz\") pod \"marketplace-operator-79b997595-jpsk9\" (UID: \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\") " pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 20 22:41:42 crc kubenswrapper[5030]: I0120 22:41:42.981161 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.014466 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.024831 5030 generic.go:334] "Generic (PLEG): container finished" podID="00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" containerID="41a937dedc3a633f2c419058e6465e868d662c0d3e05e8ca8727936a572327d0" exitCode=0 Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.024924 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" event={"ID":"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5","Type":"ContainerDied","Data":"41a937dedc3a633f2c419058e6465e868d662c0d3e05e8ca8727936a572327d0"} Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.025001 5030 scope.go:117] "RemoveContainer" containerID="27418cd1f9ab974fdd707fae32edb6029362def21560dbea701ae7e4bcc59398" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.027286 5030 generic.go:334] "Generic (PLEG): container finished" podID="ca452d17-ecf5-4ea1-8580-cca89ddb5b70" containerID="4cf316f5567c345b42168b52831684431f471ea09a73932ccdef46c6fd07ed14" exitCode=0 Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.027334 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tbm4" event={"ID":"ca452d17-ecf5-4ea1-8580-cca89ddb5b70","Type":"ContainerDied","Data":"4cf316f5567c345b42168b52831684431f471ea09a73932ccdef46c6fd07ed14"} Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.034005 5030 generic.go:334] "Generic (PLEG): container finished" podID="1481ba2e-3e83-4628-aef7-53fd09d2a02d" containerID="3b1b2ed71adf81cd57d34fd6c476908ace762e89f483636d20c55e287530a559" exitCode=0 Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.034108 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwqgg" event={"ID":"1481ba2e-3e83-4628-aef7-53fd09d2a02d","Type":"ContainerDied","Data":"3b1b2ed71adf81cd57d34fd6c476908ace762e89f483636d20c55e287530a559"} Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.034137 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwqgg" event={"ID":"1481ba2e-3e83-4628-aef7-53fd09d2a02d","Type":"ContainerDied","Data":"aa5445b5266c95f5eec22516cfff30bddfe56a2c63917ed982b05eb5d3b2390f"} Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.034203 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwqgg" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.043908 5030 generic.go:334] "Generic (PLEG): container finished" podID="25dd4196-71a9-4ac4-ba14-e29499a14b97" containerID="6620bba3c7be37539aca2e0e790a50a3c8c46358dba87bcb2a0c3ddcbb9a34a4" exitCode=0 Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.043965 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7w6m" event={"ID":"25dd4196-71a9-4ac4-ba14-e29499a14b97","Type":"ContainerDied","Data":"6620bba3c7be37539aca2e0e790a50a3c8c46358dba87bcb2a0c3ddcbb9a34a4"} Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.046566 5030 generic.go:334] "Generic (PLEG): container finished" podID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" containerID="38cb67c56cd40097cba8bb25d6bfd73eb424fe60a9e63ba2bf242f7f0c4aa8f4" exitCode=0 Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.046593 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8mnn" event={"ID":"8c46fc78-f3af-42cc-84e3-c0445d7e686f","Type":"ContainerDied","Data":"38cb67c56cd40097cba8bb25d6bfd73eb424fe60a9e63ba2bf242f7f0c4aa8f4"} Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.075808 5030 scope.go:117] "RemoveContainer" containerID="3b1b2ed71adf81cd57d34fd6c476908ace762e89f483636d20c55e287530a559" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.090902 5030 scope.go:117] "RemoveContainer" containerID="ef0e35a45c4a850deffd6495b626facbf06956fe92a6c9b71d2e3fcf23621677" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.134584 5030 scope.go:117] "RemoveContainer" containerID="f39e59040bd29fc1d368fb0afcd7c6a885f264ea7d4043e026320bcf04eb48eb" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.180331 5030 scope.go:117] "RemoveContainer" containerID="3b1b2ed71adf81cd57d34fd6c476908ace762e89f483636d20c55e287530a559" Jan 20 22:41:43 crc kubenswrapper[5030]: E0120 22:41:43.181733 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b1b2ed71adf81cd57d34fd6c476908ace762e89f483636d20c55e287530a559\": container with ID starting with 3b1b2ed71adf81cd57d34fd6c476908ace762e89f483636d20c55e287530a559 not found: ID does not exist" containerID="3b1b2ed71adf81cd57d34fd6c476908ace762e89f483636d20c55e287530a559" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.181773 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b1b2ed71adf81cd57d34fd6c476908ace762e89f483636d20c55e287530a559"} err="failed to get container status \"3b1b2ed71adf81cd57d34fd6c476908ace762e89f483636d20c55e287530a559\": rpc error: code = NotFound desc = could not find container \"3b1b2ed71adf81cd57d34fd6c476908ace762e89f483636d20c55e287530a559\": container with ID starting with 3b1b2ed71adf81cd57d34fd6c476908ace762e89f483636d20c55e287530a559 not found: ID does not exist" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.181799 5030 scope.go:117] "RemoveContainer" containerID="ef0e35a45c4a850deffd6495b626facbf06956fe92a6c9b71d2e3fcf23621677" Jan 20 22:41:43 crc kubenswrapper[5030]: E0120 22:41:43.182264 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef0e35a45c4a850deffd6495b626facbf06956fe92a6c9b71d2e3fcf23621677\": container with ID starting with ef0e35a45c4a850deffd6495b626facbf06956fe92a6c9b71d2e3fcf23621677 not found: ID does not exist" containerID="ef0e35a45c4a850deffd6495b626facbf06956fe92a6c9b71d2e3fcf23621677" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.182303 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef0e35a45c4a850deffd6495b626facbf06956fe92a6c9b71d2e3fcf23621677"} err="failed to get container status \"ef0e35a45c4a850deffd6495b626facbf06956fe92a6c9b71d2e3fcf23621677\": rpc error: code = NotFound desc = could not find container \"ef0e35a45c4a850deffd6495b626facbf06956fe92a6c9b71d2e3fcf23621677\": container with ID starting with ef0e35a45c4a850deffd6495b626facbf06956fe92a6c9b71d2e3fcf23621677 not found: ID does not exist" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.182333 5030 scope.go:117] "RemoveContainer" containerID="f39e59040bd29fc1d368fb0afcd7c6a885f264ea7d4043e026320bcf04eb48eb" Jan 20 22:41:43 crc kubenswrapper[5030]: E0120 22:41:43.183282 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f39e59040bd29fc1d368fb0afcd7c6a885f264ea7d4043e026320bcf04eb48eb\": container with ID starting with f39e59040bd29fc1d368fb0afcd7c6a885f264ea7d4043e026320bcf04eb48eb not found: ID does not exist" containerID="f39e59040bd29fc1d368fb0afcd7c6a885f264ea7d4043e026320bcf04eb48eb" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.183308 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f39e59040bd29fc1d368fb0afcd7c6a885f264ea7d4043e026320bcf04eb48eb"} err="failed to get container status \"f39e59040bd29fc1d368fb0afcd7c6a885f264ea7d4043e026320bcf04eb48eb\": rpc error: code = NotFound desc = could not find container \"f39e59040bd29fc1d368fb0afcd7c6a885f264ea7d4043e026320bcf04eb48eb\": container with ID starting with f39e59040bd29fc1d368fb0afcd7c6a885f264ea7d4043e026320bcf04eb48eb not found: ID does not exist" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.193283 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2cmd\" (UniqueName: \"kubernetes.io/projected/1481ba2e-3e83-4628-aef7-53fd09d2a02d-kube-api-access-w2cmd\") pod \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\" (UID: \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.193390 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1481ba2e-3e83-4628-aef7-53fd09d2a02d-utilities\") pod \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\" (UID: \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.193421 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1481ba2e-3e83-4628-aef7-53fd09d2a02d-catalog-content\") pod \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\" (UID: \"1481ba2e-3e83-4628-aef7-53fd09d2a02d\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.195814 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1481ba2e-3e83-4628-aef7-53fd09d2a02d-utilities" (OuterVolumeSpecName: "utilities") pod "1481ba2e-3e83-4628-aef7-53fd09d2a02d" (UID: "1481ba2e-3e83-4628-aef7-53fd09d2a02d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.197701 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1481ba2e-3e83-4628-aef7-53fd09d2a02d-kube-api-access-w2cmd" (OuterVolumeSpecName: "kube-api-access-w2cmd") pod "1481ba2e-3e83-4628-aef7-53fd09d2a02d" (UID: "1481ba2e-3e83-4628-aef7-53fd09d2a02d"). InnerVolumeSpecName "kube-api-access-w2cmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.261040 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1481ba2e-3e83-4628-aef7-53fd09d2a02d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1481ba2e-3e83-4628-aef7-53fd09d2a02d" (UID: "1481ba2e-3e83-4628-aef7-53fd09d2a02d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.274823 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.282883 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.294788 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2cmd\" (UniqueName: \"kubernetes.io/projected/1481ba2e-3e83-4628-aef7-53fd09d2a02d-kube-api-access-w2cmd\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.294814 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1481ba2e-3e83-4628-aef7-53fd09d2a02d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.294824 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1481ba2e-3e83-4628-aef7-53fd09d2a02d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.295866 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.296902 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.366409 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwqgg"] Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.369920 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rwqgg"] Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.396358 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-catalog-content\") pod \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\" (UID: \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.396395 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-marketplace-trusted-ca\") pod \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\" (UID: \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.396457 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb2gb\" (UniqueName: \"kubernetes.io/projected/25dd4196-71a9-4ac4-ba14-e29499a14b97-kube-api-access-rb2gb\") pod \"25dd4196-71a9-4ac4-ba14-e29499a14b97\" (UID: \"25dd4196-71a9-4ac4-ba14-e29499a14b97\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.396494 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-marketplace-operator-metrics\") pod \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\" (UID: \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.396519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c46fc78-f3af-42cc-84e3-c0445d7e686f-catalog-content\") pod \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\" (UID: \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.396549 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f4j4\" (UniqueName: \"kubernetes.io/projected/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-kube-api-access-4f4j4\") pod \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\" (UID: \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.396575 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht7kw\" (UniqueName: \"kubernetes.io/projected/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-kube-api-access-ht7kw\") pod \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\" (UID: \"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.396603 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dd4196-71a9-4ac4-ba14-e29499a14b97-catalog-content\") pod \"25dd4196-71a9-4ac4-ba14-e29499a14b97\" (UID: \"25dd4196-71a9-4ac4-ba14-e29499a14b97\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.396642 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nt87\" (UniqueName: \"kubernetes.io/projected/8c46fc78-f3af-42cc-84e3-c0445d7e686f-kube-api-access-6nt87\") pod \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\" (UID: \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.396665 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dd4196-71a9-4ac4-ba14-e29499a14b97-utilities\") pod \"25dd4196-71a9-4ac4-ba14-e29499a14b97\" (UID: \"25dd4196-71a9-4ac4-ba14-e29499a14b97\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.396680 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-utilities\") pod \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\" (UID: \"ca452d17-ecf5-4ea1-8580-cca89ddb5b70\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.396696 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c46fc78-f3af-42cc-84e3-c0445d7e686f-utilities\") pod \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\" (UID: \"8c46fc78-f3af-42cc-84e3-c0445d7e686f\") " Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.397696 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c46fc78-f3af-42cc-84e3-c0445d7e686f-utilities" (OuterVolumeSpecName: "utilities") pod "8c46fc78-f3af-42cc-84e3-c0445d7e686f" (UID: "8c46fc78-f3af-42cc-84e3-c0445d7e686f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.398552 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" (UID: "00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.400179 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-utilities" (OuterVolumeSpecName: "utilities") pod "ca452d17-ecf5-4ea1-8580-cca89ddb5b70" (UID: "ca452d17-ecf5-4ea1-8580-cca89ddb5b70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.400176 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25dd4196-71a9-4ac4-ba14-e29499a14b97-utilities" (OuterVolumeSpecName: "utilities") pod "25dd4196-71a9-4ac4-ba14-e29499a14b97" (UID: "25dd4196-71a9-4ac4-ba14-e29499a14b97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.400339 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-kube-api-access-ht7kw" (OuterVolumeSpecName: "kube-api-access-ht7kw") pod "00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" (UID: "00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5"). InnerVolumeSpecName "kube-api-access-ht7kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.401539 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-kube-api-access-4f4j4" (OuterVolumeSpecName: "kube-api-access-4f4j4") pod "ca452d17-ecf5-4ea1-8580-cca89ddb5b70" (UID: "ca452d17-ecf5-4ea1-8580-cca89ddb5b70"). InnerVolumeSpecName "kube-api-access-4f4j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.403323 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25dd4196-71a9-4ac4-ba14-e29499a14b97-kube-api-access-rb2gb" (OuterVolumeSpecName: "kube-api-access-rb2gb") pod "25dd4196-71a9-4ac4-ba14-e29499a14b97" (UID: "25dd4196-71a9-4ac4-ba14-e29499a14b97"). InnerVolumeSpecName "kube-api-access-rb2gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.407703 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c46fc78-f3af-42cc-84e3-c0445d7e686f-kube-api-access-6nt87" (OuterVolumeSpecName: "kube-api-access-6nt87") pod "8c46fc78-f3af-42cc-84e3-c0445d7e686f" (UID: "8c46fc78-f3af-42cc-84e3-c0445d7e686f"). InnerVolumeSpecName "kube-api-access-6nt87". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.411210 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" (UID: "00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.427408 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c46fc78-f3af-42cc-84e3-c0445d7e686f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c46fc78-f3af-42cc-84e3-c0445d7e686f" (UID: "8c46fc78-f3af-42cc-84e3-c0445d7e686f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.455869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25dd4196-71a9-4ac4-ba14-e29499a14b97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25dd4196-71a9-4ac4-ba14-e29499a14b97" (UID: "25dd4196-71a9-4ac4-ba14-e29499a14b97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.499556 5030 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.499597 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb2gb\" (UniqueName: \"kubernetes.io/projected/25dd4196-71a9-4ac4-ba14-e29499a14b97-kube-api-access-rb2gb\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.499610 5030 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.499651 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c46fc78-f3af-42cc-84e3-c0445d7e686f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.499663 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f4j4\" (UniqueName: \"kubernetes.io/projected/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-kube-api-access-4f4j4\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.499674 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht7kw\" (UniqueName: \"kubernetes.io/projected/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5-kube-api-access-ht7kw\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.499685 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25dd4196-71a9-4ac4-ba14-e29499a14b97-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.499744 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nt87\" (UniqueName: \"kubernetes.io/projected/8c46fc78-f3af-42cc-84e3-c0445d7e686f-kube-api-access-6nt87\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.499759 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.499772 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25dd4196-71a9-4ac4-ba14-e29499a14b97-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.499783 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c46fc78-f3af-42cc-84e3-c0445d7e686f-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.505560 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca452d17-ecf5-4ea1-8580-cca89ddb5b70" (UID: "ca452d17-ecf5-4ea1-8580-cca89ddb5b70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.600666 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca452d17-ecf5-4ea1-8580-cca89ddb5b70-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.604287 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jpsk9"] Jan 20 22:41:43 crc kubenswrapper[5030]: I0120 22:41:43.969931 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1481ba2e-3e83-4628-aef7-53fd09d2a02d" path="/var/lib/kubelet/pods/1481ba2e-3e83-4628-aef7-53fd09d2a02d/volumes" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.058779 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c8mnn" event={"ID":"8c46fc78-f3af-42cc-84e3-c0445d7e686f","Type":"ContainerDied","Data":"833620d4712fceb6fe4b527de6942b2aa56c74b92585f29224df07a603cfd5c5"} Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.058835 5030 scope.go:117] "RemoveContainer" containerID="38cb67c56cd40097cba8bb25d6bfd73eb424fe60a9e63ba2bf242f7f0c4aa8f4" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.058949 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c8mnn" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.064774 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.065616 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tlnpr" event={"ID":"00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5","Type":"ContainerDied","Data":"666b9ea133f335c27ce3bb1fad9d6fcd7eb9ffcb1fa14a4271178d8e5a5d6c5d"} Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.072790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" event={"ID":"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d","Type":"ContainerStarted","Data":"c691bfa4049ecba097c1f9a152b736785f885cf82af4c678a6026f230bec332c"} Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.072854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" event={"ID":"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d","Type":"ContainerStarted","Data":"f7f45411bb5d77f082c3efa4826fab77eecf9bacc887a61f4500f0ca142588c5"} Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.073336 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.076550 5030 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jpsk9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" start-of-body= Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.076603 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" podUID="827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.078359 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tbm4" event={"ID":"ca452d17-ecf5-4ea1-8580-cca89ddb5b70","Type":"ContainerDied","Data":"32c39608e4852c93503294d7816417361593c357bd5be5ab3ce1aa3d8378aaef"} Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.078449 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tbm4" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.081134 5030 scope.go:117] "RemoveContainer" containerID="5ffa93c9b1bf9d96c0abad00bef119f60a63dbb4330a44ad28fa8054e41d187d" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.083145 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s7w6m" event={"ID":"25dd4196-71a9-4ac4-ba14-e29499a14b97","Type":"ContainerDied","Data":"6864bfe41e58a5ed19564f16df60d76f2c32ff3a2c2741e9053528143b5a4226"} Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.083228 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s7w6m" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.099094 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tlnpr"] Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.101861 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tlnpr"] Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.102774 5030 scope.go:117] "RemoveContainer" containerID="d08e722d71147729b414b3cfb0d7684dfce45c0e62477941ec089d787bce7014" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.112167 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" podStartSLOduration=2.112138883 podStartE2EDuration="2.112138883s" podCreationTimestamp="2026-01-20 22:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:41:44.111326233 +0000 UTC m=+376.431586521" watchObservedRunningTime="2026-01-20 22:41:44.112138883 +0000 UTC m=+376.432399211" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.131480 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8mnn"] Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.135904 5030 scope.go:117] "RemoveContainer" containerID="41a937dedc3a633f2c419058e6465e868d662c0d3e05e8ca8727936a572327d0" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.136104 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c8mnn"] Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.147247 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7tbm4"] Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.151777 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7tbm4"] Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.163097 5030 scope.go:117] "RemoveContainer" containerID="4cf316f5567c345b42168b52831684431f471ea09a73932ccdef46c6fd07ed14" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.166568 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s7w6m"] Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.173120 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s7w6m"] Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.177529 5030 scope.go:117] "RemoveContainer" containerID="c9d278b96c2833fdd4ec6b3c222d65cecabc1400a99e6a5da787cdadffbe62a9" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.191898 5030 scope.go:117] "RemoveContainer" containerID="b3e14b51d81f45a5f5e373a4e0e64d7015afb20a0c690ae7dfcb2e3b0b80c5d6" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.209875 5030 scope.go:117] "RemoveContainer" containerID="6620bba3c7be37539aca2e0e790a50a3c8c46358dba87bcb2a0c3ddcbb9a34a4" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.226046 5030 scope.go:117] "RemoveContainer" containerID="91690598c7284e43663136e401d12ba93f02c9558eded244007c6867044ba873" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.239813 5030 scope.go:117] "RemoveContainer" containerID="8cba325132615f82304d2bc68c63750a26d4562ee29af5331425c794267485bf" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.512201 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v64zq"] Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.512879 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25dd4196-71a9-4ac4-ba14-e29499a14b97" containerName="extract-content" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.512911 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="25dd4196-71a9-4ac4-ba14-e29499a14b97" containerName="extract-content" Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.512944 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca452d17-ecf5-4ea1-8580-cca89ddb5b70" containerName="extract-utilities" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.512963 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca452d17-ecf5-4ea1-8580-cca89ddb5b70" containerName="extract-utilities" Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.512981 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1481ba2e-3e83-4628-aef7-53fd09d2a02d" containerName="registry-server" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.512999 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1481ba2e-3e83-4628-aef7-53fd09d2a02d" containerName="registry-server" Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.513013 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25dd4196-71a9-4ac4-ba14-e29499a14b97" containerName="registry-server" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513026 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="25dd4196-71a9-4ac4-ba14-e29499a14b97" containerName="registry-server" Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.513046 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca452d17-ecf5-4ea1-8580-cca89ddb5b70" containerName="extract-content" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513058 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca452d17-ecf5-4ea1-8580-cca89ddb5b70" containerName="extract-content" Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.513079 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" containerName="marketplace-operator" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513094 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" containerName="marketplace-operator" Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.513108 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" containerName="registry-server" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513121 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" containerName="registry-server" Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.513136 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" containerName="extract-content" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513149 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" containerName="extract-content" Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.513163 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca452d17-ecf5-4ea1-8580-cca89ddb5b70" containerName="registry-server" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513178 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca452d17-ecf5-4ea1-8580-cca89ddb5b70" containerName="registry-server" Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.513196 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1481ba2e-3e83-4628-aef7-53fd09d2a02d" containerName="extract-utilities" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513210 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1481ba2e-3e83-4628-aef7-53fd09d2a02d" containerName="extract-utilities" Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.513231 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" containerName="extract-utilities" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513245 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" containerName="extract-utilities" Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.513270 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25dd4196-71a9-4ac4-ba14-e29499a14b97" containerName="extract-utilities" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513283 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="25dd4196-71a9-4ac4-ba14-e29499a14b97" containerName="extract-utilities" Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.513302 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1481ba2e-3e83-4628-aef7-53fd09d2a02d" containerName="extract-content" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513316 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1481ba2e-3e83-4628-aef7-53fd09d2a02d" containerName="extract-content" Jan 20 22:41:44 crc kubenswrapper[5030]: E0120 22:41:44.513337 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" containerName="marketplace-operator" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513350 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" containerName="marketplace-operator" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513514 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" containerName="marketplace-operator" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513542 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" containerName="marketplace-operator" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513556 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="25dd4196-71a9-4ac4-ba14-e29499a14b97" containerName="registry-server" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513573 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1481ba2e-3e83-4628-aef7-53fd09d2a02d" containerName="registry-server" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513593 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" containerName="registry-server" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.513612 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca452d17-ecf5-4ea1-8580-cca89ddb5b70" containerName="registry-server" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.515152 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.517864 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.531239 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v64zq"] Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.618166 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e8785a-fb7a-4209-a065-b57a88baec40-catalog-content\") pod \"community-operators-v64zq\" (UID: \"f8e8785a-fb7a-4209-a065-b57a88baec40\") " pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.618242 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e8785a-fb7a-4209-a065-b57a88baec40-utilities\") pod \"community-operators-v64zq\" (UID: \"f8e8785a-fb7a-4209-a065-b57a88baec40\") " pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.618273 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lhwf\" (UniqueName: \"kubernetes.io/projected/f8e8785a-fb7a-4209-a065-b57a88baec40-kube-api-access-5lhwf\") pod \"community-operators-v64zq\" (UID: \"f8e8785a-fb7a-4209-a065-b57a88baec40\") " pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.714939 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9nnfj"] Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.716344 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.721090 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e8785a-fb7a-4209-a065-b57a88baec40-catalog-content\") pod \"community-operators-v64zq\" (UID: \"f8e8785a-fb7a-4209-a065-b57a88baec40\") " pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.721198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e8785a-fb7a-4209-a065-b57a88baec40-utilities\") pod \"community-operators-v64zq\" (UID: \"f8e8785a-fb7a-4209-a065-b57a88baec40\") " pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.721273 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lhwf\" (UniqueName: \"kubernetes.io/projected/f8e8785a-fb7a-4209-a065-b57a88baec40-kube-api-access-5lhwf\") pod \"community-operators-v64zq\" (UID: \"f8e8785a-fb7a-4209-a065-b57a88baec40\") " pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.722015 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e8785a-fb7a-4209-a065-b57a88baec40-utilities\") pod \"community-operators-v64zq\" (UID: \"f8e8785a-fb7a-4209-a065-b57a88baec40\") " pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.722027 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.722342 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e8785a-fb7a-4209-a065-b57a88baec40-catalog-content\") pod \"community-operators-v64zq\" (UID: \"f8e8785a-fb7a-4209-a065-b57a88baec40\") " pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.723147 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9nnfj"] Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.745945 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lhwf\" (UniqueName: \"kubernetes.io/projected/f8e8785a-fb7a-4209-a065-b57a88baec40-kube-api-access-5lhwf\") pod \"community-operators-v64zq\" (UID: \"f8e8785a-fb7a-4209-a065-b57a88baec40\") " pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.822467 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f880cab6-348f-406b-ab12-6053781eedce-utilities\") pod \"certified-operators-9nnfj\" (UID: \"f880cab6-348f-406b-ab12-6053781eedce\") " pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.822524 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25g75\" (UniqueName: \"kubernetes.io/projected/f880cab6-348f-406b-ab12-6053781eedce-kube-api-access-25g75\") pod \"certified-operators-9nnfj\" (UID: \"f880cab6-348f-406b-ab12-6053781eedce\") " pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.822584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f880cab6-348f-406b-ab12-6053781eedce-catalog-content\") pod \"certified-operators-9nnfj\" (UID: \"f880cab6-348f-406b-ab12-6053781eedce\") " pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.846365 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.926759 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f880cab6-348f-406b-ab12-6053781eedce-utilities\") pod \"certified-operators-9nnfj\" (UID: \"f880cab6-348f-406b-ab12-6053781eedce\") " pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.926834 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25g75\" (UniqueName: \"kubernetes.io/projected/f880cab6-348f-406b-ab12-6053781eedce-kube-api-access-25g75\") pod \"certified-operators-9nnfj\" (UID: \"f880cab6-348f-406b-ab12-6053781eedce\") " pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.926872 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f880cab6-348f-406b-ab12-6053781eedce-catalog-content\") pod \"certified-operators-9nnfj\" (UID: \"f880cab6-348f-406b-ab12-6053781eedce\") " pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.928541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f880cab6-348f-406b-ab12-6053781eedce-catalog-content\") pod \"certified-operators-9nnfj\" (UID: \"f880cab6-348f-406b-ab12-6053781eedce\") " pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.928584 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f880cab6-348f-406b-ab12-6053781eedce-utilities\") pod \"certified-operators-9nnfj\" (UID: \"f880cab6-348f-406b-ab12-6053781eedce\") " pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:44 crc kubenswrapper[5030]: I0120 22:41:44.952345 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25g75\" (UniqueName: \"kubernetes.io/projected/f880cab6-348f-406b-ab12-6053781eedce-kube-api-access-25g75\") pod \"certified-operators-9nnfj\" (UID: \"f880cab6-348f-406b-ab12-6053781eedce\") " pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:45 crc kubenswrapper[5030]: I0120 22:41:45.046170 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:45 crc kubenswrapper[5030]: I0120 22:41:45.097557 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 20 22:41:45 crc kubenswrapper[5030]: I0120 22:41:45.240475 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v64zq"] Jan 20 22:41:45 crc kubenswrapper[5030]: I0120 22:41:45.433478 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9nnfj"] Jan 20 22:41:45 crc kubenswrapper[5030]: W0120 22:41:45.467551 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf880cab6_348f_406b_ab12_6053781eedce.slice/crio-7ff7a9f196b9838ecdf9c755d296121cdcb3f52ec287bea7b9330b11549e9a34 WatchSource:0}: Error finding container 7ff7a9f196b9838ecdf9c755d296121cdcb3f52ec287bea7b9330b11549e9a34: Status 404 returned error can't find the container with id 7ff7a9f196b9838ecdf9c755d296121cdcb3f52ec287bea7b9330b11549e9a34 Jan 20 22:41:45 crc kubenswrapper[5030]: I0120 22:41:45.968292 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5" path="/var/lib/kubelet/pods/00d754cf-ca0b-4dd4-a322-a5ad7ce30dd5/volumes" Jan 20 22:41:45 crc kubenswrapper[5030]: I0120 22:41:45.969163 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25dd4196-71a9-4ac4-ba14-e29499a14b97" path="/var/lib/kubelet/pods/25dd4196-71a9-4ac4-ba14-e29499a14b97/volumes" Jan 20 22:41:45 crc kubenswrapper[5030]: I0120 22:41:45.969922 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c46fc78-f3af-42cc-84e3-c0445d7e686f" path="/var/lib/kubelet/pods/8c46fc78-f3af-42cc-84e3-c0445d7e686f/volumes" Jan 20 22:41:45 crc kubenswrapper[5030]: I0120 22:41:45.971636 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca452d17-ecf5-4ea1-8580-cca89ddb5b70" path="/var/lib/kubelet/pods/ca452d17-ecf5-4ea1-8580-cca89ddb5b70/volumes" Jan 20 22:41:46 crc kubenswrapper[5030]: I0120 22:41:46.099274 5030 generic.go:334] "Generic (PLEG): container finished" podID="f8e8785a-fb7a-4209-a065-b57a88baec40" containerID="c07b354577bcaf637fbf1161e625ed80ce96d282b5f21bf9505c00e63a9715f9" exitCode=0 Jan 20 22:41:46 crc kubenswrapper[5030]: I0120 22:41:46.099324 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v64zq" event={"ID":"f8e8785a-fb7a-4209-a065-b57a88baec40","Type":"ContainerDied","Data":"c07b354577bcaf637fbf1161e625ed80ce96d282b5f21bf9505c00e63a9715f9"} Jan 20 22:41:46 crc kubenswrapper[5030]: I0120 22:41:46.099383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v64zq" event={"ID":"f8e8785a-fb7a-4209-a065-b57a88baec40","Type":"ContainerStarted","Data":"f52821f4988808b57e317dfa4f15b4b621ecd4229ff90f5ecc4b812c9089f14e"} Jan 20 22:41:46 crc kubenswrapper[5030]: I0120 22:41:46.104642 5030 generic.go:334] "Generic (PLEG): container finished" podID="f880cab6-348f-406b-ab12-6053781eedce" containerID="10a97929e918afbe30046bbbda875def4a77352e6bc72b87106bea65e5c5f099" exitCode=0 Jan 20 22:41:46 crc kubenswrapper[5030]: I0120 22:41:46.104744 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nnfj" event={"ID":"f880cab6-348f-406b-ab12-6053781eedce","Type":"ContainerDied","Data":"10a97929e918afbe30046bbbda875def4a77352e6bc72b87106bea65e5c5f099"} Jan 20 22:41:46 crc kubenswrapper[5030]: I0120 22:41:46.104778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nnfj" event={"ID":"f880cab6-348f-406b-ab12-6053781eedce","Type":"ContainerStarted","Data":"7ff7a9f196b9838ecdf9c755d296121cdcb3f52ec287bea7b9330b11549e9a34"} Jan 20 22:41:46 crc kubenswrapper[5030]: I0120 22:41:46.424875 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-6pllv" Jan 20 22:41:46 crc kubenswrapper[5030]: I0120 22:41:46.474833 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dbvw5"] Jan 20 22:41:46 crc kubenswrapper[5030]: I0120 22:41:46.906068 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fv846"] Jan 20 22:41:46 crc kubenswrapper[5030]: I0120 22:41:46.907322 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:46 crc kubenswrapper[5030]: I0120 22:41:46.909615 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 20 22:41:46 crc kubenswrapper[5030]: I0120 22:41:46.929034 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv846"] Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.055706 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58bea95b-6632-463b-8ff9-24a85f213dde-utilities\") pod \"redhat-marketplace-fv846\" (UID: \"58bea95b-6632-463b-8ff9-24a85f213dde\") " pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.055799 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkrt8\" (UniqueName: \"kubernetes.io/projected/58bea95b-6632-463b-8ff9-24a85f213dde-kube-api-access-jkrt8\") pod \"redhat-marketplace-fv846\" (UID: \"58bea95b-6632-463b-8ff9-24a85f213dde\") " pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.055819 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58bea95b-6632-463b-8ff9-24a85f213dde-catalog-content\") pod \"redhat-marketplace-fv846\" (UID: \"58bea95b-6632-463b-8ff9-24a85f213dde\") " pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.113673 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kf5kh"] Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.114815 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.117824 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.118171 5030 generic.go:334] "Generic (PLEG): container finished" podID="f880cab6-348f-406b-ab12-6053781eedce" containerID="c96c7e6c2c1f1b1e7b52997f64002751e0b87c85c79c229f2d0c80bd328f6e52" exitCode=0 Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.118211 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nnfj" event={"ID":"f880cab6-348f-406b-ab12-6053781eedce","Type":"ContainerDied","Data":"c96c7e6c2c1f1b1e7b52997f64002751e0b87c85c79c229f2d0c80bd328f6e52"} Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.120937 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kf5kh"] Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.156792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkrt8\" (UniqueName: \"kubernetes.io/projected/58bea95b-6632-463b-8ff9-24a85f213dde-kube-api-access-jkrt8\") pod \"redhat-marketplace-fv846\" (UID: \"58bea95b-6632-463b-8ff9-24a85f213dde\") " pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.156836 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58bea95b-6632-463b-8ff9-24a85f213dde-catalog-content\") pod \"redhat-marketplace-fv846\" (UID: \"58bea95b-6632-463b-8ff9-24a85f213dde\") " pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.156958 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58bea95b-6632-463b-8ff9-24a85f213dde-utilities\") pod \"redhat-marketplace-fv846\" (UID: \"58bea95b-6632-463b-8ff9-24a85f213dde\") " pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.157367 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58bea95b-6632-463b-8ff9-24a85f213dde-catalog-content\") pod \"redhat-marketplace-fv846\" (UID: \"58bea95b-6632-463b-8ff9-24a85f213dde\") " pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.158161 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58bea95b-6632-463b-8ff9-24a85f213dde-utilities\") pod \"redhat-marketplace-fv846\" (UID: \"58bea95b-6632-463b-8ff9-24a85f213dde\") " pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.180549 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkrt8\" (UniqueName: \"kubernetes.io/projected/58bea95b-6632-463b-8ff9-24a85f213dde-kube-api-access-jkrt8\") pod \"redhat-marketplace-fv846\" (UID: \"58bea95b-6632-463b-8ff9-24a85f213dde\") " pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.225115 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.261584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f09a830-0564-41d1-8e00-3232119becba-catalog-content\") pod \"redhat-operators-kf5kh\" (UID: \"3f09a830-0564-41d1-8e00-3232119becba\") " pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.261713 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f09a830-0564-41d1-8e00-3232119becba-utilities\") pod \"redhat-operators-kf5kh\" (UID: \"3f09a830-0564-41d1-8e00-3232119becba\") " pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.261873 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znl2x\" (UniqueName: \"kubernetes.io/projected/3f09a830-0564-41d1-8e00-3232119becba-kube-api-access-znl2x\") pod \"redhat-operators-kf5kh\" (UID: \"3f09a830-0564-41d1-8e00-3232119becba\") " pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.363430 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f09a830-0564-41d1-8e00-3232119becba-utilities\") pod \"redhat-operators-kf5kh\" (UID: \"3f09a830-0564-41d1-8e00-3232119becba\") " pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.363512 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znl2x\" (UniqueName: \"kubernetes.io/projected/3f09a830-0564-41d1-8e00-3232119becba-kube-api-access-znl2x\") pod \"redhat-operators-kf5kh\" (UID: \"3f09a830-0564-41d1-8e00-3232119becba\") " pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.363543 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f09a830-0564-41d1-8e00-3232119becba-catalog-content\") pod \"redhat-operators-kf5kh\" (UID: \"3f09a830-0564-41d1-8e00-3232119becba\") " pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.364014 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f09a830-0564-41d1-8e00-3232119becba-catalog-content\") pod \"redhat-operators-kf5kh\" (UID: \"3f09a830-0564-41d1-8e00-3232119becba\") " pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.364286 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f09a830-0564-41d1-8e00-3232119becba-utilities\") pod \"redhat-operators-kf5kh\" (UID: \"3f09a830-0564-41d1-8e00-3232119becba\") " pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.382510 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znl2x\" (UniqueName: \"kubernetes.io/projected/3f09a830-0564-41d1-8e00-3232119becba-kube-api-access-znl2x\") pod \"redhat-operators-kf5kh\" (UID: \"3f09a830-0564-41d1-8e00-3232119becba\") " pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.477957 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.630130 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv846"] Jan 20 22:41:47 crc kubenswrapper[5030]: W0120 22:41:47.635165 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58bea95b_6632_463b_8ff9_24a85f213dde.slice/crio-50d80229f9d6786d308f83d20e80bc964cb49f4d6cf800bce640a2f53c804bc8 WatchSource:0}: Error finding container 50d80229f9d6786d308f83d20e80bc964cb49f4d6cf800bce640a2f53c804bc8: Status 404 returned error can't find the container with id 50d80229f9d6786d308f83d20e80bc964cb49f4d6cf800bce640a2f53c804bc8 Jan 20 22:41:47 crc kubenswrapper[5030]: I0120 22:41:47.880843 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kf5kh"] Jan 20 22:41:47 crc kubenswrapper[5030]: W0120 22:41:47.906302 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f09a830_0564_41d1_8e00_3232119becba.slice/crio-e54fc46728846cc0f96a4a3d9eaf746b98c740e210bd0b416375a75b5230bf8d WatchSource:0}: Error finding container e54fc46728846cc0f96a4a3d9eaf746b98c740e210bd0b416375a75b5230bf8d: Status 404 returned error can't find the container with id e54fc46728846cc0f96a4a3d9eaf746b98c740e210bd0b416375a75b5230bf8d Jan 20 22:41:48 crc kubenswrapper[5030]: I0120 22:41:48.125025 5030 generic.go:334] "Generic (PLEG): container finished" podID="58bea95b-6632-463b-8ff9-24a85f213dde" containerID="73be9cf8f1339b142151c1171c75060eb7944d2f49dffe59b4dc40eb865bd793" exitCode=0 Jan 20 22:41:48 crc kubenswrapper[5030]: I0120 22:41:48.125104 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv846" event={"ID":"58bea95b-6632-463b-8ff9-24a85f213dde","Type":"ContainerDied","Data":"73be9cf8f1339b142151c1171c75060eb7944d2f49dffe59b4dc40eb865bd793"} Jan 20 22:41:48 crc kubenswrapper[5030]: I0120 22:41:48.125129 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv846" event={"ID":"58bea95b-6632-463b-8ff9-24a85f213dde","Type":"ContainerStarted","Data":"50d80229f9d6786d308f83d20e80bc964cb49f4d6cf800bce640a2f53c804bc8"} Jan 20 22:41:48 crc kubenswrapper[5030]: I0120 22:41:48.126455 5030 generic.go:334] "Generic (PLEG): container finished" podID="f8e8785a-fb7a-4209-a065-b57a88baec40" containerID="94dba902fb8a6721913fefcf4a0d0b3582724872bb3352951c23449e34a024cf" exitCode=0 Jan 20 22:41:48 crc kubenswrapper[5030]: I0120 22:41:48.126693 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v64zq" event={"ID":"f8e8785a-fb7a-4209-a065-b57a88baec40","Type":"ContainerDied","Data":"94dba902fb8a6721913fefcf4a0d0b3582724872bb3352951c23449e34a024cf"} Jan 20 22:41:48 crc kubenswrapper[5030]: I0120 22:41:48.130660 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nnfj" event={"ID":"f880cab6-348f-406b-ab12-6053781eedce","Type":"ContainerStarted","Data":"0b1c1e87cf452f7c0e8d907d8bd7c6982af26ab5a0ea69f838a99868391098fb"} Jan 20 22:41:48 crc kubenswrapper[5030]: I0120 22:41:48.132419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf5kh" event={"ID":"3f09a830-0564-41d1-8e00-3232119becba","Type":"ContainerStarted","Data":"e54fc46728846cc0f96a4a3d9eaf746b98c740e210bd0b416375a75b5230bf8d"} Jan 20 22:41:49 crc kubenswrapper[5030]: I0120 22:41:49.139265 5030 generic.go:334] "Generic (PLEG): container finished" podID="3f09a830-0564-41d1-8e00-3232119becba" containerID="5309b03a7907a030a6b8b6e3458914eb744ba927bf5c048ac173cbcc6279d89f" exitCode=0 Jan 20 22:41:49 crc kubenswrapper[5030]: I0120 22:41:49.139716 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf5kh" event={"ID":"3f09a830-0564-41d1-8e00-3232119becba","Type":"ContainerDied","Data":"5309b03a7907a030a6b8b6e3458914eb744ba927bf5c048ac173cbcc6279d89f"} Jan 20 22:41:49 crc kubenswrapper[5030]: I0120 22:41:49.150722 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v64zq" event={"ID":"f8e8785a-fb7a-4209-a065-b57a88baec40","Type":"ContainerStarted","Data":"d94c9f8a703e75f2978eff10164982d029699b50df36d1c6e8549690c9727708"} Jan 20 22:41:49 crc kubenswrapper[5030]: I0120 22:41:49.163747 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9nnfj" podStartSLOduration=3.670581035 podStartE2EDuration="5.16370997s" podCreationTimestamp="2026-01-20 22:41:44 +0000 UTC" firstStartedPulling="2026-01-20 22:41:46.106795025 +0000 UTC m=+378.427055313" lastFinishedPulling="2026-01-20 22:41:47.59992392 +0000 UTC m=+379.920184248" observedRunningTime="2026-01-20 22:41:48.160795685 +0000 UTC m=+380.481055973" watchObservedRunningTime="2026-01-20 22:41:49.16370997 +0000 UTC m=+381.483970308" Jan 20 22:41:49 crc kubenswrapper[5030]: I0120 22:41:49.193324 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v64zq" podStartSLOduration=2.786558802 podStartE2EDuration="5.193306346s" podCreationTimestamp="2026-01-20 22:41:44 +0000 UTC" firstStartedPulling="2026-01-20 22:41:46.101037704 +0000 UTC m=+378.421298002" lastFinishedPulling="2026-01-20 22:41:48.507785258 +0000 UTC m=+380.828045546" observedRunningTime="2026-01-20 22:41:49.189729448 +0000 UTC m=+381.509989746" watchObservedRunningTime="2026-01-20 22:41:49.193306346 +0000 UTC m=+381.513566634" Jan 20 22:41:50 crc kubenswrapper[5030]: I0120 22:41:50.156912 5030 generic.go:334] "Generic (PLEG): container finished" podID="58bea95b-6632-463b-8ff9-24a85f213dde" containerID="891f24e0df6c930a7d936beacdc4e4f1f368ffaf6e5fc0c599677bc0f64fe044" exitCode=0 Jan 20 22:41:50 crc kubenswrapper[5030]: I0120 22:41:50.157011 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv846" event={"ID":"58bea95b-6632-463b-8ff9-24a85f213dde","Type":"ContainerDied","Data":"891f24e0df6c930a7d936beacdc4e4f1f368ffaf6e5fc0c599677bc0f64fe044"} Jan 20 22:41:51 crc kubenswrapper[5030]: I0120 22:41:51.163650 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv846" event={"ID":"58bea95b-6632-463b-8ff9-24a85f213dde","Type":"ContainerStarted","Data":"fef85f95d9352905e4aa39b823442e3681224682f150a380347d28a466215ced"} Jan 20 22:41:51 crc kubenswrapper[5030]: I0120 22:41:51.184085 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fv846" podStartSLOduration=3.711238248 podStartE2EDuration="5.184070304s" podCreationTimestamp="2026-01-20 22:41:46 +0000 UTC" firstStartedPulling="2026-01-20 22:41:49.152073643 +0000 UTC m=+381.472333941" lastFinishedPulling="2026-01-20 22:41:50.624905709 +0000 UTC m=+382.945165997" observedRunningTime="2026-01-20 22:41:51.179740967 +0000 UTC m=+383.500001255" watchObservedRunningTime="2026-01-20 22:41:51.184070304 +0000 UTC m=+383.504330592" Jan 20 22:41:52 crc kubenswrapper[5030]: I0120 22:41:52.172078 5030 generic.go:334] "Generic (PLEG): container finished" podID="3f09a830-0564-41d1-8e00-3232119becba" containerID="5feec5fcea41b0014502a011a58bd3b031086aa3df92b6c554b36baf9d8784e2" exitCode=0 Jan 20 22:41:52 crc kubenswrapper[5030]: I0120 22:41:52.172568 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf5kh" event={"ID":"3f09a830-0564-41d1-8e00-3232119becba","Type":"ContainerDied","Data":"5feec5fcea41b0014502a011a58bd3b031086aa3df92b6c554b36baf9d8784e2"} Jan 20 22:41:54 crc kubenswrapper[5030]: I0120 22:41:54.847055 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:54 crc kubenswrapper[5030]: I0120 22:41:54.847595 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:54 crc kubenswrapper[5030]: I0120 22:41:54.888549 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:55 crc kubenswrapper[5030]: I0120 22:41:55.046646 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:55 crc kubenswrapper[5030]: I0120 22:41:55.046714 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:55 crc kubenswrapper[5030]: I0120 22:41:55.082759 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:55 crc kubenswrapper[5030]: I0120 22:41:55.193910 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf5kh" event={"ID":"3f09a830-0564-41d1-8e00-3232119becba","Type":"ContainerStarted","Data":"376793db80f69aae127680fc34ea4dae95d690fbea083d5f60fe681742db9c1a"} Jan 20 22:41:55 crc kubenswrapper[5030]: I0120 22:41:55.221058 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kf5kh" podStartSLOduration=2.8015252349999997 podStartE2EDuration="8.221038859s" podCreationTimestamp="2026-01-20 22:41:47 +0000 UTC" firstStartedPulling="2026-01-20 22:41:49.144979879 +0000 UTC m=+381.465240167" lastFinishedPulling="2026-01-20 22:41:54.564493493 +0000 UTC m=+386.884753791" observedRunningTime="2026-01-20 22:41:55.217188474 +0000 UTC m=+387.537448762" watchObservedRunningTime="2026-01-20 22:41:55.221038859 +0000 UTC m=+387.541299147" Jan 20 22:41:55 crc kubenswrapper[5030]: I0120 22:41:55.239965 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v64zq" Jan 20 22:41:55 crc kubenswrapper[5030]: I0120 22:41:55.246950 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9nnfj" Jan 20 22:41:57 crc kubenswrapper[5030]: I0120 22:41:57.226264 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:57 crc kubenswrapper[5030]: I0120 22:41:57.226758 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:57 crc kubenswrapper[5030]: I0120 22:41:57.274538 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:57 crc kubenswrapper[5030]: I0120 22:41:57.478781 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:41:57 crc kubenswrapper[5030]: I0120 22:41:57.478828 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:41:58 crc kubenswrapper[5030]: I0120 22:41:58.252227 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fv846" Jan 20 22:41:58 crc kubenswrapper[5030]: I0120 22:41:58.519697 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kf5kh" podUID="3f09a830-0564-41d1-8e00-3232119becba" containerName="registry-server" probeResult="failure" output=< Jan 20 22:41:58 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 22:41:58 crc kubenswrapper[5030]: > Jan 20 22:42:07 crc kubenswrapper[5030]: I0120 22:42:07.518994 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:42:07 crc kubenswrapper[5030]: I0120 22:42:07.568330 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kf5kh" Jan 20 22:42:10 crc kubenswrapper[5030]: I0120 22:42:10.157532 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:42:10 crc kubenswrapper[5030]: I0120 22:42:10.157607 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:42:11 crc kubenswrapper[5030]: I0120 22:42:11.524107 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" podUID="e6b44a8f-35c4-4a14-a7bc-ab287216c87d" containerName="registry" containerID="cri-o://9a90e6eb0307e61dcc2ec53f8e11f1fbff33f6c72b55553cd5f476220c0f5aaa" gracePeriod=30 Jan 20 22:42:11 crc kubenswrapper[5030]: I0120 22:42:11.891063 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:42:11 crc kubenswrapper[5030]: I0120 22:42:11.999513 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-installation-pull-secrets\") pod \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " Jan 20 22:42:11 crc kubenswrapper[5030]: I0120 22:42:11.999584 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-trusted-ca\") pod \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " Jan 20 22:42:11 crc kubenswrapper[5030]: I0120 22:42:11.999677 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jdkd\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-kube-api-access-9jdkd\") pod \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:11.999989 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.000102 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-registry-certificates\") pod \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.000151 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-bound-sa-token\") pod \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.000879 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-registry-tls\") pod \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.000347 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e6b44a8f-35c4-4a14-a7bc-ab287216c87d" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.000943 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-ca-trust-extracted\") pod \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\" (UID: \"e6b44a8f-35c4-4a14-a7bc-ab287216c87d\") " Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.000665 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e6b44a8f-35c4-4a14-a7bc-ab287216c87d" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.001360 5030 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.001422 5030 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.017760 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-kube-api-access-9jdkd" (OuterVolumeSpecName: "kube-api-access-9jdkd") pod "e6b44a8f-35c4-4a14-a7bc-ab287216c87d" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d"). InnerVolumeSpecName "kube-api-access-9jdkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.018188 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e6b44a8f-35c4-4a14-a7bc-ab287216c87d" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.018275 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e6b44a8f-35c4-4a14-a7bc-ab287216c87d" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.018733 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e6b44a8f-35c4-4a14-a7bc-ab287216c87d" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.018803 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e6b44a8f-35c4-4a14-a7bc-ab287216c87d" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.022174 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e6b44a8f-35c4-4a14-a7bc-ab287216c87d" (UID: "e6b44a8f-35c4-4a14-a7bc-ab287216c87d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.102533 5030 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.102571 5030 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.102586 5030 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.102601 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jdkd\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-kube-api-access-9jdkd\") on node \"crc\" DevicePath \"\"" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.102612 5030 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6b44a8f-35c4-4a14-a7bc-ab287216c87d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.284136 5030 generic.go:334] "Generic (PLEG): container finished" podID="e6b44a8f-35c4-4a14-a7bc-ab287216c87d" containerID="9a90e6eb0307e61dcc2ec53f8e11f1fbff33f6c72b55553cd5f476220c0f5aaa" exitCode=0 Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.284187 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" event={"ID":"e6b44a8f-35c4-4a14-a7bc-ab287216c87d","Type":"ContainerDied","Data":"9a90e6eb0307e61dcc2ec53f8e11f1fbff33f6c72b55553cd5f476220c0f5aaa"} Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.284218 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.284242 5030 scope.go:117] "RemoveContainer" containerID="9a90e6eb0307e61dcc2ec53f8e11f1fbff33f6c72b55553cd5f476220c0f5aaa" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.284232 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dbvw5" event={"ID":"e6b44a8f-35c4-4a14-a7bc-ab287216c87d","Type":"ContainerDied","Data":"86415a1670562d1bac1b8ab29155b7e77cb42cc2a02b13f0e0246ca0297b4ff7"} Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.298889 5030 scope.go:117] "RemoveContainer" containerID="9a90e6eb0307e61dcc2ec53f8e11f1fbff33f6c72b55553cd5f476220c0f5aaa" Jan 20 22:42:12 crc kubenswrapper[5030]: E0120 22:42:12.299294 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a90e6eb0307e61dcc2ec53f8e11f1fbff33f6c72b55553cd5f476220c0f5aaa\": container with ID starting with 9a90e6eb0307e61dcc2ec53f8e11f1fbff33f6c72b55553cd5f476220c0f5aaa not found: ID does not exist" containerID="9a90e6eb0307e61dcc2ec53f8e11f1fbff33f6c72b55553cd5f476220c0f5aaa" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.299406 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a90e6eb0307e61dcc2ec53f8e11f1fbff33f6c72b55553cd5f476220c0f5aaa"} err="failed to get container status \"9a90e6eb0307e61dcc2ec53f8e11f1fbff33f6c72b55553cd5f476220c0f5aaa\": rpc error: code = NotFound desc = could not find container \"9a90e6eb0307e61dcc2ec53f8e11f1fbff33f6c72b55553cd5f476220c0f5aaa\": container with ID starting with 9a90e6eb0307e61dcc2ec53f8e11f1fbff33f6c72b55553cd5f476220c0f5aaa not found: ID does not exist" Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.316705 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dbvw5"] Jan 20 22:42:12 crc kubenswrapper[5030]: I0120 22:42:12.322126 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dbvw5"] Jan 20 22:42:13 crc kubenswrapper[5030]: I0120 22:42:13.974751 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b44a8f-35c4-4a14-a7bc-ab287216c87d" path="/var/lib/kubelet/pods/e6b44a8f-35c4-4a14-a7bc-ab287216c87d/volumes" Jan 20 22:42:40 crc kubenswrapper[5030]: I0120 22:42:40.157688 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:42:40 crc kubenswrapper[5030]: I0120 22:42:40.158535 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:42:40 crc kubenswrapper[5030]: I0120 22:42:40.158609 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:42:40 crc kubenswrapper[5030]: I0120 22:42:40.159518 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9e88aa4bf39a956e813cd75bd5356756871e80ce6ac73ab5e8b532a1293320b"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 22:42:40 crc kubenswrapper[5030]: I0120 22:42:40.159658 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://f9e88aa4bf39a956e813cd75bd5356756871e80ce6ac73ab5e8b532a1293320b" gracePeriod=600 Jan 20 22:42:40 crc kubenswrapper[5030]: I0120 22:42:40.477791 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="f9e88aa4bf39a956e813cd75bd5356756871e80ce6ac73ab5e8b532a1293320b" exitCode=0 Jan 20 22:42:40 crc kubenswrapper[5030]: I0120 22:42:40.477901 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"f9e88aa4bf39a956e813cd75bd5356756871e80ce6ac73ab5e8b532a1293320b"} Jan 20 22:42:40 crc kubenswrapper[5030]: I0120 22:42:40.478173 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"5fb69d7c86677af730b5a8cb887d6364e5648de419687c9cc1175e8da6193289"} Jan 20 22:42:40 crc kubenswrapper[5030]: I0120 22:42:40.478193 5030 scope.go:117] "RemoveContainer" containerID="c99c7570a412da9957475524691801f89f2575b8349d6a774c533d55a3bfdbce" Jan 20 22:44:28 crc kubenswrapper[5030]: I0120 22:44:28.241722 5030 scope.go:117] "RemoveContainer" containerID="dbfcbbfdc94d07878acd4c063509f224a093b447b62ede0db139bb9700f6b58d" Jan 20 22:44:28 crc kubenswrapper[5030]: I0120 22:44:28.278601 5030 scope.go:117] "RemoveContainer" containerID="54a8905efa3bfda9f6b135f1400371f68d47f5c35b90fded926bc064dabfd49f" Jan 20 22:44:40 crc kubenswrapper[5030]: I0120 22:44:40.157296 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:44:40 crc kubenswrapper[5030]: I0120 22:44:40.157923 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.201938 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp"] Jan 20 22:45:00 crc kubenswrapper[5030]: E0120 22:45:00.202820 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b44a8f-35c4-4a14-a7bc-ab287216c87d" containerName="registry" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.202841 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b44a8f-35c4-4a14-a7bc-ab287216c87d" containerName="registry" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.203011 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b44a8f-35c4-4a14-a7bc-ab287216c87d" containerName="registry" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.203546 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.205857 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.207826 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.210383 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp"] Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.277719 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d932ea97-0b18-4e73-8f4d-024982fbbd5c-secret-volume\") pod \"collect-profiles-29482485-74cfp\" (UID: \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.277809 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d932ea97-0b18-4e73-8f4d-024982fbbd5c-config-volume\") pod \"collect-profiles-29482485-74cfp\" (UID: \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.277893 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s6jm\" (UniqueName: \"kubernetes.io/projected/d932ea97-0b18-4e73-8f4d-024982fbbd5c-kube-api-access-2s6jm\") pod \"collect-profiles-29482485-74cfp\" (UID: \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.379343 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d932ea97-0b18-4e73-8f4d-024982fbbd5c-secret-volume\") pod \"collect-profiles-29482485-74cfp\" (UID: \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.379498 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d932ea97-0b18-4e73-8f4d-024982fbbd5c-config-volume\") pod \"collect-profiles-29482485-74cfp\" (UID: \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.379577 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s6jm\" (UniqueName: \"kubernetes.io/projected/d932ea97-0b18-4e73-8f4d-024982fbbd5c-kube-api-access-2s6jm\") pod \"collect-profiles-29482485-74cfp\" (UID: \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.381518 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d932ea97-0b18-4e73-8f4d-024982fbbd5c-config-volume\") pod \"collect-profiles-29482485-74cfp\" (UID: \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.389585 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d932ea97-0b18-4e73-8f4d-024982fbbd5c-secret-volume\") pod \"collect-profiles-29482485-74cfp\" (UID: \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.414481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s6jm\" (UniqueName: \"kubernetes.io/projected/d932ea97-0b18-4e73-8f4d-024982fbbd5c-kube-api-access-2s6jm\") pod \"collect-profiles-29482485-74cfp\" (UID: \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.528562 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" Jan 20 22:45:00 crc kubenswrapper[5030]: I0120 22:45:00.758313 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp"] Jan 20 22:45:01 crc kubenswrapper[5030]: I0120 22:45:01.424155 5030 generic.go:334] "Generic (PLEG): container finished" podID="d932ea97-0b18-4e73-8f4d-024982fbbd5c" containerID="a06bb76ff4a3825e1075099128adc0c7f62a34e84ae7348cfcb05f086f4b2136" exitCode=0 Jan 20 22:45:01 crc kubenswrapper[5030]: I0120 22:45:01.424213 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" event={"ID":"d932ea97-0b18-4e73-8f4d-024982fbbd5c","Type":"ContainerDied","Data":"a06bb76ff4a3825e1075099128adc0c7f62a34e84ae7348cfcb05f086f4b2136"} Jan 20 22:45:01 crc kubenswrapper[5030]: I0120 22:45:01.424251 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" event={"ID":"d932ea97-0b18-4e73-8f4d-024982fbbd5c","Type":"ContainerStarted","Data":"328e9380ade0a360a484de14da628b7df1cea341dc634b6c03b73bdb82e27e64"} Jan 20 22:45:02 crc kubenswrapper[5030]: I0120 22:45:02.691283 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" Jan 20 22:45:02 crc kubenswrapper[5030]: I0120 22:45:02.811682 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d932ea97-0b18-4e73-8f4d-024982fbbd5c-config-volume\") pod \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\" (UID: \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\") " Jan 20 22:45:02 crc kubenswrapper[5030]: I0120 22:45:02.811850 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s6jm\" (UniqueName: \"kubernetes.io/projected/d932ea97-0b18-4e73-8f4d-024982fbbd5c-kube-api-access-2s6jm\") pod \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\" (UID: \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\") " Jan 20 22:45:02 crc kubenswrapper[5030]: I0120 22:45:02.811880 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d932ea97-0b18-4e73-8f4d-024982fbbd5c-secret-volume\") pod \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\" (UID: \"d932ea97-0b18-4e73-8f4d-024982fbbd5c\") " Jan 20 22:45:02 crc kubenswrapper[5030]: I0120 22:45:02.812705 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d932ea97-0b18-4e73-8f4d-024982fbbd5c-config-volume" (OuterVolumeSpecName: "config-volume") pod "d932ea97-0b18-4e73-8f4d-024982fbbd5c" (UID: "d932ea97-0b18-4e73-8f4d-024982fbbd5c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:45:02 crc kubenswrapper[5030]: I0120 22:45:02.821354 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d932ea97-0b18-4e73-8f4d-024982fbbd5c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d932ea97-0b18-4e73-8f4d-024982fbbd5c" (UID: "d932ea97-0b18-4e73-8f4d-024982fbbd5c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:45:02 crc kubenswrapper[5030]: I0120 22:45:02.821430 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d932ea97-0b18-4e73-8f4d-024982fbbd5c-kube-api-access-2s6jm" (OuterVolumeSpecName: "kube-api-access-2s6jm") pod "d932ea97-0b18-4e73-8f4d-024982fbbd5c" (UID: "d932ea97-0b18-4e73-8f4d-024982fbbd5c"). InnerVolumeSpecName "kube-api-access-2s6jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:45:02 crc kubenswrapper[5030]: I0120 22:45:02.913605 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d932ea97-0b18-4e73-8f4d-024982fbbd5c-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 22:45:02 crc kubenswrapper[5030]: I0120 22:45:02.913684 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d932ea97-0b18-4e73-8f4d-024982fbbd5c-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 22:45:02 crc kubenswrapper[5030]: I0120 22:45:02.913701 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s6jm\" (UniqueName: \"kubernetes.io/projected/d932ea97-0b18-4e73-8f4d-024982fbbd5c-kube-api-access-2s6jm\") on node \"crc\" DevicePath \"\"" Jan 20 22:45:03 crc kubenswrapper[5030]: I0120 22:45:03.440761 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" event={"ID":"d932ea97-0b18-4e73-8f4d-024982fbbd5c","Type":"ContainerDied","Data":"328e9380ade0a360a484de14da628b7df1cea341dc634b6c03b73bdb82e27e64"} Jan 20 22:45:03 crc kubenswrapper[5030]: I0120 22:45:03.440828 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="328e9380ade0a360a484de14da628b7df1cea341dc634b6c03b73bdb82e27e64" Jan 20 22:45:03 crc kubenswrapper[5030]: I0120 22:45:03.440837 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp" Jan 20 22:45:10 crc kubenswrapper[5030]: I0120 22:45:10.157777 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:45:10 crc kubenswrapper[5030]: I0120 22:45:10.158302 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:45:28 crc kubenswrapper[5030]: I0120 22:45:28.326843 5030 scope.go:117] "RemoveContainer" containerID="1e454adb39792cdb1c7075907142434f88e8bc3e65cf2cff45c85d0c327f59cf" Jan 20 22:45:28 crc kubenswrapper[5030]: I0120 22:45:28.348495 5030 scope.go:117] "RemoveContainer" containerID="0b95c987e2e6180ab540e0177ac73fa5f4e2df06f86aec5dd1c9139a310f5032" Jan 20 22:45:28 crc kubenswrapper[5030]: I0120 22:45:28.362367 5030 scope.go:117] "RemoveContainer" containerID="d120a165c85084026947a99f6a56e4671c37d43f4f56c49bab41c61f86fbd431" Jan 20 22:45:28 crc kubenswrapper[5030]: I0120 22:45:28.381046 5030 scope.go:117] "RemoveContainer" containerID="7dbd8dce79e05fe61c3ef97dce8cbe4c309ce769b7c8b1a819081d8cdb617dd0" Jan 20 22:45:40 crc kubenswrapper[5030]: I0120 22:45:40.159304 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:45:40 crc kubenswrapper[5030]: I0120 22:45:40.160122 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:45:40 crc kubenswrapper[5030]: I0120 22:45:40.160216 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:45:40 crc kubenswrapper[5030]: I0120 22:45:40.161148 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5fb69d7c86677af730b5a8cb887d6364e5648de419687c9cc1175e8da6193289"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 22:45:40 crc kubenswrapper[5030]: I0120 22:45:40.161226 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://5fb69d7c86677af730b5a8cb887d6364e5648de419687c9cc1175e8da6193289" gracePeriod=600 Jan 20 22:45:40 crc kubenswrapper[5030]: I0120 22:45:40.677697 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="5fb69d7c86677af730b5a8cb887d6364e5648de419687c9cc1175e8da6193289" exitCode=0 Jan 20 22:45:40 crc kubenswrapper[5030]: I0120 22:45:40.677759 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"5fb69d7c86677af730b5a8cb887d6364e5648de419687c9cc1175e8da6193289"} Jan 20 22:45:40 crc kubenswrapper[5030]: I0120 22:45:40.678063 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"495572ff74c26f1ee253c86f7edeb898261f7fe51c82acc33281d83f9074845b"} Jan 20 22:45:40 crc kubenswrapper[5030]: I0120 22:45:40.678092 5030 scope.go:117] "RemoveContainer" containerID="f9e88aa4bf39a956e813cd75bd5356756871e80ce6ac73ab5e8b532a1293320b" Jan 20 22:46:48 crc kubenswrapper[5030]: I0120 22:46:48.607438 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kq4dw"] Jan 20 22:46:48 crc kubenswrapper[5030]: I0120 22:46:48.608528 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="northd" containerID="cri-o://4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352" gracePeriod=30 Jan 20 22:46:48 crc kubenswrapper[5030]: I0120 22:46:48.608742 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="sbdb" containerID="cri-o://81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede" gracePeriod=30 Jan 20 22:46:48 crc kubenswrapper[5030]: I0120 22:46:48.608812 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="nbdb" containerID="cri-o://1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab" gracePeriod=30 Jan 20 22:46:48 crc kubenswrapper[5030]: I0120 22:46:48.609120 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe" gracePeriod=30 Jan 20 22:46:48 crc kubenswrapper[5030]: I0120 22:46:48.609139 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="kube-rbac-proxy-node" containerID="cri-o://e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c" gracePeriod=30 Jan 20 22:46:48 crc kubenswrapper[5030]: I0120 22:46:48.609401 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovn-acl-logging" containerID="cri-o://4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95" gracePeriod=30 Jan 20 22:46:48 crc kubenswrapper[5030]: I0120 22:46:48.609966 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovn-controller" containerID="cri-o://f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3" gracePeriod=30 Jan 20 22:46:48 crc kubenswrapper[5030]: I0120 22:46:48.687235 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" containerID="cri-o://247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008" gracePeriod=30 Jan 20 22:46:48 crc kubenswrapper[5030]: I0120 22:46:48.979183 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/3.log" Jan 20 22:46:48 crc kubenswrapper[5030]: I0120 22:46:48.981600 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovn-acl-logging/0.log" Jan 20 22:46:48 crc kubenswrapper[5030]: I0120 22:46:48.982117 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovn-controller/0.log" Jan 20 22:46:48 crc kubenswrapper[5030]: I0120 22:46:48.982517 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030005 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jtthn"] Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030195 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovn-acl-logging" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030209 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovn-acl-logging" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030219 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030226 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030233 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030239 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030247 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030253 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030262 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovn-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030268 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovn-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030276 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="kube-rbac-proxy-node" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030282 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="kube-rbac-proxy-node" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030292 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="kube-rbac-proxy-ovn-metrics" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030297 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="kube-rbac-proxy-ovn-metrics" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030306 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="kubecfg-setup" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030312 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="kubecfg-setup" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030319 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d932ea97-0b18-4e73-8f4d-024982fbbd5c" containerName="collect-profiles" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030325 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d932ea97-0b18-4e73-8f4d-024982fbbd5c" containerName="collect-profiles" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030333 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="sbdb" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030338 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="sbdb" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030348 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="nbdb" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030353 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="nbdb" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030361 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="northd" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030366 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="northd" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030443 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d932ea97-0b18-4e73-8f4d-024982fbbd5c" containerName="collect-profiles" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030454 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030462 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030470 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="kube-rbac-proxy-node" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030476 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovn-acl-logging" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030484 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030491 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="kube-rbac-proxy-ovn-metrics" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030497 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="nbdb" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030506 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="sbdb" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030515 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovn-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030522 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="northd" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030608 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030634 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.030646 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030653 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030731 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.030739 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf449d4a-2037-4802-8370-1965d3026c07" containerName="ovnkube-controller" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.033985 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.132699 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/2.log" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.133238 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/1.log" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.133301 5030 generic.go:334] "Generic (PLEG): container finished" podID="7e610661-5072-4aa0-b1f1-75410b7f663b" containerID="87892d6c086c1bb0bb14fd6908aee383a0b4a4535439b578baf5b6b838452f1a" exitCode=2 Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.133372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n8v4f" event={"ID":"7e610661-5072-4aa0-b1f1-75410b7f663b","Type":"ContainerDied","Data":"87892d6c086c1bb0bb14fd6908aee383a0b4a4535439b578baf5b6b838452f1a"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.133418 5030 scope.go:117] "RemoveContainer" containerID="b0f5e385cd5f1702d045571944087e6d00cc5743b66386b068f212c853833667" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.134219 5030 scope.go:117] "RemoveContainer" containerID="87892d6c086c1bb0bb14fd6908aee383a0b4a4535439b578baf5b6b838452f1a" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.134554 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-n8v4f_openshift-multus(7e610661-5072-4aa0-b1f1-75410b7f663b)\"" pod="openshift-multus/multus-n8v4f" podUID="7e610661-5072-4aa0-b1f1-75410b7f663b" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.136975 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovnkube-controller/3.log" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.139416 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovn-acl-logging/0.log" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140027 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kq4dw_bf449d4a-2037-4802-8370-1965d3026c07/ovn-controller/0.log" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140367 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf449d4a-2037-4802-8370-1965d3026c07" containerID="247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008" exitCode=0 Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140405 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf449d4a-2037-4802-8370-1965d3026c07" containerID="81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede" exitCode=0 Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140414 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf449d4a-2037-4802-8370-1965d3026c07" containerID="1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab" exitCode=0 Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140422 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf449d4a-2037-4802-8370-1965d3026c07" containerID="4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352" exitCode=0 Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140429 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf449d4a-2037-4802-8370-1965d3026c07" containerID="171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe" exitCode=0 Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140436 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf449d4a-2037-4802-8370-1965d3026c07" containerID="e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c" exitCode=0 Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140438 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140444 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf449d4a-2037-4802-8370-1965d3026c07" containerID="4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95" exitCode=143 Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140452 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf449d4a-2037-4802-8370-1965d3026c07" containerID="f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3" exitCode=143 Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140488 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140514 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140526 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140535 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140558 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140568 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140577 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140588 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140594 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140600 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140606 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140611 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140636 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140642 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140647 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140652 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140659 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140667 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140674 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140679 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140684 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140689 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140694 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140715 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140720 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140725 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140731 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140738 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140746 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140752 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140757 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140762 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140767 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140772 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140793 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140798 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140803 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140808 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140815 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kq4dw" event={"ID":"bf449d4a-2037-4802-8370-1965d3026c07","Type":"ContainerDied","Data":"beeda922fc880b32e36c237db8ea60c6490fde2c801311a1fae103837e897a85"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140822 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140828 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140834 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140839 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140844 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140850 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140870 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140877 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140882 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.140888 5030 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902"} Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.141786 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-cni-bin\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.141827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-etc-openvswitch\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.141868 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-var-lib-openvswitch\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.141892 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.141903 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-env-overrides\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.141924 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.141935 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wk7p\" (UniqueName: \"kubernetes.io/projected/bf449d4a-2037-4802-8370-1965d3026c07-kube-api-access-6wk7p\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.141956 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-kubelet\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.141959 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.141981 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf449d4a-2037-4802-8370-1965d3026c07-ovn-node-metrics-cert\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142059 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-node-log\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142085 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142102 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-log-socket\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142131 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-node-log" (OuterVolumeSpecName: "node-log") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142182 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-cni-netd\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142235 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-systemd\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-ovnkube-script-lib\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142335 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-slash\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142367 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-openvswitch\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142407 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142440 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-systemd-units\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142477 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-run-netns\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142518 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-ovnkube-config\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-log-socket" (OuterVolumeSpecName: "log-socket") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142257 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142480 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142542 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142609 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142688 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142727 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.142570 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-ovn\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143009 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-run-ovn-kubernetes\") pod \"bf449d4a-2037-4802-8370-1965d3026c07\" (UID: \"bf449d4a-2037-4802-8370-1965d3026c07\") " Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143201 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-cni-bin\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143224 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-run-systemd\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143271 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-node-log\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143284 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-log-socket\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143307 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143339 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-run-ovn-kubernetes\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143369 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-etc-openvswitch\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143398 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88194a1b-dd38-4eaa-8544-a69bdd75f948-env-overrides\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143413 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2nw7\" (UniqueName: \"kubernetes.io/projected/88194a1b-dd38-4eaa-8544-a69bdd75f948-kube-api-access-z2nw7\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143434 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-run-ovn\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143451 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-kubelet\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143481 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-systemd-units\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143497 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88194a1b-dd38-4eaa-8544-a69bdd75f948-ovnkube-config\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-run-netns\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143528 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-run-openvswitch\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143526 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143572 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143609 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-cni-netd\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143675 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-var-lib-openvswitch\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143692 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143713 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88194a1b-dd38-4eaa-8544-a69bdd75f948-ovn-node-metrics-cert\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143735 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-slash\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143751 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88194a1b-dd38-4eaa-8544-a69bdd75f948-ovnkube-script-lib\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143787 5030 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143798 5030 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.143807 5030 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.144766 5030 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.144782 5030 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.144793 5030 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.144801 5030 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.144810 5030 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.144818 5030 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.144827 5030 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.144836 5030 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-node-log\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.144844 5030 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-log-socket\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.144853 5030 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.144861 5030 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bf449d4a-2037-4802-8370-1965d3026c07-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.144869 5030 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.144975 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-slash" (OuterVolumeSpecName: "host-slash") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.147568 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf449d4a-2037-4802-8370-1965d3026c07-kube-api-access-6wk7p" (OuterVolumeSpecName: "kube-api-access-6wk7p") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "kube-api-access-6wk7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.147832 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf449d4a-2037-4802-8370-1965d3026c07-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.157403 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "bf449d4a-2037-4802-8370-1965d3026c07" (UID: "bf449d4a-2037-4802-8370-1965d3026c07"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.176817 5030 scope.go:117] "RemoveContainer" containerID="247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.195994 5030 scope.go:117] "RemoveContainer" containerID="42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.213161 5030 scope.go:117] "RemoveContainer" containerID="81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.228889 5030 scope.go:117] "RemoveContainer" containerID="1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.243305 5030 scope.go:117] "RemoveContainer" containerID="4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245334 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-var-lib-openvswitch\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245364 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88194a1b-dd38-4eaa-8544-a69bdd75f948-ovn-node-metrics-cert\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245381 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-slash\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245398 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88194a1b-dd38-4eaa-8544-a69bdd75f948-ovnkube-script-lib\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245422 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-cni-bin\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245437 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-run-systemd\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245454 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-node-log\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245468 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-log-socket\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245493 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245518 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-run-ovn-kubernetes\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245534 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-etc-openvswitch\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245543 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-run-systemd\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245559 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-var-lib-openvswitch\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245599 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-slash\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245553 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88194a1b-dd38-4eaa-8544-a69bdd75f948-env-overrides\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245685 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2nw7\" (UniqueName: \"kubernetes.io/projected/88194a1b-dd38-4eaa-8544-a69bdd75f948-kube-api-access-z2nw7\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245731 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-run-ovn\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245772 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-kubelet\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245822 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-systemd-units\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245859 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88194a1b-dd38-4eaa-8544-a69bdd75f948-ovnkube-config\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245889 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-run-netns\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245925 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-run-openvswitch\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245960 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245983 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-cni-netd\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246007 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-node-log\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246033 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-log-socket\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246105 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-run-openvswitch\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.245886 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-run-ovn-kubernetes\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-run-netns\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246123 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-cni-netd\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-cni-bin\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246152 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88194a1b-dd38-4eaa-8544-a69bdd75f948-env-overrides\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-etc-openvswitch\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246184 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-host-kubelet\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246206 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wk7p\" (UniqueName: \"kubernetes.io/projected/bf449d4a-2037-4802-8370-1965d3026c07-kube-api-access-6wk7p\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-run-ovn\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246236 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88194a1b-dd38-4eaa-8544-a69bdd75f948-systemd-units\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246286 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf449d4a-2037-4802-8370-1965d3026c07-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246314 5030 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246328 5030 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-host-slash\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246343 5030 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf449d4a-2037-4802-8370-1965d3026c07-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246385 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88194a1b-dd38-4eaa-8544-a69bdd75f948-ovnkube-script-lib\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.246602 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88194a1b-dd38-4eaa-8544-a69bdd75f948-ovnkube-config\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.248558 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88194a1b-dd38-4eaa-8544-a69bdd75f948-ovn-node-metrics-cert\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.258546 5030 scope.go:117] "RemoveContainer" containerID="171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.264984 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2nw7\" (UniqueName: \"kubernetes.io/projected/88194a1b-dd38-4eaa-8544-a69bdd75f948-kube-api-access-z2nw7\") pod \"ovnkube-node-jtthn\" (UID: \"88194a1b-dd38-4eaa-8544-a69bdd75f948\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.269574 5030 scope.go:117] "RemoveContainer" containerID="e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.282648 5030 scope.go:117] "RemoveContainer" containerID="4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.298147 5030 scope.go:117] "RemoveContainer" containerID="f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.314204 5030 scope.go:117] "RemoveContainer" containerID="00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.329162 5030 scope.go:117] "RemoveContainer" containerID="247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.329738 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008\": container with ID starting with 247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008 not found: ID does not exist" containerID="247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.329772 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008"} err="failed to get container status \"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008\": rpc error: code = NotFound desc = could not find container \"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008\": container with ID starting with 247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.329798 5030 scope.go:117] "RemoveContainer" containerID="42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.330295 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\": container with ID starting with 42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2 not found: ID does not exist" containerID="42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.330337 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2"} err="failed to get container status \"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\": rpc error: code = NotFound desc = could not find container \"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\": container with ID starting with 42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.330367 5030 scope.go:117] "RemoveContainer" containerID="81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.330669 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\": container with ID starting with 81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede not found: ID does not exist" containerID="81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.330692 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede"} err="failed to get container status \"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\": rpc error: code = NotFound desc = could not find container \"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\": container with ID starting with 81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.330707 5030 scope.go:117] "RemoveContainer" containerID="1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.331140 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\": container with ID starting with 1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab not found: ID does not exist" containerID="1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.331177 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab"} err="failed to get container status \"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\": rpc error: code = NotFound desc = could not find container \"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\": container with ID starting with 1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.331199 5030 scope.go:117] "RemoveContainer" containerID="4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.331563 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\": container with ID starting with 4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352 not found: ID does not exist" containerID="4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.331595 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352"} err="failed to get container status \"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\": rpc error: code = NotFound desc = could not find container \"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\": container with ID starting with 4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.331610 5030 scope.go:117] "RemoveContainer" containerID="171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.332071 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\": container with ID starting with 171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe not found: ID does not exist" containerID="171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.332105 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe"} err="failed to get container status \"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\": rpc error: code = NotFound desc = could not find container \"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\": container with ID starting with 171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.332124 5030 scope.go:117] "RemoveContainer" containerID="e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.332393 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\": container with ID starting with e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c not found: ID does not exist" containerID="e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.332726 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c"} err="failed to get container status \"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\": rpc error: code = NotFound desc = could not find container \"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\": container with ID starting with e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.332744 5030 scope.go:117] "RemoveContainer" containerID="4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.333016 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\": container with ID starting with 4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95 not found: ID does not exist" containerID="4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.333043 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95"} err="failed to get container status \"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\": rpc error: code = NotFound desc = could not find container \"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\": container with ID starting with 4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.333059 5030 scope.go:117] "RemoveContainer" containerID="f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.333414 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\": container with ID starting with f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3 not found: ID does not exist" containerID="f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.333444 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3"} err="failed to get container status \"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\": rpc error: code = NotFound desc = could not find container \"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\": container with ID starting with f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.333462 5030 scope.go:117] "RemoveContainer" containerID="00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902" Jan 20 22:46:49 crc kubenswrapper[5030]: E0120 22:46:49.333746 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\": container with ID starting with 00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902 not found: ID does not exist" containerID="00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.333776 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902"} err="failed to get container status \"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\": rpc error: code = NotFound desc = could not find container \"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\": container with ID starting with 00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.333797 5030 scope.go:117] "RemoveContainer" containerID="247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.334142 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008"} err="failed to get container status \"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008\": rpc error: code = NotFound desc = could not find container \"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008\": container with ID starting with 247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.334164 5030 scope.go:117] "RemoveContainer" containerID="42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.334504 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2"} err="failed to get container status \"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\": rpc error: code = NotFound desc = could not find container \"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\": container with ID starting with 42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.334533 5030 scope.go:117] "RemoveContainer" containerID="81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.334908 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede"} err="failed to get container status \"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\": rpc error: code = NotFound desc = could not find container \"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\": container with ID starting with 81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.334937 5030 scope.go:117] "RemoveContainer" containerID="1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.335305 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab"} err="failed to get container status \"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\": rpc error: code = NotFound desc = could not find container \"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\": container with ID starting with 1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.335331 5030 scope.go:117] "RemoveContainer" containerID="4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.335595 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352"} err="failed to get container status \"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\": rpc error: code = NotFound desc = could not find container \"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\": container with ID starting with 4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.335636 5030 scope.go:117] "RemoveContainer" containerID="171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.335894 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe"} err="failed to get container status \"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\": rpc error: code = NotFound desc = could not find container \"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\": container with ID starting with 171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.335919 5030 scope.go:117] "RemoveContainer" containerID="e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.336150 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c"} err="failed to get container status \"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\": rpc error: code = NotFound desc = could not find container \"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\": container with ID starting with e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.336179 5030 scope.go:117] "RemoveContainer" containerID="4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.336415 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95"} err="failed to get container status \"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\": rpc error: code = NotFound desc = could not find container \"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\": container with ID starting with 4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.336440 5030 scope.go:117] "RemoveContainer" containerID="f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.336676 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3"} err="failed to get container status \"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\": rpc error: code = NotFound desc = could not find container \"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\": container with ID starting with f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.336696 5030 scope.go:117] "RemoveContainer" containerID="00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.336913 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902"} err="failed to get container status \"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\": rpc error: code = NotFound desc = could not find container \"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\": container with ID starting with 00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.336940 5030 scope.go:117] "RemoveContainer" containerID="247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.337117 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008"} err="failed to get container status \"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008\": rpc error: code = NotFound desc = could not find container \"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008\": container with ID starting with 247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.337138 5030 scope.go:117] "RemoveContainer" containerID="42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.337331 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2"} err="failed to get container status \"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\": rpc error: code = NotFound desc = could not find container \"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\": container with ID starting with 42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.337350 5030 scope.go:117] "RemoveContainer" containerID="81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.337650 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede"} err="failed to get container status \"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\": rpc error: code = NotFound desc = could not find container \"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\": container with ID starting with 81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.337674 5030 scope.go:117] "RemoveContainer" containerID="1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.337870 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab"} err="failed to get container status \"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\": rpc error: code = NotFound desc = could not find container \"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\": container with ID starting with 1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.337886 5030 scope.go:117] "RemoveContainer" containerID="4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.338087 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352"} err="failed to get container status \"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\": rpc error: code = NotFound desc = could not find container \"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\": container with ID starting with 4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.338106 5030 scope.go:117] "RemoveContainer" containerID="171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.338303 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe"} err="failed to get container status \"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\": rpc error: code = NotFound desc = could not find container \"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\": container with ID starting with 171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.338324 5030 scope.go:117] "RemoveContainer" containerID="e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.338501 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c"} err="failed to get container status \"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\": rpc error: code = NotFound desc = could not find container \"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\": container with ID starting with e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.338518 5030 scope.go:117] "RemoveContainer" containerID="4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.338800 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95"} err="failed to get container status \"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\": rpc error: code = NotFound desc = could not find container \"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\": container with ID starting with 4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.338834 5030 scope.go:117] "RemoveContainer" containerID="f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.339049 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3"} err="failed to get container status \"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\": rpc error: code = NotFound desc = could not find container \"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\": container with ID starting with f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.339074 5030 scope.go:117] "RemoveContainer" containerID="00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.339355 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902"} err="failed to get container status \"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\": rpc error: code = NotFound desc = could not find container \"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\": container with ID starting with 00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.339379 5030 scope.go:117] "RemoveContainer" containerID="247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.339579 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008"} err="failed to get container status \"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008\": rpc error: code = NotFound desc = could not find container \"247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008\": container with ID starting with 247e22e0e1b2cdc972991912e6123ff5cb74bfe44a760ccaa933ae6848324008 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.339605 5030 scope.go:117] "RemoveContainer" containerID="42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.339885 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2"} err="failed to get container status \"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\": rpc error: code = NotFound desc = could not find container \"42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2\": container with ID starting with 42fd5a8b6914585e14021b107a968d5d3d450344814968e90b8fff1d789f77d2 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.339926 5030 scope.go:117] "RemoveContainer" containerID="81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.340171 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede"} err="failed to get container status \"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\": rpc error: code = NotFound desc = could not find container \"81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede\": container with ID starting with 81f3e6645618f6008d1fdaeaf44b9313a1b001ff4e35ff9962b749b1832a3ede not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.340196 5030 scope.go:117] "RemoveContainer" containerID="1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.340402 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab"} err="failed to get container status \"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\": rpc error: code = NotFound desc = could not find container \"1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab\": container with ID starting with 1b6f783b0057455502b9ab5baa44b735546bfbef2fd97f42431dec49cf0bbeab not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.340424 5030 scope.go:117] "RemoveContainer" containerID="4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.340680 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352"} err="failed to get container status \"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\": rpc error: code = NotFound desc = could not find container \"4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352\": container with ID starting with 4d6f7f62c600d47be57a6acae466015ba59c6e855a17c535ebecc6ad82958352 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.340702 5030 scope.go:117] "RemoveContainer" containerID="171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.340990 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe"} err="failed to get container status \"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\": rpc error: code = NotFound desc = could not find container \"171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe\": container with ID starting with 171c6cf39b103a5de08f675d6a4a0092430cf57e0d4efd16b4d18b759019fbbe not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.341019 5030 scope.go:117] "RemoveContainer" containerID="e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.341353 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c"} err="failed to get container status \"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\": rpc error: code = NotFound desc = could not find container \"e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c\": container with ID starting with e85e2a238d3a75b8a5a8306314eafd2698587a2627e0840200f60573141fb09c not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.341380 5030 scope.go:117] "RemoveContainer" containerID="4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.341656 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95"} err="failed to get container status \"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\": rpc error: code = NotFound desc = could not find container \"4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95\": container with ID starting with 4201f85eb7f6b0f498fd7a539f380e406d73982aa1900ff854a4725378050f95 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.341685 5030 scope.go:117] "RemoveContainer" containerID="f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.342187 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3"} err="failed to get container status \"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\": rpc error: code = NotFound desc = could not find container \"f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3\": container with ID starting with f3e8942dd7489302ae306cc0d77973306fe98d49b0f03f8a01292c291af0bfc3 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.342214 5030 scope.go:117] "RemoveContainer" containerID="00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.342437 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902"} err="failed to get container status \"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\": rpc error: code = NotFound desc = could not find container \"00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902\": container with ID starting with 00e4c3fb1cc83adf0dca41d023adb021d1c59091d66dfd739c9054bf0c781902 not found: ID does not exist" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.350088 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.470945 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kq4dw"] Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.488195 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kq4dw"] Jan 20 22:46:49 crc kubenswrapper[5030]: I0120 22:46:49.973598 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf449d4a-2037-4802-8370-1965d3026c07" path="/var/lib/kubelet/pods/bf449d4a-2037-4802-8370-1965d3026c07/volumes" Jan 20 22:46:50 crc kubenswrapper[5030]: I0120 22:46:50.151072 5030 generic.go:334] "Generic (PLEG): container finished" podID="88194a1b-dd38-4eaa-8544-a69bdd75f948" containerID="be1fa620260d52e89c186e27ca718ae731478e4075b513b49370644f2d8e1c38" exitCode=0 Jan 20 22:46:50 crc kubenswrapper[5030]: I0120 22:46:50.151200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" event={"ID":"88194a1b-dd38-4eaa-8544-a69bdd75f948","Type":"ContainerDied","Data":"be1fa620260d52e89c186e27ca718ae731478e4075b513b49370644f2d8e1c38"} Jan 20 22:46:50 crc kubenswrapper[5030]: I0120 22:46:50.151325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" event={"ID":"88194a1b-dd38-4eaa-8544-a69bdd75f948","Type":"ContainerStarted","Data":"f4f2a741ad59795c569a8dfc2ff1027e3dcf060d30723c1dde08ea5315d8e37b"} Jan 20 22:46:50 crc kubenswrapper[5030]: I0120 22:46:50.154586 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/2.log" Jan 20 22:46:51 crc kubenswrapper[5030]: I0120 22:46:51.175003 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" event={"ID":"88194a1b-dd38-4eaa-8544-a69bdd75f948","Type":"ContainerStarted","Data":"72efc36cf937ac96f4475995d01c92ad4f5f1a6c0c2aea226effcaf40d71e292"} Jan 20 22:46:51 crc kubenswrapper[5030]: I0120 22:46:51.175692 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" event={"ID":"88194a1b-dd38-4eaa-8544-a69bdd75f948","Type":"ContainerStarted","Data":"859eb14843438a6cc527b8fee8e9d1d400f127c2aa43e0b4d52dec6c96cbf7c3"} Jan 20 22:46:51 crc kubenswrapper[5030]: I0120 22:46:51.175719 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" event={"ID":"88194a1b-dd38-4eaa-8544-a69bdd75f948","Type":"ContainerStarted","Data":"407a6526ed7c411de4ecf73ab7654bd453afd67ef0edf11f42fa6b5b66d51251"} Jan 20 22:46:51 crc kubenswrapper[5030]: I0120 22:46:51.175741 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" event={"ID":"88194a1b-dd38-4eaa-8544-a69bdd75f948","Type":"ContainerStarted","Data":"f70968b41705f0bf140f83595a70d6326bc9d7f5a02f65ade82b4a07f2f89b79"} Jan 20 22:46:51 crc kubenswrapper[5030]: I0120 22:46:51.175758 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" event={"ID":"88194a1b-dd38-4eaa-8544-a69bdd75f948","Type":"ContainerStarted","Data":"08ed87a14fa3b4bf41d67e3f55a7fd3f1979fd626ecae19199c6ed78c4fd8df6"} Jan 20 22:46:51 crc kubenswrapper[5030]: I0120 22:46:51.175775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" event={"ID":"88194a1b-dd38-4eaa-8544-a69bdd75f948","Type":"ContainerStarted","Data":"8e0c667a3fe8a448687e4ca3a72ecdb1e8bc1ffff03e8324fe04586db21554c6"} Jan 20 22:46:54 crc kubenswrapper[5030]: I0120 22:46:54.198730 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" event={"ID":"88194a1b-dd38-4eaa-8544-a69bdd75f948","Type":"ContainerStarted","Data":"48d19cd6b05e18e712be625204474a4ab5ed72fff809837bb84c2188b8151b86"} Jan 20 22:46:56 crc kubenswrapper[5030]: I0120 22:46:56.213292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" event={"ID":"88194a1b-dd38-4eaa-8544-a69bdd75f948","Type":"ContainerStarted","Data":"b47c2c03b02ee70f55be932f40d527c7ad33c94a0c15d3c3c038093fd9af274e"} Jan 20 22:46:56 crc kubenswrapper[5030]: I0120 22:46:56.213946 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:56 crc kubenswrapper[5030]: I0120 22:46:56.213963 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:56 crc kubenswrapper[5030]: I0120 22:46:56.238406 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:56 crc kubenswrapper[5030]: I0120 22:46:56.242817 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" podStartSLOduration=7.242800862 podStartE2EDuration="7.242800862s" podCreationTimestamp="2026-01-20 22:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:46:56.240454596 +0000 UTC m=+688.560714874" watchObservedRunningTime="2026-01-20 22:46:56.242800862 +0000 UTC m=+688.563061150" Jan 20 22:46:57 crc kubenswrapper[5030]: I0120 22:46:57.219400 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:46:57 crc kubenswrapper[5030]: I0120 22:46:57.259256 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:47:01 crc kubenswrapper[5030]: I0120 22:47:01.963099 5030 scope.go:117] "RemoveContainer" containerID="87892d6c086c1bb0bb14fd6908aee383a0b4a4535439b578baf5b6b838452f1a" Jan 20 22:47:01 crc kubenswrapper[5030]: E0120 22:47:01.963936 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-n8v4f_openshift-multus(7e610661-5072-4aa0-b1f1-75410b7f663b)\"" pod="openshift-multus/multus-n8v4f" podUID="7e610661-5072-4aa0-b1f1-75410b7f663b" Jan 20 22:47:12 crc kubenswrapper[5030]: I0120 22:47:12.962395 5030 scope.go:117] "RemoveContainer" containerID="87892d6c086c1bb0bb14fd6908aee383a0b4a4535439b578baf5b6b838452f1a" Jan 20 22:47:13 crc kubenswrapper[5030]: I0120 22:47:13.319370 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/2.log" Jan 20 22:47:13 crc kubenswrapper[5030]: I0120 22:47:13.319727 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n8v4f" event={"ID":"7e610661-5072-4aa0-b1f1-75410b7f663b","Type":"ContainerStarted","Data":"d82500226c0aa10a7d98a06f8ebc87db1838668d516bd42c3251ef8fd843ef5d"} Jan 20 22:47:19 crc kubenswrapper[5030]: I0120 22:47:19.385275 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jtthn" Jan 20 22:47:26 crc kubenswrapper[5030]: I0120 22:47:26.958063 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk"] Jan 20 22:47:26 crc kubenswrapper[5030]: I0120 22:47:26.960565 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" Jan 20 22:47:26 crc kubenswrapper[5030]: I0120 22:47:26.963389 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bgvt\" (UniqueName: \"kubernetes.io/projected/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-kube-api-access-2bgvt\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk\" (UID: \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" Jan 20 22:47:26 crc kubenswrapper[5030]: I0120 22:47:26.963487 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk\" (UID: \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" Jan 20 22:47:26 crc kubenswrapper[5030]: I0120 22:47:26.963572 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk\" (UID: \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" Jan 20 22:47:26 crc kubenswrapper[5030]: I0120 22:47:26.964526 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 20 22:47:26 crc kubenswrapper[5030]: I0120 22:47:26.971371 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk"] Jan 20 22:47:27 crc kubenswrapper[5030]: I0120 22:47:27.065079 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bgvt\" (UniqueName: \"kubernetes.io/projected/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-kube-api-access-2bgvt\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk\" (UID: \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" Jan 20 22:47:27 crc kubenswrapper[5030]: I0120 22:47:27.065178 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk\" (UID: \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" Jan 20 22:47:27 crc kubenswrapper[5030]: I0120 22:47:27.065244 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk\" (UID: \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" Jan 20 22:47:27 crc kubenswrapper[5030]: I0120 22:47:27.066015 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk\" (UID: \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" Jan 20 22:47:27 crc kubenswrapper[5030]: I0120 22:47:27.067015 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk\" (UID: \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" Jan 20 22:47:27 crc kubenswrapper[5030]: I0120 22:47:27.104894 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bgvt\" (UniqueName: \"kubernetes.io/projected/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-kube-api-access-2bgvt\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk\" (UID: \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" Jan 20 22:47:27 crc kubenswrapper[5030]: I0120 22:47:27.282024 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" Jan 20 22:47:27 crc kubenswrapper[5030]: I0120 22:47:27.536959 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk"] Jan 20 22:47:28 crc kubenswrapper[5030]: I0120 22:47:28.418290 5030 generic.go:334] "Generic (PLEG): container finished" podID="d2493d5a-ed39-48f7-a46f-f5b19cc201ae" containerID="e8da43346499a27b3d7f9747979f3fed05d549e5d2324129196255ea4a8a0aca" exitCode=0 Jan 20 22:47:28 crc kubenswrapper[5030]: I0120 22:47:28.418384 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" event={"ID":"d2493d5a-ed39-48f7-a46f-f5b19cc201ae","Type":"ContainerDied","Data":"e8da43346499a27b3d7f9747979f3fed05d549e5d2324129196255ea4a8a0aca"} Jan 20 22:47:28 crc kubenswrapper[5030]: I0120 22:47:28.419434 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" event={"ID":"d2493d5a-ed39-48f7-a46f-f5b19cc201ae","Type":"ContainerStarted","Data":"2c9d6b53e42907fcf350b808658261500b32869c6ce4d1d6d8be59f411c8501e"} Jan 20 22:47:28 crc kubenswrapper[5030]: I0120 22:47:28.421198 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 22:47:30 crc kubenswrapper[5030]: I0120 22:47:30.434481 5030 generic.go:334] "Generic (PLEG): container finished" podID="d2493d5a-ed39-48f7-a46f-f5b19cc201ae" containerID="aec4f01612e489033df5a614d5560ed5de61b02d76761f9a9a237794773a1d46" exitCode=0 Jan 20 22:47:30 crc kubenswrapper[5030]: I0120 22:47:30.434526 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" event={"ID":"d2493d5a-ed39-48f7-a46f-f5b19cc201ae","Type":"ContainerDied","Data":"aec4f01612e489033df5a614d5560ed5de61b02d76761f9a9a237794773a1d46"} Jan 20 22:47:31 crc kubenswrapper[5030]: I0120 22:47:31.445272 5030 generic.go:334] "Generic (PLEG): container finished" podID="d2493d5a-ed39-48f7-a46f-f5b19cc201ae" containerID="4d4d9a33bdfb8c13fbeb0946ff60dcf68c6090edf9a6bc630714daa89c02f200" exitCode=0 Jan 20 22:47:31 crc kubenswrapper[5030]: I0120 22:47:31.445367 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" event={"ID":"d2493d5a-ed39-48f7-a46f-f5b19cc201ae","Type":"ContainerDied","Data":"4d4d9a33bdfb8c13fbeb0946ff60dcf68c6090edf9a6bc630714daa89c02f200"} Jan 20 22:47:32 crc kubenswrapper[5030]: I0120 22:47:32.721830 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" Jan 20 22:47:32 crc kubenswrapper[5030]: I0120 22:47:32.738770 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bgvt\" (UniqueName: \"kubernetes.io/projected/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-kube-api-access-2bgvt\") pod \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\" (UID: \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\") " Jan 20 22:47:32 crc kubenswrapper[5030]: I0120 22:47:32.738874 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-bundle\") pod \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\" (UID: \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\") " Jan 20 22:47:32 crc kubenswrapper[5030]: I0120 22:47:32.738920 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-util\") pod \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\" (UID: \"d2493d5a-ed39-48f7-a46f-f5b19cc201ae\") " Jan 20 22:47:32 crc kubenswrapper[5030]: I0120 22:47:32.741249 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-bundle" (OuterVolumeSpecName: "bundle") pod "d2493d5a-ed39-48f7-a46f-f5b19cc201ae" (UID: "d2493d5a-ed39-48f7-a46f-f5b19cc201ae"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:47:32 crc kubenswrapper[5030]: I0120 22:47:32.748898 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-kube-api-access-2bgvt" (OuterVolumeSpecName: "kube-api-access-2bgvt") pod "d2493d5a-ed39-48f7-a46f-f5b19cc201ae" (UID: "d2493d5a-ed39-48f7-a46f-f5b19cc201ae"). InnerVolumeSpecName "kube-api-access-2bgvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:47:32 crc kubenswrapper[5030]: I0120 22:47:32.763051 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-util" (OuterVolumeSpecName: "util") pod "d2493d5a-ed39-48f7-a46f-f5b19cc201ae" (UID: "d2493d5a-ed39-48f7-a46f-f5b19cc201ae"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:47:32 crc kubenswrapper[5030]: I0120 22:47:32.840387 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-util\") on node \"crc\" DevicePath \"\"" Jan 20 22:47:32 crc kubenswrapper[5030]: I0120 22:47:32.840439 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bgvt\" (UniqueName: \"kubernetes.io/projected/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-kube-api-access-2bgvt\") on node \"crc\" DevicePath \"\"" Jan 20 22:47:32 crc kubenswrapper[5030]: I0120 22:47:32.840463 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2493d5a-ed39-48f7-a46f-f5b19cc201ae-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:47:33 crc kubenswrapper[5030]: I0120 22:47:33.464221 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" event={"ID":"d2493d5a-ed39-48f7-a46f-f5b19cc201ae","Type":"ContainerDied","Data":"2c9d6b53e42907fcf350b808658261500b32869c6ce4d1d6d8be59f411c8501e"} Jan 20 22:47:33 crc kubenswrapper[5030]: I0120 22:47:33.464602 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c9d6b53e42907fcf350b808658261500b32869c6ce4d1d6d8be59f411c8501e" Jan 20 22:47:33 crc kubenswrapper[5030]: I0120 22:47:33.464336 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk" Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.616418 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-mwgvw"] Jan 20 22:47:34 crc kubenswrapper[5030]: E0120 22:47:34.616647 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2493d5a-ed39-48f7-a46f-f5b19cc201ae" containerName="pull" Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.616662 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2493d5a-ed39-48f7-a46f-f5b19cc201ae" containerName="pull" Jan 20 22:47:34 crc kubenswrapper[5030]: E0120 22:47:34.616679 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2493d5a-ed39-48f7-a46f-f5b19cc201ae" containerName="util" Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.616686 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2493d5a-ed39-48f7-a46f-f5b19cc201ae" containerName="util" Jan 20 22:47:34 crc kubenswrapper[5030]: E0120 22:47:34.616701 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2493d5a-ed39-48f7-a46f-f5b19cc201ae" containerName="extract" Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.616708 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2493d5a-ed39-48f7-a46f-f5b19cc201ae" containerName="extract" Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.616852 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2493d5a-ed39-48f7-a46f-f5b19cc201ae" containerName="extract" Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.617249 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-mwgvw" Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.620102 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-b4mtr" Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.620369 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.620572 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.628214 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-mwgvw"] Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.664798 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fnzf\" (UniqueName: \"kubernetes.io/projected/c6f7ddb6-5178-42be-98ca-d39830d57dbc-kube-api-access-5fnzf\") pod \"nmstate-operator-646758c888-mwgvw\" (UID: \"c6f7ddb6-5178-42be-98ca-d39830d57dbc\") " pod="openshift-nmstate/nmstate-operator-646758c888-mwgvw" Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.766117 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fnzf\" (UniqueName: \"kubernetes.io/projected/c6f7ddb6-5178-42be-98ca-d39830d57dbc-kube-api-access-5fnzf\") pod \"nmstate-operator-646758c888-mwgvw\" (UID: \"c6f7ddb6-5178-42be-98ca-d39830d57dbc\") " pod="openshift-nmstate/nmstate-operator-646758c888-mwgvw" Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.783749 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fnzf\" (UniqueName: \"kubernetes.io/projected/c6f7ddb6-5178-42be-98ca-d39830d57dbc-kube-api-access-5fnzf\") pod \"nmstate-operator-646758c888-mwgvw\" (UID: \"c6f7ddb6-5178-42be-98ca-d39830d57dbc\") " pod="openshift-nmstate/nmstate-operator-646758c888-mwgvw" Jan 20 22:47:34 crc kubenswrapper[5030]: I0120 22:47:34.930183 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-mwgvw" Jan 20 22:47:35 crc kubenswrapper[5030]: I0120 22:47:35.181022 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-mwgvw"] Jan 20 22:47:35 crc kubenswrapper[5030]: I0120 22:47:35.476026 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-mwgvw" event={"ID":"c6f7ddb6-5178-42be-98ca-d39830d57dbc","Type":"ContainerStarted","Data":"8844a3a1c82479e28e04b4c5e940164ac1d52647eb4dca126c2b4900fb11838a"} Jan 20 22:47:38 crc kubenswrapper[5030]: I0120 22:47:38.492551 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-mwgvw" event={"ID":"c6f7ddb6-5178-42be-98ca-d39830d57dbc","Type":"ContainerStarted","Data":"4756669ac1ffd03f22147fc22927d5b4ace1bb14fb1ff253f33f51ad532d02be"} Jan 20 22:47:38 crc kubenswrapper[5030]: I0120 22:47:38.520787 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-mwgvw" podStartSLOduration=2.202380381 podStartE2EDuration="4.52076902s" podCreationTimestamp="2026-01-20 22:47:34 +0000 UTC" firstStartedPulling="2026-01-20 22:47:35.19440197 +0000 UTC m=+727.514662258" lastFinishedPulling="2026-01-20 22:47:37.512790589 +0000 UTC m=+729.833050897" observedRunningTime="2026-01-20 22:47:38.518206337 +0000 UTC m=+730.838466625" watchObservedRunningTime="2026-01-20 22:47:38.52076902 +0000 UTC m=+730.841029308" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.543251 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-xfzjh"] Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.545019 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-xfzjh" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.547342 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2ng7d" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.570831 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp"] Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.571652 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.573504 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.586729 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-xfzjh"] Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.592223 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp"] Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.597160 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-tgc5w"] Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.598021 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.688923 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls"] Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.689523 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.691720 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.691854 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.702071 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-kz25p" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.707596 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls"] Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.733398 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7dns\" (UniqueName: \"kubernetes.io/projected/eb8c76be-9c0b-4240-955e-016a3f5e0ccd-kube-api-access-k7dns\") pod \"nmstate-webhook-8474b5b9d8-bkdvp\" (UID: \"eb8c76be-9c0b-4240-955e-016a3f5e0ccd\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.733448 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkswd\" (UniqueName: \"kubernetes.io/projected/795abd33-3a47-45c5-a84b-7a2a69168a13-kube-api-access-jkswd\") pod \"nmstate-metrics-54757c584b-xfzjh\" (UID: \"795abd33-3a47-45c5-a84b-7a2a69168a13\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-xfzjh" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.733492 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c1bb67ab-63d4-45d9-95e3-696c32134f61-ovs-socket\") pod \"nmstate-handler-tgc5w\" (UID: \"c1bb67ab-63d4-45d9-95e3-696c32134f61\") " pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.733541 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c1bb67ab-63d4-45d9-95e3-696c32134f61-nmstate-lock\") pod \"nmstate-handler-tgc5w\" (UID: \"c1bb67ab-63d4-45d9-95e3-696c32134f61\") " pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.733633 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9znhd\" (UniqueName: \"kubernetes.io/projected/c1bb67ab-63d4-45d9-95e3-696c32134f61-kube-api-access-9znhd\") pod \"nmstate-handler-tgc5w\" (UID: \"c1bb67ab-63d4-45d9-95e3-696c32134f61\") " pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.733664 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/eb8c76be-9c0b-4240-955e-016a3f5e0ccd-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-bkdvp\" (UID: \"eb8c76be-9c0b-4240-955e-016a3f5e0ccd\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.733686 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c1bb67ab-63d4-45d9-95e3-696c32134f61-dbus-socket\") pod \"nmstate-handler-tgc5w\" (UID: \"c1bb67ab-63d4-45d9-95e3-696c32134f61\") " pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.834872 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c1bb67ab-63d4-45d9-95e3-696c32134f61-nmstate-lock\") pod \"nmstate-handler-tgc5w\" (UID: \"c1bb67ab-63d4-45d9-95e3-696c32134f61\") " pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.834930 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1420cf7f-07c1-473d-9976-b0978952519d-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-k6zls\" (UID: \"1420cf7f-07c1-473d-9976-b0978952519d\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.834964 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9znhd\" (UniqueName: \"kubernetes.io/projected/c1bb67ab-63d4-45d9-95e3-696c32134f61-kube-api-access-9znhd\") pod \"nmstate-handler-tgc5w\" (UID: \"c1bb67ab-63d4-45d9-95e3-696c32134f61\") " pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.834981 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c1bb67ab-63d4-45d9-95e3-696c32134f61-nmstate-lock\") pod \"nmstate-handler-tgc5w\" (UID: \"c1bb67ab-63d4-45d9-95e3-696c32134f61\") " pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.834993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/eb8c76be-9c0b-4240-955e-016a3f5e0ccd-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-bkdvp\" (UID: \"eb8c76be-9c0b-4240-955e-016a3f5e0ccd\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.835077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1420cf7f-07c1-473d-9976-b0978952519d-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-k6zls\" (UID: \"1420cf7f-07c1-473d-9976-b0978952519d\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.835138 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c1bb67ab-63d4-45d9-95e3-696c32134f61-dbus-socket\") pod \"nmstate-handler-tgc5w\" (UID: \"c1bb67ab-63d4-45d9-95e3-696c32134f61\") " pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.835283 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7dns\" (UniqueName: \"kubernetes.io/projected/eb8c76be-9c0b-4240-955e-016a3f5e0ccd-kube-api-access-k7dns\") pod \"nmstate-webhook-8474b5b9d8-bkdvp\" (UID: \"eb8c76be-9c0b-4240-955e-016a3f5e0ccd\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.835356 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkswd\" (UniqueName: \"kubernetes.io/projected/795abd33-3a47-45c5-a84b-7a2a69168a13-kube-api-access-jkswd\") pod \"nmstate-metrics-54757c584b-xfzjh\" (UID: \"795abd33-3a47-45c5-a84b-7a2a69168a13\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-xfzjh" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.835391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cknn\" (UniqueName: \"kubernetes.io/projected/1420cf7f-07c1-473d-9976-b0978952519d-kube-api-access-4cknn\") pod \"nmstate-console-plugin-7754f76f8b-k6zls\" (UID: \"1420cf7f-07c1-473d-9976-b0978952519d\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.835466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c1bb67ab-63d4-45d9-95e3-696c32134f61-ovs-socket\") pod \"nmstate-handler-tgc5w\" (UID: \"c1bb67ab-63d4-45d9-95e3-696c32134f61\") " pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.835503 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c1bb67ab-63d4-45d9-95e3-696c32134f61-dbus-socket\") pod \"nmstate-handler-tgc5w\" (UID: \"c1bb67ab-63d4-45d9-95e3-696c32134f61\") " pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.835563 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c1bb67ab-63d4-45d9-95e3-696c32134f61-ovs-socket\") pod \"nmstate-handler-tgc5w\" (UID: \"c1bb67ab-63d4-45d9-95e3-696c32134f61\") " pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.841865 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/eb8c76be-9c0b-4240-955e-016a3f5e0ccd-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-bkdvp\" (UID: \"eb8c76be-9c0b-4240-955e-016a3f5e0ccd\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.852779 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7dns\" (UniqueName: \"kubernetes.io/projected/eb8c76be-9c0b-4240-955e-016a3f5e0ccd-kube-api-access-k7dns\") pod \"nmstate-webhook-8474b5b9d8-bkdvp\" (UID: \"eb8c76be-9c0b-4240-955e-016a3f5e0ccd\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.858326 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9znhd\" (UniqueName: \"kubernetes.io/projected/c1bb67ab-63d4-45d9-95e3-696c32134f61-kube-api-access-9znhd\") pod \"nmstate-handler-tgc5w\" (UID: \"c1bb67ab-63d4-45d9-95e3-696c32134f61\") " pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.859361 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkswd\" (UniqueName: \"kubernetes.io/projected/795abd33-3a47-45c5-a84b-7a2a69168a13-kube-api-access-jkswd\") pod \"nmstate-metrics-54757c584b-xfzjh\" (UID: \"795abd33-3a47-45c5-a84b-7a2a69168a13\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-xfzjh" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.863197 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-xfzjh" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.889949 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c948d7657-5ql22"] Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.890771 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.891399 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.902472 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c948d7657-5ql22"] Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.923163 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.936242 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1420cf7f-07c1-473d-9976-b0978952519d-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-k6zls\" (UID: \"1420cf7f-07c1-473d-9976-b0978952519d\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.936319 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1420cf7f-07c1-473d-9976-b0978952519d-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-k6zls\" (UID: \"1420cf7f-07c1-473d-9976-b0978952519d\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.936377 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cknn\" (UniqueName: \"kubernetes.io/projected/1420cf7f-07c1-473d-9976-b0978952519d-kube-api-access-4cknn\") pod \"nmstate-console-plugin-7754f76f8b-k6zls\" (UID: \"1420cf7f-07c1-473d-9976-b0978952519d\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.937770 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1420cf7f-07c1-473d-9976-b0978952519d-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-k6zls\" (UID: \"1420cf7f-07c1-473d-9976-b0978952519d\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.941103 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1420cf7f-07c1-473d-9976-b0978952519d-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-k6zls\" (UID: \"1420cf7f-07c1-473d-9976-b0978952519d\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" Jan 20 22:47:39 crc kubenswrapper[5030]: I0120 22:47:39.951245 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cknn\" (UniqueName: \"kubernetes.io/projected/1420cf7f-07c1-473d-9976-b0978952519d-kube-api-access-4cknn\") pod \"nmstate-console-plugin-7754f76f8b-k6zls\" (UID: \"1420cf7f-07c1-473d-9976-b0978952519d\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" Jan 20 22:47:39 crc kubenswrapper[5030]: W0120 22:47:39.960489 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1bb67ab_63d4_45d9_95e3_696c32134f61.slice/crio-9bc86bdf637df1ab5e413ddaeea2cc294bff4b2fc7ec8bcf77ddbc959031977f WatchSource:0}: Error finding container 9bc86bdf637df1ab5e413ddaeea2cc294bff4b2fc7ec8bcf77ddbc959031977f: Status 404 returned error can't find the container with id 9bc86bdf637df1ab5e413ddaeea2cc294bff4b2fc7ec8bcf77ddbc959031977f Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.003868 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.037332 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-console-oauth-config\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.037708 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-console-serving-cert\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.037734 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-oauth-serving-cert\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.037754 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-trusted-ca-bundle\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.037791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bc98\" (UniqueName: \"kubernetes.io/projected/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-kube-api-access-5bc98\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.037810 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-service-ca\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.037834 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-console-config\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.139419 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-console-serving-cert\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.139462 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-oauth-serving-cert\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.139499 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-trusted-ca-bundle\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.139567 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bc98\" (UniqueName: \"kubernetes.io/projected/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-kube-api-access-5bc98\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.139587 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-service-ca\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.139636 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-console-config\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.139665 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-console-oauth-config\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.140728 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-oauth-serving-cert\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.140920 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-trusted-ca-bundle\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.141598 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-service-ca\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.142224 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-console-config\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.143489 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-console-oauth-config\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.144027 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-console-serving-cert\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.157367 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.157454 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.158292 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp"] Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.160911 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bc98\" (UniqueName: \"kubernetes.io/projected/f857e6fe-de8a-49f6-8c38-25e5bc7c7b82-kube-api-access-5bc98\") pod \"console-c948d7657-5ql22\" (UID: \"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82\") " pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: W0120 22:47:40.210947 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1420cf7f_07c1_473d_9976_b0978952519d.slice/crio-28887665f259d3b6dfe97a06c7c5732bfaa246f6c30baee3b35b2474c94c8e1f WatchSource:0}: Error finding container 28887665f259d3b6dfe97a06c7c5732bfaa246f6c30baee3b35b2474c94c8e1f: Status 404 returned error can't find the container with id 28887665f259d3b6dfe97a06c7c5732bfaa246f6c30baee3b35b2474c94c8e1f Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.213667 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls"] Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.257769 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.294364 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-xfzjh"] Jan 20 22:47:40 crc kubenswrapper[5030]: W0120 22:47:40.298465 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod795abd33_3a47_45c5_a84b_7a2a69168a13.slice/crio-95bab1b99e66c3efa193ee1aaad87a51273fc6a50a0b0e1e21029ed3455500fa WatchSource:0}: Error finding container 95bab1b99e66c3efa193ee1aaad87a51273fc6a50a0b0e1e21029ed3455500fa: Status 404 returned error can't find the container with id 95bab1b99e66c3efa193ee1aaad87a51273fc6a50a0b0e1e21029ed3455500fa Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.425217 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c948d7657-5ql22"] Jan 20 22:47:40 crc kubenswrapper[5030]: W0120 22:47:40.432760 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf857e6fe_de8a_49f6_8c38_25e5bc7c7b82.slice/crio-8aeec072f9eeb1b131a933b55befc7e2bcb733ac16efb3ef1f63b712ea898cea WatchSource:0}: Error finding container 8aeec072f9eeb1b131a933b55befc7e2bcb733ac16efb3ef1f63b712ea898cea: Status 404 returned error can't find the container with id 8aeec072f9eeb1b131a933b55befc7e2bcb733ac16efb3ef1f63b712ea898cea Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.506007 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-xfzjh" event={"ID":"795abd33-3a47-45c5-a84b-7a2a69168a13","Type":"ContainerStarted","Data":"95bab1b99e66c3efa193ee1aaad87a51273fc6a50a0b0e1e21029ed3455500fa"} Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.507452 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp" event={"ID":"eb8c76be-9c0b-4240-955e-016a3f5e0ccd","Type":"ContainerStarted","Data":"c9eff6d7893e698f9177671656bc7959a624d497063c393ecaef57fe69d075c2"} Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.508730 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" event={"ID":"1420cf7f-07c1-473d-9976-b0978952519d","Type":"ContainerStarted","Data":"28887665f259d3b6dfe97a06c7c5732bfaa246f6c30baee3b35b2474c94c8e1f"} Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.509774 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-tgc5w" event={"ID":"c1bb67ab-63d4-45d9-95e3-696c32134f61","Type":"ContainerStarted","Data":"9bc86bdf637df1ab5e413ddaeea2cc294bff4b2fc7ec8bcf77ddbc959031977f"} Jan 20 22:47:40 crc kubenswrapper[5030]: I0120 22:47:40.510710 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c948d7657-5ql22" event={"ID":"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82","Type":"ContainerStarted","Data":"8aeec072f9eeb1b131a933b55befc7e2bcb733ac16efb3ef1f63b712ea898cea"} Jan 20 22:47:41 crc kubenswrapper[5030]: I0120 22:47:41.517166 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c948d7657-5ql22" event={"ID":"f857e6fe-de8a-49f6-8c38-25e5bc7c7b82","Type":"ContainerStarted","Data":"d0af166ff3af22f1211b5f6ca7cb1ddeec33365ad9bbf1bccb49faaf994e2816"} Jan 20 22:47:41 crc kubenswrapper[5030]: I0120 22:47:41.536549 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c948d7657-5ql22" podStartSLOduration=2.5365283720000003 podStartE2EDuration="2.536528372s" podCreationTimestamp="2026-01-20 22:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:47:41.532521504 +0000 UTC m=+733.852781792" watchObservedRunningTime="2026-01-20 22:47:41.536528372 +0000 UTC m=+733.856788660" Jan 20 22:47:43 crc kubenswrapper[5030]: I0120 22:47:43.528036 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-xfzjh" event={"ID":"795abd33-3a47-45c5-a84b-7a2a69168a13","Type":"ContainerStarted","Data":"36be4de4cbd89f0f578661de8f3658a48f1f5fd4bd51065d9f041fbea08ab7ab"} Jan 20 22:47:43 crc kubenswrapper[5030]: I0120 22:47:43.530602 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp" event={"ID":"eb8c76be-9c0b-4240-955e-016a3f5e0ccd","Type":"ContainerStarted","Data":"0c498635ef17b635461caae1aac3c3bf0ded09168b36dc085e53175ddcd7b86d"} Jan 20 22:47:43 crc kubenswrapper[5030]: I0120 22:47:43.530836 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp" Jan 20 22:47:43 crc kubenswrapper[5030]: I0120 22:47:43.532391 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" event={"ID":"1420cf7f-07c1-473d-9976-b0978952519d","Type":"ContainerStarted","Data":"108dfce07fea075fe77ac48de98382874f60ea5d53290dbd65ef15787d97707d"} Jan 20 22:47:43 crc kubenswrapper[5030]: I0120 22:47:43.536134 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-tgc5w" event={"ID":"c1bb67ab-63d4-45d9-95e3-696c32134f61","Type":"ContainerStarted","Data":"50065ff5ce7af46c10e3388b50bdd759afbec82c3e4156e809fccc629885e59a"} Jan 20 22:47:43 crc kubenswrapper[5030]: I0120 22:47:43.550678 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp" podStartSLOduration=2.105966468 podStartE2EDuration="4.550662309s" podCreationTimestamp="2026-01-20 22:47:39 +0000 UTC" firstStartedPulling="2026-01-20 22:47:40.173867329 +0000 UTC m=+732.494127617" lastFinishedPulling="2026-01-20 22:47:42.61856314 +0000 UTC m=+734.938823458" observedRunningTime="2026-01-20 22:47:43.547194014 +0000 UTC m=+735.867454362" watchObservedRunningTime="2026-01-20 22:47:43.550662309 +0000 UTC m=+735.870922597" Jan 20 22:47:43 crc kubenswrapper[5030]: I0120 22:47:43.581552 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-tgc5w" podStartSLOduration=1.925918996 podStartE2EDuration="4.581468651s" podCreationTimestamp="2026-01-20 22:47:39 +0000 UTC" firstStartedPulling="2026-01-20 22:47:39.963198199 +0000 UTC m=+732.283458477" lastFinishedPulling="2026-01-20 22:47:42.618747804 +0000 UTC m=+734.939008132" observedRunningTime="2026-01-20 22:47:43.567245793 +0000 UTC m=+735.887506081" watchObservedRunningTime="2026-01-20 22:47:43.581468651 +0000 UTC m=+735.901728989" Jan 20 22:47:43 crc kubenswrapper[5030]: I0120 22:47:43.591829 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-k6zls" podStartSLOduration=2.186627076 podStartE2EDuration="4.591798153s" podCreationTimestamp="2026-01-20 22:47:39 +0000 UTC" firstStartedPulling="2026-01-20 22:47:40.212967103 +0000 UTC m=+732.533227391" lastFinishedPulling="2026-01-20 22:47:42.61813814 +0000 UTC m=+734.938398468" observedRunningTime="2026-01-20 22:47:43.582169018 +0000 UTC m=+735.902429316" watchObservedRunningTime="2026-01-20 22:47:43.591798153 +0000 UTC m=+735.912058481" Jan 20 22:47:44 crc kubenswrapper[5030]: I0120 22:47:44.542275 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:45 crc kubenswrapper[5030]: I0120 22:47:45.553092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-xfzjh" event={"ID":"795abd33-3a47-45c5-a84b-7a2a69168a13","Type":"ContainerStarted","Data":"6042320e9b63d6d19d55d35679aef27ca794d73a2476643a0eae90c6570d521f"} Jan 20 22:47:45 crc kubenswrapper[5030]: I0120 22:47:45.580350 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-xfzjh" podStartSLOduration=1.756223446 podStartE2EDuration="6.580324784s" podCreationTimestamp="2026-01-20 22:47:39 +0000 UTC" firstStartedPulling="2026-01-20 22:47:40.301119764 +0000 UTC m=+732.621380072" lastFinishedPulling="2026-01-20 22:47:45.125221112 +0000 UTC m=+737.445481410" observedRunningTime="2026-01-20 22:47:45.575554378 +0000 UTC m=+737.895814726" watchObservedRunningTime="2026-01-20 22:47:45.580324784 +0000 UTC m=+737.900585102" Jan 20 22:47:49 crc kubenswrapper[5030]: I0120 22:47:49.957114 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-tgc5w" Jan 20 22:47:50 crc kubenswrapper[5030]: I0120 22:47:50.258372 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:50 crc kubenswrapper[5030]: I0120 22:47:50.258451 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:50 crc kubenswrapper[5030]: I0120 22:47:50.267718 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:50 crc kubenswrapper[5030]: I0120 22:47:50.597865 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c948d7657-5ql22" Jan 20 22:47:50 crc kubenswrapper[5030]: I0120 22:47:50.686073 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vkrcb"] Jan 20 22:47:59 crc kubenswrapper[5030]: I0120 22:47:59.902491 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-bkdvp" Jan 20 22:48:06 crc kubenswrapper[5030]: I0120 22:48:06.018698 5030 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 20 22:48:10 crc kubenswrapper[5030]: I0120 22:48:10.157430 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:48:10 crc kubenswrapper[5030]: I0120 22:48:10.158072 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.129246 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm"] Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.131108 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.134134 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.140092 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm"] Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.223521 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/374f7b93-b687-492f-8b68-c3b20c6566e6-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm\" (UID: \"374f7b93-b687-492f-8b68-c3b20c6566e6\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.223592 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/374f7b93-b687-492f-8b68-c3b20c6566e6-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm\" (UID: \"374f7b93-b687-492f-8b68-c3b20c6566e6\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.223643 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5nrq\" (UniqueName: \"kubernetes.io/projected/374f7b93-b687-492f-8b68-c3b20c6566e6-kube-api-access-k5nrq\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm\" (UID: \"374f7b93-b687-492f-8b68-c3b20c6566e6\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.325250 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5nrq\" (UniqueName: \"kubernetes.io/projected/374f7b93-b687-492f-8b68-c3b20c6566e6-kube-api-access-k5nrq\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm\" (UID: \"374f7b93-b687-492f-8b68-c3b20c6566e6\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.325395 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/374f7b93-b687-492f-8b68-c3b20c6566e6-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm\" (UID: \"374f7b93-b687-492f-8b68-c3b20c6566e6\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.325502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/374f7b93-b687-492f-8b68-c3b20c6566e6-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm\" (UID: \"374f7b93-b687-492f-8b68-c3b20c6566e6\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.325934 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/374f7b93-b687-492f-8b68-c3b20c6566e6-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm\" (UID: \"374f7b93-b687-492f-8b68-c3b20c6566e6\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.326138 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/374f7b93-b687-492f-8b68-c3b20c6566e6-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm\" (UID: \"374f7b93-b687-492f-8b68-c3b20c6566e6\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.356567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5nrq\" (UniqueName: \"kubernetes.io/projected/374f7b93-b687-492f-8b68-c3b20c6566e6-kube-api-access-k5nrq\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm\" (UID: \"374f7b93-b687-492f-8b68-c3b20c6566e6\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.447813 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.663849 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm"] Jan 20 22:48:14 crc kubenswrapper[5030]: W0120 22:48:14.673328 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod374f7b93_b687_492f_8b68_c3b20c6566e6.slice/crio-59248672db6bdd757e5a477c9aaa4dc6a3a04e385472e714de3ceaff870c86e4 WatchSource:0}: Error finding container 59248672db6bdd757e5a477c9aaa4dc6a3a04e385472e714de3ceaff870c86e4: Status 404 returned error can't find the container with id 59248672db6bdd757e5a477c9aaa4dc6a3a04e385472e714de3ceaff870c86e4 Jan 20 22:48:14 crc kubenswrapper[5030]: I0120 22:48:14.780407 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" event={"ID":"374f7b93-b687-492f-8b68-c3b20c6566e6","Type":"ContainerStarted","Data":"59248672db6bdd757e5a477c9aaa4dc6a3a04e385472e714de3ceaff870c86e4"} Jan 20 22:48:15 crc kubenswrapper[5030]: I0120 22:48:15.747861 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-vkrcb" podUID="6d43e77d-5f9a-4b29-959f-190fa60d8807" containerName="console" containerID="cri-o://3380a5f8fa6234a24777c5a79ef63ce7322a44640be049d3222216bdfeb28a0b" gracePeriod=15 Jan 20 22:48:15 crc kubenswrapper[5030]: I0120 22:48:15.790126 5030 generic.go:334] "Generic (PLEG): container finished" podID="374f7b93-b687-492f-8b68-c3b20c6566e6" containerID="ef640f906dbaa76beab7a1716314d8aaa49658be0d369fda671e8f07ba30aa04" exitCode=0 Jan 20 22:48:15 crc kubenswrapper[5030]: I0120 22:48:15.790398 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" event={"ID":"374f7b93-b687-492f-8b68-c3b20c6566e6","Type":"ContainerDied","Data":"ef640f906dbaa76beab7a1716314d8aaa49658be0d369fda671e8f07ba30aa04"} Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.150118 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vkrcb_6d43e77d-5f9a-4b29-959f-190fa60d8807/console/0.log" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.150216 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.249669 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtp9h\" (UniqueName: \"kubernetes.io/projected/6d43e77d-5f9a-4b29-959f-190fa60d8807-kube-api-access-rtp9h\") pod \"6d43e77d-5f9a-4b29-959f-190fa60d8807\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.249872 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-oauth-serving-cert\") pod \"6d43e77d-5f9a-4b29-959f-190fa60d8807\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.249905 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-config\") pod \"6d43e77d-5f9a-4b29-959f-190fa60d8807\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.249964 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-trusted-ca-bundle\") pod \"6d43e77d-5f9a-4b29-959f-190fa60d8807\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.250029 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-oauth-config\") pod \"6d43e77d-5f9a-4b29-959f-190fa60d8807\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.250063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-service-ca\") pod \"6d43e77d-5f9a-4b29-959f-190fa60d8807\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.250093 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-serving-cert\") pod \"6d43e77d-5f9a-4b29-959f-190fa60d8807\" (UID: \"6d43e77d-5f9a-4b29-959f-190fa60d8807\") " Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.251238 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-config" (OuterVolumeSpecName: "console-config") pod "6d43e77d-5f9a-4b29-959f-190fa60d8807" (UID: "6d43e77d-5f9a-4b29-959f-190fa60d8807"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.251498 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-service-ca" (OuterVolumeSpecName: "service-ca") pod "6d43e77d-5f9a-4b29-959f-190fa60d8807" (UID: "6d43e77d-5f9a-4b29-959f-190fa60d8807"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.251521 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6d43e77d-5f9a-4b29-959f-190fa60d8807" (UID: "6d43e77d-5f9a-4b29-959f-190fa60d8807"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.252093 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6d43e77d-5f9a-4b29-959f-190fa60d8807" (UID: "6d43e77d-5f9a-4b29-959f-190fa60d8807"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.255913 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6d43e77d-5f9a-4b29-959f-190fa60d8807" (UID: "6d43e77d-5f9a-4b29-959f-190fa60d8807"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.257168 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6d43e77d-5f9a-4b29-959f-190fa60d8807" (UID: "6d43e77d-5f9a-4b29-959f-190fa60d8807"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.259112 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d43e77d-5f9a-4b29-959f-190fa60d8807-kube-api-access-rtp9h" (OuterVolumeSpecName: "kube-api-access-rtp9h") pod "6d43e77d-5f9a-4b29-959f-190fa60d8807" (UID: "6d43e77d-5f9a-4b29-959f-190fa60d8807"). InnerVolumeSpecName "kube-api-access-rtp9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.351570 5030 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.351669 5030 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.351688 5030 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.351705 5030 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.351721 5030 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d43e77d-5f9a-4b29-959f-190fa60d8807-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.351738 5030 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d43e77d-5f9a-4b29-959f-190fa60d8807-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.351755 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtp9h\" (UniqueName: \"kubernetes.io/projected/6d43e77d-5f9a-4b29-959f-190fa60d8807-kube-api-access-rtp9h\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.462033 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4svzh"] Jan 20 22:48:16 crc kubenswrapper[5030]: E0120 22:48:16.462396 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d43e77d-5f9a-4b29-959f-190fa60d8807" containerName="console" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.462428 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d43e77d-5f9a-4b29-959f-190fa60d8807" containerName="console" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.462656 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d43e77d-5f9a-4b29-959f-190fa60d8807" containerName="console" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.463925 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.481016 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4svzh"] Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.554325 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsvwc\" (UniqueName: \"kubernetes.io/projected/ddc6e15c-34a6-422b-a0ed-a94dec139c51-kube-api-access-wsvwc\") pod \"redhat-operators-4svzh\" (UID: \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\") " pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.554470 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc6e15c-34a6-422b-a0ed-a94dec139c51-catalog-content\") pod \"redhat-operators-4svzh\" (UID: \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\") " pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.554537 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc6e15c-34a6-422b-a0ed-a94dec139c51-utilities\") pod \"redhat-operators-4svzh\" (UID: \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\") " pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.656666 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsvwc\" (UniqueName: \"kubernetes.io/projected/ddc6e15c-34a6-422b-a0ed-a94dec139c51-kube-api-access-wsvwc\") pod \"redhat-operators-4svzh\" (UID: \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\") " pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.656784 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc6e15c-34a6-422b-a0ed-a94dec139c51-catalog-content\") pod \"redhat-operators-4svzh\" (UID: \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\") " pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.656837 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc6e15c-34a6-422b-a0ed-a94dec139c51-utilities\") pod \"redhat-operators-4svzh\" (UID: \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\") " pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.657761 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc6e15c-34a6-422b-a0ed-a94dec139c51-catalog-content\") pod \"redhat-operators-4svzh\" (UID: \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\") " pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.657842 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc6e15c-34a6-422b-a0ed-a94dec139c51-utilities\") pod \"redhat-operators-4svzh\" (UID: \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\") " pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.684398 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsvwc\" (UniqueName: \"kubernetes.io/projected/ddc6e15c-34a6-422b-a0ed-a94dec139c51-kube-api-access-wsvwc\") pod \"redhat-operators-4svzh\" (UID: \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\") " pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.791508 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.797153 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vkrcb_6d43e77d-5f9a-4b29-959f-190fa60d8807/console/0.log" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.797279 5030 generic.go:334] "Generic (PLEG): container finished" podID="6d43e77d-5f9a-4b29-959f-190fa60d8807" containerID="3380a5f8fa6234a24777c5a79ef63ce7322a44640be049d3222216bdfeb28a0b" exitCode=2 Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.797342 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vkrcb" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.797361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vkrcb" event={"ID":"6d43e77d-5f9a-4b29-959f-190fa60d8807","Type":"ContainerDied","Data":"3380a5f8fa6234a24777c5a79ef63ce7322a44640be049d3222216bdfeb28a0b"} Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.797640 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vkrcb" event={"ID":"6d43e77d-5f9a-4b29-959f-190fa60d8807","Type":"ContainerDied","Data":"5765b1b3f27896c79d26d45e2b414b5956e6e668caadda9586bf610e58aa8d18"} Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.797698 5030 scope.go:117] "RemoveContainer" containerID="3380a5f8fa6234a24777c5a79ef63ce7322a44640be049d3222216bdfeb28a0b" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.820235 5030 scope.go:117] "RemoveContainer" containerID="3380a5f8fa6234a24777c5a79ef63ce7322a44640be049d3222216bdfeb28a0b" Jan 20 22:48:16 crc kubenswrapper[5030]: E0120 22:48:16.821093 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3380a5f8fa6234a24777c5a79ef63ce7322a44640be049d3222216bdfeb28a0b\": container with ID starting with 3380a5f8fa6234a24777c5a79ef63ce7322a44640be049d3222216bdfeb28a0b not found: ID does not exist" containerID="3380a5f8fa6234a24777c5a79ef63ce7322a44640be049d3222216bdfeb28a0b" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.821137 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3380a5f8fa6234a24777c5a79ef63ce7322a44640be049d3222216bdfeb28a0b"} err="failed to get container status \"3380a5f8fa6234a24777c5a79ef63ce7322a44640be049d3222216bdfeb28a0b\": rpc error: code = NotFound desc = could not find container \"3380a5f8fa6234a24777c5a79ef63ce7322a44640be049d3222216bdfeb28a0b\": container with ID starting with 3380a5f8fa6234a24777c5a79ef63ce7322a44640be049d3222216bdfeb28a0b not found: ID does not exist" Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.830110 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vkrcb"] Jan 20 22:48:16 crc kubenswrapper[5030]: I0120 22:48:16.831233 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-vkrcb"] Jan 20 22:48:17 crc kubenswrapper[5030]: I0120 22:48:17.018884 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4svzh"] Jan 20 22:48:17 crc kubenswrapper[5030]: I0120 22:48:17.805839 5030 generic.go:334] "Generic (PLEG): container finished" podID="374f7b93-b687-492f-8b68-c3b20c6566e6" containerID="511fae4c547cf98ed6f03319217eb7844557daa0df007ccbda74603d727145c7" exitCode=0 Jan 20 22:48:17 crc kubenswrapper[5030]: I0120 22:48:17.805942 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" event={"ID":"374f7b93-b687-492f-8b68-c3b20c6566e6","Type":"ContainerDied","Data":"511fae4c547cf98ed6f03319217eb7844557daa0df007ccbda74603d727145c7"} Jan 20 22:48:17 crc kubenswrapper[5030]: I0120 22:48:17.808606 5030 generic.go:334] "Generic (PLEG): container finished" podID="ddc6e15c-34a6-422b-a0ed-a94dec139c51" containerID="59b67571a3bac9809158faebd21aa68626b57797281b8d778982fd7d42cb3e3f" exitCode=0 Jan 20 22:48:17 crc kubenswrapper[5030]: I0120 22:48:17.808669 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4svzh" event={"ID":"ddc6e15c-34a6-422b-a0ed-a94dec139c51","Type":"ContainerDied","Data":"59b67571a3bac9809158faebd21aa68626b57797281b8d778982fd7d42cb3e3f"} Jan 20 22:48:17 crc kubenswrapper[5030]: I0120 22:48:17.808720 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4svzh" event={"ID":"ddc6e15c-34a6-422b-a0ed-a94dec139c51","Type":"ContainerStarted","Data":"3c946ab49c593b75f4c4285459959fcdf973be10be8a24d54f2a8108e1e20242"} Jan 20 22:48:17 crc kubenswrapper[5030]: I0120 22:48:17.969251 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d43e77d-5f9a-4b29-959f-190fa60d8807" path="/var/lib/kubelet/pods/6d43e77d-5f9a-4b29-959f-190fa60d8807/volumes" Jan 20 22:48:18 crc kubenswrapper[5030]: I0120 22:48:18.819017 5030 generic.go:334] "Generic (PLEG): container finished" podID="374f7b93-b687-492f-8b68-c3b20c6566e6" containerID="ed0db88c97257f61aca10598a82a169e629e234db569a57b1eb4765d28c80a7f" exitCode=0 Jan 20 22:48:18 crc kubenswrapper[5030]: I0120 22:48:18.819063 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" event={"ID":"374f7b93-b687-492f-8b68-c3b20c6566e6","Type":"ContainerDied","Data":"ed0db88c97257f61aca10598a82a169e629e234db569a57b1eb4765d28c80a7f"} Jan 20 22:48:19 crc kubenswrapper[5030]: I0120 22:48:19.826940 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4svzh" event={"ID":"ddc6e15c-34a6-422b-a0ed-a94dec139c51","Type":"ContainerStarted","Data":"78393b03442b929293314c66bb9ee72c2dc58991ffe2efec9c3fae4024e0b7fa"} Jan 20 22:48:20 crc kubenswrapper[5030]: I0120 22:48:20.166117 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" Jan 20 22:48:20 crc kubenswrapper[5030]: I0120 22:48:20.302385 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/374f7b93-b687-492f-8b68-c3b20c6566e6-bundle\") pod \"374f7b93-b687-492f-8b68-c3b20c6566e6\" (UID: \"374f7b93-b687-492f-8b68-c3b20c6566e6\") " Jan 20 22:48:20 crc kubenswrapper[5030]: I0120 22:48:20.302543 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/374f7b93-b687-492f-8b68-c3b20c6566e6-util\") pod \"374f7b93-b687-492f-8b68-c3b20c6566e6\" (UID: \"374f7b93-b687-492f-8b68-c3b20c6566e6\") " Jan 20 22:48:20 crc kubenswrapper[5030]: I0120 22:48:20.302597 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5nrq\" (UniqueName: \"kubernetes.io/projected/374f7b93-b687-492f-8b68-c3b20c6566e6-kube-api-access-k5nrq\") pod \"374f7b93-b687-492f-8b68-c3b20c6566e6\" (UID: \"374f7b93-b687-492f-8b68-c3b20c6566e6\") " Jan 20 22:48:20 crc kubenswrapper[5030]: I0120 22:48:20.303489 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/374f7b93-b687-492f-8b68-c3b20c6566e6-bundle" (OuterVolumeSpecName: "bundle") pod "374f7b93-b687-492f-8b68-c3b20c6566e6" (UID: "374f7b93-b687-492f-8b68-c3b20c6566e6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:48:20 crc kubenswrapper[5030]: I0120 22:48:20.313035 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/374f7b93-b687-492f-8b68-c3b20c6566e6-kube-api-access-k5nrq" (OuterVolumeSpecName: "kube-api-access-k5nrq") pod "374f7b93-b687-492f-8b68-c3b20c6566e6" (UID: "374f7b93-b687-492f-8b68-c3b20c6566e6"). InnerVolumeSpecName "kube-api-access-k5nrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:48:20 crc kubenswrapper[5030]: I0120 22:48:20.404720 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/374f7b93-b687-492f-8b68-c3b20c6566e6-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:20 crc kubenswrapper[5030]: I0120 22:48:20.404843 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5nrq\" (UniqueName: \"kubernetes.io/projected/374f7b93-b687-492f-8b68-c3b20c6566e6-kube-api-access-k5nrq\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:20 crc kubenswrapper[5030]: I0120 22:48:20.836235 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" event={"ID":"374f7b93-b687-492f-8b68-c3b20c6566e6","Type":"ContainerDied","Data":"59248672db6bdd757e5a477c9aaa4dc6a3a04e385472e714de3ceaff870c86e4"} Jan 20 22:48:20 crc kubenswrapper[5030]: I0120 22:48:20.836284 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm" Jan 20 22:48:20 crc kubenswrapper[5030]: I0120 22:48:20.836297 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59248672db6bdd757e5a477c9aaa4dc6a3a04e385472e714de3ceaff870c86e4" Jan 20 22:48:22 crc kubenswrapper[5030]: I0120 22:48:22.175365 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/374f7b93-b687-492f-8b68-c3b20c6566e6-util" (OuterVolumeSpecName: "util") pod "374f7b93-b687-492f-8b68-c3b20c6566e6" (UID: "374f7b93-b687-492f-8b68-c3b20c6566e6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:48:22 crc kubenswrapper[5030]: I0120 22:48:22.231197 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/374f7b93-b687-492f-8b68-c3b20c6566e6-util\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:22 crc kubenswrapper[5030]: I0120 22:48:22.852923 5030 generic.go:334] "Generic (PLEG): container finished" podID="ddc6e15c-34a6-422b-a0ed-a94dec139c51" containerID="78393b03442b929293314c66bb9ee72c2dc58991ffe2efec9c3fae4024e0b7fa" exitCode=0 Jan 20 22:48:22 crc kubenswrapper[5030]: I0120 22:48:22.853159 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4svzh" event={"ID":"ddc6e15c-34a6-422b-a0ed-a94dec139c51","Type":"ContainerDied","Data":"78393b03442b929293314c66bb9ee72c2dc58991ffe2efec9c3fae4024e0b7fa"} Jan 20 22:48:24 crc kubenswrapper[5030]: I0120 22:48:24.875515 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4svzh" event={"ID":"ddc6e15c-34a6-422b-a0ed-a94dec139c51","Type":"ContainerStarted","Data":"65da798d5a5ba39bf07d47e8f6e2c56298595c0898afb0b38fec7850587c90e6"} Jan 20 22:48:24 crc kubenswrapper[5030]: I0120 22:48:24.902599 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4svzh" podStartSLOduration=2.788478354 podStartE2EDuration="8.902575733s" podCreationTimestamp="2026-01-20 22:48:16 +0000 UTC" firstStartedPulling="2026-01-20 22:48:17.809561503 +0000 UTC m=+770.129821801" lastFinishedPulling="2026-01-20 22:48:23.923658882 +0000 UTC m=+776.243919180" observedRunningTime="2026-01-20 22:48:24.901099577 +0000 UTC m=+777.221359875" watchObservedRunningTime="2026-01-20 22:48:24.902575733 +0000 UTC m=+777.222836041" Jan 20 22:48:26 crc kubenswrapper[5030]: I0120 22:48:26.792427 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:26 crc kubenswrapper[5030]: I0120 22:48:26.793351 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:27 crc kubenswrapper[5030]: I0120 22:48:27.839010 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4svzh" podUID="ddc6e15c-34a6-422b-a0ed-a94dec139c51" containerName="registry-server" probeResult="failure" output=< Jan 20 22:48:27 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 22:48:27 crc kubenswrapper[5030]: > Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.043436 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nkmhb"] Jan 20 22:48:28 crc kubenswrapper[5030]: E0120 22:48:28.043997 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="374f7b93-b687-492f-8b68-c3b20c6566e6" containerName="pull" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.044011 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="374f7b93-b687-492f-8b68-c3b20c6566e6" containerName="pull" Jan 20 22:48:28 crc kubenswrapper[5030]: E0120 22:48:28.044026 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="374f7b93-b687-492f-8b68-c3b20c6566e6" containerName="util" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.044033 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="374f7b93-b687-492f-8b68-c3b20c6566e6" containerName="util" Jan 20 22:48:28 crc kubenswrapper[5030]: E0120 22:48:28.044042 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="374f7b93-b687-492f-8b68-c3b20c6566e6" containerName="extract" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.044047 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="374f7b93-b687-492f-8b68-c3b20c6566e6" containerName="extract" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.044139 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="374f7b93-b687-492f-8b68-c3b20c6566e6" containerName="extract" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.044796 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.055998 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nkmhb"] Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.208471 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ebb2a1-9973-41c8-972d-e474f74155d0-catalog-content\") pod \"certified-operators-nkmhb\" (UID: \"e5ebb2a1-9973-41c8-972d-e474f74155d0\") " pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.208543 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ebb2a1-9973-41c8-972d-e474f74155d0-utilities\") pod \"certified-operators-nkmhb\" (UID: \"e5ebb2a1-9973-41c8-972d-e474f74155d0\") " pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.208583 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgxmx\" (UniqueName: \"kubernetes.io/projected/e5ebb2a1-9973-41c8-972d-e474f74155d0-kube-api-access-rgxmx\") pod \"certified-operators-nkmhb\" (UID: \"e5ebb2a1-9973-41c8-972d-e474f74155d0\") " pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.309863 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ebb2a1-9973-41c8-972d-e474f74155d0-catalog-content\") pod \"certified-operators-nkmhb\" (UID: \"e5ebb2a1-9973-41c8-972d-e474f74155d0\") " pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.309903 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ebb2a1-9973-41c8-972d-e474f74155d0-utilities\") pod \"certified-operators-nkmhb\" (UID: \"e5ebb2a1-9973-41c8-972d-e474f74155d0\") " pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.309929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgxmx\" (UniqueName: \"kubernetes.io/projected/e5ebb2a1-9973-41c8-972d-e474f74155d0-kube-api-access-rgxmx\") pod \"certified-operators-nkmhb\" (UID: \"e5ebb2a1-9973-41c8-972d-e474f74155d0\") " pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.310429 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ebb2a1-9973-41c8-972d-e474f74155d0-catalog-content\") pod \"certified-operators-nkmhb\" (UID: \"e5ebb2a1-9973-41c8-972d-e474f74155d0\") " pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.311000 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ebb2a1-9973-41c8-972d-e474f74155d0-utilities\") pod \"certified-operators-nkmhb\" (UID: \"e5ebb2a1-9973-41c8-972d-e474f74155d0\") " pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.334345 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgxmx\" (UniqueName: \"kubernetes.io/projected/e5ebb2a1-9973-41c8-972d-e474f74155d0-kube-api-access-rgxmx\") pod \"certified-operators-nkmhb\" (UID: \"e5ebb2a1-9973-41c8-972d-e474f74155d0\") " pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.367105 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.616201 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nkmhb"] Jan 20 22:48:28 crc kubenswrapper[5030]: I0120 22:48:28.897055 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkmhb" event={"ID":"e5ebb2a1-9973-41c8-972d-e474f74155d0","Type":"ContainerStarted","Data":"3e586adc9a97c014d0940de6422560f206548d9076b1fb105e1ab67f0215f580"} Jan 20 22:48:29 crc kubenswrapper[5030]: I0120 22:48:29.904340 5030 generic.go:334] "Generic (PLEG): container finished" podID="e5ebb2a1-9973-41c8-972d-e474f74155d0" containerID="b58bcfb1c53618da1973e4d35cad282cd072543027d85e3bf670ab11a43d421c" exitCode=0 Jan 20 22:48:29 crc kubenswrapper[5030]: I0120 22:48:29.904458 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkmhb" event={"ID":"e5ebb2a1-9973-41c8-972d-e474f74155d0","Type":"ContainerDied","Data":"b58bcfb1c53618da1973e4d35cad282cd072543027d85e3bf670ab11a43d421c"} Jan 20 22:48:30 crc kubenswrapper[5030]: I0120 22:48:30.901944 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk"] Jan 20 22:48:30 crc kubenswrapper[5030]: I0120 22:48:30.903755 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" Jan 20 22:48:30 crc kubenswrapper[5030]: I0120 22:48:30.907317 5030 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 20 22:48:30 crc kubenswrapper[5030]: I0120 22:48:30.907373 5030 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fhnwh" Jan 20 22:48:30 crc kubenswrapper[5030]: I0120 22:48:30.907421 5030 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 20 22:48:30 crc kubenswrapper[5030]: I0120 22:48:30.907995 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 20 22:48:30 crc kubenswrapper[5030]: I0120 22:48:30.908350 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 20 22:48:30 crc kubenswrapper[5030]: I0120 22:48:30.985084 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk"] Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.043954 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17d40a45-26fa-4c9d-91df-767448cd1cf6-webhook-cert\") pod \"metallb-operator-controller-manager-6fd7497c88-v4gzk\" (UID: \"17d40a45-26fa-4c9d-91df-767448cd1cf6\") " pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.044082 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17d40a45-26fa-4c9d-91df-767448cd1cf6-apiservice-cert\") pod \"metallb-operator-controller-manager-6fd7497c88-v4gzk\" (UID: \"17d40a45-26fa-4c9d-91df-767448cd1cf6\") " pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.044133 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8s9w\" (UniqueName: \"kubernetes.io/projected/17d40a45-26fa-4c9d-91df-767448cd1cf6-kube-api-access-d8s9w\") pod \"metallb-operator-controller-manager-6fd7497c88-v4gzk\" (UID: \"17d40a45-26fa-4c9d-91df-767448cd1cf6\") " pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.145023 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17d40a45-26fa-4c9d-91df-767448cd1cf6-webhook-cert\") pod \"metallb-operator-controller-manager-6fd7497c88-v4gzk\" (UID: \"17d40a45-26fa-4c9d-91df-767448cd1cf6\") " pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.145136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17d40a45-26fa-4c9d-91df-767448cd1cf6-apiservice-cert\") pod \"metallb-operator-controller-manager-6fd7497c88-v4gzk\" (UID: \"17d40a45-26fa-4c9d-91df-767448cd1cf6\") " pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.145175 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8s9w\" (UniqueName: \"kubernetes.io/projected/17d40a45-26fa-4c9d-91df-767448cd1cf6-kube-api-access-d8s9w\") pod \"metallb-operator-controller-manager-6fd7497c88-v4gzk\" (UID: \"17d40a45-26fa-4c9d-91df-767448cd1cf6\") " pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.153182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17d40a45-26fa-4c9d-91df-767448cd1cf6-webhook-cert\") pod \"metallb-operator-controller-manager-6fd7497c88-v4gzk\" (UID: \"17d40a45-26fa-4c9d-91df-767448cd1cf6\") " pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.158165 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw"] Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.159072 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.160100 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17d40a45-26fa-4c9d-91df-767448cd1cf6-apiservice-cert\") pod \"metallb-operator-controller-manager-6fd7497c88-v4gzk\" (UID: \"17d40a45-26fa-4c9d-91df-767448cd1cf6\") " pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.162369 5030 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.162700 5030 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xcdcs" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.162886 5030 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.167317 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8s9w\" (UniqueName: \"kubernetes.io/projected/17d40a45-26fa-4c9d-91df-767448cd1cf6-kube-api-access-d8s9w\") pod \"metallb-operator-controller-manager-6fd7497c88-v4gzk\" (UID: \"17d40a45-26fa-4c9d-91df-767448cd1cf6\") " pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.178003 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw"] Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.225777 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.350366 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bfce14b1-de45-4c8d-852f-044ee0214880-apiservice-cert\") pod \"metallb-operator-webhook-server-74877969b9-vf5sw\" (UID: \"bfce14b1-de45-4c8d-852f-044ee0214880\") " pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.350726 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msr6s\" (UniqueName: \"kubernetes.io/projected/bfce14b1-de45-4c8d-852f-044ee0214880-kube-api-access-msr6s\") pod \"metallb-operator-webhook-server-74877969b9-vf5sw\" (UID: \"bfce14b1-de45-4c8d-852f-044ee0214880\") " pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.350764 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bfce14b1-de45-4c8d-852f-044ee0214880-webhook-cert\") pod \"metallb-operator-webhook-server-74877969b9-vf5sw\" (UID: \"bfce14b1-de45-4c8d-852f-044ee0214880\") " pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.428257 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk"] Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.452308 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bfce14b1-de45-4c8d-852f-044ee0214880-apiservice-cert\") pod \"metallb-operator-webhook-server-74877969b9-vf5sw\" (UID: \"bfce14b1-de45-4c8d-852f-044ee0214880\") " pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.452389 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msr6s\" (UniqueName: \"kubernetes.io/projected/bfce14b1-de45-4c8d-852f-044ee0214880-kube-api-access-msr6s\") pod \"metallb-operator-webhook-server-74877969b9-vf5sw\" (UID: \"bfce14b1-de45-4c8d-852f-044ee0214880\") " pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.452440 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bfce14b1-de45-4c8d-852f-044ee0214880-webhook-cert\") pod \"metallb-operator-webhook-server-74877969b9-vf5sw\" (UID: \"bfce14b1-de45-4c8d-852f-044ee0214880\") " pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.457973 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bfce14b1-de45-4c8d-852f-044ee0214880-webhook-cert\") pod \"metallb-operator-webhook-server-74877969b9-vf5sw\" (UID: \"bfce14b1-de45-4c8d-852f-044ee0214880\") " pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.458493 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bfce14b1-de45-4c8d-852f-044ee0214880-apiservice-cert\") pod \"metallb-operator-webhook-server-74877969b9-vf5sw\" (UID: \"bfce14b1-de45-4c8d-852f-044ee0214880\") " pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.472151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msr6s\" (UniqueName: \"kubernetes.io/projected/bfce14b1-de45-4c8d-852f-044ee0214880-kube-api-access-msr6s\") pod \"metallb-operator-webhook-server-74877969b9-vf5sw\" (UID: \"bfce14b1-de45-4c8d-852f-044ee0214880\") " pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.495229 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.688188 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw"] Jan 20 22:48:31 crc kubenswrapper[5030]: W0120 22:48:31.696278 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfce14b1_de45_4c8d_852f_044ee0214880.slice/crio-86562899d49f78c471ce6aad8f0850617c6e6cc42d8e6cb9bd11f00137077e79 WatchSource:0}: Error finding container 86562899d49f78c471ce6aad8f0850617c6e6cc42d8e6cb9bd11f00137077e79: Status 404 returned error can't find the container with id 86562899d49f78c471ce6aad8f0850617c6e6cc42d8e6cb9bd11f00137077e79 Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.926128 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" event={"ID":"bfce14b1-de45-4c8d-852f-044ee0214880","Type":"ContainerStarted","Data":"86562899d49f78c471ce6aad8f0850617c6e6cc42d8e6cb9bd11f00137077e79"} Jan 20 22:48:31 crc kubenswrapper[5030]: I0120 22:48:31.927002 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" event={"ID":"17d40a45-26fa-4c9d-91df-767448cd1cf6","Type":"ContainerStarted","Data":"916858bfe8ec260ed1b787b375201dadf041d28f92fd035e515ffb0d44aee9fe"} Jan 20 22:48:32 crc kubenswrapper[5030]: I0120 22:48:32.932882 5030 generic.go:334] "Generic (PLEG): container finished" podID="e5ebb2a1-9973-41c8-972d-e474f74155d0" containerID="ba8276c3f73a2c8204f874aa3fded04ac5de0c8a3d3bfd3e13a91150ef5cde48" exitCode=0 Jan 20 22:48:32 crc kubenswrapper[5030]: I0120 22:48:32.932944 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkmhb" event={"ID":"e5ebb2a1-9973-41c8-972d-e474f74155d0","Type":"ContainerDied","Data":"ba8276c3f73a2c8204f874aa3fded04ac5de0c8a3d3bfd3e13a91150ef5cde48"} Jan 20 22:48:35 crc kubenswrapper[5030]: I0120 22:48:35.953173 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkmhb" event={"ID":"e5ebb2a1-9973-41c8-972d-e474f74155d0","Type":"ContainerStarted","Data":"bf5d0c6995dd2b250a82c7f96e7407ae752f57bab3639882c0f9a068846b1fdc"} Jan 20 22:48:35 crc kubenswrapper[5030]: I0120 22:48:35.971362 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nkmhb" podStartSLOduration=4.488051679 podStartE2EDuration="7.971348462s" podCreationTimestamp="2026-01-20 22:48:28 +0000 UTC" firstStartedPulling="2026-01-20 22:48:29.906546035 +0000 UTC m=+782.226806343" lastFinishedPulling="2026-01-20 22:48:33.389842838 +0000 UTC m=+785.710103126" observedRunningTime="2026-01-20 22:48:35.97125111 +0000 UTC m=+788.291511398" watchObservedRunningTime="2026-01-20 22:48:35.971348462 +0000 UTC m=+788.291608750" Jan 20 22:48:36 crc kubenswrapper[5030]: I0120 22:48:36.833018 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:36 crc kubenswrapper[5030]: I0120 22:48:36.876905 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:36 crc kubenswrapper[5030]: I0120 22:48:36.959897 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" event={"ID":"17d40a45-26fa-4c9d-91df-767448cd1cf6","Type":"ContainerStarted","Data":"d778de0871048841726b0640bc26157f8266e7bfbe3affb57290d04bea77b5f2"} Jan 20 22:48:36 crc kubenswrapper[5030]: I0120 22:48:36.988296 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" podStartSLOduration=1.68538287 podStartE2EDuration="6.988262876s" podCreationTimestamp="2026-01-20 22:48:30 +0000 UTC" firstStartedPulling="2026-01-20 22:48:31.436607805 +0000 UTC m=+783.756868093" lastFinishedPulling="2026-01-20 22:48:36.739487811 +0000 UTC m=+789.059748099" observedRunningTime="2026-01-20 22:48:36.985890587 +0000 UTC m=+789.306150875" watchObservedRunningTime="2026-01-20 22:48:36.988262876 +0000 UTC m=+789.308523164" Jan 20 22:48:37 crc kubenswrapper[5030]: I0120 22:48:37.972257 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" Jan 20 22:48:38 crc kubenswrapper[5030]: I0120 22:48:38.367797 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:38 crc kubenswrapper[5030]: I0120 22:48:38.369590 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:38 crc kubenswrapper[5030]: I0120 22:48:38.420401 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.033719 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4svzh"] Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.033954 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4svzh" podUID="ddc6e15c-34a6-422b-a0ed-a94dec139c51" containerName="registry-server" containerID="cri-o://65da798d5a5ba39bf07d47e8f6e2c56298595c0898afb0b38fec7850587c90e6" gracePeriod=2 Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.917289 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.978754 5030 generic.go:334] "Generic (PLEG): container finished" podID="ddc6e15c-34a6-422b-a0ed-a94dec139c51" containerID="65da798d5a5ba39bf07d47e8f6e2c56298595c0898afb0b38fec7850587c90e6" exitCode=0 Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.978841 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4svzh" event={"ID":"ddc6e15c-34a6-422b-a0ed-a94dec139c51","Type":"ContainerDied","Data":"65da798d5a5ba39bf07d47e8f6e2c56298595c0898afb0b38fec7850587c90e6"} Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.978865 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4svzh" Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.978894 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4svzh" event={"ID":"ddc6e15c-34a6-422b-a0ed-a94dec139c51","Type":"ContainerDied","Data":"3c946ab49c593b75f4c4285459959fcdf973be10be8a24d54f2a8108e1e20242"} Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.978919 5030 scope.go:117] "RemoveContainer" containerID="65da798d5a5ba39bf07d47e8f6e2c56298595c0898afb0b38fec7850587c90e6" Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.988863 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsvwc\" (UniqueName: \"kubernetes.io/projected/ddc6e15c-34a6-422b-a0ed-a94dec139c51-kube-api-access-wsvwc\") pod \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\" (UID: \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\") " Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.988915 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc6e15c-34a6-422b-a0ed-a94dec139c51-catalog-content\") pod \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\" (UID: \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\") " Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.989013 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc6e15c-34a6-422b-a0ed-a94dec139c51-utilities\") pod \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\" (UID: \"ddc6e15c-34a6-422b-a0ed-a94dec139c51\") " Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.989782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddc6e15c-34a6-422b-a0ed-a94dec139c51-utilities" (OuterVolumeSpecName: "utilities") pod "ddc6e15c-34a6-422b-a0ed-a94dec139c51" (UID: "ddc6e15c-34a6-422b-a0ed-a94dec139c51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.991563 5030 scope.go:117] "RemoveContainer" containerID="78393b03442b929293314c66bb9ee72c2dc58991ffe2efec9c3fae4024e0b7fa" Jan 20 22:48:39 crc kubenswrapper[5030]: I0120 22:48:39.996306 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddc6e15c-34a6-422b-a0ed-a94dec139c51-kube-api-access-wsvwc" (OuterVolumeSpecName: "kube-api-access-wsvwc") pod "ddc6e15c-34a6-422b-a0ed-a94dec139c51" (UID: "ddc6e15c-34a6-422b-a0ed-a94dec139c51"). InnerVolumeSpecName "kube-api-access-wsvwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.019451 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.030501 5030 scope.go:117] "RemoveContainer" containerID="59b67571a3bac9809158faebd21aa68626b57797281b8d778982fd7d42cb3e3f" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.045881 5030 scope.go:117] "RemoveContainer" containerID="65da798d5a5ba39bf07d47e8f6e2c56298595c0898afb0b38fec7850587c90e6" Jan 20 22:48:40 crc kubenswrapper[5030]: E0120 22:48:40.046162 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65da798d5a5ba39bf07d47e8f6e2c56298595c0898afb0b38fec7850587c90e6\": container with ID starting with 65da798d5a5ba39bf07d47e8f6e2c56298595c0898afb0b38fec7850587c90e6 not found: ID does not exist" containerID="65da798d5a5ba39bf07d47e8f6e2c56298595c0898afb0b38fec7850587c90e6" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.046190 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65da798d5a5ba39bf07d47e8f6e2c56298595c0898afb0b38fec7850587c90e6"} err="failed to get container status \"65da798d5a5ba39bf07d47e8f6e2c56298595c0898afb0b38fec7850587c90e6\": rpc error: code = NotFound desc = could not find container \"65da798d5a5ba39bf07d47e8f6e2c56298595c0898afb0b38fec7850587c90e6\": container with ID starting with 65da798d5a5ba39bf07d47e8f6e2c56298595c0898afb0b38fec7850587c90e6 not found: ID does not exist" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.046211 5030 scope.go:117] "RemoveContainer" containerID="78393b03442b929293314c66bb9ee72c2dc58991ffe2efec9c3fae4024e0b7fa" Jan 20 22:48:40 crc kubenswrapper[5030]: E0120 22:48:40.046386 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78393b03442b929293314c66bb9ee72c2dc58991ffe2efec9c3fae4024e0b7fa\": container with ID starting with 78393b03442b929293314c66bb9ee72c2dc58991ffe2efec9c3fae4024e0b7fa not found: ID does not exist" containerID="78393b03442b929293314c66bb9ee72c2dc58991ffe2efec9c3fae4024e0b7fa" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.046405 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78393b03442b929293314c66bb9ee72c2dc58991ffe2efec9c3fae4024e0b7fa"} err="failed to get container status \"78393b03442b929293314c66bb9ee72c2dc58991ffe2efec9c3fae4024e0b7fa\": rpc error: code = NotFound desc = could not find container \"78393b03442b929293314c66bb9ee72c2dc58991ffe2efec9c3fae4024e0b7fa\": container with ID starting with 78393b03442b929293314c66bb9ee72c2dc58991ffe2efec9c3fae4024e0b7fa not found: ID does not exist" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.046419 5030 scope.go:117] "RemoveContainer" containerID="59b67571a3bac9809158faebd21aa68626b57797281b8d778982fd7d42cb3e3f" Jan 20 22:48:40 crc kubenswrapper[5030]: E0120 22:48:40.048130 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59b67571a3bac9809158faebd21aa68626b57797281b8d778982fd7d42cb3e3f\": container with ID starting with 59b67571a3bac9809158faebd21aa68626b57797281b8d778982fd7d42cb3e3f not found: ID does not exist" containerID="59b67571a3bac9809158faebd21aa68626b57797281b8d778982fd7d42cb3e3f" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.048176 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b67571a3bac9809158faebd21aa68626b57797281b8d778982fd7d42cb3e3f"} err="failed to get container status \"59b67571a3bac9809158faebd21aa68626b57797281b8d778982fd7d42cb3e3f\": rpc error: code = NotFound desc = could not find container \"59b67571a3bac9809158faebd21aa68626b57797281b8d778982fd7d42cb3e3f\": container with ID starting with 59b67571a3bac9809158faebd21aa68626b57797281b8d778982fd7d42cb3e3f not found: ID does not exist" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.089940 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc6e15c-34a6-422b-a0ed-a94dec139c51-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.089971 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsvwc\" (UniqueName: \"kubernetes.io/projected/ddc6e15c-34a6-422b-a0ed-a94dec139c51-kube-api-access-wsvwc\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.116668 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddc6e15c-34a6-422b-a0ed-a94dec139c51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ddc6e15c-34a6-422b-a0ed-a94dec139c51" (UID: "ddc6e15c-34a6-422b-a0ed-a94dec139c51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.157824 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.158151 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.158212 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.158887 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"495572ff74c26f1ee253c86f7edeb898261f7fe51c82acc33281d83f9074845b"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.158950 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://495572ff74c26f1ee253c86f7edeb898261f7fe51c82acc33281d83f9074845b" gracePeriod=600 Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.191370 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc6e15c-34a6-422b-a0ed-a94dec139c51-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.309707 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4svzh"] Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.313986 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4svzh"] Jan 20 22:48:40 crc kubenswrapper[5030]: I0120 22:48:40.841505 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nkmhb"] Jan 20 22:48:41 crc kubenswrapper[5030]: I0120 22:48:41.967946 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddc6e15c-34a6-422b-a0ed-a94dec139c51" path="/var/lib/kubelet/pods/ddc6e15c-34a6-422b-a0ed-a94dec139c51/volumes" Jan 20 22:48:42 crc kubenswrapper[5030]: I0120 22:48:42.002947 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nkmhb" podUID="e5ebb2a1-9973-41c8-972d-e474f74155d0" containerName="registry-server" containerID="cri-o://bf5d0c6995dd2b250a82c7f96e7407ae752f57bab3639882c0f9a068846b1fdc" gracePeriod=2 Jan 20 22:48:44 crc kubenswrapper[5030]: I0120 22:48:44.536015 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="495572ff74c26f1ee253c86f7edeb898261f7fe51c82acc33281d83f9074845b" exitCode=0 Jan 20 22:48:44 crc kubenswrapper[5030]: I0120 22:48:44.536104 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"495572ff74c26f1ee253c86f7edeb898261f7fe51c82acc33281d83f9074845b"} Jan 20 22:48:44 crc kubenswrapper[5030]: I0120 22:48:44.536382 5030 scope.go:117] "RemoveContainer" containerID="5fb69d7c86677af730b5a8cb887d6364e5648de419687c9cc1175e8da6193289" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.411043 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.480301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ebb2a1-9973-41c8-972d-e474f74155d0-catalog-content\") pod \"e5ebb2a1-9973-41c8-972d-e474f74155d0\" (UID: \"e5ebb2a1-9973-41c8-972d-e474f74155d0\") " Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.480399 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgxmx\" (UniqueName: \"kubernetes.io/projected/e5ebb2a1-9973-41c8-972d-e474f74155d0-kube-api-access-rgxmx\") pod \"e5ebb2a1-9973-41c8-972d-e474f74155d0\" (UID: \"e5ebb2a1-9973-41c8-972d-e474f74155d0\") " Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.480472 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ebb2a1-9973-41c8-972d-e474f74155d0-utilities\") pod \"e5ebb2a1-9973-41c8-972d-e474f74155d0\" (UID: \"e5ebb2a1-9973-41c8-972d-e474f74155d0\") " Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.481398 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5ebb2a1-9973-41c8-972d-e474f74155d0-utilities" (OuterVolumeSpecName: "utilities") pod "e5ebb2a1-9973-41c8-972d-e474f74155d0" (UID: "e5ebb2a1-9973-41c8-972d-e474f74155d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.485577 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5ebb2a1-9973-41c8-972d-e474f74155d0-kube-api-access-rgxmx" (OuterVolumeSpecName: "kube-api-access-rgxmx") pod "e5ebb2a1-9973-41c8-972d-e474f74155d0" (UID: "e5ebb2a1-9973-41c8-972d-e474f74155d0"). InnerVolumeSpecName "kube-api-access-rgxmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.533041 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5ebb2a1-9973-41c8-972d-e474f74155d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5ebb2a1-9973-41c8-972d-e474f74155d0" (UID: "e5ebb2a1-9973-41c8-972d-e474f74155d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.543774 5030 generic.go:334] "Generic (PLEG): container finished" podID="e5ebb2a1-9973-41c8-972d-e474f74155d0" containerID="bf5d0c6995dd2b250a82c7f96e7407ae752f57bab3639882c0f9a068846b1fdc" exitCode=0 Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.543840 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkmhb" event={"ID":"e5ebb2a1-9973-41c8-972d-e474f74155d0","Type":"ContainerDied","Data":"bf5d0c6995dd2b250a82c7f96e7407ae752f57bab3639882c0f9a068846b1fdc"} Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.543902 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nkmhb" event={"ID":"e5ebb2a1-9973-41c8-972d-e474f74155d0","Type":"ContainerDied","Data":"3e586adc9a97c014d0940de6422560f206548d9076b1fb105e1ab67f0215f580"} Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.543929 5030 scope.go:117] "RemoveContainer" containerID="bf5d0c6995dd2b250a82c7f96e7407ae752f57bab3639882c0f9a068846b1fdc" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.543849 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nkmhb" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.546994 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"5d3679ff59bff2ab777e8cc7a5527010f7cd48401dbe03736305ae0e00f5026f"} Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.561479 5030 scope.go:117] "RemoveContainer" containerID="ba8276c3f73a2c8204f874aa3fded04ac5de0c8a3d3bfd3e13a91150ef5cde48" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.580464 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nkmhb"] Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.582297 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ebb2a1-9973-41c8-972d-e474f74155d0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.582312 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgxmx\" (UniqueName: \"kubernetes.io/projected/e5ebb2a1-9973-41c8-972d-e474f74155d0-kube-api-access-rgxmx\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.582322 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ebb2a1-9973-41c8-972d-e474f74155d0-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.584257 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nkmhb"] Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.588983 5030 scope.go:117] "RemoveContainer" containerID="b58bcfb1c53618da1973e4d35cad282cd072543027d85e3bf670ab11a43d421c" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.602837 5030 scope.go:117] "RemoveContainer" containerID="bf5d0c6995dd2b250a82c7f96e7407ae752f57bab3639882c0f9a068846b1fdc" Jan 20 22:48:45 crc kubenswrapper[5030]: E0120 22:48:45.603259 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf5d0c6995dd2b250a82c7f96e7407ae752f57bab3639882c0f9a068846b1fdc\": container with ID starting with bf5d0c6995dd2b250a82c7f96e7407ae752f57bab3639882c0f9a068846b1fdc not found: ID does not exist" containerID="bf5d0c6995dd2b250a82c7f96e7407ae752f57bab3639882c0f9a068846b1fdc" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.603299 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf5d0c6995dd2b250a82c7f96e7407ae752f57bab3639882c0f9a068846b1fdc"} err="failed to get container status \"bf5d0c6995dd2b250a82c7f96e7407ae752f57bab3639882c0f9a068846b1fdc\": rpc error: code = NotFound desc = could not find container \"bf5d0c6995dd2b250a82c7f96e7407ae752f57bab3639882c0f9a068846b1fdc\": container with ID starting with bf5d0c6995dd2b250a82c7f96e7407ae752f57bab3639882c0f9a068846b1fdc not found: ID does not exist" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.603327 5030 scope.go:117] "RemoveContainer" containerID="ba8276c3f73a2c8204f874aa3fded04ac5de0c8a3d3bfd3e13a91150ef5cde48" Jan 20 22:48:45 crc kubenswrapper[5030]: E0120 22:48:45.603646 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba8276c3f73a2c8204f874aa3fded04ac5de0c8a3d3bfd3e13a91150ef5cde48\": container with ID starting with ba8276c3f73a2c8204f874aa3fded04ac5de0c8a3d3bfd3e13a91150ef5cde48 not found: ID does not exist" containerID="ba8276c3f73a2c8204f874aa3fded04ac5de0c8a3d3bfd3e13a91150ef5cde48" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.603673 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8276c3f73a2c8204f874aa3fded04ac5de0c8a3d3bfd3e13a91150ef5cde48"} err="failed to get container status \"ba8276c3f73a2c8204f874aa3fded04ac5de0c8a3d3bfd3e13a91150ef5cde48\": rpc error: code = NotFound desc = could not find container \"ba8276c3f73a2c8204f874aa3fded04ac5de0c8a3d3bfd3e13a91150ef5cde48\": container with ID starting with ba8276c3f73a2c8204f874aa3fded04ac5de0c8a3d3bfd3e13a91150ef5cde48 not found: ID does not exist" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.603691 5030 scope.go:117] "RemoveContainer" containerID="b58bcfb1c53618da1973e4d35cad282cd072543027d85e3bf670ab11a43d421c" Jan 20 22:48:45 crc kubenswrapper[5030]: E0120 22:48:45.603893 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b58bcfb1c53618da1973e4d35cad282cd072543027d85e3bf670ab11a43d421c\": container with ID starting with b58bcfb1c53618da1973e4d35cad282cd072543027d85e3bf670ab11a43d421c not found: ID does not exist" containerID="b58bcfb1c53618da1973e4d35cad282cd072543027d85e3bf670ab11a43d421c" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.603922 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b58bcfb1c53618da1973e4d35cad282cd072543027d85e3bf670ab11a43d421c"} err="failed to get container status \"b58bcfb1c53618da1973e4d35cad282cd072543027d85e3bf670ab11a43d421c\": rpc error: code = NotFound desc = could not find container \"b58bcfb1c53618da1973e4d35cad282cd072543027d85e3bf670ab11a43d421c\": container with ID starting with b58bcfb1c53618da1973e4d35cad282cd072543027d85e3bf670ab11a43d421c not found: ID does not exist" Jan 20 22:48:45 crc kubenswrapper[5030]: I0120 22:48:45.974224 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5ebb2a1-9973-41c8-972d-e474f74155d0" path="/var/lib/kubelet/pods/e5ebb2a1-9973-41c8-972d-e474f74155d0/volumes" Jan 20 22:48:49 crc kubenswrapper[5030]: I0120 22:48:49.573726 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" event={"ID":"bfce14b1-de45-4c8d-852f-044ee0214880","Type":"ContainerStarted","Data":"ef97995aa7903c5557c0ed388468dfee6d30ec1ee3ba9b1f9711c4dee9601ffa"} Jan 20 22:48:49 crc kubenswrapper[5030]: I0120 22:48:49.574359 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" Jan 20 22:48:49 crc kubenswrapper[5030]: I0120 22:48:49.596708 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" podStartSLOduration=1.019867218 podStartE2EDuration="18.596680128s" podCreationTimestamp="2026-01-20 22:48:31 +0000 UTC" firstStartedPulling="2026-01-20 22:48:31.700388714 +0000 UTC m=+784.020649002" lastFinishedPulling="2026-01-20 22:48:49.277201584 +0000 UTC m=+801.597461912" observedRunningTime="2026-01-20 22:48:49.595121949 +0000 UTC m=+801.915382317" watchObservedRunningTime="2026-01-20 22:48:49.596680128 +0000 UTC m=+801.916940456" Jan 20 22:49:01 crc kubenswrapper[5030]: I0120 22:49:01.501843 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-74877969b9-vf5sw" Jan 20 22:49:11 crc kubenswrapper[5030]: I0120 22:49:11.229920 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6fd7497c88-v4gzk" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.111113 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-cdvkw"] Jan 20 22:49:12 crc kubenswrapper[5030]: E0120 22:49:12.111337 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc6e15c-34a6-422b-a0ed-a94dec139c51" containerName="registry-server" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.111355 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc6e15c-34a6-422b-a0ed-a94dec139c51" containerName="registry-server" Jan 20 22:49:12 crc kubenswrapper[5030]: E0120 22:49:12.111366 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ebb2a1-9973-41c8-972d-e474f74155d0" containerName="registry-server" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.111372 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ebb2a1-9973-41c8-972d-e474f74155d0" containerName="registry-server" Jan 20 22:49:12 crc kubenswrapper[5030]: E0120 22:49:12.111382 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ebb2a1-9973-41c8-972d-e474f74155d0" containerName="extract-utilities" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.111388 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ebb2a1-9973-41c8-972d-e474f74155d0" containerName="extract-utilities" Jan 20 22:49:12 crc kubenswrapper[5030]: E0120 22:49:12.111397 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ebb2a1-9973-41c8-972d-e474f74155d0" containerName="extract-content" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.111403 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ebb2a1-9973-41c8-972d-e474f74155d0" containerName="extract-content" Jan 20 22:49:12 crc kubenswrapper[5030]: E0120 22:49:12.111417 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc6e15c-34a6-422b-a0ed-a94dec139c51" containerName="extract-utilities" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.111423 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc6e15c-34a6-422b-a0ed-a94dec139c51" containerName="extract-utilities" Jan 20 22:49:12 crc kubenswrapper[5030]: E0120 22:49:12.111433 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc6e15c-34a6-422b-a0ed-a94dec139c51" containerName="extract-content" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.111439 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc6e15c-34a6-422b-a0ed-a94dec139c51" containerName="extract-content" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.111540 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5ebb2a1-9973-41c8-972d-e474f74155d0" containerName="registry-server" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.111550 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddc6e15c-34a6-422b-a0ed-a94dec139c51" containerName="registry-server" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.113317 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.115429 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.115870 5030 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-m9d7g" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.116445 5030 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.136042 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28"] Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.137939 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.143716 5030 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.146849 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28"] Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.204716 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-b2ttl"] Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.205466 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-b2ttl" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.207784 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.207923 5030 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.208333 5030 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-k48dp" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.211043 5030 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.218578 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-9npwj"] Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.219355 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-9npwj" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.223797 5030 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.232176 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-9npwj"] Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.243207 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d655cbb5-1c38-4c7c-b60c-13abdf46828b-frr-startup\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.243356 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxzx9\" (UniqueName: \"kubernetes.io/projected/d655cbb5-1c38-4c7c-b60c-13abdf46828b-kube-api-access-wxzx9\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.243434 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d655cbb5-1c38-4c7c-b60c-13abdf46828b-metrics-certs\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.243503 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d655cbb5-1c38-4c7c-b60c-13abdf46828b-reloader\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.243584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d655cbb5-1c38-4c7c-b60c-13abdf46828b-frr-sockets\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.243678 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab7d871-bc62-4186-898d-e80a2905bc64-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-lfp28\" (UID: \"aab7d871-bc62-4186-898d-e80a2905bc64\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.243748 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d655cbb5-1c38-4c7c-b60c-13abdf46828b-metrics\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.243836 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d655cbb5-1c38-4c7c-b60c-13abdf46828b-frr-conf\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.243908 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpj56\" (UniqueName: \"kubernetes.io/projected/aab7d871-bc62-4186-898d-e80a2905bc64-kube-api-access-tpj56\") pod \"frr-k8s-webhook-server-7df86c4f6c-lfp28\" (UID: \"aab7d871-bc62-4186-898d-e80a2905bc64\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345495 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d655cbb5-1c38-4c7c-b60c-13abdf46828b-frr-conf\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345550 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bbacd039-a930-489a-9aca-1fac0748973b-memberlist\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345581 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbacd039-a930-489a-9aca-1fac0748973b-metrics-certs\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345607 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpj56\" (UniqueName: \"kubernetes.io/projected/aab7d871-bc62-4186-898d-e80a2905bc64-kube-api-access-tpj56\") pod \"frr-k8s-webhook-server-7df86c4f6c-lfp28\" (UID: \"aab7d871-bc62-4186-898d-e80a2905bc64\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345654 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d655cbb5-1c38-4c7c-b60c-13abdf46828b-frr-startup\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345681 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxzx9\" (UniqueName: \"kubernetes.io/projected/d655cbb5-1c38-4c7c-b60c-13abdf46828b-kube-api-access-wxzx9\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345723 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d655cbb5-1c38-4c7c-b60c-13abdf46828b-metrics-certs\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345748 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d655cbb5-1c38-4c7c-b60c-13abdf46828b-reloader\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bbacd039-a930-489a-9aca-1fac0748973b-metallb-excludel2\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345799 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d655cbb5-1c38-4c7c-b60c-13abdf46828b-frr-sockets\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab7d871-bc62-4186-898d-e80a2905bc64-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-lfp28\" (UID: \"aab7d871-bc62-4186-898d-e80a2905bc64\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345872 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvc8p\" (UniqueName: \"kubernetes.io/projected/bbacd039-a930-489a-9aca-1fac0748973b-kube-api-access-hvc8p\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345898 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hh5k\" (UniqueName: \"kubernetes.io/projected/c2b9b98a-8ac1-4b34-bf47-b04f6e498149-kube-api-access-2hh5k\") pod \"controller-6968d8fdc4-9npwj\" (UID: \"c2b9b98a-8ac1-4b34-bf47-b04f6e498149\") " pod="metallb-system/controller-6968d8fdc4-9npwj" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345918 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2b9b98a-8ac1-4b34-bf47-b04f6e498149-cert\") pod \"controller-6968d8fdc4-9npwj\" (UID: \"c2b9b98a-8ac1-4b34-bf47-b04f6e498149\") " pod="metallb-system/controller-6968d8fdc4-9npwj" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345941 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d655cbb5-1c38-4c7c-b60c-13abdf46828b-metrics\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.345980 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2b9b98a-8ac1-4b34-bf47-b04f6e498149-metrics-certs\") pod \"controller-6968d8fdc4-9npwj\" (UID: \"c2b9b98a-8ac1-4b34-bf47-b04f6e498149\") " pod="metallb-system/controller-6968d8fdc4-9npwj" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.346059 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d655cbb5-1c38-4c7c-b60c-13abdf46828b-frr-conf\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.346496 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d655cbb5-1c38-4c7c-b60c-13abdf46828b-reloader\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.346649 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d655cbb5-1c38-4c7c-b60c-13abdf46828b-metrics\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.347030 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d655cbb5-1c38-4c7c-b60c-13abdf46828b-frr-sockets\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.347395 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d655cbb5-1c38-4c7c-b60c-13abdf46828b-frr-startup\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.351715 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d655cbb5-1c38-4c7c-b60c-13abdf46828b-metrics-certs\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.352021 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aab7d871-bc62-4186-898d-e80a2905bc64-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-lfp28\" (UID: \"aab7d871-bc62-4186-898d-e80a2905bc64\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.363436 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxzx9\" (UniqueName: \"kubernetes.io/projected/d655cbb5-1c38-4c7c-b60c-13abdf46828b-kube-api-access-wxzx9\") pod \"frr-k8s-cdvkw\" (UID: \"d655cbb5-1c38-4c7c-b60c-13abdf46828b\") " pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.365219 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpj56\" (UniqueName: \"kubernetes.io/projected/aab7d871-bc62-4186-898d-e80a2905bc64-kube-api-access-tpj56\") pod \"frr-k8s-webhook-server-7df86c4f6c-lfp28\" (UID: \"aab7d871-bc62-4186-898d-e80a2905bc64\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.429042 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.447663 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bbacd039-a930-489a-9aca-1fac0748973b-memberlist\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.447757 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbacd039-a930-489a-9aca-1fac0748973b-metrics-certs\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:12 crc kubenswrapper[5030]: E0120 22:49:12.447760 5030 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 20 22:49:12 crc kubenswrapper[5030]: E0120 22:49:12.447970 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbacd039-a930-489a-9aca-1fac0748973b-memberlist podName:bbacd039-a930-489a-9aca-1fac0748973b nodeName:}" failed. No retries permitted until 2026-01-20 22:49:12.947947534 +0000 UTC m=+825.268207822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bbacd039-a930-489a-9aca-1fac0748973b-memberlist") pod "speaker-b2ttl" (UID: "bbacd039-a930-489a-9aca-1fac0748973b") : secret "metallb-memberlist" not found Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.447854 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bbacd039-a930-489a-9aca-1fac0748973b-metallb-excludel2\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.448309 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvc8p\" (UniqueName: \"kubernetes.io/projected/bbacd039-a930-489a-9aca-1fac0748973b-kube-api-access-hvc8p\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.448395 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hh5k\" (UniqueName: \"kubernetes.io/projected/c2b9b98a-8ac1-4b34-bf47-b04f6e498149-kube-api-access-2hh5k\") pod \"controller-6968d8fdc4-9npwj\" (UID: \"c2b9b98a-8ac1-4b34-bf47-b04f6e498149\") " pod="metallb-system/controller-6968d8fdc4-9npwj" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.448462 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2b9b98a-8ac1-4b34-bf47-b04f6e498149-cert\") pod \"controller-6968d8fdc4-9npwj\" (UID: \"c2b9b98a-8ac1-4b34-bf47-b04f6e498149\") " pod="metallb-system/controller-6968d8fdc4-9npwj" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.448570 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2b9b98a-8ac1-4b34-bf47-b04f6e498149-metrics-certs\") pod \"controller-6968d8fdc4-9npwj\" (UID: \"c2b9b98a-8ac1-4b34-bf47-b04f6e498149\") " pod="metallb-system/controller-6968d8fdc4-9npwj" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.449047 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bbacd039-a930-489a-9aca-1fac0748973b-metallb-excludel2\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.450868 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.453073 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2b9b98a-8ac1-4b34-bf47-b04f6e498149-metrics-certs\") pod \"controller-6968d8fdc4-9npwj\" (UID: \"c2b9b98a-8ac1-4b34-bf47-b04f6e498149\") " pod="metallb-system/controller-6968d8fdc4-9npwj" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.454296 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbacd039-a930-489a-9aca-1fac0748973b-metrics-certs\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.457013 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2b9b98a-8ac1-4b34-bf47-b04f6e498149-cert\") pod \"controller-6968d8fdc4-9npwj\" (UID: \"c2b9b98a-8ac1-4b34-bf47-b04f6e498149\") " pod="metallb-system/controller-6968d8fdc4-9npwj" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.468912 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvc8p\" (UniqueName: \"kubernetes.io/projected/bbacd039-a930-489a-9aca-1fac0748973b-kube-api-access-hvc8p\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.469991 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hh5k\" (UniqueName: \"kubernetes.io/projected/c2b9b98a-8ac1-4b34-bf47-b04f6e498149-kube-api-access-2hh5k\") pod \"controller-6968d8fdc4-9npwj\" (UID: \"c2b9b98a-8ac1-4b34-bf47-b04f6e498149\") " pod="metallb-system/controller-6968d8fdc4-9npwj" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.531506 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-9npwj" Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.720378 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cdvkw" event={"ID":"d655cbb5-1c38-4c7c-b60c-13abdf46828b","Type":"ContainerStarted","Data":"e437997df85172f94da101b68d601161ffdcdb2547150f5582f28c0cf70fd4a8"} Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.874655 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28"] Jan 20 22:49:12 crc kubenswrapper[5030]: I0120 22:49:12.956138 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bbacd039-a930-489a-9aca-1fac0748973b-memberlist\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:12 crc kubenswrapper[5030]: E0120 22:49:12.956335 5030 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 20 22:49:12 crc kubenswrapper[5030]: E0120 22:49:12.956418 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbacd039-a930-489a-9aca-1fac0748973b-memberlist podName:bbacd039-a930-489a-9aca-1fac0748973b nodeName:}" failed. No retries permitted until 2026-01-20 22:49:13.956395996 +0000 UTC m=+826.276656294 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bbacd039-a930-489a-9aca-1fac0748973b-memberlist") pod "speaker-b2ttl" (UID: "bbacd039-a930-489a-9aca-1fac0748973b") : secret "metallb-memberlist" not found Jan 20 22:49:13 crc kubenswrapper[5030]: I0120 22:49:13.008825 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-9npwj"] Jan 20 22:49:13 crc kubenswrapper[5030]: W0120 22:49:13.011510 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2b9b98a_8ac1_4b34_bf47_b04f6e498149.slice/crio-babd523e8cd28bfdc1125660d49b4e98c34dd170dd32cd7cc3e79be36c667edf WatchSource:0}: Error finding container babd523e8cd28bfdc1125660d49b4e98c34dd170dd32cd7cc3e79be36c667edf: Status 404 returned error can't find the container with id babd523e8cd28bfdc1125660d49b4e98c34dd170dd32cd7cc3e79be36c667edf Jan 20 22:49:13 crc kubenswrapper[5030]: I0120 22:49:13.730466 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-9npwj" event={"ID":"c2b9b98a-8ac1-4b34-bf47-b04f6e498149","Type":"ContainerStarted","Data":"0d272fbb82cf9aee30bb1394182b37f2a65db35c3f1590dba3aa31cee749a66e"} Jan 20 22:49:13 crc kubenswrapper[5030]: I0120 22:49:13.730834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-9npwj" event={"ID":"c2b9b98a-8ac1-4b34-bf47-b04f6e498149","Type":"ContainerStarted","Data":"a64e021696acf242078c1a9f900e9423ce092c2178a8c3e86e6b19dab01b0a35"} Jan 20 22:49:13 crc kubenswrapper[5030]: I0120 22:49:13.730851 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-9npwj" event={"ID":"c2b9b98a-8ac1-4b34-bf47-b04f6e498149","Type":"ContainerStarted","Data":"babd523e8cd28bfdc1125660d49b4e98c34dd170dd32cd7cc3e79be36c667edf"} Jan 20 22:49:13 crc kubenswrapper[5030]: I0120 22:49:13.730872 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-9npwj" Jan 20 22:49:13 crc kubenswrapper[5030]: I0120 22:49:13.733727 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28" event={"ID":"aab7d871-bc62-4186-898d-e80a2905bc64","Type":"ContainerStarted","Data":"27eb6434e903453199dec17a2aeb5bb17537189bcb191bbd86a43073db4e59d5"} Jan 20 22:49:13 crc kubenswrapper[5030]: I0120 22:49:13.757854 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-9npwj" podStartSLOduration=1.757826982 podStartE2EDuration="1.757826982s" podCreationTimestamp="2026-01-20 22:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:49:13.757048012 +0000 UTC m=+826.077308300" watchObservedRunningTime="2026-01-20 22:49:13.757826982 +0000 UTC m=+826.078087300" Jan 20 22:49:13 crc kubenswrapper[5030]: I0120 22:49:13.968904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bbacd039-a930-489a-9aca-1fac0748973b-memberlist\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:13 crc kubenswrapper[5030]: I0120 22:49:13.982452 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bbacd039-a930-489a-9aca-1fac0748973b-memberlist\") pod \"speaker-b2ttl\" (UID: \"bbacd039-a930-489a-9aca-1fac0748973b\") " pod="metallb-system/speaker-b2ttl" Jan 20 22:49:14 crc kubenswrapper[5030]: I0120 22:49:14.018091 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-b2ttl" Jan 20 22:49:14 crc kubenswrapper[5030]: W0120 22:49:14.048192 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbacd039_a930_489a_9aca_1fac0748973b.slice/crio-7da012b65f8f28566690339350c66049d94af8e9f50957a5fa40a3eca4744bf9 WatchSource:0}: Error finding container 7da012b65f8f28566690339350c66049d94af8e9f50957a5fa40a3eca4744bf9: Status 404 returned error can't find the container with id 7da012b65f8f28566690339350c66049d94af8e9f50957a5fa40a3eca4744bf9 Jan 20 22:49:14 crc kubenswrapper[5030]: I0120 22:49:14.742081 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b2ttl" event={"ID":"bbacd039-a930-489a-9aca-1fac0748973b","Type":"ContainerStarted","Data":"15d4b55dd4931ae7f57ca90ac19c1a2e7cc694b2911f3615ac7a39b04aa2492c"} Jan 20 22:49:14 crc kubenswrapper[5030]: I0120 22:49:14.742116 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b2ttl" event={"ID":"bbacd039-a930-489a-9aca-1fac0748973b","Type":"ContainerStarted","Data":"09a592b2438fa2c25a33bbb77ad56201432b6d03327d7982f142c73a28c065dc"} Jan 20 22:49:14 crc kubenswrapper[5030]: I0120 22:49:14.742125 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b2ttl" event={"ID":"bbacd039-a930-489a-9aca-1fac0748973b","Type":"ContainerStarted","Data":"7da012b65f8f28566690339350c66049d94af8e9f50957a5fa40a3eca4744bf9"} Jan 20 22:49:14 crc kubenswrapper[5030]: I0120 22:49:14.742582 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-b2ttl" Jan 20 22:49:14 crc kubenswrapper[5030]: I0120 22:49:14.759919 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-b2ttl" podStartSLOduration=2.7599100869999997 podStartE2EDuration="2.759910087s" podCreationTimestamp="2026-01-20 22:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:49:14.758916532 +0000 UTC m=+827.079176820" watchObservedRunningTime="2026-01-20 22:49:14.759910087 +0000 UTC m=+827.080170365" Jan 20 22:49:20 crc kubenswrapper[5030]: I0120 22:49:20.812906 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28" event={"ID":"aab7d871-bc62-4186-898d-e80a2905bc64","Type":"ContainerStarted","Data":"1ae391c413ae0c7b39df3ce2bba7904da3c8b8931c235d774c80c27a1b751dcd"} Jan 20 22:49:20 crc kubenswrapper[5030]: I0120 22:49:20.813531 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28" Jan 20 22:49:20 crc kubenswrapper[5030]: I0120 22:49:20.815571 5030 generic.go:334] "Generic (PLEG): container finished" podID="d655cbb5-1c38-4c7c-b60c-13abdf46828b" containerID="c27dfea2eef5c441c2c8a772f42ee2e5e9cea6db83f7bad2fbb67a1f03422411" exitCode=0 Jan 20 22:49:20 crc kubenswrapper[5030]: I0120 22:49:20.815689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cdvkw" event={"ID":"d655cbb5-1c38-4c7c-b60c-13abdf46828b","Type":"ContainerDied","Data":"c27dfea2eef5c441c2c8a772f42ee2e5e9cea6db83f7bad2fbb67a1f03422411"} Jan 20 22:49:20 crc kubenswrapper[5030]: I0120 22:49:20.845808 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28" podStartSLOduration=1.842168539 podStartE2EDuration="8.845785548s" podCreationTimestamp="2026-01-20 22:49:12 +0000 UTC" firstStartedPulling="2026-01-20 22:49:12.880198992 +0000 UTC m=+825.200459270" lastFinishedPulling="2026-01-20 22:49:19.883815991 +0000 UTC m=+832.204076279" observedRunningTime="2026-01-20 22:49:20.839672377 +0000 UTC m=+833.159932735" watchObservedRunningTime="2026-01-20 22:49:20.845785548 +0000 UTC m=+833.166045876" Jan 20 22:49:21 crc kubenswrapper[5030]: I0120 22:49:21.823592 5030 generic.go:334] "Generic (PLEG): container finished" podID="d655cbb5-1c38-4c7c-b60c-13abdf46828b" containerID="f36ee171bd2a94eeb6939fa03d5239f2110b71900d382ab67f6c7980f1d687c1" exitCode=0 Jan 20 22:49:21 crc kubenswrapper[5030]: I0120 22:49:21.823711 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cdvkw" event={"ID":"d655cbb5-1c38-4c7c-b60c-13abdf46828b","Type":"ContainerDied","Data":"f36ee171bd2a94eeb6939fa03d5239f2110b71900d382ab67f6c7980f1d687c1"} Jan 20 22:49:22 crc kubenswrapper[5030]: I0120 22:49:22.835997 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cdvkw" event={"ID":"d655cbb5-1c38-4c7c-b60c-13abdf46828b","Type":"ContainerStarted","Data":"2e63708b57b92983e745f1ce9ab3ad38b68cab32fa467e8bf805b2f64704825f"} Jan 20 22:49:23 crc kubenswrapper[5030]: I0120 22:49:23.849010 5030 generic.go:334] "Generic (PLEG): container finished" podID="d655cbb5-1c38-4c7c-b60c-13abdf46828b" containerID="2e63708b57b92983e745f1ce9ab3ad38b68cab32fa467e8bf805b2f64704825f" exitCode=0 Jan 20 22:49:23 crc kubenswrapper[5030]: I0120 22:49:23.849076 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cdvkw" event={"ID":"d655cbb5-1c38-4c7c-b60c-13abdf46828b","Type":"ContainerDied","Data":"2e63708b57b92983e745f1ce9ab3ad38b68cab32fa467e8bf805b2f64704825f"} Jan 20 22:49:24 crc kubenswrapper[5030]: I0120 22:49:24.024195 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-b2ttl" Jan 20 22:49:24 crc kubenswrapper[5030]: I0120 22:49:24.858042 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cdvkw" event={"ID":"d655cbb5-1c38-4c7c-b60c-13abdf46828b","Type":"ContainerStarted","Data":"187841db9bf5cdeabafead46d2902346b8ce9163d6178839d1128b53076cc1d8"} Jan 20 22:49:24 crc kubenswrapper[5030]: I0120 22:49:24.858336 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cdvkw" event={"ID":"d655cbb5-1c38-4c7c-b60c-13abdf46828b","Type":"ContainerStarted","Data":"523ac9c1979b5d30dc32b5b9aad53f87623a83a79bec27687c202e009bde05a2"} Jan 20 22:49:24 crc kubenswrapper[5030]: I0120 22:49:24.858347 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cdvkw" event={"ID":"d655cbb5-1c38-4c7c-b60c-13abdf46828b","Type":"ContainerStarted","Data":"1ba74c362b3ae956f8d3889ad9dc731235b7a0c846f43423d63424a727241c96"} Jan 20 22:49:24 crc kubenswrapper[5030]: I0120 22:49:24.858358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cdvkw" event={"ID":"d655cbb5-1c38-4c7c-b60c-13abdf46828b","Type":"ContainerStarted","Data":"9409f658ca1b81120cf1cebea07f8f1392e6d3dc5b8066564141f62a5ee9417b"} Jan 20 22:49:24 crc kubenswrapper[5030]: I0120 22:49:24.858366 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cdvkw" event={"ID":"d655cbb5-1c38-4c7c-b60c-13abdf46828b","Type":"ContainerStarted","Data":"a69bd343a7f5731d0c24a04d76e5b1b9ff0254fef699eb0ec0536ba9a5205e23"} Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.444565 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs"] Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.445706 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.447936 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.460971 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs"] Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.539188 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0334a325-c5cb-4543-8580-812e53d249c9-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs\" (UID: \"0334a325-c5cb-4543-8580-812e53d249c9\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.539238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0334a325-c5cb-4543-8580-812e53d249c9-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs\" (UID: \"0334a325-c5cb-4543-8580-812e53d249c9\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.539267 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lts99\" (UniqueName: \"kubernetes.io/projected/0334a325-c5cb-4543-8580-812e53d249c9-kube-api-access-lts99\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs\" (UID: \"0334a325-c5cb-4543-8580-812e53d249c9\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.641264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0334a325-c5cb-4543-8580-812e53d249c9-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs\" (UID: \"0334a325-c5cb-4543-8580-812e53d249c9\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.641317 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0334a325-c5cb-4543-8580-812e53d249c9-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs\" (UID: \"0334a325-c5cb-4543-8580-812e53d249c9\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.641351 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lts99\" (UniqueName: \"kubernetes.io/projected/0334a325-c5cb-4543-8580-812e53d249c9-kube-api-access-lts99\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs\" (UID: \"0334a325-c5cb-4543-8580-812e53d249c9\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.641906 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0334a325-c5cb-4543-8580-812e53d249c9-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs\" (UID: \"0334a325-c5cb-4543-8580-812e53d249c9\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.641941 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0334a325-c5cb-4543-8580-812e53d249c9-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs\" (UID: \"0334a325-c5cb-4543-8580-812e53d249c9\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.672077 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lts99\" (UniqueName: \"kubernetes.io/projected/0334a325-c5cb-4543-8580-812e53d249c9-kube-api-access-lts99\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs\" (UID: \"0334a325-c5cb-4543-8580-812e53d249c9\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.770909 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.871200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cdvkw" event={"ID":"d655cbb5-1c38-4c7c-b60c-13abdf46828b","Type":"ContainerStarted","Data":"ce2a5aae82cee1100b516af6a5f101b1eae4c43e209c549e402369d78bd87d84"} Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.871812 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:25 crc kubenswrapper[5030]: I0120 22:49:25.913116 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-cdvkw" podStartSLOduration=6.662132177 podStartE2EDuration="13.913101337s" podCreationTimestamp="2026-01-20 22:49:12 +0000 UTC" firstStartedPulling="2026-01-20 22:49:12.613332166 +0000 UTC m=+824.933592454" lastFinishedPulling="2026-01-20 22:49:19.864301316 +0000 UTC m=+832.184561614" observedRunningTime="2026-01-20 22:49:25.911997039 +0000 UTC m=+838.232257337" watchObservedRunningTime="2026-01-20 22:49:25.913101337 +0000 UTC m=+838.233361625" Jan 20 22:49:26 crc kubenswrapper[5030]: I0120 22:49:26.041838 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs"] Jan 20 22:49:26 crc kubenswrapper[5030]: I0120 22:49:26.880126 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" event={"ID":"0334a325-c5cb-4543-8580-812e53d249c9","Type":"ContainerStarted","Data":"a23979b48269e1fe63182c9bbc77c06be54204b0ce9e6fffeba15750c0d8c9a0"} Jan 20 22:49:27 crc kubenswrapper[5030]: I0120 22:49:27.429818 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:27 crc kubenswrapper[5030]: I0120 22:49:27.658304 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:27 crc kubenswrapper[5030]: I0120 22:49:27.886230 5030 generic.go:334] "Generic (PLEG): container finished" podID="0334a325-c5cb-4543-8580-812e53d249c9" containerID="b69357e614609f04ab61fe32497743b2badf25fd3acde5392a4b47da28f2195f" exitCode=0 Jan 20 22:49:27 crc kubenswrapper[5030]: I0120 22:49:27.886909 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" event={"ID":"0334a325-c5cb-4543-8580-812e53d249c9","Type":"ContainerDied","Data":"b69357e614609f04ab61fe32497743b2badf25fd3acde5392a4b47da28f2195f"} Jan 20 22:49:30 crc kubenswrapper[5030]: I0120 22:49:30.907821 5030 generic.go:334] "Generic (PLEG): container finished" podID="0334a325-c5cb-4543-8580-812e53d249c9" containerID="b8ac4d14b12c8614e819576692c333883243a0e4407cd600961844da2528502c" exitCode=0 Jan 20 22:49:30 crc kubenswrapper[5030]: I0120 22:49:30.907933 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" event={"ID":"0334a325-c5cb-4543-8580-812e53d249c9","Type":"ContainerDied","Data":"b8ac4d14b12c8614e819576692c333883243a0e4407cd600961844da2528502c"} Jan 20 22:49:31 crc kubenswrapper[5030]: I0120 22:49:31.920325 5030 generic.go:334] "Generic (PLEG): container finished" podID="0334a325-c5cb-4543-8580-812e53d249c9" containerID="f7834ee254a7a5a4f19eaf01533b66dfb93b213e49fb150346aaedcd27a55264" exitCode=0 Jan 20 22:49:31 crc kubenswrapper[5030]: I0120 22:49:31.920420 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" event={"ID":"0334a325-c5cb-4543-8580-812e53d249c9","Type":"ContainerDied","Data":"f7834ee254a7a5a4f19eaf01533b66dfb93b213e49fb150346aaedcd27a55264"} Jan 20 22:49:32 crc kubenswrapper[5030]: I0120 22:49:32.458149 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lfp28" Jan 20 22:49:32 crc kubenswrapper[5030]: I0120 22:49:32.539118 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-9npwj" Jan 20 22:49:33 crc kubenswrapper[5030]: I0120 22:49:33.209947 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" Jan 20 22:49:33 crc kubenswrapper[5030]: I0120 22:49:33.250238 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lts99\" (UniqueName: \"kubernetes.io/projected/0334a325-c5cb-4543-8580-812e53d249c9-kube-api-access-lts99\") pod \"0334a325-c5cb-4543-8580-812e53d249c9\" (UID: \"0334a325-c5cb-4543-8580-812e53d249c9\") " Jan 20 22:49:33 crc kubenswrapper[5030]: I0120 22:49:33.250314 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0334a325-c5cb-4543-8580-812e53d249c9-util\") pod \"0334a325-c5cb-4543-8580-812e53d249c9\" (UID: \"0334a325-c5cb-4543-8580-812e53d249c9\") " Jan 20 22:49:33 crc kubenswrapper[5030]: I0120 22:49:33.250373 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0334a325-c5cb-4543-8580-812e53d249c9-bundle\") pod \"0334a325-c5cb-4543-8580-812e53d249c9\" (UID: \"0334a325-c5cb-4543-8580-812e53d249c9\") " Jan 20 22:49:33 crc kubenswrapper[5030]: I0120 22:49:33.251721 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0334a325-c5cb-4543-8580-812e53d249c9-bundle" (OuterVolumeSpecName: "bundle") pod "0334a325-c5cb-4543-8580-812e53d249c9" (UID: "0334a325-c5cb-4543-8580-812e53d249c9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:49:33 crc kubenswrapper[5030]: I0120 22:49:33.255778 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0334a325-c5cb-4543-8580-812e53d249c9-kube-api-access-lts99" (OuterVolumeSpecName: "kube-api-access-lts99") pod "0334a325-c5cb-4543-8580-812e53d249c9" (UID: "0334a325-c5cb-4543-8580-812e53d249c9"). InnerVolumeSpecName "kube-api-access-lts99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:49:33 crc kubenswrapper[5030]: I0120 22:49:33.277855 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0334a325-c5cb-4543-8580-812e53d249c9-util" (OuterVolumeSpecName: "util") pod "0334a325-c5cb-4543-8580-812e53d249c9" (UID: "0334a325-c5cb-4543-8580-812e53d249c9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:49:33 crc kubenswrapper[5030]: I0120 22:49:33.352274 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lts99\" (UniqueName: \"kubernetes.io/projected/0334a325-c5cb-4543-8580-812e53d249c9-kube-api-access-lts99\") on node \"crc\" DevicePath \"\"" Jan 20 22:49:33 crc kubenswrapper[5030]: I0120 22:49:33.352309 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0334a325-c5cb-4543-8580-812e53d249c9-util\") on node \"crc\" DevicePath \"\"" Jan 20 22:49:33 crc kubenswrapper[5030]: I0120 22:49:33.352318 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0334a325-c5cb-4543-8580-812e53d249c9-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:49:33 crc kubenswrapper[5030]: I0120 22:49:33.937969 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" event={"ID":"0334a325-c5cb-4543-8580-812e53d249c9","Type":"ContainerDied","Data":"a23979b48269e1fe63182c9bbc77c06be54204b0ce9e6fffeba15750c0d8c9a0"} Jan 20 22:49:33 crc kubenswrapper[5030]: I0120 22:49:33.938221 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a23979b48269e1fe63182c9bbc77c06be54204b0ce9e6fffeba15750c0d8c9a0" Jan 20 22:49:33 crc kubenswrapper[5030]: I0120 22:49:33.938038 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs" Jan 20 22:49:37 crc kubenswrapper[5030]: I0120 22:49:37.956472 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs"] Jan 20 22:49:37 crc kubenswrapper[5030]: E0120 22:49:37.957816 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0334a325-c5cb-4543-8580-812e53d249c9" containerName="extract" Jan 20 22:49:37 crc kubenswrapper[5030]: I0120 22:49:37.957917 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0334a325-c5cb-4543-8580-812e53d249c9" containerName="extract" Jan 20 22:49:37 crc kubenswrapper[5030]: E0120 22:49:37.957997 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0334a325-c5cb-4543-8580-812e53d249c9" containerName="pull" Jan 20 22:49:37 crc kubenswrapper[5030]: I0120 22:49:37.958069 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0334a325-c5cb-4543-8580-812e53d249c9" containerName="pull" Jan 20 22:49:37 crc kubenswrapper[5030]: E0120 22:49:37.958151 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0334a325-c5cb-4543-8580-812e53d249c9" containerName="util" Jan 20 22:49:37 crc kubenswrapper[5030]: I0120 22:49:37.958222 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0334a325-c5cb-4543-8580-812e53d249c9" containerName="util" Jan 20 22:49:37 crc kubenswrapper[5030]: I0120 22:49:37.958466 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0334a325-c5cb-4543-8580-812e53d249c9" containerName="extract" Jan 20 22:49:37 crc kubenswrapper[5030]: I0120 22:49:37.960339 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs" Jan 20 22:49:37 crc kubenswrapper[5030]: I0120 22:49:37.963976 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 20 22:49:37 crc kubenswrapper[5030]: I0120 22:49:37.964698 5030 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-l8xwq" Jan 20 22:49:37 crc kubenswrapper[5030]: I0120 22:49:37.973563 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 20 22:49:37 crc kubenswrapper[5030]: I0120 22:49:37.991271 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs"] Jan 20 22:49:38 crc kubenswrapper[5030]: I0120 22:49:38.017199 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt9h2\" (UniqueName: \"kubernetes.io/projected/7fa1bd5a-38a8-4a55-9367-221d3613cced-kube-api-access-bt9h2\") pod \"cert-manager-operator-controller-manager-64cf6dff88-l9cxs\" (UID: \"7fa1bd5a-38a8-4a55-9367-221d3613cced\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs" Jan 20 22:49:38 crc kubenswrapper[5030]: I0120 22:49:38.017510 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fa1bd5a-38a8-4a55-9367-221d3613cced-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-l9cxs\" (UID: \"7fa1bd5a-38a8-4a55-9367-221d3613cced\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs" Jan 20 22:49:38 crc kubenswrapper[5030]: I0120 22:49:38.119050 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt9h2\" (UniqueName: \"kubernetes.io/projected/7fa1bd5a-38a8-4a55-9367-221d3613cced-kube-api-access-bt9h2\") pod \"cert-manager-operator-controller-manager-64cf6dff88-l9cxs\" (UID: \"7fa1bd5a-38a8-4a55-9367-221d3613cced\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs" Jan 20 22:49:38 crc kubenswrapper[5030]: I0120 22:49:38.119096 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fa1bd5a-38a8-4a55-9367-221d3613cced-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-l9cxs\" (UID: \"7fa1bd5a-38a8-4a55-9367-221d3613cced\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs" Jan 20 22:49:38 crc kubenswrapper[5030]: I0120 22:49:38.119581 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fa1bd5a-38a8-4a55-9367-221d3613cced-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-l9cxs\" (UID: \"7fa1bd5a-38a8-4a55-9367-221d3613cced\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs" Jan 20 22:49:38 crc kubenswrapper[5030]: I0120 22:49:38.139649 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt9h2\" (UniqueName: \"kubernetes.io/projected/7fa1bd5a-38a8-4a55-9367-221d3613cced-kube-api-access-bt9h2\") pod \"cert-manager-operator-controller-manager-64cf6dff88-l9cxs\" (UID: \"7fa1bd5a-38a8-4a55-9367-221d3613cced\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs" Jan 20 22:49:38 crc kubenswrapper[5030]: I0120 22:49:38.277777 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs" Jan 20 22:49:38 crc kubenswrapper[5030]: I0120 22:49:38.538701 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs"] Jan 20 22:49:38 crc kubenswrapper[5030]: W0120 22:49:38.544695 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fa1bd5a_38a8_4a55_9367_221d3613cced.slice/crio-0638c7e60eccddf705c896e262358d082a36bc61e625c778ea6426f43cc70d12 WatchSource:0}: Error finding container 0638c7e60eccddf705c896e262358d082a36bc61e625c778ea6426f43cc70d12: Status 404 returned error can't find the container with id 0638c7e60eccddf705c896e262358d082a36bc61e625c778ea6426f43cc70d12 Jan 20 22:49:38 crc kubenswrapper[5030]: I0120 22:49:38.972031 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs" event={"ID":"7fa1bd5a-38a8-4a55-9367-221d3613cced","Type":"ContainerStarted","Data":"0638c7e60eccddf705c896e262358d082a36bc61e625c778ea6426f43cc70d12"} Jan 20 22:49:42 crc kubenswrapper[5030]: I0120 22:49:42.435807 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-cdvkw" Jan 20 22:49:46 crc kubenswrapper[5030]: I0120 22:49:46.018971 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs" event={"ID":"7fa1bd5a-38a8-4a55-9367-221d3613cced","Type":"ContainerStarted","Data":"06a2a7f437ec63e9bb72186c8441e20f40d3f26f12f43153a960c590ab13d839"} Jan 20 22:49:46 crc kubenswrapper[5030]: I0120 22:49:46.054074 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l9cxs" podStartSLOduration=2.19812382 podStartE2EDuration="9.054057167s" podCreationTimestamp="2026-01-20 22:49:37 +0000 UTC" firstStartedPulling="2026-01-20 22:49:38.546924419 +0000 UTC m=+850.867184707" lastFinishedPulling="2026-01-20 22:49:45.402857726 +0000 UTC m=+857.723118054" observedRunningTime="2026-01-20 22:49:46.05054084 +0000 UTC m=+858.370801168" watchObservedRunningTime="2026-01-20 22:49:46.054057167 +0000 UTC m=+858.374317455" Jan 20 22:49:52 crc kubenswrapper[5030]: I0120 22:49:52.876133 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd"] Jan 20 22:49:52 crc kubenswrapper[5030]: I0120 22:49:52.877664 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd" Jan 20 22:49:52 crc kubenswrapper[5030]: I0120 22:49:52.880403 5030 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-9hksb" Jan 20 22:49:52 crc kubenswrapper[5030]: I0120 22:49:52.883293 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 20 22:49:52 crc kubenswrapper[5030]: I0120 22:49:52.883779 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 20 22:49:52 crc kubenswrapper[5030]: I0120 22:49:52.886264 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd"] Jan 20 22:49:52 crc kubenswrapper[5030]: I0120 22:49:52.936411 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-v7jrd\" (UID: \"5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd" Jan 20 22:49:52 crc kubenswrapper[5030]: I0120 22:49:52.936540 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2t7q\" (UniqueName: \"kubernetes.io/projected/5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b-kube-api-access-l2t7q\") pod \"cert-manager-cainjector-855d9ccff4-v7jrd\" (UID: \"5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd" Jan 20 22:49:53 crc kubenswrapper[5030]: I0120 22:49:53.037800 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-v7jrd\" (UID: \"5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd" Jan 20 22:49:53 crc kubenswrapper[5030]: I0120 22:49:53.037966 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2t7q\" (UniqueName: \"kubernetes.io/projected/5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b-kube-api-access-l2t7q\") pod \"cert-manager-cainjector-855d9ccff4-v7jrd\" (UID: \"5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd" Jan 20 22:49:53 crc kubenswrapper[5030]: I0120 22:49:53.059753 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2t7q\" (UniqueName: \"kubernetes.io/projected/5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b-kube-api-access-l2t7q\") pod \"cert-manager-cainjector-855d9ccff4-v7jrd\" (UID: \"5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd" Jan 20 22:49:53 crc kubenswrapper[5030]: I0120 22:49:53.059760 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-v7jrd\" (UID: \"5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd" Jan 20 22:49:53 crc kubenswrapper[5030]: I0120 22:49:53.201699 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd" Jan 20 22:49:53 crc kubenswrapper[5030]: I0120 22:49:53.667268 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd"] Jan 20 22:49:54 crc kubenswrapper[5030]: I0120 22:49:54.066779 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd" event={"ID":"5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b","Type":"ContainerStarted","Data":"651a80783cd0d92be1d19ddc24a3c09ea4b86b36f5b24f384df4c2a8f0376b27"} Jan 20 22:49:55 crc kubenswrapper[5030]: I0120 22:49:55.313938 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-z6v29"] Jan 20 22:49:55 crc kubenswrapper[5030]: I0120 22:49:55.315100 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-z6v29" Jan 20 22:49:55 crc kubenswrapper[5030]: I0120 22:49:55.329221 5030 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-m4slx" Jan 20 22:49:55 crc kubenswrapper[5030]: I0120 22:49:55.333583 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-z6v29"] Jan 20 22:49:55 crc kubenswrapper[5030]: I0120 22:49:55.372864 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d61e12f1-4790-4ba8-a7c5-58e755a5190e-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-z6v29\" (UID: \"d61e12f1-4790-4ba8-a7c5-58e755a5190e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-z6v29" Jan 20 22:49:55 crc kubenswrapper[5030]: I0120 22:49:55.373532 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqbqz\" (UniqueName: \"kubernetes.io/projected/d61e12f1-4790-4ba8-a7c5-58e755a5190e-kube-api-access-cqbqz\") pod \"cert-manager-webhook-f4fb5df64-z6v29\" (UID: \"d61e12f1-4790-4ba8-a7c5-58e755a5190e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-z6v29" Jan 20 22:49:55 crc kubenswrapper[5030]: I0120 22:49:55.474280 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d61e12f1-4790-4ba8-a7c5-58e755a5190e-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-z6v29\" (UID: \"d61e12f1-4790-4ba8-a7c5-58e755a5190e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-z6v29" Jan 20 22:49:55 crc kubenswrapper[5030]: I0120 22:49:55.474380 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqbqz\" (UniqueName: \"kubernetes.io/projected/d61e12f1-4790-4ba8-a7c5-58e755a5190e-kube-api-access-cqbqz\") pod \"cert-manager-webhook-f4fb5df64-z6v29\" (UID: \"d61e12f1-4790-4ba8-a7c5-58e755a5190e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-z6v29" Jan 20 22:49:55 crc kubenswrapper[5030]: I0120 22:49:55.502961 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d61e12f1-4790-4ba8-a7c5-58e755a5190e-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-z6v29\" (UID: \"d61e12f1-4790-4ba8-a7c5-58e755a5190e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-z6v29" Jan 20 22:49:55 crc kubenswrapper[5030]: I0120 22:49:55.504376 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqbqz\" (UniqueName: \"kubernetes.io/projected/d61e12f1-4790-4ba8-a7c5-58e755a5190e-kube-api-access-cqbqz\") pod \"cert-manager-webhook-f4fb5df64-z6v29\" (UID: \"d61e12f1-4790-4ba8-a7c5-58e755a5190e\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-z6v29" Jan 20 22:49:55 crc kubenswrapper[5030]: I0120 22:49:55.654318 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-z6v29" Jan 20 22:49:56 crc kubenswrapper[5030]: I0120 22:49:56.107453 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-z6v29"] Jan 20 22:49:57 crc kubenswrapper[5030]: I0120 22:49:57.121200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-z6v29" event={"ID":"d61e12f1-4790-4ba8-a7c5-58e755a5190e","Type":"ContainerStarted","Data":"14e1e041293942b9f09d8bf51c1971e5fc7c3791bebba0e9f49785920cb72f22"} Jan 20 22:50:02 crc kubenswrapper[5030]: I0120 22:50:02.151219 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-z6v29" event={"ID":"d61e12f1-4790-4ba8-a7c5-58e755a5190e","Type":"ContainerStarted","Data":"2f7b9c1967ef1f811ae8da4684731f883b89920b245bf040d6b67733e68c69a8"} Jan 20 22:50:02 crc kubenswrapper[5030]: I0120 22:50:02.152280 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-z6v29" Jan 20 22:50:02 crc kubenswrapper[5030]: I0120 22:50:02.155750 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd" event={"ID":"5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b","Type":"ContainerStarted","Data":"94a528b0713c2a6e8b9758c478234ed5408421e7d5393156bf7b34e43ef9b44a"} Jan 20 22:50:02 crc kubenswrapper[5030]: I0120 22:50:02.178695 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-z6v29" podStartSLOduration=2.05246353 podStartE2EDuration="7.178675412s" podCreationTimestamp="2026-01-20 22:49:55 +0000 UTC" firstStartedPulling="2026-01-20 22:49:56.118337164 +0000 UTC m=+868.438597452" lastFinishedPulling="2026-01-20 22:50:01.244549046 +0000 UTC m=+873.564809334" observedRunningTime="2026-01-20 22:50:02.177203645 +0000 UTC m=+874.497464013" watchObservedRunningTime="2026-01-20 22:50:02.178675412 +0000 UTC m=+874.498935700" Jan 20 22:50:02 crc kubenswrapper[5030]: I0120 22:50:02.200873 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-v7jrd" podStartSLOduration=2.664604529 podStartE2EDuration="10.200847102s" podCreationTimestamp="2026-01-20 22:49:52 +0000 UTC" firstStartedPulling="2026-01-20 22:49:53.67558429 +0000 UTC m=+865.995844618" lastFinishedPulling="2026-01-20 22:50:01.211826903 +0000 UTC m=+873.532087191" observedRunningTime="2026-01-20 22:50:02.198474214 +0000 UTC m=+874.518734512" watchObservedRunningTime="2026-01-20 22:50:02.200847102 +0000 UTC m=+874.521107430" Jan 20 22:50:03 crc kubenswrapper[5030]: I0120 22:50:03.023566 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-zlqvx"] Jan 20 22:50:03 crc kubenswrapper[5030]: I0120 22:50:03.024403 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-zlqvx" Jan 20 22:50:03 crc kubenswrapper[5030]: I0120 22:50:03.026392 5030 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-hv2tk" Jan 20 22:50:03 crc kubenswrapper[5030]: I0120 22:50:03.040542 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-zlqvx"] Jan 20 22:50:03 crc kubenswrapper[5030]: I0120 22:50:03.211169 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/945ea6f2-a630-486b-bd03-4e07011e659f-bound-sa-token\") pod \"cert-manager-86cb77c54b-zlqvx\" (UID: \"945ea6f2-a630-486b-bd03-4e07011e659f\") " pod="cert-manager/cert-manager-86cb77c54b-zlqvx" Jan 20 22:50:03 crc kubenswrapper[5030]: I0120 22:50:03.211273 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2ftd\" (UniqueName: \"kubernetes.io/projected/945ea6f2-a630-486b-bd03-4e07011e659f-kube-api-access-d2ftd\") pod \"cert-manager-86cb77c54b-zlqvx\" (UID: \"945ea6f2-a630-486b-bd03-4e07011e659f\") " pod="cert-manager/cert-manager-86cb77c54b-zlqvx" Jan 20 22:50:03 crc kubenswrapper[5030]: I0120 22:50:03.312839 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2ftd\" (UniqueName: \"kubernetes.io/projected/945ea6f2-a630-486b-bd03-4e07011e659f-kube-api-access-d2ftd\") pod \"cert-manager-86cb77c54b-zlqvx\" (UID: \"945ea6f2-a630-486b-bd03-4e07011e659f\") " pod="cert-manager/cert-manager-86cb77c54b-zlqvx" Jan 20 22:50:03 crc kubenswrapper[5030]: I0120 22:50:03.313074 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/945ea6f2-a630-486b-bd03-4e07011e659f-bound-sa-token\") pod \"cert-manager-86cb77c54b-zlqvx\" (UID: \"945ea6f2-a630-486b-bd03-4e07011e659f\") " pod="cert-manager/cert-manager-86cb77c54b-zlqvx" Jan 20 22:50:03 crc kubenswrapper[5030]: I0120 22:50:03.332148 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/945ea6f2-a630-486b-bd03-4e07011e659f-bound-sa-token\") pod \"cert-manager-86cb77c54b-zlqvx\" (UID: \"945ea6f2-a630-486b-bd03-4e07011e659f\") " pod="cert-manager/cert-manager-86cb77c54b-zlqvx" Jan 20 22:50:03 crc kubenswrapper[5030]: I0120 22:50:03.337556 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2ftd\" (UniqueName: \"kubernetes.io/projected/945ea6f2-a630-486b-bd03-4e07011e659f-kube-api-access-d2ftd\") pod \"cert-manager-86cb77c54b-zlqvx\" (UID: \"945ea6f2-a630-486b-bd03-4e07011e659f\") " pod="cert-manager/cert-manager-86cb77c54b-zlqvx" Jan 20 22:50:03 crc kubenswrapper[5030]: I0120 22:50:03.345969 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-zlqvx" Jan 20 22:50:03 crc kubenswrapper[5030]: I0120 22:50:03.566402 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-zlqvx"] Jan 20 22:50:03 crc kubenswrapper[5030]: W0120 22:50:03.571425 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod945ea6f2_a630_486b_bd03_4e07011e659f.slice/crio-7f75951f28a75f4b37df2ec306901f08037b5b5fe8d8e1dff0e889d2d4ec41f9 WatchSource:0}: Error finding container 7f75951f28a75f4b37df2ec306901f08037b5b5fe8d8e1dff0e889d2d4ec41f9: Status 404 returned error can't find the container with id 7f75951f28a75f4b37df2ec306901f08037b5b5fe8d8e1dff0e889d2d4ec41f9 Jan 20 22:50:04 crc kubenswrapper[5030]: I0120 22:50:04.167278 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-zlqvx" event={"ID":"945ea6f2-a630-486b-bd03-4e07011e659f","Type":"ContainerStarted","Data":"dede21629a9f95f69ce22023db65f30416cb300f04ea011f6d8c75d11c70579a"} Jan 20 22:50:04 crc kubenswrapper[5030]: I0120 22:50:04.167322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-zlqvx" event={"ID":"945ea6f2-a630-486b-bd03-4e07011e659f","Type":"ContainerStarted","Data":"7f75951f28a75f4b37df2ec306901f08037b5b5fe8d8e1dff0e889d2d4ec41f9"} Jan 20 22:50:04 crc kubenswrapper[5030]: I0120 22:50:04.189083 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-zlqvx" podStartSLOduration=1.189063945 podStartE2EDuration="1.189063945s" podCreationTimestamp="2026-01-20 22:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:50:04.188073681 +0000 UTC m=+876.508333989" watchObservedRunningTime="2026-01-20 22:50:04.189063945 +0000 UTC m=+876.509324243" Jan 20 22:50:10 crc kubenswrapper[5030]: I0120 22:50:10.657489 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-z6v29" Jan 20 22:50:13 crc kubenswrapper[5030]: I0120 22:50:13.594066 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-98zg6"] Jan 20 22:50:13 crc kubenswrapper[5030]: I0120 22:50:13.595180 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-98zg6" Jan 20 22:50:13 crc kubenswrapper[5030]: I0120 22:50:13.598832 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-88psb" Jan 20 22:50:13 crc kubenswrapper[5030]: I0120 22:50:13.598838 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 20 22:50:13 crc kubenswrapper[5030]: I0120 22:50:13.606668 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 20 22:50:13 crc kubenswrapper[5030]: I0120 22:50:13.650683 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-98zg6"] Jan 20 22:50:13 crc kubenswrapper[5030]: I0120 22:50:13.764233 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttg6j\" (UniqueName: \"kubernetes.io/projected/a7be1522-2443-4fd3-97f9-0a8e85f99641-kube-api-access-ttg6j\") pod \"openstack-operator-index-98zg6\" (UID: \"a7be1522-2443-4fd3-97f9-0a8e85f99641\") " pod="openstack-operators/openstack-operator-index-98zg6" Jan 20 22:50:13 crc kubenswrapper[5030]: I0120 22:50:13.865110 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttg6j\" (UniqueName: \"kubernetes.io/projected/a7be1522-2443-4fd3-97f9-0a8e85f99641-kube-api-access-ttg6j\") pod \"openstack-operator-index-98zg6\" (UID: \"a7be1522-2443-4fd3-97f9-0a8e85f99641\") " pod="openstack-operators/openstack-operator-index-98zg6" Jan 20 22:50:13 crc kubenswrapper[5030]: I0120 22:50:13.884910 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttg6j\" (UniqueName: \"kubernetes.io/projected/a7be1522-2443-4fd3-97f9-0a8e85f99641-kube-api-access-ttg6j\") pod \"openstack-operator-index-98zg6\" (UID: \"a7be1522-2443-4fd3-97f9-0a8e85f99641\") " pod="openstack-operators/openstack-operator-index-98zg6" Jan 20 22:50:13 crc kubenswrapper[5030]: I0120 22:50:13.913673 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-98zg6" Jan 20 22:50:14 crc kubenswrapper[5030]: I0120 22:50:14.148073 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-98zg6"] Jan 20 22:50:14 crc kubenswrapper[5030]: I0120 22:50:14.243574 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-98zg6" event={"ID":"a7be1522-2443-4fd3-97f9-0a8e85f99641","Type":"ContainerStarted","Data":"118c9b690625d1a4a0fb71e8043fa89932fdc34270140f60226ef39bbf34c57f"} Jan 20 22:50:16 crc kubenswrapper[5030]: I0120 22:50:16.260233 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-98zg6" event={"ID":"a7be1522-2443-4fd3-97f9-0a8e85f99641","Type":"ContainerStarted","Data":"c2e8d95623c26e805dde54028221fffe11ff8d17008098088f1641e492a1d4b7"} Jan 20 22:50:16 crc kubenswrapper[5030]: I0120 22:50:16.280289 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-98zg6" podStartSLOduration=1.479762342 podStartE2EDuration="3.28026406s" podCreationTimestamp="2026-01-20 22:50:13 +0000 UTC" firstStartedPulling="2026-01-20 22:50:14.154029634 +0000 UTC m=+886.474289932" lastFinishedPulling="2026-01-20 22:50:15.954531362 +0000 UTC m=+888.274791650" observedRunningTime="2026-01-20 22:50:16.275902101 +0000 UTC m=+888.596162429" watchObservedRunningTime="2026-01-20 22:50:16.28026406 +0000 UTC m=+888.600524388" Jan 20 22:50:16 crc kubenswrapper[5030]: I0120 22:50:16.549243 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-98zg6"] Jan 20 22:50:17 crc kubenswrapper[5030]: I0120 22:50:17.151138 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-t75jj"] Jan 20 22:50:17 crc kubenswrapper[5030]: I0120 22:50:17.152072 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t75jj" Jan 20 22:50:17 crc kubenswrapper[5030]: I0120 22:50:17.167285 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t75jj"] Jan 20 22:50:17 crc kubenswrapper[5030]: I0120 22:50:17.316510 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr8vv\" (UniqueName: \"kubernetes.io/projected/ff5263c0-d16c-41dd-a473-a329c770c17c-kube-api-access-qr8vv\") pod \"openstack-operator-index-t75jj\" (UID: \"ff5263c0-d16c-41dd-a473-a329c770c17c\") " pod="openstack-operators/openstack-operator-index-t75jj" Jan 20 22:50:17 crc kubenswrapper[5030]: I0120 22:50:17.418712 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr8vv\" (UniqueName: \"kubernetes.io/projected/ff5263c0-d16c-41dd-a473-a329c770c17c-kube-api-access-qr8vv\") pod \"openstack-operator-index-t75jj\" (UID: \"ff5263c0-d16c-41dd-a473-a329c770c17c\") " pod="openstack-operators/openstack-operator-index-t75jj" Jan 20 22:50:17 crc kubenswrapper[5030]: I0120 22:50:17.445265 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr8vv\" (UniqueName: \"kubernetes.io/projected/ff5263c0-d16c-41dd-a473-a329c770c17c-kube-api-access-qr8vv\") pod \"openstack-operator-index-t75jj\" (UID: \"ff5263c0-d16c-41dd-a473-a329c770c17c\") " pod="openstack-operators/openstack-operator-index-t75jj" Jan 20 22:50:17 crc kubenswrapper[5030]: I0120 22:50:17.489935 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t75jj" Jan 20 22:50:17 crc kubenswrapper[5030]: I0120 22:50:17.793537 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-t75jj"] Jan 20 22:50:17 crc kubenswrapper[5030]: W0120 22:50:17.803167 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff5263c0_d16c_41dd_a473_a329c770c17c.slice/crio-c260c58585e003a105245cb984aedbbbc3ec07de889640d7a0a11403d88ab5a0 WatchSource:0}: Error finding container c260c58585e003a105245cb984aedbbbc3ec07de889640d7a0a11403d88ab5a0: Status 404 returned error can't find the container with id c260c58585e003a105245cb984aedbbbc3ec07de889640d7a0a11403d88ab5a0 Jan 20 22:50:18 crc kubenswrapper[5030]: I0120 22:50:18.280875 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t75jj" event={"ID":"ff5263c0-d16c-41dd-a473-a329c770c17c","Type":"ContainerStarted","Data":"c260c58585e003a105245cb984aedbbbc3ec07de889640d7a0a11403d88ab5a0"} Jan 20 22:50:18 crc kubenswrapper[5030]: I0120 22:50:18.280998 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-98zg6" podUID="a7be1522-2443-4fd3-97f9-0a8e85f99641" containerName="registry-server" containerID="cri-o://c2e8d95623c26e805dde54028221fffe11ff8d17008098088f1641e492a1d4b7" gracePeriod=2 Jan 20 22:50:18 crc kubenswrapper[5030]: I0120 22:50:18.660517 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-98zg6" Jan 20 22:50:18 crc kubenswrapper[5030]: I0120 22:50:18.840375 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttg6j\" (UniqueName: \"kubernetes.io/projected/a7be1522-2443-4fd3-97f9-0a8e85f99641-kube-api-access-ttg6j\") pod \"a7be1522-2443-4fd3-97f9-0a8e85f99641\" (UID: \"a7be1522-2443-4fd3-97f9-0a8e85f99641\") " Jan 20 22:50:18 crc kubenswrapper[5030]: I0120 22:50:18.847926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7be1522-2443-4fd3-97f9-0a8e85f99641-kube-api-access-ttg6j" (OuterVolumeSpecName: "kube-api-access-ttg6j") pod "a7be1522-2443-4fd3-97f9-0a8e85f99641" (UID: "a7be1522-2443-4fd3-97f9-0a8e85f99641"). InnerVolumeSpecName "kube-api-access-ttg6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:50:18 crc kubenswrapper[5030]: I0120 22:50:18.942567 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttg6j\" (UniqueName: \"kubernetes.io/projected/a7be1522-2443-4fd3-97f9-0a8e85f99641-kube-api-access-ttg6j\") on node \"crc\" DevicePath \"\"" Jan 20 22:50:19 crc kubenswrapper[5030]: I0120 22:50:19.291440 5030 generic.go:334] "Generic (PLEG): container finished" podID="a7be1522-2443-4fd3-97f9-0a8e85f99641" containerID="c2e8d95623c26e805dde54028221fffe11ff8d17008098088f1641e492a1d4b7" exitCode=0 Jan 20 22:50:19 crc kubenswrapper[5030]: I0120 22:50:19.291502 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-98zg6" event={"ID":"a7be1522-2443-4fd3-97f9-0a8e85f99641","Type":"ContainerDied","Data":"c2e8d95623c26e805dde54028221fffe11ff8d17008098088f1641e492a1d4b7"} Jan 20 22:50:19 crc kubenswrapper[5030]: I0120 22:50:19.291528 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-98zg6" Jan 20 22:50:19 crc kubenswrapper[5030]: I0120 22:50:19.291551 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-98zg6" event={"ID":"a7be1522-2443-4fd3-97f9-0a8e85f99641","Type":"ContainerDied","Data":"118c9b690625d1a4a0fb71e8043fa89932fdc34270140f60226ef39bbf34c57f"} Jan 20 22:50:19 crc kubenswrapper[5030]: I0120 22:50:19.291580 5030 scope.go:117] "RemoveContainer" containerID="c2e8d95623c26e805dde54028221fffe11ff8d17008098088f1641e492a1d4b7" Jan 20 22:50:19 crc kubenswrapper[5030]: I0120 22:50:19.315567 5030 scope.go:117] "RemoveContainer" containerID="c2e8d95623c26e805dde54028221fffe11ff8d17008098088f1641e492a1d4b7" Jan 20 22:50:19 crc kubenswrapper[5030]: E0120 22:50:19.316273 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2e8d95623c26e805dde54028221fffe11ff8d17008098088f1641e492a1d4b7\": container with ID starting with c2e8d95623c26e805dde54028221fffe11ff8d17008098088f1641e492a1d4b7 not found: ID does not exist" containerID="c2e8d95623c26e805dde54028221fffe11ff8d17008098088f1641e492a1d4b7" Jan 20 22:50:19 crc kubenswrapper[5030]: I0120 22:50:19.316369 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2e8d95623c26e805dde54028221fffe11ff8d17008098088f1641e492a1d4b7"} err="failed to get container status \"c2e8d95623c26e805dde54028221fffe11ff8d17008098088f1641e492a1d4b7\": rpc error: code = NotFound desc = could not find container \"c2e8d95623c26e805dde54028221fffe11ff8d17008098088f1641e492a1d4b7\": container with ID starting with c2e8d95623c26e805dde54028221fffe11ff8d17008098088f1641e492a1d4b7 not found: ID does not exist" Jan 20 22:50:19 crc kubenswrapper[5030]: I0120 22:50:19.341525 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-98zg6"] Jan 20 22:50:19 crc kubenswrapper[5030]: I0120 22:50:19.347512 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-98zg6"] Jan 20 22:50:19 crc kubenswrapper[5030]: I0120 22:50:19.977806 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7be1522-2443-4fd3-97f9-0a8e85f99641" path="/var/lib/kubelet/pods/a7be1522-2443-4fd3-97f9-0a8e85f99641/volumes" Jan 20 22:50:20 crc kubenswrapper[5030]: I0120 22:50:20.300792 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t75jj" event={"ID":"ff5263c0-d16c-41dd-a473-a329c770c17c","Type":"ContainerStarted","Data":"d5cd2ad4e448b2529a942ea1ce402bbbff882edee6616a2f81d0df880a69cb7f"} Jan 20 22:50:20 crc kubenswrapper[5030]: I0120 22:50:20.325998 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-t75jj" podStartSLOduration=2.903495705 podStartE2EDuration="3.325966588s" podCreationTimestamp="2026-01-20 22:50:17 +0000 UTC" firstStartedPulling="2026-01-20 22:50:17.810021904 +0000 UTC m=+890.130282232" lastFinishedPulling="2026-01-20 22:50:18.232492817 +0000 UTC m=+890.552753115" observedRunningTime="2026-01-20 22:50:20.319135608 +0000 UTC m=+892.639395936" watchObservedRunningTime="2026-01-20 22:50:20.325966588 +0000 UTC m=+892.646226916" Jan 20 22:50:27 crc kubenswrapper[5030]: I0120 22:50:27.491201 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-t75jj" Jan 20 22:50:27 crc kubenswrapper[5030]: I0120 22:50:27.491745 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-t75jj" Jan 20 22:50:27 crc kubenswrapper[5030]: I0120 22:50:27.539807 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-t75jj" Jan 20 22:50:28 crc kubenswrapper[5030]: I0120 22:50:28.383388 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-t75jj" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.191376 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v"] Jan 20 22:50:29 crc kubenswrapper[5030]: E0120 22:50:29.191607 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7be1522-2443-4fd3-97f9-0a8e85f99641" containerName="registry-server" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.191636 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7be1522-2443-4fd3-97f9-0a8e85f99641" containerName="registry-server" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.191758 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7be1522-2443-4fd3-97f9-0a8e85f99641" containerName="registry-server" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.192539 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.195331 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.206560 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v"] Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.394840 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a0012c2-48d6-40b9-8490-c0b904e785b7-util\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v\" (UID: \"9a0012c2-48d6-40b9-8490-c0b904e785b7\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.394934 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a0012c2-48d6-40b9-8490-c0b904e785b7-bundle\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v\" (UID: \"9a0012c2-48d6-40b9-8490-c0b904e785b7\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.394987 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwk7s\" (UniqueName: \"kubernetes.io/projected/9a0012c2-48d6-40b9-8490-c0b904e785b7-kube-api-access-bwk7s\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v\" (UID: \"9a0012c2-48d6-40b9-8490-c0b904e785b7\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.496467 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a0012c2-48d6-40b9-8490-c0b904e785b7-util\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v\" (UID: \"9a0012c2-48d6-40b9-8490-c0b904e785b7\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.496543 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a0012c2-48d6-40b9-8490-c0b904e785b7-bundle\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v\" (UID: \"9a0012c2-48d6-40b9-8490-c0b904e785b7\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.496584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwk7s\" (UniqueName: \"kubernetes.io/projected/9a0012c2-48d6-40b9-8490-c0b904e785b7-kube-api-access-bwk7s\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v\" (UID: \"9a0012c2-48d6-40b9-8490-c0b904e785b7\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.497359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a0012c2-48d6-40b9-8490-c0b904e785b7-util\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v\" (UID: \"9a0012c2-48d6-40b9-8490-c0b904e785b7\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.497535 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a0012c2-48d6-40b9-8490-c0b904e785b7-bundle\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v\" (UID: \"9a0012c2-48d6-40b9-8490-c0b904e785b7\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.530749 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwk7s\" (UniqueName: \"kubernetes.io/projected/9a0012c2-48d6-40b9-8490-c0b904e785b7-kube-api-access-bwk7s\") pod \"7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v\" (UID: \"9a0012c2-48d6-40b9-8490-c0b904e785b7\") " pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" Jan 20 22:50:29 crc kubenswrapper[5030]: I0120 22:50:29.808734 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" Jan 20 22:50:30 crc kubenswrapper[5030]: W0120 22:50:30.127412 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a0012c2_48d6_40b9_8490_c0b904e785b7.slice/crio-c544c3625d11ed31dac6feba195f18de684a637c33361d29ab32f8983c4df00a WatchSource:0}: Error finding container c544c3625d11ed31dac6feba195f18de684a637c33361d29ab32f8983c4df00a: Status 404 returned error can't find the container with id c544c3625d11ed31dac6feba195f18de684a637c33361d29ab32f8983c4df00a Jan 20 22:50:30 crc kubenswrapper[5030]: I0120 22:50:30.127467 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v"] Jan 20 22:50:30 crc kubenswrapper[5030]: I0120 22:50:30.368342 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" event={"ID":"9a0012c2-48d6-40b9-8490-c0b904e785b7","Type":"ContainerStarted","Data":"c544c3625d11ed31dac6feba195f18de684a637c33361d29ab32f8983c4df00a"} Jan 20 22:50:31 crc kubenswrapper[5030]: I0120 22:50:31.379507 5030 generic.go:334] "Generic (PLEG): container finished" podID="9a0012c2-48d6-40b9-8490-c0b904e785b7" containerID="9f2d63d3e6ab738ff2f2a269569e5b6cf524ea4f47c18034d1306d5a3cc5b29b" exitCode=0 Jan 20 22:50:31 crc kubenswrapper[5030]: I0120 22:50:31.379615 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" event={"ID":"9a0012c2-48d6-40b9-8490-c0b904e785b7","Type":"ContainerDied","Data":"9f2d63d3e6ab738ff2f2a269569e5b6cf524ea4f47c18034d1306d5a3cc5b29b"} Jan 20 22:50:32 crc kubenswrapper[5030]: I0120 22:50:32.391668 5030 generic.go:334] "Generic (PLEG): container finished" podID="9a0012c2-48d6-40b9-8490-c0b904e785b7" containerID="35a4a73cc7b2a93dccec18d3a9260524935c827138006c6b0a4b3e14c932fee9" exitCode=0 Jan 20 22:50:32 crc kubenswrapper[5030]: I0120 22:50:32.391760 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" event={"ID":"9a0012c2-48d6-40b9-8490-c0b904e785b7","Type":"ContainerDied","Data":"35a4a73cc7b2a93dccec18d3a9260524935c827138006c6b0a4b3e14c932fee9"} Jan 20 22:50:33 crc kubenswrapper[5030]: I0120 22:50:33.403119 5030 generic.go:334] "Generic (PLEG): container finished" podID="9a0012c2-48d6-40b9-8490-c0b904e785b7" containerID="324a62c6af727b39111590bbd594cae9d7d93c4239e962a7da2abaf5131d0cac" exitCode=0 Jan 20 22:50:33 crc kubenswrapper[5030]: I0120 22:50:33.403185 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" event={"ID":"9a0012c2-48d6-40b9-8490-c0b904e785b7","Type":"ContainerDied","Data":"324a62c6af727b39111590bbd594cae9d7d93c4239e962a7da2abaf5131d0cac"} Jan 20 22:50:34 crc kubenswrapper[5030]: I0120 22:50:34.699405 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" Jan 20 22:50:34 crc kubenswrapper[5030]: I0120 22:50:34.769899 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a0012c2-48d6-40b9-8490-c0b904e785b7-bundle\") pod \"9a0012c2-48d6-40b9-8490-c0b904e785b7\" (UID: \"9a0012c2-48d6-40b9-8490-c0b904e785b7\") " Jan 20 22:50:34 crc kubenswrapper[5030]: I0120 22:50:34.769946 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwk7s\" (UniqueName: \"kubernetes.io/projected/9a0012c2-48d6-40b9-8490-c0b904e785b7-kube-api-access-bwk7s\") pod \"9a0012c2-48d6-40b9-8490-c0b904e785b7\" (UID: \"9a0012c2-48d6-40b9-8490-c0b904e785b7\") " Jan 20 22:50:34 crc kubenswrapper[5030]: I0120 22:50:34.769996 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a0012c2-48d6-40b9-8490-c0b904e785b7-util\") pod \"9a0012c2-48d6-40b9-8490-c0b904e785b7\" (UID: \"9a0012c2-48d6-40b9-8490-c0b904e785b7\") " Jan 20 22:50:34 crc kubenswrapper[5030]: I0120 22:50:34.770911 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a0012c2-48d6-40b9-8490-c0b904e785b7-bundle" (OuterVolumeSpecName: "bundle") pod "9a0012c2-48d6-40b9-8490-c0b904e785b7" (UID: "9a0012c2-48d6-40b9-8490-c0b904e785b7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:50:34 crc kubenswrapper[5030]: I0120 22:50:34.778092 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a0012c2-48d6-40b9-8490-c0b904e785b7-kube-api-access-bwk7s" (OuterVolumeSpecName: "kube-api-access-bwk7s") pod "9a0012c2-48d6-40b9-8490-c0b904e785b7" (UID: "9a0012c2-48d6-40b9-8490-c0b904e785b7"). InnerVolumeSpecName "kube-api-access-bwk7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:50:34 crc kubenswrapper[5030]: I0120 22:50:34.790514 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a0012c2-48d6-40b9-8490-c0b904e785b7-util" (OuterVolumeSpecName: "util") pod "9a0012c2-48d6-40b9-8490-c0b904e785b7" (UID: "9a0012c2-48d6-40b9-8490-c0b904e785b7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:50:34 crc kubenswrapper[5030]: I0120 22:50:34.871519 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a0012c2-48d6-40b9-8490-c0b904e785b7-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:50:34 crc kubenswrapper[5030]: I0120 22:50:34.871575 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwk7s\" (UniqueName: \"kubernetes.io/projected/9a0012c2-48d6-40b9-8490-c0b904e785b7-kube-api-access-bwk7s\") on node \"crc\" DevicePath \"\"" Jan 20 22:50:34 crc kubenswrapper[5030]: I0120 22:50:34.871597 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a0012c2-48d6-40b9-8490-c0b904e785b7-util\") on node \"crc\" DevicePath \"\"" Jan 20 22:50:35 crc kubenswrapper[5030]: I0120 22:50:35.419811 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" event={"ID":"9a0012c2-48d6-40b9-8490-c0b904e785b7","Type":"ContainerDied","Data":"c544c3625d11ed31dac6feba195f18de684a637c33361d29ab32f8983c4df00a"} Jan 20 22:50:35 crc kubenswrapper[5030]: I0120 22:50:35.420385 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c544c3625d11ed31dac6feba195f18de684a637c33361d29ab32f8983c4df00a" Jan 20 22:50:35 crc kubenswrapper[5030]: I0120 22:50:35.419954 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v" Jan 20 22:50:41 crc kubenswrapper[5030]: I0120 22:50:41.384401 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7"] Jan 20 22:50:41 crc kubenswrapper[5030]: E0120 22:50:41.384612 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a0012c2-48d6-40b9-8490-c0b904e785b7" containerName="pull" Jan 20 22:50:41 crc kubenswrapper[5030]: I0120 22:50:41.384644 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0012c2-48d6-40b9-8490-c0b904e785b7" containerName="pull" Jan 20 22:50:41 crc kubenswrapper[5030]: E0120 22:50:41.384655 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a0012c2-48d6-40b9-8490-c0b904e785b7" containerName="util" Jan 20 22:50:41 crc kubenswrapper[5030]: I0120 22:50:41.384660 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0012c2-48d6-40b9-8490-c0b904e785b7" containerName="util" Jan 20 22:50:41 crc kubenswrapper[5030]: E0120 22:50:41.384669 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a0012c2-48d6-40b9-8490-c0b904e785b7" containerName="extract" Jan 20 22:50:41 crc kubenswrapper[5030]: I0120 22:50:41.384675 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0012c2-48d6-40b9-8490-c0b904e785b7" containerName="extract" Jan 20 22:50:41 crc kubenswrapper[5030]: I0120 22:50:41.385130 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a0012c2-48d6-40b9-8490-c0b904e785b7" containerName="extract" Jan 20 22:50:41 crc kubenswrapper[5030]: I0120 22:50:41.386397 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" Jan 20 22:50:41 crc kubenswrapper[5030]: I0120 22:50:41.396481 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-9thk5" Jan 20 22:50:41 crc kubenswrapper[5030]: I0120 22:50:41.410819 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7"] Jan 20 22:50:41 crc kubenswrapper[5030]: I0120 22:50:41.462878 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkwdf\" (UniqueName: \"kubernetes.io/projected/5fbec18c-c7ab-43eb-9440-c2a8973d3532-kube-api-access-gkwdf\") pod \"openstack-operator-controller-init-6d4d7d8545-fz9q7\" (UID: \"5fbec18c-c7ab-43eb-9440-c2a8973d3532\") " pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" Jan 20 22:50:41 crc kubenswrapper[5030]: I0120 22:50:41.564002 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkwdf\" (UniqueName: \"kubernetes.io/projected/5fbec18c-c7ab-43eb-9440-c2a8973d3532-kube-api-access-gkwdf\") pod \"openstack-operator-controller-init-6d4d7d8545-fz9q7\" (UID: \"5fbec18c-c7ab-43eb-9440-c2a8973d3532\") " pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" Jan 20 22:50:41 crc kubenswrapper[5030]: I0120 22:50:41.588760 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkwdf\" (UniqueName: \"kubernetes.io/projected/5fbec18c-c7ab-43eb-9440-c2a8973d3532-kube-api-access-gkwdf\") pod \"openstack-operator-controller-init-6d4d7d8545-fz9q7\" (UID: \"5fbec18c-c7ab-43eb-9440-c2a8973d3532\") " pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" Jan 20 22:50:41 crc kubenswrapper[5030]: I0120 22:50:41.709862 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" Jan 20 22:50:41 crc kubenswrapper[5030]: I0120 22:50:41.975276 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7"] Jan 20 22:50:42 crc kubenswrapper[5030]: I0120 22:50:42.468555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" event={"ID":"5fbec18c-c7ab-43eb-9440-c2a8973d3532","Type":"ContainerStarted","Data":"f4c3382246a3c95961ab4f3a240007cd3a134d3284a23038ebd7b692ddbdc754"} Jan 20 22:50:47 crc kubenswrapper[5030]: I0120 22:50:47.499460 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" event={"ID":"5fbec18c-c7ab-43eb-9440-c2a8973d3532","Type":"ContainerStarted","Data":"395ed9d798d4cf55d2e4f993c6ce58fafbcf3f2feee517ef1ae91409e5ef8021"} Jan 20 22:50:47 crc kubenswrapper[5030]: I0120 22:50:47.499852 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" Jan 20 22:50:47 crc kubenswrapper[5030]: I0120 22:50:47.542027 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" podStartSLOduration=1.402060754 podStartE2EDuration="6.542004946s" podCreationTimestamp="2026-01-20 22:50:41 +0000 UTC" firstStartedPulling="2026-01-20 22:50:41.983088431 +0000 UTC m=+914.303348719" lastFinishedPulling="2026-01-20 22:50:47.123032623 +0000 UTC m=+919.443292911" observedRunningTime="2026-01-20 22:50:47.536660588 +0000 UTC m=+919.856920896" watchObservedRunningTime="2026-01-20 22:50:47.542004946 +0000 UTC m=+919.862265274" Jan 20 22:51:01 crc kubenswrapper[5030]: I0120 22:51:01.714378 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" Jan 20 22:51:10 crc kubenswrapper[5030]: I0120 22:51:10.157554 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:51:10 crc kubenswrapper[5030]: I0120 22:51:10.159364 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.140831 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2x56r"] Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.143169 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.153853 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2x56r"] Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.222474 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7pw7\" (UniqueName: \"kubernetes.io/projected/8405eebd-ea44-4878-8055-b19c51043955-kube-api-access-d7pw7\") pod \"redhat-marketplace-2x56r\" (UID: \"8405eebd-ea44-4878-8055-b19c51043955\") " pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.222776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8405eebd-ea44-4878-8055-b19c51043955-catalog-content\") pod \"redhat-marketplace-2x56r\" (UID: \"8405eebd-ea44-4878-8055-b19c51043955\") " pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.222909 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8405eebd-ea44-4878-8055-b19c51043955-utilities\") pod \"redhat-marketplace-2x56r\" (UID: \"8405eebd-ea44-4878-8055-b19c51043955\") " pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.323733 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8405eebd-ea44-4878-8055-b19c51043955-utilities\") pod \"redhat-marketplace-2x56r\" (UID: \"8405eebd-ea44-4878-8055-b19c51043955\") " pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.323797 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7pw7\" (UniqueName: \"kubernetes.io/projected/8405eebd-ea44-4878-8055-b19c51043955-kube-api-access-d7pw7\") pod \"redhat-marketplace-2x56r\" (UID: \"8405eebd-ea44-4878-8055-b19c51043955\") " pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.323857 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8405eebd-ea44-4878-8055-b19c51043955-catalog-content\") pod \"redhat-marketplace-2x56r\" (UID: \"8405eebd-ea44-4878-8055-b19c51043955\") " pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.324302 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8405eebd-ea44-4878-8055-b19c51043955-utilities\") pod \"redhat-marketplace-2x56r\" (UID: \"8405eebd-ea44-4878-8055-b19c51043955\") " pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.324349 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8405eebd-ea44-4878-8055-b19c51043955-catalog-content\") pod \"redhat-marketplace-2x56r\" (UID: \"8405eebd-ea44-4878-8055-b19c51043955\") " pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.345797 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7pw7\" (UniqueName: \"kubernetes.io/projected/8405eebd-ea44-4878-8055-b19c51043955-kube-api-access-d7pw7\") pod \"redhat-marketplace-2x56r\" (UID: \"8405eebd-ea44-4878-8055-b19c51043955\") " pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.458225 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:14 crc kubenswrapper[5030]: I0120 22:51:14.769775 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2x56r"] Jan 20 22:51:15 crc kubenswrapper[5030]: I0120 22:51:15.715386 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2x56r" event={"ID":"8405eebd-ea44-4878-8055-b19c51043955","Type":"ContainerStarted","Data":"16effcd85d354bec2fed42bac0c4db35483f5c6f1a11d72db0a0443fe026b071"} Jan 20 22:51:15 crc kubenswrapper[5030]: I0120 22:51:15.715836 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2x56r" event={"ID":"8405eebd-ea44-4878-8055-b19c51043955","Type":"ContainerStarted","Data":"32ea6ca169368769ef3a5b001122acd90a415e37edb81072db38d95780b0cdfa"} Jan 20 22:51:16 crc kubenswrapper[5030]: I0120 22:51:16.728701 5030 generic.go:334] "Generic (PLEG): container finished" podID="8405eebd-ea44-4878-8055-b19c51043955" containerID="16effcd85d354bec2fed42bac0c4db35483f5c6f1a11d72db0a0443fe026b071" exitCode=0 Jan 20 22:51:16 crc kubenswrapper[5030]: I0120 22:51:16.728763 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2x56r" event={"ID":"8405eebd-ea44-4878-8055-b19c51043955","Type":"ContainerDied","Data":"16effcd85d354bec2fed42bac0c4db35483f5c6f1a11d72db0a0443fe026b071"} Jan 20 22:51:18 crc kubenswrapper[5030]: I0120 22:51:18.744511 5030 generic.go:334] "Generic (PLEG): container finished" podID="8405eebd-ea44-4878-8055-b19c51043955" containerID="a358395e4352bb8064d92cfbd553b6128fbd8bc809547cfc472ee1732d8118eb" exitCode=0 Jan 20 22:51:18 crc kubenswrapper[5030]: I0120 22:51:18.744591 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2x56r" event={"ID":"8405eebd-ea44-4878-8055-b19c51043955","Type":"ContainerDied","Data":"a358395e4352bb8064d92cfbd553b6128fbd8bc809547cfc472ee1732d8118eb"} Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.299024 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.300140 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.303023 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.303447 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-cl95j" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.303733 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.304673 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-nkgfc" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.313555 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.337904 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.339510 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.343498 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-clndm" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.396172 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.404371 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2cl4\" (UniqueName: \"kubernetes.io/projected/98fe7bc7-a9a3-4681-9d86-2d134e784ed0-kube-api-access-g2cl4\") pod \"cinder-operator-controller-manager-9b68f5989-hw2rq\" (UID: \"98fe7bc7-a9a3-4681-9d86-2d134e784ed0\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.404425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll987\" (UniqueName: \"kubernetes.io/projected/1d5ff7f0-0960-44b0-8637-446aa3906d52-kube-api-access-ll987\") pod \"barbican-operator-controller-manager-7ddb5c749-6hb48\" (UID: \"1d5ff7f0-0960-44b0-8637-446aa3906d52\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.415564 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.425288 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.426252 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.429862 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-72cgv" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.437737 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.438607 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.442831 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-q279j" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.446265 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.467934 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.468906 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.476213 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-snm9x" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.481453 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.507087 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7ttt\" (UniqueName: \"kubernetes.io/projected/c9aea915-81f5-4204-9d3b-4d107d4678a1-kube-api-access-z7ttt\") pod \"glance-operator-controller-manager-c6994669c-lwzdd\" (UID: \"c9aea915-81f5-4204-9d3b-4d107d4678a1\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.507157 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45n7f\" (UniqueName: \"kubernetes.io/projected/9a08b9f6-0526-42ba-9439-1fad41456c73-kube-api-access-45n7f\") pod \"designate-operator-controller-manager-9f958b845-9qgvp\" (UID: \"9a08b9f6-0526-42ba-9439-1fad41456c73\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.507193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2cl4\" (UniqueName: \"kubernetes.io/projected/98fe7bc7-a9a3-4681-9d86-2d134e784ed0-kube-api-access-g2cl4\") pod \"cinder-operator-controller-manager-9b68f5989-hw2rq\" (UID: \"98fe7bc7-a9a3-4681-9d86-2d134e784ed0\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.507224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll987\" (UniqueName: \"kubernetes.io/projected/1d5ff7f0-0960-44b0-8637-446aa3906d52-kube-api-access-ll987\") pod \"barbican-operator-controller-manager-7ddb5c749-6hb48\" (UID: \"1d5ff7f0-0960-44b0-8637-446aa3906d52\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.519633 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.525395 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.526208 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.528670 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.539069 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.539392 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-lkpt2" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.541688 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.542541 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.543822 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll987\" (UniqueName: \"kubernetes.io/projected/1d5ff7f0-0960-44b0-8637-446aa3906d52-kube-api-access-ll987\") pod \"barbican-operator-controller-manager-7ddb5c749-6hb48\" (UID: \"1d5ff7f0-0960-44b0-8637-446aa3906d52\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.544278 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.544987 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.551343 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-8lwlb" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.551525 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-s9sp2" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.565677 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.569158 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2cl4\" (UniqueName: \"kubernetes.io/projected/98fe7bc7-a9a3-4681-9d86-2d134e784ed0-kube-api-access-g2cl4\") pod \"cinder-operator-controller-manager-9b68f5989-hw2rq\" (UID: \"98fe7bc7-a9a3-4681-9d86-2d134e784ed0\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.581975 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.590242 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.591280 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.597016 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-qtmtf" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.602955 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.616413 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7ttt\" (UniqueName: \"kubernetes.io/projected/c9aea915-81f5-4204-9d3b-4d107d4678a1-kube-api-access-z7ttt\") pod \"glance-operator-controller-manager-c6994669c-lwzdd\" (UID: \"c9aea915-81f5-4204-9d3b-4d107d4678a1\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.616477 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prs4f\" (UniqueName: \"kubernetes.io/projected/d7b05cde-2eb1-4879-bc97-dd0a420d7617-kube-api-access-prs4f\") pod \"heat-operator-controller-manager-594c8c9d5d-cq5dj\" (UID: \"d7b05cde-2eb1-4879-bc97-dd0a420d7617\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.616502 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-587tm\" (UniqueName: \"kubernetes.io/projected/dac66b05-4f64-4bad-9d59-9c01ccd16018-kube-api-access-587tm\") pod \"horizon-operator-controller-manager-77d5c5b54f-dh9wl\" (UID: \"dac66b05-4f64-4bad-9d59-9c01ccd16018\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.616523 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkmlr\" (UniqueName: \"kubernetes.io/projected/123d952b-3c67-49ea-9c09-82b04a14d494-kube-api-access-bkmlr\") pod \"infra-operator-controller-manager-77c48c7859-gtjfb\" (UID: \"123d952b-3c67-49ea-9c09-82b04a14d494\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.616544 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45n7f\" (UniqueName: \"kubernetes.io/projected/9a08b9f6-0526-42ba-9439-1fad41456c73-kube-api-access-45n7f\") pod \"designate-operator-controller-manager-9f958b845-9qgvp\" (UID: \"9a08b9f6-0526-42ba-9439-1fad41456c73\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.616580 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert\") pod \"infra-operator-controller-manager-77c48c7859-gtjfb\" (UID: \"123d952b-3c67-49ea-9c09-82b04a14d494\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.623893 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.624707 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.625395 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.630976 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-v6622" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.642671 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.643464 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.646658 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.648360 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7ttt\" (UniqueName: \"kubernetes.io/projected/c9aea915-81f5-4204-9d3b-4d107d4678a1-kube-api-access-z7ttt\") pod \"glance-operator-controller-manager-c6994669c-lwzdd\" (UID: \"c9aea915-81f5-4204-9d3b-4d107d4678a1\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.670357 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.674479 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-n86hq" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.697177 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.704146 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45n7f\" (UniqueName: \"kubernetes.io/projected/9a08b9f6-0526-42ba-9439-1fad41456c73-kube-api-access-45n7f\") pod \"designate-operator-controller-manager-9f958b845-9qgvp\" (UID: \"9a08b9f6-0526-42ba-9439-1fad41456c73\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.720775 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.731453 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nzfn\" (UniqueName: \"kubernetes.io/projected/5810f6d7-0bb3-4137-8ce4-03f0b1e9b225-kube-api-access-4nzfn\") pod \"manila-operator-controller-manager-864f6b75bf-qqnjk\" (UID: \"5810f6d7-0bb3-4137-8ce4-03f0b1e9b225\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.731532 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prs4f\" (UniqueName: \"kubernetes.io/projected/d7b05cde-2eb1-4879-bc97-dd0a420d7617-kube-api-access-prs4f\") pod \"heat-operator-controller-manager-594c8c9d5d-cq5dj\" (UID: \"d7b05cde-2eb1-4879-bc97-dd0a420d7617\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.731561 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-587tm\" (UniqueName: \"kubernetes.io/projected/dac66b05-4f64-4bad-9d59-9c01ccd16018-kube-api-access-587tm\") pod \"horizon-operator-controller-manager-77d5c5b54f-dh9wl\" (UID: \"dac66b05-4f64-4bad-9d59-9c01ccd16018\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.731591 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkmlr\" (UniqueName: \"kubernetes.io/projected/123d952b-3c67-49ea-9c09-82b04a14d494-kube-api-access-bkmlr\") pod \"infra-operator-controller-manager-77c48c7859-gtjfb\" (UID: \"123d952b-3c67-49ea-9c09-82b04a14d494\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.743415 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz7hr\" (UniqueName: \"kubernetes.io/projected/96630154-0812-445c-bd03-9a2066468891-kube-api-access-zz7hr\") pod \"mariadb-operator-controller-manager-c87fff755-mq6l5\" (UID: \"96630154-0812-445c-bd03-9a2066468891\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.743484 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knmnv\" (UniqueName: \"kubernetes.io/projected/caa5b6cd-4c8e-424e-8ba9-578ccd662eab-kube-api-access-knmnv\") pod \"ironic-operator-controller-manager-78757b4889-dvhb6\" (UID: \"caa5b6cd-4c8e-424e-8ba9-578ccd662eab\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.743548 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert\") pod \"infra-operator-controller-manager-77c48c7859-gtjfb\" (UID: \"123d952b-3c67-49ea-9c09-82b04a14d494\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.743577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ndzq\" (UniqueName: \"kubernetes.io/projected/ad595f62-3880-4f00-8f70-879d673dcf55-kube-api-access-2ndzq\") pod \"keystone-operator-controller-manager-767fdc4f47-2cdl5\" (UID: \"ad595f62-3880-4f00-8f70-879d673dcf55\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" Jan 20 22:51:19 crc kubenswrapper[5030]: E0120 22:51:19.743962 5030 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 22:51:19 crc kubenswrapper[5030]: E0120 22:51:19.744015 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert podName:123d952b-3c67-49ea-9c09-82b04a14d494 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:20.2440016 +0000 UTC m=+952.564261878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert") pod "infra-operator-controller-manager-77c48c7859-gtjfb" (UID: "123d952b-3c67-49ea-9c09-82b04a14d494") : secret "infra-operator-webhook-server-cert" not found Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.744308 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.746206 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.746980 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.756058 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bqzfr" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.778199 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prs4f\" (UniqueName: \"kubernetes.io/projected/d7b05cde-2eb1-4879-bc97-dd0a420d7617-kube-api-access-prs4f\") pod \"heat-operator-controller-manager-594c8c9d5d-cq5dj\" (UID: \"d7b05cde-2eb1-4879-bc97-dd0a420d7617\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.781234 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkmlr\" (UniqueName: \"kubernetes.io/projected/123d952b-3c67-49ea-9c09-82b04a14d494-kube-api-access-bkmlr\") pod \"infra-operator-controller-manager-77c48c7859-gtjfb\" (UID: \"123d952b-3c67-49ea-9c09-82b04a14d494\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.791542 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.792709 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.796470 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-pj8rb" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.798570 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2x56r" event={"ID":"8405eebd-ea44-4878-8055-b19c51043955","Type":"ContainerStarted","Data":"d836cf3ff923ff3efd33745e886bad734cb78c8c6df172a9fe37995cfb4c3010"} Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.807599 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.809175 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.820266 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-587tm\" (UniqueName: \"kubernetes.io/projected/dac66b05-4f64-4bad-9d59-9c01ccd16018-kube-api-access-587tm\") pod \"horizon-operator-controller-manager-77d5c5b54f-dh9wl\" (UID: \"dac66b05-4f64-4bad-9d59-9c01ccd16018\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.829707 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.830543 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.844206 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-22zld" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.844404 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.847681 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.848099 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knmnv\" (UniqueName: \"kubernetes.io/projected/caa5b6cd-4c8e-424e-8ba9-578ccd662eab-kube-api-access-knmnv\") pod \"ironic-operator-controller-manager-78757b4889-dvhb6\" (UID: \"caa5b6cd-4c8e-424e-8ba9-578ccd662eab\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.848155 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp9zq\" (UniqueName: \"kubernetes.io/projected/64490fad-ad11-4eea-b07f-c3c90f9bab8f-kube-api-access-tp9zq\") pod \"openstack-baremetal-operator-controller-manager-5b9875986d7xnrt\" (UID: \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.848176 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ndzq\" (UniqueName: \"kubernetes.io/projected/ad595f62-3880-4f00-8f70-879d673dcf55-kube-api-access-2ndzq\") pod \"keystone-operator-controller-manager-767fdc4f47-2cdl5\" (UID: \"ad595f62-3880-4f00-8f70-879d673dcf55\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.848204 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrxxp\" (UniqueName: \"kubernetes.io/projected/f32a8aff-01df-47f8-83db-03ee02703e0f-kube-api-access-xrxxp\") pod \"octavia-operator-controller-manager-7fc9b76cf6-w79zj\" (UID: \"f32a8aff-01df-47f8-83db-03ee02703e0f\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.848236 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv5lf\" (UniqueName: \"kubernetes.io/projected/df584eed-267c-41ec-b019-6af9322630da-kube-api-access-dv5lf\") pod \"neutron-operator-controller-manager-cb4666565-dpmdr\" (UID: \"df584eed-267c-41ec-b019-6af9322630da\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.848253 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nzfn\" (UniqueName: \"kubernetes.io/projected/5810f6d7-0bb3-4137-8ce4-03f0b1e9b225-kube-api-access-4nzfn\") pod \"manila-operator-controller-manager-864f6b75bf-qqnjk\" (UID: \"5810f6d7-0bb3-4137-8ce4-03f0b1e9b225\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.848273 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986d7xnrt\" (UID: \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.848315 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz7hr\" (UniqueName: \"kubernetes.io/projected/96630154-0812-445c-bd03-9a2066468891-kube-api-access-zz7hr\") pod \"mariadb-operator-controller-manager-c87fff755-mq6l5\" (UID: \"96630154-0812-445c-bd03-9a2066468891\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.852038 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.869535 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.872771 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.873971 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2wz6x" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.881006 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-9kn8l" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.881168 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.906579 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.906780 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ndzq\" (UniqueName: \"kubernetes.io/projected/ad595f62-3880-4f00-8f70-879d673dcf55-kube-api-access-2ndzq\") pod \"keystone-operator-controller-manager-767fdc4f47-2cdl5\" (UID: \"ad595f62-3880-4f00-8f70-879d673dcf55\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.907179 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz7hr\" (UniqueName: \"kubernetes.io/projected/96630154-0812-445c-bd03-9a2066468891-kube-api-access-zz7hr\") pod \"mariadb-operator-controller-manager-c87fff755-mq6l5\" (UID: \"96630154-0812-445c-bd03-9a2066468891\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.907237 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knmnv\" (UniqueName: \"kubernetes.io/projected/caa5b6cd-4c8e-424e-8ba9-578ccd662eab-kube-api-access-knmnv\") pod \"ironic-operator-controller-manager-78757b4889-dvhb6\" (UID: \"caa5b6cd-4c8e-424e-8ba9-578ccd662eab\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.935221 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nzfn\" (UniqueName: \"kubernetes.io/projected/5810f6d7-0bb3-4137-8ce4-03f0b1e9b225-kube-api-access-4nzfn\") pod \"manila-operator-controller-manager-864f6b75bf-qqnjk\" (UID: \"5810f6d7-0bb3-4137-8ce4-03f0b1e9b225\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.942431 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt"] Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.953318 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp9zq\" (UniqueName: \"kubernetes.io/projected/64490fad-ad11-4eea-b07f-c3c90f9bab8f-kube-api-access-tp9zq\") pod \"openstack-baremetal-operator-controller-manager-5b9875986d7xnrt\" (UID: \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.953388 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrxxp\" (UniqueName: \"kubernetes.io/projected/f32a8aff-01df-47f8-83db-03ee02703e0f-kube-api-access-xrxxp\") pod \"octavia-operator-controller-manager-7fc9b76cf6-w79zj\" (UID: \"f32a8aff-01df-47f8-83db-03ee02703e0f\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.953434 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv5lf\" (UniqueName: \"kubernetes.io/projected/df584eed-267c-41ec-b019-6af9322630da-kube-api-access-dv5lf\") pod \"neutron-operator-controller-manager-cb4666565-dpmdr\" (UID: \"df584eed-267c-41ec-b019-6af9322630da\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.953468 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbnhw\" (UniqueName: \"kubernetes.io/projected/a2343f20-8163-4f12-85f6-d621d416444b-kube-api-access-lbnhw\") pod \"nova-operator-controller-manager-65849867d6-sjfdt\" (UID: \"a2343f20-8163-4f12-85f6-d621d416444b\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" Jan 20 22:51:19 crc kubenswrapper[5030]: I0120 22:51:19.953494 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986d7xnrt\" (UID: \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:19 crc kubenswrapper[5030]: E0120 22:51:19.954130 5030 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 22:51:19 crc kubenswrapper[5030]: E0120 22:51:19.955060 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert podName:64490fad-ad11-4eea-b07f-c3c90f9bab8f nodeName:}" failed. No retries permitted until 2026-01-20 22:51:20.455028611 +0000 UTC m=+952.775288899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" (UID: "64490fad-ad11-4eea-b07f-c3c90f9bab8f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.004385 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrxxp\" (UniqueName: \"kubernetes.io/projected/f32a8aff-01df-47f8-83db-03ee02703e0f-kube-api-access-xrxxp\") pod \"octavia-operator-controller-manager-7fc9b76cf6-w79zj\" (UID: \"f32a8aff-01df-47f8-83db-03ee02703e0f\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.008558 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv5lf\" (UniqueName: \"kubernetes.io/projected/df584eed-267c-41ec-b019-6af9322630da-kube-api-access-dv5lf\") pod \"neutron-operator-controller-manager-cb4666565-dpmdr\" (UID: \"df584eed-267c-41ec-b019-6af9322630da\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.022097 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.036988 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp9zq\" (UniqueName: \"kubernetes.io/projected/64490fad-ad11-4eea-b07f-c3c90f9bab8f-kube-api-access-tp9zq\") pod \"openstack-baremetal-operator-controller-manager-5b9875986d7xnrt\" (UID: \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.038820 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.041248 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.041363 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.043224 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.043880 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-l5gjw" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.044576 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.047026 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-nl964" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.057046 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.058122 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbnhw\" (UniqueName: \"kubernetes.io/projected/a2343f20-8163-4f12-85f6-d621d416444b-kube-api-access-lbnhw\") pod \"nova-operator-controller-manager-65849867d6-sjfdt\" (UID: \"a2343f20-8163-4f12-85f6-d621d416444b\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.058166 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk2kv\" (UniqueName: \"kubernetes.io/projected/b73ba68f-7653-47c8-bd19-4a3f05afad72-kube-api-access-rk2kv\") pod \"placement-operator-controller-manager-686df47fcb-n2m7p\" (UID: \"b73ba68f-7653-47c8-bd19-4a3f05afad72\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.058238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h9vz\" (UniqueName: \"kubernetes.io/projected/1eab1afc-3438-48e1-9210-c49f8c94316d-kube-api-access-5h9vz\") pod \"ovn-operator-controller-manager-55db956ddc-mvzhb\" (UID: \"1eab1afc-3438-48e1-9210-c49f8c94316d\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.083927 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.084321 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.084417 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.084746 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.106897 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.122700 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.123745 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.123884 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbnhw\" (UniqueName: \"kubernetes.io/projected/a2343f20-8163-4f12-85f6-d621d416444b-kube-api-access-lbnhw\") pod \"nova-operator-controller-manager-65849867d6-sjfdt\" (UID: \"a2343f20-8163-4f12-85f6-d621d416444b\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.128535 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vn9fj" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.129822 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.130426 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2x56r" podStartSLOduration=3.682789148 podStartE2EDuration="6.130409882s" podCreationTimestamp="2026-01-20 22:51:14 +0000 UTC" firstStartedPulling="2026-01-20 22:51:16.731520084 +0000 UTC m=+949.051780382" lastFinishedPulling="2026-01-20 22:51:19.179140828 +0000 UTC m=+951.499401116" observedRunningTime="2026-01-20 22:51:19.844660398 +0000 UTC m=+952.164920686" watchObservedRunningTime="2026-01-20 22:51:20.130409882 +0000 UTC m=+952.450670170" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.159451 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z7wr\" (UniqueName: \"kubernetes.io/projected/1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2-kube-api-access-8z7wr\") pod \"swift-operator-controller-manager-85dd56d4cc-nrctv\" (UID: \"1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.159528 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk2kv\" (UniqueName: \"kubernetes.io/projected/b73ba68f-7653-47c8-bd19-4a3f05afad72-kube-api-access-rk2kv\") pod \"placement-operator-controller-manager-686df47fcb-n2m7p\" (UID: \"b73ba68f-7653-47c8-bd19-4a3f05afad72\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.159560 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hrc7\" (UniqueName: \"kubernetes.io/projected/708640d3-cade-4b2d-b378-1f25cafb3575-kube-api-access-5hrc7\") pod \"telemetry-operator-controller-manager-5f8f495fcf-jgsmp\" (UID: \"708640d3-cade-4b2d-b378-1f25cafb3575\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.159584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h9vz\" (UniqueName: \"kubernetes.io/projected/1eab1afc-3438-48e1-9210-c49f8c94316d-kube-api-access-5h9vz\") pod \"ovn-operator-controller-manager-55db956ddc-mvzhb\" (UID: \"1eab1afc-3438-48e1-9210-c49f8c94316d\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.165478 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.167324 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.170397 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-n4kl9" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.171374 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.188363 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk2kv\" (UniqueName: \"kubernetes.io/projected/b73ba68f-7653-47c8-bd19-4a3f05afad72-kube-api-access-rk2kv\") pod \"placement-operator-controller-manager-686df47fcb-n2m7p\" (UID: \"b73ba68f-7653-47c8-bd19-4a3f05afad72\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.191282 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h9vz\" (UniqueName: \"kubernetes.io/projected/1eab1afc-3438-48e1-9210-c49f8c94316d-kube-api-access-5h9vz\") pod \"ovn-operator-controller-manager-55db956ddc-mvzhb\" (UID: \"1eab1afc-3438-48e1-9210-c49f8c94316d\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.211300 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.224720 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.225601 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.226701 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.227456 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-z6q8k" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.227614 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.227920 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.249054 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.249488 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.261429 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert\") pod \"infra-operator-controller-manager-77c48c7859-gtjfb\" (UID: \"123d952b-3c67-49ea-9c09-82b04a14d494\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.261485 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbwjb\" (UniqueName: \"kubernetes.io/projected/bf527df4-5b46-4894-ac83-7b5613ee7fae-kube-api-access-kbwjb\") pod \"test-operator-controller-manager-7cd8bc9dbb-pl7qm\" (UID: \"bf527df4-5b46-4894-ac83-7b5613ee7fae\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.261527 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z7wr\" (UniqueName: \"kubernetes.io/projected/1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2-kube-api-access-8z7wr\") pod \"swift-operator-controller-manager-85dd56d4cc-nrctv\" (UID: \"1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.261594 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hrc7\" (UniqueName: \"kubernetes.io/projected/708640d3-cade-4b2d-b378-1f25cafb3575-kube-api-access-5hrc7\") pod \"telemetry-operator-controller-manager-5f8f495fcf-jgsmp\" (UID: \"708640d3-cade-4b2d-b378-1f25cafb3575\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" Jan 20 22:51:20 crc kubenswrapper[5030]: E0120 22:51:20.261955 5030 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 22:51:20 crc kubenswrapper[5030]: E0120 22:51:20.261997 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert podName:123d952b-3c67-49ea-9c09-82b04a14d494 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:21.261985005 +0000 UTC m=+953.582245293 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert") pod "infra-operator-controller-manager-77c48c7859-gtjfb" (UID: "123d952b-3c67-49ea-9c09-82b04a14d494") : secret "infra-operator-webhook-server-cert" not found Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.286191 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hrc7\" (UniqueName: \"kubernetes.io/projected/708640d3-cade-4b2d-b378-1f25cafb3575-kube-api-access-5hrc7\") pod \"telemetry-operator-controller-manager-5f8f495fcf-jgsmp\" (UID: \"708640d3-cade-4b2d-b378-1f25cafb3575\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.296452 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z7wr\" (UniqueName: \"kubernetes.io/projected/1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2-kube-api-access-8z7wr\") pod \"swift-operator-controller-manager-85dd56d4cc-nrctv\" (UID: \"1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.307287 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.313731 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.314672 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.321237 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-zhfxt" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.329794 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.349024 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.362603 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.362652 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.362687 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbhk5\" (UniqueName: \"kubernetes.io/projected/a4f33608-dc70-40e3-b757-8410b4f1938e-kube-api-access-qbhk5\") pod \"watcher-operator-controller-manager-64cd966744-g9v2l\" (UID: \"a4f33608-dc70-40e3-b757-8410b4f1938e\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.362723 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdgwt\" (UniqueName: \"kubernetes.io/projected/23bf9edf-4e02-4e6b-815d-c957b80275f0-kube-api-access-hdgwt\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.362782 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbwjb\" (UniqueName: \"kubernetes.io/projected/bf527df4-5b46-4894-ac83-7b5613ee7fae-kube-api-access-kbwjb\") pod \"test-operator-controller-manager-7cd8bc9dbb-pl7qm\" (UID: \"bf527df4-5b46-4894-ac83-7b5613ee7fae\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.395593 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbwjb\" (UniqueName: \"kubernetes.io/projected/bf527df4-5b46-4894-ac83-7b5613ee7fae-kube-api-access-kbwjb\") pod \"test-operator-controller-manager-7cd8bc9dbb-pl7qm\" (UID: \"bf527df4-5b46-4894-ac83-7b5613ee7fae\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.417034 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.431570 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.457889 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.466054 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.466112 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986d7xnrt\" (UID: \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.466135 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbhk5\" (UniqueName: \"kubernetes.io/projected/a4f33608-dc70-40e3-b757-8410b4f1938e-kube-api-access-qbhk5\") pod \"watcher-operator-controller-manager-64cd966744-g9v2l\" (UID: \"a4f33608-dc70-40e3-b757-8410b4f1938e\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.466174 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdgwt\" (UniqueName: \"kubernetes.io/projected/23bf9edf-4e02-4e6b-815d-c957b80275f0-kube-api-access-hdgwt\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.466219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fcjv\" (UniqueName: \"kubernetes.io/projected/bee1b017-6c33-42be-82ba-9af5cd5b5362-kube-api-access-6fcjv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-8nglg\" (UID: \"bee1b017-6c33-42be-82ba-9af5cd5b5362\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.466279 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:20 crc kubenswrapper[5030]: E0120 22:51:20.466382 5030 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 22:51:20 crc kubenswrapper[5030]: E0120 22:51:20.466436 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs podName:23bf9edf-4e02-4e6b-815d-c957b80275f0 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:20.966420669 +0000 UTC m=+953.286680947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-dslzz" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0") : secret "metrics-server-cert" not found Jan 20 22:51:20 crc kubenswrapper[5030]: E0120 22:51:20.466838 5030 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 22:51:20 crc kubenswrapper[5030]: E0120 22:51:20.466888 5030 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 22:51:20 crc kubenswrapper[5030]: E0120 22:51:20.466902 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs podName:23bf9edf-4e02-4e6b-815d-c957b80275f0 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:20.966885209 +0000 UTC m=+953.287145497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-dslzz" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0") : secret "webhook-server-cert" not found Jan 20 22:51:20 crc kubenswrapper[5030]: E0120 22:51:20.466917 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert podName:64490fad-ad11-4eea-b07f-c3c90f9bab8f nodeName:}" failed. No retries permitted until 2026-01-20 22:51:21.46690881 +0000 UTC m=+953.787169098 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" (UID: "64490fad-ad11-4eea-b07f-c3c90f9bab8f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.482338 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.485852 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbhk5\" (UniqueName: \"kubernetes.io/projected/a4f33608-dc70-40e3-b757-8410b4f1938e-kube-api-access-qbhk5\") pod \"watcher-operator-controller-manager-64cd966744-g9v2l\" (UID: \"a4f33608-dc70-40e3-b757-8410b4f1938e\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.492078 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdgwt\" (UniqueName: \"kubernetes.io/projected/23bf9edf-4e02-4e6b-815d-c957b80275f0-kube-api-access-hdgwt\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.492361 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.554844 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.567349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fcjv\" (UniqueName: \"kubernetes.io/projected/bee1b017-6c33-42be-82ba-9af5cd5b5362-kube-api-access-6fcjv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-8nglg\" (UID: \"bee1b017-6c33-42be-82ba-9af5cd5b5362\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.587496 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fcjv\" (UniqueName: \"kubernetes.io/projected/bee1b017-6c33-42be-82ba-9af5cd5b5362-kube-api-access-6fcjv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-8nglg\" (UID: \"bee1b017-6c33-42be-82ba-9af5cd5b5362\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.599178 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.620172 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq"] Jan 20 22:51:20 crc kubenswrapper[5030]: W0120 22:51:20.648973 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98fe7bc7_a9a3_4681_9d86_2d134e784ed0.slice/crio-30aed801a968d758af4b6d855a2ef6d6d91c6050f073d908fb33f6377fed4227 WatchSource:0}: Error finding container 30aed801a968d758af4b6d855a2ef6d6d91c6050f073d908fb33f6377fed4227: Status 404 returned error can't find the container with id 30aed801a968d758af4b6d855a2ef6d6d91c6050f073d908fb33f6377fed4227 Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.730428 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.766403 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.851978 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" event={"ID":"c9aea915-81f5-4204-9d3b-4d107d4678a1","Type":"ContainerStarted","Data":"04eeeed3ac64e1451efbd97883acb294ae843803f74752e4477422cc9b11a002"} Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.860126 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" event={"ID":"98fe7bc7-a9a3-4681-9d86-2d134e784ed0","Type":"ContainerStarted","Data":"30aed801a968d758af4b6d855a2ef6d6d91c6050f073d908fb33f6377fed4227"} Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.869855 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" event={"ID":"9a08b9f6-0526-42ba-9439-1fad41456c73","Type":"ContainerStarted","Data":"f3e2cbeb0278267c45cf2f1949649c84fee44f1c3ae03089f9b5363dffeb03f2"} Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.875631 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" event={"ID":"caa5b6cd-4c8e-424e-8ba9-578ccd662eab","Type":"ContainerStarted","Data":"91fe49cac6b767fc3fad43be2ab43912ec4d57d9bf705669148fa947cc8530e1"} Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.876191 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5"] Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.882487 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" event={"ID":"1d5ff7f0-0960-44b0-8637-446aa3906d52","Type":"ContainerStarted","Data":"64e21b6bd260ff06d4c3d53a805a0c54484d958266d1ba19717faa6bf864d9c3"} Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.884101 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj"] Jan 20 22:51:20 crc kubenswrapper[5030]: W0120 22:51:20.884961 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad595f62_3880_4f00_8f70_879d673dcf55.slice/crio-3f4609d1c2943327f98dc0860442ae37a003eb0c8c8dd464054dc25633d8a127 WatchSource:0}: Error finding container 3f4609d1c2943327f98dc0860442ae37a003eb0c8c8dd464054dc25633d8a127: Status 404 returned error can't find the container with id 3f4609d1c2943327f98dc0860442ae37a003eb0c8c8dd464054dc25633d8a127 Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.909253 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl"] Jan 20 22:51:20 crc kubenswrapper[5030]: W0120 22:51:20.913130 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddac66b05_4f64_4bad_9d59_9c01ccd16018.slice/crio-0c3c66e40c7c2b8ae9854585da60c4d307ff190a708bb1b04184653096c0fab0 WatchSource:0}: Error finding container 0c3c66e40c7c2b8ae9854585da60c4d307ff190a708bb1b04184653096c0fab0: Status 404 returned error can't find the container with id 0c3c66e40c7c2b8ae9854585da60c4d307ff190a708bb1b04184653096c0fab0 Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.983097 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:20 crc kubenswrapper[5030]: I0120 22:51:20.983341 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:20 crc kubenswrapper[5030]: E0120 22:51:20.983287 5030 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 22:51:20 crc kubenswrapper[5030]: E0120 22:51:20.983462 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs podName:23bf9edf-4e02-4e6b-815d-c957b80275f0 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:21.983443302 +0000 UTC m=+954.303703590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-dslzz" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0") : secret "metrics-server-cert" not found Jan 20 22:51:20 crc kubenswrapper[5030]: E0120 22:51:20.983547 5030 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 22:51:20 crc kubenswrapper[5030]: E0120 22:51:20.983637 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs podName:23bf9edf-4e02-4e6b-815d-c957b80275f0 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:21.983601056 +0000 UTC m=+954.303861344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-dslzz" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0") : secret "webhook-server-cert" not found Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.091701 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt"] Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.095846 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk"] Jan 20 22:51:21 crc kubenswrapper[5030]: W0120 22:51:21.097110 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5810f6d7_0bb3_4137_8ce4_03f0b1e9b225.slice/crio-935f3c59e93c64cea85ed2d5be55e63cafcaf1639be8be9438362ca7681d6735 WatchSource:0}: Error finding container 935f3c59e93c64cea85ed2d5be55e63cafcaf1639be8be9438362ca7681d6735: Status 404 returned error can't find the container with id 935f3c59e93c64cea85ed2d5be55e63cafcaf1639be8be9438362ca7681d6735 Jan 20 22:51:21 crc kubenswrapper[5030]: W0120 22:51:21.099806 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96630154_0812_445c_bd03_9a2066468891.slice/crio-f0f53acb4e69e31489ae5f14455a28bc895789db2a9d9911dfac82840d78f0c9 WatchSource:0}: Error finding container f0f53acb4e69e31489ae5f14455a28bc895789db2a9d9911dfac82840d78f0c9: Status 404 returned error can't find the container with id f0f53acb4e69e31489ae5f14455a28bc895789db2a9d9911dfac82840d78f0c9 Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.100217 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5"] Jan 20 22:51:21 crc kubenswrapper[5030]: W0120 22:51:21.110056 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf584eed_267c_41ec_b019_6af9322630da.slice/crio-92ca70581cc9b932f7b9414457a3edb7c6bd6d159de093e030c1cbff4f7535cc WatchSource:0}: Error finding container 92ca70581cc9b932f7b9414457a3edb7c6bd6d159de093e030c1cbff4f7535cc: Status 404 returned error can't find the container with id 92ca70581cc9b932f7b9414457a3edb7c6bd6d159de093e030c1cbff4f7535cc Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.111966 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr"] Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.213704 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj"] Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.219514 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p"] Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.234391 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xrxxp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7fc9b76cf6-w79zj_openstack-operators(f32a8aff-01df-47f8-83db-03ee02703e0f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.235533 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" podUID="f32a8aff-01df-47f8-83db-03ee02703e0f" Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.289767 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert\") pod \"infra-operator-controller-manager-77c48c7859-gtjfb\" (UID: \"123d952b-3c67-49ea-9c09-82b04a14d494\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.289979 5030 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.292895 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert podName:123d952b-3c67-49ea-9c09-82b04a14d494 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:23.292876071 +0000 UTC m=+955.613136369 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert") pod "infra-operator-controller-manager-77c48c7859-gtjfb" (UID: "123d952b-3c67-49ea-9c09-82b04a14d494") : secret "infra-operator-webhook-server-cert" not found Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.305151 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb"] Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.307320 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5h9vz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-55db956ddc-mvzhb_openstack-operators(1eab1afc-3438-48e1-9210-c49f8c94316d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.307473 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qbhk5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-64cd966744-g9v2l_openstack-operators(a4f33608-dc70-40e3-b757-8410b4f1938e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.309090 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" podUID="a4f33608-dc70-40e3-b757-8410b4f1938e" Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.309140 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" podUID="1eab1afc-3438-48e1-9210-c49f8c94316d" Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.317602 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l"] Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.325426 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kbwjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7cd8bc9dbb-pl7qm_openstack-operators(bf527df4-5b46-4894-ac83-7b5613ee7fae): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.325609 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8z7wr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-85dd56d4cc-nrctv_openstack-operators(1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.326592 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" podUID="bf527df4-5b46-4894-ac83-7b5613ee7fae" Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.326764 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" podUID="1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2" Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.329573 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp"] Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.340291 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm"] Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.345724 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv"] Jan 20 22:51:21 crc kubenswrapper[5030]: W0120 22:51:21.345924 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod708640d3_cade_4b2d_b378_1f25cafb3575.slice/crio-e598c82ee9bfff80aef89685a049df0958cabccae007296671b9eb8dc6155114 WatchSource:0}: Error finding container e598c82ee9bfff80aef89685a049df0958cabccae007296671b9eb8dc6155114: Status 404 returned error can't find the container with id e598c82ee9bfff80aef89685a049df0958cabccae007296671b9eb8dc6155114 Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.352190 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5hrc7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5f8f495fcf-jgsmp_openstack-operators(708640d3-cade-4b2d-b378-1f25cafb3575): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.353509 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" podUID="708640d3-cade-4b2d-b378-1f25cafb3575" Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.442364 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg"] Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.495443 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986d7xnrt\" (UID: \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.495647 5030 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.495726 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert podName:64490fad-ad11-4eea-b07f-c3c90f9bab8f nodeName:}" failed. No retries permitted until 2026-01-20 22:51:23.49570383 +0000 UTC m=+955.815964188 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" (UID: "64490fad-ad11-4eea-b07f-c3c90f9bab8f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 22:51:21 crc kubenswrapper[5030]: I0120 22:51:21.937834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" event={"ID":"5810f6d7-0bb3-4137-8ce4-03f0b1e9b225","Type":"ContainerStarted","Data":"935f3c59e93c64cea85ed2d5be55e63cafcaf1639be8be9438362ca7681d6735"} Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.965612 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" podUID="1eab1afc-3438-48e1-9210-c49f8c94316d" Jan 20 22:51:21 crc kubenswrapper[5030]: E0120 22:51:21.992195 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" podUID="a4f33608-dc70-40e3-b757-8410b4f1938e" Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.006835 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.006907 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:22 crc kubenswrapper[5030]: E0120 22:51:22.007133 5030 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 22:51:22 crc kubenswrapper[5030]: E0120 22:51:22.007179 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs podName:23bf9edf-4e02-4e6b-815d-c957b80275f0 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:24.00716523 +0000 UTC m=+956.327425508 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-dslzz" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0") : secret "webhook-server-cert" not found Jan 20 22:51:22 crc kubenswrapper[5030]: E0120 22:51:22.007572 5030 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 22:51:22 crc kubenswrapper[5030]: E0120 22:51:22.007626 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs podName:23bf9edf-4e02-4e6b-815d-c957b80275f0 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:24.00758901 +0000 UTC m=+956.327849298 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-dslzz" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0") : secret "metrics-server-cert" not found Jan 20 22:51:22 crc kubenswrapper[5030]: E0120 22:51:22.019537 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" podUID="708640d3-cade-4b2d-b378-1f25cafb3575" Jan 20 22:51:22 crc kubenswrapper[5030]: E0120 22:51:22.028030 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" podUID="bf527df4-5b46-4894-ac83-7b5613ee7fae" Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.028993 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" event={"ID":"1eab1afc-3438-48e1-9210-c49f8c94316d","Type":"ContainerStarted","Data":"7153f1dca68387e2a6d3300318f24efd0e08123231f30ee0fa253ab14ef21e89"} Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.029028 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" event={"ID":"a4f33608-dc70-40e3-b757-8410b4f1938e","Type":"ContainerStarted","Data":"d77ad8f7e577fc1539cec8914e2c76b1cf272490a60878313932d418a9bee7dd"} Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.029041 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" event={"ID":"96630154-0812-445c-bd03-9a2066468891","Type":"ContainerStarted","Data":"f0f53acb4e69e31489ae5f14455a28bc895789db2a9d9911dfac82840d78f0c9"} Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.029050 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" event={"ID":"708640d3-cade-4b2d-b378-1f25cafb3575","Type":"ContainerStarted","Data":"e598c82ee9bfff80aef89685a049df0958cabccae007296671b9eb8dc6155114"} Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.029063 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" event={"ID":"dac66b05-4f64-4bad-9d59-9c01ccd16018","Type":"ContainerStarted","Data":"0c3c66e40c7c2b8ae9854585da60c4d307ff190a708bb1b04184653096c0fab0"} Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.029073 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" event={"ID":"bf527df4-5b46-4894-ac83-7b5613ee7fae","Type":"ContainerStarted","Data":"7e34a090f497852624dff94cac24642c505e29a0112d097e507c1d063804471c"} Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.032821 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" event={"ID":"b73ba68f-7653-47c8-bd19-4a3f05afad72","Type":"ContainerStarted","Data":"9aa3520b44cb611b8c2c70cf9782e71781c58c8199642c5756659871ca3ad8a4"} Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.036985 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" event={"ID":"d7b05cde-2eb1-4879-bc97-dd0a420d7617","Type":"ContainerStarted","Data":"27b8e7fdbf0a93aa969d05b8df05ba304376d0624a935c3b19db1986ad813da8"} Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.046741 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" event={"ID":"bee1b017-6c33-42be-82ba-9af5cd5b5362","Type":"ContainerStarted","Data":"5085c8ef0fa89c562c35f33a6f3a6baef9af8d7b2386c6a5a533595050133ed2"} Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.048698 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" event={"ID":"1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2","Type":"ContainerStarted","Data":"45056e4b4551d7ca50fe1510a2a3fcb124f7a6e3dcb50c106c56b9280376cd45"} Jan 20 22:51:22 crc kubenswrapper[5030]: E0120 22:51:22.053542 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" podUID="1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2" Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.055107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" event={"ID":"ad595f62-3880-4f00-8f70-879d673dcf55","Type":"ContainerStarted","Data":"3f4609d1c2943327f98dc0860442ae37a003eb0c8c8dd464054dc25633d8a127"} Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.056194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" event={"ID":"f32a8aff-01df-47f8-83db-03ee02703e0f","Type":"ContainerStarted","Data":"af20e3150d908877f0530fe90a7c29d6c4c54f8acdd7115e791b3d0bdbfcc7c4"} Jan 20 22:51:22 crc kubenswrapper[5030]: E0120 22:51:22.058648 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" podUID="f32a8aff-01df-47f8-83db-03ee02703e0f" Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.059437 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" event={"ID":"a2343f20-8163-4f12-85f6-d621d416444b","Type":"ContainerStarted","Data":"1b9233cacbdaab03a3e512ad967827c332cd4d73c727aee17bdf4c3317255495"} Jan 20 22:51:22 crc kubenswrapper[5030]: I0120 22:51:22.074342 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" event={"ID":"df584eed-267c-41ec-b019-6af9322630da","Type":"ContainerStarted","Data":"92ca70581cc9b932f7b9414457a3edb7c6bd6d159de093e030c1cbff4f7535cc"} Jan 20 22:51:23 crc kubenswrapper[5030]: E0120 22:51:23.094793 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" podUID="1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2" Jan 20 22:51:23 crc kubenswrapper[5030]: E0120 22:51:23.095045 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" podUID="a4f33608-dc70-40e3-b757-8410b4f1938e" Jan 20 22:51:23 crc kubenswrapper[5030]: E0120 22:51:23.095532 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" podUID="f32a8aff-01df-47f8-83db-03ee02703e0f" Jan 20 22:51:23 crc kubenswrapper[5030]: E0120 22:51:23.095555 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" podUID="1eab1afc-3438-48e1-9210-c49f8c94316d" Jan 20 22:51:23 crc kubenswrapper[5030]: E0120 22:51:23.095857 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" podUID="bf527df4-5b46-4894-ac83-7b5613ee7fae" Jan 20 22:51:23 crc kubenswrapper[5030]: E0120 22:51:23.096716 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" podUID="708640d3-cade-4b2d-b378-1f25cafb3575" Jan 20 22:51:23 crc kubenswrapper[5030]: I0120 22:51:23.340316 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert\") pod \"infra-operator-controller-manager-77c48c7859-gtjfb\" (UID: \"123d952b-3c67-49ea-9c09-82b04a14d494\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:23 crc kubenswrapper[5030]: E0120 22:51:23.340514 5030 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 22:51:23 crc kubenswrapper[5030]: E0120 22:51:23.340608 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert podName:123d952b-3c67-49ea-9c09-82b04a14d494 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:27.340576553 +0000 UTC m=+959.660836922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert") pod "infra-operator-controller-manager-77c48c7859-gtjfb" (UID: "123d952b-3c67-49ea-9c09-82b04a14d494") : secret "infra-operator-webhook-server-cert" not found Jan 20 22:51:23 crc kubenswrapper[5030]: I0120 22:51:23.543412 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986d7xnrt\" (UID: \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:23 crc kubenswrapper[5030]: E0120 22:51:23.543647 5030 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 22:51:23 crc kubenswrapper[5030]: E0120 22:51:23.543725 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert podName:64490fad-ad11-4eea-b07f-c3c90f9bab8f nodeName:}" failed. No retries permitted until 2026-01-20 22:51:27.543688909 +0000 UTC m=+959.863949197 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" (UID: "64490fad-ad11-4eea-b07f-c3c90f9bab8f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 22:51:24 crc kubenswrapper[5030]: I0120 22:51:24.050452 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:24 crc kubenswrapper[5030]: I0120 22:51:24.050496 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:24 crc kubenswrapper[5030]: E0120 22:51:24.050634 5030 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 22:51:24 crc kubenswrapper[5030]: E0120 22:51:24.050720 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs podName:23bf9edf-4e02-4e6b-815d-c957b80275f0 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:28.050703011 +0000 UTC m=+960.370963299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-dslzz" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0") : secret "metrics-server-cert" not found Jan 20 22:51:24 crc kubenswrapper[5030]: E0120 22:51:24.050641 5030 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 22:51:24 crc kubenswrapper[5030]: E0120 22:51:24.050791 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs podName:23bf9edf-4e02-4e6b-815d-c957b80275f0 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:28.050776072 +0000 UTC m=+960.371036360 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-dslzz" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0") : secret "webhook-server-cert" not found Jan 20 22:51:24 crc kubenswrapper[5030]: E0120 22:51:24.104268 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" podUID="a4f33608-dc70-40e3-b757-8410b4f1938e" Jan 20 22:51:24 crc kubenswrapper[5030]: E0120 22:51:24.104720 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" podUID="bf527df4-5b46-4894-ac83-7b5613ee7fae" Jan 20 22:51:24 crc kubenswrapper[5030]: I0120 22:51:24.458980 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:24 crc kubenswrapper[5030]: I0120 22:51:24.459583 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:24 crc kubenswrapper[5030]: I0120 22:51:24.513888 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:25 crc kubenswrapper[5030]: I0120 22:51:25.145788 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:25 crc kubenswrapper[5030]: I0120 22:51:25.189677 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2x56r"] Jan 20 22:51:27 crc kubenswrapper[5030]: I0120 22:51:27.122163 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2x56r" podUID="8405eebd-ea44-4878-8055-b19c51043955" containerName="registry-server" containerID="cri-o://d836cf3ff923ff3efd33745e886bad734cb78c8c6df172a9fe37995cfb4c3010" gracePeriod=2 Jan 20 22:51:27 crc kubenswrapper[5030]: I0120 22:51:27.408302 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert\") pod \"infra-operator-controller-manager-77c48c7859-gtjfb\" (UID: \"123d952b-3c67-49ea-9c09-82b04a14d494\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:27 crc kubenswrapper[5030]: E0120 22:51:27.408509 5030 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 22:51:27 crc kubenswrapper[5030]: E0120 22:51:27.408608 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert podName:123d952b-3c67-49ea-9c09-82b04a14d494 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:35.408584401 +0000 UTC m=+967.728844699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert") pod "infra-operator-controller-manager-77c48c7859-gtjfb" (UID: "123d952b-3c67-49ea-9c09-82b04a14d494") : secret "infra-operator-webhook-server-cert" not found Jan 20 22:51:27 crc kubenswrapper[5030]: I0120 22:51:27.611420 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986d7xnrt\" (UID: \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:27 crc kubenswrapper[5030]: E0120 22:51:27.611564 5030 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 22:51:27 crc kubenswrapper[5030]: E0120 22:51:27.611645 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert podName:64490fad-ad11-4eea-b07f-c3c90f9bab8f nodeName:}" failed. No retries permitted until 2026-01-20 22:51:35.611608834 +0000 UTC m=+967.931869122 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert") pod "openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" (UID: "64490fad-ad11-4eea-b07f-c3c90f9bab8f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 22:51:28 crc kubenswrapper[5030]: I0120 22:51:28.124534 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:28 crc kubenswrapper[5030]: I0120 22:51:28.124574 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:28 crc kubenswrapper[5030]: E0120 22:51:28.124759 5030 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 22:51:28 crc kubenswrapper[5030]: E0120 22:51:28.124797 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs podName:23bf9edf-4e02-4e6b-815d-c957b80275f0 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:36.124784393 +0000 UTC m=+968.445044681 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-dslzz" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0") : secret "webhook-server-cert" not found Jan 20 22:51:28 crc kubenswrapper[5030]: E0120 22:51:28.125024 5030 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 22:51:28 crc kubenswrapper[5030]: E0120 22:51:28.125047 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs podName:23bf9edf-4e02-4e6b-815d-c957b80275f0 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:36.125040208 +0000 UTC m=+968.445300496 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs") pod "openstack-operator-controller-manager-75bfd788c8-dslzz" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0") : secret "metrics-server-cert" not found Jan 20 22:51:29 crc kubenswrapper[5030]: I0120 22:51:29.139762 5030 generic.go:334] "Generic (PLEG): container finished" podID="8405eebd-ea44-4878-8055-b19c51043955" containerID="d836cf3ff923ff3efd33745e886bad734cb78c8c6df172a9fe37995cfb4c3010" exitCode=0 Jan 20 22:51:29 crc kubenswrapper[5030]: I0120 22:51:29.139818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2x56r" event={"ID":"8405eebd-ea44-4878-8055-b19c51043955","Type":"ContainerDied","Data":"d836cf3ff923ff3efd33745e886bad734cb78c8c6df172a9fe37995cfb4c3010"} Jan 20 22:51:34 crc kubenswrapper[5030]: E0120 22:51:34.459440 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d836cf3ff923ff3efd33745e886bad734cb78c8c6df172a9fe37995cfb4c3010 is running failed: container process not found" containerID="d836cf3ff923ff3efd33745e886bad734cb78c8c6df172a9fe37995cfb4c3010" cmd=["grpc_health_probe","-addr=:50051"] Jan 20 22:51:34 crc kubenswrapper[5030]: E0120 22:51:34.461534 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d836cf3ff923ff3efd33745e886bad734cb78c8c6df172a9fe37995cfb4c3010 is running failed: container process not found" containerID="d836cf3ff923ff3efd33745e886bad734cb78c8c6df172a9fe37995cfb4c3010" cmd=["grpc_health_probe","-addr=:50051"] Jan 20 22:51:34 crc kubenswrapper[5030]: E0120 22:51:34.462260 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d836cf3ff923ff3efd33745e886bad734cb78c8c6df172a9fe37995cfb4c3010 is running failed: container process not found" containerID="d836cf3ff923ff3efd33745e886bad734cb78c8c6df172a9fe37995cfb4c3010" cmd=["grpc_health_probe","-addr=:50051"] Jan 20 22:51:34 crc kubenswrapper[5030]: E0120 22:51:34.462456 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d836cf3ff923ff3efd33745e886bad734cb78c8c6df172a9fe37995cfb4c3010 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-2x56r" podUID="8405eebd-ea44-4878-8055-b19c51043955" containerName="registry-server" Jan 20 22:51:35 crc kubenswrapper[5030]: I0120 22:51:35.447155 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert\") pod \"infra-operator-controller-manager-77c48c7859-gtjfb\" (UID: \"123d952b-3c67-49ea-9c09-82b04a14d494\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:35 crc kubenswrapper[5030]: I0120 22:51:35.453525 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert\") pod \"infra-operator-controller-manager-77c48c7859-gtjfb\" (UID: \"123d952b-3c67-49ea-9c09-82b04a14d494\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:35 crc kubenswrapper[5030]: E0120 22:51:35.457557 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492" Jan 20 22:51:35 crc kubenswrapper[5030]: E0120 22:51:35.457715 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-prs4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-594c8c9d5d-cq5dj_openstack-operators(d7b05cde-2eb1-4879-bc97-dd0a420d7617): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:51:35 crc kubenswrapper[5030]: E0120 22:51:35.458861 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" podUID="d7b05cde-2eb1-4879-bc97-dd0a420d7617" Jan 20 22:51:35 crc kubenswrapper[5030]: I0120 22:51:35.507328 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-lkpt2" Jan 20 22:51:35 crc kubenswrapper[5030]: I0120 22:51:35.516001 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:35 crc kubenswrapper[5030]: I0120 22:51:35.649894 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986d7xnrt\" (UID: \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:35 crc kubenswrapper[5030]: I0120 22:51:35.653763 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert\") pod \"openstack-baremetal-operator-controller-manager-5b9875986d7xnrt\" (UID: \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:35 crc kubenswrapper[5030]: I0120 22:51:35.863957 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-22zld" Jan 20 22:51:35 crc kubenswrapper[5030]: I0120 22:51:35.873066 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:36 crc kubenswrapper[5030]: E0120 22:51:36.061473 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0f440bf7dc937ce0135bdd328716686fd2f1320f453a9ac4e11e96383148ad6c" Jan 20 22:51:36 crc kubenswrapper[5030]: E0120 22:51:36.062406 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0f440bf7dc937ce0135bdd328716686fd2f1320f453a9ac4e11e96383148ad6c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dv5lf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-cb4666565-dpmdr_openstack-operators(df584eed-267c-41ec-b019-6af9322630da): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:51:36 crc kubenswrapper[5030]: E0120 22:51:36.063613 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" podUID="df584eed-267c-41ec-b019-6af9322630da" Jan 20 22:51:36 crc kubenswrapper[5030]: I0120 22:51:36.160130 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:36 crc kubenswrapper[5030]: I0120 22:51:36.160175 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:36 crc kubenswrapper[5030]: E0120 22:51:36.160355 5030 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 22:51:36 crc kubenswrapper[5030]: E0120 22:51:36.160402 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs podName:23bf9edf-4e02-4e6b-815d-c957b80275f0 nodeName:}" failed. No retries permitted until 2026-01-20 22:51:52.160387845 +0000 UTC m=+984.480648133 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs") pod "openstack-operator-controller-manager-75bfd788c8-dslzz" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0") : secret "webhook-server-cert" not found Jan 20 22:51:36 crc kubenswrapper[5030]: I0120 22:51:36.164948 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:36 crc kubenswrapper[5030]: E0120 22:51:36.195809 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492\\\"\"" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" podUID="d7b05cde-2eb1-4879-bc97-dd0a420d7617" Jan 20 22:51:36 crc kubenswrapper[5030]: E0120 22:51:36.195971 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0f440bf7dc937ce0135bdd328716686fd2f1320f453a9ac4e11e96383148ad6c\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" podUID="df584eed-267c-41ec-b019-6af9322630da" Jan 20 22:51:37 crc kubenswrapper[5030]: E0120 22:51:37.653854 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:0d59a405f50b37c833e14c0f4987e95c8769d9ab06a7087078bdd02568c18ca8" Jan 20 22:51:37 crc kubenswrapper[5030]: E0120 22:51:37.654084 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:0d59a405f50b37c833e14c0f4987e95c8769d9ab06a7087078bdd02568c18ca8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-45n7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-9f958b845-9qgvp_openstack-operators(9a08b9f6-0526-42ba-9439-1fad41456c73): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:51:37 crc kubenswrapper[5030]: E0120 22:51:37.655889 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" podUID="9a08b9f6-0526-42ba-9439-1fad41456c73" Jan 20 22:51:38 crc kubenswrapper[5030]: E0120 22:51:38.217685 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:0d59a405f50b37c833e14c0f4987e95c8769d9ab06a7087078bdd02568c18ca8\\\"\"" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" podUID="9a08b9f6-0526-42ba-9439-1fad41456c73" Jan 20 22:51:38 crc kubenswrapper[5030]: E0120 22:51:38.245653 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:3311e627bcb860d9443592a2c67078417318c9eb77d8ef4d07f9aa7027d46822" Jan 20 22:51:38 crc kubenswrapper[5030]: E0120 22:51:38.245823 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:3311e627bcb860d9443592a2c67078417318c9eb77d8ef4d07f9aa7027d46822,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-587tm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-77d5c5b54f-dh9wl_openstack-operators(dac66b05-4f64-4bad-9d59-9c01ccd16018): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:51:38 crc kubenswrapper[5030]: E0120 22:51:38.246949 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" podUID="dac66b05-4f64-4bad-9d59-9c01ccd16018" Jan 20 22:51:38 crc kubenswrapper[5030]: E0120 22:51:38.885471 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525" Jan 20 22:51:38 crc kubenswrapper[5030]: E0120 22:51:38.885712 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-knmnv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-78757b4889-dvhb6_openstack-operators(caa5b6cd-4c8e-424e-8ba9-578ccd662eab): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:51:38 crc kubenswrapper[5030]: E0120 22:51:38.886938 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" podUID="caa5b6cd-4c8e-424e-8ba9-578ccd662eab" Jan 20 22:51:39 crc kubenswrapper[5030]: E0120 22:51:39.221878 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" podUID="caa5b6cd-4c8e-424e-8ba9-578ccd662eab" Jan 20 22:51:39 crc kubenswrapper[5030]: E0120 22:51:39.225357 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:3311e627bcb860d9443592a2c67078417318c9eb77d8ef4d07f9aa7027d46822\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" podUID="dac66b05-4f64-4bad-9d59-9c01ccd16018" Jan 20 22:51:39 crc kubenswrapper[5030]: E0120 22:51:39.381263 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71" Jan 20 22:51:39 crc kubenswrapper[5030]: E0120 22:51:39.381569 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zz7hr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-c87fff755-mq6l5_openstack-operators(96630154-0812-445c-bd03-9a2066468891): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:51:39 crc kubenswrapper[5030]: E0120 22:51:39.384087 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" podUID="96630154-0812-445c-bd03-9a2066468891" Jan 20 22:51:39 crc kubenswrapper[5030]: E0120 22:51:39.914670 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 20 22:51:39 crc kubenswrapper[5030]: E0120 22:51:39.915004 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6fcjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-8nglg_openstack-operators(bee1b017-6c33-42be-82ba-9af5cd5b5362): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:51:39 crc kubenswrapper[5030]: E0120 22:51:39.916182 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" podUID="bee1b017-6c33-42be-82ba-9af5cd5b5362" Jan 20 22:51:40 crc kubenswrapper[5030]: I0120 22:51:40.156973 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:51:40 crc kubenswrapper[5030]: I0120 22:51:40.157063 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:51:40 crc kubenswrapper[5030]: E0120 22:51:40.228802 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" podUID="96630154-0812-445c-bd03-9a2066468891" Jan 20 22:51:40 crc kubenswrapper[5030]: E0120 22:51:40.228936 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" podUID="bee1b017-6c33-42be-82ba-9af5cd5b5362" Jan 20 22:51:40 crc kubenswrapper[5030]: E0120 22:51:40.450732 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231" Jan 20 22:51:40 crc kubenswrapper[5030]: E0120 22:51:40.450941 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lbnhw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-65849867d6-sjfdt_openstack-operators(a2343f20-8163-4f12-85f6-d621d416444b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:51:40 crc kubenswrapper[5030]: E0120 22:51:40.452268 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" podUID="a2343f20-8163-4f12-85f6-d621d416444b" Jan 20 22:51:40 crc kubenswrapper[5030]: E0120 22:51:40.903882 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e" Jan 20 22:51:40 crc kubenswrapper[5030]: E0120 22:51:40.904060 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2ndzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-767fdc4f47-2cdl5_openstack-operators(ad595f62-3880-4f00-8f70-879d673dcf55): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:51:40 crc kubenswrapper[5030]: E0120 22:51:40.905291 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" podUID="ad595f62-3880-4f00-8f70-879d673dcf55" Jan 20 22:51:41 crc kubenswrapper[5030]: E0120 22:51:41.236137 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231\\\"\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" podUID="a2343f20-8163-4f12-85f6-d621d416444b" Jan 20 22:51:41 crc kubenswrapper[5030]: E0120 22:51:41.236541 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" podUID="ad595f62-3880-4f00-8f70-879d673dcf55" Jan 20 22:51:41 crc kubenswrapper[5030]: I0120 22:51:41.918826 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:41 crc kubenswrapper[5030]: I0120 22:51:41.947490 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7pw7\" (UniqueName: \"kubernetes.io/projected/8405eebd-ea44-4878-8055-b19c51043955-kube-api-access-d7pw7\") pod \"8405eebd-ea44-4878-8055-b19c51043955\" (UID: \"8405eebd-ea44-4878-8055-b19c51043955\") " Jan 20 22:51:41 crc kubenswrapper[5030]: I0120 22:51:41.947567 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8405eebd-ea44-4878-8055-b19c51043955-utilities\") pod \"8405eebd-ea44-4878-8055-b19c51043955\" (UID: \"8405eebd-ea44-4878-8055-b19c51043955\") " Jan 20 22:51:41 crc kubenswrapper[5030]: I0120 22:51:41.947640 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8405eebd-ea44-4878-8055-b19c51043955-catalog-content\") pod \"8405eebd-ea44-4878-8055-b19c51043955\" (UID: \"8405eebd-ea44-4878-8055-b19c51043955\") " Jan 20 22:51:41 crc kubenswrapper[5030]: I0120 22:51:41.949313 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8405eebd-ea44-4878-8055-b19c51043955-utilities" (OuterVolumeSpecName: "utilities") pod "8405eebd-ea44-4878-8055-b19c51043955" (UID: "8405eebd-ea44-4878-8055-b19c51043955"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:51:41 crc kubenswrapper[5030]: I0120 22:51:41.953944 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8405eebd-ea44-4878-8055-b19c51043955-kube-api-access-d7pw7" (OuterVolumeSpecName: "kube-api-access-d7pw7") pod "8405eebd-ea44-4878-8055-b19c51043955" (UID: "8405eebd-ea44-4878-8055-b19c51043955"). InnerVolumeSpecName "kube-api-access-d7pw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:51:41 crc kubenswrapper[5030]: I0120 22:51:41.972320 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8405eebd-ea44-4878-8055-b19c51043955-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8405eebd-ea44-4878-8055-b19c51043955" (UID: "8405eebd-ea44-4878-8055-b19c51043955"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:51:42 crc kubenswrapper[5030]: I0120 22:51:42.048899 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7pw7\" (UniqueName: \"kubernetes.io/projected/8405eebd-ea44-4878-8055-b19c51043955-kube-api-access-d7pw7\") on node \"crc\" DevicePath \"\"" Jan 20 22:51:42 crc kubenswrapper[5030]: I0120 22:51:42.048934 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8405eebd-ea44-4878-8055-b19c51043955-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:51:42 crc kubenswrapper[5030]: I0120 22:51:42.048945 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8405eebd-ea44-4878-8055-b19c51043955-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:51:42 crc kubenswrapper[5030]: I0120 22:51:42.242307 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2x56r" event={"ID":"8405eebd-ea44-4878-8055-b19c51043955","Type":"ContainerDied","Data":"32ea6ca169368769ef3a5b001122acd90a415e37edb81072db38d95780b0cdfa"} Jan 20 22:51:42 crc kubenswrapper[5030]: I0120 22:51:42.242355 5030 scope.go:117] "RemoveContainer" containerID="d836cf3ff923ff3efd33745e886bad734cb78c8c6df172a9fe37995cfb4c3010" Jan 20 22:51:42 crc kubenswrapper[5030]: I0120 22:51:42.242410 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2x56r" Jan 20 22:51:42 crc kubenswrapper[5030]: I0120 22:51:42.269922 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2x56r"] Jan 20 22:51:42 crc kubenswrapper[5030]: I0120 22:51:42.271366 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2x56r"] Jan 20 22:51:43 crc kubenswrapper[5030]: I0120 22:51:43.874971 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt"] Jan 20 22:51:43 crc kubenswrapper[5030]: I0120 22:51:43.971445 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8405eebd-ea44-4878-8055-b19c51043955" path="/var/lib/kubelet/pods/8405eebd-ea44-4878-8055-b19c51043955/volumes" Jan 20 22:51:44 crc kubenswrapper[5030]: W0120 22:51:44.016915 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64490fad_ad11_4eea_b07f_c3c90f9bab8f.slice/crio-cef5e3ff95f86d527a7368b737f75cb7a5ac6347b7bf32ad5c02df992761da30 WatchSource:0}: Error finding container cef5e3ff95f86d527a7368b737f75cb7a5ac6347b7bf32ad5c02df992761da30: Status 404 returned error can't find the container with id cef5e3ff95f86d527a7368b737f75cb7a5ac6347b7bf32ad5c02df992761da30 Jan 20 22:51:44 crc kubenswrapper[5030]: I0120 22:51:44.026267 5030 scope.go:117] "RemoveContainer" containerID="a358395e4352bb8064d92cfbd553b6128fbd8bc809547cfc472ee1732d8118eb" Jan 20 22:51:44 crc kubenswrapper[5030]: I0120 22:51:44.172597 5030 scope.go:117] "RemoveContainer" containerID="16effcd85d354bec2fed42bac0c4db35483f5c6f1a11d72db0a0443fe026b071" Jan 20 22:51:44 crc kubenswrapper[5030]: I0120 22:51:44.260152 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" event={"ID":"98fe7bc7-a9a3-4681-9d86-2d134e784ed0","Type":"ContainerStarted","Data":"79c3796b053a1e0dbfc7557940757713515a2dd19e862a09c993127480e0e1fc"} Jan 20 22:51:44 crc kubenswrapper[5030]: I0120 22:51:44.265898 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" Jan 20 22:51:44 crc kubenswrapper[5030]: I0120 22:51:44.268729 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" event={"ID":"64490fad-ad11-4eea-b07f-c3c90f9bab8f","Type":"ContainerStarted","Data":"cef5e3ff95f86d527a7368b737f75cb7a5ac6347b7bf32ad5c02df992761da30"} Jan 20 22:51:44 crc kubenswrapper[5030]: I0120 22:51:44.284007 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" podStartSLOduration=5.505973368 podStartE2EDuration="25.283991105s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:20.656320373 +0000 UTC m=+952.976580651" lastFinishedPulling="2026-01-20 22:51:40.4343381 +0000 UTC m=+972.754598388" observedRunningTime="2026-01-20 22:51:44.279593748 +0000 UTC m=+976.599854036" watchObservedRunningTime="2026-01-20 22:51:44.283991105 +0000 UTC m=+976.604251393" Jan 20 22:51:44 crc kubenswrapper[5030]: I0120 22:51:44.436520 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb"] Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.283448 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" event={"ID":"1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2","Type":"ContainerStarted","Data":"24e74137735a20307ecb6198bd20f9e08759fa32dbac36ab6f996478b164bc37"} Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.284076 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.289715 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" event={"ID":"123d952b-3c67-49ea-9c09-82b04a14d494","Type":"ContainerStarted","Data":"4c56725c9af1fd9984d9fe795fecb822e35ef346cecd8821f436327204352292"} Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.295118 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" event={"ID":"f32a8aff-01df-47f8-83db-03ee02703e0f","Type":"ContainerStarted","Data":"4b004e6eb4eea715e4d1d1225ebf941708a403dfaa7b81a3c79884fc564712b1"} Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.295497 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.296760 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" event={"ID":"bf527df4-5b46-4894-ac83-7b5613ee7fae","Type":"ContainerStarted","Data":"24c0e2734280ba547ae5ef0d76f16dddb92ff6ddcf0d95bf42ae9dbf234ed756"} Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.297116 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.298338 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" podStartSLOduration=3.5543074199999998 podStartE2EDuration="26.298328605s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:21.325536804 +0000 UTC m=+953.645797092" lastFinishedPulling="2026-01-20 22:51:44.069557989 +0000 UTC m=+976.389818277" observedRunningTime="2026-01-20 22:51:45.2962585 +0000 UTC m=+977.616518788" watchObservedRunningTime="2026-01-20 22:51:45.298328605 +0000 UTC m=+977.618588893" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.304246 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" event={"ID":"1eab1afc-3438-48e1-9210-c49f8c94316d","Type":"ContainerStarted","Data":"5fb34c8e29ac29cd7e990d10eebe752719765609284b211b3eb8c9c3932928d3"} Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.304798 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.308227 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" event={"ID":"c9aea915-81f5-4204-9d3b-4d107d4678a1","Type":"ContainerStarted","Data":"aca7b7399a1f0721b34e01b5e2aa237c8d9035db4cf09a589bb3d945a0420ae7"} Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.308689 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.310131 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" event={"ID":"5810f6d7-0bb3-4137-8ce4-03f0b1e9b225","Type":"ContainerStarted","Data":"29788f9d9bf1fc564feb6809bf41ef8c112495b8e2f29373c56d5f6af7a181b4"} Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.311428 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.314259 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" event={"ID":"a4f33608-dc70-40e3-b757-8410b4f1938e","Type":"ContainerStarted","Data":"7ded8ff756807b2d90f4c65bdc54ae4f12c5406d841f6298740660fc611f6f3b"} Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.314518 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.315432 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" podStartSLOduration=3.490857245 podStartE2EDuration="26.315420463s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:21.234276444 +0000 UTC m=+953.554536732" lastFinishedPulling="2026-01-20 22:51:44.058839662 +0000 UTC m=+976.379099950" observedRunningTime="2026-01-20 22:51:45.308695825 +0000 UTC m=+977.628956113" watchObservedRunningTime="2026-01-20 22:51:45.315420463 +0000 UTC m=+977.635680751" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.316322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" event={"ID":"b73ba68f-7653-47c8-bd19-4a3f05afad72","Type":"ContainerStarted","Data":"c3c465ef33d6aab6dd4b7551ec2eac0ee39cb7f40222d561b8e2d9fac3cdaa83"} Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.316778 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.318427 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" event={"ID":"1d5ff7f0-0960-44b0-8637-446aa3906d52","Type":"ContainerStarted","Data":"b720d678cbbfa9862850b06990f6e49cefc3e136fd89594ae535f4c99f8bf0f2"} Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.318516 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.319710 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" event={"ID":"708640d3-cade-4b2d-b378-1f25cafb3575","Type":"ContainerStarted","Data":"f316aa61f7ada9f40c18b6a646bb9bdcb06db631aab04eb4aa5b0679c6f42397"} Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.319984 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.329818 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" podStartSLOduration=3.578018825 podStartE2EDuration="26.329799872s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:21.307162687 +0000 UTC m=+953.627422975" lastFinishedPulling="2026-01-20 22:51:44.058943714 +0000 UTC m=+976.379204022" observedRunningTime="2026-01-20 22:51:45.327491581 +0000 UTC m=+977.647751869" watchObservedRunningTime="2026-01-20 22:51:45.329799872 +0000 UTC m=+977.650060160" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.355845 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" podStartSLOduration=3.614714908 podStartE2EDuration="26.355828598s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:21.325302249 +0000 UTC m=+953.645562537" lastFinishedPulling="2026-01-20 22:51:44.066415939 +0000 UTC m=+976.386676227" observedRunningTime="2026-01-20 22:51:45.351797958 +0000 UTC m=+977.672058246" watchObservedRunningTime="2026-01-20 22:51:45.355828598 +0000 UTC m=+977.676088886" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.379262 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" podStartSLOduration=6.155552444 podStartE2EDuration="26.379246616s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:20.655994855 +0000 UTC m=+952.976255143" lastFinishedPulling="2026-01-20 22:51:40.879689027 +0000 UTC m=+973.199949315" observedRunningTime="2026-01-20 22:51:45.372501407 +0000 UTC m=+977.692761705" watchObservedRunningTime="2026-01-20 22:51:45.379246616 +0000 UTC m=+977.699506904" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.391718 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" podStartSLOduration=7.057452057 podStartE2EDuration="26.391699392s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:21.098824147 +0000 UTC m=+953.419084435" lastFinishedPulling="2026-01-20 22:51:40.433071472 +0000 UTC m=+972.753331770" observedRunningTime="2026-01-20 22:51:45.386989788 +0000 UTC m=+977.707250076" watchObservedRunningTime="2026-01-20 22:51:45.391699392 +0000 UTC m=+977.711959680" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.401428 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" podStartSLOduration=6.546343233 podStartE2EDuration="26.401414117s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:20.578399208 +0000 UTC m=+952.898659496" lastFinishedPulling="2026-01-20 22:51:40.433470082 +0000 UTC m=+972.753730380" observedRunningTime="2026-01-20 22:51:45.400874375 +0000 UTC m=+977.721134663" watchObservedRunningTime="2026-01-20 22:51:45.401414117 +0000 UTC m=+977.721674405" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.422173 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" podStartSLOduration=3.704671969 podStartE2EDuration="26.422157947s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:21.352064151 +0000 UTC m=+953.672324439" lastFinishedPulling="2026-01-20 22:51:44.069550129 +0000 UTC m=+976.389810417" observedRunningTime="2026-01-20 22:51:45.416806008 +0000 UTC m=+977.737066286" watchObservedRunningTime="2026-01-20 22:51:45.422157947 +0000 UTC m=+977.742418235" Jan 20 22:51:45 crc kubenswrapper[5030]: I0120 22:51:45.431333 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" podStartSLOduration=3.667060257 podStartE2EDuration="26.431315199s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:21.307400633 +0000 UTC m=+953.627660921" lastFinishedPulling="2026-01-20 22:51:44.071655575 +0000 UTC m=+976.391915863" observedRunningTime="2026-01-20 22:51:45.429498969 +0000 UTC m=+977.749759257" watchObservedRunningTime="2026-01-20 22:51:45.431315199 +0000 UTC m=+977.751575477" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.273982 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" podStartSLOduration=8.07379536 podStartE2EDuration="27.273965159s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:21.233228581 +0000 UTC m=+953.553488869" lastFinishedPulling="2026-01-20 22:51:40.43339838 +0000 UTC m=+972.753658668" observedRunningTime="2026-01-20 22:51:45.452973689 +0000 UTC m=+977.773233987" watchObservedRunningTime="2026-01-20 22:51:46.273965159 +0000 UTC m=+978.594225447" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.279315 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wf9r2"] Jan 20 22:51:46 crc kubenswrapper[5030]: E0120 22:51:46.279604 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8405eebd-ea44-4878-8055-b19c51043955" containerName="registry-server" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.279630 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8405eebd-ea44-4878-8055-b19c51043955" containerName="registry-server" Jan 20 22:51:46 crc kubenswrapper[5030]: E0120 22:51:46.279640 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8405eebd-ea44-4878-8055-b19c51043955" containerName="extract-utilities" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.279647 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8405eebd-ea44-4878-8055-b19c51043955" containerName="extract-utilities" Jan 20 22:51:46 crc kubenswrapper[5030]: E0120 22:51:46.279664 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8405eebd-ea44-4878-8055-b19c51043955" containerName="extract-content" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.279670 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8405eebd-ea44-4878-8055-b19c51043955" containerName="extract-content" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.279788 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8405eebd-ea44-4878-8055-b19c51043955" containerName="registry-server" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.284808 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.298825 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wf9r2"] Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.410387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128ab8d5-e05a-4865-9479-8ab947928ad1-catalog-content\") pod \"community-operators-wf9r2\" (UID: \"128ab8d5-e05a-4865-9479-8ab947928ad1\") " pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.410503 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx2mc\" (UniqueName: \"kubernetes.io/projected/128ab8d5-e05a-4865-9479-8ab947928ad1-kube-api-access-xx2mc\") pod \"community-operators-wf9r2\" (UID: \"128ab8d5-e05a-4865-9479-8ab947928ad1\") " pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.410529 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128ab8d5-e05a-4865-9479-8ab947928ad1-utilities\") pod \"community-operators-wf9r2\" (UID: \"128ab8d5-e05a-4865-9479-8ab947928ad1\") " pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.512414 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128ab8d5-e05a-4865-9479-8ab947928ad1-catalog-content\") pod \"community-operators-wf9r2\" (UID: \"128ab8d5-e05a-4865-9479-8ab947928ad1\") " pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.512600 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx2mc\" (UniqueName: \"kubernetes.io/projected/128ab8d5-e05a-4865-9479-8ab947928ad1-kube-api-access-xx2mc\") pod \"community-operators-wf9r2\" (UID: \"128ab8d5-e05a-4865-9479-8ab947928ad1\") " pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.512656 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128ab8d5-e05a-4865-9479-8ab947928ad1-utilities\") pod \"community-operators-wf9r2\" (UID: \"128ab8d5-e05a-4865-9479-8ab947928ad1\") " pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.513564 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128ab8d5-e05a-4865-9479-8ab947928ad1-catalog-content\") pod \"community-operators-wf9r2\" (UID: \"128ab8d5-e05a-4865-9479-8ab947928ad1\") " pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.514076 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128ab8d5-e05a-4865-9479-8ab947928ad1-utilities\") pod \"community-operators-wf9r2\" (UID: \"128ab8d5-e05a-4865-9479-8ab947928ad1\") " pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.533604 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx2mc\" (UniqueName: \"kubernetes.io/projected/128ab8d5-e05a-4865-9479-8ab947928ad1-kube-api-access-xx2mc\") pod \"community-operators-wf9r2\" (UID: \"128ab8d5-e05a-4865-9479-8ab947928ad1\") " pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:46 crc kubenswrapper[5030]: I0120 22:51:46.607553 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:47 crc kubenswrapper[5030]: I0120 22:51:47.675244 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wf9r2"] Jan 20 22:51:47 crc kubenswrapper[5030]: W0120 22:51:47.682341 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod128ab8d5_e05a_4865_9479_8ab947928ad1.slice/crio-a545b8b2930feee2eb02ce45771ea87a1341760fb8872bf21f3e3277c60c9763 WatchSource:0}: Error finding container a545b8b2930feee2eb02ce45771ea87a1341760fb8872bf21f3e3277c60c9763: Status 404 returned error can't find the container with id a545b8b2930feee2eb02ce45771ea87a1341760fb8872bf21f3e3277c60c9763 Jan 20 22:51:48 crc kubenswrapper[5030]: I0120 22:51:48.355286 5030 generic.go:334] "Generic (PLEG): container finished" podID="128ab8d5-e05a-4865-9479-8ab947928ad1" containerID="c24a4fbc84cef1033cd2e3479152aa7a0374d15b8aa281a9a909788aed962c53" exitCode=0 Jan 20 22:51:48 crc kubenswrapper[5030]: I0120 22:51:48.355373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wf9r2" event={"ID":"128ab8d5-e05a-4865-9479-8ab947928ad1","Type":"ContainerDied","Data":"c24a4fbc84cef1033cd2e3479152aa7a0374d15b8aa281a9a909788aed962c53"} Jan 20 22:51:48 crc kubenswrapper[5030]: I0120 22:51:48.355797 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wf9r2" event={"ID":"128ab8d5-e05a-4865-9479-8ab947928ad1","Type":"ContainerStarted","Data":"a545b8b2930feee2eb02ce45771ea87a1341760fb8872bf21f3e3277c60c9763"} Jan 20 22:51:48 crc kubenswrapper[5030]: I0120 22:51:48.359152 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" event={"ID":"64490fad-ad11-4eea-b07f-c3c90f9bab8f","Type":"ContainerStarted","Data":"40584f451b0c6b41ad84e7c1a4e84ad8546905f21a6fe716959214b8c35e3b2e"} Jan 20 22:51:48 crc kubenswrapper[5030]: I0120 22:51:48.359286 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:48 crc kubenswrapper[5030]: I0120 22:51:48.362055 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" event={"ID":"123d952b-3c67-49ea-9c09-82b04a14d494","Type":"ContainerStarted","Data":"5abe53d03f892d3544b17d81eba417d5a56af62aff5ab61676205fa77053baf6"} Jan 20 22:51:48 crc kubenswrapper[5030]: I0120 22:51:48.362925 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:48 crc kubenswrapper[5030]: I0120 22:51:48.455213 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" podStartSLOduration=26.620777393 podStartE2EDuration="29.455195927s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:44.459474729 +0000 UTC m=+976.779735017" lastFinishedPulling="2026-01-20 22:51:47.293893243 +0000 UTC m=+979.614153551" observedRunningTime="2026-01-20 22:51:48.454034451 +0000 UTC m=+980.774294749" watchObservedRunningTime="2026-01-20 22:51:48.455195927 +0000 UTC m=+980.775456225" Jan 20 22:51:48 crc kubenswrapper[5030]: I0120 22:51:48.458871 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" podStartSLOduration=26.195673655 podStartE2EDuration="29.458860579s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:44.02620202 +0000 UTC m=+976.346462308" lastFinishedPulling="2026-01-20 22:51:47.289388944 +0000 UTC m=+979.609649232" observedRunningTime="2026-01-20 22:51:48.426694297 +0000 UTC m=+980.746954625" watchObservedRunningTime="2026-01-20 22:51:48.458860579 +0000 UTC m=+980.779120887" Jan 20 22:51:49 crc kubenswrapper[5030]: I0120 22:51:49.397094 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wf9r2" event={"ID":"128ab8d5-e05a-4865-9479-8ab947928ad1","Type":"ContainerStarted","Data":"2581dc85f2b47e5eebf7208a1e708316fbc81f6fc2018b8b661c08e180e10d2e"} Jan 20 22:51:49 crc kubenswrapper[5030]: I0120 22:51:49.629200 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" Jan 20 22:51:49 crc kubenswrapper[5030]: I0120 22:51:49.650412 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" Jan 20 22:51:49 crc kubenswrapper[5030]: I0120 22:51:49.748000 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" Jan 20 22:51:50 crc kubenswrapper[5030]: I0120 22:51:50.086772 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" Jan 20 22:51:50 crc kubenswrapper[5030]: I0120 22:51:50.254072 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" Jan 20 22:51:50 crc kubenswrapper[5030]: I0120 22:51:50.311306 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" Jan 20 22:51:50 crc kubenswrapper[5030]: I0120 22:51:50.354286 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" Jan 20 22:51:50 crc kubenswrapper[5030]: I0120 22:51:50.409739 5030 generic.go:334] "Generic (PLEG): container finished" podID="128ab8d5-e05a-4865-9479-8ab947928ad1" containerID="2581dc85f2b47e5eebf7208a1e708316fbc81f6fc2018b8b661c08e180e10d2e" exitCode=0 Jan 20 22:51:50 crc kubenswrapper[5030]: I0120 22:51:50.409870 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wf9r2" event={"ID":"128ab8d5-e05a-4865-9479-8ab947928ad1","Type":"ContainerDied","Data":"2581dc85f2b47e5eebf7208a1e708316fbc81f6fc2018b8b661c08e180e10d2e"} Jan 20 22:51:50 crc kubenswrapper[5030]: I0120 22:51:50.422693 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" Jan 20 22:51:50 crc kubenswrapper[5030]: I0120 22:51:50.441553 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" Jan 20 22:51:50 crc kubenswrapper[5030]: I0120 22:51:50.469914 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" Jan 20 22:51:50 crc kubenswrapper[5030]: I0120 22:51:50.498397 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" Jan 20 22:51:51 crc kubenswrapper[5030]: I0120 22:51:51.425332 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wf9r2" event={"ID":"128ab8d5-e05a-4865-9479-8ab947928ad1","Type":"ContainerStarted","Data":"177a55e91198413a54815644032a87b55c40e544eea946e8a83ddc22f631b452"} Jan 20 22:51:51 crc kubenswrapper[5030]: I0120 22:51:51.448400 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wf9r2" podStartSLOduration=2.902413036 podStartE2EDuration="5.448385756s" podCreationTimestamp="2026-01-20 22:51:46 +0000 UTC" firstStartedPulling="2026-01-20 22:51:48.358745802 +0000 UTC m=+980.679006130" lastFinishedPulling="2026-01-20 22:51:50.904718552 +0000 UTC m=+983.224978850" observedRunningTime="2026-01-20 22:51:51.445081952 +0000 UTC m=+983.765342250" watchObservedRunningTime="2026-01-20 22:51:51.448385756 +0000 UTC m=+983.768646044" Jan 20 22:51:52 crc kubenswrapper[5030]: I0120 22:51:52.194249 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:52 crc kubenswrapper[5030]: I0120 22:51:52.201852 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs\") pod \"openstack-operator-controller-manager-75bfd788c8-dslzz\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:52 crc kubenswrapper[5030]: I0120 22:51:52.222107 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-z6q8k" Jan 20 22:51:52 crc kubenswrapper[5030]: I0120 22:51:52.230560 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:52 crc kubenswrapper[5030]: I0120 22:51:52.437544 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" event={"ID":"df584eed-267c-41ec-b019-6af9322630da","Type":"ContainerStarted","Data":"e65d03a216edb99e7ff8a94c7c71b7abe78524f83d9aa3650be11579769cdee0"} Jan 20 22:51:52 crc kubenswrapper[5030]: I0120 22:51:52.693529 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz"] Jan 20 22:51:52 crc kubenswrapper[5030]: W0120 22:51:52.702307 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23bf9edf_4e02_4e6b_815d_c957b80275f0.slice/crio-4411971495e539214bd5762d84173cef8610b70cdcc4feddee4278edf42064a1 WatchSource:0}: Error finding container 4411971495e539214bd5762d84173cef8610b70cdcc4feddee4278edf42064a1: Status 404 returned error can't find the container with id 4411971495e539214bd5762d84173cef8610b70cdcc4feddee4278edf42064a1 Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.450471 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" event={"ID":"dac66b05-4f64-4bad-9d59-9c01ccd16018","Type":"ContainerStarted","Data":"99b032a36da1aba76e618c982844826d6f9973086eeb750125c45339572b6225"} Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.451120 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.453084 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" event={"ID":"23bf9edf-4e02-4e6b-815d-c957b80275f0","Type":"ContainerStarted","Data":"09ab300b6027b5cb312945c9d6e4e14bd2732b961fe4b0fc413eabbd873d0271"} Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.453358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" event={"ID":"23bf9edf-4e02-4e6b-815d-c957b80275f0","Type":"ContainerStarted","Data":"4411971495e539214bd5762d84173cef8610b70cdcc4feddee4278edf42064a1"} Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.453550 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.454782 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" event={"ID":"d7b05cde-2eb1-4879-bc97-dd0a420d7617","Type":"ContainerStarted","Data":"71b58c9561c91a3cafd3472b3584f780486758bc8767e83ad2f39a640c5cd1d5"} Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.455077 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.456079 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" event={"ID":"caa5b6cd-4c8e-424e-8ba9-578ccd662eab","Type":"ContainerStarted","Data":"0b82f35514b33886fb833fea714971e2e950a0b2b66749e86203ca588d1951df"} Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.456271 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.457599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" event={"ID":"9a08b9f6-0526-42ba-9439-1fad41456c73","Type":"ContainerStarted","Data":"eb9e1f492058fcaf14bce1a2e0dddb06a461741cf089920f0fd020ef503b9cb8"} Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.457737 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.457913 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.481929 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" podStartSLOduration=2.527335121 podStartE2EDuration="34.481908434s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:20.915845166 +0000 UTC m=+953.236105454" lastFinishedPulling="2026-01-20 22:51:52.870418479 +0000 UTC m=+985.190678767" observedRunningTime="2026-01-20 22:51:53.47854792 +0000 UTC m=+985.798808218" watchObservedRunningTime="2026-01-20 22:51:53.481908434 +0000 UTC m=+985.802168722" Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.501019 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" podStartSLOduration=4.229123758 podStartE2EDuration="34.500994077s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:21.111703662 +0000 UTC m=+953.431963950" lastFinishedPulling="2026-01-20 22:51:51.383573971 +0000 UTC m=+983.703834269" observedRunningTime="2026-01-20 22:51:53.495059804 +0000 UTC m=+985.815320092" watchObservedRunningTime="2026-01-20 22:51:53.500994077 +0000 UTC m=+985.821254375" Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.518688 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" podStartSLOduration=33.518671547 podStartE2EDuration="33.518671547s" podCreationTimestamp="2026-01-20 22:51:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:51:53.51610935 +0000 UTC m=+985.836369638" watchObservedRunningTime="2026-01-20 22:51:53.518671547 +0000 UTC m=+985.838931845" Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.535505 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" podStartSLOduration=3.867563134 podStartE2EDuration="34.535488049s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:20.765733474 +0000 UTC m=+953.085993762" lastFinishedPulling="2026-01-20 22:51:51.433658399 +0000 UTC m=+983.753918677" observedRunningTime="2026-01-20 22:51:53.532262608 +0000 UTC m=+985.852522896" watchObservedRunningTime="2026-01-20 22:51:53.535488049 +0000 UTC m=+985.855748357" Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.548729 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" podStartSLOduration=2.6572664059999997 podStartE2EDuration="34.548707163s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:20.9024708 +0000 UTC m=+953.222731088" lastFinishedPulling="2026-01-20 22:51:52.793911547 +0000 UTC m=+985.114171845" observedRunningTime="2026-01-20 22:51:53.547479275 +0000 UTC m=+985.867739573" watchObservedRunningTime="2026-01-20 22:51:53.548707163 +0000 UTC m=+985.868967471" Jan 20 22:51:53 crc kubenswrapper[5030]: I0120 22:51:53.576195 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" podStartSLOduration=2.344885782 podStartE2EDuration="34.57616949s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:20.562590338 +0000 UTC m=+952.882850616" lastFinishedPulling="2026-01-20 22:51:52.793874036 +0000 UTC m=+985.114134324" observedRunningTime="2026-01-20 22:51:53.570742191 +0000 UTC m=+985.891002489" watchObservedRunningTime="2026-01-20 22:51:53.57616949 +0000 UTC m=+985.896429818" Jan 20 22:51:54 crc kubenswrapper[5030]: I0120 22:51:54.484910 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" event={"ID":"bee1b017-6c33-42be-82ba-9af5cd5b5362","Type":"ContainerStarted","Data":"10c57e78eced15714eb0ea981712a0028cd5ff4428918476dae035318d85d001"} Jan 20 22:51:54 crc kubenswrapper[5030]: I0120 22:51:54.512612 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" podStartSLOduration=2.349343035 podStartE2EDuration="34.512594286s" podCreationTimestamp="2026-01-20 22:51:20 +0000 UTC" firstStartedPulling="2026-01-20 22:51:21.45589723 +0000 UTC m=+953.776157528" lastFinishedPulling="2026-01-20 22:51:53.619148471 +0000 UTC m=+985.939408779" observedRunningTime="2026-01-20 22:51:54.509910447 +0000 UTC m=+986.830170745" watchObservedRunningTime="2026-01-20 22:51:54.512594286 +0000 UTC m=+986.832854574" Jan 20 22:51:55 crc kubenswrapper[5030]: I0120 22:51:55.525985 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 20 22:51:55 crc kubenswrapper[5030]: I0120 22:51:55.884826 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 20 22:51:56 crc kubenswrapper[5030]: I0120 22:51:56.504136 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" event={"ID":"ad595f62-3880-4f00-8f70-879d673dcf55","Type":"ContainerStarted","Data":"0aa89417ba2f6c0729f3c89c1c5b9fc64701d1b0b7276ab33b14d2c745d5d230"} Jan 20 22:51:56 crc kubenswrapper[5030]: I0120 22:51:56.504345 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" Jan 20 22:51:56 crc kubenswrapper[5030]: I0120 22:51:56.528818 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" podStartSLOduration=2.420137826 podStartE2EDuration="37.52879829s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:20.892756425 +0000 UTC m=+953.213016713" lastFinishedPulling="2026-01-20 22:51:56.001416869 +0000 UTC m=+988.321677177" observedRunningTime="2026-01-20 22:51:56.526391757 +0000 UTC m=+988.846652095" watchObservedRunningTime="2026-01-20 22:51:56.52879829 +0000 UTC m=+988.849058588" Jan 20 22:51:56 crc kubenswrapper[5030]: I0120 22:51:56.607900 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:56 crc kubenswrapper[5030]: I0120 22:51:56.608159 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:56 crc kubenswrapper[5030]: I0120 22:51:56.692632 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:57 crc kubenswrapper[5030]: I0120 22:51:57.514662 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" event={"ID":"a2343f20-8163-4f12-85f6-d621d416444b","Type":"ContainerStarted","Data":"a04440c595a263c44973ca82a87dcf37eb9e37396951ad858d0befee7751d427"} Jan 20 22:51:57 crc kubenswrapper[5030]: I0120 22:51:57.514905 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" Jan 20 22:51:57 crc kubenswrapper[5030]: I0120 22:51:57.517391 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" event={"ID":"96630154-0812-445c-bd03-9a2066468891","Type":"ContainerStarted","Data":"6565260d57febc636c85ea1add62c31f1272ab12c8e5392a27d79459653a3bd3"} Jan 20 22:51:57 crc kubenswrapper[5030]: I0120 22:51:57.518021 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" Jan 20 22:51:57 crc kubenswrapper[5030]: I0120 22:51:57.542180 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" podStartSLOduration=3.181601002 podStartE2EDuration="38.542153869s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:21.104044152 +0000 UTC m=+953.424304440" lastFinishedPulling="2026-01-20 22:51:56.464596979 +0000 UTC m=+988.784857307" observedRunningTime="2026-01-20 22:51:57.535372399 +0000 UTC m=+989.855632697" watchObservedRunningTime="2026-01-20 22:51:57.542153869 +0000 UTC m=+989.862414187" Jan 20 22:51:57 crc kubenswrapper[5030]: I0120 22:51:57.562659 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" podStartSLOduration=3.201136535 podStartE2EDuration="38.562639383s" podCreationTimestamp="2026-01-20 22:51:19 +0000 UTC" firstStartedPulling="2026-01-20 22:51:21.101653169 +0000 UTC m=+953.421913457" lastFinishedPulling="2026-01-20 22:51:56.463155977 +0000 UTC m=+988.783416305" observedRunningTime="2026-01-20 22:51:57.55437101 +0000 UTC m=+989.874631328" watchObservedRunningTime="2026-01-20 22:51:57.562639383 +0000 UTC m=+989.882899671" Jan 20 22:51:57 crc kubenswrapper[5030]: I0120 22:51:57.584333 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:51:57 crc kubenswrapper[5030]: I0120 22:51:57.643838 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wf9r2"] Jan 20 22:51:59 crc kubenswrapper[5030]: I0120 22:51:59.536200 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wf9r2" podUID="128ab8d5-e05a-4865-9479-8ab947928ad1" containerName="registry-server" containerID="cri-o://177a55e91198413a54815644032a87b55c40e544eea946e8a83ddc22f631b452" gracePeriod=2 Jan 20 22:51:59 crc kubenswrapper[5030]: I0120 22:51:59.726421 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" Jan 20 22:52:00 crc kubenswrapper[5030]: I0120 22:52:00.025475 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" Jan 20 22:52:00 crc kubenswrapper[5030]: I0120 22:52:00.087059 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" Jan 20 22:52:00 crc kubenswrapper[5030]: I0120 22:52:00.110836 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" Jan 20 22:52:00 crc kubenswrapper[5030]: I0120 22:52:00.174183 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" Jan 20 22:52:01 crc kubenswrapper[5030]: I0120 22:52:01.558180 5030 generic.go:334] "Generic (PLEG): container finished" podID="128ab8d5-e05a-4865-9479-8ab947928ad1" containerID="177a55e91198413a54815644032a87b55c40e544eea946e8a83ddc22f631b452" exitCode=0 Jan 20 22:52:01 crc kubenswrapper[5030]: I0120 22:52:01.558265 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wf9r2" event={"ID":"128ab8d5-e05a-4865-9479-8ab947928ad1","Type":"ContainerDied","Data":"177a55e91198413a54815644032a87b55c40e544eea946e8a83ddc22f631b452"} Jan 20 22:52:02 crc kubenswrapper[5030]: I0120 22:52:02.236968 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.113533 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.202364 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128ab8d5-e05a-4865-9479-8ab947928ad1-utilities\") pod \"128ab8d5-e05a-4865-9479-8ab947928ad1\" (UID: \"128ab8d5-e05a-4865-9479-8ab947928ad1\") " Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.202507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128ab8d5-e05a-4865-9479-8ab947928ad1-catalog-content\") pod \"128ab8d5-e05a-4865-9479-8ab947928ad1\" (UID: \"128ab8d5-e05a-4865-9479-8ab947928ad1\") " Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.202530 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx2mc\" (UniqueName: \"kubernetes.io/projected/128ab8d5-e05a-4865-9479-8ab947928ad1-kube-api-access-xx2mc\") pod \"128ab8d5-e05a-4865-9479-8ab947928ad1\" (UID: \"128ab8d5-e05a-4865-9479-8ab947928ad1\") " Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.203521 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128ab8d5-e05a-4865-9479-8ab947928ad1-utilities" (OuterVolumeSpecName: "utilities") pod "128ab8d5-e05a-4865-9479-8ab947928ad1" (UID: "128ab8d5-e05a-4865-9479-8ab947928ad1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.208301 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128ab8d5-e05a-4865-9479-8ab947928ad1-kube-api-access-xx2mc" (OuterVolumeSpecName: "kube-api-access-xx2mc") pod "128ab8d5-e05a-4865-9479-8ab947928ad1" (UID: "128ab8d5-e05a-4865-9479-8ab947928ad1"). InnerVolumeSpecName "kube-api-access-xx2mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.260965 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128ab8d5-e05a-4865-9479-8ab947928ad1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "128ab8d5-e05a-4865-9479-8ab947928ad1" (UID: "128ab8d5-e05a-4865-9479-8ab947928ad1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.304129 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128ab8d5-e05a-4865-9479-8ab947928ad1-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.304168 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128ab8d5-e05a-4865-9479-8ab947928ad1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.304183 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx2mc\" (UniqueName: \"kubernetes.io/projected/128ab8d5-e05a-4865-9479-8ab947928ad1-kube-api-access-xx2mc\") on node \"crc\" DevicePath \"\"" Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.608779 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wf9r2" event={"ID":"128ab8d5-e05a-4865-9479-8ab947928ad1","Type":"ContainerDied","Data":"a545b8b2930feee2eb02ce45771ea87a1341760fb8872bf21f3e3277c60c9763"} Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.608840 5030 scope.go:117] "RemoveContainer" containerID="177a55e91198413a54815644032a87b55c40e544eea946e8a83ddc22f631b452" Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.609516 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wf9r2" Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.639465 5030 scope.go:117] "RemoveContainer" containerID="2581dc85f2b47e5eebf7208a1e708316fbc81f6fc2018b8b661c08e180e10d2e" Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.662816 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wf9r2"] Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.667749 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wf9r2"] Jan 20 22:52:06 crc kubenswrapper[5030]: I0120 22:52:06.679294 5030 scope.go:117] "RemoveContainer" containerID="c24a4fbc84cef1033cd2e3479152aa7a0374d15b8aa281a9a909788aed962c53" Jan 20 22:52:07 crc kubenswrapper[5030]: I0120 22:52:07.977849 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128ab8d5-e05a-4865-9479-8ab947928ad1" path="/var/lib/kubelet/pods/128ab8d5-e05a-4865-9479-8ab947928ad1/volumes" Jan 20 22:52:10 crc kubenswrapper[5030]: I0120 22:52:10.060422 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" Jan 20 22:52:10 crc kubenswrapper[5030]: I0120 22:52:10.089959 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" Jan 20 22:52:10 crc kubenswrapper[5030]: I0120 22:52:10.157539 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:52:10 crc kubenswrapper[5030]: I0120 22:52:10.157697 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:52:10 crc kubenswrapper[5030]: I0120 22:52:10.157796 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:52:10 crc kubenswrapper[5030]: I0120 22:52:10.159226 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5d3679ff59bff2ab777e8cc7a5527010f7cd48401dbe03736305ae0e00f5026f"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 22:52:10 crc kubenswrapper[5030]: I0120 22:52:10.159405 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://5d3679ff59bff2ab777e8cc7a5527010f7cd48401dbe03736305ae0e00f5026f" gracePeriod=600 Jan 20 22:52:10 crc kubenswrapper[5030]: I0120 22:52:10.230146 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" Jan 20 22:52:13 crc kubenswrapper[5030]: I0120 22:52:13.675695 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="5d3679ff59bff2ab777e8cc7a5527010f7cd48401dbe03736305ae0e00f5026f" exitCode=0 Jan 20 22:52:13 crc kubenswrapper[5030]: I0120 22:52:13.677156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"5d3679ff59bff2ab777e8cc7a5527010f7cd48401dbe03736305ae0e00f5026f"} Jan 20 22:52:13 crc kubenswrapper[5030]: I0120 22:52:13.677211 5030 scope.go:117] "RemoveContainer" containerID="495572ff74c26f1ee253c86f7edeb898261f7fe51c82acc33281d83f9074845b" Jan 20 22:52:14 crc kubenswrapper[5030]: I0120 22:52:14.685719 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"f25c78cd3e3e9922643e9336b23ff6b4fab097716027081d191bbcb08c23aed3"} Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.116102 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-8stz9"] Jan 20 22:52:17 crc kubenswrapper[5030]: E0120 22:52:17.117434 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128ab8d5-e05a-4865-9479-8ab947928ad1" containerName="extract-utilities" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.117467 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="128ab8d5-e05a-4865-9479-8ab947928ad1" containerName="extract-utilities" Jan 20 22:52:17 crc kubenswrapper[5030]: E0120 22:52:17.117496 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128ab8d5-e05a-4865-9479-8ab947928ad1" containerName="registry-server" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.117516 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="128ab8d5-e05a-4865-9479-8ab947928ad1" containerName="registry-server" Jan 20 22:52:17 crc kubenswrapper[5030]: E0120 22:52:17.117552 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128ab8d5-e05a-4865-9479-8ab947928ad1" containerName="extract-content" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.117571 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="128ab8d5-e05a-4865-9479-8ab947928ad1" containerName="extract-content" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.118112 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="128ab8d5-e05a-4865-9479-8ab947928ad1" containerName="registry-server" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.119174 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-8stz9" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.121562 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.121736 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.122718 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.125601 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.128042 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-8stz9"] Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.285903 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/943b6848-5a8e-4e04-92b3-21601e4b1ebb-node-mnt\") pod \"crc-storage-crc-8stz9\" (UID: \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\") " pod="crc-storage/crc-storage-crc-8stz9" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.286001 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgppr\" (UniqueName: \"kubernetes.io/projected/943b6848-5a8e-4e04-92b3-21601e4b1ebb-kube-api-access-mgppr\") pod \"crc-storage-crc-8stz9\" (UID: \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\") " pod="crc-storage/crc-storage-crc-8stz9" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.286046 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/943b6848-5a8e-4e04-92b3-21601e4b1ebb-crc-storage\") pod \"crc-storage-crc-8stz9\" (UID: \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\") " pod="crc-storage/crc-storage-crc-8stz9" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.387891 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/943b6848-5a8e-4e04-92b3-21601e4b1ebb-crc-storage\") pod \"crc-storage-crc-8stz9\" (UID: \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\") " pod="crc-storage/crc-storage-crc-8stz9" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.388001 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/943b6848-5a8e-4e04-92b3-21601e4b1ebb-node-mnt\") pod \"crc-storage-crc-8stz9\" (UID: \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\") " pod="crc-storage/crc-storage-crc-8stz9" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.388097 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgppr\" (UniqueName: \"kubernetes.io/projected/943b6848-5a8e-4e04-92b3-21601e4b1ebb-kube-api-access-mgppr\") pod \"crc-storage-crc-8stz9\" (UID: \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\") " pod="crc-storage/crc-storage-crc-8stz9" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.388460 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/943b6848-5a8e-4e04-92b3-21601e4b1ebb-node-mnt\") pod \"crc-storage-crc-8stz9\" (UID: \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\") " pod="crc-storage/crc-storage-crc-8stz9" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.388809 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/943b6848-5a8e-4e04-92b3-21601e4b1ebb-crc-storage\") pod \"crc-storage-crc-8stz9\" (UID: \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\") " pod="crc-storage/crc-storage-crc-8stz9" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.411003 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgppr\" (UniqueName: \"kubernetes.io/projected/943b6848-5a8e-4e04-92b3-21601e4b1ebb-kube-api-access-mgppr\") pod \"crc-storage-crc-8stz9\" (UID: \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\") " pod="crc-storage/crc-storage-crc-8stz9" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.452228 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-8stz9" Jan 20 22:52:17 crc kubenswrapper[5030]: I0120 22:52:17.895284 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-8stz9"] Jan 20 22:52:17 crc kubenswrapper[5030]: W0120 22:52:17.901825 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod943b6848_5a8e_4e04_92b3_21601e4b1ebb.slice/crio-d8003d0a2f87595556fbd08fd3888dd92ae756f94d3f40c49e97607743fd0ca7 WatchSource:0}: Error finding container d8003d0a2f87595556fbd08fd3888dd92ae756f94d3f40c49e97607743fd0ca7: Status 404 returned error can't find the container with id d8003d0a2f87595556fbd08fd3888dd92ae756f94d3f40c49e97607743fd0ca7 Jan 20 22:52:18 crc kubenswrapper[5030]: I0120 22:52:18.714310 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-8stz9" event={"ID":"943b6848-5a8e-4e04-92b3-21601e4b1ebb","Type":"ContainerStarted","Data":"d8003d0a2f87595556fbd08fd3888dd92ae756f94d3f40c49e97607743fd0ca7"} Jan 20 22:52:19 crc kubenswrapper[5030]: I0120 22:52:19.726777 5030 generic.go:334] "Generic (PLEG): container finished" podID="943b6848-5a8e-4e04-92b3-21601e4b1ebb" containerID="9d6749cd3ec4dbd9236dd6d285b026b84ee8f1460d06ad6511184a6fa9b84e5b" exitCode=0 Jan 20 22:52:19 crc kubenswrapper[5030]: I0120 22:52:19.726841 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-8stz9" event={"ID":"943b6848-5a8e-4e04-92b3-21601e4b1ebb","Type":"ContainerDied","Data":"9d6749cd3ec4dbd9236dd6d285b026b84ee8f1460d06ad6511184a6fa9b84e5b"} Jan 20 22:52:21 crc kubenswrapper[5030]: I0120 22:52:21.064808 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-8stz9" Jan 20 22:52:21 crc kubenswrapper[5030]: I0120 22:52:21.250957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgppr\" (UniqueName: \"kubernetes.io/projected/943b6848-5a8e-4e04-92b3-21601e4b1ebb-kube-api-access-mgppr\") pod \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\" (UID: \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\") " Jan 20 22:52:21 crc kubenswrapper[5030]: I0120 22:52:21.251115 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/943b6848-5a8e-4e04-92b3-21601e4b1ebb-crc-storage\") pod \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\" (UID: \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\") " Jan 20 22:52:21 crc kubenswrapper[5030]: I0120 22:52:21.251229 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/943b6848-5a8e-4e04-92b3-21601e4b1ebb-node-mnt\") pod \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\" (UID: \"943b6848-5a8e-4e04-92b3-21601e4b1ebb\") " Jan 20 22:52:21 crc kubenswrapper[5030]: I0120 22:52:21.251690 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/943b6848-5a8e-4e04-92b3-21601e4b1ebb-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "943b6848-5a8e-4e04-92b3-21601e4b1ebb" (UID: "943b6848-5a8e-4e04-92b3-21601e4b1ebb"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:52:21 crc kubenswrapper[5030]: I0120 22:52:21.257594 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943b6848-5a8e-4e04-92b3-21601e4b1ebb-kube-api-access-mgppr" (OuterVolumeSpecName: "kube-api-access-mgppr") pod "943b6848-5a8e-4e04-92b3-21601e4b1ebb" (UID: "943b6848-5a8e-4e04-92b3-21601e4b1ebb"). InnerVolumeSpecName "kube-api-access-mgppr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:52:21 crc kubenswrapper[5030]: I0120 22:52:21.279591 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/943b6848-5a8e-4e04-92b3-21601e4b1ebb-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "943b6848-5a8e-4e04-92b3-21601e4b1ebb" (UID: "943b6848-5a8e-4e04-92b3-21601e4b1ebb"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:52:21 crc kubenswrapper[5030]: I0120 22:52:21.352975 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgppr\" (UniqueName: \"kubernetes.io/projected/943b6848-5a8e-4e04-92b3-21601e4b1ebb-kube-api-access-mgppr\") on node \"crc\" DevicePath \"\"" Jan 20 22:52:21 crc kubenswrapper[5030]: I0120 22:52:21.353016 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/943b6848-5a8e-4e04-92b3-21601e4b1ebb-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 22:52:21 crc kubenswrapper[5030]: I0120 22:52:21.353029 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/943b6848-5a8e-4e04-92b3-21601e4b1ebb-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 22:52:21 crc kubenswrapper[5030]: I0120 22:52:21.744865 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-8stz9" event={"ID":"943b6848-5a8e-4e04-92b3-21601e4b1ebb","Type":"ContainerDied","Data":"d8003d0a2f87595556fbd08fd3888dd92ae756f94d3f40c49e97607743fd0ca7"} Jan 20 22:52:21 crc kubenswrapper[5030]: I0120 22:52:21.744901 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8003d0a2f87595556fbd08fd3888dd92ae756f94d3f40c49e97607743fd0ca7" Jan 20 22:52:21 crc kubenswrapper[5030]: I0120 22:52:21.744957 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-8stz9" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.510094 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-8stz9"] Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.517041 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-8stz9"] Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.641882 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-sqskk"] Jan 20 22:52:24 crc kubenswrapper[5030]: E0120 22:52:24.642348 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943b6848-5a8e-4e04-92b3-21601e4b1ebb" containerName="storage" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.642372 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="943b6848-5a8e-4e04-92b3-21601e4b1ebb" containerName="storage" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.642674 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="943b6848-5a8e-4e04-92b3-21601e4b1ebb" containerName="storage" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.643370 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sqskk" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.646378 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.646601 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.647262 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.648817 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.668346 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-sqskk"] Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.699161 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/460f4945-a971-411f-93ab-f36930319651-crc-storage\") pod \"crc-storage-crc-sqskk\" (UID: \"460f4945-a971-411f-93ab-f36930319651\") " pod="crc-storage/crc-storage-crc-sqskk" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.699343 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/460f4945-a971-411f-93ab-f36930319651-node-mnt\") pod \"crc-storage-crc-sqskk\" (UID: \"460f4945-a971-411f-93ab-f36930319651\") " pod="crc-storage/crc-storage-crc-sqskk" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.699385 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5xqb\" (UniqueName: \"kubernetes.io/projected/460f4945-a971-411f-93ab-f36930319651-kube-api-access-n5xqb\") pod \"crc-storage-crc-sqskk\" (UID: \"460f4945-a971-411f-93ab-f36930319651\") " pod="crc-storage/crc-storage-crc-sqskk" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.800293 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/460f4945-a971-411f-93ab-f36930319651-crc-storage\") pod \"crc-storage-crc-sqskk\" (UID: \"460f4945-a971-411f-93ab-f36930319651\") " pod="crc-storage/crc-storage-crc-sqskk" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.800447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/460f4945-a971-411f-93ab-f36930319651-node-mnt\") pod \"crc-storage-crc-sqskk\" (UID: \"460f4945-a971-411f-93ab-f36930319651\") " pod="crc-storage/crc-storage-crc-sqskk" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.800479 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5xqb\" (UniqueName: \"kubernetes.io/projected/460f4945-a971-411f-93ab-f36930319651-kube-api-access-n5xqb\") pod \"crc-storage-crc-sqskk\" (UID: \"460f4945-a971-411f-93ab-f36930319651\") " pod="crc-storage/crc-storage-crc-sqskk" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.801268 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/460f4945-a971-411f-93ab-f36930319651-node-mnt\") pod \"crc-storage-crc-sqskk\" (UID: \"460f4945-a971-411f-93ab-f36930319651\") " pod="crc-storage/crc-storage-crc-sqskk" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.802263 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/460f4945-a971-411f-93ab-f36930319651-crc-storage\") pod \"crc-storage-crc-sqskk\" (UID: \"460f4945-a971-411f-93ab-f36930319651\") " pod="crc-storage/crc-storage-crc-sqskk" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.831770 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5xqb\" (UniqueName: \"kubernetes.io/projected/460f4945-a971-411f-93ab-f36930319651-kube-api-access-n5xqb\") pod \"crc-storage-crc-sqskk\" (UID: \"460f4945-a971-411f-93ab-f36930319651\") " pod="crc-storage/crc-storage-crc-sqskk" Jan 20 22:52:24 crc kubenswrapper[5030]: I0120 22:52:24.972989 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sqskk" Jan 20 22:52:25 crc kubenswrapper[5030]: I0120 22:52:25.447778 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-sqskk"] Jan 20 22:52:25 crc kubenswrapper[5030]: W0120 22:52:25.455702 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod460f4945_a971_411f_93ab_f36930319651.slice/crio-8297d94935fa0e05d132a8b7cb976c87d3ef8ea572e2913b3ff7aa7475ce3124 WatchSource:0}: Error finding container 8297d94935fa0e05d132a8b7cb976c87d3ef8ea572e2913b3ff7aa7475ce3124: Status 404 returned error can't find the container with id 8297d94935fa0e05d132a8b7cb976c87d3ef8ea572e2913b3ff7aa7475ce3124 Jan 20 22:52:25 crc kubenswrapper[5030]: I0120 22:52:25.775374 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sqskk" event={"ID":"460f4945-a971-411f-93ab-f36930319651","Type":"ContainerStarted","Data":"8297d94935fa0e05d132a8b7cb976c87d3ef8ea572e2913b3ff7aa7475ce3124"} Jan 20 22:52:25 crc kubenswrapper[5030]: I0120 22:52:25.975316 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943b6848-5a8e-4e04-92b3-21601e4b1ebb" path="/var/lib/kubelet/pods/943b6848-5a8e-4e04-92b3-21601e4b1ebb/volumes" Jan 20 22:52:26 crc kubenswrapper[5030]: I0120 22:52:26.790022 5030 generic.go:334] "Generic (PLEG): container finished" podID="460f4945-a971-411f-93ab-f36930319651" containerID="edb04c1ded6bcd629be12b1024e65245d152a0cef4ff2887df6e7b31b08de55f" exitCode=0 Jan 20 22:52:26 crc kubenswrapper[5030]: I0120 22:52:26.790328 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sqskk" event={"ID":"460f4945-a971-411f-93ab-f36930319651","Type":"ContainerDied","Data":"edb04c1ded6bcd629be12b1024e65245d152a0cef4ff2887df6e7b31b08de55f"} Jan 20 22:52:28 crc kubenswrapper[5030]: I0120 22:52:28.058225 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sqskk" Jan 20 22:52:28 crc kubenswrapper[5030]: I0120 22:52:28.253347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/460f4945-a971-411f-93ab-f36930319651-node-mnt\") pod \"460f4945-a971-411f-93ab-f36930319651\" (UID: \"460f4945-a971-411f-93ab-f36930319651\") " Jan 20 22:52:28 crc kubenswrapper[5030]: I0120 22:52:28.253446 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5xqb\" (UniqueName: \"kubernetes.io/projected/460f4945-a971-411f-93ab-f36930319651-kube-api-access-n5xqb\") pod \"460f4945-a971-411f-93ab-f36930319651\" (UID: \"460f4945-a971-411f-93ab-f36930319651\") " Jan 20 22:52:28 crc kubenswrapper[5030]: I0120 22:52:28.253463 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/460f4945-a971-411f-93ab-f36930319651-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "460f4945-a971-411f-93ab-f36930319651" (UID: "460f4945-a971-411f-93ab-f36930319651"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:52:28 crc kubenswrapper[5030]: I0120 22:52:28.253501 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/460f4945-a971-411f-93ab-f36930319651-crc-storage\") pod \"460f4945-a971-411f-93ab-f36930319651\" (UID: \"460f4945-a971-411f-93ab-f36930319651\") " Jan 20 22:52:28 crc kubenswrapper[5030]: I0120 22:52:28.253885 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/460f4945-a971-411f-93ab-f36930319651-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 22:52:28 crc kubenswrapper[5030]: I0120 22:52:28.261259 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/460f4945-a971-411f-93ab-f36930319651-kube-api-access-n5xqb" (OuterVolumeSpecName: "kube-api-access-n5xqb") pod "460f4945-a971-411f-93ab-f36930319651" (UID: "460f4945-a971-411f-93ab-f36930319651"). InnerVolumeSpecName "kube-api-access-n5xqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:52:28 crc kubenswrapper[5030]: I0120 22:52:28.283129 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460f4945-a971-411f-93ab-f36930319651-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "460f4945-a971-411f-93ab-f36930319651" (UID: "460f4945-a971-411f-93ab-f36930319651"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:52:28 crc kubenswrapper[5030]: I0120 22:52:28.354568 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5xqb\" (UniqueName: \"kubernetes.io/projected/460f4945-a971-411f-93ab-f36930319651-kube-api-access-n5xqb\") on node \"crc\" DevicePath \"\"" Jan 20 22:52:28 crc kubenswrapper[5030]: I0120 22:52:28.354598 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/460f4945-a971-411f-93ab-f36930319651-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 22:52:28 crc kubenswrapper[5030]: I0120 22:52:28.811177 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sqskk" event={"ID":"460f4945-a971-411f-93ab-f36930319651","Type":"ContainerDied","Data":"8297d94935fa0e05d132a8b7cb976c87d3ef8ea572e2913b3ff7aa7475ce3124"} Jan 20 22:52:28 crc kubenswrapper[5030]: I0120 22:52:28.811475 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8297d94935fa0e05d132a8b7cb976c87d3ef8ea572e2913b3ff7aa7475ce3124" Jan 20 22:52:28 crc kubenswrapper[5030]: I0120 22:52:28.811272 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sqskk" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.632597 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn"] Jan 20 22:52:31 crc kubenswrapper[5030]: E0120 22:52:31.633111 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460f4945-a971-411f-93ab-f36930319651" containerName="storage" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.633122 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="460f4945-a971-411f-93ab-f36930319651" containerName="storage" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.633241 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="460f4945-a971-411f-93ab-f36930319651" containerName="storage" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.633923 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.636181 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openshift-service-ca.crt" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.636437 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"kube-root-ca.crt" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.636781 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dnsmasq" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.636786 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dnsmasq-dnsmasq-dockercfg-mh7wx" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.656665 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn"] Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.761716 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj"] Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.762750 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.764370 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dnsmasq-svc" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.772874 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj"] Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.811241 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6zx2\" (UniqueName: \"kubernetes.io/projected/12203532-a911-44b9-a1f8-e63b85b30a32-kube-api-access-n6zx2\") pod \"dnsmasq-dnsmasq-84b9f45d47-q9pxj\" (UID: \"12203532-a911-44b9-a1f8-e63b85b30a32\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.811279 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8zdt\" (UniqueName: \"kubernetes.io/projected/044322cd-eba1-4ff9-9e14-dc8e9d02caf2-kube-api-access-g8zdt\") pod \"dnsmasq-dnsmasq-f5849d7b9-4nsmn\" (UID: \"044322cd-eba1-4ff9-9e14-dc8e9d02caf2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.811298 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12203532-a911-44b9-a1f8-e63b85b30a32-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-q9pxj\" (UID: \"12203532-a911-44b9-a1f8-e63b85b30a32\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.811430 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044322cd-eba1-4ff9-9e14-dc8e9d02caf2-config\") pod \"dnsmasq-dnsmasq-f5849d7b9-4nsmn\" (UID: \"044322cd-eba1-4ff9-9e14-dc8e9d02caf2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.811496 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/12203532-a911-44b9-a1f8-e63b85b30a32-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-q9pxj\" (UID: \"12203532-a911-44b9-a1f8-e63b85b30a32\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.912376 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6zx2\" (UniqueName: \"kubernetes.io/projected/12203532-a911-44b9-a1f8-e63b85b30a32-kube-api-access-n6zx2\") pod \"dnsmasq-dnsmasq-84b9f45d47-q9pxj\" (UID: \"12203532-a911-44b9-a1f8-e63b85b30a32\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.912410 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8zdt\" (UniqueName: \"kubernetes.io/projected/044322cd-eba1-4ff9-9e14-dc8e9d02caf2-kube-api-access-g8zdt\") pod \"dnsmasq-dnsmasq-f5849d7b9-4nsmn\" (UID: \"044322cd-eba1-4ff9-9e14-dc8e9d02caf2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.912431 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12203532-a911-44b9-a1f8-e63b85b30a32-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-q9pxj\" (UID: \"12203532-a911-44b9-a1f8-e63b85b30a32\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.912456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044322cd-eba1-4ff9-9e14-dc8e9d02caf2-config\") pod \"dnsmasq-dnsmasq-f5849d7b9-4nsmn\" (UID: \"044322cd-eba1-4ff9-9e14-dc8e9d02caf2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.912497 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/12203532-a911-44b9-a1f8-e63b85b30a32-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-q9pxj\" (UID: \"12203532-a911-44b9-a1f8-e63b85b30a32\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.913281 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/12203532-a911-44b9-a1f8-e63b85b30a32-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-q9pxj\" (UID: \"12203532-a911-44b9-a1f8-e63b85b30a32\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.913369 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12203532-a911-44b9-a1f8-e63b85b30a32-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-q9pxj\" (UID: \"12203532-a911-44b9-a1f8-e63b85b30a32\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.913450 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044322cd-eba1-4ff9-9e14-dc8e9d02caf2-config\") pod \"dnsmasq-dnsmasq-f5849d7b9-4nsmn\" (UID: \"044322cd-eba1-4ff9-9e14-dc8e9d02caf2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.941331 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8zdt\" (UniqueName: \"kubernetes.io/projected/044322cd-eba1-4ff9-9e14-dc8e9d02caf2-kube-api-access-g8zdt\") pod \"dnsmasq-dnsmasq-f5849d7b9-4nsmn\" (UID: \"044322cd-eba1-4ff9-9e14-dc8e9d02caf2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.941578 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6zx2\" (UniqueName: \"kubernetes.io/projected/12203532-a911-44b9-a1f8-e63b85b30a32-kube-api-access-n6zx2\") pod \"dnsmasq-dnsmasq-84b9f45d47-q9pxj\" (UID: \"12203532-a911-44b9-a1f8-e63b85b30a32\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:52:31 crc kubenswrapper[5030]: I0120 22:52:31.952389 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" Jan 20 22:52:32 crc kubenswrapper[5030]: I0120 22:52:32.084859 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:52:32 crc kubenswrapper[5030]: I0120 22:52:32.230117 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn"] Jan 20 22:52:32 crc kubenswrapper[5030]: I0120 22:52:32.239270 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 22:52:32 crc kubenswrapper[5030]: I0120 22:52:32.525928 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj"] Jan 20 22:52:32 crc kubenswrapper[5030]: I0120 22:52:32.851830 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" event={"ID":"044322cd-eba1-4ff9-9e14-dc8e9d02caf2","Type":"ContainerStarted","Data":"3177fffc6728f06139faee1119135db46ac46c3595181626c740f1fa952dd33d"} Jan 20 22:52:32 crc kubenswrapper[5030]: I0120 22:52:32.853945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" event={"ID":"12203532-a911-44b9-a1f8-e63b85b30a32","Type":"ContainerStarted","Data":"1a96d8919232e73788dec23dfe45a254919e8779c7d984fa19ab00fb9c412e42"} Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.477083 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.478445 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.480267 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.480448 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.480678 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.480698 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.480800 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-8mqbb" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.480897 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.482596 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.494290 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.593052 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.593108 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq8sj\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-kube-api-access-jq8sj\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.593133 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.593148 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.593162 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.593180 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.593205 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0a5273a-9439-4881-b88a-2a3aa11e410b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.593255 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.593272 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0a5273a-9439-4881-b88a-2a3aa11e410b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.593292 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.593316 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.694649 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.694720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.694753 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq8sj\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-kube-api-access-jq8sj\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.694774 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.694792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.694813 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.694834 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.694866 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0a5273a-9439-4881-b88a-2a3aa11e410b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.694921 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.694945 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0a5273a-9439-4881-b88a-2a3aa11e410b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.694971 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.695508 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.696002 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.696596 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.696759 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.696993 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.697021 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.701988 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.703649 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0a5273a-9439-4881-b88a-2a3aa11e410b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.705453 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.707710 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0a5273a-9439-4881-b88a-2a3aa11e410b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.713472 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq8sj\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-kube-api-access-jq8sj\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.720101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:41 crc kubenswrapper[5030]: I0120 22:52:41.796117 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.192440 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.193692 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.195737 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.195778 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.202126 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.202133 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.202718 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.203237 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-9t25t" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.209820 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.209968 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.311442 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vzr9\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-kube-api-access-9vzr9\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.311491 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.311535 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38bfa948-051b-4e6c-92a5-3714e0652088-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.311730 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.311781 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.311847 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.311972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38bfa948-051b-4e6c-92a5-3714e0652088-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.312006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.312048 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.312093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.312115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.422948 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38bfa948-051b-4e6c-92a5-3714e0652088-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.423016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.423042 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.423068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.423096 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38bfa948-051b-4e6c-92a5-3714e0652088-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.423116 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.423150 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.423184 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.423207 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.423240 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vzr9\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-kube-api-access-9vzr9\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.423272 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.424263 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.424337 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") device mount path \"/mnt/openstack/pv12\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.425281 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.426887 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.430283 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38bfa948-051b-4e6c-92a5-3714e0652088-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.430844 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.433311 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.424348 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.435110 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38bfa948-051b-4e6c-92a5-3714e0652088-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.458869 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vzr9\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-kube-api-access-9vzr9\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.460263 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.476232 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:42 crc kubenswrapper[5030]: I0120 22:52:42.535578 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.659953 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.663587 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.666990 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.671336 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.671515 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.671652 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-p8blk" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.671825 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.680941 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.742798 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.742851 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.742894 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a05e942-d886-4355-b352-50408cfe11e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.742919 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8gws\" (UniqueName: \"kubernetes.io/projected/8a05e942-d886-4355-b352-50408cfe11e3-kube-api-access-r8gws\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.742934 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a05e942-d886-4355-b352-50408cfe11e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.742974 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a05e942-d886-4355-b352-50408cfe11e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.742992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.743111 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.844219 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.844271 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.844314 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a05e942-d886-4355-b352-50408cfe11e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.844333 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a05e942-d886-4355-b352-50408cfe11e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.844350 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8gws\" (UniqueName: \"kubernetes.io/projected/8a05e942-d886-4355-b352-50408cfe11e3-kube-api-access-r8gws\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.844391 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a05e942-d886-4355-b352-50408cfe11e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.844408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.844431 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.845047 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.845218 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a05e942-d886-4355-b352-50408cfe11e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.845278 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.845518 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.846349 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.849370 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a05e942-d886-4355-b352-50408cfe11e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.849757 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a05e942-d886-4355-b352-50408cfe11e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.861798 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8gws\" (UniqueName: \"kubernetes.io/projected/8a05e942-d886-4355-b352-50408cfe11e3-kube-api-access-r8gws\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.863469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:43 crc kubenswrapper[5030]: I0120 22:52:43.988312 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.045776 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.048556 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.051221 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.051404 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-jmz96" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.051791 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.051800 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.056095 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.161486 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.161562 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a857cf-5973-4c78-9758-ab04c932e180-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.161605 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.161659 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9n9n\" (UniqueName: \"kubernetes.io/projected/95a857cf-5973-4c78-9758-ab04c932e180-kube-api-access-n9n9n\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.161699 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/95a857cf-5973-4c78-9758-ab04c932e180-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.161736 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.161830 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.161877 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/95a857cf-5973-4c78-9758-ab04c932e180-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.262759 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.262819 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9n9n\" (UniqueName: \"kubernetes.io/projected/95a857cf-5973-4c78-9758-ab04c932e180-kube-api-access-n9n9n\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.262849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/95a857cf-5973-4c78-9758-ab04c932e180-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.262872 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.262947 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.262979 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/95a857cf-5973-4c78-9758-ab04c932e180-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.263004 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.263041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a857cf-5973-4c78-9758-ab04c932e180-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.263756 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") device mount path \"/mnt/openstack/pv03\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.264315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/95a857cf-5973-4c78-9758-ab04c932e180-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.265314 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.265333 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.266416 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.272390 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/95a857cf-5973-4c78-9758-ab04c932e180-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.282121 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a857cf-5973-4c78-9758-ab04c932e180-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.292922 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9n9n\" (UniqueName: \"kubernetes.io/projected/95a857cf-5973-4c78-9758-ab04c932e180-kube-api-access-n9n9n\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.324535 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.375766 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.384198 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.385036 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.386811 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.387124 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-th2w4" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.387416 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.407070 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.465724 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe6a225-5530-4922-91ec-1c4a939cb98b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.465788 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/efe6a225-5530-4922-91ec-1c4a939cb98b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.465823 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/efe6a225-5530-4922-91ec-1c4a939cb98b-config-data\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.465845 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wxnh\" (UniqueName: \"kubernetes.io/projected/efe6a225-5530-4922-91ec-1c4a939cb98b-kube-api-access-4wxnh\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.465860 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/efe6a225-5530-4922-91ec-1c4a939cb98b-kolla-config\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.568062 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/efe6a225-5530-4922-91ec-1c4a939cb98b-config-data\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.568536 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/efe6a225-5530-4922-91ec-1c4a939cb98b-kolla-config\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.568794 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wxnh\" (UniqueName: \"kubernetes.io/projected/efe6a225-5530-4922-91ec-1c4a939cb98b-kube-api-access-4wxnh\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.569111 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe6a225-5530-4922-91ec-1c4a939cb98b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.569364 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/efe6a225-5530-4922-91ec-1c4a939cb98b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.569390 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/efe6a225-5530-4922-91ec-1c4a939cb98b-kolla-config\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.569110 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/efe6a225-5530-4922-91ec-1c4a939cb98b-config-data\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.572980 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe6a225-5530-4922-91ec-1c4a939cb98b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.573114 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/efe6a225-5530-4922-91ec-1c4a939cb98b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.594731 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wxnh\" (UniqueName: \"kubernetes.io/projected/efe6a225-5530-4922-91ec-1c4a939cb98b-kube-api-access-4wxnh\") pod \"memcached-0\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:45 crc kubenswrapper[5030]: I0120 22:52:45.699941 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 22:52:47 crc kubenswrapper[5030]: I0120 22:52:47.178252 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:52:47 crc kubenswrapper[5030]: I0120 22:52:47.179333 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:52:47 crc kubenswrapper[5030]: I0120 22:52:47.181250 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-tzw4x" Jan 20 22:52:47 crc kubenswrapper[5030]: I0120 22:52:47.189494 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:52:47 crc kubenswrapper[5030]: I0120 22:52:47.295335 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8sxc\" (UniqueName: \"kubernetes.io/projected/bd0dddd2-0952-4bdb-9149-08b1b3653cbb-kube-api-access-p8sxc\") pod \"kube-state-metrics-0\" (UID: \"bd0dddd2-0952-4bdb-9149-08b1b3653cbb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:52:47 crc kubenswrapper[5030]: I0120 22:52:47.396559 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8sxc\" (UniqueName: \"kubernetes.io/projected/bd0dddd2-0952-4bdb-9149-08b1b3653cbb-kube-api-access-p8sxc\") pod \"kube-state-metrics-0\" (UID: \"bd0dddd2-0952-4bdb-9149-08b1b3653cbb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:52:47 crc kubenswrapper[5030]: I0120 22:52:47.414467 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8sxc\" (UniqueName: \"kubernetes.io/projected/bd0dddd2-0952-4bdb-9149-08b1b3653cbb-kube-api-access-p8sxc\") pod \"kube-state-metrics-0\" (UID: \"bd0dddd2-0952-4bdb-9149-08b1b3653cbb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:52:47 crc kubenswrapper[5030]: I0120 22:52:47.503230 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:52:47 crc kubenswrapper[5030]: E0120 22:52:47.676955 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 20 22:52:47 crc kubenswrapper[5030]: E0120 22:52:47.677129 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n689h697h5cbh9dh679h79hc8hb8h688h68h54fh5cbh88h67dh565h5cfh5ddh688h677h68ch64h67ch566h79h5h5b9h58h59ch568h68fh596h65dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dnsmasq,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dnsmasq-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dnsmasq-svc,SubPath:dnsmasq-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n6zx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000680000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dnsmasq-84b9f45d47-q9pxj_openstack-kuttl-tests(12203532-a911-44b9-a1f8-e63b85b30a32): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:52:47 crc kubenswrapper[5030]: E0120 22:52:47.678316 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" podUID="12203532-a911-44b9-a1f8-e63b85b30a32" Jan 20 22:52:47 crc kubenswrapper[5030]: E0120 22:52:47.693968 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 20 22:52:47 crc kubenswrapper[5030]: E0120 22:52:47.694145 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n56bh585hb8hf5h5f4h7fh658hc6h549h8fh595h58bh4h5d6h66ch544h5c6hdh586h7dh59hb7h67h675h648h547h557h576h59bh5bdh5c4h5cbq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dnsmasq,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g8zdt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000680000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dnsmasq-f5849d7b9-4nsmn_openstack-kuttl-tests(044322cd-eba1-4ff9-9e14-dc8e9d02caf2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:52:47 crc kubenswrapper[5030]: E0120 22:52:47.695326 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" podUID="044322cd-eba1-4ff9-9e14-dc8e9d02caf2" Jan 20 22:52:47 crc kubenswrapper[5030]: E0120 22:52:47.989856 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" podUID="12203532-a911-44b9-a1f8-e63b85b30a32" Jan 20 22:52:47 crc kubenswrapper[5030]: E0120 22:52:47.990392 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" podUID="044322cd-eba1-4ff9-9e14-dc8e9d02caf2" Jan 20 22:52:48 crc kubenswrapper[5030]: I0120 22:52:48.153202 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 22:52:48 crc kubenswrapper[5030]: I0120 22:52:48.160839 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 22:52:48 crc kubenswrapper[5030]: I0120 22:52:48.175729 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 22:52:48 crc kubenswrapper[5030]: W0120 22:52:48.299722 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd0dddd2_0952_4bdb_9149_08b1b3653cbb.slice/crio-12ba68aec1f5ec593692220b26209997a9d7ea613f75c79853a72240267ee41a WatchSource:0}: Error finding container 12ba68aec1f5ec593692220b26209997a9d7ea613f75c79853a72240267ee41a: Status 404 returned error can't find the container with id 12ba68aec1f5ec593692220b26209997a9d7ea613f75c79853a72240267ee41a Jan 20 22:52:48 crc kubenswrapper[5030]: W0120 22:52:48.307283 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a05e942_d886_4355_b352_50408cfe11e3.slice/crio-17458a6af4868b27fea5694c3ea52722ef21ec8531b391cdc69caeb5dc2e847a WatchSource:0}: Error finding container 17458a6af4868b27fea5694c3ea52722ef21ec8531b391cdc69caeb5dc2e847a: Status 404 returned error can't find the container with id 17458a6af4868b27fea5694c3ea52722ef21ec8531b391cdc69caeb5dc2e847a Jan 20 22:52:48 crc kubenswrapper[5030]: I0120 22:52:48.313368 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:52:48 crc kubenswrapper[5030]: I0120 22:52:48.328161 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 22:52:48 crc kubenswrapper[5030]: I0120 22:52:48.336920 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 22:52:48 crc kubenswrapper[5030]: I0120 22:52:48.996274 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"95a857cf-5973-4c78-9758-ab04c932e180","Type":"ContainerStarted","Data":"efdde6c1f53faeac19b424fc4d6b1c4cdc852a6f40cd8e43bb5c38df18ca9c2f"} Jan 20 22:52:48 crc kubenswrapper[5030]: I0120 22:52:48.997239 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"bd0dddd2-0952-4bdb-9149-08b1b3653cbb","Type":"ContainerStarted","Data":"12ba68aec1f5ec593692220b26209997a9d7ea613f75c79853a72240267ee41a"} Jan 20 22:52:48 crc kubenswrapper[5030]: I0120 22:52:48.997990 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"8a05e942-d886-4355-b352-50408cfe11e3","Type":"ContainerStarted","Data":"17458a6af4868b27fea5694c3ea52722ef21ec8531b391cdc69caeb5dc2e847a"} Jan 20 22:52:48 crc kubenswrapper[5030]: I0120 22:52:48.998751 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"efe6a225-5530-4922-91ec-1c4a939cb98b","Type":"ContainerStarted","Data":"1f7ea6ee541ca660624948bd02d48ba083b1b0ce0ae6c89acaf2cabd591dc181"} Jan 20 22:52:48 crc kubenswrapper[5030]: I0120 22:52:48.999350 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"b0a5273a-9439-4881-b88a-2a3aa11e410b","Type":"ContainerStarted","Data":"8af7359bc48418173b51693b863b4f938e54842a895b52d72f07b15be15b76c1"} Jan 20 22:52:49 crc kubenswrapper[5030]: I0120 22:52:49.000167 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"38bfa948-051b-4e6c-92a5-3714e0652088","Type":"ContainerStarted","Data":"fade6fb19a5f99275c15e45e03f4dd9c4d7b7774a9d13abfb46ceed58f75c998"} Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.610124 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.611271 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.614188 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.614716 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.615072 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-xxssz" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.615270 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.615726 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.627305 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.656603 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.656671 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg4rt\" (UniqueName: \"kubernetes.io/projected/69346425-07fa-46ee-a8a9-9750fd83ac46-kube-api-access-wg4rt\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.656715 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.656742 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69346425-07fa-46ee-a8a9-9750fd83ac46-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.656769 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69346425-07fa-46ee-a8a9-9750fd83ac46-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.656791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69346425-07fa-46ee-a8a9-9750fd83ac46-config\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.656813 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.657046 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.759059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.759105 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg4rt\" (UniqueName: \"kubernetes.io/projected/69346425-07fa-46ee-a8a9-9750fd83ac46-kube-api-access-wg4rt\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.759153 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.759177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69346425-07fa-46ee-a8a9-9750fd83ac46-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.759204 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69346425-07fa-46ee-a8a9-9750fd83ac46-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.759225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69346425-07fa-46ee-a8a9-9750fd83ac46-config\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.759247 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.759267 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.759788 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.761062 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69346425-07fa-46ee-a8a9-9750fd83ac46-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.761963 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69346425-07fa-46ee-a8a9-9750fd83ac46-config\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.762510 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69346425-07fa-46ee-a8a9-9750fd83ac46-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.765795 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.766515 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.766840 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.776402 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg4rt\" (UniqueName: \"kubernetes.io/projected/69346425-07fa-46ee-a8a9-9750fd83ac46-kube-api-access-wg4rt\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.786289 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-nb-0\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:50 crc kubenswrapper[5030]: I0120 22:52:50.940840 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:52:51 crc kubenswrapper[5030]: I0120 22:52:51.375108 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 22:52:51 crc kubenswrapper[5030]: W0120 22:52:51.381278 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69346425_07fa_46ee_a8a9_9750fd83ac46.slice/crio-b7ae022d96085c94041bc7645fcc1ddf6967ea12ab55391a2644d6e2140b5b78 WatchSource:0}: Error finding container b7ae022d96085c94041bc7645fcc1ddf6967ea12ab55391a2644d6e2140b5b78: Status 404 returned error can't find the container with id b7ae022d96085c94041bc7645fcc1ddf6967ea12ab55391a2644d6e2140b5b78 Jan 20 22:52:52 crc kubenswrapper[5030]: I0120 22:52:52.022421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"69346425-07fa-46ee-a8a9-9750fd83ac46","Type":"ContainerStarted","Data":"b7ae022d96085c94041bc7645fcc1ddf6967ea12ab55391a2644d6e2140b5b78"} Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.565585 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.567051 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.574360 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.576424 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.577400 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.577724 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.577869 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-54hfl" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.617140 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.617176 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-config\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.617217 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.617255 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.617281 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.617331 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.617434 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.617521 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpx28\" (UniqueName: \"kubernetes.io/projected/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-kube-api-access-fpx28\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.718683 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.718769 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.718810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.718837 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.718856 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.718888 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpx28\" (UniqueName: \"kubernetes.io/projected/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-kube-api-access-fpx28\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.718950 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.718975 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-config\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.719693 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.720297 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-config\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.721249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.725786 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.738799 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.738885 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.739249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.757029 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.764753 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpx28\" (UniqueName: \"kubernetes.io/projected/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-kube-api-access-fpx28\") pod \"ovsdbserver-sb-0\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:52:53 crc kubenswrapper[5030]: I0120 22:52:53.905919 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:53:00 crc kubenswrapper[5030]: I0120 22:53:00.080403 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 22:53:00 crc kubenswrapper[5030]: W0120 22:53:00.374366 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42b7bb7d_f87c_4f2f_8d95_5d8ff851912c.slice/crio-84273878b9163e70f3a952aca0d221876ee6741a40855566b52a09e705afc60a WatchSource:0}: Error finding container 84273878b9163e70f3a952aca0d221876ee6741a40855566b52a09e705afc60a: Status 404 returned error can't find the container with id 84273878b9163e70f3a952aca0d221876ee6741a40855566b52a09e705afc60a Jan 20 22:53:01 crc kubenswrapper[5030]: I0120 22:53:01.109863 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"95a857cf-5973-4c78-9758-ab04c932e180","Type":"ContainerStarted","Data":"f4f036c607bf3ef974bccea536d7279888114e64c92919f79bea59ab01d3bf6a"} Jan 20 22:53:01 crc kubenswrapper[5030]: I0120 22:53:01.111654 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"bd0dddd2-0952-4bdb-9149-08b1b3653cbb","Type":"ContainerStarted","Data":"09a1061cec01b660a9ddb3341b1806861ee961251efda9dbc91b67e899b7a10f"} Jan 20 22:53:01 crc kubenswrapper[5030]: I0120 22:53:01.111829 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:53:01 crc kubenswrapper[5030]: I0120 22:53:01.113381 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"8a05e942-d886-4355-b352-50408cfe11e3","Type":"ContainerStarted","Data":"b0d2103e4ac83ab42ccb1a0edfca0d51c401307b05570b94fa953f43e63443b3"} Jan 20 22:53:01 crc kubenswrapper[5030]: I0120 22:53:01.115519 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"efe6a225-5530-4922-91ec-1c4a939cb98b","Type":"ContainerStarted","Data":"6b6d7fced9bfec32b48d7d732119d3237bc9fe4d5517f7a246ca37788d005e77"} Jan 20 22:53:01 crc kubenswrapper[5030]: I0120 22:53:01.115606 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 22:53:01 crc kubenswrapper[5030]: I0120 22:53:01.117725 5030 generic.go:334] "Generic (PLEG): container finished" podID="12203532-a911-44b9-a1f8-e63b85b30a32" containerID="fb19b49479cb655206e3149fcba22f286b15ec9d6c9d5548e3e033525c9b09a9" exitCode=0 Jan 20 22:53:01 crc kubenswrapper[5030]: I0120 22:53:01.117768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" event={"ID":"12203532-a911-44b9-a1f8-e63b85b30a32","Type":"ContainerDied","Data":"fb19b49479cb655206e3149fcba22f286b15ec9d6c9d5548e3e033525c9b09a9"} Jan 20 22:53:01 crc kubenswrapper[5030]: I0120 22:53:01.119422 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c","Type":"ContainerStarted","Data":"84273878b9163e70f3a952aca0d221876ee6741a40855566b52a09e705afc60a"} Jan 20 22:53:01 crc kubenswrapper[5030]: I0120 22:53:01.121668 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"69346425-07fa-46ee-a8a9-9750fd83ac46","Type":"ContainerStarted","Data":"d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7"} Jan 20 22:53:01 crc kubenswrapper[5030]: I0120 22:53:01.213517 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=4.869985568 podStartE2EDuration="16.213499084s" podCreationTimestamp="2026-01-20 22:52:45 +0000 UTC" firstStartedPulling="2026-01-20 22:52:48.17510195 +0000 UTC m=+1040.495362238" lastFinishedPulling="2026-01-20 22:52:59.518615466 +0000 UTC m=+1051.838875754" observedRunningTime="2026-01-20 22:53:01.207396405 +0000 UTC m=+1053.527656703" watchObservedRunningTime="2026-01-20 22:53:01.213499084 +0000 UTC m=+1053.533759372" Jan 20 22:53:01 crc kubenswrapper[5030]: I0120 22:53:01.231764 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.023965204 podStartE2EDuration="14.231745959s" podCreationTimestamp="2026-01-20 22:52:47 +0000 UTC" firstStartedPulling="2026-01-20 22:52:48.310014694 +0000 UTC m=+1040.630274982" lastFinishedPulling="2026-01-20 22:53:00.517795449 +0000 UTC m=+1052.838055737" observedRunningTime="2026-01-20 22:53:01.228544651 +0000 UTC m=+1053.548804949" watchObservedRunningTime="2026-01-20 22:53:01.231745959 +0000 UTC m=+1053.552006247" Jan 20 22:53:02 crc kubenswrapper[5030]: I0120 22:53:02.131680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" event={"ID":"12203532-a911-44b9-a1f8-e63b85b30a32","Type":"ContainerStarted","Data":"0798a74683ee4a674b978d4932f7546f0e893fc80d67cc004b7de950080e0504"} Jan 20 22:53:02 crc kubenswrapper[5030]: I0120 22:53:02.132725 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:53:02 crc kubenswrapper[5030]: I0120 22:53:02.134490 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c","Type":"ContainerStarted","Data":"53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd"} Jan 20 22:53:02 crc kubenswrapper[5030]: I0120 22:53:02.136370 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"38bfa948-051b-4e6c-92a5-3714e0652088","Type":"ContainerStarted","Data":"775623890bd1b1a33b5cac6e58ca14081b26ef86769bbbe4eaeddb36e108320c"} Jan 20 22:53:02 crc kubenswrapper[5030]: I0120 22:53:02.138099 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"b0a5273a-9439-4881-b88a-2a3aa11e410b","Type":"ContainerStarted","Data":"b3335bf3d79fe9a34a33b8eb93fa2ea9b983ca3d131a46215aea375ddf7c1e58"} Jan 20 22:53:02 crc kubenswrapper[5030]: I0120 22:53:02.149493 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" podStartSLOduration=3.155596703 podStartE2EDuration="31.149470663s" podCreationTimestamp="2026-01-20 22:52:31 +0000 UTC" firstStartedPulling="2026-01-20 22:52:32.52518294 +0000 UTC m=+1024.845443228" lastFinishedPulling="2026-01-20 22:53:00.5190569 +0000 UTC m=+1052.839317188" observedRunningTime="2026-01-20 22:53:02.14848759 +0000 UTC m=+1054.468747888" watchObservedRunningTime="2026-01-20 22:53:02.149470663 +0000 UTC m=+1054.469730951" Jan 20 22:53:04 crc kubenswrapper[5030]: I0120 22:53:04.154581 5030 generic.go:334] "Generic (PLEG): container finished" podID="8a05e942-d886-4355-b352-50408cfe11e3" containerID="b0d2103e4ac83ab42ccb1a0edfca0d51c401307b05570b94fa953f43e63443b3" exitCode=0 Jan 20 22:53:04 crc kubenswrapper[5030]: I0120 22:53:04.154794 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"8a05e942-d886-4355-b352-50408cfe11e3","Type":"ContainerDied","Data":"b0d2103e4ac83ab42ccb1a0edfca0d51c401307b05570b94fa953f43e63443b3"} Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.177460 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c","Type":"ContainerStarted","Data":"dc8cf096144942ad340ca9001d99a944cd0a4c991ff4c64ab1bbed9217b019de"} Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.182285 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"69346425-07fa-46ee-a8a9-9750fd83ac46","Type":"ContainerStarted","Data":"bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f"} Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.185608 5030 generic.go:334] "Generic (PLEG): container finished" podID="95a857cf-5973-4c78-9758-ab04c932e180" containerID="f4f036c607bf3ef974bccea536d7279888114e64c92919f79bea59ab01d3bf6a" exitCode=0 Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.185820 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"95a857cf-5973-4c78-9758-ab04c932e180","Type":"ContainerDied","Data":"f4f036c607bf3ef974bccea536d7279888114e64c92919f79bea59ab01d3bf6a"} Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.189499 5030 generic.go:334] "Generic (PLEG): container finished" podID="044322cd-eba1-4ff9-9e14-dc8e9d02caf2" containerID="169977ece3ff70a661fdad360c1f83517955613b379024bfe6f5ccc0ead4a519" exitCode=0 Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.189825 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" event={"ID":"044322cd-eba1-4ff9-9e14-dc8e9d02caf2","Type":"ContainerDied","Data":"169977ece3ff70a661fdad360c1f83517955613b379024bfe6f5ccc0ead4a519"} Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.197104 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"8a05e942-d886-4355-b352-50408cfe11e3","Type":"ContainerStarted","Data":"f42f977d4b1f56e969e3645e77647af497eeead81e8ac35d88c15264e254f40f"} Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.216174 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=9.269219515 podStartE2EDuration="13.21614374s" podCreationTimestamp="2026-01-20 22:52:52 +0000 UTC" firstStartedPulling="2026-01-20 22:53:00.40886431 +0000 UTC m=+1052.729124598" lastFinishedPulling="2026-01-20 22:53:04.355788535 +0000 UTC m=+1056.676048823" observedRunningTime="2026-01-20 22:53:05.206564376 +0000 UTC m=+1057.526824704" watchObservedRunningTime="2026-01-20 22:53:05.21614374 +0000 UTC m=+1057.536404068" Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.257235 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.2687590269999998 podStartE2EDuration="16.257214242s" podCreationTimestamp="2026-01-20 22:52:49 +0000 UTC" firstStartedPulling="2026-01-20 22:52:51.383654969 +0000 UTC m=+1043.703915257" lastFinishedPulling="2026-01-20 22:53:04.372110174 +0000 UTC m=+1056.692370472" observedRunningTime="2026-01-20 22:53:05.255520551 +0000 UTC m=+1057.575780859" watchObservedRunningTime="2026-01-20 22:53:05.257214242 +0000 UTC m=+1057.577474540" Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.306923 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=12.101963423 podStartE2EDuration="23.306893745s" podCreationTimestamp="2026-01-20 22:52:42 +0000 UTC" firstStartedPulling="2026-01-20 22:52:48.31351708 +0000 UTC m=+1040.633777368" lastFinishedPulling="2026-01-20 22:52:59.518447402 +0000 UTC m=+1051.838707690" observedRunningTime="2026-01-20 22:53:05.304249781 +0000 UTC m=+1057.624510079" watchObservedRunningTime="2026-01-20 22:53:05.306893745 +0000 UTC m=+1057.627154063" Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.701453 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.906415 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.942060 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.942110 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.972853 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:53:05 crc kubenswrapper[5030]: I0120 22:53:05.983995 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.206602 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"95a857cf-5973-4c78-9758-ab04c932e180","Type":"ContainerStarted","Data":"a959770d65b862147cb227ef990299c9ce51c5014c0b60151012a6e091f021b1"} Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.209059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" event={"ID":"044322cd-eba1-4ff9-9e14-dc8e9d02caf2","Type":"ContainerStarted","Data":"07a5a41ace821b7aa4c7fd86d1833b406e981f5922c4094d1c0bbf839c6069ea"} Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.209803 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.239474 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=10.503521755 podStartE2EDuration="22.239448061s" podCreationTimestamp="2026-01-20 22:52:44 +0000 UTC" firstStartedPulling="2026-01-20 22:52:48.155147663 +0000 UTC m=+1040.475407951" lastFinishedPulling="2026-01-20 22:52:59.891073969 +0000 UTC m=+1052.211334257" observedRunningTime="2026-01-20 22:53:06.225506141 +0000 UTC m=+1058.545766439" watchObservedRunningTime="2026-01-20 22:53:06.239448061 +0000 UTC m=+1058.559708359" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.247446 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" podStartSLOduration=-9223372001.607351 podStartE2EDuration="35.247424056s" podCreationTimestamp="2026-01-20 22:52:31 +0000 UTC" firstStartedPulling="2026-01-20 22:52:32.239038476 +0000 UTC m=+1024.559298764" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:53:06.244048744 +0000 UTC m=+1058.564309082" watchObservedRunningTime="2026-01-20 22:53:06.247424056 +0000 UTC m=+1058.567684384" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.257689 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.269519 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.533321 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.534694 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.536723 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.536854 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.537321 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-qkc9k" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.537458 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.553268 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.638594 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.638665 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.638696 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-scripts\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.638723 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.638738 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-config\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.638758 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6x2f\" (UniqueName: \"kubernetes.io/projected/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-kube-api-access-g6x2f\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.638953 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.740739 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.740783 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-scripts\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.740801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.740823 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.740838 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-config\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.740858 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6x2f\" (UniqueName: \"kubernetes.io/projected/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-kube-api-access-g6x2f\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.740893 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.741674 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.741829 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-scripts\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.741858 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-config\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.745919 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.746112 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.746372 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.760179 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6x2f\" (UniqueName: \"kubernetes.io/projected/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-kube-api-access-g6x2f\") pod \"ovn-northd-0\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.855299 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:06 crc kubenswrapper[5030]: I0120 22:53:06.952762 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" Jan 20 22:53:07 crc kubenswrapper[5030]: I0120 22:53:07.087043 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:53:07 crc kubenswrapper[5030]: I0120 22:53:07.154463 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn"] Jan 20 22:53:07 crc kubenswrapper[5030]: I0120 22:53:07.381543 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 22:53:07 crc kubenswrapper[5030]: I0120 22:53:07.514788 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.226107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869","Type":"ContainerStarted","Data":"25ede500a1cf863025c9de214d70add569b559d86113b90f5ff61923fad6b9ec"} Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.226345 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" podUID="044322cd-eba1-4ff9-9e14-dc8e9d02caf2" containerName="dnsmasq-dns" containerID="cri-o://07a5a41ace821b7aa4c7fd86d1833b406e981f5922c4094d1c0bbf839c6069ea" gracePeriod=10 Jan 20 22:53:08 crc kubenswrapper[5030]: E0120 22:53:08.362126 5030 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.9:51426->38.102.83.9:37955: write tcp 38.102.83.9:51426->38.102.83.9:37955: write: broken pipe Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.597548 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.604315 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.610208 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.610522 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-rrrtc" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.610683 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.610918 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.635082 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.775784 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.776058 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.776134 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4870c60-c6b8-48d3-998e-a91670fcedbc-cache\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.776168 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zgwr\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-kube-api-access-4zgwr\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.776203 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4870c60-c6b8-48d3-998e-a91670fcedbc-lock\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.788452 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.877320 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8zdt\" (UniqueName: \"kubernetes.io/projected/044322cd-eba1-4ff9-9e14-dc8e9d02caf2-kube-api-access-g8zdt\") pod \"044322cd-eba1-4ff9-9e14-dc8e9d02caf2\" (UID: \"044322cd-eba1-4ff9-9e14-dc8e9d02caf2\") " Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.877392 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044322cd-eba1-4ff9-9e14-dc8e9d02caf2-config\") pod \"044322cd-eba1-4ff9-9e14-dc8e9d02caf2\" (UID: \"044322cd-eba1-4ff9-9e14-dc8e9d02caf2\") " Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.877699 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4870c60-c6b8-48d3-998e-a91670fcedbc-cache\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.877734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zgwr\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-kube-api-access-4zgwr\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.877775 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4870c60-c6b8-48d3-998e-a91670fcedbc-lock\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.877820 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.877861 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: E0120 22:53:08.878108 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.878121 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4870c60-c6b8-48d3-998e-a91670fcedbc-cache\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.878137 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") device mount path \"/mnt/openstack/pv15\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: E0120 22:53:08.878131 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 22:53:08 crc kubenswrapper[5030]: E0120 22:53:08.878206 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift podName:f4870c60-c6b8-48d3-998e-a91670fcedbc nodeName:}" failed. No retries permitted until 2026-01-20 22:53:09.37818982 +0000 UTC m=+1061.698450098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift") pod "swift-storage-0" (UID: "f4870c60-c6b8-48d3-998e-a91670fcedbc") : configmap "swift-ring-files" not found Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.878271 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4870c60-c6b8-48d3-998e-a91670fcedbc-lock\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.888061 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/044322cd-eba1-4ff9-9e14-dc8e9d02caf2-kube-api-access-g8zdt" (OuterVolumeSpecName: "kube-api-access-g8zdt") pod "044322cd-eba1-4ff9-9e14-dc8e9d02caf2" (UID: "044322cd-eba1-4ff9-9e14-dc8e9d02caf2"). InnerVolumeSpecName "kube-api-access-g8zdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.893684 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zgwr\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-kube-api-access-4zgwr\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.896998 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.923291 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/044322cd-eba1-4ff9-9e14-dc8e9d02caf2-config" (OuterVolumeSpecName: "config") pod "044322cd-eba1-4ff9-9e14-dc8e9d02caf2" (UID: "044322cd-eba1-4ff9-9e14-dc8e9d02caf2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.982005 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8zdt\" (UniqueName: \"kubernetes.io/projected/044322cd-eba1-4ff9-9e14-dc8e9d02caf2-kube-api-access-g8zdt\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:08 crc kubenswrapper[5030]: I0120 22:53:08.982041 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/044322cd-eba1-4ff9-9e14-dc8e9d02caf2-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.234899 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869","Type":"ContainerStarted","Data":"29fb501d2b9d7513a788b9257ebec18345b31379794a7b078577b83a4385a4f0"} Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.234954 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869","Type":"ContainerStarted","Data":"247edcbff378bb2e2fd1e3b880680b1439f95acd1f37bcf000a2cab517dc6fd8"} Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.235060 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.236693 5030 generic.go:334] "Generic (PLEG): container finished" podID="044322cd-eba1-4ff9-9e14-dc8e9d02caf2" containerID="07a5a41ace821b7aa4c7fd86d1833b406e981f5922c4094d1c0bbf839c6069ea" exitCode=0 Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.236724 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" event={"ID":"044322cd-eba1-4ff9-9e14-dc8e9d02caf2","Type":"ContainerDied","Data":"07a5a41ace821b7aa4c7fd86d1833b406e981f5922c4094d1c0bbf839c6069ea"} Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.236743 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" event={"ID":"044322cd-eba1-4ff9-9e14-dc8e9d02caf2","Type":"ContainerDied","Data":"3177fffc6728f06139faee1119135db46ac46c3595181626c740f1fa952dd33d"} Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.236758 5030 scope.go:117] "RemoveContainer" containerID="07a5a41ace821b7aa4c7fd86d1833b406e981f5922c4094d1c0bbf839c6069ea" Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.236772 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn" Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.257712 5030 scope.go:117] "RemoveContainer" containerID="169977ece3ff70a661fdad360c1f83517955613b379024bfe6f5ccc0ead4a519" Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.259484 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.12976286 podStartE2EDuration="3.259464688s" podCreationTimestamp="2026-01-20 22:53:06 +0000 UTC" firstStartedPulling="2026-01-20 22:53:07.383489341 +0000 UTC m=+1059.703749629" lastFinishedPulling="2026-01-20 22:53:08.513191139 +0000 UTC m=+1060.833451457" observedRunningTime="2026-01-20 22:53:09.255024539 +0000 UTC m=+1061.575284867" watchObservedRunningTime="2026-01-20 22:53:09.259464688 +0000 UTC m=+1061.579724976" Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.277323 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn"] Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.279163 5030 scope.go:117] "RemoveContainer" containerID="07a5a41ace821b7aa4c7fd86d1833b406e981f5922c4094d1c0bbf839c6069ea" Jan 20 22:53:09 crc kubenswrapper[5030]: E0120 22:53:09.279921 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07a5a41ace821b7aa4c7fd86d1833b406e981f5922c4094d1c0bbf839c6069ea\": container with ID starting with 07a5a41ace821b7aa4c7fd86d1833b406e981f5922c4094d1c0bbf839c6069ea not found: ID does not exist" containerID="07a5a41ace821b7aa4c7fd86d1833b406e981f5922c4094d1c0bbf839c6069ea" Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.279989 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a5a41ace821b7aa4c7fd86d1833b406e981f5922c4094d1c0bbf839c6069ea"} err="failed to get container status \"07a5a41ace821b7aa4c7fd86d1833b406e981f5922c4094d1c0bbf839c6069ea\": rpc error: code = NotFound desc = could not find container \"07a5a41ace821b7aa4c7fd86d1833b406e981f5922c4094d1c0bbf839c6069ea\": container with ID starting with 07a5a41ace821b7aa4c7fd86d1833b406e981f5922c4094d1c0bbf839c6069ea not found: ID does not exist" Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.280032 5030 scope.go:117] "RemoveContainer" containerID="169977ece3ff70a661fdad360c1f83517955613b379024bfe6f5ccc0ead4a519" Jan 20 22:53:09 crc kubenswrapper[5030]: E0120 22:53:09.280480 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"169977ece3ff70a661fdad360c1f83517955613b379024bfe6f5ccc0ead4a519\": container with ID starting with 169977ece3ff70a661fdad360c1f83517955613b379024bfe6f5ccc0ead4a519 not found: ID does not exist" containerID="169977ece3ff70a661fdad360c1f83517955613b379024bfe6f5ccc0ead4a519" Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.280563 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"169977ece3ff70a661fdad360c1f83517955613b379024bfe6f5ccc0ead4a519"} err="failed to get container status \"169977ece3ff70a661fdad360c1f83517955613b379024bfe6f5ccc0ead4a519\": rpc error: code = NotFound desc = could not find container \"169977ece3ff70a661fdad360c1f83517955613b379024bfe6f5ccc0ead4a519\": container with ID starting with 169977ece3ff70a661fdad360c1f83517955613b379024bfe6f5ccc0ead4a519 not found: ID does not exist" Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.285680 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-f5849d7b9-4nsmn"] Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.387782 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:09 crc kubenswrapper[5030]: E0120 22:53:09.388150 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 22:53:09 crc kubenswrapper[5030]: E0120 22:53:09.388170 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 22:53:09 crc kubenswrapper[5030]: E0120 22:53:09.388206 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift podName:f4870c60-c6b8-48d3-998e-a91670fcedbc nodeName:}" failed. No retries permitted until 2026-01-20 22:53:10.38819041 +0000 UTC m=+1062.708450698 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift") pod "swift-storage-0" (UID: "f4870c60-c6b8-48d3-998e-a91670fcedbc") : configmap "swift-ring-files" not found Jan 20 22:53:09 crc kubenswrapper[5030]: I0120 22:53:09.973715 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="044322cd-eba1-4ff9-9e14-dc8e9d02caf2" path="/var/lib/kubelet/pods/044322cd-eba1-4ff9-9e14-dc8e9d02caf2/volumes" Jan 20 22:53:10 crc kubenswrapper[5030]: I0120 22:53:10.404540 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:10 crc kubenswrapper[5030]: E0120 22:53:10.404971 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 22:53:10 crc kubenswrapper[5030]: E0120 22:53:10.405042 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 22:53:10 crc kubenswrapper[5030]: E0120 22:53:10.405133 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift podName:f4870c60-c6b8-48d3-998e-a91670fcedbc nodeName:}" failed. No retries permitted until 2026-01-20 22:53:12.405099156 +0000 UTC m=+1064.725359474 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift") pod "swift-storage-0" (UID: "f4870c60-c6b8-48d3-998e-a91670fcedbc") : configmap "swift-ring-files" not found Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.440249 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:12 crc kubenswrapper[5030]: E0120 22:53:12.440446 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 22:53:12 crc kubenswrapper[5030]: E0120 22:53:12.440653 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 22:53:12 crc kubenswrapper[5030]: E0120 22:53:12.440711 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift podName:f4870c60-c6b8-48d3-998e-a91670fcedbc nodeName:}" failed. No retries permitted until 2026-01-20 22:53:16.440695011 +0000 UTC m=+1068.760955299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift") pod "swift-storage-0" (UID: "f4870c60-c6b8-48d3-998e-a91670fcedbc") : configmap "swift-ring-files" not found Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.681875 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-gvgkg"] Jan 20 22:53:12 crc kubenswrapper[5030]: E0120 22:53:12.682211 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="044322cd-eba1-4ff9-9e14-dc8e9d02caf2" containerName="init" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.682232 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="044322cd-eba1-4ff9-9e14-dc8e9d02caf2" containerName="init" Jan 20 22:53:12 crc kubenswrapper[5030]: E0120 22:53:12.682242 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="044322cd-eba1-4ff9-9e14-dc8e9d02caf2" containerName="dnsmasq-dns" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.682250 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="044322cd-eba1-4ff9-9e14-dc8e9d02caf2" containerName="dnsmasq-dns" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.682455 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="044322cd-eba1-4ff9-9e14-dc8e9d02caf2" containerName="dnsmasq-dns" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.683027 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.685528 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.685829 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.686104 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.697231 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-gvgkg"] Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.848827 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-combined-ca-bundle\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.848926 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c693e21-8f8f-4437-9662-876c9f6f48e1-etc-swift\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.849027 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dks9m\" (UniqueName: \"kubernetes.io/projected/1c693e21-8f8f-4437-9662-876c9f6f48e1-kube-api-access-dks9m\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.849248 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-swiftconf\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.849298 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-dispersionconf\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.849495 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c693e21-8f8f-4437-9662-876c9f6f48e1-ring-data-devices\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.849574 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c693e21-8f8f-4437-9662-876c9f6f48e1-scripts\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.951260 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c693e21-8f8f-4437-9662-876c9f6f48e1-ring-data-devices\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.951584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c693e21-8f8f-4437-9662-876c9f6f48e1-scripts\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.951714 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-combined-ca-bundle\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.951792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c693e21-8f8f-4437-9662-876c9f6f48e1-etc-swift\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.951866 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dks9m\" (UniqueName: \"kubernetes.io/projected/1c693e21-8f8f-4437-9662-876c9f6f48e1-kube-api-access-dks9m\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.951957 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-swiftconf\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.952031 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-dispersionconf\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.952337 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c693e21-8f8f-4437-9662-876c9f6f48e1-ring-data-devices\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.952538 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c693e21-8f8f-4437-9662-876c9f6f48e1-scripts\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.952637 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c693e21-8f8f-4437-9662-876c9f6f48e1-etc-swift\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.957963 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-combined-ca-bundle\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.965239 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-dispersionconf\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.968536 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-swiftconf\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:12 crc kubenswrapper[5030]: I0120 22:53:12.972774 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dks9m\" (UniqueName: \"kubernetes.io/projected/1c693e21-8f8f-4437-9662-876c9f6f48e1-kube-api-access-dks9m\") pod \"swift-ring-rebalance-gvgkg\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:13 crc kubenswrapper[5030]: I0120 22:53:13.004477 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:13 crc kubenswrapper[5030]: I0120 22:53:13.560154 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-gvgkg"] Jan 20 22:53:13 crc kubenswrapper[5030]: W0120 22:53:13.565892 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c693e21_8f8f_4437_9662_876c9f6f48e1.slice/crio-5993bb64c5166ed18343382fbef1b4c82a7ca728147842fa20a3cf79c0a049f3 WatchSource:0}: Error finding container 5993bb64c5166ed18343382fbef1b4c82a7ca728147842fa20a3cf79c0a049f3: Status 404 returned error can't find the container with id 5993bb64c5166ed18343382fbef1b4c82a7ca728147842fa20a3cf79c0a049f3 Jan 20 22:53:13 crc kubenswrapper[5030]: I0120 22:53:13.989398 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:53:13 crc kubenswrapper[5030]: I0120 22:53:13.989790 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:53:14 crc kubenswrapper[5030]: I0120 22:53:14.078338 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:53:14 crc kubenswrapper[5030]: I0120 22:53:14.278968 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" event={"ID":"1c693e21-8f8f-4437-9662-876c9f6f48e1","Type":"ContainerStarted","Data":"5993bb64c5166ed18343382fbef1b4c82a7ca728147842fa20a3cf79c0a049f3"} Jan 20 22:53:14 crc kubenswrapper[5030]: I0120 22:53:14.366570 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.362913 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn"] Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.365426 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.367811 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.376696 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.376973 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.388936 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn"] Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.425854 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-slndf"] Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.427914 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-slndf" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.455088 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-slndf"] Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.499971 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.504207 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z9tz\" (UniqueName: \"kubernetes.io/projected/8016e98a-33a5-43a8-9480-1a424b187263-kube-api-access-2z9tz\") pod \"keystone-db-create-slndf\" (UID: \"8016e98a-33a5-43a8-9480-1a424b187263\") " pod="openstack-kuttl-tests/keystone-db-create-slndf" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.504292 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75632b87-176b-4dea-be41-ea40a8615763-operator-scripts\") pod \"keystone-8c9f-account-create-update-fzrtn\" (UID: \"75632b87-176b-4dea-be41-ea40a8615763\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.504325 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8016e98a-33a5-43a8-9480-1a424b187263-operator-scripts\") pod \"keystone-db-create-slndf\" (UID: \"8016e98a-33a5-43a8-9480-1a424b187263\") " pod="openstack-kuttl-tests/keystone-db-create-slndf" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.504371 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwb7c\" (UniqueName: \"kubernetes.io/projected/75632b87-176b-4dea-be41-ea40a8615763-kube-api-access-hwb7c\") pod \"keystone-8c9f-account-create-update-fzrtn\" (UID: \"75632b87-176b-4dea-be41-ea40a8615763\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.597434 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-xj9s4"] Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.598303 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-xj9s4" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.606375 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z9tz\" (UniqueName: \"kubernetes.io/projected/8016e98a-33a5-43a8-9480-1a424b187263-kube-api-access-2z9tz\") pod \"keystone-db-create-slndf\" (UID: \"8016e98a-33a5-43a8-9480-1a424b187263\") " pod="openstack-kuttl-tests/keystone-db-create-slndf" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.606414 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75632b87-176b-4dea-be41-ea40a8615763-operator-scripts\") pod \"keystone-8c9f-account-create-update-fzrtn\" (UID: \"75632b87-176b-4dea-be41-ea40a8615763\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.606439 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8016e98a-33a5-43a8-9480-1a424b187263-operator-scripts\") pod \"keystone-db-create-slndf\" (UID: \"8016e98a-33a5-43a8-9480-1a424b187263\") " pod="openstack-kuttl-tests/keystone-db-create-slndf" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.606463 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwb7c\" (UniqueName: \"kubernetes.io/projected/75632b87-176b-4dea-be41-ea40a8615763-kube-api-access-hwb7c\") pod \"keystone-8c9f-account-create-update-fzrtn\" (UID: \"75632b87-176b-4dea-be41-ea40a8615763\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.607510 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8016e98a-33a5-43a8-9480-1a424b187263-operator-scripts\") pod \"keystone-db-create-slndf\" (UID: \"8016e98a-33a5-43a8-9480-1a424b187263\") " pod="openstack-kuttl-tests/keystone-db-create-slndf" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.607927 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75632b87-176b-4dea-be41-ea40a8615763-operator-scripts\") pod \"keystone-8c9f-account-create-update-fzrtn\" (UID: \"75632b87-176b-4dea-be41-ea40a8615763\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.616286 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-xj9s4"] Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.629572 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwb7c\" (UniqueName: \"kubernetes.io/projected/75632b87-176b-4dea-be41-ea40a8615763-kube-api-access-hwb7c\") pod \"keystone-8c9f-account-create-update-fzrtn\" (UID: \"75632b87-176b-4dea-be41-ea40a8615763\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.642321 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z9tz\" (UniqueName: \"kubernetes.io/projected/8016e98a-33a5-43a8-9480-1a424b187263-kube-api-access-2z9tz\") pod \"keystone-db-create-slndf\" (UID: \"8016e98a-33a5-43a8-9480-1a424b187263\") " pod="openstack-kuttl-tests/keystone-db-create-slndf" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.693649 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.699168 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n"] Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.700301 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.702047 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.708932 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab29253-03b2-4351-a4ba-96cff66ebe9d-operator-scripts\") pod \"placement-db-create-xj9s4\" (UID: \"2ab29253-03b2-4351-a4ba-96cff66ebe9d\") " pod="openstack-kuttl-tests/placement-db-create-xj9s4" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.709035 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c4jv\" (UniqueName: \"kubernetes.io/projected/2ab29253-03b2-4351-a4ba-96cff66ebe9d-kube-api-access-8c4jv\") pod \"placement-db-create-xj9s4\" (UID: \"2ab29253-03b2-4351-a4ba-96cff66ebe9d\") " pod="openstack-kuttl-tests/placement-db-create-xj9s4" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.712125 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n"] Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.769156 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-slndf" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.810849 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms4j5\" (UniqueName: \"kubernetes.io/projected/11a08db1-ff16-4bbc-a3c1-b34488e0a9b7-kube-api-access-ms4j5\") pod \"placement-e5d9-account-create-update-6bg9n\" (UID: \"11a08db1-ff16-4bbc-a3c1-b34488e0a9b7\") " pod="openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.810942 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab29253-03b2-4351-a4ba-96cff66ebe9d-operator-scripts\") pod \"placement-db-create-xj9s4\" (UID: \"2ab29253-03b2-4351-a4ba-96cff66ebe9d\") " pod="openstack-kuttl-tests/placement-db-create-xj9s4" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.810983 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a08db1-ff16-4bbc-a3c1-b34488e0a9b7-operator-scripts\") pod \"placement-e5d9-account-create-update-6bg9n\" (UID: \"11a08db1-ff16-4bbc-a3c1-b34488e0a9b7\") " pod="openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.811047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c4jv\" (UniqueName: \"kubernetes.io/projected/2ab29253-03b2-4351-a4ba-96cff66ebe9d-kube-api-access-8c4jv\") pod \"placement-db-create-xj9s4\" (UID: \"2ab29253-03b2-4351-a4ba-96cff66ebe9d\") " pod="openstack-kuttl-tests/placement-db-create-xj9s4" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.811953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab29253-03b2-4351-a4ba-96cff66ebe9d-operator-scripts\") pod \"placement-db-create-xj9s4\" (UID: \"2ab29253-03b2-4351-a4ba-96cff66ebe9d\") " pod="openstack-kuttl-tests/placement-db-create-xj9s4" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.836211 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c4jv\" (UniqueName: \"kubernetes.io/projected/2ab29253-03b2-4351-a4ba-96cff66ebe9d-kube-api-access-8c4jv\") pod \"placement-db-create-xj9s4\" (UID: \"2ab29253-03b2-4351-a4ba-96cff66ebe9d\") " pod="openstack-kuttl-tests/placement-db-create-xj9s4" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.904046 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-6ck4d"] Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.905300 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-6ck4d" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.912416 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-6ck4d"] Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.912668 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms4j5\" (UniqueName: \"kubernetes.io/projected/11a08db1-ff16-4bbc-a3c1-b34488e0a9b7-kube-api-access-ms4j5\") pod \"placement-e5d9-account-create-update-6bg9n\" (UID: \"11a08db1-ff16-4bbc-a3c1-b34488e0a9b7\") " pod="openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.912734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a08db1-ff16-4bbc-a3c1-b34488e0a9b7-operator-scripts\") pod \"placement-e5d9-account-create-update-6bg9n\" (UID: \"11a08db1-ff16-4bbc-a3c1-b34488e0a9b7\") " pod="openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.913545 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a08db1-ff16-4bbc-a3c1-b34488e0a9b7-operator-scripts\") pod \"placement-e5d9-account-create-update-6bg9n\" (UID: \"11a08db1-ff16-4bbc-a3c1-b34488e0a9b7\") " pod="openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.921954 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-xj9s4" Jan 20 22:53:15 crc kubenswrapper[5030]: I0120 22:53:15.931266 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms4j5\" (UniqueName: \"kubernetes.io/projected/11a08db1-ff16-4bbc-a3c1-b34488e0a9b7-kube-api-access-ms4j5\") pod \"placement-e5d9-account-create-update-6bg9n\" (UID: \"11a08db1-ff16-4bbc-a3c1-b34488e0a9b7\") " pod="openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.014525 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1-operator-scripts\") pod \"glance-db-create-6ck4d\" (UID: \"fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1\") " pod="openstack-kuttl-tests/glance-db-create-6ck4d" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.014713 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9469\" (UniqueName: \"kubernetes.io/projected/fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1-kube-api-access-x9469\") pod \"glance-db-create-6ck4d\" (UID: \"fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1\") " pod="openstack-kuttl-tests/glance-db-create-6ck4d" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.016758 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-523f-account-create-update-lfzlt"] Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.018257 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-523f-account-create-update-lfzlt" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.020740 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.022159 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-523f-account-create-update-lfzlt"] Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.025402 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.118423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1-operator-scripts\") pod \"glance-db-create-6ck4d\" (UID: \"fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1\") " pod="openstack-kuttl-tests/glance-db-create-6ck4d" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.118545 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhz4c\" (UniqueName: \"kubernetes.io/projected/cd6f8be1-d7f8-47ac-800d-0007b458ba14-kube-api-access-xhz4c\") pod \"glance-523f-account-create-update-lfzlt\" (UID: \"cd6f8be1-d7f8-47ac-800d-0007b458ba14\") " pod="openstack-kuttl-tests/glance-523f-account-create-update-lfzlt" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.118778 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9469\" (UniqueName: \"kubernetes.io/projected/fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1-kube-api-access-x9469\") pod \"glance-db-create-6ck4d\" (UID: \"fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1\") " pod="openstack-kuttl-tests/glance-db-create-6ck4d" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.118812 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd6f8be1-d7f8-47ac-800d-0007b458ba14-operator-scripts\") pod \"glance-523f-account-create-update-lfzlt\" (UID: \"cd6f8be1-d7f8-47ac-800d-0007b458ba14\") " pod="openstack-kuttl-tests/glance-523f-account-create-update-lfzlt" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.119195 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1-operator-scripts\") pod \"glance-db-create-6ck4d\" (UID: \"fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1\") " pod="openstack-kuttl-tests/glance-db-create-6ck4d" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.136955 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9469\" (UniqueName: \"kubernetes.io/projected/fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1-kube-api-access-x9469\") pod \"glance-db-create-6ck4d\" (UID: \"fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1\") " pod="openstack-kuttl-tests/glance-db-create-6ck4d" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.221035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd6f8be1-d7f8-47ac-800d-0007b458ba14-operator-scripts\") pod \"glance-523f-account-create-update-lfzlt\" (UID: \"cd6f8be1-d7f8-47ac-800d-0007b458ba14\") " pod="openstack-kuttl-tests/glance-523f-account-create-update-lfzlt" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.221261 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhz4c\" (UniqueName: \"kubernetes.io/projected/cd6f8be1-d7f8-47ac-800d-0007b458ba14-kube-api-access-xhz4c\") pod \"glance-523f-account-create-update-lfzlt\" (UID: \"cd6f8be1-d7f8-47ac-800d-0007b458ba14\") " pod="openstack-kuttl-tests/glance-523f-account-create-update-lfzlt" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.221917 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd6f8be1-d7f8-47ac-800d-0007b458ba14-operator-scripts\") pod \"glance-523f-account-create-update-lfzlt\" (UID: \"cd6f8be1-d7f8-47ac-800d-0007b458ba14\") " pod="openstack-kuttl-tests/glance-523f-account-create-update-lfzlt" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.223293 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-6ck4d" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.240071 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhz4c\" (UniqueName: \"kubernetes.io/projected/cd6f8be1-d7f8-47ac-800d-0007b458ba14-kube-api-access-xhz4c\") pod \"glance-523f-account-create-update-lfzlt\" (UID: \"cd6f8be1-d7f8-47ac-800d-0007b458ba14\") " pod="openstack-kuttl-tests/glance-523f-account-create-update-lfzlt" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.336460 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-523f-account-create-update-lfzlt" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.378366 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:53:16 crc kubenswrapper[5030]: I0120 22:53:16.526641 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:16 crc kubenswrapper[5030]: E0120 22:53:16.526850 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 22:53:16 crc kubenswrapper[5030]: E0120 22:53:16.526868 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 22:53:16 crc kubenswrapper[5030]: E0120 22:53:16.526917 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift podName:f4870c60-c6b8-48d3-998e-a91670fcedbc nodeName:}" failed. No retries permitted until 2026-01-20 22:53:24.526899447 +0000 UTC m=+1076.847159735 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift") pod "swift-storage-0" (UID: "f4870c60-c6b8-48d3-998e-a91670fcedbc") : configmap "swift-ring-files" not found Jan 20 22:53:18 crc kubenswrapper[5030]: I0120 22:53:18.208566 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-xj9s4"] Jan 20 22:53:18 crc kubenswrapper[5030]: I0120 22:53:18.302910 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn"] Jan 20 22:53:18 crc kubenswrapper[5030]: I0120 22:53:18.329127 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-xj9s4" event={"ID":"2ab29253-03b2-4351-a4ba-96cff66ebe9d","Type":"ContainerStarted","Data":"11b487da71a2d4fc5b70ec2fb980f526bf3bd8654481a8f2d6be7fcd6a847236"} Jan 20 22:53:18 crc kubenswrapper[5030]: I0120 22:53:18.330834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" event={"ID":"1c693e21-8f8f-4437-9662-876c9f6f48e1","Type":"ContainerStarted","Data":"5480da400ee80721be2f16ae8f3038789f5bab39358c94ed7e428011135c3283"} Jan 20 22:53:18 crc kubenswrapper[5030]: W0120 22:53:18.373495 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd6f8be1_d7f8_47ac_800d_0007b458ba14.slice/crio-8cbedeb29d437a1ec6f35f5d68a0fbe1f005d90fa63338cc1ff9b4e20c3fa001 WatchSource:0}: Error finding container 8cbedeb29d437a1ec6f35f5d68a0fbe1f005d90fa63338cc1ff9b4e20c3fa001: Status 404 returned error can't find the container with id 8cbedeb29d437a1ec6f35f5d68a0fbe1f005d90fa63338cc1ff9b4e20c3fa001 Jan 20 22:53:18 crc kubenswrapper[5030]: W0120 22:53:18.373908 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8016e98a_33a5_43a8_9480_1a424b187263.slice/crio-4e87ade4e28ebc6d4ed71b5b12f9e83ab827ddda5c0e03d1ff76600d632bbb75 WatchSource:0}: Error finding container 4e87ade4e28ebc6d4ed71b5b12f9e83ab827ddda5c0e03d1ff76600d632bbb75: Status 404 returned error can't find the container with id 4e87ade4e28ebc6d4ed71b5b12f9e83ab827ddda5c0e03d1ff76600d632bbb75 Jan 20 22:53:18 crc kubenswrapper[5030]: I0120 22:53:18.374021 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-slndf"] Jan 20 22:53:18 crc kubenswrapper[5030]: W0120 22:53:18.375290 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11a08db1_ff16_4bbc_a3c1_b34488e0a9b7.slice/crio-9eb48284c731c35b417bb69d7450a9c96f707986a1097eb1e49467e9c673a699 WatchSource:0}: Error finding container 9eb48284c731c35b417bb69d7450a9c96f707986a1097eb1e49467e9c673a699: Status 404 returned error can't find the container with id 9eb48284c731c35b417bb69d7450a9c96f707986a1097eb1e49467e9c673a699 Jan 20 22:53:18 crc kubenswrapper[5030]: I0120 22:53:18.384702 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-6ck4d"] Jan 20 22:53:18 crc kubenswrapper[5030]: I0120 22:53:18.392688 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-523f-account-create-update-lfzlt"] Jan 20 22:53:18 crc kubenswrapper[5030]: I0120 22:53:18.400073 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n"] Jan 20 22:53:18 crc kubenswrapper[5030]: I0120 22:53:18.404945 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" podStartSLOduration=2.21878996 podStartE2EDuration="6.404929835s" podCreationTimestamp="2026-01-20 22:53:12 +0000 UTC" firstStartedPulling="2026-01-20 22:53:13.567838518 +0000 UTC m=+1065.888098806" lastFinishedPulling="2026-01-20 22:53:17.753978403 +0000 UTC m=+1070.074238681" observedRunningTime="2026-01-20 22:53:18.359422154 +0000 UTC m=+1070.679682442" watchObservedRunningTime="2026-01-20 22:53:18.404929835 +0000 UTC m=+1070.725190123" Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.341333 5030 generic.go:334] "Generic (PLEG): container finished" podID="cd6f8be1-d7f8-47ac-800d-0007b458ba14" containerID="3a3477d03775c8409a06e6db45461245f50d32bf83272caaa790fbb1f2b619a0" exitCode=0 Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.341671 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-523f-account-create-update-lfzlt" event={"ID":"cd6f8be1-d7f8-47ac-800d-0007b458ba14","Type":"ContainerDied","Data":"3a3477d03775c8409a06e6db45461245f50d32bf83272caaa790fbb1f2b619a0"} Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.341697 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-523f-account-create-update-lfzlt" event={"ID":"cd6f8be1-d7f8-47ac-800d-0007b458ba14","Type":"ContainerStarted","Data":"8cbedeb29d437a1ec6f35f5d68a0fbe1f005d90fa63338cc1ff9b4e20c3fa001"} Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.344283 5030 generic.go:334] "Generic (PLEG): container finished" podID="2ab29253-03b2-4351-a4ba-96cff66ebe9d" containerID="5083ee0eecec9aa23fd8e6c82e1b7f2af5bc2a0e4f692f039c3caf0ab0de514a" exitCode=0 Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.344323 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-xj9s4" event={"ID":"2ab29253-03b2-4351-a4ba-96cff66ebe9d","Type":"ContainerDied","Data":"5083ee0eecec9aa23fd8e6c82e1b7f2af5bc2a0e4f692f039c3caf0ab0de514a"} Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.348240 5030 generic.go:334] "Generic (PLEG): container finished" podID="11a08db1-ff16-4bbc-a3c1-b34488e0a9b7" containerID="f62ebf5cd18768a4d65883ecf1d5961daa058a3f749fc304252503a2f82a463d" exitCode=0 Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.348343 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n" event={"ID":"11a08db1-ff16-4bbc-a3c1-b34488e0a9b7","Type":"ContainerDied","Data":"f62ebf5cd18768a4d65883ecf1d5961daa058a3f749fc304252503a2f82a463d"} Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.348385 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n" event={"ID":"11a08db1-ff16-4bbc-a3c1-b34488e0a9b7","Type":"ContainerStarted","Data":"9eb48284c731c35b417bb69d7450a9c96f707986a1097eb1e49467e9c673a699"} Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.350944 5030 generic.go:334] "Generic (PLEG): container finished" podID="fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1" containerID="3622233fe725e7827b49fc546a0a57fcc1df7bee2d883a8db35f60ca54cdcc3d" exitCode=0 Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.351021 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-6ck4d" event={"ID":"fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1","Type":"ContainerDied","Data":"3622233fe725e7827b49fc546a0a57fcc1df7bee2d883a8db35f60ca54cdcc3d"} Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.351051 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-6ck4d" event={"ID":"fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1","Type":"ContainerStarted","Data":"732d764956751afe13606e42f07307b8101af5264900b2b82f18fc64e25ce848"} Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.360970 5030 generic.go:334] "Generic (PLEG): container finished" podID="8016e98a-33a5-43a8-9480-1a424b187263" containerID="6fdf5ee996af7b294b4ccf748fd8a7949b814dbeedb68b669717d89b03aacbbb" exitCode=0 Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.361065 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-slndf" event={"ID":"8016e98a-33a5-43a8-9480-1a424b187263","Type":"ContainerDied","Data":"6fdf5ee996af7b294b4ccf748fd8a7949b814dbeedb68b669717d89b03aacbbb"} Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.361105 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-slndf" event={"ID":"8016e98a-33a5-43a8-9480-1a424b187263","Type":"ContainerStarted","Data":"4e87ade4e28ebc6d4ed71b5b12f9e83ab827ddda5c0e03d1ff76600d632bbb75"} Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.373476 5030 generic.go:334] "Generic (PLEG): container finished" podID="75632b87-176b-4dea-be41-ea40a8615763" containerID="e91ba2b5c96f1ceb6cf02b5d22e74541a14325e0f54d4686895e5fc2e62c482c" exitCode=0 Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.373530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn" event={"ID":"75632b87-176b-4dea-be41-ea40a8615763","Type":"ContainerDied","Data":"e91ba2b5c96f1ceb6cf02b5d22e74541a14325e0f54d4686895e5fc2e62c482c"} Jan 20 22:53:19 crc kubenswrapper[5030]: I0120 22:53:19.373588 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn" event={"ID":"75632b87-176b-4dea-be41-ea40a8615763","Type":"ContainerStarted","Data":"c10bd78a1c1b09ef74330f39d981e8ce6ed4a3cb2d1a1143fbc95ef7abe893c6"} Jan 20 22:53:20 crc kubenswrapper[5030]: I0120 22:53:20.751170 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-xj9s4" Jan 20 22:53:20 crc kubenswrapper[5030]: I0120 22:53:20.804073 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab29253-03b2-4351-a4ba-96cff66ebe9d-operator-scripts\") pod \"2ab29253-03b2-4351-a4ba-96cff66ebe9d\" (UID: \"2ab29253-03b2-4351-a4ba-96cff66ebe9d\") " Jan 20 22:53:20 crc kubenswrapper[5030]: I0120 22:53:20.804117 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c4jv\" (UniqueName: \"kubernetes.io/projected/2ab29253-03b2-4351-a4ba-96cff66ebe9d-kube-api-access-8c4jv\") pod \"2ab29253-03b2-4351-a4ba-96cff66ebe9d\" (UID: \"2ab29253-03b2-4351-a4ba-96cff66ebe9d\") " Jan 20 22:53:20 crc kubenswrapper[5030]: I0120 22:53:20.807166 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab29253-03b2-4351-a4ba-96cff66ebe9d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ab29253-03b2-4351-a4ba-96cff66ebe9d" (UID: "2ab29253-03b2-4351-a4ba-96cff66ebe9d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:20 crc kubenswrapper[5030]: I0120 22:53:20.809948 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab29253-03b2-4351-a4ba-96cff66ebe9d-kube-api-access-8c4jv" (OuterVolumeSpecName: "kube-api-access-8c4jv") pod "2ab29253-03b2-4351-a4ba-96cff66ebe9d" (UID: "2ab29253-03b2-4351-a4ba-96cff66ebe9d"). InnerVolumeSpecName "kube-api-access-8c4jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:20 crc kubenswrapper[5030]: I0120 22:53:20.906054 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab29253-03b2-4351-a4ba-96cff66ebe9d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:20 crc kubenswrapper[5030]: I0120 22:53:20.906083 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c4jv\" (UniqueName: \"kubernetes.io/projected/2ab29253-03b2-4351-a4ba-96cff66ebe9d-kube-api-access-8c4jv\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.059348 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-523f-account-create-update-lfzlt" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.063871 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-slndf" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.067567 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-6ck4d" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.075503 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.079687 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.109105 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8016e98a-33a5-43a8-9480-1a424b187263-operator-scripts\") pod \"8016e98a-33a5-43a8-9480-1a424b187263\" (UID: \"8016e98a-33a5-43a8-9480-1a424b187263\") " Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.109238 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhz4c\" (UniqueName: \"kubernetes.io/projected/cd6f8be1-d7f8-47ac-800d-0007b458ba14-kube-api-access-xhz4c\") pod \"cd6f8be1-d7f8-47ac-800d-0007b458ba14\" (UID: \"cd6f8be1-d7f8-47ac-800d-0007b458ba14\") " Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.109270 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a08db1-ff16-4bbc-a3c1-b34488e0a9b7-operator-scripts\") pod \"11a08db1-ff16-4bbc-a3c1-b34488e0a9b7\" (UID: \"11a08db1-ff16-4bbc-a3c1-b34488e0a9b7\") " Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.109301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9469\" (UniqueName: \"kubernetes.io/projected/fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1-kube-api-access-x9469\") pod \"fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1\" (UID: \"fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1\") " Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.109325 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms4j5\" (UniqueName: \"kubernetes.io/projected/11a08db1-ff16-4bbc-a3c1-b34488e0a9b7-kube-api-access-ms4j5\") pod \"11a08db1-ff16-4bbc-a3c1-b34488e0a9b7\" (UID: \"11a08db1-ff16-4bbc-a3c1-b34488e0a9b7\") " Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.109360 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z9tz\" (UniqueName: \"kubernetes.io/projected/8016e98a-33a5-43a8-9480-1a424b187263-kube-api-access-2z9tz\") pod \"8016e98a-33a5-43a8-9480-1a424b187263\" (UID: \"8016e98a-33a5-43a8-9480-1a424b187263\") " Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.109406 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd6f8be1-d7f8-47ac-800d-0007b458ba14-operator-scripts\") pod \"cd6f8be1-d7f8-47ac-800d-0007b458ba14\" (UID: \"cd6f8be1-d7f8-47ac-800d-0007b458ba14\") " Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.109438 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwb7c\" (UniqueName: \"kubernetes.io/projected/75632b87-176b-4dea-be41-ea40a8615763-kube-api-access-hwb7c\") pod \"75632b87-176b-4dea-be41-ea40a8615763\" (UID: \"75632b87-176b-4dea-be41-ea40a8615763\") " Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.109467 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1-operator-scripts\") pod \"fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1\" (UID: \"fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1\") " Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.109497 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75632b87-176b-4dea-be41-ea40a8615763-operator-scripts\") pod \"75632b87-176b-4dea-be41-ea40a8615763\" (UID: \"75632b87-176b-4dea-be41-ea40a8615763\") " Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.109995 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8016e98a-33a5-43a8-9480-1a424b187263-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8016e98a-33a5-43a8-9480-1a424b187263" (UID: "8016e98a-33a5-43a8-9480-1a424b187263"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.110267 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd6f8be1-d7f8-47ac-800d-0007b458ba14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd6f8be1-d7f8-47ac-800d-0007b458ba14" (UID: "cd6f8be1-d7f8-47ac-800d-0007b458ba14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.110034 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11a08db1-ff16-4bbc-a3c1-b34488e0a9b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11a08db1-ff16-4bbc-a3c1-b34488e0a9b7" (UID: "11a08db1-ff16-4bbc-a3c1-b34488e0a9b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.110387 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75632b87-176b-4dea-be41-ea40a8615763-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75632b87-176b-4dea-be41-ea40a8615763" (UID: "75632b87-176b-4dea-be41-ea40a8615763"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.110442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1" (UID: "fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.114495 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a08db1-ff16-4bbc-a3c1-b34488e0a9b7-kube-api-access-ms4j5" (OuterVolumeSpecName: "kube-api-access-ms4j5") pod "11a08db1-ff16-4bbc-a3c1-b34488e0a9b7" (UID: "11a08db1-ff16-4bbc-a3c1-b34488e0a9b7"). InnerVolumeSpecName "kube-api-access-ms4j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.114588 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd6f8be1-d7f8-47ac-800d-0007b458ba14-kube-api-access-xhz4c" (OuterVolumeSpecName: "kube-api-access-xhz4c") pod "cd6f8be1-d7f8-47ac-800d-0007b458ba14" (UID: "cd6f8be1-d7f8-47ac-800d-0007b458ba14"). InnerVolumeSpecName "kube-api-access-xhz4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.114657 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1-kube-api-access-x9469" (OuterVolumeSpecName: "kube-api-access-x9469") pod "fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1" (UID: "fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1"). InnerVolumeSpecName "kube-api-access-x9469". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.118361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8016e98a-33a5-43a8-9480-1a424b187263-kube-api-access-2z9tz" (OuterVolumeSpecName: "kube-api-access-2z9tz") pod "8016e98a-33a5-43a8-9480-1a424b187263" (UID: "8016e98a-33a5-43a8-9480-1a424b187263"). InnerVolumeSpecName "kube-api-access-2z9tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.118437 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75632b87-176b-4dea-be41-ea40a8615763-kube-api-access-hwb7c" (OuterVolumeSpecName: "kube-api-access-hwb7c") pod "75632b87-176b-4dea-be41-ea40a8615763" (UID: "75632b87-176b-4dea-be41-ea40a8615763"). InnerVolumeSpecName "kube-api-access-hwb7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.211327 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhz4c\" (UniqueName: \"kubernetes.io/projected/cd6f8be1-d7f8-47ac-800d-0007b458ba14-kube-api-access-xhz4c\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.211360 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11a08db1-ff16-4bbc-a3c1-b34488e0a9b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.211369 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9469\" (UniqueName: \"kubernetes.io/projected/fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1-kube-api-access-x9469\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.211377 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms4j5\" (UniqueName: \"kubernetes.io/projected/11a08db1-ff16-4bbc-a3c1-b34488e0a9b7-kube-api-access-ms4j5\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.211386 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z9tz\" (UniqueName: \"kubernetes.io/projected/8016e98a-33a5-43a8-9480-1a424b187263-kube-api-access-2z9tz\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.211394 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd6f8be1-d7f8-47ac-800d-0007b458ba14-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.211403 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwb7c\" (UniqueName: \"kubernetes.io/projected/75632b87-176b-4dea-be41-ea40a8615763-kube-api-access-hwb7c\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.211411 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.211419 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75632b87-176b-4dea-be41-ea40a8615763-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.211428 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8016e98a-33a5-43a8-9480-1a424b187263-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.388512 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-523f-account-create-update-lfzlt" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.388556 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-523f-account-create-update-lfzlt" event={"ID":"cd6f8be1-d7f8-47ac-800d-0007b458ba14","Type":"ContainerDied","Data":"8cbedeb29d437a1ec6f35f5d68a0fbe1f005d90fa63338cc1ff9b4e20c3fa001"} Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.388608 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cbedeb29d437a1ec6f35f5d68a0fbe1f005d90fa63338cc1ff9b4e20c3fa001" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.390539 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-xj9s4" event={"ID":"2ab29253-03b2-4351-a4ba-96cff66ebe9d","Type":"ContainerDied","Data":"11b487da71a2d4fc5b70ec2fb980f526bf3bd8654481a8f2d6be7fcd6a847236"} Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.390582 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11b487da71a2d4fc5b70ec2fb980f526bf3bd8654481a8f2d6be7fcd6a847236" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.390656 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-xj9s4" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.392125 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n" event={"ID":"11a08db1-ff16-4bbc-a3c1-b34488e0a9b7","Type":"ContainerDied","Data":"9eb48284c731c35b417bb69d7450a9c96f707986a1097eb1e49467e9c673a699"} Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.392153 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eb48284c731c35b417bb69d7450a9c96f707986a1097eb1e49467e9c673a699" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.392240 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.394784 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-6ck4d" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.394807 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-6ck4d" event={"ID":"fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1","Type":"ContainerDied","Data":"732d764956751afe13606e42f07307b8101af5264900b2b82f18fc64e25ce848"} Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.395036 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="732d764956751afe13606e42f07307b8101af5264900b2b82f18fc64e25ce848" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.396460 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-slndf" event={"ID":"8016e98a-33a5-43a8-9480-1a424b187263","Type":"ContainerDied","Data":"4e87ade4e28ebc6d4ed71b5b12f9e83ab827ddda5c0e03d1ff76600d632bbb75"} Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.396510 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e87ade4e28ebc6d4ed71b5b12f9e83ab827ddda5c0e03d1ff76600d632bbb75" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.396584 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-slndf" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.402520 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn" event={"ID":"75632b87-176b-4dea-be41-ea40a8615763","Type":"ContainerDied","Data":"c10bd78a1c1b09ef74330f39d981e8ce6ed4a3cb2d1a1143fbc95ef7abe893c6"} Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.402572 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c10bd78a1c1b09ef74330f39d981e8ce6ed4a3cb2d1a1143fbc95ef7abe893c6" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.402684 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn" Jan 20 22:53:21 crc kubenswrapper[5030]: I0120 22:53:21.934705 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.624111 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-n78kc"] Jan 20 22:53:22 crc kubenswrapper[5030]: E0120 22:53:22.624420 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1" containerName="mariadb-database-create" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.624433 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1" containerName="mariadb-database-create" Jan 20 22:53:22 crc kubenswrapper[5030]: E0120 22:53:22.624449 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8016e98a-33a5-43a8-9480-1a424b187263" containerName="mariadb-database-create" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.624456 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8016e98a-33a5-43a8-9480-1a424b187263" containerName="mariadb-database-create" Jan 20 22:53:22 crc kubenswrapper[5030]: E0120 22:53:22.624476 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd6f8be1-d7f8-47ac-800d-0007b458ba14" containerName="mariadb-account-create-update" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.624482 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6f8be1-d7f8-47ac-800d-0007b458ba14" containerName="mariadb-account-create-update" Jan 20 22:53:22 crc kubenswrapper[5030]: E0120 22:53:22.624499 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a08db1-ff16-4bbc-a3c1-b34488e0a9b7" containerName="mariadb-account-create-update" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.624504 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a08db1-ff16-4bbc-a3c1-b34488e0a9b7" containerName="mariadb-account-create-update" Jan 20 22:53:22 crc kubenswrapper[5030]: E0120 22:53:22.624514 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab29253-03b2-4351-a4ba-96cff66ebe9d" containerName="mariadb-database-create" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.624520 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab29253-03b2-4351-a4ba-96cff66ebe9d" containerName="mariadb-database-create" Jan 20 22:53:22 crc kubenswrapper[5030]: E0120 22:53:22.624544 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75632b87-176b-4dea-be41-ea40a8615763" containerName="mariadb-account-create-update" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.624550 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="75632b87-176b-4dea-be41-ea40a8615763" containerName="mariadb-account-create-update" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.624699 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1" containerName="mariadb-database-create" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.624709 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a08db1-ff16-4bbc-a3c1-b34488e0a9b7" containerName="mariadb-account-create-update" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.624721 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd6f8be1-d7f8-47ac-800d-0007b458ba14" containerName="mariadb-account-create-update" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.624730 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="75632b87-176b-4dea-be41-ea40a8615763" containerName="mariadb-account-create-update" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.624742 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab29253-03b2-4351-a4ba-96cff66ebe9d" containerName="mariadb-database-create" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.624749 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8016e98a-33a5-43a8-9480-1a424b187263" containerName="mariadb-database-create" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.625198 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-n78kc" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.627809 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.633336 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mrrd\" (UniqueName: \"kubernetes.io/projected/4221deee-e2cc-4277-be4e-7a0ddc776887-kube-api-access-6mrrd\") pod \"root-account-create-update-n78kc\" (UID: \"4221deee-e2cc-4277-be4e-7a0ddc776887\") " pod="openstack-kuttl-tests/root-account-create-update-n78kc" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.633515 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4221deee-e2cc-4277-be4e-7a0ddc776887-operator-scripts\") pod \"root-account-create-update-n78kc\" (UID: \"4221deee-e2cc-4277-be4e-7a0ddc776887\") " pod="openstack-kuttl-tests/root-account-create-update-n78kc" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.639782 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-n78kc"] Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.734561 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4221deee-e2cc-4277-be4e-7a0ddc776887-operator-scripts\") pod \"root-account-create-update-n78kc\" (UID: \"4221deee-e2cc-4277-be4e-7a0ddc776887\") " pod="openstack-kuttl-tests/root-account-create-update-n78kc" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.734733 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mrrd\" (UniqueName: \"kubernetes.io/projected/4221deee-e2cc-4277-be4e-7a0ddc776887-kube-api-access-6mrrd\") pod \"root-account-create-update-n78kc\" (UID: \"4221deee-e2cc-4277-be4e-7a0ddc776887\") " pod="openstack-kuttl-tests/root-account-create-update-n78kc" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.735952 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4221deee-e2cc-4277-be4e-7a0ddc776887-operator-scripts\") pod \"root-account-create-update-n78kc\" (UID: \"4221deee-e2cc-4277-be4e-7a0ddc776887\") " pod="openstack-kuttl-tests/root-account-create-update-n78kc" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.758531 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mrrd\" (UniqueName: \"kubernetes.io/projected/4221deee-e2cc-4277-be4e-7a0ddc776887-kube-api-access-6mrrd\") pod \"root-account-create-update-n78kc\" (UID: \"4221deee-e2cc-4277-be4e-7a0ddc776887\") " pod="openstack-kuttl-tests/root-account-create-update-n78kc" Jan 20 22:53:22 crc kubenswrapper[5030]: I0120 22:53:22.940269 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-n78kc" Jan 20 22:53:23 crc kubenswrapper[5030]: I0120 22:53:23.385660 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-n78kc"] Jan 20 22:53:23 crc kubenswrapper[5030]: I0120 22:53:23.416567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-n78kc" event={"ID":"4221deee-e2cc-4277-be4e-7a0ddc776887","Type":"ContainerStarted","Data":"05bccbbecca6963fd6277db2e51088ee0f06485e98611a30e421fe0931548428"} Jan 20 22:53:24 crc kubenswrapper[5030]: I0120 22:53:24.423606 5030 generic.go:334] "Generic (PLEG): container finished" podID="4221deee-e2cc-4277-be4e-7a0ddc776887" containerID="1f129a58100ee9749133f57abfedbfee9be9910d03dfde215c4f5caac80fa22d" exitCode=0 Jan 20 22:53:24 crc kubenswrapper[5030]: I0120 22:53:24.423660 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-n78kc" event={"ID":"4221deee-e2cc-4277-be4e-7a0ddc776887","Type":"ContainerDied","Data":"1f129a58100ee9749133f57abfedbfee9be9910d03dfde215c4f5caac80fa22d"} Jan 20 22:53:24 crc kubenswrapper[5030]: I0120 22:53:24.563447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:24 crc kubenswrapper[5030]: I0120 22:53:24.573913 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift\") pod \"swift-storage-0\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:24 crc kubenswrapper[5030]: I0120 22:53:24.871238 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:53:25 crc kubenswrapper[5030]: I0120 22:53:25.173889 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 22:53:25 crc kubenswrapper[5030]: I0120 22:53:25.432096 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"a32d7f934880c093501fe43086fd0946c6ccd073d9129de0239769825c5932ef"} Jan 20 22:53:25 crc kubenswrapper[5030]: I0120 22:53:25.434154 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c693e21-8f8f-4437-9662-876c9f6f48e1" containerID="5480da400ee80721be2f16ae8f3038789f5bab39358c94ed7e428011135c3283" exitCode=0 Jan 20 22:53:25 crc kubenswrapper[5030]: I0120 22:53:25.434268 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" event={"ID":"1c693e21-8f8f-4437-9662-876c9f6f48e1","Type":"ContainerDied","Data":"5480da400ee80721be2f16ae8f3038789f5bab39358c94ed7e428011135c3283"} Jan 20 22:53:25 crc kubenswrapper[5030]: I0120 22:53:25.738360 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-n78kc" Jan 20 22:53:25 crc kubenswrapper[5030]: I0120 22:53:25.881979 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mrrd\" (UniqueName: \"kubernetes.io/projected/4221deee-e2cc-4277-be4e-7a0ddc776887-kube-api-access-6mrrd\") pod \"4221deee-e2cc-4277-be4e-7a0ddc776887\" (UID: \"4221deee-e2cc-4277-be4e-7a0ddc776887\") " Jan 20 22:53:25 crc kubenswrapper[5030]: I0120 22:53:25.882027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4221deee-e2cc-4277-be4e-7a0ddc776887-operator-scripts\") pod \"4221deee-e2cc-4277-be4e-7a0ddc776887\" (UID: \"4221deee-e2cc-4277-be4e-7a0ddc776887\") " Jan 20 22:53:25 crc kubenswrapper[5030]: I0120 22:53:25.882800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4221deee-e2cc-4277-be4e-7a0ddc776887-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4221deee-e2cc-4277-be4e-7a0ddc776887" (UID: "4221deee-e2cc-4277-be4e-7a0ddc776887"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:25 crc kubenswrapper[5030]: I0120 22:53:25.886653 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4221deee-e2cc-4277-be4e-7a0ddc776887-kube-api-access-6mrrd" (OuterVolumeSpecName: "kube-api-access-6mrrd") pod "4221deee-e2cc-4277-be4e-7a0ddc776887" (UID: "4221deee-e2cc-4277-be4e-7a0ddc776887"). InnerVolumeSpecName "kube-api-access-6mrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:25 crc kubenswrapper[5030]: I0120 22:53:25.984333 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mrrd\" (UniqueName: \"kubernetes.io/projected/4221deee-e2cc-4277-be4e-7a0ddc776887-kube-api-access-6mrrd\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:25 crc kubenswrapper[5030]: I0120 22:53:25.984363 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4221deee-e2cc-4277-be4e-7a0ddc776887-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.197008 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-t8brv"] Jan 20 22:53:26 crc kubenswrapper[5030]: E0120 22:53:26.197484 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4221deee-e2cc-4277-be4e-7a0ddc776887" containerName="mariadb-account-create-update" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.197511 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4221deee-e2cc-4277-be4e-7a0ddc776887" containerName="mariadb-account-create-update" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.197849 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4221deee-e2cc-4277-be4e-7a0ddc776887" containerName="mariadb-account-create-update" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.198607 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.200840 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-7sczs" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.202399 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.209548 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-t8brv"] Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.390765 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pflb9\" (UniqueName: \"kubernetes.io/projected/419fff1b-62d5-431e-b99c-acf5c8a5e61a-kube-api-access-pflb9\") pod \"glance-db-sync-t8brv\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.390836 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-db-sync-config-data\") pod \"glance-db-sync-t8brv\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.390981 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-combined-ca-bundle\") pod \"glance-db-sync-t8brv\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.391073 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-config-data\") pod \"glance-db-sync-t8brv\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.447064 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-n78kc" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.447059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-n78kc" event={"ID":"4221deee-e2cc-4277-be4e-7a0ddc776887","Type":"ContainerDied","Data":"05bccbbecca6963fd6277db2e51088ee0f06485e98611a30e421fe0931548428"} Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.447225 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05bccbbecca6963fd6277db2e51088ee0f06485e98611a30e421fe0931548428" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.492160 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-config-data\") pod \"glance-db-sync-t8brv\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.492261 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pflb9\" (UniqueName: \"kubernetes.io/projected/419fff1b-62d5-431e-b99c-acf5c8a5e61a-kube-api-access-pflb9\") pod \"glance-db-sync-t8brv\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.492304 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-db-sync-config-data\") pod \"glance-db-sync-t8brv\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.492392 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-combined-ca-bundle\") pod \"glance-db-sync-t8brv\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.499169 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-db-sync-config-data\") pod \"glance-db-sync-t8brv\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.499332 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-config-data\") pod \"glance-db-sync-t8brv\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.522349 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-combined-ca-bundle\") pod \"glance-db-sync-t8brv\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.528940 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pflb9\" (UniqueName: \"kubernetes.io/projected/419fff1b-62d5-431e-b99c-acf5c8a5e61a-kube-api-access-pflb9\") pod \"glance-db-sync-t8brv\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.835113 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:53:26 crc kubenswrapper[5030]: I0120 22:53:26.842843 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.000263 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dks9m\" (UniqueName: \"kubernetes.io/projected/1c693e21-8f8f-4437-9662-876c9f6f48e1-kube-api-access-dks9m\") pod \"1c693e21-8f8f-4437-9662-876c9f6f48e1\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.000310 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-swiftconf\") pod \"1c693e21-8f8f-4437-9662-876c9f6f48e1\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.000380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c693e21-8f8f-4437-9662-876c9f6f48e1-etc-swift\") pod \"1c693e21-8f8f-4437-9662-876c9f6f48e1\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.000430 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-dispersionconf\") pod \"1c693e21-8f8f-4437-9662-876c9f6f48e1\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.000459 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-combined-ca-bundle\") pod \"1c693e21-8f8f-4437-9662-876c9f6f48e1\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.000499 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c693e21-8f8f-4437-9662-876c9f6f48e1-ring-data-devices\") pod \"1c693e21-8f8f-4437-9662-876c9f6f48e1\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.000519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c693e21-8f8f-4437-9662-876c9f6f48e1-scripts\") pod \"1c693e21-8f8f-4437-9662-876c9f6f48e1\" (UID: \"1c693e21-8f8f-4437-9662-876c9f6f48e1\") " Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.002385 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c693e21-8f8f-4437-9662-876c9f6f48e1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1c693e21-8f8f-4437-9662-876c9f6f48e1" (UID: "1c693e21-8f8f-4437-9662-876c9f6f48e1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.006119 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c693e21-8f8f-4437-9662-876c9f6f48e1-kube-api-access-dks9m" (OuterVolumeSpecName: "kube-api-access-dks9m") pod "1c693e21-8f8f-4437-9662-876c9f6f48e1" (UID: "1c693e21-8f8f-4437-9662-876c9f6f48e1"). InnerVolumeSpecName "kube-api-access-dks9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.019032 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c693e21-8f8f-4437-9662-876c9f6f48e1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1c693e21-8f8f-4437-9662-876c9f6f48e1" (UID: "1c693e21-8f8f-4437-9662-876c9f6f48e1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.022130 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c693e21-8f8f-4437-9662-876c9f6f48e1-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.022170 5030 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c693e21-8f8f-4437-9662-876c9f6f48e1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.022185 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dks9m\" (UniqueName: \"kubernetes.io/projected/1c693e21-8f8f-4437-9662-876c9f6f48e1-kube-api-access-dks9m\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.036086 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1c693e21-8f8f-4437-9662-876c9f6f48e1" (UID: "1c693e21-8f8f-4437-9662-876c9f6f48e1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.041896 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1c693e21-8f8f-4437-9662-876c9f6f48e1" (UID: "1c693e21-8f8f-4437-9662-876c9f6f48e1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.044894 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c693e21-8f8f-4437-9662-876c9f6f48e1" (UID: "1c693e21-8f8f-4437-9662-876c9f6f48e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.051149 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c693e21-8f8f-4437-9662-876c9f6f48e1-scripts" (OuterVolumeSpecName: "scripts") pod "1c693e21-8f8f-4437-9662-876c9f6f48e1" (UID: "1c693e21-8f8f-4437-9662-876c9f6f48e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.126866 5030 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.127131 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.127143 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c693e21-8f8f-4437-9662-876c9f6f48e1-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.127151 5030 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c693e21-8f8f-4437-9662-876c9f6f48e1-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.327499 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-t8brv"] Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.458086 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" event={"ID":"1c693e21-8f8f-4437-9662-876c9f6f48e1","Type":"ContainerDied","Data":"5993bb64c5166ed18343382fbef1b4c82a7ca728147842fa20a3cf79c0a049f3"} Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.458124 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5993bb64c5166ed18343382fbef1b4c82a7ca728147842fa20a3cf79c0a049f3" Jan 20 22:53:27 crc kubenswrapper[5030]: I0120 22:53:27.458194 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-gvgkg" Jan 20 22:53:27 crc kubenswrapper[5030]: W0120 22:53:27.598836 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod419fff1b_62d5_431e_b99c_acf5c8a5e61a.slice/crio-8cab31abe4822cd6dd3582b1e7cd352b5668d73de8adda5f757a602e52da90e1 WatchSource:0}: Error finding container 8cab31abe4822cd6dd3582b1e7cd352b5668d73de8adda5f757a602e52da90e1: Status 404 returned error can't find the container with id 8cab31abe4822cd6dd3582b1e7cd352b5668d73de8adda5f757a602e52da90e1 Jan 20 22:53:28 crc kubenswrapper[5030]: I0120 22:53:28.469539 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"ae49717efe652fa5f134b16d6554d620221090eb39bb69192b38b9c574c2291c"} Jan 20 22:53:28 crc kubenswrapper[5030]: I0120 22:53:28.469925 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"3e0d8293f0f6aca058326e5ad7fece6c0d7a637efa32ef0554b0a6456daa1e20"} Jan 20 22:53:28 crc kubenswrapper[5030]: I0120 22:53:28.469940 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"c623f5b67162ae7ba2ac2764b304e3991464bb5c5b5ab6193cf3fd5b24bb2845"} Jan 20 22:53:28 crc kubenswrapper[5030]: I0120 22:53:28.470873 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-t8brv" event={"ID":"419fff1b-62d5-431e-b99c-acf5c8a5e61a","Type":"ContainerStarted","Data":"8cab31abe4822cd6dd3582b1e7cd352b5668d73de8adda5f757a602e52da90e1"} Jan 20 22:53:29 crc kubenswrapper[5030]: I0120 22:53:29.056580 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-n78kc"] Jan 20 22:53:29 crc kubenswrapper[5030]: I0120 22:53:29.062406 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-n78kc"] Jan 20 22:53:29 crc kubenswrapper[5030]: I0120 22:53:29.493799 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"7b3f774418e3437069778c78fb657a5ed3fb7e251d10f744a2722ba73b312add"} Jan 20 22:53:29 crc kubenswrapper[5030]: I0120 22:53:29.974877 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4221deee-e2cc-4277-be4e-7a0ddc776887" path="/var/lib/kubelet/pods/4221deee-e2cc-4277-be4e-7a0ddc776887/volumes" Jan 20 22:53:30 crc kubenswrapper[5030]: I0120 22:53:30.504683 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"cb2e2e77e02b8db23512519bebfb695114aaf3212af696ecbeb14d791e69248d"} Jan 20 22:53:30 crc kubenswrapper[5030]: I0120 22:53:30.504725 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"95c5f038cd2f0fd19a1d2f8de88ec001899e1f73b1fb3f3369a0e83f7a93bc73"} Jan 20 22:53:30 crc kubenswrapper[5030]: I0120 22:53:30.504738 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"c4478344184d417c7bf47c30c7e0b83365751f9ca859287ab58440f5bfb834e1"} Jan 20 22:53:31 crc kubenswrapper[5030]: I0120 22:53:31.517861 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"92b7b86eb5275b933dafd8fadf06f237a24ffe511636756664335966de4acf0a"} Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.067301 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tr48l"] Jan 20 22:53:34 crc kubenswrapper[5030]: E0120 22:53:34.068157 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c693e21-8f8f-4437-9662-876c9f6f48e1" containerName="swift-ring-rebalance" Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.068179 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c693e21-8f8f-4437-9662-876c9f6f48e1" containerName="swift-ring-rebalance" Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.068494 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c693e21-8f8f-4437-9662-876c9f6f48e1" containerName="swift-ring-rebalance" Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.069304 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-tr48l" Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.072976 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.077010 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tr48l"] Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.145887 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2338665c-e47e-47a0-8b70-3fb568fdebc7-operator-scripts\") pod \"root-account-create-update-tr48l\" (UID: \"2338665c-e47e-47a0-8b70-3fb568fdebc7\") " pod="openstack-kuttl-tests/root-account-create-update-tr48l" Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.146071 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6mn4\" (UniqueName: \"kubernetes.io/projected/2338665c-e47e-47a0-8b70-3fb568fdebc7-kube-api-access-d6mn4\") pod \"root-account-create-update-tr48l\" (UID: \"2338665c-e47e-47a0-8b70-3fb568fdebc7\") " pod="openstack-kuttl-tests/root-account-create-update-tr48l" Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.247749 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6mn4\" (UniqueName: \"kubernetes.io/projected/2338665c-e47e-47a0-8b70-3fb568fdebc7-kube-api-access-d6mn4\") pod \"root-account-create-update-tr48l\" (UID: \"2338665c-e47e-47a0-8b70-3fb568fdebc7\") " pod="openstack-kuttl-tests/root-account-create-update-tr48l" Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.247916 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2338665c-e47e-47a0-8b70-3fb568fdebc7-operator-scripts\") pod \"root-account-create-update-tr48l\" (UID: \"2338665c-e47e-47a0-8b70-3fb568fdebc7\") " pod="openstack-kuttl-tests/root-account-create-update-tr48l" Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.248717 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2338665c-e47e-47a0-8b70-3fb568fdebc7-operator-scripts\") pod \"root-account-create-update-tr48l\" (UID: \"2338665c-e47e-47a0-8b70-3fb568fdebc7\") " pod="openstack-kuttl-tests/root-account-create-update-tr48l" Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.268241 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6mn4\" (UniqueName: \"kubernetes.io/projected/2338665c-e47e-47a0-8b70-3fb568fdebc7-kube-api-access-d6mn4\") pod \"root-account-create-update-tr48l\" (UID: \"2338665c-e47e-47a0-8b70-3fb568fdebc7\") " pod="openstack-kuttl-tests/root-account-create-update-tr48l" Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.385292 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-tr48l" Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.543487 5030 generic.go:334] "Generic (PLEG): container finished" podID="b0a5273a-9439-4881-b88a-2a3aa11e410b" containerID="b3335bf3d79fe9a34a33b8eb93fa2ea9b983ca3d131a46215aea375ddf7c1e58" exitCode=0 Jan 20 22:53:34 crc kubenswrapper[5030]: I0120 22:53:34.543541 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"b0a5273a-9439-4881-b88a-2a3aa11e410b","Type":"ContainerDied","Data":"b3335bf3d79fe9a34a33b8eb93fa2ea9b983ca3d131a46215aea375ddf7c1e58"} Jan 20 22:53:35 crc kubenswrapper[5030]: I0120 22:53:35.002846 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tr48l"] Jan 20 22:53:35 crc kubenswrapper[5030]: W0120 22:53:35.012500 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2338665c_e47e_47a0_8b70_3fb568fdebc7.slice/crio-c0abfdd15866c63d5c3d0be398e6172ff7858076b7df3eb8b9a9cbaaf3e4f37a WatchSource:0}: Error finding container c0abfdd15866c63d5c3d0be398e6172ff7858076b7df3eb8b9a9cbaaf3e4f37a: Status 404 returned error can't find the container with id c0abfdd15866c63d5c3d0be398e6172ff7858076b7df3eb8b9a9cbaaf3e4f37a Jan 20 22:53:35 crc kubenswrapper[5030]: I0120 22:53:35.553453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"b0a5273a-9439-4881-b88a-2a3aa11e410b","Type":"ContainerStarted","Data":"dea827cf70a7f2a132d0be3a5d721e6d07a8e2f8519f56822cf6962aaa2d6fac"} Jan 20 22:53:35 crc kubenswrapper[5030]: I0120 22:53:35.553688 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:53:35 crc kubenswrapper[5030]: I0120 22:53:35.557894 5030 generic.go:334] "Generic (PLEG): container finished" podID="38bfa948-051b-4e6c-92a5-3714e0652088" containerID="775623890bd1b1a33b5cac6e58ca14081b26ef86769bbbe4eaeddb36e108320c" exitCode=0 Jan 20 22:53:35 crc kubenswrapper[5030]: I0120 22:53:35.557954 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"38bfa948-051b-4e6c-92a5-3714e0652088","Type":"ContainerDied","Data":"775623890bd1b1a33b5cac6e58ca14081b26ef86769bbbe4eaeddb36e108320c"} Jan 20 22:53:35 crc kubenswrapper[5030]: I0120 22:53:35.559476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-tr48l" event={"ID":"2338665c-e47e-47a0-8b70-3fb568fdebc7","Type":"ContainerStarted","Data":"c0abfdd15866c63d5c3d0be398e6172ff7858076b7df3eb8b9a9cbaaf3e4f37a"} Jan 20 22:53:35 crc kubenswrapper[5030]: I0120 22:53:35.581044 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=44.235494686 podStartE2EDuration="55.581022582s" podCreationTimestamp="2026-01-20 22:52:40 +0000 UTC" firstStartedPulling="2026-01-20 22:52:48.172903896 +0000 UTC m=+1040.493164184" lastFinishedPulling="2026-01-20 22:52:59.518431792 +0000 UTC m=+1051.838692080" observedRunningTime="2026-01-20 22:53:35.573139349 +0000 UTC m=+1087.893399637" watchObservedRunningTime="2026-01-20 22:53:35.581022582 +0000 UTC m=+1087.901282870" Jan 20 22:53:43 crc kubenswrapper[5030]: E0120 22:53:43.542784 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:e4aa4ebbb1e581a12040e9ad2ae2709ac31b5d965bb64fc4252d1028b05c565f" Jan 20 22:53:43 crc kubenswrapper[5030]: E0120 22:53:43.543471 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:e4aa4ebbb1e581a12040e9ad2ae2709ac31b5d965bb64fc4252d1028b05c565f,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pflb9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-t8brv_openstack-kuttl-tests(419fff1b-62d5-431e-b99c-acf5c8a5e61a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:53:43 crc kubenswrapper[5030]: E0120 22:53:43.545257 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-kuttl-tests/glance-db-sync-t8brv" podUID="419fff1b-62d5-431e-b99c-acf5c8a5e61a" Jan 20 22:53:43 crc kubenswrapper[5030]: E0120 22:53:43.629966 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:e4aa4ebbb1e581a12040e9ad2ae2709ac31b5d965bb64fc4252d1028b05c565f\\\"\"" pod="openstack-kuttl-tests/glance-db-sync-t8brv" podUID="419fff1b-62d5-431e-b99c-acf5c8a5e61a" Jan 20 22:53:44 crc kubenswrapper[5030]: I0120 22:53:44.634727 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"38bfa948-051b-4e6c-92a5-3714e0652088","Type":"ContainerStarted","Data":"a4829caf0f9e77c402ed59e975d250deb92fb12f806a28728d5643b813800504"} Jan 20 22:53:44 crc kubenswrapper[5030]: I0120 22:53:44.635230 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:53:44 crc kubenswrapper[5030]: I0120 22:53:44.636501 5030 generic.go:334] "Generic (PLEG): container finished" podID="2338665c-e47e-47a0-8b70-3fb568fdebc7" containerID="4e441803cac62ba19b4b02a09bdf8586418dc277eee8924311eaeac3ee0c9a7b" exitCode=0 Jan 20 22:53:44 crc kubenswrapper[5030]: I0120 22:53:44.636556 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-tr48l" event={"ID":"2338665c-e47e-47a0-8b70-3fb568fdebc7","Type":"ContainerDied","Data":"4e441803cac62ba19b4b02a09bdf8586418dc277eee8924311eaeac3ee0c9a7b"} Jan 20 22:53:44 crc kubenswrapper[5030]: I0120 22:53:44.670144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"61166ebcaa8d52cbda3ab4ddba35312982c50bdd4e6acec74584508c92aaf82e"} Jan 20 22:53:44 crc kubenswrapper[5030]: I0120 22:53:44.670216 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"dd5a67ef4fb551fa2f475133669cec13544df0cc96f8214505d7da9eba2e8bd0"} Jan 20 22:53:44 crc kubenswrapper[5030]: I0120 22:53:44.670229 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"2334a73469889f8ad1c51ce74e70eb724d6a8113be8a4e9fc40821d4ff9a633b"} Jan 20 22:53:44 crc kubenswrapper[5030]: I0120 22:53:44.670243 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"966b0cf5daaff4c0b35472ac3ae84907a5ab999f5d4629a16af22c0f9f24aa9a"} Jan 20 22:53:44 crc kubenswrapper[5030]: I0120 22:53:44.691522 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=52.178151652 podStartE2EDuration="1m3.691501814s" podCreationTimestamp="2026-01-20 22:52:41 +0000 UTC" firstStartedPulling="2026-01-20 22:52:48.315872307 +0000 UTC m=+1040.636132615" lastFinishedPulling="2026-01-20 22:52:59.829222479 +0000 UTC m=+1052.149482777" observedRunningTime="2026-01-20 22:53:44.691328819 +0000 UTC m=+1097.011589117" watchObservedRunningTime="2026-01-20 22:53:44.691501814 +0000 UTC m=+1097.011762102" Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.009478 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-tr48l" Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.152445 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6mn4\" (UniqueName: \"kubernetes.io/projected/2338665c-e47e-47a0-8b70-3fb568fdebc7-kube-api-access-d6mn4\") pod \"2338665c-e47e-47a0-8b70-3fb568fdebc7\" (UID: \"2338665c-e47e-47a0-8b70-3fb568fdebc7\") " Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.152484 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2338665c-e47e-47a0-8b70-3fb568fdebc7-operator-scripts\") pod \"2338665c-e47e-47a0-8b70-3fb568fdebc7\" (UID: \"2338665c-e47e-47a0-8b70-3fb568fdebc7\") " Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.153260 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2338665c-e47e-47a0-8b70-3fb568fdebc7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2338665c-e47e-47a0-8b70-3fb568fdebc7" (UID: "2338665c-e47e-47a0-8b70-3fb568fdebc7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.171795 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2338665c-e47e-47a0-8b70-3fb568fdebc7-kube-api-access-d6mn4" (OuterVolumeSpecName: "kube-api-access-d6mn4") pod "2338665c-e47e-47a0-8b70-3fb568fdebc7" (UID: "2338665c-e47e-47a0-8b70-3fb568fdebc7"). InnerVolumeSpecName "kube-api-access-d6mn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.254448 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6mn4\" (UniqueName: \"kubernetes.io/projected/2338665c-e47e-47a0-8b70-3fb568fdebc7-kube-api-access-d6mn4\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.254495 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2338665c-e47e-47a0-8b70-3fb568fdebc7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.691442 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"adc617fb88a39b843a0ff031f77209c3ae2283dc26aec23acae2b5de8b19f18c"} Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.691501 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"2bb4c6cb0c30a38e71228d46f1ce0a2d6242c26273e09475f37ec15be630df47"} Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.691516 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerStarted","Data":"03d8131565b2aa8215b283479dbb08d399b764b7bdd91342366f8b11d7c1bd11"} Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.693227 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-tr48l" event={"ID":"2338665c-e47e-47a0-8b70-3fb568fdebc7","Type":"ContainerDied","Data":"c0abfdd15866c63d5c3d0be398e6172ff7858076b7df3eb8b9a9cbaaf3e4f37a"} Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.693266 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0abfdd15866c63d5c3d0be398e6172ff7858076b7df3eb8b9a9cbaaf3e4f37a" Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.693289 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-tr48l" Jan 20 22:53:46 crc kubenswrapper[5030]: I0120 22:53:46.734152 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=21.111332975 podStartE2EDuration="39.734133679s" podCreationTimestamp="2026-01-20 22:53:07 +0000 UTC" firstStartedPulling="2026-01-20 22:53:25.179405519 +0000 UTC m=+1077.499665817" lastFinishedPulling="2026-01-20 22:53:43.802206233 +0000 UTC m=+1096.122466521" observedRunningTime="2026-01-20 22:53:46.730244865 +0000 UTC m=+1099.050505243" watchObservedRunningTime="2026-01-20 22:53:46.734133679 +0000 UTC m=+1099.054393967" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.037810 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc"] Jan 20 22:53:47 crc kubenswrapper[5030]: E0120 22:53:47.038302 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2338665c-e47e-47a0-8b70-3fb568fdebc7" containerName="mariadb-account-create-update" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.038314 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2338665c-e47e-47a0-8b70-3fb568fdebc7" containerName="mariadb-account-create-update" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.038462 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2338665c-e47e-47a0-8b70-3fb568fdebc7" containerName="mariadb-account-create-update" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.039192 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.041848 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.060108 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc"] Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.169080 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-58b8ddd7c-xmthc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.169132 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhlgf\" (UniqueName: \"kubernetes.io/projected/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-kube-api-access-jhlgf\") pod \"dnsmasq-dnsmasq-58b8ddd7c-xmthc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.169244 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-58b8ddd7c-xmthc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.169294 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-config\") pod \"dnsmasq-dnsmasq-58b8ddd7c-xmthc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.271613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-config\") pod \"dnsmasq-dnsmasq-58b8ddd7c-xmthc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.271873 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-58b8ddd7c-xmthc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.271930 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhlgf\" (UniqueName: \"kubernetes.io/projected/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-kube-api-access-jhlgf\") pod \"dnsmasq-dnsmasq-58b8ddd7c-xmthc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.272057 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-58b8ddd7c-xmthc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.273303 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-config\") pod \"dnsmasq-dnsmasq-58b8ddd7c-xmthc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.273567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-58b8ddd7c-xmthc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.273878 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-58b8ddd7c-xmthc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.301873 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhlgf\" (UniqueName: \"kubernetes.io/projected/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-kube-api-access-jhlgf\") pod \"dnsmasq-dnsmasq-58b8ddd7c-xmthc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.367769 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:47 crc kubenswrapper[5030]: I0120 22:53:47.832051 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc"] Jan 20 22:53:47 crc kubenswrapper[5030]: W0120 22:53:47.840275 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24e33cea_b99d_4398_9ecb_ec247a8eb2cc.slice/crio-f542f5d18fffb2253f8667b07c9e3d8d3f1c73f5909238abf9c771acef804efe WatchSource:0}: Error finding container f542f5d18fffb2253f8667b07c9e3d8d3f1c73f5909238abf9c771acef804efe: Status 404 returned error can't find the container with id f542f5d18fffb2253f8667b07c9e3d8d3f1c73f5909238abf9c771acef804efe Jan 20 22:53:48 crc kubenswrapper[5030]: I0120 22:53:48.710870 5030 generic.go:334] "Generic (PLEG): container finished" podID="24e33cea-b99d-4398-9ecb-ec247a8eb2cc" containerID="ffb4def33af47a65b5171a0efd49012016f4989e7ec77538f03ced86a5dc9e32" exitCode=0 Jan 20 22:53:48 crc kubenswrapper[5030]: I0120 22:53:48.710971 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" event={"ID":"24e33cea-b99d-4398-9ecb-ec247a8eb2cc","Type":"ContainerDied","Data":"ffb4def33af47a65b5171a0efd49012016f4989e7ec77538f03ced86a5dc9e32"} Jan 20 22:53:48 crc kubenswrapper[5030]: I0120 22:53:48.711229 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" event={"ID":"24e33cea-b99d-4398-9ecb-ec247a8eb2cc","Type":"ContainerStarted","Data":"f542f5d18fffb2253f8667b07c9e3d8d3f1c73f5909238abf9c771acef804efe"} Jan 20 22:53:49 crc kubenswrapper[5030]: I0120 22:53:49.726682 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" event={"ID":"24e33cea-b99d-4398-9ecb-ec247a8eb2cc","Type":"ContainerStarted","Data":"a62dda1b120931862d2ff8123b95183adde1c849221e13dfe84789cde234f043"} Jan 20 22:53:49 crc kubenswrapper[5030]: I0120 22:53:49.728205 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:49 crc kubenswrapper[5030]: I0120 22:53:49.772973 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" podStartSLOduration=2.772948466 podStartE2EDuration="2.772948466s" podCreationTimestamp="2026-01-20 22:53:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:53:49.760597534 +0000 UTC m=+1102.080857872" watchObservedRunningTime="2026-01-20 22:53:49.772948466 +0000 UTC m=+1102.093208794" Jan 20 22:53:51 crc kubenswrapper[5030]: I0120 22:53:51.800924 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.260791 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.262033 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.267574 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.279209 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-76gkg"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.280460 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-76gkg" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.300563 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-76gkg"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.349954 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-clvw5"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.351506 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-clvw5" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.362912 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjx2t\" (UniqueName: \"kubernetes.io/projected/2cc300a3-f7d4-4061-ba18-268032869088-kube-api-access-pjx2t\") pod \"cinder-96ed-account-create-update-2qx68\" (UID: \"2cc300a3-f7d4-4061-ba18-268032869088\") " pod="openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.362985 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hpwm\" (UniqueName: \"kubernetes.io/projected/b344be57-888d-48a1-8b94-26ed9061b2f2-kube-api-access-4hpwm\") pod \"cinder-db-create-76gkg\" (UID: \"b344be57-888d-48a1-8b94-26ed9061b2f2\") " pod="openstack-kuttl-tests/cinder-db-create-76gkg" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.363012 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cc300a3-f7d4-4061-ba18-268032869088-operator-scripts\") pod \"cinder-96ed-account-create-update-2qx68\" (UID: \"2cc300a3-f7d4-4061-ba18-268032869088\") " pod="openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.363039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b344be57-888d-48a1-8b94-26ed9061b2f2-operator-scripts\") pod \"cinder-db-create-76gkg\" (UID: \"b344be57-888d-48a1-8b94-26ed9061b2f2\") " pod="openstack-kuttl-tests/cinder-db-create-76gkg" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.367222 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.377682 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-clvw5"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.395678 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.396682 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.398928 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.424071 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.466659 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d757282-7e93-4149-82c8-560aa414ff9f-operator-scripts\") pod \"barbican-db-create-clvw5\" (UID: \"0d757282-7e93-4149-82c8-560aa414ff9f\") " pod="openstack-kuttl-tests/barbican-db-create-clvw5" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.466703 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjx2t\" (UniqueName: \"kubernetes.io/projected/2cc300a3-f7d4-4061-ba18-268032869088-kube-api-access-pjx2t\") pod \"cinder-96ed-account-create-update-2qx68\" (UID: \"2cc300a3-f7d4-4061-ba18-268032869088\") " pod="openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.466767 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hpwm\" (UniqueName: \"kubernetes.io/projected/b344be57-888d-48a1-8b94-26ed9061b2f2-kube-api-access-4hpwm\") pod \"cinder-db-create-76gkg\" (UID: \"b344be57-888d-48a1-8b94-26ed9061b2f2\") " pod="openstack-kuttl-tests/cinder-db-create-76gkg" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.466793 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cc300a3-f7d4-4061-ba18-268032869088-operator-scripts\") pod \"cinder-96ed-account-create-update-2qx68\" (UID: \"2cc300a3-f7d4-4061-ba18-268032869088\") " pod="openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.466821 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vmg4\" (UniqueName: \"kubernetes.io/projected/208c8689-8a0c-409c-86e6-d0cfa8893c77-kube-api-access-4vmg4\") pod \"barbican-8cf9-account-create-update-jjgf7\" (UID: \"208c8689-8a0c-409c-86e6-d0cfa8893c77\") " pod="openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.466847 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2x9v\" (UniqueName: \"kubernetes.io/projected/0d757282-7e93-4149-82c8-560aa414ff9f-kube-api-access-w2x9v\") pod \"barbican-db-create-clvw5\" (UID: \"0d757282-7e93-4149-82c8-560aa414ff9f\") " pod="openstack-kuttl-tests/barbican-db-create-clvw5" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.466870 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b344be57-888d-48a1-8b94-26ed9061b2f2-operator-scripts\") pod \"cinder-db-create-76gkg\" (UID: \"b344be57-888d-48a1-8b94-26ed9061b2f2\") " pod="openstack-kuttl-tests/cinder-db-create-76gkg" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.466897 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/208c8689-8a0c-409c-86e6-d0cfa8893c77-operator-scripts\") pod \"barbican-8cf9-account-create-update-jjgf7\" (UID: \"208c8689-8a0c-409c-86e6-d0cfa8893c77\") " pod="openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.470915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b344be57-888d-48a1-8b94-26ed9061b2f2-operator-scripts\") pod \"cinder-db-create-76gkg\" (UID: \"b344be57-888d-48a1-8b94-26ed9061b2f2\") " pod="openstack-kuttl-tests/cinder-db-create-76gkg" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.481188 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-whswv"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.483348 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cc300a3-f7d4-4061-ba18-268032869088-operator-scripts\") pod \"cinder-96ed-account-create-update-2qx68\" (UID: \"2cc300a3-f7d4-4061-ba18-268032869088\") " pod="openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.502887 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjx2t\" (UniqueName: \"kubernetes.io/projected/2cc300a3-f7d4-4061-ba18-268032869088-kube-api-access-pjx2t\") pod \"cinder-96ed-account-create-update-2qx68\" (UID: \"2cc300a3-f7d4-4061-ba18-268032869088\") " pod="openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.503411 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hpwm\" (UniqueName: \"kubernetes.io/projected/b344be57-888d-48a1-8b94-26ed9061b2f2-kube-api-access-4hpwm\") pod \"cinder-db-create-76gkg\" (UID: \"b344be57-888d-48a1-8b94-26ed9061b2f2\") " pod="openstack-kuttl-tests/cinder-db-create-76gkg" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.508199 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-whswv" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.531929 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-whswv"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.567753 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d757282-7e93-4149-82c8-560aa414ff9f-operator-scripts\") pod \"barbican-db-create-clvw5\" (UID: \"0d757282-7e93-4149-82c8-560aa414ff9f\") " pod="openstack-kuttl-tests/barbican-db-create-clvw5" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.567826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls6tg\" (UniqueName: \"kubernetes.io/projected/44044fda-5fae-4af3-8ef5-96c63087bed3-kube-api-access-ls6tg\") pod \"neutron-db-create-whswv\" (UID: \"44044fda-5fae-4af3-8ef5-96c63087bed3\") " pod="openstack-kuttl-tests/neutron-db-create-whswv" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.567880 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vmg4\" (UniqueName: \"kubernetes.io/projected/208c8689-8a0c-409c-86e6-d0cfa8893c77-kube-api-access-4vmg4\") pod \"barbican-8cf9-account-create-update-jjgf7\" (UID: \"208c8689-8a0c-409c-86e6-d0cfa8893c77\") " pod="openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.567909 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2x9v\" (UniqueName: \"kubernetes.io/projected/0d757282-7e93-4149-82c8-560aa414ff9f-kube-api-access-w2x9v\") pod \"barbican-db-create-clvw5\" (UID: \"0d757282-7e93-4149-82c8-560aa414ff9f\") " pod="openstack-kuttl-tests/barbican-db-create-clvw5" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.567939 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/208c8689-8a0c-409c-86e6-d0cfa8893c77-operator-scripts\") pod \"barbican-8cf9-account-create-update-jjgf7\" (UID: \"208c8689-8a0c-409c-86e6-d0cfa8893c77\") " pod="openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.567959 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44044fda-5fae-4af3-8ef5-96c63087bed3-operator-scripts\") pod \"neutron-db-create-whswv\" (UID: \"44044fda-5fae-4af3-8ef5-96c63087bed3\") " pod="openstack-kuttl-tests/neutron-db-create-whswv" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.568450 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d757282-7e93-4149-82c8-560aa414ff9f-operator-scripts\") pod \"barbican-db-create-clvw5\" (UID: \"0d757282-7e93-4149-82c8-560aa414ff9f\") " pod="openstack-kuttl-tests/barbican-db-create-clvw5" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.569005 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/208c8689-8a0c-409c-86e6-d0cfa8893c77-operator-scripts\") pod \"barbican-8cf9-account-create-update-jjgf7\" (UID: \"208c8689-8a0c-409c-86e6-d0cfa8893c77\") " pod="openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.578870 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.588124 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.589254 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.595823 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-76gkg" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.597903 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.598145 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vmg4\" (UniqueName: \"kubernetes.io/projected/208c8689-8a0c-409c-86e6-d0cfa8893c77-kube-api-access-4vmg4\") pod \"barbican-8cf9-account-create-update-jjgf7\" (UID: \"208c8689-8a0c-409c-86e6-d0cfa8893c77\") " pod="openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.598182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2x9v\" (UniqueName: \"kubernetes.io/projected/0d757282-7e93-4149-82c8-560aa414ff9f-kube-api-access-w2x9v\") pod \"barbican-db-create-clvw5\" (UID: \"0d757282-7e93-4149-82c8-560aa414ff9f\") " pod="openstack-kuttl-tests/barbican-db-create-clvw5" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.602279 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.669270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcd28\" (UniqueName: \"kubernetes.io/projected/01baa0a2-d612-4ca4-9ee2-0957df3b6d77-kube-api-access-xcd28\") pod \"neutron-2b61-account-create-update-zzbjt\" (UID: \"01baa0a2-d612-4ca4-9ee2-0957df3b6d77\") " pod="openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.669314 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls6tg\" (UniqueName: \"kubernetes.io/projected/44044fda-5fae-4af3-8ef5-96c63087bed3-kube-api-access-ls6tg\") pod \"neutron-db-create-whswv\" (UID: \"44044fda-5fae-4af3-8ef5-96c63087bed3\") " pod="openstack-kuttl-tests/neutron-db-create-whswv" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.669371 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01baa0a2-d612-4ca4-9ee2-0957df3b6d77-operator-scripts\") pod \"neutron-2b61-account-create-update-zzbjt\" (UID: \"01baa0a2-d612-4ca4-9ee2-0957df3b6d77\") " pod="openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.669414 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44044fda-5fae-4af3-8ef5-96c63087bed3-operator-scripts\") pod \"neutron-db-create-whswv\" (UID: \"44044fda-5fae-4af3-8ef5-96c63087bed3\") " pod="openstack-kuttl-tests/neutron-db-create-whswv" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.670219 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44044fda-5fae-4af3-8ef5-96c63087bed3-operator-scripts\") pod \"neutron-db-create-whswv\" (UID: \"44044fda-5fae-4af3-8ef5-96c63087bed3\") " pod="openstack-kuttl-tests/neutron-db-create-whswv" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.677016 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-clvw5" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.712592 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-swfqz"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.713614 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-swfqz" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.714249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls6tg\" (UniqueName: \"kubernetes.io/projected/44044fda-5fae-4af3-8ef5-96c63087bed3-kube-api-access-ls6tg\") pod \"neutron-db-create-whswv\" (UID: \"44044fda-5fae-4af3-8ef5-96c63087bed3\") " pod="openstack-kuttl-tests/neutron-db-create-whswv" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.717033 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.717152 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.717415 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-x95hg" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.717524 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.735483 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-swfqz"] Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.771556 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54ds7\" (UniqueName: \"kubernetes.io/projected/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-kube-api-access-54ds7\") pod \"keystone-db-sync-swfqz\" (UID: \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\") " pod="openstack-kuttl-tests/keystone-db-sync-swfqz" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.771906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-config-data\") pod \"keystone-db-sync-swfqz\" (UID: \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\") " pod="openstack-kuttl-tests/keystone-db-sync-swfqz" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.771969 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcd28\" (UniqueName: \"kubernetes.io/projected/01baa0a2-d612-4ca4-9ee2-0957df3b6d77-kube-api-access-xcd28\") pod \"neutron-2b61-account-create-update-zzbjt\" (UID: \"01baa0a2-d612-4ca4-9ee2-0957df3b6d77\") " pod="openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.772020 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01baa0a2-d612-4ca4-9ee2-0957df3b6d77-operator-scripts\") pod \"neutron-2b61-account-create-update-zzbjt\" (UID: \"01baa0a2-d612-4ca4-9ee2-0957df3b6d77\") " pod="openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.772044 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-combined-ca-bundle\") pod \"keystone-db-sync-swfqz\" (UID: \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\") " pod="openstack-kuttl-tests/keystone-db-sync-swfqz" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.775209 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01baa0a2-d612-4ca4-9ee2-0957df3b6d77-operator-scripts\") pod \"neutron-2b61-account-create-update-zzbjt\" (UID: \"01baa0a2-d612-4ca4-9ee2-0957df3b6d77\") " pod="openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.775316 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.796238 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcd28\" (UniqueName: \"kubernetes.io/projected/01baa0a2-d612-4ca4-9ee2-0957df3b6d77-kube-api-access-xcd28\") pod \"neutron-2b61-account-create-update-zzbjt\" (UID: \"01baa0a2-d612-4ca4-9ee2-0957df3b6d77\") " pod="openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.873352 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-config-data\") pod \"keystone-db-sync-swfqz\" (UID: \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\") " pod="openstack-kuttl-tests/keystone-db-sync-swfqz" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.873799 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-combined-ca-bundle\") pod \"keystone-db-sync-swfqz\" (UID: \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\") " pod="openstack-kuttl-tests/keystone-db-sync-swfqz" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.873897 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54ds7\" (UniqueName: \"kubernetes.io/projected/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-kube-api-access-54ds7\") pod \"keystone-db-sync-swfqz\" (UID: \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\") " pod="openstack-kuttl-tests/keystone-db-sync-swfqz" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.877195 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-config-data\") pod \"keystone-db-sync-swfqz\" (UID: \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\") " pod="openstack-kuttl-tests/keystone-db-sync-swfqz" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.878322 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-combined-ca-bundle\") pod \"keystone-db-sync-swfqz\" (UID: \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\") " pod="openstack-kuttl-tests/keystone-db-sync-swfqz" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.892115 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-whswv" Jan 20 22:53:52 crc kubenswrapper[5030]: I0120 22:53:52.900314 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54ds7\" (UniqueName: \"kubernetes.io/projected/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-kube-api-access-54ds7\") pod \"keystone-db-sync-swfqz\" (UID: \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\") " pod="openstack-kuttl-tests/keystone-db-sync-swfqz" Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:52.999972 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt" Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.044672 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-swfqz" Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.050473 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68"] Jan 20 22:53:53 crc kubenswrapper[5030]: W0120 22:53:53.066402 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cc300a3_f7d4_4061_ba18_268032869088.slice/crio-e6557d357b0c9007d724250a6096bf76885900f9bc20a5f9457310b281b17212 WatchSource:0}: Error finding container e6557d357b0c9007d724250a6096bf76885900f9bc20a5f9457310b281b17212: Status 404 returned error can't find the container with id e6557d357b0c9007d724250a6096bf76885900f9bc20a5f9457310b281b17212 Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.110881 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-76gkg"] Jan 20 22:53:53 crc kubenswrapper[5030]: W0120 22:53:53.121039 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb344be57_888d_48a1_8b94_26ed9061b2f2.slice/crio-38527754490f24e38ebff68bc275b3146326db2225bf71ee050f79306d222222 WatchSource:0}: Error finding container 38527754490f24e38ebff68bc275b3146326db2225bf71ee050f79306d222222: Status 404 returned error can't find the container with id 38527754490f24e38ebff68bc275b3146326db2225bf71ee050f79306d222222 Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.209718 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-clvw5"] Jan 20 22:53:53 crc kubenswrapper[5030]: W0120 22:53:53.222614 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d757282_7e93_4149_82c8_560aa414ff9f.slice/crio-9b03da8d2542bd89c87d8a5febe91c6b8642927bd110bccf0156826de4c0684d WatchSource:0}: Error finding container 9b03da8d2542bd89c87d8a5febe91c6b8642927bd110bccf0156826de4c0684d: Status 404 returned error can't find the container with id 9b03da8d2542bd89c87d8a5febe91c6b8642927bd110bccf0156826de4c0684d Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.306248 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7"] Jan 20 22:53:53 crc kubenswrapper[5030]: W0120 22:53:53.328768 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod208c8689_8a0c_409c_86e6_d0cfa8893c77.slice/crio-0fe2752e27527a2a50033d5a817891dec19d6f374f447c4a86670c7d5008f18a WatchSource:0}: Error finding container 0fe2752e27527a2a50033d5a817891dec19d6f374f447c4a86670c7d5008f18a: Status 404 returned error can't find the container with id 0fe2752e27527a2a50033d5a817891dec19d6f374f447c4a86670c7d5008f18a Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.411888 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-whswv"] Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.516648 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-swfqz"] Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.541924 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt"] Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.782433 5030 generic.go:334] "Generic (PLEG): container finished" podID="2cc300a3-f7d4-4061-ba18-268032869088" containerID="44e9e86c1b6b010df1b0262a613eedd66cdead712f7feb3518e2d40f086da17e" exitCode=0 Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.782554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68" event={"ID":"2cc300a3-f7d4-4061-ba18-268032869088","Type":"ContainerDied","Data":"44e9e86c1b6b010df1b0262a613eedd66cdead712f7feb3518e2d40f086da17e"} Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.782603 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68" event={"ID":"2cc300a3-f7d4-4061-ba18-268032869088","Type":"ContainerStarted","Data":"e6557d357b0c9007d724250a6096bf76885900f9bc20a5f9457310b281b17212"} Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.783847 5030 generic.go:334] "Generic (PLEG): container finished" podID="44044fda-5fae-4af3-8ef5-96c63087bed3" containerID="882cc8c16207535f682a857f2aff37f733997ce119b202b91104337ac96cd344" exitCode=0 Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.783883 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-whswv" event={"ID":"44044fda-5fae-4af3-8ef5-96c63087bed3","Type":"ContainerDied","Data":"882cc8c16207535f682a857f2aff37f733997ce119b202b91104337ac96cd344"} Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.784077 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-whswv" event={"ID":"44044fda-5fae-4af3-8ef5-96c63087bed3","Type":"ContainerStarted","Data":"e79bc1f6fa144b8b0ec3594d24ce093f443774b9c3cccee9de228b6a89bddd1b"} Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.786230 5030 generic.go:334] "Generic (PLEG): container finished" podID="0d757282-7e93-4149-82c8-560aa414ff9f" containerID="893109e8602c9959be263bcaa24129014f0586a2671d43af5bd007a9455a8741" exitCode=0 Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.786283 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-clvw5" event={"ID":"0d757282-7e93-4149-82c8-560aa414ff9f","Type":"ContainerDied","Data":"893109e8602c9959be263bcaa24129014f0586a2671d43af5bd007a9455a8741"} Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.786302 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-clvw5" event={"ID":"0d757282-7e93-4149-82c8-560aa414ff9f","Type":"ContainerStarted","Data":"9b03da8d2542bd89c87d8a5febe91c6b8642927bd110bccf0156826de4c0684d"} Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.788764 5030 generic.go:334] "Generic (PLEG): container finished" podID="b344be57-888d-48a1-8b94-26ed9061b2f2" containerID="2232c770e9612ca6b18e044232df33e7ad98af8922670115d8c6b21ba6e27375" exitCode=0 Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.788822 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-76gkg" event={"ID":"b344be57-888d-48a1-8b94-26ed9061b2f2","Type":"ContainerDied","Data":"2232c770e9612ca6b18e044232df33e7ad98af8922670115d8c6b21ba6e27375"} Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.788844 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-76gkg" event={"ID":"b344be57-888d-48a1-8b94-26ed9061b2f2","Type":"ContainerStarted","Data":"38527754490f24e38ebff68bc275b3146326db2225bf71ee050f79306d222222"} Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.790005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-swfqz" event={"ID":"8dca47bc-b3f4-427a-b88c-451cc8b36bc9","Type":"ContainerStarted","Data":"87c446237eb0d0a77e1c09edd3107fbc18eaa7cd54a831e90a7bf11b04cbaa05"} Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.791875 5030 generic.go:334] "Generic (PLEG): container finished" podID="208c8689-8a0c-409c-86e6-d0cfa8893c77" containerID="95b2f36df5f25c28ef9a913f2bb113e62e72854104599c6bbe79672b14286604" exitCode=0 Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.791917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7" event={"ID":"208c8689-8a0c-409c-86e6-d0cfa8893c77","Type":"ContainerDied","Data":"95b2f36df5f25c28ef9a913f2bb113e62e72854104599c6bbe79672b14286604"} Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.791985 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7" event={"ID":"208c8689-8a0c-409c-86e6-d0cfa8893c77","Type":"ContainerStarted","Data":"0fe2752e27527a2a50033d5a817891dec19d6f374f447c4a86670c7d5008f18a"} Jan 20 22:53:53 crc kubenswrapper[5030]: I0120 22:53:53.793077 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt" event={"ID":"01baa0a2-d612-4ca4-9ee2-0957df3b6d77","Type":"ContainerStarted","Data":"bf64e368d08472e3339e0811ab994f88a1fbaa8ffa652b0abf2ad2728613edc9"} Jan 20 22:53:54 crc kubenswrapper[5030]: I0120 22:53:54.805349 5030 generic.go:334] "Generic (PLEG): container finished" podID="01baa0a2-d612-4ca4-9ee2-0957df3b6d77" containerID="524a55892a1a52741bd4d8c5a2facdaa841bff142021e22e5dd212b97c706f95" exitCode=0 Jan 20 22:53:54 crc kubenswrapper[5030]: I0120 22:53:54.805419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt" event={"ID":"01baa0a2-d612-4ca4-9ee2-0957df3b6d77","Type":"ContainerDied","Data":"524a55892a1a52741bd4d8c5a2facdaa841bff142021e22e5dd212b97c706f95"} Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.184951 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-whswv" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.214303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls6tg\" (UniqueName: \"kubernetes.io/projected/44044fda-5fae-4af3-8ef5-96c63087bed3-kube-api-access-ls6tg\") pod \"44044fda-5fae-4af3-8ef5-96c63087bed3\" (UID: \"44044fda-5fae-4af3-8ef5-96c63087bed3\") " Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.214468 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44044fda-5fae-4af3-8ef5-96c63087bed3-operator-scripts\") pod \"44044fda-5fae-4af3-8ef5-96c63087bed3\" (UID: \"44044fda-5fae-4af3-8ef5-96c63087bed3\") " Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.215835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44044fda-5fae-4af3-8ef5-96c63087bed3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44044fda-5fae-4af3-8ef5-96c63087bed3" (UID: "44044fda-5fae-4af3-8ef5-96c63087bed3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.223046 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44044fda-5fae-4af3-8ef5-96c63087bed3-kube-api-access-ls6tg" (OuterVolumeSpecName: "kube-api-access-ls6tg") pod "44044fda-5fae-4af3-8ef5-96c63087bed3" (UID: "44044fda-5fae-4af3-8ef5-96c63087bed3"). InnerVolumeSpecName "kube-api-access-ls6tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.316048 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44044fda-5fae-4af3-8ef5-96c63087bed3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.316076 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls6tg\" (UniqueName: \"kubernetes.io/projected/44044fda-5fae-4af3-8ef5-96c63087bed3-kube-api-access-ls6tg\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.320916 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-76gkg" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.326437 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.331958 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-clvw5" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.355583 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.418452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/208c8689-8a0c-409c-86e6-d0cfa8893c77-operator-scripts\") pod \"208c8689-8a0c-409c-86e6-d0cfa8893c77\" (UID: \"208c8689-8a0c-409c-86e6-d0cfa8893c77\") " Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.418848 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b344be57-888d-48a1-8b94-26ed9061b2f2-operator-scripts\") pod \"b344be57-888d-48a1-8b94-26ed9061b2f2\" (UID: \"b344be57-888d-48a1-8b94-26ed9061b2f2\") " Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.418955 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vmg4\" (UniqueName: \"kubernetes.io/projected/208c8689-8a0c-409c-86e6-d0cfa8893c77-kube-api-access-4vmg4\") pod \"208c8689-8a0c-409c-86e6-d0cfa8893c77\" (UID: \"208c8689-8a0c-409c-86e6-d0cfa8893c77\") " Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.418990 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjx2t\" (UniqueName: \"kubernetes.io/projected/2cc300a3-f7d4-4061-ba18-268032869088-kube-api-access-pjx2t\") pod \"2cc300a3-f7d4-4061-ba18-268032869088\" (UID: \"2cc300a3-f7d4-4061-ba18-268032869088\") " Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.418999 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/208c8689-8a0c-409c-86e6-d0cfa8893c77-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "208c8689-8a0c-409c-86e6-d0cfa8893c77" (UID: "208c8689-8a0c-409c-86e6-d0cfa8893c77"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.419050 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d757282-7e93-4149-82c8-560aa414ff9f-operator-scripts\") pod \"0d757282-7e93-4149-82c8-560aa414ff9f\" (UID: \"0d757282-7e93-4149-82c8-560aa414ff9f\") " Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.419079 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cc300a3-f7d4-4061-ba18-268032869088-operator-scripts\") pod \"2cc300a3-f7d4-4061-ba18-268032869088\" (UID: \"2cc300a3-f7d4-4061-ba18-268032869088\") " Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.419110 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hpwm\" (UniqueName: \"kubernetes.io/projected/b344be57-888d-48a1-8b94-26ed9061b2f2-kube-api-access-4hpwm\") pod \"b344be57-888d-48a1-8b94-26ed9061b2f2\" (UID: \"b344be57-888d-48a1-8b94-26ed9061b2f2\") " Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.419238 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2x9v\" (UniqueName: \"kubernetes.io/projected/0d757282-7e93-4149-82c8-560aa414ff9f-kube-api-access-w2x9v\") pod \"0d757282-7e93-4149-82c8-560aa414ff9f\" (UID: \"0d757282-7e93-4149-82c8-560aa414ff9f\") " Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.419430 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b344be57-888d-48a1-8b94-26ed9061b2f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b344be57-888d-48a1-8b94-26ed9061b2f2" (UID: "b344be57-888d-48a1-8b94-26ed9061b2f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.419564 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d757282-7e93-4149-82c8-560aa414ff9f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d757282-7e93-4149-82c8-560aa414ff9f" (UID: "0d757282-7e93-4149-82c8-560aa414ff9f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.419682 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc300a3-f7d4-4061-ba18-268032869088-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2cc300a3-f7d4-4061-ba18-268032869088" (UID: "2cc300a3-f7d4-4061-ba18-268032869088"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.420157 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/208c8689-8a0c-409c-86e6-d0cfa8893c77-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.420176 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b344be57-888d-48a1-8b94-26ed9061b2f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.420188 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d757282-7e93-4149-82c8-560aa414ff9f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.420385 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cc300a3-f7d4-4061-ba18-268032869088-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.424188 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b344be57-888d-48a1-8b94-26ed9061b2f2-kube-api-access-4hpwm" (OuterVolumeSpecName: "kube-api-access-4hpwm") pod "b344be57-888d-48a1-8b94-26ed9061b2f2" (UID: "b344be57-888d-48a1-8b94-26ed9061b2f2"). InnerVolumeSpecName "kube-api-access-4hpwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.424262 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/208c8689-8a0c-409c-86e6-d0cfa8893c77-kube-api-access-4vmg4" (OuterVolumeSpecName: "kube-api-access-4vmg4") pod "208c8689-8a0c-409c-86e6-d0cfa8893c77" (UID: "208c8689-8a0c-409c-86e6-d0cfa8893c77"). InnerVolumeSpecName "kube-api-access-4vmg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.424306 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc300a3-f7d4-4061-ba18-268032869088-kube-api-access-pjx2t" (OuterVolumeSpecName: "kube-api-access-pjx2t") pod "2cc300a3-f7d4-4061-ba18-268032869088" (UID: "2cc300a3-f7d4-4061-ba18-268032869088"). InnerVolumeSpecName "kube-api-access-pjx2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.424331 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d757282-7e93-4149-82c8-560aa414ff9f-kube-api-access-w2x9v" (OuterVolumeSpecName: "kube-api-access-w2x9v") pod "0d757282-7e93-4149-82c8-560aa414ff9f" (UID: "0d757282-7e93-4149-82c8-560aa414ff9f"). InnerVolumeSpecName "kube-api-access-w2x9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.522351 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vmg4\" (UniqueName: \"kubernetes.io/projected/208c8689-8a0c-409c-86e6-d0cfa8893c77-kube-api-access-4vmg4\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.522396 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjx2t\" (UniqueName: \"kubernetes.io/projected/2cc300a3-f7d4-4061-ba18-268032869088-kube-api-access-pjx2t\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.522405 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hpwm\" (UniqueName: \"kubernetes.io/projected/b344be57-888d-48a1-8b94-26ed9061b2f2-kube-api-access-4hpwm\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.522451 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2x9v\" (UniqueName: \"kubernetes.io/projected/0d757282-7e93-4149-82c8-560aa414ff9f-kube-api-access-w2x9v\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.821012 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-clvw5" event={"ID":"0d757282-7e93-4149-82c8-560aa414ff9f","Type":"ContainerDied","Data":"9b03da8d2542bd89c87d8a5febe91c6b8642927bd110bccf0156826de4c0684d"} Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.821032 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-clvw5" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.821253 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b03da8d2542bd89c87d8a5febe91c6b8642927bd110bccf0156826de4c0684d" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.823549 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-76gkg" event={"ID":"b344be57-888d-48a1-8b94-26ed9061b2f2","Type":"ContainerDied","Data":"38527754490f24e38ebff68bc275b3146326db2225bf71ee050f79306d222222"} Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.823588 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38527754490f24e38ebff68bc275b3146326db2225bf71ee050f79306d222222" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.823763 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-76gkg" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.825276 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7" event={"ID":"208c8689-8a0c-409c-86e6-d0cfa8893c77","Type":"ContainerDied","Data":"0fe2752e27527a2a50033d5a817891dec19d6f374f447c4a86670c7d5008f18a"} Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.825344 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fe2752e27527a2a50033d5a817891dec19d6f374f447c4a86670c7d5008f18a" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.825296 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.827652 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68" event={"ID":"2cc300a3-f7d4-4061-ba18-268032869088","Type":"ContainerDied","Data":"e6557d357b0c9007d724250a6096bf76885900f9bc20a5f9457310b281b17212"} Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.827687 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6557d357b0c9007d724250a6096bf76885900f9bc20a5f9457310b281b17212" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.827752 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.829710 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-whswv" Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.829932 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-whswv" event={"ID":"44044fda-5fae-4af3-8ef5-96c63087bed3","Type":"ContainerDied","Data":"e79bc1f6fa144b8b0ec3594d24ce093f443774b9c3cccee9de228b6a89bddd1b"} Jan 20 22:53:55 crc kubenswrapper[5030]: I0120 22:53:55.829958 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e79bc1f6fa144b8b0ec3594d24ce093f443774b9c3cccee9de228b6a89bddd1b" Jan 20 22:53:57 crc kubenswrapper[5030]: I0120 22:53:57.369864 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:53:57 crc kubenswrapper[5030]: I0120 22:53:57.445707 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj"] Jan 20 22:53:57 crc kubenswrapper[5030]: I0120 22:53:57.445974 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" podUID="12203532-a911-44b9-a1f8-e63b85b30a32" containerName="dnsmasq-dns" containerID="cri-o://0798a74683ee4a674b978d4932f7546f0e893fc80d67cc004b7de950080e0504" gracePeriod=10 Jan 20 22:53:57 crc kubenswrapper[5030]: I0120 22:53:57.876949 5030 generic.go:334] "Generic (PLEG): container finished" podID="12203532-a911-44b9-a1f8-e63b85b30a32" containerID="0798a74683ee4a674b978d4932f7546f0e893fc80d67cc004b7de950080e0504" exitCode=0 Jan 20 22:53:57 crc kubenswrapper[5030]: I0120 22:53:57.877037 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" event={"ID":"12203532-a911-44b9-a1f8-e63b85b30a32","Type":"ContainerDied","Data":"0798a74683ee4a674b978d4932f7546f0e893fc80d67cc004b7de950080e0504"} Jan 20 22:53:57 crc kubenswrapper[5030]: I0120 22:53:57.980727 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.067155 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcd28\" (UniqueName: \"kubernetes.io/projected/01baa0a2-d612-4ca4-9ee2-0957df3b6d77-kube-api-access-xcd28\") pod \"01baa0a2-d612-4ca4-9ee2-0957df3b6d77\" (UID: \"01baa0a2-d612-4ca4-9ee2-0957df3b6d77\") " Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.067671 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01baa0a2-d612-4ca4-9ee2-0957df3b6d77-operator-scripts\") pod \"01baa0a2-d612-4ca4-9ee2-0957df3b6d77\" (UID: \"01baa0a2-d612-4ca4-9ee2-0957df3b6d77\") " Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.070714 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01baa0a2-d612-4ca4-9ee2-0957df3b6d77-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01baa0a2-d612-4ca4-9ee2-0957df3b6d77" (UID: "01baa0a2-d612-4ca4-9ee2-0957df3b6d77"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.071703 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01baa0a2-d612-4ca4-9ee2-0957df3b6d77-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.073529 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01baa0a2-d612-4ca4-9ee2-0957df3b6d77-kube-api-access-xcd28" (OuterVolumeSpecName: "kube-api-access-xcd28") pod "01baa0a2-d612-4ca4-9ee2-0957df3b6d77" (UID: "01baa0a2-d612-4ca4-9ee2-0957df3b6d77"). InnerVolumeSpecName "kube-api-access-xcd28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.149813 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.172390 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/12203532-a911-44b9-a1f8-e63b85b30a32-dnsmasq-svc\") pod \"12203532-a911-44b9-a1f8-e63b85b30a32\" (UID: \"12203532-a911-44b9-a1f8-e63b85b30a32\") " Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.172480 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12203532-a911-44b9-a1f8-e63b85b30a32-config\") pod \"12203532-a911-44b9-a1f8-e63b85b30a32\" (UID: \"12203532-a911-44b9-a1f8-e63b85b30a32\") " Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.172551 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6zx2\" (UniqueName: \"kubernetes.io/projected/12203532-a911-44b9-a1f8-e63b85b30a32-kube-api-access-n6zx2\") pod \"12203532-a911-44b9-a1f8-e63b85b30a32\" (UID: \"12203532-a911-44b9-a1f8-e63b85b30a32\") " Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.172880 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcd28\" (UniqueName: \"kubernetes.io/projected/01baa0a2-d612-4ca4-9ee2-0957df3b6d77-kube-api-access-xcd28\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.178080 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12203532-a911-44b9-a1f8-e63b85b30a32-kube-api-access-n6zx2" (OuterVolumeSpecName: "kube-api-access-n6zx2") pod "12203532-a911-44b9-a1f8-e63b85b30a32" (UID: "12203532-a911-44b9-a1f8-e63b85b30a32"). InnerVolumeSpecName "kube-api-access-n6zx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.226400 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12203532-a911-44b9-a1f8-e63b85b30a32-config" (OuterVolumeSpecName: "config") pod "12203532-a911-44b9-a1f8-e63b85b30a32" (UID: "12203532-a911-44b9-a1f8-e63b85b30a32"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.231702 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12203532-a911-44b9-a1f8-e63b85b30a32-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "12203532-a911-44b9-a1f8-e63b85b30a32" (UID: "12203532-a911-44b9-a1f8-e63b85b30a32"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.275877 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6zx2\" (UniqueName: \"kubernetes.io/projected/12203532-a911-44b9-a1f8-e63b85b30a32-kube-api-access-n6zx2\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.275912 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/12203532-a911-44b9-a1f8-e63b85b30a32-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.275923 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12203532-a911-44b9-a1f8-e63b85b30a32-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.888592 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-t8brv" event={"ID":"419fff1b-62d5-431e-b99c-acf5c8a5e61a","Type":"ContainerStarted","Data":"24ed350f07926d46377d70ae896f08ae5561446525b45f08d3c09e118a1e0e78"} Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.891632 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.891654 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt" event={"ID":"01baa0a2-d612-4ca4-9ee2-0957df3b6d77","Type":"ContainerDied","Data":"bf64e368d08472e3339e0811ab994f88a1fbaa8ffa652b0abf2ad2728613edc9"} Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.891694 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf64e368d08472e3339e0811ab994f88a1fbaa8ffa652b0abf2ad2728613edc9" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.894052 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" event={"ID":"12203532-a911-44b9-a1f8-e63b85b30a32","Type":"ContainerDied","Data":"1a96d8919232e73788dec23dfe45a254919e8779c7d984fa19ab00fb9c412e42"} Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.894091 5030 scope.go:117] "RemoveContainer" containerID="0798a74683ee4a674b978d4932f7546f0e893fc80d67cc004b7de950080e0504" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.894216 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.896957 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-swfqz" event={"ID":"8dca47bc-b3f4-427a-b88c-451cc8b36bc9","Type":"ContainerStarted","Data":"cd95a6397db4e9560793bf5984d28722a88bf228d5a81f2f64fa90df41af5578"} Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.929493 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-t8brv" podStartSLOduration=2.532771063 podStartE2EDuration="32.929354438s" podCreationTimestamp="2026-01-20 22:53:26 +0000 UTC" firstStartedPulling="2026-01-20 22:53:27.601294573 +0000 UTC m=+1079.921554861" lastFinishedPulling="2026-01-20 22:53:57.997877938 +0000 UTC m=+1110.318138236" observedRunningTime="2026-01-20 22:53:58.925962995 +0000 UTC m=+1111.246223303" watchObservedRunningTime="2026-01-20 22:53:58.929354438 +0000 UTC m=+1111.249614806" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.951738 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-swfqz" podStartSLOduration=2.463352412 podStartE2EDuration="6.951712794s" podCreationTimestamp="2026-01-20 22:53:52 +0000 UTC" firstStartedPulling="2026-01-20 22:53:53.523936858 +0000 UTC m=+1105.844197146" lastFinishedPulling="2026-01-20 22:53:58.01229723 +0000 UTC m=+1110.332557528" observedRunningTime="2026-01-20 22:53:58.943093003 +0000 UTC m=+1111.263353291" watchObservedRunningTime="2026-01-20 22:53:58.951712794 +0000 UTC m=+1111.271973122" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.978835 5030 scope.go:117] "RemoveContainer" containerID="fb19b49479cb655206e3149fcba22f286b15ec9d6c9d5548e3e033525c9b09a9" Jan 20 22:53:58 crc kubenswrapper[5030]: I0120 22:53:58.986457 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj"] Jan 20 22:53:59 crc kubenswrapper[5030]: I0120 22:53:59.030118 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q9pxj"] Jan 20 22:53:59 crc kubenswrapper[5030]: I0120 22:53:59.976415 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12203532-a911-44b9-a1f8-e63b85b30a32" path="/var/lib/kubelet/pods/12203532-a911-44b9-a1f8-e63b85b30a32/volumes" Jan 20 22:54:02 crc kubenswrapper[5030]: I0120 22:54:02.539940 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:54:03 crc kubenswrapper[5030]: I0120 22:54:03.947092 5030 generic.go:334] "Generic (PLEG): container finished" podID="8dca47bc-b3f4-427a-b88c-451cc8b36bc9" containerID="cd95a6397db4e9560793bf5984d28722a88bf228d5a81f2f64fa90df41af5578" exitCode=0 Jan 20 22:54:03 crc kubenswrapper[5030]: I0120 22:54:03.947208 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-swfqz" event={"ID":"8dca47bc-b3f4-427a-b88c-451cc8b36bc9","Type":"ContainerDied","Data":"cd95a6397db4e9560793bf5984d28722a88bf228d5a81f2f64fa90df41af5578"} Jan 20 22:54:05 crc kubenswrapper[5030]: I0120 22:54:05.266358 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-swfqz" Jan 20 22:54:05 crc kubenswrapper[5030]: I0120 22:54:05.312500 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54ds7\" (UniqueName: \"kubernetes.io/projected/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-kube-api-access-54ds7\") pod \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\" (UID: \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\") " Jan 20 22:54:05 crc kubenswrapper[5030]: I0120 22:54:05.312653 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-combined-ca-bundle\") pod \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\" (UID: \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\") " Jan 20 22:54:05 crc kubenswrapper[5030]: I0120 22:54:05.312721 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-config-data\") pod \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\" (UID: \"8dca47bc-b3f4-427a-b88c-451cc8b36bc9\") " Jan 20 22:54:05 crc kubenswrapper[5030]: I0120 22:54:05.321361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-kube-api-access-54ds7" (OuterVolumeSpecName: "kube-api-access-54ds7") pod "8dca47bc-b3f4-427a-b88c-451cc8b36bc9" (UID: "8dca47bc-b3f4-427a-b88c-451cc8b36bc9"). InnerVolumeSpecName "kube-api-access-54ds7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:54:05 crc kubenswrapper[5030]: I0120 22:54:05.353792 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dca47bc-b3f4-427a-b88c-451cc8b36bc9" (UID: "8dca47bc-b3f4-427a-b88c-451cc8b36bc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:05 crc kubenswrapper[5030]: I0120 22:54:05.361094 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-config-data" (OuterVolumeSpecName: "config-data") pod "8dca47bc-b3f4-427a-b88c-451cc8b36bc9" (UID: "8dca47bc-b3f4-427a-b88c-451cc8b36bc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:05 crc kubenswrapper[5030]: I0120 22:54:05.415347 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54ds7\" (UniqueName: \"kubernetes.io/projected/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-kube-api-access-54ds7\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:05 crc kubenswrapper[5030]: I0120 22:54:05.415421 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:05 crc kubenswrapper[5030]: I0120 22:54:05.415447 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dca47bc-b3f4-427a-b88c-451cc8b36bc9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:05 crc kubenswrapper[5030]: I0120 22:54:05.974042 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-swfqz" Jan 20 22:54:05 crc kubenswrapper[5030]: I0120 22:54:05.980564 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-swfqz" event={"ID":"8dca47bc-b3f4-427a-b88c-451cc8b36bc9","Type":"ContainerDied","Data":"87c446237eb0d0a77e1c09edd3107fbc18eaa7cd54a831e90a7bf11b04cbaa05"} Jan 20 22:54:05 crc kubenswrapper[5030]: I0120 22:54:05.980718 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87c446237eb0d0a77e1c09edd3107fbc18eaa7cd54a831e90a7bf11b04cbaa05" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.167661 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-8mzhq"] Jan 20 22:54:06 crc kubenswrapper[5030]: E0120 22:54:06.168443 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dca47bc-b3f4-427a-b88c-451cc8b36bc9" containerName="keystone-db-sync" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.168463 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dca47bc-b3f4-427a-b88c-451cc8b36bc9" containerName="keystone-db-sync" Jan 20 22:54:06 crc kubenswrapper[5030]: E0120 22:54:06.168481 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01baa0a2-d612-4ca4-9ee2-0957df3b6d77" containerName="mariadb-account-create-update" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.168489 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="01baa0a2-d612-4ca4-9ee2-0957df3b6d77" containerName="mariadb-account-create-update" Jan 20 22:54:06 crc kubenswrapper[5030]: E0120 22:54:06.168504 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc300a3-f7d4-4061-ba18-268032869088" containerName="mariadb-account-create-update" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.168510 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc300a3-f7d4-4061-ba18-268032869088" containerName="mariadb-account-create-update" Jan 20 22:54:06 crc kubenswrapper[5030]: E0120 22:54:06.168527 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12203532-a911-44b9-a1f8-e63b85b30a32" containerName="init" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.168533 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="12203532-a911-44b9-a1f8-e63b85b30a32" containerName="init" Jan 20 22:54:06 crc kubenswrapper[5030]: E0120 22:54:06.168559 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44044fda-5fae-4af3-8ef5-96c63087bed3" containerName="mariadb-database-create" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.168568 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="44044fda-5fae-4af3-8ef5-96c63087bed3" containerName="mariadb-database-create" Jan 20 22:54:06 crc kubenswrapper[5030]: E0120 22:54:06.168588 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d757282-7e93-4149-82c8-560aa414ff9f" containerName="mariadb-database-create" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.168594 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d757282-7e93-4149-82c8-560aa414ff9f" containerName="mariadb-database-create" Jan 20 22:54:06 crc kubenswrapper[5030]: E0120 22:54:06.168629 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12203532-a911-44b9-a1f8-e63b85b30a32" containerName="dnsmasq-dns" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.168636 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="12203532-a911-44b9-a1f8-e63b85b30a32" containerName="dnsmasq-dns" Jan 20 22:54:06 crc kubenswrapper[5030]: E0120 22:54:06.168653 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="208c8689-8a0c-409c-86e6-d0cfa8893c77" containerName="mariadb-account-create-update" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.168660 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="208c8689-8a0c-409c-86e6-d0cfa8893c77" containerName="mariadb-account-create-update" Jan 20 22:54:06 crc kubenswrapper[5030]: E0120 22:54:06.168675 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b344be57-888d-48a1-8b94-26ed9061b2f2" containerName="mariadb-database-create" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.168682 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b344be57-888d-48a1-8b94-26ed9061b2f2" containerName="mariadb-database-create" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.169015 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="12203532-a911-44b9-a1f8-e63b85b30a32" containerName="dnsmasq-dns" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.169032 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="01baa0a2-d612-4ca4-9ee2-0957df3b6d77" containerName="mariadb-account-create-update" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.169041 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dca47bc-b3f4-427a-b88c-451cc8b36bc9" containerName="keystone-db-sync" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.169064 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d757282-7e93-4149-82c8-560aa414ff9f" containerName="mariadb-database-create" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.169084 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="44044fda-5fae-4af3-8ef5-96c63087bed3" containerName="mariadb-database-create" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.169098 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc300a3-f7d4-4061-ba18-268032869088" containerName="mariadb-account-create-update" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.169110 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b344be57-888d-48a1-8b94-26ed9061b2f2" containerName="mariadb-database-create" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.169128 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="208c8689-8a0c-409c-86e6-d0cfa8893c77" containerName="mariadb-account-create-update" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.174937 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.190649 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.190945 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.191164 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.191382 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-x95hg" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.191594 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.227313 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-combined-ca-bundle\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.227383 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-config-data\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.227415 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-credential-keys\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.227453 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k42pf\" (UniqueName: \"kubernetes.io/projected/e6b05918-cd1e-4650-ae50-80bf1e025669-kube-api-access-k42pf\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.227489 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-fernet-keys\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.227546 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-scripts\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.239901 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-8mzhq"] Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.328842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k42pf\" (UniqueName: \"kubernetes.io/projected/e6b05918-cd1e-4650-ae50-80bf1e025669-kube-api-access-k42pf\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.328923 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-fernet-keys\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.329017 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-scripts\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.329050 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-combined-ca-bundle\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.329109 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-config-data\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.329149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-credential-keys\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.348154 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-scripts\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.348824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-combined-ca-bundle\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.348946 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-fernet-keys\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.349400 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-credential-keys\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.352338 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-k84rf"] Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.353366 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.357817 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-config-data\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.358212 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.358385 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-6drfn" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.358960 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.384211 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qcfbt"] Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.385252 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.392833 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k42pf\" (UniqueName: \"kubernetes.io/projected/e6b05918-cd1e-4650-ae50-80bf1e025669-kube-api-access-k42pf\") pod \"keystone-bootstrap-8mzhq\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.393979 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.394223 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.394343 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-nph5t" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.394601 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-k84rf"] Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.406681 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qcfbt"] Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.424751 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xldkf"] Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.425860 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.430561 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/307f56f7-ed99-4389-a33f-7e1c8dda480b-etc-machine-id\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.430635 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptl7b\" (UniqueName: \"kubernetes.io/projected/cf0019c8-f547-4abb-84db-73e5bc5e3e53-kube-api-access-ptl7b\") pod \"neutron-db-sync-qcfbt\" (UID: \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\") " pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.430689 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-combined-ca-bundle\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.430717 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf0019c8-f547-4abb-84db-73e5bc5e3e53-combined-ca-bundle\") pod \"neutron-db-sync-qcfbt\" (UID: \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\") " pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.430748 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-scripts\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.430779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-db-sync-config-data\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.430798 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf0019c8-f547-4abb-84db-73e5bc5e3e53-config\") pod \"neutron-db-sync-qcfbt\" (UID: \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\") " pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.430816 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6jlb\" (UniqueName: \"kubernetes.io/projected/307f56f7-ed99-4389-a33f-7e1c8dda480b-kube-api-access-p6jlb\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.430830 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-config-data\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.437922 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.438147 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-ng9ck" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.438271 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.442084 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xldkf"] Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.471206 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.473146 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.476998 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.477224 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.497327 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-x4mch"] Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.498274 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-x4mch" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.507738 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.515687 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.517140 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.525017 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-pqgvp" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.531272 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-x4mch"] Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532072 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-scripts\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532101 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/307f56f7-ed99-4389-a33f-7e1c8dda480b-etc-machine-id\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532128 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-config-data\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532182 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptl7b\" (UniqueName: \"kubernetes.io/projected/cf0019c8-f547-4abb-84db-73e5bc5e3e53-kube-api-access-ptl7b\") pod \"neutron-db-sync-qcfbt\" (UID: \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\") " pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532199 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4txp\" (UniqueName: \"kubernetes.io/projected/2cb00a6c-b955-4b9f-b41c-06978a151b1e-kube-api-access-g4txp\") pod \"barbican-db-sync-x4mch\" (UID: \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\") " pod="openstack-kuttl-tests/barbican-db-sync-x4mch" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532216 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cb00a6c-b955-4b9f-b41c-06978a151b1e-db-sync-config-data\") pod \"barbican-db-sync-x4mch\" (UID: \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\") " pod="openstack-kuttl-tests/barbican-db-sync-x4mch" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532249 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-combined-ca-bundle\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532267 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7038973c-1328-4d3d-a720-11483c795e71-run-httpd\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532283 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96x86\" (UniqueName: \"kubernetes.io/projected/7038973c-1328-4d3d-a720-11483c795e71-kube-api-access-96x86\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532299 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-config-data\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532321 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-combined-ca-bundle\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532343 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf0019c8-f547-4abb-84db-73e5bc5e3e53-combined-ca-bundle\") pod \"neutron-db-sync-qcfbt\" (UID: \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\") " pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532366 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-logs\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb00a6c-b955-4b9f-b41c-06978a151b1e-combined-ca-bundle\") pod \"barbican-db-sync-x4mch\" (UID: \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\") " pod="openstack-kuttl-tests/barbican-db-sync-x4mch" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532409 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-scripts\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532428 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-scripts\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532446 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532480 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-db-sync-config-data\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532497 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532512 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7038973c-1328-4d3d-a720-11483c795e71-log-httpd\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532528 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkpzl\" (UniqueName: \"kubernetes.io/projected/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-kube-api-access-lkpzl\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532546 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf0019c8-f547-4abb-84db-73e5bc5e3e53-config\") pod \"neutron-db-sync-qcfbt\" (UID: \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\") " pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532564 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6jlb\" (UniqueName: \"kubernetes.io/projected/307f56f7-ed99-4389-a33f-7e1c8dda480b-kube-api-access-p6jlb\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.532579 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-config-data\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.537477 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-config-data\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.537576 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/307f56f7-ed99-4389-a33f-7e1c8dda480b-etc-machine-id\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.546921 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-combined-ca-bundle\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.550432 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-db-sync-config-data\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.551966 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf0019c8-f547-4abb-84db-73e5bc5e3e53-combined-ca-bundle\") pod \"neutron-db-sync-qcfbt\" (UID: \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\") " pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.563043 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptl7b\" (UniqueName: \"kubernetes.io/projected/cf0019c8-f547-4abb-84db-73e5bc5e3e53-kube-api-access-ptl7b\") pod \"neutron-db-sync-qcfbt\" (UID: \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\") " pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.577220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-scripts\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.577730 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf0019c8-f547-4abb-84db-73e5bc5e3e53-config\") pod \"neutron-db-sync-qcfbt\" (UID: \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\") " pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.594511 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6jlb\" (UniqueName: \"kubernetes.io/projected/307f56f7-ed99-4389-a33f-7e1c8dda480b-kube-api-access-p6jlb\") pod \"cinder-db-sync-k84rf\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635406 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-logs\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635708 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb00a6c-b955-4b9f-b41c-06978a151b1e-combined-ca-bundle\") pod \"barbican-db-sync-x4mch\" (UID: \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\") " pod="openstack-kuttl-tests/barbican-db-sync-x4mch" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635729 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-scripts\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635745 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635777 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635793 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7038973c-1328-4d3d-a720-11483c795e71-log-httpd\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkpzl\" (UniqueName: \"kubernetes.io/projected/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-kube-api-access-lkpzl\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635832 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-scripts\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635858 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-config-data\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635880 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4txp\" (UniqueName: \"kubernetes.io/projected/2cb00a6c-b955-4b9f-b41c-06978a151b1e-kube-api-access-g4txp\") pod \"barbican-db-sync-x4mch\" (UID: \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\") " pod="openstack-kuttl-tests/barbican-db-sync-x4mch" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635897 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cb00a6c-b955-4b9f-b41c-06978a151b1e-db-sync-config-data\") pod \"barbican-db-sync-x4mch\" (UID: \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\") " pod="openstack-kuttl-tests/barbican-db-sync-x4mch" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635927 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-combined-ca-bundle\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635943 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7038973c-1328-4d3d-a720-11483c795e71-run-httpd\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635960 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96x86\" (UniqueName: \"kubernetes.io/projected/7038973c-1328-4d3d-a720-11483c795e71-kube-api-access-96x86\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.635975 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-config-data\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.646355 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-logs\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.650998 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-scripts\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.651454 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb00a6c-b955-4b9f-b41c-06978a151b1e-combined-ca-bundle\") pod \"barbican-db-sync-x4mch\" (UID: \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\") " pod="openstack-kuttl-tests/barbican-db-sync-x4mch" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.661983 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-config-data\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.663317 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-config-data\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.663526 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7038973c-1328-4d3d-a720-11483c795e71-log-httpd\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.663850 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7038973c-1328-4d3d-a720-11483c795e71-run-httpd\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.671144 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.672399 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.673988 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cb00a6c-b955-4b9f-b41c-06978a151b1e-db-sync-config-data\") pod \"barbican-db-sync-x4mch\" (UID: \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\") " pod="openstack-kuttl-tests/barbican-db-sync-x4mch" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.674469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-combined-ca-bundle\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.676083 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkpzl\" (UniqueName: \"kubernetes.io/projected/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-kube-api-access-lkpzl\") pod \"placement-db-sync-xldkf\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.683249 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.690393 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96x86\" (UniqueName: \"kubernetes.io/projected/7038973c-1328-4d3d-a720-11483c795e71-kube-api-access-96x86\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.703440 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4txp\" (UniqueName: \"kubernetes.io/projected/2cb00a6c-b955-4b9f-b41c-06978a151b1e-kube-api-access-g4txp\") pod \"barbican-db-sync-x4mch\" (UID: \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\") " pod="openstack-kuttl-tests/barbican-db-sync-x4mch" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.704059 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-scripts\") pod \"ceilometer-0\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.713117 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.730377 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.742963 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:06 crc kubenswrapper[5030]: I0120 22:54:06.754377 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-x4mch" Jan 20 22:54:07 crc kubenswrapper[5030]: W0120 22:54:07.154336 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6b05918_cd1e_4650_ae50_80bf1e025669.slice/crio-5ee24c6d05f173669f145eef1be66ae524110e7a37df633afae6c66084b1d0a5 WatchSource:0}: Error finding container 5ee24c6d05f173669f145eef1be66ae524110e7a37df633afae6c66084b1d0a5: Status 404 returned error can't find the container with id 5ee24c6d05f173669f145eef1be66ae524110e7a37df633afae6c66084b1d0a5 Jan 20 22:54:07 crc kubenswrapper[5030]: I0120 22:54:07.156972 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-8mzhq"] Jan 20 22:54:07 crc kubenswrapper[5030]: I0120 22:54:07.240369 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qcfbt"] Jan 20 22:54:07 crc kubenswrapper[5030]: I0120 22:54:07.246753 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-k84rf"] Jan 20 22:54:07 crc kubenswrapper[5030]: I0120 22:54:07.257471 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xldkf"] Jan 20 22:54:07 crc kubenswrapper[5030]: I0120 22:54:07.376702 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-x4mch"] Jan 20 22:54:07 crc kubenswrapper[5030]: W0120 22:54:07.387171 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cb00a6c_b955_4b9f_b41c_06978a151b1e.slice/crio-8350b67018e6a889ece2e5842465c65dc19059f0b29729fb4def64248be14545 WatchSource:0}: Error finding container 8350b67018e6a889ece2e5842465c65dc19059f0b29729fb4def64248be14545: Status 404 returned error can't find the container with id 8350b67018e6a889ece2e5842465c65dc19059f0b29729fb4def64248be14545 Jan 20 22:54:07 crc kubenswrapper[5030]: I0120 22:54:07.427281 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:54:07 crc kubenswrapper[5030]: W0120 22:54:07.430979 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7038973c_1328_4d3d_a720_11483c795e71.slice/crio-7638d3655f2a90d90497e13c0e6555607d3615b0c89974f2423a9d50d37e17f5 WatchSource:0}: Error finding container 7638d3655f2a90d90497e13c0e6555607d3615b0c89974f2423a9d50d37e17f5: Status 404 returned error can't find the container with id 7638d3655f2a90d90497e13c0e6555607d3615b0c89974f2423a9d50d37e17f5 Jan 20 22:54:08 crc kubenswrapper[5030]: I0120 22:54:08.012838 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7038973c-1328-4d3d-a720-11483c795e71","Type":"ContainerStarted","Data":"7638d3655f2a90d90497e13c0e6555607d3615b0c89974f2423a9d50d37e17f5"} Jan 20 22:54:08 crc kubenswrapper[5030]: I0120 22:54:08.024718 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-k84rf" event={"ID":"307f56f7-ed99-4389-a33f-7e1c8dda480b","Type":"ContainerStarted","Data":"88738cbb413569f4c4058296972ed607d6cd267a123d3867672cea173147a930"} Jan 20 22:54:08 crc kubenswrapper[5030]: I0120 22:54:08.030008 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" event={"ID":"cf0019c8-f547-4abb-84db-73e5bc5e3e53","Type":"ContainerStarted","Data":"0d5fc4f1b3c7ed51fa948b656f3eb485e97fb558a4d5d8321c931ebb2fed3f7e"} Jan 20 22:54:08 crc kubenswrapper[5030]: I0120 22:54:08.030077 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" event={"ID":"cf0019c8-f547-4abb-84db-73e5bc5e3e53","Type":"ContainerStarted","Data":"b6fa9727e9f222d113d620f8431c3439a00a8213a66adb92e730125e1dc96df1"} Jan 20 22:54:08 crc kubenswrapper[5030]: I0120 22:54:08.036973 5030 generic.go:334] "Generic (PLEG): container finished" podID="419fff1b-62d5-431e-b99c-acf5c8a5e61a" containerID="24ed350f07926d46377d70ae896f08ae5561446525b45f08d3c09e118a1e0e78" exitCode=0 Jan 20 22:54:08 crc kubenswrapper[5030]: I0120 22:54:08.037050 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-t8brv" event={"ID":"419fff1b-62d5-431e-b99c-acf5c8a5e61a","Type":"ContainerDied","Data":"24ed350f07926d46377d70ae896f08ae5561446525b45f08d3c09e118a1e0e78"} Jan 20 22:54:08 crc kubenswrapper[5030]: I0120 22:54:08.039554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-x4mch" event={"ID":"2cb00a6c-b955-4b9f-b41c-06978a151b1e","Type":"ContainerStarted","Data":"8350b67018e6a889ece2e5842465c65dc19059f0b29729fb4def64248be14545"} Jan 20 22:54:08 crc kubenswrapper[5030]: I0120 22:54:08.045823 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" event={"ID":"e6b05918-cd1e-4650-ae50-80bf1e025669","Type":"ContainerStarted","Data":"8bd8acb85c09b4e34102253ea7289fb53effc5b251d4dec799a5f526c8eb6c2b"} Jan 20 22:54:08 crc kubenswrapper[5030]: I0120 22:54:08.045855 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" event={"ID":"e6b05918-cd1e-4650-ae50-80bf1e025669","Type":"ContainerStarted","Data":"5ee24c6d05f173669f145eef1be66ae524110e7a37df633afae6c66084b1d0a5"} Jan 20 22:54:08 crc kubenswrapper[5030]: I0120 22:54:08.047897 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-xldkf" event={"ID":"8e2ce9c3-f457-4923-8ffa-8a393b4e176d","Type":"ContainerStarted","Data":"f33dd61b71d0d81defbc5761bff9510d19fd5f57e322e767e1db834b1e832dec"} Jan 20 22:54:08 crc kubenswrapper[5030]: I0120 22:54:08.101016 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" podStartSLOduration=2.100910501 podStartE2EDuration="2.100910501s" podCreationTimestamp="2026-01-20 22:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:54:08.088409446 +0000 UTC m=+1120.408669734" watchObservedRunningTime="2026-01-20 22:54:08.100910501 +0000 UTC m=+1120.421170789" Jan 20 22:54:08 crc kubenswrapper[5030]: I0120 22:54:08.126720 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" podStartSLOduration=2.12670317 podStartE2EDuration="2.12670317s" podCreationTimestamp="2026-01-20 22:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:54:08.12216619 +0000 UTC m=+1120.442426478" watchObservedRunningTime="2026-01-20 22:54:08.12670317 +0000 UTC m=+1120.446963458" Jan 20 22:54:08 crc kubenswrapper[5030]: I0120 22:54:08.890459 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:54:11 crc kubenswrapper[5030]: I0120 22:54:11.093076 5030 generic.go:334] "Generic (PLEG): container finished" podID="e6b05918-cd1e-4650-ae50-80bf1e025669" containerID="8bd8acb85c09b4e34102253ea7289fb53effc5b251d4dec799a5f526c8eb6c2b" exitCode=0 Jan 20 22:54:11 crc kubenswrapper[5030]: I0120 22:54:11.093567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" event={"ID":"e6b05918-cd1e-4650-ae50-80bf1e025669","Type":"ContainerDied","Data":"8bd8acb85c09b4e34102253ea7289fb53effc5b251d4dec799a5f526c8eb6c2b"} Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.598375 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.712930 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-fernet-keys\") pod \"e6b05918-cd1e-4650-ae50-80bf1e025669\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.713112 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-combined-ca-bundle\") pod \"e6b05918-cd1e-4650-ae50-80bf1e025669\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.713163 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-scripts\") pod \"e6b05918-cd1e-4650-ae50-80bf1e025669\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.713189 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-config-data\") pod \"e6b05918-cd1e-4650-ae50-80bf1e025669\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.713281 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-credential-keys\") pod \"e6b05918-cd1e-4650-ae50-80bf1e025669\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.713332 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k42pf\" (UniqueName: \"kubernetes.io/projected/e6b05918-cd1e-4650-ae50-80bf1e025669-kube-api-access-k42pf\") pod \"e6b05918-cd1e-4650-ae50-80bf1e025669\" (UID: \"e6b05918-cd1e-4650-ae50-80bf1e025669\") " Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.719473 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e6b05918-cd1e-4650-ae50-80bf1e025669" (UID: "e6b05918-cd1e-4650-ae50-80bf1e025669"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.720038 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e6b05918-cd1e-4650-ae50-80bf1e025669" (UID: "e6b05918-cd1e-4650-ae50-80bf1e025669"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.720889 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b05918-cd1e-4650-ae50-80bf1e025669-kube-api-access-k42pf" (OuterVolumeSpecName: "kube-api-access-k42pf") pod "e6b05918-cd1e-4650-ae50-80bf1e025669" (UID: "e6b05918-cd1e-4650-ae50-80bf1e025669"). InnerVolumeSpecName "kube-api-access-k42pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.721574 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-scripts" (OuterVolumeSpecName: "scripts") pod "e6b05918-cd1e-4650-ae50-80bf1e025669" (UID: "e6b05918-cd1e-4650-ae50-80bf1e025669"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.750399 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-config-data" (OuterVolumeSpecName: "config-data") pod "e6b05918-cd1e-4650-ae50-80bf1e025669" (UID: "e6b05918-cd1e-4650-ae50-80bf1e025669"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.760740 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6b05918-cd1e-4650-ae50-80bf1e025669" (UID: "e6b05918-cd1e-4650-ae50-80bf1e025669"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.815914 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.815944 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.815953 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.815961 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.815971 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k42pf\" (UniqueName: \"kubernetes.io/projected/e6b05918-cd1e-4650-ae50-80bf1e025669-kube-api-access-k42pf\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:15 crc kubenswrapper[5030]: I0120 22:54:15.815980 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6b05918-cd1e-4650-ae50-80bf1e025669-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.162055 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" event={"ID":"e6b05918-cd1e-4650-ae50-80bf1e025669","Type":"ContainerDied","Data":"5ee24c6d05f173669f145eef1be66ae524110e7a37df633afae6c66084b1d0a5"} Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.162088 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ee24c6d05f173669f145eef1be66ae524110e7a37df633afae6c66084b1d0a5" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.162224 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-8mzhq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.689122 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-8mzhq"] Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.695554 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-8mzhq"] Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.782214 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-b9nnq"] Jan 20 22:54:16 crc kubenswrapper[5030]: E0120 22:54:16.782789 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b05918-cd1e-4650-ae50-80bf1e025669" containerName="keystone-bootstrap" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.782812 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b05918-cd1e-4650-ae50-80bf1e025669" containerName="keystone-bootstrap" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.783061 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b05918-cd1e-4650-ae50-80bf1e025669" containerName="keystone-bootstrap" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.783675 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.787255 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.787588 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.787763 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.787913 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.788097 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-x95hg" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.792328 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-b9nnq"] Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.836193 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-combined-ca-bundle\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.836256 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckch2\" (UniqueName: \"kubernetes.io/projected/715ee943-81f3-4215-8e95-2dbb93ff0e3d-kube-api-access-ckch2\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.836323 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-config-data\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.836366 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-fernet-keys\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.836411 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-credential-keys\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.836470 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-scripts\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.937713 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-combined-ca-bundle\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.938397 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckch2\" (UniqueName: \"kubernetes.io/projected/715ee943-81f3-4215-8e95-2dbb93ff0e3d-kube-api-access-ckch2\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.938479 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-config-data\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.938502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-fernet-keys\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.938573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-credential-keys\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.938712 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-scripts\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.942520 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-config-data\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.943713 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-combined-ca-bundle\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.943980 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-scripts\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.950117 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-fernet-keys\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.953435 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-credential-keys\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:16 crc kubenswrapper[5030]: I0120 22:54:16.954689 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckch2\" (UniqueName: \"kubernetes.io/projected/715ee943-81f3-4215-8e95-2dbb93ff0e3d-kube-api-access-ckch2\") pod \"keystone-bootstrap-b9nnq\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.101506 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.172682 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-t8brv" event={"ID":"419fff1b-62d5-431e-b99c-acf5c8a5e61a","Type":"ContainerDied","Data":"8cab31abe4822cd6dd3582b1e7cd352b5668d73de8adda5f757a602e52da90e1"} Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.172751 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cab31abe4822cd6dd3582b1e7cd352b5668d73de8adda5f757a602e52da90e1" Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.211877 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.349449 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-db-sync-config-data\") pod \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.350013 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-combined-ca-bundle\") pod \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.350063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-config-data\") pod \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.350153 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pflb9\" (UniqueName: \"kubernetes.io/projected/419fff1b-62d5-431e-b99c-acf5c8a5e61a-kube-api-access-pflb9\") pod \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\" (UID: \"419fff1b-62d5-431e-b99c-acf5c8a5e61a\") " Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.354529 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/419fff1b-62d5-431e-b99c-acf5c8a5e61a-kube-api-access-pflb9" (OuterVolumeSpecName: "kube-api-access-pflb9") pod "419fff1b-62d5-431e-b99c-acf5c8a5e61a" (UID: "419fff1b-62d5-431e-b99c-acf5c8a5e61a"). InnerVolumeSpecName "kube-api-access-pflb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.356732 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "419fff1b-62d5-431e-b99c-acf5c8a5e61a" (UID: "419fff1b-62d5-431e-b99c-acf5c8a5e61a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.385125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "419fff1b-62d5-431e-b99c-acf5c8a5e61a" (UID: "419fff1b-62d5-431e-b99c-acf5c8a5e61a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.396245 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-config-data" (OuterVolumeSpecName: "config-data") pod "419fff1b-62d5-431e-b99c-acf5c8a5e61a" (UID: "419fff1b-62d5-431e-b99c-acf5c8a5e61a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.452555 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.452602 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.452641 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pflb9\" (UniqueName: \"kubernetes.io/projected/419fff1b-62d5-431e-b99c-acf5c8a5e61a-kube-api-access-pflb9\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.452665 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/419fff1b-62d5-431e-b99c-acf5c8a5e61a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:17 crc kubenswrapper[5030]: I0120 22:54:17.974121 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b05918-cd1e-4650-ae50-80bf1e025669" path="/var/lib/kubelet/pods/e6b05918-cd1e-4650-ae50-80bf1e025669/volumes" Jan 20 22:54:18 crc kubenswrapper[5030]: I0120 22:54:18.180418 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-t8brv" Jan 20 22:54:19 crc kubenswrapper[5030]: E0120 22:54:19.080568 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16" Jan 20 22:54:19 crc kubenswrapper[5030]: E0120 22:54:19.080905 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g4txp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-x4mch_openstack-kuttl-tests(2cb00a6c-b955-4b9f-b41c-06978a151b1e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:54:19 crc kubenswrapper[5030]: E0120 22:54:19.082710 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-kuttl-tests/barbican-db-sync-x4mch" podUID="2cb00a6c-b955-4b9f-b41c-06978a151b1e" Jan 20 22:54:19 crc kubenswrapper[5030]: E0120 22:54:19.193731 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16\\\"\"" pod="openstack-kuttl-tests/barbican-db-sync-x4mch" podUID="2cb00a6c-b955-4b9f-b41c-06978a151b1e" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.672415 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:54:19 crc kubenswrapper[5030]: E0120 22:54:19.672799 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="419fff1b-62d5-431e-b99c-acf5c8a5e61a" containerName="glance-db-sync" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.672816 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="419fff1b-62d5-431e-b99c-acf5c8a5e61a" containerName="glance-db-sync" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.672978 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="419fff1b-62d5-431e-b99c-acf5c8a5e61a" containerName="glance-db-sync" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.673840 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.678360 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.687284 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.687390 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-7sczs" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.687511 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.725985 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.727551 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.729907 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.762812 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.789671 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-config-data\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.789718 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.789747 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.789769 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.789787 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-logs\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.789808 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52aa2084-808f-4185-af31-a2357b5df57a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.789835 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.789859 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.789878 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh7h4\" (UniqueName: \"kubernetes.io/projected/52aa2084-808f-4185-af31-a2357b5df57a-kube-api-access-mh7h4\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.789897 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.790045 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whxpq\" (UniqueName: \"kubernetes.io/projected/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-kube-api-access-whxpq\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.790108 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52aa2084-808f-4185-af31-a2357b5df57a-logs\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.790134 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-scripts\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.790161 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.891956 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-config-data\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892014 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892040 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892065 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892164 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-logs\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892195 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52aa2084-808f-4185-af31-a2357b5df57a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892242 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892265 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892285 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh7h4\" (UniqueName: \"kubernetes.io/projected/52aa2084-808f-4185-af31-a2357b5df57a-kube-api-access-mh7h4\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892300 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892329 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whxpq\" (UniqueName: \"kubernetes.io/projected/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-kube-api-access-whxpq\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892348 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52aa2084-808f-4185-af31-a2357b5df57a-logs\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892363 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-scripts\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892379 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892390 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.892835 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.893477 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-logs\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.894048 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52aa2084-808f-4185-af31-a2357b5df57a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.895894 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52aa2084-808f-4185-af31-a2357b5df57a-logs\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.897632 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.897655 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-scripts\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.898243 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.899608 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.902079 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.905446 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.911529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-config-data\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.912317 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whxpq\" (UniqueName: \"kubernetes.io/projected/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-kube-api-access-whxpq\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.914061 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh7h4\" (UniqueName: \"kubernetes.io/projected/52aa2084-808f-4185-af31-a2357b5df57a-kube-api-access-mh7h4\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.922259 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:19 crc kubenswrapper[5030]: I0120 22:54:19.930152 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:20 crc kubenswrapper[5030]: I0120 22:54:20.005225 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:20 crc kubenswrapper[5030]: I0120 22:54:20.049017 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:21 crc kubenswrapper[5030]: I0120 22:54:21.020166 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:54:21 crc kubenswrapper[5030]: I0120 22:54:21.083811 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:54:27 crc kubenswrapper[5030]: E0120 22:54:27.058970 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777" Jan 20 22:54:27 crc kubenswrapper[5030]: E0120 22:54:27.059444 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ncch67fh697h579hch5fdh564h647h56fh587h699h5bch68ch59ch57ch656h78h649hcbhf9hch565h58bh69h89h575h6dh58bh546h54bhcbhf8q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-96x86,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack-kuttl-tests(7038973c-1328-4d3d-a720-11483c795e71): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:54:29 crc kubenswrapper[5030]: E0120 22:54:29.249713 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Jan 20 22:54:29 crc kubenswrapper[5030]: E0120 22:54:29.250704 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p6jlb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-k84rf_openstack-kuttl-tests(307f56f7-ed99-4389-a33f-7e1c8dda480b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 22:54:29 crc kubenswrapper[5030]: E0120 22:54:29.252280 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-kuttl-tests/cinder-db-sync-k84rf" podUID="307f56f7-ed99-4389-a33f-7e1c8dda480b" Jan 20 22:54:29 crc kubenswrapper[5030]: E0120 22:54:29.275204 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack-kuttl-tests/cinder-db-sync-k84rf" podUID="307f56f7-ed99-4389-a33f-7e1c8dda480b" Jan 20 22:54:29 crc kubenswrapper[5030]: I0120 22:54:29.739605 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:54:29 crc kubenswrapper[5030]: I0120 22:54:29.813243 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-b9nnq"] Jan 20 22:54:29 crc kubenswrapper[5030]: I0120 22:54:29.823508 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:54:29 crc kubenswrapper[5030]: W0120 22:54:29.990721 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod715ee943_81f3_4215_8e95_2dbb93ff0e3d.slice/crio-864bd189d7684a3b1c0a444da1c31fe9580e746e2ad2fd8c68042bca40e29a66 WatchSource:0}: Error finding container 864bd189d7684a3b1c0a444da1c31fe9580e746e2ad2fd8c68042bca40e29a66: Status 404 returned error can't find the container with id 864bd189d7684a3b1c0a444da1c31fe9580e746e2ad2fd8c68042bca40e29a66 Jan 20 22:54:29 crc kubenswrapper[5030]: W0120 22:54:29.993695 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52aa2084_808f_4185_af31_a2357b5df57a.slice/crio-56b52ab6f2d146c2298a60158fbd92cbaa0736b1b3ca0e6903c57c35da1b1189 WatchSource:0}: Error finding container 56b52ab6f2d146c2298a60158fbd92cbaa0736b1b3ca0e6903c57c35da1b1189: Status 404 returned error can't find the container with id 56b52ab6f2d146c2298a60158fbd92cbaa0736b1b3ca0e6903c57c35da1b1189 Jan 20 22:54:29 crc kubenswrapper[5030]: I0120 22:54:29.997016 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 22:54:30 crc kubenswrapper[5030]: I0120 22:54:30.285459 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2","Type":"ContainerStarted","Data":"e93d5e697c3b8fc44a60d2d43286803e4863a30ccaf57d686864f5640031db36"} Jan 20 22:54:30 crc kubenswrapper[5030]: I0120 22:54:30.288917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-xldkf" event={"ID":"8e2ce9c3-f457-4923-8ffa-8a393b4e176d","Type":"ContainerStarted","Data":"49d979ae396897b6fe84c0b16f5ec543ac67241f64c0cec0f9462b3f9343ec6c"} Jan 20 22:54:30 crc kubenswrapper[5030]: I0120 22:54:30.294859 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" event={"ID":"715ee943-81f3-4215-8e95-2dbb93ff0e3d","Type":"ContainerStarted","Data":"864bd189d7684a3b1c0a444da1c31fe9580e746e2ad2fd8c68042bca40e29a66"} Jan 20 22:54:30 crc kubenswrapper[5030]: I0120 22:54:30.301973 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"52aa2084-808f-4185-af31-a2357b5df57a","Type":"ContainerStarted","Data":"56b52ab6f2d146c2298a60158fbd92cbaa0736b1b3ca0e6903c57c35da1b1189"} Jan 20 22:54:31 crc kubenswrapper[5030]: I0120 22:54:31.315119 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"52aa2084-808f-4185-af31-a2357b5df57a","Type":"ContainerStarted","Data":"0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274"} Jan 20 22:54:31 crc kubenswrapper[5030]: I0120 22:54:31.319061 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-x4mch" event={"ID":"2cb00a6c-b955-4b9f-b41c-06978a151b1e","Type":"ContainerStarted","Data":"5d14d0e5317ecfbb409f5ce07548372a72d4a18c0e73ffe51bb4f499086b4b8e"} Jan 20 22:54:31 crc kubenswrapper[5030]: I0120 22:54:31.324542 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2","Type":"ContainerStarted","Data":"54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d"} Jan 20 22:54:31 crc kubenswrapper[5030]: I0120 22:54:31.326491 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" event={"ID":"715ee943-81f3-4215-8e95-2dbb93ff0e3d","Type":"ContainerStarted","Data":"ad58e0af4cff640a3546268c9d9f769b218b122f4b3e5f8c8c22aee0588d00cc"} Jan 20 22:54:31 crc kubenswrapper[5030]: I0120 22:54:31.335579 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-x4mch" podStartSLOduration=2.276154779 podStartE2EDuration="25.335562293s" podCreationTimestamp="2026-01-20 22:54:06 +0000 UTC" firstStartedPulling="2026-01-20 22:54:07.389563775 +0000 UTC m=+1119.709824063" lastFinishedPulling="2026-01-20 22:54:30.448971289 +0000 UTC m=+1142.769231577" observedRunningTime="2026-01-20 22:54:31.333491673 +0000 UTC m=+1143.653751961" watchObservedRunningTime="2026-01-20 22:54:31.335562293 +0000 UTC m=+1143.655822581" Jan 20 22:54:31 crc kubenswrapper[5030]: I0120 22:54:31.337765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7038973c-1328-4d3d-a720-11483c795e71","Type":"ContainerStarted","Data":"f71a5041d12161e5443074c913005467b54b979803ff2118ddbfac789b90445b"} Jan 20 22:54:31 crc kubenswrapper[5030]: I0120 22:54:31.338912 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-xldkf" podStartSLOduration=3.423466499 podStartE2EDuration="25.338899225s" podCreationTimestamp="2026-01-20 22:54:06 +0000 UTC" firstStartedPulling="2026-01-20 22:54:07.272027496 +0000 UTC m=+1119.592287784" lastFinishedPulling="2026-01-20 22:54:29.187460192 +0000 UTC m=+1141.507720510" observedRunningTime="2026-01-20 22:54:30.30935088 +0000 UTC m=+1142.629611188" watchObservedRunningTime="2026-01-20 22:54:31.338899225 +0000 UTC m=+1143.659159513" Jan 20 22:54:31 crc kubenswrapper[5030]: I0120 22:54:31.355600 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" podStartSLOduration=15.355584463 podStartE2EDuration="15.355584463s" podCreationTimestamp="2026-01-20 22:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:54:31.354442795 +0000 UTC m=+1143.674703083" watchObservedRunningTime="2026-01-20 22:54:31.355584463 +0000 UTC m=+1143.675844751" Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.347844 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2","Type":"ContainerStarted","Data":"14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819"} Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.348221 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" containerName="glance-log" containerID="cri-o://54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d" gracePeriod=30 Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.348603 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" containerName="glance-httpd" containerID="cri-o://14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819" gracePeriod=30 Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.352149 5030 generic.go:334] "Generic (PLEG): container finished" podID="8e2ce9c3-f457-4923-8ffa-8a393b4e176d" containerID="49d979ae396897b6fe84c0b16f5ec543ac67241f64c0cec0f9462b3f9343ec6c" exitCode=0 Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.352188 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-xldkf" event={"ID":"8e2ce9c3-f457-4923-8ffa-8a393b4e176d","Type":"ContainerDied","Data":"49d979ae396897b6fe84c0b16f5ec543ac67241f64c0cec0f9462b3f9343ec6c"} Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.354095 5030 generic.go:334] "Generic (PLEG): container finished" podID="cf0019c8-f547-4abb-84db-73e5bc5e3e53" containerID="0d5fc4f1b3c7ed51fa948b656f3eb485e97fb558a4d5d8321c931ebb2fed3f7e" exitCode=0 Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.354145 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" event={"ID":"cf0019c8-f547-4abb-84db-73e5bc5e3e53","Type":"ContainerDied","Data":"0d5fc4f1b3c7ed51fa948b656f3eb485e97fb558a4d5d8321c931ebb2fed3f7e"} Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.370303 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=14.370287784 podStartE2EDuration="14.370287784s" podCreationTimestamp="2026-01-20 22:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:54:32.369081315 +0000 UTC m=+1144.689341623" watchObservedRunningTime="2026-01-20 22:54:32.370287784 +0000 UTC m=+1144.690548072" Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.370741 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"52aa2084-808f-4185-af31-a2357b5df57a","Type":"ContainerStarted","Data":"25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f"} Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.371003 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="52aa2084-808f-4185-af31-a2357b5df57a" containerName="glance-httpd" containerID="cri-o://25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f" gracePeriod=30 Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.371105 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="52aa2084-808f-4185-af31-a2357b5df57a" containerName="glance-log" containerID="cri-o://0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274" gracePeriod=30 Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.446127 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=14.446111666 podStartE2EDuration="14.446111666s" podCreationTimestamp="2026-01-20 22:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:54:32.422071878 +0000 UTC m=+1144.742332166" watchObservedRunningTime="2026-01-20 22:54:32.446111666 +0000 UTC m=+1144.766371954" Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.929224 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:32 crc kubenswrapper[5030]: I0120 22:54:32.986392 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.034323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-config-data\") pod \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.034371 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-combined-ca-bundle\") pod \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.034409 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-logs\") pod \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.034773 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-scripts\") pod \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.034972 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.035092 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-logs" (OuterVolumeSpecName: "logs") pod "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" (UID: "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.035159 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52aa2084-808f-4185-af31-a2357b5df57a-logs\") pod \"52aa2084-808f-4185-af31-a2357b5df57a\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.035325 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-scripts\") pod \"52aa2084-808f-4185-af31-a2357b5df57a\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.035614 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-httpd-run\") pod \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.035658 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"52aa2084-808f-4185-af31-a2357b5df57a\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.036087 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" (UID: "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.036130 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52aa2084-808f-4185-af31-a2357b5df57a-logs" (OuterVolumeSpecName: "logs") pod "52aa2084-808f-4185-af31-a2357b5df57a" (UID: "52aa2084-808f-4185-af31-a2357b5df57a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.036226 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whxpq\" (UniqueName: \"kubernetes.io/projected/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-kube-api-access-whxpq\") pod \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\" (UID: \"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.036281 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-combined-ca-bundle\") pod \"52aa2084-808f-4185-af31-a2357b5df57a\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.036369 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-config-data\") pod \"52aa2084-808f-4185-af31-a2357b5df57a\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.036429 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52aa2084-808f-4185-af31-a2357b5df57a-httpd-run\") pod \"52aa2084-808f-4185-af31-a2357b5df57a\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.036454 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh7h4\" (UniqueName: \"kubernetes.io/projected/52aa2084-808f-4185-af31-a2357b5df57a-kube-api-access-mh7h4\") pod \"52aa2084-808f-4185-af31-a2357b5df57a\" (UID: \"52aa2084-808f-4185-af31-a2357b5df57a\") " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.037715 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52aa2084-808f-4185-af31-a2357b5df57a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "52aa2084-808f-4185-af31-a2357b5df57a" (UID: "52aa2084-808f-4185-af31-a2357b5df57a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.039087 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.039220 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52aa2084-808f-4185-af31-a2357b5df57a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.039235 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.039246 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52aa2084-808f-4185-af31-a2357b5df57a-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.042581 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" (UID: "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.044318 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-kube-api-access-whxpq" (OuterVolumeSpecName: "kube-api-access-whxpq") pod "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" (UID: "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2"). InnerVolumeSpecName "kube-api-access-whxpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.044538 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-scripts" (OuterVolumeSpecName: "scripts") pod "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" (UID: "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.046127 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-scripts" (OuterVolumeSpecName: "scripts") pod "52aa2084-808f-4185-af31-a2357b5df57a" (UID: "52aa2084-808f-4185-af31-a2357b5df57a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.047409 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "52aa2084-808f-4185-af31-a2357b5df57a" (UID: "52aa2084-808f-4185-af31-a2357b5df57a"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.047601 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52aa2084-808f-4185-af31-a2357b5df57a-kube-api-access-mh7h4" (OuterVolumeSpecName: "kube-api-access-mh7h4") pod "52aa2084-808f-4185-af31-a2357b5df57a" (UID: "52aa2084-808f-4185-af31-a2357b5df57a"). InnerVolumeSpecName "kube-api-access-mh7h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.068384 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" (UID: "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.074801 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52aa2084-808f-4185-af31-a2357b5df57a" (UID: "52aa2084-808f-4185-af31-a2357b5df57a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.090567 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-config-data" (OuterVolumeSpecName: "config-data") pod "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" (UID: "39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.112981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-config-data" (OuterVolumeSpecName: "config-data") pod "52aa2084-808f-4185-af31-a2357b5df57a" (UID: "52aa2084-808f-4185-af31-a2357b5df57a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.140884 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.140917 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.140928 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.140976 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.140993 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.141010 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.141022 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whxpq\" (UniqueName: \"kubernetes.io/projected/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2-kube-api-access-whxpq\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.141038 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.141048 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52aa2084-808f-4185-af31-a2357b5df57a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.141057 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh7h4\" (UniqueName: \"kubernetes.io/projected/52aa2084-808f-4185-af31-a2357b5df57a-kube-api-access-mh7h4\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.157817 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.159023 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.242574 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.242610 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.383519 5030 generic.go:334] "Generic (PLEG): container finished" podID="39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" containerID="14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819" exitCode=0 Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.383587 5030 generic.go:334] "Generic (PLEG): container finished" podID="39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" containerID="54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d" exitCode=143 Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.383643 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2","Type":"ContainerDied","Data":"14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819"} Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.383668 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.383687 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2","Type":"ContainerDied","Data":"54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d"} Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.383704 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2","Type":"ContainerDied","Data":"e93d5e697c3b8fc44a60d2d43286803e4863a30ccaf57d686864f5640031db36"} Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.383723 5030 scope.go:117] "RemoveContainer" containerID="14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.388157 5030 generic.go:334] "Generic (PLEG): container finished" podID="52aa2084-808f-4185-af31-a2357b5df57a" containerID="25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f" exitCode=0 Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.388182 5030 generic.go:334] "Generic (PLEG): container finished" podID="52aa2084-808f-4185-af31-a2357b5df57a" containerID="0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274" exitCode=143 Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.388577 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.388989 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"52aa2084-808f-4185-af31-a2357b5df57a","Type":"ContainerDied","Data":"25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f"} Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.389019 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"52aa2084-808f-4185-af31-a2357b5df57a","Type":"ContainerDied","Data":"0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274"} Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.389028 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"52aa2084-808f-4185-af31-a2357b5df57a","Type":"ContainerDied","Data":"56b52ab6f2d146c2298a60158fbd92cbaa0736b1b3ca0e6903c57c35da1b1189"} Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.436886 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.461914 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.500807 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.511677 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.522983 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:54:33 crc kubenswrapper[5030]: E0120 22:54:33.523451 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" containerName="glance-log" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.523471 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" containerName="glance-log" Jan 20 22:54:33 crc kubenswrapper[5030]: E0120 22:54:33.523526 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52aa2084-808f-4185-af31-a2357b5df57a" containerName="glance-log" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.523533 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="52aa2084-808f-4185-af31-a2357b5df57a" containerName="glance-log" Jan 20 22:54:33 crc kubenswrapper[5030]: E0120 22:54:33.523588 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" containerName="glance-httpd" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.523596 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" containerName="glance-httpd" Jan 20 22:54:33 crc kubenswrapper[5030]: E0120 22:54:33.523606 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52aa2084-808f-4185-af31-a2357b5df57a" containerName="glance-httpd" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.523611 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="52aa2084-808f-4185-af31-a2357b5df57a" containerName="glance-httpd" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.523862 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" containerName="glance-httpd" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.523907 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="52aa2084-808f-4185-af31-a2357b5df57a" containerName="glance-httpd" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.523919 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" containerName="glance-log" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.523929 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="52aa2084-808f-4185-af31-a2357b5df57a" containerName="glance-log" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.525257 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.529059 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-7sczs" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.530163 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.530582 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.531877 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.533469 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.535348 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.540341 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.540414 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.542461 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.546759 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.546849 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pssm2\" (UniqueName: \"kubernetes.io/projected/484288bf-3189-4fbb-9166-1a039ba08128-kube-api-access-pssm2\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.546906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/484288bf-3189-4fbb-9166-1a039ba08128-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.546939 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.546991 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/484288bf-3189-4fbb-9166-1a039ba08128-logs\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.547022 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-config-data\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.547043 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-scripts\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.547081 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.552609 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648427 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/484288bf-3189-4fbb-9166-1a039ba08128-logs\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648461 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-config-data\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648478 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-scripts\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648527 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c99b064-a7ab-46e1-9f39-64227f14c796-logs\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648546 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648569 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648595 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p67j\" (UniqueName: \"kubernetes.io/projected/7c99b064-a7ab-46e1-9f39-64227f14c796-kube-api-access-4p67j\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648614 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648658 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648715 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pssm2\" (UniqueName: \"kubernetes.io/projected/484288bf-3189-4fbb-9166-1a039ba08128-kube-api-access-pssm2\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648748 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/484288bf-3189-4fbb-9166-1a039ba08128-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648763 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648781 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648807 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c99b064-a7ab-46e1-9f39-64227f14c796-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648823 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.648975 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/484288bf-3189-4fbb-9166-1a039ba08128-logs\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.649386 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.650301 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/484288bf-3189-4fbb-9166-1a039ba08128-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.653919 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.654873 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.664105 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-scripts\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.664877 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-config-data\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.671906 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pssm2\" (UniqueName: \"kubernetes.io/projected/484288bf-3189-4fbb-9166-1a039ba08128-kube-api-access-pssm2\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.675744 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.751553 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.751633 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p67j\" (UniqueName: \"kubernetes.io/projected/7c99b064-a7ab-46e1-9f39-64227f14c796-kube-api-access-4p67j\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.751670 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.751732 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.751761 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c99b064-a7ab-46e1-9f39-64227f14c796-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.751797 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.751838 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c99b064-a7ab-46e1-9f39-64227f14c796-logs\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.751874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.754110 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c99b064-a7ab-46e1-9f39-64227f14c796-logs\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.754206 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.754286 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c99b064-a7ab-46e1-9f39-64227f14c796-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.757338 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.757599 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.759431 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.770855 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p67j\" (UniqueName: \"kubernetes.io/projected/7c99b064-a7ab-46e1-9f39-64227f14c796-kube-api-access-4p67j\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.780823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.810119 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.855052 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.887868 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.973320 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2" path="/var/lib/kubelet/pods/39a2c4cb-d1b4-4083-8753-8ed33f3bbbf2/volumes" Jan 20 22:54:33 crc kubenswrapper[5030]: I0120 22:54:33.974133 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52aa2084-808f-4185-af31-a2357b5df57a" path="/var/lib/kubelet/pods/52aa2084-808f-4185-af31-a2357b5df57a/volumes" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.414703 5030 generic.go:334] "Generic (PLEG): container finished" podID="715ee943-81f3-4215-8e95-2dbb93ff0e3d" containerID="ad58e0af4cff640a3546268c9d9f769b218b122f4b3e5f8c8c22aee0588d00cc" exitCode=0 Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.414908 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" event={"ID":"715ee943-81f3-4215-8e95-2dbb93ff0e3d","Type":"ContainerDied","Data":"ad58e0af4cff640a3546268c9d9f769b218b122f4b3e5f8c8c22aee0588d00cc"} Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.417879 5030 generic.go:334] "Generic (PLEG): container finished" podID="2cb00a6c-b955-4b9f-b41c-06978a151b1e" containerID="5d14d0e5317ecfbb409f5ce07548372a72d4a18c0e73ffe51bb4f499086b4b8e" exitCode=0 Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.418366 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-x4mch" event={"ID":"2cb00a6c-b955-4b9f-b41c-06978a151b1e","Type":"ContainerDied","Data":"5d14d0e5317ecfbb409f5ce07548372a72d4a18c0e73ffe51bb4f499086b4b8e"} Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.607029 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.614698 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.767897 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-config-data\") pod \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.767944 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkpzl\" (UniqueName: \"kubernetes.io/projected/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-kube-api-access-lkpzl\") pod \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.767970 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf0019c8-f547-4abb-84db-73e5bc5e3e53-combined-ca-bundle\") pod \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\" (UID: \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\") " Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.768028 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf0019c8-f547-4abb-84db-73e5bc5e3e53-config\") pod \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\" (UID: \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\") " Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.768076 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-logs\") pod \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.768141 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptl7b\" (UniqueName: \"kubernetes.io/projected/cf0019c8-f547-4abb-84db-73e5bc5e3e53-kube-api-access-ptl7b\") pod \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\" (UID: \"cf0019c8-f547-4abb-84db-73e5bc5e3e53\") " Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.768185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-combined-ca-bundle\") pod \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.768294 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-scripts\") pod \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\" (UID: \"8e2ce9c3-f457-4923-8ffa-8a393b4e176d\") " Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.770994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-logs" (OuterVolumeSpecName: "logs") pod "8e2ce9c3-f457-4923-8ffa-8a393b4e176d" (UID: "8e2ce9c3-f457-4923-8ffa-8a393b4e176d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.773712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-scripts" (OuterVolumeSpecName: "scripts") pod "8e2ce9c3-f457-4923-8ffa-8a393b4e176d" (UID: "8e2ce9c3-f457-4923-8ffa-8a393b4e176d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.774317 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-kube-api-access-lkpzl" (OuterVolumeSpecName: "kube-api-access-lkpzl") pod "8e2ce9c3-f457-4923-8ffa-8a393b4e176d" (UID: "8e2ce9c3-f457-4923-8ffa-8a393b4e176d"). InnerVolumeSpecName "kube-api-access-lkpzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.774990 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf0019c8-f547-4abb-84db-73e5bc5e3e53-kube-api-access-ptl7b" (OuterVolumeSpecName: "kube-api-access-ptl7b") pod "cf0019c8-f547-4abb-84db-73e5bc5e3e53" (UID: "cf0019c8-f547-4abb-84db-73e5bc5e3e53"). InnerVolumeSpecName "kube-api-access-ptl7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.794174 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf0019c8-f547-4abb-84db-73e5bc5e3e53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf0019c8-f547-4abb-84db-73e5bc5e3e53" (UID: "cf0019c8-f547-4abb-84db-73e5bc5e3e53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.808324 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf0019c8-f547-4abb-84db-73e5bc5e3e53-config" (OuterVolumeSpecName: "config") pod "cf0019c8-f547-4abb-84db-73e5bc5e3e53" (UID: "cf0019c8-f547-4abb-84db-73e5bc5e3e53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.819336 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e2ce9c3-f457-4923-8ffa-8a393b4e176d" (UID: "8e2ce9c3-f457-4923-8ffa-8a393b4e176d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.820065 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-config-data" (OuterVolumeSpecName: "config-data") pod "8e2ce9c3-f457-4923-8ffa-8a393b4e176d" (UID: "8e2ce9c3-f457-4923-8ffa-8a393b4e176d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.870037 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.870070 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.870079 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkpzl\" (UniqueName: \"kubernetes.io/projected/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-kube-api-access-lkpzl\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.870091 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf0019c8-f547-4abb-84db-73e5bc5e3e53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.870100 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf0019c8-f547-4abb-84db-73e5bc5e3e53-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.870109 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.870120 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptl7b\" (UniqueName: \"kubernetes.io/projected/cf0019c8-f547-4abb-84db-73e5bc5e3e53-kube-api-access-ptl7b\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:34 crc kubenswrapper[5030]: I0120 22:54:34.870217 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2ce9c3-f457-4923-8ffa-8a393b4e176d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.428553 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-xldkf" event={"ID":"8e2ce9c3-f457-4923-8ffa-8a393b4e176d","Type":"ContainerDied","Data":"f33dd61b71d0d81defbc5761bff9510d19fd5f57e322e767e1db834b1e832dec"} Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.428587 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f33dd61b71d0d81defbc5761bff9510d19fd5f57e322e767e1db834b1e832dec" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.428563 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-xldkf" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.431373 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.431524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-qcfbt" event={"ID":"cf0019c8-f547-4abb-84db-73e5bc5e3e53","Type":"ContainerDied","Data":"b6fa9727e9f222d113d620f8431c3439a00a8213a66adb92e730125e1dc96df1"} Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.431552 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6fa9727e9f222d113d620f8431c3439a00a8213a66adb92e730125e1dc96df1" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.889182 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-56bb869d7d-pvd6v"] Jan 20 22:54:35 crc kubenswrapper[5030]: E0120 22:54:35.889461 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0019c8-f547-4abb-84db-73e5bc5e3e53" containerName="neutron-db-sync" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.889472 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0019c8-f547-4abb-84db-73e5bc5e3e53" containerName="neutron-db-sync" Jan 20 22:54:35 crc kubenswrapper[5030]: E0120 22:54:35.889499 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2ce9c3-f457-4923-8ffa-8a393b4e176d" containerName="placement-db-sync" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.889505 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2ce9c3-f457-4923-8ffa-8a393b4e176d" containerName="placement-db-sync" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.894144 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf0019c8-f547-4abb-84db-73e5bc5e3e53" containerName="neutron-db-sync" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.894174 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e2ce9c3-f457-4923-8ffa-8a393b4e176d" containerName="placement-db-sync" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.895104 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.898579 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.898691 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.898578 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.902886 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.903107 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-ng9ck" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.927317 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-56bb869d7d-pvd6v"] Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.952136 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5dd888bb9-vzkmz"] Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.953835 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.958695 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.961751 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.962859 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.972339 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-nph5t" Jan 20 22:54:35 crc kubenswrapper[5030]: I0120 22:54:35.987727 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5dd888bb9-vzkmz"] Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.088907 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-ovndb-tls-certs\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.089604 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-combined-ca-bundle\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.089687 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-config-data\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.089729 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-combined-ca-bundle\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.089763 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-internal-tls-certs\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.089948 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpzl9\" (UniqueName: \"kubernetes.io/projected/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-kube-api-access-rpzl9\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.090016 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-httpd-config\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.090060 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-public-tls-certs\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.090101 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3fc09a-8a37-4559-91cb-817bcc4171b9-logs\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.090116 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-config\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.090143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-scripts\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.090177 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj76r\" (UniqueName: \"kubernetes.io/projected/7d3fc09a-8a37-4559-91cb-817bcc4171b9-kube-api-access-fj76r\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.191201 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-config\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.191261 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-scripts\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.191300 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj76r\" (UniqueName: \"kubernetes.io/projected/7d3fc09a-8a37-4559-91cb-817bcc4171b9-kube-api-access-fj76r\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.191336 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-ovndb-tls-certs\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.191383 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-combined-ca-bundle\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.192128 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-config-data\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.192166 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-combined-ca-bundle\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.192185 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-internal-tls-certs\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.192207 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpzl9\" (UniqueName: \"kubernetes.io/projected/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-kube-api-access-rpzl9\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.192237 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-httpd-config\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.192255 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-public-tls-certs\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.192281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3fc09a-8a37-4559-91cb-817bcc4171b9-logs\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.192573 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3fc09a-8a37-4559-91cb-817bcc4171b9-logs\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.197598 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-combined-ca-bundle\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.198457 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-config\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.199955 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-public-tls-certs\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.200077 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-config-data\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.206749 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-scripts\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.208700 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-internal-tls-certs\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.208991 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-combined-ca-bundle\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.209336 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-ovndb-tls-certs\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.211243 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpzl9\" (UniqueName: \"kubernetes.io/projected/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-kube-api-access-rpzl9\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.211824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-httpd-config\") pod \"neutron-5dd888bb9-vzkmz\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.212019 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj76r\" (UniqueName: \"kubernetes.io/projected/7d3fc09a-8a37-4559-91cb-817bcc4171b9-kube-api-access-fj76r\") pod \"placement-56bb869d7d-pvd6v\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.243001 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.290848 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.747407 5030 scope.go:117] "RemoveContainer" containerID="54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.833303 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.833404 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-x4mch" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.868964 5030 scope.go:117] "RemoveContainer" containerID="14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819" Jan 20 22:54:36 crc kubenswrapper[5030]: E0120 22:54:36.886087 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819\": container with ID starting with 14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819 not found: ID does not exist" containerID="14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.886134 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819"} err="failed to get container status \"14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819\": rpc error: code = NotFound desc = could not find container \"14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819\": container with ID starting with 14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819 not found: ID does not exist" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.886161 5030 scope.go:117] "RemoveContainer" containerID="54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.906468 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb00a6c-b955-4b9f-b41c-06978a151b1e-combined-ca-bundle\") pod \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\" (UID: \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\") " Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.908338 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-scripts\") pod \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.908442 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-credential-keys\") pod \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.909134 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckch2\" (UniqueName: \"kubernetes.io/projected/715ee943-81f3-4215-8e95-2dbb93ff0e3d-kube-api-access-ckch2\") pod \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.909321 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-config-data\") pod \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.909435 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cb00a6c-b955-4b9f-b41c-06978a151b1e-db-sync-config-data\") pod \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\" (UID: \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\") " Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.911705 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-fernet-keys\") pod \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.912200 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4txp\" (UniqueName: \"kubernetes.io/projected/2cb00a6c-b955-4b9f-b41c-06978a151b1e-kube-api-access-g4txp\") pod \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\" (UID: \"2cb00a6c-b955-4b9f-b41c-06978a151b1e\") " Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.912343 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-combined-ca-bundle\") pod \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\" (UID: \"715ee943-81f3-4215-8e95-2dbb93ff0e3d\") " Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.915363 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-scripts" (OuterVolumeSpecName: "scripts") pod "715ee943-81f3-4215-8e95-2dbb93ff0e3d" (UID: "715ee943-81f3-4215-8e95-2dbb93ff0e3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.920674 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "715ee943-81f3-4215-8e95-2dbb93ff0e3d" (UID: "715ee943-81f3-4215-8e95-2dbb93ff0e3d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:36 crc kubenswrapper[5030]: E0120 22:54:36.920761 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d\": container with ID starting with 54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d not found: ID does not exist" containerID="54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.920815 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d"} err="failed to get container status \"54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d\": rpc error: code = NotFound desc = could not find container \"54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d\": container with ID starting with 54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d not found: ID does not exist" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.920846 5030 scope.go:117] "RemoveContainer" containerID="14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.921116 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb00a6c-b955-4b9f-b41c-06978a151b1e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2cb00a6c-b955-4b9f-b41c-06978a151b1e" (UID: "2cb00a6c-b955-4b9f-b41c-06978a151b1e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.922966 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819"} err="failed to get container status \"14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819\": rpc error: code = NotFound desc = could not find container \"14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819\": container with ID starting with 14588fdf1e8f83e151a0516cf059251f240a54c235e2321ae68df23043675819 not found: ID does not exist" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.923004 5030 scope.go:117] "RemoveContainer" containerID="54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.924399 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "715ee943-81f3-4215-8e95-2dbb93ff0e3d" (UID: "715ee943-81f3-4215-8e95-2dbb93ff0e3d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.924670 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715ee943-81f3-4215-8e95-2dbb93ff0e3d-kube-api-access-ckch2" (OuterVolumeSpecName: "kube-api-access-ckch2") pod "715ee943-81f3-4215-8e95-2dbb93ff0e3d" (UID: "715ee943-81f3-4215-8e95-2dbb93ff0e3d"). InnerVolumeSpecName "kube-api-access-ckch2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.928388 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb00a6c-b955-4b9f-b41c-06978a151b1e-kube-api-access-g4txp" (OuterVolumeSpecName: "kube-api-access-g4txp") pod "2cb00a6c-b955-4b9f-b41c-06978a151b1e" (UID: "2cb00a6c-b955-4b9f-b41c-06978a151b1e"). InnerVolumeSpecName "kube-api-access-g4txp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.928500 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d"} err="failed to get container status \"54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d\": rpc error: code = NotFound desc = could not find container \"54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d\": container with ID starting with 54463402229e6a46190fcb3b0ac7130c81fbb1a43fd5d602bb3af854fb75cd7d not found: ID does not exist" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.928720 5030 scope.go:117] "RemoveContainer" containerID="25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.940709 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-config-data" (OuterVolumeSpecName: "config-data") pod "715ee943-81f3-4215-8e95-2dbb93ff0e3d" (UID: "715ee943-81f3-4215-8e95-2dbb93ff0e3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.975695 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "715ee943-81f3-4215-8e95-2dbb93ff0e3d" (UID: "715ee943-81f3-4215-8e95-2dbb93ff0e3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.975964 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb00a6c-b955-4b9f-b41c-06978a151b1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cb00a6c-b955-4b9f-b41c-06978a151b1e" (UID: "2cb00a6c-b955-4b9f-b41c-06978a151b1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:36 crc kubenswrapper[5030]: I0120 22:54:36.986411 5030 scope.go:117] "RemoveContainer" containerID="0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.021448 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb00a6c-b955-4b9f-b41c-06978a151b1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.021475 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.021490 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.021503 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckch2\" (UniqueName: \"kubernetes.io/projected/715ee943-81f3-4215-8e95-2dbb93ff0e3d-kube-api-access-ckch2\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.022302 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.022315 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cb00a6c-b955-4b9f-b41c-06978a151b1e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.022327 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.022339 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4txp\" (UniqueName: \"kubernetes.io/projected/2cb00a6c-b955-4b9f-b41c-06978a151b1e-kube-api-access-g4txp\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.022349 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/715ee943-81f3-4215-8e95-2dbb93ff0e3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.029566 5030 scope.go:117] "RemoveContainer" containerID="25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f" Jan 20 22:54:37 crc kubenswrapper[5030]: E0120 22:54:37.030017 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f\": container with ID starting with 25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f not found: ID does not exist" containerID="25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.030064 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f"} err="failed to get container status \"25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f\": rpc error: code = NotFound desc = could not find container \"25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f\": container with ID starting with 25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f not found: ID does not exist" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.030089 5030 scope.go:117] "RemoveContainer" containerID="0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274" Jan 20 22:54:37 crc kubenswrapper[5030]: E0120 22:54:37.030466 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274\": container with ID starting with 0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274 not found: ID does not exist" containerID="0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.030508 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274"} err="failed to get container status \"0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274\": rpc error: code = NotFound desc = could not find container \"0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274\": container with ID starting with 0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274 not found: ID does not exist" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.030537 5030 scope.go:117] "RemoveContainer" containerID="25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.031153 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f"} err="failed to get container status \"25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f\": rpc error: code = NotFound desc = could not find container \"25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f\": container with ID starting with 25a209ab70320efb957b518cfb95b5b4eae116664ea4a867509612504995ca7f not found: ID does not exist" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.031312 5030 scope.go:117] "RemoveContainer" containerID="0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274" Jan 20 22:54:37 crc kubenswrapper[5030]: I0120 22:54:37.035382 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274"} err="failed to get container status \"0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274\": rpc error: code = NotFound desc = could not find container \"0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274\": container with ID starting with 0f11dadbd302cf6b6f720a85cad769e1dee664eed46e9f0d0a88c98e018bc274 not found: ID does not exist" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.299959 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.405598 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.425312 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-56bb869d7d-pvd6v"] Jan 20 22:54:38 crc kubenswrapper[5030]: W0120 22:54:37.458797 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d3fc09a_8a37_4559_91cb_817bcc4171b9.slice/crio-7cc3d813da304dfbd96b3a5e42d4eaba171503097105c591e1f6ce6b5cbda0dd WatchSource:0}: Error finding container 7cc3d813da304dfbd96b3a5e42d4eaba171503097105c591e1f6ce6b5cbda0dd: Status 404 returned error can't find the container with id 7cc3d813da304dfbd96b3a5e42d4eaba171503097105c591e1f6ce6b5cbda0dd Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.459299 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7c99b064-a7ab-46e1-9f39-64227f14c796","Type":"ContainerStarted","Data":"9b162603f333fd0d668e8eff38d9ad0e1e1b7f5d68524cc45923ebe413899017"} Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.466250 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"484288bf-3189-4fbb-9166-1a039ba08128","Type":"ContainerStarted","Data":"72369b726c39de69b65d98e8f98e6c534cdc30387a7c057a897292c6245a7ab7"} Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.468687 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" event={"ID":"715ee943-81f3-4215-8e95-2dbb93ff0e3d","Type":"ContainerDied","Data":"864bd189d7684a3b1c0a444da1c31fe9580e746e2ad2fd8c68042bca40e29a66"} Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.468718 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="864bd189d7684a3b1c0a444da1c31fe9580e746e2ad2fd8c68042bca40e29a66" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.468796 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-b9nnq" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.493089 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7038973c-1328-4d3d-a720-11483c795e71","Type":"ContainerStarted","Data":"870db84ff6a1f2377b0785f3e1b5bd45a3a2b1aec3a4e16c38c802a4bf83d113"} Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.499905 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-x4mch" event={"ID":"2cb00a6c-b955-4b9f-b41c-06978a151b1e","Type":"ContainerDied","Data":"8350b67018e6a889ece2e5842465c65dc19059f0b29729fb4def64248be14545"} Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.499949 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8350b67018e6a889ece2e5842465c65dc19059f0b29729fb4def64248be14545" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.500222 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-x4mch" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.515324 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5dd888bb9-vzkmz"] Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.988736 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-5ddff9b97-b55t4"] Jan 20 22:54:38 crc kubenswrapper[5030]: E0120 22:54:37.989017 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715ee943-81f3-4215-8e95-2dbb93ff0e3d" containerName="keystone-bootstrap" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.989029 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="715ee943-81f3-4215-8e95-2dbb93ff0e3d" containerName="keystone-bootstrap" Jan 20 22:54:38 crc kubenswrapper[5030]: E0120 22:54:37.989046 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb00a6c-b955-4b9f-b41c-06978a151b1e" containerName="barbican-db-sync" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.989052 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb00a6c-b955-4b9f-b41c-06978a151b1e" containerName="barbican-db-sync" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.989196 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="715ee943-81f3-4215-8e95-2dbb93ff0e3d" containerName="keystone-bootstrap" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.989215 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb00a6c-b955-4b9f-b41c-06978a151b1e" containerName="barbican-db-sync" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.989683 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5ddff9b97-b55t4"] Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.989754 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.994983 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.995161 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.995510 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-x95hg" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.995905 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.995998 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:37.996377 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.174824 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-public-tls-certs\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.175091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-config-data\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.175112 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-fernet-keys\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.175128 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-internal-tls-certs\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.175175 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-credential-keys\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.175218 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-scripts\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.175237 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbrcv\" (UniqueName: \"kubernetes.io/projected/1e567662-7ee7-4658-8954-06e43a966ce6-kube-api-access-jbrcv\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.175358 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-combined-ca-bundle\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.215236 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9"] Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.216715 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.227124 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.227190 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.227368 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-pqgvp" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.229742 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs"] Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.236287 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.237733 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9"] Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.242204 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.246592 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs"] Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.275990 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-config-data-custom\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276026 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-combined-ca-bundle\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276048 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp7wh\" (UniqueName: \"kubernetes.io/projected/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-kube-api-access-dp7wh\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-combined-ca-bundle\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276087 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-config-data\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276125 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-combined-ca-bundle\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276146 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-config-data\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276162 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-public-tls-certs\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276181 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2732803-fafc-45fd-9cc7-4b61bd446fe0-logs\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276212 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-config-data\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276231 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-fernet-keys\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276248 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-internal-tls-certs\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-logs\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw6jg\" (UniqueName: \"kubernetes.io/projected/c2732803-fafc-45fd-9cc7-4b61bd446fe0-kube-api-access-sw6jg\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276335 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-credential-keys\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-config-data-custom\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276380 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-scripts\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.276396 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbrcv\" (UniqueName: \"kubernetes.io/projected/1e567662-7ee7-4658-8954-06e43a966ce6-kube-api-access-jbrcv\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.315322 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-internal-tls-certs\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.319129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-scripts\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.319195 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-public-tls-certs\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.319496 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-config-data\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.319677 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-combined-ca-bundle\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.330478 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbrcv\" (UniqueName: \"kubernetes.io/projected/1e567662-7ee7-4658-8954-06e43a966ce6-kube-api-access-jbrcv\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.331161 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-credential-keys\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.344311 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-fernet-keys\") pod \"keystone-5ddff9b97-b55t4\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.385446 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-config-data\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.385516 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-config-data\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.385536 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2732803-fafc-45fd-9cc7-4b61bd446fe0-logs\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.385576 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-logs\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.385607 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw6jg\" (UniqueName: \"kubernetes.io/projected/c2732803-fafc-45fd-9cc7-4b61bd446fe0-kube-api-access-sw6jg\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.385653 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-config-data-custom\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.385696 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-config-data-custom\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.385713 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-combined-ca-bundle\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.385733 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp7wh\" (UniqueName: \"kubernetes.io/projected/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-kube-api-access-dp7wh\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.385746 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-combined-ca-bundle\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.388730 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-logs\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.389037 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2732803-fafc-45fd-9cc7-4b61bd446fe0-logs\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.389293 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x"] Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.443246 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-config-data-custom\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.444281 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-config-data\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.468054 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.468201 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-config-data\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.469211 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-combined-ca-bundle\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.469452 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x"] Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.469669 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-config-data-custom\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.469927 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.471343 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-combined-ca-bundle\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.531787 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp7wh\" (UniqueName: \"kubernetes.io/projected/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-kube-api-access-dp7wh\") pod \"barbican-worker-c9769d76f-gpdjs\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.532306 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw6jg\" (UniqueName: \"kubernetes.io/projected/c2732803-fafc-45fd-9cc7-4b61bd446fe0-kube-api-access-sw6jg\") pod \"barbican-keystone-listener-84c478564-jqlr9\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.563693 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.575935 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.586898 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7c99b064-a7ab-46e1-9f39-64227f14c796","Type":"ContainerStarted","Data":"46df37c503cb037bacfe11ad662227a33d86eea89c69d58b223169d6ff67a6c0"} Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.590825 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-config-data\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.590924 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-config-data-custom\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.590994 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45c84de5-3393-4292-be4d-0ccce61e0ace-logs\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.591067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-combined-ca-bundle\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.591090 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4c42\" (UniqueName: \"kubernetes.io/projected/45c84de5-3393-4292-be4d-0ccce61e0ace-kube-api-access-b4c42\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.618412 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" event={"ID":"e82c7c33-c62b-4a4f-bf19-2e8176a84d79","Type":"ContainerStarted","Data":"5d30cc57716599d0d651620c6ec1f1fcf24192cc3b44b822bb51e448cbc85a7a"} Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.618707 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" event={"ID":"e82c7c33-c62b-4a4f-bf19-2e8176a84d79","Type":"ContainerStarted","Data":"732357dd8392df70779f24553a774511a373c5ac65085763e1e0daf543c5cc10"} Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.618722 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" event={"ID":"e82c7c33-c62b-4a4f-bf19-2e8176a84d79","Type":"ContainerStarted","Data":"1a6ccb43039d595cfbdc5c7ba5e30033e297f0c5636209af520a9f9d4dfd5757"} Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.618894 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.643173 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.649094 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"484288bf-3189-4fbb-9166-1a039ba08128","Type":"ContainerStarted","Data":"c86f3d18825271b702ba7c875db8138c946a9cb80e528f38d065a6ba41118b3c"} Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.649446 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" podStartSLOduration=3.649426765 podStartE2EDuration="3.649426765s" podCreationTimestamp="2026-01-20 22:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:54:38.645315864 +0000 UTC m=+1150.965576142" watchObservedRunningTime="2026-01-20 22:54:38.649426765 +0000 UTC m=+1150.969687053" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.663738 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" event={"ID":"7d3fc09a-8a37-4559-91cb-817bcc4171b9","Type":"ContainerStarted","Data":"21c6bb975875359b4aada1c097326bbd3a6c5bc0f7495610ae74c71812916f4a"} Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.663801 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" event={"ID":"7d3fc09a-8a37-4559-91cb-817bcc4171b9","Type":"ContainerStarted","Data":"f8b91389342e688b3e0145f9dea09b80af881bed02683f96c927275a4d6eabc9"} Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.663812 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" event={"ID":"7d3fc09a-8a37-4559-91cb-817bcc4171b9","Type":"ContainerStarted","Data":"7cc3d813da304dfbd96b3a5e42d4eaba171503097105c591e1f6ce6b5cbda0dd"} Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.665187 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.665215 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.685791 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5fc4cf64cd-9987t"] Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.687468 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.692147 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4c42\" (UniqueName: \"kubernetes.io/projected/45c84de5-3393-4292-be4d-0ccce61e0ace-kube-api-access-b4c42\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.692238 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-config-data\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.692319 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-config-data-custom\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.692379 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45c84de5-3393-4292-be4d-0ccce61e0ace-logs\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.692488 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-combined-ca-bundle\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.693429 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.693686 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.693774 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5fc4cf64cd-9987t"] Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.706301 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-combined-ca-bundle\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.706890 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45c84de5-3393-4292-be4d-0ccce61e0ace-logs\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.715536 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-config-data-custom\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.720690 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-config-data\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.728748 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4c42\" (UniqueName: \"kubernetes.io/projected/45c84de5-3393-4292-be4d-0ccce61e0ace-kube-api-access-b4c42\") pod \"barbican-api-7dd65bc764-srg7x\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.743509 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" podStartSLOduration=3.743489781 podStartE2EDuration="3.743489781s" podCreationTimestamp="2026-01-20 22:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:54:38.70328692 +0000 UTC m=+1151.023547218" watchObservedRunningTime="2026-01-20 22:54:38.743489781 +0000 UTC m=+1151.063750069" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.793563 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-config\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.793610 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-ovndb-tls-certs\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.793670 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dqm9\" (UniqueName: \"kubernetes.io/projected/fd100215-a3f7-497e-842d-582681701903-kube-api-access-8dqm9\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.793712 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-public-tls-certs\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.793749 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-httpd-config\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.793868 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-internal-tls-certs\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.793906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-combined-ca-bundle\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.897209 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-httpd-config\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.897503 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-internal-tls-certs\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.897535 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-combined-ca-bundle\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.897573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-config\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.897593 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-ovndb-tls-certs\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.897633 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dqm9\" (UniqueName: \"kubernetes.io/projected/fd100215-a3f7-497e-842d-582681701903-kube-api-access-8dqm9\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.897662 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-public-tls-certs\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.909479 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-combined-ca-bundle\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.910015 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-public-tls-certs\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.912006 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-httpd-config\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.917905 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-ovndb-tls-certs\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.918550 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.920418 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-internal-tls-certs\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.926416 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-config\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:38 crc kubenswrapper[5030]: I0120 22:54:38.941315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dqm9\" (UniqueName: \"kubernetes.io/projected/fd100215-a3f7-497e-842d-582681701903-kube-api-access-8dqm9\") pod \"neutron-5fc4cf64cd-9987t\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:39 crc kubenswrapper[5030]: I0120 22:54:39.019380 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:39 crc kubenswrapper[5030]: I0120 22:54:39.154875 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9"] Jan 20 22:54:39 crc kubenswrapper[5030]: W0120 22:54:39.185641 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2732803_fafc_45fd_9cc7_4b61bd446fe0.slice/crio-64e63735adacf6c7238d20b1699a473042d698b7f08a4205fb78d95a2ced185e WatchSource:0}: Error finding container 64e63735adacf6c7238d20b1699a473042d698b7f08a4205fb78d95a2ced185e: Status 404 returned error can't find the container with id 64e63735adacf6c7238d20b1699a473042d698b7f08a4205fb78d95a2ced185e Jan 20 22:54:39 crc kubenswrapper[5030]: I0120 22:54:39.337605 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs"] Jan 20 22:54:39 crc kubenswrapper[5030]: I0120 22:54:39.466296 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x"] Jan 20 22:54:39 crc kubenswrapper[5030]: I0120 22:54:39.532852 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5ddff9b97-b55t4"] Jan 20 22:54:39 crc kubenswrapper[5030]: I0120 22:54:39.752540 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" event={"ID":"c2732803-fafc-45fd-9cc7-4b61bd446fe0","Type":"ContainerStarted","Data":"64e63735adacf6c7238d20b1699a473042d698b7f08a4205fb78d95a2ced185e"} Jan 20 22:54:39 crc kubenswrapper[5030]: I0120 22:54:39.753522 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" event={"ID":"45c84de5-3393-4292-be4d-0ccce61e0ace","Type":"ContainerStarted","Data":"e2dae2e1c096e43be4e399e4f2bb449a749b918180bfcb1449f011c085cf8f0c"} Jan 20 22:54:39 crc kubenswrapper[5030]: I0120 22:54:39.754495 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" event={"ID":"1e567662-7ee7-4658-8954-06e43a966ce6","Type":"ContainerStarted","Data":"446a0e59d05ba7244d24e55ba1629ae3e9b86853d4d79eaac54309ec41db85cd"} Jan 20 22:54:39 crc kubenswrapper[5030]: I0120 22:54:39.764323 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7c99b064-a7ab-46e1-9f39-64227f14c796","Type":"ContainerStarted","Data":"a59c09d700ae01d32bd23b9e5481351100daeb6febd423cf4c7fcf2d6e662c06"} Jan 20 22:54:39 crc kubenswrapper[5030]: I0120 22:54:39.778934 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" event={"ID":"bdb8aa29-a5b3-4060-a8a6-19d1eda96426","Type":"ContainerStarted","Data":"2aa4e4b1173c3699a682420395dc50aa23d6e7720ee6294ef75d97af7f921c77"} Jan 20 22:54:39 crc kubenswrapper[5030]: I0120 22:54:39.779896 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5fc4cf64cd-9987t"] Jan 20 22:54:39 crc kubenswrapper[5030]: I0120 22:54:39.808716 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=6.808699996 podStartE2EDuration="6.808699996s" podCreationTimestamp="2026-01-20 22:54:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:54:39.797034331 +0000 UTC m=+1152.117294619" watchObservedRunningTime="2026-01-20 22:54:39.808699996 +0000 UTC m=+1152.128960284" Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.157565 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.157891 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.790792 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"484288bf-3189-4fbb-9166-1a039ba08128","Type":"ContainerStarted","Data":"c93df97c119c55c22e75d8a04150ddd1c5a2d7f15dc2b57ed704c6c7f485fb33"} Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.799005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" event={"ID":"45c84de5-3393-4292-be4d-0ccce61e0ace","Type":"ContainerStarted","Data":"d6b173a1fa700dcf1708107f8b164a0ab9ae083a6b457897870e78dc7b9081a8"} Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.799046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" event={"ID":"45c84de5-3393-4292-be4d-0ccce61e0ace","Type":"ContainerStarted","Data":"59b4500db8b85f3fadb5da747f22f0cdf4331c9422eb20c2227f847b377d1190"} Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.799070 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.799253 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.803283 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" event={"ID":"1e567662-7ee7-4658-8954-06e43a966ce6","Type":"ContainerStarted","Data":"e77f4e5c204c68469f44c8339a5b79d8d9973035aab21ac9dd8fa2c2f6325da5"} Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.803993 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.805668 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" event={"ID":"fd100215-a3f7-497e-842d-582681701903","Type":"ContainerStarted","Data":"03d2222b1f44670ddf7c4edbaa91540b2ac22ae3a87c8da53d412157de2aefb6"} Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.805690 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" event={"ID":"fd100215-a3f7-497e-842d-582681701903","Type":"ContainerStarted","Data":"4ef4f1c19ec69f0a2a6ee1ccdc738015babfb45998ce1816892ed6bcd3c46351"} Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.805714 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" event={"ID":"fd100215-a3f7-497e-842d-582681701903","Type":"ContainerStarted","Data":"a380f459cecde946617edbe7684d2038dcc22d4cc30565abe22ed5462d3a2945"} Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.806112 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.816303 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=7.816292674 podStartE2EDuration="7.816292674s" podCreationTimestamp="2026-01-20 22:54:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:54:40.812293957 +0000 UTC m=+1153.132554235" watchObservedRunningTime="2026-01-20 22:54:40.816292674 +0000 UTC m=+1153.136552962" Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.835169 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" podStartSLOduration=2.835154004 podStartE2EDuration="2.835154004s" podCreationTimestamp="2026-01-20 22:54:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:54:40.828846221 +0000 UTC m=+1153.149106509" watchObservedRunningTime="2026-01-20 22:54:40.835154004 +0000 UTC m=+1153.155414292" Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.887932 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" podStartSLOduration=2.887914973 podStartE2EDuration="2.887914973s" podCreationTimestamp="2026-01-20 22:54:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:54:40.879555299 +0000 UTC m=+1153.199815607" watchObservedRunningTime="2026-01-20 22:54:40.887914973 +0000 UTC m=+1153.208175261" Jan 20 22:54:40 crc kubenswrapper[5030]: I0120 22:54:40.912914 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" podStartSLOduration=3.912898173 podStartE2EDuration="3.912898173s" podCreationTimestamp="2026-01-20 22:54:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:54:40.905731088 +0000 UTC m=+1153.225991376" watchObservedRunningTime="2026-01-20 22:54:40.912898173 +0000 UTC m=+1153.233158461" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.023810 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-76878489f6-7898t"] Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.025298 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.035271 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.035603 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.042312 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-76878489f6-7898t"] Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.074257 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-combined-ca-bundle\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.074310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-internal-tls-certs\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.074348 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-config-data-custom\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.074386 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv6vc\" (UniqueName: \"kubernetes.io/projected/285c1a35-91f7-441f-aaf2-a228cccc8962-kube-api-access-jv6vc\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.074449 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-public-tls-certs\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.074476 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-config-data\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.074516 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/285c1a35-91f7-441f-aaf2-a228cccc8962-logs\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.176557 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-config-data-custom\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.176628 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv6vc\" (UniqueName: \"kubernetes.io/projected/285c1a35-91f7-441f-aaf2-a228cccc8962-kube-api-access-jv6vc\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.176672 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-public-tls-certs\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.176692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-config-data\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.176723 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/285c1a35-91f7-441f-aaf2-a228cccc8962-logs\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.176806 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-combined-ca-bundle\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.176822 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-internal-tls-certs\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.178886 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/285c1a35-91f7-441f-aaf2-a228cccc8962-logs\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.183242 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-internal-tls-certs\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.183893 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-public-tls-certs\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.184514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-config-data-custom\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.189470 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-combined-ca-bundle\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.190245 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-config-data\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.194119 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv6vc\" (UniqueName: \"kubernetes.io/projected/285c1a35-91f7-441f-aaf2-a228cccc8962-kube-api-access-jv6vc\") pod \"barbican-api-76878489f6-7898t\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:42 crc kubenswrapper[5030]: I0120 22:54:42.343142 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:43 crc kubenswrapper[5030]: I0120 22:54:43.443757 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-76878489f6-7898t"] Jan 20 22:54:43 crc kubenswrapper[5030]: I0120 22:54:43.855522 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:43 crc kubenswrapper[5030]: I0120 22:54:43.855587 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:43 crc kubenswrapper[5030]: I0120 22:54:43.888559 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:43 crc kubenswrapper[5030]: I0120 22:54:43.888612 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:43 crc kubenswrapper[5030]: I0120 22:54:43.892821 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:43 crc kubenswrapper[5030]: I0120 22:54:43.907910 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:43 crc kubenswrapper[5030]: I0120 22:54:43.925777 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:43 crc kubenswrapper[5030]: I0120 22:54:43.943285 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:44 crc kubenswrapper[5030]: I0120 22:54:44.835932 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:44 crc kubenswrapper[5030]: I0120 22:54:44.836273 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:44 crc kubenswrapper[5030]: I0120 22:54:44.836287 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:44 crc kubenswrapper[5030]: I0120 22:54:44.836297 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:46 crc kubenswrapper[5030]: I0120 22:54:46.751023 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:46 crc kubenswrapper[5030]: I0120 22:54:46.753023 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:54:46 crc kubenswrapper[5030]: I0120 22:54:46.838553 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:47 crc kubenswrapper[5030]: I0120 22:54:47.626083 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:54:48 crc kubenswrapper[5030]: W0120 22:54:48.859692 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod285c1a35_91f7_441f_aaf2_a228cccc8962.slice/crio-fdea3b413a52c003a0efb130ec633c91ad4e1ea476b33c3d3ada42ef194c8381 WatchSource:0}: Error finding container fdea3b413a52c003a0efb130ec633c91ad4e1ea476b33c3d3ada42ef194c8381: Status 404 returned error can't find the container with id fdea3b413a52c003a0efb130ec633c91ad4e1ea476b33c3d3ada42ef194c8381 Jan 20 22:54:49 crc kubenswrapper[5030]: I0120 22:54:49.879396 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" event={"ID":"285c1a35-91f7-441f-aaf2-a228cccc8962","Type":"ContainerStarted","Data":"fdea3b413a52c003a0efb130ec633c91ad4e1ea476b33c3d3ada42ef194c8381"} Jan 20 22:54:50 crc kubenswrapper[5030]: E0120 22:54:50.221316 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-kuttl-tests/ceilometer-0" podUID="7038973c-1328-4d3d-a720-11483c795e71" Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.295181 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.463230 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.888893 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-k84rf" event={"ID":"307f56f7-ed99-4389-a33f-7e1c8dda480b","Type":"ContainerStarted","Data":"1899dc7597c55dc2c6cfc75ca888715f8d68e0856cf45f3f65ffe82ffacc10d0"} Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.891195 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7038973c-1328-4d3d-a720-11483c795e71","Type":"ContainerStarted","Data":"3b3de02ac7c1072089f94c57e37253fa757076a18df2cccf1c4793d573e64e1f"} Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.891302 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7038973c-1328-4d3d-a720-11483c795e71" containerName="ceilometer-notification-agent" containerID="cri-o://f71a5041d12161e5443074c913005467b54b979803ff2118ddbfac789b90445b" gracePeriod=30 Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.891335 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.891349 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7038973c-1328-4d3d-a720-11483c795e71" containerName="proxy-httpd" containerID="cri-o://3b3de02ac7c1072089f94c57e37253fa757076a18df2cccf1c4793d573e64e1f" gracePeriod=30 Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.891363 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7038973c-1328-4d3d-a720-11483c795e71" containerName="sg-core" containerID="cri-o://870db84ff6a1f2377b0785f3e1b5bd45a3a2b1aec3a4e16c38c802a4bf83d113" gracePeriod=30 Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.917367 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" event={"ID":"285c1a35-91f7-441f-aaf2-a228cccc8962","Type":"ContainerStarted","Data":"01ac7883c22efebc724af58ba7cf2e86bb831d40c452847d4fd9f6769db4311a"} Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.917414 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" event={"ID":"285c1a35-91f7-441f-aaf2-a228cccc8962","Type":"ContainerStarted","Data":"d27c68304cedb99531414f7ed01f149023e91c9d96d8750769fd70655c26aa29"} Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.917632 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.917689 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.930917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" event={"ID":"bdb8aa29-a5b3-4060-a8a6-19d1eda96426","Type":"ContainerStarted","Data":"14bfbe083e51244aea6aa5252b2d46ca796ef97c8afa9c47bc422ff79846cd66"} Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.930970 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" event={"ID":"bdb8aa29-a5b3-4060-a8a6-19d1eda96426","Type":"ContainerStarted","Data":"06e68b68bba4d4968142a0ed2d7232e6d395b065e2c82b3bdda9db70cd9473e7"} Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.946114 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" event={"ID":"c2732803-fafc-45fd-9cc7-4b61bd446fe0","Type":"ContainerStarted","Data":"95f59c7ca745d484ee25cabdc0bdf2f4f2cc68cfe3f951e0879e51278548c3c4"} Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.946157 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" event={"ID":"c2732803-fafc-45fd-9cc7-4b61bd446fe0","Type":"ContainerStarted","Data":"c7e72e59310be353b8fa092d384d923b5d454411590aafac1c75d0f354a04eb2"} Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.951806 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-k84rf" podStartSLOduration=2.3140526550000002 podStartE2EDuration="44.9517818s" podCreationTimestamp="2026-01-20 22:54:06 +0000 UTC" firstStartedPulling="2026-01-20 22:54:07.26072547 +0000 UTC m=+1119.580985748" lastFinishedPulling="2026-01-20 22:54:49.898454575 +0000 UTC m=+1162.218714893" observedRunningTime="2026-01-20 22:54:50.914988641 +0000 UTC m=+1163.235248929" watchObservedRunningTime="2026-01-20 22:54:50.9517818 +0000 UTC m=+1163.272042088" Jan 20 22:54:50 crc kubenswrapper[5030]: I0120 22:54:50.968355 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" podStartSLOduration=9.968333293 podStartE2EDuration="9.968333293s" podCreationTimestamp="2026-01-20 22:54:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:54:50.963029054 +0000 UTC m=+1163.283289342" watchObservedRunningTime="2026-01-20 22:54:50.968333293 +0000 UTC m=+1163.288593581" Jan 20 22:54:51 crc kubenswrapper[5030]: I0120 22:54:51.035792 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" podStartSLOduration=2.696251154 podStartE2EDuration="13.03576994s" podCreationTimestamp="2026-01-20 22:54:38 +0000 UTC" firstStartedPulling="2026-01-20 22:54:39.386733725 +0000 UTC m=+1151.706994013" lastFinishedPulling="2026-01-20 22:54:49.726252471 +0000 UTC m=+1162.046512799" observedRunningTime="2026-01-20 22:54:50.990993607 +0000 UTC m=+1163.311253915" watchObservedRunningTime="2026-01-20 22:54:51.03576994 +0000 UTC m=+1163.356030228" Jan 20 22:54:51 crc kubenswrapper[5030]: I0120 22:54:51.956787 5030 generic.go:334] "Generic (PLEG): container finished" podID="7038973c-1328-4d3d-a720-11483c795e71" containerID="3b3de02ac7c1072089f94c57e37253fa757076a18df2cccf1c4793d573e64e1f" exitCode=0 Jan 20 22:54:51 crc kubenswrapper[5030]: I0120 22:54:51.957098 5030 generic.go:334] "Generic (PLEG): container finished" podID="7038973c-1328-4d3d-a720-11483c795e71" containerID="870db84ff6a1f2377b0785f3e1b5bd45a3a2b1aec3a4e16c38c802a4bf83d113" exitCode=2 Jan 20 22:54:51 crc kubenswrapper[5030]: I0120 22:54:51.956871 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7038973c-1328-4d3d-a720-11483c795e71","Type":"ContainerDied","Data":"3b3de02ac7c1072089f94c57e37253fa757076a18df2cccf1c4793d573e64e1f"} Jan 20 22:54:51 crc kubenswrapper[5030]: I0120 22:54:51.957290 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7038973c-1328-4d3d-a720-11483c795e71","Type":"ContainerDied","Data":"870db84ff6a1f2377b0785f3e1b5bd45a3a2b1aec3a4e16c38c802a4bf83d113"} Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.635696 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.697254 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" podStartSLOduration=6.203213762 podStartE2EDuration="16.69723633s" podCreationTimestamp="2026-01-20 22:54:38 +0000 UTC" firstStartedPulling="2026-01-20 22:54:39.235256076 +0000 UTC m=+1151.555516364" lastFinishedPulling="2026-01-20 22:54:49.729278644 +0000 UTC m=+1162.049538932" observedRunningTime="2026-01-20 22:54:51.03452632 +0000 UTC m=+1163.354786608" watchObservedRunningTime="2026-01-20 22:54:54.69723633 +0000 UTC m=+1167.017496628" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.811615 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-sg-core-conf-yaml\") pod \"7038973c-1328-4d3d-a720-11483c795e71\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.811771 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96x86\" (UniqueName: \"kubernetes.io/projected/7038973c-1328-4d3d-a720-11483c795e71-kube-api-access-96x86\") pod \"7038973c-1328-4d3d-a720-11483c795e71\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.811798 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-scripts\") pod \"7038973c-1328-4d3d-a720-11483c795e71\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.811837 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-combined-ca-bundle\") pod \"7038973c-1328-4d3d-a720-11483c795e71\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.811880 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-config-data\") pod \"7038973c-1328-4d3d-a720-11483c795e71\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.811903 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7038973c-1328-4d3d-a720-11483c795e71-run-httpd\") pod \"7038973c-1328-4d3d-a720-11483c795e71\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.811929 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7038973c-1328-4d3d-a720-11483c795e71-log-httpd\") pod \"7038973c-1328-4d3d-a720-11483c795e71\" (UID: \"7038973c-1328-4d3d-a720-11483c795e71\") " Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.812230 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7038973c-1328-4d3d-a720-11483c795e71-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7038973c-1328-4d3d-a720-11483c795e71" (UID: "7038973c-1328-4d3d-a720-11483c795e71"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.812341 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7038973c-1328-4d3d-a720-11483c795e71-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.812724 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7038973c-1328-4d3d-a720-11483c795e71-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7038973c-1328-4d3d-a720-11483c795e71" (UID: "7038973c-1328-4d3d-a720-11483c795e71"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.817357 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7038973c-1328-4d3d-a720-11483c795e71-kube-api-access-96x86" (OuterVolumeSpecName: "kube-api-access-96x86") pod "7038973c-1328-4d3d-a720-11483c795e71" (UID: "7038973c-1328-4d3d-a720-11483c795e71"). InnerVolumeSpecName "kube-api-access-96x86". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.838288 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-scripts" (OuterVolumeSpecName: "scripts") pod "7038973c-1328-4d3d-a720-11483c795e71" (UID: "7038973c-1328-4d3d-a720-11483c795e71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.839978 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7038973c-1328-4d3d-a720-11483c795e71" (UID: "7038973c-1328-4d3d-a720-11483c795e71"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.853469 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7038973c-1328-4d3d-a720-11483c795e71" (UID: "7038973c-1328-4d3d-a720-11483c795e71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.877784 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-config-data" (OuterVolumeSpecName: "config-data") pod "7038973c-1328-4d3d-a720-11483c795e71" (UID: "7038973c-1328-4d3d-a720-11483c795e71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.913661 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.913863 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96x86\" (UniqueName: \"kubernetes.io/projected/7038973c-1328-4d3d-a720-11483c795e71-kube-api-access-96x86\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.913925 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.913980 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.914032 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7038973c-1328-4d3d-a720-11483c795e71-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.914085 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7038973c-1328-4d3d-a720-11483c795e71-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.989378 5030 generic.go:334] "Generic (PLEG): container finished" podID="7038973c-1328-4d3d-a720-11483c795e71" containerID="f71a5041d12161e5443074c913005467b54b979803ff2118ddbfac789b90445b" exitCode=0 Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.989410 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7038973c-1328-4d3d-a720-11483c795e71","Type":"ContainerDied","Data":"f71a5041d12161e5443074c913005467b54b979803ff2118ddbfac789b90445b"} Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.989740 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7038973c-1328-4d3d-a720-11483c795e71","Type":"ContainerDied","Data":"7638d3655f2a90d90497e13c0e6555607d3615b0c89974f2423a9d50d37e17f5"} Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.989797 5030 scope.go:117] "RemoveContainer" containerID="3b3de02ac7c1072089f94c57e37253fa757076a18df2cccf1c4793d573e64e1f" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.989463 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.991868 5030 generic.go:334] "Generic (PLEG): container finished" podID="307f56f7-ed99-4389-a33f-7e1c8dda480b" containerID="1899dc7597c55dc2c6cfc75ca888715f8d68e0856cf45f3f65ffe82ffacc10d0" exitCode=0 Jan 20 22:54:54 crc kubenswrapper[5030]: I0120 22:54:54.991926 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-k84rf" event={"ID":"307f56f7-ed99-4389-a33f-7e1c8dda480b","Type":"ContainerDied","Data":"1899dc7597c55dc2c6cfc75ca888715f8d68e0856cf45f3f65ffe82ffacc10d0"} Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.019229 5030 scope.go:117] "RemoveContainer" containerID="870db84ff6a1f2377b0785f3e1b5bd45a3a2b1aec3a4e16c38c802a4bf83d113" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.040757 5030 scope.go:117] "RemoveContainer" containerID="f71a5041d12161e5443074c913005467b54b979803ff2118ddbfac789b90445b" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.056587 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.058817 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.099447 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.103860 5030 scope.go:117] "RemoveContainer" containerID="3b3de02ac7c1072089f94c57e37253fa757076a18df2cccf1c4793d573e64e1f" Jan 20 22:54:55 crc kubenswrapper[5030]: E0120 22:54:55.105151 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7038973c-1328-4d3d-a720-11483c795e71" containerName="ceilometer-notification-agent" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.105181 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7038973c-1328-4d3d-a720-11483c795e71" containerName="ceilometer-notification-agent" Jan 20 22:54:55 crc kubenswrapper[5030]: E0120 22:54:55.105195 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7038973c-1328-4d3d-a720-11483c795e71" containerName="proxy-httpd" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.105203 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7038973c-1328-4d3d-a720-11483c795e71" containerName="proxy-httpd" Jan 20 22:54:55 crc kubenswrapper[5030]: E0120 22:54:55.105221 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7038973c-1328-4d3d-a720-11483c795e71" containerName="sg-core" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.105226 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7038973c-1328-4d3d-a720-11483c795e71" containerName="sg-core" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.105395 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7038973c-1328-4d3d-a720-11483c795e71" containerName="sg-core" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.105420 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7038973c-1328-4d3d-a720-11483c795e71" containerName="proxy-httpd" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.105442 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7038973c-1328-4d3d-a720-11483c795e71" containerName="ceilometer-notification-agent" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.107176 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: E0120 22:54:55.109007 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b3de02ac7c1072089f94c57e37253fa757076a18df2cccf1c4793d573e64e1f\": container with ID starting with 3b3de02ac7c1072089f94c57e37253fa757076a18df2cccf1c4793d573e64e1f not found: ID does not exist" containerID="3b3de02ac7c1072089f94c57e37253fa757076a18df2cccf1c4793d573e64e1f" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.109274 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b3de02ac7c1072089f94c57e37253fa757076a18df2cccf1c4793d573e64e1f"} err="failed to get container status \"3b3de02ac7c1072089f94c57e37253fa757076a18df2cccf1c4793d573e64e1f\": rpc error: code = NotFound desc = could not find container \"3b3de02ac7c1072089f94c57e37253fa757076a18df2cccf1c4793d573e64e1f\": container with ID starting with 3b3de02ac7c1072089f94c57e37253fa757076a18df2cccf1c4793d573e64e1f not found: ID does not exist" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.109299 5030 scope.go:117] "RemoveContainer" containerID="870db84ff6a1f2377b0785f3e1b5bd45a3a2b1aec3a4e16c38c802a4bf83d113" Jan 20 22:54:55 crc kubenswrapper[5030]: E0120 22:54:55.111373 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870db84ff6a1f2377b0785f3e1b5bd45a3a2b1aec3a4e16c38c802a4bf83d113\": container with ID starting with 870db84ff6a1f2377b0785f3e1b5bd45a3a2b1aec3a4e16c38c802a4bf83d113 not found: ID does not exist" containerID="870db84ff6a1f2377b0785f3e1b5bd45a3a2b1aec3a4e16c38c802a4bf83d113" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.111411 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870db84ff6a1f2377b0785f3e1b5bd45a3a2b1aec3a4e16c38c802a4bf83d113"} err="failed to get container status \"870db84ff6a1f2377b0785f3e1b5bd45a3a2b1aec3a4e16c38c802a4bf83d113\": rpc error: code = NotFound desc = could not find container \"870db84ff6a1f2377b0785f3e1b5bd45a3a2b1aec3a4e16c38c802a4bf83d113\": container with ID starting with 870db84ff6a1f2377b0785f3e1b5bd45a3a2b1aec3a4e16c38c802a4bf83d113 not found: ID does not exist" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.111436 5030 scope.go:117] "RemoveContainer" containerID="f71a5041d12161e5443074c913005467b54b979803ff2118ddbfac789b90445b" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.112180 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.112199 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.112334 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:54:55 crc kubenswrapper[5030]: E0120 22:54:55.112419 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71a5041d12161e5443074c913005467b54b979803ff2118ddbfac789b90445b\": container with ID starting with f71a5041d12161e5443074c913005467b54b979803ff2118ddbfac789b90445b not found: ID does not exist" containerID="f71a5041d12161e5443074c913005467b54b979803ff2118ddbfac789b90445b" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.112444 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71a5041d12161e5443074c913005467b54b979803ff2118ddbfac789b90445b"} err="failed to get container status \"f71a5041d12161e5443074c913005467b54b979803ff2118ddbfac789b90445b\": rpc error: code = NotFound desc = could not find container \"f71a5041d12161e5443074c913005467b54b979803ff2118ddbfac789b90445b\": container with ID starting with f71a5041d12161e5443074c913005467b54b979803ff2118ddbfac789b90445b not found: ID does not exist" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.218467 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-config-data\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.218532 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.218575 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-log-httpd\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.218602 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-scripts\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.218647 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.218662 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-run-httpd\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.218700 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dglns\" (UniqueName: \"kubernetes.io/projected/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-kube-api-access-dglns\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.320959 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.321022 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-run-httpd\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.321107 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dglns\" (UniqueName: \"kubernetes.io/projected/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-kube-api-access-dglns\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.321192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-config-data\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.321273 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.321368 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-log-httpd\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.321434 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-scripts\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.321883 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-run-httpd\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.323051 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-log-httpd\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.327899 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-config-data\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.328131 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-scripts\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.328594 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.329124 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.338550 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dglns\" (UniqueName: \"kubernetes.io/projected/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-kube-api-access-dglns\") pod \"ceilometer-0\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.430007 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.888377 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:54:55 crc kubenswrapper[5030]: I0120 22:54:55.990161 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7038973c-1328-4d3d-a720-11483c795e71" path="/var/lib/kubelet/pods/7038973c-1328-4d3d-a720-11483c795e71/volumes" Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.004339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4","Type":"ContainerStarted","Data":"83ae7d812fc51c58507565a8dd297f0069138f4562e63949060139364ac2cb49"} Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.230575 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.338285 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/307f56f7-ed99-4389-a33f-7e1c8dda480b-etc-machine-id\") pod \"307f56f7-ed99-4389-a33f-7e1c8dda480b\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.338496 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-scripts\") pod \"307f56f7-ed99-4389-a33f-7e1c8dda480b\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.338491 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/307f56f7-ed99-4389-a33f-7e1c8dda480b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "307f56f7-ed99-4389-a33f-7e1c8dda480b" (UID: "307f56f7-ed99-4389-a33f-7e1c8dda480b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.338533 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6jlb\" (UniqueName: \"kubernetes.io/projected/307f56f7-ed99-4389-a33f-7e1c8dda480b-kube-api-access-p6jlb\") pod \"307f56f7-ed99-4389-a33f-7e1c8dda480b\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.338657 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-combined-ca-bundle\") pod \"307f56f7-ed99-4389-a33f-7e1c8dda480b\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.338745 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-db-sync-config-data\") pod \"307f56f7-ed99-4389-a33f-7e1c8dda480b\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.338778 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-config-data\") pod \"307f56f7-ed99-4389-a33f-7e1c8dda480b\" (UID: \"307f56f7-ed99-4389-a33f-7e1c8dda480b\") " Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.339301 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/307f56f7-ed99-4389-a33f-7e1c8dda480b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.345432 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "307f56f7-ed99-4389-a33f-7e1c8dda480b" (UID: "307f56f7-ed99-4389-a33f-7e1c8dda480b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.345495 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/307f56f7-ed99-4389-a33f-7e1c8dda480b-kube-api-access-p6jlb" (OuterVolumeSpecName: "kube-api-access-p6jlb") pod "307f56f7-ed99-4389-a33f-7e1c8dda480b" (UID: "307f56f7-ed99-4389-a33f-7e1c8dda480b"). InnerVolumeSpecName "kube-api-access-p6jlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.347465 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-scripts" (OuterVolumeSpecName: "scripts") pod "307f56f7-ed99-4389-a33f-7e1c8dda480b" (UID: "307f56f7-ed99-4389-a33f-7e1c8dda480b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.391193 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "307f56f7-ed99-4389-a33f-7e1c8dda480b" (UID: "307f56f7-ed99-4389-a33f-7e1c8dda480b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.414060 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-config-data" (OuterVolumeSpecName: "config-data") pod "307f56f7-ed99-4389-a33f-7e1c8dda480b" (UID: "307f56f7-ed99-4389-a33f-7e1c8dda480b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.440893 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.440933 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.440945 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.440957 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6jlb\" (UniqueName: \"kubernetes.io/projected/307f56f7-ed99-4389-a33f-7e1c8dda480b-kube-api-access-p6jlb\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:56 crc kubenswrapper[5030]: I0120 22:54:56.440970 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/307f56f7-ed99-4389-a33f-7e1c8dda480b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.015928 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-k84rf" event={"ID":"307f56f7-ed99-4389-a33f-7e1c8dda480b","Type":"ContainerDied","Data":"88738cbb413569f4c4058296972ed607d6cd267a123d3867672cea173147a930"} Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.016255 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88738cbb413569f4c4058296972ed607d6cd267a123d3867672cea173147a930" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.015977 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-k84rf" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.018267 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4","Type":"ContainerStarted","Data":"37cb3d5607e2a068aaaa15cfe3279c4b96baf7def7ef5cd895baa4791e14ab38"} Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.400924 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:54:57 crc kubenswrapper[5030]: E0120 22:54:57.401247 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307f56f7-ed99-4389-a33f-7e1c8dda480b" containerName="cinder-db-sync" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.401257 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="307f56f7-ed99-4389-a33f-7e1c8dda480b" containerName="cinder-db-sync" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.401432 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="307f56f7-ed99-4389-a33f-7e1c8dda480b" containerName="cinder-db-sync" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.405312 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.407553 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.407824 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.407950 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-6drfn" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.411478 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.413429 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.461777 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-config-data\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.461848 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.461874 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fgmp\" (UniqueName: \"kubernetes.io/projected/e7d3dbb0-879e-4c4c-8114-2e599be7953f-kube-api-access-8fgmp\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.461914 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-scripts\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.462087 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.462214 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7d3dbb0-879e-4c4c-8114-2e599be7953f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.563852 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-scripts\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.563921 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.563966 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7d3dbb0-879e-4c4c-8114-2e599be7953f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.564027 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-config-data\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.564061 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.564084 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fgmp\" (UniqueName: \"kubernetes.io/projected/e7d3dbb0-879e-4c4c-8114-2e599be7953f-kube-api-access-8fgmp\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.566149 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7d3dbb0-879e-4c4c-8114-2e599be7953f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.568662 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.568875 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-config-data\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.572352 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-scripts\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.574143 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.585254 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fgmp\" (UniqueName: \"kubernetes.io/projected/e7d3dbb0-879e-4c4c-8114-2e599be7953f-kube-api-access-8fgmp\") pod \"cinder-scheduler-0\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.586586 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.591078 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.594542 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.598499 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.665366 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-scripts\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.665432 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.665469 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45649fa5-a578-4976-82ef-b45ed55f9f25-etc-machine-id\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.665499 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-config-data\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.665525 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-config-data-custom\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.665683 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45649fa5-a578-4976-82ef-b45ed55f9f25-logs\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.665740 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrwvg\" (UniqueName: \"kubernetes.io/projected/45649fa5-a578-4976-82ef-b45ed55f9f25-kube-api-access-rrwvg\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.733385 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.766751 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-config-data\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.766806 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-config-data-custom\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.766837 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45649fa5-a578-4976-82ef-b45ed55f9f25-logs\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.766856 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrwvg\" (UniqueName: \"kubernetes.io/projected/45649fa5-a578-4976-82ef-b45ed55f9f25-kube-api-access-rrwvg\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.766901 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-scripts\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.766946 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.767007 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45649fa5-a578-4976-82ef-b45ed55f9f25-etc-machine-id\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.767096 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45649fa5-a578-4976-82ef-b45ed55f9f25-etc-machine-id\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.768218 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45649fa5-a578-4976-82ef-b45ed55f9f25-logs\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.773359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-config-data-custom\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.775265 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-scripts\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.775963 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-config-data\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.778100 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.782404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrwvg\" (UniqueName: \"kubernetes.io/projected/45649fa5-a578-4976-82ef-b45ed55f9f25-kube-api-access-rrwvg\") pod \"cinder-api-0\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:57 crc kubenswrapper[5030]: I0120 22:54:57.952575 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:54:58 crc kubenswrapper[5030]: I0120 22:54:58.058492 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4","Type":"ContainerStarted","Data":"f5b03dcd2042d09e4f15b24fb2f3daae8ab428a0a481927f6bedb02af229ca8d"} Jan 20 22:54:58 crc kubenswrapper[5030]: I0120 22:54:58.260707 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:54:58 crc kubenswrapper[5030]: I0120 22:54:58.685906 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:54:59 crc kubenswrapper[5030]: I0120 22:54:59.068439 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"45649fa5-a578-4976-82ef-b45ed55f9f25","Type":"ContainerStarted","Data":"9112c4c4c78a110c500bc39499eed1d35e62a8455b9722705521e5b12e184a91"} Jan 20 22:54:59 crc kubenswrapper[5030]: I0120 22:54:59.073597 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4","Type":"ContainerStarted","Data":"04afda13a8fe20ac1b74fae31b5cce9caea4d095bbe8c614957f6a0527645c95"} Jan 20 22:54:59 crc kubenswrapper[5030]: I0120 22:54:59.075080 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e7d3dbb0-879e-4c4c-8114-2e599be7953f","Type":"ContainerStarted","Data":"b6ec004ff87d258ca6c620bd94e86e8e216d1aff8d4c5f02ac948b9f7a76ea31"} Jan 20 22:54:59 crc kubenswrapper[5030]: I0120 22:54:59.190526 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:59 crc kubenswrapper[5030]: I0120 22:54:59.507181 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:54:59 crc kubenswrapper[5030]: I0120 22:54:59.536347 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:54:59 crc kubenswrapper[5030]: I0120 22:54:59.608236 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x"] Jan 20 22:54:59 crc kubenswrapper[5030]: I0120 22:54:59.608463 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" podUID="45c84de5-3393-4292-be4d-0ccce61e0ace" containerName="barbican-api-log" containerID="cri-o://59b4500db8b85f3fadb5da747f22f0cdf4331c9422eb20c2227f847b377d1190" gracePeriod=30 Jan 20 22:54:59 crc kubenswrapper[5030]: I0120 22:54:59.608542 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" podUID="45c84de5-3393-4292-be4d-0ccce61e0ace" containerName="barbican-api" containerID="cri-o://d6b173a1fa700dcf1708107f8b164a0ab9ae083a6b457897870e78dc7b9081a8" gracePeriod=30 Jan 20 22:55:00 crc kubenswrapper[5030]: I0120 22:55:00.088873 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e7d3dbb0-879e-4c4c-8114-2e599be7953f","Type":"ContainerStarted","Data":"a7bbf3d9b706a4ec1b9f2c244f0b6bb8b5bf58fc8ba075a8e55314bcf67a5402"} Jan 20 22:55:00 crc kubenswrapper[5030]: I0120 22:55:00.092556 5030 generic.go:334] "Generic (PLEG): container finished" podID="45c84de5-3393-4292-be4d-0ccce61e0ace" containerID="59b4500db8b85f3fadb5da747f22f0cdf4331c9422eb20c2227f847b377d1190" exitCode=143 Jan 20 22:55:00 crc kubenswrapper[5030]: I0120 22:55:00.092655 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" event={"ID":"45c84de5-3393-4292-be4d-0ccce61e0ace","Type":"ContainerDied","Data":"59b4500db8b85f3fadb5da747f22f0cdf4331c9422eb20c2227f847b377d1190"} Jan 20 22:55:00 crc kubenswrapper[5030]: I0120 22:55:00.095242 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="45649fa5-a578-4976-82ef-b45ed55f9f25" containerName="cinder-api-log" containerID="cri-o://c496d9ec10aa5434563425ac1e672214770f61914d1dc19339acad85b178854f" gracePeriod=30 Jan 20 22:55:00 crc kubenswrapper[5030]: I0120 22:55:00.095509 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"45649fa5-a578-4976-82ef-b45ed55f9f25","Type":"ContainerStarted","Data":"845c6fff41d3f940ea4d8ed0e2d9e1995c54a35720968d4f8683ddcb92ad8af5"} Jan 20 22:55:00 crc kubenswrapper[5030]: I0120 22:55:00.095543 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:00 crc kubenswrapper[5030]: I0120 22:55:00.095554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"45649fa5-a578-4976-82ef-b45ed55f9f25","Type":"ContainerStarted","Data":"c496d9ec10aa5434563425ac1e672214770f61914d1dc19339acad85b178854f"} Jan 20 22:55:00 crc kubenswrapper[5030]: I0120 22:55:00.095606 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="45649fa5-a578-4976-82ef-b45ed55f9f25" containerName="cinder-api" containerID="cri-o://845c6fff41d3f940ea4d8ed0e2d9e1995c54a35720968d4f8683ddcb92ad8af5" gracePeriod=30 Jan 20 22:55:00 crc kubenswrapper[5030]: I0120 22:55:00.115274 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.11525722 podStartE2EDuration="3.11525722s" podCreationTimestamp="2026-01-20 22:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:55:00.112597875 +0000 UTC m=+1172.432858163" watchObservedRunningTime="2026-01-20 22:55:00.11525722 +0000 UTC m=+1172.435517498" Jan 20 22:55:01 crc kubenswrapper[5030]: I0120 22:55:01.105224 5030 generic.go:334] "Generic (PLEG): container finished" podID="45649fa5-a578-4976-82ef-b45ed55f9f25" containerID="c496d9ec10aa5434563425ac1e672214770f61914d1dc19339acad85b178854f" exitCode=143 Jan 20 22:55:01 crc kubenswrapper[5030]: I0120 22:55:01.105307 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"45649fa5-a578-4976-82ef-b45ed55f9f25","Type":"ContainerDied","Data":"c496d9ec10aa5434563425ac1e672214770f61914d1dc19339acad85b178854f"} Jan 20 22:55:01 crc kubenswrapper[5030]: I0120 22:55:01.109035 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4","Type":"ContainerStarted","Data":"d976ab81d0fc982ad1d8fb1bb2389e2903d277c20f5c252a0a10bccd290da3e7"} Jan 20 22:55:01 crc kubenswrapper[5030]: I0120 22:55:01.109422 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:01 crc kubenswrapper[5030]: I0120 22:55:01.111823 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e7d3dbb0-879e-4c4c-8114-2e599be7953f","Type":"ContainerStarted","Data":"372084fd7988ec075d3f9fb4a81d8ff7a4fd248f2b6c327e515c916329b43e6c"} Jan 20 22:55:01 crc kubenswrapper[5030]: I0120 22:55:01.136125 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.106733279 podStartE2EDuration="6.13610133s" podCreationTimestamp="2026-01-20 22:54:55 +0000 UTC" firstStartedPulling="2026-01-20 22:54:55.910206494 +0000 UTC m=+1168.230466812" lastFinishedPulling="2026-01-20 22:54:59.939574575 +0000 UTC m=+1172.259834863" observedRunningTime="2026-01-20 22:55:01.130945526 +0000 UTC m=+1173.451205874" watchObservedRunningTime="2026-01-20 22:55:01.13610133 +0000 UTC m=+1173.456361618" Jan 20 22:55:01 crc kubenswrapper[5030]: I0120 22:55:01.153406 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.20422815 podStartE2EDuration="4.15339047s" podCreationTimestamp="2026-01-20 22:54:57 +0000 UTC" firstStartedPulling="2026-01-20 22:54:58.283155807 +0000 UTC m=+1170.603416095" lastFinishedPulling="2026-01-20 22:54:59.232318127 +0000 UTC m=+1171.552578415" observedRunningTime="2026-01-20 22:55:01.151486664 +0000 UTC m=+1173.471746942" watchObservedRunningTime="2026-01-20 22:55:01.15339047 +0000 UTC m=+1173.473650758" Jan 20 22:55:02 crc kubenswrapper[5030]: I0120 22:55:02.734078 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.155207 5030 generic.go:334] "Generic (PLEG): container finished" podID="45c84de5-3393-4292-be4d-0ccce61e0ace" containerID="d6b173a1fa700dcf1708107f8b164a0ab9ae083a6b457897870e78dc7b9081a8" exitCode=0 Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.155978 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" event={"ID":"45c84de5-3393-4292-be4d-0ccce61e0ace","Type":"ContainerDied","Data":"d6b173a1fa700dcf1708107f8b164a0ab9ae083a6b457897870e78dc7b9081a8"} Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.270949 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.318130 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45c84de5-3393-4292-be4d-0ccce61e0ace-logs\") pod \"45c84de5-3393-4292-be4d-0ccce61e0ace\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.318385 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-config-data-custom\") pod \"45c84de5-3393-4292-be4d-0ccce61e0ace\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.318473 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4c42\" (UniqueName: \"kubernetes.io/projected/45c84de5-3393-4292-be4d-0ccce61e0ace-kube-api-access-b4c42\") pod \"45c84de5-3393-4292-be4d-0ccce61e0ace\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.318514 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-combined-ca-bundle\") pod \"45c84de5-3393-4292-be4d-0ccce61e0ace\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.318535 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-config-data\") pod \"45c84de5-3393-4292-be4d-0ccce61e0ace\" (UID: \"45c84de5-3393-4292-be4d-0ccce61e0ace\") " Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.318893 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45c84de5-3393-4292-be4d-0ccce61e0ace-logs" (OuterVolumeSpecName: "logs") pod "45c84de5-3393-4292-be4d-0ccce61e0ace" (UID: "45c84de5-3393-4292-be4d-0ccce61e0ace"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.318997 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45c84de5-3393-4292-be4d-0ccce61e0ace-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.326541 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45c84de5-3393-4292-be4d-0ccce61e0ace-kube-api-access-b4c42" (OuterVolumeSpecName: "kube-api-access-b4c42") pod "45c84de5-3393-4292-be4d-0ccce61e0ace" (UID: "45c84de5-3393-4292-be4d-0ccce61e0ace"). InnerVolumeSpecName "kube-api-access-b4c42". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.329993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "45c84de5-3393-4292-be4d-0ccce61e0ace" (UID: "45c84de5-3393-4292-be4d-0ccce61e0ace"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.366843 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45c84de5-3393-4292-be4d-0ccce61e0ace" (UID: "45c84de5-3393-4292-be4d-0ccce61e0ace"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.370305 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-config-data" (OuterVolumeSpecName: "config-data") pod "45c84de5-3393-4292-be4d-0ccce61e0ace" (UID: "45c84de5-3393-4292-be4d-0ccce61e0ace"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.421481 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.421537 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.421554 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4c42\" (UniqueName: \"kubernetes.io/projected/45c84de5-3393-4292-be4d-0ccce61e0ace-kube-api-access-b4c42\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:03 crc kubenswrapper[5030]: I0120 22:55:03.421568 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c84de5-3393-4292-be4d-0ccce61e0ace-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:04 crc kubenswrapper[5030]: I0120 22:55:04.165095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" event={"ID":"45c84de5-3393-4292-be4d-0ccce61e0ace","Type":"ContainerDied","Data":"e2dae2e1c096e43be4e399e4f2bb449a749b918180bfcb1449f011c085cf8f0c"} Jan 20 22:55:04 crc kubenswrapper[5030]: I0120 22:55:04.165136 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x" Jan 20 22:55:04 crc kubenswrapper[5030]: I0120 22:55:04.165161 5030 scope.go:117] "RemoveContainer" containerID="d6b173a1fa700dcf1708107f8b164a0ab9ae083a6b457897870e78dc7b9081a8" Jan 20 22:55:04 crc kubenswrapper[5030]: I0120 22:55:04.186008 5030 scope.go:117] "RemoveContainer" containerID="59b4500db8b85f3fadb5da747f22f0cdf4331c9422eb20c2227f847b377d1190" Jan 20 22:55:04 crc kubenswrapper[5030]: I0120 22:55:04.195431 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x"] Jan 20 22:55:04 crc kubenswrapper[5030]: I0120 22:55:04.205828 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-7dd65bc764-srg7x"] Jan 20 22:55:05 crc kubenswrapper[5030]: I0120 22:55:05.975492 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45c84de5-3393-4292-be4d-0ccce61e0ace" path="/var/lib/kubelet/pods/45c84de5-3393-4292-be4d-0ccce61e0ace/volumes" Jan 20 22:55:06 crc kubenswrapper[5030]: I0120 22:55:06.310561 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:55:07 crc kubenswrapper[5030]: I0120 22:55:07.170137 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:55:07 crc kubenswrapper[5030]: I0120 22:55:07.237959 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:55:07 crc kubenswrapper[5030]: I0120 22:55:07.929690 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:07 crc kubenswrapper[5030]: I0120 22:55:07.989867 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:55:08 crc kubenswrapper[5030]: I0120 22:55:08.217191 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="e7d3dbb0-879e-4c4c-8114-2e599be7953f" containerName="cinder-scheduler" containerID="cri-o://a7bbf3d9b706a4ec1b9f2c244f0b6bb8b5bf58fc8ba075a8e55314bcf67a5402" gracePeriod=30 Jan 20 22:55:08 crc kubenswrapper[5030]: I0120 22:55:08.217301 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="e7d3dbb0-879e-4c4c-8114-2e599be7953f" containerName="probe" containerID="cri-o://372084fd7988ec075d3f9fb4a81d8ff7a4fd248f2b6c327e515c916329b43e6c" gracePeriod=30 Jan 20 22:55:09 crc kubenswrapper[5030]: I0120 22:55:09.034726 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:55:09 crc kubenswrapper[5030]: I0120 22:55:09.118439 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5dd888bb9-vzkmz"] Jan 20 22:55:09 crc kubenswrapper[5030]: I0120 22:55:09.118919 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" podUID="e82c7c33-c62b-4a4f-bf19-2e8176a84d79" containerName="neutron-api" containerID="cri-o://732357dd8392df70779f24553a774511a373c5ac65085763e1e0daf543c5cc10" gracePeriod=30 Jan 20 22:55:09 crc kubenswrapper[5030]: I0120 22:55:09.119636 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" podUID="e82c7c33-c62b-4a4f-bf19-2e8176a84d79" containerName="neutron-httpd" containerID="cri-o://5d30cc57716599d0d651620c6ec1f1fcf24192cc3b44b822bb51e448cbc85a7a" gracePeriod=30 Jan 20 22:55:09 crc kubenswrapper[5030]: I0120 22:55:09.234451 5030 generic.go:334] "Generic (PLEG): container finished" podID="e7d3dbb0-879e-4c4c-8114-2e599be7953f" containerID="372084fd7988ec075d3f9fb4a81d8ff7a4fd248f2b6c327e515c916329b43e6c" exitCode=0 Jan 20 22:55:09 crc kubenswrapper[5030]: I0120 22:55:09.234494 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e7d3dbb0-879e-4c4c-8114-2e599be7953f","Type":"ContainerDied","Data":"372084fd7988ec075d3f9fb4a81d8ff7a4fd248f2b6c327e515c916329b43e6c"} Jan 20 22:55:09 crc kubenswrapper[5030]: I0120 22:55:09.835386 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:10 crc kubenswrapper[5030]: I0120 22:55:10.157575 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:55:10 crc kubenswrapper[5030]: I0120 22:55:10.157673 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:55:10 crc kubenswrapper[5030]: I0120 22:55:10.247129 5030 generic.go:334] "Generic (PLEG): container finished" podID="e82c7c33-c62b-4a4f-bf19-2e8176a84d79" containerID="5d30cc57716599d0d651620c6ec1f1fcf24192cc3b44b822bb51e448cbc85a7a" exitCode=0 Jan 20 22:55:10 crc kubenswrapper[5030]: I0120 22:55:10.247197 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" event={"ID":"e82c7c33-c62b-4a4f-bf19-2e8176a84d79","Type":"ContainerDied","Data":"5d30cc57716599d0d651620c6ec1f1fcf24192cc3b44b822bb51e448cbc85a7a"} Jan 20 22:55:10 crc kubenswrapper[5030]: I0120 22:55:10.258149 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.256140 5030 generic.go:334] "Generic (PLEG): container finished" podID="e7d3dbb0-879e-4c4c-8114-2e599be7953f" containerID="a7bbf3d9b706a4ec1b9f2c244f0b6bb8b5bf58fc8ba075a8e55314bcf67a5402" exitCode=0 Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.256167 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e7d3dbb0-879e-4c4c-8114-2e599be7953f","Type":"ContainerDied","Data":"a7bbf3d9b706a4ec1b9f2c244f0b6bb8b5bf58fc8ba075a8e55314bcf67a5402"} Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.361951 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.479185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-config-data-custom\") pod \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.479272 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fgmp\" (UniqueName: \"kubernetes.io/projected/e7d3dbb0-879e-4c4c-8114-2e599be7953f-kube-api-access-8fgmp\") pod \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.479292 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-scripts\") pod \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.479342 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-combined-ca-bundle\") pod \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.479402 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-config-data\") pod \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.479471 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7d3dbb0-879e-4c4c-8114-2e599be7953f-etc-machine-id\") pod \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\" (UID: \"e7d3dbb0-879e-4c4c-8114-2e599be7953f\") " Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.479678 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d3dbb0-879e-4c4c-8114-2e599be7953f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e7d3dbb0-879e-4c4c-8114-2e599be7953f" (UID: "e7d3dbb0-879e-4c4c-8114-2e599be7953f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.479968 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7d3dbb0-879e-4c4c-8114-2e599be7953f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.485517 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-scripts" (OuterVolumeSpecName: "scripts") pod "e7d3dbb0-879e-4c4c-8114-2e599be7953f" (UID: "e7d3dbb0-879e-4c4c-8114-2e599be7953f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.497387 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d3dbb0-879e-4c4c-8114-2e599be7953f-kube-api-access-8fgmp" (OuterVolumeSpecName: "kube-api-access-8fgmp") pod "e7d3dbb0-879e-4c4c-8114-2e599be7953f" (UID: "e7d3dbb0-879e-4c4c-8114-2e599be7953f"). InnerVolumeSpecName "kube-api-access-8fgmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.497779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e7d3dbb0-879e-4c4c-8114-2e599be7953f" (UID: "e7d3dbb0-879e-4c4c-8114-2e599be7953f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.528611 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7d3dbb0-879e-4c4c-8114-2e599be7953f" (UID: "e7d3dbb0-879e-4c4c-8114-2e599be7953f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.582002 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fgmp\" (UniqueName: \"kubernetes.io/projected/e7d3dbb0-879e-4c4c-8114-2e599be7953f-kube-api-access-8fgmp\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.582049 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.582062 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.582074 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.595596 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-config-data" (OuterVolumeSpecName: "config-data") pod "e7d3dbb0-879e-4c4c-8114-2e599be7953f" (UID: "e7d3dbb0-879e-4c4c-8114-2e599be7953f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.684281 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d3dbb0-879e-4c4c-8114-2e599be7953f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.931450 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 22:55:11 crc kubenswrapper[5030]: E0120 22:55:11.932256 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c84de5-3393-4292-be4d-0ccce61e0ace" containerName="barbican-api-log" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.932279 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c84de5-3393-4292-be4d-0ccce61e0ace" containerName="barbican-api-log" Jan 20 22:55:11 crc kubenswrapper[5030]: E0120 22:55:11.932294 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45c84de5-3393-4292-be4d-0ccce61e0ace" containerName="barbican-api" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.932304 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c84de5-3393-4292-be4d-0ccce61e0ace" containerName="barbican-api" Jan 20 22:55:11 crc kubenswrapper[5030]: E0120 22:55:11.932327 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d3dbb0-879e-4c4c-8114-2e599be7953f" containerName="cinder-scheduler" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.932335 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d3dbb0-879e-4c4c-8114-2e599be7953f" containerName="cinder-scheduler" Jan 20 22:55:11 crc kubenswrapper[5030]: E0120 22:55:11.932348 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d3dbb0-879e-4c4c-8114-2e599be7953f" containerName="probe" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.932355 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d3dbb0-879e-4c4c-8114-2e599be7953f" containerName="probe" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.932614 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d3dbb0-879e-4c4c-8114-2e599be7953f" containerName="probe" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.932670 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="45c84de5-3393-4292-be4d-0ccce61e0ace" containerName="barbican-api" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.932686 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="45c84de5-3393-4292-be4d-0ccce61e0ace" containerName="barbican-api-log" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.932703 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d3dbb0-879e-4c4c-8114-2e599be7953f" containerName="cinder-scheduler" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.933344 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.936342 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.936385 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.940148 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-r9b4p" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.949404 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.991016 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270d79a6-4c9f-415b-aa6a-e0d33da8177d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.991137 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/270d79a6-4c9f-415b-aa6a-e0d33da8177d-openstack-config-secret\") pod \"openstackclient\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.991190 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/270d79a6-4c9f-415b-aa6a-e0d33da8177d-openstack-config\") pod \"openstackclient\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:11 crc kubenswrapper[5030]: I0120 22:55:11.991218 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkgcz\" (UniqueName: \"kubernetes.io/projected/270d79a6-4c9f-415b-aa6a-e0d33da8177d-kube-api-access-lkgcz\") pod \"openstackclient\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.093294 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/270d79a6-4c9f-415b-aa6a-e0d33da8177d-openstack-config\") pod \"openstackclient\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.093357 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkgcz\" (UniqueName: \"kubernetes.io/projected/270d79a6-4c9f-415b-aa6a-e0d33da8177d-kube-api-access-lkgcz\") pod \"openstackclient\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.093479 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270d79a6-4c9f-415b-aa6a-e0d33da8177d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.093548 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/270d79a6-4c9f-415b-aa6a-e0d33da8177d-openstack-config-secret\") pod \"openstackclient\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.094351 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/270d79a6-4c9f-415b-aa6a-e0d33da8177d-openstack-config\") pod \"openstackclient\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.097108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270d79a6-4c9f-415b-aa6a-e0d33da8177d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.098119 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/270d79a6-4c9f-415b-aa6a-e0d33da8177d-openstack-config-secret\") pod \"openstackclient\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.111191 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkgcz\" (UniqueName: \"kubernetes.io/projected/270d79a6-4c9f-415b-aa6a-e0d33da8177d-kube-api-access-lkgcz\") pod \"openstackclient\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.267297 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e7d3dbb0-879e-4c4c-8114-2e599be7953f","Type":"ContainerDied","Data":"b6ec004ff87d258ca6c620bd94e86e8e216d1aff8d4c5f02ac948b9f7a76ea31"} Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.267351 5030 scope.go:117] "RemoveContainer" containerID="372084fd7988ec075d3f9fb4a81d8ff7a4fd248f2b6c327e515c916329b43e6c" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.267449 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.298194 5030 scope.go:117] "RemoveContainer" containerID="a7bbf3d9b706a4ec1b9f2c244f0b6bb8b5bf58fc8ba075a8e55314bcf67a5402" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.306143 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.309989 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.334136 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.341977 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.343739 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.366076 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.405828 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.406093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-scripts\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.406250 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.406710 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d94sk\" (UniqueName: \"kubernetes.io/projected/d09f29b6-8161-4158-810d-efe783958358-kube-api-access-d94sk\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.406838 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d09f29b6-8161-4158-810d-efe783958358-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.407097 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-config-data\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.413384 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.511747 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d94sk\" (UniqueName: \"kubernetes.io/projected/d09f29b6-8161-4158-810d-efe783958358-kube-api-access-d94sk\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.512054 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d09f29b6-8161-4158-810d-efe783958358-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.512087 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-config-data\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.512133 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.512171 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-scripts\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.512206 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.512975 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d09f29b6-8161-4158-810d-efe783958358-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.518869 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.518908 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.524897 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-config-data\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.526419 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-scripts\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.527562 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d94sk\" (UniqueName: \"kubernetes.io/projected/d09f29b6-8161-4158-810d-efe783958358-kube-api-access-d94sk\") pod \"cinder-scheduler-0\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.732067 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:12 crc kubenswrapper[5030]: I0120 22:55:12.811457 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.116267 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:55:13 crc kubenswrapper[5030]: W0120 22:55:13.129391 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd09f29b6_8161_4158_810d_efe783958358.slice/crio-f7f33823fa68b9cd7b2bca090004c206b04e4346f1ee3a9b7bd97feeccce6063 WatchSource:0}: Error finding container f7f33823fa68b9cd7b2bca090004c206b04e4346f1ee3a9b7bd97feeccce6063: Status 404 returned error can't find the container with id f7f33823fa68b9cd7b2bca090004c206b04e4346f1ee3a9b7bd97feeccce6063 Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.280689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"270d79a6-4c9f-415b-aa6a-e0d33da8177d","Type":"ContainerStarted","Data":"753a16f8f91ef7a7fff7f4e5b0e793f6136a1469999d68a5394ad2182d3d4ece"} Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.282892 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"d09f29b6-8161-4158-810d-efe783958358","Type":"ContainerStarted","Data":"f7f33823fa68b9cd7b2bca090004c206b04e4346f1ee3a9b7bd97feeccce6063"} Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.288860 5030 generic.go:334] "Generic (PLEG): container finished" podID="e82c7c33-c62b-4a4f-bf19-2e8176a84d79" containerID="732357dd8392df70779f24553a774511a373c5ac65085763e1e0daf543c5cc10" exitCode=0 Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.288887 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" event={"ID":"e82c7c33-c62b-4a4f-bf19-2e8176a84d79","Type":"ContainerDied","Data":"732357dd8392df70779f24553a774511a373c5ac65085763e1e0daf543c5cc10"} Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.371731 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.536350 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-config\") pod \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.536484 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-httpd-config\") pod \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.536560 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-ovndb-tls-certs\") pod \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.537201 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpzl9\" (UniqueName: \"kubernetes.io/projected/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-kube-api-access-rpzl9\") pod \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.537239 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-combined-ca-bundle\") pod \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\" (UID: \"e82c7c33-c62b-4a4f-bf19-2e8176a84d79\") " Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.540774 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-kube-api-access-rpzl9" (OuterVolumeSpecName: "kube-api-access-rpzl9") pod "e82c7c33-c62b-4a4f-bf19-2e8176a84d79" (UID: "e82c7c33-c62b-4a4f-bf19-2e8176a84d79"). InnerVolumeSpecName "kube-api-access-rpzl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.542097 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e82c7c33-c62b-4a4f-bf19-2e8176a84d79" (UID: "e82c7c33-c62b-4a4f-bf19-2e8176a84d79"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.588032 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e82c7c33-c62b-4a4f-bf19-2e8176a84d79" (UID: "e82c7c33-c62b-4a4f-bf19-2e8176a84d79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.590512 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-config" (OuterVolumeSpecName: "config") pod "e82c7c33-c62b-4a4f-bf19-2e8176a84d79" (UID: "e82c7c33-c62b-4a4f-bf19-2e8176a84d79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.614835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e82c7c33-c62b-4a4f-bf19-2e8176a84d79" (UID: "e82c7c33-c62b-4a4f-bf19-2e8176a84d79"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.645894 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.646060 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpzl9\" (UniqueName: \"kubernetes.io/projected/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-kube-api-access-rpzl9\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.646131 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.646186 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.646238 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e82c7c33-c62b-4a4f-bf19-2e8176a84d79-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:13 crc kubenswrapper[5030]: I0120 22:55:13.975892 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d3dbb0-879e-4c4c-8114-2e599be7953f" path="/var/lib/kubelet/pods/e7d3dbb0-879e-4c4c-8114-2e599be7953f/volumes" Jan 20 22:55:14 crc kubenswrapper[5030]: I0120 22:55:14.303435 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"d09f29b6-8161-4158-810d-efe783958358","Type":"ContainerStarted","Data":"fef776c551a97498795259e77769d450688d573f30cb486c4698bfd320905171"} Jan 20 22:55:14 crc kubenswrapper[5030]: I0120 22:55:14.303841 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"d09f29b6-8161-4158-810d-efe783958358","Type":"ContainerStarted","Data":"a5eb9a66cd59578f8b736dfa3e91fcb0a66588356d5c43c532a1426a7815f1a9"} Jan 20 22:55:14 crc kubenswrapper[5030]: I0120 22:55:14.307506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" event={"ID":"e82c7c33-c62b-4a4f-bf19-2e8176a84d79","Type":"ContainerDied","Data":"1a6ccb43039d595cfbdc5c7ba5e30033e297f0c5636209af520a9f9d4dfd5757"} Jan 20 22:55:14 crc kubenswrapper[5030]: I0120 22:55:14.307555 5030 scope.go:117] "RemoveContainer" containerID="5d30cc57716599d0d651620c6ec1f1fcf24192cc3b44b822bb51e448cbc85a7a" Jan 20 22:55:14 crc kubenswrapper[5030]: I0120 22:55:14.307701 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5dd888bb9-vzkmz" Jan 20 22:55:14 crc kubenswrapper[5030]: I0120 22:55:14.327158 5030 scope.go:117] "RemoveContainer" containerID="732357dd8392df70779f24553a774511a373c5ac65085763e1e0daf543c5cc10" Jan 20 22:55:14 crc kubenswrapper[5030]: I0120 22:55:14.333614 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=2.333596525 podStartE2EDuration="2.333596525s" podCreationTimestamp="2026-01-20 22:55:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:55:14.326511733 +0000 UTC m=+1186.646772021" watchObservedRunningTime="2026-01-20 22:55:14.333596525 +0000 UTC m=+1186.653856813" Jan 20 22:55:14 crc kubenswrapper[5030]: I0120 22:55:14.350109 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5dd888bb9-vzkmz"] Jan 20 22:55:14 crc kubenswrapper[5030]: I0120 22:55:14.357833 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5dd888bb9-vzkmz"] Jan 20 22:55:15 crc kubenswrapper[5030]: I0120 22:55:15.975541 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e82c7c33-c62b-4a4f-bf19-2e8176a84d79" path="/var/lib/kubelet/pods/e82c7c33-c62b-4a4f-bf19-2e8176a84d79/volumes" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.088322 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22"] Jan 20 22:55:16 crc kubenswrapper[5030]: E0120 22:55:16.088722 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82c7c33-c62b-4a4f-bf19-2e8176a84d79" containerName="neutron-api" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.088734 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82c7c33-c62b-4a4f-bf19-2e8176a84d79" containerName="neutron-api" Jan 20 22:55:16 crc kubenswrapper[5030]: E0120 22:55:16.088777 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82c7c33-c62b-4a4f-bf19-2e8176a84d79" containerName="neutron-httpd" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.088783 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82c7c33-c62b-4a4f-bf19-2e8176a84d79" containerName="neutron-httpd" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.088957 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82c7c33-c62b-4a4f-bf19-2e8176a84d79" containerName="neutron-api" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.088968 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82c7c33-c62b-4a4f-bf19-2e8176a84d79" containerName="neutron-httpd" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.089860 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.092515 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.092703 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.092977 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.099713 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22"] Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.192581 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-combined-ca-bundle\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.192654 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd556b2-18b0-43da-ad37-e84c9e7601cc-log-httpd\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.192732 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-public-tls-certs\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.192751 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7dd556b2-18b0-43da-ad37-e84c9e7601cc-etc-swift\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.192793 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-config-data\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.192816 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-internal-tls-certs\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.192849 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8svt8\" (UniqueName: \"kubernetes.io/projected/7dd556b2-18b0-43da-ad37-e84c9e7601cc-kube-api-access-8svt8\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.192869 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd556b2-18b0-43da-ad37-e84c9e7601cc-run-httpd\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.294469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-config-data\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.294531 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-internal-tls-certs\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.294574 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8svt8\" (UniqueName: \"kubernetes.io/projected/7dd556b2-18b0-43da-ad37-e84c9e7601cc-kube-api-access-8svt8\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.294596 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd556b2-18b0-43da-ad37-e84c9e7601cc-run-httpd\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.294706 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-combined-ca-bundle\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.294734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd556b2-18b0-43da-ad37-e84c9e7601cc-log-httpd\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.294786 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-public-tls-certs\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.294806 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7dd556b2-18b0-43da-ad37-e84c9e7601cc-etc-swift\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.295786 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd556b2-18b0-43da-ad37-e84c9e7601cc-log-httpd\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.296481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd556b2-18b0-43da-ad37-e84c9e7601cc-run-httpd\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.310688 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-combined-ca-bundle\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.311020 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-public-tls-certs\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.312605 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7dd556b2-18b0-43da-ad37-e84c9e7601cc-etc-swift\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.315288 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8svt8\" (UniqueName: \"kubernetes.io/projected/7dd556b2-18b0-43da-ad37-e84c9e7601cc-kube-api-access-8svt8\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.315329 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-internal-tls-certs\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.315482 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-config-data\") pod \"swift-proxy-5bdb48996-6pf22\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:16 crc kubenswrapper[5030]: I0120 22:55:16.412129 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:17 crc kubenswrapper[5030]: I0120 22:55:17.239835 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:17 crc kubenswrapper[5030]: I0120 22:55:17.240275 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="ceilometer-central-agent" containerID="cri-o://37cb3d5607e2a068aaaa15cfe3279c4b96baf7def7ef5cd895baa4791e14ab38" gracePeriod=30 Jan 20 22:55:17 crc kubenswrapper[5030]: I0120 22:55:17.242737 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="proxy-httpd" containerID="cri-o://d976ab81d0fc982ad1d8fb1bb2389e2903d277c20f5c252a0a10bccd290da3e7" gracePeriod=30 Jan 20 22:55:17 crc kubenswrapper[5030]: I0120 22:55:17.242864 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="ceilometer-notification-agent" containerID="cri-o://f5b03dcd2042d09e4f15b24fb2f3daae8ab428a0a481927f6bedb02af229ca8d" gracePeriod=30 Jan 20 22:55:17 crc kubenswrapper[5030]: I0120 22:55:17.242911 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="sg-core" containerID="cri-o://04afda13a8fe20ac1b74fae31b5cce9caea4d095bbe8c614957f6a0527645c95" gracePeriod=30 Jan 20 22:55:17 crc kubenswrapper[5030]: I0120 22:55:17.264569 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.147:3000/\": EOF" Jan 20 22:55:17 crc kubenswrapper[5030]: I0120 22:55:17.735865 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:18 crc kubenswrapper[5030]: I0120 22:55:18.361183 5030 generic.go:334] "Generic (PLEG): container finished" podID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerID="d976ab81d0fc982ad1d8fb1bb2389e2903d277c20f5c252a0a10bccd290da3e7" exitCode=0 Jan 20 22:55:18 crc kubenswrapper[5030]: I0120 22:55:18.361430 5030 generic.go:334] "Generic (PLEG): container finished" podID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerID="04afda13a8fe20ac1b74fae31b5cce9caea4d095bbe8c614957f6a0527645c95" exitCode=2 Jan 20 22:55:18 crc kubenswrapper[5030]: I0120 22:55:18.361443 5030 generic.go:334] "Generic (PLEG): container finished" podID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerID="f5b03dcd2042d09e4f15b24fb2f3daae8ab428a0a481927f6bedb02af229ca8d" exitCode=0 Jan 20 22:55:18 crc kubenswrapper[5030]: I0120 22:55:18.361453 5030 generic.go:334] "Generic (PLEG): container finished" podID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerID="37cb3d5607e2a068aaaa15cfe3279c4b96baf7def7ef5cd895baa4791e14ab38" exitCode=0 Jan 20 22:55:18 crc kubenswrapper[5030]: I0120 22:55:18.361249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4","Type":"ContainerDied","Data":"d976ab81d0fc982ad1d8fb1bb2389e2903d277c20f5c252a0a10bccd290da3e7"} Jan 20 22:55:18 crc kubenswrapper[5030]: I0120 22:55:18.361490 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4","Type":"ContainerDied","Data":"04afda13a8fe20ac1b74fae31b5cce9caea4d095bbe8c614957f6a0527645c95"} Jan 20 22:55:18 crc kubenswrapper[5030]: I0120 22:55:18.361508 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4","Type":"ContainerDied","Data":"f5b03dcd2042d09e4f15b24fb2f3daae8ab428a0a481927f6bedb02af229ca8d"} Jan 20 22:55:18 crc kubenswrapper[5030]: I0120 22:55:18.361522 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4","Type":"ContainerDied","Data":"37cb3d5607e2a068aaaa15cfe3279c4b96baf7def7ef5cd895baa4791e14ab38"} Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.347097 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.386540 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"270d79a6-4c9f-415b-aa6a-e0d33da8177d","Type":"ContainerStarted","Data":"06277311f4997ca2086de214919a0b96983f447ac24c043ef95dfbbb0b0f8356"} Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.390224 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4","Type":"ContainerDied","Data":"83ae7d812fc51c58507565a8dd297f0069138f4562e63949060139364ac2cb49"} Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.390272 5030 scope.go:117] "RemoveContainer" containerID="d976ab81d0fc982ad1d8fb1bb2389e2903d277c20f5c252a0a10bccd290da3e7" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.390300 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.400436 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.129937466 podStartE2EDuration="10.400421308s" podCreationTimestamp="2026-01-20 22:55:11 +0000 UTC" firstStartedPulling="2026-01-20 22:55:12.828825697 +0000 UTC m=+1185.149085995" lastFinishedPulling="2026-01-20 22:55:21.099309559 +0000 UTC m=+1193.419569837" observedRunningTime="2026-01-20 22:55:21.39966127 +0000 UTC m=+1193.719921558" watchObservedRunningTime="2026-01-20 22:55:21.400421308 +0000 UTC m=+1193.720681596" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.409924 5030 scope.go:117] "RemoveContainer" containerID="04afda13a8fe20ac1b74fae31b5cce9caea4d095bbe8c614957f6a0527645c95" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.428754 5030 scope.go:117] "RemoveContainer" containerID="f5b03dcd2042d09e4f15b24fb2f3daae8ab428a0a481927f6bedb02af229ca8d" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.445378 5030 scope.go:117] "RemoveContainer" containerID="37cb3d5607e2a068aaaa15cfe3279c4b96baf7def7ef5cd895baa4791e14ab38" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.490352 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-config-data\") pod \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.490487 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-combined-ca-bundle\") pod \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.490566 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-scripts\") pod \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.490698 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-sg-core-conf-yaml\") pod \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.490730 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-log-httpd\") pod \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.490752 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dglns\" (UniqueName: \"kubernetes.io/projected/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-kube-api-access-dglns\") pod \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.490769 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-run-httpd\") pod \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\" (UID: \"4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4\") " Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.492073 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" (UID: "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.493659 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" (UID: "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.497915 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-scripts" (OuterVolumeSpecName: "scripts") pod "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" (UID: "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.500609 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-kube-api-access-dglns" (OuterVolumeSpecName: "kube-api-access-dglns") pod "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" (UID: "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4"). InnerVolumeSpecName "kube-api-access-dglns". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.525734 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" (UID: "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:21 crc kubenswrapper[5030]: W0120 22:55:21.534794 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dd556b2_18b0_43da_ad37_e84c9e7601cc.slice/crio-c5ccd8bb48eb11e95727d07ea061c249533afa1e5a4417b1a46ed9ca691a2be8 WatchSource:0}: Error finding container c5ccd8bb48eb11e95727d07ea061c249533afa1e5a4417b1a46ed9ca691a2be8: Status 404 returned error can't find the container with id c5ccd8bb48eb11e95727d07ea061c249533afa1e5a4417b1a46ed9ca691a2be8 Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.536163 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22"] Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.584818 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" (UID: "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.585011 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-config-data" (OuterVolumeSpecName: "config-data") pod "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" (UID: "4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.593493 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.593536 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.593554 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.593572 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dglns\" (UniqueName: \"kubernetes.io/projected/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-kube-api-access-dglns\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.593589 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.593604 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.593644 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.733864 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.745830 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.776780 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:21 crc kubenswrapper[5030]: E0120 22:55:21.777223 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="ceilometer-central-agent" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.777240 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="ceilometer-central-agent" Jan 20 22:55:21 crc kubenswrapper[5030]: E0120 22:55:21.777253 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="ceilometer-notification-agent" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.777260 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="ceilometer-notification-agent" Jan 20 22:55:21 crc kubenswrapper[5030]: E0120 22:55:21.777269 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="sg-core" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.777276 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="sg-core" Jan 20 22:55:21 crc kubenswrapper[5030]: E0120 22:55:21.777298 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="proxy-httpd" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.777303 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="proxy-httpd" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.777465 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="proxy-httpd" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.777477 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="ceilometer-notification-agent" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.777492 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="sg-core" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.777504 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" containerName="ceilometer-central-agent" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.779011 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.785280 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.786076 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.791923 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.900997 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-config-data\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.901055 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-scripts\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.901178 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzqrn\" (UniqueName: \"kubernetes.io/projected/1b731572-00e6-412a-95ad-f29dfcb5fd25-kube-api-access-nzqrn\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.901467 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.901494 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b731572-00e6-412a-95ad-f29dfcb5fd25-log-httpd\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.901513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.901543 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b731572-00e6-412a-95ad-f29dfcb5fd25-run-httpd\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:21 crc kubenswrapper[5030]: I0120 22:55:21.970970 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4" path="/var/lib/kubelet/pods/4ecd3cec-599a-4f7c-80a4-b1dc16d6c8e4/volumes" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.003360 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-config-data\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.003413 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-scripts\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.003506 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzqrn\" (UniqueName: \"kubernetes.io/projected/1b731572-00e6-412a-95ad-f29dfcb5fd25-kube-api-access-nzqrn\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.003525 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.003543 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b731572-00e6-412a-95ad-f29dfcb5fd25-log-httpd\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.003560 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.003590 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b731572-00e6-412a-95ad-f29dfcb5fd25-run-httpd\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.004035 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b731572-00e6-412a-95ad-f29dfcb5fd25-log-httpd\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.004095 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b731572-00e6-412a-95ad-f29dfcb5fd25-run-httpd\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.010402 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.010500 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-scripts\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.010642 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.011162 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-config-data\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.026655 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzqrn\" (UniqueName: \"kubernetes.io/projected/1b731572-00e6-412a-95ad-f29dfcb5fd25-kube-api-access-nzqrn\") pod \"ceilometer-0\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.126527 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.403513 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" event={"ID":"7dd556b2-18b0-43da-ad37-e84c9e7601cc","Type":"ContainerStarted","Data":"65b8ebd2e425010f04d6e967a0f85be11bd51c38df6fd77b8844497673c771f4"} Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.403869 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.403906 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.403914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" event={"ID":"7dd556b2-18b0-43da-ad37-e84c9e7601cc","Type":"ContainerStarted","Data":"709a2a7ededf39d4cf921562b31fe5c77003d1a66336a20cb086a47b4c5c88f2"} Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.403923 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" event={"ID":"7dd556b2-18b0-43da-ad37-e84c9e7601cc","Type":"ContainerStarted","Data":"c5ccd8bb48eb11e95727d07ea061c249533afa1e5a4417b1a46ed9ca691a2be8"} Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.423186 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" podStartSLOduration=6.423171226 podStartE2EDuration="6.423171226s" podCreationTimestamp="2026-01-20 22:55:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:55:22.419203079 +0000 UTC m=+1194.739463357" watchObservedRunningTime="2026-01-20 22:55:22.423171226 +0000 UTC m=+1194.743431514" Jan 20 22:55:22 crc kubenswrapper[5030]: I0120 22:55:22.550494 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:22 crc kubenswrapper[5030]: W0120 22:55:22.559782 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b731572_00e6_412a_95ad_f29dfcb5fd25.slice/crio-c46877040775ccfc66bcb2cfe4418235d572594ad54f2d1f0b156e2b9a74229f WatchSource:0}: Error finding container c46877040775ccfc66bcb2cfe4418235d572594ad54f2d1f0b156e2b9a74229f: Status 404 returned error can't find the container with id c46877040775ccfc66bcb2cfe4418235d572594ad54f2d1f0b156e2b9a74229f Jan 20 22:55:23 crc kubenswrapper[5030]: I0120 22:55:23.059262 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:55:23 crc kubenswrapper[5030]: I0120 22:55:23.348504 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:23 crc kubenswrapper[5030]: I0120 22:55:23.427591 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1b731572-00e6-412a-95ad-f29dfcb5fd25","Type":"ContainerStarted","Data":"c46877040775ccfc66bcb2cfe4418235d572594ad54f2d1f0b156e2b9a74229f"} Jan 20 22:55:24 crc kubenswrapper[5030]: I0120 22:55:24.444384 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1b731572-00e6-412a-95ad-f29dfcb5fd25","Type":"ContainerStarted","Data":"5b18d35f58af6b1af1dd45d1c97d2163a9b7a42cd579499234b80bd1dd7729b5"} Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.454038 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1b731572-00e6-412a-95ad-f29dfcb5fd25","Type":"ContainerStarted","Data":"c547e6388a11520067d51cb497bf00f299b4b09026bf497927ed488f87ba206c"} Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.576738 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-vx9c6"] Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.577892 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-vx9c6" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.589005 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-vx9c6"] Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.661557 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrxn4\" (UniqueName: \"kubernetes.io/projected/60acba6f-1f50-4392-9a13-6b6ef04c93a6-kube-api-access-jrxn4\") pod \"nova-api-db-create-vx9c6\" (UID: \"60acba6f-1f50-4392-9a13-6b6ef04c93a6\") " pod="openstack-kuttl-tests/nova-api-db-create-vx9c6" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.661614 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60acba6f-1f50-4392-9a13-6b6ef04c93a6-operator-scripts\") pod \"nova-api-db-create-vx9c6\" (UID: \"60acba6f-1f50-4392-9a13-6b6ef04c93a6\") " pod="openstack-kuttl-tests/nova-api-db-create-vx9c6" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.673355 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-74g9t"] Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.674358 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-74g9t" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.695138 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x"] Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.696122 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.698218 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.705205 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-74g9t"] Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.722483 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x"] Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.763889 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np74b\" (UniqueName: \"kubernetes.io/projected/981d713c-d94c-4957-8d8d-eb000cc07969-kube-api-access-np74b\") pod \"nova-cell0-db-create-74g9t\" (UID: \"981d713c-d94c-4957-8d8d-eb000cc07969\") " pod="openstack-kuttl-tests/nova-cell0-db-create-74g9t" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.764496 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/981d713c-d94c-4957-8d8d-eb000cc07969-operator-scripts\") pod \"nova-cell0-db-create-74g9t\" (UID: \"981d713c-d94c-4957-8d8d-eb000cc07969\") " pod="openstack-kuttl-tests/nova-cell0-db-create-74g9t" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.764663 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrxn4\" (UniqueName: \"kubernetes.io/projected/60acba6f-1f50-4392-9a13-6b6ef04c93a6-kube-api-access-jrxn4\") pod \"nova-api-db-create-vx9c6\" (UID: \"60acba6f-1f50-4392-9a13-6b6ef04c93a6\") " pod="openstack-kuttl-tests/nova-api-db-create-vx9c6" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.764757 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60acba6f-1f50-4392-9a13-6b6ef04c93a6-operator-scripts\") pod \"nova-api-db-create-vx9c6\" (UID: \"60acba6f-1f50-4392-9a13-6b6ef04c93a6\") " pod="openstack-kuttl-tests/nova-api-db-create-vx9c6" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.765515 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60acba6f-1f50-4392-9a13-6b6ef04c93a6-operator-scripts\") pod \"nova-api-db-create-vx9c6\" (UID: \"60acba6f-1f50-4392-9a13-6b6ef04c93a6\") " pod="openstack-kuttl-tests/nova-api-db-create-vx9c6" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.775680 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-7t7cc"] Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.776839 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-7t7cc" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.783916 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-7t7cc"] Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.789485 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrxn4\" (UniqueName: \"kubernetes.io/projected/60acba6f-1f50-4392-9a13-6b6ef04c93a6-kube-api-access-jrxn4\") pod \"nova-api-db-create-vx9c6\" (UID: \"60acba6f-1f50-4392-9a13-6b6ef04c93a6\") " pod="openstack-kuttl-tests/nova-api-db-create-vx9c6" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.866230 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bndx\" (UniqueName: \"kubernetes.io/projected/d1df9c08-0838-4b4e-a77d-0105392e8c10-kube-api-access-4bndx\") pod \"nova-api-29d2-account-create-update-jxb4x\" (UID: \"d1df9c08-0838-4b4e-a77d-0105392e8c10\") " pod="openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.866317 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72j6m\" (UniqueName: \"kubernetes.io/projected/6f91afae-2ac6-49ea-aa9a-b5c8eed0184f-kube-api-access-72j6m\") pod \"nova-cell1-db-create-7t7cc\" (UID: \"6f91afae-2ac6-49ea-aa9a-b5c8eed0184f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-7t7cc" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.866417 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f91afae-2ac6-49ea-aa9a-b5c8eed0184f-operator-scripts\") pod \"nova-cell1-db-create-7t7cc\" (UID: \"6f91afae-2ac6-49ea-aa9a-b5c8eed0184f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-7t7cc" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.866640 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/981d713c-d94c-4957-8d8d-eb000cc07969-operator-scripts\") pod \"nova-cell0-db-create-74g9t\" (UID: \"981d713c-d94c-4957-8d8d-eb000cc07969\") " pod="openstack-kuttl-tests/nova-cell0-db-create-74g9t" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.866713 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1df9c08-0838-4b4e-a77d-0105392e8c10-operator-scripts\") pod \"nova-api-29d2-account-create-update-jxb4x\" (UID: \"d1df9c08-0838-4b4e-a77d-0105392e8c10\") " pod="openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.866780 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np74b\" (UniqueName: \"kubernetes.io/projected/981d713c-d94c-4957-8d8d-eb000cc07969-kube-api-access-np74b\") pod \"nova-cell0-db-create-74g9t\" (UID: \"981d713c-d94c-4957-8d8d-eb000cc07969\") " pod="openstack-kuttl-tests/nova-cell0-db-create-74g9t" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.867675 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/981d713c-d94c-4957-8d8d-eb000cc07969-operator-scripts\") pod \"nova-cell0-db-create-74g9t\" (UID: \"981d713c-d94c-4957-8d8d-eb000cc07969\") " pod="openstack-kuttl-tests/nova-cell0-db-create-74g9t" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.880412 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587"] Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.881444 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.894053 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.897123 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-vx9c6" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.907521 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587"] Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.911555 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np74b\" (UniqueName: \"kubernetes.io/projected/981d713c-d94c-4957-8d8d-eb000cc07969-kube-api-access-np74b\") pod \"nova-cell0-db-create-74g9t\" (UID: \"981d713c-d94c-4957-8d8d-eb000cc07969\") " pod="openstack-kuttl-tests/nova-cell0-db-create-74g9t" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.969072 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvdf2\" (UniqueName: \"kubernetes.io/projected/830cabde-1736-47a1-add5-6fde195a1723-kube-api-access-mvdf2\") pod \"nova-cell0-e77d-account-create-update-4s587\" (UID: \"830cabde-1736-47a1-add5-6fde195a1723\") " pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.969149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72j6m\" (UniqueName: \"kubernetes.io/projected/6f91afae-2ac6-49ea-aa9a-b5c8eed0184f-kube-api-access-72j6m\") pod \"nova-cell1-db-create-7t7cc\" (UID: \"6f91afae-2ac6-49ea-aa9a-b5c8eed0184f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-7t7cc" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.969177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f91afae-2ac6-49ea-aa9a-b5c8eed0184f-operator-scripts\") pod \"nova-cell1-db-create-7t7cc\" (UID: \"6f91afae-2ac6-49ea-aa9a-b5c8eed0184f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-7t7cc" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.969203 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/830cabde-1736-47a1-add5-6fde195a1723-operator-scripts\") pod \"nova-cell0-e77d-account-create-update-4s587\" (UID: \"830cabde-1736-47a1-add5-6fde195a1723\") " pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.969272 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1df9c08-0838-4b4e-a77d-0105392e8c10-operator-scripts\") pod \"nova-api-29d2-account-create-update-jxb4x\" (UID: \"d1df9c08-0838-4b4e-a77d-0105392e8c10\") " pod="openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.969307 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bndx\" (UniqueName: \"kubernetes.io/projected/d1df9c08-0838-4b4e-a77d-0105392e8c10-kube-api-access-4bndx\") pod \"nova-api-29d2-account-create-update-jxb4x\" (UID: \"d1df9c08-0838-4b4e-a77d-0105392e8c10\") " pod="openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.970535 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f91afae-2ac6-49ea-aa9a-b5c8eed0184f-operator-scripts\") pod \"nova-cell1-db-create-7t7cc\" (UID: \"6f91afae-2ac6-49ea-aa9a-b5c8eed0184f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-7t7cc" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.970802 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1df9c08-0838-4b4e-a77d-0105392e8c10-operator-scripts\") pod \"nova-api-29d2-account-create-update-jxb4x\" (UID: \"d1df9c08-0838-4b4e-a77d-0105392e8c10\") " pod="openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.988246 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-74g9t" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.990971 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bndx\" (UniqueName: \"kubernetes.io/projected/d1df9c08-0838-4b4e-a77d-0105392e8c10-kube-api-access-4bndx\") pod \"nova-api-29d2-account-create-update-jxb4x\" (UID: \"d1df9c08-0838-4b4e-a77d-0105392e8c10\") " pod="openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x" Jan 20 22:55:25 crc kubenswrapper[5030]: I0120 22:55:25.991127 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72j6m\" (UniqueName: \"kubernetes.io/projected/6f91afae-2ac6-49ea-aa9a-b5c8eed0184f-kube-api-access-72j6m\") pod \"nova-cell1-db-create-7t7cc\" (UID: \"6f91afae-2ac6-49ea-aa9a-b5c8eed0184f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-7t7cc" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.011026 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.070615 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvdf2\" (UniqueName: \"kubernetes.io/projected/830cabde-1736-47a1-add5-6fde195a1723-kube-api-access-mvdf2\") pod \"nova-cell0-e77d-account-create-update-4s587\" (UID: \"830cabde-1736-47a1-add5-6fde195a1723\") " pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.070733 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/830cabde-1736-47a1-add5-6fde195a1723-operator-scripts\") pod \"nova-cell0-e77d-account-create-update-4s587\" (UID: \"830cabde-1736-47a1-add5-6fde195a1723\") " pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.072390 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/830cabde-1736-47a1-add5-6fde195a1723-operator-scripts\") pod \"nova-cell0-e77d-account-create-update-4s587\" (UID: \"830cabde-1736-47a1-add5-6fde195a1723\") " pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.090071 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4"] Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.091269 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.093897 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.095130 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvdf2\" (UniqueName: \"kubernetes.io/projected/830cabde-1736-47a1-add5-6fde195a1723-kube-api-access-mvdf2\") pod \"nova-cell0-e77d-account-create-update-4s587\" (UID: \"830cabde-1736-47a1-add5-6fde195a1723\") " pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.102101 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4"] Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.128287 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-7t7cc" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.172085 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22a14bbb-3aa5-4a2f-b49b-446729d08cca-operator-scripts\") pod \"nova-cell1-76de-account-create-update-mdgc4\" (UID: \"22a14bbb-3aa5-4a2f-b49b-446729d08cca\") " pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.172697 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttxn5\" (UniqueName: \"kubernetes.io/projected/22a14bbb-3aa5-4a2f-b49b-446729d08cca-kube-api-access-ttxn5\") pod \"nova-cell1-76de-account-create-update-mdgc4\" (UID: \"22a14bbb-3aa5-4a2f-b49b-446729d08cca\") " pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.203124 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.274069 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttxn5\" (UniqueName: \"kubernetes.io/projected/22a14bbb-3aa5-4a2f-b49b-446729d08cca-kube-api-access-ttxn5\") pod \"nova-cell1-76de-account-create-update-mdgc4\" (UID: \"22a14bbb-3aa5-4a2f-b49b-446729d08cca\") " pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.274177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22a14bbb-3aa5-4a2f-b49b-446729d08cca-operator-scripts\") pod \"nova-cell1-76de-account-create-update-mdgc4\" (UID: \"22a14bbb-3aa5-4a2f-b49b-446729d08cca\") " pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.275152 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22a14bbb-3aa5-4a2f-b49b-446729d08cca-operator-scripts\") pod \"nova-cell1-76de-account-create-update-mdgc4\" (UID: \"22a14bbb-3aa5-4a2f-b49b-446729d08cca\") " pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.297044 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttxn5\" (UniqueName: \"kubernetes.io/projected/22a14bbb-3aa5-4a2f-b49b-446729d08cca-kube-api-access-ttxn5\") pod \"nova-cell1-76de-account-create-update-mdgc4\" (UID: \"22a14bbb-3aa5-4a2f-b49b-446729d08cca\") " pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.314002 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.425901 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.429134 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.466589 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-vx9c6"] Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.480009 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1b731572-00e6-412a-95ad-f29dfcb5fd25","Type":"ContainerStarted","Data":"60d8a9ac2380f9ec7a6a58ffa7b4a90a8145e870d46b3507cbef8b189a025876"} Jan 20 22:55:26 crc kubenswrapper[5030]: W0120 22:55:26.519741 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60acba6f_1f50_4392_9a13_6b6ef04c93a6.slice/crio-dc33d28aa83f6ddeb4df36bd5776ad09f1a851ec9c5a15850199f1bf6485509c WatchSource:0}: Error finding container dc33d28aa83f6ddeb4df36bd5776ad09f1a851ec9c5a15850199f1bf6485509c: Status 404 returned error can't find the container with id dc33d28aa83f6ddeb4df36bd5776ad09f1a851ec9c5a15850199f1bf6485509c Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.619519 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x"] Jan 20 22:55:26 crc kubenswrapper[5030]: W0120 22:55:26.629356 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1df9c08_0838_4b4e_a77d_0105392e8c10.slice/crio-2721f513e920a8f2e26a1088ae2ad1d93e81211d2a608a09eb91782f37bf3c41 WatchSource:0}: Error finding container 2721f513e920a8f2e26a1088ae2ad1d93e81211d2a608a09eb91782f37bf3c41: Status 404 returned error can't find the container with id 2721f513e920a8f2e26a1088ae2ad1d93e81211d2a608a09eb91782f37bf3c41 Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.786721 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-74g9t"] Jan 20 22:55:26 crc kubenswrapper[5030]: W0120 22:55:26.787267 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod981d713c_d94c_4957_8d8d_eb000cc07969.slice/crio-9d16d9085ef04631036e35f6b172391b4715b1bef2beb05c153a0dc1c15b7ace WatchSource:0}: Error finding container 9d16d9085ef04631036e35f6b172391b4715b1bef2beb05c153a0dc1c15b7ace: Status 404 returned error can't find the container with id 9d16d9085ef04631036e35f6b172391b4715b1bef2beb05c153a0dc1c15b7ace Jan 20 22:55:26 crc kubenswrapper[5030]: W0120 22:55:26.794473 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f91afae_2ac6_49ea_aa9a_b5c8eed0184f.slice/crio-8fca1c2139961533181212731bf4287ba4fb0fe302d325926862333c5b0116d9 WatchSource:0}: Error finding container 8fca1c2139961533181212731bf4287ba4fb0fe302d325926862333c5b0116d9: Status 404 returned error can't find the container with id 8fca1c2139961533181212731bf4287ba4fb0fe302d325926862333c5b0116d9 Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.801432 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-7t7cc"] Jan 20 22:55:26 crc kubenswrapper[5030]: I0120 22:55:26.920184 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587"] Jan 20 22:55:26 crc kubenswrapper[5030]: W0120 22:55:26.929167 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod830cabde_1736_47a1_add5_6fde195a1723.slice/crio-11662a10c222ecfa8027c951b91143485ba0ab346c969c68709faaebb767cb5f WatchSource:0}: Error finding container 11662a10c222ecfa8027c951b91143485ba0ab346c969c68709faaebb767cb5f: Status 404 returned error can't find the container with id 11662a10c222ecfa8027c951b91143485ba0ab346c969c68709faaebb767cb5f Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.041365 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4"] Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.497595 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" event={"ID":"830cabde-1736-47a1-add5-6fde195a1723","Type":"ContainerStarted","Data":"bbd90196bdd3a4a47bb0c4c568ffffbec0595ca6ecfec4c5cd5a3dbc6608e462"} Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.497654 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" event={"ID":"830cabde-1736-47a1-add5-6fde195a1723","Type":"ContainerStarted","Data":"11662a10c222ecfa8027c951b91143485ba0ab346c969c68709faaebb767cb5f"} Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.502040 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1b731572-00e6-412a-95ad-f29dfcb5fd25","Type":"ContainerStarted","Data":"6f1837d7b5fa506b76f2d219ea690d5ada282903b0adad3f7f55ad1c8646a29d"} Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.502458 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="ceilometer-central-agent" containerID="cri-o://5b18d35f58af6b1af1dd45d1c97d2163a9b7a42cd579499234b80bd1dd7729b5" gracePeriod=30 Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.502567 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="proxy-httpd" containerID="cri-o://6f1837d7b5fa506b76f2d219ea690d5ada282903b0adad3f7f55ad1c8646a29d" gracePeriod=30 Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.502598 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.502634 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="sg-core" containerID="cri-o://60d8a9ac2380f9ec7a6a58ffa7b4a90a8145e870d46b3507cbef8b189a025876" gracePeriod=30 Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.502677 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="ceilometer-notification-agent" containerID="cri-o://c547e6388a11520067d51cb497bf00f299b4b09026bf497927ed488f87ba206c" gracePeriod=30 Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.506112 5030 generic.go:334] "Generic (PLEG): container finished" podID="6f91afae-2ac6-49ea-aa9a-b5c8eed0184f" containerID="66e620a76ff8286c5dc7276c1571b2da8a7c84757e27d0bad05d3e967a8f9819" exitCode=0 Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.506207 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-7t7cc" event={"ID":"6f91afae-2ac6-49ea-aa9a-b5c8eed0184f","Type":"ContainerDied","Data":"66e620a76ff8286c5dc7276c1571b2da8a7c84757e27d0bad05d3e967a8f9819"} Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.506231 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-7t7cc" event={"ID":"6f91afae-2ac6-49ea-aa9a-b5c8eed0184f","Type":"ContainerStarted","Data":"8fca1c2139961533181212731bf4287ba4fb0fe302d325926862333c5b0116d9"} Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.520037 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" podStartSLOduration=2.520016799 podStartE2EDuration="2.520016799s" podCreationTimestamp="2026-01-20 22:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:55:27.514586347 +0000 UTC m=+1199.834846645" watchObservedRunningTime="2026-01-20 22:55:27.520016799 +0000 UTC m=+1199.840277087" Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.529087 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" event={"ID":"22a14bbb-3aa5-4a2f-b49b-446729d08cca","Type":"ContainerStarted","Data":"931e42e4636ed8c2ed4c71dffe3b6456f856ff06fb0a66e1ed6b1e2aef0b546c"} Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.529303 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" event={"ID":"22a14bbb-3aa5-4a2f-b49b-446729d08cca","Type":"ContainerStarted","Data":"ea700918c6a43033121adb80b88834eee95733968110be9ddd0bd7bff7951b56"} Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.531418 5030 generic.go:334] "Generic (PLEG): container finished" podID="60acba6f-1f50-4392-9a13-6b6ef04c93a6" containerID="616aa75fd9e2ad8fd790729e288a6a541ddb55a66a96677982b3af6e2ceb68f2" exitCode=0 Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.531506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-vx9c6" event={"ID":"60acba6f-1f50-4392-9a13-6b6ef04c93a6","Type":"ContainerDied","Data":"616aa75fd9e2ad8fd790729e288a6a541ddb55a66a96677982b3af6e2ceb68f2"} Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.531571 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-vx9c6" event={"ID":"60acba6f-1f50-4392-9a13-6b6ef04c93a6","Type":"ContainerStarted","Data":"dc33d28aa83f6ddeb4df36bd5776ad09f1a851ec9c5a15850199f1bf6485509c"} Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.541005 5030 generic.go:334] "Generic (PLEG): container finished" podID="981d713c-d94c-4957-8d8d-eb000cc07969" containerID="ecc31cd1a34d8f9168d6e06983ace5d6bc419548b4bce4c1012faf388f418b36" exitCode=0 Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.541268 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-74g9t" event={"ID":"981d713c-d94c-4957-8d8d-eb000cc07969","Type":"ContainerDied","Data":"ecc31cd1a34d8f9168d6e06983ace5d6bc419548b4bce4c1012faf388f418b36"} Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.554944 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-74g9t" event={"ID":"981d713c-d94c-4957-8d8d-eb000cc07969","Type":"ContainerStarted","Data":"9d16d9085ef04631036e35f6b172391b4715b1bef2beb05c153a0dc1c15b7ace"} Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.556204 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.354641747 podStartE2EDuration="6.556188867s" podCreationTimestamp="2026-01-20 22:55:21 +0000 UTC" firstStartedPulling="2026-01-20 22:55:22.561467163 +0000 UTC m=+1194.881727451" lastFinishedPulling="2026-01-20 22:55:26.763014283 +0000 UTC m=+1199.083274571" observedRunningTime="2026-01-20 22:55:27.536609912 +0000 UTC m=+1199.856870190" watchObservedRunningTime="2026-01-20 22:55:27.556188867 +0000 UTC m=+1199.876449155" Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.571081 5030 generic.go:334] "Generic (PLEG): container finished" podID="d1df9c08-0838-4b4e-a77d-0105392e8c10" containerID="489a2af29b8eb0e259083f0edb59002efc22e0b485becab088c05419badfc6bd" exitCode=0 Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.571127 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x" event={"ID":"d1df9c08-0838-4b4e-a77d-0105392e8c10","Type":"ContainerDied","Data":"489a2af29b8eb0e259083f0edb59002efc22e0b485becab088c05419badfc6bd"} Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.571150 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x" event={"ID":"d1df9c08-0838-4b4e-a77d-0105392e8c10","Type":"ContainerStarted","Data":"2721f513e920a8f2e26a1088ae2ad1d93e81211d2a608a09eb91782f37bf3c41"} Jan 20 22:55:27 crc kubenswrapper[5030]: I0120 22:55:27.636371 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" podStartSLOduration=1.636355853 podStartE2EDuration="1.636355853s" podCreationTimestamp="2026-01-20 22:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:55:27.610917236 +0000 UTC m=+1199.931177534" watchObservedRunningTime="2026-01-20 22:55:27.636355853 +0000 UTC m=+1199.956616141" Jan 20 22:55:28 crc kubenswrapper[5030]: I0120 22:55:28.527882 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:55:28 crc kubenswrapper[5030]: I0120 22:55:28.528429 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="484288bf-3189-4fbb-9166-1a039ba08128" containerName="glance-log" containerID="cri-o://c86f3d18825271b702ba7c875db8138c946a9cb80e528f38d065a6ba41118b3c" gracePeriod=30 Jan 20 22:55:28 crc kubenswrapper[5030]: I0120 22:55:28.528582 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="484288bf-3189-4fbb-9166-1a039ba08128" containerName="glance-httpd" containerID="cri-o://c93df97c119c55c22e75d8a04150ddd1c5a2d7f15dc2b57ed704c6c7f485fb33" gracePeriod=30 Jan 20 22:55:28 crc kubenswrapper[5030]: I0120 22:55:28.594678 5030 generic.go:334] "Generic (PLEG): container finished" podID="22a14bbb-3aa5-4a2f-b49b-446729d08cca" containerID="931e42e4636ed8c2ed4c71dffe3b6456f856ff06fb0a66e1ed6b1e2aef0b546c" exitCode=0 Jan 20 22:55:28 crc kubenswrapper[5030]: I0120 22:55:28.594742 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" event={"ID":"22a14bbb-3aa5-4a2f-b49b-446729d08cca","Type":"ContainerDied","Data":"931e42e4636ed8c2ed4c71dffe3b6456f856ff06fb0a66e1ed6b1e2aef0b546c"} Jan 20 22:55:28 crc kubenswrapper[5030]: I0120 22:55:28.599251 5030 generic.go:334] "Generic (PLEG): container finished" podID="830cabde-1736-47a1-add5-6fde195a1723" containerID="bbd90196bdd3a4a47bb0c4c568ffffbec0595ca6ecfec4c5cd5a3dbc6608e462" exitCode=0 Jan 20 22:55:28 crc kubenswrapper[5030]: I0120 22:55:28.599308 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" event={"ID":"830cabde-1736-47a1-add5-6fde195a1723","Type":"ContainerDied","Data":"bbd90196bdd3a4a47bb0c4c568ffffbec0595ca6ecfec4c5cd5a3dbc6608e462"} Jan 20 22:55:28 crc kubenswrapper[5030]: I0120 22:55:28.604511 5030 generic.go:334] "Generic (PLEG): container finished" podID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerID="6f1837d7b5fa506b76f2d219ea690d5ada282903b0adad3f7f55ad1c8646a29d" exitCode=0 Jan 20 22:55:28 crc kubenswrapper[5030]: I0120 22:55:28.604538 5030 generic.go:334] "Generic (PLEG): container finished" podID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerID="60d8a9ac2380f9ec7a6a58ffa7b4a90a8145e870d46b3507cbef8b189a025876" exitCode=2 Jan 20 22:55:28 crc kubenswrapper[5030]: I0120 22:55:28.604553 5030 generic.go:334] "Generic (PLEG): container finished" podID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerID="c547e6388a11520067d51cb497bf00f299b4b09026bf497927ed488f87ba206c" exitCode=0 Jan 20 22:55:28 crc kubenswrapper[5030]: I0120 22:55:28.604719 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1b731572-00e6-412a-95ad-f29dfcb5fd25","Type":"ContainerDied","Data":"6f1837d7b5fa506b76f2d219ea690d5ada282903b0adad3f7f55ad1c8646a29d"} Jan 20 22:55:28 crc kubenswrapper[5030]: I0120 22:55:28.604742 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1b731572-00e6-412a-95ad-f29dfcb5fd25","Type":"ContainerDied","Data":"60d8a9ac2380f9ec7a6a58ffa7b4a90a8145e870d46b3507cbef8b189a025876"} Jan 20 22:55:28 crc kubenswrapper[5030]: I0120 22:55:28.604752 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1b731572-00e6-412a-95ad-f29dfcb5fd25","Type":"ContainerDied","Data":"c547e6388a11520067d51cb497bf00f299b4b09026bf497927ed488f87ba206c"} Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.039706 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-vx9c6" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.129459 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrxn4\" (UniqueName: \"kubernetes.io/projected/60acba6f-1f50-4392-9a13-6b6ef04c93a6-kube-api-access-jrxn4\") pod \"60acba6f-1f50-4392-9a13-6b6ef04c93a6\" (UID: \"60acba6f-1f50-4392-9a13-6b6ef04c93a6\") " Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.129627 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60acba6f-1f50-4392-9a13-6b6ef04c93a6-operator-scripts\") pod \"60acba6f-1f50-4392-9a13-6b6ef04c93a6\" (UID: \"60acba6f-1f50-4392-9a13-6b6ef04c93a6\") " Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.130696 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60acba6f-1f50-4392-9a13-6b6ef04c93a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60acba6f-1f50-4392-9a13-6b6ef04c93a6" (UID: "60acba6f-1f50-4392-9a13-6b6ef04c93a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.137619 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60acba6f-1f50-4392-9a13-6b6ef04c93a6-kube-api-access-jrxn4" (OuterVolumeSpecName: "kube-api-access-jrxn4") pod "60acba6f-1f50-4392-9a13-6b6ef04c93a6" (UID: "60acba6f-1f50-4392-9a13-6b6ef04c93a6"). InnerVolumeSpecName "kube-api-access-jrxn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.231345 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60acba6f-1f50-4392-9a13-6b6ef04c93a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.231389 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrxn4\" (UniqueName: \"kubernetes.io/projected/60acba6f-1f50-4392-9a13-6b6ef04c93a6-kube-api-access-jrxn4\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.239115 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.243548 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-74g9t" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.248285 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-7t7cc" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.335338 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bndx\" (UniqueName: \"kubernetes.io/projected/d1df9c08-0838-4b4e-a77d-0105392e8c10-kube-api-access-4bndx\") pod \"d1df9c08-0838-4b4e-a77d-0105392e8c10\" (UID: \"d1df9c08-0838-4b4e-a77d-0105392e8c10\") " Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.335388 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72j6m\" (UniqueName: \"kubernetes.io/projected/6f91afae-2ac6-49ea-aa9a-b5c8eed0184f-kube-api-access-72j6m\") pod \"6f91afae-2ac6-49ea-aa9a-b5c8eed0184f\" (UID: \"6f91afae-2ac6-49ea-aa9a-b5c8eed0184f\") " Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.335430 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np74b\" (UniqueName: \"kubernetes.io/projected/981d713c-d94c-4957-8d8d-eb000cc07969-kube-api-access-np74b\") pod \"981d713c-d94c-4957-8d8d-eb000cc07969\" (UID: \"981d713c-d94c-4957-8d8d-eb000cc07969\") " Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.335542 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/981d713c-d94c-4957-8d8d-eb000cc07969-operator-scripts\") pod \"981d713c-d94c-4957-8d8d-eb000cc07969\" (UID: \"981d713c-d94c-4957-8d8d-eb000cc07969\") " Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.335560 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f91afae-2ac6-49ea-aa9a-b5c8eed0184f-operator-scripts\") pod \"6f91afae-2ac6-49ea-aa9a-b5c8eed0184f\" (UID: \"6f91afae-2ac6-49ea-aa9a-b5c8eed0184f\") " Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.336005 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1df9c08-0838-4b4e-a77d-0105392e8c10-operator-scripts\") pod \"d1df9c08-0838-4b4e-a77d-0105392e8c10\" (UID: \"d1df9c08-0838-4b4e-a77d-0105392e8c10\") " Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.336798 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1df9c08-0838-4b4e-a77d-0105392e8c10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1df9c08-0838-4b4e-a77d-0105392e8c10" (UID: "d1df9c08-0838-4b4e-a77d-0105392e8c10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.338243 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f91afae-2ac6-49ea-aa9a-b5c8eed0184f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f91afae-2ac6-49ea-aa9a-b5c8eed0184f" (UID: "6f91afae-2ac6-49ea-aa9a-b5c8eed0184f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.338445 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1df9c08-0838-4b4e-a77d-0105392e8c10-kube-api-access-4bndx" (OuterVolumeSpecName: "kube-api-access-4bndx") pod "d1df9c08-0838-4b4e-a77d-0105392e8c10" (UID: "d1df9c08-0838-4b4e-a77d-0105392e8c10"). InnerVolumeSpecName "kube-api-access-4bndx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.343491 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f91afae-2ac6-49ea-aa9a-b5c8eed0184f-kube-api-access-72j6m" (OuterVolumeSpecName: "kube-api-access-72j6m") pod "6f91afae-2ac6-49ea-aa9a-b5c8eed0184f" (UID: "6f91afae-2ac6-49ea-aa9a-b5c8eed0184f"). InnerVolumeSpecName "kube-api-access-72j6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.344088 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/981d713c-d94c-4957-8d8d-eb000cc07969-kube-api-access-np74b" (OuterVolumeSpecName: "kube-api-access-np74b") pod "981d713c-d94c-4957-8d8d-eb000cc07969" (UID: "981d713c-d94c-4957-8d8d-eb000cc07969"). InnerVolumeSpecName "kube-api-access-np74b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.348825 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/981d713c-d94c-4957-8d8d-eb000cc07969-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "981d713c-d94c-4957-8d8d-eb000cc07969" (UID: "981d713c-d94c-4957-8d8d-eb000cc07969"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.437698 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1df9c08-0838-4b4e-a77d-0105392e8c10-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.437732 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bndx\" (UniqueName: \"kubernetes.io/projected/d1df9c08-0838-4b4e-a77d-0105392e8c10-kube-api-access-4bndx\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.437744 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72j6m\" (UniqueName: \"kubernetes.io/projected/6f91afae-2ac6-49ea-aa9a-b5c8eed0184f-kube-api-access-72j6m\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.437752 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np74b\" (UniqueName: \"kubernetes.io/projected/981d713c-d94c-4957-8d8d-eb000cc07969-kube-api-access-np74b\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.437761 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/981d713c-d94c-4957-8d8d-eb000cc07969-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.437769 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f91afae-2ac6-49ea-aa9a-b5c8eed0184f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.614518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-7t7cc" event={"ID":"6f91afae-2ac6-49ea-aa9a-b5c8eed0184f","Type":"ContainerDied","Data":"8fca1c2139961533181212731bf4287ba4fb0fe302d325926862333c5b0116d9"} Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.614562 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fca1c2139961533181212731bf4287ba4fb0fe302d325926862333c5b0116d9" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.614599 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-7t7cc" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.617179 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-vx9c6" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.617179 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-vx9c6" event={"ID":"60acba6f-1f50-4392-9a13-6b6ef04c93a6","Type":"ContainerDied","Data":"dc33d28aa83f6ddeb4df36bd5776ad09f1a851ec9c5a15850199f1bf6485509c"} Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.617292 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc33d28aa83f6ddeb4df36bd5776ad09f1a851ec9c5a15850199f1bf6485509c" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.618730 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-74g9t" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.618729 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-74g9t" event={"ID":"981d713c-d94c-4957-8d8d-eb000cc07969","Type":"ContainerDied","Data":"9d16d9085ef04631036e35f6b172391b4715b1bef2beb05c153a0dc1c15b7ace"} Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.618783 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d16d9085ef04631036e35f6b172391b4715b1bef2beb05c153a0dc1c15b7ace" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.620297 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x" event={"ID":"d1df9c08-0838-4b4e-a77d-0105392e8c10","Type":"ContainerDied","Data":"2721f513e920a8f2e26a1088ae2ad1d93e81211d2a608a09eb91782f37bf3c41"} Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.620332 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2721f513e920a8f2e26a1088ae2ad1d93e81211d2a608a09eb91782f37bf3c41" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.620370 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x" Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.622913 5030 generic.go:334] "Generic (PLEG): container finished" podID="484288bf-3189-4fbb-9166-1a039ba08128" containerID="c86f3d18825271b702ba7c875db8138c946a9cb80e528f38d065a6ba41118b3c" exitCode=143 Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.623025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"484288bf-3189-4fbb-9166-1a039ba08128","Type":"ContainerDied","Data":"c86f3d18825271b702ba7c875db8138c946a9cb80e528f38d065a6ba41118b3c"} Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.877935 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.878162 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="7c99b064-a7ab-46e1-9f39-64227f14c796" containerName="glance-log" containerID="cri-o://46df37c503cb037bacfe11ad662227a33d86eea89c69d58b223169d6ff67a6c0" gracePeriod=30 Jan 20 22:55:29 crc kubenswrapper[5030]: I0120 22:55:29.878527 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="7c99b064-a7ab-46e1-9f39-64227f14c796" containerName="glance-httpd" containerID="cri-o://a59c09d700ae01d32bd23b9e5481351100daeb6febd423cf4c7fcf2d6e662c06" gracePeriod=30 Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.125588 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.130841 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.149135 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/830cabde-1736-47a1-add5-6fde195a1723-operator-scripts\") pod \"830cabde-1736-47a1-add5-6fde195a1723\" (UID: \"830cabde-1736-47a1-add5-6fde195a1723\") " Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.149354 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvdf2\" (UniqueName: \"kubernetes.io/projected/830cabde-1736-47a1-add5-6fde195a1723-kube-api-access-mvdf2\") pod \"830cabde-1736-47a1-add5-6fde195a1723\" (UID: \"830cabde-1736-47a1-add5-6fde195a1723\") " Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.149536 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/830cabde-1736-47a1-add5-6fde195a1723-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "830cabde-1736-47a1-add5-6fde195a1723" (UID: "830cabde-1736-47a1-add5-6fde195a1723"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.149857 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/830cabde-1736-47a1-add5-6fde195a1723-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.155806 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/830cabde-1736-47a1-add5-6fde195a1723-kube-api-access-mvdf2" (OuterVolumeSpecName: "kube-api-access-mvdf2") pod "830cabde-1736-47a1-add5-6fde195a1723" (UID: "830cabde-1736-47a1-add5-6fde195a1723"). InnerVolumeSpecName "kube-api-access-mvdf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.253140 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttxn5\" (UniqueName: \"kubernetes.io/projected/22a14bbb-3aa5-4a2f-b49b-446729d08cca-kube-api-access-ttxn5\") pod \"22a14bbb-3aa5-4a2f-b49b-446729d08cca\" (UID: \"22a14bbb-3aa5-4a2f-b49b-446729d08cca\") " Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.253256 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22a14bbb-3aa5-4a2f-b49b-446729d08cca-operator-scripts\") pod \"22a14bbb-3aa5-4a2f-b49b-446729d08cca\" (UID: \"22a14bbb-3aa5-4a2f-b49b-446729d08cca\") " Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.253685 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvdf2\" (UniqueName: \"kubernetes.io/projected/830cabde-1736-47a1-add5-6fde195a1723-kube-api-access-mvdf2\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.254059 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a14bbb-3aa5-4a2f-b49b-446729d08cca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22a14bbb-3aa5-4a2f-b49b-446729d08cca" (UID: "22a14bbb-3aa5-4a2f-b49b-446729d08cca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.256662 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a14bbb-3aa5-4a2f-b49b-446729d08cca-kube-api-access-ttxn5" (OuterVolumeSpecName: "kube-api-access-ttxn5") pod "22a14bbb-3aa5-4a2f-b49b-446729d08cca" (UID: "22a14bbb-3aa5-4a2f-b49b-446729d08cca"). InnerVolumeSpecName "kube-api-access-ttxn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.355758 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22a14bbb-3aa5-4a2f-b49b-446729d08cca-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.355783 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttxn5\" (UniqueName: \"kubernetes.io/projected/22a14bbb-3aa5-4a2f-b49b-446729d08cca-kube-api-access-ttxn5\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.470501 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.558756 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrwvg\" (UniqueName: \"kubernetes.io/projected/45649fa5-a578-4976-82ef-b45ed55f9f25-kube-api-access-rrwvg\") pod \"45649fa5-a578-4976-82ef-b45ed55f9f25\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.559462 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-scripts\") pod \"45649fa5-a578-4976-82ef-b45ed55f9f25\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.559526 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45649fa5-a578-4976-82ef-b45ed55f9f25-etc-machine-id\") pod \"45649fa5-a578-4976-82ef-b45ed55f9f25\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.559550 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45649fa5-a578-4976-82ef-b45ed55f9f25-logs\") pod \"45649fa5-a578-4976-82ef-b45ed55f9f25\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.559570 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-combined-ca-bundle\") pod \"45649fa5-a578-4976-82ef-b45ed55f9f25\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.559664 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-config-data\") pod \"45649fa5-a578-4976-82ef-b45ed55f9f25\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.559704 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-config-data-custom\") pod \"45649fa5-a578-4976-82ef-b45ed55f9f25\" (UID: \"45649fa5-a578-4976-82ef-b45ed55f9f25\") " Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.560390 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45649fa5-a578-4976-82ef-b45ed55f9f25-logs" (OuterVolumeSpecName: "logs") pod "45649fa5-a578-4976-82ef-b45ed55f9f25" (UID: "45649fa5-a578-4976-82ef-b45ed55f9f25"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.560580 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45649fa5-a578-4976-82ef-b45ed55f9f25-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.560741 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45649fa5-a578-4976-82ef-b45ed55f9f25-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "45649fa5-a578-4976-82ef-b45ed55f9f25" (UID: "45649fa5-a578-4976-82ef-b45ed55f9f25"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.564786 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "45649fa5-a578-4976-82ef-b45ed55f9f25" (UID: "45649fa5-a578-4976-82ef-b45ed55f9f25"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.564841 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-scripts" (OuterVolumeSpecName: "scripts") pod "45649fa5-a578-4976-82ef-b45ed55f9f25" (UID: "45649fa5-a578-4976-82ef-b45ed55f9f25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.564871 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45649fa5-a578-4976-82ef-b45ed55f9f25-kube-api-access-rrwvg" (OuterVolumeSpecName: "kube-api-access-rrwvg") pod "45649fa5-a578-4976-82ef-b45ed55f9f25" (UID: "45649fa5-a578-4976-82ef-b45ed55f9f25"). InnerVolumeSpecName "kube-api-access-rrwvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.584424 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45649fa5-a578-4976-82ef-b45ed55f9f25" (UID: "45649fa5-a578-4976-82ef-b45ed55f9f25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.617088 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-config-data" (OuterVolumeSpecName: "config-data") pod "45649fa5-a578-4976-82ef-b45ed55f9f25" (UID: "45649fa5-a578-4976-82ef-b45ed55f9f25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.651650 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" event={"ID":"22a14bbb-3aa5-4a2f-b49b-446729d08cca","Type":"ContainerDied","Data":"ea700918c6a43033121adb80b88834eee95733968110be9ddd0bd7bff7951b56"} Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.651935 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea700918c6a43033121adb80b88834eee95733968110be9ddd0bd7bff7951b56" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.651686 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.658505 5030 generic.go:334] "Generic (PLEG): container finished" podID="45649fa5-a578-4976-82ef-b45ed55f9f25" containerID="845c6fff41d3f940ea4d8ed0e2d9e1995c54a35720968d4f8683ddcb92ad8af5" exitCode=137 Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.658555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"45649fa5-a578-4976-82ef-b45ed55f9f25","Type":"ContainerDied","Data":"845c6fff41d3f940ea4d8ed0e2d9e1995c54a35720968d4f8683ddcb92ad8af5"} Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.658589 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"45649fa5-a578-4976-82ef-b45ed55f9f25","Type":"ContainerDied","Data":"9112c4c4c78a110c500bc39499eed1d35e62a8455b9722705521e5b12e184a91"} Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.658606 5030 scope.go:117] "RemoveContainer" containerID="845c6fff41d3f940ea4d8ed0e2d9e1995c54a35720968d4f8683ddcb92ad8af5" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.658529 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.661906 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.661950 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrwvg\" (UniqueName: \"kubernetes.io/projected/45649fa5-a578-4976-82ef-b45ed55f9f25-kube-api-access-rrwvg\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.661962 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.661972 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/45649fa5-a578-4976-82ef-b45ed55f9f25-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.661980 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.661990 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45649fa5-a578-4976-82ef-b45ed55f9f25-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.662808 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" event={"ID":"830cabde-1736-47a1-add5-6fde195a1723","Type":"ContainerDied","Data":"11662a10c222ecfa8027c951b91143485ba0ab346c969c68709faaebb767cb5f"} Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.662832 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11662a10c222ecfa8027c951b91143485ba0ab346c969c68709faaebb767cb5f" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.662830 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.665245 5030 generic.go:334] "Generic (PLEG): container finished" podID="7c99b064-a7ab-46e1-9f39-64227f14c796" containerID="46df37c503cb037bacfe11ad662227a33d86eea89c69d58b223169d6ff67a6c0" exitCode=143 Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.665351 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7c99b064-a7ab-46e1-9f39-64227f14c796","Type":"ContainerDied","Data":"46df37c503cb037bacfe11ad662227a33d86eea89c69d58b223169d6ff67a6c0"} Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.712834 5030 scope.go:117] "RemoveContainer" containerID="c496d9ec10aa5434563425ac1e672214770f61914d1dc19339acad85b178854f" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.730043 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.734896 5030 scope.go:117] "RemoveContainer" containerID="845c6fff41d3f940ea4d8ed0e2d9e1995c54a35720968d4f8683ddcb92ad8af5" Jan 20 22:55:30 crc kubenswrapper[5030]: E0120 22:55:30.735299 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"845c6fff41d3f940ea4d8ed0e2d9e1995c54a35720968d4f8683ddcb92ad8af5\": container with ID starting with 845c6fff41d3f940ea4d8ed0e2d9e1995c54a35720968d4f8683ddcb92ad8af5 not found: ID does not exist" containerID="845c6fff41d3f940ea4d8ed0e2d9e1995c54a35720968d4f8683ddcb92ad8af5" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.735349 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"845c6fff41d3f940ea4d8ed0e2d9e1995c54a35720968d4f8683ddcb92ad8af5"} err="failed to get container status \"845c6fff41d3f940ea4d8ed0e2d9e1995c54a35720968d4f8683ddcb92ad8af5\": rpc error: code = NotFound desc = could not find container \"845c6fff41d3f940ea4d8ed0e2d9e1995c54a35720968d4f8683ddcb92ad8af5\": container with ID starting with 845c6fff41d3f940ea4d8ed0e2d9e1995c54a35720968d4f8683ddcb92ad8af5 not found: ID does not exist" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.735386 5030 scope.go:117] "RemoveContainer" containerID="c496d9ec10aa5434563425ac1e672214770f61914d1dc19339acad85b178854f" Jan 20 22:55:30 crc kubenswrapper[5030]: E0120 22:55:30.735710 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c496d9ec10aa5434563425ac1e672214770f61914d1dc19339acad85b178854f\": container with ID starting with c496d9ec10aa5434563425ac1e672214770f61914d1dc19339acad85b178854f not found: ID does not exist" containerID="c496d9ec10aa5434563425ac1e672214770f61914d1dc19339acad85b178854f" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.735741 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c496d9ec10aa5434563425ac1e672214770f61914d1dc19339acad85b178854f"} err="failed to get container status \"c496d9ec10aa5434563425ac1e672214770f61914d1dc19339acad85b178854f\": rpc error: code = NotFound desc = could not find container \"c496d9ec10aa5434563425ac1e672214770f61914d1dc19339acad85b178854f\": container with ID starting with c496d9ec10aa5434563425ac1e672214770f61914d1dc19339acad85b178854f not found: ID does not exist" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.742549 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.754475 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:55:30 crc kubenswrapper[5030]: E0120 22:55:30.754947 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1df9c08-0838-4b4e-a77d-0105392e8c10" containerName="mariadb-account-create-update" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.754962 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1df9c08-0838-4b4e-a77d-0105392e8c10" containerName="mariadb-account-create-update" Jan 20 22:55:30 crc kubenswrapper[5030]: E0120 22:55:30.754975 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45649fa5-a578-4976-82ef-b45ed55f9f25" containerName="cinder-api-log" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.754984 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="45649fa5-a578-4976-82ef-b45ed55f9f25" containerName="cinder-api-log" Jan 20 22:55:30 crc kubenswrapper[5030]: E0120 22:55:30.755000 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="981d713c-d94c-4957-8d8d-eb000cc07969" containerName="mariadb-database-create" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755008 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="981d713c-d94c-4957-8d8d-eb000cc07969" containerName="mariadb-database-create" Jan 20 22:55:30 crc kubenswrapper[5030]: E0120 22:55:30.755036 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a14bbb-3aa5-4a2f-b49b-446729d08cca" containerName="mariadb-account-create-update" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755044 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a14bbb-3aa5-4a2f-b49b-446729d08cca" containerName="mariadb-account-create-update" Jan 20 22:55:30 crc kubenswrapper[5030]: E0120 22:55:30.755059 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="830cabde-1736-47a1-add5-6fde195a1723" containerName="mariadb-account-create-update" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755069 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="830cabde-1736-47a1-add5-6fde195a1723" containerName="mariadb-account-create-update" Jan 20 22:55:30 crc kubenswrapper[5030]: E0120 22:55:30.755086 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60acba6f-1f50-4392-9a13-6b6ef04c93a6" containerName="mariadb-database-create" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755094 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="60acba6f-1f50-4392-9a13-6b6ef04c93a6" containerName="mariadb-database-create" Jan 20 22:55:30 crc kubenswrapper[5030]: E0120 22:55:30.755112 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f91afae-2ac6-49ea-aa9a-b5c8eed0184f" containerName="mariadb-database-create" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755120 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f91afae-2ac6-49ea-aa9a-b5c8eed0184f" containerName="mariadb-database-create" Jan 20 22:55:30 crc kubenswrapper[5030]: E0120 22:55:30.755131 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45649fa5-a578-4976-82ef-b45ed55f9f25" containerName="cinder-api" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755140 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="45649fa5-a578-4976-82ef-b45ed55f9f25" containerName="cinder-api" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755323 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="45649fa5-a578-4976-82ef-b45ed55f9f25" containerName="cinder-api" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755345 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f91afae-2ac6-49ea-aa9a-b5c8eed0184f" containerName="mariadb-database-create" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755355 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="830cabde-1736-47a1-add5-6fde195a1723" containerName="mariadb-account-create-update" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755370 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="45649fa5-a578-4976-82ef-b45ed55f9f25" containerName="cinder-api-log" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755392 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="22a14bbb-3aa5-4a2f-b49b-446729d08cca" containerName="mariadb-account-create-update" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755406 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1df9c08-0838-4b4e-a77d-0105392e8c10" containerName="mariadb-account-create-update" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755426 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="60acba6f-1f50-4392-9a13-6b6ef04c93a6" containerName="mariadb-database-create" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.755435 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="981d713c-d94c-4957-8d8d-eb000cc07969" containerName="mariadb-database-create" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.756559 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.758373 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.758437 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.761393 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.764135 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.867008 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6clrh\" (UniqueName: \"kubernetes.io/projected/bcf270c3-8bc4-4b11-8c04-10708f22f76e-kube-api-access-6clrh\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.867064 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-config-data-custom\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.867090 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-config-data\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.867149 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bcf270c3-8bc4-4b11-8c04-10708f22f76e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.867166 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcf270c3-8bc4-4b11-8c04-10708f22f76e-logs\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.867226 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.867328 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.867363 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.867479 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-scripts\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.968894 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.968949 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.968997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-scripts\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.969057 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6clrh\" (UniqueName: \"kubernetes.io/projected/bcf270c3-8bc4-4b11-8c04-10708f22f76e-kube-api-access-6clrh\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.969096 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-config-data-custom\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.969121 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-config-data\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.969152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bcf270c3-8bc4-4b11-8c04-10708f22f76e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.969199 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcf270c3-8bc4-4b11-8c04-10708f22f76e-logs\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.969274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.969405 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bcf270c3-8bc4-4b11-8c04-10708f22f76e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.969734 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcf270c3-8bc4-4b11-8c04-10708f22f76e-logs\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.973209 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.973221 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.973940 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-config-data-custom\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.974331 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-config-data\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.992995 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.993133 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6clrh\" (UniqueName: \"kubernetes.io/projected/bcf270c3-8bc4-4b11-8c04-10708f22f76e-kube-api-access-6clrh\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:30 crc kubenswrapper[5030]: I0120 22:55:30.993169 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-scripts\") pod \"cinder-api-0\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:31 crc kubenswrapper[5030]: I0120 22:55:31.073694 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:31 crc kubenswrapper[5030]: I0120 22:55:31.411871 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:55:31 crc kubenswrapper[5030]: I0120 22:55:31.682047 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"bcf270c3-8bc4-4b11-8c04-10708f22f76e","Type":"ContainerStarted","Data":"803a871ac34dc6e83e980939ad71904b625abed93d38aea905fba51dbb908812"} Jan 20 22:55:31 crc kubenswrapper[5030]: I0120 22:55:31.978265 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45649fa5-a578-4976-82ef-b45ed55f9f25" path="/var/lib/kubelet/pods/45649fa5-a578-4976-82ef-b45ed55f9f25/volumes" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.158851 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.324142 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-internal-tls-certs\") pod \"484288bf-3189-4fbb-9166-1a039ba08128\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.324506 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/484288bf-3189-4fbb-9166-1a039ba08128-httpd-run\") pod \"484288bf-3189-4fbb-9166-1a039ba08128\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.324574 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-config-data\") pod \"484288bf-3189-4fbb-9166-1a039ba08128\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.324622 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"484288bf-3189-4fbb-9166-1a039ba08128\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.324656 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-scripts\") pod \"484288bf-3189-4fbb-9166-1a039ba08128\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.324734 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pssm2\" (UniqueName: \"kubernetes.io/projected/484288bf-3189-4fbb-9166-1a039ba08128-kube-api-access-pssm2\") pod \"484288bf-3189-4fbb-9166-1a039ba08128\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.324751 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-combined-ca-bundle\") pod \"484288bf-3189-4fbb-9166-1a039ba08128\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.324787 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/484288bf-3189-4fbb-9166-1a039ba08128-logs\") pod \"484288bf-3189-4fbb-9166-1a039ba08128\" (UID: \"484288bf-3189-4fbb-9166-1a039ba08128\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.325043 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484288bf-3189-4fbb-9166-1a039ba08128-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "484288bf-3189-4fbb-9166-1a039ba08128" (UID: "484288bf-3189-4fbb-9166-1a039ba08128"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.325224 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/484288bf-3189-4fbb-9166-1a039ba08128-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.325374 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484288bf-3189-4fbb-9166-1a039ba08128-logs" (OuterVolumeSpecName: "logs") pod "484288bf-3189-4fbb-9166-1a039ba08128" (UID: "484288bf-3189-4fbb-9166-1a039ba08128"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.333855 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-scripts" (OuterVolumeSpecName: "scripts") pod "484288bf-3189-4fbb-9166-1a039ba08128" (UID: "484288bf-3189-4fbb-9166-1a039ba08128"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.334444 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484288bf-3189-4fbb-9166-1a039ba08128-kube-api-access-pssm2" (OuterVolumeSpecName: "kube-api-access-pssm2") pod "484288bf-3189-4fbb-9166-1a039ba08128" (UID: "484288bf-3189-4fbb-9166-1a039ba08128"). InnerVolumeSpecName "kube-api-access-pssm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.337593 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "484288bf-3189-4fbb-9166-1a039ba08128" (UID: "484288bf-3189-4fbb-9166-1a039ba08128"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.340138 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.378217 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "484288bf-3189-4fbb-9166-1a039ba08128" (UID: "484288bf-3189-4fbb-9166-1a039ba08128"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.402775 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "484288bf-3189-4fbb-9166-1a039ba08128" (UID: "484288bf-3189-4fbb-9166-1a039ba08128"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.427399 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/484288bf-3189-4fbb-9166-1a039ba08128-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.427431 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.427452 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.427463 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.427471 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pssm2\" (UniqueName: \"kubernetes.io/projected/484288bf-3189-4fbb-9166-1a039ba08128-kube-api-access-pssm2\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.427480 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.433289 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-config-data" (OuterVolumeSpecName: "config-data") pod "484288bf-3189-4fbb-9166-1a039ba08128" (UID: "484288bf-3189-4fbb-9166-1a039ba08128"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.450765 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.528380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-scripts\") pod \"1b731572-00e6-412a-95ad-f29dfcb5fd25\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.528486 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzqrn\" (UniqueName: \"kubernetes.io/projected/1b731572-00e6-412a-95ad-f29dfcb5fd25-kube-api-access-nzqrn\") pod \"1b731572-00e6-412a-95ad-f29dfcb5fd25\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.528504 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-sg-core-conf-yaml\") pod \"1b731572-00e6-412a-95ad-f29dfcb5fd25\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.528536 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-combined-ca-bundle\") pod \"1b731572-00e6-412a-95ad-f29dfcb5fd25\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.528565 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-config-data\") pod \"1b731572-00e6-412a-95ad-f29dfcb5fd25\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.528585 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b731572-00e6-412a-95ad-f29dfcb5fd25-log-httpd\") pod \"1b731572-00e6-412a-95ad-f29dfcb5fd25\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.528720 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b731572-00e6-412a-95ad-f29dfcb5fd25-run-httpd\") pod \"1b731572-00e6-412a-95ad-f29dfcb5fd25\" (UID: \"1b731572-00e6-412a-95ad-f29dfcb5fd25\") " Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.529121 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b731572-00e6-412a-95ad-f29dfcb5fd25-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1b731572-00e6-412a-95ad-f29dfcb5fd25" (UID: "1b731572-00e6-412a-95ad-f29dfcb5fd25"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.529270 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b731572-00e6-412a-95ad-f29dfcb5fd25-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1b731572-00e6-412a-95ad-f29dfcb5fd25" (UID: "1b731572-00e6-412a-95ad-f29dfcb5fd25"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.529564 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/484288bf-3189-4fbb-9166-1a039ba08128-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.529578 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.529587 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b731572-00e6-412a-95ad-f29dfcb5fd25-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.529595 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b731572-00e6-412a-95ad-f29dfcb5fd25-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.531907 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b731572-00e6-412a-95ad-f29dfcb5fd25-kube-api-access-nzqrn" (OuterVolumeSpecName: "kube-api-access-nzqrn") pod "1b731572-00e6-412a-95ad-f29dfcb5fd25" (UID: "1b731572-00e6-412a-95ad-f29dfcb5fd25"). InnerVolumeSpecName "kube-api-access-nzqrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.534939 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-scripts" (OuterVolumeSpecName: "scripts") pod "1b731572-00e6-412a-95ad-f29dfcb5fd25" (UID: "1b731572-00e6-412a-95ad-f29dfcb5fd25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.554004 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1b731572-00e6-412a-95ad-f29dfcb5fd25" (UID: "1b731572-00e6-412a-95ad-f29dfcb5fd25"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.594675 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b731572-00e6-412a-95ad-f29dfcb5fd25" (UID: "1b731572-00e6-412a-95ad-f29dfcb5fd25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.624975 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-config-data" (OuterVolumeSpecName: "config-data") pod "1b731572-00e6-412a-95ad-f29dfcb5fd25" (UID: "1b731572-00e6-412a-95ad-f29dfcb5fd25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.631126 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.631162 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.631176 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzqrn\" (UniqueName: \"kubernetes.io/projected/1b731572-00e6-412a-95ad-f29dfcb5fd25-kube-api-access-nzqrn\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.631194 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.631207 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b731572-00e6-412a-95ad-f29dfcb5fd25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.695207 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"bcf270c3-8bc4-4b11-8c04-10708f22f76e","Type":"ContainerStarted","Data":"ea1e0a375f06f75251c94f8a9680e30e93e6ebbeb077b27d82cd29945cdfd113"} Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.695246 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"bcf270c3-8bc4-4b11-8c04-10708f22f76e","Type":"ContainerStarted","Data":"4519dfb732f0f1ed4b9e6668cf83ed384631e71d9e34923e24132c02c8524752"} Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.699461 5030 generic.go:334] "Generic (PLEG): container finished" podID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerID="5b18d35f58af6b1af1dd45d1c97d2163a9b7a42cd579499234b80bd1dd7729b5" exitCode=0 Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.699543 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1b731572-00e6-412a-95ad-f29dfcb5fd25","Type":"ContainerDied","Data":"5b18d35f58af6b1af1dd45d1c97d2163a9b7a42cd579499234b80bd1dd7729b5"} Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.699561 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.699602 5030 scope.go:117] "RemoveContainer" containerID="6f1837d7b5fa506b76f2d219ea690d5ada282903b0adad3f7f55ad1c8646a29d" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.699571 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1b731572-00e6-412a-95ad-f29dfcb5fd25","Type":"ContainerDied","Data":"c46877040775ccfc66bcb2cfe4418235d572594ad54f2d1f0b156e2b9a74229f"} Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.706512 5030 generic.go:334] "Generic (PLEG): container finished" podID="484288bf-3189-4fbb-9166-1a039ba08128" containerID="c93df97c119c55c22e75d8a04150ddd1c5a2d7f15dc2b57ed704c6c7f485fb33" exitCode=0 Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.706555 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.706580 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"484288bf-3189-4fbb-9166-1a039ba08128","Type":"ContainerDied","Data":"c93df97c119c55c22e75d8a04150ddd1c5a2d7f15dc2b57ed704c6c7f485fb33"} Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.706609 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"484288bf-3189-4fbb-9166-1a039ba08128","Type":"ContainerDied","Data":"72369b726c39de69b65d98e8f98e6c534cdc30387a7c057a897292c6245a7ab7"} Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.744232 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.755443 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.765717 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.780115 5030 scope.go:117] "RemoveContainer" containerID="60d8a9ac2380f9ec7a6a58ffa7b4a90a8145e870d46b3507cbef8b189a025876" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.789598 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.818053 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:32 crc kubenswrapper[5030]: E0120 22:55:32.818409 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="sg-core" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.818426 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="sg-core" Jan 20 22:55:32 crc kubenswrapper[5030]: E0120 22:55:32.818436 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="ceilometer-central-agent" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.818444 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="ceilometer-central-agent" Jan 20 22:55:32 crc kubenswrapper[5030]: E0120 22:55:32.818457 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484288bf-3189-4fbb-9166-1a039ba08128" containerName="glance-log" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.818463 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="484288bf-3189-4fbb-9166-1a039ba08128" containerName="glance-log" Jan 20 22:55:32 crc kubenswrapper[5030]: E0120 22:55:32.818472 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484288bf-3189-4fbb-9166-1a039ba08128" containerName="glance-httpd" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.818477 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="484288bf-3189-4fbb-9166-1a039ba08128" containerName="glance-httpd" Jan 20 22:55:32 crc kubenswrapper[5030]: E0120 22:55:32.818487 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="proxy-httpd" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.818493 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="proxy-httpd" Jan 20 22:55:32 crc kubenswrapper[5030]: E0120 22:55:32.818517 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="ceilometer-notification-agent" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.818523 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="ceilometer-notification-agent" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.818712 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="ceilometer-notification-agent" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.818720 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="sg-core" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.818729 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="proxy-httpd" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.818742 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" containerName="ceilometer-central-agent" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.818750 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="484288bf-3189-4fbb-9166-1a039ba08128" containerName="glance-log" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.818764 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="484288bf-3189-4fbb-9166-1a039ba08128" containerName="glance-httpd" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.820165 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.820773 5030 scope.go:117] "RemoveContainer" containerID="c547e6388a11520067d51cb497bf00f299b4b09026bf497927ed488f87ba206c" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.821649 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.821813 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.827493 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.828997 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.833072 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.833250 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.838806 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.849411 5030 scope.go:117] "RemoveContainer" containerID="5b18d35f58af6b1af1dd45d1c97d2163a9b7a42cd579499234b80bd1dd7729b5" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.853062 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.874588 5030 scope.go:117] "RemoveContainer" containerID="6f1837d7b5fa506b76f2d219ea690d5ada282903b0adad3f7f55ad1c8646a29d" Jan 20 22:55:32 crc kubenswrapper[5030]: E0120 22:55:32.875011 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f1837d7b5fa506b76f2d219ea690d5ada282903b0adad3f7f55ad1c8646a29d\": container with ID starting with 6f1837d7b5fa506b76f2d219ea690d5ada282903b0adad3f7f55ad1c8646a29d not found: ID does not exist" containerID="6f1837d7b5fa506b76f2d219ea690d5ada282903b0adad3f7f55ad1c8646a29d" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.875043 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f1837d7b5fa506b76f2d219ea690d5ada282903b0adad3f7f55ad1c8646a29d"} err="failed to get container status \"6f1837d7b5fa506b76f2d219ea690d5ada282903b0adad3f7f55ad1c8646a29d\": rpc error: code = NotFound desc = could not find container \"6f1837d7b5fa506b76f2d219ea690d5ada282903b0adad3f7f55ad1c8646a29d\": container with ID starting with 6f1837d7b5fa506b76f2d219ea690d5ada282903b0adad3f7f55ad1c8646a29d not found: ID does not exist" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.875063 5030 scope.go:117] "RemoveContainer" containerID="60d8a9ac2380f9ec7a6a58ffa7b4a90a8145e870d46b3507cbef8b189a025876" Jan 20 22:55:32 crc kubenswrapper[5030]: E0120 22:55:32.875328 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60d8a9ac2380f9ec7a6a58ffa7b4a90a8145e870d46b3507cbef8b189a025876\": container with ID starting with 60d8a9ac2380f9ec7a6a58ffa7b4a90a8145e870d46b3507cbef8b189a025876 not found: ID does not exist" containerID="60d8a9ac2380f9ec7a6a58ffa7b4a90a8145e870d46b3507cbef8b189a025876" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.875379 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d8a9ac2380f9ec7a6a58ffa7b4a90a8145e870d46b3507cbef8b189a025876"} err="failed to get container status \"60d8a9ac2380f9ec7a6a58ffa7b4a90a8145e870d46b3507cbef8b189a025876\": rpc error: code = NotFound desc = could not find container \"60d8a9ac2380f9ec7a6a58ffa7b4a90a8145e870d46b3507cbef8b189a025876\": container with ID starting with 60d8a9ac2380f9ec7a6a58ffa7b4a90a8145e870d46b3507cbef8b189a025876 not found: ID does not exist" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.875416 5030 scope.go:117] "RemoveContainer" containerID="c547e6388a11520067d51cb497bf00f299b4b09026bf497927ed488f87ba206c" Jan 20 22:55:32 crc kubenswrapper[5030]: E0120 22:55:32.875961 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c547e6388a11520067d51cb497bf00f299b4b09026bf497927ed488f87ba206c\": container with ID starting with c547e6388a11520067d51cb497bf00f299b4b09026bf497927ed488f87ba206c not found: ID does not exist" containerID="c547e6388a11520067d51cb497bf00f299b4b09026bf497927ed488f87ba206c" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.875991 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c547e6388a11520067d51cb497bf00f299b4b09026bf497927ed488f87ba206c"} err="failed to get container status \"c547e6388a11520067d51cb497bf00f299b4b09026bf497927ed488f87ba206c\": rpc error: code = NotFound desc = could not find container \"c547e6388a11520067d51cb497bf00f299b4b09026bf497927ed488f87ba206c\": container with ID starting with c547e6388a11520067d51cb497bf00f299b4b09026bf497927ed488f87ba206c not found: ID does not exist" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.876006 5030 scope.go:117] "RemoveContainer" containerID="5b18d35f58af6b1af1dd45d1c97d2163a9b7a42cd579499234b80bd1dd7729b5" Jan 20 22:55:32 crc kubenswrapper[5030]: E0120 22:55:32.876244 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b18d35f58af6b1af1dd45d1c97d2163a9b7a42cd579499234b80bd1dd7729b5\": container with ID starting with 5b18d35f58af6b1af1dd45d1c97d2163a9b7a42cd579499234b80bd1dd7729b5 not found: ID does not exist" containerID="5b18d35f58af6b1af1dd45d1c97d2163a9b7a42cd579499234b80bd1dd7729b5" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.876284 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b18d35f58af6b1af1dd45d1c97d2163a9b7a42cd579499234b80bd1dd7729b5"} err="failed to get container status \"5b18d35f58af6b1af1dd45d1c97d2163a9b7a42cd579499234b80bd1dd7729b5\": rpc error: code = NotFound desc = could not find container \"5b18d35f58af6b1af1dd45d1c97d2163a9b7a42cd579499234b80bd1dd7729b5\": container with ID starting with 5b18d35f58af6b1af1dd45d1c97d2163a9b7a42cd579499234b80bd1dd7729b5 not found: ID does not exist" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.876329 5030 scope.go:117] "RemoveContainer" containerID="c93df97c119c55c22e75d8a04150ddd1c5a2d7f15dc2b57ed704c6c7f485fb33" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.896343 5030 scope.go:117] "RemoveContainer" containerID="c86f3d18825271b702ba7c875db8138c946a9cb80e528f38d065a6ba41118b3c" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.911737 5030 scope.go:117] "RemoveContainer" containerID="c93df97c119c55c22e75d8a04150ddd1c5a2d7f15dc2b57ed704c6c7f485fb33" Jan 20 22:55:32 crc kubenswrapper[5030]: E0120 22:55:32.912299 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c93df97c119c55c22e75d8a04150ddd1c5a2d7f15dc2b57ed704c6c7f485fb33\": container with ID starting with c93df97c119c55c22e75d8a04150ddd1c5a2d7f15dc2b57ed704c6c7f485fb33 not found: ID does not exist" containerID="c93df97c119c55c22e75d8a04150ddd1c5a2d7f15dc2b57ed704c6c7f485fb33" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.912363 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93df97c119c55c22e75d8a04150ddd1c5a2d7f15dc2b57ed704c6c7f485fb33"} err="failed to get container status \"c93df97c119c55c22e75d8a04150ddd1c5a2d7f15dc2b57ed704c6c7f485fb33\": rpc error: code = NotFound desc = could not find container \"c93df97c119c55c22e75d8a04150ddd1c5a2d7f15dc2b57ed704c6c7f485fb33\": container with ID starting with c93df97c119c55c22e75d8a04150ddd1c5a2d7f15dc2b57ed704c6c7f485fb33 not found: ID does not exist" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.912436 5030 scope.go:117] "RemoveContainer" containerID="c86f3d18825271b702ba7c875db8138c946a9cb80e528f38d065a6ba41118b3c" Jan 20 22:55:32 crc kubenswrapper[5030]: E0120 22:55:32.913341 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c86f3d18825271b702ba7c875db8138c946a9cb80e528f38d065a6ba41118b3c\": container with ID starting with c86f3d18825271b702ba7c875db8138c946a9cb80e528f38d065a6ba41118b3c not found: ID does not exist" containerID="c86f3d18825271b702ba7c875db8138c946a9cb80e528f38d065a6ba41118b3c" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.913375 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c86f3d18825271b702ba7c875db8138c946a9cb80e528f38d065a6ba41118b3c"} err="failed to get container status \"c86f3d18825271b702ba7c875db8138c946a9cb80e528f38d065a6ba41118b3c\": rpc error: code = NotFound desc = could not find container \"c86f3d18825271b702ba7c875db8138c946a9cb80e528f38d065a6ba41118b3c\": container with ID starting with c86f3d18825271b702ba7c875db8138c946a9cb80e528f38d065a6ba41118b3c not found: ID does not exist" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.935768 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962ec476-3693-41a1-b425-a275bf790beb-logs\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.935847 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69frn\" (UniqueName: \"kubernetes.io/projected/2183a5d1-2627-4b96-996e-627ce72f5bc6-kube-api-access-69frn\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.935893 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-scripts\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.935915 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.935943 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.935974 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.936002 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/962ec476-3693-41a1-b425-a275bf790beb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.936025 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-config-data\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.936066 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.936087 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbncs\" (UniqueName: \"kubernetes.io/projected/962ec476-3693-41a1-b425-a275bf790beb-kube-api-access-hbncs\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.936111 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2183a5d1-2627-4b96-996e-627ce72f5bc6-run-httpd\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.936147 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2183a5d1-2627-4b96-996e-627ce72f5bc6-log-httpd\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.936170 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.936231 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:32 crc kubenswrapper[5030]: I0120 22:55:32.936266 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.037985 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038021 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038053 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/962ec476-3693-41a1-b425-a275bf790beb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038077 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-config-data\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038099 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038120 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbncs\" (UniqueName: \"kubernetes.io/projected/962ec476-3693-41a1-b425-a275bf790beb-kube-api-access-hbncs\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038146 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2183a5d1-2627-4b96-996e-627ce72f5bc6-run-httpd\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038159 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2183a5d1-2627-4b96-996e-627ce72f5bc6-log-httpd\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038182 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038222 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038258 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038278 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962ec476-3693-41a1-b425-a275bf790beb-logs\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038294 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69frn\" (UniqueName: \"kubernetes.io/projected/2183a5d1-2627-4b96-996e-627ce72f5bc6-kube-api-access-69frn\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038317 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-scripts\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.038334 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.039236 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.039406 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2183a5d1-2627-4b96-996e-627ce72f5bc6-log-httpd\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.039451 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2183a5d1-2627-4b96-996e-627ce72f5bc6-run-httpd\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.040096 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962ec476-3693-41a1-b425-a275bf790beb-logs\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.040361 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/962ec476-3693-41a1-b425-a275bf790beb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.044345 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.045157 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.045544 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.046747 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.048225 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-scripts\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.051372 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.056585 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.056783 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-config-data\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.060522 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbncs\" (UniqueName: \"kubernetes.io/projected/962ec476-3693-41a1-b425-a275bf790beb-kube-api-access-hbncs\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.065260 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69frn\" (UniqueName: \"kubernetes.io/projected/2183a5d1-2627-4b96-996e-627ce72f5bc6-kube-api-access-69frn\") pod \"ceilometer-0\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.092179 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.151192 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.163987 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.614232 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:33 crc kubenswrapper[5030]: W0120 22:55:33.615671 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2183a5d1_2627_4b96_996e_627ce72f5bc6.slice/crio-9c6707a419c7e2c60e90c7be497340945e3f6f32636560cb1672421a78febeb9 WatchSource:0}: Error finding container 9c6707a419c7e2c60e90c7be497340945e3f6f32636560cb1672421a78febeb9: Status 404 returned error can't find the container with id 9c6707a419c7e2c60e90c7be497340945e3f6f32636560cb1672421a78febeb9 Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.644539 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.700912 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.731736 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"962ec476-3693-41a1-b425-a275bf790beb","Type":"ContainerStarted","Data":"7579ebaccaa33ec14f8e1d1c7e82a66614c0f6ace8b6b2afb2695cd0a9684c22"} Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.733580 5030 generic.go:334] "Generic (PLEG): container finished" podID="7c99b064-a7ab-46e1-9f39-64227f14c796" containerID="a59c09d700ae01d32bd23b9e5481351100daeb6febd423cf4c7fcf2d6e662c06" exitCode=0 Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.733638 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7c99b064-a7ab-46e1-9f39-64227f14c796","Type":"ContainerDied","Data":"a59c09d700ae01d32bd23b9e5481351100daeb6febd423cf4c7fcf2d6e662c06"} Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.733656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7c99b064-a7ab-46e1-9f39-64227f14c796","Type":"ContainerDied","Data":"9b162603f333fd0d668e8eff38d9ad0e1e1b7f5d68524cc45923ebe413899017"} Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.733670 5030 scope.go:117] "RemoveContainer" containerID="a59c09d700ae01d32bd23b9e5481351100daeb6febd423cf4c7fcf2d6e662c06" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.733684 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.745814 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2183a5d1-2627-4b96-996e-627ce72f5bc6","Type":"ContainerStarted","Data":"9c6707a419c7e2c60e90c7be497340945e3f6f32636560cb1672421a78febeb9"} Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.747151 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.749080 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-public-tls-certs\") pod \"7c99b064-a7ab-46e1-9f39-64227f14c796\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.749306 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p67j\" (UniqueName: \"kubernetes.io/projected/7c99b064-a7ab-46e1-9f39-64227f14c796-kube-api-access-4p67j\") pod \"7c99b064-a7ab-46e1-9f39-64227f14c796\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.749393 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c99b064-a7ab-46e1-9f39-64227f14c796-logs\") pod \"7c99b064-a7ab-46e1-9f39-64227f14c796\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.749471 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"7c99b064-a7ab-46e1-9f39-64227f14c796\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.749553 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-combined-ca-bundle\") pod \"7c99b064-a7ab-46e1-9f39-64227f14c796\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.749637 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-scripts\") pod \"7c99b064-a7ab-46e1-9f39-64227f14c796\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.749974 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c99b064-a7ab-46e1-9f39-64227f14c796-logs" (OuterVolumeSpecName: "logs") pod "7c99b064-a7ab-46e1-9f39-64227f14c796" (UID: "7c99b064-a7ab-46e1-9f39-64227f14c796"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.757091 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-config-data\") pod \"7c99b064-a7ab-46e1-9f39-64227f14c796\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.757183 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c99b064-a7ab-46e1-9f39-64227f14c796-httpd-run\") pod \"7c99b064-a7ab-46e1-9f39-64227f14c796\" (UID: \"7c99b064-a7ab-46e1-9f39-64227f14c796\") " Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.757410 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-scripts" (OuterVolumeSpecName: "scripts") pod "7c99b064-a7ab-46e1-9f39-64227f14c796" (UID: "7c99b064-a7ab-46e1-9f39-64227f14c796"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.758159 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c99b064-a7ab-46e1-9f39-64227f14c796-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7c99b064-a7ab-46e1-9f39-64227f14c796" (UID: "7c99b064-a7ab-46e1-9f39-64227f14c796"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.758747 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c99b064-a7ab-46e1-9f39-64227f14c796-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.758832 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c99b064-a7ab-46e1-9f39-64227f14c796-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.758886 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.761750 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "7c99b064-a7ab-46e1-9f39-64227f14c796" (UID: "7c99b064-a7ab-46e1-9f39-64227f14c796"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.761826 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c99b064-a7ab-46e1-9f39-64227f14c796-kube-api-access-4p67j" (OuterVolumeSpecName: "kube-api-access-4p67j") pod "7c99b064-a7ab-46e1-9f39-64227f14c796" (UID: "7c99b064-a7ab-46e1-9f39-64227f14c796"). InnerVolumeSpecName "kube-api-access-4p67j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.770375 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.770355483 podStartE2EDuration="3.770355483s" podCreationTimestamp="2026-01-20 22:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:55:33.764361638 +0000 UTC m=+1206.084621926" watchObservedRunningTime="2026-01-20 22:55:33.770355483 +0000 UTC m=+1206.090615761" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.781252 5030 scope.go:117] "RemoveContainer" containerID="46df37c503cb037bacfe11ad662227a33d86eea89c69d58b223169d6ff67a6c0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.796078 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c99b064-a7ab-46e1-9f39-64227f14c796" (UID: "7c99b064-a7ab-46e1-9f39-64227f14c796"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.804800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-config-data" (OuterVolumeSpecName: "config-data") pod "7c99b064-a7ab-46e1-9f39-64227f14c796" (UID: "7c99b064-a7ab-46e1-9f39-64227f14c796"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.813827 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7c99b064-a7ab-46e1-9f39-64227f14c796" (UID: "7c99b064-a7ab-46e1-9f39-64227f14c796"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.860811 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p67j\" (UniqueName: \"kubernetes.io/projected/7c99b064-a7ab-46e1-9f39-64227f14c796-kube-api-access-4p67j\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.860856 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.860870 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.860882 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.860895 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c99b064-a7ab-46e1-9f39-64227f14c796-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.875402 5030 scope.go:117] "RemoveContainer" containerID="a59c09d700ae01d32bd23b9e5481351100daeb6febd423cf4c7fcf2d6e662c06" Jan 20 22:55:33 crc kubenswrapper[5030]: E0120 22:55:33.876055 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a59c09d700ae01d32bd23b9e5481351100daeb6febd423cf4c7fcf2d6e662c06\": container with ID starting with a59c09d700ae01d32bd23b9e5481351100daeb6febd423cf4c7fcf2d6e662c06 not found: ID does not exist" containerID="a59c09d700ae01d32bd23b9e5481351100daeb6febd423cf4c7fcf2d6e662c06" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.876093 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a59c09d700ae01d32bd23b9e5481351100daeb6febd423cf4c7fcf2d6e662c06"} err="failed to get container status \"a59c09d700ae01d32bd23b9e5481351100daeb6febd423cf4c7fcf2d6e662c06\": rpc error: code = NotFound desc = could not find container \"a59c09d700ae01d32bd23b9e5481351100daeb6febd423cf4c7fcf2d6e662c06\": container with ID starting with a59c09d700ae01d32bd23b9e5481351100daeb6febd423cf4c7fcf2d6e662c06 not found: ID does not exist" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.876120 5030 scope.go:117] "RemoveContainer" containerID="46df37c503cb037bacfe11ad662227a33d86eea89c69d58b223169d6ff67a6c0" Jan 20 22:55:33 crc kubenswrapper[5030]: E0120 22:55:33.876575 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46df37c503cb037bacfe11ad662227a33d86eea89c69d58b223169d6ff67a6c0\": container with ID starting with 46df37c503cb037bacfe11ad662227a33d86eea89c69d58b223169d6ff67a6c0 not found: ID does not exist" containerID="46df37c503cb037bacfe11ad662227a33d86eea89c69d58b223169d6ff67a6c0" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.876597 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46df37c503cb037bacfe11ad662227a33d86eea89c69d58b223169d6ff67a6c0"} err="failed to get container status \"46df37c503cb037bacfe11ad662227a33d86eea89c69d58b223169d6ff67a6c0\": rpc error: code = NotFound desc = could not find container \"46df37c503cb037bacfe11ad662227a33d86eea89c69d58b223169d6ff67a6c0\": container with ID starting with 46df37c503cb037bacfe11ad662227a33d86eea89c69d58b223169d6ff67a6c0 not found: ID does not exist" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.884574 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.963363 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.974462 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b731572-00e6-412a-95ad-f29dfcb5fd25" path="/var/lib/kubelet/pods/1b731572-00e6-412a-95ad-f29dfcb5fd25/volumes" Jan 20 22:55:33 crc kubenswrapper[5030]: I0120 22:55:33.975766 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484288bf-3189-4fbb-9166-1a039ba08128" path="/var/lib/kubelet/pods/484288bf-3189-4fbb-9166-1a039ba08128/volumes" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.061603 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.082963 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.092431 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:55:34 crc kubenswrapper[5030]: E0120 22:55:34.092839 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c99b064-a7ab-46e1-9f39-64227f14c796" containerName="glance-log" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.092857 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c99b064-a7ab-46e1-9f39-64227f14c796" containerName="glance-log" Jan 20 22:55:34 crc kubenswrapper[5030]: E0120 22:55:34.092873 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c99b064-a7ab-46e1-9f39-64227f14c796" containerName="glance-httpd" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.092879 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c99b064-a7ab-46e1-9f39-64227f14c796" containerName="glance-httpd" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.093072 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c99b064-a7ab-46e1-9f39-64227f14c796" containerName="glance-httpd" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.093102 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c99b064-a7ab-46e1-9f39-64227f14c796" containerName="glance-log" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.094010 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.098272 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.098567 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.106324 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.268536 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6acf7ebc-be88-4e1d-9b31-bef0707ea082-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.268848 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.268875 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6acf7ebc-be88-4e1d-9b31-bef0707ea082-logs\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.268918 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72xcq\" (UniqueName: \"kubernetes.io/projected/6acf7ebc-be88-4e1d-9b31-bef0707ea082-kube-api-access-72xcq\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.269046 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.269068 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-config-data\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.269087 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.269107 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-scripts\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.370745 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72xcq\" (UniqueName: \"kubernetes.io/projected/6acf7ebc-be88-4e1d-9b31-bef0707ea082-kube-api-access-72xcq\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.370851 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.370876 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-config-data\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.370894 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.370913 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-scripts\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.370939 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6acf7ebc-be88-4e1d-9b31-bef0707ea082-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.370964 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.370986 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6acf7ebc-be88-4e1d-9b31-bef0707ea082-logs\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.371477 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6acf7ebc-be88-4e1d-9b31-bef0707ea082-logs\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.371579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6acf7ebc-be88-4e1d-9b31-bef0707ea082-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.372066 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.375089 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.375164 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.375586 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-scripts\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.376944 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-config-data\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.389163 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72xcq\" (UniqueName: \"kubernetes.io/projected/6acf7ebc-be88-4e1d-9b31-bef0707ea082-kube-api-access-72xcq\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.409299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.419080 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.760827 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2183a5d1-2627-4b96-996e-627ce72f5bc6","Type":"ContainerStarted","Data":"ca0d50b0ac70c9b1652d1607dbb6f9befb40fbcbc096d3d4f0cacd17b828d7c4"} Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.778482 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"962ec476-3693-41a1-b425-a275bf790beb","Type":"ContainerStarted","Data":"4eb95752fd1da606b39641693b5639240910a1ec73d7ec7ac34635061387513d"} Jan 20 22:55:34 crc kubenswrapper[5030]: I0120 22:55:34.881492 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:55:35 crc kubenswrapper[5030]: I0120 22:55:35.791177 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2183a5d1-2627-4b96-996e-627ce72f5bc6","Type":"ContainerStarted","Data":"11b5e34117b83101a01cef056a94b7f9f49e60e5742feac5554f4453dbe468a7"} Jan 20 22:55:35 crc kubenswrapper[5030]: I0120 22:55:35.793864 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"962ec476-3693-41a1-b425-a275bf790beb","Type":"ContainerStarted","Data":"be10608095d180892da25bc367a49e31085449b06dae8af243a26e65d423eabd"} Jan 20 22:55:35 crc kubenswrapper[5030]: I0120 22:55:35.796920 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"6acf7ebc-be88-4e1d-9b31-bef0707ea082","Type":"ContainerStarted","Data":"f2e4fba2a035d0d53c65eea9e0abbcb8b18335ef51e2f23ea789c2843967c10f"} Jan 20 22:55:35 crc kubenswrapper[5030]: I0120 22:55:35.796980 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"6acf7ebc-be88-4e1d-9b31-bef0707ea082","Type":"ContainerStarted","Data":"bb9c254ed7087071c1e4a6a64b5ffd65090cf6e8d35568315b3a537d68632b61"} Jan 20 22:55:35 crc kubenswrapper[5030]: I0120 22:55:35.824100 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.824081397 podStartE2EDuration="3.824081397s" podCreationTimestamp="2026-01-20 22:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:55:35.813427878 +0000 UTC m=+1208.133688176" watchObservedRunningTime="2026-01-20 22:55:35.824081397 +0000 UTC m=+1208.144341685" Jan 20 22:55:35 crc kubenswrapper[5030]: I0120 22:55:35.972541 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c99b064-a7ab-46e1-9f39-64227f14c796" path="/var/lib/kubelet/pods/7c99b064-a7ab-46e1-9f39-64227f14c796/volumes" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.189421 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5"] Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.191122 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.193189 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.193350 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.194734 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-8chxr" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.204613 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5"] Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.325725 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2c5c5\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.325777 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-config-data\") pod \"nova-cell0-conductor-db-sync-2c5c5\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.325813 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhsb4\" (UniqueName: \"kubernetes.io/projected/70c19680-49a9-401d-bea1-db45fd5e9004-kube-api-access-xhsb4\") pod \"nova-cell0-conductor-db-sync-2c5c5\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.325968 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-scripts\") pod \"nova-cell0-conductor-db-sync-2c5c5\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.427671 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2c5c5\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.427739 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-config-data\") pod \"nova-cell0-conductor-db-sync-2c5c5\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.427775 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhsb4\" (UniqueName: \"kubernetes.io/projected/70c19680-49a9-401d-bea1-db45fd5e9004-kube-api-access-xhsb4\") pod \"nova-cell0-conductor-db-sync-2c5c5\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.427896 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-scripts\") pod \"nova-cell0-conductor-db-sync-2c5c5\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.432794 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-scripts\") pod \"nova-cell0-conductor-db-sync-2c5c5\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.438244 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-config-data\") pod \"nova-cell0-conductor-db-sync-2c5c5\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.438863 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2c5c5\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.445170 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhsb4\" (UniqueName: \"kubernetes.io/projected/70c19680-49a9-401d-bea1-db45fd5e9004-kube-api-access-xhsb4\") pod \"nova-cell0-conductor-db-sync-2c5c5\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.519749 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.539062 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.808168 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2183a5d1-2627-4b96-996e-627ce72f5bc6","Type":"ContainerStarted","Data":"bbe5c2167f06bdf774f0ccb4c6713fe4fbf2cb648e1b798b7e335d599a1ccc40"} Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.812870 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"6acf7ebc-be88-4e1d-9b31-bef0707ea082","Type":"ContainerStarted","Data":"09189446050efd620d2103042e8ffbe0d00050dcd359f229a97abe5fb2fafe72"} Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.833892 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.833873749 podStartE2EDuration="2.833873749s" podCreationTimestamp="2026-01-20 22:55:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:55:36.831513922 +0000 UTC m=+1209.151774210" watchObservedRunningTime="2026-01-20 22:55:36.833873749 +0000 UTC m=+1209.154134037" Jan 20 22:55:36 crc kubenswrapper[5030]: I0120 22:55:36.956486 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5"] Jan 20 22:55:36 crc kubenswrapper[5030]: W0120 22:55:36.966017 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70c19680_49a9_401d_bea1_db45fd5e9004.slice/crio-dcd66abae615f0929b4850ddaac8da30b4cce24076675a9c2803456abea00dc7 WatchSource:0}: Error finding container dcd66abae615f0929b4850ddaac8da30b4cce24076675a9c2803456abea00dc7: Status 404 returned error can't find the container with id dcd66abae615f0929b4850ddaac8da30b4cce24076675a9c2803456abea00dc7 Jan 20 22:55:37 crc kubenswrapper[5030]: I0120 22:55:37.838301 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" event={"ID":"70c19680-49a9-401d-bea1-db45fd5e9004","Type":"ContainerStarted","Data":"dcd66abae615f0929b4850ddaac8da30b4cce24076675a9c2803456abea00dc7"} Jan 20 22:55:37 crc kubenswrapper[5030]: I0120 22:55:37.846190 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2183a5d1-2627-4b96-996e-627ce72f5bc6","Type":"ContainerStarted","Data":"91e9a073bd8d96ffb1e9921b40aa367ce74be1aaaf7a18f28272b87379fed356"} Jan 20 22:55:37 crc kubenswrapper[5030]: I0120 22:55:37.846359 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="ceilometer-central-agent" containerID="cri-o://ca0d50b0ac70c9b1652d1607dbb6f9befb40fbcbc096d3d4f0cacd17b828d7c4" gracePeriod=30 Jan 20 22:55:37 crc kubenswrapper[5030]: I0120 22:55:37.846425 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:37 crc kubenswrapper[5030]: I0120 22:55:37.846485 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="proxy-httpd" containerID="cri-o://91e9a073bd8d96ffb1e9921b40aa367ce74be1aaaf7a18f28272b87379fed356" gracePeriod=30 Jan 20 22:55:37 crc kubenswrapper[5030]: I0120 22:55:37.846520 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="sg-core" containerID="cri-o://bbe5c2167f06bdf774f0ccb4c6713fe4fbf2cb648e1b798b7e335d599a1ccc40" gracePeriod=30 Jan 20 22:55:37 crc kubenswrapper[5030]: I0120 22:55:37.846550 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="ceilometer-notification-agent" containerID="cri-o://11b5e34117b83101a01cef056a94b7f9f49e60e5742feac5554f4453dbe468a7" gracePeriod=30 Jan 20 22:55:37 crc kubenswrapper[5030]: I0120 22:55:37.880343 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.27937118 podStartE2EDuration="5.880325392s" podCreationTimestamp="2026-01-20 22:55:32 +0000 UTC" firstStartedPulling="2026-01-20 22:55:33.61858686 +0000 UTC m=+1205.938847148" lastFinishedPulling="2026-01-20 22:55:37.219541062 +0000 UTC m=+1209.539801360" observedRunningTime="2026-01-20 22:55:37.867334096 +0000 UTC m=+1210.187594394" watchObservedRunningTime="2026-01-20 22:55:37.880325392 +0000 UTC m=+1210.200585680" Jan 20 22:55:38 crc kubenswrapper[5030]: I0120 22:55:38.862504 5030 generic.go:334] "Generic (PLEG): container finished" podID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerID="91e9a073bd8d96ffb1e9921b40aa367ce74be1aaaf7a18f28272b87379fed356" exitCode=0 Jan 20 22:55:38 crc kubenswrapper[5030]: I0120 22:55:38.862982 5030 generic.go:334] "Generic (PLEG): container finished" podID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerID="bbe5c2167f06bdf774f0ccb4c6713fe4fbf2cb648e1b798b7e335d599a1ccc40" exitCode=2 Jan 20 22:55:38 crc kubenswrapper[5030]: I0120 22:55:38.862993 5030 generic.go:334] "Generic (PLEG): container finished" podID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerID="11b5e34117b83101a01cef056a94b7f9f49e60e5742feac5554f4453dbe468a7" exitCode=0 Jan 20 22:55:38 crc kubenswrapper[5030]: I0120 22:55:38.862585 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2183a5d1-2627-4b96-996e-627ce72f5bc6","Type":"ContainerDied","Data":"91e9a073bd8d96ffb1e9921b40aa367ce74be1aaaf7a18f28272b87379fed356"} Jan 20 22:55:38 crc kubenswrapper[5030]: I0120 22:55:38.863656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2183a5d1-2627-4b96-996e-627ce72f5bc6","Type":"ContainerDied","Data":"bbe5c2167f06bdf774f0ccb4c6713fe4fbf2cb648e1b798b7e335d599a1ccc40"} Jan 20 22:55:38 crc kubenswrapper[5030]: I0120 22:55:38.863703 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2183a5d1-2627-4b96-996e-627ce72f5bc6","Type":"ContainerDied","Data":"11b5e34117b83101a01cef056a94b7f9f49e60e5742feac5554f4453dbe468a7"} Jan 20 22:55:40 crc kubenswrapper[5030]: I0120 22:55:40.157236 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:55:40 crc kubenswrapper[5030]: I0120 22:55:40.157599 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:55:40 crc kubenswrapper[5030]: I0120 22:55:40.157666 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:55:40 crc kubenswrapper[5030]: I0120 22:55:40.158412 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f25c78cd3e3e9922643e9336b23ff6b4fab097716027081d191bbcb08c23aed3"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 22:55:40 crc kubenswrapper[5030]: I0120 22:55:40.158483 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://f25c78cd3e3e9922643e9336b23ff6b4fab097716027081d191bbcb08c23aed3" gracePeriod=600 Jan 20 22:55:40 crc kubenswrapper[5030]: I0120 22:55:40.882055 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="f25c78cd3e3e9922643e9336b23ff6b4fab097716027081d191bbcb08c23aed3" exitCode=0 Jan 20 22:55:40 crc kubenswrapper[5030]: I0120 22:55:40.882133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"f25c78cd3e3e9922643e9336b23ff6b4fab097716027081d191bbcb08c23aed3"} Jan 20 22:55:40 crc kubenswrapper[5030]: I0120 22:55:40.882183 5030 scope.go:117] "RemoveContainer" containerID="5d3679ff59bff2ab777e8cc7a5527010f7cd48401dbe03736305ae0e00f5026f" Jan 20 22:55:40 crc kubenswrapper[5030]: I0120 22:55:40.884924 5030 generic.go:334] "Generic (PLEG): container finished" podID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerID="ca0d50b0ac70c9b1652d1607dbb6f9befb40fbcbc096d3d4f0cacd17b828d7c4" exitCode=0 Jan 20 22:55:40 crc kubenswrapper[5030]: I0120 22:55:40.884950 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2183a5d1-2627-4b96-996e-627ce72f5bc6","Type":"ContainerDied","Data":"ca0d50b0ac70c9b1652d1607dbb6f9befb40fbcbc096d3d4f0cacd17b828d7c4"} Jan 20 22:55:42 crc kubenswrapper[5030]: I0120 22:55:42.921721 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.164759 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.164825 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.211020 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.212253 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.772942 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.853704 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-combined-ca-bundle\") pod \"2183a5d1-2627-4b96-996e-627ce72f5bc6\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.853750 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-config-data\") pod \"2183a5d1-2627-4b96-996e-627ce72f5bc6\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.853854 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69frn\" (UniqueName: \"kubernetes.io/projected/2183a5d1-2627-4b96-996e-627ce72f5bc6-kube-api-access-69frn\") pod \"2183a5d1-2627-4b96-996e-627ce72f5bc6\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.853890 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2183a5d1-2627-4b96-996e-627ce72f5bc6-log-httpd\") pod \"2183a5d1-2627-4b96-996e-627ce72f5bc6\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.853946 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-sg-core-conf-yaml\") pod \"2183a5d1-2627-4b96-996e-627ce72f5bc6\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.853992 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2183a5d1-2627-4b96-996e-627ce72f5bc6-run-httpd\") pod \"2183a5d1-2627-4b96-996e-627ce72f5bc6\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.854061 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-scripts\") pod \"2183a5d1-2627-4b96-996e-627ce72f5bc6\" (UID: \"2183a5d1-2627-4b96-996e-627ce72f5bc6\") " Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.854447 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2183a5d1-2627-4b96-996e-627ce72f5bc6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2183a5d1-2627-4b96-996e-627ce72f5bc6" (UID: "2183a5d1-2627-4b96-996e-627ce72f5bc6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.854649 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2183a5d1-2627-4b96-996e-627ce72f5bc6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2183a5d1-2627-4b96-996e-627ce72f5bc6" (UID: "2183a5d1-2627-4b96-996e-627ce72f5bc6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.857549 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-scripts" (OuterVolumeSpecName: "scripts") pod "2183a5d1-2627-4b96-996e-627ce72f5bc6" (UID: "2183a5d1-2627-4b96-996e-627ce72f5bc6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.857647 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2183a5d1-2627-4b96-996e-627ce72f5bc6-kube-api-access-69frn" (OuterVolumeSpecName: "kube-api-access-69frn") pod "2183a5d1-2627-4b96-996e-627ce72f5bc6" (UID: "2183a5d1-2627-4b96-996e-627ce72f5bc6"). InnerVolumeSpecName "kube-api-access-69frn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.878887 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2183a5d1-2627-4b96-996e-627ce72f5bc6" (UID: "2183a5d1-2627-4b96-996e-627ce72f5bc6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.920671 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2183a5d1-2627-4b96-996e-627ce72f5bc6" (UID: "2183a5d1-2627-4b96-996e-627ce72f5bc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.925566 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2183a5d1-2627-4b96-996e-627ce72f5bc6","Type":"ContainerDied","Data":"9c6707a419c7e2c60e90c7be497340945e3f6f32636560cb1672421a78febeb9"} Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.925642 5030 scope.go:117] "RemoveContainer" containerID="91e9a073bd8d96ffb1e9921b40aa367ce74be1aaaf7a18f28272b87379fed356" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.925767 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.930492 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" event={"ID":"70c19680-49a9-401d-bea1-db45fd5e9004","Type":"ContainerStarted","Data":"7831bb3048791d0b20d632a1503d425a4657f693ca27819b51673e2e5d6aa956"} Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.939980 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.940008 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"e42d8051ffc8563878f969388446a5e74e804108b7b6e90bd869128b6c0317ab"} Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.940120 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.953382 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" podStartSLOduration=1.348948753 podStartE2EDuration="7.953359122s" podCreationTimestamp="2026-01-20 22:55:36 +0000 UTC" firstStartedPulling="2026-01-20 22:55:36.969292817 +0000 UTC m=+1209.289553115" lastFinishedPulling="2026-01-20 22:55:43.573703166 +0000 UTC m=+1215.893963484" observedRunningTime="2026-01-20 22:55:43.944011115 +0000 UTC m=+1216.264271423" watchObservedRunningTime="2026-01-20 22:55:43.953359122 +0000 UTC m=+1216.273619410" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.953580 5030 scope.go:117] "RemoveContainer" containerID="bbe5c2167f06bdf774f0ccb4c6713fe4fbf2cb648e1b798b7e335d599a1ccc40" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.956520 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-config-data" (OuterVolumeSpecName: "config-data") pod "2183a5d1-2627-4b96-996e-627ce72f5bc6" (UID: "2183a5d1-2627-4b96-996e-627ce72f5bc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.957483 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.957517 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.957530 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.957540 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69frn\" (UniqueName: \"kubernetes.io/projected/2183a5d1-2627-4b96-996e-627ce72f5bc6-kube-api-access-69frn\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.957548 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2183a5d1-2627-4b96-996e-627ce72f5bc6-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.957557 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2183a5d1-2627-4b96-996e-627ce72f5bc6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.957566 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2183a5d1-2627-4b96-996e-627ce72f5bc6-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.973749 5030 scope.go:117] "RemoveContainer" containerID="11b5e34117b83101a01cef056a94b7f9f49e60e5742feac5554f4453dbe468a7" Jan 20 22:55:43 crc kubenswrapper[5030]: I0120 22:55:43.993082 5030 scope.go:117] "RemoveContainer" containerID="ca0d50b0ac70c9b1652d1607dbb6f9befb40fbcbc096d3d4f0cacd17b828d7c4" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.249142 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.257741 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.284435 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:44 crc kubenswrapper[5030]: E0120 22:55:44.284853 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="ceilometer-central-agent" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.284873 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="ceilometer-central-agent" Jan 20 22:55:44 crc kubenswrapper[5030]: E0120 22:55:44.284894 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="proxy-httpd" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.284903 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="proxy-httpd" Jan 20 22:55:44 crc kubenswrapper[5030]: E0120 22:55:44.284921 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="ceilometer-notification-agent" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.284931 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="ceilometer-notification-agent" Jan 20 22:55:44 crc kubenswrapper[5030]: E0120 22:55:44.284947 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="sg-core" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.284955 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="sg-core" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.285179 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="proxy-httpd" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.285196 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="sg-core" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.285221 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="ceilometer-central-agent" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.285230 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" containerName="ceilometer-notification-agent" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.292605 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.296226 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.296453 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.317714 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.364591 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-scripts\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.364684 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-run-httpd\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.364706 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbkm7\" (UniqueName: \"kubernetes.io/projected/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-kube-api-access-tbkm7\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.364726 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-config-data\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.365127 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.365214 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.365244 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-log-httpd\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.420664 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.421590 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.449285 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.473272 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.473305 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.473392 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.473423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-log-httpd\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.473449 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-scripts\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.473466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-run-httpd\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.473486 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbkm7\" (UniqueName: \"kubernetes.io/projected/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-kube-api-access-tbkm7\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.473503 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-config-data\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.474367 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-run-httpd\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.474722 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-log-httpd\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.481776 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-config-data\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.488602 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-scripts\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.497881 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.498678 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.505662 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbkm7\" (UniqueName: \"kubernetes.io/projected/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-kube-api-access-tbkm7\") pod \"ceilometer-0\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.607717 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.956770 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:44 crc kubenswrapper[5030]: I0120 22:55:44.957048 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:45 crc kubenswrapper[5030]: I0120 22:55:45.048719 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:55:45 crc kubenswrapper[5030]: I0120 22:55:45.701359 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:45 crc kubenswrapper[5030]: I0120 22:55:45.768886 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:55:45 crc kubenswrapper[5030]: I0120 22:55:45.976449 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2183a5d1-2627-4b96-996e-627ce72f5bc6" path="/var/lib/kubelet/pods/2183a5d1-2627-4b96-996e-627ce72f5bc6/volumes" Jan 20 22:55:45 crc kubenswrapper[5030]: I0120 22:55:45.977586 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"59af2e95-cfa7-42b0-8d75-34d72fc69e7a","Type":"ContainerStarted","Data":"dd5a6003b17772b156ee9be14fba2bb196106ed5d5cbc7d271a1d0a6d753b1b3"} Jan 20 22:55:45 crc kubenswrapper[5030]: I0120 22:55:45.977609 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"59af2e95-cfa7-42b0-8d75-34d72fc69e7a","Type":"ContainerStarted","Data":"b9af4c8f14fae1cf5d7741d64d8c6905fa32b08b1c180d6a842ed2866db1107a"} Jan 20 22:55:46 crc kubenswrapper[5030]: I0120 22:55:46.983554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"59af2e95-cfa7-42b0-8d75-34d72fc69e7a","Type":"ContainerStarted","Data":"db36e8d5253b1fb31f53685e50ba7918126cbd226416ba245bee2cd9a58f5658"} Jan 20 22:55:47 crc kubenswrapper[5030]: I0120 22:55:47.091973 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:47 crc kubenswrapper[5030]: I0120 22:55:47.092086 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 22:55:47 crc kubenswrapper[5030]: I0120 22:55:47.093384 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:55:48 crc kubenswrapper[5030]: I0120 22:55:48.030472 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"59af2e95-cfa7-42b0-8d75-34d72fc69e7a","Type":"ContainerStarted","Data":"524a2dcd4b1b17b23f66eb9c3310b38a02ebf8073b8c313772bf6ecbf19036c6"} Jan 20 22:55:50 crc kubenswrapper[5030]: I0120 22:55:50.060474 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"59af2e95-cfa7-42b0-8d75-34d72fc69e7a","Type":"ContainerStarted","Data":"66e613445163cacf379c54655d656d7966e7f2f798a145995652a776c308b64b"} Jan 20 22:55:50 crc kubenswrapper[5030]: I0120 22:55:50.061094 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:55:50 crc kubenswrapper[5030]: I0120 22:55:50.087866 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.595097695 podStartE2EDuration="6.087830874s" podCreationTimestamp="2026-01-20 22:55:44 +0000 UTC" firstStartedPulling="2026-01-20 22:55:45.045939214 +0000 UTC m=+1217.366199502" lastFinishedPulling="2026-01-20 22:55:49.538672363 +0000 UTC m=+1221.858932681" observedRunningTime="2026-01-20 22:55:50.083861178 +0000 UTC m=+1222.404121496" watchObservedRunningTime="2026-01-20 22:55:50.087830874 +0000 UTC m=+1222.408091202" Jan 20 22:55:56 crc kubenswrapper[5030]: I0120 22:55:56.110681 5030 generic.go:334] "Generic (PLEG): container finished" podID="70c19680-49a9-401d-bea1-db45fd5e9004" containerID="7831bb3048791d0b20d632a1503d425a4657f693ca27819b51673e2e5d6aa956" exitCode=0 Jan 20 22:55:56 crc kubenswrapper[5030]: I0120 22:55:56.110778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" event={"ID":"70c19680-49a9-401d-bea1-db45fd5e9004","Type":"ContainerDied","Data":"7831bb3048791d0b20d632a1503d425a4657f693ca27819b51673e2e5d6aa956"} Jan 20 22:55:57 crc kubenswrapper[5030]: I0120 22:55:57.460520 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:57 crc kubenswrapper[5030]: I0120 22:55:57.519116 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhsb4\" (UniqueName: \"kubernetes.io/projected/70c19680-49a9-401d-bea1-db45fd5e9004-kube-api-access-xhsb4\") pod \"70c19680-49a9-401d-bea1-db45fd5e9004\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " Jan 20 22:55:57 crc kubenswrapper[5030]: I0120 22:55:57.519241 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-config-data\") pod \"70c19680-49a9-401d-bea1-db45fd5e9004\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " Jan 20 22:55:57 crc kubenswrapper[5030]: I0120 22:55:57.519278 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-scripts\") pod \"70c19680-49a9-401d-bea1-db45fd5e9004\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " Jan 20 22:55:57 crc kubenswrapper[5030]: I0120 22:55:57.519308 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-combined-ca-bundle\") pod \"70c19680-49a9-401d-bea1-db45fd5e9004\" (UID: \"70c19680-49a9-401d-bea1-db45fd5e9004\") " Jan 20 22:55:57 crc kubenswrapper[5030]: I0120 22:55:57.525475 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-scripts" (OuterVolumeSpecName: "scripts") pod "70c19680-49a9-401d-bea1-db45fd5e9004" (UID: "70c19680-49a9-401d-bea1-db45fd5e9004"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:57 crc kubenswrapper[5030]: I0120 22:55:57.525899 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c19680-49a9-401d-bea1-db45fd5e9004-kube-api-access-xhsb4" (OuterVolumeSpecName: "kube-api-access-xhsb4") pod "70c19680-49a9-401d-bea1-db45fd5e9004" (UID: "70c19680-49a9-401d-bea1-db45fd5e9004"). InnerVolumeSpecName "kube-api-access-xhsb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:55:57 crc kubenswrapper[5030]: I0120 22:55:57.546940 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70c19680-49a9-401d-bea1-db45fd5e9004" (UID: "70c19680-49a9-401d-bea1-db45fd5e9004"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:57 crc kubenswrapper[5030]: I0120 22:55:57.553226 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-config-data" (OuterVolumeSpecName: "config-data") pod "70c19680-49a9-401d-bea1-db45fd5e9004" (UID: "70c19680-49a9-401d-bea1-db45fd5e9004"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:55:57 crc kubenswrapper[5030]: I0120 22:55:57.621015 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:57 crc kubenswrapper[5030]: I0120 22:55:57.621049 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:57 crc kubenswrapper[5030]: I0120 22:55:57.621060 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c19680-49a9-401d-bea1-db45fd5e9004-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:57 crc kubenswrapper[5030]: I0120 22:55:57.621073 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhsb4\" (UniqueName: \"kubernetes.io/projected/70c19680-49a9-401d-bea1-db45fd5e9004-kube-api-access-xhsb4\") on node \"crc\" DevicePath \"\"" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.134201 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" event={"ID":"70c19680-49a9-401d-bea1-db45fd5e9004","Type":"ContainerDied","Data":"dcd66abae615f0929b4850ddaac8da30b4cce24076675a9c2803456abea00dc7"} Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.134273 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcd66abae615f0929b4850ddaac8da30b4cce24076675a9c2803456abea00dc7" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.134294 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.247091 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 22:55:58 crc kubenswrapper[5030]: E0120 22:55:58.247473 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c19680-49a9-401d-bea1-db45fd5e9004" containerName="nova-cell0-conductor-db-sync" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.247493 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c19680-49a9-401d-bea1-db45fd5e9004" containerName="nova-cell0-conductor-db-sync" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.247776 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c19680-49a9-401d-bea1-db45fd5e9004" containerName="nova-cell0-conductor-db-sync" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.248483 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.250822 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.251923 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-8chxr" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.272469 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.332508 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d59910a-9b07-47ff-b79b-661139ece8c3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1d59910a-9b07-47ff-b79b-661139ece8c3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.332580 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2h7r\" (UniqueName: \"kubernetes.io/projected/1d59910a-9b07-47ff-b79b-661139ece8c3-kube-api-access-v2h7r\") pod \"nova-cell0-conductor-0\" (UID: \"1d59910a-9b07-47ff-b79b-661139ece8c3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.332792 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d59910a-9b07-47ff-b79b-661139ece8c3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1d59910a-9b07-47ff-b79b-661139ece8c3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.434873 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d59910a-9b07-47ff-b79b-661139ece8c3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1d59910a-9b07-47ff-b79b-661139ece8c3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.434959 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2h7r\" (UniqueName: \"kubernetes.io/projected/1d59910a-9b07-47ff-b79b-661139ece8c3-kube-api-access-v2h7r\") pod \"nova-cell0-conductor-0\" (UID: \"1d59910a-9b07-47ff-b79b-661139ece8c3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.435073 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d59910a-9b07-47ff-b79b-661139ece8c3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1d59910a-9b07-47ff-b79b-661139ece8c3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.439576 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d59910a-9b07-47ff-b79b-661139ece8c3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1d59910a-9b07-47ff-b79b-661139ece8c3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.440772 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d59910a-9b07-47ff-b79b-661139ece8c3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1d59910a-9b07-47ff-b79b-661139ece8c3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.456692 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2h7r\" (UniqueName: \"kubernetes.io/projected/1d59910a-9b07-47ff-b79b-661139ece8c3-kube-api-access-v2h7r\") pod \"nova-cell0-conductor-0\" (UID: \"1d59910a-9b07-47ff-b79b-661139ece8c3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:55:58 crc kubenswrapper[5030]: I0120 22:55:58.572598 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:55:59 crc kubenswrapper[5030]: I0120 22:55:59.033774 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 22:55:59 crc kubenswrapper[5030]: W0120 22:55:59.044587 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d59910a_9b07_47ff_b79b_661139ece8c3.slice/crio-b8b314698985d960cc62b1dcff7fe13d10cfed291fdc3bb43d84a42bc3d3ca1a WatchSource:0}: Error finding container b8b314698985d960cc62b1dcff7fe13d10cfed291fdc3bb43d84a42bc3d3ca1a: Status 404 returned error can't find the container with id b8b314698985d960cc62b1dcff7fe13d10cfed291fdc3bb43d84a42bc3d3ca1a Jan 20 22:55:59 crc kubenswrapper[5030]: I0120 22:55:59.146584 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"1d59910a-9b07-47ff-b79b-661139ece8c3","Type":"ContainerStarted","Data":"b8b314698985d960cc62b1dcff7fe13d10cfed291fdc3bb43d84a42bc3d3ca1a"} Jan 20 22:56:00 crc kubenswrapper[5030]: I0120 22:56:00.156592 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"1d59910a-9b07-47ff-b79b-661139ece8c3","Type":"ContainerStarted","Data":"789a37f026b23cad58514a4681ab46b1426788dba79b1a6a382f7073a104bb9e"} Jan 20 22:56:00 crc kubenswrapper[5030]: I0120 22:56:00.159437 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:56:00 crc kubenswrapper[5030]: I0120 22:56:00.188169 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.188138905 podStartE2EDuration="2.188138905s" podCreationTimestamp="2026-01-20 22:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:56:00.180765257 +0000 UTC m=+1232.501025545" watchObservedRunningTime="2026-01-20 22:56:00.188138905 +0000 UTC m=+1232.508399223" Jan 20 22:56:08 crc kubenswrapper[5030]: I0120 22:56:08.615601 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.112121 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h"] Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.113150 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.121292 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.121941 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.130794 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h"] Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.154154 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-scripts\") pod \"nova-cell0-cell-mapping-kqh4h\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.154242 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6p4d\" (UniqueName: \"kubernetes.io/projected/fa4b12e1-a480-491b-94a7-57989dc998eb-kube-api-access-d6p4d\") pod \"nova-cell0-cell-mapping-kqh4h\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.154338 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kqh4h\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.154392 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-config-data\") pod \"nova-cell0-cell-mapping-kqh4h\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.256050 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-config-data\") pod \"nova-cell0-cell-mapping-kqh4h\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.256178 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-scripts\") pod \"nova-cell0-cell-mapping-kqh4h\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.256203 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6p4d\" (UniqueName: \"kubernetes.io/projected/fa4b12e1-a480-491b-94a7-57989dc998eb-kube-api-access-d6p4d\") pod \"nova-cell0-cell-mapping-kqh4h\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.256236 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kqh4h\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.264166 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-scripts\") pod \"nova-cell0-cell-mapping-kqh4h\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.265352 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-config-data\") pod \"nova-cell0-cell-mapping-kqh4h\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.283076 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.284810 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.286855 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.292192 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6p4d\" (UniqueName: \"kubernetes.io/projected/fa4b12e1-a480-491b-94a7-57989dc998eb-kube-api-access-d6p4d\") pod \"nova-cell0-cell-mapping-kqh4h\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.293412 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kqh4h\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.338318 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.358162 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b27d4b-53fe-4dd1-ac75-a00a12c33655-config-data\") pod \"nova-api-0\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.358223 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b27d4b-53fe-4dd1-ac75-a00a12c33655-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.358282 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49b27d4b-53fe-4dd1-ac75-a00a12c33655-logs\") pod \"nova-api-0\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.358303 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bpc6\" (UniqueName: \"kubernetes.io/projected/49b27d4b-53fe-4dd1-ac75-a00a12c33655-kube-api-access-8bpc6\") pod \"nova-api-0\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.367715 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.369312 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.386477 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.392770 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.450124 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.460560 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b27d4b-53fe-4dd1-ac75-a00a12c33655-config-data\") pod \"nova-api-0\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.460631 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b27d4b-53fe-4dd1-ac75-a00a12c33655-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.460679 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49b27d4b-53fe-4dd1-ac75-a00a12c33655-logs\") pod \"nova-api-0\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.460697 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bpc6\" (UniqueName: \"kubernetes.io/projected/49b27d4b-53fe-4dd1-ac75-a00a12c33655-kube-api-access-8bpc6\") pod \"nova-api-0\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.460723 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda6086-3121-417a-ab80-989c2641c87a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbda6086-3121-417a-ab80-989c2641c87a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.460744 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbda6086-3121-417a-ab80-989c2641c87a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbda6086-3121-417a-ab80-989c2641c87a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.460776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjt2c\" (UniqueName: \"kubernetes.io/projected/cbda6086-3121-417a-ab80-989c2641c87a-kube-api-access-gjt2c\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbda6086-3121-417a-ab80-989c2641c87a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.461730 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49b27d4b-53fe-4dd1-ac75-a00a12c33655-logs\") pod \"nova-api-0\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.470278 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b27d4b-53fe-4dd1-ac75-a00a12c33655-config-data\") pod \"nova-api-0\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.498258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b27d4b-53fe-4dd1-ac75-a00a12c33655-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.503930 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.505311 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.507922 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.535687 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bpc6\" (UniqueName: \"kubernetes.io/projected/49b27d4b-53fe-4dd1-ac75-a00a12c33655-kube-api-access-8bpc6\") pod \"nova-api-0\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.566689 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.568646 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda6086-3121-417a-ab80-989c2641c87a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbda6086-3121-417a-ab80-989c2641c87a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.568680 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbda6086-3121-417a-ab80-989c2641c87a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbda6086-3121-417a-ab80-989c2641c87a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.568716 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjt2c\" (UniqueName: \"kubernetes.io/projected/cbda6086-3121-417a-ab80-989c2641c87a-kube-api-access-gjt2c\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbda6086-3121-417a-ab80-989c2641c87a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.575594 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.581445 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.601995 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.606822 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbda6086-3121-417a-ab80-989c2641c87a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbda6086-3121-417a-ab80-989c2641c87a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.608328 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda6086-3121-417a-ab80-989c2641c87a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbda6086-3121-417a-ab80-989c2641c87a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.615260 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjt2c\" (UniqueName: \"kubernetes.io/projected/cbda6086-3121-417a-ab80-989c2641c87a-kube-api-access-gjt2c\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbda6086-3121-417a-ab80-989c2641c87a\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.653968 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.670713 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8f24c28-d790-451a-81cb-ca190c707420-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8f24c28-d790-451a-81cb-ca190c707420\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.670816 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6zfb\" (UniqueName: \"kubernetes.io/projected/f8f24c28-d790-451a-81cb-ca190c707420-kube-api-access-n6zfb\") pod \"nova-scheduler-0\" (UID: \"f8f24c28-d790-451a-81cb-ca190c707420\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.670856 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8f24c28-d790-451a-81cb-ca190c707420-config-data\") pod \"nova-scheduler-0\" (UID: \"f8f24c28-d790-451a-81cb-ca190c707420\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.705610 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.737984 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.775560 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fd6ab88-2c2c-4303-b379-667b201b64b5-logs\") pod \"nova-metadata-0\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.775650 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7m95\" (UniqueName: \"kubernetes.io/projected/7fd6ab88-2c2c-4303-b379-667b201b64b5-kube-api-access-b7m95\") pod \"nova-metadata-0\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.775709 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6zfb\" (UniqueName: \"kubernetes.io/projected/f8f24c28-d790-451a-81cb-ca190c707420-kube-api-access-n6zfb\") pod \"nova-scheduler-0\" (UID: \"f8f24c28-d790-451a-81cb-ca190c707420\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.775771 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8f24c28-d790-451a-81cb-ca190c707420-config-data\") pod \"nova-scheduler-0\" (UID: \"f8f24c28-d790-451a-81cb-ca190c707420\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.775821 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fd6ab88-2c2c-4303-b379-667b201b64b5-config-data\") pod \"nova-metadata-0\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.775862 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd6ab88-2c2c-4303-b379-667b201b64b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.775892 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8f24c28-d790-451a-81cb-ca190c707420-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8f24c28-d790-451a-81cb-ca190c707420\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.784530 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8f24c28-d790-451a-81cb-ca190c707420-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f8f24c28-d790-451a-81cb-ca190c707420\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.785140 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8f24c28-d790-451a-81cb-ca190c707420-config-data\") pod \"nova-scheduler-0\" (UID: \"f8f24c28-d790-451a-81cb-ca190c707420\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.796061 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6zfb\" (UniqueName: \"kubernetes.io/projected/f8f24c28-d790-451a-81cb-ca190c707420-kube-api-access-n6zfb\") pod \"nova-scheduler-0\" (UID: \"f8f24c28-d790-451a-81cb-ca190c707420\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.878727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7m95\" (UniqueName: \"kubernetes.io/projected/7fd6ab88-2c2c-4303-b379-667b201b64b5-kube-api-access-b7m95\") pod \"nova-metadata-0\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.878855 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fd6ab88-2c2c-4303-b379-667b201b64b5-config-data\") pod \"nova-metadata-0\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.878899 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd6ab88-2c2c-4303-b379-667b201b64b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.878955 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fd6ab88-2c2c-4303-b379-667b201b64b5-logs\") pod \"nova-metadata-0\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.880590 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fd6ab88-2c2c-4303-b379-667b201b64b5-logs\") pod \"nova-metadata-0\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.885440 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fd6ab88-2c2c-4303-b379-667b201b64b5-config-data\") pod \"nova-metadata-0\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.886284 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd6ab88-2c2c-4303-b379-667b201b64b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.898876 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7m95\" (UniqueName: \"kubernetes.io/projected/7fd6ab88-2c2c-4303-b379-667b201b64b5-kube-api-access-b7m95\") pod \"nova-metadata-0\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.946686 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:09 crc kubenswrapper[5030]: I0120 22:56:09.966381 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.053411 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h"] Jan 20 22:56:10 crc kubenswrapper[5030]: W0120 22:56:10.070568 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa4b12e1_a480_491b_94a7_57989dc998eb.slice/crio-e9f90801775107e605a52690f5ca7bb2d7dc8fe64f05034e5e0360e38c501d06 WatchSource:0}: Error finding container e9f90801775107e605a52690f5ca7bb2d7dc8fe64f05034e5e0360e38c501d06: Status 404 returned error can't find the container with id e9f90801775107e605a52690f5ca7bb2d7dc8fe64f05034e5e0360e38c501d06 Jan 20 22:56:10 crc kubenswrapper[5030]: W0120 22:56:10.193241 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49b27d4b_53fe_4dd1_ac75_a00a12c33655.slice/crio-23e199514c7bf45d83f0e1e66b68fec71ef502374e169a5c073a8fb8ba0dd79e WatchSource:0}: Error finding container 23e199514c7bf45d83f0e1e66b68fec71ef502374e169a5c073a8fb8ba0dd79e: Status 404 returned error can't find the container with id 23e199514c7bf45d83f0e1e66b68fec71ef502374e169a5c073a8fb8ba0dd79e Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.200468 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.265365 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns"] Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.266845 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.272197 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.272417 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 20 22:56:10 crc kubenswrapper[5030]: W0120 22:56:10.296544 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbda6086_3121_417a_ab80_989c2641c87a.slice/crio-1c03776b3466a82fe19d33911416515f4704b16b1b6d6a92e6245de72419fccb WatchSource:0}: Error finding container 1c03776b3466a82fe19d33911416515f4704b16b1b6d6a92e6245de72419fccb: Status 404 returned error can't find the container with id 1c03776b3466a82fe19d33911416515f4704b16b1b6d6a92e6245de72419fccb Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.296656 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns"] Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.301785 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" event={"ID":"fa4b12e1-a480-491b-94a7-57989dc998eb","Type":"ContainerStarted","Data":"e9f90801775107e605a52690f5ca7bb2d7dc8fe64f05034e5e0360e38c501d06"} Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.307769 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"49b27d4b-53fe-4dd1-ac75-a00a12c33655","Type":"ContainerStarted","Data":"23e199514c7bf45d83f0e1e66b68fec71ef502374e169a5c073a8fb8ba0dd79e"} Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.309104 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.391642 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-config-data\") pod \"nova-cell1-conductor-db-sync-9thns\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.391981 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfq8d\" (UniqueName: \"kubernetes.io/projected/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-kube-api-access-rfq8d\") pod \"nova-cell1-conductor-db-sync-9thns\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.392058 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-scripts\") pod \"nova-cell1-conductor-db-sync-9thns\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.392136 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9thns\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.397475 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.495803 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-config-data\") pod \"nova-cell1-conductor-db-sync-9thns\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.495862 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfq8d\" (UniqueName: \"kubernetes.io/projected/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-kube-api-access-rfq8d\") pod \"nova-cell1-conductor-db-sync-9thns\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.495957 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-scripts\") pod \"nova-cell1-conductor-db-sync-9thns\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.495990 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9thns\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.503150 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9thns\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.505197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-config-data\") pod \"nova-cell1-conductor-db-sync-9thns\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.508227 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.515007 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-scripts\") pod \"nova-cell1-conductor-db-sync-9thns\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.515564 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfq8d\" (UniqueName: \"kubernetes.io/projected/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-kube-api-access-rfq8d\") pod \"nova-cell1-conductor-db-sync-9thns\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:10 crc kubenswrapper[5030]: I0120 22:56:10.673452 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:11 crc kubenswrapper[5030]: I0120 22:56:11.141612 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns"] Jan 20 22:56:11 crc kubenswrapper[5030]: W0120 22:56:11.147123 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dd0a29f_3a2b_4fdb_86b6_11fb77b00a35.slice/crio-ea32e81053be2faecfa8578187e283c4ac34fca4fbdc258597741f84f3396500 WatchSource:0}: Error finding container ea32e81053be2faecfa8578187e283c4ac34fca4fbdc258597741f84f3396500: Status 404 returned error can't find the container with id ea32e81053be2faecfa8578187e283c4ac34fca4fbdc258597741f84f3396500 Jan 20 22:56:11 crc kubenswrapper[5030]: I0120 22:56:11.320809 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f8f24c28-d790-451a-81cb-ca190c707420","Type":"ContainerStarted","Data":"db95afd6f03d7907fd81e229e8f1363ea4395a01b419bf9f73307328959d98c0"} Jan 20 22:56:11 crc kubenswrapper[5030]: I0120 22:56:11.324115 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" event={"ID":"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35","Type":"ContainerStarted","Data":"ea32e81053be2faecfa8578187e283c4ac34fca4fbdc258597741f84f3396500"} Jan 20 22:56:11 crc kubenswrapper[5030]: I0120 22:56:11.327391 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" event={"ID":"fa4b12e1-a480-491b-94a7-57989dc998eb","Type":"ContainerStarted","Data":"49ed1aef5ea7985dfc9dc8cd928f0107137a32f74b75f621873618e8e8aaf816"} Jan 20 22:56:11 crc kubenswrapper[5030]: I0120 22:56:11.330917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"7fd6ab88-2c2c-4303-b379-667b201b64b5","Type":"ContainerStarted","Data":"b66e54e92bf14bbfd3d49a7a817f5b5a072f1c7aeb63d3b8f1ce4e2d2b6a1f89"} Jan 20 22:56:11 crc kubenswrapper[5030]: I0120 22:56:11.331749 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"cbda6086-3121-417a-ab80-989c2641c87a","Type":"ContainerStarted","Data":"1c03776b3466a82fe19d33911416515f4704b16b1b6d6a92e6245de72419fccb"} Jan 20 22:56:11 crc kubenswrapper[5030]: I0120 22:56:11.351229 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" podStartSLOduration=2.3512125839999998 podStartE2EDuration="2.351212584s" podCreationTimestamp="2026-01-20 22:56:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:56:11.34404358 +0000 UTC m=+1243.664303868" watchObservedRunningTime="2026-01-20 22:56:11.351212584 +0000 UTC m=+1243.671472872" Jan 20 22:56:12 crc kubenswrapper[5030]: I0120 22:56:12.342553 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" event={"ID":"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35","Type":"ContainerStarted","Data":"379f9e93e41d27df40fadbc04a7f498ac03dce05e1399bab7f71d6c134822330"} Jan 20 22:56:12 crc kubenswrapper[5030]: I0120 22:56:12.364730 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" podStartSLOduration=2.364713066 podStartE2EDuration="2.364713066s" podCreationTimestamp="2026-01-20 22:56:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:56:12.358339301 +0000 UTC m=+1244.678599589" watchObservedRunningTime="2026-01-20 22:56:12.364713066 +0000 UTC m=+1244.684973354" Jan 20 22:56:12 crc kubenswrapper[5030]: I0120 22:56:12.764131 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:12 crc kubenswrapper[5030]: I0120 22:56:12.772574 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.365116 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"cbda6086-3121-417a-ab80-989c2641c87a","Type":"ContainerStarted","Data":"6e0d50786fce5ea42ef27af39889f471118646fa9e63d66040f431ee9ea9a969"} Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.365368 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="cbda6086-3121-417a-ab80-989c2641c87a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://6e0d50786fce5ea42ef27af39889f471118646fa9e63d66040f431ee9ea9a969" gracePeriod=30 Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.374307 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f8f24c28-d790-451a-81cb-ca190c707420","Type":"ContainerStarted","Data":"26f2f0767268f4022645c573e1271916fd36287774ea788d59b21d7a72e29c72"} Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.377634 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"49b27d4b-53fe-4dd1-ac75-a00a12c33655","Type":"ContainerStarted","Data":"2f80231e85030d0b524b044287053cf4535a6f353648dc9031d28703bdb28583"} Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.377763 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"49b27d4b-53fe-4dd1-ac75-a00a12c33655","Type":"ContainerStarted","Data":"66e9f3e88813ae293b39322d69fb0eb0235a26062b99fea75af34d4fb4f87df4"} Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.385064 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"7fd6ab88-2c2c-4303-b379-667b201b64b5","Type":"ContainerStarted","Data":"5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0"} Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.385513 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"7fd6ab88-2c2c-4303-b379-667b201b64b5","Type":"ContainerStarted","Data":"1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071"} Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.385256 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="7fd6ab88-2c2c-4303-b379-667b201b64b5" containerName="nova-metadata-log" containerID="cri-o://1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071" gracePeriod=30 Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.385680 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="7fd6ab88-2c2c-4303-b379-667b201b64b5" containerName="nova-metadata-metadata" containerID="cri-o://5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0" gracePeriod=30 Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.405408 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.748759753 podStartE2EDuration="5.405331291s" podCreationTimestamp="2026-01-20 22:56:09 +0000 UTC" firstStartedPulling="2026-01-20 22:56:10.301505952 +0000 UTC m=+1242.621766240" lastFinishedPulling="2026-01-20 22:56:12.95807749 +0000 UTC m=+1245.278337778" observedRunningTime="2026-01-20 22:56:14.386197327 +0000 UTC m=+1246.706457635" watchObservedRunningTime="2026-01-20 22:56:14.405331291 +0000 UTC m=+1246.725591619" Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.425170 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.984890436 podStartE2EDuration="5.425133822s" podCreationTimestamp="2026-01-20 22:56:09 +0000 UTC" firstStartedPulling="2026-01-20 22:56:10.51808267 +0000 UTC m=+1242.838342968" lastFinishedPulling="2026-01-20 22:56:12.958326066 +0000 UTC m=+1245.278586354" observedRunningTime="2026-01-20 22:56:14.415144809 +0000 UTC m=+1246.735405137" watchObservedRunningTime="2026-01-20 22:56:14.425133822 +0000 UTC m=+1246.745394160" Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.451441 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.6892736 podStartE2EDuration="5.45141019s" podCreationTimestamp="2026-01-20 22:56:09 +0000 UTC" firstStartedPulling="2026-01-20 22:56:10.196240127 +0000 UTC m=+1242.516500415" lastFinishedPulling="2026-01-20 22:56:12.958376727 +0000 UTC m=+1245.278637005" observedRunningTime="2026-01-20 22:56:14.445685901 +0000 UTC m=+1246.765946209" watchObservedRunningTime="2026-01-20 22:56:14.45141019 +0000 UTC m=+1246.771670508" Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.464833 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.898948671 podStartE2EDuration="5.464814216s" podCreationTimestamp="2026-01-20 22:56:09 +0000 UTC" firstStartedPulling="2026-01-20 22:56:10.404299318 +0000 UTC m=+1242.724559606" lastFinishedPulling="2026-01-20 22:56:12.970164853 +0000 UTC m=+1245.290425151" observedRunningTime="2026-01-20 22:56:14.463059822 +0000 UTC m=+1246.783320110" watchObservedRunningTime="2026-01-20 22:56:14.464814216 +0000 UTC m=+1246.785074504" Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.618943 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.738907 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.947872 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.966742 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.967040 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:14 crc kubenswrapper[5030]: I0120 22:56:14.988030 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.092196 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fd6ab88-2c2c-4303-b379-667b201b64b5-config-data\") pod \"7fd6ab88-2c2c-4303-b379-667b201b64b5\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.092269 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd6ab88-2c2c-4303-b379-667b201b64b5-combined-ca-bundle\") pod \"7fd6ab88-2c2c-4303-b379-667b201b64b5\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.092344 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fd6ab88-2c2c-4303-b379-667b201b64b5-logs\") pod \"7fd6ab88-2c2c-4303-b379-667b201b64b5\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.092494 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7m95\" (UniqueName: \"kubernetes.io/projected/7fd6ab88-2c2c-4303-b379-667b201b64b5-kube-api-access-b7m95\") pod \"7fd6ab88-2c2c-4303-b379-667b201b64b5\" (UID: \"7fd6ab88-2c2c-4303-b379-667b201b64b5\") " Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.092780 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fd6ab88-2c2c-4303-b379-667b201b64b5-logs" (OuterVolumeSpecName: "logs") pod "7fd6ab88-2c2c-4303-b379-667b201b64b5" (UID: "7fd6ab88-2c2c-4303-b379-667b201b64b5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.093516 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fd6ab88-2c2c-4303-b379-667b201b64b5-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.097982 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fd6ab88-2c2c-4303-b379-667b201b64b5-kube-api-access-b7m95" (OuterVolumeSpecName: "kube-api-access-b7m95") pod "7fd6ab88-2c2c-4303-b379-667b201b64b5" (UID: "7fd6ab88-2c2c-4303-b379-667b201b64b5"). InnerVolumeSpecName "kube-api-access-b7m95". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.117272 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fd6ab88-2c2c-4303-b379-667b201b64b5-config-data" (OuterVolumeSpecName: "config-data") pod "7fd6ab88-2c2c-4303-b379-667b201b64b5" (UID: "7fd6ab88-2c2c-4303-b379-667b201b64b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.117492 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fd6ab88-2c2c-4303-b379-667b201b64b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fd6ab88-2c2c-4303-b379-667b201b64b5" (UID: "7fd6ab88-2c2c-4303-b379-667b201b64b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.194570 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7m95\" (UniqueName: \"kubernetes.io/projected/7fd6ab88-2c2c-4303-b379-667b201b64b5-kube-api-access-b7m95\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.194596 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fd6ab88-2c2c-4303-b379-667b201b64b5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.194605 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd6ab88-2c2c-4303-b379-667b201b64b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.402639 5030 generic.go:334] "Generic (PLEG): container finished" podID="7fd6ab88-2c2c-4303-b379-667b201b64b5" containerID="5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0" exitCode=0 Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.402670 5030 generic.go:334] "Generic (PLEG): container finished" podID="7fd6ab88-2c2c-4303-b379-667b201b64b5" containerID="1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071" exitCode=143 Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.403209 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.405864 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"7fd6ab88-2c2c-4303-b379-667b201b64b5","Type":"ContainerDied","Data":"5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0"} Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.405918 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"7fd6ab88-2c2c-4303-b379-667b201b64b5","Type":"ContainerDied","Data":"1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071"} Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.405932 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"7fd6ab88-2c2c-4303-b379-667b201b64b5","Type":"ContainerDied","Data":"b66e54e92bf14bbfd3d49a7a817f5b5a072f1c7aeb63d3b8f1ce4e2d2b6a1f89"} Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.405951 5030 scope.go:117] "RemoveContainer" containerID="5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.436667 5030 scope.go:117] "RemoveContainer" containerID="1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.446212 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.455050 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.464377 5030 scope.go:117] "RemoveContainer" containerID="5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0" Jan 20 22:56:15 crc kubenswrapper[5030]: E0120 22:56:15.464976 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0\": container with ID starting with 5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0 not found: ID does not exist" containerID="5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.465016 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0"} err="failed to get container status \"5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0\": rpc error: code = NotFound desc = could not find container \"5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0\": container with ID starting with 5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0 not found: ID does not exist" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.465078 5030 scope.go:117] "RemoveContainer" containerID="1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071" Jan 20 22:56:15 crc kubenswrapper[5030]: E0120 22:56:15.465372 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071\": container with ID starting with 1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071 not found: ID does not exist" containerID="1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.465412 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071"} err="failed to get container status \"1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071\": rpc error: code = NotFound desc = could not find container \"1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071\": container with ID starting with 1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071 not found: ID does not exist" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.465441 5030 scope.go:117] "RemoveContainer" containerID="5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.465938 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0"} err="failed to get container status \"5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0\": rpc error: code = NotFound desc = could not find container \"5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0\": container with ID starting with 5f44593e0525fd18e64ed71d455f0d520ce3b3ba0b8be3cc19161999dd6aa9e0 not found: ID does not exist" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.465961 5030 scope.go:117] "RemoveContainer" containerID="1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.466364 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071"} err="failed to get container status \"1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071\": rpc error: code = NotFound desc = could not find container \"1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071\": container with ID starting with 1246e5b3b0bc1247e2caecc9e0b133997ba398984498dabd65368d04da18d071 not found: ID does not exist" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.494666 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:15 crc kubenswrapper[5030]: E0120 22:56:15.495280 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd6ab88-2c2c-4303-b379-667b201b64b5" containerName="nova-metadata-metadata" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.495310 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd6ab88-2c2c-4303-b379-667b201b64b5" containerName="nova-metadata-metadata" Jan 20 22:56:15 crc kubenswrapper[5030]: E0120 22:56:15.495339 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd6ab88-2c2c-4303-b379-667b201b64b5" containerName="nova-metadata-log" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.495353 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd6ab88-2c2c-4303-b379-667b201b64b5" containerName="nova-metadata-log" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.495677 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fd6ab88-2c2c-4303-b379-667b201b64b5" containerName="nova-metadata-metadata" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.495722 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fd6ab88-2c2c-4303-b379-667b201b64b5" containerName="nova-metadata-log" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.497246 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.505769 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.506127 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.529350 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.601600 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-logs\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.601682 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nxzm\" (UniqueName: \"kubernetes.io/projected/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-kube-api-access-9nxzm\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.601731 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.601854 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.601902 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-config-data\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.703991 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.704083 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-config-data\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.704261 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-logs\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.704292 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nxzm\" (UniqueName: \"kubernetes.io/projected/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-kube-api-access-9nxzm\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.704333 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.705735 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-logs\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.708972 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-config-data\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.709685 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.718088 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.719596 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nxzm\" (UniqueName: \"kubernetes.io/projected/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-kube-api-access-9nxzm\") pod \"nova-metadata-0\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.830878 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:15 crc kubenswrapper[5030]: I0120 22:56:15.982442 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fd6ab88-2c2c-4303-b379-667b201b64b5" path="/var/lib/kubelet/pods/7fd6ab88-2c2c-4303-b379-667b201b64b5/volumes" Jan 20 22:56:16 crc kubenswrapper[5030]: W0120 22:56:16.302782 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92e4a98_2fb3_4c64_bf8a_4be41d3ab0ec.slice/crio-d1dd3a351091a6fde3c7001ec8ccef6c44a7259d505d6772960c96a78e176aa8 WatchSource:0}: Error finding container d1dd3a351091a6fde3c7001ec8ccef6c44a7259d505d6772960c96a78e176aa8: Status 404 returned error can't find the container with id d1dd3a351091a6fde3c7001ec8ccef6c44a7259d505d6772960c96a78e176aa8 Jan 20 22:56:16 crc kubenswrapper[5030]: I0120 22:56:16.303521 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:16 crc kubenswrapper[5030]: I0120 22:56:16.414502 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec","Type":"ContainerStarted","Data":"d1dd3a351091a6fde3c7001ec8ccef6c44a7259d505d6772960c96a78e176aa8"} Jan 20 22:56:17 crc kubenswrapper[5030]: E0120 22:56:17.346705 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa4b12e1_a480_491b_94a7_57989dc998eb.slice/crio-49ed1aef5ea7985dfc9dc8cd928f0107137a32f74b75f621873618e8e8aaf816.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa4b12e1_a480_491b_94a7_57989dc998eb.slice/crio-conmon-49ed1aef5ea7985dfc9dc8cd928f0107137a32f74b75f621873618e8e8aaf816.scope\": RecentStats: unable to find data in memory cache]" Jan 20 22:56:17 crc kubenswrapper[5030]: I0120 22:56:17.428541 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec","Type":"ContainerStarted","Data":"d815603d666eefed1fd3f6178b06f440e30f6b9e289186473e3e975fe4e7069e"} Jan 20 22:56:17 crc kubenswrapper[5030]: I0120 22:56:17.428581 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec","Type":"ContainerStarted","Data":"01ea3d4ddfa4cb62ecf8192f43cd81d916914e07e1acb715ab872268e62148f3"} Jan 20 22:56:17 crc kubenswrapper[5030]: I0120 22:56:17.431404 5030 generic.go:334] "Generic (PLEG): container finished" podID="fa4b12e1-a480-491b-94a7-57989dc998eb" containerID="49ed1aef5ea7985dfc9dc8cd928f0107137a32f74b75f621873618e8e8aaf816" exitCode=0 Jan 20 22:56:17 crc kubenswrapper[5030]: I0120 22:56:17.431449 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" event={"ID":"fa4b12e1-a480-491b-94a7-57989dc998eb","Type":"ContainerDied","Data":"49ed1aef5ea7985dfc9dc8cd928f0107137a32f74b75f621873618e8e8aaf816"} Jan 20 22:56:17 crc kubenswrapper[5030]: I0120 22:56:17.449047 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.449030896 podStartE2EDuration="2.449030896s" podCreationTimestamp="2026-01-20 22:56:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:56:17.445140541 +0000 UTC m=+1249.765400869" watchObservedRunningTime="2026-01-20 22:56:17.449030896 +0000 UTC m=+1249.769291184" Jan 20 22:56:18 crc kubenswrapper[5030]: I0120 22:56:18.426483 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:56:18 crc kubenswrapper[5030]: I0120 22:56:18.427149 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="bd0dddd2-0952-4bdb-9149-08b1b3653cbb" containerName="kube-state-metrics" containerID="cri-o://09a1061cec01b660a9ddb3341b1806861ee961251efda9dbc91b67e899b7a10f" gracePeriod=30 Jan 20 22:56:18 crc kubenswrapper[5030]: I0120 22:56:18.823267 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:18 crc kubenswrapper[5030]: I0120 22:56:18.919132 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:18 crc kubenswrapper[5030]: I0120 22:56:18.957190 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-combined-ca-bundle\") pod \"fa4b12e1-a480-491b-94a7-57989dc998eb\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " Jan 20 22:56:18 crc kubenswrapper[5030]: I0120 22:56:18.957233 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6p4d\" (UniqueName: \"kubernetes.io/projected/fa4b12e1-a480-491b-94a7-57989dc998eb-kube-api-access-d6p4d\") pod \"fa4b12e1-a480-491b-94a7-57989dc998eb\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " Jan 20 22:56:18 crc kubenswrapper[5030]: I0120 22:56:18.957298 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-scripts\") pod \"fa4b12e1-a480-491b-94a7-57989dc998eb\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " Jan 20 22:56:18 crc kubenswrapper[5030]: I0120 22:56:18.957546 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-config-data\") pod \"fa4b12e1-a480-491b-94a7-57989dc998eb\" (UID: \"fa4b12e1-a480-491b-94a7-57989dc998eb\") " Jan 20 22:56:18 crc kubenswrapper[5030]: I0120 22:56:18.962920 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa4b12e1-a480-491b-94a7-57989dc998eb-kube-api-access-d6p4d" (OuterVolumeSpecName: "kube-api-access-d6p4d") pod "fa4b12e1-a480-491b-94a7-57989dc998eb" (UID: "fa4b12e1-a480-491b-94a7-57989dc998eb"). InnerVolumeSpecName "kube-api-access-d6p4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:56:18 crc kubenswrapper[5030]: I0120 22:56:18.963372 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-scripts" (OuterVolumeSpecName: "scripts") pod "fa4b12e1-a480-491b-94a7-57989dc998eb" (UID: "fa4b12e1-a480-491b-94a7-57989dc998eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:18 crc kubenswrapper[5030]: I0120 22:56:18.987183 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa4b12e1-a480-491b-94a7-57989dc998eb" (UID: "fa4b12e1-a480-491b-94a7-57989dc998eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:18 crc kubenswrapper[5030]: I0120 22:56:18.999837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-config-data" (OuterVolumeSpecName: "config-data") pod "fa4b12e1-a480-491b-94a7-57989dc998eb" (UID: "fa4b12e1-a480-491b-94a7-57989dc998eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.059280 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8sxc\" (UniqueName: \"kubernetes.io/projected/bd0dddd2-0952-4bdb-9149-08b1b3653cbb-kube-api-access-p8sxc\") pod \"bd0dddd2-0952-4bdb-9149-08b1b3653cbb\" (UID: \"bd0dddd2-0952-4bdb-9149-08b1b3653cbb\") " Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.059919 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.059951 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6p4d\" (UniqueName: \"kubernetes.io/projected/fa4b12e1-a480-491b-94a7-57989dc998eb-kube-api-access-d6p4d\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.059972 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.059991 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa4b12e1-a480-491b-94a7-57989dc998eb-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.061769 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd0dddd2-0952-4bdb-9149-08b1b3653cbb-kube-api-access-p8sxc" (OuterVolumeSpecName: "kube-api-access-p8sxc") pod "bd0dddd2-0952-4bdb-9149-08b1b3653cbb" (UID: "bd0dddd2-0952-4bdb-9149-08b1b3653cbb"). InnerVolumeSpecName "kube-api-access-p8sxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.162076 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8sxc\" (UniqueName: \"kubernetes.io/projected/bd0dddd2-0952-4bdb-9149-08b1b3653cbb-kube-api-access-p8sxc\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.449176 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" event={"ID":"fa4b12e1-a480-491b-94a7-57989dc998eb","Type":"ContainerDied","Data":"e9f90801775107e605a52690f5ca7bb2d7dc8fe64f05034e5e0360e38c501d06"} Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.449212 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.449222 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9f90801775107e605a52690f5ca7bb2d7dc8fe64f05034e5e0360e38c501d06" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.450456 5030 generic.go:334] "Generic (PLEG): container finished" podID="bd0dddd2-0952-4bdb-9149-08b1b3653cbb" containerID="09a1061cec01b660a9ddb3341b1806861ee961251efda9dbc91b67e899b7a10f" exitCode=2 Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.450481 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"bd0dddd2-0952-4bdb-9149-08b1b3653cbb","Type":"ContainerDied","Data":"09a1061cec01b660a9ddb3341b1806861ee961251efda9dbc91b67e899b7a10f"} Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.450497 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"bd0dddd2-0952-4bdb-9149-08b1b3653cbb","Type":"ContainerDied","Data":"12ba68aec1f5ec593692220b26209997a9d7ea613f75c79853a72240267ee41a"} Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.450500 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.450512 5030 scope.go:117] "RemoveContainer" containerID="09a1061cec01b660a9ddb3341b1806861ee961251efda9dbc91b67e899b7a10f" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.487316 5030 scope.go:117] "RemoveContainer" containerID="09a1061cec01b660a9ddb3341b1806861ee961251efda9dbc91b67e899b7a10f" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.488277 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:56:19 crc kubenswrapper[5030]: E0120 22:56:19.490765 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09a1061cec01b660a9ddb3341b1806861ee961251efda9dbc91b67e899b7a10f\": container with ID starting with 09a1061cec01b660a9ddb3341b1806861ee961251efda9dbc91b67e899b7a10f not found: ID does not exist" containerID="09a1061cec01b660a9ddb3341b1806861ee961251efda9dbc91b67e899b7a10f" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.490816 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09a1061cec01b660a9ddb3341b1806861ee961251efda9dbc91b67e899b7a10f"} err="failed to get container status \"09a1061cec01b660a9ddb3341b1806861ee961251efda9dbc91b67e899b7a10f\": rpc error: code = NotFound desc = could not find container \"09a1061cec01b660a9ddb3341b1806861ee961251efda9dbc91b67e899b7a10f\": container with ID starting with 09a1061cec01b660a9ddb3341b1806861ee961251efda9dbc91b67e899b7a10f not found: ID does not exist" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.503691 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.534808 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:56:19 crc kubenswrapper[5030]: E0120 22:56:19.535204 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd0dddd2-0952-4bdb-9149-08b1b3653cbb" containerName="kube-state-metrics" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.535216 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd0dddd2-0952-4bdb-9149-08b1b3653cbb" containerName="kube-state-metrics" Jan 20 22:56:19 crc kubenswrapper[5030]: E0120 22:56:19.535250 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4b12e1-a480-491b-94a7-57989dc998eb" containerName="nova-manage" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.535256 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4b12e1-a480-491b-94a7-57989dc998eb" containerName="nova-manage" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.535409 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd0dddd2-0952-4bdb-9149-08b1b3653cbb" containerName="kube-state-metrics" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.535421 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa4b12e1-a480-491b-94a7-57989dc998eb" containerName="nova-manage" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.536063 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.543179 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.543431 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.546442 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.592993 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4b6g\" (UniqueName: \"kubernetes.io/projected/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-api-access-g4b6g\") pod \"kube-state-metrics-0\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.593050 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.593185 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.593239 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.670421 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.670991 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="f8f24c28-d790-451a-81cb-ca190c707420" containerName="nova-scheduler-scheduler" containerID="cri-o://26f2f0767268f4022645c573e1271916fd36287774ea788d59b21d7a72e29c72" gracePeriod=30 Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.688010 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.688231 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="49b27d4b-53fe-4dd1-ac75-a00a12c33655" containerName="nova-api-log" containerID="cri-o://66e9f3e88813ae293b39322d69fb0eb0235a26062b99fea75af34d4fb4f87df4" gracePeriod=30 Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.688631 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="49b27d4b-53fe-4dd1-ac75-a00a12c33655" containerName="nova-api-api" containerID="cri-o://2f80231e85030d0b524b044287053cf4535a6f353648dc9031d28703bdb28583" gracePeriod=30 Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.698559 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.698646 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.698747 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4b6g\" (UniqueName: \"kubernetes.io/projected/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-api-access-g4b6g\") pod \"kube-state-metrics-0\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.698776 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.706314 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.717383 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.732156 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4b6g\" (UniqueName: \"kubernetes.io/projected/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-api-access-g4b6g\") pod \"kube-state-metrics-0\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.732160 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.738532 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.738806 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" containerName="nova-metadata-log" containerID="cri-o://01ea3d4ddfa4cb62ecf8192f43cd81d916914e07e1acb715ab872268e62148f3" gracePeriod=30 Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.739278 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" containerName="nova-metadata-metadata" containerID="cri-o://d815603d666eefed1fd3f6178b06f440e30f6b9e289186473e3e975fe4e7069e" gracePeriod=30 Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.959269 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:19 crc kubenswrapper[5030]: I0120 22:56:19.971020 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd0dddd2-0952-4bdb-9149-08b1b3653cbb" path="/var/lib/kubelet/pods/bd0dddd2-0952-4bdb-9149-08b1b3653cbb/volumes" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.344320 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.345028 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="ceilometer-central-agent" containerID="cri-o://dd5a6003b17772b156ee9be14fba2bb196106ed5d5cbc7d271a1d0a6d753b1b3" gracePeriod=30 Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.345155 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="proxy-httpd" containerID="cri-o://66e613445163cacf379c54655d656d7966e7f2f798a145995652a776c308b64b" gracePeriod=30 Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.345387 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="sg-core" containerID="cri-o://524a2dcd4b1b17b23f66eb9c3310b38a02ebf8073b8c313772bf6ecbf19036c6" gracePeriod=30 Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.345435 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="ceilometer-notification-agent" containerID="cri-o://db36e8d5253b1fb31f53685e50ba7918126cbd226416ba245bee2cd9a58f5658" gracePeriod=30 Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.397252 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.462709 5030 generic.go:334] "Generic (PLEG): container finished" podID="b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" containerID="d815603d666eefed1fd3f6178b06f440e30f6b9e289186473e3e975fe4e7069e" exitCode=0 Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.462737 5030 generic.go:334] "Generic (PLEG): container finished" podID="b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" containerID="01ea3d4ddfa4cb62ecf8192f43cd81d916914e07e1acb715ab872268e62148f3" exitCode=143 Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.462781 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec","Type":"ContainerDied","Data":"d815603d666eefed1fd3f6178b06f440e30f6b9e289186473e3e975fe4e7069e"} Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.462806 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec","Type":"ContainerDied","Data":"01ea3d4ddfa4cb62ecf8192f43cd81d916914e07e1acb715ab872268e62148f3"} Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.464944 5030 generic.go:334] "Generic (PLEG): container finished" podID="49b27d4b-53fe-4dd1-ac75-a00a12c33655" containerID="2f80231e85030d0b524b044287053cf4535a6f353648dc9031d28703bdb28583" exitCode=0 Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.464982 5030 generic.go:334] "Generic (PLEG): container finished" podID="49b27d4b-53fe-4dd1-ac75-a00a12c33655" containerID="66e9f3e88813ae293b39322d69fb0eb0235a26062b99fea75af34d4fb4f87df4" exitCode=143 Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.465028 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"49b27d4b-53fe-4dd1-ac75-a00a12c33655","Type":"ContainerDied","Data":"2f80231e85030d0b524b044287053cf4535a6f353648dc9031d28703bdb28583"} Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.465057 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"49b27d4b-53fe-4dd1-ac75-a00a12c33655","Type":"ContainerDied","Data":"66e9f3e88813ae293b39322d69fb0eb0235a26062b99fea75af34d4fb4f87df4"} Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.468571 5030 generic.go:334] "Generic (PLEG): container finished" podID="8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35" containerID="379f9e93e41d27df40fadbc04a7f498ac03dce05e1399bab7f71d6c134822330" exitCode=0 Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.468611 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" event={"ID":"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35","Type":"ContainerDied","Data":"379f9e93e41d27df40fadbc04a7f498ac03dce05e1399bab7f71d6c134822330"} Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.472852 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d969fd5b-6579-4bc5-b64f-05bcb2aec29d","Type":"ContainerStarted","Data":"712b17246d7d0928ffcfb40f7ddc0b583f5dad565a9dc7d530c64379cf291a5e"} Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.474643 5030 generic.go:334] "Generic (PLEG): container finished" podID="f8f24c28-d790-451a-81cb-ca190c707420" containerID="26f2f0767268f4022645c573e1271916fd36287774ea788d59b21d7a72e29c72" exitCode=0 Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.474675 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f8f24c28-d790-451a-81cb-ca190c707420","Type":"ContainerDied","Data":"26f2f0767268f4022645c573e1271916fd36287774ea788d59b21d7a72e29c72"} Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.704005 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.715382 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.821972 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nxzm\" (UniqueName: \"kubernetes.io/projected/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-kube-api-access-9nxzm\") pod \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.822202 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-logs\") pod \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.822283 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b27d4b-53fe-4dd1-ac75-a00a12c33655-config-data\") pod \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.822366 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49b27d4b-53fe-4dd1-ac75-a00a12c33655-logs\") pod \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.822455 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-nova-metadata-tls-certs\") pod \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.822519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-combined-ca-bundle\") pod \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.822576 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-logs" (OuterVolumeSpecName: "logs") pod "b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" (UID: "b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.822826 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b27d4b-53fe-4dd1-ac75-a00a12c33655-combined-ca-bundle\") pod \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.822873 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-config-data\") pod \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\" (UID: \"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec\") " Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.822909 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bpc6\" (UniqueName: \"kubernetes.io/projected/49b27d4b-53fe-4dd1-ac75-a00a12c33655-kube-api-access-8bpc6\") pod \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\" (UID: \"49b27d4b-53fe-4dd1-ac75-a00a12c33655\") " Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.822833 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49b27d4b-53fe-4dd1-ac75-a00a12c33655-logs" (OuterVolumeSpecName: "logs") pod "49b27d4b-53fe-4dd1-ac75-a00a12c33655" (UID: "49b27d4b-53fe-4dd1-ac75-a00a12c33655"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.823590 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.823615 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49b27d4b-53fe-4dd1-ac75-a00a12c33655-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.828116 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-kube-api-access-9nxzm" (OuterVolumeSpecName: "kube-api-access-9nxzm") pod "b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" (UID: "b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec"). InnerVolumeSpecName "kube-api-access-9nxzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.837414 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.837658 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b27d4b-53fe-4dd1-ac75-a00a12c33655-kube-api-access-8bpc6" (OuterVolumeSpecName: "kube-api-access-8bpc6") pod "49b27d4b-53fe-4dd1-ac75-a00a12c33655" (UID: "49b27d4b-53fe-4dd1-ac75-a00a12c33655"). InnerVolumeSpecName "kube-api-access-8bpc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.850145 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b27d4b-53fe-4dd1-ac75-a00a12c33655-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49b27d4b-53fe-4dd1-ac75-a00a12c33655" (UID: "49b27d4b-53fe-4dd1-ac75-a00a12c33655"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.855112 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b27d4b-53fe-4dd1-ac75-a00a12c33655-config-data" (OuterVolumeSpecName: "config-data") pod "49b27d4b-53fe-4dd1-ac75-a00a12c33655" (UID: "49b27d4b-53fe-4dd1-ac75-a00a12c33655"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.858007 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" (UID: "b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.875981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-config-data" (OuterVolumeSpecName: "config-data") pod "b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" (UID: "b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.881964 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" (UID: "b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.924524 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6zfb\" (UniqueName: \"kubernetes.io/projected/f8f24c28-d790-451a-81cb-ca190c707420-kube-api-access-n6zfb\") pod \"f8f24c28-d790-451a-81cb-ca190c707420\" (UID: \"f8f24c28-d790-451a-81cb-ca190c707420\") " Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.924600 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8f24c28-d790-451a-81cb-ca190c707420-config-data\") pod \"f8f24c28-d790-451a-81cb-ca190c707420\" (UID: \"f8f24c28-d790-451a-81cb-ca190c707420\") " Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.924795 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8f24c28-d790-451a-81cb-ca190c707420-combined-ca-bundle\") pod \"f8f24c28-d790-451a-81cb-ca190c707420\" (UID: \"f8f24c28-d790-451a-81cb-ca190c707420\") " Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.925210 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b27d4b-53fe-4dd1-ac75-a00a12c33655-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.925228 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.925238 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bpc6\" (UniqueName: \"kubernetes.io/projected/49b27d4b-53fe-4dd1-ac75-a00a12c33655-kube-api-access-8bpc6\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.925250 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nxzm\" (UniqueName: \"kubernetes.io/projected/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-kube-api-access-9nxzm\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.925258 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b27d4b-53fe-4dd1-ac75-a00a12c33655-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.925267 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.925275 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.928080 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8f24c28-d790-451a-81cb-ca190c707420-kube-api-access-n6zfb" (OuterVolumeSpecName: "kube-api-access-n6zfb") pod "f8f24c28-d790-451a-81cb-ca190c707420" (UID: "f8f24c28-d790-451a-81cb-ca190c707420"). InnerVolumeSpecName "kube-api-access-n6zfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.948180 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8f24c28-d790-451a-81cb-ca190c707420-config-data" (OuterVolumeSpecName: "config-data") pod "f8f24c28-d790-451a-81cb-ca190c707420" (UID: "f8f24c28-d790-451a-81cb-ca190c707420"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:20 crc kubenswrapper[5030]: I0120 22:56:20.954280 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8f24c28-d790-451a-81cb-ca190c707420-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8f24c28-d790-451a-81cb-ca190c707420" (UID: "f8f24c28-d790-451a-81cb-ca190c707420"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.026896 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8f24c28-d790-451a-81cb-ca190c707420-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.026928 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6zfb\" (UniqueName: \"kubernetes.io/projected/f8f24c28-d790-451a-81cb-ca190c707420-kube-api-access-n6zfb\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.026939 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8f24c28-d790-451a-81cb-ca190c707420-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.489510 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.489505 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f8f24c28-d790-451a-81cb-ca190c707420","Type":"ContainerDied","Data":"db95afd6f03d7907fd81e229e8f1363ea4395a01b419bf9f73307328959d98c0"} Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.490540 5030 scope.go:117] "RemoveContainer" containerID="26f2f0767268f4022645c573e1271916fd36287774ea788d59b21d7a72e29c72" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.494332 5030 generic.go:334] "Generic (PLEG): container finished" podID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerID="66e613445163cacf379c54655d656d7966e7f2f798a145995652a776c308b64b" exitCode=0 Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.494382 5030 generic.go:334] "Generic (PLEG): container finished" podID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerID="524a2dcd4b1b17b23f66eb9c3310b38a02ebf8073b8c313772bf6ecbf19036c6" exitCode=2 Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.494399 5030 generic.go:334] "Generic (PLEG): container finished" podID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerID="dd5a6003b17772b156ee9be14fba2bb196106ed5d5cbc7d271a1d0a6d753b1b3" exitCode=0 Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.494498 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"59af2e95-cfa7-42b0-8d75-34d72fc69e7a","Type":"ContainerDied","Data":"66e613445163cacf379c54655d656d7966e7f2f798a145995652a776c308b64b"} Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.494538 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"59af2e95-cfa7-42b0-8d75-34d72fc69e7a","Type":"ContainerDied","Data":"524a2dcd4b1b17b23f66eb9c3310b38a02ebf8073b8c313772bf6ecbf19036c6"} Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.494559 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"59af2e95-cfa7-42b0-8d75-34d72fc69e7a","Type":"ContainerDied","Data":"dd5a6003b17772b156ee9be14fba2bb196106ed5d5cbc7d271a1d0a6d753b1b3"} Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.503477 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.503708 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec","Type":"ContainerDied","Data":"d1dd3a351091a6fde3c7001ec8ccef6c44a7259d505d6772960c96a78e176aa8"} Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.506606 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"49b27d4b-53fe-4dd1-ac75-a00a12c33655","Type":"ContainerDied","Data":"23e199514c7bf45d83f0e1e66b68fec71ef502374e169a5c073a8fb8ba0dd79e"} Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.506740 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.509293 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d969fd5b-6579-4bc5-b64f-05bcb2aec29d","Type":"ContainerStarted","Data":"baf6a60a1b77f421596bd948659ecac15aa8955fa1ce8ad3378955ea3bd359da"} Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.509682 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.533234 5030 scope.go:117] "RemoveContainer" containerID="d815603d666eefed1fd3f6178b06f440e30f6b9e289186473e3e975fe4e7069e" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.538973 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.08598034 podStartE2EDuration="2.538957417s" podCreationTimestamp="2026-01-20 22:56:19 +0000 UTC" firstStartedPulling="2026-01-20 22:56:20.438466613 +0000 UTC m=+1252.758726901" lastFinishedPulling="2026-01-20 22:56:20.89144369 +0000 UTC m=+1253.211703978" observedRunningTime="2026-01-20 22:56:21.531898106 +0000 UTC m=+1253.852158394" watchObservedRunningTime="2026-01-20 22:56:21.538957417 +0000 UTC m=+1253.859217705" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.574605 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.590578 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.603419 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:21 crc kubenswrapper[5030]: E0120 22:56:21.603893 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f24c28-d790-451a-81cb-ca190c707420" containerName="nova-scheduler-scheduler" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.603907 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f24c28-d790-451a-81cb-ca190c707420" containerName="nova-scheduler-scheduler" Jan 20 22:56:21 crc kubenswrapper[5030]: E0120 22:56:21.603927 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b27d4b-53fe-4dd1-ac75-a00a12c33655" containerName="nova-api-log" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.603936 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b27d4b-53fe-4dd1-ac75-a00a12c33655" containerName="nova-api-log" Jan 20 22:56:21 crc kubenswrapper[5030]: E0120 22:56:21.603964 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" containerName="nova-metadata-log" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.603973 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" containerName="nova-metadata-log" Jan 20 22:56:21 crc kubenswrapper[5030]: E0120 22:56:21.603989 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" containerName="nova-metadata-metadata" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.603999 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" containerName="nova-metadata-metadata" Jan 20 22:56:21 crc kubenswrapper[5030]: E0120 22:56:21.604017 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b27d4b-53fe-4dd1-ac75-a00a12c33655" containerName="nova-api-api" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.604025 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b27d4b-53fe-4dd1-ac75-a00a12c33655" containerName="nova-api-api" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.604224 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" containerName="nova-metadata-metadata" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.604244 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b27d4b-53fe-4dd1-ac75-a00a12c33655" containerName="nova-api-log" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.604262 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" containerName="nova-metadata-log" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.604280 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b27d4b-53fe-4dd1-ac75-a00a12c33655" containerName="nova-api-api" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.604300 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8f24c28-d790-451a-81cb-ca190c707420" containerName="nova-scheduler-scheduler" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.605476 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.608616 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.608925 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.631988 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.640135 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.640198 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.640228 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-config-data\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.640254 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbcx8\" (UniqueName: \"kubernetes.io/projected/48442f94-e2f3-4585-8918-ce420c8ceebd-kube-api-access-jbcx8\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.640276 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48442f94-e2f3-4585-8918-ce420c8ceebd-logs\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.661727 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.678549 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.690727 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.698381 5030 scope.go:117] "RemoveContainer" containerID="01ea3d4ddfa4cb62ecf8192f43cd81d916914e07e1acb715ab872268e62148f3" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.701145 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.716768 5030 scope.go:117] "RemoveContainer" containerID="2f80231e85030d0b524b044287053cf4535a6f353648dc9031d28703bdb28583" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.718086 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.724070 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.726814 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.742424 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.742515 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.742551 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-config-data\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.742581 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbcx8\" (UniqueName: \"kubernetes.io/projected/48442f94-e2f3-4585-8918-ce420c8ceebd-kube-api-access-jbcx8\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.742610 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48442f94-e2f3-4585-8918-ce420c8ceebd-logs\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.743350 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48442f94-e2f3-4585-8918-ce420c8ceebd-logs\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.744731 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.746929 5030 scope.go:117] "RemoveContainer" containerID="66e9f3e88813ae293b39322d69fb0eb0235a26062b99fea75af34d4fb4f87df4" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.749299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.749578 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.751486 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-config-data\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.759306 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbcx8\" (UniqueName: \"kubernetes.io/projected/48442f94-e2f3-4585-8918-ce420c8ceebd-kube-api-access-jbcx8\") pod \"nova-metadata-0\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.764843 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.766206 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.768454 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.776285 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.843513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e1c922-638b-45a3-8cb0-7bbd0001bff1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.843823 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jrbh\" (UniqueName: \"kubernetes.io/projected/57e1c922-638b-45a3-8cb0-7bbd0001bff1-kube-api-access-4jrbh\") pod \"nova-api-0\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.843847 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32782f79-2dae-4307-a6b0-2ffd14a5516c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"32782f79-2dae-4307-a6b0-2ffd14a5516c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.843885 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e1c922-638b-45a3-8cb0-7bbd0001bff1-config-data\") pod \"nova-api-0\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.843937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxwl4\" (UniqueName: \"kubernetes.io/projected/32782f79-2dae-4307-a6b0-2ffd14a5516c-kube-api-access-jxwl4\") pod \"nova-scheduler-0\" (UID: \"32782f79-2dae-4307-a6b0-2ffd14a5516c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.843984 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57e1c922-638b-45a3-8cb0-7bbd0001bff1-logs\") pod \"nova-api-0\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.844003 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32782f79-2dae-4307-a6b0-2ffd14a5516c-config-data\") pod \"nova-scheduler-0\" (UID: \"32782f79-2dae-4307-a6b0-2ffd14a5516c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.945417 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32782f79-2dae-4307-a6b0-2ffd14a5516c-config-data\") pod \"nova-scheduler-0\" (UID: \"32782f79-2dae-4307-a6b0-2ffd14a5516c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.946169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e1c922-638b-45a3-8cb0-7bbd0001bff1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.946242 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jrbh\" (UniqueName: \"kubernetes.io/projected/57e1c922-638b-45a3-8cb0-7bbd0001bff1-kube-api-access-4jrbh\") pod \"nova-api-0\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.946283 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32782f79-2dae-4307-a6b0-2ffd14a5516c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"32782f79-2dae-4307-a6b0-2ffd14a5516c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.946361 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e1c922-638b-45a3-8cb0-7bbd0001bff1-config-data\") pod \"nova-api-0\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.946456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxwl4\" (UniqueName: \"kubernetes.io/projected/32782f79-2dae-4307-a6b0-2ffd14a5516c-kube-api-access-jxwl4\") pod \"nova-scheduler-0\" (UID: \"32782f79-2dae-4307-a6b0-2ffd14a5516c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.946520 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57e1c922-638b-45a3-8cb0-7bbd0001bff1-logs\") pod \"nova-api-0\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.946916 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57e1c922-638b-45a3-8cb0-7bbd0001bff1-logs\") pod \"nova-api-0\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.950068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e1c922-638b-45a3-8cb0-7bbd0001bff1-config-data\") pod \"nova-api-0\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.950107 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e1c922-638b-45a3-8cb0-7bbd0001bff1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.951210 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32782f79-2dae-4307-a6b0-2ffd14a5516c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"32782f79-2dae-4307-a6b0-2ffd14a5516c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.953407 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32782f79-2dae-4307-a6b0-2ffd14a5516c-config-data\") pod \"nova-scheduler-0\" (UID: \"32782f79-2dae-4307-a6b0-2ffd14a5516c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.961745 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.963532 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jrbh\" (UniqueName: \"kubernetes.io/projected/57e1c922-638b-45a3-8cb0-7bbd0001bff1-kube-api-access-4jrbh\") pod \"nova-api-0\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.964218 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxwl4\" (UniqueName: \"kubernetes.io/projected/32782f79-2dae-4307-a6b0-2ffd14a5516c-kube-api-access-jxwl4\") pod \"nova-scheduler-0\" (UID: \"32782f79-2dae-4307-a6b0-2ffd14a5516c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.968237 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.972177 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49b27d4b-53fe-4dd1-ac75-a00a12c33655" path="/var/lib/kubelet/pods/49b27d4b-53fe-4dd1-ac75-a00a12c33655/volumes" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.972790 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec" path="/var/lib/kubelet/pods/b92e4a98-2fb3-4c64-bf8a-4be41d3ab0ec/volumes" Jan 20 22:56:21 crc kubenswrapper[5030]: I0120 22:56:21.973332 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8f24c28-d790-451a-81cb-ca190c707420" path="/var/lib/kubelet/pods/f8f24c28-d790-451a-81cb-ca190c707420/volumes" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.040258 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.047933 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfq8d\" (UniqueName: \"kubernetes.io/projected/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-kube-api-access-rfq8d\") pod \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.047980 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-combined-ca-bundle\") pod \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.048133 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-config-data\") pod \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.048207 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-scripts\") pod \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\" (UID: \"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35\") " Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.056849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-scripts" (OuterVolumeSpecName: "scripts") pod "8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35" (UID: "8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.056860 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-kube-api-access-rfq8d" (OuterVolumeSpecName: "kube-api-access-rfq8d") pod "8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35" (UID: "8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35"). InnerVolumeSpecName "kube-api-access-rfq8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.074797 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-config-data" (OuterVolumeSpecName: "config-data") pod "8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35" (UID: "8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.077069 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35" (UID: "8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.143099 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.150783 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.150816 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.150826 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfq8d\" (UniqueName: \"kubernetes.io/projected/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-kube-api-access-rfq8d\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.150838 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.392247 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:56:22 crc kubenswrapper[5030]: W0120 22:56:22.394746 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48442f94_e2f3_4585_8918_ce420c8ceebd.slice/crio-db575e18d28561f032445cf5cc8a11854b8285f4aa3f310d786c884820f0fe23 WatchSource:0}: Error finding container db575e18d28561f032445cf5cc8a11854b8285f4aa3f310d786c884820f0fe23: Status 404 returned error can't find the container with id db575e18d28561f032445cf5cc8a11854b8285f4aa3f310d786c884820f0fe23 Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.494708 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:56:22 crc kubenswrapper[5030]: W0120 22:56:22.496882 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32782f79_2dae_4307_a6b0_2ffd14a5516c.slice/crio-04e8a238074ad47bd1b364102eaecf6891e816bdd8f968b68a125f8340fb587e WatchSource:0}: Error finding container 04e8a238074ad47bd1b364102eaecf6891e816bdd8f968b68a125f8340fb587e: Status 404 returned error can't find the container with id 04e8a238074ad47bd1b364102eaecf6891e816bdd8f968b68a125f8340fb587e Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.520395 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"32782f79-2dae-4307-a6b0-2ffd14a5516c","Type":"ContainerStarted","Data":"04e8a238074ad47bd1b364102eaecf6891e816bdd8f968b68a125f8340fb587e"} Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.521428 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"48442f94-e2f3-4585-8918-ce420c8ceebd","Type":"ContainerStarted","Data":"db575e18d28561f032445cf5cc8a11854b8285f4aa3f310d786c884820f0fe23"} Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.529455 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.529706 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns" event={"ID":"8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35","Type":"ContainerDied","Data":"ea32e81053be2faecfa8578187e283c4ac34fca4fbdc258597741f84f3396500"} Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.529738 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea32e81053be2faecfa8578187e283c4ac34fca4fbdc258597741f84f3396500" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.570556 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 22:56:22 crc kubenswrapper[5030]: E0120 22:56:22.571073 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35" containerName="nova-cell1-conductor-db-sync" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.571091 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35" containerName="nova-cell1-conductor-db-sync" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.571284 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35" containerName="nova-cell1-conductor-db-sync" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.571909 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.573708 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.606282 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.626593 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.661514 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwjxf\" (UniqueName: \"kubernetes.io/projected/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-kube-api-access-rwjxf\") pod \"nova-cell1-conductor-0\" (UID: \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.661557 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.661580 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.763753 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.763798 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.763941 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwjxf\" (UniqueName: \"kubernetes.io/projected/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-kube-api-access-rwjxf\") pod \"nova-cell1-conductor-0\" (UID: \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.781398 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.786172 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.789723 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwjxf\" (UniqueName: \"kubernetes.io/projected/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-kube-api-access-rwjxf\") pod \"nova-cell1-conductor-0\" (UID: \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:56:22 crc kubenswrapper[5030]: I0120 22:56:22.918043 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.108889 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.169064 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbkm7\" (UniqueName: \"kubernetes.io/projected/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-kube-api-access-tbkm7\") pod \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.169446 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-run-httpd\") pod \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.169577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-sg-core-conf-yaml\") pod \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.169840 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "59af2e95-cfa7-42b0-8d75-34d72fc69e7a" (UID: "59af2e95-cfa7-42b0-8d75-34d72fc69e7a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.169864 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-combined-ca-bundle\") pod \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.169955 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-config-data\") pod \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.170060 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-log-httpd\") pod \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.170122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-scripts\") pod \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.170924 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "59af2e95-cfa7-42b0-8d75-34d72fc69e7a" (UID: "59af2e95-cfa7-42b0-8d75-34d72fc69e7a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.171993 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.172010 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.181788 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-kube-api-access-tbkm7" (OuterVolumeSpecName: "kube-api-access-tbkm7") pod "59af2e95-cfa7-42b0-8d75-34d72fc69e7a" (UID: "59af2e95-cfa7-42b0-8d75-34d72fc69e7a"). InnerVolumeSpecName "kube-api-access-tbkm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.195099 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-scripts" (OuterVolumeSpecName: "scripts") pod "59af2e95-cfa7-42b0-8d75-34d72fc69e7a" (UID: "59af2e95-cfa7-42b0-8d75-34d72fc69e7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.237360 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "59af2e95-cfa7-42b0-8d75-34d72fc69e7a" (UID: "59af2e95-cfa7-42b0-8d75-34d72fc69e7a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.272636 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59af2e95-cfa7-42b0-8d75-34d72fc69e7a" (UID: "59af2e95-cfa7-42b0-8d75-34d72fc69e7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.273202 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-combined-ca-bundle\") pod \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\" (UID: \"59af2e95-cfa7-42b0-8d75-34d72fc69e7a\") " Jan 20 22:56:23 crc kubenswrapper[5030]: W0120 22:56:23.273351 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/59af2e95-cfa7-42b0-8d75-34d72fc69e7a/volumes/kubernetes.io~secret/combined-ca-bundle Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.273377 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59af2e95-cfa7-42b0-8d75-34d72fc69e7a" (UID: "59af2e95-cfa7-42b0-8d75-34d72fc69e7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.273854 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.273886 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.273899 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbkm7\" (UniqueName: \"kubernetes.io/projected/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-kube-api-access-tbkm7\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.273914 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.293781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-config-data" (OuterVolumeSpecName: "config-data") pod "59af2e95-cfa7-42b0-8d75-34d72fc69e7a" (UID: "59af2e95-cfa7-42b0-8d75-34d72fc69e7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.375158 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59af2e95-cfa7-42b0-8d75-34d72fc69e7a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.403431 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.539830 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"f0fffc32-86e9-4278-8c2d-37a04bbd89b1","Type":"ContainerStarted","Data":"1a42cc36f57f686bb515eac375c4829e1b50f352c61784a231a964fd7d816ac5"} Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.543092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"57e1c922-638b-45a3-8cb0-7bbd0001bff1","Type":"ContainerStarted","Data":"e8fa0730d8388c812c23489ee334a55e6b4cf576e5e0f018427442c8ca0e3861"} Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.543129 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"57e1c922-638b-45a3-8cb0-7bbd0001bff1","Type":"ContainerStarted","Data":"d57e9a596a0fb434f8d26f160376b517153fff7eaf795c7c50d57ed0779d15e3"} Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.543141 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"57e1c922-638b-45a3-8cb0-7bbd0001bff1","Type":"ContainerStarted","Data":"2830148096e3ec6bd3e07daf094b59b78eff2478c18e37dbded374588308efa6"} Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.544799 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"32782f79-2dae-4307-a6b0-2ffd14a5516c","Type":"ContainerStarted","Data":"590e23862d0316a50a051bcb1d10f8f68ac4cffea79384d48d8e2cbffd0c8f39"} Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.548356 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"48442f94-e2f3-4585-8918-ce420c8ceebd","Type":"ContainerStarted","Data":"c18103edf3e204b90386e22ef4f69d4f81ced1a152e9094afc110a226fe5326d"} Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.548400 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"48442f94-e2f3-4585-8918-ce420c8ceebd","Type":"ContainerStarted","Data":"cbaba1e3573350a3d9e3cdf4e6e08ae2868b0a8abe44899607092a69523f0877"} Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.553291 5030 generic.go:334] "Generic (PLEG): container finished" podID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerID="db36e8d5253b1fb31f53685e50ba7918126cbd226416ba245bee2cd9a58f5658" exitCode=0 Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.553331 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"59af2e95-cfa7-42b0-8d75-34d72fc69e7a","Type":"ContainerDied","Data":"db36e8d5253b1fb31f53685e50ba7918126cbd226416ba245bee2cd9a58f5658"} Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.553373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"59af2e95-cfa7-42b0-8d75-34d72fc69e7a","Type":"ContainerDied","Data":"b9af4c8f14fae1cf5d7741d64d8c6905fa32b08b1c180d6a842ed2866db1107a"} Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.553397 5030 scope.go:117] "RemoveContainer" containerID="66e613445163cacf379c54655d656d7966e7f2f798a145995652a776c308b64b" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.553397 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.570210 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.570191945 podStartE2EDuration="2.570191945s" podCreationTimestamp="2026-01-20 22:56:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:56:23.563902892 +0000 UTC m=+1255.884163190" watchObservedRunningTime="2026-01-20 22:56:23.570191945 +0000 UTC m=+1255.890452233" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.603498 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.603461503 podStartE2EDuration="2.603461503s" podCreationTimestamp="2026-01-20 22:56:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:56:23.585863275 +0000 UTC m=+1255.906123573" watchObservedRunningTime="2026-01-20 22:56:23.603461503 +0000 UTC m=+1255.923721791" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.627302 5030 scope.go:117] "RemoveContainer" containerID="524a2dcd4b1b17b23f66eb9c3310b38a02ebf8073b8c313772bf6ecbf19036c6" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.636427 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.636405332 podStartE2EDuration="2.636405332s" podCreationTimestamp="2026-01-20 22:56:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:56:23.608484014 +0000 UTC m=+1255.928744292" watchObservedRunningTime="2026-01-20 22:56:23.636405332 +0000 UTC m=+1255.956665620" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.658144 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.668702 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.670499 5030 scope.go:117] "RemoveContainer" containerID="db36e8d5253b1fb31f53685e50ba7918126cbd226416ba245bee2cd9a58f5658" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.675768 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:23 crc kubenswrapper[5030]: E0120 22:56:23.676163 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="proxy-httpd" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.676222 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="proxy-httpd" Jan 20 22:56:23 crc kubenswrapper[5030]: E0120 22:56:23.676307 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="ceilometer-central-agent" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.676358 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="ceilometer-central-agent" Jan 20 22:56:23 crc kubenswrapper[5030]: E0120 22:56:23.676415 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="ceilometer-notification-agent" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.676463 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="ceilometer-notification-agent" Jan 20 22:56:23 crc kubenswrapper[5030]: E0120 22:56:23.676513 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="sg-core" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.676559 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="sg-core" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.676768 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="ceilometer-central-agent" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.676828 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="proxy-httpd" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.676901 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="sg-core" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.676955 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" containerName="ceilometer-notification-agent" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.678464 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.681300 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.681561 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.681817 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.691298 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.703258 5030 scope.go:117] "RemoveContainer" containerID="dd5a6003b17772b156ee9be14fba2bb196106ed5d5cbc7d271a1d0a6d753b1b3" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.723973 5030 scope.go:117] "RemoveContainer" containerID="66e613445163cacf379c54655d656d7966e7f2f798a145995652a776c308b64b" Jan 20 22:56:23 crc kubenswrapper[5030]: E0120 22:56:23.724474 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66e613445163cacf379c54655d656d7966e7f2f798a145995652a776c308b64b\": container with ID starting with 66e613445163cacf379c54655d656d7966e7f2f798a145995652a776c308b64b not found: ID does not exist" containerID="66e613445163cacf379c54655d656d7966e7f2f798a145995652a776c308b64b" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.724521 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66e613445163cacf379c54655d656d7966e7f2f798a145995652a776c308b64b"} err="failed to get container status \"66e613445163cacf379c54655d656d7966e7f2f798a145995652a776c308b64b\": rpc error: code = NotFound desc = could not find container \"66e613445163cacf379c54655d656d7966e7f2f798a145995652a776c308b64b\": container with ID starting with 66e613445163cacf379c54655d656d7966e7f2f798a145995652a776c308b64b not found: ID does not exist" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.724547 5030 scope.go:117] "RemoveContainer" containerID="524a2dcd4b1b17b23f66eb9c3310b38a02ebf8073b8c313772bf6ecbf19036c6" Jan 20 22:56:23 crc kubenswrapper[5030]: E0120 22:56:23.725013 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"524a2dcd4b1b17b23f66eb9c3310b38a02ebf8073b8c313772bf6ecbf19036c6\": container with ID starting with 524a2dcd4b1b17b23f66eb9c3310b38a02ebf8073b8c313772bf6ecbf19036c6 not found: ID does not exist" containerID="524a2dcd4b1b17b23f66eb9c3310b38a02ebf8073b8c313772bf6ecbf19036c6" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.725065 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"524a2dcd4b1b17b23f66eb9c3310b38a02ebf8073b8c313772bf6ecbf19036c6"} err="failed to get container status \"524a2dcd4b1b17b23f66eb9c3310b38a02ebf8073b8c313772bf6ecbf19036c6\": rpc error: code = NotFound desc = could not find container \"524a2dcd4b1b17b23f66eb9c3310b38a02ebf8073b8c313772bf6ecbf19036c6\": container with ID starting with 524a2dcd4b1b17b23f66eb9c3310b38a02ebf8073b8c313772bf6ecbf19036c6 not found: ID does not exist" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.725097 5030 scope.go:117] "RemoveContainer" containerID="db36e8d5253b1fb31f53685e50ba7918126cbd226416ba245bee2cd9a58f5658" Jan 20 22:56:23 crc kubenswrapper[5030]: E0120 22:56:23.725442 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db36e8d5253b1fb31f53685e50ba7918126cbd226416ba245bee2cd9a58f5658\": container with ID starting with db36e8d5253b1fb31f53685e50ba7918126cbd226416ba245bee2cd9a58f5658 not found: ID does not exist" containerID="db36e8d5253b1fb31f53685e50ba7918126cbd226416ba245bee2cd9a58f5658" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.725467 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db36e8d5253b1fb31f53685e50ba7918126cbd226416ba245bee2cd9a58f5658"} err="failed to get container status \"db36e8d5253b1fb31f53685e50ba7918126cbd226416ba245bee2cd9a58f5658\": rpc error: code = NotFound desc = could not find container \"db36e8d5253b1fb31f53685e50ba7918126cbd226416ba245bee2cd9a58f5658\": container with ID starting with db36e8d5253b1fb31f53685e50ba7918126cbd226416ba245bee2cd9a58f5658 not found: ID does not exist" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.725484 5030 scope.go:117] "RemoveContainer" containerID="dd5a6003b17772b156ee9be14fba2bb196106ed5d5cbc7d271a1d0a6d753b1b3" Jan 20 22:56:23 crc kubenswrapper[5030]: E0120 22:56:23.725959 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd5a6003b17772b156ee9be14fba2bb196106ed5d5cbc7d271a1d0a6d753b1b3\": container with ID starting with dd5a6003b17772b156ee9be14fba2bb196106ed5d5cbc7d271a1d0a6d753b1b3 not found: ID does not exist" containerID="dd5a6003b17772b156ee9be14fba2bb196106ed5d5cbc7d271a1d0a6d753b1b3" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.725991 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd5a6003b17772b156ee9be14fba2bb196106ed5d5cbc7d271a1d0a6d753b1b3"} err="failed to get container status \"dd5a6003b17772b156ee9be14fba2bb196106ed5d5cbc7d271a1d0a6d753b1b3\": rpc error: code = NotFound desc = could not find container \"dd5a6003b17772b156ee9be14fba2bb196106ed5d5cbc7d271a1d0a6d753b1b3\": container with ID starting with dd5a6003b17772b156ee9be14fba2bb196106ed5d5cbc7d271a1d0a6d753b1b3 not found: ID does not exist" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.782457 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-log-httpd\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.782530 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-config-data\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.782605 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tddr2\" (UniqueName: \"kubernetes.io/projected/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-kube-api-access-tddr2\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.782668 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.782702 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.782721 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-scripts\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.782745 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-run-httpd\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.782765 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.885581 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-log-httpd\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.886149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-config-data\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.886451 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tddr2\" (UniqueName: \"kubernetes.io/projected/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-kube-api-access-tddr2\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.886473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-log-httpd\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.887008 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.887224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.887364 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-scripts\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.887591 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-run-httpd\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.888368 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.890258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-config-data\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.890534 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-run-httpd\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.892052 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.892341 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.894450 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-scripts\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.895135 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.911929 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tddr2\" (UniqueName: \"kubernetes.io/projected/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-kube-api-access-tddr2\") pod \"ceilometer-0\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.977000 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59af2e95-cfa7-42b0-8d75-34d72fc69e7a" path="/var/lib/kubelet/pods/59af2e95-cfa7-42b0-8d75-34d72fc69e7a/volumes" Jan 20 22:56:23 crc kubenswrapper[5030]: I0120 22:56:23.997807 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:24 crc kubenswrapper[5030]: I0120 22:56:24.463022 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:24 crc kubenswrapper[5030]: I0120 22:56:24.566065 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"f0fffc32-86e9-4278-8c2d-37a04bbd89b1","Type":"ContainerStarted","Data":"efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4"} Jan 20 22:56:24 crc kubenswrapper[5030]: I0120 22:56:24.567056 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:56:24 crc kubenswrapper[5030]: I0120 22:56:24.579530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"497735a2-eb2c-4cd4-b475-831e9d1fcdc3","Type":"ContainerStarted","Data":"e0ebf7b8db72188ee35516c2b8afd2bb33c062c782d5ae925de7133ae3504507"} Jan 20 22:56:24 crc kubenswrapper[5030]: I0120 22:56:24.597881 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.597862761 podStartE2EDuration="2.597862761s" podCreationTimestamp="2026-01-20 22:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:56:24.592358338 +0000 UTC m=+1256.912618616" watchObservedRunningTime="2026-01-20 22:56:24.597862761 +0000 UTC m=+1256.918123059" Jan 20 22:56:25 crc kubenswrapper[5030]: I0120 22:56:25.594650 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"497735a2-eb2c-4cd4-b475-831e9d1fcdc3","Type":"ContainerStarted","Data":"688e9992dba81cee14e33d6f8669f2e3c6fc5e7c56db5b3e6734e7ecb946e075"} Jan 20 22:56:26 crc kubenswrapper[5030]: I0120 22:56:26.609049 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"497735a2-eb2c-4cd4-b475-831e9d1fcdc3","Type":"ContainerStarted","Data":"aed59368615779705dc94e1dadcd97da030104ed67118fbb559a2c9176b807e2"} Jan 20 22:56:26 crc kubenswrapper[5030]: I0120 22:56:26.961937 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:26 crc kubenswrapper[5030]: I0120 22:56:26.962146 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:27 crc kubenswrapper[5030]: I0120 22:56:27.041426 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:27 crc kubenswrapper[5030]: I0120 22:56:27.618036 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"497735a2-eb2c-4cd4-b475-831e9d1fcdc3","Type":"ContainerStarted","Data":"b78446ce7c295303e7237184bfd4f2d4b59b098e6bcc0750bd3d374f463c8054"} Jan 20 22:56:29 crc kubenswrapper[5030]: I0120 22:56:29.643946 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"497735a2-eb2c-4cd4-b475-831e9d1fcdc3","Type":"ContainerStarted","Data":"de811efa46ab4fc85c062a9304f0749e0aae672f056aed2b67d7987cbd434192"} Jan 20 22:56:29 crc kubenswrapper[5030]: I0120 22:56:29.645534 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:29 crc kubenswrapper[5030]: I0120 22:56:29.683207 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.197512008 podStartE2EDuration="6.683184746s" podCreationTimestamp="2026-01-20 22:56:23 +0000 UTC" firstStartedPulling="2026-01-20 22:56:24.477855218 +0000 UTC m=+1256.798115516" lastFinishedPulling="2026-01-20 22:56:28.963527926 +0000 UTC m=+1261.283788254" observedRunningTime="2026-01-20 22:56:29.674037703 +0000 UTC m=+1261.994298031" watchObservedRunningTime="2026-01-20 22:56:29.683184746 +0000 UTC m=+1262.003445044" Jan 20 22:56:29 crc kubenswrapper[5030]: I0120 22:56:29.986220 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:56:31 crc kubenswrapper[5030]: I0120 22:56:31.982200 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:31 crc kubenswrapper[5030]: I0120 22:56:31.982565 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:32 crc kubenswrapper[5030]: I0120 22:56:32.041038 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:32 crc kubenswrapper[5030]: I0120 22:56:32.081864 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:32 crc kubenswrapper[5030]: I0120 22:56:32.147133 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:32 crc kubenswrapper[5030]: I0120 22:56:32.147271 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:32 crc kubenswrapper[5030]: I0120 22:56:32.723427 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:56:32 crc kubenswrapper[5030]: I0120 22:56:32.968699 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:56:32 crc kubenswrapper[5030]: I0120 22:56:32.972835 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.175:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 22:56:32 crc kubenswrapper[5030]: I0120 22:56:32.972949 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.175:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 22:56:33 crc kubenswrapper[5030]: I0120 22:56:33.228137 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="57e1c922-638b-45a3-8cb0-7bbd0001bff1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 22:56:33 crc kubenswrapper[5030]: I0120 22:56:33.228361 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="57e1c922-638b-45a3-8cb0-7bbd0001bff1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 22:56:41 crc kubenswrapper[5030]: I0120 22:56:41.975109 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:41 crc kubenswrapper[5030]: I0120 22:56:41.976948 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:41 crc kubenswrapper[5030]: I0120 22:56:41.981318 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:42 crc kubenswrapper[5030]: I0120 22:56:42.149352 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:42 crc kubenswrapper[5030]: I0120 22:56:42.149849 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:42 crc kubenswrapper[5030]: I0120 22:56:42.150577 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:42 crc kubenswrapper[5030]: I0120 22:56:42.153215 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:42 crc kubenswrapper[5030]: I0120 22:56:42.793845 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:42 crc kubenswrapper[5030]: I0120 22:56:42.799090 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:42 crc kubenswrapper[5030]: I0120 22:56:42.803921 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.788415 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.815253 5030 generic.go:334] "Generic (PLEG): container finished" podID="cbda6086-3121-417a-ab80-989c2641c87a" containerID="6e0d50786fce5ea42ef27af39889f471118646fa9e63d66040f431ee9ea9a969" exitCode=137 Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.815336 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.815391 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"cbda6086-3121-417a-ab80-989c2641c87a","Type":"ContainerDied","Data":"6e0d50786fce5ea42ef27af39889f471118646fa9e63d66040f431ee9ea9a969"} Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.815424 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"cbda6086-3121-417a-ab80-989c2641c87a","Type":"ContainerDied","Data":"1c03776b3466a82fe19d33911416515f4704b16b1b6d6a92e6245de72419fccb"} Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.815441 5030 scope.go:117] "RemoveContainer" containerID="6e0d50786fce5ea42ef27af39889f471118646fa9e63d66040f431ee9ea9a969" Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.835489 5030 scope.go:117] "RemoveContainer" containerID="6e0d50786fce5ea42ef27af39889f471118646fa9e63d66040f431ee9ea9a969" Jan 20 22:56:44 crc kubenswrapper[5030]: E0120 22:56:44.835974 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e0d50786fce5ea42ef27af39889f471118646fa9e63d66040f431ee9ea9a969\": container with ID starting with 6e0d50786fce5ea42ef27af39889f471118646fa9e63d66040f431ee9ea9a969 not found: ID does not exist" containerID="6e0d50786fce5ea42ef27af39889f471118646fa9e63d66040f431ee9ea9a969" Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.836036 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e0d50786fce5ea42ef27af39889f471118646fa9e63d66040f431ee9ea9a969"} err="failed to get container status \"6e0d50786fce5ea42ef27af39889f471118646fa9e63d66040f431ee9ea9a969\": rpc error: code = NotFound desc = could not find container \"6e0d50786fce5ea42ef27af39889f471118646fa9e63d66040f431ee9ea9a969\": container with ID starting with 6e0d50786fce5ea42ef27af39889f471118646fa9e63d66040f431ee9ea9a969 not found: ID does not exist" Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.845727 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.846578 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="ceilometer-central-agent" containerID="cri-o://688e9992dba81cee14e33d6f8669f2e3c6fc5e7c56db5b3e6734e7ecb946e075" gracePeriod=30 Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.846719 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="proxy-httpd" containerID="cri-o://de811efa46ab4fc85c062a9304f0749e0aae672f056aed2b67d7987cbd434192" gracePeriod=30 Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.846823 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="sg-core" containerID="cri-o://b78446ce7c295303e7237184bfd4f2d4b59b098e6bcc0750bd3d374f463c8054" gracePeriod=30 Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.846847 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="ceilometer-notification-agent" containerID="cri-o://aed59368615779705dc94e1dadcd97da030104ed67118fbb559a2c9176b807e2" gracePeriod=30 Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.858373 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.179:3000/\": EOF" Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.867561 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjt2c\" (UniqueName: \"kubernetes.io/projected/cbda6086-3121-417a-ab80-989c2641c87a-kube-api-access-gjt2c\") pod \"cbda6086-3121-417a-ab80-989c2641c87a\" (UID: \"cbda6086-3121-417a-ab80-989c2641c87a\") " Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.867728 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbda6086-3121-417a-ab80-989c2641c87a-config-data\") pod \"cbda6086-3121-417a-ab80-989c2641c87a\" (UID: \"cbda6086-3121-417a-ab80-989c2641c87a\") " Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.867876 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda6086-3121-417a-ab80-989c2641c87a-combined-ca-bundle\") pod \"cbda6086-3121-417a-ab80-989c2641c87a\" (UID: \"cbda6086-3121-417a-ab80-989c2641c87a\") " Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.878141 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbda6086-3121-417a-ab80-989c2641c87a-kube-api-access-gjt2c" (OuterVolumeSpecName: "kube-api-access-gjt2c") pod "cbda6086-3121-417a-ab80-989c2641c87a" (UID: "cbda6086-3121-417a-ab80-989c2641c87a"). InnerVolumeSpecName "kube-api-access-gjt2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.895155 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbda6086-3121-417a-ab80-989c2641c87a-config-data" (OuterVolumeSpecName: "config-data") pod "cbda6086-3121-417a-ab80-989c2641c87a" (UID: "cbda6086-3121-417a-ab80-989c2641c87a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.904304 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbda6086-3121-417a-ab80-989c2641c87a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbda6086-3121-417a-ab80-989c2641c87a" (UID: "cbda6086-3121-417a-ab80-989c2641c87a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.969892 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbda6086-3121-417a-ab80-989c2641c87a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.969927 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbda6086-3121-417a-ab80-989c2641c87a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:44 crc kubenswrapper[5030]: I0120 22:56:44.969938 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjt2c\" (UniqueName: \"kubernetes.io/projected/cbda6086-3121-417a-ab80-989c2641c87a-kube-api-access-gjt2c\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.151989 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.169829 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.200204 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 22:56:45 crc kubenswrapper[5030]: E0120 22:56:45.200594 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbda6086-3121-417a-ab80-989c2641c87a" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.200614 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbda6086-3121-417a-ab80-989c2641c87a" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.201726 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbda6086-3121-417a-ab80-989c2641c87a" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.202452 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.207347 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.207997 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.208909 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.215033 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.275606 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.275723 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.275836 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.275863 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.275911 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk9jg\" (UniqueName: \"kubernetes.io/projected/a199be04-e085-49d3-a376-abdf54773b99-kube-api-access-hk9jg\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.377860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.377982 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.378015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.378079 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk9jg\" (UniqueName: \"kubernetes.io/projected/a199be04-e085-49d3-a376-abdf54773b99-kube-api-access-hk9jg\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.378179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.386799 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.387344 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.387895 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.388644 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.402307 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk9jg\" (UniqueName: \"kubernetes.io/projected/a199be04-e085-49d3-a376-abdf54773b99-kube-api-access-hk9jg\") pod \"nova-cell1-novncproxy-0\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.519105 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.829526 5030 generic.go:334] "Generic (PLEG): container finished" podID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerID="de811efa46ab4fc85c062a9304f0749e0aae672f056aed2b67d7987cbd434192" exitCode=0 Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.829934 5030 generic.go:334] "Generic (PLEG): container finished" podID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerID="b78446ce7c295303e7237184bfd4f2d4b59b098e6bcc0750bd3d374f463c8054" exitCode=2 Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.829958 5030 generic.go:334] "Generic (PLEG): container finished" podID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerID="688e9992dba81cee14e33d6f8669f2e3c6fc5e7c56db5b3e6734e7ecb946e075" exitCode=0 Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.829789 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"497735a2-eb2c-4cd4-b475-831e9d1fcdc3","Type":"ContainerDied","Data":"de811efa46ab4fc85c062a9304f0749e0aae672f056aed2b67d7987cbd434192"} Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.830061 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"497735a2-eb2c-4cd4-b475-831e9d1fcdc3","Type":"ContainerDied","Data":"b78446ce7c295303e7237184bfd4f2d4b59b098e6bcc0750bd3d374f463c8054"} Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.830086 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"497735a2-eb2c-4cd4-b475-831e9d1fcdc3","Type":"ContainerDied","Data":"688e9992dba81cee14e33d6f8669f2e3c6fc5e7c56db5b3e6734e7ecb946e075"} Jan 20 22:56:45 crc kubenswrapper[5030]: I0120 22:56:45.943600 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.013004 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbda6086-3121-417a-ab80-989c2641c87a" path="/var/lib/kubelet/pods/cbda6086-3121-417a-ab80-989c2641c87a/volumes" Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.023793 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.024106 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="57e1c922-638b-45a3-8cb0-7bbd0001bff1" containerName="nova-api-api" containerID="cri-o://e8fa0730d8388c812c23489ee334a55e6b4cf576e5e0f018427442c8ca0e3861" gracePeriod=30 Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.024106 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="57e1c922-638b-45a3-8cb0-7bbd0001bff1" containerName="nova-api-log" containerID="cri-o://d57e9a596a0fb434f8d26f160376b517153fff7eaf795c7c50d57ed0779d15e3" gracePeriod=30 Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.837512 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.840869 5030 generic.go:334] "Generic (PLEG): container finished" podID="57e1c922-638b-45a3-8cb0-7bbd0001bff1" containerID="d57e9a596a0fb434f8d26f160376b517153fff7eaf795c7c50d57ed0779d15e3" exitCode=143 Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.840971 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"57e1c922-638b-45a3-8cb0-7bbd0001bff1","Type":"ContainerDied","Data":"d57e9a596a0fb434f8d26f160376b517153fff7eaf795c7c50d57ed0779d15e3"} Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.842565 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"a199be04-e085-49d3-a376-abdf54773b99","Type":"ContainerStarted","Data":"19f231d19d885aa27ab7782d5877fea4392fd2ffd88a91c7505026e5f5c450f4"} Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.842613 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"a199be04-e085-49d3-a376-abdf54773b99","Type":"ContainerStarted","Data":"0e6b4ca7705812c489ba0d949368d954796ef3ea85d194df9bb90c9a032ec40f"} Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.845328 5030 generic.go:334] "Generic (PLEG): container finished" podID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerID="aed59368615779705dc94e1dadcd97da030104ed67118fbb559a2c9176b807e2" exitCode=0 Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.845411 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"497735a2-eb2c-4cd4-b475-831e9d1fcdc3","Type":"ContainerDied","Data":"aed59368615779705dc94e1dadcd97da030104ed67118fbb559a2c9176b807e2"} Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.845445 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"497735a2-eb2c-4cd4-b475-831e9d1fcdc3","Type":"ContainerDied","Data":"e0ebf7b8db72188ee35516c2b8afd2bb33c062c782d5ae925de7133ae3504507"} Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.845495 5030 scope.go:117] "RemoveContainer" containerID="de811efa46ab4fc85c062a9304f0749e0aae672f056aed2b67d7987cbd434192" Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.845413 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.865929 5030 scope.go:117] "RemoveContainer" containerID="b78446ce7c295303e7237184bfd4f2d4b59b098e6bcc0750bd3d374f463c8054" Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.904363 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=1.9043427309999998 podStartE2EDuration="1.904342731s" podCreationTimestamp="2026-01-20 22:56:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:56:46.897728071 +0000 UTC m=+1279.217988359" watchObservedRunningTime="2026-01-20 22:56:46.904342731 +0000 UTC m=+1279.224603019" Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.917272 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-run-httpd\") pod \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.917444 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-combined-ca-bundle\") pod \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.917602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-ceilometer-tls-certs\") pod \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.917722 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-config-data\") pod \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.917804 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-sg-core-conf-yaml\") pod \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.917838 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-scripts\") pod \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.917876 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tddr2\" (UniqueName: \"kubernetes.io/projected/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-kube-api-access-tddr2\") pod \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.917917 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-log-httpd\") pod \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\" (UID: \"497735a2-eb2c-4cd4-b475-831e9d1fcdc3\") " Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.917935 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "497735a2-eb2c-4cd4-b475-831e9d1fcdc3" (UID: "497735a2-eb2c-4cd4-b475-831e9d1fcdc3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.918613 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.920608 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "497735a2-eb2c-4cd4-b475-831e9d1fcdc3" (UID: "497735a2-eb2c-4cd4-b475-831e9d1fcdc3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.925068 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-kube-api-access-tddr2" (OuterVolumeSpecName: "kube-api-access-tddr2") pod "497735a2-eb2c-4cd4-b475-831e9d1fcdc3" (UID: "497735a2-eb2c-4cd4-b475-831e9d1fcdc3"). InnerVolumeSpecName "kube-api-access-tddr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.928808 5030 scope.go:117] "RemoveContainer" containerID="aed59368615779705dc94e1dadcd97da030104ed67118fbb559a2c9176b807e2" Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.935948 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-scripts" (OuterVolumeSpecName: "scripts") pod "497735a2-eb2c-4cd4-b475-831e9d1fcdc3" (UID: "497735a2-eb2c-4cd4-b475-831e9d1fcdc3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.949263 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "497735a2-eb2c-4cd4-b475-831e9d1fcdc3" (UID: "497735a2-eb2c-4cd4-b475-831e9d1fcdc3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:46 crc kubenswrapper[5030]: I0120 22:56:46.978819 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "497735a2-eb2c-4cd4-b475-831e9d1fcdc3" (UID: "497735a2-eb2c-4cd4-b475-831e9d1fcdc3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.002718 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "497735a2-eb2c-4cd4-b475-831e9d1fcdc3" (UID: "497735a2-eb2c-4cd4-b475-831e9d1fcdc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.020932 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.020983 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.021000 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.021011 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.021025 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tddr2\" (UniqueName: \"kubernetes.io/projected/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-kube-api-access-tddr2\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.021037 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.033953 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-config-data" (OuterVolumeSpecName: "config-data") pod "497735a2-eb2c-4cd4-b475-831e9d1fcdc3" (UID: "497735a2-eb2c-4cd4-b475-831e9d1fcdc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.042383 5030 scope.go:117] "RemoveContainer" containerID="688e9992dba81cee14e33d6f8669f2e3c6fc5e7c56db5b3e6734e7ecb946e075" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.060350 5030 scope.go:117] "RemoveContainer" containerID="de811efa46ab4fc85c062a9304f0749e0aae672f056aed2b67d7987cbd434192" Jan 20 22:56:47 crc kubenswrapper[5030]: E0120 22:56:47.061066 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de811efa46ab4fc85c062a9304f0749e0aae672f056aed2b67d7987cbd434192\": container with ID starting with de811efa46ab4fc85c062a9304f0749e0aae672f056aed2b67d7987cbd434192 not found: ID does not exist" containerID="de811efa46ab4fc85c062a9304f0749e0aae672f056aed2b67d7987cbd434192" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.061130 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de811efa46ab4fc85c062a9304f0749e0aae672f056aed2b67d7987cbd434192"} err="failed to get container status \"de811efa46ab4fc85c062a9304f0749e0aae672f056aed2b67d7987cbd434192\": rpc error: code = NotFound desc = could not find container \"de811efa46ab4fc85c062a9304f0749e0aae672f056aed2b67d7987cbd434192\": container with ID starting with de811efa46ab4fc85c062a9304f0749e0aae672f056aed2b67d7987cbd434192 not found: ID does not exist" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.061154 5030 scope.go:117] "RemoveContainer" containerID="b78446ce7c295303e7237184bfd4f2d4b59b098e6bcc0750bd3d374f463c8054" Jan 20 22:56:47 crc kubenswrapper[5030]: E0120 22:56:47.061552 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b78446ce7c295303e7237184bfd4f2d4b59b098e6bcc0750bd3d374f463c8054\": container with ID starting with b78446ce7c295303e7237184bfd4f2d4b59b098e6bcc0750bd3d374f463c8054 not found: ID does not exist" containerID="b78446ce7c295303e7237184bfd4f2d4b59b098e6bcc0750bd3d374f463c8054" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.061671 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b78446ce7c295303e7237184bfd4f2d4b59b098e6bcc0750bd3d374f463c8054"} err="failed to get container status \"b78446ce7c295303e7237184bfd4f2d4b59b098e6bcc0750bd3d374f463c8054\": rpc error: code = NotFound desc = could not find container \"b78446ce7c295303e7237184bfd4f2d4b59b098e6bcc0750bd3d374f463c8054\": container with ID starting with b78446ce7c295303e7237184bfd4f2d4b59b098e6bcc0750bd3d374f463c8054 not found: ID does not exist" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.061779 5030 scope.go:117] "RemoveContainer" containerID="aed59368615779705dc94e1dadcd97da030104ed67118fbb559a2c9176b807e2" Jan 20 22:56:47 crc kubenswrapper[5030]: E0120 22:56:47.062168 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aed59368615779705dc94e1dadcd97da030104ed67118fbb559a2c9176b807e2\": container with ID starting with aed59368615779705dc94e1dadcd97da030104ed67118fbb559a2c9176b807e2 not found: ID does not exist" containerID="aed59368615779705dc94e1dadcd97da030104ed67118fbb559a2c9176b807e2" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.062196 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aed59368615779705dc94e1dadcd97da030104ed67118fbb559a2c9176b807e2"} err="failed to get container status \"aed59368615779705dc94e1dadcd97da030104ed67118fbb559a2c9176b807e2\": rpc error: code = NotFound desc = could not find container \"aed59368615779705dc94e1dadcd97da030104ed67118fbb559a2c9176b807e2\": container with ID starting with aed59368615779705dc94e1dadcd97da030104ed67118fbb559a2c9176b807e2 not found: ID does not exist" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.062214 5030 scope.go:117] "RemoveContainer" containerID="688e9992dba81cee14e33d6f8669f2e3c6fc5e7c56db5b3e6734e7ecb946e075" Jan 20 22:56:47 crc kubenswrapper[5030]: E0120 22:56:47.062429 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"688e9992dba81cee14e33d6f8669f2e3c6fc5e7c56db5b3e6734e7ecb946e075\": container with ID starting with 688e9992dba81cee14e33d6f8669f2e3c6fc5e7c56db5b3e6734e7ecb946e075 not found: ID does not exist" containerID="688e9992dba81cee14e33d6f8669f2e3c6fc5e7c56db5b3e6734e7ecb946e075" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.062452 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"688e9992dba81cee14e33d6f8669f2e3c6fc5e7c56db5b3e6734e7ecb946e075"} err="failed to get container status \"688e9992dba81cee14e33d6f8669f2e3c6fc5e7c56db5b3e6734e7ecb946e075\": rpc error: code = NotFound desc = could not find container \"688e9992dba81cee14e33d6f8669f2e3c6fc5e7c56db5b3e6734e7ecb946e075\": container with ID starting with 688e9992dba81cee14e33d6f8669f2e3c6fc5e7c56db5b3e6734e7ecb946e075 not found: ID does not exist" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.122190 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497735a2-eb2c-4cd4-b475-831e9d1fcdc3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.182802 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.194723 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.207647 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:47 crc kubenswrapper[5030]: E0120 22:56:47.208851 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="sg-core" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.208876 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="sg-core" Jan 20 22:56:47 crc kubenswrapper[5030]: E0120 22:56:47.208895 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="proxy-httpd" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.208903 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="proxy-httpd" Jan 20 22:56:47 crc kubenswrapper[5030]: E0120 22:56:47.208923 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="ceilometer-notification-agent" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.208934 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="ceilometer-notification-agent" Jan 20 22:56:47 crc kubenswrapper[5030]: E0120 22:56:47.208951 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="ceilometer-central-agent" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.208968 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="ceilometer-central-agent" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.209394 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="proxy-httpd" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.209417 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="ceilometer-notification-agent" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.209429 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="ceilometer-central-agent" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.209465 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" containerName="sg-core" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.217003 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.222521 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.223081 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.223303 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.266804 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.335908 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.336203 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940fda36-90a2-4de3-b784-8675539485bb-run-httpd\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.336233 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5r7g\" (UniqueName: \"kubernetes.io/projected/940fda36-90a2-4de3-b784-8675539485bb-kube-api-access-f5r7g\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.336269 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-config-data\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.336319 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.336430 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940fda36-90a2-4de3-b784-8675539485bb-log-httpd\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.336570 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.336650 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-scripts\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.437963 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940fda36-90a2-4de3-b784-8675539485bb-run-httpd\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.438011 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5r7g\" (UniqueName: \"kubernetes.io/projected/940fda36-90a2-4de3-b784-8675539485bb-kube-api-access-f5r7g\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.438047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-config-data\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.438098 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.438122 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940fda36-90a2-4de3-b784-8675539485bb-log-httpd\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.438158 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.438188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-scripts\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.438216 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.438839 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940fda36-90a2-4de3-b784-8675539485bb-run-httpd\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.438938 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940fda36-90a2-4de3-b784-8675539485bb-log-httpd\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.442849 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-scripts\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.443572 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-config-data\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.446232 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.446546 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.447649 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.458018 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5r7g\" (UniqueName: \"kubernetes.io/projected/940fda36-90a2-4de3-b784-8675539485bb-kube-api-access-f5r7g\") pod \"ceilometer-0\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.557641 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:47 crc kubenswrapper[5030]: I0120 22:56:47.981986 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="497735a2-eb2c-4cd4-b475-831e9d1fcdc3" path="/var/lib/kubelet/pods/497735a2-eb2c-4cd4-b475-831e9d1fcdc3/volumes" Jan 20 22:56:48 crc kubenswrapper[5030]: I0120 22:56:48.026595 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:48 crc kubenswrapper[5030]: W0120 22:56:48.030515 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod940fda36_90a2_4de3_b784_8675539485bb.slice/crio-5a21c71afa0218394ce96032f76495a347c6ccae912e9f7c74604ed302202182 WatchSource:0}: Error finding container 5a21c71afa0218394ce96032f76495a347c6ccae912e9f7c74604ed302202182: Status 404 returned error can't find the container with id 5a21c71afa0218394ce96032f76495a347c6ccae912e9f7c74604ed302202182 Jan 20 22:56:48 crc kubenswrapper[5030]: I0120 22:56:48.601528 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:48 crc kubenswrapper[5030]: I0120 22:56:48.860676 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"940fda36-90a2-4de3-b784-8675539485bb","Type":"ContainerStarted","Data":"f6c5b3416469033a9078104f6e8dd0e9de9a0b471faa620f74e3b12ef9a17672"} Jan 20 22:56:48 crc kubenswrapper[5030]: I0120 22:56:48.860902 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"940fda36-90a2-4de3-b784-8675539485bb","Type":"ContainerStarted","Data":"5a21c71afa0218394ce96032f76495a347c6ccae912e9f7c74604ed302202182"} Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.715035 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.868939 5030 generic.go:334] "Generic (PLEG): container finished" podID="57e1c922-638b-45a3-8cb0-7bbd0001bff1" containerID="e8fa0730d8388c812c23489ee334a55e6b4cf576e5e0f018427442c8ca0e3861" exitCode=0 Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.868996 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"57e1c922-638b-45a3-8cb0-7bbd0001bff1","Type":"ContainerDied","Data":"e8fa0730d8388c812c23489ee334a55e6b4cf576e5e0f018427442c8ca0e3861"} Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.869022 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"57e1c922-638b-45a3-8cb0-7bbd0001bff1","Type":"ContainerDied","Data":"2830148096e3ec6bd3e07daf094b59b78eff2478c18e37dbded374588308efa6"} Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.869041 5030 scope.go:117] "RemoveContainer" containerID="e8fa0730d8388c812c23489ee334a55e6b4cf576e5e0f018427442c8ca0e3861" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.869038 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.871791 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"940fda36-90a2-4de3-b784-8675539485bb","Type":"ContainerStarted","Data":"46ec56b56209da1511d7e8641129b9abead6a71089fb75a7800bb992f8f5e105"} Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.885980 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e1c922-638b-45a3-8cb0-7bbd0001bff1-config-data\") pod \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.886049 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jrbh\" (UniqueName: \"kubernetes.io/projected/57e1c922-638b-45a3-8cb0-7bbd0001bff1-kube-api-access-4jrbh\") pod \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.886197 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e1c922-638b-45a3-8cb0-7bbd0001bff1-combined-ca-bundle\") pod \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.886216 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57e1c922-638b-45a3-8cb0-7bbd0001bff1-logs\") pod \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\" (UID: \"57e1c922-638b-45a3-8cb0-7bbd0001bff1\") " Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.886584 5030 scope.go:117] "RemoveContainer" containerID="d57e9a596a0fb434f8d26f160376b517153fff7eaf795c7c50d57ed0779d15e3" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.887309 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57e1c922-638b-45a3-8cb0-7bbd0001bff1-logs" (OuterVolumeSpecName: "logs") pod "57e1c922-638b-45a3-8cb0-7bbd0001bff1" (UID: "57e1c922-638b-45a3-8cb0-7bbd0001bff1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.890453 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e1c922-638b-45a3-8cb0-7bbd0001bff1-kube-api-access-4jrbh" (OuterVolumeSpecName: "kube-api-access-4jrbh") pod "57e1c922-638b-45a3-8cb0-7bbd0001bff1" (UID: "57e1c922-638b-45a3-8cb0-7bbd0001bff1"). InnerVolumeSpecName "kube-api-access-4jrbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.915386 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e1c922-638b-45a3-8cb0-7bbd0001bff1-config-data" (OuterVolumeSpecName: "config-data") pod "57e1c922-638b-45a3-8cb0-7bbd0001bff1" (UID: "57e1c922-638b-45a3-8cb0-7bbd0001bff1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.921576 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e1c922-638b-45a3-8cb0-7bbd0001bff1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57e1c922-638b-45a3-8cb0-7bbd0001bff1" (UID: "57e1c922-638b-45a3-8cb0-7bbd0001bff1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.937441 5030 scope.go:117] "RemoveContainer" containerID="e8fa0730d8388c812c23489ee334a55e6b4cf576e5e0f018427442c8ca0e3861" Jan 20 22:56:49 crc kubenswrapper[5030]: E0120 22:56:49.937995 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8fa0730d8388c812c23489ee334a55e6b4cf576e5e0f018427442c8ca0e3861\": container with ID starting with e8fa0730d8388c812c23489ee334a55e6b4cf576e5e0f018427442c8ca0e3861 not found: ID does not exist" containerID="e8fa0730d8388c812c23489ee334a55e6b4cf576e5e0f018427442c8ca0e3861" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.938050 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8fa0730d8388c812c23489ee334a55e6b4cf576e5e0f018427442c8ca0e3861"} err="failed to get container status \"e8fa0730d8388c812c23489ee334a55e6b4cf576e5e0f018427442c8ca0e3861\": rpc error: code = NotFound desc = could not find container \"e8fa0730d8388c812c23489ee334a55e6b4cf576e5e0f018427442c8ca0e3861\": container with ID starting with e8fa0730d8388c812c23489ee334a55e6b4cf576e5e0f018427442c8ca0e3861 not found: ID does not exist" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.938075 5030 scope.go:117] "RemoveContainer" containerID="d57e9a596a0fb434f8d26f160376b517153fff7eaf795c7c50d57ed0779d15e3" Jan 20 22:56:49 crc kubenswrapper[5030]: E0120 22:56:49.938370 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d57e9a596a0fb434f8d26f160376b517153fff7eaf795c7c50d57ed0779d15e3\": container with ID starting with d57e9a596a0fb434f8d26f160376b517153fff7eaf795c7c50d57ed0779d15e3 not found: ID does not exist" containerID="d57e9a596a0fb434f8d26f160376b517153fff7eaf795c7c50d57ed0779d15e3" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.938402 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d57e9a596a0fb434f8d26f160376b517153fff7eaf795c7c50d57ed0779d15e3"} err="failed to get container status \"d57e9a596a0fb434f8d26f160376b517153fff7eaf795c7c50d57ed0779d15e3\": rpc error: code = NotFound desc = could not find container \"d57e9a596a0fb434f8d26f160376b517153fff7eaf795c7c50d57ed0779d15e3\": container with ID starting with d57e9a596a0fb434f8d26f160376b517153fff7eaf795c7c50d57ed0779d15e3 not found: ID does not exist" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.988979 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e1c922-638b-45a3-8cb0-7bbd0001bff1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.989027 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57e1c922-638b-45a3-8cb0-7bbd0001bff1-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.989042 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e1c922-638b-45a3-8cb0-7bbd0001bff1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:49 crc kubenswrapper[5030]: I0120 22:56:49.989054 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jrbh\" (UniqueName: \"kubernetes.io/projected/57e1c922-638b-45a3-8cb0-7bbd0001bff1-kube-api-access-4jrbh\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.187337 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.208275 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.241816 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:50 crc kubenswrapper[5030]: E0120 22:56:50.242331 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e1c922-638b-45a3-8cb0-7bbd0001bff1" containerName="nova-api-log" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.242372 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e1c922-638b-45a3-8cb0-7bbd0001bff1" containerName="nova-api-log" Jan 20 22:56:50 crc kubenswrapper[5030]: E0120 22:56:50.242392 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e1c922-638b-45a3-8cb0-7bbd0001bff1" containerName="nova-api-api" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.242400 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e1c922-638b-45a3-8cb0-7bbd0001bff1" containerName="nova-api-api" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.242639 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e1c922-638b-45a3-8cb0-7bbd0001bff1" containerName="nova-api-api" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.242665 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e1c922-638b-45a3-8cb0-7bbd0001bff1" containerName="nova-api-log" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.244011 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.245953 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.246007 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.246828 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.258450 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.397303 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.397632 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6htv9\" (UniqueName: \"kubernetes.io/projected/ef537024-2950-4510-97c8-50b40e685989-kube-api-access-6htv9\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.397681 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-public-tls-certs\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.397712 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-config-data\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.397816 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef537024-2950-4510-97c8-50b40e685989-logs\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.397870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.499029 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef537024-2950-4510-97c8-50b40e685989-logs\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.499101 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.499157 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.499219 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6htv9\" (UniqueName: \"kubernetes.io/projected/ef537024-2950-4510-97c8-50b40e685989-kube-api-access-6htv9\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.499259 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-public-tls-certs\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.499286 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-config-data\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.499531 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef537024-2950-4510-97c8-50b40e685989-logs\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.503382 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-config-data\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.505078 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.506779 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-public-tls-certs\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.515437 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.517895 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6htv9\" (UniqueName: \"kubernetes.io/projected/ef537024-2950-4510-97c8-50b40e685989-kube-api-access-6htv9\") pod \"nova-api-0\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.519613 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.563608 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:56:50 crc kubenswrapper[5030]: I0120 22:56:50.881074 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"940fda36-90a2-4de3-b784-8675539485bb","Type":"ContainerStarted","Data":"cadfc25eb7872c669a4f97cd5eb7a4e22d0fee1a322d328fdf649889711e94e0"} Jan 20 22:56:51 crc kubenswrapper[5030]: I0120 22:56:51.045392 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:56:51 crc kubenswrapper[5030]: W0120 22:56:51.046308 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef537024_2950_4510_97c8_50b40e685989.slice/crio-ce954749df96b29e368fc2326a2927cf970486950a55da111f09f48e30eb1108 WatchSource:0}: Error finding container ce954749df96b29e368fc2326a2927cf970486950a55da111f09f48e30eb1108: Status 404 returned error can't find the container with id ce954749df96b29e368fc2326a2927cf970486950a55da111f09f48e30eb1108 Jan 20 22:56:51 crc kubenswrapper[5030]: I0120 22:56:51.889404 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ef537024-2950-4510-97c8-50b40e685989","Type":"ContainerStarted","Data":"6ff87555539f5cf94dc0163f52d60833f852402cd8a43a14a4c802eb7604386a"} Jan 20 22:56:51 crc kubenswrapper[5030]: I0120 22:56:51.889795 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ef537024-2950-4510-97c8-50b40e685989","Type":"ContainerStarted","Data":"ee57a3203c76267f5d0f9739dbc9009d8650bf4e73459122ea3c5f845e1212ed"} Jan 20 22:56:51 crc kubenswrapper[5030]: I0120 22:56:51.889818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ef537024-2950-4510-97c8-50b40e685989","Type":"ContainerStarted","Data":"ce954749df96b29e368fc2326a2927cf970486950a55da111f09f48e30eb1108"} Jan 20 22:56:51 crc kubenswrapper[5030]: I0120 22:56:51.919247 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.919226076 podStartE2EDuration="1.919226076s" podCreationTimestamp="2026-01-20 22:56:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:56:51.904073348 +0000 UTC m=+1284.224333636" watchObservedRunningTime="2026-01-20 22:56:51.919226076 +0000 UTC m=+1284.239486364" Jan 20 22:56:51 crc kubenswrapper[5030]: I0120 22:56:51.972306 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57e1c922-638b-45a3-8cb0-7bbd0001bff1" path="/var/lib/kubelet/pods/57e1c922-638b-45a3-8cb0-7bbd0001bff1/volumes" Jan 20 22:56:52 crc kubenswrapper[5030]: I0120 22:56:52.913540 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="ceilometer-central-agent" containerID="cri-o://f6c5b3416469033a9078104f6e8dd0e9de9a0b471faa620f74e3b12ef9a17672" gracePeriod=30 Jan 20 22:56:52 crc kubenswrapper[5030]: I0120 22:56:52.913965 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"940fda36-90a2-4de3-b784-8675539485bb","Type":"ContainerStarted","Data":"5d9a514440e09f143be73d50e175015b9fc59486b6e9b6338a3738e563b16aa5"} Jan 20 22:56:52 crc kubenswrapper[5030]: I0120 22:56:52.914026 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:52 crc kubenswrapper[5030]: I0120 22:56:52.914337 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="proxy-httpd" containerID="cri-o://5d9a514440e09f143be73d50e175015b9fc59486b6e9b6338a3738e563b16aa5" gracePeriod=30 Jan 20 22:56:52 crc kubenswrapper[5030]: I0120 22:56:52.914417 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="sg-core" containerID="cri-o://cadfc25eb7872c669a4f97cd5eb7a4e22d0fee1a322d328fdf649889711e94e0" gracePeriod=30 Jan 20 22:56:52 crc kubenswrapper[5030]: I0120 22:56:52.914473 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="ceilometer-notification-agent" containerID="cri-o://46ec56b56209da1511d7e8641129b9abead6a71089fb75a7800bb992f8f5e105" gracePeriod=30 Jan 20 22:56:52 crc kubenswrapper[5030]: I0120 22:56:52.944717 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.31293953 podStartE2EDuration="5.944699229s" podCreationTimestamp="2026-01-20 22:56:47 +0000 UTC" firstStartedPulling="2026-01-20 22:56:48.033770098 +0000 UTC m=+1280.354030386" lastFinishedPulling="2026-01-20 22:56:51.665529797 +0000 UTC m=+1283.985790085" observedRunningTime="2026-01-20 22:56:52.942931805 +0000 UTC m=+1285.263192123" watchObservedRunningTime="2026-01-20 22:56:52.944699229 +0000 UTC m=+1285.264959527" Jan 20 22:56:53 crc kubenswrapper[5030]: I0120 22:56:53.926925 5030 generic.go:334] "Generic (PLEG): container finished" podID="940fda36-90a2-4de3-b784-8675539485bb" containerID="5d9a514440e09f143be73d50e175015b9fc59486b6e9b6338a3738e563b16aa5" exitCode=0 Jan 20 22:56:53 crc kubenswrapper[5030]: I0120 22:56:53.927333 5030 generic.go:334] "Generic (PLEG): container finished" podID="940fda36-90a2-4de3-b784-8675539485bb" containerID="cadfc25eb7872c669a4f97cd5eb7a4e22d0fee1a322d328fdf649889711e94e0" exitCode=2 Jan 20 22:56:53 crc kubenswrapper[5030]: I0120 22:56:53.927353 5030 generic.go:334] "Generic (PLEG): container finished" podID="940fda36-90a2-4de3-b784-8675539485bb" containerID="46ec56b56209da1511d7e8641129b9abead6a71089fb75a7800bb992f8f5e105" exitCode=0 Jan 20 22:56:53 crc kubenswrapper[5030]: I0120 22:56:53.927005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"940fda36-90a2-4de3-b784-8675539485bb","Type":"ContainerDied","Data":"5d9a514440e09f143be73d50e175015b9fc59486b6e9b6338a3738e563b16aa5"} Jan 20 22:56:53 crc kubenswrapper[5030]: I0120 22:56:53.927409 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"940fda36-90a2-4de3-b784-8675539485bb","Type":"ContainerDied","Data":"cadfc25eb7872c669a4f97cd5eb7a4e22d0fee1a322d328fdf649889711e94e0"} Jan 20 22:56:53 crc kubenswrapper[5030]: I0120 22:56:53.927433 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"940fda36-90a2-4de3-b784-8675539485bb","Type":"ContainerDied","Data":"46ec56b56209da1511d7e8641129b9abead6a71089fb75a7800bb992f8f5e105"} Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.520367 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.543416 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.711978 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.810139 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-config-data\") pod \"940fda36-90a2-4de3-b784-8675539485bb\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.810221 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-scripts\") pod \"940fda36-90a2-4de3-b784-8675539485bb\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.810318 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-sg-core-conf-yaml\") pod \"940fda36-90a2-4de3-b784-8675539485bb\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.810344 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940fda36-90a2-4de3-b784-8675539485bb-run-httpd\") pod \"940fda36-90a2-4de3-b784-8675539485bb\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.810378 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-combined-ca-bundle\") pod \"940fda36-90a2-4de3-b784-8675539485bb\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.810465 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5r7g\" (UniqueName: \"kubernetes.io/projected/940fda36-90a2-4de3-b784-8675539485bb-kube-api-access-f5r7g\") pod \"940fda36-90a2-4de3-b784-8675539485bb\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.810492 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-ceilometer-tls-certs\") pod \"940fda36-90a2-4de3-b784-8675539485bb\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.810563 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940fda36-90a2-4de3-b784-8675539485bb-log-httpd\") pod \"940fda36-90a2-4de3-b784-8675539485bb\" (UID: \"940fda36-90a2-4de3-b784-8675539485bb\") " Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.811527 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/940fda36-90a2-4de3-b784-8675539485bb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "940fda36-90a2-4de3-b784-8675539485bb" (UID: "940fda36-90a2-4de3-b784-8675539485bb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.811558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/940fda36-90a2-4de3-b784-8675539485bb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "940fda36-90a2-4de3-b784-8675539485bb" (UID: "940fda36-90a2-4de3-b784-8675539485bb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.816609 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/940fda36-90a2-4de3-b784-8675539485bb-kube-api-access-f5r7g" (OuterVolumeSpecName: "kube-api-access-f5r7g") pod "940fda36-90a2-4de3-b784-8675539485bb" (UID: "940fda36-90a2-4de3-b784-8675539485bb"). InnerVolumeSpecName "kube-api-access-f5r7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.816930 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-scripts" (OuterVolumeSpecName: "scripts") pod "940fda36-90a2-4de3-b784-8675539485bb" (UID: "940fda36-90a2-4de3-b784-8675539485bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.842750 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "940fda36-90a2-4de3-b784-8675539485bb" (UID: "940fda36-90a2-4de3-b784-8675539485bb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.888903 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "940fda36-90a2-4de3-b784-8675539485bb" (UID: "940fda36-90a2-4de3-b784-8675539485bb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.905905 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "940fda36-90a2-4de3-b784-8675539485bb" (UID: "940fda36-90a2-4de3-b784-8675539485bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.913497 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.913745 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.913916 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940fda36-90a2-4de3-b784-8675539485bb-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.914046 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.914174 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5r7g\" (UniqueName: \"kubernetes.io/projected/940fda36-90a2-4de3-b784-8675539485bb-kube-api-access-f5r7g\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.914293 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.914409 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/940fda36-90a2-4de3-b784-8675539485bb-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.941922 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-config-data" (OuterVolumeSpecName: "config-data") pod "940fda36-90a2-4de3-b784-8675539485bb" (UID: "940fda36-90a2-4de3-b784-8675539485bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.949949 5030 generic.go:334] "Generic (PLEG): container finished" podID="940fda36-90a2-4de3-b784-8675539485bb" containerID="f6c5b3416469033a9078104f6e8dd0e9de9a0b471faa620f74e3b12ef9a17672" exitCode=0 Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.950027 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.950085 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"940fda36-90a2-4de3-b784-8675539485bb","Type":"ContainerDied","Data":"f6c5b3416469033a9078104f6e8dd0e9de9a0b471faa620f74e3b12ef9a17672"} Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.950167 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"940fda36-90a2-4de3-b784-8675539485bb","Type":"ContainerDied","Data":"5a21c71afa0218394ce96032f76495a347c6ccae912e9f7c74604ed302202182"} Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.950278 5030 scope.go:117] "RemoveContainer" containerID="5d9a514440e09f143be73d50e175015b9fc59486b6e9b6338a3738e563b16aa5" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.974550 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.974938 5030 scope.go:117] "RemoveContainer" containerID="cadfc25eb7872c669a4f97cd5eb7a4e22d0fee1a322d328fdf649889711e94e0" Jan 20 22:56:55 crc kubenswrapper[5030]: I0120 22:56:55.992243 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.002589 5030 scope.go:117] "RemoveContainer" containerID="46ec56b56209da1511d7e8641129b9abead6a71089fb75a7800bb992f8f5e105" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.007736 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.015862 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/940fda36-90a2-4de3-b784-8675539485bb-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.036736 5030 scope.go:117] "RemoveContainer" containerID="f6c5b3416469033a9078104f6e8dd0e9de9a0b471faa620f74e3b12ef9a17672" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.049648 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:56 crc kubenswrapper[5030]: E0120 22:56:56.050298 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="proxy-httpd" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.050322 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="proxy-httpd" Jan 20 22:56:56 crc kubenswrapper[5030]: E0120 22:56:56.050342 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="ceilometer-notification-agent" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.050353 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="ceilometer-notification-agent" Jan 20 22:56:56 crc kubenswrapper[5030]: E0120 22:56:56.050378 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="sg-core" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.050387 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="sg-core" Jan 20 22:56:56 crc kubenswrapper[5030]: E0120 22:56:56.050416 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="ceilometer-central-agent" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.050426 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="ceilometer-central-agent" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.050701 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="ceilometer-central-agent" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.050724 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="sg-core" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.050750 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="ceilometer-notification-agent" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.050766 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="940fda36-90a2-4de3-b784-8675539485bb" containerName="proxy-httpd" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.052806 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.059451 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.059706 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.059850 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.066517 5030 scope.go:117] "RemoveContainer" containerID="5d9a514440e09f143be73d50e175015b9fc59486b6e9b6338a3738e563b16aa5" Jan 20 22:56:56 crc kubenswrapper[5030]: E0120 22:56:56.073861 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d9a514440e09f143be73d50e175015b9fc59486b6e9b6338a3738e563b16aa5\": container with ID starting with 5d9a514440e09f143be73d50e175015b9fc59486b6e9b6338a3738e563b16aa5 not found: ID does not exist" containerID="5d9a514440e09f143be73d50e175015b9fc59486b6e9b6338a3738e563b16aa5" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.073911 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d9a514440e09f143be73d50e175015b9fc59486b6e9b6338a3738e563b16aa5"} err="failed to get container status \"5d9a514440e09f143be73d50e175015b9fc59486b6e9b6338a3738e563b16aa5\": rpc error: code = NotFound desc = could not find container \"5d9a514440e09f143be73d50e175015b9fc59486b6e9b6338a3738e563b16aa5\": container with ID starting with 5d9a514440e09f143be73d50e175015b9fc59486b6e9b6338a3738e563b16aa5 not found: ID does not exist" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.073949 5030 scope.go:117] "RemoveContainer" containerID="cadfc25eb7872c669a4f97cd5eb7a4e22d0fee1a322d328fdf649889711e94e0" Jan 20 22:56:56 crc kubenswrapper[5030]: E0120 22:56:56.075074 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cadfc25eb7872c669a4f97cd5eb7a4e22d0fee1a322d328fdf649889711e94e0\": container with ID starting with cadfc25eb7872c669a4f97cd5eb7a4e22d0fee1a322d328fdf649889711e94e0 not found: ID does not exist" containerID="cadfc25eb7872c669a4f97cd5eb7a4e22d0fee1a322d328fdf649889711e94e0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.075115 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadfc25eb7872c669a4f97cd5eb7a4e22d0fee1a322d328fdf649889711e94e0"} err="failed to get container status \"cadfc25eb7872c669a4f97cd5eb7a4e22d0fee1a322d328fdf649889711e94e0\": rpc error: code = NotFound desc = could not find container \"cadfc25eb7872c669a4f97cd5eb7a4e22d0fee1a322d328fdf649889711e94e0\": container with ID starting with cadfc25eb7872c669a4f97cd5eb7a4e22d0fee1a322d328fdf649889711e94e0 not found: ID does not exist" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.075134 5030 scope.go:117] "RemoveContainer" containerID="46ec56b56209da1511d7e8641129b9abead6a71089fb75a7800bb992f8f5e105" Jan 20 22:56:56 crc kubenswrapper[5030]: E0120 22:56:56.078223 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46ec56b56209da1511d7e8641129b9abead6a71089fb75a7800bb992f8f5e105\": container with ID starting with 46ec56b56209da1511d7e8641129b9abead6a71089fb75a7800bb992f8f5e105 not found: ID does not exist" containerID="46ec56b56209da1511d7e8641129b9abead6a71089fb75a7800bb992f8f5e105" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.078274 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46ec56b56209da1511d7e8641129b9abead6a71089fb75a7800bb992f8f5e105"} err="failed to get container status \"46ec56b56209da1511d7e8641129b9abead6a71089fb75a7800bb992f8f5e105\": rpc error: code = NotFound desc = could not find container \"46ec56b56209da1511d7e8641129b9abead6a71089fb75a7800bb992f8f5e105\": container with ID starting with 46ec56b56209da1511d7e8641129b9abead6a71089fb75a7800bb992f8f5e105 not found: ID does not exist" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.078307 5030 scope.go:117] "RemoveContainer" containerID="f6c5b3416469033a9078104f6e8dd0e9de9a0b471faa620f74e3b12ef9a17672" Jan 20 22:56:56 crc kubenswrapper[5030]: E0120 22:56:56.078802 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c5b3416469033a9078104f6e8dd0e9de9a0b471faa620f74e3b12ef9a17672\": container with ID starting with f6c5b3416469033a9078104f6e8dd0e9de9a0b471faa620f74e3b12ef9a17672 not found: ID does not exist" containerID="f6c5b3416469033a9078104f6e8dd0e9de9a0b471faa620f74e3b12ef9a17672" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.078854 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c5b3416469033a9078104f6e8dd0e9de9a0b471faa620f74e3b12ef9a17672"} err="failed to get container status \"f6c5b3416469033a9078104f6e8dd0e9de9a0b471faa620f74e3b12ef9a17672\": rpc error: code = NotFound desc = could not find container \"f6c5b3416469033a9078104f6e8dd0e9de9a0b471faa620f74e3b12ef9a17672\": container with ID starting with f6c5b3416469033a9078104f6e8dd0e9de9a0b471faa620f74e3b12ef9a17672 not found: ID does not exist" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.087558 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.187307 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc"] Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.188470 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.191057 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.193027 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.220711 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-config-data\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.220777 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.220968 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.221067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.221234 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-scripts\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.221296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddc1fbc-a41d-446b-81e3-a15febd39285-log-httpd\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.221398 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddc1fbc-a41d-446b-81e3-a15febd39285-run-httpd\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.221467 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkp57\" (UniqueName: \"kubernetes.io/projected/2ddc1fbc-a41d-446b-81e3-a15febd39285-kube-api-access-fkp57\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.241535 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc"] Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.322967 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.323037 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-scripts\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.323057 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddc1fbc-a41d-446b-81e3-a15febd39285-log-httpd\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.323086 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-config-data\") pod \"nova-cell1-cell-mapping-4fgrc\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.323114 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddc1fbc-a41d-446b-81e3-a15febd39285-run-httpd\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.323132 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkp57\" (UniqueName: \"kubernetes.io/projected/2ddc1fbc-a41d-446b-81e3-a15febd39285-kube-api-access-fkp57\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.323212 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-scripts\") pod \"nova-cell1-cell-mapping-4fgrc\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.323551 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddc1fbc-a41d-446b-81e3-a15febd39285-log-httpd\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.323594 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr62b\" (UniqueName: \"kubernetes.io/projected/808972eb-d558-43db-9cd9-ce9315b4dd16-kube-api-access-fr62b\") pod \"nova-cell1-cell-mapping-4fgrc\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.323605 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddc1fbc-a41d-446b-81e3-a15febd39285-run-httpd\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.323634 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-config-data\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.323653 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.323691 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4fgrc\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.323730 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.327588 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.328745 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-scripts\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.329219 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.329847 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.330742 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-config-data\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.342212 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkp57\" (UniqueName: \"kubernetes.io/projected/2ddc1fbc-a41d-446b-81e3-a15febd39285-kube-api-access-fkp57\") pod \"ceilometer-0\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.380893 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.425046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-config-data\") pod \"nova-cell1-cell-mapping-4fgrc\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.425556 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-scripts\") pod \"nova-cell1-cell-mapping-4fgrc\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.425738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr62b\" (UniqueName: \"kubernetes.io/projected/808972eb-d558-43db-9cd9-ce9315b4dd16-kube-api-access-fr62b\") pod \"nova-cell1-cell-mapping-4fgrc\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.425931 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4fgrc\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.429768 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-scripts\") pod \"nova-cell1-cell-mapping-4fgrc\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.429823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4fgrc\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.431593 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-config-data\") pod \"nova-cell1-cell-mapping-4fgrc\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.441541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr62b\" (UniqueName: \"kubernetes.io/projected/808972eb-d558-43db-9cd9-ce9315b4dd16-kube-api-access-fr62b\") pod \"nova-cell1-cell-mapping-4fgrc\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.512269 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:56:56 crc kubenswrapper[5030]: W0120 22:56:56.832461 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ddc1fbc_a41d_446b_81e3_a15febd39285.slice/crio-e64b70ac088634e0a35988aae2c05712d18bd425b64b8c3b786f8bdee5fc97a1 WatchSource:0}: Error finding container e64b70ac088634e0a35988aae2c05712d18bd425b64b8c3b786f8bdee5fc97a1: Status 404 returned error can't find the container with id e64b70ac088634e0a35988aae2c05712d18bd425b64b8c3b786f8bdee5fc97a1 Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.841009 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.964987 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2ddc1fbc-a41d-446b-81e3-a15febd39285","Type":"ContainerStarted","Data":"e64b70ac088634e0a35988aae2c05712d18bd425b64b8c3b786f8bdee5fc97a1"} Jan 20 22:56:56 crc kubenswrapper[5030]: I0120 22:56:56.966128 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc"] Jan 20 22:56:56 crc kubenswrapper[5030]: W0120 22:56:56.968915 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod808972eb_d558_43db_9cd9_ce9315b4dd16.slice/crio-24381b91141c2f41097a46afa7e779c8cdc8646793881784d47b78988957e188 WatchSource:0}: Error finding container 24381b91141c2f41097a46afa7e779c8cdc8646793881784d47b78988957e188: Status 404 returned error can't find the container with id 24381b91141c2f41097a46afa7e779c8cdc8646793881784d47b78988957e188 Jan 20 22:56:57 crc kubenswrapper[5030]: I0120 22:56:57.981471 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="940fda36-90a2-4de3-b784-8675539485bb" path="/var/lib/kubelet/pods/940fda36-90a2-4de3-b784-8675539485bb/volumes" Jan 20 22:56:57 crc kubenswrapper[5030]: I0120 22:56:57.982972 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2ddc1fbc-a41d-446b-81e3-a15febd39285","Type":"ContainerStarted","Data":"a4a6c07a940fa33f24511acd3d3cfb359f4abd81b4a5617e7de7f6743af38fe9"} Jan 20 22:56:57 crc kubenswrapper[5030]: I0120 22:56:57.983005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" event={"ID":"808972eb-d558-43db-9cd9-ce9315b4dd16","Type":"ContainerStarted","Data":"de1ddfa2e63339299e0dc0bec48a287474e36a02e58e7951ddee8179a1a548a7"} Jan 20 22:56:57 crc kubenswrapper[5030]: I0120 22:56:57.983032 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" event={"ID":"808972eb-d558-43db-9cd9-ce9315b4dd16","Type":"ContainerStarted","Data":"24381b91141c2f41097a46afa7e779c8cdc8646793881784d47b78988957e188"} Jan 20 22:56:58 crc kubenswrapper[5030]: I0120 22:56:58.047583 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" podStartSLOduration=2.047562879 podStartE2EDuration="2.047562879s" podCreationTimestamp="2026-01-20 22:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:56:58.038559981 +0000 UTC m=+1290.358820269" watchObservedRunningTime="2026-01-20 22:56:58.047562879 +0000 UTC m=+1290.367823177" Jan 20 22:56:58 crc kubenswrapper[5030]: I0120 22:56:58.988749 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2ddc1fbc-a41d-446b-81e3-a15febd39285","Type":"ContainerStarted","Data":"be006ffb3db61a321e993c269bb9ae3ea0ca63c388bc27053d0c190cbf6073af"} Jan 20 22:56:58 crc kubenswrapper[5030]: I0120 22:56:58.989079 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2ddc1fbc-a41d-446b-81e3-a15febd39285","Type":"ContainerStarted","Data":"893f6987a6df5b3331fffed8a8ea0ba6aec4842601c274d8f1e560748fa5ffd8"} Jan 20 22:57:00 crc kubenswrapper[5030]: I0120 22:57:00.564875 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:00 crc kubenswrapper[5030]: I0120 22:57:00.565231 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:01 crc kubenswrapper[5030]: I0120 22:57:01.012776 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2ddc1fbc-a41d-446b-81e3-a15febd39285","Type":"ContainerStarted","Data":"63057bd1179d40b3071a3468cfba935dac00fdf9cf0809c23272ed79c89a9eb3"} Jan 20 22:57:01 crc kubenswrapper[5030]: I0120 22:57:01.013220 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:57:01 crc kubenswrapper[5030]: I0120 22:57:01.046498 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.754605947 podStartE2EDuration="6.046477846s" podCreationTimestamp="2026-01-20 22:56:55 +0000 UTC" firstStartedPulling="2026-01-20 22:56:56.836421539 +0000 UTC m=+1289.156681827" lastFinishedPulling="2026-01-20 22:57:00.128293428 +0000 UTC m=+1292.448553726" observedRunningTime="2026-01-20 22:57:01.037119739 +0000 UTC m=+1293.357380027" watchObservedRunningTime="2026-01-20 22:57:01.046477846 +0000 UTC m=+1293.366738134" Jan 20 22:57:01 crc kubenswrapper[5030]: I0120 22:57:01.583932 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="ef537024-2950-4510-97c8-50b40e685989" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.182:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 22:57:01 crc kubenswrapper[5030]: I0120 22:57:01.583950 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="ef537024-2950-4510-97c8-50b40e685989" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.182:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 22:57:02 crc kubenswrapper[5030]: I0120 22:57:02.025200 5030 generic.go:334] "Generic (PLEG): container finished" podID="808972eb-d558-43db-9cd9-ce9315b4dd16" containerID="de1ddfa2e63339299e0dc0bec48a287474e36a02e58e7951ddee8179a1a548a7" exitCode=0 Jan 20 22:57:02 crc kubenswrapper[5030]: I0120 22:57:02.026613 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" event={"ID":"808972eb-d558-43db-9cd9-ce9315b4dd16","Type":"ContainerDied","Data":"de1ddfa2e63339299e0dc0bec48a287474e36a02e58e7951ddee8179a1a548a7"} Jan 20 22:57:03 crc kubenswrapper[5030]: I0120 22:57:03.491750 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:57:03 crc kubenswrapper[5030]: I0120 22:57:03.673819 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-config-data\") pod \"808972eb-d558-43db-9cd9-ce9315b4dd16\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " Jan 20 22:57:03 crc kubenswrapper[5030]: I0120 22:57:03.674078 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr62b\" (UniqueName: \"kubernetes.io/projected/808972eb-d558-43db-9cd9-ce9315b4dd16-kube-api-access-fr62b\") pod \"808972eb-d558-43db-9cd9-ce9315b4dd16\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " Jan 20 22:57:03 crc kubenswrapper[5030]: I0120 22:57:03.674132 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-scripts\") pod \"808972eb-d558-43db-9cd9-ce9315b4dd16\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " Jan 20 22:57:03 crc kubenswrapper[5030]: I0120 22:57:03.674233 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-combined-ca-bundle\") pod \"808972eb-d558-43db-9cd9-ce9315b4dd16\" (UID: \"808972eb-d558-43db-9cd9-ce9315b4dd16\") " Jan 20 22:57:03 crc kubenswrapper[5030]: I0120 22:57:03.681021 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-scripts" (OuterVolumeSpecName: "scripts") pod "808972eb-d558-43db-9cd9-ce9315b4dd16" (UID: "808972eb-d558-43db-9cd9-ce9315b4dd16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:03 crc kubenswrapper[5030]: I0120 22:57:03.681372 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/808972eb-d558-43db-9cd9-ce9315b4dd16-kube-api-access-fr62b" (OuterVolumeSpecName: "kube-api-access-fr62b") pod "808972eb-d558-43db-9cd9-ce9315b4dd16" (UID: "808972eb-d558-43db-9cd9-ce9315b4dd16"). InnerVolumeSpecName "kube-api-access-fr62b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:03 crc kubenswrapper[5030]: I0120 22:57:03.707689 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "808972eb-d558-43db-9cd9-ce9315b4dd16" (UID: "808972eb-d558-43db-9cd9-ce9315b4dd16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:03 crc kubenswrapper[5030]: I0120 22:57:03.710974 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-config-data" (OuterVolumeSpecName: "config-data") pod "808972eb-d558-43db-9cd9-ce9315b4dd16" (UID: "808972eb-d558-43db-9cd9-ce9315b4dd16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:03 crc kubenswrapper[5030]: I0120 22:57:03.776721 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:03 crc kubenswrapper[5030]: I0120 22:57:03.776772 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr62b\" (UniqueName: \"kubernetes.io/projected/808972eb-d558-43db-9cd9-ce9315b4dd16-kube-api-access-fr62b\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:03 crc kubenswrapper[5030]: I0120 22:57:03.776793 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:03 crc kubenswrapper[5030]: I0120 22:57:03.776811 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/808972eb-d558-43db-9cd9-ce9315b4dd16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:04 crc kubenswrapper[5030]: I0120 22:57:04.051381 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" event={"ID":"808972eb-d558-43db-9cd9-ce9315b4dd16","Type":"ContainerDied","Data":"24381b91141c2f41097a46afa7e779c8cdc8646793881784d47b78988957e188"} Jan 20 22:57:04 crc kubenswrapper[5030]: I0120 22:57:04.051443 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24381b91141c2f41097a46afa7e779c8cdc8646793881784d47b78988957e188" Jan 20 22:57:04 crc kubenswrapper[5030]: I0120 22:57:04.051520 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc" Jan 20 22:57:04 crc kubenswrapper[5030]: I0120 22:57:04.207772 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:57:04 crc kubenswrapper[5030]: I0120 22:57:04.208071 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="ef537024-2950-4510-97c8-50b40e685989" containerName="nova-api-log" containerID="cri-o://ee57a3203c76267f5d0f9739dbc9009d8650bf4e73459122ea3c5f845e1212ed" gracePeriod=30 Jan 20 22:57:04 crc kubenswrapper[5030]: I0120 22:57:04.208639 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="ef537024-2950-4510-97c8-50b40e685989" containerName="nova-api-api" containerID="cri-o://6ff87555539f5cf94dc0163f52d60833f852402cd8a43a14a4c802eb7604386a" gracePeriod=30 Jan 20 22:57:04 crc kubenswrapper[5030]: I0120 22:57:04.222448 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:57:04 crc kubenswrapper[5030]: I0120 22:57:04.222935 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="32782f79-2dae-4307-a6b0-2ffd14a5516c" containerName="nova-scheduler-scheduler" containerID="cri-o://590e23862d0316a50a051bcb1d10f8f68ac4cffea79384d48d8e2cbffd0c8f39" gracePeriod=30 Jan 20 22:57:04 crc kubenswrapper[5030]: I0120 22:57:04.235763 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:57:04 crc kubenswrapper[5030]: I0120 22:57:04.236040 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerName="nova-metadata-log" containerID="cri-o://cbaba1e3573350a3d9e3cdf4e6e08ae2868b0a8abe44899607092a69523f0877" gracePeriod=30 Jan 20 22:57:04 crc kubenswrapper[5030]: I0120 22:57:04.236223 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerName="nova-metadata-metadata" containerID="cri-o://c18103edf3e204b90386e22ef4f69d4f81ced1a152e9094afc110a226fe5326d" gracePeriod=30 Jan 20 22:57:05 crc kubenswrapper[5030]: I0120 22:57:05.063569 5030 generic.go:334] "Generic (PLEG): container finished" podID="ef537024-2950-4510-97c8-50b40e685989" containerID="ee57a3203c76267f5d0f9739dbc9009d8650bf4e73459122ea3c5f845e1212ed" exitCode=143 Jan 20 22:57:05 crc kubenswrapper[5030]: I0120 22:57:05.063638 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ef537024-2950-4510-97c8-50b40e685989","Type":"ContainerDied","Data":"ee57a3203c76267f5d0f9739dbc9009d8650bf4e73459122ea3c5f845e1212ed"} Jan 20 22:57:05 crc kubenswrapper[5030]: I0120 22:57:05.066336 5030 generic.go:334] "Generic (PLEG): container finished" podID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerID="cbaba1e3573350a3d9e3cdf4e6e08ae2868b0a8abe44899607092a69523f0877" exitCode=143 Jan 20 22:57:05 crc kubenswrapper[5030]: I0120 22:57:05.066377 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"48442f94-e2f3-4585-8918-ce420c8ceebd","Type":"ContainerDied","Data":"cbaba1e3573350a3d9e3cdf4e6e08ae2868b0a8abe44899607092a69523f0877"} Jan 20 22:57:06 crc kubenswrapper[5030]: I0120 22:57:06.081944 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"32782f79-2dae-4307-a6b0-2ffd14a5516c","Type":"ContainerDied","Data":"590e23862d0316a50a051bcb1d10f8f68ac4cffea79384d48d8e2cbffd0c8f39"} Jan 20 22:57:06 crc kubenswrapper[5030]: I0120 22:57:06.081956 5030 generic.go:334] "Generic (PLEG): container finished" podID="32782f79-2dae-4307-a6b0-2ffd14a5516c" containerID="590e23862d0316a50a051bcb1d10f8f68ac4cffea79384d48d8e2cbffd0c8f39" exitCode=0 Jan 20 22:57:06 crc kubenswrapper[5030]: I0120 22:57:06.412752 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:06 crc kubenswrapper[5030]: I0120 22:57:06.534644 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32782f79-2dae-4307-a6b0-2ffd14a5516c-combined-ca-bundle\") pod \"32782f79-2dae-4307-a6b0-2ffd14a5516c\" (UID: \"32782f79-2dae-4307-a6b0-2ffd14a5516c\") " Jan 20 22:57:06 crc kubenswrapper[5030]: I0120 22:57:06.534740 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32782f79-2dae-4307-a6b0-2ffd14a5516c-config-data\") pod \"32782f79-2dae-4307-a6b0-2ffd14a5516c\" (UID: \"32782f79-2dae-4307-a6b0-2ffd14a5516c\") " Jan 20 22:57:06 crc kubenswrapper[5030]: I0120 22:57:06.534959 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxwl4\" (UniqueName: \"kubernetes.io/projected/32782f79-2dae-4307-a6b0-2ffd14a5516c-kube-api-access-jxwl4\") pod \"32782f79-2dae-4307-a6b0-2ffd14a5516c\" (UID: \"32782f79-2dae-4307-a6b0-2ffd14a5516c\") " Jan 20 22:57:06 crc kubenswrapper[5030]: I0120 22:57:06.548013 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32782f79-2dae-4307-a6b0-2ffd14a5516c-kube-api-access-jxwl4" (OuterVolumeSpecName: "kube-api-access-jxwl4") pod "32782f79-2dae-4307-a6b0-2ffd14a5516c" (UID: "32782f79-2dae-4307-a6b0-2ffd14a5516c"). InnerVolumeSpecName "kube-api-access-jxwl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:06 crc kubenswrapper[5030]: I0120 22:57:06.566092 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32782f79-2dae-4307-a6b0-2ffd14a5516c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32782f79-2dae-4307-a6b0-2ffd14a5516c" (UID: "32782f79-2dae-4307-a6b0-2ffd14a5516c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:06 crc kubenswrapper[5030]: I0120 22:57:06.583848 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32782f79-2dae-4307-a6b0-2ffd14a5516c-config-data" (OuterVolumeSpecName: "config-data") pod "32782f79-2dae-4307-a6b0-2ffd14a5516c" (UID: "32782f79-2dae-4307-a6b0-2ffd14a5516c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:06 crc kubenswrapper[5030]: I0120 22:57:06.637277 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32782f79-2dae-4307-a6b0-2ffd14a5516c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:06 crc kubenswrapper[5030]: I0120 22:57:06.637324 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32782f79-2dae-4307-a6b0-2ffd14a5516c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:06 crc kubenswrapper[5030]: I0120 22:57:06.637336 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxwl4\" (UniqueName: \"kubernetes.io/projected/32782f79-2dae-4307-a6b0-2ffd14a5516c-kube-api-access-jxwl4\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.098524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"32782f79-2dae-4307-a6b0-2ffd14a5516c","Type":"ContainerDied","Data":"04e8a238074ad47bd1b364102eaecf6891e816bdd8f968b68a125f8340fb587e"} Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.098596 5030 scope.go:117] "RemoveContainer" containerID="590e23862d0316a50a051bcb1d10f8f68ac4cffea79384d48d8e2cbffd0c8f39" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.098762 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.148268 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.168750 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.182237 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:57:07 crc kubenswrapper[5030]: E0120 22:57:07.182673 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32782f79-2dae-4307-a6b0-2ffd14a5516c" containerName="nova-scheduler-scheduler" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.182693 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="32782f79-2dae-4307-a6b0-2ffd14a5516c" containerName="nova-scheduler-scheduler" Jan 20 22:57:07 crc kubenswrapper[5030]: E0120 22:57:07.182714 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808972eb-d558-43db-9cd9-ce9315b4dd16" containerName="nova-manage" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.182720 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="808972eb-d558-43db-9cd9-ce9315b4dd16" containerName="nova-manage" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.182916 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="808972eb-d558-43db-9cd9-ce9315b4dd16" containerName="nova-manage" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.182936 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="32782f79-2dae-4307-a6b0-2ffd14a5516c" containerName="nova-scheduler-scheduler" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.183607 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.185863 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.193865 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.353189 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-config-data\") pod \"nova-scheduler-0\" (UID: \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.353310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.353374 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lckx7\" (UniqueName: \"kubernetes.io/projected/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-kube-api-access-lckx7\") pod \"nova-scheduler-0\" (UID: \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.380385 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.175:8775/\": read tcp 10.217.0.2:49546->10.217.0.175:8775: read: connection reset by peer" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.380408 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.175:8775/\": read tcp 10.217.0.2:49544->10.217.0.175:8775: read: connection reset by peer" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.455302 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-config-data\") pod \"nova-scheduler-0\" (UID: \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.455359 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.455386 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lckx7\" (UniqueName: \"kubernetes.io/projected/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-kube-api-access-lckx7\") pod \"nova-scheduler-0\" (UID: \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.460888 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-config-data\") pod \"nova-scheduler-0\" (UID: \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.462210 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.476032 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lckx7\" (UniqueName: \"kubernetes.io/projected/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-kube-api-access-lckx7\") pod \"nova-scheduler-0\" (UID: \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.505239 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.881311 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:07 crc kubenswrapper[5030]: I0120 22:57:07.972807 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32782f79-2dae-4307-a6b0-2ffd14a5516c" path="/var/lib/kubelet/pods/32782f79-2dae-4307-a6b0-2ffd14a5516c/volumes" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.002361 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:57:08 crc kubenswrapper[5030]: W0120 22:57:08.014287 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd48f030_3391_4fdb_b101_45b0a7f4b9a3.slice/crio-a0ce169c8dab7719e35efb95e7fc95eedf3c1d30c8e461aa5019df790f82a14e WatchSource:0}: Error finding container a0ce169c8dab7719e35efb95e7fc95eedf3c1d30c8e461aa5019df790f82a14e: Status 404 returned error can't find the container with id a0ce169c8dab7719e35efb95e7fc95eedf3c1d30c8e461aa5019df790f82a14e Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.067548 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-nova-metadata-tls-certs\") pod \"48442f94-e2f3-4585-8918-ce420c8ceebd\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.067613 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-config-data\") pod \"48442f94-e2f3-4585-8918-ce420c8ceebd\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.067667 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-combined-ca-bundle\") pod \"48442f94-e2f3-4585-8918-ce420c8ceebd\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.067699 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbcx8\" (UniqueName: \"kubernetes.io/projected/48442f94-e2f3-4585-8918-ce420c8ceebd-kube-api-access-jbcx8\") pod \"48442f94-e2f3-4585-8918-ce420c8ceebd\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.067842 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48442f94-e2f3-4585-8918-ce420c8ceebd-logs\") pod \"48442f94-e2f3-4585-8918-ce420c8ceebd\" (UID: \"48442f94-e2f3-4585-8918-ce420c8ceebd\") " Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.068974 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48442f94-e2f3-4585-8918-ce420c8ceebd-logs" (OuterVolumeSpecName: "logs") pod "48442f94-e2f3-4585-8918-ce420c8ceebd" (UID: "48442f94-e2f3-4585-8918-ce420c8ceebd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.071998 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48442f94-e2f3-4585-8918-ce420c8ceebd-kube-api-access-jbcx8" (OuterVolumeSpecName: "kube-api-access-jbcx8") pod "48442f94-e2f3-4585-8918-ce420c8ceebd" (UID: "48442f94-e2f3-4585-8918-ce420c8ceebd"). InnerVolumeSpecName "kube-api-access-jbcx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.114998 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-config-data" (OuterVolumeSpecName: "config-data") pod "48442f94-e2f3-4585-8918-ce420c8ceebd" (UID: "48442f94-e2f3-4585-8918-ce420c8ceebd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.116386 5030 generic.go:334] "Generic (PLEG): container finished" podID="ef537024-2950-4510-97c8-50b40e685989" containerID="6ff87555539f5cf94dc0163f52d60833f852402cd8a43a14a4c802eb7604386a" exitCode=0 Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.116443 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ef537024-2950-4510-97c8-50b40e685989","Type":"ContainerDied","Data":"6ff87555539f5cf94dc0163f52d60833f852402cd8a43a14a4c802eb7604386a"} Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.117010 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48442f94-e2f3-4585-8918-ce420c8ceebd" (UID: "48442f94-e2f3-4585-8918-ce420c8ceebd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.137963 5030 generic.go:334] "Generic (PLEG): container finished" podID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerID="c18103edf3e204b90386e22ef4f69d4f81ced1a152e9094afc110a226fe5326d" exitCode=0 Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.138044 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"48442f94-e2f3-4585-8918-ce420c8ceebd","Type":"ContainerDied","Data":"c18103edf3e204b90386e22ef4f69d4f81ced1a152e9094afc110a226fe5326d"} Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.138083 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"48442f94-e2f3-4585-8918-ce420c8ceebd","Type":"ContainerDied","Data":"db575e18d28561f032445cf5cc8a11854b8285f4aa3f310d786c884820f0fe23"} Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.138106 5030 scope.go:117] "RemoveContainer" containerID="c18103edf3e204b90386e22ef4f69d4f81ced1a152e9094afc110a226fe5326d" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.138200 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.139962 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"dd48f030-3391-4fdb-b101-45b0a7f4b9a3","Type":"ContainerStarted","Data":"a0ce169c8dab7719e35efb95e7fc95eedf3c1d30c8e461aa5019df790f82a14e"} Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.147337 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "48442f94-e2f3-4585-8918-ce420c8ceebd" (UID: "48442f94-e2f3-4585-8918-ce420c8ceebd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.169655 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.169688 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.169697 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48442f94-e2f3-4585-8918-ce420c8ceebd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.169705 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbcx8\" (UniqueName: \"kubernetes.io/projected/48442f94-e2f3-4585-8918-ce420c8ceebd-kube-api-access-jbcx8\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.169714 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48442f94-e2f3-4585-8918-ce420c8ceebd-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.175423 5030 scope.go:117] "RemoveContainer" containerID="cbaba1e3573350a3d9e3cdf4e6e08ae2868b0a8abe44899607092a69523f0877" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.203897 5030 scope.go:117] "RemoveContainer" containerID="c18103edf3e204b90386e22ef4f69d4f81ced1a152e9094afc110a226fe5326d" Jan 20 22:57:08 crc kubenswrapper[5030]: E0120 22:57:08.204463 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c18103edf3e204b90386e22ef4f69d4f81ced1a152e9094afc110a226fe5326d\": container with ID starting with c18103edf3e204b90386e22ef4f69d4f81ced1a152e9094afc110a226fe5326d not found: ID does not exist" containerID="c18103edf3e204b90386e22ef4f69d4f81ced1a152e9094afc110a226fe5326d" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.204516 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c18103edf3e204b90386e22ef4f69d4f81ced1a152e9094afc110a226fe5326d"} err="failed to get container status \"c18103edf3e204b90386e22ef4f69d4f81ced1a152e9094afc110a226fe5326d\": rpc error: code = NotFound desc = could not find container \"c18103edf3e204b90386e22ef4f69d4f81ced1a152e9094afc110a226fe5326d\": container with ID starting with c18103edf3e204b90386e22ef4f69d4f81ced1a152e9094afc110a226fe5326d not found: ID does not exist" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.204548 5030 scope.go:117] "RemoveContainer" containerID="cbaba1e3573350a3d9e3cdf4e6e08ae2868b0a8abe44899607092a69523f0877" Jan 20 22:57:08 crc kubenswrapper[5030]: E0120 22:57:08.205015 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbaba1e3573350a3d9e3cdf4e6e08ae2868b0a8abe44899607092a69523f0877\": container with ID starting with cbaba1e3573350a3d9e3cdf4e6e08ae2868b0a8abe44899607092a69523f0877 not found: ID does not exist" containerID="cbaba1e3573350a3d9e3cdf4e6e08ae2868b0a8abe44899607092a69523f0877" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.205046 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbaba1e3573350a3d9e3cdf4e6e08ae2868b0a8abe44899607092a69523f0877"} err="failed to get container status \"cbaba1e3573350a3d9e3cdf4e6e08ae2868b0a8abe44899607092a69523f0877\": rpc error: code = NotFound desc = could not find container \"cbaba1e3573350a3d9e3cdf4e6e08ae2868b0a8abe44899607092a69523f0877\": container with ID starting with cbaba1e3573350a3d9e3cdf4e6e08ae2868b0a8abe44899607092a69523f0877 not found: ID does not exist" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.241385 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.375585 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-combined-ca-bundle\") pod \"ef537024-2950-4510-97c8-50b40e685989\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.375732 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6htv9\" (UniqueName: \"kubernetes.io/projected/ef537024-2950-4510-97c8-50b40e685989-kube-api-access-6htv9\") pod \"ef537024-2950-4510-97c8-50b40e685989\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.375802 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-public-tls-certs\") pod \"ef537024-2950-4510-97c8-50b40e685989\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.375839 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-config-data\") pod \"ef537024-2950-4510-97c8-50b40e685989\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.375876 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef537024-2950-4510-97c8-50b40e685989-logs\") pod \"ef537024-2950-4510-97c8-50b40e685989\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.375927 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-internal-tls-certs\") pod \"ef537024-2950-4510-97c8-50b40e685989\" (UID: \"ef537024-2950-4510-97c8-50b40e685989\") " Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.376359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef537024-2950-4510-97c8-50b40e685989-logs" (OuterVolumeSpecName: "logs") pod "ef537024-2950-4510-97c8-50b40e685989" (UID: "ef537024-2950-4510-97c8-50b40e685989"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.384459 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef537024-2950-4510-97c8-50b40e685989-kube-api-access-6htv9" (OuterVolumeSpecName: "kube-api-access-6htv9") pod "ef537024-2950-4510-97c8-50b40e685989" (UID: "ef537024-2950-4510-97c8-50b40e685989"). InnerVolumeSpecName "kube-api-access-6htv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.405974 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-config-data" (OuterVolumeSpecName: "config-data") pod "ef537024-2950-4510-97c8-50b40e685989" (UID: "ef537024-2950-4510-97c8-50b40e685989"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.426668 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef537024-2950-4510-97c8-50b40e685989" (UID: "ef537024-2950-4510-97c8-50b40e685989"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.438078 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ef537024-2950-4510-97c8-50b40e685989" (UID: "ef537024-2950-4510-97c8-50b40e685989"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.445524 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ef537024-2950-4510-97c8-50b40e685989" (UID: "ef537024-2950-4510-97c8-50b40e685989"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.477911 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.477949 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6htv9\" (UniqueName: \"kubernetes.io/projected/ef537024-2950-4510-97c8-50b40e685989-kube-api-access-6htv9\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.477962 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.477974 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.477988 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef537024-2950-4510-97c8-50b40e685989-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.478000 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef537024-2950-4510-97c8-50b40e685989-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.554533 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.562812 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.571115 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:57:08 crc kubenswrapper[5030]: E0120 22:57:08.571475 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef537024-2950-4510-97c8-50b40e685989" containerName="nova-api-api" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.571493 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef537024-2950-4510-97c8-50b40e685989" containerName="nova-api-api" Jan 20 22:57:08 crc kubenswrapper[5030]: E0120 22:57:08.571506 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerName="nova-metadata-log" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.571512 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerName="nova-metadata-log" Jan 20 22:57:08 crc kubenswrapper[5030]: E0120 22:57:08.571524 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef537024-2950-4510-97c8-50b40e685989" containerName="nova-api-log" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.571531 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef537024-2950-4510-97c8-50b40e685989" containerName="nova-api-log" Jan 20 22:57:08 crc kubenswrapper[5030]: E0120 22:57:08.571551 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerName="nova-metadata-metadata" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.571558 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerName="nova-metadata-metadata" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.571804 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef537024-2950-4510-97c8-50b40e685989" containerName="nova-api-log" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.571817 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerName="nova-metadata-log" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.571830 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="48442f94-e2f3-4585-8918-ce420c8ceebd" containerName="nova-metadata-metadata" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.571839 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef537024-2950-4510-97c8-50b40e685989" containerName="nova-api-api" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.572684 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.575176 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.575595 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.597283 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.682926 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlb4n\" (UniqueName: \"kubernetes.io/projected/a4277815-cc65-44c8-b719-7673ce4c1d88-kube-api-access-xlb4n\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.683075 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.683120 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.683175 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4277815-cc65-44c8-b719-7673ce4c1d88-logs\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.683432 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-config-data\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.785562 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-config-data\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.785656 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlb4n\" (UniqueName: \"kubernetes.io/projected/a4277815-cc65-44c8-b719-7673ce4c1d88-kube-api-access-xlb4n\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.785808 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.785873 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.785965 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4277815-cc65-44c8-b719-7673ce4c1d88-logs\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.786717 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4277815-cc65-44c8-b719-7673ce4c1d88-logs\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.790871 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.790900 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.791583 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-config-data\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.804102 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlb4n\" (UniqueName: \"kubernetes.io/projected/a4277815-cc65-44c8-b719-7673ce4c1d88-kube-api-access-xlb4n\") pod \"nova-metadata-0\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:08 crc kubenswrapper[5030]: I0120 22:57:08.890024 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.154739 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"dd48f030-3391-4fdb-b101-45b0a7f4b9a3","Type":"ContainerStarted","Data":"38e3f4f4dcc255a9e5a20a831293e1da39522536a4dd156d458238e937469e98"} Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.159872 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ef537024-2950-4510-97c8-50b40e685989","Type":"ContainerDied","Data":"ce954749df96b29e368fc2326a2927cf970486950a55da111f09f48e30eb1108"} Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.159907 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.159919 5030 scope.go:117] "RemoveContainer" containerID="6ff87555539f5cf94dc0163f52d60833f852402cd8a43a14a4c802eb7604386a" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.169945 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.169926326 podStartE2EDuration="2.169926326s" podCreationTimestamp="2026-01-20 22:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:57:09.169613169 +0000 UTC m=+1301.489873457" watchObservedRunningTime="2026-01-20 22:57:09.169926326 +0000 UTC m=+1301.490186634" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.180426 5030 scope.go:117] "RemoveContainer" containerID="ee57a3203c76267f5d0f9739dbc9009d8650bf4e73459122ea3c5f845e1212ed" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.235057 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.249278 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.257652 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.259414 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.262845 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.263045 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.263158 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.266931 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.401544 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.404101 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.404252 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr8f9\" (UniqueName: \"kubernetes.io/projected/4b05ddec-40d4-42ea-987c-03718a02724d-kube-api-access-lr8f9\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.404349 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-config-data\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.404571 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-public-tls-certs\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.404782 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.404897 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b05ddec-40d4-42ea-987c-03718a02724d-logs\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: W0120 22:57:09.406147 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4277815_cc65_44c8_b719_7673ce4c1d88.slice/crio-6753c4750027cbd2b74c3a98cd876a96119d4d1c1046b188106460bdd86fca52 WatchSource:0}: Error finding container 6753c4750027cbd2b74c3a98cd876a96119d4d1c1046b188106460bdd86fca52: Status 404 returned error can't find the container with id 6753c4750027cbd2b74c3a98cd876a96119d4d1c1046b188106460bdd86fca52 Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.506638 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr8f9\" (UniqueName: \"kubernetes.io/projected/4b05ddec-40d4-42ea-987c-03718a02724d-kube-api-access-lr8f9\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.506774 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-config-data\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.506824 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-public-tls-certs\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.506881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.506942 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b05ddec-40d4-42ea-987c-03718a02724d-logs\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.507065 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.507904 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b05ddec-40d4-42ea-987c-03718a02724d-logs\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.512287 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.513278 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.513720 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-config-data\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.523217 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-public-tls-certs\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.537150 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr8f9\" (UniqueName: \"kubernetes.io/projected/4b05ddec-40d4-42ea-987c-03718a02724d-kube-api-access-lr8f9\") pod \"nova-api-0\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.580760 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.975966 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48442f94-e2f3-4585-8918-ce420c8ceebd" path="/var/lib/kubelet/pods/48442f94-e2f3-4585-8918-ce420c8ceebd/volumes" Jan 20 22:57:09 crc kubenswrapper[5030]: I0120 22:57:09.977348 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef537024-2950-4510-97c8-50b40e685989" path="/var/lib/kubelet/pods/ef537024-2950-4510-97c8-50b40e685989/volumes" Jan 20 22:57:10 crc kubenswrapper[5030]: I0120 22:57:10.040930 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:57:10 crc kubenswrapper[5030]: W0120 22:57:10.043845 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b05ddec_40d4_42ea_987c_03718a02724d.slice/crio-6df536c3c6c433873930d888129d0a6a748662d5d4db80f163c0129a4c3d8368 WatchSource:0}: Error finding container 6df536c3c6c433873930d888129d0a6a748662d5d4db80f163c0129a4c3d8368: Status 404 returned error can't find the container with id 6df536c3c6c433873930d888129d0a6a748662d5d4db80f163c0129a4c3d8368 Jan 20 22:57:10 crc kubenswrapper[5030]: I0120 22:57:10.172785 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a4277815-cc65-44c8-b719-7673ce4c1d88","Type":"ContainerStarted","Data":"54c949cb8fb90e5917444c912258fd302ea6bebb581039d1381ea77c29048eb2"} Jan 20 22:57:10 crc kubenswrapper[5030]: I0120 22:57:10.172848 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a4277815-cc65-44c8-b719-7673ce4c1d88","Type":"ContainerStarted","Data":"1149b9108c8726b17b503e37defac2cb908a5b4f024630bb83ebda23ce06e15e"} Jan 20 22:57:10 crc kubenswrapper[5030]: I0120 22:57:10.172873 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a4277815-cc65-44c8-b719-7673ce4c1d88","Type":"ContainerStarted","Data":"6753c4750027cbd2b74c3a98cd876a96119d4d1c1046b188106460bdd86fca52"} Jan 20 22:57:10 crc kubenswrapper[5030]: I0120 22:57:10.174101 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"4b05ddec-40d4-42ea-987c-03718a02724d","Type":"ContainerStarted","Data":"6df536c3c6c433873930d888129d0a6a748662d5d4db80f163c0129a4c3d8368"} Jan 20 22:57:10 crc kubenswrapper[5030]: I0120 22:57:10.201755 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.201735075 podStartE2EDuration="2.201735075s" podCreationTimestamp="2026-01-20 22:57:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:57:10.191378321 +0000 UTC m=+1302.511638619" watchObservedRunningTime="2026-01-20 22:57:10.201735075 +0000 UTC m=+1302.521995373" Jan 20 22:57:11 crc kubenswrapper[5030]: I0120 22:57:11.191333 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"4b05ddec-40d4-42ea-987c-03718a02724d","Type":"ContainerStarted","Data":"a6f737d8849e1b7dd322cb02df6ee01f0aa48f194ddabe4f0c964de3f0feac4f"} Jan 20 22:57:11 crc kubenswrapper[5030]: I0120 22:57:11.191897 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"4b05ddec-40d4-42ea-987c-03718a02724d","Type":"ContainerStarted","Data":"4b3edaf72ef2938e18892d64d5b81c473524ae4b964355521fc26202296a8f67"} Jan 20 22:57:11 crc kubenswrapper[5030]: I0120 22:57:11.241717 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.241689113 podStartE2EDuration="2.241689113s" podCreationTimestamp="2026-01-20 22:57:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:57:11.223120958 +0000 UTC m=+1303.543381256" watchObservedRunningTime="2026-01-20 22:57:11.241689113 +0000 UTC m=+1303.561949431" Jan 20 22:57:12 crc kubenswrapper[5030]: I0120 22:57:12.506157 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:13 crc kubenswrapper[5030]: I0120 22:57:13.891200 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:13 crc kubenswrapper[5030]: I0120 22:57:13.891292 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:17 crc kubenswrapper[5030]: I0120 22:57:17.506471 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:17 crc kubenswrapper[5030]: I0120 22:57:17.532690 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:18 crc kubenswrapper[5030]: I0120 22:57:18.345998 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:18 crc kubenswrapper[5030]: I0120 22:57:18.890683 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:18 crc kubenswrapper[5030]: I0120 22:57:18.890758 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:19 crc kubenswrapper[5030]: I0120 22:57:19.581461 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:19 crc kubenswrapper[5030]: I0120 22:57:19.581550 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:19 crc kubenswrapper[5030]: I0120 22:57:19.903848 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 22:57:19 crc kubenswrapper[5030]: I0120 22:57:19.903863 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 22:57:20 crc kubenswrapper[5030]: I0120 22:57:20.595778 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="4b05ddec-40d4-42ea-987c-03718a02724d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 22:57:20 crc kubenswrapper[5030]: I0120 22:57:20.595814 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="4b05ddec-40d4-42ea-987c-03718a02724d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 22:57:26 crc kubenswrapper[5030]: I0120 22:57:26.395756 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:57:28 crc kubenswrapper[5030]: I0120 22:57:28.900202 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:28 crc kubenswrapper[5030]: I0120 22:57:28.901600 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:28 crc kubenswrapper[5030]: I0120 22:57:28.913391 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:29 crc kubenswrapper[5030]: I0120 22:57:29.429718 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:29 crc kubenswrapper[5030]: I0120 22:57:29.588928 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:29 crc kubenswrapper[5030]: I0120 22:57:29.589024 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:29 crc kubenswrapper[5030]: I0120 22:57:29.589515 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:29 crc kubenswrapper[5030]: I0120 22:57:29.589558 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:29 crc kubenswrapper[5030]: I0120 22:57:29.595139 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:29 crc kubenswrapper[5030]: I0120 22:57:29.595211 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.795906 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.796816 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="270d79a6-4c9f-415b-aa6a-e0d33da8177d" containerName="openstackclient" containerID="cri-o://06277311f4997ca2086de214919a0b96983f447ac24c043ef95dfbbb0b0f8356" gracePeriod=2 Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.808848 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.822739 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.823124 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="69346425-07fa-46ee-a8a9-9750fd83ac46" containerName="openstack-network-exporter" containerID="cri-o://bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f" gracePeriod=300 Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.830949 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.831178 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" containerName="ovn-northd" containerID="cri-o://247edcbff378bb2e2fd1e3b880680b1439f95acd1f37bcf000a2cab517dc6fd8" gracePeriod=30 Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.831299 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" containerName="openstack-network-exporter" containerID="cri-o://29fb501d2b9d7513a788b9257ebec18345b31379794a7b078577b83a4385a4f0" gracePeriod=30 Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.852296 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="270d79a6-4c9f-415b-aa6a-e0d33da8177d" podUID="d71f95f1-872a-4016-b491-66efe081ba1c" Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.859353 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt"] Jan 20 22:57:35 crc kubenswrapper[5030]: E0120 22:57:35.859757 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270d79a6-4c9f-415b-aa6a-e0d33da8177d" containerName="openstackclient" Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.859787 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="270d79a6-4c9f-415b-aa6a-e0d33da8177d" containerName="openstackclient" Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.859954 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="270d79a6-4c9f-415b-aa6a-e0d33da8177d" containerName="openstackclient" Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.860862 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.877702 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.879590 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.902561 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt"] Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.944078 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv"] Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.961586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9m56\" (UniqueName: \"kubernetes.io/projected/ce21c77f-25ce-4bba-b711-7375ce5e824c-kube-api-access-f9m56\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.961665 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-config-data-custom\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.961691 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce21c77f-25ce-4bba-b711-7375ce5e824c-logs\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.961763 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-config-data\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.961808 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-combined-ca-bundle\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.988217 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 22:57:35 crc kubenswrapper[5030]: E0120 22:57:35.992258 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-5x7jr openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/openstackclient" podUID="d71f95f1-872a-4016-b491-66efe081ba1c" Jan 20 22:57:35 crc kubenswrapper[5030]: I0120 22:57:35.992383 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.018855 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.018897 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.020202 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.020371 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="efe6a225-5530-4922-91ec-1c4a939cb98b" containerName="memcached" containerID="cri-o://6b6d7fced9bfec32b48d7d732119d3237bc9fe4d5517f7a246ca37788d005e77" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.031948 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.052242 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.052462 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="d09f29b6-8161-4158-810d-efe783958358" containerName="cinder-scheduler" containerID="cri-o://a5eb9a66cd59578f8b736dfa3e91fcb0a66588356d5c43c532a1426a7815f1a9" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.052581 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="d09f29b6-8161-4158-810d-efe783958358" containerName="probe" containerID="cri-o://fef776c551a97498795259e77769d450688d573f30cb486c4698bfd320905171" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.062138 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.062397 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="6acf7ebc-be88-4e1d-9b31-bef0707ea082" containerName="glance-log" containerID="cri-o://f2e4fba2a035d0d53c65eea9e0abbcb8b18335ef51e2f23ea789c2843967c10f" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.062544 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="6acf7ebc-be88-4e1d-9b31-bef0707ea082" containerName="glance-httpd" containerID="cri-o://09189446050efd620d2103042e8ffbe0d00050dcd359f229a97abe5fb2fafe72" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.063262 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-combined-ca-bundle\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.063323 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d71f95f1-872a-4016-b491-66efe081ba1c-openstack-config\") pod \"openstackclient\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.063347 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d71f95f1-872a-4016-b491-66efe081ba1c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.063373 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9m56\" (UniqueName: \"kubernetes.io/projected/ce21c77f-25ce-4bba-b711-7375ce5e824c-kube-api-access-f9m56\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.063396 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-config-data-custom\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.063423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce21c77f-25ce-4bba-b711-7375ce5e824c-logs\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.063447 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71f95f1-872a-4016-b491-66efe081ba1c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.063512 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-config-data\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.063529 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x7jr\" (UniqueName: \"kubernetes.io/projected/d71f95f1-872a-4016-b491-66efe081ba1c-kube-api-access-5x7jr\") pod \"openstackclient\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.069581 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce21c77f-25ce-4bba-b711-7375ce5e824c-logs\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.086783 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-config-data-custom\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.093706 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.105029 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-config-data\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.107224 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-combined-ca-bundle\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.118990 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="69346425-07fa-46ee-a8a9-9750fd83ac46" containerName="ovsdbserver-nb" containerID="cri-o://d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7" gracePeriod=300 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.148671 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.179291 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9m56\" (UniqueName: \"kubernetes.io/projected/ce21c77f-25ce-4bba-b711-7375ce5e824c-kube-api-access-f9m56\") pod \"barbican-keystone-listener-9fc44964b-hvwvt\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.209360 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7bd4b6f588-wr8mk"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.310006 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.312670 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.312979 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpxwc\" (UniqueName: \"kubernetes.io/projected/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-kube-api-access-fpxwc\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313060 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d71f95f1-872a-4016-b491-66efe081ba1c-openstack-config\") pod \"openstackclient\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313090 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d71f95f1-872a-4016-b491-66efe081ba1c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313117 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-config-data\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313158 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-public-tls-certs\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313199 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-config-data-custom\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313236 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71f95f1-872a-4016-b491-66efe081ba1c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313254 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-config-data\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313336 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a888a7-7114-4dcb-b271-4422a3802e3b-logs\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313363 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-internal-tls-certs\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313395 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-config-data-custom\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313410 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9vgs\" (UniqueName: \"kubernetes.io/projected/37a888a7-7114-4dcb-b271-4422a3802e3b-kube-api-access-b9vgs\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313463 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x7jr\" (UniqueName: \"kubernetes.io/projected/d71f95f1-872a-4016-b491-66efe081ba1c-kube-api-access-5x7jr\") pod \"openstackclient\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313498 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-combined-ca-bundle\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313515 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-logs\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.313532 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-combined-ca-bundle\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.319410 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d71f95f1-872a-4016-b491-66efe081ba1c-openstack-config\") pod \"openstackclient\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:36 crc kubenswrapper[5030]: E0120 22:57:36.327730 5030 projected.go:194] Error preparing data for projected volume kube-api-access-5x7jr for pod openstack-kuttl-tests/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 20 22:57:36 crc kubenswrapper[5030]: E0120 22:57:36.327787 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d71f95f1-872a-4016-b491-66efe081ba1c-kube-api-access-5x7jr podName:d71f95f1-872a-4016-b491-66efe081ba1c nodeName:}" failed. No retries permitted until 2026-01-20 22:57:36.827772616 +0000 UTC m=+1329.148032894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5x7jr" (UniqueName: "kubernetes.io/projected/d71f95f1-872a-4016-b491-66efe081ba1c-kube-api-access-5x7jr") pod "openstackclient" (UID: "d71f95f1-872a-4016-b491-66efe081ba1c") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.379223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71f95f1-872a-4016-b491-66efe081ba1c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.393103 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d71f95f1-872a-4016-b491-66efe081ba1c-openstack-config-secret\") pod \"openstackclient\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.406923 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415510 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-config-data\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415550 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-combined-ca-bundle\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415599 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpxwc\" (UniqueName: \"kubernetes.io/projected/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-kube-api-access-fpxwc\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415654 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-config-data\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415679 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-combined-ca-bundle\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415699 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-public-tls-certs\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415719 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-internal-tls-certs\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415736 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f93538-1f2d-4699-9b7b-dcf4c110178a-logs\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-config-data-custom\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415776 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-config-data\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415800 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-scripts\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415829 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a888a7-7114-4dcb-b271-4422a3802e3b-logs\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415851 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-internal-tls-certs\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415872 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-config-data-custom\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415888 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9vgs\" (UniqueName: \"kubernetes.io/projected/37a888a7-7114-4dcb-b271-4422a3802e3b-kube-api-access-b9vgs\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415907 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-public-tls-certs\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415926 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4j9m\" (UniqueName: \"kubernetes.io/projected/59f93538-1f2d-4699-9b7b-dcf4c110178a-kube-api-access-r4j9m\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415968 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-combined-ca-bundle\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.415985 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-logs\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.420691 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-public-tls-certs\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.420983 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.421141 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="1d59910a-9b07-47ff-b79b-661139ece8c3" containerName="nova-cell0-conductor-conductor" containerID="cri-o://789a37f026b23cad58514a4681ab46b1426788dba79b1a6a382f7073a104bb9e" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.426276 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-logs\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.431030 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a888a7-7114-4dcb-b271-4422a3802e3b-logs\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.438255 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7bd4b6f588-wr8mk"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.441165 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-config-data-custom\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.443724 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-config-data\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.445927 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-combined-ca-bundle\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.447402 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-config-data-custom\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.450600 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.450852 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerName="nova-metadata-log" containerID="cri-o://1149b9108c8726b17b503e37defac2cb908a5b4f024630bb83ebda23ce06e15e" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.450996 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerName="nova-metadata-metadata" containerID="cri-o://54c949cb8fb90e5917444c912258fd302ea6bebb581039d1381ea77c29048eb2" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.455436 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpxwc\" (UniqueName: \"kubernetes.io/projected/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-kube-api-access-fpxwc\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.460889 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-combined-ca-bundle\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.463148 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-internal-tls-certs\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.480586 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-config-data\") pod \"barbican-worker-64675d4b5c-z9thv\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.487221 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9vgs\" (UniqueName: \"kubernetes.io/projected/37a888a7-7114-4dcb-b271-4422a3802e3b-kube-api-access-b9vgs\") pod \"barbican-api-67b967cd44-2rgq2\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.522655 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.522872 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="4b05ddec-40d4-42ea-987c-03718a02724d" containerName="nova-api-log" containerID="cri-o://4b3edaf72ef2938e18892d64d5b81c473524ae4b964355521fc26202296a8f67" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.522986 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="4b05ddec-40d4-42ea-987c-03718a02724d" containerName="nova-api-api" containerID="cri-o://a6f737d8849e1b7dd322cb02df6ee01f0aa48f194ddabe4f0c964de3f0feac4f" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.524003 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-public-tls-certs\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.524030 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4j9m\" (UniqueName: \"kubernetes.io/projected/59f93538-1f2d-4699-9b7b-dcf4c110178a-kube-api-access-r4j9m\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.524070 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-config-data\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.524143 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-combined-ca-bundle\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.524163 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-internal-tls-certs\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.524178 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f93538-1f2d-4699-9b7b-dcf4c110178a-logs\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.524211 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-scripts\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.538495 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f93538-1f2d-4699-9b7b-dcf4c110178a-logs\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.544985 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-config-data\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.550388 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-scripts\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.555579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-combined-ca-bundle\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.556018 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-internal-tls-certs\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.556462 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-public-tls-certs\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.580719 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.580958 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="dd48f030-3391-4fdb-b101-45b0a7f4b9a3" containerName="nova-scheduler-scheduler" containerID="cri-o://38e3f4f4dcc255a9e5a20a831293e1da39522536a4dd156d458238e937469e98" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.582354 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4j9m\" (UniqueName: \"kubernetes.io/projected/59f93538-1f2d-4699-9b7b-dcf4c110178a-kube-api-access-r4j9m\") pod \"placement-7bd4b6f588-wr8mk\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.617912 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.618281 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="bcf270c3-8bc4-4b11-8c04-10708f22f76e" containerName="cinder-api-log" containerID="cri-o://4519dfb732f0f1ed4b9e6668cf83ed384631e71d9e34923e24132c02c8524752" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.618375 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="bcf270c3-8bc4-4b11-8c04-10708f22f76e" containerName="cinder-api" containerID="cri-o://ea1e0a375f06f75251c94f8a9680e30e93e6ebbeb077b27d82cd29945cdfd113" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.631822 5030 generic.go:334] "Generic (PLEG): container finished" podID="36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" containerID="29fb501d2b9d7513a788b9257ebec18345b31379794a7b078577b83a4385a4f0" exitCode=2 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.631911 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869","Type":"ContainerDied","Data":"29fb501d2b9d7513a788b9257ebec18345b31379794a7b078577b83a4385a4f0"} Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.654664 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.655268 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" containerName="openstack-network-exporter" containerID="cri-o://dc8cf096144942ad340ca9001d99a944cd0a4c991ff4c64ab1bbed9217b019de" gracePeriod=300 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.665744 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.665952 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="f0fffc32-86e9-4278-8c2d-37a04bbd89b1" containerName="nova-cell1-conductor-conductor" containerID="cri-o://efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.666916 5030 generic.go:334] "Generic (PLEG): container finished" podID="69346425-07fa-46ee-a8a9-9750fd83ac46" containerID="bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f" exitCode=2 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.667003 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"69346425-07fa-46ee-a8a9-9750fd83ac46","Type":"ContainerDied","Data":"bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f"} Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.693944 5030 generic.go:334] "Generic (PLEG): container finished" podID="6acf7ebc-be88-4e1d-9b31-bef0707ea082" containerID="f2e4fba2a035d0d53c65eea9e0abbcb8b18335ef51e2f23ea789c2843967c10f" exitCode=143 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.694035 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.695721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"6acf7ebc-be88-4e1d-9b31-bef0707ea082","Type":"ContainerDied","Data":"f2e4fba2a035d0d53c65eea9e0abbcb8b18335ef51e2f23ea789c2843967c10f"} Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.705937 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.706126 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="a199be04-e085-49d3-a376-abdf54773b99" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://19f231d19d885aa27ab7782d5877fea4392fd2ffd88a91c7505026e5f5c450f4" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.725116 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.746417 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-65fcc556fd-25njc"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.749321 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.755707 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-65fcc556fd-25njc"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.767664 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-866d7bb55b-59hzj"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.768868 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.775542 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.775790 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="962ec476-3693-41a1-b425-a275bf790beb" containerName="glance-log" containerID="cri-o://4eb95752fd1da606b39641693b5639240910a1ec73d7ec7ac34635061387513d" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.775934 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="962ec476-3693-41a1-b425-a275bf790beb" containerName="glance-httpd" containerID="cri-o://be10608095d180892da25bc367a49e31085449b06dae8af243a26e65d423eabd" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.781432 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-866d7bb55b-59hzj"] Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.850302 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-public-tls-certs\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.850690 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-config\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.850728 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgnts\" (UniqueName: \"kubernetes.io/projected/3ce810a5-c4fb-4211-9bf1-eb6300276d70-kube-api-access-cgnts\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.850829 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-httpd-config\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.850865 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x7jr\" (UniqueName: \"kubernetes.io/projected/d71f95f1-872a-4016-b491-66efe081ba1c-kube-api-access-5x7jr\") pod \"openstackclient\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.850919 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-internal-tls-certs\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.850938 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-combined-ca-bundle\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.850958 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-ovndb-tls-certs\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: E0120 22:57:36.853407 5030 projected.go:194] Error preparing data for projected volume kube-api-access-5x7jr for pod openstack-kuttl-tests/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 20 22:57:36 crc kubenswrapper[5030]: E0120 22:57:36.853467 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d71f95f1-872a-4016-b491-66efe081ba1c-kube-api-access-5x7jr podName:d71f95f1-872a-4016-b491-66efe081ba1c nodeName:}" failed. No retries permitted until 2026-01-20 22:57:37.853448464 +0000 UTC m=+1330.173708752 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5x7jr" (UniqueName: "kubernetes.io/projected/d71f95f1-872a-4016-b491-66efe081ba1c-kube-api-access-5x7jr") pod "openstackclient" (UID: "d71f95f1-872a-4016-b491-66efe081ba1c") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 20 22:57:36 crc kubenswrapper[5030]: E0120 22:57:36.864933 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="247edcbff378bb2e2fd1e3b880680b1439f95acd1f37bcf000a2cab517dc6fd8" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 22:57:36 crc kubenswrapper[5030]: E0120 22:57:36.867932 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="247edcbff378bb2e2fd1e3b880680b1439f95acd1f37bcf000a2cab517dc6fd8" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 22:57:36 crc kubenswrapper[5030]: E0120 22:57:36.872012 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="247edcbff378bb2e2fd1e3b880680b1439f95acd1f37bcf000a2cab517dc6fd8" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 22:57:36 crc kubenswrapper[5030]: E0120 22:57:36.872121 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" containerName="ovn-northd" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.873070 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" containerName="ovsdbserver-sb" containerID="cri-o://53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd" gracePeriod=300 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.953607 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-scripts\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.953670 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-combined-ca-bundle\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.953697 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-httpd-config\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.953766 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-internal-tls-certs\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.953782 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-combined-ca-bundle\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.953802 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-ovndb-tls-certs\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.953838 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-internal-tls-certs\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.953860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-public-tls-certs\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.953881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-config\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.953906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-config-data\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.953926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgnts\" (UniqueName: \"kubernetes.io/projected/3ce810a5-c4fb-4211-9bf1-eb6300276d70-kube-api-access-cgnts\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.953946 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-fernet-keys\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.953985 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj5ww\" (UniqueName: \"kubernetes.io/projected/f0994e0e-f167-4c5a-bef7-95da6f589767-kube-api-access-dj5ww\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.954015 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-credential-keys\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.954058 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-public-tls-certs\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.958450 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-httpd-config\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.961048 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-internal-tls-certs\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.961275 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-config\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.961743 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-ovndb-tls-certs\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.971157 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-public-tls-certs\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.979909 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgnts\" (UniqueName: \"kubernetes.io/projected/3ce810a5-c4fb-4211-9bf1-eb6300276d70-kube-api-access-cgnts\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.980174 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="95a857cf-5973-4c78-9758-ab04c932e180" containerName="galera" containerID="cri-o://a959770d65b862147cb227ef990299c9ce51c5014c0b60151012a6e091f021b1" gracePeriod=30 Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.980301 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-combined-ca-bundle\") pod \"neutron-65fcc556fd-25njc\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:36 crc kubenswrapper[5030]: I0120 22:57:36.992860 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="8a05e942-d886-4355-b352-50408cfe11e3" containerName="galera" containerID="cri-o://f42f977d4b1f56e969e3645e77647af497eeead81e8ac35d88c15264e254f40f" gracePeriod=30 Jan 20 22:57:37 crc kubenswrapper[5030]: E0120 22:57:37.027919 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36a5a6f5_2652_4f8d_8a3b_ae3b58ab4869.slice/crio-29fb501d2b9d7513a788b9257ebec18345b31379794a7b078577b83a4385a4f0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6acf7ebc_be88_4e1d_9b31_bef0707ea082.slice/crio-f2e4fba2a035d0d53c65eea9e0abbcb8b18335ef51e2f23ea789c2843967c10f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69346425_07fa_46ee_a8a9_9750fd83ac46.slice/crio-conmon-bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b05ddec_40d4_42ea_987c_03718a02724d.slice/crio-4b3edaf72ef2938e18892d64d5b81c473524ae4b964355521fc26202296a8f67.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42b7bb7d_f87c_4f2f_8d95_5d8ff851912c.slice/crio-53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69346425_07fa_46ee_a8a9_9750fd83ac46.slice/crio-bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69346425_07fa_46ee_a8a9_9750fd83ac46.slice/crio-conmon-d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcf270c3_8bc4_4b11_8c04_10708f22f76e.slice/crio-conmon-4519dfb732f0f1ed4b9e6668cf83ed384631e71d9e34923e24132c02c8524752.scope\": RecentStats: unable to find data in memory cache]" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.056021 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-fernet-keys\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.056090 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj5ww\" (UniqueName: \"kubernetes.io/projected/f0994e0e-f167-4c5a-bef7-95da6f589767-kube-api-access-dj5ww\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.056121 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-credential-keys\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.056160 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-public-tls-certs\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.056186 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-scripts\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.056205 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-combined-ca-bundle\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.056295 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-internal-tls-certs\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.056333 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-config-data\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.063407 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-fernet-keys\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.063726 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-public-tls-certs\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.065026 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-config-data\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.066986 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-scripts\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.067018 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-credential-keys\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.067221 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-internal-tls-certs\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.067262 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-combined-ca-bundle\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.075229 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj5ww\" (UniqueName: \"kubernetes.io/projected/f0994e0e-f167-4c5a-bef7-95da6f589767-kube-api-access-dj5ww\") pod \"keystone-866d7bb55b-59hzj\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.131870 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.144321 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.157346 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.162392 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.179688 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt"] Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.260585 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d71f95f1-872a-4016-b491-66efe081ba1c-openstack-config-secret\") pod \"d71f95f1-872a-4016-b491-66efe081ba1c\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.261169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d71f95f1-872a-4016-b491-66efe081ba1c-openstack-config\") pod \"d71f95f1-872a-4016-b491-66efe081ba1c\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.261206 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71f95f1-872a-4016-b491-66efe081ba1c-combined-ca-bundle\") pod \"d71f95f1-872a-4016-b491-66efe081ba1c\" (UID: \"d71f95f1-872a-4016-b491-66efe081ba1c\") " Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.261826 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x7jr\" (UniqueName: \"kubernetes.io/projected/d71f95f1-872a-4016-b491-66efe081ba1c-kube-api-access-5x7jr\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.265378 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71f95f1-872a-4016-b491-66efe081ba1c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d71f95f1-872a-4016-b491-66efe081ba1c" (UID: "d71f95f1-872a-4016-b491-66efe081ba1c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.268879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d71f95f1-872a-4016-b491-66efe081ba1c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d71f95f1-872a-4016-b491-66efe081ba1c" (UID: "d71f95f1-872a-4016-b491-66efe081ba1c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.302907 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d71f95f1-872a-4016-b491-66efe081ba1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d71f95f1-872a-4016-b491-66efe081ba1c" (UID: "d71f95f1-872a-4016-b491-66efe081ba1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.364326 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d71f95f1-872a-4016-b491-66efe081ba1c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.364383 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d71f95f1-872a-4016-b491-66efe081ba1c-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.364397 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d71f95f1-872a-4016-b491-66efe081ba1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.461470 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.499401 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:37 crc kubenswrapper[5030]: E0120 22:57:37.565775 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38e3f4f4dcc255a9e5a20a831293e1da39522536a4dd156d458238e937469e98" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 22:57:37 crc kubenswrapper[5030]: E0120 22:57:37.652285 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38e3f4f4dcc255a9e5a20a831293e1da39522536a4dd156d458238e937469e98" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.762245 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt"] Jan 20 22:57:37 crc kubenswrapper[5030]: E0120 22:57:37.767051 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38e3f4f4dcc255a9e5a20a831293e1da39522536a4dd156d458238e937469e98" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 22:57:37 crc kubenswrapper[5030]: E0120 22:57:37.767131 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="dd48f030-3391-4fdb-b101-45b0a7f4b9a3" containerName="nova-scheduler-scheduler" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.894447 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv"] Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.926182 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_69346425-07fa-46ee-a8a9-9750fd83ac46/ovsdbserver-nb/0.log" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.926850 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.975284 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-combined-ca-bundle\") pod \"69346425-07fa-46ee-a8a9-9750fd83ac46\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.975356 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-metrics-certs-tls-certs\") pod \"69346425-07fa-46ee-a8a9-9750fd83ac46\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.975500 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-ovsdbserver-nb-tls-certs\") pod \"69346425-07fa-46ee-a8a9-9750fd83ac46\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.975538 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg4rt\" (UniqueName: \"kubernetes.io/projected/69346425-07fa-46ee-a8a9-9750fd83ac46-kube-api-access-wg4rt\") pod \"69346425-07fa-46ee-a8a9-9750fd83ac46\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.975632 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69346425-07fa-46ee-a8a9-9750fd83ac46-config\") pod \"69346425-07fa-46ee-a8a9-9750fd83ac46\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.975658 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69346425-07fa-46ee-a8a9-9750fd83ac46-scripts\") pod \"69346425-07fa-46ee-a8a9-9750fd83ac46\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.975692 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69346425-07fa-46ee-a8a9-9750fd83ac46-ovsdb-rundir\") pod \"69346425-07fa-46ee-a8a9-9750fd83ac46\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.975710 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"69346425-07fa-46ee-a8a9-9750fd83ac46\" (UID: \"69346425-07fa-46ee-a8a9-9750fd83ac46\") " Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.990132 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_42b7bb7d-f87c-4f2f-8d95-5d8ff851912c/ovsdbserver-sb/0.log" Jan 20 22:57:37 crc kubenswrapper[5030]: I0120 22:57:37.990240 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:57:37 crc kubenswrapper[5030]: E0120 22:57:37.990325 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.001990 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69346425-07fa-46ee-a8a9-9750fd83ac46-config" (OuterVolumeSpecName: "config") pod "69346425-07fa-46ee-a8a9-9750fd83ac46" (UID: "69346425-07fa-46ee-a8a9-9750fd83ac46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.002121 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.004331 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69346425-07fa-46ee-a8a9-9750fd83ac46-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "69346425-07fa-46ee-a8a9-9750fd83ac46" (UID: "69346425-07fa-46ee-a8a9-9750fd83ac46"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.013465 5030 generic.go:334] "Generic (PLEG): container finished" podID="4b05ddec-40d4-42ea-987c-03718a02724d" containerID="4b3edaf72ef2938e18892d64d5b81c473524ae4b964355521fc26202296a8f67" exitCode=143 Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.021037 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69346425-07fa-46ee-a8a9-9750fd83ac46-kube-api-access-wg4rt" (OuterVolumeSpecName: "kube-api-access-wg4rt") pod "69346425-07fa-46ee-a8a9-9750fd83ac46" (UID: "69346425-07fa-46ee-a8a9-9750fd83ac46"). InnerVolumeSpecName "kube-api-access-wg4rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.021941 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69346425-07fa-46ee-a8a9-9750fd83ac46-scripts" (OuterVolumeSpecName: "scripts") pod "69346425-07fa-46ee-a8a9-9750fd83ac46" (UID: "69346425-07fa-46ee-a8a9-9750fd83ac46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.030807 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.030867 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="f0fffc32-86e9-4278-8c2d-37a04bbd89b1" containerName="nova-cell1-conductor-conductor" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.049832 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "69346425-07fa-46ee-a8a9-9750fd83ac46" (UID: "69346425-07fa-46ee-a8a9-9750fd83ac46"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.105529 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-combined-ca-bundle\") pod \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.105569 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.105699 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-config\") pod \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.108854 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-ovsdb-rundir\") pod \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.108939 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-scripts\") pod \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.108962 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpx28\" (UniqueName: \"kubernetes.io/projected/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-kube-api-access-fpx28\") pod \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.108984 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-ovsdbserver-sb-tls-certs\") pod \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.109012 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-metrics-certs-tls-certs\") pod \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\" (UID: \"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c\") " Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.132726 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" (UID: "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.144089 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/69346425-07fa-46ee-a8a9-9750fd83ac46-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.144148 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.144159 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.144171 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg4rt\" (UniqueName: \"kubernetes.io/projected/69346425-07fa-46ee-a8a9-9750fd83ac46-kube-api-access-wg4rt\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.144183 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69346425-07fa-46ee-a8a9-9750fd83ac46-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.144192 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69346425-07fa-46ee-a8a9-9750fd83ac46-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.149129 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_42b7bb7d-f87c-4f2f-8d95-5d8ff851912c/ovsdbserver-sb/0.log" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.149506 5030 generic.go:334] "Generic (PLEG): container finished" podID="42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" containerID="dc8cf096144942ad340ca9001d99a944cd0a4c991ff4c64ab1bbed9217b019de" exitCode=2 Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.149529 5030 generic.go:334] "Generic (PLEG): container finished" podID="42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" containerID="53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd" exitCode=143 Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.149699 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.149862 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-kube-api-access-fpx28" (OuterVolumeSpecName: "kube-api-access-fpx28") pod "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" (UID: "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c"). InnerVolumeSpecName "kube-api-access-fpx28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.150091 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" (UID: "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.156273 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-config" (OuterVolumeSpecName: "config") pod "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" (UID: "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.178301 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-scripts" (OuterVolumeSpecName: "scripts") pod "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" (UID: "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.207913 5030 generic.go:334] "Generic (PLEG): container finished" podID="d09f29b6-8161-4158-810d-efe783958358" containerID="fef776c551a97498795259e77769d450688d573f30cb486c4698bfd320905171" exitCode=0 Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.238977 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71f95f1-872a-4016-b491-66efe081ba1c" path="/var/lib/kubelet/pods/d71f95f1-872a-4016-b491-66efe081ba1c/volumes" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.258470 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.275280 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.275311 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.275323 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpx28\" (UniqueName: \"kubernetes.io/projected/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-kube-api-access-fpx28\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.275331 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.275350 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.279742 5030 generic.go:334] "Generic (PLEG): container finished" podID="bcf270c3-8bc4-4b11-8c04-10708f22f76e" containerID="4519dfb732f0f1ed4b9e6668cf83ed384631e71d9e34923e24132c02c8524752" exitCode=143 Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.304533 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69346425-07fa-46ee-a8a9-9750fd83ac46" (UID: "69346425-07fa-46ee-a8a9-9750fd83ac46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.315983 5030 generic.go:334] "Generic (PLEG): container finished" podID="1d59910a-9b07-47ff-b79b-661139ece8c3" containerID="789a37f026b23cad58514a4681ab46b1426788dba79b1a6a382f7073a104bb9e" exitCode=0 Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.349381 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_69346425-07fa-46ee-a8a9-9750fd83ac46/ovsdbserver-nb/0.log" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.349544 5030 generic.go:334] "Generic (PLEG): container finished" podID="69346425-07fa-46ee-a8a9-9750fd83ac46" containerID="d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7" exitCode=143 Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.349980 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.377943 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.385101 5030 generic.go:334] "Generic (PLEG): container finished" podID="962ec476-3693-41a1-b425-a275bf790beb" containerID="4eb95752fd1da606b39641693b5639240910a1ec73d7ec7ac34635061387513d" exitCode=143 Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.385752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "69346425-07fa-46ee-a8a9-9750fd83ac46" (UID: "69346425-07fa-46ee-a8a9-9750fd83ac46"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.411170 5030 generic.go:334] "Generic (PLEG): container finished" podID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerID="1149b9108c8726b17b503e37defac2cb908a5b4f024630bb83ebda23ce06e15e" exitCode=143 Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.430896 5030 generic.go:334] "Generic (PLEG): container finished" podID="a199be04-e085-49d3-a376-abdf54773b99" containerID="19f231d19d885aa27ab7782d5877fea4392fd2ffd88a91c7505026e5f5c450f4" exitCode=0 Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.431009 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.461042 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.461075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"4b05ddec-40d4-42ea-987c-03718a02724d","Type":"ContainerDied","Data":"4b3edaf72ef2938e18892d64d5b81c473524ae4b964355521fc26202296a8f67"} Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.461096 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c","Type":"ContainerDied","Data":"dc8cf096144942ad340ca9001d99a944cd0a4c991ff4c64ab1bbed9217b019de"} Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.461111 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.461233 5030 scope.go:117] "RemoveContainer" containerID="dc8cf096144942ad340ca9001d99a944cd0a4c991ff4c64ab1bbed9217b019de" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.461759 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69346425-07fa-46ee-a8a9-9750fd83ac46" containerName="openstack-network-exporter" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.461771 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="69346425-07fa-46ee-a8a9-9750fd83ac46" containerName="openstack-network-exporter" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.461785 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69346425-07fa-46ee-a8a9-9750fd83ac46" containerName="ovsdbserver-nb" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.461792 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="69346425-07fa-46ee-a8a9-9750fd83ac46" containerName="ovsdbserver-nb" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.461821 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" containerName="ovsdbserver-sb" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.461829 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" containerName="ovsdbserver-sb" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.461842 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" containerName="openstack-network-exporter" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.461849 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" containerName="openstack-network-exporter" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.462011 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" containerName="ovsdbserver-sb" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.462027 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="69346425-07fa-46ee-a8a9-9750fd83ac46" containerName="ovsdbserver-nb" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.462035 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" containerName="openstack-network-exporter" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.462045 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="69346425-07fa-46ee-a8a9-9750fd83ac46" containerName="openstack-network-exporter" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.467803 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.467962 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.469584 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-523f-account-create-update-lfzlt"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.469609 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-523f-account-create-update-lfzlt"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.469656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c","Type":"ContainerDied","Data":"53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd"} Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.469679 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"d09f29b6-8161-4158-810d-efe783958358","Type":"ContainerDied","Data":"fef776c551a97498795259e77769d450688d573f30cb486c4698bfd320905171"} Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.469692 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"bcf270c3-8bc4-4b11-8c04-10708f22f76e","Type":"ContainerDied","Data":"4519dfb732f0f1ed4b9e6668cf83ed384631e71d9e34923e24132c02c8524752"} Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.469703 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"1d59910a-9b07-47ff-b79b-661139ece8c3","Type":"ContainerDied","Data":"789a37f026b23cad58514a4681ab46b1426788dba79b1a6a382f7073a104bb9e"} Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.469713 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"69346425-07fa-46ee-a8a9-9750fd83ac46","Type":"ContainerDied","Data":"d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7"} Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.469725 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"69346425-07fa-46ee-a8a9-9750fd83ac46","Type":"ContainerDied","Data":"b7ae022d96085c94041bc7645fcc1ddf6967ea12ab55391a2644d6e2140b5b78"} Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.469734 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" event={"ID":"ce21c77f-25ce-4bba-b711-7375ce5e824c","Type":"ContainerStarted","Data":"e2f1fb0d9eb9ea650acf54f5893cecdf9eca5f9ae8472088d53d46b93493a224"} Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.469744 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"962ec476-3693-41a1-b425-a275bf790beb","Type":"ContainerDied","Data":"4eb95752fd1da606b39641693b5639240910a1ec73d7ec7ac34635061387513d"} Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.469755 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a4277815-cc65-44c8-b719-7673ce4c1d88","Type":"ContainerDied","Data":"1149b9108c8726b17b503e37defac2cb908a5b4f024630bb83ebda23ce06e15e"} Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.470300 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"a199be04-e085-49d3-a376-abdf54773b99","Type":"ContainerDied","Data":"19f231d19d885aa27ab7782d5877fea4392fd2ffd88a91c7505026e5f5c450f4"} Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.470398 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.497691 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.508663 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.510198 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.524049 5030 scope.go:117] "RemoveContainer" containerID="53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.541706 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" (UID: "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.572658 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l"] Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.576775 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 789a37f026b23cad58514a4681ab46b1426788dba79b1a6a382f7073a104bb9e is running failed: container process not found" containerID="789a37f026b23cad58514a4681ab46b1426788dba79b1a6a382f7073a104bb9e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.577594 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 789a37f026b23cad58514a4681ab46b1426788dba79b1a6a382f7073a104bb9e is running failed: container process not found" containerID="789a37f026b23cad58514a4681ab46b1426788dba79b1a6a382f7073a104bb9e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.578009 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 789a37f026b23cad58514a4681ab46b1426788dba79b1a6a382f7073a104bb9e is running failed: container process not found" containerID="789a37f026b23cad58514a4681ab46b1426788dba79b1a6a382f7073a104bb9e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.578070 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 789a37f026b23cad58514a4681ab46b1426788dba79b1a6a382f7073a104bb9e is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="1d59910a-9b07-47ff-b79b-661139ece8c3" containerName="nova-cell0-conductor-conductor" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.587975 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.612106 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.612178 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.612210 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szznv\" (UniqueName: \"kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.612227 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data-custom\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.612277 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data-custom\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.614647 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.614701 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data podName:b0a5273a-9439-4881-b88a-2a3aa11e410b nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.114684341 +0000 UTC m=+1331.434944629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data") pod "rabbitmq-server-0" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b") : configmap "rabbitmq-config-data" not found Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.620088 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.620126 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt76q\" (UniqueName: \"kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.620152 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da108774-0e8b-4790-ac01-1a542370a374-logs\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.620245 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.620296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bc0c068-8645-4d10-95a1-94cce9fe9592-logs\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.620399 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.620410 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.643947 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "69346425-07fa-46ee-a8a9-9750fd83ac46" (UID: "69346425-07fa-46ee-a8a9-9750fd83ac46"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.659122 5030 scope.go:117] "RemoveContainer" containerID="dc8cf096144942ad340ca9001d99a944cd0a4c991ff4c64ab1bbed9217b019de" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.665790 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc8cf096144942ad340ca9001d99a944cd0a4c991ff4c64ab1bbed9217b019de\": container with ID starting with dc8cf096144942ad340ca9001d99a944cd0a4c991ff4c64ab1bbed9217b019de not found: ID does not exist" containerID="dc8cf096144942ad340ca9001d99a944cd0a4c991ff4c64ab1bbed9217b019de" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.665913 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc8cf096144942ad340ca9001d99a944cd0a4c991ff4c64ab1bbed9217b019de"} err="failed to get container status \"dc8cf096144942ad340ca9001d99a944cd0a4c991ff4c64ab1bbed9217b019de\": rpc error: code = NotFound desc = could not find container \"dc8cf096144942ad340ca9001d99a944cd0a4c991ff4c64ab1bbed9217b019de\": container with ID starting with dc8cf096144942ad340ca9001d99a944cd0a4c991ff4c64ab1bbed9217b019de not found: ID does not exist" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.665950 5030 scope.go:117] "RemoveContainer" containerID="53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.666866 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd\": container with ID starting with 53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd not found: ID does not exist" containerID="53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.666896 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd"} err="failed to get container status \"53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd\": rpc error: code = NotFound desc = could not find container \"53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd\": container with ID starting with 53ec0a3905198b272c3f683c87a5feb7d95ab2ce7af83d8e8fc0ac087a48d7dd not found: ID does not exist" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.666911 5030 scope.go:117] "RemoveContainer" containerID="bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.671142 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.703995 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-k84rf"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.715322 5030 scope.go:117] "RemoveContainer" containerID="d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.724198 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-76de-account-create-update-mdgc4"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.726230 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.726269 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt76q\" (UniqueName: \"kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.726290 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da108774-0e8b-4790-ac01-1a542370a374-logs\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.726332 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.726353 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bc0c068-8645-4d10-95a1-94cce9fe9592-logs\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.726407 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.726447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.726473 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szznv\" (UniqueName: \"kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.726489 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data-custom\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.726507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data-custom\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.726580 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/69346425-07fa-46ee-a8a9-9750fd83ac46-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.726830 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.726897 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data podName:da108774-0e8b-4790-ac01-1a542370a374 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.226877813 +0000 UTC m=+1331.547138091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data") pod "barbican-keystone-listener-5d9845b76b-jvdpt" (UID: "da108774-0e8b-4790-ac01-1a542370a374") : secret "barbican-config-data" not found Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.727112 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.727141 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data podName:7bc0c068-8645-4d10-95a1-94cce9fe9592 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.227134969 +0000 UTC m=+1331.547395257 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data") pod "barbican-worker-76869767b9-5lj6l" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592") : secret "barbican-config-data" not found Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.729328 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.729376 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle podName:7bc0c068-8645-4d10-95a1-94cce9fe9592 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.229361984 +0000 UTC m=+1331.549622262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle") pod "barbican-worker-76869767b9-5lj6l" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592") : secret "combined-ca-bundle" not found Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.729675 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da108774-0e8b-4790-ac01-1a542370a374-logs\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.729871 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.729897 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle podName:da108774-0e8b-4790-ac01-1a542370a374 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.229889107 +0000 UTC m=+1331.550149395 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle") pod "barbican-keystone-listener-5d9845b76b-jvdpt" (UID: "da108774-0e8b-4790-ac01-1a542370a374") : secret "combined-ca-bundle" not found Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.734677 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bc0c068-8645-4d10-95a1-94cce9fe9592-logs\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.735205 5030 projected.go:194] Error preparing data for projected volume kube-api-access-szznv for pod openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.735257 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv podName:7bc0c068-8645-4d10-95a1-94cce9fe9592 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.235239418 +0000 UTC m=+1331.555499706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-szznv" (UniqueName: "kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv") pod "barbican-worker-76869767b9-5lj6l" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.735322 5030 projected.go:194] Error preparing data for projected volume kube-api-access-kt76q for pod openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.735374 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q podName:da108774-0e8b-4790-ac01-1a542370a374 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.2353598 +0000 UTC m=+1331.555620088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kt76q" (UniqueName: "kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q") pod "barbican-keystone-listener-5d9845b76b-jvdpt" (UID: "da108774-0e8b-4790-ac01-1a542370a374") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.739760 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.753844 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.758679 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-k84rf"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.759107 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data-custom\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.776522 5030 scope.go:117] "RemoveContainer" containerID="bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.777454 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f\": container with ID starting with bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f not found: ID does not exist" containerID="bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.777498 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f"} err="failed to get container status \"bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f\": rpc error: code = NotFound desc = could not find container \"bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f\": container with ID starting with bc56a1b8a3a01b5907e82e6ef3bec8b9bb28e7b990e3cb6ac554c649bb9cd05f not found: ID does not exist" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.777517 5030 scope.go:117] "RemoveContainer" containerID="d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7" Jan 20 22:57:38 crc kubenswrapper[5030]: E0120 22:57:38.790911 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7\": container with ID starting with d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7 not found: ID does not exist" containerID="d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.790946 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7"} err="failed to get container status \"d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7\": rpc error: code = NotFound desc = could not find container \"d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7\": container with ID starting with d4247cd7ad73cef333d58bd562a667f7c008cbbc322cc89bfeca09b44ee786b7 not found: ID does not exist" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.813018 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data-custom\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.818669 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g8w6k"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.820073 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.833950 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.859325 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.867603 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.875670 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-e77d-account-create-update-4s587"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.884894 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g8w6k"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.891714 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-t8brv"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.901154 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-t8brv"] Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.917112 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" (UID: "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.932314 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" (UID: "42b7bb7d-f87c-4f2f-8d95-5d8ff851912c"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.933829 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts\") pod \"root-account-create-update-g8w6k\" (UID: \"d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d\") " pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.933948 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data-custom\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.933964 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.933981 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.934002 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vr69\" (UniqueName: \"kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69\") pod \"root-account-create-update-g8w6k\" (UID: \"d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d\") " pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.934027 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.934135 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km2cf\" (UniqueName: \"kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.934165 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0613c56-1bb3-455b-a45d-6b9ff28427c5-logs\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.934231 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.934278 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.934289 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:38 crc kubenswrapper[5030]: I0120 22:57:38.944498 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:38.999226 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-e5d9-account-create-update-6bg9n"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.037583 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data-custom\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.037642 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.037699 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.037723 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vr69\" (UniqueName: \"kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69\") pod \"root-account-create-update-g8w6k\" (UID: \"d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d\") " pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.037756 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.037811 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.037851 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km2cf\" (UniqueName: \"kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.037874 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.537859354 +0000 UTC m=+1331.858119642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "combined-ca-bundle" not found Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.037896 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0613c56-1bb3-455b-a45d-6b9ff28427c5-logs\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.038049 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.038287 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.038332 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.538318085 +0000 UTC m=+1331.858578373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "cert-barbican-internal-svc" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.038473 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.038494 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.538486529 +0000 UTC m=+1331.858746817 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "barbican-config-data" not found Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.038826 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0613c56-1bb3-455b-a45d-6b9ff28427c5-logs\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.039644 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts\") pod \"root-account-create-update-g8w6k\" (UID: \"d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d\") " pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.039813 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.039841 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts podName:d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.539832682 +0000 UTC m=+1331.860092970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts") pod "root-account-create-update-g8w6k" (UID: "d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d") : configmap "openstack-cell1-scripts" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.040841 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.040871 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.540862617 +0000 UTC m=+1331.861122905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "cert-barbican-public-svc" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.041992 5030 projected.go:194] Error preparing data for projected volume kube-api-access-km2cf for pod openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.042032 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.542023426 +0000 UTC m=+1331.862283714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-km2cf" (UniqueName: "kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.044793 5030 projected.go:194] Error preparing data for projected volume kube-api-access-4vr69 for pod openstack-kuttl-tests/root-account-create-update-g8w6k: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.044845 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69 podName:d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d nodeName:}" failed. No retries permitted until 2026-01-20 22:57:39.544831354 +0000 UTC m=+1331.865091642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4vr69" (UniqueName: "kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69") pod "root-account-create-update-g8w6k" (UID: "d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.046805 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data-custom\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.131039 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv"] Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.142951 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.150240 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data podName:b0a5273a-9439-4881-b88a-2a3aa11e410b nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.15021363 +0000 UTC m=+1332.470473918 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data") pod "rabbitmq-server-0" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b") : configmap "rabbitmq-config-data" not found Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.156853 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qcfbt"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.203245 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.205393 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qcfbt"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.244563 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.244610 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt76q\" (UniqueName: \"kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.244665 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.244733 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.244787 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.244811 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szznv\" (UniqueName: \"kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.245159 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.245197 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle podName:7bc0c068-8645-4d10-95a1-94cce9fe9592 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.245184821 +0000 UTC m=+1332.565445109 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle") pod "barbican-worker-76869767b9-5lj6l" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592") : secret "combined-ca-bundle" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.245352 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.245374 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle podName:da108774-0e8b-4790-ac01-1a542370a374 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.245367215 +0000 UTC m=+1332.565627493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle") pod "barbican-keystone-listener-5d9845b76b-jvdpt" (UID: "da108774-0e8b-4790-ac01-1a542370a374") : secret "combined-ca-bundle" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.245406 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.245423 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data podName:7bc0c068-8645-4d10-95a1-94cce9fe9592 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.245417777 +0000 UTC m=+1332.565678055 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data") pod "barbican-worker-76869767b9-5lj6l" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592") : secret "barbican-config-data" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.245450 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.245464 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data podName:da108774-0e8b-4790-ac01-1a542370a374 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.245459758 +0000 UTC m=+1332.565720046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data") pod "barbican-keystone-listener-5d9845b76b-jvdpt" (UID: "da108774-0e8b-4790-ac01-1a542370a374") : secret "barbican-config-data" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.250280 5030 projected.go:194] Error preparing data for projected volume kube-api-access-szznv for pod openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.250342 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv podName:7bc0c068-8645-4d10-95a1-94cce9fe9592 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.250326487 +0000 UTC m=+1332.570586775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-szznv" (UniqueName: "kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv") pod "barbican-worker-76869767b9-5lj6l" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.250392 5030 projected.go:194] Error preparing data for projected volume kube-api-access-kt76q for pod openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.250415 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q podName:da108774-0e8b-4790-ac01-1a542370a374 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.250408199 +0000 UTC m=+1332.570668487 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kt76q" (UniqueName: "kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q") pod "barbican-keystone-listener-5d9845b76b-jvdpt" (UID: "da108774-0e8b-4790-ac01-1a542370a374") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.256313 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tr48l"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.283490 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tr48l"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.298835 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.306810 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.311656 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.318556 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-4fgrc"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.348027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-nova-novncproxy-tls-certs\") pod \"a199be04-e085-49d3-a376-abdf54773b99\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.348128 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-combined-ca-bundle\") pod \"a199be04-e085-49d3-a376-abdf54773b99\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.348197 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk9jg\" (UniqueName: \"kubernetes.io/projected/a199be04-e085-49d3-a376-abdf54773b99-kube-api-access-hk9jg\") pod \"a199be04-e085-49d3-a376-abdf54773b99\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.348234 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-vencrypt-tls-certs\") pod \"a199be04-e085-49d3-a376-abdf54773b99\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.348258 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-config-data\") pod \"a199be04-e085-49d3-a376-abdf54773b99\" (UID: \"a199be04-e085-49d3-a376-abdf54773b99\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.348900 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-kqh4h"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.372364 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.389681 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a199be04-e085-49d3-a376-abdf54773b99-kube-api-access-hk9jg" (OuterVolumeSpecName: "kube-api-access-hk9jg") pod "a199be04-e085-49d3-a376-abdf54773b99" (UID: "a199be04-e085-49d3-a376-abdf54773b99"). InnerVolumeSpecName "kube-api-access-hk9jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.389948 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.404672 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-8cf9-account-create-update-jjgf7"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.429405 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.433962 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="sg-core" containerID="cri-o://be006ffb3db61a321e993c269bb9ae3ea0ca63c388bc27053d0c190cbf6073af" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.433961 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="proxy-httpd" containerID="cri-o://63057bd1179d40b3071a3468cfba935dac00fdf9cf0809c23272ed79c89a9eb3" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.434182 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="ceilometer-notification-agent" containerID="cri-o://893f6987a6df5b3331fffed8a8ea0ba6aec4842601c274d8f1e560748fa5ffd8" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.434220 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="ceilometer-central-agent" containerID="cri-o://a4a6c07a940fa33f24511acd3d3cfb359f4abd81b4a5617e7de7f6743af38fe9" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.450549 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d59910a-9b07-47ff-b79b-661139ece8c3-config-data\") pod \"1d59910a-9b07-47ff-b79b-661139ece8c3\" (UID: \"1d59910a-9b07-47ff-b79b-661139ece8c3\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.450680 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkgcz\" (UniqueName: \"kubernetes.io/projected/270d79a6-4c9f-415b-aa6a-e0d33da8177d-kube-api-access-lkgcz\") pod \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.450723 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/270d79a6-4c9f-415b-aa6a-e0d33da8177d-openstack-config\") pod \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.450812 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270d79a6-4c9f-415b-aa6a-e0d33da8177d-combined-ca-bundle\") pod \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.450878 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d59910a-9b07-47ff-b79b-661139ece8c3-combined-ca-bundle\") pod \"1d59910a-9b07-47ff-b79b-661139ece8c3\" (UID: \"1d59910a-9b07-47ff-b79b-661139ece8c3\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.450896 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2h7r\" (UniqueName: \"kubernetes.io/projected/1d59910a-9b07-47ff-b79b-661139ece8c3-kube-api-access-v2h7r\") pod \"1d59910a-9b07-47ff-b79b-661139ece8c3\" (UID: \"1d59910a-9b07-47ff-b79b-661139ece8c3\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.450925 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/270d79a6-4c9f-415b-aa6a-e0d33da8177d-openstack-config-secret\") pod \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\" (UID: \"270d79a6-4c9f-415b-aa6a-e0d33da8177d\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.451399 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk9jg\" (UniqueName: \"kubernetes.io/projected/a199be04-e085-49d3-a376-abdf54773b99-kube-api-access-hk9jg\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.464274 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" event={"ID":"ce21c77f-25ce-4bba-b711-7375ce5e824c","Type":"ContainerStarted","Data":"903e3d3d344c10e9eb7cd1b444ed7504bc9dc3e0e1295314208b54614b16480a"} Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.476635 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" event={"ID":"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef","Type":"ContainerStarted","Data":"16e00ec261436bba79f4e1f1da0aa6c5a6077529ffff0c75640ae041e0a9aaed"} Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.476685 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" event={"ID":"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef","Type":"ContainerStarted","Data":"e594999bcc298ec09bcf4cc91d8c81e08eed57723ae2f5308c3c7138f9972c8d"} Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.478334 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xldkf"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.478828 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/270d79a6-4c9f-415b-aa6a-e0d33da8177d-kube-api-access-lkgcz" (OuterVolumeSpecName: "kube-api-access-lkgcz") pod "270d79a6-4c9f-415b-aa6a-e0d33da8177d" (UID: "270d79a6-4c9f-415b-aa6a-e0d33da8177d"). InnerVolumeSpecName "kube-api-access-lkgcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.478904 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d59910a-9b07-47ff-b79b-661139ece8c3-kube-api-access-v2h7r" (OuterVolumeSpecName: "kube-api-access-v2h7r") pod "1d59910a-9b07-47ff-b79b-661139ece8c3" (UID: "1d59910a-9b07-47ff-b79b-661139ece8c3"). InnerVolumeSpecName "kube-api-access-v2h7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.479511 5030 generic.go:334] "Generic (PLEG): container finished" podID="efe6a225-5530-4922-91ec-1c4a939cb98b" containerID="6b6d7fced9bfec32b48d7d732119d3237bc9fe4d5517f7a246ca37788d005e77" exitCode=0 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.479562 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"efe6a225-5530-4922-91ec-1c4a939cb98b","Type":"ContainerDied","Data":"6b6d7fced9bfec32b48d7d732119d3237bc9fe4d5517f7a246ca37788d005e77"} Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.487740 5030 generic.go:334] "Generic (PLEG): container finished" podID="95a857cf-5973-4c78-9758-ab04c932e180" containerID="a959770d65b862147cb227ef990299c9ce51c5014c0b60151012a6e091f021b1" exitCode=0 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.487806 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"95a857cf-5973-4c78-9758-ab04c932e180","Type":"ContainerDied","Data":"a959770d65b862147cb227ef990299c9ce51c5014c0b60151012a6e091f021b1"} Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.491455 5030 generic.go:334] "Generic (PLEG): container finished" podID="270d79a6-4c9f-415b-aa6a-e0d33da8177d" containerID="06277311f4997ca2086de214919a0b96983f447ac24c043ef95dfbbb0b0f8356" exitCode=137 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.491523 5030 scope.go:117] "RemoveContainer" containerID="06277311f4997ca2086de214919a0b96983f447ac24c043ef95dfbbb0b0f8356" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.491741 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.494389 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xldkf"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.524323 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"42b7bb7d-f87c-4f2f-8d95-5d8ff851912c","Type":"ContainerDied","Data":"84273878b9163e70f3a952aca0d221876ee6741a40855566b52a09e705afc60a"} Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.526454 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.535965 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"a199be04-e085-49d3-a376-abdf54773b99","Type":"ContainerDied","Data":"0e6b4ca7705812c489ba0d949368d954796ef3ea85d194df9bb90c9a032ec40f"} Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.536065 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.538566 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.542466 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"1d59910a-9b07-47ff-b79b-661139ece8c3","Type":"ContainerDied","Data":"b8b314698985d960cc62b1dcff7fe13d10cfed291fdc3bb43d84a42bc3d3ca1a"} Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.542527 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.543278 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.546732 5030 scope.go:117] "RemoveContainer" containerID="06277311f4997ca2086de214919a0b96983f447ac24c043ef95dfbbb0b0f8356" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.546799 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-gvgkg"] Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.547445 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06277311f4997ca2086de214919a0b96983f447ac24c043ef95dfbbb0b0f8356\": container with ID starting with 06277311f4997ca2086de214919a0b96983f447ac24c043ef95dfbbb0b0f8356 not found: ID does not exist" containerID="06277311f4997ca2086de214919a0b96983f447ac24c043ef95dfbbb0b0f8356" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.547487 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06277311f4997ca2086de214919a0b96983f447ac24c043ef95dfbbb0b0f8356"} err="failed to get container status \"06277311f4997ca2086de214919a0b96983f447ac24c043ef95dfbbb0b0f8356\": rpc error: code = NotFound desc = could not find container \"06277311f4997ca2086de214919a0b96983f447ac24c043ef95dfbbb0b0f8356\": container with ID starting with 06277311f4997ca2086de214919a0b96983f447ac24c043ef95dfbbb0b0f8356 not found: ID does not exist" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.547514 5030 scope.go:117] "RemoveContainer" containerID="19f231d19d885aa27ab7782d5877fea4392fd2ffd88a91c7505026e5f5c450f4" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.553843 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km2cf\" (UniqueName: \"kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.553944 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.553968 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts\") pod \"root-account-create-update-g8w6k\" (UID: \"d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d\") " pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.554042 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.554062 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.554087 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vr69\" (UniqueName: \"kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69\") pod \"root-account-create-update-g8w6k\" (UID: \"d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d\") " pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.554119 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.554203 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2h7r\" (UniqueName: \"kubernetes.io/projected/1d59910a-9b07-47ff-b79b-661139ece8c3-kube-api-access-v2h7r\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.554223 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkgcz\" (UniqueName: \"kubernetes.io/projected/270d79a6-4c9f-415b-aa6a-e0d33da8177d-kube-api-access-lkgcz\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.554237 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.554286 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.554272556 +0000 UTC m=+1332.874532844 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "barbican-config-data" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.554298 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.554375 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.554361468 +0000 UTC m=+1332.874621756 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "cert-barbican-internal-svc" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.554409 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.554429 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.554423299 +0000 UTC m=+1332.874683577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "combined-ca-bundle" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.554505 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.554538 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts podName:d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.554530172 +0000 UTC m=+1332.874790460 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts") pod "root-account-create-update-g8w6k" (UID: "d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d") : configmap "openstack-cell1-scripts" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.555708 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.555845 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.555824434 +0000 UTC m=+1332.876084722 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "cert-barbican-public-svc" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.557914 5030 projected.go:194] Error preparing data for projected volume kube-api-access-4vr69 for pod openstack-kuttl-tests/root-account-create-update-g8w6k: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.557993 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69 podName:d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.557975486 +0000 UTC m=+1332.878235774 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4vr69" (UniqueName: "kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69") pod "root-account-create-update-g8w6k" (UID: "d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.558038 5030 projected.go:194] Error preparing data for projected volume kube-api-access-km2cf for pod openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.558061 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.558055128 +0000 UTC m=+1332.878315416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-km2cf" (UniqueName: "kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.576498 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-gvgkg"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.604928 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.605696 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv"] Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.610958 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d59910a-9b07-47ff-b79b-661139ece8c3" containerName="nova-cell0-conductor-conductor" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.610987 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d59910a-9b07-47ff-b79b-661139ece8c3" containerName="nova-cell0-conductor-conductor" Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.611001 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a857cf-5973-4c78-9758-ab04c932e180" containerName="mysql-bootstrap" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.611008 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a857cf-5973-4c78-9758-ab04c932e180" containerName="mysql-bootstrap" Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.611069 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efe6a225-5530-4922-91ec-1c4a939cb98b" containerName="memcached" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.611076 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="efe6a225-5530-4922-91ec-1c4a939cb98b" containerName="memcached" Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.611087 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a857cf-5973-4c78-9758-ab04c932e180" containerName="galera" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.611093 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a857cf-5973-4c78-9758-ab04c932e180" containerName="galera" Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.611111 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a199be04-e085-49d3-a376-abdf54773b99" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.611118 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a199be04-e085-49d3-a376-abdf54773b99" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.611345 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a857cf-5973-4c78-9758-ab04c932e180" containerName="galera" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.611362 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="efe6a225-5530-4922-91ec-1c4a939cb98b" containerName="memcached" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.611374 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a199be04-e085-49d3-a376-abdf54773b99" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.611385 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d59910a-9b07-47ff-b79b-661139ece8c3" containerName="nova-cell0-conductor-conductor" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.615115 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.620656 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.624905 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-server" containerID="cri-o://c623f5b67162ae7ba2ac2764b304e3991464bb5c5b5ab6193cf3fd5b24bb2845" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.625030 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="swift-recon-cron" containerID="cri-o://adc617fb88a39b843a0ff031f77209c3ae2283dc26aec23acae2b5de8b19f18c" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.625065 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="rsync" containerID="cri-o://2bb4c6cb0c30a38e71228d46f1ce0a2d6242c26273e09475f37ec15be630df47" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.625099 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-expirer" containerID="cri-o://03d8131565b2aa8215b283479dbb08d399b764b7bdd91342366f8b11d7c1bd11" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.625131 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-updater" containerID="cri-o://61166ebcaa8d52cbda3ab4ddba35312982c50bdd4e6acec74584508c92aaf82e" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.625174 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-auditor" containerID="cri-o://dd5a67ef4fb551fa2f475133669cec13544df0cc96f8214505d7da9eba2e8bd0" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.625210 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-replicator" containerID="cri-o://2334a73469889f8ad1c51ce74e70eb724d6a8113be8a4e9fc40821d4ff9a633b" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.625249 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-server" containerID="cri-o://966b0cf5daaff4c0b35472ac3ae84907a5ab999f5d4629a16af22c0f9f24aa9a" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.625337 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-updater" containerID="cri-o://92b7b86eb5275b933dafd8fadf06f237a24ffe511636756664335966de4acf0a" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.625375 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-auditor" containerID="cri-o://cb2e2e77e02b8db23512519bebfb695114aaf3212af696ecbeb14d791e69248d" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.625403 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-replicator" containerID="cri-o://95c5f038cd2f0fd19a1d2f8de88ec001899e1f73b1fb3f3369a0e83f7a93bc73" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.628344 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-replicator" containerID="cri-o://3e0d8293f0f6aca058326e5ad7fece6c0d7a637efa32ef0554b0a6456daa1e20" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.628842 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-server" containerID="cri-o://c4478344184d417c7bf47c30c7e0b83365751f9ca859287ab58440f5bfb834e1" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.628999 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-reaper" containerID="cri-o://7b3f774418e3437069778c78fb657a5ed3fb7e251d10f744a2722ba73b312add" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.630746 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-auditor" containerID="cri-o://ae49717efe652fa5f134b16d6554d620221090eb39bb69192b38b9c574c2291c" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.655431 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-kolla-config\") pod \"95a857cf-5973-4c78-9758-ab04c932e180\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.655530 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"95a857cf-5973-4c78-9758-ab04c932e180\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.655559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/95a857cf-5973-4c78-9758-ab04c932e180-galera-tls-certs\") pod \"95a857cf-5973-4c78-9758-ab04c932e180\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.655611 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9n9n\" (UniqueName: \"kubernetes.io/projected/95a857cf-5973-4c78-9758-ab04c932e180-kube-api-access-n9n9n\") pod \"95a857cf-5973-4c78-9758-ab04c932e180\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.655718 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-operator-scripts\") pod \"95a857cf-5973-4c78-9758-ab04c932e180\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.655897 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-config-data-default\") pod \"95a857cf-5973-4c78-9758-ab04c932e180\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.655961 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/95a857cf-5973-4c78-9758-ab04c932e180-config-data-generated\") pod \"95a857cf-5973-4c78-9758-ab04c932e180\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.655995 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a857cf-5973-4c78-9758-ab04c932e180-combined-ca-bundle\") pod \"95a857cf-5973-4c78-9758-ab04c932e180\" (UID: \"95a857cf-5973-4c78-9758-ab04c932e180\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.656385 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "95a857cf-5973-4c78-9758-ab04c932e180" (UID: "95a857cf-5973-4c78-9758-ab04c932e180"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.656528 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.657440 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95a857cf-5973-4c78-9758-ab04c932e180" (UID: "95a857cf-5973-4c78-9758-ab04c932e180"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.658099 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "95a857cf-5973-4c78-9758-ab04c932e180" (UID: "95a857cf-5973-4c78-9758-ab04c932e180"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.658516 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95a857cf-5973-4c78-9758-ab04c932e180-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "95a857cf-5973-4c78-9758-ab04c932e180" (UID: "95a857cf-5973-4c78-9758-ab04c932e180"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.736952 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.752168 5030 scope.go:117] "RemoveContainer" containerID="789a37f026b23cad58514a4681ab46b1426788dba79b1a6a382f7073a104bb9e" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.765429 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/efe6a225-5530-4922-91ec-1c4a939cb98b-config-data\") pod \"efe6a225-5530-4922-91ec-1c4a939cb98b\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.765478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/efe6a225-5530-4922-91ec-1c4a939cb98b-kolla-config\") pod \"efe6a225-5530-4922-91ec-1c4a939cb98b\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.765513 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wxnh\" (UniqueName: \"kubernetes.io/projected/efe6a225-5530-4922-91ec-1c4a939cb98b-kube-api-access-4wxnh\") pod \"efe6a225-5530-4922-91ec-1c4a939cb98b\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.765564 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/efe6a225-5530-4922-91ec-1c4a939cb98b-memcached-tls-certs\") pod \"efe6a225-5530-4922-91ec-1c4a939cb98b\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.765642 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe6a225-5530-4922-91ec-1c4a939cb98b-combined-ca-bundle\") pod \"efe6a225-5530-4922-91ec-1c4a939cb98b\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.766111 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efe6a225-5530-4922-91ec-1c4a939cb98b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "efe6a225-5530-4922-91ec-1c4a939cb98b" (UID: "efe6a225-5530-4922-91ec-1c4a939cb98b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.766278 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb4rl\" (UniqueName: \"kubernetes.io/projected/b4e0a7b0-4723-42a8-aa99-c5f155dff065-kube-api-access-kb4rl\") pod \"dnsmasq-dnsmasq-84b9f45d47-8fgcv\" (UID: \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.766340 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b4e0a7b0-4723-42a8-aa99-c5f155dff065-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8fgcv\" (UID: \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.766430 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4e0a7b0-4723-42a8-aa99-c5f155dff065-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8fgcv\" (UID: \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.766668 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.766685 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/95a857cf-5973-4c78-9758-ab04c932e180-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.766696 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/95a857cf-5973-4c78-9758-ab04c932e180-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.766710 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/efe6a225-5530-4922-91ec-1c4a939cb98b-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.772856 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efe6a225-5530-4922-91ec-1c4a939cb98b-config-data" (OuterVolumeSpecName: "config-data") pod "efe6a225-5530-4922-91ec-1c4a939cb98b" (UID: "efe6a225-5530-4922-91ec-1c4a939cb98b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.779734 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.825071 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a857cf-5973-4c78-9758-ab04c932e180-kube-api-access-n9n9n" (OuterVolumeSpecName: "kube-api-access-n9n9n") pod "95a857cf-5973-4c78-9758-ab04c932e180" (UID: "95a857cf-5973-4c78-9758-ab04c932e180"). InnerVolumeSpecName "kube-api-access-n9n9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.886219 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe6a225-5530-4922-91ec-1c4a939cb98b-kube-api-access-4wxnh" (OuterVolumeSpecName: "kube-api-access-4wxnh") pod "efe6a225-5530-4922-91ec-1c4a939cb98b" (UID: "efe6a225-5530-4922-91ec-1c4a939cb98b"). InnerVolumeSpecName "kube-api-access-4wxnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.886541 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wxnh\" (UniqueName: \"kubernetes.io/projected/efe6a225-5530-4922-91ec-1c4a939cb98b-kube-api-access-4wxnh\") pod \"efe6a225-5530-4922-91ec-1c4a939cb98b\" (UID: \"efe6a225-5530-4922-91ec-1c4a939cb98b\") " Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.886929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4e0a7b0-4723-42a8-aa99-c5f155dff065-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8fgcv\" (UID: \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.887287 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb4rl\" (UniqueName: \"kubernetes.io/projected/b4e0a7b0-4723-42a8-aa99-c5f155dff065-kube-api-access-kb4rl\") pod \"dnsmasq-dnsmasq-84b9f45d47-8fgcv\" (UID: \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.887333 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-65fcc556fd-25njc"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.887365 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b4e0a7b0-4723-42a8-aa99-c5f155dff065-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8fgcv\" (UID: \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.887482 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/efe6a225-5530-4922-91ec-1c4a939cb98b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.887509 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9n9n\" (UniqueName: \"kubernetes.io/projected/95a857cf-5973-4c78-9758-ab04c932e180-kube-api-access-n9n9n\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:39 crc kubenswrapper[5030]: W0120 22:57:39.887604 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/efe6a225-5530-4922-91ec-1c4a939cb98b/volumes/kubernetes.io~projected/kube-api-access-4wxnh Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.889952 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe6a225-5530-4922-91ec-1c4a939cb98b-kube-api-access-4wxnh" (OuterVolumeSpecName: "kube-api-access-4wxnh") pod "efe6a225-5530-4922-91ec-1c4a939cb98b" (UID: "efe6a225-5530-4922-91ec-1c4a939cb98b"). InnerVolumeSpecName "kube-api-access-4wxnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.902319 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4e0a7b0-4723-42a8-aa99-c5f155dff065-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8fgcv\" (UID: \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.903189 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b4e0a7b0-4723-42a8-aa99-c5f155dff065-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8fgcv\" (UID: \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.914493 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5fc4cf64cd-9987t"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.914802 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" podUID="fd100215-a3f7-497e-842d-582681701903" containerName="neutron-api" containerID="cri-o://4ef4f1c19ec69f0a2a6ee1ccdc738015babfb45998ce1816892ed6bcd3c46351" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.915268 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" podUID="fd100215-a3f7-497e-842d-582681701903" containerName="neutron-httpd" containerID="cri-o://03d2222b1f44670ddf7c4edbaa91540b2ac22ae3a87c8da53d412157de2aefb6" gracePeriod=30 Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.941133 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-x4mch"] Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.958325 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "95a857cf-5973-4c78-9758-ab04c932e180" (UID: "95a857cf-5973-4c78-9758-ab04c932e180"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.989229 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wxnh\" (UniqueName: \"kubernetes.io/projected/efe6a225-5530-4922-91ec-1c4a939cb98b-kube-api-access-4wxnh\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.989270 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.989410 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 22:57:39 crc kubenswrapper[5030]: E0120 22:57:39.989471 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data podName:38bfa948-051b-4e6c-92a5-3714e0652088 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:40.489454212 +0000 UTC m=+1332.809714500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data") pod "rabbitmq-cell1-server-0" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088") : configmap "rabbitmq-cell1-config-data" not found Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.992488 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a08db1-ff16-4bbc-a3c1-b34488e0a9b7" path="/var/lib/kubelet/pods/11a08db1-ff16-4bbc-a3c1-b34488e0a9b7/volumes" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.993139 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c693e21-8f8f-4437-9662-876c9f6f48e1" path="/var/lib/kubelet/pods/1c693e21-8f8f-4437-9662-876c9f6f48e1/volumes" Jan 20 22:57:39 crc kubenswrapper[5030]: I0120 22:57:39.993903 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="208c8689-8a0c-409c-86e6-d0cfa8893c77" path="/var/lib/kubelet/pods/208c8689-8a0c-409c-86e6-d0cfa8893c77/volumes" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.001274 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a14bbb-3aa5-4a2f-b49b-446729d08cca" path="/var/lib/kubelet/pods/22a14bbb-3aa5-4a2f-b49b-446729d08cca/volumes" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.002154 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2338665c-e47e-47a0-8b70-3fb568fdebc7" path="/var/lib/kubelet/pods/2338665c-e47e-47a0-8b70-3fb568fdebc7/volumes" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.002871 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="307f56f7-ed99-4389-a33f-7e1c8dda480b" path="/var/lib/kubelet/pods/307f56f7-ed99-4389-a33f-7e1c8dda480b/volumes" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.006673 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb4rl\" (UniqueName: \"kubernetes.io/projected/b4e0a7b0-4723-42a8-aa99-c5f155dff065-kube-api-access-kb4rl\") pod \"dnsmasq-dnsmasq-84b9f45d47-8fgcv\" (UID: \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.008986 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="419fff1b-62d5-431e-b99c-acf5c8a5e61a" path="/var/lib/kubelet/pods/419fff1b-62d5-431e-b99c-acf5c8a5e61a/volumes" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.009658 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69346425-07fa-46ee-a8a9-9750fd83ac46" path="/var/lib/kubelet/pods/69346425-07fa-46ee-a8a9-9750fd83ac46/volumes" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.009709 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": read tcp 10.217.0.2:42402->10.217.0.186:8775: read: connection reset by peer" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.009807 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": read tcp 10.217.0.2:42414->10.217.0.186:8775: read: connection reset by peer" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.021135 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="808972eb-d558-43db-9cd9-ce9315b4dd16" path="/var/lib/kubelet/pods/808972eb-d558-43db-9cd9-ce9315b4dd16/volumes" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.021906 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="830cabde-1736-47a1-add5-6fde195a1723" path="/var/lib/kubelet/pods/830cabde-1736-47a1-add5-6fde195a1723/volumes" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.022500 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e2ce9c3-f457-4923-8ffa-8a393b4e176d" path="/var/lib/kubelet/pods/8e2ce9c3-f457-4923-8ffa-8a393b4e176d/volumes" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.027398 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd6f8be1-d7f8-47ac-800d-0007b458ba14" path="/var/lib/kubelet/pods/cd6f8be1-d7f8-47ac-800d-0007b458ba14/volumes" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.028213 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf0019c8-f547-4abb-84db-73e5bc5e3e53" path="/var/lib/kubelet/pods/cf0019c8-f547-4abb-84db-73e5bc5e3e53/volumes" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.028758 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa4b12e1-a480-491b-94a7-57989dc998eb" path="/var/lib/kubelet/pods/fa4b12e1-a480-491b-94a7-57989dc998eb/volumes" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.120229 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.158670 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="d969fd5b-6579-4bc5-b64f-05bcb2aec29d" containerName="kube-state-metrics" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.178849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/270d79a6-4c9f-415b-aa6a-e0d33da8177d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "270d79a6-4c9f-415b-aa6a-e0d33da8177d" (UID: "270d79a6-4c9f-415b-aa6a-e0d33da8177d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.203417 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.203449 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/270d79a6-4c9f-415b-aa6a-e0d33da8177d-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.203500 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.203542 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data podName:b0a5273a-9439-4881-b88a-2a3aa11e410b nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.203529455 +0000 UTC m=+1334.523789743 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data") pod "rabbitmq-server-0" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b") : configmap "rabbitmq-config-data" not found Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.212411 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="4b05ddec-40d4-42ea-987c-03718a02724d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": read tcp 10.217.0.2:33814->10.217.0.187:8774: read: connection reset by peer" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.212502 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="4b05ddec-40d4-42ea-987c-03718a02724d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": read tcp 10.217.0.2:33826->10.217.0.187:8774: read: connection reset by peer" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.261038 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.306815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.306876 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szznv\" (UniqueName: \"kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.308233 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.308313 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data podName:da108774-0e8b-4790-ac01-1a542370a374 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.308294024 +0000 UTC m=+1334.628554312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data") pod "barbican-keystone-listener-5d9845b76b-jvdpt" (UID: "da108774-0e8b-4790-ac01-1a542370a374") : secret "barbican-config-data" not found Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.312706 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.312774 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt76q\" (UniqueName: \"kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.312896 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.313047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.313259 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.313311 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data podName:7bc0c068-8645-4d10-95a1-94cce9fe9592 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.313292827 +0000 UTC m=+1334.633553115 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data") pod "barbican-worker-76869767b9-5lj6l" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592") : secret "barbican-config-data" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.313350 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.313370 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle podName:7bc0c068-8645-4d10-95a1-94cce9fe9592 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.313363279 +0000 UTC m=+1334.633623567 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle") pod "barbican-worker-76869767b9-5lj6l" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592") : secret "combined-ca-bundle" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.313646 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.313678 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle podName:da108774-0e8b-4790-ac01-1a542370a374 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.313668656 +0000 UTC m=+1334.633929044 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle") pod "barbican-keystone-listener-5d9845b76b-jvdpt" (UID: "da108774-0e8b-4790-ac01-1a542370a374") : secret "combined-ca-bundle" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.315256 5030 projected.go:194] Error preparing data for projected volume kube-api-access-szznv for pod openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.315317 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv podName:7bc0c068-8645-4d10-95a1-94cce9fe9592 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.315298736 +0000 UTC m=+1334.635559024 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-szznv" (UniqueName: "kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv") pod "barbican-worker-76869767b9-5lj6l" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.317801 5030 projected.go:194] Error preparing data for projected volume kube-api-access-kt76q for pod openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.317834 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q podName:da108774-0e8b-4790-ac01-1a542370a374 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.317825408 +0000 UTC m=+1334.638085686 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kt76q" (UniqueName: "kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q") pod "barbican-keystone-listener-5d9845b76b-jvdpt" (UID: "da108774-0e8b-4790-ac01-1a542370a374") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.394313 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "a199be04-e085-49d3-a376-abdf54773b99" (UID: "a199be04-e085-49d3-a376-abdf54773b99"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.415341 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.432681 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe6a225-5530-4922-91ec-1c4a939cb98b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efe6a225-5530-4922-91ec-1c4a939cb98b" (UID: "efe6a225-5530-4922-91ec-1c4a939cb98b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.436477 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a199be04-e085-49d3-a376-abdf54773b99" (UID: "a199be04-e085-49d3-a376-abdf54773b99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.442210 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-config-data" (OuterVolumeSpecName: "config-data") pod "a199be04-e085-49d3-a376-abdf54773b99" (UID: "a199be04-e085-49d3-a376-abdf54773b99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.472384 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270d79a6-4c9f-415b-aa6a-e0d33da8177d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "270d79a6-4c9f-415b-aa6a-e0d33da8177d" (UID: "270d79a6-4c9f-415b-aa6a-e0d33da8177d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.489405 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d59910a-9b07-47ff-b79b-661139ece8c3-config-data" (OuterVolumeSpecName: "config-data") pod "1d59910a-9b07-47ff-b79b-661139ece8c3" (UID: "1d59910a-9b07-47ff-b79b-661139ece8c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.516921 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.517055 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data podName:38bfa948-051b-4e6c-92a5-3714e0652088 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:41.517035856 +0000 UTC m=+1333.837296144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data") pod "rabbitmq-cell1-server-0" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088") : configmap "rabbitmq-cell1-config-data" not found Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.517246 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe6a225-5530-4922-91ec-1c4a939cb98b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.517328 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.517384 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d59910a-9b07-47ff-b79b-661139ece8c3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.517444 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.517505 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/270d79a6-4c9f-415b-aa6a-e0d33da8177d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.558578 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a857cf-5973-4c78-9758-ab04c932e180-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95a857cf-5973-4c78-9758-ab04c932e180" (UID: "95a857cf-5973-4c78-9758-ab04c932e180"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.562975 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d59910a-9b07-47ff-b79b-661139ece8c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d59910a-9b07-47ff-b79b-661139ece8c3" (UID: "1d59910a-9b07-47ff-b79b-661139ece8c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.599612 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "a199be04-e085-49d3-a376-abdf54773b99" (UID: "a199be04-e085-49d3-a376-abdf54773b99"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.609169 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270d79a6-4c9f-415b-aa6a-e0d33da8177d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "270d79a6-4c9f-415b-aa6a-e0d33da8177d" (UID: "270d79a6-4c9f-415b-aa6a-e0d33da8177d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.617025 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a857cf-5973-4c78-9758-ab04c932e180-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "95a857cf-5973-4c78-9758-ab04c932e180" (UID: "95a857cf-5973-4c78-9758-ab04c932e180"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.618589 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km2cf\" (UniqueName: \"kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.618685 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.618706 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts\") pod \"root-account-create-update-g8w6k\" (UID: \"d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d\") " pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.618761 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.618780 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.618803 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vr69\" (UniqueName: \"kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69\") pod \"root-account-create-update-g8w6k\" (UID: \"d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d\") " pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.618824 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.618913 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a857cf-5973-4c78-9758-ab04c932e180-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.618923 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d59910a-9b07-47ff-b79b-661139ece8c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.618933 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/270d79a6-4c9f-415b-aa6a-e0d33da8177d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.618943 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/95a857cf-5973-4c78-9758-ab04c932e180-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.618952 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a199be04-e085-49d3-a376-abdf54773b99-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.619023 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.619067 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.61905401 +0000 UTC m=+1334.939314288 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "cert-barbican-internal-svc" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.619235 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.619260 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.619252485 +0000 UTC m=+1334.939512773 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "barbican-config-data" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.619534 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.619555 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.619548903 +0000 UTC m=+1334.939809191 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "cert-barbican-public-svc" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.619580 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.619641 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts podName:d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.619591654 +0000 UTC m=+1334.939851942 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts") pod "root-account-create-update-g8w6k" (UID: "d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d") : configmap "openstack-cell1-scripts" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.619895 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.619919 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.619912581 +0000 UTC m=+1334.940172869 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "combined-ca-bundle" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.624981 5030 projected.go:194] Error preparing data for projected volume kube-api-access-km2cf for pod openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.625040 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.625022886 +0000 UTC m=+1334.945283174 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-km2cf" (UniqueName: "kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.625084 5030 projected.go:194] Error preparing data for projected volume kube-api-access-4vr69 for pod openstack-kuttl-tests/root-account-create-update-g8w6k: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.625105 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69 podName:d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.625098588 +0000 UTC m=+1334.945358876 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4vr69" (UniqueName: "kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69") pod "root-account-create-update-g8w6k" (UID: "d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.626853 5030 generic.go:334] "Generic (PLEG): container finished" podID="962ec476-3693-41a1-b425-a275bf790beb" containerID="be10608095d180892da25bc367a49e31085449b06dae8af243a26e65d423eabd" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.627339 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe6a225-5530-4922-91ec-1c4a939cb98b-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "efe6a225-5530-4922-91ec-1c4a939cb98b" (UID: "efe6a225-5530-4922-91ec-1c4a939cb98b"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.661064 5030 generic.go:334] "Generic (PLEG): container finished" podID="bcf270c3-8bc4-4b11-8c04-10708f22f76e" containerID="ea1e0a375f06f75251c94f8a9680e30e93e6ebbeb077b27d82cd29945cdfd113" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.669806 5030 generic.go:334] "Generic (PLEG): container finished" podID="fd100215-a3f7-497e-842d-582681701903" containerID="03d2222b1f44670ddf7c4edbaa91540b2ac22ae3a87c8da53d412157de2aefb6" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.698609 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.709250 5030 generic.go:334] "Generic (PLEG): container finished" podID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerID="63057bd1179d40b3071a3468cfba935dac00fdf9cf0809c23272ed79c89a9eb3" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.709278 5030 generic.go:334] "Generic (PLEG): container finished" podID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerID="be006ffb3db61a321e993c269bb9ae3ea0ca63c388bc27053d0c190cbf6073af" exitCode=2 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.709286 5030 generic.go:334] "Generic (PLEG): container finished" podID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerID="893f6987a6df5b3331fffed8a8ea0ba6aec4842601c274d8f1e560748fa5ffd8" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.709292 5030 generic.go:334] "Generic (PLEG): container finished" podID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerID="a4a6c07a940fa33f24511acd3d3cfb359f4abd81b4a5617e7de7f6743af38fe9" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.712263 5030 generic.go:334] "Generic (PLEG): container finished" podID="dd48f030-3391-4fdb-b101-45b0a7f4b9a3" containerID="38e3f4f4dcc255a9e5a20a831293e1da39522536a4dd156d458238e937469e98" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.714261 5030 generic.go:334] "Generic (PLEG): container finished" podID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerID="54c949cb8fb90e5917444c912258fd302ea6bebb581039d1381ea77c29048eb2" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.720679 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" podUID="b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" containerName="barbican-worker-log" containerID="cri-o://16e00ec261436bba79f4e1f1da0aa6c5a6077529ffff0c75640ae041e0a9aaed" gracePeriod=30 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.721183 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" podUID="b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" containerName="barbican-worker" containerID="cri-o://6247a77514dea17f931c5a0520c612b84e0184270b42ce2a744c871dd5a047b4" gracePeriod=30 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.724404 5030 generic.go:334] "Generic (PLEG): container finished" podID="4b05ddec-40d4-42ea-987c-03718a02724d" containerID="a6f737d8849e1b7dd322cb02df6ee01f0aa48f194ddabe4f0c964de3f0feac4f" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.733561 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/efe6a225-5530-4922-91ec-1c4a939cb98b-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.767290 5030 generic.go:334] "Generic (PLEG): container finished" podID="6acf7ebc-be88-4e1d-9b31-bef0707ea082" containerID="09189446050efd620d2103042e8ffbe0d00050dcd359f229a97abe5fb2fafe72" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794781 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="2bb4c6cb0c30a38e71228d46f1ce0a2d6242c26273e09475f37ec15be630df47" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794810 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="03d8131565b2aa8215b283479dbb08d399b764b7bdd91342366f8b11d7c1bd11" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794818 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="61166ebcaa8d52cbda3ab4ddba35312982c50bdd4e6acec74584508c92aaf82e" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794825 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="dd5a67ef4fb551fa2f475133669cec13544df0cc96f8214505d7da9eba2e8bd0" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794834 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="2334a73469889f8ad1c51ce74e70eb724d6a8113be8a4e9fc40821d4ff9a633b" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794841 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="966b0cf5daaff4c0b35472ac3ae84907a5ab999f5d4629a16af22c0f9f24aa9a" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794848 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="92b7b86eb5275b933dafd8fadf06f237a24ffe511636756664335966de4acf0a" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794855 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="cb2e2e77e02b8db23512519bebfb695114aaf3212af696ecbeb14d791e69248d" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794860 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="95c5f038cd2f0fd19a1d2f8de88ec001899e1f73b1fb3f3369a0e83f7a93bc73" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794868 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="c4478344184d417c7bf47c30c7e0b83365751f9ca859287ab58440f5bfb834e1" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794876 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="7b3f774418e3437069778c78fb657a5ed3fb7e251d10f744a2722ba73b312add" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794883 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="ae49717efe652fa5f134b16d6554d620221090eb39bb69192b38b9c574c2291c" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794889 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="3e0d8293f0f6aca058326e5ad7fece6c0d7a637efa32ef0554b0a6456daa1e20" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.794895 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="c623f5b67162ae7ba2ac2764b304e3991464bb5c5b5ab6193cf3fd5b24bb2845" exitCode=0 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.796422 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.978683 5030 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.015s" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.978931 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" event={"ID":"3ce810a5-c4fb-4211-9bf1-eb6300276d70","Type":"ContainerStarted","Data":"50767913068a65b6cf20e84d096e45421fdf9953a9461370793203236b0d12e7"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.978991 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"962ec476-3693-41a1-b425-a275bf790beb","Type":"ContainerDied","Data":"be10608095d180892da25bc367a49e31085449b06dae8af243a26e65d423eabd"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979011 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-x4mch"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979030 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979043 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"bcf270c3-8bc4-4b11-8c04-10708f22f76e","Type":"ContainerDied","Data":"ea1e0a375f06f75251c94f8a9680e30e93e6ebbeb077b27d82cd29945cdfd113"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979054 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" event={"ID":"fd100215-a3f7-497e-842d-582681701903","Type":"ContainerDied","Data":"03d2222b1f44670ddf7c4edbaa91540b2ac22ae3a87c8da53d412157de2aefb6"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979065 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-76gkg"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"95a857cf-5973-4c78-9758-ab04c932e180","Type":"ContainerDied","Data":"efdde6c1f53faeac19b424fc4d6b1c4cdc852a6f40cd8e43bb5c38df18ca9c2f"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979089 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2ddc1fbc-a41d-446b-81e3-a15febd39285","Type":"ContainerDied","Data":"63057bd1179d40b3071a3468cfba935dac00fdf9cf0809c23272ed79c89a9eb3"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979100 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2ddc1fbc-a41d-446b-81e3-a15febd39285","Type":"ContainerDied","Data":"be006ffb3db61a321e993c269bb9ae3ea0ca63c388bc27053d0c190cbf6073af"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979113 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-76gkg"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979130 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-56bb869d7d-pvd6v"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2ddc1fbc-a41d-446b-81e3-a15febd39285","Type":"ContainerDied","Data":"893f6987a6df5b3331fffed8a8ea0ba6aec4842601c274d8f1e560748fa5ffd8"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979155 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7bd4b6f588-wr8mk"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979173 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979185 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979196 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7bd4b6f588-wr8mk"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979210 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2ddc1fbc-a41d-446b-81e3-a15febd39285","Type":"ContainerDied","Data":"a4a6c07a940fa33f24511acd3d3cfb359f4abd81b4a5617e7de7f6743af38fe9"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979221 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"dd48f030-3391-4fdb-b101-45b0a7f4b9a3","Type":"ContainerDied","Data":"38e3f4f4dcc255a9e5a20a831293e1da39522536a4dd156d458238e937469e98"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979235 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979248 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a4277815-cc65-44c8-b719-7673ce4c1d88","Type":"ContainerDied","Data":"54c949cb8fb90e5917444c912258fd302ea6bebb581039d1381ea77c29048eb2"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979260 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979273 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-96ed-account-create-update-2qx68"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.979288 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982050 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" event={"ID":"f0994e0e-f167-4c5a-bef7-95da6f589767","Type":"ContainerStarted","Data":"330b2c233250fdbd1793d68fb3084927a79634629cb113a67425e6c0f3ae8ebf"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982079 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982093 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-7t7cc"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982102 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" event={"ID":"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef","Type":"ContainerStarted","Data":"6247a77514dea17f931c5a0520c612b84e0184270b42ce2a744c871dd5a047b4"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"4b05ddec-40d4-42ea-987c-03718a02724d","Type":"ContainerDied","Data":"a6f737d8849e1b7dd322cb02df6ee01f0aa48f194ddabe4f0c964de3f0feac4f"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982145 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-7t7cc"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982163 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" event={"ID":"59f93538-1f2d-4699-9b7b-dcf4c110178a","Type":"ContainerStarted","Data":"b5827bd4f30cc238b7506f6b8689c257ce264258c8d945378e8539c7984a11db"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982180 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" event={"ID":"37a888a7-7114-4dcb-b271-4422a3802e3b","Type":"ContainerStarted","Data":"f733bf8ac42245482c67234a8af7ffa8f6b0e4bc988409fbd1db67e80fb9b0d4"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982190 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-whswv"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982201 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"6acf7ebc-be88-4e1d-9b31-bef0707ea082","Type":"ContainerDied","Data":"09189446050efd620d2103042e8ffbe0d00050dcd359f229a97abe5fb2fafe72"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982213 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-74g9t"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982223 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"2bb4c6cb0c30a38e71228d46f1ce0a2d6242c26273e09475f37ec15be630df47"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982234 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"03d8131565b2aa8215b283479dbb08d399b764b7bdd91342366f8b11d7c1bd11"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982244 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-whswv"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982255 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-74g9t"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982266 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-xj9s4"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982276 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-xj9s4"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982285 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"61166ebcaa8d52cbda3ab4ddba35312982c50bdd4e6acec74584508c92aaf82e"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982294 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"dd5a67ef4fb551fa2f475133669cec13544df0cc96f8214505d7da9eba2e8bd0"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982306 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"2334a73469889f8ad1c51ce74e70eb724d6a8113be8a4e9fc40821d4ff9a633b"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982319 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"966b0cf5daaff4c0b35472ac3ae84907a5ab999f5d4629a16af22c0f9f24aa9a"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982327 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982337 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"92b7b86eb5275b933dafd8fadf06f237a24ffe511636756664335966de4acf0a"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982348 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"cb2e2e77e02b8db23512519bebfb695114aaf3212af696ecbeb14d791e69248d"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"95c5f038cd2f0fd19a1d2f8de88ec001899e1f73b1fb3f3369a0e83f7a93bc73"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982368 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-2b61-account-create-update-zzbjt"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982379 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"c4478344184d417c7bf47c30c7e0b83365751f9ca859287ab58440f5bfb834e1"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"7b3f774418e3437069778c78fb657a5ed3fb7e251d10f744a2722ba73b312add"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982397 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982406 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"ae49717efe652fa5f134b16d6554d620221090eb39bb69192b38b9c574c2291c"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982414 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"3e0d8293f0f6aca058326e5ad7fece6c0d7a637efa32ef0554b0a6456daa1e20"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982422 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"c623f5b67162ae7ba2ac2764b304e3991464bb5c5b5ab6193cf3fd5b24bb2845"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982431 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"efe6a225-5530-4922-91ec-1c4a939cb98b","Type":"ContainerDied","Data":"1f7ea6ee541ca660624948bd02d48ba083b1b0ce0ae6c89acaf2cabd591dc181"} Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982442 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2c5c5"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982454 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-vx9c6"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982463 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-vx9c6"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982473 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982484 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-29d2-account-create-update-jxb4x"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982493 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982502 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-9thns"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982510 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-65fcc556fd-25njc"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982519 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982527 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-8c9f-account-create-update-fzrtn"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982535 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982549 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-866d7bb55b-59hzj"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982556 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-clvw5"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982565 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-6ck4d"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982574 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-clvw5"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982583 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g8w6k"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982597 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-6ck4d"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982605 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982614 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982641 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l"] Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.982651 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt"] Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.983254 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-kt76q], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" podUID="da108774-0e8b-4790-ac01-1a542370a374" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.984842 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" podUID="7dd556b2-18b0-43da-ad37-e84c9e7601cc" containerName="proxy-httpd" containerID="cri-o://709a2a7ededf39d4cf921562b31fe5c77003d1a66336a20cb086a47b4c5c88f2" gracePeriod=30 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.985030 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="d969fd5b-6579-4bc5-b64f-05bcb2aec29d" containerName="kube-state-metrics" containerID="cri-o://baf6a60a1b77f421596bd948659ecac15aa8955fa1ce8ad3378955ea3bd359da" gracePeriod=30 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.985070 5030 scope.go:117] "RemoveContainer" containerID="a959770d65b862147cb227ef990299c9ce51c5014c0b60151012a6e091f021b1" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.985213 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" podUID="7d3fc09a-8a37-4559-91cb-817bcc4171b9" containerName="placement-log" containerID="cri-o://f8b91389342e688b3e0145f9dea09b80af881bed02683f96c927275a4d6eabc9" gracePeriod=30 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.985286 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.986716 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" podUID="7dd556b2-18b0-43da-ad37-e84c9e7601cc" containerName="proxy-server" containerID="cri-o://65b8ebd2e425010f04d6e967a0f85be11bd51c38df6fd77b8844497673c771f4" gracePeriod=30 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.986868 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" podUID="bdb8aa29-a5b3-4060-a8a6-19d1eda96426" containerName="barbican-worker-log" containerID="cri-o://06e68b68bba4d4968142a0ed2d7232e6d395b065e2c82b3bdda9db70cd9473e7" gracePeriod=30 Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.987256 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" podUID="bdb8aa29-a5b3-4060-a8a6-19d1eda96426" containerName="barbican-worker" containerID="cri-o://14bfbe083e51244aea6aa5252b2d46ca796ef97c8afa9c47bc422ff79846cd66" gracePeriod=30 Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.987451 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-4vr69 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/root-account-create-update-g8w6k" podUID="d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d" Jan 20 22:57:40 crc kubenswrapper[5030]: I0120 22:57:40.987510 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" podUID="7d3fc09a-8a37-4559-91cb-817bcc4171b9" containerName="placement-api" containerID="cri-o://21c6bb975875359b4aada1c097326bbd3a6c5bc0f7495610ae74c71812916f4a" gracePeriod=30 Jan 20 22:57:40 crc kubenswrapper[5030]: E0120 22:57:40.990070 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-szznv], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" podUID="7bc0c068-8645-4d10-95a1-94cce9fe9592" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.002882 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.008839 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.009030 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" podUID="c2732803-fafc-45fd-9cc7-4b61bd446fe0" containerName="barbican-keystone-listener-log" containerID="cri-o://c7e72e59310be353b8fa092d384d923b5d454411590aafac1c75d0f354a04eb2" gracePeriod=30 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.009136 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" podUID="c2732803-fafc-45fd-9cc7-4b61bd446fe0" containerName="barbican-keystone-listener" containerID="cri-o://95f59c7ca745d484ee25cabdc0bdf2f4f2cc68cfe3f951e0879e51278548c3c4" gracePeriod=30 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.024792 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-76878489f6-7898t"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.025021 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" podUID="285c1a35-91f7-441f-aaf2-a228cccc8962" containerName="barbican-api-log" containerID="cri-o://d27c68304cedb99531414f7ed01f149023e91c9d96d8750769fd70655c26aa29" gracePeriod=30 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.025139 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" podUID="285c1a35-91f7-441f-aaf2-a228cccc8962" containerName="barbican-api" containerID="cri-o://01ac7883c22efebc724af58ba7cf2e86bb831d40c452847d4fd9f6769db4311a" gracePeriod=30 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.031751 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x"] Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.032596 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data internal-tls-certs kube-api-access-km2cf public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" podUID="e0613c56-1bb3-455b-a45d-6b9ff28427c5" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.045762 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12523fd-a36d-4490-93bb-aa22ef7a0595-operator-scripts\") pod \"keystone-8c9f-account-create-update-sstcf\" (UID: \"f12523fd-a36d-4490-93bb-aa22ef7a0595\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.046393 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d52lp\" (UniqueName: \"kubernetes.io/projected/f12523fd-a36d-4490-93bb-aa22ef7a0595-kube-api-access-d52lp\") pod \"keystone-8c9f-account-create-update-sstcf\" (UID: \"f12523fd-a36d-4490-93bb-aa22ef7a0595\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.065398 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-b9nnq"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.073682 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-swfqz"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.081125 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-b9nnq"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.086749 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-swfqz"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.092388 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-866d7bb55b-59hzj"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.107576 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5ddff9b97-b55t4"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.107817 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" podUID="1e567662-7ee7-4658-8954-06e43a966ce6" containerName="keystone-api" containerID="cri-o://e77f4e5c204c68469f44c8339a5b79d8d9973035aab21ac9dd8fa2c2f6325da5" gracePeriod=30 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.131357 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="b0a5273a-9439-4881-b88a-2a3aa11e410b" containerName="rabbitmq" containerID="cri-o://dea827cf70a7f2a132d0be3a5d721e6d07a8e2f8519f56822cf6962aaa2d6fac" gracePeriod=604800 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.150527 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d52lp\" (UniqueName: \"kubernetes.io/projected/f12523fd-a36d-4490-93bb-aa22ef7a0595-kube-api-access-d52lp\") pod \"keystone-8c9f-account-create-update-sstcf\" (UID: \"f12523fd-a36d-4490-93bb-aa22ef7a0595\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.150588 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12523fd-a36d-4490-93bb-aa22ef7a0595-operator-scripts\") pod \"keystone-8c9f-account-create-update-sstcf\" (UID: \"f12523fd-a36d-4490-93bb-aa22ef7a0595\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf" Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.150982 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.151034 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f12523fd-a36d-4490-93bb-aa22ef7a0595-operator-scripts podName:f12523fd-a36d-4490-93bb-aa22ef7a0595 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:41.651020042 +0000 UTC m=+1333.971280330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f12523fd-a36d-4490-93bb-aa22ef7a0595-operator-scripts") pod "keystone-8c9f-account-create-update-sstcf" (UID: "f12523fd-a36d-4490-93bb-aa22ef7a0595") : configmap "openstack-scripts" not found Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.163784 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf"] Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.189853 5030 projected.go:194] Error preparing data for projected volume kube-api-access-d52lp for pod openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.189976 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f12523fd-a36d-4490-93bb-aa22ef7a0595-kube-api-access-d52lp podName:f12523fd-a36d-4490-93bb-aa22ef7a0595 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:41.689943093 +0000 UTC m=+1334.010203381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-d52lp" (UniqueName: "kubernetes.io/projected/f12523fd-a36d-4490-93bb-aa22ef7a0595-kube-api-access-d52lp") pod "keystone-8c9f-account-create-update-sstcf" (UID: "f12523fd-a36d-4490-93bb-aa22ef7a0595") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.296851 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-slndf"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.296921 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-slndf"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.307611 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.314557 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" podStartSLOduration=6.314543189 podStartE2EDuration="6.314543189s" podCreationTimestamp="2026-01-20 22:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:57:40.735298761 +0000 UTC m=+1333.055559049" watchObservedRunningTime="2026-01-20 22:57:41.314543189 +0000 UTC m=+1333.634803477" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.355434 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="38bfa948-051b-4e6c-92a5-3714e0652088" containerName="rabbitmq" containerID="cri-o://a4829caf0f9e77c402ed59e975d250deb92fb12f806a28728d5643b813800504" gracePeriod=604800 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.413292 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" podUID="7dd556b2-18b0-43da-ad37-e84c9e7601cc" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.152:8080/healthcheck\": dial tcp 10.217.0.152:8080: connect: connection refused" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.413304 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" podUID="7dd556b2-18b0-43da-ad37-e84c9e7601cc" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.152:8080/healthcheck\": dial tcp 10.217.0.152:8080: connect: connection refused" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.428164 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.442974 5030 scope.go:117] "RemoveContainer" containerID="f4f036c607bf3ef974bccea536d7279888114e64c92919f79bea59ab01d3bf6a" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.507404 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-scripts\") pod \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.507446 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6acf7ebc-be88-4e1d-9b31-bef0707ea082-httpd-run\") pod \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.507513 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6acf7ebc-be88-4e1d-9b31-bef0707ea082-logs\") pod \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.507555 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-combined-ca-bundle\") pod \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.507610 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72xcq\" (UniqueName: \"kubernetes.io/projected/6acf7ebc-be88-4e1d-9b31-bef0707ea082-kube-api-access-72xcq\") pod \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.507654 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-public-tls-certs\") pod \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.507694 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-config-data\") pod \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.507778 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\" (UID: \"6acf7ebc-be88-4e1d-9b31-bef0707ea082\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.509413 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6acf7ebc-be88-4e1d-9b31-bef0707ea082-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6acf7ebc-be88-4e1d-9b31-bef0707ea082" (UID: "6acf7ebc-be88-4e1d-9b31-bef0707ea082"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.512325 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6acf7ebc-be88-4e1d-9b31-bef0707ea082-logs" (OuterVolumeSpecName: "logs") pod "6acf7ebc-be88-4e1d-9b31-bef0707ea082" (UID: "6acf7ebc-be88-4e1d-9b31-bef0707ea082"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.521369 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6acf7ebc-be88-4e1d-9b31-bef0707ea082-kube-api-access-72xcq" (OuterVolumeSpecName: "kube-api-access-72xcq") pod "6acf7ebc-be88-4e1d-9b31-bef0707ea082" (UID: "6acf7ebc-be88-4e1d-9b31-bef0707ea082"). InnerVolumeSpecName "kube-api-access-72xcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.525755 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "6acf7ebc-be88-4e1d-9b31-bef0707ea082" (UID: "6acf7ebc-be88-4e1d-9b31-bef0707ea082"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.525867 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-scripts" (OuterVolumeSpecName: "scripts") pod "6acf7ebc-be88-4e1d-9b31-bef0707ea082" (UID: "6acf7ebc-be88-4e1d-9b31-bef0707ea082"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.547399 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6acf7ebc-be88-4e1d-9b31-bef0707ea082" (UID: "6acf7ebc-be88-4e1d-9b31-bef0707ea082"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.611498 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.611543 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.611556 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6acf7ebc-be88-4e1d-9b31-bef0707ea082-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.611567 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6acf7ebc-be88-4e1d-9b31-bef0707ea082-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.611581 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.611594 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72xcq\" (UniqueName: \"kubernetes.io/projected/6acf7ebc-be88-4e1d-9b31-bef0707ea082-kube-api-access-72xcq\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.612315 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.612392 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data podName:38bfa948-051b-4e6c-92a5-3714e0652088 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:43.612374038 +0000 UTC m=+1335.932634326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data") pod "rabbitmq-cell1-server-0" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088") : configmap "rabbitmq-cell1-config-data" not found Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.622183 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6acf7ebc-be88-4e1d-9b31-bef0707ea082" (UID: "6acf7ebc-be88-4e1d-9b31-bef0707ea082"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.625176 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.625288 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.628831 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-config-data" (OuterVolumeSpecName: "config-data") pod "6acf7ebc-be88-4e1d-9b31-bef0707ea082" (UID: "6acf7ebc-be88-4e1d-9b31-bef0707ea082"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:41 crc kubenswrapper[5030]: W0120 22:57:41.635505 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4e0a7b0_4723_42a8_aa99_c5f155dff065.slice/crio-5c5ba45262d5823fc921b92884c0f77002d1416c375024811477480f10417c7d WatchSource:0}: Error finding container 5c5ba45262d5823fc921b92884c0f77002d1416c375024811477480f10417c7d: Status 404 returned error can't find the container with id 5c5ba45262d5823fc921b92884c0f77002d1416c375024811477480f10417c7d Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.660848 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.714699 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-config-data\") pod \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\" (UID: \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.714834 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lckx7\" (UniqueName: \"kubernetes.io/projected/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-kube-api-access-lckx7\") pod \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\" (UID: \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.714875 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-combined-ca-bundle\") pod \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\" (UID: \"dd48f030-3391-4fdb-b101-45b0a7f4b9a3\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.715406 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d52lp\" (UniqueName: \"kubernetes.io/projected/f12523fd-a36d-4490-93bb-aa22ef7a0595-kube-api-access-d52lp\") pod \"keystone-8c9f-account-create-update-sstcf\" (UID: \"f12523fd-a36d-4490-93bb-aa22ef7a0595\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.715465 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12523fd-a36d-4490-93bb-aa22ef7a0595-operator-scripts\") pod \"keystone-8c9f-account-create-update-sstcf\" (UID: \"f12523fd-a36d-4490-93bb-aa22ef7a0595\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.715607 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.715660 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6acf7ebc-be88-4e1d-9b31-bef0707ea082-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.715673 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.715740 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.715865 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f12523fd-a36d-4490-93bb-aa22ef7a0595-operator-scripts podName:f12523fd-a36d-4490-93bb-aa22ef7a0595 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.715828247 +0000 UTC m=+1335.036088545 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f12523fd-a36d-4490-93bb-aa22ef7a0595-operator-scripts") pod "keystone-8c9f-account-create-update-sstcf" (UID: "f12523fd-a36d-4490-93bb-aa22ef7a0595") : configmap "openstack-scripts" not found Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.721336 5030 projected.go:194] Error preparing data for projected volume kube-api-access-d52lp for pod openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.721409 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f12523fd-a36d-4490-93bb-aa22ef7a0595-kube-api-access-d52lp podName:f12523fd-a36d-4490-93bb-aa22ef7a0595 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:42.721394063 +0000 UTC m=+1335.041654341 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-d52lp" (UniqueName: "kubernetes.io/projected/f12523fd-a36d-4490-93bb-aa22ef7a0595-kube-api-access-d52lp") pod "keystone-8c9f-account-create-update-sstcf" (UID: "f12523fd-a36d-4490-93bb-aa22ef7a0595") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.746734 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-config-data" (OuterVolumeSpecName: "config-data") pod "dd48f030-3391-4fdb-b101-45b0a7f4b9a3" (UID: "dd48f030-3391-4fdb-b101-45b0a7f4b9a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.754547 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd48f030-3391-4fdb-b101-45b0a7f4b9a3" (UID: "dd48f030-3391-4fdb-b101-45b0a7f4b9a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.754850 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-kube-api-access-lckx7" (OuterVolumeSpecName: "kube-api-access-lckx7") pod "dd48f030-3391-4fdb-b101-45b0a7f4b9a3" (UID: "dd48f030-3391-4fdb-b101-45b0a7f4b9a3"). InnerVolumeSpecName "kube-api-access-lckx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.775253 5030 scope.go:117] "RemoveContainer" containerID="6b6d7fced9bfec32b48d7d732119d3237bc9fe4d5517f7a246ca37788d005e77" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.796872 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="b0a5273a-9439-4881-b88a-2a3aa11e410b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.806243 5030 generic.go:334] "Generic (PLEG): container finished" podID="b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" containerID="16e00ec261436bba79f4e1f1da0aa6c5a6077529ffff0c75640ae041e0a9aaed" exitCode=143 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.806374 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" event={"ID":"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef","Type":"ContainerDied","Data":"16e00ec261436bba79f4e1f1da0aa6c5a6077529ffff0c75640ae041e0a9aaed"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.807634 5030 generic.go:334] "Generic (PLEG): container finished" podID="d969fd5b-6579-4bc5-b64f-05bcb2aec29d" containerID="baf6a60a1b77f421596bd948659ecac15aa8955fa1ce8ad3378955ea3bd359da" exitCode=2 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.807757 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d969fd5b-6579-4bc5-b64f-05bcb2aec29d","Type":"ContainerDied","Data":"baf6a60a1b77f421596bd948659ecac15aa8955fa1ce8ad3378955ea3bd359da"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.807824 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d969fd5b-6579-4bc5-b64f-05bcb2aec29d","Type":"ContainerDied","Data":"712b17246d7d0928ffcfb40f7ddc0b583f5dad565a9dc7d530c64379cf291a5e"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.807881 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="712b17246d7d0928ffcfb40f7ddc0b583f5dad565a9dc7d530c64379cf291a5e" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.809116 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a4277815-cc65-44c8-b719-7673ce4c1d88","Type":"ContainerDied","Data":"6753c4750027cbd2b74c3a98cd876a96119d4d1c1046b188106460bdd86fca52"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.809193 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6753c4750027cbd2b74c3a98cd876a96119d4d1c1046b188106460bdd86fca52" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.811544 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"6acf7ebc-be88-4e1d-9b31-bef0707ea082","Type":"ContainerDied","Data":"bb9c254ed7087071c1e4a6a64b5ffd65090cf6e8d35568315b3a537d68632b61"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.811717 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.823119 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.823143 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lckx7\" (UniqueName: \"kubernetes.io/projected/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-kube-api-access-lckx7\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.823154 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd48f030-3391-4fdb-b101-45b0a7f4b9a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.825064 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2ddc1fbc-a41d-446b-81e3-a15febd39285","Type":"ContainerDied","Data":"e64b70ac088634e0a35988aae2c05712d18bd425b64b8c3b786f8bdee5fc97a1"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.825128 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e64b70ac088634e0a35988aae2c05712d18bd425b64b8c3b786f8bdee5fc97a1" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.827254 5030 generic.go:334] "Generic (PLEG): container finished" podID="f0fffc32-86e9-4278-8c2d-37a04bbd89b1" containerID="efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4" exitCode=0 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.828679 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"f0fffc32-86e9-4278-8c2d-37a04bbd89b1","Type":"ContainerDied","Data":"efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.843702 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.843933 5030 generic.go:334] "Generic (PLEG): container finished" podID="8a05e942-d886-4355-b352-50408cfe11e3" containerID="f42f977d4b1f56e969e3645e77647af497eeead81e8ac35d88c15264e254f40f" exitCode=0 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.843999 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"8a05e942-d886-4355-b352-50408cfe11e3","Type":"ContainerDied","Data":"f42f977d4b1f56e969e3645e77647af497eeead81e8ac35d88c15264e254f40f"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.844018 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"8a05e942-d886-4355-b352-50408cfe11e3","Type":"ContainerDied","Data":"17458a6af4868b27fea5694c3ea52722ef21ec8531b391cdc69caeb5dc2e847a"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.844030 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17458a6af4868b27fea5694c3ea52722ef21ec8531b391cdc69caeb5dc2e847a" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.847694 5030 generic.go:334] "Generic (PLEG): container finished" podID="bdb8aa29-a5b3-4060-a8a6-19d1eda96426" containerID="06e68b68bba4d4968142a0ed2d7232e6d395b065e2c82b3bdda9db70cd9473e7" exitCode=143 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.847750 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" event={"ID":"bdb8aa29-a5b3-4060-a8a6-19d1eda96426","Type":"ContainerDied","Data":"06e68b68bba4d4968142a0ed2d7232e6d395b065e2c82b3bdda9db70cd9473e7"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.849829 5030 generic.go:334] "Generic (PLEG): container finished" podID="d09f29b6-8161-4158-810d-efe783958358" containerID="a5eb9a66cd59578f8b736dfa3e91fcb0a66588356d5c43c532a1426a7815f1a9" exitCode=0 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.849891 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"d09f29b6-8161-4158-810d-efe783958358","Type":"ContainerDied","Data":"a5eb9a66cd59578f8b736dfa3e91fcb0a66588356d5c43c532a1426a7815f1a9"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.850887 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" event={"ID":"b4e0a7b0-4723-42a8-aa99-c5f155dff065","Type":"ContainerStarted","Data":"5c5ba45262d5823fc921b92884c0f77002d1416c375024811477480f10417c7d"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.853646 5030 generic.go:334] "Generic (PLEG): container finished" podID="7dd556b2-18b0-43da-ad37-e84c9e7601cc" containerID="65b8ebd2e425010f04d6e967a0f85be11bd51c38df6fd77b8844497673c771f4" exitCode=0 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.853675 5030 generic.go:334] "Generic (PLEG): container finished" podID="7dd556b2-18b0-43da-ad37-e84c9e7601cc" containerID="709a2a7ededf39d4cf921562b31fe5c77003d1a66336a20cb086a47b4c5c88f2" exitCode=0 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.853718 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" event={"ID":"7dd556b2-18b0-43da-ad37-e84c9e7601cc","Type":"ContainerDied","Data":"65b8ebd2e425010f04d6e967a0f85be11bd51c38df6fd77b8844497673c771f4"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.853747 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" event={"ID":"7dd556b2-18b0-43da-ad37-e84c9e7601cc","Type":"ContainerDied","Data":"709a2a7ededf39d4cf921562b31fe5c77003d1a66336a20cb086a47b4c5c88f2"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.865299 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" event={"ID":"ce21c77f-25ce-4bba-b711-7375ce5e824c","Type":"ContainerStarted","Data":"405b9eff86b1d0a72cc550154567e82c0cebeba952b16c2b03fcc3116f8d17a5"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.865490 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" podUID="ce21c77f-25ce-4bba-b711-7375ce5e824c" containerName="barbican-keystone-listener-log" containerID="cri-o://903e3d3d344c10e9eb7cd1b444ed7504bc9dc3e0e1295314208b54614b16480a" gracePeriod=30 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.865819 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" podUID="ce21c77f-25ce-4bba-b711-7375ce5e824c" containerName="barbican-keystone-listener" containerID="cri-o://405b9eff86b1d0a72cc550154567e82c0cebeba952b16c2b03fcc3116f8d17a5" gracePeriod=30 Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.874316 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="247edcbff378bb2e2fd1e3b880680b1439f95acd1f37bcf000a2cab517dc6fd8" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.888004 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"dd48f030-3391-4fdb-b101-45b0a7f4b9a3","Type":"ContainerDied","Data":"a0ce169c8dab7719e35efb95e7fc95eedf3c1d30c8e461aa5019df790f82a14e"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.888168 5030 scope.go:117] "RemoveContainer" containerID="09189446050efd620d2103042e8ffbe0d00050dcd359f229a97abe5fb2fafe72" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.888261 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.888173 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="247edcbff378bb2e2fd1e3b880680b1439f95acd1f37bcf000a2cab517dc6fd8" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.905056 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="247edcbff378bb2e2fd1e3b880680b1439f95acd1f37bcf000a2cab517dc6fd8" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.905157 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" containerName="ovn-northd" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.908535 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" podStartSLOduration=6.908516416 podStartE2EDuration="6.908516416s" podCreationTimestamp="2026-01-20 22:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:57:41.882716396 +0000 UTC m=+1334.202976694" watchObservedRunningTime="2026-01-20 22:57:41.908516416 +0000 UTC m=+1334.228776694" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.911049 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"4b05ddec-40d4-42ea-987c-03718a02724d","Type":"ContainerDied","Data":"6df536c3c6c433873930d888129d0a6a748662d5d4db80f163c0129a4c3d8368"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.911138 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6df536c3c6c433873930d888129d0a6a748662d5d4db80f163c0129a4c3d8368" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.917482 5030 generic.go:334] "Generic (PLEG): container finished" podID="7d3fc09a-8a37-4559-91cb-817bcc4171b9" containerID="f8b91389342e688b3e0145f9dea09b80af881bed02683f96c927275a4d6eabc9" exitCode=143 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.917573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" event={"ID":"7d3fc09a-8a37-4559-91cb-817bcc4171b9","Type":"ContainerDied","Data":"f8b91389342e688b3e0145f9dea09b80af881bed02683f96c927275a4d6eabc9"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.919140 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.920394 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"bcf270c3-8bc4-4b11-8c04-10708f22f76e","Type":"ContainerDied","Data":"803a871ac34dc6e83e980939ad71904b625abed93d38aea905fba51dbb908812"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.923125 5030 generic.go:334] "Generic (PLEG): container finished" podID="285c1a35-91f7-441f-aaf2-a228cccc8962" containerID="d27c68304cedb99531414f7ed01f149023e91c9d96d8750769fd70655c26aa29" exitCode=143 Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.923178 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" event={"ID":"285c1a35-91f7-441f-aaf2-a228cccc8962","Type":"ContainerDied","Data":"d27c68304cedb99531414f7ed01f149023e91c9d96d8750769fd70655c26aa29"} Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.924227 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlb4n\" (UniqueName: \"kubernetes.io/projected/a4277815-cc65-44c8-b719-7673ce4c1d88-kube-api-access-xlb4n\") pod \"a4277815-cc65-44c8-b719-7673ce4c1d88\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.924253 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-config-data\") pod \"a4277815-cc65-44c8-b719-7673ce4c1d88\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.924293 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4277815-cc65-44c8-b719-7673ce4c1d88-logs\") pod \"a4277815-cc65-44c8-b719-7673ce4c1d88\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.924409 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-nova-metadata-tls-certs\") pod \"a4277815-cc65-44c8-b719-7673ce4c1d88\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.924446 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-combined-ca-bundle\") pod \"a4277815-cc65-44c8-b719-7673ce4c1d88\" (UID: \"a4277815-cc65-44c8-b719-7673ce4c1d88\") " Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.925797 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4277815-cc65-44c8-b719-7673ce4c1d88-logs" (OuterVolumeSpecName: "logs") pod "a4277815-cc65-44c8-b719-7673ce4c1d88" (UID: "a4277815-cc65-44c8-b719-7673ce4c1d88"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.954186 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4277815-cc65-44c8-b719-7673ce4c1d88-kube-api-access-xlb4n" (OuterVolumeSpecName: "kube-api-access-xlb4n") pod "a4277815-cc65-44c8-b719-7673ce4c1d88" (UID: "a4277815-cc65-44c8-b719-7673ce4c1d88"). InnerVolumeSpecName "kube-api-access-xlb4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.956667 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.956771 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.960804 5030 scope.go:117] "RemoveContainer" containerID="f2e4fba2a035d0d53c65eea9e0abbcb8b18335ef51e2f23ea789c2843967c10f" Jan 20 22:57:41 crc kubenswrapper[5030]: E0120 22:57:41.982919 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-d52lp operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf" podUID="f12523fd-a36d-4490-93bb-aa22ef7a0595" Jan 20 22:57:41 crc kubenswrapper[5030]: I0120 22:57:41.986956 5030 generic.go:334] "Generic (PLEG): container finished" podID="c2732803-fafc-45fd-9cc7-4b61bd446fe0" containerID="c7e72e59310be353b8fa092d384d923b5d454411590aafac1c75d0f354a04eb2" exitCode=143 Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.005478 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01baa0a2-d612-4ca4-9ee2-0957df3b6d77" path="/var/lib/kubelet/pods/01baa0a2-d612-4ca4-9ee2-0957df3b6d77/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.007169 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d757282-7e93-4149-82c8-560aa414ff9f" path="/var/lib/kubelet/pods/0d757282-7e93-4149-82c8-560aa414ff9f/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.007819 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="270d79a6-4c9f-415b-aa6a-e0d33da8177d" path="/var/lib/kubelet/pods/270d79a6-4c9f-415b-aa6a-e0d33da8177d/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.009038 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab29253-03b2-4351-a4ba-96cff66ebe9d" path="/var/lib/kubelet/pods/2ab29253-03b2-4351-a4ba-96cff66ebe9d/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.009127 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.009909 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb00a6c-b955-4b9f-b41c-06978a151b1e" path="/var/lib/kubelet/pods/2cb00a6c-b955-4b9f-b41c-06978a151b1e/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.010443 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cc300a3-f7d4-4061-ba18-268032869088" path="/var/lib/kubelet/pods/2cc300a3-f7d4-4061-ba18-268032869088/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.010939 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.011095 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.011095 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.011181 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b7bb7d-f87c-4f2f-8d95-5d8ff851912c" path="/var/lib/kubelet/pods/42b7bb7d-f87c-4f2f-8d95-5d8ff851912c/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.011647 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.012384 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44044fda-5fae-4af3-8ef5-96c63087bed3" path="/var/lib/kubelet/pods/44044fda-5fae-4af3-8ef5-96c63087bed3/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.013083 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-config-data" (OuterVolumeSpecName: "config-data") pod "a4277815-cc65-44c8-b719-7673ce4c1d88" (UID: "a4277815-cc65-44c8-b719-7673ce4c1d88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.013161 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.014157 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60acba6f-1f50-4392-9a13-6b6ef04c93a6" path="/var/lib/kubelet/pods/60acba6f-1f50-4392-9a13-6b6ef04c93a6/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.015073 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6acf7ebc-be88-4e1d-9b31-bef0707ea082" path="/var/lib/kubelet/pods/6acf7ebc-be88-4e1d-9b31-bef0707ea082/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.016021 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f91afae-2ac6-49ea-aa9a-b5c8eed0184f" path="/var/lib/kubelet/pods/6f91afae-2ac6-49ea-aa9a-b5c8eed0184f/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.017889 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.018317 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c19680-49a9-401d-bea1-db45fd5e9004" path="/var/lib/kubelet/pods/70c19680-49a9-401d-bea1-db45fd5e9004/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.019463 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715ee943-81f3-4215-8e95-2dbb93ff0e3d" path="/var/lib/kubelet/pods/715ee943-81f3-4215-8e95-2dbb93ff0e3d/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.020297 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75632b87-176b-4dea-be41-ea40a8615763" path="/var/lib/kubelet/pods/75632b87-176b-4dea-be41-ea40a8615763/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.020890 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8016e98a-33a5-43a8-9480-1a424b187263" path="/var/lib/kubelet/pods/8016e98a-33a5-43a8-9480-1a424b187263/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.022189 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.022296 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dca47bc-b3f4-427a-b88c-451cc8b36bc9" path="/var/lib/kubelet/pods/8dca47bc-b3f4-427a-b88c-451cc8b36bc9/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.023154 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35" path="/var/lib/kubelet/pods/8dd0a29f-3a2b-4fdb-86b6-11fb77b00a35/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.024013 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="981d713c-d94c-4957-8d8d-eb000cc07969" path="/var/lib/kubelet/pods/981d713c-d94c-4957-8d8d-eb000cc07969/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.024745 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b344be57-888d-48a1-8b94-26ed9061b2f2" path="/var/lib/kubelet/pods/b344be57-888d-48a1-8b94-26ed9061b2f2/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.025316 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1df9c08-0838-4b4e-a77d-0105392e8c10" path="/var/lib/kubelet/pods/d1df9c08-0838-4b4e-a77d-0105392e8c10/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.025862 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcf270c3-8bc4-4b11-8c04-10708f22f76e-logs\") pod \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.025903 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-config-data-custom\") pod \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.025948 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-scripts\") pod \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.025978 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bcf270c3-8bc4-4b11-8c04-10708f22f76e-etc-machine-id\") pod \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.026007 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-combined-ca-bundle\") pod \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.026056 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-config-data\") pod \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.026124 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6clrh\" (UniqueName: \"kubernetes.io/projected/bcf270c3-8bc4-4b11-8c04-10708f22f76e-kube-api-access-6clrh\") pod \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.026158 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-internal-tls-certs\") pod \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.026179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-public-tls-certs\") pod \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\" (UID: \"bcf270c3-8bc4-4b11-8c04-10708f22f76e\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.028610 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcf270c3-8bc4-4b11-8c04-10708f22f76e-logs" (OuterVolumeSpecName: "logs") pod "bcf270c3-8bc4-4b11-8c04-10708f22f76e" (UID: "bcf270c3-8bc4-4b11-8c04-10708f22f76e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.035282 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1" path="/var/lib/kubelet/pods/fd0a5408-7192-4ec5-8ce1-b80d5da4e4a1/volumes" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.037389 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlb4n\" (UniqueName: \"kubernetes.io/projected/a4277815-cc65-44c8-b719-7673ce4c1d88-kube-api-access-xlb4n\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.038909 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.038930 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4277815-cc65-44c8-b719-7673ce4c1d88-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.041980 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bcf270c3-8bc4-4b11-8c04-10708f22f76e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bcf270c3-8bc4-4b11-8c04-10708f22f76e" (UID: "bcf270c3-8bc4-4b11-8c04-10708f22f76e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.044597 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.048448 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.048477 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" event={"ID":"c2732803-fafc-45fd-9cc7-4b61bd446fe0","Type":"ContainerDied","Data":"c7e72e59310be353b8fa092d384d923b5d454411590aafac1c75d0f354a04eb2"} Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.048882 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"962ec476-3693-41a1-b425-a275bf790beb","Type":"ContainerDied","Data":"7579ebaccaa33ec14f8e1d1c7e82a66614c0f6ace8b6b2afb2695cd0a9684c22"} Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.048913 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.048932 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.048943 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.049847 5030 scope.go:117] "RemoveContainer" containerID="38e3f4f4dcc255a9e5a20a831293e1da39522536a4dd156d458238e937469e98" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.052954 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.057466 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.063493 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.068308 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bcf270c3-8bc4-4b11-8c04-10708f22f76e" (UID: "bcf270c3-8bc4-4b11-8c04-10708f22f76e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.080212 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.084337 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf270c3-8bc4-4b11-8c04-10708f22f76e-kube-api-access-6clrh" (OuterVolumeSpecName: "kube-api-access-6clrh") pod "bcf270c3-8bc4-4b11-8c04-10708f22f76e" (UID: "bcf270c3-8bc4-4b11-8c04-10708f22f76e"). InnerVolumeSpecName "kube-api-access-6clrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.085143 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-scripts" (OuterVolumeSpecName: "scripts") pod "bcf270c3-8bc4-4b11-8c04-10708f22f76e" (UID: "bcf270c3-8bc4-4b11-8c04-10708f22f76e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.088538 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.089812 5030 scope.go:117] "RemoveContainer" containerID="ea1e0a375f06f75251c94f8a9680e30e93e6ebbeb077b27d82cd29945cdfd113" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.122309 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141128 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d09f29b6-8161-4158-810d-efe783958358-etc-machine-id\") pod \"d09f29b6-8161-4158-810d-efe783958358\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d94sk\" (UniqueName: \"kubernetes.io/projected/d09f29b6-8161-4158-810d-efe783958358-kube-api-access-d94sk\") pod \"d09f29b6-8161-4158-810d-efe783958358\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141235 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8gws\" (UniqueName: \"kubernetes.io/projected/8a05e942-d886-4355-b352-50408cfe11e3-kube-api-access-r8gws\") pod \"8a05e942-d886-4355-b352-50408cfe11e3\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141274 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-state-metrics-tls-certs\") pod \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141306 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-internal-tls-certs\") pod \"4b05ddec-40d4-42ea-987c-03718a02724d\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141335 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-config-data\") pod \"962ec476-3693-41a1-b425-a275bf790beb\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-combined-ca-bundle\") pod \"2ddc1fbc-a41d-446b-81e3-a15febd39285\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141428 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddc1fbc-a41d-446b-81e3-a15febd39285-log-httpd\") pod \"2ddc1fbc-a41d-446b-81e3-a15febd39285\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141448 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-kolla-config\") pod \"8a05e942-d886-4355-b352-50408cfe11e3\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbncs\" (UniqueName: \"kubernetes.io/projected/962ec476-3693-41a1-b425-a275bf790beb-kube-api-access-hbncs\") pod \"962ec476-3693-41a1-b425-a275bf790beb\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141517 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-internal-tls-certs\") pod \"962ec476-3693-41a1-b425-a275bf790beb\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141534 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-state-metrics-tls-config\") pod \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141566 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-combined-ca-bundle\") pod \"4b05ddec-40d4-42ea-987c-03718a02724d\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141588 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data-custom\") pod \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141614 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-config-data\") pod \"d09f29b6-8161-4158-810d-efe783958358\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141746 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-operator-scripts\") pod \"8a05e942-d886-4355-b352-50408cfe11e3\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141781 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr8f9\" (UniqueName: \"kubernetes.io/projected/4b05ddec-40d4-42ea-987c-03718a02724d-kube-api-access-lr8f9\") pod \"4b05ddec-40d4-42ea-987c-03718a02724d\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141805 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da108774-0e8b-4790-ac01-1a542370a374-logs\") pod \"da108774-0e8b-4790-ac01-1a542370a374\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141833 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/962ec476-3693-41a1-b425-a275bf790beb-httpd-run\") pod \"962ec476-3693-41a1-b425-a275bf790beb\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141854 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-sg-core-conf-yaml\") pod \"2ddc1fbc-a41d-446b-81e3-a15febd39285\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141879 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"962ec476-3693-41a1-b425-a275bf790beb\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141920 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0613c56-1bb3-455b-a45d-6b9ff28427c5-logs\") pod \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141951 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b05ddec-40d4-42ea-987c-03718a02724d-logs\") pod \"4b05ddec-40d4-42ea-987c-03718a02724d\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141972 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"8a05e942-d886-4355-b352-50408cfe11e3\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.141994 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddc1fbc-a41d-446b-81e3-a15febd39285-run-httpd\") pod \"2ddc1fbc-a41d-446b-81e3-a15febd39285\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142016 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-config-data\") pod \"4b05ddec-40d4-42ea-987c-03718a02724d\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142038 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-config-data-default\") pod \"8a05e942-d886-4355-b352-50408cfe11e3\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142062 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-config-data\") pod \"2ddc1fbc-a41d-446b-81e3-a15febd39285\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142088 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-combined-ca-bundle\") pod \"d09f29b6-8161-4158-810d-efe783958358\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142109 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkp57\" (UniqueName: \"kubernetes.io/projected/2ddc1fbc-a41d-446b-81e3-a15febd39285-kube-api-access-fkp57\") pod \"2ddc1fbc-a41d-446b-81e3-a15febd39285\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142141 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-scripts\") pod \"2ddc1fbc-a41d-446b-81e3-a15febd39285\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data-custom\") pod \"7bc0c068-8645-4d10-95a1-94cce9fe9592\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142199 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4b6g\" (UniqueName: \"kubernetes.io/projected/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-api-access-g4b6g\") pod \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142219 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-combined-ca-bundle\") pod \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\" (UID: \"d969fd5b-6579-4bc5-b64f-05bcb2aec29d\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142238 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962ec476-3693-41a1-b425-a275bf790beb-logs\") pod \"962ec476-3693-41a1-b425-a275bf790beb\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142263 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data-custom\") pod \"da108774-0e8b-4790-ac01-1a542370a374\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142300 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a05e942-d886-4355-b352-50408cfe11e3-galera-tls-certs\") pod \"8a05e942-d886-4355-b352-50408cfe11e3\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142319 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-public-tls-certs\") pod \"4b05ddec-40d4-42ea-987c-03718a02724d\" (UID: \"4b05ddec-40d4-42ea-987c-03718a02724d\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142346 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-scripts\") pod \"962ec476-3693-41a1-b425-a275bf790beb\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142374 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a05e942-d886-4355-b352-50408cfe11e3-config-data-generated\") pod \"8a05e942-d886-4355-b352-50408cfe11e3\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142392 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bc0c068-8645-4d10-95a1-94cce9fe9592-logs\") pod \"7bc0c068-8645-4d10-95a1-94cce9fe9592\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142411 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-scripts\") pod \"d09f29b6-8161-4158-810d-efe783958358\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142434 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-ceilometer-tls-certs\") pod \"2ddc1fbc-a41d-446b-81e3-a15febd39285\" (UID: \"2ddc1fbc-a41d-446b-81e3-a15febd39285\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142455 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a05e942-d886-4355-b352-50408cfe11e3-combined-ca-bundle\") pod \"8a05e942-d886-4355-b352-50408cfe11e3\" (UID: \"8a05e942-d886-4355-b352-50408cfe11e3\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142473 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-combined-ca-bundle\") pod \"962ec476-3693-41a1-b425-a275bf790beb\" (UID: \"962ec476-3693-41a1-b425-a275bf790beb\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142495 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-config-data-custom\") pod \"d09f29b6-8161-4158-810d-efe783958358\" (UID: \"d09f29b6-8161-4158-810d-efe783958358\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a05e942-d886-4355-b352-50408cfe11e3" (UID: "8a05e942-d886-4355-b352-50408cfe11e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.142877 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d09f29b6-8161-4158-810d-efe783958358-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d09f29b6-8161-4158-810d-efe783958358" (UID: "d09f29b6-8161-4158-810d-efe783958358"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.143322 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d09f29b6-8161-4158-810d-efe783958358-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.143340 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcf270c3-8bc4-4b11-8c04-10708f22f76e-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.143351 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.143361 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.143374 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bcf270c3-8bc4-4b11-8c04-10708f22f76e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.143382 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.143393 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6clrh\" (UniqueName: \"kubernetes.io/projected/bcf270c3-8bc4-4b11-8c04-10708f22f76e-kube-api-access-6clrh\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.145939 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b05ddec-40d4-42ea-987c-03718a02724d-logs" (OuterVolumeSpecName: "logs") pod "4b05ddec-40d4-42ea-987c-03718a02724d" (UID: "4b05ddec-40d4-42ea-987c-03718a02724d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.153876 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "8a05e942-d886-4355-b352-50408cfe11e3" (UID: "8a05e942-d886-4355-b352-50408cfe11e3"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.155852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da108774-0e8b-4790-ac01-1a542370a374-logs" (OuterVolumeSpecName: "logs") pod "da108774-0e8b-4790-ac01-1a542370a374" (UID: "da108774-0e8b-4790-ac01-1a542370a374"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.156040 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962ec476-3693-41a1-b425-a275bf790beb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "962ec476-3693-41a1-b425-a275bf790beb" (UID: "962ec476-3693-41a1-b425-a275bf790beb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.159883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ddc1fbc-a41d-446b-81e3-a15febd39285-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2ddc1fbc-a41d-446b-81e3-a15febd39285" (UID: "2ddc1fbc-a41d-446b-81e3-a15febd39285"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.160068 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0613c56-1bb3-455b-a45d-6b9ff28427c5-logs" (OuterVolumeSpecName: "logs") pod "e0613c56-1bb3-455b-a45d-6b9ff28427c5" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.160491 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "8a05e942-d886-4355-b352-50408cfe11e3" (UID: "8a05e942-d886-4355-b352-50408cfe11e3"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.166103 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962ec476-3693-41a1-b425-a275bf790beb-logs" (OuterVolumeSpecName: "logs") pod "962ec476-3693-41a1-b425-a275bf790beb" (UID: "962ec476-3693-41a1-b425-a275bf790beb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.166261 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d09f29b6-8161-4158-810d-efe783958358" (UID: "d09f29b6-8161-4158-810d-efe783958358"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.166502 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bc0c068-8645-4d10-95a1-94cce9fe9592-logs" (OuterVolumeSpecName: "logs") pod "7bc0c068-8645-4d10-95a1-94cce9fe9592" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.166691 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a05e942-d886-4355-b352-50408cfe11e3-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "8a05e942-d886-4355-b352-50408cfe11e3" (UID: "8a05e942-d886-4355-b352-50408cfe11e3"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.166712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ddc1fbc-a41d-446b-81e3-a15febd39285-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2ddc1fbc-a41d-446b-81e3-a15febd39285" (UID: "2ddc1fbc-a41d-446b-81e3-a15febd39285"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.166693 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b05ddec-40d4-42ea-987c-03718a02724d-kube-api-access-lr8f9" (OuterVolumeSpecName: "kube-api-access-lr8f9") pod "4b05ddec-40d4-42ea-987c-03718a02724d" (UID: "4b05ddec-40d4-42ea-987c-03718a02724d"). InnerVolumeSpecName "kube-api-access-lr8f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.168274 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a05e942-d886-4355-b352-50408cfe11e3-kube-api-access-r8gws" (OuterVolumeSpecName: "kube-api-access-r8gws") pod "8a05e942-d886-4355-b352-50408cfe11e3" (UID: "8a05e942-d886-4355-b352-50408cfe11e3"). InnerVolumeSpecName "kube-api-access-r8gws". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.169184 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d09f29b6-8161-4158-810d-efe783958358-kube-api-access-d94sk" (OuterVolumeSpecName: "kube-api-access-d94sk") pod "d09f29b6-8161-4158-810d-efe783958358" (UID: "d09f29b6-8161-4158-810d-efe783958358"). InnerVolumeSpecName "kube-api-access-d94sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.170179 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ddc1fbc-a41d-446b-81e3-a15febd39285-kube-api-access-fkp57" (OuterVolumeSpecName: "kube-api-access-fkp57") pod "2ddc1fbc-a41d-446b-81e3-a15febd39285" (UID: "2ddc1fbc-a41d-446b-81e3-a15febd39285"). InnerVolumeSpecName "kube-api-access-fkp57". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.170600 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e0613c56-1bb3-455b-a45d-6b9ff28427c5" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.186810 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da108774-0e8b-4790-ac01-1a542370a374" (UID: "da108774-0e8b-4790-ac01-1a542370a374"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.193476 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-scripts" (OuterVolumeSpecName: "scripts") pod "d09f29b6-8161-4158-810d-efe783958358" (UID: "d09f29b6-8161-4158-810d-efe783958358"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.193563 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-scripts" (OuterVolumeSpecName: "scripts") pod "962ec476-3693-41a1-b425-a275bf790beb" (UID: "962ec476-3693-41a1-b425-a275bf790beb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.201640 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-api-access-g4b6g" (OuterVolumeSpecName: "kube-api-access-g4b6g") pod "d969fd5b-6579-4bc5-b64f-05bcb2aec29d" (UID: "d969fd5b-6579-4bc5-b64f-05bcb2aec29d"). InnerVolumeSpecName "kube-api-access-g4b6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.212986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962ec476-3693-41a1-b425-a275bf790beb-kube-api-access-hbncs" (OuterVolumeSpecName: "kube-api-access-hbncs") pod "962ec476-3693-41a1-b425-a275bf790beb" (UID: "962ec476-3693-41a1-b425-a275bf790beb"). InnerVolumeSpecName "kube-api-access-hbncs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.213332 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7bc0c068-8645-4d10-95a1-94cce9fe9592" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.213572 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-scripts" (OuterVolumeSpecName: "scripts") pod "2ddc1fbc-a41d-446b-81e3-a15febd39285" (UID: "2ddc1fbc-a41d-446b-81e3-a15febd39285"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.218791 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "962ec476-3693-41a1-b425-a275bf790beb" (UID: "962ec476-3693-41a1-b425-a275bf790beb"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.244032 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-config-data\") pod \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.244370 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-public-tls-certs\") pod \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.244469 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8svt8\" (UniqueName: \"kubernetes.io/projected/7dd556b2-18b0-43da-ad37-e84c9e7601cc-kube-api-access-8svt8\") pod \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.244575 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-internal-tls-certs\") pod \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.244720 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd556b2-18b0-43da-ad37-e84c9e7601cc-log-httpd\") pod \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.244840 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd556b2-18b0-43da-ad37-e84c9e7601cc-run-httpd\") pod \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.245690 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7dd556b2-18b0-43da-ad37-e84c9e7601cc-etc-swift\") pod \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.245915 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-combined-ca-bundle\") pod \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\" (UID: \"7dd556b2-18b0-43da-ad37-e84c9e7601cc\") " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.245430 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd556b2-18b0-43da-ad37-e84c9e7601cc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7dd556b2-18b0-43da-ad37-e84c9e7601cc" (UID: "7dd556b2-18b0-43da-ad37-e84c9e7601cc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.245460 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd556b2-18b0-43da-ad37-e84c9e7601cc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7dd556b2-18b0-43da-ad37-e84c9e7601cc" (UID: "7dd556b2-18b0-43da-ad37-e84c9e7601cc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.246968 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.247043 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d94sk\" (UniqueName: \"kubernetes.io/projected/d09f29b6-8161-4158-810d-efe783958358-kube-api-access-d94sk\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.247112 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8gws\" (UniqueName: \"kubernetes.io/projected/8a05e942-d886-4355-b352-50408cfe11e3-kube-api-access-r8gws\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.247177 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.247262 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddc1fbc-a41d-446b-81e3-a15febd39285-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.247347 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbncs\" (UniqueName: \"kubernetes.io/projected/962ec476-3693-41a1-b425-a275bf790beb-kube-api-access-hbncs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.247424 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.247526 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd556b2-18b0-43da-ad37-e84c9e7601cc-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.247587 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr8f9\" (UniqueName: \"kubernetes.io/projected/4b05ddec-40d4-42ea-987c-03718a02724d-kube-api-access-lr8f9\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.247662 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da108774-0e8b-4790-ac01-1a542370a374-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.247718 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/962ec476-3693-41a1-b425-a275bf790beb-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.247781 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.247839 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0613c56-1bb3-455b-a45d-6b9ff28427c5-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.247891 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7dd556b2-18b0-43da-ad37-e84c9e7601cc-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.248010 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b05ddec-40d4-42ea-987c-03718a02724d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.248067 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddc1fbc-a41d-446b-81e3-a15febd39285-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.248121 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a05e942-d886-4355-b352-50408cfe11e3-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.248374 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkp57\" (UniqueName: \"kubernetes.io/projected/2ddc1fbc-a41d-446b-81e3-a15febd39285-kube-api-access-fkp57\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.248433 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.248486 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.248571 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4b6g\" (UniqueName: \"kubernetes.io/projected/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-api-access-g4b6g\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.248915 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962ec476-3693-41a1-b425-a275bf790beb-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.249023 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.249082 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.249142 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a05e942-d886-4355-b352-50408cfe11e3-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.249203 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bc0c068-8645-4d10-95a1-94cce9fe9592-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.249256 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.246970 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.249417 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data podName:b0a5273a-9439-4881-b88a-2a3aa11e410b nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.249402428 +0000 UTC m=+1338.569662816 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data") pod "rabbitmq-server-0" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b") : configmap "rabbitmq-config-data" not found Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.268579 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd556b2-18b0-43da-ad37-e84c9e7601cc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7dd556b2-18b0-43da-ad37-e84c9e7601cc" (UID: "7dd556b2-18b0-43da-ad37-e84c9e7601cc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.274073 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4277815-cc65-44c8-b719-7673ce4c1d88" (UID: "a4277815-cc65-44c8-b719-7673ce4c1d88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.274489 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd556b2-18b0-43da-ad37-e84c9e7601cc-kube-api-access-8svt8" (OuterVolumeSpecName: "kube-api-access-8svt8") pod "7dd556b2-18b0-43da-ad37-e84c9e7601cc" (UID: "7dd556b2-18b0-43da-ad37-e84c9e7601cc"). InnerVolumeSpecName "kube-api-access-8svt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.321788 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "mysql-db") pod "8a05e942-d886-4355-b352-50408cfe11e3" (UID: "8a05e942-d886-4355-b352-50408cfe11e3"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.351455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.351560 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.351601 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szznv\" (UniqueName: \"kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.351818 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle\") pod \"barbican-worker-76869767b9-5lj6l\" (UID: \"7bc0c068-8645-4d10-95a1-94cce9fe9592\") " pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.351850 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt76q\" (UniqueName: \"kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.351900 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle\") pod \"barbican-keystone-listener-5d9845b76b-jvdpt\" (UID: \"da108774-0e8b-4790-ac01-1a542370a374\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.352011 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.352029 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7dd556b2-18b0-43da-ad37-e84c9e7601cc-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.352041 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8svt8\" (UniqueName: \"kubernetes.io/projected/7dd556b2-18b0-43da-ad37-e84c9e7601cc-kube-api-access-8svt8\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.352053 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.352437 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.352499 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data podName:7bc0c068-8645-4d10-95a1-94cce9fe9592 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.352483507 +0000 UTC m=+1338.672743785 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data") pod "barbican-worker-76869767b9-5lj6l" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592") : secret "barbican-config-data" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.352733 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.352763 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data podName:da108774-0e8b-4790-ac01-1a542370a374 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.352756334 +0000 UTC m=+1338.673016622 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data") pod "barbican-keystone-listener-5d9845b76b-jvdpt" (UID: "da108774-0e8b-4790-ac01-1a542370a374") : secret "barbican-config-data" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.352908 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.352936 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle podName:7bc0c068-8645-4d10-95a1-94cce9fe9592 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.352930388 +0000 UTC m=+1338.673190676 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle") pod "barbican-worker-76869767b9-5lj6l" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592") : secret "combined-ca-bundle" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.352969 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.352986 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle podName:da108774-0e8b-4790-ac01-1a542370a374 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.35298031 +0000 UTC m=+1338.673240598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle") pod "barbican-keystone-listener-5d9845b76b-jvdpt" (UID: "da108774-0e8b-4790-ac01-1a542370a374") : secret "combined-ca-bundle" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.358502 5030 projected.go:194] Error preparing data for projected volume kube-api-access-szznv for pod openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.358585 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv podName:7bc0c068-8645-4d10-95a1-94cce9fe9592 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.358565156 +0000 UTC m=+1338.678825444 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-szznv" (UniqueName: "kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv") pod "barbican-worker-76869767b9-5lj6l" (UID: "7bc0c068-8645-4d10-95a1-94cce9fe9592") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.359590 5030 projected.go:194] Error preparing data for projected volume kube-api-access-kt76q for pod openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.359704 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q podName:da108774-0e8b-4790-ac01-1a542370a374 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.359688243 +0000 UTC m=+1338.679948531 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kt76q" (UniqueName: "kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q") pod "barbican-keystone-listener-5d9845b76b-jvdpt" (UID: "da108774-0e8b-4790-ac01-1a542370a374") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.419770 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcf270c3-8bc4-4b11-8c04-10708f22f76e" (UID: "bcf270c3-8bc4-4b11-8c04-10708f22f76e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.454416 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.485558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a4277815-cc65-44c8-b719-7673ce4c1d88" (UID: "a4277815-cc65-44c8-b719-7673ce4c1d88"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.503823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b05ddec-40d4-42ea-987c-03718a02724d" (UID: "4b05ddec-40d4-42ea-987c-03718a02724d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.530705 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "d969fd5b-6579-4bc5-b64f-05bcb2aec29d" (UID: "d969fd5b-6579-4bc5-b64f-05bcb2aec29d"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.539099 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="38bfa948-051b-4e6c-92a5-3714e0652088" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.556725 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.556885 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4277815-cc65-44c8-b719-7673ce4c1d88-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.556902 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.562938 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.571486 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a05e942-d886-4355-b352-50408cfe11e3-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "8a05e942-d886-4355-b352-50408cfe11e3" (UID: "8a05e942-d886-4355-b352-50408cfe11e3"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.630299 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-config-data" (OuterVolumeSpecName: "config-data") pod "4b05ddec-40d4-42ea-987c-03718a02724d" (UID: "4b05ddec-40d4-42ea-987c-03718a02724d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.639698 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "d969fd5b-6579-4bc5-b64f-05bcb2aec29d" (UID: "d969fd5b-6579-4bc5-b64f-05bcb2aec29d"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.645733 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d969fd5b-6579-4bc5-b64f-05bcb2aec29d" (UID: "d969fd5b-6579-4bc5-b64f-05bcb2aec29d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.659518 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km2cf\" (UniqueName: \"kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.659698 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.659787 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts\") pod \"root-account-create-update-g8w6k\" (UID: \"d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d\") " pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.659904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.660084 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.660169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vr69\" (UniqueName: \"kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69\") pod \"root-account-create-update-g8w6k\" (UID: \"d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d\") " pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.660244 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs\") pod \"barbican-api-54b9d65d4d-n574x\" (UID: \"e0613c56-1bb3-455b-a45d-6b9ff28427c5\") " pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.660391 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.660481 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.660540 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.660600 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d969fd5b-6579-4bc5-b64f-05bcb2aec29d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.660671 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a05e942-d886-4355-b352-50408cfe11e3-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.660811 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.661315 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.661298355 +0000 UTC m=+1338.981558643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "cert-barbican-internal-svc" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.661103 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.661143 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.661494 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.661471329 +0000 UTC m=+1338.981731617 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "combined-ca-bundle" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.661749 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts podName:d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.661740416 +0000 UTC m=+1338.982000694 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts") pod "root-account-create-update-g8w6k" (UID: "d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d") : configmap "openstack-cell1-scripts" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.661954 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.662062 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.662054183 +0000 UTC m=+1338.982314471 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "cert-barbican-public-svc" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.661973 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.662201 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.662194697 +0000 UTC m=+1338.982454985 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : secret "barbican-config-data" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.664270 5030 projected.go:194] Error preparing data for projected volume kube-api-access-km2cf for pod openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.664402 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf podName:e0613c56-1bb3-455b-a45d-6b9ff28427c5 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.66439336 +0000 UTC m=+1338.984653648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-km2cf" (UniqueName: "kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf") pod "barbican-api-54b9d65d4d-n574x" (UID: "e0613c56-1bb3-455b-a45d-6b9ff28427c5") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.670385 5030 projected.go:194] Error preparing data for projected volume kube-api-access-4vr69 for pod openstack-kuttl-tests/root-account-create-update-g8w6k: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.670681 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69 podName:d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d nodeName:}" failed. No retries permitted until 2026-01-20 22:57:46.670663414 +0000 UTC m=+1338.990923702 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4vr69" (UniqueName: "kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69") pod "root-account-create-update-g8w6k" (UID: "d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.700861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2ddc1fbc-a41d-446b-81e3-a15febd39285" (UID: "2ddc1fbc-a41d-446b-81e3-a15febd39285"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.703742 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a05e942-d886-4355-b352-50408cfe11e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a05e942-d886-4355-b352-50408cfe11e3" (UID: "8a05e942-d886-4355-b352-50408cfe11e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.718194 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "962ec476-3693-41a1-b425-a275bf790beb" (UID: "962ec476-3693-41a1-b425-a275bf790beb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.720961 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.743790 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bcf270c3-8bc4-4b11-8c04-10708f22f76e" (UID: "bcf270c3-8bc4-4b11-8c04-10708f22f76e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.761794 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4b05ddec-40d4-42ea-987c-03718a02724d" (UID: "4b05ddec-40d4-42ea-987c-03718a02724d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.762815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d52lp\" (UniqueName: \"kubernetes.io/projected/f12523fd-a36d-4490-93bb-aa22ef7a0595-kube-api-access-d52lp\") pod \"keystone-8c9f-account-create-update-sstcf\" (UID: \"f12523fd-a36d-4490-93bb-aa22ef7a0595\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.762860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12523fd-a36d-4490-93bb-aa22ef7a0595-operator-scripts\") pod \"keystone-8c9f-account-create-update-sstcf\" (UID: \"f12523fd-a36d-4490-93bb-aa22ef7a0595\") " pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.762968 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.762984 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.762994 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.763002 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.763012 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a05e942-d886-4355-b352-50408cfe11e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.763020 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.763069 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.763105 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f12523fd-a36d-4490-93bb-aa22ef7a0595-operator-scripts podName:f12523fd-a36d-4490-93bb-aa22ef7a0595 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:44.763092563 +0000 UTC m=+1337.083352851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f12523fd-a36d-4490-93bb-aa22ef7a0595-operator-scripts") pod "keystone-8c9f-account-create-update-sstcf" (UID: "f12523fd-a36d-4490-93bb-aa22ef7a0595") : configmap "openstack-scripts" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.766117 5030 projected.go:194] Error preparing data for projected volume kube-api-access-d52lp for pod openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.766161 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f12523fd-a36d-4490-93bb-aa22ef7a0595-kube-api-access-d52lp podName:f12523fd-a36d-4490-93bb-aa22ef7a0595 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:44.766151258 +0000 UTC m=+1337.086411546 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-d52lp" (UniqueName: "kubernetes.io/projected/f12523fd-a36d-4490-93bb-aa22ef7a0595-kube-api-access-d52lp") pod "keystone-8c9f-account-create-update-sstcf" (UID: "f12523fd-a36d-4490-93bb-aa22ef7a0595") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.789068 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-config-data" (OuterVolumeSpecName: "config-data") pod "bcf270c3-8bc4-4b11-8c04-10708f22f76e" (UID: "bcf270c3-8bc4-4b11-8c04-10708f22f76e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.799479 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "962ec476-3693-41a1-b425-a275bf790beb" (UID: "962ec476-3693-41a1-b425-a275bf790beb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.807550 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bcf270c3-8bc4-4b11-8c04-10708f22f76e" (UID: "bcf270c3-8bc4-4b11-8c04-10708f22f76e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.811738 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7dd556b2-18b0-43da-ad37-e84c9e7601cc" (UID: "7dd556b2-18b0-43da-ad37-e84c9e7601cc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.825097 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dd556b2-18b0-43da-ad37-e84c9e7601cc" (UID: "7dd556b2-18b0-43da-ad37-e84c9e7601cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.827922 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4b05ddec-40d4-42ea-987c-03718a02724d" (UID: "4b05ddec-40d4-42ea-987c-03718a02724d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.840430 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d09f29b6-8161-4158-810d-efe783958358" (UID: "d09f29b6-8161-4158-810d-efe783958358"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.852956 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2ddc1fbc-a41d-446b-81e3-a15febd39285" (UID: "2ddc1fbc-a41d-446b-81e3-a15febd39285"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.853852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-config-data" (OuterVolumeSpecName: "config-data") pod "7dd556b2-18b0-43da-ad37-e84c9e7601cc" (UID: "7dd556b2-18b0-43da-ad37-e84c9e7601cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.859255 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-config-data" (OuterVolumeSpecName: "config-data") pod "962ec476-3693-41a1-b425-a275bf790beb" (UID: "962ec476-3693-41a1-b425-a275bf790beb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.864599 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b05ddec-40d4-42ea-987c-03718a02724d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.864694 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.864705 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.864715 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/962ec476-3693-41a1-b425-a275bf790beb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.864723 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.864732 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.864740 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.864750 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcf270c3-8bc4-4b11-8c04-10708f22f76e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.864758 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.864766 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.868912 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ddc1fbc-a41d-446b-81e3-a15febd39285" (UID: "2ddc1fbc-a41d-446b-81e3-a15febd39285"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.869724 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7dd556b2-18b0-43da-ad37-e84c9e7601cc" (UID: "7dd556b2-18b0-43da-ad37-e84c9e7601cc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.880292 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-config-data" (OuterVolumeSpecName: "config-data") pod "d09f29b6-8161-4158-810d-efe783958358" (UID: "d09f29b6-8161-4158-810d-efe783958358"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.891484 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-config-data" (OuterVolumeSpecName: "config-data") pod "2ddc1fbc-a41d-446b-81e3-a15febd39285" (UID: "2ddc1fbc-a41d-446b-81e3-a15febd39285"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.919933 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4 is running failed: container process not found" containerID="efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.920405 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4 is running failed: container process not found" containerID="efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.921458 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4 is running failed: container process not found" containerID="efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 22:57:42 crc kubenswrapper[5030]: E0120 22:57:42.921494 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="f0fffc32-86e9-4278-8c2d-37a04bbd89b1" containerName="nova-cell1-conductor-conductor" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.966934 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd556b2-18b0-43da-ad37-e84c9e7601cc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.967267 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.967408 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d09f29b6-8161-4158-810d-efe783958358-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.967597 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ddc1fbc-a41d-446b-81e3-a15febd39285-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.989207 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:57:42 crc kubenswrapper[5030]: I0120 22:57:42.996412 5030 scope.go:117] "RemoveContainer" containerID="4519dfb732f0f1ed4b9e6668cf83ed384631e71d9e34923e24132c02c8524752" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.023427 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.032875 5030 scope.go:117] "RemoveContainer" containerID="be10608095d180892da25bc367a49e31085449b06dae8af243a26e65d423eabd" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.047732 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4e0a7b0-4723-42a8-aa99-c5f155dff065" containerID="73e6371cccef5385cdf656fa200e1a4aa393d71c27e38df1a06c78a99d91d53f" exitCode=0 Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.047836 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" event={"ID":"b4e0a7b0-4723-42a8-aa99-c5f155dff065","Type":"ContainerDied","Data":"73e6371cccef5385cdf656fa200e1a4aa393d71c27e38df1a06c78a99d91d53f"} Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.047892 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.055479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" event={"ID":"59f93538-1f2d-4699-9b7b-dcf4c110178a","Type":"ContainerStarted","Data":"683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844"} Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.060760 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" event={"ID":"37a888a7-7114-4dcb-b271-4422a3802e3b","Type":"ContainerStarted","Data":"a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437"} Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.069651 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-config-data-custom\") pod \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.069767 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2732803-fafc-45fd-9cc7-4b61bd446fe0-logs\") pod \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.069836 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-config-data\") pod \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.069862 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-combined-ca-bundle\") pod \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.070412 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwjxf\" (UniqueName: \"kubernetes.io/projected/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-kube-api-access-rwjxf\") pod \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\" (UID: \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.070458 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw6jg\" (UniqueName: \"kubernetes.io/projected/c2732803-fafc-45fd-9cc7-4b61bd446fe0-kube-api-access-sw6jg\") pod \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\" (UID: \"c2732803-fafc-45fd-9cc7-4b61bd446fe0\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.070666 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-combined-ca-bundle\") pod \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\" (UID: \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.070722 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-config-data\") pod \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\" (UID: \"f0fffc32-86e9-4278-8c2d-37a04bbd89b1\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.074938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" event={"ID":"f0994e0e-f167-4c5a-bef7-95da6f589767","Type":"ContainerStarted","Data":"54243974c668734f9b6c987ca5121b2c2c4d3cb7b7efacc44e8bd71d567cfcbb"} Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.075264 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" podUID="f0994e0e-f167-4c5a-bef7-95da6f589767" containerName="keystone-api" containerID="cri-o://54243974c668734f9b6c987ca5121b2c2c4d3cb7b7efacc44e8bd71d567cfcbb" gracePeriod=30 Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.076136 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.077896 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2732803-fafc-45fd-9cc7-4b61bd446fe0-logs" (OuterVolumeSpecName: "logs") pod "c2732803-fafc-45fd-9cc7-4b61bd446fe0" (UID: "c2732803-fafc-45fd-9cc7-4b61bd446fe0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.082105 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c2732803-fafc-45fd-9cc7-4b61bd446fe0" (UID: "c2732803-fafc-45fd-9cc7-4b61bd446fe0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.083250 5030 scope.go:117] "RemoveContainer" containerID="4eb95752fd1da606b39641693b5639240910a1ec73d7ec7ac34635061387513d" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.086979 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2732803-fafc-45fd-9cc7-4b61bd446fe0-kube-api-access-sw6jg" (OuterVolumeSpecName: "kube-api-access-sw6jg") pod "c2732803-fafc-45fd-9cc7-4b61bd446fe0" (UID: "c2732803-fafc-45fd-9cc7-4b61bd446fe0"). InnerVolumeSpecName "kube-api-access-sw6jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.090933 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" event={"ID":"7dd556b2-18b0-43da-ad37-e84c9e7601cc","Type":"ContainerDied","Data":"c5ccd8bb48eb11e95727d07ea061c249533afa1e5a4417b1a46ed9ca691a2be8"} Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.091032 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.095853 5030 generic.go:334] "Generic (PLEG): container finished" podID="bdb8aa29-a5b3-4060-a8a6-19d1eda96426" containerID="14bfbe083e51244aea6aa5252b2d46ca796ef97c8afa9c47bc422ff79846cd66" exitCode=0 Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.095912 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" event={"ID":"bdb8aa29-a5b3-4060-a8a6-19d1eda96426","Type":"ContainerDied","Data":"14bfbe083e51244aea6aa5252b2d46ca796ef97c8afa9c47bc422ff79846cd66"} Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.095938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" event={"ID":"bdb8aa29-a5b3-4060-a8a6-19d1eda96426","Type":"ContainerDied","Data":"2aa4e4b1173c3699a682420395dc50aa23d6e7720ee6294ef75d97af7f921c77"} Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.095993 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.104505 5030 generic.go:334] "Generic (PLEG): container finished" podID="c2732803-fafc-45fd-9cc7-4b61bd446fe0" containerID="95f59c7ca745d484ee25cabdc0bdf2f4f2cc68cfe3f951e0879e51278548c3c4" exitCode=0 Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.104571 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.104552 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" event={"ID":"c2732803-fafc-45fd-9cc7-4b61bd446fe0","Type":"ContainerDied","Data":"95f59c7ca745d484ee25cabdc0bdf2f4f2cc68cfe3f951e0879e51278548c3c4"} Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.104649 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9" event={"ID":"c2732803-fafc-45fd-9cc7-4b61bd446fe0","Type":"ContainerDied","Data":"64e63735adacf6c7238d20b1699a473042d698b7f08a4205fb78d95a2ced185e"} Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.107228 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" event={"ID":"3ce810a5-c4fb-4211-9bf1-eb6300276d70","Type":"ContainerStarted","Data":"18040f298fd1f797165b1d82dca99dc3c800a6017b34137e3a03f3f67c4e976f"} Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.107646 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-kube-api-access-rwjxf" (OuterVolumeSpecName: "kube-api-access-rwjxf") pod "f0fffc32-86e9-4278-8c2d-37a04bbd89b1" (UID: "f0fffc32-86e9-4278-8c2d-37a04bbd89b1"). InnerVolumeSpecName "kube-api-access-rwjxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.108690 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"f0fffc32-86e9-4278-8c2d-37a04bbd89b1","Type":"ContainerDied","Data":"1a42cc36f57f686bb515eac375c4829e1b50f352c61784a231a964fd7d816ac5"} Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.108829 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.112067 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" podStartSLOduration=7.112055342 podStartE2EDuration="7.112055342s" podCreationTimestamp="2026-01-20 22:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:57:43.110948545 +0000 UTC m=+1335.431208833" watchObservedRunningTime="2026-01-20 22:57:43.112055342 +0000 UTC m=+1335.432315630" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.114831 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"d09f29b6-8161-4158-810d-efe783958358","Type":"ContainerDied","Data":"f7f33823fa68b9cd7b2bca090004c206b04e4346f1ee3a9b7bd97feeccce6063"} Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.115609 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.121711 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.124369 5030 generic.go:334] "Generic (PLEG): container finished" podID="ce21c77f-25ce-4bba-b711-7375ce5e824c" containerID="903e3d3d344c10e9eb7cd1b444ed7504bc9dc3e0e1295314208b54614b16480a" exitCode=143 Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.124541 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.124582 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.124583 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" event={"ID":"ce21c77f-25ce-4bba-b711-7375ce5e824c","Type":"ContainerDied","Data":"903e3d3d344c10e9eb7cd1b444ed7504bc9dc3e0e1295314208b54614b16480a"} Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.124649 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.124554 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.124609 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.124734 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.124744 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.124787 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.124833 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-g8w6k" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.124991 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.133845 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.172554 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-config-data\") pod \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.172803 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-combined-ca-bundle\") pod \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.172936 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp7wh\" (UniqueName: \"kubernetes.io/projected/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-kube-api-access-dp7wh\") pod \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.173009 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-logs\") pod \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.173164 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-config-data-custom\") pod \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\" (UID: \"bdb8aa29-a5b3-4060-a8a6-19d1eda96426\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.173451 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-logs" (OuterVolumeSpecName: "logs") pod "bdb8aa29-a5b3-4060-a8a6-19d1eda96426" (UID: "bdb8aa29-a5b3-4060-a8a6-19d1eda96426"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.173868 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.173948 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2732803-fafc-45fd-9cc7-4b61bd446fe0-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.174005 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.174073 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwjxf\" (UniqueName: \"kubernetes.io/projected/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-kube-api-access-rwjxf\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.174140 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw6jg\" (UniqueName: \"kubernetes.io/projected/c2732803-fafc-45fd-9cc7-4b61bd446fe0-kube-api-access-sw6jg\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.178987 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-kube-api-access-dp7wh" (OuterVolumeSpecName: "kube-api-access-dp7wh") pod "bdb8aa29-a5b3-4060-a8a6-19d1eda96426" (UID: "bdb8aa29-a5b3-4060-a8a6-19d1eda96426"). InnerVolumeSpecName "kube-api-access-dp7wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.180731 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bdb8aa29-a5b3-4060-a8a6-19d1eda96426" (UID: "bdb8aa29-a5b3-4060-a8a6-19d1eda96426"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.249498 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0fffc32-86e9-4278-8c2d-37a04bbd89b1" (UID: "f0fffc32-86e9-4278-8c2d-37a04bbd89b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.261969 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2732803-fafc-45fd-9cc7-4b61bd446fe0" (UID: "c2732803-fafc-45fd-9cc7-4b61bd446fe0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.261987 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-config-data" (OuterVolumeSpecName: "config-data") pod "f0fffc32-86e9-4278-8c2d-37a04bbd89b1" (UID: "f0fffc32-86e9-4278-8c2d-37a04bbd89b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.275730 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.275759 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp7wh\" (UniqueName: \"kubernetes.io/projected/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-kube-api-access-dp7wh\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.275770 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.275779 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.275790 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fffc32-86e9-4278-8c2d-37a04bbd89b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.278310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-config-data" (OuterVolumeSpecName: "config-data") pod "bdb8aa29-a5b3-4060-a8a6-19d1eda96426" (UID: "bdb8aa29-a5b3-4060-a8a6-19d1eda96426"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.289182 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdb8aa29-a5b3-4060-a8a6-19d1eda96426" (UID: "bdb8aa29-a5b3-4060-a8a6-19d1eda96426"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.299852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-config-data" (OuterVolumeSpecName: "config-data") pod "c2732803-fafc-45fd-9cc7-4b61bd446fe0" (UID: "c2732803-fafc-45fd-9cc7-4b61bd446fe0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.377438 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.377807 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb8aa29-a5b3-4060-a8a6-19d1eda96426-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.377823 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2732803-fafc-45fd-9cc7-4b61bd446fe0-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.434562 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.435612 5030 scope.go:117] "RemoveContainer" containerID="65b8ebd2e425010f04d6e967a0f85be11bd51c38df6fd77b8844497673c771f4" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.488868 5030 scope.go:117] "RemoveContainer" containerID="709a2a7ededf39d4cf921562b31fe5c77003d1a66336a20cb086a47b4c5c88f2" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.555859 5030 scope.go:117] "RemoveContainer" containerID="14bfbe083e51244aea6aa5252b2d46ca796ef97c8afa9c47bc422ff79846cd66" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.590714 5030 scope.go:117] "RemoveContainer" containerID="06e68b68bba4d4968142a0ed2d7232e6d395b065e2c82b3bdda9db70cd9473e7" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.643932 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.658587 5030 scope.go:117] "RemoveContainer" containerID="14bfbe083e51244aea6aa5252b2d46ca796ef97c8afa9c47bc422ff79846cd66" Jan 20 22:57:43 crc kubenswrapper[5030]: E0120 22:57:43.660063 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14bfbe083e51244aea6aa5252b2d46ca796ef97c8afa9c47bc422ff79846cd66\": container with ID starting with 14bfbe083e51244aea6aa5252b2d46ca796ef97c8afa9c47bc422ff79846cd66 not found: ID does not exist" containerID="14bfbe083e51244aea6aa5252b2d46ca796ef97c8afa9c47bc422ff79846cd66" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.660116 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14bfbe083e51244aea6aa5252b2d46ca796ef97c8afa9c47bc422ff79846cd66"} err="failed to get container status \"14bfbe083e51244aea6aa5252b2d46ca796ef97c8afa9c47bc422ff79846cd66\": rpc error: code = NotFound desc = could not find container \"14bfbe083e51244aea6aa5252b2d46ca796ef97c8afa9c47bc422ff79846cd66\": container with ID starting with 14bfbe083e51244aea6aa5252b2d46ca796ef97c8afa9c47bc422ff79846cd66 not found: ID does not exist" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.660142 5030 scope.go:117] "RemoveContainer" containerID="06e68b68bba4d4968142a0ed2d7232e6d395b065e2c82b3bdda9db70cd9473e7" Jan 20 22:57:43 crc kubenswrapper[5030]: E0120 22:57:43.661027 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e68b68bba4d4968142a0ed2d7232e6d395b065e2c82b3bdda9db70cd9473e7\": container with ID starting with 06e68b68bba4d4968142a0ed2d7232e6d395b065e2c82b3bdda9db70cd9473e7 not found: ID does not exist" containerID="06e68b68bba4d4968142a0ed2d7232e6d395b065e2c82b3bdda9db70cd9473e7" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.661063 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e68b68bba4d4968142a0ed2d7232e6d395b065e2c82b3bdda9db70cd9473e7"} err="failed to get container status \"06e68b68bba4d4968142a0ed2d7232e6d395b065e2c82b3bdda9db70cd9473e7\": rpc error: code = NotFound desc = could not find container \"06e68b68bba4d4968142a0ed2d7232e6d395b065e2c82b3bdda9db70cd9473e7\": container with ID starting with 06e68b68bba4d4968142a0ed2d7232e6d395b065e2c82b3bdda9db70cd9473e7 not found: ID does not exist" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.661131 5030 scope.go:117] "RemoveContainer" containerID="95f59c7ca745d484ee25cabdc0bdf2f4f2cc68cfe3f951e0879e51278548c3c4" Jan 20 22:57:43 crc kubenswrapper[5030]: E0120 22:57:43.682529 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 22:57:43 crc kubenswrapper[5030]: E0120 22:57:43.682607 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data podName:38bfa948-051b-4e6c-92a5-3714e0652088 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:47.682589316 +0000 UTC m=+1340.002849604 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data") pod "rabbitmq-cell1-server-0" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088") : configmap "rabbitmq-cell1-config-data" not found Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.683926 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.693135 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-76869767b9-5lj6l"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.701011 5030 scope.go:117] "RemoveContainer" containerID="c7e72e59310be353b8fa092d384d923b5d454411590aafac1c75d0f354a04eb2" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.712233 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.722980 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.725414 5030 scope.go:117] "RemoveContainer" containerID="95f59c7ca745d484ee25cabdc0bdf2f4f2cc68cfe3f951e0879e51278548c3c4" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.729357 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs"] Jan 20 22:57:43 crc kubenswrapper[5030]: E0120 22:57:43.730262 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95f59c7ca745d484ee25cabdc0bdf2f4f2cc68cfe3f951e0879e51278548c3c4\": container with ID starting with 95f59c7ca745d484ee25cabdc0bdf2f4f2cc68cfe3f951e0879e51278548c3c4 not found: ID does not exist" containerID="95f59c7ca745d484ee25cabdc0bdf2f4f2cc68cfe3f951e0879e51278548c3c4" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.730297 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95f59c7ca745d484ee25cabdc0bdf2f4f2cc68cfe3f951e0879e51278548c3c4"} err="failed to get container status \"95f59c7ca745d484ee25cabdc0bdf2f4f2cc68cfe3f951e0879e51278548c3c4\": rpc error: code = NotFound desc = could not find container \"95f59c7ca745d484ee25cabdc0bdf2f4f2cc68cfe3f951e0879e51278548c3c4\": container with ID starting with 95f59c7ca745d484ee25cabdc0bdf2f4f2cc68cfe3f951e0879e51278548c3c4 not found: ID does not exist" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.730322 5030 scope.go:117] "RemoveContainer" containerID="c7e72e59310be353b8fa092d384d923b5d454411590aafac1c75d0f354a04eb2" Jan 20 22:57:43 crc kubenswrapper[5030]: E0120 22:57:43.730642 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7e72e59310be353b8fa092d384d923b5d454411590aafac1c75d0f354a04eb2\": container with ID starting with c7e72e59310be353b8fa092d384d923b5d454411590aafac1c75d0f354a04eb2 not found: ID does not exist" containerID="c7e72e59310be353b8fa092d384d923b5d454411590aafac1c75d0f354a04eb2" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.730843 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e72e59310be353b8fa092d384d923b5d454411590aafac1c75d0f354a04eb2"} err="failed to get container status \"c7e72e59310be353b8fa092d384d923b5d454411590aafac1c75d0f354a04eb2\": rpc error: code = NotFound desc = could not find container \"c7e72e59310be353b8fa092d384d923b5d454411590aafac1c75d0f354a04eb2\": container with ID starting with c7e72e59310be353b8fa092d384d923b5d454411590aafac1c75d0f354a04eb2 not found: ID does not exist" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.730882 5030 scope.go:117] "RemoveContainer" containerID="efd0142d2cdad98921b0a888af00007450dcab3e20ff251d67667fe179fb89a4" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.734806 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-c9769d76f-gpdjs"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.766215 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.773275 5030 scope.go:117] "RemoveContainer" containerID="fef776c551a97498795259e77769d450688d573f30cb486c4698bfd320905171" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.783143 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5d9845b76b-jvdpt"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.783752 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-scripts\") pod \"f0994e0e-f167-4c5a-bef7-95da6f589767\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.783840 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-config-data\") pod \"f0994e0e-f167-4c5a-bef7-95da6f589767\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.784003 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-internal-tls-certs\") pod \"f0994e0e-f167-4c5a-bef7-95da6f589767\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.784072 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-public-tls-certs\") pod \"f0994e0e-f167-4c5a-bef7-95da6f589767\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.784167 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-credential-keys\") pod \"f0994e0e-f167-4c5a-bef7-95da6f589767\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.784568 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj5ww\" (UniqueName: \"kubernetes.io/projected/f0994e0e-f167-4c5a-bef7-95da6f589767-kube-api-access-dj5ww\") pod \"f0994e0e-f167-4c5a-bef7-95da6f589767\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.784685 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-combined-ca-bundle\") pod \"f0994e0e-f167-4c5a-bef7-95da6f589767\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.784768 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-fernet-keys\") pod \"f0994e0e-f167-4c5a-bef7-95da6f589767\" (UID: \"f0994e0e-f167-4c5a-bef7-95da6f589767\") " Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.785277 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szznv\" (UniqueName: \"kubernetes.io/projected/7bc0c068-8645-4d10-95a1-94cce9fe9592-kube-api-access-szznv\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.785353 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.785414 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bc0c068-8645-4d10-95a1-94cce9fe9592-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.787707 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f0994e0e-f167-4c5a-bef7-95da6f589767" (UID: "f0994e0e-f167-4c5a-bef7-95da6f589767"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.790770 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-scripts" (OuterVolumeSpecName: "scripts") pod "f0994e0e-f167-4c5a-bef7-95da6f589767" (UID: "f0994e0e-f167-4c5a-bef7-95da6f589767"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.796962 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.801728 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0994e0e-f167-4c5a-bef7-95da6f589767-kube-api-access-dj5ww" (OuterVolumeSpecName: "kube-api-access-dj5ww") pod "f0994e0e-f167-4c5a-bef7-95da6f589767" (UID: "f0994e0e-f167-4c5a-bef7-95da6f589767"). InnerVolumeSpecName "kube-api-access-dj5ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.801736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f0994e0e-f167-4c5a-bef7-95da6f589767" (UID: "f0994e0e-f167-4c5a-bef7-95da6f589767"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.804203 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-5bdb48996-6pf22"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.809140 5030 scope.go:117] "RemoveContainer" containerID="a5eb9a66cd59578f8b736dfa3e91fcb0a66588356d5c43c532a1426a7815f1a9" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.810905 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.816568 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.823803 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.828781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0994e0e-f167-4c5a-bef7-95da6f589767" (UID: "f0994e0e-f167-4c5a-bef7-95da6f589767"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.829141 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.834977 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.841390 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.847946 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.853774 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.853908 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-config-data" (OuterVolumeSpecName: "config-data") pod "f0994e0e-f167-4c5a-bef7-95da6f589767" (UID: "f0994e0e-f167-4c5a-bef7-95da6f589767"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.863800 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.876643 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.886673 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.886699 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.886710 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.886718 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.886728 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da108774-0e8b-4790-ac01-1a542370a374-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.886737 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt76q\" (UniqueName: \"kubernetes.io/projected/da108774-0e8b-4790-ac01-1a542370a374-kube-api-access-kt76q\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.886746 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.886760 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj5ww\" (UniqueName: \"kubernetes.io/projected/f0994e0e-f167-4c5a-bef7-95da6f589767-kube-api-access-dj5ww\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.886768 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.887166 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f0994e0e-f167-4c5a-bef7-95da6f589767" (UID: "f0994e0e-f167-4c5a-bef7-95da6f589767"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.892847 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g8w6k"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.900183 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g8w6k"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.903861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f0994e0e-f167-4c5a-bef7-95da6f589767" (UID: "f0994e0e-f167-4c5a-bef7-95da6f589767"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.909378 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.913744 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-54b9d65d4d-n574x"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.918244 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.922651 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84c478564-jqlr9"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.926905 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.931337 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.935897 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.942139 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.949780 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:57:43 crc kubenswrapper[5030]: I0120 22:57:43.954307 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.980360 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" path="/var/lib/kubelet/pods/2ddc1fbc-a41d-446b-81e3-a15febd39285/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.982307 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b05ddec-40d4-42ea-987c-03718a02724d" path="/var/lib/kubelet/pods/4b05ddec-40d4-42ea-987c-03718a02724d/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.983220 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc0c068-8645-4d10-95a1-94cce9fe9592" path="/var/lib/kubelet/pods/7bc0c068-8645-4d10-95a1-94cce9fe9592/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.984398 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd556b2-18b0-43da-ad37-e84c9e7601cc" path="/var/lib/kubelet/pods/7dd556b2-18b0-43da-ad37-e84c9e7601cc/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.985404 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a05e942-d886-4355-b352-50408cfe11e3" path="/var/lib/kubelet/pods/8a05e942-d886-4355-b352-50408cfe11e3/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.987376 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a857cf-5973-4c78-9758-ab04c932e180" path="/var/lib/kubelet/pods/95a857cf-5973-4c78-9758-ab04c932e180/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.988527 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962ec476-3693-41a1-b425-a275bf790beb" path="/var/lib/kubelet/pods/962ec476-3693-41a1-b425-a275bf790beb/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.988583 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.988603 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vr69\" (UniqueName: \"kubernetes.io/projected/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-kube-api-access-4vr69\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.988612 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.988638 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0994e0e-f167-4c5a-bef7-95da6f589767-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.988648 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km2cf\" (UniqueName: \"kubernetes.io/projected/e0613c56-1bb3-455b-a45d-6b9ff28427c5-kube-api-access-km2cf\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.988657 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.988667 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.988676 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.988685 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0613c56-1bb3-455b-a45d-6b9ff28427c5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.989557 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4277815-cc65-44c8-b719-7673ce4c1d88" path="/var/lib/kubelet/pods/a4277815-cc65-44c8-b719-7673ce4c1d88/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.991202 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf270c3-8bc4-4b11-8c04-10708f22f76e" path="/var/lib/kubelet/pods/bcf270c3-8bc4-4b11-8c04-10708f22f76e/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.992569 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdb8aa29-a5b3-4060-a8a6-19d1eda96426" path="/var/lib/kubelet/pods/bdb8aa29-a5b3-4060-a8a6-19d1eda96426/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.994546 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2732803-fafc-45fd-9cc7-4b61bd446fe0" path="/var/lib/kubelet/pods/c2732803-fafc-45fd-9cc7-4b61bd446fe0/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.995380 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d09f29b6-8161-4158-810d-efe783958358" path="/var/lib/kubelet/pods/d09f29b6-8161-4158-810d-efe783958358/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.996055 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d" path="/var/lib/kubelet/pods/d5bb01e2-3d2e-4d4b-a851-4eff63db9d1d/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.996464 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d969fd5b-6579-4bc5-b64f-05bcb2aec29d" path="/var/lib/kubelet/pods/d969fd5b-6579-4bc5-b64f-05bcb2aec29d/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.997933 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da108774-0e8b-4790-ac01-1a542370a374" path="/var/lib/kubelet/pods/da108774-0e8b-4790-ac01-1a542370a374/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:43.998421 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd48f030-3391-4fdb-b101-45b0a7f4b9a3" path="/var/lib/kubelet/pods/dd48f030-3391-4fdb-b101-45b0a7f4b9a3/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.000868 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0613c56-1bb3-455b-a45d-6b9ff28427c5" path="/var/lib/kubelet/pods/e0613c56-1bb3-455b-a45d-6b9ff28427c5/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.001477 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0fffc32-86e9-4278-8c2d-37a04bbd89b1" path="/var/lib/kubelet/pods/f0fffc32-86e9-4278-8c2d-37a04bbd89b1/volumes" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.161943 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869/ovn-northd/0.log" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.161990 5030 generic.go:334] "Generic (PLEG): container finished" podID="36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" containerID="247edcbff378bb2e2fd1e3b880680b1439f95acd1f37bcf000a2cab517dc6fd8" exitCode=139 Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.162043 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869","Type":"ContainerDied","Data":"247edcbff378bb2e2fd1e3b880680b1439f95acd1f37bcf000a2cab517dc6fd8"} Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.167350 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" event={"ID":"b4e0a7b0-4723-42a8-aa99-c5f155dff065","Type":"ContainerStarted","Data":"cd3823ccaebe114b42c66e749c5091c7a2afeeff5db990cd2453a24eb789b703"} Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.177524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" event={"ID":"59f93538-1f2d-4699-9b7b-dcf4c110178a","Type":"ContainerStarted","Data":"76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f"} Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.177693 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" podUID="59f93538-1f2d-4699-9b7b-dcf4c110178a" containerName="placement-log" containerID="cri-o://683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844" gracePeriod=30 Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.177944 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" podUID="59f93538-1f2d-4699-9b7b-dcf4c110178a" containerName="placement-api" containerID="cri-o://76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f" gracePeriod=30 Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.178038 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.178147 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.192101 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" podStartSLOduration=6.192078839 podStartE2EDuration="6.192078839s" podCreationTimestamp="2026-01-20 22:57:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:57:44.185008216 +0000 UTC m=+1336.505268514" watchObservedRunningTime="2026-01-20 22:57:44.192078839 +0000 UTC m=+1336.512339157" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.193849 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" podUID="285c1a35-91f7-441f-aaf2-a228cccc8962" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.146:9311/healthcheck\": read tcp 10.217.0.2:43888->10.217.0.146:9311: read: connection reset by peer" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.193849 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" podUID="285c1a35-91f7-441f-aaf2-a228cccc8962" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.146:9311/healthcheck\": read tcp 10.217.0.2:43890->10.217.0.146:9311: read: connection reset by peer" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.195059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" event={"ID":"3ce810a5-c4fb-4211-9bf1-eb6300276d70","Type":"ContainerStarted","Data":"e0142e268b351b6a62427e70c71a48fc93815f1920a2cb306ed981400fac604f"} Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.195239 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" podUID="3ce810a5-c4fb-4211-9bf1-eb6300276d70" containerName="neutron-api" containerID="cri-o://18040f298fd1f797165b1d82dca99dc3c800a6017b34137e3a03f3f67c4e976f" gracePeriod=30 Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.195359 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.195441 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" podUID="3ce810a5-c4fb-4211-9bf1-eb6300276d70" containerName="neutron-httpd" containerID="cri-o://e0142e268b351b6a62427e70c71a48fc93815f1920a2cb306ed981400fac604f" gracePeriod=30 Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.205063 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" event={"ID":"37a888a7-7114-4dcb-b271-4422a3802e3b","Type":"ContainerStarted","Data":"b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de"} Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.205251 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" podUID="37a888a7-7114-4dcb-b271-4422a3802e3b" containerName="barbican-api-log" containerID="cri-o://a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437" gracePeriod=30 Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.205360 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.205393 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.205420 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" podUID="37a888a7-7114-4dcb-b271-4422a3802e3b" containerName="barbican-api" containerID="cri-o://b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de" gracePeriod=30 Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.209670 5030 generic.go:334] "Generic (PLEG): container finished" podID="f0994e0e-f167-4c5a-bef7-95da6f589767" containerID="54243974c668734f9b6c987ca5121b2c2c4d3cb7b7efacc44e8bd71d567cfcbb" exitCode=0 Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.209744 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" event={"ID":"f0994e0e-f167-4c5a-bef7-95da6f589767","Type":"ContainerDied","Data":"54243974c668734f9b6c987ca5121b2c2c4d3cb7b7efacc44e8bd71d567cfcbb"} Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.209775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" event={"ID":"f0994e0e-f167-4c5a-bef7-95da6f589767","Type":"ContainerDied","Data":"330b2c233250fdbd1793d68fb3084927a79634629cb113a67425e6c0f3ae8ebf"} Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.209795 5030 scope.go:117] "RemoveContainer" containerID="54243974c668734f9b6c987ca5121b2c2c4d3cb7b7efacc44e8bd71d567cfcbb" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.209897 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-866d7bb55b-59hzj" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.217857 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.224082 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" podStartSLOduration=8.224060531 podStartE2EDuration="8.224060531s" podCreationTimestamp="2026-01-20 22:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:57:44.20971867 +0000 UTC m=+1336.529979018" watchObservedRunningTime="2026-01-20 22:57:44.224060531 +0000 UTC m=+1336.544320819" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.245917 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" podStartSLOduration=8.245895335 podStartE2EDuration="8.245895335s" podCreationTimestamp="2026-01-20 22:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:57:44.245672438 +0000 UTC m=+1336.565932746" watchObservedRunningTime="2026-01-20 22:57:44.245895335 +0000 UTC m=+1336.566155633" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.268639 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" podStartSLOduration=9.268597109 podStartE2EDuration="9.268597109s" podCreationTimestamp="2026-01-20 22:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:57:44.268243051 +0000 UTC m=+1336.588503349" watchObservedRunningTime="2026-01-20 22:57:44.268597109 +0000 UTC m=+1336.588857417" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.364761 5030 scope.go:117] "RemoveContainer" containerID="54243974c668734f9b6c987ca5121b2c2c4d3cb7b7efacc44e8bd71d567cfcbb" Jan 20 22:57:44 crc kubenswrapper[5030]: E0120 22:57:44.365375 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54243974c668734f9b6c987ca5121b2c2c4d3cb7b7efacc44e8bd71d567cfcbb\": container with ID starting with 54243974c668734f9b6c987ca5121b2c2c4d3cb7b7efacc44e8bd71d567cfcbb not found: ID does not exist" containerID="54243974c668734f9b6c987ca5121b2c2c4d3cb7b7efacc44e8bd71d567cfcbb" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.365414 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54243974c668734f9b6c987ca5121b2c2c4d3cb7b7efacc44e8bd71d567cfcbb"} err="failed to get container status \"54243974c668734f9b6c987ca5121b2c2c4d3cb7b7efacc44e8bd71d567cfcbb\": rpc error: code = NotFound desc = could not find container \"54243974c668734f9b6c987ca5121b2c2c4d3cb7b7efacc44e8bd71d567cfcbb\": container with ID starting with 54243974c668734f9b6c987ca5121b2c2c4d3cb7b7efacc44e8bd71d567cfcbb not found: ID does not exist" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.548431 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869/ovn-northd/0.log" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.548732 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.577782 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf"] Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.590512 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-8c9f-account-create-update-sstcf"] Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.598443 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6x2f\" (UniqueName: \"kubernetes.io/projected/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-kube-api-access-g6x2f\") pod \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.598518 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-config\") pod \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.598609 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-combined-ca-bundle\") pod \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.598650 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-metrics-certs-tls-certs\") pod \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.598674 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-ovn-rundir\") pod \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.598737 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-ovn-northd-tls-certs\") pod \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.598760 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-scripts\") pod \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\" (UID: \"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.599578 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" (UID: "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.599759 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-scripts" (OuterVolumeSpecName: "scripts") pod "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" (UID: "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.600806 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-config" (OuterVolumeSpecName: "config") pod "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" (UID: "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.601954 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-866d7bb55b-59hzj"] Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.608269 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-kube-api-access-g6x2f" (OuterVolumeSpecName: "kube-api-access-g6x2f") pod "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" (UID: "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869"). InnerVolumeSpecName "kube-api-access-g6x2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.608323 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-866d7bb55b-59hzj"] Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.640464 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.641061 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" (UID: "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.701173 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-combined-ca-bundle\") pod \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.701246 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3fc09a-8a37-4559-91cb-817bcc4171b9-logs\") pod \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.701367 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-public-tls-certs\") pod \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.701450 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-internal-tls-certs\") pod \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.701482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-scripts\") pod \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.701548 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj76r\" (UniqueName: \"kubernetes.io/projected/7d3fc09a-8a37-4559-91cb-817bcc4171b9-kube-api-access-fj76r\") pod \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.703146 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-config-data\") pod \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\" (UID: \"7d3fc09a-8a37-4559-91cb-817bcc4171b9\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.703862 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.703915 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.703927 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f12523fd-a36d-4490-93bb-aa22ef7a0595-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.703936 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d52lp\" (UniqueName: \"kubernetes.io/projected/f12523fd-a36d-4490-93bb-aa22ef7a0595-kube-api-access-d52lp\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.703948 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.703987 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6x2f\" (UniqueName: \"kubernetes.io/projected/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-kube-api-access-g6x2f\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.703997 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.705374 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d3fc09a-8a37-4559-91cb-817bcc4171b9-logs" (OuterVolumeSpecName: "logs") pod "7d3fc09a-8a37-4559-91cb-817bcc4171b9" (UID: "7d3fc09a-8a37-4559-91cb-817bcc4171b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.705821 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" (UID: "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.713287 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3fc09a-8a37-4559-91cb-817bcc4171b9-kube-api-access-fj76r" (OuterVolumeSpecName: "kube-api-access-fj76r") pod "7d3fc09a-8a37-4559-91cb-817bcc4171b9" (UID: "7d3fc09a-8a37-4559-91cb-817bcc4171b9"). InnerVolumeSpecName "kube-api-access-fj76r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.714946 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-scripts" (OuterVolumeSpecName: "scripts") pod "7d3fc09a-8a37-4559-91cb-817bcc4171b9" (UID: "7d3fc09a-8a37-4559-91cb-817bcc4171b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.716243 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" (UID: "36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.787007 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d3fc09a-8a37-4559-91cb-817bcc4171b9" (UID: "7d3fc09a-8a37-4559-91cb-817bcc4171b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.791120 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-config-data" (OuterVolumeSpecName: "config-data") pod "7d3fc09a-8a37-4559-91cb-817bcc4171b9" (UID: "7d3fc09a-8a37-4559-91cb-817bcc4171b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.793927 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7d3fc09a-8a37-4559-91cb-817bcc4171b9" (UID: "7d3fc09a-8a37-4559-91cb-817bcc4171b9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.806371 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.806437 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d3fc09a-8a37-4559-91cb-817bcc4171b9-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.806463 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.806474 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.806484 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.806493 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj76r\" (UniqueName: \"kubernetes.io/projected/7d3fc09a-8a37-4559-91cb-817bcc4171b9-kube-api-access-fj76r\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.806502 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.806512 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.810589 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.814184 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.820062 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.823010 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7d3fc09a-8a37-4559-91cb-817bcc4171b9" (UID: "7d3fc09a-8a37-4559-91cb-817bcc4171b9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.834151 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.907795 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-config-data-custom\") pod \"37a888a7-7114-4dcb-b271-4422a3802e3b\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.907841 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a888a7-7114-4dcb-b271-4422a3802e3b-logs\") pod \"37a888a7-7114-4dcb-b271-4422a3802e3b\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.907868 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9vgs\" (UniqueName: \"kubernetes.io/projected/37a888a7-7114-4dcb-b271-4422a3802e3b-kube-api-access-b9vgs\") pod \"37a888a7-7114-4dcb-b271-4422a3802e3b\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.907894 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-scripts\") pod \"59f93538-1f2d-4699-9b7b-dcf4c110178a\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.907963 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-scripts\") pod \"1e567662-7ee7-4658-8954-06e43a966ce6\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.907987 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv6vc\" (UniqueName: \"kubernetes.io/projected/285c1a35-91f7-441f-aaf2-a228cccc8962-kube-api-access-jv6vc\") pod \"285c1a35-91f7-441f-aaf2-a228cccc8962\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908224 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a888a7-7114-4dcb-b271-4422a3802e3b-logs" (OuterVolumeSpecName: "logs") pod "37a888a7-7114-4dcb-b271-4422a3802e3b" (UID: "37a888a7-7114-4dcb-b271-4422a3802e3b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-combined-ca-bundle\") pod \"59f93538-1f2d-4699-9b7b-dcf4c110178a\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908381 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/285c1a35-91f7-441f-aaf2-a228cccc8962-logs\") pod \"285c1a35-91f7-441f-aaf2-a228cccc8962\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908408 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-public-tls-certs\") pod \"59f93538-1f2d-4699-9b7b-dcf4c110178a\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908430 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbrcv\" (UniqueName: \"kubernetes.io/projected/1e567662-7ee7-4658-8954-06e43a966ce6-kube-api-access-jbrcv\") pod \"1e567662-7ee7-4658-8954-06e43a966ce6\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908460 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-combined-ca-bundle\") pod \"37a888a7-7114-4dcb-b271-4422a3802e3b\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908491 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-combined-ca-bundle\") pod \"285c1a35-91f7-441f-aaf2-a228cccc8962\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908513 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-internal-tls-certs\") pod \"285c1a35-91f7-441f-aaf2-a228cccc8962\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908530 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-public-tls-certs\") pod \"285c1a35-91f7-441f-aaf2-a228cccc8962\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908546 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f93538-1f2d-4699-9b7b-dcf4c110178a-logs\") pod \"59f93538-1f2d-4699-9b7b-dcf4c110178a\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908566 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-config-data\") pod \"59f93538-1f2d-4699-9b7b-dcf4c110178a\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908582 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-public-tls-certs\") pod \"1e567662-7ee7-4658-8954-06e43a966ce6\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908598 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-config-data-custom\") pod \"285c1a35-91f7-441f-aaf2-a228cccc8962\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908615 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-config-data\") pod \"37a888a7-7114-4dcb-b271-4422a3802e3b\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908655 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-combined-ca-bundle\") pod \"1e567662-7ee7-4658-8954-06e43a966ce6\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908688 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-fernet-keys\") pod \"1e567662-7ee7-4658-8954-06e43a966ce6\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908705 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-internal-tls-certs\") pod \"59f93538-1f2d-4699-9b7b-dcf4c110178a\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908735 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4j9m\" (UniqueName: \"kubernetes.io/projected/59f93538-1f2d-4699-9b7b-dcf4c110178a-kube-api-access-r4j9m\") pod \"59f93538-1f2d-4699-9b7b-dcf4c110178a\" (UID: \"59f93538-1f2d-4699-9b7b-dcf4c110178a\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908762 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-credential-keys\") pod \"1e567662-7ee7-4658-8954-06e43a966ce6\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908779 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-internal-tls-certs\") pod \"37a888a7-7114-4dcb-b271-4422a3802e3b\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908796 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-public-tls-certs\") pod \"37a888a7-7114-4dcb-b271-4422a3802e3b\" (UID: \"37a888a7-7114-4dcb-b271-4422a3802e3b\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908816 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-config-data\") pod \"285c1a35-91f7-441f-aaf2-a228cccc8962\" (UID: \"285c1a35-91f7-441f-aaf2-a228cccc8962\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908831 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-config-data\") pod \"1e567662-7ee7-4658-8954-06e43a966ce6\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.908849 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-internal-tls-certs\") pod \"1e567662-7ee7-4658-8954-06e43a966ce6\" (UID: \"1e567662-7ee7-4658-8954-06e43a966ce6\") " Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.909117 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37a888a7-7114-4dcb-b271-4422a3802e3b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.909129 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d3fc09a-8a37-4559-91cb-817bcc4171b9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.910853 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/285c1a35-91f7-441f-aaf2-a228cccc8962-logs" (OuterVolumeSpecName: "logs") pod "285c1a35-91f7-441f-aaf2-a228cccc8962" (UID: "285c1a35-91f7-441f-aaf2-a228cccc8962"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.915964 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a888a7-7114-4dcb-b271-4422a3802e3b-kube-api-access-b9vgs" (OuterVolumeSpecName: "kube-api-access-b9vgs") pod "37a888a7-7114-4dcb-b271-4422a3802e3b" (UID: "37a888a7-7114-4dcb-b271-4422a3802e3b"). InnerVolumeSpecName "kube-api-access-b9vgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.916986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f93538-1f2d-4699-9b7b-dcf4c110178a-logs" (OuterVolumeSpecName: "logs") pod "59f93538-1f2d-4699-9b7b-dcf4c110178a" (UID: "59f93538-1f2d-4699-9b7b-dcf4c110178a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.918498 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-scripts" (OuterVolumeSpecName: "scripts") pod "1e567662-7ee7-4658-8954-06e43a966ce6" (UID: "1e567662-7ee7-4658-8954-06e43a966ce6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.918525 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/285c1a35-91f7-441f-aaf2-a228cccc8962-kube-api-access-jv6vc" (OuterVolumeSpecName: "kube-api-access-jv6vc") pod "285c1a35-91f7-441f-aaf2-a228cccc8962" (UID: "285c1a35-91f7-441f-aaf2-a228cccc8962"). InnerVolumeSpecName "kube-api-access-jv6vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.920002 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1e567662-7ee7-4658-8954-06e43a966ce6" (UID: "1e567662-7ee7-4658-8954-06e43a966ce6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.920731 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e567662-7ee7-4658-8954-06e43a966ce6-kube-api-access-jbrcv" (OuterVolumeSpecName: "kube-api-access-jbrcv") pod "1e567662-7ee7-4658-8954-06e43a966ce6" (UID: "1e567662-7ee7-4658-8954-06e43a966ce6"). InnerVolumeSpecName "kube-api-access-jbrcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.921071 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-scripts" (OuterVolumeSpecName: "scripts") pod "59f93538-1f2d-4699-9b7b-dcf4c110178a" (UID: "59f93538-1f2d-4699-9b7b-dcf4c110178a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.921249 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1e567662-7ee7-4658-8954-06e43a966ce6" (UID: "1e567662-7ee7-4658-8954-06e43a966ce6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.929781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "37a888a7-7114-4dcb-b271-4422a3802e3b" (UID: "37a888a7-7114-4dcb-b271-4422a3802e3b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.931233 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f93538-1f2d-4699-9b7b-dcf4c110178a-kube-api-access-r4j9m" (OuterVolumeSpecName: "kube-api-access-r4j9m") pod "59f93538-1f2d-4699-9b7b-dcf4c110178a" (UID: "59f93538-1f2d-4699-9b7b-dcf4c110178a"). InnerVolumeSpecName "kube-api-access-r4j9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.931994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "285c1a35-91f7-441f-aaf2-a228cccc8962" (UID: "285c1a35-91f7-441f-aaf2-a228cccc8962"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.936722 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37a888a7-7114-4dcb-b271-4422a3802e3b" (UID: "37a888a7-7114-4dcb-b271-4422a3802e3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.953059 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-config-data" (OuterVolumeSpecName: "config-data") pod "1e567662-7ee7-4658-8954-06e43a966ce6" (UID: "1e567662-7ee7-4658-8954-06e43a966ce6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.955204 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e567662-7ee7-4658-8954-06e43a966ce6" (UID: "1e567662-7ee7-4658-8954-06e43a966ce6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.959588 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "285c1a35-91f7-441f-aaf2-a228cccc8962" (UID: "285c1a35-91f7-441f-aaf2-a228cccc8962"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.978332 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59f93538-1f2d-4699-9b7b-dcf4c110178a" (UID: "59f93538-1f2d-4699-9b7b-dcf4c110178a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.978585 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "285c1a35-91f7-441f-aaf2-a228cccc8962" (UID: "285c1a35-91f7-441f-aaf2-a228cccc8962"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.979187 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1e567662-7ee7-4658-8954-06e43a966ce6" (UID: "1e567662-7ee7-4658-8954-06e43a966ce6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.986916 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "37a888a7-7114-4dcb-b271-4422a3802e3b" (UID: "37a888a7-7114-4dcb-b271-4422a3802e3b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.988224 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "37a888a7-7114-4dcb-b271-4422a3802e3b" (UID: "37a888a7-7114-4dcb-b271-4422a3802e3b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.988406 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-config-data" (OuterVolumeSpecName: "config-data") pod "37a888a7-7114-4dcb-b271-4422a3802e3b" (UID: "37a888a7-7114-4dcb-b271-4422a3802e3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.994364 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "285c1a35-91f7-441f-aaf2-a228cccc8962" (UID: "285c1a35-91f7-441f-aaf2-a228cccc8962"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.998551 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-config-data" (OuterVolumeSpecName: "config-data") pod "285c1a35-91f7-441f-aaf2-a228cccc8962" (UID: "285c1a35-91f7-441f-aaf2-a228cccc8962"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:44 crc kubenswrapper[5030]: I0120 22:57:44.999159 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-config-data" (OuterVolumeSpecName: "config-data") pod "59f93538-1f2d-4699-9b7b-dcf4c110178a" (UID: "59f93538-1f2d-4699-9b7b-dcf4c110178a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.007118 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1e567662-7ee7-4658-8954-06e43a966ce6" (UID: "1e567662-7ee7-4658-8954-06e43a966ce6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010470 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbrcv\" (UniqueName: \"kubernetes.io/projected/1e567662-7ee7-4658-8954-06e43a966ce6-kube-api-access-jbrcv\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010493 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010502 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010511 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010519 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010528 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f93538-1f2d-4699-9b7b-dcf4c110178a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010537 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010545 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010553 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010560 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010569 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010576 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010586 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4j9m\" (UniqueName: \"kubernetes.io/projected/59f93538-1f2d-4699-9b7b-dcf4c110178a-kube-api-access-r4j9m\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010593 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010601 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010609 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010630 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285c1a35-91f7-441f-aaf2-a228cccc8962-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010638 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010647 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010655 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37a888a7-7114-4dcb-b271-4422a3802e3b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010663 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9vgs\" (UniqueName: \"kubernetes.io/projected/37a888a7-7114-4dcb-b271-4422a3802e3b-kube-api-access-b9vgs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010671 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010679 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e567662-7ee7-4658-8954-06e43a966ce6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010687 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv6vc\" (UniqueName: \"kubernetes.io/projected/285c1a35-91f7-441f-aaf2-a228cccc8962-kube-api-access-jv6vc\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010696 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.010704 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/285c1a35-91f7-441f-aaf2-a228cccc8962-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.023498 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "59f93538-1f2d-4699-9b7b-dcf4c110178a" (UID: "59f93538-1f2d-4699-9b7b-dcf4c110178a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.024794 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "59f93538-1f2d-4699-9b7b-dcf4c110178a" (UID: "59f93538-1f2d-4699-9b7b-dcf4c110178a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.112376 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.112636 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59f93538-1f2d-4699-9b7b-dcf4c110178a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.233831 5030 generic.go:334] "Generic (PLEG): container finished" podID="3ce810a5-c4fb-4211-9bf1-eb6300276d70" containerID="e0142e268b351b6a62427e70c71a48fc93815f1920a2cb306ed981400fac604f" exitCode=0 Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.233948 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" event={"ID":"3ce810a5-c4fb-4211-9bf1-eb6300276d70","Type":"ContainerDied","Data":"e0142e268b351b6a62427e70c71a48fc93815f1920a2cb306ed981400fac604f"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.238249 5030 generic.go:334] "Generic (PLEG): container finished" podID="7d3fc09a-8a37-4559-91cb-817bcc4171b9" containerID="21c6bb975875359b4aada1c097326bbd3a6c5bc0f7495610ae74c71812916f4a" exitCode=0 Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.238492 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" event={"ID":"7d3fc09a-8a37-4559-91cb-817bcc4171b9","Type":"ContainerDied","Data":"21c6bb975875359b4aada1c097326bbd3a6c5bc0f7495610ae74c71812916f4a"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.238606 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.240290 5030 scope.go:117] "RemoveContainer" containerID="21c6bb975875359b4aada1c097326bbd3a6c5bc0f7495610ae74c71812916f4a" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.240555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-56bb869d7d-pvd6v" event={"ID":"7d3fc09a-8a37-4559-91cb-817bcc4171b9","Type":"ContainerDied","Data":"7cc3d813da304dfbd96b3a5e42d4eaba171503097105c591e1f6ce6b5cbda0dd"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.243321 5030 generic.go:334] "Generic (PLEG): container finished" podID="37a888a7-7114-4dcb-b271-4422a3802e3b" containerID="b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de" exitCode=0 Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.243358 5030 generic.go:334] "Generic (PLEG): container finished" podID="37a888a7-7114-4dcb-b271-4422a3802e3b" containerID="a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437" exitCode=143 Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.243437 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" event={"ID":"37a888a7-7114-4dcb-b271-4422a3802e3b","Type":"ContainerDied","Data":"b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.243472 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" event={"ID":"37a888a7-7114-4dcb-b271-4422a3802e3b","Type":"ContainerDied","Data":"a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.243515 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" event={"ID":"37a888a7-7114-4dcb-b271-4422a3802e3b","Type":"ContainerDied","Data":"f733bf8ac42245482c67234a8af7ffa8f6b0e4bc988409fbd1db67e80fb9b0d4"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.243601 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.252800 5030 generic.go:334] "Generic (PLEG): container finished" podID="285c1a35-91f7-441f-aaf2-a228cccc8962" containerID="01ac7883c22efebc724af58ba7cf2e86bb831d40c452847d4fd9f6769db4311a" exitCode=0 Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.252874 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" event={"ID":"285c1a35-91f7-441f-aaf2-a228cccc8962","Type":"ContainerDied","Data":"01ac7883c22efebc724af58ba7cf2e86bb831d40c452847d4fd9f6769db4311a"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.252956 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" event={"ID":"285c1a35-91f7-441f-aaf2-a228cccc8962","Type":"ContainerDied","Data":"fdea3b413a52c003a0efb130ec633c91ad4e1ea476b33c3d3ada42ef194c8381"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.253112 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-76878489f6-7898t" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.256825 5030 generic.go:334] "Generic (PLEG): container finished" podID="1e567662-7ee7-4658-8954-06e43a966ce6" containerID="e77f4e5c204c68469f44c8339a5b79d8d9973035aab21ac9dd8fa2c2f6325da5" exitCode=0 Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.256869 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" event={"ID":"1e567662-7ee7-4658-8954-06e43a966ce6","Type":"ContainerDied","Data":"e77f4e5c204c68469f44c8339a5b79d8d9973035aab21ac9dd8fa2c2f6325da5"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.256895 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" event={"ID":"1e567662-7ee7-4658-8954-06e43a966ce6","Type":"ContainerDied","Data":"446a0e59d05ba7244d24e55ba1629ae3e9b86853d4d79eaac54309ec41db85cd"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.256852 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5ddff9b97-b55t4" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.259592 5030 generic.go:334] "Generic (PLEG): container finished" podID="59f93538-1f2d-4699-9b7b-dcf4c110178a" containerID="76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f" exitCode=0 Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.259612 5030 generic.go:334] "Generic (PLEG): container finished" podID="59f93538-1f2d-4699-9b7b-dcf4c110178a" containerID="683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844" exitCode=143 Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.259657 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" event={"ID":"59f93538-1f2d-4699-9b7b-dcf4c110178a","Type":"ContainerDied","Data":"76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.259675 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" event={"ID":"59f93538-1f2d-4699-9b7b-dcf4c110178a","Type":"ContainerDied","Data":"683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.259685 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" event={"ID":"59f93538-1f2d-4699-9b7b-dcf4c110178a","Type":"ContainerDied","Data":"b5827bd4f30cc238b7506f6b8689c257ce264258c8d945378e8539c7984a11db"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.259727 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7bd4b6f588-wr8mk" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.267977 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.273451 5030 scope.go:117] "RemoveContainer" containerID="f8b91389342e688b3e0145f9dea09b80af881bed02683f96c927275a4d6eabc9" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.274951 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869/ovn-northd/0.log" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.275405 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.279680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869","Type":"ContainerDied","Data":"25ede500a1cf863025c9de214d70add569b559d86113b90f5ff61923fad6b9ec"} Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.295911 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-56bb869d7d-pvd6v"] Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.303829 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-56bb869d7d-pvd6v"] Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.316354 5030 scope.go:117] "RemoveContainer" containerID="21c6bb975875359b4aada1c097326bbd3a6c5bc0f7495610ae74c71812916f4a" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.316673 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-76878489f6-7898t"] Jan 20 22:57:45 crc kubenswrapper[5030]: E0120 22:57:45.316954 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c6bb975875359b4aada1c097326bbd3a6c5bc0f7495610ae74c71812916f4a\": container with ID starting with 21c6bb975875359b4aada1c097326bbd3a6c5bc0f7495610ae74c71812916f4a not found: ID does not exist" containerID="21c6bb975875359b4aada1c097326bbd3a6c5bc0f7495610ae74c71812916f4a" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.317022 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c6bb975875359b4aada1c097326bbd3a6c5bc0f7495610ae74c71812916f4a"} err="failed to get container status \"21c6bb975875359b4aada1c097326bbd3a6c5bc0f7495610ae74c71812916f4a\": rpc error: code = NotFound desc = could not find container \"21c6bb975875359b4aada1c097326bbd3a6c5bc0f7495610ae74c71812916f4a\": container with ID starting with 21c6bb975875359b4aada1c097326bbd3a6c5bc0f7495610ae74c71812916f4a not found: ID does not exist" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.317051 5030 scope.go:117] "RemoveContainer" containerID="f8b91389342e688b3e0145f9dea09b80af881bed02683f96c927275a4d6eabc9" Jan 20 22:57:45 crc kubenswrapper[5030]: E0120 22:57:45.320744 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8b91389342e688b3e0145f9dea09b80af881bed02683f96c927275a4d6eabc9\": container with ID starting with f8b91389342e688b3e0145f9dea09b80af881bed02683f96c927275a4d6eabc9 not found: ID does not exist" containerID="f8b91389342e688b3e0145f9dea09b80af881bed02683f96c927275a4d6eabc9" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.320792 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b91389342e688b3e0145f9dea09b80af881bed02683f96c927275a4d6eabc9"} err="failed to get container status \"f8b91389342e688b3e0145f9dea09b80af881bed02683f96c927275a4d6eabc9\": rpc error: code = NotFound desc = could not find container \"f8b91389342e688b3e0145f9dea09b80af881bed02683f96c927275a4d6eabc9\": container with ID starting with f8b91389342e688b3e0145f9dea09b80af881bed02683f96c927275a4d6eabc9 not found: ID does not exist" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.320887 5030 scope.go:117] "RemoveContainer" containerID="b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.322358 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-76878489f6-7898t"] Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.328068 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2"] Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.333420 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-67b967cd44-2rgq2"] Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.360457 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.366517 5030 scope.go:117] "RemoveContainer" containerID="a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.369058 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.379778 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5ddff9b97-b55t4"] Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.387351 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-5ddff9b97-b55t4"] Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.388160 5030 scope.go:117] "RemoveContainer" containerID="b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de" Jan 20 22:57:45 crc kubenswrapper[5030]: E0120 22:57:45.388568 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de\": container with ID starting with b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de not found: ID does not exist" containerID="b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.388601 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de"} err="failed to get container status \"b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de\": rpc error: code = NotFound desc = could not find container \"b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de\": container with ID starting with b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de not found: ID does not exist" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.388671 5030 scope.go:117] "RemoveContainer" containerID="a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437" Jan 20 22:57:45 crc kubenswrapper[5030]: E0120 22:57:45.389040 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437\": container with ID starting with a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437 not found: ID does not exist" containerID="a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.389136 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437"} err="failed to get container status \"a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437\": rpc error: code = NotFound desc = could not find container \"a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437\": container with ID starting with a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437 not found: ID does not exist" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.389215 5030 scope.go:117] "RemoveContainer" containerID="b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.389737 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de"} err="failed to get container status \"b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de\": rpc error: code = NotFound desc = could not find container \"b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de\": container with ID starting with b79abb1336665ef1394a1e4f2e2df0e7d850a4a2bacf34ffd630ec8660b209de not found: ID does not exist" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.389829 5030 scope.go:117] "RemoveContainer" containerID="a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.390268 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437"} err="failed to get container status \"a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437\": rpc error: code = NotFound desc = could not find container \"a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437\": container with ID starting with a7529a562e054b0127b0945b5f385f166b662c2cbaf906f8f9229b4c2051e437 not found: ID does not exist" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.390302 5030 scope.go:117] "RemoveContainer" containerID="01ac7883c22efebc724af58ba7cf2e86bb831d40c452847d4fd9f6769db4311a" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.393389 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7bd4b6f588-wr8mk"] Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.398519 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7bd4b6f588-wr8mk"] Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.413213 5030 scope.go:117] "RemoveContainer" containerID="d27c68304cedb99531414f7ed01f149023e91c9d96d8750769fd70655c26aa29" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.433943 5030 scope.go:117] "RemoveContainer" containerID="01ac7883c22efebc724af58ba7cf2e86bb831d40c452847d4fd9f6769db4311a" Jan 20 22:57:45 crc kubenswrapper[5030]: E0120 22:57:45.434384 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ac7883c22efebc724af58ba7cf2e86bb831d40c452847d4fd9f6769db4311a\": container with ID starting with 01ac7883c22efebc724af58ba7cf2e86bb831d40c452847d4fd9f6769db4311a not found: ID does not exist" containerID="01ac7883c22efebc724af58ba7cf2e86bb831d40c452847d4fd9f6769db4311a" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.434422 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ac7883c22efebc724af58ba7cf2e86bb831d40c452847d4fd9f6769db4311a"} err="failed to get container status \"01ac7883c22efebc724af58ba7cf2e86bb831d40c452847d4fd9f6769db4311a\": rpc error: code = NotFound desc = could not find container \"01ac7883c22efebc724af58ba7cf2e86bb831d40c452847d4fd9f6769db4311a\": container with ID starting with 01ac7883c22efebc724af58ba7cf2e86bb831d40c452847d4fd9f6769db4311a not found: ID does not exist" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.434448 5030 scope.go:117] "RemoveContainer" containerID="d27c68304cedb99531414f7ed01f149023e91c9d96d8750769fd70655c26aa29" Jan 20 22:57:45 crc kubenswrapper[5030]: E0120 22:57:45.435033 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d27c68304cedb99531414f7ed01f149023e91c9d96d8750769fd70655c26aa29\": container with ID starting with d27c68304cedb99531414f7ed01f149023e91c9d96d8750769fd70655c26aa29 not found: ID does not exist" containerID="d27c68304cedb99531414f7ed01f149023e91c9d96d8750769fd70655c26aa29" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.435139 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27c68304cedb99531414f7ed01f149023e91c9d96d8750769fd70655c26aa29"} err="failed to get container status \"d27c68304cedb99531414f7ed01f149023e91c9d96d8750769fd70655c26aa29\": rpc error: code = NotFound desc = could not find container \"d27c68304cedb99531414f7ed01f149023e91c9d96d8750769fd70655c26aa29\": container with ID starting with d27c68304cedb99531414f7ed01f149023e91c9d96d8750769fd70655c26aa29 not found: ID does not exist" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.435246 5030 scope.go:117] "RemoveContainer" containerID="e77f4e5c204c68469f44c8339a5b79d8d9973035aab21ac9dd8fa2c2f6325da5" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.457928 5030 scope.go:117] "RemoveContainer" containerID="e77f4e5c204c68469f44c8339a5b79d8d9973035aab21ac9dd8fa2c2f6325da5" Jan 20 22:57:45 crc kubenswrapper[5030]: E0120 22:57:45.458406 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e77f4e5c204c68469f44c8339a5b79d8d9973035aab21ac9dd8fa2c2f6325da5\": container with ID starting with e77f4e5c204c68469f44c8339a5b79d8d9973035aab21ac9dd8fa2c2f6325da5 not found: ID does not exist" containerID="e77f4e5c204c68469f44c8339a5b79d8d9973035aab21ac9dd8fa2c2f6325da5" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.458442 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e77f4e5c204c68469f44c8339a5b79d8d9973035aab21ac9dd8fa2c2f6325da5"} err="failed to get container status \"e77f4e5c204c68469f44c8339a5b79d8d9973035aab21ac9dd8fa2c2f6325da5\": rpc error: code = NotFound desc = could not find container \"e77f4e5c204c68469f44c8339a5b79d8d9973035aab21ac9dd8fa2c2f6325da5\": container with ID starting with e77f4e5c204c68469f44c8339a5b79d8d9973035aab21ac9dd8fa2c2f6325da5 not found: ID does not exist" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.458469 5030 scope.go:117] "RemoveContainer" containerID="76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.476018 5030 scope.go:117] "RemoveContainer" containerID="683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.509051 5030 scope.go:117] "RemoveContainer" containerID="76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f" Jan 20 22:57:45 crc kubenswrapper[5030]: E0120 22:57:45.510317 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f\": container with ID starting with 76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f not found: ID does not exist" containerID="76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.510386 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f"} err="failed to get container status \"76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f\": rpc error: code = NotFound desc = could not find container \"76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f\": container with ID starting with 76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f not found: ID does not exist" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.510429 5030 scope.go:117] "RemoveContainer" containerID="683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844" Jan 20 22:57:45 crc kubenswrapper[5030]: E0120 22:57:45.511275 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844\": container with ID starting with 683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844 not found: ID does not exist" containerID="683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.511319 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844"} err="failed to get container status \"683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844\": rpc error: code = NotFound desc = could not find container \"683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844\": container with ID starting with 683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844 not found: ID does not exist" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.511346 5030 scope.go:117] "RemoveContainer" containerID="76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.511781 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f"} err="failed to get container status \"76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f\": rpc error: code = NotFound desc = could not find container \"76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f\": container with ID starting with 76c24d0f5f895599a99c74578bb8c76fbbb91c70d0dc33b095ba614e4f688f3f not found: ID does not exist" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.511821 5030 scope.go:117] "RemoveContainer" containerID="683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.512772 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844"} err="failed to get container status \"683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844\": rpc error: code = NotFound desc = could not find container \"683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844\": container with ID starting with 683c36330579a981fccdc4329e8f08c7c192e2c3feec4ce0ffb8fa11510cb844 not found: ID does not exist" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.512861 5030 scope.go:117] "RemoveContainer" containerID="29fb501d2b9d7513a788b9257ebec18345b31379794a7b078577b83a4385a4f0" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.559057 5030 scope.go:117] "RemoveContainer" containerID="247edcbff378bb2e2fd1e3b880680b1439f95acd1f37bcf000a2cab517dc6fd8" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.979469 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e567662-7ee7-4658-8954-06e43a966ce6" path="/var/lib/kubelet/pods/1e567662-7ee7-4658-8954-06e43a966ce6/volumes" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.980580 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="285c1a35-91f7-441f-aaf2-a228cccc8962" path="/var/lib/kubelet/pods/285c1a35-91f7-441f-aaf2-a228cccc8962/volumes" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.981895 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" path="/var/lib/kubelet/pods/36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869/volumes" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.984736 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a888a7-7114-4dcb-b271-4422a3802e3b" path="/var/lib/kubelet/pods/37a888a7-7114-4dcb-b271-4422a3802e3b/volumes" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.986138 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f93538-1f2d-4699-9b7b-dcf4c110178a" path="/var/lib/kubelet/pods/59f93538-1f2d-4699-9b7b-dcf4c110178a/volumes" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.987383 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d3fc09a-8a37-4559-91cb-817bcc4171b9" path="/var/lib/kubelet/pods/7d3fc09a-8a37-4559-91cb-817bcc4171b9/volumes" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.988122 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0994e0e-f167-4c5a-bef7-95da6f589767" path="/var/lib/kubelet/pods/f0994e0e-f167-4c5a-bef7-95da6f589767/volumes" Jan 20 22:57:45 crc kubenswrapper[5030]: I0120 22:57:45.988762 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f12523fd-a36d-4490-93bb-aa22ef7a0595" path="/var/lib/kubelet/pods/f12523fd-a36d-4490-93bb-aa22ef7a0595/volumes" Jan 20 22:57:46 crc kubenswrapper[5030]: I0120 22:57:46.077334 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="bcf270c3-8bc4-4b11-8c04-10708f22f76e" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.160:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 22:57:46 crc kubenswrapper[5030]: E0120 22:57:46.336049 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 22:57:46 crc kubenswrapper[5030]: E0120 22:57:46.336167 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data podName:b0a5273a-9439-4881-b88a-2a3aa11e410b nodeName:}" failed. No retries permitted until 2026-01-20 22:57:54.336134932 +0000 UTC m=+1346.656395290 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data") pod "rabbitmq-server-0" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b") : configmap "rabbitmq-config-data" not found Jan 20 22:57:47 crc kubenswrapper[5030]: E0120 22:57:47.757204 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 22:57:47 crc kubenswrapper[5030]: E0120 22:57:47.757733 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data podName:38bfa948-051b-4e6c-92a5-3714e0652088 nodeName:}" failed. No retries permitted until 2026-01-20 22:57:55.757714707 +0000 UTC m=+1348.077975015 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data") pod "rabbitmq-cell1-server-0" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088") : configmap "rabbitmq-cell1-config-data" not found Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.800126 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.858380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-erlang-cookie\") pod \"b0a5273a-9439-4881-b88a-2a3aa11e410b\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.858558 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b0a5273a-9439-4881-b88a-2a3aa11e410b\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.858613 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-server-conf\") pod \"b0a5273a-9439-4881-b88a-2a3aa11e410b\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.858670 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0a5273a-9439-4881-b88a-2a3aa11e410b-pod-info\") pod \"b0a5273a-9439-4881-b88a-2a3aa11e410b\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.858778 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-plugins-conf\") pod \"b0a5273a-9439-4881-b88a-2a3aa11e410b\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.858819 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-confd\") pod \"b0a5273a-9439-4881-b88a-2a3aa11e410b\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.858974 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-tls\") pod \"b0a5273a-9439-4881-b88a-2a3aa11e410b\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.859082 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-plugins\") pod \"b0a5273a-9439-4881-b88a-2a3aa11e410b\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.859183 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq8sj\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-kube-api-access-jq8sj\") pod \"b0a5273a-9439-4881-b88a-2a3aa11e410b\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.859269 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0a5273a-9439-4881-b88a-2a3aa11e410b-erlang-cookie-secret\") pod \"b0a5273a-9439-4881-b88a-2a3aa11e410b\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.859360 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data\") pod \"b0a5273a-9439-4881-b88a-2a3aa11e410b\" (UID: \"b0a5273a-9439-4881-b88a-2a3aa11e410b\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.859986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b0a5273a-9439-4881-b88a-2a3aa11e410b" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.860034 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b0a5273a-9439-4881-b88a-2a3aa11e410b" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.860317 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.860347 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.860363 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b0a5273a-9439-4881-b88a-2a3aa11e410b" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.864667 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-kube-api-access-jq8sj" (OuterVolumeSpecName: "kube-api-access-jq8sj") pod "b0a5273a-9439-4881-b88a-2a3aa11e410b" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b"). InnerVolumeSpecName "kube-api-access-jq8sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.864690 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b0a5273a-9439-4881-b88a-2a3aa11e410b" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.865561 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "b0a5273a-9439-4881-b88a-2a3aa11e410b" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.867113 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a5273a-9439-4881-b88a-2a3aa11e410b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b0a5273a-9439-4881-b88a-2a3aa11e410b" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.868233 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b0a5273a-9439-4881-b88a-2a3aa11e410b-pod-info" (OuterVolumeSpecName: "pod-info") pod "b0a5273a-9439-4881-b88a-2a3aa11e410b" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.882978 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data" (OuterVolumeSpecName: "config-data") pod "b0a5273a-9439-4881-b88a-2a3aa11e410b" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.919933 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.920181 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-server-conf" (OuterVolumeSpecName: "server-conf") pod "b0a5273a-9439-4881-b88a-2a3aa11e410b" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.935302 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b0a5273a-9439-4881-b88a-2a3aa11e410b" (UID: "b0a5273a-9439-4881-b88a-2a3aa11e410b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961171 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data\") pod \"38bfa948-051b-4e6c-92a5-3714e0652088\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961235 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vzr9\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-kube-api-access-9vzr9\") pod \"38bfa948-051b-4e6c-92a5-3714e0652088\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961290 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-server-conf\") pod \"38bfa948-051b-4e6c-92a5-3714e0652088\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961314 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-plugins\") pod \"38bfa948-051b-4e6c-92a5-3714e0652088\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961356 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-confd\") pod \"38bfa948-051b-4e6c-92a5-3714e0652088\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961372 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-tls\") pod \"38bfa948-051b-4e6c-92a5-3714e0652088\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961397 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-plugins-conf\") pod \"38bfa948-051b-4e6c-92a5-3714e0652088\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38bfa948-051b-4e6c-92a5-3714e0652088-pod-info\") pod \"38bfa948-051b-4e6c-92a5-3714e0652088\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-erlang-cookie\") pod \"38bfa948-051b-4e6c-92a5-3714e0652088\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"38bfa948-051b-4e6c-92a5-3714e0652088\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961499 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38bfa948-051b-4e6c-92a5-3714e0652088-erlang-cookie-secret\") pod \"38bfa948-051b-4e6c-92a5-3714e0652088\" (UID: \"38bfa948-051b-4e6c-92a5-3714e0652088\") " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961768 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961787 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961798 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b0a5273a-9439-4881-b88a-2a3aa11e410b-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961808 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b0a5273a-9439-4881-b88a-2a3aa11e410b-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961818 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961827 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961836 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b0a5273a-9439-4881-b88a-2a3aa11e410b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961845 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq8sj\" (UniqueName: \"kubernetes.io/projected/b0a5273a-9439-4881-b88a-2a3aa11e410b-kube-api-access-jq8sj\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.961853 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b0a5273a-9439-4881-b88a-2a3aa11e410b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.964871 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "38bfa948-051b-4e6c-92a5-3714e0652088" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.966111 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "38bfa948-051b-4e6c-92a5-3714e0652088" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.966807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "38bfa948-051b-4e6c-92a5-3714e0652088" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.967049 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "38bfa948-051b-4e6c-92a5-3714e0652088" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.975497 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-kube-api-access-9vzr9" (OuterVolumeSpecName: "kube-api-access-9vzr9") pod "38bfa948-051b-4e6c-92a5-3714e0652088" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088"). InnerVolumeSpecName "kube-api-access-9vzr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.975504 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "38bfa948-051b-4e6c-92a5-3714e0652088" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.977176 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.979529 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38bfa948-051b-4e6c-92a5-3714e0652088-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "38bfa948-051b-4e6c-92a5-3714e0652088" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:57:47 crc kubenswrapper[5030]: I0120 22:57:47.982103 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/38bfa948-051b-4e6c-92a5-3714e0652088-pod-info" (OuterVolumeSpecName: "pod-info") pod "38bfa948-051b-4e6c-92a5-3714e0652088" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.000232 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-server-conf" (OuterVolumeSpecName: "server-conf") pod "38bfa948-051b-4e6c-92a5-3714e0652088" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.002818 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data" (OuterVolumeSpecName: "config-data") pod "38bfa948-051b-4e6c-92a5-3714e0652088" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.063027 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.063063 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.063076 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vzr9\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-kube-api-access-9vzr9\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.063091 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.063107 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.063120 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.063130 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38bfa948-051b-4e6c-92a5-3714e0652088-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.063142 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38bfa948-051b-4e6c-92a5-3714e0652088-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.063153 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.063176 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.063188 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38bfa948-051b-4e6c-92a5-3714e0652088-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.064230 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "38bfa948-051b-4e6c-92a5-3714e0652088" (UID: "38bfa948-051b-4e6c-92a5-3714e0652088"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.079487 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.164072 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38bfa948-051b-4e6c-92a5-3714e0652088-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.164108 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.338489 5030 generic.go:334] "Generic (PLEG): container finished" podID="38bfa948-051b-4e6c-92a5-3714e0652088" containerID="a4829caf0f9e77c402ed59e975d250deb92fb12f806a28728d5643b813800504" exitCode=0 Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.338608 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"38bfa948-051b-4e6c-92a5-3714e0652088","Type":"ContainerDied","Data":"a4829caf0f9e77c402ed59e975d250deb92fb12f806a28728d5643b813800504"} Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.338766 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"38bfa948-051b-4e6c-92a5-3714e0652088","Type":"ContainerDied","Data":"fade6fb19a5f99275c15e45e03f4dd9c4d7b7774a9d13abfb46ceed58f75c998"} Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.338802 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.338847 5030 scope.go:117] "RemoveContainer" containerID="a4829caf0f9e77c402ed59e975d250deb92fb12f806a28728d5643b813800504" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.346115 5030 generic.go:334] "Generic (PLEG): container finished" podID="b0a5273a-9439-4881-b88a-2a3aa11e410b" containerID="dea827cf70a7f2a132d0be3a5d721e6d07a8e2f8519f56822cf6962aaa2d6fac" exitCode=0 Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.346166 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"b0a5273a-9439-4881-b88a-2a3aa11e410b","Type":"ContainerDied","Data":"dea827cf70a7f2a132d0be3a5d721e6d07a8e2f8519f56822cf6962aaa2d6fac"} Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.346205 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"b0a5273a-9439-4881-b88a-2a3aa11e410b","Type":"ContainerDied","Data":"8af7359bc48418173b51693b863b4f938e54842a895b52d72f07b15be15b76c1"} Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.346286 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.398158 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.401876 5030 scope.go:117] "RemoveContainer" containerID="775623890bd1b1a33b5cac6e58ca14081b26ef86769bbbe4eaeddb36e108320c" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.412351 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.423539 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.430893 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.485132 5030 scope.go:117] "RemoveContainer" containerID="a4829caf0f9e77c402ed59e975d250deb92fb12f806a28728d5643b813800504" Jan 20 22:57:48 crc kubenswrapper[5030]: E0120 22:57:48.485554 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4829caf0f9e77c402ed59e975d250deb92fb12f806a28728d5643b813800504\": container with ID starting with a4829caf0f9e77c402ed59e975d250deb92fb12f806a28728d5643b813800504 not found: ID does not exist" containerID="a4829caf0f9e77c402ed59e975d250deb92fb12f806a28728d5643b813800504" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.485582 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4829caf0f9e77c402ed59e975d250deb92fb12f806a28728d5643b813800504"} err="failed to get container status \"a4829caf0f9e77c402ed59e975d250deb92fb12f806a28728d5643b813800504\": rpc error: code = NotFound desc = could not find container \"a4829caf0f9e77c402ed59e975d250deb92fb12f806a28728d5643b813800504\": container with ID starting with a4829caf0f9e77c402ed59e975d250deb92fb12f806a28728d5643b813800504 not found: ID does not exist" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.485604 5030 scope.go:117] "RemoveContainer" containerID="775623890bd1b1a33b5cac6e58ca14081b26ef86769bbbe4eaeddb36e108320c" Jan 20 22:57:48 crc kubenswrapper[5030]: E0120 22:57:48.486020 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"775623890bd1b1a33b5cac6e58ca14081b26ef86769bbbe4eaeddb36e108320c\": container with ID starting with 775623890bd1b1a33b5cac6e58ca14081b26ef86769bbbe4eaeddb36e108320c not found: ID does not exist" containerID="775623890bd1b1a33b5cac6e58ca14081b26ef86769bbbe4eaeddb36e108320c" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.486046 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"775623890bd1b1a33b5cac6e58ca14081b26ef86769bbbe4eaeddb36e108320c"} err="failed to get container status \"775623890bd1b1a33b5cac6e58ca14081b26ef86769bbbe4eaeddb36e108320c\": rpc error: code = NotFound desc = could not find container \"775623890bd1b1a33b5cac6e58ca14081b26ef86769bbbe4eaeddb36e108320c\": container with ID starting with 775623890bd1b1a33b5cac6e58ca14081b26ef86769bbbe4eaeddb36e108320c not found: ID does not exist" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.486064 5030 scope.go:117] "RemoveContainer" containerID="dea827cf70a7f2a132d0be3a5d721e6d07a8e2f8519f56822cf6962aaa2d6fac" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.509401 5030 scope.go:117] "RemoveContainer" containerID="b3335bf3d79fe9a34a33b8eb93fa2ea9b983ca3d131a46215aea375ddf7c1e58" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.529654 5030 scope.go:117] "RemoveContainer" containerID="dea827cf70a7f2a132d0be3a5d721e6d07a8e2f8519f56822cf6962aaa2d6fac" Jan 20 22:57:48 crc kubenswrapper[5030]: E0120 22:57:48.530073 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea827cf70a7f2a132d0be3a5d721e6d07a8e2f8519f56822cf6962aaa2d6fac\": container with ID starting with dea827cf70a7f2a132d0be3a5d721e6d07a8e2f8519f56822cf6962aaa2d6fac not found: ID does not exist" containerID="dea827cf70a7f2a132d0be3a5d721e6d07a8e2f8519f56822cf6962aaa2d6fac" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.530135 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea827cf70a7f2a132d0be3a5d721e6d07a8e2f8519f56822cf6962aaa2d6fac"} err="failed to get container status \"dea827cf70a7f2a132d0be3a5d721e6d07a8e2f8519f56822cf6962aaa2d6fac\": rpc error: code = NotFound desc = could not find container \"dea827cf70a7f2a132d0be3a5d721e6d07a8e2f8519f56822cf6962aaa2d6fac\": container with ID starting with dea827cf70a7f2a132d0be3a5d721e6d07a8e2f8519f56822cf6962aaa2d6fac not found: ID does not exist" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.530168 5030 scope.go:117] "RemoveContainer" containerID="b3335bf3d79fe9a34a33b8eb93fa2ea9b983ca3d131a46215aea375ddf7c1e58" Jan 20 22:57:48 crc kubenswrapper[5030]: E0120 22:57:48.530547 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3335bf3d79fe9a34a33b8eb93fa2ea9b983ca3d131a46215aea375ddf7c1e58\": container with ID starting with b3335bf3d79fe9a34a33b8eb93fa2ea9b983ca3d131a46215aea375ddf7c1e58 not found: ID does not exist" containerID="b3335bf3d79fe9a34a33b8eb93fa2ea9b983ca3d131a46215aea375ddf7c1e58" Jan 20 22:57:48 crc kubenswrapper[5030]: I0120 22:57:48.530586 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3335bf3d79fe9a34a33b8eb93fa2ea9b983ca3d131a46215aea375ddf7c1e58"} err="failed to get container status \"b3335bf3d79fe9a34a33b8eb93fa2ea9b983ca3d131a46215aea375ddf7c1e58\": rpc error: code = NotFound desc = could not find container \"b3335bf3d79fe9a34a33b8eb93fa2ea9b983ca3d131a46215aea375ddf7c1e58\": container with ID starting with b3335bf3d79fe9a34a33b8eb93fa2ea9b983ca3d131a46215aea375ddf7c1e58 not found: ID does not exist" Jan 20 22:57:49 crc kubenswrapper[5030]: I0120 22:57:49.994977 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38bfa948-051b-4e6c-92a5-3714e0652088" path="/var/lib/kubelet/pods/38bfa948-051b-4e6c-92a5-3714e0652088/volumes" Jan 20 22:57:49 crc kubenswrapper[5030]: I0120 22:57:49.997808 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a5273a-9439-4881-b88a-2a3aa11e410b" path="/var/lib/kubelet/pods/b0a5273a-9439-4881-b88a-2a3aa11e410b/volumes" Jan 20 22:57:50 crc kubenswrapper[5030]: I0120 22:57:50.264024 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:57:50 crc kubenswrapper[5030]: I0120 22:57:50.350150 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc"] Jan 20 22:57:50 crc kubenswrapper[5030]: I0120 22:57:50.350455 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" podUID="24e33cea-b99d-4398-9ecb-ec247a8eb2cc" containerName="dnsmasq-dns" containerID="cri-o://a62dda1b120931862d2ff8123b95183adde1c849221e13dfe84789cde234f043" gracePeriod=10 Jan 20 22:57:50 crc kubenswrapper[5030]: I0120 22:57:50.865566 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:57:50 crc kubenswrapper[5030]: I0120 22:57:50.914014 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-dns-swift-storage-0\") pod \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " Jan 20 22:57:50 crc kubenswrapper[5030]: I0120 22:57:50.914150 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-dnsmasq-svc\") pod \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " Jan 20 22:57:50 crc kubenswrapper[5030]: I0120 22:57:50.914232 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-config\") pod \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " Jan 20 22:57:50 crc kubenswrapper[5030]: I0120 22:57:50.914277 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhlgf\" (UniqueName: \"kubernetes.io/projected/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-kube-api-access-jhlgf\") pod \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\" (UID: \"24e33cea-b99d-4398-9ecb-ec247a8eb2cc\") " Jan 20 22:57:50 crc kubenswrapper[5030]: I0120 22:57:50.927368 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-kube-api-access-jhlgf" (OuterVolumeSpecName: "kube-api-access-jhlgf") pod "24e33cea-b99d-4398-9ecb-ec247a8eb2cc" (UID: "24e33cea-b99d-4398-9ecb-ec247a8eb2cc"). InnerVolumeSpecName "kube-api-access-jhlgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:57:50 crc kubenswrapper[5030]: I0120 22:57:50.966955 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-config" (OuterVolumeSpecName: "config") pod "24e33cea-b99d-4398-9ecb-ec247a8eb2cc" (UID: "24e33cea-b99d-4398-9ecb-ec247a8eb2cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:50 crc kubenswrapper[5030]: I0120 22:57:50.972791 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "24e33cea-b99d-4398-9ecb-ec247a8eb2cc" (UID: "24e33cea-b99d-4398-9ecb-ec247a8eb2cc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:50 crc kubenswrapper[5030]: I0120 22:57:50.991740 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "24e33cea-b99d-4398-9ecb-ec247a8eb2cc" (UID: "24e33cea-b99d-4398-9ecb-ec247a8eb2cc"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.016502 5030 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.016534 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.016548 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.016562 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhlgf\" (UniqueName: \"kubernetes.io/projected/24e33cea-b99d-4398-9ecb-ec247a8eb2cc-kube-api-access-jhlgf\") on node \"crc\" DevicePath \"\"" Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.402814 5030 generic.go:334] "Generic (PLEG): container finished" podID="24e33cea-b99d-4398-9ecb-ec247a8eb2cc" containerID="a62dda1b120931862d2ff8123b95183adde1c849221e13dfe84789cde234f043" exitCode=0 Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.402879 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" event={"ID":"24e33cea-b99d-4398-9ecb-ec247a8eb2cc","Type":"ContainerDied","Data":"a62dda1b120931862d2ff8123b95183adde1c849221e13dfe84789cde234f043"} Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.402908 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.402953 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc" event={"ID":"24e33cea-b99d-4398-9ecb-ec247a8eb2cc","Type":"ContainerDied","Data":"f542f5d18fffb2253f8667b07c9e3d8d3f1c73f5909238abf9c771acef804efe"} Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.402981 5030 scope.go:117] "RemoveContainer" containerID="a62dda1b120931862d2ff8123b95183adde1c849221e13dfe84789cde234f043" Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.452586 5030 scope.go:117] "RemoveContainer" containerID="ffb4def33af47a65b5171a0efd49012016f4989e7ec77538f03ced86a5dc9e32" Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.457875 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc"] Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.468665 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58b8ddd7c-xmthc"] Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.507642 5030 scope.go:117] "RemoveContainer" containerID="a62dda1b120931862d2ff8123b95183adde1c849221e13dfe84789cde234f043" Jan 20 22:57:51 crc kubenswrapper[5030]: E0120 22:57:51.508101 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a62dda1b120931862d2ff8123b95183adde1c849221e13dfe84789cde234f043\": container with ID starting with a62dda1b120931862d2ff8123b95183adde1c849221e13dfe84789cde234f043 not found: ID does not exist" containerID="a62dda1b120931862d2ff8123b95183adde1c849221e13dfe84789cde234f043" Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.508150 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a62dda1b120931862d2ff8123b95183adde1c849221e13dfe84789cde234f043"} err="failed to get container status \"a62dda1b120931862d2ff8123b95183adde1c849221e13dfe84789cde234f043\": rpc error: code = NotFound desc = could not find container \"a62dda1b120931862d2ff8123b95183adde1c849221e13dfe84789cde234f043\": container with ID starting with a62dda1b120931862d2ff8123b95183adde1c849221e13dfe84789cde234f043 not found: ID does not exist" Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.508183 5030 scope.go:117] "RemoveContainer" containerID="ffb4def33af47a65b5171a0efd49012016f4989e7ec77538f03ced86a5dc9e32" Jan 20 22:57:51 crc kubenswrapper[5030]: E0120 22:57:51.508453 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffb4def33af47a65b5171a0efd49012016f4989e7ec77538f03ced86a5dc9e32\": container with ID starting with ffb4def33af47a65b5171a0efd49012016f4989e7ec77538f03ced86a5dc9e32 not found: ID does not exist" containerID="ffb4def33af47a65b5171a0efd49012016f4989e7ec77538f03ced86a5dc9e32" Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.508502 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb4def33af47a65b5171a0efd49012016f4989e7ec77538f03ced86a5dc9e32"} err="failed to get container status \"ffb4def33af47a65b5171a0efd49012016f4989e7ec77538f03ced86a5dc9e32\": rpc error: code = NotFound desc = could not find container \"ffb4def33af47a65b5171a0efd49012016f4989e7ec77538f03ced86a5dc9e32\": container with ID starting with ffb4def33af47a65b5171a0efd49012016f4989e7ec77538f03ced86a5dc9e32 not found: ID does not exist" Jan 20 22:57:51 crc kubenswrapper[5030]: I0120 22:57:51.989414 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24e33cea-b99d-4398-9ecb-ec247a8eb2cc" path="/var/lib/kubelet/pods/24e33cea-b99d-4398-9ecb-ec247a8eb2cc/volumes" Jan 20 22:58:07 crc kubenswrapper[5030]: I0120 22:58:07.464886 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" podUID="3ce810a5-c4fb-4211-9bf1-eb6300276d70" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.193:9696/\": dial tcp 10.217.0.193:9696: connect: connection refused" Jan 20 22:58:09 crc kubenswrapper[5030]: I0120 22:58:09.021877 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" podUID="fd100215-a3f7-497e-842d-582681701903" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.145:9696/\": dial tcp 10.217.0.145:9696: connect: connection refused" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.158185 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.158242 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.401678 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5fc4cf64cd-9987t_fd100215-a3f7-497e-842d-582681701903/neutron-api/0.log" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.402013 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.448858 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-ovndb-tls-certs\") pod \"fd100215-a3f7-497e-842d-582681701903\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.448900 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dqm9\" (UniqueName: \"kubernetes.io/projected/fd100215-a3f7-497e-842d-582681701903-kube-api-access-8dqm9\") pod \"fd100215-a3f7-497e-842d-582681701903\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.448951 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-httpd-config\") pod \"fd100215-a3f7-497e-842d-582681701903\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.448993 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-internal-tls-certs\") pod \"fd100215-a3f7-497e-842d-582681701903\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.449010 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-combined-ca-bundle\") pod \"fd100215-a3f7-497e-842d-582681701903\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.449039 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-public-tls-certs\") pod \"fd100215-a3f7-497e-842d-582681701903\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.449071 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-config\") pod \"fd100215-a3f7-497e-842d-582681701903\" (UID: \"fd100215-a3f7-497e-842d-582681701903\") " Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.456046 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "fd100215-a3f7-497e-842d-582681701903" (UID: "fd100215-a3f7-497e-842d-582681701903"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.463104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd100215-a3f7-497e-842d-582681701903-kube-api-access-8dqm9" (OuterVolumeSpecName: "kube-api-access-8dqm9") pod "fd100215-a3f7-497e-842d-582681701903" (UID: "fd100215-a3f7-497e-842d-582681701903"). InnerVolumeSpecName "kube-api-access-8dqm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.485979 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fd100215-a3f7-497e-842d-582681701903" (UID: "fd100215-a3f7-497e-842d-582681701903"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.494359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-config" (OuterVolumeSpecName: "config") pod "fd100215-a3f7-497e-842d-582681701903" (UID: "fd100215-a3f7-497e-842d-582681701903"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.496577 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.500744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd100215-a3f7-497e-842d-582681701903" (UID: "fd100215-a3f7-497e-842d-582681701903"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.501462 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fd100215-a3f7-497e-842d-582681701903" (UID: "fd100215-a3f7-497e-842d-582681701903"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.503168 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "fd100215-a3f7-497e-842d-582681701903" (UID: "fd100215-a3f7-497e-842d-582681701903"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.550193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4870c60-c6b8-48d3-998e-a91670fcedbc-cache\") pod \"f4870c60-c6b8-48d3-998e-a91670fcedbc\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.550279 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift\") pod \"f4870c60-c6b8-48d3-998e-a91670fcedbc\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.550323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"f4870c60-c6b8-48d3-998e-a91670fcedbc\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.550351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4870c60-c6b8-48d3-998e-a91670fcedbc-lock\") pod \"f4870c60-c6b8-48d3-998e-a91670fcedbc\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.550372 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zgwr\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-kube-api-access-4zgwr\") pod \"f4870c60-c6b8-48d3-998e-a91670fcedbc\" (UID: \"f4870c60-c6b8-48d3-998e-a91670fcedbc\") " Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.550538 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.550550 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.550560 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.550570 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.550578 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.550587 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dqm9\" (UniqueName: \"kubernetes.io/projected/fd100215-a3f7-497e-842d-582681701903-kube-api-access-8dqm9\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.550596 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fd100215-a3f7-497e-842d-582681701903-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.550758 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4870c60-c6b8-48d3-998e-a91670fcedbc-cache" (OuterVolumeSpecName: "cache") pod "f4870c60-c6b8-48d3-998e-a91670fcedbc" (UID: "f4870c60-c6b8-48d3-998e-a91670fcedbc"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.551001 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4870c60-c6b8-48d3-998e-a91670fcedbc-lock" (OuterVolumeSpecName: "lock") pod "f4870c60-c6b8-48d3-998e-a91670fcedbc" (UID: "f4870c60-c6b8-48d3-998e-a91670fcedbc"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.552905 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "swift") pod "f4870c60-c6b8-48d3-998e-a91670fcedbc" (UID: "f4870c60-c6b8-48d3-998e-a91670fcedbc"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.553883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-kube-api-access-4zgwr" (OuterVolumeSpecName: "kube-api-access-4zgwr") pod "f4870c60-c6b8-48d3-998e-a91670fcedbc" (UID: "f4870c60-c6b8-48d3-998e-a91670fcedbc"). InnerVolumeSpecName "kube-api-access-4zgwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.554605 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f4870c60-c6b8-48d3-998e-a91670fcedbc" (UID: "f4870c60-c6b8-48d3-998e-a91670fcedbc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.605102 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerID="adc617fb88a39b843a0ff031f77209c3ae2283dc26aec23acae2b5de8b19f18c" exitCode=137 Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.605171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"adc617fb88a39b843a0ff031f77209c3ae2283dc26aec23acae2b5de8b19f18c"} Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.605200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"f4870c60-c6b8-48d3-998e-a91670fcedbc","Type":"ContainerDied","Data":"a32d7f934880c093501fe43086fd0946c6ccd073d9129de0239769825c5932ef"} Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.605222 5030 scope.go:117] "RemoveContainer" containerID="adc617fb88a39b843a0ff031f77209c3ae2283dc26aec23acae2b5de8b19f18c" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.605222 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.610821 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5fc4cf64cd-9987t_fd100215-a3f7-497e-842d-582681701903/neutron-api/0.log" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.610891 5030 generic.go:334] "Generic (PLEG): container finished" podID="fd100215-a3f7-497e-842d-582681701903" containerID="4ef4f1c19ec69f0a2a6ee1ccdc738015babfb45998ce1816892ed6bcd3c46351" exitCode=137 Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.610930 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" event={"ID":"fd100215-a3f7-497e-842d-582681701903","Type":"ContainerDied","Data":"4ef4f1c19ec69f0a2a6ee1ccdc738015babfb45998ce1816892ed6bcd3c46351"} Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.610957 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.610966 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5fc4cf64cd-9987t" event={"ID":"fd100215-a3f7-497e-842d-582681701903","Type":"ContainerDied","Data":"a380f459cecde946617edbe7684d2038dcc22d4cc30565abe22ed5462d3a2945"} Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.634043 5030 scope.go:117] "RemoveContainer" containerID="2bb4c6cb0c30a38e71228d46f1ce0a2d6242c26273e09475f37ec15be630df47" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.653116 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.653147 5030 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f4870c60-c6b8-48d3-998e-a91670fcedbc-lock\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.653159 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zgwr\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-kube-api-access-4zgwr\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.653169 5030 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f4870c60-c6b8-48d3-998e-a91670fcedbc-cache\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.653177 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f4870c60-c6b8-48d3-998e-a91670fcedbc-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.655229 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.660365 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.665407 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5fc4cf64cd-9987t"] Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.666292 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.668130 5030 scope.go:117] "RemoveContainer" containerID="03d8131565b2aa8215b283479dbb08d399b764b7bdd91342366f8b11d7c1bd11" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.670516 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5fc4cf64cd-9987t"] Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.685023 5030 scope.go:117] "RemoveContainer" containerID="61166ebcaa8d52cbda3ab4ddba35312982c50bdd4e6acec74584508c92aaf82e" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.740357 5030 scope.go:117] "RemoveContainer" containerID="dd5a67ef4fb551fa2f475133669cec13544df0cc96f8214505d7da9eba2e8bd0" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.754646 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.758504 5030 scope.go:117] "RemoveContainer" containerID="2334a73469889f8ad1c51ce74e70eb724d6a8113be8a4e9fc40821d4ff9a633b" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.862510 5030 scope.go:117] "RemoveContainer" containerID="966b0cf5daaff4c0b35472ac3ae84907a5ab999f5d4629a16af22c0f9f24aa9a" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.884884 5030 scope.go:117] "RemoveContainer" containerID="92b7b86eb5275b933dafd8fadf06f237a24ffe511636756664335966de4acf0a" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.913260 5030 scope.go:117] "RemoveContainer" containerID="cb2e2e77e02b8db23512519bebfb695114aaf3212af696ecbeb14d791e69248d" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.931051 5030 scope.go:117] "RemoveContainer" containerID="95c5f038cd2f0fd19a1d2f8de88ec001899e1f73b1fb3f3369a0e83f7a93bc73" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.952588 5030 scope.go:117] "RemoveContainer" containerID="c4478344184d417c7bf47c30c7e0b83365751f9ca859287ab58440f5bfb834e1" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.977076 5030 scope.go:117] "RemoveContainer" containerID="7b3f774418e3437069778c78fb657a5ed3fb7e251d10f744a2722ba73b312add" Jan 20 22:58:10 crc kubenswrapper[5030]: I0120 22:58:10.988231 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod270d79a6-4c9f-415b-aa6a-e0d33da8177d"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod270d79a6-4c9f-415b-aa6a-e0d33da8177d] : Timed out while waiting for systemd to remove kubepods-besteffort-pod270d79a6_4c9f_415b_aa6a_e0d33da8177d.slice" Jan 20 22:58:10 crc kubenswrapper[5030]: E0120 22:58:10.988279 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod270d79a6-4c9f-415b-aa6a-e0d33da8177d] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod270d79a6-4c9f-415b-aa6a-e0d33da8177d] : Timed out while waiting for systemd to remove kubepods-besteffort-pod270d79a6_4c9f_415b_aa6a_e0d33da8177d.slice" pod="openstack-kuttl-tests/openstackclient" podUID="270d79a6-4c9f-415b-aa6a-e0d33da8177d" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.044595 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.045962 5030 scope.go:117] "RemoveContainer" containerID="ae49717efe652fa5f134b16d6554d620221090eb39bb69192b38b9c574c2291c" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.057990 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpxwc\" (UniqueName: \"kubernetes.io/projected/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-kube-api-access-fpxwc\") pod \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.058075 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-combined-ca-bundle\") pod \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.058183 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-logs\") pod \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.058742 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-logs" (OuterVolumeSpecName: "logs") pod "b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" (UID: "b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.058826 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-config-data\") pod \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.059109 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-config-data-custom\") pod \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\" (UID: \"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef\") " Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.060071 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.066180 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-kube-api-access-fpxwc" (OuterVolumeSpecName: "kube-api-access-fpxwc") pod "b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" (UID: "b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef"). InnerVolumeSpecName "kube-api-access-fpxwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.066254 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" (UID: "b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.077991 5030 scope.go:117] "RemoveContainer" containerID="3e0d8293f0f6aca058326e5ad7fece6c0d7a637efa32ef0554b0a6456daa1e20" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.089968 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" (UID: "b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.104303 5030 scope.go:117] "RemoveContainer" containerID="c623f5b67162ae7ba2ac2764b304e3991464bb5c5b5ab6193cf3fd5b24bb2845" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.106761 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-config-data" (OuterVolumeSpecName: "config-data") pod "b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" (UID: "b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.120161 5030 scope.go:117] "RemoveContainer" containerID="adc617fb88a39b843a0ff031f77209c3ae2283dc26aec23acae2b5de8b19f18c" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.120568 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adc617fb88a39b843a0ff031f77209c3ae2283dc26aec23acae2b5de8b19f18c\": container with ID starting with adc617fb88a39b843a0ff031f77209c3ae2283dc26aec23acae2b5de8b19f18c not found: ID does not exist" containerID="adc617fb88a39b843a0ff031f77209c3ae2283dc26aec23acae2b5de8b19f18c" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.120609 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adc617fb88a39b843a0ff031f77209c3ae2283dc26aec23acae2b5de8b19f18c"} err="failed to get container status \"adc617fb88a39b843a0ff031f77209c3ae2283dc26aec23acae2b5de8b19f18c\": rpc error: code = NotFound desc = could not find container \"adc617fb88a39b843a0ff031f77209c3ae2283dc26aec23acae2b5de8b19f18c\": container with ID starting with adc617fb88a39b843a0ff031f77209c3ae2283dc26aec23acae2b5de8b19f18c not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.120653 5030 scope.go:117] "RemoveContainer" containerID="2bb4c6cb0c30a38e71228d46f1ce0a2d6242c26273e09475f37ec15be630df47" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.120904 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bb4c6cb0c30a38e71228d46f1ce0a2d6242c26273e09475f37ec15be630df47\": container with ID starting with 2bb4c6cb0c30a38e71228d46f1ce0a2d6242c26273e09475f37ec15be630df47 not found: ID does not exist" containerID="2bb4c6cb0c30a38e71228d46f1ce0a2d6242c26273e09475f37ec15be630df47" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.120945 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bb4c6cb0c30a38e71228d46f1ce0a2d6242c26273e09475f37ec15be630df47"} err="failed to get container status \"2bb4c6cb0c30a38e71228d46f1ce0a2d6242c26273e09475f37ec15be630df47\": rpc error: code = NotFound desc = could not find container \"2bb4c6cb0c30a38e71228d46f1ce0a2d6242c26273e09475f37ec15be630df47\": container with ID starting with 2bb4c6cb0c30a38e71228d46f1ce0a2d6242c26273e09475f37ec15be630df47 not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.120975 5030 scope.go:117] "RemoveContainer" containerID="03d8131565b2aa8215b283479dbb08d399b764b7bdd91342366f8b11d7c1bd11" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.121215 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d8131565b2aa8215b283479dbb08d399b764b7bdd91342366f8b11d7c1bd11\": container with ID starting with 03d8131565b2aa8215b283479dbb08d399b764b7bdd91342366f8b11d7c1bd11 not found: ID does not exist" containerID="03d8131565b2aa8215b283479dbb08d399b764b7bdd91342366f8b11d7c1bd11" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.121245 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d8131565b2aa8215b283479dbb08d399b764b7bdd91342366f8b11d7c1bd11"} err="failed to get container status \"03d8131565b2aa8215b283479dbb08d399b764b7bdd91342366f8b11d7c1bd11\": rpc error: code = NotFound desc = could not find container \"03d8131565b2aa8215b283479dbb08d399b764b7bdd91342366f8b11d7c1bd11\": container with ID starting with 03d8131565b2aa8215b283479dbb08d399b764b7bdd91342366f8b11d7c1bd11 not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.121264 5030 scope.go:117] "RemoveContainer" containerID="61166ebcaa8d52cbda3ab4ddba35312982c50bdd4e6acec74584508c92aaf82e" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.121535 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61166ebcaa8d52cbda3ab4ddba35312982c50bdd4e6acec74584508c92aaf82e\": container with ID starting with 61166ebcaa8d52cbda3ab4ddba35312982c50bdd4e6acec74584508c92aaf82e not found: ID does not exist" containerID="61166ebcaa8d52cbda3ab4ddba35312982c50bdd4e6acec74584508c92aaf82e" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.121595 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61166ebcaa8d52cbda3ab4ddba35312982c50bdd4e6acec74584508c92aaf82e"} err="failed to get container status \"61166ebcaa8d52cbda3ab4ddba35312982c50bdd4e6acec74584508c92aaf82e\": rpc error: code = NotFound desc = could not find container \"61166ebcaa8d52cbda3ab4ddba35312982c50bdd4e6acec74584508c92aaf82e\": container with ID starting with 61166ebcaa8d52cbda3ab4ddba35312982c50bdd4e6acec74584508c92aaf82e not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.121654 5030 scope.go:117] "RemoveContainer" containerID="dd5a67ef4fb551fa2f475133669cec13544df0cc96f8214505d7da9eba2e8bd0" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.121921 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd5a67ef4fb551fa2f475133669cec13544df0cc96f8214505d7da9eba2e8bd0\": container with ID starting with dd5a67ef4fb551fa2f475133669cec13544df0cc96f8214505d7da9eba2e8bd0 not found: ID does not exist" containerID="dd5a67ef4fb551fa2f475133669cec13544df0cc96f8214505d7da9eba2e8bd0" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.121953 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd5a67ef4fb551fa2f475133669cec13544df0cc96f8214505d7da9eba2e8bd0"} err="failed to get container status \"dd5a67ef4fb551fa2f475133669cec13544df0cc96f8214505d7da9eba2e8bd0\": rpc error: code = NotFound desc = could not find container \"dd5a67ef4fb551fa2f475133669cec13544df0cc96f8214505d7da9eba2e8bd0\": container with ID starting with dd5a67ef4fb551fa2f475133669cec13544df0cc96f8214505d7da9eba2e8bd0 not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.121971 5030 scope.go:117] "RemoveContainer" containerID="2334a73469889f8ad1c51ce74e70eb724d6a8113be8a4e9fc40821d4ff9a633b" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.122171 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2334a73469889f8ad1c51ce74e70eb724d6a8113be8a4e9fc40821d4ff9a633b\": container with ID starting with 2334a73469889f8ad1c51ce74e70eb724d6a8113be8a4e9fc40821d4ff9a633b not found: ID does not exist" containerID="2334a73469889f8ad1c51ce74e70eb724d6a8113be8a4e9fc40821d4ff9a633b" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.122197 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2334a73469889f8ad1c51ce74e70eb724d6a8113be8a4e9fc40821d4ff9a633b"} err="failed to get container status \"2334a73469889f8ad1c51ce74e70eb724d6a8113be8a4e9fc40821d4ff9a633b\": rpc error: code = NotFound desc = could not find container \"2334a73469889f8ad1c51ce74e70eb724d6a8113be8a4e9fc40821d4ff9a633b\": container with ID starting with 2334a73469889f8ad1c51ce74e70eb724d6a8113be8a4e9fc40821d4ff9a633b not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.122214 5030 scope.go:117] "RemoveContainer" containerID="966b0cf5daaff4c0b35472ac3ae84907a5ab999f5d4629a16af22c0f9f24aa9a" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.122393 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"966b0cf5daaff4c0b35472ac3ae84907a5ab999f5d4629a16af22c0f9f24aa9a\": container with ID starting with 966b0cf5daaff4c0b35472ac3ae84907a5ab999f5d4629a16af22c0f9f24aa9a not found: ID does not exist" containerID="966b0cf5daaff4c0b35472ac3ae84907a5ab999f5d4629a16af22c0f9f24aa9a" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.122420 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966b0cf5daaff4c0b35472ac3ae84907a5ab999f5d4629a16af22c0f9f24aa9a"} err="failed to get container status \"966b0cf5daaff4c0b35472ac3ae84907a5ab999f5d4629a16af22c0f9f24aa9a\": rpc error: code = NotFound desc = could not find container \"966b0cf5daaff4c0b35472ac3ae84907a5ab999f5d4629a16af22c0f9f24aa9a\": container with ID starting with 966b0cf5daaff4c0b35472ac3ae84907a5ab999f5d4629a16af22c0f9f24aa9a not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.122436 5030 scope.go:117] "RemoveContainer" containerID="92b7b86eb5275b933dafd8fadf06f237a24ffe511636756664335966de4acf0a" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.122750 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b7b86eb5275b933dafd8fadf06f237a24ffe511636756664335966de4acf0a\": container with ID starting with 92b7b86eb5275b933dafd8fadf06f237a24ffe511636756664335966de4acf0a not found: ID does not exist" containerID="92b7b86eb5275b933dafd8fadf06f237a24ffe511636756664335966de4acf0a" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.122815 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b7b86eb5275b933dafd8fadf06f237a24ffe511636756664335966de4acf0a"} err="failed to get container status \"92b7b86eb5275b933dafd8fadf06f237a24ffe511636756664335966de4acf0a\": rpc error: code = NotFound desc = could not find container \"92b7b86eb5275b933dafd8fadf06f237a24ffe511636756664335966de4acf0a\": container with ID starting with 92b7b86eb5275b933dafd8fadf06f237a24ffe511636756664335966de4acf0a not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.122852 5030 scope.go:117] "RemoveContainer" containerID="cb2e2e77e02b8db23512519bebfb695114aaf3212af696ecbeb14d791e69248d" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.123081 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb2e2e77e02b8db23512519bebfb695114aaf3212af696ecbeb14d791e69248d\": container with ID starting with cb2e2e77e02b8db23512519bebfb695114aaf3212af696ecbeb14d791e69248d not found: ID does not exist" containerID="cb2e2e77e02b8db23512519bebfb695114aaf3212af696ecbeb14d791e69248d" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.123113 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb2e2e77e02b8db23512519bebfb695114aaf3212af696ecbeb14d791e69248d"} err="failed to get container status \"cb2e2e77e02b8db23512519bebfb695114aaf3212af696ecbeb14d791e69248d\": rpc error: code = NotFound desc = could not find container \"cb2e2e77e02b8db23512519bebfb695114aaf3212af696ecbeb14d791e69248d\": container with ID starting with cb2e2e77e02b8db23512519bebfb695114aaf3212af696ecbeb14d791e69248d not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.123133 5030 scope.go:117] "RemoveContainer" containerID="95c5f038cd2f0fd19a1d2f8de88ec001899e1f73b1fb3f3369a0e83f7a93bc73" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.123389 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c5f038cd2f0fd19a1d2f8de88ec001899e1f73b1fb3f3369a0e83f7a93bc73\": container with ID starting with 95c5f038cd2f0fd19a1d2f8de88ec001899e1f73b1fb3f3369a0e83f7a93bc73 not found: ID does not exist" containerID="95c5f038cd2f0fd19a1d2f8de88ec001899e1f73b1fb3f3369a0e83f7a93bc73" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.123441 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c5f038cd2f0fd19a1d2f8de88ec001899e1f73b1fb3f3369a0e83f7a93bc73"} err="failed to get container status \"95c5f038cd2f0fd19a1d2f8de88ec001899e1f73b1fb3f3369a0e83f7a93bc73\": rpc error: code = NotFound desc = could not find container \"95c5f038cd2f0fd19a1d2f8de88ec001899e1f73b1fb3f3369a0e83f7a93bc73\": container with ID starting with 95c5f038cd2f0fd19a1d2f8de88ec001899e1f73b1fb3f3369a0e83f7a93bc73 not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.123473 5030 scope.go:117] "RemoveContainer" containerID="c4478344184d417c7bf47c30c7e0b83365751f9ca859287ab58440f5bfb834e1" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.123759 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4478344184d417c7bf47c30c7e0b83365751f9ca859287ab58440f5bfb834e1\": container with ID starting with c4478344184d417c7bf47c30c7e0b83365751f9ca859287ab58440f5bfb834e1 not found: ID does not exist" containerID="c4478344184d417c7bf47c30c7e0b83365751f9ca859287ab58440f5bfb834e1" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.123789 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4478344184d417c7bf47c30c7e0b83365751f9ca859287ab58440f5bfb834e1"} err="failed to get container status \"c4478344184d417c7bf47c30c7e0b83365751f9ca859287ab58440f5bfb834e1\": rpc error: code = NotFound desc = could not find container \"c4478344184d417c7bf47c30c7e0b83365751f9ca859287ab58440f5bfb834e1\": container with ID starting with c4478344184d417c7bf47c30c7e0b83365751f9ca859287ab58440f5bfb834e1 not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.123807 5030 scope.go:117] "RemoveContainer" containerID="7b3f774418e3437069778c78fb657a5ed3fb7e251d10f744a2722ba73b312add" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.123994 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b3f774418e3437069778c78fb657a5ed3fb7e251d10f744a2722ba73b312add\": container with ID starting with 7b3f774418e3437069778c78fb657a5ed3fb7e251d10f744a2722ba73b312add not found: ID does not exist" containerID="7b3f774418e3437069778c78fb657a5ed3fb7e251d10f744a2722ba73b312add" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.124023 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3f774418e3437069778c78fb657a5ed3fb7e251d10f744a2722ba73b312add"} err="failed to get container status \"7b3f774418e3437069778c78fb657a5ed3fb7e251d10f744a2722ba73b312add\": rpc error: code = NotFound desc = could not find container \"7b3f774418e3437069778c78fb657a5ed3fb7e251d10f744a2722ba73b312add\": container with ID starting with 7b3f774418e3437069778c78fb657a5ed3fb7e251d10f744a2722ba73b312add not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.124040 5030 scope.go:117] "RemoveContainer" containerID="ae49717efe652fa5f134b16d6554d620221090eb39bb69192b38b9c574c2291c" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.124216 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae49717efe652fa5f134b16d6554d620221090eb39bb69192b38b9c574c2291c\": container with ID starting with ae49717efe652fa5f134b16d6554d620221090eb39bb69192b38b9c574c2291c not found: ID does not exist" containerID="ae49717efe652fa5f134b16d6554d620221090eb39bb69192b38b9c574c2291c" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.124241 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae49717efe652fa5f134b16d6554d620221090eb39bb69192b38b9c574c2291c"} err="failed to get container status \"ae49717efe652fa5f134b16d6554d620221090eb39bb69192b38b9c574c2291c\": rpc error: code = NotFound desc = could not find container \"ae49717efe652fa5f134b16d6554d620221090eb39bb69192b38b9c574c2291c\": container with ID starting with ae49717efe652fa5f134b16d6554d620221090eb39bb69192b38b9c574c2291c not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.124257 5030 scope.go:117] "RemoveContainer" containerID="3e0d8293f0f6aca058326e5ad7fece6c0d7a637efa32ef0554b0a6456daa1e20" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.124484 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e0d8293f0f6aca058326e5ad7fece6c0d7a637efa32ef0554b0a6456daa1e20\": container with ID starting with 3e0d8293f0f6aca058326e5ad7fece6c0d7a637efa32ef0554b0a6456daa1e20 not found: ID does not exist" containerID="3e0d8293f0f6aca058326e5ad7fece6c0d7a637efa32ef0554b0a6456daa1e20" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.124568 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e0d8293f0f6aca058326e5ad7fece6c0d7a637efa32ef0554b0a6456daa1e20"} err="failed to get container status \"3e0d8293f0f6aca058326e5ad7fece6c0d7a637efa32ef0554b0a6456daa1e20\": rpc error: code = NotFound desc = could not find container \"3e0d8293f0f6aca058326e5ad7fece6c0d7a637efa32ef0554b0a6456daa1e20\": container with ID starting with 3e0d8293f0f6aca058326e5ad7fece6c0d7a637efa32ef0554b0a6456daa1e20 not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.124615 5030 scope.go:117] "RemoveContainer" containerID="c623f5b67162ae7ba2ac2764b304e3991464bb5c5b5ab6193cf3fd5b24bb2845" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.124880 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c623f5b67162ae7ba2ac2764b304e3991464bb5c5b5ab6193cf3fd5b24bb2845\": container with ID starting with c623f5b67162ae7ba2ac2764b304e3991464bb5c5b5ab6193cf3fd5b24bb2845 not found: ID does not exist" containerID="c623f5b67162ae7ba2ac2764b304e3991464bb5c5b5ab6193cf3fd5b24bb2845" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.124909 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c623f5b67162ae7ba2ac2764b304e3991464bb5c5b5ab6193cf3fd5b24bb2845"} err="failed to get container status \"c623f5b67162ae7ba2ac2764b304e3991464bb5c5b5ab6193cf3fd5b24bb2845\": rpc error: code = NotFound desc = could not find container \"c623f5b67162ae7ba2ac2764b304e3991464bb5c5b5ab6193cf3fd5b24bb2845\": container with ID starting with c623f5b67162ae7ba2ac2764b304e3991464bb5c5b5ab6193cf3fd5b24bb2845 not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.124926 5030 scope.go:117] "RemoveContainer" containerID="03d2222b1f44670ddf7c4edbaa91540b2ac22ae3a87c8da53d412157de2aefb6" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.142370 5030 scope.go:117] "RemoveContainer" containerID="4ef4f1c19ec69f0a2a6ee1ccdc738015babfb45998ce1816892ed6bcd3c46351" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.161136 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpxwc\" (UniqueName: \"kubernetes.io/projected/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-kube-api-access-fpxwc\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.161175 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.161186 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.161195 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.166881 5030 scope.go:117] "RemoveContainer" containerID="03d2222b1f44670ddf7c4edbaa91540b2ac22ae3a87c8da53d412157de2aefb6" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.167330 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d2222b1f44670ddf7c4edbaa91540b2ac22ae3a87c8da53d412157de2aefb6\": container with ID starting with 03d2222b1f44670ddf7c4edbaa91540b2ac22ae3a87c8da53d412157de2aefb6 not found: ID does not exist" containerID="03d2222b1f44670ddf7c4edbaa91540b2ac22ae3a87c8da53d412157de2aefb6" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.167386 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d2222b1f44670ddf7c4edbaa91540b2ac22ae3a87c8da53d412157de2aefb6"} err="failed to get container status \"03d2222b1f44670ddf7c4edbaa91540b2ac22ae3a87c8da53d412157de2aefb6\": rpc error: code = NotFound desc = could not find container \"03d2222b1f44670ddf7c4edbaa91540b2ac22ae3a87c8da53d412157de2aefb6\": container with ID starting with 03d2222b1f44670ddf7c4edbaa91540b2ac22ae3a87c8da53d412157de2aefb6 not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.167421 5030 scope.go:117] "RemoveContainer" containerID="4ef4f1c19ec69f0a2a6ee1ccdc738015babfb45998ce1816892ed6bcd3c46351" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.167824 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ef4f1c19ec69f0a2a6ee1ccdc738015babfb45998ce1816892ed6bcd3c46351\": container with ID starting with 4ef4f1c19ec69f0a2a6ee1ccdc738015babfb45998ce1816892ed6bcd3c46351 not found: ID does not exist" containerID="4ef4f1c19ec69f0a2a6ee1ccdc738015babfb45998ce1816892ed6bcd3c46351" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.167848 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ef4f1c19ec69f0a2a6ee1ccdc738015babfb45998ce1816892ed6bcd3c46351"} err="failed to get container status \"4ef4f1c19ec69f0a2a6ee1ccdc738015babfb45998ce1816892ed6bcd3c46351\": rpc error: code = NotFound desc = could not find container \"4ef4f1c19ec69f0a2a6ee1ccdc738015babfb45998ce1816892ed6bcd3c46351\": container with ID starting with 4ef4f1c19ec69f0a2a6ee1ccdc738015babfb45998ce1816892ed6bcd3c46351 not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.423634 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod1d59910a-9b07-47ff-b79b-661139ece8c3"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod1d59910a-9b07-47ff-b79b-661139ece8c3] : Timed out while waiting for systemd to remove kubepods-besteffort-pod1d59910a_9b07_47ff_b79b_661139ece8c3.slice" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.423690 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod1d59910a-9b07-47ff-b79b-661139ece8c3] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod1d59910a-9b07-47ff-b79b-661139ece8c3] : Timed out while waiting for systemd to remove kubepods-besteffort-pod1d59910a_9b07_47ff_b79b_661139ece8c3.slice" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="1d59910a-9b07-47ff-b79b-661139ece8c3" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.437476 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","poda199be04-e085-49d3-a376-abdf54773b99"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort poda199be04-e085-49d3-a376-abdf54773b99] : Timed out while waiting for systemd to remove kubepods-besteffort-poda199be04_e085_49d3_a376_abdf54773b99.slice" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.437557 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort poda199be04-e085-49d3-a376-abdf54773b99] : unable to destroy cgroup paths for cgroup [kubepods besteffort poda199be04-e085-49d3-a376-abdf54773b99] : Timed out while waiting for systemd to remove kubepods-besteffort-poda199be04_e085_49d3_a376_abdf54773b99.slice" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="a199be04-e085-49d3-a376-abdf54773b99" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.441713 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podefe6a225-5530-4922-91ec-1c4a939cb98b"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podefe6a225-5530-4922-91ec-1c4a939cb98b] : Timed out while waiting for systemd to remove kubepods-besteffort-podefe6a225_5530_4922_91ec_1c4a939cb98b.slice" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.441774 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podefe6a225-5530-4922-91ec-1c4a939cb98b] : unable to destroy cgroup paths for cgroup [kubepods besteffort podefe6a225-5530-4922-91ec-1c4a939cb98b] : Timed out while waiting for systemd to remove kubepods-besteffort-podefe6a225_5530_4922_91ec_1c4a939cb98b.slice" pod="openstack-kuttl-tests/memcached-0" podUID="efe6a225-5530-4922-91ec-1c4a939cb98b" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.626536 5030 generic.go:334] "Generic (PLEG): container finished" podID="b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" containerID="6247a77514dea17f931c5a0520c612b84e0184270b42ce2a744c871dd5a047b4" exitCode=137 Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.626590 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" event={"ID":"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef","Type":"ContainerDied","Data":"6247a77514dea17f931c5a0520c612b84e0184270b42ce2a744c871dd5a047b4"} Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.626662 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.626689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv" event={"ID":"b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef","Type":"ContainerDied","Data":"e594999bcc298ec09bcf4cc91d8c81e08eed57723ae2f5308c3c7138f9972c8d"} Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.626724 5030 scope.go:117] "RemoveContainer" containerID="6247a77514dea17f931c5a0520c612b84e0184270b42ce2a744c871dd5a047b4" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.630810 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.630853 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.630922 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.631032 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.654138 5030 scope.go:117] "RemoveContainer" containerID="16e00ec261436bba79f4e1f1da0aa6c5a6077529ffff0c75640ae041e0a9aaed" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.675728 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.685866 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.687276 5030 scope.go:117] "RemoveContainer" containerID="6247a77514dea17f931c5a0520c612b84e0184270b42ce2a744c871dd5a047b4" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.687796 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6247a77514dea17f931c5a0520c612b84e0184270b42ce2a744c871dd5a047b4\": container with ID starting with 6247a77514dea17f931c5a0520c612b84e0184270b42ce2a744c871dd5a047b4 not found: ID does not exist" containerID="6247a77514dea17f931c5a0520c612b84e0184270b42ce2a744c871dd5a047b4" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.687842 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6247a77514dea17f931c5a0520c612b84e0184270b42ce2a744c871dd5a047b4"} err="failed to get container status \"6247a77514dea17f931c5a0520c612b84e0184270b42ce2a744c871dd5a047b4\": rpc error: code = NotFound desc = could not find container \"6247a77514dea17f931c5a0520c612b84e0184270b42ce2a744c871dd5a047b4\": container with ID starting with 6247a77514dea17f931c5a0520c612b84e0184270b42ce2a744c871dd5a047b4 not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.687874 5030 scope.go:117] "RemoveContainer" containerID="16e00ec261436bba79f4e1f1da0aa6c5a6077529ffff0c75640ae041e0a9aaed" Jan 20 22:58:11 crc kubenswrapper[5030]: E0120 22:58:11.688708 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16e00ec261436bba79f4e1f1da0aa6c5a6077529ffff0c75640ae041e0a9aaed\": container with ID starting with 16e00ec261436bba79f4e1f1da0aa6c5a6077529ffff0c75640ae041e0a9aaed not found: ID does not exist" containerID="16e00ec261436bba79f4e1f1da0aa6c5a6077529ffff0c75640ae041e0a9aaed" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.688868 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16e00ec261436bba79f4e1f1da0aa6c5a6077529ffff0c75640ae041e0a9aaed"} err="failed to get container status \"16e00ec261436bba79f4e1f1da0aa6c5a6077529ffff0c75640ae041e0a9aaed\": rpc error: code = NotFound desc = could not find container \"16e00ec261436bba79f4e1f1da0aa6c5a6077529ffff0c75640ae041e0a9aaed\": container with ID starting with 16e00ec261436bba79f4e1f1da0aa6c5a6077529ffff0c75640ae041e0a9aaed not found: ID does not exist" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.694068 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.702275 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.731789 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.740360 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.748698 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv"] Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.755381 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-64675d4b5c-z9thv"] Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.976798 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d59910a-9b07-47ff-b79b-661139ece8c3" path="/var/lib/kubelet/pods/1d59910a-9b07-47ff-b79b-661139ece8c3/volumes" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.977498 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a199be04-e085-49d3-a376-abdf54773b99" path="/var/lib/kubelet/pods/a199be04-e085-49d3-a376-abdf54773b99/volumes" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.978508 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" path="/var/lib/kubelet/pods/b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef/volumes" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.980111 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efe6a225-5530-4922-91ec-1c4a939cb98b" path="/var/lib/kubelet/pods/efe6a225-5530-4922-91ec-1c4a939cb98b/volumes" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.981328 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" path="/var/lib/kubelet/pods/f4870c60-c6b8-48d3-998e-a91670fcedbc/volumes" Jan 20 22:58:11 crc kubenswrapper[5030]: I0120 22:58:11.984049 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd100215-a3f7-497e-842d-582681701903" path="/var/lib/kubelet/pods/fd100215-a3f7-497e-842d-582681701903/volumes" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.290284 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.379540 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce21c77f-25ce-4bba-b711-7375ce5e824c-logs\") pod \"ce21c77f-25ce-4bba-b711-7375ce5e824c\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.379668 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-config-data-custom\") pod \"ce21c77f-25ce-4bba-b711-7375ce5e824c\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.379718 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-combined-ca-bundle\") pod \"ce21c77f-25ce-4bba-b711-7375ce5e824c\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.379781 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9m56\" (UniqueName: \"kubernetes.io/projected/ce21c77f-25ce-4bba-b711-7375ce5e824c-kube-api-access-f9m56\") pod \"ce21c77f-25ce-4bba-b711-7375ce5e824c\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.379845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-config-data\") pod \"ce21c77f-25ce-4bba-b711-7375ce5e824c\" (UID: \"ce21c77f-25ce-4bba-b711-7375ce5e824c\") " Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.380034 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce21c77f-25ce-4bba-b711-7375ce5e824c-logs" (OuterVolumeSpecName: "logs") pod "ce21c77f-25ce-4bba-b711-7375ce5e824c" (UID: "ce21c77f-25ce-4bba-b711-7375ce5e824c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.383107 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ce21c77f-25ce-4bba-b711-7375ce5e824c" (UID: "ce21c77f-25ce-4bba-b711-7375ce5e824c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.384920 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce21c77f-25ce-4bba-b711-7375ce5e824c-kube-api-access-f9m56" (OuterVolumeSpecName: "kube-api-access-f9m56") pod "ce21c77f-25ce-4bba-b711-7375ce5e824c" (UID: "ce21c77f-25ce-4bba-b711-7375ce5e824c"). InnerVolumeSpecName "kube-api-access-f9m56". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.416137 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce21c77f-25ce-4bba-b711-7375ce5e824c" (UID: "ce21c77f-25ce-4bba-b711-7375ce5e824c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.422480 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-config-data" (OuterVolumeSpecName: "config-data") pod "ce21c77f-25ce-4bba-b711-7375ce5e824c" (UID: "ce21c77f-25ce-4bba-b711-7375ce5e824c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.481762 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.481797 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9m56\" (UniqueName: \"kubernetes.io/projected/ce21c77f-25ce-4bba-b711-7375ce5e824c-kube-api-access-f9m56\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.481809 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.481817 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce21c77f-25ce-4bba-b711-7375ce5e824c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.481827 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce21c77f-25ce-4bba-b711-7375ce5e824c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.641191 5030 generic.go:334] "Generic (PLEG): container finished" podID="ce21c77f-25ce-4bba-b711-7375ce5e824c" containerID="405b9eff86b1d0a72cc550154567e82c0cebeba952b16c2b03fcc3116f8d17a5" exitCode=137 Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.641258 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" event={"ID":"ce21c77f-25ce-4bba-b711-7375ce5e824c","Type":"ContainerDied","Data":"405b9eff86b1d0a72cc550154567e82c0cebeba952b16c2b03fcc3116f8d17a5"} Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.641291 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" event={"ID":"ce21c77f-25ce-4bba-b711-7375ce5e824c","Type":"ContainerDied","Data":"e2f1fb0d9eb9ea650acf54f5893cecdf9eca5f9ae8472088d53d46b93493a224"} Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.641308 5030 scope.go:117] "RemoveContainer" containerID="405b9eff86b1d0a72cc550154567e82c0cebeba952b16c2b03fcc3116f8d17a5" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.641312 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.682325 5030 scope.go:117] "RemoveContainer" containerID="903e3d3d344c10e9eb7cd1b444ed7504bc9dc3e0e1295314208b54614b16480a" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.699990 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt"] Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.708602 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-9fc44964b-hvwvt"] Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.732348 5030 scope.go:117] "RemoveContainer" containerID="405b9eff86b1d0a72cc550154567e82c0cebeba952b16c2b03fcc3116f8d17a5" Jan 20 22:58:12 crc kubenswrapper[5030]: E0120 22:58:12.732959 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"405b9eff86b1d0a72cc550154567e82c0cebeba952b16c2b03fcc3116f8d17a5\": container with ID starting with 405b9eff86b1d0a72cc550154567e82c0cebeba952b16c2b03fcc3116f8d17a5 not found: ID does not exist" containerID="405b9eff86b1d0a72cc550154567e82c0cebeba952b16c2b03fcc3116f8d17a5" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.732988 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"405b9eff86b1d0a72cc550154567e82c0cebeba952b16c2b03fcc3116f8d17a5"} err="failed to get container status \"405b9eff86b1d0a72cc550154567e82c0cebeba952b16c2b03fcc3116f8d17a5\": rpc error: code = NotFound desc = could not find container \"405b9eff86b1d0a72cc550154567e82c0cebeba952b16c2b03fcc3116f8d17a5\": container with ID starting with 405b9eff86b1d0a72cc550154567e82c0cebeba952b16c2b03fcc3116f8d17a5 not found: ID does not exist" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.733012 5030 scope.go:117] "RemoveContainer" containerID="903e3d3d344c10e9eb7cd1b444ed7504bc9dc3e0e1295314208b54614b16480a" Jan 20 22:58:12 crc kubenswrapper[5030]: E0120 22:58:12.736973 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"903e3d3d344c10e9eb7cd1b444ed7504bc9dc3e0e1295314208b54614b16480a\": container with ID starting with 903e3d3d344c10e9eb7cd1b444ed7504bc9dc3e0e1295314208b54614b16480a not found: ID does not exist" containerID="903e3d3d344c10e9eb7cd1b444ed7504bc9dc3e0e1295314208b54614b16480a" Jan 20 22:58:12 crc kubenswrapper[5030]: I0120 22:58:12.737014 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903e3d3d344c10e9eb7cd1b444ed7504bc9dc3e0e1295314208b54614b16480a"} err="failed to get container status \"903e3d3d344c10e9eb7cd1b444ed7504bc9dc3e0e1295314208b54614b16480a\": rpc error: code = NotFound desc = could not find container \"903e3d3d344c10e9eb7cd1b444ed7504bc9dc3e0e1295314208b54614b16480a\": container with ID starting with 903e3d3d344c10e9eb7cd1b444ed7504bc9dc3e0e1295314208b54614b16480a not found: ID does not exist" Jan 20 22:58:13 crc kubenswrapper[5030]: I0120 22:58:13.975171 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce21c77f-25ce-4bba-b711-7375ce5e824c" path="/var/lib/kubelet/pods/ce21c77f-25ce-4bba-b711-7375ce5e824c/volumes" Jan 20 22:58:14 crc kubenswrapper[5030]: I0120 22:58:14.672596 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-65fcc556fd-25njc_3ce810a5-c4fb-4211-9bf1-eb6300276d70/neutron-api/0.log" Jan 20 22:58:14 crc kubenswrapper[5030]: I0120 22:58:14.672991 5030 generic.go:334] "Generic (PLEG): container finished" podID="3ce810a5-c4fb-4211-9bf1-eb6300276d70" containerID="18040f298fd1f797165b1d82dca99dc3c800a6017b34137e3a03f3f67c4e976f" exitCode=137 Jan 20 22:58:14 crc kubenswrapper[5030]: I0120 22:58:14.673046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" event={"ID":"3ce810a5-c4fb-4211-9bf1-eb6300276d70","Type":"ContainerDied","Data":"18040f298fd1f797165b1d82dca99dc3c800a6017b34137e3a03f3f67c4e976f"} Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.163814 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-65fcc556fd-25njc_3ce810a5-c4fb-4211-9bf1-eb6300276d70/neutron-api/0.log" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.163888 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.226760 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-public-tls-certs\") pod \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.226821 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-internal-tls-certs\") pod \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.226859 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-combined-ca-bundle\") pod \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.226902 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-httpd-config\") pod \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.226983 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgnts\" (UniqueName: \"kubernetes.io/projected/3ce810a5-c4fb-4211-9bf1-eb6300276d70-kube-api-access-cgnts\") pod \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.227015 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-ovndb-tls-certs\") pod \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.227080 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-config\") pod \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\" (UID: \"3ce810a5-c4fb-4211-9bf1-eb6300276d70\") " Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.242762 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3ce810a5-c4fb-4211-9bf1-eb6300276d70" (UID: "3ce810a5-c4fb-4211-9bf1-eb6300276d70"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.242786 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce810a5-c4fb-4211-9bf1-eb6300276d70-kube-api-access-cgnts" (OuterVolumeSpecName: "kube-api-access-cgnts") pod "3ce810a5-c4fb-4211-9bf1-eb6300276d70" (UID: "3ce810a5-c4fb-4211-9bf1-eb6300276d70"). InnerVolumeSpecName "kube-api-access-cgnts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.269299 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ce810a5-c4fb-4211-9bf1-eb6300276d70" (UID: "3ce810a5-c4fb-4211-9bf1-eb6300276d70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.276197 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3ce810a5-c4fb-4211-9bf1-eb6300276d70" (UID: "3ce810a5-c4fb-4211-9bf1-eb6300276d70"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.282707 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-config" (OuterVolumeSpecName: "config") pod "3ce810a5-c4fb-4211-9bf1-eb6300276d70" (UID: "3ce810a5-c4fb-4211-9bf1-eb6300276d70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.292894 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3ce810a5-c4fb-4211-9bf1-eb6300276d70" (UID: "3ce810a5-c4fb-4211-9bf1-eb6300276d70"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.293865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3ce810a5-c4fb-4211-9bf1-eb6300276d70" (UID: "3ce810a5-c4fb-4211-9bf1-eb6300276d70"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.328755 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.328805 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.328828 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.328847 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.328866 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.328884 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3ce810a5-c4fb-4211-9bf1-eb6300276d70-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.328902 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgnts\" (UniqueName: \"kubernetes.io/projected/3ce810a5-c4fb-4211-9bf1-eb6300276d70-kube-api-access-cgnts\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.693937 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-65fcc556fd-25njc_3ce810a5-c4fb-4211-9bf1-eb6300276d70/neutron-api/0.log" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.694360 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" event={"ID":"3ce810a5-c4fb-4211-9bf1-eb6300276d70","Type":"ContainerDied","Data":"50767913068a65b6cf20e84d096e45421fdf9953a9461370793203236b0d12e7"} Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.694405 5030 scope.go:117] "RemoveContainer" containerID="e0142e268b351b6a62427e70c71a48fc93815f1920a2cb306ed981400fac604f" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.694494 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-65fcc556fd-25njc" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.734708 5030 scope.go:117] "RemoveContainer" containerID="18040f298fd1f797165b1d82dca99dc3c800a6017b34137e3a03f3f67c4e976f" Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.758143 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-65fcc556fd-25njc"] Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.764883 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-65fcc556fd-25njc"] Jan 20 22:58:15 crc kubenswrapper[5030]: I0120 22:58:15.991720 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ce810a5-c4fb-4211-9bf1-eb6300276d70" path="/var/lib/kubelet/pods/3ce810a5-c4fb-4211-9bf1-eb6300276d70/volumes" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.513537 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514331 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-auditor" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514346 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-auditor" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514364 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="swift-recon-cron" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514372 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="swift-recon-cron" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514383 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0fffc32-86e9-4278-8c2d-37a04bbd89b1" containerName="nova-cell1-conductor-conductor" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514392 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0fffc32-86e9-4278-8c2d-37a04bbd89b1" containerName="nova-cell1-conductor-conductor" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514402 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-expirer" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514410 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-expirer" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514423 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962ec476-3693-41a1-b425-a275bf790beb" containerName="glance-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514431 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="962ec476-3693-41a1-b425-a275bf790beb" containerName="glance-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514444 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6acf7ebc-be88-4e1d-9b31-bef0707ea082" containerName="glance-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514452 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6acf7ebc-be88-4e1d-9b31-bef0707ea082" containerName="glance-log" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514463 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="sg-core" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514470 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="sg-core" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514485 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0994e0e-f167-4c5a-bef7-95da6f589767" containerName="keystone-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514493 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0994e0e-f167-4c5a-bef7-95da6f589767" containerName="keystone-api" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514508 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d969fd5b-6579-4bc5-b64f-05bcb2aec29d" containerName="kube-state-metrics" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514533 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d969fd5b-6579-4bc5-b64f-05bcb2aec29d" containerName="kube-state-metrics" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514545 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2732803-fafc-45fd-9cc7-4b61bd446fe0" containerName="barbican-keystone-listener" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514553 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2732803-fafc-45fd-9cc7-4b61bd446fe0" containerName="barbican-keystone-listener" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514567 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd556b2-18b0-43da-ad37-e84c9e7601cc" containerName="proxy-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514575 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd556b2-18b0-43da-ad37-e84c9e7601cc" containerName="proxy-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514591 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-replicator" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514598 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-replicator" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514611 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a05e942-d886-4355-b352-50408cfe11e3" containerName="galera" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514641 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a05e942-d886-4355-b352-50408cfe11e3" containerName="galera" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514655 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="rsync" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514662 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="rsync" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514671 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285c1a35-91f7-441f-aaf2-a228cccc8962" containerName="barbican-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514680 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="285c1a35-91f7-441f-aaf2-a228cccc8962" containerName="barbican-api" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514692 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e567662-7ee7-4658-8954-06e43a966ce6" containerName="keystone-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514702 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e567662-7ee7-4658-8954-06e43a966ce6" containerName="keystone-api" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514710 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e33cea-b99d-4398-9ecb-ec247a8eb2cc" containerName="init" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514721 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e33cea-b99d-4398-9ecb-ec247a8eb2cc" containerName="init" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514729 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf270c3-8bc4-4b11-8c04-10708f22f76e" containerName="cinder-api-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514737 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf270c3-8bc4-4b11-8c04-10708f22f76e" containerName="cinder-api-log" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514750 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a5273a-9439-4881-b88a-2a3aa11e410b" containerName="setup-container" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514758 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a5273a-9439-4881-b88a-2a3aa11e410b" containerName="setup-container" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514772 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f93538-1f2d-4699-9b7b-dcf4c110178a" containerName="placement-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514779 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f93538-1f2d-4699-9b7b-dcf4c110178a" containerName="placement-api" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514789 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-updater" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514796 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-updater" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514807 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd100215-a3f7-497e-842d-582681701903" containerName="neutron-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514815 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd100215-a3f7-497e-842d-582681701903" containerName="neutron-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514826 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd556b2-18b0-43da-ad37-e84c9e7601cc" containerName="proxy-server" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514835 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd556b2-18b0-43da-ad37-e84c9e7601cc" containerName="proxy-server" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514849 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3fc09a-8a37-4559-91cb-817bcc4171b9" containerName="placement-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514856 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3fc09a-8a37-4559-91cb-817bcc4171b9" containerName="placement-api" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514872 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb8aa29-a5b3-4060-a8a6-19d1eda96426" containerName="barbican-worker-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514879 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb8aa29-a5b3-4060-a8a6-19d1eda96426" containerName="barbican-worker-log" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514906 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-reaper" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514913 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-reaper" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514922 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerName="nova-metadata-metadata" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514931 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerName="nova-metadata-metadata" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514939 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b05ddec-40d4-42ea-987c-03718a02724d" containerName="nova-api-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514946 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b05ddec-40d4-42ea-987c-03718a02724d" containerName="nova-api-api" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514957 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" containerName="openstack-network-exporter" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514964 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" containerName="openstack-network-exporter" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514976 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-server" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.514983 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-server" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.514994 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-replicator" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515001 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-replicator" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515009 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3fc09a-8a37-4559-91cb-817bcc4171b9" containerName="placement-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515016 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3fc09a-8a37-4559-91cb-817bcc4171b9" containerName="placement-log" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515025 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-auditor" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515033 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-auditor" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515044 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f93538-1f2d-4699-9b7b-dcf4c110178a" containerName="placement-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515051 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f93538-1f2d-4699-9b7b-dcf4c110178a" containerName="placement-log" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515064 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285c1a35-91f7-441f-aaf2-a228cccc8962" containerName="barbican-api-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515071 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="285c1a35-91f7-441f-aaf2-a228cccc8962" containerName="barbican-api-log" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515094 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf270c3-8bc4-4b11-8c04-10708f22f76e" containerName="cinder-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515101 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf270c3-8bc4-4b11-8c04-10708f22f76e" containerName="cinder-api" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515112 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e33cea-b99d-4398-9ecb-ec247a8eb2cc" containerName="dnsmasq-dns" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515120 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e33cea-b99d-4398-9ecb-ec247a8eb2cc" containerName="dnsmasq-dns" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515131 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb8aa29-a5b3-4060-a8a6-19d1eda96426" containerName="barbican-worker" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515138 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb8aa29-a5b3-4060-a8a6-19d1eda96426" containerName="barbican-worker" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515154 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd48f030-3391-4fdb-b101-45b0a7f4b9a3" containerName="nova-scheduler-scheduler" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515162 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd48f030-3391-4fdb-b101-45b0a7f4b9a3" containerName="nova-scheduler-scheduler" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515176 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a888a7-7114-4dcb-b271-4422a3802e3b" containerName="barbican-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515183 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a888a7-7114-4dcb-b271-4422a3802e3b" containerName="barbican-api" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515202 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce21c77f-25ce-4bba-b711-7375ce5e824c" containerName="barbican-keystone-listener" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515210 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce21c77f-25ce-4bba-b711-7375ce5e824c" containerName="barbican-keystone-listener" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515220 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6acf7ebc-be88-4e1d-9b31-bef0707ea082" containerName="glance-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515228 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6acf7ebc-be88-4e1d-9b31-bef0707ea082" containerName="glance-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515240 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b05ddec-40d4-42ea-987c-03718a02724d" containerName="nova-api-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515248 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b05ddec-40d4-42ea-987c-03718a02724d" containerName="nova-api-log" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515257 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" containerName="barbican-worker" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515265 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" containerName="barbican-worker" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515280 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a888a7-7114-4dcb-b271-4422a3802e3b" containerName="barbican-api-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515287 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a888a7-7114-4dcb-b271-4422a3802e3b" containerName="barbican-api-log" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515299 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d09f29b6-8161-4158-810d-efe783958358" containerName="probe" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515306 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d09f29b6-8161-4158-810d-efe783958358" containerName="probe" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515319 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerName="nova-metadata-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515327 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerName="nova-metadata-log" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515336 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" containerName="barbican-worker-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515343 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" containerName="barbican-worker-log" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515356 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="ceilometer-notification-agent" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515364 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="ceilometer-notification-agent" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515374 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-updater" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515381 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-updater" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515394 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd100215-a3f7-497e-842d-582681701903" containerName="neutron-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515402 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd100215-a3f7-497e-842d-582681701903" containerName="neutron-api" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515410 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-replicator" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515417 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-replicator" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515431 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962ec476-3693-41a1-b425-a275bf790beb" containerName="glance-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515439 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="962ec476-3693-41a1-b425-a275bf790beb" containerName="glance-log" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515453 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-server" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515460 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-server" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515469 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a5273a-9439-4881-b88a-2a3aa11e410b" containerName="rabbitmq" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515477 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a5273a-9439-4881-b88a-2a3aa11e410b" containerName="rabbitmq" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515487 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" containerName="ovn-northd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515495 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" containerName="ovn-northd" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515506 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d09f29b6-8161-4158-810d-efe783958358" containerName="cinder-scheduler" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515515 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d09f29b6-8161-4158-810d-efe783958358" containerName="cinder-scheduler" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515528 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce810a5-c4fb-4211-9bf1-eb6300276d70" containerName="neutron-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515535 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce810a5-c4fb-4211-9bf1-eb6300276d70" containerName="neutron-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515548 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a05e942-d886-4355-b352-50408cfe11e3" containerName="mysql-bootstrap" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515555 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a05e942-d886-4355-b352-50408cfe11e3" containerName="mysql-bootstrap" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515564 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2732803-fafc-45fd-9cc7-4b61bd446fe0" containerName="barbican-keystone-listener-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515571 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2732803-fafc-45fd-9cc7-4b61bd446fe0" containerName="barbican-keystone-listener-log" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515585 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-server" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515592 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-server" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515602 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce810a5-c4fb-4211-9bf1-eb6300276d70" containerName="neutron-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515609 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce810a5-c4fb-4211-9bf1-eb6300276d70" containerName="neutron-api" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515692 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="proxy-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515702 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="proxy-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515711 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce21c77f-25ce-4bba-b711-7375ce5e824c" containerName="barbican-keystone-listener-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515718 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce21c77f-25ce-4bba-b711-7375ce5e824c" containerName="barbican-keystone-listener-log" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515730 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="ceilometer-central-agent" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515738 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="ceilometer-central-agent" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515746 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38bfa948-051b-4e6c-92a5-3714e0652088" containerName="rabbitmq" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515754 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="38bfa948-051b-4e6c-92a5-3714e0652088" containerName="rabbitmq" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515767 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-auditor" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515814 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-auditor" Jan 20 22:58:25 crc kubenswrapper[5030]: E0120 22:58:25.515824 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38bfa948-051b-4e6c-92a5-3714e0652088" containerName="setup-container" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515832 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="38bfa948-051b-4e6c-92a5-3714e0652088" containerName="setup-container" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.515986 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a888a7-7114-4dcb-b271-4422a3802e3b" containerName="barbican-api-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516004 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-server" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516013 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-expirer" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516026 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce810a5-c4fb-4211-9bf1-eb6300276d70" containerName="neutron-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516039 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" containerName="barbican-worker-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516051 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" containerName="openstack-network-exporter" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516059 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d09f29b6-8161-4158-810d-efe783958358" containerName="cinder-scheduler" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516073 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd100215-a3f7-497e-842d-582681701903" containerName="neutron-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516084 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="285c1a35-91f7-441f-aaf2-a228cccc8962" containerName="barbican-api-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516097 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a888a7-7114-4dcb-b271-4422a3802e3b" containerName="barbican-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516106 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-reaper" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516115 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a5273a-9439-4881-b88a-2a3aa11e410b" containerName="rabbitmq" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516126 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="ceilometer-central-agent" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516136 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="proxy-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516148 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf270c3-8bc4-4b11-8c04-10708f22f76e" containerName="cinder-api-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516161 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f93538-1f2d-4699-9b7b-dcf4c110178a" containerName="placement-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516171 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0fffc32-86e9-4278-8c2d-37a04bbd89b1" containerName="nova-cell1-conductor-conductor" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516182 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-auditor" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516192 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-auditor" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516206 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdb8aa29-a5b3-4060-a8a6-19d1eda96426" containerName="barbican-worker-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516217 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2732803-fafc-45fd-9cc7-4b61bd446fe0" containerName="barbican-keystone-listener-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516226 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd48f030-3391-4fdb-b101-45b0a7f4b9a3" containerName="nova-scheduler-scheduler" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516237 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce21c77f-25ce-4bba-b711-7375ce5e824c" containerName="barbican-keystone-listener-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516245 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a05e942-d886-4355-b352-50408cfe11e3" containerName="galera" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516258 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="38bfa948-051b-4e6c-92a5-3714e0652088" containerName="rabbitmq" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516269 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="24e33cea-b99d-4398-9ecb-ec247a8eb2cc" containerName="dnsmasq-dns" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516277 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd556b2-18b0-43da-ad37-e84c9e7601cc" containerName="proxy-server" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516289 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-server" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516299 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerName="nova-metadata-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516308 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-auditor" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516318 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e567662-7ee7-4658-8954-06e43a966ce6" containerName="keystone-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516327 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce810a5-c4fb-4211-9bf1-eb6300276d70" containerName="neutron-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516335 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-updater" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516342 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="962ec476-3693-41a1-b425-a275bf790beb" containerName="glance-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516353 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a5a6f5-2652-4f8d-8a3b-ae3b58ab4869" containerName="ovn-northd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516363 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="285c1a35-91f7-441f-aaf2-a228cccc8962" containerName="barbican-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516375 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdb8aa29-a5b3-4060-a8a6-19d1eda96426" containerName="barbican-worker" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516384 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="rsync" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516394 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd100215-a3f7-497e-842d-582681701903" containerName="neutron-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516406 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="sg-core" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516417 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d969fd5b-6579-4bc5-b64f-05bcb2aec29d" containerName="kube-state-metrics" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516426 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf270c3-8bc4-4b11-8c04-10708f22f76e" containerName="cinder-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516433 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6acf7ebc-be88-4e1d-9b31-bef0707ea082" containerName="glance-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516442 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b05ddec-40d4-42ea-987c-03718a02724d" containerName="nova-api-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516449 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2732803-fafc-45fd-9cc7-4b61bd446fe0" containerName="barbican-keystone-listener" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516462 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3fc09a-8a37-4559-91cb-817bcc4171b9" containerName="placement-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516471 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="account-replicator" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516483 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3fc09a-8a37-4559-91cb-817bcc4171b9" containerName="placement-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516493 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4277815-cc65-44c8-b719-7673ce4c1d88" containerName="nova-metadata-metadata" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516504 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-replicator" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516513 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0994e0e-f167-4c5a-bef7-95da6f589767" containerName="keystone-api" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516524 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="object-replicator" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516535 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce21c77f-25ce-4bba-b711-7375ce5e824c" containerName="barbican-keystone-listener" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516547 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b05ddec-40d4-42ea-987c-03718a02724d" containerName="nova-api-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516556 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-server" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516566 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="swift-recon-cron" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516575 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="962ec476-3693-41a1-b425-a275bf790beb" containerName="glance-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516588 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d09f29b6-8161-4158-810d-efe783958358" containerName="probe" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516598 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8bf0ab4-25bd-4aec-85bb-c4daa416c7ef" containerName="barbican-worker" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516607 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ddc1fbc-a41d-446b-81e3-a15febd39285" containerName="ceilometer-notification-agent" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516643 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd556b2-18b0-43da-ad37-e84c9e7601cc" containerName="proxy-httpd" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516652 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6acf7ebc-be88-4e1d-9b31-bef0707ea082" containerName="glance-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516662 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f93538-1f2d-4699-9b7b-dcf4c110178a" containerName="placement-log" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.516673 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4870c60-c6b8-48d3-998e-a91670fcedbc" containerName="container-updater" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.517558 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.524420 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.529601 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.529663 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-d79nm" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.529674 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.529699 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.530056 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.530306 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.537653 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.605900 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.605983 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9bc4ba4-4b03-441b-b492-6de67c2647b6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.606009 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.606079 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m25ll\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-kube-api-access-m25ll\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.606114 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9bc4ba4-4b03-441b-b492-6de67c2647b6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.606133 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.606152 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.606175 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.606194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.606214 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.606274 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.707486 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.707566 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9bc4ba4-4b03-441b-b492-6de67c2647b6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.707596 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.707671 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m25ll\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-kube-api-access-m25ll\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.707712 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9bc4ba4-4b03-441b-b492-6de67c2647b6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.707738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.707763 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.707785 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.707812 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.707840 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.707915 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.708314 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") device mount path \"/mnt/openstack/pv07\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.709099 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.709863 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.710610 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.710766 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.710835 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.714252 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9bc4ba4-4b03-441b-b492-6de67c2647b6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.716261 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.716873 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.719459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9bc4ba4-4b03-441b-b492-6de67c2647b6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.726067 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m25ll\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-kube-api-access-m25ll\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.740393 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:25 crc kubenswrapper[5030]: I0120 22:58:25.854980 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.089999 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.778346 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.781552 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.786039 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.786074 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.786414 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.787443 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.788022 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.793241 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.793358 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-x5wl9" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.798200 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.810383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"b9bc4ba4-4b03-441b-b492-6de67c2647b6","Type":"ContainerStarted","Data":"042a45836f5533a4769ed35bfb4435bb732499ab8cf956d04c6181f28acffce9"} Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.837706 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w65db\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-kube-api-access-w65db\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.837786 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.837834 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83f11c15-8856-45fd-9703-348310781d5a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.837872 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.837913 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.837999 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.838105 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.838176 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.838306 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.838363 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83f11c15-8856-45fd-9703-348310781d5a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.838525 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.939446 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83f11c15-8856-45fd-9703-348310781d5a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.939556 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.939601 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w65db\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-kube-api-access-w65db\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.939653 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.939682 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83f11c15-8856-45fd-9703-348310781d5a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.939707 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.939736 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.939755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.939788 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.939814 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.939861 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.940268 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.940419 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.940800 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.940851 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.941693 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.943424 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.945525 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.945789 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.945801 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83f11c15-8856-45fd-9703-348310781d5a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.968890 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w65db\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-kube-api-access-w65db\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:26 crc kubenswrapper[5030]: I0120 22:58:26.972852 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.146486 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83f11c15-8856-45fd-9703-348310781d5a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.147053 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.322335 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.325062 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.333305 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.334854 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.335142 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.337815 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-jcxx5" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.340783 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.346542 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.453053 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.453110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.453207 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kolla-config\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.453235 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.453255 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.453285 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-default\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.453308 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t4kr\" (UniqueName: \"kubernetes.io/projected/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kube-api-access-6t4kr\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.453390 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.556744 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.556850 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kolla-config\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.556884 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.556908 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.556950 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-default\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.556979 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t4kr\" (UniqueName: \"kubernetes.io/projected/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kube-api-access-6t4kr\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.557055 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.557228 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.557422 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.557702 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kolla-config\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.558054 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-default\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.559481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.560372 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.562794 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.564715 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.581942 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t4kr\" (UniqueName: \"kubernetes.io/projected/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kube-api-access-6t4kr\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.583814 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-0\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.609268 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 22:58:27 crc kubenswrapper[5030]: W0120 22:58:27.617379 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83f11c15_8856_45fd_9703_348310781d5a.slice/crio-ef13c838ced0f980ead5be321b135461da111a3826340912690b010245dfb0c6 WatchSource:0}: Error finding container ef13c838ced0f980ead5be321b135461da111a3826340912690b010245dfb0c6: Status 404 returned error can't find the container with id ef13c838ced0f980ead5be321b135461da111a3826340912690b010245dfb0c6 Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.652739 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.823469 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"b9bc4ba4-4b03-441b-b492-6de67c2647b6","Type":"ContainerStarted","Data":"56eaf2fe8a93d7268fc8ee7e06205cca6c5d17255e04bed26c15d451551557f2"} Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.827898 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"83f11c15-8856-45fd-9703-348310781d5a","Type":"ContainerStarted","Data":"ef13c838ced0f980ead5be321b135461da111a3826340912690b010245dfb0c6"} Jan 20 22:58:27 crc kubenswrapper[5030]: I0120 22:58:27.879200 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 22:58:27 crc kubenswrapper[5030]: W0120 22:58:27.882827 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a16fd61_1cde_4e12_8e9f_db0c8f2182b7.slice/crio-48aa49b8cf040be83cb11c7fe449fde83c14cc4b734a69f5407c7447559a3e9b WatchSource:0}: Error finding container 48aa49b8cf040be83cb11c7fe449fde83c14cc4b734a69f5407c7447559a3e9b: Status 404 returned error can't find the container with id 48aa49b8cf040be83cb11c7fe449fde83c14cc4b734a69f5407c7447559a3e9b Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.690428 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.692324 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.694262 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-bz4km" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.696936 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.696982 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.697096 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.710799 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.834692 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7","Type":"ContainerStarted","Data":"90624f592f91634ffe7a336b3a4f4109cf49de743eedeca34a6df4b0c20d7765"} Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.834743 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7","Type":"ContainerStarted","Data":"48aa49b8cf040be83cb11c7fe449fde83c14cc4b734a69f5407c7447559a3e9b"} Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.882668 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.882747 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f927a168-d367-4174-9e6e-fd9b7964346b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.882774 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25csx\" (UniqueName: \"kubernetes.io/projected/f927a168-d367-4174-9e6e-fd9b7964346b-kube-api-access-25csx\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.882829 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.882846 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.882869 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f927a168-d367-4174-9e6e-fd9b7964346b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.882897 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f927a168-d367-4174-9e6e-fd9b7964346b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.882916 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.984235 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.984279 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.984301 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f927a168-d367-4174-9e6e-fd9b7964346b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.984340 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f927a168-d367-4174-9e6e-fd9b7964346b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.984376 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.984414 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.984462 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f927a168-d367-4174-9e6e-fd9b7964346b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.984509 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25csx\" (UniqueName: \"kubernetes.io/projected/f927a168-d367-4174-9e6e-fd9b7964346b-kube-api-access-25csx\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.984796 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.984934 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f927a168-d367-4174-9e6e-fd9b7964346b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.985085 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.985274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.986266 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.992138 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f927a168-d367-4174-9e6e-fd9b7964346b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:28 crc kubenswrapper[5030]: I0120 22:58:28.992257 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f927a168-d367-4174-9e6e-fd9b7964346b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.000147 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25csx\" (UniqueName: \"kubernetes.io/projected/f927a168-d367-4174-9e6e-fd9b7964346b-kube-api-access-25csx\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.003591 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.007791 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.229911 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 22:58:29 crc kubenswrapper[5030]: W0120 22:58:29.232569 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf927a168_d367_4174_9e6e_fd9b7964346b.slice/crio-53e9263094fb938ecb15f86897c564340afd1397551859b0675e06fe8ca5851a WatchSource:0}: Error finding container 53e9263094fb938ecb15f86897c564340afd1397551859b0675e06fe8ca5851a: Status 404 returned error can't find the container with id 53e9263094fb938ecb15f86897c564340afd1397551859b0675e06fe8ca5851a Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.761224 5030 scope.go:117] "RemoveContainer" containerID="9d6749cd3ec4dbd9236dd6d285b026b84ee8f1460d06ad6511184a6fa9b84e5b" Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.845427 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f927a168-d367-4174-9e6e-fd9b7964346b","Type":"ContainerStarted","Data":"23246055da36c2f333112d28800e6e96f2ab653464ea645c0d2e1aab9a862a2b"} Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.845470 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f927a168-d367-4174-9e6e-fd9b7964346b","Type":"ContainerStarted","Data":"53e9263094fb938ecb15f86897c564340afd1397551859b0675e06fe8ca5851a"} Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.847006 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"83f11c15-8856-45fd-9703-348310781d5a","Type":"ContainerStarted","Data":"023df1c950388e7dcb139b685ad0bdfed5f5d067074b9530aa12528857b11e81"} Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.911932 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.912827 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.914471 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-f4998" Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.914713 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.914835 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 22:58:29 crc kubenswrapper[5030]: I0120 22:58:29.924115 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.002371 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q59nz\" (UniqueName: \"kubernetes.io/projected/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-kube-api-access-q59nz\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.002436 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.002546 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-config-data\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.002574 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.002678 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-kolla-config\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.103681 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.103769 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-kolla-config\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.103810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q59nz\" (UniqueName: \"kubernetes.io/projected/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-kube-api-access-q59nz\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.103839 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.103903 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-config-data\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.104704 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-kolla-config\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.105371 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-config-data\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.109094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.109121 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.120181 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q59nz\" (UniqueName: \"kubernetes.io/projected/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-kube-api-access-q59nz\") pod \"memcached-0\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.230975 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.663896 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 22:58:30 crc kubenswrapper[5030]: W0120 22:58:30.669139 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f73fe0c_f004_4b64_8952_cc6d7ebb26a9.slice/crio-7ad8983153ac237af54853992a2d9be4d360e8a81b2945f412f2e918378244b6 WatchSource:0}: Error finding container 7ad8983153ac237af54853992a2d9be4d360e8a81b2945f412f2e918378244b6: Status 404 returned error can't find the container with id 7ad8983153ac237af54853992a2d9be4d360e8a81b2945f412f2e918378244b6 Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.864290 5030 generic.go:334] "Generic (PLEG): container finished" podID="9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" containerID="90624f592f91634ffe7a336b3a4f4109cf49de743eedeca34a6df4b0c20d7765" exitCode=0 Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.864412 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7","Type":"ContainerDied","Data":"90624f592f91634ffe7a336b3a4f4109cf49de743eedeca34a6df4b0c20d7765"} Jan 20 22:58:30 crc kubenswrapper[5030]: I0120 22:58:30.868469 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9","Type":"ContainerStarted","Data":"7ad8983153ac237af54853992a2d9be4d360e8a81b2945f412f2e918378244b6"} Jan 20 22:58:31 crc kubenswrapper[5030]: I0120 22:58:31.859108 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:58:31 crc kubenswrapper[5030]: I0120 22:58:31.860549 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:58:31 crc kubenswrapper[5030]: I0120 22:58:31.866863 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:58:31 crc kubenswrapper[5030]: I0120 22:58:31.869066 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-gjsr5" Jan 20 22:58:31 crc kubenswrapper[5030]: I0120 22:58:31.877613 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7","Type":"ContainerStarted","Data":"b2bea81020fe0e73b0b3e7d490ce3abe8750029ce24d8cb352e08256fba27883"} Jan 20 22:58:31 crc kubenswrapper[5030]: I0120 22:58:31.896859 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9","Type":"ContainerStarted","Data":"ff2baf8e3414b506fcd4fe9cdcb3f98e506b98207980d6f88effbe946d2e0a60"} Jan 20 22:58:31 crc kubenswrapper[5030]: I0120 22:58:31.897641 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:31 crc kubenswrapper[5030]: I0120 22:58:31.916002 5030 generic.go:334] "Generic (PLEG): container finished" podID="f927a168-d367-4174-9e6e-fd9b7964346b" containerID="23246055da36c2f333112d28800e6e96f2ab653464ea645c0d2e1aab9a862a2b" exitCode=0 Jan 20 22:58:31 crc kubenswrapper[5030]: I0120 22:58:31.916243 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f927a168-d367-4174-9e6e-fd9b7964346b","Type":"ContainerDied","Data":"23246055da36c2f333112d28800e6e96f2ab653464ea645c0d2e1aab9a862a2b"} Jan 20 22:58:31 crc kubenswrapper[5030]: I0120 22:58:31.934062 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtm4n\" (UniqueName: \"kubernetes.io/projected/4799c457-9dc2-4576-961d-9e5c5cbf0fd7-kube-api-access-vtm4n\") pod \"kube-state-metrics-0\" (UID: \"4799c457-9dc2-4576-961d-9e5c5cbf0fd7\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:58:31 crc kubenswrapper[5030]: I0120 22:58:31.946844 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=4.9468273400000005 podStartE2EDuration="4.94682734s" podCreationTimestamp="2026-01-20 22:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:58:31.926794511 +0000 UTC m=+1384.247054829" watchObservedRunningTime="2026-01-20 22:58:31.94682734 +0000 UTC m=+1384.267087628" Jan 20 22:58:31 crc kubenswrapper[5030]: I0120 22:58:31.961483 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.961464398 podStartE2EDuration="2.961464398s" podCreationTimestamp="2026-01-20 22:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:58:31.961030097 +0000 UTC m=+1384.281290385" watchObservedRunningTime="2026-01-20 22:58:31.961464398 +0000 UTC m=+1384.281724676" Jan 20 22:58:32 crc kubenswrapper[5030]: I0120 22:58:32.035769 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtm4n\" (UniqueName: \"kubernetes.io/projected/4799c457-9dc2-4576-961d-9e5c5cbf0fd7-kube-api-access-vtm4n\") pod \"kube-state-metrics-0\" (UID: \"4799c457-9dc2-4576-961d-9e5c5cbf0fd7\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:58:32 crc kubenswrapper[5030]: I0120 22:58:32.051285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtm4n\" (UniqueName: \"kubernetes.io/projected/4799c457-9dc2-4576-961d-9e5c5cbf0fd7-kube-api-access-vtm4n\") pod \"kube-state-metrics-0\" (UID: \"4799c457-9dc2-4576-961d-9e5c5cbf0fd7\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:58:32 crc kubenswrapper[5030]: I0120 22:58:32.179488 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:58:32 crc kubenswrapper[5030]: I0120 22:58:32.421277 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 22:58:32 crc kubenswrapper[5030]: W0120 22:58:32.422130 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4799c457_9dc2_4576_961d_9e5c5cbf0fd7.slice/crio-dd1d3506617600dee86435951a9669ea5806a8732cb658b420236393a76001b3 WatchSource:0}: Error finding container dd1d3506617600dee86435951a9669ea5806a8732cb658b420236393a76001b3: Status 404 returned error can't find the container with id dd1d3506617600dee86435951a9669ea5806a8732cb658b420236393a76001b3 Jan 20 22:58:32 crc kubenswrapper[5030]: I0120 22:58:32.423856 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 22:58:32 crc kubenswrapper[5030]: I0120 22:58:32.927528 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"4799c457-9dc2-4576-961d-9e5c5cbf0fd7","Type":"ContainerStarted","Data":"dd1d3506617600dee86435951a9669ea5806a8732cb658b420236393a76001b3"} Jan 20 22:58:32 crc kubenswrapper[5030]: I0120 22:58:32.930146 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f927a168-d367-4174-9e6e-fd9b7964346b","Type":"ContainerStarted","Data":"c4f519b9aa19de2a59b393d537b2988696190ba41760458874985232c34d2380"} Jan 20 22:58:32 crc kubenswrapper[5030]: I0120 22:58:32.958460 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=4.958411755 podStartE2EDuration="4.958411755s" podCreationTimestamp="2026-01-20 22:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:58:32.957911272 +0000 UTC m=+1385.278171600" watchObservedRunningTime="2026-01-20 22:58:32.958411755 +0000 UTC m=+1385.278672093" Jan 20 22:58:33 crc kubenswrapper[5030]: E0120 22:58:33.659527 5030 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.9:33298->38.102.83.9:37955: read tcp 38.102.83.9:33298->38.102.83.9:37955: read: connection reset by peer Jan 20 22:58:33 crc kubenswrapper[5030]: E0120 22:58:33.659743 5030 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.9:33298->38.102.83.9:37955: write tcp 38.102.83.9:33298->38.102.83.9:37955: write: broken pipe Jan 20 22:58:33 crc kubenswrapper[5030]: I0120 22:58:33.941191 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"4799c457-9dc2-4576-961d-9e5c5cbf0fd7","Type":"ContainerStarted","Data":"b054c744cc5b60da751b376f974aa4b8b8d215604cd72b70b44a32268c3f6d3d"} Jan 20 22:58:33 crc kubenswrapper[5030]: I0120 22:58:33.960514 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.589908009 podStartE2EDuration="2.960488876s" podCreationTimestamp="2026-01-20 22:58:31 +0000 UTC" firstStartedPulling="2026-01-20 22:58:32.423653385 +0000 UTC m=+1384.743913673" lastFinishedPulling="2026-01-20 22:58:32.794234212 +0000 UTC m=+1385.114494540" observedRunningTime="2026-01-20 22:58:33.957997316 +0000 UTC m=+1386.278257644" watchObservedRunningTime="2026-01-20 22:58:33.960488876 +0000 UTC m=+1386.280749204" Jan 20 22:58:34 crc kubenswrapper[5030]: I0120 22:58:34.882969 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 22:58:34 crc kubenswrapper[5030]: I0120 22:58:34.884479 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:34 crc kubenswrapper[5030]: I0120 22:58:34.887347 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 22:58:34 crc kubenswrapper[5030]: I0120 22:58:34.887483 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-pz99d" Jan 20 22:58:34 crc kubenswrapper[5030]: I0120 22:58:34.887519 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 20 22:58:34 crc kubenswrapper[5030]: I0120 22:58:34.887343 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 22:58:34 crc kubenswrapper[5030]: I0120 22:58:34.899827 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 22:58:34 crc kubenswrapper[5030]: I0120 22:58:34.922696 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 22:58:34 crc kubenswrapper[5030]: I0120 22:58:34.950981 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.001920 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.001980 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.002012 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92x4t\" (UniqueName: \"kubernetes.io/projected/886eef18-1c15-40bc-a798-217df2a568ff-kube-api-access-92x4t\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.002057 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.002076 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/886eef18-1c15-40bc-a798-217df2a568ff-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.002101 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-config\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.002121 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.002143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.103487 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.103566 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/886eef18-1c15-40bc-a798-217df2a568ff-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.103667 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-config\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.103703 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.103740 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.103846 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.103882 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.103930 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92x4t\" (UniqueName: \"kubernetes.io/projected/886eef18-1c15-40bc-a798-217df2a568ff-kube-api-access-92x4t\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.104084 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/886eef18-1c15-40bc-a798-217df2a568ff-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.104667 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.105543 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.105596 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-config\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.110353 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.110960 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.112239 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.123433 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92x4t\" (UniqueName: \"kubernetes.io/projected/886eef18-1c15-40bc-a798-217df2a568ff-kube-api-access-92x4t\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.126628 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-nb-0\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.211318 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.232971 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.675975 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 22:58:35 crc kubenswrapper[5030]: W0120 22:58:35.678998 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod886eef18_1c15_40bc_a798_217df2a568ff.slice/crio-4c5c790317fff819e3752f45df8b1bf73074faeb39193fba5dcd1b456a8cce1e WatchSource:0}: Error finding container 4c5c790317fff819e3752f45df8b1bf73074faeb39193fba5dcd1b456a8cce1e: Status 404 returned error can't find the container with id 4c5c790317fff819e3752f45df8b1bf73074faeb39193fba5dcd1b456a8cce1e Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.959818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"886eef18-1c15-40bc-a798-217df2a568ff","Type":"ContainerStarted","Data":"2e33323b62f3534db68317bbdcfa4b96b235ccb6f864576f5bc4f881659045cb"} Jan 20 22:58:35 crc kubenswrapper[5030]: I0120 22:58:35.960155 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"886eef18-1c15-40bc-a798-217df2a568ff","Type":"ContainerStarted","Data":"4c5c790317fff819e3752f45df8b1bf73074faeb39193fba5dcd1b456a8cce1e"} Jan 20 22:58:36 crc kubenswrapper[5030]: I0120 22:58:36.963422 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 22:58:36 crc kubenswrapper[5030]: I0120 22:58:36.964760 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:36 crc kubenswrapper[5030]: I0120 22:58:36.967732 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 22:58:36 crc kubenswrapper[5030]: I0120 22:58:36.968083 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 22:58:36 crc kubenswrapper[5030]: I0120 22:58:36.969083 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-jwgzb" Jan 20 22:58:36 crc kubenswrapper[5030]: I0120 22:58:36.969202 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 22:58:36 crc kubenswrapper[5030]: I0120 22:58:36.971107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"886eef18-1c15-40bc-a798-217df2a568ff","Type":"ContainerStarted","Data":"0baa25f272096765a528195b72ca66c3413a72c927cbfa02fe2cad36d4bbd81c"} Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.030704 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.046499 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.046480952 podStartE2EDuration="3.046480952s" podCreationTimestamp="2026-01-20 22:58:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:58:37.044217246 +0000 UTC m=+1389.364477534" watchObservedRunningTime="2026-01-20 22:58:37.046480952 +0000 UTC m=+1389.366741250" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.136673 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e90be399-4f89-42ef-950a-2ef8c14a4d0f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.136721 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.136747 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e90be399-4f89-42ef-950a-2ef8c14a4d0f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.136836 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.136920 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.137008 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90be399-4f89-42ef-950a-2ef8c14a4d0f-config\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.137053 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.137187 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9kl8\" (UniqueName: \"kubernetes.io/projected/e90be399-4f89-42ef-950a-2ef8c14a4d0f-kube-api-access-q9kl8\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.238204 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.238309 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9kl8\" (UniqueName: \"kubernetes.io/projected/e90be399-4f89-42ef-950a-2ef8c14a4d0f-kube-api-access-q9kl8\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.238344 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e90be399-4f89-42ef-950a-2ef8c14a4d0f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.238368 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.238392 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e90be399-4f89-42ef-950a-2ef8c14a4d0f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.238443 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.238484 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.238524 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90be399-4f89-42ef-950a-2ef8c14a4d0f-config\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.239665 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90be399-4f89-42ef-950a-2ef8c14a4d0f-config\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.239680 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.239978 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e90be399-4f89-42ef-950a-2ef8c14a4d0f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.240159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e90be399-4f89-42ef-950a-2ef8c14a4d0f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.248770 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.250796 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.252242 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.263331 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9kl8\" (UniqueName: \"kubernetes.io/projected/e90be399-4f89-42ef-950a-2ef8c14a4d0f-kube-api-access-q9kl8\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.273868 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.294682 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.653349 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.653774 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.752662 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.776232 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 22:58:37 crc kubenswrapper[5030]: W0120 22:58:37.782708 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode90be399_4f89_42ef_950a_2ef8c14a4d0f.slice/crio-caae53aba34ae8f4022ee1724b905e205b673a25f26e839d113d103817f284f4 WatchSource:0}: Error finding container caae53aba34ae8f4022ee1724b905e205b673a25f26e839d113d103817f284f4: Status 404 returned error can't find the container with id caae53aba34ae8f4022ee1724b905e205b673a25f26e839d113d103817f284f4 Jan 20 22:58:37 crc kubenswrapper[5030]: I0120 22:58:37.980968 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"e90be399-4f89-42ef-950a-2ef8c14a4d0f","Type":"ContainerStarted","Data":"caae53aba34ae8f4022ee1724b905e205b673a25f26e839d113d103817f284f4"} Jan 20 22:58:38 crc kubenswrapper[5030]: I0120 22:58:38.082770 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 22:58:38 crc kubenswrapper[5030]: I0120 22:58:38.213331 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:38 crc kubenswrapper[5030]: I0120 22:58:38.407357 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:38 crc kubenswrapper[5030]: I0120 22:58:38.996238 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"e90be399-4f89-42ef-950a-2ef8c14a4d0f","Type":"ContainerStarted","Data":"e9be47aacf56d38b357ffb541b2521e1d61edfb8742915789559a4601680022e"} Jan 20 22:58:38 crc kubenswrapper[5030]: I0120 22:58:38.996607 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"e90be399-4f89-42ef-950a-2ef8c14a4d0f","Type":"ContainerStarted","Data":"0744807f85e5619e2c08ebdc739a96d47e1c08e5f289d32e4fc562f7632a83d0"} Jan 20 22:58:38 crc kubenswrapper[5030]: I0120 22:58:38.997893 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.009196 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.009322 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.029761 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=3.029733535 podStartE2EDuration="3.029733535s" podCreationTimestamp="2026-01-20 22:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:58:39.024889307 +0000 UTC m=+1391.345149635" watchObservedRunningTime="2026-01-20 22:58:39.029733535 +0000 UTC m=+1391.349993863" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.124377 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.851606 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-brxx2"] Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.853226 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-brxx2" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.874024 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-brxx2"] Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.887942 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399e28e7-a389-47c4-b8c2-89d8dbad34be-operator-scripts\") pod \"keystone-db-create-brxx2\" (UID: \"399e28e7-a389-47c4-b8c2-89d8dbad34be\") " pod="openstack-kuttl-tests/keystone-db-create-brxx2" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.888165 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5hmq\" (UniqueName: \"kubernetes.io/projected/399e28e7-a389-47c4-b8c2-89d8dbad34be-kube-api-access-p5hmq\") pod \"keystone-db-create-brxx2\" (UID: \"399e28e7-a389-47c4-b8c2-89d8dbad34be\") " pod="openstack-kuttl-tests/keystone-db-create-brxx2" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.889577 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-afd0-account-create-update-476mm"] Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.890689 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.897038 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.910215 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-afd0-account-create-update-476mm"] Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.990584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5hmq\" (UniqueName: \"kubernetes.io/projected/399e28e7-a389-47c4-b8c2-89d8dbad34be-kube-api-access-p5hmq\") pod \"keystone-db-create-brxx2\" (UID: \"399e28e7-a389-47c4-b8c2-89d8dbad34be\") " pod="openstack-kuttl-tests/keystone-db-create-brxx2" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.990829 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a712d79-3228-4e70-ae9d-8436a0dee203-operator-scripts\") pod \"keystone-afd0-account-create-update-476mm\" (UID: \"1a712d79-3228-4e70-ae9d-8436a0dee203\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.990902 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399e28e7-a389-47c4-b8c2-89d8dbad34be-operator-scripts\") pod \"keystone-db-create-brxx2\" (UID: \"399e28e7-a389-47c4-b8c2-89d8dbad34be\") " pod="openstack-kuttl-tests/keystone-db-create-brxx2" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.990972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m755\" (UniqueName: \"kubernetes.io/projected/1a712d79-3228-4e70-ae9d-8436a0dee203-kube-api-access-6m755\") pod \"keystone-afd0-account-create-update-476mm\" (UID: \"1a712d79-3228-4e70-ae9d-8436a0dee203\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" Jan 20 22:58:39 crc kubenswrapper[5030]: I0120 22:58:39.992377 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399e28e7-a389-47c4-b8c2-89d8dbad34be-operator-scripts\") pod \"keystone-db-create-brxx2\" (UID: \"399e28e7-a389-47c4-b8c2-89d8dbad34be\") " pod="openstack-kuttl-tests/keystone-db-create-brxx2" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.012793 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5hmq\" (UniqueName: \"kubernetes.io/projected/399e28e7-a389-47c4-b8c2-89d8dbad34be-kube-api-access-p5hmq\") pod \"keystone-db-create-brxx2\" (UID: \"399e28e7-a389-47c4-b8c2-89d8dbad34be\") " pod="openstack-kuttl-tests/keystone-db-create-brxx2" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.052990 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.092727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a712d79-3228-4e70-ae9d-8436a0dee203-operator-scripts\") pod \"keystone-afd0-account-create-update-476mm\" (UID: \"1a712d79-3228-4e70-ae9d-8436a0dee203\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.092967 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m755\" (UniqueName: \"kubernetes.io/projected/1a712d79-3228-4e70-ae9d-8436a0dee203-kube-api-access-6m755\") pod \"keystone-afd0-account-create-update-476mm\" (UID: \"1a712d79-3228-4e70-ae9d-8436a0dee203\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.094422 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a712d79-3228-4e70-ae9d-8436a0dee203-operator-scripts\") pod \"keystone-afd0-account-create-update-476mm\" (UID: \"1a712d79-3228-4e70-ae9d-8436a0dee203\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.107547 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.128864 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m755\" (UniqueName: \"kubernetes.io/projected/1a712d79-3228-4e70-ae9d-8436a0dee203-kube-api-access-6m755\") pod \"keystone-afd0-account-create-update-476mm\" (UID: \"1a712d79-3228-4e70-ae9d-8436a0dee203\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.139465 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-mbfq9"] Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.141661 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-mbfq9" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.155962 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-mbfq9"] Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.157039 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.157111 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.187263 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-brxx2" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.225509 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.252261 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp"] Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.267986 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp"] Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.268088 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.270339 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.296262 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.310534 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a10a501a-a4d8-4baf-b4f0-f76193fc92f3-operator-scripts\") pod \"placement-db-create-mbfq9\" (UID: \"a10a501a-a4d8-4baf-b4f0-f76193fc92f3\") " pod="openstack-kuttl-tests/placement-db-create-mbfq9" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.310804 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcwhm\" (UniqueName: \"kubernetes.io/projected/a10a501a-a4d8-4baf-b4f0-f76193fc92f3-kube-api-access-zcwhm\") pod \"placement-db-create-mbfq9\" (UID: \"a10a501a-a4d8-4baf-b4f0-f76193fc92f3\") " pod="openstack-kuttl-tests/placement-db-create-mbfq9" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.310826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/145c5e5d-0043-46f7-af17-4426bea1a3eb-operator-scripts\") pod \"placement-f6c7-account-create-update-rs7dp\" (UID: \"145c5e5d-0043-46f7-af17-4426bea1a3eb\") " pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.310854 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngkln\" (UniqueName: \"kubernetes.io/projected/145c5e5d-0043-46f7-af17-4426bea1a3eb-kube-api-access-ngkln\") pod \"placement-f6c7-account-create-update-rs7dp\" (UID: \"145c5e5d-0043-46f7-af17-4426bea1a3eb\") " pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.345268 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.412532 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngkln\" (UniqueName: \"kubernetes.io/projected/145c5e5d-0043-46f7-af17-4426bea1a3eb-kube-api-access-ngkln\") pod \"placement-f6c7-account-create-update-rs7dp\" (UID: \"145c5e5d-0043-46f7-af17-4426bea1a3eb\") " pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.412838 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a10a501a-a4d8-4baf-b4f0-f76193fc92f3-operator-scripts\") pod \"placement-db-create-mbfq9\" (UID: \"a10a501a-a4d8-4baf-b4f0-f76193fc92f3\") " pod="openstack-kuttl-tests/placement-db-create-mbfq9" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.412986 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/145c5e5d-0043-46f7-af17-4426bea1a3eb-operator-scripts\") pod \"placement-f6c7-account-create-update-rs7dp\" (UID: \"145c5e5d-0043-46f7-af17-4426bea1a3eb\") " pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.413072 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcwhm\" (UniqueName: \"kubernetes.io/projected/a10a501a-a4d8-4baf-b4f0-f76193fc92f3-kube-api-access-zcwhm\") pod \"placement-db-create-mbfq9\" (UID: \"a10a501a-a4d8-4baf-b4f0-f76193fc92f3\") " pod="openstack-kuttl-tests/placement-db-create-mbfq9" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.413891 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a10a501a-a4d8-4baf-b4f0-f76193fc92f3-operator-scripts\") pod \"placement-db-create-mbfq9\" (UID: \"a10a501a-a4d8-4baf-b4f0-f76193fc92f3\") " pod="openstack-kuttl-tests/placement-db-create-mbfq9" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.414239 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/145c5e5d-0043-46f7-af17-4426bea1a3eb-operator-scripts\") pod \"placement-f6c7-account-create-update-rs7dp\" (UID: \"145c5e5d-0043-46f7-af17-4426bea1a3eb\") " pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.427978 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngkln\" (UniqueName: \"kubernetes.io/projected/145c5e5d-0043-46f7-af17-4426bea1a3eb-kube-api-access-ngkln\") pod \"placement-f6c7-account-create-update-rs7dp\" (UID: \"145c5e5d-0043-46f7-af17-4426bea1a3eb\") " pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" Jan 20 22:58:40 crc kubenswrapper[5030]: I0120 22:58:40.428045 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcwhm\" (UniqueName: \"kubernetes.io/projected/a10a501a-a4d8-4baf-b4f0-f76193fc92f3-kube-api-access-zcwhm\") pod \"placement-db-create-mbfq9\" (UID: \"a10a501a-a4d8-4baf-b4f0-f76193fc92f3\") " pod="openstack-kuttl-tests/placement-db-create-mbfq9" Jan 20 22:58:41 crc kubenswrapper[5030]: I0120 22:58:40.475229 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-mbfq9" Jan 20 22:58:41 crc kubenswrapper[5030]: I0120 22:58:40.635866 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-brxx2"] Jan 20 22:58:41 crc kubenswrapper[5030]: W0120 22:58:40.640275 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod399e28e7_a389_47c4_b8c2_89d8dbad34be.slice/crio-8d5b79f48376d2724e81cad712cf07fe1c1ad4dbf53d769de98433c06436a6ed WatchSource:0}: Error finding container 8d5b79f48376d2724e81cad712cf07fe1c1ad4dbf53d769de98433c06436a6ed: Status 404 returned error can't find the container with id 8d5b79f48376d2724e81cad712cf07fe1c1ad4dbf53d769de98433c06436a6ed Jan 20 22:58:41 crc kubenswrapper[5030]: I0120 22:58:40.641373 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" Jan 20 22:58:41 crc kubenswrapper[5030]: I0120 22:58:40.756596 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-afd0-account-create-update-476mm"] Jan 20 22:58:41 crc kubenswrapper[5030]: I0120 22:58:41.019008 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-brxx2" event={"ID":"399e28e7-a389-47c4-b8c2-89d8dbad34be","Type":"ContainerStarted","Data":"b9431845f134e3c49f0b4fda626f37bee07bab7e57e203f049a9bdac3398065b"} Jan 20 22:58:41 crc kubenswrapper[5030]: I0120 22:58:41.019057 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-brxx2" event={"ID":"399e28e7-a389-47c4-b8c2-89d8dbad34be","Type":"ContainerStarted","Data":"8d5b79f48376d2724e81cad712cf07fe1c1ad4dbf53d769de98433c06436a6ed"} Jan 20 22:58:41 crc kubenswrapper[5030]: I0120 22:58:41.024778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" event={"ID":"1a712d79-3228-4e70-ae9d-8436a0dee203","Type":"ContainerStarted","Data":"15f3f2e178878a7a7591f1942904d8f760ff5c2f3da40e1b5354683582d9d89a"} Jan 20 22:58:41 crc kubenswrapper[5030]: I0120 22:58:41.024860 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" event={"ID":"1a712d79-3228-4e70-ae9d-8436a0dee203","Type":"ContainerStarted","Data":"732d2f56e911467f0506c0097c543ea3b08da186ebb2cb470aa1df43fd61aeed"} Jan 20 22:58:41 crc kubenswrapper[5030]: I0120 22:58:41.025408 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:41 crc kubenswrapper[5030]: I0120 22:58:41.041012 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-create-brxx2" podStartSLOduration=2.040982101 podStartE2EDuration="2.040982101s" podCreationTimestamp="2026-01-20 22:58:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:58:41.04050808 +0000 UTC m=+1393.360768378" watchObservedRunningTime="2026-01-20 22:58:41.040982101 +0000 UTC m=+1393.361242389" Jan 20 22:58:41 crc kubenswrapper[5030]: I0120 22:58:41.062596 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" podStartSLOduration=2.06257402 podStartE2EDuration="2.06257402s" podCreationTimestamp="2026-01-20 22:58:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:58:41.05603464 +0000 UTC m=+1393.376294948" watchObservedRunningTime="2026-01-20 22:58:41.06257402 +0000 UTC m=+1393.382834308" Jan 20 22:58:41 crc kubenswrapper[5030]: I0120 22:58:41.659481 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp"] Jan 20 22:58:41 crc kubenswrapper[5030]: W0120 22:58:41.663312 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda10a501a_a4d8_4baf_b4f0_f76193fc92f3.slice/crio-2ac884c198c81141e23353a7db7269cb35c113632742ed7ab295920ee048fdb5 WatchSource:0}: Error finding container 2ac884c198c81141e23353a7db7269cb35c113632742ed7ab295920ee048fdb5: Status 404 returned error can't find the container with id 2ac884c198c81141e23353a7db7269cb35c113632742ed7ab295920ee048fdb5 Jan 20 22:58:41 crc kubenswrapper[5030]: I0120 22:58:41.670190 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-mbfq9"] Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.035775 5030 generic.go:334] "Generic (PLEG): container finished" podID="399e28e7-a389-47c4-b8c2-89d8dbad34be" containerID="b9431845f134e3c49f0b4fda626f37bee07bab7e57e203f049a9bdac3398065b" exitCode=0 Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.035859 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-brxx2" event={"ID":"399e28e7-a389-47c4-b8c2-89d8dbad34be","Type":"ContainerDied","Data":"b9431845f134e3c49f0b4fda626f37bee07bab7e57e203f049a9bdac3398065b"} Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.038235 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" event={"ID":"145c5e5d-0043-46f7-af17-4426bea1a3eb","Type":"ContainerStarted","Data":"51833c9fe476e8a63424a4a38609c31c548e23bb8b6381dba3b27392a0a655de"} Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.038283 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" event={"ID":"145c5e5d-0043-46f7-af17-4426bea1a3eb","Type":"ContainerStarted","Data":"c6d112cf934dfdef042538db235d8629571756a01c0e50051efe894723cf653c"} Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.040313 5030 generic.go:334] "Generic (PLEG): container finished" podID="1a712d79-3228-4e70-ae9d-8436a0dee203" containerID="15f3f2e178878a7a7591f1942904d8f760ff5c2f3da40e1b5354683582d9d89a" exitCode=0 Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.040381 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" event={"ID":"1a712d79-3228-4e70-ae9d-8436a0dee203","Type":"ContainerDied","Data":"15f3f2e178878a7a7591f1942904d8f760ff5c2f3da40e1b5354683582d9d89a"} Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.043009 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-mbfq9" event={"ID":"a10a501a-a4d8-4baf-b4f0-f76193fc92f3","Type":"ContainerStarted","Data":"2fb065624a0cd717b1029696de26712931b9386e0c465d5a6fc6c506a938300a"} Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.043067 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-mbfq9" event={"ID":"a10a501a-a4d8-4baf-b4f0-f76193fc92f3","Type":"ContainerStarted","Data":"2ac884c198c81141e23353a7db7269cb35c113632742ed7ab295920ee048fdb5"} Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.099281 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-create-mbfq9" podStartSLOduration=2.099261537 podStartE2EDuration="2.099261537s" podCreationTimestamp="2026-01-20 22:58:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:58:42.088453314 +0000 UTC m=+1394.408713622" watchObservedRunningTime="2026-01-20 22:58:42.099261537 +0000 UTC m=+1394.419521845" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.111496 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" podStartSLOduration=2.111475256 podStartE2EDuration="2.111475256s" podCreationTimestamp="2026-01-20 22:58:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:58:42.106590407 +0000 UTC m=+1394.426850695" watchObservedRunningTime="2026-01-20 22:58:42.111475256 +0000 UTC m=+1394.431735544" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.122117 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.188499 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.338725 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.340561 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.342962 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-cv6lg" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.343288 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.343565 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.352647 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.352955 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.366467 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.374276 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.378775 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.379064 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-72mn5" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.379188 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.379442 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.405039 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.441340 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.441386 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.441524 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4db8327-61c4-4757-bed5-728ef0f8bbc6-config\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.441589 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.441652 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4db8327-61c4-4757-bed5-728ef0f8bbc6-scripts\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.441751 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgltc\" (UniqueName: \"kubernetes.io/projected/c4db8327-61c4-4757-bed5-728ef0f8bbc6-kube-api-access-cgltc\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.441787 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrtnp\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-kube-api-access-rrtnp\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.441806 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.441824 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c4db8327-61c4-4757-bed5-728ef0f8bbc6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.441844 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d8bcc41f-fad5-4773-be90-803444a622b0-cache\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.441859 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d8bcc41f-fad5-4773-be90-803444a622b0-lock\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.441918 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.543595 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.543679 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.543708 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.543770 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4db8327-61c4-4757-bed5-728ef0f8bbc6-config\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.543793 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.543810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4db8327-61c4-4757-bed5-728ef0f8bbc6-scripts\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.543825 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgltc\" (UniqueName: \"kubernetes.io/projected/c4db8327-61c4-4757-bed5-728ef0f8bbc6-kube-api-access-cgltc\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.543847 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrtnp\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-kube-api-access-rrtnp\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.543862 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c4db8327-61c4-4757-bed5-728ef0f8bbc6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.543875 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.543894 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d8bcc41f-fad5-4773-be90-803444a622b0-cache\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.543909 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d8bcc41f-fad5-4773-be90-803444a622b0-lock\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: E0120 22:58:42.544560 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 22:58:42 crc kubenswrapper[5030]: E0120 22:58:42.544580 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 22:58:42 crc kubenswrapper[5030]: E0120 22:58:42.544616 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift podName:d8bcc41f-fad5-4773-be90-803444a622b0 nodeName:}" failed. No retries permitted until 2026-01-20 22:58:43.044601842 +0000 UTC m=+1395.364862130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift") pod "swift-storage-0" (UID: "d8bcc41f-fad5-4773-be90-803444a622b0") : configmap "swift-ring-files" not found Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.544638 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") device mount path \"/mnt/openstack/pv15\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.544695 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d8bcc41f-fad5-4773-be90-803444a622b0-cache\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.544770 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d8bcc41f-fad5-4773-be90-803444a622b0-lock\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.545043 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c4db8327-61c4-4757-bed5-728ef0f8bbc6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.545485 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4db8327-61c4-4757-bed5-728ef0f8bbc6-config\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.545501 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4db8327-61c4-4757-bed5-728ef0f8bbc6-scripts\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.550820 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.551209 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.552186 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.561227 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrtnp\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-kube-api-access-rrtnp\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.563331 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgltc\" (UniqueName: \"kubernetes.io/projected/c4db8327-61c4-4757-bed5-728ef0f8bbc6-kube-api-access-cgltc\") pod \"ovn-northd-0\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.565992 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:42 crc kubenswrapper[5030]: I0120 22:58:42.654219 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.028045 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-b5r8x"] Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.029886 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.037174 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.037206 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.047263 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-b5r8x"] Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.047577 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.053071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:43 crc kubenswrapper[5030]: E0120 22:58:43.053664 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 22:58:43 crc kubenswrapper[5030]: E0120 22:58:43.053693 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 22:58:43 crc kubenswrapper[5030]: E0120 22:58:43.053758 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift podName:d8bcc41f-fad5-4773-be90-803444a622b0 nodeName:}" failed. No retries permitted until 2026-01-20 22:58:44.053733556 +0000 UTC m=+1396.373993924 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift") pod "swift-storage-0" (UID: "d8bcc41f-fad5-4773-be90-803444a622b0") : configmap "swift-ring-files" not found Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.056599 5030 generic.go:334] "Generic (PLEG): container finished" podID="a10a501a-a4d8-4baf-b4f0-f76193fc92f3" containerID="2fb065624a0cd717b1029696de26712931b9386e0c465d5a6fc6c506a938300a" exitCode=0 Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.056731 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-mbfq9" event={"ID":"a10a501a-a4d8-4baf-b4f0-f76193fc92f3","Type":"ContainerDied","Data":"2fb065624a0cd717b1029696de26712931b9386e0c465d5a6fc6c506a938300a"} Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.064294 5030 generic.go:334] "Generic (PLEG): container finished" podID="145c5e5d-0043-46f7-af17-4426bea1a3eb" containerID="51833c9fe476e8a63424a4a38609c31c548e23bb8b6381dba3b27392a0a655de" exitCode=0 Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.065034 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" event={"ID":"145c5e5d-0043-46f7-af17-4426bea1a3eb","Type":"ContainerDied","Data":"51833c9fe476e8a63424a4a38609c31c548e23bb8b6381dba3b27392a0a655de"} Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.125031 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 22:58:43 crc kubenswrapper[5030]: W0120 22:58:43.140258 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4db8327_61c4_4757_bed5_728ef0f8bbc6.slice/crio-eeb0058f88d6fd8423fef2453c0ebc56111945cc904514ce6a8c59eb3d1573e2 WatchSource:0}: Error finding container eeb0058f88d6fd8423fef2453c0ebc56111945cc904514ce6a8c59eb3d1573e2: Status 404 returned error can't find the container with id eeb0058f88d6fd8423fef2453c0ebc56111945cc904514ce6a8c59eb3d1573e2 Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.154774 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/013d4d42-09ab-4f76-b77c-09a52e7c2635-etc-swift\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.154882 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/013d4d42-09ab-4f76-b77c-09a52e7c2635-scripts\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.154906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-dispersionconf\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.154945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/013d4d42-09ab-4f76-b77c-09a52e7c2635-ring-data-devices\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.154995 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-swiftconf\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.155079 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-combined-ca-bundle\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.155116 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbgch\" (UniqueName: \"kubernetes.io/projected/013d4d42-09ab-4f76-b77c-09a52e7c2635-kube-api-access-dbgch\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.258636 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-swiftconf\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.259361 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-combined-ca-bundle\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.259428 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbgch\" (UniqueName: \"kubernetes.io/projected/013d4d42-09ab-4f76-b77c-09a52e7c2635-kube-api-access-dbgch\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.259482 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/013d4d42-09ab-4f76-b77c-09a52e7c2635-etc-swift\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.259529 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/013d4d42-09ab-4f76-b77c-09a52e7c2635-scripts\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.259549 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-dispersionconf\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.259576 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/013d4d42-09ab-4f76-b77c-09a52e7c2635-ring-data-devices\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.260348 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/013d4d42-09ab-4f76-b77c-09a52e7c2635-ring-data-devices\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.260977 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/013d4d42-09ab-4f76-b77c-09a52e7c2635-etc-swift\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.261445 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/013d4d42-09ab-4f76-b77c-09a52e7c2635-scripts\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.263207 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-swiftconf\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.263602 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-combined-ca-bundle\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.265561 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-dispersionconf\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.281948 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbgch\" (UniqueName: \"kubernetes.io/projected/013d4d42-09ab-4f76-b77c-09a52e7c2635-kube-api-access-dbgch\") pod \"swift-ring-rebalance-b5r8x\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.364054 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.452068 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-brxx2" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.478659 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.567201 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5hmq\" (UniqueName: \"kubernetes.io/projected/399e28e7-a389-47c4-b8c2-89d8dbad34be-kube-api-access-p5hmq\") pod \"399e28e7-a389-47c4-b8c2-89d8dbad34be\" (UID: \"399e28e7-a389-47c4-b8c2-89d8dbad34be\") " Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.567546 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399e28e7-a389-47c4-b8c2-89d8dbad34be-operator-scripts\") pod \"399e28e7-a389-47c4-b8c2-89d8dbad34be\" (UID: \"399e28e7-a389-47c4-b8c2-89d8dbad34be\") " Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.567641 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a712d79-3228-4e70-ae9d-8436a0dee203-operator-scripts\") pod \"1a712d79-3228-4e70-ae9d-8436a0dee203\" (UID: \"1a712d79-3228-4e70-ae9d-8436a0dee203\") " Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.567697 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m755\" (UniqueName: \"kubernetes.io/projected/1a712d79-3228-4e70-ae9d-8436a0dee203-kube-api-access-6m755\") pod \"1a712d79-3228-4e70-ae9d-8436a0dee203\" (UID: \"1a712d79-3228-4e70-ae9d-8436a0dee203\") " Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.568055 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a712d79-3228-4e70-ae9d-8436a0dee203-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a712d79-3228-4e70-ae9d-8436a0dee203" (UID: "1a712d79-3228-4e70-ae9d-8436a0dee203"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.568307 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/399e28e7-a389-47c4-b8c2-89d8dbad34be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "399e28e7-a389-47c4-b8c2-89d8dbad34be" (UID: "399e28e7-a389-47c4-b8c2-89d8dbad34be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.570950 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/399e28e7-a389-47c4-b8c2-89d8dbad34be-kube-api-access-p5hmq" (OuterVolumeSpecName: "kube-api-access-p5hmq") pod "399e28e7-a389-47c4-b8c2-89d8dbad34be" (UID: "399e28e7-a389-47c4-b8c2-89d8dbad34be"). InnerVolumeSpecName "kube-api-access-p5hmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.571493 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a712d79-3228-4e70-ae9d-8436a0dee203-kube-api-access-6m755" (OuterVolumeSpecName: "kube-api-access-6m755") pod "1a712d79-3228-4e70-ae9d-8436a0dee203" (UID: "1a712d79-3228-4e70-ae9d-8436a0dee203"). InnerVolumeSpecName "kube-api-access-6m755". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.669701 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/399e28e7-a389-47c4-b8c2-89d8dbad34be-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.669741 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a712d79-3228-4e70-ae9d-8436a0dee203-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.669757 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m755\" (UniqueName: \"kubernetes.io/projected/1a712d79-3228-4e70-ae9d-8436a0dee203-kube-api-access-6m755\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.669773 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5hmq\" (UniqueName: \"kubernetes.io/projected/399e28e7-a389-47c4-b8c2-89d8dbad34be-kube-api-access-p5hmq\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:43 crc kubenswrapper[5030]: W0120 22:58:43.812507 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod013d4d42_09ab_4f76_b77c_09a52e7c2635.slice/crio-99febb9ac1b6d949370411a86118e2c1c78e077859668a391f6dbd576a95ecf8 WatchSource:0}: Error finding container 99febb9ac1b6d949370411a86118e2c1c78e077859668a391f6dbd576a95ecf8: Status 404 returned error can't find the container with id 99febb9ac1b6d949370411a86118e2c1c78e077859668a391f6dbd576a95ecf8 Jan 20 22:58:43 crc kubenswrapper[5030]: I0120 22:58:43.818593 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-b5r8x"] Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.073838 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:44 crc kubenswrapper[5030]: E0120 22:58:44.074045 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 22:58:44 crc kubenswrapper[5030]: E0120 22:58:44.074261 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 22:58:44 crc kubenswrapper[5030]: E0120 22:58:44.074378 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift podName:d8bcc41f-fad5-4773-be90-803444a622b0 nodeName:}" failed. No retries permitted until 2026-01-20 22:58:46.074344511 +0000 UTC m=+1398.394604829 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift") pod "swift-storage-0" (UID: "d8bcc41f-fad5-4773-be90-803444a622b0") : configmap "swift-ring-files" not found Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.077333 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" event={"ID":"1a712d79-3228-4e70-ae9d-8436a0dee203","Type":"ContainerDied","Data":"732d2f56e911467f0506c0097c543ea3b08da186ebb2cb470aa1df43fd61aeed"} Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.077377 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="732d2f56e911467f0506c0097c543ea3b08da186ebb2cb470aa1df43fd61aeed" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.077382 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-476mm" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.079331 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"c4db8327-61c4-4757-bed5-728ef0f8bbc6","Type":"ContainerStarted","Data":"96fcd4787c2a4fc0ac3198c4288890edb23f73112901a2786f3617f650cef644"} Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.079391 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"c4db8327-61c4-4757-bed5-728ef0f8bbc6","Type":"ContainerStarted","Data":"5ee7da575f223a9839f7f97dae1e5c9caf513e03dec1badc74521523c66f260c"} Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.079421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"c4db8327-61c4-4757-bed5-728ef0f8bbc6","Type":"ContainerStarted","Data":"eeb0058f88d6fd8423fef2453c0ebc56111945cc904514ce6a8c59eb3d1573e2"} Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.079587 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.084451 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" event={"ID":"013d4d42-09ab-4f76-b77c-09a52e7c2635","Type":"ContainerStarted","Data":"f2c5b9db286e929674b9d0249c5dd2cb7c764e76513905ab03fd1fffafa3a335"} Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.084491 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" event={"ID":"013d4d42-09ab-4f76-b77c-09a52e7c2635","Type":"ContainerStarted","Data":"99febb9ac1b6d949370411a86118e2c1c78e077859668a391f6dbd576a95ecf8"} Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.086174 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-brxx2" event={"ID":"399e28e7-a389-47c4-b8c2-89d8dbad34be","Type":"ContainerDied","Data":"8d5b79f48376d2724e81cad712cf07fe1c1ad4dbf53d769de98433c06436a6ed"} Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.086222 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d5b79f48376d2724e81cad712cf07fe1c1ad4dbf53d769de98433c06436a6ed" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.086336 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-brxx2" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.113573 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.113554979 podStartE2EDuration="2.113554979s" podCreationTimestamp="2026-01-20 22:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:58:44.108779753 +0000 UTC m=+1396.429040041" watchObservedRunningTime="2026-01-20 22:58:44.113554979 +0000 UTC m=+1396.433815267" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.149555 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" podStartSLOduration=1.149532969 podStartE2EDuration="1.149532969s" podCreationTimestamp="2026-01-20 22:58:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:58:44.140427956 +0000 UTC m=+1396.460688264" watchObservedRunningTime="2026-01-20 22:58:44.149532969 +0000 UTC m=+1396.469793257" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.413978 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-mbfq9" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.479455 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a10a501a-a4d8-4baf-b4f0-f76193fc92f3-operator-scripts\") pod \"a10a501a-a4d8-4baf-b4f0-f76193fc92f3\" (UID: \"a10a501a-a4d8-4baf-b4f0-f76193fc92f3\") " Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.479639 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcwhm\" (UniqueName: \"kubernetes.io/projected/a10a501a-a4d8-4baf-b4f0-f76193fc92f3-kube-api-access-zcwhm\") pod \"a10a501a-a4d8-4baf-b4f0-f76193fc92f3\" (UID: \"a10a501a-a4d8-4baf-b4f0-f76193fc92f3\") " Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.479954 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a10a501a-a4d8-4baf-b4f0-f76193fc92f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a10a501a-a4d8-4baf-b4f0-f76193fc92f3" (UID: "a10a501a-a4d8-4baf-b4f0-f76193fc92f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.480131 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a10a501a-a4d8-4baf-b4f0-f76193fc92f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.484785 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a10a501a-a4d8-4baf-b4f0-f76193fc92f3-kube-api-access-zcwhm" (OuterVolumeSpecName: "kube-api-access-zcwhm") pod "a10a501a-a4d8-4baf-b4f0-f76193fc92f3" (UID: "a10a501a-a4d8-4baf-b4f0-f76193fc92f3"). InnerVolumeSpecName "kube-api-access-zcwhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.486210 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.581221 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngkln\" (UniqueName: \"kubernetes.io/projected/145c5e5d-0043-46f7-af17-4426bea1a3eb-kube-api-access-ngkln\") pod \"145c5e5d-0043-46f7-af17-4426bea1a3eb\" (UID: \"145c5e5d-0043-46f7-af17-4426bea1a3eb\") " Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.581346 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/145c5e5d-0043-46f7-af17-4426bea1a3eb-operator-scripts\") pod \"145c5e5d-0043-46f7-af17-4426bea1a3eb\" (UID: \"145c5e5d-0043-46f7-af17-4426bea1a3eb\") " Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.581899 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/145c5e5d-0043-46f7-af17-4426bea1a3eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "145c5e5d-0043-46f7-af17-4426bea1a3eb" (UID: "145c5e5d-0043-46f7-af17-4426bea1a3eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.582056 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcwhm\" (UniqueName: \"kubernetes.io/projected/a10a501a-a4d8-4baf-b4f0-f76193fc92f3-kube-api-access-zcwhm\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.582095 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/145c5e5d-0043-46f7-af17-4426bea1a3eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.583869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/145c5e5d-0043-46f7-af17-4426bea1a3eb-kube-api-access-ngkln" (OuterVolumeSpecName: "kube-api-access-ngkln") pod "145c5e5d-0043-46f7-af17-4426bea1a3eb" (UID: "145c5e5d-0043-46f7-af17-4426bea1a3eb"). InnerVolumeSpecName "kube-api-access-ngkln". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:44 crc kubenswrapper[5030]: I0120 22:58:44.684420 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngkln\" (UniqueName: \"kubernetes.io/projected/145c5e5d-0043-46f7-af17-4426bea1a3eb-kube-api-access-ngkln\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.097561 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" event={"ID":"145c5e5d-0043-46f7-af17-4426bea1a3eb","Type":"ContainerDied","Data":"c6d112cf934dfdef042538db235d8629571756a01c0e50051efe894723cf653c"} Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.097596 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.097633 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6d112cf934dfdef042538db235d8629571756a01c0e50051efe894723cf653c" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.101067 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-mbfq9" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.101676 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-mbfq9" event={"ID":"a10a501a-a4d8-4baf-b4f0-f76193fc92f3","Type":"ContainerDied","Data":"2ac884c198c81141e23353a7db7269cb35c113632742ed7ab295920ee048fdb5"} Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.101725 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ac884c198c81141e23353a7db7269cb35c113632742ed7ab295920ee048fdb5" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.412764 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-wdrgs"] Jan 20 22:58:45 crc kubenswrapper[5030]: E0120 22:58:45.413252 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145c5e5d-0043-46f7-af17-4426bea1a3eb" containerName="mariadb-account-create-update" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.413284 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="145c5e5d-0043-46f7-af17-4426bea1a3eb" containerName="mariadb-account-create-update" Jan 20 22:58:45 crc kubenswrapper[5030]: E0120 22:58:45.413309 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399e28e7-a389-47c4-b8c2-89d8dbad34be" containerName="mariadb-database-create" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.413320 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="399e28e7-a389-47c4-b8c2-89d8dbad34be" containerName="mariadb-database-create" Jan 20 22:58:45 crc kubenswrapper[5030]: E0120 22:58:45.413343 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10a501a-a4d8-4baf-b4f0-f76193fc92f3" containerName="mariadb-database-create" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.413355 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10a501a-a4d8-4baf-b4f0-f76193fc92f3" containerName="mariadb-database-create" Jan 20 22:58:45 crc kubenswrapper[5030]: E0120 22:58:45.413375 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a712d79-3228-4e70-ae9d-8436a0dee203" containerName="mariadb-account-create-update" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.413387 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a712d79-3228-4e70-ae9d-8436a0dee203" containerName="mariadb-account-create-update" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.413677 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a712d79-3228-4e70-ae9d-8436a0dee203" containerName="mariadb-account-create-update" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.413715 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="399e28e7-a389-47c4-b8c2-89d8dbad34be" containerName="mariadb-database-create" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.413739 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="145c5e5d-0043-46f7-af17-4426bea1a3eb" containerName="mariadb-account-create-update" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.413766 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a10a501a-a4d8-4baf-b4f0-f76193fc92f3" containerName="mariadb-database-create" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.414544 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-wdrgs" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.420910 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-wdrgs"] Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.503033 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztf65\" (UniqueName: \"kubernetes.io/projected/4cf8ec7d-d1a5-45c1-8c1b-106c2358caff-kube-api-access-ztf65\") pod \"glance-db-create-wdrgs\" (UID: \"4cf8ec7d-d1a5-45c1-8c1b-106c2358caff\") " pod="openstack-kuttl-tests/glance-db-create-wdrgs" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.503079 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cf8ec7d-d1a5-45c1-8c1b-106c2358caff-operator-scripts\") pod \"glance-db-create-wdrgs\" (UID: \"4cf8ec7d-d1a5-45c1-8c1b-106c2358caff\") " pod="openstack-kuttl-tests/glance-db-create-wdrgs" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.518721 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-b741-account-create-update-w9xxk"] Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.519788 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-b741-account-create-update-w9xxk" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.521796 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.534322 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-b741-account-create-update-w9xxk"] Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.605368 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztf65\" (UniqueName: \"kubernetes.io/projected/4cf8ec7d-d1a5-45c1-8c1b-106c2358caff-kube-api-access-ztf65\") pod \"glance-db-create-wdrgs\" (UID: \"4cf8ec7d-d1a5-45c1-8c1b-106c2358caff\") " pod="openstack-kuttl-tests/glance-db-create-wdrgs" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.605467 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cf8ec7d-d1a5-45c1-8c1b-106c2358caff-operator-scripts\") pod \"glance-db-create-wdrgs\" (UID: \"4cf8ec7d-d1a5-45c1-8c1b-106c2358caff\") " pod="openstack-kuttl-tests/glance-db-create-wdrgs" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.605567 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8q6c\" (UniqueName: \"kubernetes.io/projected/b5285dc6-c907-4a96-b97a-97950b15c428-kube-api-access-r8q6c\") pod \"glance-b741-account-create-update-w9xxk\" (UID: \"b5285dc6-c907-4a96-b97a-97950b15c428\") " pod="openstack-kuttl-tests/glance-b741-account-create-update-w9xxk" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.605654 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5285dc6-c907-4a96-b97a-97950b15c428-operator-scripts\") pod \"glance-b741-account-create-update-w9xxk\" (UID: \"b5285dc6-c907-4a96-b97a-97950b15c428\") " pod="openstack-kuttl-tests/glance-b741-account-create-update-w9xxk" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.606525 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cf8ec7d-d1a5-45c1-8c1b-106c2358caff-operator-scripts\") pod \"glance-db-create-wdrgs\" (UID: \"4cf8ec7d-d1a5-45c1-8c1b-106c2358caff\") " pod="openstack-kuttl-tests/glance-db-create-wdrgs" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.623561 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztf65\" (UniqueName: \"kubernetes.io/projected/4cf8ec7d-d1a5-45c1-8c1b-106c2358caff-kube-api-access-ztf65\") pod \"glance-db-create-wdrgs\" (UID: \"4cf8ec7d-d1a5-45c1-8c1b-106c2358caff\") " pod="openstack-kuttl-tests/glance-db-create-wdrgs" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.707154 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8q6c\" (UniqueName: \"kubernetes.io/projected/b5285dc6-c907-4a96-b97a-97950b15c428-kube-api-access-r8q6c\") pod \"glance-b741-account-create-update-w9xxk\" (UID: \"b5285dc6-c907-4a96-b97a-97950b15c428\") " pod="openstack-kuttl-tests/glance-b741-account-create-update-w9xxk" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.707228 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5285dc6-c907-4a96-b97a-97950b15c428-operator-scripts\") pod \"glance-b741-account-create-update-w9xxk\" (UID: \"b5285dc6-c907-4a96-b97a-97950b15c428\") " pod="openstack-kuttl-tests/glance-b741-account-create-update-w9xxk" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.708000 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5285dc6-c907-4a96-b97a-97950b15c428-operator-scripts\") pod \"glance-b741-account-create-update-w9xxk\" (UID: \"b5285dc6-c907-4a96-b97a-97950b15c428\") " pod="openstack-kuttl-tests/glance-b741-account-create-update-w9xxk" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.725075 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8q6c\" (UniqueName: \"kubernetes.io/projected/b5285dc6-c907-4a96-b97a-97950b15c428-kube-api-access-r8q6c\") pod \"glance-b741-account-create-update-w9xxk\" (UID: \"b5285dc6-c907-4a96-b97a-97950b15c428\") " pod="openstack-kuttl-tests/glance-b741-account-create-update-w9xxk" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.733749 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-wdrgs" Jan 20 22:58:45 crc kubenswrapper[5030]: I0120 22:58:45.854731 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-b741-account-create-update-w9xxk" Jan 20 22:58:46 crc kubenswrapper[5030]: I0120 22:58:46.112530 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:46 crc kubenswrapper[5030]: E0120 22:58:46.113332 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 22:58:46 crc kubenswrapper[5030]: E0120 22:58:46.113356 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 22:58:46 crc kubenswrapper[5030]: E0120 22:58:46.113394 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift podName:d8bcc41f-fad5-4773-be90-803444a622b0 nodeName:}" failed. No retries permitted until 2026-01-20 22:58:50.113378637 +0000 UTC m=+1402.433638925 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift") pod "swift-storage-0" (UID: "d8bcc41f-fad5-4773-be90-803444a622b0") : configmap "swift-ring-files" not found Jan 20 22:58:46 crc kubenswrapper[5030]: I0120 22:58:46.180779 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-wdrgs"] Jan 20 22:58:46 crc kubenswrapper[5030]: W0120 22:58:46.184272 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cf8ec7d_d1a5_45c1_8c1b_106c2358caff.slice/crio-5622327f65e9cef1e9ae1fc0faf9cef608f82cf2bdd7d584fea85f522b80245f WatchSource:0}: Error finding container 5622327f65e9cef1e9ae1fc0faf9cef608f82cf2bdd7d584fea85f522b80245f: Status 404 returned error can't find the container with id 5622327f65e9cef1e9ae1fc0faf9cef608f82cf2bdd7d584fea85f522b80245f Jan 20 22:58:46 crc kubenswrapper[5030]: I0120 22:58:46.314442 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-b741-account-create-update-w9xxk"] Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.122699 5030 generic.go:334] "Generic (PLEG): container finished" podID="b5285dc6-c907-4a96-b97a-97950b15c428" containerID="f1f879bde92b8e7e837720c11d4960f9a23eb7c5819999adfe14cc248145d4fc" exitCode=0 Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.122802 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-b741-account-create-update-w9xxk" event={"ID":"b5285dc6-c907-4a96-b97a-97950b15c428","Type":"ContainerDied","Data":"f1f879bde92b8e7e837720c11d4960f9a23eb7c5819999adfe14cc248145d4fc"} Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.122834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-b741-account-create-update-w9xxk" event={"ID":"b5285dc6-c907-4a96-b97a-97950b15c428","Type":"ContainerStarted","Data":"bae40595a793cdb5d3c5d3698b757186003950e86b9c2caef00b824529b87e30"} Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.125293 5030 generic.go:334] "Generic (PLEG): container finished" podID="4cf8ec7d-d1a5-45c1-8c1b-106c2358caff" containerID="e246add32a8f2b92c56ef74c9ae087941267aacd2dac7ba62b4badf399e3ef9a" exitCode=0 Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.125345 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-wdrgs" event={"ID":"4cf8ec7d-d1a5-45c1-8c1b-106c2358caff","Type":"ContainerDied","Data":"e246add32a8f2b92c56ef74c9ae087941267aacd2dac7ba62b4badf399e3ef9a"} Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.125373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-wdrgs" event={"ID":"4cf8ec7d-d1a5-45c1-8c1b-106c2358caff","Type":"ContainerStarted","Data":"5622327f65e9cef1e9ae1fc0faf9cef608f82cf2bdd7d584fea85f522b80245f"} Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.305410 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-t8vvb"] Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.306535 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-t8vvb" Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.311512 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.328475 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-t8vvb"] Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.433165 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c010c4a0-9ac8-4016-ab6d-86cbce546e3f-operator-scripts\") pod \"root-account-create-update-t8vvb\" (UID: \"c010c4a0-9ac8-4016-ab6d-86cbce546e3f\") " pod="openstack-kuttl-tests/root-account-create-update-t8vvb" Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.433234 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmhpf\" (UniqueName: \"kubernetes.io/projected/c010c4a0-9ac8-4016-ab6d-86cbce546e3f-kube-api-access-jmhpf\") pod \"root-account-create-update-t8vvb\" (UID: \"c010c4a0-9ac8-4016-ab6d-86cbce546e3f\") " pod="openstack-kuttl-tests/root-account-create-update-t8vvb" Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.534834 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmhpf\" (UniqueName: \"kubernetes.io/projected/c010c4a0-9ac8-4016-ab6d-86cbce546e3f-kube-api-access-jmhpf\") pod \"root-account-create-update-t8vvb\" (UID: \"c010c4a0-9ac8-4016-ab6d-86cbce546e3f\") " pod="openstack-kuttl-tests/root-account-create-update-t8vvb" Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.534973 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c010c4a0-9ac8-4016-ab6d-86cbce546e3f-operator-scripts\") pod \"root-account-create-update-t8vvb\" (UID: \"c010c4a0-9ac8-4016-ab6d-86cbce546e3f\") " pod="openstack-kuttl-tests/root-account-create-update-t8vvb" Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.535678 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c010c4a0-9ac8-4016-ab6d-86cbce546e3f-operator-scripts\") pod \"root-account-create-update-t8vvb\" (UID: \"c010c4a0-9ac8-4016-ab6d-86cbce546e3f\") " pod="openstack-kuttl-tests/root-account-create-update-t8vvb" Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.558651 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmhpf\" (UniqueName: \"kubernetes.io/projected/c010c4a0-9ac8-4016-ab6d-86cbce546e3f-kube-api-access-jmhpf\") pod \"root-account-create-update-t8vvb\" (UID: \"c010c4a0-9ac8-4016-ab6d-86cbce546e3f\") " pod="openstack-kuttl-tests/root-account-create-update-t8vvb" Jan 20 22:58:47 crc kubenswrapper[5030]: I0120 22:58:47.624994 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-t8vvb" Jan 20 22:58:48 crc kubenswrapper[5030]: W0120 22:58:48.140897 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc010c4a0_9ac8_4016_ab6d_86cbce546e3f.slice/crio-5ccbc00108e7b63dd287393de3570480984f8adec41cac1728f540f112379149 WatchSource:0}: Error finding container 5ccbc00108e7b63dd287393de3570480984f8adec41cac1728f540f112379149: Status 404 returned error can't find the container with id 5ccbc00108e7b63dd287393de3570480984f8adec41cac1728f540f112379149 Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.145685 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-t8vvb"] Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.544602 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-b741-account-create-update-w9xxk" Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.564752 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-wdrgs" Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.653463 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cf8ec7d-d1a5-45c1-8c1b-106c2358caff-operator-scripts\") pod \"4cf8ec7d-d1a5-45c1-8c1b-106c2358caff\" (UID: \"4cf8ec7d-d1a5-45c1-8c1b-106c2358caff\") " Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.653638 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5285dc6-c907-4a96-b97a-97950b15c428-operator-scripts\") pod \"b5285dc6-c907-4a96-b97a-97950b15c428\" (UID: \"b5285dc6-c907-4a96-b97a-97950b15c428\") " Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.653740 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8q6c\" (UniqueName: \"kubernetes.io/projected/b5285dc6-c907-4a96-b97a-97950b15c428-kube-api-access-r8q6c\") pod \"b5285dc6-c907-4a96-b97a-97950b15c428\" (UID: \"b5285dc6-c907-4a96-b97a-97950b15c428\") " Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.653811 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztf65\" (UniqueName: \"kubernetes.io/projected/4cf8ec7d-d1a5-45c1-8c1b-106c2358caff-kube-api-access-ztf65\") pod \"4cf8ec7d-d1a5-45c1-8c1b-106c2358caff\" (UID: \"4cf8ec7d-d1a5-45c1-8c1b-106c2358caff\") " Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.654343 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5285dc6-c907-4a96-b97a-97950b15c428-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5285dc6-c907-4a96-b97a-97950b15c428" (UID: "b5285dc6-c907-4a96-b97a-97950b15c428"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.654800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cf8ec7d-d1a5-45c1-8c1b-106c2358caff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cf8ec7d-d1a5-45c1-8c1b-106c2358caff" (UID: "4cf8ec7d-d1a5-45c1-8c1b-106c2358caff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.659089 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cf8ec7d-d1a5-45c1-8c1b-106c2358caff-kube-api-access-ztf65" (OuterVolumeSpecName: "kube-api-access-ztf65") pod "4cf8ec7d-d1a5-45c1-8c1b-106c2358caff" (UID: "4cf8ec7d-d1a5-45c1-8c1b-106c2358caff"). InnerVolumeSpecName "kube-api-access-ztf65". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.659189 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5285dc6-c907-4a96-b97a-97950b15c428-kube-api-access-r8q6c" (OuterVolumeSpecName: "kube-api-access-r8q6c") pod "b5285dc6-c907-4a96-b97a-97950b15c428" (UID: "b5285dc6-c907-4a96-b97a-97950b15c428"). InnerVolumeSpecName "kube-api-access-r8q6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.755238 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5285dc6-c907-4a96-b97a-97950b15c428-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.755557 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8q6c\" (UniqueName: \"kubernetes.io/projected/b5285dc6-c907-4a96-b97a-97950b15c428-kube-api-access-r8q6c\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.755569 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztf65\" (UniqueName: \"kubernetes.io/projected/4cf8ec7d-d1a5-45c1-8c1b-106c2358caff-kube-api-access-ztf65\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:48 crc kubenswrapper[5030]: I0120 22:58:48.755578 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cf8ec7d-d1a5-45c1-8c1b-106c2358caff-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:49 crc kubenswrapper[5030]: I0120 22:58:49.149435 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-b741-account-create-update-w9xxk" event={"ID":"b5285dc6-c907-4a96-b97a-97950b15c428","Type":"ContainerDied","Data":"bae40595a793cdb5d3c5d3698b757186003950e86b9c2caef00b824529b87e30"} Jan 20 22:58:49 crc kubenswrapper[5030]: I0120 22:58:49.149486 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-b741-account-create-update-w9xxk" Jan 20 22:58:49 crc kubenswrapper[5030]: I0120 22:58:49.149712 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bae40595a793cdb5d3c5d3698b757186003950e86b9c2caef00b824529b87e30" Jan 20 22:58:49 crc kubenswrapper[5030]: I0120 22:58:49.151884 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-wdrgs" event={"ID":"4cf8ec7d-d1a5-45c1-8c1b-106c2358caff","Type":"ContainerDied","Data":"5622327f65e9cef1e9ae1fc0faf9cef608f82cf2bdd7d584fea85f522b80245f"} Jan 20 22:58:49 crc kubenswrapper[5030]: I0120 22:58:49.151906 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-wdrgs" Jan 20 22:58:49 crc kubenswrapper[5030]: I0120 22:58:49.152164 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5622327f65e9cef1e9ae1fc0faf9cef608f82cf2bdd7d584fea85f522b80245f" Jan 20 22:58:49 crc kubenswrapper[5030]: I0120 22:58:49.153769 5030 generic.go:334] "Generic (PLEG): container finished" podID="c010c4a0-9ac8-4016-ab6d-86cbce546e3f" containerID="7228ae851e48f872fc68053da066eb092303bbfc14163b39931113d15daeae2a" exitCode=0 Jan 20 22:58:49 crc kubenswrapper[5030]: I0120 22:58:49.153827 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-t8vvb" event={"ID":"c010c4a0-9ac8-4016-ab6d-86cbce546e3f","Type":"ContainerDied","Data":"7228ae851e48f872fc68053da066eb092303bbfc14163b39931113d15daeae2a"} Jan 20 22:58:49 crc kubenswrapper[5030]: I0120 22:58:49.153868 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-t8vvb" event={"ID":"c010c4a0-9ac8-4016-ab6d-86cbce546e3f","Type":"ContainerStarted","Data":"5ccbc00108e7b63dd287393de3570480984f8adec41cac1728f540f112379149"} Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.189148 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:50 crc kubenswrapper[5030]: E0120 22:58:50.189408 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 22:58:50 crc kubenswrapper[5030]: E0120 22:58:50.191237 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 22:58:50 crc kubenswrapper[5030]: E0120 22:58:50.191415 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift podName:d8bcc41f-fad5-4773-be90-803444a622b0 nodeName:}" failed. No retries permitted until 2026-01-20 22:58:58.191386949 +0000 UTC m=+1410.511647277 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift") pod "swift-storage-0" (UID: "d8bcc41f-fad5-4773-be90-803444a622b0") : configmap "swift-ring-files" not found Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.681608 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-t8vvb" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.743360 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-qgczp"] Jan 20 22:58:50 crc kubenswrapper[5030]: E0120 22:58:50.743747 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c010c4a0-9ac8-4016-ab6d-86cbce546e3f" containerName="mariadb-account-create-update" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.743770 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c010c4a0-9ac8-4016-ab6d-86cbce546e3f" containerName="mariadb-account-create-update" Jan 20 22:58:50 crc kubenswrapper[5030]: E0120 22:58:50.743792 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5285dc6-c907-4a96-b97a-97950b15c428" containerName="mariadb-account-create-update" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.743801 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5285dc6-c907-4a96-b97a-97950b15c428" containerName="mariadb-account-create-update" Jan 20 22:58:50 crc kubenswrapper[5030]: E0120 22:58:50.743812 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cf8ec7d-d1a5-45c1-8c1b-106c2358caff" containerName="mariadb-database-create" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.743821 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf8ec7d-d1a5-45c1-8c1b-106c2358caff" containerName="mariadb-database-create" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.744038 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cf8ec7d-d1a5-45c1-8c1b-106c2358caff" containerName="mariadb-database-create" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.744055 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5285dc6-c907-4a96-b97a-97950b15c428" containerName="mariadb-account-create-update" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.744073 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c010c4a0-9ac8-4016-ab6d-86cbce546e3f" containerName="mariadb-account-create-update" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.744768 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.748030 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.748103 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-ghprt" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.755908 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-qgczp"] Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.803982 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmhpf\" (UniqueName: \"kubernetes.io/projected/c010c4a0-9ac8-4016-ab6d-86cbce546e3f-kube-api-access-jmhpf\") pod \"c010c4a0-9ac8-4016-ab6d-86cbce546e3f\" (UID: \"c010c4a0-9ac8-4016-ab6d-86cbce546e3f\") " Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.804053 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c010c4a0-9ac8-4016-ab6d-86cbce546e3f-operator-scripts\") pod \"c010c4a0-9ac8-4016-ab6d-86cbce546e3f\" (UID: \"c010c4a0-9ac8-4016-ab6d-86cbce546e3f\") " Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.804309 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-config-data\") pod \"glance-db-sync-qgczp\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.804352 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjrcl\" (UniqueName: \"kubernetes.io/projected/8292fbbe-24f2-4dde-82f5-338ac93e33f1-kube-api-access-wjrcl\") pod \"glance-db-sync-qgczp\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.804370 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-combined-ca-bundle\") pod \"glance-db-sync-qgczp\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.804496 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-db-sync-config-data\") pod \"glance-db-sync-qgczp\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.805131 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c010c4a0-9ac8-4016-ab6d-86cbce546e3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c010c4a0-9ac8-4016-ab6d-86cbce546e3f" (UID: "c010c4a0-9ac8-4016-ab6d-86cbce546e3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.809245 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c010c4a0-9ac8-4016-ab6d-86cbce546e3f-kube-api-access-jmhpf" (OuterVolumeSpecName: "kube-api-access-jmhpf") pod "c010c4a0-9ac8-4016-ab6d-86cbce546e3f" (UID: "c010c4a0-9ac8-4016-ab6d-86cbce546e3f"). InnerVolumeSpecName "kube-api-access-jmhpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.905863 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-db-sync-config-data\") pod \"glance-db-sync-qgczp\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.906256 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-config-data\") pod \"glance-db-sync-qgczp\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.906326 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjrcl\" (UniqueName: \"kubernetes.io/projected/8292fbbe-24f2-4dde-82f5-338ac93e33f1-kube-api-access-wjrcl\") pod \"glance-db-sync-qgczp\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.906360 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-combined-ca-bundle\") pod \"glance-db-sync-qgczp\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.906607 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmhpf\" (UniqueName: \"kubernetes.io/projected/c010c4a0-9ac8-4016-ab6d-86cbce546e3f-kube-api-access-jmhpf\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.906678 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c010c4a0-9ac8-4016-ab6d-86cbce546e3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.910235 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-db-sync-config-data\") pod \"glance-db-sync-qgczp\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.910351 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-config-data\") pod \"glance-db-sync-qgczp\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.910443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-combined-ca-bundle\") pod \"glance-db-sync-qgczp\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:50 crc kubenswrapper[5030]: I0120 22:58:50.930171 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjrcl\" (UniqueName: \"kubernetes.io/projected/8292fbbe-24f2-4dde-82f5-338ac93e33f1-kube-api-access-wjrcl\") pod \"glance-db-sync-qgczp\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:51 crc kubenswrapper[5030]: I0120 22:58:51.067796 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:51 crc kubenswrapper[5030]: I0120 22:58:51.207323 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-t8vvb" Jan 20 22:58:51 crc kubenswrapper[5030]: I0120 22:58:51.207327 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-t8vvb" event={"ID":"c010c4a0-9ac8-4016-ab6d-86cbce546e3f","Type":"ContainerDied","Data":"5ccbc00108e7b63dd287393de3570480984f8adec41cac1728f540f112379149"} Jan 20 22:58:51 crc kubenswrapper[5030]: I0120 22:58:51.207374 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ccbc00108e7b63dd287393de3570480984f8adec41cac1728f540f112379149" Jan 20 22:58:51 crc kubenswrapper[5030]: I0120 22:58:51.230748 5030 generic.go:334] "Generic (PLEG): container finished" podID="013d4d42-09ab-4f76-b77c-09a52e7c2635" containerID="f2c5b9db286e929674b9d0249c5dd2cb7c764e76513905ab03fd1fffafa3a335" exitCode=0 Jan 20 22:58:51 crc kubenswrapper[5030]: I0120 22:58:51.230796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" event={"ID":"013d4d42-09ab-4f76-b77c-09a52e7c2635","Type":"ContainerDied","Data":"f2c5b9db286e929674b9d0249c5dd2cb7c764e76513905ab03fd1fffafa3a335"} Jan 20 22:58:51 crc kubenswrapper[5030]: I0120 22:58:51.629336 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-qgczp"] Jan 20 22:58:51 crc kubenswrapper[5030]: W0120 22:58:51.634121 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8292fbbe_24f2_4dde_82f5_338ac93e33f1.slice/crio-c063746c42a565c9ba7c3e3d6d0df3e1a468f8d05f2f270dade3434ae6f4d426 WatchSource:0}: Error finding container c063746c42a565c9ba7c3e3d6d0df3e1a468f8d05f2f270dade3434ae6f4d426: Status 404 returned error can't find the container with id c063746c42a565c9ba7c3e3d6d0df3e1a468f8d05f2f270dade3434ae6f4d426 Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.245803 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-qgczp" event={"ID":"8292fbbe-24f2-4dde-82f5-338ac93e33f1","Type":"ContainerStarted","Data":"c063746c42a565c9ba7c3e3d6d0df3e1a468f8d05f2f270dade3434ae6f4d426"} Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.670515 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.745928 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-dispersionconf\") pod \"013d4d42-09ab-4f76-b77c-09a52e7c2635\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.745984 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-swiftconf\") pod \"013d4d42-09ab-4f76-b77c-09a52e7c2635\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.746013 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/013d4d42-09ab-4f76-b77c-09a52e7c2635-ring-data-devices\") pod \"013d4d42-09ab-4f76-b77c-09a52e7c2635\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.746059 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbgch\" (UniqueName: \"kubernetes.io/projected/013d4d42-09ab-4f76-b77c-09a52e7c2635-kube-api-access-dbgch\") pod \"013d4d42-09ab-4f76-b77c-09a52e7c2635\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.746111 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/013d4d42-09ab-4f76-b77c-09a52e7c2635-scripts\") pod \"013d4d42-09ab-4f76-b77c-09a52e7c2635\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.746129 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/013d4d42-09ab-4f76-b77c-09a52e7c2635-etc-swift\") pod \"013d4d42-09ab-4f76-b77c-09a52e7c2635\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.746171 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-combined-ca-bundle\") pod \"013d4d42-09ab-4f76-b77c-09a52e7c2635\" (UID: \"013d4d42-09ab-4f76-b77c-09a52e7c2635\") " Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.747027 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013d4d42-09ab-4f76-b77c-09a52e7c2635-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "013d4d42-09ab-4f76-b77c-09a52e7c2635" (UID: "013d4d42-09ab-4f76-b77c-09a52e7c2635"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.747944 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/013d4d42-09ab-4f76-b77c-09a52e7c2635-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "013d4d42-09ab-4f76-b77c-09a52e7c2635" (UID: "013d4d42-09ab-4f76-b77c-09a52e7c2635"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.755924 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "013d4d42-09ab-4f76-b77c-09a52e7c2635" (UID: "013d4d42-09ab-4f76-b77c-09a52e7c2635"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.756086 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013d4d42-09ab-4f76-b77c-09a52e7c2635-kube-api-access-dbgch" (OuterVolumeSpecName: "kube-api-access-dbgch") pod "013d4d42-09ab-4f76-b77c-09a52e7c2635" (UID: "013d4d42-09ab-4f76-b77c-09a52e7c2635"). InnerVolumeSpecName "kube-api-access-dbgch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.772017 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/013d4d42-09ab-4f76-b77c-09a52e7c2635-scripts" (OuterVolumeSpecName: "scripts") pod "013d4d42-09ab-4f76-b77c-09a52e7c2635" (UID: "013d4d42-09ab-4f76-b77c-09a52e7c2635"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.773915 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "013d4d42-09ab-4f76-b77c-09a52e7c2635" (UID: "013d4d42-09ab-4f76-b77c-09a52e7c2635"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.790758 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "013d4d42-09ab-4f76-b77c-09a52e7c2635" (UID: "013d4d42-09ab-4f76-b77c-09a52e7c2635"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.847744 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbgch\" (UniqueName: \"kubernetes.io/projected/013d4d42-09ab-4f76-b77c-09a52e7c2635-kube-api-access-dbgch\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.847780 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/013d4d42-09ab-4f76-b77c-09a52e7c2635-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.847789 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/013d4d42-09ab-4f76-b77c-09a52e7c2635-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.847798 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.847806 5030 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.847814 5030 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/013d4d42-09ab-4f76-b77c-09a52e7c2635-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:52 crc kubenswrapper[5030]: I0120 22:58:52.847823 5030 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/013d4d42-09ab-4f76-b77c-09a52e7c2635-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:53 crc kubenswrapper[5030]: I0120 22:58:53.263163 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" event={"ID":"013d4d42-09ab-4f76-b77c-09a52e7c2635","Type":"ContainerDied","Data":"99febb9ac1b6d949370411a86118e2c1c78e077859668a391f6dbd576a95ecf8"} Jan 20 22:58:53 crc kubenswrapper[5030]: I0120 22:58:53.263475 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99febb9ac1b6d949370411a86118e2c1c78e077859668a391f6dbd576a95ecf8" Jan 20 22:58:53 crc kubenswrapper[5030]: I0120 22:58:53.263202 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-b5r8x" Jan 20 22:58:53 crc kubenswrapper[5030]: I0120 22:58:53.265326 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-qgczp" event={"ID":"8292fbbe-24f2-4dde-82f5-338ac93e33f1","Type":"ContainerStarted","Data":"0d2d09ed40317efbf1687169ffd9762f52fd37ce6338e32a2cef774fa2697e92"} Jan 20 22:58:53 crc kubenswrapper[5030]: I0120 22:58:53.303478 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-qgczp" podStartSLOduration=3.30344824 podStartE2EDuration="3.30344824s" podCreationTimestamp="2026-01-20 22:58:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:58:53.288483755 +0000 UTC m=+1405.608744083" watchObservedRunningTime="2026-01-20 22:58:53.30344824 +0000 UTC m=+1405.623708558" Jan 20 22:58:53 crc kubenswrapper[5030]: I0120 22:58:53.762330 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-t8vvb"] Jan 20 22:58:53 crc kubenswrapper[5030]: I0120 22:58:53.771994 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-t8vvb"] Jan 20 22:58:53 crc kubenswrapper[5030]: I0120 22:58:53.979959 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c010c4a0-9ac8-4016-ab6d-86cbce546e3f" path="/var/lib/kubelet/pods/c010c4a0-9ac8-4016-ab6d-86cbce546e3f/volumes" Jan 20 22:58:55 crc kubenswrapper[5030]: I0120 22:58:55.285250 5030 generic.go:334] "Generic (PLEG): container finished" podID="8292fbbe-24f2-4dde-82f5-338ac93e33f1" containerID="0d2d09ed40317efbf1687169ffd9762f52fd37ce6338e32a2cef774fa2697e92" exitCode=0 Jan 20 22:58:55 crc kubenswrapper[5030]: I0120 22:58:55.285310 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-qgczp" event={"ID":"8292fbbe-24f2-4dde-82f5-338ac93e33f1","Type":"ContainerDied","Data":"0d2d09ed40317efbf1687169ffd9762f52fd37ce6338e32a2cef774fa2697e92"} Jan 20 22:58:56 crc kubenswrapper[5030]: I0120 22:58:56.719022 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:56 crc kubenswrapper[5030]: I0120 22:58:56.818331 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-config-data\") pod \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " Jan 20 22:58:56 crc kubenswrapper[5030]: I0120 22:58:56.818503 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-combined-ca-bundle\") pod \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " Jan 20 22:58:56 crc kubenswrapper[5030]: I0120 22:58:56.818679 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjrcl\" (UniqueName: \"kubernetes.io/projected/8292fbbe-24f2-4dde-82f5-338ac93e33f1-kube-api-access-wjrcl\") pod \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " Jan 20 22:58:56 crc kubenswrapper[5030]: I0120 22:58:56.818793 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-db-sync-config-data\") pod \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\" (UID: \"8292fbbe-24f2-4dde-82f5-338ac93e33f1\") " Jan 20 22:58:56 crc kubenswrapper[5030]: I0120 22:58:56.823659 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8292fbbe-24f2-4dde-82f5-338ac93e33f1" (UID: "8292fbbe-24f2-4dde-82f5-338ac93e33f1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:56 crc kubenswrapper[5030]: I0120 22:58:56.823693 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8292fbbe-24f2-4dde-82f5-338ac93e33f1-kube-api-access-wjrcl" (OuterVolumeSpecName: "kube-api-access-wjrcl") pod "8292fbbe-24f2-4dde-82f5-338ac93e33f1" (UID: "8292fbbe-24f2-4dde-82f5-338ac93e33f1"). InnerVolumeSpecName "kube-api-access-wjrcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:58:56 crc kubenswrapper[5030]: I0120 22:58:56.844801 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8292fbbe-24f2-4dde-82f5-338ac93e33f1" (UID: "8292fbbe-24f2-4dde-82f5-338ac93e33f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:56 crc kubenswrapper[5030]: I0120 22:58:56.873183 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-config-data" (OuterVolumeSpecName: "config-data") pod "8292fbbe-24f2-4dde-82f5-338ac93e33f1" (UID: "8292fbbe-24f2-4dde-82f5-338ac93e33f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:58:56 crc kubenswrapper[5030]: I0120 22:58:56.922866 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:56 crc kubenswrapper[5030]: I0120 22:58:56.922907 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjrcl\" (UniqueName: \"kubernetes.io/projected/8292fbbe-24f2-4dde-82f5-338ac93e33f1-kube-api-access-wjrcl\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:56 crc kubenswrapper[5030]: I0120 22:58:56.922922 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:56 crc kubenswrapper[5030]: I0120 22:58:56.922934 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8292fbbe-24f2-4dde-82f5-338ac93e33f1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:58:57 crc kubenswrapper[5030]: I0120 22:58:57.304008 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-qgczp" event={"ID":"8292fbbe-24f2-4dde-82f5-338ac93e33f1","Type":"ContainerDied","Data":"c063746c42a565c9ba7c3e3d6d0df3e1a468f8d05f2f270dade3434ae6f4d426"} Jan 20 22:58:57 crc kubenswrapper[5030]: I0120 22:58:57.304067 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c063746c42a565c9ba7c3e3d6d0df3e1a468f8d05f2f270dade3434ae6f4d426" Jan 20 22:58:57 crc kubenswrapper[5030]: I0120 22:58:57.304136 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-qgczp" Jan 20 22:58:57 crc kubenswrapper[5030]: I0120 22:58:57.711007 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.243720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.253709 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift\") pod \"swift-storage-0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.292196 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.755090 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bsjb8"] Jan 20 22:58:58 crc kubenswrapper[5030]: E0120 22:58:58.755728 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013d4d42-09ab-4f76-b77c-09a52e7c2635" containerName="swift-ring-rebalance" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.755747 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="013d4d42-09ab-4f76-b77c-09a52e7c2635" containerName="swift-ring-rebalance" Jan 20 22:58:58 crc kubenswrapper[5030]: E0120 22:58:58.755775 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8292fbbe-24f2-4dde-82f5-338ac93e33f1" containerName="glance-db-sync" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.755782 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8292fbbe-24f2-4dde-82f5-338ac93e33f1" containerName="glance-db-sync" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.755930 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="013d4d42-09ab-4f76-b77c-09a52e7c2635" containerName="swift-ring-rebalance" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.755945 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8292fbbe-24f2-4dde-82f5-338ac93e33f1" containerName="glance-db-sync" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.756458 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-bsjb8" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.763846 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.792877 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bsjb8"] Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.811473 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.856590 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq4s5\" (UniqueName: \"kubernetes.io/projected/28ea1315-f9a0-4297-8c79-4432e0969375-kube-api-access-tq4s5\") pod \"root-account-create-update-bsjb8\" (UID: \"28ea1315-f9a0-4297-8c79-4432e0969375\") " pod="openstack-kuttl-tests/root-account-create-update-bsjb8" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.856768 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ea1315-f9a0-4297-8c79-4432e0969375-operator-scripts\") pod \"root-account-create-update-bsjb8\" (UID: \"28ea1315-f9a0-4297-8c79-4432e0969375\") " pod="openstack-kuttl-tests/root-account-create-update-bsjb8" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.958455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq4s5\" (UniqueName: \"kubernetes.io/projected/28ea1315-f9a0-4297-8c79-4432e0969375-kube-api-access-tq4s5\") pod \"root-account-create-update-bsjb8\" (UID: \"28ea1315-f9a0-4297-8c79-4432e0969375\") " pod="openstack-kuttl-tests/root-account-create-update-bsjb8" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.959019 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ea1315-f9a0-4297-8c79-4432e0969375-operator-scripts\") pod \"root-account-create-update-bsjb8\" (UID: \"28ea1315-f9a0-4297-8c79-4432e0969375\") " pod="openstack-kuttl-tests/root-account-create-update-bsjb8" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.960050 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ea1315-f9a0-4297-8c79-4432e0969375-operator-scripts\") pod \"root-account-create-update-bsjb8\" (UID: \"28ea1315-f9a0-4297-8c79-4432e0969375\") " pod="openstack-kuttl-tests/root-account-create-update-bsjb8" Jan 20 22:58:58 crc kubenswrapper[5030]: I0120 22:58:58.978397 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq4s5\" (UniqueName: \"kubernetes.io/projected/28ea1315-f9a0-4297-8c79-4432e0969375-kube-api-access-tq4s5\") pod \"root-account-create-update-bsjb8\" (UID: \"28ea1315-f9a0-4297-8c79-4432e0969375\") " pod="openstack-kuttl-tests/root-account-create-update-bsjb8" Jan 20 22:58:59 crc kubenswrapper[5030]: I0120 22:58:59.085218 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-bsjb8" Jan 20 22:58:59 crc kubenswrapper[5030]: I0120 22:58:59.339102 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"95c78ba26ea1327b543cb4fcc17a119b2b94fc9979e98f800b3989e33137c216"} Jan 20 22:58:59 crc kubenswrapper[5030]: I0120 22:58:59.339512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"81f204e94e50c10aad98f32fa610bc765fd0625f37c6d72281e27a9a07ef0053"} Jan 20 22:58:59 crc kubenswrapper[5030]: I0120 22:58:59.339530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"bc19f8ea22af2621575e8fed2298df70012df1256810526ccdb4808e80ca0994"} Jan 20 22:58:59 crc kubenswrapper[5030]: I0120 22:58:59.339562 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"6948576a4578a82d92748b68434f12aee9a957e1f209430ca19e9192db60a78c"} Jan 20 22:58:59 crc kubenswrapper[5030]: I0120 22:58:59.518570 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bsjb8"] Jan 20 22:59:00 crc kubenswrapper[5030]: I0120 22:59:00.348220 5030 generic.go:334] "Generic (PLEG): container finished" podID="b9bc4ba4-4b03-441b-b492-6de67c2647b6" containerID="56eaf2fe8a93d7268fc8ee7e06205cca6c5d17255e04bed26c15d451551557f2" exitCode=0 Jan 20 22:59:00 crc kubenswrapper[5030]: I0120 22:59:00.348289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"b9bc4ba4-4b03-441b-b492-6de67c2647b6","Type":"ContainerDied","Data":"56eaf2fe8a93d7268fc8ee7e06205cca6c5d17255e04bed26c15d451551557f2"} Jan 20 22:59:00 crc kubenswrapper[5030]: I0120 22:59:00.355113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"14144cf1922d6be5801bcf6d78d028157ed7e8d1d57067fa2c77b371a66a2544"} Jan 20 22:59:00 crc kubenswrapper[5030]: I0120 22:59:00.355157 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"2a9f08f93ceb305eb8bbbc6f99714cc83ac38112f37d032b0618da20043f1d87"} Jan 20 22:59:00 crc kubenswrapper[5030]: I0120 22:59:00.355170 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"030d4cd534eb9dfce0da93b394af8bc39c6d3274923f61e781d77d1230701dae"} Jan 20 22:59:00 crc kubenswrapper[5030]: I0120 22:59:00.355183 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"c0d724abadc5d3c497596ad5aa09fe62eb5baaea033ab9d5697927b249d3b189"} Jan 20 22:59:00 crc kubenswrapper[5030]: I0120 22:59:00.355194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"3b3e2f8dd2ace93c881c281f8989f46699dbadd47bb896f1d2bf1e9f061b8629"} Jan 20 22:59:00 crc kubenswrapper[5030]: I0120 22:59:00.357273 5030 generic.go:334] "Generic (PLEG): container finished" podID="28ea1315-f9a0-4297-8c79-4432e0969375" containerID="c4dcf5e1ee539c112bd3f6d2e928eb1b4f90aa50eaed6f7be68fbe2d8e6e37e8" exitCode=0 Jan 20 22:59:00 crc kubenswrapper[5030]: I0120 22:59:00.357298 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-bsjb8" event={"ID":"28ea1315-f9a0-4297-8c79-4432e0969375","Type":"ContainerDied","Data":"c4dcf5e1ee539c112bd3f6d2e928eb1b4f90aa50eaed6f7be68fbe2d8e6e37e8"} Jan 20 22:59:00 crc kubenswrapper[5030]: I0120 22:59:00.357315 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-bsjb8" event={"ID":"28ea1315-f9a0-4297-8c79-4432e0969375","Type":"ContainerStarted","Data":"40744805d7c9255cb035881987cf29edad32aba9dc2d484625b06e5bf403e92e"} Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.368857 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"021cea42e21ad43546b51f3127b140505655ea943ba44527c60d312311c3e34f"} Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.369338 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"a432bf14acfe81bc271e02f652015ce152b293eb9646041378fc56598872ca81"} Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.369349 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"ecdfbe98e48feb2bbd526ae30ed7b61315c792112126a0fe9ff631a269b2721b"} Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.369358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"56618c09fc923586f43bcbb44d653ee3a7d3a621886e4715ea74da7134d82d71"} Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.369367 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"cafdcd217051f0178624a66e9db90a71b91346975e830dc33b5be81f1cbce400"} Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.369379 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"11b09f54c09fd448fff25683584a5e001e51f0cffe01fad0c973005e9cc64963"} Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.370961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"b9bc4ba4-4b03-441b-b492-6de67c2647b6","Type":"ContainerStarted","Data":"2eaa0bf043f813374e4a3eece104c7b2f4bc50fb0dfe7dac3ec20252cfacfb0a"} Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.371171 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.372513 5030 generic.go:334] "Generic (PLEG): container finished" podID="83f11c15-8856-45fd-9703-348310781d5a" containerID="023df1c950388e7dcb139b685ad0bdfed5f5d067074b9530aa12528857b11e81" exitCode=0 Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.372598 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"83f11c15-8856-45fd-9703-348310781d5a","Type":"ContainerDied","Data":"023df1c950388e7dcb139b685ad0bdfed5f5d067074b9530aa12528857b11e81"} Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.400406 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=37.400388909 podStartE2EDuration="37.400388909s" podCreationTimestamp="2026-01-20 22:58:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:01.389893113 +0000 UTC m=+1413.710153401" watchObservedRunningTime="2026-01-20 22:59:01.400388909 +0000 UTC m=+1413.720649197" Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.700507 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-bsjb8" Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.805215 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ea1315-f9a0-4297-8c79-4432e0969375-operator-scripts\") pod \"28ea1315-f9a0-4297-8c79-4432e0969375\" (UID: \"28ea1315-f9a0-4297-8c79-4432e0969375\") " Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.805562 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq4s5\" (UniqueName: \"kubernetes.io/projected/28ea1315-f9a0-4297-8c79-4432e0969375-kube-api-access-tq4s5\") pod \"28ea1315-f9a0-4297-8c79-4432e0969375\" (UID: \"28ea1315-f9a0-4297-8c79-4432e0969375\") " Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.805808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28ea1315-f9a0-4297-8c79-4432e0969375-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28ea1315-f9a0-4297-8c79-4432e0969375" (UID: "28ea1315-f9a0-4297-8c79-4432e0969375"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.805972 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ea1315-f9a0-4297-8c79-4432e0969375-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.808901 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ea1315-f9a0-4297-8c79-4432e0969375-kube-api-access-tq4s5" (OuterVolumeSpecName: "kube-api-access-tq4s5") pod "28ea1315-f9a0-4297-8c79-4432e0969375" (UID: "28ea1315-f9a0-4297-8c79-4432e0969375"). InnerVolumeSpecName "kube-api-access-tq4s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:01 crc kubenswrapper[5030]: I0120 22:59:01.906888 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq4s5\" (UniqueName: \"kubernetes.io/projected/28ea1315-f9a0-4297-8c79-4432e0969375-kube-api-access-tq4s5\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.382347 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-bsjb8" event={"ID":"28ea1315-f9a0-4297-8c79-4432e0969375","Type":"ContainerDied","Data":"40744805d7c9255cb035881987cf29edad32aba9dc2d484625b06e5bf403e92e"} Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.382391 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40744805d7c9255cb035881987cf29edad32aba9dc2d484625b06e5bf403e92e" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.382364 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-bsjb8" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.386638 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"83f11c15-8856-45fd-9703-348310781d5a","Type":"ContainerStarted","Data":"dcc99476597aaed55627fb273471e78f5cfba90216403d60549d509f104df53d"} Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.386860 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.394239 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerStarted","Data":"31484680afb5e1717e0037f4e7b46b2af2b23145d165332e6243e3d95db0a10f"} Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.412210 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.412192749 podStartE2EDuration="37.412192749s" podCreationTimestamp="2026-01-20 22:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:02.411318547 +0000 UTC m=+1414.731578835" watchObservedRunningTime="2026-01-20 22:59:02.412192749 +0000 UTC m=+1414.732453037" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.449638 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=20.449587923 podStartE2EDuration="20.449587923s" podCreationTimestamp="2026-01-20 22:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:02.446876307 +0000 UTC m=+1414.767136605" watchObservedRunningTime="2026-01-20 22:59:02.449587923 +0000 UTC m=+1414.769848211" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.600225 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf"] Jan 20 22:59:02 crc kubenswrapper[5030]: E0120 22:59:02.600683 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ea1315-f9a0-4297-8c79-4432e0969375" containerName="mariadb-account-create-update" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.600700 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ea1315-f9a0-4297-8c79-4432e0969375" containerName="mariadb-account-create-update" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.600875 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ea1315-f9a0-4297-8c79-4432e0969375" containerName="mariadb-account-create-update" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.601585 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.605395 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.634179 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf"] Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.720505 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qtj7\" (UniqueName: \"kubernetes.io/projected/6c46ecc6-0196-4659-b06f-3603396bc91a-kube-api-access-8qtj7\") pod \"dnsmasq-dnsmasq-6ff445895c-69rsf\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.720869 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6ff445895c-69rsf\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.720894 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6ff445895c-69rsf\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.720945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-config\") pod \"dnsmasq-dnsmasq-6ff445895c-69rsf\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.822728 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qtj7\" (UniqueName: \"kubernetes.io/projected/6c46ecc6-0196-4659-b06f-3603396bc91a-kube-api-access-8qtj7\") pod \"dnsmasq-dnsmasq-6ff445895c-69rsf\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.822815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6ff445895c-69rsf\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.822848 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6ff445895c-69rsf\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.822930 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-config\") pod \"dnsmasq-dnsmasq-6ff445895c-69rsf\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.823760 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6ff445895c-69rsf\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.823959 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-config\") pod \"dnsmasq-dnsmasq-6ff445895c-69rsf\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.824376 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6ff445895c-69rsf\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.839643 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qtj7\" (UniqueName: \"kubernetes.io/projected/6c46ecc6-0196-4659-b06f-3603396bc91a-kube-api-access-8qtj7\") pod \"dnsmasq-dnsmasq-6ff445895c-69rsf\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:02 crc kubenswrapper[5030]: I0120 22:59:02.922695 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:03 crc kubenswrapper[5030]: I0120 22:59:03.328649 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf"] Jan 20 22:59:03 crc kubenswrapper[5030]: W0120 22:59:03.334776 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c46ecc6_0196_4659_b06f_3603396bc91a.slice/crio-9e1b58d376b88dd02ed13bec244f2c6f8fa1aae357ad98403b6ddedc02096d23 WatchSource:0}: Error finding container 9e1b58d376b88dd02ed13bec244f2c6f8fa1aae357ad98403b6ddedc02096d23: Status 404 returned error can't find the container with id 9e1b58d376b88dd02ed13bec244f2c6f8fa1aae357ad98403b6ddedc02096d23 Jan 20 22:59:03 crc kubenswrapper[5030]: I0120 22:59:03.403820 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" event={"ID":"6c46ecc6-0196-4659-b06f-3603396bc91a","Type":"ContainerStarted","Data":"9e1b58d376b88dd02ed13bec244f2c6f8fa1aae357ad98403b6ddedc02096d23"} Jan 20 22:59:04 crc kubenswrapper[5030]: I0120 22:59:04.412888 5030 generic.go:334] "Generic (PLEG): container finished" podID="6c46ecc6-0196-4659-b06f-3603396bc91a" containerID="a7766a0903428d42d7f819e6196e488f4f97723340372fdc9bc2a319a03b945f" exitCode=0 Jan 20 22:59:04 crc kubenswrapper[5030]: I0120 22:59:04.412926 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" event={"ID":"6c46ecc6-0196-4659-b06f-3603396bc91a","Type":"ContainerDied","Data":"a7766a0903428d42d7f819e6196e488f4f97723340372fdc9bc2a319a03b945f"} Jan 20 22:59:05 crc kubenswrapper[5030]: I0120 22:59:05.427916 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" event={"ID":"6c46ecc6-0196-4659-b06f-3603396bc91a","Type":"ContainerStarted","Data":"272f3362cfb0f6639a3278329b5b414c51b4001addfaf831fd46b21f2f8238ee"} Jan 20 22:59:05 crc kubenswrapper[5030]: I0120 22:59:05.428437 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:05 crc kubenswrapper[5030]: I0120 22:59:05.448999 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" podStartSLOduration=3.448984172 podStartE2EDuration="3.448984172s" podCreationTimestamp="2026-01-20 22:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:05.443762254 +0000 UTC m=+1417.764022582" watchObservedRunningTime="2026-01-20 22:59:05.448984172 +0000 UTC m=+1417.769244460" Jan 20 22:59:10 crc kubenswrapper[5030]: I0120 22:59:10.157571 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 22:59:10 crc kubenswrapper[5030]: I0120 22:59:10.158279 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 22:59:10 crc kubenswrapper[5030]: I0120 22:59:10.158347 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 22:59:10 crc kubenswrapper[5030]: I0120 22:59:10.159323 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e42d8051ffc8563878f969388446a5e74e804108b7b6e90bd869128b6c0317ab"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 22:59:10 crc kubenswrapper[5030]: I0120 22:59:10.159418 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://e42d8051ffc8563878f969388446a5e74e804108b7b6e90bd869128b6c0317ab" gracePeriod=600 Jan 20 22:59:10 crc kubenswrapper[5030]: I0120 22:59:10.477737 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="e42d8051ffc8563878f969388446a5e74e804108b7b6e90bd869128b6c0317ab" exitCode=0 Jan 20 22:59:10 crc kubenswrapper[5030]: I0120 22:59:10.477801 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"e42d8051ffc8563878f969388446a5e74e804108b7b6e90bd869128b6c0317ab"} Jan 20 22:59:10 crc kubenswrapper[5030]: I0120 22:59:10.478111 5030 scope.go:117] "RemoveContainer" containerID="f25c78cd3e3e9922643e9336b23ff6b4fab097716027081d191bbcb08c23aed3" Jan 20 22:59:11 crc kubenswrapper[5030]: I0120 22:59:11.492400 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460"} Jan 20 22:59:12 crc kubenswrapper[5030]: I0120 22:59:12.924896 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.006288 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv"] Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.006783 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" podUID="b4e0a7b0-4723-42a8-aa99-c5f155dff065" containerName="dnsmasq-dns" containerID="cri-o://cd3823ccaebe114b42c66e749c5091c7a2afeeff5db990cd2453a24eb789b703" gracePeriod=10 Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.440676 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.502458 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb4rl\" (UniqueName: \"kubernetes.io/projected/b4e0a7b0-4723-42a8-aa99-c5f155dff065-kube-api-access-kb4rl\") pod \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\" (UID: \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\") " Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.502875 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4e0a7b0-4723-42a8-aa99-c5f155dff065-config\") pod \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\" (UID: \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\") " Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.502979 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b4e0a7b0-4723-42a8-aa99-c5f155dff065-dnsmasq-svc\") pod \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\" (UID: \"b4e0a7b0-4723-42a8-aa99-c5f155dff065\") " Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.511836 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4e0a7b0-4723-42a8-aa99-c5f155dff065-kube-api-access-kb4rl" (OuterVolumeSpecName: "kube-api-access-kb4rl") pod "b4e0a7b0-4723-42a8-aa99-c5f155dff065" (UID: "b4e0a7b0-4723-42a8-aa99-c5f155dff065"). InnerVolumeSpecName "kube-api-access-kb4rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.515070 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4e0a7b0-4723-42a8-aa99-c5f155dff065" containerID="cd3823ccaebe114b42c66e749c5091c7a2afeeff5db990cd2453a24eb789b703" exitCode=0 Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.515128 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" event={"ID":"b4e0a7b0-4723-42a8-aa99-c5f155dff065","Type":"ContainerDied","Data":"cd3823ccaebe114b42c66e749c5091c7a2afeeff5db990cd2453a24eb789b703"} Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.515168 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" event={"ID":"b4e0a7b0-4723-42a8-aa99-c5f155dff065","Type":"ContainerDied","Data":"5c5ba45262d5823fc921b92884c0f77002d1416c375024811477480f10417c7d"} Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.515193 5030 scope.go:117] "RemoveContainer" containerID="cd3823ccaebe114b42c66e749c5091c7a2afeeff5db990cd2453a24eb789b703" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.515393 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.572763 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4e0a7b0-4723-42a8-aa99-c5f155dff065-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "b4e0a7b0-4723-42a8-aa99-c5f155dff065" (UID: "b4e0a7b0-4723-42a8-aa99-c5f155dff065"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.583558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4e0a7b0-4723-42a8-aa99-c5f155dff065-config" (OuterVolumeSpecName: "config") pod "b4e0a7b0-4723-42a8-aa99-c5f155dff065" (UID: "b4e0a7b0-4723-42a8-aa99-c5f155dff065"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.595502 5030 scope.go:117] "RemoveContainer" containerID="73e6371cccef5385cdf656fa200e1a4aa393d71c27e38df1a06c78a99d91d53f" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.604293 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b4e0a7b0-4723-42a8-aa99-c5f155dff065-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.604318 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb4rl\" (UniqueName: \"kubernetes.io/projected/b4e0a7b0-4723-42a8-aa99-c5f155dff065-kube-api-access-kb4rl\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.604329 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4e0a7b0-4723-42a8-aa99-c5f155dff065-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.616128 5030 scope.go:117] "RemoveContainer" containerID="cd3823ccaebe114b42c66e749c5091c7a2afeeff5db990cd2453a24eb789b703" Jan 20 22:59:13 crc kubenswrapper[5030]: E0120 22:59:13.616580 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd3823ccaebe114b42c66e749c5091c7a2afeeff5db990cd2453a24eb789b703\": container with ID starting with cd3823ccaebe114b42c66e749c5091c7a2afeeff5db990cd2453a24eb789b703 not found: ID does not exist" containerID="cd3823ccaebe114b42c66e749c5091c7a2afeeff5db990cd2453a24eb789b703" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.616669 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3823ccaebe114b42c66e749c5091c7a2afeeff5db990cd2453a24eb789b703"} err="failed to get container status \"cd3823ccaebe114b42c66e749c5091c7a2afeeff5db990cd2453a24eb789b703\": rpc error: code = NotFound desc = could not find container \"cd3823ccaebe114b42c66e749c5091c7a2afeeff5db990cd2453a24eb789b703\": container with ID starting with cd3823ccaebe114b42c66e749c5091c7a2afeeff5db990cd2453a24eb789b703 not found: ID does not exist" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.616700 5030 scope.go:117] "RemoveContainer" containerID="73e6371cccef5385cdf656fa200e1a4aa393d71c27e38df1a06c78a99d91d53f" Jan 20 22:59:13 crc kubenswrapper[5030]: E0120 22:59:13.616982 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73e6371cccef5385cdf656fa200e1a4aa393d71c27e38df1a06c78a99d91d53f\": container with ID starting with 73e6371cccef5385cdf656fa200e1a4aa393d71c27e38df1a06c78a99d91d53f not found: ID does not exist" containerID="73e6371cccef5385cdf656fa200e1a4aa393d71c27e38df1a06c78a99d91d53f" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.617010 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e6371cccef5385cdf656fa200e1a4aa393d71c27e38df1a06c78a99d91d53f"} err="failed to get container status \"73e6371cccef5385cdf656fa200e1a4aa393d71c27e38df1a06c78a99d91d53f\": rpc error: code = NotFound desc = could not find container \"73e6371cccef5385cdf656fa200e1a4aa393d71c27e38df1a06c78a99d91d53f\": container with ID starting with 73e6371cccef5385cdf656fa200e1a4aa393d71c27e38df1a06c78a99d91d53f not found: ID does not exist" Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.846946 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv"] Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.855967 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8fgcv"] Jan 20 22:59:13 crc kubenswrapper[5030]: I0120 22:59:13.976699 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4e0a7b0-4723-42a8-aa99-c5f155dff065" path="/var/lib/kubelet/pods/b4e0a7b0-4723-42a8-aa99-c5f155dff065/volumes" Jan 20 22:59:15 crc kubenswrapper[5030]: I0120 22:59:15.858902 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.150964 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.740689 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-qqptl"] Jan 20 22:59:17 crc kubenswrapper[5030]: E0120 22:59:17.740990 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e0a7b0-4723-42a8-aa99-c5f155dff065" containerName="dnsmasq-dns" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.741005 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e0a7b0-4723-42a8-aa99-c5f155dff065" containerName="dnsmasq-dns" Jan 20 22:59:17 crc kubenswrapper[5030]: E0120 22:59:17.741039 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e0a7b0-4723-42a8-aa99-c5f155dff065" containerName="init" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.741045 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e0a7b0-4723-42a8-aa99-c5f155dff065" containerName="init" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.741183 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e0a7b0-4723-42a8-aa99-c5f155dff065" containerName="dnsmasq-dns" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.741649 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-qqptl" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.750002 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-qqptl"] Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.841673 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-t86dg"] Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.842548 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-t86dg" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.853760 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-t86dg"] Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.866570 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2"] Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.868667 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.872604 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66krd\" (UniqueName: \"kubernetes.io/projected/910e33af-9b19-4387-babd-c19cce0cfd29-kube-api-access-66krd\") pod \"barbican-db-create-qqptl\" (UID: \"910e33af-9b19-4387-babd-c19cce0cfd29\") " pod="openstack-kuttl-tests/barbican-db-create-qqptl" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.872669 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910e33af-9b19-4387-babd-c19cce0cfd29-operator-scripts\") pod \"barbican-db-create-qqptl\" (UID: \"910e33af-9b19-4387-babd-c19cce0cfd29\") " pod="openstack-kuttl-tests/barbican-db-create-qqptl" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.881501 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.883234 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2"] Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.937348 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl"] Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.938423 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.940192 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.943824 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-s2nnr"] Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.944864 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-s2nnr" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.956173 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl"] Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.970869 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-s2nnr"] Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.973479 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkq4s\" (UniqueName: \"kubernetes.io/projected/a43751e7-1432-4e99-8e62-50d14a3c9470-kube-api-access-rkq4s\") pod \"cinder-4cc4-account-create-update-lmfr2\" (UID: \"a43751e7-1432-4e99-8e62-50d14a3c9470\") " pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.973536 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a43751e7-1432-4e99-8e62-50d14a3c9470-operator-scripts\") pod \"cinder-4cc4-account-create-update-lmfr2\" (UID: \"a43751e7-1432-4e99-8e62-50d14a3c9470\") " pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.973560 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ebfa275-f618-4aaf-b02b-f4a6db7c78e4-operator-scripts\") pod \"cinder-db-create-t86dg\" (UID: \"6ebfa275-f618-4aaf-b02b-f4a6db7c78e4\") " pod="openstack-kuttl-tests/cinder-db-create-t86dg" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.973589 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66krd\" (UniqueName: \"kubernetes.io/projected/910e33af-9b19-4387-babd-c19cce0cfd29-kube-api-access-66krd\") pod \"barbican-db-create-qqptl\" (UID: \"910e33af-9b19-4387-babd-c19cce0cfd29\") " pod="openstack-kuttl-tests/barbican-db-create-qqptl" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.973610 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8z6k\" (UniqueName: \"kubernetes.io/projected/6ebfa275-f618-4aaf-b02b-f4a6db7c78e4-kube-api-access-l8z6k\") pod \"cinder-db-create-t86dg\" (UID: \"6ebfa275-f618-4aaf-b02b-f4a6db7c78e4\") " pod="openstack-kuttl-tests/cinder-db-create-t86dg" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.973648 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910e33af-9b19-4387-babd-c19cce0cfd29-operator-scripts\") pod \"barbican-db-create-qqptl\" (UID: \"910e33af-9b19-4387-babd-c19cce0cfd29\") " pod="openstack-kuttl-tests/barbican-db-create-qqptl" Jan 20 22:59:17 crc kubenswrapper[5030]: I0120 22:59:17.974240 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910e33af-9b19-4387-babd-c19cce0cfd29-operator-scripts\") pod \"barbican-db-create-qqptl\" (UID: \"910e33af-9b19-4387-babd-c19cce0cfd29\") " pod="openstack-kuttl-tests/barbican-db-create-qqptl" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.024227 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66krd\" (UniqueName: \"kubernetes.io/projected/910e33af-9b19-4387-babd-c19cce0cfd29-kube-api-access-66krd\") pod \"barbican-db-create-qqptl\" (UID: \"910e33af-9b19-4387-babd-c19cce0cfd29\") " pod="openstack-kuttl-tests/barbican-db-create-qqptl" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.067779 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-qqptl" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.074459 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a-operator-scripts\") pod \"barbican-73e2-account-create-update-whnpl\" (UID: \"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a\") " pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.074499 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkq4s\" (UniqueName: \"kubernetes.io/projected/a43751e7-1432-4e99-8e62-50d14a3c9470-kube-api-access-rkq4s\") pod \"cinder-4cc4-account-create-update-lmfr2\" (UID: \"a43751e7-1432-4e99-8e62-50d14a3c9470\") " pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.074541 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a601abae-f93a-48ec-8597-00ef38c3e1e6-operator-scripts\") pod \"neutron-db-create-s2nnr\" (UID: \"a601abae-f93a-48ec-8597-00ef38c3e1e6\") " pod="openstack-kuttl-tests/neutron-db-create-s2nnr" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.074561 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g4gw\" (UniqueName: \"kubernetes.io/projected/a601abae-f93a-48ec-8597-00ef38c3e1e6-kube-api-access-5g4gw\") pod \"neutron-db-create-s2nnr\" (UID: \"a601abae-f93a-48ec-8597-00ef38c3e1e6\") " pod="openstack-kuttl-tests/neutron-db-create-s2nnr" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.074659 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a43751e7-1432-4e99-8e62-50d14a3c9470-operator-scripts\") pod \"cinder-4cc4-account-create-update-lmfr2\" (UID: \"a43751e7-1432-4e99-8e62-50d14a3c9470\") " pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.074683 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ebfa275-f618-4aaf-b02b-f4a6db7c78e4-operator-scripts\") pod \"cinder-db-create-t86dg\" (UID: \"6ebfa275-f618-4aaf-b02b-f4a6db7c78e4\") " pod="openstack-kuttl-tests/cinder-db-create-t86dg" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.074710 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8z6k\" (UniqueName: \"kubernetes.io/projected/6ebfa275-f618-4aaf-b02b-f4a6db7c78e4-kube-api-access-l8z6k\") pod \"cinder-db-create-t86dg\" (UID: \"6ebfa275-f618-4aaf-b02b-f4a6db7c78e4\") " pod="openstack-kuttl-tests/cinder-db-create-t86dg" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.074732 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md46j\" (UniqueName: \"kubernetes.io/projected/faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a-kube-api-access-md46j\") pod \"barbican-73e2-account-create-update-whnpl\" (UID: \"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a\") " pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.075459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ebfa275-f618-4aaf-b02b-f4a6db7c78e4-operator-scripts\") pod \"cinder-db-create-t86dg\" (UID: \"6ebfa275-f618-4aaf-b02b-f4a6db7c78e4\") " pod="openstack-kuttl-tests/cinder-db-create-t86dg" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.075696 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a43751e7-1432-4e99-8e62-50d14a3c9470-operator-scripts\") pod \"cinder-4cc4-account-create-update-lmfr2\" (UID: \"a43751e7-1432-4e99-8e62-50d14a3c9470\") " pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.096818 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkq4s\" (UniqueName: \"kubernetes.io/projected/a43751e7-1432-4e99-8e62-50d14a3c9470-kube-api-access-rkq4s\") pod \"cinder-4cc4-account-create-update-lmfr2\" (UID: \"a43751e7-1432-4e99-8e62-50d14a3c9470\") " pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.099939 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-c82wm"] Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.100864 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-c82wm" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.103320 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.103794 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.103987 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-5nbxn" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.104148 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.109013 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-c82wm"] Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.112652 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8z6k\" (UniqueName: \"kubernetes.io/projected/6ebfa275-f618-4aaf-b02b-f4a6db7c78e4-kube-api-access-l8z6k\") pod \"cinder-db-create-t86dg\" (UID: \"6ebfa275-f618-4aaf-b02b-f4a6db7c78e4\") " pod="openstack-kuttl-tests/cinder-db-create-t86dg" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.154558 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-t86dg" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.157375 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v"] Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.158254 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.161061 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.175539 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a-operator-scripts\") pod \"barbican-73e2-account-create-update-whnpl\" (UID: \"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a\") " pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.175597 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a601abae-f93a-48ec-8597-00ef38c3e1e6-operator-scripts\") pod \"neutron-db-create-s2nnr\" (UID: \"a601abae-f93a-48ec-8597-00ef38c3e1e6\") " pod="openstack-kuttl-tests/neutron-db-create-s2nnr" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.175632 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g4gw\" (UniqueName: \"kubernetes.io/projected/a601abae-f93a-48ec-8597-00ef38c3e1e6-kube-api-access-5g4gw\") pod \"neutron-db-create-s2nnr\" (UID: \"a601abae-f93a-48ec-8597-00ef38c3e1e6\") " pod="openstack-kuttl-tests/neutron-db-create-s2nnr" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.175680 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md46j\" (UniqueName: \"kubernetes.io/projected/faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a-kube-api-access-md46j\") pod \"barbican-73e2-account-create-update-whnpl\" (UID: \"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a\") " pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.176499 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a-operator-scripts\") pod \"barbican-73e2-account-create-update-whnpl\" (UID: \"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a\") " pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.178183 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v"] Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.178379 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a601abae-f93a-48ec-8597-00ef38c3e1e6-operator-scripts\") pod \"neutron-db-create-s2nnr\" (UID: \"a601abae-f93a-48ec-8597-00ef38c3e1e6\") " pod="openstack-kuttl-tests/neutron-db-create-s2nnr" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.185861 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.193484 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md46j\" (UniqueName: \"kubernetes.io/projected/faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a-kube-api-access-md46j\") pod \"barbican-73e2-account-create-update-whnpl\" (UID: \"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a\") " pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.200335 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g4gw\" (UniqueName: \"kubernetes.io/projected/a601abae-f93a-48ec-8597-00ef38c3e1e6-kube-api-access-5g4gw\") pod \"neutron-db-create-s2nnr\" (UID: \"a601abae-f93a-48ec-8597-00ef38c3e1e6\") " pod="openstack-kuttl-tests/neutron-db-create-s2nnr" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.252900 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.280948 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttd2b\" (UniqueName: \"kubernetes.io/projected/d1f367a2-2685-4a40-b29d-2a69508228f5-kube-api-access-ttd2b\") pod \"neutron-e899-account-create-update-bwc2v\" (UID: \"d1f367a2-2685-4a40-b29d-2a69508228f5\") " pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.281424 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dbd22d0-db4b-485c-b2f6-20bfa502543f-config-data\") pod \"keystone-db-sync-c82wm\" (UID: \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\") " pod="openstack-kuttl-tests/keystone-db-sync-c82wm" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.281532 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1f367a2-2685-4a40-b29d-2a69508228f5-operator-scripts\") pod \"neutron-e899-account-create-update-bwc2v\" (UID: \"d1f367a2-2685-4a40-b29d-2a69508228f5\") " pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.281571 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpl9d\" (UniqueName: \"kubernetes.io/projected/0dbd22d0-db4b-485c-b2f6-20bfa502543f-kube-api-access-tpl9d\") pod \"keystone-db-sync-c82wm\" (UID: \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\") " pod="openstack-kuttl-tests/keystone-db-sync-c82wm" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.281636 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbd22d0-db4b-485c-b2f6-20bfa502543f-combined-ca-bundle\") pod \"keystone-db-sync-c82wm\" (UID: \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\") " pod="openstack-kuttl-tests/keystone-db-sync-c82wm" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.294467 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-s2nnr" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.383502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dbd22d0-db4b-485c-b2f6-20bfa502543f-config-data\") pod \"keystone-db-sync-c82wm\" (UID: \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\") " pod="openstack-kuttl-tests/keystone-db-sync-c82wm" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.383565 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1f367a2-2685-4a40-b29d-2a69508228f5-operator-scripts\") pod \"neutron-e899-account-create-update-bwc2v\" (UID: \"d1f367a2-2685-4a40-b29d-2a69508228f5\") " pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.383585 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpl9d\" (UniqueName: \"kubernetes.io/projected/0dbd22d0-db4b-485c-b2f6-20bfa502543f-kube-api-access-tpl9d\") pod \"keystone-db-sync-c82wm\" (UID: \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\") " pod="openstack-kuttl-tests/keystone-db-sync-c82wm" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.383606 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbd22d0-db4b-485c-b2f6-20bfa502543f-combined-ca-bundle\") pod \"keystone-db-sync-c82wm\" (UID: \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\") " pod="openstack-kuttl-tests/keystone-db-sync-c82wm" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.383664 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttd2b\" (UniqueName: \"kubernetes.io/projected/d1f367a2-2685-4a40-b29d-2a69508228f5-kube-api-access-ttd2b\") pod \"neutron-e899-account-create-update-bwc2v\" (UID: \"d1f367a2-2685-4a40-b29d-2a69508228f5\") " pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.386859 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1f367a2-2685-4a40-b29d-2a69508228f5-operator-scripts\") pod \"neutron-e899-account-create-update-bwc2v\" (UID: \"d1f367a2-2685-4a40-b29d-2a69508228f5\") " pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.390159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbd22d0-db4b-485c-b2f6-20bfa502543f-combined-ca-bundle\") pod \"keystone-db-sync-c82wm\" (UID: \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\") " pod="openstack-kuttl-tests/keystone-db-sync-c82wm" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.394177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dbd22d0-db4b-485c-b2f6-20bfa502543f-config-data\") pod \"keystone-db-sync-c82wm\" (UID: \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\") " pod="openstack-kuttl-tests/keystone-db-sync-c82wm" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.407204 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttd2b\" (UniqueName: \"kubernetes.io/projected/d1f367a2-2685-4a40-b29d-2a69508228f5-kube-api-access-ttd2b\") pod \"neutron-e899-account-create-update-bwc2v\" (UID: \"d1f367a2-2685-4a40-b29d-2a69508228f5\") " pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.414953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpl9d\" (UniqueName: \"kubernetes.io/projected/0dbd22d0-db4b-485c-b2f6-20bfa502543f-kube-api-access-tpl9d\") pod \"keystone-db-sync-c82wm\" (UID: \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\") " pod="openstack-kuttl-tests/keystone-db-sync-c82wm" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.491797 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-c82wm" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.541866 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-qqptl"] Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.575824 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-qqptl" event={"ID":"910e33af-9b19-4387-babd-c19cce0cfd29","Type":"ContainerStarted","Data":"e88e36fc447994d62ed970c7af897adb00db66ef0869cd8160049218e5bea23c"} Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.601560 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.682715 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2"] Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.756958 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-t86dg"] Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.822466 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-s2nnr"] Jan 20 22:59:18 crc kubenswrapper[5030]: W0120 22:59:18.848395 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda601abae_f93a_48ec_8597_00ef38c3e1e6.slice/crio-d83ff557c555e4ea7402a11908200e27c0bcb910492ed7a3fa3c3f8eeff1e4d8 WatchSource:0}: Error finding container d83ff557c555e4ea7402a11908200e27c0bcb910492ed7a3fa3c3f8eeff1e4d8: Status 404 returned error can't find the container with id d83ff557c555e4ea7402a11908200e27c0bcb910492ed7a3fa3c3f8eeff1e4d8 Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.896148 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl"] Jan 20 22:59:18 crc kubenswrapper[5030]: I0120 22:59:18.967374 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-c82wm"] Jan 20 22:59:18 crc kubenswrapper[5030]: W0120 22:59:18.996575 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dbd22d0_db4b_485c_b2f6_20bfa502543f.slice/crio-cb418540327854ea632fe66f2c7138f433453a30862b571d1a2bc56d7c663dcb WatchSource:0}: Error finding container cb418540327854ea632fe66f2c7138f433453a30862b571d1a2bc56d7c663dcb: Status 404 returned error can't find the container with id cb418540327854ea632fe66f2c7138f433453a30862b571d1a2bc56d7c663dcb Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.157670 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v"] Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.586964 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" event={"ID":"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a","Type":"ContainerStarted","Data":"9229318f7cb77ae4d53c9c07b2207d74bfdc19cd971563178ba54067d58e96ad"} Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.587363 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" event={"ID":"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a","Type":"ContainerStarted","Data":"6784a6d94b12b64167c227ac68ee3c276e83f440bd8c5baf0dcb8b22ee755a17"} Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.590448 5030 generic.go:334] "Generic (PLEG): container finished" podID="a601abae-f93a-48ec-8597-00ef38c3e1e6" containerID="8aaf087519b345b06fa1b7e7bcfa88a8075014f84a204827b107d99bae073ccb" exitCode=0 Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.590516 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-s2nnr" event={"ID":"a601abae-f93a-48ec-8597-00ef38c3e1e6","Type":"ContainerDied","Data":"8aaf087519b345b06fa1b7e7bcfa88a8075014f84a204827b107d99bae073ccb"} Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.590543 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-s2nnr" event={"ID":"a601abae-f93a-48ec-8597-00ef38c3e1e6","Type":"ContainerStarted","Data":"d83ff557c555e4ea7402a11908200e27c0bcb910492ed7a3fa3c3f8eeff1e4d8"} Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.593135 5030 generic.go:334] "Generic (PLEG): container finished" podID="6ebfa275-f618-4aaf-b02b-f4a6db7c78e4" containerID="ed4735ac8e65d646343fe55203e0d23a01fa79f4cd61e5be232941c6f5de79bb" exitCode=0 Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.593184 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-t86dg" event={"ID":"6ebfa275-f618-4aaf-b02b-f4a6db7c78e4","Type":"ContainerDied","Data":"ed4735ac8e65d646343fe55203e0d23a01fa79f4cd61e5be232941c6f5de79bb"} Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.593236 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-t86dg" event={"ID":"6ebfa275-f618-4aaf-b02b-f4a6db7c78e4","Type":"ContainerStarted","Data":"c2d351597eaee596e2a1a45aa409260109083cdcce2f45736bbc3c6a0df5468c"} Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.596470 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" event={"ID":"d1f367a2-2685-4a40-b29d-2a69508228f5","Type":"ContainerStarted","Data":"1fdbf3f0525c086956ed40b1d3edf22385f43cf65cd8c8f8464f203907256fa6"} Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.596523 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" event={"ID":"d1f367a2-2685-4a40-b29d-2a69508228f5","Type":"ContainerStarted","Data":"851db81d02ca1e21505be83c54aae7b9579637673db519eb3317d841e7890c49"} Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.598538 5030 generic.go:334] "Generic (PLEG): container finished" podID="a43751e7-1432-4e99-8e62-50d14a3c9470" containerID="9c3071b7e01af1fad2cd7e74c1d4926f81b1317666d5ffe14fa1a2e5d0253fcb" exitCode=0 Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.598582 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2" event={"ID":"a43751e7-1432-4e99-8e62-50d14a3c9470","Type":"ContainerDied","Data":"9c3071b7e01af1fad2cd7e74c1d4926f81b1317666d5ffe14fa1a2e5d0253fcb"} Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.598599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2" event={"ID":"a43751e7-1432-4e99-8e62-50d14a3c9470","Type":"ContainerStarted","Data":"2f5e173c12032ed901166a55b5b0d683f2bc317fdb4f931de38251611654a231"} Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.600350 5030 generic.go:334] "Generic (PLEG): container finished" podID="910e33af-9b19-4387-babd-c19cce0cfd29" containerID="f65bf0c0a190a4be243558d97f9121553b84f600642d4efa0249963226d5019d" exitCode=0 Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.600419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-qqptl" event={"ID":"910e33af-9b19-4387-babd-c19cce0cfd29","Type":"ContainerDied","Data":"f65bf0c0a190a4be243558d97f9121553b84f600642d4efa0249963226d5019d"} Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.603957 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" podStartSLOduration=2.603940302 podStartE2EDuration="2.603940302s" podCreationTimestamp="2026-01-20 22:59:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:19.601047531 +0000 UTC m=+1431.921307819" watchObservedRunningTime="2026-01-20 22:59:19.603940302 +0000 UTC m=+1431.924200590" Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.609396 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-c82wm" event={"ID":"0dbd22d0-db4b-485c-b2f6-20bfa502543f","Type":"ContainerStarted","Data":"930e742a8c3b6bfba53d77a91e0a752230ac32c65e3b1f9f159ccd247b30505f"} Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.609434 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-c82wm" event={"ID":"0dbd22d0-db4b-485c-b2f6-20bfa502543f","Type":"ContainerStarted","Data":"cb418540327854ea632fe66f2c7138f433453a30862b571d1a2bc56d7c663dcb"} Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.666857 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" podStartSLOduration=1.666839901 podStartE2EDuration="1.666839901s" podCreationTimestamp="2026-01-20 22:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:19.660799253 +0000 UTC m=+1431.981059541" watchObservedRunningTime="2026-01-20 22:59:19.666839901 +0000 UTC m=+1431.987100189" Jan 20 22:59:19 crc kubenswrapper[5030]: I0120 22:59:19.689216 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-c82wm" podStartSLOduration=1.689202115 podStartE2EDuration="1.689202115s" podCreationTimestamp="2026-01-20 22:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:19.683479804 +0000 UTC m=+1432.003740092" watchObservedRunningTime="2026-01-20 22:59:19.689202115 +0000 UTC m=+1432.009462403" Jan 20 22:59:20 crc kubenswrapper[5030]: I0120 22:59:20.621078 5030 generic.go:334] "Generic (PLEG): container finished" podID="faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a" containerID="9229318f7cb77ae4d53c9c07b2207d74bfdc19cd971563178ba54067d58e96ad" exitCode=0 Jan 20 22:59:20 crc kubenswrapper[5030]: I0120 22:59:20.621169 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" event={"ID":"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a","Type":"ContainerDied","Data":"9229318f7cb77ae4d53c9c07b2207d74bfdc19cd971563178ba54067d58e96ad"} Jan 20 22:59:20 crc kubenswrapper[5030]: I0120 22:59:20.624029 5030 generic.go:334] "Generic (PLEG): container finished" podID="d1f367a2-2685-4a40-b29d-2a69508228f5" containerID="1fdbf3f0525c086956ed40b1d3edf22385f43cf65cd8c8f8464f203907256fa6" exitCode=0 Jan 20 22:59:20 crc kubenswrapper[5030]: I0120 22:59:20.624067 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" event={"ID":"d1f367a2-2685-4a40-b29d-2a69508228f5","Type":"ContainerDied","Data":"1fdbf3f0525c086956ed40b1d3edf22385f43cf65cd8c8f8464f203907256fa6"} Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.021064 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.120724 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-qqptl" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.124839 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-s2nnr" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.130172 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-t86dg" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.144722 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a43751e7-1432-4e99-8e62-50d14a3c9470-operator-scripts\") pod \"a43751e7-1432-4e99-8e62-50d14a3c9470\" (UID: \"a43751e7-1432-4e99-8e62-50d14a3c9470\") " Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.144869 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkq4s\" (UniqueName: \"kubernetes.io/projected/a43751e7-1432-4e99-8e62-50d14a3c9470-kube-api-access-rkq4s\") pod \"a43751e7-1432-4e99-8e62-50d14a3c9470\" (UID: \"a43751e7-1432-4e99-8e62-50d14a3c9470\") " Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.148005 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43751e7-1432-4e99-8e62-50d14a3c9470-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a43751e7-1432-4e99-8e62-50d14a3c9470" (UID: "a43751e7-1432-4e99-8e62-50d14a3c9470"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.152567 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a43751e7-1432-4e99-8e62-50d14a3c9470-kube-api-access-rkq4s" (OuterVolumeSpecName: "kube-api-access-rkq4s") pod "a43751e7-1432-4e99-8e62-50d14a3c9470" (UID: "a43751e7-1432-4e99-8e62-50d14a3c9470"). InnerVolumeSpecName "kube-api-access-rkq4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.246170 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g4gw\" (UniqueName: \"kubernetes.io/projected/a601abae-f93a-48ec-8597-00ef38c3e1e6-kube-api-access-5g4gw\") pod \"a601abae-f93a-48ec-8597-00ef38c3e1e6\" (UID: \"a601abae-f93a-48ec-8597-00ef38c3e1e6\") " Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.246224 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910e33af-9b19-4387-babd-c19cce0cfd29-operator-scripts\") pod \"910e33af-9b19-4387-babd-c19cce0cfd29\" (UID: \"910e33af-9b19-4387-babd-c19cce0cfd29\") " Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.246253 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8z6k\" (UniqueName: \"kubernetes.io/projected/6ebfa275-f618-4aaf-b02b-f4a6db7c78e4-kube-api-access-l8z6k\") pod \"6ebfa275-f618-4aaf-b02b-f4a6db7c78e4\" (UID: \"6ebfa275-f618-4aaf-b02b-f4a6db7c78e4\") " Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.246352 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66krd\" (UniqueName: \"kubernetes.io/projected/910e33af-9b19-4387-babd-c19cce0cfd29-kube-api-access-66krd\") pod \"910e33af-9b19-4387-babd-c19cce0cfd29\" (UID: \"910e33af-9b19-4387-babd-c19cce0cfd29\") " Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.246399 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a601abae-f93a-48ec-8597-00ef38c3e1e6-operator-scripts\") pod \"a601abae-f93a-48ec-8597-00ef38c3e1e6\" (UID: \"a601abae-f93a-48ec-8597-00ef38c3e1e6\") " Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.246442 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ebfa275-f618-4aaf-b02b-f4a6db7c78e4-operator-scripts\") pod \"6ebfa275-f618-4aaf-b02b-f4a6db7c78e4\" (UID: \"6ebfa275-f618-4aaf-b02b-f4a6db7c78e4\") " Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.246801 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkq4s\" (UniqueName: \"kubernetes.io/projected/a43751e7-1432-4e99-8e62-50d14a3c9470-kube-api-access-rkq4s\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.246826 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a43751e7-1432-4e99-8e62-50d14a3c9470-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.246814 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a601abae-f93a-48ec-8597-00ef38c3e1e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a601abae-f93a-48ec-8597-00ef38c3e1e6" (UID: "a601abae-f93a-48ec-8597-00ef38c3e1e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.247081 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/910e33af-9b19-4387-babd-c19cce0cfd29-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "910e33af-9b19-4387-babd-c19cce0cfd29" (UID: "910e33af-9b19-4387-babd-c19cce0cfd29"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.247261 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ebfa275-f618-4aaf-b02b-f4a6db7c78e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ebfa275-f618-4aaf-b02b-f4a6db7c78e4" (UID: "6ebfa275-f618-4aaf-b02b-f4a6db7c78e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.249126 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ebfa275-f618-4aaf-b02b-f4a6db7c78e4-kube-api-access-l8z6k" (OuterVolumeSpecName: "kube-api-access-l8z6k") pod "6ebfa275-f618-4aaf-b02b-f4a6db7c78e4" (UID: "6ebfa275-f618-4aaf-b02b-f4a6db7c78e4"). InnerVolumeSpecName "kube-api-access-l8z6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.251069 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a601abae-f93a-48ec-8597-00ef38c3e1e6-kube-api-access-5g4gw" (OuterVolumeSpecName: "kube-api-access-5g4gw") pod "a601abae-f93a-48ec-8597-00ef38c3e1e6" (UID: "a601abae-f93a-48ec-8597-00ef38c3e1e6"). InnerVolumeSpecName "kube-api-access-5g4gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.251798 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/910e33af-9b19-4387-babd-c19cce0cfd29-kube-api-access-66krd" (OuterVolumeSpecName: "kube-api-access-66krd") pod "910e33af-9b19-4387-babd-c19cce0cfd29" (UID: "910e33af-9b19-4387-babd-c19cce0cfd29"). InnerVolumeSpecName "kube-api-access-66krd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.348328 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66krd\" (UniqueName: \"kubernetes.io/projected/910e33af-9b19-4387-babd-c19cce0cfd29-kube-api-access-66krd\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.348381 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a601abae-f93a-48ec-8597-00ef38c3e1e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.348401 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ebfa275-f618-4aaf-b02b-f4a6db7c78e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.348422 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g4gw\" (UniqueName: \"kubernetes.io/projected/a601abae-f93a-48ec-8597-00ef38c3e1e6-kube-api-access-5g4gw\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.348439 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910e33af-9b19-4387-babd-c19cce0cfd29-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.348456 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8z6k\" (UniqueName: \"kubernetes.io/projected/6ebfa275-f618-4aaf-b02b-f4a6db7c78e4-kube-api-access-l8z6k\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.637616 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.637663 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2" event={"ID":"a43751e7-1432-4e99-8e62-50d14a3c9470","Type":"ContainerDied","Data":"2f5e173c12032ed901166a55b5b0d683f2bc317fdb4f931de38251611654a231"} Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.637719 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f5e173c12032ed901166a55b5b0d683f2bc317fdb4f931de38251611654a231" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.639947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-qqptl" event={"ID":"910e33af-9b19-4387-babd-c19cce0cfd29","Type":"ContainerDied","Data":"e88e36fc447994d62ed970c7af897adb00db66ef0869cd8160049218e5bea23c"} Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.639996 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e88e36fc447994d62ed970c7af897adb00db66ef0869cd8160049218e5bea23c" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.640012 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-qqptl" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.641943 5030 generic.go:334] "Generic (PLEG): container finished" podID="0dbd22d0-db4b-485c-b2f6-20bfa502543f" containerID="930e742a8c3b6bfba53d77a91e0a752230ac32c65e3b1f9f159ccd247b30505f" exitCode=0 Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.641998 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-c82wm" event={"ID":"0dbd22d0-db4b-485c-b2f6-20bfa502543f","Type":"ContainerDied","Data":"930e742a8c3b6bfba53d77a91e0a752230ac32c65e3b1f9f159ccd247b30505f"} Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.644259 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-s2nnr" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.644264 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-s2nnr" event={"ID":"a601abae-f93a-48ec-8597-00ef38c3e1e6","Type":"ContainerDied","Data":"d83ff557c555e4ea7402a11908200e27c0bcb910492ed7a3fa3c3f8eeff1e4d8"} Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.644320 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d83ff557c555e4ea7402a11908200e27c0bcb910492ed7a3fa3c3f8eeff1e4d8" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.647503 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-t86dg" event={"ID":"6ebfa275-f618-4aaf-b02b-f4a6db7c78e4","Type":"ContainerDied","Data":"c2d351597eaee596e2a1a45aa409260109083cdcce2f45736bbc3c6a0df5468c"} Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.647539 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2d351597eaee596e2a1a45aa409260109083cdcce2f45736bbc3c6a0df5468c" Jan 20 22:59:21 crc kubenswrapper[5030]: I0120 22:59:21.647736 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-t86dg" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.060965 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.148403 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.162561 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a-operator-scripts\") pod \"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a\" (UID: \"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a\") " Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.162771 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md46j\" (UniqueName: \"kubernetes.io/projected/faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a-kube-api-access-md46j\") pod \"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a\" (UID: \"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a\") " Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.163126 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a" (UID: "faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.171826 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a-kube-api-access-md46j" (OuterVolumeSpecName: "kube-api-access-md46j") pod "faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a" (UID: "faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a"). InnerVolumeSpecName "kube-api-access-md46j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.264043 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttd2b\" (UniqueName: \"kubernetes.io/projected/d1f367a2-2685-4a40-b29d-2a69508228f5-kube-api-access-ttd2b\") pod \"d1f367a2-2685-4a40-b29d-2a69508228f5\" (UID: \"d1f367a2-2685-4a40-b29d-2a69508228f5\") " Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.264481 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1f367a2-2685-4a40-b29d-2a69508228f5-operator-scripts\") pod \"d1f367a2-2685-4a40-b29d-2a69508228f5\" (UID: \"d1f367a2-2685-4a40-b29d-2a69508228f5\") " Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.264942 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md46j\" (UniqueName: \"kubernetes.io/projected/faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a-kube-api-access-md46j\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.265004 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1f367a2-2685-4a40-b29d-2a69508228f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1f367a2-2685-4a40-b29d-2a69508228f5" (UID: "d1f367a2-2685-4a40-b29d-2a69508228f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.265027 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.266546 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1f367a2-2685-4a40-b29d-2a69508228f5-kube-api-access-ttd2b" (OuterVolumeSpecName: "kube-api-access-ttd2b") pod "d1f367a2-2685-4a40-b29d-2a69508228f5" (UID: "d1f367a2-2685-4a40-b29d-2a69508228f5"). InnerVolumeSpecName "kube-api-access-ttd2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.366812 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1f367a2-2685-4a40-b29d-2a69508228f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.366842 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttd2b\" (UniqueName: \"kubernetes.io/projected/d1f367a2-2685-4a40-b29d-2a69508228f5-kube-api-access-ttd2b\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.665350 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" event={"ID":"d1f367a2-2685-4a40-b29d-2a69508228f5","Type":"ContainerDied","Data":"851db81d02ca1e21505be83c54aae7b9579637673db519eb3317d841e7890c49"} Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.665430 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="851db81d02ca1e21505be83c54aae7b9579637673db519eb3317d841e7890c49" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.665373 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.668381 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.668421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl" event={"ID":"faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a","Type":"ContainerDied","Data":"6784a6d94b12b64167c227ac68ee3c276e83f440bd8c5baf0dcb8b22ee755a17"} Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.668499 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6784a6d94b12b64167c227ac68ee3c276e83f440bd8c5baf0dcb8b22ee755a17" Jan 20 22:59:22 crc kubenswrapper[5030]: I0120 22:59:22.952722 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-c82wm" Jan 20 22:59:23 crc kubenswrapper[5030]: I0120 22:59:23.078116 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbd22d0-db4b-485c-b2f6-20bfa502543f-combined-ca-bundle\") pod \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\" (UID: \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\") " Jan 20 22:59:23 crc kubenswrapper[5030]: I0120 22:59:23.078352 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpl9d\" (UniqueName: \"kubernetes.io/projected/0dbd22d0-db4b-485c-b2f6-20bfa502543f-kube-api-access-tpl9d\") pod \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\" (UID: \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\") " Jan 20 22:59:23 crc kubenswrapper[5030]: I0120 22:59:23.078381 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dbd22d0-db4b-485c-b2f6-20bfa502543f-config-data\") pod \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\" (UID: \"0dbd22d0-db4b-485c-b2f6-20bfa502543f\") " Jan 20 22:59:23 crc kubenswrapper[5030]: I0120 22:59:23.084170 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbd22d0-db4b-485c-b2f6-20bfa502543f-kube-api-access-tpl9d" (OuterVolumeSpecName: "kube-api-access-tpl9d") pod "0dbd22d0-db4b-485c-b2f6-20bfa502543f" (UID: "0dbd22d0-db4b-485c-b2f6-20bfa502543f"). InnerVolumeSpecName "kube-api-access-tpl9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:23 crc kubenswrapper[5030]: I0120 22:59:23.114089 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbd22d0-db4b-485c-b2f6-20bfa502543f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dbd22d0-db4b-485c-b2f6-20bfa502543f" (UID: "0dbd22d0-db4b-485c-b2f6-20bfa502543f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:23 crc kubenswrapper[5030]: I0120 22:59:23.139755 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbd22d0-db4b-485c-b2f6-20bfa502543f-config-data" (OuterVolumeSpecName: "config-data") pod "0dbd22d0-db4b-485c-b2f6-20bfa502543f" (UID: "0dbd22d0-db4b-485c-b2f6-20bfa502543f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:23 crc kubenswrapper[5030]: I0120 22:59:23.180651 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpl9d\" (UniqueName: \"kubernetes.io/projected/0dbd22d0-db4b-485c-b2f6-20bfa502543f-kube-api-access-tpl9d\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:23 crc kubenswrapper[5030]: I0120 22:59:23.180974 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dbd22d0-db4b-485c-b2f6-20bfa502543f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:23 crc kubenswrapper[5030]: I0120 22:59:23.180990 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbd22d0-db4b-485c-b2f6-20bfa502543f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:23 crc kubenswrapper[5030]: I0120 22:59:23.694131 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-c82wm" event={"ID":"0dbd22d0-db4b-485c-b2f6-20bfa502543f","Type":"ContainerDied","Data":"cb418540327854ea632fe66f2c7138f433453a30862b571d1a2bc56d7c663dcb"} Jan 20 22:59:23 crc kubenswrapper[5030]: I0120 22:59:23.694218 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb418540327854ea632fe66f2c7138f433453a30862b571d1a2bc56d7c663dcb" Jan 20 22:59:23 crc kubenswrapper[5030]: I0120 22:59:23.694319 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-c82wm" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.156611 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-78ff9"] Jan 20 22:59:24 crc kubenswrapper[5030]: E0120 22:59:24.160999 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f367a2-2685-4a40-b29d-2a69508228f5" containerName="mariadb-account-create-update" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161022 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f367a2-2685-4a40-b29d-2a69508228f5" containerName="mariadb-account-create-update" Jan 20 22:59:24 crc kubenswrapper[5030]: E0120 22:59:24.161042 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910e33af-9b19-4387-babd-c19cce0cfd29" containerName="mariadb-database-create" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161048 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="910e33af-9b19-4387-babd-c19cce0cfd29" containerName="mariadb-database-create" Jan 20 22:59:24 crc kubenswrapper[5030]: E0120 22:59:24.161056 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a43751e7-1432-4e99-8e62-50d14a3c9470" containerName="mariadb-account-create-update" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161062 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43751e7-1432-4e99-8e62-50d14a3c9470" containerName="mariadb-account-create-update" Jan 20 22:59:24 crc kubenswrapper[5030]: E0120 22:59:24.161076 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbd22d0-db4b-485c-b2f6-20bfa502543f" containerName="keystone-db-sync" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161082 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbd22d0-db4b-485c-b2f6-20bfa502543f" containerName="keystone-db-sync" Jan 20 22:59:24 crc kubenswrapper[5030]: E0120 22:59:24.161090 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a601abae-f93a-48ec-8597-00ef38c3e1e6" containerName="mariadb-database-create" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161097 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a601abae-f93a-48ec-8597-00ef38c3e1e6" containerName="mariadb-database-create" Jan 20 22:59:24 crc kubenswrapper[5030]: E0120 22:59:24.161107 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a" containerName="mariadb-account-create-update" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161113 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a" containerName="mariadb-account-create-update" Jan 20 22:59:24 crc kubenswrapper[5030]: E0120 22:59:24.161122 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebfa275-f618-4aaf-b02b-f4a6db7c78e4" containerName="mariadb-database-create" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161127 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebfa275-f618-4aaf-b02b-f4a6db7c78e4" containerName="mariadb-database-create" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161267 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="910e33af-9b19-4387-babd-c19cce0cfd29" containerName="mariadb-database-create" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161275 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a601abae-f93a-48ec-8597-00ef38c3e1e6" containerName="mariadb-database-create" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161286 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a43751e7-1432-4e99-8e62-50d14a3c9470" containerName="mariadb-account-create-update" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161298 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f367a2-2685-4a40-b29d-2a69508228f5" containerName="mariadb-account-create-update" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161307 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a" containerName="mariadb-account-create-update" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161318 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dbd22d0-db4b-485c-b2f6-20bfa502543f" containerName="keystone-db-sync" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161331 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ebfa275-f618-4aaf-b02b-f4a6db7c78e4" containerName="mariadb-database-create" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.161834 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.165871 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.166004 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.166083 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.166147 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.166392 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-5nbxn" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.181947 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-78ff9"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.245905 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.247179 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.250224 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.258813 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.259014 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-ghprt" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.259290 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.266730 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.300102 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlvgj\" (UniqueName: \"kubernetes.io/projected/cdfdc05b-f070-4e63-9606-c7f604a5ccda-kube-api-access-zlvgj\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.300154 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-credential-keys\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.300188 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-config-data\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.300221 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-fernet-keys\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.300238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-combined-ca-bundle\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.300271 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-scripts\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.306204 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.307554 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.312572 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.315922 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.330673 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.332761 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.341674 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.341760 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.345898 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.356289 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-m7m8q"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.357239 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.362835 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.363005 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-zx6f5" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.363101 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.368589 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.382338 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-m7m8q"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401356 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-logs\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401405 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401430 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-fernet-keys\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401454 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-combined-ca-bundle\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401488 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfglr\" (UniqueName: \"kubernetes.io/projected/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-kube-api-access-dfglr\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401512 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401528 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-scripts\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401545 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401559 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-scripts\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401573 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401591 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-config-data\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401612 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401656 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-logs\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401679 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7gn9\" (UniqueName: \"kubernetes.io/projected/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-kube-api-access-j7gn9\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401699 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401715 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401741 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401760 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401778 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlvgj\" (UniqueName: \"kubernetes.io/projected/cdfdc05b-f070-4e63-9606-c7f604a5ccda-kube-api-access-zlvgj\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-credential-keys\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401832 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-config-data\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.401847 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.411372 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-combined-ca-bundle\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.411875 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-config-data\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.412242 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-fernet-keys\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.422023 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-scripts\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.433035 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-w9wqz"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.434295 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.437970 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-credential-keys\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.439221 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlvgj\" (UniqueName: \"kubernetes.io/projected/cdfdc05b-f070-4e63-9606-c7f604a5ccda-kube-api-access-zlvgj\") pod \"keystone-bootstrap-78ff9\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.441840 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-tmgt6" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.442031 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.465149 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-w9wqz"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.481294 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-vbdht"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.482305 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.488146 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.488405 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-j55hb" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.488503 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.489890 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503095 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-db-sync-config-data\") pod \"barbican-db-sync-w9wqz\" (UID: \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\") " pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503142 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwbk9\" (UniqueName: \"kubernetes.io/projected/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-kube-api-access-gwbk9\") pod \"barbican-db-sync-w9wqz\" (UID: \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\") " pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503169 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed15b8f4-d0a2-401f-ac7e-cc273da45269-combined-ca-bundle\") pod \"neutron-db-sync-m7m8q\" (UID: \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\") " pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503198 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-config-data\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503236 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503257 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2ktn\" (UniqueName: \"kubernetes.io/projected/ed15b8f4-d0a2-401f-ac7e-cc273da45269-kube-api-access-k2ktn\") pod \"neutron-db-sync-m7m8q\" (UID: \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\") " pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503283 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-logs\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503303 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503325 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15247204-6ca3-4d52-ac9e-e8ba2e682d35-run-httpd\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503345 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-combined-ca-bundle\") pod \"barbican-db-sync-w9wqz\" (UID: \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\") " pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503365 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfglr\" (UniqueName: \"kubernetes.io/projected/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-kube-api-access-dfglr\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503386 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-scripts\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503406 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503422 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503441 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503457 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-scripts\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503472 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503491 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-config-data\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503511 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503528 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-logs\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7gn9\" (UniqueName: \"kubernetes.io/projected/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-kube-api-access-j7gn9\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503571 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed15b8f4-d0a2-401f-ac7e-cc273da45269-config\") pod \"neutron-db-sync-m7m8q\" (UID: \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\") " pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503586 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.503604 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.504520 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-logs\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.504597 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-logs\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.505014 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.505647 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.508950 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.508999 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rqtc\" (UniqueName: \"kubernetes.io/projected/15247204-6ca3-4d52-ac9e-e8ba2e682d35-kube-api-access-5rqtc\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.509054 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.509086 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.509107 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15247204-6ca3-4d52-ac9e-e8ba2e682d35-log-httpd\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.509497 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.509812 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-vbdht"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.512250 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-scripts\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.512860 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.513046 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.513561 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.514030 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.515723 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.517610 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.521527 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.526358 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8lg48"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.527398 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.533257 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-config-data\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.533391 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.533431 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.533878 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-pnjbw" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.533985 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8lg48"] Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.536378 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfglr\" (UniqueName: \"kubernetes.io/projected/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-kube-api-access-dfglr\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.543109 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7gn9\" (UniqueName: \"kubernetes.io/projected/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-kube-api-access-j7gn9\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.564062 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.569225 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2ktn\" (UniqueName: \"kubernetes.io/projected/ed15b8f4-d0a2-401f-ac7e-cc273da45269-kube-api-access-k2ktn\") pod \"neutron-db-sync-m7m8q\" (UID: \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\") " pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613631 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-combined-ca-bundle\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613668 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15247204-6ca3-4d52-ac9e-e8ba2e682d35-run-httpd\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613688 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e588d13-720f-45b7-aaa8-d339dc20f695-etc-machine-id\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613706 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-combined-ca-bundle\") pod \"barbican-db-sync-w9wqz\" (UID: \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\") " pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613733 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-config-data\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613771 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-combined-ca-bundle\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613793 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-scripts\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613813 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613829 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-scripts\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613854 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-db-sync-config-data\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613888 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed15b8f4-d0a2-401f-ac7e-cc273da45269-config\") pod \"neutron-db-sync-m7m8q\" (UID: \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\") " pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613906 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613924 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rqtc\" (UniqueName: \"kubernetes.io/projected/15247204-6ca3-4d52-ac9e-e8ba2e682d35-kube-api-access-5rqtc\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613947 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lp6j\" (UniqueName: \"kubernetes.io/projected/3e588d13-720f-45b7-aaa8-d339dc20f695-kube-api-access-4lp6j\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613966 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-scripts\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.613982 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c92053f1-ec4e-4dee-ac05-a4860c140b41-logs\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.614004 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15247204-6ca3-4d52-ac9e-e8ba2e682d35-log-httpd\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.614028 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-db-sync-config-data\") pod \"barbican-db-sync-w9wqz\" (UID: \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\") " pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.614046 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh2pn\" (UniqueName: \"kubernetes.io/projected/c92053f1-ec4e-4dee-ac05-a4860c140b41-kube-api-access-fh2pn\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.614064 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-config-data\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.614083 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwbk9\" (UniqueName: \"kubernetes.io/projected/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-kube-api-access-gwbk9\") pod \"barbican-db-sync-w9wqz\" (UID: \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\") " pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.614101 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed15b8f4-d0a2-401f-ac7e-cc273da45269-combined-ca-bundle\") pod \"neutron-db-sync-m7m8q\" (UID: \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\") " pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.614116 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-config-data\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.614464 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15247204-6ca3-4d52-ac9e-e8ba2e682d35-run-httpd\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.614908 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15247204-6ca3-4d52-ac9e-e8ba2e682d35-log-httpd\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.620984 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.622260 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-db-sync-config-data\") pod \"barbican-db-sync-w9wqz\" (UID: \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\") " pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.623151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed15b8f4-d0a2-401f-ac7e-cc273da45269-config\") pod \"neutron-db-sync-m7m8q\" (UID: \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\") " pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.623908 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-combined-ca-bundle\") pod \"barbican-db-sync-w9wqz\" (UID: \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\") " pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.623935 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.624969 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-config-data\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.625904 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed15b8f4-d0a2-401f-ac7e-cc273da45269-combined-ca-bundle\") pod \"neutron-db-sync-m7m8q\" (UID: \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\") " pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.627021 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.641134 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2ktn\" (UniqueName: \"kubernetes.io/projected/ed15b8f4-d0a2-401f-ac7e-cc273da45269-kube-api-access-k2ktn\") pod \"neutron-db-sync-m7m8q\" (UID: \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\") " pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.641830 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rqtc\" (UniqueName: \"kubernetes.io/projected/15247204-6ca3-4d52-ac9e-e8ba2e682d35-kube-api-access-5rqtc\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.644230 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwbk9\" (UniqueName: \"kubernetes.io/projected/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-kube-api-access-gwbk9\") pod \"barbican-db-sync-w9wqz\" (UID: \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\") " pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.644707 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-scripts\") pod \"ceilometer-0\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.647052 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.681601 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.715747 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh2pn\" (UniqueName: \"kubernetes.io/projected/c92053f1-ec4e-4dee-ac05-a4860c140b41-kube-api-access-fh2pn\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.715796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-config-data\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.715839 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-combined-ca-bundle\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.715874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e588d13-720f-45b7-aaa8-d339dc20f695-etc-machine-id\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.715891 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-config-data\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.715908 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-combined-ca-bundle\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.715933 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-scripts\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.716010 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-db-sync-config-data\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.716127 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e588d13-720f-45b7-aaa8-d339dc20f695-etc-machine-id\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.716068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lp6j\" (UniqueName: \"kubernetes.io/projected/3e588d13-720f-45b7-aaa8-d339dc20f695-kube-api-access-4lp6j\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.716564 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-scripts\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.716588 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c92053f1-ec4e-4dee-ac05-a4860c140b41-logs\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.717579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c92053f1-ec4e-4dee-ac05-a4860c140b41-logs\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.722386 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-db-sync-config-data\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.724574 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-config-data\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.726563 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-scripts\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.727882 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-combined-ca-bundle\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.729365 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-config-data\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.730392 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-combined-ca-bundle\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.733777 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-scripts\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.735702 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh2pn\" (UniqueName: \"kubernetes.io/projected/c92053f1-ec4e-4dee-ac05-a4860c140b41-kube-api-access-fh2pn\") pod \"placement-db-sync-8lg48\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.736799 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lp6j\" (UniqueName: \"kubernetes.io/projected/3e588d13-720f-45b7-aaa8-d339dc20f695-kube-api-access-4lp6j\") pod \"cinder-db-sync-vbdht\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.814155 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.868168 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.942033 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.951953 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:24 crc kubenswrapper[5030]: I0120 22:59:24.963754 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-78ff9"] Jan 20 22:59:24 crc kubenswrapper[5030]: W0120 22:59:24.981774 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdfdc05b_f070_4e63_9606_c7f604a5ccda.slice/crio-7287c7f837030fd575c19cdbe0942e70cbb5f05008764e622ce4ce8ef151fa68 WatchSource:0}: Error finding container 7287c7f837030fd575c19cdbe0942e70cbb5f05008764e622ce4ce8ef151fa68: Status 404 returned error can't find the container with id 7287c7f837030fd575c19cdbe0942e70cbb5f05008764e622ce4ce8ef151fa68 Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.122972 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.142812 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.224756 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-m7m8q"] Jan 20 22:59:25 crc kubenswrapper[5030]: W0120 22:59:25.232955 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded15b8f4_d0a2_401f_ac7e_cc273da45269.slice/crio-a49501582461db055200519fb5cb4d0b18985ad1880f8b3539f52f062f3e1a7b WatchSource:0}: Error finding container a49501582461db055200519fb5cb4d0b18985ad1880f8b3539f52f062f3e1a7b: Status 404 returned error can't find the container with id a49501582461db055200519fb5cb4d0b18985ad1880f8b3539f52f062f3e1a7b Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.323119 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:59:25 crc kubenswrapper[5030]: W0120 22:59:25.328387 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe32cdca_a8f0_4c66_91bb_0ae2e0da99a8.slice/crio-04e85f606351b27d4497ba02bbda0c75306c596d681dc7954c992b7e6eb24702 WatchSource:0}: Error finding container 04e85f606351b27d4497ba02bbda0c75306c596d681dc7954c992b7e6eb24702: Status 404 returned error can't find the container with id 04e85f606351b27d4497ba02bbda0c75306c596d681dc7954c992b7e6eb24702 Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.333215 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-w9wqz"] Jan 20 22:59:25 crc kubenswrapper[5030]: W0120 22:59:25.351011 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc105b2eb_c6ab_47a6_bdcb_9cf5e9893bae.slice/crio-2369f7584fc875acb35fc4fa8972ed120ad91f6ff87f1ae374a7776ec89719b8 WatchSource:0}: Error finding container 2369f7584fc875acb35fc4fa8972ed120ad91f6ff87f1ae374a7776ec89719b8: Status 404 returned error can't find the container with id 2369f7584fc875acb35fc4fa8972ed120ad91f6ff87f1ae374a7776ec89719b8 Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.507278 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8lg48"] Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.515126 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-vbdht"] Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.796023 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7","Type":"ContainerStarted","Data":"56edc5d2efeea9a28c13816471d41914ce49041d2de59fd1ec22460767b79903"} Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.798878 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" event={"ID":"ed15b8f4-d0a2-401f-ac7e-cc273da45269","Type":"ContainerStarted","Data":"b55acc6188f53bf283b05b9661bb3eb3caf7b2386532ae980483b10b21b0a5fb"} Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.798900 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" event={"ID":"ed15b8f4-d0a2-401f-ac7e-cc273da45269","Type":"ContainerStarted","Data":"a49501582461db055200519fb5cb4d0b18985ad1880f8b3539f52f062f3e1a7b"} Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.802264 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-vbdht" event={"ID":"3e588d13-720f-45b7-aaa8-d339dc20f695","Type":"ContainerStarted","Data":"2e055d2541edb88f6ffee86733420bae13f69cbba28736eecba171ae51521dec"} Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.804255 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"15247204-6ca3-4d52-ac9e-e8ba2e682d35","Type":"ContainerStarted","Data":"d59da9f94d4d076d36a2a053e93a80f47f72a8da615471f83d2e406500828b45"} Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.806510 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" event={"ID":"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae","Type":"ContainerStarted","Data":"f18030c78bda0c891f73eaa93b340a7df761b53cca2ed85b4b69c8267068017e"} Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.806597 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" event={"ID":"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae","Type":"ContainerStarted","Data":"2369f7584fc875acb35fc4fa8972ed120ad91f6ff87f1ae374a7776ec89719b8"} Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.809292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" event={"ID":"cdfdc05b-f070-4e63-9606-c7f604a5ccda","Type":"ContainerStarted","Data":"36afc91b2b88250d491610095902402ca8687e8f5009809524f0cc02447b6ac9"} Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.809387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" event={"ID":"cdfdc05b-f070-4e63-9606-c7f604a5ccda","Type":"ContainerStarted","Data":"7287c7f837030fd575c19cdbe0942e70cbb5f05008764e622ce4ce8ef151fa68"} Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.815030 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8","Type":"ContainerStarted","Data":"04e85f606351b27d4497ba02bbda0c75306c596d681dc7954c992b7e6eb24702"} Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.816352 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" podStartSLOduration=1.816333673 podStartE2EDuration="1.816333673s" podCreationTimestamp="2026-01-20 22:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:25.812959541 +0000 UTC m=+1438.133219829" watchObservedRunningTime="2026-01-20 22:59:25.816333673 +0000 UTC m=+1438.136593961" Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.834278 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" podStartSLOduration=1.834262348 podStartE2EDuration="1.834262348s" podCreationTimestamp="2026-01-20 22:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:25.825779232 +0000 UTC m=+1438.146039520" watchObservedRunningTime="2026-01-20 22:59:25.834262348 +0000 UTC m=+1438.154522636" Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.852213 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-8lg48" event={"ID":"c92053f1-ec4e-4dee-ac05-a4860c140b41","Type":"ContainerStarted","Data":"58c4c940230c2969da4de4c10a965e74815f9a0617d579d94c2cea62112932fc"} Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.852252 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-8lg48" event={"ID":"c92053f1-ec4e-4dee-ac05-a4860c140b41","Type":"ContainerStarted","Data":"dd220a7d9dff3614e53f27ec375dae9bdff4da07e7de34c9a66d258483c68356"} Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.853742 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" podStartSLOduration=1.853725012 podStartE2EDuration="1.853725012s" podCreationTimestamp="2026-01-20 22:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:25.840379568 +0000 UTC m=+1438.160639856" watchObservedRunningTime="2026-01-20 22:59:25.853725012 +0000 UTC m=+1438.173985300" Jan 20 22:59:25 crc kubenswrapper[5030]: I0120 22:59:25.879962 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-8lg48" podStartSLOduration=1.879945099 podStartE2EDuration="1.879945099s" podCreationTimestamp="2026-01-20 22:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:25.874777343 +0000 UTC m=+1438.195037631" watchObservedRunningTime="2026-01-20 22:59:25.879945099 +0000 UTC m=+1438.200205387" Jan 20 22:59:26 crc kubenswrapper[5030]: I0120 22:59:26.868092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7","Type":"ContainerStarted","Data":"cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d"} Jan 20 22:59:26 crc kubenswrapper[5030]: I0120 22:59:26.880906 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-vbdht" event={"ID":"3e588d13-720f-45b7-aaa8-d339dc20f695","Type":"ContainerStarted","Data":"44f6f5122053dee469c5f5f15881894d07ecebb1b02c5a8aeebf93f8fe66c416"} Jan 20 22:59:26 crc kubenswrapper[5030]: I0120 22:59:26.885245 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"15247204-6ca3-4d52-ac9e-e8ba2e682d35","Type":"ContainerStarted","Data":"adf6b9b7c07af52aab5a63971fbefa3a17b151f0c57d391ef2c2b3e03cca6ed6"} Jan 20 22:59:26 crc kubenswrapper[5030]: I0120 22:59:26.900300 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8","Type":"ContainerStarted","Data":"6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f"} Jan 20 22:59:26 crc kubenswrapper[5030]: I0120 22:59:26.902669 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-vbdht" podStartSLOduration=2.902654631 podStartE2EDuration="2.902654631s" podCreationTimestamp="2026-01-20 22:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:26.894943684 +0000 UTC m=+1439.215203972" watchObservedRunningTime="2026-01-20 22:59:26.902654631 +0000 UTC m=+1439.222914919" Jan 20 22:59:26 crc kubenswrapper[5030]: I0120 22:59:26.995943 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.436089 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.501985 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.909562 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8","Type":"ContainerStarted","Data":"3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9"} Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.909694 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" containerName="glance-log" containerID="cri-o://6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f" gracePeriod=30 Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.909742 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" containerName="glance-httpd" containerID="cri-o://3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9" gracePeriod=30 Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.913044 5030 generic.go:334] "Generic (PLEG): container finished" podID="c92053f1-ec4e-4dee-ac05-a4860c140b41" containerID="58c4c940230c2969da4de4c10a965e74815f9a0617d579d94c2cea62112932fc" exitCode=0 Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.913105 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-8lg48" event={"ID":"c92053f1-ec4e-4dee-ac05-a4860c140b41","Type":"ContainerDied","Data":"58c4c940230c2969da4de4c10a965e74815f9a0617d579d94c2cea62112932fc"} Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.917388 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7","Type":"ContainerStarted","Data":"b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711"} Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.917523 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" containerName="glance-log" containerID="cri-o://cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d" gracePeriod=30 Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.917614 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" containerName="glance-httpd" containerID="cri-o://b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711" gracePeriod=30 Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.929976 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"15247204-6ca3-4d52-ac9e-e8ba2e682d35","Type":"ContainerStarted","Data":"c81c0d711506b6eeceb9bbf72e9abc922b35819dae6ed02a943b536214b59bb3"} Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.932379 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.932364583 podStartE2EDuration="3.932364583s" podCreationTimestamp="2026-01-20 22:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:27.930469018 +0000 UTC m=+1440.250729306" watchObservedRunningTime="2026-01-20 22:59:27.932364583 +0000 UTC m=+1440.252624861" Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.934974 5030 generic.go:334] "Generic (PLEG): container finished" podID="c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae" containerID="f18030c78bda0c891f73eaa93b340a7df761b53cca2ed85b4b69c8267068017e" exitCode=0 Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.935726 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" event={"ID":"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae","Type":"ContainerDied","Data":"f18030c78bda0c891f73eaa93b340a7df761b53cca2ed85b4b69c8267068017e"} Jan 20 22:59:27 crc kubenswrapper[5030]: I0120 22:59:27.986593 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.986578061 podStartE2EDuration="3.986578061s" podCreationTimestamp="2026-01-20 22:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:27.985487715 +0000 UTC m=+1440.305748003" watchObservedRunningTime="2026-01-20 22:59:27.986578061 +0000 UTC m=+1440.306838349" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.545340 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.620064 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739233 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7gn9\" (UniqueName: \"kubernetes.io/projected/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-kube-api-access-j7gn9\") pod \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739294 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-config-data\") pod \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739315 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-scripts\") pod \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739401 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739427 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-logs\") pod \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739445 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-logs\") pod \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-httpd-run\") pod \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739498 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-public-tls-certs\") pod \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739521 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-config-data\") pod \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739560 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-combined-ca-bundle\") pod \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739603 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-combined-ca-bundle\") pod \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739678 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfglr\" (UniqueName: \"kubernetes.io/projected/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-kube-api-access-dfglr\") pod \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739702 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-internal-tls-certs\") pod \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\" (UID: \"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739731 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-scripts\") pod \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.739999 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-httpd-run\") pod \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\" (UID: \"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8\") " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.740708 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" (UID: "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.741119 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-logs" (OuterVolumeSpecName: "logs") pod "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" (UID: "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.741973 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-logs" (OuterVolumeSpecName: "logs") pod "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" (UID: "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.743551 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" (UID: "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.745993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-scripts" (OuterVolumeSpecName: "scripts") pod "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" (UID: "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.746126 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" (UID: "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.746517 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-kube-api-access-j7gn9" (OuterVolumeSpecName: "kube-api-access-j7gn9") pod "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" (UID: "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8"). InnerVolumeSpecName "kube-api-access-j7gn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.747805 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-scripts" (OuterVolumeSpecName: "scripts") pod "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" (UID: "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.748101 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" (UID: "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.750073 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-kube-api-access-dfglr" (OuterVolumeSpecName: "kube-api-access-dfglr") pod "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" (UID: "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7"). InnerVolumeSpecName "kube-api-access-dfglr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.778294 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" (UID: "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.781772 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" (UID: "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.796365 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-config-data" (OuterVolumeSpecName: "config-data") pod "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" (UID: "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.800453 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" (UID: "b0c2332f-3093-47c0-9b1e-cc2780a9a3b7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.802013 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-config-data" (OuterVolumeSpecName: "config-data") pod "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" (UID: "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.805357 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" (UID: "be32cdca-a8f0-4c66-91bb-0ae2e0da99a8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.841847 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.841886 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.841896 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfglr\" (UniqueName: \"kubernetes.io/projected/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-kube-api-access-dfglr\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.841909 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.841918 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.841927 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.841937 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7gn9\" (UniqueName: \"kubernetes.io/projected/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-kube-api-access-j7gn9\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.841945 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.841979 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.841989 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.842002 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.842011 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.842019 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.842027 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.842035 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.842045 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.858464 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.863126 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.943014 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.943039 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.947000 5030 generic.go:334] "Generic (PLEG): container finished" podID="b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" containerID="b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711" exitCode=0 Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.947040 5030 generic.go:334] "Generic (PLEG): container finished" podID="b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" containerID="cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d" exitCode=143 Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.947086 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7","Type":"ContainerDied","Data":"b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711"} Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.947146 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7","Type":"ContainerDied","Data":"cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d"} Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.947165 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b0c2332f-3093-47c0-9b1e-cc2780a9a3b7","Type":"ContainerDied","Data":"56edc5d2efeea9a28c13816471d41914ce49041d2de59fd1ec22460767b79903"} Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.947189 5030 scope.go:117] "RemoveContainer" containerID="b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.947349 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.953557 5030 generic.go:334] "Generic (PLEG): container finished" podID="ed15b8f4-d0a2-401f-ac7e-cc273da45269" containerID="b55acc6188f53bf283b05b9661bb3eb3caf7b2386532ae980483b10b21b0a5fb" exitCode=0 Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.953781 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" event={"ID":"ed15b8f4-d0a2-401f-ac7e-cc273da45269","Type":"ContainerDied","Data":"b55acc6188f53bf283b05b9661bb3eb3caf7b2386532ae980483b10b21b0a5fb"} Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.957656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"15247204-6ca3-4d52-ac9e-e8ba2e682d35","Type":"ContainerStarted","Data":"a2c273022a42bfa06dbc3b574e6c50272816d7abc8a2a2ee16e3f3ee91c019cb"} Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.968894 5030 generic.go:334] "Generic (PLEG): container finished" podID="be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" containerID="3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9" exitCode=0 Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.969089 5030 generic.go:334] "Generic (PLEG): container finished" podID="be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" containerID="6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f" exitCode=143 Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.968977 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.968996 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8","Type":"ContainerDied","Data":"3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9"} Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.971577 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8","Type":"ContainerDied","Data":"6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f"} Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.971600 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"be32cdca-a8f0-4c66-91bb-0ae2e0da99a8","Type":"ContainerDied","Data":"04e85f606351b27d4497ba02bbda0c75306c596d681dc7954c992b7e6eb24702"} Jan 20 22:59:28 crc kubenswrapper[5030]: I0120 22:59:28.976745 5030 scope.go:117] "RemoveContainer" containerID="cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.005738 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.037533 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.037890 5030 scope.go:117] "RemoveContainer" containerID="b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711" Jan 20 22:59:29 crc kubenswrapper[5030]: E0120 22:59:29.040360 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711\": container with ID starting with b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711 not found: ID does not exist" containerID="b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.040389 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711"} err="failed to get container status \"b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711\": rpc error: code = NotFound desc = could not find container \"b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711\": container with ID starting with b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711 not found: ID does not exist" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.040411 5030 scope.go:117] "RemoveContainer" containerID="cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d" Jan 20 22:59:29 crc kubenswrapper[5030]: E0120 22:59:29.052266 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d\": container with ID starting with cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d not found: ID does not exist" containerID="cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.052310 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d"} err="failed to get container status \"cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d\": rpc error: code = NotFound desc = could not find container \"cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d\": container with ID starting with cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d not found: ID does not exist" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.052336 5030 scope.go:117] "RemoveContainer" containerID="b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.052813 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711"} err="failed to get container status \"b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711\": rpc error: code = NotFound desc = could not find container \"b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711\": container with ID starting with b4780aed27b4cd4546a48bf2e037d75f27492c08b974aca0ef6c8de36b820711 not found: ID does not exist" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.052831 5030 scope.go:117] "RemoveContainer" containerID="cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.056924 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d"} err="failed to get container status \"cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d\": rpc error: code = NotFound desc = could not find container \"cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d\": container with ID starting with cdca21a623608d3a610ebbbea5e927f65ad9cfb075c69811451a61ff577f0a1d not found: ID does not exist" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.056947 5030 scope.go:117] "RemoveContainer" containerID="3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.071083 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:59:29 crc kubenswrapper[5030]: E0120 22:59:29.071891 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" containerName="glance-log" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.071906 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" containerName="glance-log" Jan 20 22:59:29 crc kubenswrapper[5030]: E0120 22:59:29.071925 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" containerName="glance-log" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.071931 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" containerName="glance-log" Jan 20 22:59:29 crc kubenswrapper[5030]: E0120 22:59:29.071958 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" containerName="glance-httpd" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.071968 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" containerName="glance-httpd" Jan 20 22:59:29 crc kubenswrapper[5030]: E0120 22:59:29.071989 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" containerName="glance-httpd" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.071995 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" containerName="glance-httpd" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.072315 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" containerName="glance-log" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.072332 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" containerName="glance-httpd" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.072347 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" containerName="glance-log" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.072368 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" containerName="glance-httpd" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.078610 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.081091 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-ghprt" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.084052 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.084225 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.084366 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.096738 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.117774 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.133729 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.138824 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.140206 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.141880 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.144908 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.156073 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.182982 5030 scope.go:117] "RemoveContainer" containerID="6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.220176 5030 scope.go:117] "RemoveContainer" containerID="3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9" Jan 20 22:59:29 crc kubenswrapper[5030]: E0120 22:59:29.221065 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9\": container with ID starting with 3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9 not found: ID does not exist" containerID="3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.221259 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9"} err="failed to get container status \"3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9\": rpc error: code = NotFound desc = could not find container \"3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9\": container with ID starting with 3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9 not found: ID does not exist" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.221327 5030 scope.go:117] "RemoveContainer" containerID="6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f" Jan 20 22:59:29 crc kubenswrapper[5030]: E0120 22:59:29.222087 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f\": container with ID starting with 6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f not found: ID does not exist" containerID="6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.222125 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f"} err="failed to get container status \"6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f\": rpc error: code = NotFound desc = could not find container \"6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f\": container with ID starting with 6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f not found: ID does not exist" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.222151 5030 scope.go:117] "RemoveContainer" containerID="3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.222482 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9"} err="failed to get container status \"3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9\": rpc error: code = NotFound desc = could not find container \"3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9\": container with ID starting with 3e9949c7befa50af18937fec01588a5a4125b13e1501ed8101d14bb5d45f6db9 not found: ID does not exist" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.222500 5030 scope.go:117] "RemoveContainer" containerID="6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.222834 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f"} err="failed to get container status \"6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f\": rpc error: code = NotFound desc = could not find container \"6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f\": container with ID starting with 6bc38b21069a79b407da21b86f4ee07f846d2ade38dca5bc933dafbcee62968f not found: ID does not exist" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.251386 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.251423 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53eabbda-71af-46ee-85df-de9997ed9b2b-logs\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.251455 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.251475 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.251492 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.251519 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.251548 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.251566 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53eabbda-71af-46ee-85df-de9997ed9b2b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.251584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.251599 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.252019 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-scripts\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.252046 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-config-data\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.252069 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mct7x\" (UniqueName: \"kubernetes.io/projected/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-kube-api-access-mct7x\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.252086 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.252116 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfnc8\" (UniqueName: \"kubernetes.io/projected/53eabbda-71af-46ee-85df-de9997ed9b2b-kube-api-access-jfnc8\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.252138 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353505 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353536 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353581 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353606 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53eabbda-71af-46ee-85df-de9997ed9b2b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353649 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353669 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353695 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-scripts\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353719 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-config-data\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353749 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mct7x\" (UniqueName: \"kubernetes.io/projected/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-kube-api-access-mct7x\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353775 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353818 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfnc8\" (UniqueName: \"kubernetes.io/projected/53eabbda-71af-46ee-85df-de9997ed9b2b-kube-api-access-jfnc8\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353848 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353925 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53eabbda-71af-46ee-85df-de9997ed9b2b-logs\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.353961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.354225 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-logs\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.354264 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.354644 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53eabbda-71af-46ee-85df-de9997ed9b2b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.355180 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.359090 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.364338 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.366431 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.367091 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53eabbda-71af-46ee-85df-de9997ed9b2b-logs\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.373380 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.374362 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.376107 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-scripts\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.377159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.379806 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-config-data\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.383243 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.384194 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfnc8\" (UniqueName: \"kubernetes.io/projected/53eabbda-71af-46ee-85df-de9997ed9b2b-kube-api-access-jfnc8\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.386115 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mct7x\" (UniqueName: \"kubernetes.io/projected/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-kube-api-access-mct7x\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.423465 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.430433 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.480683 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.489973 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.547583 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.552045 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.661239 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-combined-ca-bundle\") pod \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\" (UID: \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\") " Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.661294 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwbk9\" (UniqueName: \"kubernetes.io/projected/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-kube-api-access-gwbk9\") pod \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\" (UID: \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\") " Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.661368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-combined-ca-bundle\") pod \"c92053f1-ec4e-4dee-ac05-a4860c140b41\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.661389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-scripts\") pod \"c92053f1-ec4e-4dee-ac05-a4860c140b41\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.661428 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c92053f1-ec4e-4dee-ac05-a4860c140b41-logs\") pod \"c92053f1-ec4e-4dee-ac05-a4860c140b41\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.661517 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-db-sync-config-data\") pod \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\" (UID: \"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae\") " Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.661542 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-config-data\") pod \"c92053f1-ec4e-4dee-ac05-a4860c140b41\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.661580 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh2pn\" (UniqueName: \"kubernetes.io/projected/c92053f1-ec4e-4dee-ac05-a4860c140b41-kube-api-access-fh2pn\") pod \"c92053f1-ec4e-4dee-ac05-a4860c140b41\" (UID: \"c92053f1-ec4e-4dee-ac05-a4860c140b41\") " Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.671005 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c92053f1-ec4e-4dee-ac05-a4860c140b41-logs" (OuterVolumeSpecName: "logs") pod "c92053f1-ec4e-4dee-ac05-a4860c140b41" (UID: "c92053f1-ec4e-4dee-ac05-a4860c140b41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.691105 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-kube-api-access-gwbk9" (OuterVolumeSpecName: "kube-api-access-gwbk9") pod "c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae" (UID: "c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae"). InnerVolumeSpecName "kube-api-access-gwbk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.693189 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae" (UID: "c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.699082 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-scripts" (OuterVolumeSpecName: "scripts") pod "c92053f1-ec4e-4dee-ac05-a4860c140b41" (UID: "c92053f1-ec4e-4dee-ac05-a4860c140b41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.709509 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c92053f1-ec4e-4dee-ac05-a4860c140b41-kube-api-access-fh2pn" (OuterVolumeSpecName: "kube-api-access-fh2pn") pod "c92053f1-ec4e-4dee-ac05-a4860c140b41" (UID: "c92053f1-ec4e-4dee-ac05-a4860c140b41"). InnerVolumeSpecName "kube-api-access-fh2pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.740798 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-config-data" (OuterVolumeSpecName: "config-data") pod "c92053f1-ec4e-4dee-ac05-a4860c140b41" (UID: "c92053f1-ec4e-4dee-ac05-a4860c140b41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.742314 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae" (UID: "c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.767828 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.767860 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.767873 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh2pn\" (UniqueName: \"kubernetes.io/projected/c92053f1-ec4e-4dee-ac05-a4860c140b41-kube-api-access-fh2pn\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.767885 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.767895 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwbk9\" (UniqueName: \"kubernetes.io/projected/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae-kube-api-access-gwbk9\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.767905 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.767917 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c92053f1-ec4e-4dee-ac05-a4860c140b41-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.785259 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c92053f1-ec4e-4dee-ac05-a4860c140b41" (UID: "c92053f1-ec4e-4dee-ac05-a4860c140b41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.868990 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c92053f1-ec4e-4dee-ac05-a4860c140b41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.989108 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0c2332f-3093-47c0-9b1e-cc2780a9a3b7" path="/var/lib/kubelet/pods/b0c2332f-3093-47c0-9b1e-cc2780a9a3b7/volumes" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.990114 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be32cdca-a8f0-4c66-91bb-0ae2e0da99a8" path="/var/lib/kubelet/pods/be32cdca-a8f0-4c66-91bb-0ae2e0da99a8/volumes" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.999785 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" event={"ID":"c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae","Type":"ContainerDied","Data":"2369f7584fc875acb35fc4fa8972ed120ad91f6ff87f1ae374a7776ec89719b8"} Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.999827 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2369f7584fc875acb35fc4fa8972ed120ad91f6ff87f1ae374a7776ec89719b8" Jan 20 22:59:29 crc kubenswrapper[5030]: I0120 22:59:29.999801 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-w9wqz" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.001370 5030 generic.go:334] "Generic (PLEG): container finished" podID="cdfdc05b-f070-4e63-9606-c7f604a5ccda" containerID="36afc91b2b88250d491610095902402ca8687e8f5009809524f0cc02447b6ac9" exitCode=0 Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.001429 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" event={"ID":"cdfdc05b-f070-4e63-9606-c7f604a5ccda","Type":"ContainerDied","Data":"36afc91b2b88250d491610095902402ca8687e8f5009809524f0cc02447b6ac9"} Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.007054 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-8lg48" event={"ID":"c92053f1-ec4e-4dee-ac05-a4860c140b41","Type":"ContainerDied","Data":"dd220a7d9dff3614e53f27ec375dae9bdff4da07e7de34c9a66d258483c68356"} Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.007077 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd220a7d9dff3614e53f27ec375dae9bdff4da07e7de34c9a66d258483c68356" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.007109 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-8lg48" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.122224 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-9855649bd-rgwm6"] Jan 20 22:59:30 crc kubenswrapper[5030]: E0120 22:59:30.122574 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae" containerName="barbican-db-sync" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.122585 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae" containerName="barbican-db-sync" Jan 20 22:59:30 crc kubenswrapper[5030]: E0120 22:59:30.122608 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c92053f1-ec4e-4dee-ac05-a4860c140b41" containerName="placement-db-sync" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.122614 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92053f1-ec4e-4dee-ac05-a4860c140b41" containerName="placement-db-sync" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.122823 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c92053f1-ec4e-4dee-ac05-a4860c140b41" containerName="placement-db-sync" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.122851 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae" containerName="barbican-db-sync" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.123602 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.129584 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.130242 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-pnjbw" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.130356 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.146842 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.162221 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-9855649bd-rgwm6"] Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.263751 5030 scope.go:117] "RemoveContainer" containerID="1f129a58100ee9749133f57abfedbfee9be9910d03dfde215c4f5caac80fa22d" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.294473 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.312425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c547bf6-ed95-4bc8-8573-3047e4689484-logs\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.314147 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-config-data\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.314175 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-scripts\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.314254 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-combined-ca-bundle\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.314339 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swghr\" (UniqueName: \"kubernetes.io/projected/0c547bf6-ed95-4bc8-8573-3047e4689484-kube-api-access-swghr\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.320228 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq"] Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.327822 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.330678 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.331086 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.331405 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-tmgt6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.347752 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq"] Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.349367 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.352928 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.353770 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq"] Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.359400 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq"] Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.387435 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2"] Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.390601 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.397809 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2"] Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.399002 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.415440 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-config-data\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.415487 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-scripts\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.415507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-combined-ca-bundle\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.415569 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swghr\" (UniqueName: \"kubernetes.io/projected/0c547bf6-ed95-4bc8-8573-3047e4689484-kube-api-access-swghr\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.415608 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c547bf6-ed95-4bc8-8573-3047e4689484-logs\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.416056 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c547bf6-ed95-4bc8-8573-3047e4689484-logs\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.436289 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-combined-ca-bundle\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.439279 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-scripts\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.447054 5030 scope.go:117] "RemoveContainer" containerID="3622233fe725e7827b49fc546a0a57fcc1df7bee2d883a8db35f60ca54cdcc3d" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.447367 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swghr\" (UniqueName: \"kubernetes.io/projected/0c547bf6-ed95-4bc8-8573-3047e4689484-kube-api-access-swghr\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.451184 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-config-data\") pod \"placement-9855649bd-rgwm6\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.461262 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.497369 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.517787 5030 scope.go:117] "RemoveContainer" containerID="5083ee0eecec9aa23fd8e6c82e1b7f2af5bc2a0e4f692f039c3caf0ab0de514a" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.518876 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-logs\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.518906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-logs\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.518935 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-config-data\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.518971 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-combined-ca-bundle\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.518994 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmddf\" (UniqueName: \"kubernetes.io/projected/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-kube-api-access-kmddf\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.519013 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhpfp\" (UniqueName: \"kubernetes.io/projected/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-kube-api-access-xhpfp\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.519135 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlpqw\" (UniqueName: \"kubernetes.io/projected/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-kube-api-access-rlpqw\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.519160 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-logs\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.519186 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-config-data\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.519201 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-config-data-custom\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.519219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-config-data\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.519250 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-combined-ca-bundle\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.519282 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-config-data-custom\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.519304 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-config-data-custom\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.519318 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-combined-ca-bundle\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.572324 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-777ff759db-mgr2r"] Jan 20 22:59:30 crc kubenswrapper[5030]: E0120 22:59:30.572734 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed15b8f4-d0a2-401f-ac7e-cc273da45269" containerName="neutron-db-sync" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.572751 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed15b8f4-d0a2-401f-ac7e-cc273da45269" containerName="neutron-db-sync" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.572968 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed15b8f4-d0a2-401f-ac7e-cc273da45269" containerName="neutron-db-sync" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.575366 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.579170 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.579974 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.582210 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-777ff759db-mgr2r"] Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.608100 5030 scope.go:117] "RemoveContainer" containerID="3a3477d03775c8409a06e6db45461245f50d32bf83272caaa790fbb1f2b619a0" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621106 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2ktn\" (UniqueName: \"kubernetes.io/projected/ed15b8f4-d0a2-401f-ac7e-cc273da45269-kube-api-access-k2ktn\") pod \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\" (UID: \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\") " Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621184 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed15b8f4-d0a2-401f-ac7e-cc273da45269-config\") pod \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\" (UID: \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\") " Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621211 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed15b8f4-d0a2-401f-ac7e-cc273da45269-combined-ca-bundle\") pod \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\" (UID: \"ed15b8f4-d0a2-401f-ac7e-cc273da45269\") " Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621553 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-config-data-custom\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-config-data-custom\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-combined-ca-bundle\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621671 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-logs\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621687 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-logs\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621709 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-config-data\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621731 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-combined-ca-bundle\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621753 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmddf\" (UniqueName: \"kubernetes.io/projected/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-kube-api-access-kmddf\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621773 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhpfp\" (UniqueName: \"kubernetes.io/projected/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-kube-api-access-xhpfp\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621789 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlpqw\" (UniqueName: \"kubernetes.io/projected/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-kube-api-access-rlpqw\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621813 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-logs\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621837 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-config-data\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621852 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-config-data-custom\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621871 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-config-data\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.621904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-combined-ca-bundle\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.622879 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-logs\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.626876 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-combined-ca-bundle\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.631489 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-config-data\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.632505 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-config-data\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.634744 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-logs\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.636338 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-combined-ca-bundle\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.636790 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-logs\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.637041 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed15b8f4-d0a2-401f-ac7e-cc273da45269-kube-api-access-k2ktn" (OuterVolumeSpecName: "kube-api-access-k2ktn") pod "ed15b8f4-d0a2-401f-ac7e-cc273da45269" (UID: "ed15b8f4-d0a2-401f-ac7e-cc273da45269"). InnerVolumeSpecName "kube-api-access-k2ktn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.640633 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlpqw\" (UniqueName: \"kubernetes.io/projected/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-kube-api-access-rlpqw\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.642228 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-config-data-custom\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.643755 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-combined-ca-bundle\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.643847 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-config-data-custom\") pod \"barbican-worker-85fffb8dc7-27vbq\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.644205 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhpfp\" (UniqueName: \"kubernetes.io/projected/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-kube-api-access-xhpfp\") pod \"barbican-keystone-listener-74b4fcd47d-8qrmq\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.646501 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-config-data\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.647469 5030 scope.go:117] "RemoveContainer" containerID="b0d2103e4ac83ab42ccb1a0edfca0d51c401307b05570b94fa953f43e63443b3" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.651119 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmddf\" (UniqueName: \"kubernetes.io/projected/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-kube-api-access-kmddf\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.652451 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-config-data-custom\") pod \"barbican-api-7ff8bf756d-jk7l2\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.673718 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed15b8f4-d0a2-401f-ac7e-cc273da45269-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed15b8f4-d0a2-401f-ac7e-cc273da45269" (UID: "ed15b8f4-d0a2-401f-ac7e-cc273da45269"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.690210 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.692551 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed15b8f4-d0a2-401f-ac7e-cc273da45269-config" (OuterVolumeSpecName: "config") pod "ed15b8f4-d0a2-401f-ac7e-cc273da45269" (UID: "ed15b8f4-d0a2-401f-ac7e-cc273da45269"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.699884 5030 scope.go:117] "RemoveContainer" containerID="5480da400ee80721be2f16ae8f3038789f5bab39358c94ed7e428011135c3283" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.723753 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7e101a-46be-4de8-97af-d47cdce4ef90-logs\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.723826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-combined-ca-bundle\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.723847 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-scripts\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.724022 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-public-tls-certs\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.724227 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbb8k\" (UniqueName: \"kubernetes.io/projected/3e7e101a-46be-4de8-97af-d47cdce4ef90-kube-api-access-fbb8k\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.724272 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-internal-tls-certs\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.724292 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-config-data\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.724955 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2ktn\" (UniqueName: \"kubernetes.io/projected/ed15b8f4-d0a2-401f-ac7e-cc273da45269-kube-api-access-k2ktn\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.724971 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed15b8f4-d0a2-401f-ac7e-cc273da45269-config\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.724981 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed15b8f4-d0a2-401f-ac7e-cc273da45269-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.774902 5030 scope.go:117] "RemoveContainer" containerID="f62ebf5cd18768a4d65883ecf1d5961daa058a3f749fc304252503a2f82a463d" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.785542 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.796089 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.826526 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-combined-ca-bundle\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.826568 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-scripts\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.826608 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-public-tls-certs\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.826717 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbb8k\" (UniqueName: \"kubernetes.io/projected/3e7e101a-46be-4de8-97af-d47cdce4ef90-kube-api-access-fbb8k\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.826744 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-internal-tls-certs\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.826760 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-config-data\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.826788 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7e101a-46be-4de8-97af-d47cdce4ef90-logs\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.827490 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7e101a-46be-4de8-97af-d47cdce4ef90-logs\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.831663 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-internal-tls-certs\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.831796 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-config-data\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.832830 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-scripts\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.833219 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-combined-ca-bundle\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.836294 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-public-tls-certs\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.859951 5030 scope.go:117] "RemoveContainer" containerID="f42f977d4b1f56e969e3645e77647af497eeead81e8ac35d88c15264e254f40f" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.873571 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbb8k\" (UniqueName: \"kubernetes.io/projected/3e7e101a-46be-4de8-97af-d47cdce4ef90-kube-api-access-fbb8k\") pod \"placement-777ff759db-mgr2r\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.901220 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.927373 5030 scope.go:117] "RemoveContainer" containerID="e91ba2b5c96f1ceb6cf02b5d22e74541a14325e0f54d4686895e5fc2e62c482c" Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.963493 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-9855649bd-rgwm6"] Jan 20 22:59:30 crc kubenswrapper[5030]: I0120 22:59:30.987221 5030 scope.go:117] "RemoveContainer" containerID="6fdf5ee996af7b294b4ccf748fd8a7949b814dbeedb68b669717d89b03aacbbb" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.032190 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"15247204-6ca3-4d52-ac9e-e8ba2e682d35","Type":"ContainerStarted","Data":"e873a504335074ef9b6e599db1a678cb382455d2f7f6a91a65fca438ad552396"} Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.032522 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="ceilometer-central-agent" containerID="cri-o://adf6b9b7c07af52aab5a63971fbefa3a17b151f0c57d391ef2c2b3e03cca6ed6" gracePeriod=30 Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.032684 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="proxy-httpd" containerID="cri-o://e873a504335074ef9b6e599db1a678cb382455d2f7f6a91a65fca438ad552396" gracePeriod=30 Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.032773 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="sg-core" containerID="cri-o://a2c273022a42bfa06dbc3b574e6c50272816d7abc8a2a2ee16e3f3ee91c019cb" gracePeriod=30 Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.032856 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="ceilometer-notification-agent" containerID="cri-o://c81c0d711506b6eeceb9bbf72e9abc922b35819dae6ed02a943b536214b59bb3" gracePeriod=30 Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.033428 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.063333 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.36658372 podStartE2EDuration="7.063317386s" podCreationTimestamp="2026-01-20 22:59:24 +0000 UTC" firstStartedPulling="2026-01-20 22:59:25.148614791 +0000 UTC m=+1437.468875079" lastFinishedPulling="2026-01-20 22:59:29.845348457 +0000 UTC m=+1442.165608745" observedRunningTime="2026-01-20 22:59:31.053071037 +0000 UTC m=+1443.373331325" watchObservedRunningTime="2026-01-20 22:59:31.063317386 +0000 UTC m=+1443.383577674" Jan 20 22:59:31 crc kubenswrapper[5030]: W0120 22:59:31.073780 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c547bf6_ed95_4bc8_8573_3047e4689484.slice/crio-2c9572d1f3ae94ab6e57b82eb1f102f5c4b17430a168712d9171152cea0b0ae6 WatchSource:0}: Error finding container 2c9572d1f3ae94ab6e57b82eb1f102f5c4b17430a168712d9171152cea0b0ae6: Status 404 returned error can't find the container with id 2c9572d1f3ae94ab6e57b82eb1f102f5c4b17430a168712d9171152cea0b0ae6 Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.109410 5030 generic.go:334] "Generic (PLEG): container finished" podID="3e588d13-720f-45b7-aaa8-d339dc20f695" containerID="44f6f5122053dee469c5f5f15881894d07ecebb1b02c5a8aeebf93f8fe66c416" exitCode=0 Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.109467 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-vbdht" event={"ID":"3e588d13-720f-45b7-aaa8-d339dc20f695","Type":"ContainerDied","Data":"44f6f5122053dee469c5f5f15881894d07ecebb1b02c5a8aeebf93f8fe66c416"} Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.113093 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" event={"ID":"ed15b8f4-d0a2-401f-ac7e-cc273da45269","Type":"ContainerDied","Data":"a49501582461db055200519fb5cb4d0b18985ad1880f8b3539f52f062f3e1a7b"} Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.113111 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a49501582461db055200519fb5cb4d0b18985ad1880f8b3539f52f062f3e1a7b" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.113149 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-m7m8q" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.175894 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq"] Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.184130 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"53eabbda-71af-46ee-85df-de9997ed9b2b","Type":"ContainerStarted","Data":"8d4fbb30a5cf6219d5682098586dc3cec9e1cbd936b41a0e2dec7fd3e4246ed4"} Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.188230 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb","Type":"ContainerStarted","Data":"3b0c5a28407c7b691f9cde3beaaf43dbc504f80e10cc07f583e835c611bd6192"} Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.188379 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb","Type":"ContainerStarted","Data":"625a7d3ce44403049e137f9f93a519a84747870829838643d2aeacacc8b9fcb6"} Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.238157 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-66c88c7dcc-g5x75"] Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.239536 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.246205 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-zx6f5" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.246220 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.246489 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.247862 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.266032 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-66c88c7dcc-g5x75"] Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.342540 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-config\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.343286 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckr6v\" (UniqueName: \"kubernetes.io/projected/c840a11b-13a9-494d-804d-ab91e0300073-kube-api-access-ckr6v\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.343319 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-ovndb-tls-certs\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.343364 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-httpd-config\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.343386 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-combined-ca-bundle\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.445966 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-httpd-config\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.446010 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-combined-ca-bundle\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.446122 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-config\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.446149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckr6v\" (UniqueName: \"kubernetes.io/projected/c840a11b-13a9-494d-804d-ab91e0300073-kube-api-access-ckr6v\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.446189 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-ovndb-tls-certs\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.451366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-httpd-config\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.457229 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-combined-ca-bundle\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.461811 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-ovndb-tls-certs\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.462097 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq"] Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.463579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-config\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.467739 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckr6v\" (UniqueName: \"kubernetes.io/projected/c840a11b-13a9-494d-804d-ab91e0300073-kube-api-access-ckr6v\") pod \"neutron-66c88c7dcc-g5x75\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.598163 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2"] Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.608935 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:31 crc kubenswrapper[5030]: W0120 22:59:31.655922 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e38dceb_8a8c_4a9f_8f41_91bb9868510a.slice/crio-14550e8bdeb6cf43c6c32cd1cb27c1f76f8e4a904b422f81d9a6d70e781662d9 WatchSource:0}: Error finding container 14550e8bdeb6cf43c6c32cd1cb27c1f76f8e4a904b422f81d9a6d70e781662d9: Status 404 returned error can't find the container with id 14550e8bdeb6cf43c6c32cd1cb27c1f76f8e4a904b422f81d9a6d70e781662d9 Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.674793 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.742116 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-777ff759db-mgr2r"] Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.855776 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-fernet-keys\") pod \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.855885 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-scripts\") pod \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.855919 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-config-data\") pod \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.855941 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlvgj\" (UniqueName: \"kubernetes.io/projected/cdfdc05b-f070-4e63-9606-c7f604a5ccda-kube-api-access-zlvgj\") pod \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.856025 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-credential-keys\") pod \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.856051 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-combined-ca-bundle\") pod \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\" (UID: \"cdfdc05b-f070-4e63-9606-c7f604a5ccda\") " Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.863168 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-scripts" (OuterVolumeSpecName: "scripts") pod "cdfdc05b-f070-4e63-9606-c7f604a5ccda" (UID: "cdfdc05b-f070-4e63-9606-c7f604a5ccda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.867007 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfdc05b-f070-4e63-9606-c7f604a5ccda-kube-api-access-zlvgj" (OuterVolumeSpecName: "kube-api-access-zlvgj") pod "cdfdc05b-f070-4e63-9606-c7f604a5ccda" (UID: "cdfdc05b-f070-4e63-9606-c7f604a5ccda"). InnerVolumeSpecName "kube-api-access-zlvgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.868139 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cdfdc05b-f070-4e63-9606-c7f604a5ccda" (UID: "cdfdc05b-f070-4e63-9606-c7f604a5ccda"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.875887 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cdfdc05b-f070-4e63-9606-c7f604a5ccda" (UID: "cdfdc05b-f070-4e63-9606-c7f604a5ccda"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.959923 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlvgj\" (UniqueName: \"kubernetes.io/projected/cdfdc05b-f070-4e63-9606-c7f604a5ccda-kube-api-access-zlvgj\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.960272 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.960285 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.960341 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:31 crc kubenswrapper[5030]: I0120 22:59:31.998969 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-config-data" (OuterVolumeSpecName: "config-data") pod "cdfdc05b-f070-4e63-9606-c7f604a5ccda" (UID: "cdfdc05b-f070-4e63-9606-c7f604a5ccda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.032007 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdfdc05b-f070-4e63-9606-c7f604a5ccda" (UID: "cdfdc05b-f070-4e63-9606-c7f604a5ccda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.062695 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.063743 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfdc05b-f070-4e63-9606-c7f604a5ccda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.162665 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-78ff9"] Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.171272 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-78ff9"] Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.203417 5030 generic.go:334] "Generic (PLEG): container finished" podID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerID="e873a504335074ef9b6e599db1a678cb382455d2f7f6a91a65fca438ad552396" exitCode=0 Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.203444 5030 generic.go:334] "Generic (PLEG): container finished" podID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerID="a2c273022a42bfa06dbc3b574e6c50272816d7abc8a2a2ee16e3f3ee91c019cb" exitCode=2 Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.203451 5030 generic.go:334] "Generic (PLEG): container finished" podID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerID="c81c0d711506b6eeceb9bbf72e9abc922b35819dae6ed02a943b536214b59bb3" exitCode=0 Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.203457 5030 generic.go:334] "Generic (PLEG): container finished" podID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerID="adf6b9b7c07af52aab5a63971fbefa3a17b151f0c57d391ef2c2b3e03cca6ed6" exitCode=0 Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.203491 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"15247204-6ca3-4d52-ac9e-e8ba2e682d35","Type":"ContainerDied","Data":"e873a504335074ef9b6e599db1a678cb382455d2f7f6a91a65fca438ad552396"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.203514 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"15247204-6ca3-4d52-ac9e-e8ba2e682d35","Type":"ContainerDied","Data":"a2c273022a42bfa06dbc3b574e6c50272816d7abc8a2a2ee16e3f3ee91c019cb"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.203525 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"15247204-6ca3-4d52-ac9e-e8ba2e682d35","Type":"ContainerDied","Data":"c81c0d711506b6eeceb9bbf72e9abc922b35819dae6ed02a943b536214b59bb3"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.203533 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"15247204-6ca3-4d52-ac9e-e8ba2e682d35","Type":"ContainerDied","Data":"adf6b9b7c07af52aab5a63971fbefa3a17b151f0c57d391ef2c2b3e03cca6ed6"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.203541 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"15247204-6ca3-4d52-ac9e-e8ba2e682d35","Type":"ContainerDied","Data":"d59da9f94d4d076d36a2a053e93a80f47f72a8da615471f83d2e406500828b45"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.203551 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d59da9f94d4d076d36a2a053e93a80f47f72a8da615471f83d2e406500828b45" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.205441 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" event={"ID":"3e38dceb-8a8c-4a9f-8f41-91bb9868510a","Type":"ContainerStarted","Data":"f914453f875b54ee6ab683459a0c75064627074a817073323f80398d441f2bb8"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.205466 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" event={"ID":"3e38dceb-8a8c-4a9f-8f41-91bb9868510a","Type":"ContainerStarted","Data":"14550e8bdeb6cf43c6c32cd1cb27c1f76f8e4a904b422f81d9a6d70e781662d9"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.207022 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" event={"ID":"0c547bf6-ed95-4bc8-8573-3047e4689484","Type":"ContainerStarted","Data":"618c6b906a2fc068bead3717fa5d723efe80c68d1b2e2c9f5efe125a0f7571a5"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.207045 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" event={"ID":"0c547bf6-ed95-4bc8-8573-3047e4689484","Type":"ContainerStarted","Data":"612c19ede81e6b0e21637082653d323f14d40b7293d3a145df76bd10f3f66c67"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.207054 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" event={"ID":"0c547bf6-ed95-4bc8-8573-3047e4689484","Type":"ContainerStarted","Data":"2c9572d1f3ae94ab6e57b82eb1f102f5c4b17430a168712d9171152cea0b0ae6"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.208114 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.208145 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.215170 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" event={"ID":"f62d42b6-4ec4-4f63-8f7c-31975f6677bb","Type":"ContainerStarted","Data":"8da669032a42ae5fafc499507531665ae7680439dcecff59acbf618249a406b2"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.215208 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" event={"ID":"f62d42b6-4ec4-4f63-8f7c-31975f6677bb","Type":"ContainerStarted","Data":"cac1cd239c8eb7be62a520b3a04834943d6ef25c4b2d01f12353c199ab2ebfde"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.218471 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb","Type":"ContainerStarted","Data":"6e73868661483fabc0e7f3ee5f4e53c03032db3a5af7d1a65ec1cb93103cb2e2"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.221560 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7287c7f837030fd575c19cdbe0942e70cbb5f05008764e622ce4ce8ef151fa68" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.221670 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-78ff9" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.246679 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"53eabbda-71af-46ee-85df-de9997ed9b2b","Type":"ContainerStarted","Data":"203a6757ba66bed0aa792c3622d72d2811284af7b92f313c118c72e3bd14c9b0"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.246717 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"53eabbda-71af-46ee-85df-de9997ed9b2b","Type":"ContainerStarted","Data":"9c9cc79f5dd471123bad13d3eca786383b406d209ce6569d1d53b3d158c5fc53"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.252202 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" event={"ID":"3e7e101a-46be-4de8-97af-d47cdce4ef90","Type":"ContainerStarted","Data":"f452b9ea45f1195ca7898583c3bfd33b36728479e749928f1d50f07d9c2da3b7"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.256310 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" event={"ID":"9b5e90d7-03d0-48f4-abc2-914ccd98c0db","Type":"ContainerStarted","Data":"aaad7c24ea8a9b3796aa4e5cfe2cc0e940b0c172d22637110c91ea576ac7f3ca"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.256353 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" event={"ID":"9b5e90d7-03d0-48f4-abc2-914ccd98c0db","Type":"ContainerStarted","Data":"26ac43c883c15abf4af3c5422e5d1c62a3bf4d492dd374cfe9f9364237f47b17"} Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.257159 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" podStartSLOduration=2.257133887 podStartE2EDuration="2.257133887s" podCreationTimestamp="2026-01-20 22:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:32.23134999 +0000 UTC m=+1444.551610288" watchObservedRunningTime="2026-01-20 22:59:32.257133887 +0000 UTC m=+1444.577394175" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.270566 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-66c88c7dcc-g5x75"] Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.279658 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-cx7lh"] Jan 20 22:59:32 crc kubenswrapper[5030]: E0120 22:59:32.279992 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfdc05b-f070-4e63-9606-c7f604a5ccda" containerName="keystone-bootstrap" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.280011 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfdc05b-f070-4e63-9606-c7f604a5ccda" containerName="keystone-bootstrap" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.280167 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfdc05b-f070-4e63-9606-c7f604a5ccda" containerName="keystone-bootstrap" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.280706 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.282099 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.282528 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.284281 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.285402 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-5nbxn" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.285561 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.288683 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-cx7lh"] Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.294061 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.294041814 podStartE2EDuration="3.294041814s" podCreationTimestamp="2026-01-20 22:59:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:32.267306724 +0000 UTC m=+1444.587567012" watchObservedRunningTime="2026-01-20 22:59:32.294041814 +0000 UTC m=+1444.614302102" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.303107 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.303091225 podStartE2EDuration="3.303091225s" podCreationTimestamp="2026-01-20 22:59:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:32.301823274 +0000 UTC m=+1444.622083562" watchObservedRunningTime="2026-01-20 22:59:32.303091225 +0000 UTC m=+1444.623351513" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.422991 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.474210 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-fernet-keys\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.474452 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-combined-ca-bundle\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.474479 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-config-data\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.474501 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-credential-keys\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.474567 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-scripts\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.474585 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srn59\" (UniqueName: \"kubernetes.io/projected/70aec167-1edb-4bc8-8170-f769f2672134-kube-api-access-srn59\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.575460 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15247204-6ca3-4d52-ac9e-e8ba2e682d35-log-httpd\") pod \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.576110 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rqtc\" (UniqueName: \"kubernetes.io/projected/15247204-6ca3-4d52-ac9e-e8ba2e682d35-kube-api-access-5rqtc\") pod \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.576194 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15247204-6ca3-4d52-ac9e-e8ba2e682d35-run-httpd\") pod \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.576224 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-sg-core-conf-yaml\") pod \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.576314 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-config-data\") pod \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.576349 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-scripts\") pod \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.576410 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-combined-ca-bundle\") pod \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\" (UID: \"15247204-6ca3-4d52-ac9e-e8ba2e682d35\") " Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.576708 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-scripts\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.576738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srn59\" (UniqueName: \"kubernetes.io/projected/70aec167-1edb-4bc8-8170-f769f2672134-kube-api-access-srn59\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.576789 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-fernet-keys\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.576830 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-combined-ca-bundle\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.576881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-config-data\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.576908 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-credential-keys\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.576038 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15247204-6ca3-4d52-ac9e-e8ba2e682d35-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "15247204-6ca3-4d52-ac9e-e8ba2e682d35" (UID: "15247204-6ca3-4d52-ac9e-e8ba2e682d35"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.580264 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15247204-6ca3-4d52-ac9e-e8ba2e682d35-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "15247204-6ca3-4d52-ac9e-e8ba2e682d35" (UID: "15247204-6ca3-4d52-ac9e-e8ba2e682d35"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.586406 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-scripts" (OuterVolumeSpecName: "scripts") pod "15247204-6ca3-4d52-ac9e-e8ba2e682d35" (UID: "15247204-6ca3-4d52-ac9e-e8ba2e682d35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.587213 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-combined-ca-bundle\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.588896 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15247204-6ca3-4d52-ac9e-e8ba2e682d35-kube-api-access-5rqtc" (OuterVolumeSpecName: "kube-api-access-5rqtc") pod "15247204-6ca3-4d52-ac9e-e8ba2e682d35" (UID: "15247204-6ca3-4d52-ac9e-e8ba2e682d35"). InnerVolumeSpecName "kube-api-access-5rqtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.590223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-fernet-keys\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.591340 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-credential-keys\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.593299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-scripts\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.593953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-config-data\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.594892 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srn59\" (UniqueName: \"kubernetes.io/projected/70aec167-1edb-4bc8-8170-f769f2672134-kube-api-access-srn59\") pod \"keystone-bootstrap-cx7lh\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.622221 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.628087 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "15247204-6ca3-4d52-ac9e-e8ba2e682d35" (UID: "15247204-6ca3-4d52-ac9e-e8ba2e682d35"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.679799 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rqtc\" (UniqueName: \"kubernetes.io/projected/15247204-6ca3-4d52-ac9e-e8ba2e682d35-kube-api-access-5rqtc\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.680160 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15247204-6ca3-4d52-ac9e-e8ba2e682d35-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.680240 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.680318 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.680383 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15247204-6ca3-4d52-ac9e-e8ba2e682d35-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.701252 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15247204-6ca3-4d52-ac9e-e8ba2e682d35" (UID: "15247204-6ca3-4d52-ac9e-e8ba2e682d35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.728913 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-config-data" (OuterVolumeSpecName: "config-data") pod "15247204-6ca3-4d52-ac9e-e8ba2e682d35" (UID: "15247204-6ca3-4d52-ac9e-e8ba2e682d35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.771944 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.780784 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-scripts\") pod \"3e588d13-720f-45b7-aaa8-d339dc20f695\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.780843 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-db-sync-config-data\") pod \"3e588d13-720f-45b7-aaa8-d339dc20f695\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.780884 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lp6j\" (UniqueName: \"kubernetes.io/projected/3e588d13-720f-45b7-aaa8-d339dc20f695-kube-api-access-4lp6j\") pod \"3e588d13-720f-45b7-aaa8-d339dc20f695\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.780930 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e588d13-720f-45b7-aaa8-d339dc20f695-etc-machine-id\") pod \"3e588d13-720f-45b7-aaa8-d339dc20f695\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.781046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-config-data\") pod \"3e588d13-720f-45b7-aaa8-d339dc20f695\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.781064 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-combined-ca-bundle\") pod \"3e588d13-720f-45b7-aaa8-d339dc20f695\" (UID: \"3e588d13-720f-45b7-aaa8-d339dc20f695\") " Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.781365 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.781383 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15247204-6ca3-4d52-ac9e-e8ba2e682d35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.783108 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e588d13-720f-45b7-aaa8-d339dc20f695-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3e588d13-720f-45b7-aaa8-d339dc20f695" (UID: "3e588d13-720f-45b7-aaa8-d339dc20f695"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.793815 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-scripts" (OuterVolumeSpecName: "scripts") pod "3e588d13-720f-45b7-aaa8-d339dc20f695" (UID: "3e588d13-720f-45b7-aaa8-d339dc20f695"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.795805 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e588d13-720f-45b7-aaa8-d339dc20f695-kube-api-access-4lp6j" (OuterVolumeSpecName: "kube-api-access-4lp6j") pod "3e588d13-720f-45b7-aaa8-d339dc20f695" (UID: "3e588d13-720f-45b7-aaa8-d339dc20f695"). InnerVolumeSpecName "kube-api-access-4lp6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.807432 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3e588d13-720f-45b7-aaa8-d339dc20f695" (UID: "3e588d13-720f-45b7-aaa8-d339dc20f695"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.820944 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e588d13-720f-45b7-aaa8-d339dc20f695" (UID: "3e588d13-720f-45b7-aaa8-d339dc20f695"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.889734 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.889983 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.889992 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.890005 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lp6j\" (UniqueName: \"kubernetes.io/projected/3e588d13-720f-45b7-aaa8-d339dc20f695-kube-api-access-4lp6j\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.890015 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e588d13-720f-45b7-aaa8-d339dc20f695-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.966562 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-config-data" (OuterVolumeSpecName: "config-data") pod "3e588d13-720f-45b7-aaa8-d339dc20f695" (UID: "3e588d13-720f-45b7-aaa8-d339dc20f695"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:32 crc kubenswrapper[5030]: I0120 22:59:32.991290 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e588d13-720f-45b7-aaa8-d339dc20f695-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.275643 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" event={"ID":"f62d42b6-4ec4-4f63-8f7c-31975f6677bb","Type":"ContainerStarted","Data":"cf2a56f804091f8d287a400f7f7a6cd9c78c7a43c30bba4d5b9ef210fd03c3b7"} Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.277420 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-vbdht" event={"ID":"3e588d13-720f-45b7-aaa8-d339dc20f695","Type":"ContainerDied","Data":"2e055d2541edb88f6ffee86733420bae13f69cbba28736eecba171ae51521dec"} Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.277450 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e055d2541edb88f6ffee86733420bae13f69cbba28736eecba171ae51521dec" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.277497 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-vbdht" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.280826 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" event={"ID":"3e38dceb-8a8c-4a9f-8f41-91bb9868510a","Type":"ContainerStarted","Data":"20bded7c0405fc45f3c485eca633da7c4fe33c8e4a392d411706cb5a621b8095"} Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.281426 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.281454 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.286208 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" event={"ID":"c840a11b-13a9-494d-804d-ab91e0300073","Type":"ContainerStarted","Data":"b1a100196f52cd51df47685e166b707ce9ec8c7b991d38c1902ddd14662f3777"} Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.286263 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" event={"ID":"c840a11b-13a9-494d-804d-ab91e0300073","Type":"ContainerStarted","Data":"7e59dd71b1509599776887729e46e1021780ba0d3f30a81c91a3255fe7bb5b8e"} Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.286276 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" event={"ID":"c840a11b-13a9-494d-804d-ab91e0300073","Type":"ContainerStarted","Data":"948b9845ccaabe9a087c6af8f47ee8142ec39a299c118713f401e3da952e8dc2"} Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.286805 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.287875 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" event={"ID":"3e7e101a-46be-4de8-97af-d47cdce4ef90","Type":"ContainerStarted","Data":"913daf955b0d563fe98b12119e758cdafbabc393cfd58e273b8eca4ad2a22273"} Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.287914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" event={"ID":"3e7e101a-46be-4de8-97af-d47cdce4ef90","Type":"ContainerStarted","Data":"bb165cf308dfeb395cf94e6cb6d70f6e2a782cc4e59efe496e7ca248092fc2b5"} Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.288429 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.288555 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.296576 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" event={"ID":"9b5e90d7-03d0-48f4-abc2-914ccd98c0db","Type":"ContainerStarted","Data":"3fcdf58baca53e88af4e903ab1f51ec63e8c6a04e6a3019435c78eaefad8028e"} Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.298045 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.372428 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:59:33 crc kubenswrapper[5030]: E0120 22:59:33.373052 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="ceilometer-notification-agent" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.373067 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="ceilometer-notification-agent" Jan 20 22:59:33 crc kubenswrapper[5030]: E0120 22:59:33.373089 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="proxy-httpd" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.373095 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="proxy-httpd" Jan 20 22:59:33 crc kubenswrapper[5030]: E0120 22:59:33.373125 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e588d13-720f-45b7-aaa8-d339dc20f695" containerName="cinder-db-sync" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.373132 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e588d13-720f-45b7-aaa8-d339dc20f695" containerName="cinder-db-sync" Jan 20 22:59:33 crc kubenswrapper[5030]: E0120 22:59:33.373145 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="ceilometer-central-agent" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.373151 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="ceilometer-central-agent" Jan 20 22:59:33 crc kubenswrapper[5030]: E0120 22:59:33.373165 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="sg-core" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.373170 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="sg-core" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.373510 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="ceilometer-notification-agent" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.373528 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="proxy-httpd" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.373536 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="ceilometer-central-agent" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.373546 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e588d13-720f-45b7-aaa8-d339dc20f695" containerName="cinder-db-sync" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.373553 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" containerName="sg-core" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.374629 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" podStartSLOduration=3.374591402 podStartE2EDuration="3.374591402s" podCreationTimestamp="2026-01-20 22:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:33.302506729 +0000 UTC m=+1445.622767017" watchObservedRunningTime="2026-01-20 22:59:33.374591402 +0000 UTC m=+1445.694851690" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.374765 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.381233 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-j55hb" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.381562 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.381648 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.381854 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.410316 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.411274 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" podStartSLOduration=3.411265153 podStartE2EDuration="3.411265153s" podCreationTimestamp="2026-01-20 22:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:33.374048388 +0000 UTC m=+1445.694308676" watchObservedRunningTime="2026-01-20 22:59:33.411265153 +0000 UTC m=+1445.731525441" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.441530 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.451944 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.477514 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-cx7lh"] Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.501057 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2978dafa-f9c4-4170-beb7-006349acb611-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.501121 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-config-data\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.501137 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-scripts\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.501194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp22t\" (UniqueName: \"kubernetes.io/projected/2978dafa-f9c4-4170-beb7-006349acb611-kube-api-access-pp22t\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.501224 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.501277 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.517179 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.519256 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.523462 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.528179 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.529672 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.530186 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.538844 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.558743 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" podStartSLOduration=2.558719168 podStartE2EDuration="2.558719168s" podCreationTimestamp="2026-01-20 22:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:33.440566716 +0000 UTC m=+1445.760827004" watchObservedRunningTime="2026-01-20 22:59:33.558719168 +0000 UTC m=+1445.878979456" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.569579 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.579335 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.596399 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" podStartSLOduration=3.596374503 podStartE2EDuration="3.596374503s" podCreationTimestamp="2026-01-20 22:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:33.493194205 +0000 UTC m=+1445.813454503" watchObservedRunningTime="2026-01-20 22:59:33.596374503 +0000 UTC m=+1445.916634791" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.602796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp22t\" (UniqueName: \"kubernetes.io/projected/2978dafa-f9c4-4170-beb7-006349acb611-kube-api-access-pp22t\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.602841 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.602871 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.602890 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-config-data\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.602927 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.602946 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60afc403-7a57-4953-b772-938b04583542-log-httpd\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.603002 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdvwz\" (UniqueName: \"kubernetes.io/projected/60afc403-7a57-4953-b772-938b04583542-kube-api-access-vdvwz\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.603035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2978dafa-f9c4-4170-beb7-006349acb611-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.603055 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60afc403-7a57-4953-b772-938b04583542-run-httpd\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.603070 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.603087 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-config-data\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.603103 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-scripts\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.603129 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-scripts\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.603248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2978dafa-f9c4-4170-beb7-006349acb611-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.606811 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" podStartSLOduration=3.606781977 podStartE2EDuration="3.606781977s" podCreationTimestamp="2026-01-20 22:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:33.52134469 +0000 UTC m=+1445.841604978" watchObservedRunningTime="2026-01-20 22:59:33.606781977 +0000 UTC m=+1445.927042265" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.612512 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.612551 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.612741 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-config-data\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.617244 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-scripts\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.619890 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp22t\" (UniqueName: \"kubernetes.io/projected/2978dafa-f9c4-4170-beb7-006349acb611-kube-api-access-pp22t\") pod \"cinder-scheduler-0\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.695959 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705051 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f336347-0f81-4441-96ab-61163b122739-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705117 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60afc403-7a57-4953-b772-938b04583542-log-httpd\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705155 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705189 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-config-data-custom\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705249 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdvwz\" (UniqueName: \"kubernetes.io/projected/60afc403-7a57-4953-b772-938b04583542-kube-api-access-vdvwz\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705287 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-config-data\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-scripts\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705338 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f336347-0f81-4441-96ab-61163b122739-logs\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705376 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60afc403-7a57-4953-b772-938b04583542-run-httpd\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705405 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705435 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zj7s\" (UniqueName: \"kubernetes.io/projected/0f336347-0f81-4441-96ab-61163b122739-kube-api-access-7zj7s\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705472 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-scripts\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-config-data\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.705749 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60afc403-7a57-4953-b772-938b04583542-log-httpd\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.706487 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60afc403-7a57-4953-b772-938b04583542-run-httpd\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.713232 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-config-data\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.715365 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-scripts\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.715521 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.718459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.725490 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdvwz\" (UniqueName: \"kubernetes.io/projected/60afc403-7a57-4953-b772-938b04583542-kube-api-access-vdvwz\") pod \"ceilometer-0\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.807674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.808026 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-config-data-custom\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.808108 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-config-data\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.808157 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-scripts\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.808206 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f336347-0f81-4441-96ab-61163b122739-logs\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.808264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zj7s\" (UniqueName: \"kubernetes.io/projected/0f336347-0f81-4441-96ab-61163b122739-kube-api-access-7zj7s\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.808361 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f336347-0f81-4441-96ab-61163b122739-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.808500 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f336347-0f81-4441-96ab-61163b122739-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.809059 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f336347-0f81-4441-96ab-61163b122739-logs\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.814214 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-config-data-custom\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.814301 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-scripts\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.817659 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-config-data\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.818088 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.825755 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zj7s\" (UniqueName: \"kubernetes.io/projected/0f336347-0f81-4441-96ab-61163b122739-kube-api-access-7zj7s\") pod \"cinder-api-0\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.872942 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.897138 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.990546 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15247204-6ca3-4d52-ac9e-e8ba2e682d35" path="/var/lib/kubelet/pods/15247204-6ca3-4d52-ac9e-e8ba2e682d35/volumes" Jan 20 22:59:33 crc kubenswrapper[5030]: I0120 22:59:33.991375 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdfdc05b-f070-4e63-9606-c7f604a5ccda" path="/var/lib/kubelet/pods/cdfdc05b-f070-4e63-9606-c7f604a5ccda/volumes" Jan 20 22:59:34 crc kubenswrapper[5030]: I0120 22:59:34.137661 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:59:34 crc kubenswrapper[5030]: W0120 22:59:34.157789 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2978dafa_f9c4_4170_beb7_006349acb611.slice/crio-49d5778dde06fe428ffc68850ffd5bbf09e41774142cec04b6413d1a49cd706d WatchSource:0}: Error finding container 49d5778dde06fe428ffc68850ffd5bbf09e41774142cec04b6413d1a49cd706d: Status 404 returned error can't find the container with id 49d5778dde06fe428ffc68850ffd5bbf09e41774142cec04b6413d1a49cd706d Jan 20 22:59:34 crc kubenswrapper[5030]: I0120 22:59:34.307385 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2978dafa-f9c4-4170-beb7-006349acb611","Type":"ContainerStarted","Data":"49d5778dde06fe428ffc68850ffd5bbf09e41774142cec04b6413d1a49cd706d"} Jan 20 22:59:34 crc kubenswrapper[5030]: I0120 22:59:34.310034 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" event={"ID":"70aec167-1edb-4bc8-8170-f769f2672134","Type":"ContainerStarted","Data":"838a40628f24c61170dd8dd7851a372a7e7957c4115606ae38fb7c89ecebe31e"} Jan 20 22:59:34 crc kubenswrapper[5030]: I0120 22:59:34.310073 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" event={"ID":"70aec167-1edb-4bc8-8170-f769f2672134","Type":"ContainerStarted","Data":"6b4610eff9216e0f6d854cc5a35fc3d8807fe004703aa18e9f7f1608339bd76f"} Jan 20 22:59:34 crc kubenswrapper[5030]: I0120 22:59:34.328341 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:59:34 crc kubenswrapper[5030]: I0120 22:59:34.333516 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" podStartSLOduration=2.333499173 podStartE2EDuration="2.333499173s" podCreationTimestamp="2026-01-20 22:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:34.325392675 +0000 UTC m=+1446.645652963" watchObservedRunningTime="2026-01-20 22:59:34.333499173 +0000 UTC m=+1446.653759451" Jan 20 22:59:34 crc kubenswrapper[5030]: W0120 22:59:34.335154 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f336347_0f81_4441_96ab_61163b122739.slice/crio-f7fdf183f09b316675cfc03683f8a42f84ecc0ddbadf2761626640721e6718df WatchSource:0}: Error finding container f7fdf183f09b316675cfc03683f8a42f84ecc0ddbadf2761626640721e6718df: Status 404 returned error can't find the container with id f7fdf183f09b316675cfc03683f8a42f84ecc0ddbadf2761626640721e6718df Jan 20 22:59:34 crc kubenswrapper[5030]: I0120 22:59:34.452774 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 22:59:35 crc kubenswrapper[5030]: I0120 22:59:35.323087 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"0f336347-0f81-4441-96ab-61163b122739","Type":"ContainerStarted","Data":"ecfd3c5416b4086094f57cfe9cc2b60f08ae34daa6e97879254febfac8e1fd01"} Jan 20 22:59:35 crc kubenswrapper[5030]: I0120 22:59:35.323592 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"0f336347-0f81-4441-96ab-61163b122739","Type":"ContainerStarted","Data":"f7fdf183f09b316675cfc03683f8a42f84ecc0ddbadf2761626640721e6718df"} Jan 20 22:59:35 crc kubenswrapper[5030]: I0120 22:59:35.327445 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"60afc403-7a57-4953-b772-938b04583542","Type":"ContainerStarted","Data":"85749da8118a19d4ee6beacddb1b7b4dc39a928829589089c98b6d3551dd0bdb"} Jan 20 22:59:35 crc kubenswrapper[5030]: I0120 22:59:35.333002 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2978dafa-f9c4-4170-beb7-006349acb611","Type":"ContainerStarted","Data":"e1f106cc826e8d122e21fc6f9520fcd6d24aeb2b09b2973e566ef48c044cc50b"} Jan 20 22:59:36 crc kubenswrapper[5030]: I0120 22:59:36.342663 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"0f336347-0f81-4441-96ab-61163b122739","Type":"ContainerStarted","Data":"7f8a29c4096b03ee93407a0ec983ab93df3073848d2a78028289c39b392b6a6a"} Jan 20 22:59:36 crc kubenswrapper[5030]: I0120 22:59:36.343077 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:36 crc kubenswrapper[5030]: I0120 22:59:36.344441 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"60afc403-7a57-4953-b772-938b04583542","Type":"ContainerStarted","Data":"100287f0b6f74ab7203d3a8bd63df77c1e5592cc2b3a660adcb64f49efadfb64"} Jan 20 22:59:36 crc kubenswrapper[5030]: I0120 22:59:36.346070 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2978dafa-f9c4-4170-beb7-006349acb611","Type":"ContainerStarted","Data":"955d210467a70be6c11932f1fd8727d14c75ab7051cd8f6fd2776946289e070b"} Jan 20 22:59:36 crc kubenswrapper[5030]: I0120 22:59:36.369519 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.369502257 podStartE2EDuration="3.369502257s" podCreationTimestamp="2026-01-20 22:59:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:36.366472674 +0000 UTC m=+1448.686732962" watchObservedRunningTime="2026-01-20 22:59:36.369502257 +0000 UTC m=+1448.689762545" Jan 20 22:59:36 crc kubenswrapper[5030]: I0120 22:59:36.393924 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.39390556 podStartE2EDuration="3.39390556s" podCreationTimestamp="2026-01-20 22:59:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:36.39060652 +0000 UTC m=+1448.710866808" watchObservedRunningTime="2026-01-20 22:59:36.39390556 +0000 UTC m=+1448.714165848" Jan 20 22:59:37 crc kubenswrapper[5030]: I0120 22:59:37.359670 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"60afc403-7a57-4953-b772-938b04583542","Type":"ContainerStarted","Data":"8dce74e269908d3d2a775a636d4a4986668541ea1b915b14ee03edd033b8c98c"} Jan 20 22:59:37 crc kubenswrapper[5030]: I0120 22:59:37.360012 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"60afc403-7a57-4953-b772-938b04583542","Type":"ContainerStarted","Data":"d8c5df5d409b433301543b513acb94e59d487da2998ab3b5dfa1af569b6de93b"} Jan 20 22:59:37 crc kubenswrapper[5030]: I0120 22:59:37.361559 5030 generic.go:334] "Generic (PLEG): container finished" podID="70aec167-1edb-4bc8-8170-f769f2672134" containerID="838a40628f24c61170dd8dd7851a372a7e7957c4115606ae38fb7c89ecebe31e" exitCode=0 Jan 20 22:59:37 crc kubenswrapper[5030]: I0120 22:59:37.361587 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" event={"ID":"70aec167-1edb-4bc8-8170-f769f2672134","Type":"ContainerDied","Data":"838a40628f24c61170dd8dd7851a372a7e7957c4115606ae38fb7c89ecebe31e"} Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.696164 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.791674 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.900181 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-scripts\") pod \"70aec167-1edb-4bc8-8170-f769f2672134\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.900285 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srn59\" (UniqueName: \"kubernetes.io/projected/70aec167-1edb-4bc8-8170-f769f2672134-kube-api-access-srn59\") pod \"70aec167-1edb-4bc8-8170-f769f2672134\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.900302 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-credential-keys\") pod \"70aec167-1edb-4bc8-8170-f769f2672134\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.900342 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-config-data\") pod \"70aec167-1edb-4bc8-8170-f769f2672134\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.900437 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-combined-ca-bundle\") pod \"70aec167-1edb-4bc8-8170-f769f2672134\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.900474 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-fernet-keys\") pod \"70aec167-1edb-4bc8-8170-f769f2672134\" (UID: \"70aec167-1edb-4bc8-8170-f769f2672134\") " Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.908693 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "70aec167-1edb-4bc8-8170-f769f2672134" (UID: "70aec167-1edb-4bc8-8170-f769f2672134"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.910329 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70aec167-1edb-4bc8-8170-f769f2672134-kube-api-access-srn59" (OuterVolumeSpecName: "kube-api-access-srn59") pod "70aec167-1edb-4bc8-8170-f769f2672134" (UID: "70aec167-1edb-4bc8-8170-f769f2672134"). InnerVolumeSpecName "kube-api-access-srn59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.913174 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "70aec167-1edb-4bc8-8170-f769f2672134" (UID: "70aec167-1edb-4bc8-8170-f769f2672134"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.913232 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-scripts" (OuterVolumeSpecName: "scripts") pod "70aec167-1edb-4bc8-8170-f769f2672134" (UID: "70aec167-1edb-4bc8-8170-f769f2672134"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.926993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70aec167-1edb-4bc8-8170-f769f2672134" (UID: "70aec167-1edb-4bc8-8170-f769f2672134"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.942785 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-config-data" (OuterVolumeSpecName: "config-data") pod "70aec167-1edb-4bc8-8170-f769f2672134" (UID: "70aec167-1edb-4bc8-8170-f769f2672134"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.987095 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.987313 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="0f336347-0f81-4441-96ab-61163b122739" containerName="cinder-api-log" containerID="cri-o://ecfd3c5416b4086094f57cfe9cc2b60f08ae34daa6e97879254febfac8e1fd01" gracePeriod=30 Jan 20 22:59:38 crc kubenswrapper[5030]: I0120 22:59:38.987733 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="0f336347-0f81-4441-96ab-61163b122739" containerName="cinder-api" containerID="cri-o://7f8a29c4096b03ee93407a0ec983ab93df3073848d2a78028289c39b392b6a6a" gracePeriod=30 Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.002176 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srn59\" (UniqueName: \"kubernetes.io/projected/70aec167-1edb-4bc8-8170-f769f2672134-kube-api-access-srn59\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.002208 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.002219 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.002228 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.002236 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.002245 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70aec167-1edb-4bc8-8170-f769f2672134-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.401434 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" event={"ID":"70aec167-1edb-4bc8-8170-f769f2672134","Type":"ContainerDied","Data":"6b4610eff9216e0f6d854cc5a35fc3d8807fe004703aa18e9f7f1608339bd76f"} Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.401740 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b4610eff9216e0f6d854cc5a35fc3d8807fe004703aa18e9f7f1608339bd76f" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.401798 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-cx7lh" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.419180 5030 generic.go:334] "Generic (PLEG): container finished" podID="0f336347-0f81-4441-96ab-61163b122739" containerID="7f8a29c4096b03ee93407a0ec983ab93df3073848d2a78028289c39b392b6a6a" exitCode=0 Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.419210 5030 generic.go:334] "Generic (PLEG): container finished" podID="0f336347-0f81-4441-96ab-61163b122739" containerID="ecfd3c5416b4086094f57cfe9cc2b60f08ae34daa6e97879254febfac8e1fd01" exitCode=143 Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.419270 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"0f336347-0f81-4441-96ab-61163b122739","Type":"ContainerDied","Data":"7f8a29c4096b03ee93407a0ec983ab93df3073848d2a78028289c39b392b6a6a"} Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.419297 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"0f336347-0f81-4441-96ab-61163b122739","Type":"ContainerDied","Data":"ecfd3c5416b4086094f57cfe9cc2b60f08ae34daa6e97879254febfac8e1fd01"} Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.424561 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"60afc403-7a57-4953-b772-938b04583542","Type":"ContainerStarted","Data":"a0802d9d36c2d6e26de2172094c5bdc0c08c4a0572a400a5f5bd5a01b3251e12"} Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.424765 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.446740 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.58022918 podStartE2EDuration="6.446725404s" podCreationTimestamp="2026-01-20 22:59:33 +0000 UTC" firstStartedPulling="2026-01-20 22:59:34.456120173 +0000 UTC m=+1446.776380461" lastFinishedPulling="2026-01-20 22:59:38.322616367 +0000 UTC m=+1450.642876685" observedRunningTime="2026-01-20 22:59:39.445670368 +0000 UTC m=+1451.765930656" watchObservedRunningTime="2026-01-20 22:59:39.446725404 +0000 UTC m=+1451.766985692" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.481337 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.481646 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.490654 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.490779 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.527233 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.535679 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.538083 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.544407 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.612166 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f336347-0f81-4441-96ab-61163b122739-logs\") pod \"0f336347-0f81-4441-96ab-61163b122739\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.612475 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-combined-ca-bundle\") pod \"0f336347-0f81-4441-96ab-61163b122739\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.612578 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-config-data-custom\") pod \"0f336347-0f81-4441-96ab-61163b122739\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.612752 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f336347-0f81-4441-96ab-61163b122739-etc-machine-id\") pod \"0f336347-0f81-4441-96ab-61163b122739\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.612858 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zj7s\" (UniqueName: \"kubernetes.io/projected/0f336347-0f81-4441-96ab-61163b122739-kube-api-access-7zj7s\") pod \"0f336347-0f81-4441-96ab-61163b122739\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.612932 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-config-data\") pod \"0f336347-0f81-4441-96ab-61163b122739\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.612741 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f336347-0f81-4441-96ab-61163b122739-logs" (OuterVolumeSpecName: "logs") pod "0f336347-0f81-4441-96ab-61163b122739" (UID: "0f336347-0f81-4441-96ab-61163b122739"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.613085 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-scripts\") pod \"0f336347-0f81-4441-96ab-61163b122739\" (UID: \"0f336347-0f81-4441-96ab-61163b122739\") " Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.613534 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f336347-0f81-4441-96ab-61163b122739-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.613108 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f336347-0f81-4441-96ab-61163b122739-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0f336347-0f81-4441-96ab-61163b122739" (UID: "0f336347-0f81-4441-96ab-61163b122739"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.625772 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0f336347-0f81-4441-96ab-61163b122739" (UID: "0f336347-0f81-4441-96ab-61163b122739"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.626441 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-6765d59f9b-2rs69"] Jan 20 22:59:39 crc kubenswrapper[5030]: E0120 22:59:39.626926 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70aec167-1edb-4bc8-8170-f769f2672134" containerName="keystone-bootstrap" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.626953 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70aec167-1edb-4bc8-8170-f769f2672134" containerName="keystone-bootstrap" Jan 20 22:59:39 crc kubenswrapper[5030]: E0120 22:59:39.626975 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f336347-0f81-4441-96ab-61163b122739" containerName="cinder-api" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.626984 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f336347-0f81-4441-96ab-61163b122739" containerName="cinder-api" Jan 20 22:59:39 crc kubenswrapper[5030]: E0120 22:59:39.627021 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f336347-0f81-4441-96ab-61163b122739" containerName="cinder-api-log" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.627029 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f336347-0f81-4441-96ab-61163b122739" containerName="cinder-api-log" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.626451 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f336347-0f81-4441-96ab-61163b122739-kube-api-access-7zj7s" (OuterVolumeSpecName: "kube-api-access-7zj7s") pod "0f336347-0f81-4441-96ab-61163b122739" (UID: "0f336347-0f81-4441-96ab-61163b122739"). InnerVolumeSpecName "kube-api-access-7zj7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.627229 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f336347-0f81-4441-96ab-61163b122739" containerName="cinder-api-log" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.627261 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f336347-0f81-4441-96ab-61163b122739" containerName="cinder-api" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.627278 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70aec167-1edb-4bc8-8170-f769f2672134" containerName="keystone-bootstrap" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.627986 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.637167 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-scripts" (OuterVolumeSpecName: "scripts") pod "0f336347-0f81-4441-96ab-61163b122739" (UID: "0f336347-0f81-4441-96ab-61163b122739"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.637203 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.637400 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.637482 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-5nbxn" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.637662 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.648389 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.648600 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.674662 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f336347-0f81-4441-96ab-61163b122739" (UID: "0f336347-0f81-4441-96ab-61163b122739"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.715124 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-fernet-keys\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.715180 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-combined-ca-bundle\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.715212 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-internal-tls-certs\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.715232 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-scripts\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.715262 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-credential-keys\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.715332 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-public-tls-certs\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.715357 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rvw9\" (UniqueName: \"kubernetes.io/projected/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-kube-api-access-4rvw9\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.715381 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-config-data\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.715422 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.715435 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.715445 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.715453 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f336347-0f81-4441-96ab-61163b122739-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.715463 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zj7s\" (UniqueName: \"kubernetes.io/projected/0f336347-0f81-4441-96ab-61163b122739-kube-api-access-7zj7s\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.728125 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6765d59f9b-2rs69"] Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.789744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-config-data" (OuterVolumeSpecName: "config-data") pod "0f336347-0f81-4441-96ab-61163b122739" (UID: "0f336347-0f81-4441-96ab-61163b122739"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.807608 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.818494 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-public-tls-certs\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.818551 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rvw9\" (UniqueName: \"kubernetes.io/projected/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-kube-api-access-4rvw9\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.818579 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-config-data\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.818604 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-fernet-keys\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.818669 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-combined-ca-bundle\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.818701 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-internal-tls-certs\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.818722 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-scripts\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.818748 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-credential-keys\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.818820 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f336347-0f81-4441-96ab-61163b122739-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.827721 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-public-tls-certs\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.828756 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-config-data\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.835127 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-internal-tls-certs\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.842149 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-credential-keys\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.844245 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-fernet-keys\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.844473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-scripts\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.846274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-combined-ca-bundle\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:39 crc kubenswrapper[5030]: I0120 22:59:39.856095 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rvw9\" (UniqueName: \"kubernetes.io/projected/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-kube-api-access-4rvw9\") pod \"keystone-6765d59f9b-2rs69\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.051356 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.439500 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.439537 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"0f336347-0f81-4441-96ab-61163b122739","Type":"ContainerDied","Data":"f7fdf183f09b316675cfc03683f8a42f84ecc0ddbadf2761626640721e6718df"} Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.439990 5030 scope.go:117] "RemoveContainer" containerID="7f8a29c4096b03ee93407a0ec983ab93df3073848d2a78028289c39b392b6a6a" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.440744 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.440774 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.440815 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.440833 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.472541 5030 scope.go:117] "RemoveContainer" containerID="ecfd3c5416b4086094f57cfe9cc2b60f08ae34daa6e97879254febfac8e1fd01" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.472676 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.482769 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.497177 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6765d59f9b-2rs69"] Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.509929 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.512408 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.516562 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.516770 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.516778 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.527540 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.628238 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-57bd57cf66-fdt4n"] Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.629602 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.633073 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.633318 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.636646 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.636742 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-config-data-custom\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.637286 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.637317 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.637358 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t67rg\" (UniqueName: \"kubernetes.io/projected/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-kube-api-access-t67rg\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.637380 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.637469 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-config-data\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.637492 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-scripts\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.637640 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-logs\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.684587 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-57bd57cf66-fdt4n"] Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.739504 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-config-data-custom\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.739777 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-config\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.739804 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-internal-tls-certs\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.739823 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvmfp\" (UniqueName: \"kubernetes.io/projected/7dfd1851-3088-4199-9b12-47964306a9b5-kube-api-access-rvmfp\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.739847 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.739866 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-combined-ca-bundle\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.739882 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.739898 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t67rg\" (UniqueName: \"kubernetes.io/projected/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-kube-api-access-t67rg\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.739916 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.739956 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-ovndb-tls-certs\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.739992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-httpd-config\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.740039 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-config-data\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.740060 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-scripts\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.740082 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-logs\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.740131 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.740164 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-public-tls-certs\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.740241 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.744310 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-logs\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.747882 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-config-data-custom\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.749988 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.750418 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-config-data\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.750583 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.751180 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.752387 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-scripts\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.762035 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t67rg\" (UniqueName: \"kubernetes.io/projected/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-kube-api-access-t67rg\") pod \"cinder-api-0\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.842613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-public-tls-certs\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.842724 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-config\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.842748 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-internal-tls-certs\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.842765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvmfp\" (UniqueName: \"kubernetes.io/projected/7dfd1851-3088-4199-9b12-47964306a9b5-kube-api-access-rvmfp\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.842790 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-combined-ca-bundle\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.842855 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-ovndb-tls-certs\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.842880 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-httpd-config\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.847549 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-combined-ca-bundle\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.848220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-ovndb-tls-certs\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.849133 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-public-tls-certs\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.849259 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-config\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.850006 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-internal-tls-certs\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.852182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-httpd-config\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:40 crc kubenswrapper[5030]: I0120 22:59:40.867225 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvmfp\" (UniqueName: \"kubernetes.io/projected/7dfd1851-3088-4199-9b12-47964306a9b5-kube-api-access-rvmfp\") pod \"neutron-57bd57cf66-fdt4n\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:41 crc kubenswrapper[5030]: I0120 22:59:41.027296 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:41 crc kubenswrapper[5030]: I0120 22:59:41.060665 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:41 crc kubenswrapper[5030]: I0120 22:59:41.317022 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 22:59:41 crc kubenswrapper[5030]: W0120 22:59:41.321157 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47b9cc70_f5f7_4bf6_9bc7_15a2c706a1a8.slice/crio-c0e02a10d3feda6c69bd09d2d18eef87657d7324fa8faaf320b57694bf208acf WatchSource:0}: Error finding container c0e02a10d3feda6c69bd09d2d18eef87657d7324fa8faaf320b57694bf208acf: Status 404 returned error can't find the container with id c0e02a10d3feda6c69bd09d2d18eef87657d7324fa8faaf320b57694bf208acf Jan 20 22:59:41 crc kubenswrapper[5030]: I0120 22:59:41.467553 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8","Type":"ContainerStarted","Data":"c0e02a10d3feda6c69bd09d2d18eef87657d7324fa8faaf320b57694bf208acf"} Jan 20 22:59:41 crc kubenswrapper[5030]: I0120 22:59:41.471895 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" event={"ID":"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d","Type":"ContainerStarted","Data":"03000a5337c54b499955e11b625aa2770b106374d729d827663c69d19147b1d2"} Jan 20 22:59:41 crc kubenswrapper[5030]: I0120 22:59:41.471935 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" event={"ID":"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d","Type":"ContainerStarted","Data":"2fdf0d827c5cfc2af154771cf82f329e30b020edd426413fc0e8ecc69b819bbe"} Jan 20 22:59:41 crc kubenswrapper[5030]: I0120 22:59:41.494540 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" podStartSLOduration=2.494523815 podStartE2EDuration="2.494523815s" podCreationTimestamp="2026-01-20 22:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:41.487723699 +0000 UTC m=+1453.807983987" watchObservedRunningTime="2026-01-20 22:59:41.494523815 +0000 UTC m=+1453.814784103" Jan 20 22:59:41 crc kubenswrapper[5030]: I0120 22:59:41.583891 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-57bd57cf66-fdt4n"] Jan 20 22:59:41 crc kubenswrapper[5030]: I0120 22:59:41.984999 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f336347-0f81-4441-96ab-61163b122739" path="/var/lib/kubelet/pods/0f336347-0f81-4441-96ab-61163b122739/volumes" Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.273896 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.394068 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.484023 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" event={"ID":"7dfd1851-3088-4199-9b12-47964306a9b5","Type":"ContainerStarted","Data":"dd92b5dd5973bf0786d8d8b9c80aff684fc1a3124f9a956ceefe2ee92d51545f"} Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.484058 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" event={"ID":"7dfd1851-3088-4199-9b12-47964306a9b5","Type":"ContainerStarted","Data":"25d71b9ff6b3918f06b7ac2f245f258bfaef55aef059830afb2f1b222de24f94"} Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.484071 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" event={"ID":"7dfd1851-3088-4199-9b12-47964306a9b5","Type":"ContainerStarted","Data":"8c6bcac09d4231a5bf5dba205344abd8281bbaca28674d737f171a31a10e77c4"} Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.484904 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.490296 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8","Type":"ContainerStarted","Data":"c011608dc0cda953d39ae86ce2df8b44ddec5366f2481a6fa9504f761cc6409a"} Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.490336 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.500805 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.501135 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.503203 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" podStartSLOduration=2.503091834 podStartE2EDuration="2.503091834s" podCreationTimestamp="2026-01-20 22:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:42.50134182 +0000 UTC m=+1454.821602108" watchObservedRunningTime="2026-01-20 22:59:42.503091834 +0000 UTC m=+1454.823352132" Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.503487 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.507121 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.507219 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 22:59:42 crc kubenswrapper[5030]: I0120 22:59:42.508114 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.499147 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8","Type":"ContainerStarted","Data":"1e2c3887d4bfb803bee6d88be2bd09aad12d8e435404a6d4a7f84152d7fd324e"} Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.519934 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.5199167019999997 podStartE2EDuration="3.519916702s" podCreationTimestamp="2026-01-20 22:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:43.518419306 +0000 UTC m=+1455.838679594" watchObservedRunningTime="2026-01-20 22:59:43.519916702 +0000 UTC m=+1455.840176990" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.554976 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q"] Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.556281 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.558018 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.558425 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.568246 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q"] Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.698216 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-combined-ca-bundle\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.698270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-config-data\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.698302 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-public-tls-certs\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.698453 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-logs\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.698494 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg92w\" (UniqueName: \"kubernetes.io/projected/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-kube-api-access-sg92w\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.698732 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-internal-tls-certs\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.698850 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-config-data-custom\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.800648 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-internal-tls-certs\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.800720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-config-data-custom\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.800772 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-combined-ca-bundle\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.800796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-config-data\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.800815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-public-tls-certs\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.800834 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-logs\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.800853 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg92w\" (UniqueName: \"kubernetes.io/projected/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-kube-api-access-sg92w\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.801336 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-logs\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.808176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-internal-tls-certs\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.816414 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-combined-ca-bundle\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.816579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-public-tls-certs\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.816872 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-config-data-custom\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.817064 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-config-data\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.818680 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg92w\" (UniqueName: \"kubernetes.io/projected/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-kube-api-access-sg92w\") pod \"barbican-api-84f4bb648-q4s7q\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.870465 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:43 crc kubenswrapper[5030]: I0120 22:59:43.953503 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:44 crc kubenswrapper[5030]: I0120 22:59:44.021270 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:59:44 crc kubenswrapper[5030]: I0120 22:59:44.368027 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q"] Jan 20 22:59:44 crc kubenswrapper[5030]: W0120 22:59:44.370031 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0188ec37_5fa0_4c9c_82ea_7eb7010c88d3.slice/crio-c8d370e6dfb97831885b9d5f61d78141e1999f187cef03777e6c1d3940c10aee WatchSource:0}: Error finding container c8d370e6dfb97831885b9d5f61d78141e1999f187cef03777e6c1d3940c10aee: Status 404 returned error can't find the container with id c8d370e6dfb97831885b9d5f61d78141e1999f187cef03777e6c1d3940c10aee Jan 20 22:59:44 crc kubenswrapper[5030]: I0120 22:59:44.509471 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" event={"ID":"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3","Type":"ContainerStarted","Data":"3bdf48c04934d54f7661f747d57939590f3632f793a8c0aa8ae04262dd469c85"} Jan 20 22:59:44 crc kubenswrapper[5030]: I0120 22:59:44.509865 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:44 crc kubenswrapper[5030]: I0120 22:59:44.509890 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" event={"ID":"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3","Type":"ContainerStarted","Data":"c8d370e6dfb97831885b9d5f61d78141e1999f187cef03777e6c1d3940c10aee"} Jan 20 22:59:44 crc kubenswrapper[5030]: I0120 22:59:44.509873 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="2978dafa-f9c4-4170-beb7-006349acb611" containerName="probe" containerID="cri-o://955d210467a70be6c11932f1fd8727d14c75ab7051cd8f6fd2776946289e070b" gracePeriod=30 Jan 20 22:59:44 crc kubenswrapper[5030]: I0120 22:59:44.510497 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="2978dafa-f9c4-4170-beb7-006349acb611" containerName="cinder-scheduler" containerID="cri-o://e1f106cc826e8d122e21fc6f9520fcd6d24aeb2b09b2973e566ef48c044cc50b" gracePeriod=30 Jan 20 22:59:45 crc kubenswrapper[5030]: I0120 22:59:45.528509 5030 generic.go:334] "Generic (PLEG): container finished" podID="2978dafa-f9c4-4170-beb7-006349acb611" containerID="955d210467a70be6c11932f1fd8727d14c75ab7051cd8f6fd2776946289e070b" exitCode=0 Jan 20 22:59:45 crc kubenswrapper[5030]: I0120 22:59:45.528648 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2978dafa-f9c4-4170-beb7-006349acb611","Type":"ContainerDied","Data":"955d210467a70be6c11932f1fd8727d14c75ab7051cd8f6fd2776946289e070b"} Jan 20 22:59:45 crc kubenswrapper[5030]: I0120 22:59:45.533576 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" event={"ID":"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3","Type":"ContainerStarted","Data":"abe1b9d9c501552d8beeada5cb25d3fbed77f8cbf429b1c1351a006527292ef3"} Jan 20 22:59:45 crc kubenswrapper[5030]: I0120 22:59:45.557097 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" podStartSLOduration=2.557075985 podStartE2EDuration="2.557075985s" podCreationTimestamp="2026-01-20 22:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:45.555598859 +0000 UTC m=+1457.875859147" watchObservedRunningTime="2026-01-20 22:59:45.557075985 +0000 UTC m=+1457.877336293" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.334603 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.457889 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2978dafa-f9c4-4170-beb7-006349acb611-etc-machine-id\") pod \"2978dafa-f9c4-4170-beb7-006349acb611\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.458020 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-combined-ca-bundle\") pod \"2978dafa-f9c4-4170-beb7-006349acb611\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.458068 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-scripts\") pod \"2978dafa-f9c4-4170-beb7-006349acb611\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.458124 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-config-data-custom\") pod \"2978dafa-f9c4-4170-beb7-006349acb611\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.458223 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-config-data\") pod \"2978dafa-f9c4-4170-beb7-006349acb611\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.458281 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp22t\" (UniqueName: \"kubernetes.io/projected/2978dafa-f9c4-4170-beb7-006349acb611-kube-api-access-pp22t\") pod \"2978dafa-f9c4-4170-beb7-006349acb611\" (UID: \"2978dafa-f9c4-4170-beb7-006349acb611\") " Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.458057 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2978dafa-f9c4-4170-beb7-006349acb611-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2978dafa-f9c4-4170-beb7-006349acb611" (UID: "2978dafa-f9c4-4170-beb7-006349acb611"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.458936 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2978dafa-f9c4-4170-beb7-006349acb611-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.469331 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2978dafa-f9c4-4170-beb7-006349acb611-kube-api-access-pp22t" (OuterVolumeSpecName: "kube-api-access-pp22t") pod "2978dafa-f9c4-4170-beb7-006349acb611" (UID: "2978dafa-f9c4-4170-beb7-006349acb611"). InnerVolumeSpecName "kube-api-access-pp22t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.469736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-scripts" (OuterVolumeSpecName: "scripts") pod "2978dafa-f9c4-4170-beb7-006349acb611" (UID: "2978dafa-f9c4-4170-beb7-006349acb611"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.470526 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2978dafa-f9c4-4170-beb7-006349acb611" (UID: "2978dafa-f9c4-4170-beb7-006349acb611"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.527361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2978dafa-f9c4-4170-beb7-006349acb611" (UID: "2978dafa-f9c4-4170-beb7-006349acb611"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.547637 5030 generic.go:334] "Generic (PLEG): container finished" podID="2978dafa-f9c4-4170-beb7-006349acb611" containerID="e1f106cc826e8d122e21fc6f9520fcd6d24aeb2b09b2973e566ef48c044cc50b" exitCode=0 Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.547715 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.547740 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2978dafa-f9c4-4170-beb7-006349acb611","Type":"ContainerDied","Data":"e1f106cc826e8d122e21fc6f9520fcd6d24aeb2b09b2973e566ef48c044cc50b"} Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.547795 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"2978dafa-f9c4-4170-beb7-006349acb611","Type":"ContainerDied","Data":"49d5778dde06fe428ffc68850ffd5bbf09e41774142cec04b6413d1a49cd706d"} Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.547813 5030 scope.go:117] "RemoveContainer" containerID="955d210467a70be6c11932f1fd8727d14c75ab7051cd8f6fd2776946289e070b" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.547855 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.548467 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.560258 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.560296 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.560310 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.560326 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp22t\" (UniqueName: \"kubernetes.io/projected/2978dafa-f9c4-4170-beb7-006349acb611-kube-api-access-pp22t\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.583870 5030 scope.go:117] "RemoveContainer" containerID="e1f106cc826e8d122e21fc6f9520fcd6d24aeb2b09b2973e566ef48c044cc50b" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.608672 5030 scope.go:117] "RemoveContainer" containerID="955d210467a70be6c11932f1fd8727d14c75ab7051cd8f6fd2776946289e070b" Jan 20 22:59:46 crc kubenswrapper[5030]: E0120 22:59:46.609078 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"955d210467a70be6c11932f1fd8727d14c75ab7051cd8f6fd2776946289e070b\": container with ID starting with 955d210467a70be6c11932f1fd8727d14c75ab7051cd8f6fd2776946289e070b not found: ID does not exist" containerID="955d210467a70be6c11932f1fd8727d14c75ab7051cd8f6fd2776946289e070b" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.609111 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"955d210467a70be6c11932f1fd8727d14c75ab7051cd8f6fd2776946289e070b"} err="failed to get container status \"955d210467a70be6c11932f1fd8727d14c75ab7051cd8f6fd2776946289e070b\": rpc error: code = NotFound desc = could not find container \"955d210467a70be6c11932f1fd8727d14c75ab7051cd8f6fd2776946289e070b\": container with ID starting with 955d210467a70be6c11932f1fd8727d14c75ab7051cd8f6fd2776946289e070b not found: ID does not exist" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.609134 5030 scope.go:117] "RemoveContainer" containerID="e1f106cc826e8d122e21fc6f9520fcd6d24aeb2b09b2973e566ef48c044cc50b" Jan 20 22:59:46 crc kubenswrapper[5030]: E0120 22:59:46.609369 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f106cc826e8d122e21fc6f9520fcd6d24aeb2b09b2973e566ef48c044cc50b\": container with ID starting with e1f106cc826e8d122e21fc6f9520fcd6d24aeb2b09b2973e566ef48c044cc50b not found: ID does not exist" containerID="e1f106cc826e8d122e21fc6f9520fcd6d24aeb2b09b2973e566ef48c044cc50b" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.609401 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f106cc826e8d122e21fc6f9520fcd6d24aeb2b09b2973e566ef48c044cc50b"} err="failed to get container status \"e1f106cc826e8d122e21fc6f9520fcd6d24aeb2b09b2973e566ef48c044cc50b\": rpc error: code = NotFound desc = could not find container \"e1f106cc826e8d122e21fc6f9520fcd6d24aeb2b09b2973e566ef48c044cc50b\": container with ID starting with e1f106cc826e8d122e21fc6f9520fcd6d24aeb2b09b2973e566ef48c044cc50b not found: ID does not exist" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.612412 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-config-data" (OuterVolumeSpecName: "config-data") pod "2978dafa-f9c4-4170-beb7-006349acb611" (UID: "2978dafa-f9c4-4170-beb7-006349acb611"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.661682 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2978dafa-f9c4-4170-beb7-006349acb611-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.897169 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.904770 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.918542 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:59:46 crc kubenswrapper[5030]: E0120 22:59:46.918900 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2978dafa-f9c4-4170-beb7-006349acb611" containerName="probe" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.918921 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2978dafa-f9c4-4170-beb7-006349acb611" containerName="probe" Jan 20 22:59:46 crc kubenswrapper[5030]: E0120 22:59:46.918947 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2978dafa-f9c4-4170-beb7-006349acb611" containerName="cinder-scheduler" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.918957 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2978dafa-f9c4-4170-beb7-006349acb611" containerName="cinder-scheduler" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.919180 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2978dafa-f9c4-4170-beb7-006349acb611" containerName="cinder-scheduler" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.919205 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2978dafa-f9c4-4170-beb7-006349acb611" containerName="probe" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.920085 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.923183 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 22:59:46 crc kubenswrapper[5030]: I0120 22:59:46.936347 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.069635 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-config-data\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.069681 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f87ad01b-11b8-43e1-9a04-5947ff6b0441-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.069707 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-scripts\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.069731 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.069885 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgzr7\" (UniqueName: \"kubernetes.io/projected/f87ad01b-11b8-43e1-9a04-5947ff6b0441-kube-api-access-tgzr7\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.069990 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.172206 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-config-data\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.172590 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f87ad01b-11b8-43e1-9a04-5947ff6b0441-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.172646 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-scripts\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.172674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.172724 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgzr7\" (UniqueName: \"kubernetes.io/projected/f87ad01b-11b8-43e1-9a04-5947ff6b0441-kube-api-access-tgzr7\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.172780 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.173498 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f87ad01b-11b8-43e1-9a04-5947ff6b0441-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.177649 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.178393 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-scripts\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.178502 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-config-data\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.187096 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.210219 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgzr7\" (UniqueName: \"kubernetes.io/projected/f87ad01b-11b8-43e1-9a04-5947ff6b0441-kube-api-access-tgzr7\") pod \"cinder-scheduler-0\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.237005 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.666804 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 22:59:47 crc kubenswrapper[5030]: W0120 22:59:47.670417 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf87ad01b_11b8_43e1_9a04_5947ff6b0441.slice/crio-6076955485ede4c367e36abab60b2f3dbe9f3f386fab65d2c32b33b048ad2447 WatchSource:0}: Error finding container 6076955485ede4c367e36abab60b2f3dbe9f3f386fab65d2c32b33b048ad2447: Status 404 returned error can't find the container with id 6076955485ede4c367e36abab60b2f3dbe9f3f386fab65d2c32b33b048ad2447 Jan 20 22:59:47 crc kubenswrapper[5030]: I0120 22:59:47.988329 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2978dafa-f9c4-4170-beb7-006349acb611" path="/var/lib/kubelet/pods/2978dafa-f9c4-4170-beb7-006349acb611/volumes" Jan 20 22:59:48 crc kubenswrapper[5030]: I0120 22:59:48.577239 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f87ad01b-11b8-43e1-9a04-5947ff6b0441","Type":"ContainerStarted","Data":"54990945a497a45075119ed3dc036df2589703681e3a56dcfccafab5ebeb782b"} Jan 20 22:59:48 crc kubenswrapper[5030]: I0120 22:59:48.577293 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f87ad01b-11b8-43e1-9a04-5947ff6b0441","Type":"ContainerStarted","Data":"6076955485ede4c367e36abab60b2f3dbe9f3f386fab65d2c32b33b048ad2447"} Jan 20 22:59:49 crc kubenswrapper[5030]: I0120 22:59:49.594559 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f87ad01b-11b8-43e1-9a04-5947ff6b0441","Type":"ContainerStarted","Data":"97c9b3af0a221749a5d3d0a48cb4ab7d818c47e0f19d9b433358dccfb2edecb5"} Jan 20 22:59:49 crc kubenswrapper[5030]: I0120 22:59:49.622929 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.622911654 podStartE2EDuration="3.622911654s" podCreationTimestamp="2026-01-20 22:59:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 22:59:49.615417441 +0000 UTC m=+1461.935677759" watchObservedRunningTime="2026-01-20 22:59:49.622911654 +0000 UTC m=+1461.943171942" Jan 20 22:59:52 crc kubenswrapper[5030]: I0120 22:59:52.237302 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:52 crc kubenswrapper[5030]: I0120 22:59:52.793922 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 22:59:55 crc kubenswrapper[5030]: I0120 22:59:55.172880 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:55 crc kubenswrapper[5030]: I0120 22:59:55.224531 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 22:59:55 crc kubenswrapper[5030]: I0120 22:59:55.298667 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2"] Jan 20 22:59:55 crc kubenswrapper[5030]: I0120 22:59:55.298946 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" podUID="3e38dceb-8a8c-4a9f-8f41-91bb9868510a" containerName="barbican-api-log" containerID="cri-o://f914453f875b54ee6ab683459a0c75064627074a817073323f80398d441f2bb8" gracePeriod=30 Jan 20 22:59:55 crc kubenswrapper[5030]: I0120 22:59:55.299110 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" podUID="3e38dceb-8a8c-4a9f-8f41-91bb9868510a" containerName="barbican-api" containerID="cri-o://20bded7c0405fc45f3c485eca633da7c4fe33c8e4a392d411706cb5a621b8095" gracePeriod=30 Jan 20 22:59:55 crc kubenswrapper[5030]: I0120 22:59:55.684204 5030 generic.go:334] "Generic (PLEG): container finished" podID="3e38dceb-8a8c-4a9f-8f41-91bb9868510a" containerID="f914453f875b54ee6ab683459a0c75064627074a817073323f80398d441f2bb8" exitCode=143 Jan 20 22:59:55 crc kubenswrapper[5030]: I0120 22:59:55.684567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" event={"ID":"3e38dceb-8a8c-4a9f-8f41-91bb9868510a","Type":"ContainerDied","Data":"f914453f875b54ee6ab683459a0c75064627074a817073323f80398d441f2bb8"} Jan 20 22:59:57 crc kubenswrapper[5030]: I0120 22:59:57.483383 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 22:59:58 crc kubenswrapper[5030]: I0120 22:59:58.439135 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" podUID="3e38dceb-8a8c-4a9f-8f41-91bb9868510a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.242:9311/healthcheck\": read tcp 10.217.0.2:32978->10.217.0.242:9311: read: connection reset by peer" Jan 20 22:59:58 crc kubenswrapper[5030]: I0120 22:59:58.439218 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" podUID="3e38dceb-8a8c-4a9f-8f41-91bb9868510a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.242:9311/healthcheck\": read tcp 10.217.0.2:32994->10.217.0.242:9311: read: connection reset by peer" Jan 20 22:59:58 crc kubenswrapper[5030]: I0120 22:59:58.719429 5030 generic.go:334] "Generic (PLEG): container finished" podID="3e38dceb-8a8c-4a9f-8f41-91bb9868510a" containerID="20bded7c0405fc45f3c485eca633da7c4fe33c8e4a392d411706cb5a621b8095" exitCode=0 Jan 20 22:59:58 crc kubenswrapper[5030]: I0120 22:59:58.719551 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" event={"ID":"3e38dceb-8a8c-4a9f-8f41-91bb9868510a","Type":"ContainerDied","Data":"20bded7c0405fc45f3c485eca633da7c4fe33c8e4a392d411706cb5a621b8095"} Jan 20 22:59:58 crc kubenswrapper[5030]: I0120 22:59:58.910880 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.017449 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-config-data\") pod \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.017582 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-config-data-custom\") pod \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.017670 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmddf\" (UniqueName: \"kubernetes.io/projected/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-kube-api-access-kmddf\") pod \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.017733 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-combined-ca-bundle\") pod \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.017757 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-logs\") pod \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\" (UID: \"3e38dceb-8a8c-4a9f-8f41-91bb9868510a\") " Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.019649 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-logs" (OuterVolumeSpecName: "logs") pod "3e38dceb-8a8c-4a9f-8f41-91bb9868510a" (UID: "3e38dceb-8a8c-4a9f-8f41-91bb9868510a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.023902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3e38dceb-8a8c-4a9f-8f41-91bb9868510a" (UID: "3e38dceb-8a8c-4a9f-8f41-91bb9868510a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.024192 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-kube-api-access-kmddf" (OuterVolumeSpecName: "kube-api-access-kmddf") pod "3e38dceb-8a8c-4a9f-8f41-91bb9868510a" (UID: "3e38dceb-8a8c-4a9f-8f41-91bb9868510a"). InnerVolumeSpecName "kube-api-access-kmddf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.063194 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e38dceb-8a8c-4a9f-8f41-91bb9868510a" (UID: "3e38dceb-8a8c-4a9f-8f41-91bb9868510a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.081834 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-config-data" (OuterVolumeSpecName: "config-data") pod "3e38dceb-8a8c-4a9f-8f41-91bb9868510a" (UID: "3e38dceb-8a8c-4a9f-8f41-91bb9868510a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.119580 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.119646 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmddf\" (UniqueName: \"kubernetes.io/projected/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-kube-api-access-kmddf\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.119662 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.119673 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.119686 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e38dceb-8a8c-4a9f-8f41-91bb9868510a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.735816 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" event={"ID":"3e38dceb-8a8c-4a9f-8f41-91bb9868510a","Type":"ContainerDied","Data":"14550e8bdeb6cf43c6c32cd1cb27c1f76f8e4a904b422f81d9a6d70e781662d9"} Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.736305 5030 scope.go:117] "RemoveContainer" containerID="20bded7c0405fc45f3c485eca633da7c4fe33c8e4a392d411706cb5a621b8095" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.736497 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.789511 5030 scope.go:117] "RemoveContainer" containerID="f914453f875b54ee6ab683459a0c75064627074a817073323f80398d441f2bb8" Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.799460 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2"] Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.807193 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-7ff8bf756d-jk7l2"] Jan 20 22:59:59 crc kubenswrapper[5030]: I0120 22:59:59.975006 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e38dceb-8a8c-4a9f-8f41-91bb9868510a" path="/var/lib/kubelet/pods/3e38dceb-8a8c-4a9f-8f41-91bb9868510a/volumes" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.151902 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng"] Jan 20 23:00:00 crc kubenswrapper[5030]: E0120 23:00:00.152465 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e38dceb-8a8c-4a9f-8f41-91bb9868510a" containerName="barbican-api-log" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.152497 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e38dceb-8a8c-4a9f-8f41-91bb9868510a" containerName="barbican-api-log" Jan 20 23:00:00 crc kubenswrapper[5030]: E0120 23:00:00.152534 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e38dceb-8a8c-4a9f-8f41-91bb9868510a" containerName="barbican-api" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.152548 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e38dceb-8a8c-4a9f-8f41-91bb9868510a" containerName="barbican-api" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.152869 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e38dceb-8a8c-4a9f-8f41-91bb9868510a" containerName="barbican-api-log" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.152911 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e38dceb-8a8c-4a9f-8f41-91bb9868510a" containerName="barbican-api" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.153869 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.158064 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.158082 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.194182 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng"] Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.243556 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg99f\" (UniqueName: \"kubernetes.io/projected/69981745-6b95-4420-8607-6189df827955-kube-api-access-tg99f\") pod \"collect-profiles-29482500-fwwng\" (UID: \"69981745-6b95-4420-8607-6189df827955\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.244047 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69981745-6b95-4420-8607-6189df827955-config-volume\") pod \"collect-profiles-29482500-fwwng\" (UID: \"69981745-6b95-4420-8607-6189df827955\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.244111 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69981745-6b95-4420-8607-6189df827955-secret-volume\") pod \"collect-profiles-29482500-fwwng\" (UID: \"69981745-6b95-4420-8607-6189df827955\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.346246 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69981745-6b95-4420-8607-6189df827955-config-volume\") pod \"collect-profiles-29482500-fwwng\" (UID: \"69981745-6b95-4420-8607-6189df827955\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.346696 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69981745-6b95-4420-8607-6189df827955-secret-volume\") pod \"collect-profiles-29482500-fwwng\" (UID: \"69981745-6b95-4420-8607-6189df827955\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.346921 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg99f\" (UniqueName: \"kubernetes.io/projected/69981745-6b95-4420-8607-6189df827955-kube-api-access-tg99f\") pod \"collect-profiles-29482500-fwwng\" (UID: \"69981745-6b95-4420-8607-6189df827955\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.348000 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69981745-6b95-4420-8607-6189df827955-config-volume\") pod \"collect-profiles-29482500-fwwng\" (UID: \"69981745-6b95-4420-8607-6189df827955\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.351838 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69981745-6b95-4420-8607-6189df827955-secret-volume\") pod \"collect-profiles-29482500-fwwng\" (UID: \"69981745-6b95-4420-8607-6189df827955\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.378242 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg99f\" (UniqueName: \"kubernetes.io/projected/69981745-6b95-4420-8607-6189df827955-kube-api-access-tg99f\") pod \"collect-profiles-29482500-fwwng\" (UID: \"69981745-6b95-4420-8607-6189df827955\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.488793 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" Jan 20 23:00:00 crc kubenswrapper[5030]: I0120 23:00:00.959659 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng"] Jan 20 23:00:01 crc kubenswrapper[5030]: I0120 23:00:01.516725 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 23:00:01 crc kubenswrapper[5030]: I0120 23:00:01.518301 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 23:00:01 crc kubenswrapper[5030]: I0120 23:00:01.618050 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 23:00:01 crc kubenswrapper[5030]: I0120 23:00:01.764776 5030 generic.go:334] "Generic (PLEG): container finished" podID="69981745-6b95-4420-8607-6189df827955" containerID="bda6d746b7d14d48dfab7091c45f743116fdf4934edb9a99356585797691c9bd" exitCode=0 Jan 20 23:00:01 crc kubenswrapper[5030]: I0120 23:00:01.764875 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" event={"ID":"69981745-6b95-4420-8607-6189df827955","Type":"ContainerDied","Data":"bda6d746b7d14d48dfab7091c45f743116fdf4934edb9a99356585797691c9bd"} Jan 20 23:00:01 crc kubenswrapper[5030]: I0120 23:00:01.764914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" event={"ID":"69981745-6b95-4420-8607-6189df827955","Type":"ContainerStarted","Data":"d8465efdca9432ccd4cb2016a54a4051f462b9ea6522822b46d2f54607774055"} Jan 20 23:00:01 crc kubenswrapper[5030]: I0120 23:00:01.930611 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 23:00:01 crc kubenswrapper[5030]: I0120 23:00:01.938061 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 23:00:02 crc kubenswrapper[5030]: I0120 23:00:02.079089 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-9855649bd-rgwm6"] Jan 20 23:00:02 crc kubenswrapper[5030]: I0120 23:00:02.774139 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" podUID="0c547bf6-ed95-4bc8-8573-3047e4689484" containerName="placement-log" containerID="cri-o://612c19ede81e6b0e21637082653d323f14d40b7293d3a145df76bd10f3f66c67" gracePeriod=30 Jan 20 23:00:02 crc kubenswrapper[5030]: I0120 23:00:02.774216 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" podUID="0c547bf6-ed95-4bc8-8573-3047e4689484" containerName="placement-api" containerID="cri-o://618c6b906a2fc068bead3717fa5d723efe80c68d1b2e2c9f5efe125a0f7571a5" gracePeriod=30 Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.104788 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.199393 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69981745-6b95-4420-8607-6189df827955-config-volume\") pod \"69981745-6b95-4420-8607-6189df827955\" (UID: \"69981745-6b95-4420-8607-6189df827955\") " Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.199812 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg99f\" (UniqueName: \"kubernetes.io/projected/69981745-6b95-4420-8607-6189df827955-kube-api-access-tg99f\") pod \"69981745-6b95-4420-8607-6189df827955\" (UID: \"69981745-6b95-4420-8607-6189df827955\") " Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.199839 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69981745-6b95-4420-8607-6189df827955-secret-volume\") pod \"69981745-6b95-4420-8607-6189df827955\" (UID: \"69981745-6b95-4420-8607-6189df827955\") " Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.200065 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69981745-6b95-4420-8607-6189df827955-config-volume" (OuterVolumeSpecName: "config-volume") pod "69981745-6b95-4420-8607-6189df827955" (UID: "69981745-6b95-4420-8607-6189df827955"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.200255 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69981745-6b95-4420-8607-6189df827955-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.205286 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69981745-6b95-4420-8607-6189df827955-kube-api-access-tg99f" (OuterVolumeSpecName: "kube-api-access-tg99f") pod "69981745-6b95-4420-8607-6189df827955" (UID: "69981745-6b95-4420-8607-6189df827955"). InnerVolumeSpecName "kube-api-access-tg99f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.205368 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69981745-6b95-4420-8607-6189df827955-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "69981745-6b95-4420-8607-6189df827955" (UID: "69981745-6b95-4420-8607-6189df827955"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.302307 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg99f\" (UniqueName: \"kubernetes.io/projected/69981745-6b95-4420-8607-6189df827955-kube-api-access-tg99f\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.302552 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69981745-6b95-4420-8607-6189df827955-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.787087 5030 generic.go:334] "Generic (PLEG): container finished" podID="0c547bf6-ed95-4bc8-8573-3047e4689484" containerID="612c19ede81e6b0e21637082653d323f14d40b7293d3a145df76bd10f3f66c67" exitCode=143 Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.787192 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" event={"ID":"0c547bf6-ed95-4bc8-8573-3047e4689484","Type":"ContainerDied","Data":"612c19ede81e6b0e21637082653d323f14d40b7293d3a145df76bd10f3f66c67"} Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.789677 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" event={"ID":"69981745-6b95-4420-8607-6189df827955","Type":"ContainerDied","Data":"d8465efdca9432ccd4cb2016a54a4051f462b9ea6522822b46d2f54607774055"} Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.789729 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng" Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.789783 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8465efdca9432ccd4cb2016a54a4051f462b9ea6522822b46d2f54607774055" Jan 20 23:00:03 crc kubenswrapper[5030]: I0120 23:00:03.882846 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.349043 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.461411 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-scripts\") pod \"0c547bf6-ed95-4bc8-8573-3047e4689484\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.461483 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-combined-ca-bundle\") pod \"0c547bf6-ed95-4bc8-8573-3047e4689484\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.461758 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swghr\" (UniqueName: \"kubernetes.io/projected/0c547bf6-ed95-4bc8-8573-3047e4689484-kube-api-access-swghr\") pod \"0c547bf6-ed95-4bc8-8573-3047e4689484\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.461814 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-config-data\") pod \"0c547bf6-ed95-4bc8-8573-3047e4689484\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.461918 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c547bf6-ed95-4bc8-8573-3047e4689484-logs\") pod \"0c547bf6-ed95-4bc8-8573-3047e4689484\" (UID: \"0c547bf6-ed95-4bc8-8573-3047e4689484\") " Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.462699 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c547bf6-ed95-4bc8-8573-3047e4689484-logs" (OuterVolumeSpecName: "logs") pod "0c547bf6-ed95-4bc8-8573-3047e4689484" (UID: "0c547bf6-ed95-4bc8-8573-3047e4689484"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.467064 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-scripts" (OuterVolumeSpecName: "scripts") pod "0c547bf6-ed95-4bc8-8573-3047e4689484" (UID: "0c547bf6-ed95-4bc8-8573-3047e4689484"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.467195 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c547bf6-ed95-4bc8-8573-3047e4689484-kube-api-access-swghr" (OuterVolumeSpecName: "kube-api-access-swghr") pod "0c547bf6-ed95-4bc8-8573-3047e4689484" (UID: "0c547bf6-ed95-4bc8-8573-3047e4689484"). InnerVolumeSpecName "kube-api-access-swghr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.522500 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c547bf6-ed95-4bc8-8573-3047e4689484" (UID: "0c547bf6-ed95-4bc8-8573-3047e4689484"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.525988 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-config-data" (OuterVolumeSpecName: "config-data") pod "0c547bf6-ed95-4bc8-8573-3047e4689484" (UID: "0c547bf6-ed95-4bc8-8573-3047e4689484"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.564315 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swghr\" (UniqueName: \"kubernetes.io/projected/0c547bf6-ed95-4bc8-8573-3047e4689484-kube-api-access-swghr\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.564658 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.564893 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c547bf6-ed95-4bc8-8573-3047e4689484-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.565175 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.565314 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c547bf6-ed95-4bc8-8573-3047e4689484-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.837904 5030 generic.go:334] "Generic (PLEG): container finished" podID="0c547bf6-ed95-4bc8-8573-3047e4689484" containerID="618c6b906a2fc068bead3717fa5d723efe80c68d1b2e2c9f5efe125a0f7571a5" exitCode=0 Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.837983 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" event={"ID":"0c547bf6-ed95-4bc8-8573-3047e4689484","Type":"ContainerDied","Data":"618c6b906a2fc068bead3717fa5d723efe80c68d1b2e2c9f5efe125a0f7571a5"} Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.838034 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" event={"ID":"0c547bf6-ed95-4bc8-8573-3047e4689484","Type":"ContainerDied","Data":"2c9572d1f3ae94ab6e57b82eb1f102f5c4b17430a168712d9171152cea0b0ae6"} Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.838073 5030 scope.go:117] "RemoveContainer" containerID="618c6b906a2fc068bead3717fa5d723efe80c68d1b2e2c9f5efe125a0f7571a5" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.838327 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-9855649bd-rgwm6" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.884959 5030 scope.go:117] "RemoveContainer" containerID="612c19ede81e6b0e21637082653d323f14d40b7293d3a145df76bd10f3f66c67" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.901256 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-9855649bd-rgwm6"] Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.915587 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-9855649bd-rgwm6"] Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.919062 5030 scope.go:117] "RemoveContainer" containerID="618c6b906a2fc068bead3717fa5d723efe80c68d1b2e2c9f5efe125a0f7571a5" Jan 20 23:00:06 crc kubenswrapper[5030]: E0120 23:00:06.919775 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"618c6b906a2fc068bead3717fa5d723efe80c68d1b2e2c9f5efe125a0f7571a5\": container with ID starting with 618c6b906a2fc068bead3717fa5d723efe80c68d1b2e2c9f5efe125a0f7571a5 not found: ID does not exist" containerID="618c6b906a2fc068bead3717fa5d723efe80c68d1b2e2c9f5efe125a0f7571a5" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.919921 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"618c6b906a2fc068bead3717fa5d723efe80c68d1b2e2c9f5efe125a0f7571a5"} err="failed to get container status \"618c6b906a2fc068bead3717fa5d723efe80c68d1b2e2c9f5efe125a0f7571a5\": rpc error: code = NotFound desc = could not find container \"618c6b906a2fc068bead3717fa5d723efe80c68d1b2e2c9f5efe125a0f7571a5\": container with ID starting with 618c6b906a2fc068bead3717fa5d723efe80c68d1b2e2c9f5efe125a0f7571a5 not found: ID does not exist" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.920072 5030 scope.go:117] "RemoveContainer" containerID="612c19ede81e6b0e21637082653d323f14d40b7293d3a145df76bd10f3f66c67" Jan 20 23:00:06 crc kubenswrapper[5030]: E0120 23:00:06.920577 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"612c19ede81e6b0e21637082653d323f14d40b7293d3a145df76bd10f3f66c67\": container with ID starting with 612c19ede81e6b0e21637082653d323f14d40b7293d3a145df76bd10f3f66c67 not found: ID does not exist" containerID="612c19ede81e6b0e21637082653d323f14d40b7293d3a145df76bd10f3f66c67" Jan 20 23:00:06 crc kubenswrapper[5030]: I0120 23:00:06.920686 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"612c19ede81e6b0e21637082653d323f14d40b7293d3a145df76bd10f3f66c67"} err="failed to get container status \"612c19ede81e6b0e21637082653d323f14d40b7293d3a145df76bd10f3f66c67\": rpc error: code = NotFound desc = could not find container \"612c19ede81e6b0e21637082653d323f14d40b7293d3a145df76bd10f3f66c67\": container with ID starting with 612c19ede81e6b0e21637082653d323f14d40b7293d3a145df76bd10f3f66c67 not found: ID does not exist" Jan 20 23:00:07 crc kubenswrapper[5030]: I0120 23:00:07.972749 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c547bf6-ed95-4bc8-8573-3047e4689484" path="/var/lib/kubelet/pods/0c547bf6-ed95-4bc8-8573-3047e4689484/volumes" Jan 20 23:00:11 crc kubenswrapper[5030]: I0120 23:00:11.079020 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 23:00:11 crc kubenswrapper[5030]: I0120 23:00:11.168166 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-66c88c7dcc-g5x75"] Jan 20 23:00:11 crc kubenswrapper[5030]: I0120 23:00:11.168406 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" podUID="c840a11b-13a9-494d-804d-ab91e0300073" containerName="neutron-api" containerID="cri-o://7e59dd71b1509599776887729e46e1021780ba0d3f30a81c91a3255fe7bb5b8e" gracePeriod=30 Jan 20 23:00:11 crc kubenswrapper[5030]: I0120 23:00:11.168476 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" podUID="c840a11b-13a9-494d-804d-ab91e0300073" containerName="neutron-httpd" containerID="cri-o://b1a100196f52cd51df47685e166b707ce9ec8c7b991d38c1902ddd14662f3777" gracePeriod=30 Jan 20 23:00:11 crc kubenswrapper[5030]: I0120 23:00:11.502135 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 23:00:11 crc kubenswrapper[5030]: I0120 23:00:11.903288 5030 generic.go:334] "Generic (PLEG): container finished" podID="c840a11b-13a9-494d-804d-ab91e0300073" containerID="b1a100196f52cd51df47685e166b707ce9ec8c7b991d38c1902ddd14662f3777" exitCode=0 Jan 20 23:00:11 crc kubenswrapper[5030]: I0120 23:00:11.903362 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" event={"ID":"c840a11b-13a9-494d-804d-ab91e0300073","Type":"ContainerDied","Data":"b1a100196f52cd51df47685e166b707ce9ec8c7b991d38c1902ddd14662f3777"} Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.123256 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.124050 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="ceilometer-central-agent" containerID="cri-o://100287f0b6f74ab7203d3a8bd63df77c1e5592cc2b3a660adcb64f49efadfb64" gracePeriod=30 Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.124094 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="sg-core" containerID="cri-o://8dce74e269908d3d2a775a636d4a4986668541ea1b915b14ee03edd033b8c98c" gracePeriod=30 Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.124132 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="ceilometer-notification-agent" containerID="cri-o://d8c5df5d409b433301543b513acb94e59d487da2998ab3b5dfa1af569b6de93b" gracePeriod=30 Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.124360 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="proxy-httpd" containerID="cri-o://a0802d9d36c2d6e26de2172094c5bdc0c08c4a0572a400a5f5bd5a01b3251e12" gracePeriod=30 Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.209052 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8"] Jan 20 23:00:15 crc kubenswrapper[5030]: E0120 23:00:15.209919 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69981745-6b95-4420-8607-6189df827955" containerName="collect-profiles" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.209935 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="69981745-6b95-4420-8607-6189df827955" containerName="collect-profiles" Jan 20 23:00:15 crc kubenswrapper[5030]: E0120 23:00:15.209948 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c547bf6-ed95-4bc8-8573-3047e4689484" containerName="placement-api" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.209955 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c547bf6-ed95-4bc8-8573-3047e4689484" containerName="placement-api" Jan 20 23:00:15 crc kubenswrapper[5030]: E0120 23:00:15.209980 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c547bf6-ed95-4bc8-8573-3047e4689484" containerName="placement-log" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.209987 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c547bf6-ed95-4bc8-8573-3047e4689484" containerName="placement-log" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.210153 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="69981745-6b95-4420-8607-6189df827955" containerName="collect-profiles" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.210166 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c547bf6-ed95-4bc8-8573-3047e4689484" containerName="placement-log" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.210172 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c547bf6-ed95-4bc8-8573-3047e4689484" containerName="placement-api" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.211056 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.213583 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.213938 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.220038 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.249173 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8"] Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.330371 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-combined-ca-bundle\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.330477 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-internal-tls-certs\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.330510 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-public-tls-certs\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.330534 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177bf646-3453-466c-ac69-7777db44dda5-log-httpd\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.330650 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-config-data\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.330688 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/177bf646-3453-466c-ac69-7777db44dda5-etc-swift\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.330800 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177bf646-3453-466c-ac69-7777db44dda5-run-httpd\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.330901 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgg8t\" (UniqueName: \"kubernetes.io/projected/177bf646-3453-466c-ac69-7777db44dda5-kube-api-access-qgg8t\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.432765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-combined-ca-bundle\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.433105 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-internal-tls-certs\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.433198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-public-tls-certs\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.433269 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177bf646-3453-466c-ac69-7777db44dda5-log-httpd\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.433363 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-config-data\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.433445 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/177bf646-3453-466c-ac69-7777db44dda5-etc-swift\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.433551 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177bf646-3453-466c-ac69-7777db44dda5-run-httpd\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.433648 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgg8t\" (UniqueName: \"kubernetes.io/projected/177bf646-3453-466c-ac69-7777db44dda5-kube-api-access-qgg8t\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.433983 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177bf646-3453-466c-ac69-7777db44dda5-run-httpd\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.434207 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177bf646-3453-466c-ac69-7777db44dda5-log-httpd\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.439871 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-internal-tls-certs\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.439997 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/177bf646-3453-466c-ac69-7777db44dda5-etc-swift\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.442090 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-public-tls-certs\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.444291 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-combined-ca-bundle\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.449804 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgg8t\" (UniqueName: \"kubernetes.io/projected/177bf646-3453-466c-ac69-7777db44dda5-kube-api-access-qgg8t\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.453820 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-config-data\") pod \"swift-proxy-f55649dc6-gfkb8\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.528300 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.948304 5030 generic.go:334] "Generic (PLEG): container finished" podID="60afc403-7a57-4953-b772-938b04583542" containerID="a0802d9d36c2d6e26de2172094c5bdc0c08c4a0572a400a5f5bd5a01b3251e12" exitCode=0 Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.948539 5030 generic.go:334] "Generic (PLEG): container finished" podID="60afc403-7a57-4953-b772-938b04583542" containerID="8dce74e269908d3d2a775a636d4a4986668541ea1b915b14ee03edd033b8c98c" exitCode=2 Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.948547 5030 generic.go:334] "Generic (PLEG): container finished" podID="60afc403-7a57-4953-b772-938b04583542" containerID="100287f0b6f74ab7203d3a8bd63df77c1e5592cc2b3a660adcb64f49efadfb64" exitCode=0 Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.948348 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"60afc403-7a57-4953-b772-938b04583542","Type":"ContainerDied","Data":"a0802d9d36c2d6e26de2172094c5bdc0c08c4a0572a400a5f5bd5a01b3251e12"} Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.948575 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"60afc403-7a57-4953-b772-938b04583542","Type":"ContainerDied","Data":"8dce74e269908d3d2a775a636d4a4986668541ea1b915b14ee03edd033b8c98c"} Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.948600 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"60afc403-7a57-4953-b772-938b04583542","Type":"ContainerDied","Data":"100287f0b6f74ab7203d3a8bd63df77c1e5592cc2b3a660adcb64f49efadfb64"} Jan 20 23:00:15 crc kubenswrapper[5030]: I0120 23:00:15.976263 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8"] Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.424416 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.425947 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.428897 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.428966 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-lc5qt" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.430555 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.433634 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.449542 5030 status_manager.go:875] "Failed to update status for pod" pod="openstack-kuttl-tests/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32df1baa-dcd2-4d09-8458-517d6ce782aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T23:00:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T23:00:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T23:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T23:00:16Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:2b4f8494513a3af102066fec5868ab167ac8664aceb2f0c639d7a0b60260a944\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem\\\",\\\"name\\\":\\\"combined-ca-bundle\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqfbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T23:00:16Z\\\"}}\" for pod \"openstack-kuttl-tests\"/\"openstackclient\": pods \"openstackclient\" not found" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.462242 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:00:16 crc kubenswrapper[5030]: E0120 23:00:16.465700 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-gqfbf openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[combined-ca-bundle kube-api-access-gqfbf openstack-config openstack-config-secret]: context canceled" pod="openstack-kuttl-tests/openstackclient" podUID="32df1baa-dcd2-4d09-8458-517d6ce782aa" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.483463 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.509705 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.511546 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.538747 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.542656 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="32df1baa-dcd2-4d09-8458-517d6ce782aa" podUID="703b2784-b7db-4424-b479-91d7c53f9ecb" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.554753 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/703b2784-b7db-4424-b479-91d7c53f9ecb-openstack-config\") pod \"openstackclient\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.554794 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69sz9\" (UniqueName: \"kubernetes.io/projected/703b2784-b7db-4424-b479-91d7c53f9ecb-kube-api-access-69sz9\") pod \"openstackclient\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.554820 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b2784-b7db-4424-b479-91d7c53f9ecb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.554904 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/703b2784-b7db-4424-b479-91d7c53f9ecb-openstack-config-secret\") pod \"openstackclient\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.638359 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.656587 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/703b2784-b7db-4424-b479-91d7c53f9ecb-openstack-config-secret\") pod \"openstackclient\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.656750 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/703b2784-b7db-4424-b479-91d7c53f9ecb-openstack-config\") pod \"openstackclient\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.656792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69sz9\" (UniqueName: \"kubernetes.io/projected/703b2784-b7db-4424-b479-91d7c53f9ecb-kube-api-access-69sz9\") pod \"openstackclient\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.656826 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b2784-b7db-4424-b479-91d7c53f9ecb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.657701 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/703b2784-b7db-4424-b479-91d7c53f9ecb-openstack-config\") pod \"openstackclient\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.661421 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b2784-b7db-4424-b479-91d7c53f9ecb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.664878 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/703b2784-b7db-4424-b479-91d7c53f9ecb-openstack-config-secret\") pod \"openstackclient\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.677500 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69sz9\" (UniqueName: \"kubernetes.io/projected/703b2784-b7db-4424-b479-91d7c53f9ecb-kube-api-access-69sz9\") pod \"openstackclient\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.758508 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-combined-ca-bundle\") pod \"c840a11b-13a9-494d-804d-ab91e0300073\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.758584 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-ovndb-tls-certs\") pod \"c840a11b-13a9-494d-804d-ab91e0300073\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.758606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-httpd-config\") pod \"c840a11b-13a9-494d-804d-ab91e0300073\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.758652 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-config\") pod \"c840a11b-13a9-494d-804d-ab91e0300073\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.758708 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckr6v\" (UniqueName: \"kubernetes.io/projected/c840a11b-13a9-494d-804d-ab91e0300073-kube-api-access-ckr6v\") pod \"c840a11b-13a9-494d-804d-ab91e0300073\" (UID: \"c840a11b-13a9-494d-804d-ab91e0300073\") " Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.762467 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c840a11b-13a9-494d-804d-ab91e0300073-kube-api-access-ckr6v" (OuterVolumeSpecName: "kube-api-access-ckr6v") pod "c840a11b-13a9-494d-804d-ab91e0300073" (UID: "c840a11b-13a9-494d-804d-ab91e0300073"). InnerVolumeSpecName "kube-api-access-ckr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.763736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c840a11b-13a9-494d-804d-ab91e0300073" (UID: "c840a11b-13a9-494d-804d-ab91e0300073"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.809438 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-config" (OuterVolumeSpecName: "config") pod "c840a11b-13a9-494d-804d-ab91e0300073" (UID: "c840a11b-13a9-494d-804d-ab91e0300073"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.816554 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c840a11b-13a9-494d-804d-ab91e0300073" (UID: "c840a11b-13a9-494d-804d-ab91e0300073"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.834752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c840a11b-13a9-494d-804d-ab91e0300073" (UID: "c840a11b-13a9-494d-804d-ab91e0300073"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.844742 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.868328 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.868363 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.868373 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.868382 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c840a11b-13a9-494d-804d-ab91e0300073-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.868392 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckr6v\" (UniqueName: \"kubernetes.io/projected/c840a11b-13a9-494d-804d-ab91e0300073-kube-api-access-ckr6v\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.962577 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" event={"ID":"177bf646-3453-466c-ac69-7777db44dda5","Type":"ContainerStarted","Data":"45ada1c5b5263d9253d72b7b3529d52163687af3862b2a42b49f7e5e3b235906"} Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.962609 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.962643 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" event={"ID":"177bf646-3453-466c-ac69-7777db44dda5","Type":"ContainerStarted","Data":"c4c34433a2e5244dfc75cef087ad787ef3f93d3b31b9b4666e5d1d0506c65a29"} Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.962654 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.962664 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" event={"ID":"177bf646-3453-466c-ac69-7777db44dda5","Type":"ContainerStarted","Data":"46b888c998f50e5e6e9383ccf6b9cab1b056e9c057649e106064ec6ac0cbd6fb"} Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.964362 5030 generic.go:334] "Generic (PLEG): container finished" podID="c840a11b-13a9-494d-804d-ab91e0300073" containerID="7e59dd71b1509599776887729e46e1021780ba0d3f30a81c91a3255fe7bb5b8e" exitCode=0 Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.964427 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.964943 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.965299 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" event={"ID":"c840a11b-13a9-494d-804d-ab91e0300073","Type":"ContainerDied","Data":"7e59dd71b1509599776887729e46e1021780ba0d3f30a81c91a3255fe7bb5b8e"} Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.965362 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-66c88c7dcc-g5x75" event={"ID":"c840a11b-13a9-494d-804d-ab91e0300073","Type":"ContainerDied","Data":"948b9845ccaabe9a087c6af8f47ee8142ec39a299c118713f401e3da952e8dc2"} Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.965387 5030 scope.go:117] "RemoveContainer" containerID="b1a100196f52cd51df47685e166b707ce9ec8c7b991d38c1902ddd14662f3777" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.980917 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.992150 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="32df1baa-dcd2-4d09-8458-517d6ce782aa" podUID="703b2784-b7db-4424-b479-91d7c53f9ecb" Jan 20 23:00:16 crc kubenswrapper[5030]: I0120 23:00:16.996078 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" podStartSLOduration=1.9960558179999999 podStartE2EDuration="1.996055818s" podCreationTimestamp="2026-01-20 23:00:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:00:16.986560057 +0000 UTC m=+1489.306820345" watchObservedRunningTime="2026-01-20 23:00:16.996055818 +0000 UTC m=+1489.316316106" Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.029406 5030 scope.go:117] "RemoveContainer" containerID="7e59dd71b1509599776887729e46e1021780ba0d3f30a81c91a3255fe7bb5b8e" Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.029456 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-66c88c7dcc-g5x75"] Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.035863 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-66c88c7dcc-g5x75"] Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.058016 5030 scope.go:117] "RemoveContainer" containerID="b1a100196f52cd51df47685e166b707ce9ec8c7b991d38c1902ddd14662f3777" Jan 20 23:00:17 crc kubenswrapper[5030]: E0120 23:00:17.062114 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a100196f52cd51df47685e166b707ce9ec8c7b991d38c1902ddd14662f3777\": container with ID starting with b1a100196f52cd51df47685e166b707ce9ec8c7b991d38c1902ddd14662f3777 not found: ID does not exist" containerID="b1a100196f52cd51df47685e166b707ce9ec8c7b991d38c1902ddd14662f3777" Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.062244 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a100196f52cd51df47685e166b707ce9ec8c7b991d38c1902ddd14662f3777"} err="failed to get container status \"b1a100196f52cd51df47685e166b707ce9ec8c7b991d38c1902ddd14662f3777\": rpc error: code = NotFound desc = could not find container \"b1a100196f52cd51df47685e166b707ce9ec8c7b991d38c1902ddd14662f3777\": container with ID starting with b1a100196f52cd51df47685e166b707ce9ec8c7b991d38c1902ddd14662f3777 not found: ID does not exist" Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.062326 5030 scope.go:117] "RemoveContainer" containerID="7e59dd71b1509599776887729e46e1021780ba0d3f30a81c91a3255fe7bb5b8e" Jan 20 23:00:17 crc kubenswrapper[5030]: E0120 23:00:17.062962 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e59dd71b1509599776887729e46e1021780ba0d3f30a81c91a3255fe7bb5b8e\": container with ID starting with 7e59dd71b1509599776887729e46e1021780ba0d3f30a81c91a3255fe7bb5b8e not found: ID does not exist" containerID="7e59dd71b1509599776887729e46e1021780ba0d3f30a81c91a3255fe7bb5b8e" Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.063001 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e59dd71b1509599776887729e46e1021780ba0d3f30a81c91a3255fe7bb5b8e"} err="failed to get container status \"7e59dd71b1509599776887729e46e1021780ba0d3f30a81c91a3255fe7bb5b8e\": rpc error: code = NotFound desc = could not find container \"7e59dd71b1509599776887729e46e1021780ba0d3f30a81c91a3255fe7bb5b8e\": container with ID starting with 7e59dd71b1509599776887729e46e1021780ba0d3f30a81c91a3255fe7bb5b8e not found: ID does not exist" Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.275234 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:00:17 crc kubenswrapper[5030]: W0120 23:00:17.286715 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod703b2784_b7db_4424_b479_91d7c53f9ecb.slice/crio-b70e0d9896c1f4e53e9f85a99886b66d7ef32088a5ae8a80ebf8e7a35421c81c WatchSource:0}: Error finding container b70e0d9896c1f4e53e9f85a99886b66d7ef32088a5ae8a80ebf8e7a35421c81c: Status 404 returned error can't find the container with id b70e0d9896c1f4e53e9f85a99886b66d7ef32088a5ae8a80ebf8e7a35421c81c Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.974180 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32df1baa-dcd2-4d09-8458-517d6ce782aa" path="/var/lib/kubelet/pods/32df1baa-dcd2-4d09-8458-517d6ce782aa/volumes" Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.974936 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c840a11b-13a9-494d-804d-ab91e0300073" path="/var/lib/kubelet/pods/c840a11b-13a9-494d-804d-ab91e0300073/volumes" Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.979876 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"703b2784-b7db-4424-b479-91d7c53f9ecb","Type":"ContainerStarted","Data":"bd4a155b76873118c2373604fae7e69b21203161dbc22f566ab70841f2e1c6ae"} Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.979921 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"703b2784-b7db-4424-b479-91d7c53f9ecb","Type":"ContainerStarted","Data":"b70e0d9896c1f4e53e9f85a99886b66d7ef32088a5ae8a80ebf8e7a35421c81c"} Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.979937 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:00:17 crc kubenswrapper[5030]: I0120 23:00:17.984974 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="32df1baa-dcd2-4d09-8458-517d6ce782aa" podUID="703b2784-b7db-4424-b479-91d7c53f9ecb" Jan 20 23:00:18 crc kubenswrapper[5030]: I0120 23:00:18.006641 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="32df1baa-dcd2-4d09-8458-517d6ce782aa" podUID="703b2784-b7db-4424-b479-91d7c53f9ecb" Jan 20 23:00:18 crc kubenswrapper[5030]: I0120 23:00:18.996888 5030 generic.go:334] "Generic (PLEG): container finished" podID="60afc403-7a57-4953-b772-938b04583542" containerID="d8c5df5d409b433301543b513acb94e59d487da2998ab3b5dfa1af569b6de93b" exitCode=0 Jan 20 23:00:18 crc kubenswrapper[5030]: I0120 23:00:18.996970 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"60afc403-7a57-4953-b772-938b04583542","Type":"ContainerDied","Data":"d8c5df5d409b433301543b513acb94e59d487da2998ab3b5dfa1af569b6de93b"} Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.285136 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.309332 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=3.309310402 podStartE2EDuration="3.309310402s" podCreationTimestamp="2026-01-20 23:00:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:00:18.003634092 +0000 UTC m=+1490.323894400" watchObservedRunningTime="2026-01-20 23:00:19.309310402 +0000 UTC m=+1491.629570710" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.411606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdvwz\" (UniqueName: \"kubernetes.io/projected/60afc403-7a57-4953-b772-938b04583542-kube-api-access-vdvwz\") pod \"60afc403-7a57-4953-b772-938b04583542\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.411680 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-scripts\") pod \"60afc403-7a57-4953-b772-938b04583542\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.411720 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60afc403-7a57-4953-b772-938b04583542-log-httpd\") pod \"60afc403-7a57-4953-b772-938b04583542\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.411769 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60afc403-7a57-4953-b772-938b04583542-run-httpd\") pod \"60afc403-7a57-4953-b772-938b04583542\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.411825 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-combined-ca-bundle\") pod \"60afc403-7a57-4953-b772-938b04583542\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.411843 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-config-data\") pod \"60afc403-7a57-4953-b772-938b04583542\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.411880 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-sg-core-conf-yaml\") pod \"60afc403-7a57-4953-b772-938b04583542\" (UID: \"60afc403-7a57-4953-b772-938b04583542\") " Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.413024 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60afc403-7a57-4953-b772-938b04583542-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "60afc403-7a57-4953-b772-938b04583542" (UID: "60afc403-7a57-4953-b772-938b04583542"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.413395 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60afc403-7a57-4953-b772-938b04583542-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "60afc403-7a57-4953-b772-938b04583542" (UID: "60afc403-7a57-4953-b772-938b04583542"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.417902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60afc403-7a57-4953-b772-938b04583542-kube-api-access-vdvwz" (OuterVolumeSpecName: "kube-api-access-vdvwz") pod "60afc403-7a57-4953-b772-938b04583542" (UID: "60afc403-7a57-4953-b772-938b04583542"). InnerVolumeSpecName "kube-api-access-vdvwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.418557 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-scripts" (OuterVolumeSpecName: "scripts") pod "60afc403-7a57-4953-b772-938b04583542" (UID: "60afc403-7a57-4953-b772-938b04583542"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.440175 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "60afc403-7a57-4953-b772-938b04583542" (UID: "60afc403-7a57-4953-b772-938b04583542"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.479540 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60afc403-7a57-4953-b772-938b04583542" (UID: "60afc403-7a57-4953-b772-938b04583542"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.500577 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-config-data" (OuterVolumeSpecName: "config-data") pod "60afc403-7a57-4953-b772-938b04583542" (UID: "60afc403-7a57-4953-b772-938b04583542"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.513703 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdvwz\" (UniqueName: \"kubernetes.io/projected/60afc403-7a57-4953-b772-938b04583542-kube-api-access-vdvwz\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.513746 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.513756 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60afc403-7a57-4953-b772-938b04583542-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.513765 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60afc403-7a57-4953-b772-938b04583542-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.513802 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.513811 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:19 crc kubenswrapper[5030]: I0120 23:00:19.513818 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60afc403-7a57-4953-b772-938b04583542-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.015790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"60afc403-7a57-4953-b772-938b04583542","Type":"ContainerDied","Data":"85749da8118a19d4ee6beacddb1b7b4dc39a928829589089c98b6d3551dd0bdb"} Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.015922 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.016089 5030 scope.go:117] "RemoveContainer" containerID="a0802d9d36c2d6e26de2172094c5bdc0c08c4a0572a400a5f5bd5a01b3251e12" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.059464 5030 scope.go:117] "RemoveContainer" containerID="8dce74e269908d3d2a775a636d4a4986668541ea1b915b14ee03edd033b8c98c" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.064784 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.077398 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.090908 5030 scope.go:117] "RemoveContainer" containerID="d8c5df5d409b433301543b513acb94e59d487da2998ab3b5dfa1af569b6de93b" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.092088 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:00:20 crc kubenswrapper[5030]: E0120 23:00:20.092763 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="ceilometer-central-agent" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.092795 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="ceilometer-central-agent" Jan 20 23:00:20 crc kubenswrapper[5030]: E0120 23:00:20.092814 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="sg-core" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.092839 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="sg-core" Jan 20 23:00:20 crc kubenswrapper[5030]: E0120 23:00:20.092861 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c840a11b-13a9-494d-804d-ab91e0300073" containerName="neutron-httpd" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.092874 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c840a11b-13a9-494d-804d-ab91e0300073" containerName="neutron-httpd" Jan 20 23:00:20 crc kubenswrapper[5030]: E0120 23:00:20.092905 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c840a11b-13a9-494d-804d-ab91e0300073" containerName="neutron-api" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.092917 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c840a11b-13a9-494d-804d-ab91e0300073" containerName="neutron-api" Jan 20 23:00:20 crc kubenswrapper[5030]: E0120 23:00:20.092940 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="proxy-httpd" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.092952 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="proxy-httpd" Jan 20 23:00:20 crc kubenswrapper[5030]: E0120 23:00:20.092974 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="ceilometer-notification-agent" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.092986 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="ceilometer-notification-agent" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.093268 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="ceilometer-central-agent" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.093296 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c840a11b-13a9-494d-804d-ab91e0300073" containerName="neutron-httpd" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.093318 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="ceilometer-notification-agent" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.093340 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c840a11b-13a9-494d-804d-ab91e0300073" containerName="neutron-api" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.093356 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="sg-core" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.093373 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="60afc403-7a57-4953-b772-938b04583542" containerName="proxy-httpd" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.096448 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.099827 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.100252 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.100454 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.129029 5030 scope.go:117] "RemoveContainer" containerID="100287f0b6f74ab7203d3a8bd63df77c1e5592cc2b3a660adcb64f49efadfb64" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.227012 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8bdba4-cb98-410c-a3f2-e65111577e74-log-httpd\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.227061 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8bdba4-cb98-410c-a3f2-e65111577e74-run-httpd\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.227091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-scripts\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.227155 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.227191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjh8d\" (UniqueName: \"kubernetes.io/projected/6d8bdba4-cb98-410c-a3f2-e65111577e74-kube-api-access-xjh8d\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.227212 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.227243 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-config-data\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.328562 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8bdba4-cb98-410c-a3f2-e65111577e74-log-httpd\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.328602 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8bdba4-cb98-410c-a3f2-e65111577e74-run-httpd\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.328643 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-scripts\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.328669 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.328702 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjh8d\" (UniqueName: \"kubernetes.io/projected/6d8bdba4-cb98-410c-a3f2-e65111577e74-kube-api-access-xjh8d\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.328726 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.328758 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-config-data\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.329927 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8bdba4-cb98-410c-a3f2-e65111577e74-run-httpd\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.330208 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8bdba4-cb98-410c-a3f2-e65111577e74-log-httpd\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.333434 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.333542 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.334127 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-config-data\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.338653 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-scripts\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.358609 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjh8d\" (UniqueName: \"kubernetes.io/projected/6d8bdba4-cb98-410c-a3f2-e65111577e74-kube-api-access-xjh8d\") pod \"ceilometer-0\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.428289 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:20 crc kubenswrapper[5030]: I0120 23:00:20.867229 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:00:21 crc kubenswrapper[5030]: I0120 23:00:21.025686 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"6d8bdba4-cb98-410c-a3f2-e65111577e74","Type":"ContainerStarted","Data":"339f53a129568e40439b133eb6f9f277b7f87dd1d36f7017d141d0221458c516"} Jan 20 23:00:21 crc kubenswrapper[5030]: I0120 23:00:21.979436 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60afc403-7a57-4953-b772-938b04583542" path="/var/lib/kubelet/pods/60afc403-7a57-4953-b772-938b04583542/volumes" Jan 20 23:00:22 crc kubenswrapper[5030]: I0120 23:00:22.043986 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"6d8bdba4-cb98-410c-a3f2-e65111577e74","Type":"ContainerStarted","Data":"be41f10e6402837652eac00e45c74a23de7e9c39482dfe1fe4b7ea58a6c8d0c4"} Jan 20 23:00:23 crc kubenswrapper[5030]: I0120 23:00:23.058046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"6d8bdba4-cb98-410c-a3f2-e65111577e74","Type":"ContainerStarted","Data":"948ddc82a958686ff209d61a104892ec314ae9149a981c1e742dcebd579967fe"} Jan 20 23:00:23 crc kubenswrapper[5030]: I0120 23:00:23.058409 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"6d8bdba4-cb98-410c-a3f2-e65111577e74","Type":"ContainerStarted","Data":"75bc2fae05fafa815037b121ffce74491c51d25a92d8e8c5940ec2a9a0576550"} Jan 20 23:00:24 crc kubenswrapper[5030]: I0120 23:00:24.864033 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:00:25 crc kubenswrapper[5030]: I0120 23:00:25.087091 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"6d8bdba4-cb98-410c-a3f2-e65111577e74","Type":"ContainerStarted","Data":"8cb21e0bd4248fc0020825e85f7b9b0c029a91d637107704c9dcd4bec4aa85b8"} Jan 20 23:00:25 crc kubenswrapper[5030]: I0120 23:00:25.087344 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:25 crc kubenswrapper[5030]: I0120 23:00:25.125534 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.806292733 podStartE2EDuration="5.125512843s" podCreationTimestamp="2026-01-20 23:00:20 +0000 UTC" firstStartedPulling="2026-01-20 23:00:20.866795374 +0000 UTC m=+1493.187055662" lastFinishedPulling="2026-01-20 23:00:24.186015444 +0000 UTC m=+1496.506275772" observedRunningTime="2026-01-20 23:00:25.11551463 +0000 UTC m=+1497.435774938" watchObservedRunningTime="2026-01-20 23:00:25.125512843 +0000 UTC m=+1497.445773131" Jan 20 23:00:25 crc kubenswrapper[5030]: I0120 23:00:25.533475 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:25 crc kubenswrapper[5030]: I0120 23:00:25.536110 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:00:25 crc kubenswrapper[5030]: I0120 23:00:25.789942 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:00:25 crc kubenswrapper[5030]: I0120 23:00:25.790163 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="53eabbda-71af-46ee-85df-de9997ed9b2b" containerName="glance-log" containerID="cri-o://203a6757ba66bed0aa792c3622d72d2811284af7b92f313c118c72e3bd14c9b0" gracePeriod=30 Jan 20 23:00:25 crc kubenswrapper[5030]: I0120 23:00:25.790592 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="53eabbda-71af-46ee-85df-de9997ed9b2b" containerName="glance-httpd" containerID="cri-o://9c9cc79f5dd471123bad13d3eca786383b406d209ce6569d1d53b3d158c5fc53" gracePeriod=30 Jan 20 23:00:26 crc kubenswrapper[5030]: I0120 23:00:26.096563 5030 generic.go:334] "Generic (PLEG): container finished" podID="53eabbda-71af-46ee-85df-de9997ed9b2b" containerID="203a6757ba66bed0aa792c3622d72d2811284af7b92f313c118c72e3bd14c9b0" exitCode=143 Jan 20 23:00:26 crc kubenswrapper[5030]: I0120 23:00:26.099374 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"53eabbda-71af-46ee-85df-de9997ed9b2b","Type":"ContainerDied","Data":"203a6757ba66bed0aa792c3622d72d2811284af7b92f313c118c72e3bd14c9b0"} Jan 20 23:00:26 crc kubenswrapper[5030]: I0120 23:00:26.099663 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="ceilometer-central-agent" containerID="cri-o://be41f10e6402837652eac00e45c74a23de7e9c39482dfe1fe4b7ea58a6c8d0c4" gracePeriod=30 Jan 20 23:00:26 crc kubenswrapper[5030]: I0120 23:00:26.100098 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="proxy-httpd" containerID="cri-o://8cb21e0bd4248fc0020825e85f7b9b0c029a91d637107704c9dcd4bec4aa85b8" gracePeriod=30 Jan 20 23:00:26 crc kubenswrapper[5030]: I0120 23:00:26.100219 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="sg-core" containerID="cri-o://948ddc82a958686ff209d61a104892ec314ae9149a981c1e742dcebd579967fe" gracePeriod=30 Jan 20 23:00:26 crc kubenswrapper[5030]: I0120 23:00:26.100333 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="ceilometer-notification-agent" containerID="cri-o://75bc2fae05fafa815037b121ffce74491c51d25a92d8e8c5940ec2a9a0576550" gracePeriod=30 Jan 20 23:00:27 crc kubenswrapper[5030]: I0120 23:00:27.117427 5030 generic.go:334] "Generic (PLEG): container finished" podID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerID="8cb21e0bd4248fc0020825e85f7b9b0c029a91d637107704c9dcd4bec4aa85b8" exitCode=0 Jan 20 23:00:27 crc kubenswrapper[5030]: I0120 23:00:27.117468 5030 generic.go:334] "Generic (PLEG): container finished" podID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerID="948ddc82a958686ff209d61a104892ec314ae9149a981c1e742dcebd579967fe" exitCode=2 Jan 20 23:00:27 crc kubenswrapper[5030]: I0120 23:00:27.117483 5030 generic.go:334] "Generic (PLEG): container finished" podID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerID="75bc2fae05fafa815037b121ffce74491c51d25a92d8e8c5940ec2a9a0576550" exitCode=0 Jan 20 23:00:27 crc kubenswrapper[5030]: I0120 23:00:27.117502 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"6d8bdba4-cb98-410c-a3f2-e65111577e74","Type":"ContainerDied","Data":"8cb21e0bd4248fc0020825e85f7b9b0c029a91d637107704c9dcd4bec4aa85b8"} Jan 20 23:00:27 crc kubenswrapper[5030]: I0120 23:00:27.117572 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"6d8bdba4-cb98-410c-a3f2-e65111577e74","Type":"ContainerDied","Data":"948ddc82a958686ff209d61a104892ec314ae9149a981c1e742dcebd579967fe"} Jan 20 23:00:27 crc kubenswrapper[5030]: I0120 23:00:27.117592 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"6d8bdba4-cb98-410c-a3f2-e65111577e74","Type":"ContainerDied","Data":"75bc2fae05fafa815037b121ffce74491c51d25a92d8e8c5940ec2a9a0576550"} Jan 20 23:00:27 crc kubenswrapper[5030]: I0120 23:00:27.394803 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:00:27 crc kubenswrapper[5030]: I0120 23:00:27.395044 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" containerName="glance-log" containerID="cri-o://3b0c5a28407c7b691f9cde3beaaf43dbc504f80e10cc07f583e835c611bd6192" gracePeriod=30 Jan 20 23:00:27 crc kubenswrapper[5030]: I0120 23:00:27.395097 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" containerName="glance-httpd" containerID="cri-o://6e73868661483fabc0e7f3ee5f4e53c03032db3a5af7d1a65ec1cb93103cb2e2" gracePeriod=30 Jan 20 23:00:28 crc kubenswrapper[5030]: I0120 23:00:28.128466 5030 generic.go:334] "Generic (PLEG): container finished" podID="0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" containerID="3b0c5a28407c7b691f9cde3beaaf43dbc504f80e10cc07f583e835c611bd6192" exitCode=143 Jan 20 23:00:28 crc kubenswrapper[5030]: I0120 23:00:28.128573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb","Type":"ContainerDied","Data":"3b0c5a28407c7b691f9cde3beaaf43dbc504f80e10cc07f583e835c611bd6192"} Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.138547 5030 generic.go:334] "Generic (PLEG): container finished" podID="53eabbda-71af-46ee-85df-de9997ed9b2b" containerID="9c9cc79f5dd471123bad13d3eca786383b406d209ce6569d1d53b3d158c5fc53" exitCode=0 Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.138662 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"53eabbda-71af-46ee-85df-de9997ed9b2b","Type":"ContainerDied","Data":"9c9cc79f5dd471123bad13d3eca786383b406d209ce6569d1d53b3d158c5fc53"} Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.341133 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.387807 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53eabbda-71af-46ee-85df-de9997ed9b2b-logs\") pod \"53eabbda-71af-46ee-85df-de9997ed9b2b\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.388847 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53eabbda-71af-46ee-85df-de9997ed9b2b-logs" (OuterVolumeSpecName: "logs") pod "53eabbda-71af-46ee-85df-de9997ed9b2b" (UID: "53eabbda-71af-46ee-85df-de9997ed9b2b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.389525 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-scripts\") pod \"53eabbda-71af-46ee-85df-de9997ed9b2b\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.389612 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-public-tls-certs\") pod \"53eabbda-71af-46ee-85df-de9997ed9b2b\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.389665 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-combined-ca-bundle\") pod \"53eabbda-71af-46ee-85df-de9997ed9b2b\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.389715 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53eabbda-71af-46ee-85df-de9997ed9b2b-httpd-run\") pod \"53eabbda-71af-46ee-85df-de9997ed9b2b\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.389802 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"53eabbda-71af-46ee-85df-de9997ed9b2b\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.389838 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-config-data\") pod \"53eabbda-71af-46ee-85df-de9997ed9b2b\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.389902 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfnc8\" (UniqueName: \"kubernetes.io/projected/53eabbda-71af-46ee-85df-de9997ed9b2b-kube-api-access-jfnc8\") pod \"53eabbda-71af-46ee-85df-de9997ed9b2b\" (UID: \"53eabbda-71af-46ee-85df-de9997ed9b2b\") " Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.390514 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53eabbda-71af-46ee-85df-de9997ed9b2b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.393218 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53eabbda-71af-46ee-85df-de9997ed9b2b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "53eabbda-71af-46ee-85df-de9997ed9b2b" (UID: "53eabbda-71af-46ee-85df-de9997ed9b2b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.400420 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "53eabbda-71af-46ee-85df-de9997ed9b2b" (UID: "53eabbda-71af-46ee-85df-de9997ed9b2b"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.406318 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-scripts" (OuterVolumeSpecName: "scripts") pod "53eabbda-71af-46ee-85df-de9997ed9b2b" (UID: "53eabbda-71af-46ee-85df-de9997ed9b2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.408243 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53eabbda-71af-46ee-85df-de9997ed9b2b-kube-api-access-jfnc8" (OuterVolumeSpecName: "kube-api-access-jfnc8") pod "53eabbda-71af-46ee-85df-de9997ed9b2b" (UID: "53eabbda-71af-46ee-85df-de9997ed9b2b"). InnerVolumeSpecName "kube-api-access-jfnc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.449181 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53eabbda-71af-46ee-85df-de9997ed9b2b" (UID: "53eabbda-71af-46ee-85df-de9997ed9b2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.459749 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "53eabbda-71af-46ee-85df-de9997ed9b2b" (UID: "53eabbda-71af-46ee-85df-de9997ed9b2b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.480996 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-config-data" (OuterVolumeSpecName: "config-data") pod "53eabbda-71af-46ee-85df-de9997ed9b2b" (UID: "53eabbda-71af-46ee-85df-de9997ed9b2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.491850 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfnc8\" (UniqueName: \"kubernetes.io/projected/53eabbda-71af-46ee-85df-de9997ed9b2b-kube-api-access-jfnc8\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.491891 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.491906 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.491918 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.491933 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/53eabbda-71af-46ee-85df-de9997ed9b2b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.491972 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.491986 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53eabbda-71af-46ee-85df-de9997ed9b2b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.525987 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:00:29 crc kubenswrapper[5030]: I0120 23:00:29.593689 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.149050 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"53eabbda-71af-46ee-85df-de9997ed9b2b","Type":"ContainerDied","Data":"8d4fbb30a5cf6219d5682098586dc3cec9e1cbd936b41a0e2dec7fd3e4246ed4"} Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.149152 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.149366 5030 scope.go:117] "RemoveContainer" containerID="9c9cc79f5dd471123bad13d3eca786383b406d209ce6569d1d53b3d158c5fc53" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.172859 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.183692 5030 scope.go:117] "RemoveContainer" containerID="203a6757ba66bed0aa792c3622d72d2811284af7b92f313c118c72e3bd14c9b0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.189805 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.208666 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:00:30 crc kubenswrapper[5030]: E0120 23:00:30.209116 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53eabbda-71af-46ee-85df-de9997ed9b2b" containerName="glance-log" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.209139 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="53eabbda-71af-46ee-85df-de9997ed9b2b" containerName="glance-log" Jan 20 23:00:30 crc kubenswrapper[5030]: E0120 23:00:30.209209 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53eabbda-71af-46ee-85df-de9997ed9b2b" containerName="glance-httpd" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.209218 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="53eabbda-71af-46ee-85df-de9997ed9b2b" containerName="glance-httpd" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.209440 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="53eabbda-71af-46ee-85df-de9997ed9b2b" containerName="glance-log" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.209465 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="53eabbda-71af-46ee-85df-de9997ed9b2b" containerName="glance-httpd" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.210563 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.219990 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.220047 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.228418 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.308019 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d90984a-ad24-4f85-b39e-fff5b8152a04-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.308159 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.308418 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.308472 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-scripts\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.308584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d90984a-ad24-4f85-b39e-fff5b8152a04-logs\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.308714 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-config-data\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.308749 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.308775 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lnh5\" (UniqueName: \"kubernetes.io/projected/5d90984a-ad24-4f85-b39e-fff5b8152a04-kube-api-access-6lnh5\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.410800 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.410854 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lnh5\" (UniqueName: \"kubernetes.io/projected/5d90984a-ad24-4f85-b39e-fff5b8152a04-kube-api-access-6lnh5\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.410947 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d90984a-ad24-4f85-b39e-fff5b8152a04-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.410972 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.411003 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.411027 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-scripts\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.411085 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d90984a-ad24-4f85-b39e-fff5b8152a04-logs\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.411125 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-config-data\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.411295 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.411565 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d90984a-ad24-4f85-b39e-fff5b8152a04-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.412325 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d90984a-ad24-4f85-b39e-fff5b8152a04-logs\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.416982 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-scripts\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.417933 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-config-data\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.424349 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.427084 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.435884 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.441988 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lnh5\" (UniqueName: \"kubernetes.io/projected/5d90984a-ad24-4f85-b39e-fff5b8152a04-kube-api-access-6lnh5\") pod \"glance-default-external-api-0\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.533400 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.540034 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.237:9292/healthcheck\": read tcp 10.217.0.2:39372->10.217.0.237:9292: read: connection reset by peer" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.540039 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.237:9292/healthcheck\": read tcp 10.217.0.2:39386->10.217.0.237:9292: read: connection reset by peer" Jan 20 23:00:30 crc kubenswrapper[5030]: I0120 23:00:30.968572 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.006910 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:00:31 crc kubenswrapper[5030]: W0120 23:00:31.009943 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d90984a_ad24_4f85_b39e_fff5b8152a04.slice/crio-5861c632c04e0fa81a9ea8da2cd209929246f83b14798ccd4ab85ffefebc3695 WatchSource:0}: Error finding container 5861c632c04e0fa81a9ea8da2cd209929246f83b14798ccd4ab85ffefebc3695: Status 404 returned error can't find the container with id 5861c632c04e0fa81a9ea8da2cd209929246f83b14798ccd4ab85ffefebc3695 Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.019499 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.019575 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-config-data\") pod \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.019670 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-httpd-run\") pod \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.019750 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-internal-tls-certs\") pod \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.019974 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-scripts\") pod \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.019997 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" (UID: "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.020031 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-combined-ca-bundle\") pod \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.020073 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mct7x\" (UniqueName: \"kubernetes.io/projected/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-kube-api-access-mct7x\") pod \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.020158 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-logs\") pod \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\" (UID: \"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb\") " Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.020714 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.026558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-logs" (OuterVolumeSpecName: "logs") pod "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" (UID: "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.031267 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-scripts" (OuterVolumeSpecName: "scripts") pod "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" (UID: "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.031776 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" (UID: "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.032084 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-kube-api-access-mct7x" (OuterVolumeSpecName: "kube-api-access-mct7x") pod "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" (UID: "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb"). InnerVolumeSpecName "kube-api-access-mct7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.073630 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" (UID: "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.085850 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" (UID: "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.100136 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-config-data" (OuterVolumeSpecName: "config-data") pod "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" (UID: "0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.121602 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.121647 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.121658 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.121668 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.121677 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.121686 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mct7x\" (UniqueName: \"kubernetes.io/projected/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-kube-api-access-mct7x\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.121694 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.141947 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.166656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"5d90984a-ad24-4f85-b39e-fff5b8152a04","Type":"ContainerStarted","Data":"5861c632c04e0fa81a9ea8da2cd209929246f83b14798ccd4ab85ffefebc3695"} Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.171331 5030 generic.go:334] "Generic (PLEG): container finished" podID="0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" containerID="6e73868661483fabc0e7f3ee5f4e53c03032db3a5af7d1a65ec1cb93103cb2e2" exitCode=0 Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.171389 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.171378 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb","Type":"ContainerDied","Data":"6e73868661483fabc0e7f3ee5f4e53c03032db3a5af7d1a65ec1cb93103cb2e2"} Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.171515 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb","Type":"ContainerDied","Data":"625a7d3ce44403049e137f9f93a519a84747870829838643d2aeacacc8b9fcb6"} Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.171546 5030 scope.go:117] "RemoveContainer" containerID="6e73868661483fabc0e7f3ee5f4e53c03032db3a5af7d1a65ec1cb93103cb2e2" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.195303 5030 scope.go:117] "RemoveContainer" containerID="3b0c5a28407c7b691f9cde3beaaf43dbc504f80e10cc07f583e835c611bd6192" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.209794 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.223396 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.224371 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.234755 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:00:31 crc kubenswrapper[5030]: E0120 23:00:31.235164 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" containerName="glance-httpd" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.235181 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" containerName="glance-httpd" Jan 20 23:00:31 crc kubenswrapper[5030]: E0120 23:00:31.235204 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" containerName="glance-log" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.235211 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" containerName="glance-log" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.235406 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" containerName="glance-log" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.235425 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" containerName="glance-httpd" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.236287 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.239589 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.239749 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.242851 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.249784 5030 scope.go:117] "RemoveContainer" containerID="6e73868661483fabc0e7f3ee5f4e53c03032db3a5af7d1a65ec1cb93103cb2e2" Jan 20 23:00:31 crc kubenswrapper[5030]: E0120 23:00:31.250208 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e73868661483fabc0e7f3ee5f4e53c03032db3a5af7d1a65ec1cb93103cb2e2\": container with ID starting with 6e73868661483fabc0e7f3ee5f4e53c03032db3a5af7d1a65ec1cb93103cb2e2 not found: ID does not exist" containerID="6e73868661483fabc0e7f3ee5f4e53c03032db3a5af7d1a65ec1cb93103cb2e2" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.250257 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e73868661483fabc0e7f3ee5f4e53c03032db3a5af7d1a65ec1cb93103cb2e2"} err="failed to get container status \"6e73868661483fabc0e7f3ee5f4e53c03032db3a5af7d1a65ec1cb93103cb2e2\": rpc error: code = NotFound desc = could not find container \"6e73868661483fabc0e7f3ee5f4e53c03032db3a5af7d1a65ec1cb93103cb2e2\": container with ID starting with 6e73868661483fabc0e7f3ee5f4e53c03032db3a5af7d1a65ec1cb93103cb2e2 not found: ID does not exist" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.250282 5030 scope.go:117] "RemoveContainer" containerID="3b0c5a28407c7b691f9cde3beaaf43dbc504f80e10cc07f583e835c611bd6192" Jan 20 23:00:31 crc kubenswrapper[5030]: E0120 23:00:31.251731 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b0c5a28407c7b691f9cde3beaaf43dbc504f80e10cc07f583e835c611bd6192\": container with ID starting with 3b0c5a28407c7b691f9cde3beaaf43dbc504f80e10cc07f583e835c611bd6192 not found: ID does not exist" containerID="3b0c5a28407c7b691f9cde3beaaf43dbc504f80e10cc07f583e835c611bd6192" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.251760 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b0c5a28407c7b691f9cde3beaaf43dbc504f80e10cc07f583e835c611bd6192"} err="failed to get container status \"3b0c5a28407c7b691f9cde3beaaf43dbc504f80e10cc07f583e835c611bd6192\": rpc error: code = NotFound desc = could not find container \"3b0c5a28407c7b691f9cde3beaaf43dbc504f80e10cc07f583e835c611bd6192\": container with ID starting with 3b0c5a28407c7b691f9cde3beaaf43dbc504f80e10cc07f583e835c611bd6192 not found: ID does not exist" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.427057 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.427165 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0fc512e7-ef40-4589-a094-9906cf91a0d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.427206 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fc512e7-ef40-4589-a094-9906cf91a0d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.427249 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.427322 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.427344 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.427373 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdjnc\" (UniqueName: \"kubernetes.io/projected/0fc512e7-ef40-4589-a094-9906cf91a0d3-kube-api-access-mdjnc\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.427396 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.528909 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.528983 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0fc512e7-ef40-4589-a094-9906cf91a0d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.529013 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fc512e7-ef40-4589-a094-9906cf91a0d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.529044 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.529072 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.529099 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.529120 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdjnc\" (UniqueName: \"kubernetes.io/projected/0fc512e7-ef40-4589-a094-9906cf91a0d3-kube-api-access-mdjnc\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.529135 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.530268 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.532468 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0fc512e7-ef40-4589-a094-9906cf91a0d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.534065 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fc512e7-ef40-4589-a094-9906cf91a0d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.534408 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.538461 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.540486 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.542344 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.549473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdjnc\" (UniqueName: \"kubernetes.io/projected/0fc512e7-ef40-4589-a094-9906cf91a0d3-kube-api-access-mdjnc\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.557730 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.860906 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.978702 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb" path="/var/lib/kubelet/pods/0959e3fe-810d-4c1b-a8f7-ea15bf25e9fb/volumes" Jan 20 23:00:31 crc kubenswrapper[5030]: I0120 23:00:31.980318 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53eabbda-71af-46ee-85df-de9997ed9b2b" path="/var/lib/kubelet/pods/53eabbda-71af-46ee-85df-de9997ed9b2b/volumes" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.020004 5030 scope.go:117] "RemoveContainer" containerID="ad58e0af4cff640a3546268c9d9f769b218b122f4b3e5f8c8c22aee0588d00cc" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.102946 5030 scope.go:117] "RemoveContainer" containerID="95b2f36df5f25c28ef9a913f2bb113e62e72854104599c6bbe79672b14286604" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.153719 5030 scope.go:117] "RemoveContainer" containerID="5d14d0e5317ecfbb409f5ce07548372a72d4a18c0e73ffe51bb4f499086b4b8e" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.191166 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"5d90984a-ad24-4f85-b39e-fff5b8152a04","Type":"ContainerStarted","Data":"4149f309379d97cf970398e78be56e16f3aa567df776e1d3fb1cdb1b410e4921"} Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.207325 5030 scope.go:117] "RemoveContainer" containerID="49d979ae396897b6fe84c0b16f5ec543ac67241f64c0cec0f9462b3f9343ec6c" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.238691 5030 scope.go:117] "RemoveContainer" containerID="44e9e86c1b6b010df1b0262a613eedd66cdead712f7feb3518e2d40f086da17e" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.271024 5030 scope.go:117] "RemoveContainer" containerID="24ed350f07926d46377d70ae896f08ae5561446525b45f08d3c09e118a1e0e78" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.313223 5030 scope.go:117] "RemoveContainer" containerID="cd95a6397db4e9560793bf5984d28722a88bf228d5a81f2f64fa90df41af5578" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.317876 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:00:32 crc kubenswrapper[5030]: W0120 23:00:32.329775 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fc512e7_ef40_4589_a094_9906cf91a0d3.slice/crio-d6e925334f43dc8e3a0c2fdf773b419b58a41ffe0512709f9744f65729e1ab2d WatchSource:0}: Error finding container d6e925334f43dc8e3a0c2fdf773b419b58a41ffe0512709f9744f65729e1ab2d: Status 404 returned error can't find the container with id d6e925334f43dc8e3a0c2fdf773b419b58a41ffe0512709f9744f65729e1ab2d Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.352858 5030 scope.go:117] "RemoveContainer" containerID="524a55892a1a52741bd4d8c5a2facdaa841bff142021e22e5dd212b97c706f95" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.389997 5030 scope.go:117] "RemoveContainer" containerID="4e441803cac62ba19b4b02a09bdf8586418dc277eee8924311eaeac3ee0c9a7b" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.422538 5030 scope.go:117] "RemoveContainer" containerID="8bd8acb85c09b4e34102253ea7289fb53effc5b251d4dec799a5f526c8eb6c2b" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.463958 5030 scope.go:117] "RemoveContainer" containerID="893109e8602c9959be263bcaa24129014f0586a2671d43af5bd007a9455a8741" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.486497 5030 scope.go:117] "RemoveContainer" containerID="0d5fc4f1b3c7ed51fa948b656f3eb485e97fb558a4d5d8321c931ebb2fed3f7e" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.533873 5030 scope.go:117] "RemoveContainer" containerID="882cc8c16207535f682a857f2aff37f733997ce119b202b91104337ac96cd344" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.599048 5030 scope.go:117] "RemoveContainer" containerID="2232c770e9612ca6b18e044232df33e7ad98af8922670115d8c6b21ba6e27375" Jan 20 23:00:32 crc kubenswrapper[5030]: I0120 23:00:32.889586 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.059001 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjh8d\" (UniqueName: \"kubernetes.io/projected/6d8bdba4-cb98-410c-a3f2-e65111577e74-kube-api-access-xjh8d\") pod \"6d8bdba4-cb98-410c-a3f2-e65111577e74\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.059127 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8bdba4-cb98-410c-a3f2-e65111577e74-run-httpd\") pod \"6d8bdba4-cb98-410c-a3f2-e65111577e74\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.059154 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-combined-ca-bundle\") pod \"6d8bdba4-cb98-410c-a3f2-e65111577e74\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.060409 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d8bdba4-cb98-410c-a3f2-e65111577e74-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6d8bdba4-cb98-410c-a3f2-e65111577e74" (UID: "6d8bdba4-cb98-410c-a3f2-e65111577e74"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.060725 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-config-data\") pod \"6d8bdba4-cb98-410c-a3f2-e65111577e74\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.060763 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-sg-core-conf-yaml\") pod \"6d8bdba4-cb98-410c-a3f2-e65111577e74\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.060860 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8bdba4-cb98-410c-a3f2-e65111577e74-log-httpd\") pod \"6d8bdba4-cb98-410c-a3f2-e65111577e74\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.060886 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-scripts\") pod \"6d8bdba4-cb98-410c-a3f2-e65111577e74\" (UID: \"6d8bdba4-cb98-410c-a3f2-e65111577e74\") " Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.061795 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8bdba4-cb98-410c-a3f2-e65111577e74-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.062155 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d8bdba4-cb98-410c-a3f2-e65111577e74-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6d8bdba4-cb98-410c-a3f2-e65111577e74" (UID: "6d8bdba4-cb98-410c-a3f2-e65111577e74"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.068141 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d8bdba4-cb98-410c-a3f2-e65111577e74-kube-api-access-xjh8d" (OuterVolumeSpecName: "kube-api-access-xjh8d") pod "6d8bdba4-cb98-410c-a3f2-e65111577e74" (UID: "6d8bdba4-cb98-410c-a3f2-e65111577e74"). InnerVolumeSpecName "kube-api-access-xjh8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.068172 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-scripts" (OuterVolumeSpecName: "scripts") pod "6d8bdba4-cb98-410c-a3f2-e65111577e74" (UID: "6d8bdba4-cb98-410c-a3f2-e65111577e74"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.091871 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6d8bdba4-cb98-410c-a3f2-e65111577e74" (UID: "6d8bdba4-cb98-410c-a3f2-e65111577e74"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.138524 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d8bdba4-cb98-410c-a3f2-e65111577e74" (UID: "6d8bdba4-cb98-410c-a3f2-e65111577e74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.160124 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-config-data" (OuterVolumeSpecName: "config-data") pod "6d8bdba4-cb98-410c-a3f2-e65111577e74" (UID: "6d8bdba4-cb98-410c-a3f2-e65111577e74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.162439 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.162541 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.162612 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.162694 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8bdba4-cb98-410c-a3f2-e65111577e74-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.162760 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8bdba4-cb98-410c-a3f2-e65111577e74-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.162818 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjh8d\" (UniqueName: \"kubernetes.io/projected/6d8bdba4-cb98-410c-a3f2-e65111577e74-kube-api-access-xjh8d\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.209833 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"5d90984a-ad24-4f85-b39e-fff5b8152a04","Type":"ContainerStarted","Data":"9e870211888f2d26672f4b7bb80891aa23aedcb37c96e9715ce4623aaf9698b0"} Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.211856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0fc512e7-ef40-4589-a094-9906cf91a0d3","Type":"ContainerStarted","Data":"896d652e86e3f6645f535d8655829b4fc84bc1a6b041883ee7d85864aa92a7c3"} Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.212068 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0fc512e7-ef40-4589-a094-9906cf91a0d3","Type":"ContainerStarted","Data":"d6e925334f43dc8e3a0c2fdf773b419b58a41ffe0512709f9744f65729e1ab2d"} Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.216825 5030 generic.go:334] "Generic (PLEG): container finished" podID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerID="be41f10e6402837652eac00e45c74a23de7e9c39482dfe1fe4b7ea58a6c8d0c4" exitCode=0 Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.216856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"6d8bdba4-cb98-410c-a3f2-e65111577e74","Type":"ContainerDied","Data":"be41f10e6402837652eac00e45c74a23de7e9c39482dfe1fe4b7ea58a6c8d0c4"} Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.216871 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"6d8bdba4-cb98-410c-a3f2-e65111577e74","Type":"ContainerDied","Data":"339f53a129568e40439b133eb6f9f277b7f87dd1d36f7017d141d0221458c516"} Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.216881 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.216892 5030 scope.go:117] "RemoveContainer" containerID="8cb21e0bd4248fc0020825e85f7b9b0c029a91d637107704c9dcd4bec4aa85b8" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.234728 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.234692275 podStartE2EDuration="3.234692275s" podCreationTimestamp="2026-01-20 23:00:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:00:33.227707095 +0000 UTC m=+1505.547967393" watchObservedRunningTime="2026-01-20 23:00:33.234692275 +0000 UTC m=+1505.554952563" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.260141 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.264930 5030 scope.go:117] "RemoveContainer" containerID="948ddc82a958686ff209d61a104892ec314ae9149a981c1e742dcebd579967fe" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.277546 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.283711 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:00:33 crc kubenswrapper[5030]: E0120 23:00:33.284102 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="sg-core" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.284119 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="sg-core" Jan 20 23:00:33 crc kubenswrapper[5030]: E0120 23:00:33.284143 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="ceilometer-central-agent" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.284149 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="ceilometer-central-agent" Jan 20 23:00:33 crc kubenswrapper[5030]: E0120 23:00:33.284163 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="proxy-httpd" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.284169 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="proxy-httpd" Jan 20 23:00:33 crc kubenswrapper[5030]: E0120 23:00:33.284181 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="ceilometer-notification-agent" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.284186 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="ceilometer-notification-agent" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.284364 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="proxy-httpd" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.284381 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="ceilometer-central-agent" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.284390 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="sg-core" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.284404 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" containerName="ceilometer-notification-agent" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.296141 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.298298 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.300342 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.303192 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.316612 5030 scope.go:117] "RemoveContainer" containerID="75bc2fae05fafa815037b121ffce74491c51d25a92d8e8c5940ec2a9a0576550" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.339142 5030 scope.go:117] "RemoveContainer" containerID="be41f10e6402837652eac00e45c74a23de7e9c39482dfe1fe4b7ea58a6c8d0c4" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.362332 5030 scope.go:117] "RemoveContainer" containerID="8cb21e0bd4248fc0020825e85f7b9b0c029a91d637107704c9dcd4bec4aa85b8" Jan 20 23:00:33 crc kubenswrapper[5030]: E0120 23:00:33.363213 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb21e0bd4248fc0020825e85f7b9b0c029a91d637107704c9dcd4bec4aa85b8\": container with ID starting with 8cb21e0bd4248fc0020825e85f7b9b0c029a91d637107704c9dcd4bec4aa85b8 not found: ID does not exist" containerID="8cb21e0bd4248fc0020825e85f7b9b0c029a91d637107704c9dcd4bec4aa85b8" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.363318 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb21e0bd4248fc0020825e85f7b9b0c029a91d637107704c9dcd4bec4aa85b8"} err="failed to get container status \"8cb21e0bd4248fc0020825e85f7b9b0c029a91d637107704c9dcd4bec4aa85b8\": rpc error: code = NotFound desc = could not find container \"8cb21e0bd4248fc0020825e85f7b9b0c029a91d637107704c9dcd4bec4aa85b8\": container with ID starting with 8cb21e0bd4248fc0020825e85f7b9b0c029a91d637107704c9dcd4bec4aa85b8 not found: ID does not exist" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.363417 5030 scope.go:117] "RemoveContainer" containerID="948ddc82a958686ff209d61a104892ec314ae9149a981c1e742dcebd579967fe" Jan 20 23:00:33 crc kubenswrapper[5030]: E0120 23:00:33.363957 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948ddc82a958686ff209d61a104892ec314ae9149a981c1e742dcebd579967fe\": container with ID starting with 948ddc82a958686ff209d61a104892ec314ae9149a981c1e742dcebd579967fe not found: ID does not exist" containerID="948ddc82a958686ff209d61a104892ec314ae9149a981c1e742dcebd579967fe" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.363986 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948ddc82a958686ff209d61a104892ec314ae9149a981c1e742dcebd579967fe"} err="failed to get container status \"948ddc82a958686ff209d61a104892ec314ae9149a981c1e742dcebd579967fe\": rpc error: code = NotFound desc = could not find container \"948ddc82a958686ff209d61a104892ec314ae9149a981c1e742dcebd579967fe\": container with ID starting with 948ddc82a958686ff209d61a104892ec314ae9149a981c1e742dcebd579967fe not found: ID does not exist" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.364006 5030 scope.go:117] "RemoveContainer" containerID="75bc2fae05fafa815037b121ffce74491c51d25a92d8e8c5940ec2a9a0576550" Jan 20 23:00:33 crc kubenswrapper[5030]: E0120 23:00:33.364316 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75bc2fae05fafa815037b121ffce74491c51d25a92d8e8c5940ec2a9a0576550\": container with ID starting with 75bc2fae05fafa815037b121ffce74491c51d25a92d8e8c5940ec2a9a0576550 not found: ID does not exist" containerID="75bc2fae05fafa815037b121ffce74491c51d25a92d8e8c5940ec2a9a0576550" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.364353 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75bc2fae05fafa815037b121ffce74491c51d25a92d8e8c5940ec2a9a0576550"} err="failed to get container status \"75bc2fae05fafa815037b121ffce74491c51d25a92d8e8c5940ec2a9a0576550\": rpc error: code = NotFound desc = could not find container \"75bc2fae05fafa815037b121ffce74491c51d25a92d8e8c5940ec2a9a0576550\": container with ID starting with 75bc2fae05fafa815037b121ffce74491c51d25a92d8e8c5940ec2a9a0576550 not found: ID does not exist" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.364417 5030 scope.go:117] "RemoveContainer" containerID="be41f10e6402837652eac00e45c74a23de7e9c39482dfe1fe4b7ea58a6c8d0c4" Jan 20 23:00:33 crc kubenswrapper[5030]: E0120 23:00:33.364840 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be41f10e6402837652eac00e45c74a23de7e9c39482dfe1fe4b7ea58a6c8d0c4\": container with ID starting with be41f10e6402837652eac00e45c74a23de7e9c39482dfe1fe4b7ea58a6c8d0c4 not found: ID does not exist" containerID="be41f10e6402837652eac00e45c74a23de7e9c39482dfe1fe4b7ea58a6c8d0c4" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.364881 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be41f10e6402837652eac00e45c74a23de7e9c39482dfe1fe4b7ea58a6c8d0c4"} err="failed to get container status \"be41f10e6402837652eac00e45c74a23de7e9c39482dfe1fe4b7ea58a6c8d0c4\": rpc error: code = NotFound desc = could not find container \"be41f10e6402837652eac00e45c74a23de7e9c39482dfe1fe4b7ea58a6c8d0c4\": container with ID starting with be41f10e6402837652eac00e45c74a23de7e9c39482dfe1fe4b7ea58a6c8d0c4 not found: ID does not exist" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.472184 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19c4076d-316a-436e-b7a8-ed9b5f942f0c-run-httpd\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.472439 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kth6\" (UniqueName: \"kubernetes.io/projected/19c4076d-316a-436e-b7a8-ed9b5f942f0c-kube-api-access-2kth6\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.472707 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-config-data\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.472789 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.472869 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-scripts\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.472971 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19c4076d-316a-436e-b7a8-ed9b5f942f0c-log-httpd\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.473087 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.574187 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19c4076d-316a-436e-b7a8-ed9b5f942f0c-log-httpd\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.574295 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.574365 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19c4076d-316a-436e-b7a8-ed9b5f942f0c-run-httpd\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.574401 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kth6\" (UniqueName: \"kubernetes.io/projected/19c4076d-316a-436e-b7a8-ed9b5f942f0c-kube-api-access-2kth6\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.574558 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-config-data\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.574608 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.574696 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-scripts\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.575093 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19c4076d-316a-436e-b7a8-ed9b5f942f0c-run-httpd\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.575149 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19c4076d-316a-436e-b7a8-ed9b5f942f0c-log-httpd\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.578595 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.579572 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-config-data\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.579936 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.580108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-scripts\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.591095 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kth6\" (UniqueName: \"kubernetes.io/projected/19c4076d-316a-436e-b7a8-ed9b5f942f0c-kube-api-access-2kth6\") pod \"ceilometer-0\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.611338 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:33 crc kubenswrapper[5030]: I0120 23:00:33.984778 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d8bdba4-cb98-410c-a3f2-e65111577e74" path="/var/lib/kubelet/pods/6d8bdba4-cb98-410c-a3f2-e65111577e74/volumes" Jan 20 23:00:34 crc kubenswrapper[5030]: I0120 23:00:34.043764 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:00:34 crc kubenswrapper[5030]: W0120 23:00:34.048711 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19c4076d_316a_436e_b7a8_ed9b5f942f0c.slice/crio-ff5f229b58d0072f20cc04685331d6d3daa8d9889851c099c73b0134bdb57d5c WatchSource:0}: Error finding container ff5f229b58d0072f20cc04685331d6d3daa8d9889851c099c73b0134bdb57d5c: Status 404 returned error can't find the container with id ff5f229b58d0072f20cc04685331d6d3daa8d9889851c099c73b0134bdb57d5c Jan 20 23:00:34 crc kubenswrapper[5030]: I0120 23:00:34.226947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19c4076d-316a-436e-b7a8-ed9b5f942f0c","Type":"ContainerStarted","Data":"ff5f229b58d0072f20cc04685331d6d3daa8d9889851c099c73b0134bdb57d5c"} Jan 20 23:00:34 crc kubenswrapper[5030]: I0120 23:00:34.229270 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0fc512e7-ef40-4589-a094-9906cf91a0d3","Type":"ContainerStarted","Data":"32dd60b784789cf7134c38532e81ef67d0d751a5bd26694f1fd92042ac4cc701"} Jan 20 23:00:34 crc kubenswrapper[5030]: I0120 23:00:34.263055 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.263036303 podStartE2EDuration="3.263036303s" podCreationTimestamp="2026-01-20 23:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:00:34.258675297 +0000 UTC m=+1506.578935615" watchObservedRunningTime="2026-01-20 23:00:34.263036303 +0000 UTC m=+1506.583296601" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.250272 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19c4076d-316a-436e-b7a8-ed9b5f942f0c","Type":"ContainerStarted","Data":"1f686e077c11dd76e21872c810c9be19eb982cdf94d00717d56de9cec79fd333"} Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.637879 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-q6mv6"] Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.647166 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-q6mv6" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.647897 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd"] Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.649026 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.653649 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.664686 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-q6mv6"] Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.685393 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd"] Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.741603 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-zr5tr"] Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.742670 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-zr5tr" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.767883 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-zr5tr"] Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.813459 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhn59\" (UniqueName: \"kubernetes.io/projected/8fb7df47-fb12-4daa-9442-52c1dfd18bea-kube-api-access-zhn59\") pod \"nova-api-be4a-account-create-update-vq6sd\" (UID: \"8fb7df47-fb12-4daa-9442-52c1dfd18bea\") " pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.813570 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fb7df47-fb12-4daa-9442-52c1dfd18bea-operator-scripts\") pod \"nova-api-be4a-account-create-update-vq6sd\" (UID: \"8fb7df47-fb12-4daa-9442-52c1dfd18bea\") " pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.813633 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db1552c5-e1e0-4c78-8860-243a48c39e6a-operator-scripts\") pod \"nova-api-db-create-q6mv6\" (UID: \"db1552c5-e1e0-4c78-8860-243a48c39e6a\") " pod="openstack-kuttl-tests/nova-api-db-create-q6mv6" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.813699 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhd8g\" (UniqueName: \"kubernetes.io/projected/db1552c5-e1e0-4c78-8860-243a48c39e6a-kube-api-access-vhd8g\") pod \"nova-api-db-create-q6mv6\" (UID: \"db1552c5-e1e0-4c78-8860-243a48c39e6a\") " pod="openstack-kuttl-tests/nova-api-db-create-q6mv6" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.818127 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-wmpkb"] Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.819267 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-wmpkb" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.830289 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-wmpkb"] Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.862771 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt"] Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.864544 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.874177 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.890981 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt"] Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.915307 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fb7df47-fb12-4daa-9442-52c1dfd18bea-operator-scripts\") pod \"nova-api-be4a-account-create-update-vq6sd\" (UID: \"8fb7df47-fb12-4daa-9442-52c1dfd18bea\") " pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.915376 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db1552c5-e1e0-4c78-8860-243a48c39e6a-operator-scripts\") pod \"nova-api-db-create-q6mv6\" (UID: \"db1552c5-e1e0-4c78-8860-243a48c39e6a\") " pod="openstack-kuttl-tests/nova-api-db-create-q6mv6" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.915476 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhd8g\" (UniqueName: \"kubernetes.io/projected/db1552c5-e1e0-4c78-8860-243a48c39e6a-kube-api-access-vhd8g\") pod \"nova-api-db-create-q6mv6\" (UID: \"db1552c5-e1e0-4c78-8860-243a48c39e6a\") " pod="openstack-kuttl-tests/nova-api-db-create-q6mv6" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.915536 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d75890-b0d8-4622-8471-e44bdc3e925f-operator-scripts\") pod \"nova-cell0-db-create-zr5tr\" (UID: \"84d75890-b0d8-4622-8471-e44bdc3e925f\") " pod="openstack-kuttl-tests/nova-cell0-db-create-zr5tr" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.915573 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdlmx\" (UniqueName: \"kubernetes.io/projected/c82005b0-b1f6-4884-bc9a-f131a62edd97-kube-api-access-hdlmx\") pod \"nova-cell1-db-create-wmpkb\" (UID: \"c82005b0-b1f6-4884-bc9a-f131a62edd97\") " pod="openstack-kuttl-tests/nova-cell1-db-create-wmpkb" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.915654 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c82005b0-b1f6-4884-bc9a-f131a62edd97-operator-scripts\") pod \"nova-cell1-db-create-wmpkb\" (UID: \"c82005b0-b1f6-4884-bc9a-f131a62edd97\") " pod="openstack-kuttl-tests/nova-cell1-db-create-wmpkb" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.915681 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhn59\" (UniqueName: \"kubernetes.io/projected/8fb7df47-fb12-4daa-9442-52c1dfd18bea-kube-api-access-zhn59\") pod \"nova-api-be4a-account-create-update-vq6sd\" (UID: \"8fb7df47-fb12-4daa-9442-52c1dfd18bea\") " pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.915723 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgmvm\" (UniqueName: \"kubernetes.io/projected/84d75890-b0d8-4622-8471-e44bdc3e925f-kube-api-access-dgmvm\") pod \"nova-cell0-db-create-zr5tr\" (UID: \"84d75890-b0d8-4622-8471-e44bdc3e925f\") " pod="openstack-kuttl-tests/nova-cell0-db-create-zr5tr" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.916166 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fb7df47-fb12-4daa-9442-52c1dfd18bea-operator-scripts\") pod \"nova-api-be4a-account-create-update-vq6sd\" (UID: \"8fb7df47-fb12-4daa-9442-52c1dfd18bea\") " pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.920926 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db1552c5-e1e0-4c78-8860-243a48c39e6a-operator-scripts\") pod \"nova-api-db-create-q6mv6\" (UID: \"db1552c5-e1e0-4c78-8860-243a48c39e6a\") " pod="openstack-kuttl-tests/nova-api-db-create-q6mv6" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.932529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhn59\" (UniqueName: \"kubernetes.io/projected/8fb7df47-fb12-4daa-9442-52c1dfd18bea-kube-api-access-zhn59\") pod \"nova-api-be4a-account-create-update-vq6sd\" (UID: \"8fb7df47-fb12-4daa-9442-52c1dfd18bea\") " pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.932853 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhd8g\" (UniqueName: \"kubernetes.io/projected/db1552c5-e1e0-4c78-8860-243a48c39e6a-kube-api-access-vhd8g\") pod \"nova-api-db-create-q6mv6\" (UID: \"db1552c5-e1e0-4c78-8860-243a48c39e6a\") " pod="openstack-kuttl-tests/nova-api-db-create-q6mv6" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.973054 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-q6mv6" Jan 20 23:00:35 crc kubenswrapper[5030]: I0120 23:00:35.990479 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.017459 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdlmx\" (UniqueName: \"kubernetes.io/projected/c82005b0-b1f6-4884-bc9a-f131a62edd97-kube-api-access-hdlmx\") pod \"nova-cell1-db-create-wmpkb\" (UID: \"c82005b0-b1f6-4884-bc9a-f131a62edd97\") " pod="openstack-kuttl-tests/nova-cell1-db-create-wmpkb" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.018182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c82005b0-b1f6-4884-bc9a-f131a62edd97-operator-scripts\") pod \"nova-cell1-db-create-wmpkb\" (UID: \"c82005b0-b1f6-4884-bc9a-f131a62edd97\") " pod="openstack-kuttl-tests/nova-cell1-db-create-wmpkb" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.018233 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c82005b0-b1f6-4884-bc9a-f131a62edd97-operator-scripts\") pod \"nova-cell1-db-create-wmpkb\" (UID: \"c82005b0-b1f6-4884-bc9a-f131a62edd97\") " pod="openstack-kuttl-tests/nova-cell1-db-create-wmpkb" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.018280 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgmvm\" (UniqueName: \"kubernetes.io/projected/84d75890-b0d8-4622-8471-e44bdc3e925f-kube-api-access-dgmvm\") pod \"nova-cell0-db-create-zr5tr\" (UID: \"84d75890-b0d8-4622-8471-e44bdc3e925f\") " pod="openstack-kuttl-tests/nova-cell0-db-create-zr5tr" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.021738 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnqzw\" (UniqueName: \"kubernetes.io/projected/7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5-kube-api-access-nnqzw\") pod \"nova-cell0-d0e8-account-create-update-47pjt\" (UID: \"7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5\") " pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.021854 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5-operator-scripts\") pod \"nova-cell0-d0e8-account-create-update-47pjt\" (UID: \"7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5\") " pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.021912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d75890-b0d8-4622-8471-e44bdc3e925f-operator-scripts\") pod \"nova-cell0-db-create-zr5tr\" (UID: \"84d75890-b0d8-4622-8471-e44bdc3e925f\") " pod="openstack-kuttl-tests/nova-cell0-db-create-zr5tr" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.022859 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d75890-b0d8-4622-8471-e44bdc3e925f-operator-scripts\") pod \"nova-cell0-db-create-zr5tr\" (UID: \"84d75890-b0d8-4622-8471-e44bdc3e925f\") " pod="openstack-kuttl-tests/nova-cell0-db-create-zr5tr" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.037287 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdlmx\" (UniqueName: \"kubernetes.io/projected/c82005b0-b1f6-4884-bc9a-f131a62edd97-kube-api-access-hdlmx\") pod \"nova-cell1-db-create-wmpkb\" (UID: \"c82005b0-b1f6-4884-bc9a-f131a62edd97\") " pod="openstack-kuttl-tests/nova-cell1-db-create-wmpkb" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.037969 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq"] Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.039079 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.041248 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.043441 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgmvm\" (UniqueName: \"kubernetes.io/projected/84d75890-b0d8-4622-8471-e44bdc3e925f-kube-api-access-dgmvm\") pod \"nova-cell0-db-create-zr5tr\" (UID: \"84d75890-b0d8-4622-8471-e44bdc3e925f\") " pod="openstack-kuttl-tests/nova-cell0-db-create-zr5tr" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.051323 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq"] Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.062669 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-zr5tr" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.124887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnqzw\" (UniqueName: \"kubernetes.io/projected/7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5-kube-api-access-nnqzw\") pod \"nova-cell0-d0e8-account-create-update-47pjt\" (UID: \"7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5\") " pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.125248 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5-operator-scripts\") pod \"nova-cell0-d0e8-account-create-update-47pjt\" (UID: \"7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5\") " pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.128151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5-operator-scripts\") pod \"nova-cell0-d0e8-account-create-update-47pjt\" (UID: \"7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5\") " pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.144354 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-wmpkb" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.146463 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnqzw\" (UniqueName: \"kubernetes.io/projected/7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5-kube-api-access-nnqzw\") pod \"nova-cell0-d0e8-account-create-update-47pjt\" (UID: \"7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5\") " pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.226741 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a315c01e-271c-4200-b411-c1a9fa71820f-operator-scripts\") pod \"nova-cell1-7aac-account-create-update-v8vdq\" (UID: \"a315c01e-271c-4200-b411-c1a9fa71820f\") " pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.226826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94ll5\" (UniqueName: \"kubernetes.io/projected/a315c01e-271c-4200-b411-c1a9fa71820f-kube-api-access-94ll5\") pod \"nova-cell1-7aac-account-create-update-v8vdq\" (UID: \"a315c01e-271c-4200-b411-c1a9fa71820f\") " pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.281331 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19c4076d-316a-436e-b7a8-ed9b5f942f0c","Type":"ContainerStarted","Data":"fcc3a0c29008ae396a861e080ef889b9da3b41c8a5c3d991bd4c17a8045f6e3e"} Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.281370 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19c4076d-316a-436e-b7a8-ed9b5f942f0c","Type":"ContainerStarted","Data":"ba8dba67d6db502a3953f80a2d5031043c4063dfff96429f3a4ca5b0816d87a2"} Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.283700 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.328483 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94ll5\" (UniqueName: \"kubernetes.io/projected/a315c01e-271c-4200-b411-c1a9fa71820f-kube-api-access-94ll5\") pod \"nova-cell1-7aac-account-create-update-v8vdq\" (UID: \"a315c01e-271c-4200-b411-c1a9fa71820f\") " pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.328702 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a315c01e-271c-4200-b411-c1a9fa71820f-operator-scripts\") pod \"nova-cell1-7aac-account-create-update-v8vdq\" (UID: \"a315c01e-271c-4200-b411-c1a9fa71820f\") " pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.329520 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a315c01e-271c-4200-b411-c1a9fa71820f-operator-scripts\") pod \"nova-cell1-7aac-account-create-update-v8vdq\" (UID: \"a315c01e-271c-4200-b411-c1a9fa71820f\") " pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.353201 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94ll5\" (UniqueName: \"kubernetes.io/projected/a315c01e-271c-4200-b411-c1a9fa71820f-kube-api-access-94ll5\") pod \"nova-cell1-7aac-account-create-update-v8vdq\" (UID: \"a315c01e-271c-4200-b411-c1a9fa71820f\") " pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq" Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.494197 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-q6mv6"] Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.552748 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd"] Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.566569 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-zr5tr"] Jan 20 23:00:36 crc kubenswrapper[5030]: W0120 23:00:36.585435 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84d75890_b0d8_4622_8471_e44bdc3e925f.slice/crio-05d2ac3127aef74fbfaa08227f2f54b9857f3145b39a9a82abcf628ade949bff WatchSource:0}: Error finding container 05d2ac3127aef74fbfaa08227f2f54b9857f3145b39a9a82abcf628ade949bff: Status 404 returned error can't find the container with id 05d2ac3127aef74fbfaa08227f2f54b9857f3145b39a9a82abcf628ade949bff Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.653836 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq" Jan 20 23:00:36 crc kubenswrapper[5030]: W0120 23:00:36.658564 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc82005b0_b1f6_4884_bc9a_f131a62edd97.slice/crio-8288de3d39fbbe3f9930648ec3ec479b70af9548d70e30ee309c4bd1633bf1a2 WatchSource:0}: Error finding container 8288de3d39fbbe3f9930648ec3ec479b70af9548d70e30ee309c4bd1633bf1a2: Status 404 returned error can't find the container with id 8288de3d39fbbe3f9930648ec3ec479b70af9548d70e30ee309c4bd1633bf1a2 Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.659057 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-wmpkb"] Jan 20 23:00:36 crc kubenswrapper[5030]: I0120 23:00:36.760889 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt"] Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.067573 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq"] Jan 20 23:00:37 crc kubenswrapper[5030]: W0120 23:00:37.168466 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda315c01e_271c_4200_b411_c1a9fa71820f.slice/crio-c87fc30b0a8f0ba3e2c9c6ae7a4f92550a1d26c49fd3951d7b203a5b6868a43f WatchSource:0}: Error finding container c87fc30b0a8f0ba3e2c9c6ae7a4f92550a1d26c49fd3951d7b203a5b6868a43f: Status 404 returned error can't find the container with id c87fc30b0a8f0ba3e2c9c6ae7a4f92550a1d26c49fd3951d7b203a5b6868a43f Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.292168 5030 generic.go:334] "Generic (PLEG): container finished" podID="8fb7df47-fb12-4daa-9442-52c1dfd18bea" containerID="4f4ec23418a7fb4081ca771a7f62f966b6982ff38a1e94ba24bc598ee9b01174" exitCode=0 Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.292459 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd" event={"ID":"8fb7df47-fb12-4daa-9442-52c1dfd18bea","Type":"ContainerDied","Data":"4f4ec23418a7fb4081ca771a7f62f966b6982ff38a1e94ba24bc598ee9b01174"} Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.292501 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd" event={"ID":"8fb7df47-fb12-4daa-9442-52c1dfd18bea","Type":"ContainerStarted","Data":"16f575684ba70f5727c2d734d3f5c24725649b5f2d0ebed9cf50d8905c62cdeb"} Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.296807 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-zr5tr" event={"ID":"84d75890-b0d8-4622-8471-e44bdc3e925f","Type":"ContainerDied","Data":"b80da2e4f32300b6f46c477fe14dab075793bedce95e881576e43e8d1bfce9c8"} Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.297260 5030 generic.go:334] "Generic (PLEG): container finished" podID="84d75890-b0d8-4622-8471-e44bdc3e925f" containerID="b80da2e4f32300b6f46c477fe14dab075793bedce95e881576e43e8d1bfce9c8" exitCode=0 Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.297334 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-zr5tr" event={"ID":"84d75890-b0d8-4622-8471-e44bdc3e925f","Type":"ContainerStarted","Data":"05d2ac3127aef74fbfaa08227f2f54b9857f3145b39a9a82abcf628ade949bff"} Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.299308 5030 generic.go:334] "Generic (PLEG): container finished" podID="c82005b0-b1f6-4884-bc9a-f131a62edd97" containerID="69bd31968979fa52907dfd19ffa2733b98eb3b378f5267a768a9ed8e3af8a0b8" exitCode=0 Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.299360 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-wmpkb" event={"ID":"c82005b0-b1f6-4884-bc9a-f131a62edd97","Type":"ContainerDied","Data":"69bd31968979fa52907dfd19ffa2733b98eb3b378f5267a768a9ed8e3af8a0b8"} Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.299382 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-wmpkb" event={"ID":"c82005b0-b1f6-4884-bc9a-f131a62edd97","Type":"ContainerStarted","Data":"8288de3d39fbbe3f9930648ec3ec479b70af9548d70e30ee309c4bd1633bf1a2"} Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.300879 5030 generic.go:334] "Generic (PLEG): container finished" podID="db1552c5-e1e0-4c78-8860-243a48c39e6a" containerID="6fcd6c82f9b98061477b62981fd66acb9608345abb256874ad023be0bb233075" exitCode=0 Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.300932 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-q6mv6" event={"ID":"db1552c5-e1e0-4c78-8860-243a48c39e6a","Type":"ContainerDied","Data":"6fcd6c82f9b98061477b62981fd66acb9608345abb256874ad023be0bb233075"} Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.300955 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-q6mv6" event={"ID":"db1552c5-e1e0-4c78-8860-243a48c39e6a","Type":"ContainerStarted","Data":"40cfa332081063d735da650071c8cab2d496bb3c4af595db36cec1877f93cfc2"} Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.301897 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq" event={"ID":"a315c01e-271c-4200-b411-c1a9fa71820f","Type":"ContainerStarted","Data":"c87fc30b0a8f0ba3e2c9c6ae7a4f92550a1d26c49fd3951d7b203a5b6868a43f"} Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.303080 5030 generic.go:334] "Generic (PLEG): container finished" podID="7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5" containerID="7eaa48654e4ad98b0166fb2076c14af3b45afccd85f1add13ccfa8fda294f671" exitCode=0 Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.303137 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt" event={"ID":"7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5","Type":"ContainerDied","Data":"7eaa48654e4ad98b0166fb2076c14af3b45afccd85f1add13ccfa8fda294f671"} Jan 20 23:00:37 crc kubenswrapper[5030]: I0120 23:00:37.303156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt" event={"ID":"7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5","Type":"ContainerStarted","Data":"204dbdbb9e978db17ad0a1cd61aef35f9cc100304d51f6d850364b066e3321ab"} Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.332397 5030 generic.go:334] "Generic (PLEG): container finished" podID="a315c01e-271c-4200-b411-c1a9fa71820f" containerID="dbfeb70d4766b33590ea3682d4b59046ad66cacc8ce576e77fadaf87a7c55995" exitCode=0 Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.333218 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq" event={"ID":"a315c01e-271c-4200-b411-c1a9fa71820f","Type":"ContainerDied","Data":"dbfeb70d4766b33590ea3682d4b59046ad66cacc8ce576e77fadaf87a7c55995"} Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.347576 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19c4076d-316a-436e-b7a8-ed9b5f942f0c","Type":"ContainerStarted","Data":"4607e5ae35f297ec38d67baba04586d4646282afea61cb310f8e921df3ea408c"} Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.348306 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.402116 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.6027951219999999 podStartE2EDuration="5.402095102s" podCreationTimestamp="2026-01-20 23:00:33 +0000 UTC" firstStartedPulling="2026-01-20 23:00:34.051749807 +0000 UTC m=+1506.372010095" lastFinishedPulling="2026-01-20 23:00:37.851049787 +0000 UTC m=+1510.171310075" observedRunningTime="2026-01-20 23:00:38.385810687 +0000 UTC m=+1510.706070995" watchObservedRunningTime="2026-01-20 23:00:38.402095102 +0000 UTC m=+1510.722355400" Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.804346 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-q6mv6" Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.913232 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-wmpkb" Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.917408 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd" Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.923485 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt" Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.935402 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-zr5tr" Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.976107 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhd8g\" (UniqueName: \"kubernetes.io/projected/db1552c5-e1e0-4c78-8860-243a48c39e6a-kube-api-access-vhd8g\") pod \"db1552c5-e1e0-4c78-8860-243a48c39e6a\" (UID: \"db1552c5-e1e0-4c78-8860-243a48c39e6a\") " Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.976162 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db1552c5-e1e0-4c78-8860-243a48c39e6a-operator-scripts\") pod \"db1552c5-e1e0-4c78-8860-243a48c39e6a\" (UID: \"db1552c5-e1e0-4c78-8860-243a48c39e6a\") " Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.976833 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db1552c5-e1e0-4c78-8860-243a48c39e6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db1552c5-e1e0-4c78-8860-243a48c39e6a" (UID: "db1552c5-e1e0-4c78-8860-243a48c39e6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:00:38 crc kubenswrapper[5030]: I0120 23:00:38.981484 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db1552c5-e1e0-4c78-8860-243a48c39e6a-kube-api-access-vhd8g" (OuterVolumeSpecName: "kube-api-access-vhd8g") pod "db1552c5-e1e0-4c78-8860-243a48c39e6a" (UID: "db1552c5-e1e0-4c78-8860-243a48c39e6a"). InnerVolumeSpecName "kube-api-access-vhd8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.078222 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhn59\" (UniqueName: \"kubernetes.io/projected/8fb7df47-fb12-4daa-9442-52c1dfd18bea-kube-api-access-zhn59\") pod \"8fb7df47-fb12-4daa-9442-52c1dfd18bea\" (UID: \"8fb7df47-fb12-4daa-9442-52c1dfd18bea\") " Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.078270 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdlmx\" (UniqueName: \"kubernetes.io/projected/c82005b0-b1f6-4884-bc9a-f131a62edd97-kube-api-access-hdlmx\") pod \"c82005b0-b1f6-4884-bc9a-f131a62edd97\" (UID: \"c82005b0-b1f6-4884-bc9a-f131a62edd97\") " Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.078297 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgmvm\" (UniqueName: \"kubernetes.io/projected/84d75890-b0d8-4622-8471-e44bdc3e925f-kube-api-access-dgmvm\") pod \"84d75890-b0d8-4622-8471-e44bdc3e925f\" (UID: \"84d75890-b0d8-4622-8471-e44bdc3e925f\") " Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.078323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fb7df47-fb12-4daa-9442-52c1dfd18bea-operator-scripts\") pod \"8fb7df47-fb12-4daa-9442-52c1dfd18bea\" (UID: \"8fb7df47-fb12-4daa-9442-52c1dfd18bea\") " Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.078386 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5-operator-scripts\") pod \"7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5\" (UID: \"7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5\") " Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.078833 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c82005b0-b1f6-4884-bc9a-f131a62edd97-operator-scripts\") pod \"c82005b0-b1f6-4884-bc9a-f131a62edd97\" (UID: \"c82005b0-b1f6-4884-bc9a-f131a62edd97\") " Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.078860 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnqzw\" (UniqueName: \"kubernetes.io/projected/7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5-kube-api-access-nnqzw\") pod \"7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5\" (UID: \"7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5\") " Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.078850 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb7df47-fb12-4daa-9442-52c1dfd18bea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fb7df47-fb12-4daa-9442-52c1dfd18bea" (UID: "8fb7df47-fb12-4daa-9442-52c1dfd18bea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.078900 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5" (UID: "7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.078942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d75890-b0d8-4622-8471-e44bdc3e925f-operator-scripts\") pod \"84d75890-b0d8-4622-8471-e44bdc3e925f\" (UID: \"84d75890-b0d8-4622-8471-e44bdc3e925f\") " Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.079110 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c82005b0-b1f6-4884-bc9a-f131a62edd97-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c82005b0-b1f6-4884-bc9a-f131a62edd97" (UID: "c82005b0-b1f6-4884-bc9a-f131a62edd97"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.079378 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhd8g\" (UniqueName: \"kubernetes.io/projected/db1552c5-e1e0-4c78-8860-243a48c39e6a-kube-api-access-vhd8g\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.079396 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db1552c5-e1e0-4c78-8860-243a48c39e6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.079406 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fb7df47-fb12-4daa-9442-52c1dfd18bea-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.079415 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.079423 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c82005b0-b1f6-4884-bc9a-f131a62edd97-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.079455 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d75890-b0d8-4622-8471-e44bdc3e925f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84d75890-b0d8-4622-8471-e44bdc3e925f" (UID: "84d75890-b0d8-4622-8471-e44bdc3e925f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.081237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d75890-b0d8-4622-8471-e44bdc3e925f-kube-api-access-dgmvm" (OuterVolumeSpecName: "kube-api-access-dgmvm") pod "84d75890-b0d8-4622-8471-e44bdc3e925f" (UID: "84d75890-b0d8-4622-8471-e44bdc3e925f"). InnerVolumeSpecName "kube-api-access-dgmvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.082169 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb7df47-fb12-4daa-9442-52c1dfd18bea-kube-api-access-zhn59" (OuterVolumeSpecName: "kube-api-access-zhn59") pod "8fb7df47-fb12-4daa-9442-52c1dfd18bea" (UID: "8fb7df47-fb12-4daa-9442-52c1dfd18bea"). InnerVolumeSpecName "kube-api-access-zhn59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.082933 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5-kube-api-access-nnqzw" (OuterVolumeSpecName: "kube-api-access-nnqzw") pod "7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5" (UID: "7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5"). InnerVolumeSpecName "kube-api-access-nnqzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.083243 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c82005b0-b1f6-4884-bc9a-f131a62edd97-kube-api-access-hdlmx" (OuterVolumeSpecName: "kube-api-access-hdlmx") pod "c82005b0-b1f6-4884-bc9a-f131a62edd97" (UID: "c82005b0-b1f6-4884-bc9a-f131a62edd97"). InnerVolumeSpecName "kube-api-access-hdlmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.181723 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhn59\" (UniqueName: \"kubernetes.io/projected/8fb7df47-fb12-4daa-9442-52c1dfd18bea-kube-api-access-zhn59\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.181767 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdlmx\" (UniqueName: \"kubernetes.io/projected/c82005b0-b1f6-4884-bc9a-f131a62edd97-kube-api-access-hdlmx\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.181780 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgmvm\" (UniqueName: \"kubernetes.io/projected/84d75890-b0d8-4622-8471-e44bdc3e925f-kube-api-access-dgmvm\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.181792 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnqzw\" (UniqueName: \"kubernetes.io/projected/7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5-kube-api-access-nnqzw\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.181804 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d75890-b0d8-4622-8471-e44bdc3e925f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.359388 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-q6mv6" event={"ID":"db1552c5-e1e0-4c78-8860-243a48c39e6a","Type":"ContainerDied","Data":"40cfa332081063d735da650071c8cab2d496bb3c4af595db36cec1877f93cfc2"} Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.359434 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40cfa332081063d735da650071c8cab2d496bb3c4af595db36cec1877f93cfc2" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.359408 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-q6mv6" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.362818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt" event={"ID":"7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5","Type":"ContainerDied","Data":"204dbdbb9e978db17ad0a1cd61aef35f9cc100304d51f6d850364b066e3321ab"} Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.362855 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.362857 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="204dbdbb9e978db17ad0a1cd61aef35f9cc100304d51f6d850364b066e3321ab" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.368352 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.368363 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd" event={"ID":"8fb7df47-fb12-4daa-9442-52c1dfd18bea","Type":"ContainerDied","Data":"16f575684ba70f5727c2d734d3f5c24725649b5f2d0ebed9cf50d8905c62cdeb"} Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.368398 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16f575684ba70f5727c2d734d3f5c24725649b5f2d0ebed9cf50d8905c62cdeb" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.375455 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-zr5tr" event={"ID":"84d75890-b0d8-4622-8471-e44bdc3e925f","Type":"ContainerDied","Data":"05d2ac3127aef74fbfaa08227f2f54b9857f3145b39a9a82abcf628ade949bff"} Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.375490 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05d2ac3127aef74fbfaa08227f2f54b9857f3145b39a9a82abcf628ade949bff" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.375542 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-zr5tr" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.386781 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-wmpkb" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.386880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-wmpkb" event={"ID":"c82005b0-b1f6-4884-bc9a-f131a62edd97","Type":"ContainerDied","Data":"8288de3d39fbbe3f9930648ec3ec479b70af9548d70e30ee309c4bd1633bf1a2"} Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.386910 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8288de3d39fbbe3f9930648ec3ec479b70af9548d70e30ee309c4bd1633bf1a2" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.679798 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.794666 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a315c01e-271c-4200-b411-c1a9fa71820f-operator-scripts\") pod \"a315c01e-271c-4200-b411-c1a9fa71820f\" (UID: \"a315c01e-271c-4200-b411-c1a9fa71820f\") " Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.794889 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94ll5\" (UniqueName: \"kubernetes.io/projected/a315c01e-271c-4200-b411-c1a9fa71820f-kube-api-access-94ll5\") pod \"a315c01e-271c-4200-b411-c1a9fa71820f\" (UID: \"a315c01e-271c-4200-b411-c1a9fa71820f\") " Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.795324 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a315c01e-271c-4200-b411-c1a9fa71820f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a315c01e-271c-4200-b411-c1a9fa71820f" (UID: "a315c01e-271c-4200-b411-c1a9fa71820f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.795466 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a315c01e-271c-4200-b411-c1a9fa71820f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.797835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a315c01e-271c-4200-b411-c1a9fa71820f-kube-api-access-94ll5" (OuterVolumeSpecName: "kube-api-access-94ll5") pod "a315c01e-271c-4200-b411-c1a9fa71820f" (UID: "a315c01e-271c-4200-b411-c1a9fa71820f"). InnerVolumeSpecName "kube-api-access-94ll5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:39 crc kubenswrapper[5030]: I0120 23:00:39.896808 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94ll5\" (UniqueName: \"kubernetes.io/projected/a315c01e-271c-4200-b411-c1a9fa71820f-kube-api-access-94ll5\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:40 crc kubenswrapper[5030]: I0120 23:00:40.398937 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq" event={"ID":"a315c01e-271c-4200-b411-c1a9fa71820f","Type":"ContainerDied","Data":"c87fc30b0a8f0ba3e2c9c6ae7a4f92550a1d26c49fd3951d7b203a5b6868a43f"} Jan 20 23:00:40 crc kubenswrapper[5030]: I0120 23:00:40.399008 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c87fc30b0a8f0ba3e2c9c6ae7a4f92550a1d26c49fd3951d7b203a5b6868a43f" Jan 20 23:00:40 crc kubenswrapper[5030]: I0120 23:00:40.399944 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq" Jan 20 23:00:40 crc kubenswrapper[5030]: I0120 23:00:40.534421 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:40 crc kubenswrapper[5030]: I0120 23:00:40.534502 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:40 crc kubenswrapper[5030]: I0120 23:00:40.567301 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:40 crc kubenswrapper[5030]: I0120 23:00:40.601518 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.067554 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj"] Jan 20 23:00:41 crc kubenswrapper[5030]: E0120 23:00:41.068341 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82005b0-b1f6-4884-bc9a-f131a62edd97" containerName="mariadb-database-create" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.068358 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82005b0-b1f6-4884-bc9a-f131a62edd97" containerName="mariadb-database-create" Jan 20 23:00:41 crc kubenswrapper[5030]: E0120 23:00:41.068372 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5" containerName="mariadb-account-create-update" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.068420 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5" containerName="mariadb-account-create-update" Jan 20 23:00:41 crc kubenswrapper[5030]: E0120 23:00:41.068432 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb7df47-fb12-4daa-9442-52c1dfd18bea" containerName="mariadb-account-create-update" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.068440 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb7df47-fb12-4daa-9442-52c1dfd18bea" containerName="mariadb-account-create-update" Jan 20 23:00:41 crc kubenswrapper[5030]: E0120 23:00:41.068452 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1552c5-e1e0-4c78-8860-243a48c39e6a" containerName="mariadb-database-create" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.068458 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1552c5-e1e0-4c78-8860-243a48c39e6a" containerName="mariadb-database-create" Jan 20 23:00:41 crc kubenswrapper[5030]: E0120 23:00:41.068470 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d75890-b0d8-4622-8471-e44bdc3e925f" containerName="mariadb-database-create" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.068477 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d75890-b0d8-4622-8471-e44bdc3e925f" containerName="mariadb-database-create" Jan 20 23:00:41 crc kubenswrapper[5030]: E0120 23:00:41.068489 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a315c01e-271c-4200-b411-c1a9fa71820f" containerName="mariadb-account-create-update" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.068495 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a315c01e-271c-4200-b411-c1a9fa71820f" containerName="mariadb-account-create-update" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.068713 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5" containerName="mariadb-account-create-update" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.068733 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1552c5-e1e0-4c78-8860-243a48c39e6a" containerName="mariadb-database-create" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.068745 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fb7df47-fb12-4daa-9442-52c1dfd18bea" containerName="mariadb-account-create-update" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.068755 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c82005b0-b1f6-4884-bc9a-f131a62edd97" containerName="mariadb-database-create" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.068767 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a315c01e-271c-4200-b411-c1a9fa71820f" containerName="mariadb-account-create-update" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.068778 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d75890-b0d8-4622-8471-e44bdc3e925f" containerName="mariadb-database-create" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.069308 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.073602 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-2qp62" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.074479 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.074891 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.092099 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj"] Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.223988 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wrlxj\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.224136 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-scripts\") pod \"nova-cell0-conductor-db-sync-wrlxj\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.224228 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-config-data\") pod \"nova-cell0-conductor-db-sync-wrlxj\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.224316 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlzmg\" (UniqueName: \"kubernetes.io/projected/0b38a913-9251-435b-9656-5c2cee72e288-kube-api-access-rlzmg\") pod \"nova-cell0-conductor-db-sync-wrlxj\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.326387 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-config-data\") pod \"nova-cell0-conductor-db-sync-wrlxj\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.326524 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlzmg\" (UniqueName: \"kubernetes.io/projected/0b38a913-9251-435b-9656-5c2cee72e288-kube-api-access-rlzmg\") pod \"nova-cell0-conductor-db-sync-wrlxj\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.326708 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wrlxj\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.326810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-scripts\") pod \"nova-cell0-conductor-db-sync-wrlxj\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.333733 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-config-data\") pod \"nova-cell0-conductor-db-sync-wrlxj\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.333789 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-scripts\") pod \"nova-cell0-conductor-db-sync-wrlxj\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.339538 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wrlxj\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.349506 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlzmg\" (UniqueName: \"kubernetes.io/projected/0b38a913-9251-435b-9656-5c2cee72e288-kube-api-access-rlzmg\") pod \"nova-cell0-conductor-db-sync-wrlxj\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.395300 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.407117 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.407158 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.862753 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.863118 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.870943 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj"] Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.891127 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:41 crc kubenswrapper[5030]: I0120 23:00:41.908416 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:42 crc kubenswrapper[5030]: I0120 23:00:42.417320 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" event={"ID":"0b38a913-9251-435b-9656-5c2cee72e288","Type":"ContainerStarted","Data":"ecbdf9eff636c5fd5c83072aa27c4b23c8d55653d5d725a2825277d02afc5dc4"} Jan 20 23:00:42 crc kubenswrapper[5030]: I0120 23:00:42.417571 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" event={"ID":"0b38a913-9251-435b-9656-5c2cee72e288","Type":"ContainerStarted","Data":"29586bcb47bf86e89a675c8f04bdf6df3e1b188e7491ec42c3e73b18faa5383f"} Jan 20 23:00:42 crc kubenswrapper[5030]: I0120 23:00:42.418793 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:42 crc kubenswrapper[5030]: I0120 23:00:42.418838 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:42 crc kubenswrapper[5030]: I0120 23:00:42.440346 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" podStartSLOduration=1.440330152 podStartE2EDuration="1.440330152s" podCreationTimestamp="2026-01-20 23:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:00:42.436090208 +0000 UTC m=+1514.756350496" watchObservedRunningTime="2026-01-20 23:00:42.440330152 +0000 UTC m=+1514.760590450" Jan 20 23:00:43 crc kubenswrapper[5030]: I0120 23:00:43.237420 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:43 crc kubenswrapper[5030]: I0120 23:00:43.251463 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:00:44 crc kubenswrapper[5030]: I0120 23:00:44.243055 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:44 crc kubenswrapper[5030]: I0120 23:00:44.277421 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:00:46 crc kubenswrapper[5030]: I0120 23:00:46.463756 5030 generic.go:334] "Generic (PLEG): container finished" podID="0b38a913-9251-435b-9656-5c2cee72e288" containerID="ecbdf9eff636c5fd5c83072aa27c4b23c8d55653d5d725a2825277d02afc5dc4" exitCode=0 Jan 20 23:00:46 crc kubenswrapper[5030]: I0120 23:00:46.463884 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" event={"ID":"0b38a913-9251-435b-9656-5c2cee72e288","Type":"ContainerDied","Data":"ecbdf9eff636c5fd5c83072aa27c4b23c8d55653d5d725a2825277d02afc5dc4"} Jan 20 23:00:47 crc kubenswrapper[5030]: I0120 23:00:47.862526 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:47 crc kubenswrapper[5030]: I0120 23:00:47.951942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-config-data\") pod \"0b38a913-9251-435b-9656-5c2cee72e288\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " Jan 20 23:00:47 crc kubenswrapper[5030]: I0120 23:00:47.952034 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-combined-ca-bundle\") pod \"0b38a913-9251-435b-9656-5c2cee72e288\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " Jan 20 23:00:47 crc kubenswrapper[5030]: I0120 23:00:47.952073 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlzmg\" (UniqueName: \"kubernetes.io/projected/0b38a913-9251-435b-9656-5c2cee72e288-kube-api-access-rlzmg\") pod \"0b38a913-9251-435b-9656-5c2cee72e288\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " Jan 20 23:00:47 crc kubenswrapper[5030]: I0120 23:00:47.952176 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-scripts\") pod \"0b38a913-9251-435b-9656-5c2cee72e288\" (UID: \"0b38a913-9251-435b-9656-5c2cee72e288\") " Jan 20 23:00:47 crc kubenswrapper[5030]: I0120 23:00:47.958518 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b38a913-9251-435b-9656-5c2cee72e288-kube-api-access-rlzmg" (OuterVolumeSpecName: "kube-api-access-rlzmg") pod "0b38a913-9251-435b-9656-5c2cee72e288" (UID: "0b38a913-9251-435b-9656-5c2cee72e288"). InnerVolumeSpecName "kube-api-access-rlzmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:00:47 crc kubenswrapper[5030]: I0120 23:00:47.968613 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-scripts" (OuterVolumeSpecName: "scripts") pod "0b38a913-9251-435b-9656-5c2cee72e288" (UID: "0b38a913-9251-435b-9656-5c2cee72e288"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:47 crc kubenswrapper[5030]: I0120 23:00:47.997167 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b38a913-9251-435b-9656-5c2cee72e288" (UID: "0b38a913-9251-435b-9656-5c2cee72e288"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.002069 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-config-data" (OuterVolumeSpecName: "config-data") pod "0b38a913-9251-435b-9656-5c2cee72e288" (UID: "0b38a913-9251-435b-9656-5c2cee72e288"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.053963 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.054135 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.054195 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b38a913-9251-435b-9656-5c2cee72e288-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.054286 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlzmg\" (UniqueName: \"kubernetes.io/projected/0b38a913-9251-435b-9656-5c2cee72e288-kube-api-access-rlzmg\") on node \"crc\" DevicePath \"\"" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.493847 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" event={"ID":"0b38a913-9251-435b-9656-5c2cee72e288","Type":"ContainerDied","Data":"29586bcb47bf86e89a675c8f04bdf6df3e1b188e7491ec42c3e73b18faa5383f"} Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.493892 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29586bcb47bf86e89a675c8f04bdf6df3e1b188e7491ec42c3e73b18faa5383f" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.493934 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.581754 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:00:48 crc kubenswrapper[5030]: E0120 23:00:48.582205 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b38a913-9251-435b-9656-5c2cee72e288" containerName="nova-cell0-conductor-db-sync" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.582225 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b38a913-9251-435b-9656-5c2cee72e288" containerName="nova-cell0-conductor-db-sync" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.582419 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b38a913-9251-435b-9656-5c2cee72e288" containerName="nova-cell0-conductor-db-sync" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.583178 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.586655 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.587441 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-2qp62" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.594473 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.666709 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpmd4\" (UniqueName: \"kubernetes.io/projected/4789919e-c06b-413a-8185-6766523e9925-kube-api-access-bpmd4\") pod \"nova-cell0-conductor-0\" (UID: \"4789919e-c06b-413a-8185-6766523e9925\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.666796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4789919e-c06b-413a-8185-6766523e9925-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4789919e-c06b-413a-8185-6766523e9925\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.666861 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4789919e-c06b-413a-8185-6766523e9925-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4789919e-c06b-413a-8185-6766523e9925\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.768188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpmd4\" (UniqueName: \"kubernetes.io/projected/4789919e-c06b-413a-8185-6766523e9925-kube-api-access-bpmd4\") pod \"nova-cell0-conductor-0\" (UID: \"4789919e-c06b-413a-8185-6766523e9925\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.768303 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4789919e-c06b-413a-8185-6766523e9925-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4789919e-c06b-413a-8185-6766523e9925\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.768396 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4789919e-c06b-413a-8185-6766523e9925-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4789919e-c06b-413a-8185-6766523e9925\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.775197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4789919e-c06b-413a-8185-6766523e9925-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4789919e-c06b-413a-8185-6766523e9925\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.777116 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4789919e-c06b-413a-8185-6766523e9925-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4789919e-c06b-413a-8185-6766523e9925\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.803407 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpmd4\" (UniqueName: \"kubernetes.io/projected/4789919e-c06b-413a-8185-6766523e9925-kube-api-access-bpmd4\") pod \"nova-cell0-conductor-0\" (UID: \"4789919e-c06b-413a-8185-6766523e9925\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:00:48 crc kubenswrapper[5030]: I0120 23:00:48.921374 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:00:49 crc kubenswrapper[5030]: I0120 23:00:49.179559 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:00:49 crc kubenswrapper[5030]: W0120 23:00:49.183913 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4789919e_c06b_413a_8185_6766523e9925.slice/crio-4e70aed4d79380e1fb50bea706a72607f0fa412f6f53a229cfa795a9d5a3b9ad WatchSource:0}: Error finding container 4e70aed4d79380e1fb50bea706a72607f0fa412f6f53a229cfa795a9d5a3b9ad: Status 404 returned error can't find the container with id 4e70aed4d79380e1fb50bea706a72607f0fa412f6f53a229cfa795a9d5a3b9ad Jan 20 23:00:49 crc kubenswrapper[5030]: I0120 23:00:49.505456 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4789919e-c06b-413a-8185-6766523e9925","Type":"ContainerStarted","Data":"492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2"} Jan 20 23:00:49 crc kubenswrapper[5030]: I0120 23:00:49.505948 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:00:49 crc kubenswrapper[5030]: I0120 23:00:49.505964 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4789919e-c06b-413a-8185-6766523e9925","Type":"ContainerStarted","Data":"4e70aed4d79380e1fb50bea706a72607f0fa412f6f53a229cfa795a9d5a3b9ad"} Jan 20 23:00:58 crc kubenswrapper[5030]: I0120 23:00:58.967280 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:00:58 crc kubenswrapper[5030]: I0120 23:00:58.989932 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=10.989911037 podStartE2EDuration="10.989911037s" podCreationTimestamp="2026-01-20 23:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:00:49.526126865 +0000 UTC m=+1521.846387183" watchObservedRunningTime="2026-01-20 23:00:58.989911037 +0000 UTC m=+1531.310171365" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.480523 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm"] Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.482544 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.485528 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.485559 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.507567 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm"] Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.581565 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-csrrm\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.581715 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-config-data\") pod \"nova-cell0-cell-mapping-csrrm\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.582061 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-scripts\") pod \"nova-cell0-cell-mapping-csrrm\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.582175 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nn2j\" (UniqueName: \"kubernetes.io/projected/33b0c307-dd4b-41d1-a042-98ee993d330a-kube-api-access-4nn2j\") pod \"nova-cell0-cell-mapping-csrrm\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.619356 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.621334 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.624385 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.629255 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.636000 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.643145 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.703755 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.706120 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-config-data\") pod \"nova-cell0-cell-mapping-csrrm\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.706182 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4868cb8-1f63-4955-b57d-0857ed6bf776-config-data\") pod \"nova-scheduler-0\" (UID: \"f4868cb8-1f63-4955-b57d-0857ed6bf776\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.706221 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b60400c-9868-401b-81b6-6e279fe14c5f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.706288 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-scripts\") pod \"nova-cell0-cell-mapping-csrrm\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.706320 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kshl5\" (UniqueName: \"kubernetes.io/projected/f4868cb8-1f63-4955-b57d-0857ed6bf776-kube-api-access-kshl5\") pod \"nova-scheduler-0\" (UID: \"f4868cb8-1f63-4955-b57d-0857ed6bf776\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.706349 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4868cb8-1f63-4955-b57d-0857ed6bf776-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f4868cb8-1f63-4955-b57d-0857ed6bf776\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.706363 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxrz7\" (UniqueName: \"kubernetes.io/projected/5b60400c-9868-401b-81b6-6e279fe14c5f-kube-api-access-lxrz7\") pod \"nova-api-0\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.706382 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nn2j\" (UniqueName: \"kubernetes.io/projected/33b0c307-dd4b-41d1-a042-98ee993d330a-kube-api-access-4nn2j\") pod \"nova-cell0-cell-mapping-csrrm\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.706414 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b60400c-9868-401b-81b6-6e279fe14c5f-config-data\") pod \"nova-api-0\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.706461 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-csrrm\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.706480 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b60400c-9868-401b-81b6-6e279fe14c5f-logs\") pod \"nova-api-0\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.723908 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.730900 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-csrrm\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.731553 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-config-data\") pod \"nova-cell0-cell-mapping-csrrm\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.731845 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-scripts\") pod \"nova-cell0-cell-mapping-csrrm\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.741738 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nn2j\" (UniqueName: \"kubernetes.io/projected/33b0c307-dd4b-41d1-a042-98ee993d330a-kube-api-access-4nn2j\") pod \"nova-cell0-cell-mapping-csrrm\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.807543 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b60400c-9868-401b-81b6-6e279fe14c5f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.807719 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kshl5\" (UniqueName: \"kubernetes.io/projected/f4868cb8-1f63-4955-b57d-0857ed6bf776-kube-api-access-kshl5\") pod \"nova-scheduler-0\" (UID: \"f4868cb8-1f63-4955-b57d-0857ed6bf776\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.807749 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4868cb8-1f63-4955-b57d-0857ed6bf776-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f4868cb8-1f63-4955-b57d-0857ed6bf776\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.807764 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxrz7\" (UniqueName: \"kubernetes.io/projected/5b60400c-9868-401b-81b6-6e279fe14c5f-kube-api-access-lxrz7\") pod \"nova-api-0\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.807789 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b60400c-9868-401b-81b6-6e279fe14c5f-config-data\") pod \"nova-api-0\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.807823 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b60400c-9868-401b-81b6-6e279fe14c5f-logs\") pod \"nova-api-0\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.807860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4868cb8-1f63-4955-b57d-0857ed6bf776-config-data\") pod \"nova-scheduler-0\" (UID: \"f4868cb8-1f63-4955-b57d-0857ed6bf776\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.816417 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b60400c-9868-401b-81b6-6e279fe14c5f-logs\") pod \"nova-api-0\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.818003 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.829255 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b60400c-9868-401b-81b6-6e279fe14c5f-config-data\") pod \"nova-api-0\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.835533 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4868cb8-1f63-4955-b57d-0857ed6bf776-config-data\") pod \"nova-scheduler-0\" (UID: \"f4868cb8-1f63-4955-b57d-0857ed6bf776\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.836008 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b60400c-9868-401b-81b6-6e279fe14c5f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.853536 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4868cb8-1f63-4955-b57d-0857ed6bf776-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f4868cb8-1f63-4955-b57d-0857ed6bf776\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.861691 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.863165 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.866873 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.876953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kshl5\" (UniqueName: \"kubernetes.io/projected/f4868cb8-1f63-4955-b57d-0857ed6bf776-kube-api-access-kshl5\") pod \"nova-scheduler-0\" (UID: \"f4868cb8-1f63-4955-b57d-0857ed6bf776\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.889334 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.890536 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.918650 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-logs\") pod \"nova-metadata-0\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.918700 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-config-data\") pod \"nova-metadata-0\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.918726 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d118055-40a6-48ed-a7c4-ac3b267c53bd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.918792 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrq2x\" (UniqueName: \"kubernetes.io/projected/7d118055-40a6-48ed-a7c4-ac3b267c53bd-kube-api-access-wrq2x\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.918822 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.918865 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d118055-40a6-48ed-a7c4-ac3b267c53bd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.918897 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfwp7\" (UniqueName: \"kubernetes.io/projected/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-kube-api-access-dfwp7\") pod \"nova-metadata-0\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.920238 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.929277 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.935923 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxrz7\" (UniqueName: \"kubernetes.io/projected/5b60400c-9868-401b-81b6-6e279fe14c5f-kube-api-access-lxrz7\") pod \"nova-api-0\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:00:59 crc kubenswrapper[5030]: I0120 23:00:59.961684 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.004464 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.019810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d118055-40a6-48ed-a7c4-ac3b267c53bd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.019898 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrq2x\" (UniqueName: \"kubernetes.io/projected/7d118055-40a6-48ed-a7c4-ac3b267c53bd-kube-api-access-wrq2x\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.019940 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.020000 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d118055-40a6-48ed-a7c4-ac3b267c53bd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.020089 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfwp7\" (UniqueName: \"kubernetes.io/projected/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-kube-api-access-dfwp7\") pod \"nova-metadata-0\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.020152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-logs\") pod \"nova-metadata-0\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.020192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-config-data\") pod \"nova-metadata-0\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.023502 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-logs\") pod \"nova-metadata-0\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.028410 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.031313 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d118055-40a6-48ed-a7c4-ac3b267c53bd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.032142 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d118055-40a6-48ed-a7c4-ac3b267c53bd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.032567 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.040277 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-config-data\") pod \"nova-metadata-0\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.044771 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfwp7\" (UniqueName: \"kubernetes.io/projected/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-kube-api-access-dfwp7\") pod \"nova-metadata-0\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.051056 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrq2x\" (UniqueName: \"kubernetes.io/projected/7d118055-40a6-48ed-a7c4-ac3b267c53bd-kube-api-access-wrq2x\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.136254 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-cron-29482501-nv2t9"] Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.137563 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.156511 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-cron-29482501-nv2t9"] Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.227876 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-fernet-keys\") pod \"keystone-cron-29482501-nv2t9\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.227939 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-config-data\") pod \"keystone-cron-29482501-nv2t9\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.227962 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-combined-ca-bundle\") pod \"keystone-cron-29482501-nv2t9\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.227990 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv68s\" (UniqueName: \"kubernetes.io/projected/94b1f55a-64ae-4fc1-91d3-5d65502e870c-kube-api-access-nv68s\") pod \"keystone-cron-29482501-nv2t9\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.308858 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.320104 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.335149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-fernet-keys\") pod \"keystone-cron-29482501-nv2t9\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.335246 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-config-data\") pod \"keystone-cron-29482501-nv2t9\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.335279 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-combined-ca-bundle\") pod \"keystone-cron-29482501-nv2t9\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.335318 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv68s\" (UniqueName: \"kubernetes.io/projected/94b1f55a-64ae-4fc1-91d3-5d65502e870c-kube-api-access-nv68s\") pod \"keystone-cron-29482501-nv2t9\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.341246 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-combined-ca-bundle\") pod \"keystone-cron-29482501-nv2t9\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.341284 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-config-data\") pod \"keystone-cron-29482501-nv2t9\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.341490 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-fernet-keys\") pod \"keystone-cron-29482501-nv2t9\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.354370 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv68s\" (UniqueName: \"kubernetes.io/projected/94b1f55a-64ae-4fc1-91d3-5d65502e870c-kube-api-access-nv68s\") pod \"keystone-cron-29482501-nv2t9\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.441220 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm"] Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.466857 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.532233 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r"] Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.535859 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.543070 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.543245 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.553035 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r"] Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.567058 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.574807 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:00 crc kubenswrapper[5030]: W0120 23:01:00.585163 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b60400c_9868_401b_81b6_6e279fe14c5f.slice/crio-6c8a2da0b56a71c81c379924e82f019c877e1911d494af8cd561dd735874086c WatchSource:0}: Error finding container 6c8a2da0b56a71c81c379924e82f019c877e1911d494af8cd561dd735874086c: Status 404 returned error can't find the container with id 6c8a2da0b56a71c81c379924e82f019c877e1911d494af8cd561dd735874086c Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.640651 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-config-data\") pod \"nova-cell1-conductor-db-sync-nzf7r\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.640967 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcbw4\" (UniqueName: \"kubernetes.io/projected/ce959850-3505-4e4b-9055-caa9b213416a-kube-api-access-jcbw4\") pod \"nova-cell1-conductor-db-sync-nzf7r\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.641009 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-scripts\") pod \"nova-cell1-conductor-db-sync-nzf7r\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.641093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nzf7r\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.712511 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" event={"ID":"33b0c307-dd4b-41d1-a042-98ee993d330a","Type":"ContainerStarted","Data":"f60106e35683c5d569cb990f9fda62a442d36c2261c8ffc07a18885ab32e1f23"} Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.712574 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" event={"ID":"33b0c307-dd4b-41d1-a042-98ee993d330a","Type":"ContainerStarted","Data":"537bf8300f9b144ee9fcee45e125dd387947e58e52bd3551dd249d51235bc704"} Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.717878 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5b60400c-9868-401b-81b6-6e279fe14c5f","Type":"ContainerStarted","Data":"6c8a2da0b56a71c81c379924e82f019c877e1911d494af8cd561dd735874086c"} Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.719643 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f4868cb8-1f63-4955-b57d-0857ed6bf776","Type":"ContainerStarted","Data":"f60798d2f5c573e2e40a540aec06095754e9757fe477925be060a6c4d9d7c39a"} Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.729753 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" podStartSLOduration=1.7297355620000001 podStartE2EDuration="1.729735562s" podCreationTimestamp="2026-01-20 23:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:00.727414065 +0000 UTC m=+1533.047674353" watchObservedRunningTime="2026-01-20 23:01:00.729735562 +0000 UTC m=+1533.049995850" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.744357 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-config-data\") pod \"nova-cell1-conductor-db-sync-nzf7r\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.744408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcbw4\" (UniqueName: \"kubernetes.io/projected/ce959850-3505-4e4b-9055-caa9b213416a-kube-api-access-jcbw4\") pod \"nova-cell1-conductor-db-sync-nzf7r\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.744445 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-scripts\") pod \"nova-cell1-conductor-db-sync-nzf7r\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.744488 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nzf7r\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.749346 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-config-data\") pod \"nova-cell1-conductor-db-sync-nzf7r\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.754979 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-scripts\") pod \"nova-cell1-conductor-db-sync-nzf7r\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.758924 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nzf7r\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.768292 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcbw4\" (UniqueName: \"kubernetes.io/projected/ce959850-3505-4e4b-9055-caa9b213416a-kube-api-access-jcbw4\") pod \"nova-cell1-conductor-db-sync-nzf7r\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.801379 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.870535 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.890160 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:01:00 crc kubenswrapper[5030]: I0120 23:01:00.976974 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-cron-29482501-nv2t9"] Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.322515 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r"] Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.736926 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f4868cb8-1f63-4955-b57d-0857ed6bf776","Type":"ContainerStarted","Data":"4e6f3c85c4b4f7f9e04d60f32cfc062d7db5f163d1d2c4e0e05067d0af71943d"} Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.739011 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" event={"ID":"ce959850-3505-4e4b-9055-caa9b213416a","Type":"ContainerStarted","Data":"6f52ea9610953ea6508ff7b23f08c8f684930217e4f49a7a64f1aa5e4dae1b5d"} Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.739057 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" event={"ID":"ce959850-3505-4e4b-9055-caa9b213416a","Type":"ContainerStarted","Data":"6cbeb7e22f154fe5208ff91d672a47027ee1ae863b832b411b275dfc6a680713"} Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.740632 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7d118055-40a6-48ed-a7c4-ac3b267c53bd","Type":"ContainerStarted","Data":"78bb4756bf3dbe66249916e13a219aa5e564af2cf8d6950e19565a49c27e42ab"} Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.740661 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7d118055-40a6-48ed-a7c4-ac3b267c53bd","Type":"ContainerStarted","Data":"993f4b8057a3cbd28f20c9cf784907069019b5c9f3dd72e26b2b8c0cf738a02d"} Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.742915 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c48efe35-f76b-4a56-9951-0d2ba7e61ea3","Type":"ContainerStarted","Data":"c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68"} Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.742948 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c48efe35-f76b-4a56-9951-0d2ba7e61ea3","Type":"ContainerStarted","Data":"a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701"} Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.742973 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c48efe35-f76b-4a56-9951-0d2ba7e61ea3","Type":"ContainerStarted","Data":"1166ba303d2189cbc1e033922a3ba0f272e5b3945aacd7629461d128bb36b42f"} Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.744771 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" event={"ID":"94b1f55a-64ae-4fc1-91d3-5d65502e870c","Type":"ContainerStarted","Data":"aa572d94787348cd7844f38114a079565a13848b4376a45837e9cca1482b4fba"} Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.744839 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" event={"ID":"94b1f55a-64ae-4fc1-91d3-5d65502e870c","Type":"ContainerStarted","Data":"dff65cabe8d64426a345d6de23691e69b116a07ab5044a40ad2f157157e261b6"} Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.746680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5b60400c-9868-401b-81b6-6e279fe14c5f","Type":"ContainerStarted","Data":"f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9"} Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.746719 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5b60400c-9868-401b-81b6-6e279fe14c5f","Type":"ContainerStarted","Data":"b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de"} Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.764005 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.763989344 podStartE2EDuration="2.763989344s" podCreationTimestamp="2026-01-20 23:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:01.760587531 +0000 UTC m=+1534.080847819" watchObservedRunningTime="2026-01-20 23:01:01.763989344 +0000 UTC m=+1534.084249632" Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.785054 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.785037786 podStartE2EDuration="2.785037786s" podCreationTimestamp="2026-01-20 23:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:01.780561798 +0000 UTC m=+1534.100822086" watchObservedRunningTime="2026-01-20 23:01:01.785037786 +0000 UTC m=+1534.105298064" Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.802114 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.802100531 podStartE2EDuration="2.802100531s" podCreationTimestamp="2026-01-20 23:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:01.794943337 +0000 UTC m=+1534.115203625" watchObservedRunningTime="2026-01-20 23:01:01.802100531 +0000 UTC m=+1534.122360819" Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.831485 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" podStartSLOduration=1.831471125 podStartE2EDuration="1.831471125s" podCreationTimestamp="2026-01-20 23:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:01.827749174 +0000 UTC m=+1534.148009462" watchObservedRunningTime="2026-01-20 23:01:01.831471125 +0000 UTC m=+1534.151731413" Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.831584 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" podStartSLOduration=1.831581057 podStartE2EDuration="1.831581057s" podCreationTimestamp="2026-01-20 23:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:01.817532316 +0000 UTC m=+1534.137792604" watchObservedRunningTime="2026-01-20 23:01:01.831581057 +0000 UTC m=+1534.151841345" Jan 20 23:01:01 crc kubenswrapper[5030]: I0120 23:01:01.846999 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.846982822 podStartE2EDuration="2.846982822s" podCreationTimestamp="2026-01-20 23:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:01.840875964 +0000 UTC m=+1534.161136252" watchObservedRunningTime="2026-01-20 23:01:01.846982822 +0000 UTC m=+1534.167243110" Jan 20 23:01:02 crc kubenswrapper[5030]: I0120 23:01:02.560702 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:01:02 crc kubenswrapper[5030]: I0120 23:01:02.578247 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:03 crc kubenswrapper[5030]: I0120 23:01:03.616605 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:03 crc kubenswrapper[5030]: I0120 23:01:03.763711 5030 generic.go:334] "Generic (PLEG): container finished" podID="94b1f55a-64ae-4fc1-91d3-5d65502e870c" containerID="aa572d94787348cd7844f38114a079565a13848b4376a45837e9cca1482b4fba" exitCode=0 Jan 20 23:01:03 crc kubenswrapper[5030]: I0120 23:01:03.763790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" event={"ID":"94b1f55a-64ae-4fc1-91d3-5d65502e870c","Type":"ContainerDied","Data":"aa572d94787348cd7844f38114a079565a13848b4376a45837e9cca1482b4fba"} Jan 20 23:01:03 crc kubenswrapper[5030]: I0120 23:01:03.763899 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c48efe35-f76b-4a56-9951-0d2ba7e61ea3" containerName="nova-metadata-log" containerID="cri-o://a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701" gracePeriod=30 Jan 20 23:01:03 crc kubenswrapper[5030]: I0120 23:01:03.763959 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c48efe35-f76b-4a56-9951-0d2ba7e61ea3" containerName="nova-metadata-metadata" containerID="cri-o://c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68" gracePeriod=30 Jan 20 23:01:03 crc kubenswrapper[5030]: I0120 23:01:03.764453 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="7d118055-40a6-48ed-a7c4-ac3b267c53bd" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://78bb4756bf3dbe66249916e13a219aa5e564af2cf8d6950e19565a49c27e42ab" gracePeriod=30 Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.328516 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.411686 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-logs\") pod \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.411959 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-logs" (OuterVolumeSpecName: "logs") pod "c48efe35-f76b-4a56-9951-0d2ba7e61ea3" (UID: "c48efe35-f76b-4a56-9951-0d2ba7e61ea3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.411998 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-config-data\") pod \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.412035 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfwp7\" (UniqueName: \"kubernetes.io/projected/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-kube-api-access-dfwp7\") pod \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.412141 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-combined-ca-bundle\") pod \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\" (UID: \"c48efe35-f76b-4a56-9951-0d2ba7e61ea3\") " Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.412601 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.417865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-kube-api-access-dfwp7" (OuterVolumeSpecName: "kube-api-access-dfwp7") pod "c48efe35-f76b-4a56-9951-0d2ba7e61ea3" (UID: "c48efe35-f76b-4a56-9951-0d2ba7e61ea3"). InnerVolumeSpecName "kube-api-access-dfwp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.440713 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-config-data" (OuterVolumeSpecName: "config-data") pod "c48efe35-f76b-4a56-9951-0d2ba7e61ea3" (UID: "c48efe35-f76b-4a56-9951-0d2ba7e61ea3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.441024 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c48efe35-f76b-4a56-9951-0d2ba7e61ea3" (UID: "c48efe35-f76b-4a56-9951-0d2ba7e61ea3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.478981 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.514429 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrq2x\" (UniqueName: \"kubernetes.io/projected/7d118055-40a6-48ed-a7c4-ac3b267c53bd-kube-api-access-wrq2x\") pod \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\" (UID: \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\") " Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.514477 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d118055-40a6-48ed-a7c4-ac3b267c53bd-combined-ca-bundle\") pod \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\" (UID: \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\") " Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.514589 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d118055-40a6-48ed-a7c4-ac3b267c53bd-config-data\") pod \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\" (UID: \"7d118055-40a6-48ed-a7c4-ac3b267c53bd\") " Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.514845 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.514856 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfwp7\" (UniqueName: \"kubernetes.io/projected/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-kube-api-access-dfwp7\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.514866 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c48efe35-f76b-4a56-9951-0d2ba7e61ea3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.518767 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d118055-40a6-48ed-a7c4-ac3b267c53bd-kube-api-access-wrq2x" (OuterVolumeSpecName: "kube-api-access-wrq2x") pod "7d118055-40a6-48ed-a7c4-ac3b267c53bd" (UID: "7d118055-40a6-48ed-a7c4-ac3b267c53bd"). InnerVolumeSpecName "kube-api-access-wrq2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.541936 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d118055-40a6-48ed-a7c4-ac3b267c53bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d118055-40a6-48ed-a7c4-ac3b267c53bd" (UID: "7d118055-40a6-48ed-a7c4-ac3b267c53bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.551982 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d118055-40a6-48ed-a7c4-ac3b267c53bd-config-data" (OuterVolumeSpecName: "config-data") pod "7d118055-40a6-48ed-a7c4-ac3b267c53bd" (UID: "7d118055-40a6-48ed-a7c4-ac3b267c53bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.615960 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d118055-40a6-48ed-a7c4-ac3b267c53bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.615991 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrq2x\" (UniqueName: \"kubernetes.io/projected/7d118055-40a6-48ed-a7c4-ac3b267c53bd-kube-api-access-wrq2x\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.616001 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d118055-40a6-48ed-a7c4-ac3b267c53bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.778079 5030 generic.go:334] "Generic (PLEG): container finished" podID="ce959850-3505-4e4b-9055-caa9b213416a" containerID="6f52ea9610953ea6508ff7b23f08c8f684930217e4f49a7a64f1aa5e4dae1b5d" exitCode=0 Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.778173 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" event={"ID":"ce959850-3505-4e4b-9055-caa9b213416a","Type":"ContainerDied","Data":"6f52ea9610953ea6508ff7b23f08c8f684930217e4f49a7a64f1aa5e4dae1b5d"} Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.780520 5030 generic.go:334] "Generic (PLEG): container finished" podID="7d118055-40a6-48ed-a7c4-ac3b267c53bd" containerID="78bb4756bf3dbe66249916e13a219aa5e564af2cf8d6950e19565a49c27e42ab" exitCode=0 Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.780564 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.780615 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7d118055-40a6-48ed-a7c4-ac3b267c53bd","Type":"ContainerDied","Data":"78bb4756bf3dbe66249916e13a219aa5e564af2cf8d6950e19565a49c27e42ab"} Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.780676 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7d118055-40a6-48ed-a7c4-ac3b267c53bd","Type":"ContainerDied","Data":"993f4b8057a3cbd28f20c9cf784907069019b5c9f3dd72e26b2b8c0cf738a02d"} Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.780704 5030 scope.go:117] "RemoveContainer" containerID="78bb4756bf3dbe66249916e13a219aa5e564af2cf8d6950e19565a49c27e42ab" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.785793 5030 generic.go:334] "Generic (PLEG): container finished" podID="c48efe35-f76b-4a56-9951-0d2ba7e61ea3" containerID="c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68" exitCode=0 Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.785819 5030 generic.go:334] "Generic (PLEG): container finished" podID="c48efe35-f76b-4a56-9951-0d2ba7e61ea3" containerID="a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701" exitCode=143 Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.785938 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.792763 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c48efe35-f76b-4a56-9951-0d2ba7e61ea3","Type":"ContainerDied","Data":"c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68"} Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.792954 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c48efe35-f76b-4a56-9951-0d2ba7e61ea3","Type":"ContainerDied","Data":"a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701"} Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.793026 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c48efe35-f76b-4a56-9951-0d2ba7e61ea3","Type":"ContainerDied","Data":"1166ba303d2189cbc1e033922a3ba0f272e5b3945aacd7629461d128bb36b42f"} Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.815488 5030 scope.go:117] "RemoveContainer" containerID="78bb4756bf3dbe66249916e13a219aa5e564af2cf8d6950e19565a49c27e42ab" Jan 20 23:01:04 crc kubenswrapper[5030]: E0120 23:01:04.816138 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78bb4756bf3dbe66249916e13a219aa5e564af2cf8d6950e19565a49c27e42ab\": container with ID starting with 78bb4756bf3dbe66249916e13a219aa5e564af2cf8d6950e19565a49c27e42ab not found: ID does not exist" containerID="78bb4756bf3dbe66249916e13a219aa5e564af2cf8d6950e19565a49c27e42ab" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.816180 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78bb4756bf3dbe66249916e13a219aa5e564af2cf8d6950e19565a49c27e42ab"} err="failed to get container status \"78bb4756bf3dbe66249916e13a219aa5e564af2cf8d6950e19565a49c27e42ab\": rpc error: code = NotFound desc = could not find container \"78bb4756bf3dbe66249916e13a219aa5e564af2cf8d6950e19565a49c27e42ab\": container with ID starting with 78bb4756bf3dbe66249916e13a219aa5e564af2cf8d6950e19565a49c27e42ab not found: ID does not exist" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.816206 5030 scope.go:117] "RemoveContainer" containerID="c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.850003 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.868511 5030 scope.go:117] "RemoveContainer" containerID="a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.872554 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.891574 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.901747 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:01:04 crc kubenswrapper[5030]: E0120 23:01:04.902238 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d118055-40a6-48ed-a7c4-ac3b267c53bd" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.902263 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d118055-40a6-48ed-a7c4-ac3b267c53bd" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:01:04 crc kubenswrapper[5030]: E0120 23:01:04.902295 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48efe35-f76b-4a56-9951-0d2ba7e61ea3" containerName="nova-metadata-log" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.902304 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48efe35-f76b-4a56-9951-0d2ba7e61ea3" containerName="nova-metadata-log" Jan 20 23:01:04 crc kubenswrapper[5030]: E0120 23:01:04.902318 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48efe35-f76b-4a56-9951-0d2ba7e61ea3" containerName="nova-metadata-metadata" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.902329 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48efe35-f76b-4a56-9951-0d2ba7e61ea3" containerName="nova-metadata-metadata" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.902586 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d118055-40a6-48ed-a7c4-ac3b267c53bd" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.902678 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c48efe35-f76b-4a56-9951-0d2ba7e61ea3" containerName="nova-metadata-log" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.902716 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c48efe35-f76b-4a56-9951-0d2ba7e61ea3" containerName="nova-metadata-metadata" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.903570 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.912755 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.912962 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.913742 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.922012 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.923318 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.923370 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.923409 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.923463 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmwmf\" (UniqueName: \"kubernetes.io/projected/c4a387ba-cb79-4f5d-a313-37204d51d189-kube-api-access-rmwmf\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.923572 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.933941 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.942422 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.944413 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.947732 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.948104 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.950791 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.953904 5030 scope.go:117] "RemoveContainer" containerID="c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68" Jan 20 23:01:04 crc kubenswrapper[5030]: E0120 23:01:04.956870 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68\": container with ID starting with c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68 not found: ID does not exist" containerID="c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.956916 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68"} err="failed to get container status \"c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68\": rpc error: code = NotFound desc = could not find container \"c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68\": container with ID starting with c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68 not found: ID does not exist" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.956944 5030 scope.go:117] "RemoveContainer" containerID="a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701" Jan 20 23:01:04 crc kubenswrapper[5030]: E0120 23:01:04.957699 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701\": container with ID starting with a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701 not found: ID does not exist" containerID="a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.957729 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701"} err="failed to get container status \"a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701\": rpc error: code = NotFound desc = could not find container \"a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701\": container with ID starting with a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701 not found: ID does not exist" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.957750 5030 scope.go:117] "RemoveContainer" containerID="c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.958335 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68"} err="failed to get container status \"c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68\": rpc error: code = NotFound desc = could not find container \"c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68\": container with ID starting with c6878d802e3c55362ecf44296dc231cacdf811864431e717407b67f7297d2e68 not found: ID does not exist" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.958353 5030 scope.go:117] "RemoveContainer" containerID="a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701" Jan 20 23:01:04 crc kubenswrapper[5030]: I0120 23:01:04.960776 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701"} err="failed to get container status \"a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701\": rpc error: code = NotFound desc = could not find container \"a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701\": container with ID starting with a055a0d823d14b64fbf665d3d13f2272ac8c2928a8d76b953174db90ad48a701 not found: ID does not exist" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.025525 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.025614 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b18e1433-ed65-4d7e-bc78-62681cbfdcae-logs\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.025662 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.025685 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.025725 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.025748 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.026815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmwmf\" (UniqueName: \"kubernetes.io/projected/c4a387ba-cb79-4f5d-a313-37204d51d189-kube-api-access-rmwmf\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.026925 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-config-data\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.027138 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.027178 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwhgz\" (UniqueName: \"kubernetes.io/projected/b18e1433-ed65-4d7e-bc78-62681cbfdcae-kube-api-access-fwhgz\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.031088 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.032061 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.033267 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.036116 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.040886 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.051573 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmwmf\" (UniqueName: \"kubernetes.io/projected/c4a387ba-cb79-4f5d-a313-37204d51d189-kube-api-access-rmwmf\") pod \"nova-cell1-novncproxy-0\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.128401 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-config-data\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.129997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.130355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwhgz\" (UniqueName: \"kubernetes.io/projected/b18e1433-ed65-4d7e-bc78-62681cbfdcae-kube-api-access-fwhgz\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.130787 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.130822 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b18e1433-ed65-4d7e-bc78-62681cbfdcae-logs\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.131852 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b18e1433-ed65-4d7e-bc78-62681cbfdcae-logs\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.132609 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.133188 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-config-data\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.134508 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.143846 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwhgz\" (UniqueName: \"kubernetes.io/projected/b18e1433-ed65-4d7e-bc78-62681cbfdcae-kube-api-access-fwhgz\") pod \"nova-metadata-0\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.207644 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.232493 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-combined-ca-bundle\") pod \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.232570 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv68s\" (UniqueName: \"kubernetes.io/projected/94b1f55a-64ae-4fc1-91d3-5d65502e870c-kube-api-access-nv68s\") pod \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.232708 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-fernet-keys\") pod \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.232752 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-config-data\") pod \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\" (UID: \"94b1f55a-64ae-4fc1-91d3-5d65502e870c\") " Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.241109 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "94b1f55a-64ae-4fc1-91d3-5d65502e870c" (UID: "94b1f55a-64ae-4fc1-91d3-5d65502e870c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.244873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b1f55a-64ae-4fc1-91d3-5d65502e870c-kube-api-access-nv68s" (OuterVolumeSpecName: "kube-api-access-nv68s") pod "94b1f55a-64ae-4fc1-91d3-5d65502e870c" (UID: "94b1f55a-64ae-4fc1-91d3-5d65502e870c"). InnerVolumeSpecName "kube-api-access-nv68s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.246339 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.266551 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.275454 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94b1f55a-64ae-4fc1-91d3-5d65502e870c" (UID: "94b1f55a-64ae-4fc1-91d3-5d65502e870c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.290957 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-config-data" (OuterVolumeSpecName: "config-data") pod "94b1f55a-64ae-4fc1-91d3-5d65502e870c" (UID: "94b1f55a-64ae-4fc1-91d3-5d65502e870c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.335436 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.335466 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.335480 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b1f55a-64ae-4fc1-91d3-5d65502e870c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.335495 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv68s\" (UniqueName: \"kubernetes.io/projected/94b1f55a-64ae-4fc1-91d3-5d65502e870c-kube-api-access-nv68s\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.720599 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:01:05 crc kubenswrapper[5030]: W0120 23:01:05.759176 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb18e1433_ed65_4d7e_bc78_62681cbfdcae.slice/crio-8bf2043e8c546af7e912260110d1bbee73808a9f1435de79a19d65eca550877b WatchSource:0}: Error finding container 8bf2043e8c546af7e912260110d1bbee73808a9f1435de79a19d65eca550877b: Status 404 returned error can't find the container with id 8bf2043e8c546af7e912260110d1bbee73808a9f1435de79a19d65eca550877b Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.760639 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.799466 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c4a387ba-cb79-4f5d-a313-37204d51d189","Type":"ContainerStarted","Data":"b7d2e5aba18fa2fbf1dac48da0ce44d2e5229298ef70dea322a5b649be96ac68"} Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.804475 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" event={"ID":"94b1f55a-64ae-4fc1-91d3-5d65502e870c","Type":"ContainerDied","Data":"dff65cabe8d64426a345d6de23691e69b116a07ab5044a40ad2f157157e261b6"} Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.804527 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dff65cabe8d64426a345d6de23691e69b116a07ab5044a40ad2f157157e261b6" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.804545 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cron-29482501-nv2t9" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.805844 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b18e1433-ed65-4d7e-bc78-62681cbfdcae","Type":"ContainerStarted","Data":"8bf2043e8c546af7e912260110d1bbee73808a9f1435de79a19d65eca550877b"} Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.975703 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d118055-40a6-48ed-a7c4-ac3b267c53bd" path="/var/lib/kubelet/pods/7d118055-40a6-48ed-a7c4-ac3b267c53bd/volumes" Jan 20 23:01:05 crc kubenswrapper[5030]: I0120 23:01:05.977260 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c48efe35-f76b-4a56-9951-0d2ba7e61ea3" path="/var/lib/kubelet/pods/c48efe35-f76b-4a56-9951-0d2ba7e61ea3/volumes" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.141853 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.253586 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-config-data\") pod \"ce959850-3505-4e4b-9055-caa9b213416a\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.253921 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-combined-ca-bundle\") pod \"ce959850-3505-4e4b-9055-caa9b213416a\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.253988 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcbw4\" (UniqueName: \"kubernetes.io/projected/ce959850-3505-4e4b-9055-caa9b213416a-kube-api-access-jcbw4\") pod \"ce959850-3505-4e4b-9055-caa9b213416a\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.254169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-scripts\") pod \"ce959850-3505-4e4b-9055-caa9b213416a\" (UID: \"ce959850-3505-4e4b-9055-caa9b213416a\") " Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.257538 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce959850-3505-4e4b-9055-caa9b213416a-kube-api-access-jcbw4" (OuterVolumeSpecName: "kube-api-access-jcbw4") pod "ce959850-3505-4e4b-9055-caa9b213416a" (UID: "ce959850-3505-4e4b-9055-caa9b213416a"). InnerVolumeSpecName "kube-api-access-jcbw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.262136 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-scripts" (OuterVolumeSpecName: "scripts") pod "ce959850-3505-4e4b-9055-caa9b213416a" (UID: "ce959850-3505-4e4b-9055-caa9b213416a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.278160 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce959850-3505-4e4b-9055-caa9b213416a" (UID: "ce959850-3505-4e4b-9055-caa9b213416a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.279958 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-config-data" (OuterVolumeSpecName: "config-data") pod "ce959850-3505-4e4b-9055-caa9b213416a" (UID: "ce959850-3505-4e4b-9055-caa9b213416a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.356137 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.356170 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.356180 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce959850-3505-4e4b-9055-caa9b213416a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.356191 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcbw4\" (UniqueName: \"kubernetes.io/projected/ce959850-3505-4e4b-9055-caa9b213416a-kube-api-access-jcbw4\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.817157 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b18e1433-ed65-4d7e-bc78-62681cbfdcae","Type":"ContainerStarted","Data":"5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb"} Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.817223 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b18e1433-ed65-4d7e-bc78-62681cbfdcae","Type":"ContainerStarted","Data":"fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f"} Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.818787 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" event={"ID":"ce959850-3505-4e4b-9055-caa9b213416a","Type":"ContainerDied","Data":"6cbeb7e22f154fe5208ff91d672a47027ee1ae863b832b411b275dfc6a680713"} Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.818816 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.818836 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cbeb7e22f154fe5208ff91d672a47027ee1ae863b832b411b275dfc6a680713" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.833231 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c4a387ba-cb79-4f5d-a313-37204d51d189","Type":"ContainerStarted","Data":"3bcbe084389694d96a65b34b6aa0d2aa7c93b51780ed5c86c1d3a10c6a57ad9e"} Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.837472 5030 generic.go:334] "Generic (PLEG): container finished" podID="33b0c307-dd4b-41d1-a042-98ee993d330a" containerID="f60106e35683c5d569cb990f9fda62a442d36c2261c8ffc07a18885ab32e1f23" exitCode=0 Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.837515 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" event={"ID":"33b0c307-dd4b-41d1-a042-98ee993d330a","Type":"ContainerDied","Data":"f60106e35683c5d569cb990f9fda62a442d36c2261c8ffc07a18885ab32e1f23"} Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.844994 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.844978502 podStartE2EDuration="2.844978502s" podCreationTimestamp="2026-01-20 23:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:06.840013511 +0000 UTC m=+1539.160273799" watchObservedRunningTime="2026-01-20 23:01:06.844978502 +0000 UTC m=+1539.165238790" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.873250 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.873226529 podStartE2EDuration="2.873226529s" podCreationTimestamp="2026-01-20 23:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:06.867042678 +0000 UTC m=+1539.187302966" watchObservedRunningTime="2026-01-20 23:01:06.873226529 +0000 UTC m=+1539.193486817" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.905346 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:01:06 crc kubenswrapper[5030]: E0120 23:01:06.905904 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce959850-3505-4e4b-9055-caa9b213416a" containerName="nova-cell1-conductor-db-sync" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.905930 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce959850-3505-4e4b-9055-caa9b213416a" containerName="nova-cell1-conductor-db-sync" Jan 20 23:01:06 crc kubenswrapper[5030]: E0120 23:01:06.905959 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b1f55a-64ae-4fc1-91d3-5d65502e870c" containerName="keystone-cron" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.905969 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b1f55a-64ae-4fc1-91d3-5d65502e870c" containerName="keystone-cron" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.906201 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce959850-3505-4e4b-9055-caa9b213416a" containerName="nova-cell1-conductor-db-sync" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.906238 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b1f55a-64ae-4fc1-91d3-5d65502e870c" containerName="keystone-cron" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.907061 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.911159 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.915132 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.967692 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5699cc7d-e528-4022-bff1-7057062e4a03-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5699cc7d-e528-4022-bff1-7057062e4a03\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.967742 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5699cc7d-e528-4022-bff1-7057062e4a03-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5699cc7d-e528-4022-bff1-7057062e4a03\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:01:06 crc kubenswrapper[5030]: I0120 23:01:06.967956 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzg2f\" (UniqueName: \"kubernetes.io/projected/5699cc7d-e528-4022-bff1-7057062e4a03-kube-api-access-gzg2f\") pod \"nova-cell1-conductor-0\" (UID: \"5699cc7d-e528-4022-bff1-7057062e4a03\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.069292 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5699cc7d-e528-4022-bff1-7057062e4a03-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5699cc7d-e528-4022-bff1-7057062e4a03\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.069359 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5699cc7d-e528-4022-bff1-7057062e4a03-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5699cc7d-e528-4022-bff1-7057062e4a03\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.069447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzg2f\" (UniqueName: \"kubernetes.io/projected/5699cc7d-e528-4022-bff1-7057062e4a03-kube-api-access-gzg2f\") pod \"nova-cell1-conductor-0\" (UID: \"5699cc7d-e528-4022-bff1-7057062e4a03\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.073761 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5699cc7d-e528-4022-bff1-7057062e4a03-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5699cc7d-e528-4022-bff1-7057062e4a03\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.076461 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5699cc7d-e528-4022-bff1-7057062e4a03-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5699cc7d-e528-4022-bff1-7057062e4a03\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.084019 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzg2f\" (UniqueName: \"kubernetes.io/projected/5699cc7d-e528-4022-bff1-7057062e4a03-kube-api-access-gzg2f\") pod \"nova-cell1-conductor-0\" (UID: \"5699cc7d-e528-4022-bff1-7057062e4a03\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.226110 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.251614 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.251878 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="4799c457-9dc2-4576-961d-9e5c5cbf0fd7" containerName="kube-state-metrics" containerID="cri-o://b054c744cc5b60da751b376f974aa4b8b8d215604cd72b70b44a32268c3f6d3d" gracePeriod=30 Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.735908 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.797407 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.847864 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"5699cc7d-e528-4022-bff1-7057062e4a03","Type":"ContainerStarted","Data":"039b12a18dd4009b921a385ac492c0fe4c0dc5a70de823c3c59cd5ff252ef641"} Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.849606 5030 generic.go:334] "Generic (PLEG): container finished" podID="4799c457-9dc2-4576-961d-9e5c5cbf0fd7" containerID="b054c744cc5b60da751b376f974aa4b8b8d215604cd72b70b44a32268c3f6d3d" exitCode=2 Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.849846 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.850554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"4799c457-9dc2-4576-961d-9e5c5cbf0fd7","Type":"ContainerDied","Data":"b054c744cc5b60da751b376f974aa4b8b8d215604cd72b70b44a32268c3f6d3d"} Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.850573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"4799c457-9dc2-4576-961d-9e5c5cbf0fd7","Type":"ContainerDied","Data":"dd1d3506617600dee86435951a9669ea5806a8732cb658b420236393a76001b3"} Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.850588 5030 scope.go:117] "RemoveContainer" containerID="b054c744cc5b60da751b376f974aa4b8b8d215604cd72b70b44a32268c3f6d3d" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.869917 5030 scope.go:117] "RemoveContainer" containerID="b054c744cc5b60da751b376f974aa4b8b8d215604cd72b70b44a32268c3f6d3d" Jan 20 23:01:07 crc kubenswrapper[5030]: E0120 23:01:07.870723 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b054c744cc5b60da751b376f974aa4b8b8d215604cd72b70b44a32268c3f6d3d\": container with ID starting with b054c744cc5b60da751b376f974aa4b8b8d215604cd72b70b44a32268c3f6d3d not found: ID does not exist" containerID="b054c744cc5b60da751b376f974aa4b8b8d215604cd72b70b44a32268c3f6d3d" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.870778 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b054c744cc5b60da751b376f974aa4b8b8d215604cd72b70b44a32268c3f6d3d"} err="failed to get container status \"b054c744cc5b60da751b376f974aa4b8b8d215604cd72b70b44a32268c3f6d3d\": rpc error: code = NotFound desc = could not find container \"b054c744cc5b60da751b376f974aa4b8b8d215604cd72b70b44a32268c3f6d3d\": container with ID starting with b054c744cc5b60da751b376f974aa4b8b8d215604cd72b70b44a32268c3f6d3d not found: ID does not exist" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.882303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtm4n\" (UniqueName: \"kubernetes.io/projected/4799c457-9dc2-4576-961d-9e5c5cbf0fd7-kube-api-access-vtm4n\") pod \"4799c457-9dc2-4576-961d-9e5c5cbf0fd7\" (UID: \"4799c457-9dc2-4576-961d-9e5c5cbf0fd7\") " Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.887694 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4799c457-9dc2-4576-961d-9e5c5cbf0fd7-kube-api-access-vtm4n" (OuterVolumeSpecName: "kube-api-access-vtm4n") pod "4799c457-9dc2-4576-961d-9e5c5cbf0fd7" (UID: "4799c457-9dc2-4576-961d-9e5c5cbf0fd7"). InnerVolumeSpecName "kube-api-access-vtm4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:07 crc kubenswrapper[5030]: I0120 23:01:07.993541 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtm4n\" (UniqueName: \"kubernetes.io/projected/4799c457-9dc2-4576-961d-9e5c5cbf0fd7-kube-api-access-vtm4n\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.204061 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.218602 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.233482 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.260359 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:01:08 crc kubenswrapper[5030]: E0120 23:01:08.260739 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4799c457-9dc2-4576-961d-9e5c5cbf0fd7" containerName="kube-state-metrics" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.260755 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4799c457-9dc2-4576-961d-9e5c5cbf0fd7" containerName="kube-state-metrics" Jan 20 23:01:08 crc kubenswrapper[5030]: E0120 23:01:08.260791 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b0c307-dd4b-41d1-a042-98ee993d330a" containerName="nova-manage" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.260797 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b0c307-dd4b-41d1-a042-98ee993d330a" containerName="nova-manage" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.260971 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4799c457-9dc2-4576-961d-9e5c5cbf0fd7" containerName="kube-state-metrics" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.260983 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b0c307-dd4b-41d1-a042-98ee993d330a" containerName="nova-manage" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.261571 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.265358 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.265521 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.280605 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.298585 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-config-data\") pod \"33b0c307-dd4b-41d1-a042-98ee993d330a\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.298672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nn2j\" (UniqueName: \"kubernetes.io/projected/33b0c307-dd4b-41d1-a042-98ee993d330a-kube-api-access-4nn2j\") pod \"33b0c307-dd4b-41d1-a042-98ee993d330a\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.298799 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-combined-ca-bundle\") pod \"33b0c307-dd4b-41d1-a042-98ee993d330a\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.298952 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-scripts\") pod \"33b0c307-dd4b-41d1-a042-98ee993d330a\" (UID: \"33b0c307-dd4b-41d1-a042-98ee993d330a\") " Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.299311 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.299380 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.299439 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.299512 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b6zw\" (UniqueName: \"kubernetes.io/projected/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-api-access-7b6zw\") pod \"kube-state-metrics-0\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.303161 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-scripts" (OuterVolumeSpecName: "scripts") pod "33b0c307-dd4b-41d1-a042-98ee993d330a" (UID: "33b0c307-dd4b-41d1-a042-98ee993d330a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.313864 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b0c307-dd4b-41d1-a042-98ee993d330a-kube-api-access-4nn2j" (OuterVolumeSpecName: "kube-api-access-4nn2j") pod "33b0c307-dd4b-41d1-a042-98ee993d330a" (UID: "33b0c307-dd4b-41d1-a042-98ee993d330a"). InnerVolumeSpecName "kube-api-access-4nn2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.324310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-config-data" (OuterVolumeSpecName: "config-data") pod "33b0c307-dd4b-41d1-a042-98ee993d330a" (UID: "33b0c307-dd4b-41d1-a042-98ee993d330a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.327790 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33b0c307-dd4b-41d1-a042-98ee993d330a" (UID: "33b0c307-dd4b-41d1-a042-98ee993d330a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.401023 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.401111 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.401188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b6zw\" (UniqueName: \"kubernetes.io/projected/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-api-access-7b6zw\") pod \"kube-state-metrics-0\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.401279 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.401352 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.401386 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.401400 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nn2j\" (UniqueName: \"kubernetes.io/projected/33b0c307-dd4b-41d1-a042-98ee993d330a-kube-api-access-4nn2j\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.401414 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b0c307-dd4b-41d1-a042-98ee993d330a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.404828 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.405422 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.413329 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.420743 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b6zw\" (UniqueName: \"kubernetes.io/projected/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-api-access-7b6zw\") pod \"kube-state-metrics-0\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.586750 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.862441 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"5699cc7d-e528-4022-bff1-7057062e4a03","Type":"ContainerStarted","Data":"3840da32c6f4ec49386ce37acd28fcc6e739b91b2b7ca663f4470e40df7a6ce7"} Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.864015 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.865276 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" event={"ID":"33b0c307-dd4b-41d1-a042-98ee993d330a","Type":"ContainerDied","Data":"537bf8300f9b144ee9fcee45e125dd387947e58e52bd3551dd249d51235bc704"} Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.865329 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="537bf8300f9b144ee9fcee45e125dd387947e58e52bd3551dd249d51235bc704" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.865332 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm" Jan 20 23:01:08 crc kubenswrapper[5030]: I0120 23:01:08.891043 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.8910220300000002 podStartE2EDuration="2.89102203s" podCreationTimestamp="2026-01-20 23:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:08.88482811 +0000 UTC m=+1541.205088428" watchObservedRunningTime="2026-01-20 23:01:08.89102203 +0000 UTC m=+1541.211282338" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.026840 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.027203 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="5b60400c-9868-401b-81b6-6e279fe14c5f" containerName="nova-api-log" containerID="cri-o://b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de" gracePeriod=30 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.027602 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="5b60400c-9868-401b-81b6-6e279fe14c5f" containerName="nova-api-api" containerID="cri-o://f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9" gracePeriod=30 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.046946 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.064358 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.064609 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="f4868cb8-1f63-4955-b57d-0857ed6bf776" containerName="nova-scheduler-scheduler" containerID="cri-o://4e6f3c85c4b4f7f9e04d60f32cfc062d7db5f163d1d2c4e0e05067d0af71943d" gracePeriod=30 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.082314 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.082527 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b18e1433-ed65-4d7e-bc78-62681cbfdcae" containerName="nova-metadata-log" containerID="cri-o://fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f" gracePeriod=30 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.082985 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b18e1433-ed65-4d7e-bc78-62681cbfdcae" containerName="nova-metadata-metadata" containerID="cri-o://5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb" gracePeriod=30 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.271448 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.271733 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="ceilometer-central-agent" containerID="cri-o://1f686e077c11dd76e21872c810c9be19eb982cdf94d00717d56de9cec79fd333" gracePeriod=30 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.272177 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="proxy-httpd" containerID="cri-o://4607e5ae35f297ec38d67baba04586d4646282afea61cb310f8e921df3ea408c" gracePeriod=30 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.272219 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="sg-core" containerID="cri-o://fcc3a0c29008ae396a861e080ef889b9da3b41c8a5c3d991bd4c17a8045f6e3e" gracePeriod=30 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.272254 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="ceilometer-notification-agent" containerID="cri-o://ba8dba67d6db502a3953f80a2d5031043c4063dfff96429f3a4ca5b0816d87a2" gracePeriod=30 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.628336 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.640005 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.736546 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-nova-metadata-tls-certs\") pod \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.736707 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b60400c-9868-401b-81b6-6e279fe14c5f-combined-ca-bundle\") pod \"5b60400c-9868-401b-81b6-6e279fe14c5f\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.736764 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b18e1433-ed65-4d7e-bc78-62681cbfdcae-logs\") pod \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.736783 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-combined-ca-bundle\") pod \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.736845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxrz7\" (UniqueName: \"kubernetes.io/projected/5b60400c-9868-401b-81b6-6e279fe14c5f-kube-api-access-lxrz7\") pod \"5b60400c-9868-401b-81b6-6e279fe14c5f\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.736882 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b60400c-9868-401b-81b6-6e279fe14c5f-config-data\") pod \"5b60400c-9868-401b-81b6-6e279fe14c5f\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.736927 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwhgz\" (UniqueName: \"kubernetes.io/projected/b18e1433-ed65-4d7e-bc78-62681cbfdcae-kube-api-access-fwhgz\") pod \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.736945 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b60400c-9868-401b-81b6-6e279fe14c5f-logs\") pod \"5b60400c-9868-401b-81b6-6e279fe14c5f\" (UID: \"5b60400c-9868-401b-81b6-6e279fe14c5f\") " Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.736968 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-config-data\") pod \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\" (UID: \"b18e1433-ed65-4d7e-bc78-62681cbfdcae\") " Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.737243 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b18e1433-ed65-4d7e-bc78-62681cbfdcae-logs" (OuterVolumeSpecName: "logs") pod "b18e1433-ed65-4d7e-bc78-62681cbfdcae" (UID: "b18e1433-ed65-4d7e-bc78-62681cbfdcae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.737329 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b18e1433-ed65-4d7e-bc78-62681cbfdcae-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.737464 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b60400c-9868-401b-81b6-6e279fe14c5f-logs" (OuterVolumeSpecName: "logs") pod "5b60400c-9868-401b-81b6-6e279fe14c5f" (UID: "5b60400c-9868-401b-81b6-6e279fe14c5f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.743306 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b60400c-9868-401b-81b6-6e279fe14c5f-kube-api-access-lxrz7" (OuterVolumeSpecName: "kube-api-access-lxrz7") pod "5b60400c-9868-401b-81b6-6e279fe14c5f" (UID: "5b60400c-9868-401b-81b6-6e279fe14c5f"). InnerVolumeSpecName "kube-api-access-lxrz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.743360 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b18e1433-ed65-4d7e-bc78-62681cbfdcae-kube-api-access-fwhgz" (OuterVolumeSpecName: "kube-api-access-fwhgz") pod "b18e1433-ed65-4d7e-bc78-62681cbfdcae" (UID: "b18e1433-ed65-4d7e-bc78-62681cbfdcae"). InnerVolumeSpecName "kube-api-access-fwhgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.761012 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b60400c-9868-401b-81b6-6e279fe14c5f-config-data" (OuterVolumeSpecName: "config-data") pod "5b60400c-9868-401b-81b6-6e279fe14c5f" (UID: "5b60400c-9868-401b-81b6-6e279fe14c5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.762957 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b18e1433-ed65-4d7e-bc78-62681cbfdcae" (UID: "b18e1433-ed65-4d7e-bc78-62681cbfdcae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.763913 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b60400c-9868-401b-81b6-6e279fe14c5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b60400c-9868-401b-81b6-6e279fe14c5f" (UID: "5b60400c-9868-401b-81b6-6e279fe14c5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.764041 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-config-data" (OuterVolumeSpecName: "config-data") pod "b18e1433-ed65-4d7e-bc78-62681cbfdcae" (UID: "b18e1433-ed65-4d7e-bc78-62681cbfdcae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.791094 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b18e1433-ed65-4d7e-bc78-62681cbfdcae" (UID: "b18e1433-ed65-4d7e-bc78-62681cbfdcae"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.838736 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.838769 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxrz7\" (UniqueName: \"kubernetes.io/projected/5b60400c-9868-401b-81b6-6e279fe14c5f-kube-api-access-lxrz7\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.838782 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b60400c-9868-401b-81b6-6e279fe14c5f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.838795 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwhgz\" (UniqueName: \"kubernetes.io/projected/b18e1433-ed65-4d7e-bc78-62681cbfdcae-kube-api-access-fwhgz\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.838804 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b60400c-9868-401b-81b6-6e279fe14c5f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.838813 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.838821 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b18e1433-ed65-4d7e-bc78-62681cbfdcae-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.838830 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b60400c-9868-401b-81b6-6e279fe14c5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.875437 5030 generic.go:334] "Generic (PLEG): container finished" podID="b18e1433-ed65-4d7e-bc78-62681cbfdcae" containerID="5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb" exitCode=0 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.875474 5030 generic.go:334] "Generic (PLEG): container finished" podID="b18e1433-ed65-4d7e-bc78-62681cbfdcae" containerID="fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f" exitCode=143 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.875490 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.875515 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b18e1433-ed65-4d7e-bc78-62681cbfdcae","Type":"ContainerDied","Data":"5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb"} Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.875548 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b18e1433-ed65-4d7e-bc78-62681cbfdcae","Type":"ContainerDied","Data":"fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f"} Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.875560 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b18e1433-ed65-4d7e-bc78-62681cbfdcae","Type":"ContainerDied","Data":"8bf2043e8c546af7e912260110d1bbee73808a9f1435de79a19d65eca550877b"} Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.875576 5030 scope.go:117] "RemoveContainer" containerID="5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.878893 5030 generic.go:334] "Generic (PLEG): container finished" podID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerID="4607e5ae35f297ec38d67baba04586d4646282afea61cb310f8e921df3ea408c" exitCode=0 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.878915 5030 generic.go:334] "Generic (PLEG): container finished" podID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerID="fcc3a0c29008ae396a861e080ef889b9da3b41c8a5c3d991bd4c17a8045f6e3e" exitCode=2 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.878926 5030 generic.go:334] "Generic (PLEG): container finished" podID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerID="1f686e077c11dd76e21872c810c9be19eb982cdf94d00717d56de9cec79fd333" exitCode=0 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.878927 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19c4076d-316a-436e-b7a8-ed9b5f942f0c","Type":"ContainerDied","Data":"4607e5ae35f297ec38d67baba04586d4646282afea61cb310f8e921df3ea408c"} Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.879005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19c4076d-316a-436e-b7a8-ed9b5f942f0c","Type":"ContainerDied","Data":"fcc3a0c29008ae396a861e080ef889b9da3b41c8a5c3d991bd4c17a8045f6e3e"} Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.879031 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19c4076d-316a-436e-b7a8-ed9b5f942f0c","Type":"ContainerDied","Data":"1f686e077c11dd76e21872c810c9be19eb982cdf94d00717d56de9cec79fd333"} Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.883838 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"6e835bcd-3585-4a51-ba54-aeb66b426afa","Type":"ContainerStarted","Data":"4224aea8dfc41047d5b9bd1083b109636d208592edde37fe0ce27e2bbe90849f"} Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.883878 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"6e835bcd-3585-4a51-ba54-aeb66b426afa","Type":"ContainerStarted","Data":"b2a7cfc6a0a635a29a4a4804cf1e8c3aa65170e2efd04603769ad6d5473595be"} Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.884091 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.886296 5030 generic.go:334] "Generic (PLEG): container finished" podID="5b60400c-9868-401b-81b6-6e279fe14c5f" containerID="f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9" exitCode=0 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.886320 5030 generic.go:334] "Generic (PLEG): container finished" podID="5b60400c-9868-401b-81b6-6e279fe14c5f" containerID="b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de" exitCode=143 Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.886352 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.886380 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5b60400c-9868-401b-81b6-6e279fe14c5f","Type":"ContainerDied","Data":"f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9"} Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.886425 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5b60400c-9868-401b-81b6-6e279fe14c5f","Type":"ContainerDied","Data":"b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de"} Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.886447 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5b60400c-9868-401b-81b6-6e279fe14c5f","Type":"ContainerDied","Data":"6c8a2da0b56a71c81c379924e82f019c877e1911d494af8cd561dd735874086c"} Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.902516 5030 scope.go:117] "RemoveContainer" containerID="fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.926086 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.514094737 podStartE2EDuration="1.926068792s" podCreationTimestamp="2026-01-20 23:01:08 +0000 UTC" firstStartedPulling="2026-01-20 23:01:09.037818679 +0000 UTC m=+1541.358078957" lastFinishedPulling="2026-01-20 23:01:09.449792594 +0000 UTC m=+1541.770053012" observedRunningTime="2026-01-20 23:01:09.910710388 +0000 UTC m=+1542.230970706" watchObservedRunningTime="2026-01-20 23:01:09.926068792 +0000 UTC m=+1542.246329090" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.929573 5030 scope.go:117] "RemoveContainer" containerID="5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb" Jan 20 23:01:09 crc kubenswrapper[5030]: E0120 23:01:09.930512 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb\": container with ID starting with 5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb not found: ID does not exist" containerID="5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.930547 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb"} err="failed to get container status \"5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb\": rpc error: code = NotFound desc = could not find container \"5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb\": container with ID starting with 5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb not found: ID does not exist" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.930571 5030 scope.go:117] "RemoveContainer" containerID="fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f" Jan 20 23:01:09 crc kubenswrapper[5030]: E0120 23:01:09.931094 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f\": container with ID starting with fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f not found: ID does not exist" containerID="fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.931126 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f"} err="failed to get container status \"fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f\": rpc error: code = NotFound desc = could not find container \"fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f\": container with ID starting with fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f not found: ID does not exist" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.931145 5030 scope.go:117] "RemoveContainer" containerID="5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.932026 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb"} err="failed to get container status \"5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb\": rpc error: code = NotFound desc = could not find container \"5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb\": container with ID starting with 5a59b69e0355a502176679c998fc22c11e00b9695dbaf86bc7cc48d5f4f1b9bb not found: ID does not exist" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.932066 5030 scope.go:117] "RemoveContainer" containerID="fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.932923 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f"} err="failed to get container status \"fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f\": rpc error: code = NotFound desc = could not find container \"fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f\": container with ID starting with fe1d6e4970fb9d1b0cf2d88b7ce235b97fb7257626b8ffe4bfc1e39d6585167f not found: ID does not exist" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.932945 5030 scope.go:117] "RemoveContainer" containerID="f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9" Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.960542 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:09 crc kubenswrapper[5030]: I0120 23:01:09.994712 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4799c457-9dc2-4576-961d-9e5c5cbf0fd7" path="/var/lib/kubelet/pods/4799c457-9dc2-4576-961d-9e5c5cbf0fd7/volumes" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.006023 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.012690 5030 scope.go:117] "RemoveContainer" containerID="b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.018750 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.028548 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:10 crc kubenswrapper[5030]: E0120 23:01:10.029019 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18e1433-ed65-4d7e-bc78-62681cbfdcae" containerName="nova-metadata-log" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.029039 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18e1433-ed65-4d7e-bc78-62681cbfdcae" containerName="nova-metadata-log" Jan 20 23:01:10 crc kubenswrapper[5030]: E0120 23:01:10.029056 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b60400c-9868-401b-81b6-6e279fe14c5f" containerName="nova-api-api" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.029062 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b60400c-9868-401b-81b6-6e279fe14c5f" containerName="nova-api-api" Jan 20 23:01:10 crc kubenswrapper[5030]: E0120 23:01:10.029075 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b60400c-9868-401b-81b6-6e279fe14c5f" containerName="nova-api-log" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.029081 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b60400c-9868-401b-81b6-6e279fe14c5f" containerName="nova-api-log" Jan 20 23:01:10 crc kubenswrapper[5030]: E0120 23:01:10.029091 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18e1433-ed65-4d7e-bc78-62681cbfdcae" containerName="nova-metadata-metadata" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.029097 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18e1433-ed65-4d7e-bc78-62681cbfdcae" containerName="nova-metadata-metadata" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.029286 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b18e1433-ed65-4d7e-bc78-62681cbfdcae" containerName="nova-metadata-metadata" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.029300 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b60400c-9868-401b-81b6-6e279fe14c5f" containerName="nova-api-api" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.029318 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b18e1433-ed65-4d7e-bc78-62681cbfdcae" containerName="nova-metadata-log" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.029329 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b60400c-9868-401b-81b6-6e279fe14c5f" containerName="nova-api-log" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.030340 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.032480 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.032667 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.038002 5030 scope.go:117] "RemoveContainer" containerID="f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9" Jan 20 23:01:10 crc kubenswrapper[5030]: E0120 23:01:10.038908 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9\": container with ID starting with f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9 not found: ID does not exist" containerID="f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.039014 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9"} err="failed to get container status \"f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9\": rpc error: code = NotFound desc = could not find container \"f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9\": container with ID starting with f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9 not found: ID does not exist" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.039100 5030 scope.go:117] "RemoveContainer" containerID="b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.039288 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:10 crc kubenswrapper[5030]: E0120 23:01:10.042819 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de\": container with ID starting with b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de not found: ID does not exist" containerID="b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.042864 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de"} err="failed to get container status \"b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de\": rpc error: code = NotFound desc = could not find container \"b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de\": container with ID starting with b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de not found: ID does not exist" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.042891 5030 scope.go:117] "RemoveContainer" containerID="f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.044002 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9"} err="failed to get container status \"f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9\": rpc error: code = NotFound desc = could not find container \"f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9\": container with ID starting with f6fb14c511a2a638b3751ae021169909deaba3d6999c4410a5e2846a9e0f22a9 not found: ID does not exist" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.044038 5030 scope.go:117] "RemoveContainer" containerID="b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.046162 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.047869 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de"} err="failed to get container status \"b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de\": rpc error: code = NotFound desc = could not find container \"b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de\": container with ID starting with b3d8692393c2baf04e5f63db11d6bbeb48b1327f1a36cc951bbc9741731ff7de not found: ID does not exist" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.054611 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.056431 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.058243 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.064426 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.144695 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jc8b\" (UniqueName: \"kubernetes.io/projected/0e850b76-b34f-44d5-8723-ce657ebf49b6-kube-api-access-2jc8b\") pod \"nova-api-0\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.144780 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a29aab-8344-44a5-a736-6ec987b286e7-logs\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.144843 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.144929 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e850b76-b34f-44d5-8723-ce657ebf49b6-config-data\") pod \"nova-api-0\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.144951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.144971 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-config-data\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.144991 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9prsg\" (UniqueName: \"kubernetes.io/projected/91a29aab-8344-44a5-a736-6ec987b286e7-kube-api-access-9prsg\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.145026 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e850b76-b34f-44d5-8723-ce657ebf49b6-logs\") pod \"nova-api-0\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.145053 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e850b76-b34f-44d5-8723-ce657ebf49b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.157753 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.157804 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.247329 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jc8b\" (UniqueName: \"kubernetes.io/projected/0e850b76-b34f-44d5-8723-ce657ebf49b6-kube-api-access-2jc8b\") pod \"nova-api-0\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.247462 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a29aab-8344-44a5-a736-6ec987b286e7-logs\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.247553 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.247700 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e850b76-b34f-44d5-8723-ce657ebf49b6-config-data\") pod \"nova-api-0\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.247732 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-config-data\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.247763 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.247796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9prsg\" (UniqueName: \"kubernetes.io/projected/91a29aab-8344-44a5-a736-6ec987b286e7-kube-api-access-9prsg\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.247849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e850b76-b34f-44d5-8723-ce657ebf49b6-logs\") pod \"nova-api-0\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.247900 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e850b76-b34f-44d5-8723-ce657ebf49b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.249316 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a29aab-8344-44a5-a736-6ec987b286e7-logs\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.251150 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.252063 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e850b76-b34f-44d5-8723-ce657ebf49b6-logs\") pod \"nova-api-0\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.254909 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.255281 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.256555 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e850b76-b34f-44d5-8723-ce657ebf49b6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.257549 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e850b76-b34f-44d5-8723-ce657ebf49b6-config-data\") pod \"nova-api-0\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.258448 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-config-data\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.270562 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jc8b\" (UniqueName: \"kubernetes.io/projected/0e850b76-b34f-44d5-8723-ce657ebf49b6-kube-api-access-2jc8b\") pod \"nova-api-0\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.271484 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9prsg\" (UniqueName: \"kubernetes.io/projected/91a29aab-8344-44a5-a736-6ec987b286e7-kube-api-access-9prsg\") pod \"nova-metadata-0\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.351349 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.372585 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:10 crc kubenswrapper[5030]: I0120 23:01:10.932542 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:10 crc kubenswrapper[5030]: W0120 23:01:10.941028 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a29aab_8344_44a5_a736_6ec987b286e7.slice/crio-d9360bc8c6322154560a4f25a4f9d9ca7301edd66ec5241983317fa79a183613 WatchSource:0}: Error finding container d9360bc8c6322154560a4f25a4f9d9ca7301edd66ec5241983317fa79a183613: Status 404 returned error can't find the container with id d9360bc8c6322154560a4f25a4f9d9ca7301edd66ec5241983317fa79a183613 Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.010284 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:11 crc kubenswrapper[5030]: W0120 23:01:11.012988 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e850b76_b34f_44d5_8723_ce657ebf49b6.slice/crio-37279f13d7c2a7fa8c76a84c716e56eada0cbac8632cc485677c5c7da6fc58de WatchSource:0}: Error finding container 37279f13d7c2a7fa8c76a84c716e56eada0cbac8632cc485677c5c7da6fc58de: Status 404 returned error can't find the container with id 37279f13d7c2a7fa8c76a84c716e56eada0cbac8632cc485677c5c7da6fc58de Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.780204 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.874452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19c4076d-316a-436e-b7a8-ed9b5f942f0c-log-httpd\") pod \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.875127 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-sg-core-conf-yaml\") pod \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.875578 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kth6\" (UniqueName: \"kubernetes.io/projected/19c4076d-316a-436e-b7a8-ed9b5f942f0c-kube-api-access-2kth6\") pod \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.875080 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19c4076d-316a-436e-b7a8-ed9b5f942f0c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "19c4076d-316a-436e-b7a8-ed9b5f942f0c" (UID: "19c4076d-316a-436e-b7a8-ed9b5f942f0c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.875731 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-scripts\") pod \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.876166 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-config-data\") pod \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.876383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-combined-ca-bundle\") pod \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.876697 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19c4076d-316a-436e-b7a8-ed9b5f942f0c-run-httpd\") pod \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\" (UID: \"19c4076d-316a-436e-b7a8-ed9b5f942f0c\") " Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.877071 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19c4076d-316a-436e-b7a8-ed9b5f942f0c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "19c4076d-316a-436e-b7a8-ed9b5f942f0c" (UID: "19c4076d-316a-436e-b7a8-ed9b5f942f0c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.877485 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19c4076d-316a-436e-b7a8-ed9b5f942f0c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.877561 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19c4076d-316a-436e-b7a8-ed9b5f942f0c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.879067 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-scripts" (OuterVolumeSpecName: "scripts") pod "19c4076d-316a-436e-b7a8-ed9b5f942f0c" (UID: "19c4076d-316a-436e-b7a8-ed9b5f942f0c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.879465 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19c4076d-316a-436e-b7a8-ed9b5f942f0c-kube-api-access-2kth6" (OuterVolumeSpecName: "kube-api-access-2kth6") pod "19c4076d-316a-436e-b7a8-ed9b5f942f0c" (UID: "19c4076d-316a-436e-b7a8-ed9b5f942f0c"). InnerVolumeSpecName "kube-api-access-2kth6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.911439 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"91a29aab-8344-44a5-a736-6ec987b286e7","Type":"ContainerStarted","Data":"9019c6e3de8e3b5c6c69f545d75528a3a3d50b6bd8120858c42802af8687d3e8"} Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.911580 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"91a29aab-8344-44a5-a736-6ec987b286e7","Type":"ContainerStarted","Data":"fefd41374a528bcf03ae16c1857115655d70febcbd2986de16707ae8901a4def"} Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.911597 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"91a29aab-8344-44a5-a736-6ec987b286e7","Type":"ContainerStarted","Data":"d9360bc8c6322154560a4f25a4f9d9ca7301edd66ec5241983317fa79a183613"} Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.914735 5030 generic.go:334] "Generic (PLEG): container finished" podID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerID="ba8dba67d6db502a3953f80a2d5031043c4063dfff96429f3a4ca5b0816d87a2" exitCode=0 Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.914839 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.914866 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19c4076d-316a-436e-b7a8-ed9b5f942f0c","Type":"ContainerDied","Data":"ba8dba67d6db502a3953f80a2d5031043c4063dfff96429f3a4ca5b0816d87a2"} Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.915211 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19c4076d-316a-436e-b7a8-ed9b5f942f0c","Type":"ContainerDied","Data":"ff5f229b58d0072f20cc04685331d6d3daa8d9889851c099c73b0134bdb57d5c"} Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.915239 5030 scope.go:117] "RemoveContainer" containerID="4607e5ae35f297ec38d67baba04586d4646282afea61cb310f8e921df3ea408c" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.914932 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "19c4076d-316a-436e-b7a8-ed9b5f942f0c" (UID: "19c4076d-316a-436e-b7a8-ed9b5f942f0c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.918591 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"0e850b76-b34f-44d5-8723-ce657ebf49b6","Type":"ContainerStarted","Data":"df0e53ae7397bf5f9a81af2d566612f2c4ea85843b5446ce373ddc9fc56834e0"} Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.918719 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"0e850b76-b34f-44d5-8723-ce657ebf49b6","Type":"ContainerStarted","Data":"6f4d1ad6fde510fd8991f49aa2ffc3bec3865f2e86c3443863810760a596606e"} Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.918792 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"0e850b76-b34f-44d5-8723-ce657ebf49b6","Type":"ContainerStarted","Data":"37279f13d7c2a7fa8c76a84c716e56eada0cbac8632cc485677c5c7da6fc58de"} Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.938506 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.938488724 podStartE2EDuration="2.938488724s" podCreationTimestamp="2026-01-20 23:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:11.932444616 +0000 UTC m=+1544.252704914" watchObservedRunningTime="2026-01-20 23:01:11.938488724 +0000 UTC m=+1544.258749012" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.951822 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.951803987 podStartE2EDuration="2.951803987s" podCreationTimestamp="2026-01-20 23:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:11.948545058 +0000 UTC m=+1544.268805366" watchObservedRunningTime="2026-01-20 23:01:11.951803987 +0000 UTC m=+1544.272064275" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.955244 5030 scope.go:117] "RemoveContainer" containerID="fcc3a0c29008ae396a861e080ef889b9da3b41c8a5c3d991bd4c17a8045f6e3e" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.959466 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19c4076d-316a-436e-b7a8-ed9b5f942f0c" (UID: "19c4076d-316a-436e-b7a8-ed9b5f942f0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.973075 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b60400c-9868-401b-81b6-6e279fe14c5f" path="/var/lib/kubelet/pods/5b60400c-9868-401b-81b6-6e279fe14c5f/volumes" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.973724 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b18e1433-ed65-4d7e-bc78-62681cbfdcae" path="/var/lib/kubelet/pods/b18e1433-ed65-4d7e-bc78-62681cbfdcae/volumes" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.979537 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.979672 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.979738 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kth6\" (UniqueName: \"kubernetes.io/projected/19c4076d-316a-436e-b7a8-ed9b5f942f0c-kube-api-access-2kth6\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.979795 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.980690 5030 scope.go:117] "RemoveContainer" containerID="ba8dba67d6db502a3953f80a2d5031043c4063dfff96429f3a4ca5b0816d87a2" Jan 20 23:01:11 crc kubenswrapper[5030]: I0120 23:01:11.984335 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-config-data" (OuterVolumeSpecName: "config-data") pod "19c4076d-316a-436e-b7a8-ed9b5f942f0c" (UID: "19c4076d-316a-436e-b7a8-ed9b5f942f0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.007544 5030 scope.go:117] "RemoveContainer" containerID="1f686e077c11dd76e21872c810c9be19eb982cdf94d00717d56de9cec79fd333" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.029357 5030 scope.go:117] "RemoveContainer" containerID="4607e5ae35f297ec38d67baba04586d4646282afea61cb310f8e921df3ea408c" Jan 20 23:01:12 crc kubenswrapper[5030]: E0120 23:01:12.029749 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4607e5ae35f297ec38d67baba04586d4646282afea61cb310f8e921df3ea408c\": container with ID starting with 4607e5ae35f297ec38d67baba04586d4646282afea61cb310f8e921df3ea408c not found: ID does not exist" containerID="4607e5ae35f297ec38d67baba04586d4646282afea61cb310f8e921df3ea408c" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.029779 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4607e5ae35f297ec38d67baba04586d4646282afea61cb310f8e921df3ea408c"} err="failed to get container status \"4607e5ae35f297ec38d67baba04586d4646282afea61cb310f8e921df3ea408c\": rpc error: code = NotFound desc = could not find container \"4607e5ae35f297ec38d67baba04586d4646282afea61cb310f8e921df3ea408c\": container with ID starting with 4607e5ae35f297ec38d67baba04586d4646282afea61cb310f8e921df3ea408c not found: ID does not exist" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.029818 5030 scope.go:117] "RemoveContainer" containerID="fcc3a0c29008ae396a861e080ef889b9da3b41c8a5c3d991bd4c17a8045f6e3e" Jan 20 23:01:12 crc kubenswrapper[5030]: E0120 23:01:12.030130 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcc3a0c29008ae396a861e080ef889b9da3b41c8a5c3d991bd4c17a8045f6e3e\": container with ID starting with fcc3a0c29008ae396a861e080ef889b9da3b41c8a5c3d991bd4c17a8045f6e3e not found: ID does not exist" containerID="fcc3a0c29008ae396a861e080ef889b9da3b41c8a5c3d991bd4c17a8045f6e3e" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.030149 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc3a0c29008ae396a861e080ef889b9da3b41c8a5c3d991bd4c17a8045f6e3e"} err="failed to get container status \"fcc3a0c29008ae396a861e080ef889b9da3b41c8a5c3d991bd4c17a8045f6e3e\": rpc error: code = NotFound desc = could not find container \"fcc3a0c29008ae396a861e080ef889b9da3b41c8a5c3d991bd4c17a8045f6e3e\": container with ID starting with fcc3a0c29008ae396a861e080ef889b9da3b41c8a5c3d991bd4c17a8045f6e3e not found: ID does not exist" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.030161 5030 scope.go:117] "RemoveContainer" containerID="ba8dba67d6db502a3953f80a2d5031043c4063dfff96429f3a4ca5b0816d87a2" Jan 20 23:01:12 crc kubenswrapper[5030]: E0120 23:01:12.030364 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba8dba67d6db502a3953f80a2d5031043c4063dfff96429f3a4ca5b0816d87a2\": container with ID starting with ba8dba67d6db502a3953f80a2d5031043c4063dfff96429f3a4ca5b0816d87a2 not found: ID does not exist" containerID="ba8dba67d6db502a3953f80a2d5031043c4063dfff96429f3a4ca5b0816d87a2" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.030381 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8dba67d6db502a3953f80a2d5031043c4063dfff96429f3a4ca5b0816d87a2"} err="failed to get container status \"ba8dba67d6db502a3953f80a2d5031043c4063dfff96429f3a4ca5b0816d87a2\": rpc error: code = NotFound desc = could not find container \"ba8dba67d6db502a3953f80a2d5031043c4063dfff96429f3a4ca5b0816d87a2\": container with ID starting with ba8dba67d6db502a3953f80a2d5031043c4063dfff96429f3a4ca5b0816d87a2 not found: ID does not exist" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.030393 5030 scope.go:117] "RemoveContainer" containerID="1f686e077c11dd76e21872c810c9be19eb982cdf94d00717d56de9cec79fd333" Jan 20 23:01:12 crc kubenswrapper[5030]: E0120 23:01:12.030597 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f686e077c11dd76e21872c810c9be19eb982cdf94d00717d56de9cec79fd333\": container with ID starting with 1f686e077c11dd76e21872c810c9be19eb982cdf94d00717d56de9cec79fd333 not found: ID does not exist" containerID="1f686e077c11dd76e21872c810c9be19eb982cdf94d00717d56de9cec79fd333" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.030613 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f686e077c11dd76e21872c810c9be19eb982cdf94d00717d56de9cec79fd333"} err="failed to get container status \"1f686e077c11dd76e21872c810c9be19eb982cdf94d00717d56de9cec79fd333\": rpc error: code = NotFound desc = could not find container \"1f686e077c11dd76e21872c810c9be19eb982cdf94d00717d56de9cec79fd333\": container with ID starting with 1f686e077c11dd76e21872c810c9be19eb982cdf94d00717d56de9cec79fd333 not found: ID does not exist" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.081131 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19c4076d-316a-436e-b7a8-ed9b5f942f0c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.259756 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.285151 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.285480 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.299661 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:01:12 crc kubenswrapper[5030]: E0120 23:01:12.300237 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="proxy-httpd" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.300265 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="proxy-httpd" Jan 20 23:01:12 crc kubenswrapper[5030]: E0120 23:01:12.300292 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="ceilometer-central-agent" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.300301 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="ceilometer-central-agent" Jan 20 23:01:12 crc kubenswrapper[5030]: E0120 23:01:12.300322 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="ceilometer-notification-agent" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.300331 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="ceilometer-notification-agent" Jan 20 23:01:12 crc kubenswrapper[5030]: E0120 23:01:12.300356 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="sg-core" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.300365 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="sg-core" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.300595 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="ceilometer-central-agent" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.300613 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="proxy-httpd" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.300656 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="sg-core" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.300669 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" containerName="ceilometer-notification-agent" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.304375 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.309006 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.309345 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.309963 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.322962 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.390046 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.390433 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.390466 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f09bef-5dba-4628-adf3-c440c793dd50-log-httpd\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.390524 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-scripts\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.390722 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f09bef-5dba-4628-adf3-c440c793dd50-run-httpd\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.390800 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.390900 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-config-data\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.390927 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nw4l\" (UniqueName: \"kubernetes.io/projected/d9f09bef-5dba-4628-adf3-c440c793dd50-kube-api-access-9nw4l\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: E0120 23:01:12.393766 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19c4076d_316a_436e_b7a8_ed9b5f942f0c.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.491950 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-config-data\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.491996 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nw4l\" (UniqueName: \"kubernetes.io/projected/d9f09bef-5dba-4628-adf3-c440c793dd50-kube-api-access-9nw4l\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.492025 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.492048 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.492076 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f09bef-5dba-4628-adf3-c440c793dd50-log-httpd\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.492097 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-scripts\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.492160 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f09bef-5dba-4628-adf3-c440c793dd50-run-httpd\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.492205 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.492705 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f09bef-5dba-4628-adf3-c440c793dd50-log-httpd\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.492905 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f09bef-5dba-4628-adf3-c440c793dd50-run-httpd\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.497566 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.497729 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.497919 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-scripts\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.498605 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.511703 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-config-data\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.516137 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nw4l\" (UniqueName: \"kubernetes.io/projected/d9f09bef-5dba-4628-adf3-c440c793dd50-kube-api-access-9nw4l\") pod \"ceilometer-0\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.627950 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.929714 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4868cb8-1f63-4955-b57d-0857ed6bf776" containerID="4e6f3c85c4b4f7f9e04d60f32cfc062d7db5f163d1d2c4e0e05067d0af71943d" exitCode=0 Jan 20 23:01:12 crc kubenswrapper[5030]: I0120 23:01:12.929921 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f4868cb8-1f63-4955-b57d-0857ed6bf776","Type":"ContainerDied","Data":"4e6f3c85c4b4f7f9e04d60f32cfc062d7db5f163d1d2c4e0e05067d0af71943d"} Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.105391 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.268491 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.305996 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4868cb8-1f63-4955-b57d-0857ed6bf776-config-data\") pod \"f4868cb8-1f63-4955-b57d-0857ed6bf776\" (UID: \"f4868cb8-1f63-4955-b57d-0857ed6bf776\") " Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.306082 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4868cb8-1f63-4955-b57d-0857ed6bf776-combined-ca-bundle\") pod \"f4868cb8-1f63-4955-b57d-0857ed6bf776\" (UID: \"f4868cb8-1f63-4955-b57d-0857ed6bf776\") " Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.306128 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kshl5\" (UniqueName: \"kubernetes.io/projected/f4868cb8-1f63-4955-b57d-0857ed6bf776-kube-api-access-kshl5\") pod \"f4868cb8-1f63-4955-b57d-0857ed6bf776\" (UID: \"f4868cb8-1f63-4955-b57d-0857ed6bf776\") " Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.312505 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4868cb8-1f63-4955-b57d-0857ed6bf776-kube-api-access-kshl5" (OuterVolumeSpecName: "kube-api-access-kshl5") pod "f4868cb8-1f63-4955-b57d-0857ed6bf776" (UID: "f4868cb8-1f63-4955-b57d-0857ed6bf776"). InnerVolumeSpecName "kube-api-access-kshl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.337719 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4868cb8-1f63-4955-b57d-0857ed6bf776-config-data" (OuterVolumeSpecName: "config-data") pod "f4868cb8-1f63-4955-b57d-0857ed6bf776" (UID: "f4868cb8-1f63-4955-b57d-0857ed6bf776"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.340177 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4868cb8-1f63-4955-b57d-0857ed6bf776-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4868cb8-1f63-4955-b57d-0857ed6bf776" (UID: "f4868cb8-1f63-4955-b57d-0857ed6bf776"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.408324 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4868cb8-1f63-4955-b57d-0857ed6bf776-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.408365 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4868cb8-1f63-4955-b57d-0857ed6bf776-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.408387 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kshl5\" (UniqueName: \"kubernetes.io/projected/f4868cb8-1f63-4955-b57d-0857ed6bf776-kube-api-access-kshl5\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.943508 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.943592 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"f4868cb8-1f63-4955-b57d-0857ed6bf776","Type":"ContainerDied","Data":"f60798d2f5c573e2e40a540aec06095754e9757fe477925be060a6c4d9d7c39a"} Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.943957 5030 scope.go:117] "RemoveContainer" containerID="4e6f3c85c4b4f7f9e04d60f32cfc062d7db5f163d1d2c4e0e05067d0af71943d" Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.947041 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d9f09bef-5dba-4628-adf3-c440c793dd50","Type":"ContainerStarted","Data":"0d0716e8e70864125838d7a292a8e610c00fd026cb5a41287b3f82baabf81a60"} Jan 20 23:01:13 crc kubenswrapper[5030]: I0120 23:01:13.947106 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d9f09bef-5dba-4628-adf3-c440c793dd50","Type":"ContainerStarted","Data":"50f4823fb9fe76278d8c70257557f46de3a71c6bd75e15f0caf00c3a1c10860d"} Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.008865 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19c4076d-316a-436e-b7a8-ed9b5f942f0c" path="/var/lib/kubelet/pods/19c4076d-316a-436e-b7a8-ed9b5f942f0c/volumes" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.012134 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.012164 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.012192 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:01:14 crc kubenswrapper[5030]: E0120 23:01:14.012525 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4868cb8-1f63-4955-b57d-0857ed6bf776" containerName="nova-scheduler-scheduler" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.012542 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4868cb8-1f63-4955-b57d-0857ed6bf776" containerName="nova-scheduler-scheduler" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.012716 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4868cb8-1f63-4955-b57d-0857ed6bf776" containerName="nova-scheduler-scheduler" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.013316 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.019970 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.020854 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.123986 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6768bc62-1cf3-4135-b061-64f99a6d479f-config-data\") pod \"nova-scheduler-0\" (UID: \"6768bc62-1cf3-4135-b061-64f99a6d479f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.124275 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6768bc62-1cf3-4135-b061-64f99a6d479f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6768bc62-1cf3-4135-b061-64f99a6d479f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.124748 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75sxv\" (UniqueName: \"kubernetes.io/projected/6768bc62-1cf3-4135-b061-64f99a6d479f-kube-api-access-75sxv\") pod \"nova-scheduler-0\" (UID: \"6768bc62-1cf3-4135-b061-64f99a6d479f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.226154 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6768bc62-1cf3-4135-b061-64f99a6d479f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6768bc62-1cf3-4135-b061-64f99a6d479f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.226282 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75sxv\" (UniqueName: \"kubernetes.io/projected/6768bc62-1cf3-4135-b061-64f99a6d479f-kube-api-access-75sxv\") pod \"nova-scheduler-0\" (UID: \"6768bc62-1cf3-4135-b061-64f99a6d479f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.226345 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6768bc62-1cf3-4135-b061-64f99a6d479f-config-data\") pod \"nova-scheduler-0\" (UID: \"6768bc62-1cf3-4135-b061-64f99a6d479f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.230466 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6768bc62-1cf3-4135-b061-64f99a6d479f-config-data\") pod \"nova-scheduler-0\" (UID: \"6768bc62-1cf3-4135-b061-64f99a6d479f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.232661 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6768bc62-1cf3-4135-b061-64f99a6d479f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6768bc62-1cf3-4135-b061-64f99a6d479f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.244665 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75sxv\" (UniqueName: \"kubernetes.io/projected/6768bc62-1cf3-4135-b061-64f99a6d479f-kube-api-access-75sxv\") pod \"nova-scheduler-0\" (UID: \"6768bc62-1cf3-4135-b061-64f99a6d479f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.343359 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.809669 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:01:14 crc kubenswrapper[5030]: W0120 23:01:14.812223 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6768bc62_1cf3_4135_b061_64f99a6d479f.slice/crio-9db41a651d6e71856ae95e325cbf750baa03e7d59df51c47a350ffa7cd62bc1d WatchSource:0}: Error finding container 9db41a651d6e71856ae95e325cbf750baa03e7d59df51c47a350ffa7cd62bc1d: Status 404 returned error can't find the container with id 9db41a651d6e71856ae95e325cbf750baa03e7d59df51c47a350ffa7cd62bc1d Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.959156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"6768bc62-1cf3-4135-b061-64f99a6d479f","Type":"ContainerStarted","Data":"9db41a651d6e71856ae95e325cbf750baa03e7d59df51c47a350ffa7cd62bc1d"} Jan 20 23:01:14 crc kubenswrapper[5030]: I0120 23:01:14.964301 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d9f09bef-5dba-4628-adf3-c440c793dd50","Type":"ContainerStarted","Data":"2acb5a5593855b0b97108d18e658bc30b9de7c2326da2ca5cf487fa7371d2dfc"} Jan 20 23:01:15 crc kubenswrapper[5030]: I0120 23:01:15.247344 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:15 crc kubenswrapper[5030]: I0120 23:01:15.265707 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:15 crc kubenswrapper[5030]: I0120 23:01:15.352003 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:15 crc kubenswrapper[5030]: I0120 23:01:15.352242 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:15 crc kubenswrapper[5030]: I0120 23:01:15.970584 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4868cb8-1f63-4955-b57d-0857ed6bf776" path="/var/lib/kubelet/pods/f4868cb8-1f63-4955-b57d-0857ed6bf776/volumes" Jan 20 23:01:15 crc kubenswrapper[5030]: I0120 23:01:15.974281 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"6768bc62-1cf3-4135-b061-64f99a6d479f","Type":"ContainerStarted","Data":"e372e9bfd9f6333c1bc02350302286900a50f57a6abe1f9789a879147361734a"} Jan 20 23:01:15 crc kubenswrapper[5030]: I0120 23:01:15.977227 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d9f09bef-5dba-4628-adf3-c440c793dd50","Type":"ContainerStarted","Data":"80a14381ea24f2faad80e46fd57367dccad4f601b4e2e50f1b99ff4d3236a41d"} Jan 20 23:01:15 crc kubenswrapper[5030]: I0120 23:01:15.993167 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.004451 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=3.004434596 podStartE2EDuration="3.004434596s" podCreationTimestamp="2026-01-20 23:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:15.988494748 +0000 UTC m=+1548.308755066" watchObservedRunningTime="2026-01-20 23:01:16.004434596 +0000 UTC m=+1548.324694884" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.137393 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf"] Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.138452 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.140591 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.141462 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.155800 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf"] Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.167663 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-scripts\") pod \"nova-cell1-cell-mapping-ms7xf\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.167798 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4zjn\" (UniqueName: \"kubernetes.io/projected/657d77bd-c7a3-441a-84dd-652ede942310-kube-api-access-f4zjn\") pod \"nova-cell1-cell-mapping-ms7xf\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.167904 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ms7xf\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.167932 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-config-data\") pod \"nova-cell1-cell-mapping-ms7xf\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.268951 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4zjn\" (UniqueName: \"kubernetes.io/projected/657d77bd-c7a3-441a-84dd-652ede942310-kube-api-access-f4zjn\") pod \"nova-cell1-cell-mapping-ms7xf\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.269035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-config-data\") pod \"nova-cell1-cell-mapping-ms7xf\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.269054 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ms7xf\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.269139 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-scripts\") pod \"nova-cell1-cell-mapping-ms7xf\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.273073 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ms7xf\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.275607 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-config-data\") pod \"nova-cell1-cell-mapping-ms7xf\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.281835 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-scripts\") pod \"nova-cell1-cell-mapping-ms7xf\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.287585 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4zjn\" (UniqueName: \"kubernetes.io/projected/657d77bd-c7a3-441a-84dd-652ede942310-kube-api-access-f4zjn\") pod \"nova-cell1-cell-mapping-ms7xf\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.459005 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.919212 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf"] Jan 20 23:01:16 crc kubenswrapper[5030]: I0120 23:01:16.994712 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" event={"ID":"657d77bd-c7a3-441a-84dd-652ede942310","Type":"ContainerStarted","Data":"3610a01d4423971b728a5d1f3c721b8807b86cdee5096e265a002969c3ed8fa6"} Jan 20 23:01:17 crc kubenswrapper[5030]: I0120 23:01:17.000092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d9f09bef-5dba-4628-adf3-c440c793dd50","Type":"ContainerStarted","Data":"fd64a3d0252666a93d9d1778117f9917ec05f69b515ee10b13d1ac7146aea041"} Jan 20 23:01:17 crc kubenswrapper[5030]: I0120 23:01:17.027960 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.945466592 podStartE2EDuration="5.027943147s" podCreationTimestamp="2026-01-20 23:01:12 +0000 UTC" firstStartedPulling="2026-01-20 23:01:13.140584166 +0000 UTC m=+1545.460844454" lastFinishedPulling="2026-01-20 23:01:16.223060721 +0000 UTC m=+1548.543321009" observedRunningTime="2026-01-20 23:01:17.019796189 +0000 UTC m=+1549.340056517" watchObservedRunningTime="2026-01-20 23:01:17.027943147 +0000 UTC m=+1549.348203445" Jan 20 23:01:18 crc kubenswrapper[5030]: I0120 23:01:18.013724 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" event={"ID":"657d77bd-c7a3-441a-84dd-652ede942310","Type":"ContainerStarted","Data":"62de685673aa44af6c5c901ff95c07029682edb3f840e756280af27d277bf5a7"} Jan 20 23:01:18 crc kubenswrapper[5030]: I0120 23:01:18.014464 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:18 crc kubenswrapper[5030]: I0120 23:01:18.601659 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:01:18 crc kubenswrapper[5030]: I0120 23:01:18.635455 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" podStartSLOduration=2.635425765 podStartE2EDuration="2.635425765s" podCreationTimestamp="2026-01-20 23:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:18.029711479 +0000 UTC m=+1550.349971787" watchObservedRunningTime="2026-01-20 23:01:18.635425765 +0000 UTC m=+1550.955686093" Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.343719 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.611546 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s9zt2"] Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.614286 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.623922 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s9zt2"] Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.735707 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-utilities\") pod \"redhat-marketplace-s9zt2\" (UID: \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\") " pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.735783 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjwpn\" (UniqueName: \"kubernetes.io/projected/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-kube-api-access-mjwpn\") pod \"redhat-marketplace-s9zt2\" (UID: \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\") " pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.735877 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-catalog-content\") pod \"redhat-marketplace-s9zt2\" (UID: \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\") " pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.837793 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-catalog-content\") pod \"redhat-marketplace-s9zt2\" (UID: \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\") " pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.837980 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-utilities\") pod \"redhat-marketplace-s9zt2\" (UID: \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\") " pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.838035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjwpn\" (UniqueName: \"kubernetes.io/projected/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-kube-api-access-mjwpn\") pod \"redhat-marketplace-s9zt2\" (UID: \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\") " pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.839034 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-catalog-content\") pod \"redhat-marketplace-s9zt2\" (UID: \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\") " pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.839356 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-utilities\") pod \"redhat-marketplace-s9zt2\" (UID: \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\") " pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.860180 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjwpn\" (UniqueName: \"kubernetes.io/projected/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-kube-api-access-mjwpn\") pod \"redhat-marketplace-s9zt2\" (UID: \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\") " pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:19 crc kubenswrapper[5030]: I0120 23:01:19.939415 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:20 crc kubenswrapper[5030]: I0120 23:01:20.351589 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:20 crc kubenswrapper[5030]: I0120 23:01:20.352815 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:20 crc kubenswrapper[5030]: I0120 23:01:20.373566 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:20 crc kubenswrapper[5030]: I0120 23:01:20.373598 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:21 crc kubenswrapper[5030]: I0120 23:01:21.366802 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="91a29aab-8344-44a5-a736-6ec987b286e7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.25:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:01:21 crc kubenswrapper[5030]: I0120 23:01:21.366904 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="91a29aab-8344-44a5-a736-6ec987b286e7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.25:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:01:21 crc kubenswrapper[5030]: I0120 23:01:21.384175 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s9zt2"] Jan 20 23:01:21 crc kubenswrapper[5030]: W0120 23:01:21.391837 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e2d3aa6_7b93_41d0_aa60_64eaab9a15eb.slice/crio-3a2867dc072261bf5275a867a451dce85c53867a454d724ad5869ab7f1c07c3f WatchSource:0}: Error finding container 3a2867dc072261bf5275a867a451dce85c53867a454d724ad5869ab7f1c07c3f: Status 404 returned error can't find the container with id 3a2867dc072261bf5275a867a451dce85c53867a454d724ad5869ab7f1c07c3f Jan 20 23:01:21 crc kubenswrapper[5030]: I0120 23:01:21.457082 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="0e850b76-b34f-44d5-8723-ce657ebf49b6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.26:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:01:21 crc kubenswrapper[5030]: I0120 23:01:21.457110 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="0e850b76-b34f-44d5-8723-ce657ebf49b6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.26:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:01:22 crc kubenswrapper[5030]: I0120 23:01:22.100487 5030 generic.go:334] "Generic (PLEG): container finished" podID="0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" containerID="85ab71d8b7ed484bc88f91053985f98b08119343be39f412b7f1d9838affad1d" exitCode=0 Jan 20 23:01:22 crc kubenswrapper[5030]: I0120 23:01:22.100567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9zt2" event={"ID":"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb","Type":"ContainerDied","Data":"85ab71d8b7ed484bc88f91053985f98b08119343be39f412b7f1d9838affad1d"} Jan 20 23:01:22 crc kubenswrapper[5030]: I0120 23:01:22.100594 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9zt2" event={"ID":"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb","Type":"ContainerStarted","Data":"3a2867dc072261bf5275a867a451dce85c53867a454d724ad5869ab7f1c07c3f"} Jan 20 23:01:22 crc kubenswrapper[5030]: I0120 23:01:22.104542 5030 generic.go:334] "Generic (PLEG): container finished" podID="657d77bd-c7a3-441a-84dd-652ede942310" containerID="62de685673aa44af6c5c901ff95c07029682edb3f840e756280af27d277bf5a7" exitCode=0 Jan 20 23:01:22 crc kubenswrapper[5030]: I0120 23:01:22.104590 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" event={"ID":"657d77bd-c7a3-441a-84dd-652ede942310","Type":"ContainerDied","Data":"62de685673aa44af6c5c901ff95c07029682edb3f840e756280af27d277bf5a7"} Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.118113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9zt2" event={"ID":"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb","Type":"ContainerStarted","Data":"ebf02f9fd0439ea98f802bd91276eafca210d4f05ef8d7718d27101c1d91aae5"} Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.469774 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.529163 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-combined-ca-bundle\") pod \"657d77bd-c7a3-441a-84dd-652ede942310\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.529305 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-config-data\") pod \"657d77bd-c7a3-441a-84dd-652ede942310\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.529376 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4zjn\" (UniqueName: \"kubernetes.io/projected/657d77bd-c7a3-441a-84dd-652ede942310-kube-api-access-f4zjn\") pod \"657d77bd-c7a3-441a-84dd-652ede942310\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.529405 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-scripts\") pod \"657d77bd-c7a3-441a-84dd-652ede942310\" (UID: \"657d77bd-c7a3-441a-84dd-652ede942310\") " Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.536355 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-scripts" (OuterVolumeSpecName: "scripts") pod "657d77bd-c7a3-441a-84dd-652ede942310" (UID: "657d77bd-c7a3-441a-84dd-652ede942310"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.538074 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/657d77bd-c7a3-441a-84dd-652ede942310-kube-api-access-f4zjn" (OuterVolumeSpecName: "kube-api-access-f4zjn") pod "657d77bd-c7a3-441a-84dd-652ede942310" (UID: "657d77bd-c7a3-441a-84dd-652ede942310"). InnerVolumeSpecName "kube-api-access-f4zjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.574807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-config-data" (OuterVolumeSpecName: "config-data") pod "657d77bd-c7a3-441a-84dd-652ede942310" (UID: "657d77bd-c7a3-441a-84dd-652ede942310"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.580902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "657d77bd-c7a3-441a-84dd-652ede942310" (UID: "657d77bd-c7a3-441a-84dd-652ede942310"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.632152 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.632205 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.632219 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4zjn\" (UniqueName: \"kubernetes.io/projected/657d77bd-c7a3-441a-84dd-652ede942310-kube-api-access-f4zjn\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:23 crc kubenswrapper[5030]: I0120 23:01:23.632234 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657d77bd-c7a3-441a-84dd-652ede942310-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.135329 5030 generic.go:334] "Generic (PLEG): container finished" podID="0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" containerID="ebf02f9fd0439ea98f802bd91276eafca210d4f05ef8d7718d27101c1d91aae5" exitCode=0 Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.135439 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9zt2" event={"ID":"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb","Type":"ContainerDied","Data":"ebf02f9fd0439ea98f802bd91276eafca210d4f05ef8d7718d27101c1d91aae5"} Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.139856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" event={"ID":"657d77bd-c7a3-441a-84dd-652ede942310","Type":"ContainerDied","Data":"3610a01d4423971b728a5d1f3c721b8807b86cdee5096e265a002969c3ed8fa6"} Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.139914 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3610a01d4423971b728a5d1f3c721b8807b86cdee5096e265a002969c3ed8fa6" Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.139994 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf" Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.343720 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.396205 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.423179 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.423556 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="0e850b76-b34f-44d5-8723-ce657ebf49b6" containerName="nova-api-log" containerID="cri-o://6f4d1ad6fde510fd8991f49aa2ffc3bec3865f2e86c3443863810760a596606e" gracePeriod=30 Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.423718 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="0e850b76-b34f-44d5-8723-ce657ebf49b6" containerName="nova-api-api" containerID="cri-o://df0e53ae7397bf5f9a81af2d566612f2c4ea85843b5446ce373ddc9fc56834e0" gracePeriod=30 Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.439815 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.511528 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.511799 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="91a29aab-8344-44a5-a736-6ec987b286e7" containerName="nova-metadata-log" containerID="cri-o://fefd41374a528bcf03ae16c1857115655d70febcbd2986de16707ae8901a4def" gracePeriod=30 Jan 20 23:01:24 crc kubenswrapper[5030]: I0120 23:01:24.511953 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="91a29aab-8344-44a5-a736-6ec987b286e7" containerName="nova-metadata-metadata" containerID="cri-o://9019c6e3de8e3b5c6c69f545d75528a3a3d50b6bd8120858c42802af8687d3e8" gracePeriod=30 Jan 20 23:01:25 crc kubenswrapper[5030]: I0120 23:01:25.151826 5030 generic.go:334] "Generic (PLEG): container finished" podID="91a29aab-8344-44a5-a736-6ec987b286e7" containerID="fefd41374a528bcf03ae16c1857115655d70febcbd2986de16707ae8901a4def" exitCode=143 Jan 20 23:01:25 crc kubenswrapper[5030]: I0120 23:01:25.151886 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"91a29aab-8344-44a5-a736-6ec987b286e7","Type":"ContainerDied","Data":"fefd41374a528bcf03ae16c1857115655d70febcbd2986de16707ae8901a4def"} Jan 20 23:01:25 crc kubenswrapper[5030]: I0120 23:01:25.154223 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9zt2" event={"ID":"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb","Type":"ContainerStarted","Data":"27e0995842d568abe5153f73aa0a781e635d3a102263863d7497e11decd590cf"} Jan 20 23:01:25 crc kubenswrapper[5030]: I0120 23:01:25.156078 5030 generic.go:334] "Generic (PLEG): container finished" podID="0e850b76-b34f-44d5-8723-ce657ebf49b6" containerID="6f4d1ad6fde510fd8991f49aa2ffc3bec3865f2e86c3443863810760a596606e" exitCode=143 Jan 20 23:01:25 crc kubenswrapper[5030]: I0120 23:01:25.156130 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"0e850b76-b34f-44d5-8723-ce657ebf49b6","Type":"ContainerDied","Data":"6f4d1ad6fde510fd8991f49aa2ffc3bec3865f2e86c3443863810760a596606e"} Jan 20 23:01:25 crc kubenswrapper[5030]: I0120 23:01:25.171708 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s9zt2" podStartSLOduration=3.415885507 podStartE2EDuration="6.171690329s" podCreationTimestamp="2026-01-20 23:01:19 +0000 UTC" firstStartedPulling="2026-01-20 23:01:22.10379858 +0000 UTC m=+1554.424058878" lastFinishedPulling="2026-01-20 23:01:24.859603372 +0000 UTC m=+1557.179863700" observedRunningTime="2026-01-20 23:01:25.169465295 +0000 UTC m=+1557.489725583" watchObservedRunningTime="2026-01-20 23:01:25.171690329 +0000 UTC m=+1557.491950617" Jan 20 23:01:25 crc kubenswrapper[5030]: I0120 23:01:25.185220 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:26 crc kubenswrapper[5030]: I0120 23:01:26.165191 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="6768bc62-1cf3-4135-b061-64f99a6d479f" containerName="nova-scheduler-scheduler" containerID="cri-o://e372e9bfd9f6333c1bc02350302286900a50f57a6abe1f9789a879147361734a" gracePeriod=30 Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.098722 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.105669 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.187852 5030 generic.go:334] "Generic (PLEG): container finished" podID="0e850b76-b34f-44d5-8723-ce657ebf49b6" containerID="df0e53ae7397bf5f9a81af2d566612f2c4ea85843b5446ce373ddc9fc56834e0" exitCode=0 Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.187924 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"0e850b76-b34f-44d5-8723-ce657ebf49b6","Type":"ContainerDied","Data":"df0e53ae7397bf5f9a81af2d566612f2c4ea85843b5446ce373ddc9fc56834e0"} Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.187951 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"0e850b76-b34f-44d5-8723-ce657ebf49b6","Type":"ContainerDied","Data":"37279f13d7c2a7fa8c76a84c716e56eada0cbac8632cc485677c5c7da6fc58de"} Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.187972 5030 scope.go:117] "RemoveContainer" containerID="df0e53ae7397bf5f9a81af2d566612f2c4ea85843b5446ce373ddc9fc56834e0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.188140 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.191187 5030 generic.go:334] "Generic (PLEG): container finished" podID="91a29aab-8344-44a5-a736-6ec987b286e7" containerID="9019c6e3de8e3b5c6c69f545d75528a3a3d50b6bd8120858c42802af8687d3e8" exitCode=0 Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.191267 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.191265 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"91a29aab-8344-44a5-a736-6ec987b286e7","Type":"ContainerDied","Data":"9019c6e3de8e3b5c6c69f545d75528a3a3d50b6bd8120858c42802af8687d3e8"} Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.191648 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"91a29aab-8344-44a5-a736-6ec987b286e7","Type":"ContainerDied","Data":"d9360bc8c6322154560a4f25a4f9d9ca7301edd66ec5241983317fa79a183613"} Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.210490 5030 scope.go:117] "RemoveContainer" containerID="6f4d1ad6fde510fd8991f49aa2ffc3bec3865f2e86c3443863810760a596606e" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.230917 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9prsg\" (UniqueName: \"kubernetes.io/projected/91a29aab-8344-44a5-a736-6ec987b286e7-kube-api-access-9prsg\") pod \"91a29aab-8344-44a5-a736-6ec987b286e7\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.230991 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-config-data\") pod \"91a29aab-8344-44a5-a736-6ec987b286e7\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.231056 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-combined-ca-bundle\") pod \"91a29aab-8344-44a5-a736-6ec987b286e7\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.231108 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a29aab-8344-44a5-a736-6ec987b286e7-logs\") pod \"91a29aab-8344-44a5-a736-6ec987b286e7\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.231180 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e850b76-b34f-44d5-8723-ce657ebf49b6-combined-ca-bundle\") pod \"0e850b76-b34f-44d5-8723-ce657ebf49b6\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.231231 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e850b76-b34f-44d5-8723-ce657ebf49b6-logs\") pod \"0e850b76-b34f-44d5-8723-ce657ebf49b6\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.231330 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e850b76-b34f-44d5-8723-ce657ebf49b6-config-data\") pod \"0e850b76-b34f-44d5-8723-ce657ebf49b6\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.231358 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-nova-metadata-tls-certs\") pod \"91a29aab-8344-44a5-a736-6ec987b286e7\" (UID: \"91a29aab-8344-44a5-a736-6ec987b286e7\") " Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.231409 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jc8b\" (UniqueName: \"kubernetes.io/projected/0e850b76-b34f-44d5-8723-ce657ebf49b6-kube-api-access-2jc8b\") pod \"0e850b76-b34f-44d5-8723-ce657ebf49b6\" (UID: \"0e850b76-b34f-44d5-8723-ce657ebf49b6\") " Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.233082 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91a29aab-8344-44a5-a736-6ec987b286e7-logs" (OuterVolumeSpecName: "logs") pod "91a29aab-8344-44a5-a736-6ec987b286e7" (UID: "91a29aab-8344-44a5-a736-6ec987b286e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.234212 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e850b76-b34f-44d5-8723-ce657ebf49b6-logs" (OuterVolumeSpecName: "logs") pod "0e850b76-b34f-44d5-8723-ce657ebf49b6" (UID: "0e850b76-b34f-44d5-8723-ce657ebf49b6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.240052 5030 scope.go:117] "RemoveContainer" containerID="df0e53ae7397bf5f9a81af2d566612f2c4ea85843b5446ce373ddc9fc56834e0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.249303 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a29aab-8344-44a5-a736-6ec987b286e7-kube-api-access-9prsg" (OuterVolumeSpecName: "kube-api-access-9prsg") pod "91a29aab-8344-44a5-a736-6ec987b286e7" (UID: "91a29aab-8344-44a5-a736-6ec987b286e7"). InnerVolumeSpecName "kube-api-access-9prsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.249442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e850b76-b34f-44d5-8723-ce657ebf49b6-kube-api-access-2jc8b" (OuterVolumeSpecName: "kube-api-access-2jc8b") pod "0e850b76-b34f-44d5-8723-ce657ebf49b6" (UID: "0e850b76-b34f-44d5-8723-ce657ebf49b6"). InnerVolumeSpecName "kube-api-access-2jc8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:28 crc kubenswrapper[5030]: E0120 23:01:28.249387 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df0e53ae7397bf5f9a81af2d566612f2c4ea85843b5446ce373ddc9fc56834e0\": container with ID starting with df0e53ae7397bf5f9a81af2d566612f2c4ea85843b5446ce373ddc9fc56834e0 not found: ID does not exist" containerID="df0e53ae7397bf5f9a81af2d566612f2c4ea85843b5446ce373ddc9fc56834e0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.249514 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df0e53ae7397bf5f9a81af2d566612f2c4ea85843b5446ce373ddc9fc56834e0"} err="failed to get container status \"df0e53ae7397bf5f9a81af2d566612f2c4ea85843b5446ce373ddc9fc56834e0\": rpc error: code = NotFound desc = could not find container \"df0e53ae7397bf5f9a81af2d566612f2c4ea85843b5446ce373ddc9fc56834e0\": container with ID starting with df0e53ae7397bf5f9a81af2d566612f2c4ea85843b5446ce373ddc9fc56834e0 not found: ID does not exist" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.249543 5030 scope.go:117] "RemoveContainer" containerID="6f4d1ad6fde510fd8991f49aa2ffc3bec3865f2e86c3443863810760a596606e" Jan 20 23:01:28 crc kubenswrapper[5030]: E0120 23:01:28.249976 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f4d1ad6fde510fd8991f49aa2ffc3bec3865f2e86c3443863810760a596606e\": container with ID starting with 6f4d1ad6fde510fd8991f49aa2ffc3bec3865f2e86c3443863810760a596606e not found: ID does not exist" containerID="6f4d1ad6fde510fd8991f49aa2ffc3bec3865f2e86c3443863810760a596606e" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.250009 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f4d1ad6fde510fd8991f49aa2ffc3bec3865f2e86c3443863810760a596606e"} err="failed to get container status \"6f4d1ad6fde510fd8991f49aa2ffc3bec3865f2e86c3443863810760a596606e\": rpc error: code = NotFound desc = could not find container \"6f4d1ad6fde510fd8991f49aa2ffc3bec3865f2e86c3443863810760a596606e\": container with ID starting with 6f4d1ad6fde510fd8991f49aa2ffc3bec3865f2e86c3443863810760a596606e not found: ID does not exist" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.250062 5030 scope.go:117] "RemoveContainer" containerID="9019c6e3de8e3b5c6c69f545d75528a3a3d50b6bd8120858c42802af8687d3e8" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.268081 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-config-data" (OuterVolumeSpecName: "config-data") pod "91a29aab-8344-44a5-a736-6ec987b286e7" (UID: "91a29aab-8344-44a5-a736-6ec987b286e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.272107 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e850b76-b34f-44d5-8723-ce657ebf49b6-config-data" (OuterVolumeSpecName: "config-data") pod "0e850b76-b34f-44d5-8723-ce657ebf49b6" (UID: "0e850b76-b34f-44d5-8723-ce657ebf49b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.274069 5030 scope.go:117] "RemoveContainer" containerID="fefd41374a528bcf03ae16c1857115655d70febcbd2986de16707ae8901a4def" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.277524 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91a29aab-8344-44a5-a736-6ec987b286e7" (UID: "91a29aab-8344-44a5-a736-6ec987b286e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.279893 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e850b76-b34f-44d5-8723-ce657ebf49b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e850b76-b34f-44d5-8723-ce657ebf49b6" (UID: "0e850b76-b34f-44d5-8723-ce657ebf49b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.295183 5030 scope.go:117] "RemoveContainer" containerID="9019c6e3de8e3b5c6c69f545d75528a3a3d50b6bd8120858c42802af8687d3e8" Jan 20 23:01:28 crc kubenswrapper[5030]: E0120 23:01:28.295589 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9019c6e3de8e3b5c6c69f545d75528a3a3d50b6bd8120858c42802af8687d3e8\": container with ID starting with 9019c6e3de8e3b5c6c69f545d75528a3a3d50b6bd8120858c42802af8687d3e8 not found: ID does not exist" containerID="9019c6e3de8e3b5c6c69f545d75528a3a3d50b6bd8120858c42802af8687d3e8" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.295673 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9019c6e3de8e3b5c6c69f545d75528a3a3d50b6bd8120858c42802af8687d3e8"} err="failed to get container status \"9019c6e3de8e3b5c6c69f545d75528a3a3d50b6bd8120858c42802af8687d3e8\": rpc error: code = NotFound desc = could not find container \"9019c6e3de8e3b5c6c69f545d75528a3a3d50b6bd8120858c42802af8687d3e8\": container with ID starting with 9019c6e3de8e3b5c6c69f545d75528a3a3d50b6bd8120858c42802af8687d3e8 not found: ID does not exist" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.295708 5030 scope.go:117] "RemoveContainer" containerID="fefd41374a528bcf03ae16c1857115655d70febcbd2986de16707ae8901a4def" Jan 20 23:01:28 crc kubenswrapper[5030]: E0120 23:01:28.296028 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fefd41374a528bcf03ae16c1857115655d70febcbd2986de16707ae8901a4def\": container with ID starting with fefd41374a528bcf03ae16c1857115655d70febcbd2986de16707ae8901a4def not found: ID does not exist" containerID="fefd41374a528bcf03ae16c1857115655d70febcbd2986de16707ae8901a4def" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.296056 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fefd41374a528bcf03ae16c1857115655d70febcbd2986de16707ae8901a4def"} err="failed to get container status \"fefd41374a528bcf03ae16c1857115655d70febcbd2986de16707ae8901a4def\": rpc error: code = NotFound desc = could not find container \"fefd41374a528bcf03ae16c1857115655d70febcbd2986de16707ae8901a4def\": container with ID starting with fefd41374a528bcf03ae16c1857115655d70febcbd2986de16707ae8901a4def not found: ID does not exist" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.308144 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "91a29aab-8344-44a5-a736-6ec987b286e7" (UID: "91a29aab-8344-44a5-a736-6ec987b286e7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.333364 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e850b76-b34f-44d5-8723-ce657ebf49b6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.333395 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.333407 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jc8b\" (UniqueName: \"kubernetes.io/projected/0e850b76-b34f-44d5-8723-ce657ebf49b6-kube-api-access-2jc8b\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.333416 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9prsg\" (UniqueName: \"kubernetes.io/projected/91a29aab-8344-44a5-a736-6ec987b286e7-kube-api-access-9prsg\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.333425 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.333434 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a29aab-8344-44a5-a736-6ec987b286e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.333442 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a29aab-8344-44a5-a736-6ec987b286e7-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.333452 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e850b76-b34f-44d5-8723-ce657ebf49b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.333461 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e850b76-b34f-44d5-8723-ce657ebf49b6-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.544632 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.559485 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.568431 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.585485 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.593333 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:28 crc kubenswrapper[5030]: E0120 23:01:28.593772 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657d77bd-c7a3-441a-84dd-652ede942310" containerName="nova-manage" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.593790 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="657d77bd-c7a3-441a-84dd-652ede942310" containerName="nova-manage" Jan 20 23:01:28 crc kubenswrapper[5030]: E0120 23:01:28.593806 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a29aab-8344-44a5-a736-6ec987b286e7" containerName="nova-metadata-log" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.593813 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a29aab-8344-44a5-a736-6ec987b286e7" containerName="nova-metadata-log" Jan 20 23:01:28 crc kubenswrapper[5030]: E0120 23:01:28.593826 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a29aab-8344-44a5-a736-6ec987b286e7" containerName="nova-metadata-metadata" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.593833 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a29aab-8344-44a5-a736-6ec987b286e7" containerName="nova-metadata-metadata" Jan 20 23:01:28 crc kubenswrapper[5030]: E0120 23:01:28.593846 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e850b76-b34f-44d5-8723-ce657ebf49b6" containerName="nova-api-log" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.593852 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e850b76-b34f-44d5-8723-ce657ebf49b6" containerName="nova-api-log" Jan 20 23:01:28 crc kubenswrapper[5030]: E0120 23:01:28.593866 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e850b76-b34f-44d5-8723-ce657ebf49b6" containerName="nova-api-api" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.593872 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e850b76-b34f-44d5-8723-ce657ebf49b6" containerName="nova-api-api" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.594045 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e850b76-b34f-44d5-8723-ce657ebf49b6" containerName="nova-api-log" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.594062 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="91a29aab-8344-44a5-a736-6ec987b286e7" containerName="nova-metadata-log" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.594071 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="657d77bd-c7a3-441a-84dd-652ede942310" containerName="nova-manage" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.594086 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e850b76-b34f-44d5-8723-ce657ebf49b6" containerName="nova-api-api" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.594093 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="91a29aab-8344-44a5-a736-6ec987b286e7" containerName="nova-metadata-metadata" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.595168 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.597441 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.601572 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.604138 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.615168 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.618441 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.621991 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.634650 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.653026 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/873931fd-cdee-4177-b8f2-4c8063bd0626-logs\") pod \"nova-api-0\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.653322 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st6qt\" (UniqueName: \"kubernetes.io/projected/873931fd-cdee-4177-b8f2-4c8063bd0626-kube-api-access-st6qt\") pod \"nova-api-0\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.653393 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873931fd-cdee-4177-b8f2-4c8063bd0626-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.653424 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873931fd-cdee-4177-b8f2-4c8063bd0626-config-data\") pod \"nova-api-0\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.760337 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/873931fd-cdee-4177-b8f2-4c8063bd0626-logs\") pod \"nova-api-0\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.760385 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f99e18-c5b4-4def-9f5e-5be497088422-logs\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.760412 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.760460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xczwn\" (UniqueName: \"kubernetes.io/projected/b5f99e18-c5b4-4def-9f5e-5be497088422-kube-api-access-xczwn\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.760482 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.760751 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st6qt\" (UniqueName: \"kubernetes.io/projected/873931fd-cdee-4177-b8f2-4c8063bd0626-kube-api-access-st6qt\") pod \"nova-api-0\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.760862 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-config-data\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.760860 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/873931fd-cdee-4177-b8f2-4c8063bd0626-logs\") pod \"nova-api-0\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.760889 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873931fd-cdee-4177-b8f2-4c8063bd0626-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.760951 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873931fd-cdee-4177-b8f2-4c8063bd0626-config-data\") pod \"nova-api-0\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.766338 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873931fd-cdee-4177-b8f2-4c8063bd0626-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.768601 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873931fd-cdee-4177-b8f2-4c8063bd0626-config-data\") pod \"nova-api-0\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.785666 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st6qt\" (UniqueName: \"kubernetes.io/projected/873931fd-cdee-4177-b8f2-4c8063bd0626-kube-api-access-st6qt\") pod \"nova-api-0\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.862663 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f99e18-c5b4-4def-9f5e-5be497088422-logs\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.862740 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.862826 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xczwn\" (UniqueName: \"kubernetes.io/projected/b5f99e18-c5b4-4def-9f5e-5be497088422-kube-api-access-xczwn\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.862856 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.862959 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-config-data\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.863109 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f99e18-c5b4-4def-9f5e-5be497088422-logs\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.866787 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.867464 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-config-data\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.868345 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.878579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xczwn\" (UniqueName: \"kubernetes.io/projected/b5f99e18-c5b4-4def-9f5e-5be497088422-kube-api-access-xczwn\") pod \"nova-metadata-0\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.917977 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:28 crc kubenswrapper[5030]: I0120 23:01:28.937743 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:29 crc kubenswrapper[5030]: E0120 23:01:29.345679 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e372e9bfd9f6333c1bc02350302286900a50f57a6abe1f9789a879147361734a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:01:29 crc kubenswrapper[5030]: E0120 23:01:29.347027 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e372e9bfd9f6333c1bc02350302286900a50f57a6abe1f9789a879147361734a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:01:29 crc kubenswrapper[5030]: E0120 23:01:29.348513 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e372e9bfd9f6333c1bc02350302286900a50f57a6abe1f9789a879147361734a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:01:29 crc kubenswrapper[5030]: E0120 23:01:29.348559 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="6768bc62-1cf3-4135-b061-64f99a6d479f" containerName="nova-scheduler-scheduler" Jan 20 23:01:29 crc kubenswrapper[5030]: I0120 23:01:29.441690 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:01:29 crc kubenswrapper[5030]: I0120 23:01:29.512780 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:29 crc kubenswrapper[5030]: I0120 23:01:29.939581 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:29 crc kubenswrapper[5030]: I0120 23:01:29.940065 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:29 crc kubenswrapper[5030]: I0120 23:01:29.974075 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e850b76-b34f-44d5-8723-ce657ebf49b6" path="/var/lib/kubelet/pods/0e850b76-b34f-44d5-8723-ce657ebf49b6/volumes" Jan 20 23:01:29 crc kubenswrapper[5030]: I0120 23:01:29.975010 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a29aab-8344-44a5-a736-6ec987b286e7" path="/var/lib/kubelet/pods/91a29aab-8344-44a5-a736-6ec987b286e7/volumes" Jan 20 23:01:29 crc kubenswrapper[5030]: I0120 23:01:29.987504 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:30 crc kubenswrapper[5030]: I0120 23:01:30.214451 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b5f99e18-c5b4-4def-9f5e-5be497088422","Type":"ContainerStarted","Data":"b3074f8fcdf7f1a56717e9304ba5737a0e0d1592d96a5bd9d970577342c28521"} Jan 20 23:01:30 crc kubenswrapper[5030]: I0120 23:01:30.214518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b5f99e18-c5b4-4def-9f5e-5be497088422","Type":"ContainerStarted","Data":"0351a1478214b121a7bb8088f4a8ae6a86f4ec9b8b21b1882a5aae98732aa2f9"} Jan 20 23:01:30 crc kubenswrapper[5030]: I0120 23:01:30.214531 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b5f99e18-c5b4-4def-9f5e-5be497088422","Type":"ContainerStarted","Data":"61459a67070b863fb3d1eb9dcc2f1640425c4e2159637480d72683645d32d93f"} Jan 20 23:01:30 crc kubenswrapper[5030]: I0120 23:01:30.216158 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"873931fd-cdee-4177-b8f2-4c8063bd0626","Type":"ContainerStarted","Data":"b0cf8625792b65b5a4afb9498b633cef35873c125bcd0f848566c3c6f677743d"} Jan 20 23:01:30 crc kubenswrapper[5030]: I0120 23:01:30.216189 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"873931fd-cdee-4177-b8f2-4c8063bd0626","Type":"ContainerStarted","Data":"7eac10302ce3014cb84b4e2d90a80c0f1efddbca4b19c6dca6c7bc2cbe57ef01"} Jan 20 23:01:30 crc kubenswrapper[5030]: I0120 23:01:30.216205 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"873931fd-cdee-4177-b8f2-4c8063bd0626","Type":"ContainerStarted","Data":"0527e235ad5746b0d3f7ad974aaaa0cc09a08a98a1320e11e230a0bfbda88d68"} Jan 20 23:01:30 crc kubenswrapper[5030]: I0120 23:01:30.239995 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.239978258 podStartE2EDuration="2.239978258s" podCreationTimestamp="2026-01-20 23:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:30.231231785 +0000 UTC m=+1562.551492073" watchObservedRunningTime="2026-01-20 23:01:30.239978258 +0000 UTC m=+1562.560238546" Jan 20 23:01:30 crc kubenswrapper[5030]: I0120 23:01:30.257954 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.257934045 podStartE2EDuration="2.257934045s" podCreationTimestamp="2026-01-20 23:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:30.250138465 +0000 UTC m=+1562.570398763" watchObservedRunningTime="2026-01-20 23:01:30.257934045 +0000 UTC m=+1562.578194343" Jan 20 23:01:30 crc kubenswrapper[5030]: I0120 23:01:30.278701 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:30 crc kubenswrapper[5030]: I0120 23:01:30.329520 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s9zt2"] Jan 20 23:01:31 crc kubenswrapper[5030]: I0120 23:01:31.241173 5030 generic.go:334] "Generic (PLEG): container finished" podID="6768bc62-1cf3-4135-b061-64f99a6d479f" containerID="e372e9bfd9f6333c1bc02350302286900a50f57a6abe1f9789a879147361734a" exitCode=0 Jan 20 23:01:31 crc kubenswrapper[5030]: I0120 23:01:31.241231 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"6768bc62-1cf3-4135-b061-64f99a6d479f","Type":"ContainerDied","Data":"e372e9bfd9f6333c1bc02350302286900a50f57a6abe1f9789a879147361734a"} Jan 20 23:01:31 crc kubenswrapper[5030]: I0120 23:01:31.456289 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:31 crc kubenswrapper[5030]: I0120 23:01:31.515785 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6768bc62-1cf3-4135-b061-64f99a6d479f-combined-ca-bundle\") pod \"6768bc62-1cf3-4135-b061-64f99a6d479f\" (UID: \"6768bc62-1cf3-4135-b061-64f99a6d479f\") " Jan 20 23:01:31 crc kubenswrapper[5030]: I0120 23:01:31.515985 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6768bc62-1cf3-4135-b061-64f99a6d479f-config-data\") pod \"6768bc62-1cf3-4135-b061-64f99a6d479f\" (UID: \"6768bc62-1cf3-4135-b061-64f99a6d479f\") " Jan 20 23:01:31 crc kubenswrapper[5030]: I0120 23:01:31.516084 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75sxv\" (UniqueName: \"kubernetes.io/projected/6768bc62-1cf3-4135-b061-64f99a6d479f-kube-api-access-75sxv\") pod \"6768bc62-1cf3-4135-b061-64f99a6d479f\" (UID: \"6768bc62-1cf3-4135-b061-64f99a6d479f\") " Jan 20 23:01:31 crc kubenswrapper[5030]: I0120 23:01:31.524761 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6768bc62-1cf3-4135-b061-64f99a6d479f-kube-api-access-75sxv" (OuterVolumeSpecName: "kube-api-access-75sxv") pod "6768bc62-1cf3-4135-b061-64f99a6d479f" (UID: "6768bc62-1cf3-4135-b061-64f99a6d479f"). InnerVolumeSpecName "kube-api-access-75sxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:31 crc kubenswrapper[5030]: I0120 23:01:31.561697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6768bc62-1cf3-4135-b061-64f99a6d479f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6768bc62-1cf3-4135-b061-64f99a6d479f" (UID: "6768bc62-1cf3-4135-b061-64f99a6d479f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:31 crc kubenswrapper[5030]: I0120 23:01:31.575637 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6768bc62-1cf3-4135-b061-64f99a6d479f-config-data" (OuterVolumeSpecName: "config-data") pod "6768bc62-1cf3-4135-b061-64f99a6d479f" (UID: "6768bc62-1cf3-4135-b061-64f99a6d479f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:31 crc kubenswrapper[5030]: I0120 23:01:31.627240 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6768bc62-1cf3-4135-b061-64f99a6d479f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:31 crc kubenswrapper[5030]: I0120 23:01:31.627309 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75sxv\" (UniqueName: \"kubernetes.io/projected/6768bc62-1cf3-4135-b061-64f99a6d479f-kube-api-access-75sxv\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:31 crc kubenswrapper[5030]: I0120 23:01:31.627330 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6768bc62-1cf3-4135-b061-64f99a6d479f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.251856 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s9zt2" podUID="0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" containerName="registry-server" containerID="cri-o://27e0995842d568abe5153f73aa0a781e635d3a102263863d7497e11decd590cf" gracePeriod=2 Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.252151 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.252493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"6768bc62-1cf3-4135-b061-64f99a6d479f","Type":"ContainerDied","Data":"9db41a651d6e71856ae95e325cbf750baa03e7d59df51c47a350ffa7cd62bc1d"} Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.252559 5030 scope.go:117] "RemoveContainer" containerID="e372e9bfd9f6333c1bc02350302286900a50f57a6abe1f9789a879147361734a" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.277150 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.286403 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.303812 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:01:32 crc kubenswrapper[5030]: E0120 23:01:32.304251 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6768bc62-1cf3-4135-b061-64f99a6d479f" containerName="nova-scheduler-scheduler" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.304272 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6768bc62-1cf3-4135-b061-64f99a6d479f" containerName="nova-scheduler-scheduler" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.304510 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6768bc62-1cf3-4135-b061-64f99a6d479f" containerName="nova-scheduler-scheduler" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.305196 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.311596 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.315823 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.444477 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2071f00e-fdae-400c-bec2-f9092faef33d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2071f00e-fdae-400c-bec2-f9092faef33d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.444519 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2071f00e-fdae-400c-bec2-f9092faef33d-config-data\") pod \"nova-scheduler-0\" (UID: \"2071f00e-fdae-400c-bec2-f9092faef33d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.444547 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh2xb\" (UniqueName: \"kubernetes.io/projected/2071f00e-fdae-400c-bec2-f9092faef33d-kube-api-access-mh2xb\") pod \"nova-scheduler-0\" (UID: \"2071f00e-fdae-400c-bec2-f9092faef33d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.546820 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2071f00e-fdae-400c-bec2-f9092faef33d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2071f00e-fdae-400c-bec2-f9092faef33d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.547118 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2071f00e-fdae-400c-bec2-f9092faef33d-config-data\") pod \"nova-scheduler-0\" (UID: \"2071f00e-fdae-400c-bec2-f9092faef33d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.547159 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh2xb\" (UniqueName: \"kubernetes.io/projected/2071f00e-fdae-400c-bec2-f9092faef33d-kube-api-access-mh2xb\") pod \"nova-scheduler-0\" (UID: \"2071f00e-fdae-400c-bec2-f9092faef33d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.555716 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2071f00e-fdae-400c-bec2-f9092faef33d-config-data\") pod \"nova-scheduler-0\" (UID: \"2071f00e-fdae-400c-bec2-f9092faef33d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.559312 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2071f00e-fdae-400c-bec2-f9092faef33d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2071f00e-fdae-400c-bec2-f9092faef33d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.577112 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh2xb\" (UniqueName: \"kubernetes.io/projected/2071f00e-fdae-400c-bec2-f9092faef33d-kube-api-access-mh2xb\") pod \"nova-scheduler-0\" (UID: \"2071f00e-fdae-400c-bec2-f9092faef33d\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.704225 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.726146 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.851491 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjwpn\" (UniqueName: \"kubernetes.io/projected/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-kube-api-access-mjwpn\") pod \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\" (UID: \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\") " Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.851838 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-catalog-content\") pod \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\" (UID: \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\") " Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.851928 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-utilities\") pod \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\" (UID: \"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb\") " Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.855367 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-kube-api-access-mjwpn" (OuterVolumeSpecName: "kube-api-access-mjwpn") pod "0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" (UID: "0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb"). InnerVolumeSpecName "kube-api-access-mjwpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.862380 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-utilities" (OuterVolumeSpecName: "utilities") pod "0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" (UID: "0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.872457 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" (UID: "0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:01:32 crc kubenswrapper[5030]: E0120 23:01:32.881230 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e850b76_b34f_44d5_8723_ce657ebf49b6.slice/crio-37279f13d7c2a7fa8c76a84c716e56eada0cbac8632cc485677c5c7da6fc58de\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a29aab_8344_44a5_a736_6ec987b286e7.slice/crio-d9360bc8c6322154560a4f25a4f9d9ca7301edd66ec5241983317fa79a183613\": RecentStats: unable to find data in memory cache]" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.955053 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.955135 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjwpn\" (UniqueName: \"kubernetes.io/projected/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-kube-api-access-mjwpn\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:32 crc kubenswrapper[5030]: I0120 23:01:32.955167 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.035882 5030 scope.go:117] "RemoveContainer" containerID="1899dc7597c55dc2c6cfc75ca888715f8d68e0856cf45f3f65ffe82ffacc10d0" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.095597 5030 scope.go:117] "RemoveContainer" containerID="616aa75fd9e2ad8fd790729e288a6a541ddb55a66a96677982b3af6e2ceb68f2" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.121742 5030 scope.go:117] "RemoveContainer" containerID="ecc31cd1a34d8f9168d6e06983ace5d6bc419548b4bce4c1012faf388f418b36" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.175365 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.181090 5030 scope.go:117] "RemoveContainer" containerID="66e620a76ff8286c5dc7276c1571b2da8a7c84757e27d0bad05d3e967a8f9819" Jan 20 23:01:33 crc kubenswrapper[5030]: W0120 23:01:33.188365 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2071f00e_fdae_400c_bec2_f9092faef33d.slice/crio-195c07e9207c0d526a9d755e7b0bfe80dcdf587a2a21c54a61240f8f0a999522 WatchSource:0}: Error finding container 195c07e9207c0d526a9d755e7b0bfe80dcdf587a2a21c54a61240f8f0a999522: Status 404 returned error can't find the container with id 195c07e9207c0d526a9d755e7b0bfe80dcdf587a2a21c54a61240f8f0a999522 Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.256047 5030 scope.go:117] "RemoveContainer" containerID="489a2af29b8eb0e259083f0edb59002efc22e0b485becab088c05419badfc6bd" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.273506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"2071f00e-fdae-400c-bec2-f9092faef33d","Type":"ContainerStarted","Data":"195c07e9207c0d526a9d755e7b0bfe80dcdf587a2a21c54a61240f8f0a999522"} Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.277009 5030 generic.go:334] "Generic (PLEG): container finished" podID="0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" containerID="27e0995842d568abe5153f73aa0a781e635d3a102263863d7497e11decd590cf" exitCode=0 Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.277145 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s9zt2" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.277101 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9zt2" event={"ID":"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb","Type":"ContainerDied","Data":"27e0995842d568abe5153f73aa0a781e635d3a102263863d7497e11decd590cf"} Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.277420 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s9zt2" event={"ID":"0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb","Type":"ContainerDied","Data":"3a2867dc072261bf5275a867a451dce85c53867a454d724ad5869ab7f1c07c3f"} Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.277446 5030 scope.go:117] "RemoveContainer" containerID="27e0995842d568abe5153f73aa0a781e635d3a102263863d7497e11decd590cf" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.298368 5030 scope.go:117] "RemoveContainer" containerID="931e42e4636ed8c2ed4c71dffe3b6456f856ff06fb0a66e1ed6b1e2aef0b546c" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.315389 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s9zt2"] Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.324155 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s9zt2"] Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.329505 5030 scope.go:117] "RemoveContainer" containerID="ebf02f9fd0439ea98f802bd91276eafca210d4f05ef8d7718d27101c1d91aae5" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.357204 5030 scope.go:117] "RemoveContainer" containerID="bbd90196bdd3a4a47bb0c4c568ffffbec0595ca6ecfec4c5cd5a3dbc6608e462" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.376749 5030 scope.go:117] "RemoveContainer" containerID="85ab71d8b7ed484bc88f91053985f98b08119343be39f412b7f1d9838affad1d" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.410854 5030 scope.go:117] "RemoveContainer" containerID="27e0995842d568abe5153f73aa0a781e635d3a102263863d7497e11decd590cf" Jan 20 23:01:33 crc kubenswrapper[5030]: E0120 23:01:33.411443 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e0995842d568abe5153f73aa0a781e635d3a102263863d7497e11decd590cf\": container with ID starting with 27e0995842d568abe5153f73aa0a781e635d3a102263863d7497e11decd590cf not found: ID does not exist" containerID="27e0995842d568abe5153f73aa0a781e635d3a102263863d7497e11decd590cf" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.411497 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e0995842d568abe5153f73aa0a781e635d3a102263863d7497e11decd590cf"} err="failed to get container status \"27e0995842d568abe5153f73aa0a781e635d3a102263863d7497e11decd590cf\": rpc error: code = NotFound desc = could not find container \"27e0995842d568abe5153f73aa0a781e635d3a102263863d7497e11decd590cf\": container with ID starting with 27e0995842d568abe5153f73aa0a781e635d3a102263863d7497e11decd590cf not found: ID does not exist" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.411535 5030 scope.go:117] "RemoveContainer" containerID="ebf02f9fd0439ea98f802bd91276eafca210d4f05ef8d7718d27101c1d91aae5" Jan 20 23:01:33 crc kubenswrapper[5030]: E0120 23:01:33.411985 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf02f9fd0439ea98f802bd91276eafca210d4f05ef8d7718d27101c1d91aae5\": container with ID starting with ebf02f9fd0439ea98f802bd91276eafca210d4f05ef8d7718d27101c1d91aae5 not found: ID does not exist" containerID="ebf02f9fd0439ea98f802bd91276eafca210d4f05ef8d7718d27101c1d91aae5" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.412097 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf02f9fd0439ea98f802bd91276eafca210d4f05ef8d7718d27101c1d91aae5"} err="failed to get container status \"ebf02f9fd0439ea98f802bd91276eafca210d4f05ef8d7718d27101c1d91aae5\": rpc error: code = NotFound desc = could not find container \"ebf02f9fd0439ea98f802bd91276eafca210d4f05ef8d7718d27101c1d91aae5\": container with ID starting with ebf02f9fd0439ea98f802bd91276eafca210d4f05ef8d7718d27101c1d91aae5 not found: ID does not exist" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.412215 5030 scope.go:117] "RemoveContainer" containerID="85ab71d8b7ed484bc88f91053985f98b08119343be39f412b7f1d9838affad1d" Jan 20 23:01:33 crc kubenswrapper[5030]: E0120 23:01:33.412633 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ab71d8b7ed484bc88f91053985f98b08119343be39f412b7f1d9838affad1d\": container with ID starting with 85ab71d8b7ed484bc88f91053985f98b08119343be39f412b7f1d9838affad1d not found: ID does not exist" containerID="85ab71d8b7ed484bc88f91053985f98b08119343be39f412b7f1d9838affad1d" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.412673 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ab71d8b7ed484bc88f91053985f98b08119343be39f412b7f1d9838affad1d"} err="failed to get container status \"85ab71d8b7ed484bc88f91053985f98b08119343be39f412b7f1d9838affad1d\": rpc error: code = NotFound desc = could not find container \"85ab71d8b7ed484bc88f91053985f98b08119343be39f412b7f1d9838affad1d\": container with ID starting with 85ab71d8b7ed484bc88f91053985f98b08119343be39f412b7f1d9838affad1d not found: ID does not exist" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.938646 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.938702 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.991770 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" path="/var/lib/kubelet/pods/0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb/volumes" Jan 20 23:01:33 crc kubenswrapper[5030]: I0120 23:01:33.994140 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6768bc62-1cf3-4135-b061-64f99a6d479f" path="/var/lib/kubelet/pods/6768bc62-1cf3-4135-b061-64f99a6d479f/volumes" Jan 20 23:01:34 crc kubenswrapper[5030]: I0120 23:01:34.297401 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"2071f00e-fdae-400c-bec2-f9092faef33d","Type":"ContainerStarted","Data":"765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600"} Jan 20 23:01:34 crc kubenswrapper[5030]: I0120 23:01:34.326687 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.326670635 podStartE2EDuration="2.326670635s" podCreationTimestamp="2026-01-20 23:01:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:34.323440075 +0000 UTC m=+1566.643700443" watchObservedRunningTime="2026-01-20 23:01:34.326670635 +0000 UTC m=+1566.646930923" Jan 20 23:01:37 crc kubenswrapper[5030]: I0120 23:01:37.727978 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:38 crc kubenswrapper[5030]: I0120 23:01:38.918369 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:38 crc kubenswrapper[5030]: I0120 23:01:38.919356 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:38 crc kubenswrapper[5030]: I0120 23:01:38.938589 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:38 crc kubenswrapper[5030]: I0120 23:01:38.938683 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:40 crc kubenswrapper[5030]: I0120 23:01:40.018800 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.32:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:01:40 crc kubenswrapper[5030]: I0120 23:01:40.018851 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.32:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:01:40 crc kubenswrapper[5030]: I0120 23:01:40.018810 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="873931fd-cdee-4177-b8f2-4c8063bd0626" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.31:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:01:40 crc kubenswrapper[5030]: I0120 23:01:40.018810 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="873931fd-cdee-4177-b8f2-4c8063bd0626" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.31:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:01:40 crc kubenswrapper[5030]: I0120 23:01:40.157664 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:01:40 crc kubenswrapper[5030]: I0120 23:01:40.157718 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:01:42 crc kubenswrapper[5030]: I0120 23:01:42.642539 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:42 crc kubenswrapper[5030]: I0120 23:01:42.727923 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:42 crc kubenswrapper[5030]: I0120 23:01:42.754233 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:43 crc kubenswrapper[5030]: E0120 23:01:43.114202 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a29aab_8344_44a5_a736_6ec987b286e7.slice/crio-d9360bc8c6322154560a4f25a4f9d9ca7301edd66ec5241983317fa79a183613\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e850b76_b34f_44d5_8723_ce657ebf49b6.slice/crio-37279f13d7c2a7fa8c76a84c716e56eada0cbac8632cc485677c5c7da6fc58de\": RecentStats: unable to find data in memory cache]" Jan 20 23:01:43 crc kubenswrapper[5030]: I0120 23:01:43.451899 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:01:48 crc kubenswrapper[5030]: I0120 23:01:48.926582 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:48 crc kubenswrapper[5030]: I0120 23:01:48.927171 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:48 crc kubenswrapper[5030]: I0120 23:01:48.928056 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:48 crc kubenswrapper[5030]: I0120 23:01:48.928284 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:48 crc kubenswrapper[5030]: I0120 23:01:48.933891 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:48 crc kubenswrapper[5030]: I0120 23:01:48.935928 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:48 crc kubenswrapper[5030]: I0120 23:01:48.950337 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:48 crc kubenswrapper[5030]: I0120 23:01:48.962551 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:48 crc kubenswrapper[5030]: I0120 23:01:48.963220 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:49 crc kubenswrapper[5030]: I0120 23:01:49.487183 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:01:50 crc kubenswrapper[5030]: I0120 23:01:50.738341 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:01:50 crc kubenswrapper[5030]: I0120 23:01:50.740250 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="ceilometer-central-agent" containerID="cri-o://0d0716e8e70864125838d7a292a8e610c00fd026cb5a41287b3f82baabf81a60" gracePeriod=30 Jan 20 23:01:50 crc kubenswrapper[5030]: I0120 23:01:50.740396 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="proxy-httpd" containerID="cri-o://fd64a3d0252666a93d9d1778117f9917ec05f69b515ee10b13d1ac7146aea041" gracePeriod=30 Jan 20 23:01:50 crc kubenswrapper[5030]: I0120 23:01:50.740447 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="sg-core" containerID="cri-o://80a14381ea24f2faad80e46fd57367dccad4f601b4e2e50f1b99ff4d3236a41d" gracePeriod=30 Jan 20 23:01:50 crc kubenswrapper[5030]: I0120 23:01:50.740485 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="ceilometer-notification-agent" containerID="cri-o://2acb5a5593855b0b97108d18e658bc30b9de7c2326da2ca5cf487fa7371d2dfc" gracePeriod=30 Jan 20 23:01:51 crc kubenswrapper[5030]: I0120 23:01:51.501986 5030 generic.go:334] "Generic (PLEG): container finished" podID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerID="fd64a3d0252666a93d9d1778117f9917ec05f69b515ee10b13d1ac7146aea041" exitCode=0 Jan 20 23:01:51 crc kubenswrapper[5030]: I0120 23:01:51.502473 5030 generic.go:334] "Generic (PLEG): container finished" podID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerID="80a14381ea24f2faad80e46fd57367dccad4f601b4e2e50f1b99ff4d3236a41d" exitCode=2 Jan 20 23:01:51 crc kubenswrapper[5030]: I0120 23:01:51.502506 5030 generic.go:334] "Generic (PLEG): container finished" podID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerID="0d0716e8e70864125838d7a292a8e610c00fd026cb5a41287b3f82baabf81a60" exitCode=0 Jan 20 23:01:51 crc kubenswrapper[5030]: I0120 23:01:51.502068 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d9f09bef-5dba-4628-adf3-c440c793dd50","Type":"ContainerDied","Data":"fd64a3d0252666a93d9d1778117f9917ec05f69b515ee10b13d1ac7146aea041"} Jan 20 23:01:51 crc kubenswrapper[5030]: I0120 23:01:51.502588 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d9f09bef-5dba-4628-adf3-c440c793dd50","Type":"ContainerDied","Data":"80a14381ea24f2faad80e46fd57367dccad4f601b4e2e50f1b99ff4d3236a41d"} Jan 20 23:01:51 crc kubenswrapper[5030]: I0120 23:01:51.502603 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d9f09bef-5dba-4628-adf3-c440c793dd50","Type":"ContainerDied","Data":"0d0716e8e70864125838d7a292a8e610c00fd026cb5a41287b3f82baabf81a60"} Jan 20 23:01:51 crc kubenswrapper[5030]: I0120 23:01:51.616736 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:51 crc kubenswrapper[5030]: I0120 23:01:51.616998 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="873931fd-cdee-4177-b8f2-4c8063bd0626" containerName="nova-api-log" containerID="cri-o://7eac10302ce3014cb84b4e2d90a80c0f1efddbca4b19c6dca6c7bc2cbe57ef01" gracePeriod=30 Jan 20 23:01:51 crc kubenswrapper[5030]: I0120 23:01:51.617081 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="873931fd-cdee-4177-b8f2-4c8063bd0626" containerName="nova-api-api" containerID="cri-o://b0cf8625792b65b5a4afb9498b633cef35873c125bcd0f848566c3c6f677743d" gracePeriod=30 Jan 20 23:01:52 crc kubenswrapper[5030]: I0120 23:01:52.511600 5030 generic.go:334] "Generic (PLEG): container finished" podID="873931fd-cdee-4177-b8f2-4c8063bd0626" containerID="7eac10302ce3014cb84b4e2d90a80c0f1efddbca4b19c6dca6c7bc2cbe57ef01" exitCode=143 Jan 20 23:01:52 crc kubenswrapper[5030]: I0120 23:01:52.511645 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"873931fd-cdee-4177-b8f2-4c8063bd0626","Type":"ContainerDied","Data":"7eac10302ce3014cb84b4e2d90a80c0f1efddbca4b19c6dca6c7bc2cbe57ef01"} Jan 20 23:01:53 crc kubenswrapper[5030]: E0120 23:01:53.315657 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a29aab_8344_44a5_a736_6ec987b286e7.slice/crio-d9360bc8c6322154560a4f25a4f9d9ca7301edd66ec5241983317fa79a183613\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e850b76_b34f_44d5_8723_ce657ebf49b6.slice/crio-37279f13d7c2a7fa8c76a84c716e56eada0cbac8632cc485677c5c7da6fc58de\": RecentStats: unable to find data in memory cache]" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.272299 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.418185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/873931fd-cdee-4177-b8f2-4c8063bd0626-logs\") pod \"873931fd-cdee-4177-b8f2-4c8063bd0626\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.418254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873931fd-cdee-4177-b8f2-4c8063bd0626-combined-ca-bundle\") pod \"873931fd-cdee-4177-b8f2-4c8063bd0626\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.418322 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st6qt\" (UniqueName: \"kubernetes.io/projected/873931fd-cdee-4177-b8f2-4c8063bd0626-kube-api-access-st6qt\") pod \"873931fd-cdee-4177-b8f2-4c8063bd0626\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.418519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873931fd-cdee-4177-b8f2-4c8063bd0626-config-data\") pod \"873931fd-cdee-4177-b8f2-4c8063bd0626\" (UID: \"873931fd-cdee-4177-b8f2-4c8063bd0626\") " Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.420267 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/873931fd-cdee-4177-b8f2-4c8063bd0626-logs" (OuterVolumeSpecName: "logs") pod "873931fd-cdee-4177-b8f2-4c8063bd0626" (UID: "873931fd-cdee-4177-b8f2-4c8063bd0626"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.424103 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/873931fd-cdee-4177-b8f2-4c8063bd0626-kube-api-access-st6qt" (OuterVolumeSpecName: "kube-api-access-st6qt") pod "873931fd-cdee-4177-b8f2-4c8063bd0626" (UID: "873931fd-cdee-4177-b8f2-4c8063bd0626"). InnerVolumeSpecName "kube-api-access-st6qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.444029 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873931fd-cdee-4177-b8f2-4c8063bd0626-config-data" (OuterVolumeSpecName: "config-data") pod "873931fd-cdee-4177-b8f2-4c8063bd0626" (UID: "873931fd-cdee-4177-b8f2-4c8063bd0626"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.445173 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873931fd-cdee-4177-b8f2-4c8063bd0626-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "873931fd-cdee-4177-b8f2-4c8063bd0626" (UID: "873931fd-cdee-4177-b8f2-4c8063bd0626"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.520684 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873931fd-cdee-4177-b8f2-4c8063bd0626-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.520918 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/873931fd-cdee-4177-b8f2-4c8063bd0626-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.520995 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873931fd-cdee-4177-b8f2-4c8063bd0626-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.521067 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st6qt\" (UniqueName: \"kubernetes.io/projected/873931fd-cdee-4177-b8f2-4c8063bd0626-kube-api-access-st6qt\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.566717 5030 generic.go:334] "Generic (PLEG): container finished" podID="873931fd-cdee-4177-b8f2-4c8063bd0626" containerID="b0cf8625792b65b5a4afb9498b633cef35873c125bcd0f848566c3c6f677743d" exitCode=0 Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.566766 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"873931fd-cdee-4177-b8f2-4c8063bd0626","Type":"ContainerDied","Data":"b0cf8625792b65b5a4afb9498b633cef35873c125bcd0f848566c3c6f677743d"} Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.566808 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"873931fd-cdee-4177-b8f2-4c8063bd0626","Type":"ContainerDied","Data":"0527e235ad5746b0d3f7ad974aaaa0cc09a08a98a1320e11e230a0bfbda88d68"} Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.566840 5030 scope.go:117] "RemoveContainer" containerID="b0cf8625792b65b5a4afb9498b633cef35873c125bcd0f848566c3c6f677743d" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.567186 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.591214 5030 scope.go:117] "RemoveContainer" containerID="7eac10302ce3014cb84b4e2d90a80c0f1efddbca4b19c6dca6c7bc2cbe57ef01" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.613899 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.614577 5030 scope.go:117] "RemoveContainer" containerID="b0cf8625792b65b5a4afb9498b633cef35873c125bcd0f848566c3c6f677743d" Jan 20 23:01:55 crc kubenswrapper[5030]: E0120 23:01:55.615064 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0cf8625792b65b5a4afb9498b633cef35873c125bcd0f848566c3c6f677743d\": container with ID starting with b0cf8625792b65b5a4afb9498b633cef35873c125bcd0f848566c3c6f677743d not found: ID does not exist" containerID="b0cf8625792b65b5a4afb9498b633cef35873c125bcd0f848566c3c6f677743d" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.615102 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0cf8625792b65b5a4afb9498b633cef35873c125bcd0f848566c3c6f677743d"} err="failed to get container status \"b0cf8625792b65b5a4afb9498b633cef35873c125bcd0f848566c3c6f677743d\": rpc error: code = NotFound desc = could not find container \"b0cf8625792b65b5a4afb9498b633cef35873c125bcd0f848566c3c6f677743d\": container with ID starting with b0cf8625792b65b5a4afb9498b633cef35873c125bcd0f848566c3c6f677743d not found: ID does not exist" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.615127 5030 scope.go:117] "RemoveContainer" containerID="7eac10302ce3014cb84b4e2d90a80c0f1efddbca4b19c6dca6c7bc2cbe57ef01" Jan 20 23:01:55 crc kubenswrapper[5030]: E0120 23:01:55.615605 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eac10302ce3014cb84b4e2d90a80c0f1efddbca4b19c6dca6c7bc2cbe57ef01\": container with ID starting with 7eac10302ce3014cb84b4e2d90a80c0f1efddbca4b19c6dca6c7bc2cbe57ef01 not found: ID does not exist" containerID="7eac10302ce3014cb84b4e2d90a80c0f1efddbca4b19c6dca6c7bc2cbe57ef01" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.615662 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eac10302ce3014cb84b4e2d90a80c0f1efddbca4b19c6dca6c7bc2cbe57ef01"} err="failed to get container status \"7eac10302ce3014cb84b4e2d90a80c0f1efddbca4b19c6dca6c7bc2cbe57ef01\": rpc error: code = NotFound desc = could not find container \"7eac10302ce3014cb84b4e2d90a80c0f1efddbca4b19c6dca6c7bc2cbe57ef01\": container with ID starting with 7eac10302ce3014cb84b4e2d90a80c0f1efddbca4b19c6dca6c7bc2cbe57ef01 not found: ID does not exist" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.628653 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.641566 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:55 crc kubenswrapper[5030]: E0120 23:01:55.642495 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" containerName="registry-server" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.642586 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" containerName="registry-server" Jan 20 23:01:55 crc kubenswrapper[5030]: E0120 23:01:55.642676 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" containerName="extract-content" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.642739 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" containerName="extract-content" Jan 20 23:01:55 crc kubenswrapper[5030]: E0120 23:01:55.642824 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" containerName="extract-utilities" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.642889 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" containerName="extract-utilities" Jan 20 23:01:55 crc kubenswrapper[5030]: E0120 23:01:55.642965 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873931fd-cdee-4177-b8f2-4c8063bd0626" containerName="nova-api-api" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.643024 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="873931fd-cdee-4177-b8f2-4c8063bd0626" containerName="nova-api-api" Jan 20 23:01:55 crc kubenswrapper[5030]: E0120 23:01:55.643087 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873931fd-cdee-4177-b8f2-4c8063bd0626" containerName="nova-api-log" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.643144 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="873931fd-cdee-4177-b8f2-4c8063bd0626" containerName="nova-api-log" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.643539 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="873931fd-cdee-4177-b8f2-4c8063bd0626" containerName="nova-api-log" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.643647 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="873931fd-cdee-4177-b8f2-4c8063bd0626" containerName="nova-api-api" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.643744 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e2d3aa6-7b93-41d0-aa60-64eaab9a15eb" containerName="registry-server" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.645145 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.647042 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.647106 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.647369 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.649193 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.724473 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.724753 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.724805 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-public-tls-certs\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.724832 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-logs\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.724873 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-config-data\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.724909 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hkpq\" (UniqueName: \"kubernetes.io/projected/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-kube-api-access-5hkpq\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.826818 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.826881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.826959 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-public-tls-certs\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.827003 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-logs\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.827059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-config-data\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.827142 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hkpq\" (UniqueName: \"kubernetes.io/projected/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-kube-api-access-5hkpq\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.827586 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-logs\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.830886 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-public-tls-certs\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.830937 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.831259 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-config-data\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.832498 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.843311 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hkpq\" (UniqueName: \"kubernetes.io/projected/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-kube-api-access-5hkpq\") pod \"nova-api-0\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.972415 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="873931fd-cdee-4177-b8f2-4c8063bd0626" path="/var/lib/kubelet/pods/873931fd-cdee-4177-b8f2-4c8063bd0626/volumes" Jan 20 23:01:55 crc kubenswrapper[5030]: I0120 23:01:55.974308 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:01:56 crc kubenswrapper[5030]: I0120 23:01:56.434680 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:01:56 crc kubenswrapper[5030]: I0120 23:01:56.578972 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba","Type":"ContainerStarted","Data":"ea8361cea39466f58a0e0ba4e7a8fb1d29c3b50db4beee685da502dc82042523"} Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.054065 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.150355 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-combined-ca-bundle\") pod \"d9f09bef-5dba-4628-adf3-c440c793dd50\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.150417 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-config-data\") pod \"d9f09bef-5dba-4628-adf3-c440c793dd50\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.150443 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-sg-core-conf-yaml\") pod \"d9f09bef-5dba-4628-adf3-c440c793dd50\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.150480 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-scripts\") pod \"d9f09bef-5dba-4628-adf3-c440c793dd50\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.150517 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f09bef-5dba-4628-adf3-c440c793dd50-log-httpd\") pod \"d9f09bef-5dba-4628-adf3-c440c793dd50\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.150632 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nw4l\" (UniqueName: \"kubernetes.io/projected/d9f09bef-5dba-4628-adf3-c440c793dd50-kube-api-access-9nw4l\") pod \"d9f09bef-5dba-4628-adf3-c440c793dd50\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.150672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f09bef-5dba-4628-adf3-c440c793dd50-run-httpd\") pod \"d9f09bef-5dba-4628-adf3-c440c793dd50\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.150689 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-ceilometer-tls-certs\") pod \"d9f09bef-5dba-4628-adf3-c440c793dd50\" (UID: \"d9f09bef-5dba-4628-adf3-c440c793dd50\") " Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.151279 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9f09bef-5dba-4628-adf3-c440c793dd50-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d9f09bef-5dba-4628-adf3-c440c793dd50" (UID: "d9f09bef-5dba-4628-adf3-c440c793dd50"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.151534 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9f09bef-5dba-4628-adf3-c440c793dd50-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d9f09bef-5dba-4628-adf3-c440c793dd50" (UID: "d9f09bef-5dba-4628-adf3-c440c793dd50"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.156764 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9f09bef-5dba-4628-adf3-c440c793dd50-kube-api-access-9nw4l" (OuterVolumeSpecName: "kube-api-access-9nw4l") pod "d9f09bef-5dba-4628-adf3-c440c793dd50" (UID: "d9f09bef-5dba-4628-adf3-c440c793dd50"). InnerVolumeSpecName "kube-api-access-9nw4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.157477 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-scripts" (OuterVolumeSpecName: "scripts") pod "d9f09bef-5dba-4628-adf3-c440c793dd50" (UID: "d9f09bef-5dba-4628-adf3-c440c793dd50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.183481 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d9f09bef-5dba-4628-adf3-c440c793dd50" (UID: "d9f09bef-5dba-4628-adf3-c440c793dd50"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.198307 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d9f09bef-5dba-4628-adf3-c440c793dd50" (UID: "d9f09bef-5dba-4628-adf3-c440c793dd50"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.250325 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-config-data" (OuterVolumeSpecName: "config-data") pod "d9f09bef-5dba-4628-adf3-c440c793dd50" (UID: "d9f09bef-5dba-4628-adf3-c440c793dd50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.252430 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nw4l\" (UniqueName: \"kubernetes.io/projected/d9f09bef-5dba-4628-adf3-c440c793dd50-kube-api-access-9nw4l\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.252469 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f09bef-5dba-4628-adf3-c440c793dd50-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.252484 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.252496 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.252509 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.252520 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.252530 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f09bef-5dba-4628-adf3-c440c793dd50-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.260515 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9f09bef-5dba-4628-adf3-c440c793dd50" (UID: "d9f09bef-5dba-4628-adf3-c440c793dd50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.353982 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f09bef-5dba-4628-adf3-c440c793dd50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.394286 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-nzw2k"] Jan 20 23:01:57 crc kubenswrapper[5030]: E0120 23:01:57.394608 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="proxy-httpd" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.394716 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="proxy-httpd" Jan 20 23:01:57 crc kubenswrapper[5030]: E0120 23:01:57.394747 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="ceilometer-notification-agent" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.394755 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="ceilometer-notification-agent" Jan 20 23:01:57 crc kubenswrapper[5030]: E0120 23:01:57.394764 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="sg-core" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.394771 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="sg-core" Jan 20 23:01:57 crc kubenswrapper[5030]: E0120 23:01:57.394783 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="ceilometer-central-agent" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.394789 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="ceilometer-central-agent" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.394938 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="ceilometer-central-agent" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.394951 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="sg-core" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.394962 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="ceilometer-notification-agent" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.394971 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerName="proxy-httpd" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.395519 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.396892 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncontroller-ovncontroller-dockercfg-glmxz" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.397290 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-scripts" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.400417 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovncontroller-ovndbs" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.403999 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-nzw2k"] Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.441634 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-tnb66"] Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.443711 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.457850 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-log-ovn\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.457915 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-run-ovn\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.457993 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a374d235-1f65-459e-a2b3-088c67ea01a9-ovn-controller-tls-certs\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.458023 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a374d235-1f65-459e-a2b3-088c67ea01a9-scripts\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.458038 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w8vb\" (UniqueName: \"kubernetes.io/projected/a374d235-1f65-459e-a2b3-088c67ea01a9-kube-api-access-5w8vb\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.458081 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a374d235-1f65-459e-a2b3-088c67ea01a9-combined-ca-bundle\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.458101 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-run\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.464489 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-qdt54"] Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.466091 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.471516 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-metrics-config" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.499850 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-qdt54"] Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.534488 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-tnb66"] Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-etc-ovs\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563157 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a374d235-1f65-459e-a2b3-088c67ea01a9-ovn-controller-tls-certs\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563178 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-log\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563202 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/599db120-1b92-4a0a-8784-3979907751b0-config\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563221 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w8vb\" (UniqueName: \"kubernetes.io/projected/a374d235-1f65-459e-a2b3-088c67ea01a9-kube-api-access-5w8vb\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563239 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a374d235-1f65-459e-a2b3-088c67ea01a9-scripts\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563258 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/599db120-1b92-4a0a-8784-3979907751b0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563295 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-run\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563321 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a374d235-1f65-459e-a2b3-088c67ea01a9-combined-ca-bundle\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563339 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-run\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563360 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563385 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599db120-1b92-4a0a-8784-3979907751b0-combined-ca-bundle\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-log-ovn\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-run-ovn\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563483 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9scc\" (UniqueName: \"kubernetes.io/projected/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-kube-api-access-h9scc\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563504 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/599db120-1b92-4a0a-8784-3979907751b0-ovn-rundir\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563528 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-lib\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563558 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/599db120-1b92-4a0a-8784-3979907751b0-ovs-rundir\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.563582 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66h7t\" (UniqueName: \"kubernetes.io/projected/599db120-1b92-4a0a-8784-3979907751b0-kube-api-access-66h7t\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.565443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-run-ovn\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.565514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-log-ovn\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.565540 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-run\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.567297 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a374d235-1f65-459e-a2b3-088c67ea01a9-scripts\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.568782 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a374d235-1f65-459e-a2b3-088c67ea01a9-combined-ca-bundle\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.569555 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a374d235-1f65-459e-a2b3-088c67ea01a9-ovn-controller-tls-certs\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.586412 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w8vb\" (UniqueName: \"kubernetes.io/projected/a374d235-1f65-459e-a2b3-088c67ea01a9-kube-api-access-5w8vb\") pod \"ovn-controller-nzw2k\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.604654 5030 generic.go:334] "Generic (PLEG): container finished" podID="d9f09bef-5dba-4628-adf3-c440c793dd50" containerID="2acb5a5593855b0b97108d18e658bc30b9de7c2326da2ca5cf487fa7371d2dfc" exitCode=0 Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.604742 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.604761 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d9f09bef-5dba-4628-adf3-c440c793dd50","Type":"ContainerDied","Data":"2acb5a5593855b0b97108d18e658bc30b9de7c2326da2ca5cf487fa7371d2dfc"} Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.604864 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d9f09bef-5dba-4628-adf3-c440c793dd50","Type":"ContainerDied","Data":"50f4823fb9fe76278d8c70257557f46de3a71c6bd75e15f0caf00c3a1c10860d"} Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.604895 5030 scope.go:117] "RemoveContainer" containerID="fd64a3d0252666a93d9d1778117f9917ec05f69b515ee10b13d1ac7146aea041" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.608149 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba","Type":"ContainerStarted","Data":"bacdb63cfad87ddbc9e7c7e70fdb4097469fbce0f8efae90054339392ef9fc71"} Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.608180 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba","Type":"ContainerStarted","Data":"7bebe1daa1d25250b1dedbc99eb90a92119deaafd6bb66ce48014fb925d151e0"} Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.637049 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.637029012 podStartE2EDuration="2.637029012s" podCreationTimestamp="2026-01-20 23:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:57.629248014 +0000 UTC m=+1589.949508302" watchObservedRunningTime="2026-01-20 23:01:57.637029012 +0000 UTC m=+1589.957289300" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.645091 5030 scope.go:117] "RemoveContainer" containerID="80a14381ea24f2faad80e46fd57367dccad4f601b4e2e50f1b99ff4d3236a41d" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.646510 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.656329 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.665278 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-etc-ovs\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.665329 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-log\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.665356 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/599db120-1b92-4a0a-8784-3979907751b0-config\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.665381 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/599db120-1b92-4a0a-8784-3979907751b0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.665415 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-run\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.665440 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.665466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599db120-1b92-4a0a-8784-3979907751b0-combined-ca-bundle\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.665523 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9scc\" (UniqueName: \"kubernetes.io/projected/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-kube-api-access-h9scc\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.665545 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/599db120-1b92-4a0a-8784-3979907751b0-ovn-rundir\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.665569 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-lib\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.665596 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/599db120-1b92-4a0a-8784-3979907751b0-ovs-rundir\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.665630 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66h7t\" (UniqueName: \"kubernetes.io/projected/599db120-1b92-4a0a-8784-3979907751b0-kube-api-access-66h7t\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.667101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/599db120-1b92-4a0a-8784-3979907751b0-config\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.667225 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/599db120-1b92-4a0a-8784-3979907751b0-ovn-rundir\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.667348 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-etc-ovs\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.667459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-log\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.668217 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.668383 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-run\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.668482 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/599db120-1b92-4a0a-8784-3979907751b0-ovs-rundir\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.672358 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-lib\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.679118 5030 scope.go:117] "RemoveContainer" containerID="2acb5a5593855b0b97108d18e658bc30b9de7c2326da2ca5cf487fa7371d2dfc" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.680694 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.683786 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.685699 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.687658 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.687800 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.697419 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/599db120-1b92-4a0a-8784-3979907751b0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.701342 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.702952 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9scc\" (UniqueName: \"kubernetes.io/projected/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-kube-api-access-h9scc\") pod \"ovn-controller-ovs-tnb66\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.704122 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66h7t\" (UniqueName: \"kubernetes.io/projected/599db120-1b92-4a0a-8784-3979907751b0-kube-api-access-66h7t\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.705250 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599db120-1b92-4a0a-8784-3979907751b0-combined-ca-bundle\") pod \"ovn-controller-metrics-qdt54\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.713438 5030 scope.go:117] "RemoveContainer" containerID="0d0716e8e70864125838d7a292a8e610c00fd026cb5a41287b3f82baabf81a60" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.713842 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.758863 5030 scope.go:117] "RemoveContainer" containerID="fd64a3d0252666a93d9d1778117f9917ec05f69b515ee10b13d1ac7146aea041" Jan 20 23:01:57 crc kubenswrapper[5030]: E0120 23:01:57.759447 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd64a3d0252666a93d9d1778117f9917ec05f69b515ee10b13d1ac7146aea041\": container with ID starting with fd64a3d0252666a93d9d1778117f9917ec05f69b515ee10b13d1ac7146aea041 not found: ID does not exist" containerID="fd64a3d0252666a93d9d1778117f9917ec05f69b515ee10b13d1ac7146aea041" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.759476 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd64a3d0252666a93d9d1778117f9917ec05f69b515ee10b13d1ac7146aea041"} err="failed to get container status \"fd64a3d0252666a93d9d1778117f9917ec05f69b515ee10b13d1ac7146aea041\": rpc error: code = NotFound desc = could not find container \"fd64a3d0252666a93d9d1778117f9917ec05f69b515ee10b13d1ac7146aea041\": container with ID starting with fd64a3d0252666a93d9d1778117f9917ec05f69b515ee10b13d1ac7146aea041 not found: ID does not exist" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.759500 5030 scope.go:117] "RemoveContainer" containerID="80a14381ea24f2faad80e46fd57367dccad4f601b4e2e50f1b99ff4d3236a41d" Jan 20 23:01:57 crc kubenswrapper[5030]: E0120 23:01:57.759990 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80a14381ea24f2faad80e46fd57367dccad4f601b4e2e50f1b99ff4d3236a41d\": container with ID starting with 80a14381ea24f2faad80e46fd57367dccad4f601b4e2e50f1b99ff4d3236a41d not found: ID does not exist" containerID="80a14381ea24f2faad80e46fd57367dccad4f601b4e2e50f1b99ff4d3236a41d" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.760028 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80a14381ea24f2faad80e46fd57367dccad4f601b4e2e50f1b99ff4d3236a41d"} err="failed to get container status \"80a14381ea24f2faad80e46fd57367dccad4f601b4e2e50f1b99ff4d3236a41d\": rpc error: code = NotFound desc = could not find container \"80a14381ea24f2faad80e46fd57367dccad4f601b4e2e50f1b99ff4d3236a41d\": container with ID starting with 80a14381ea24f2faad80e46fd57367dccad4f601b4e2e50f1b99ff4d3236a41d not found: ID does not exist" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.760045 5030 scope.go:117] "RemoveContainer" containerID="2acb5a5593855b0b97108d18e658bc30b9de7c2326da2ca5cf487fa7371d2dfc" Jan 20 23:01:57 crc kubenswrapper[5030]: E0120 23:01:57.760328 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2acb5a5593855b0b97108d18e658bc30b9de7c2326da2ca5cf487fa7371d2dfc\": container with ID starting with 2acb5a5593855b0b97108d18e658bc30b9de7c2326da2ca5cf487fa7371d2dfc not found: ID does not exist" containerID="2acb5a5593855b0b97108d18e658bc30b9de7c2326da2ca5cf487fa7371d2dfc" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.760351 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2acb5a5593855b0b97108d18e658bc30b9de7c2326da2ca5cf487fa7371d2dfc"} err="failed to get container status \"2acb5a5593855b0b97108d18e658bc30b9de7c2326da2ca5cf487fa7371d2dfc\": rpc error: code = NotFound desc = could not find container \"2acb5a5593855b0b97108d18e658bc30b9de7c2326da2ca5cf487fa7371d2dfc\": container with ID starting with 2acb5a5593855b0b97108d18e658bc30b9de7c2326da2ca5cf487fa7371d2dfc not found: ID does not exist" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.760366 5030 scope.go:117] "RemoveContainer" containerID="0d0716e8e70864125838d7a292a8e610c00fd026cb5a41287b3f82baabf81a60" Jan 20 23:01:57 crc kubenswrapper[5030]: E0120 23:01:57.760590 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d0716e8e70864125838d7a292a8e610c00fd026cb5a41287b3f82baabf81a60\": container with ID starting with 0d0716e8e70864125838d7a292a8e610c00fd026cb5a41287b3f82baabf81a60 not found: ID does not exist" containerID="0d0716e8e70864125838d7a292a8e610c00fd026cb5a41287b3f82baabf81a60" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.760612 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d0716e8e70864125838d7a292a8e610c00fd026cb5a41287b3f82baabf81a60"} err="failed to get container status \"0d0716e8e70864125838d7a292a8e610c00fd026cb5a41287b3f82baabf81a60\": rpc error: code = NotFound desc = could not find container \"0d0716e8e70864125838d7a292a8e610c00fd026cb5a41287b3f82baabf81a60\": container with ID starting with 0d0716e8e70864125838d7a292a8e610c00fd026cb5a41287b3f82baabf81a60 not found: ID does not exist" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.765467 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.768033 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.768111 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c264e34-26dd-4ab6-88e1-a44f64fac9da-run-httpd\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.768281 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.768314 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-scripts\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.768339 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5tm4\" (UniqueName: \"kubernetes.io/projected/5c264e34-26dd-4ab6-88e1-a44f64fac9da-kube-api-access-g5tm4\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.768386 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-config-data\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.768484 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.768516 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c264e34-26dd-4ab6-88e1-a44f64fac9da-log-httpd\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.793719 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.870592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.871041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c264e34-26dd-4ab6-88e1-a44f64fac9da-log-httpd\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.871086 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.871119 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c264e34-26dd-4ab6-88e1-a44f64fac9da-run-httpd\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.871180 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.871206 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-scripts\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.871231 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5tm4\" (UniqueName: \"kubernetes.io/projected/5c264e34-26dd-4ab6-88e1-a44f64fac9da-kube-api-access-g5tm4\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.871264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-config-data\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.875216 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c264e34-26dd-4ab6-88e1-a44f64fac9da-run-httpd\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.875488 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c264e34-26dd-4ab6-88e1-a44f64fac9da-log-httpd\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.880001 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.880132 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-scripts\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.883815 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.889907 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.891953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-config-data\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.900076 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5tm4\" (UniqueName: \"kubernetes.io/projected/5c264e34-26dd-4ab6-88e1-a44f64fac9da-kube-api-access-g5tm4\") pod \"ceilometer-0\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:57 crc kubenswrapper[5030]: I0120 23:01:57.971390 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9f09bef-5dba-4628-adf3-c440c793dd50" path="/var/lib/kubelet/pods/d9f09bef-5dba-4628-adf3-c440c793dd50/volumes" Jan 20 23:01:58 crc kubenswrapper[5030]: I0120 23:01:58.009814 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:01:58 crc kubenswrapper[5030]: I0120 23:01:58.241637 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-nzw2k"] Jan 20 23:01:58 crc kubenswrapper[5030]: I0120 23:01:58.351034 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-qdt54"] Jan 20 23:01:58 crc kubenswrapper[5030]: I0120 23:01:58.386918 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-tnb66"] Jan 20 23:01:58 crc kubenswrapper[5030]: W0120 23:01:58.387275 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8fc3408_8eeb_4eef_a3f5_d90ffa41de23.slice/crio-e2bbb40bc6265a90e230b93fb787abb4998da19e4ef498e3b1ed0b709f71cde5 WatchSource:0}: Error finding container e2bbb40bc6265a90e230b93fb787abb4998da19e4ef498e3b1ed0b709f71cde5: Status 404 returned error can't find the container with id e2bbb40bc6265a90e230b93fb787abb4998da19e4ef498e3b1ed0b709f71cde5 Jan 20 23:01:58 crc kubenswrapper[5030]: I0120 23:01:58.547536 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:01:58 crc kubenswrapper[5030]: W0120 23:01:58.559439 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c264e34_26dd_4ab6_88e1_a44f64fac9da.slice/crio-ed6546c3d4beb04f45b36dcdb4f2de02020bc33919dc288ece2fcbae267f0a4f WatchSource:0}: Error finding container ed6546c3d4beb04f45b36dcdb4f2de02020bc33919dc288ece2fcbae267f0a4f: Status 404 returned error can't find the container with id ed6546c3d4beb04f45b36dcdb4f2de02020bc33919dc288ece2fcbae267f0a4f Jan 20 23:01:58 crc kubenswrapper[5030]: I0120 23:01:58.630944 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" event={"ID":"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23","Type":"ContainerStarted","Data":"e2bbb40bc6265a90e230b93fb787abb4998da19e4ef498e3b1ed0b709f71cde5"} Jan 20 23:01:58 crc kubenswrapper[5030]: I0120 23:01:58.632695 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5c264e34-26dd-4ab6-88e1-a44f64fac9da","Type":"ContainerStarted","Data":"ed6546c3d4beb04f45b36dcdb4f2de02020bc33919dc288ece2fcbae267f0a4f"} Jan 20 23:01:58 crc kubenswrapper[5030]: I0120 23:01:58.644480 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-nzw2k" event={"ID":"a374d235-1f65-459e-a2b3-088c67ea01a9","Type":"ContainerStarted","Data":"8b3cb7ee8ec71da861b6dfaeff6d98398bda7249a88bdb4a28cb338a213d1620"} Jan 20 23:01:58 crc kubenswrapper[5030]: I0120 23:01:58.648860 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" event={"ID":"599db120-1b92-4a0a-8784-3979907751b0","Type":"ContainerStarted","Data":"fdd187e113a53689521b896185a8c082fdf1396a94ffe1d6da4a2705830c087c"} Jan 20 23:01:59 crc kubenswrapper[5030]: I0120 23:01:59.661243 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" event={"ID":"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23","Type":"ContainerStarted","Data":"a479d83b4cf11748a2ed078f2865a4cbdf198c24afc29ea367174fa05297b4dc"} Jan 20 23:01:59 crc kubenswrapper[5030]: I0120 23:01:59.671117 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" event={"ID":"599db120-1b92-4a0a-8784-3979907751b0","Type":"ContainerStarted","Data":"3a62ce4c2fd235e8370495928de1a63e5dd29e97781d47e61b49eb2d1a201e27"} Jan 20 23:01:59 crc kubenswrapper[5030]: I0120 23:01:59.748147 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" podStartSLOduration=2.7481216330000002 podStartE2EDuration="2.748121633s" podCreationTimestamp="2026-01-20 23:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:01:59.706932071 +0000 UTC m=+1592.027192359" watchObservedRunningTime="2026-01-20 23:01:59.748121633 +0000 UTC m=+1592.068381931" Jan 20 23:02:00 crc kubenswrapper[5030]: I0120 23:02:00.681138 5030 generic.go:334] "Generic (PLEG): container finished" podID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerID="a479d83b4cf11748a2ed078f2865a4cbdf198c24afc29ea367174fa05297b4dc" exitCode=0 Jan 20 23:02:00 crc kubenswrapper[5030]: I0120 23:02:00.682339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" event={"ID":"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23","Type":"ContainerDied","Data":"a479d83b4cf11748a2ed078f2865a4cbdf198c24afc29ea367174fa05297b4dc"} Jan 20 23:02:00 crc kubenswrapper[5030]: I0120 23:02:00.687046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5c264e34-26dd-4ab6-88e1-a44f64fac9da","Type":"ContainerStarted","Data":"0a9bc2b2e62fa1665346965719dba7ef174e556ec608612008ea3d57f6ce58a5"} Jan 20 23:02:00 crc kubenswrapper[5030]: I0120 23:02:00.687157 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5c264e34-26dd-4ab6-88e1-a44f64fac9da","Type":"ContainerStarted","Data":"aa2beec7d83b690d8c2d49158b55f640524cb3d1f27c3a03e7ff160cba304856"} Jan 20 23:02:00 crc kubenswrapper[5030]: I0120 23:02:00.690794 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-nzw2k" event={"ID":"a374d235-1f65-459e-a2b3-088c67ea01a9","Type":"ContainerStarted","Data":"0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d"} Jan 20 23:02:00 crc kubenswrapper[5030]: I0120 23:02:00.690842 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:02:00 crc kubenswrapper[5030]: I0120 23:02:00.753713 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-nzw2k" podStartSLOduration=2.493147394 podStartE2EDuration="3.753692428s" podCreationTimestamp="2026-01-20 23:01:57 +0000 UTC" firstStartedPulling="2026-01-20 23:01:58.254920573 +0000 UTC m=+1590.575180861" lastFinishedPulling="2026-01-20 23:01:59.515465587 +0000 UTC m=+1591.835725895" observedRunningTime="2026-01-20 23:02:00.737474353 +0000 UTC m=+1593.057734651" watchObservedRunningTime="2026-01-20 23:02:00.753692428 +0000 UTC m=+1593.073952716" Jan 20 23:02:01 crc kubenswrapper[5030]: I0120 23:02:01.700115 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" event={"ID":"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23","Type":"ContainerStarted","Data":"73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18"} Jan 20 23:02:01 crc kubenswrapper[5030]: I0120 23:02:01.700744 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" event={"ID":"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23","Type":"ContainerStarted","Data":"a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407"} Jan 20 23:02:01 crc kubenswrapper[5030]: I0120 23:02:01.700768 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:02:01 crc kubenswrapper[5030]: I0120 23:02:01.700782 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:02:01 crc kubenswrapper[5030]: I0120 23:02:01.704229 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5c264e34-26dd-4ab6-88e1-a44f64fac9da","Type":"ContainerStarted","Data":"6edc24e70d18a4d950a7bac3cb71da281a70c3b75b61a409054feff5a2f24333"} Jan 20 23:02:01 crc kubenswrapper[5030]: I0120 23:02:01.725522 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" podStartSLOduration=4.07834593 podStartE2EDuration="4.725502072s" podCreationTimestamp="2026-01-20 23:01:57 +0000 UTC" firstStartedPulling="2026-01-20 23:01:58.389906335 +0000 UTC m=+1590.710166613" lastFinishedPulling="2026-01-20 23:01:59.037062467 +0000 UTC m=+1591.357322755" observedRunningTime="2026-01-20 23:02:01.719028745 +0000 UTC m=+1594.039289033" watchObservedRunningTime="2026-01-20 23:02:01.725502072 +0000 UTC m=+1594.045762360" Jan 20 23:02:02 crc kubenswrapper[5030]: I0120 23:02:02.603276 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-tnb66"] Jan 20 23:02:02 crc kubenswrapper[5030]: I0120 23:02:02.616162 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-nzw2k"] Jan 20 23:02:02 crc kubenswrapper[5030]: I0120 23:02:02.625752 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-qdt54"] Jan 20 23:02:02 crc kubenswrapper[5030]: I0120 23:02:02.625956 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" podUID="599db120-1b92-4a0a-8784-3979907751b0" containerName="openstack-network-exporter" containerID="cri-o://3a62ce4c2fd235e8370495928de1a63e5dd29e97781d47e61b49eb2d1a201e27" gracePeriod=30 Jan 20 23:02:02 crc kubenswrapper[5030]: I0120 23:02:02.712890 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" secret="" err="secret \"ovncontroller-ovncontroller-dockercfg-glmxz\" not found" Jan 20 23:02:02 crc kubenswrapper[5030]: E0120 23:02:02.860036 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 20 23:02:02 crc kubenswrapper[5030]: E0120 23:02:02.860126 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts podName:e8fc3408-8eeb-4eef-a3f5-d90ffa41de23 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:03.360105873 +0000 UTC m=+1595.680366241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts") pod "ovn-controller-ovs-tnb66" (UID: "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23") : configmap "ovncontroller-scripts" not found Jan 20 23:02:02 crc kubenswrapper[5030]: E0120 23:02:02.890789 5030 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack-kuttl-tests/ovn-controller-nzw2k" message=< Jan 20 23:02:02 crc kubenswrapper[5030]: Exiting ovn-controller (1) [ OK ] Jan 20 23:02:02 crc kubenswrapper[5030]: > Jan 20 23:02:02 crc kubenswrapper[5030]: E0120 23:02:02.890837 5030 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack-kuttl-tests/ovn-controller-nzw2k" podUID="a374d235-1f65-459e-a2b3-088c67ea01a9" containerName="ovn-controller" containerID="cri-o://0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d" Jan 20 23:02:02 crc kubenswrapper[5030]: I0120 23:02:02.890871 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-nzw2k" podUID="a374d235-1f65-459e-a2b3-088c67ea01a9" containerName="ovn-controller" containerID="cri-o://0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d" gracePeriod=30 Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.157885 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-metrics-qdt54_599db120-1b92-4a0a-8784-3979907751b0/openstack-network-exporter/0.log" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.158226 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.270260 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/599db120-1b92-4a0a-8784-3979907751b0-metrics-certs-tls-certs\") pod \"599db120-1b92-4a0a-8784-3979907751b0\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.270668 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66h7t\" (UniqueName: \"kubernetes.io/projected/599db120-1b92-4a0a-8784-3979907751b0-kube-api-access-66h7t\") pod \"599db120-1b92-4a0a-8784-3979907751b0\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.270844 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/599db120-1b92-4a0a-8784-3979907751b0-ovs-rundir\") pod \"599db120-1b92-4a0a-8784-3979907751b0\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.271010 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599db120-1b92-4a0a-8784-3979907751b0-combined-ca-bundle\") pod \"599db120-1b92-4a0a-8784-3979907751b0\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.271115 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/599db120-1b92-4a0a-8784-3979907751b0-ovn-rundir\") pod \"599db120-1b92-4a0a-8784-3979907751b0\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.271220 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/599db120-1b92-4a0a-8784-3979907751b0-config\") pod \"599db120-1b92-4a0a-8784-3979907751b0\" (UID: \"599db120-1b92-4a0a-8784-3979907751b0\") " Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.273569 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/599db120-1b92-4a0a-8784-3979907751b0-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "599db120-1b92-4a0a-8784-3979907751b0" (UID: "599db120-1b92-4a0a-8784-3979907751b0"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.279341 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/599db120-1b92-4a0a-8784-3979907751b0-config" (OuterVolumeSpecName: "config") pod "599db120-1b92-4a0a-8784-3979907751b0" (UID: "599db120-1b92-4a0a-8784-3979907751b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.279509 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/599db120-1b92-4a0a-8784-3979907751b0-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "599db120-1b92-4a0a-8784-3979907751b0" (UID: "599db120-1b92-4a0a-8784-3979907751b0"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.286176 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/599db120-1b92-4a0a-8784-3979907751b0-kube-api-access-66h7t" (OuterVolumeSpecName: "kube-api-access-66h7t") pod "599db120-1b92-4a0a-8784-3979907751b0" (UID: "599db120-1b92-4a0a-8784-3979907751b0"). InnerVolumeSpecName "kube-api-access-66h7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.325006 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/599db120-1b92-4a0a-8784-3979907751b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "599db120-1b92-4a0a-8784-3979907751b0" (UID: "599db120-1b92-4a0a-8784-3979907751b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.358266 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.367285 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/599db120-1b92-4a0a-8784-3979907751b0-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "599db120-1b92-4a0a-8784-3979907751b0" (UID: "599db120-1b92-4a0a-8784-3979907751b0"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.373076 5030 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/599db120-1b92-4a0a-8784-3979907751b0-ovs-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.373109 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599db120-1b92-4a0a-8784-3979907751b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.373122 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/599db120-1b92-4a0a-8784-3979907751b0-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.373134 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/599db120-1b92-4a0a-8784-3979907751b0-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.373144 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/599db120-1b92-4a0a-8784-3979907751b0-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.373153 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66h7t\" (UniqueName: \"kubernetes.io/projected/599db120-1b92-4a0a-8784-3979907751b0-kube-api-access-66h7t\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:03 crc kubenswrapper[5030]: E0120 23:02:03.373161 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 20 23:02:03 crc kubenswrapper[5030]: E0120 23:02:03.373227 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts podName:e8fc3408-8eeb-4eef-a3f5-d90ffa41de23 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:04.373210137 +0000 UTC m=+1596.693470425 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts") pod "ovn-controller-ovs-tnb66" (UID: "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23") : configmap "ovncontroller-scripts" not found Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.473746 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-run-ovn\") pod \"a374d235-1f65-459e-a2b3-088c67ea01a9\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.474301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a374d235-1f65-459e-a2b3-088c67ea01a9-combined-ca-bundle\") pod \"a374d235-1f65-459e-a2b3-088c67ea01a9\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.474244 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a374d235-1f65-459e-a2b3-088c67ea01a9" (UID: "a374d235-1f65-459e-a2b3-088c67ea01a9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.474855 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w8vb\" (UniqueName: \"kubernetes.io/projected/a374d235-1f65-459e-a2b3-088c67ea01a9-kube-api-access-5w8vb\") pod \"a374d235-1f65-459e-a2b3-088c67ea01a9\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.475091 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-run\") pod \"a374d235-1f65-459e-a2b3-088c67ea01a9\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.475162 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a374d235-1f65-459e-a2b3-088c67ea01a9-ovn-controller-tls-certs\") pod \"a374d235-1f65-459e-a2b3-088c67ea01a9\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.475235 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-log-ovn\") pod \"a374d235-1f65-459e-a2b3-088c67ea01a9\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.475254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a374d235-1f65-459e-a2b3-088c67ea01a9-scripts\") pod \"a374d235-1f65-459e-a2b3-088c67ea01a9\" (UID: \"a374d235-1f65-459e-a2b3-088c67ea01a9\") " Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.475979 5030 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.476860 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a374d235-1f65-459e-a2b3-088c67ea01a9-scripts" (OuterVolumeSpecName: "scripts") pod "a374d235-1f65-459e-a2b3-088c67ea01a9" (UID: "a374d235-1f65-459e-a2b3-088c67ea01a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.476849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-run" (OuterVolumeSpecName: "var-run") pod "a374d235-1f65-459e-a2b3-088c67ea01a9" (UID: "a374d235-1f65-459e-a2b3-088c67ea01a9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.476895 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a374d235-1f65-459e-a2b3-088c67ea01a9" (UID: "a374d235-1f65-459e-a2b3-088c67ea01a9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.479195 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a374d235-1f65-459e-a2b3-088c67ea01a9-kube-api-access-5w8vb" (OuterVolumeSpecName: "kube-api-access-5w8vb") pod "a374d235-1f65-459e-a2b3-088c67ea01a9" (UID: "a374d235-1f65-459e-a2b3-088c67ea01a9"). InnerVolumeSpecName "kube-api-access-5w8vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.504010 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a374d235-1f65-459e-a2b3-088c67ea01a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a374d235-1f65-459e-a2b3-088c67ea01a9" (UID: "a374d235-1f65-459e-a2b3-088c67ea01a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.556484 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a374d235-1f65-459e-a2b3-088c67ea01a9-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "a374d235-1f65-459e-a2b3-088c67ea01a9" (UID: "a374d235-1f65-459e-a2b3-088c67ea01a9"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.577238 5030 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.577273 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a374d235-1f65-459e-a2b3-088c67ea01a9-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.577285 5030 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a374d235-1f65-459e-a2b3-088c67ea01a9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.577293 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a374d235-1f65-459e-a2b3-088c67ea01a9-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.577302 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a374d235-1f65-459e-a2b3-088c67ea01a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.577311 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w8vb\" (UniqueName: \"kubernetes.io/projected/a374d235-1f65-459e-a2b3-088c67ea01a9-kube-api-access-5w8vb\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:03 crc kubenswrapper[5030]: E0120 23:02:03.584315 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e850b76_b34f_44d5_8723_ce657ebf49b6.slice/crio-37279f13d7c2a7fa8c76a84c716e56eada0cbac8632cc485677c5c7da6fc58de\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a29aab_8344_44a5_a736_6ec987b286e7.slice/crio-d9360bc8c6322154560a4f25a4f9d9ca7301edd66ec5241983317fa79a183613\": RecentStats: unable to find data in memory cache]" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.721980 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-metrics-qdt54_599db120-1b92-4a0a-8784-3979907751b0/openstack-network-exporter/0.log" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.722028 5030 generic.go:334] "Generic (PLEG): container finished" podID="599db120-1b92-4a0a-8784-3979907751b0" containerID="3a62ce4c2fd235e8370495928de1a63e5dd29e97781d47e61b49eb2d1a201e27" exitCode=2 Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.722083 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" event={"ID":"599db120-1b92-4a0a-8784-3979907751b0","Type":"ContainerDied","Data":"3a62ce4c2fd235e8370495928de1a63e5dd29e97781d47e61b49eb2d1a201e27"} Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.722108 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" event={"ID":"599db120-1b92-4a0a-8784-3979907751b0","Type":"ContainerDied","Data":"fdd187e113a53689521b896185a8c082fdf1396a94ffe1d6da4a2705830c087c"} Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.722123 5030 scope.go:117] "RemoveContainer" containerID="3a62ce4c2fd235e8370495928de1a63e5dd29e97781d47e61b49eb2d1a201e27" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.722126 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-qdt54" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.726457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5c264e34-26dd-4ab6-88e1-a44f64fac9da","Type":"ContainerStarted","Data":"ed33b27e4afb955b83d56cfdd6383bdd0da34ea1240c3f18f863547d28dffa02"} Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.726688 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.728356 5030 generic.go:334] "Generic (PLEG): container finished" podID="a374d235-1f65-459e-a2b3-088c67ea01a9" containerID="0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d" exitCode=0 Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.728443 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-nzw2k" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.728394 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-nzw2k" event={"ID":"a374d235-1f65-459e-a2b3-088c67ea01a9","Type":"ContainerDied","Data":"0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d"} Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.728615 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-nzw2k" event={"ID":"a374d235-1f65-459e-a2b3-088c67ea01a9","Type":"ContainerDied","Data":"8b3cb7ee8ec71da861b6dfaeff6d98398bda7249a88bdb4a28cb338a213d1620"} Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.758261 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.340609136 podStartE2EDuration="6.758241087s" podCreationTimestamp="2026-01-20 23:01:57 +0000 UTC" firstStartedPulling="2026-01-20 23:01:58.564044728 +0000 UTC m=+1590.884305016" lastFinishedPulling="2026-01-20 23:02:02.981676679 +0000 UTC m=+1595.301936967" observedRunningTime="2026-01-20 23:02:03.744996055 +0000 UTC m=+1596.065256353" watchObservedRunningTime="2026-01-20 23:02:03.758241087 +0000 UTC m=+1596.078501375" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.791888 5030 scope.go:117] "RemoveContainer" containerID="3a62ce4c2fd235e8370495928de1a63e5dd29e97781d47e61b49eb2d1a201e27" Jan 20 23:02:03 crc kubenswrapper[5030]: E0120 23:02:03.794933 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a62ce4c2fd235e8370495928de1a63e5dd29e97781d47e61b49eb2d1a201e27\": container with ID starting with 3a62ce4c2fd235e8370495928de1a63e5dd29e97781d47e61b49eb2d1a201e27 not found: ID does not exist" containerID="3a62ce4c2fd235e8370495928de1a63e5dd29e97781d47e61b49eb2d1a201e27" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.794987 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a62ce4c2fd235e8370495928de1a63e5dd29e97781d47e61b49eb2d1a201e27"} err="failed to get container status \"3a62ce4c2fd235e8370495928de1a63e5dd29e97781d47e61b49eb2d1a201e27\": rpc error: code = NotFound desc = could not find container \"3a62ce4c2fd235e8370495928de1a63e5dd29e97781d47e61b49eb2d1a201e27\": container with ID starting with 3a62ce4c2fd235e8370495928de1a63e5dd29e97781d47e61b49eb2d1a201e27 not found: ID does not exist" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.795009 5030 scope.go:117] "RemoveContainer" containerID="0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.834707 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-qdt54"] Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.835027 5030 scope.go:117] "RemoveContainer" containerID="0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d" Jan 20 23:02:03 crc kubenswrapper[5030]: E0120 23:02:03.835999 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d\": container with ID starting with 0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d not found: ID does not exist" containerID="0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.836048 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d"} err="failed to get container status \"0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d\": rpc error: code = NotFound desc = could not find container \"0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d\": container with ID starting with 0c1eab3d6efed17fe4362d9529fae0c5e3246fba4c6c37248d2b94855614a33d not found: ID does not exist" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.847161 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-qdt54"] Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.857478 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-nzw2k"] Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.865798 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-nzw2k"] Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.973558 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="599db120-1b92-4a0a-8784-3979907751b0" path="/var/lib/kubelet/pods/599db120-1b92-4a0a-8784-3979907751b0/volumes" Jan 20 23:02:03 crc kubenswrapper[5030]: I0120 23:02:03.974933 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a374d235-1f65-459e-a2b3-088c67ea01a9" path="/var/lib/kubelet/pods/a374d235-1f65-459e-a2b3-088c67ea01a9/volumes" Jan 20 23:02:04 crc kubenswrapper[5030]: I0120 23:02:04.114164 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerName="ovs-vswitchd" containerID="cri-o://73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18" gracePeriod=30 Jan 20 23:02:04 crc kubenswrapper[5030]: E0120 23:02:04.364140 5030 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Jan 20 23:02:04 crc kubenswrapper[5030]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 20 23:02:04 crc kubenswrapper[5030]: + source /usr/local/bin/container-scripts/functions Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNBridge=br-int Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNRemote=tcp:localhost:6642 Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNEncapType=geneve Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNAvailabilityZones= Jan 20 23:02:04 crc kubenswrapper[5030]: ++ EnableChassisAsGateway=true Jan 20 23:02:04 crc kubenswrapper[5030]: ++ PhysicalNetworks= Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNHostName= Jan 20 23:02:04 crc kubenswrapper[5030]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 20 23:02:04 crc kubenswrapper[5030]: ++ ovs_dir=/var/lib/openvswitch Jan 20 23:02:04 crc kubenswrapper[5030]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 20 23:02:04 crc kubenswrapper[5030]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 20 23:02:04 crc kubenswrapper[5030]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 20 23:02:04 crc kubenswrapper[5030]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 20 23:02:04 crc kubenswrapper[5030]: + sleep 0.5 Jan 20 23:02:04 crc kubenswrapper[5030]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 20 23:02:04 crc kubenswrapper[5030]: + cleanup_ovsdb_server_semaphore Jan 20 23:02:04 crc kubenswrapper[5030]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 20 23:02:04 crc kubenswrapper[5030]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 20 23:02:04 crc kubenswrapper[5030]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" message=< Jan 20 23:02:04 crc kubenswrapper[5030]: Exiting ovsdb-server (5) [ OK ] Jan 20 23:02:04 crc kubenswrapper[5030]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 20 23:02:04 crc kubenswrapper[5030]: + source /usr/local/bin/container-scripts/functions Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNBridge=br-int Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNRemote=tcp:localhost:6642 Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNEncapType=geneve Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNAvailabilityZones= Jan 20 23:02:04 crc kubenswrapper[5030]: ++ EnableChassisAsGateway=true Jan 20 23:02:04 crc kubenswrapper[5030]: ++ PhysicalNetworks= Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNHostName= Jan 20 23:02:04 crc kubenswrapper[5030]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 20 23:02:04 crc kubenswrapper[5030]: ++ ovs_dir=/var/lib/openvswitch Jan 20 23:02:04 crc kubenswrapper[5030]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 20 23:02:04 crc kubenswrapper[5030]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 20 23:02:04 crc kubenswrapper[5030]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 20 23:02:04 crc kubenswrapper[5030]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 20 23:02:04 crc kubenswrapper[5030]: + sleep 0.5 Jan 20 23:02:04 crc kubenswrapper[5030]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 20 23:02:04 crc kubenswrapper[5030]: + cleanup_ovsdb_server_semaphore Jan 20 23:02:04 crc kubenswrapper[5030]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 20 23:02:04 crc kubenswrapper[5030]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 20 23:02:04 crc kubenswrapper[5030]: > Jan 20 23:02:04 crc kubenswrapper[5030]: E0120 23:02:04.364190 5030 kuberuntime_container.go:691] "PreStop hook failed" err=< Jan 20 23:02:04 crc kubenswrapper[5030]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 20 23:02:04 crc kubenswrapper[5030]: + source /usr/local/bin/container-scripts/functions Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNBridge=br-int Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNRemote=tcp:localhost:6642 Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNEncapType=geneve Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNAvailabilityZones= Jan 20 23:02:04 crc kubenswrapper[5030]: ++ EnableChassisAsGateway=true Jan 20 23:02:04 crc kubenswrapper[5030]: ++ PhysicalNetworks= Jan 20 23:02:04 crc kubenswrapper[5030]: ++ OVNHostName= Jan 20 23:02:04 crc kubenswrapper[5030]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 20 23:02:04 crc kubenswrapper[5030]: ++ ovs_dir=/var/lib/openvswitch Jan 20 23:02:04 crc kubenswrapper[5030]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 20 23:02:04 crc kubenswrapper[5030]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 20 23:02:04 crc kubenswrapper[5030]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 20 23:02:04 crc kubenswrapper[5030]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 20 23:02:04 crc kubenswrapper[5030]: + sleep 0.5 Jan 20 23:02:04 crc kubenswrapper[5030]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 20 23:02:04 crc kubenswrapper[5030]: + cleanup_ovsdb_server_semaphore Jan 20 23:02:04 crc kubenswrapper[5030]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 20 23:02:04 crc kubenswrapper[5030]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 20 23:02:04 crc kubenswrapper[5030]: > pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerName="ovsdb-server" containerID="cri-o://a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407" Jan 20 23:02:04 crc kubenswrapper[5030]: I0120 23:02:04.364237 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerName="ovsdb-server" containerID="cri-o://a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407" gracePeriod=30 Jan 20 23:02:04 crc kubenswrapper[5030]: E0120 23:02:04.391906 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 20 23:02:04 crc kubenswrapper[5030]: E0120 23:02:04.392021 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts podName:e8fc3408-8eeb-4eef-a3f5-d90ffa41de23 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:06.391996524 +0000 UTC m=+1598.712256812 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts") pod "ovn-controller-ovs-tnb66" (UID: "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23") : configmap "ovncontroller-scripts" not found Jan 20 23:02:04 crc kubenswrapper[5030]: I0120 23:02:04.752720 5030 generic.go:334] "Generic (PLEG): container finished" podID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerID="a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407" exitCode=0 Jan 20 23:02:04 crc kubenswrapper[5030]: I0120 23:02:04.752763 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" event={"ID":"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23","Type":"ContainerDied","Data":"a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407"} Jan 20 23:02:05 crc kubenswrapper[5030]: I0120 23:02:05.975532 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:02:05 crc kubenswrapper[5030]: I0120 23:02:05.975907 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:02:06 crc kubenswrapper[5030]: E0120 23:02:06.437444 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 20 23:02:06 crc kubenswrapper[5030]: E0120 23:02:06.437554 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts podName:e8fc3408-8eeb-4eef-a3f5-d90ffa41de23 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:10.43752693 +0000 UTC m=+1602.757787248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts") pod "ovn-controller-ovs-tnb66" (UID: "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23") : configmap "ovncontroller-scripts" not found Jan 20 23:02:06 crc kubenswrapper[5030]: I0120 23:02:06.989884 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.34:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:02:06 crc kubenswrapper[5030]: I0120 23:02:06.989901 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.34:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:02:10 crc kubenswrapper[5030]: I0120 23:02:10.158088 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:02:10 crc kubenswrapper[5030]: I0120 23:02:10.158902 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:02:10 crc kubenswrapper[5030]: I0120 23:02:10.158976 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 23:02:10 crc kubenswrapper[5030]: I0120 23:02:10.160046 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 23:02:10 crc kubenswrapper[5030]: I0120 23:02:10.160151 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" gracePeriod=600 Jan 20 23:02:10 crc kubenswrapper[5030]: E0120 23:02:10.294500 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:02:10 crc kubenswrapper[5030]: E0120 23:02:10.530284 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 20 23:02:10 crc kubenswrapper[5030]: E0120 23:02:10.530381 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts podName:e8fc3408-8eeb-4eef-a3f5-d90ffa41de23 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:18.530357706 +0000 UTC m=+1610.850618024 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts") pod "ovn-controller-ovs-tnb66" (UID: "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23") : configmap "ovncontroller-scripts" not found Jan 20 23:02:10 crc kubenswrapper[5030]: I0120 23:02:10.869365 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" exitCode=0 Jan 20 23:02:10 crc kubenswrapper[5030]: I0120 23:02:10.869444 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460"} Jan 20 23:02:10 crc kubenswrapper[5030]: I0120 23:02:10.869507 5030 scope.go:117] "RemoveContainer" containerID="e42d8051ffc8563878f969388446a5e74e804108b7b6e90bd869128b6c0317ab" Jan 20 23:02:10 crc kubenswrapper[5030]: I0120 23:02:10.870777 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:02:10 crc kubenswrapper[5030]: E0120 23:02:10.871529 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:02:13 crc kubenswrapper[5030]: E0120 23:02:13.850558 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a29aab_8344_44a5_a736_6ec987b286e7.slice/crio-d9360bc8c6322154560a4f25a4f9d9ca7301edd66ec5241983317fa79a183613\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e850b76_b34f_44d5_8723_ce657ebf49b6.slice/crio-37279f13d7c2a7fa8c76a84c716e56eada0cbac8632cc485677c5c7da6fc58de\": RecentStats: unable to find data in memory cache]" Jan 20 23:02:15 crc kubenswrapper[5030]: I0120 23:02:15.985990 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:02:15 crc kubenswrapper[5030]: I0120 23:02:15.986585 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:02:15 crc kubenswrapper[5030]: I0120 23:02:15.987205 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:02:15 crc kubenswrapper[5030]: I0120 23:02:15.987239 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:02:15 crc kubenswrapper[5030]: I0120 23:02:15.998991 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:02:16 crc kubenswrapper[5030]: I0120 23:02:16.000105 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:02:18 crc kubenswrapper[5030]: E0120 23:02:18.616780 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 20 23:02:18 crc kubenswrapper[5030]: E0120 23:02:18.617264 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts podName:e8fc3408-8eeb-4eef-a3f5-d90ffa41de23 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:34.617226635 +0000 UTC m=+1626.937486973 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts") pod "ovn-controller-ovs-tnb66" (UID: "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23") : configmap "ovncontroller-scripts" not found Jan 20 23:02:20 crc kubenswrapper[5030]: I0120 23:02:20.963268 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:02:20 crc kubenswrapper[5030]: E0120 23:02:20.964282 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:02:24 crc kubenswrapper[5030]: E0120 23:02:24.183312 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a29aab_8344_44a5_a736_6ec987b286e7.slice/crio-d9360bc8c6322154560a4f25a4f9d9ca7301edd66ec5241983317fa79a183613\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e850b76_b34f_44d5_8723_ce657ebf49b6.slice/crio-37279f13d7c2a7fa8c76a84c716e56eada0cbac8632cc485677c5c7da6fc58de\": RecentStats: unable to find data in memory cache]" Jan 20 23:02:28 crc kubenswrapper[5030]: I0120 23:02:28.021652 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:02:30 crc kubenswrapper[5030]: E0120 23:02:30.987897 5030 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/ca4d725065c024a4a45073384bb3a2520cf78ed5c136c49ea83d33b7341c2821/diff" to get inode usage: stat /var/lib/containers/storage/overlay/ca4d725065c024a4a45073384bb3a2520cf78ed5c136c49ea83d33b7341c2821/diff: no such file or directory, extraDiskErr: Jan 20 23:02:30 crc kubenswrapper[5030]: E0120 23:02:30.989171 5030 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/485112f2bf18200e2721aa81629ff39c1ed1f23f8650aa6bf761df6c08ab1cc1/diff" to get inode usage: stat /var/lib/containers/storage/overlay/485112f2bf18200e2721aa81629ff39c1ed1f23f8650aa6bf761df6c08ab1cc1/diff: no such file or directory, extraDiskErr: Jan 20 23:02:31 crc kubenswrapper[5030]: I0120 23:02:31.962672 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:02:31 crc kubenswrapper[5030]: E0120 23:02:31.963128 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:02:32 crc kubenswrapper[5030]: E0120 23:02:32.767236 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407 is running failed: container process not found" containerID="a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 20 23:02:32 crc kubenswrapper[5030]: E0120 23:02:32.768576 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 20 23:02:32 crc kubenswrapper[5030]: E0120 23:02:32.768966 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407 is running failed: container process not found" containerID="a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 20 23:02:32 crc kubenswrapper[5030]: E0120 23:02:32.769671 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407 is running failed: container process not found" containerID="a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 20 23:02:32 crc kubenswrapper[5030]: E0120 23:02:32.769775 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerName="ovsdb-server" Jan 20 23:02:32 crc kubenswrapper[5030]: E0120 23:02:32.770830 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 20 23:02:32 crc kubenswrapper[5030]: E0120 23:02:32.772296 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 20 23:02:32 crc kubenswrapper[5030]: E0120 23:02:32.772363 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerName="ovs-vswitchd" Jan 20 23:02:33 crc kubenswrapper[5030]: I0120 23:02:33.702528 5030 scope.go:117] "RemoveContainer" containerID="49ed1aef5ea7985dfc9dc8cd928f0107137a32f74b75f621873618e8e8aaf816" Jan 20 23:02:33 crc kubenswrapper[5030]: I0120 23:02:33.769462 5030 scope.go:117] "RemoveContainer" containerID="7831bb3048791d0b20d632a1503d425a4657f693ca27819b51673e2e5d6aa956" Jan 20 23:02:33 crc kubenswrapper[5030]: I0120 23:02:33.823881 5030 scope.go:117] "RemoveContainer" containerID="baf6a60a1b77f421596bd948659ecac15aa8955fa1ce8ad3378955ea3bd359da" Jan 20 23:02:33 crc kubenswrapper[5030]: I0120 23:02:33.849310 5030 scope.go:117] "RemoveContainer" containerID="379f9e93e41d27df40fadbc04a7f498ac03dce05e1399bab7f71d6c134822330" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.533674 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-ovs-tnb66_e8fc3408-8eeb-4eef-a3f5-d90ffa41de23/ovs-vswitchd/0.log" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.534798 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.701344 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-etc-ovs\") pod \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.701391 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-run\") pod \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.701434 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" (UID: "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.701642 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-lib\") pod \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.701736 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9scc\" (UniqueName: \"kubernetes.io/projected/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-kube-api-access-h9scc\") pod \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.701836 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts\") pod \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.701891 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-log\") pod \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\" (UID: \"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23\") " Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.701931 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-lib" (OuterVolumeSpecName: "var-lib") pod "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" (UID: "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.701985 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-run" (OuterVolumeSpecName: "var-run") pod "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" (UID: "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.702512 5030 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-etc-ovs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.702536 5030 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.702550 5030 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-lib\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.702486 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-log" (OuterVolumeSpecName: "var-log") pod "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" (UID: "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.704684 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts" (OuterVolumeSpecName: "scripts") pod "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" (UID: "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.712338 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-kube-api-access-h9scc" (OuterVolumeSpecName: "kube-api-access-h9scc") pod "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" (UID: "e8fc3408-8eeb-4eef-a3f5-d90ffa41de23"). InnerVolumeSpecName "kube-api-access-h9scc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.804943 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9scc\" (UniqueName: \"kubernetes.io/projected/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-kube-api-access-h9scc\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.805851 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:34 crc kubenswrapper[5030]: I0120 23:02:34.805904 5030 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23-var-log\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.162488 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-ovs-tnb66_e8fc3408-8eeb-4eef-a3f5-d90ffa41de23/ovs-vswitchd/0.log" Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.164389 5030 generic.go:334] "Generic (PLEG): container finished" podID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerID="73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18" exitCode=137 Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.164449 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" event={"ID":"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23","Type":"ContainerDied","Data":"73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18"} Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.164467 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.164500 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-tnb66" event={"ID":"e8fc3408-8eeb-4eef-a3f5-d90ffa41de23","Type":"ContainerDied","Data":"e2bbb40bc6265a90e230b93fb787abb4998da19e4ef498e3b1ed0b709f71cde5"} Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.164529 5030 scope.go:117] "RemoveContainer" containerID="73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18" Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.206433 5030 scope.go:117] "RemoveContainer" containerID="a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407" Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.224774 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-tnb66"] Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.242391 5030 scope.go:117] "RemoveContainer" containerID="a479d83b4cf11748a2ed078f2865a4cbdf198c24afc29ea367174fa05297b4dc" Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.242542 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-tnb66"] Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.289282 5030 scope.go:117] "RemoveContainer" containerID="73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18" Jan 20 23:02:35 crc kubenswrapper[5030]: E0120 23:02:35.290022 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18\": container with ID starting with 73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18 not found: ID does not exist" containerID="73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18" Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.290092 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18"} err="failed to get container status \"73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18\": rpc error: code = NotFound desc = could not find container \"73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18\": container with ID starting with 73b76271b2a25078b5402e7a9c01e1320d2d40e264aa025c08d9a90bea49ec18 not found: ID does not exist" Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.290137 5030 scope.go:117] "RemoveContainer" containerID="a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407" Jan 20 23:02:35 crc kubenswrapper[5030]: E0120 23:02:35.291716 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407\": container with ID starting with a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407 not found: ID does not exist" containerID="a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407" Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.291866 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407"} err="failed to get container status \"a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407\": rpc error: code = NotFound desc = could not find container \"a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407\": container with ID starting with a6bc64ac3ad8ba11787062db8bccdb37e41917c414ce13c9b18a0c889c8a3407 not found: ID does not exist" Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.291908 5030 scope.go:117] "RemoveContainer" containerID="a479d83b4cf11748a2ed078f2865a4cbdf198c24afc29ea367174fa05297b4dc" Jan 20 23:02:35 crc kubenswrapper[5030]: E0120 23:02:35.293132 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a479d83b4cf11748a2ed078f2865a4cbdf198c24afc29ea367174fa05297b4dc\": container with ID starting with a479d83b4cf11748a2ed078f2865a4cbdf198c24afc29ea367174fa05297b4dc not found: ID does not exist" containerID="a479d83b4cf11748a2ed078f2865a4cbdf198c24afc29ea367174fa05297b4dc" Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.293220 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a479d83b4cf11748a2ed078f2865a4cbdf198c24afc29ea367174fa05297b4dc"} err="failed to get container status \"a479d83b4cf11748a2ed078f2865a4cbdf198c24afc29ea367174fa05297b4dc\": rpc error: code = NotFound desc = could not find container \"a479d83b4cf11748a2ed078f2865a4cbdf198c24afc29ea367174fa05297b4dc\": container with ID starting with a479d83b4cf11748a2ed078f2865a4cbdf198c24afc29ea367174fa05297b4dc not found: ID does not exist" Jan 20 23:02:35 crc kubenswrapper[5030]: I0120 23:02:35.980976 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" path="/var/lib/kubelet/pods/e8fc3408-8eeb-4eef-a3f5-d90ffa41de23/volumes" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.321707 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.322145 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="703b2784-b7db-4424-b479-91d7c53f9ecb" containerName="openstackclient" containerID="cri-o://bd4a155b76873118c2373604fae7e69b21203161dbc22f566ab70841f2e1c6ae" gracePeriod=2 Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.346158 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.381343 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bsjb8"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.415135 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bsjb8"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.441989 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.462425 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4j2gm"] Jan 20 23:02:37 crc kubenswrapper[5030]: E0120 23:02:37.462800 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703b2784-b7db-4424-b479-91d7c53f9ecb" containerName="openstackclient" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.462817 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="703b2784-b7db-4424-b479-91d7c53f9ecb" containerName="openstackclient" Jan 20 23:02:37 crc kubenswrapper[5030]: E0120 23:02:37.462829 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="599db120-1b92-4a0a-8784-3979907751b0" containerName="openstack-network-exporter" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.462835 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="599db120-1b92-4a0a-8784-3979907751b0" containerName="openstack-network-exporter" Jan 20 23:02:37 crc kubenswrapper[5030]: E0120 23:02:37.462852 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerName="ovsdb-server-init" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.462858 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerName="ovsdb-server-init" Jan 20 23:02:37 crc kubenswrapper[5030]: E0120 23:02:37.462878 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerName="ovsdb-server" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.462884 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerName="ovsdb-server" Jan 20 23:02:37 crc kubenswrapper[5030]: E0120 23:02:37.462901 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a374d235-1f65-459e-a2b3-088c67ea01a9" containerName="ovn-controller" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.462906 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a374d235-1f65-459e-a2b3-088c67ea01a9" containerName="ovn-controller" Jan 20 23:02:37 crc kubenswrapper[5030]: E0120 23:02:37.462920 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerName="ovs-vswitchd" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.462928 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerName="ovs-vswitchd" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.463075 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="703b2784-b7db-4424-b479-91d7c53f9ecb" containerName="openstackclient" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.463088 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a374d235-1f65-459e-a2b3-088c67ea01a9" containerName="ovn-controller" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.463095 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerName="ovsdb-server" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.463111 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8fc3408-8eeb-4eef-a3f5-d90ffa41de23" containerName="ovs-vswitchd" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.463124 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="599db120-1b92-4a0a-8784-3979907751b0" containerName="openstack-network-exporter" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.463684 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-4j2gm" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.470022 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.482006 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4j2gm"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.555307 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.556425 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.564944 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 23:02:37 crc kubenswrapper[5030]: E0120 23:02:37.576588 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:02:37 crc kubenswrapper[5030]: E0120 23:02:37.576676 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data podName:83f11c15-8856-45fd-9703-348310781d5a nodeName:}" failed. No retries permitted until 2026-01-20 23:02:38.076659424 +0000 UTC m=+1630.396919712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data") pod "rabbitmq-server-0" (UID: "83f11c15-8856-45fd-9703-348310781d5a") : configmap "rabbitmq-config-data" not found Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.578850 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.588988 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.590306 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.610861 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.626025 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-4cc4-account-create-update-lmfr2"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.626632 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.655982 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.676787 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzcc\" (UniqueName: \"kubernetes.io/projected/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-kube-api-access-pdzcc\") pod \"cinder-4cc4-account-create-update-49tr4\" (UID: \"74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a\") " pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.676833 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/207ab9b6-de27-45b0-945c-68ed0ab8afbf-operator-scripts\") pod \"neutron-e899-account-create-update-rq7hk\" (UID: \"207ab9b6-de27-45b0-945c-68ed0ab8afbf\") " pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.676865 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-operator-scripts\") pod \"cinder-4cc4-account-create-update-49tr4\" (UID: \"74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a\") " pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.676885 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6plg\" (UniqueName: \"kubernetes.io/projected/207ab9b6-de27-45b0-945c-68ed0ab8afbf-kube-api-access-c6plg\") pod \"neutron-e899-account-create-update-rq7hk\" (UID: \"207ab9b6-de27-45b0-945c-68ed0ab8afbf\") " pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.676902 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7be6ad04-b67e-42c1-867f-996017de8a50-operator-scripts\") pod \"root-account-create-update-4j2gm\" (UID: \"7be6ad04-b67e-42c1-867f-996017de8a50\") " pod="openstack-kuttl-tests/root-account-create-update-4j2gm" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.676940 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7cvm\" (UniqueName: \"kubernetes.io/projected/7be6ad04-b67e-42c1-867f-996017de8a50-kube-api-access-z7cvm\") pod \"root-account-create-update-4j2gm\" (UID: \"7be6ad04-b67e-42c1-867f-996017de8a50\") " pod="openstack-kuttl-tests/root-account-create-update-4j2gm" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.684430 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.684711 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" containerName="ovn-northd" containerID="cri-o://5ee7da575f223a9839f7f97dae1e5c9caf513e03dec1badc74521523c66f260c" gracePeriod=30 Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.684830 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" containerName="openstack-network-exporter" containerID="cri-o://96fcd4787c2a4fc0ac3198c4288890edb23f73112901a2786f3617f650cef644" gracePeriod=30 Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.729014 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.729654 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="e90be399-4f89-42ef-950a-2ef8c14a4d0f" containerName="openstack-network-exporter" containerID="cri-o://e9be47aacf56d38b357ffb541b2521e1d61edfb8742915789559a4601680022e" gracePeriod=300 Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.771680 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.779027 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzcc\" (UniqueName: \"kubernetes.io/projected/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-kube-api-access-pdzcc\") pod \"cinder-4cc4-account-create-update-49tr4\" (UID: \"74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a\") " pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.779076 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/207ab9b6-de27-45b0-945c-68ed0ab8afbf-operator-scripts\") pod \"neutron-e899-account-create-update-rq7hk\" (UID: \"207ab9b6-de27-45b0-945c-68ed0ab8afbf\") " pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.779113 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-operator-scripts\") pod \"cinder-4cc4-account-create-update-49tr4\" (UID: \"74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a\") " pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.779130 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6plg\" (UniqueName: \"kubernetes.io/projected/207ab9b6-de27-45b0-945c-68ed0ab8afbf-kube-api-access-c6plg\") pod \"neutron-e899-account-create-update-rq7hk\" (UID: \"207ab9b6-de27-45b0-945c-68ed0ab8afbf\") " pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.779145 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7be6ad04-b67e-42c1-867f-996017de8a50-operator-scripts\") pod \"root-account-create-update-4j2gm\" (UID: \"7be6ad04-b67e-42c1-867f-996017de8a50\") " pod="openstack-kuttl-tests/root-account-create-update-4j2gm" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.779183 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7cvm\" (UniqueName: \"kubernetes.io/projected/7be6ad04-b67e-42c1-867f-996017de8a50-kube-api-access-z7cvm\") pod \"root-account-create-update-4j2gm\" (UID: \"7be6ad04-b67e-42c1-867f-996017de8a50\") " pod="openstack-kuttl-tests/root-account-create-update-4j2gm" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.780124 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/207ab9b6-de27-45b0-945c-68ed0ab8afbf-operator-scripts\") pod \"neutron-e899-account-create-update-rq7hk\" (UID: \"207ab9b6-de27-45b0-945c-68ed0ab8afbf\") " pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.780552 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-operator-scripts\") pod \"cinder-4cc4-account-create-update-49tr4\" (UID: \"74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a\") " pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.781121 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7be6ad04-b67e-42c1-867f-996017de8a50-operator-scripts\") pod \"root-account-create-update-4j2gm\" (UID: \"7be6ad04-b67e-42c1-867f-996017de8a50\") " pod="openstack-kuttl-tests/root-account-create-update-4j2gm" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.799207 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.800402 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.810677 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.845085 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-e899-account-create-update-bwc2v"] Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.856251 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7cvm\" (UniqueName: \"kubernetes.io/projected/7be6ad04-b67e-42c1-867f-996017de8a50-kube-api-access-z7cvm\") pod \"root-account-create-update-4j2gm\" (UID: \"7be6ad04-b67e-42c1-867f-996017de8a50\") " pod="openstack-kuttl-tests/root-account-create-update-4j2gm" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.857136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6plg\" (UniqueName: \"kubernetes.io/projected/207ab9b6-de27-45b0-945c-68ed0ab8afbf-kube-api-access-c6plg\") pod \"neutron-e899-account-create-update-rq7hk\" (UID: \"207ab9b6-de27-45b0-945c-68ed0ab8afbf\") " pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.865863 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzcc\" (UniqueName: \"kubernetes.io/projected/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-kube-api-access-pdzcc\") pod \"cinder-4cc4-account-create-update-49tr4\" (UID: \"74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a\") " pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.888592 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf9zn\" (UniqueName: \"kubernetes.io/projected/1adc2a4a-9139-449f-a755-1a6ede8c141d-kube-api-access-rf9zn\") pod \"placement-f6c7-account-create-update-w4mp2\" (UID: \"1adc2a4a-9139-449f-a755-1a6ede8c141d\") " pod="openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.888697 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adc2a4a-9139-449f-a755-1a6ede8c141d-operator-scripts\") pod \"placement-f6c7-account-create-update-w4mp2\" (UID: \"1adc2a4a-9139-449f-a755-1a6ede8c141d\") " pod="openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.888929 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.914454 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.984191 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/ovsdbserver-nb-0" secret="" err="secret \"ovncluster-ovndbcluster-nb-dockercfg-pz99d\" not found" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.994714 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf9zn\" (UniqueName: \"kubernetes.io/projected/1adc2a4a-9139-449f-a755-1a6ede8c141d-kube-api-access-rf9zn\") pod \"placement-f6c7-account-create-update-w4mp2\" (UID: \"1adc2a4a-9139-449f-a755-1a6ede8c141d\") " pod="openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.994852 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adc2a4a-9139-449f-a755-1a6ede8c141d-operator-scripts\") pod \"placement-f6c7-account-create-update-w4mp2\" (UID: \"1adc2a4a-9139-449f-a755-1a6ede8c141d\") " pod="openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2" Jan 20 23:02:37 crc kubenswrapper[5030]: I0120 23:02:37.995607 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adc2a4a-9139-449f-a755-1a6ede8c141d-operator-scripts\") pod \"placement-f6c7-account-create-update-w4mp2\" (UID: \"1adc2a4a-9139-449f-a755-1a6ede8c141d\") " pod="openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.010905 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ea1315-f9a0-4297-8c79-4432e0969375" path="/var/lib/kubelet/pods/28ea1315-f9a0-4297-8c79-4432e0969375/volumes" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.011896 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a43751e7-1432-4e99-8e62-50d14a3c9470" path="/var/lib/kubelet/pods/a43751e7-1432-4e99-8e62-50d14a3c9470/volumes" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.012441 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1f367a2-2685-4a40-b29d-2a69508228f5" path="/var/lib/kubelet/pods/d1f367a2-2685-4a40-b29d-2a69508228f5/volumes" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.015067 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-w9wqz"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.018850 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="e90be399-4f89-42ef-950a-2ef8c14a4d0f" containerName="ovsdbserver-sb" containerID="cri-o://0744807f85e5619e2c08ebdc739a96d47e1c08e5f289d32e4fc562f7632a83d0" gracePeriod=300 Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.033037 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-w9wqz"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.034483 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf9zn\" (UniqueName: \"kubernetes.io/projected/1adc2a4a-9139-449f-a755-1a6ede8c141d-kube-api-access-rf9zn\") pod \"placement-f6c7-account-create-update-w4mp2\" (UID: \"1adc2a4a-9139-449f-a755-1a6ede8c141d\") " pod="openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.034892 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" containerName="ovn-northd" probeResult="failure" output=< Jan 20 23:02:38 crc kubenswrapper[5030]: 2026-01-20T23:02:37Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Jan 20 23:02:38 crc kubenswrapper[5030]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Jan 20 23:02:38 crc kubenswrapper[5030]: 2026-01-20T23:02:38Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Jan 20 23:02:38 crc kubenswrapper[5030]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Jan 20 23:02:38 crc kubenswrapper[5030]: > Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.067421 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-qgczp"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.099350 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2"] Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.100574 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.100613 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-metrics-certs-tls-certs podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:38.60060022 +0000 UTC m=+1630.920860508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : secret "cert-ovn-metrics" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.100931 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovndbcluster-nb-scripts: configmap "ovndbcluster-nb-scripts" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.100965 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-scripts podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:38.600957319 +0000 UTC m=+1630.921217607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-scripts") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : configmap "ovndbcluster-nb-scripts" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.101391 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.101414 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-combined-ca-bundle podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:38.60140646 +0000 UTC m=+1630.921666748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-combined-ca-bundle") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : secret "combined-ca-bundle" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.101437 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovndbcluster-nb-config: configmap "ovndbcluster-nb-config" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.101454 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-config podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:38.601449371 +0000 UTC m=+1630.921709659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-config") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : configmap "ovndbcluster-nb-config" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.101495 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.101512 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data podName:83f11c15-8856-45fd-9703-348310781d5a nodeName:}" failed. No retries permitted until 2026-01-20 23:02:39.101506472 +0000 UTC m=+1631.421766760 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data") pod "rabbitmq-server-0" (UID: "83f11c15-8856-45fd-9703-348310781d5a") : configmap "rabbitmq-config-data" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.101540 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.101555 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-ovsdbserver-nb-tls-certs podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:38.601550683 +0000 UTC m=+1630.921810961 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.108508 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-4j2gm" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.121018 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" containerName="ovn-northd" probeResult="failure" output=< Jan 20 23:02:38 crc kubenswrapper[5030]: 2026-01-20T23:02:37Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Jan 20 23:02:38 crc kubenswrapper[5030]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Jan 20 23:02:38 crc kubenswrapper[5030]: 2026-01-20T23:02:38Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Jan 20 23:02:38 crc kubenswrapper[5030]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Jan 20 23:02:38 crc kubenswrapper[5030]: > Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.148677 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-qgczp"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.179459 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.187957 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.199756 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-f6c7-account-create-update-rs7dp"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.235572 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-vbdht"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.283023 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-vbdht"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.315441 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.334565 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.377268 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-be4a-account-create-update-vq6sd"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.466690 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.468361 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.497247 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.537876 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8lg48"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.551008 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_e90be399-4f89-42ef-950a-2ef8c14a4d0f/ovsdbserver-sb/0.log" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.551052 5030 generic.go:334] "Generic (PLEG): container finished" podID="e90be399-4f89-42ef-950a-2ef8c14a4d0f" containerID="e9be47aacf56d38b357ffb541b2521e1d61edfb8742915789559a4601680022e" exitCode=2 Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.551074 5030 generic.go:334] "Generic (PLEG): container finished" podID="e90be399-4f89-42ef-950a-2ef8c14a4d0f" containerID="0744807f85e5619e2c08ebdc739a96d47e1c08e5f289d32e4fc562f7632a83d0" exitCode=143 Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.551135 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"e90be399-4f89-42ef-950a-2ef8c14a4d0f","Type":"ContainerDied","Data":"e9be47aacf56d38b357ffb541b2521e1d61edfb8742915789559a4601680022e"} Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.551161 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"e90be399-4f89-42ef-950a-2ef8c14a4d0f","Type":"ContainerDied","Data":"0744807f85e5619e2c08ebdc739a96d47e1c08e5f289d32e4fc562f7632a83d0"} Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.551552 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clv2r\" (UniqueName: \"kubernetes.io/projected/23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3-kube-api-access-clv2r\") pod \"nova-api-be4a-account-create-update-6nlfp\" (UID: \"23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3\") " pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.551584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3-operator-scripts\") pod \"nova-api-be4a-account-create-update-6nlfp\" (UID: \"23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3\") " pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.565784 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.593595 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8lg48"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.598924 5030 generic.go:334] "Generic (PLEG): container finished" podID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" containerID="96fcd4787c2a4fc0ac3198c4288890edb23f73112901a2786f3617f650cef644" exitCode=2 Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.599191 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"c4db8327-61c4-4757-bed5-728ef0f8bbc6","Type":"ContainerDied","Data":"96fcd4787c2a4fc0ac3198c4288890edb23f73112901a2786f3617f650cef644"} Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.599479 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="886eef18-1c15-40bc-a798-217df2a568ff" containerName="openstack-network-exporter" containerID="cri-o://0baa25f272096765a528195b72ca66c3413a72c927cbfa02fe2cad36d4bbd81c" gracePeriod=300 Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.620821 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.665909 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.667075 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.674300 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clv2r\" (UniqueName: \"kubernetes.io/projected/23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3-kube-api-access-clv2r\") pod \"nova-api-be4a-account-create-update-6nlfp\" (UID: \"23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3\") " pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.674345 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3-operator-scripts\") pod \"nova-api-be4a-account-create-update-6nlfp\" (UID: \"23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3\") " pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.675149 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3-operator-scripts\") pod \"nova-api-be4a-account-create-update-6nlfp\" (UID: \"23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3\") " pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp" Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.679353 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovndbcluster-nb-config: configmap "ovndbcluster-nb-config" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.679426 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-config podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:39.679409471 +0000 UTC m=+1631.999669759 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-config") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : configmap "ovndbcluster-nb-config" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.679655 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.679682 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-metrics-certs-tls-certs podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:39.679674827 +0000 UTC m=+1631.999935115 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : secret "cert-ovn-metrics" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.679713 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.679732 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-combined-ca-bundle podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:39.679726398 +0000 UTC m=+1631.999986686 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-combined-ca-bundle") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : secret "combined-ca-bundle" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.679768 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.679786 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-ovsdbserver-nb-tls-certs podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:39.67978087 +0000 UTC m=+1632.000041158 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.679811 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovndbcluster-nb-scripts: configmap "ovndbcluster-nb-scripts" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.679827 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-scripts podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:39.67982153 +0000 UTC m=+1632.000081818 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-scripts") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : configmap "ovndbcluster-nb-scripts" not found Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.681483 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.700202 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.712912 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.714076 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.726890 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.729886 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clv2r\" (UniqueName: \"kubernetes.io/projected/23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3-kube-api-access-clv2r\") pod \"nova-api-be4a-account-create-update-6nlfp\" (UID: \"23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3\") " pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.730188 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.740395 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-m7m8q"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.759755 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-m7m8q"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.776589 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhn7h\" (UniqueName: \"kubernetes.io/projected/ea9bfa72-d35e-4dad-9fcf-747c90607c2d-kube-api-access-hhn7h\") pod \"nova-cell0-d0e8-account-create-update-dg2g2\" (UID: \"ea9bfa72-d35e-4dad-9fcf-747c90607c2d\") " pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.776795 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea9bfa72-d35e-4dad-9fcf-747c90607c2d-operator-scripts\") pod \"nova-cell0-d0e8-account-create-update-dg2g2\" (UID: \"ea9bfa72-d35e-4dad-9fcf-747c90607c2d\") " pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.782568 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.782816 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="0fc512e7-ef40-4589-a094-9906cf91a0d3" containerName="glance-log" containerID="cri-o://896d652e86e3f6645f535d8655829b4fc84bc1a6b041883ee7d85864aa92a7c3" gracePeriod=30 Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.782944 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="0fc512e7-ef40-4589-a094-9906cf91a0d3" containerName="glance-httpd" containerID="cri-o://32dd60b784789cf7134c38532e81ef67d0d751a5bd26694f1fd92042ac4cc701" gracePeriod=30 Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.822690 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.857453 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-47pjt"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.858396 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="886eef18-1c15-40bc-a798-217df2a568ff" containerName="ovsdbserver-nb" containerID="cri-o://2e33323b62f3534db68317bbdcfa4b96b235ccb6f864576f5bc4f881659045cb" gracePeriod=300 Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.879714 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbtjb\" (UniqueName: \"kubernetes.io/projected/43e23459-b501-42f3-bf9c-2d50df954c66-kube-api-access-zbtjb\") pod \"nova-cell1-7aac-account-create-update-f7s69\" (UID: \"43e23459-b501-42f3-bf9c-2d50df954c66\") " pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.879997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea9bfa72-d35e-4dad-9fcf-747c90607c2d-operator-scripts\") pod \"nova-cell0-d0e8-account-create-update-dg2g2\" (UID: \"ea9bfa72-d35e-4dad-9fcf-747c90607c2d\") " pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.880023 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e23459-b501-42f3-bf9c-2d50df954c66-operator-scripts\") pod \"nova-cell1-7aac-account-create-update-f7s69\" (UID: \"43e23459-b501-42f3-bf9c-2d50df954c66\") " pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.880076 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhn7h\" (UniqueName: \"kubernetes.io/projected/ea9bfa72-d35e-4dad-9fcf-747c90607c2d-kube-api-access-hhn7h\") pod \"nova-cell0-d0e8-account-create-update-dg2g2\" (UID: \"ea9bfa72-d35e-4dad-9fcf-747c90607c2d\") " pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.889337 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea9bfa72-d35e-4dad-9fcf-747c90607c2d-operator-scripts\") pod \"nova-cell0-d0e8-account-create-update-dg2g2\" (UID: \"ea9bfa72-d35e-4dad-9fcf-747c90607c2d\") " pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2" Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.889452 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:02:38 crc kubenswrapper[5030]: E0120 23:02:38.889492 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data podName:b9bc4ba4-4b03-441b-b492-6de67c2647b6 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:39.389477447 +0000 UTC m=+1631.709737735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data") pod "rabbitmq-cell1-server-0" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.913010 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhn7h\" (UniqueName: \"kubernetes.io/projected/ea9bfa72-d35e-4dad-9fcf-747c90607c2d-kube-api-access-hhn7h\") pod \"nova-cell0-d0e8-account-create-update-dg2g2\" (UID: \"ea9bfa72-d35e-4dad-9fcf-747c90607c2d\") " pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.914326 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.914870 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="f87ad01b-11b8-43e1-9a04-5947ff6b0441" containerName="cinder-scheduler" containerID="cri-o://54990945a497a45075119ed3dc036df2589703681e3a56dcfccafab5ebeb782b" gracePeriod=30 Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.915006 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="f87ad01b-11b8-43e1-9a04-5947ff6b0441" containerName="probe" containerID="cri-o://97c9b3af0a221749a5d3d0a48cb4ab7d818c47e0f19d9b433358dccfb2edecb5" gracePeriod=30 Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.946514 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-b741-account-create-update-w9xxk"] Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.982386 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbtjb\" (UniqueName: \"kubernetes.io/projected/43e23459-b501-42f3-bf9c-2d50df954c66-kube-api-access-zbtjb\") pod \"nova-cell1-7aac-account-create-update-f7s69\" (UID: \"43e23459-b501-42f3-bf9c-2d50df954c66\") " pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.982601 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e23459-b501-42f3-bf9c-2d50df954c66-operator-scripts\") pod \"nova-cell1-7aac-account-create-update-f7s69\" (UID: \"43e23459-b501-42f3-bf9c-2d50df954c66\") " pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69" Jan 20 23:02:38 crc kubenswrapper[5030]: I0120 23:02:38.984588 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e23459-b501-42f3-bf9c-2d50df954c66-operator-scripts\") pod \"nova-cell1-7aac-account-create-update-f7s69\" (UID: \"43e23459-b501-42f3-bf9c-2d50df954c66\") " pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.009658 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-b741-account-create-update-w9xxk"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.033679 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-wdrgs"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.048674 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbtjb\" (UniqueName: \"kubernetes.io/projected/43e23459-b501-42f3-bf9c-2d50df954c66-kube-api-access-zbtjb\") pod \"nova-cell1-7aac-account-create-update-f7s69\" (UID: \"43e23459-b501-42f3-bf9c-2d50df954c66\") " pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.076280 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-wdrgs"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.127733 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-777ff759db-mgr2r"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.128000 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" podUID="3e7e101a-46be-4de8-97af-d47cdce4ef90" containerName="placement-log" containerID="cri-o://bb165cf308dfeb395cf94e6cb6d70f6e2a782cc4e59efe496e7ca248092fc2b5" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.128351 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" podUID="3e7e101a-46be-4de8-97af-d47cdce4ef90" containerName="placement-api" containerID="cri-o://913daf955b0d563fe98b12119e758cdafbabc393cfd58e273b8eca4ad2a22273" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.158061 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-b5r8x"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.172541 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-b5r8x"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.182671 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183165 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-server" containerID="cri-o://bc19f8ea22af2621575e8fed2298df70012df1256810526ccdb4808e80ca0994" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183564 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="swift-recon-cron" containerID="cri-o://31484680afb5e1717e0037f4e7b46b2af2b23145d165332e6243e3d95db0a10f" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183607 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="rsync" containerID="cri-o://021cea42e21ad43546b51f3127b140505655ea943ba44527c60d312311c3e34f" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183660 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-expirer" containerID="cri-o://a432bf14acfe81bc271e02f652015ce152b293eb9646041378fc56598872ca81" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183696 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-updater" containerID="cri-o://ecdfbe98e48feb2bbd526ae30ed7b61315c792112126a0fe9ff631a269b2721b" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183728 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-auditor" containerID="cri-o://56618c09fc923586f43bcbb44d653ee3a7d3a621886e4715ea74da7134d82d71" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183756 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-replicator" containerID="cri-o://cafdcd217051f0178624a66e9db90a71b91346975e830dc33b5be81f1cbce400" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183784 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-server" containerID="cri-o://11b09f54c09fd448fff25683584a5e001e51f0cffe01fad0c973005e9cc64963" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183814 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-updater" containerID="cri-o://14144cf1922d6be5801bcf6d78d028157ed7e8d1d57067fa2c77b371a66a2544" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183844 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-auditor" containerID="cri-o://2a9f08f93ceb305eb8bbbc6f99714cc83ac38112f37d032b0618da20043f1d87" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183874 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-replicator" containerID="cri-o://030d4cd534eb9dfce0da93b394af8bc39c6d3274923f61e781d77d1230701dae" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183915 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-server" containerID="cri-o://c0d724abadc5d3c497596ad5aa09fe62eb5baaea033ab9d5697927b249d3b189" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183942 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-reaper" containerID="cri-o://3b3e2f8dd2ace93c881c281f8989f46699dbadd47bb896f1d2bf1e9f061b8629" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.183970 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-auditor" containerID="cri-o://95c78ba26ea1327b543cb4fcc17a119b2b94fc9979e98f800b3989e33137c216" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.184000 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-replicator" containerID="cri-o://81f204e94e50c10aad98f32fa610bc765fd0625f37c6d72281e27a9a07ef0053" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.199582 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.199663 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data podName:83f11c15-8856-45fd-9703-348310781d5a nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.199647568 +0000 UTC m=+1633.519907856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data") pod "rabbitmq-server-0" (UID: "83f11c15-8856-45fd-9703-348310781d5a") : configmap "rabbitmq-config-data" not found Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.219963 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.225675 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.238463 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.244555 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-7aac-account-create-update-v8vdq"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.259596 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.260096 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="5d90984a-ad24-4f85-b39e-fff5b8152a04" containerName="glance-log" containerID="cri-o://4149f309379d97cf970398e78be56e16f3aa567df776e1d3fb1cdb1b410e4921" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.260469 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="5d90984a-ad24-4f85-b39e-fff5b8152a04" containerName="glance-httpd" containerID="cri-o://9e870211888f2d26672f4b7bb80891aa23aedcb37c96e9715ce4623aaf9698b0" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.265443 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.276209 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_e90be399-4f89-42ef-950a-2ef8c14a4d0f/ovsdbserver-sb/0.log" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.276270 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.358955 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9"] Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.364067 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90be399-4f89-42ef-950a-2ef8c14a4d0f" containerName="openstack-network-exporter" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.364100 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90be399-4f89-42ef-950a-2ef8c14a4d0f" containerName="openstack-network-exporter" Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.364133 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90be399-4f89-42ef-950a-2ef8c14a4d0f" containerName="ovsdbserver-sb" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.364141 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90be399-4f89-42ef-950a-2ef8c14a4d0f" containerName="ovsdbserver-sb" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.372877 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e90be399-4f89-42ef-950a-2ef8c14a4d0f" containerName="openstack-network-exporter" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.372919 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e90be399-4f89-42ef-950a-2ef8c14a4d0f" containerName="ovsdbserver-sb" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.382508 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.439671 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-57bd57cf66-fdt4n"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.444477 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" podUID="7dfd1851-3088-4199-9b12-47964306a9b5" containerName="neutron-api" containerID="cri-o://25d71b9ff6b3918f06b7ac2f245f258bfaef55aef059830afb2f1b222de24f94" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.446559 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" podUID="7dfd1851-3088-4199-9b12-47964306a9b5" containerName="neutron-httpd" containerID="cri-o://dd92b5dd5973bf0786d8d8b9c80aff684fc1a3124f9a956ceefe2ee92d51545f" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.449653 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e90be399-4f89-42ef-950a-2ef8c14a4d0f-config" (OuterVolumeSpecName: "config") pod "e90be399-4f89-42ef-950a-2ef8c14a4d0f" (UID: "e90be399-4f89-42ef-950a-2ef8c14a4d0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.448902 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90be399-4f89-42ef-950a-2ef8c14a4d0f-config\") pod \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.453221 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e90be399-4f89-42ef-950a-2ef8c14a4d0f-ovsdb-rundir\") pod \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.453477 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-metrics-certs-tls-certs\") pod \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.453584 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9kl8\" (UniqueName: \"kubernetes.io/projected/e90be399-4f89-42ef-950a-2ef8c14a4d0f-kube-api-access-q9kl8\") pod \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.453772 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.453907 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-ovsdbserver-sb-tls-certs\") pod \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.483531 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e90be399-4f89-42ef-950a-2ef8c14a4d0f-scripts\") pod \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.456058 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e90be399-4f89-42ef-950a-2ef8c14a4d0f-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "e90be399-4f89-42ef-950a-2ef8c14a4d0f" (UID: "e90be399-4f89-42ef-950a-2ef8c14a4d0f"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.485209 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e90be399-4f89-42ef-950a-2ef8c14a4d0f-scripts" (OuterVolumeSpecName: "scripts") pod "e90be399-4f89-42ef-950a-2ef8c14a4d0f" (UID: "e90be399-4f89-42ef-950a-2ef8c14a4d0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.485981 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-combined-ca-bundle\") pod \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\" (UID: \"e90be399-4f89-42ef-950a-2ef8c14a4d0f\") " Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.487558 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e90be399-4f89-42ef-950a-2ef8c14a4d0f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.487577 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90be399-4f89-42ef-950a-2ef8c14a4d0f-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.487586 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e90be399-4f89-42ef-950a-2ef8c14a4d0f-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.492284 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.492349 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43e23459-b501-42f3-bf9c-2d50df954c66-operator-scripts podName:43e23459-b501-42f3-bf9c-2d50df954c66 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:39.992331793 +0000 UTC m=+1632.312592071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/43e23459-b501-42f3-bf9c-2d50df954c66-operator-scripts") pod "nova-cell1-7aac-account-create-update-f7s69" (UID: "43e23459-b501-42f3-bf9c-2d50df954c66") : configmap "openstack-cell1-scripts" not found Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.492818 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.492845 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data podName:b9bc4ba4-4b03-441b-b492-6de67c2647b6 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:40.492836445 +0000 UTC m=+1632.813096733 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data") pod "rabbitmq-cell1-server-0" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.493724 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e90be399-4f89-42ef-950a-2ef8c14a4d0f-kube-api-access-q9kl8" (OuterVolumeSpecName: "kube-api-access-q9kl8") pod "e90be399-4f89-42ef-950a-2ef8c14a4d0f" (UID: "e90be399-4f89-42ef-950a-2ef8c14a4d0f"). InnerVolumeSpecName "kube-api-access-q9kl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.504894 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "e90be399-4f89-42ef-950a-2ef8c14a4d0f" (UID: "e90be399-4f89-42ef-950a-2ef8c14a4d0f"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.505679 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.505895 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" containerName="cinder-api-log" containerID="cri-o://c011608dc0cda953d39ae86ce2df8b44ddec5366f2481a6fa9504f761cc6409a" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.506241 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" containerName="cinder-api" containerID="cri-o://1e2c3887d4bfb803bee6d88be2bd09aad12d8e435404a6d4a7f84152d7fd324e" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.551504 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.573951 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e90be399-4f89-42ef-950a-2ef8c14a4d0f" (UID: "e90be399-4f89-42ef-950a-2ef8c14a4d0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.588909 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-qqptl"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.593543 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f6db0c18-060b-4be2-9a11-3df481a3abca-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-z22p9\" (UID: \"f6db0c18-060b-4be2-9a11-3df481a3abca\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.593605 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j75ff\" (UniqueName: \"kubernetes.io/projected/f6db0c18-060b-4be2-9a11-3df481a3abca-kube-api-access-j75ff\") pod \"dnsmasq-dnsmasq-84b9f45d47-z22p9\" (UID: \"f6db0c18-060b-4be2-9a11-3df481a3abca\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.594324 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6db0c18-060b-4be2-9a11-3df481a3abca-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-z22p9\" (UID: \"f6db0c18-060b-4be2-9a11-3df481a3abca\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.594406 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.594418 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9kl8\" (UniqueName: \"kubernetes.io/projected/e90be399-4f89-42ef-950a-2ef8c14a4d0f-kube-api-access-q9kl8\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.594437 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.637049 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-qqptl"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.642398 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e90be399-4f89-42ef-950a-2ef8c14a4d0f" (UID: "e90be399-4f89-42ef-950a-2ef8c14a4d0f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.643571 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "e90be399-4f89-42ef-950a-2ef8c14a4d0f" (UID: "e90be399-4f89-42ef-950a-2ef8c14a4d0f"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.651721 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.657097 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.660238 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"5d90984a-ad24-4f85-b39e-fff5b8152a04","Type":"ContainerDied","Data":"4149f309379d97cf970398e78be56e16f3aa567df776e1d3fb1cdb1b410e4921"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.662971 5030 generic.go:334] "Generic (PLEG): container finished" podID="5d90984a-ad24-4f85-b39e-fff5b8152a04" containerID="4149f309379d97cf970398e78be56e16f3aa567df776e1d3fb1cdb1b410e4921" exitCode=143 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.667055 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.674215 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-73e2-account-create-update-whnpl"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.679369 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.679717 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" podUID="f62d42b6-4ec4-4f63-8f7c-31975f6677bb" containerName="barbican-worker-log" containerID="cri-o://8da669032a42ae5fafc499507531665ae7680439dcecff59acbf618249a406b2" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.680558 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" podUID="f62d42b6-4ec4-4f63-8f7c-31975f6677bb" containerName="barbican-worker" containerID="cri-o://cf2a56f804091f8d287a400f7f7a6cd9c78c7a43c30bba4d5b9ef210fd03c3b7" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.698375 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6db0c18-060b-4be2-9a11-3df481a3abca-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-z22p9\" (UID: \"f6db0c18-060b-4be2-9a11-3df481a3abca\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.698649 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f6db0c18-060b-4be2-9a11-3df481a3abca-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-z22p9\" (UID: \"f6db0c18-060b-4be2-9a11-3df481a3abca\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.698742 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j75ff\" (UniqueName: \"kubernetes.io/projected/f6db0c18-060b-4be2-9a11-3df481a3abca-kube-api-access-j75ff\") pod \"dnsmasq-dnsmasq-84b9f45d47-z22p9\" (UID: \"f6db0c18-060b-4be2-9a11-3df481a3abca\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.698887 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.698946 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.699473 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e90be399-4f89-42ef-950a-2ef8c14a4d0f-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.699637 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.699733 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-metrics-certs-tls-certs podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.699718404 +0000 UTC m=+1634.019978692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : secret "cert-ovn-metrics" not found Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.700070 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.700154 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-combined-ca-bundle podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.700145855 +0000 UTC m=+1634.020406143 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-combined-ca-bundle") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : secret "combined-ca-bundle" not found Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.700813 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.701064 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-ovsdbserver-nb-tls-certs podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.701043517 +0000 UTC m=+1634.021303805 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.701101 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovndbcluster-nb-config: configmap "ovndbcluster-nb-config" not found Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.701121 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-config podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.701115608 +0000 UTC m=+1634.021375896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-config") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : configmap "ovndbcluster-nb-config" not found Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.701144 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovndbcluster-nb-scripts: configmap "ovndbcluster-nb-scripts" not found Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.701163 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-scripts podName:886eef18-1c15-40bc-a798-217df2a568ff nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.701157459 +0000 UTC m=+1634.021417747 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-scripts") pod "ovsdbserver-nb-0" (UID: "886eef18-1c15-40bc-a798-217df2a568ff") : configmap "ovndbcluster-nb-scripts" not found Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.700898 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_886eef18-1c15-40bc-a798-217df2a568ff/ovsdbserver-nb/0.log" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.701206 5030 generic.go:334] "Generic (PLEG): container finished" podID="886eef18-1c15-40bc-a798-217df2a568ff" containerID="0baa25f272096765a528195b72ca66c3413a72c927cbfa02fe2cad36d4bbd81c" exitCode=2 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.701223 5030 generic.go:334] "Generic (PLEG): container finished" podID="886eef18-1c15-40bc-a798-217df2a568ff" containerID="2e33323b62f3534db68317bbdcfa4b96b235ccb6f864576f5bc4f881659045cb" exitCode=143 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.701310 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"886eef18-1c15-40bc-a798-217df2a568ff","Type":"ContainerDied","Data":"0baa25f272096765a528195b72ca66c3413a72c927cbfa02fe2cad36d4bbd81c"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.701337 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"886eef18-1c15-40bc-a798-217df2a568ff","Type":"ContainerDied","Data":"2e33323b62f3534db68317bbdcfa4b96b235ccb6f864576f5bc4f881659045cb"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.707412 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-ms7xf"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.708535 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f6db0c18-060b-4be2-9a11-3df481a3abca-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-z22p9\" (UID: \"f6db0c18-060b-4be2-9a11-3df481a3abca\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.709466 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6db0c18-060b-4be2-9a11-3df481a3abca-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-z22p9\" (UID: \"f6db0c18-060b-4be2-9a11-3df481a3abca\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.747041 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:02:39 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:02:39 crc kubenswrapper[5030]: Jan 20 23:02:39 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:02:39 crc kubenswrapper[5030]: Jan 20 23:02:39 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:02:39 crc kubenswrapper[5030]: Jan 20 23:02:39 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:02:39 crc kubenswrapper[5030]: Jan 20 23:02:39 crc kubenswrapper[5030]: if [ -n "cinder" ]; then Jan 20 23:02:39 crc kubenswrapper[5030]: GRANT_DATABASE="cinder" Jan 20 23:02:39 crc kubenswrapper[5030]: else Jan 20 23:02:39 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:02:39 crc kubenswrapper[5030]: fi Jan 20 23:02:39 crc kubenswrapper[5030]: Jan 20 23:02:39 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:02:39 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:02:39 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:02:39 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:02:39 crc kubenswrapper[5030]: # support updates Jan 20 23:02:39 crc kubenswrapper[5030]: Jan 20 23:02:39 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.748099 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" podUID="74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.752161 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j75ff\" (UniqueName: \"kubernetes.io/projected/f6db0c18-060b-4be2-9a11-3df481a3abca-kube-api-access-j75ff\") pod \"dnsmasq-dnsmasq-84b9f45d47-z22p9\" (UID: \"f6db0c18-060b-4be2-9a11-3df481a3abca\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.758677 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.775774 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-csrrm"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780468 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="a432bf14acfe81bc271e02f652015ce152b293eb9646041378fc56598872ca81" exitCode=0 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780492 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="ecdfbe98e48feb2bbd526ae30ed7b61315c792112126a0fe9ff631a269b2721b" exitCode=0 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780498 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="56618c09fc923586f43bcbb44d653ee3a7d3a621886e4715ea74da7134d82d71" exitCode=0 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780504 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="cafdcd217051f0178624a66e9db90a71b91346975e830dc33b5be81f1cbce400" exitCode=0 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780510 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="11b09f54c09fd448fff25683584a5e001e51f0cffe01fad0c973005e9cc64963" exitCode=0 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780516 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="14144cf1922d6be5801bcf6d78d028157ed7e8d1d57067fa2c77b371a66a2544" exitCode=0 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780521 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="2a9f08f93ceb305eb8bbbc6f99714cc83ac38112f37d032b0618da20043f1d87" exitCode=0 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780527 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="030d4cd534eb9dfce0da93b394af8bc39c6d3274923f61e781d77d1230701dae" exitCode=0 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780534 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="c0d724abadc5d3c497596ad5aa09fe62eb5baaea033ab9d5697927b249d3b189" exitCode=0 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780540 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="3b3e2f8dd2ace93c881c281f8989f46699dbadd47bb896f1d2bf1e9f061b8629" exitCode=0 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780546 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="95c78ba26ea1327b543cb4fcc17a119b2b94fc9979e98f800b3989e33137c216" exitCode=0 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780552 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="81f204e94e50c10aad98f32fa610bc765fd0625f37c6d72281e27a9a07ef0053" exitCode=0 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780590 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"a432bf14acfe81bc271e02f652015ce152b293eb9646041378fc56598872ca81"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780612 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"ecdfbe98e48feb2bbd526ae30ed7b61315c792112126a0fe9ff631a269b2721b"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780639 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"56618c09fc923586f43bcbb44d653ee3a7d3a621886e4715ea74da7134d82d71"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780648 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"cafdcd217051f0178624a66e9db90a71b91346975e830dc33b5be81f1cbce400"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"11b09f54c09fd448fff25683584a5e001e51f0cffe01fad0c973005e9cc64963"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780665 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"14144cf1922d6be5801bcf6d78d028157ed7e8d1d57067fa2c77b371a66a2544"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780674 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"2a9f08f93ceb305eb8bbbc6f99714cc83ac38112f37d032b0618da20043f1d87"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780681 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"030d4cd534eb9dfce0da93b394af8bc39c6d3274923f61e781d77d1230701dae"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780691 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"c0d724abadc5d3c497596ad5aa09fe62eb5baaea033ab9d5697927b249d3b189"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780699 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"3b3e2f8dd2ace93c881c281f8989f46699dbadd47bb896f1d2bf1e9f061b8629"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780707 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"95c78ba26ea1327b543cb4fcc17a119b2b94fc9979e98f800b3989e33137c216"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.780715 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"81f204e94e50c10aad98f32fa610bc765fd0625f37c6d72281e27a9a07ef0053"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.781865 5030 generic.go:334] "Generic (PLEG): container finished" podID="703b2784-b7db-4424-b479-91d7c53f9ecb" containerID="bd4a155b76873118c2373604fae7e69b21203161dbc22f566ab70841f2e1c6ae" exitCode=137 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.783057 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_e90be399-4f89-42ef-950a-2ef8c14a4d0f/ovsdbserver-sb/0.log" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.783099 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"e90be399-4f89-42ef-950a-2ef8c14a4d0f","Type":"ContainerDied","Data":"caae53aba34ae8f4022ee1724b905e205b673a25f26e839d113d103817f284f4"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.783119 5030 scope.go:117] "RemoveContainer" containerID="e9be47aacf56d38b357ffb541b2521e1d61edfb8742915789559a4601680022e" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.783226 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.808295 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.812686 5030 generic.go:334] "Generic (PLEG): container finished" podID="3e7e101a-46be-4de8-97af-d47cdce4ef90" containerID="bb165cf308dfeb395cf94e6cb6d70f6e2a782cc4e59efe496e7ca248092fc2b5" exitCode=143 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.812811 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" event={"ID":"3e7e101a-46be-4de8-97af-d47cdce4ef90","Type":"ContainerDied","Data":"bb165cf308dfeb395cf94e6cb6d70f6e2a782cc4e59efe496e7ca248092fc2b5"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.815664 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-t86dg"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.823046 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-t86dg"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.824669 5030 generic.go:334] "Generic (PLEG): container finished" podID="0fc512e7-ef40-4589-a094-9906cf91a0d3" containerID="896d652e86e3f6645f535d8655829b4fc84bc1a6b041883ee7d85864aa92a7c3" exitCode=143 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.824709 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0fc512e7-ef40-4589-a094-9906cf91a0d3","Type":"ContainerDied","Data":"896d652e86e3f6645f535d8655829b4fc84bc1a6b041883ee7d85864aa92a7c3"} Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.854949 5030 scope.go:117] "RemoveContainer" containerID="0744807f85e5619e2c08ebdc739a96d47e1c08e5f289d32e4fc562f7632a83d0" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.863063 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.863306 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" podUID="0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" containerName="barbican-api-log" containerID="cri-o://3bdf48c04934d54f7661f747d57939590f3632f793a8c0aa8ae04262dd469c85" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.864356 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" podUID="0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" containerName="barbican-api" containerID="cri-o://abe1b9d9c501552d8beeada5cb25d3fbed77f8cbf429b1c1351a006527292ef3" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.885449 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_886eef18-1c15-40bc-a798-217df2a568ff/ovsdbserver-nb/0.log" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.885526 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.892597 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.892807 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" podUID="9b5e90d7-03d0-48f4-abc2-914ccd98c0db" containerName="barbican-keystone-listener-log" containerID="cri-o://aaad7c24ea8a9b3796aa4e5cfe2cc0e940b0c172d22637110c91ea576ac7f3ca" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.892909 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" podUID="9b5e90d7-03d0-48f4-abc2-914ccd98c0db" containerName="barbican-keystone-listener" containerID="cri-o://3fcdf58baca53e88af4e903ab1f51ec63e8c6a04e6a3019435c78eaefad8028e" gracePeriod=30 Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.908308 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-mbfq9"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.919942 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-mbfq9"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.927909 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2"] Jan 20 23:02:39 crc kubenswrapper[5030]: I0120 23:02:39.947704 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.955737 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:02:39 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:02:39 crc kubenswrapper[5030]: Jan 20 23:02:39 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:02:39 crc kubenswrapper[5030]: Jan 20 23:02:39 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:02:39 crc kubenswrapper[5030]: Jan 20 23:02:39 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:02:39 crc kubenswrapper[5030]: Jan 20 23:02:39 crc kubenswrapper[5030]: if [ -n "" ]; then Jan 20 23:02:39 crc kubenswrapper[5030]: GRANT_DATABASE="" Jan 20 23:02:39 crc kubenswrapper[5030]: else Jan 20 23:02:39 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:02:39 crc kubenswrapper[5030]: fi Jan 20 23:02:39 crc kubenswrapper[5030]: Jan 20 23:02:39 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:02:39 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:02:39 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:02:39 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:02:39 crc kubenswrapper[5030]: # support updates Jan 20 23:02:39 crc kubenswrapper[5030]: Jan 20 23:02:39 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:02:39 crc kubenswrapper[5030]: E0120 23:02:39.957893 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-4j2gm" podUID="7be6ad04-b67e-42c1-867f-996017de8a50" Jan 20 23:02:39 crc kubenswrapper[5030]: W0120 23:02:39.981124 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod207ab9b6_de27_45b0_945c_68ed0ab8afbf.slice/crio-61d8ae2b8c363577b217d9e51a4c7799afc9e31d8f835c0fccf03aa388711626 WatchSource:0}: Error finding container 61d8ae2b8c363577b217d9e51a4c7799afc9e31d8f835c0fccf03aa388711626: Status 404 returned error can't find the container with id 61d8ae2b8c363577b217d9e51a4c7799afc9e31d8f835c0fccf03aa388711626 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.006554 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="013d4d42-09ab-4f76-b77c-09a52e7c2635" path="/var/lib/kubelet/pods/013d4d42-09ab-4f76-b77c-09a52e7c2635/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.007298 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="145c5e5d-0043-46f7-af17-4426bea1a3eb" path="/var/lib/kubelet/pods/145c5e5d-0043-46f7-af17-4426bea1a3eb/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.009952 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-scripts\") pod \"886eef18-1c15-40bc-a798-217df2a568ff\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.010033 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-combined-ca-bundle\") pod \"886eef18-1c15-40bc-a798-217df2a568ff\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.010075 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"886eef18-1c15-40bc-a798-217df2a568ff\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.010107 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-ovsdbserver-nb-tls-certs\") pod \"886eef18-1c15-40bc-a798-217df2a568ff\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.010194 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/886eef18-1c15-40bc-a798-217df2a568ff-ovsdb-rundir\") pod \"886eef18-1c15-40bc-a798-217df2a568ff\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.010282 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-metrics-certs-tls-certs\") pod \"886eef18-1c15-40bc-a798-217df2a568ff\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.010365 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-config\") pod \"886eef18-1c15-40bc-a798-217df2a568ff\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.010397 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92x4t\" (UniqueName: \"kubernetes.io/projected/886eef18-1c15-40bc-a798-217df2a568ff-kube-api-access-92x4t\") pod \"886eef18-1c15-40bc-a798-217df2a568ff\" (UID: \"886eef18-1c15-40bc-a798-217df2a568ff\") " Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.017067 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:02:40 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: if [ -n "neutron" ]; then Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="neutron" Jan 20 23:02:40 crc kubenswrapper[5030]: else Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:02:40 crc kubenswrapper[5030]: fi Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:02:40 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:02:40 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:02:40 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:02:40 crc kubenswrapper[5030]: # support updates Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.017751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/886eef18-1c15-40bc-a798-217df2a568ff-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "886eef18-1c15-40bc-a798-217df2a568ff" (UID: "886eef18-1c15-40bc-a798-217df2a568ff"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.017830 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.019657 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-config" (OuterVolumeSpecName: "config") pod "886eef18-1c15-40bc-a798-217df2a568ff" (UID: "886eef18-1c15-40bc-a798-217df2a568ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.021816 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-scripts" (OuterVolumeSpecName: "scripts") pod "886eef18-1c15-40bc-a798-217df2a568ff" (UID: "886eef18-1c15-40bc-a798-217df2a568ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.023656 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" podUID="207ab9b6-de27-45b0-945c-68ed0ab8afbf" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.030159 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43e23459-b501-42f3-bf9c-2d50df954c66-operator-scripts podName:43e23459-b501-42f3-bf9c-2d50df954c66 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.030134517 +0000 UTC m=+1633.350394805 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/43e23459-b501-42f3-bf9c-2d50df954c66-operator-scripts") pod "nova-cell1-7aac-account-create-update-f7s69" (UID: "43e23459-b501-42f3-bf9c-2d50df954c66") : configmap "openstack-cell1-scripts" not found Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.031474 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b0c307-dd4b-41d1-a042-98ee993d330a" path="/var/lib/kubelet/pods/33b0c307-dd4b-41d1-a042-98ee993d330a/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.036709 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e588d13-720f-45b7-aaa8-d339dc20f695" path="/var/lib/kubelet/pods/3e588d13-720f-45b7-aaa8-d339dc20f695/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.037577 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cf8ec7d-d1a5-45c1-8c1b-106c2358caff" path="/var/lib/kubelet/pods/4cf8ec7d-d1a5-45c1-8c1b-106c2358caff/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.038202 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="657d77bd-c7a3-441a-84dd-652ede942310" path="/var/lib/kubelet/pods/657d77bd-c7a3-441a-84dd-652ede942310/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.051718 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.054448 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ebfa275-f618-4aaf-b02b-f4a6db7c78e4" path="/var/lib/kubelet/pods/6ebfa275-f618-4aaf-b02b-f4a6db7c78e4/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.055258 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5" path="/var/lib/kubelet/pods/7408aa73-b3a2-4f7f-bd47-8d5f4cd4c1e5/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.056112 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8292fbbe-24f2-4dde-82f5-338ac93e33f1" path="/var/lib/kubelet/pods/8292fbbe-24f2-4dde-82f5-338ac93e33f1/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.056907 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb7df47-fb12-4daa-9442-52c1dfd18bea" path="/var/lib/kubelet/pods/8fb7df47-fb12-4daa-9442-52c1dfd18bea/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.057485 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="910e33af-9b19-4387-babd-c19cce0cfd29" path="/var/lib/kubelet/pods/910e33af-9b19-4387-babd-c19cce0cfd29/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.067103 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a10a501a-a4d8-4baf-b4f0-f76193fc92f3" path="/var/lib/kubelet/pods/a10a501a-a4d8-4baf-b4f0-f76193fc92f3/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.068064 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a315c01e-271c-4200-b411-c1a9fa71820f" path="/var/lib/kubelet/pods/a315c01e-271c-4200-b411-c1a9fa71820f/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.068607 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/886eef18-1c15-40bc-a798-217df2a568ff-kube-api-access-92x4t" (OuterVolumeSpecName: "kube-api-access-92x4t") pod "886eef18-1c15-40bc-a798-217df2a568ff" (UID: "886eef18-1c15-40bc-a798-217df2a568ff"). InnerVolumeSpecName "kube-api-access-92x4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.068756 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5285dc6-c907-4a96-b97a-97950b15c428" path="/var/lib/kubelet/pods/b5285dc6-c907-4a96-b97a-97950b15c428/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.081843 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "886eef18-1c15-40bc-a798-217df2a568ff" (UID: "886eef18-1c15-40bc-a798-217df2a568ff"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.082543 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="83f11c15-8856-45fd-9703-348310781d5a" containerName="rabbitmq" containerID="cri-o://dcc99476597aaed55627fb273471e78f5cfba90216403d60549d509f104df53d" gracePeriod=604800 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.082694 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae" path="/var/lib/kubelet/pods/c105b2eb-c6ab-47a6-bdcb-9cf5e9893bae/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.083402 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c92053f1-ec4e-4dee-ac05-a4860c140b41" path="/var/lib/kubelet/pods/c92053f1-ec4e-4dee-ac05-a4860c140b41/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.096845 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed15b8f4-d0a2-401f-ac7e-cc273da45269" path="/var/lib/kubelet/pods/ed15b8f4-d0a2-401f-ac7e-cc273da45269/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.097737 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a" path="/var/lib/kubelet/pods/faa3e9a9-bae7-45dc-9b43-a9ae4ee7916a/volumes" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.120990 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.121033 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.121046 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/886eef18-1c15-40bc-a798-217df2a568ff-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.121058 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/886eef18-1c15-40bc-a798-217df2a568ff-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.121072 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92x4t\" (UniqueName: \"kubernetes.io/projected/886eef18-1c15-40bc-a798-217df2a568ff-kube-api-access-92x4t\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.122290 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.130854 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.130988 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.131026 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-config-data: configmap "openstack-config-data" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.131327 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-config-data: configmap "openstack-config-data" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.133654 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-operator-scripts podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:40.622324878 +0000 UTC m=+1632.942585166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-operator-scripts") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : configmap "openstack-scripts" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.133694 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-combined-ca-bundle podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:40.633681724 +0000 UTC m=+1632.953942012 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-combined-ca-bundle") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : secret "combined-ca-bundle" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.133735 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-galera-tls-certs podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:40.633727345 +0000 UTC m=+1632.953987633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-galera-tls-certs") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : secret "cert-galera-openstack-svc" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.133747 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kolla-config podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:40.633741275 +0000 UTC m=+1632.954001563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kolla-config") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : configmap "openstack-config-data" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.133761 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-default podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:40.633755896 +0000 UTC m=+1632.954016184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-default") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : configmap "openstack-config-data" not found Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.138140 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "886eef18-1c15-40bc-a798-217df2a568ff" (UID: "886eef18-1c15-40bc-a798-217df2a568ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.180386 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.197218 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-s2nnr"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.197245 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-s2nnr"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.197258 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.197272 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.197281 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.197291 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-q6mv6"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.197302 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-q6mv6"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.197312 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.197327 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.197691 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" podUID="177bf646-3453-466c-ac69-7777db44dda5" containerName="proxy-httpd" containerID="cri-o://c4c34433a2e5244dfc75cef087ad787ef3f93d3b31b9b4666e5d1d0506c65a29" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.197827 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" containerName="nova-api-log" containerID="cri-o://7bebe1daa1d25250b1dedbc99eb90a92119deaafd6bb66ce48014fb925d151e0" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.198198 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" podUID="177bf646-3453-466c-ac69-7777db44dda5" containerName="proxy-server" containerID="cri-o://45ada1c5b5263d9253d72b7b3529d52163687af3862b2a42b49f7e5e3b235906" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.198259 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" containerName="nova-api-api" containerID="cri-o://bacdb63cfad87ddbc9e7c7e70fdb4097469fbce0f8efae90054339392ef9fc71" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.198388 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "886eef18-1c15-40bc-a798-217df2a568ff" (UID: "886eef18-1c15-40bc-a798-217df2a568ff"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.202308 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.202470 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="ceilometer-central-agent" containerID="cri-o://aa2beec7d83b690d8c2d49158b55f640524cb3d1f27c3a03e7ff160cba304856" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.202538 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="proxy-httpd" containerID="cri-o://ed33b27e4afb955b83d56cfdd6383bdd0da34ea1240c3f18f863547d28dffa02" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.202569 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="sg-core" containerID="cri-o://6edc24e70d18a4d950a7bac3cb71da281a70c3b75b61a409054feff5a2f24333" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.202595 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="ceilometer-notification-agent" containerID="cri-o://0a9bc2b2e62fa1665346965719dba7ef174e556ec608612008ea3d57f6ce58a5" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.263176 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/703b2784-b7db-4424-b479-91d7c53f9ecb-openstack-config-secret\") pod \"703b2784-b7db-4424-b479-91d7c53f9ecb\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.263272 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/703b2784-b7db-4424-b479-91d7c53f9ecb-openstack-config\") pod \"703b2784-b7db-4424-b479-91d7c53f9ecb\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.263451 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69sz9\" (UniqueName: \"kubernetes.io/projected/703b2784-b7db-4424-b479-91d7c53f9ecb-kube-api-access-69sz9\") pod \"703b2784-b7db-4424-b479-91d7c53f9ecb\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.263473 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b2784-b7db-4424-b479-91d7c53f9ecb-combined-ca-bundle\") pod \"703b2784-b7db-4424-b479-91d7c53f9ecb\" (UID: \"703b2784-b7db-4424-b479-91d7c53f9ecb\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.267531 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.267875 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="2071f00e-fdae-400c-bec2-f9092faef33d" containerName="nova-scheduler-scheduler" containerID="cri-o://765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.268782 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.268811 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.270333 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:02:40 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: if [ -n "nova_api" ]; then Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="nova_api" Jan 20 23:02:40 crc kubenswrapper[5030]: else Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:02:40 crc kubenswrapper[5030]: fi Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:02:40 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:02:40 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:02:40 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:02:40 crc kubenswrapper[5030]: # support updates Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.270370 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "886eef18-1c15-40bc-a798-217df2a568ff" (UID: "886eef18-1c15-40bc-a798-217df2a568ff"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.274409 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:02:40 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: if [ -n "placement" ]; then Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="placement" Jan 20 23:02:40 crc kubenswrapper[5030]: else Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:02:40 crc kubenswrapper[5030]: fi Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:02:40 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:02:40 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:02:40 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:02:40 crc kubenswrapper[5030]: # support updates Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.275019 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp" podUID="23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.275491 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2" podUID="1adc2a4a-9139-449f-a755-1a6ede8c141d" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.286865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703b2784-b7db-4424-b479-91d7c53f9ecb-kube-api-access-69sz9" (OuterVolumeSpecName: "kube-api-access-69sz9") pod "703b2784-b7db-4424-b479-91d7c53f9ecb" (UID: "703b2784-b7db-4424-b479-91d7c53f9ecb"). InnerVolumeSpecName "kube-api-access-69sz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.289706 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.331237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703b2784-b7db-4424-b479-91d7c53f9ecb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "703b2784-b7db-4424-b479-91d7c53f9ecb" (UID: "703b2784-b7db-4424-b479-91d7c53f9ecb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.331637 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-wmpkb"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.341460 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-wmpkb"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.352472 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703b2784-b7db-4424-b479-91d7c53f9ecb-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "703b2784-b7db-4424-b479-91d7c53f9ecb" (UID: "703b2784-b7db-4424-b479-91d7c53f9ecb"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.362409 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.370402 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703b2784-b7db-4424-b479-91d7c53f9ecb-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "703b2784-b7db-4424-b479-91d7c53f9ecb" (UID: "703b2784-b7db-4424-b479-91d7c53f9ecb"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.374741 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-zr5tr"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.378495 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69sz9\" (UniqueName: \"kubernetes.io/projected/703b2784-b7db-4424-b479-91d7c53f9ecb-kube-api-access-69sz9\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.378535 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b2784-b7db-4424-b479-91d7c53f9ecb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.378547 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/703b2784-b7db-4424-b479-91d7c53f9ecb-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.378556 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.378564 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/886eef18-1c15-40bc-a798-217df2a568ff-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.378574 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/703b2784-b7db-4424-b479-91d7c53f9ecb-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.382712 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-zr5tr"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.398404 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.398670 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerName="nova-metadata-log" containerID="cri-o://0351a1478214b121a7bb8088f4a8ae6a86f4ec9b8b21b1882a5aae98732aa2f9" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.399221 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerName="nova-metadata-metadata" containerID="cri-o://b3074f8fcdf7f1a56717e9304ba5737a0e0d1592d96a5bd9d970577342c28521" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.400286 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.409604 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.424126 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-wrlxj"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.432667 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.432881 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="4789919e-c06b-413a-8185-6766523e9925" containerName="nova-cell0-conductor-conductor" containerID="cri-o://492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.438663 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.446838 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.447091 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="5699cc7d-e528-4022-bff1-7057062e4a03" containerName="nova-cell1-conductor-conductor" containerID="cri-o://3840da32c6f4ec49386ce37acd28fcc6e739b91b2b7ca663f4470e40df7a6ce7" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.461606 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.462068 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="c4a387ba-cb79-4f5d-a313-37204d51d189" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3bcbe084389694d96a65b34b6aa0d2aa7c93b51780ed5c86c1d3a10c6a57ad9e" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.467180 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.472699 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-nzf7r"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.499652 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4j2gm"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.505419 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.505650 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="6e835bcd-3585-4a51-ba54-aeb66b426afa" containerName="kube-state-metrics" containerID="cri-o://4224aea8dfc41047d5b9bd1083b109636d208592edde37fe0ce27e2bbe90849f" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.514141 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="f927a168-d367-4174-9e6e-fd9b7964346b" containerName="galera" containerID="cri-o://c4f519b9aa19de2a59b393d537b2988696190ba41760458874985232c34d2380" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.527048 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.531810 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" podUID="177bf646-3453-466c-ac69-7777db44dda5" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.255:8080/healthcheck\": dial tcp 10.217.0.255:8080: connect: connection refused" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.532136 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" podUID="177bf646-3453-466c-ac69-7777db44dda5" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.255:8080/healthcheck\": dial tcp 10.217.0.255:8080: connect: connection refused" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.545208 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4j2gm"] Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.551674 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:02:40 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: if [ -n "nova_cell1" ]; then Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="nova_cell1" Jan 20 23:02:40 crc kubenswrapper[5030]: else Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:02:40 crc kubenswrapper[5030]: fi Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:02:40 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:02:40 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:02:40 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:02:40 crc kubenswrapper[5030]: # support updates Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.553380 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69" podUID="43e23459-b501-42f3-bf9c-2d50df954c66" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.557689 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl"] Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.558131 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886eef18-1c15-40bc-a798-217df2a568ff" containerName="ovsdbserver-nb" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.558149 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="886eef18-1c15-40bc-a798-217df2a568ff" containerName="ovsdbserver-nb" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.558157 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886eef18-1c15-40bc-a798-217df2a568ff" containerName="openstack-network-exporter" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.558164 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="886eef18-1c15-40bc-a798-217df2a568ff" containerName="openstack-network-exporter" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.558330 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="886eef18-1c15-40bc-a798-217df2a568ff" containerName="openstack-network-exporter" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.558349 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="886eef18-1c15-40bc-a798-217df2a568ff" containerName="ovsdbserver-nb" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.558958 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.562060 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.565153 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.573279 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="b9bc4ba4-4b03-441b-b492-6de67c2647b6" containerName="rabbitmq" containerID="cri-o://2eaa0bf043f813374e4a3eece104c7b2f4bc50fb0dfe7dac3ec20252cfacfb0a" gracePeriod=604800 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.577687 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.581485 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.585212 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.585263 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data podName:b9bc4ba4-4b03-441b-b492-6de67c2647b6 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:42.585249111 +0000 UTC m=+1634.905509399 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data") pod "rabbitmq-cell1-server-0" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.586438 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.598201 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-afd0-account-create-update-476mm"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.608051 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-afd0-account-create-update-476mm"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.613256 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.616388 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="6f73fe0c-f004-4b64-8952-cc6d7ebb26a9" containerName="memcached" containerID="cri-o://ff2baf8e3414b506fcd4fe9cdcb3f98e506b98207980d6f88effbe946d2e0a60" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.620825 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.629309 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.639112 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-cx7lh"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.643862 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-cx7lh"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.648461 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-c82wm"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.662170 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-c82wm"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.684756 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-cron-29482501-nv2t9"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.686544 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-operator-scripts\") pod \"keystone-afd0-account-create-update-2xqzl\" (UID: \"6a8b63a6-0ddc-47f6-9cc3-30862eb55edf\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.686930 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tgls\" (UniqueName: \"kubernetes.io/projected/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-kube-api-access-5tgls\") pod \"keystone-afd0-account-create-update-2xqzl\" (UID: \"6a8b63a6-0ddc-47f6-9cc3-30862eb55edf\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.687202 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.687302 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-galera-tls-certs podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.687286732 +0000 UTC m=+1634.007547020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-galera-tls-certs") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : secret "cert-galera-openstack-svc" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.687400 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.687474 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-combined-ca-bundle podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.687465476 +0000 UTC m=+1634.007725764 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-combined-ca-bundle") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : secret "combined-ca-bundle" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.687331 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-config-data: configmap "openstack-config-data" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.687603 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kolla-config podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.687596169 +0000 UTC m=+1634.007856457 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kolla-config") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : configmap "openstack-config-data" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.687416 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.687758 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-operator-scripts podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.687751684 +0000 UTC m=+1634.008011972 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-operator-scripts") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : configmap "openstack-scripts" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.687239 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-config-data: configmap "openstack-config-data" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.687880 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-default podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.687873987 +0000 UTC m=+1634.008134265 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-default") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : configmap "openstack-config-data" not found Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.689524 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-cron-29482501-nv2t9"] Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.691556 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:02:40 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: if [ -n "nova_cell0" ]; then Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="nova_cell0" Jan 20 23:02:40 crc kubenswrapper[5030]: else Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:02:40 crc kubenswrapper[5030]: fi Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:02:40 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:02:40 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:02:40 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:02:40 crc kubenswrapper[5030]: # support updates Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.692932 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2" podUID="ea9bfa72-d35e-4dad-9fcf-747c90607c2d" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.694006 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.698596 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6765d59f9b-2rs69"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.698907 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" podUID="b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" containerName="keystone-api" containerID="cri-o://03000a5337c54b499955e11b625aa2770b106374d729d827663c69d19147b1d2" gracePeriod=30 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.704715 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-brxx2"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.713232 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-brxx2"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.718914 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl"] Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.719589 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-5tgls operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl" podUID="6a8b63a6-0ddc-47f6-9cc3-30862eb55edf" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.741958 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.762682 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.789244 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-operator-scripts\") pod \"keystone-afd0-account-create-update-2xqzl\" (UID: \"6a8b63a6-0ddc-47f6-9cc3-30862eb55edf\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.789450 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.789534 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-operator-scripts podName:6a8b63a6-0ddc-47f6-9cc3-30862eb55edf nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.289512757 +0000 UTC m=+1633.609773045 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-operator-scripts") pod "keystone-afd0-account-create-update-2xqzl" (UID: "6a8b63a6-0ddc-47f6-9cc3-30862eb55edf") : configmap "openstack-scripts" not found Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.789564 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tgls\" (UniqueName: \"kubernetes.io/projected/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-kube-api-access-5tgls\") pod \"keystone-afd0-account-create-update-2xqzl\" (UID: \"6a8b63a6-0ddc-47f6-9cc3-30862eb55edf\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.794195 5030 projected.go:194] Error preparing data for projected volume kube-api-access-5tgls for pod openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.794251 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-kube-api-access-5tgls podName:6a8b63a6-0ddc-47f6-9cc3-30862eb55edf nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.294235032 +0000 UTC m=+1633.614495320 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5tgls" (UniqueName: "kubernetes.io/projected/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-kube-api-access-5tgls") pod "keystone-afd0-account-create-update-2xqzl" (UID: "6a8b63a6-0ddc-47f6-9cc3-30862eb55edf") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.806582 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9"] Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.836490 5030 generic.go:334] "Generic (PLEG): container finished" podID="6e835bcd-3585-4a51-ba54-aeb66b426afa" containerID="4224aea8dfc41047d5b9bd1083b109636d208592edde37fe0ce27e2bbe90849f" exitCode=2 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.836544 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"6e835bcd-3585-4a51-ba54-aeb66b426afa","Type":"ContainerDied","Data":"4224aea8dfc41047d5b9bd1083b109636d208592edde37fe0ce27e2bbe90849f"} Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.837209 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.839752 5030 generic.go:334] "Generic (PLEG): container finished" podID="177bf646-3453-466c-ac69-7777db44dda5" containerID="45ada1c5b5263d9253d72b7b3529d52163687af3862b2a42b49f7e5e3b235906" exitCode=0 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.839772 5030 generic.go:334] "Generic (PLEG): container finished" podID="177bf646-3453-466c-ac69-7777db44dda5" containerID="c4c34433a2e5244dfc75cef087ad787ef3f93d3b31b9b4666e5d1d0506c65a29" exitCode=0 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.839800 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" event={"ID":"177bf646-3453-466c-ac69-7777db44dda5","Type":"ContainerDied","Data":"45ada1c5b5263d9253d72b7b3529d52163687af3862b2a42b49f7e5e3b235906"} Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.839815 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" event={"ID":"177bf646-3453-466c-ac69-7777db44dda5","Type":"ContainerDied","Data":"c4c34433a2e5244dfc75cef087ad787ef3f93d3b31b9b4666e5d1d0506c65a29"} Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.839830 5030 scope.go:117] "RemoveContainer" containerID="45ada1c5b5263d9253d72b7b3529d52163687af3862b2a42b49f7e5e3b235906" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.846003 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2" event={"ID":"ea9bfa72-d35e-4dad-9fcf-747c90607c2d","Type":"ContainerStarted","Data":"b0cff95ace65239bbe602d15dcde1ff8d6a3dbc69cf195ce50523ac148de07ef"} Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.852244 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.889711 5030 generic.go:334] "Generic (PLEG): container finished" podID="f62d42b6-4ec4-4f63-8f7c-31975f6677bb" containerID="8da669032a42ae5fafc499507531665ae7680439dcecff59acbf618249a406b2" exitCode=143 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.889767 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" event={"ID":"f62d42b6-4ec4-4f63-8f7c-31975f6677bb","Type":"ContainerDied","Data":"8da669032a42ae5fafc499507531665ae7680439dcecff59acbf618249a406b2"} Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.893313 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-internal-tls-certs\") pod \"177bf646-3453-466c-ac69-7777db44dda5\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.893352 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177bf646-3453-466c-ac69-7777db44dda5-run-httpd\") pod \"177bf646-3453-466c-ac69-7777db44dda5\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.894141 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/177bf646-3453-466c-ac69-7777db44dda5-etc-swift\") pod \"177bf646-3453-466c-ac69-7777db44dda5\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.893861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/177bf646-3453-466c-ac69-7777db44dda5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "177bf646-3453-466c-ac69-7777db44dda5" (UID: "177bf646-3453-466c-ac69-7777db44dda5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.894183 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177bf646-3453-466c-ac69-7777db44dda5-log-httpd\") pod \"177bf646-3453-466c-ac69-7777db44dda5\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.894286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-public-tls-certs\") pod \"177bf646-3453-466c-ac69-7777db44dda5\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.894463 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgg8t\" (UniqueName: \"kubernetes.io/projected/177bf646-3453-466c-ac69-7777db44dda5-kube-api-access-qgg8t\") pod \"177bf646-3453-466c-ac69-7777db44dda5\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.894495 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/177bf646-3453-466c-ac69-7777db44dda5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "177bf646-3453-466c-ac69-7777db44dda5" (UID: "177bf646-3453-466c-ac69-7777db44dda5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.894508 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-combined-ca-bundle\") pod \"177bf646-3453-466c-ac69-7777db44dda5\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.894566 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-config-data\") pod \"177bf646-3453-466c-ac69-7777db44dda5\" (UID: \"177bf646-3453-466c-ac69-7777db44dda5\") " Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.895287 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177bf646-3453-466c-ac69-7777db44dda5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.895305 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/177bf646-3453-466c-ac69-7777db44dda5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.898566 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/177bf646-3453-466c-ac69-7777db44dda5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "177bf646-3453-466c-ac69-7777db44dda5" (UID: "177bf646-3453-466c-ac69-7777db44dda5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.899876 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="021cea42e21ad43546b51f3127b140505655ea943ba44527c60d312311c3e34f" exitCode=0 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.899897 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="bc19f8ea22af2621575e8fed2298df70012df1256810526ccdb4808e80ca0994" exitCode=0 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.899948 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"021cea42e21ad43546b51f3127b140505655ea943ba44527c60d312311c3e34f"} Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.899966 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"bc19f8ea22af2621575e8fed2298df70012df1256810526ccdb4808e80ca0994"} Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.904018 5030 generic.go:334] "Generic (PLEG): container finished" podID="9b5e90d7-03d0-48f4-abc2-914ccd98c0db" containerID="aaad7c24ea8a9b3796aa4e5cfe2cc0e940b0c172d22637110c91ea576ac7f3ca" exitCode=143 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.904054 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" event={"ID":"9b5e90d7-03d0-48f4-abc2-914ccd98c0db","Type":"ContainerDied","Data":"aaad7c24ea8a9b3796aa4e5cfe2cc0e940b0c172d22637110c91ea576ac7f3ca"} Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.918079 5030 generic.go:334] "Generic (PLEG): container finished" podID="0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" containerID="3bdf48c04934d54f7661f747d57939590f3632f793a8c0aa8ae04262dd469c85" exitCode=143 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.918164 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" event={"ID":"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3","Type":"ContainerDied","Data":"3bdf48c04934d54f7661f747d57939590f3632f793a8c0aa8ae04262dd469c85"} Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.919000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/177bf646-3453-466c-ac69-7777db44dda5-kube-api-access-qgg8t" (OuterVolumeSpecName: "kube-api-access-qgg8t") pod "177bf646-3453-466c-ac69-7777db44dda5" (UID: "177bf646-3453-466c-ac69-7777db44dda5"). InnerVolumeSpecName "kube-api-access-qgg8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.919957 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp" event={"ID":"23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3","Type":"ContainerStarted","Data":"f653eeff8519405fd6277a476af479ebbfee38fefa0a0138ddf8c03e46c0d408"} Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.920236 5030 scope.go:117] "RemoveContainer" containerID="c4c34433a2e5244dfc75cef087ad787ef3f93d3b31b9b4666e5d1d0506c65a29" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.939409 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" event={"ID":"74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a","Type":"ContainerStarted","Data":"ba3a0ea5399c82383ea22cd9e5e09b92b84a0316f3ec57dc864b445c2f9a74d0"} Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.940682 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" secret="" err="secret \"galera-openstack-dockercfg-jcxx5\" not found" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.951632 5030 generic.go:334] "Generic (PLEG): container finished" podID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerID="0351a1478214b121a7bb8088f4a8ae6a86f4ec9b8b21b1882a5aae98732aa2f9" exitCode=143 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.951690 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b5f99e18-c5b4-4def-9f5e-5be497088422","Type":"ContainerDied","Data":"0351a1478214b121a7bb8088f4a8ae6a86f4ec9b8b21b1882a5aae98732aa2f9"} Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.965386 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:02:40 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: if [ -n "cinder" ]; then Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="cinder" Jan 20 23:02:40 crc kubenswrapper[5030]: else Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:02:40 crc kubenswrapper[5030]: fi Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:02:40 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:02:40 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:02:40 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:02:40 crc kubenswrapper[5030]: # support updates Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.967662 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" podUID="74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.968430 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-4j2gm" event={"ID":"7be6ad04-b67e-42c1-867f-996017de8a50","Type":"ContainerStarted","Data":"956e5d6a8b04b4095c00aa0154120c10bf4af15a6e53c980483d9bd9d32f8aba"} Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.969108 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-4j2gm" secret="" err="secret \"galera-openstack-cell1-dockercfg-bz4km\" not found" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.974180 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:02:40 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: if [ -n "" ]; then Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="" Jan 20 23:02:40 crc kubenswrapper[5030]: else Jan 20 23:02:40 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:02:40 crc kubenswrapper[5030]: fi Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:02:40 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:02:40 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:02:40 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:02:40 crc kubenswrapper[5030]: # support updates Jan 20 23:02:40 crc kubenswrapper[5030]: Jan 20 23:02:40 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.975717 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-4j2gm" podUID="7be6ad04-b67e-42c1-867f-996017de8a50" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.978011 5030 generic.go:334] "Generic (PLEG): container finished" podID="7dfd1851-3088-4199-9b12-47964306a9b5" containerID="dd92b5dd5973bf0786d8d8b9c80aff684fc1a3124f9a956ceefe2ee92d51545f" exitCode=0 Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.978120 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" event={"ID":"7dfd1851-3088-4199-9b12-47964306a9b5","Type":"ContainerDied","Data":"dd92b5dd5973bf0786d8d8b9c80aff684fc1a3124f9a956ceefe2ee92d51545f"} Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.995303 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "177bf646-3453-466c-ac69-7777db44dda5" (UID: "177bf646-3453-466c-ac69-7777db44dda5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.996565 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgg8t\" (UniqueName: \"kubernetes.io/projected/177bf646-3453-466c-ac69-7777db44dda5-kube-api-access-qgg8t\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.996590 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: I0120 23:02:40.996600 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/177bf646-3453-466c-ac69-7777db44dda5-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.997265 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:02:40 crc kubenswrapper[5030]: E0120 23:02:40.997317 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-operator-scripts podName:74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.497297089 +0000 UTC m=+1633.817557377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-operator-scripts") pod "cinder-4cc4-account-create-update-49tr4" (UID: "74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a") : configmap "openstack-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.009215 5030 generic.go:334] "Generic (PLEG): container finished" podID="47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" containerID="c011608dc0cda953d39ae86ce2df8b44ddec5366f2481a6fa9504f761cc6409a" exitCode=143 Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.009363 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8","Type":"ContainerDied","Data":"c011608dc0cda953d39ae86ce2df8b44ddec5366f2481a6fa9504f761cc6409a"} Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.015367 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "177bf646-3453-466c-ac69-7777db44dda5" (UID: "177bf646-3453-466c-ac69-7777db44dda5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.016182 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-config-data" (OuterVolumeSpecName: "config-data") pod "177bf646-3453-466c-ac69-7777db44dda5" (UID: "177bf646-3453-466c-ac69-7777db44dda5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.017778 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "177bf646-3453-466c-ac69-7777db44dda5" (UID: "177bf646-3453-466c-ac69-7777db44dda5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.020972 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2" event={"ID":"1adc2a4a-9139-449f-a755-1a6ede8c141d","Type":"ContainerStarted","Data":"860c3a0d3d06a15d598175c59609c54a5a058e3d23f646aa2bec257f177e19f5"} Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.057259 5030 generic.go:334] "Generic (PLEG): container finished" podID="6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" containerID="7bebe1daa1d25250b1dedbc99eb90a92119deaafd6bb66ce48014fb925d151e0" exitCode=143 Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.057544 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba","Type":"ContainerDied","Data":"7bebe1daa1d25250b1dedbc99eb90a92119deaafd6bb66ce48014fb925d151e0"} Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.068660 5030 generic.go:334] "Generic (PLEG): container finished" podID="f87ad01b-11b8-43e1-9a04-5947ff6b0441" containerID="97c9b3af0a221749a5d3d0a48cb4ab7d818c47e0f19d9b433358dccfb2edecb5" exitCode=0 Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.068746 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f87ad01b-11b8-43e1-9a04-5947ff6b0441","Type":"ContainerDied","Data":"97c9b3af0a221749a5d3d0a48cb4ab7d818c47e0f19d9b433358dccfb2edecb5"} Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.073078 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" podUID="7dfd1851-3088-4199-9b12-47964306a9b5" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.251:9696/\": dial tcp 10.217.0.251:9696: connect: connection refused" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.079504 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" event={"ID":"207ab9b6-de27-45b0-945c-68ed0ab8afbf","Type":"ContainerStarted","Data":"61d8ae2b8c363577b217d9e51a4c7799afc9e31d8f835c0fccf03aa388711626"} Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.080132 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" secret="" err="secret \"galera-openstack-dockercfg-jcxx5\" not found" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.082457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69" event={"ID":"43e23459-b501-42f3-bf9c-2d50df954c66","Type":"ContainerStarted","Data":"29b62b5c41db7c6f8dc6d1d8fbf5037522dffc1da90f1fe13ad075a2703aeea8"} Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.102673 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.102709 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.102719 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177bf646-3453-466c-ac69-7777db44dda5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.103143 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.103228 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.103368 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43e23459-b501-42f3-bf9c-2d50df954c66-operator-scripts podName:43e23459-b501-42f3-bf9c-2d50df954c66 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:43.103341866 +0000 UTC m=+1635.423602154 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/43e23459-b501-42f3-bf9c-2d50df954c66-operator-scripts") pod "nova-cell1-7aac-account-create-update-f7s69" (UID: "43e23459-b501-42f3-bf9c-2d50df954c66") : configmap "openstack-cell1-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.103466 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7be6ad04-b67e-42c1-867f-996017de8a50-operator-scripts podName:7be6ad04-b67e-42c1-867f-996017de8a50 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.603456068 +0000 UTC m=+1633.923716356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7be6ad04-b67e-42c1-867f-996017de8a50-operator-scripts") pod "root-account-create-update-4j2gm" (UID: "7be6ad04-b67e-42c1-867f-996017de8a50") : configmap "openstack-cell1-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.105969 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" event={"ID":"f6db0c18-060b-4be2-9a11-3df481a3abca","Type":"ContainerStarted","Data":"4af79e0461ca284398575e67a2d268a9da3cf44ec5315b0cf25a34c1a07995cd"} Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.119906 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:02:41 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:02:41 crc kubenswrapper[5030]: Jan 20 23:02:41 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:02:41 crc kubenswrapper[5030]: Jan 20 23:02:41 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:02:41 crc kubenswrapper[5030]: Jan 20 23:02:41 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:02:41 crc kubenswrapper[5030]: Jan 20 23:02:41 crc kubenswrapper[5030]: if [ -n "neutron" ]; then Jan 20 23:02:41 crc kubenswrapper[5030]: GRANT_DATABASE="neutron" Jan 20 23:02:41 crc kubenswrapper[5030]: else Jan 20 23:02:41 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:02:41 crc kubenswrapper[5030]: fi Jan 20 23:02:41 crc kubenswrapper[5030]: Jan 20 23:02:41 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:02:41 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:02:41 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:02:41 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:02:41 crc kubenswrapper[5030]: # support updates Jan 20 23:02:41 crc kubenswrapper[5030]: Jan 20 23:02:41 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.121749 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" podUID="207ab9b6-de27-45b0-945c-68ed0ab8afbf" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.140148 5030 generic.go:334] "Generic (PLEG): container finished" podID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerID="ed33b27e4afb955b83d56cfdd6383bdd0da34ea1240c3f18f863547d28dffa02" exitCode=0 Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.140201 5030 generic.go:334] "Generic (PLEG): container finished" podID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerID="6edc24e70d18a4d950a7bac3cb71da281a70c3b75b61a409054feff5a2f24333" exitCode=2 Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.140211 5030 generic.go:334] "Generic (PLEG): container finished" podID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerID="aa2beec7d83b690d8c2d49158b55f640524cb3d1f27c3a03e7ff160cba304856" exitCode=0 Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.140427 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5c264e34-26dd-4ab6-88e1-a44f64fac9da","Type":"ContainerDied","Data":"ed33b27e4afb955b83d56cfdd6383bdd0da34ea1240c3f18f863547d28dffa02"} Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.140476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5c264e34-26dd-4ab6-88e1-a44f64fac9da","Type":"ContainerDied","Data":"6edc24e70d18a4d950a7bac3cb71da281a70c3b75b61a409054feff5a2f24333"} Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.140488 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5c264e34-26dd-4ab6-88e1-a44f64fac9da","Type":"ContainerDied","Data":"aa2beec7d83b690d8c2d49158b55f640524cb3d1f27c3a03e7ff160cba304856"} Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.150377 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_886eef18-1c15-40bc-a798-217df2a568ff/ovsdbserver-nb/0.log" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.150809 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.152845 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"886eef18-1c15-40bc-a798-217df2a568ff","Type":"ContainerDied","Data":"4c5c790317fff819e3752f45df8b1bf73074faeb39193fba5dcd1b456a8cce1e"} Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.153071 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl" Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.207319 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.207376 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data podName:83f11c15-8856-45fd-9703-348310781d5a nodeName:}" failed. No retries permitted until 2026-01-20 23:02:45.207362505 +0000 UTC m=+1637.527622793 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data") pod "rabbitmq-server-0" (UID: "83f11c15-8856-45fd-9703-348310781d5a") : configmap "rabbitmq-config-data" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.208711 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.208747 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/207ab9b6-de27-45b0-945c-68ed0ab8afbf-operator-scripts podName:207ab9b6-de27-45b0-945c-68ed0ab8afbf nodeName:}" failed. No retries permitted until 2026-01-20 23:02:41.708737349 +0000 UTC m=+1634.028997637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/207ab9b6-de27-45b0-945c-68ed0ab8afbf-operator-scripts") pod "neutron-e899-account-create-update-rq7hk" (UID: "207ab9b6-de27-45b0-945c-68ed0ab8afbf") : configmap "openstack-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.233984 5030 scope.go:117] "RemoveContainer" containerID="45ada1c5b5263d9253d72b7b3529d52163687af3862b2a42b49f7e5e3b235906" Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.235130 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ada1c5b5263d9253d72b7b3529d52163687af3862b2a42b49f7e5e3b235906\": container with ID starting with 45ada1c5b5263d9253d72b7b3529d52163687af3862b2a42b49f7e5e3b235906 not found: ID does not exist" containerID="45ada1c5b5263d9253d72b7b3529d52163687af3862b2a42b49f7e5e3b235906" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.235160 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ada1c5b5263d9253d72b7b3529d52163687af3862b2a42b49f7e5e3b235906"} err="failed to get container status \"45ada1c5b5263d9253d72b7b3529d52163687af3862b2a42b49f7e5e3b235906\": rpc error: code = NotFound desc = could not find container \"45ada1c5b5263d9253d72b7b3529d52163687af3862b2a42b49f7e5e3b235906\": container with ID starting with 45ada1c5b5263d9253d72b7b3529d52163687af3862b2a42b49f7e5e3b235906 not found: ID does not exist" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.235179 5030 scope.go:117] "RemoveContainer" containerID="c4c34433a2e5244dfc75cef087ad787ef3f93d3b31b9b4666e5d1d0506c65a29" Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.235801 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c34433a2e5244dfc75cef087ad787ef3f93d3b31b9b4666e5d1d0506c65a29\": container with ID starting with c4c34433a2e5244dfc75cef087ad787ef3f93d3b31b9b4666e5d1d0506c65a29 not found: ID does not exist" containerID="c4c34433a2e5244dfc75cef087ad787ef3f93d3b31b9b4666e5d1d0506c65a29" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.235859 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c34433a2e5244dfc75cef087ad787ef3f93d3b31b9b4666e5d1d0506c65a29"} err="failed to get container status \"c4c34433a2e5244dfc75cef087ad787ef3f93d3b31b9b4666e5d1d0506c65a29\": rpc error: code = NotFound desc = could not find container \"c4c34433a2e5244dfc75cef087ad787ef3f93d3b31b9b4666e5d1d0506c65a29\": container with ID starting with c4c34433a2e5244dfc75cef087ad787ef3f93d3b31b9b4666e5d1d0506c65a29 not found: ID does not exist" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.235889 5030 scope.go:117] "RemoveContainer" containerID="bd4a155b76873118c2373604fae7e69b21203161dbc22f566ab70841f2e1c6ae" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.262215 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.288188 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.288212 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.288253 5030 scope.go:117] "RemoveContainer" containerID="0baa25f272096765a528195b72ca66c3413a72c927cbfa02fe2cad36d4bbd81c" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.292849 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.293074 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.308843 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-state-metrics-tls-config\") pod \"6e835bcd-3585-4a51-ba54-aeb66b426afa\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.308968 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-combined-ca-bundle\") pod \"6e835bcd-3585-4a51-ba54-aeb66b426afa\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.309085 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-state-metrics-tls-certs\") pod \"6e835bcd-3585-4a51-ba54-aeb66b426afa\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.309292 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b6zw\" (UniqueName: \"kubernetes.io/projected/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-api-access-7b6zw\") pod \"6e835bcd-3585-4a51-ba54-aeb66b426afa\" (UID: \"6e835bcd-3585-4a51-ba54-aeb66b426afa\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.310743 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tgls\" (UniqueName: \"kubernetes.io/projected/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-kube-api-access-5tgls\") pod \"keystone-afd0-account-create-update-2xqzl\" (UID: \"6a8b63a6-0ddc-47f6-9cc3-30862eb55edf\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.310902 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-operator-scripts\") pod \"keystone-afd0-account-create-update-2xqzl\" (UID: \"6a8b63a6-0ddc-47f6-9cc3-30862eb55edf\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl" Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.311093 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.311137 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-operator-scripts podName:6a8b63a6-0ddc-47f6-9cc3-30862eb55edf nodeName:}" failed. No retries permitted until 2026-01-20 23:02:42.311123317 +0000 UTC m=+1634.631383605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-operator-scripts") pod "keystone-afd0-account-create-update-2xqzl" (UID: "6a8b63a6-0ddc-47f6-9cc3-30862eb55edf") : configmap "openstack-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.315797 5030 projected.go:194] Error preparing data for projected volume kube-api-access-5tgls for pod openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.315861 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-kube-api-access-5tgls podName:6a8b63a6-0ddc-47f6-9cc3-30862eb55edf nodeName:}" failed. No retries permitted until 2026-01-20 23:02:42.315844092 +0000 UTC m=+1634.636104380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5tgls" (UniqueName: "kubernetes.io/projected/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-kube-api-access-5tgls") pod "keystone-afd0-account-create-update-2xqzl" (UID: "6a8b63a6-0ddc-47f6-9cc3-30862eb55edf") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.320102 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-api-access-7b6zw" (OuterVolumeSpecName: "kube-api-access-7b6zw") pod "6e835bcd-3585-4a51-ba54-aeb66b426afa" (UID: "6e835bcd-3585-4a51-ba54-aeb66b426afa"). InnerVolumeSpecName "kube-api-access-7b6zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.348441 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "6e835bcd-3585-4a51-ba54-aeb66b426afa" (UID: "6e835bcd-3585-4a51-ba54-aeb66b426afa"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.357214 5030 scope.go:117] "RemoveContainer" containerID="2e33323b62f3534db68317bbdcfa4b96b235ccb6f864576f5bc4f881659045cb" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.385374 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vljcv"] Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.386031 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177bf646-3453-466c-ac69-7777db44dda5" containerName="proxy-server" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.386046 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="177bf646-3453-466c-ac69-7777db44dda5" containerName="proxy-server" Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.386069 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e835bcd-3585-4a51-ba54-aeb66b426afa" containerName="kube-state-metrics" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.386076 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e835bcd-3585-4a51-ba54-aeb66b426afa" containerName="kube-state-metrics" Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.386091 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177bf646-3453-466c-ac69-7777db44dda5" containerName="proxy-httpd" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.386096 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="177bf646-3453-466c-ac69-7777db44dda5" containerName="proxy-httpd" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.386642 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="177bf646-3453-466c-ac69-7777db44dda5" containerName="proxy-httpd" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.386660 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="177bf646-3453-466c-ac69-7777db44dda5" containerName="proxy-server" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.386681 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e835bcd-3585-4a51-ba54-aeb66b426afa" containerName="kube-state-metrics" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.387893 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "6e835bcd-3585-4a51-ba54-aeb66b426afa" (UID: "6e835bcd-3585-4a51-ba54-aeb66b426afa"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.388115 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.395981 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vljcv"] Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.407827 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.414745 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea9bfa72-d35e-4dad-9fcf-747c90607c2d-operator-scripts\") pod \"ea9bfa72-d35e-4dad-9fcf-747c90607c2d\" (UID: \"ea9bfa72-d35e-4dad-9fcf-747c90607c2d\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.414819 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhn7h\" (UniqueName: \"kubernetes.io/projected/ea9bfa72-d35e-4dad-9fcf-747c90607c2d-kube-api-access-hhn7h\") pod \"ea9bfa72-d35e-4dad-9fcf-747c90607c2d\" (UID: \"ea9bfa72-d35e-4dad-9fcf-747c90607c2d\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.415417 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.415430 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b6zw\" (UniqueName: \"kubernetes.io/projected/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-api-access-7b6zw\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.415438 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.416923 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea9bfa72-d35e-4dad-9fcf-747c90607c2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea9bfa72-d35e-4dad-9fcf-747c90607c2d" (UID: "ea9bfa72-d35e-4dad-9fcf-747c90607c2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.424381 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea9bfa72-d35e-4dad-9fcf-747c90607c2d-kube-api-access-hhn7h" (OuterVolumeSpecName: "kube-api-access-hhn7h") pod "ea9bfa72-d35e-4dad-9fcf-747c90607c2d" (UID: "ea9bfa72-d35e-4dad-9fcf-747c90607c2d"). InnerVolumeSpecName "kube-api-access-hhn7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.424483 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e835bcd-3585-4a51-ba54-aeb66b426afa" (UID: "6e835bcd-3585-4a51-ba54-aeb66b426afa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.520368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clv2r\" (UniqueName: \"kubernetes.io/projected/23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3-kube-api-access-clv2r\") pod \"23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3\" (UID: \"23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.520756 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3-operator-scripts\") pod \"23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3\" (UID: \"23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.521272 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d6x6\" (UniqueName: \"kubernetes.io/projected/6b5cb860-d149-4999-a4bb-be57ead53529-kube-api-access-9d6x6\") pod \"community-operators-vljcv\" (UID: \"6b5cb860-d149-4999-a4bb-be57ead53529\") " pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.521342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b5cb860-d149-4999-a4bb-be57ead53529-catalog-content\") pod \"community-operators-vljcv\" (UID: \"6b5cb860-d149-4999-a4bb-be57ead53529\") " pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.521483 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b5cb860-d149-4999-a4bb-be57ead53529-utilities\") pod \"community-operators-vljcv\" (UID: \"6b5cb860-d149-4999-a4bb-be57ead53529\") " pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.521710 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e835bcd-3585-4a51-ba54-aeb66b426afa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.521735 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea9bfa72-d35e-4dad-9fcf-747c90607c2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.521749 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhn7h\" (UniqueName: \"kubernetes.io/projected/ea9bfa72-d35e-4dad-9fcf-747c90607c2d-kube-api-access-hhn7h\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.523016 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3" (UID: "23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.523101 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.523160 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-operator-scripts podName:74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a nodeName:}" failed. No retries permitted until 2026-01-20 23:02:42.523139821 +0000 UTC m=+1634.843400109 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-operator-scripts") pod "cinder-4cc4-account-create-update-49tr4" (UID: "74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a") : configmap "openstack-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.523749 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3-kube-api-access-clv2r" (OuterVolumeSpecName: "kube-api-access-clv2r") pod "23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3" (UID: "23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3"). InnerVolumeSpecName "kube-api-access-clv2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.526288 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" containerName="galera" containerID="cri-o://b2bea81020fe0e73b0b3e7d490ce3abe8750029ce24d8cb352e08256fba27883" gracePeriod=30 Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.623695 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d6x6\" (UniqueName: \"kubernetes.io/projected/6b5cb860-d149-4999-a4bb-be57ead53529-kube-api-access-9d6x6\") pod \"community-operators-vljcv\" (UID: \"6b5cb860-d149-4999-a4bb-be57ead53529\") " pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.623726 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.623761 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b5cb860-d149-4999-a4bb-be57ead53529-catalog-content\") pod \"community-operators-vljcv\" (UID: \"6b5cb860-d149-4999-a4bb-be57ead53529\") " pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.623808 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7be6ad04-b67e-42c1-867f-996017de8a50-operator-scripts podName:7be6ad04-b67e-42c1-867f-996017de8a50 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:42.623786638 +0000 UTC m=+1634.944046986 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7be6ad04-b67e-42c1-867f-996017de8a50-operator-scripts") pod "root-account-create-update-4j2gm" (UID: "7be6ad04-b67e-42c1-867f-996017de8a50") : configmap "openstack-cell1-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.623957 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b5cb860-d149-4999-a4bb-be57ead53529-utilities\") pod \"community-operators-vljcv\" (UID: \"6b5cb860-d149-4999-a4bb-be57ead53529\") " pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.624207 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clv2r\" (UniqueName: \"kubernetes.io/projected/23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3-kube-api-access-clv2r\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.624223 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.624318 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b5cb860-d149-4999-a4bb-be57ead53529-catalog-content\") pod \"community-operators-vljcv\" (UID: \"6b5cb860-d149-4999-a4bb-be57ead53529\") " pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.624495 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b5cb860-d149-4999-a4bb-be57ead53529-utilities\") pod \"community-operators-vljcv\" (UID: \"6b5cb860-d149-4999-a4bb-be57ead53529\") " pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.644584 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d6x6\" (UniqueName: \"kubernetes.io/projected/6b5cb860-d149-4999-a4bb-be57ead53529-kube-api-access-9d6x6\") pod \"community-operators-vljcv\" (UID: \"6b5cb860-d149-4999-a4bb-be57ead53529\") " pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.647171 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.712042 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.718169 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.724298 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.724788 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-combined-ca-bundle\") pod \"c4a387ba-cb79-4f5d-a313-37204d51d189\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.724823 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-vencrypt-tls-certs\") pod \"c4a387ba-cb79-4f5d-a313-37204d51d189\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.724931 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmwmf\" (UniqueName: \"kubernetes.io/projected/c4a387ba-cb79-4f5d-a313-37204d51d189-kube-api-access-rmwmf\") pod \"c4a387ba-cb79-4f5d-a313-37204d51d189\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.725015 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-nova-novncproxy-tls-certs\") pod \"c4a387ba-cb79-4f5d-a313-37204d51d189\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.725050 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-config-data\") pod \"c4a387ba-cb79-4f5d-a313-37204d51d189\" (UID: \"c4a387ba-cb79-4f5d-a313-37204d51d189\") " Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.728167 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-config-data: configmap "openstack-config-data" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.728222 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kolla-config podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:43.728207917 +0000 UTC m=+1636.048468205 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kolla-config") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : configmap "openstack-config-data" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.729292 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-config-data: configmap "openstack-config-data" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.729344 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-default podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:43.729328843 +0000 UTC m=+1636.049589211 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-default") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : configmap "openstack-config-data" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.729760 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.729820 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-galera-tls-certs podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:43.729809795 +0000 UTC m=+1636.050070083 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-galera-tls-certs") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : secret "cert-galera-openstack-svc" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.729850 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.729898 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-operator-scripts podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:43.729890547 +0000 UTC m=+1636.050150835 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-operator-scripts") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : configmap "openstack-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.729935 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.729982 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-combined-ca-bundle podName:9a16fd61-1cde-4e12-8e9f-db0c8f2182b7 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:43.729976289 +0000 UTC m=+1636.050236577 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-combined-ca-bundle") pod "openstack-galera-0" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7") : secret "combined-ca-bundle" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.730014 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: E0120 23:02:41.730139 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/207ab9b6-de27-45b0-945c-68ed0ab8afbf-operator-scripts podName:207ab9b6-de27-45b0-945c-68ed0ab8afbf nodeName:}" failed. No retries permitted until 2026-01-20 23:02:42.730097862 +0000 UTC m=+1635.050358150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/207ab9b6-de27-45b0-945c-68ed0ab8afbf-operator-scripts") pod "neutron-e899-account-create-update-rq7hk" (UID: "207ab9b6-de27-45b0-945c-68ed0ab8afbf") : configmap "openstack-scripts" not found Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.732327 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.739433 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a387ba-cb79-4f5d-a313-37204d51d189-kube-api-access-rmwmf" (OuterVolumeSpecName: "kube-api-access-rmwmf") pod "c4a387ba-cb79-4f5d-a313-37204d51d189" (UID: "c4a387ba-cb79-4f5d-a313-37204d51d189"). InnerVolumeSpecName "kube-api-access-rmwmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.764013 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829260 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-kolla-config\") pod \"f927a168-d367-4174-9e6e-fd9b7964346b\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829348 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f927a168-d367-4174-9e6e-fd9b7964346b-combined-ca-bundle\") pod \"f927a168-d367-4174-9e6e-fd9b7964346b\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829373 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f927a168-d367-4174-9e6e-fd9b7964346b-config-data-generated\") pod \"f927a168-d367-4174-9e6e-fd9b7964346b\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829426 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f927a168-d367-4174-9e6e-fd9b7964346b-galera-tls-certs\") pod \"f927a168-d367-4174-9e6e-fd9b7964346b\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829494 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-operator-scripts\") pod \"f927a168-d367-4174-9e6e-fd9b7964346b\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829511 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"f927a168-d367-4174-9e6e-fd9b7964346b\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829532 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adc2a4a-9139-449f-a755-1a6ede8c141d-operator-scripts\") pod \"1adc2a4a-9139-449f-a755-1a6ede8c141d\" (UID: \"1adc2a4a-9139-449f-a755-1a6ede8c141d\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829551 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbtjb\" (UniqueName: \"kubernetes.io/projected/43e23459-b501-42f3-bf9c-2d50df954c66-kube-api-access-zbtjb\") pod \"43e23459-b501-42f3-bf9c-2d50df954c66\" (UID: \"43e23459-b501-42f3-bf9c-2d50df954c66\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829586 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5699cc7d-e528-4022-bff1-7057062e4a03-config-data\") pod \"5699cc7d-e528-4022-bff1-7057062e4a03\" (UID: \"5699cc7d-e528-4022-bff1-7057062e4a03\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf9zn\" (UniqueName: \"kubernetes.io/projected/1adc2a4a-9139-449f-a755-1a6ede8c141d-kube-api-access-rf9zn\") pod \"1adc2a4a-9139-449f-a755-1a6ede8c141d\" (UID: \"1adc2a4a-9139-449f-a755-1a6ede8c141d\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829710 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25csx\" (UniqueName: \"kubernetes.io/projected/f927a168-d367-4174-9e6e-fd9b7964346b-kube-api-access-25csx\") pod \"f927a168-d367-4174-9e6e-fd9b7964346b\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829732 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5699cc7d-e528-4022-bff1-7057062e4a03-combined-ca-bundle\") pod \"5699cc7d-e528-4022-bff1-7057062e4a03\" (UID: \"5699cc7d-e528-4022-bff1-7057062e4a03\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829751 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzg2f\" (UniqueName: \"kubernetes.io/projected/5699cc7d-e528-4022-bff1-7057062e4a03-kube-api-access-gzg2f\") pod \"5699cc7d-e528-4022-bff1-7057062e4a03\" (UID: \"5699cc7d-e528-4022-bff1-7057062e4a03\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829828 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-config-data-default\") pod \"f927a168-d367-4174-9e6e-fd9b7964346b\" (UID: \"f927a168-d367-4174-9e6e-fd9b7964346b\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.829863 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e23459-b501-42f3-bf9c-2d50df954c66-operator-scripts\") pod \"43e23459-b501-42f3-bf9c-2d50df954c66\" (UID: \"43e23459-b501-42f3-bf9c-2d50df954c66\") " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.830268 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmwmf\" (UniqueName: \"kubernetes.io/projected/c4a387ba-cb79-4f5d-a313-37204d51d189-kube-api-access-rmwmf\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.830590 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43e23459-b501-42f3-bf9c-2d50df954c66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43e23459-b501-42f3-bf9c-2d50df954c66" (UID: "43e23459-b501-42f3-bf9c-2d50df954c66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.830929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f927a168-d367-4174-9e6e-fd9b7964346b" (UID: "f927a168-d367-4174-9e6e-fd9b7964346b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.835662 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f927a168-d367-4174-9e6e-fd9b7964346b" (UID: "f927a168-d367-4174-9e6e-fd9b7964346b"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.835694 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f927a168-d367-4174-9e6e-fd9b7964346b-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f927a168-d367-4174-9e6e-fd9b7964346b" (UID: "f927a168-d367-4174-9e6e-fd9b7964346b"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.836265 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4a387ba-cb79-4f5d-a313-37204d51d189" (UID: "c4a387ba-cb79-4f5d-a313-37204d51d189"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.836386 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f927a168-d367-4174-9e6e-fd9b7964346b" (UID: "f927a168-d367-4174-9e6e-fd9b7964346b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.837051 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1adc2a4a-9139-449f-a755-1a6ede8c141d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1adc2a4a-9139-449f-a755-1a6ede8c141d" (UID: "1adc2a4a-9139-449f-a755-1a6ede8c141d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.873548 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1adc2a4a-9139-449f-a755-1a6ede8c141d-kube-api-access-rf9zn" (OuterVolumeSpecName: "kube-api-access-rf9zn") pod "1adc2a4a-9139-449f-a755-1a6ede8c141d" (UID: "1adc2a4a-9139-449f-a755-1a6ede8c141d"). InnerVolumeSpecName "kube-api-access-rf9zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.876371 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5699cc7d-e528-4022-bff1-7057062e4a03-kube-api-access-gzg2f" (OuterVolumeSpecName: "kube-api-access-gzg2f") pod "5699cc7d-e528-4022-bff1-7057062e4a03" (UID: "5699cc7d-e528-4022-bff1-7057062e4a03"). InnerVolumeSpecName "kube-api-access-gzg2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.877683 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f927a168-d367-4174-9e6e-fd9b7964346b-kube-api-access-25csx" (OuterVolumeSpecName: "kube-api-access-25csx") pod "f927a168-d367-4174-9e6e-fd9b7964346b" (UID: "f927a168-d367-4174-9e6e-fd9b7964346b"). InnerVolumeSpecName "kube-api-access-25csx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.877795 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43e23459-b501-42f3-bf9c-2d50df954c66-kube-api-access-zbtjb" (OuterVolumeSpecName: "kube-api-access-zbtjb") pod "43e23459-b501-42f3-bf9c-2d50df954c66" (UID: "43e23459-b501-42f3-bf9c-2d50df954c66"). InnerVolumeSpecName "kube-api-access-zbtjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.905316 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "f927a168-d367-4174-9e6e-fd9b7964346b" (UID: "f927a168-d367-4174-9e6e-fd9b7964346b"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.943682 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25csx\" (UniqueName: \"kubernetes.io/projected/f927a168-d367-4174-9e6e-fd9b7964346b-kube-api-access-25csx\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.943712 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzg2f\" (UniqueName: \"kubernetes.io/projected/5699cc7d-e528-4022-bff1-7057062e4a03-kube-api-access-gzg2f\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.943721 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.943732 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43e23459-b501-42f3-bf9c-2d50df954c66-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.943741 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.943750 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f927a168-d367-4174-9e6e-fd9b7964346b-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.943758 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f927a168-d367-4174-9e6e-fd9b7964346b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.943778 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.943787 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1adc2a4a-9139-449f-a755-1a6ede8c141d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.943811 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbtjb\" (UniqueName: \"kubernetes.io/projected/43e23459-b501-42f3-bf9c-2d50df954c66-kube-api-access-zbtjb\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.943820 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf9zn\" (UniqueName: \"kubernetes.io/projected/1adc2a4a-9139-449f-a755-1a6ede8c141d-kube-api-access-rf9zn\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.943829 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.978274 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.980322 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b38a913-9251-435b-9656-5c2cee72e288" path="/var/lib/kubelet/pods/0b38a913-9251-435b-9656-5c2cee72e288/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.980847 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dbd22d0-db4b-485c-b2f6-20bfa502543f" path="/var/lib/kubelet/pods/0dbd22d0-db4b-485c-b2f6-20bfa502543f/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.981843 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a712d79-3228-4e70-ae9d-8436a0dee203" path="/var/lib/kubelet/pods/1a712d79-3228-4e70-ae9d-8436a0dee203/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.982826 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="399e28e7-a389-47c4-b8c2-89d8dbad34be" path="/var/lib/kubelet/pods/399e28e7-a389-47c4-b8c2-89d8dbad34be/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.984034 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="703b2784-b7db-4424-b479-91d7c53f9ecb" path="/var/lib/kubelet/pods/703b2784-b7db-4424-b479-91d7c53f9ecb/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.984523 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5699cc7d-e528-4022-bff1-7057062e4a03-config-data" (OuterVolumeSpecName: "config-data") pod "5699cc7d-e528-4022-bff1-7057062e4a03" (UID: "5699cc7d-e528-4022-bff1-7057062e4a03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.984583 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70aec167-1edb-4bc8-8170-f769f2672134" path="/var/lib/kubelet/pods/70aec167-1edb-4bc8-8170-f769f2672134/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.985147 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d75890-b0d8-4622-8471-e44bdc3e925f" path="/var/lib/kubelet/pods/84d75890-b0d8-4622-8471-e44bdc3e925f/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.986339 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="886eef18-1c15-40bc-a798-217df2a568ff" path="/var/lib/kubelet/pods/886eef18-1c15-40bc-a798-217df2a568ff/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.986864 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b1f55a-64ae-4fc1-91d3-5d65502e870c" path="/var/lib/kubelet/pods/94b1f55a-64ae-4fc1-91d3-5d65502e870c/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.987315 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a601abae-f93a-48ec-8597-00ef38c3e1e6" path="/var/lib/kubelet/pods/a601abae-f93a-48ec-8597-00ef38c3e1e6/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.988490 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c82005b0-b1f6-4884-bc9a-f131a62edd97" path="/var/lib/kubelet/pods/c82005b0-b1f6-4884-bc9a-f131a62edd97/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.989286 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce959850-3505-4e4b-9055-caa9b213416a" path="/var/lib/kubelet/pods/ce959850-3505-4e4b-9055-caa9b213416a/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.990260 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db1552c5-e1e0-4c78-8860-243a48c39e6a" path="/var/lib/kubelet/pods/db1552c5-e1e0-4c78-8860-243a48c39e6a/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.991384 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e90be399-4f89-42ef-950a-2ef8c14a4d0f" path="/var/lib/kubelet/pods/e90be399-4f89-42ef-950a-2ef8c14a4d0f/volumes" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.993805 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5699cc7d-e528-4022-bff1-7057062e4a03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5699cc7d-e528-4022-bff1-7057062e4a03" (UID: "5699cc7d-e528-4022-bff1-7057062e4a03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:41 crc kubenswrapper[5030]: I0120 23:02:41.995783 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "c4a387ba-cb79-4f5d-a313-37204d51d189" (UID: "c4a387ba-cb79-4f5d-a313-37204d51d189"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.028465 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f927a168-d367-4174-9e6e-fd9b7964346b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f927a168-d367-4174-9e6e-fd9b7964346b" (UID: "f927a168-d367-4174-9e6e-fd9b7964346b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.035250 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "c4a387ba-cb79-4f5d-a313-37204d51d189" (UID: "c4a387ba-cb79-4f5d-a313-37204d51d189"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.036206 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-config-data" (OuterVolumeSpecName: "config-data") pod "c4a387ba-cb79-4f5d-a313-37204d51d189" (UID: "c4a387ba-cb79-4f5d-a313-37204d51d189"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.047904 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5699cc7d-e528-4022-bff1-7057062e4a03-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.047953 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.047966 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5699cc7d-e528-4022-bff1-7057062e4a03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.047975 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f927a168-d367-4174-9e6e-fd9b7964346b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.047985 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.047993 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4a387ba-cb79-4f5d-a313-37204d51d189-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.048001 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.069576 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f927a168-d367-4174-9e6e-fd9b7964346b-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f927a168-d367-4174-9e6e-fd9b7964346b" (UID: "f927a168-d367-4174-9e6e-fd9b7964346b"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.151507 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f927a168-d367-4174-9e6e-fd9b7964346b-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.188927 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.197440 5030 generic.go:334] "Generic (PLEG): container finished" podID="9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" containerID="b2bea81020fe0e73b0b3e7d490ce3abe8750029ce24d8cb352e08256fba27883" exitCode=0 Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.200949 5030 generic.go:334] "Generic (PLEG): container finished" podID="f87ad01b-11b8-43e1-9a04-5947ff6b0441" containerID="54990945a497a45075119ed3dc036df2589703681e3a56dcfccafab5ebeb782b" exitCode=0 Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.203216 5030 generic.go:334] "Generic (PLEG): container finished" podID="6f73fe0c-f004-4b64-8952-cc6d7ebb26a9" containerID="ff2baf8e3414b506fcd4fe9cdcb3f98e506b98207980d6f88effbe946d2e0a60" exitCode=0 Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.218360 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.225123 5030 generic.go:334] "Generic (PLEG): container finished" podID="0fc512e7-ef40-4589-a094-9906cf91a0d3" containerID="32dd60b784789cf7134c38532e81ef67d0d751a5bd26694f1fd92042ac4cc701" exitCode=0 Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.225816 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2" event={"ID":"1adc2a4a-9139-449f-a755-1a6ede8c141d","Type":"ContainerDied","Data":"860c3a0d3d06a15d598175c59609c54a5a058e3d23f646aa2bec257f177e19f5"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.225867 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7","Type":"ContainerDied","Data":"b2bea81020fe0e73b0b3e7d490ce3abe8750029ce24d8cb352e08256fba27883"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.225883 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f87ad01b-11b8-43e1-9a04-5947ff6b0441","Type":"ContainerDied","Data":"54990945a497a45075119ed3dc036df2589703681e3a56dcfccafab5ebeb782b"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.225903 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9","Type":"ContainerDied","Data":"ff2baf8e3414b506fcd4fe9cdcb3f98e506b98207980d6f88effbe946d2e0a60"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.225917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69" event={"ID":"43e23459-b501-42f3-bf9c-2d50df954c66","Type":"ContainerDied","Data":"29b62b5c41db7c6f8dc6d1d8fbf5037522dffc1da90f1fe13ad075a2703aeea8"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.225930 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0fc512e7-ef40-4589-a094-9906cf91a0d3","Type":"ContainerDied","Data":"32dd60b784789cf7134c38532e81ef67d0d751a5bd26694f1fd92042ac4cc701"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.232636 5030 generic.go:334] "Generic (PLEG): container finished" podID="f6db0c18-060b-4be2-9a11-3df481a3abca" containerID="91e6a8963318ec7bd0552820186fd8be5e5d5bc9103782aebeffb44f5f7fe32a" exitCode=0 Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.232698 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" event={"ID":"f6db0c18-060b-4be2-9a11-3df481a3abca","Type":"ContainerDied","Data":"91e6a8963318ec7bd0552820186fd8be5e5d5bc9103782aebeffb44f5f7fe32a"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.237133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"6e835bcd-3585-4a51-ba54-aeb66b426afa","Type":"ContainerDied","Data":"b2a7cfc6a0a635a29a4a4804cf1e8c3aa65170e2efd04603769ad6d5473595be"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.237171 5030 scope.go:117] "RemoveContainer" containerID="4224aea8dfc41047d5b9bd1083b109636d208592edde37fe0ce27e2bbe90849f" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.237253 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.250217 5030 generic.go:334] "Generic (PLEG): container finished" podID="f927a168-d367-4174-9e6e-fd9b7964346b" containerID="c4f519b9aa19de2a59b393d537b2988696190ba41760458874985232c34d2380" exitCode=0 Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.250293 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.250306 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f927a168-d367-4174-9e6e-fd9b7964346b","Type":"ContainerDied","Data":"c4f519b9aa19de2a59b393d537b2988696190ba41760458874985232c34d2380"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.250347 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f927a168-d367-4174-9e6e-fd9b7964346b","Type":"ContainerDied","Data":"53e9263094fb938ecb15f86897c564340afd1397551859b0675e06fe8ca5851a"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.254010 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" event={"ID":"177bf646-3453-466c-ac69-7777db44dda5","Type":"ContainerDied","Data":"46b888c998f50e5e6e9383ccf6b9cab1b056e9c057649e106064ec6ac0cbd6fb"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.254213 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.258017 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2" event={"ID":"ea9bfa72-d35e-4dad-9fcf-747c90607c2d","Type":"ContainerDied","Data":"b0cff95ace65239bbe602d15dcde1ff8d6a3dbc69cf195ce50523ac148de07ef"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.258063 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.269282 5030 generic.go:334] "Generic (PLEG): container finished" podID="5699cc7d-e528-4022-bff1-7057062e4a03" containerID="3840da32c6f4ec49386ce37acd28fcc6e739b91b2b7ca663f4470e40df7a6ce7" exitCode=0 Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.269338 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"5699cc7d-e528-4022-bff1-7057062e4a03","Type":"ContainerDied","Data":"3840da32c6f4ec49386ce37acd28fcc6e739b91b2b7ca663f4470e40df7a6ce7"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.269363 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"5699cc7d-e528-4022-bff1-7057062e4a03","Type":"ContainerDied","Data":"039b12a18dd4009b921a385ac492c0fe4c0dc5a70de823c3c59cd5ff252ef641"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.269405 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.271742 5030 generic.go:334] "Generic (PLEG): container finished" podID="c4a387ba-cb79-4f5d-a313-37204d51d189" containerID="3bcbe084389694d96a65b34b6aa0d2aa7c93b51780ed5c86c1d3a10c6a57ad9e" exitCode=0 Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.271802 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c4a387ba-cb79-4f5d-a313-37204d51d189","Type":"ContainerDied","Data":"3bcbe084389694d96a65b34b6aa0d2aa7c93b51780ed5c86c1d3a10c6a57ad9e"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.271821 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c4a387ba-cb79-4f5d-a313-37204d51d189","Type":"ContainerDied","Data":"b7d2e5aba18fa2fbf1dac48da0ce44d2e5229298ef70dea322a5b649be96ac68"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.271873 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.273764 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp" event={"ID":"23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3","Type":"ContainerDied","Data":"f653eeff8519405fd6277a476af479ebbfee38fefa0a0138ddf8c03e46c0d408"} Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.274057 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.274107 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.355294 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-operator-scripts\") pod \"keystone-afd0-account-create-update-2xqzl\" (UID: \"6a8b63a6-0ddc-47f6-9cc3-30862eb55edf\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.355471 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tgls\" (UniqueName: \"kubernetes.io/projected/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-kube-api-access-5tgls\") pod \"keystone-afd0-account-create-update-2xqzl\" (UID: \"6a8b63a6-0ddc-47f6-9cc3-30862eb55edf\") " pod="openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl" Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.356542 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.356672 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-operator-scripts podName:6a8b63a6-0ddc-47f6-9cc3-30862eb55edf nodeName:}" failed. No retries permitted until 2026-01-20 23:02:44.356631733 +0000 UTC m=+1636.676892021 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-operator-scripts") pod "keystone-afd0-account-create-update-2xqzl" (UID: "6a8b63a6-0ddc-47f6-9cc3-30862eb55edf") : configmap "openstack-scripts" not found Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.358456 5030 projected.go:194] Error preparing data for projected volume kube-api-access-5tgls for pod openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.358517 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-kube-api-access-5tgls podName:6a8b63a6-0ddc-47f6-9cc3-30862eb55edf nodeName:}" failed. No retries permitted until 2026-01-20 23:02:44.358502789 +0000 UTC m=+1636.678763077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5tgls" (UniqueName: "kubernetes.io/projected/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-kube-api-access-5tgls") pod "keystone-afd0-account-create-update-2xqzl" (UID: "6a8b63a6-0ddc-47f6-9cc3-30862eb55edf") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.595062 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.596655 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.596814 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data podName:b9bc4ba4-4b03-441b-b492-6de67c2647b6 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:46.596796742 +0000 UTC m=+1638.917057030 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data") pod "rabbitmq-cell1-server-0" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.597175 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.597203 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-operator-scripts podName:74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a nodeName:}" failed. No retries permitted until 2026-01-20 23:02:44.597195061 +0000 UTC m=+1636.917455349 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-operator-scripts") pod "cinder-4cc4-account-create-update-49tr4" (UID: "74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a") : configmap "openstack-scripts" not found Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.599344 5030 scope.go:117] "RemoveContainer" containerID="c4f519b9aa19de2a59b393d537b2988696190ba41760458874985232c34d2380" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.656808 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.657193 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ee7da575f223a9839f7f97dae1e5c9caf513e03dec1badc74521523c66f260c" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.658237 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ee7da575f223a9839f7f97dae1e5c9caf513e03dec1badc74521523c66f260c" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.663040 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ee7da575f223a9839f7f97dae1e5c9caf513e03dec1badc74521523c66f260c" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.663075 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" containerName="ovn-northd" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.683888 5030 scope.go:117] "RemoveContainer" containerID="23246055da36c2f333112d28800e6e96f2ab653464ea645c0d2e1aab9a862a2b" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.693538 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2"] Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.697557 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-scripts\") pod \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.697589 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgzr7\" (UniqueName: \"kubernetes.io/projected/f87ad01b-11b8-43e1-9a04-5947ff6b0441-kube-api-access-tgzr7\") pod \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.697612 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f87ad01b-11b8-43e1-9a04-5947ff6b0441-etc-machine-id\") pod \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.697645 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-combined-ca-bundle\") pod \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.697674 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-config-data\") pod \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.697706 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-config-data-custom\") pod \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\" (UID: \"f87ad01b-11b8-43e1-9a04-5947ff6b0441\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.697747 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f87ad01b-11b8-43e1-9a04-5947ff6b0441-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f87ad01b-11b8-43e1-9a04-5947ff6b0441" (UID: "f87ad01b-11b8-43e1-9a04-5947ff6b0441"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.698527 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f87ad01b-11b8-43e1-9a04-5947ff6b0441-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.698592 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.698649 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7be6ad04-b67e-42c1-867f-996017de8a50-operator-scripts podName:7be6ad04-b67e-42c1-867f-996017de8a50 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:44.698632557 +0000 UTC m=+1637.018892845 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7be6ad04-b67e-42c1-867f-996017de8a50-operator-scripts") pod "root-account-create-update-4j2gm" (UID: "7be6ad04-b67e-42c1-867f-996017de8a50") : configmap "openstack-cell1-scripts" not found Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.706139 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f87ad01b-11b8-43e1-9a04-5947ff6b0441-kube-api-access-tgzr7" (OuterVolumeSpecName: "kube-api-access-tgzr7") pod "f87ad01b-11b8-43e1-9a04-5947ff6b0441" (UID: "f87ad01b-11b8-43e1-9a04-5947ff6b0441"). InnerVolumeSpecName "kube-api-access-tgzr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.706694 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-scripts" (OuterVolumeSpecName: "scripts") pod "f87ad01b-11b8-43e1-9a04-5947ff6b0441" (UID: "f87ad01b-11b8-43e1-9a04-5947ff6b0441"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.722088 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.250:8776/healthcheck\": read tcp 10.217.0.2:45054->10.217.0.250:8776: read: connection reset by peer" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.725716 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-f6c7-account-create-update-w4mp2"] Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.741919 5030 scope.go:117] "RemoveContainer" containerID="c4f519b9aa19de2a59b393d537b2988696190ba41760458874985232c34d2380" Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.743088 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4f519b9aa19de2a59b393d537b2988696190ba41760458874985232c34d2380\": container with ID starting with c4f519b9aa19de2a59b393d537b2988696190ba41760458874985232c34d2380 not found: ID does not exist" containerID="c4f519b9aa19de2a59b393d537b2988696190ba41760458874985232c34d2380" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.743829 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4f519b9aa19de2a59b393d537b2988696190ba41760458874985232c34d2380"} err="failed to get container status \"c4f519b9aa19de2a59b393d537b2988696190ba41760458874985232c34d2380\": rpc error: code = NotFound desc = could not find container \"c4f519b9aa19de2a59b393d537b2988696190ba41760458874985232c34d2380\": container with ID starting with c4f519b9aa19de2a59b393d537b2988696190ba41760458874985232c34d2380 not found: ID does not exist" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.744386 5030 scope.go:117] "RemoveContainer" containerID="23246055da36c2f333112d28800e6e96f2ab653464ea645c0d2e1aab9a862a2b" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.744975 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f87ad01b-11b8-43e1-9a04-5947ff6b0441" (UID: "f87ad01b-11b8-43e1-9a04-5947ff6b0441"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.747925 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.753038 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.753097 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23246055da36c2f333112d28800e6e96f2ab653464ea645c0d2e1aab9a862a2b\": container with ID starting with 23246055da36c2f333112d28800e6e96f2ab653464ea645c0d2e1aab9a862a2b not found: ID does not exist" containerID="23246055da36c2f333112d28800e6e96f2ab653464ea645c0d2e1aab9a862a2b" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.753120 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23246055da36c2f333112d28800e6e96f2ab653464ea645c0d2e1aab9a862a2b"} err="failed to get container status \"23246055da36c2f333112d28800e6e96f2ab653464ea645c0d2e1aab9a862a2b\": rpc error: code = NotFound desc = could not find container \"23246055da36c2f333112d28800e6e96f2ab653464ea645c0d2e1aab9a862a2b\": container with ID starting with 23246055da36c2f333112d28800e6e96f2ab653464ea645c0d2e1aab9a862a2b not found: ID does not exist" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.753274 5030 scope.go:117] "RemoveContainer" containerID="3840da32c6f4ec49386ce37acd28fcc6e739b91b2b7ca663f4470e40df7a6ce7" Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.770371 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.770415 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="2071f00e-fdae-400c-bec2-f9092faef33d" containerName="nova-scheduler-scheduler" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.776333 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.799147 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-memcached-tls-certs\") pod \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.799205 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-kolla-config\") pod \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.799276 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-config-data\") pod \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.799330 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q59nz\" (UniqueName: \"kubernetes.io/projected/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-kube-api-access-q59nz\") pod \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.799413 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-combined-ca-bundle\") pod \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\" (UID: \"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.799886 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.799985 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgzr7\" (UniqueName: \"kubernetes.io/projected/f87ad01b-11b8-43e1-9a04-5947ff6b0441-kube-api-access-tgzr7\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.799998 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.800050 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.800087 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/207ab9b6-de27-45b0-945c-68ed0ab8afbf-operator-scripts podName:207ab9b6-de27-45b0-945c-68ed0ab8afbf nodeName:}" failed. No retries permitted until 2026-01-20 23:02:44.800074093 +0000 UTC m=+1637.120334381 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/207ab9b6-de27-45b0-945c-68ed0ab8afbf-operator-scripts") pod "neutron-e899-account-create-update-rq7hk" (UID: "207ab9b6-de27-45b0-945c-68ed0ab8afbf") : configmap "openstack-scripts" not found Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.800495 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6f73fe0c-f004-4b64-8952-cc6d7ebb26a9" (UID: "6f73fe0c-f004-4b64-8952-cc6d7ebb26a9"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.801102 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-config-data" (OuterVolumeSpecName: "config-data") pod "6f73fe0c-f004-4b64-8952-cc6d7ebb26a9" (UID: "6f73fe0c-f004-4b64-8952-cc6d7ebb26a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.812818 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-kube-api-access-q59nz" (OuterVolumeSpecName: "kube-api-access-q59nz") pod "6f73fe0c-f004-4b64-8952-cc6d7ebb26a9" (UID: "6f73fe0c-f004-4b64-8952-cc6d7ebb26a9"). InnerVolumeSpecName "kube-api-access-q59nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.831798 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f87ad01b-11b8-43e1-9a04-5947ff6b0441" (UID: "f87ad01b-11b8-43e1-9a04-5947ff6b0441"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.838069 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.841831 5030 scope.go:117] "RemoveContainer" containerID="3840da32c6f4ec49386ce37acd28fcc6e739b91b2b7ca663f4470e40df7a6ce7" Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.846953 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3840da32c6f4ec49386ce37acd28fcc6e739b91b2b7ca663f4470e40df7a6ce7\": container with ID starting with 3840da32c6f4ec49386ce37acd28fcc6e739b91b2b7ca663f4470e40df7a6ce7 not found: ID does not exist" containerID="3840da32c6f4ec49386ce37acd28fcc6e739b91b2b7ca663f4470e40df7a6ce7" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.846977 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3840da32c6f4ec49386ce37acd28fcc6e739b91b2b7ca663f4470e40df7a6ce7"} err="failed to get container status \"3840da32c6f4ec49386ce37acd28fcc6e739b91b2b7ca663f4470e40df7a6ce7\": rpc error: code = NotFound desc = could not find container \"3840da32c6f4ec49386ce37acd28fcc6e739b91b2b7ca663f4470e40df7a6ce7\": container with ID starting with 3840da32c6f4ec49386ce37acd28fcc6e739b91b2b7ca663f4470e40df7a6ce7 not found: ID does not exist" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.846995 5030 scope.go:117] "RemoveContainer" containerID="3bcbe084389694d96a65b34b6aa0d2aa7c93b51780ed5c86c1d3a10c6a57ad9e" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.848505 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.881849 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.891760 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f73fe0c-f004-4b64-8952-cc6d7ebb26a9" (UID: "6f73fe0c-f004-4b64-8952-cc6d7ebb26a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.901770 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t4kr\" (UniqueName: \"kubernetes.io/projected/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kube-api-access-6t4kr\") pod \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.901950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-operator-scripts\") pod \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.901978 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-generated\") pod \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.902008 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.902072 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-combined-ca-bundle\") pod \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.902563 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.902678 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-default\") pod \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.902719 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-galera-tls-certs\") pod \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.902750 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kolla-config\") pod \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\" (UID: \"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7\") " Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.903207 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.903233 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.903244 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.903257 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q59nz\" (UniqueName: \"kubernetes.io/projected/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-kube-api-access-q59nz\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.903269 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.903279 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.906381 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.908778 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.915928 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.916053 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-config-data" (OuterVolumeSpecName: "config-data") pod "f87ad01b-11b8-43e1-9a04-5947ff6b0441" (UID: "f87ad01b-11b8-43e1-9a04-5947ff6b0441"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.916698 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.938860 5030 scope.go:117] "RemoveContainer" containerID="3bcbe084389694d96a65b34b6aa0d2aa7c93b51780ed5c86c1d3a10c6a57ad9e" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.939605 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kube-api-access-6t4kr" (OuterVolumeSpecName: "kube-api-access-6t4kr") pod "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7"). InnerVolumeSpecName "kube-api-access-6t4kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: E0120 23:02:42.939873 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bcbe084389694d96a65b34b6aa0d2aa7c93b51780ed5c86c1d3a10c6a57ad9e\": container with ID starting with 3bcbe084389694d96a65b34b6aa0d2aa7c93b51780ed5c86c1d3a10c6a57ad9e not found: ID does not exist" containerID="3bcbe084389694d96a65b34b6aa0d2aa7c93b51780ed5c86c1d3a10c6a57ad9e" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.939914 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bcbe084389694d96a65b34b6aa0d2aa7c93b51780ed5c86c1d3a10c6a57ad9e"} err="failed to get container status \"3bcbe084389694d96a65b34b6aa0d2aa7c93b51780ed5c86c1d3a10c6a57ad9e\": rpc error: code = NotFound desc = could not find container \"3bcbe084389694d96a65b34b6aa0d2aa7c93b51780ed5c86c1d3a10c6a57ad9e\": container with ID starting with 3bcbe084389694d96a65b34b6aa0d2aa7c93b51780ed5c86c1d3a10c6a57ad9e not found: ID does not exist" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.944540 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.949720 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "mysql-db") pod "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.969840 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "6f73fe0c-f004-4b64-8952-cc6d7ebb26a9" (UID: "6f73fe0c-f004-4b64-8952-cc6d7ebb26a9"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.974350 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8"] Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.986885 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-f55649dc6-gfkb8"] Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.993528 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.994856 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" (UID: "9a16fd61-1cde-4e12-8e9f-db0c8f2182b7"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:42 crc kubenswrapper[5030]: I0120 23:02:42.998671 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.008303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0fc512e7-ef40-4589-a094-9906cf91a0d3-httpd-run\") pod \"0fc512e7-ef40-4589-a094-9906cf91a0d3\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.008832 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-internal-tls-certs\") pod \"0fc512e7-ef40-4589-a094-9906cf91a0d3\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.008986 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-config-data\") pod \"0fc512e7-ef40-4589-a094-9906cf91a0d3\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.009065 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6plg\" (UniqueName: \"kubernetes.io/projected/207ab9b6-de27-45b0-945c-68ed0ab8afbf-kube-api-access-c6plg\") pod \"207ab9b6-de27-45b0-945c-68ed0ab8afbf\" (UID: \"207ab9b6-de27-45b0-945c-68ed0ab8afbf\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.009150 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/207ab9b6-de27-45b0-945c-68ed0ab8afbf-operator-scripts\") pod \"207ab9b6-de27-45b0-945c-68ed0ab8afbf\" (UID: \"207ab9b6-de27-45b0-945c-68ed0ab8afbf\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.009377 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-combined-ca-bundle\") pod \"0fc512e7-ef40-4589-a094-9906cf91a0d3\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.009454 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdjnc\" (UniqueName: \"kubernetes.io/projected/0fc512e7-ef40-4589-a094-9906cf91a0d3-kube-api-access-mdjnc\") pod \"0fc512e7-ef40-4589-a094-9906cf91a0d3\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.009541 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-scripts\") pod \"0fc512e7-ef40-4589-a094-9906cf91a0d3\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.009661 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fc512e7-ef40-4589-a094-9906cf91a0d3-logs\") pod \"0fc512e7-ef40-4589-a094-9906cf91a0d3\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.009737 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"0fc512e7-ef40-4589-a094-9906cf91a0d3\" (UID: \"0fc512e7-ef40-4589-a094-9906cf91a0d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.014193 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/207ab9b6-de27-45b0-945c-68ed0ab8afbf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "207ab9b6-de27-45b0-945c-68ed0ab8afbf" (UID: "207ab9b6-de27-45b0-945c-68ed0ab8afbf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.014474 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fc512e7-ef40-4589-a094-9906cf91a0d3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0fc512e7-ef40-4589-a094-9906cf91a0d3" (UID: "0fc512e7-ef40-4589-a094-9906cf91a0d3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.014611 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.014687 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/207ab9b6-de27-45b0-945c-68ed0ab8afbf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.014741 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.014794 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.014845 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.014907 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.014960 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f87ad01b-11b8-43e1-9a04-5947ff6b0441-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.015010 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t4kr\" (UniqueName: \"kubernetes.io/projected/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-kube-api-access-6t4kr\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.015061 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.015111 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0fc512e7-ef40-4589-a094-9906cf91a0d3-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.015171 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.020479 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fc512e7-ef40-4589-a094-9906cf91a0d3-kube-api-access-mdjnc" (OuterVolumeSpecName: "kube-api-access-mdjnc") pod "0fc512e7-ef40-4589-a094-9906cf91a0d3" (UID: "0fc512e7-ef40-4589-a094-9906cf91a0d3"). InnerVolumeSpecName "kube-api-access-mdjnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.031994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fc512e7-ef40-4589-a094-9906cf91a0d3-logs" (OuterVolumeSpecName: "logs") pod "0fc512e7-ef40-4589-a094-9906cf91a0d3" (UID: "0fc512e7-ef40-4589-a094-9906cf91a0d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.037565 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "0fc512e7-ef40-4589-a094-9906cf91a0d3" (UID: "0fc512e7-ef40-4589-a094-9906cf91a0d3"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.045482 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/207ab9b6-de27-45b0-945c-68ed0ab8afbf-kube-api-access-c6plg" (OuterVolumeSpecName: "kube-api-access-c6plg") pod "207ab9b6-de27-45b0-945c-68ed0ab8afbf" (UID: "207ab9b6-de27-45b0-945c-68ed0ab8afbf"). InnerVolumeSpecName "kube-api-access-c6plg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.052969 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.059692 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-afd0-account-create-update-2xqzl"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.060027 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-scripts" (OuterVolumeSpecName: "scripts") pod "0fc512e7-ef40-4589-a094-9906cf91a0d3" (UID: "0fc512e7-ef40-4589-a094-9906cf91a0d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.085345 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.098414 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-be4a-account-create-update-6nlfp"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.103446 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fc512e7-ef40-4589-a094-9906cf91a0d3" (UID: "0fc512e7-ef40-4589-a094-9906cf91a0d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.105853 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.106990 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.114445 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.117132 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.117170 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6plg\" (UniqueName: \"kubernetes.io/projected/207ab9b6-de27-45b0-945c-68ed0ab8afbf-kube-api-access-c6plg\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.117181 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.117190 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdjnc\" (UniqueName: \"kubernetes.io/projected/0fc512e7-ef40-4589-a094-9906cf91a0d3-kube-api-access-mdjnc\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.117198 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.117206 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fc512e7-ef40-4589-a094-9906cf91a0d3-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.117246 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.119325 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-config-data" (OuterVolumeSpecName: "config-data") pod "0fc512e7-ef40-4589-a094-9906cf91a0d3" (UID: "0fc512e7-ef40-4589-a094-9906cf91a0d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.128051 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.136055 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-d0e8-account-create-update-dg2g2"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.136664 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.143544 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.150004 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.156545 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0fc512e7-ef40-4589-a094-9906cf91a0d3" (UID: "0fc512e7-ef40-4589-a094-9906cf91a0d3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.164934 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.170418 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-7aac-account-create-update-f7s69"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.218825 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.218844 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.218868 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tgls\" (UniqueName: \"kubernetes.io/projected/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf-kube-api-access-5tgls\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.218877 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.218886 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc512e7-ef40-4589-a094-9906cf91a0d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.220092 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.225519 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-4j2gm" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.259989 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.273692 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vljcv"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.290855 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.310109 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-4j2gm" event={"ID":"7be6ad04-b67e-42c1-867f-996017de8a50","Type":"ContainerDied","Data":"956e5d6a8b04b4095c00aa0154120c10bf4af15a6e53c980483d9bd9d32f8aba"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.310412 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-4j2gm" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.319588 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7cvm\" (UniqueName: \"kubernetes.io/projected/7be6ad04-b67e-42c1-867f-996017de8a50-kube-api-access-z7cvm\") pod \"7be6ad04-b67e-42c1-867f-996017de8a50\" (UID: \"7be6ad04-b67e-42c1-867f-996017de8a50\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.319639 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-public-tls-certs\") pod \"3e7e101a-46be-4de8-97af-d47cdce4ef90\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.319752 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-scripts\") pod \"3e7e101a-46be-4de8-97af-d47cdce4ef90\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.319820 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7be6ad04-b67e-42c1-867f-996017de8a50-operator-scripts\") pod \"7be6ad04-b67e-42c1-867f-996017de8a50\" (UID: \"7be6ad04-b67e-42c1-867f-996017de8a50\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.319906 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-internal-tls-certs\") pod \"3e7e101a-46be-4de8-97af-d47cdce4ef90\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.319931 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"9a16fd61-1cde-4e12-8e9f-db0c8f2182b7","Type":"ContainerDied","Data":"48aa49b8cf040be83cb11c7fe449fde83c14cc4b734a69f5407c7447559a3e9b"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.319976 5030 scope.go:117] "RemoveContainer" containerID="b2bea81020fe0e73b0b3e7d490ce3abe8750029ce24d8cb352e08256fba27883" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.319996 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-config-data\") pod \"3e7e101a-46be-4de8-97af-d47cdce4ef90\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.320035 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-combined-ca-bundle\") pod \"3e7e101a-46be-4de8-97af-d47cdce4ef90\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.320058 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbb8k\" (UniqueName: \"kubernetes.io/projected/3e7e101a-46be-4de8-97af-d47cdce4ef90-kube-api-access-fbb8k\") pod \"3e7e101a-46be-4de8-97af-d47cdce4ef90\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.320077 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7e101a-46be-4de8-97af-d47cdce4ef90-logs\") pod \"3e7e101a-46be-4de8-97af-d47cdce4ef90\" (UID: \"3e7e101a-46be-4de8-97af-d47cdce4ef90\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.320109 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-operator-scripts\") pod \"74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a\" (UID: \"74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.320120 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.320141 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdzcc\" (UniqueName: \"kubernetes.io/projected/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-kube-api-access-pdzcc\") pod \"74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a\" (UID: \"74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.321680 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7be6ad04-b67e-42c1-867f-996017de8a50-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7be6ad04-b67e-42c1-867f-996017de8a50" (UID: "7be6ad04-b67e-42c1-867f-996017de8a50"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.322305 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e7e101a-46be-4de8-97af-d47cdce4ef90-logs" (OuterVolumeSpecName: "logs") pod "3e7e101a-46be-4de8-97af-d47cdce4ef90" (UID: "3e7e101a-46be-4de8-97af-d47cdce4ef90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.322592 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a" (UID: "74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.325054 5030 generic.go:334] "Generic (PLEG): container finished" podID="0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" containerID="abe1b9d9c501552d8beeada5cb25d3fbed77f8cbf429b1c1351a006527292ef3" exitCode=0 Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.325148 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" event={"ID":"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3","Type":"ContainerDied","Data":"abe1b9d9c501552d8beeada5cb25d3fbed77f8cbf429b1c1351a006527292ef3"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.335985 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-kube-api-access-pdzcc" (OuterVolumeSpecName: "kube-api-access-pdzcc") pod "74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a" (UID: "74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a"). InnerVolumeSpecName "kube-api-access-pdzcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.340701 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e7e101a-46be-4de8-97af-d47cdce4ef90-kube-api-access-fbb8k" (OuterVolumeSpecName: "kube-api-access-fbb8k") pod "3e7e101a-46be-4de8-97af-d47cdce4ef90" (UID: "3e7e101a-46be-4de8-97af-d47cdce4ef90"). InnerVolumeSpecName "kube-api-access-fbb8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.344072 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" event={"ID":"f6db0c18-060b-4be2-9a11-3df481a3abca","Type":"ContainerStarted","Data":"edd6221aa1184740bf7b1b82a394a6ad535840a9387fd6d10f9040803703f18b"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.344201 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.344912 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.350961 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.350995 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"6f73fe0c-f004-4b64-8952-cc6d7ebb26a9","Type":"ContainerDied","Data":"7ad8983153ac237af54853992a2d9be4d360e8a81b2945f412f2e918378244b6"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.362545 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-scripts" (OuterVolumeSpecName: "scripts") pod "3e7e101a-46be-4de8-97af-d47cdce4ef90" (UID: "3e7e101a-46be-4de8-97af-d47cdce4ef90"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.363237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be6ad04-b67e-42c1-867f-996017de8a50-kube-api-access-z7cvm" (OuterVolumeSpecName: "kube-api-access-z7cvm") pod "7be6ad04-b67e-42c1-867f-996017de8a50" (UID: "7be6ad04-b67e-42c1-867f-996017de8a50"). InnerVolumeSpecName "kube-api-access-z7cvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.364106 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.365076 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_c4db8327-61c4-4757-bed5-728ef0f8bbc6/ovn-northd/0.log" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.365120 5030 generic.go:334] "Generic (PLEG): container finished" podID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" containerID="5ee7da575f223a9839f7f97dae1e5c9caf513e03dec1badc74521523c66f260c" exitCode=139 Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.365184 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"c4db8327-61c4-4757-bed5-728ef0f8bbc6","Type":"ContainerDied","Data":"5ee7da575f223a9839f7f97dae1e5c9caf513e03dec1badc74521523c66f260c"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.369381 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.376905 5030 generic.go:334] "Generic (PLEG): container finished" podID="47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" containerID="1e2c3887d4bfb803bee6d88be2bd09aad12d8e435404a6d4a7f84152d7fd324e" exitCode=0 Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.377180 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.377312 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8","Type":"ContainerDied","Data":"1e2c3887d4bfb803bee6d88be2bd09aad12d8e435404a6d4a7f84152d7fd324e"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.379300 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" podStartSLOduration=5.379287614 podStartE2EDuration="5.379287614s" podCreationTimestamp="2026-01-20 23:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:02:43.372512709 +0000 UTC m=+1635.692772997" watchObservedRunningTime="2026-01-20 23:02:43.379287614 +0000 UTC m=+1635.699547902" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.395856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"f87ad01b-11b8-43e1-9a04-5947ff6b0441","Type":"ContainerDied","Data":"6076955485ede4c367e36abab60b2f3dbe9f3f386fab65d2c32b33b048ad2447"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.395970 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.403858 5030 scope.go:117] "RemoveContainer" containerID="90624f592f91634ffe7a336b3a4f4109cf49de743eedeca34a6df4b0c20d7765" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.414219 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-config-data" (OuterVolumeSpecName: "config-data") pod "3e7e101a-46be-4de8-97af-d47cdce4ef90" (UID: "3e7e101a-46be-4de8-97af-d47cdce4ef90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d90984a-ad24-4f85-b39e-fff5b8152a04-httpd-run\") pod \"5d90984a-ad24-4f85-b39e-fff5b8152a04\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422103 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-combined-ca-bundle\") pod \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422126 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-config-data-custom\") pod \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422164 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-scripts\") pod \"5d90984a-ad24-4f85-b39e-fff5b8152a04\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422163 5030 generic.go:334] "Generic (PLEG): container finished" podID="3e7e101a-46be-4de8-97af-d47cdce4ef90" containerID="913daf955b0d563fe98b12119e758cdafbabc393cfd58e273b8eca4ad2a22273" exitCode=0 Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-config-data\") pod \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422217 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d90984a-ad24-4f85-b39e-fff5b8152a04-logs\") pod \"5d90984a-ad24-4f85-b39e-fff5b8152a04\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422240 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-logs\") pod \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422261 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-public-tls-certs\") pod \"5d90984a-ad24-4f85-b39e-fff5b8152a04\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422263 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" event={"ID":"3e7e101a-46be-4de8-97af-d47cdce4ef90","Type":"ContainerDied","Data":"913daf955b0d563fe98b12119e758cdafbabc393cfd58e273b8eca4ad2a22273"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422280 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-combined-ca-bundle\") pod \"5d90984a-ad24-4f85-b39e-fff5b8152a04\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422298 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" event={"ID":"3e7e101a-46be-4de8-97af-d47cdce4ef90","Type":"ContainerDied","Data":"f452b9ea45f1195ca7898583c3bfd33b36728479e749928f1d50f07d9c2da3b7"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422305 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lnh5\" (UniqueName: \"kubernetes.io/projected/5d90984a-ad24-4f85-b39e-fff5b8152a04-kube-api-access-6lnh5\") pod \"5d90984a-ad24-4f85-b39e-fff5b8152a04\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422297 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-777ff759db-mgr2r" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422325 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-etc-machine-id\") pod \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422355 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-internal-tls-certs\") pod \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-scripts\") pod \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422426 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t67rg\" (UniqueName: \"kubernetes.io/projected/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-kube-api-access-t67rg\") pod \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422547 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-config-data\") pod \"5d90984a-ad24-4f85-b39e-fff5b8152a04\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422576 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"5d90984a-ad24-4f85-b39e-fff5b8152a04\" (UID: \"5d90984a-ad24-4f85-b39e-fff5b8152a04\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422604 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-public-tls-certs\") pod \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\" (UID: \"47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" (UID: "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422941 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422953 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422963 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbb8k\" (UniqueName: \"kubernetes.io/projected/3e7e101a-46be-4de8-97af-d47cdce4ef90-kube-api-access-fbb8k\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422972 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7e101a-46be-4de8-97af-d47cdce4ef90-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422981 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422989 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdzcc\" (UniqueName: \"kubernetes.io/projected/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a-kube-api-access-pdzcc\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.422999 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7cvm\" (UniqueName: \"kubernetes.io/projected/7be6ad04-b67e-42c1-867f-996017de8a50-kube-api-access-z7cvm\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.423007 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.423015 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7be6ad04-b67e-42c1-867f-996017de8a50-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.423291 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d90984a-ad24-4f85-b39e-fff5b8152a04-logs" (OuterVolumeSpecName: "logs") pod "5d90984a-ad24-4f85-b39e-fff5b8152a04" (UID: "5d90984a-ad24-4f85-b39e-fff5b8152a04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.423417 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d90984a-ad24-4f85-b39e-fff5b8152a04-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5d90984a-ad24-4f85-b39e-fff5b8152a04" (UID: "5d90984a-ad24-4f85-b39e-fff5b8152a04"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.423599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-logs" (OuterVolumeSpecName: "logs") pod "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" (UID: "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.445155 5030 generic.go:334] "Generic (PLEG): container finished" podID="5d90984a-ad24-4f85-b39e-fff5b8152a04" containerID="9e870211888f2d26672f4b7bb80891aa23aedcb37c96e9715ce4623aaf9698b0" exitCode=0 Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.445252 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"5d90984a-ad24-4f85-b39e-fff5b8152a04","Type":"ContainerDied","Data":"9e870211888f2d26672f4b7bb80891aa23aedcb37c96e9715ce4623aaf9698b0"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.445281 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"5d90984a-ad24-4f85-b39e-fff5b8152a04","Type":"ContainerDied","Data":"5861c632c04e0fa81a9ea8da2cd209929246f83b14798ccd4ab85ffefebc3695"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.445352 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.450555 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "5d90984a-ad24-4f85-b39e-fff5b8152a04" (UID: "5d90984a-ad24-4f85-b39e-fff5b8152a04"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.451420 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.453400 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" (UID: "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.455105 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e7e101a-46be-4de8-97af-d47cdce4ef90" (UID: "3e7e101a-46be-4de8-97af-d47cdce4ef90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.455131 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" event={"ID":"74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a","Type":"ContainerDied","Data":"ba3a0ea5399c82383ea22cd9e5e09b92b84a0316f3ec57dc864b445c2f9a74d0"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.455203 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.458372 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-scripts" (OuterVolumeSpecName: "scripts") pod "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" (UID: "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.460191 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.461694 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.462148 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk" event={"ID":"207ab9b6-de27-45b0-945c-68ed0ab8afbf","Type":"ContainerDied","Data":"61d8ae2b8c363577b217d9e51a4c7799afc9e31d8f835c0fccf03aa388711626"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.472359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d90984a-ad24-4f85-b39e-fff5b8152a04-kube-api-access-6lnh5" (OuterVolumeSpecName: "kube-api-access-6lnh5") pod "5d90984a-ad24-4f85-b39e-fff5b8152a04" (UID: "5d90984a-ad24-4f85-b39e-fff5b8152a04"). InnerVolumeSpecName "kube-api-access-6lnh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.472388 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0fc512e7-ef40-4589-a094-9906cf91a0d3","Type":"ContainerDied","Data":"d6e925334f43dc8e3a0c2fdf773b419b58a41ffe0512709f9744f65729e1ab2d"} Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.472482 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.480895 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.482967 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-kube-api-access-t67rg" (OuterVolumeSpecName: "kube-api-access-t67rg") pod "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" (UID: "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8"). InnerVolumeSpecName "kube-api-access-t67rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.483050 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-scripts" (OuterVolumeSpecName: "scripts") pod "5d90984a-ad24-4f85-b39e-fff5b8152a04" (UID: "5d90984a-ad24-4f85-b39e-fff5b8152a04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.483536 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e7e101a-46be-4de8-97af-d47cdce4ef90" (UID: "3e7e101a-46be-4de8-97af-d47cdce4ef90"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.488069 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.491026 5030 scope.go:117] "RemoveContainer" containerID="ff2baf8e3414b506fcd4fe9cdcb3f98e506b98207980d6f88effbe946d2e0a60" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.517706 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" (UID: "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.530776 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3e7e101a-46be-4de8-97af-d47cdce4ef90" (UID: "3e7e101a-46be-4de8-97af-d47cdce4ef90"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.531878 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.534576 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.534638 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t67rg\" (UniqueName: \"kubernetes.io/projected/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-kube-api-access-t67rg\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.534672 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.534683 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.534692 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d90984a-ad24-4f85-b39e-fff5b8152a04-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.534701 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.534709 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.534718 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.534725 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.534733 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d90984a-ad24-4f85-b39e-fff5b8152a04-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.534742 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.534750 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lnh5\" (UniqueName: \"kubernetes.io/projected/5d90984a-ad24-4f85-b39e-fff5b8152a04-kube-api-access-6lnh5\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.534760 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7e101a-46be-4de8-97af-d47cdce4ef90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.540504 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.544069 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5d90984a-ad24-4f85-b39e-fff5b8152a04" (UID: "5d90984a-ad24-4f85-b39e-fff5b8152a04"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.556483 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-4cc4-account-create-update-49tr4"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.559768 5030 scope.go:117] "RemoveContainer" containerID="1e2c3887d4bfb803bee6d88be2bd09aad12d8e435404a6d4a7f84152d7fd324e" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.560926 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.561768 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.570049 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.589726 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.596932 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-e899-account-create-update-rq7hk"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.604966 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-config-data" (OuterVolumeSpecName: "config-data") pod "5d90984a-ad24-4f85-b39e-fff5b8152a04" (UID: "5d90984a-ad24-4f85-b39e-fff5b8152a04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.611654 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" (UID: "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.614794 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d90984a-ad24-4f85-b39e-fff5b8152a04" (UID: "5d90984a-ad24-4f85-b39e-fff5b8152a04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.636637 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-public-tls-certs\") pod \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.636680 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-combined-ca-bundle\") pod \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.636797 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-config-data\") pod \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.636867 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-config-data-custom\") pod \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.637095 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-logs\") pod \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.637118 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg92w\" (UniqueName: \"kubernetes.io/projected/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-kube-api-access-sg92w\") pod \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.637179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-internal-tls-certs\") pod \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\" (UID: \"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.637470 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.637487 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.637495 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.637506 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.637514 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d90984a-ad24-4f85-b39e-fff5b8152a04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.637993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-logs" (OuterVolumeSpecName: "logs") pod "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" (UID: "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.638242 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" (UID: "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.640009 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" (UID: "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.642466 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-kube-api-access-sg92w" (OuterVolumeSpecName: "kube-api-access-sg92w") pod "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" (UID: "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3"). InnerVolumeSpecName "kube-api-access-sg92w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.677420 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" (UID: "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.681946 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-config-data" (OuterVolumeSpecName: "config-data") pod "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" (UID: "47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.693028 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-config-data" (OuterVolumeSpecName: "config-data") pod "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" (UID: "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.699177 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" (UID: "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.702553 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" (UID: "0188ec37-5fa0-4c9c-82ea-7eb7010c88d3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.714024 5030 scope.go:117] "RemoveContainer" containerID="c011608dc0cda953d39ae86ce2df8b44ddec5366f2481a6fa9504f761cc6409a" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.740867 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.740900 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.740915 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.740929 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.740940 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.740952 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.740962 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.740978 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg92w\" (UniqueName: \"kubernetes.io/projected/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-kube-api-access-sg92w\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.740989 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.744501 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_c4db8327-61c4-4757-bed5-728ef0f8bbc6/ovn-northd/0.log" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.744573 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.751196 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4j2gm"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.758921 5030 scope.go:117] "RemoveContainer" containerID="97c9b3af0a221749a5d3d0a48cb4ab7d818c47e0f19d9b433358dccfb2edecb5" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.766424 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4j2gm"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.785662 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-777ff759db-mgr2r"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.793902 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-777ff759db-mgr2r"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.809451 5030 scope.go:117] "RemoveContainer" containerID="54990945a497a45075119ed3dc036df2589703681e3a56dcfccafab5ebeb782b" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.815093 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.822342 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.842589 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-ovn-northd-tls-certs\") pod \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.842968 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-metrics-certs-tls-certs\") pod \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.843180 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-combined-ca-bundle\") pod \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.843241 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgltc\" (UniqueName: \"kubernetes.io/projected/c4db8327-61c4-4757-bed5-728ef0f8bbc6-kube-api-access-cgltc\") pod \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.843281 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4db8327-61c4-4757-bed5-728ef0f8bbc6-config\") pod \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.843312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c4db8327-61c4-4757-bed5-728ef0f8bbc6-ovn-rundir\") pod \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.843341 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4db8327-61c4-4757-bed5-728ef0f8bbc6-scripts\") pod \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\" (UID: \"c4db8327-61c4-4757-bed5-728ef0f8bbc6\") " Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.844495 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4db8327-61c4-4757-bed5-728ef0f8bbc6-scripts" (OuterVolumeSpecName: "scripts") pod "c4db8327-61c4-4757-bed5-728ef0f8bbc6" (UID: "c4db8327-61c4-4757-bed5-728ef0f8bbc6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.844990 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4db8327-61c4-4757-bed5-728ef0f8bbc6-config" (OuterVolumeSpecName: "config") pod "c4db8327-61c4-4757-bed5-728ef0f8bbc6" (UID: "c4db8327-61c4-4757-bed5-728ef0f8bbc6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.847229 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4db8327-61c4-4757-bed5-728ef0f8bbc6-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "c4db8327-61c4-4757-bed5-728ef0f8bbc6" (UID: "c4db8327-61c4-4757-bed5-728ef0f8bbc6"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.873958 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4db8327-61c4-4757-bed5-728ef0f8bbc6-kube-api-access-cgltc" (OuterVolumeSpecName: "kube-api-access-cgltc") pod "c4db8327-61c4-4757-bed5-728ef0f8bbc6" (UID: "c4db8327-61c4-4757-bed5-728ef0f8bbc6"). InnerVolumeSpecName "kube-api-access-cgltc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.879956 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.883393 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4db8327-61c4-4757-bed5-728ef0f8bbc6" (UID: "c4db8327-61c4-4757-bed5-728ef0f8bbc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.908012 5030 scope.go:117] "RemoveContainer" containerID="913daf955b0d563fe98b12119e758cdafbabc393cfd58e273b8eca4ad2a22273" Jan 20 23:02:43 crc kubenswrapper[5030]: E0120 23:02:43.922796 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2 is running failed: container process not found" containerID="492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:02:43 crc kubenswrapper[5030]: E0120 23:02:43.923359 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2 is running failed: container process not found" containerID="492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:02:43 crc kubenswrapper[5030]: E0120 23:02:43.923755 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2 is running failed: container process not found" containerID="492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:02:43 crc kubenswrapper[5030]: E0120 23:02:43.923823 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="4789919e-c06b-413a-8185-6766523e9925" containerName="nova-cell0-conductor-conductor" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.945430 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.945467 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgltc\" (UniqueName: \"kubernetes.io/projected/c4db8327-61c4-4757-bed5-728ef0f8bbc6-kube-api-access-cgltc\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.945481 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4db8327-61c4-4757-bed5-728ef0f8bbc6-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.945491 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c4db8327-61c4-4757-bed5-728ef0f8bbc6-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.945500 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4db8327-61c4-4757-bed5-728ef0f8bbc6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.974598 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fc512e7-ef40-4589-a094-9906cf91a0d3" path="/var/lib/kubelet/pods/0fc512e7-ef40-4589-a094-9906cf91a0d3/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.975365 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="177bf646-3453-466c-ac69-7777db44dda5" path="/var/lib/kubelet/pods/177bf646-3453-466c-ac69-7777db44dda5/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.975940 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1adc2a4a-9139-449f-a755-1a6ede8c141d" path="/var/lib/kubelet/pods/1adc2a4a-9139-449f-a755-1a6ede8c141d/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.976707 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="207ab9b6-de27-45b0-945c-68ed0ab8afbf" path="/var/lib/kubelet/pods/207ab9b6-de27-45b0-945c-68ed0ab8afbf/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.977018 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3" path="/var/lib/kubelet/pods/23ceeb5f-0fab-46ab-9eec-a0fda5e4e6f3/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.977360 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e7e101a-46be-4de8-97af-d47cdce4ef90" path="/var/lib/kubelet/pods/3e7e101a-46be-4de8-97af-d47cdce4ef90/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.977942 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43e23459-b501-42f3-bf9c-2d50df954c66" path="/var/lib/kubelet/pods/43e23459-b501-42f3-bf9c-2d50df954c66/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.978249 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5699cc7d-e528-4022-bff1-7057062e4a03" path="/var/lib/kubelet/pods/5699cc7d-e528-4022-bff1-7057062e4a03/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.979181 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d90984a-ad24-4f85-b39e-fff5b8152a04" path="/var/lib/kubelet/pods/5d90984a-ad24-4f85-b39e-fff5b8152a04/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.979689 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a8b63a6-0ddc-47f6-9cc3-30862eb55edf" path="/var/lib/kubelet/pods/6a8b63a6-0ddc-47f6-9cc3-30862eb55edf/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.979967 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e835bcd-3585-4a51-ba54-aeb66b426afa" path="/var/lib/kubelet/pods/6e835bcd-3585-4a51-ba54-aeb66b426afa/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.980941 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f73fe0c-f004-4b64-8952-cc6d7ebb26a9" path="/var/lib/kubelet/pods/6f73fe0c-f004-4b64-8952-cc6d7ebb26a9/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.981418 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a" path="/var/lib/kubelet/pods/74b5216c-b1f6-42eb-ba4a-63d31f3e7a8a/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.981796 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7be6ad04-b67e-42c1-867f-996017de8a50" path="/var/lib/kubelet/pods/7be6ad04-b67e-42c1-867f-996017de8a50/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.982213 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" path="/var/lib/kubelet/pods/9a16fd61-1cde-4e12-8e9f-db0c8f2182b7/volumes" Jan 20 23:02:43 crc kubenswrapper[5030]: I0120 23:02:43.982981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "c4db8327-61c4-4757-bed5-728ef0f8bbc6" (UID: "c4db8327-61c4-4757-bed5-728ef0f8bbc6"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:43.983915 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4a387ba-cb79-4f5d-a313-37204d51d189" path="/var/lib/kubelet/pods/c4a387ba-cb79-4f5d-a313-37204d51d189/volumes" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:43.984349 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea9bfa72-d35e-4dad-9fcf-747c90607c2d" path="/var/lib/kubelet/pods/ea9bfa72-d35e-4dad-9fcf-747c90607c2d/volumes" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:43.984680 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f87ad01b-11b8-43e1-9a04-5947ff6b0441" path="/var/lib/kubelet/pods/f87ad01b-11b8-43e1-9a04-5947ff6b0441/volumes" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:43.988521 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f927a168-d367-4174-9e6e-fd9b7964346b" path="/var/lib/kubelet/pods/f927a168-d367-4174-9e6e-fd9b7964346b/volumes" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:43.989831 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "c4db8327-61c4-4757-bed5-728ef0f8bbc6" (UID: "c4db8327-61c4-4757-bed5-728ef0f8bbc6"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.048287 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c264e34-26dd-4ab6-88e1-a44f64fac9da-run-httpd\") pod \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.048351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-sg-core-conf-yaml\") pod \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.048396 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-config-data\") pod \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.048415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-combined-ca-bundle\") pod \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.048482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-ceilometer-tls-certs\") pod \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.048543 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5tm4\" (UniqueName: \"kubernetes.io/projected/5c264e34-26dd-4ab6-88e1-a44f64fac9da-kube-api-access-g5tm4\") pod \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.048571 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c264e34-26dd-4ab6-88e1-a44f64fac9da-log-httpd\") pod \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.048840 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-scripts\") pod \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\" (UID: \"5c264e34-26dd-4ab6-88e1-a44f64fac9da\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.049404 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c264e34-26dd-4ab6-88e1-a44f64fac9da-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5c264e34-26dd-4ab6-88e1-a44f64fac9da" (UID: "5c264e34-26dd-4ab6-88e1-a44f64fac9da"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.050211 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c264e34-26dd-4ab6-88e1-a44f64fac9da-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5c264e34-26dd-4ab6-88e1-a44f64fac9da" (UID: "5c264e34-26dd-4ab6-88e1-a44f64fac9da"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.050535 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c264e34-26dd-4ab6-88e1-a44f64fac9da-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.050550 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5c264e34-26dd-4ab6-88e1-a44f64fac9da-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.050559 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.050569 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4db8327-61c4-4757-bed5-728ef0f8bbc6-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.052424 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c264e34-26dd-4ab6-88e1-a44f64fac9da-kube-api-access-g5tm4" (OuterVolumeSpecName: "kube-api-access-g5tm4") pod "5c264e34-26dd-4ab6-88e1-a44f64fac9da" (UID: "5c264e34-26dd-4ab6-88e1-a44f64fac9da"). InnerVolumeSpecName "kube-api-access-g5tm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.054839 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-scripts" (OuterVolumeSpecName: "scripts") pod "5c264e34-26dd-4ab6-88e1-a44f64fac9da" (UID: "5c264e34-26dd-4ab6-88e1-a44f64fac9da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.110259 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5c264e34-26dd-4ab6-88e1-a44f64fac9da" (UID: "5c264e34-26dd-4ab6-88e1-a44f64fac9da"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.118822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5c264e34-26dd-4ab6-88e1-a44f64fac9da" (UID: "5c264e34-26dd-4ab6-88e1-a44f64fac9da"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.151524 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.151553 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.151563 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5tm4\" (UniqueName: \"kubernetes.io/projected/5c264e34-26dd-4ab6-88e1-a44f64fac9da-kube-api-access-g5tm4\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.151573 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.163596 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-config-data" (OuterVolumeSpecName: "config-data") pod "5c264e34-26dd-4ab6-88e1-a44f64fac9da" (UID: "5c264e34-26dd-4ab6-88e1-a44f64fac9da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.167931 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c264e34-26dd-4ab6-88e1-a44f64fac9da" (UID: "5c264e34-26dd-4ab6-88e1-a44f64fac9da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.220184 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.228931 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.229357 5030 scope.go:117] "RemoveContainer" containerID="bb165cf308dfeb395cf94e6cb6d70f6e2a782cc4e59efe496e7ca248092fc2b5" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.236394 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.247100 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.253635 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.260737 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c264e34-26dd-4ab6-88e1-a44f64fac9da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.260778 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.268257 5030 scope.go:117] "RemoveContainer" containerID="913daf955b0d563fe98b12119e758cdafbabc393cfd58e273b8eca4ad2a22273" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.268537 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913daf955b0d563fe98b12119e758cdafbabc393cfd58e273b8eca4ad2a22273\": container with ID starting with 913daf955b0d563fe98b12119e758cdafbabc393cfd58e273b8eca4ad2a22273 not found: ID does not exist" containerID="913daf955b0d563fe98b12119e758cdafbabc393cfd58e273b8eca4ad2a22273" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.268562 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913daf955b0d563fe98b12119e758cdafbabc393cfd58e273b8eca4ad2a22273"} err="failed to get container status \"913daf955b0d563fe98b12119e758cdafbabc393cfd58e273b8eca4ad2a22273\": rpc error: code = NotFound desc = could not find container \"913daf955b0d563fe98b12119e758cdafbabc393cfd58e273b8eca4ad2a22273\": container with ID starting with 913daf955b0d563fe98b12119e758cdafbabc393cfd58e273b8eca4ad2a22273 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.268583 5030 scope.go:117] "RemoveContainer" containerID="bb165cf308dfeb395cf94e6cb6d70f6e2a782cc4e59efe496e7ca248092fc2b5" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.268765 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb165cf308dfeb395cf94e6cb6d70f6e2a782cc4e59efe496e7ca248092fc2b5\": container with ID starting with bb165cf308dfeb395cf94e6cb6d70f6e2a782cc4e59efe496e7ca248092fc2b5 not found: ID does not exist" containerID="bb165cf308dfeb395cf94e6cb6d70f6e2a782cc4e59efe496e7ca248092fc2b5" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.268780 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb165cf308dfeb395cf94e6cb6d70f6e2a782cc4e59efe496e7ca248092fc2b5"} err="failed to get container status \"bb165cf308dfeb395cf94e6cb6d70f6e2a782cc4e59efe496e7ca248092fc2b5\": rpc error: code = NotFound desc = could not find container \"bb165cf308dfeb395cf94e6cb6d70f6e2a782cc4e59efe496e7ca248092fc2b5\": container with ID starting with bb165cf308dfeb395cf94e6cb6d70f6e2a782cc4e59efe496e7ca248092fc2b5 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.268807 5030 scope.go:117] "RemoveContainer" containerID="9e870211888f2d26672f4b7bb80891aa23aedcb37c96e9715ce4623aaf9698b0" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.287900 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.309232 5030 scope.go:117] "RemoveContainer" containerID="4149f309379d97cf970398e78be56e16f3aa567df776e1d3fb1cdb1b410e4921" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.334099 5030 scope.go:117] "RemoveContainer" containerID="9e870211888f2d26672f4b7bb80891aa23aedcb37c96e9715ce4623aaf9698b0" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.334707 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e870211888f2d26672f4b7bb80891aa23aedcb37c96e9715ce4623aaf9698b0\": container with ID starting with 9e870211888f2d26672f4b7bb80891aa23aedcb37c96e9715ce4623aaf9698b0 not found: ID does not exist" containerID="9e870211888f2d26672f4b7bb80891aa23aedcb37c96e9715ce4623aaf9698b0" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.334751 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e870211888f2d26672f4b7bb80891aa23aedcb37c96e9715ce4623aaf9698b0"} err="failed to get container status \"9e870211888f2d26672f4b7bb80891aa23aedcb37c96e9715ce4623aaf9698b0\": rpc error: code = NotFound desc = could not find container \"9e870211888f2d26672f4b7bb80891aa23aedcb37c96e9715ce4623aaf9698b0\": container with ID starting with 9e870211888f2d26672f4b7bb80891aa23aedcb37c96e9715ce4623aaf9698b0 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.334777 5030 scope.go:117] "RemoveContainer" containerID="4149f309379d97cf970398e78be56e16f3aa567df776e1d3fb1cdb1b410e4921" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.335289 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4149f309379d97cf970398e78be56e16f3aa567df776e1d3fb1cdb1b410e4921\": container with ID starting with 4149f309379d97cf970398e78be56e16f3aa567df776e1d3fb1cdb1b410e4921 not found: ID does not exist" containerID="4149f309379d97cf970398e78be56e16f3aa567df776e1d3fb1cdb1b410e4921" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.335332 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4149f309379d97cf970398e78be56e16f3aa567df776e1d3fb1cdb1b410e4921"} err="failed to get container status \"4149f309379d97cf970398e78be56e16f3aa567df776e1d3fb1cdb1b410e4921\": rpc error: code = NotFound desc = could not find container \"4149f309379d97cf970398e78be56e16f3aa567df776e1d3fb1cdb1b410e4921\": container with ID starting with 4149f309379d97cf970398e78be56e16f3aa567df776e1d3fb1cdb1b410e4921 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.335463 5030 scope.go:117] "RemoveContainer" containerID="32dd60b784789cf7134c38532e81ef67d0d751a5bd26694f1fd92042ac4cc701" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361194 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-logs\") pod \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361261 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-config-data\") pod \"b5f99e18-c5b4-4def-9f5e-5be497088422\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4789919e-c06b-413a-8185-6766523e9925-combined-ca-bundle\") pod \"4789919e-c06b-413a-8185-6766523e9925\" (UID: \"4789919e-c06b-413a-8185-6766523e9925\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361337 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-internal-tls-certs\") pod \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361362 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f99e18-c5b4-4def-9f5e-5be497088422-logs\") pod \"b5f99e18-c5b4-4def-9f5e-5be497088422\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-combined-ca-bundle\") pod \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-combined-ca-bundle\") pod \"b5f99e18-c5b4-4def-9f5e-5be497088422\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361457 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpmd4\" (UniqueName: \"kubernetes.io/projected/4789919e-c06b-413a-8185-6766523e9925-kube-api-access-bpmd4\") pod \"4789919e-c06b-413a-8185-6766523e9925\" (UID: \"4789919e-c06b-413a-8185-6766523e9925\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361527 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-nova-metadata-tls-certs\") pod \"b5f99e18-c5b4-4def-9f5e-5be497088422\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361553 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hkpq\" (UniqueName: \"kubernetes.io/projected/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-kube-api-access-5hkpq\") pod \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361587 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-public-tls-certs\") pod \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361610 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xczwn\" (UniqueName: \"kubernetes.io/projected/b5f99e18-c5b4-4def-9f5e-5be497088422-kube-api-access-xczwn\") pod \"b5f99e18-c5b4-4def-9f5e-5be497088422\" (UID: \"b5f99e18-c5b4-4def-9f5e-5be497088422\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361681 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-config-data\") pod \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\" (UID: \"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361734 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4789919e-c06b-413a-8185-6766523e9925-config-data\") pod \"4789919e-c06b-413a-8185-6766523e9925\" (UID: \"4789919e-c06b-413a-8185-6766523e9925\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.361970 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-logs" (OuterVolumeSpecName: "logs") pod "6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" (UID: "6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.362259 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.364066 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f99e18-c5b4-4def-9f5e-5be497088422-logs" (OuterVolumeSpecName: "logs") pod "b5f99e18-c5b4-4def-9f5e-5be497088422" (UID: "b5f99e18-c5b4-4def-9f5e-5be497088422"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.375101 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f99e18-c5b4-4def-9f5e-5be497088422-kube-api-access-xczwn" (OuterVolumeSpecName: "kube-api-access-xczwn") pod "b5f99e18-c5b4-4def-9f5e-5be497088422" (UID: "b5f99e18-c5b4-4def-9f5e-5be497088422"). InnerVolumeSpecName "kube-api-access-xczwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.375260 5030 scope.go:117] "RemoveContainer" containerID="896d652e86e3f6645f535d8655829b4fc84bc1a6b041883ee7d85864aa92a7c3" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.384881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-kube-api-access-5hkpq" (OuterVolumeSpecName: "kube-api-access-5hkpq") pod "6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" (UID: "6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba"). InnerVolumeSpecName "kube-api-access-5hkpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.392522 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4789919e-c06b-413a-8185-6766523e9925-kube-api-access-bpmd4" (OuterVolumeSpecName: "kube-api-access-bpmd4") pod "4789919e-c06b-413a-8185-6766523e9925" (UID: "4789919e-c06b-413a-8185-6766523e9925"). InnerVolumeSpecName "kube-api-access-bpmd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.396943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4789919e-c06b-413a-8185-6766523e9925-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4789919e-c06b-413a-8185-6766523e9925" (UID: "4789919e-c06b-413a-8185-6766523e9925"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.396990 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4789919e-c06b-413a-8185-6766523e9925-config-data" (OuterVolumeSpecName: "config-data") pod "4789919e-c06b-413a-8185-6766523e9925" (UID: "4789919e-c06b-413a-8185-6766523e9925"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.398045 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" (UID: "6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.404272 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-config-data" (OuterVolumeSpecName: "config-data") pod "b5f99e18-c5b4-4def-9f5e-5be497088422" (UID: "b5f99e18-c5b4-4def-9f5e-5be497088422"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.416647 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-config-data" (OuterVolumeSpecName: "config-data") pod "6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" (UID: "6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.426436 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5f99e18-c5b4-4def-9f5e-5be497088422" (UID: "b5f99e18-c5b4-4def-9f5e-5be497088422"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.427495 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" (UID: "6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.432655 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" (UID: "6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.448874 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b5f99e18-c5b4-4def-9f5e-5be497088422" (UID: "b5f99e18-c5b4-4def-9f5e-5be497088422"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.463538 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-fernet-keys\") pod \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.463656 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-internal-tls-certs\") pod \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.463680 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-scripts\") pod \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.463725 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-credential-keys\") pod \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.463776 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rvw9\" (UniqueName: \"kubernetes.io/projected/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-kube-api-access-4rvw9\") pod \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.463804 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-combined-ca-bundle\") pod \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.463823 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-public-tls-certs\") pod \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.463862 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-config-data\") pod \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\" (UID: \"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d\") " Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.464166 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.464182 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4789919e-c06b-413a-8185-6766523e9925-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.464192 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.464201 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f99e18-c5b4-4def-9f5e-5be497088422-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.464210 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.464219 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.464229 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpmd4\" (UniqueName: \"kubernetes.io/projected/4789919e-c06b-413a-8185-6766523e9925-kube-api-access-bpmd4\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.464237 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f99e18-c5b4-4def-9f5e-5be497088422-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.464246 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hkpq\" (UniqueName: \"kubernetes.io/projected/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-kube-api-access-5hkpq\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.464255 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.464264 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xczwn\" (UniqueName: \"kubernetes.io/projected/b5f99e18-c5b4-4def-9f5e-5be497088422-kube-api-access-xczwn\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.464273 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.464282 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4789919e-c06b-413a-8185-6766523e9925-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.470353 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" (UID: "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.470447 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-scripts" (OuterVolumeSpecName: "scripts") pod "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" (UID: "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.471449 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" (UID: "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.472032 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-kube-api-access-4rvw9" (OuterVolumeSpecName: "kube-api-access-4rvw9") pod "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" (UID: "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d"). InnerVolumeSpecName "kube-api-access-4rvw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.489156 5030 generic.go:334] "Generic (PLEG): container finished" podID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerID="0a9bc2b2e62fa1665346965719dba7ef174e556ec608612008ea3d57f6ce58a5" exitCode=0 Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.489225 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5c264e34-26dd-4ab6-88e1-a44f64fac9da","Type":"ContainerDied","Data":"0a9bc2b2e62fa1665346965719dba7ef174e556ec608612008ea3d57f6ce58a5"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.489259 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5c264e34-26dd-4ab6-88e1-a44f64fac9da","Type":"ContainerDied","Data":"ed6546c3d4beb04f45b36dcdb4f2de02020bc33919dc288ece2fcbae267f0a4f"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.489277 5030 scope.go:117] "RemoveContainer" containerID="ed33b27e4afb955b83d56cfdd6383bdd0da34ea1240c3f18f863547d28dffa02" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.489637 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.493437 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-config-data" (OuterVolumeSpecName: "config-data") pod "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" (UID: "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.496341 5030 generic.go:334] "Generic (PLEG): container finished" podID="b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" containerID="03000a5337c54b499955e11b625aa2770b106374d729d827663c69d19147b1d2" exitCode=0 Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.496396 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" event={"ID":"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d","Type":"ContainerDied","Data":"03000a5337c54b499955e11b625aa2770b106374d729d827663c69d19147b1d2"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.496421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" event={"ID":"b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d","Type":"ContainerDied","Data":"2fdf0d827c5cfc2af154771cf82f329e30b020edd426413fc0e8ecc69b819bbe"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.496483 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6765d59f9b-2rs69" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.501829 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" event={"ID":"0188ec37-5fa0-4c9c-82ea-7eb7010c88d3","Type":"ContainerDied","Data":"c8d370e6dfb97831885b9d5f61d78141e1999f187cef03777e6c1d3940c10aee"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.501909 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.506426 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" (UID: "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.508451 5030 generic.go:334] "Generic (PLEG): container finished" podID="6b5cb860-d149-4999-a4bb-be57ead53529" containerID="072e84c24bcb5fbbfc5adf7ca0749e8ce5e5a9a248260590e82bbf8c30a161ab" exitCode=0 Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.508512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vljcv" event={"ID":"6b5cb860-d149-4999-a4bb-be57ead53529","Type":"ContainerDied","Data":"072e84c24bcb5fbbfc5adf7ca0749e8ce5e5a9a248260590e82bbf8c30a161ab"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.508534 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vljcv" event={"ID":"6b5cb860-d149-4999-a4bb-be57ead53529","Type":"ContainerStarted","Data":"174b1208575986cd971adf051d4da5478249654bf2cfac4d8d16fbb4b6d8fdfb"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.512800 5030 scope.go:117] "RemoveContainer" containerID="6edc24e70d18a4d950a7bac3cb71da281a70c3b75b61a409054feff5a2f24333" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.523391 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.531341 5030 generic.go:334] "Generic (PLEG): container finished" podID="6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" containerID="bacdb63cfad87ddbc9e7c7e70fdb4097469fbce0f8efae90054339392ef9fc71" exitCode=0 Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.531464 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba","Type":"ContainerDied","Data":"bacdb63cfad87ddbc9e7c7e70fdb4097469fbce0f8efae90054339392ef9fc71"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.531501 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba","Type":"ContainerDied","Data":"ea8361cea39466f58a0e0ba4e7a8fb1d29c3b50db4beee685da502dc82042523"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.531588 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.535359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" (UID: "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.536952 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-84f4bb648-q4s7q"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.537670 5030 generic.go:334] "Generic (PLEG): container finished" podID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerID="b3074f8fcdf7f1a56717e9304ba5737a0e0d1592d96a5bd9d970577342c28521" exitCode=0 Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.537813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b5f99e18-c5b4-4def-9f5e-5be497088422","Type":"ContainerDied","Data":"b3074f8fcdf7f1a56717e9304ba5737a0e0d1592d96a5bd9d970577342c28521"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.537886 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b5f99e18-c5b4-4def-9f5e-5be497088422","Type":"ContainerDied","Data":"61459a67070b863fb3d1eb9dcc2f1640425c4e2159637480d72683645d32d93f"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.537999 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.542652 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" (UID: "b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.546914 5030 scope.go:117] "RemoveContainer" containerID="0a9bc2b2e62fa1665346965719dba7ef174e556ec608612008ea3d57f6ce58a5" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.547149 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_c4db8327-61c4-4757-bed5-728ef0f8bbc6/ovn-northd/0.log" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.547285 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"c4db8327-61c4-4757-bed5-728ef0f8bbc6","Type":"ContainerDied","Data":"eeb0058f88d6fd8423fef2453c0ebc56111945cc904514ce6a8c59eb3d1573e2"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.547488 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.561040 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.561134 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4789919e-c06b-413a-8185-6766523e9925","Type":"ContainerDied","Data":"492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.561051 5030 generic.go:334] "Generic (PLEG): container finished" podID="4789919e-c06b-413a-8185-6766523e9925" containerID="492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2" exitCode=0 Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.561360 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"4789919e-c06b-413a-8185-6766523e9925","Type":"ContainerDied","Data":"4e70aed4d79380e1fb50bea706a72607f0fa412f6f53a229cfa795a9d5a3b9ad"} Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.566709 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.566741 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rvw9\" (UniqueName: \"kubernetes.io/projected/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-kube-api-access-4rvw9\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.566753 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.566763 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.566772 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.566781 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.566790 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.566797 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.569609 5030 scope.go:117] "RemoveContainer" containerID="aa2beec7d83b690d8c2d49158b55f640524cb3d1f27c3a03e7ff160cba304856" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.584436 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.604536 5030 scope.go:117] "RemoveContainer" containerID="ed33b27e4afb955b83d56cfdd6383bdd0da34ea1240c3f18f863547d28dffa02" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.604942 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed33b27e4afb955b83d56cfdd6383bdd0da34ea1240c3f18f863547d28dffa02\": container with ID starting with ed33b27e4afb955b83d56cfdd6383bdd0da34ea1240c3f18f863547d28dffa02 not found: ID does not exist" containerID="ed33b27e4afb955b83d56cfdd6383bdd0da34ea1240c3f18f863547d28dffa02" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.604982 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed33b27e4afb955b83d56cfdd6383bdd0da34ea1240c3f18f863547d28dffa02"} err="failed to get container status \"ed33b27e4afb955b83d56cfdd6383bdd0da34ea1240c3f18f863547d28dffa02\": rpc error: code = NotFound desc = could not find container \"ed33b27e4afb955b83d56cfdd6383bdd0da34ea1240c3f18f863547d28dffa02\": container with ID starting with ed33b27e4afb955b83d56cfdd6383bdd0da34ea1240c3f18f863547d28dffa02 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.605008 5030 scope.go:117] "RemoveContainer" containerID="6edc24e70d18a4d950a7bac3cb71da281a70c3b75b61a409054feff5a2f24333" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.605997 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6edc24e70d18a4d950a7bac3cb71da281a70c3b75b61a409054feff5a2f24333\": container with ID starting with 6edc24e70d18a4d950a7bac3cb71da281a70c3b75b61a409054feff5a2f24333 not found: ID does not exist" containerID="6edc24e70d18a4d950a7bac3cb71da281a70c3b75b61a409054feff5a2f24333" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.606032 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6edc24e70d18a4d950a7bac3cb71da281a70c3b75b61a409054feff5a2f24333"} err="failed to get container status \"6edc24e70d18a4d950a7bac3cb71da281a70c3b75b61a409054feff5a2f24333\": rpc error: code = NotFound desc = could not find container \"6edc24e70d18a4d950a7bac3cb71da281a70c3b75b61a409054feff5a2f24333\": container with ID starting with 6edc24e70d18a4d950a7bac3cb71da281a70c3b75b61a409054feff5a2f24333 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.606054 5030 scope.go:117] "RemoveContainer" containerID="0a9bc2b2e62fa1665346965719dba7ef174e556ec608612008ea3d57f6ce58a5" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.606653 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a9bc2b2e62fa1665346965719dba7ef174e556ec608612008ea3d57f6ce58a5\": container with ID starting with 0a9bc2b2e62fa1665346965719dba7ef174e556ec608612008ea3d57f6ce58a5 not found: ID does not exist" containerID="0a9bc2b2e62fa1665346965719dba7ef174e556ec608612008ea3d57f6ce58a5" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.606677 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9bc2b2e62fa1665346965719dba7ef174e556ec608612008ea3d57f6ce58a5"} err="failed to get container status \"0a9bc2b2e62fa1665346965719dba7ef174e556ec608612008ea3d57f6ce58a5\": rpc error: code = NotFound desc = could not find container \"0a9bc2b2e62fa1665346965719dba7ef174e556ec608612008ea3d57f6ce58a5\": container with ID starting with 0a9bc2b2e62fa1665346965719dba7ef174e556ec608612008ea3d57f6ce58a5 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.606692 5030 scope.go:117] "RemoveContainer" containerID="aa2beec7d83b690d8c2d49158b55f640524cb3d1f27c3a03e7ff160cba304856" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.606891 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2beec7d83b690d8c2d49158b55f640524cb3d1f27c3a03e7ff160cba304856\": container with ID starting with aa2beec7d83b690d8c2d49158b55f640524cb3d1f27c3a03e7ff160cba304856 not found: ID does not exist" containerID="aa2beec7d83b690d8c2d49158b55f640524cb3d1f27c3a03e7ff160cba304856" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.606916 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2beec7d83b690d8c2d49158b55f640524cb3d1f27c3a03e7ff160cba304856"} err="failed to get container status \"aa2beec7d83b690d8c2d49158b55f640524cb3d1f27c3a03e7ff160cba304856\": rpc error: code = NotFound desc = could not find container \"aa2beec7d83b690d8c2d49158b55f640524cb3d1f27c3a03e7ff160cba304856\": container with ID starting with aa2beec7d83b690d8c2d49158b55f640524cb3d1f27c3a03e7ff160cba304856 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.606931 5030 scope.go:117] "RemoveContainer" containerID="03000a5337c54b499955e11b625aa2770b106374d729d827663c69d19147b1d2" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.608195 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.614628 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.624772 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.630647 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.635559 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.635798 5030 scope.go:117] "RemoveContainer" containerID="03000a5337c54b499955e11b625aa2770b106374d729d827663c69d19147b1d2" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.636244 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03000a5337c54b499955e11b625aa2770b106374d729d827663c69d19147b1d2\": container with ID starting with 03000a5337c54b499955e11b625aa2770b106374d729d827663c69d19147b1d2 not found: ID does not exist" containerID="03000a5337c54b499955e11b625aa2770b106374d729d827663c69d19147b1d2" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.636279 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03000a5337c54b499955e11b625aa2770b106374d729d827663c69d19147b1d2"} err="failed to get container status \"03000a5337c54b499955e11b625aa2770b106374d729d827663c69d19147b1d2\": rpc error: code = NotFound desc = could not find container \"03000a5337c54b499955e11b625aa2770b106374d729d827663c69d19147b1d2\": container with ID starting with 03000a5337c54b499955e11b625aa2770b106374d729d827663c69d19147b1d2 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.636309 5030 scope.go:117] "RemoveContainer" containerID="abe1b9d9c501552d8beeada5cb25d3fbed77f8cbf429b1c1351a006527292ef3" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.639978 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.645470 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.650228 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.654829 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.654893 5030 scope.go:117] "RemoveContainer" containerID="3bdf48c04934d54f7661f747d57939590f3632f793a8c0aa8ae04262dd469c85" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.670406 5030 scope.go:117] "RemoveContainer" containerID="bacdb63cfad87ddbc9e7c7e70fdb4097469fbce0f8efae90054339392ef9fc71" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.680464 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c264e34_26dd_4ab6_88e1_a44f64fac9da.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4789919e_c06b_413a_8185_6766523e9925.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ef25eff_8fd8_40b8_881b_e8b7eb12d1ba.slice/crio-ea8361cea39466f58a0e0ba4e7a8fb1d29c3b50db4beee685da502dc82042523\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c264e34_26dd_4ab6_88e1_a44f64fac9da.slice/crio-ed6546c3d4beb04f45b36dcdb4f2de02020bc33919dc288ece2fcbae267f0a4f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5f99e18_c5b4_4def_9f5e_5be497088422.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4789919e_c06b_413a_8185_6766523e9925.slice/crio-4e70aed4d79380e1fb50bea706a72607f0fa412f6f53a229cfa795a9d5a3b9ad\": RecentStats: unable to find data in memory cache]" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.703112 5030 scope.go:117] "RemoveContainer" containerID="7bebe1daa1d25250b1dedbc99eb90a92119deaafd6bb66ce48014fb925d151e0" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.725556 5030 scope.go:117] "RemoveContainer" containerID="bacdb63cfad87ddbc9e7c7e70fdb4097469fbce0f8efae90054339392ef9fc71" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.725936 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bacdb63cfad87ddbc9e7c7e70fdb4097469fbce0f8efae90054339392ef9fc71\": container with ID starting with bacdb63cfad87ddbc9e7c7e70fdb4097469fbce0f8efae90054339392ef9fc71 not found: ID does not exist" containerID="bacdb63cfad87ddbc9e7c7e70fdb4097469fbce0f8efae90054339392ef9fc71" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.725966 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bacdb63cfad87ddbc9e7c7e70fdb4097469fbce0f8efae90054339392ef9fc71"} err="failed to get container status \"bacdb63cfad87ddbc9e7c7e70fdb4097469fbce0f8efae90054339392ef9fc71\": rpc error: code = NotFound desc = could not find container \"bacdb63cfad87ddbc9e7c7e70fdb4097469fbce0f8efae90054339392ef9fc71\": container with ID starting with bacdb63cfad87ddbc9e7c7e70fdb4097469fbce0f8efae90054339392ef9fc71 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.725988 5030 scope.go:117] "RemoveContainer" containerID="7bebe1daa1d25250b1dedbc99eb90a92119deaafd6bb66ce48014fb925d151e0" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.726281 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bebe1daa1d25250b1dedbc99eb90a92119deaafd6bb66ce48014fb925d151e0\": container with ID starting with 7bebe1daa1d25250b1dedbc99eb90a92119deaafd6bb66ce48014fb925d151e0 not found: ID does not exist" containerID="7bebe1daa1d25250b1dedbc99eb90a92119deaafd6bb66ce48014fb925d151e0" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.726301 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bebe1daa1d25250b1dedbc99eb90a92119deaafd6bb66ce48014fb925d151e0"} err="failed to get container status \"7bebe1daa1d25250b1dedbc99eb90a92119deaafd6bb66ce48014fb925d151e0\": rpc error: code = NotFound desc = could not find container \"7bebe1daa1d25250b1dedbc99eb90a92119deaafd6bb66ce48014fb925d151e0\": container with ID starting with 7bebe1daa1d25250b1dedbc99eb90a92119deaafd6bb66ce48014fb925d151e0 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.726313 5030 scope.go:117] "RemoveContainer" containerID="b3074f8fcdf7f1a56717e9304ba5737a0e0d1592d96a5bd9d970577342c28521" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.747696 5030 scope.go:117] "RemoveContainer" containerID="0351a1478214b121a7bb8088f4a8ae6a86f4ec9b8b21b1882a5aae98732aa2f9" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.764641 5030 scope.go:117] "RemoveContainer" containerID="b3074f8fcdf7f1a56717e9304ba5737a0e0d1592d96a5bd9d970577342c28521" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.765426 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3074f8fcdf7f1a56717e9304ba5737a0e0d1592d96a5bd9d970577342c28521\": container with ID starting with b3074f8fcdf7f1a56717e9304ba5737a0e0d1592d96a5bd9d970577342c28521 not found: ID does not exist" containerID="b3074f8fcdf7f1a56717e9304ba5737a0e0d1592d96a5bd9d970577342c28521" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.765457 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3074f8fcdf7f1a56717e9304ba5737a0e0d1592d96a5bd9d970577342c28521"} err="failed to get container status \"b3074f8fcdf7f1a56717e9304ba5737a0e0d1592d96a5bd9d970577342c28521\": rpc error: code = NotFound desc = could not find container \"b3074f8fcdf7f1a56717e9304ba5737a0e0d1592d96a5bd9d970577342c28521\": container with ID starting with b3074f8fcdf7f1a56717e9304ba5737a0e0d1592d96a5bd9d970577342c28521 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.765479 5030 scope.go:117] "RemoveContainer" containerID="0351a1478214b121a7bb8088f4a8ae6a86f4ec9b8b21b1882a5aae98732aa2f9" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.765881 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0351a1478214b121a7bb8088f4a8ae6a86f4ec9b8b21b1882a5aae98732aa2f9\": container with ID starting with 0351a1478214b121a7bb8088f4a8ae6a86f4ec9b8b21b1882a5aae98732aa2f9 not found: ID does not exist" containerID="0351a1478214b121a7bb8088f4a8ae6a86f4ec9b8b21b1882a5aae98732aa2f9" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.765903 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0351a1478214b121a7bb8088f4a8ae6a86f4ec9b8b21b1882a5aae98732aa2f9"} err="failed to get container status \"0351a1478214b121a7bb8088f4a8ae6a86f4ec9b8b21b1882a5aae98732aa2f9\": rpc error: code = NotFound desc = could not find container \"0351a1478214b121a7bb8088f4a8ae6a86f4ec9b8b21b1882a5aae98732aa2f9\": container with ID starting with 0351a1478214b121a7bb8088f4a8ae6a86f4ec9b8b21b1882a5aae98732aa2f9 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.765915 5030 scope.go:117] "RemoveContainer" containerID="96fcd4787c2a4fc0ac3198c4288890edb23f73112901a2786f3617f650cef644" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.782332 5030 scope.go:117] "RemoveContainer" containerID="5ee7da575f223a9839f7f97dae1e5c9caf513e03dec1badc74521523c66f260c" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.799942 5030 scope.go:117] "RemoveContainer" containerID="492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.819243 5030 scope.go:117] "RemoveContainer" containerID="492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2" Jan 20 23:02:44 crc kubenswrapper[5030]: E0120 23:02:44.826773 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2\": container with ID starting with 492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2 not found: ID does not exist" containerID="492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.826825 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2"} err="failed to get container status \"492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2\": rpc error: code = NotFound desc = could not find container \"492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2\": container with ID starting with 492b515956edc8eb4708834d87b0afd9ebcebf9e8e0d0446fcfd1a16965d45b2 not found: ID does not exist" Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.839043 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6765d59f9b-2rs69"] Jan 20 23:02:44 crc kubenswrapper[5030]: I0120 23:02:44.853049 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-6765d59f9b-2rs69"] Jan 20 23:02:45 crc kubenswrapper[5030]: E0120 23:02:45.275654 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:02:45 crc kubenswrapper[5030]: E0120 23:02:45.275726 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data podName:83f11c15-8856-45fd-9703-348310781d5a nodeName:}" failed. No retries permitted until 2026-01-20 23:02:53.275712555 +0000 UTC m=+1645.595972843 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data") pod "rabbitmq-server-0" (UID: "83f11c15-8856-45fd-9703-348310781d5a") : configmap "rabbitmq-config-data" not found Jan 20 23:02:45 crc kubenswrapper[5030]: I0120 23:02:45.579203 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vljcv" event={"ID":"6b5cb860-d149-4999-a4bb-be57ead53529","Type":"ContainerStarted","Data":"a49c5065ccd33f9fa9001282a9b4eb232a7f9fd9af393854807e464310a43b93"} Jan 20 23:02:45 crc kubenswrapper[5030]: I0120 23:02:45.856546 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="b9bc4ba4-4b03-441b-b492-6de67c2647b6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.201:5671: connect: connection refused" Jan 20 23:02:45 crc kubenswrapper[5030]: I0120 23:02:45.962044 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:02:45 crc kubenswrapper[5030]: E0120 23:02:45.962450 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:02:45 crc kubenswrapper[5030]: I0120 23:02:45.978730 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" path="/var/lib/kubelet/pods/0188ec37-5fa0-4c9c-82ea-7eb7010c88d3/volumes" Jan 20 23:02:45 crc kubenswrapper[5030]: I0120 23:02:45.980141 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4789919e-c06b-413a-8185-6766523e9925" path="/var/lib/kubelet/pods/4789919e-c06b-413a-8185-6766523e9925/volumes" Jan 20 23:02:45 crc kubenswrapper[5030]: I0120 23:02:45.981300 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" path="/var/lib/kubelet/pods/47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8/volumes" Jan 20 23:02:45 crc kubenswrapper[5030]: I0120 23:02:45.983539 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" path="/var/lib/kubelet/pods/5c264e34-26dd-4ab6-88e1-a44f64fac9da/volumes" Jan 20 23:02:45 crc kubenswrapper[5030]: I0120 23:02:45.985031 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" path="/var/lib/kubelet/pods/6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba/volumes" Jan 20 23:02:45 crc kubenswrapper[5030]: I0120 23:02:45.987369 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" path="/var/lib/kubelet/pods/b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d/volumes" Jan 20 23:02:45 crc kubenswrapper[5030]: I0120 23:02:45.988580 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f99e18-c5b4-4def-9f5e-5be497088422" path="/var/lib/kubelet/pods/b5f99e18-c5b4-4def-9f5e-5be497088422/volumes" Jan 20 23:02:45 crc kubenswrapper[5030]: I0120 23:02:45.989949 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" path="/var/lib/kubelet/pods/c4db8327-61c4-4757-bed5-728ef0f8bbc6/volumes" Jan 20 23:02:46 crc kubenswrapper[5030]: E0120 23:02:46.601200 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:02:46 crc kubenswrapper[5030]: E0120 23:02:46.601284 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data podName:b9bc4ba4-4b03-441b-b492-6de67c2647b6 nodeName:}" failed. No retries permitted until 2026-01-20 23:02:54.601262539 +0000 UTC m=+1646.921522837 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data") pod "rabbitmq-cell1-server-0" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:02:46 crc kubenswrapper[5030]: I0120 23:02:46.616371 5030 generic.go:334] "Generic (PLEG): container finished" podID="83f11c15-8856-45fd-9703-348310781d5a" containerID="dcc99476597aaed55627fb273471e78f5cfba90216403d60549d509f104df53d" exitCode=0 Jan 20 23:02:46 crc kubenswrapper[5030]: I0120 23:02:46.616458 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"83f11c15-8856-45fd-9703-348310781d5a","Type":"ContainerDied","Data":"dcc99476597aaed55627fb273471e78f5cfba90216403d60549d509f104df53d"} Jan 20 23:02:46 crc kubenswrapper[5030]: I0120 23:02:46.620271 5030 generic.go:334] "Generic (PLEG): container finished" podID="6b5cb860-d149-4999-a4bb-be57ead53529" containerID="a49c5065ccd33f9fa9001282a9b4eb232a7f9fd9af393854807e464310a43b93" exitCode=0 Jan 20 23:02:46 crc kubenswrapper[5030]: I0120 23:02:46.620355 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vljcv" event={"ID":"6b5cb860-d149-4999-a4bb-be57ead53529","Type":"ContainerDied","Data":"a49c5065ccd33f9fa9001282a9b4eb232a7f9fd9af393854807e464310a43b93"} Jan 20 23:02:46 crc kubenswrapper[5030]: I0120 23:02:46.851131 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.008180 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-tls\") pod \"83f11c15-8856-45fd-9703-348310781d5a\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.008297 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-erlang-cookie\") pod \"83f11c15-8856-45fd-9703-348310781d5a\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.008369 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83f11c15-8856-45fd-9703-348310781d5a-erlang-cookie-secret\") pod \"83f11c15-8856-45fd-9703-348310781d5a\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.008415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"83f11c15-8856-45fd-9703-348310781d5a\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.008478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-plugins\") pod \"83f11c15-8856-45fd-9703-348310781d5a\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.008549 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-server-conf\") pod \"83f11c15-8856-45fd-9703-348310781d5a\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.008711 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-plugins-conf\") pod \"83f11c15-8856-45fd-9703-348310781d5a\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.008763 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w65db\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-kube-api-access-w65db\") pod \"83f11c15-8856-45fd-9703-348310781d5a\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.008846 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data\") pod \"83f11c15-8856-45fd-9703-348310781d5a\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.008885 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-confd\") pod \"83f11c15-8856-45fd-9703-348310781d5a\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.008931 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83f11c15-8856-45fd-9703-348310781d5a-pod-info\") pod \"83f11c15-8856-45fd-9703-348310781d5a\" (UID: \"83f11c15-8856-45fd-9703-348310781d5a\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.009121 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "83f11c15-8856-45fd-9703-348310781d5a" (UID: "83f11c15-8856-45fd-9703-348310781d5a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.009423 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "83f11c15-8856-45fd-9703-348310781d5a" (UID: "83f11c15-8856-45fd-9703-348310781d5a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.009702 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.009723 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.009742 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "83f11c15-8856-45fd-9703-348310781d5a" (UID: "83f11c15-8856-45fd-9703-348310781d5a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.013569 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83f11c15-8856-45fd-9703-348310781d5a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "83f11c15-8856-45fd-9703-348310781d5a" (UID: "83f11c15-8856-45fd-9703-348310781d5a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.013852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "83f11c15-8856-45fd-9703-348310781d5a" (UID: "83f11c15-8856-45fd-9703-348310781d5a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.013962 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/83f11c15-8856-45fd-9703-348310781d5a-pod-info" (OuterVolumeSpecName: "pod-info") pod "83f11c15-8856-45fd-9703-348310781d5a" (UID: "83f11c15-8856-45fd-9703-348310781d5a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.014031 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-kube-api-access-w65db" (OuterVolumeSpecName: "kube-api-access-w65db") pod "83f11c15-8856-45fd-9703-348310781d5a" (UID: "83f11c15-8856-45fd-9703-348310781d5a"). InnerVolumeSpecName "kube-api-access-w65db". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.018823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "83f11c15-8856-45fd-9703-348310781d5a" (UID: "83f11c15-8856-45fd-9703-348310781d5a"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.046579 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data" (OuterVolumeSpecName: "config-data") pod "83f11c15-8856-45fd-9703-348310781d5a" (UID: "83f11c15-8856-45fd-9703-348310781d5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.069875 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-server-conf" (OuterVolumeSpecName: "server-conf") pod "83f11c15-8856-45fd-9703-348310781d5a" (UID: "83f11c15-8856-45fd-9703-348310781d5a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.114174 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.115050 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.115100 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.115125 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w65db\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-kube-api-access-w65db\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.115145 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83f11c15-8856-45fd-9703-348310781d5a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.115165 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83f11c15-8856-45fd-9703-348310781d5a-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.115186 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.115205 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83f11c15-8856-45fd-9703-348310781d5a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.137004 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.137943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "83f11c15-8856-45fd-9703-348310781d5a" (UID: "83f11c15-8856-45fd-9703-348310781d5a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.203891 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.217106 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.217141 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83f11c15-8856-45fd-9703-348310781d5a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.318197 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-server-conf\") pod \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.318240 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.318299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-plugins-conf\") pod \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.318328 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-confd\") pod \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.318363 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data\") pod \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.318387 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-tls\") pod \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.318418 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-erlang-cookie\") pod \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.318447 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9bc4ba4-4b03-441b-b492-6de67c2647b6-pod-info\") pod \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.318495 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m25ll\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-kube-api-access-m25ll\") pod \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.318519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-plugins\") pod \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.318672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9bc4ba4-4b03-441b-b492-6de67c2647b6-erlang-cookie-secret\") pod \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\" (UID: \"b9bc4ba4-4b03-441b-b492-6de67c2647b6\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.319873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b9bc4ba4-4b03-441b-b492-6de67c2647b6" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.320324 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b9bc4ba4-4b03-441b-b492-6de67c2647b6" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.321196 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b9bc4ba4-4b03-441b-b492-6de67c2647b6" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.322804 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9bc4ba4-4b03-441b-b492-6de67c2647b6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b9bc4ba4-4b03-441b-b492-6de67c2647b6" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.322821 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-kube-api-access-m25ll" (OuterVolumeSpecName: "kube-api-access-m25ll") pod "b9bc4ba4-4b03-441b-b492-6de67c2647b6" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6"). InnerVolumeSpecName "kube-api-access-m25ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.323466 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b9bc4ba4-4b03-441b-b492-6de67c2647b6-pod-info" (OuterVolumeSpecName: "pod-info") pod "b9bc4ba4-4b03-441b-b492-6de67c2647b6" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.324592 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "b9bc4ba4-4b03-441b-b492-6de67c2647b6" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.328967 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b9bc4ba4-4b03-441b-b492-6de67c2647b6" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.341999 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data" (OuterVolumeSpecName: "config-data") pod "b9bc4ba4-4b03-441b-b492-6de67c2647b6" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.375779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-server-conf" (OuterVolumeSpecName: "server-conf") pod "b9bc4ba4-4b03-441b-b492-6de67c2647b6" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.420716 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m25ll\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-kube-api-access-m25ll\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.421021 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.421109 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9bc4ba4-4b03-441b-b492-6de67c2647b6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.421184 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.421286 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.421385 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.421462 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9bc4ba4-4b03-441b-b492-6de67c2647b6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.421543 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.421657 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.421737 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9bc4ba4-4b03-441b-b492-6de67c2647b6-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.435573 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b9bc4ba4-4b03-441b-b492-6de67c2647b6" (UID: "b9bc4ba4-4b03-441b-b492-6de67c2647b6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.442030 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.523183 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.523221 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9bc4ba4-4b03-441b-b492-6de67c2647b6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.630422 5030 generic.go:334] "Generic (PLEG): container finished" podID="b9bc4ba4-4b03-441b-b492-6de67c2647b6" containerID="2eaa0bf043f813374e4a3eece104c7b2f4bc50fb0dfe7dac3ec20252cfacfb0a" exitCode=0 Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.630487 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"b9bc4ba4-4b03-441b-b492-6de67c2647b6","Type":"ContainerDied","Data":"2eaa0bf043f813374e4a3eece104c7b2f4bc50fb0dfe7dac3ec20252cfacfb0a"} Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.630514 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"b9bc4ba4-4b03-441b-b492-6de67c2647b6","Type":"ContainerDied","Data":"042a45836f5533a4769ed35bfb4435bb732499ab8cf956d04c6181f28acffce9"} Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.630531 5030 scope.go:117] "RemoveContainer" containerID="2eaa0bf043f813374e4a3eece104c7b2f4bc50fb0dfe7dac3ec20252cfacfb0a" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.630636 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.633324 5030 generic.go:334] "Generic (PLEG): container finished" podID="2071f00e-fdae-400c-bec2-f9092faef33d" containerID="765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600" exitCode=0 Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.633414 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"2071f00e-fdae-400c-bec2-f9092faef33d","Type":"ContainerDied","Data":"765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600"} Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.636180 5030 generic.go:334] "Generic (PLEG): container finished" podID="f62d42b6-4ec4-4f63-8f7c-31975f6677bb" containerID="cf2a56f804091f8d287a400f7f7a6cd9c78c7a43c30bba4d5b9ef210fd03c3b7" exitCode=0 Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.636238 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" event={"ID":"f62d42b6-4ec4-4f63-8f7c-31975f6677bb","Type":"ContainerDied","Data":"cf2a56f804091f8d287a400f7f7a6cd9c78c7a43c30bba4d5b9ef210fd03c3b7"} Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.638788 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"83f11c15-8856-45fd-9703-348310781d5a","Type":"ContainerDied","Data":"ef13c838ced0f980ead5be321b135461da111a3826340912690b010245dfb0c6"} Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.638852 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.643357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vljcv" event={"ID":"6b5cb860-d149-4999-a4bb-be57ead53529","Type":"ContainerStarted","Data":"6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0"} Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.645546 5030 generic.go:334] "Generic (PLEG): container finished" podID="9b5e90d7-03d0-48f4-abc2-914ccd98c0db" containerID="3fcdf58baca53e88af4e903ab1f51ec63e8c6a04e6a3019435c78eaefad8028e" exitCode=0 Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.645591 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" event={"ID":"9b5e90d7-03d0-48f4-abc2-914ccd98c0db","Type":"ContainerDied","Data":"3fcdf58baca53e88af4e903ab1f51ec63e8c6a04e6a3019435c78eaefad8028e"} Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.661731 5030 scope.go:117] "RemoveContainer" containerID="56eaf2fe8a93d7268fc8ee7e06205cca6c5d17255e04bed26c15d451551557f2" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.663386 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vljcv" podStartSLOduration=4.067642609 podStartE2EDuration="6.663368409s" podCreationTimestamp="2026-01-20 23:02:41 +0000 UTC" firstStartedPulling="2026-01-20 23:02:44.513020755 +0000 UTC m=+1636.833281033" lastFinishedPulling="2026-01-20 23:02:47.108746545 +0000 UTC m=+1639.429006833" observedRunningTime="2026-01-20 23:02:47.65890557 +0000 UTC m=+1639.979165878" watchObservedRunningTime="2026-01-20 23:02:47.663368409 +0000 UTC m=+1639.983628697" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.697497 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.698168 5030 scope.go:117] "RemoveContainer" containerID="2eaa0bf043f813374e4a3eece104c7b2f4bc50fb0dfe7dac3ec20252cfacfb0a" Jan 20 23:02:47 crc kubenswrapper[5030]: E0120 23:02:47.698724 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eaa0bf043f813374e4a3eece104c7b2f4bc50fb0dfe7dac3ec20252cfacfb0a\": container with ID starting with 2eaa0bf043f813374e4a3eece104c7b2f4bc50fb0dfe7dac3ec20252cfacfb0a not found: ID does not exist" containerID="2eaa0bf043f813374e4a3eece104c7b2f4bc50fb0dfe7dac3ec20252cfacfb0a" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.698781 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eaa0bf043f813374e4a3eece104c7b2f4bc50fb0dfe7dac3ec20252cfacfb0a"} err="failed to get container status \"2eaa0bf043f813374e4a3eece104c7b2f4bc50fb0dfe7dac3ec20252cfacfb0a\": rpc error: code = NotFound desc = could not find container \"2eaa0bf043f813374e4a3eece104c7b2f4bc50fb0dfe7dac3ec20252cfacfb0a\": container with ID starting with 2eaa0bf043f813374e4a3eece104c7b2f4bc50fb0dfe7dac3ec20252cfacfb0a not found: ID does not exist" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.698813 5030 scope.go:117] "RemoveContainer" containerID="56eaf2fe8a93d7268fc8ee7e06205cca6c5d17255e04bed26c15d451551557f2" Jan 20 23:02:47 crc kubenswrapper[5030]: E0120 23:02:47.699338 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56eaf2fe8a93d7268fc8ee7e06205cca6c5d17255e04bed26c15d451551557f2\": container with ID starting with 56eaf2fe8a93d7268fc8ee7e06205cca6c5d17255e04bed26c15d451551557f2 not found: ID does not exist" containerID="56eaf2fe8a93d7268fc8ee7e06205cca6c5d17255e04bed26c15d451551557f2" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.699365 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56eaf2fe8a93d7268fc8ee7e06205cca6c5d17255e04bed26c15d451551557f2"} err="failed to get container status \"56eaf2fe8a93d7268fc8ee7e06205cca6c5d17255e04bed26c15d451551557f2\": rpc error: code = NotFound desc = could not find container \"56eaf2fe8a93d7268fc8ee7e06205cca6c5d17255e04bed26c15d451551557f2\": container with ID starting with 56eaf2fe8a93d7268fc8ee7e06205cca6c5d17255e04bed26c15d451551557f2 not found: ID does not exist" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.699385 5030 scope.go:117] "RemoveContainer" containerID="dcc99476597aaed55627fb273471e78f5cfba90216403d60549d509f104df53d" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.716242 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.727294 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:02:47 crc kubenswrapper[5030]: E0120 23:02:47.729098 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600 is running failed: container process not found" containerID="765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:02:47 crc kubenswrapper[5030]: E0120 23:02:47.729494 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600 is running failed: container process not found" containerID="765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.729631 5030 scope.go:117] "RemoveContainer" containerID="023df1c950388e7dcb139b685ad0bdfed5f5d067074b9530aa12528857b11e81" Jan 20 23:02:47 crc kubenswrapper[5030]: E0120 23:02:47.729729 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600 is running failed: container process not found" containerID="765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:02:47 crc kubenswrapper[5030]: E0120 23:02:47.729759 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="2071f00e-fdae-400c-bec2-f9092faef33d" containerName="nova-scheduler-scheduler" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.733859 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.805047 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.872874 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.888428 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.929177 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-config-data-custom\") pod \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.929340 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-combined-ca-bundle\") pod \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.929397 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-logs\") pod \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.929432 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhpfp\" (UniqueName: \"kubernetes.io/projected/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-kube-api-access-xhpfp\") pod \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.929450 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-config-data\") pod \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\" (UID: \"9b5e90d7-03d0-48f4-abc2-914ccd98c0db\") " Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.929897 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-logs" (OuterVolumeSpecName: "logs") pod "9b5e90d7-03d0-48f4-abc2-914ccd98c0db" (UID: "9b5e90d7-03d0-48f4-abc2-914ccd98c0db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.932845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9b5e90d7-03d0-48f4-abc2-914ccd98c0db" (UID: "9b5e90d7-03d0-48f4-abc2-914ccd98c0db"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.932885 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-kube-api-access-xhpfp" (OuterVolumeSpecName: "kube-api-access-xhpfp") pod "9b5e90d7-03d0-48f4-abc2-914ccd98c0db" (UID: "9b5e90d7-03d0-48f4-abc2-914ccd98c0db"). InnerVolumeSpecName "kube-api-access-xhpfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.947820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b5e90d7-03d0-48f4-abc2-914ccd98c0db" (UID: "9b5e90d7-03d0-48f4-abc2-914ccd98c0db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.970556 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f11c15-8856-45fd-9703-348310781d5a" path="/var/lib/kubelet/pods/83f11c15-8856-45fd-9703-348310781d5a/volumes" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.971440 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9bc4ba4-4b03-441b-b492-6de67c2647b6" path="/var/lib/kubelet/pods/b9bc4ba4-4b03-441b-b492-6de67c2647b6/volumes" Jan 20 23:02:47 crc kubenswrapper[5030]: I0120 23:02:47.995800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-config-data" (OuterVolumeSpecName: "config-data") pod "9b5e90d7-03d0-48f4-abc2-914ccd98c0db" (UID: "9b5e90d7-03d0-48f4-abc2-914ccd98c0db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.030583 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlpqw\" (UniqueName: \"kubernetes.io/projected/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-kube-api-access-rlpqw\") pod \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.030905 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-logs\") pod \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.030959 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-config-data\") pod \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.031022 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2071f00e-fdae-400c-bec2-f9092faef33d-combined-ca-bundle\") pod \"2071f00e-fdae-400c-bec2-f9092faef33d\" (UID: \"2071f00e-fdae-400c-bec2-f9092faef33d\") " Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.031105 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2071f00e-fdae-400c-bec2-f9092faef33d-config-data\") pod \"2071f00e-fdae-400c-bec2-f9092faef33d\" (UID: \"2071f00e-fdae-400c-bec2-f9092faef33d\") " Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.031171 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-combined-ca-bundle\") pod \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.031221 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh2xb\" (UniqueName: \"kubernetes.io/projected/2071f00e-fdae-400c-bec2-f9092faef33d-kube-api-access-mh2xb\") pod \"2071f00e-fdae-400c-bec2-f9092faef33d\" (UID: \"2071f00e-fdae-400c-bec2-f9092faef33d\") " Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.031265 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-config-data-custom\") pod \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\" (UID: \"f62d42b6-4ec4-4f63-8f7c-31975f6677bb\") " Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.031566 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhpfp\" (UniqueName: \"kubernetes.io/projected/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-kube-api-access-xhpfp\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.031582 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.031594 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.031609 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.031639 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b5e90d7-03d0-48f4-abc2-914ccd98c0db-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.031661 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-logs" (OuterVolumeSpecName: "logs") pod "f62d42b6-4ec4-4f63-8f7c-31975f6677bb" (UID: "f62d42b6-4ec4-4f63-8f7c-31975f6677bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.033673 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-kube-api-access-rlpqw" (OuterVolumeSpecName: "kube-api-access-rlpqw") pod "f62d42b6-4ec4-4f63-8f7c-31975f6677bb" (UID: "f62d42b6-4ec4-4f63-8f7c-31975f6677bb"). InnerVolumeSpecName "kube-api-access-rlpqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.036364 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f62d42b6-4ec4-4f63-8f7c-31975f6677bb" (UID: "f62d42b6-4ec4-4f63-8f7c-31975f6677bb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.036992 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2071f00e-fdae-400c-bec2-f9092faef33d-kube-api-access-mh2xb" (OuterVolumeSpecName: "kube-api-access-mh2xb") pod "2071f00e-fdae-400c-bec2-f9092faef33d" (UID: "2071f00e-fdae-400c-bec2-f9092faef33d"). InnerVolumeSpecName "kube-api-access-mh2xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.051161 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2071f00e-fdae-400c-bec2-f9092faef33d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2071f00e-fdae-400c-bec2-f9092faef33d" (UID: "2071f00e-fdae-400c-bec2-f9092faef33d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.061522 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2071f00e-fdae-400c-bec2-f9092faef33d-config-data" (OuterVolumeSpecName: "config-data") pod "2071f00e-fdae-400c-bec2-f9092faef33d" (UID: "2071f00e-fdae-400c-bec2-f9092faef33d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.063827 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f62d42b6-4ec4-4f63-8f7c-31975f6677bb" (UID: "f62d42b6-4ec4-4f63-8f7c-31975f6677bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.091179 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-config-data" (OuterVolumeSpecName: "config-data") pod "f62d42b6-4ec4-4f63-8f7c-31975f6677bb" (UID: "f62d42b6-4ec4-4f63-8f7c-31975f6677bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.133645 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2071f00e-fdae-400c-bec2-f9092faef33d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.133716 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.133737 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh2xb\" (UniqueName: \"kubernetes.io/projected/2071f00e-fdae-400c-bec2-f9092faef33d-kube-api-access-mh2xb\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.133757 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.133776 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlpqw\" (UniqueName: \"kubernetes.io/projected/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-kube-api-access-rlpqw\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.133793 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.133810 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62d42b6-4ec4-4f63-8f7c-31975f6677bb-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.133827 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2071f00e-fdae-400c-bec2-f9092faef33d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.661929 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" event={"ID":"f62d42b6-4ec4-4f63-8f7c-31975f6677bb","Type":"ContainerDied","Data":"cac1cd239c8eb7be62a520b3a04834943d6ef25c4b2d01f12353c199ab2ebfde"} Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.661990 5030 scope.go:117] "RemoveContainer" containerID="cf2a56f804091f8d287a400f7f7a6cd9c78c7a43c30bba4d5b9ef210fd03c3b7" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.661994 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.673832 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" event={"ID":"9b5e90d7-03d0-48f4-abc2-914ccd98c0db","Type":"ContainerDied","Data":"26ac43c883c15abf4af3c5422e5d1c62a3bf4d492dd374cfe9f9364237f47b17"} Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.674012 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.684313 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.684337 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"2071f00e-fdae-400c-bec2-f9092faef33d","Type":"ContainerDied","Data":"195c07e9207c0d526a9d755e7b0bfe80dcdf587a2a21c54a61240f8f0a999522"} Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.710283 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq"] Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.715820 5030 scope.go:117] "RemoveContainer" containerID="8da669032a42ae5fafc499507531665ae7680439dcecff59acbf618249a406b2" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.727361 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-85fffb8dc7-27vbq"] Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.755182 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq"] Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.761860 5030 scope.go:117] "RemoveContainer" containerID="3fcdf58baca53e88af4e903ab1f51ec63e8c6a04e6a3019435c78eaefad8028e" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.763500 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-74b4fcd47d-8qrmq"] Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.772872 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.779758 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.795320 5030 scope.go:117] "RemoveContainer" containerID="aaad7c24ea8a9b3796aa4e5cfe2cc0e940b0c172d22637110c91ea576ac7f3ca" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.824657 5030 scope.go:117] "RemoveContainer" containerID="765e3ac1c87f4032588114551dc62fb1a693dd741ba177fbeee1bce786fcd600" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.941617 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.32:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:02:48 crc kubenswrapper[5030]: I0120 23:02:48.941841 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.32:8775/\": dial tcp 10.217.1.32:8775: i/o timeout" Jan 20 23:02:49 crc kubenswrapper[5030]: I0120 23:02:49.979049 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2071f00e-fdae-400c-bec2-f9092faef33d" path="/var/lib/kubelet/pods/2071f00e-fdae-400c-bec2-f9092faef33d/volumes" Jan 20 23:02:49 crc kubenswrapper[5030]: I0120 23:02:49.980806 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b5e90d7-03d0-48f4-abc2-914ccd98c0db" path="/var/lib/kubelet/pods/9b5e90d7-03d0-48f4-abc2-914ccd98c0db/volumes" Jan 20 23:02:49 crc kubenswrapper[5030]: I0120 23:02:49.982479 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f62d42b6-4ec4-4f63-8f7c-31975f6677bb" path="/var/lib/kubelet/pods/f62d42b6-4ec4-4f63-8f7c-31975f6677bb/volumes" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.054864 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.150940 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf"] Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.151850 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" podUID="6c46ecc6-0196-4659-b06f-3603396bc91a" containerName="dnsmasq-dns" containerID="cri-o://272f3362cfb0f6639a3278329b5b414c51b4001addfaf831fd46b21f2f8238ee" gracePeriod=10 Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.717789 5030 generic.go:334] "Generic (PLEG): container finished" podID="7dfd1851-3088-4199-9b12-47964306a9b5" containerID="25d71b9ff6b3918f06b7ac2f245f258bfaef55aef059830afb2f1b222de24f94" exitCode=0 Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.717857 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" event={"ID":"7dfd1851-3088-4199-9b12-47964306a9b5","Type":"ContainerDied","Data":"25d71b9ff6b3918f06b7ac2f245f258bfaef55aef059830afb2f1b222de24f94"} Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.717887 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" event={"ID":"7dfd1851-3088-4199-9b12-47964306a9b5","Type":"ContainerDied","Data":"8c6bcac09d4231a5bf5dba205344abd8281bbaca28674d737f171a31a10e77c4"} Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.717901 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c6bcac09d4231a5bf5dba205344abd8281bbaca28674d737f171a31a10e77c4" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.720289 5030 generic.go:334] "Generic (PLEG): container finished" podID="6c46ecc6-0196-4659-b06f-3603396bc91a" containerID="272f3362cfb0f6639a3278329b5b414c51b4001addfaf831fd46b21f2f8238ee" exitCode=0 Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.720315 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" event={"ID":"6c46ecc6-0196-4659-b06f-3603396bc91a","Type":"ContainerDied","Data":"272f3362cfb0f6639a3278329b5b414c51b4001addfaf831fd46b21f2f8238ee"} Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.720334 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" event={"ID":"6c46ecc6-0196-4659-b06f-3603396bc91a","Type":"ContainerDied","Data":"9e1b58d376b88dd02ed13bec244f2c6f8fa1aae357ad98403b6ddedc02096d23"} Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.720345 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e1b58d376b88dd02ed13bec244f2c6f8fa1aae357ad98403b6ddedc02096d23" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.722232 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.729079 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.884773 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-ovndb-tls-certs\") pod \"7dfd1851-3088-4199-9b12-47964306a9b5\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.885349 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qtj7\" (UniqueName: \"kubernetes.io/projected/6c46ecc6-0196-4659-b06f-3603396bc91a-kube-api-access-8qtj7\") pod \"6c46ecc6-0196-4659-b06f-3603396bc91a\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.885559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-dnsmasq-svc\") pod \"6c46ecc6-0196-4659-b06f-3603396bc91a\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.885689 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvmfp\" (UniqueName: \"kubernetes.io/projected/7dfd1851-3088-4199-9b12-47964306a9b5-kube-api-access-rvmfp\") pod \"7dfd1851-3088-4199-9b12-47964306a9b5\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.885834 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-combined-ca-bundle\") pod \"7dfd1851-3088-4199-9b12-47964306a9b5\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.885993 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-public-tls-certs\") pod \"7dfd1851-3088-4199-9b12-47964306a9b5\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.886111 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-config\") pod \"7dfd1851-3088-4199-9b12-47964306a9b5\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.886312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-httpd-config\") pod \"7dfd1851-3088-4199-9b12-47964306a9b5\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.886517 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-dns-swift-storage-0\") pod \"6c46ecc6-0196-4659-b06f-3603396bc91a\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.887102 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-internal-tls-certs\") pod \"7dfd1851-3088-4199-9b12-47964306a9b5\" (UID: \"7dfd1851-3088-4199-9b12-47964306a9b5\") " Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.887308 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-config\") pod \"6c46ecc6-0196-4659-b06f-3603396bc91a\" (UID: \"6c46ecc6-0196-4659-b06f-3603396bc91a\") " Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.890600 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dfd1851-3088-4199-9b12-47964306a9b5-kube-api-access-rvmfp" (OuterVolumeSpecName: "kube-api-access-rvmfp") pod "7dfd1851-3088-4199-9b12-47964306a9b5" (UID: "7dfd1851-3088-4199-9b12-47964306a9b5"). InnerVolumeSpecName "kube-api-access-rvmfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.891333 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c46ecc6-0196-4659-b06f-3603396bc91a-kube-api-access-8qtj7" (OuterVolumeSpecName: "kube-api-access-8qtj7") pod "6c46ecc6-0196-4659-b06f-3603396bc91a" (UID: "6c46ecc6-0196-4659-b06f-3603396bc91a"). InnerVolumeSpecName "kube-api-access-8qtj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.892169 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7dfd1851-3088-4199-9b12-47964306a9b5" (UID: "7dfd1851-3088-4199-9b12-47964306a9b5"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.924752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6c46ecc6-0196-4659-b06f-3603396bc91a" (UID: "6c46ecc6-0196-4659-b06f-3603396bc91a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.926154 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-config" (OuterVolumeSpecName: "config") pod "7dfd1851-3088-4199-9b12-47964306a9b5" (UID: "7dfd1851-3088-4199-9b12-47964306a9b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.928713 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dfd1851-3088-4199-9b12-47964306a9b5" (UID: "7dfd1851-3088-4199-9b12-47964306a9b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.931546 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "6c46ecc6-0196-4659-b06f-3603396bc91a" (UID: "6c46ecc6-0196-4659-b06f-3603396bc91a"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.937419 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-config" (OuterVolumeSpecName: "config") pod "6c46ecc6-0196-4659-b06f-3603396bc91a" (UID: "6c46ecc6-0196-4659-b06f-3603396bc91a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.938485 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7dfd1851-3088-4199-9b12-47964306a9b5" (UID: "7dfd1851-3088-4199-9b12-47964306a9b5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.939670 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7dfd1851-3088-4199-9b12-47964306a9b5" (UID: "7dfd1851-3088-4199-9b12-47964306a9b5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.945321 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7dfd1851-3088-4199-9b12-47964306a9b5" (UID: "7dfd1851-3088-4199-9b12-47964306a9b5"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.989544 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.989584 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qtj7\" (UniqueName: \"kubernetes.io/projected/6c46ecc6-0196-4659-b06f-3603396bc91a-kube-api-access-8qtj7\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.989599 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.989611 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvmfp\" (UniqueName: \"kubernetes.io/projected/7dfd1851-3088-4199-9b12-47964306a9b5-kube-api-access-rvmfp\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.989639 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.989650 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.989661 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.989673 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.989684 5030 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.989695 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfd1851-3088-4199-9b12-47964306a9b5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:50 crc kubenswrapper[5030]: I0120 23:02:50.989706 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c46ecc6-0196-4659-b06f-3603396bc91a-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:51 crc kubenswrapper[5030]: I0120 23:02:51.719854 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:51 crc kubenswrapper[5030]: I0120 23:02:51.719923 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:51 crc kubenswrapper[5030]: I0120 23:02:51.734049 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-57bd57cf66-fdt4n" Jan 20 23:02:51 crc kubenswrapper[5030]: I0120 23:02:51.734055 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf" Jan 20 23:02:51 crc kubenswrapper[5030]: I0120 23:02:51.797288 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-57bd57cf66-fdt4n"] Jan 20 23:02:51 crc kubenswrapper[5030]: I0120 23:02:51.810931 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-57bd57cf66-fdt4n"] Jan 20 23:02:51 crc kubenswrapper[5030]: I0120 23:02:51.813889 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:51 crc kubenswrapper[5030]: I0120 23:02:51.821919 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf"] Jan 20 23:02:51 crc kubenswrapper[5030]: I0120 23:02:51.828901 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6ff445895c-69rsf"] Jan 20 23:02:51 crc kubenswrapper[5030]: I0120 23:02:51.978235 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c46ecc6-0196-4659-b06f-3603396bc91a" path="/var/lib/kubelet/pods/6c46ecc6-0196-4659-b06f-3603396bc91a/volumes" Jan 20 23:02:51 crc kubenswrapper[5030]: I0120 23:02:51.979419 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dfd1851-3088-4199-9b12-47964306a9b5" path="/var/lib/kubelet/pods/7dfd1851-3088-4199-9b12-47964306a9b5/volumes" Jan 20 23:02:52 crc kubenswrapper[5030]: I0120 23:02:52.795588 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:52 crc kubenswrapper[5030]: I0120 23:02:52.853164 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vljcv"] Jan 20 23:02:54 crc kubenswrapper[5030]: I0120 23:02:54.783393 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vljcv" podUID="6b5cb860-d149-4999-a4bb-be57ead53529" containerName="registry-server" containerID="cri-o://6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0" gracePeriod=2 Jan 20 23:02:54 crc kubenswrapper[5030]: E0120 23:02:54.960566 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b5cb860_d149_4999_a4bb_be57ead53529.slice/crio-6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b5cb860_d149_4999_a4bb_be57ead53529.slice/crio-conmon-6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.205010 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.358098 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b5cb860-d149-4999-a4bb-be57ead53529-catalog-content\") pod \"6b5cb860-d149-4999-a4bb-be57ead53529\" (UID: \"6b5cb860-d149-4999-a4bb-be57ead53529\") " Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.358721 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d6x6\" (UniqueName: \"kubernetes.io/projected/6b5cb860-d149-4999-a4bb-be57ead53529-kube-api-access-9d6x6\") pod \"6b5cb860-d149-4999-a4bb-be57ead53529\" (UID: \"6b5cb860-d149-4999-a4bb-be57ead53529\") " Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.358801 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b5cb860-d149-4999-a4bb-be57ead53529-utilities\") pod \"6b5cb860-d149-4999-a4bb-be57ead53529\" (UID: \"6b5cb860-d149-4999-a4bb-be57ead53529\") " Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.360255 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b5cb860-d149-4999-a4bb-be57ead53529-utilities" (OuterVolumeSpecName: "utilities") pod "6b5cb860-d149-4999-a4bb-be57ead53529" (UID: "6b5cb860-d149-4999-a4bb-be57ead53529"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.370189 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b5cb860-d149-4999-a4bb-be57ead53529-kube-api-access-9d6x6" (OuterVolumeSpecName: "kube-api-access-9d6x6") pod "6b5cb860-d149-4999-a4bb-be57ead53529" (UID: "6b5cb860-d149-4999-a4bb-be57ead53529"). InnerVolumeSpecName "kube-api-access-9d6x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.415030 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b5cb860-d149-4999-a4bb-be57ead53529-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b5cb860-d149-4999-a4bb-be57ead53529" (UID: "6b5cb860-d149-4999-a4bb-be57ead53529"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.459975 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d6x6\" (UniqueName: \"kubernetes.io/projected/6b5cb860-d149-4999-a4bb-be57ead53529-kube-api-access-9d6x6\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.460176 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b5cb860-d149-4999-a4bb-be57ead53529-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.460235 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b5cb860-d149-4999-a4bb-be57ead53529-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.796257 5030 generic.go:334] "Generic (PLEG): container finished" podID="6b5cb860-d149-4999-a4bb-be57ead53529" containerID="6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0" exitCode=0 Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.796320 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vljcv" event={"ID":"6b5cb860-d149-4999-a4bb-be57ead53529","Type":"ContainerDied","Data":"6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0"} Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.796364 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vljcv" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.796419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vljcv" event={"ID":"6b5cb860-d149-4999-a4bb-be57ead53529","Type":"ContainerDied","Data":"174b1208575986cd971adf051d4da5478249654bf2cfac4d8d16fbb4b6d8fdfb"} Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.796464 5030 scope.go:117] "RemoveContainer" containerID="6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.844062 5030 scope.go:117] "RemoveContainer" containerID="a49c5065ccd33f9fa9001282a9b4eb232a7f9fd9af393854807e464310a43b93" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.853144 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vljcv"] Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.863019 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vljcv"] Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.883042 5030 scope.go:117] "RemoveContainer" containerID="072e84c24bcb5fbbfc5adf7ca0749e8ce5e5a9a248260590e82bbf8c30a161ab" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.924665 5030 scope.go:117] "RemoveContainer" containerID="6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0" Jan 20 23:02:55 crc kubenswrapper[5030]: E0120 23:02:55.925192 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0\": container with ID starting with 6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0 not found: ID does not exist" containerID="6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.925274 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0"} err="failed to get container status \"6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0\": rpc error: code = NotFound desc = could not find container \"6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0\": container with ID starting with 6a315e33db82820e53d9a37849a0b2e4145d606c59836be7008829c4ddff56e0 not found: ID does not exist" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.925318 5030 scope.go:117] "RemoveContainer" containerID="a49c5065ccd33f9fa9001282a9b4eb232a7f9fd9af393854807e464310a43b93" Jan 20 23:02:55 crc kubenswrapper[5030]: E0120 23:02:55.925976 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49c5065ccd33f9fa9001282a9b4eb232a7f9fd9af393854807e464310a43b93\": container with ID starting with a49c5065ccd33f9fa9001282a9b4eb232a7f9fd9af393854807e464310a43b93 not found: ID does not exist" containerID="a49c5065ccd33f9fa9001282a9b4eb232a7f9fd9af393854807e464310a43b93" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.926010 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49c5065ccd33f9fa9001282a9b4eb232a7f9fd9af393854807e464310a43b93"} err="failed to get container status \"a49c5065ccd33f9fa9001282a9b4eb232a7f9fd9af393854807e464310a43b93\": rpc error: code = NotFound desc = could not find container \"a49c5065ccd33f9fa9001282a9b4eb232a7f9fd9af393854807e464310a43b93\": container with ID starting with a49c5065ccd33f9fa9001282a9b4eb232a7f9fd9af393854807e464310a43b93 not found: ID does not exist" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.926033 5030 scope.go:117] "RemoveContainer" containerID="072e84c24bcb5fbbfc5adf7ca0749e8ce5e5a9a248260590e82bbf8c30a161ab" Jan 20 23:02:55 crc kubenswrapper[5030]: E0120 23:02:55.926418 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"072e84c24bcb5fbbfc5adf7ca0749e8ce5e5a9a248260590e82bbf8c30a161ab\": container with ID starting with 072e84c24bcb5fbbfc5adf7ca0749e8ce5e5a9a248260590e82bbf8c30a161ab not found: ID does not exist" containerID="072e84c24bcb5fbbfc5adf7ca0749e8ce5e5a9a248260590e82bbf8c30a161ab" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.926476 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"072e84c24bcb5fbbfc5adf7ca0749e8ce5e5a9a248260590e82bbf8c30a161ab"} err="failed to get container status \"072e84c24bcb5fbbfc5adf7ca0749e8ce5e5a9a248260590e82bbf8c30a161ab\": rpc error: code = NotFound desc = could not find container \"072e84c24bcb5fbbfc5adf7ca0749e8ce5e5a9a248260590e82bbf8c30a161ab\": container with ID starting with 072e84c24bcb5fbbfc5adf7ca0749e8ce5e5a9a248260590e82bbf8c30a161ab not found: ID does not exist" Jan 20 23:02:55 crc kubenswrapper[5030]: I0120 23:02:55.979499 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b5cb860-d149-4999-a4bb-be57ead53529" path="/var/lib/kubelet/pods/6b5cb860-d149-4999-a4bb-be57ead53529/volumes" Jan 20 23:03:00 crc kubenswrapper[5030]: I0120 23:03:00.963004 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:03:00 crc kubenswrapper[5030]: E0120 23:03:00.964259 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.627818 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.805868 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d8bcc41f-fad5-4773-be90-803444a622b0-lock\") pod \"d8bcc41f-fad5-4773-be90-803444a622b0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.805913 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"d8bcc41f-fad5-4773-be90-803444a622b0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.805978 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift\") pod \"d8bcc41f-fad5-4773-be90-803444a622b0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.805997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrtnp\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-kube-api-access-rrtnp\") pod \"d8bcc41f-fad5-4773-be90-803444a622b0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.806041 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d8bcc41f-fad5-4773-be90-803444a622b0-cache\") pod \"d8bcc41f-fad5-4773-be90-803444a622b0\" (UID: \"d8bcc41f-fad5-4773-be90-803444a622b0\") " Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.806713 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8bcc41f-fad5-4773-be90-803444a622b0-lock" (OuterVolumeSpecName: "lock") pod "d8bcc41f-fad5-4773-be90-803444a622b0" (UID: "d8bcc41f-fad5-4773-be90-803444a622b0"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.807464 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8bcc41f-fad5-4773-be90-803444a622b0-cache" (OuterVolumeSpecName: "cache") pod "d8bcc41f-fad5-4773-be90-803444a622b0" (UID: "d8bcc41f-fad5-4773-be90-803444a622b0"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.812517 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-kube-api-access-rrtnp" (OuterVolumeSpecName: "kube-api-access-rrtnp") pod "d8bcc41f-fad5-4773-be90-803444a622b0" (UID: "d8bcc41f-fad5-4773-be90-803444a622b0"). InnerVolumeSpecName "kube-api-access-rrtnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.813966 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d8bcc41f-fad5-4773-be90-803444a622b0" (UID: "d8bcc41f-fad5-4773-be90-803444a622b0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.815202 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "swift") pod "d8bcc41f-fad5-4773-be90-803444a622b0" (UID: "d8bcc41f-fad5-4773-be90-803444a622b0"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.907633 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.907669 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrtnp\" (UniqueName: \"kubernetes.io/projected/d8bcc41f-fad5-4773-be90-803444a622b0-kube-api-access-rrtnp\") on node \"crc\" DevicePath \"\"" Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.907684 5030 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d8bcc41f-fad5-4773-be90-803444a622b0-cache\") on node \"crc\" DevicePath \"\"" Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.907697 5030 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d8bcc41f-fad5-4773-be90-803444a622b0-lock\") on node \"crc\" DevicePath \"\"" Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.907735 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.920407 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.978510 5030 generic.go:334] "Generic (PLEG): container finished" podID="d8bcc41f-fad5-4773-be90-803444a622b0" containerID="31484680afb5e1717e0037f4e7b46b2af2b23145d165332e6243e3d95db0a10f" exitCode=137 Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.978590 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"31484680afb5e1717e0037f4e7b46b2af2b23145d165332e6243e3d95db0a10f"} Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.978658 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d8bcc41f-fad5-4773-be90-803444a622b0","Type":"ContainerDied","Data":"6948576a4578a82d92748b68434f12aee9a957e1f209430ca19e9192db60a78c"} Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.978679 5030 scope.go:117] "RemoveContainer" containerID="31484680afb5e1717e0037f4e7b46b2af2b23145d165332e6243e3d95db0a10f" Jan 20 23:03:09 crc kubenswrapper[5030]: I0120 23:03:09.978706 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.008443 5030 scope.go:117] "RemoveContainer" containerID="021cea42e21ad43546b51f3127b140505655ea943ba44527c60d312311c3e34f" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.008813 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.021349 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.033222 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.042470 5030 scope.go:117] "RemoveContainer" containerID="a432bf14acfe81bc271e02f652015ce152b293eb9646041378fc56598872ca81" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.068293 5030 scope.go:117] "RemoveContainer" containerID="ecdfbe98e48feb2bbd526ae30ed7b61315c792112126a0fe9ff631a269b2721b" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.095857 5030 scope.go:117] "RemoveContainer" containerID="56618c09fc923586f43bcbb44d653ee3a7d3a621886e4715ea74da7134d82d71" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.113994 5030 scope.go:117] "RemoveContainer" containerID="cafdcd217051f0178624a66e9db90a71b91346975e830dc33b5be81f1cbce400" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.141689 5030 scope.go:117] "RemoveContainer" containerID="11b09f54c09fd448fff25683584a5e001e51f0cffe01fad0c973005e9cc64963" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.163153 5030 scope.go:117] "RemoveContainer" containerID="14144cf1922d6be5801bcf6d78d028157ed7e8d1d57067fa2c77b371a66a2544" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.184780 5030 scope.go:117] "RemoveContainer" containerID="2a9f08f93ceb305eb8bbbc6f99714cc83ac38112f37d032b0618da20043f1d87" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.205789 5030 scope.go:117] "RemoveContainer" containerID="030d4cd534eb9dfce0da93b394af8bc39c6d3274923f61e781d77d1230701dae" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.233020 5030 scope.go:117] "RemoveContainer" containerID="c0d724abadc5d3c497596ad5aa09fe62eb5baaea033ab9d5697927b249d3b189" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.265358 5030 scope.go:117] "RemoveContainer" containerID="3b3e2f8dd2ace93c881c281f8989f46699dbadd47bb896f1d2bf1e9f061b8629" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.287542 5030 scope.go:117] "RemoveContainer" containerID="95c78ba26ea1327b543cb4fcc17a119b2b94fc9979e98f800b3989e33137c216" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.311753 5030 scope.go:117] "RemoveContainer" containerID="81f204e94e50c10aad98f32fa610bc765fd0625f37c6d72281e27a9a07ef0053" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.341015 5030 scope.go:117] "RemoveContainer" containerID="bc19f8ea22af2621575e8fed2298df70012df1256810526ccdb4808e80ca0994" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.367097 5030 scope.go:117] "RemoveContainer" containerID="31484680afb5e1717e0037f4e7b46b2af2b23145d165332e6243e3d95db0a10f" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.367613 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31484680afb5e1717e0037f4e7b46b2af2b23145d165332e6243e3d95db0a10f\": container with ID starting with 31484680afb5e1717e0037f4e7b46b2af2b23145d165332e6243e3d95db0a10f not found: ID does not exist" containerID="31484680afb5e1717e0037f4e7b46b2af2b23145d165332e6243e3d95db0a10f" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.367673 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31484680afb5e1717e0037f4e7b46b2af2b23145d165332e6243e3d95db0a10f"} err="failed to get container status \"31484680afb5e1717e0037f4e7b46b2af2b23145d165332e6243e3d95db0a10f\": rpc error: code = NotFound desc = could not find container \"31484680afb5e1717e0037f4e7b46b2af2b23145d165332e6243e3d95db0a10f\": container with ID starting with 31484680afb5e1717e0037f4e7b46b2af2b23145d165332e6243e3d95db0a10f not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.367695 5030 scope.go:117] "RemoveContainer" containerID="021cea42e21ad43546b51f3127b140505655ea943ba44527c60d312311c3e34f" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.368067 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"021cea42e21ad43546b51f3127b140505655ea943ba44527c60d312311c3e34f\": container with ID starting with 021cea42e21ad43546b51f3127b140505655ea943ba44527c60d312311c3e34f not found: ID does not exist" containerID="021cea42e21ad43546b51f3127b140505655ea943ba44527c60d312311c3e34f" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.368095 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"021cea42e21ad43546b51f3127b140505655ea943ba44527c60d312311c3e34f"} err="failed to get container status \"021cea42e21ad43546b51f3127b140505655ea943ba44527c60d312311c3e34f\": rpc error: code = NotFound desc = could not find container \"021cea42e21ad43546b51f3127b140505655ea943ba44527c60d312311c3e34f\": container with ID starting with 021cea42e21ad43546b51f3127b140505655ea943ba44527c60d312311c3e34f not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.368117 5030 scope.go:117] "RemoveContainer" containerID="a432bf14acfe81bc271e02f652015ce152b293eb9646041378fc56598872ca81" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.368586 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a432bf14acfe81bc271e02f652015ce152b293eb9646041378fc56598872ca81\": container with ID starting with a432bf14acfe81bc271e02f652015ce152b293eb9646041378fc56598872ca81 not found: ID does not exist" containerID="a432bf14acfe81bc271e02f652015ce152b293eb9646041378fc56598872ca81" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.368647 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a432bf14acfe81bc271e02f652015ce152b293eb9646041378fc56598872ca81"} err="failed to get container status \"a432bf14acfe81bc271e02f652015ce152b293eb9646041378fc56598872ca81\": rpc error: code = NotFound desc = could not find container \"a432bf14acfe81bc271e02f652015ce152b293eb9646041378fc56598872ca81\": container with ID starting with a432bf14acfe81bc271e02f652015ce152b293eb9646041378fc56598872ca81 not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.368667 5030 scope.go:117] "RemoveContainer" containerID="ecdfbe98e48feb2bbd526ae30ed7b61315c792112126a0fe9ff631a269b2721b" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.368932 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecdfbe98e48feb2bbd526ae30ed7b61315c792112126a0fe9ff631a269b2721b\": container with ID starting with ecdfbe98e48feb2bbd526ae30ed7b61315c792112126a0fe9ff631a269b2721b not found: ID does not exist" containerID="ecdfbe98e48feb2bbd526ae30ed7b61315c792112126a0fe9ff631a269b2721b" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.368967 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecdfbe98e48feb2bbd526ae30ed7b61315c792112126a0fe9ff631a269b2721b"} err="failed to get container status \"ecdfbe98e48feb2bbd526ae30ed7b61315c792112126a0fe9ff631a269b2721b\": rpc error: code = NotFound desc = could not find container \"ecdfbe98e48feb2bbd526ae30ed7b61315c792112126a0fe9ff631a269b2721b\": container with ID starting with ecdfbe98e48feb2bbd526ae30ed7b61315c792112126a0fe9ff631a269b2721b not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.368985 5030 scope.go:117] "RemoveContainer" containerID="56618c09fc923586f43bcbb44d653ee3a7d3a621886e4715ea74da7134d82d71" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.369224 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56618c09fc923586f43bcbb44d653ee3a7d3a621886e4715ea74da7134d82d71\": container with ID starting with 56618c09fc923586f43bcbb44d653ee3a7d3a621886e4715ea74da7134d82d71 not found: ID does not exist" containerID="56618c09fc923586f43bcbb44d653ee3a7d3a621886e4715ea74da7134d82d71" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.369245 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56618c09fc923586f43bcbb44d653ee3a7d3a621886e4715ea74da7134d82d71"} err="failed to get container status \"56618c09fc923586f43bcbb44d653ee3a7d3a621886e4715ea74da7134d82d71\": rpc error: code = NotFound desc = could not find container \"56618c09fc923586f43bcbb44d653ee3a7d3a621886e4715ea74da7134d82d71\": container with ID starting with 56618c09fc923586f43bcbb44d653ee3a7d3a621886e4715ea74da7134d82d71 not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.369260 5030 scope.go:117] "RemoveContainer" containerID="cafdcd217051f0178624a66e9db90a71b91346975e830dc33b5be81f1cbce400" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.369545 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cafdcd217051f0178624a66e9db90a71b91346975e830dc33b5be81f1cbce400\": container with ID starting with cafdcd217051f0178624a66e9db90a71b91346975e830dc33b5be81f1cbce400 not found: ID does not exist" containerID="cafdcd217051f0178624a66e9db90a71b91346975e830dc33b5be81f1cbce400" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.369578 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cafdcd217051f0178624a66e9db90a71b91346975e830dc33b5be81f1cbce400"} err="failed to get container status \"cafdcd217051f0178624a66e9db90a71b91346975e830dc33b5be81f1cbce400\": rpc error: code = NotFound desc = could not find container \"cafdcd217051f0178624a66e9db90a71b91346975e830dc33b5be81f1cbce400\": container with ID starting with cafdcd217051f0178624a66e9db90a71b91346975e830dc33b5be81f1cbce400 not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.369596 5030 scope.go:117] "RemoveContainer" containerID="11b09f54c09fd448fff25683584a5e001e51f0cffe01fad0c973005e9cc64963" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.371643 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11b09f54c09fd448fff25683584a5e001e51f0cffe01fad0c973005e9cc64963\": container with ID starting with 11b09f54c09fd448fff25683584a5e001e51f0cffe01fad0c973005e9cc64963 not found: ID does not exist" containerID="11b09f54c09fd448fff25683584a5e001e51f0cffe01fad0c973005e9cc64963" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.371670 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11b09f54c09fd448fff25683584a5e001e51f0cffe01fad0c973005e9cc64963"} err="failed to get container status \"11b09f54c09fd448fff25683584a5e001e51f0cffe01fad0c973005e9cc64963\": rpc error: code = NotFound desc = could not find container \"11b09f54c09fd448fff25683584a5e001e51f0cffe01fad0c973005e9cc64963\": container with ID starting with 11b09f54c09fd448fff25683584a5e001e51f0cffe01fad0c973005e9cc64963 not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.371685 5030 scope.go:117] "RemoveContainer" containerID="14144cf1922d6be5801bcf6d78d028157ed7e8d1d57067fa2c77b371a66a2544" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.372028 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14144cf1922d6be5801bcf6d78d028157ed7e8d1d57067fa2c77b371a66a2544\": container with ID starting with 14144cf1922d6be5801bcf6d78d028157ed7e8d1d57067fa2c77b371a66a2544 not found: ID does not exist" containerID="14144cf1922d6be5801bcf6d78d028157ed7e8d1d57067fa2c77b371a66a2544" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.372084 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14144cf1922d6be5801bcf6d78d028157ed7e8d1d57067fa2c77b371a66a2544"} err="failed to get container status \"14144cf1922d6be5801bcf6d78d028157ed7e8d1d57067fa2c77b371a66a2544\": rpc error: code = NotFound desc = could not find container \"14144cf1922d6be5801bcf6d78d028157ed7e8d1d57067fa2c77b371a66a2544\": container with ID starting with 14144cf1922d6be5801bcf6d78d028157ed7e8d1d57067fa2c77b371a66a2544 not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.372102 5030 scope.go:117] "RemoveContainer" containerID="2a9f08f93ceb305eb8bbbc6f99714cc83ac38112f37d032b0618da20043f1d87" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.372432 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a9f08f93ceb305eb8bbbc6f99714cc83ac38112f37d032b0618da20043f1d87\": container with ID starting with 2a9f08f93ceb305eb8bbbc6f99714cc83ac38112f37d032b0618da20043f1d87 not found: ID does not exist" containerID="2a9f08f93ceb305eb8bbbc6f99714cc83ac38112f37d032b0618da20043f1d87" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.372465 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a9f08f93ceb305eb8bbbc6f99714cc83ac38112f37d032b0618da20043f1d87"} err="failed to get container status \"2a9f08f93ceb305eb8bbbc6f99714cc83ac38112f37d032b0618da20043f1d87\": rpc error: code = NotFound desc = could not find container \"2a9f08f93ceb305eb8bbbc6f99714cc83ac38112f37d032b0618da20043f1d87\": container with ID starting with 2a9f08f93ceb305eb8bbbc6f99714cc83ac38112f37d032b0618da20043f1d87 not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.372486 5030 scope.go:117] "RemoveContainer" containerID="030d4cd534eb9dfce0da93b394af8bc39c6d3274923f61e781d77d1230701dae" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.372828 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"030d4cd534eb9dfce0da93b394af8bc39c6d3274923f61e781d77d1230701dae\": container with ID starting with 030d4cd534eb9dfce0da93b394af8bc39c6d3274923f61e781d77d1230701dae not found: ID does not exist" containerID="030d4cd534eb9dfce0da93b394af8bc39c6d3274923f61e781d77d1230701dae" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.372858 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"030d4cd534eb9dfce0da93b394af8bc39c6d3274923f61e781d77d1230701dae"} err="failed to get container status \"030d4cd534eb9dfce0da93b394af8bc39c6d3274923f61e781d77d1230701dae\": rpc error: code = NotFound desc = could not find container \"030d4cd534eb9dfce0da93b394af8bc39c6d3274923f61e781d77d1230701dae\": container with ID starting with 030d4cd534eb9dfce0da93b394af8bc39c6d3274923f61e781d77d1230701dae not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.372875 5030 scope.go:117] "RemoveContainer" containerID="c0d724abadc5d3c497596ad5aa09fe62eb5baaea033ab9d5697927b249d3b189" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.373185 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d724abadc5d3c497596ad5aa09fe62eb5baaea033ab9d5697927b249d3b189\": container with ID starting with c0d724abadc5d3c497596ad5aa09fe62eb5baaea033ab9d5697927b249d3b189 not found: ID does not exist" containerID="c0d724abadc5d3c497596ad5aa09fe62eb5baaea033ab9d5697927b249d3b189" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.373224 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d724abadc5d3c497596ad5aa09fe62eb5baaea033ab9d5697927b249d3b189"} err="failed to get container status \"c0d724abadc5d3c497596ad5aa09fe62eb5baaea033ab9d5697927b249d3b189\": rpc error: code = NotFound desc = could not find container \"c0d724abadc5d3c497596ad5aa09fe62eb5baaea033ab9d5697927b249d3b189\": container with ID starting with c0d724abadc5d3c497596ad5aa09fe62eb5baaea033ab9d5697927b249d3b189 not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.373241 5030 scope.go:117] "RemoveContainer" containerID="3b3e2f8dd2ace93c881c281f8989f46699dbadd47bb896f1d2bf1e9f061b8629" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.373730 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b3e2f8dd2ace93c881c281f8989f46699dbadd47bb896f1d2bf1e9f061b8629\": container with ID starting with 3b3e2f8dd2ace93c881c281f8989f46699dbadd47bb896f1d2bf1e9f061b8629 not found: ID does not exist" containerID="3b3e2f8dd2ace93c881c281f8989f46699dbadd47bb896f1d2bf1e9f061b8629" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.373758 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b3e2f8dd2ace93c881c281f8989f46699dbadd47bb896f1d2bf1e9f061b8629"} err="failed to get container status \"3b3e2f8dd2ace93c881c281f8989f46699dbadd47bb896f1d2bf1e9f061b8629\": rpc error: code = NotFound desc = could not find container \"3b3e2f8dd2ace93c881c281f8989f46699dbadd47bb896f1d2bf1e9f061b8629\": container with ID starting with 3b3e2f8dd2ace93c881c281f8989f46699dbadd47bb896f1d2bf1e9f061b8629 not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.373778 5030 scope.go:117] "RemoveContainer" containerID="95c78ba26ea1327b543cb4fcc17a119b2b94fc9979e98f800b3989e33137c216" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.374118 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c78ba26ea1327b543cb4fcc17a119b2b94fc9979e98f800b3989e33137c216\": container with ID starting with 95c78ba26ea1327b543cb4fcc17a119b2b94fc9979e98f800b3989e33137c216 not found: ID does not exist" containerID="95c78ba26ea1327b543cb4fcc17a119b2b94fc9979e98f800b3989e33137c216" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.374144 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c78ba26ea1327b543cb4fcc17a119b2b94fc9979e98f800b3989e33137c216"} err="failed to get container status \"95c78ba26ea1327b543cb4fcc17a119b2b94fc9979e98f800b3989e33137c216\": rpc error: code = NotFound desc = could not find container \"95c78ba26ea1327b543cb4fcc17a119b2b94fc9979e98f800b3989e33137c216\": container with ID starting with 95c78ba26ea1327b543cb4fcc17a119b2b94fc9979e98f800b3989e33137c216 not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.374160 5030 scope.go:117] "RemoveContainer" containerID="81f204e94e50c10aad98f32fa610bc765fd0625f37c6d72281e27a9a07ef0053" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.374607 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f204e94e50c10aad98f32fa610bc765fd0625f37c6d72281e27a9a07ef0053\": container with ID starting with 81f204e94e50c10aad98f32fa610bc765fd0625f37c6d72281e27a9a07ef0053 not found: ID does not exist" containerID="81f204e94e50c10aad98f32fa610bc765fd0625f37c6d72281e27a9a07ef0053" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.374668 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f204e94e50c10aad98f32fa610bc765fd0625f37c6d72281e27a9a07ef0053"} err="failed to get container status \"81f204e94e50c10aad98f32fa610bc765fd0625f37c6d72281e27a9a07ef0053\": rpc error: code = NotFound desc = could not find container \"81f204e94e50c10aad98f32fa610bc765fd0625f37c6d72281e27a9a07ef0053\": container with ID starting with 81f204e94e50c10aad98f32fa610bc765fd0625f37c6d72281e27a9a07ef0053 not found: ID does not exist" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.374687 5030 scope.go:117] "RemoveContainer" containerID="bc19f8ea22af2621575e8fed2298df70012df1256810526ccdb4808e80ca0994" Jan 20 23:03:10 crc kubenswrapper[5030]: E0120 23:03:10.375165 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc19f8ea22af2621575e8fed2298df70012df1256810526ccdb4808e80ca0994\": container with ID starting with bc19f8ea22af2621575e8fed2298df70012df1256810526ccdb4808e80ca0994 not found: ID does not exist" containerID="bc19f8ea22af2621575e8fed2298df70012df1256810526ccdb4808e80ca0994" Jan 20 23:03:10 crc kubenswrapper[5030]: I0120 23:03:10.375186 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc19f8ea22af2621575e8fed2298df70012df1256810526ccdb4808e80ca0994"} err="failed to get container status \"bc19f8ea22af2621575e8fed2298df70012df1256810526ccdb4808e80ca0994\": rpc error: code = NotFound desc = could not find container \"bc19f8ea22af2621575e8fed2298df70012df1256810526ccdb4808e80ca0994\": container with ID starting with bc19f8ea22af2621575e8fed2298df70012df1256810526ccdb4808e80ca0994 not found: ID does not exist" Jan 20 23:03:11 crc kubenswrapper[5030]: I0120 23:03:11.983885 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" path="/var/lib/kubelet/pods/d8bcc41f-fad5-4773-be90-803444a622b0/volumes" Jan 20 23:03:12 crc kubenswrapper[5030]: I0120 23:03:12.610016 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod6a8b63a6-0ddc-47f6-9cc3-30862eb55edf"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod6a8b63a6-0ddc-47f6-9cc3-30862eb55edf] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6a8b63a6_0ddc_47f6_9cc3_30862eb55edf.slice" Jan 20 23:03:15 crc kubenswrapper[5030]: I0120 23:03:15.961690 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:03:15 crc kubenswrapper[5030]: E0120 23:03:15.962088 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.238548 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-sqskk"] Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.250192 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-sqskk"] Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377012 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-xmsrw"] Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377312 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-server" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377330 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-server" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377341 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5cb860-d149-4999-a4bb-be57ead53529" containerName="registry-server" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377349 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5cb860-d149-4999-a4bb-be57ead53529" containerName="registry-server" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377365 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" containerName="galera" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377373 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" containerName="galera" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377387 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-auditor" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377395 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-auditor" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377407 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5699cc7d-e528-4022-bff1-7057062e4a03" containerName="nova-cell1-conductor-conductor" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377437 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5699cc7d-e528-4022-bff1-7057062e4a03" containerName="nova-cell1-conductor-conductor" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377451 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" containerName="nova-api-api" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377459 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" containerName="nova-api-api" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377469 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="sg-core" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377478 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="sg-core" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377492 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" containerName="cinder-api" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377499 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" containerName="cinder-api" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377509 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d90984a-ad24-4f85-b39e-fff5b8152a04" containerName="glance-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377517 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d90984a-ad24-4f85-b39e-fff5b8152a04" containerName="glance-log" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377532 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" containerName="keystone-api" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377539 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" containerName="keystone-api" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377550 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-expirer" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377558 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-expirer" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377572 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dfd1851-3088-4199-9b12-47964306a9b5" containerName="neutron-httpd" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377579 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfd1851-3088-4199-9b12-47964306a9b5" containerName="neutron-httpd" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377593 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-replicator" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377601 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-replicator" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377609 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-updater" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377634 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-updater" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377643 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="rsync" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377650 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="rsync" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377660 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-reaper" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377668 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-reaper" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377683 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5cb860-d149-4999-a4bb-be57ead53529" containerName="extract-content" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377690 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5cb860-d149-4999-a4bb-be57ead53529" containerName="extract-content" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377703 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f11c15-8856-45fd-9703-348310781d5a" containerName="setup-container" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377710 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f11c15-8856-45fd-9703-348310781d5a" containerName="setup-container" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377720 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7e101a-46be-4de8-97af-d47cdce4ef90" containerName="placement-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377727 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7e101a-46be-4de8-97af-d47cdce4ef90" containerName="placement-log" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377739 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" containerName="nova-api-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377747 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" containerName="nova-api-log" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377760 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9bc4ba4-4b03-441b-b492-6de67c2647b6" containerName="rabbitmq" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377767 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bc4ba4-4b03-441b-b492-6de67c2647b6" containerName="rabbitmq" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377779 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" containerName="ovn-northd" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377786 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" containerName="ovn-northd" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377797 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-updater" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377805 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-updater" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377817 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-auditor" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377825 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-auditor" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377834 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" containerName="barbican-api-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377841 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" containerName="barbican-api-log" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377849 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="ceilometer-notification-agent" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377856 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="ceilometer-notification-agent" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377868 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="swift-recon-cron" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377876 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="swift-recon-cron" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377886 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c46ecc6-0196-4659-b06f-3603396bc91a" containerName="init" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377893 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c46ecc6-0196-4659-b06f-3603396bc91a" containerName="init" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377904 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f11c15-8856-45fd-9703-348310781d5a" containerName="rabbitmq" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377911 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f11c15-8856-45fd-9703-348310781d5a" containerName="rabbitmq" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377922 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f927a168-d367-4174-9e6e-fd9b7964346b" containerName="galera" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377930 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f927a168-d367-4174-9e6e-fd9b7964346b" containerName="galera" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377940 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-server" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377947 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-server" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377961 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="proxy-httpd" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377968 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="proxy-httpd" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377978 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f927a168-d367-4174-9e6e-fd9b7964346b" containerName="mysql-bootstrap" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.377985 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f927a168-d367-4174-9e6e-fd9b7964346b" containerName="mysql-bootstrap" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.377998 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-server" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378006 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-server" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378015 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7e101a-46be-4de8-97af-d47cdce4ef90" containerName="placement-api" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378023 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7e101a-46be-4de8-97af-d47cdce4ef90" containerName="placement-api" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378032 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9bc4ba4-4b03-441b-b492-6de67c2647b6" containerName="setup-container" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378039 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bc4ba4-4b03-441b-b492-6de67c2647b6" containerName="setup-container" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378051 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc512e7-ef40-4589-a094-9906cf91a0d3" containerName="glance-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378058 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc512e7-ef40-4589-a094-9906cf91a0d3" containerName="glance-log" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378067 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a387ba-cb79-4f5d-a313-37204d51d189" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378075 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a387ba-cb79-4f5d-a313-37204d51d189" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378083 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="ceilometer-central-agent" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378090 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="ceilometer-central-agent" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378100 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5e90d7-03d0-48f4-abc2-914ccd98c0db" containerName="barbican-keystone-listener-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378109 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5e90d7-03d0-48f4-abc2-914ccd98c0db" containerName="barbican-keystone-listener-log" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378122 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-replicator" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378129 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-replicator" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378138 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87ad01b-11b8-43e1-9a04-5947ff6b0441" containerName="cinder-scheduler" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378145 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87ad01b-11b8-43e1-9a04-5947ff6b0441" containerName="cinder-scheduler" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378159 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5e90d7-03d0-48f4-abc2-914ccd98c0db" containerName="barbican-keystone-listener" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378166 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5e90d7-03d0-48f4-abc2-914ccd98c0db" containerName="barbican-keystone-listener" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378176 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62d42b6-4ec4-4f63-8f7c-31975f6677bb" containerName="barbican-worker" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378184 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62d42b6-4ec4-4f63-8f7c-31975f6677bb" containerName="barbican-worker" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378196 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" containerName="openstack-network-exporter" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378204 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" containerName="openstack-network-exporter" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378212 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f73fe0c-f004-4b64-8952-cc6d7ebb26a9" containerName="memcached" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378219 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f73fe0c-f004-4b64-8952-cc6d7ebb26a9" containerName="memcached" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378233 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerName="nova-metadata-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378240 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerName="nova-metadata-log" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378252 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2071f00e-fdae-400c-bec2-f9092faef33d" containerName="nova-scheduler-scheduler" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378259 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2071f00e-fdae-400c-bec2-f9092faef33d" containerName="nova-scheduler-scheduler" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378273 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87ad01b-11b8-43e1-9a04-5947ff6b0441" containerName="probe" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378280 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87ad01b-11b8-43e1-9a04-5947ff6b0441" containerName="probe" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378292 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d90984a-ad24-4f85-b39e-fff5b8152a04" containerName="glance-httpd" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378300 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d90984a-ad24-4f85-b39e-fff5b8152a04" containerName="glance-httpd" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378313 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4789919e-c06b-413a-8185-6766523e9925" containerName="nova-cell0-conductor-conductor" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378321 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4789919e-c06b-413a-8185-6766523e9925" containerName="nova-cell0-conductor-conductor" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378335 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" containerName="mysql-bootstrap" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378346 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" containerName="mysql-bootstrap" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378358 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-replicator" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378368 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-replicator" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378382 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c46ecc6-0196-4659-b06f-3603396bc91a" containerName="dnsmasq-dns" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378389 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c46ecc6-0196-4659-b06f-3603396bc91a" containerName="dnsmasq-dns" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378399 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerName="nova-metadata-metadata" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378407 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerName="nova-metadata-metadata" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378421 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc512e7-ef40-4589-a094-9906cf91a0d3" containerName="glance-httpd" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378428 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc512e7-ef40-4589-a094-9906cf91a0d3" containerName="glance-httpd" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378440 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-auditor" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378447 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-auditor" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378455 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" containerName="barbican-api" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378463 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" containerName="barbican-api" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378475 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dfd1851-3088-4199-9b12-47964306a9b5" containerName="neutron-api" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378482 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dfd1851-3088-4199-9b12-47964306a9b5" containerName="neutron-api" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378491 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" containerName="cinder-api-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378498 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" containerName="cinder-api-log" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378508 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62d42b6-4ec4-4f63-8f7c-31975f6677bb" containerName="barbican-worker-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378517 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62d42b6-4ec4-4f63-8f7c-31975f6677bb" containerName="barbican-worker-log" Jan 20 23:03:21 crc kubenswrapper[5030]: E0120 23:03:21.378526 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b5cb860-d149-4999-a4bb-be57ead53529" containerName="extract-utilities" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378533 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b5cb860-d149-4999-a4bb-be57ead53529" containerName="extract-utilities" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378704 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" containerName="cinder-api-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378715 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerName="nova-metadata-metadata" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378728 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-auditor" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378738 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" containerName="barbican-api" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378748 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4789919e-c06b-413a-8185-6766523e9925" containerName="nova-cell0-conductor-conductor" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378761 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" containerName="openstack-network-exporter" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378773 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" containerName="nova-api-api" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378782 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="sg-core" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378790 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5699cc7d-e528-4022-bff1-7057062e4a03" containerName="nova-cell1-conductor-conductor" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378801 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dfd1851-3088-4199-9b12-47964306a9b5" containerName="neutron-api" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378812 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f73fe0c-f004-4b64-8952-cc6d7ebb26a9" containerName="memcached" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378824 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-auditor" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378833 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62d42b6-4ec4-4f63-8f7c-31975f6677bb" containerName="barbican-worker" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378844 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-updater" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378855 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4db8327-61c4-4757-bed5-728ef0f8bbc6" containerName="ovn-northd" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378870 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f927a168-d367-4174-9e6e-fd9b7964346b" containerName="galera" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378883 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f99e18-c5b4-4def-9f5e-5be497088422" containerName="nova-metadata-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378892 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-server" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378901 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="ceilometer-notification-agent" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378913 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-server" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378925 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9bc4ba4-4b03-441b-b492-6de67c2647b6" containerName="rabbitmq" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378935 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-expirer" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378944 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d90984a-ad24-4f85-b39e-fff5b8152a04" containerName="glance-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378956 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a6bfb2-dc60-47ff-80bc-ea70b7a37c9d" containerName="keystone-api" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378966 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b5cb860-d149-4999-a4bb-be57ead53529" containerName="registry-server" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.378977 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-auditor" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379003 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f11c15-8856-45fd-9703-348310781d5a" containerName="rabbitmq" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379014 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62d42b6-4ec4-4f63-8f7c-31975f6677bb" containerName="barbican-worker-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379027 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f87ad01b-11b8-43e1-9a04-5947ff6b0441" containerName="probe" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379036 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-replicator" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379050 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef25eff-8fd8-40b8-881b-e8b7eb12d1ba" containerName="nova-api-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379063 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7e101a-46be-4de8-97af-d47cdce4ef90" containerName="placement-api" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379073 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4a387ba-cb79-4f5d-a313-37204d51d189" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379081 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-server" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379092 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f87ad01b-11b8-43e1-9a04-5947ff6b0441" containerName="cinder-scheduler" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379103 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="proxy-httpd" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379114 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c46ecc6-0196-4659-b06f-3603396bc91a" containerName="dnsmasq-dns" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379124 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b5e90d7-03d0-48f4-abc2-914ccd98c0db" containerName="barbican-keystone-listener" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379135 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-reaper" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379144 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="container-updater" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379154 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c264e34-26dd-4ab6-88e1-a44f64fac9da" containerName="ceilometer-central-agent" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379165 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fc512e7-ef40-4589-a094-9906cf91a0d3" containerName="glance-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379173 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0188ec37-5fa0-4c9c-82ea-7eb7010c88d3" containerName="barbican-api-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379182 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="rsync" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379193 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7e101a-46be-4de8-97af-d47cdce4ef90" containerName="placement-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379203 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="swift-recon-cron" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379214 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="object-replicator" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379223 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d90984a-ad24-4f85-b39e-fff5b8152a04" containerName="glance-httpd" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379236 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fc512e7-ef40-4589-a094-9906cf91a0d3" containerName="glance-httpd" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379248 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a16fd61-1cde-4e12-8e9f-db0c8f2182b7" containerName="galera" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379260 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b9cc70-f5f7-4bf6-9bc7-15a2c706a1a8" containerName="cinder-api" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379269 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2071f00e-fdae-400c-bec2-f9092faef33d" containerName="nova-scheduler-scheduler" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379280 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b5e90d7-03d0-48f4-abc2-914ccd98c0db" containerName="barbican-keystone-listener-log" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379291 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcc41f-fad5-4773-be90-803444a622b0" containerName="account-replicator" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379300 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dfd1851-3088-4199-9b12-47964306a9b5" containerName="neutron-httpd" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.379853 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xmsrw" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.383102 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.384184 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.384219 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.384431 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.415310 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-xmsrw"] Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.502557 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-crc-storage\") pod \"crc-storage-crc-xmsrw\" (UID: \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\") " pod="crc-storage/crc-storage-crc-xmsrw" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.502773 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-node-mnt\") pod \"crc-storage-crc-xmsrw\" (UID: \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\") " pod="crc-storage/crc-storage-crc-xmsrw" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.502863 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nscg\" (UniqueName: \"kubernetes.io/projected/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-kube-api-access-7nscg\") pod \"crc-storage-crc-xmsrw\" (UID: \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\") " pod="crc-storage/crc-storage-crc-xmsrw" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.604341 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-node-mnt\") pod \"crc-storage-crc-xmsrw\" (UID: \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\") " pod="crc-storage/crc-storage-crc-xmsrw" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.604470 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nscg\" (UniqueName: \"kubernetes.io/projected/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-kube-api-access-7nscg\") pod \"crc-storage-crc-xmsrw\" (UID: \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\") " pod="crc-storage/crc-storage-crc-xmsrw" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.604592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-crc-storage\") pod \"crc-storage-crc-xmsrw\" (UID: \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\") " pod="crc-storage/crc-storage-crc-xmsrw" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.604790 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-node-mnt\") pod \"crc-storage-crc-xmsrw\" (UID: \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\") " pod="crc-storage/crc-storage-crc-xmsrw" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.606091 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-crc-storage\") pod \"crc-storage-crc-xmsrw\" (UID: \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\") " pod="crc-storage/crc-storage-crc-xmsrw" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.637583 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nscg\" (UniqueName: \"kubernetes.io/projected/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-kube-api-access-7nscg\") pod \"crc-storage-crc-xmsrw\" (UID: \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\") " pod="crc-storage/crc-storage-crc-xmsrw" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.704358 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xmsrw" Jan 20 23:03:21 crc kubenswrapper[5030]: I0120 23:03:21.983278 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="460f4945-a971-411f-93ab-f36930319651" path="/var/lib/kubelet/pods/460f4945-a971-411f-93ab-f36930319651/volumes" Jan 20 23:03:22 crc kubenswrapper[5030]: I0120 23:03:22.246656 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-xmsrw"] Jan 20 23:03:23 crc kubenswrapper[5030]: I0120 23:03:23.134920 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xmsrw" event={"ID":"2f2c82a7-76d1-44d4-ac12-0958b571d5ac","Type":"ContainerStarted","Data":"262337461f3e9750b3223c467dbf9bcb5eebed3cc7ea72a46af2095d1c02899a"} Jan 20 23:03:24 crc kubenswrapper[5030]: I0120 23:03:24.150521 5030 generic.go:334] "Generic (PLEG): container finished" podID="2f2c82a7-76d1-44d4-ac12-0958b571d5ac" containerID="f1d8809b1d196273dc294cf293af381f668786c5df69401b667fed9278b6508b" exitCode=0 Jan 20 23:03:24 crc kubenswrapper[5030]: I0120 23:03:24.151049 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xmsrw" event={"ID":"2f2c82a7-76d1-44d4-ac12-0958b571d5ac","Type":"ContainerDied","Data":"f1d8809b1d196273dc294cf293af381f668786c5df69401b667fed9278b6508b"} Jan 20 23:03:25 crc kubenswrapper[5030]: I0120 23:03:25.478475 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xmsrw" Jan 20 23:03:25 crc kubenswrapper[5030]: I0120 23:03:25.676244 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-node-mnt\") pod \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\" (UID: \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\") " Jan 20 23:03:25 crc kubenswrapper[5030]: I0120 23:03:25.676510 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nscg\" (UniqueName: \"kubernetes.io/projected/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-kube-api-access-7nscg\") pod \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\" (UID: \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\") " Jan 20 23:03:25 crc kubenswrapper[5030]: I0120 23:03:25.676576 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-crc-storage\") pod \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\" (UID: \"2f2c82a7-76d1-44d4-ac12-0958b571d5ac\") " Jan 20 23:03:25 crc kubenswrapper[5030]: I0120 23:03:25.676783 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "2f2c82a7-76d1-44d4-ac12-0958b571d5ac" (UID: "2f2c82a7-76d1-44d4-ac12-0958b571d5ac"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:03:25 crc kubenswrapper[5030]: I0120 23:03:25.677227 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:03:25 crc kubenswrapper[5030]: I0120 23:03:25.683720 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-kube-api-access-7nscg" (OuterVolumeSpecName: "kube-api-access-7nscg") pod "2f2c82a7-76d1-44d4-ac12-0958b571d5ac" (UID: "2f2c82a7-76d1-44d4-ac12-0958b571d5ac"). InnerVolumeSpecName "kube-api-access-7nscg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:03:25 crc kubenswrapper[5030]: I0120 23:03:25.695550 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "2f2c82a7-76d1-44d4-ac12-0958b571d5ac" (UID: "2f2c82a7-76d1-44d4-ac12-0958b571d5ac"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:03:25 crc kubenswrapper[5030]: I0120 23:03:25.778180 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nscg\" (UniqueName: \"kubernetes.io/projected/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-kube-api-access-7nscg\") on node \"crc\" DevicePath \"\"" Jan 20 23:03:25 crc kubenswrapper[5030]: I0120 23:03:25.778210 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/2f2c82a7-76d1-44d4-ac12-0958b571d5ac-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:03:26 crc kubenswrapper[5030]: I0120 23:03:26.172190 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xmsrw" event={"ID":"2f2c82a7-76d1-44d4-ac12-0958b571d5ac","Type":"ContainerDied","Data":"262337461f3e9750b3223c467dbf9bcb5eebed3cc7ea72a46af2095d1c02899a"} Jan 20 23:03:26 crc kubenswrapper[5030]: I0120 23:03:26.172268 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="262337461f3e9750b3223c467dbf9bcb5eebed3cc7ea72a46af2095d1c02899a" Jan 20 23:03:26 crc kubenswrapper[5030]: I0120 23:03:26.172296 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xmsrw" Jan 20 23:03:26 crc kubenswrapper[5030]: I0120 23:03:26.962065 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:03:26 crc kubenswrapper[5030]: E0120 23:03:26.962521 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.384146 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-xmsrw"] Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.391745 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-xmsrw"] Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.507850 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-7k6np"] Jan 20 23:03:29 crc kubenswrapper[5030]: E0120 23:03:29.508448 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f2c82a7-76d1-44d4-ac12-0958b571d5ac" containerName="storage" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.508475 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2c82a7-76d1-44d4-ac12-0958b571d5ac" containerName="storage" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.508654 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f2c82a7-76d1-44d4-ac12-0958b571d5ac" containerName="storage" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.509448 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7k6np" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.512657 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.512953 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.513408 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.514879 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.522670 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7k6np"] Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.581257 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0e19c0ae-a028-4c60-90bd-dc184f834639-node-mnt\") pod \"crc-storage-crc-7k6np\" (UID: \"0e19c0ae-a028-4c60-90bd-dc184f834639\") " pod="crc-storage/crc-storage-crc-7k6np" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.581990 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f5rz\" (UniqueName: \"kubernetes.io/projected/0e19c0ae-a028-4c60-90bd-dc184f834639-kube-api-access-6f5rz\") pod \"crc-storage-crc-7k6np\" (UID: \"0e19c0ae-a028-4c60-90bd-dc184f834639\") " pod="crc-storage/crc-storage-crc-7k6np" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.582149 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0e19c0ae-a028-4c60-90bd-dc184f834639-crc-storage\") pod \"crc-storage-crc-7k6np\" (UID: \"0e19c0ae-a028-4c60-90bd-dc184f834639\") " pod="crc-storage/crc-storage-crc-7k6np" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.684423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f5rz\" (UniqueName: \"kubernetes.io/projected/0e19c0ae-a028-4c60-90bd-dc184f834639-kube-api-access-6f5rz\") pod \"crc-storage-crc-7k6np\" (UID: \"0e19c0ae-a028-4c60-90bd-dc184f834639\") " pod="crc-storage/crc-storage-crc-7k6np" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.684551 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0e19c0ae-a028-4c60-90bd-dc184f834639-crc-storage\") pod \"crc-storage-crc-7k6np\" (UID: \"0e19c0ae-a028-4c60-90bd-dc184f834639\") " pod="crc-storage/crc-storage-crc-7k6np" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.684657 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0e19c0ae-a028-4c60-90bd-dc184f834639-node-mnt\") pod \"crc-storage-crc-7k6np\" (UID: \"0e19c0ae-a028-4c60-90bd-dc184f834639\") " pod="crc-storage/crc-storage-crc-7k6np" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.685070 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0e19c0ae-a028-4c60-90bd-dc184f834639-node-mnt\") pod \"crc-storage-crc-7k6np\" (UID: \"0e19c0ae-a028-4c60-90bd-dc184f834639\") " pod="crc-storage/crc-storage-crc-7k6np" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.686568 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0e19c0ae-a028-4c60-90bd-dc184f834639-crc-storage\") pod \"crc-storage-crc-7k6np\" (UID: \"0e19c0ae-a028-4c60-90bd-dc184f834639\") " pod="crc-storage/crc-storage-crc-7k6np" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.722543 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f5rz\" (UniqueName: \"kubernetes.io/projected/0e19c0ae-a028-4c60-90bd-dc184f834639-kube-api-access-6f5rz\") pod \"crc-storage-crc-7k6np\" (UID: \"0e19c0ae-a028-4c60-90bd-dc184f834639\") " pod="crc-storage/crc-storage-crc-7k6np" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.835898 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7k6np" Jan 20 23:03:29 crc kubenswrapper[5030]: I0120 23:03:29.985139 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f2c82a7-76d1-44d4-ac12-0958b571d5ac" path="/var/lib/kubelet/pods/2f2c82a7-76d1-44d4-ac12-0958b571d5ac/volumes" Jan 20 23:03:30 crc kubenswrapper[5030]: I0120 23:03:30.125080 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7k6np"] Jan 20 23:03:30 crc kubenswrapper[5030]: I0120 23:03:30.212915 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7k6np" event={"ID":"0e19c0ae-a028-4c60-90bd-dc184f834639","Type":"ContainerStarted","Data":"c1aca38a90695f9c778a898deb629021e234d8761f949639545166e4b95ccfe5"} Jan 20 23:03:31 crc kubenswrapper[5030]: I0120 23:03:31.224919 5030 generic.go:334] "Generic (PLEG): container finished" podID="0e19c0ae-a028-4c60-90bd-dc184f834639" containerID="9a906110591b1c9cf594b5c7f6ade4774e1be0f703f6d02471b591d8d76c6641" exitCode=0 Jan 20 23:03:31 crc kubenswrapper[5030]: I0120 23:03:31.225014 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7k6np" event={"ID":"0e19c0ae-a028-4c60-90bd-dc184f834639","Type":"ContainerDied","Data":"9a906110591b1c9cf594b5c7f6ade4774e1be0f703f6d02471b591d8d76c6641"} Jan 20 23:03:32 crc kubenswrapper[5030]: I0120 23:03:32.566917 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7k6np" Jan 20 23:03:32 crc kubenswrapper[5030]: I0120 23:03:32.724530 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f5rz\" (UniqueName: \"kubernetes.io/projected/0e19c0ae-a028-4c60-90bd-dc184f834639-kube-api-access-6f5rz\") pod \"0e19c0ae-a028-4c60-90bd-dc184f834639\" (UID: \"0e19c0ae-a028-4c60-90bd-dc184f834639\") " Jan 20 23:03:32 crc kubenswrapper[5030]: I0120 23:03:32.724591 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0e19c0ae-a028-4c60-90bd-dc184f834639-node-mnt\") pod \"0e19c0ae-a028-4c60-90bd-dc184f834639\" (UID: \"0e19c0ae-a028-4c60-90bd-dc184f834639\") " Jan 20 23:03:32 crc kubenswrapper[5030]: I0120 23:03:32.724706 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0e19c0ae-a028-4c60-90bd-dc184f834639-crc-storage\") pod \"0e19c0ae-a028-4c60-90bd-dc184f834639\" (UID: \"0e19c0ae-a028-4c60-90bd-dc184f834639\") " Jan 20 23:03:32 crc kubenswrapper[5030]: I0120 23:03:32.724867 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e19c0ae-a028-4c60-90bd-dc184f834639-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "0e19c0ae-a028-4c60-90bd-dc184f834639" (UID: "0e19c0ae-a028-4c60-90bd-dc184f834639"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:03:32 crc kubenswrapper[5030]: I0120 23:03:32.725684 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/0e19c0ae-a028-4c60-90bd-dc184f834639-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:03:32 crc kubenswrapper[5030]: I0120 23:03:32.730838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e19c0ae-a028-4c60-90bd-dc184f834639-kube-api-access-6f5rz" (OuterVolumeSpecName: "kube-api-access-6f5rz") pod "0e19c0ae-a028-4c60-90bd-dc184f834639" (UID: "0e19c0ae-a028-4c60-90bd-dc184f834639"). InnerVolumeSpecName "kube-api-access-6f5rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:03:32 crc kubenswrapper[5030]: I0120 23:03:32.743220 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e19c0ae-a028-4c60-90bd-dc184f834639-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "0e19c0ae-a028-4c60-90bd-dc184f834639" (UID: "0e19c0ae-a028-4c60-90bd-dc184f834639"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:03:32 crc kubenswrapper[5030]: I0120 23:03:32.827883 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/0e19c0ae-a028-4c60-90bd-dc184f834639-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:03:32 crc kubenswrapper[5030]: I0120 23:03:32.827934 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f5rz\" (UniqueName: \"kubernetes.io/projected/0e19c0ae-a028-4c60-90bd-dc184f834639-kube-api-access-6f5rz\") on node \"crc\" DevicePath \"\"" Jan 20 23:03:33 crc kubenswrapper[5030]: I0120 23:03:33.256167 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7k6np" event={"ID":"0e19c0ae-a028-4c60-90bd-dc184f834639","Type":"ContainerDied","Data":"c1aca38a90695f9c778a898deb629021e234d8761f949639545166e4b95ccfe5"} Jan 20 23:03:33 crc kubenswrapper[5030]: I0120 23:03:33.256223 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1aca38a90695f9c778a898deb629021e234d8761f949639545166e4b95ccfe5" Jan 20 23:03:33 crc kubenswrapper[5030]: I0120 23:03:33.256382 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7k6np" Jan 20 23:03:34 crc kubenswrapper[5030]: I0120 23:03:34.084734 5030 scope.go:117] "RemoveContainer" containerID="63057bd1179d40b3071a3468cfba935dac00fdf9cf0809c23272ed79c89a9eb3" Jan 20 23:03:34 crc kubenswrapper[5030]: I0120 23:03:34.119354 5030 scope.go:117] "RemoveContainer" containerID="1149b9108c8726b17b503e37defac2cb908a5b4f024630bb83ebda23ce06e15e" Jan 20 23:03:34 crc kubenswrapper[5030]: I0120 23:03:34.143499 5030 scope.go:117] "RemoveContainer" containerID="de1ddfa2e63339299e0dc0bec48a287474e36a02e58e7951ddee8179a1a548a7" Jan 20 23:03:34 crc kubenswrapper[5030]: I0120 23:03:34.195424 5030 scope.go:117] "RemoveContainer" containerID="a6f737d8849e1b7dd322cb02df6ee01f0aa48f194ddabe4f0c964de3f0feac4f" Jan 20 23:03:34 crc kubenswrapper[5030]: I0120 23:03:34.224883 5030 scope.go:117] "RemoveContainer" containerID="edb04c1ded6bcd629be12b1024e65245d152a0cef4ff2887df6e7b31b08de55f" Jan 20 23:03:34 crc kubenswrapper[5030]: I0120 23:03:34.249917 5030 scope.go:117] "RemoveContainer" containerID="893f6987a6df5b3331fffed8a8ea0ba6aec4842601c274d8f1e560748fa5ffd8" Jan 20 23:03:34 crc kubenswrapper[5030]: I0120 23:03:34.284569 5030 scope.go:117] "RemoveContainer" containerID="a4a6c07a940fa33f24511acd3d3cfb359f4abd81b4a5617e7de7f6743af38fe9" Jan 20 23:03:34 crc kubenswrapper[5030]: I0120 23:03:34.301935 5030 scope.go:117] "RemoveContainer" containerID="4b3edaf72ef2938e18892d64d5b81c473524ae4b964355521fc26202296a8f67" Jan 20 23:03:34 crc kubenswrapper[5030]: I0120 23:03:34.321401 5030 scope.go:117] "RemoveContainer" containerID="54c949cb8fb90e5917444c912258fd302ea6bebb581039d1381ea77c29048eb2" Jan 20 23:03:34 crc kubenswrapper[5030]: I0120 23:03:34.344713 5030 scope.go:117] "RemoveContainer" containerID="be006ffb3db61a321e993c269bb9ae3ea0ca63c388bc27053d0c190cbf6073af" Jan 20 23:03:37 crc kubenswrapper[5030]: I0120 23:03:37.969907 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:03:37 crc kubenswrapper[5030]: E0120 23:03:37.970491 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.556694 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:03:44 crc kubenswrapper[5030]: E0120 23:03:44.557457 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e19c0ae-a028-4c60-90bd-dc184f834639" containerName="storage" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.557470 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e19c0ae-a028-4c60-90bd-dc184f834639" containerName="storage" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.557620 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e19c0ae-a028-4c60-90bd-dc184f834639" containerName="storage" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.558324 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.560295 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.561026 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.561125 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.562176 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.562250 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.562575 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-tgs5j" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.562663 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.574228 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.705285 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.705328 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.705367 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78f61cd4-3d8c-48e0-bebd-03969052e500-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.705387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.705410 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.705505 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.705535 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.705602 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8nlm\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-kube-api-access-t8nlm\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.705732 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.705802 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78f61cd4-3d8c-48e0-bebd-03969052e500-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.705837 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.806959 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78f61cd4-3d8c-48e0-bebd-03969052e500-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.807011 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.807037 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.807056 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.807090 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78f61cd4-3d8c-48e0-bebd-03969052e500-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.807105 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.807124 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.807147 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.807161 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.807188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8nlm\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-kube-api-access-t8nlm\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.807216 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.808004 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.808243 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.808530 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.808938 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.809004 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.810210 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.813792 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.815969 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78f61cd4-3d8c-48e0-bebd-03969052e500-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.819077 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78f61cd4-3d8c-48e0-bebd-03969052e500-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.821251 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.827057 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8nlm\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-kube-api-access-t8nlm\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.827760 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:44 crc kubenswrapper[5030]: I0120 23:03:44.884129 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.300182 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.374578 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"78f61cd4-3d8c-48e0-bebd-03969052e500","Type":"ContainerStarted","Data":"198c46fbff0b02ec58713aac2e3aed20927f56beb3bcd29624042538a10fb9ad"} Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.686906 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.688240 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.692800 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.693077 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.693172 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-xthmg" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.694047 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.694180 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.695779 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.697570 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.716425 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.819647 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.819845 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4jt8\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-kube-api-access-s4jt8\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.819997 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.820122 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.820167 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.820221 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.820381 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6ede01a6-9221-4276-b537-212e5da27f5c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.820470 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6ede01a6-9221-4276-b537-212e5da27f5c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.820547 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.820611 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.820707 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.921575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6ede01a6-9221-4276-b537-212e5da27f5c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.921706 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6ede01a6-9221-4276-b537-212e5da27f5c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.921752 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.921787 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.921813 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.921841 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.921866 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4jt8\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-kube-api-access-s4jt8\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.921908 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.921939 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.921962 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.921992 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.924059 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.924195 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.925325 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.926964 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.929084 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.929090 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.930129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6ede01a6-9221-4276-b537-212e5da27f5c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.930907 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.932000 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.933884 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6ede01a6-9221-4276-b537-212e5da27f5c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.950247 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:45 crc kubenswrapper[5030]: I0120 23:03:45.956408 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4jt8\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-kube-api-access-s4jt8\") pod \"rabbitmq-server-0\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:46 crc kubenswrapper[5030]: I0120 23:03:46.023568 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:03:46 crc kubenswrapper[5030]: I0120 23:03:46.273469 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:03:46 crc kubenswrapper[5030]: W0120 23:03:46.345846 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ede01a6_9221_4276_b537_212e5da27f5c.slice/crio-f110732b2b8320173a9a0256ee3d6ea0412e4623b249bed84157371192363e50 WatchSource:0}: Error finding container f110732b2b8320173a9a0256ee3d6ea0412e4623b249bed84157371192363e50: Status 404 returned error can't find the container with id f110732b2b8320173a9a0256ee3d6ea0412e4623b249bed84157371192363e50 Jan 20 23:03:46 crc kubenswrapper[5030]: I0120 23:03:46.391239 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"6ede01a6-9221-4276-b537-212e5da27f5c","Type":"ContainerStarted","Data":"f110732b2b8320173a9a0256ee3d6ea0412e4623b249bed84157371192363e50"} Jan 20 23:03:46 crc kubenswrapper[5030]: I0120 23:03:46.964263 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:03:46 crc kubenswrapper[5030]: I0120 23:03:46.968916 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:46 crc kubenswrapper[5030]: I0120 23:03:46.977102 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-c2dbx" Jan 20 23:03:46 crc kubenswrapper[5030]: I0120 23:03:46.977300 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:03:46 crc kubenswrapper[5030]: I0120 23:03:46.977811 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:03:46 crc kubenswrapper[5030]: I0120 23:03:46.979710 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:03:46 crc kubenswrapper[5030]: I0120 23:03:46.979812 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:03:46 crc kubenswrapper[5030]: I0120 23:03:46.985458 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.038648 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae8a133-5a30-4a92-a0cd-e07c5941af0f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.038700 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-config-data-default\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.038749 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae8a133-5a30-4a92-a0cd-e07c5941af0f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.038774 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-kolla-config\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.038822 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.038884 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhrqm\" (UniqueName: \"kubernetes.io/projected/cae8a133-5a30-4a92-a0cd-e07c5941af0f-kube-api-access-mhrqm\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.038907 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cae8a133-5a30-4a92-a0cd-e07c5941af0f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.038999 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.139914 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.140442 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae8a133-5a30-4a92-a0cd-e07c5941af0f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.140522 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-config-data-default\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.140604 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae8a133-5a30-4a92-a0cd-e07c5941af0f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.140291 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.140691 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-kolla-config\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.140912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.141031 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhrqm\" (UniqueName: \"kubernetes.io/projected/cae8a133-5a30-4a92-a0cd-e07c5941af0f-kube-api-access-mhrqm\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.141064 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cae8a133-5a30-4a92-a0cd-e07c5941af0f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.141525 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cae8a133-5a30-4a92-a0cd-e07c5941af0f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.141714 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-kolla-config\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.142780 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.142931 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-config-data-default\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.146075 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae8a133-5a30-4a92-a0cd-e07c5941af0f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.158345 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae8a133-5a30-4a92-a0cd-e07c5941af0f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.166413 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhrqm\" (UniqueName: \"kubernetes.io/projected/cae8a133-5a30-4a92-a0cd-e07c5941af0f-kube-api-access-mhrqm\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.168146 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.296421 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.407965 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"78f61cd4-3d8c-48e0-bebd-03969052e500","Type":"ContainerStarted","Data":"cec126b6647bad2db7e9ddc4b5f214085e98e532d7618dfac27c453a22f8875a"} Jan 20 23:03:47 crc kubenswrapper[5030]: I0120 23:03:47.885109 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.417166 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"6ede01a6-9221-4276-b537-212e5da27f5c","Type":"ContainerStarted","Data":"ff8398a734783704ed53adf1f8cf6ac6b31a5b7eda7fc7e80e5c0e36f03e7f79"} Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.418991 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"cae8a133-5a30-4a92-a0cd-e07c5941af0f","Type":"ContainerStarted","Data":"1f4c3fcafbcbc7133742f0c2259065e74d4e0153ceb5bf2f0ca3a73ef1200c9c"} Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.419048 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"cae8a133-5a30-4a92-a0cd-e07c5941af0f","Type":"ContainerStarted","Data":"1ab3bed68b1854174840ebd72cbcf854434d5af7db649a2018f4a61a6c04892f"} Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.471104 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.472253 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.475209 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-7wrsw" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.476824 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.477881 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.478598 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.495317 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.563323 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.563368 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.563437 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.563735 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqwk5\" (UniqueName: \"kubernetes.io/projected/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-kube-api-access-dqwk5\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.563820 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.563852 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.563897 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.564212 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.666039 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.666113 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.666160 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.666193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.666264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.666290 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.666323 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.666369 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqwk5\" (UniqueName: \"kubernetes.io/projected/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-kube-api-access-dqwk5\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.666571 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.666687 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.667114 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.667338 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.668017 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.674211 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.679344 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.691558 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqwk5\" (UniqueName: \"kubernetes.io/projected/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-kube-api-access-dqwk5\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.698931 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.699229 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.706354 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.708788 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-tb4nm" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.709037 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.709141 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.732800 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.767831 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11f45a61-64e0-41ac-85bd-660c5a2c83be-config-data\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.768101 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f45a61-64e0-41ac-85bd-660c5a2c83be-combined-ca-bundle\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.768149 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/11f45a61-64e0-41ac-85bd-660c5a2c83be-memcached-tls-certs\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.768169 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fms9k\" (UniqueName: \"kubernetes.io/projected/11f45a61-64e0-41ac-85bd-660c5a2c83be-kube-api-access-fms9k\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.768203 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11f45a61-64e0-41ac-85bd-660c5a2c83be-kolla-config\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.789174 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.869610 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11f45a61-64e0-41ac-85bd-660c5a2c83be-config-data\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.869692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f45a61-64e0-41ac-85bd-660c5a2c83be-combined-ca-bundle\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.869747 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/11f45a61-64e0-41ac-85bd-660c5a2c83be-memcached-tls-certs\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.869771 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fms9k\" (UniqueName: \"kubernetes.io/projected/11f45a61-64e0-41ac-85bd-660c5a2c83be-kube-api-access-fms9k\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.869808 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11f45a61-64e0-41ac-85bd-660c5a2c83be-kolla-config\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.870366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11f45a61-64e0-41ac-85bd-660c5a2c83be-config-data\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.870498 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11f45a61-64e0-41ac-85bd-660c5a2c83be-kolla-config\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.873847 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f45a61-64e0-41ac-85bd-660c5a2c83be-combined-ca-bundle\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.874266 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/11f45a61-64e0-41ac-85bd-660c5a2c83be-memcached-tls-certs\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:48 crc kubenswrapper[5030]: I0120 23:03:48.892613 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fms9k\" (UniqueName: \"kubernetes.io/projected/11f45a61-64e0-41ac-85bd-660c5a2c83be-kube-api-access-fms9k\") pod \"memcached-0\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:49 crc kubenswrapper[5030]: I0120 23:03:49.065303 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:49 crc kubenswrapper[5030]: I0120 23:03:49.252027 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:03:49 crc kubenswrapper[5030]: I0120 23:03:49.275490 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:03:49 crc kubenswrapper[5030]: I0120 23:03:49.427245 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"11f45a61-64e0-41ac-85bd-660c5a2c83be","Type":"ContainerStarted","Data":"909e491ca41f02445b9bfd666378a82a3944a12b0f1f9142f0839021c03093fb"} Jan 20 23:03:49 crc kubenswrapper[5030]: I0120 23:03:49.429557 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"a3bd79be-d1b9-4637-9dca-93cd7dbb5294","Type":"ContainerStarted","Data":"a2da9c9ec0ad052792dfb644ad3b4a584aeddba98b2deeefbfaaf9f4edbce253"} Jan 20 23:03:50 crc kubenswrapper[5030]: I0120 23:03:50.404475 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:03:50 crc kubenswrapper[5030]: I0120 23:03:50.405556 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:03:50 crc kubenswrapper[5030]: I0120 23:03:50.410430 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-j6g6c" Jan 20 23:03:50 crc kubenswrapper[5030]: I0120 23:03:50.417326 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:03:50 crc kubenswrapper[5030]: I0120 23:03:50.461138 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"a3bd79be-d1b9-4637-9dca-93cd7dbb5294","Type":"ContainerStarted","Data":"b0e41ea8ff28c7473b445e12018325b5cf0e9b0b6d55e693a8b950412cfbdc77"} Jan 20 23:03:50 crc kubenswrapper[5030]: I0120 23:03:50.465354 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"11f45a61-64e0-41ac-85bd-660c5a2c83be","Type":"ContainerStarted","Data":"1cada3db4af66ec4d1c27414fa05f35d51c9d85cd6c0e4252f5af47988c2a0b3"} Jan 20 23:03:50 crc kubenswrapper[5030]: I0120 23:03:50.465588 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:50 crc kubenswrapper[5030]: I0120 23:03:50.504685 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czskw\" (UniqueName: \"kubernetes.io/projected/d808c4c2-9747-4f7f-aea4-a3a2698b453e-kube-api-access-czskw\") pod \"kube-state-metrics-0\" (UID: \"d808c4c2-9747-4f7f-aea4-a3a2698b453e\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:03:50 crc kubenswrapper[5030]: I0120 23:03:50.513259 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.513241411 podStartE2EDuration="2.513241411s" podCreationTimestamp="2026-01-20 23:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:03:50.505169344 +0000 UTC m=+1702.825429632" watchObservedRunningTime="2026-01-20 23:03:50.513241411 +0000 UTC m=+1702.833501699" Jan 20 23:03:50 crc kubenswrapper[5030]: I0120 23:03:50.606066 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czskw\" (UniqueName: \"kubernetes.io/projected/d808c4c2-9747-4f7f-aea4-a3a2698b453e-kube-api-access-czskw\") pod \"kube-state-metrics-0\" (UID: \"d808c4c2-9747-4f7f-aea4-a3a2698b453e\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:03:50 crc kubenswrapper[5030]: I0120 23:03:50.629388 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czskw\" (UniqueName: \"kubernetes.io/projected/d808c4c2-9747-4f7f-aea4-a3a2698b453e-kube-api-access-czskw\") pod \"kube-state-metrics-0\" (UID: \"d808c4c2-9747-4f7f-aea4-a3a2698b453e\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:03:50 crc kubenswrapper[5030]: I0120 23:03:50.725187 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:03:51 crc kubenswrapper[5030]: I0120 23:03:51.168405 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:03:51 crc kubenswrapper[5030]: I0120 23:03:51.174221 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 23:03:51 crc kubenswrapper[5030]: I0120 23:03:51.473850 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d808c4c2-9747-4f7f-aea4-a3a2698b453e","Type":"ContainerStarted","Data":"e345fa82caee8495b122425cd4d168793900d4938d6213a556b387c516a65a1e"} Jan 20 23:03:52 crc kubenswrapper[5030]: I0120 23:03:52.483471 5030 generic.go:334] "Generic (PLEG): container finished" podID="cae8a133-5a30-4a92-a0cd-e07c5941af0f" containerID="1f4c3fcafbcbc7133742f0c2259065e74d4e0153ceb5bf2f0ca3a73ef1200c9c" exitCode=0 Jan 20 23:03:52 crc kubenswrapper[5030]: I0120 23:03:52.483706 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"cae8a133-5a30-4a92-a0cd-e07c5941af0f","Type":"ContainerDied","Data":"1f4c3fcafbcbc7133742f0c2259065e74d4e0153ceb5bf2f0ca3a73ef1200c9c"} Jan 20 23:03:52 crc kubenswrapper[5030]: I0120 23:03:52.486064 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d808c4c2-9747-4f7f-aea4-a3a2698b453e","Type":"ContainerStarted","Data":"63f933bb3c5d62aebcaf790abe7a9783ebabfd7c507c26b2013dc71a5002bae9"} Jan 20 23:03:52 crc kubenswrapper[5030]: I0120 23:03:52.487230 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:03:52 crc kubenswrapper[5030]: I0120 23:03:52.533744 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.103789109 podStartE2EDuration="2.533723812s" podCreationTimestamp="2026-01-20 23:03:50 +0000 UTC" firstStartedPulling="2026-01-20 23:03:51.174019139 +0000 UTC m=+1703.494279427" lastFinishedPulling="2026-01-20 23:03:51.603953822 +0000 UTC m=+1703.924214130" observedRunningTime="2026-01-20 23:03:52.532668666 +0000 UTC m=+1704.852928974" watchObservedRunningTime="2026-01-20 23:03:52.533723812 +0000 UTC m=+1704.853984100" Jan 20 23:03:52 crc kubenswrapper[5030]: I0120 23:03:52.962022 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:03:52 crc kubenswrapper[5030]: E0120 23:03:52.962773 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:03:53 crc kubenswrapper[5030]: I0120 23:03:53.499836 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"cae8a133-5a30-4a92-a0cd-e07c5941af0f","Type":"ContainerStarted","Data":"3ce1c0b95d67e31204548dcefbbebb8d7587a4e6d04adf65b96d1d9f7681e5f5"} Jan 20 23:03:53 crc kubenswrapper[5030]: I0120 23:03:53.510417 5030 generic.go:334] "Generic (PLEG): container finished" podID="a3bd79be-d1b9-4637-9dca-93cd7dbb5294" containerID="b0e41ea8ff28c7473b445e12018325b5cf0e9b0b6d55e693a8b950412cfbdc77" exitCode=0 Jan 20 23:03:53 crc kubenswrapper[5030]: I0120 23:03:53.510498 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"a3bd79be-d1b9-4637-9dca-93cd7dbb5294","Type":"ContainerDied","Data":"b0e41ea8ff28c7473b445e12018325b5cf0e9b0b6d55e693a8b950412cfbdc77"} Jan 20 23:03:53 crc kubenswrapper[5030]: I0120 23:03:53.539040 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=8.539019413 podStartE2EDuration="8.539019413s" podCreationTimestamp="2026-01-20 23:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:03:53.532931424 +0000 UTC m=+1705.853191752" watchObservedRunningTime="2026-01-20 23:03:53.539019413 +0000 UTC m=+1705.859279711" Jan 20 23:03:54 crc kubenswrapper[5030]: I0120 23:03:54.067007 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:03:54 crc kubenswrapper[5030]: I0120 23:03:54.520034 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"a3bd79be-d1b9-4637-9dca-93cd7dbb5294","Type":"ContainerStarted","Data":"f9125a8e056c241ba3ecc8ab6eed158313bbc4eae392988d937ab45defc16863"} Jan 20 23:03:54 crc kubenswrapper[5030]: I0120 23:03:54.536819 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=7.53680211 podStartE2EDuration="7.53680211s" podCreationTimestamp="2026-01-20 23:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:03:54.535009906 +0000 UTC m=+1706.855270204" watchObservedRunningTime="2026-01-20 23:03:54.53680211 +0000 UTC m=+1706.857062408" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.446937 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.448594 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.451644 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.455491 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.455741 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.455911 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-2vxnl" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.456374 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.470985 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.579321 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.579368 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.579387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.579409 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.579489 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.579518 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52q5p\" (UniqueName: \"kubernetes.io/projected/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-kube-api-access-52q5p\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.579534 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-config\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.579583 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.680400 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.680491 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.680531 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-config\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.680556 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52q5p\" (UniqueName: \"kubernetes.io/projected/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-kube-api-access-52q5p\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.680666 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.680709 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.680742 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.680765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.681289 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") device mount path \"/mnt/openstack/pv16\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.681407 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.681746 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.682302 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-config\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.696024 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.696429 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.696650 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.703743 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52q5p\" (UniqueName: \"kubernetes.io/projected/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-kube-api-access-52q5p\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.714192 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:55 crc kubenswrapper[5030]: I0120 23:03:55.769919 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.223340 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:03:56 crc kubenswrapper[5030]: W0120 23:03:56.226960 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0abaaf5_8404_4d53_ab0d_ceb843879e7b.slice/crio-b4ebde8a21ef69b50d163b6790e3f68a93b61e18d955907ca97606c78be05807 WatchSource:0}: Error finding container b4ebde8a21ef69b50d163b6790e3f68a93b61e18d955907ca97606c78be05807: Status 404 returned error can't find the container with id b4ebde8a21ef69b50d163b6790e3f68a93b61e18d955907ca97606c78be05807 Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.537026 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"a0abaaf5-8404-4d53-ab0d-ceb843879e7b","Type":"ContainerStarted","Data":"2a38d763c0caa77b3b4fae1898fcea6589985d2f9175090530c93c3047d97531"} Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.537073 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"a0abaaf5-8404-4d53-ab0d-ceb843879e7b","Type":"ContainerStarted","Data":"b4ebde8a21ef69b50d163b6790e3f68a93b61e18d955907ca97606c78be05807"} Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.683538 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.684704 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.686870 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-2vcpw" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.687429 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.687589 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.687874 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.717086 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.802477 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.802962 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qshdk\" (UniqueName: \"kubernetes.io/projected/076e3516-7a56-4b76-ba13-4f4f11f83af8-kube-api-access-qshdk\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.803006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/076e3516-7a56-4b76-ba13-4f4f11f83af8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.803145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.803180 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/076e3516-7a56-4b76-ba13-4f4f11f83af8-config\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.803203 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/076e3516-7a56-4b76-ba13-4f4f11f83af8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.803335 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.803383 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.904267 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.904309 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/076e3516-7a56-4b76-ba13-4f4f11f83af8-config\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.904328 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/076e3516-7a56-4b76-ba13-4f4f11f83af8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.904389 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.904410 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.904434 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.904453 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qshdk\" (UniqueName: \"kubernetes.io/projected/076e3516-7a56-4b76-ba13-4f4f11f83af8-kube-api-access-qshdk\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.904475 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/076e3516-7a56-4b76-ba13-4f4f11f83af8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.905952 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/076e3516-7a56-4b76-ba13-4f4f11f83af8-config\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.906501 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.906835 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/076e3516-7a56-4b76-ba13-4f4f11f83af8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.905201 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/076e3516-7a56-4b76-ba13-4f4f11f83af8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.909605 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.909771 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.913311 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.926003 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qshdk\" (UniqueName: \"kubernetes.io/projected/076e3516-7a56-4b76-ba13-4f4f11f83af8-kube-api-access-qshdk\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:56 crc kubenswrapper[5030]: I0120 23:03:56.929225 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:57 crc kubenswrapper[5030]: I0120 23:03:57.060771 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:03:57 crc kubenswrapper[5030]: I0120 23:03:57.297305 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:57 crc kubenswrapper[5030]: I0120 23:03:57.297350 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:57 crc kubenswrapper[5030]: I0120 23:03:57.369732 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:57 crc kubenswrapper[5030]: I0120 23:03:57.500954 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:03:57 crc kubenswrapper[5030]: W0120 23:03:57.505543 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod076e3516_7a56_4b76_ba13_4f4f11f83af8.slice/crio-1e9774ee3008170e37fc8225fc5439b39532c85172bb7503fdf9acaa839534dc WatchSource:0}: Error finding container 1e9774ee3008170e37fc8225fc5439b39532c85172bb7503fdf9acaa839534dc: Status 404 returned error can't find the container with id 1e9774ee3008170e37fc8225fc5439b39532c85172bb7503fdf9acaa839534dc Jan 20 23:03:57 crc kubenswrapper[5030]: I0120 23:03:57.545398 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"076e3516-7a56-4b76-ba13-4f4f11f83af8","Type":"ContainerStarted","Data":"1e9774ee3008170e37fc8225fc5439b39532c85172bb7503fdf9acaa839534dc"} Jan 20 23:03:57 crc kubenswrapper[5030]: I0120 23:03:57.550270 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"a0abaaf5-8404-4d53-ab0d-ceb843879e7b","Type":"ContainerStarted","Data":"143fd3fdb7a82cd6e839d0f97c95e1923c32e0d47238eb2b8ba19f425ffccf46"} Jan 20 23:03:57 crc kubenswrapper[5030]: I0120 23:03:57.570151 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.570132025 podStartE2EDuration="3.570132025s" podCreationTimestamp="2026-01-20 23:03:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:03:57.567932721 +0000 UTC m=+1709.888193019" watchObservedRunningTime="2026-01-20 23:03:57.570132025 +0000 UTC m=+1709.890392333" Jan 20 23:03:57 crc kubenswrapper[5030]: I0120 23:03:57.633645 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.562770 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"076e3516-7a56-4b76-ba13-4f4f11f83af8","Type":"ContainerStarted","Data":"55836a84ea42622b1c95a5c3544f4cddcb27aff9869f6f7aca168d293dd1544e"} Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.563107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"076e3516-7a56-4b76-ba13-4f4f11f83af8","Type":"ContainerStarted","Data":"aa914e7fae792f39c001fbef3009f7737f6fcbd15f05ee4dbed06af2baea26c7"} Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.590858 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=3.59083689 podStartE2EDuration="3.59083689s" podCreationTimestamp="2026-01-20 23:03:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:03:58.584834144 +0000 UTC m=+1710.905094452" watchObservedRunningTime="2026-01-20 23:03:58.59083689 +0000 UTC m=+1710.911097188" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.628971 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-p58br"] Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.630072 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-p58br" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.645457 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-p58br"] Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.729707 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8"] Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.730689 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.733583 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.736005 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49108146-06bd-4395-99a7-3b3d37ce1920-operator-scripts\") pod \"keystone-db-create-p58br\" (UID: \"49108146-06bd-4395-99a7-3b3d37ce1920\") " pod="openstack-kuttl-tests/keystone-db-create-p58br" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.736037 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcbdw\" (UniqueName: \"kubernetes.io/projected/49108146-06bd-4395-99a7-3b3d37ce1920-kube-api-access-hcbdw\") pod \"keystone-db-create-p58br\" (UID: \"49108146-06bd-4395-99a7-3b3d37ce1920\") " pod="openstack-kuttl-tests/keystone-db-create-p58br" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.740692 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8"] Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.770081 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.790004 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.790219 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.837645 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49108146-06bd-4395-99a7-3b3d37ce1920-operator-scripts\") pod \"keystone-db-create-p58br\" (UID: \"49108146-06bd-4395-99a7-3b3d37ce1920\") " pod="openstack-kuttl-tests/keystone-db-create-p58br" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.837705 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcbdw\" (UniqueName: \"kubernetes.io/projected/49108146-06bd-4395-99a7-3b3d37ce1920-kube-api-access-hcbdw\") pod \"keystone-db-create-p58br\" (UID: \"49108146-06bd-4395-99a7-3b3d37ce1920\") " pod="openstack-kuttl-tests/keystone-db-create-p58br" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.837801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beebe328-1fb3-442d-90aa-aac7c2cf9457-operator-scripts\") pod \"keystone-cb8a-account-create-update-lgmp8\" (UID: \"beebe328-1fb3-442d-90aa-aac7c2cf9457\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.837899 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqc2p\" (UniqueName: \"kubernetes.io/projected/beebe328-1fb3-442d-90aa-aac7c2cf9457-kube-api-access-zqc2p\") pod \"keystone-cb8a-account-create-update-lgmp8\" (UID: \"beebe328-1fb3-442d-90aa-aac7c2cf9457\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.838877 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49108146-06bd-4395-99a7-3b3d37ce1920-operator-scripts\") pod \"keystone-db-create-p58br\" (UID: \"49108146-06bd-4395-99a7-3b3d37ce1920\") " pod="openstack-kuttl-tests/keystone-db-create-p58br" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.857257 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcbdw\" (UniqueName: \"kubernetes.io/projected/49108146-06bd-4395-99a7-3b3d37ce1920-kube-api-access-hcbdw\") pod \"keystone-db-create-p58br\" (UID: \"49108146-06bd-4395-99a7-3b3d37ce1920\") " pod="openstack-kuttl-tests/keystone-db-create-p58br" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.916388 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-fmgxs"] Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.917506 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-fmgxs" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.929757 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-fmgxs"] Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.939529 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqc2p\" (UniqueName: \"kubernetes.io/projected/beebe328-1fb3-442d-90aa-aac7c2cf9457-kube-api-access-zqc2p\") pod \"keystone-cb8a-account-create-update-lgmp8\" (UID: \"beebe328-1fb3-442d-90aa-aac7c2cf9457\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.939654 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beebe328-1fb3-442d-90aa-aac7c2cf9457-operator-scripts\") pod \"keystone-cb8a-account-create-update-lgmp8\" (UID: \"beebe328-1fb3-442d-90aa-aac7c2cf9457\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.940406 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beebe328-1fb3-442d-90aa-aac7c2cf9457-operator-scripts\") pod \"keystone-cb8a-account-create-update-lgmp8\" (UID: \"beebe328-1fb3-442d-90aa-aac7c2cf9457\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.952881 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-p58br" Jan 20 23:03:58 crc kubenswrapper[5030]: I0120 23:03:58.957248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqc2p\" (UniqueName: \"kubernetes.io/projected/beebe328-1fb3-442d-90aa-aac7c2cf9457-kube-api-access-zqc2p\") pod \"keystone-cb8a-account-create-update-lgmp8\" (UID: \"beebe328-1fb3-442d-90aa-aac7c2cf9457\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.040047 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl"] Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.041022 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.043118 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.043678 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clcp6\" (UniqueName: \"kubernetes.io/projected/0c878c14-0187-4224-bffa-99280c03dae7-kube-api-access-clcp6\") pod \"placement-db-create-fmgxs\" (UID: \"0c878c14-0187-4224-bffa-99280c03dae7\") " pod="openstack-kuttl-tests/placement-db-create-fmgxs" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.043827 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c878c14-0187-4224-bffa-99280c03dae7-operator-scripts\") pod \"placement-db-create-fmgxs\" (UID: \"0c878c14-0187-4224-bffa-99280c03dae7\") " pod="openstack-kuttl-tests/placement-db-create-fmgxs" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.049535 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.056203 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl"] Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.146583 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c878c14-0187-4224-bffa-99280c03dae7-operator-scripts\") pod \"placement-db-create-fmgxs\" (UID: \"0c878c14-0187-4224-bffa-99280c03dae7\") " pod="openstack-kuttl-tests/placement-db-create-fmgxs" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.146744 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clcp6\" (UniqueName: \"kubernetes.io/projected/0c878c14-0187-4224-bffa-99280c03dae7-kube-api-access-clcp6\") pod \"placement-db-create-fmgxs\" (UID: \"0c878c14-0187-4224-bffa-99280c03dae7\") " pod="openstack-kuttl-tests/placement-db-create-fmgxs" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.146770 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/364de244-2190-47e1-bade-2c277858c97a-operator-scripts\") pod \"placement-73fb-account-create-update-ndhrl\" (UID: \"364de244-2190-47e1-bade-2c277858c97a\") " pod="openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.146801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8qsb\" (UniqueName: \"kubernetes.io/projected/364de244-2190-47e1-bade-2c277858c97a-kube-api-access-r8qsb\") pod \"placement-73fb-account-create-update-ndhrl\" (UID: \"364de244-2190-47e1-bade-2c277858c97a\") " pod="openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.147753 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c878c14-0187-4224-bffa-99280c03dae7-operator-scripts\") pod \"placement-db-create-fmgxs\" (UID: \"0c878c14-0187-4224-bffa-99280c03dae7\") " pod="openstack-kuttl-tests/placement-db-create-fmgxs" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.163288 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clcp6\" (UniqueName: \"kubernetes.io/projected/0c878c14-0187-4224-bffa-99280c03dae7-kube-api-access-clcp6\") pod \"placement-db-create-fmgxs\" (UID: \"0c878c14-0187-4224-bffa-99280c03dae7\") " pod="openstack-kuttl-tests/placement-db-create-fmgxs" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.244235 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-fmgxs" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.248377 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/364de244-2190-47e1-bade-2c277858c97a-operator-scripts\") pod \"placement-73fb-account-create-update-ndhrl\" (UID: \"364de244-2190-47e1-bade-2c277858c97a\") " pod="openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.248438 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8qsb\" (UniqueName: \"kubernetes.io/projected/364de244-2190-47e1-bade-2c277858c97a-kube-api-access-r8qsb\") pod \"placement-73fb-account-create-update-ndhrl\" (UID: \"364de244-2190-47e1-bade-2c277858c97a\") " pod="openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.249380 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/364de244-2190-47e1-bade-2c277858c97a-operator-scripts\") pod \"placement-73fb-account-create-update-ndhrl\" (UID: \"364de244-2190-47e1-bade-2c277858c97a\") " pod="openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.272137 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8qsb\" (UniqueName: \"kubernetes.io/projected/364de244-2190-47e1-bade-2c277858c97a-kube-api-access-r8qsb\") pod \"placement-73fb-account-create-update-ndhrl\" (UID: \"364de244-2190-47e1-bade-2c277858c97a\") " pod="openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.343663 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8"] Jan 20 23:03:59 crc kubenswrapper[5030]: W0120 23:03:59.348192 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbeebe328_1fb3_442d_90aa_aac7c2cf9457.slice/crio-0e2073f1185cf3a6ac2c4dc8e2d8b43c74f95fac499912dce3b8fc6630fbab86 WatchSource:0}: Error finding container 0e2073f1185cf3a6ac2c4dc8e2d8b43c74f95fac499912dce3b8fc6630fbab86: Status 404 returned error can't find the container with id 0e2073f1185cf3a6ac2c4dc8e2d8b43c74f95fac499912dce3b8fc6630fbab86 Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.372496 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl" Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.435286 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-p58br"] Jan 20 23:03:59 crc kubenswrapper[5030]: W0120 23:03:59.439208 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49108146_06bd_4395_99a7_3b3d37ce1920.slice/crio-bfa9ab02542f40e6f5ad2f0d74ef22aaa1d1bb474d4a49d7dd9ca2546e63170f WatchSource:0}: Error finding container bfa9ab02542f40e6f5ad2f0d74ef22aaa1d1bb474d4a49d7dd9ca2546e63170f: Status 404 returned error can't find the container with id bfa9ab02542f40e6f5ad2f0d74ef22aaa1d1bb474d4a49d7dd9ca2546e63170f Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.573421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-p58br" event={"ID":"49108146-06bd-4395-99a7-3b3d37ce1920","Type":"ContainerStarted","Data":"bfa9ab02542f40e6f5ad2f0d74ef22aaa1d1bb474d4a49d7dd9ca2546e63170f"} Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.575982 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8" event={"ID":"beebe328-1fb3-442d-90aa-aac7c2cf9457","Type":"ContainerStarted","Data":"0e2073f1185cf3a6ac2c4dc8e2d8b43c74f95fac499912dce3b8fc6630fbab86"} Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.654478 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-fmgxs"] Jan 20 23:03:59 crc kubenswrapper[5030]: W0120 23:03:59.667084 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c878c14_0187_4224_bffa_99280c03dae7.slice/crio-60064dcebde8c03b521e6e9c51e5f9c31e4a205fc8bf6190401d7e76cd450cf1 WatchSource:0}: Error finding container 60064dcebde8c03b521e6e9c51e5f9c31e4a205fc8bf6190401d7e76cd450cf1: Status 404 returned error can't find the container with id 60064dcebde8c03b521e6e9c51e5f9c31e4a205fc8bf6190401d7e76cd450cf1 Jan 20 23:03:59 crc kubenswrapper[5030]: I0120 23:03:59.822436 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl"] Jan 20 23:03:59 crc kubenswrapper[5030]: W0120 23:03:59.892351 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod364de244_2190_47e1_bade_2c277858c97a.slice/crio-caa2eeb96a8a4f09d360147fee1ed3f82a25d03770042eb49007afed4c7fc021 WatchSource:0}: Error finding container caa2eeb96a8a4f09d360147fee1ed3f82a25d03770042eb49007afed4c7fc021: Status 404 returned error can't find the container with id caa2eeb96a8a4f09d360147fee1ed3f82a25d03770042eb49007afed4c7fc021 Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.062965 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.123905 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.589602 5030 generic.go:334] "Generic (PLEG): container finished" podID="beebe328-1fb3-442d-90aa-aac7c2cf9457" containerID="23c3b4bff5c9d478f4f105968fe365da138cfa39a0900042d6d05c0b978b4131" exitCode=0 Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.589844 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8" event={"ID":"beebe328-1fb3-442d-90aa-aac7c2cf9457","Type":"ContainerDied","Data":"23c3b4bff5c9d478f4f105968fe365da138cfa39a0900042d6d05c0b978b4131"} Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.599264 5030 generic.go:334] "Generic (PLEG): container finished" podID="0c878c14-0187-4224-bffa-99280c03dae7" containerID="2a584d50b092bb2e78449a32c91df2402a0537b6a34ad6a3f2ec1df462b05a74" exitCode=0 Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.599387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-fmgxs" event={"ID":"0c878c14-0187-4224-bffa-99280c03dae7","Type":"ContainerDied","Data":"2a584d50b092bb2e78449a32c91df2402a0537b6a34ad6a3f2ec1df462b05a74"} Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.599433 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-fmgxs" event={"ID":"0c878c14-0187-4224-bffa-99280c03dae7","Type":"ContainerStarted","Data":"60064dcebde8c03b521e6e9c51e5f9c31e4a205fc8bf6190401d7e76cd450cf1"} Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.601371 5030 generic.go:334] "Generic (PLEG): container finished" podID="364de244-2190-47e1-bade-2c277858c97a" containerID="438639437ff6cdf5a7b8b51f8127f0dcdd80ff295e3a8e7788eb685ff527e8e2" exitCode=0 Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.601471 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl" event={"ID":"364de244-2190-47e1-bade-2c277858c97a","Type":"ContainerDied","Data":"438639437ff6cdf5a7b8b51f8127f0dcdd80ff295e3a8e7788eb685ff527e8e2"} Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.601494 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl" event={"ID":"364de244-2190-47e1-bade-2c277858c97a","Type":"ContainerStarted","Data":"caa2eeb96a8a4f09d360147fee1ed3f82a25d03770042eb49007afed4c7fc021"} Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.602890 5030 generic.go:334] "Generic (PLEG): container finished" podID="49108146-06bd-4395-99a7-3b3d37ce1920" containerID="745d8a8358077afde7e45bd24c3baddb33dc50783803442ad724e0850b039df1" exitCode=0 Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.603771 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-p58br" event={"ID":"49108146-06bd-4395-99a7-3b3d37ce1920","Type":"ContainerDied","Data":"745d8a8358077afde7e45bd24c3baddb33dc50783803442ad724e0850b039df1"} Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.603843 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.730008 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:04:00 crc kubenswrapper[5030]: I0120 23:04:00.770811 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:04:01 crc kubenswrapper[5030]: I0120 23:04:01.286788 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:04:01 crc kubenswrapper[5030]: I0120 23:04:01.348477 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:04:01 crc kubenswrapper[5030]: I0120 23:04:01.830703 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:04:01 crc kubenswrapper[5030]: I0120 23:04:01.836202 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:04:01 crc kubenswrapper[5030]: I0120 23:04:01.836328 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:01 crc kubenswrapper[5030]: I0120 23:04:01.839083 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-fdp95" Jan 20 23:04:01 crc kubenswrapper[5030]: I0120 23:04:01.839257 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 20 23:04:01 crc kubenswrapper[5030]: I0120 23:04:01.839851 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 20 23:04:01 crc kubenswrapper[5030]: I0120 23:04:01.840174 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:04:01 crc kubenswrapper[5030]: I0120 23:04:01.840357 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 20 23:04:01 crc kubenswrapper[5030]: I0120 23:04:01.899168 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.003292 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m97n\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-kube-api-access-2m97n\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.003411 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.003460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.003547 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7ebac554-54b2-415f-9d13-dc616983bff9-cache\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.003599 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7ebac554-54b2-415f-9d13-dc616983bff9-lock\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.042855 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.104908 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m97n\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-kube-api-access-2m97n\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.104979 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.105005 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.105060 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7ebac554-54b2-415f-9d13-dc616983bff9-cache\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.105096 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7ebac554-54b2-415f-9d13-dc616983bff9-lock\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: E0120 23:04:02.105215 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:04:02 crc kubenswrapper[5030]: E0120 23:04:02.105236 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:04:02 crc kubenswrapper[5030]: E0120 23:04:02.105290 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift podName:7ebac554-54b2-415f-9d13-dc616983bff9 nodeName:}" failed. No retries permitted until 2026-01-20 23:04:02.605270806 +0000 UTC m=+1714.925531094 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift") pod "swift-storage-0" (UID: "7ebac554-54b2-415f-9d13-dc616983bff9") : configmap "swift-ring-files" not found Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.105465 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7ebac554-54b2-415f-9d13-dc616983bff9-lock\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.105568 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.105689 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7ebac554-54b2-415f-9d13-dc616983bff9-cache\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.108175 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.123867 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m97n\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-kube-api-access-2m97n\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.152761 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.199584 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-kzxqs"] Jan 20 23:04:02 crc kubenswrapper[5030]: E0120 23:04:02.200305 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="364de244-2190-47e1-bade-2c277858c97a" containerName="mariadb-account-create-update" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.202392 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="364de244-2190-47e1-bade-2c277858c97a" containerName="mariadb-account-create-update" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.206102 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="364de244-2190-47e1-bade-2c277858c97a" containerName="mariadb-account-create-update" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.206574 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-kzxqs"] Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.206671 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.207043 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/364de244-2190-47e1-bade-2c277858c97a-operator-scripts\") pod \"364de244-2190-47e1-bade-2c277858c97a\" (UID: \"364de244-2190-47e1-bade-2c277858c97a\") " Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.208514 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8qsb\" (UniqueName: \"kubernetes.io/projected/364de244-2190-47e1-bade-2c277858c97a-kube-api-access-r8qsb\") pod \"364de244-2190-47e1-bade-2c277858c97a\" (UID: \"364de244-2190-47e1-bade-2c277858c97a\") " Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.209230 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/364de244-2190-47e1-bade-2c277858c97a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "364de244-2190-47e1-bade-2c277858c97a" (UID: "364de244-2190-47e1-bade-2c277858c97a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.211122 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.211326 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.211456 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.213287 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/364de244-2190-47e1-bade-2c277858c97a-kube-api-access-r8qsb" (OuterVolumeSpecName: "kube-api-access-r8qsb") pod "364de244-2190-47e1-bade-2c277858c97a" (UID: "364de244-2190-47e1-bade-2c277858c97a"). InnerVolumeSpecName "kube-api-access-r8qsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.240715 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.245510 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-fmgxs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.256219 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-p58br" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.312102 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-etc-swift\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.312409 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-combined-ca-bundle\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.312468 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-swiftconf\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.312514 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-dispersionconf\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.312534 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jclc\" (UniqueName: \"kubernetes.io/projected/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-kube-api-access-7jclc\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.312561 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-ring-data-devices\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.312638 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-scripts\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.312682 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/364de244-2190-47e1-bade-2c277858c97a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.312694 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8qsb\" (UniqueName: \"kubernetes.io/projected/364de244-2190-47e1-bade-2c277858c97a-kube-api-access-r8qsb\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.362643 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:04:02 crc kubenswrapper[5030]: E0120 23:04:02.363072 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c878c14-0187-4224-bffa-99280c03dae7" containerName="mariadb-database-create" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.363088 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c878c14-0187-4224-bffa-99280c03dae7" containerName="mariadb-database-create" Jan 20 23:04:02 crc kubenswrapper[5030]: E0120 23:04:02.363115 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beebe328-1fb3-442d-90aa-aac7c2cf9457" containerName="mariadb-account-create-update" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.363124 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="beebe328-1fb3-442d-90aa-aac7c2cf9457" containerName="mariadb-account-create-update" Jan 20 23:04:02 crc kubenswrapper[5030]: E0120 23:04:02.363145 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49108146-06bd-4395-99a7-3b3d37ce1920" containerName="mariadb-database-create" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.363151 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="49108146-06bd-4395-99a7-3b3d37ce1920" containerName="mariadb-database-create" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.363309 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="49108146-06bd-4395-99a7-3b3d37ce1920" containerName="mariadb-database-create" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.363327 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c878c14-0187-4224-bffa-99280c03dae7" containerName="mariadb-database-create" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.363351 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="beebe328-1fb3-442d-90aa-aac7c2cf9457" containerName="mariadb-account-create-update" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.364202 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.370239 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-447fw" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.370440 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.371058 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.371598 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.375219 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.414204 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c878c14-0187-4224-bffa-99280c03dae7-operator-scripts\") pod \"0c878c14-0187-4224-bffa-99280c03dae7\" (UID: \"0c878c14-0187-4224-bffa-99280c03dae7\") " Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.414280 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcbdw\" (UniqueName: \"kubernetes.io/projected/49108146-06bd-4395-99a7-3b3d37ce1920-kube-api-access-hcbdw\") pod \"49108146-06bd-4395-99a7-3b3d37ce1920\" (UID: \"49108146-06bd-4395-99a7-3b3d37ce1920\") " Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.414309 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49108146-06bd-4395-99a7-3b3d37ce1920-operator-scripts\") pod \"49108146-06bd-4395-99a7-3b3d37ce1920\" (UID: \"49108146-06bd-4395-99a7-3b3d37ce1920\") " Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.414395 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clcp6\" (UniqueName: \"kubernetes.io/projected/0c878c14-0187-4224-bffa-99280c03dae7-kube-api-access-clcp6\") pod \"0c878c14-0187-4224-bffa-99280c03dae7\" (UID: \"0c878c14-0187-4224-bffa-99280c03dae7\") " Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.414431 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beebe328-1fb3-442d-90aa-aac7c2cf9457-operator-scripts\") pod \"beebe328-1fb3-442d-90aa-aac7c2cf9457\" (UID: \"beebe328-1fb3-442d-90aa-aac7c2cf9457\") " Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.414452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqc2p\" (UniqueName: \"kubernetes.io/projected/beebe328-1fb3-442d-90aa-aac7c2cf9457-kube-api-access-zqc2p\") pod \"beebe328-1fb3-442d-90aa-aac7c2cf9457\" (UID: \"beebe328-1fb3-442d-90aa-aac7c2cf9457\") " Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.414773 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-scripts\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.414816 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-etc-swift\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.414845 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-combined-ca-bundle\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.414900 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-swiftconf\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.414956 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-dispersionconf\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.414985 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jclc\" (UniqueName: \"kubernetes.io/projected/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-kube-api-access-7jclc\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.415020 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-ring-data-devices\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.415954 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-ring-data-devices\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.416525 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c878c14-0187-4224-bffa-99280c03dae7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c878c14-0187-4224-bffa-99280c03dae7" (UID: "0c878c14-0187-4224-bffa-99280c03dae7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.418410 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-etc-swift\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.418405 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beebe328-1fb3-442d-90aa-aac7c2cf9457-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "beebe328-1fb3-442d-90aa-aac7c2cf9457" (UID: "beebe328-1fb3-442d-90aa-aac7c2cf9457"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.420043 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49108146-06bd-4395-99a7-3b3d37ce1920-kube-api-access-hcbdw" (OuterVolumeSpecName: "kube-api-access-hcbdw") pod "49108146-06bd-4395-99a7-3b3d37ce1920" (UID: "49108146-06bd-4395-99a7-3b3d37ce1920"). InnerVolumeSpecName "kube-api-access-hcbdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.421090 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c878c14-0187-4224-bffa-99280c03dae7-kube-api-access-clcp6" (OuterVolumeSpecName: "kube-api-access-clcp6") pod "0c878c14-0187-4224-bffa-99280c03dae7" (UID: "0c878c14-0187-4224-bffa-99280c03dae7"). InnerVolumeSpecName "kube-api-access-clcp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.421401 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49108146-06bd-4395-99a7-3b3d37ce1920-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49108146-06bd-4395-99a7-3b3d37ce1920" (UID: "49108146-06bd-4395-99a7-3b3d37ce1920"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.421705 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-dispersionconf\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.423068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-scripts\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.423239 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beebe328-1fb3-442d-90aa-aac7c2cf9457-kube-api-access-zqc2p" (OuterVolumeSpecName: "kube-api-access-zqc2p") pod "beebe328-1fb3-442d-90aa-aac7c2cf9457" (UID: "beebe328-1fb3-442d-90aa-aac7c2cf9457"). InnerVolumeSpecName "kube-api-access-zqc2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.422987 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-swiftconf\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.430241 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-combined-ca-bundle\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.439928 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jclc\" (UniqueName: \"kubernetes.io/projected/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-kube-api-access-7jclc\") pod \"swift-ring-rebalance-kzxqs\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.516818 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1399fab1-c25c-4b58-a58e-9dfd0062a09c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.516877 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnf25\" (UniqueName: \"kubernetes.io/projected/1399fab1-c25c-4b58-a58e-9dfd0062a09c-kube-api-access-hnf25\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.517001 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.517033 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1399fab1-c25c-4b58-a58e-9dfd0062a09c-config\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.517069 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1399fab1-c25c-4b58-a58e-9dfd0062a09c-scripts\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.517100 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.517125 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.517179 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c878c14-0187-4224-bffa-99280c03dae7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.517193 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcbdw\" (UniqueName: \"kubernetes.io/projected/49108146-06bd-4395-99a7-3b3d37ce1920-kube-api-access-hcbdw\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.517206 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49108146-06bd-4395-99a7-3b3d37ce1920-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.517219 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clcp6\" (UniqueName: \"kubernetes.io/projected/0c878c14-0187-4224-bffa-99280c03dae7-kube-api-access-clcp6\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.517230 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/beebe328-1fb3-442d-90aa-aac7c2cf9457-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.517242 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqc2p\" (UniqueName: \"kubernetes.io/projected/beebe328-1fb3-442d-90aa-aac7c2cf9457-kube-api-access-zqc2p\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.562758 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.623438 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.623502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1399fab1-c25c-4b58-a58e-9dfd0062a09c-config\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.623552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1399fab1-c25c-4b58-a58e-9dfd0062a09c-scripts\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.623594 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.623711 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.623761 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1399fab1-c25c-4b58-a58e-9dfd0062a09c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.623788 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnf25\" (UniqueName: \"kubernetes.io/projected/1399fab1-c25c-4b58-a58e-9dfd0062a09c-kube-api-access-hnf25\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.623853 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:02 crc kubenswrapper[5030]: E0120 23:04:02.624584 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:04:02 crc kubenswrapper[5030]: E0120 23:04:02.624610 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:04:02 crc kubenswrapper[5030]: E0120 23:04:02.624808 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift podName:7ebac554-54b2-415f-9d13-dc616983bff9 nodeName:}" failed. No retries permitted until 2026-01-20 23:04:03.624789081 +0000 UTC m=+1715.945049379 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift") pod "swift-storage-0" (UID: "7ebac554-54b2-415f-9d13-dc616983bff9") : configmap "swift-ring-files" not found Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.628652 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.629654 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1399fab1-c25c-4b58-a58e-9dfd0062a09c-scripts\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.629677 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1399fab1-c25c-4b58-a58e-9dfd0062a09c-config\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.630176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1399fab1-c25c-4b58-a58e-9dfd0062a09c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.631970 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.632603 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-p58br" event={"ID":"49108146-06bd-4395-99a7-3b3d37ce1920","Type":"ContainerDied","Data":"bfa9ab02542f40e6f5ad2f0d74ef22aaa1d1bb474d4a49d7dd9ca2546e63170f"} Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.632665 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfa9ab02542f40e6f5ad2f0d74ef22aaa1d1bb474d4a49d7dd9ca2546e63170f" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.632764 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-p58br" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.634054 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.634659 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.634650 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8" event={"ID":"beebe328-1fb3-442d-90aa-aac7c2cf9457","Type":"ContainerDied","Data":"0e2073f1185cf3a6ac2c4dc8e2d8b43c74f95fac499912dce3b8fc6630fbab86"} Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.634909 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e2073f1185cf3a6ac2c4dc8e2d8b43c74f95fac499912dce3b8fc6630fbab86" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.639145 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-fmgxs" event={"ID":"0c878c14-0187-4224-bffa-99280c03dae7","Type":"ContainerDied","Data":"60064dcebde8c03b521e6e9c51e5f9c31e4a205fc8bf6190401d7e76cd450cf1"} Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.639174 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60064dcebde8c03b521e6e9c51e5f9c31e4a205fc8bf6190401d7e76cd450cf1" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.639237 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-fmgxs" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.647557 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl" event={"ID":"364de244-2190-47e1-bade-2c277858c97a","Type":"ContainerDied","Data":"caa2eeb96a8a4f09d360147fee1ed3f82a25d03770042eb49007afed4c7fc021"} Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.647650 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.647661 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caa2eeb96a8a4f09d360147fee1ed3f82a25d03770042eb49007afed4c7fc021" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.648377 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnf25\" (UniqueName: \"kubernetes.io/projected/1399fab1-c25c-4b58-a58e-9dfd0062a09c-kube-api-access-hnf25\") pod \"ovn-northd-0\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:02 crc kubenswrapper[5030]: I0120 23:04:02.681979 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:03 crc kubenswrapper[5030]: I0120 23:04:03.018589 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-kzxqs"] Jan 20 23:04:03 crc kubenswrapper[5030]: W0120 23:04:03.023107 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc11c685c_7088_4b9a_ae6a_6fcf926abf3c.slice/crio-dd5990318f172a99e2a6a782e9926009bed5627bed23946a8bfdf5e006b16fbf WatchSource:0}: Error finding container dd5990318f172a99e2a6a782e9926009bed5627bed23946a8bfdf5e006b16fbf: Status 404 returned error can't find the container with id dd5990318f172a99e2a6a782e9926009bed5627bed23946a8bfdf5e006b16fbf Jan 20 23:04:03 crc kubenswrapper[5030]: I0120 23:04:03.138944 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:04:03 crc kubenswrapper[5030]: I0120 23:04:03.640931 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:03 crc kubenswrapper[5030]: E0120 23:04:03.641102 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:04:03 crc kubenswrapper[5030]: E0120 23:04:03.641138 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:04:03 crc kubenswrapper[5030]: E0120 23:04:03.641199 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift podName:7ebac554-54b2-415f-9d13-dc616983bff9 nodeName:}" failed. No retries permitted until 2026-01-20 23:04:05.641181852 +0000 UTC m=+1717.961442140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift") pod "swift-storage-0" (UID: "7ebac554-54b2-415f-9d13-dc616983bff9") : configmap "swift-ring-files" not found Jan 20 23:04:03 crc kubenswrapper[5030]: I0120 23:04:03.656171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"1399fab1-c25c-4b58-a58e-9dfd0062a09c","Type":"ContainerStarted","Data":"024213f076bfea3599e3af6ece9e72be955eb0c53722fd158681cd3ee0b2aef1"} Jan 20 23:04:03 crc kubenswrapper[5030]: I0120 23:04:03.656213 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"1399fab1-c25c-4b58-a58e-9dfd0062a09c","Type":"ContainerStarted","Data":"a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7"} Jan 20 23:04:03 crc kubenswrapper[5030]: I0120 23:04:03.656224 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"1399fab1-c25c-4b58-a58e-9dfd0062a09c","Type":"ContainerStarted","Data":"078298afef122c853f4483a01a811bf7424016846dac12e491649c1b0e43647a"} Jan 20 23:04:03 crc kubenswrapper[5030]: I0120 23:04:03.656294 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:03 crc kubenswrapper[5030]: I0120 23:04:03.658211 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" event={"ID":"c11c685c-7088-4b9a-ae6a-6fcf926abf3c","Type":"ContainerStarted","Data":"ae017982458ddc6af7aca2197dbd82b45a8fce1dc25b7adfb5138c1f5c66675e"} Jan 20 23:04:03 crc kubenswrapper[5030]: I0120 23:04:03.658268 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" event={"ID":"c11c685c-7088-4b9a-ae6a-6fcf926abf3c","Type":"ContainerStarted","Data":"dd5990318f172a99e2a6a782e9926009bed5627bed23946a8bfdf5e006b16fbf"} Jan 20 23:04:03 crc kubenswrapper[5030]: I0120 23:04:03.678938 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=1.6789173910000001 podStartE2EDuration="1.678917391s" podCreationTimestamp="2026-01-20 23:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:03.678300207 +0000 UTC m=+1715.998560495" watchObservedRunningTime="2026-01-20 23:04:03.678917391 +0000 UTC m=+1715.999177669" Jan 20 23:04:03 crc kubenswrapper[5030]: I0120 23:04:03.695407 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" podStartSLOduration=1.6953882729999998 podStartE2EDuration="1.695388273s" podCreationTimestamp="2026-01-20 23:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:03.692178894 +0000 UTC m=+1716.012439182" watchObservedRunningTime="2026-01-20 23:04:03.695388273 +0000 UTC m=+1716.015648561" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.242601 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-q4dk6"] Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.243682 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-q4dk6" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.289344 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-q4dk6"] Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.352892 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb"] Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.353532 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qchv4\" (UniqueName: \"kubernetes.io/projected/a7d8c060-85fa-4664-97ef-22370096f2b1-kube-api-access-qchv4\") pod \"glance-db-create-q4dk6\" (UID: \"a7d8c060-85fa-4664-97ef-22370096f2b1\") " pod="openstack-kuttl-tests/glance-db-create-q4dk6" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.353584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d8c060-85fa-4664-97ef-22370096f2b1-operator-scripts\") pod \"glance-db-create-q4dk6\" (UID: \"a7d8c060-85fa-4664-97ef-22370096f2b1\") " pod="openstack-kuttl-tests/glance-db-create-q4dk6" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.354053 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.358966 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.368794 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb"] Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.454711 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbmv8\" (UniqueName: \"kubernetes.io/projected/efb5e616-18ad-4dc4-b8c4-b69a6c649bb4-kube-api-access-xbmv8\") pod \"glance-a2f7-account-create-update-d6xpb\" (UID: \"efb5e616-18ad-4dc4-b8c4-b69a6c649bb4\") " pod="openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.454832 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb5e616-18ad-4dc4-b8c4-b69a6c649bb4-operator-scripts\") pod \"glance-a2f7-account-create-update-d6xpb\" (UID: \"efb5e616-18ad-4dc4-b8c4-b69a6c649bb4\") " pod="openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.454862 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qchv4\" (UniqueName: \"kubernetes.io/projected/a7d8c060-85fa-4664-97ef-22370096f2b1-kube-api-access-qchv4\") pod \"glance-db-create-q4dk6\" (UID: \"a7d8c060-85fa-4664-97ef-22370096f2b1\") " pod="openstack-kuttl-tests/glance-db-create-q4dk6" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.454885 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d8c060-85fa-4664-97ef-22370096f2b1-operator-scripts\") pod \"glance-db-create-q4dk6\" (UID: \"a7d8c060-85fa-4664-97ef-22370096f2b1\") " pod="openstack-kuttl-tests/glance-db-create-q4dk6" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.455541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d8c060-85fa-4664-97ef-22370096f2b1-operator-scripts\") pod \"glance-db-create-q4dk6\" (UID: \"a7d8c060-85fa-4664-97ef-22370096f2b1\") " pod="openstack-kuttl-tests/glance-db-create-q4dk6" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.483262 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qchv4\" (UniqueName: \"kubernetes.io/projected/a7d8c060-85fa-4664-97ef-22370096f2b1-kube-api-access-qchv4\") pod \"glance-db-create-q4dk6\" (UID: \"a7d8c060-85fa-4664-97ef-22370096f2b1\") " pod="openstack-kuttl-tests/glance-db-create-q4dk6" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.556818 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbmv8\" (UniqueName: \"kubernetes.io/projected/efb5e616-18ad-4dc4-b8c4-b69a6c649bb4-kube-api-access-xbmv8\") pod \"glance-a2f7-account-create-update-d6xpb\" (UID: \"efb5e616-18ad-4dc4-b8c4-b69a6c649bb4\") " pod="openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.556913 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb5e616-18ad-4dc4-b8c4-b69a6c649bb4-operator-scripts\") pod \"glance-a2f7-account-create-update-d6xpb\" (UID: \"efb5e616-18ad-4dc4-b8c4-b69a6c649bb4\") " pod="openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.557557 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb5e616-18ad-4dc4-b8c4-b69a6c649bb4-operator-scripts\") pod \"glance-a2f7-account-create-update-d6xpb\" (UID: \"efb5e616-18ad-4dc4-b8c4-b69a6c649bb4\") " pod="openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.573379 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-q4dk6" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.586124 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbmv8\" (UniqueName: \"kubernetes.io/projected/efb5e616-18ad-4dc4-b8c4-b69a6c649bb4-kube-api-access-xbmv8\") pod \"glance-a2f7-account-create-update-d6xpb\" (UID: \"efb5e616-18ad-4dc4-b8c4-b69a6c649bb4\") " pod="openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb" Jan 20 23:04:04 crc kubenswrapper[5030]: I0120 23:04:04.677944 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb" Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.141771 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-q4dk6"] Jan 20 23:04:05 crc kubenswrapper[5030]: W0120 23:04:05.153141 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7d8c060_85fa_4664_97ef_22370096f2b1.slice/crio-a97e2a67947cb83728cfcafd0df173e3183881de267d434d5e0e3c857ae846b7 WatchSource:0}: Error finding container a97e2a67947cb83728cfcafd0df173e3183881de267d434d5e0e3c857ae846b7: Status 404 returned error can't find the container with id a97e2a67947cb83728cfcafd0df173e3183881de267d434d5e0e3c857ae846b7 Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.270937 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb"] Jan 20 23:04:05 crc kubenswrapper[5030]: W0120 23:04:05.275422 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefb5e616_18ad_4dc4_b8c4_b69a6c649bb4.slice/crio-119f51504264ad0ca7c234a46a89f3d12e0a4fb40afe3146b55a4b3dd88513f3 WatchSource:0}: Error finding container 119f51504264ad0ca7c234a46a89f3d12e0a4fb40afe3146b55a4b3dd88513f3: Status 404 returned error can't find the container with id 119f51504264ad0ca7c234a46a89f3d12e0a4fb40afe3146b55a4b3dd88513f3 Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.674857 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:05 crc kubenswrapper[5030]: E0120 23:04:05.675109 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:04:05 crc kubenswrapper[5030]: E0120 23:04:05.675285 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:04:05 crc kubenswrapper[5030]: E0120 23:04:05.675354 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift podName:7ebac554-54b2-415f-9d13-dc616983bff9 nodeName:}" failed. No retries permitted until 2026-01-20 23:04:09.675331206 +0000 UTC m=+1721.995591504 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift") pod "swift-storage-0" (UID: "7ebac554-54b2-415f-9d13-dc616983bff9") : configmap "swift-ring-files" not found Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.693889 5030 generic.go:334] "Generic (PLEG): container finished" podID="a7d8c060-85fa-4664-97ef-22370096f2b1" containerID="2ae32ab73523fefa4a071347e2c2b96e0ad6fcaa89f2c5f18bc0b2e6a93d85ba" exitCode=0 Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.693961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-q4dk6" event={"ID":"a7d8c060-85fa-4664-97ef-22370096f2b1","Type":"ContainerDied","Data":"2ae32ab73523fefa4a071347e2c2b96e0ad6fcaa89f2c5f18bc0b2e6a93d85ba"} Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.693990 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-q4dk6" event={"ID":"a7d8c060-85fa-4664-97ef-22370096f2b1","Type":"ContainerStarted","Data":"a97e2a67947cb83728cfcafd0df173e3183881de267d434d5e0e3c857ae846b7"} Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.696008 5030 generic.go:334] "Generic (PLEG): container finished" podID="efb5e616-18ad-4dc4-b8c4-b69a6c649bb4" containerID="fd9b139e4f8e23ee3d63600f68be17c4b6a9c41704e129cb9fee4ca166609d8e" exitCode=0 Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.696061 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb" event={"ID":"efb5e616-18ad-4dc4-b8c4-b69a6c649bb4","Type":"ContainerDied","Data":"fd9b139e4f8e23ee3d63600f68be17c4b6a9c41704e129cb9fee4ca166609d8e"} Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.696093 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb" event={"ID":"efb5e616-18ad-4dc4-b8c4-b69a6c649bb4","Type":"ContainerStarted","Data":"119f51504264ad0ca7c234a46a89f3d12e0a4fb40afe3146b55a4b3dd88513f3"} Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.963180 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:04:05 crc kubenswrapper[5030]: E0120 23:04:05.963434 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.972970 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-z694x"] Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.974050 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-z694x" Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.976229 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-z694x"] Jan 20 23:04:05 crc kubenswrapper[5030]: I0120 23:04:05.976579 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:04:06 crc kubenswrapper[5030]: I0120 23:04:06.087914 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94847cfc-3a91-4bb3-a3f8-c77c642b9fdf-operator-scripts\") pod \"root-account-create-update-z694x\" (UID: \"94847cfc-3a91-4bb3-a3f8-c77c642b9fdf\") " pod="openstack-kuttl-tests/root-account-create-update-z694x" Jan 20 23:04:06 crc kubenswrapper[5030]: I0120 23:04:06.088150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdqpf\" (UniqueName: \"kubernetes.io/projected/94847cfc-3a91-4bb3-a3f8-c77c642b9fdf-kube-api-access-vdqpf\") pod \"root-account-create-update-z694x\" (UID: \"94847cfc-3a91-4bb3-a3f8-c77c642b9fdf\") " pod="openstack-kuttl-tests/root-account-create-update-z694x" Jan 20 23:04:06 crc kubenswrapper[5030]: I0120 23:04:06.189384 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94847cfc-3a91-4bb3-a3f8-c77c642b9fdf-operator-scripts\") pod \"root-account-create-update-z694x\" (UID: \"94847cfc-3a91-4bb3-a3f8-c77c642b9fdf\") " pod="openstack-kuttl-tests/root-account-create-update-z694x" Jan 20 23:04:06 crc kubenswrapper[5030]: I0120 23:04:06.189590 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdqpf\" (UniqueName: \"kubernetes.io/projected/94847cfc-3a91-4bb3-a3f8-c77c642b9fdf-kube-api-access-vdqpf\") pod \"root-account-create-update-z694x\" (UID: \"94847cfc-3a91-4bb3-a3f8-c77c642b9fdf\") " pod="openstack-kuttl-tests/root-account-create-update-z694x" Jan 20 23:04:06 crc kubenswrapper[5030]: I0120 23:04:06.190742 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94847cfc-3a91-4bb3-a3f8-c77c642b9fdf-operator-scripts\") pod \"root-account-create-update-z694x\" (UID: \"94847cfc-3a91-4bb3-a3f8-c77c642b9fdf\") " pod="openstack-kuttl-tests/root-account-create-update-z694x" Jan 20 23:04:06 crc kubenswrapper[5030]: I0120 23:04:06.227269 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdqpf\" (UniqueName: \"kubernetes.io/projected/94847cfc-3a91-4bb3-a3f8-c77c642b9fdf-kube-api-access-vdqpf\") pod \"root-account-create-update-z694x\" (UID: \"94847cfc-3a91-4bb3-a3f8-c77c642b9fdf\") " pod="openstack-kuttl-tests/root-account-create-update-z694x" Jan 20 23:04:06 crc kubenswrapper[5030]: I0120 23:04:06.295451 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-z694x" Jan 20 23:04:06 crc kubenswrapper[5030]: I0120 23:04:06.829382 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-z694x"] Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.116846 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-q4dk6" Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.154167 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb" Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.207760 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d8c060-85fa-4664-97ef-22370096f2b1-operator-scripts\") pod \"a7d8c060-85fa-4664-97ef-22370096f2b1\" (UID: \"a7d8c060-85fa-4664-97ef-22370096f2b1\") " Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.207852 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qchv4\" (UniqueName: \"kubernetes.io/projected/a7d8c060-85fa-4664-97ef-22370096f2b1-kube-api-access-qchv4\") pod \"a7d8c060-85fa-4664-97ef-22370096f2b1\" (UID: \"a7d8c060-85fa-4664-97ef-22370096f2b1\") " Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.208904 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d8c060-85fa-4664-97ef-22370096f2b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7d8c060-85fa-4664-97ef-22370096f2b1" (UID: "a7d8c060-85fa-4664-97ef-22370096f2b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.212899 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d8c060-85fa-4664-97ef-22370096f2b1-kube-api-access-qchv4" (OuterVolumeSpecName: "kube-api-access-qchv4") pod "a7d8c060-85fa-4664-97ef-22370096f2b1" (UID: "a7d8c060-85fa-4664-97ef-22370096f2b1"). InnerVolumeSpecName "kube-api-access-qchv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.309736 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbmv8\" (UniqueName: \"kubernetes.io/projected/efb5e616-18ad-4dc4-b8c4-b69a6c649bb4-kube-api-access-xbmv8\") pod \"efb5e616-18ad-4dc4-b8c4-b69a6c649bb4\" (UID: \"efb5e616-18ad-4dc4-b8c4-b69a6c649bb4\") " Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.310083 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb5e616-18ad-4dc4-b8c4-b69a6c649bb4-operator-scripts\") pod \"efb5e616-18ad-4dc4-b8c4-b69a6c649bb4\" (UID: \"efb5e616-18ad-4dc4-b8c4-b69a6c649bb4\") " Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.310524 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d8c060-85fa-4664-97ef-22370096f2b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.310535 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qchv4\" (UniqueName: \"kubernetes.io/projected/a7d8c060-85fa-4664-97ef-22370096f2b1-kube-api-access-qchv4\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.311216 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb5e616-18ad-4dc4-b8c4-b69a6c649bb4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efb5e616-18ad-4dc4-b8c4-b69a6c649bb4" (UID: "efb5e616-18ad-4dc4-b8c4-b69a6c649bb4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.314127 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb5e616-18ad-4dc4-b8c4-b69a6c649bb4-kube-api-access-xbmv8" (OuterVolumeSpecName: "kube-api-access-xbmv8") pod "efb5e616-18ad-4dc4-b8c4-b69a6c649bb4" (UID: "efb5e616-18ad-4dc4-b8c4-b69a6c649bb4"). InnerVolumeSpecName "kube-api-access-xbmv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.412151 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbmv8\" (UniqueName: \"kubernetes.io/projected/efb5e616-18ad-4dc4-b8c4-b69a6c649bb4-kube-api-access-xbmv8\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.412194 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb5e616-18ad-4dc4-b8c4-b69a6c649bb4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.719726 5030 generic.go:334] "Generic (PLEG): container finished" podID="94847cfc-3a91-4bb3-a3f8-c77c642b9fdf" containerID="e6ae00a5b5176d1e8061fa02216441a4d16f157d7c215977e50d6e93c60f64c1" exitCode=0 Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.719781 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-z694x" event={"ID":"94847cfc-3a91-4bb3-a3f8-c77c642b9fdf","Type":"ContainerDied","Data":"e6ae00a5b5176d1e8061fa02216441a4d16f157d7c215977e50d6e93c60f64c1"} Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.719841 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-z694x" event={"ID":"94847cfc-3a91-4bb3-a3f8-c77c642b9fdf","Type":"ContainerStarted","Data":"3510e23ee5893a3d00044264ee43fa08e0edecdfaedd67fa0b165572ec0ae872"} Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.722579 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb" event={"ID":"efb5e616-18ad-4dc4-b8c4-b69a6c649bb4","Type":"ContainerDied","Data":"119f51504264ad0ca7c234a46a89f3d12e0a4fb40afe3146b55a4b3dd88513f3"} Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.722640 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="119f51504264ad0ca7c234a46a89f3d12e0a4fb40afe3146b55a4b3dd88513f3" Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.722700 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb" Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.724831 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-q4dk6" event={"ID":"a7d8c060-85fa-4664-97ef-22370096f2b1","Type":"ContainerDied","Data":"a97e2a67947cb83728cfcafd0df173e3183881de267d434d5e0e3c857ae846b7"} Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.724854 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a97e2a67947cb83728cfcafd0df173e3183881de267d434d5e0e3c857ae846b7" Jan 20 23:04:07 crc kubenswrapper[5030]: I0120 23:04:07.724930 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-q4dk6" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.146365 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-z694x" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.246724 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdqpf\" (UniqueName: \"kubernetes.io/projected/94847cfc-3a91-4bb3-a3f8-c77c642b9fdf-kube-api-access-vdqpf\") pod \"94847cfc-3a91-4bb3-a3f8-c77c642b9fdf\" (UID: \"94847cfc-3a91-4bb3-a3f8-c77c642b9fdf\") " Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.246794 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94847cfc-3a91-4bb3-a3f8-c77c642b9fdf-operator-scripts\") pod \"94847cfc-3a91-4bb3-a3f8-c77c642b9fdf\" (UID: \"94847cfc-3a91-4bb3-a3f8-c77c642b9fdf\") " Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.247524 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94847cfc-3a91-4bb3-a3f8-c77c642b9fdf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94847cfc-3a91-4bb3-a3f8-c77c642b9fdf" (UID: "94847cfc-3a91-4bb3-a3f8-c77c642b9fdf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.261832 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94847cfc-3a91-4bb3-a3f8-c77c642b9fdf-kube-api-access-vdqpf" (OuterVolumeSpecName: "kube-api-access-vdqpf") pod "94847cfc-3a91-4bb3-a3f8-c77c642b9fdf" (UID: "94847cfc-3a91-4bb3-a3f8-c77c642b9fdf"). InnerVolumeSpecName "kube-api-access-vdqpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.348462 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdqpf\" (UniqueName: \"kubernetes.io/projected/94847cfc-3a91-4bb3-a3f8-c77c642b9fdf-kube-api-access-vdqpf\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.348494 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94847cfc-3a91-4bb3-a3f8-c77c642b9fdf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.503873 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4qlnz"] Jan 20 23:04:09 crc kubenswrapper[5030]: E0120 23:04:09.504160 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb5e616-18ad-4dc4-b8c4-b69a6c649bb4" containerName="mariadb-account-create-update" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.504175 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb5e616-18ad-4dc4-b8c4-b69a6c649bb4" containerName="mariadb-account-create-update" Jan 20 23:04:09 crc kubenswrapper[5030]: E0120 23:04:09.504187 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d8c060-85fa-4664-97ef-22370096f2b1" containerName="mariadb-database-create" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.504193 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d8c060-85fa-4664-97ef-22370096f2b1" containerName="mariadb-database-create" Jan 20 23:04:09 crc kubenswrapper[5030]: E0120 23:04:09.504207 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94847cfc-3a91-4bb3-a3f8-c77c642b9fdf" containerName="mariadb-account-create-update" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.504214 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="94847cfc-3a91-4bb3-a3f8-c77c642b9fdf" containerName="mariadb-account-create-update" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.504364 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d8c060-85fa-4664-97ef-22370096f2b1" containerName="mariadb-database-create" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.504393 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="94847cfc-3a91-4bb3-a3f8-c77c642b9fdf" containerName="mariadb-account-create-update" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.504402 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb5e616-18ad-4dc4-b8c4-b69a6c649bb4" containerName="mariadb-account-create-update" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.504910 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.507080 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-g4857" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.507124 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.517045 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4qlnz"] Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.652925 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-combined-ca-bundle\") pod \"glance-db-sync-4qlnz\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.652970 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-config-data\") pod \"glance-db-sync-4qlnz\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.653034 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-db-sync-config-data\") pod \"glance-db-sync-4qlnz\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.653410 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqq98\" (UniqueName: \"kubernetes.io/projected/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-kube-api-access-lqq98\") pod \"glance-db-sync-4qlnz\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.754413 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-z694x" event={"ID":"94847cfc-3a91-4bb3-a3f8-c77c642b9fdf","Type":"ContainerDied","Data":"3510e23ee5893a3d00044264ee43fa08e0edecdfaedd67fa0b165572ec0ae872"} Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.754475 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3510e23ee5893a3d00044264ee43fa08e0edecdfaedd67fa0b165572ec0ae872" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.754547 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-z694x" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.754814 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-db-sync-config-data\") pod \"glance-db-sync-4qlnz\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.754877 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.754903 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqq98\" (UniqueName: \"kubernetes.io/projected/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-kube-api-access-lqq98\") pod \"glance-db-sync-4qlnz\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.755098 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-combined-ca-bundle\") pod \"glance-db-sync-4qlnz\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.755132 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-config-data\") pod \"glance-db-sync-4qlnz\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.764711 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift\") pod \"swift-storage-0\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.765022 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-combined-ca-bundle\") pod \"glance-db-sync-4qlnz\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.773168 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-config-data\") pod \"glance-db-sync-4qlnz\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.774719 5030 generic.go:334] "Generic (PLEG): container finished" podID="c11c685c-7088-4b9a-ae6a-6fcf926abf3c" containerID="ae017982458ddc6af7aca2197dbd82b45a8fce1dc25b7adfb5138c1f5c66675e" exitCode=0 Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.774767 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" event={"ID":"c11c685c-7088-4b9a-ae6a-6fcf926abf3c","Type":"ContainerDied","Data":"ae017982458ddc6af7aca2197dbd82b45a8fce1dc25b7adfb5138c1f5c66675e"} Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.775083 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-db-sync-config-data\") pod \"glance-db-sync-4qlnz\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.778571 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqq98\" (UniqueName: \"kubernetes.io/projected/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-kube-api-access-lqq98\") pod \"glance-db-sync-4qlnz\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.821500 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:09 crc kubenswrapper[5030]: I0120 23:04:09.953288 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:04:10 crc kubenswrapper[5030]: I0120 23:04:10.297341 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4qlnz"] Jan 20 23:04:10 crc kubenswrapper[5030]: W0120 23:04:10.301882 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e9fb28d_51f1_4f0f_a2ee_dd4a379a0a92.slice/crio-f77ea467c22298303b3739bba2ec37c7b25a9e82458565092cb191a65147a5db WatchSource:0}: Error finding container f77ea467c22298303b3739bba2ec37c7b25a9e82458565092cb191a65147a5db: Status 404 returned error can't find the container with id f77ea467c22298303b3739bba2ec37c7b25a9e82458565092cb191a65147a5db Jan 20 23:04:10 crc kubenswrapper[5030]: I0120 23:04:10.448750 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:04:10 crc kubenswrapper[5030]: I0120 23:04:10.792605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"e48b459962ea7bc574b006ce6341dda88c60d32146ba90952d7090ce53463da0"} Jan 20 23:04:10 crc kubenswrapper[5030]: I0120 23:04:10.792662 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"222855be530361537c189bb68ea262fab03aa1e573ca39420ca757a8fc9752a9"} Jan 20 23:04:10 crc kubenswrapper[5030]: I0120 23:04:10.792671 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"9c3af1cc8881556988bb09e0cef1c91c5085b9e1f776fcc78cffc47380390a67"} Jan 20 23:04:10 crc kubenswrapper[5030]: I0120 23:04:10.794043 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-4qlnz" event={"ID":"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92","Type":"ContainerStarted","Data":"f77ea467c22298303b3739bba2ec37c7b25a9e82458565092cb191a65147a5db"} Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.197294 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.282229 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jclc\" (UniqueName: \"kubernetes.io/projected/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-kube-api-access-7jclc\") pod \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.282332 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-dispersionconf\") pod \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.282360 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-swiftconf\") pod \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.282414 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-combined-ca-bundle\") pod \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.282470 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-ring-data-devices\") pod \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.282514 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-scripts\") pod \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.282531 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-etc-swift\") pod \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\" (UID: \"c11c685c-7088-4b9a-ae6a-6fcf926abf3c\") " Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.283550 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c11c685c-7088-4b9a-ae6a-6fcf926abf3c" (UID: "c11c685c-7088-4b9a-ae6a-6fcf926abf3c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.284485 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c11c685c-7088-4b9a-ae6a-6fcf926abf3c" (UID: "c11c685c-7088-4b9a-ae6a-6fcf926abf3c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.286690 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-kube-api-access-7jclc" (OuterVolumeSpecName: "kube-api-access-7jclc") pod "c11c685c-7088-4b9a-ae6a-6fcf926abf3c" (UID: "c11c685c-7088-4b9a-ae6a-6fcf926abf3c"). InnerVolumeSpecName "kube-api-access-7jclc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.294649 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c11c685c-7088-4b9a-ae6a-6fcf926abf3c" (UID: "c11c685c-7088-4b9a-ae6a-6fcf926abf3c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.309960 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-scripts" (OuterVolumeSpecName: "scripts") pod "c11c685c-7088-4b9a-ae6a-6fcf926abf3c" (UID: "c11c685c-7088-4b9a-ae6a-6fcf926abf3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.319901 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c11c685c-7088-4b9a-ae6a-6fcf926abf3c" (UID: "c11c685c-7088-4b9a-ae6a-6fcf926abf3c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.322655 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c11c685c-7088-4b9a-ae6a-6fcf926abf3c" (UID: "c11c685c-7088-4b9a-ae6a-6fcf926abf3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.384235 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.384267 5030 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.384278 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.384287 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.384298 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jclc\" (UniqueName: \"kubernetes.io/projected/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-kube-api-access-7jclc\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.384308 5030 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.384316 5030 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c11c685c-7088-4b9a-ae6a-6fcf926abf3c-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.805919 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"c6d0a29d7c5b2300d15ec8193bba8327f5c9413176c849542e224a3285bd14db"} Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.806243 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"e0a2f61c01c7d8a27a3ed5ef3a647a8e8fbd99b4f1d1be9c8bc289f70dc07409"} Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.806254 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"ef5734bddbf1b14a4ff079c9189361ff4acf245e82b0aa907a1cab9c7c034a64"} Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.806263 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"ba9956fcef6b04c46db28c6ae2b349ec3802723f420d58f0af21b4f46ec7c18c"} Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.806273 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"5c9f5726d07ae41454c108e2c0d2fd019e4139eca228dc8e2ae6c5ddaee5f754"} Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.806281 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"4340f305b67f1fdb453229a8e50f3723634ae23ea784fed8c8391cf3ca98c768"} Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.807420 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-4qlnz" event={"ID":"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92","Type":"ContainerStarted","Data":"a3c25eb491fc9772df5063d25427ecd83cd060358bb698133efd3c33d5bcaccb"} Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.809552 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" event={"ID":"c11c685c-7088-4b9a-ae6a-6fcf926abf3c","Type":"ContainerDied","Data":"dd5990318f172a99e2a6a782e9926009bed5627bed23946a8bfdf5e006b16fbf"} Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.809575 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd5990318f172a99e2a6a782e9926009bed5627bed23946a8bfdf5e006b16fbf" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.809601 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-kzxqs" Jan 20 23:04:11 crc kubenswrapper[5030]: I0120 23:04:11.828063 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-4qlnz" podStartSLOduration=2.828046372 podStartE2EDuration="2.828046372s" podCreationTimestamp="2026-01-20 23:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:11.821090463 +0000 UTC m=+1724.141350761" watchObservedRunningTime="2026-01-20 23:04:11.828046372 +0000 UTC m=+1724.148306660" Jan 20 23:04:12 crc kubenswrapper[5030]: I0120 23:04:12.370195 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-z694x"] Jan 20 23:04:12 crc kubenswrapper[5030]: I0120 23:04:12.379108 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-z694x"] Jan 20 23:04:12 crc kubenswrapper[5030]: I0120 23:04:12.822190 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"2f419f949223bd3eecb657c2e5db8dc3d6e15172bce4f76b79a2996c7f986fab"} Jan 20 23:04:12 crc kubenswrapper[5030]: I0120 23:04:12.822502 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"0228e6e5a2bf8d67091bb924e23d89a3dacb9f0b27fee5bc85fea465f5349cc2"} Jan 20 23:04:12 crc kubenswrapper[5030]: I0120 23:04:12.822517 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"6e533722ba2e3c86bb244f1d172c2e5b6d85b47103226081550521e55d75cf57"} Jan 20 23:04:12 crc kubenswrapper[5030]: I0120 23:04:12.822530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"71177140e4344999214a9b119df28b5034a95818f303e43d4d9fda79f212ab31"} Jan 20 23:04:12 crc kubenswrapper[5030]: I0120 23:04:12.822543 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"46e37ba2eb5235b1620489492fb3a576c004b9a14e7fd13ba34cffba482905bf"} Jan 20 23:04:12 crc kubenswrapper[5030]: I0120 23:04:12.822554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"4b6cf4885689d7d5fb7caeeeabdeec6302123ed28f3ab03a8471fc3d784419c4"} Jan 20 23:04:12 crc kubenswrapper[5030]: I0120 23:04:12.822567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerStarted","Data":"2a7fd43e4183dd4379e4ba1e71ac29b4bdb0e6c7406d53a86f975b26229f1c3a"} Jan 20 23:04:12 crc kubenswrapper[5030]: I0120 23:04:12.864803 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=12.864785609 podStartE2EDuration="12.864785609s" podCreationTimestamp="2026-01-20 23:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:12.857765428 +0000 UTC m=+1725.178025716" watchObservedRunningTime="2026-01-20 23:04:12.864785609 +0000 UTC m=+1725.185045897" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.017678 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc"] Jan 20 23:04:13 crc kubenswrapper[5030]: E0120 23:04:13.018075 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11c685c-7088-4b9a-ae6a-6fcf926abf3c" containerName="swift-ring-rebalance" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.018100 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11c685c-7088-4b9a-ae6a-6fcf926abf3c" containerName="swift-ring-rebalance" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.018316 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11c685c-7088-4b9a-ae6a-6fcf926abf3c" containerName="swift-ring-rebalance" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.019247 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.021270 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.031934 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc"] Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.108592 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64r4m\" (UniqueName: \"kubernetes.io/projected/8831bf4c-47d9-4612-a313-abbb7305ed03-kube-api-access-64r4m\") pod \"dnsmasq-dnsmasq-7d9dfbfb8c-59tpc\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.108677 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d9dfbfb8c-59tpc\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.108715 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-7d9dfbfb8c-59tpc\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.108753 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-config\") pod \"dnsmasq-dnsmasq-7d9dfbfb8c-59tpc\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.210523 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64r4m\" (UniqueName: \"kubernetes.io/projected/8831bf4c-47d9-4612-a313-abbb7305ed03-kube-api-access-64r4m\") pod \"dnsmasq-dnsmasq-7d9dfbfb8c-59tpc\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.210577 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d9dfbfb8c-59tpc\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.210610 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-7d9dfbfb8c-59tpc\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.210663 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-config\") pod \"dnsmasq-dnsmasq-7d9dfbfb8c-59tpc\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.211444 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-config\") pod \"dnsmasq-dnsmasq-7d9dfbfb8c-59tpc\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.212202 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d9dfbfb8c-59tpc\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.212719 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-7d9dfbfb8c-59tpc\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.234487 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64r4m\" (UniqueName: \"kubernetes.io/projected/8831bf4c-47d9-4612-a313-abbb7305ed03-kube-api-access-64r4m\") pod \"dnsmasq-dnsmasq-7d9dfbfb8c-59tpc\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.333715 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.735563 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc"] Jan 20 23:04:13 crc kubenswrapper[5030]: W0120 23:04:13.743284 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8831bf4c_47d9_4612_a313_abbb7305ed03.slice/crio-39149dd9ef01dbe393090834ebcbe34246d1401d2ab3b005f2b2674d8d8e91bd WatchSource:0}: Error finding container 39149dd9ef01dbe393090834ebcbe34246d1401d2ab3b005f2b2674d8d8e91bd: Status 404 returned error can't find the container with id 39149dd9ef01dbe393090834ebcbe34246d1401d2ab3b005f2b2674d8d8e91bd Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.833340 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" event={"ID":"8831bf4c-47d9-4612-a313-abbb7305ed03","Type":"ContainerStarted","Data":"39149dd9ef01dbe393090834ebcbe34246d1401d2ab3b005f2b2674d8d8e91bd"} Jan 20 23:04:13 crc kubenswrapper[5030]: I0120 23:04:13.984667 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94847cfc-3a91-4bb3-a3f8-c77c642b9fdf" path="/var/lib/kubelet/pods/94847cfc-3a91-4bb3-a3f8-c77c642b9fdf/volumes" Jan 20 23:04:14 crc kubenswrapper[5030]: I0120 23:04:14.846215 5030 generic.go:334] "Generic (PLEG): container finished" podID="8831bf4c-47d9-4612-a313-abbb7305ed03" containerID="f9f4e5e4ef0633e901b2f3e5dcb8d01446f70fcef427b87e340a5e8c919d4608" exitCode=0 Jan 20 23:04:14 crc kubenswrapper[5030]: I0120 23:04:14.846284 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" event={"ID":"8831bf4c-47d9-4612-a313-abbb7305ed03","Type":"ContainerDied","Data":"f9f4e5e4ef0633e901b2f3e5dcb8d01446f70fcef427b87e340a5e8c919d4608"} Jan 20 23:04:15 crc kubenswrapper[5030]: I0120 23:04:15.856815 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" event={"ID":"8831bf4c-47d9-4612-a313-abbb7305ed03","Type":"ContainerStarted","Data":"a26d5b2837a0e64faafa7120a4f35aa3fe39c4b2885fdb2829f8e628d88a311b"} Jan 20 23:04:15 crc kubenswrapper[5030]: I0120 23:04:15.858271 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:15 crc kubenswrapper[5030]: I0120 23:04:15.895514 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" podStartSLOduration=3.89548739 podStartE2EDuration="3.89548739s" podCreationTimestamp="2026-01-20 23:04:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:15.885931897 +0000 UTC m=+1728.206192195" watchObservedRunningTime="2026-01-20 23:04:15.89548739 +0000 UTC m=+1728.215747718" Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.385556 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-z7qq5"] Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.387793 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-z7qq5" Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.399182 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.405304 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-z7qq5"] Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.482985 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77ab4d17-e667-42e0-bbc4-d0802a254b6d-operator-scripts\") pod \"root-account-create-update-z7qq5\" (UID: \"77ab4d17-e667-42e0-bbc4-d0802a254b6d\") " pod="openstack-kuttl-tests/root-account-create-update-z7qq5" Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.483118 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjq7c\" (UniqueName: \"kubernetes.io/projected/77ab4d17-e667-42e0-bbc4-d0802a254b6d-kube-api-access-xjq7c\") pod \"root-account-create-update-z7qq5\" (UID: \"77ab4d17-e667-42e0-bbc4-d0802a254b6d\") " pod="openstack-kuttl-tests/root-account-create-update-z7qq5" Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.584253 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77ab4d17-e667-42e0-bbc4-d0802a254b6d-operator-scripts\") pod \"root-account-create-update-z7qq5\" (UID: \"77ab4d17-e667-42e0-bbc4-d0802a254b6d\") " pod="openstack-kuttl-tests/root-account-create-update-z7qq5" Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.584374 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjq7c\" (UniqueName: \"kubernetes.io/projected/77ab4d17-e667-42e0-bbc4-d0802a254b6d-kube-api-access-xjq7c\") pod \"root-account-create-update-z7qq5\" (UID: \"77ab4d17-e667-42e0-bbc4-d0802a254b6d\") " pod="openstack-kuttl-tests/root-account-create-update-z7qq5" Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.585262 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77ab4d17-e667-42e0-bbc4-d0802a254b6d-operator-scripts\") pod \"root-account-create-update-z7qq5\" (UID: \"77ab4d17-e667-42e0-bbc4-d0802a254b6d\") " pod="openstack-kuttl-tests/root-account-create-update-z7qq5" Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.611114 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjq7c\" (UniqueName: \"kubernetes.io/projected/77ab4d17-e667-42e0-bbc4-d0802a254b6d-kube-api-access-xjq7c\") pod \"root-account-create-update-z7qq5\" (UID: \"77ab4d17-e667-42e0-bbc4-d0802a254b6d\") " pod="openstack-kuttl-tests/root-account-create-update-z7qq5" Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.726184 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-z7qq5" Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.776033 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.884133 5030 generic.go:334] "Generic (PLEG): container finished" podID="5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92" containerID="a3c25eb491fc9772df5063d25427ecd83cd060358bb698133efd3c33d5bcaccb" exitCode=0 Jan 20 23:04:17 crc kubenswrapper[5030]: I0120 23:04:17.884858 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-4qlnz" event={"ID":"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92","Type":"ContainerDied","Data":"a3c25eb491fc9772df5063d25427ecd83cd060358bb698133efd3c33d5bcaccb"} Jan 20 23:04:18 crc kubenswrapper[5030]: I0120 23:04:18.208658 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-z7qq5"] Jan 20 23:04:18 crc kubenswrapper[5030]: I0120 23:04:18.896974 5030 generic.go:334] "Generic (PLEG): container finished" podID="77ab4d17-e667-42e0-bbc4-d0802a254b6d" containerID="81069aac51082cb84904c36eea24a6a7c42b1204c1fb2deb3fbd811976d06a4f" exitCode=0 Jan 20 23:04:18 crc kubenswrapper[5030]: I0120 23:04:18.897110 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-z7qq5" event={"ID":"77ab4d17-e667-42e0-bbc4-d0802a254b6d","Type":"ContainerDied","Data":"81069aac51082cb84904c36eea24a6a7c42b1204c1fb2deb3fbd811976d06a4f"} Jan 20 23:04:18 crc kubenswrapper[5030]: I0120 23:04:18.897194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-z7qq5" event={"ID":"77ab4d17-e667-42e0-bbc4-d0802a254b6d","Type":"ContainerStarted","Data":"68a504064272969136fbf731d0ba9d6ec167eb9a6f63a0b9e60db56c51122ddb"} Jan 20 23:04:18 crc kubenswrapper[5030]: I0120 23:04:18.902287 5030 generic.go:334] "Generic (PLEG): container finished" podID="78f61cd4-3d8c-48e0-bebd-03969052e500" containerID="cec126b6647bad2db7e9ddc4b5f214085e98e532d7618dfac27c453a22f8875a" exitCode=0 Jan 20 23:04:18 crc kubenswrapper[5030]: I0120 23:04:18.902327 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"78f61cd4-3d8c-48e0-bebd-03969052e500","Type":"ContainerDied","Data":"cec126b6647bad2db7e9ddc4b5f214085e98e532d7618dfac27c453a22f8875a"} Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.326661 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.417790 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqq98\" (UniqueName: \"kubernetes.io/projected/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-kube-api-access-lqq98\") pod \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.417917 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-combined-ca-bundle\") pod \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.417963 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-db-sync-config-data\") pod \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.418000 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-config-data\") pod \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\" (UID: \"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92\") " Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.422551 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-kube-api-access-lqq98" (OuterVolumeSpecName: "kube-api-access-lqq98") pod "5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92" (UID: "5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92"). InnerVolumeSpecName "kube-api-access-lqq98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.423287 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92" (UID: "5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.445249 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92" (UID: "5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.484082 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-config-data" (OuterVolumeSpecName: "config-data") pod "5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92" (UID: "5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.519851 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqq98\" (UniqueName: \"kubernetes.io/projected/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-kube-api-access-lqq98\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.519894 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.519907 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.519918 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.915276 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.915282 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-4qlnz" event={"ID":"5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92","Type":"ContainerDied","Data":"f77ea467c22298303b3739bba2ec37c7b25a9e82458565092cb191a65147a5db"} Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.915674 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f77ea467c22298303b3739bba2ec37c7b25a9e82458565092cb191a65147a5db" Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.917591 5030 generic.go:334] "Generic (PLEG): container finished" podID="6ede01a6-9221-4276-b537-212e5da27f5c" containerID="ff8398a734783704ed53adf1f8cf6ac6b31a5b7eda7fc7e80e5c0e36f03e7f79" exitCode=0 Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.917683 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"6ede01a6-9221-4276-b537-212e5da27f5c","Type":"ContainerDied","Data":"ff8398a734783704ed53adf1f8cf6ac6b31a5b7eda7fc7e80e5c0e36f03e7f79"} Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.920562 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"78f61cd4-3d8c-48e0-bebd-03969052e500","Type":"ContainerStarted","Data":"2f5b088739c93b3850099728469fea790afce761d8524b43ba23c86d11359f9b"} Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.920944 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:04:19 crc kubenswrapper[5030]: I0120 23:04:19.998532 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=36.998515685 podStartE2EDuration="36.998515685s" podCreationTimestamp="2026-01-20 23:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:19.977572624 +0000 UTC m=+1732.297832912" watchObservedRunningTime="2026-01-20 23:04:19.998515685 +0000 UTC m=+1732.318775973" Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.315635 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-z7qq5" Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.433250 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjq7c\" (UniqueName: \"kubernetes.io/projected/77ab4d17-e667-42e0-bbc4-d0802a254b6d-kube-api-access-xjq7c\") pod \"77ab4d17-e667-42e0-bbc4-d0802a254b6d\" (UID: \"77ab4d17-e667-42e0-bbc4-d0802a254b6d\") " Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.433457 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77ab4d17-e667-42e0-bbc4-d0802a254b6d-operator-scripts\") pod \"77ab4d17-e667-42e0-bbc4-d0802a254b6d\" (UID: \"77ab4d17-e667-42e0-bbc4-d0802a254b6d\") " Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.434090 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ab4d17-e667-42e0-bbc4-d0802a254b6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77ab4d17-e667-42e0-bbc4-d0802a254b6d" (UID: "77ab4d17-e667-42e0-bbc4-d0802a254b6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.439488 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ab4d17-e667-42e0-bbc4-d0802a254b6d-kube-api-access-xjq7c" (OuterVolumeSpecName: "kube-api-access-xjq7c") pod "77ab4d17-e667-42e0-bbc4-d0802a254b6d" (UID: "77ab4d17-e667-42e0-bbc4-d0802a254b6d"). InnerVolumeSpecName "kube-api-access-xjq7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.535335 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjq7c\" (UniqueName: \"kubernetes.io/projected/77ab4d17-e667-42e0-bbc4-d0802a254b6d-kube-api-access-xjq7c\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.535365 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77ab4d17-e667-42e0-bbc4-d0802a254b6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.928982 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-z7qq5" event={"ID":"77ab4d17-e667-42e0-bbc4-d0802a254b6d","Type":"ContainerDied","Data":"68a504064272969136fbf731d0ba9d6ec167eb9a6f63a0b9e60db56c51122ddb"} Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.929034 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68a504064272969136fbf731d0ba9d6ec167eb9a6f63a0b9e60db56c51122ddb" Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.929067 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-z7qq5" Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.932568 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"6ede01a6-9221-4276-b537-212e5da27f5c","Type":"ContainerStarted","Data":"7e8d67b1a16612d40e2f749471f6d722e5a192ea12c3992c445e75cee863bda4"} Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.932937 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.962297 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:04:20 crc kubenswrapper[5030]: E0120 23:04:20.962528 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:04:20 crc kubenswrapper[5030]: I0120 23:04:20.969844 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.969814096 podStartE2EDuration="36.969814096s" podCreationTimestamp="2026-01-20 23:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:20.959404422 +0000 UTC m=+1733.279664720" watchObservedRunningTime="2026-01-20 23:04:20.969814096 +0000 UTC m=+1733.290074424" Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.335515 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.403634 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9"] Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.403845 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" podUID="f6db0c18-060b-4be2-9a11-3df481a3abca" containerName="dnsmasq-dns" containerID="cri-o://edd6221aa1184740bf7b1b82a394a6ad535840a9387fd6d10f9040803703f18b" gracePeriod=10 Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.847761 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.957377 5030 generic.go:334] "Generic (PLEG): container finished" podID="f6db0c18-060b-4be2-9a11-3df481a3abca" containerID="edd6221aa1184740bf7b1b82a394a6ad535840a9387fd6d10f9040803703f18b" exitCode=0 Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.957442 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.957430 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" event={"ID":"f6db0c18-060b-4be2-9a11-3df481a3abca","Type":"ContainerDied","Data":"edd6221aa1184740bf7b1b82a394a6ad535840a9387fd6d10f9040803703f18b"} Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.957796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9" event={"ID":"f6db0c18-060b-4be2-9a11-3df481a3abca","Type":"ContainerDied","Data":"4af79e0461ca284398575e67a2d268a9da3cf44ec5315b0cf25a34c1a07995cd"} Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.957822 5030 scope.go:117] "RemoveContainer" containerID="edd6221aa1184740bf7b1b82a394a6ad535840a9387fd6d10f9040803703f18b" Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.981042 5030 scope.go:117] "RemoveContainer" containerID="91e6a8963318ec7bd0552820186fd8be5e5d5bc9103782aebeffb44f5f7fe32a" Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.986380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6db0c18-060b-4be2-9a11-3df481a3abca-config\") pod \"f6db0c18-060b-4be2-9a11-3df481a3abca\" (UID: \"f6db0c18-060b-4be2-9a11-3df481a3abca\") " Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.986502 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j75ff\" (UniqueName: \"kubernetes.io/projected/f6db0c18-060b-4be2-9a11-3df481a3abca-kube-api-access-j75ff\") pod \"f6db0c18-060b-4be2-9a11-3df481a3abca\" (UID: \"f6db0c18-060b-4be2-9a11-3df481a3abca\") " Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.986544 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f6db0c18-060b-4be2-9a11-3df481a3abca-dnsmasq-svc\") pod \"f6db0c18-060b-4be2-9a11-3df481a3abca\" (UID: \"f6db0c18-060b-4be2-9a11-3df481a3abca\") " Jan 20 23:04:23 crc kubenswrapper[5030]: I0120 23:04:23.991215 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6db0c18-060b-4be2-9a11-3df481a3abca-kube-api-access-j75ff" (OuterVolumeSpecName: "kube-api-access-j75ff") pod "f6db0c18-060b-4be2-9a11-3df481a3abca" (UID: "f6db0c18-060b-4be2-9a11-3df481a3abca"). InnerVolumeSpecName "kube-api-access-j75ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:24 crc kubenswrapper[5030]: I0120 23:04:24.001444 5030 scope.go:117] "RemoveContainer" containerID="edd6221aa1184740bf7b1b82a394a6ad535840a9387fd6d10f9040803703f18b" Jan 20 23:04:24 crc kubenswrapper[5030]: E0120 23:04:24.002091 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edd6221aa1184740bf7b1b82a394a6ad535840a9387fd6d10f9040803703f18b\": container with ID starting with edd6221aa1184740bf7b1b82a394a6ad535840a9387fd6d10f9040803703f18b not found: ID does not exist" containerID="edd6221aa1184740bf7b1b82a394a6ad535840a9387fd6d10f9040803703f18b" Jan 20 23:04:24 crc kubenswrapper[5030]: I0120 23:04:24.002126 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edd6221aa1184740bf7b1b82a394a6ad535840a9387fd6d10f9040803703f18b"} err="failed to get container status \"edd6221aa1184740bf7b1b82a394a6ad535840a9387fd6d10f9040803703f18b\": rpc error: code = NotFound desc = could not find container \"edd6221aa1184740bf7b1b82a394a6ad535840a9387fd6d10f9040803703f18b\": container with ID starting with edd6221aa1184740bf7b1b82a394a6ad535840a9387fd6d10f9040803703f18b not found: ID does not exist" Jan 20 23:04:24 crc kubenswrapper[5030]: I0120 23:04:24.002147 5030 scope.go:117] "RemoveContainer" containerID="91e6a8963318ec7bd0552820186fd8be5e5d5bc9103782aebeffb44f5f7fe32a" Jan 20 23:04:24 crc kubenswrapper[5030]: E0120 23:04:24.002458 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e6a8963318ec7bd0552820186fd8be5e5d5bc9103782aebeffb44f5f7fe32a\": container with ID starting with 91e6a8963318ec7bd0552820186fd8be5e5d5bc9103782aebeffb44f5f7fe32a not found: ID does not exist" containerID="91e6a8963318ec7bd0552820186fd8be5e5d5bc9103782aebeffb44f5f7fe32a" Jan 20 23:04:24 crc kubenswrapper[5030]: I0120 23:04:24.002510 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e6a8963318ec7bd0552820186fd8be5e5d5bc9103782aebeffb44f5f7fe32a"} err="failed to get container status \"91e6a8963318ec7bd0552820186fd8be5e5d5bc9103782aebeffb44f5f7fe32a\": rpc error: code = NotFound desc = could not find container \"91e6a8963318ec7bd0552820186fd8be5e5d5bc9103782aebeffb44f5f7fe32a\": container with ID starting with 91e6a8963318ec7bd0552820186fd8be5e5d5bc9103782aebeffb44f5f7fe32a not found: ID does not exist" Jan 20 23:04:24 crc kubenswrapper[5030]: I0120 23:04:24.020182 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6db0c18-060b-4be2-9a11-3df481a3abca-config" (OuterVolumeSpecName: "config") pod "f6db0c18-060b-4be2-9a11-3df481a3abca" (UID: "f6db0c18-060b-4be2-9a11-3df481a3abca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:24 crc kubenswrapper[5030]: I0120 23:04:24.037067 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6db0c18-060b-4be2-9a11-3df481a3abca-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "f6db0c18-060b-4be2-9a11-3df481a3abca" (UID: "f6db0c18-060b-4be2-9a11-3df481a3abca"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:24 crc kubenswrapper[5030]: I0120 23:04:24.088685 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j75ff\" (UniqueName: \"kubernetes.io/projected/f6db0c18-060b-4be2-9a11-3df481a3abca-kube-api-access-j75ff\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:24 crc kubenswrapper[5030]: I0120 23:04:24.088713 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f6db0c18-060b-4be2-9a11-3df481a3abca-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:24 crc kubenswrapper[5030]: I0120 23:04:24.088723 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6db0c18-060b-4be2-9a11-3df481a3abca-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:24 crc kubenswrapper[5030]: I0120 23:04:24.296733 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9"] Jan 20 23:04:24 crc kubenswrapper[5030]: I0120 23:04:24.300796 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-z22p9"] Jan 20 23:04:25 crc kubenswrapper[5030]: I0120 23:04:25.982166 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6db0c18-060b-4be2-9a11-3df481a3abca" path="/var/lib/kubelet/pods/f6db0c18-060b-4be2-9a11-3df481a3abca/volumes" Jan 20 23:04:34 crc kubenswrapper[5030]: I0120 23:04:34.887909 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:04:35 crc kubenswrapper[5030]: I0120 23:04:35.962222 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:04:35 crc kubenswrapper[5030]: E0120 23:04:35.962710 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:04:36 crc kubenswrapper[5030]: I0120 23:04:36.026836 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:04:36 crc kubenswrapper[5030]: I0120 23:04:36.982482 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-krpzk"] Jan 20 23:04:36 crc kubenswrapper[5030]: E0120 23:04:36.983177 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ab4d17-e667-42e0-bbc4-d0802a254b6d" containerName="mariadb-account-create-update" Jan 20 23:04:36 crc kubenswrapper[5030]: I0120 23:04:36.983195 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ab4d17-e667-42e0-bbc4-d0802a254b6d" containerName="mariadb-account-create-update" Jan 20 23:04:36 crc kubenswrapper[5030]: E0120 23:04:36.983217 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6db0c18-060b-4be2-9a11-3df481a3abca" containerName="dnsmasq-dns" Jan 20 23:04:36 crc kubenswrapper[5030]: I0120 23:04:36.983227 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6db0c18-060b-4be2-9a11-3df481a3abca" containerName="dnsmasq-dns" Jan 20 23:04:36 crc kubenswrapper[5030]: E0120 23:04:36.983243 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6db0c18-060b-4be2-9a11-3df481a3abca" containerName="init" Jan 20 23:04:36 crc kubenswrapper[5030]: I0120 23:04:36.983251 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6db0c18-060b-4be2-9a11-3df481a3abca" containerName="init" Jan 20 23:04:36 crc kubenswrapper[5030]: E0120 23:04:36.983277 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92" containerName="glance-db-sync" Jan 20 23:04:36 crc kubenswrapper[5030]: I0120 23:04:36.983286 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92" containerName="glance-db-sync" Jan 20 23:04:36 crc kubenswrapper[5030]: I0120 23:04:36.983466 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ab4d17-e667-42e0-bbc4-d0802a254b6d" containerName="mariadb-account-create-update" Jan 20 23:04:36 crc kubenswrapper[5030]: I0120 23:04:36.983488 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92" containerName="glance-db-sync" Jan 20 23:04:36 crc kubenswrapper[5030]: I0120 23:04:36.983513 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6db0c18-060b-4be2-9a11-3df481a3abca" containerName="dnsmasq-dns" Jan 20 23:04:36 crc kubenswrapper[5030]: I0120 23:04:36.984145 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-krpzk" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.000060 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-krpzk"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.008498 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-n5h5x"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.009554 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-n5h5x" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.035912 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-n5h5x"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.113205 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw9cb\" (UniqueName: \"kubernetes.io/projected/867dbc1d-4b5e-4803-b3ee-aabaf77f4282-kube-api-access-zw9cb\") pod \"cinder-db-create-krpzk\" (UID: \"867dbc1d-4b5e-4803-b3ee-aabaf77f4282\") " pod="openstack-kuttl-tests/cinder-db-create-krpzk" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.113259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/867dbc1d-4b5e-4803-b3ee-aabaf77f4282-operator-scripts\") pod \"cinder-db-create-krpzk\" (UID: \"867dbc1d-4b5e-4803-b3ee-aabaf77f4282\") " pod="openstack-kuttl-tests/cinder-db-create-krpzk" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.113359 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gr6l\" (UniqueName: \"kubernetes.io/projected/25712fb0-ffff-4189-a2a8-f6f0e88778ff-kube-api-access-8gr6l\") pod \"neutron-db-create-n5h5x\" (UID: \"25712fb0-ffff-4189-a2a8-f6f0e88778ff\") " pod="openstack-kuttl-tests/neutron-db-create-n5h5x" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.113378 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25712fb0-ffff-4189-a2a8-f6f0e88778ff-operator-scripts\") pod \"neutron-db-create-n5h5x\" (UID: \"25712fb0-ffff-4189-a2a8-f6f0e88778ff\") " pod="openstack-kuttl-tests/neutron-db-create-n5h5x" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.115261 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-kgl2n"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.116287 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-kgl2n" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.122937 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-b757-account-create-update-58cfg"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.123850 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b757-account-create-update-58cfg" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.127199 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.136313 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-kgl2n"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.161265 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-b757-account-create-update-58cfg"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.206366 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.207320 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.209492 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.214716 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrz2s\" (UniqueName: \"kubernetes.io/projected/e0e36085-7754-49ab-b3c8-7b3b810aff4a-kube-api-access-qrz2s\") pod \"neutron-b757-account-create-update-58cfg\" (UID: \"e0e36085-7754-49ab-b3c8-7b3b810aff4a\") " pod="openstack-kuttl-tests/neutron-b757-account-create-update-58cfg" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.214776 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gr6l\" (UniqueName: \"kubernetes.io/projected/25712fb0-ffff-4189-a2a8-f6f0e88778ff-kube-api-access-8gr6l\") pod \"neutron-db-create-n5h5x\" (UID: \"25712fb0-ffff-4189-a2a8-f6f0e88778ff\") " pod="openstack-kuttl-tests/neutron-db-create-n5h5x" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.214787 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.214801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25712fb0-ffff-4189-a2a8-f6f0e88778ff-operator-scripts\") pod \"neutron-db-create-n5h5x\" (UID: \"25712fb0-ffff-4189-a2a8-f6f0e88778ff\") " pod="openstack-kuttl-tests/neutron-db-create-n5h5x" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.214887 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf-operator-scripts\") pod \"barbican-db-create-kgl2n\" (UID: \"6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf\") " pod="openstack-kuttl-tests/barbican-db-create-kgl2n" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.214948 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0e36085-7754-49ab-b3c8-7b3b810aff4a-operator-scripts\") pod \"neutron-b757-account-create-update-58cfg\" (UID: \"e0e36085-7754-49ab-b3c8-7b3b810aff4a\") " pod="openstack-kuttl-tests/neutron-b757-account-create-update-58cfg" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.214989 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw9cb\" (UniqueName: \"kubernetes.io/projected/867dbc1d-4b5e-4803-b3ee-aabaf77f4282-kube-api-access-zw9cb\") pod \"cinder-db-create-krpzk\" (UID: \"867dbc1d-4b5e-4803-b3ee-aabaf77f4282\") " pod="openstack-kuttl-tests/cinder-db-create-krpzk" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.215011 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpksj\" (UniqueName: \"kubernetes.io/projected/6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf-kube-api-access-hpksj\") pod \"barbican-db-create-kgl2n\" (UID: \"6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf\") " pod="openstack-kuttl-tests/barbican-db-create-kgl2n" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.215048 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/867dbc1d-4b5e-4803-b3ee-aabaf77f4282-operator-scripts\") pod \"cinder-db-create-krpzk\" (UID: \"867dbc1d-4b5e-4803-b3ee-aabaf77f4282\") " pod="openstack-kuttl-tests/cinder-db-create-krpzk" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.215498 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25712fb0-ffff-4189-a2a8-f6f0e88778ff-operator-scripts\") pod \"neutron-db-create-n5h5x\" (UID: \"25712fb0-ffff-4189-a2a8-f6f0e88778ff\") " pod="openstack-kuttl-tests/neutron-db-create-n5h5x" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.216031 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/867dbc1d-4b5e-4803-b3ee-aabaf77f4282-operator-scripts\") pod \"cinder-db-create-krpzk\" (UID: \"867dbc1d-4b5e-4803-b3ee-aabaf77f4282\") " pod="openstack-kuttl-tests/cinder-db-create-krpzk" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.240068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gr6l\" (UniqueName: \"kubernetes.io/projected/25712fb0-ffff-4189-a2a8-f6f0e88778ff-kube-api-access-8gr6l\") pod \"neutron-db-create-n5h5x\" (UID: \"25712fb0-ffff-4189-a2a8-f6f0e88778ff\") " pod="openstack-kuttl-tests/neutron-db-create-n5h5x" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.241255 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw9cb\" (UniqueName: \"kubernetes.io/projected/867dbc1d-4b5e-4803-b3ee-aabaf77f4282-kube-api-access-zw9cb\") pod \"cinder-db-create-krpzk\" (UID: \"867dbc1d-4b5e-4803-b3ee-aabaf77f4282\") " pod="openstack-kuttl-tests/cinder-db-create-krpzk" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.302925 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-mrxvh"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.304970 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.309865 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.310094 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.310352 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-2vb6q" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.310512 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.322377 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrz2s\" (UniqueName: \"kubernetes.io/projected/e0e36085-7754-49ab-b3c8-7b3b810aff4a-kube-api-access-qrz2s\") pod \"neutron-b757-account-create-update-58cfg\" (UID: \"e0e36085-7754-49ab-b3c8-7b3b810aff4a\") " pod="openstack-kuttl-tests/neutron-b757-account-create-update-58cfg" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.322455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf-operator-scripts\") pod \"barbican-db-create-kgl2n\" (UID: \"6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf\") " pod="openstack-kuttl-tests/barbican-db-create-kgl2n" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.322492 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlths\" (UniqueName: \"kubernetes.io/projected/a437decd-da76-4967-80ac-674e94cfed08-kube-api-access-tlths\") pod \"cinder-0e14-account-create-update-kxft9\" (UID: \"a437decd-da76-4967-80ac-674e94cfed08\") " pod="openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.322521 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0e36085-7754-49ab-b3c8-7b3b810aff4a-operator-scripts\") pod \"neutron-b757-account-create-update-58cfg\" (UID: \"e0e36085-7754-49ab-b3c8-7b3b810aff4a\") " pod="openstack-kuttl-tests/neutron-b757-account-create-update-58cfg" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.322547 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpksj\" (UniqueName: \"kubernetes.io/projected/6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf-kube-api-access-hpksj\") pod \"barbican-db-create-kgl2n\" (UID: \"6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf\") " pod="openstack-kuttl-tests/barbican-db-create-kgl2n" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.322591 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a437decd-da76-4967-80ac-674e94cfed08-operator-scripts\") pod \"cinder-0e14-account-create-update-kxft9\" (UID: \"a437decd-da76-4967-80ac-674e94cfed08\") " pod="openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.323832 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf-operator-scripts\") pod \"barbican-db-create-kgl2n\" (UID: \"6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf\") " pod="openstack-kuttl-tests/barbican-db-create-kgl2n" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.324523 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0e36085-7754-49ab-b3c8-7b3b810aff4a-operator-scripts\") pod \"neutron-b757-account-create-update-58cfg\" (UID: \"e0e36085-7754-49ab-b3c8-7b3b810aff4a\") " pod="openstack-kuttl-tests/neutron-b757-account-create-update-58cfg" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.325399 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-krpzk" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.331072 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-n5h5x" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.334801 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-mrxvh"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.344833 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.346789 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.353577 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.357199 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.361079 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrz2s\" (UniqueName: \"kubernetes.io/projected/e0e36085-7754-49ab-b3c8-7b3b810aff4a-kube-api-access-qrz2s\") pod \"neutron-b757-account-create-update-58cfg\" (UID: \"e0e36085-7754-49ab-b3c8-7b3b810aff4a\") " pod="openstack-kuttl-tests/neutron-b757-account-create-update-58cfg" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.396481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpksj\" (UniqueName: \"kubernetes.io/projected/6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf-kube-api-access-hpksj\") pod \"barbican-db-create-kgl2n\" (UID: \"6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf\") " pod="openstack-kuttl-tests/barbican-db-create-kgl2n" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.427461 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72208017-a92b-44ff-9f45-bafc84173dfd-combined-ca-bundle\") pod \"keystone-db-sync-mrxvh\" (UID: \"72208017-a92b-44ff-9f45-bafc84173dfd\") " pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.427560 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlths\" (UniqueName: \"kubernetes.io/projected/a437decd-da76-4967-80ac-674e94cfed08-kube-api-access-tlths\") pod \"cinder-0e14-account-create-update-kxft9\" (UID: \"a437decd-da76-4967-80ac-674e94cfed08\") " pod="openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.427587 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72208017-a92b-44ff-9f45-bafc84173dfd-config-data\") pod \"keystone-db-sync-mrxvh\" (UID: \"72208017-a92b-44ff-9f45-bafc84173dfd\") " pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.427657 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d3e6560-7767-4dcf-a6b7-c0a510179c8c-operator-scripts\") pod \"barbican-a33f-account-create-update-k7gcm\" (UID: \"2d3e6560-7767-4dcf-a6b7-c0a510179c8c\") " pod="openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.427691 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a437decd-da76-4967-80ac-674e94cfed08-operator-scripts\") pod \"cinder-0e14-account-create-update-kxft9\" (UID: \"a437decd-da76-4967-80ac-674e94cfed08\") " pod="openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.427719 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxlcc\" (UniqueName: \"kubernetes.io/projected/72208017-a92b-44ff-9f45-bafc84173dfd-kube-api-access-mxlcc\") pod \"keystone-db-sync-mrxvh\" (UID: \"72208017-a92b-44ff-9f45-bafc84173dfd\") " pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.427758 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl96s\" (UniqueName: \"kubernetes.io/projected/2d3e6560-7767-4dcf-a6b7-c0a510179c8c-kube-api-access-sl96s\") pod \"barbican-a33f-account-create-update-k7gcm\" (UID: \"2d3e6560-7767-4dcf-a6b7-c0a510179c8c\") " pod="openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.428426 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a437decd-da76-4967-80ac-674e94cfed08-operator-scripts\") pod \"cinder-0e14-account-create-update-kxft9\" (UID: \"a437decd-da76-4967-80ac-674e94cfed08\") " pod="openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.443426 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-kgl2n" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.443554 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlths\" (UniqueName: \"kubernetes.io/projected/a437decd-da76-4967-80ac-674e94cfed08-kube-api-access-tlths\") pod \"cinder-0e14-account-create-update-kxft9\" (UID: \"a437decd-da76-4967-80ac-674e94cfed08\") " pod="openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.452579 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b757-account-create-update-58cfg" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.522088 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.530502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxlcc\" (UniqueName: \"kubernetes.io/projected/72208017-a92b-44ff-9f45-bafc84173dfd-kube-api-access-mxlcc\") pod \"keystone-db-sync-mrxvh\" (UID: \"72208017-a92b-44ff-9f45-bafc84173dfd\") " pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.530570 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl96s\" (UniqueName: \"kubernetes.io/projected/2d3e6560-7767-4dcf-a6b7-c0a510179c8c-kube-api-access-sl96s\") pod \"barbican-a33f-account-create-update-k7gcm\" (UID: \"2d3e6560-7767-4dcf-a6b7-c0a510179c8c\") " pod="openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.530609 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72208017-a92b-44ff-9f45-bafc84173dfd-combined-ca-bundle\") pod \"keystone-db-sync-mrxvh\" (UID: \"72208017-a92b-44ff-9f45-bafc84173dfd\") " pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.530679 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72208017-a92b-44ff-9f45-bafc84173dfd-config-data\") pod \"keystone-db-sync-mrxvh\" (UID: \"72208017-a92b-44ff-9f45-bafc84173dfd\") " pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.530722 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d3e6560-7767-4dcf-a6b7-c0a510179c8c-operator-scripts\") pod \"barbican-a33f-account-create-update-k7gcm\" (UID: \"2d3e6560-7767-4dcf-a6b7-c0a510179c8c\") " pod="openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.531307 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d3e6560-7767-4dcf-a6b7-c0a510179c8c-operator-scripts\") pod \"barbican-a33f-account-create-update-k7gcm\" (UID: \"2d3e6560-7767-4dcf-a6b7-c0a510179c8c\") " pod="openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.538802 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72208017-a92b-44ff-9f45-bafc84173dfd-combined-ca-bundle\") pod \"keystone-db-sync-mrxvh\" (UID: \"72208017-a92b-44ff-9f45-bafc84173dfd\") " pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.538865 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72208017-a92b-44ff-9f45-bafc84173dfd-config-data\") pod \"keystone-db-sync-mrxvh\" (UID: \"72208017-a92b-44ff-9f45-bafc84173dfd\") " pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.555352 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl96s\" (UniqueName: \"kubernetes.io/projected/2d3e6560-7767-4dcf-a6b7-c0a510179c8c-kube-api-access-sl96s\") pod \"barbican-a33f-account-create-update-k7gcm\" (UID: \"2d3e6560-7767-4dcf-a6b7-c0a510179c8c\") " pod="openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.557941 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxlcc\" (UniqueName: \"kubernetes.io/projected/72208017-a92b-44ff-9f45-bafc84173dfd-kube-api-access-mxlcc\") pod \"keystone-db-sync-mrxvh\" (UID: \"72208017-a92b-44ff-9f45-bafc84173dfd\") " pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.632454 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.770336 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm" Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.857034 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-n5h5x"] Jan 20 23:04:37 crc kubenswrapper[5030]: I0120 23:04:37.986250 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-krpzk"] Jan 20 23:04:38 crc kubenswrapper[5030]: I0120 23:04:38.063697 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-b757-account-create-update-58cfg"] Jan 20 23:04:38 crc kubenswrapper[5030]: I0120 23:04:38.077892 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9"] Jan 20 23:04:38 crc kubenswrapper[5030]: I0120 23:04:38.096182 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-kgl2n"] Jan 20 23:04:38 crc kubenswrapper[5030]: I0120 23:04:38.121885 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9" event={"ID":"a437decd-da76-4967-80ac-674e94cfed08","Type":"ContainerStarted","Data":"b8f644080eb7594aa2963e939ced6662de5131ceebed9d2d123ef475ce95217e"} Jan 20 23:04:38 crc kubenswrapper[5030]: I0120 23:04:38.124605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-krpzk" event={"ID":"867dbc1d-4b5e-4803-b3ee-aabaf77f4282","Type":"ContainerStarted","Data":"3c211bacd2950b1707e65acf02865a3f305188577e5d70fb71094d65b629113e"} Jan 20 23:04:38 crc kubenswrapper[5030]: I0120 23:04:38.127342 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-n5h5x" event={"ID":"25712fb0-ffff-4189-a2a8-f6f0e88778ff","Type":"ContainerStarted","Data":"1f0965177f09b885701678b3a85480e90b7149337fdeb135ff36eef3c364719d"} Jan 20 23:04:38 crc kubenswrapper[5030]: I0120 23:04:38.127372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-n5h5x" event={"ID":"25712fb0-ffff-4189-a2a8-f6f0e88778ff","Type":"ContainerStarted","Data":"3d067a34fe4b4b64603ad4f9e81706a561b5f4705b768bd5c92ddeb71b219cc2"} Jan 20 23:04:38 crc kubenswrapper[5030]: I0120 23:04:38.128594 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-b757-account-create-update-58cfg" event={"ID":"e0e36085-7754-49ab-b3c8-7b3b810aff4a","Type":"ContainerStarted","Data":"e67b9201fdd7d264fe42ee75278f968e52951b0aded9c4688264f291c614327c"} Jan 20 23:04:38 crc kubenswrapper[5030]: I0120 23:04:38.146243 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-create-n5h5x" podStartSLOduration=2.146228394 podStartE2EDuration="2.146228394s" podCreationTimestamp="2026-01-20 23:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:38.143371355 +0000 UTC m=+1750.463631643" watchObservedRunningTime="2026-01-20 23:04:38.146228394 +0000 UTC m=+1750.466488682" Jan 20 23:04:38 crc kubenswrapper[5030]: I0120 23:04:38.207140 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-mrxvh"] Jan 20 23:04:38 crc kubenswrapper[5030]: I0120 23:04:38.324397 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm"] Jan 20 23:04:38 crc kubenswrapper[5030]: W0120 23:04:38.332671 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d3e6560_7767_4dcf_a6b7_c0a510179c8c.slice/crio-f2e7ff955038312a1da3c4ebd436a71128c785220f572273ae596e5bd0292d49 WatchSource:0}: Error finding container f2e7ff955038312a1da3c4ebd436a71128c785220f572273ae596e5bd0292d49: Status 404 returned error can't find the container with id f2e7ff955038312a1da3c4ebd436a71128c785220f572273ae596e5bd0292d49 Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.140752 5030 generic.go:334] "Generic (PLEG): container finished" podID="867dbc1d-4b5e-4803-b3ee-aabaf77f4282" containerID="b81ff34b6eebd5990d479504e7a8389fc6a76b12390c4a30c7a2cb9b553d3af3" exitCode=0 Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.140818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-krpzk" event={"ID":"867dbc1d-4b5e-4803-b3ee-aabaf77f4282","Type":"ContainerDied","Data":"b81ff34b6eebd5990d479504e7a8389fc6a76b12390c4a30c7a2cb9b553d3af3"} Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.143991 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" event={"ID":"72208017-a92b-44ff-9f45-bafc84173dfd","Type":"ContainerStarted","Data":"e5e0bba9966895f6218767c5716dcc0df0126ba0c296f81b6f368434a9473f17"} Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.144019 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" event={"ID":"72208017-a92b-44ff-9f45-bafc84173dfd","Type":"ContainerStarted","Data":"494c8383b060dad1632d0a0e9ca6d1d7edca3638adda41f387f57f1a686e9bbf"} Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.145795 5030 generic.go:334] "Generic (PLEG): container finished" podID="25712fb0-ffff-4189-a2a8-f6f0e88778ff" containerID="1f0965177f09b885701678b3a85480e90b7149337fdeb135ff36eef3c364719d" exitCode=0 Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.145846 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-n5h5x" event={"ID":"25712fb0-ffff-4189-a2a8-f6f0e88778ff","Type":"ContainerDied","Data":"1f0965177f09b885701678b3a85480e90b7149337fdeb135ff36eef3c364719d"} Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.147049 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0e36085-7754-49ab-b3c8-7b3b810aff4a" containerID="a088a2c71d5376f6316b4b1e10827836e559e903b68e3cf49a068c14fa399de3" exitCode=0 Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.147092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-b757-account-create-update-58cfg" event={"ID":"e0e36085-7754-49ab-b3c8-7b3b810aff4a","Type":"ContainerDied","Data":"a088a2c71d5376f6316b4b1e10827836e559e903b68e3cf49a068c14fa399de3"} Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.148334 5030 generic.go:334] "Generic (PLEG): container finished" podID="2d3e6560-7767-4dcf-a6b7-c0a510179c8c" containerID="8e081848c548e341bad0e73dab25932b18c37499573198982675c824eb766cce" exitCode=0 Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.148407 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm" event={"ID":"2d3e6560-7767-4dcf-a6b7-c0a510179c8c","Type":"ContainerDied","Data":"8e081848c548e341bad0e73dab25932b18c37499573198982675c824eb766cce"} Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.148429 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm" event={"ID":"2d3e6560-7767-4dcf-a6b7-c0a510179c8c","Type":"ContainerStarted","Data":"f2e7ff955038312a1da3c4ebd436a71128c785220f572273ae596e5bd0292d49"} Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.151199 5030 generic.go:334] "Generic (PLEG): container finished" podID="6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf" containerID="3edb379847341a6e47f821c96e48da36c12ae72e7fbf28a90433bd533216077a" exitCode=0 Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.151232 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-kgl2n" event={"ID":"6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf","Type":"ContainerDied","Data":"3edb379847341a6e47f821c96e48da36c12ae72e7fbf28a90433bd533216077a"} Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.151264 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-kgl2n" event={"ID":"6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf","Type":"ContainerStarted","Data":"38ff3b47dc972208bde58adda40023788c2c4dcc179c9bdb05b5e0c49d1fe500"} Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.155846 5030 generic.go:334] "Generic (PLEG): container finished" podID="a437decd-da76-4967-80ac-674e94cfed08" containerID="1dd447d544e7681d6930f430646e45366ea16afd38076d543d31132350e8193c" exitCode=0 Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.155882 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9" event={"ID":"a437decd-da76-4967-80ac-674e94cfed08","Type":"ContainerDied","Data":"1dd447d544e7681d6930f430646e45366ea16afd38076d543d31132350e8193c"} Jan 20 23:04:39 crc kubenswrapper[5030]: I0120 23:04:39.286104 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" podStartSLOduration=2.285619021 podStartE2EDuration="2.285619021s" podCreationTimestamp="2026-01-20 23:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:39.280989118 +0000 UTC m=+1751.601249406" watchObservedRunningTime="2026-01-20 23:04:39.285619021 +0000 UTC m=+1751.605879309" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.548526 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-krpzk" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.603155 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/867dbc1d-4b5e-4803-b3ee-aabaf77f4282-operator-scripts\") pod \"867dbc1d-4b5e-4803-b3ee-aabaf77f4282\" (UID: \"867dbc1d-4b5e-4803-b3ee-aabaf77f4282\") " Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.603277 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw9cb\" (UniqueName: \"kubernetes.io/projected/867dbc1d-4b5e-4803-b3ee-aabaf77f4282-kube-api-access-zw9cb\") pod \"867dbc1d-4b5e-4803-b3ee-aabaf77f4282\" (UID: \"867dbc1d-4b5e-4803-b3ee-aabaf77f4282\") " Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.604213 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/867dbc1d-4b5e-4803-b3ee-aabaf77f4282-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "867dbc1d-4b5e-4803-b3ee-aabaf77f4282" (UID: "867dbc1d-4b5e-4803-b3ee-aabaf77f4282"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.618457 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/867dbc1d-4b5e-4803-b3ee-aabaf77f4282-kube-api-access-zw9cb" (OuterVolumeSpecName: "kube-api-access-zw9cb") pod "867dbc1d-4b5e-4803-b3ee-aabaf77f4282" (UID: "867dbc1d-4b5e-4803-b3ee-aabaf77f4282"). InnerVolumeSpecName "kube-api-access-zw9cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.704533 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw9cb\" (UniqueName: \"kubernetes.io/projected/867dbc1d-4b5e-4803-b3ee-aabaf77f4282-kube-api-access-zw9cb\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.704568 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/867dbc1d-4b5e-4803-b3ee-aabaf77f4282-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.741297 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b757-account-create-update-58cfg" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.745730 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.758659 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-n5h5x" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.769612 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.790144 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-kgl2n" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.808216 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a437decd-da76-4967-80ac-674e94cfed08-operator-scripts\") pod \"a437decd-da76-4967-80ac-674e94cfed08\" (UID: \"a437decd-da76-4967-80ac-674e94cfed08\") " Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.808299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlths\" (UniqueName: \"kubernetes.io/projected/a437decd-da76-4967-80ac-674e94cfed08-kube-api-access-tlths\") pod \"a437decd-da76-4967-80ac-674e94cfed08\" (UID: \"a437decd-da76-4967-80ac-674e94cfed08\") " Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.808484 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrz2s\" (UniqueName: \"kubernetes.io/projected/e0e36085-7754-49ab-b3c8-7b3b810aff4a-kube-api-access-qrz2s\") pod \"e0e36085-7754-49ab-b3c8-7b3b810aff4a\" (UID: \"e0e36085-7754-49ab-b3c8-7b3b810aff4a\") " Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.808519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0e36085-7754-49ab-b3c8-7b3b810aff4a-operator-scripts\") pod \"e0e36085-7754-49ab-b3c8-7b3b810aff4a\" (UID: \"e0e36085-7754-49ab-b3c8-7b3b810aff4a\") " Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.809239 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a437decd-da76-4967-80ac-674e94cfed08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a437decd-da76-4967-80ac-674e94cfed08" (UID: "a437decd-da76-4967-80ac-674e94cfed08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.810573 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0e36085-7754-49ab-b3c8-7b3b810aff4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0e36085-7754-49ab-b3c8-7b3b810aff4a" (UID: "e0e36085-7754-49ab-b3c8-7b3b810aff4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.822476 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a437decd-da76-4967-80ac-674e94cfed08-kube-api-access-tlths" (OuterVolumeSpecName: "kube-api-access-tlths") pod "a437decd-da76-4967-80ac-674e94cfed08" (UID: "a437decd-da76-4967-80ac-674e94cfed08"). InnerVolumeSpecName "kube-api-access-tlths". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.826974 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e36085-7754-49ab-b3c8-7b3b810aff4a-kube-api-access-qrz2s" (OuterVolumeSpecName: "kube-api-access-qrz2s") pod "e0e36085-7754-49ab-b3c8-7b3b810aff4a" (UID: "e0e36085-7754-49ab-b3c8-7b3b810aff4a"). InnerVolumeSpecName "kube-api-access-qrz2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.910288 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpksj\" (UniqueName: \"kubernetes.io/projected/6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf-kube-api-access-hpksj\") pod \"6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf\" (UID: \"6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf\") " Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.910362 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25712fb0-ffff-4189-a2a8-f6f0e88778ff-operator-scripts\") pod \"25712fb0-ffff-4189-a2a8-f6f0e88778ff\" (UID: \"25712fb0-ffff-4189-a2a8-f6f0e88778ff\") " Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.910413 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf-operator-scripts\") pod \"6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf\" (UID: \"6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf\") " Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.910494 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gr6l\" (UniqueName: \"kubernetes.io/projected/25712fb0-ffff-4189-a2a8-f6f0e88778ff-kube-api-access-8gr6l\") pod \"25712fb0-ffff-4189-a2a8-f6f0e88778ff\" (UID: \"25712fb0-ffff-4189-a2a8-f6f0e88778ff\") " Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.910552 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d3e6560-7767-4dcf-a6b7-c0a510179c8c-operator-scripts\") pod \"2d3e6560-7767-4dcf-a6b7-c0a510179c8c\" (UID: \"2d3e6560-7767-4dcf-a6b7-c0a510179c8c\") " Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.910665 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl96s\" (UniqueName: \"kubernetes.io/projected/2d3e6560-7767-4dcf-a6b7-c0a510179c8c-kube-api-access-sl96s\") pod \"2d3e6560-7767-4dcf-a6b7-c0a510179c8c\" (UID: \"2d3e6560-7767-4dcf-a6b7-c0a510179c8c\") " Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.911022 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25712fb0-ffff-4189-a2a8-f6f0e88778ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25712fb0-ffff-4189-a2a8-f6f0e88778ff" (UID: "25712fb0-ffff-4189-a2a8-f6f0e88778ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.911266 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf" (UID: "6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.911383 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d3e6560-7767-4dcf-a6b7-c0a510179c8c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d3e6560-7767-4dcf-a6b7-c0a510179c8c" (UID: "2d3e6560-7767-4dcf-a6b7-c0a510179c8c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.911763 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d3e6560-7767-4dcf-a6b7-c0a510179c8c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.911791 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrz2s\" (UniqueName: \"kubernetes.io/projected/e0e36085-7754-49ab-b3c8-7b3b810aff4a-kube-api-access-qrz2s\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.911805 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0e36085-7754-49ab-b3c8-7b3b810aff4a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.911818 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a437decd-da76-4967-80ac-674e94cfed08-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.911831 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlths\" (UniqueName: \"kubernetes.io/projected/a437decd-da76-4967-80ac-674e94cfed08-kube-api-access-tlths\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.911843 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25712fb0-ffff-4189-a2a8-f6f0e88778ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.911856 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.913233 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf-kube-api-access-hpksj" (OuterVolumeSpecName: "kube-api-access-hpksj") pod "6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf" (UID: "6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf"). InnerVolumeSpecName "kube-api-access-hpksj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.913722 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d3e6560-7767-4dcf-a6b7-c0a510179c8c-kube-api-access-sl96s" (OuterVolumeSpecName: "kube-api-access-sl96s") pod "2d3e6560-7767-4dcf-a6b7-c0a510179c8c" (UID: "2d3e6560-7767-4dcf-a6b7-c0a510179c8c"). InnerVolumeSpecName "kube-api-access-sl96s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:40 crc kubenswrapper[5030]: I0120 23:04:40.913994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25712fb0-ffff-4189-a2a8-f6f0e88778ff-kube-api-access-8gr6l" (OuterVolumeSpecName: "kube-api-access-8gr6l") pod "25712fb0-ffff-4189-a2a8-f6f0e88778ff" (UID: "25712fb0-ffff-4189-a2a8-f6f0e88778ff"). InnerVolumeSpecName "kube-api-access-8gr6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.013863 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gr6l\" (UniqueName: \"kubernetes.io/projected/25712fb0-ffff-4189-a2a8-f6f0e88778ff-kube-api-access-8gr6l\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.013900 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl96s\" (UniqueName: \"kubernetes.io/projected/2d3e6560-7767-4dcf-a6b7-c0a510179c8c-kube-api-access-sl96s\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.013909 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpksj\" (UniqueName: \"kubernetes.io/projected/6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf-kube-api-access-hpksj\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.177683 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-n5h5x" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.177677 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-n5h5x" event={"ID":"25712fb0-ffff-4189-a2a8-f6f0e88778ff","Type":"ContainerDied","Data":"3d067a34fe4b4b64603ad4f9e81706a561b5f4705b768bd5c92ddeb71b219cc2"} Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.177884 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d067a34fe4b4b64603ad4f9e81706a561b5f4705b768bd5c92ddeb71b219cc2" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.180157 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b757-account-create-update-58cfg" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.180972 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-b757-account-create-update-58cfg" event={"ID":"e0e36085-7754-49ab-b3c8-7b3b810aff4a","Type":"ContainerDied","Data":"e67b9201fdd7d264fe42ee75278f968e52951b0aded9c4688264f291c614327c"} Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.181053 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e67b9201fdd7d264fe42ee75278f968e52951b0aded9c4688264f291c614327c" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.183567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm" event={"ID":"2d3e6560-7767-4dcf-a6b7-c0a510179c8c","Type":"ContainerDied","Data":"f2e7ff955038312a1da3c4ebd436a71128c785220f572273ae596e5bd0292d49"} Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.183613 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2e7ff955038312a1da3c4ebd436a71128c785220f572273ae596e5bd0292d49" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.183724 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.187832 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-kgl2n" event={"ID":"6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf","Type":"ContainerDied","Data":"38ff3b47dc972208bde58adda40023788c2c4dcc179c9bdb05b5e0c49d1fe500"} Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.187911 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38ff3b47dc972208bde58adda40023788c2c4dcc179c9bdb05b5e0c49d1fe500" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.188031 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-kgl2n" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.191606 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.191591 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9" event={"ID":"a437decd-da76-4967-80ac-674e94cfed08","Type":"ContainerDied","Data":"b8f644080eb7594aa2963e939ced6662de5131ceebed9d2d123ef475ce95217e"} Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.192143 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8f644080eb7594aa2963e939ced6662de5131ceebed9d2d123ef475ce95217e" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.193720 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-krpzk" event={"ID":"867dbc1d-4b5e-4803-b3ee-aabaf77f4282","Type":"ContainerDied","Data":"3c211bacd2950b1707e65acf02865a3f305188577e5d70fb71094d65b629113e"} Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.193748 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c211bacd2950b1707e65acf02865a3f305188577e5d70fb71094d65b629113e" Jan 20 23:04:41 crc kubenswrapper[5030]: I0120 23:04:41.193820 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-krpzk" Jan 20 23:04:42 crc kubenswrapper[5030]: I0120 23:04:42.205543 5030 generic.go:334] "Generic (PLEG): container finished" podID="72208017-a92b-44ff-9f45-bafc84173dfd" containerID="e5e0bba9966895f6218767c5716dcc0df0126ba0c296f81b6f368434a9473f17" exitCode=0 Jan 20 23:04:42 crc kubenswrapper[5030]: I0120 23:04:42.205589 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" event={"ID":"72208017-a92b-44ff-9f45-bafc84173dfd","Type":"ContainerDied","Data":"e5e0bba9966895f6218767c5716dcc0df0126ba0c296f81b6f368434a9473f17"} Jan 20 23:04:43 crc kubenswrapper[5030]: I0120 23:04:43.615684 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" Jan 20 23:04:43 crc kubenswrapper[5030]: I0120 23:04:43.764443 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxlcc\" (UniqueName: \"kubernetes.io/projected/72208017-a92b-44ff-9f45-bafc84173dfd-kube-api-access-mxlcc\") pod \"72208017-a92b-44ff-9f45-bafc84173dfd\" (UID: \"72208017-a92b-44ff-9f45-bafc84173dfd\") " Jan 20 23:04:43 crc kubenswrapper[5030]: I0120 23:04:43.764584 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72208017-a92b-44ff-9f45-bafc84173dfd-combined-ca-bundle\") pod \"72208017-a92b-44ff-9f45-bafc84173dfd\" (UID: \"72208017-a92b-44ff-9f45-bafc84173dfd\") " Jan 20 23:04:43 crc kubenswrapper[5030]: I0120 23:04:43.765986 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72208017-a92b-44ff-9f45-bafc84173dfd-config-data\") pod \"72208017-a92b-44ff-9f45-bafc84173dfd\" (UID: \"72208017-a92b-44ff-9f45-bafc84173dfd\") " Jan 20 23:04:43 crc kubenswrapper[5030]: I0120 23:04:43.775685 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72208017-a92b-44ff-9f45-bafc84173dfd-kube-api-access-mxlcc" (OuterVolumeSpecName: "kube-api-access-mxlcc") pod "72208017-a92b-44ff-9f45-bafc84173dfd" (UID: "72208017-a92b-44ff-9f45-bafc84173dfd"). InnerVolumeSpecName "kube-api-access-mxlcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:43 crc kubenswrapper[5030]: I0120 23:04:43.825088 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72208017-a92b-44ff-9f45-bafc84173dfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72208017-a92b-44ff-9f45-bafc84173dfd" (UID: "72208017-a92b-44ff-9f45-bafc84173dfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:43 crc kubenswrapper[5030]: I0120 23:04:43.827342 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72208017-a92b-44ff-9f45-bafc84173dfd-config-data" (OuterVolumeSpecName: "config-data") pod "72208017-a92b-44ff-9f45-bafc84173dfd" (UID: "72208017-a92b-44ff-9f45-bafc84173dfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:43 crc kubenswrapper[5030]: I0120 23:04:43.868231 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72208017-a92b-44ff-9f45-bafc84173dfd-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:43 crc kubenswrapper[5030]: I0120 23:04:43.868265 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxlcc\" (UniqueName: \"kubernetes.io/projected/72208017-a92b-44ff-9f45-bafc84173dfd-kube-api-access-mxlcc\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:43 crc kubenswrapper[5030]: I0120 23:04:43.868278 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72208017-a92b-44ff-9f45-bafc84173dfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.228062 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" event={"ID":"72208017-a92b-44ff-9f45-bafc84173dfd","Type":"ContainerDied","Data":"494c8383b060dad1632d0a0e9ca6d1d7edca3638adda41f387f57f1a686e9bbf"} Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.228113 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="494c8383b060dad1632d0a0e9ca6d1d7edca3638adda41f387f57f1a686e9bbf" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.228120 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-mrxvh" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.438551 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-rdd9j"] Jan 20 23:04:44 crc kubenswrapper[5030]: E0120 23:04:44.439000 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3e6560-7767-4dcf-a6b7-c0a510179c8c" containerName="mariadb-account-create-update" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439021 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3e6560-7767-4dcf-a6b7-c0a510179c8c" containerName="mariadb-account-create-update" Jan 20 23:04:44 crc kubenswrapper[5030]: E0120 23:04:44.439041 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72208017-a92b-44ff-9f45-bafc84173dfd" containerName="keystone-db-sync" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439050 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="72208017-a92b-44ff-9f45-bafc84173dfd" containerName="keystone-db-sync" Jan 20 23:04:44 crc kubenswrapper[5030]: E0120 23:04:44.439074 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="867dbc1d-4b5e-4803-b3ee-aabaf77f4282" containerName="mariadb-database-create" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439085 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="867dbc1d-4b5e-4803-b3ee-aabaf77f4282" containerName="mariadb-database-create" Jan 20 23:04:44 crc kubenswrapper[5030]: E0120 23:04:44.439103 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25712fb0-ffff-4189-a2a8-f6f0e88778ff" containerName="mariadb-database-create" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439111 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="25712fb0-ffff-4189-a2a8-f6f0e88778ff" containerName="mariadb-database-create" Jan 20 23:04:44 crc kubenswrapper[5030]: E0120 23:04:44.439121 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf" containerName="mariadb-database-create" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439129 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf" containerName="mariadb-database-create" Jan 20 23:04:44 crc kubenswrapper[5030]: E0120 23:04:44.439141 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a437decd-da76-4967-80ac-674e94cfed08" containerName="mariadb-account-create-update" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439149 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a437decd-da76-4967-80ac-674e94cfed08" containerName="mariadb-account-create-update" Jan 20 23:04:44 crc kubenswrapper[5030]: E0120 23:04:44.439164 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e36085-7754-49ab-b3c8-7b3b810aff4a" containerName="mariadb-account-create-update" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439172 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e36085-7754-49ab-b3c8-7b3b810aff4a" containerName="mariadb-account-create-update" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439347 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="72208017-a92b-44ff-9f45-bafc84173dfd" containerName="keystone-db-sync" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439365 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="25712fb0-ffff-4189-a2a8-f6f0e88778ff" containerName="mariadb-database-create" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439383 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a437decd-da76-4967-80ac-674e94cfed08" containerName="mariadb-account-create-update" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439396 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e36085-7754-49ab-b3c8-7b3b810aff4a" containerName="mariadb-account-create-update" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439409 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="867dbc1d-4b5e-4803-b3ee-aabaf77f4282" containerName="mariadb-database-create" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439426 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf" containerName="mariadb-database-create" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.439441 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d3e6560-7767-4dcf-a6b7-c0a510179c8c" containerName="mariadb-account-create-update" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.440060 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.443602 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.444000 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.444171 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.444503 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.444721 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-2vb6q" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.481992 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-rdd9j"] Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.579669 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-combined-ca-bundle\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.579728 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-config-data\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.579754 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-scripts\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.579789 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-fernet-keys\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.579823 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-credential-keys\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.579866 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgmqg\" (UniqueName: \"kubernetes.io/projected/18897247-c232-43c2-9923-86f0391699ab-kube-api-access-dgmqg\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.606573 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.608477 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.611769 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.611942 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.681454 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94zfx\" (UniqueName: \"kubernetes.io/projected/2cdf2496-d5b8-406d-8693-6e39c33950de-kube-api-access-94zfx\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.681506 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-scripts\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.681535 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cdf2496-d5b8-406d-8693-6e39c33950de-run-httpd\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.681563 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.681589 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-fernet-keys\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.681643 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-credential-keys\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.681673 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-config-data\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.681690 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.681715 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cdf2496-d5b8-406d-8693-6e39c33950de-log-httpd\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.681737 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgmqg\" (UniqueName: \"kubernetes.io/projected/18897247-c232-43c2-9923-86f0391699ab-kube-api-access-dgmqg\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.681760 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-scripts\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.681789 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-combined-ca-bundle\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.681819 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-config-data\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.685042 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-scripts\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.688252 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.689037 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-config-data\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.689537 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-fernet-keys\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.689848 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-combined-ca-bundle\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.696154 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-credential-keys\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.707284 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-wflcw"] Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.708350 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-wflcw" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.715955 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-5jblq" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.716147 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.716258 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.724919 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-wflcw"] Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.743666 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-jpzls"] Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.743885 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgmqg\" (UniqueName: \"kubernetes.io/projected/18897247-c232-43c2-9923-86f0391699ab-kube-api-access-dgmqg\") pod \"keystone-bootstrap-rdd9j\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.744690 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.750154 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.750184 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-896tn" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.750663 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.764347 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-jpzls"] Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.764664 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.777683 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-grk67"] Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.778609 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.780579 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-2d6zz" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.781000 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.781129 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.782809 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-scripts\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.782882 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94zfx\" (UniqueName: \"kubernetes.io/projected/2cdf2496-d5b8-406d-8693-6e39c33950de-kube-api-access-94zfx\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.782912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cdf2496-d5b8-406d-8693-6e39c33950de-run-httpd\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.782939 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxvf\" (UniqueName: \"kubernetes.io/projected/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-kube-api-access-fnxvf\") pod \"neutron-db-sync-wflcw\" (UID: \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\") " pod="openstack-kuttl-tests/neutron-db-sync-wflcw" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.782957 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.783008 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-combined-ca-bundle\") pod \"neutron-db-sync-wflcw\" (UID: \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\") " pod="openstack-kuttl-tests/neutron-db-sync-wflcw" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.783030 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-config-data\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.783047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.783071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cdf2496-d5b8-406d-8693-6e39c33950de-log-httpd\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.783088 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-config\") pod \"neutron-db-sync-wflcw\" (UID: \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\") " pod="openstack-kuttl-tests/neutron-db-sync-wflcw" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.792144 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cdf2496-d5b8-406d-8693-6e39c33950de-run-httpd\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.797147 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cdf2496-d5b8-406d-8693-6e39c33950de-log-httpd\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.797671 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-config-data\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.807308 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-scripts\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.809854 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.825398 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.828742 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-grk67"] Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.829397 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94zfx\" (UniqueName: \"kubernetes.io/projected/2cdf2496-d5b8-406d-8693-6e39c33950de-kube-api-access-94zfx\") pod \"ceilometer-0\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889073 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kr7g\" (UniqueName: \"kubernetes.io/projected/b1382233-4692-4fff-8586-47c406224b47-kube-api-access-9kr7g\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889143 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-combined-ca-bundle\") pod \"neutron-db-sync-wflcw\" (UID: \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\") " pod="openstack-kuttl-tests/neutron-db-sync-wflcw" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889168 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-etc-machine-id\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889193 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-scripts\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889228 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-config\") pod \"neutron-db-sync-wflcw\" (UID: \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\") " pod="openstack-kuttl-tests/neutron-db-sync-wflcw" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-combined-ca-bundle\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889285 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw47k\" (UniqueName: \"kubernetes.io/projected/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-kube-api-access-pw47k\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-db-sync-config-data\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889340 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-combined-ca-bundle\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889375 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1382233-4692-4fff-8586-47c406224b47-logs\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889412 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnxvf\" (UniqueName: \"kubernetes.io/projected/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-kube-api-access-fnxvf\") pod \"neutron-db-sync-wflcw\" (UID: \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\") " pod="openstack-kuttl-tests/neutron-db-sync-wflcw" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889432 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-config-data\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889458 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-scripts\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.889487 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-config-data\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.898275 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-combined-ca-bundle\") pod \"neutron-db-sync-wflcw\" (UID: \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\") " pod="openstack-kuttl-tests/neutron-db-sync-wflcw" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.905622 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-config\") pod \"neutron-db-sync-wflcw\" (UID: \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\") " pod="openstack-kuttl-tests/neutron-db-sync-wflcw" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.914778 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnxvf\" (UniqueName: \"kubernetes.io/projected/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-kube-api-access-fnxvf\") pod \"neutron-db-sync-wflcw\" (UID: \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\") " pod="openstack-kuttl-tests/neutron-db-sync-wflcw" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.923364 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-9dztn"] Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.923585 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.926271 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-9dztn" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.931977 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.932156 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-s9xtv" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.952045 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-9dztn"] Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.993050 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-combined-ca-bundle\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.993198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw47k\" (UniqueName: \"kubernetes.io/projected/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-kube-api-access-pw47k\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.993278 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-db-sync-config-data\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.993358 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-combined-ca-bundle\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.993439 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1382233-4692-4fff-8586-47c406224b47-logs\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.993553 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-config-data\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.993647 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-scripts\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.993741 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-config-data\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.993817 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kr7g\" (UniqueName: \"kubernetes.io/projected/b1382233-4692-4fff-8586-47c406224b47-kube-api-access-9kr7g\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.993899 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-etc-machine-id\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.994007 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-scripts\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.994294 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1382233-4692-4fff-8586-47c406224b47-logs\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:44 crc kubenswrapper[5030]: I0120 23:04:44.994759 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-etc-machine-id\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.001032 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-config-data\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.004923 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-db-sync-config-data\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.006037 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-scripts\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.009124 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-combined-ca-bundle\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.009398 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-scripts\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.009563 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-config-data\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.010981 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw47k\" (UniqueName: \"kubernetes.io/projected/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-kube-api-access-pw47k\") pod \"cinder-db-sync-jpzls\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.012207 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kr7g\" (UniqueName: \"kubernetes.io/projected/b1382233-4692-4fff-8586-47c406224b47-kube-api-access-9kr7g\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.023362 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-combined-ca-bundle\") pod \"placement-db-sync-grk67\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.095491 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-db-sync-config-data\") pod \"barbican-db-sync-9dztn\" (UID: \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\") " pod="openstack-kuttl-tests/barbican-db-sync-9dztn" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.095601 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-combined-ca-bundle\") pod \"barbican-db-sync-9dztn\" (UID: \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\") " pod="openstack-kuttl-tests/barbican-db-sync-9dztn" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.095640 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5wh5\" (UniqueName: \"kubernetes.io/projected/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-kube-api-access-s5wh5\") pod \"barbican-db-sync-9dztn\" (UID: \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\") " pod="openstack-kuttl-tests/barbican-db-sync-9dztn" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.192505 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-wflcw" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.197489 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-db-sync-config-data\") pod \"barbican-db-sync-9dztn\" (UID: \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\") " pod="openstack-kuttl-tests/barbican-db-sync-9dztn" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.197578 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-combined-ca-bundle\") pod \"barbican-db-sync-9dztn\" (UID: \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\") " pod="openstack-kuttl-tests/barbican-db-sync-9dztn" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.197607 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5wh5\" (UniqueName: \"kubernetes.io/projected/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-kube-api-access-s5wh5\") pod \"barbican-db-sync-9dztn\" (UID: \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\") " pod="openstack-kuttl-tests/barbican-db-sync-9dztn" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.201229 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-db-sync-config-data\") pod \"barbican-db-sync-9dztn\" (UID: \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\") " pod="openstack-kuttl-tests/barbican-db-sync-9dztn" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.202822 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-combined-ca-bundle\") pod \"barbican-db-sync-9dztn\" (UID: \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\") " pod="openstack-kuttl-tests/barbican-db-sync-9dztn" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.206182 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.213333 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5wh5\" (UniqueName: \"kubernetes.io/projected/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-kube-api-access-s5wh5\") pod \"barbican-db-sync-9dztn\" (UID: \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\") " pod="openstack-kuttl-tests/barbican-db-sync-9dztn" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.214302 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.257452 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-9dztn" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.337521 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-rdd9j"] Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.441682 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:04:45 crc kubenswrapper[5030]: W0120 23:04:45.455858 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cdf2496_d5b8_406d_8693_6e39c33950de.slice/crio-09edff785978bab73a5f509ed8bcf3271a792aa4777d1e9c59ee5fbe20961b9a WatchSource:0}: Error finding container 09edff785978bab73a5f509ed8bcf3271a792aa4777d1e9c59ee5fbe20961b9a: Status 404 returned error can't find the container with id 09edff785978bab73a5f509ed8bcf3271a792aa4777d1e9c59ee5fbe20961b9a Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.564902 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.566574 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.568827 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.569492 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.569719 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-g4857" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.569858 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.576720 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.637156 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.638503 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.645046 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.645225 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.653715 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.702886 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-wflcw"] Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711199 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711234 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711308 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wdxj\" (UniqueName: \"kubernetes.io/projected/427db03c-2d7b-444c-8bca-933cba12fc4c-kube-api-access-5wdxj\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711330 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-config-data\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711371 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/427db03c-2d7b-444c-8bca-933cba12fc4c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711400 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9058e93a-dadd-4a54-aa47-af4c43b1d10a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711417 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711436 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711455 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711481 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-scripts\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711502 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9058e93a-dadd-4a54-aa47-af4c43b1d10a-logs\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711530 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711547 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427db03c-2d7b-444c-8bca-933cba12fc4c-logs\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711569 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dftg4\" (UniqueName: \"kubernetes.io/projected/9058e93a-dadd-4a54-aa47-af4c43b1d10a-kube-api-access-dftg4\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.711594 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.796113 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-jpzls"] Jan 20 23:04:45 crc kubenswrapper[5030]: W0120 23:04:45.808450 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1382233_4692_4fff_8586_47c406224b47.slice/crio-4ee6ad9cc3d48fe039719c7d14a0bd4a5727afd3dd3f4aee9dd898bff6621010 WatchSource:0}: Error finding container 4ee6ad9cc3d48fe039719c7d14a0bd4a5727afd3dd3f4aee9dd898bff6621010: Status 404 returned error can't find the container with id 4ee6ad9cc3d48fe039719c7d14a0bd4a5727afd3dd3f4aee9dd898bff6621010 Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.809059 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-grk67"] Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815343 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/427db03c-2d7b-444c-8bca-933cba12fc4c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815430 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9058e93a-dadd-4a54-aa47-af4c43b1d10a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815491 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815520 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815569 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-scripts\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815604 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9058e93a-dadd-4a54-aa47-af4c43b1d10a-logs\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815690 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815713 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427db03c-2d7b-444c-8bca-933cba12fc4c-logs\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815746 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dftg4\" (UniqueName: \"kubernetes.io/projected/9058e93a-dadd-4a54-aa47-af4c43b1d10a-kube-api-access-dftg4\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815775 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815814 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815838 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815863 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/427db03c-2d7b-444c-8bca-933cba12fc4c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815933 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wdxj\" (UniqueName: \"kubernetes.io/projected/427db03c-2d7b-444c-8bca-933cba12fc4c-kube-api-access-5wdxj\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815964 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-config-data\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.815983 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9058e93a-dadd-4a54-aa47-af4c43b1d10a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.816317 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.816345 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.817364 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-9dztn"] Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.817885 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427db03c-2d7b-444c-8bca-933cba12fc4c-logs\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.821951 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9058e93a-dadd-4a54-aa47-af4c43b1d10a-logs\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.822993 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-config-data\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.823448 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.826300 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-scripts\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.827216 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.827943 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.828172 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.829482 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.830103 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.855297 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dftg4\" (UniqueName: \"kubernetes.io/projected/9058e93a-dadd-4a54-aa47-af4c43b1d10a-kube-api-access-dftg4\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.856156 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wdxj\" (UniqueName: \"kubernetes.io/projected/427db03c-2d7b-444c-8bca-933cba12fc4c-kube-api-access-5wdxj\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.859959 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.860014 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-0\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.885084 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:45 crc kubenswrapper[5030]: I0120 23:04:45.955750 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.268541 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-jpzls" event={"ID":"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7","Type":"ContainerStarted","Data":"c23637e20582b8309584063ddf23d6a0262a9a279ee34e0a1943487699e39324"} Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.271047 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-grk67" event={"ID":"b1382233-4692-4fff-8586-47c406224b47","Type":"ContainerStarted","Data":"e632d16da87f475682232fd9152c12b9f4e4bef74c0982e287d33cc4f3c9ba3b"} Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.271069 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-grk67" event={"ID":"b1382233-4692-4fff-8586-47c406224b47","Type":"ContainerStarted","Data":"4ee6ad9cc3d48fe039719c7d14a0bd4a5727afd3dd3f4aee9dd898bff6621010"} Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.287818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" event={"ID":"18897247-c232-43c2-9923-86f0391699ab","Type":"ContainerStarted","Data":"fea1372c596cefb82e941decc5c1f9e70bb6ceb4faaae0afec71eaec02f7aaf9"} Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.287863 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" event={"ID":"18897247-c232-43c2-9923-86f0391699ab","Type":"ContainerStarted","Data":"5a6285a7848d0a0df85f8f1b408c99e290992f1b563763bfa499432d2dd48fc3"} Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.295703 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-grk67" podStartSLOduration=2.295658172 podStartE2EDuration="2.295658172s" podCreationTimestamp="2026-01-20 23:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:46.285507965 +0000 UTC m=+1758.605768253" watchObservedRunningTime="2026-01-20 23:04:46.295658172 +0000 UTC m=+1758.615918460" Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.307653 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2cdf2496-d5b8-406d-8693-6e39c33950de","Type":"ContainerStarted","Data":"09edff785978bab73a5f509ed8bcf3271a792aa4777d1e9c59ee5fbe20961b9a"} Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.309411 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" podStartSLOduration=2.309392497 podStartE2EDuration="2.309392497s" podCreationTimestamp="2026-01-20 23:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:46.305814001 +0000 UTC m=+1758.626074289" watchObservedRunningTime="2026-01-20 23:04:46.309392497 +0000 UTC m=+1758.629652795" Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.319688 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-9dztn" event={"ID":"c1bb61bc-fb1d-463c-b1e8-8949e01ca893","Type":"ContainerStarted","Data":"cc33277d470448158fbe92e67fa94ac0abfb328a0c0c4b05536c09660bb8ac02"} Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.319767 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-9dztn" event={"ID":"c1bb61bc-fb1d-463c-b1e8-8949e01ca893","Type":"ContainerStarted","Data":"a6c68174c433bdb6cfc06141a30e194b92cc76c7e915168b8a27d3fbad7ddca6"} Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.328114 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-wflcw" event={"ID":"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7","Type":"ContainerStarted","Data":"feb1a638c84cf3386e550d8bc38c115b9e45a247fbc73b072ff09a3993cb8acb"} Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.328164 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-wflcw" event={"ID":"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7","Type":"ContainerStarted","Data":"b9c0e2c541011d086695aada757acada2cec792829240644d82c751f38a38115"} Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.350253 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.351682 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-9dztn" podStartSLOduration=2.351672877 podStartE2EDuration="2.351672877s" podCreationTimestamp="2026-01-20 23:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:46.338892846 +0000 UTC m=+1758.659153134" watchObservedRunningTime="2026-01-20 23:04:46.351672877 +0000 UTC m=+1758.671933165" Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.377846 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-wflcw" podStartSLOduration=2.377820104 podStartE2EDuration="2.377820104s" podCreationTimestamp="2026-01-20 23:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:46.35547673 +0000 UTC m=+1758.675737018" watchObservedRunningTime="2026-01-20 23:04:46.377820104 +0000 UTC m=+1758.698080392" Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.461223 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:04:46 crc kubenswrapper[5030]: I0120 23:04:46.911268 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:04:47 crc kubenswrapper[5030]: I0120 23:04:47.014904 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:04:47 crc kubenswrapper[5030]: I0120 23:04:47.064120 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:04:47 crc kubenswrapper[5030]: I0120 23:04:47.384877 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"427db03c-2d7b-444c-8bca-933cba12fc4c","Type":"ContainerStarted","Data":"51366bf435f3a024269cbee1e6072ae8753473b50ad2302bea9eee39258dcdd1"} Jan 20 23:04:47 crc kubenswrapper[5030]: I0120 23:04:47.411471 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"9058e93a-dadd-4a54-aa47-af4c43b1d10a","Type":"ContainerStarted","Data":"57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb"} Jan 20 23:04:47 crc kubenswrapper[5030]: I0120 23:04:47.411525 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"9058e93a-dadd-4a54-aa47-af4c43b1d10a","Type":"ContainerStarted","Data":"974ee5751c859e7f9f89de2de27a3753379e77f618952d9b75e0dcd899c0f38c"} Jan 20 23:04:47 crc kubenswrapper[5030]: I0120 23:04:47.420292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-jpzls" event={"ID":"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7","Type":"ContainerStarted","Data":"7c053da04f9e763cce1ed24ccb24b42f76ae096e7227f61b45fb9840d0d097ad"} Jan 20 23:04:47 crc kubenswrapper[5030]: I0120 23:04:47.427148 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2cdf2496-d5b8-406d-8693-6e39c33950de","Type":"ContainerStarted","Data":"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34"} Jan 20 23:04:47 crc kubenswrapper[5030]: I0120 23:04:47.452994 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-jpzls" podStartSLOduration=3.452976286 podStartE2EDuration="3.452976286s" podCreationTimestamp="2026-01-20 23:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:47.445895094 +0000 UTC m=+1759.766155382" watchObservedRunningTime="2026-01-20 23:04:47.452976286 +0000 UTC m=+1759.773236574" Jan 20 23:04:48 crc kubenswrapper[5030]: I0120 23:04:48.440066 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"9058e93a-dadd-4a54-aa47-af4c43b1d10a","Type":"ContainerStarted","Data":"12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d"} Jan 20 23:04:48 crc kubenswrapper[5030]: I0120 23:04:48.440163 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="9058e93a-dadd-4a54-aa47-af4c43b1d10a" containerName="glance-log" containerID="cri-o://57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb" gracePeriod=30 Jan 20 23:04:48 crc kubenswrapper[5030]: I0120 23:04:48.440218 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="9058e93a-dadd-4a54-aa47-af4c43b1d10a" containerName="glance-httpd" containerID="cri-o://12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d" gracePeriod=30 Jan 20 23:04:48 crc kubenswrapper[5030]: I0120 23:04:48.443718 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2cdf2496-d5b8-406d-8693-6e39c33950de","Type":"ContainerStarted","Data":"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf"} Jan 20 23:04:48 crc kubenswrapper[5030]: I0120 23:04:48.446537 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"427db03c-2d7b-444c-8bca-933cba12fc4c","Type":"ContainerStarted","Data":"e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd"} Jan 20 23:04:48 crc kubenswrapper[5030]: I0120 23:04:48.446561 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"427db03c-2d7b-444c-8bca-933cba12fc4c","Type":"ContainerStarted","Data":"579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b"} Jan 20 23:04:48 crc kubenswrapper[5030]: I0120 23:04:48.446678 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="427db03c-2d7b-444c-8bca-933cba12fc4c" containerName="glance-log" containerID="cri-o://579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b" gracePeriod=30 Jan 20 23:04:48 crc kubenswrapper[5030]: I0120 23:04:48.446778 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="427db03c-2d7b-444c-8bca-933cba12fc4c" containerName="glance-httpd" containerID="cri-o://e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd" gracePeriod=30 Jan 20 23:04:48 crc kubenswrapper[5030]: I0120 23:04:48.471244 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.471225002 podStartE2EDuration="4.471225002s" podCreationTimestamp="2026-01-20 23:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:48.465265097 +0000 UTC m=+1760.785525405" watchObservedRunningTime="2026-01-20 23:04:48.471225002 +0000 UTC m=+1760.791485290" Jan 20 23:04:48 crc kubenswrapper[5030]: I0120 23:04:48.520137 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.520113443 podStartE2EDuration="4.520113443s" podCreationTimestamp="2026-01-20 23:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:48.497217285 +0000 UTC m=+1760.817477573" watchObservedRunningTime="2026-01-20 23:04:48.520113443 +0000 UTC m=+1760.840373731" Jan 20 23:04:48 crc kubenswrapper[5030]: I0120 23:04:48.955905 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.046884 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.086882 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wdxj\" (UniqueName: \"kubernetes.io/projected/427db03c-2d7b-444c-8bca-933cba12fc4c-kube-api-access-5wdxj\") pod \"427db03c-2d7b-444c-8bca-933cba12fc4c\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.086946 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-internal-tls-certs\") pod \"427db03c-2d7b-444c-8bca-933cba12fc4c\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.086978 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-scripts\") pod \"427db03c-2d7b-444c-8bca-933cba12fc4c\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.087005 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-combined-ca-bundle\") pod \"427db03c-2d7b-444c-8bca-933cba12fc4c\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.087062 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/427db03c-2d7b-444c-8bca-933cba12fc4c-httpd-run\") pod \"427db03c-2d7b-444c-8bca-933cba12fc4c\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.087146 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"427db03c-2d7b-444c-8bca-933cba12fc4c\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.087166 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427db03c-2d7b-444c-8bca-933cba12fc4c-logs\") pod \"427db03c-2d7b-444c-8bca-933cba12fc4c\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.087190 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-config-data\") pod \"427db03c-2d7b-444c-8bca-933cba12fc4c\" (UID: \"427db03c-2d7b-444c-8bca-933cba12fc4c\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.092020 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/427db03c-2d7b-444c-8bca-933cba12fc4c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "427db03c-2d7b-444c-8bca-933cba12fc4c" (UID: "427db03c-2d7b-444c-8bca-933cba12fc4c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.092241 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/427db03c-2d7b-444c-8bca-933cba12fc4c-logs" (OuterVolumeSpecName: "logs") pod "427db03c-2d7b-444c-8bca-933cba12fc4c" (UID: "427db03c-2d7b-444c-8bca-933cba12fc4c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.093506 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "427db03c-2d7b-444c-8bca-933cba12fc4c" (UID: "427db03c-2d7b-444c-8bca-933cba12fc4c"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.095337 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427db03c-2d7b-444c-8bca-933cba12fc4c-kube-api-access-5wdxj" (OuterVolumeSpecName: "kube-api-access-5wdxj") pod "427db03c-2d7b-444c-8bca-933cba12fc4c" (UID: "427db03c-2d7b-444c-8bca-933cba12fc4c"). InnerVolumeSpecName "kube-api-access-5wdxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.096015 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-scripts" (OuterVolumeSpecName: "scripts") pod "427db03c-2d7b-444c-8bca-933cba12fc4c" (UID: "427db03c-2d7b-444c-8bca-933cba12fc4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.122870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "427db03c-2d7b-444c-8bca-933cba12fc4c" (UID: "427db03c-2d7b-444c-8bca-933cba12fc4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.137136 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-config-data" (OuterVolumeSpecName: "config-data") pod "427db03c-2d7b-444c-8bca-933cba12fc4c" (UID: "427db03c-2d7b-444c-8bca-933cba12fc4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.146961 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "427db03c-2d7b-444c-8bca-933cba12fc4c" (UID: "427db03c-2d7b-444c-8bca-933cba12fc4c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188178 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-scripts\") pod \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188235 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9058e93a-dadd-4a54-aa47-af4c43b1d10a-logs\") pod \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188255 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-config-data\") pod \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188290 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dftg4\" (UniqueName: \"kubernetes.io/projected/9058e93a-dadd-4a54-aa47-af4c43b1d10a-kube-api-access-dftg4\") pod \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188353 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9058e93a-dadd-4a54-aa47-af4c43b1d10a-httpd-run\") pod \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188374 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-public-tls-certs\") pod \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188403 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-combined-ca-bundle\") pod \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188456 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\" (UID: \"9058e93a-dadd-4a54-aa47-af4c43b1d10a\") " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9058e93a-dadd-4a54-aa47-af4c43b1d10a-logs" (OuterVolumeSpecName: "logs") pod "9058e93a-dadd-4a54-aa47-af4c43b1d10a" (UID: "9058e93a-dadd-4a54-aa47-af4c43b1d10a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9058e93a-dadd-4a54-aa47-af4c43b1d10a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9058e93a-dadd-4a54-aa47-af4c43b1d10a" (UID: "9058e93a-dadd-4a54-aa47-af4c43b1d10a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188805 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wdxj\" (UniqueName: \"kubernetes.io/projected/427db03c-2d7b-444c-8bca-933cba12fc4c-kube-api-access-5wdxj\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188869 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188884 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188900 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188913 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/427db03c-2d7b-444c-8bca-933cba12fc4c-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188941 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188954 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427db03c-2d7b-444c-8bca-933cba12fc4c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.188966 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427db03c-2d7b-444c-8bca-933cba12fc4c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.192781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-scripts" (OuterVolumeSpecName: "scripts") pod "9058e93a-dadd-4a54-aa47-af4c43b1d10a" (UID: "9058e93a-dadd-4a54-aa47-af4c43b1d10a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.196451 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9058e93a-dadd-4a54-aa47-af4c43b1d10a-kube-api-access-dftg4" (OuterVolumeSpecName: "kube-api-access-dftg4") pod "9058e93a-dadd-4a54-aa47-af4c43b1d10a" (UID: "9058e93a-dadd-4a54-aa47-af4c43b1d10a"). InnerVolumeSpecName "kube-api-access-dftg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.197718 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "9058e93a-dadd-4a54-aa47-af4c43b1d10a" (UID: "9058e93a-dadd-4a54-aa47-af4c43b1d10a"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.212112 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.217857 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9058e93a-dadd-4a54-aa47-af4c43b1d10a" (UID: "9058e93a-dadd-4a54-aa47-af4c43b1d10a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.233822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-config-data" (OuterVolumeSpecName: "config-data") pod "9058e93a-dadd-4a54-aa47-af4c43b1d10a" (UID: "9058e93a-dadd-4a54-aa47-af4c43b1d10a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.241328 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9058e93a-dadd-4a54-aa47-af4c43b1d10a" (UID: "9058e93a-dadd-4a54-aa47-af4c43b1d10a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.290334 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.290368 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9058e93a-dadd-4a54-aa47-af4c43b1d10a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.290382 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.290395 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dftg4\" (UniqueName: \"kubernetes.io/projected/9058e93a-dadd-4a54-aa47-af4c43b1d10a-kube-api-access-dftg4\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.290408 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9058e93a-dadd-4a54-aa47-af4c43b1d10a-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.290419 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.290429 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9058e93a-dadd-4a54-aa47-af4c43b1d10a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.290467 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.290478 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.307804 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.392443 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.459450 5030 generic.go:334] "Generic (PLEG): container finished" podID="427db03c-2d7b-444c-8bca-933cba12fc4c" containerID="e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd" exitCode=143 Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.459520 5030 generic.go:334] "Generic (PLEG): container finished" podID="427db03c-2d7b-444c-8bca-933cba12fc4c" containerID="579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b" exitCode=143 Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.459541 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.459508 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"427db03c-2d7b-444c-8bca-933cba12fc4c","Type":"ContainerDied","Data":"e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd"} Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.459638 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"427db03c-2d7b-444c-8bca-933cba12fc4c","Type":"ContainerDied","Data":"579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b"} Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.459660 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"427db03c-2d7b-444c-8bca-933cba12fc4c","Type":"ContainerDied","Data":"51366bf435f3a024269cbee1e6072ae8753473b50ad2302bea9eee39258dcdd1"} Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.459681 5030 scope.go:117] "RemoveContainer" containerID="e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.462259 5030 generic.go:334] "Generic (PLEG): container finished" podID="9058e93a-dadd-4a54-aa47-af4c43b1d10a" containerID="12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d" exitCode=0 Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.462283 5030 generic.go:334] "Generic (PLEG): container finished" podID="9058e93a-dadd-4a54-aa47-af4c43b1d10a" containerID="57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb" exitCode=143 Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.462326 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.462338 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"9058e93a-dadd-4a54-aa47-af4c43b1d10a","Type":"ContainerDied","Data":"12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d"} Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.462363 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"9058e93a-dadd-4a54-aa47-af4c43b1d10a","Type":"ContainerDied","Data":"57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb"} Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.462403 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"9058e93a-dadd-4a54-aa47-af4c43b1d10a","Type":"ContainerDied","Data":"974ee5751c859e7f9f89de2de27a3753379e77f618952d9b75e0dcd899c0f38c"} Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.464283 5030 generic.go:334] "Generic (PLEG): container finished" podID="b1382233-4692-4fff-8586-47c406224b47" containerID="e632d16da87f475682232fd9152c12b9f4e4bef74c0982e287d33cc4f3c9ba3b" exitCode=0 Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.464348 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-grk67" event={"ID":"b1382233-4692-4fff-8586-47c406224b47","Type":"ContainerDied","Data":"e632d16da87f475682232fd9152c12b9f4e4bef74c0982e287d33cc4f3c9ba3b"} Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.466782 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2cdf2496-d5b8-406d-8693-6e39c33950de","Type":"ContainerStarted","Data":"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec"} Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.501606 5030 scope.go:117] "RemoveContainer" containerID="579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.513182 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.520160 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.536996 5030 scope.go:117] "RemoveContainer" containerID="e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd" Jan 20 23:04:49 crc kubenswrapper[5030]: E0120 23:04:49.537482 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd\": container with ID starting with e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd not found: ID does not exist" containerID="e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.537529 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd"} err="failed to get container status \"e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd\": rpc error: code = NotFound desc = could not find container \"e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd\": container with ID starting with e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd not found: ID does not exist" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.537559 5030 scope.go:117] "RemoveContainer" containerID="579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b" Jan 20 23:04:49 crc kubenswrapper[5030]: E0120 23:04:49.537915 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b\": container with ID starting with 579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b not found: ID does not exist" containerID="579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.537937 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b"} err="failed to get container status \"579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b\": rpc error: code = NotFound desc = could not find container \"579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b\": container with ID starting with 579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b not found: ID does not exist" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.537951 5030 scope.go:117] "RemoveContainer" containerID="e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.538928 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd"} err="failed to get container status \"e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd\": rpc error: code = NotFound desc = could not find container \"e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd\": container with ID starting with e690a652ba82fe1821a733d3fbe7331ec44b2f84eb829ed49450ae002a3c92bd not found: ID does not exist" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.538966 5030 scope.go:117] "RemoveContainer" containerID="579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.539450 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b"} err="failed to get container status \"579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b\": rpc error: code = NotFound desc = could not find container \"579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b\": container with ID starting with 579648bd9ebd63f6cf4c4fc7323312afb2c1a50f3295f09bf7b535fa13673f5b not found: ID does not exist" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.539500 5030 scope.go:117] "RemoveContainer" containerID="12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.546652 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.562482 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.579543 5030 scope.go:117] "RemoveContainer" containerID="57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.579715 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:04:49 crc kubenswrapper[5030]: E0120 23:04:49.580080 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427db03c-2d7b-444c-8bca-933cba12fc4c" containerName="glance-log" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.580100 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="427db03c-2d7b-444c-8bca-933cba12fc4c" containerName="glance-log" Jan 20 23:04:49 crc kubenswrapper[5030]: E0120 23:04:49.580121 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9058e93a-dadd-4a54-aa47-af4c43b1d10a" containerName="glance-log" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.580131 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9058e93a-dadd-4a54-aa47-af4c43b1d10a" containerName="glance-log" Jan 20 23:04:49 crc kubenswrapper[5030]: E0120 23:04:49.580149 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9058e93a-dadd-4a54-aa47-af4c43b1d10a" containerName="glance-httpd" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.580157 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9058e93a-dadd-4a54-aa47-af4c43b1d10a" containerName="glance-httpd" Jan 20 23:04:49 crc kubenswrapper[5030]: E0120 23:04:49.580190 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427db03c-2d7b-444c-8bca-933cba12fc4c" containerName="glance-httpd" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.580198 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="427db03c-2d7b-444c-8bca-933cba12fc4c" containerName="glance-httpd" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.580392 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9058e93a-dadd-4a54-aa47-af4c43b1d10a" containerName="glance-log" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.580418 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="427db03c-2d7b-444c-8bca-933cba12fc4c" containerName="glance-log" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.580436 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="427db03c-2d7b-444c-8bca-933cba12fc4c" containerName="glance-httpd" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.580447 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9058e93a-dadd-4a54-aa47-af4c43b1d10a" containerName="glance-httpd" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.581709 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.587585 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.588053 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.589108 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.592419 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.594050 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.596139 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.596344 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.600836 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.601026 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-g4857" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.605842 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.640792 5030 scope.go:117] "RemoveContainer" containerID="12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d" Jan 20 23:04:49 crc kubenswrapper[5030]: E0120 23:04:49.641205 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d\": container with ID starting with 12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d not found: ID does not exist" containerID="12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.641245 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d"} err="failed to get container status \"12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d\": rpc error: code = NotFound desc = could not find container \"12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d\": container with ID starting with 12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d not found: ID does not exist" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.641271 5030 scope.go:117] "RemoveContainer" containerID="57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb" Jan 20 23:04:49 crc kubenswrapper[5030]: E0120 23:04:49.641731 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb\": container with ID starting with 57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb not found: ID does not exist" containerID="57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.641782 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb"} err="failed to get container status \"57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb\": rpc error: code = NotFound desc = could not find container \"57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb\": container with ID starting with 57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb not found: ID does not exist" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.641835 5030 scope.go:117] "RemoveContainer" containerID="12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.642236 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d"} err="failed to get container status \"12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d\": rpc error: code = NotFound desc = could not find container \"12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d\": container with ID starting with 12eedad293e5e65b0166f950aa34e66e3cef65fe99670fb1f60bd14544a2b43d not found: ID does not exist" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.642264 5030 scope.go:117] "RemoveContainer" containerID="57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.643039 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb"} err="failed to get container status \"57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb\": rpc error: code = NotFound desc = could not find container \"57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb\": container with ID starting with 57bdd85c2eb8b797df6b43f41b300abb04d9132ae2841d30c8c516c21efb5cdb not found: ID does not exist" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.700921 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1c1b889-379f-4f53-9426-cd5da16ccc34-logs\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.700985 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701037 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjxnf\" (UniqueName: \"kubernetes.io/projected/d1c1b889-379f-4f53-9426-cd5da16ccc34-kube-api-access-jjxnf\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701171 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a3d4bee-3c5a-4be3-8b79-af236dd08645-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701190 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701235 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1c1b889-379f-4f53-9426-cd5da16ccc34-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701249 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-scripts\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701271 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701327 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701348 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fwch\" (UniqueName: \"kubernetes.io/projected/3a3d4bee-3c5a-4be3-8b79-af236dd08645-kube-api-access-9fwch\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701407 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-config-data\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701576 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3d4bee-3c5a-4be3-8b79-af236dd08645-logs\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.701592 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803415 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803426 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjxnf\" (UniqueName: \"kubernetes.io/projected/d1c1b889-379f-4f53-9426-cd5da16ccc34-kube-api-access-jjxnf\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803705 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803735 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a3d4bee-3c5a-4be3-8b79-af236dd08645-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803752 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803771 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-scripts\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803787 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1c1b889-379f-4f53-9426-cd5da16ccc34-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803812 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803843 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803868 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fwch\" (UniqueName: \"kubernetes.io/projected/3a3d4bee-3c5a-4be3-8b79-af236dd08645-kube-api-access-9fwch\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-config-data\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803924 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803943 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803975 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3d4bee-3c5a-4be3-8b79-af236dd08645-logs\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.803989 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.804009 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1c1b889-379f-4f53-9426-cd5da16ccc34-logs\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.804387 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1c1b889-379f-4f53-9426-cd5da16ccc34-logs\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.804814 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.805736 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3d4bee-3c5a-4be3-8b79-af236dd08645-logs\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.805766 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a3d4bee-3c5a-4be3-8b79-af236dd08645-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.805783 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1c1b889-379f-4f53-9426-cd5da16ccc34-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.811363 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-scripts\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.811463 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.811505 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.811969 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.816834 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.817387 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-config-data\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.817434 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.820646 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.824004 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fwch\" (UniqueName: \"kubernetes.io/projected/3a3d4bee-3c5a-4be3-8b79-af236dd08645-kube-api-access-9fwch\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.828688 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjxnf\" (UniqueName: \"kubernetes.io/projected/d1c1b889-379f-4f53-9426-cd5da16ccc34-kube-api-access-jjxnf\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.835706 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.836801 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-0\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.914029 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.923393 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.962774 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:04:49 crc kubenswrapper[5030]: E0120 23:04:49.962984 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.973848 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="427db03c-2d7b-444c-8bca-933cba12fc4c" path="/var/lib/kubelet/pods/427db03c-2d7b-444c-8bca-933cba12fc4c/volumes" Jan 20 23:04:49 crc kubenswrapper[5030]: I0120 23:04:49.974494 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9058e93a-dadd-4a54-aa47-af4c43b1d10a" path="/var/lib/kubelet/pods/9058e93a-dadd-4a54-aa47-af4c43b1d10a/volumes" Jan 20 23:04:50 crc kubenswrapper[5030]: I0120 23:04:50.089555 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5e9fb28d_51f1_4f0f_a2ee_dd4a379a0a92.slice" Jan 20 23:04:50 crc kubenswrapper[5030]: E0120 23:04:50.089947 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5e9fb28d_51f1_4f0f_a2ee_dd4a379a0a92.slice" pod="openstack-kuttl-tests/glance-db-sync-4qlnz" podUID="5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92" Jan 20 23:04:50 crc kubenswrapper[5030]: I0120 23:04:50.418594 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:04:50 crc kubenswrapper[5030]: I0120 23:04:50.459229 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:04:50 crc kubenswrapper[5030]: W0120 23:04:50.474735 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1c1b889_379f_4f53_9426_cd5da16ccc34.slice/crio-2042ceed21d280b471824b70d6624fb08ba9a4c49f23f339828e93ab337ea50f WatchSource:0}: Error finding container 2042ceed21d280b471824b70d6624fb08ba9a4c49f23f339828e93ab337ea50f: Status 404 returned error can't find the container with id 2042ceed21d280b471824b70d6624fb08ba9a4c49f23f339828e93ab337ea50f Jan 20 23:04:50 crc kubenswrapper[5030]: I0120 23:04:50.500719 5030 generic.go:334] "Generic (PLEG): container finished" podID="18897247-c232-43c2-9923-86f0391699ab" containerID="fea1372c596cefb82e941decc5c1f9e70bb6ceb4faaae0afec71eaec02f7aaf9" exitCode=0 Jan 20 23:04:50 crc kubenswrapper[5030]: I0120 23:04:50.500806 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" event={"ID":"18897247-c232-43c2-9923-86f0391699ab","Type":"ContainerDied","Data":"fea1372c596cefb82e941decc5c1f9e70bb6ceb4faaae0afec71eaec02f7aaf9"} Jan 20 23:04:50 crc kubenswrapper[5030]: I0120 23:04:50.504167 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"3a3d4bee-3c5a-4be3-8b79-af236dd08645","Type":"ContainerStarted","Data":"d0500aefd223a7ad0a6cf73449a7d0bda27f7fa62d082286196e56b9d921b999"} Jan 20 23:04:50 crc kubenswrapper[5030]: I0120 23:04:50.521500 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-4qlnz" Jan 20 23:04:50 crc kubenswrapper[5030]: I0120 23:04:50.945608 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.026487 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-combined-ca-bundle\") pod \"b1382233-4692-4fff-8586-47c406224b47\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.026690 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1382233-4692-4fff-8586-47c406224b47-logs\") pod \"b1382233-4692-4fff-8586-47c406224b47\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.026764 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-scripts\") pod \"b1382233-4692-4fff-8586-47c406224b47\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.026789 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kr7g\" (UniqueName: \"kubernetes.io/projected/b1382233-4692-4fff-8586-47c406224b47-kube-api-access-9kr7g\") pod \"b1382233-4692-4fff-8586-47c406224b47\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.026843 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-config-data\") pod \"b1382233-4692-4fff-8586-47c406224b47\" (UID: \"b1382233-4692-4fff-8586-47c406224b47\") " Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.027426 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1382233-4692-4fff-8586-47c406224b47-logs" (OuterVolumeSpecName: "logs") pod "b1382233-4692-4fff-8586-47c406224b47" (UID: "b1382233-4692-4fff-8586-47c406224b47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.027857 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1382233-4692-4fff-8586-47c406224b47-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.032864 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-scripts" (OuterVolumeSpecName: "scripts") pod "b1382233-4692-4fff-8586-47c406224b47" (UID: "b1382233-4692-4fff-8586-47c406224b47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.036666 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1382233-4692-4fff-8586-47c406224b47-kube-api-access-9kr7g" (OuterVolumeSpecName: "kube-api-access-9kr7g") pod "b1382233-4692-4fff-8586-47c406224b47" (UID: "b1382233-4692-4fff-8586-47c406224b47"). InnerVolumeSpecName "kube-api-access-9kr7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.059900 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-config-data" (OuterVolumeSpecName: "config-data") pod "b1382233-4692-4fff-8586-47c406224b47" (UID: "b1382233-4692-4fff-8586-47c406224b47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.064750 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1382233-4692-4fff-8586-47c406224b47" (UID: "b1382233-4692-4fff-8586-47c406224b47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.129247 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.129289 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.129304 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1382233-4692-4fff-8586-47c406224b47-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.129317 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kr7g\" (UniqueName: \"kubernetes.io/projected/b1382233-4692-4fff-8586-47c406224b47-kube-api-access-9kr7g\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.537927 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-grk67" event={"ID":"b1382233-4692-4fff-8586-47c406224b47","Type":"ContainerDied","Data":"4ee6ad9cc3d48fe039719c7d14a0bd4a5727afd3dd3f4aee9dd898bff6621010"} Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.537975 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ee6ad9cc3d48fe039719c7d14a0bd4a5727afd3dd3f4aee9dd898bff6621010" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.537939 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-grk67" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.541103 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2cdf2496-d5b8-406d-8693-6e39c33950de","Type":"ContainerStarted","Data":"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe"} Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.541325 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="ceilometer-central-agent" containerID="cri-o://8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34" gracePeriod=30 Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.541638 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.541959 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="proxy-httpd" containerID="cri-o://b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe" gracePeriod=30 Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.542034 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="sg-core" containerID="cri-o://10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec" gracePeriod=30 Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.542097 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="ceilometer-notification-agent" containerID="cri-o://192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf" gracePeriod=30 Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.544658 5030 generic.go:334] "Generic (PLEG): container finished" podID="c1bb61bc-fb1d-463c-b1e8-8949e01ca893" containerID="cc33277d470448158fbe92e67fa94ac0abfb328a0c0c4b05536c09660bb8ac02" exitCode=0 Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.544725 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-9dztn" event={"ID":"c1bb61bc-fb1d-463c-b1e8-8949e01ca893","Type":"ContainerDied","Data":"cc33277d470448158fbe92e67fa94ac0abfb328a0c0c4b05536c09660bb8ac02"} Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.546322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d1c1b889-379f-4f53-9426-cd5da16ccc34","Type":"ContainerStarted","Data":"23fb52203c1208e4eeb12c51e5097a053bd1b139254715b80d20e1ac118896b9"} Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.546350 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d1c1b889-379f-4f53-9426-cd5da16ccc34","Type":"ContainerStarted","Data":"2042ceed21d280b471824b70d6624fb08ba9a4c49f23f339828e93ab337ea50f"} Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.550668 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"3a3d4bee-3c5a-4be3-8b79-af236dd08645","Type":"ContainerStarted","Data":"11ba608fcb875ff78e07ffe256b78dc318fe1c7fa59d1ff1d49fd823f5cb3132"} Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.560056 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.722355608 podStartE2EDuration="7.560041139s" podCreationTimestamp="2026-01-20 23:04:44 +0000 UTC" firstStartedPulling="2026-01-20 23:04:45.461051951 +0000 UTC m=+1757.781312239" lastFinishedPulling="2026-01-20 23:04:50.298737482 +0000 UTC m=+1762.618997770" observedRunningTime="2026-01-20 23:04:51.55804303 +0000 UTC m=+1763.878303438" watchObservedRunningTime="2026-01-20 23:04:51.560041139 +0000 UTC m=+1763.880301427" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.671314 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7869b46478-p797p"] Jan 20 23:04:51 crc kubenswrapper[5030]: E0120 23:04:51.678668 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1382233-4692-4fff-8586-47c406224b47" containerName="placement-db-sync" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.678742 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1382233-4692-4fff-8586-47c406224b47" containerName="placement-db-sync" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.678971 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1382233-4692-4fff-8586-47c406224b47" containerName="placement-db-sync" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.679932 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.689230 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.689601 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-2d6zz" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.691438 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.691776 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.691987 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.693419 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7869b46478-p797p"] Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.744868 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-public-tls-certs\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.744919 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jft59\" (UniqueName: \"kubernetes.io/projected/4ee5d591-43e7-437c-9b55-b9132f22246a-kube-api-access-jft59\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.745002 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-combined-ca-bundle\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.745039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-scripts\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.745057 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-config-data\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.745073 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-internal-tls-certs\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.745089 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ee5d591-43e7-437c-9b55-b9132f22246a-logs\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.847461 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-public-tls-certs\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.847511 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jft59\" (UniqueName: \"kubernetes.io/projected/4ee5d591-43e7-437c-9b55-b9132f22246a-kube-api-access-jft59\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.847582 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-combined-ca-bundle\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.847617 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-scripts\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.847661 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-config-data\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.847676 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-internal-tls-certs\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.847695 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ee5d591-43e7-437c-9b55-b9132f22246a-logs\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.848152 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ee5d591-43e7-437c-9b55-b9132f22246a-logs\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.853506 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-internal-tls-certs\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.853675 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-config-data\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.854416 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-scripts\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.854799 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-public-tls-certs\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.855328 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-combined-ca-bundle\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:51 crc kubenswrapper[5030]: I0120 23:04:51.868282 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jft59\" (UniqueName: \"kubernetes.io/projected/4ee5d591-43e7-437c-9b55-b9132f22246a-kube-api-access-jft59\") pod \"placement-7869b46478-p797p\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.076860 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.082129 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.152131 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-combined-ca-bundle\") pod \"18897247-c232-43c2-9923-86f0391699ab\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.152460 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-config-data\") pod \"18897247-c232-43c2-9923-86f0391699ab\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.152523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-credential-keys\") pod \"18897247-c232-43c2-9923-86f0391699ab\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.152609 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-fernet-keys\") pod \"18897247-c232-43c2-9923-86f0391699ab\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.152641 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgmqg\" (UniqueName: \"kubernetes.io/projected/18897247-c232-43c2-9923-86f0391699ab-kube-api-access-dgmqg\") pod \"18897247-c232-43c2-9923-86f0391699ab\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.152674 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-scripts\") pod \"18897247-c232-43c2-9923-86f0391699ab\" (UID: \"18897247-c232-43c2-9923-86f0391699ab\") " Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.159331 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "18897247-c232-43c2-9923-86f0391699ab" (UID: "18897247-c232-43c2-9923-86f0391699ab"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.159350 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "18897247-c232-43c2-9923-86f0391699ab" (UID: "18897247-c232-43c2-9923-86f0391699ab"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.174751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-scripts" (OuterVolumeSpecName: "scripts") pod "18897247-c232-43c2-9923-86f0391699ab" (UID: "18897247-c232-43c2-9923-86f0391699ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.174886 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18897247-c232-43c2-9923-86f0391699ab-kube-api-access-dgmqg" (OuterVolumeSpecName: "kube-api-access-dgmqg") pod "18897247-c232-43c2-9923-86f0391699ab" (UID: "18897247-c232-43c2-9923-86f0391699ab"). InnerVolumeSpecName "kube-api-access-dgmqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.198304 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-config-data" (OuterVolumeSpecName: "config-data") pod "18897247-c232-43c2-9923-86f0391699ab" (UID: "18897247-c232-43c2-9923-86f0391699ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.202426 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18897247-c232-43c2-9923-86f0391699ab" (UID: "18897247-c232-43c2-9923-86f0391699ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.254208 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.254242 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.254253 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.254266 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.254275 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgmqg\" (UniqueName: \"kubernetes.io/projected/18897247-c232-43c2-9923-86f0391699ab-kube-api-access-dgmqg\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.254286 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18897247-c232-43c2-9923-86f0391699ab-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.270942 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.355842 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cdf2496-d5b8-406d-8693-6e39c33950de-run-httpd\") pod \"2cdf2496-d5b8-406d-8693-6e39c33950de\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.355884 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-scripts\") pod \"2cdf2496-d5b8-406d-8693-6e39c33950de\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.356017 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cdf2496-d5b8-406d-8693-6e39c33950de-log-httpd\") pod \"2cdf2496-d5b8-406d-8693-6e39c33950de\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.356063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-sg-core-conf-yaml\") pod \"2cdf2496-d5b8-406d-8693-6e39c33950de\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.356126 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94zfx\" (UniqueName: \"kubernetes.io/projected/2cdf2496-d5b8-406d-8693-6e39c33950de-kube-api-access-94zfx\") pod \"2cdf2496-d5b8-406d-8693-6e39c33950de\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.356155 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-combined-ca-bundle\") pod \"2cdf2496-d5b8-406d-8693-6e39c33950de\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.356186 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-config-data\") pod \"2cdf2496-d5b8-406d-8693-6e39c33950de\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.356183 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cdf2496-d5b8-406d-8693-6e39c33950de-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2cdf2496-d5b8-406d-8693-6e39c33950de" (UID: "2cdf2496-d5b8-406d-8693-6e39c33950de"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.356454 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cdf2496-d5b8-406d-8693-6e39c33950de-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2cdf2496-d5b8-406d-8693-6e39c33950de" (UID: "2cdf2496-d5b8-406d-8693-6e39c33950de"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.356498 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cdf2496-d5b8-406d-8693-6e39c33950de-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.358749 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cdf2496-d5b8-406d-8693-6e39c33950de-kube-api-access-94zfx" (OuterVolumeSpecName: "kube-api-access-94zfx") pod "2cdf2496-d5b8-406d-8693-6e39c33950de" (UID: "2cdf2496-d5b8-406d-8693-6e39c33950de"). InnerVolumeSpecName "kube-api-access-94zfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.359679 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-scripts" (OuterVolumeSpecName: "scripts") pod "2cdf2496-d5b8-406d-8693-6e39c33950de" (UID: "2cdf2496-d5b8-406d-8693-6e39c33950de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.381768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2cdf2496-d5b8-406d-8693-6e39c33950de" (UID: "2cdf2496-d5b8-406d-8693-6e39c33950de"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.425844 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cdf2496-d5b8-406d-8693-6e39c33950de" (UID: "2cdf2496-d5b8-406d-8693-6e39c33950de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.457146 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-config-data" (OuterVolumeSpecName: "config-data") pod "2cdf2496-d5b8-406d-8693-6e39c33950de" (UID: "2cdf2496-d5b8-406d-8693-6e39c33950de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.457916 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-config-data\") pod \"2cdf2496-d5b8-406d-8693-6e39c33950de\" (UID: \"2cdf2496-d5b8-406d-8693-6e39c33950de\") " Jan 20 23:04:52 crc kubenswrapper[5030]: W0120 23:04:52.458037 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2cdf2496-d5b8-406d-8693-6e39c33950de/volumes/kubernetes.io~secret/config-data Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.458052 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-config-data" (OuterVolumeSpecName: "config-data") pod "2cdf2496-d5b8-406d-8693-6e39c33950de" (UID: "2cdf2496-d5b8-406d-8693-6e39c33950de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.458358 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.458381 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94zfx\" (UniqueName: \"kubernetes.io/projected/2cdf2496-d5b8-406d-8693-6e39c33950de-kube-api-access-94zfx\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.458395 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.458408 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.458418 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdf2496-d5b8-406d-8693-6e39c33950de-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.458427 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2cdf2496-d5b8-406d-8693-6e39c33950de-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.526093 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7869b46478-p797p"] Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.567494 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7869b46478-p797p" event={"ID":"4ee5d591-43e7-437c-9b55-b9132f22246a","Type":"ContainerStarted","Data":"e233ca221348f7805dd360b2f065fe3a1854fe954b610e9cacf74feb82b6f310"} Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.569844 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.569886 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-rdd9j" event={"ID":"18897247-c232-43c2-9923-86f0391699ab","Type":"ContainerDied","Data":"5a6285a7848d0a0df85f8f1b408c99e290992f1b563763bfa499432d2dd48fc3"} Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.570337 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a6285a7848d0a0df85f8f1b408c99e290992f1b563763bfa499432d2dd48fc3" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.574273 5030 generic.go:334] "Generic (PLEG): container finished" podID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerID="b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe" exitCode=0 Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.574425 5030 generic.go:334] "Generic (PLEG): container finished" podID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerID="10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec" exitCode=2 Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.574526 5030 generic.go:334] "Generic (PLEG): container finished" podID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerID="192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf" exitCode=0 Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.574614 5030 generic.go:334] "Generic (PLEG): container finished" podID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerID="8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34" exitCode=0 Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.574355 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2cdf2496-d5b8-406d-8693-6e39c33950de","Type":"ContainerDied","Data":"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe"} Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.574932 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2cdf2496-d5b8-406d-8693-6e39c33950de","Type":"ContainerDied","Data":"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec"} Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.575045 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2cdf2496-d5b8-406d-8693-6e39c33950de","Type":"ContainerDied","Data":"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf"} Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.575167 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2cdf2496-d5b8-406d-8693-6e39c33950de","Type":"ContainerDied","Data":"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34"} Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.575254 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"2cdf2496-d5b8-406d-8693-6e39c33950de","Type":"ContainerDied","Data":"09edff785978bab73a5f509ed8bcf3271a792aa4777d1e9c59ee5fbe20961b9a"} Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.574997 5030 scope.go:117] "RemoveContainer" containerID="b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.574331 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.586594 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d1c1b889-379f-4f53-9426-cd5da16ccc34","Type":"ContainerStarted","Data":"2b554ea16953e30e845cf19ca4b9182ead41c46fc5580c12af02d9e53332bb1e"} Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.591347 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"3a3d4bee-3c5a-4be3-8b79-af236dd08645","Type":"ContainerStarted","Data":"355a65489075362570fc4a79cf6fc0137cce9d33afec9c8da23c87eea6ec654f"} Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.630818 5030 scope.go:117] "RemoveContainer" containerID="10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.651069 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.651051138 podStartE2EDuration="3.651051138s" podCreationTimestamp="2026-01-20 23:04:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:52.618371541 +0000 UTC m=+1764.938631839" watchObservedRunningTime="2026-01-20 23:04:52.651051138 +0000 UTC m=+1764.971311426" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.651767 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.651757244 podStartE2EDuration="3.651757244s" podCreationTimestamp="2026-01-20 23:04:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:52.649013637 +0000 UTC m=+1764.969273925" watchObservedRunningTime="2026-01-20 23:04:52.651757244 +0000 UTC m=+1764.972017532" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.670472 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-rdd9j"] Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.679250 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-rdd9j"] Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.695212 5030 scope.go:117] "RemoveContainer" containerID="192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.707729 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.728495 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.744703 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:04:52 crc kubenswrapper[5030]: E0120 23:04:52.745111 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="proxy-httpd" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.745131 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="proxy-httpd" Jan 20 23:04:52 crc kubenswrapper[5030]: E0120 23:04:52.745151 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="ceilometer-central-agent" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.745158 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="ceilometer-central-agent" Jan 20 23:04:52 crc kubenswrapper[5030]: E0120 23:04:52.745166 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="ceilometer-notification-agent" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.745172 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="ceilometer-notification-agent" Jan 20 23:04:52 crc kubenswrapper[5030]: E0120 23:04:52.745185 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18897247-c232-43c2-9923-86f0391699ab" containerName="keystone-bootstrap" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.745191 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="18897247-c232-43c2-9923-86f0391699ab" containerName="keystone-bootstrap" Jan 20 23:04:52 crc kubenswrapper[5030]: E0120 23:04:52.745203 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="sg-core" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.745209 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="sg-core" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.745351 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="proxy-httpd" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.745369 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="sg-core" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.745381 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="ceilometer-central-agent" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.745392 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" containerName="ceilometer-notification-agent" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.745402 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="18897247-c232-43c2-9923-86f0391699ab" containerName="keystone-bootstrap" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.746863 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.749414 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.749563 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.764795 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.764850 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-scripts\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.764897 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b70086d-b309-4113-bcee-34b36bf512a7-run-httpd\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.764945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-config-data\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.764977 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b70086d-b309-4113-bcee-34b36bf512a7-log-httpd\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.765019 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.765042 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk9tm\" (UniqueName: \"kubernetes.io/projected/7b70086d-b309-4113-bcee-34b36bf512a7-kube-api-access-kk9tm\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.772266 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.783067 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kqx2p"] Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.784842 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.788306 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.788693 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-2vb6q" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.788844 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.788949 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.789248 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.800782 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kqx2p"] Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.833358 5030 scope.go:117] "RemoveContainer" containerID="8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.860504 5030 scope.go:117] "RemoveContainer" containerID="b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe" Jan 20 23:04:52 crc kubenswrapper[5030]: E0120 23:04:52.861046 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe\": container with ID starting with b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe not found: ID does not exist" containerID="b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.861077 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe"} err="failed to get container status \"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe\": rpc error: code = NotFound desc = could not find container \"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe\": container with ID starting with b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.861096 5030 scope.go:117] "RemoveContainer" containerID="10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec" Jan 20 23:04:52 crc kubenswrapper[5030]: E0120 23:04:52.861470 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec\": container with ID starting with 10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec not found: ID does not exist" containerID="10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.861526 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec"} err="failed to get container status \"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec\": rpc error: code = NotFound desc = could not find container \"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec\": container with ID starting with 10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.861560 5030 scope.go:117] "RemoveContainer" containerID="192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf" Jan 20 23:04:52 crc kubenswrapper[5030]: E0120 23:04:52.861945 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf\": container with ID starting with 192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf not found: ID does not exist" containerID="192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.861976 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf"} err="failed to get container status \"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf\": rpc error: code = NotFound desc = could not find container \"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf\": container with ID starting with 192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.862014 5030 scope.go:117] "RemoveContainer" containerID="8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34" Jan 20 23:04:52 crc kubenswrapper[5030]: E0120 23:04:52.862456 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34\": container with ID starting with 8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34 not found: ID does not exist" containerID="8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.862519 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34"} err="failed to get container status \"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34\": rpc error: code = NotFound desc = could not find container \"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34\": container with ID starting with 8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34 not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.862551 5030 scope.go:117] "RemoveContainer" containerID="b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.862905 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe"} err="failed to get container status \"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe\": rpc error: code = NotFound desc = could not find container \"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe\": container with ID starting with b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.862944 5030 scope.go:117] "RemoveContainer" containerID="10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.863331 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec"} err="failed to get container status \"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec\": rpc error: code = NotFound desc = could not find container \"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec\": container with ID starting with 10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.863362 5030 scope.go:117] "RemoveContainer" containerID="192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.863636 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf"} err="failed to get container status \"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf\": rpc error: code = NotFound desc = could not find container \"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf\": container with ID starting with 192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.863664 5030 scope.go:117] "RemoveContainer" containerID="8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.863867 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34"} err="failed to get container status \"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34\": rpc error: code = NotFound desc = could not find container \"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34\": container with ID starting with 8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34 not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.863908 5030 scope.go:117] "RemoveContainer" containerID="b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.864142 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe"} err="failed to get container status \"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe\": rpc error: code = NotFound desc = could not find container \"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe\": container with ID starting with b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.864163 5030 scope.go:117] "RemoveContainer" containerID="10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.864423 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec"} err="failed to get container status \"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec\": rpc error: code = NotFound desc = could not find container \"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec\": container with ID starting with 10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.864449 5030 scope.go:117] "RemoveContainer" containerID="192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.864738 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf"} err="failed to get container status \"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf\": rpc error: code = NotFound desc = could not find container \"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf\": container with ID starting with 192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.864759 5030 scope.go:117] "RemoveContainer" containerID="8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.865027 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34"} err="failed to get container status \"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34\": rpc error: code = NotFound desc = could not find container \"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34\": container with ID starting with 8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34 not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.865047 5030 scope.go:117] "RemoveContainer" containerID="b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.865853 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-config-data\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.867290 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b70086d-b309-4113-bcee-34b36bf512a7-log-httpd\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.867369 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-combined-ca-bundle\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.867407 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-credential-keys\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.867441 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-scripts\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.867496 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.867533 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk9tm\" (UniqueName: \"kubernetes.io/projected/7b70086d-b309-4113-bcee-34b36bf512a7-kube-api-access-kk9tm\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.867587 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.867656 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-scripts\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.867744 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b70086d-b309-4113-bcee-34b36bf512a7-run-httpd\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.867767 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x4pz\" (UniqueName: \"kubernetes.io/projected/d44bc258-6678-4851-a91f-f6e382262bf9-kube-api-access-4x4pz\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.867808 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-fernet-keys\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.867840 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-config-data\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.868986 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b70086d-b309-4113-bcee-34b36bf512a7-log-httpd\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.869015 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b70086d-b309-4113-bcee-34b36bf512a7-run-httpd\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.869463 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe"} err="failed to get container status \"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe\": rpc error: code = NotFound desc = could not find container \"b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe\": container with ID starting with b6e0398aaac9e4b0bdc3e84fa692fdfbf26ce89fd3364721e775dbf5767268fe not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.869497 5030 scope.go:117] "RemoveContainer" containerID="10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.869872 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec"} err="failed to get container status \"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec\": rpc error: code = NotFound desc = could not find container \"10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec\": container with ID starting with 10a88e51d85c955aa646f159e0133e289c45e982ad7067bbd50a62bc89d77cec not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.869902 5030 scope.go:117] "RemoveContainer" containerID="192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.870156 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf"} err="failed to get container status \"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf\": rpc error: code = NotFound desc = could not find container \"192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf\": container with ID starting with 192679356bd8c69233200767b1cfce1aab508ede28d32016c946d6cb44bf0acf not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.870179 5030 scope.go:117] "RemoveContainer" containerID="8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.870377 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34"} err="failed to get container status \"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34\": rpc error: code = NotFound desc = could not find container \"8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34\": container with ID starting with 8f881177aaf74d2b89f79fdf0e8687fe66f4e9171d910b4f6b3a72b43dfd5a34 not found: ID does not exist" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.878779 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-config-data\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.880471 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.882484 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-scripts\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.883081 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.889684 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk9tm\" (UniqueName: \"kubernetes.io/projected/7b70086d-b309-4113-bcee-34b36bf512a7-kube-api-access-kk9tm\") pod \"ceilometer-0\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.969998 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x4pz\" (UniqueName: \"kubernetes.io/projected/d44bc258-6678-4851-a91f-f6e382262bf9-kube-api-access-4x4pz\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.970058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-fernet-keys\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.970094 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-config-data\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.970145 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-combined-ca-bundle\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.970171 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-credential-keys\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.970196 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-scripts\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.973164 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-fernet-keys\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.973262 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-scripts\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.974144 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-credential-keys\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.974269 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-combined-ca-bundle\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.974345 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-config-data\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.987034 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x4pz\" (UniqueName: \"kubernetes.io/projected/d44bc258-6678-4851-a91f-f6e382262bf9-kube-api-access-4x4pz\") pod \"keystone-bootstrap-kqx2p\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:52 crc kubenswrapper[5030]: I0120 23:04:52.994755 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-9dztn" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.071452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-combined-ca-bundle\") pod \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\" (UID: \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\") " Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.071589 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5wh5\" (UniqueName: \"kubernetes.io/projected/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-kube-api-access-s5wh5\") pod \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\" (UID: \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\") " Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.071647 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-db-sync-config-data\") pod \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\" (UID: \"c1bb61bc-fb1d-463c-b1e8-8949e01ca893\") " Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.074552 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-kube-api-access-s5wh5" (OuterVolumeSpecName: "kube-api-access-s5wh5") pod "c1bb61bc-fb1d-463c-b1e8-8949e01ca893" (UID: "c1bb61bc-fb1d-463c-b1e8-8949e01ca893"). InnerVolumeSpecName "kube-api-access-s5wh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.075099 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c1bb61bc-fb1d-463c-b1e8-8949e01ca893" (UID: "c1bb61bc-fb1d-463c-b1e8-8949e01ca893"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.092989 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1bb61bc-fb1d-463c-b1e8-8949e01ca893" (UID: "c1bb61bc-fb1d-463c-b1e8-8949e01ca893"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.149720 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.166248 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.174039 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.174082 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.174093 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5wh5\" (UniqueName: \"kubernetes.io/projected/c1bb61bc-fb1d-463c-b1e8-8949e01ca893-kube-api-access-s5wh5\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.601823 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-9dztn" event={"ID":"c1bb61bc-fb1d-463c-b1e8-8949e01ca893","Type":"ContainerDied","Data":"a6c68174c433bdb6cfc06141a30e194b92cc76c7e915168b8a27d3fbad7ddca6"} Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.602124 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6c68174c433bdb6cfc06141a30e194b92cc76c7e915168b8a27d3fbad7ddca6" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.601846 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-9dztn" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.620884 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7869b46478-p797p" event={"ID":"4ee5d591-43e7-437c-9b55-b9132f22246a","Type":"ContainerStarted","Data":"59ac6043f93ef086250501bf602da8db89685b82321f351a7d96e1a127965ace"} Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.620918 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7869b46478-p797p" event={"ID":"4ee5d591-43e7-437c-9b55-b9132f22246a","Type":"ContainerStarted","Data":"94d39046ffec8b1af080acbe58f700c3628f887d518971230088dd0fae4ee9cd"} Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.620932 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.621314 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.661694 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.668025 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-7869b46478-p797p" podStartSLOduration=2.668006901 podStartE2EDuration="2.668006901s" podCreationTimestamp="2026-01-20 23:04:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:53.649567972 +0000 UTC m=+1765.969828270" watchObservedRunningTime="2026-01-20 23:04:53.668006901 +0000 UTC m=+1765.988267189" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.724534 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kqx2p"] Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.849857 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5"] Jan 20 23:04:53 crc kubenswrapper[5030]: E0120 23:04:53.859279 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1bb61bc-fb1d-463c-b1e8-8949e01ca893" containerName="barbican-db-sync" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.859303 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1bb61bc-fb1d-463c-b1e8-8949e01ca893" containerName="barbican-db-sync" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.859471 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1bb61bc-fb1d-463c-b1e8-8949e01ca893" containerName="barbican-db-sync" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.860316 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.865998 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.866173 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.866335 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-s9xtv" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.894716 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px"] Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.896208 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.911029 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 20 23:04:53 crc kubenswrapper[5030]: I0120 23:04:53.928989 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5"] Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.002503 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-config-data-custom\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.002640 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-config-data-custom\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.002707 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-config-data\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.002745 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbd870c6-92ea-4c50-a0dd-77e716077572-logs\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.002768 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-combined-ca-bundle\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.002792 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/128b8307-03b0-488a-9e89-83c29af6dd9b-logs\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.002812 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rn29\" (UniqueName: \"kubernetes.io/projected/128b8307-03b0-488a-9e89-83c29af6dd9b-kube-api-access-8rn29\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.002847 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-config-data\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.002897 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttnhg\" (UniqueName: \"kubernetes.io/projected/bbd870c6-92ea-4c50-a0dd-77e716077572-kube-api-access-ttnhg\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.002917 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-combined-ca-bundle\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.017590 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18897247-c232-43c2-9923-86f0391699ab" path="/var/lib/kubelet/pods/18897247-c232-43c2-9923-86f0391699ab/volumes" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.018494 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cdf2496-d5b8-406d-8693-6e39c33950de" path="/var/lib/kubelet/pods/2cdf2496-d5b8-406d-8693-6e39c33950de/volumes" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.019746 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px"] Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.049871 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-799ff9684d-67v2r"] Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.051375 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.060001 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.061006 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-799ff9684d-67v2r"] Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.104418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttnhg\" (UniqueName: \"kubernetes.io/projected/bbd870c6-92ea-4c50-a0dd-77e716077572-kube-api-access-ttnhg\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.104456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-combined-ca-bundle\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.104518 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-config-data-custom\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.104614 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-config-data-custom\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.104708 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-config-data\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.104732 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbd870c6-92ea-4c50-a0dd-77e716077572-logs\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.104751 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-combined-ca-bundle\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.104773 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rn29\" (UniqueName: \"kubernetes.io/projected/128b8307-03b0-488a-9e89-83c29af6dd9b-kube-api-access-8rn29\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.104790 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/128b8307-03b0-488a-9e89-83c29af6dd9b-logs\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.104825 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-config-data\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.106161 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/128b8307-03b0-488a-9e89-83c29af6dd9b-logs\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.106185 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbd870c6-92ea-4c50-a0dd-77e716077572-logs\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.108133 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-config-data-custom\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.109150 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-combined-ca-bundle\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.110444 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-combined-ca-bundle\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.111571 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-config-data-custom\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.112817 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-config-data\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.118339 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-config-data\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.119897 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttnhg\" (UniqueName: \"kubernetes.io/projected/bbd870c6-92ea-4c50-a0dd-77e716077572-kube-api-access-ttnhg\") pod \"barbican-keystone-listener-574ddc89bb-w4hm5\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.121995 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rn29\" (UniqueName: \"kubernetes.io/projected/128b8307-03b0-488a-9e89-83c29af6dd9b-kube-api-access-8rn29\") pod \"barbican-worker-74ccf5ddc7-wj4px\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.206194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-combined-ca-bundle\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.206511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-config-data\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.206552 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-logs\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.206604 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-config-data-custom\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.206684 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm64x\" (UniqueName: \"kubernetes.io/projected/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-kube-api-access-vm64x\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.246044 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.295387 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.308112 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-combined-ca-bundle\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.308201 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-config-data\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.308244 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-logs\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.308313 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-config-data-custom\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.308354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm64x\" (UniqueName: \"kubernetes.io/projected/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-kube-api-access-vm64x\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.308762 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-logs\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.311851 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-config-data-custom\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.312293 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-config-data\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.314395 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-combined-ca-bundle\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.325186 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm64x\" (UniqueName: \"kubernetes.io/projected/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-kube-api-access-vm64x\") pod \"barbican-api-799ff9684d-67v2r\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.381973 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.633061 5030 generic.go:334] "Generic (PLEG): container finished" podID="425b72eb-bdf3-42af-bf9e-4a2b8431b0e7" containerID="7c053da04f9e763cce1ed24ccb24b42f76ae096e7227f61b45fb9840d0d097ad" exitCode=0 Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.633220 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-jpzls" event={"ID":"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7","Type":"ContainerDied","Data":"7c053da04f9e763cce1ed24ccb24b42f76ae096e7227f61b45fb9840d0d097ad"} Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.634720 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" event={"ID":"d44bc258-6678-4851-a91f-f6e382262bf9","Type":"ContainerStarted","Data":"55f2c9b45d924d0e23919182c095b361caabfee3b3a8fd458ebce7784dff7d49"} Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.634746 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" event={"ID":"d44bc258-6678-4851-a91f-f6e382262bf9","Type":"ContainerStarted","Data":"4085f511e27cdf995e82f59ab0e22a46777282440654928ad0c31360f7230483"} Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.636136 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7b70086d-b309-4113-bcee-34b36bf512a7","Type":"ContainerStarted","Data":"4f9f113804a4276a11bbd7a355acea21f00bc02ed42433215af9f3ec417b390c"} Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.636178 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7b70086d-b309-4113-bcee-34b36bf512a7","Type":"ContainerStarted","Data":"fea5bc5d2c00aa5413463a34d36196ae530d12109a24852923f21ad9d43493d4"} Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.684032 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" podStartSLOduration=2.6840109720000003 podStartE2EDuration="2.684010972s" podCreationTimestamp="2026-01-20 23:04:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:54.669065958 +0000 UTC m=+1766.989326246" watchObservedRunningTime="2026-01-20 23:04:54.684010972 +0000 UTC m=+1767.004271260" Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.692403 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5"] Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.790831 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px"] Jan 20 23:04:54 crc kubenswrapper[5030]: I0120 23:04:54.894855 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-799ff9684d-67v2r"] Jan 20 23:04:54 crc kubenswrapper[5030]: W0120 23:04:54.913175 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod605ed7f2_e057_4d9b_b80c_1b80a0b3ae4d.slice/crio-b1fbe7575ca366617a1b55d7a8942e235d6b557dd4b0def85a9a0496a527198d WatchSource:0}: Error finding container b1fbe7575ca366617a1b55d7a8942e235d6b557dd4b0def85a9a0496a527198d: Status 404 returned error can't find the container with id b1fbe7575ca366617a1b55d7a8942e235d6b557dd4b0def85a9a0496a527198d Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.646355 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" event={"ID":"128b8307-03b0-488a-9e89-83c29af6dd9b","Type":"ContainerStarted","Data":"b907239be444a9423c0f1c089334d96acd30b0631822f0e4c0784e1217a02a7b"} Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.646617 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" event={"ID":"128b8307-03b0-488a-9e89-83c29af6dd9b","Type":"ContainerStarted","Data":"72b1998c6c8dee12853739e805d1cf361d95b2e26677c4f5d66171c955ca2571"} Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.646664 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" event={"ID":"128b8307-03b0-488a-9e89-83c29af6dd9b","Type":"ContainerStarted","Data":"a9455d4abf69d4d830bfd00a0ad36891faafe3a49ee298915dc3eeeb2bdd3743"} Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.650112 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7b70086d-b309-4113-bcee-34b36bf512a7","Type":"ContainerStarted","Data":"5141cc3f8fec2508048bd571d6893ee4dd69b19c8d0b4735e355b4a0c9bb32b3"} Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.653220 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" event={"ID":"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d","Type":"ContainerStarted","Data":"0c3a5c19c1e410128793e3779060588805094169ed3914675342d1f99f09c499"} Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.653254 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" event={"ID":"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d","Type":"ContainerStarted","Data":"e409d88079d3d24e3cc7058504a95aef60f2cbca47dae33b0546b8ba2533ca3a"} Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.653264 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" event={"ID":"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d","Type":"ContainerStarted","Data":"b1fbe7575ca366617a1b55d7a8942e235d6b557dd4b0def85a9a0496a527198d"} Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.654372 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.654396 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.657864 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" event={"ID":"bbd870c6-92ea-4c50-a0dd-77e716077572","Type":"ContainerStarted","Data":"9fcef5bb10bc0f76675b28df6b7242170fa10c2e7d377591672b4db441f8a654"} Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.657917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" event={"ID":"bbd870c6-92ea-4c50-a0dd-77e716077572","Type":"ContainerStarted","Data":"4f38b5a344b1f202e5c8a24ab0113aed84acc42e572b88dfe648cfd86fc0dc6f"} Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.657927 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" event={"ID":"bbd870c6-92ea-4c50-a0dd-77e716077572","Type":"ContainerStarted","Data":"4c4ca9d754f745d5dcc4840a067783a2ba9decd45cad5e204e08ae9586b7270c"} Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.672303 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" podStartSLOduration=2.672285757 podStartE2EDuration="2.672285757s" podCreationTimestamp="2026-01-20 23:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:55.666783083 +0000 UTC m=+1767.987043371" watchObservedRunningTime="2026-01-20 23:04:55.672285757 +0000 UTC m=+1767.992546045" Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.690593 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" podStartSLOduration=2.6905744929999997 podStartE2EDuration="2.690574493s" podCreationTimestamp="2026-01-20 23:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:55.682347003 +0000 UTC m=+1768.002607291" watchObservedRunningTime="2026-01-20 23:04:55.690574493 +0000 UTC m=+1768.010834781" Jan 20 23:04:55 crc kubenswrapper[5030]: I0120 23:04:55.729779 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" podStartSLOduration=2.7297600170000003 podStartE2EDuration="2.729760017s" podCreationTimestamp="2026-01-20 23:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:55.718598345 +0000 UTC m=+1768.038858633" watchObservedRunningTime="2026-01-20 23:04:55.729760017 +0000 UTC m=+1768.050020305" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.093446 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.247772 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-scripts\") pod \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.248156 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw47k\" (UniqueName: \"kubernetes.io/projected/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-kube-api-access-pw47k\") pod \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.248202 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-db-sync-config-data\") pod \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.248242 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-etc-machine-id\") pod \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.248317 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-config-data\") pod \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.248396 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-combined-ca-bundle\") pod \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\" (UID: \"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7\") " Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.248485 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "425b72eb-bdf3-42af-bf9e-4a2b8431b0e7" (UID: "425b72eb-bdf3-42af-bf9e-4a2b8431b0e7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.248741 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.261769 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "425b72eb-bdf3-42af-bf9e-4a2b8431b0e7" (UID: "425b72eb-bdf3-42af-bf9e-4a2b8431b0e7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.264881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-scripts" (OuterVolumeSpecName: "scripts") pod "425b72eb-bdf3-42af-bf9e-4a2b8431b0e7" (UID: "425b72eb-bdf3-42af-bf9e-4a2b8431b0e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.269955 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-kube-api-access-pw47k" (OuterVolumeSpecName: "kube-api-access-pw47k") pod "425b72eb-bdf3-42af-bf9e-4a2b8431b0e7" (UID: "425b72eb-bdf3-42af-bf9e-4a2b8431b0e7"). InnerVolumeSpecName "kube-api-access-pw47k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.277990 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "425b72eb-bdf3-42af-bf9e-4a2b8431b0e7" (UID: "425b72eb-bdf3-42af-bf9e-4a2b8431b0e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.292975 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-config-data" (OuterVolumeSpecName: "config-data") pod "425b72eb-bdf3-42af-bf9e-4a2b8431b0e7" (UID: "425b72eb-bdf3-42af-bf9e-4a2b8431b0e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.350132 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.350164 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.350175 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw47k\" (UniqueName: \"kubernetes.io/projected/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-kube-api-access-pw47k\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.350188 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.350197 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.681072 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7b70086d-b309-4113-bcee-34b36bf512a7","Type":"ContainerStarted","Data":"d64f84f11d342393009bbaea6a332279247d7db0d59200ce6548a27f2d410c13"} Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.683508 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-jpzls" event={"ID":"425b72eb-bdf3-42af-bf9e-4a2b8431b0e7","Type":"ContainerDied","Data":"c23637e20582b8309584063ddf23d6a0262a9a279ee34e0a1943487699e39324"} Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.683576 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c23637e20582b8309584063ddf23d6a0262a9a279ee34e0a1943487699e39324" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.683711 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-jpzls" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.924407 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:04:56 crc kubenswrapper[5030]: E0120 23:04:56.924782 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425b72eb-bdf3-42af-bf9e-4a2b8431b0e7" containerName="cinder-db-sync" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.924798 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="425b72eb-bdf3-42af-bf9e-4a2b8431b0e7" containerName="cinder-db-sync" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.924985 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="425b72eb-bdf3-42af-bf9e-4a2b8431b0e7" containerName="cinder-db-sync" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.925814 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.932503 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-896tn" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.933272 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.933386 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:04:56 crc kubenswrapper[5030]: I0120 23:04:56.933487 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.022846 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.073413 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/991271c7-67ba-45ae-ba73-f18c7b800f18-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.073739 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-scripts\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.073762 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.073905 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5hfx\" (UniqueName: \"kubernetes.io/projected/991271c7-67ba-45ae-ba73-f18c7b800f18-kube-api-access-m5hfx\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.073959 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-config-data\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.074008 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.120327 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.121794 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.125404 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.158852 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.175452 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5hfx\" (UniqueName: \"kubernetes.io/projected/991271c7-67ba-45ae-ba73-f18c7b800f18-kube-api-access-m5hfx\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.175506 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-config-data\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.175552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.175573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/991271c7-67ba-45ae-ba73-f18c7b800f18-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.175588 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-scripts\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.175606 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.180442 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.182678 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/991271c7-67ba-45ae-ba73-f18c7b800f18-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.193723 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-config-data\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.195756 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.199496 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-scripts\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.226281 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5hfx\" (UniqueName: \"kubernetes.io/projected/991271c7-67ba-45ae-ba73-f18c7b800f18-kube-api-access-m5hfx\") pod \"cinder-scheduler-0\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.276819 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-config-data\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.276874 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd7tx\" (UniqueName: \"kubernetes.io/projected/5998b8ca-9d38-4656-8a54-4b85f05014dd-kube-api-access-zd7tx\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.276898 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5998b8ca-9d38-4656-8a54-4b85f05014dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.276962 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5998b8ca-9d38-4656-8a54-4b85f05014dd-logs\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.276992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.277013 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.277028 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-scripts\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.361385 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.378849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5998b8ca-9d38-4656-8a54-4b85f05014dd-logs\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.379057 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.379181 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.379267 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5998b8ca-9d38-4656-8a54-4b85f05014dd-logs\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.379284 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-scripts\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.379481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-config-data\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.379580 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd7tx\" (UniqueName: \"kubernetes.io/projected/5998b8ca-9d38-4656-8a54-4b85f05014dd-kube-api-access-zd7tx\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.379697 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5998b8ca-9d38-4656-8a54-4b85f05014dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.379776 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5998b8ca-9d38-4656-8a54-4b85f05014dd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.382683 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-scripts\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.383336 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-config-data-custom\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.385079 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-config-data\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.386267 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.405678 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd7tx\" (UniqueName: \"kubernetes.io/projected/5998b8ca-9d38-4656-8a54-4b85f05014dd-kube-api-access-zd7tx\") pod \"cinder-api-0\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.439323 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.703271 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7b70086d-b309-4113-bcee-34b36bf512a7","Type":"ContainerStarted","Data":"2ef52465ddd335691209703804aeb21ba338d53e8296574b1eca1b4832254289"} Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.704067 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.707282 5030 generic.go:334] "Generic (PLEG): container finished" podID="d44bc258-6678-4851-a91f-f6e382262bf9" containerID="55f2c9b45d924d0e23919182c095b361caabfee3b3a8fd458ebce7784dff7d49" exitCode=0 Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.707356 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" event={"ID":"d44bc258-6678-4851-a91f-f6e382262bf9","Type":"ContainerDied","Data":"55f2c9b45d924d0e23919182c095b361caabfee3b3a8fd458ebce7784dff7d49"} Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.729717 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.5031047060000002 podStartE2EDuration="5.729700799s" podCreationTimestamp="2026-01-20 23:04:52 +0000 UTC" firstStartedPulling="2026-01-20 23:04:53.67614421 +0000 UTC m=+1765.996404498" lastFinishedPulling="2026-01-20 23:04:56.902740313 +0000 UTC m=+1769.223000591" observedRunningTime="2026-01-20 23:04:57.727539346 +0000 UTC m=+1770.047799634" watchObservedRunningTime="2026-01-20 23:04:57.729700799 +0000 UTC m=+1770.049961087" Jan 20 23:04:57 crc kubenswrapper[5030]: W0120 23:04:57.880123 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod991271c7_67ba_45ae_ba73_f18c7b800f18.slice/crio-6823ffec3be245ed7155c947b58ea087a2f0c5cf836819ef0d574219da54777d WatchSource:0}: Error finding container 6823ffec3be245ed7155c947b58ea087a2f0c5cf836819ef0d574219da54777d: Status 404 returned error can't find the container with id 6823ffec3be245ed7155c947b58ea087a2f0c5cf836819ef0d574219da54777d Jan 20 23:04:57 crc kubenswrapper[5030]: I0120 23:04:57.881391 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:04:58 crc kubenswrapper[5030]: I0120 23:04:58.020888 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:04:58 crc kubenswrapper[5030]: W0120 23:04:58.024389 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5998b8ca_9d38_4656_8a54_4b85f05014dd.slice/crio-9292bfd14c1616dac7ad047a9c3e471c574c4418298d18ab61c328f8a683c5ac WatchSource:0}: Error finding container 9292bfd14c1616dac7ad047a9c3e471c574c4418298d18ab61c328f8a683c5ac: Status 404 returned error can't find the container with id 9292bfd14c1616dac7ad047a9c3e471c574c4418298d18ab61c328f8a683c5ac Jan 20 23:04:58 crc kubenswrapper[5030]: I0120 23:04:58.744455 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5998b8ca-9d38-4656-8a54-4b85f05014dd","Type":"ContainerStarted","Data":"fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4"} Jan 20 23:04:58 crc kubenswrapper[5030]: I0120 23:04:58.744723 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5998b8ca-9d38-4656-8a54-4b85f05014dd","Type":"ContainerStarted","Data":"9292bfd14c1616dac7ad047a9c3e471c574c4418298d18ab61c328f8a683c5ac"} Jan 20 23:04:58 crc kubenswrapper[5030]: I0120 23:04:58.746893 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"991271c7-67ba-45ae-ba73-f18c7b800f18","Type":"ContainerStarted","Data":"7aaef5944257c8160778a89d0dad44a7bad0610243894c7f621c8bfee35d7645"} Jan 20 23:04:58 crc kubenswrapper[5030]: I0120 23:04:58.746916 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"991271c7-67ba-45ae-ba73-f18c7b800f18","Type":"ContainerStarted","Data":"6823ffec3be245ed7155c947b58ea087a2f0c5cf836819ef0d574219da54777d"} Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.138595 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.312683 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-config-data\") pod \"d44bc258-6678-4851-a91f-f6e382262bf9\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.312758 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-scripts\") pod \"d44bc258-6678-4851-a91f-f6e382262bf9\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.312829 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-combined-ca-bundle\") pod \"d44bc258-6678-4851-a91f-f6e382262bf9\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.312901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x4pz\" (UniqueName: \"kubernetes.io/projected/d44bc258-6678-4851-a91f-f6e382262bf9-kube-api-access-4x4pz\") pod \"d44bc258-6678-4851-a91f-f6e382262bf9\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.312935 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-fernet-keys\") pod \"d44bc258-6678-4851-a91f-f6e382262bf9\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.313014 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-credential-keys\") pod \"d44bc258-6678-4851-a91f-f6e382262bf9\" (UID: \"d44bc258-6678-4851-a91f-f6e382262bf9\") " Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.318260 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d44bc258-6678-4851-a91f-f6e382262bf9-kube-api-access-4x4pz" (OuterVolumeSpecName: "kube-api-access-4x4pz") pod "d44bc258-6678-4851-a91f-f6e382262bf9" (UID: "d44bc258-6678-4851-a91f-f6e382262bf9"). InnerVolumeSpecName "kube-api-access-4x4pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.318640 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-scripts" (OuterVolumeSpecName: "scripts") pod "d44bc258-6678-4851-a91f-f6e382262bf9" (UID: "d44bc258-6678-4851-a91f-f6e382262bf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.319340 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d44bc258-6678-4851-a91f-f6e382262bf9" (UID: "d44bc258-6678-4851-a91f-f6e382262bf9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.321794 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d44bc258-6678-4851-a91f-f6e382262bf9" (UID: "d44bc258-6678-4851-a91f-f6e382262bf9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.338464 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-config-data" (OuterVolumeSpecName: "config-data") pod "d44bc258-6678-4851-a91f-f6e382262bf9" (UID: "d44bc258-6678-4851-a91f-f6e382262bf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.347779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d44bc258-6678-4851-a91f-f6e382262bf9" (UID: "d44bc258-6678-4851-a91f-f6e382262bf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.415984 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.416020 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.416032 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x4pz\" (UniqueName: \"kubernetes.io/projected/d44bc258-6678-4851-a91f-f6e382262bf9-kube-api-access-4x4pz\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.416042 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.416051 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.416058 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d44bc258-6678-4851-a91f-f6e382262bf9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.757164 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" event={"ID":"d44bc258-6678-4851-a91f-f6e382262bf9","Type":"ContainerDied","Data":"4085f511e27cdf995e82f59ab0e22a46777282440654928ad0c31360f7230483"} Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.757538 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4085f511e27cdf995e82f59ab0e22a46777282440654928ad0c31360f7230483" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.757349 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-kqx2p" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.759832 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5998b8ca-9d38-4656-8a54-4b85f05014dd","Type":"ContainerStarted","Data":"4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d"} Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.761156 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.764139 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"991271c7-67ba-45ae-ba73-f18c7b800f18","Type":"ContainerStarted","Data":"083fa6e70157ee0d9cc4d23b5e281cbba6cc4848736abacffb288f9d1649d1c8"} Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.813585 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.813564783 podStartE2EDuration="2.813564783s" podCreationTimestamp="2026-01-20 23:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:59.784502845 +0000 UTC m=+1772.104763133" watchObservedRunningTime="2026-01-20 23:04:59.813564783 +0000 UTC m=+1772.133825061" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.820719 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.820701537 podStartE2EDuration="3.820701537s" podCreationTimestamp="2026-01-20 23:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:04:59.806717627 +0000 UTC m=+1772.126977915" watchObservedRunningTime="2026-01-20 23:04:59.820701537 +0000 UTC m=+1772.140961825" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.895843 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-67567dfbc6-zn99j"] Jan 20 23:04:59 crc kubenswrapper[5030]: E0120 23:04:59.896199 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d44bc258-6678-4851-a91f-f6e382262bf9" containerName="keystone-bootstrap" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.896216 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d44bc258-6678-4851-a91f-f6e382262bf9" containerName="keystone-bootstrap" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.896376 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d44bc258-6678-4851-a91f-f6e382262bf9" containerName="keystone-bootstrap" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.896921 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.898959 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.900053 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.900369 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.901368 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.904159 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.904556 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-2vb6q" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.910790 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-67567dfbc6-zn99j"] Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.914809 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.915676 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.928478 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.928525 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.984464 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.984573 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.984640 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:04:59 crc kubenswrapper[5030]: I0120 23:04:59.984683 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.016387 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.027270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7ckj\" (UniqueName: \"kubernetes.io/projected/e0ca1870-fde9-4d58-9593-555c107cada4-kube-api-access-j7ckj\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.027455 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-combined-ca-bundle\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.027504 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-fernet-keys\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.027685 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-internal-tls-certs\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.027731 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-credential-keys\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.027795 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-config-data\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.027824 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-public-tls-certs\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.027841 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-scripts\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.130585 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7ckj\" (UniqueName: \"kubernetes.io/projected/e0ca1870-fde9-4d58-9593-555c107cada4-kube-api-access-j7ckj\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.130669 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-combined-ca-bundle\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.130704 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-fernet-keys\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.130726 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-internal-tls-certs\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.130785 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-credential-keys\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.130814 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-config-data\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.130840 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-public-tls-certs\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.130858 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-scripts\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.135474 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-credential-keys\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.135563 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-internal-tls-certs\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.136020 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-public-tls-certs\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.136238 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-scripts\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.137717 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-combined-ca-bundle\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.139803 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-fernet-keys\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.140547 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-config-data\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.158194 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7ckj\" (UniqueName: \"kubernetes.io/projected/e0ca1870-fde9-4d58-9593-555c107cada4-kube-api-access-j7ckj\") pod \"keystone-67567dfbc6-zn99j\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.213165 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.538149 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b"] Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.539823 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.545746 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.545794 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.566049 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b"] Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.641594 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-config-data\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.641697 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-combined-ca-bundle\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.641734 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-logs\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.641756 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-public-tls-certs\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.641796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-674gj\" (UniqueName: \"kubernetes.io/projected/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-kube-api-access-674gj\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.641815 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-internal-tls-certs\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.641856 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-config-data-custom\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.759118 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-67567dfbc6-zn99j"] Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.759138 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-combined-ca-bundle\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.759474 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-logs\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.759516 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-public-tls-certs\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.759606 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-674gj\" (UniqueName: \"kubernetes.io/projected/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-kube-api-access-674gj\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.759664 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-internal-tls-certs\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.759760 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-config-data-custom\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.759872 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-config-data\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.761313 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-logs\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.775326 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-config-data-custom\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.777795 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-combined-ca-bundle\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.780153 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-public-tls-certs\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.801080 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-internal-tls-certs\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.801137 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" event={"ID":"e0ca1870-fde9-4d58-9593-555c107cada4","Type":"ContainerStarted","Data":"df0724fb704495efada931d4949058aef5caa4e000b7a820e4a92309511cc47b"} Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.801168 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.801475 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.801504 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.801515 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.813151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-config-data\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.814236 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-674gj\" (UniqueName: \"kubernetes.io/projected/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-kube-api-access-674gj\") pod \"barbican-api-8c8d65fc8-zg74b\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:00 crc kubenswrapper[5030]: I0120 23:05:00.862969 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:01 crc kubenswrapper[5030]: I0120 23:05:01.378651 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b"] Jan 20 23:05:01 crc kubenswrapper[5030]: W0120 23:05:01.396063 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dd4290d_9d6d_4a8e_9b8b_a1f51b675e2c.slice/crio-8ba66e73fadff9100438a6ab8d73292c0621ba600a6b7e7b25d0fb716bdc9147 WatchSource:0}: Error finding container 8ba66e73fadff9100438a6ab8d73292c0621ba600a6b7e7b25d0fb716bdc9147: Status 404 returned error can't find the container with id 8ba66e73fadff9100438a6ab8d73292c0621ba600a6b7e7b25d0fb716bdc9147 Jan 20 23:05:01 crc kubenswrapper[5030]: I0120 23:05:01.809856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" event={"ID":"e0ca1870-fde9-4d58-9593-555c107cada4","Type":"ContainerStarted","Data":"71895f6baf692fcb059570c10acee58639753ce9643aab3879a97adb3c69f858"} Jan 20 23:05:01 crc kubenswrapper[5030]: I0120 23:05:01.810293 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:01 crc kubenswrapper[5030]: I0120 23:05:01.814580 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" event={"ID":"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c","Type":"ContainerStarted","Data":"9685f0161fa2dc2a70a60068493799a7269233e5fdcb1286db91750b507a8e03"} Jan 20 23:05:01 crc kubenswrapper[5030]: I0120 23:05:01.814613 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" event={"ID":"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c","Type":"ContainerStarted","Data":"1e47d9a62be110198d6a21ed0e2f60d6739a5d24b7c3d9403df7146ed5e0e34c"} Jan 20 23:05:01 crc kubenswrapper[5030]: I0120 23:05:01.814643 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:01 crc kubenswrapper[5030]: I0120 23:05:01.814651 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" event={"ID":"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c","Type":"ContainerStarted","Data":"8ba66e73fadff9100438a6ab8d73292c0621ba600a6b7e7b25d0fb716bdc9147"} Jan 20 23:05:01 crc kubenswrapper[5030]: I0120 23:05:01.815127 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="5998b8ca-9d38-4656-8a54-4b85f05014dd" containerName="cinder-api-log" containerID="cri-o://fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4" gracePeriod=30 Jan 20 23:05:01 crc kubenswrapper[5030]: I0120 23:05:01.815204 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:01 crc kubenswrapper[5030]: I0120 23:05:01.815238 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="5998b8ca-9d38-4656-8a54-4b85f05014dd" containerName="cinder-api" containerID="cri-o://4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d" gracePeriod=30 Jan 20 23:05:01 crc kubenswrapper[5030]: I0120 23:05:01.841629 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" podStartSLOduration=2.841595079 podStartE2EDuration="2.841595079s" podCreationTimestamp="2026-01-20 23:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:05:01.827749602 +0000 UTC m=+1774.148009890" watchObservedRunningTime="2026-01-20 23:05:01.841595079 +0000 UTC m=+1774.161855367" Jan 20 23:05:01 crc kubenswrapper[5030]: I0120 23:05:01.856176 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" podStartSLOduration=1.856157383 podStartE2EDuration="1.856157383s" podCreationTimestamp="2026-01-20 23:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:05:01.85312229 +0000 UTC m=+1774.173382598" watchObservedRunningTime="2026-01-20 23:05:01.856157383 +0000 UTC m=+1774.176417681" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.362836 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.422013 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.491406 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-config-data\") pod \"5998b8ca-9d38-4656-8a54-4b85f05014dd\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.491469 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-scripts\") pod \"5998b8ca-9d38-4656-8a54-4b85f05014dd\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.491589 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5998b8ca-9d38-4656-8a54-4b85f05014dd-logs\") pod \"5998b8ca-9d38-4656-8a54-4b85f05014dd\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.491660 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-config-data-custom\") pod \"5998b8ca-9d38-4656-8a54-4b85f05014dd\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.491712 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd7tx\" (UniqueName: \"kubernetes.io/projected/5998b8ca-9d38-4656-8a54-4b85f05014dd-kube-api-access-zd7tx\") pod \"5998b8ca-9d38-4656-8a54-4b85f05014dd\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.491733 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-combined-ca-bundle\") pod \"5998b8ca-9d38-4656-8a54-4b85f05014dd\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.491818 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5998b8ca-9d38-4656-8a54-4b85f05014dd-etc-machine-id\") pod \"5998b8ca-9d38-4656-8a54-4b85f05014dd\" (UID: \"5998b8ca-9d38-4656-8a54-4b85f05014dd\") " Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.493069 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5998b8ca-9d38-4656-8a54-4b85f05014dd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5998b8ca-9d38-4656-8a54-4b85f05014dd" (UID: "5998b8ca-9d38-4656-8a54-4b85f05014dd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.493319 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5998b8ca-9d38-4656-8a54-4b85f05014dd-logs" (OuterVolumeSpecName: "logs") pod "5998b8ca-9d38-4656-8a54-4b85f05014dd" (UID: "5998b8ca-9d38-4656-8a54-4b85f05014dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.499117 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5998b8ca-9d38-4656-8a54-4b85f05014dd-kube-api-access-zd7tx" (OuterVolumeSpecName: "kube-api-access-zd7tx") pod "5998b8ca-9d38-4656-8a54-4b85f05014dd" (UID: "5998b8ca-9d38-4656-8a54-4b85f05014dd"). InnerVolumeSpecName "kube-api-access-zd7tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.500691 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-scripts" (OuterVolumeSpecName: "scripts") pod "5998b8ca-9d38-4656-8a54-4b85f05014dd" (UID: "5998b8ca-9d38-4656-8a54-4b85f05014dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.507520 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5998b8ca-9d38-4656-8a54-4b85f05014dd" (UID: "5998b8ca-9d38-4656-8a54-4b85f05014dd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.525345 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5998b8ca-9d38-4656-8a54-4b85f05014dd" (UID: "5998b8ca-9d38-4656-8a54-4b85f05014dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.548005 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-config-data" (OuterVolumeSpecName: "config-data") pod "5998b8ca-9d38-4656-8a54-4b85f05014dd" (UID: "5998b8ca-9d38-4656-8a54-4b85f05014dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.594378 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd7tx\" (UniqueName: \"kubernetes.io/projected/5998b8ca-9d38-4656-8a54-4b85f05014dd-kube-api-access-zd7tx\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.594438 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.594451 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5998b8ca-9d38-4656-8a54-4b85f05014dd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.594464 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.594499 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.594511 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5998b8ca-9d38-4656-8a54-4b85f05014dd-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.594524 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5998b8ca-9d38-4656-8a54-4b85f05014dd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.827723 5030 generic.go:334] "Generic (PLEG): container finished" podID="5998b8ca-9d38-4656-8a54-4b85f05014dd" containerID="4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d" exitCode=0 Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.827773 5030 generic.go:334] "Generic (PLEG): container finished" podID="5998b8ca-9d38-4656-8a54-4b85f05014dd" containerID="fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4" exitCode=143 Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.827800 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5998b8ca-9d38-4656-8a54-4b85f05014dd","Type":"ContainerDied","Data":"4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d"} Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.827848 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5998b8ca-9d38-4656-8a54-4b85f05014dd","Type":"ContainerDied","Data":"fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4"} Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.827862 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"5998b8ca-9d38-4656-8a54-4b85f05014dd","Type":"ContainerDied","Data":"9292bfd14c1616dac7ad047a9c3e471c574c4418298d18ab61c328f8a683c5ac"} Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.827813 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.827878 5030 scope.go:117] "RemoveContainer" containerID="4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.828020 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.828030 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.832195 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.834806 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.860059 5030 scope.go:117] "RemoveContainer" containerID="fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.867308 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.867414 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.877342 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.891738 5030 scope.go:117] "RemoveContainer" containerID="4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d" Jan 20 23:05:02 crc kubenswrapper[5030]: E0120 23:05:02.894542 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d\": container with ID starting with 4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d not found: ID does not exist" containerID="4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.894599 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d"} err="failed to get container status \"4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d\": rpc error: code = NotFound desc = could not find container \"4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d\": container with ID starting with 4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d not found: ID does not exist" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.894656 5030 scope.go:117] "RemoveContainer" containerID="fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4" Jan 20 23:05:02 crc kubenswrapper[5030]: E0120 23:05:02.897584 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4\": container with ID starting with fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4 not found: ID does not exist" containerID="fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.897653 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4"} err="failed to get container status \"fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4\": rpc error: code = NotFound desc = could not find container \"fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4\": container with ID starting with fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4 not found: ID does not exist" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.897689 5030 scope.go:117] "RemoveContainer" containerID="4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.897853 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.899768 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d"} err="failed to get container status \"4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d\": rpc error: code = NotFound desc = could not find container \"4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d\": container with ID starting with 4c1f76fad070c0b2326c1c42e08f83a1b1af521bf419fb0a41927e08132e886d not found: ID does not exist" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.899811 5030 scope.go:117] "RemoveContainer" containerID="fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.905054 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4"} err="failed to get container status \"fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4\": rpc error: code = NotFound desc = could not find container \"fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4\": container with ID starting with fce3dd2cb463d41c6f82d0039678d30ce387e442e1dbb1bda5a95cf3e8c8c4c4 not found: ID does not exist" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.935310 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.974803 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:05:02 crc kubenswrapper[5030]: E0120 23:05:02.975767 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5998b8ca-9d38-4656-8a54-4b85f05014dd" containerName="cinder-api-log" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.975799 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5998b8ca-9d38-4656-8a54-4b85f05014dd" containerName="cinder-api-log" Jan 20 23:05:02 crc kubenswrapper[5030]: E0120 23:05:02.977783 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5998b8ca-9d38-4656-8a54-4b85f05014dd" containerName="cinder-api" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.977814 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5998b8ca-9d38-4656-8a54-4b85f05014dd" containerName="cinder-api" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.978454 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5998b8ca-9d38-4656-8a54-4b85f05014dd" containerName="cinder-api" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.978510 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5998b8ca-9d38-4656-8a54-4b85f05014dd" containerName="cinder-api-log" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.980443 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.983518 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.984378 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:05:02 crc kubenswrapper[5030]: I0120 23:05:02.984554 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.002220 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.102310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.102364 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63645689-a02d-4ee6-b712-554130bdca30-etc-machine-id\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.102391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-config-data\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.102477 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63645689-a02d-4ee6-b712-554130bdca30-logs\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.102504 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-public-tls-certs\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.102542 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-config-data-custom\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.102580 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-scripts\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.102639 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbj4\" (UniqueName: \"kubernetes.io/projected/63645689-a02d-4ee6-b712-554130bdca30-kube-api-access-ffbj4\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.102664 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.203569 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.203655 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63645689-a02d-4ee6-b712-554130bdca30-etc-machine-id\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.203681 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-config-data\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.203712 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63645689-a02d-4ee6-b712-554130bdca30-logs\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.203731 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-public-tls-certs\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.203750 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63645689-a02d-4ee6-b712-554130bdca30-etc-machine-id\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.203767 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-config-data-custom\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.203839 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-scripts\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.203989 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbj4\" (UniqueName: \"kubernetes.io/projected/63645689-a02d-4ee6-b712-554130bdca30-kube-api-access-ffbj4\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.204035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.204604 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63645689-a02d-4ee6-b712-554130bdca30-logs\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.208875 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-scripts\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.209226 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-public-tls-certs\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.209747 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.209904 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.212443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-config-data\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.216019 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-config-data-custom\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.220460 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbj4\" (UniqueName: \"kubernetes.io/projected/63645689-a02d-4ee6-b712-554130bdca30-kube-api-access-ffbj4\") pod \"cinder-api-0\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.312402 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:03 crc kubenswrapper[5030]: W0120 23:05:03.784808 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63645689_a02d_4ee6_b712_554130bdca30.slice/crio-1855439598a81b763fc62654a4b8858232e8d6fdbde25d898d67837a55ac5fcc WatchSource:0}: Error finding container 1855439598a81b763fc62654a4b8858232e8d6fdbde25d898d67837a55ac5fcc: Status 404 returned error can't find the container with id 1855439598a81b763fc62654a4b8858232e8d6fdbde25d898d67837a55ac5fcc Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.786022 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.843793 5030 generic.go:334] "Generic (PLEG): container finished" podID="a667d4ce-6850-4d3d-8a28-d1a24eb13ae7" containerID="feb1a638c84cf3386e550d8bc38c115b9e45a247fbc73b072ff09a3993cb8acb" exitCode=0 Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.843850 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-wflcw" event={"ID":"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7","Type":"ContainerDied","Data":"feb1a638c84cf3386e550d8bc38c115b9e45a247fbc73b072ff09a3993cb8acb"} Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.858810 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"63645689-a02d-4ee6-b712-554130bdca30","Type":"ContainerStarted","Data":"1855439598a81b763fc62654a4b8858232e8d6fdbde25d898d67837a55ac5fcc"} Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.961691 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:05:03 crc kubenswrapper[5030]: E0120 23:05:03.962097 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:05:03 crc kubenswrapper[5030]: I0120 23:05:03.977581 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5998b8ca-9d38-4656-8a54-4b85f05014dd" path="/var/lib/kubelet/pods/5998b8ca-9d38-4656-8a54-4b85f05014dd/volumes" Jan 20 23:05:04 crc kubenswrapper[5030]: I0120 23:05:04.876859 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"63645689-a02d-4ee6-b712-554130bdca30","Type":"ContainerStarted","Data":"8afb44471792a68df8295fcfe9340c136a8e28722e34319c5bd8ccede98487ee"} Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.228378 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-wflcw" Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.251771 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-config\") pod \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\" (UID: \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\") " Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.251814 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-combined-ca-bundle\") pod \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\" (UID: \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\") " Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.251923 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnxvf\" (UniqueName: \"kubernetes.io/projected/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-kube-api-access-fnxvf\") pod \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\" (UID: \"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7\") " Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.257098 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-kube-api-access-fnxvf" (OuterVolumeSpecName: "kube-api-access-fnxvf") pod "a667d4ce-6850-4d3d-8a28-d1a24eb13ae7" (UID: "a667d4ce-6850-4d3d-8a28-d1a24eb13ae7"). InnerVolumeSpecName "kube-api-access-fnxvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.275211 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a667d4ce-6850-4d3d-8a28-d1a24eb13ae7" (UID: "a667d4ce-6850-4d3d-8a28-d1a24eb13ae7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.281717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-config" (OuterVolumeSpecName: "config") pod "a667d4ce-6850-4d3d-8a28-d1a24eb13ae7" (UID: "a667d4ce-6850-4d3d-8a28-d1a24eb13ae7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.354312 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.354577 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.354663 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnxvf\" (UniqueName: \"kubernetes.io/projected/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7-kube-api-access-fnxvf\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.748566 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.820013 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.891923 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-wflcw" Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.891912 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-wflcw" event={"ID":"a667d4ce-6850-4d3d-8a28-d1a24eb13ae7","Type":"ContainerDied","Data":"b9c0e2c541011d086695aada757acada2cec792829240644d82c751f38a38115"} Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.893838 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9c0e2c541011d086695aada757acada2cec792829240644d82c751f38a38115" Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.898883 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"63645689-a02d-4ee6-b712-554130bdca30","Type":"ContainerStarted","Data":"76bbe65e67022afeb40494b3071315eb174c4e626acd1c77fb924d72243b1dbb"} Jan 20 23:05:05 crc kubenswrapper[5030]: I0120 23:05:05.942480 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.94246049 podStartE2EDuration="3.94246049s" podCreationTimestamp="2026-01-20 23:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:05:05.926681905 +0000 UTC m=+1778.246942193" watchObservedRunningTime="2026-01-20 23:05:05.94246049 +0000 UTC m=+1778.262720788" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.028346 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-85b4b556b9-gdz7b"] Jan 20 23:05:06 crc kubenswrapper[5030]: E0120 23:05:06.029207 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a667d4ce-6850-4d3d-8a28-d1a24eb13ae7" containerName="neutron-db-sync" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.029295 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a667d4ce-6850-4d3d-8a28-d1a24eb13ae7" containerName="neutron-db-sync" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.029895 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a667d4ce-6850-4d3d-8a28-d1a24eb13ae7" containerName="neutron-db-sync" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.031460 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.049374 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.049641 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.049793 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.049929 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-5jblq" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.057142 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-85b4b556b9-gdz7b"] Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.067325 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drwkm\" (UniqueName: \"kubernetes.io/projected/e1e5066b-6521-4a21-aec7-ee8470028811-kube-api-access-drwkm\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.067377 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-config\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.067402 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-httpd-config\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.070097 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-combined-ca-bundle\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.070529 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-ovndb-tls-certs\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.172391 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-ovndb-tls-certs\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.172449 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drwkm\" (UniqueName: \"kubernetes.io/projected/e1e5066b-6521-4a21-aec7-ee8470028811-kube-api-access-drwkm\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.172480 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-config\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.172501 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-httpd-config\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.172523 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-combined-ca-bundle\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.176842 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-ovndb-tls-certs\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.176854 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-combined-ca-bundle\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.177094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-config\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.180886 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-httpd-config\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.190197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drwkm\" (UniqueName: \"kubernetes.io/projected/e1e5066b-6521-4a21-aec7-ee8470028811-kube-api-access-drwkm\") pod \"neutron-85b4b556b9-gdz7b\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:06 crc kubenswrapper[5030]: I0120 23:05:06.370998 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:07 crc kubenswrapper[5030]: I0120 23:05:06.908052 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:07 crc kubenswrapper[5030]: I0120 23:05:07.694322 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:07 crc kubenswrapper[5030]: I0120 23:05:07.755644 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-85b4b556b9-gdz7b"] Jan 20 23:05:07 crc kubenswrapper[5030]: W0120 23:05:07.757815 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e5066b_6521_4a21_aec7_ee8470028811.slice/crio-677d9236d3f6ac5ea5aa59b789ad9d6e06273d145af3504e29bf0b2f5d6daf4c WatchSource:0}: Error finding container 677d9236d3f6ac5ea5aa59b789ad9d6e06273d145af3504e29bf0b2f5d6daf4c: Status 404 returned error can't find the container with id 677d9236d3f6ac5ea5aa59b789ad9d6e06273d145af3504e29bf0b2f5d6daf4c Jan 20 23:05:07 crc kubenswrapper[5030]: I0120 23:05:07.763355 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:05:07 crc kubenswrapper[5030]: I0120 23:05:07.922528 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" event={"ID":"e1e5066b-6521-4a21-aec7-ee8470028811","Type":"ContainerStarted","Data":"677d9236d3f6ac5ea5aa59b789ad9d6e06273d145af3504e29bf0b2f5d6daf4c"} Jan 20 23:05:07 crc kubenswrapper[5030]: I0120 23:05:07.922836 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="991271c7-67ba-45ae-ba73-f18c7b800f18" containerName="cinder-scheduler" containerID="cri-o://7aaef5944257c8160778a89d0dad44a7bad0610243894c7f621c8bfee35d7645" gracePeriod=30 Jan 20 23:05:07 crc kubenswrapper[5030]: I0120 23:05:07.922919 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="991271c7-67ba-45ae-ba73-f18c7b800f18" containerName="probe" containerID="cri-o://083fa6e70157ee0d9cc4d23b5e281cbba6cc4848736abacffb288f9d1649d1c8" gracePeriod=30 Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.724048 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-677b6f7646-2t2kv"] Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.728039 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.730554 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.739930 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.743803 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-677b6f7646-2t2kv"] Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.815339 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-config\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.815406 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-public-tls-certs\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.815426 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-combined-ca-bundle\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.815467 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-httpd-config\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.815502 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-ovndb-tls-certs\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.815523 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8569g\" (UniqueName: \"kubernetes.io/projected/8659ca99-8cf1-4b79-90f1-e9db988aaefa-kube-api-access-8569g\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.815568 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-internal-tls-certs\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.917165 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-config\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.917252 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-public-tls-certs\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.917273 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-combined-ca-bundle\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.917322 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-httpd-config\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.917357 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-ovndb-tls-certs\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.917392 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8569g\" (UniqueName: \"kubernetes.io/projected/8659ca99-8cf1-4b79-90f1-e9db988aaefa-kube-api-access-8569g\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.917437 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-internal-tls-certs\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.921041 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-public-tls-certs\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.921649 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-internal-tls-certs\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.926996 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-httpd-config\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.927313 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-config\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.927555 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-combined-ca-bundle\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.933409 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-ovndb-tls-certs\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.949198 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8569g\" (UniqueName: \"kubernetes.io/projected/8659ca99-8cf1-4b79-90f1-e9db988aaefa-kube-api-access-8569g\") pod \"neutron-677b6f7646-2t2kv\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.950059 5030 generic.go:334] "Generic (PLEG): container finished" podID="991271c7-67ba-45ae-ba73-f18c7b800f18" containerID="083fa6e70157ee0d9cc4d23b5e281cbba6cc4848736abacffb288f9d1649d1c8" exitCode=0 Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.950103 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"991271c7-67ba-45ae-ba73-f18c7b800f18","Type":"ContainerDied","Data":"083fa6e70157ee0d9cc4d23b5e281cbba6cc4848736abacffb288f9d1649d1c8"} Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.953512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" event={"ID":"e1e5066b-6521-4a21-aec7-ee8470028811","Type":"ContainerStarted","Data":"e84cff37bf897c3fe607c04c0f3891a72b6ad601a4b9cea5ae770d85b26fdc56"} Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.953547 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" event={"ID":"e1e5066b-6521-4a21-aec7-ee8470028811","Type":"ContainerStarted","Data":"7d8afe6beeb49bea88586ad01735215dbf61aba50d91a2be011617309bccb5fd"} Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.954714 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:08 crc kubenswrapper[5030]: I0120 23:05:08.974103 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" podStartSLOduration=3.9740897840000002 podStartE2EDuration="3.974089784s" podCreationTimestamp="2026-01-20 23:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:05:08.971158803 +0000 UTC m=+1781.291419091" watchObservedRunningTime="2026-01-20 23:05:08.974089784 +0000 UTC m=+1781.294350072" Jan 20 23:05:09 crc kubenswrapper[5030]: I0120 23:05:09.046552 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:09 crc kubenswrapper[5030]: I0120 23:05:09.509090 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-677b6f7646-2t2kv"] Jan 20 23:05:09 crc kubenswrapper[5030]: I0120 23:05:09.937026 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:09 crc kubenswrapper[5030]: I0120 23:05:09.979945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" event={"ID":"8659ca99-8cf1-4b79-90f1-e9db988aaefa","Type":"ContainerStarted","Data":"b9d0aa43e5924f881e26530da6f2716e1ea0b9c60a52a97d1c2c1728e2fab06b"} Jan 20 23:05:09 crc kubenswrapper[5030]: I0120 23:05:09.979990 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" event={"ID":"8659ca99-8cf1-4b79-90f1-e9db988aaefa","Type":"ContainerStarted","Data":"cf810cb414b460060cb68353480728f3d35d4508163577f0f7c0716bd8de5958"} Jan 20 23:05:09 crc kubenswrapper[5030]: I0120 23:05:09.982387 5030 generic.go:334] "Generic (PLEG): container finished" podID="991271c7-67ba-45ae-ba73-f18c7b800f18" containerID="7aaef5944257c8160778a89d0dad44a7bad0610243894c7f621c8bfee35d7645" exitCode=0 Jan 20 23:05:09 crc kubenswrapper[5030]: I0120 23:05:09.983473 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:09 crc kubenswrapper[5030]: I0120 23:05:09.983985 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"991271c7-67ba-45ae-ba73-f18c7b800f18","Type":"ContainerDied","Data":"7aaef5944257c8160778a89d0dad44a7bad0610243894c7f621c8bfee35d7645"} Jan 20 23:05:09 crc kubenswrapper[5030]: I0120 23:05:09.984015 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"991271c7-67ba-45ae-ba73-f18c7b800f18","Type":"ContainerDied","Data":"6823ffec3be245ed7155c947b58ea087a2f0c5cf836819ef0d574219da54777d"} Jan 20 23:05:09 crc kubenswrapper[5030]: I0120 23:05:09.984036 5030 scope.go:117] "RemoveContainer" containerID="083fa6e70157ee0d9cc4d23b5e281cbba6cc4848736abacffb288f9d1649d1c8" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.015531 5030 scope.go:117] "RemoveContainer" containerID="7aaef5944257c8160778a89d0dad44a7bad0610243894c7f621c8bfee35d7645" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.037377 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-combined-ca-bundle\") pod \"991271c7-67ba-45ae-ba73-f18c7b800f18\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.037466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-config-data-custom\") pod \"991271c7-67ba-45ae-ba73-f18c7b800f18\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.037569 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-scripts\") pod \"991271c7-67ba-45ae-ba73-f18c7b800f18\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.037612 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5hfx\" (UniqueName: \"kubernetes.io/projected/991271c7-67ba-45ae-ba73-f18c7b800f18-kube-api-access-m5hfx\") pod \"991271c7-67ba-45ae-ba73-f18c7b800f18\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.037699 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-config-data\") pod \"991271c7-67ba-45ae-ba73-f18c7b800f18\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.037748 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/991271c7-67ba-45ae-ba73-f18c7b800f18-etc-machine-id\") pod \"991271c7-67ba-45ae-ba73-f18c7b800f18\" (UID: \"991271c7-67ba-45ae-ba73-f18c7b800f18\") " Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.039700 5030 scope.go:117] "RemoveContainer" containerID="083fa6e70157ee0d9cc4d23b5e281cbba6cc4848736abacffb288f9d1649d1c8" Jan 20 23:05:10 crc kubenswrapper[5030]: E0120 23:05:10.040691 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"083fa6e70157ee0d9cc4d23b5e281cbba6cc4848736abacffb288f9d1649d1c8\": container with ID starting with 083fa6e70157ee0d9cc4d23b5e281cbba6cc4848736abacffb288f9d1649d1c8 not found: ID does not exist" containerID="083fa6e70157ee0d9cc4d23b5e281cbba6cc4848736abacffb288f9d1649d1c8" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.040718 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"083fa6e70157ee0d9cc4d23b5e281cbba6cc4848736abacffb288f9d1649d1c8"} err="failed to get container status \"083fa6e70157ee0d9cc4d23b5e281cbba6cc4848736abacffb288f9d1649d1c8\": rpc error: code = NotFound desc = could not find container \"083fa6e70157ee0d9cc4d23b5e281cbba6cc4848736abacffb288f9d1649d1c8\": container with ID starting with 083fa6e70157ee0d9cc4d23b5e281cbba6cc4848736abacffb288f9d1649d1c8 not found: ID does not exist" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.040716 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/991271c7-67ba-45ae-ba73-f18c7b800f18-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "991271c7-67ba-45ae-ba73-f18c7b800f18" (UID: "991271c7-67ba-45ae-ba73-f18c7b800f18"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.040737 5030 scope.go:117] "RemoveContainer" containerID="7aaef5944257c8160778a89d0dad44a7bad0610243894c7f621c8bfee35d7645" Jan 20 23:05:10 crc kubenswrapper[5030]: E0120 23:05:10.041716 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aaef5944257c8160778a89d0dad44a7bad0610243894c7f621c8bfee35d7645\": container with ID starting with 7aaef5944257c8160778a89d0dad44a7bad0610243894c7f621c8bfee35d7645 not found: ID does not exist" containerID="7aaef5944257c8160778a89d0dad44a7bad0610243894c7f621c8bfee35d7645" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.041749 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aaef5944257c8160778a89d0dad44a7bad0610243894c7f621c8bfee35d7645"} err="failed to get container status \"7aaef5944257c8160778a89d0dad44a7bad0610243894c7f621c8bfee35d7645\": rpc error: code = NotFound desc = could not find container \"7aaef5944257c8160778a89d0dad44a7bad0610243894c7f621c8bfee35d7645\": container with ID starting with 7aaef5944257c8160778a89d0dad44a7bad0610243894c7f621c8bfee35d7645 not found: ID does not exist" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.044157 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/991271c7-67ba-45ae-ba73-f18c7b800f18-kube-api-access-m5hfx" (OuterVolumeSpecName: "kube-api-access-m5hfx") pod "991271c7-67ba-45ae-ba73-f18c7b800f18" (UID: "991271c7-67ba-45ae-ba73-f18c7b800f18"). InnerVolumeSpecName "kube-api-access-m5hfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.044241 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-scripts" (OuterVolumeSpecName: "scripts") pod "991271c7-67ba-45ae-ba73-f18c7b800f18" (UID: "991271c7-67ba-45ae-ba73-f18c7b800f18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.052414 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "991271c7-67ba-45ae-ba73-f18c7b800f18" (UID: "991271c7-67ba-45ae-ba73-f18c7b800f18"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.087514 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "991271c7-67ba-45ae-ba73-f18c7b800f18" (UID: "991271c7-67ba-45ae-ba73-f18c7b800f18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.131953 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-config-data" (OuterVolumeSpecName: "config-data") pod "991271c7-67ba-45ae-ba73-f18c7b800f18" (UID: "991271c7-67ba-45ae-ba73-f18c7b800f18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.141165 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/991271c7-67ba-45ae-ba73-f18c7b800f18-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.141349 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.141414 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.141465 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.141667 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5hfx\" (UniqueName: \"kubernetes.io/projected/991271c7-67ba-45ae-ba73-f18c7b800f18-kube-api-access-m5hfx\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.141733 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/991271c7-67ba-45ae-ba73-f18c7b800f18-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.315538 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.331187 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.344638 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:05:10 crc kubenswrapper[5030]: E0120 23:05:10.345187 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991271c7-67ba-45ae-ba73-f18c7b800f18" containerName="cinder-scheduler" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.345280 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="991271c7-67ba-45ae-ba73-f18c7b800f18" containerName="cinder-scheduler" Jan 20 23:05:10 crc kubenswrapper[5030]: E0120 23:05:10.345362 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991271c7-67ba-45ae-ba73-f18c7b800f18" containerName="probe" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.345424 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="991271c7-67ba-45ae-ba73-f18c7b800f18" containerName="probe" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.345679 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="991271c7-67ba-45ae-ba73-f18c7b800f18" containerName="probe" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.345742 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="991271c7-67ba-45ae-ba73-f18c7b800f18" containerName="cinder-scheduler" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.346655 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.348946 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.377932 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.445972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.446110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7bxk\" (UniqueName: \"kubernetes.io/projected/1728938b-621e-44f4-91e2-c061a23aa819-kube-api-access-d7bxk\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.446143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.446183 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-scripts\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.446202 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-config-data\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.446355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1728938b-621e-44f4-91e2-c061a23aa819-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.547752 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-scripts\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.547800 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-config-data\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.547849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1728938b-621e-44f4-91e2-c061a23aa819-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.547881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.547944 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1728938b-621e-44f4-91e2-c061a23aa819-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.548101 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7bxk\" (UniqueName: \"kubernetes.io/projected/1728938b-621e-44f4-91e2-c061a23aa819-kube-api-access-d7bxk\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.548408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.552051 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.552197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-config-data\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.552226 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-scripts\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.552735 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.564086 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7bxk\" (UniqueName: \"kubernetes.io/projected/1728938b-621e-44f4-91e2-c061a23aa819-kube-api-access-d7bxk\") pod \"cinder-scheduler-0\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:10 crc kubenswrapper[5030]: I0120 23:05:10.670065 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:11 crc kubenswrapper[5030]: I0120 23:05:11.000424 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" event={"ID":"8659ca99-8cf1-4b79-90f1-e9db988aaefa","Type":"ContainerStarted","Data":"bf734de45d5f60047a30e1ceed955053b0626031cab53137b8904fe6db18d665"} Jan 20 23:05:11 crc kubenswrapper[5030]: I0120 23:05:11.001159 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:11 crc kubenswrapper[5030]: I0120 23:05:11.028797 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" podStartSLOduration=3.028775618 podStartE2EDuration="3.028775618s" podCreationTimestamp="2026-01-20 23:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:05:11.01942581 +0000 UTC m=+1783.339686098" watchObservedRunningTime="2026-01-20 23:05:11.028775618 +0000 UTC m=+1783.349035906" Jan 20 23:05:11 crc kubenswrapper[5030]: W0120 23:05:11.172679 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1728938b_621e_44f4_91e2_c061a23aa819.slice/crio-757f521c9d31487a057624a6b3e97bff2316d3bdd4bafa5de2112e15d8363490 WatchSource:0}: Error finding container 757f521c9d31487a057624a6b3e97bff2316d3bdd4bafa5de2112e15d8363490: Status 404 returned error can't find the container with id 757f521c9d31487a057624a6b3e97bff2316d3bdd4bafa5de2112e15d8363490 Jan 20 23:05:11 crc kubenswrapper[5030]: I0120 23:05:11.180035 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:05:11 crc kubenswrapper[5030]: I0120 23:05:11.978776 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="991271c7-67ba-45ae-ba73-f18c7b800f18" path="/var/lib/kubelet/pods/991271c7-67ba-45ae-ba73-f18c7b800f18/volumes" Jan 20 23:05:12 crc kubenswrapper[5030]: I0120 23:05:12.012323 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1728938b-621e-44f4-91e2-c061a23aa819","Type":"ContainerStarted","Data":"f02ca9d198cfa26557c21f23bad287bf72936e7445101fad35f42301c6d41bc2"} Jan 20 23:05:12 crc kubenswrapper[5030]: I0120 23:05:12.012442 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1728938b-621e-44f4-91e2-c061a23aa819","Type":"ContainerStarted","Data":"757f521c9d31487a057624a6b3e97bff2316d3bdd4bafa5de2112e15d8363490"} Jan 20 23:05:12 crc kubenswrapper[5030]: I0120 23:05:12.288969 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:12 crc kubenswrapper[5030]: I0120 23:05:12.396922 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:05:12 crc kubenswrapper[5030]: I0120 23:05:12.458522 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-799ff9684d-67v2r"] Jan 20 23:05:12 crc kubenswrapper[5030]: I0120 23:05:12.458834 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" podUID="605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" containerName="barbican-api-log" containerID="cri-o://e409d88079d3d24e3cc7058504a95aef60f2cbca47dae33b0546b8ba2533ca3a" gracePeriod=30 Jan 20 23:05:12 crc kubenswrapper[5030]: I0120 23:05:12.459237 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" podUID="605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" containerName="barbican-api" containerID="cri-o://0c3a5c19c1e410128793e3779060588805094169ed3914675342d1f99f09c499" gracePeriod=30 Jan 20 23:05:13 crc kubenswrapper[5030]: I0120 23:05:13.025516 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1728938b-621e-44f4-91e2-c061a23aa819","Type":"ContainerStarted","Data":"87560233fddfaf3c5921a24575892e3f668b172a67d7c5d34650e6ff443be940"} Jan 20 23:05:13 crc kubenswrapper[5030]: I0120 23:05:13.028367 5030 generic.go:334] "Generic (PLEG): container finished" podID="605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" containerID="e409d88079d3d24e3cc7058504a95aef60f2cbca47dae33b0546b8ba2533ca3a" exitCode=143 Jan 20 23:05:13 crc kubenswrapper[5030]: I0120 23:05:13.029231 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" event={"ID":"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d","Type":"ContainerDied","Data":"e409d88079d3d24e3cc7058504a95aef60f2cbca47dae33b0546b8ba2533ca3a"} Jan 20 23:05:13 crc kubenswrapper[5030]: I0120 23:05:13.046926 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.046910894 podStartE2EDuration="3.046910894s" podCreationTimestamp="2026-01-20 23:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:05:13.040302483 +0000 UTC m=+1785.360562761" watchObservedRunningTime="2026-01-20 23:05:13.046910894 +0000 UTC m=+1785.367171182" Jan 20 23:05:15 crc kubenswrapper[5030]: I0120 23:05:15.090174 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:05:15 crc kubenswrapper[5030]: I0120 23:05:15.615407 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" podUID="605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.94:9311/healthcheck\": read tcp 10.217.0.2:58514->10.217.1.94:9311: read: connection reset by peer" Jan 20 23:05:15 crc kubenswrapper[5030]: I0120 23:05:15.616183 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" podUID="605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.94:9311/healthcheck\": read tcp 10.217.0.2:58500->10.217.1.94:9311: read: connection reset by peer" Jan 20 23:05:15 crc kubenswrapper[5030]: I0120 23:05:15.670234 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:15 crc kubenswrapper[5030]: I0120 23:05:15.964086 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:05:15 crc kubenswrapper[5030]: E0120 23:05:15.964761 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.063858 5030 generic.go:334] "Generic (PLEG): container finished" podID="605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" containerID="0c3a5c19c1e410128793e3779060588805094169ed3914675342d1f99f09c499" exitCode=0 Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.063906 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" event={"ID":"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d","Type":"ContainerDied","Data":"0c3a5c19c1e410128793e3779060588805094169ed3914675342d1f99f09c499"} Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.063935 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" event={"ID":"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d","Type":"ContainerDied","Data":"b1fbe7575ca366617a1b55d7a8942e235d6b557dd4b0def85a9a0496a527198d"} Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.063949 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1fbe7575ca366617a1b55d7a8942e235d6b557dd4b0def85a9a0496a527198d" Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.064755 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.160855 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm64x\" (UniqueName: \"kubernetes.io/projected/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-kube-api-access-vm64x\") pod \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.161037 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-config-data-custom\") pod \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.161930 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-combined-ca-bundle\") pod \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.161957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-config-data\") pod \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.162035 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-logs\") pod \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\" (UID: \"605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d\") " Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.162480 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-logs" (OuterVolumeSpecName: "logs") pod "605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" (UID: "605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.166869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-kube-api-access-vm64x" (OuterVolumeSpecName: "kube-api-access-vm64x") pod "605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" (UID: "605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d"). InnerVolumeSpecName "kube-api-access-vm64x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.167151 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" (UID: "605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.196311 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" (UID: "605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.232504 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-config-data" (OuterVolumeSpecName: "config-data") pod "605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" (UID: "605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.263812 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.263876 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm64x\" (UniqueName: \"kubernetes.io/projected/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-kube-api-access-vm64x\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.263902 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.263926 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:16 crc kubenswrapper[5030]: I0120 23:05:16.263944 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:17 crc kubenswrapper[5030]: I0120 23:05:17.073088 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-799ff9684d-67v2r" Jan 20 23:05:17 crc kubenswrapper[5030]: I0120 23:05:17.131525 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-799ff9684d-67v2r"] Jan 20 23:05:17 crc kubenswrapper[5030]: I0120 23:05:17.142400 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-799ff9684d-67v2r"] Jan 20 23:05:17 crc kubenswrapper[5030]: I0120 23:05:17.979597 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" path="/var/lib/kubelet/pods/605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d/volumes" Jan 20 23:05:20 crc kubenswrapper[5030]: I0120 23:05:20.904740 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:05:23 crc kubenswrapper[5030]: I0120 23:05:23.091721 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:05:23 crc kubenswrapper[5030]: I0120 23:05:23.117588 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:05:23 crc kubenswrapper[5030]: I0120 23:05:23.157210 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:29 crc kubenswrapper[5030]: I0120 23:05:29.962267 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:05:29 crc kubenswrapper[5030]: E0120 23:05:29.964423 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:05:31 crc kubenswrapper[5030]: I0120 23:05:31.681116 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.082580 5030 scope.go:117] "RemoveContainer" containerID="f18030c78bda0c891f73eaa93b340a7df761b53cca2ed85b4b69c8267068017e" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.116330 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:05:35 crc kubenswrapper[5030]: E0120 23:05:35.123724 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" containerName="barbican-api" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.123935 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" containerName="barbican-api" Jan 20 23:05:35 crc kubenswrapper[5030]: E0120 23:05:35.124088 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" containerName="barbican-api-log" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.124189 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" containerName="barbican-api-log" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.124648 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" containerName="barbican-api-log" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.129548 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="605ed7f2-e057-4d9b-b80c-1b80a0b3ae4d" containerName="barbican-api" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.130677 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.134454 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-dcfsp" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.134951 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.141070 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.143144 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.183585 5030 scope.go:117] "RemoveContainer" containerID="c81c0d711506b6eeceb9bbf72e9abc922b35819dae6ed02a943b536214b59bb3" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.206767 5030 scope.go:117] "RemoveContainer" containerID="a2c273022a42bfa06dbc3b574e6c50272816d7abc8a2a2ee16e3f3ee91c019cb" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.227729 5030 scope.go:117] "RemoveContainer" containerID="a7766a0903428d42d7f819e6196e488f4f97723340372fdc9bc2a319a03b945f" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.245810 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-openstack-config-secret\") pod \"openstackclient\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.245901 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrcwj\" (UniqueName: \"kubernetes.io/projected/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-kube-api-access-qrcwj\") pod \"openstackclient\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.245955 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-openstack-config\") pod \"openstackclient\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.246035 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.251799 5030 scope.go:117] "RemoveContainer" containerID="e246add32a8f2b92c56ef74c9ae087941267aacd2dac7ba62b4badf399e3ef9a" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.290832 5030 scope.go:117] "RemoveContainer" containerID="f65bf0c0a190a4be243558d97f9121553b84f600642d4efa0249963226d5019d" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.325030 5030 scope.go:117] "RemoveContainer" containerID="f2c5b9db286e929674b9d0249c5dd2cb7c764e76513905ab03fd1fffafa3a335" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.348075 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.348167 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-openstack-config-secret\") pod \"openstackclient\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.348219 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrcwj\" (UniqueName: \"kubernetes.io/projected/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-kube-api-access-qrcwj\") pod \"openstackclient\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.348256 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-openstack-config\") pod \"openstackclient\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.351013 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-openstack-config\") pod \"openstackclient\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.354752 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-openstack-config-secret\") pod \"openstackclient\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.357482 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-combined-ca-bundle\") pod \"openstackclient\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.364014 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrcwj\" (UniqueName: \"kubernetes.io/projected/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-kube-api-access-qrcwj\") pod \"openstackclient\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.369075 5030 scope.go:117] "RemoveContainer" containerID="b9431845f134e3c49f0b4fda626f37bee07bab7e57e203f049a9bdac3398065b" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.398288 5030 scope.go:117] "RemoveContainer" containerID="36afc91b2b88250d491610095902402ca8687e8f5009809524f0cc02447b6ac9" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.439133 5030 scope.go:117] "RemoveContainer" containerID="f1f879bde92b8e7e837720c11d4960f9a23eb7c5819999adfe14cc248145d4fc" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.469350 5030 scope.go:117] "RemoveContainer" containerID="2fb065624a0cd717b1029696de26712931b9386e0c465d5a6fc6c506a938300a" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.491449 5030 scope.go:117] "RemoveContainer" containerID="51833c9fe476e8a63424a4a38609c31c548e23bb8b6381dba3b27392a0a655de" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.498773 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.513429 5030 scope.go:117] "RemoveContainer" containerID="e873a504335074ef9b6e599db1a678cb382455d2f7f6a91a65fca438ad552396" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.552892 5030 scope.go:117] "RemoveContainer" containerID="0d2d09ed40317efbf1687169ffd9762f52fd37ce6338e32a2cef774fa2697e92" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.587238 5030 scope.go:117] "RemoveContainer" containerID="15f3f2e178878a7a7591f1942904d8f760ff5c2f3da40e1b5354683582d9d89a" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.624336 5030 scope.go:117] "RemoveContainer" containerID="58c4c940230c2969da4de4c10a965e74815f9a0617d579d94c2cea62112932fc" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.662738 5030 scope.go:117] "RemoveContainer" containerID="1fdbf3f0525c086956ed40b1d3edf22385f43cf65cd8c8f8464f203907256fa6" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.692923 5030 scope.go:117] "RemoveContainer" containerID="44f6f5122053dee469c5f5f15881894d07ecebb1b02c5a8aeebf93f8fe66c416" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.731036 5030 scope.go:117] "RemoveContainer" containerID="adf6b9b7c07af52aab5a63971fbefa3a17b151f0c57d391ef2c2b3e03cca6ed6" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.752039 5030 scope.go:117] "RemoveContainer" containerID="930e742a8c3b6bfba53d77a91e0a752230ac32c65e3b1f9f159ccd247b30505f" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.790345 5030 scope.go:117] "RemoveContainer" containerID="9229318f7cb77ae4d53c9c07b2207d74bfdc19cd971563178ba54067d58e96ad" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.816845 5030 scope.go:117] "RemoveContainer" containerID="ed4735ac8e65d646343fe55203e0d23a01fa79f4cd61e5be232941c6f5de79bb" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.834403 5030 scope.go:117] "RemoveContainer" containerID="b55acc6188f53bf283b05b9661bb3eb3caf7b2386532ae980483b10b21b0a5fb" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.884895 5030 scope.go:117] "RemoveContainer" containerID="838a40628f24c61170dd8dd7851a372a7e7957c4115606ae38fb7c89ecebe31e" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.922213 5030 scope.go:117] "RemoveContainer" containerID="8aaf087519b345b06fa1b7e7bcfa88a8075014f84a204827b107d99bae073ccb" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.945095 5030 scope.go:117] "RemoveContainer" containerID="9c3071b7e01af1fad2cd7e74c1d4926f81b1317666d5ffe14fa1a2e5d0253fcb" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.951240 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.963507 5030 scope.go:117] "RemoveContainer" containerID="272f3362cfb0f6639a3278329b5b414c51b4001addfaf831fd46b21f2f8238ee" Jan 20 23:05:35 crc kubenswrapper[5030]: I0120 23:05:35.987416 5030 scope.go:117] "RemoveContainer" containerID="c4dcf5e1ee539c112bd3f6d2e928eb1b4f90aa50eaed6f7be68fbe2d8e6e37e8" Jan 20 23:05:36 crc kubenswrapper[5030]: I0120 23:05:36.009205 5030 scope.go:117] "RemoveContainer" containerID="7228ae851e48f872fc68053da066eb092303bbfc14163b39931113d15daeae2a" Jan 20 23:05:36 crc kubenswrapper[5030]: I0120 23:05:36.352794 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99","Type":"ContainerStarted","Data":"fc1bd89789fbcdfff592fdf01d1355ec90b510f11eea0c403079203e91249261"} Jan 20 23:05:36 crc kubenswrapper[5030]: I0120 23:05:36.353075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99","Type":"ContainerStarted","Data":"524ed3d080a6567afd674ecafba1a17e305340a068ae356d5f33c3322f09abc7"} Jan 20 23:05:36 crc kubenswrapper[5030]: I0120 23:05:36.373709 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=1.373683921 podStartE2EDuration="1.373683921s" podCreationTimestamp="2026-01-20 23:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:05:36.365942851 +0000 UTC m=+1808.686203129" watchObservedRunningTime="2026-01-20 23:05:36.373683921 +0000 UTC m=+1808.693944219" Jan 20 23:05:36 crc kubenswrapper[5030]: I0120 23:05:36.387009 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.353172 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz"] Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.355090 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.357570 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.358063 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.361749 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.372920 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz"] Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.437023 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-config-data\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.437082 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl559\" (UniqueName: \"kubernetes.io/projected/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-kube-api-access-fl559\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.437228 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-etc-swift\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.437271 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-log-httpd\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.437314 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-run-httpd\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.437426 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-combined-ca-bundle\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.437812 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-public-tls-certs\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.437895 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-internal-tls-certs\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.539167 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-public-tls-certs\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.539503 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-internal-tls-certs\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.539598 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-config-data\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.539700 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl559\" (UniqueName: \"kubernetes.io/projected/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-kube-api-access-fl559\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.539822 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-etc-swift\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.539951 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-log-httpd\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.540453 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-log-httpd\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.540649 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-run-httpd\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.541053 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-combined-ca-bundle\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.540872 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-run-httpd\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.548679 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-etc-swift\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.548815 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-public-tls-certs\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.551937 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-combined-ca-bundle\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.552905 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-internal-tls-certs\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.553605 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-config-data\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.566441 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl559\" (UniqueName: \"kubernetes.io/projected/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-kube-api-access-fl559\") pod \"swift-proxy-6c6dffc9bb-hmplz\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:38 crc kubenswrapper[5030]: I0120 23:05:38.675711 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.059980 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.121205 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-85b4b556b9-gdz7b"] Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.121520 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" podUID="e1e5066b-6521-4a21-aec7-ee8470028811" containerName="neutron-api" containerID="cri-o://7d8afe6beeb49bea88586ad01735215dbf61aba50d91a2be011617309bccb5fd" gracePeriod=30 Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.121693 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" podUID="e1e5066b-6521-4a21-aec7-ee8470028811" containerName="neutron-httpd" containerID="cri-o://e84cff37bf897c3fe607c04c0f3891a72b6ad601a4b9cea5ae770d85b26fdc56" gracePeriod=30 Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.147470 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz"] Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.418520 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" event={"ID":"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf","Type":"ContainerStarted","Data":"7d91b777f27a8846dc117e3e1f244fee0123666d9f72ac9992025e702d704e39"} Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.421410 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" event={"ID":"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf","Type":"ContainerStarted","Data":"a06a815ed889dad6f5811fac204d7f02ab487d8f58ddbdfe6d158f07d14d0a34"} Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.421471 5030 generic.go:334] "Generic (PLEG): container finished" podID="e1e5066b-6521-4a21-aec7-ee8470028811" containerID="e84cff37bf897c3fe607c04c0f3891a72b6ad601a4b9cea5ae770d85b26fdc56" exitCode=0 Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.421479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" event={"ID":"e1e5066b-6521-4a21-aec7-ee8470028811","Type":"ContainerDied","Data":"e84cff37bf897c3fe607c04c0f3891a72b6ad601a4b9cea5ae770d85b26fdc56"} Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.959117 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.959520 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="ceilometer-central-agent" containerID="cri-o://4f9f113804a4276a11bbd7a355acea21f00bc02ed42433215af9f3ec417b390c" gracePeriod=30 Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.959744 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="ceilometer-notification-agent" containerID="cri-o://5141cc3f8fec2508048bd571d6893ee4dd69b19c8d0b4735e355b4a0c9bb32b3" gracePeriod=30 Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.959864 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="sg-core" containerID="cri-o://d64f84f11d342393009bbaea6a332279247d7db0d59200ce6548a27f2d410c13" gracePeriod=30 Jan 20 23:05:39 crc kubenswrapper[5030]: I0120 23:05:39.959748 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="proxy-httpd" containerID="cri-o://2ef52465ddd335691209703804aeb21ba338d53e8296574b1eca1b4832254289" gracePeriod=30 Jan 20 23:05:40 crc kubenswrapper[5030]: I0120 23:05:40.437601 5030 generic.go:334] "Generic (PLEG): container finished" podID="7b70086d-b309-4113-bcee-34b36bf512a7" containerID="2ef52465ddd335691209703804aeb21ba338d53e8296574b1eca1b4832254289" exitCode=0 Jan 20 23:05:40 crc kubenswrapper[5030]: I0120 23:05:40.437661 5030 generic.go:334] "Generic (PLEG): container finished" podID="7b70086d-b309-4113-bcee-34b36bf512a7" containerID="d64f84f11d342393009bbaea6a332279247d7db0d59200ce6548a27f2d410c13" exitCode=2 Jan 20 23:05:40 crc kubenswrapper[5030]: I0120 23:05:40.437670 5030 generic.go:334] "Generic (PLEG): container finished" podID="7b70086d-b309-4113-bcee-34b36bf512a7" containerID="4f9f113804a4276a11bbd7a355acea21f00bc02ed42433215af9f3ec417b390c" exitCode=0 Jan 20 23:05:40 crc kubenswrapper[5030]: I0120 23:05:40.437710 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7b70086d-b309-4113-bcee-34b36bf512a7","Type":"ContainerDied","Data":"2ef52465ddd335691209703804aeb21ba338d53e8296574b1eca1b4832254289"} Jan 20 23:05:40 crc kubenswrapper[5030]: I0120 23:05:40.437735 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7b70086d-b309-4113-bcee-34b36bf512a7","Type":"ContainerDied","Data":"d64f84f11d342393009bbaea6a332279247d7db0d59200ce6548a27f2d410c13"} Jan 20 23:05:40 crc kubenswrapper[5030]: I0120 23:05:40.437744 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7b70086d-b309-4113-bcee-34b36bf512a7","Type":"ContainerDied","Data":"4f9f113804a4276a11bbd7a355acea21f00bc02ed42433215af9f3ec417b390c"} Jan 20 23:05:40 crc kubenswrapper[5030]: I0120 23:05:40.441063 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" event={"ID":"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf","Type":"ContainerStarted","Data":"1ace08f4089db5eedfbc26aaa16427e4c52aaf88c8dbe004dece963061aab561"} Jan 20 23:05:40 crc kubenswrapper[5030]: I0120 23:05:40.441370 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:40 crc kubenswrapper[5030]: I0120 23:05:40.441384 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:40 crc kubenswrapper[5030]: I0120 23:05:40.481799 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" podStartSLOduration=2.481781568 podStartE2EDuration="2.481781568s" podCreationTimestamp="2026-01-20 23:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:05:40.469006887 +0000 UTC m=+1812.789267175" watchObservedRunningTime="2026-01-20 23:05:40.481781568 +0000 UTC m=+1812.802041856" Jan 20 23:05:42 crc kubenswrapper[5030]: I0120 23:05:42.963144 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:05:42 crc kubenswrapper[5030]: E0120 23:05:42.963543 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.493972 5030 generic.go:334] "Generic (PLEG): container finished" podID="7b70086d-b309-4113-bcee-34b36bf512a7" containerID="5141cc3f8fec2508048bd571d6893ee4dd69b19c8d0b4735e355b4a0c9bb32b3" exitCode=0 Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.494018 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7b70086d-b309-4113-bcee-34b36bf512a7","Type":"ContainerDied","Data":"5141cc3f8fec2508048bd571d6893ee4dd69b19c8d0b4735e355b4a0c9bb32b3"} Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.638031 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.753068 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b70086d-b309-4113-bcee-34b36bf512a7-run-httpd\") pod \"7b70086d-b309-4113-bcee-34b36bf512a7\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.753120 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk9tm\" (UniqueName: \"kubernetes.io/projected/7b70086d-b309-4113-bcee-34b36bf512a7-kube-api-access-kk9tm\") pod \"7b70086d-b309-4113-bcee-34b36bf512a7\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.753140 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-scripts\") pod \"7b70086d-b309-4113-bcee-34b36bf512a7\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.753182 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-config-data\") pod \"7b70086d-b309-4113-bcee-34b36bf512a7\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.753201 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b70086d-b309-4113-bcee-34b36bf512a7-log-httpd\") pod \"7b70086d-b309-4113-bcee-34b36bf512a7\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.753253 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-combined-ca-bundle\") pod \"7b70086d-b309-4113-bcee-34b36bf512a7\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.753280 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-sg-core-conf-yaml\") pod \"7b70086d-b309-4113-bcee-34b36bf512a7\" (UID: \"7b70086d-b309-4113-bcee-34b36bf512a7\") " Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.753591 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b70086d-b309-4113-bcee-34b36bf512a7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7b70086d-b309-4113-bcee-34b36bf512a7" (UID: "7b70086d-b309-4113-bcee-34b36bf512a7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.754090 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b70086d-b309-4113-bcee-34b36bf512a7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7b70086d-b309-4113-bcee-34b36bf512a7" (UID: "7b70086d-b309-4113-bcee-34b36bf512a7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.764891 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-scripts" (OuterVolumeSpecName: "scripts") pod "7b70086d-b309-4113-bcee-34b36bf512a7" (UID: "7b70086d-b309-4113-bcee-34b36bf512a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.765143 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b70086d-b309-4113-bcee-34b36bf512a7-kube-api-access-kk9tm" (OuterVolumeSpecName: "kube-api-access-kk9tm") pod "7b70086d-b309-4113-bcee-34b36bf512a7" (UID: "7b70086d-b309-4113-bcee-34b36bf512a7"). InnerVolumeSpecName "kube-api-access-kk9tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.787250 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7b70086d-b309-4113-bcee-34b36bf512a7" (UID: "7b70086d-b309-4113-bcee-34b36bf512a7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.842926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b70086d-b309-4113-bcee-34b36bf512a7" (UID: "7b70086d-b309-4113-bcee-34b36bf512a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.855144 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b70086d-b309-4113-bcee-34b36bf512a7-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.855176 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk9tm\" (UniqueName: \"kubernetes.io/projected/7b70086d-b309-4113-bcee-34b36bf512a7-kube-api-access-kk9tm\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.855185 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.855194 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b70086d-b309-4113-bcee-34b36bf512a7-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.855203 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.855212 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.873143 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-config-data" (OuterVolumeSpecName: "config-data") pod "7b70086d-b309-4113-bcee-34b36bf512a7" (UID: "7b70086d-b309-4113-bcee-34b36bf512a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:43 crc kubenswrapper[5030]: I0120 23:05:43.957106 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b70086d-b309-4113-bcee-34b36bf512a7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.210657 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.262368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-httpd-config\") pod \"e1e5066b-6521-4a21-aec7-ee8470028811\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.262422 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-ovndb-tls-certs\") pod \"e1e5066b-6521-4a21-aec7-ee8470028811\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.262575 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-combined-ca-bundle\") pod \"e1e5066b-6521-4a21-aec7-ee8470028811\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.262605 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drwkm\" (UniqueName: \"kubernetes.io/projected/e1e5066b-6521-4a21-aec7-ee8470028811-kube-api-access-drwkm\") pod \"e1e5066b-6521-4a21-aec7-ee8470028811\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.262654 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-config\") pod \"e1e5066b-6521-4a21-aec7-ee8470028811\" (UID: \"e1e5066b-6521-4a21-aec7-ee8470028811\") " Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.268603 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e1e5066b-6521-4a21-aec7-ee8470028811" (UID: "e1e5066b-6521-4a21-aec7-ee8470028811"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.269845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e5066b-6521-4a21-aec7-ee8470028811-kube-api-access-drwkm" (OuterVolumeSpecName: "kube-api-access-drwkm") pod "e1e5066b-6521-4a21-aec7-ee8470028811" (UID: "e1e5066b-6521-4a21-aec7-ee8470028811"). InnerVolumeSpecName "kube-api-access-drwkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.318874 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1e5066b-6521-4a21-aec7-ee8470028811" (UID: "e1e5066b-6521-4a21-aec7-ee8470028811"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.333136 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-config" (OuterVolumeSpecName: "config") pod "e1e5066b-6521-4a21-aec7-ee8470028811" (UID: "e1e5066b-6521-4a21-aec7-ee8470028811"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.354518 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e1e5066b-6521-4a21-aec7-ee8470028811" (UID: "e1e5066b-6521-4a21-aec7-ee8470028811"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.364659 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.364688 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.364699 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.364712 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drwkm\" (UniqueName: \"kubernetes.io/projected/e1e5066b-6521-4a21-aec7-ee8470028811-kube-api-access-drwkm\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.364721 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1e5066b-6521-4a21-aec7-ee8470028811-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.507343 5030 generic.go:334] "Generic (PLEG): container finished" podID="e1e5066b-6521-4a21-aec7-ee8470028811" containerID="7d8afe6beeb49bea88586ad01735215dbf61aba50d91a2be011617309bccb5fd" exitCode=0 Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.507383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" event={"ID":"e1e5066b-6521-4a21-aec7-ee8470028811","Type":"ContainerDied","Data":"7d8afe6beeb49bea88586ad01735215dbf61aba50d91a2be011617309bccb5fd"} Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.507438 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" event={"ID":"e1e5066b-6521-4a21-aec7-ee8470028811","Type":"ContainerDied","Data":"677d9236d3f6ac5ea5aa59b789ad9d6e06273d145af3504e29bf0b2f5d6daf4c"} Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.507468 5030 scope.go:117] "RemoveContainer" containerID="e84cff37bf897c3fe607c04c0f3891a72b6ad601a4b9cea5ae770d85b26fdc56" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.507472 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-85b4b556b9-gdz7b" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.510704 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7b70086d-b309-4113-bcee-34b36bf512a7","Type":"ContainerDied","Data":"fea5bc5d2c00aa5413463a34d36196ae530d12109a24852923f21ad9d43493d4"} Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.510787 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.548839 5030 scope.go:117] "RemoveContainer" containerID="7d8afe6beeb49bea88586ad01735215dbf61aba50d91a2be011617309bccb5fd" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.561492 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.569234 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.584950 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-85b4b556b9-gdz7b"] Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.590538 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:05:44 crc kubenswrapper[5030]: E0120 23:05:44.590876 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="ceilometer-central-agent" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.590896 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="ceilometer-central-agent" Jan 20 23:05:44 crc kubenswrapper[5030]: E0120 23:05:44.590914 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="ceilometer-notification-agent" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.590921 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="ceilometer-notification-agent" Jan 20 23:05:44 crc kubenswrapper[5030]: E0120 23:05:44.590940 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="proxy-httpd" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.590947 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="proxy-httpd" Jan 20 23:05:44 crc kubenswrapper[5030]: E0120 23:05:44.590957 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e5066b-6521-4a21-aec7-ee8470028811" containerName="neutron-api" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.590963 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e5066b-6521-4a21-aec7-ee8470028811" containerName="neutron-api" Jan 20 23:05:44 crc kubenswrapper[5030]: E0120 23:05:44.590974 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="sg-core" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.590979 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="sg-core" Jan 20 23:05:44 crc kubenswrapper[5030]: E0120 23:05:44.590991 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e5066b-6521-4a21-aec7-ee8470028811" containerName="neutron-httpd" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.590997 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e5066b-6521-4a21-aec7-ee8470028811" containerName="neutron-httpd" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.591142 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e5066b-6521-4a21-aec7-ee8470028811" containerName="neutron-api" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.591157 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="proxy-httpd" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.591168 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="ceilometer-central-agent" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.591178 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e5066b-6521-4a21-aec7-ee8470028811" containerName="neutron-httpd" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.591193 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="sg-core" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.591203 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" containerName="ceilometer-notification-agent" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.592882 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.592973 5030 scope.go:117] "RemoveContainer" containerID="e84cff37bf897c3fe607c04c0f3891a72b6ad601a4b9cea5ae770d85b26fdc56" Jan 20 23:05:44 crc kubenswrapper[5030]: E0120 23:05:44.594149 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e84cff37bf897c3fe607c04c0f3891a72b6ad601a4b9cea5ae770d85b26fdc56\": container with ID starting with e84cff37bf897c3fe607c04c0f3891a72b6ad601a4b9cea5ae770d85b26fdc56 not found: ID does not exist" containerID="e84cff37bf897c3fe607c04c0f3891a72b6ad601a4b9cea5ae770d85b26fdc56" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.594203 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84cff37bf897c3fe607c04c0f3891a72b6ad601a4b9cea5ae770d85b26fdc56"} err="failed to get container status \"e84cff37bf897c3fe607c04c0f3891a72b6ad601a4b9cea5ae770d85b26fdc56\": rpc error: code = NotFound desc = could not find container \"e84cff37bf897c3fe607c04c0f3891a72b6ad601a4b9cea5ae770d85b26fdc56\": container with ID starting with e84cff37bf897c3fe607c04c0f3891a72b6ad601a4b9cea5ae770d85b26fdc56 not found: ID does not exist" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.594233 5030 scope.go:117] "RemoveContainer" containerID="7d8afe6beeb49bea88586ad01735215dbf61aba50d91a2be011617309bccb5fd" Jan 20 23:05:44 crc kubenswrapper[5030]: E0120 23:05:44.594445 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d8afe6beeb49bea88586ad01735215dbf61aba50d91a2be011617309bccb5fd\": container with ID starting with 7d8afe6beeb49bea88586ad01735215dbf61aba50d91a2be011617309bccb5fd not found: ID does not exist" containerID="7d8afe6beeb49bea88586ad01735215dbf61aba50d91a2be011617309bccb5fd" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.594468 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8afe6beeb49bea88586ad01735215dbf61aba50d91a2be011617309bccb5fd"} err="failed to get container status \"7d8afe6beeb49bea88586ad01735215dbf61aba50d91a2be011617309bccb5fd\": rpc error: code = NotFound desc = could not find container \"7d8afe6beeb49bea88586ad01735215dbf61aba50d91a2be011617309bccb5fd\": container with ID starting with 7d8afe6beeb49bea88586ad01735215dbf61aba50d91a2be011617309bccb5fd not found: ID does not exist" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.594482 5030 scope.go:117] "RemoveContainer" containerID="2ef52465ddd335691209703804aeb21ba338d53e8296574b1eca1b4832254289" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.599135 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.601874 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-85b4b556b9-gdz7b"] Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.601959 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.629662 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.666983 5030 scope.go:117] "RemoveContainer" containerID="d64f84f11d342393009bbaea6a332279247d7db0d59200ce6548a27f2d410c13" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.668665 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-scripts\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.668707 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.668745 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.668767 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-log-httpd\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.668802 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-run-httpd\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.668826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd6gh\" (UniqueName: \"kubernetes.io/projected/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-kube-api-access-wd6gh\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.668844 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-config-data\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.714611 5030 scope.go:117] "RemoveContainer" containerID="5141cc3f8fec2508048bd571d6893ee4dd69b19c8d0b4735e355b4a0c9bb32b3" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.732872 5030 scope.go:117] "RemoveContainer" containerID="4f9f113804a4276a11bbd7a355acea21f00bc02ed42433215af9f3ec417b390c" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.770324 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.770371 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-log-httpd\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.770421 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-run-httpd\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.770446 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd6gh\" (UniqueName: \"kubernetes.io/projected/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-kube-api-access-wd6gh\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.770467 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-config-data\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.770537 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-scripts\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.770562 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.771370 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-log-httpd\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.771937 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-run-httpd\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.777316 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-config-data\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.777911 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.778341 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-scripts\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.778651 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.793238 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd6gh\" (UniqueName: \"kubernetes.io/projected/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-kube-api-access-wd6gh\") pod \"ceilometer-0\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:44 crc kubenswrapper[5030]: I0120 23:05:44.924528 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:45 crc kubenswrapper[5030]: I0120 23:05:45.357545 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:05:45 crc kubenswrapper[5030]: I0120 23:05:45.425062 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:05:45 crc kubenswrapper[5030]: I0120 23:05:45.527314 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12","Type":"ContainerStarted","Data":"8bbce3af4abf07b8c3c807e790383cdd1737a7487bb843cefdbe8512e64c14a1"} Jan 20 23:05:45 crc kubenswrapper[5030]: I0120 23:05:45.973524 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b70086d-b309-4113-bcee-34b36bf512a7" path="/var/lib/kubelet/pods/7b70086d-b309-4113-bcee-34b36bf512a7/volumes" Jan 20 23:05:45 crc kubenswrapper[5030]: I0120 23:05:45.975580 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e5066b-6521-4a21-aec7-ee8470028811" path="/var/lib/kubelet/pods/e1e5066b-6521-4a21-aec7-ee8470028811/volumes" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.400313 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-2m9r9"] Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.401404 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.414586 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-2m9r9"] Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.504238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtsjs\" (UniqueName: \"kubernetes.io/projected/288cd5d7-f17b-4184-9220-da634c367fde-kube-api-access-qtsjs\") pod \"nova-api-db-create-2m9r9\" (UID: \"288cd5d7-f17b-4184-9220-da634c367fde\") " pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.504285 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288cd5d7-f17b-4184-9220-da634c367fde-operator-scripts\") pod \"nova-api-db-create-2m9r9\" (UID: \"288cd5d7-f17b-4184-9220-da634c367fde\") " pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.517260 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn"] Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.518438 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.520907 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.526342 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-v84g9"] Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.527458 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-v84g9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.557002 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn"] Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.562243 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-v84g9"] Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.562973 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12","Type":"ContainerStarted","Data":"a97a61f027df0970b8918a2764d2761555411d7a5abf6bb0302f267f55cf9b39"} Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.608453 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm2rc\" (UniqueName: \"kubernetes.io/projected/62549b05-ca68-4615-9cf8-ee0a6f5c5d69-kube-api-access-vm2rc\") pod \"nova-api-5185-account-create-update-fvtmn\" (UID: \"62549b05-ca68-4615-9cf8-ee0a6f5c5d69\") " pod="openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.608584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea3c7c52-f274-4811-8559-49b78f7b9bfa-operator-scripts\") pod \"nova-cell0-db-create-v84g9\" (UID: \"ea3c7c52-f274-4811-8559-49b78f7b9bfa\") " pod="openstack-kuttl-tests/nova-cell0-db-create-v84g9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.608645 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62549b05-ca68-4615-9cf8-ee0a6f5c5d69-operator-scripts\") pod \"nova-api-5185-account-create-update-fvtmn\" (UID: \"62549b05-ca68-4615-9cf8-ee0a6f5c5d69\") " pod="openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.608680 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtsjs\" (UniqueName: \"kubernetes.io/projected/288cd5d7-f17b-4184-9220-da634c367fde-kube-api-access-qtsjs\") pod \"nova-api-db-create-2m9r9\" (UID: \"288cd5d7-f17b-4184-9220-da634c367fde\") " pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.608731 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6dp5\" (UniqueName: \"kubernetes.io/projected/ea3c7c52-f274-4811-8559-49b78f7b9bfa-kube-api-access-d6dp5\") pod \"nova-cell0-db-create-v84g9\" (UID: \"ea3c7c52-f274-4811-8559-49b78f7b9bfa\") " pod="openstack-kuttl-tests/nova-cell0-db-create-v84g9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.608753 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288cd5d7-f17b-4184-9220-da634c367fde-operator-scripts\") pod \"nova-api-db-create-2m9r9\" (UID: \"288cd5d7-f17b-4184-9220-da634c367fde\") " pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.611434 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288cd5d7-f17b-4184-9220-da634c367fde-operator-scripts\") pod \"nova-api-db-create-2m9r9\" (UID: \"288cd5d7-f17b-4184-9220-da634c367fde\") " pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.612638 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-8qnm7"] Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.613766 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-8qnm7" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.621666 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-8qnm7"] Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.634255 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtsjs\" (UniqueName: \"kubernetes.io/projected/288cd5d7-f17b-4184-9220-da634c367fde-kube-api-access-qtsjs\") pod \"nova-api-db-create-2m9r9\" (UID: \"288cd5d7-f17b-4184-9220-da634c367fde\") " pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.709811 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc12aecf-2818-47e1-81b0-8643dc895afb-operator-scripts\") pod \"nova-cell1-db-create-8qnm7\" (UID: \"dc12aecf-2818-47e1-81b0-8643dc895afb\") " pod="openstack-kuttl-tests/nova-cell1-db-create-8qnm7" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.709895 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm2rc\" (UniqueName: \"kubernetes.io/projected/62549b05-ca68-4615-9cf8-ee0a6f5c5d69-kube-api-access-vm2rc\") pod \"nova-api-5185-account-create-update-fvtmn\" (UID: \"62549b05-ca68-4615-9cf8-ee0a6f5c5d69\") " pod="openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.709943 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea3c7c52-f274-4811-8559-49b78f7b9bfa-operator-scripts\") pod \"nova-cell0-db-create-v84g9\" (UID: \"ea3c7c52-f274-4811-8559-49b78f7b9bfa\") " pod="openstack-kuttl-tests/nova-cell0-db-create-v84g9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.709977 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62549b05-ca68-4615-9cf8-ee0a6f5c5d69-operator-scripts\") pod \"nova-api-5185-account-create-update-fvtmn\" (UID: \"62549b05-ca68-4615-9cf8-ee0a6f5c5d69\") " pod="openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.710011 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6dp5\" (UniqueName: \"kubernetes.io/projected/ea3c7c52-f274-4811-8559-49b78f7b9bfa-kube-api-access-d6dp5\") pod \"nova-cell0-db-create-v84g9\" (UID: \"ea3c7c52-f274-4811-8559-49b78f7b9bfa\") " pod="openstack-kuttl-tests/nova-cell0-db-create-v84g9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.710035 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82749\" (UniqueName: \"kubernetes.io/projected/dc12aecf-2818-47e1-81b0-8643dc895afb-kube-api-access-82749\") pod \"nova-cell1-db-create-8qnm7\" (UID: \"dc12aecf-2818-47e1-81b0-8643dc895afb\") " pod="openstack-kuttl-tests/nova-cell1-db-create-8qnm7" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.711063 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62549b05-ca68-4615-9cf8-ee0a6f5c5d69-operator-scripts\") pod \"nova-api-5185-account-create-update-fvtmn\" (UID: \"62549b05-ca68-4615-9cf8-ee0a6f5c5d69\") " pod="openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.711067 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea3c7c52-f274-4811-8559-49b78f7b9bfa-operator-scripts\") pod \"nova-cell0-db-create-v84g9\" (UID: \"ea3c7c52-f274-4811-8559-49b78f7b9bfa\") " pod="openstack-kuttl-tests/nova-cell0-db-create-v84g9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.714829 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.722173 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2"] Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.723265 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.729911 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm2rc\" (UniqueName: \"kubernetes.io/projected/62549b05-ca68-4615-9cf8-ee0a6f5c5d69-kube-api-access-vm2rc\") pod \"nova-api-5185-account-create-update-fvtmn\" (UID: \"62549b05-ca68-4615-9cf8-ee0a6f5c5d69\") " pod="openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.730038 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.731650 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2"] Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.740294 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6dp5\" (UniqueName: \"kubernetes.io/projected/ea3c7c52-f274-4811-8559-49b78f7b9bfa-kube-api-access-d6dp5\") pod \"nova-cell0-db-create-v84g9\" (UID: \"ea3c7c52-f274-4811-8559-49b78f7b9bfa\") " pod="openstack-kuttl-tests/nova-cell0-db-create-v84g9" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.811275 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmgpf\" (UniqueName: \"kubernetes.io/projected/48e38148-bd69-40ba-a439-0ed6e3be605a-kube-api-access-hmgpf\") pod \"nova-cell0-c9bf-account-create-update-nqql2\" (UID: \"48e38148-bd69-40ba-a439-0ed6e3be605a\") " pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.811348 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e38148-bd69-40ba-a439-0ed6e3be605a-operator-scripts\") pod \"nova-cell0-c9bf-account-create-update-nqql2\" (UID: \"48e38148-bd69-40ba-a439-0ed6e3be605a\") " pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.811487 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82749\" (UniqueName: \"kubernetes.io/projected/dc12aecf-2818-47e1-81b0-8643dc895afb-kube-api-access-82749\") pod \"nova-cell1-db-create-8qnm7\" (UID: \"dc12aecf-2818-47e1-81b0-8643dc895afb\") " pod="openstack-kuttl-tests/nova-cell1-db-create-8qnm7" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.812424 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc12aecf-2818-47e1-81b0-8643dc895afb-operator-scripts\") pod \"nova-cell1-db-create-8qnm7\" (UID: \"dc12aecf-2818-47e1-81b0-8643dc895afb\") " pod="openstack-kuttl-tests/nova-cell1-db-create-8qnm7" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.813256 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc12aecf-2818-47e1-81b0-8643dc895afb-operator-scripts\") pod \"nova-cell1-db-create-8qnm7\" (UID: \"dc12aecf-2818-47e1-81b0-8643dc895afb\") " pod="openstack-kuttl-tests/nova-cell1-db-create-8qnm7" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.830859 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82749\" (UniqueName: \"kubernetes.io/projected/dc12aecf-2818-47e1-81b0-8643dc895afb-kube-api-access-82749\") pod \"nova-cell1-db-create-8qnm7\" (UID: \"dc12aecf-2818-47e1-81b0-8643dc895afb\") " pod="openstack-kuttl-tests/nova-cell1-db-create-8qnm7" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.914024 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmgpf\" (UniqueName: \"kubernetes.io/projected/48e38148-bd69-40ba-a439-0ed6e3be605a-kube-api-access-hmgpf\") pod \"nova-cell0-c9bf-account-create-update-nqql2\" (UID: \"48e38148-bd69-40ba-a439-0ed6e3be605a\") " pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.914077 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e38148-bd69-40ba-a439-0ed6e3be605a-operator-scripts\") pod \"nova-cell0-c9bf-account-create-update-nqql2\" (UID: \"48e38148-bd69-40ba-a439-0ed6e3be605a\") " pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.914778 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e38148-bd69-40ba-a439-0ed6e3be605a-operator-scripts\") pod \"nova-cell0-c9bf-account-create-update-nqql2\" (UID: \"48e38148-bd69-40ba-a439-0ed6e3be605a\") " pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.917819 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p"] Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.918990 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.922785 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.928187 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p"] Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.932163 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmgpf\" (UniqueName: \"kubernetes.io/projected/48e38148-bd69-40ba-a439-0ed6e3be605a-kube-api-access-hmgpf\") pod \"nova-cell0-c9bf-account-create-update-nqql2\" (UID: \"48e38148-bd69-40ba-a439-0ed6e3be605a\") " pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2" Jan 20 23:05:46 crc kubenswrapper[5030]: I0120 23:05:46.998180 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn" Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.006222 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-v84g9" Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.016145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zhsq\" (UniqueName: \"kubernetes.io/projected/775ba99a-26a5-4fd1-a46b-0d85647a97aa-kube-api-access-7zhsq\") pod \"nova-cell1-46ec-account-create-update-vnw9p\" (UID: \"775ba99a-26a5-4fd1-a46b-0d85647a97aa\") " pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p" Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.016190 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/775ba99a-26a5-4fd1-a46b-0d85647a97aa-operator-scripts\") pod \"nova-cell1-46ec-account-create-update-vnw9p\" (UID: \"775ba99a-26a5-4fd1-a46b-0d85647a97aa\") " pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p" Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.025996 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-8qnm7" Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.093628 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2" Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.117912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zhsq\" (UniqueName: \"kubernetes.io/projected/775ba99a-26a5-4fd1-a46b-0d85647a97aa-kube-api-access-7zhsq\") pod \"nova-cell1-46ec-account-create-update-vnw9p\" (UID: \"775ba99a-26a5-4fd1-a46b-0d85647a97aa\") " pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p" Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.118171 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/775ba99a-26a5-4fd1-a46b-0d85647a97aa-operator-scripts\") pod \"nova-cell1-46ec-account-create-update-vnw9p\" (UID: \"775ba99a-26a5-4fd1-a46b-0d85647a97aa\") " pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p" Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.118792 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/775ba99a-26a5-4fd1-a46b-0d85647a97aa-operator-scripts\") pod \"nova-cell1-46ec-account-create-update-vnw9p\" (UID: \"775ba99a-26a5-4fd1-a46b-0d85647a97aa\") " pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p" Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.138785 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zhsq\" (UniqueName: \"kubernetes.io/projected/775ba99a-26a5-4fd1-a46b-0d85647a97aa-kube-api-access-7zhsq\") pod \"nova-cell1-46ec-account-create-update-vnw9p\" (UID: \"775ba99a-26a5-4fd1-a46b-0d85647a97aa\") " pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p" Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.192272 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-2m9r9"] Jan 20 23:05:47 crc kubenswrapper[5030]: W0120 23:05:47.210454 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod288cd5d7_f17b_4184_9220_da634c367fde.slice/crio-d3aeb04e2bf42b997441c743340619a7fbc53153dbc29260e8c3a942773d1556 WatchSource:0}: Error finding container d3aeb04e2bf42b997441c743340619a7fbc53153dbc29260e8c3a942773d1556: Status 404 returned error can't find the container with id d3aeb04e2bf42b997441c743340619a7fbc53153dbc29260e8c3a942773d1556 Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.246532 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p" Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.493722 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn"] Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.581826 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn" event={"ID":"62549b05-ca68-4615-9cf8-ee0a6f5c5d69","Type":"ContainerStarted","Data":"0a829ec7cc8e154bafbe6898d6bb8035221de527a411977d67117c59fd874773"} Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.586176 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" event={"ID":"288cd5d7-f17b-4184-9220-da634c367fde","Type":"ContainerStarted","Data":"9b9922dacab84e233a14985be81f6c61373bf733824192ed0355af9dc1045a6e"} Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.586221 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" event={"ID":"288cd5d7-f17b-4184-9220-da634c367fde","Type":"ContainerStarted","Data":"d3aeb04e2bf42b997441c743340619a7fbc53153dbc29260e8c3a942773d1556"} Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.619902 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" podStartSLOduration=1.6198868499999999 podStartE2EDuration="1.61988685s" podCreationTimestamp="2026-01-20 23:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:05:47.617375208 +0000 UTC m=+1819.937635496" watchObservedRunningTime="2026-01-20 23:05:47.61988685 +0000 UTC m=+1819.940147148" Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.626937 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12","Type":"ContainerStarted","Data":"02494de1f86f8014a5b0d66c7f0ce9155ce1cfece2389b90dd09177746a61a0e"} Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.626983 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12","Type":"ContainerStarted","Data":"6689a4b81616125bb5c8154771bc3d81c905135f719111aeef31d9c9cf266356"} Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.676113 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-8qnm7"] Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.683763 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-v84g9"] Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.746307 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2"] Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.920778 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.921020 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="d1c1b889-379f-4f53-9426-cd5da16ccc34" containerName="glance-log" containerID="cri-o://23fb52203c1208e4eeb12c51e5097a053bd1b139254715b80d20e1ac118896b9" gracePeriod=30 Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.921242 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="d1c1b889-379f-4f53-9426-cd5da16ccc34" containerName="glance-httpd" containerID="cri-o://2b554ea16953e30e845cf19ca4b9182ead41c46fc5580c12af02d9e53332bb1e" gracePeriod=30 Jan 20 23:05:47 crc kubenswrapper[5030]: I0120 23:05:47.951424 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p"] Jan 20 23:05:47 crc kubenswrapper[5030]: W0120 23:05:47.951905 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod775ba99a_26a5_4fd1_a46b_0d85647a97aa.slice/crio-2f25ba78a4e1130a353f26bf991d3abbeccede67c353c3bf791080b51aafe65e WatchSource:0}: Error finding container 2f25ba78a4e1130a353f26bf991d3abbeccede67c353c3bf791080b51aafe65e: Status 404 returned error can't find the container with id 2f25ba78a4e1130a353f26bf991d3abbeccede67c353c3bf791080b51aafe65e Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.640288 5030 generic.go:334] "Generic (PLEG): container finished" podID="775ba99a-26a5-4fd1-a46b-0d85647a97aa" containerID="272d75bf9c3c2e74c8b39ab9a0c6c94f4aca0e5a6fdfeb82d45bbdd9a5cc5e12" exitCode=0 Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.640389 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p" event={"ID":"775ba99a-26a5-4fd1-a46b-0d85647a97aa","Type":"ContainerDied","Data":"272d75bf9c3c2e74c8b39ab9a0c6c94f4aca0e5a6fdfeb82d45bbdd9a5cc5e12"} Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.640812 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p" event={"ID":"775ba99a-26a5-4fd1-a46b-0d85647a97aa","Type":"ContainerStarted","Data":"2f25ba78a4e1130a353f26bf991d3abbeccede67c353c3bf791080b51aafe65e"} Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.642964 5030 generic.go:334] "Generic (PLEG): container finished" podID="48e38148-bd69-40ba-a439-0ed6e3be605a" containerID="57cfb712970271072b19d4d183fff7c643dc704e5e1907442c4839bf65f54d4c" exitCode=0 Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.643036 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2" event={"ID":"48e38148-bd69-40ba-a439-0ed6e3be605a","Type":"ContainerDied","Data":"57cfb712970271072b19d4d183fff7c643dc704e5e1907442c4839bf65f54d4c"} Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.643181 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2" event={"ID":"48e38148-bd69-40ba-a439-0ed6e3be605a","Type":"ContainerStarted","Data":"4373f0c93c3cf93cf6d302c4858cb4180a1bd7d5cbb5c9607692681caae30834"} Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.652403 5030 generic.go:334] "Generic (PLEG): container finished" podID="d1c1b889-379f-4f53-9426-cd5da16ccc34" containerID="23fb52203c1208e4eeb12c51e5097a053bd1b139254715b80d20e1ac118896b9" exitCode=143 Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.652561 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d1c1b889-379f-4f53-9426-cd5da16ccc34","Type":"ContainerDied","Data":"23fb52203c1208e4eeb12c51e5097a053bd1b139254715b80d20e1ac118896b9"} Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.661747 5030 generic.go:334] "Generic (PLEG): container finished" podID="62549b05-ca68-4615-9cf8-ee0a6f5c5d69" containerID="741a4b81aa00f963bf02f9332ccebe85e1d0760931f18f3eb6bcb184b92fd180" exitCode=0 Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.661803 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn" event={"ID":"62549b05-ca68-4615-9cf8-ee0a6f5c5d69","Type":"ContainerDied","Data":"741a4b81aa00f963bf02f9332ccebe85e1d0760931f18f3eb6bcb184b92fd180"} Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.663214 5030 generic.go:334] "Generic (PLEG): container finished" podID="ea3c7c52-f274-4811-8559-49b78f7b9bfa" containerID="e55c81a9c6be668bdf5b020760ecbb0376b4bd21b7daa84f73ecacc1bd8c9253" exitCode=0 Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.663249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-v84g9" event={"ID":"ea3c7c52-f274-4811-8559-49b78f7b9bfa","Type":"ContainerDied","Data":"e55c81a9c6be668bdf5b020760ecbb0376b4bd21b7daa84f73ecacc1bd8c9253"} Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.663262 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-v84g9" event={"ID":"ea3c7c52-f274-4811-8559-49b78f7b9bfa","Type":"ContainerStarted","Data":"1fb0fec29c34dad147a6a3b8559f846ab80fd48695287ca4c1de50591626e15c"} Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.664988 5030 generic.go:334] "Generic (PLEG): container finished" podID="dc12aecf-2818-47e1-81b0-8643dc895afb" containerID="a508ab8c85c7e03a4396f79456f3d11b9a5a5c184f3e100bc13ce8623c314a5a" exitCode=0 Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.665127 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-8qnm7" event={"ID":"dc12aecf-2818-47e1-81b0-8643dc895afb","Type":"ContainerDied","Data":"a508ab8c85c7e03a4396f79456f3d11b9a5a5c184f3e100bc13ce8623c314a5a"} Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.665155 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-8qnm7" event={"ID":"dc12aecf-2818-47e1-81b0-8643dc895afb","Type":"ContainerStarted","Data":"2918bed48ca7c44bb4f2697dfa25157a61e6e8eecf4369509666e5c35408c88c"} Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.681498 5030 generic.go:334] "Generic (PLEG): container finished" podID="288cd5d7-f17b-4184-9220-da634c367fde" containerID="9b9922dacab84e233a14985be81f6c61373bf733824192ed0355af9dc1045a6e" exitCode=0 Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.681533 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" event={"ID":"288cd5d7-f17b-4184-9220-da634c367fde","Type":"ContainerDied","Data":"9b9922dacab84e233a14985be81f6c61373bf733824192ed0355af9dc1045a6e"} Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.688582 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:48 crc kubenswrapper[5030]: I0120 23:05:48.692148 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:05:49 crc kubenswrapper[5030]: I0120 23:05:49.546426 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:05:49 crc kubenswrapper[5030]: I0120 23:05:49.547011 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="3a3d4bee-3c5a-4be3-8b79-af236dd08645" containerName="glance-httpd" containerID="cri-o://355a65489075362570fc4a79cf6fc0137cce9d33afec9c8da23c87eea6ec654f" gracePeriod=30 Jan 20 23:05:49 crc kubenswrapper[5030]: I0120 23:05:49.546956 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="3a3d4bee-3c5a-4be3-8b79-af236dd08645" containerName="glance-log" containerID="cri-o://11ba608fcb875ff78e07ffe256b78dc318fe1c7fa59d1ff1d49fd823f5cb3132" gracePeriod=30 Jan 20 23:05:49 crc kubenswrapper[5030]: I0120 23:05:49.692565 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12","Type":"ContainerStarted","Data":"29e3ce0be17e64dc30a9d69ac467390c805d0134bcdd32cce3b1b94cab6612f9"} Jan 20 23:05:49 crc kubenswrapper[5030]: I0120 23:05:49.692825 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="ceilometer-notification-agent" containerID="cri-o://6689a4b81616125bb5c8154771bc3d81c905135f719111aeef31d9c9cf266356" gracePeriod=30 Jan 20 23:05:49 crc kubenswrapper[5030]: I0120 23:05:49.692837 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:49 crc kubenswrapper[5030]: I0120 23:05:49.692759 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="sg-core" containerID="cri-o://02494de1f86f8014a5b0d66c7f0ce9155ce1cfece2389b90dd09177746a61a0e" gracePeriod=30 Jan 20 23:05:49 crc kubenswrapper[5030]: I0120 23:05:49.692733 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="ceilometer-central-agent" containerID="cri-o://a97a61f027df0970b8918a2764d2761555411d7a5abf6bb0302f267f55cf9b39" gracePeriod=30 Jan 20 23:05:49 crc kubenswrapper[5030]: I0120 23:05:49.692774 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="proxy-httpd" containerID="cri-o://29e3ce0be17e64dc30a9d69ac467390c805d0134bcdd32cce3b1b94cab6612f9" gracePeriod=30 Jan 20 23:05:49 crc kubenswrapper[5030]: I0120 23:05:49.697897 5030 generic.go:334] "Generic (PLEG): container finished" podID="3a3d4bee-3c5a-4be3-8b79-af236dd08645" containerID="11ba608fcb875ff78e07ffe256b78dc318fe1c7fa59d1ff1d49fd823f5cb3132" exitCode=143 Jan 20 23:05:49 crc kubenswrapper[5030]: I0120 23:05:49.697930 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"3a3d4bee-3c5a-4be3-8b79-af236dd08645","Type":"ContainerDied","Data":"11ba608fcb875ff78e07ffe256b78dc318fe1c7fa59d1ff1d49fd823f5cb3132"} Jan 20 23:05:49 crc kubenswrapper[5030]: I0120 23:05:49.721072 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.5021052900000003 podStartE2EDuration="5.721054917s" podCreationTimestamp="2026-01-20 23:05:44 +0000 UTC" firstStartedPulling="2026-01-20 23:05:45.362376905 +0000 UTC m=+1817.682637193" lastFinishedPulling="2026-01-20 23:05:48.581326522 +0000 UTC m=+1820.901586820" observedRunningTime="2026-01-20 23:05:49.714033086 +0000 UTC m=+1822.034293374" watchObservedRunningTime="2026-01-20 23:05:49.721054917 +0000 UTC m=+1822.041315205" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.121464 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-8qnm7" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.186380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc12aecf-2818-47e1-81b0-8643dc895afb-operator-scripts\") pod \"dc12aecf-2818-47e1-81b0-8643dc895afb\" (UID: \"dc12aecf-2818-47e1-81b0-8643dc895afb\") " Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.186448 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82749\" (UniqueName: \"kubernetes.io/projected/dc12aecf-2818-47e1-81b0-8643dc895afb-kube-api-access-82749\") pod \"dc12aecf-2818-47e1-81b0-8643dc895afb\" (UID: \"dc12aecf-2818-47e1-81b0-8643dc895afb\") " Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.187323 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc12aecf-2818-47e1-81b0-8643dc895afb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc12aecf-2818-47e1-81b0-8643dc895afb" (UID: "dc12aecf-2818-47e1-81b0-8643dc895afb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.195344 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-v84g9" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.197565 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc12aecf-2818-47e1-81b0-8643dc895afb-kube-api-access-82749" (OuterVolumeSpecName: "kube-api-access-82749") pod "dc12aecf-2818-47e1-81b0-8643dc895afb" (UID: "dc12aecf-2818-47e1-81b0-8643dc895afb"). InnerVolumeSpecName "kube-api-access-82749". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.230415 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.241211 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.250141 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.262751 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.287950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288cd5d7-f17b-4184-9220-da634c367fde-operator-scripts\") pod \"288cd5d7-f17b-4184-9220-da634c367fde\" (UID: \"288cd5d7-f17b-4184-9220-da634c367fde\") " Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.287993 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zhsq\" (UniqueName: \"kubernetes.io/projected/775ba99a-26a5-4fd1-a46b-0d85647a97aa-kube-api-access-7zhsq\") pod \"775ba99a-26a5-4fd1-a46b-0d85647a97aa\" (UID: \"775ba99a-26a5-4fd1-a46b-0d85647a97aa\") " Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.288067 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/775ba99a-26a5-4fd1-a46b-0d85647a97aa-operator-scripts\") pod \"775ba99a-26a5-4fd1-a46b-0d85647a97aa\" (UID: \"775ba99a-26a5-4fd1-a46b-0d85647a97aa\") " Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.288853 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e38148-bd69-40ba-a439-0ed6e3be605a-operator-scripts\") pod \"48e38148-bd69-40ba-a439-0ed6e3be605a\" (UID: \"48e38148-bd69-40ba-a439-0ed6e3be605a\") " Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.288921 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/288cd5d7-f17b-4184-9220-da634c367fde-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "288cd5d7-f17b-4184-9220-da634c367fde" (UID: "288cd5d7-f17b-4184-9220-da634c367fde"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.288975 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/775ba99a-26a5-4fd1-a46b-0d85647a97aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "775ba99a-26a5-4fd1-a46b-0d85647a97aa" (UID: "775ba99a-26a5-4fd1-a46b-0d85647a97aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.289035 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6dp5\" (UniqueName: \"kubernetes.io/projected/ea3c7c52-f274-4811-8559-49b78f7b9bfa-kube-api-access-d6dp5\") pod \"ea3c7c52-f274-4811-8559-49b78f7b9bfa\" (UID: \"ea3c7c52-f274-4811-8559-49b78f7b9bfa\") " Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.289099 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtsjs\" (UniqueName: \"kubernetes.io/projected/288cd5d7-f17b-4184-9220-da634c367fde-kube-api-access-qtsjs\") pod \"288cd5d7-f17b-4184-9220-da634c367fde\" (UID: \"288cd5d7-f17b-4184-9220-da634c367fde\") " Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.289121 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmgpf\" (UniqueName: \"kubernetes.io/projected/48e38148-bd69-40ba-a439-0ed6e3be605a-kube-api-access-hmgpf\") pod \"48e38148-bd69-40ba-a439-0ed6e3be605a\" (UID: \"48e38148-bd69-40ba-a439-0ed6e3be605a\") " Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.289176 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea3c7c52-f274-4811-8559-49b78f7b9bfa-operator-scripts\") pod \"ea3c7c52-f274-4811-8559-49b78f7b9bfa\" (UID: \"ea3c7c52-f274-4811-8559-49b78f7b9bfa\") " Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.289493 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48e38148-bd69-40ba-a439-0ed6e3be605a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48e38148-bd69-40ba-a439-0ed6e3be605a" (UID: "48e38148-bd69-40ba-a439-0ed6e3be605a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.289600 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/775ba99a-26a5-4fd1-a46b-0d85647a97aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.289631 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e38148-bd69-40ba-a439-0ed6e3be605a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.289642 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc12aecf-2818-47e1-81b0-8643dc895afb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.289654 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82749\" (UniqueName: \"kubernetes.io/projected/dc12aecf-2818-47e1-81b0-8643dc895afb-kube-api-access-82749\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.289667 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/288cd5d7-f17b-4184-9220-da634c367fde-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.289854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea3c7c52-f274-4811-8559-49b78f7b9bfa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea3c7c52-f274-4811-8559-49b78f7b9bfa" (UID: "ea3c7c52-f274-4811-8559-49b78f7b9bfa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.292606 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/288cd5d7-f17b-4184-9220-da634c367fde-kube-api-access-qtsjs" (OuterVolumeSpecName: "kube-api-access-qtsjs") pod "288cd5d7-f17b-4184-9220-da634c367fde" (UID: "288cd5d7-f17b-4184-9220-da634c367fde"). InnerVolumeSpecName "kube-api-access-qtsjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.292678 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/775ba99a-26a5-4fd1-a46b-0d85647a97aa-kube-api-access-7zhsq" (OuterVolumeSpecName: "kube-api-access-7zhsq") pod "775ba99a-26a5-4fd1-a46b-0d85647a97aa" (UID: "775ba99a-26a5-4fd1-a46b-0d85647a97aa"). InnerVolumeSpecName "kube-api-access-7zhsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.295934 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3c7c52-f274-4811-8559-49b78f7b9bfa-kube-api-access-d6dp5" (OuterVolumeSpecName: "kube-api-access-d6dp5") pod "ea3c7c52-f274-4811-8559-49b78f7b9bfa" (UID: "ea3c7c52-f274-4811-8559-49b78f7b9bfa"). InnerVolumeSpecName "kube-api-access-d6dp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.299409 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48e38148-bd69-40ba-a439-0ed6e3be605a-kube-api-access-hmgpf" (OuterVolumeSpecName: "kube-api-access-hmgpf") pod "48e38148-bd69-40ba-a439-0ed6e3be605a" (UID: "48e38148-bd69-40ba-a439-0ed6e3be605a"). InnerVolumeSpecName "kube-api-access-hmgpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.391379 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62549b05-ca68-4615-9cf8-ee0a6f5c5d69-operator-scripts\") pod \"62549b05-ca68-4615-9cf8-ee0a6f5c5d69\" (UID: \"62549b05-ca68-4615-9cf8-ee0a6f5c5d69\") " Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.391567 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm2rc\" (UniqueName: \"kubernetes.io/projected/62549b05-ca68-4615-9cf8-ee0a6f5c5d69-kube-api-access-vm2rc\") pod \"62549b05-ca68-4615-9cf8-ee0a6f5c5d69\" (UID: \"62549b05-ca68-4615-9cf8-ee0a6f5c5d69\") " Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.391896 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6dp5\" (UniqueName: \"kubernetes.io/projected/ea3c7c52-f274-4811-8559-49b78f7b9bfa-kube-api-access-d6dp5\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.391913 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtsjs\" (UniqueName: \"kubernetes.io/projected/288cd5d7-f17b-4184-9220-da634c367fde-kube-api-access-qtsjs\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.391924 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmgpf\" (UniqueName: \"kubernetes.io/projected/48e38148-bd69-40ba-a439-0ed6e3be605a-kube-api-access-hmgpf\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.391933 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea3c7c52-f274-4811-8559-49b78f7b9bfa-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.391941 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zhsq\" (UniqueName: \"kubernetes.io/projected/775ba99a-26a5-4fd1-a46b-0d85647a97aa-kube-api-access-7zhsq\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.395227 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62549b05-ca68-4615-9cf8-ee0a6f5c5d69-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62549b05-ca68-4615-9cf8-ee0a6f5c5d69" (UID: "62549b05-ca68-4615-9cf8-ee0a6f5c5d69"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.396310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62549b05-ca68-4615-9cf8-ee0a6f5c5d69-kube-api-access-vm2rc" (OuterVolumeSpecName: "kube-api-access-vm2rc") pod "62549b05-ca68-4615-9cf8-ee0a6f5c5d69" (UID: "62549b05-ca68-4615-9cf8-ee0a6f5c5d69"). InnerVolumeSpecName "kube-api-access-vm2rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.492988 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62549b05-ca68-4615-9cf8-ee0a6f5c5d69-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.493015 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm2rc\" (UniqueName: \"kubernetes.io/projected/62549b05-ca68-4615-9cf8-ee0a6f5c5d69-kube-api-access-vm2rc\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.709158 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.709154 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn" event={"ID":"62549b05-ca68-4615-9cf8-ee0a6f5c5d69","Type":"ContainerDied","Data":"0a829ec7cc8e154bafbe6898d6bb8035221de527a411977d67117c59fd874773"} Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.709299 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a829ec7cc8e154bafbe6898d6bb8035221de527a411977d67117c59fd874773" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.711291 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-v84g9" event={"ID":"ea3c7c52-f274-4811-8559-49b78f7b9bfa","Type":"ContainerDied","Data":"1fb0fec29c34dad147a6a3b8559f846ab80fd48695287ca4c1de50591626e15c"} Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.711341 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fb0fec29c34dad147a6a3b8559f846ab80fd48695287ca4c1de50591626e15c" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.711312 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-v84g9" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.713337 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-8qnm7" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.713372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-8qnm7" event={"ID":"dc12aecf-2818-47e1-81b0-8643dc895afb","Type":"ContainerDied","Data":"2918bed48ca7c44bb4f2697dfa25157a61e6e8eecf4369509666e5c35408c88c"} Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.713419 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2918bed48ca7c44bb4f2697dfa25157a61e6e8eecf4369509666e5c35408c88c" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.715965 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" event={"ID":"288cd5d7-f17b-4184-9220-da634c367fde","Type":"ContainerDied","Data":"d3aeb04e2bf42b997441c743340619a7fbc53153dbc29260e8c3a942773d1556"} Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.715992 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-2m9r9" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.716010 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3aeb04e2bf42b997441c743340619a7fbc53153dbc29260e8c3a942773d1556" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.721741 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerID="29e3ce0be17e64dc30a9d69ac467390c805d0134bcdd32cce3b1b94cab6612f9" exitCode=0 Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.721780 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerID="02494de1f86f8014a5b0d66c7f0ce9155ce1cfece2389b90dd09177746a61a0e" exitCode=2 Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.721799 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerID="6689a4b81616125bb5c8154771bc3d81c905135f719111aeef31d9c9cf266356" exitCode=0 Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.721860 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12","Type":"ContainerDied","Data":"29e3ce0be17e64dc30a9d69ac467390c805d0134bcdd32cce3b1b94cab6612f9"} Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.721910 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12","Type":"ContainerDied","Data":"02494de1f86f8014a5b0d66c7f0ce9155ce1cfece2389b90dd09177746a61a0e"} Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.721934 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12","Type":"ContainerDied","Data":"6689a4b81616125bb5c8154771bc3d81c905135f719111aeef31d9c9cf266356"} Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.724129 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2" event={"ID":"48e38148-bd69-40ba-a439-0ed6e3be605a","Type":"ContainerDied","Data":"4373f0c93c3cf93cf6d302c4858cb4180a1bd7d5cbb5c9607692681caae30834"} Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.724177 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4373f0c93c3cf93cf6d302c4858cb4180a1bd7d5cbb5c9607692681caae30834" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.724290 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.727753 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p" event={"ID":"775ba99a-26a5-4fd1-a46b-0d85647a97aa","Type":"ContainerDied","Data":"2f25ba78a4e1130a353f26bf991d3abbeccede67c353c3bf791080b51aafe65e"} Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.727792 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f25ba78a4e1130a353f26bf991d3abbeccede67c353c3bf791080b51aafe65e" Jan 20 23:05:50 crc kubenswrapper[5030]: I0120 23:05:50.727848 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.073003 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="d1c1b889-379f-4f53-9426-cd5da16ccc34" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.88:9292/healthcheck\": read tcp 10.217.0.2:48932->10.217.1.88:9292: read: connection reset by peer" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.073070 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="d1c1b889-379f-4f53-9426-cd5da16ccc34" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.88:9292/healthcheck\": read tcp 10.217.0.2:48924->10.217.1.88:9292: read: connection reset by peer" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.513542 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.613466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjxnf\" (UniqueName: \"kubernetes.io/projected/d1c1b889-379f-4f53-9426-cd5da16ccc34-kube-api-access-jjxnf\") pod \"d1c1b889-379f-4f53-9426-cd5da16ccc34\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.613530 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-scripts\") pod \"d1c1b889-379f-4f53-9426-cd5da16ccc34\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.613665 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-combined-ca-bundle\") pod \"d1c1b889-379f-4f53-9426-cd5da16ccc34\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.613718 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-config-data\") pod \"d1c1b889-379f-4f53-9426-cd5da16ccc34\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.613751 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1c1b889-379f-4f53-9426-cd5da16ccc34-logs\") pod \"d1c1b889-379f-4f53-9426-cd5da16ccc34\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.613770 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-public-tls-certs\") pod \"d1c1b889-379f-4f53-9426-cd5da16ccc34\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.613829 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1c1b889-379f-4f53-9426-cd5da16ccc34-httpd-run\") pod \"d1c1b889-379f-4f53-9426-cd5da16ccc34\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.613875 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"d1c1b889-379f-4f53-9426-cd5da16ccc34\" (UID: \"d1c1b889-379f-4f53-9426-cd5da16ccc34\") " Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.614471 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1c1b889-379f-4f53-9426-cd5da16ccc34-logs" (OuterVolumeSpecName: "logs") pod "d1c1b889-379f-4f53-9426-cd5da16ccc34" (UID: "d1c1b889-379f-4f53-9426-cd5da16ccc34"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.614606 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1c1b889-379f-4f53-9426-cd5da16ccc34-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d1c1b889-379f-4f53-9426-cd5da16ccc34" (UID: "d1c1b889-379f-4f53-9426-cd5da16ccc34"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.618773 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "d1c1b889-379f-4f53-9426-cd5da16ccc34" (UID: "d1c1b889-379f-4f53-9426-cd5da16ccc34"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.624047 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c1b889-379f-4f53-9426-cd5da16ccc34-kube-api-access-jjxnf" (OuterVolumeSpecName: "kube-api-access-jjxnf") pod "d1c1b889-379f-4f53-9426-cd5da16ccc34" (UID: "d1c1b889-379f-4f53-9426-cd5da16ccc34"). InnerVolumeSpecName "kube-api-access-jjxnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.643481 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-scripts" (OuterVolumeSpecName: "scripts") pod "d1c1b889-379f-4f53-9426-cd5da16ccc34" (UID: "d1c1b889-379f-4f53-9426-cd5da16ccc34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.653973 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1c1b889-379f-4f53-9426-cd5da16ccc34" (UID: "d1c1b889-379f-4f53-9426-cd5da16ccc34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.680198 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-config-data" (OuterVolumeSpecName: "config-data") pod "d1c1b889-379f-4f53-9426-cd5da16ccc34" (UID: "d1c1b889-379f-4f53-9426-cd5da16ccc34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.691979 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d1c1b889-379f-4f53-9426-cd5da16ccc34" (UID: "d1c1b889-379f-4f53-9426-cd5da16ccc34"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.715125 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.715149 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.715160 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.715171 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1c1b889-379f-4f53-9426-cd5da16ccc34-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.715178 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1c1b889-379f-4f53-9426-cd5da16ccc34-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.715186 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1c1b889-379f-4f53-9426-cd5da16ccc34-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.715221 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.715231 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjxnf\" (UniqueName: \"kubernetes.io/projected/d1c1b889-379f-4f53-9426-cd5da16ccc34-kube-api-access-jjxnf\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.737551 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.752008 5030 generic.go:334] "Generic (PLEG): container finished" podID="d1c1b889-379f-4f53-9426-cd5da16ccc34" containerID="2b554ea16953e30e845cf19ca4b9182ead41c46fc5580c12af02d9e53332bb1e" exitCode=0 Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.752076 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.752100 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d1c1b889-379f-4f53-9426-cd5da16ccc34","Type":"ContainerDied","Data":"2b554ea16953e30e845cf19ca4b9182ead41c46fc5580c12af02d9e53332bb1e"} Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.752378 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"d1c1b889-379f-4f53-9426-cd5da16ccc34","Type":"ContainerDied","Data":"2042ceed21d280b471824b70d6624fb08ba9a4c49f23f339828e93ab337ea50f"} Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.752401 5030 scope.go:117] "RemoveContainer" containerID="2b554ea16953e30e845cf19ca4b9182ead41c46fc5580c12af02d9e53332bb1e" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.780216 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.782029 5030 scope.go:117] "RemoveContainer" containerID="23fb52203c1208e4eeb12c51e5097a053bd1b139254715b80d20e1ac118896b9" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.790815 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.813929 5030 scope.go:117] "RemoveContainer" containerID="2b554ea16953e30e845cf19ca4b9182ead41c46fc5580c12af02d9e53332bb1e" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.814468 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:05:51 crc kubenswrapper[5030]: E0120 23:05:51.814530 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b554ea16953e30e845cf19ca4b9182ead41c46fc5580c12af02d9e53332bb1e\": container with ID starting with 2b554ea16953e30e845cf19ca4b9182ead41c46fc5580c12af02d9e53332bb1e not found: ID does not exist" containerID="2b554ea16953e30e845cf19ca4b9182ead41c46fc5580c12af02d9e53332bb1e" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.814593 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b554ea16953e30e845cf19ca4b9182ead41c46fc5580c12af02d9e53332bb1e"} err="failed to get container status \"2b554ea16953e30e845cf19ca4b9182ead41c46fc5580c12af02d9e53332bb1e\": rpc error: code = NotFound desc = could not find container \"2b554ea16953e30e845cf19ca4b9182ead41c46fc5580c12af02d9e53332bb1e\": container with ID starting with 2b554ea16953e30e845cf19ca4b9182ead41c46fc5580c12af02d9e53332bb1e not found: ID does not exist" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.814651 5030 scope.go:117] "RemoveContainer" containerID="23fb52203c1208e4eeb12c51e5097a053bd1b139254715b80d20e1ac118896b9" Jan 20 23:05:51 crc kubenswrapper[5030]: E0120 23:05:51.814991 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc12aecf-2818-47e1-81b0-8643dc895afb" containerName="mariadb-database-create" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.815049 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc12aecf-2818-47e1-81b0-8643dc895afb" containerName="mariadb-database-create" Jan 20 23:05:51 crc kubenswrapper[5030]: E0120 23:05:51.815117 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3c7c52-f274-4811-8559-49b78f7b9bfa" containerName="mariadb-database-create" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.815166 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3c7c52-f274-4811-8559-49b78f7b9bfa" containerName="mariadb-database-create" Jan 20 23:05:51 crc kubenswrapper[5030]: E0120 23:05:51.815222 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="288cd5d7-f17b-4184-9220-da634c367fde" containerName="mariadb-database-create" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.815285 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="288cd5d7-f17b-4184-9220-da634c367fde" containerName="mariadb-database-create" Jan 20 23:05:51 crc kubenswrapper[5030]: E0120 23:05:51.817232 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62549b05-ca68-4615-9cf8-ee0a6f5c5d69" containerName="mariadb-account-create-update" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.817333 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="62549b05-ca68-4615-9cf8-ee0a6f5c5d69" containerName="mariadb-account-create-update" Jan 20 23:05:51 crc kubenswrapper[5030]: E0120 23:05:51.817402 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c1b889-379f-4f53-9426-cd5da16ccc34" containerName="glance-log" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.817453 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c1b889-379f-4f53-9426-cd5da16ccc34" containerName="glance-log" Jan 20 23:05:51 crc kubenswrapper[5030]: E0120 23:05:51.817505 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c1b889-379f-4f53-9426-cd5da16ccc34" containerName="glance-httpd" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.817549 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c1b889-379f-4f53-9426-cd5da16ccc34" containerName="glance-httpd" Jan 20 23:05:51 crc kubenswrapper[5030]: E0120 23:05:51.817601 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e38148-bd69-40ba-a439-0ed6e3be605a" containerName="mariadb-account-create-update" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.817712 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e38148-bd69-40ba-a439-0ed6e3be605a" containerName="mariadb-account-create-update" Jan 20 23:05:51 crc kubenswrapper[5030]: E0120 23:05:51.817770 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="775ba99a-26a5-4fd1-a46b-0d85647a97aa" containerName="mariadb-account-create-update" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.817819 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="775ba99a-26a5-4fd1-a46b-0d85647a97aa" containerName="mariadb-account-create-update" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.818097 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="62549b05-ca68-4615-9cf8-ee0a6f5c5d69" containerName="mariadb-account-create-update" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.818154 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc12aecf-2818-47e1-81b0-8643dc895afb" containerName="mariadb-database-create" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.818219 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c1b889-379f-4f53-9426-cd5da16ccc34" containerName="glance-httpd" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.818275 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3c7c52-f274-4811-8559-49b78f7b9bfa" containerName="mariadb-database-create" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.818373 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="288cd5d7-f17b-4184-9220-da634c367fde" containerName="mariadb-database-create" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.818447 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="775ba99a-26a5-4fd1-a46b-0d85647a97aa" containerName="mariadb-account-create-update" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.818507 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="48e38148-bd69-40ba-a439-0ed6e3be605a" containerName="mariadb-account-create-update" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.818559 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c1b889-379f-4f53-9426-cd5da16ccc34" containerName="glance-log" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.819690 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.820237 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:51 crc kubenswrapper[5030]: E0120 23:05:51.815092 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23fb52203c1208e4eeb12c51e5097a053bd1b139254715b80d20e1ac118896b9\": container with ID starting with 23fb52203c1208e4eeb12c51e5097a053bd1b139254715b80d20e1ac118896b9 not found: ID does not exist" containerID="23fb52203c1208e4eeb12c51e5097a053bd1b139254715b80d20e1ac118896b9" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.820814 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23fb52203c1208e4eeb12c51e5097a053bd1b139254715b80d20e1ac118896b9"} err="failed to get container status \"23fb52203c1208e4eeb12c51e5097a053bd1b139254715b80d20e1ac118896b9\": rpc error: code = NotFound desc = could not find container \"23fb52203c1208e4eeb12c51e5097a053bd1b139254715b80d20e1ac118896b9\": container with ID starting with 23fb52203c1208e4eeb12c51e5097a053bd1b139254715b80d20e1ac118896b9 not found: ID does not exist" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.824937 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.825290 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.826935 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.920950 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.921310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.921475 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc14f499-5449-459d-874e-a7fe1b79e469-logs\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.921851 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.921945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc14f499-5449-459d-874e-a7fe1b79e469-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.921987 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.922025 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6fnp\" (UniqueName: \"kubernetes.io/projected/dc14f499-5449-459d-874e-a7fe1b79e469-kube-api-access-x6fnp\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.922045 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.922512 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n"] Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.923519 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.930309 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.930711 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-d552q" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.930798 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.938604 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n"] Jan 20 23:05:51 crc kubenswrapper[5030]: I0120 23:05:51.971912 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1c1b889-379f-4f53-9426-cd5da16ccc34" path="/var/lib/kubelet/pods/d1c1b889-379f-4f53-9426-cd5da16ccc34/volumes" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023131 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc14f499-5449-459d-874e-a7fe1b79e469-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023158 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5q2n\" (UniqueName: \"kubernetes.io/projected/316f5040-d923-4ef3-9fbf-f52bf080d8dc-kube-api-access-t5q2n\") pod \"nova-cell0-conductor-db-sync-q5l2n\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023183 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023206 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6fnp\" (UniqueName: \"kubernetes.io/projected/dc14f499-5449-459d-874e-a7fe1b79e469-kube-api-access-x6fnp\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023222 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023241 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023266 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-scripts\") pod \"nova-cell0-conductor-db-sync-q5l2n\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023284 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-config-data\") pod \"nova-cell0-conductor-db-sync-q5l2n\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023313 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023353 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc14f499-5449-459d-874e-a7fe1b79e469-logs\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q5l2n\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023802 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc14f499-5449-459d-874e-a7fe1b79e469-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.023897 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.024014 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc14f499-5449-459d-874e-a7fe1b79e469-logs\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.028311 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.028818 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.029250 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.029470 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.038090 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6fnp\" (UniqueName: \"kubernetes.io/projected/dc14f499-5449-459d-874e-a7fe1b79e469-kube-api-access-x6fnp\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.050398 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.124974 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q5l2n\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.125052 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5q2n\" (UniqueName: \"kubernetes.io/projected/316f5040-d923-4ef3-9fbf-f52bf080d8dc-kube-api-access-t5q2n\") pod \"nova-cell0-conductor-db-sync-q5l2n\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.125098 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-scripts\") pod \"nova-cell0-conductor-db-sync-q5l2n\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.125116 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-config-data\") pod \"nova-cell0-conductor-db-sync-q5l2n\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.129438 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q5l2n\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.129463 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-config-data\") pod \"nova-cell0-conductor-db-sync-q5l2n\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.129909 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-scripts\") pod \"nova-cell0-conductor-db-sync-q5l2n\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.140274 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.143529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5q2n\" (UniqueName: \"kubernetes.io/projected/316f5040-d923-4ef3-9fbf-f52bf080d8dc-kube-api-access-t5q2n\") pod \"nova-cell0-conductor-db-sync-q5l2n\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.242064 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.556184 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:05:52 crc kubenswrapper[5030]: W0120 23:05:52.557370 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc14f499_5449_459d_874e_a7fe1b79e469.slice/crio-d7211d602a1c53fe5d6cdfaeaf27028436d65115df95d6b4395a12df62fad210 WatchSource:0}: Error finding container d7211d602a1c53fe5d6cdfaeaf27028436d65115df95d6b4395a12df62fad210: Status 404 returned error can't find the container with id d7211d602a1c53fe5d6cdfaeaf27028436d65115df95d6b4395a12df62fad210 Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.711593 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="3a3d4bee-3c5a-4be3-8b79-af236dd08645" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.87:9292/healthcheck\": read tcp 10.217.0.2:60794->10.217.1.87:9292: read: connection reset by peer" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.711593 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="3a3d4bee-3c5a-4be3-8b79-af236dd08645" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.87:9292/healthcheck\": read tcp 10.217.0.2:60806->10.217.1.87:9292: read: connection reset by peer" Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.719578 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n"] Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.782828 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"dc14f499-5449-459d-874e-a7fe1b79e469","Type":"ContainerStarted","Data":"d7211d602a1c53fe5d6cdfaeaf27028436d65115df95d6b4395a12df62fad210"} Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.791731 5030 generic.go:334] "Generic (PLEG): container finished" podID="3a3d4bee-3c5a-4be3-8b79-af236dd08645" containerID="355a65489075362570fc4a79cf6fc0137cce9d33afec9c8da23c87eea6ec654f" exitCode=0 Jan 20 23:05:52 crc kubenswrapper[5030]: I0120 23:05:52.791770 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"3a3d4bee-3c5a-4be3-8b79-af236dd08645","Type":"ContainerDied","Data":"355a65489075362570fc4a79cf6fc0137cce9d33afec9c8da23c87eea6ec654f"} Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.052090 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.145173 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fwch\" (UniqueName: \"kubernetes.io/projected/3a3d4bee-3c5a-4be3-8b79-af236dd08645-kube-api-access-9fwch\") pod \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.145259 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-combined-ca-bundle\") pod \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.145344 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-config-data\") pod \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.145389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-internal-tls-certs\") pod \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.145413 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3d4bee-3c5a-4be3-8b79-af236dd08645-logs\") pod \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.145456 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a3d4bee-3c5a-4be3-8b79-af236dd08645-httpd-run\") pod \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.145492 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-scripts\") pod \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.145521 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\" (UID: \"3a3d4bee-3c5a-4be3-8b79-af236dd08645\") " Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.146317 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a3d4bee-3c5a-4be3-8b79-af236dd08645-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3a3d4bee-3c5a-4be3-8b79-af236dd08645" (UID: "3a3d4bee-3c5a-4be3-8b79-af236dd08645"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.146505 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a3d4bee-3c5a-4be3-8b79-af236dd08645-logs" (OuterVolumeSpecName: "logs") pod "3a3d4bee-3c5a-4be3-8b79-af236dd08645" (UID: "3a3d4bee-3c5a-4be3-8b79-af236dd08645"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.151394 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-scripts" (OuterVolumeSpecName: "scripts") pod "3a3d4bee-3c5a-4be3-8b79-af236dd08645" (UID: "3a3d4bee-3c5a-4be3-8b79-af236dd08645"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.151648 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3d4bee-3c5a-4be3-8b79-af236dd08645-kube-api-access-9fwch" (OuterVolumeSpecName: "kube-api-access-9fwch") pod "3a3d4bee-3c5a-4be3-8b79-af236dd08645" (UID: "3a3d4bee-3c5a-4be3-8b79-af236dd08645"). InnerVolumeSpecName "kube-api-access-9fwch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.151583 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "3a3d4bee-3c5a-4be3-8b79-af236dd08645" (UID: "3a3d4bee-3c5a-4be3-8b79-af236dd08645"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.184338 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a3d4bee-3c5a-4be3-8b79-af236dd08645" (UID: "3a3d4bee-3c5a-4be3-8b79-af236dd08645"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.217997 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3a3d4bee-3c5a-4be3-8b79-af236dd08645" (UID: "3a3d4bee-3c5a-4be3-8b79-af236dd08645"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.247678 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3d4bee-3c5a-4be3-8b79-af236dd08645-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.247709 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a3d4bee-3c5a-4be3-8b79-af236dd08645-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.247721 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.247742 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.247754 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fwch\" (UniqueName: \"kubernetes.io/projected/3a3d4bee-3c5a-4be3-8b79-af236dd08645-kube-api-access-9fwch\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.247766 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.247776 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.247758 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-config-data" (OuterVolumeSpecName: "config-data") pod "3a3d4bee-3c5a-4be3-8b79-af236dd08645" (UID: "3a3d4bee-3c5a-4be3-8b79-af236dd08645"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.275395 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.348812 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3d4bee-3c5a-4be3-8b79-af236dd08645-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.348841 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.802731 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"dc14f499-5449-459d-874e-a7fe1b79e469","Type":"ContainerStarted","Data":"bb03a7a9462c3d1f022ea25c6e705e371d2aa4a25dcb01ea0294ce1846049a9b"} Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.805170 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.805691 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"3a3d4bee-3c5a-4be3-8b79-af236dd08645","Type":"ContainerDied","Data":"d0500aefd223a7ad0a6cf73449a7d0bda27f7fa62d082286196e56b9d921b999"} Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.805732 5030 scope.go:117] "RemoveContainer" containerID="355a65489075362570fc4a79cf6fc0137cce9d33afec9c8da23c87eea6ec654f" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.807836 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" event={"ID":"316f5040-d923-4ef3-9fbf-f52bf080d8dc","Type":"ContainerStarted","Data":"898572d4bfe70ff7e5c0bc5e173412aee6bb392651d855f94362d6a8dcdc23c8"} Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.807884 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" event={"ID":"316f5040-d923-4ef3-9fbf-f52bf080d8dc","Type":"ContainerStarted","Data":"fd18b38560edfb3aba6c738cbccb5d14518a35d2513aaa6d81964ca7af420745"} Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.837254 5030 scope.go:117] "RemoveContainer" containerID="11ba608fcb875ff78e07ffe256b78dc318fe1c7fa59d1ff1d49fd823f5cb3132" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.839461 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" podStartSLOduration=2.839435565 podStartE2EDuration="2.839435565s" podCreationTimestamp="2026-01-20 23:05:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:05:53.831313917 +0000 UTC m=+1826.151574215" watchObservedRunningTime="2026-01-20 23:05:53.839435565 +0000 UTC m=+1826.159695863" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.859558 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.873794 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.883345 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:05:53 crc kubenswrapper[5030]: E0120 23:05:53.884878 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3d4bee-3c5a-4be3-8b79-af236dd08645" containerName="glance-httpd" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.884920 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3d4bee-3c5a-4be3-8b79-af236dd08645" containerName="glance-httpd" Jan 20 23:05:53 crc kubenswrapper[5030]: E0120 23:05:53.884933 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3d4bee-3c5a-4be3-8b79-af236dd08645" containerName="glance-log" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.884978 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3d4bee-3c5a-4be3-8b79-af236dd08645" containerName="glance-log" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.885550 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3d4bee-3c5a-4be3-8b79-af236dd08645" containerName="glance-httpd" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.885583 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3d4bee-3c5a-4be3-8b79-af236dd08645" containerName="glance-log" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.889819 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.891646 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.892377 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.896815 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.964184 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:05:53 crc kubenswrapper[5030]: E0120 23:05:53.965218 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.973853 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a3d4bee-3c5a-4be3-8b79-af236dd08645" path="/var/lib/kubelet/pods/3a3d4bee-3c5a-4be3-8b79-af236dd08645/volumes" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.980976 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.981021 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-config-data\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.981047 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47ec7945-d514-4036-b888-6cc63c552195-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.981091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzq55\" (UniqueName: \"kubernetes.io/projected/47ec7945-d514-4036-b888-6cc63c552195-kube-api-access-pzq55\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.981115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.981143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.981194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-scripts\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:53 crc kubenswrapper[5030]: I0120 23:05:53.981211 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ec7945-d514-4036-b888-6cc63c552195-logs\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.082679 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzq55\" (UniqueName: \"kubernetes.io/projected/47ec7945-d514-4036-b888-6cc63c552195-kube-api-access-pzq55\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.083440 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.083602 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.084172 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-scripts\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.084291 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ec7945-d514-4036-b888-6cc63c552195-logs\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.084468 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.084573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-config-data\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.084684 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47ec7945-d514-4036-b888-6cc63c552195-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.085684 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47ec7945-d514-4036-b888-6cc63c552195-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.085153 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ec7945-d514-4036-b888-6cc63c552195-logs\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.085697 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.089213 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.095725 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.099741 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzq55\" (UniqueName: \"kubernetes.io/projected/47ec7945-d514-4036-b888-6cc63c552195-kube-api-access-pzq55\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.102494 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-config-data\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.110010 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-scripts\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.120631 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-internal-api-0\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.205152 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.629765 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:05:54 crc kubenswrapper[5030]: W0120 23:05:54.630884 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47ec7945_d514_4036_b888_6cc63c552195.slice/crio-8d79895e347e67e7872b234130dafcf8fa3a7e82a3d87ab0a0ed73c7b1f413e8 WatchSource:0}: Error finding container 8d79895e347e67e7872b234130dafcf8fa3a7e82a3d87ab0a0ed73c7b1f413e8: Status 404 returned error can't find the container with id 8d79895e347e67e7872b234130dafcf8fa3a7e82a3d87ab0a0ed73c7b1f413e8 Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.817749 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"47ec7945-d514-4036-b888-6cc63c552195","Type":"ContainerStarted","Data":"8d79895e347e67e7872b234130dafcf8fa3a7e82a3d87ab0a0ed73c7b1f413e8"} Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.834227 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"dc14f499-5449-459d-874e-a7fe1b79e469","Type":"ContainerStarted","Data":"1779ccb659c7f6cf01660be8e527df0c53283c0e70f691746ab04f5da6b5a774"} Jan 20 23:05:54 crc kubenswrapper[5030]: I0120 23:05:54.864218 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.86419557 podStartE2EDuration="3.86419557s" podCreationTimestamp="2026-01-20 23:05:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:05:54.854150905 +0000 UTC m=+1827.174411193" watchObservedRunningTime="2026-01-20 23:05:54.86419557 +0000 UTC m=+1827.184455858" Jan 20 23:05:55 crc kubenswrapper[5030]: I0120 23:05:55.844209 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"47ec7945-d514-4036-b888-6cc63c552195","Type":"ContainerStarted","Data":"8e0043eba4e2ddfdb9777f320a29f7f7423ad5d5f2afe41d13e3932b5a6a9147"} Jan 20 23:05:55 crc kubenswrapper[5030]: I0120 23:05:55.844804 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"47ec7945-d514-4036-b888-6cc63c552195","Type":"ContainerStarted","Data":"459e0a1d8f69b89f034afa3986d4b2888fa00baf09bf1b847ebc91cc377c8c2a"} Jan 20 23:05:55 crc kubenswrapper[5030]: I0120 23:05:55.848393 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerID="a97a61f027df0970b8918a2764d2761555411d7a5abf6bb0302f267f55cf9b39" exitCode=0 Jan 20 23:05:55 crc kubenswrapper[5030]: I0120 23:05:55.848497 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12","Type":"ContainerDied","Data":"a97a61f027df0970b8918a2764d2761555411d7a5abf6bb0302f267f55cf9b39"} Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.155092 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.190138 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.1901163710000002 podStartE2EDuration="3.190116371s" podCreationTimestamp="2026-01-20 23:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:05:55.871743075 +0000 UTC m=+1828.192003363" watchObservedRunningTime="2026-01-20 23:05:56.190116371 +0000 UTC m=+1828.510376669" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.240373 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-scripts\") pod \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.240435 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-combined-ca-bundle\") pod \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.240522 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-log-httpd\") pod \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.240598 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-config-data\") pod \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.240696 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-sg-core-conf-yaml\") pod \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.240741 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-run-httpd\") pod \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.240776 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd6gh\" (UniqueName: \"kubernetes.io/projected/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-kube-api-access-wd6gh\") pod \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\" (UID: \"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12\") " Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.241771 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" (UID: "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.241873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" (UID: "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.246534 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-scripts" (OuterVolumeSpecName: "scripts") pod "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" (UID: "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.246602 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-kube-api-access-wd6gh" (OuterVolumeSpecName: "kube-api-access-wd6gh") pod "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" (UID: "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12"). InnerVolumeSpecName "kube-api-access-wd6gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.279024 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" (UID: "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.307734 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" (UID: "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.323274 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-config-data" (OuterVolumeSpecName: "config-data") pod "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" (UID: "4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.342818 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.342853 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.342865 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.342874 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.342882 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.342890 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.342899 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd6gh\" (UniqueName: \"kubernetes.io/projected/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12-kube-api-access-wd6gh\") on node \"crc\" DevicePath \"\"" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.870575 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12","Type":"ContainerDied","Data":"8bbce3af4abf07b8c3c807e790383cdd1737a7487bb843cefdbe8512e64c14a1"} Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.870656 5030 scope.go:117] "RemoveContainer" containerID="29e3ce0be17e64dc30a9d69ac467390c805d0134bcdd32cce3b1b94cab6612f9" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.870656 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.911735 5030 scope.go:117] "RemoveContainer" containerID="02494de1f86f8014a5b0d66c7f0ce9155ce1cfece2389b90dd09177746a61a0e" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.924457 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.931143 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.952818 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:05:56 crc kubenswrapper[5030]: E0120 23:05:56.953148 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="proxy-httpd" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.953164 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="proxy-httpd" Jan 20 23:05:56 crc kubenswrapper[5030]: E0120 23:05:56.953188 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="sg-core" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.953194 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="sg-core" Jan 20 23:05:56 crc kubenswrapper[5030]: E0120 23:05:56.953209 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="ceilometer-central-agent" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.953216 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="ceilometer-central-agent" Jan 20 23:05:56 crc kubenswrapper[5030]: E0120 23:05:56.953227 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="ceilometer-notification-agent" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.953233 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="ceilometer-notification-agent" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.953374 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="proxy-httpd" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.953391 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="ceilometer-central-agent" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.953403 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="sg-core" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.953417 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" containerName="ceilometer-notification-agent" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.954802 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.957096 5030 scope.go:117] "RemoveContainer" containerID="6689a4b81616125bb5c8154771bc3d81c905135f719111aeef31d9c9cf266356" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.959226 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.959254 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.977187 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:05:56 crc kubenswrapper[5030]: I0120 23:05:56.986435 5030 scope.go:117] "RemoveContainer" containerID="a97a61f027df0970b8918a2764d2761555411d7a5abf6bb0302f267f55cf9b39" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.058408 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.058475 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-config-data\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.058497 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj9dn\" (UniqueName: \"kubernetes.io/projected/db54c2bb-728c-4701-9b57-cd322a642cb4-kube-api-access-jj9dn\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.058517 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.058547 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-scripts\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.058698 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db54c2bb-728c-4701-9b57-cd322a642cb4-log-httpd\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.058764 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db54c2bb-728c-4701-9b57-cd322a642cb4-run-httpd\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.160514 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db54c2bb-728c-4701-9b57-cd322a642cb4-run-httpd\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.160720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.160822 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-config-data\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.160860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj9dn\" (UniqueName: \"kubernetes.io/projected/db54c2bb-728c-4701-9b57-cd322a642cb4-kube-api-access-jj9dn\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.160904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.160997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-scripts\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.161049 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db54c2bb-728c-4701-9b57-cd322a642cb4-run-httpd\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.161780 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db54c2bb-728c-4701-9b57-cd322a642cb4-log-httpd\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.162426 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db54c2bb-728c-4701-9b57-cd322a642cb4-log-httpd\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.167267 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-scripts\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.168239 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.174672 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-config-data\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.175076 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.180018 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj9dn\" (UniqueName: \"kubernetes.io/projected/db54c2bb-728c-4701-9b57-cd322a642cb4-kube-api-access-jj9dn\") pod \"ceilometer-0\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.277640 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.734104 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:05:57 crc kubenswrapper[5030]: W0120 23:05:57.736522 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb54c2bb_728c_4701_9b57_cd322a642cb4.slice/crio-225b4c5fcd389367c1779de16751be5b936e5c359f04e2b11f1f03a183a4fe82 WatchSource:0}: Error finding container 225b4c5fcd389367c1779de16751be5b936e5c359f04e2b11f1f03a183a4fe82: Status 404 returned error can't find the container with id 225b4c5fcd389367c1779de16751be5b936e5c359f04e2b11f1f03a183a4fe82 Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.882468 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db54c2bb-728c-4701-9b57-cd322a642cb4","Type":"ContainerStarted","Data":"225b4c5fcd389367c1779de16751be5b936e5c359f04e2b11f1f03a183a4fe82"} Jan 20 23:05:57 crc kubenswrapper[5030]: I0120 23:05:57.985039 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12" path="/var/lib/kubelet/pods/4f45ba4a-c3b9-47ae-8c79-c70d3e1c1f12/volumes" Jan 20 23:05:58 crc kubenswrapper[5030]: I0120 23:05:58.895810 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db54c2bb-728c-4701-9b57-cd322a642cb4","Type":"ContainerStarted","Data":"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0"} Jan 20 23:05:59 crc kubenswrapper[5030]: I0120 23:05:59.910354 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db54c2bb-728c-4701-9b57-cd322a642cb4","Type":"ContainerStarted","Data":"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0"} Jan 20 23:06:00 crc kubenswrapper[5030]: I0120 23:06:00.920963 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db54c2bb-728c-4701-9b57-cd322a642cb4","Type":"ContainerStarted","Data":"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083"} Jan 20 23:06:01 crc kubenswrapper[5030]: I0120 23:06:01.933336 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db54c2bb-728c-4701-9b57-cd322a642cb4","Type":"ContainerStarted","Data":"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235"} Jan 20 23:06:01 crc kubenswrapper[5030]: I0120 23:06:01.933636 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:01 crc kubenswrapper[5030]: I0120 23:06:01.950633 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.222281767 podStartE2EDuration="5.950605212s" podCreationTimestamp="2026-01-20 23:05:56 +0000 UTC" firstStartedPulling="2026-01-20 23:05:57.742216922 +0000 UTC m=+1830.062477210" lastFinishedPulling="2026-01-20 23:06:01.470540367 +0000 UTC m=+1833.790800655" observedRunningTime="2026-01-20 23:06:01.949960587 +0000 UTC m=+1834.270220885" watchObservedRunningTime="2026-01-20 23:06:01.950605212 +0000 UTC m=+1834.270865500" Jan 20 23:06:02 crc kubenswrapper[5030]: I0120 23:06:02.141521 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:06:02 crc kubenswrapper[5030]: I0120 23:06:02.142716 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:06:02 crc kubenswrapper[5030]: I0120 23:06:02.182250 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:06:02 crc kubenswrapper[5030]: I0120 23:06:02.197580 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:06:02 crc kubenswrapper[5030]: I0120 23:06:02.942781 5030 generic.go:334] "Generic (PLEG): container finished" podID="316f5040-d923-4ef3-9fbf-f52bf080d8dc" containerID="898572d4bfe70ff7e5c0bc5e173412aee6bb392651d855f94362d6a8dcdc23c8" exitCode=0 Jan 20 23:06:02 crc kubenswrapper[5030]: I0120 23:06:02.942848 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" event={"ID":"316f5040-d923-4ef3-9fbf-f52bf080d8dc","Type":"ContainerDied","Data":"898572d4bfe70ff7e5c0bc5e173412aee6bb392651d855f94362d6a8dcdc23c8"} Jan 20 23:06:02 crc kubenswrapper[5030]: I0120 23:06:02.943561 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:06:02 crc kubenswrapper[5030]: I0120 23:06:02.943581 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.205549 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.205898 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.272320 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.277526 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.440869 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.600410 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-scripts\") pod \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.600508 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5q2n\" (UniqueName: \"kubernetes.io/projected/316f5040-d923-4ef3-9fbf-f52bf080d8dc-kube-api-access-t5q2n\") pod \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.600679 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-combined-ca-bundle\") pod \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.600877 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-config-data\") pod \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\" (UID: \"316f5040-d923-4ef3-9fbf-f52bf080d8dc\") " Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.614808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316f5040-d923-4ef3-9fbf-f52bf080d8dc-kube-api-access-t5q2n" (OuterVolumeSpecName: "kube-api-access-t5q2n") pod "316f5040-d923-4ef3-9fbf-f52bf080d8dc" (UID: "316f5040-d923-4ef3-9fbf-f52bf080d8dc"). InnerVolumeSpecName "kube-api-access-t5q2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.621988 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-scripts" (OuterVolumeSpecName: "scripts") pod "316f5040-d923-4ef3-9fbf-f52bf080d8dc" (UID: "316f5040-d923-4ef3-9fbf-f52bf080d8dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.630879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "316f5040-d923-4ef3-9fbf-f52bf080d8dc" (UID: "316f5040-d923-4ef3-9fbf-f52bf080d8dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.644705 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-config-data" (OuterVolumeSpecName: "config-data") pod "316f5040-d923-4ef3-9fbf-f52bf080d8dc" (UID: "316f5040-d923-4ef3-9fbf-f52bf080d8dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.703368 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.703392 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.703402 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5q2n\" (UniqueName: \"kubernetes.io/projected/316f5040-d923-4ef3-9fbf-f52bf080d8dc-kube-api-access-t5q2n\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.703413 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316f5040-d923-4ef3-9fbf-f52bf080d8dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.962008 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.962091 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n" event={"ID":"316f5040-d923-4ef3-9fbf-f52bf080d8dc","Type":"ContainerDied","Data":"fd18b38560edfb3aba6c738cbccb5d14518a35d2513aaa6d81964ca7af420745"} Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.962428 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd18b38560edfb3aba6c738cbccb5d14518a35d2513aaa6d81964ca7af420745" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.962478 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:06:04 crc kubenswrapper[5030]: I0120 23:06:04.962503 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.036474 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.036587 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.044002 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:06:05 crc kubenswrapper[5030]: E0120 23:06:05.044509 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316f5040-d923-4ef3-9fbf-f52bf080d8dc" containerName="nova-cell0-conductor-db-sync" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.044531 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="316f5040-d923-4ef3-9fbf-f52bf080d8dc" containerName="nova-cell0-conductor-db-sync" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.044789 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="316f5040-d923-4ef3-9fbf-f52bf080d8dc" containerName="nova-cell0-conductor-db-sync" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.045486 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.047685 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.049701 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-d552q" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.050025 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.068721 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.213145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480916ab-abdd-474b-a001-12e99c3e1e42-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"480916ab-abdd-474b-a001-12e99c3e1e42\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.213201 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480916ab-abdd-474b-a001-12e99c3e1e42-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"480916ab-abdd-474b-a001-12e99c3e1e42\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.213244 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnrvh\" (UniqueName: \"kubernetes.io/projected/480916ab-abdd-474b-a001-12e99c3e1e42-kube-api-access-cnrvh\") pod \"nova-cell0-conductor-0\" (UID: \"480916ab-abdd-474b-a001-12e99c3e1e42\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.239292 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.239513 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="ceilometer-central-agent" containerID="cri-o://ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0" gracePeriod=30 Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.239643 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="ceilometer-notification-agent" containerID="cri-o://4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0" gracePeriod=30 Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.239641 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="sg-core" containerID="cri-o://5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083" gracePeriod=30 Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.239777 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="proxy-httpd" containerID="cri-o://63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235" gracePeriod=30 Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.314756 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480916ab-abdd-474b-a001-12e99c3e1e42-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"480916ab-abdd-474b-a001-12e99c3e1e42\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.314815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480916ab-abdd-474b-a001-12e99c3e1e42-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"480916ab-abdd-474b-a001-12e99c3e1e42\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.314858 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnrvh\" (UniqueName: \"kubernetes.io/projected/480916ab-abdd-474b-a001-12e99c3e1e42-kube-api-access-cnrvh\") pod \"nova-cell0-conductor-0\" (UID: \"480916ab-abdd-474b-a001-12e99c3e1e42\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.319535 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480916ab-abdd-474b-a001-12e99c3e1e42-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"480916ab-abdd-474b-a001-12e99c3e1e42\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.321747 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480916ab-abdd-474b-a001-12e99c3e1e42-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"480916ab-abdd-474b-a001-12e99c3e1e42\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.332380 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnrvh\" (UniqueName: \"kubernetes.io/projected/480916ab-abdd-474b-a001-12e99c3e1e42-kube-api-access-cnrvh\") pod \"nova-cell0-conductor-0\" (UID: \"480916ab-abdd-474b-a001-12e99c3e1e42\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.363972 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.800323 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.931937 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.978903 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"480916ab-abdd-474b-a001-12e99c3e1e42","Type":"ContainerStarted","Data":"8b9328262bb10ec471943b09c62e8e73ada907f52ca6af9ad3801d543824add6"} Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.982017 5030 generic.go:334] "Generic (PLEG): container finished" podID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerID="63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235" exitCode=0 Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.982054 5030 generic.go:334] "Generic (PLEG): container finished" podID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerID="5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083" exitCode=2 Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.982069 5030 generic.go:334] "Generic (PLEG): container finished" podID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerID="4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0" exitCode=0 Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.982078 5030 generic.go:334] "Generic (PLEG): container finished" podID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerID="ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0" exitCode=0 Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.982281 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db54c2bb-728c-4701-9b57-cd322a642cb4","Type":"ContainerDied","Data":"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235"} Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.982337 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db54c2bb-728c-4701-9b57-cd322a642cb4","Type":"ContainerDied","Data":"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083"} Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.982350 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db54c2bb-728c-4701-9b57-cd322a642cb4","Type":"ContainerDied","Data":"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0"} Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.982360 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db54c2bb-728c-4701-9b57-cd322a642cb4","Type":"ContainerDied","Data":"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0"} Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.982372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"db54c2bb-728c-4701-9b57-cd322a642cb4","Type":"ContainerDied","Data":"225b4c5fcd389367c1779de16751be5b936e5c359f04e2b11f1f03a183a4fe82"} Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.982395 5030 scope.go:117] "RemoveContainer" containerID="63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235" Jan 20 23:06:05 crc kubenswrapper[5030]: I0120 23:06:05.982930 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.005560 5030 scope.go:117] "RemoveContainer" containerID="5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.023646 5030 scope.go:117] "RemoveContainer" containerID="4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.028349 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db54c2bb-728c-4701-9b57-cd322a642cb4-run-httpd\") pod \"db54c2bb-728c-4701-9b57-cd322a642cb4\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.028443 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-combined-ca-bundle\") pod \"db54c2bb-728c-4701-9b57-cd322a642cb4\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.028501 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-scripts\") pod \"db54c2bb-728c-4701-9b57-cd322a642cb4\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.028522 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-sg-core-conf-yaml\") pod \"db54c2bb-728c-4701-9b57-cd322a642cb4\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.028599 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj9dn\" (UniqueName: \"kubernetes.io/projected/db54c2bb-728c-4701-9b57-cd322a642cb4-kube-api-access-jj9dn\") pod \"db54c2bb-728c-4701-9b57-cd322a642cb4\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.028617 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-config-data\") pod \"db54c2bb-728c-4701-9b57-cd322a642cb4\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.028676 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db54c2bb-728c-4701-9b57-cd322a642cb4-log-httpd\") pod \"db54c2bb-728c-4701-9b57-cd322a642cb4\" (UID: \"db54c2bb-728c-4701-9b57-cd322a642cb4\") " Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.030434 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db54c2bb-728c-4701-9b57-cd322a642cb4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "db54c2bb-728c-4701-9b57-cd322a642cb4" (UID: "db54c2bb-728c-4701-9b57-cd322a642cb4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.031369 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db54c2bb-728c-4701-9b57-cd322a642cb4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "db54c2bb-728c-4701-9b57-cd322a642cb4" (UID: "db54c2bb-728c-4701-9b57-cd322a642cb4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.034233 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db54c2bb-728c-4701-9b57-cd322a642cb4-kube-api-access-jj9dn" (OuterVolumeSpecName: "kube-api-access-jj9dn") pod "db54c2bb-728c-4701-9b57-cd322a642cb4" (UID: "db54c2bb-728c-4701-9b57-cd322a642cb4"). InnerVolumeSpecName "kube-api-access-jj9dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.034659 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-scripts" (OuterVolumeSpecName: "scripts") pod "db54c2bb-728c-4701-9b57-cd322a642cb4" (UID: "db54c2bb-728c-4701-9b57-cd322a642cb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.046829 5030 scope.go:117] "RemoveContainer" containerID="ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.054799 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "db54c2bb-728c-4701-9b57-cd322a642cb4" (UID: "db54c2bb-728c-4701-9b57-cd322a642cb4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.069476 5030 scope.go:117] "RemoveContainer" containerID="63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235" Jan 20 23:06:06 crc kubenswrapper[5030]: E0120 23:06:06.070134 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235\": container with ID starting with 63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235 not found: ID does not exist" containerID="63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.070270 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235"} err="failed to get container status \"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235\": rpc error: code = NotFound desc = could not find container \"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235\": container with ID starting with 63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.070419 5030 scope.go:117] "RemoveContainer" containerID="5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083" Jan 20 23:06:06 crc kubenswrapper[5030]: E0120 23:06:06.070900 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083\": container with ID starting with 5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083 not found: ID does not exist" containerID="5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.070954 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083"} err="failed to get container status \"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083\": rpc error: code = NotFound desc = could not find container \"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083\": container with ID starting with 5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.070977 5030 scope.go:117] "RemoveContainer" containerID="4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0" Jan 20 23:06:06 crc kubenswrapper[5030]: E0120 23:06:06.071358 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0\": container with ID starting with 4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0 not found: ID does not exist" containerID="4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.071380 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0"} err="failed to get container status \"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0\": rpc error: code = NotFound desc = could not find container \"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0\": container with ID starting with 4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.071394 5030 scope.go:117] "RemoveContainer" containerID="ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0" Jan 20 23:06:06 crc kubenswrapper[5030]: E0120 23:06:06.071994 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0\": container with ID starting with ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0 not found: ID does not exist" containerID="ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.072338 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0"} err="failed to get container status \"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0\": rpc error: code = NotFound desc = could not find container \"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0\": container with ID starting with ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.072367 5030 scope.go:117] "RemoveContainer" containerID="63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.072845 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235"} err="failed to get container status \"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235\": rpc error: code = NotFound desc = could not find container \"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235\": container with ID starting with 63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.072907 5030 scope.go:117] "RemoveContainer" containerID="5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.073239 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083"} err="failed to get container status \"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083\": rpc error: code = NotFound desc = could not find container \"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083\": container with ID starting with 5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.073261 5030 scope.go:117] "RemoveContainer" containerID="4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.073838 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0"} err="failed to get container status \"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0\": rpc error: code = NotFound desc = could not find container \"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0\": container with ID starting with 4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.073866 5030 scope.go:117] "RemoveContainer" containerID="ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.074521 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0"} err="failed to get container status \"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0\": rpc error: code = NotFound desc = could not find container \"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0\": container with ID starting with ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.074570 5030 scope.go:117] "RemoveContainer" containerID="63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.074905 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235"} err="failed to get container status \"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235\": rpc error: code = NotFound desc = could not find container \"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235\": container with ID starting with 63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.074925 5030 scope.go:117] "RemoveContainer" containerID="5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.075093 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083"} err="failed to get container status \"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083\": rpc error: code = NotFound desc = could not find container \"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083\": container with ID starting with 5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.075109 5030 scope.go:117] "RemoveContainer" containerID="4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.075277 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0"} err="failed to get container status \"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0\": rpc error: code = NotFound desc = could not find container \"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0\": container with ID starting with 4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.075297 5030 scope.go:117] "RemoveContainer" containerID="ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.075606 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0"} err="failed to get container status \"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0\": rpc error: code = NotFound desc = could not find container \"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0\": container with ID starting with ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.075646 5030 scope.go:117] "RemoveContainer" containerID="63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.075867 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235"} err="failed to get container status \"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235\": rpc error: code = NotFound desc = could not find container \"63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235\": container with ID starting with 63af16094a9a1c928d0ac82ec8bd918642c6ea20226abb3c7ee01c8193703235 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.075889 5030 scope.go:117] "RemoveContainer" containerID="5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.076092 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083"} err="failed to get container status \"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083\": rpc error: code = NotFound desc = could not find container \"5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083\": container with ID starting with 5af453f071eb36402f4f9cc85bcf7faae2e53032fa20d20ae07ddea63398b083 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.076111 5030 scope.go:117] "RemoveContainer" containerID="4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.076460 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0"} err="failed to get container status \"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0\": rpc error: code = NotFound desc = could not find container \"4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0\": container with ID starting with 4635c849d1074e904ec618422f8c704247a953832e41f39178077e4e74ac0fb0 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.076611 5030 scope.go:117] "RemoveContainer" containerID="ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.077018 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0"} err="failed to get container status \"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0\": rpc error: code = NotFound desc = could not find container \"ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0\": container with ID starting with ce8ce94e93fbfcfcc203ea370b87ad93ab5ba7cacb976f9b32624e04669d86f0 not found: ID does not exist" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.097944 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db54c2bb-728c-4701-9b57-cd322a642cb4" (UID: "db54c2bb-728c-4701-9b57-cd322a642cb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.112633 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-config-data" (OuterVolumeSpecName: "config-data") pod "db54c2bb-728c-4701-9b57-cd322a642cb4" (UID: "db54c2bb-728c-4701-9b57-cd322a642cb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.130606 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.130658 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj9dn\" (UniqueName: \"kubernetes.io/projected/db54c2bb-728c-4701-9b57-cd322a642cb4-kube-api-access-jj9dn\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.130675 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.130684 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db54c2bb-728c-4701-9b57-cd322a642cb4-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.130693 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db54c2bb-728c-4701-9b57-cd322a642cb4-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.130702 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.130711 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db54c2bb-728c-4701-9b57-cd322a642cb4-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.356279 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.389638 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.411713 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:06:06 crc kubenswrapper[5030]: E0120 23:06:06.412255 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="ceilometer-central-agent" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.412282 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="ceilometer-central-agent" Jan 20 23:06:06 crc kubenswrapper[5030]: E0120 23:06:06.412304 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="ceilometer-notification-agent" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.412312 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="ceilometer-notification-agent" Jan 20 23:06:06 crc kubenswrapper[5030]: E0120 23:06:06.412335 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="sg-core" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.412343 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="sg-core" Jan 20 23:06:06 crc kubenswrapper[5030]: E0120 23:06:06.412368 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="proxy-httpd" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.412375 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="proxy-httpd" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.412559 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="sg-core" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.412571 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="ceilometer-central-agent" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.412589 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="ceilometer-notification-agent" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.412616 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" containerName="proxy-httpd" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.414583 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.418407 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.418709 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.424987 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.444036 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.444180 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-scripts\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.444203 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-run-httpd\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.444247 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-log-httpd\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.444269 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.444310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-config-data\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.444333 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2tqg\" (UniqueName: \"kubernetes.io/projected/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-kube-api-access-p2tqg\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.546746 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-log-httpd\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.546883 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.546947 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-config-data\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.547005 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2tqg\" (UniqueName: \"kubernetes.io/projected/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-kube-api-access-p2tqg\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.547330 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-log-httpd\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.547704 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.547762 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-scripts\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.547780 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-run-httpd\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.549578 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-run-httpd\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.554083 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-scripts\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.554833 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.556591 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-config-data\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.560384 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.585565 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2tqg\" (UniqueName: \"kubernetes.io/projected/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-kube-api-access-p2tqg\") pod \"ceilometer-0\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.726708 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.744887 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.754901 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.962235 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:06:06 crc kubenswrapper[5030]: E0120 23:06:06.962792 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.993398 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"480916ab-abdd-474b-a001-12e99c3e1e42","Type":"ContainerStarted","Data":"0f3a917976c46bd8ac3d7283588cd417a1a1576ec8637bc5f388d2baab0c0ebd"} Jan 20 23:06:06 crc kubenswrapper[5030]: I0120 23:06:06.993880 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:06:07 crc kubenswrapper[5030]: I0120 23:06:07.023966 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.023946375 podStartE2EDuration="2.023946375s" podCreationTimestamp="2026-01-20 23:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:07.021475524 +0000 UTC m=+1839.341735812" watchObservedRunningTime="2026-01-20 23:06:07.023946375 +0000 UTC m=+1839.344206663" Jan 20 23:06:07 crc kubenswrapper[5030]: I0120 23:06:07.229899 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:06:07 crc kubenswrapper[5030]: I0120 23:06:07.986489 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db54c2bb-728c-4701-9b57-cd322a642cb4" path="/var/lib/kubelet/pods/db54c2bb-728c-4701-9b57-cd322a642cb4/volumes" Jan 20 23:06:08 crc kubenswrapper[5030]: I0120 23:06:08.018094 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e","Type":"ContainerStarted","Data":"3e5f4f08565d2fabcb922ce081027f5b288dc07697610d03330b564ce0b972fb"} Jan 20 23:06:09 crc kubenswrapper[5030]: I0120 23:06:09.026393 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e","Type":"ContainerStarted","Data":"3be2cb38d9f117696064c23ddcb58967316a94eb60bc236b7633172fcd0883c1"} Jan 20 23:06:09 crc kubenswrapper[5030]: I0120 23:06:09.026715 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e","Type":"ContainerStarted","Data":"64276a673eccc337b3a92ee9cfe55913394327ac5fea09449c77770c10978c66"} Jan 20 23:06:10 crc kubenswrapper[5030]: I0120 23:06:10.039499 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e","Type":"ContainerStarted","Data":"8663f21c6f657fc21b551689ac7e403a73d7b917ddc92f30bbe96a82a764ffa3"} Jan 20 23:06:10 crc kubenswrapper[5030]: I0120 23:06:10.404009 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:06:10 crc kubenswrapper[5030]: I0120 23:06:10.908558 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24"] Jan 20 23:06:10 crc kubenswrapper[5030]: I0120 23:06:10.910097 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:10 crc kubenswrapper[5030]: I0120 23:06:10.912301 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 20 23:06:10 crc kubenswrapper[5030]: I0120 23:06:10.912583 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 20 23:06:10 crc kubenswrapper[5030]: I0120 23:06:10.928644 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24"] Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.024916 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-scripts\") pod \"nova-cell0-cell-mapping-q2p24\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.025340 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-q2p24\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.025423 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-config-data\") pod \"nova-cell0-cell-mapping-q2p24\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.025708 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x8md\" (UniqueName: \"kubernetes.io/projected/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-kube-api-access-4x8md\") pod \"nova-cell0-cell-mapping-q2p24\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.029542 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.030950 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.034636 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.040942 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.096418 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.097396 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e","Type":"ContainerStarted","Data":"f5c32f92175e7a5b690a5c4cb113f30fada3f5b40fce50511e15e82e36509d84"} Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.097426 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.097485 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.100309 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.120767 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.129280 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7166ca5-636c-4498-ad08-14bd9dbfd983-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.129342 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-q2p24\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.129379 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7166ca5-636c-4498-ad08-14bd9dbfd983-logs\") pod \"nova-api-0\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.129448 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5cl7\" (UniqueName: \"kubernetes.io/projected/c7166ca5-636c-4498-ad08-14bd9dbfd983-kube-api-access-z5cl7\") pod \"nova-api-0\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.129502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-config-data\") pod \"nova-cell0-cell-mapping-q2p24\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.129537 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x8md\" (UniqueName: \"kubernetes.io/projected/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-kube-api-access-4x8md\") pod \"nova-cell0-cell-mapping-q2p24\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.129576 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6333656f-5635-45be-ae40-d6b3a88d5dd2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6333656f-5635-45be-ae40-d6b3a88d5dd2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.129634 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6333656f-5635-45be-ae40-d6b3a88d5dd2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6333656f-5635-45be-ae40-d6b3a88d5dd2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.129687 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7166ca5-636c-4498-ad08-14bd9dbfd983-config-data\") pod \"nova-api-0\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.129712 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm2bn\" (UniqueName: \"kubernetes.io/projected/6333656f-5635-45be-ae40-d6b3a88d5dd2-kube-api-access-jm2bn\") pod \"nova-cell1-novncproxy-0\" (UID: \"6333656f-5635-45be-ae40-d6b3a88d5dd2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.132156 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-scripts\") pod \"nova-cell0-cell-mapping-q2p24\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.136438 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-config-data\") pod \"nova-cell0-cell-mapping-q2p24\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.141241 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-scripts\") pod \"nova-cell0-cell-mapping-q2p24\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.141294 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.143039 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.148080 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-q2p24\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.160288 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x8md\" (UniqueName: \"kubernetes.io/projected/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-kube-api-access-4x8md\") pod \"nova-cell0-cell-mapping-q2p24\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.160525 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.161681 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.183044 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.950291321 podStartE2EDuration="5.183024684s" podCreationTimestamp="2026-01-20 23:06:06 +0000 UTC" firstStartedPulling="2026-01-20 23:06:07.248329001 +0000 UTC m=+1839.568589289" lastFinishedPulling="2026-01-20 23:06:10.481062334 +0000 UTC m=+1842.801322652" observedRunningTime="2026-01-20 23:06:11.136933441 +0000 UTC m=+1843.457193729" watchObservedRunningTime="2026-01-20 23:06:11.183024684 +0000 UTC m=+1843.503284972" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.234054 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.238734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5cl7\" (UniqueName: \"kubernetes.io/projected/c7166ca5-636c-4498-ad08-14bd9dbfd983-kube-api-access-z5cl7\") pod \"nova-api-0\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.238800 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.238833 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6333656f-5635-45be-ae40-d6b3a88d5dd2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6333656f-5635-45be-ae40-d6b3a88d5dd2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.238857 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhvlt\" (UniqueName: \"kubernetes.io/projected/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-kube-api-access-rhvlt\") pod \"nova-scheduler-0\" (UID: \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.238876 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6333656f-5635-45be-ae40-d6b3a88d5dd2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6333656f-5635-45be-ae40-d6b3a88d5dd2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.238902 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7166ca5-636c-4498-ad08-14bd9dbfd983-config-data\") pod \"nova-api-0\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.238918 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm2bn\" (UniqueName: \"kubernetes.io/projected/6333656f-5635-45be-ae40-d6b3a88d5dd2-kube-api-access-jm2bn\") pod \"nova-cell1-novncproxy-0\" (UID: \"6333656f-5635-45be-ae40-d6b3a88d5dd2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.238945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-config-data\") pod \"nova-scheduler-0\" (UID: \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.238985 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7166ca5-636c-4498-ad08-14bd9dbfd983-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.239013 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7166ca5-636c-4498-ad08-14bd9dbfd983-logs\") pod \"nova-api-0\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.239367 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7166ca5-636c-4498-ad08-14bd9dbfd983-logs\") pod \"nova-api-0\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.242178 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.242539 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6333656f-5635-45be-ae40-d6b3a88d5dd2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6333656f-5635-45be-ae40-d6b3a88d5dd2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.243605 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.244186 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7166ca5-636c-4498-ad08-14bd9dbfd983-config-data\") pod \"nova-api-0\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.245933 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7166ca5-636c-4498-ad08-14bd9dbfd983-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.246251 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.248030 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6333656f-5635-45be-ae40-d6b3a88d5dd2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6333656f-5635-45be-ae40-d6b3a88d5dd2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.260171 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5cl7\" (UniqueName: \"kubernetes.io/projected/c7166ca5-636c-4498-ad08-14bd9dbfd983-kube-api-access-z5cl7\") pod \"nova-api-0\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.281774 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm2bn\" (UniqueName: \"kubernetes.io/projected/6333656f-5635-45be-ae40-d6b3a88d5dd2-kube-api-access-jm2bn\") pod \"nova-cell1-novncproxy-0\" (UID: \"6333656f-5635-45be-ae40-d6b3a88d5dd2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.294162 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.340797 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-config-data\") pod \"nova-scheduler-0\" (UID: \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.340861 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-config-data\") pod \"nova-metadata-0\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.340884 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-logs\") pod \"nova-metadata-0\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.340929 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dghxp\" (UniqueName: \"kubernetes.io/projected/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-kube-api-access-dghxp\") pod \"nova-metadata-0\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.340991 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.341013 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.341055 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhvlt\" (UniqueName: \"kubernetes.io/projected/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-kube-api-access-rhvlt\") pod \"nova-scheduler-0\" (UID: \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.345810 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-config-data\") pod \"nova-scheduler-0\" (UID: \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.349100 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.354206 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.360740 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhvlt\" (UniqueName: \"kubernetes.io/projected/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-kube-api-access-rhvlt\") pod \"nova-scheduler-0\" (UID: \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.423632 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.443233 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-config-data\") pod \"nova-metadata-0\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.443287 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-logs\") pod \"nova-metadata-0\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.443353 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dghxp\" (UniqueName: \"kubernetes.io/projected/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-kube-api-access-dghxp\") pod \"nova-metadata-0\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.443432 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.444452 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-logs\") pod \"nova-metadata-0\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.452295 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-config-data\") pod \"nova-metadata-0\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.452927 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.461437 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dghxp\" (UniqueName: \"kubernetes.io/projected/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-kube-api-access-dghxp\") pod \"nova-metadata-0\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.529207 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.574920 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.741512 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24"] Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.939129 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:11 crc kubenswrapper[5030]: I0120 23:06:11.998219 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5"] Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:11.999253 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.005184 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.009686 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.012234 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5"] Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.047530 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.059508 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hglg5\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.059717 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-scripts\") pod \"nova-cell1-conductor-db-sync-hglg5\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.059741 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-config-data\") pod \"nova-cell1-conductor-db-sync-hglg5\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.059763 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqkjk\" (UniqueName: \"kubernetes.io/projected/646a1e7c-67a8-4acc-8810-2ba983fb655e-kube-api-access-vqkjk\") pod \"nova-cell1-conductor-db-sync-hglg5\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.113850 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c7166ca5-636c-4498-ad08-14bd9dbfd983","Type":"ContainerStarted","Data":"45c8e1336b154790738d5eda8cd0a5777bd09e33750ece1f1fd29d236362259c"} Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.115664 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6333656f-5635-45be-ae40-d6b3a88d5dd2","Type":"ContainerStarted","Data":"6c80b59be69852446bd48f69bc963b6f7a846d8567be9056502f556258fcb7ad"} Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.117942 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.118313 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" event={"ID":"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7","Type":"ContainerStarted","Data":"23ae2d82921cf0555f438e643bf03e657a5b1d48089e782a3020dd8f67bf0058"} Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.118337 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" event={"ID":"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7","Type":"ContainerStarted","Data":"1107b8f981ecb7bffbf07bf47bc18620173ff30e9d10f5d0e025d7d56ff203ce"} Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.143845 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" podStartSLOduration=2.14382751 podStartE2EDuration="2.14382751s" podCreationTimestamp="2026-01-20 23:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:12.129495031 +0000 UTC m=+1844.449755319" watchObservedRunningTime="2026-01-20 23:06:12.14382751 +0000 UTC m=+1844.464087798" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.161878 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-scripts\") pod \"nova-cell1-conductor-db-sync-hglg5\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.161926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-config-data\") pod \"nova-cell1-conductor-db-sync-hglg5\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.161953 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqkjk\" (UniqueName: \"kubernetes.io/projected/646a1e7c-67a8-4acc-8810-2ba983fb655e-kube-api-access-vqkjk\") pod \"nova-cell1-conductor-db-sync-hglg5\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.162007 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hglg5\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.170252 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hglg5\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.170503 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-scripts\") pod \"nova-cell1-conductor-db-sync-hglg5\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.172565 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-config-data\") pod \"nova-cell1-conductor-db-sync-hglg5\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.187016 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqkjk\" (UniqueName: \"kubernetes.io/projected/646a1e7c-67a8-4acc-8810-2ba983fb655e-kube-api-access-vqkjk\") pod \"nova-cell1-conductor-db-sync-hglg5\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.213338 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.326720 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:12 crc kubenswrapper[5030]: I0120 23:06:12.886383 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5"] Jan 20 23:06:13 crc kubenswrapper[5030]: I0120 23:06:13.128441 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c7166ca5-636c-4498-ad08-14bd9dbfd983","Type":"ContainerStarted","Data":"7d86c9bc2c14e418b59e081656b1d1e26b5b5863d296d478be709a9868692ded"} Jan 20 23:06:13 crc kubenswrapper[5030]: I0120 23:06:13.128484 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c7166ca5-636c-4498-ad08-14bd9dbfd983","Type":"ContainerStarted","Data":"9ae6a527e170b07f5e79f9192f378238f23f99ebe8f3e1621446ffd3a2aab26e"} Jan 20 23:06:13 crc kubenswrapper[5030]: I0120 23:06:13.130947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d","Type":"ContainerStarted","Data":"cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a"} Jan 20 23:06:13 crc kubenswrapper[5030]: I0120 23:06:13.130993 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d","Type":"ContainerStarted","Data":"f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e"} Jan 20 23:06:13 crc kubenswrapper[5030]: I0120 23:06:13.131003 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d","Type":"ContainerStarted","Data":"6c7fa41875f922e791a25c05714600a2b3547cfeb592dbce0bef84afef6f71de"} Jan 20 23:06:13 crc kubenswrapper[5030]: I0120 23:06:13.132729 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6333656f-5635-45be-ae40-d6b3a88d5dd2","Type":"ContainerStarted","Data":"7d74d76e748ce42248bad643833e42d4f0940d3e32c93fcd3dc5fae403342d36"} Jan 20 23:06:13 crc kubenswrapper[5030]: I0120 23:06:13.134419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5","Type":"ContainerStarted","Data":"99a71f0deee15d0a9699a8ffa07c2e18238d26e0c5c3bddbccea94b51ac564a5"} Jan 20 23:06:13 crc kubenswrapper[5030]: I0120 23:06:13.134466 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5","Type":"ContainerStarted","Data":"78eef86f14b235bdf72f51feecef935db186bd06481e9be8b748c5cddf83dd7e"} Jan 20 23:06:13 crc kubenswrapper[5030]: I0120 23:06:13.139778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" event={"ID":"646a1e7c-67a8-4acc-8810-2ba983fb655e","Type":"ContainerStarted","Data":"b6f79fd59ed7ecafe0fd25c1903c1b870250f9d5ab3c3f88ec28f9fb73996a13"} Jan 20 23:06:13 crc kubenswrapper[5030]: I0120 23:06:13.154892 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.15487545 podStartE2EDuration="2.15487545s" podCreationTimestamp="2026-01-20 23:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:13.151013347 +0000 UTC m=+1845.471273635" watchObservedRunningTime="2026-01-20 23:06:13.15487545 +0000 UTC m=+1845.475135738" Jan 20 23:06:13 crc kubenswrapper[5030]: I0120 23:06:13.172808 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.172787826 podStartE2EDuration="2.172787826s" podCreationTimestamp="2026-01-20 23:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:13.169916437 +0000 UTC m=+1845.490176725" watchObservedRunningTime="2026-01-20 23:06:13.172787826 +0000 UTC m=+1845.493048114" Jan 20 23:06:13 crc kubenswrapper[5030]: I0120 23:06:13.208127 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.208103787 podStartE2EDuration="2.208103787s" podCreationTimestamp="2026-01-20 23:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:13.184276806 +0000 UTC m=+1845.504537094" watchObservedRunningTime="2026-01-20 23:06:13.208103787 +0000 UTC m=+1845.528364085" Jan 20 23:06:13 crc kubenswrapper[5030]: I0120 23:06:13.211504 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.21149593 podStartE2EDuration="2.21149593s" podCreationTimestamp="2026-01-20 23:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:13.201896386 +0000 UTC m=+1845.522156674" watchObservedRunningTime="2026-01-20 23:06:13.21149593 +0000 UTC m=+1845.531756218" Jan 20 23:06:14 crc kubenswrapper[5030]: I0120 23:06:14.060251 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:14 crc kubenswrapper[5030]: I0120 23:06:14.086606 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:06:14 crc kubenswrapper[5030]: I0120 23:06:14.148825 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" event={"ID":"646a1e7c-67a8-4acc-8810-2ba983fb655e","Type":"ContainerStarted","Data":"b7540dd8f63045c75c56efba1cd97bca76da39abdb06d7e436a00757cad793ab"} Jan 20 23:06:14 crc kubenswrapper[5030]: I0120 23:06:14.164712 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" podStartSLOduration=3.164695341 podStartE2EDuration="3.164695341s" podCreationTimestamp="2026-01-20 23:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:14.162267662 +0000 UTC m=+1846.482527950" watchObservedRunningTime="2026-01-20 23:06:14.164695341 +0000 UTC m=+1846.484955629" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.157405 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="6333656f-5635-45be-ae40-d6b3a88d5dd2" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7d74d76e748ce42248bad643833e42d4f0940d3e32c93fcd3dc5fae403342d36" gracePeriod=30 Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.157566 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" containerName="nova-metadata-log" containerID="cri-o://f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e" gracePeriod=30 Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.157661 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" containerName="nova-metadata-metadata" containerID="cri-o://cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a" gracePeriod=30 Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.731153 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.784954 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.833714 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm2bn\" (UniqueName: \"kubernetes.io/projected/6333656f-5635-45be-ae40-d6b3a88d5dd2-kube-api-access-jm2bn\") pod \"6333656f-5635-45be-ae40-d6b3a88d5dd2\" (UID: \"6333656f-5635-45be-ae40-d6b3a88d5dd2\") " Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.833753 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dghxp\" (UniqueName: \"kubernetes.io/projected/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-kube-api-access-dghxp\") pod \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.833825 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-logs\") pod \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.833854 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6333656f-5635-45be-ae40-d6b3a88d5dd2-combined-ca-bundle\") pod \"6333656f-5635-45be-ae40-d6b3a88d5dd2\" (UID: \"6333656f-5635-45be-ae40-d6b3a88d5dd2\") " Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.833870 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-config-data\") pod \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.834326 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-combined-ca-bundle\") pod \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\" (UID: \"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d\") " Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.834401 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-logs" (OuterVolumeSpecName: "logs") pod "f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" (UID: "f9c99ef4-87f8-4778-ba74-7d941bc3dc7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.834476 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6333656f-5635-45be-ae40-d6b3a88d5dd2-config-data\") pod \"6333656f-5635-45be-ae40-d6b3a88d5dd2\" (UID: \"6333656f-5635-45be-ae40-d6b3a88d5dd2\") " Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.835189 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.841142 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6333656f-5635-45be-ae40-d6b3a88d5dd2-kube-api-access-jm2bn" (OuterVolumeSpecName: "kube-api-access-jm2bn") pod "6333656f-5635-45be-ae40-d6b3a88d5dd2" (UID: "6333656f-5635-45be-ae40-d6b3a88d5dd2"). InnerVolumeSpecName "kube-api-access-jm2bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.841545 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-kube-api-access-dghxp" (OuterVolumeSpecName: "kube-api-access-dghxp") pod "f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" (UID: "f9c99ef4-87f8-4778-ba74-7d941bc3dc7d"). InnerVolumeSpecName "kube-api-access-dghxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.861771 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-config-data" (OuterVolumeSpecName: "config-data") pod "f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" (UID: "f9c99ef4-87f8-4778-ba74-7d941bc3dc7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.869528 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6333656f-5635-45be-ae40-d6b3a88d5dd2-config-data" (OuterVolumeSpecName: "config-data") pod "6333656f-5635-45be-ae40-d6b3a88d5dd2" (UID: "6333656f-5635-45be-ae40-d6b3a88d5dd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.870659 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6333656f-5635-45be-ae40-d6b3a88d5dd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6333656f-5635-45be-ae40-d6b3a88d5dd2" (UID: "6333656f-5635-45be-ae40-d6b3a88d5dd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.871576 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" (UID: "f9c99ef4-87f8-4778-ba74-7d941bc3dc7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.936432 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.936464 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6333656f-5635-45be-ae40-d6b3a88d5dd2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.936475 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm2bn\" (UniqueName: \"kubernetes.io/projected/6333656f-5635-45be-ae40-d6b3a88d5dd2-kube-api-access-jm2bn\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.936485 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dghxp\" (UniqueName: \"kubernetes.io/projected/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-kube-api-access-dghxp\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.936499 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6333656f-5635-45be-ae40-d6b3a88d5dd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:15 crc kubenswrapper[5030]: I0120 23:06:15.936508 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.168409 5030 generic.go:334] "Generic (PLEG): container finished" podID="f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" containerID="cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a" exitCode=0 Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.168441 5030 generic.go:334] "Generic (PLEG): container finished" podID="f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" containerID="f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e" exitCode=143 Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.168474 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.168482 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d","Type":"ContainerDied","Data":"cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a"} Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.168811 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d","Type":"ContainerDied","Data":"f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e"} Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.168862 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f9c99ef4-87f8-4778-ba74-7d941bc3dc7d","Type":"ContainerDied","Data":"6c7fa41875f922e791a25c05714600a2b3547cfeb592dbce0bef84afef6f71de"} Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.168963 5030 scope.go:117] "RemoveContainer" containerID="cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.174075 5030 generic.go:334] "Generic (PLEG): container finished" podID="6333656f-5635-45be-ae40-d6b3a88d5dd2" containerID="7d74d76e748ce42248bad643833e42d4f0940d3e32c93fcd3dc5fae403342d36" exitCode=0 Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.174198 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6333656f-5635-45be-ae40-d6b3a88d5dd2","Type":"ContainerDied","Data":"7d74d76e748ce42248bad643833e42d4f0940d3e32c93fcd3dc5fae403342d36"} Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.176026 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6333656f-5635-45be-ae40-d6b3a88d5dd2","Type":"ContainerDied","Data":"6c80b59be69852446bd48f69bc963b6f7a846d8567be9056502f556258fcb7ad"} Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.174413 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.199329 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.209338 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.222461 5030 scope.go:117] "RemoveContainer" containerID="f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.251521 5030 scope.go:117] "RemoveContainer" containerID="cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.251666 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:16 crc kubenswrapper[5030]: E0120 23:06:16.258022 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" containerName="nova-metadata-log" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.258074 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" containerName="nova-metadata-log" Jan 20 23:06:16 crc kubenswrapper[5030]: E0120 23:06:16.258107 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6333656f-5635-45be-ae40-d6b3a88d5dd2" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.258113 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6333656f-5635-45be-ae40-d6b3a88d5dd2" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:06:16 crc kubenswrapper[5030]: E0120 23:06:16.258185 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" containerName="nova-metadata-metadata" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.258249 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" containerName="nova-metadata-metadata" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.258553 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6333656f-5635-45be-ae40-d6b3a88d5dd2" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.258581 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" containerName="nova-metadata-log" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.258595 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" containerName="nova-metadata-metadata" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.259710 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: E0120 23:06:16.260469 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a\": container with ID starting with cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a not found: ID does not exist" containerID="cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.260524 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a"} err="failed to get container status \"cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a\": rpc error: code = NotFound desc = could not find container \"cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a\": container with ID starting with cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a not found: ID does not exist" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.260557 5030 scope.go:117] "RemoveContainer" containerID="f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.262486 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.262759 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:06:16 crc kubenswrapper[5030]: E0120 23:06:16.262979 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e\": container with ID starting with f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e not found: ID does not exist" containerID="f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.263017 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e"} err="failed to get container status \"f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e\": rpc error: code = NotFound desc = could not find container \"f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e\": container with ID starting with f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e not found: ID does not exist" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.263041 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.263046 5030 scope.go:117] "RemoveContainer" containerID="cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.264238 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a"} err="failed to get container status \"cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a\": rpc error: code = NotFound desc = could not find container \"cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a\": container with ID starting with cb13b4a01854da78235055335c2b915e6bf3959d148849dc693a2290f41ff33a not found: ID does not exist" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.264268 5030 scope.go:117] "RemoveContainer" containerID="f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.264488 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e"} err="failed to get container status \"f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e\": rpc error: code = NotFound desc = could not find container \"f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e\": container with ID starting with f498875f4214d93718761874afbacbeed48cd5a754806a7380b0812d1de1d20e not found: ID does not exist" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.264534 5030 scope.go:117] "RemoveContainer" containerID="7d74d76e748ce42248bad643833e42d4f0940d3e32c93fcd3dc5fae403342d36" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.305039 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.314488 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.325749 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.327344 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.329713 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.329994 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.330212 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.334920 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.345718 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62x2m\" (UniqueName: \"kubernetes.io/projected/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-kube-api-access-62x2m\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.345782 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.345818 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.346079 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.346202 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb97c2ab-db93-44cc-b05f-439a08be695a-logs\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.346264 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.346314 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.346350 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-config-data\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.346453 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.346548 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2kb8\" (UniqueName: \"kubernetes.io/projected/eb97c2ab-db93-44cc-b05f-439a08be695a-kube-api-access-w2kb8\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.360250 5030 scope.go:117] "RemoveContainer" containerID="7d74d76e748ce42248bad643833e42d4f0940d3e32c93fcd3dc5fae403342d36" Jan 20 23:06:16 crc kubenswrapper[5030]: E0120 23:06:16.360763 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d74d76e748ce42248bad643833e42d4f0940d3e32c93fcd3dc5fae403342d36\": container with ID starting with 7d74d76e748ce42248bad643833e42d4f0940d3e32c93fcd3dc5fae403342d36 not found: ID does not exist" containerID="7d74d76e748ce42248bad643833e42d4f0940d3e32c93fcd3dc5fae403342d36" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.360802 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d74d76e748ce42248bad643833e42d4f0940d3e32c93fcd3dc5fae403342d36"} err="failed to get container status \"7d74d76e748ce42248bad643833e42d4f0940d3e32c93fcd3dc5fae403342d36\": rpc error: code = NotFound desc = could not find container \"7d74d76e748ce42248bad643833e42d4f0940d3e32c93fcd3dc5fae403342d36\": container with ID starting with 7d74d76e748ce42248bad643833e42d4f0940d3e32c93fcd3dc5fae403342d36 not found: ID does not exist" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.448442 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.448523 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-config-data\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.448565 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.448678 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.448752 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2kb8\" (UniqueName: \"kubernetes.io/projected/eb97c2ab-db93-44cc-b05f-439a08be695a-kube-api-access-w2kb8\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.448803 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62x2m\" (UniqueName: \"kubernetes.io/projected/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-kube-api-access-62x2m\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.449215 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.449361 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.450195 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.450333 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb97c2ab-db93-44cc-b05f-439a08be695a-logs\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.451430 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb97c2ab-db93-44cc-b05f-439a08be695a-logs\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.453740 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.462766 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.469860 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.469860 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.470541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-config-data\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.471212 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.471433 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.473029 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2kb8\" (UniqueName: \"kubernetes.io/projected/eb97c2ab-db93-44cc-b05f-439a08be695a-kube-api-access-w2kb8\") pod \"nova-metadata-0\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.473169 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62x2m\" (UniqueName: \"kubernetes.io/projected/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-kube-api-access-62x2m\") pod \"nova-cell1-novncproxy-0\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.530368 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.656470 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:16 crc kubenswrapper[5030]: I0120 23:06:16.667889 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:17 crc kubenswrapper[5030]: I0120 23:06:17.140890 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:06:17 crc kubenswrapper[5030]: I0120 23:06:17.194320 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"49b5232f-22f5-443f-8b8d-bf8f8b5889e4","Type":"ContainerStarted","Data":"edf244324b0d100ade396554331be9e7b568a487f3b6827277b6cd8e62cd010a"} Jan 20 23:06:17 crc kubenswrapper[5030]: I0120 23:06:17.226464 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:17 crc kubenswrapper[5030]: W0120 23:06:17.230906 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb97c2ab_db93_44cc_b05f_439a08be695a.slice/crio-73c6505bd805e05f200143ced9b4b1a40292bba4f34a42e37479ec580f757029 WatchSource:0}: Error finding container 73c6505bd805e05f200143ced9b4b1a40292bba4f34a42e37479ec580f757029: Status 404 returned error can't find the container with id 73c6505bd805e05f200143ced9b4b1a40292bba4f34a42e37479ec580f757029 Jan 20 23:06:17 crc kubenswrapper[5030]: I0120 23:06:17.978567 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6333656f-5635-45be-ae40-d6b3a88d5dd2" path="/var/lib/kubelet/pods/6333656f-5635-45be-ae40-d6b3a88d5dd2/volumes" Jan 20 23:06:17 crc kubenswrapper[5030]: I0120 23:06:17.979709 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c99ef4-87f8-4778-ba74-7d941bc3dc7d" path="/var/lib/kubelet/pods/f9c99ef4-87f8-4778-ba74-7d941bc3dc7d/volumes" Jan 20 23:06:18 crc kubenswrapper[5030]: I0120 23:06:18.204681 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"eb97c2ab-db93-44cc-b05f-439a08be695a","Type":"ContainerStarted","Data":"f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723"} Jan 20 23:06:18 crc kubenswrapper[5030]: I0120 23:06:18.204739 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"eb97c2ab-db93-44cc-b05f-439a08be695a","Type":"ContainerStarted","Data":"35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49"} Jan 20 23:06:18 crc kubenswrapper[5030]: I0120 23:06:18.204753 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"eb97c2ab-db93-44cc-b05f-439a08be695a","Type":"ContainerStarted","Data":"73c6505bd805e05f200143ced9b4b1a40292bba4f34a42e37479ec580f757029"} Jan 20 23:06:18 crc kubenswrapper[5030]: I0120 23:06:18.207681 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"49b5232f-22f5-443f-8b8d-bf8f8b5889e4","Type":"ContainerStarted","Data":"68aba122543fc2cde94a4194512c12897ed28fdcdd2ea3c2f997cce1c69f0ad2"} Jan 20 23:06:18 crc kubenswrapper[5030]: I0120 23:06:18.232990 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.232966488 podStartE2EDuration="2.232966488s" podCreationTimestamp="2026-01-20 23:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:18.222460162 +0000 UTC m=+1850.542720450" watchObservedRunningTime="2026-01-20 23:06:18.232966488 +0000 UTC m=+1850.553226776" Jan 20 23:06:18 crc kubenswrapper[5030]: I0120 23:06:18.246827 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.246809265 podStartE2EDuration="2.246809265s" podCreationTimestamp="2026-01-20 23:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:18.24204689 +0000 UTC m=+1850.562307178" watchObservedRunningTime="2026-01-20 23:06:18.246809265 +0000 UTC m=+1850.567069553" Jan 20 23:06:19 crc kubenswrapper[5030]: I0120 23:06:19.222113 5030 generic.go:334] "Generic (PLEG): container finished" podID="c1e5244a-19c2-4e6f-bced-f44b38a0d3b7" containerID="23ae2d82921cf0555f438e643bf03e657a5b1d48089e782a3020dd8f67bf0058" exitCode=0 Jan 20 23:06:19 crc kubenswrapper[5030]: I0120 23:06:19.222255 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" event={"ID":"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7","Type":"ContainerDied","Data":"23ae2d82921cf0555f438e643bf03e657a5b1d48089e782a3020dd8f67bf0058"} Jan 20 23:06:19 crc kubenswrapper[5030]: I0120 23:06:19.226558 5030 generic.go:334] "Generic (PLEG): container finished" podID="646a1e7c-67a8-4acc-8810-2ba983fb655e" containerID="b7540dd8f63045c75c56efba1cd97bca76da39abdb06d7e436a00757cad793ab" exitCode=0 Jan 20 23:06:19 crc kubenswrapper[5030]: I0120 23:06:19.226674 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" event={"ID":"646a1e7c-67a8-4acc-8810-2ba983fb655e","Type":"ContainerDied","Data":"b7540dd8f63045c75c56efba1cd97bca76da39abdb06d7e436a00757cad793ab"} Jan 20 23:06:19 crc kubenswrapper[5030]: I0120 23:06:19.962574 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:06:19 crc kubenswrapper[5030]: E0120 23:06:19.963035 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.792843 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.800007 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.855902 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x8md\" (UniqueName: \"kubernetes.io/projected/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-kube-api-access-4x8md\") pod \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.856176 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-scripts\") pod \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.856320 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-combined-ca-bundle\") pod \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.856410 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-config-data\") pod \"646a1e7c-67a8-4acc-8810-2ba983fb655e\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.856513 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-combined-ca-bundle\") pod \"646a1e7c-67a8-4acc-8810-2ba983fb655e\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.857098 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqkjk\" (UniqueName: \"kubernetes.io/projected/646a1e7c-67a8-4acc-8810-2ba983fb655e-kube-api-access-vqkjk\") pod \"646a1e7c-67a8-4acc-8810-2ba983fb655e\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.857264 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-scripts\") pod \"646a1e7c-67a8-4acc-8810-2ba983fb655e\" (UID: \"646a1e7c-67a8-4acc-8810-2ba983fb655e\") " Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.857376 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-config-data\") pod \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\" (UID: \"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7\") " Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.862597 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-scripts" (OuterVolumeSpecName: "scripts") pod "c1e5244a-19c2-4e6f-bced-f44b38a0d3b7" (UID: "c1e5244a-19c2-4e6f-bced-f44b38a0d3b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.863270 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-kube-api-access-4x8md" (OuterVolumeSpecName: "kube-api-access-4x8md") pod "c1e5244a-19c2-4e6f-bced-f44b38a0d3b7" (UID: "c1e5244a-19c2-4e6f-bced-f44b38a0d3b7"). InnerVolumeSpecName "kube-api-access-4x8md". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.863962 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/646a1e7c-67a8-4acc-8810-2ba983fb655e-kube-api-access-vqkjk" (OuterVolumeSpecName: "kube-api-access-vqkjk") pod "646a1e7c-67a8-4acc-8810-2ba983fb655e" (UID: "646a1e7c-67a8-4acc-8810-2ba983fb655e"). InnerVolumeSpecName "kube-api-access-vqkjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.874641 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-scripts" (OuterVolumeSpecName: "scripts") pod "646a1e7c-67a8-4acc-8810-2ba983fb655e" (UID: "646a1e7c-67a8-4acc-8810-2ba983fb655e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.889992 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-config-data" (OuterVolumeSpecName: "config-data") pod "c1e5244a-19c2-4e6f-bced-f44b38a0d3b7" (UID: "c1e5244a-19c2-4e6f-bced-f44b38a0d3b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.892981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "646a1e7c-67a8-4acc-8810-2ba983fb655e" (UID: "646a1e7c-67a8-4acc-8810-2ba983fb655e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.896890 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1e5244a-19c2-4e6f-bced-f44b38a0d3b7" (UID: "c1e5244a-19c2-4e6f-bced-f44b38a0d3b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.899846 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-config-data" (OuterVolumeSpecName: "config-data") pod "646a1e7c-67a8-4acc-8810-2ba983fb655e" (UID: "646a1e7c-67a8-4acc-8810-2ba983fb655e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.959138 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.959319 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqkjk\" (UniqueName: \"kubernetes.io/projected/646a1e7c-67a8-4acc-8810-2ba983fb655e-kube-api-access-vqkjk\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.959407 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.959485 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.959601 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x8md\" (UniqueName: \"kubernetes.io/projected/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-kube-api-access-4x8md\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.959731 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.959817 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:20 crc kubenswrapper[5030]: I0120 23:06:20.959896 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/646a1e7c-67a8-4acc-8810-2ba983fb655e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.269139 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" event={"ID":"646a1e7c-67a8-4acc-8810-2ba983fb655e","Type":"ContainerDied","Data":"b6f79fd59ed7ecafe0fd25c1903c1b870250f9d5ab3c3f88ec28f9fb73996a13"} Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.269528 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6f79fd59ed7ecafe0fd25c1903c1b870250f9d5ab3c3f88ec28f9fb73996a13" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.269180 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.272171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" event={"ID":"c1e5244a-19c2-4e6f-bced-f44b38a0d3b7","Type":"ContainerDied","Data":"1107b8f981ecb7bffbf07bf47bc18620173ff30e9d10f5d0e025d7d56ff203ce"} Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.272216 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1107b8f981ecb7bffbf07bf47bc18620173ff30e9d10f5d0e025d7d56ff203ce" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.272323 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.340409 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:06:21 crc kubenswrapper[5030]: E0120 23:06:21.340935 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="646a1e7c-67a8-4acc-8810-2ba983fb655e" containerName="nova-cell1-conductor-db-sync" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.340968 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="646a1e7c-67a8-4acc-8810-2ba983fb655e" containerName="nova-cell1-conductor-db-sync" Jan 20 23:06:21 crc kubenswrapper[5030]: E0120 23:06:21.340996 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e5244a-19c2-4e6f-bced-f44b38a0d3b7" containerName="nova-manage" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.341005 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e5244a-19c2-4e6f-bced-f44b38a0d3b7" containerName="nova-manage" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.341195 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="646a1e7c-67a8-4acc-8810-2ba983fb655e" containerName="nova-cell1-conductor-db-sync" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.341237 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e5244a-19c2-4e6f-bced-f44b38a0d3b7" containerName="nova-manage" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.342125 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.344500 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.354040 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.355568 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.355600 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.473031 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.473068 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.473106 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2fzh\" (UniqueName: \"kubernetes.io/projected/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-kube-api-access-t2fzh\") pod \"nova-cell1-conductor-0\" (UID: \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.480719 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.497916 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.498138 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="0f7dec90-ff53-49e4-ad99-a24b0e54e5b5" containerName="nova-scheduler-scheduler" containerID="cri-o://99a71f0deee15d0a9699a8ffa07c2e18238d26e0c5c3bddbccea94b51ac564a5" gracePeriod=30 Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.525294 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.525565 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="eb97c2ab-db93-44cc-b05f-439a08be695a" containerName="nova-metadata-log" containerID="cri-o://35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49" gracePeriod=30 Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.525586 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="eb97c2ab-db93-44cc-b05f-439a08be695a" containerName="nova-metadata-metadata" containerID="cri-o://f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723" gracePeriod=30 Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.574488 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.574542 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.574581 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2fzh\" (UniqueName: \"kubernetes.io/projected/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-kube-api-access-t2fzh\") pod \"nova-cell1-conductor-0\" (UID: \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.578549 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.578922 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.591135 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2fzh\" (UniqueName: \"kubernetes.io/projected/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-kube-api-access-t2fzh\") pod \"nova-cell1-conductor-0\" (UID: \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.658589 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.658658 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.662175 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:06:21 crc kubenswrapper[5030]: I0120 23:06:21.669060 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.135712 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.140828 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: W0120 23:06:22.140923 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda666baa5_8c6c_40e4_8c4d_7ab347b29e04.slice/crio-ec014873a1bd14099b9d990a92fcb7b5aa7c462964ac3d40fca07374b556f60c WatchSource:0}: Error finding container ec014873a1bd14099b9d990a92fcb7b5aa7c462964ac3d40fca07374b556f60c: Status 404 returned error can't find the container with id ec014873a1bd14099b9d990a92fcb7b5aa7c462964ac3d40fca07374b556f60c Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.184435 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb97c2ab-db93-44cc-b05f-439a08be695a-logs\") pod \"eb97c2ab-db93-44cc-b05f-439a08be695a\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.184563 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-nova-metadata-tls-certs\") pod \"eb97c2ab-db93-44cc-b05f-439a08be695a\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.184608 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-combined-ca-bundle\") pod \"eb97c2ab-db93-44cc-b05f-439a08be695a\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.184797 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-config-data\") pod \"eb97c2ab-db93-44cc-b05f-439a08be695a\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.184870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb97c2ab-db93-44cc-b05f-439a08be695a-logs" (OuterVolumeSpecName: "logs") pod "eb97c2ab-db93-44cc-b05f-439a08be695a" (UID: "eb97c2ab-db93-44cc-b05f-439a08be695a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.184893 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2kb8\" (UniqueName: \"kubernetes.io/projected/eb97c2ab-db93-44cc-b05f-439a08be695a-kube-api-access-w2kb8\") pod \"eb97c2ab-db93-44cc-b05f-439a08be695a\" (UID: \"eb97c2ab-db93-44cc-b05f-439a08be695a\") " Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.185443 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb97c2ab-db93-44cc-b05f-439a08be695a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.194802 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb97c2ab-db93-44cc-b05f-439a08be695a-kube-api-access-w2kb8" (OuterVolumeSpecName: "kube-api-access-w2kb8") pod "eb97c2ab-db93-44cc-b05f-439a08be695a" (UID: "eb97c2ab-db93-44cc-b05f-439a08be695a"). InnerVolumeSpecName "kube-api-access-w2kb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.215168 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-config-data" (OuterVolumeSpecName: "config-data") pod "eb97c2ab-db93-44cc-b05f-439a08be695a" (UID: "eb97c2ab-db93-44cc-b05f-439a08be695a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.223240 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb97c2ab-db93-44cc-b05f-439a08be695a" (UID: "eb97c2ab-db93-44cc-b05f-439a08be695a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.249405 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "eb97c2ab-db93-44cc-b05f-439a08be695a" (UID: "eb97c2ab-db93-44cc-b05f-439a08be695a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.284290 5030 generic.go:334] "Generic (PLEG): container finished" podID="eb97c2ab-db93-44cc-b05f-439a08be695a" containerID="f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723" exitCode=0 Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.284332 5030 generic.go:334] "Generic (PLEG): container finished" podID="eb97c2ab-db93-44cc-b05f-439a08be695a" containerID="35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49" exitCode=143 Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.284373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"eb97c2ab-db93-44cc-b05f-439a08be695a","Type":"ContainerDied","Data":"f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723"} Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.284405 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"eb97c2ab-db93-44cc-b05f-439a08be695a","Type":"ContainerDied","Data":"35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49"} Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.284421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"eb97c2ab-db93-44cc-b05f-439a08be695a","Type":"ContainerDied","Data":"73c6505bd805e05f200143ced9b4b1a40292bba4f34a42e37479ec580f757029"} Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.284439 5030 scope.go:117] "RemoveContainer" containerID="f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.284581 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.290666 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2kb8\" (UniqueName: \"kubernetes.io/projected/eb97c2ab-db93-44cc-b05f-439a08be695a-kube-api-access-w2kb8\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.290690 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.290701 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.290730 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb97c2ab-db93-44cc-b05f-439a08be695a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.291554 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="c7166ca5-636c-4498-ad08-14bd9dbfd983" containerName="nova-api-log" containerID="cri-o://9ae6a527e170b07f5e79f9192f378238f23f99ebe8f3e1621446ffd3a2aab26e" gracePeriod=30 Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.291916 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"a666baa5-8c6c-40e4-8c4d-7ab347b29e04","Type":"ContainerStarted","Data":"ec014873a1bd14099b9d990a92fcb7b5aa7c462964ac3d40fca07374b556f60c"} Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.291982 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="c7166ca5-636c-4498-ad08-14bd9dbfd983" containerName="nova-api-api" containerID="cri-o://7d86c9bc2c14e418b59e081656b1d1e26b5b5863d296d478be709a9868692ded" gracePeriod=30 Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.298327 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="c7166ca5-636c-4498-ad08-14bd9dbfd983" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.119:8774/\": EOF" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.299523 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="c7166ca5-636c-4498-ad08-14bd9dbfd983" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.119:8774/\": EOF" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.327376 5030 scope.go:117] "RemoveContainer" containerID="35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.337796 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.357781 5030 scope.go:117] "RemoveContainer" containerID="f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723" Jan 20 23:06:22 crc kubenswrapper[5030]: E0120 23:06:22.358256 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723\": container with ID starting with f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723 not found: ID does not exist" containerID="f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.358324 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723"} err="failed to get container status \"f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723\": rpc error: code = NotFound desc = could not find container \"f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723\": container with ID starting with f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723 not found: ID does not exist" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.358351 5030 scope.go:117] "RemoveContainer" containerID="35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49" Jan 20 23:06:22 crc kubenswrapper[5030]: E0120 23:06:22.358654 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49\": container with ID starting with 35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49 not found: ID does not exist" containerID="35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.358705 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49"} err="failed to get container status \"35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49\": rpc error: code = NotFound desc = could not find container \"35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49\": container with ID starting with 35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49 not found: ID does not exist" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.358735 5030 scope.go:117] "RemoveContainer" containerID="f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.358945 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723"} err="failed to get container status \"f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723\": rpc error: code = NotFound desc = could not find container \"f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723\": container with ID starting with f255179c9687834c04e7787aab60d861cbd634e99ba97206cc69132f17f4d723 not found: ID does not exist" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.358968 5030 scope.go:117] "RemoveContainer" containerID="35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.359169 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49"} err="failed to get container status \"35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49\": rpc error: code = NotFound desc = could not find container \"35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49\": container with ID starting with 35292a28c8d220e6fcc15c7efc98b18fa29d9ee229c74c0f96ab147c11d2ff49 not found: ID does not exist" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.378956 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.399349 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:22 crc kubenswrapper[5030]: E0120 23:06:22.399738 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb97c2ab-db93-44cc-b05f-439a08be695a" containerName="nova-metadata-log" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.399771 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb97c2ab-db93-44cc-b05f-439a08be695a" containerName="nova-metadata-log" Jan 20 23:06:22 crc kubenswrapper[5030]: E0120 23:06:22.399799 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb97c2ab-db93-44cc-b05f-439a08be695a" containerName="nova-metadata-metadata" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.399805 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb97c2ab-db93-44cc-b05f-439a08be695a" containerName="nova-metadata-metadata" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.399987 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb97c2ab-db93-44cc-b05f-439a08be695a" containerName="nova-metadata-log" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.400006 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb97c2ab-db93-44cc-b05f-439a08be695a" containerName="nova-metadata-metadata" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.401352 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.405145 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.405315 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.410905 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.495051 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv75w\" (UniqueName: \"kubernetes.io/projected/8bc1ae48-151e-4fc9-b08c-4afa012265ae-kube-api-access-rv75w\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.495102 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc1ae48-151e-4fc9-b08c-4afa012265ae-logs\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.495173 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-config-data\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.495214 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.495276 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.597291 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv75w\" (UniqueName: \"kubernetes.io/projected/8bc1ae48-151e-4fc9-b08c-4afa012265ae-kube-api-access-rv75w\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.597338 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc1ae48-151e-4fc9-b08c-4afa012265ae-logs\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.597377 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-config-data\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.597420 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.597460 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.597970 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc1ae48-151e-4fc9-b08c-4afa012265ae-logs\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.601614 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.601755 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.603068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-config-data\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.622979 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv75w\" (UniqueName: \"kubernetes.io/projected/8bc1ae48-151e-4fc9-b08c-4afa012265ae-kube-api-access-rv75w\") pod \"nova-metadata-0\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:22 crc kubenswrapper[5030]: I0120 23:06:22.715499 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:23 crc kubenswrapper[5030]: I0120 23:06:23.182201 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:23 crc kubenswrapper[5030]: I0120 23:06:23.304880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"a666baa5-8c6c-40e4-8c4d-7ab347b29e04","Type":"ContainerStarted","Data":"0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225"} Jan 20 23:06:23 crc kubenswrapper[5030]: I0120 23:06:23.305033 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:06:23 crc kubenswrapper[5030]: I0120 23:06:23.306776 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"8bc1ae48-151e-4fc9-b08c-4afa012265ae","Type":"ContainerStarted","Data":"31dfcbe4f6c17af95678bea3af05fdd10cfac96b22190b7fdb1e9a0be587b09c"} Jan 20 23:06:23 crc kubenswrapper[5030]: I0120 23:06:23.308394 5030 generic.go:334] "Generic (PLEG): container finished" podID="c7166ca5-636c-4498-ad08-14bd9dbfd983" containerID="9ae6a527e170b07f5e79f9192f378238f23f99ebe8f3e1621446ffd3a2aab26e" exitCode=143 Jan 20 23:06:23 crc kubenswrapper[5030]: I0120 23:06:23.308419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c7166ca5-636c-4498-ad08-14bd9dbfd983","Type":"ContainerDied","Data":"9ae6a527e170b07f5e79f9192f378238f23f99ebe8f3e1621446ffd3a2aab26e"} Jan 20 23:06:23 crc kubenswrapper[5030]: I0120 23:06:23.329802 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.329782693 podStartE2EDuration="2.329782693s" podCreationTimestamp="2026-01-20 23:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:23.323752995 +0000 UTC m=+1855.644013283" watchObservedRunningTime="2026-01-20 23:06:23.329782693 +0000 UTC m=+1855.650043001" Jan 20 23:06:23 crc kubenswrapper[5030]: I0120 23:06:23.978508 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb97c2ab-db93-44cc-b05f-439a08be695a" path="/var/lib/kubelet/pods/eb97c2ab-db93-44cc-b05f-439a08be695a/volumes" Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.323316 5030 generic.go:334] "Generic (PLEG): container finished" podID="0f7dec90-ff53-49e4-ad99-a24b0e54e5b5" containerID="99a71f0deee15d0a9699a8ffa07c2e18238d26e0c5c3bddbccea94b51ac564a5" exitCode=0 Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.323372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5","Type":"ContainerDied","Data":"99a71f0deee15d0a9699a8ffa07c2e18238d26e0c5c3bddbccea94b51ac564a5"} Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.323765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5","Type":"ContainerDied","Data":"78eef86f14b235bdf72f51feecef935db186bd06481e9be8b748c5cddf83dd7e"} Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.323783 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78eef86f14b235bdf72f51feecef935db186bd06481e9be8b748c5cddf83dd7e" Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.328965 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"8bc1ae48-151e-4fc9-b08c-4afa012265ae","Type":"ContainerStarted","Data":"b23f8c8c1c7670e182626ea44ac8643174e86ad72b395b0a56ce5b1274afa1c4"} Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.329025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"8bc1ae48-151e-4fc9-b08c-4afa012265ae","Type":"ContainerStarted","Data":"8ccb7871d1d322b35b080a53848a656dec1a9d94b88aef6026877a164f9327ed"} Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.349886 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.355332 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.355306765 podStartE2EDuration="2.355306765s" podCreationTimestamp="2026-01-20 23:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:24.351841751 +0000 UTC m=+1856.672102049" watchObservedRunningTime="2026-01-20 23:06:24.355306765 +0000 UTC m=+1856.675567083" Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.429470 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-combined-ca-bundle\") pod \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\" (UID: \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\") " Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.429677 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-config-data\") pod \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\" (UID: \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\") " Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.429798 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhvlt\" (UniqueName: \"kubernetes.io/projected/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-kube-api-access-rhvlt\") pod \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\" (UID: \"0f7dec90-ff53-49e4-ad99-a24b0e54e5b5\") " Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.435848 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-kube-api-access-rhvlt" (OuterVolumeSpecName: "kube-api-access-rhvlt") pod "0f7dec90-ff53-49e4-ad99-a24b0e54e5b5" (UID: "0f7dec90-ff53-49e4-ad99-a24b0e54e5b5"). InnerVolumeSpecName "kube-api-access-rhvlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.456147 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-config-data" (OuterVolumeSpecName: "config-data") pod "0f7dec90-ff53-49e4-ad99-a24b0e54e5b5" (UID: "0f7dec90-ff53-49e4-ad99-a24b0e54e5b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.470197 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f7dec90-ff53-49e4-ad99-a24b0e54e5b5" (UID: "0f7dec90-ff53-49e4-ad99-a24b0e54e5b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.533404 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.533462 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhvlt\" (UniqueName: \"kubernetes.io/projected/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-kube-api-access-rhvlt\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:24 crc kubenswrapper[5030]: I0120 23:06:24.533481 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.340867 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.388563 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.397327 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.428358 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:25 crc kubenswrapper[5030]: E0120 23:06:25.429216 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f7dec90-ff53-49e4-ad99-a24b0e54e5b5" containerName="nova-scheduler-scheduler" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.429353 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f7dec90-ff53-49e4-ad99-a24b0e54e5b5" containerName="nova-scheduler-scheduler" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.429792 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f7dec90-ff53-49e4-ad99-a24b0e54e5b5" containerName="nova-scheduler-scheduler" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.430903 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.433002 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.441925 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.555535 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd4297f4-36e8-419c-81cd-1fa36be55bc2-config-data\") pod \"nova-scheduler-0\" (UID: \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.555644 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4297f4-36e8-419c-81cd-1fa36be55bc2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.555900 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvjtf\" (UniqueName: \"kubernetes.io/projected/dd4297f4-36e8-419c-81cd-1fa36be55bc2-kube-api-access-tvjtf\") pod \"nova-scheduler-0\" (UID: \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.659059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd4297f4-36e8-419c-81cd-1fa36be55bc2-config-data\") pod \"nova-scheduler-0\" (UID: \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.659152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4297f4-36e8-419c-81cd-1fa36be55bc2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.659283 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvjtf\" (UniqueName: \"kubernetes.io/projected/dd4297f4-36e8-419c-81cd-1fa36be55bc2-kube-api-access-tvjtf\") pod \"nova-scheduler-0\" (UID: \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.667642 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4297f4-36e8-419c-81cd-1fa36be55bc2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.667781 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd4297f4-36e8-419c-81cd-1fa36be55bc2-config-data\") pod \"nova-scheduler-0\" (UID: \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.694720 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvjtf\" (UniqueName: \"kubernetes.io/projected/dd4297f4-36e8-419c-81cd-1fa36be55bc2-kube-api-access-tvjtf\") pod \"nova-scheduler-0\" (UID: \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.765064 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:25 crc kubenswrapper[5030]: I0120 23:06:25.981463 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f7dec90-ff53-49e4-ad99-a24b0e54e5b5" path="/var/lib/kubelet/pods/0f7dec90-ff53-49e4-ad99-a24b0e54e5b5/volumes" Jan 20 23:06:26 crc kubenswrapper[5030]: I0120 23:06:26.313706 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:26 crc kubenswrapper[5030]: W0120 23:06:26.316328 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd4297f4_36e8_419c_81cd_1fa36be55bc2.slice/crio-cd59ca04ffaa1865f9ae803413ab6517536d7967907837d94af85707a3719e41 WatchSource:0}: Error finding container cd59ca04ffaa1865f9ae803413ab6517536d7967907837d94af85707a3719e41: Status 404 returned error can't find the container with id cd59ca04ffaa1865f9ae803413ab6517536d7967907837d94af85707a3719e41 Jan 20 23:06:26 crc kubenswrapper[5030]: I0120 23:06:26.353600 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"dd4297f4-36e8-419c-81cd-1fa36be55bc2","Type":"ContainerStarted","Data":"cd59ca04ffaa1865f9ae803413ab6517536d7967907837d94af85707a3719e41"} Jan 20 23:06:26 crc kubenswrapper[5030]: I0120 23:06:26.669145 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:26 crc kubenswrapper[5030]: I0120 23:06:26.704336 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:27 crc kubenswrapper[5030]: I0120 23:06:27.370501 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"dd4297f4-36e8-419c-81cd-1fa36be55bc2","Type":"ContainerStarted","Data":"6e58555f160e268b77ba7e16690198097633f07e1b7605f94dbddc164622a915"} Jan 20 23:06:27 crc kubenswrapper[5030]: I0120 23:06:27.401565 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:06:27 crc kubenswrapper[5030]: I0120 23:06:27.407375 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.407357667 podStartE2EDuration="2.407357667s" podCreationTimestamp="2026-01-20 23:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:27.397974098 +0000 UTC m=+1859.718234386" watchObservedRunningTime="2026-01-20 23:06:27.407357667 +0000 UTC m=+1859.727617955" Jan 20 23:06:27 crc kubenswrapper[5030]: I0120 23:06:27.716444 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:27 crc kubenswrapper[5030]: I0120 23:06:27.716977 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.163821 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.211551 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7166ca5-636c-4498-ad08-14bd9dbfd983-combined-ca-bundle\") pod \"c7166ca5-636c-4498-ad08-14bd9dbfd983\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.211655 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7166ca5-636c-4498-ad08-14bd9dbfd983-logs\") pod \"c7166ca5-636c-4498-ad08-14bd9dbfd983\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.211686 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5cl7\" (UniqueName: \"kubernetes.io/projected/c7166ca5-636c-4498-ad08-14bd9dbfd983-kube-api-access-z5cl7\") pod \"c7166ca5-636c-4498-ad08-14bd9dbfd983\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.211741 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7166ca5-636c-4498-ad08-14bd9dbfd983-config-data\") pod \"c7166ca5-636c-4498-ad08-14bd9dbfd983\" (UID: \"c7166ca5-636c-4498-ad08-14bd9dbfd983\") " Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.212446 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7166ca5-636c-4498-ad08-14bd9dbfd983-logs" (OuterVolumeSpecName: "logs") pod "c7166ca5-636c-4498-ad08-14bd9dbfd983" (UID: "c7166ca5-636c-4498-ad08-14bd9dbfd983"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.219186 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7166ca5-636c-4498-ad08-14bd9dbfd983-kube-api-access-z5cl7" (OuterVolumeSpecName: "kube-api-access-z5cl7") pod "c7166ca5-636c-4498-ad08-14bd9dbfd983" (UID: "c7166ca5-636c-4498-ad08-14bd9dbfd983"). InnerVolumeSpecName "kube-api-access-z5cl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.241077 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7166ca5-636c-4498-ad08-14bd9dbfd983-config-data" (OuterVolumeSpecName: "config-data") pod "c7166ca5-636c-4498-ad08-14bd9dbfd983" (UID: "c7166ca5-636c-4498-ad08-14bd9dbfd983"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.245325 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7166ca5-636c-4498-ad08-14bd9dbfd983-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7166ca5-636c-4498-ad08-14bd9dbfd983" (UID: "c7166ca5-636c-4498-ad08-14bd9dbfd983"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.313637 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7166ca5-636c-4498-ad08-14bd9dbfd983-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.313675 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7166ca5-636c-4498-ad08-14bd9dbfd983-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.313688 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5cl7\" (UniqueName: \"kubernetes.io/projected/c7166ca5-636c-4498-ad08-14bd9dbfd983-kube-api-access-z5cl7\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.313702 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7166ca5-636c-4498-ad08-14bd9dbfd983-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.381971 5030 generic.go:334] "Generic (PLEG): container finished" podID="c7166ca5-636c-4498-ad08-14bd9dbfd983" containerID="7d86c9bc2c14e418b59e081656b1d1e26b5b5863d296d478be709a9868692ded" exitCode=0 Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.382934 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.386207 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c7166ca5-636c-4498-ad08-14bd9dbfd983","Type":"ContainerDied","Data":"7d86c9bc2c14e418b59e081656b1d1e26b5b5863d296d478be709a9868692ded"} Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.386279 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c7166ca5-636c-4498-ad08-14bd9dbfd983","Type":"ContainerDied","Data":"45c8e1336b154790738d5eda8cd0a5777bd09e33750ece1f1fd29d236362259c"} Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.386326 5030 scope.go:117] "RemoveContainer" containerID="7d86c9bc2c14e418b59e081656b1d1e26b5b5863d296d478be709a9868692ded" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.429318 5030 scope.go:117] "RemoveContainer" containerID="9ae6a527e170b07f5e79f9192f378238f23f99ebe8f3e1621446ffd3a2aab26e" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.429500 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.444086 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.476940 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:28 crc kubenswrapper[5030]: E0120 23:06:28.477587 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7166ca5-636c-4498-ad08-14bd9dbfd983" containerName="nova-api-api" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.477608 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7166ca5-636c-4498-ad08-14bd9dbfd983" containerName="nova-api-api" Jan 20 23:06:28 crc kubenswrapper[5030]: E0120 23:06:28.477637 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7166ca5-636c-4498-ad08-14bd9dbfd983" containerName="nova-api-log" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.477644 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7166ca5-636c-4498-ad08-14bd9dbfd983" containerName="nova-api-log" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.477967 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7166ca5-636c-4498-ad08-14bd9dbfd983" containerName="nova-api-api" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.478031 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7166ca5-636c-4498-ad08-14bd9dbfd983" containerName="nova-api-log" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.478106 5030 scope.go:117] "RemoveContainer" containerID="7d86c9bc2c14e418b59e081656b1d1e26b5b5863d296d478be709a9868692ded" Jan 20 23:06:28 crc kubenswrapper[5030]: E0120 23:06:28.483948 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d86c9bc2c14e418b59e081656b1d1e26b5b5863d296d478be709a9868692ded\": container with ID starting with 7d86c9bc2c14e418b59e081656b1d1e26b5b5863d296d478be709a9868692ded not found: ID does not exist" containerID="7d86c9bc2c14e418b59e081656b1d1e26b5b5863d296d478be709a9868692ded" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.484038 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d86c9bc2c14e418b59e081656b1d1e26b5b5863d296d478be709a9868692ded"} err="failed to get container status \"7d86c9bc2c14e418b59e081656b1d1e26b5b5863d296d478be709a9868692ded\": rpc error: code = NotFound desc = could not find container \"7d86c9bc2c14e418b59e081656b1d1e26b5b5863d296d478be709a9868692ded\": container with ID starting with 7d86c9bc2c14e418b59e081656b1d1e26b5b5863d296d478be709a9868692ded not found: ID does not exist" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.484077 5030 scope.go:117] "RemoveContainer" containerID="9ae6a527e170b07f5e79f9192f378238f23f99ebe8f3e1621446ffd3a2aab26e" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.492944 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.499316 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:28 crc kubenswrapper[5030]: E0120 23:06:28.499445 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ae6a527e170b07f5e79f9192f378238f23f99ebe8f3e1621446ffd3a2aab26e\": container with ID starting with 9ae6a527e170b07f5e79f9192f378238f23f99ebe8f3e1621446ffd3a2aab26e not found: ID does not exist" containerID="9ae6a527e170b07f5e79f9192f378238f23f99ebe8f3e1621446ffd3a2aab26e" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.499477 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae6a527e170b07f5e79f9192f378238f23f99ebe8f3e1621446ffd3a2aab26e"} err="failed to get container status \"9ae6a527e170b07f5e79f9192f378238f23f99ebe8f3e1621446ffd3a2aab26e\": rpc error: code = NotFound desc = could not find container \"9ae6a527e170b07f5e79f9192f378238f23f99ebe8f3e1621446ffd3a2aab26e\": container with ID starting with 9ae6a527e170b07f5e79f9192f378238f23f99ebe8f3e1621446ffd3a2aab26e not found: ID does not exist" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.499868 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.618628 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm4sw\" (UniqueName: \"kubernetes.io/projected/e844ffd7-188b-4333-84f4-474c6f346630-kube-api-access-hm4sw\") pod \"nova-api-0\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.618676 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e844ffd7-188b-4333-84f4-474c6f346630-logs\") pod \"nova-api-0\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.618813 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e844ffd7-188b-4333-84f4-474c6f346630-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.618858 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e844ffd7-188b-4333-84f4-474c6f346630-config-data\") pod \"nova-api-0\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.720856 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e844ffd7-188b-4333-84f4-474c6f346630-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.720921 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e844ffd7-188b-4333-84f4-474c6f346630-config-data\") pod \"nova-api-0\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.720962 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm4sw\" (UniqueName: \"kubernetes.io/projected/e844ffd7-188b-4333-84f4-474c6f346630-kube-api-access-hm4sw\") pod \"nova-api-0\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.720984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e844ffd7-188b-4333-84f4-474c6f346630-logs\") pod \"nova-api-0\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.721707 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e844ffd7-188b-4333-84f4-474c6f346630-logs\") pod \"nova-api-0\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.727124 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e844ffd7-188b-4333-84f4-474c6f346630-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.727451 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e844ffd7-188b-4333-84f4-474c6f346630-config-data\") pod \"nova-api-0\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.738334 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm4sw\" (UniqueName: \"kubernetes.io/projected/e844ffd7-188b-4333-84f4-474c6f346630-kube-api-access-hm4sw\") pod \"nova-api-0\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:28 crc kubenswrapper[5030]: I0120 23:06:28.838546 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:29 crc kubenswrapper[5030]: W0120 23:06:29.312777 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode844ffd7_188b_4333_84f4_474c6f346630.slice/crio-be6f8426d3cfc0a95ab34817fd68baf63c450690d5b6dce074f8ea126ccc0b27 WatchSource:0}: Error finding container be6f8426d3cfc0a95ab34817fd68baf63c450690d5b6dce074f8ea126ccc0b27: Status 404 returned error can't find the container with id be6f8426d3cfc0a95ab34817fd68baf63c450690d5b6dce074f8ea126ccc0b27 Jan 20 23:06:29 crc kubenswrapper[5030]: I0120 23:06:29.315579 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:29 crc kubenswrapper[5030]: I0120 23:06:29.397710 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e844ffd7-188b-4333-84f4-474c6f346630","Type":"ContainerStarted","Data":"be6f8426d3cfc0a95ab34817fd68baf63c450690d5b6dce074f8ea126ccc0b27"} Jan 20 23:06:29 crc kubenswrapper[5030]: I0120 23:06:29.980700 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7166ca5-636c-4498-ad08-14bd9dbfd983" path="/var/lib/kubelet/pods/c7166ca5-636c-4498-ad08-14bd9dbfd983/volumes" Jan 20 23:06:30 crc kubenswrapper[5030]: I0120 23:06:30.418750 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e844ffd7-188b-4333-84f4-474c6f346630","Type":"ContainerStarted","Data":"dc72430e074a7f953f4baf7c240facdf41b788f369e96c34ce426557194c2c31"} Jan 20 23:06:30 crc kubenswrapper[5030]: I0120 23:06:30.419039 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e844ffd7-188b-4333-84f4-474c6f346630","Type":"ContainerStarted","Data":"0ff9251db9165741ab99181da39dd3279ddbee949749fbe1b36b854f83078b11"} Jan 20 23:06:30 crc kubenswrapper[5030]: I0120 23:06:30.449856 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.449823694 podStartE2EDuration="2.449823694s" podCreationTimestamp="2026-01-20 23:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:30.444388682 +0000 UTC m=+1862.764649060" watchObservedRunningTime="2026-01-20 23:06:30.449823694 +0000 UTC m=+1862.770084012" Jan 20 23:06:30 crc kubenswrapper[5030]: I0120 23:06:30.765398 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:31 crc kubenswrapper[5030]: I0120 23:06:31.712453 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.305378 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-84wng"] Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.307904 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.311887 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.312961 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.317730 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-84wng"] Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.397800 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-84wng\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.397865 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw8w4\" (UniqueName: \"kubernetes.io/projected/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-kube-api-access-lw8w4\") pod \"nova-cell1-cell-mapping-84wng\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.397894 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-config-data\") pod \"nova-cell1-cell-mapping-84wng\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.397932 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-scripts\") pod \"nova-cell1-cell-mapping-84wng\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.499466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-84wng\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.499547 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw8w4\" (UniqueName: \"kubernetes.io/projected/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-kube-api-access-lw8w4\") pod \"nova-cell1-cell-mapping-84wng\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.499584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-config-data\") pod \"nova-cell1-cell-mapping-84wng\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.499667 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-scripts\") pod \"nova-cell1-cell-mapping-84wng\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.505553 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-84wng\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.505561 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-scripts\") pod \"nova-cell1-cell-mapping-84wng\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.506464 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-config-data\") pod \"nova-cell1-cell-mapping-84wng\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.526084 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw8w4\" (UniqueName: \"kubernetes.io/projected/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-kube-api-access-lw8w4\") pod \"nova-cell1-cell-mapping-84wng\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.637744 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.717014 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:32 crc kubenswrapper[5030]: I0120 23:06:32.717047 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:33 crc kubenswrapper[5030]: I0120 23:06:33.088742 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-84wng"] Jan 20 23:06:33 crc kubenswrapper[5030]: I0120 23:06:33.460153 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" event={"ID":"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30","Type":"ContainerStarted","Data":"ddfad440bf1dbcb119302c876ff1167f8af714dafbb96b9a1371e0c947bbd1ef"} Jan 20 23:06:33 crc kubenswrapper[5030]: I0120 23:06:33.460765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" event={"ID":"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30","Type":"ContainerStarted","Data":"6e5f504ce28f2dd4a44bb3533c93d290d2344000072b8801a69e2c68aa6cb1b1"} Jan 20 23:06:33 crc kubenswrapper[5030]: I0120 23:06:33.487519 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" podStartSLOduration=1.487498805 podStartE2EDuration="1.487498805s" podCreationTimestamp="2026-01-20 23:06:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:33.487346841 +0000 UTC m=+1865.807607139" watchObservedRunningTime="2026-01-20 23:06:33.487498805 +0000 UTC m=+1865.807759113" Jan 20 23:06:33 crc kubenswrapper[5030]: I0120 23:06:33.737897 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="8bc1ae48-151e-4fc9-b08c-4afa012265ae" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.127:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:06:33 crc kubenswrapper[5030]: I0120 23:06:33.737934 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="8bc1ae48-151e-4fc9-b08c-4afa012265ae" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.127:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:06:33 crc kubenswrapper[5030]: I0120 23:06:33.963015 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:06:33 crc kubenswrapper[5030]: E0120 23:06:33.963446 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:06:35 crc kubenswrapper[5030]: I0120 23:06:35.765726 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:35 crc kubenswrapper[5030]: I0120 23:06:35.815239 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:36 crc kubenswrapper[5030]: I0120 23:06:36.558977 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:36 crc kubenswrapper[5030]: I0120 23:06:36.595785 5030 scope.go:117] "RemoveContainer" containerID="25d71b9ff6b3918f06b7ac2f245f258bfaef55aef059830afb2f1b222de24f94" Jan 20 23:06:36 crc kubenswrapper[5030]: I0120 23:06:36.642715 5030 scope.go:117] "RemoveContainer" containerID="6fcd6c82f9b98061477b62981fd66acb9608345abb256874ad023be0bb233075" Jan 20 23:06:36 crc kubenswrapper[5030]: I0120 23:06:36.686308 5030 scope.go:117] "RemoveContainer" containerID="dd92b5dd5973bf0786d8d8b9c80aff684fc1a3124f9a956ceefe2ee92d51545f" Jan 20 23:06:36 crc kubenswrapper[5030]: I0120 23:06:36.759268 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:38 crc kubenswrapper[5030]: I0120 23:06:38.536725 5030 generic.go:334] "Generic (PLEG): container finished" podID="eea7dea3-e338-41cb-9c57-e5c8f4e4ef30" containerID="ddfad440bf1dbcb119302c876ff1167f8af714dafbb96b9a1371e0c947bbd1ef" exitCode=0 Jan 20 23:06:38 crc kubenswrapper[5030]: I0120 23:06:38.536778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" event={"ID":"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30","Type":"ContainerDied","Data":"ddfad440bf1dbcb119302c876ff1167f8af714dafbb96b9a1371e0c947bbd1ef"} Jan 20 23:06:38 crc kubenswrapper[5030]: I0120 23:06:38.839795 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:38 crc kubenswrapper[5030]: I0120 23:06:38.839846 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:39 crc kubenswrapper[5030]: I0120 23:06:39.865290 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:39 crc kubenswrapper[5030]: I0120 23:06:39.923949 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="e844ffd7-188b-4333-84f4-474c6f346630" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.129:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:06:39 crc kubenswrapper[5030]: I0120 23:06:39.924233 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="e844ffd7-188b-4333-84f4-474c6f346630" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.129:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.053425 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-scripts\") pod \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.053857 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-config-data\") pod \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.053879 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-combined-ca-bundle\") pod \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.053963 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw8w4\" (UniqueName: \"kubernetes.io/projected/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-kube-api-access-lw8w4\") pod \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\" (UID: \"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30\") " Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.059330 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-scripts" (OuterVolumeSpecName: "scripts") pod "eea7dea3-e338-41cb-9c57-e5c8f4e4ef30" (UID: "eea7dea3-e338-41cb-9c57-e5c8f4e4ef30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.060405 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-kube-api-access-lw8w4" (OuterVolumeSpecName: "kube-api-access-lw8w4") pod "eea7dea3-e338-41cb-9c57-e5c8f4e4ef30" (UID: "eea7dea3-e338-41cb-9c57-e5c8f4e4ef30"). InnerVolumeSpecName "kube-api-access-lw8w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.088465 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-config-data" (OuterVolumeSpecName: "config-data") pod "eea7dea3-e338-41cb-9c57-e5c8f4e4ef30" (UID: "eea7dea3-e338-41cb-9c57-e5c8f4e4ef30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.092341 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eea7dea3-e338-41cb-9c57-e5c8f4e4ef30" (UID: "eea7dea3-e338-41cb-9c57-e5c8f4e4ef30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.156400 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw8w4\" (UniqueName: \"kubernetes.io/projected/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-kube-api-access-lw8w4\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.156431 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.156444 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.156458 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.314459 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.314958 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="d808c4c2-9747-4f7f-aea4-a3a2698b453e" containerName="kube-state-metrics" containerID="cri-o://63f933bb3c5d62aebcaf790abe7a9783ebabfd7c507c26b2013dc71a5002bae9" gracePeriod=30 Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.559790 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.559740 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-84wng" event={"ID":"eea7dea3-e338-41cb-9c57-e5c8f4e4ef30","Type":"ContainerDied","Data":"6e5f504ce28f2dd4a44bb3533c93d290d2344000072b8801a69e2c68aa6cb1b1"} Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.560454 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e5f504ce28f2dd4a44bb3533c93d290d2344000072b8801a69e2c68aa6cb1b1" Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.561946 5030 generic.go:334] "Generic (PLEG): container finished" podID="d808c4c2-9747-4f7f-aea4-a3a2698b453e" containerID="63f933bb3c5d62aebcaf790abe7a9783ebabfd7c507c26b2013dc71a5002bae9" exitCode=2 Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.561982 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d808c4c2-9747-4f7f-aea4-a3a2698b453e","Type":"ContainerDied","Data":"63f933bb3c5d62aebcaf790abe7a9783ebabfd7c507c26b2013dc71a5002bae9"} Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.726554 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="d808c4c2-9747-4f7f-aea4-a3a2698b453e" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.1.56:8081/readyz\": dial tcp 10.217.1.56:8081: connect: connection refused" Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.763473 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.763950 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="e844ffd7-188b-4333-84f4-474c6f346630" containerName="nova-api-log" containerID="cri-o://0ff9251db9165741ab99181da39dd3279ddbee949749fbe1b36b854f83078b11" gracePeriod=30 Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.764617 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="e844ffd7-188b-4333-84f4-474c6f346630" containerName="nova-api-api" containerID="cri-o://dc72430e074a7f953f4baf7c240facdf41b788f369e96c34ce426557194c2c31" gracePeriod=30 Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.775498 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.775749 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="dd4297f4-36e8-419c-81cd-1fa36be55bc2" containerName="nova-scheduler-scheduler" containerID="cri-o://6e58555f160e268b77ba7e16690198097633f07e1b7605f94dbddc164622a915" gracePeriod=30 Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.813649 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.814489 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="8bc1ae48-151e-4fc9-b08c-4afa012265ae" containerName="nova-metadata-log" containerID="cri-o://8ccb7871d1d322b35b080a53848a656dec1a9d94b88aef6026877a164f9327ed" gracePeriod=30 Jan 20 23:06:40 crc kubenswrapper[5030]: I0120 23:06:40.814661 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="8bc1ae48-151e-4fc9-b08c-4afa012265ae" containerName="nova-metadata-metadata" containerID="cri-o://b23f8c8c1c7670e182626ea44ac8643174e86ad72b395b0a56ce5b1274afa1c4" gracePeriod=30 Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.292466 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.376181 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czskw\" (UniqueName: \"kubernetes.io/projected/d808c4c2-9747-4f7f-aea4-a3a2698b453e-kube-api-access-czskw\") pod \"d808c4c2-9747-4f7f-aea4-a3a2698b453e\" (UID: \"d808c4c2-9747-4f7f-aea4-a3a2698b453e\") " Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.385343 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d808c4c2-9747-4f7f-aea4-a3a2698b453e-kube-api-access-czskw" (OuterVolumeSpecName: "kube-api-access-czskw") pod "d808c4c2-9747-4f7f-aea4-a3a2698b453e" (UID: "d808c4c2-9747-4f7f-aea4-a3a2698b453e"). InnerVolumeSpecName "kube-api-access-czskw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.478825 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czskw\" (UniqueName: \"kubernetes.io/projected/d808c4c2-9747-4f7f-aea4-a3a2698b453e-kube-api-access-czskw\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.573866 5030 generic.go:334] "Generic (PLEG): container finished" podID="8bc1ae48-151e-4fc9-b08c-4afa012265ae" containerID="8ccb7871d1d322b35b080a53848a656dec1a9d94b88aef6026877a164f9327ed" exitCode=143 Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.573958 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"8bc1ae48-151e-4fc9-b08c-4afa012265ae","Type":"ContainerDied","Data":"8ccb7871d1d322b35b080a53848a656dec1a9d94b88aef6026877a164f9327ed"} Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.575726 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d808c4c2-9747-4f7f-aea4-a3a2698b453e","Type":"ContainerDied","Data":"e345fa82caee8495b122425cd4d168793900d4938d6213a556b387c516a65a1e"} Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.575756 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.575795 5030 scope.go:117] "RemoveContainer" containerID="63f933bb3c5d62aebcaf790abe7a9783ebabfd7c507c26b2013dc71a5002bae9" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.578021 5030 generic.go:334] "Generic (PLEG): container finished" podID="e844ffd7-188b-4333-84f4-474c6f346630" containerID="0ff9251db9165741ab99181da39dd3279ddbee949749fbe1b36b854f83078b11" exitCode=143 Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.578061 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e844ffd7-188b-4333-84f4-474c6f346630","Type":"ContainerDied","Data":"0ff9251db9165741ab99181da39dd3279ddbee949749fbe1b36b854f83078b11"} Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.617603 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.630650 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.646201 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:06:41 crc kubenswrapper[5030]: E0120 23:06:41.647337 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea7dea3-e338-41cb-9c57-e5c8f4e4ef30" containerName="nova-manage" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.647385 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea7dea3-e338-41cb-9c57-e5c8f4e4ef30" containerName="nova-manage" Jan 20 23:06:41 crc kubenswrapper[5030]: E0120 23:06:41.647445 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d808c4c2-9747-4f7f-aea4-a3a2698b453e" containerName="kube-state-metrics" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.647454 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d808c4c2-9747-4f7f-aea4-a3a2698b453e" containerName="kube-state-metrics" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.648003 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d808c4c2-9747-4f7f-aea4-a3a2698b453e" containerName="kube-state-metrics" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.648039 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea7dea3-e338-41cb-9c57-e5c8f4e4ef30" containerName="nova-manage" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.649055 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.652349 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.653074 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.683275 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.783743 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.783825 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.783985 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.784460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sldsp\" (UniqueName: \"kubernetes.io/projected/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-api-access-sldsp\") pod \"kube-state-metrics-0\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.886300 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sldsp\" (UniqueName: \"kubernetes.io/projected/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-api-access-sldsp\") pod \"kube-state-metrics-0\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.886411 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.886441 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.886490 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.897564 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.901358 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.904822 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.908695 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sldsp\" (UniqueName: \"kubernetes.io/projected/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-api-access-sldsp\") pod \"kube-state-metrics-0\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.974142 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d808c4c2-9747-4f7f-aea4-a3a2698b453e" path="/var/lib/kubelet/pods/d808c4c2-9747-4f7f-aea4-a3a2698b453e/volumes" Jan 20 23:06:41 crc kubenswrapper[5030]: I0120 23:06:41.986154 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:42 crc kubenswrapper[5030]: I0120 23:06:42.130885 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:06:42 crc kubenswrapper[5030]: I0120 23:06:42.131290 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="proxy-httpd" containerID="cri-o://f5c32f92175e7a5b690a5c4cb113f30fada3f5b40fce50511e15e82e36509d84" gracePeriod=30 Jan 20 23:06:42 crc kubenswrapper[5030]: I0120 23:06:42.131336 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="sg-core" containerID="cri-o://8663f21c6f657fc21b551689ac7e403a73d7b917ddc92f30bbe96a82a764ffa3" gracePeriod=30 Jan 20 23:06:42 crc kubenswrapper[5030]: I0120 23:06:42.131357 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="ceilometer-central-agent" containerID="cri-o://64276a673eccc337b3a92ee9cfe55913394327ac5fea09449c77770c10978c66" gracePeriod=30 Jan 20 23:06:42 crc kubenswrapper[5030]: I0120 23:06:42.131382 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="ceilometer-notification-agent" containerID="cri-o://3be2cb38d9f117696064c23ddcb58967316a94eb60bc236b7633172fcd0883c1" gracePeriod=30 Jan 20 23:06:42 crc kubenswrapper[5030]: I0120 23:06:42.437344 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:06:42 crc kubenswrapper[5030]: I0120 23:06:42.588514 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c","Type":"ContainerStarted","Data":"e0913b7a61fba5f555780dc8ee66b46a2eb1f575b2c36c64e4b482cc25cf4906"} Jan 20 23:06:42 crc kubenswrapper[5030]: I0120 23:06:42.593091 5030 generic.go:334] "Generic (PLEG): container finished" podID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerID="f5c32f92175e7a5b690a5c4cb113f30fada3f5b40fce50511e15e82e36509d84" exitCode=0 Jan 20 23:06:42 crc kubenswrapper[5030]: I0120 23:06:42.593123 5030 generic.go:334] "Generic (PLEG): container finished" podID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerID="8663f21c6f657fc21b551689ac7e403a73d7b917ddc92f30bbe96a82a764ffa3" exitCode=2 Jan 20 23:06:42 crc kubenswrapper[5030]: I0120 23:06:42.593133 5030 generic.go:334] "Generic (PLEG): container finished" podID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerID="64276a673eccc337b3a92ee9cfe55913394327ac5fea09449c77770c10978c66" exitCode=0 Jan 20 23:06:42 crc kubenswrapper[5030]: I0120 23:06:42.593150 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e","Type":"ContainerDied","Data":"f5c32f92175e7a5b690a5c4cb113f30fada3f5b40fce50511e15e82e36509d84"} Jan 20 23:06:42 crc kubenswrapper[5030]: I0120 23:06:42.593172 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e","Type":"ContainerDied","Data":"8663f21c6f657fc21b551689ac7e403a73d7b917ddc92f30bbe96a82a764ffa3"} Jan 20 23:06:42 crc kubenswrapper[5030]: I0120 23:06:42.593183 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e","Type":"ContainerDied","Data":"64276a673eccc337b3a92ee9cfe55913394327ac5fea09449c77770c10978c66"} Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.067103 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.208833 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4297f4-36e8-419c-81cd-1fa36be55bc2-combined-ca-bundle\") pod \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\" (UID: \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\") " Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.208919 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvjtf\" (UniqueName: \"kubernetes.io/projected/dd4297f4-36e8-419c-81cd-1fa36be55bc2-kube-api-access-tvjtf\") pod \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\" (UID: \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\") " Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.208992 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd4297f4-36e8-419c-81cd-1fa36be55bc2-config-data\") pod \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\" (UID: \"dd4297f4-36e8-419c-81cd-1fa36be55bc2\") " Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.223844 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4297f4-36e8-419c-81cd-1fa36be55bc2-kube-api-access-tvjtf" (OuterVolumeSpecName: "kube-api-access-tvjtf") pod "dd4297f4-36e8-419c-81cd-1fa36be55bc2" (UID: "dd4297f4-36e8-419c-81cd-1fa36be55bc2"). InnerVolumeSpecName "kube-api-access-tvjtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.232893 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4297f4-36e8-419c-81cd-1fa36be55bc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd4297f4-36e8-419c-81cd-1fa36be55bc2" (UID: "dd4297f4-36e8-419c-81cd-1fa36be55bc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.233166 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4297f4-36e8-419c-81cd-1fa36be55bc2-config-data" (OuterVolumeSpecName: "config-data") pod "dd4297f4-36e8-419c-81cd-1fa36be55bc2" (UID: "dd4297f4-36e8-419c-81cd-1fa36be55bc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.311436 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvjtf\" (UniqueName: \"kubernetes.io/projected/dd4297f4-36e8-419c-81cd-1fa36be55bc2-kube-api-access-tvjtf\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.311823 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd4297f4-36e8-419c-81cd-1fa36be55bc2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.311837 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4297f4-36e8-419c-81cd-1fa36be55bc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.602800 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c","Type":"ContainerStarted","Data":"8fff461aea2537b44e18dcf720322f47cc70d29129fcd1dd6c7ce152cd4912d9"} Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.603559 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.605246 5030 generic.go:334] "Generic (PLEG): container finished" podID="dd4297f4-36e8-419c-81cd-1fa36be55bc2" containerID="6e58555f160e268b77ba7e16690198097633f07e1b7605f94dbddc164622a915" exitCode=0 Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.605321 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"dd4297f4-36e8-419c-81cd-1fa36be55bc2","Type":"ContainerDied","Data":"6e58555f160e268b77ba7e16690198097633f07e1b7605f94dbddc164622a915"} Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.605346 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"dd4297f4-36e8-419c-81cd-1fa36be55bc2","Type":"ContainerDied","Data":"cd59ca04ffaa1865f9ae803413ab6517536d7967907837d94af85707a3719e41"} Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.605368 5030 scope.go:117] "RemoveContainer" containerID="6e58555f160e268b77ba7e16690198097633f07e1b7605f94dbddc164622a915" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.605556 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.627705 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.249415938 podStartE2EDuration="2.627688193s" podCreationTimestamp="2026-01-20 23:06:41 +0000 UTC" firstStartedPulling="2026-01-20 23:06:42.442306936 +0000 UTC m=+1874.762567254" lastFinishedPulling="2026-01-20 23:06:42.820579221 +0000 UTC m=+1875.140839509" observedRunningTime="2026-01-20 23:06:43.621025751 +0000 UTC m=+1875.941286039" watchObservedRunningTime="2026-01-20 23:06:43.627688193 +0000 UTC m=+1875.947948481" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.632805 5030 scope.go:117] "RemoveContainer" containerID="6e58555f160e268b77ba7e16690198097633f07e1b7605f94dbddc164622a915" Jan 20 23:06:43 crc kubenswrapper[5030]: E0120 23:06:43.633267 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e58555f160e268b77ba7e16690198097633f07e1b7605f94dbddc164622a915\": container with ID starting with 6e58555f160e268b77ba7e16690198097633f07e1b7605f94dbddc164622a915 not found: ID does not exist" containerID="6e58555f160e268b77ba7e16690198097633f07e1b7605f94dbddc164622a915" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.633313 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e58555f160e268b77ba7e16690198097633f07e1b7605f94dbddc164622a915"} err="failed to get container status \"6e58555f160e268b77ba7e16690198097633f07e1b7605f94dbddc164622a915\": rpc error: code = NotFound desc = could not find container \"6e58555f160e268b77ba7e16690198097633f07e1b7605f94dbddc164622a915\": container with ID starting with 6e58555f160e268b77ba7e16690198097633f07e1b7605f94dbddc164622a915 not found: ID does not exist" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.648480 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.655879 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.669911 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:43 crc kubenswrapper[5030]: E0120 23:06:43.670383 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4297f4-36e8-419c-81cd-1fa36be55bc2" containerName="nova-scheduler-scheduler" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.670411 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4297f4-36e8-419c-81cd-1fa36be55bc2" containerName="nova-scheduler-scheduler" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.670743 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd4297f4-36e8-419c-81cd-1fa36be55bc2" containerName="nova-scheduler-scheduler" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.671310 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.673273 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.692716 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.820903 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b74f69b6-bbb2-4578-a61c-9479f6802e01\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.821062 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-config-data\") pod \"nova-scheduler-0\" (UID: \"b74f69b6-bbb2-4578-a61c-9479f6802e01\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.821150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj8gf\" (UniqueName: \"kubernetes.io/projected/b74f69b6-bbb2-4578-a61c-9479f6802e01-kube-api-access-jj8gf\") pod \"nova-scheduler-0\" (UID: \"b74f69b6-bbb2-4578-a61c-9479f6802e01\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.922824 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b74f69b6-bbb2-4578-a61c-9479f6802e01\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.922917 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-config-data\") pod \"nova-scheduler-0\" (UID: \"b74f69b6-bbb2-4578-a61c-9479f6802e01\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.922981 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj8gf\" (UniqueName: \"kubernetes.io/projected/b74f69b6-bbb2-4578-a61c-9479f6802e01-kube-api-access-jj8gf\") pod \"nova-scheduler-0\" (UID: \"b74f69b6-bbb2-4578-a61c-9479f6802e01\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.926691 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-config-data\") pod \"nova-scheduler-0\" (UID: \"b74f69b6-bbb2-4578-a61c-9479f6802e01\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.928199 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b74f69b6-bbb2-4578-a61c-9479f6802e01\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.951668 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj8gf\" (UniqueName: \"kubernetes.io/projected/b74f69b6-bbb2-4578-a61c-9479f6802e01-kube-api-access-jj8gf\") pod \"nova-scheduler-0\" (UID: \"b74f69b6-bbb2-4578-a61c-9479f6802e01\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.972157 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd4297f4-36e8-419c-81cd-1fa36be55bc2" path="/var/lib/kubelet/pods/dd4297f4-36e8-419c-81cd-1fa36be55bc2/volumes" Jan 20 23:06:43 crc kubenswrapper[5030]: I0120 23:06:43.991177 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.371211 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.494038 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.536832 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-nova-metadata-tls-certs\") pod \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.537153 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv75w\" (UniqueName: \"kubernetes.io/projected/8bc1ae48-151e-4fc9-b08c-4afa012265ae-kube-api-access-rv75w\") pod \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.537287 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-config-data\") pod \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.537401 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc1ae48-151e-4fc9-b08c-4afa012265ae-logs\") pod \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.537547 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-combined-ca-bundle\") pod \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\" (UID: \"8bc1ae48-151e-4fc9-b08c-4afa012265ae\") " Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.540152 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bc1ae48-151e-4fc9-b08c-4afa012265ae-logs" (OuterVolumeSpecName: "logs") pod "8bc1ae48-151e-4fc9-b08c-4afa012265ae" (UID: "8bc1ae48-151e-4fc9-b08c-4afa012265ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.544983 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc1ae48-151e-4fc9-b08c-4afa012265ae-kube-api-access-rv75w" (OuterVolumeSpecName: "kube-api-access-rv75w") pod "8bc1ae48-151e-4fc9-b08c-4afa012265ae" (UID: "8bc1ae48-151e-4fc9-b08c-4afa012265ae"). InnerVolumeSpecName "kube-api-access-rv75w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.568639 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-config-data" (OuterVolumeSpecName: "config-data") pod "8bc1ae48-151e-4fc9-b08c-4afa012265ae" (UID: "8bc1ae48-151e-4fc9-b08c-4afa012265ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.571289 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bc1ae48-151e-4fc9-b08c-4afa012265ae" (UID: "8bc1ae48-151e-4fc9-b08c-4afa012265ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.592067 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8bc1ae48-151e-4fc9-b08c-4afa012265ae" (UID: "8bc1ae48-151e-4fc9-b08c-4afa012265ae"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.617951 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"b74f69b6-bbb2-4578-a61c-9479f6802e01","Type":"ContainerStarted","Data":"ddb1e50f0149d2f277fe1610af98284b0db0c78e952d740ce24f39679bc3fc95"} Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.622382 5030 generic.go:334] "Generic (PLEG): container finished" podID="8bc1ae48-151e-4fc9-b08c-4afa012265ae" containerID="b23f8c8c1c7670e182626ea44ac8643174e86ad72b395b0a56ce5b1274afa1c4" exitCode=0 Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.622429 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"8bc1ae48-151e-4fc9-b08c-4afa012265ae","Type":"ContainerDied","Data":"b23f8c8c1c7670e182626ea44ac8643174e86ad72b395b0a56ce5b1274afa1c4"} Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.622479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"8bc1ae48-151e-4fc9-b08c-4afa012265ae","Type":"ContainerDied","Data":"31dfcbe4f6c17af95678bea3af05fdd10cfac96b22190b7fdb1e9a0be587b09c"} Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.622496 5030 scope.go:117] "RemoveContainer" containerID="b23f8c8c1c7670e182626ea44ac8643174e86ad72b395b0a56ce5b1274afa1c4" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.622517 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.639573 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.639652 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.639665 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv75w\" (UniqueName: \"kubernetes.io/projected/8bc1ae48-151e-4fc9-b08c-4afa012265ae-kube-api-access-rv75w\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.639675 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc1ae48-151e-4fc9-b08c-4afa012265ae-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.639687 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc1ae48-151e-4fc9-b08c-4afa012265ae-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.669732 5030 scope.go:117] "RemoveContainer" containerID="8ccb7871d1d322b35b080a53848a656dec1a9d94b88aef6026877a164f9327ed" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.674894 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.697209 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.707426 5030 scope.go:117] "RemoveContainer" containerID="b23f8c8c1c7670e182626ea44ac8643174e86ad72b395b0a56ce5b1274afa1c4" Jan 20 23:06:44 crc kubenswrapper[5030]: E0120 23:06:44.708241 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b23f8c8c1c7670e182626ea44ac8643174e86ad72b395b0a56ce5b1274afa1c4\": container with ID starting with b23f8c8c1c7670e182626ea44ac8643174e86ad72b395b0a56ce5b1274afa1c4 not found: ID does not exist" containerID="b23f8c8c1c7670e182626ea44ac8643174e86ad72b395b0a56ce5b1274afa1c4" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.708293 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b23f8c8c1c7670e182626ea44ac8643174e86ad72b395b0a56ce5b1274afa1c4"} err="failed to get container status \"b23f8c8c1c7670e182626ea44ac8643174e86ad72b395b0a56ce5b1274afa1c4\": rpc error: code = NotFound desc = could not find container \"b23f8c8c1c7670e182626ea44ac8643174e86ad72b395b0a56ce5b1274afa1c4\": container with ID starting with b23f8c8c1c7670e182626ea44ac8643174e86ad72b395b0a56ce5b1274afa1c4 not found: ID does not exist" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.708327 5030 scope.go:117] "RemoveContainer" containerID="8ccb7871d1d322b35b080a53848a656dec1a9d94b88aef6026877a164f9327ed" Jan 20 23:06:44 crc kubenswrapper[5030]: E0120 23:06:44.709048 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ccb7871d1d322b35b080a53848a656dec1a9d94b88aef6026877a164f9327ed\": container with ID starting with 8ccb7871d1d322b35b080a53848a656dec1a9d94b88aef6026877a164f9327ed not found: ID does not exist" containerID="8ccb7871d1d322b35b080a53848a656dec1a9d94b88aef6026877a164f9327ed" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.709109 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ccb7871d1d322b35b080a53848a656dec1a9d94b88aef6026877a164f9327ed"} err="failed to get container status \"8ccb7871d1d322b35b080a53848a656dec1a9d94b88aef6026877a164f9327ed\": rpc error: code = NotFound desc = could not find container \"8ccb7871d1d322b35b080a53848a656dec1a9d94b88aef6026877a164f9327ed\": container with ID starting with 8ccb7871d1d322b35b080a53848a656dec1a9d94b88aef6026877a164f9327ed not found: ID does not exist" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.716719 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:44 crc kubenswrapper[5030]: E0120 23:06:44.717402 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc1ae48-151e-4fc9-b08c-4afa012265ae" containerName="nova-metadata-metadata" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.717442 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc1ae48-151e-4fc9-b08c-4afa012265ae" containerName="nova-metadata-metadata" Jan 20 23:06:44 crc kubenswrapper[5030]: E0120 23:06:44.717508 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc1ae48-151e-4fc9-b08c-4afa012265ae" containerName="nova-metadata-log" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.717524 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc1ae48-151e-4fc9-b08c-4afa012265ae" containerName="nova-metadata-log" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.717888 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc1ae48-151e-4fc9-b08c-4afa012265ae" containerName="nova-metadata-metadata" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.717947 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc1ae48-151e-4fc9-b08c-4afa012265ae" containerName="nova-metadata-log" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.719733 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.726463 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.728366 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.732763 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.843901 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.843969 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-config-data\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.844093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qffw\" (UniqueName: \"kubernetes.io/projected/0037fd82-93c2-4ba4-87c2-563388f2b56c-kube-api-access-8qffw\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.844126 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.844181 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0037fd82-93c2-4ba4-87c2-563388f2b56c-logs\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.945341 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.945416 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-config-data\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.945463 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qffw\" (UniqueName: \"kubernetes.io/projected/0037fd82-93c2-4ba4-87c2-563388f2b56c-kube-api-access-8qffw\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.945492 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.945513 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0037fd82-93c2-4ba4-87c2-563388f2b56c-logs\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.947053 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0037fd82-93c2-4ba4-87c2-563388f2b56c-logs\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.951488 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.951515 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.952017 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-config-data\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.963723 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:06:44 crc kubenswrapper[5030]: E0120 23:06:44.963966 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:06:44 crc kubenswrapper[5030]: I0120 23:06:44.967773 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qffw\" (UniqueName: \"kubernetes.io/projected/0037fd82-93c2-4ba4-87c2-563388f2b56c-kube-api-access-8qffw\") pod \"nova-metadata-0\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.038677 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.445030 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.502042 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.551854 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.563169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e844ffd7-188b-4333-84f4-474c6f346630-logs\") pod \"e844ffd7-188b-4333-84f4-474c6f346630\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.563223 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e844ffd7-188b-4333-84f4-474c6f346630-combined-ca-bundle\") pod \"e844ffd7-188b-4333-84f4-474c6f346630\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.563251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e844ffd7-188b-4333-84f4-474c6f346630-config-data\") pod \"e844ffd7-188b-4333-84f4-474c6f346630\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.563303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm4sw\" (UniqueName: \"kubernetes.io/projected/e844ffd7-188b-4333-84f4-474c6f346630-kube-api-access-hm4sw\") pod \"e844ffd7-188b-4333-84f4-474c6f346630\" (UID: \"e844ffd7-188b-4333-84f4-474c6f346630\") " Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.564019 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e844ffd7-188b-4333-84f4-474c6f346630-logs" (OuterVolumeSpecName: "logs") pod "e844ffd7-188b-4333-84f4-474c6f346630" (UID: "e844ffd7-188b-4333-84f4-474c6f346630"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.571294 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e844ffd7-188b-4333-84f4-474c6f346630-kube-api-access-hm4sw" (OuterVolumeSpecName: "kube-api-access-hm4sw") pod "e844ffd7-188b-4333-84f4-474c6f346630" (UID: "e844ffd7-188b-4333-84f4-474c6f346630"). InnerVolumeSpecName "kube-api-access-hm4sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.612572 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e844ffd7-188b-4333-84f4-474c6f346630-config-data" (OuterVolumeSpecName: "config-data") pod "e844ffd7-188b-4333-84f4-474c6f346630" (UID: "e844ffd7-188b-4333-84f4-474c6f346630"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.622766 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e844ffd7-188b-4333-84f4-474c6f346630-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e844ffd7-188b-4333-84f4-474c6f346630" (UID: "e844ffd7-188b-4333-84f4-474c6f346630"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.635501 5030 generic.go:334] "Generic (PLEG): container finished" podID="e844ffd7-188b-4333-84f4-474c6f346630" containerID="dc72430e074a7f953f4baf7c240facdf41b788f369e96c34ce426557194c2c31" exitCode=0 Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.635552 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e844ffd7-188b-4333-84f4-474c6f346630","Type":"ContainerDied","Data":"dc72430e074a7f953f4baf7c240facdf41b788f369e96c34ce426557194c2c31"} Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.635573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e844ffd7-188b-4333-84f4-474c6f346630","Type":"ContainerDied","Data":"be6f8426d3cfc0a95ab34817fd68baf63c450690d5b6dce074f8ea126ccc0b27"} Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.635588 5030 scope.go:117] "RemoveContainer" containerID="dc72430e074a7f953f4baf7c240facdf41b788f369e96c34ce426557194c2c31" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.635697 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.647613 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0037fd82-93c2-4ba4-87c2-563388f2b56c","Type":"ContainerStarted","Data":"2637b198c127ff15d3cda806ad9c17309ec22af0631afe4f583165d198101b60"} Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.666877 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-sg-core-conf-yaml\") pod \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.667010 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-scripts\") pod \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.667041 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-combined-ca-bundle\") pod \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.667090 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-run-httpd\") pod \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.667111 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-config-data\") pod \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.667180 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-log-httpd\") pod \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.667204 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2tqg\" (UniqueName: \"kubernetes.io/projected/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-kube-api-access-p2tqg\") pod \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\" (UID: \"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e\") " Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.667553 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e844ffd7-188b-4333-84f4-474c6f346630-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.667564 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e844ffd7-188b-4333-84f4-474c6f346630-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.667574 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e844ffd7-188b-4333-84f4-474c6f346630-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.667582 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm4sw\" (UniqueName: \"kubernetes.io/projected/e844ffd7-188b-4333-84f4-474c6f346630-kube-api-access-hm4sw\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.670918 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" (UID: "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.672661 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" (UID: "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.675870 5030 generic.go:334] "Generic (PLEG): container finished" podID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerID="3be2cb38d9f117696064c23ddcb58967316a94eb60bc236b7633172fcd0883c1" exitCode=0 Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.675923 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e","Type":"ContainerDied","Data":"3be2cb38d9f117696064c23ddcb58967316a94eb60bc236b7633172fcd0883c1"} Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.675951 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e","Type":"ContainerDied","Data":"3e5f4f08565d2fabcb922ce081027f5b288dc07697610d03330b564ce0b972fb"} Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.676009 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.676186 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-scripts" (OuterVolumeSpecName: "scripts") pod "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" (UID: "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.679933 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.685275 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-kube-api-access-p2tqg" (OuterVolumeSpecName: "kube-api-access-p2tqg") pod "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" (UID: "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e"). InnerVolumeSpecName "kube-api-access-p2tqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.687824 5030 scope.go:117] "RemoveContainer" containerID="0ff9251db9165741ab99181da39dd3279ddbee949749fbe1b36b854f83078b11" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.689138 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"b74f69b6-bbb2-4578-a61c-9479f6802e01","Type":"ContainerStarted","Data":"65a340046c8d57278deb025b2315942cfd40c2be9694e1d9f412ed27c7d0f561"} Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.714552 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.721724 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:45 crc kubenswrapper[5030]: E0120 23:06:45.722130 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="ceilometer-notification-agent" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.722144 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="ceilometer-notification-agent" Jan 20 23:06:45 crc kubenswrapper[5030]: E0120 23:06:45.722159 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e844ffd7-188b-4333-84f4-474c6f346630" containerName="nova-api-api" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.722165 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e844ffd7-188b-4333-84f4-474c6f346630" containerName="nova-api-api" Jan 20 23:06:45 crc kubenswrapper[5030]: E0120 23:06:45.722180 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e844ffd7-188b-4333-84f4-474c6f346630" containerName="nova-api-log" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.722187 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e844ffd7-188b-4333-84f4-474c6f346630" containerName="nova-api-log" Jan 20 23:06:45 crc kubenswrapper[5030]: E0120 23:06:45.722199 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="sg-core" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.722205 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="sg-core" Jan 20 23:06:45 crc kubenswrapper[5030]: E0120 23:06:45.722215 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="proxy-httpd" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.722222 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="proxy-httpd" Jan 20 23:06:45 crc kubenswrapper[5030]: E0120 23:06:45.722238 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="ceilometer-central-agent" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.722243 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="ceilometer-central-agent" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.722421 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="proxy-httpd" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.722439 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="ceilometer-central-agent" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.722451 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e844ffd7-188b-4333-84f4-474c6f346630" containerName="nova-api-api" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.722468 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="sg-core" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.722478 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e844ffd7-188b-4333-84f4-474c6f346630" containerName="nova-api-log" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.722488 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" containerName="ceilometer-notification-agent" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.723489 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.726598 5030 scope.go:117] "RemoveContainer" containerID="dc72430e074a7f953f4baf7c240facdf41b788f369e96c34ce426557194c2c31" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.728586 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:06:45 crc kubenswrapper[5030]: E0120 23:06:45.729851 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc72430e074a7f953f4baf7c240facdf41b788f369e96c34ce426557194c2c31\": container with ID starting with dc72430e074a7f953f4baf7c240facdf41b788f369e96c34ce426557194c2c31 not found: ID does not exist" containerID="dc72430e074a7f953f4baf7c240facdf41b788f369e96c34ce426557194c2c31" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.729907 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc72430e074a7f953f4baf7c240facdf41b788f369e96c34ce426557194c2c31"} err="failed to get container status \"dc72430e074a7f953f4baf7c240facdf41b788f369e96c34ce426557194c2c31\": rpc error: code = NotFound desc = could not find container \"dc72430e074a7f953f4baf7c240facdf41b788f369e96c34ce426557194c2c31\": container with ID starting with dc72430e074a7f953f4baf7c240facdf41b788f369e96c34ce426557194c2c31 not found: ID does not exist" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.729944 5030 scope.go:117] "RemoveContainer" containerID="0ff9251db9165741ab99181da39dd3279ddbee949749fbe1b36b854f83078b11" Jan 20 23:06:45 crc kubenswrapper[5030]: E0120 23:06:45.730378 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ff9251db9165741ab99181da39dd3279ddbee949749fbe1b36b854f83078b11\": container with ID starting with 0ff9251db9165741ab99181da39dd3279ddbee949749fbe1b36b854f83078b11 not found: ID does not exist" containerID="0ff9251db9165741ab99181da39dd3279ddbee949749fbe1b36b854f83078b11" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.730407 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ff9251db9165741ab99181da39dd3279ddbee949749fbe1b36b854f83078b11"} err="failed to get container status \"0ff9251db9165741ab99181da39dd3279ddbee949749fbe1b36b854f83078b11\": rpc error: code = NotFound desc = could not find container \"0ff9251db9165741ab99181da39dd3279ddbee949749fbe1b36b854f83078b11\": container with ID starting with 0ff9251db9165741ab99181da39dd3279ddbee949749fbe1b36b854f83078b11 not found: ID does not exist" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.730425 5030 scope.go:117] "RemoveContainer" containerID="f5c32f92175e7a5b690a5c4cb113f30fada3f5b40fce50511e15e82e36509d84" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.732710 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.738423 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.738406242 podStartE2EDuration="2.738406242s" podCreationTimestamp="2026-01-20 23:06:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:45.709009116 +0000 UTC m=+1878.029269404" watchObservedRunningTime="2026-01-20 23:06:45.738406242 +0000 UTC m=+1878.058666530" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.742447 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" (UID: "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.758550 5030 scope.go:117] "RemoveContainer" containerID="8663f21c6f657fc21b551689ac7e403a73d7b917ddc92f30bbe96a82a764ffa3" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.768542 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.768574 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.768584 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.768594 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.768604 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2tqg\" (UniqueName: \"kubernetes.io/projected/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-kube-api-access-p2tqg\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.783396 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" (UID: "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.784376 5030 scope.go:117] "RemoveContainer" containerID="3be2cb38d9f117696064c23ddcb58967316a94eb60bc236b7633172fcd0883c1" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.789271 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-config-data" (OuterVolumeSpecName: "config-data") pod "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" (UID: "bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.803305 5030 scope.go:117] "RemoveContainer" containerID="64276a673eccc337b3a92ee9cfe55913394327ac5fea09449c77770c10978c66" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.827335 5030 scope.go:117] "RemoveContainer" containerID="f5c32f92175e7a5b690a5c4cb113f30fada3f5b40fce50511e15e82e36509d84" Jan 20 23:06:45 crc kubenswrapper[5030]: E0120 23:06:45.827907 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5c32f92175e7a5b690a5c4cb113f30fada3f5b40fce50511e15e82e36509d84\": container with ID starting with f5c32f92175e7a5b690a5c4cb113f30fada3f5b40fce50511e15e82e36509d84 not found: ID does not exist" containerID="f5c32f92175e7a5b690a5c4cb113f30fada3f5b40fce50511e15e82e36509d84" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.827962 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5c32f92175e7a5b690a5c4cb113f30fada3f5b40fce50511e15e82e36509d84"} err="failed to get container status \"f5c32f92175e7a5b690a5c4cb113f30fada3f5b40fce50511e15e82e36509d84\": rpc error: code = NotFound desc = could not find container \"f5c32f92175e7a5b690a5c4cb113f30fada3f5b40fce50511e15e82e36509d84\": container with ID starting with f5c32f92175e7a5b690a5c4cb113f30fada3f5b40fce50511e15e82e36509d84 not found: ID does not exist" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.827988 5030 scope.go:117] "RemoveContainer" containerID="8663f21c6f657fc21b551689ac7e403a73d7b917ddc92f30bbe96a82a764ffa3" Jan 20 23:06:45 crc kubenswrapper[5030]: E0120 23:06:45.828540 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8663f21c6f657fc21b551689ac7e403a73d7b917ddc92f30bbe96a82a764ffa3\": container with ID starting with 8663f21c6f657fc21b551689ac7e403a73d7b917ddc92f30bbe96a82a764ffa3 not found: ID does not exist" containerID="8663f21c6f657fc21b551689ac7e403a73d7b917ddc92f30bbe96a82a764ffa3" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.828579 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8663f21c6f657fc21b551689ac7e403a73d7b917ddc92f30bbe96a82a764ffa3"} err="failed to get container status \"8663f21c6f657fc21b551689ac7e403a73d7b917ddc92f30bbe96a82a764ffa3\": rpc error: code = NotFound desc = could not find container \"8663f21c6f657fc21b551689ac7e403a73d7b917ddc92f30bbe96a82a764ffa3\": container with ID starting with 8663f21c6f657fc21b551689ac7e403a73d7b917ddc92f30bbe96a82a764ffa3 not found: ID does not exist" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.828662 5030 scope.go:117] "RemoveContainer" containerID="3be2cb38d9f117696064c23ddcb58967316a94eb60bc236b7633172fcd0883c1" Jan 20 23:06:45 crc kubenswrapper[5030]: E0120 23:06:45.828938 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be2cb38d9f117696064c23ddcb58967316a94eb60bc236b7633172fcd0883c1\": container with ID starting with 3be2cb38d9f117696064c23ddcb58967316a94eb60bc236b7633172fcd0883c1 not found: ID does not exist" containerID="3be2cb38d9f117696064c23ddcb58967316a94eb60bc236b7633172fcd0883c1" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.828958 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be2cb38d9f117696064c23ddcb58967316a94eb60bc236b7633172fcd0883c1"} err="failed to get container status \"3be2cb38d9f117696064c23ddcb58967316a94eb60bc236b7633172fcd0883c1\": rpc error: code = NotFound desc = could not find container \"3be2cb38d9f117696064c23ddcb58967316a94eb60bc236b7633172fcd0883c1\": container with ID starting with 3be2cb38d9f117696064c23ddcb58967316a94eb60bc236b7633172fcd0883c1 not found: ID does not exist" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.828969 5030 scope.go:117] "RemoveContainer" containerID="64276a673eccc337b3a92ee9cfe55913394327ac5fea09449c77770c10978c66" Jan 20 23:06:45 crc kubenswrapper[5030]: E0120 23:06:45.829173 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64276a673eccc337b3a92ee9cfe55913394327ac5fea09449c77770c10978c66\": container with ID starting with 64276a673eccc337b3a92ee9cfe55913394327ac5fea09449c77770c10978c66 not found: ID does not exist" containerID="64276a673eccc337b3a92ee9cfe55913394327ac5fea09449c77770c10978c66" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.829188 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64276a673eccc337b3a92ee9cfe55913394327ac5fea09449c77770c10978c66"} err="failed to get container status \"64276a673eccc337b3a92ee9cfe55913394327ac5fea09449c77770c10978c66\": rpc error: code = NotFound desc = could not find container \"64276a673eccc337b3a92ee9cfe55913394327ac5fea09449c77770c10978c66\": container with ID starting with 64276a673eccc337b3a92ee9cfe55913394327ac5fea09449c77770c10978c66 not found: ID does not exist" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.870356 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46643fed-d966-40fa-bd2d-a623e341cc22-logs\") pod \"nova-api-0\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.870658 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc2k8\" (UniqueName: \"kubernetes.io/projected/46643fed-d966-40fa-bd2d-a623e341cc22-kube-api-access-xc2k8\") pod \"nova-api-0\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.870684 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46643fed-d966-40fa-bd2d-a623e341cc22-config-data\") pod \"nova-api-0\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.870704 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46643fed-d966-40fa-bd2d-a623e341cc22-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.870829 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.870840 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.973428 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46643fed-d966-40fa-bd2d-a623e341cc22-logs\") pod \"nova-api-0\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.973545 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc2k8\" (UniqueName: \"kubernetes.io/projected/46643fed-d966-40fa-bd2d-a623e341cc22-kube-api-access-xc2k8\") pod \"nova-api-0\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.973610 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46643fed-d966-40fa-bd2d-a623e341cc22-config-data\") pod \"nova-api-0\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.974598 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46643fed-d966-40fa-bd2d-a623e341cc22-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.974010 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46643fed-d966-40fa-bd2d-a623e341cc22-logs\") pod \"nova-api-0\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.978428 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46643fed-d966-40fa-bd2d-a623e341cc22-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.979451 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46643fed-d966-40fa-bd2d-a623e341cc22-config-data\") pod \"nova-api-0\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.981520 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc1ae48-151e-4fc9-b08c-4afa012265ae" path="/var/lib/kubelet/pods/8bc1ae48-151e-4fc9-b08c-4afa012265ae/volumes" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.982404 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e844ffd7-188b-4333-84f4-474c6f346630" path="/var/lib/kubelet/pods/e844ffd7-188b-4333-84f4-474c6f346630/volumes" Jan 20 23:06:45 crc kubenswrapper[5030]: I0120 23:06:45.995856 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc2k8\" (UniqueName: \"kubernetes.io/projected/46643fed-d966-40fa-bd2d-a623e341cc22-kube-api-access-xc2k8\") pod \"nova-api-0\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.046154 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.163004 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.200225 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.205738 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.211853 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.215004 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.215341 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.215576 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.229033 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.393567 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-run-httpd\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.393839 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s28dt\" (UniqueName: \"kubernetes.io/projected/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-kube-api-access-s28dt\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.393862 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.393969 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-config-data\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.394043 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-scripts\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.394214 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.394262 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-log-httpd\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.394411 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.496358 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-run-httpd\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.496435 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s28dt\" (UniqueName: \"kubernetes.io/projected/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-kube-api-access-s28dt\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.496467 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.496507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-config-data\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.496536 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-scripts\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.496597 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.496641 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-log-httpd\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.496696 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.497269 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-log-httpd\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.497691 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-run-httpd\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.506076 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.506119 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-config-data\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.507115 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-scripts\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.509758 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.510463 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.513115 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.514603 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s28dt\" (UniqueName: \"kubernetes.io/projected/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-kube-api-access-s28dt\") pod \"ceilometer-0\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.527378 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.715111 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0037fd82-93c2-4ba4-87c2-563388f2b56c","Type":"ContainerStarted","Data":"2d202c8a6069a8f94f4c396d447c7c9ca8b664cf155fd902dbbcf820ccd0a8ea"} Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.715528 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0037fd82-93c2-4ba4-87c2-563388f2b56c","Type":"ContainerStarted","Data":"19fb231c59f8089b6cc2f266b334a33db07302a867317497a770819abfdc1c42"} Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.725911 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"46643fed-d966-40fa-bd2d-a623e341cc22","Type":"ContainerStarted","Data":"ba32217690e786626f2dcd194b14af1af39cdc0b09f842cd29c0f38e3cbfc111"} Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.725950 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"46643fed-d966-40fa-bd2d-a623e341cc22","Type":"ContainerStarted","Data":"d10b77d90476ca3487dbf0d272e3924a71562bab01716a9ee558ee5259643775"} Jan 20 23:06:46 crc kubenswrapper[5030]: I0120 23:06:46.744837 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.744775889 podStartE2EDuration="2.744775889s" podCreationTimestamp="2026-01-20 23:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:46.739155642 +0000 UTC m=+1879.059415930" watchObservedRunningTime="2026-01-20 23:06:46.744775889 +0000 UTC m=+1879.065036177" Jan 20 23:06:47 crc kubenswrapper[5030]: I0120 23:06:47.000144 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:06:47 crc kubenswrapper[5030]: W0120 23:06:47.001347 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cd2ab40_dcc9_46b8_831a_25ed1ce447f9.slice/crio-82a1899e4374d9b3004e53d503619161600150ec7cf6af54d11064480baba32c WatchSource:0}: Error finding container 82a1899e4374d9b3004e53d503619161600150ec7cf6af54d11064480baba32c: Status 404 returned error can't find the container with id 82a1899e4374d9b3004e53d503619161600150ec7cf6af54d11064480baba32c Jan 20 23:06:47 crc kubenswrapper[5030]: I0120 23:06:47.743002 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9","Type":"ContainerStarted","Data":"82a1899e4374d9b3004e53d503619161600150ec7cf6af54d11064480baba32c"} Jan 20 23:06:47 crc kubenswrapper[5030]: I0120 23:06:47.744989 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"46643fed-d966-40fa-bd2d-a623e341cc22","Type":"ContainerStarted","Data":"35ccf2b17b3c46464b1c4743b34d1d319bc3c0ab5bedfe9f0ad04ec481506817"} Jan 20 23:06:47 crc kubenswrapper[5030]: I0120 23:06:47.770540 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.770520066 podStartE2EDuration="2.770520066s" podCreationTimestamp="2026-01-20 23:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:06:47.761698152 +0000 UTC m=+1880.081958440" watchObservedRunningTime="2026-01-20 23:06:47.770520066 +0000 UTC m=+1880.090780364" Jan 20 23:06:47 crc kubenswrapper[5030]: I0120 23:06:47.979892 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e" path="/var/lib/kubelet/pods/bd4b4b0b-6797-44fa-83cb-1b00e9c28a3e/volumes" Jan 20 23:06:48 crc kubenswrapper[5030]: I0120 23:06:48.753818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9","Type":"ContainerStarted","Data":"67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5"} Jan 20 23:06:48 crc kubenswrapper[5030]: I0120 23:06:48.991191 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:49 crc kubenswrapper[5030]: I0120 23:06:49.767527 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9","Type":"ContainerStarted","Data":"144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a"} Jan 20 23:06:50 crc kubenswrapper[5030]: I0120 23:06:50.039512 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:50 crc kubenswrapper[5030]: I0120 23:06:50.040763 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:50 crc kubenswrapper[5030]: I0120 23:06:50.782735 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9","Type":"ContainerStarted","Data":"1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663"} Jan 20 23:06:51 crc kubenswrapper[5030]: I0120 23:06:51.796645 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9","Type":"ContainerStarted","Data":"420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18"} Jan 20 23:06:51 crc kubenswrapper[5030]: I0120 23:06:51.796943 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:06:51 crc kubenswrapper[5030]: I0120 23:06:51.826391 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.3514829179999999 podStartE2EDuration="5.826368201s" podCreationTimestamp="2026-01-20 23:06:46 +0000 UTC" firstStartedPulling="2026-01-20 23:06:47.003586044 +0000 UTC m=+1879.323846372" lastFinishedPulling="2026-01-20 23:06:51.478471357 +0000 UTC m=+1883.798731655" observedRunningTime="2026-01-20 23:06:51.818959861 +0000 UTC m=+1884.139220159" watchObservedRunningTime="2026-01-20 23:06:51.826368201 +0000 UTC m=+1884.146628499" Jan 20 23:06:52 crc kubenswrapper[5030]: I0120 23:06:52.002609 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:06:53 crc kubenswrapper[5030]: I0120 23:06:53.991972 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:54 crc kubenswrapper[5030]: I0120 23:06:54.026349 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:54 crc kubenswrapper[5030]: I0120 23:06:54.857839 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:06:55 crc kubenswrapper[5030]: I0120 23:06:55.040523 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:55 crc kubenswrapper[5030]: I0120 23:06:55.040586 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:06:56 crc kubenswrapper[5030]: I0120 23:06:56.047189 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:56 crc kubenswrapper[5030]: I0120 23:06:56.047241 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:06:56 crc kubenswrapper[5030]: I0120 23:06:56.054770 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.133:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:06:56 crc kubenswrapper[5030]: I0120 23:06:56.054871 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.133:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:06:57 crc kubenswrapper[5030]: I0120 23:06:57.088850 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="46643fed-d966-40fa-bd2d-a623e341cc22" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.134:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:06:57 crc kubenswrapper[5030]: I0120 23:06:57.088900 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="46643fed-d966-40fa-bd2d-a623e341cc22" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.134:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:06:59 crc kubenswrapper[5030]: I0120 23:06:59.962557 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:06:59 crc kubenswrapper[5030]: E0120 23:06:59.963369 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:07:05 crc kubenswrapper[5030]: I0120 23:07:05.061446 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:07:05 crc kubenswrapper[5030]: I0120 23:07:05.065593 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:07:05 crc kubenswrapper[5030]: I0120 23:07:05.080497 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:07:05 crc kubenswrapper[5030]: I0120 23:07:05.081349 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:07:06 crc kubenswrapper[5030]: I0120 23:07:06.052669 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:06 crc kubenswrapper[5030]: I0120 23:07:06.053449 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:06 crc kubenswrapper[5030]: I0120 23:07:06.054072 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:06 crc kubenswrapper[5030]: I0120 23:07:06.058471 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:06 crc kubenswrapper[5030]: I0120 23:07:06.965308 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:06 crc kubenswrapper[5030]: I0120 23:07:06.970155 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:08 crc kubenswrapper[5030]: I0120 23:07:08.797674 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:07:08 crc kubenswrapper[5030]: I0120 23:07:08.798579 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="ceilometer-central-agent" containerID="cri-o://67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5" gracePeriod=30 Jan 20 23:07:08 crc kubenswrapper[5030]: I0120 23:07:08.798725 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="proxy-httpd" containerID="cri-o://420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18" gracePeriod=30 Jan 20 23:07:08 crc kubenswrapper[5030]: I0120 23:07:08.798790 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="sg-core" containerID="cri-o://1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663" gracePeriod=30 Jan 20 23:07:08 crc kubenswrapper[5030]: I0120 23:07:08.798809 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="ceilometer-notification-agent" containerID="cri-o://144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a" gracePeriod=30 Jan 20 23:07:08 crc kubenswrapper[5030]: I0120 23:07:08.820978 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Jan 20 23:07:08 crc kubenswrapper[5030]: I0120 23:07:08.986023 5030 generic.go:334] "Generic (PLEG): container finished" podID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerID="1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663" exitCode=2 Jan 20 23:07:08 crc kubenswrapper[5030]: I0120 23:07:08.986071 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9","Type":"ContainerDied","Data":"1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663"} Jan 20 23:07:09 crc kubenswrapper[5030]: I0120 23:07:09.959368 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:09 crc kubenswrapper[5030]: I0120 23:07:09.997510 5030 generic.go:334] "Generic (PLEG): container finished" podID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerID="420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18" exitCode=0 Jan 20 23:07:09 crc kubenswrapper[5030]: I0120 23:07:09.997864 5030 generic.go:334] "Generic (PLEG): container finished" podID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerID="144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a" exitCode=0 Jan 20 23:07:09 crc kubenswrapper[5030]: I0120 23:07:09.997877 5030 generic.go:334] "Generic (PLEG): container finished" podID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerID="67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5" exitCode=0 Jan 20 23:07:09 crc kubenswrapper[5030]: I0120 23:07:09.997897 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9","Type":"ContainerDied","Data":"420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18"} Jan 20 23:07:09 crc kubenswrapper[5030]: I0120 23:07:09.997923 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9","Type":"ContainerDied","Data":"144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a"} Jan 20 23:07:09 crc kubenswrapper[5030]: I0120 23:07:09.997934 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9","Type":"ContainerDied","Data":"67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5"} Jan 20 23:07:09 crc kubenswrapper[5030]: I0120 23:07:09.997942 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9","Type":"ContainerDied","Data":"82a1899e4374d9b3004e53d503619161600150ec7cf6af54d11064480baba32c"} Jan 20 23:07:09 crc kubenswrapper[5030]: I0120 23:07:09.997958 5030 scope.go:117] "RemoveContainer" containerID="420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18" Jan 20 23:07:09 crc kubenswrapper[5030]: I0120 23:07:09.998103 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.022165 5030 scope.go:117] "RemoveContainer" containerID="1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.050519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-run-httpd\") pod \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.050588 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-sg-core-conf-yaml\") pod \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.050865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" (UID: "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.051382 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-log-httpd\") pod \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.051432 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s28dt\" (UniqueName: \"kubernetes.io/projected/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-kube-api-access-s28dt\") pod \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.051465 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-config-data\") pod \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.051648 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-ceilometer-tls-certs\") pod \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.051754 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-scripts\") pod \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.051786 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-combined-ca-bundle\") pod \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\" (UID: \"5cd2ab40-dcc9-46b8-831a-25ed1ce447f9\") " Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.051811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" (UID: "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.052518 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.052538 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.056520 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-kube-api-access-s28dt" (OuterVolumeSpecName: "kube-api-access-s28dt") pod "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" (UID: "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9"). InnerVolumeSpecName "kube-api-access-s28dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.068890 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-scripts" (OuterVolumeSpecName: "scripts") pod "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" (UID: "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.070890 5030 scope.go:117] "RemoveContainer" containerID="144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.088492 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" (UID: "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.149782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" (UID: "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.153934 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.153960 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.153970 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.153980 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s28dt\" (UniqueName: \"kubernetes.io/projected/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-kube-api-access-s28dt\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.161611 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-config-data" (OuterVolumeSpecName: "config-data") pod "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" (UID: "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.166981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" (UID: "5cd2ab40-dcc9-46b8-831a-25ed1ce447f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.246645 5030 scope.go:117] "RemoveContainer" containerID="67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.255825 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.255853 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.271137 5030 scope.go:117] "RemoveContainer" containerID="420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18" Jan 20 23:07:10 crc kubenswrapper[5030]: E0120 23:07:10.272639 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18\": container with ID starting with 420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18 not found: ID does not exist" containerID="420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.272690 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18"} err="failed to get container status \"420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18\": rpc error: code = NotFound desc = could not find container \"420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18\": container with ID starting with 420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18 not found: ID does not exist" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.272718 5030 scope.go:117] "RemoveContainer" containerID="1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663" Jan 20 23:07:10 crc kubenswrapper[5030]: E0120 23:07:10.273156 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663\": container with ID starting with 1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663 not found: ID does not exist" containerID="1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.273190 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663"} err="failed to get container status \"1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663\": rpc error: code = NotFound desc = could not find container \"1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663\": container with ID starting with 1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663 not found: ID does not exist" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.273212 5030 scope.go:117] "RemoveContainer" containerID="144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a" Jan 20 23:07:10 crc kubenswrapper[5030]: E0120 23:07:10.273477 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a\": container with ID starting with 144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a not found: ID does not exist" containerID="144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.273518 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a"} err="failed to get container status \"144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a\": rpc error: code = NotFound desc = could not find container \"144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a\": container with ID starting with 144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a not found: ID does not exist" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.273544 5030 scope.go:117] "RemoveContainer" containerID="67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5" Jan 20 23:07:10 crc kubenswrapper[5030]: E0120 23:07:10.273801 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5\": container with ID starting with 67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5 not found: ID does not exist" containerID="67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.273825 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5"} err="failed to get container status \"67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5\": rpc error: code = NotFound desc = could not find container \"67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5\": container with ID starting with 67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5 not found: ID does not exist" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.273838 5030 scope.go:117] "RemoveContainer" containerID="420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.274037 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18"} err="failed to get container status \"420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18\": rpc error: code = NotFound desc = could not find container \"420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18\": container with ID starting with 420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18 not found: ID does not exist" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.274060 5030 scope.go:117] "RemoveContainer" containerID="1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.274269 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663"} err="failed to get container status \"1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663\": rpc error: code = NotFound desc = could not find container \"1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663\": container with ID starting with 1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663 not found: ID does not exist" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.274294 5030 scope.go:117] "RemoveContainer" containerID="144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.274488 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a"} err="failed to get container status \"144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a\": rpc error: code = NotFound desc = could not find container \"144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a\": container with ID starting with 144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a not found: ID does not exist" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.274509 5030 scope.go:117] "RemoveContainer" containerID="67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.274713 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5"} err="failed to get container status \"67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5\": rpc error: code = NotFound desc = could not find container \"67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5\": container with ID starting with 67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5 not found: ID does not exist" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.274732 5030 scope.go:117] "RemoveContainer" containerID="420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.274902 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18"} err="failed to get container status \"420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18\": rpc error: code = NotFound desc = could not find container \"420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18\": container with ID starting with 420192e8af9cd67f3168bb8f1a2a03d6953e9b7986d1e8893b1c1979cf253b18 not found: ID does not exist" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.274920 5030 scope.go:117] "RemoveContainer" containerID="1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.275096 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663"} err="failed to get container status \"1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663\": rpc error: code = NotFound desc = could not find container \"1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663\": container with ID starting with 1b5604ec9f8fac87e308c29d80fc53506ef9fb2f500f20edf07a726b1de84663 not found: ID does not exist" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.275112 5030 scope.go:117] "RemoveContainer" containerID="144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.275298 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a"} err="failed to get container status \"144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a\": rpc error: code = NotFound desc = could not find container \"144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a\": container with ID starting with 144063c200a5ab1e74e983166d4a61ef2a5ad1622fc1eaf8116c2eb05c5e221a not found: ID does not exist" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.275319 5030 scope.go:117] "RemoveContainer" containerID="67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.275469 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5"} err="failed to get container status \"67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5\": rpc error: code = NotFound desc = could not find container \"67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5\": container with ID starting with 67bbbaa2152bd99cc125f7730eb3b33eae643df36c597a5dba5ba9733150bcb5 not found: ID does not exist" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.328708 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.345462 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.372879 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:07:10 crc kubenswrapper[5030]: E0120 23:07:10.373254 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="ceilometer-notification-agent" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.373279 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="ceilometer-notification-agent" Jan 20 23:07:10 crc kubenswrapper[5030]: E0120 23:07:10.373306 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="ceilometer-central-agent" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.373313 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="ceilometer-central-agent" Jan 20 23:07:10 crc kubenswrapper[5030]: E0120 23:07:10.373322 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="proxy-httpd" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.373328 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="proxy-httpd" Jan 20 23:07:10 crc kubenswrapper[5030]: E0120 23:07:10.373345 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="sg-core" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.373351 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="sg-core" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.373500 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="ceilometer-notification-agent" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.373514 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="proxy-httpd" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.373529 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="ceilometer-central-agent" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.373543 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" containerName="sg-core" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.374982 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.377772 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.379209 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.384688 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.389100 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.458880 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st2cf\" (UniqueName: \"kubernetes.io/projected/1fbc8f3d-c103-44c1-ac39-963383927298-kube-api-access-st2cf\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.458932 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fbc8f3d-c103-44c1-ac39-963383927298-run-httpd\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.458961 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-scripts\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.459021 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.459037 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.459051 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fbc8f3d-c103-44c1-ac39-963383927298-log-httpd\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.459197 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-config-data\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.459332 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.561419 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.561472 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.561494 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fbc8f3d-c103-44c1-ac39-963383927298-log-httpd\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.561532 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-config-data\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.561575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.561646 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st2cf\" (UniqueName: \"kubernetes.io/projected/1fbc8f3d-c103-44c1-ac39-963383927298-kube-api-access-st2cf\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.561667 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fbc8f3d-c103-44c1-ac39-963383927298-run-httpd\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.561691 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-scripts\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.562049 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fbc8f3d-c103-44c1-ac39-963383927298-log-httpd\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.562334 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fbc8f3d-c103-44c1-ac39-963383927298-run-httpd\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.566196 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.566285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.566471 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.567675 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-scripts\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.567715 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-config-data\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.577301 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st2cf\" (UniqueName: \"kubernetes.io/projected/1fbc8f3d-c103-44c1-ac39-963383927298-kube-api-access-st2cf\") pod \"ceilometer-0\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:10 crc kubenswrapper[5030]: I0120 23:07:10.694055 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:11 crc kubenswrapper[5030]: I0120 23:07:11.135152 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:07:11 crc kubenswrapper[5030]: W0120 23:07:11.136825 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fbc8f3d_c103_44c1_ac39_963383927298.slice/crio-323514164ec51c17f373aca4d2ea49a17d7e21642b141bda29e7245d854fc68f WatchSource:0}: Error finding container 323514164ec51c17f373aca4d2ea49a17d7e21642b141bda29e7245d854fc68f: Status 404 returned error can't find the container with id 323514164ec51c17f373aca4d2ea49a17d7e21642b141bda29e7245d854fc68f Jan 20 23:07:11 crc kubenswrapper[5030]: I0120 23:07:11.157690 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:07:11 crc kubenswrapper[5030]: I0120 23:07:11.157911 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="46643fed-d966-40fa-bd2d-a623e341cc22" containerName="nova-api-log" containerID="cri-o://ba32217690e786626f2dcd194b14af1af39cdc0b09f842cd29c0f38e3cbfc111" gracePeriod=30 Jan 20 23:07:11 crc kubenswrapper[5030]: I0120 23:07:11.158033 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="46643fed-d966-40fa-bd2d-a623e341cc22" containerName="nova-api-api" containerID="cri-o://35ccf2b17b3c46464b1c4743b34d1d319bc3c0ab5bedfe9f0ad04ec481506817" gracePeriod=30 Jan 20 23:07:11 crc kubenswrapper[5030]: I0120 23:07:11.983828 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cd2ab40-dcc9-46b8-831a-25ed1ce447f9" path="/var/lib/kubelet/pods/5cd2ab40-dcc9-46b8-831a-25ed1ce447f9/volumes" Jan 20 23:07:12 crc kubenswrapper[5030]: I0120 23:07:12.024380 5030 generic.go:334] "Generic (PLEG): container finished" podID="46643fed-d966-40fa-bd2d-a623e341cc22" containerID="ba32217690e786626f2dcd194b14af1af39cdc0b09f842cd29c0f38e3cbfc111" exitCode=143 Jan 20 23:07:12 crc kubenswrapper[5030]: I0120 23:07:12.024440 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"46643fed-d966-40fa-bd2d-a623e341cc22","Type":"ContainerDied","Data":"ba32217690e786626f2dcd194b14af1af39cdc0b09f842cd29c0f38e3cbfc111"} Jan 20 23:07:12 crc kubenswrapper[5030]: I0120 23:07:12.027118 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1fbc8f3d-c103-44c1-ac39-963383927298","Type":"ContainerStarted","Data":"1ada5657c0ab92d2e37c68f3c3a8795a07c7e3f6b3784b49d9928703c0eb8a6b"} Jan 20 23:07:12 crc kubenswrapper[5030]: I0120 23:07:12.027692 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1fbc8f3d-c103-44c1-ac39-963383927298","Type":"ContainerStarted","Data":"323514164ec51c17f373aca4d2ea49a17d7e21642b141bda29e7245d854fc68f"} Jan 20 23:07:12 crc kubenswrapper[5030]: I0120 23:07:12.425314 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:07:13 crc kubenswrapper[5030]: I0120 23:07:13.041147 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1fbc8f3d-c103-44c1-ac39-963383927298","Type":"ContainerStarted","Data":"47b6880acb6eec1dc3481e292765412f6e14455ba1de3248d6739a81ad379bf2"} Jan 20 23:07:14 crc kubenswrapper[5030]: I0120 23:07:14.060289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1fbc8f3d-c103-44c1-ac39-963383927298","Type":"ContainerStarted","Data":"a868bbf49948a9db47505cfb9af10383df7c7060db49bd2fa0bc2d951a9ff9d5"} Jan 20 23:07:14 crc kubenswrapper[5030]: I0120 23:07:14.869669 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:14 crc kubenswrapper[5030]: I0120 23:07:14.942532 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46643fed-d966-40fa-bd2d-a623e341cc22-logs\") pod \"46643fed-d966-40fa-bd2d-a623e341cc22\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " Jan 20 23:07:14 crc kubenswrapper[5030]: I0120 23:07:14.942823 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46643fed-d966-40fa-bd2d-a623e341cc22-combined-ca-bundle\") pod \"46643fed-d966-40fa-bd2d-a623e341cc22\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " Jan 20 23:07:14 crc kubenswrapper[5030]: I0120 23:07:14.942977 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46643fed-d966-40fa-bd2d-a623e341cc22-config-data\") pod \"46643fed-d966-40fa-bd2d-a623e341cc22\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " Jan 20 23:07:14 crc kubenswrapper[5030]: I0120 23:07:14.943031 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc2k8\" (UniqueName: \"kubernetes.io/projected/46643fed-d966-40fa-bd2d-a623e341cc22-kube-api-access-xc2k8\") pod \"46643fed-d966-40fa-bd2d-a623e341cc22\" (UID: \"46643fed-d966-40fa-bd2d-a623e341cc22\") " Jan 20 23:07:14 crc kubenswrapper[5030]: I0120 23:07:14.945376 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46643fed-d966-40fa-bd2d-a623e341cc22-logs" (OuterVolumeSpecName: "logs") pod "46643fed-d966-40fa-bd2d-a623e341cc22" (UID: "46643fed-d966-40fa-bd2d-a623e341cc22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:07:14 crc kubenswrapper[5030]: I0120 23:07:14.952752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46643fed-d966-40fa-bd2d-a623e341cc22-kube-api-access-xc2k8" (OuterVolumeSpecName: "kube-api-access-xc2k8") pod "46643fed-d966-40fa-bd2d-a623e341cc22" (UID: "46643fed-d966-40fa-bd2d-a623e341cc22"). InnerVolumeSpecName "kube-api-access-xc2k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:07:14 crc kubenswrapper[5030]: I0120 23:07:14.963604 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:07:14 crc kubenswrapper[5030]: I0120 23:07:14.976006 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46643fed-d966-40fa-bd2d-a623e341cc22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46643fed-d966-40fa-bd2d-a623e341cc22" (UID: "46643fed-d966-40fa-bd2d-a623e341cc22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:14 crc kubenswrapper[5030]: I0120 23:07:14.982816 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46643fed-d966-40fa-bd2d-a623e341cc22-config-data" (OuterVolumeSpecName: "config-data") pod "46643fed-d966-40fa-bd2d-a623e341cc22" (UID: "46643fed-d966-40fa-bd2d-a623e341cc22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.048035 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46643fed-d966-40fa-bd2d-a623e341cc22-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.048068 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc2k8\" (UniqueName: \"kubernetes.io/projected/46643fed-d966-40fa-bd2d-a623e341cc22-kube-api-access-xc2k8\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.048082 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46643fed-d966-40fa-bd2d-a623e341cc22-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.048093 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46643fed-d966-40fa-bd2d-a623e341cc22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.078149 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1fbc8f3d-c103-44c1-ac39-963383927298","Type":"ContainerStarted","Data":"51ab2319e1e12fac61a767b6c8e4a563862f0b2517267e7aece6523d60d5e8ae"} Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.078363 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="ceilometer-central-agent" containerID="cri-o://1ada5657c0ab92d2e37c68f3c3a8795a07c7e3f6b3784b49d9928703c0eb8a6b" gracePeriod=30 Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.078728 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.078755 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="proxy-httpd" containerID="cri-o://51ab2319e1e12fac61a767b6c8e4a563862f0b2517267e7aece6523d60d5e8ae" gracePeriod=30 Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.078810 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="sg-core" containerID="cri-o://a868bbf49948a9db47505cfb9af10383df7c7060db49bd2fa0bc2d951a9ff9d5" gracePeriod=30 Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.078809 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="ceilometer-notification-agent" containerID="cri-o://47b6880acb6eec1dc3481e292765412f6e14455ba1de3248d6739a81ad379bf2" gracePeriod=30 Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.097131 5030 generic.go:334] "Generic (PLEG): container finished" podID="46643fed-d966-40fa-bd2d-a623e341cc22" containerID="35ccf2b17b3c46464b1c4743b34d1d319bc3c0ab5bedfe9f0ad04ec481506817" exitCode=0 Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.097170 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"46643fed-d966-40fa-bd2d-a623e341cc22","Type":"ContainerDied","Data":"35ccf2b17b3c46464b1c4743b34d1d319bc3c0ab5bedfe9f0ad04ec481506817"} Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.097195 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"46643fed-d966-40fa-bd2d-a623e341cc22","Type":"ContainerDied","Data":"d10b77d90476ca3487dbf0d272e3924a71562bab01716a9ee558ee5259643775"} Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.097212 5030 scope.go:117] "RemoveContainer" containerID="35ccf2b17b3c46464b1c4743b34d1d319bc3c0ab5bedfe9f0ad04ec481506817" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.097366 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.114610 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.638782025 podStartE2EDuration="5.114591328s" podCreationTimestamp="2026-01-20 23:07:10 +0000 UTC" firstStartedPulling="2026-01-20 23:07:11.1423789 +0000 UTC m=+1903.462639239" lastFinishedPulling="2026-01-20 23:07:14.618188244 +0000 UTC m=+1906.938448542" observedRunningTime="2026-01-20 23:07:15.099286195 +0000 UTC m=+1907.419546483" watchObservedRunningTime="2026-01-20 23:07:15.114591328 +0000 UTC m=+1907.434851616" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.143460 5030 scope.go:117] "RemoveContainer" containerID="ba32217690e786626f2dcd194b14af1af39cdc0b09f842cd29c0f38e3cbfc111" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.149161 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.158238 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.175612 5030 scope.go:117] "RemoveContainer" containerID="35ccf2b17b3c46464b1c4743b34d1d319bc3c0ab5bedfe9f0ad04ec481506817" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.177190 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:07:15 crc kubenswrapper[5030]: E0120 23:07:15.177524 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46643fed-d966-40fa-bd2d-a623e341cc22" containerName="nova-api-log" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.177541 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="46643fed-d966-40fa-bd2d-a623e341cc22" containerName="nova-api-log" Jan 20 23:07:15 crc kubenswrapper[5030]: E0120 23:07:15.177576 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46643fed-d966-40fa-bd2d-a623e341cc22" containerName="nova-api-api" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.177582 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="46643fed-d966-40fa-bd2d-a623e341cc22" containerName="nova-api-api" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.177762 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="46643fed-d966-40fa-bd2d-a623e341cc22" containerName="nova-api-log" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.177780 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="46643fed-d966-40fa-bd2d-a623e341cc22" containerName="nova-api-api" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.178881 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: E0120 23:07:15.182089 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35ccf2b17b3c46464b1c4743b34d1d319bc3c0ab5bedfe9f0ad04ec481506817\": container with ID starting with 35ccf2b17b3c46464b1c4743b34d1d319bc3c0ab5bedfe9f0ad04ec481506817 not found: ID does not exist" containerID="35ccf2b17b3c46464b1c4743b34d1d319bc3c0ab5bedfe9f0ad04ec481506817" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.182247 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ccf2b17b3c46464b1c4743b34d1d319bc3c0ab5bedfe9f0ad04ec481506817"} err="failed to get container status \"35ccf2b17b3c46464b1c4743b34d1d319bc3c0ab5bedfe9f0ad04ec481506817\": rpc error: code = NotFound desc = could not find container \"35ccf2b17b3c46464b1c4743b34d1d319bc3c0ab5bedfe9f0ad04ec481506817\": container with ID starting with 35ccf2b17b3c46464b1c4743b34d1d319bc3c0ab5bedfe9f0ad04ec481506817 not found: ID does not exist" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.182362 5030 scope.go:117] "RemoveContainer" containerID="ba32217690e786626f2dcd194b14af1af39cdc0b09f842cd29c0f38e3cbfc111" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.182334 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.182515 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.182383 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:07:15 crc kubenswrapper[5030]: E0120 23:07:15.182933 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba32217690e786626f2dcd194b14af1af39cdc0b09f842cd29c0f38e3cbfc111\": container with ID starting with ba32217690e786626f2dcd194b14af1af39cdc0b09f842cd29c0f38e3cbfc111 not found: ID does not exist" containerID="ba32217690e786626f2dcd194b14af1af39cdc0b09f842cd29c0f38e3cbfc111" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.182957 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba32217690e786626f2dcd194b14af1af39cdc0b09f842cd29c0f38e3cbfc111"} err="failed to get container status \"ba32217690e786626f2dcd194b14af1af39cdc0b09f842cd29c0f38e3cbfc111\": rpc error: code = NotFound desc = could not find container \"ba32217690e786626f2dcd194b14af1af39cdc0b09f842cd29c0f38e3cbfc111\": container with ID starting with ba32217690e786626f2dcd194b14af1af39cdc0b09f842cd29c0f38e3cbfc111 not found: ID does not exist" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.194982 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.250687 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-config-data\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.250747 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z88nw\" (UniqueName: \"kubernetes.io/projected/2b88d820-54a4-4307-82de-6b7e7b5cd435-kube-api-access-z88nw\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.250790 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.250921 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.251009 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b88d820-54a4-4307-82de-6b7e7b5cd435-logs\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.251193 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-public-tls-certs\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.353199 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-config-data\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.353266 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z88nw\" (UniqueName: \"kubernetes.io/projected/2b88d820-54a4-4307-82de-6b7e7b5cd435-kube-api-access-z88nw\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.353289 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.353352 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.353380 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b88d820-54a4-4307-82de-6b7e7b5cd435-logs\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.353413 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-public-tls-certs\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.354004 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b88d820-54a4-4307-82de-6b7e7b5cd435-logs\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.357451 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-config-data\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.357536 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.357799 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.360151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-public-tls-certs\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.368659 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z88nw\" (UniqueName: \"kubernetes.io/projected/2b88d820-54a4-4307-82de-6b7e7b5cd435-kube-api-access-z88nw\") pod \"nova-api-0\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.499897 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.983984 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46643fed-d966-40fa-bd2d-a623e341cc22" path="/var/lib/kubelet/pods/46643fed-d966-40fa-bd2d-a623e341cc22/volumes" Jan 20 23:07:15 crc kubenswrapper[5030]: I0120 23:07:15.985374 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:07:16 crc kubenswrapper[5030]: I0120 23:07:16.110371 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"2b88d820-54a4-4307-82de-6b7e7b5cd435","Type":"ContainerStarted","Data":"7d734a100670ce4293494e4925ac17467db764f2908d8753da4ceec6a8815490"} Jan 20 23:07:16 crc kubenswrapper[5030]: I0120 23:07:16.113880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"cbac8a8efde457ecc93be8c164837cd99df9bb7ee7e2524b52649b2e550edfbe"} Jan 20 23:07:16 crc kubenswrapper[5030]: I0120 23:07:16.117012 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fbc8f3d-c103-44c1-ac39-963383927298" containerID="51ab2319e1e12fac61a767b6c8e4a563862f0b2517267e7aece6523d60d5e8ae" exitCode=0 Jan 20 23:07:16 crc kubenswrapper[5030]: I0120 23:07:16.117096 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fbc8f3d-c103-44c1-ac39-963383927298" containerID="a868bbf49948a9db47505cfb9af10383df7c7060db49bd2fa0bc2d951a9ff9d5" exitCode=2 Jan 20 23:07:16 crc kubenswrapper[5030]: I0120 23:07:16.117159 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fbc8f3d-c103-44c1-ac39-963383927298" containerID="47b6880acb6eec1dc3481e292765412f6e14455ba1de3248d6739a81ad379bf2" exitCode=0 Jan 20 23:07:16 crc kubenswrapper[5030]: I0120 23:07:16.117220 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1fbc8f3d-c103-44c1-ac39-963383927298","Type":"ContainerDied","Data":"51ab2319e1e12fac61a767b6c8e4a563862f0b2517267e7aece6523d60d5e8ae"} Jan 20 23:07:16 crc kubenswrapper[5030]: I0120 23:07:16.117297 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1fbc8f3d-c103-44c1-ac39-963383927298","Type":"ContainerDied","Data":"a868bbf49948a9db47505cfb9af10383df7c7060db49bd2fa0bc2d951a9ff9d5"} Jan 20 23:07:16 crc kubenswrapper[5030]: I0120 23:07:16.117361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1fbc8f3d-c103-44c1-ac39-963383927298","Type":"ContainerDied","Data":"47b6880acb6eec1dc3481e292765412f6e14455ba1de3248d6739a81ad379bf2"} Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.131254 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fbc8f3d-c103-44c1-ac39-963383927298" containerID="1ada5657c0ab92d2e37c68f3c3a8795a07c7e3f6b3784b49d9928703c0eb8a6b" exitCode=0 Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.131349 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1fbc8f3d-c103-44c1-ac39-963383927298","Type":"ContainerDied","Data":"1ada5657c0ab92d2e37c68f3c3a8795a07c7e3f6b3784b49d9928703c0eb8a6b"} Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.134366 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"2b88d820-54a4-4307-82de-6b7e7b5cd435","Type":"ContainerStarted","Data":"ffffc1c09fe8eaaa4c9873343803eceb97caa93ec62a7a0f5b0fb64447d616c8"} Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.134422 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"2b88d820-54a4-4307-82de-6b7e7b5cd435","Type":"ContainerStarted","Data":"61d51d6533347c2adcbfbfb7e645e966f84e627cfee7356d6c103a97b3f6d938"} Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.170262 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.170239535 podStartE2EDuration="2.170239535s" podCreationTimestamp="2026-01-20 23:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:07:17.159186866 +0000 UTC m=+1909.479447194" watchObservedRunningTime="2026-01-20 23:07:17.170239535 +0000 UTC m=+1909.490499823" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.574401 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.715036 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-combined-ca-bundle\") pod \"1fbc8f3d-c103-44c1-ac39-963383927298\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.715593 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-sg-core-conf-yaml\") pod \"1fbc8f3d-c103-44c1-ac39-963383927298\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.715742 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fbc8f3d-c103-44c1-ac39-963383927298-run-httpd\") pod \"1fbc8f3d-c103-44c1-ac39-963383927298\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.716406 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fbc8f3d-c103-44c1-ac39-963383927298-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1fbc8f3d-c103-44c1-ac39-963383927298" (UID: "1fbc8f3d-c103-44c1-ac39-963383927298"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.716670 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-scripts\") pod \"1fbc8f3d-c103-44c1-ac39-963383927298\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.717168 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-ceilometer-tls-certs\") pod \"1fbc8f3d-c103-44c1-ac39-963383927298\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.717540 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-config-data\") pod \"1fbc8f3d-c103-44c1-ac39-963383927298\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.717565 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fbc8f3d-c103-44c1-ac39-963383927298-log-httpd\") pod \"1fbc8f3d-c103-44c1-ac39-963383927298\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.717768 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st2cf\" (UniqueName: \"kubernetes.io/projected/1fbc8f3d-c103-44c1-ac39-963383927298-kube-api-access-st2cf\") pod \"1fbc8f3d-c103-44c1-ac39-963383927298\" (UID: \"1fbc8f3d-c103-44c1-ac39-963383927298\") " Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.718457 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fbc8f3d-c103-44c1-ac39-963383927298-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.718776 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fbc8f3d-c103-44c1-ac39-963383927298-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1fbc8f3d-c103-44c1-ac39-963383927298" (UID: "1fbc8f3d-c103-44c1-ac39-963383927298"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.722391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-scripts" (OuterVolumeSpecName: "scripts") pod "1fbc8f3d-c103-44c1-ac39-963383927298" (UID: "1fbc8f3d-c103-44c1-ac39-963383927298"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.725535 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fbc8f3d-c103-44c1-ac39-963383927298-kube-api-access-st2cf" (OuterVolumeSpecName: "kube-api-access-st2cf") pod "1fbc8f3d-c103-44c1-ac39-963383927298" (UID: "1fbc8f3d-c103-44c1-ac39-963383927298"). InnerVolumeSpecName "kube-api-access-st2cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.769139 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1fbc8f3d-c103-44c1-ac39-963383927298" (UID: "1fbc8f3d-c103-44c1-ac39-963383927298"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.779232 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1fbc8f3d-c103-44c1-ac39-963383927298" (UID: "1fbc8f3d-c103-44c1-ac39-963383927298"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.815452 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fbc8f3d-c103-44c1-ac39-963383927298" (UID: "1fbc8f3d-c103-44c1-ac39-963383927298"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.819573 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.819756 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.819821 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1fbc8f3d-c103-44c1-ac39-963383927298-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.819880 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st2cf\" (UniqueName: \"kubernetes.io/projected/1fbc8f3d-c103-44c1-ac39-963383927298-kube-api-access-st2cf\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.819945 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.820006 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.845511 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-config-data" (OuterVolumeSpecName: "config-data") pod "1fbc8f3d-c103-44c1-ac39-963383927298" (UID: "1fbc8f3d-c103-44c1-ac39-963383927298"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:17 crc kubenswrapper[5030]: I0120 23:07:17.922133 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fbc8f3d-c103-44c1-ac39-963383927298-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.151538 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1fbc8f3d-c103-44c1-ac39-963383927298","Type":"ContainerDied","Data":"323514164ec51c17f373aca4d2ea49a17d7e21642b141bda29e7245d854fc68f"} Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.151653 5030 scope.go:117] "RemoveContainer" containerID="51ab2319e1e12fac61a767b6c8e4a563862f0b2517267e7aece6523d60d5e8ae" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.151565 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.184917 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.186009 5030 scope.go:117] "RemoveContainer" containerID="a868bbf49948a9db47505cfb9af10383df7c7060db49bd2fa0bc2d951a9ff9d5" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.198366 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.220689 5030 scope.go:117] "RemoveContainer" containerID="47b6880acb6eec1dc3481e292765412f6e14455ba1de3248d6739a81ad379bf2" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.222240 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:07:18 crc kubenswrapper[5030]: E0120 23:07:18.222690 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="sg-core" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.222717 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="sg-core" Jan 20 23:07:18 crc kubenswrapper[5030]: E0120 23:07:18.222744 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="proxy-httpd" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.222752 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="proxy-httpd" Jan 20 23:07:18 crc kubenswrapper[5030]: E0120 23:07:18.222771 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="ceilometer-central-agent" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.222780 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="ceilometer-central-agent" Jan 20 23:07:18 crc kubenswrapper[5030]: E0120 23:07:18.222797 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="ceilometer-notification-agent" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.222807 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="ceilometer-notification-agent" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.223004 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="ceilometer-central-agent" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.223026 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="ceilometer-notification-agent" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.223039 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="sg-core" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.223059 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" containerName="proxy-httpd" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.225539 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.231159 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.231464 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.231730 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.244922 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.260499 5030 scope.go:117] "RemoveContainer" containerID="1ada5657c0ab92d2e37c68f3c3a8795a07c7e3f6b3784b49d9928703c0eb8a6b" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.329523 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz695\" (UniqueName: \"kubernetes.io/projected/b6b94587-03e7-4220-b417-0244e3623cd5-kube-api-access-vz695\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.329582 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.329818 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.329949 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-scripts\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.330077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.330286 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b94587-03e7-4220-b417-0244e3623cd5-log-httpd\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.330371 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-config-data\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.330490 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b94587-03e7-4220-b417-0244e3623cd5-run-httpd\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.432786 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.432927 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.433035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-scripts\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.433839 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.434006 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b94587-03e7-4220-b417-0244e3623cd5-log-httpd\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.434880 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b94587-03e7-4220-b417-0244e3623cd5-log-httpd\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.435663 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-config-data\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.435773 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b94587-03e7-4220-b417-0244e3623cd5-run-httpd\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.435908 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz695\" (UniqueName: \"kubernetes.io/projected/b6b94587-03e7-4220-b417-0244e3623cd5-kube-api-access-vz695\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.436453 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b94587-03e7-4220-b417-0244e3623cd5-run-httpd\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.438896 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.439216 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-scripts\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.440514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.440693 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.454679 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-config-data\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.457735 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz695\" (UniqueName: \"kubernetes.io/projected/b6b94587-03e7-4220-b417-0244e3623cd5-kube-api-access-vz695\") pod \"ceilometer-0\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:18 crc kubenswrapper[5030]: I0120 23:07:18.563117 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:19 crc kubenswrapper[5030]: W0120 23:07:19.086536 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6b94587_03e7_4220_b417_0244e3623cd5.slice/crio-b271955100407803cc3b336caad14a4684c87d03e1174a9f38f22372a86d5637 WatchSource:0}: Error finding container b271955100407803cc3b336caad14a4684c87d03e1174a9f38f22372a86d5637: Status 404 returned error can't find the container with id b271955100407803cc3b336caad14a4684c87d03e1174a9f38f22372a86d5637 Jan 20 23:07:19 crc kubenswrapper[5030]: I0120 23:07:19.104224 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:07:19 crc kubenswrapper[5030]: I0120 23:07:19.166633 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b6b94587-03e7-4220-b417-0244e3623cd5","Type":"ContainerStarted","Data":"b271955100407803cc3b336caad14a4684c87d03e1174a9f38f22372a86d5637"} Jan 20 23:07:19 crc kubenswrapper[5030]: I0120 23:07:19.977143 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fbc8f3d-c103-44c1-ac39-963383927298" path="/var/lib/kubelet/pods/1fbc8f3d-c103-44c1-ac39-963383927298/volumes" Jan 20 23:07:20 crc kubenswrapper[5030]: I0120 23:07:20.179424 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b6b94587-03e7-4220-b417-0244e3623cd5","Type":"ContainerStarted","Data":"946c17aa01d77e8eccf38bf873615ae3de802c969d5a314ca7b0bcf0eb6374f1"} Jan 20 23:07:21 crc kubenswrapper[5030]: I0120 23:07:21.188659 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b6b94587-03e7-4220-b417-0244e3623cd5","Type":"ContainerStarted","Data":"4f4bc6591caf71c0df2653257968c6767a5209155e845f49b6ad42788e1d5137"} Jan 20 23:07:22 crc kubenswrapper[5030]: I0120 23:07:22.203139 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b6b94587-03e7-4220-b417-0244e3623cd5","Type":"ContainerStarted","Data":"2de53c3b859c17f6dce4f358962fc2f49e746c59df64f97dfaf9b707a5340a38"} Jan 20 23:07:23 crc kubenswrapper[5030]: I0120 23:07:23.222204 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b6b94587-03e7-4220-b417-0244e3623cd5","Type":"ContainerStarted","Data":"06fbce813ee541d9eeed7850ca354fca0580cf614c4eb0a8edaf5b5a22908da6"} Jan 20 23:07:23 crc kubenswrapper[5030]: I0120 23:07:23.224043 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:23 crc kubenswrapper[5030]: I0120 23:07:23.270122 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.99587318 podStartE2EDuration="5.270095055s" podCreationTimestamp="2026-01-20 23:07:18 +0000 UTC" firstStartedPulling="2026-01-20 23:07:19.092954144 +0000 UTC m=+1911.413214472" lastFinishedPulling="2026-01-20 23:07:22.367176019 +0000 UTC m=+1914.687436347" observedRunningTime="2026-01-20 23:07:23.254443154 +0000 UTC m=+1915.574703502" watchObservedRunningTime="2026-01-20 23:07:23.270095055 +0000 UTC m=+1915.590355383" Jan 20 23:07:25 crc kubenswrapper[5030]: I0120 23:07:25.502197 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:25 crc kubenswrapper[5030]: I0120 23:07:25.503010 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:25 crc kubenswrapper[5030]: I0120 23:07:25.771178 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n5tfc"] Jan 20 23:07:25 crc kubenswrapper[5030]: I0120 23:07:25.774215 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:25 crc kubenswrapper[5030]: I0120 23:07:25.792544 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n5tfc"] Jan 20 23:07:25 crc kubenswrapper[5030]: I0120 23:07:25.923736 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cbd351a-05f9-4807-b011-f41df17c5749-catalog-content\") pod \"redhat-operators-n5tfc\" (UID: \"1cbd351a-05f9-4807-b011-f41df17c5749\") " pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:25 crc kubenswrapper[5030]: I0120 23:07:25.923841 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glzd5\" (UniqueName: \"kubernetes.io/projected/1cbd351a-05f9-4807-b011-f41df17c5749-kube-api-access-glzd5\") pod \"redhat-operators-n5tfc\" (UID: \"1cbd351a-05f9-4807-b011-f41df17c5749\") " pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:25 crc kubenswrapper[5030]: I0120 23:07:25.924052 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cbd351a-05f9-4807-b011-f41df17c5749-utilities\") pod \"redhat-operators-n5tfc\" (UID: \"1cbd351a-05f9-4807-b011-f41df17c5749\") " pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:26 crc kubenswrapper[5030]: I0120 23:07:26.025836 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glzd5\" (UniqueName: \"kubernetes.io/projected/1cbd351a-05f9-4807-b011-f41df17c5749-kube-api-access-glzd5\") pod \"redhat-operators-n5tfc\" (UID: \"1cbd351a-05f9-4807-b011-f41df17c5749\") " pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:26 crc kubenswrapper[5030]: I0120 23:07:26.025960 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cbd351a-05f9-4807-b011-f41df17c5749-utilities\") pod \"redhat-operators-n5tfc\" (UID: \"1cbd351a-05f9-4807-b011-f41df17c5749\") " pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:26 crc kubenswrapper[5030]: I0120 23:07:26.026087 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cbd351a-05f9-4807-b011-f41df17c5749-catalog-content\") pod \"redhat-operators-n5tfc\" (UID: \"1cbd351a-05f9-4807-b011-f41df17c5749\") " pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:26 crc kubenswrapper[5030]: I0120 23:07:26.026538 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cbd351a-05f9-4807-b011-f41df17c5749-utilities\") pod \"redhat-operators-n5tfc\" (UID: \"1cbd351a-05f9-4807-b011-f41df17c5749\") " pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:26 crc kubenswrapper[5030]: I0120 23:07:26.026577 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cbd351a-05f9-4807-b011-f41df17c5749-catalog-content\") pod \"redhat-operators-n5tfc\" (UID: \"1cbd351a-05f9-4807-b011-f41df17c5749\") " pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:26 crc kubenswrapper[5030]: I0120 23:07:26.046915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glzd5\" (UniqueName: \"kubernetes.io/projected/1cbd351a-05f9-4807-b011-f41df17c5749-kube-api-access-glzd5\") pod \"redhat-operators-n5tfc\" (UID: \"1cbd351a-05f9-4807-b011-f41df17c5749\") " pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:26 crc kubenswrapper[5030]: I0120 23:07:26.111629 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:26 crc kubenswrapper[5030]: I0120 23:07:26.520811 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="2b88d820-54a4-4307-82de-6b7e7b5cd435" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.137:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:07:26 crc kubenswrapper[5030]: I0120 23:07:26.520816 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="2b88d820-54a4-4307-82de-6b7e7b5cd435" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.137:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:07:26 crc kubenswrapper[5030]: I0120 23:07:26.575989 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n5tfc"] Jan 20 23:07:27 crc kubenswrapper[5030]: I0120 23:07:27.270674 5030 generic.go:334] "Generic (PLEG): container finished" podID="1cbd351a-05f9-4807-b011-f41df17c5749" containerID="105a86987a15592dec94da35cfe1aeaaef7d469c8d09307d4c4f66a3cb4746d6" exitCode=0 Jan 20 23:07:27 crc kubenswrapper[5030]: I0120 23:07:27.270934 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5tfc" event={"ID":"1cbd351a-05f9-4807-b011-f41df17c5749","Type":"ContainerDied","Data":"105a86987a15592dec94da35cfe1aeaaef7d469c8d09307d4c4f66a3cb4746d6"} Jan 20 23:07:27 crc kubenswrapper[5030]: I0120 23:07:27.270957 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5tfc" event={"ID":"1cbd351a-05f9-4807-b011-f41df17c5749","Type":"ContainerStarted","Data":"db5c57f9b45fcb5aa1412f85798cdfa21d30d0e06857f23b2c5f00540119a793"} Jan 20 23:07:29 crc kubenswrapper[5030]: I0120 23:07:29.300884 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5tfc" event={"ID":"1cbd351a-05f9-4807-b011-f41df17c5749","Type":"ContainerStarted","Data":"27f2802d654235a6db3e7db4a83d4b0312e34dedca9f0150e9d3a229c348c506"} Jan 20 23:07:32 crc kubenswrapper[5030]: I0120 23:07:32.339423 5030 generic.go:334] "Generic (PLEG): container finished" podID="1cbd351a-05f9-4807-b011-f41df17c5749" containerID="27f2802d654235a6db3e7db4a83d4b0312e34dedca9f0150e9d3a229c348c506" exitCode=0 Jan 20 23:07:32 crc kubenswrapper[5030]: I0120 23:07:32.339538 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5tfc" event={"ID":"1cbd351a-05f9-4807-b011-f41df17c5749","Type":"ContainerDied","Data":"27f2802d654235a6db3e7db4a83d4b0312e34dedca9f0150e9d3a229c348c506"} Jan 20 23:07:33 crc kubenswrapper[5030]: I0120 23:07:33.355072 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5tfc" event={"ID":"1cbd351a-05f9-4807-b011-f41df17c5749","Type":"ContainerStarted","Data":"31151087c0156218b3f3234741a7ce015c45ab375154198e8f7d253b3f9a6216"} Jan 20 23:07:33 crc kubenswrapper[5030]: I0120 23:07:33.380590 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n5tfc" podStartSLOduration=2.815467865 podStartE2EDuration="8.380566327s" podCreationTimestamp="2026-01-20 23:07:25 +0000 UTC" firstStartedPulling="2026-01-20 23:07:27.272472117 +0000 UTC m=+1919.592732405" lastFinishedPulling="2026-01-20 23:07:32.837570569 +0000 UTC m=+1925.157830867" observedRunningTime="2026-01-20 23:07:33.374594992 +0000 UTC m=+1925.694855330" watchObservedRunningTime="2026-01-20 23:07:33.380566327 +0000 UTC m=+1925.700826615" Jan 20 23:07:35 crc kubenswrapper[5030]: I0120 23:07:35.517390 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:35 crc kubenswrapper[5030]: I0120 23:07:35.519117 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:35 crc kubenswrapper[5030]: I0120 23:07:35.519415 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:35 crc kubenswrapper[5030]: I0120 23:07:35.528711 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:36 crc kubenswrapper[5030]: I0120 23:07:36.113243 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:36 crc kubenswrapper[5030]: I0120 23:07:36.113319 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:36 crc kubenswrapper[5030]: I0120 23:07:36.399201 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:36 crc kubenswrapper[5030]: I0120 23:07:36.411474 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:07:36 crc kubenswrapper[5030]: I0120 23:07:36.996392 5030 scope.go:117] "RemoveContainer" containerID="69bd31968979fa52907dfd19ffa2733b98eb3b378f5267a768a9ed8e3af8a0b8" Jan 20 23:07:37 crc kubenswrapper[5030]: I0120 23:07:37.018818 5030 scope.go:117] "RemoveContainer" containerID="ecbdf9eff636c5fd5c83072aa27c4b23c8d55653d5d725a2825277d02afc5dc4" Jan 20 23:07:37 crc kubenswrapper[5030]: I0120 23:07:37.087432 5030 scope.go:117] "RemoveContainer" containerID="b80da2e4f32300b6f46c477fe14dab075793bedce95e881576e43e8d1bfce9c8" Jan 20 23:07:37 crc kubenswrapper[5030]: I0120 23:07:37.109771 5030 scope.go:117] "RemoveContainer" containerID="dbfeb70d4766b33590ea3682d4b59046ad66cacc8ce576e77fadaf87a7c55995" Jan 20 23:07:37 crc kubenswrapper[5030]: I0120 23:07:37.151487 5030 scope.go:117] "RemoveContainer" containerID="f60106e35683c5d569cb990f9fda62a442d36c2261c8ffc07a18885ab32e1f23" Jan 20 23:07:37 crc kubenswrapper[5030]: I0120 23:07:37.184215 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5tfc" podUID="1cbd351a-05f9-4807-b011-f41df17c5749" containerName="registry-server" probeResult="failure" output=< Jan 20 23:07:37 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 23:07:37 crc kubenswrapper[5030]: > Jan 20 23:07:37 crc kubenswrapper[5030]: I0120 23:07:37.205211 5030 scope.go:117] "RemoveContainer" containerID="aa572d94787348cd7844f38114a079565a13848b4376a45837e9cca1482b4fba" Jan 20 23:07:37 crc kubenswrapper[5030]: I0120 23:07:37.233254 5030 scope.go:117] "RemoveContainer" containerID="6f52ea9610953ea6508ff7b23f08c8f684930217e4f49a7a64f1aa5e4dae1b5d" Jan 20 23:07:37 crc kubenswrapper[5030]: I0120 23:07:37.272835 5030 scope.go:117] "RemoveContainer" containerID="62de685673aa44af6c5c901ff95c07029682edb3f840e756280af27d277bf5a7" Jan 20 23:07:37 crc kubenswrapper[5030]: I0120 23:07:37.319837 5030 scope.go:117] "RemoveContainer" containerID="7eaa48654e4ad98b0166fb2076c14af3b45afccd85f1add13ccfa8fda294f671" Jan 20 23:07:37 crc kubenswrapper[5030]: I0120 23:07:37.349931 5030 scope.go:117] "RemoveContainer" containerID="4f4ec23418a7fb4081ca771a7f62f966b6982ff38a1e94ba24bc598ee9b01174" Jan 20 23:07:46 crc kubenswrapper[5030]: I0120 23:07:46.194911 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:46 crc kubenswrapper[5030]: I0120 23:07:46.278192 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:46 crc kubenswrapper[5030]: I0120 23:07:46.446708 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n5tfc"] Jan 20 23:07:47 crc kubenswrapper[5030]: I0120 23:07:47.536268 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n5tfc" podUID="1cbd351a-05f9-4807-b011-f41df17c5749" containerName="registry-server" containerID="cri-o://31151087c0156218b3f3234741a7ce015c45ab375154198e8f7d253b3f9a6216" gracePeriod=2 Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.026038 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.120299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glzd5\" (UniqueName: \"kubernetes.io/projected/1cbd351a-05f9-4807-b011-f41df17c5749-kube-api-access-glzd5\") pod \"1cbd351a-05f9-4807-b011-f41df17c5749\" (UID: \"1cbd351a-05f9-4807-b011-f41df17c5749\") " Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.120426 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cbd351a-05f9-4807-b011-f41df17c5749-utilities\") pod \"1cbd351a-05f9-4807-b011-f41df17c5749\" (UID: \"1cbd351a-05f9-4807-b011-f41df17c5749\") " Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.120556 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cbd351a-05f9-4807-b011-f41df17c5749-catalog-content\") pod \"1cbd351a-05f9-4807-b011-f41df17c5749\" (UID: \"1cbd351a-05f9-4807-b011-f41df17c5749\") " Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.121861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cbd351a-05f9-4807-b011-f41df17c5749-utilities" (OuterVolumeSpecName: "utilities") pod "1cbd351a-05f9-4807-b011-f41df17c5749" (UID: "1cbd351a-05f9-4807-b011-f41df17c5749"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.127104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cbd351a-05f9-4807-b011-f41df17c5749-kube-api-access-glzd5" (OuterVolumeSpecName: "kube-api-access-glzd5") pod "1cbd351a-05f9-4807-b011-f41df17c5749" (UID: "1cbd351a-05f9-4807-b011-f41df17c5749"). InnerVolumeSpecName "kube-api-access-glzd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.222977 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glzd5\" (UniqueName: \"kubernetes.io/projected/1cbd351a-05f9-4807-b011-f41df17c5749-kube-api-access-glzd5\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.223017 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cbd351a-05f9-4807-b011-f41df17c5749-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.281235 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cbd351a-05f9-4807-b011-f41df17c5749-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cbd351a-05f9-4807-b011-f41df17c5749" (UID: "1cbd351a-05f9-4807-b011-f41df17c5749"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.325002 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cbd351a-05f9-4807-b011-f41df17c5749-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.551957 5030 generic.go:334] "Generic (PLEG): container finished" podID="1cbd351a-05f9-4807-b011-f41df17c5749" containerID="31151087c0156218b3f3234741a7ce015c45ab375154198e8f7d253b3f9a6216" exitCode=0 Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.552040 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5tfc" event={"ID":"1cbd351a-05f9-4807-b011-f41df17c5749","Type":"ContainerDied","Data":"31151087c0156218b3f3234741a7ce015c45ab375154198e8f7d253b3f9a6216"} Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.552053 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5tfc" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.552983 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5tfc" event={"ID":"1cbd351a-05f9-4807-b011-f41df17c5749","Type":"ContainerDied","Data":"db5c57f9b45fcb5aa1412f85798cdfa21d30d0e06857f23b2c5f00540119a793"} Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.553060 5030 scope.go:117] "RemoveContainer" containerID="31151087c0156218b3f3234741a7ce015c45ab375154198e8f7d253b3f9a6216" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.573365 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.592181 5030 scope.go:117] "RemoveContainer" containerID="27f2802d654235a6db3e7db4a83d4b0312e34dedca9f0150e9d3a229c348c506" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.633582 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n5tfc"] Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.670145 5030 scope.go:117] "RemoveContainer" containerID="105a86987a15592dec94da35cfe1aeaaef7d469c8d09307d4c4f66a3cb4746d6" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.675037 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n5tfc"] Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.741773 5030 scope.go:117] "RemoveContainer" containerID="31151087c0156218b3f3234741a7ce015c45ab375154198e8f7d253b3f9a6216" Jan 20 23:07:48 crc kubenswrapper[5030]: E0120 23:07:48.742773 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31151087c0156218b3f3234741a7ce015c45ab375154198e8f7d253b3f9a6216\": container with ID starting with 31151087c0156218b3f3234741a7ce015c45ab375154198e8f7d253b3f9a6216 not found: ID does not exist" containerID="31151087c0156218b3f3234741a7ce015c45ab375154198e8f7d253b3f9a6216" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.742829 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31151087c0156218b3f3234741a7ce015c45ab375154198e8f7d253b3f9a6216"} err="failed to get container status \"31151087c0156218b3f3234741a7ce015c45ab375154198e8f7d253b3f9a6216\": rpc error: code = NotFound desc = could not find container \"31151087c0156218b3f3234741a7ce015c45ab375154198e8f7d253b3f9a6216\": container with ID starting with 31151087c0156218b3f3234741a7ce015c45ab375154198e8f7d253b3f9a6216 not found: ID does not exist" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.742862 5030 scope.go:117] "RemoveContainer" containerID="27f2802d654235a6db3e7db4a83d4b0312e34dedca9f0150e9d3a229c348c506" Jan 20 23:07:48 crc kubenswrapper[5030]: E0120 23:07:48.743205 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f2802d654235a6db3e7db4a83d4b0312e34dedca9f0150e9d3a229c348c506\": container with ID starting with 27f2802d654235a6db3e7db4a83d4b0312e34dedca9f0150e9d3a229c348c506 not found: ID does not exist" containerID="27f2802d654235a6db3e7db4a83d4b0312e34dedca9f0150e9d3a229c348c506" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.743232 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f2802d654235a6db3e7db4a83d4b0312e34dedca9f0150e9d3a229c348c506"} err="failed to get container status \"27f2802d654235a6db3e7db4a83d4b0312e34dedca9f0150e9d3a229c348c506\": rpc error: code = NotFound desc = could not find container \"27f2802d654235a6db3e7db4a83d4b0312e34dedca9f0150e9d3a229c348c506\": container with ID starting with 27f2802d654235a6db3e7db4a83d4b0312e34dedca9f0150e9d3a229c348c506 not found: ID does not exist" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.743250 5030 scope.go:117] "RemoveContainer" containerID="105a86987a15592dec94da35cfe1aeaaef7d469c8d09307d4c4f66a3cb4746d6" Jan 20 23:07:48 crc kubenswrapper[5030]: E0120 23:07:48.743947 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"105a86987a15592dec94da35cfe1aeaaef7d469c8d09307d4c4f66a3cb4746d6\": container with ID starting with 105a86987a15592dec94da35cfe1aeaaef7d469c8d09307d4c4f66a3cb4746d6 not found: ID does not exist" containerID="105a86987a15592dec94da35cfe1aeaaef7d469c8d09307d4c4f66a3cb4746d6" Jan 20 23:07:48 crc kubenswrapper[5030]: I0120 23:07:48.743991 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"105a86987a15592dec94da35cfe1aeaaef7d469c8d09307d4c4f66a3cb4746d6"} err="failed to get container status \"105a86987a15592dec94da35cfe1aeaaef7d469c8d09307d4c4f66a3cb4746d6\": rpc error: code = NotFound desc = could not find container \"105a86987a15592dec94da35cfe1aeaaef7d469c8d09307d4c4f66a3cb4746d6\": container with ID starting with 105a86987a15592dec94da35cfe1aeaaef7d469c8d09307d4c4f66a3cb4746d6 not found: ID does not exist" Jan 20 23:07:49 crc kubenswrapper[5030]: I0120 23:07:49.983918 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cbd351a-05f9-4807-b011-f41df17c5749" path="/var/lib/kubelet/pods/1cbd351a-05f9-4807-b011-f41df17c5749/volumes" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.859359 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-ccrvp"] Jan 20 23:07:53 crc kubenswrapper[5030]: E0120 23:07:53.861028 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cbd351a-05f9-4807-b011-f41df17c5749" containerName="extract-utilities" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.861115 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cbd351a-05f9-4807-b011-f41df17c5749" containerName="extract-utilities" Jan 20 23:07:53 crc kubenswrapper[5030]: E0120 23:07:53.861188 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cbd351a-05f9-4807-b011-f41df17c5749" containerName="registry-server" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.861242 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cbd351a-05f9-4807-b011-f41df17c5749" containerName="registry-server" Jan 20 23:07:53 crc kubenswrapper[5030]: E0120 23:07:53.861311 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cbd351a-05f9-4807-b011-f41df17c5749" containerName="extract-content" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.861368 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cbd351a-05f9-4807-b011-f41df17c5749" containerName="extract-content" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.862079 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cbd351a-05f9-4807-b011-f41df17c5749" containerName="registry-server" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.863487 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.866167 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncontroller-ovncontroller-dockercfg-wj25c" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.866403 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-scripts" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.880513 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-ndh7q"] Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.882040 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.890192 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovncontroller-ovndbs" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.901965 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-ccrvp"] Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.914733 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ndh7q"] Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.931691 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-9254x"] Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.932991 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d870142-c0e3-45cd-81c1-5c6a2325be55-scripts\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.933065 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-run\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.933169 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd6wl\" (UniqueName: \"kubernetes.io/projected/9d870142-c0e3-45cd-81c1-5c6a2325be55-kube-api-access-pd6wl\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.933194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-log\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.933448 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-lib\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.933568 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-etc-ovs\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.933649 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.935248 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-metrics-config" Jan 20 23:07:53 crc kubenswrapper[5030]: I0120 23:07:53.945521 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-9254x"] Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.011097 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4"] Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.013010 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.017844 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-extra-scripts" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.019400 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4"] Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.035225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-run\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.035578 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/893a5628-7014-4e7d-aab7-1454984546b2-combined-ca-bundle\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.035887 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-config\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.036029 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.036197 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/893a5628-7014-4e7d-aab7-1454984546b2-ovn-controller-tls-certs\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.036325 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-ovs-rundir\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.036404 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/893a5628-7014-4e7d-aab7-1454984546b2-scripts\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.036486 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd6wl\" (UniqueName: \"kubernetes.io/projected/9d870142-c0e3-45cd-81c1-5c6a2325be55-kube-api-access-pd6wl\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.036553 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-log\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.036642 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-run\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.036755 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cqhv\" (UniqueName: \"kubernetes.io/projected/893a5628-7014-4e7d-aab7-1454984546b2-kube-api-access-6cqhv\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.036896 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-log-ovn\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.037064 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-run-ovn\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.037169 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-ovn-rundir\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.037328 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-lib\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.037446 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-combined-ca-bundle\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.037530 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-etc-ovs\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.037608 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d870142-c0e3-45cd-81c1-5c6a2325be55-scripts\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.037840 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5w9k\" (UniqueName: \"kubernetes.io/projected/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-kube-api-access-b5w9k\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.038708 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-run\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.039759 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-log\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.041068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-lib\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.041553 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-etc-ovs\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.044853 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d870142-c0e3-45cd-81c1-5c6a2325be55-scripts\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.063063 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd6wl\" (UniqueName: \"kubernetes.io/projected/9d870142-c0e3-45cd-81c1-5c6a2325be55-kube-api-access-pd6wl\") pod \"ovn-controller-ovs-ccrvp\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.139239 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-log-ovn\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.139751 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-run-ovn\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.139908 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-ovn-rundir\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.139869 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-run-ovn\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.139694 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-log-ovn\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.140023 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-ovn-rundir\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.140585 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-run\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.140718 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aad8615a-ad29-42bc-81ba-48a88ec9e901-additional-scripts\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.140831 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-run-ovn\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.140938 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-combined-ca-bundle\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.141524 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-log-ovn\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.141652 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2b96\" (UniqueName: \"kubernetes.io/projected/aad8615a-ad29-42bc-81ba-48a88ec9e901-kube-api-access-p2b96\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.141751 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5w9k\" (UniqueName: \"kubernetes.io/projected/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-kube-api-access-b5w9k\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.141864 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/893a5628-7014-4e7d-aab7-1454984546b2-combined-ca-bundle\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.141949 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-config\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.142066 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.142144 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aad8615a-ad29-42bc-81ba-48a88ec9e901-scripts\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.142256 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/893a5628-7014-4e7d-aab7-1454984546b2-ovn-controller-tls-certs\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.142346 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-ovs-rundir\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.142444 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/893a5628-7014-4e7d-aab7-1454984546b2-scripts\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.142538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-run\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.142655 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cqhv\" (UniqueName: \"kubernetes.io/projected/893a5628-7014-4e7d-aab7-1454984546b2-kube-api-access-6cqhv\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.143894 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-config\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.143982 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-ovs-rundir\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.144029 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-run\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.145474 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/893a5628-7014-4e7d-aab7-1454984546b2-scripts\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.146132 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-combined-ca-bundle\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.146336 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/893a5628-7014-4e7d-aab7-1454984546b2-combined-ca-bundle\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.147383 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/893a5628-7014-4e7d-aab7-1454984546b2-ovn-controller-tls-certs\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.152150 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.158443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cqhv\" (UniqueName: \"kubernetes.io/projected/893a5628-7014-4e7d-aab7-1454984546b2-kube-api-access-6cqhv\") pod \"ovn-controller-ndh7q\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.161779 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5w9k\" (UniqueName: \"kubernetes.io/projected/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-kube-api-access-b5w9k\") pod \"ovn-controller-metrics-9254x\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.196547 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.214251 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.244664 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aad8615a-ad29-42bc-81ba-48a88ec9e901-scripts\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.244798 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-run\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.244825 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aad8615a-ad29-42bc-81ba-48a88ec9e901-additional-scripts\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.244846 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-run-ovn\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.244867 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-log-ovn\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.244887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2b96\" (UniqueName: \"kubernetes.io/projected/aad8615a-ad29-42bc-81ba-48a88ec9e901-kube-api-access-p2b96\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.245356 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-run-ovn\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.245426 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-log-ovn\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.245448 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-run\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.246010 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aad8615a-ad29-42bc-81ba-48a88ec9e901-additional-scripts\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.247091 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aad8615a-ad29-42bc-81ba-48a88ec9e901-scripts\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.258577 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.262326 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2b96\" (UniqueName: \"kubernetes.io/projected/aad8615a-ad29-42bc-81ba-48a88ec9e901-kube-api-access-p2b96\") pod \"ovn-controller-ndh7q-config-795n4\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.346251 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.670784 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ndh7q"] Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.740787 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-ccrvp"] Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.810173 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-9254x"] Jan 20 23:07:54 crc kubenswrapper[5030]: I0120 23:07:54.859734 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4"] Jan 20 23:07:55 crc kubenswrapper[5030]: I0120 23:07:55.622133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ndh7q" event={"ID":"893a5628-7014-4e7d-aab7-1454984546b2","Type":"ContainerStarted","Data":"e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698"} Jan 20 23:07:55 crc kubenswrapper[5030]: I0120 23:07:55.622599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ndh7q" event={"ID":"893a5628-7014-4e7d-aab7-1454984546b2","Type":"ContainerStarted","Data":"dc7573b0075ea670867abd2bca0b33398fd34aa544abc90a015bb035ef941292"} Jan 20 23:07:55 crc kubenswrapper[5030]: I0120 23:07:55.622647 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:55 crc kubenswrapper[5030]: I0120 23:07:55.625973 5030 generic.go:334] "Generic (PLEG): container finished" podID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerID="a250ee84175492205fc032ec7ef1b2826234a8f95c9690768cde08732f54aa4b" exitCode=0 Jan 20 23:07:55 crc kubenswrapper[5030]: I0120 23:07:55.626080 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" event={"ID":"9d870142-c0e3-45cd-81c1-5c6a2325be55","Type":"ContainerDied","Data":"a250ee84175492205fc032ec7ef1b2826234a8f95c9690768cde08732f54aa4b"} Jan 20 23:07:55 crc kubenswrapper[5030]: I0120 23:07:55.626123 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" event={"ID":"9d870142-c0e3-45cd-81c1-5c6a2325be55","Type":"ContainerStarted","Data":"422e43c0a7491e0958feeeae08e497dd170261f0bda74a4f571b22520c40fdc6"} Jan 20 23:07:55 crc kubenswrapper[5030]: I0120 23:07:55.631087 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" event={"ID":"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746","Type":"ContainerStarted","Data":"7d983c088eb6b5aada8b43eaa56707af22c8edd8086b021d4020bff5a98b62a9"} Jan 20 23:07:55 crc kubenswrapper[5030]: I0120 23:07:55.631212 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" event={"ID":"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746","Type":"ContainerStarted","Data":"d0c951ec4d45a090e88b00c6542ff8f9b2fc57d79eee7a5ffb45f2da5adb54c3"} Jan 20 23:07:55 crc kubenswrapper[5030]: I0120 23:07:55.633798 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" event={"ID":"aad8615a-ad29-42bc-81ba-48a88ec9e901","Type":"ContainerStarted","Data":"d6822b95c4a216a9a63dcbccea5e2206e6ee671b6acd2b901417699cad643d22"} Jan 20 23:07:55 crc kubenswrapper[5030]: I0120 23:07:55.633911 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" event={"ID":"aad8615a-ad29-42bc-81ba-48a88ec9e901","Type":"ContainerStarted","Data":"a861fb543b4cc778a64bc36138e60d3aaa387b7acaf9e285e4b5a4da4afd8847"} Jan 20 23:07:55 crc kubenswrapper[5030]: I0120 23:07:55.647765 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-ndh7q" podStartSLOduration=2.647745438 podStartE2EDuration="2.647745438s" podCreationTimestamp="2026-01-20 23:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:07:55.639440785 +0000 UTC m=+1947.959701083" watchObservedRunningTime="2026-01-20 23:07:55.647745438 +0000 UTC m=+1947.968005736" Jan 20 23:07:55 crc kubenswrapper[5030]: I0120 23:07:55.680110 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" podStartSLOduration=2.680084158 podStartE2EDuration="2.680084158s" podCreationTimestamp="2026-01-20 23:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:07:55.674094162 +0000 UTC m=+1947.994354450" watchObservedRunningTime="2026-01-20 23:07:55.680084158 +0000 UTC m=+1948.000344486" Jan 20 23:07:55 crc kubenswrapper[5030]: I0120 23:07:55.709602 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" podStartSLOduration=2.709580199 podStartE2EDuration="2.709580199s" podCreationTimestamp="2026-01-20 23:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:07:55.699064863 +0000 UTC m=+1948.019325151" watchObservedRunningTime="2026-01-20 23:07:55.709580199 +0000 UTC m=+1948.029840487" Jan 20 23:07:56 crc kubenswrapper[5030]: I0120 23:07:56.646476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" event={"ID":"9d870142-c0e3-45cd-81c1-5c6a2325be55","Type":"ContainerStarted","Data":"1c83759a327a23747c10530750f3480ef054908978614d8ec14557542b7805c0"} Jan 20 23:07:56 crc kubenswrapper[5030]: I0120 23:07:56.647135 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" event={"ID":"9d870142-c0e3-45cd-81c1-5c6a2325be55","Type":"ContainerStarted","Data":"8c414e8e4d8f67036d5bc982cb1a10fc1658a7e83ee567f657f4df23a6dd409b"} Jan 20 23:07:56 crc kubenswrapper[5030]: I0120 23:07:56.647166 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:56 crc kubenswrapper[5030]: I0120 23:07:56.678265 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" podStartSLOduration=3.678244077 podStartE2EDuration="3.678244077s" podCreationTimestamp="2026-01-20 23:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:07:56.66771593 +0000 UTC m=+1948.987976228" watchObservedRunningTime="2026-01-20 23:07:56.678244077 +0000 UTC m=+1948.998504375" Jan 20 23:07:57 crc kubenswrapper[5030]: I0120 23:07:57.658557 5030 generic.go:334] "Generic (PLEG): container finished" podID="aad8615a-ad29-42bc-81ba-48a88ec9e901" containerID="d6822b95c4a216a9a63dcbccea5e2206e6ee671b6acd2b901417699cad643d22" exitCode=0 Jan 20 23:07:57 crc kubenswrapper[5030]: I0120 23:07:57.658715 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" event={"ID":"aad8615a-ad29-42bc-81ba-48a88ec9e901","Type":"ContainerDied","Data":"d6822b95c4a216a9a63dcbccea5e2206e6ee671b6acd2b901417699cad643d22"} Jan 20 23:07:57 crc kubenswrapper[5030]: I0120 23:07:57.659058 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:07:57 crc kubenswrapper[5030]: I0120 23:07:57.975966 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-ccrvp"] Jan 20 23:07:57 crc kubenswrapper[5030]: I0120 23:07:57.994517 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-9254x"] Jan 20 23:07:57 crc kubenswrapper[5030]: I0120 23:07:57.994726 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" podUID="bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746" containerName="openstack-network-exporter" containerID="cri-o://7d983c088eb6b5aada8b43eaa56707af22c8edd8086b021d4020bff5a98b62a9" gracePeriod=30 Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.004057 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4"] Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.012317 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ndh7q"] Jan 20 23:07:58 crc kubenswrapper[5030]: E0120 23:07:58.129582 5030 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack-kuttl-tests/ovn-controller-ndh7q" message=< Jan 20 23:07:58 crc kubenswrapper[5030]: Exiting ovn-controller (1) [ OK ] Jan 20 23:07:58 crc kubenswrapper[5030]: > Jan 20 23:07:58 crc kubenswrapper[5030]: E0120 23:07:58.129974 5030 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack-kuttl-tests/ovn-controller-ndh7q" podUID="893a5628-7014-4e7d-aab7-1454984546b2" containerName="ovn-controller" containerID="cri-o://e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.130023 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-ndh7q" podUID="893a5628-7014-4e7d-aab7-1454984546b2" containerName="ovn-controller" containerID="cri-o://e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698" gracePeriod=30 Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.510861 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-metrics-9254x_bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746/openstack-network-exporter/0.log" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.510928 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.515682 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644174 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-ovs-rundir\") pod \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/893a5628-7014-4e7d-aab7-1454984546b2-scripts\") pod \"893a5628-7014-4e7d-aab7-1454984546b2\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644292 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cqhv\" (UniqueName: \"kubernetes.io/projected/893a5628-7014-4e7d-aab7-1454984546b2-kube-api-access-6cqhv\") pod \"893a5628-7014-4e7d-aab7-1454984546b2\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644318 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746" (UID: "bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644342 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-ovn-rundir\") pod \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644369 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/893a5628-7014-4e7d-aab7-1454984546b2-combined-ca-bundle\") pod \"893a5628-7014-4e7d-aab7-1454984546b2\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644392 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-run\") pod \"893a5628-7014-4e7d-aab7-1454984546b2\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-config\") pod \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644449 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5w9k\" (UniqueName: \"kubernetes.io/projected/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-kube-api-access-b5w9k\") pod \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644484 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-log-ovn\") pod \"893a5628-7014-4e7d-aab7-1454984546b2\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644506 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/893a5628-7014-4e7d-aab7-1454984546b2-ovn-controller-tls-certs\") pod \"893a5628-7014-4e7d-aab7-1454984546b2\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644470 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746" (UID: "bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644552 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "893a5628-7014-4e7d-aab7-1454984546b2" (UID: "893a5628-7014-4e7d-aab7-1454984546b2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644530 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-run" (OuterVolumeSpecName: "var-run") pod "893a5628-7014-4e7d-aab7-1454984546b2" (UID: "893a5628-7014-4e7d-aab7-1454984546b2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644572 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-metrics-certs-tls-certs\") pod \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644770 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-run-ovn\") pod \"893a5628-7014-4e7d-aab7-1454984546b2\" (UID: \"893a5628-7014-4e7d-aab7-1454984546b2\") " Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644796 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-combined-ca-bundle\") pod \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\" (UID: \"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746\") " Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.644872 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "893a5628-7014-4e7d-aab7-1454984546b2" (UID: "893a5628-7014-4e7d-aab7-1454984546b2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.645472 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/893a5628-7014-4e7d-aab7-1454984546b2-scripts" (OuterVolumeSpecName: "scripts") pod "893a5628-7014-4e7d-aab7-1454984546b2" (UID: "893a5628-7014-4e7d-aab7-1454984546b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.645555 5030 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.645569 5030 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.645580 5030 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/893a5628-7014-4e7d-aab7-1454984546b2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.645588 5030 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-ovs-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.645597 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.646179 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-config" (OuterVolumeSpecName: "config") pod "bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746" (UID: "bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.649988 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/893a5628-7014-4e7d-aab7-1454984546b2-kube-api-access-6cqhv" (OuterVolumeSpecName: "kube-api-access-6cqhv") pod "893a5628-7014-4e7d-aab7-1454984546b2" (UID: "893a5628-7014-4e7d-aab7-1454984546b2"). InnerVolumeSpecName "kube-api-access-6cqhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.651184 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-kube-api-access-b5w9k" (OuterVolumeSpecName: "kube-api-access-b5w9k") pod "bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746" (UID: "bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746"). InnerVolumeSpecName "kube-api-access-b5w9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.674919 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746" (UID: "bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.675681 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-metrics-9254x_bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746/openstack-network-exporter/0.log" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.675743 5030 generic.go:334] "Generic (PLEG): container finished" podID="bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746" containerID="7d983c088eb6b5aada8b43eaa56707af22c8edd8086b021d4020bff5a98b62a9" exitCode=2 Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.675817 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" event={"ID":"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746","Type":"ContainerDied","Data":"7d983c088eb6b5aada8b43eaa56707af22c8edd8086b021d4020bff5a98b62a9"} Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.675852 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" event={"ID":"bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746","Type":"ContainerDied","Data":"d0c951ec4d45a090e88b00c6542ff8f9b2fc57d79eee7a5ffb45f2da5adb54c3"} Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.675847 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-metrics-9254x" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.675886 5030 scope.go:117] "RemoveContainer" containerID="7d983c088eb6b5aada8b43eaa56707af22c8edd8086b021d4020bff5a98b62a9" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.683430 5030 generic.go:334] "Generic (PLEG): container finished" podID="893a5628-7014-4e7d-aab7-1454984546b2" containerID="e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698" exitCode=0 Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.683509 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ndh7q" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.683526 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ndh7q" event={"ID":"893a5628-7014-4e7d-aab7-1454984546b2","Type":"ContainerDied","Data":"e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698"} Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.683693 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ndh7q" event={"ID":"893a5628-7014-4e7d-aab7-1454984546b2","Type":"ContainerDied","Data":"dc7573b0075ea670867abd2bca0b33398fd34aa544abc90a015bb035ef941292"} Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.685523 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/893a5628-7014-4e7d-aab7-1454984546b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "893a5628-7014-4e7d-aab7-1454984546b2" (UID: "893a5628-7014-4e7d-aab7-1454984546b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.686127 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" secret="" err="secret \"ovncontroller-ovncontroller-dockercfg-wj25c\" not found" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.706494 5030 scope.go:117] "RemoveContainer" containerID="7d983c088eb6b5aada8b43eaa56707af22c8edd8086b021d4020bff5a98b62a9" Jan 20 23:07:58 crc kubenswrapper[5030]: E0120 23:07:58.707024 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d983c088eb6b5aada8b43eaa56707af22c8edd8086b021d4020bff5a98b62a9\": container with ID starting with 7d983c088eb6b5aada8b43eaa56707af22c8edd8086b021d4020bff5a98b62a9 not found: ID does not exist" containerID="7d983c088eb6b5aada8b43eaa56707af22c8edd8086b021d4020bff5a98b62a9" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.707092 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d983c088eb6b5aada8b43eaa56707af22c8edd8086b021d4020bff5a98b62a9"} err="failed to get container status \"7d983c088eb6b5aada8b43eaa56707af22c8edd8086b021d4020bff5a98b62a9\": rpc error: code = NotFound desc = could not find container \"7d983c088eb6b5aada8b43eaa56707af22c8edd8086b021d4020bff5a98b62a9\": container with ID starting with 7d983c088eb6b5aada8b43eaa56707af22c8edd8086b021d4020bff5a98b62a9 not found: ID does not exist" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.707121 5030 scope.go:117] "RemoveContainer" containerID="e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.727280 5030 scope.go:117] "RemoveContainer" containerID="e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698" Jan 20 23:07:58 crc kubenswrapper[5030]: E0120 23:07:58.728412 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698\": container with ID starting with e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698 not found: ID does not exist" containerID="e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.728455 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698"} err="failed to get container status \"e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698\": rpc error: code = NotFound desc = could not find container \"e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698\": container with ID starting with e40e6f321201070f6240b084932cdb3bbb294339eb2a47e25c2c415c07ded698 not found: ID does not exist" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.731614 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746" (UID: "bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.746897 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/893a5628-7014-4e7d-aab7-1454984546b2-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.746928 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cqhv\" (UniqueName: \"kubernetes.io/projected/893a5628-7014-4e7d-aab7-1454984546b2-kube-api-access-6cqhv\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.746938 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/893a5628-7014-4e7d-aab7-1454984546b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.746947 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.746957 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5w9k\" (UniqueName: \"kubernetes.io/projected/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-kube-api-access-b5w9k\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.746966 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.746974 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.749407 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/893a5628-7014-4e7d-aab7-1454984546b2-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "893a5628-7014-4e7d-aab7-1454984546b2" (UID: "893a5628-7014-4e7d-aab7-1454984546b2"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:07:58 crc kubenswrapper[5030]: E0120 23:07:58.849233 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 20 23:07:58 crc kubenswrapper[5030]: E0120 23:07:58.849347 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9d870142-c0e3-45cd-81c1-5c6a2325be55-scripts podName:9d870142-c0e3-45cd-81c1-5c6a2325be55 nodeName:}" failed. No retries permitted until 2026-01-20 23:07:59.349320323 +0000 UTC m=+1951.669580701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/9d870142-c0e3-45cd-81c1-5c6a2325be55-scripts") pod "ovn-controller-ovs-ccrvp" (UID: "9d870142-c0e3-45cd-81c1-5c6a2325be55") : configmap "ovncontroller-scripts" not found Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.850536 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/893a5628-7014-4e7d-aab7-1454984546b2-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:58 crc kubenswrapper[5030]: I0120 23:07:58.949394 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.015080 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-9254x"] Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.023543 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-metrics-9254x"] Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.031581 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ndh7q"] Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.039300 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ndh7q"] Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.053363 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-run\") pod \"aad8615a-ad29-42bc-81ba-48a88ec9e901\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.053435 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-run" (OuterVolumeSpecName: "var-run") pod "aad8615a-ad29-42bc-81ba-48a88ec9e901" (UID: "aad8615a-ad29-42bc-81ba-48a88ec9e901"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.053514 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aad8615a-ad29-42bc-81ba-48a88ec9e901-scripts\") pod \"aad8615a-ad29-42bc-81ba-48a88ec9e901\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.053605 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aad8615a-ad29-42bc-81ba-48a88ec9e901-additional-scripts\") pod \"aad8615a-ad29-42bc-81ba-48a88ec9e901\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.053753 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-log-ovn\") pod \"aad8615a-ad29-42bc-81ba-48a88ec9e901\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.053786 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2b96\" (UniqueName: \"kubernetes.io/projected/aad8615a-ad29-42bc-81ba-48a88ec9e901-kube-api-access-p2b96\") pod \"aad8615a-ad29-42bc-81ba-48a88ec9e901\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.053839 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-run-ovn\") pod \"aad8615a-ad29-42bc-81ba-48a88ec9e901\" (UID: \"aad8615a-ad29-42bc-81ba-48a88ec9e901\") " Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.053888 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "aad8615a-ad29-42bc-81ba-48a88ec9e901" (UID: "aad8615a-ad29-42bc-81ba-48a88ec9e901"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.054005 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "aad8615a-ad29-42bc-81ba-48a88ec9e901" (UID: "aad8615a-ad29-42bc-81ba-48a88ec9e901"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.054274 5030 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.054292 5030 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.054302 5030 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aad8615a-ad29-42bc-81ba-48a88ec9e901-var-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.054760 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad8615a-ad29-42bc-81ba-48a88ec9e901-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "aad8615a-ad29-42bc-81ba-48a88ec9e901" (UID: "aad8615a-ad29-42bc-81ba-48a88ec9e901"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.054923 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad8615a-ad29-42bc-81ba-48a88ec9e901-scripts" (OuterVolumeSpecName: "scripts") pod "aad8615a-ad29-42bc-81ba-48a88ec9e901" (UID: "aad8615a-ad29-42bc-81ba-48a88ec9e901"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.057606 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aad8615a-ad29-42bc-81ba-48a88ec9e901-kube-api-access-p2b96" (OuterVolumeSpecName: "kube-api-access-p2b96") pod "aad8615a-ad29-42bc-81ba-48a88ec9e901" (UID: "aad8615a-ad29-42bc-81ba-48a88ec9e901"). InnerVolumeSpecName "kube-api-access-p2b96". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.157049 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aad8615a-ad29-42bc-81ba-48a88ec9e901-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.157081 5030 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aad8615a-ad29-42bc-81ba-48a88ec9e901-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.157093 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2b96\" (UniqueName: \"kubernetes.io/projected/aad8615a-ad29-42bc-81ba-48a88ec9e901-kube-api-access-p2b96\") on node \"crc\" DevicePath \"\"" Jan 20 23:07:59 crc kubenswrapper[5030]: E0120 23:07:59.360652 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 20 23:07:59 crc kubenswrapper[5030]: E0120 23:07:59.361037 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9d870142-c0e3-45cd-81c1-5c6a2325be55-scripts podName:9d870142-c0e3-45cd-81c1-5c6a2325be55 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:00.361013475 +0000 UTC m=+1952.681273803 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/9d870142-c0e3-45cd-81c1-5c6a2325be55-scripts") pod "ovn-controller-ovs-ccrvp" (UID: "9d870142-c0e3-45cd-81c1-5c6a2325be55") : configmap "ovncontroller-scripts" not found Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.699429 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.699429 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4" event={"ID":"aad8615a-ad29-42bc-81ba-48a88ec9e901","Type":"ContainerDied","Data":"a861fb543b4cc778a64bc36138e60d3aaa387b7acaf9e285e4b5a4da4afd8847"} Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.700049 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a861fb543b4cc778a64bc36138e60d3aaa387b7acaf9e285e4b5a4da4afd8847" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.748136 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4"] Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.758982 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ndh7q-config-795n4"] Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.974813 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="893a5628-7014-4e7d-aab7-1454984546b2" path="/var/lib/kubelet/pods/893a5628-7014-4e7d-aab7-1454984546b2/volumes" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.975747 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aad8615a-ad29-42bc-81ba-48a88ec9e901" path="/var/lib/kubelet/pods/aad8615a-ad29-42bc-81ba-48a88ec9e901/volumes" Jan 20 23:07:59 crc kubenswrapper[5030]: I0120 23:07:59.976355 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746" path="/var/lib/kubelet/pods/bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746/volumes" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.113804 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" podUID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerName="ovs-vswitchd" containerID="cri-o://1c83759a327a23747c10530750f3480ef054908978614d8ec14557542b7805c0" gracePeriod=30 Jan 20 23:08:00 crc kubenswrapper[5030]: E0120 23:08:00.333088 5030 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Jan 20 23:08:00 crc kubenswrapper[5030]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 20 23:08:00 crc kubenswrapper[5030]: + source /usr/local/bin/container-scripts/functions Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNBridge=br-int Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNRemote=tcp:localhost:6642 Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNEncapType=geneve Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNAvailabilityZones= Jan 20 23:08:00 crc kubenswrapper[5030]: ++ EnableChassisAsGateway=true Jan 20 23:08:00 crc kubenswrapper[5030]: ++ PhysicalNetworks= Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNHostName= Jan 20 23:08:00 crc kubenswrapper[5030]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 20 23:08:00 crc kubenswrapper[5030]: ++ ovs_dir=/var/lib/openvswitch Jan 20 23:08:00 crc kubenswrapper[5030]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 20 23:08:00 crc kubenswrapper[5030]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 20 23:08:00 crc kubenswrapper[5030]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 20 23:08:00 crc kubenswrapper[5030]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 20 23:08:00 crc kubenswrapper[5030]: + sleep 0.5 Jan 20 23:08:00 crc kubenswrapper[5030]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 20 23:08:00 crc kubenswrapper[5030]: + cleanup_ovsdb_server_semaphore Jan 20 23:08:00 crc kubenswrapper[5030]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 20 23:08:00 crc kubenswrapper[5030]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 20 23:08:00 crc kubenswrapper[5030]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" message=< Jan 20 23:08:00 crc kubenswrapper[5030]: Exiting ovsdb-server (5) [ OK ] Jan 20 23:08:00 crc kubenswrapper[5030]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 20 23:08:00 crc kubenswrapper[5030]: + source /usr/local/bin/container-scripts/functions Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNBridge=br-int Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNRemote=tcp:localhost:6642 Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNEncapType=geneve Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNAvailabilityZones= Jan 20 23:08:00 crc kubenswrapper[5030]: ++ EnableChassisAsGateway=true Jan 20 23:08:00 crc kubenswrapper[5030]: ++ PhysicalNetworks= Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNHostName= Jan 20 23:08:00 crc kubenswrapper[5030]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 20 23:08:00 crc kubenswrapper[5030]: ++ ovs_dir=/var/lib/openvswitch Jan 20 23:08:00 crc kubenswrapper[5030]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 20 23:08:00 crc kubenswrapper[5030]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 20 23:08:00 crc kubenswrapper[5030]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 20 23:08:00 crc kubenswrapper[5030]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 20 23:08:00 crc kubenswrapper[5030]: + sleep 0.5 Jan 20 23:08:00 crc kubenswrapper[5030]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 20 23:08:00 crc kubenswrapper[5030]: + cleanup_ovsdb_server_semaphore Jan 20 23:08:00 crc kubenswrapper[5030]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 20 23:08:00 crc kubenswrapper[5030]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 20 23:08:00 crc kubenswrapper[5030]: > Jan 20 23:08:00 crc kubenswrapper[5030]: E0120 23:08:00.333153 5030 kuberuntime_container.go:691] "PreStop hook failed" err=< Jan 20 23:08:00 crc kubenswrapper[5030]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 20 23:08:00 crc kubenswrapper[5030]: + source /usr/local/bin/container-scripts/functions Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNBridge=br-int Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNRemote=tcp:localhost:6642 Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNEncapType=geneve Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNAvailabilityZones= Jan 20 23:08:00 crc kubenswrapper[5030]: ++ EnableChassisAsGateway=true Jan 20 23:08:00 crc kubenswrapper[5030]: ++ PhysicalNetworks= Jan 20 23:08:00 crc kubenswrapper[5030]: ++ OVNHostName= Jan 20 23:08:00 crc kubenswrapper[5030]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 20 23:08:00 crc kubenswrapper[5030]: ++ ovs_dir=/var/lib/openvswitch Jan 20 23:08:00 crc kubenswrapper[5030]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 20 23:08:00 crc kubenswrapper[5030]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 20 23:08:00 crc kubenswrapper[5030]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 20 23:08:00 crc kubenswrapper[5030]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 20 23:08:00 crc kubenswrapper[5030]: + sleep 0.5 Jan 20 23:08:00 crc kubenswrapper[5030]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 20 23:08:00 crc kubenswrapper[5030]: + cleanup_ovsdb_server_semaphore Jan 20 23:08:00 crc kubenswrapper[5030]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 20 23:08:00 crc kubenswrapper[5030]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 20 23:08:00 crc kubenswrapper[5030]: > pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" podUID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerName="ovsdb-server" containerID="cri-o://8c414e8e4d8f67036d5bc982cb1a10fc1658a7e83ee567f657f4df23a6dd409b" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.333191 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" podUID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerName="ovsdb-server" containerID="cri-o://8c414e8e4d8f67036d5bc982cb1a10fc1658a7e83ee567f657f4df23a6dd409b" gracePeriod=30 Jan 20 23:08:00 crc kubenswrapper[5030]: E0120 23:08:00.381815 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/ovncontroller-scripts: configmap "ovncontroller-scripts" not found Jan 20 23:08:00 crc kubenswrapper[5030]: E0120 23:08:00.381895 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9d870142-c0e3-45cd-81c1-5c6a2325be55-scripts podName:9d870142-c0e3-45cd-81c1-5c6a2325be55 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:02.381876428 +0000 UTC m=+1954.702136716 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/9d870142-c0e3-45cd-81c1-5c6a2325be55-scripts") pod "ovn-controller-ovs-ccrvp" (UID: "9d870142-c0e3-45cd-81c1-5c6a2325be55") : configmap "ovncontroller-scripts" not found Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.708661 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-ovs-ccrvp_9d870142-c0e3-45cd-81c1-5c6a2325be55/ovs-vswitchd/0.log" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.709798 5030 generic.go:334] "Generic (PLEG): container finished" podID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerID="1c83759a327a23747c10530750f3480ef054908978614d8ec14557542b7805c0" exitCode=143 Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.709823 5030 generic.go:334] "Generic (PLEG): container finished" podID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerID="8c414e8e4d8f67036d5bc982cb1a10fc1658a7e83ee567f657f4df23a6dd409b" exitCode=0 Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.709843 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" event={"ID":"9d870142-c0e3-45cd-81c1-5c6a2325be55","Type":"ContainerDied","Data":"1c83759a327a23747c10530750f3480ef054908978614d8ec14557542b7805c0"} Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.709868 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" event={"ID":"9d870142-c0e3-45cd-81c1-5c6a2325be55","Type":"ContainerDied","Data":"8c414e8e4d8f67036d5bc982cb1a10fc1658a7e83ee567f657f4df23a6dd409b"} Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.709877 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" event={"ID":"9d870142-c0e3-45cd-81c1-5c6a2325be55","Type":"ContainerDied","Data":"422e43c0a7491e0958feeeae08e497dd170261f0bda74a4f571b22520c40fdc6"} Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.709886 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="422e43c0a7491e0958feeeae08e497dd170261f0bda74a4f571b22520c40fdc6" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.738128 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-controller-ovs-ccrvp_9d870142-c0e3-45cd-81c1-5c6a2325be55/ovs-vswitchd/0.log" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.739203 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.890895 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-etc-ovs\") pod \"9d870142-c0e3-45cd-81c1-5c6a2325be55\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.891006 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-lib\") pod \"9d870142-c0e3-45cd-81c1-5c6a2325be55\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.891008 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "9d870142-c0e3-45cd-81c1-5c6a2325be55" (UID: "9d870142-c0e3-45cd-81c1-5c6a2325be55"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.891189 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d870142-c0e3-45cd-81c1-5c6a2325be55-scripts\") pod \"9d870142-c0e3-45cd-81c1-5c6a2325be55\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.891241 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-log\") pod \"9d870142-c0e3-45cd-81c1-5c6a2325be55\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.891110 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-lib" (OuterVolumeSpecName: "var-lib") pod "9d870142-c0e3-45cd-81c1-5c6a2325be55" (UID: "9d870142-c0e3-45cd-81c1-5c6a2325be55"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.891310 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd6wl\" (UniqueName: \"kubernetes.io/projected/9d870142-c0e3-45cd-81c1-5c6a2325be55-kube-api-access-pd6wl\") pod \"9d870142-c0e3-45cd-81c1-5c6a2325be55\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.891375 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-log" (OuterVolumeSpecName: "var-log") pod "9d870142-c0e3-45cd-81c1-5c6a2325be55" (UID: "9d870142-c0e3-45cd-81c1-5c6a2325be55"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.891400 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-run\") pod \"9d870142-c0e3-45cd-81c1-5c6a2325be55\" (UID: \"9d870142-c0e3-45cd-81c1-5c6a2325be55\") " Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.891523 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-run" (OuterVolumeSpecName: "var-run") pod "9d870142-c0e3-45cd-81c1-5c6a2325be55" (UID: "9d870142-c0e3-45cd-81c1-5c6a2325be55"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.892170 5030 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-etc-ovs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.892207 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d870142-c0e3-45cd-81c1-5c6a2325be55-scripts" (OuterVolumeSpecName: "scripts") pod "9d870142-c0e3-45cd-81c1-5c6a2325be55" (UID: "9d870142-c0e3-45cd-81c1-5c6a2325be55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.892211 5030 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-lib\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.892262 5030 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-log\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.892272 5030 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9d870142-c0e3-45cd-81c1-5c6a2325be55-var-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.899925 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d870142-c0e3-45cd-81c1-5c6a2325be55-kube-api-access-pd6wl" (OuterVolumeSpecName: "kube-api-access-pd6wl") pod "9d870142-c0e3-45cd-81c1-5c6a2325be55" (UID: "9d870142-c0e3-45cd-81c1-5c6a2325be55"). InnerVolumeSpecName "kube-api-access-pd6wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.993446 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d870142-c0e3-45cd-81c1-5c6a2325be55-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:00 crc kubenswrapper[5030]: I0120 23:08:00.993474 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd6wl\" (UniqueName: \"kubernetes.io/projected/9d870142-c0e3-45cd-81c1-5c6a2325be55-kube-api-access-pd6wl\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:01 crc kubenswrapper[5030]: I0120 23:08:01.721181 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-controller-ovs-ccrvp" Jan 20 23:08:01 crc kubenswrapper[5030]: I0120 23:08:01.769060 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-ccrvp"] Jan 20 23:08:01 crc kubenswrapper[5030]: I0120 23:08:01.780336 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-controller-ovs-ccrvp"] Jan 20 23:08:01 crc kubenswrapper[5030]: I0120 23:08:01.980780 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d870142-c0e3-45cd-81c1-5c6a2325be55" path="/var/lib/kubelet/pods/9d870142-c0e3-45cd-81c1-5c6a2325be55/volumes" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.794250 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.794683 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="cd8f6ca4-97c3-4971-b18f-cfbd9f599c99" containerName="openstackclient" containerID="cri-o://fc1bd89789fbcdfff592fdf01d1355ec90b510f11eea0c403079203e91249261" gracePeriod=2 Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.833820 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.857224 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.920688 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-73fb-account-create-update-7rshq"] Jan 20 23:08:03 crc kubenswrapper[5030]: E0120 23:08:03.921092 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746" containerName="openstack-network-exporter" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.921106 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746" containerName="openstack-network-exporter" Jan 20 23:08:03 crc kubenswrapper[5030]: E0120 23:08:03.921126 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerName="ovsdb-server" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.921133 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerName="ovsdb-server" Jan 20 23:08:03 crc kubenswrapper[5030]: E0120 23:08:03.921143 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerName="ovsdb-server-init" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.921150 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerName="ovsdb-server-init" Jan 20 23:08:03 crc kubenswrapper[5030]: E0120 23:08:03.921169 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="893a5628-7014-4e7d-aab7-1454984546b2" containerName="ovn-controller" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.921174 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="893a5628-7014-4e7d-aab7-1454984546b2" containerName="ovn-controller" Jan 20 23:08:03 crc kubenswrapper[5030]: E0120 23:08:03.921187 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd8f6ca4-97c3-4971-b18f-cfbd9f599c99" containerName="openstackclient" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.921192 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd8f6ca4-97c3-4971-b18f-cfbd9f599c99" containerName="openstackclient" Jan 20 23:08:03 crc kubenswrapper[5030]: E0120 23:08:03.921204 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad8615a-ad29-42bc-81ba-48a88ec9e901" containerName="ovn-config" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.921210 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad8615a-ad29-42bc-81ba-48a88ec9e901" containerName="ovn-config" Jan 20 23:08:03 crc kubenswrapper[5030]: E0120 23:08:03.921222 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerName="ovs-vswitchd" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.921228 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerName="ovs-vswitchd" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.921405 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="893a5628-7014-4e7d-aab7-1454984546b2" containerName="ovn-controller" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.921425 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerName="ovs-vswitchd" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.921435 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcfc1ad9-48d4-447d-88b4-1ca4ec0bb746" containerName="openstack-network-exporter" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.921443 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aad8615a-ad29-42bc-81ba-48a88ec9e901" containerName="ovn-config" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.921462 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd8f6ca4-97c3-4971-b18f-cfbd9f599c99" containerName="openstackclient" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.921474 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d870142-c0e3-45cd-81c1-5c6a2325be55" containerName="ovsdb-server" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.922098 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.927845 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.933376 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-73fb-account-create-update-7rshq"] Jan 20 23:08:03 crc kubenswrapper[5030]: E0120 23:08:03.960849 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:08:03 crc kubenswrapper[5030]: E0120 23:08:03.960908 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data podName:78f61cd4-3d8c-48e0-bebd-03969052e500 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:04.460889165 +0000 UTC m=+1956.781149453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data") pod "rabbitmq-cell1-server-0" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.994229 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt"] Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.995343 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-h7nmm"] Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.995505 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" Jan 20 23:08:03 crc kubenswrapper[5030]: I0120 23:08:03.997498 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-h7nmm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.005880 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.014430 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.028389 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.042283 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-h7nmm"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.056168 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2-operator-scripts\") pod \"nova-cell0-c9bf-account-create-update-pcwvt\" (UID: \"ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2\") " pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.056516 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdb66\" (UniqueName: \"kubernetes.io/projected/06e5d8f2-57ee-4aba-aea9-db1da4cb32dc-kube-api-access-xdb66\") pod \"placement-73fb-account-create-update-7rshq\" (UID: \"06e5d8f2-57ee-4aba-aea9-db1da4cb32dc\") " pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.056702 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2flwx\" (UniqueName: \"kubernetes.io/projected/3559ba4b-6e2a-4df1-900f-4a0138dab333-kube-api-access-2flwx\") pod \"root-account-create-update-h7nmm\" (UID: \"3559ba4b-6e2a-4df1-900f-4a0138dab333\") " pod="openstack-kuttl-tests/root-account-create-update-h7nmm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.056804 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2p8j\" (UniqueName: \"kubernetes.io/projected/ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2-kube-api-access-r2p8j\") pod \"nova-cell0-c9bf-account-create-update-pcwvt\" (UID: \"ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2\") " pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.056934 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e5d8f2-57ee-4aba-aea9-db1da4cb32dc-operator-scripts\") pod \"placement-73fb-account-create-update-7rshq\" (UID: \"06e5d8f2-57ee-4aba-aea9-db1da4cb32dc\") " pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.057002 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3559ba4b-6e2a-4df1-900f-4a0138dab333-operator-scripts\") pod \"root-account-create-update-h7nmm\" (UID: \"3559ba4b-6e2a-4df1-900f-4a0138dab333\") " pod="openstack-kuttl-tests/root-account-create-update-h7nmm" Jan 20 23:08:04 crc kubenswrapper[5030]: E0120 23:08:04.057103 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.056730 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl"] Jan 20 23:08:04 crc kubenswrapper[5030]: E0120 23:08:04.057568 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle podName:b74f69b6-bbb2-4578-a61c-9479f6802e01 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:04.557546876 +0000 UTC m=+1956.877807224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle") pod "nova-scheduler-0" (UID: "b74f69b6-bbb2-4578-a61c-9479f6802e01") : secret "combined-ca-bundle" not found Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.068231 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-73fb-account-create-update-ndhrl"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.087505 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.088692 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.092673 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.105640 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.106972 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.133123 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.144701 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.159665 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5f82922-ba42-4068-b84b-adc7297a751f-operator-scripts\") pod \"nova-cell1-46ec-account-create-update-nvkl8\" (UID: \"c5f82922-ba42-4068-b84b-adc7297a751f\") " pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.159768 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2flwx\" (UniqueName: \"kubernetes.io/projected/3559ba4b-6e2a-4df1-900f-4a0138dab333-kube-api-access-2flwx\") pod \"root-account-create-update-h7nmm\" (UID: \"3559ba4b-6e2a-4df1-900f-4a0138dab333\") " pod="openstack-kuttl-tests/root-account-create-update-h7nmm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.159799 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2p8j\" (UniqueName: \"kubernetes.io/projected/ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2-kube-api-access-r2p8j\") pod \"nova-cell0-c9bf-account-create-update-pcwvt\" (UID: \"ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2\") " pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.159850 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e5d8f2-57ee-4aba-aea9-db1da4cb32dc-operator-scripts\") pod \"placement-73fb-account-create-update-7rshq\" (UID: \"06e5d8f2-57ee-4aba-aea9-db1da4cb32dc\") " pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.159866 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3559ba4b-6e2a-4df1-900f-4a0138dab333-operator-scripts\") pod \"root-account-create-update-h7nmm\" (UID: \"3559ba4b-6e2a-4df1-900f-4a0138dab333\") " pod="openstack-kuttl-tests/root-account-create-update-h7nmm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.159899 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2-operator-scripts\") pod \"nova-cell0-c9bf-account-create-update-pcwvt\" (UID: \"ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2\") " pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.159932 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds292\" (UniqueName: \"kubernetes.io/projected/c5f82922-ba42-4068-b84b-adc7297a751f-kube-api-access-ds292\") pod \"nova-cell1-46ec-account-create-update-nvkl8\" (UID: \"c5f82922-ba42-4068-b84b-adc7297a751f\") " pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.159970 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqwr5\" (UniqueName: \"kubernetes.io/projected/8a7e9fd9-51c7-4047-9342-ac3368e079f1-kube-api-access-lqwr5\") pod \"glance-a2f7-account-create-update-4gsvm\" (UID: \"8a7e9fd9-51c7-4047-9342-ac3368e079f1\") " pod="openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.160011 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a7e9fd9-51c7-4047-9342-ac3368e079f1-operator-scripts\") pod \"glance-a2f7-account-create-update-4gsvm\" (UID: \"8a7e9fd9-51c7-4047-9342-ac3368e079f1\") " pod="openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.160030 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdb66\" (UniqueName: \"kubernetes.io/projected/06e5d8f2-57ee-4aba-aea9-db1da4cb32dc-kube-api-access-xdb66\") pod \"placement-73fb-account-create-update-7rshq\" (UID: \"06e5d8f2-57ee-4aba-aea9-db1da4cb32dc\") " pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.160890 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2-operator-scripts\") pod \"nova-cell0-c9bf-account-create-update-pcwvt\" (UID: \"ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2\") " pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.161836 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e5d8f2-57ee-4aba-aea9-db1da4cb32dc-operator-scripts\") pod \"placement-73fb-account-create-update-7rshq\" (UID: \"06e5d8f2-57ee-4aba-aea9-db1da4cb32dc\") " pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.161982 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3559ba4b-6e2a-4df1-900f-4a0138dab333-operator-scripts\") pod \"root-account-create-update-h7nmm\" (UID: \"3559ba4b-6e2a-4df1-900f-4a0138dab333\") " pod="openstack-kuttl-tests/root-account-create-update-h7nmm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.166573 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.174430 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.198177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2p8j\" (UniqueName: \"kubernetes.io/projected/ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2-kube-api-access-r2p8j\") pod \"nova-cell0-c9bf-account-create-update-pcwvt\" (UID: \"ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2\") " pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.222722 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.245102 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2flwx\" (UniqueName: \"kubernetes.io/projected/3559ba4b-6e2a-4df1-900f-4a0138dab333-kube-api-access-2flwx\") pod \"root-account-create-update-h7nmm\" (UID: \"3559ba4b-6e2a-4df1-900f-4a0138dab333\") " pod="openstack-kuttl-tests/root-account-create-update-h7nmm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.245541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdb66\" (UniqueName: \"kubernetes.io/projected/06e5d8f2-57ee-4aba-aea9-db1da4cb32dc-kube-api-access-xdb66\") pod \"placement-73fb-account-create-update-7rshq\" (UID: \"06e5d8f2-57ee-4aba-aea9-db1da4cb32dc\") " pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.255046 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-nqql2"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.262021 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.262690 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds292\" (UniqueName: \"kubernetes.io/projected/c5f82922-ba42-4068-b84b-adc7297a751f-kube-api-access-ds292\") pod \"nova-cell1-46ec-account-create-update-nvkl8\" (UID: \"c5f82922-ba42-4068-b84b-adc7297a751f\") " pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.262762 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqwr5\" (UniqueName: \"kubernetes.io/projected/8a7e9fd9-51c7-4047-9342-ac3368e079f1-kube-api-access-lqwr5\") pod \"glance-a2f7-account-create-update-4gsvm\" (UID: \"8a7e9fd9-51c7-4047-9342-ac3368e079f1\") " pod="openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.262831 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a7e9fd9-51c7-4047-9342-ac3368e079f1-operator-scripts\") pod \"glance-a2f7-account-create-update-4gsvm\" (UID: \"8a7e9fd9-51c7-4047-9342-ac3368e079f1\") " pod="openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.262870 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5f82922-ba42-4068-b84b-adc7297a751f-operator-scripts\") pod \"nova-cell1-46ec-account-create-update-nvkl8\" (UID: \"c5f82922-ba42-4068-b84b-adc7297a751f\") " pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.264217 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5f82922-ba42-4068-b84b-adc7297a751f-operator-scripts\") pod \"nova-cell1-46ec-account-create-update-nvkl8\" (UID: \"c5f82922-ba42-4068-b84b-adc7297a751f\") " pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8" Jan 20 23:08:04 crc kubenswrapper[5030]: E0120 23:08:04.264310 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:08:04 crc kubenswrapper[5030]: E0120 23:08:04.264362 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data podName:6ede01a6-9221-4276-b537-212e5da27f5c nodeName:}" failed. No retries permitted until 2026-01-20 23:08:04.764347248 +0000 UTC m=+1957.084607536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data") pod "rabbitmq-server-0" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c") : configmap "rabbitmq-config-data" not found Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.265228 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a7e9fd9-51c7-4047-9342-ac3368e079f1-operator-scripts\") pod \"glance-a2f7-account-create-update-4gsvm\" (UID: \"8a7e9fd9-51c7-4047-9342-ac3368e079f1\") " pod="openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.284411 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.302919 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-46ec-account-create-update-vnw9p"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.312697 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.340710 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.353352 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.353709 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.366841 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.380681 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqwr5\" (UniqueName: \"kubernetes.io/projected/8a7e9fd9-51c7-4047-9342-ac3368e079f1-kube-api-access-lqwr5\") pod \"glance-a2f7-account-create-update-4gsvm\" (UID: \"8a7e9fd9-51c7-4047-9342-ac3368e079f1\") " pod="openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.386260 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-h7nmm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.413446 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds292\" (UniqueName: \"kubernetes.io/projected/c5f82922-ba42-4068-b84b-adc7297a751f-kube-api-access-ds292\") pod \"nova-cell1-46ec-account-create-update-nvkl8\" (UID: \"c5f82922-ba42-4068-b84b-adc7297a751f\") " pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.442649 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.449777 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-a2f7-account-create-update-d6xpb"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.459210 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.468898 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07b2a1c0-8ab2-433a-9f20-169b898c0501-operator-scripts\") pod \"neutron-b757-account-create-update-dvwc9\" (UID: \"07b2a1c0-8ab2-433a-9f20-169b898c0501\") " pod="openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.469048 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vppkd\" (UniqueName: \"kubernetes.io/projected/07b2a1c0-8ab2-433a-9f20-169b898c0501-kube-api-access-vppkd\") pod \"neutron-b757-account-create-update-dvwc9\" (UID: \"07b2a1c0-8ab2-433a-9f20-169b898c0501\") " pod="openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9" Jan 20 23:08:04 crc kubenswrapper[5030]: E0120 23:08:04.469294 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:08:04 crc kubenswrapper[5030]: E0120 23:08:04.469724 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data podName:78f61cd4-3d8c-48e0-bebd-03969052e500 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:05.469706406 +0000 UTC m=+1957.789966694 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data") pod "rabbitmq-cell1-server-0" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.510030 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.526137 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="a0abaaf5-8404-4d53-ab0d-ceb843879e7b" containerName="openstack-network-exporter" containerID="cri-o://143fd3fdb7a82cd6e839d0f97c95e1923c32e0d47238eb2b8ba19f425ffccf46" gracePeriod=300 Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.554611 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.555055 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="1399fab1-c25c-4b58-a58e-9dfd0062a09c" containerName="ovn-northd" containerID="cri-o://a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7" gracePeriod=30 Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.556013 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="1399fab1-c25c-4b58-a58e-9dfd0062a09c" containerName="openstack-network-exporter" containerID="cri-o://024213f076bfea3599e3af6ece9e72be955eb0c53722fd158681cd3ee0b2aef1" gracePeriod=30 Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.568890 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-z7qq5"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.571899 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vppkd\" (UniqueName: \"kubernetes.io/projected/07b2a1c0-8ab2-433a-9f20-169b898c0501-kube-api-access-vppkd\") pod \"neutron-b757-account-create-update-dvwc9\" (UID: \"07b2a1c0-8ab2-433a-9f20-169b898c0501\") " pod="openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.573195 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07b2a1c0-8ab2-433a-9f20-169b898c0501-operator-scripts\") pod \"neutron-b757-account-create-update-dvwc9\" (UID: \"07b2a1c0-8ab2-433a-9f20-169b898c0501\") " pod="openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9" Jan 20 23:08:04 crc kubenswrapper[5030]: E0120 23:08:04.573373 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:08:04 crc kubenswrapper[5030]: E0120 23:08:04.573420 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle podName:b74f69b6-bbb2-4578-a61c-9479f6802e01 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:05.57340521 +0000 UTC m=+1957.893665488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle") pod "nova-scheduler-0" (UID: "b74f69b6-bbb2-4578-a61c-9479f6802e01") : secret "combined-ca-bundle" not found Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.574425 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07b2a1c0-8ab2-433a-9f20-169b898c0501-operator-scripts\") pod \"neutron-b757-account-create-update-dvwc9\" (UID: \"07b2a1c0-8ab2-433a-9f20-169b898c0501\") " pod="openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.591634 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-z7qq5"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.643251 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.710085 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="a0abaaf5-8404-4d53-ab0d-ceb843879e7b" containerName="ovsdbserver-nb" containerID="cri-o://2a38d763c0caa77b3b4fae1898fcea6589985d2f9175090530c93c3047d97531" gracePeriod=300 Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.735991 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vppkd\" (UniqueName: \"kubernetes.io/projected/07b2a1c0-8ab2-433a-9f20-169b898c0501-kube-api-access-vppkd\") pod \"neutron-b757-account-create-update-dvwc9\" (UID: \"07b2a1c0-8ab2-433a-9f20-169b898c0501\") " pod="openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.754830 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-b757-account-create-update-58cfg"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.773016 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9" Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.787711 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-b757-account-create-update-58cfg"] Jan 20 23:08:04 crc kubenswrapper[5030]: E0120 23:08:04.791302 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:08:04 crc kubenswrapper[5030]: E0120 23:08:04.791357 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data podName:6ede01a6-9221-4276-b537-212e5da27f5c nodeName:}" failed. No retries permitted until 2026-01-20 23:08:05.791342085 +0000 UTC m=+1958.111602373 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data") pod "rabbitmq-server-0" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c") : configmap "rabbitmq-config-data" not found Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.812672 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.813263 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="076e3516-7a56-4b76-ba13-4f4f11f83af8" containerName="openstack-network-exporter" containerID="cri-o://55836a84ea42622b1c95a5c3544f4cddcb27aff9869f6f7aca168d293dd1544e" gracePeriod=300 Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.832693 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-9dztn"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.879631 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-9dztn"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.913970 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.940380 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-0e14-account-create-update-kxft9"] Jan 20 23:08:04 crc kubenswrapper[5030]: I0120 23:08:04.985688 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4qlnz"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.013989 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4qlnz"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.040681 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-grk67"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.104276 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-grk67"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.132723 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-wflcw"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.197415 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-wflcw"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.209686 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="076e3516-7a56-4b76-ba13-4f4f11f83af8" containerName="ovsdbserver-sb" containerID="cri-o://aa914e7fae792f39c001fbef3009f7737f6fcbd15f05ee4dbed06af2baea26c7" gracePeriod=300 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.231690 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.277833 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-84wng"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.312559 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-84wng"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.335787 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-q2p24"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.349954 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-jpzls"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.370405 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-jpzls"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.408684 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.416657 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-kzxqs"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.429631 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-kzxqs"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.445853 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.446308 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-server" containerID="cri-o://222855be530361537c189bb68ea262fab03aa1e573ca39420ca757a8fc9752a9" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.446706 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="swift-recon-cron" containerID="cri-o://2f419f949223bd3eecb657c2e5db8dc3d6e15172bce4f76b79a2996c7f986fab" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.446757 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="rsync" containerID="cri-o://0228e6e5a2bf8d67091bb924e23d89a3dacb9f0b27fee5bc85fea465f5349cc2" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.446785 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-expirer" containerID="cri-o://6e533722ba2e3c86bb244f1d172c2e5b6d85b47103226081550521e55d75cf57" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.446825 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-updater" containerID="cri-o://71177140e4344999214a9b119df28b5034a95818f303e43d4d9fda79f212ab31" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.446852 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-auditor" containerID="cri-o://46e37ba2eb5235b1620489492fb3a576c004b9a14e7fd13ba34cffba482905bf" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.446883 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-replicator" containerID="cri-o://4b6cf4885689d7d5fb7caeeeabdeec6302123ed28f3ab03a8471fc3d784419c4" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.446911 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-server" containerID="cri-o://2a7fd43e4183dd4379e4ba1e71ac29b4bdb0e6c7406d53a86f975b26229f1c3a" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.446940 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-updater" containerID="cri-o://c6d0a29d7c5b2300d15ec8193bba8327f5c9413176c849542e224a3285bd14db" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.446972 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-auditor" containerID="cri-o://4340f305b67f1fdb453229a8e50f3723634ae23ea784fed8c8391cf3ca98c768" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.446967 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-reaper" containerID="cri-o://5c9f5726d07ae41454c108e2c0d2fd019e4139eca228dc8e2ae6c5ddaee5f754" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.447002 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-replicator" containerID="cri-o://e48b459962ea7bc574b006ce6341dda88c60d32146ba90952d7090ce53463da0" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.447017 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-auditor" containerID="cri-o://e0a2f61c01c7d8a27a3ed5ef3a647a8e8fbd99b4f1d1be9c8bc289f70dc07409" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.447052 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-replicator" containerID="cri-o://ef5734bddbf1b14a4ff079c9189361ff4acf245e82b0aa907a1cab9c7c034a64" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.447082 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-server" containerID="cri-o://ba9956fcef6b04c46db28c6ae2b349ec3802723f420d58f0af21b4f46ec7c18c" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.458549 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.458814 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="47ec7945-d514-4036-b888-6cc63c552195" containerName="glance-log" containerID="cri-o://459e0a1d8f69b89f034afa3986d4b2888fa00baf09bf1b847ebc91cc377c8c2a" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.459207 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="47ec7945-d514-4036-b888-6cc63c552195" containerName="glance-httpd" containerID="cri-o://8e0043eba4e2ddfdb9777f320a29f7f7423ad5d5f2afe41d13e3932b5a6a9147" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.467130 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="78f61cd4-3d8c-48e0-bebd-03969052e500" containerName="rabbitmq" containerID="cri-o://2f5b088739c93b3850099728469fea790afce761d8524b43ba23c86d11359f9b" gracePeriod=604800 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.470437 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-kgl2n"] Jan 20 23:08:05 crc kubenswrapper[5030]: E0120 23:08:05.519515 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:08:05 crc kubenswrapper[5030]: E0120 23:08:05.519570 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data podName:78f61cd4-3d8c-48e0-bebd-03969052e500 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:07.519555597 +0000 UTC m=+1959.839815885 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data") pod "rabbitmq-cell1-server-0" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.524225 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-kgl2n"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.541877 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7869b46478-p797p"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.542132 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7869b46478-p797p" podUID="4ee5d591-43e7-437c-9b55-b9132f22246a" containerName="placement-log" containerID="cri-o://94d39046ffec8b1af080acbe58f700c3628f887d518971230088dd0fae4ee9cd" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.542544 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7869b46478-p797p" podUID="4ee5d591-43e7-437c-9b55-b9132f22246a" containerName="placement-api" containerID="cri-o://59ac6043f93ef086250501bf602da8db89685b82321f351a7d96e1a127965ace" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.570939 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-677b6f7646-2t2kv"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.571216 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" podUID="8659ca99-8cf1-4b79-90f1-e9db988aaefa" containerName="neutron-api" containerID="cri-o://b9d0aa43e5924f881e26530da6f2716e1ea0b9c60a52a97d1c2c1728e2fab06b" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.571637 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" podUID="8659ca99-8cf1-4b79-90f1-e9db988aaefa" containerName="neutron-httpd" containerID="cri-o://bf734de45d5f60047a30e1ceed955053b0626031cab53137b8904fe6db18d665" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.617838 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.623773 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.629294 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1ab7f925-f4fc-4393-920e-38907a760bf4-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-9ht9g\" (UID: \"1ab7f925-f4fc-4393-920e-38907a760bf4\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.629653 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wknqp\" (UniqueName: \"kubernetes.io/projected/1ab7f925-f4fc-4393-920e-38907a760bf4-kube-api-access-wknqp\") pod \"dnsmasq-dnsmasq-84b9f45d47-9ht9g\" (UID: \"1ab7f925-f4fc-4393-920e-38907a760bf4\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.629750 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ab7f925-f4fc-4393-920e-38907a760bf4-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-9ht9g\" (UID: \"1ab7f925-f4fc-4393-920e-38907a760bf4\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:08:05 crc kubenswrapper[5030]: E0120 23:08:05.630057 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:08:05 crc kubenswrapper[5030]: E0120 23:08:05.630102 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle podName:b74f69b6-bbb2-4578-a61c-9479f6802e01 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:07.630086568 +0000 UTC m=+1959.950346856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle") pod "nova-scheduler-0" (UID: "b74f69b6-bbb2-4578-a61c-9479f6802e01") : secret "combined-ca-bundle" not found Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.726475 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.727128 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="dc14f499-5449-459d-874e-a7fe1b79e469" containerName="glance-log" containerID="cri-o://bb03a7a9462c3d1f022ea25c6e705e371d2aa4a25dcb01ea0294ce1846049a9b" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.728153 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="dc14f499-5449-459d-874e-a7fe1b79e469" containerName="glance-httpd" containerID="cri-o://1779ccb659c7f6cf01660be8e527df0c53283c0e70f691746ab04f5da6b5a774" gracePeriod=30 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.748133 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1ab7f925-f4fc-4393-920e-38907a760bf4-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-9ht9g\" (UID: \"1ab7f925-f4fc-4393-920e-38907a760bf4\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.756634 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wknqp\" (UniqueName: \"kubernetes.io/projected/1ab7f925-f4fc-4393-920e-38907a760bf4-kube-api-access-wknqp\") pod \"dnsmasq-dnsmasq-84b9f45d47-9ht9g\" (UID: \"1ab7f925-f4fc-4393-920e-38907a760bf4\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.756805 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ab7f925-f4fc-4393-920e-38907a760bf4-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-9ht9g\" (UID: \"1ab7f925-f4fc-4393-920e-38907a760bf4\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.753600 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1ab7f925-f4fc-4393-920e-38907a760bf4-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-9ht9g\" (UID: \"1ab7f925-f4fc-4393-920e-38907a760bf4\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.757692 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ab7f925-f4fc-4393-920e-38907a760bf4-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-9ht9g\" (UID: \"1ab7f925-f4fc-4393-920e-38907a760bf4\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.792648 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g"] Jan 20 23:08:05 crc kubenswrapper[5030]: E0120 23:08:05.797858 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a38d763c0caa77b3b4fae1898fcea6589985d2f9175090530c93c3047d97531 is running failed: container process not found" containerID="2a38d763c0caa77b3b4fae1898fcea6589985d2f9175090530c93c3047d97531" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:08:05 crc kubenswrapper[5030]: E0120 23:08:05.809525 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a38d763c0caa77b3b4fae1898fcea6589985d2f9175090530c93c3047d97531 is running failed: container process not found" containerID="2a38d763c0caa77b3b4fae1898fcea6589985d2f9175090530c93c3047d97531" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:08:05 crc kubenswrapper[5030]: E0120 23:08:05.810794 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a38d763c0caa77b3b4fae1898fcea6589985d2f9175090530c93c3047d97531 is running failed: container process not found" containerID="2a38d763c0caa77b3b4fae1898fcea6589985d2f9175090530c93c3047d97531" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:08:05 crc kubenswrapper[5030]: E0120 23:08:05.810877 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2a38d763c0caa77b3b4fae1898fcea6589985d2f9175090530c93c3047d97531 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="a0abaaf5-8404-4d53-ab0d-ceb843879e7b" containerName="ovsdbserver-nb" Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.815371 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wknqp\" (UniqueName: \"kubernetes.io/projected/1ab7f925-f4fc-4393-920e-38907a760bf4-kube-api-access-wknqp\") pod \"dnsmasq-dnsmasq-84b9f45d47-9ht9g\" (UID: \"1ab7f925-f4fc-4393-920e-38907a760bf4\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.818667 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-q4dk6"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.829950 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_076e3516-7a56-4b76-ba13-4f4f11f83af8/ovsdbserver-sb/0.log" Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.829986 5030 generic.go:334] "Generic (PLEG): container finished" podID="076e3516-7a56-4b76-ba13-4f4f11f83af8" containerID="55836a84ea42622b1c95a5c3544f4cddcb27aff9869f6f7aca168d293dd1544e" exitCode=2 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.830003 5030 generic.go:334] "Generic (PLEG): container finished" podID="076e3516-7a56-4b76-ba13-4f4f11f83af8" containerID="aa914e7fae792f39c001fbef3009f7737f6fcbd15f05ee4dbed06af2baea26c7" exitCode=143 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.830043 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"076e3516-7a56-4b76-ba13-4f4f11f83af8","Type":"ContainerDied","Data":"55836a84ea42622b1c95a5c3544f4cddcb27aff9869f6f7aca168d293dd1544e"} Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.830067 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"076e3516-7a56-4b76-ba13-4f4f11f83af8","Type":"ContainerDied","Data":"aa914e7fae792f39c001fbef3009f7737f6fcbd15f05ee4dbed06af2baea26c7"} Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.835587 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-q4dk6"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.845805 5030 generic.go:334] "Generic (PLEG): container finished" podID="8659ca99-8cf1-4b79-90f1-e9db988aaefa" containerID="bf734de45d5f60047a30e1ceed955053b0626031cab53137b8904fe6db18d665" exitCode=0 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.845872 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" event={"ID":"8659ca99-8cf1-4b79-90f1-e9db988aaefa","Type":"ContainerDied","Data":"bf734de45d5f60047a30e1ceed955053b0626031cab53137b8904fe6db18d665"} Jan 20 23:08:05 crc kubenswrapper[5030]: E0120 23:08:05.859258 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:08:05 crc kubenswrapper[5030]: E0120 23:08:05.859318 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data podName:6ede01a6-9221-4276-b537-212e5da27f5c nodeName:}" failed. No retries permitted until 2026-01-20 23:08:07.859301389 +0000 UTC m=+1960.179561677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data") pod "rabbitmq-server-0" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c") : configmap "rabbitmq-config-data" not found Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.859907 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.861301 5030 generic.go:334] "Generic (PLEG): container finished" podID="47ec7945-d514-4036-b888-6cc63c552195" containerID="459e0a1d8f69b89f034afa3986d4b2888fa00baf09bf1b847ebc91cc377c8c2a" exitCode=143 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.861366 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"47ec7945-d514-4036-b888-6cc63c552195","Type":"ContainerDied","Data":"459e0a1d8f69b89f034afa3986d4b2888fa00baf09bf1b847ebc91cc377c8c2a"} Jan 20 23:08:05 crc kubenswrapper[5030]: E0120 23:08:05.875443 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:08:05 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:08:05 crc kubenswrapper[5030]: Jan 20 23:08:05 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:08:05 crc kubenswrapper[5030]: Jan 20 23:08:05 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:08:05 crc kubenswrapper[5030]: Jan 20 23:08:05 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:08:05 crc kubenswrapper[5030]: Jan 20 23:08:05 crc kubenswrapper[5030]: if [ -n "placement" ]; then Jan 20 23:08:05 crc kubenswrapper[5030]: GRANT_DATABASE="placement" Jan 20 23:08:05 crc kubenswrapper[5030]: else Jan 20 23:08:05 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:08:05 crc kubenswrapper[5030]: fi Jan 20 23:08:05 crc kubenswrapper[5030]: Jan 20 23:08:05 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:08:05 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:08:05 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:08:05 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:08:05 crc kubenswrapper[5030]: # support updates Jan 20 23:08:05 crc kubenswrapper[5030]: Jan 20 23:08:05 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:08:05 crc kubenswrapper[5030]: E0120 23:08:05.876639 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" podUID="06e5d8f2-57ee-4aba-aea9-db1da4cb32dc" Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.880177 5030 generic.go:334] "Generic (PLEG): container finished" podID="1399fab1-c25c-4b58-a58e-9dfd0062a09c" containerID="024213f076bfea3599e3af6ece9e72be955eb0c53722fd158681cd3ee0b2aef1" exitCode=2 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.880230 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"1399fab1-c25c-4b58-a58e-9dfd0062a09c","Type":"ContainerDied","Data":"024213f076bfea3599e3af6ece9e72be955eb0c53722fd158681cd3ee0b2aef1"} Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.886919 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.897759 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_a0abaaf5-8404-4d53-ab0d-ceb843879e7b/ovsdbserver-nb/0.log" Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.897806 5030 generic.go:334] "Generic (PLEG): container finished" podID="a0abaaf5-8404-4d53-ab0d-ceb843879e7b" containerID="143fd3fdb7a82cd6e839d0f97c95e1923c32e0d47238eb2b8ba19f425ffccf46" exitCode=2 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.897821 5030 generic.go:334] "Generic (PLEG): container finished" podID="a0abaaf5-8404-4d53-ab0d-ceb843879e7b" containerID="2a38d763c0caa77b3b4fae1898fcea6589985d2f9175090530c93c3047d97531" exitCode=143 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.897882 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"a0abaaf5-8404-4d53-ab0d-ceb843879e7b","Type":"ContainerDied","Data":"143fd3fdb7a82cd6e839d0f97c95e1923c32e0d47238eb2b8ba19f425ffccf46"} Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.897907 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"a0abaaf5-8404-4d53-ab0d-ceb843879e7b","Type":"ContainerDied","Data":"2a38d763c0caa77b3b4fae1898fcea6589985d2f9175090530c93c3047d97531"} Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.900003 5030 generic.go:334] "Generic (PLEG): container finished" podID="4ee5d591-43e7-437c-9b55-b9132f22246a" containerID="94d39046ffec8b1af080acbe58f700c3628f887d518971230088dd0fae4ee9cd" exitCode=143 Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.900047 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7869b46478-p797p" event={"ID":"4ee5d591-43e7-437c-9b55-b9132f22246a","Type":"ContainerDied","Data":"94d39046ffec8b1af080acbe58f700c3628f887d518971230088dd0fae4ee9cd"} Jan 20 23:08:05 crc kubenswrapper[5030]: I0120 23:08:05.940534 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-a33f-account-create-update-k7gcm"] Jan 20 23:08:05 crc kubenswrapper[5030]: W0120 23:08:05.995048 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3559ba4b_6e2a_4df1_900f_4a0138dab333.slice/crio-73ee186f1ebbffd7604290cb324748c50fa03cef14113dee116ffdb8311833b3 WatchSource:0}: Error finding container 73ee186f1ebbffd7604290cb324748c50fa03cef14113dee116ffdb8311833b3: Status 404 returned error can't find the container with id 73ee186f1ebbffd7604290cb324748c50fa03cef14113dee116ffdb8311833b3 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.011869 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="6ede01a6-9221-4276-b537-212e5da27f5c" containerName="rabbitmq" containerID="cri-o://7e8d67b1a16612d40e2f749471f6d722e5a192ea12c3992c445e75cee863bda4" gracePeriod=604800 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.013152 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d3e6560-7767-4dcf-a6b7-c0a510179c8c" path="/var/lib/kubelet/pods/2d3e6560-7767-4dcf-a6b7-c0a510179c8c/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.013699 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="364de244-2190-47e1-bade-2c277858c97a" path="/var/lib/kubelet/pods/364de244-2190-47e1-bade-2c277858c97a/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.014205 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="425b72eb-bdf3-42af-bf9e-4a2b8431b0e7" path="/var/lib/kubelet/pods/425b72eb-bdf3-42af-bf9e-4a2b8431b0e7/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.014947 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48e38148-bd69-40ba-a439-0ed6e3be605a" path="/var/lib/kubelet/pods/48e38148-bd69-40ba-a439-0ed6e3be605a/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.015943 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92" path="/var/lib/kubelet/pods/5e9fb28d-51f1-4f0f-a2ee-dd4a379a0a92/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.016661 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf" path="/var/lib/kubelet/pods/6b90244e-e3da-4d2d-bfdd-e6308a4dcbdf/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.017141 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="775ba99a-26a5-4fd1-a46b-0d85647a97aa" path="/var/lib/kubelet/pods/775ba99a-26a5-4fd1-a46b-0d85647a97aa/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.018339 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77ab4d17-e667-42e0-bbc4-d0802a254b6d" path="/var/lib/kubelet/pods/77ab4d17-e667-42e0-bbc4-d0802a254b6d/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.019161 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a437decd-da76-4967-80ac-674e94cfed08" path="/var/lib/kubelet/pods/a437decd-da76-4967-80ac-674e94cfed08/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.019771 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a667d4ce-6850-4d3d-8a28-d1a24eb13ae7" path="/var/lib/kubelet/pods/a667d4ce-6850-4d3d-8a28-d1a24eb13ae7/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.020689 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d8c060-85fa-4664-97ef-22370096f2b1" path="/var/lib/kubelet/pods/a7d8c060-85fa-4664-97ef-22370096f2b1/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.021495 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1382233-4692-4fff-8586-47c406224b47" path="/var/lib/kubelet/pods/b1382233-4692-4fff-8586-47c406224b47/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.022064 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c11c685c-7088-4b9a-ae6a-6fcf926abf3c" path="/var/lib/kubelet/pods/c11c685c-7088-4b9a-ae6a-6fcf926abf3c/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.022567 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1bb61bc-fb1d-463c-b1e8-8949e01ca893" path="/var/lib/kubelet/pods/c1bb61bc-fb1d-463c-b1e8-8949e01ca893/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.023457 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1e5244a-19c2-4e6f-bced-f44b38a0d3b7" path="/var/lib/kubelet/pods/c1e5244a-19c2-4e6f-bced-f44b38a0d3b7/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.024025 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e36085-7754-49ab-b3c8-7b3b810aff4a" path="/var/lib/kubelet/pods/e0e36085-7754-49ab-b3c8-7b3b810aff4a/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.024493 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eea7dea3-e338-41cb-9c57-e5c8f4e4ef30" path="/var/lib/kubelet/pods/eea7dea3-e338-41cb-9c57-e5c8f4e4ef30/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.025413 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb5e616-18ad-4dc4-b8c4-b69a6c649bb4" path="/var/lib/kubelet/pods/efb5e616-18ad-4dc4-b8c4-b69a6c649bb4/volumes" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.025973 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.026266 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerName="nova-metadata-log" containerID="cri-o://19fb231c59f8089b6cc2f266b334a33db07302a867317497a770819abfdc1c42" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.026559 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerName="nova-metadata-metadata" containerID="cri-o://2d202c8a6069a8f94f4c396d447c7c9ca8b664cf155fd902dbbcf820ccd0a8ea" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.030893 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.031066 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" podUID="bbd870c6-92ea-4c50-a0dd-77e716077572" containerName="barbican-keystone-listener-log" containerID="cri-o://4f38b5a344b1f202e5c8a24ab0113aed84acc42e572b88dfe648cfd86fc0dc6f" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.031316 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" podUID="bbd870c6-92ea-4c50-a0dd-77e716077572" containerName="barbican-keystone-listener" containerID="cri-o://9fcef5bb10bc0f76675b28df6b7242170fa10c2e7d377591672b4db441f8a654" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043019 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="6e533722ba2e3c86bb244f1d172c2e5b6d85b47103226081550521e55d75cf57" exitCode=0 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043322 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="71177140e4344999214a9b119df28b5034a95818f303e43d4d9fda79f212ab31" exitCode=0 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043330 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="46e37ba2eb5235b1620489492fb3a576c004b9a14e7fd13ba34cffba482905bf" exitCode=0 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043337 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="4b6cf4885689d7d5fb7caeeeabdeec6302123ed28f3ab03a8471fc3d784419c4" exitCode=0 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043343 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="c6d0a29d7c5b2300d15ec8193bba8327f5c9413176c849542e224a3285bd14db" exitCode=0 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043350 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="e0a2f61c01c7d8a27a3ed5ef3a647a8e8fbd99b4f1d1be9c8bc289f70dc07409" exitCode=0 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043356 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="ef5734bddbf1b14a4ff079c9189361ff4acf245e82b0aa907a1cab9c7c034a64" exitCode=0 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043364 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="5c9f5726d07ae41454c108e2c0d2fd019e4139eca228dc8e2ae6c5ddaee5f754" exitCode=0 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043370 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="4340f305b67f1fdb453229a8e50f3723634ae23ea784fed8c8391cf3ca98c768" exitCode=0 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043376 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="e48b459962ea7bc574b006ce6341dda88c60d32146ba90952d7090ce53463da0" exitCode=0 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043398 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043424 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"6e533722ba2e3c86bb244f1d172c2e5b6d85b47103226081550521e55d75cf57"} Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043441 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"71177140e4344999214a9b119df28b5034a95818f303e43d4d9fda79f212ab31"} Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043451 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"46e37ba2eb5235b1620489492fb3a576c004b9a14e7fd13ba34cffba482905bf"} Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043461 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"4b6cf4885689d7d5fb7caeeeabdeec6302123ed28f3ab03a8471fc3d784419c4"} Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043469 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"c6d0a29d7c5b2300d15ec8193bba8327f5c9413176c849542e224a3285bd14db"} Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043477 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"e0a2f61c01c7d8a27a3ed5ef3a647a8e8fbd99b4f1d1be9c8bc289f70dc07409"} Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043486 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"ef5734bddbf1b14a4ff079c9189361ff4acf245e82b0aa907a1cab9c7c034a64"} Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043494 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"5c9f5726d07ae41454c108e2c0d2fd019e4139eca228dc8e2ae6c5ddaee5f754"} Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043503 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"4340f305b67f1fdb453229a8e50f3723634ae23ea784fed8c8391cf3ca98c768"} Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.043513 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"e48b459962ea7bc574b006ce6341dda88c60d32146ba90952d7090ce53463da0"} Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.057097 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.084299 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.084959 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" podUID="128b8307-03b0-488a-9e89-83c29af6dd9b" containerName="barbican-worker-log" containerID="cri-o://72b1998c6c8dee12853739e805d1cf361d95b2e26677c4f5d66171c955ca2571" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.085439 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" podUID="128b8307-03b0-488a-9e89-83c29af6dd9b" containerName="barbican-worker" containerID="cri-o://b907239be444a9423c0f1c089334d96acd30b0631822f0e4c0784e1217a02a7b" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.111690 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.164388 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.164690 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="1728938b-621e-44f4-91e2-c061a23aa819" containerName="cinder-scheduler" containerID="cri-o://f02ca9d198cfa26557c21f23bad287bf72936e7445101fad35f42301c6d41bc2" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.164818 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="1728938b-621e-44f4-91e2-c061a23aa819" containerName="probe" containerID="cri-o://87560233fddfaf3c5921a24575892e3f668b172a67d7c5d34650e6ff443be940" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.176521 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-h7nmm"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.184764 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-73fb-account-create-update-7rshq"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.196920 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.197186 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="2b88d820-54a4-4307-82de-6b7e7b5cd435" containerName="nova-api-log" containerID="cri-o://61d51d6533347c2adcbfbfb7e645e966f84e627cfee7356d6c103a97b3f6d938" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.199315 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="2b88d820-54a4-4307-82de-6b7e7b5cd435" containerName="nova-api-api" containerID="cri-o://ffffc1c09fe8eaaa4c9873343803eceb97caa93ec62a7a0f5b0fb64447d616c8" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.235116 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.235384 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="63645689-a02d-4ee6-b712-554130bdca30" containerName="cinder-api-log" containerID="cri-o://8afb44471792a68df8295fcfe9340c136a8e28722e34319c5bd8ccede98487ee" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.235584 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="63645689-a02d-4ee6-b712-554130bdca30" containerName="cinder-api" containerID="cri-o://76bbe65e67022afeb40494b3071315eb174c4e626acd1c77fb924d72243b1dbb" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.267529 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.268017 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" podUID="6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" containerName="barbican-api-log" containerID="cri-o://1e47d9a62be110198d6a21ed0e2f60d6739a5d24b7c3d9403df7146ed5e0e34c" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.268362 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" podUID="6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" containerName="barbican-api" containerID="cri-o://9685f0161fa2dc2a70a60068493799a7269233e5fdcb1286db91750b507a8e03" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.294087 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-fmgxs"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.320682 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.320940 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" podUID="cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" containerName="proxy-httpd" containerID="cri-o://7d91b777f27a8846dc117e3e1f244fee0123666d9f72ac9992025e702d704e39" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.321067 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" podUID="cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" containerName="proxy-server" containerID="cri-o://1ace08f4089db5eedfbc26aaa16427e4c52aaf88c8dbe004dece963061aab561" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.347458 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-fmgxs"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.364794 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-73fb-account-create-update-7rshq"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.410458 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-krpzk"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.449851 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-krpzk"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.485055 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-n5h5x"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.506271 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_a0abaaf5-8404-4d53-ab0d-ceb843879e7b/ovsdbserver-nb/0.log" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.506353 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.523828 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="a3bd79be-d1b9-4637-9dca-93cd7dbb5294" containerName="galera" containerID="cri-o://f9125a8e056c241ba3ecc8ab6eed158313bbc4eae392988d937ab45defc16863" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.524180 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_076e3516-7a56-4b76-ba13-4f4f11f83af8/ovsdbserver-sb/0.log" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.524235 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.524320 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-n5h5x"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.541027 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.553250 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-8qnm7"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.564460 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-8qnm7"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.577500 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8"] Jan 20 23:08:06 crc kubenswrapper[5030]: W0120 23:08:06.580021 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec16e80e_8f86_4dfc_ac11_32f7cc4a3bd2.slice/crio-bb0be371f8766b46fb8a8eb9da2275bcc053b6f37c8f7c73e59249190e3df8c6 WatchSource:0}: Error finding container bb0be371f8766b46fb8a8eb9da2275bcc053b6f37c8f7c73e59249190e3df8c6: Status 404 returned error can't find the container with id bb0be371f8766b46fb8a8eb9da2275bcc053b6f37c8f7c73e59249190e3df8c6 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.588301 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.588699 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="49b5232f-22f5-443f-8b8d-bf8f8b5889e4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://68aba122543fc2cde94a4194512c12897ed28fdcdd2ea3c2f997cce1c69f0ad2" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.589527 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"076e3516-7a56-4b76-ba13-4f4f11f83af8\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.589593 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/076e3516-7a56-4b76-ba13-4f4f11f83af8-scripts\") pod \"076e3516-7a56-4b76-ba13-4f4f11f83af8\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.589668 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52q5p\" (UniqueName: \"kubernetes.io/projected/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-kube-api-access-52q5p\") pod \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.589732 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-metrics-certs-tls-certs\") pod \"076e3516-7a56-4b76-ba13-4f4f11f83af8\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.589761 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-ovsdbserver-sb-tls-certs\") pod \"076e3516-7a56-4b76-ba13-4f4f11f83af8\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.589797 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-ovsdb-rundir\") pod \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.589838 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/076e3516-7a56-4b76-ba13-4f4f11f83af8-config\") pod \"076e3516-7a56-4b76-ba13-4f4f11f83af8\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.589860 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-ovsdbserver-nb-tls-certs\") pod \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.589890 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-combined-ca-bundle\") pod \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.589922 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-scripts\") pod \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.589956 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-combined-ca-bundle\") pod \"076e3516-7a56-4b76-ba13-4f4f11f83af8\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.589974 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.589997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/076e3516-7a56-4b76-ba13-4f4f11f83af8-ovsdb-rundir\") pod \"076e3516-7a56-4b76-ba13-4f4f11f83af8\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.590012 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-config\") pod \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.590027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-metrics-certs-tls-certs\") pod \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\" (UID: \"a0abaaf5-8404-4d53-ab0d-ceb843879e7b\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.590043 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qshdk\" (UniqueName: \"kubernetes.io/projected/076e3516-7a56-4b76-ba13-4f4f11f83af8-kube-api-access-qshdk\") pod \"076e3516-7a56-4b76-ba13-4f4f11f83af8\" (UID: \"076e3516-7a56-4b76-ba13-4f4f11f83af8\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.597119 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "a0abaaf5-8404-4d53-ab0d-ceb843879e7b" (UID: "a0abaaf5-8404-4d53-ab0d-ceb843879e7b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.599125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/076e3516-7a56-4b76-ba13-4f4f11f83af8-scripts" (OuterVolumeSpecName: "scripts") pod "076e3516-7a56-4b76-ba13-4f4f11f83af8" (UID: "076e3516-7a56-4b76-ba13-4f4f11f83af8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.600018 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/076e3516-7a56-4b76-ba13-4f4f11f83af8-config" (OuterVolumeSpecName: "config") pod "076e3516-7a56-4b76-ba13-4f4f11f83af8" (UID: "076e3516-7a56-4b76-ba13-4f4f11f83af8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.600302 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/076e3516-7a56-4b76-ba13-4f4f11f83af8-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "076e3516-7a56-4b76-ba13-4f4f11f83af8" (UID: "076e3516-7a56-4b76-ba13-4f4f11f83af8"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.601896 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-scripts" (OuterVolumeSpecName: "scripts") pod "a0abaaf5-8404-4d53-ab0d-ceb843879e7b" (UID: "a0abaaf5-8404-4d53-ab0d-ceb843879e7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.602266 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-config" (OuterVolumeSpecName: "config") pod "a0abaaf5-8404-4d53-ab0d-ceb843879e7b" (UID: "a0abaaf5-8404-4d53-ab0d-ceb843879e7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: E0120 23:08:06.604158 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:08:06 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: if [ -n "nova_cell0" ]; then Jan 20 23:08:06 crc kubenswrapper[5030]: GRANT_DATABASE="nova_cell0" Jan 20 23:08:06 crc kubenswrapper[5030]: else Jan 20 23:08:06 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:08:06 crc kubenswrapper[5030]: fi Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:08:06 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:08:06 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:08:06 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:08:06 crc kubenswrapper[5030]: # support updates Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.604838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076e3516-7a56-4b76-ba13-4f4f11f83af8-kube-api-access-qshdk" (OuterVolumeSpecName: "kube-api-access-qshdk") pod "076e3516-7a56-4b76-ba13-4f4f11f83af8" (UID: "076e3516-7a56-4b76-ba13-4f4f11f83af8"). InnerVolumeSpecName "kube-api-access-qshdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: E0120 23:08:06.605525 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" podUID="ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.612716 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-v84g9"] Jan 20 23:08:06 crc kubenswrapper[5030]: E0120 23:08:06.613282 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:08:06 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: if [ -n "nova_cell1" ]; then Jan 20 23:08:06 crc kubenswrapper[5030]: GRANT_DATABASE="nova_cell1" Jan 20 23:08:06 crc kubenswrapper[5030]: else Jan 20 23:08:06 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:08:06 crc kubenswrapper[5030]: fi Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:08:06 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:08:06 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:08:06 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:08:06 crc kubenswrapper[5030]: # support updates Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.613831 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-kube-api-access-52q5p" (OuterVolumeSpecName: "kube-api-access-52q5p") pod "a0abaaf5-8404-4d53-ab0d-ceb843879e7b" (UID: "a0abaaf5-8404-4d53-ab0d-ceb843879e7b"). InnerVolumeSpecName "kube-api-access-52q5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: E0120 23:08:06.614357 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8" podUID="c5f82922-ba42-4068-b84b-adc7297a751f" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.627208 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.636864 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "a0abaaf5-8404-4d53-ab0d-ceb843879e7b" (UID: "a0abaaf5-8404-4d53-ab0d-ceb843879e7b"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.636920 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "076e3516-7a56-4b76-ba13-4f4f11f83af8" (UID: "076e3516-7a56-4b76-ba13-4f4f11f83af8"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.676778 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="49b5232f-22f5-443f-8b8d-bf8f8b5889e4" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.1.125:6080/vnc_lite.html\": dial tcp 10.217.1.125:6080: connect: connection refused" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.677065 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0abaaf5-8404-4d53-ab0d-ceb843879e7b" (UID: "a0abaaf5-8404-4d53-ab0d-ceb843879e7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: E0120 23:08:06.677808 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:08:06 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: if [ -n "glance" ]; then Jan 20 23:08:06 crc kubenswrapper[5030]: GRANT_DATABASE="glance" Jan 20 23:08:06 crc kubenswrapper[5030]: else Jan 20 23:08:06 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:08:06 crc kubenswrapper[5030]: fi Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:08:06 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:08:06 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:08:06 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:08:06 crc kubenswrapper[5030]: # support updates Jan 20 23:08:06 crc kubenswrapper[5030]: Jan 20 23:08:06 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:08:06 crc kubenswrapper[5030]: E0120 23:08:06.682953 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm" podUID="8a7e9fd9-51c7-4047-9342-ac3368e079f1" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.718794 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-v84g9"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.721970 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "076e3516-7a56-4b76-ba13-4f4f11f83af8" (UID: "076e3516-7a56-4b76-ba13-4f4f11f83af8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.728767 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/076e3516-7a56-4b76-ba13-4f4f11f83af8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.728790 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52q5p\" (UniqueName: \"kubernetes.io/projected/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-kube-api-access-52q5p\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.728805 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.728814 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/076e3516-7a56-4b76-ba13-4f4f11f83af8-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.728823 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.728832 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.728932 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.728957 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.728968 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/076e3516-7a56-4b76-ba13-4f4f11f83af8-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.728983 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.729002 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qshdk\" (UniqueName: \"kubernetes.io/projected/076e3516-7a56-4b76-ba13-4f4f11f83af8-kube-api-access-qshdk\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.729021 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.756714 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-2m9r9"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.770909 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "a0abaaf5-8404-4d53-ab0d-ceb843879e7b" (UID: "a0abaaf5-8404-4d53-ab0d-ceb843879e7b"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.790559 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-2m9r9"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.798103 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.803734 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.804462 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.818741 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "a0abaaf5-8404-4d53-ab0d-ceb843879e7b" (UID: "a0abaaf5-8404-4d53-ab0d-ceb843879e7b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.820909 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-5185-account-create-update-fvtmn"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.828971 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.829151 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="b74f69b6-bbb2-4578-a61c-9479f6802e01" containerName="nova-scheduler-scheduler" containerID="cri-o://65a340046c8d57278deb025b2315942cfd40c2be9694e1d9f412ed27c7d0f561" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.831817 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.831838 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.831847 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.831874 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0abaaf5-8404-4d53-ab0d-ceb843879e7b-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.836151 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.845397 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.870706 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.870926 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="a666baa5-8c6c-40e4-8c4d-7ab347b29e04" containerName="nova-cell1-conductor-conductor" containerID="cri-o://0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.873567 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-hglg5"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.884406 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.884456 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.884647 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="480916ab-abdd-474b-a001-12e99c3e1e42" containerName="nova-cell0-conductor-conductor" containerID="cri-o://0f3a917976c46bd8ac3d7283588cd417a1a1576ec8637bc5f388d2baab0c0ebd" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.887553 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-q5l2n"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.892418 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.892882 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="ceilometer-central-agent" containerID="cri-o://946c17aa01d77e8eccf38bf873615ae3de802c969d5a314ca7b0bcf0eb6374f1" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.892972 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "076e3516-7a56-4b76-ba13-4f4f11f83af8" (UID: "076e3516-7a56-4b76-ba13-4f4f11f83af8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.892988 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="sg-core" containerID="cri-o://2de53c3b859c17f6dce4f358962fc2f49e746c59df64f97dfaf9b707a5340a38" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.892980 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="proxy-httpd" containerID="cri-o://06fbce813ee541d9eeed7850ca354fca0580cf614c4eb0a8edaf5b5a22908da6" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.893027 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="ceilometer-notification-agent" containerID="cri-o://4f4bc6591caf71c0df2653257968c6767a5209155e845f49b6ad42788e1d5137" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.897947 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.898307 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="3b8587e7-4e10-44f3-a92c-3ae1bdb9667c" containerName="kube-state-metrics" containerID="cri-o://8fff461aea2537b44e18dcf720322f47cc70d29129fcd1dd6c7ce152cd4912d9" gracePeriod=30 Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.916952 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.933150 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-combined-ca-bundle\") pod \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.933261 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-openstack-config-secret\") pod \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.933370 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrcwj\" (UniqueName: \"kubernetes.io/projected/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-kube-api-access-qrcwj\") pod \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.933523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-openstack-config\") pod \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\" (UID: \"cd8f6ca4-97c3-4971-b18f-cfbd9f599c99\") " Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.933984 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.956038 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "076e3516-7a56-4b76-ba13-4f4f11f83af8" (UID: "076e3516-7a56-4b76-ba13-4f4f11f83af8"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.960068 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-kube-api-access-qrcwj" (OuterVolumeSpecName: "kube-api-access-qrcwj") pod "cd8f6ca4-97c3-4971-b18f-cfbd9f599c99" (UID: "cd8f6ca4-97c3-4971-b18f-cfbd9f599c99"). InnerVolumeSpecName "kube-api-access-qrcwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.962367 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.969150 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.970772 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd8f6ca4-97c3-4971-b18f-cfbd9f599c99" (UID: "cd8f6ca4-97c3-4971-b18f-cfbd9f599c99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.975642 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "cd8f6ca4-97c3-4971-b18f-cfbd9f599c99" (UID: "cd8f6ca4-97c3-4971-b18f-cfbd9f599c99"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.976702 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9"] Jan 20 23:08:06 crc kubenswrapper[5030]: I0120 23:08:06.992343 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g"] Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.009304 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "cd8f6ca4-97c3-4971-b18f-cfbd9f599c99" (UID: "cd8f6ca4-97c3-4971-b18f-cfbd9f599c99"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.010493 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:08:07 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: if [ -n "neutron" ]; then Jan 20 23:08:07 crc kubenswrapper[5030]: GRANT_DATABASE="neutron" Jan 20 23:08:07 crc kubenswrapper[5030]: else Jan 20 23:08:07 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:08:07 crc kubenswrapper[5030]: fi Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:08:07 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:08:07 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:08:07 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:08:07 crc kubenswrapper[5030]: # support updates Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.012445 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9" podUID="07b2a1c0-8ab2-433a-9f20-169b898c0501" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.039456 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrcwj\" (UniqueName: \"kubernetes.io/projected/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-kube-api-access-qrcwj\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.039500 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/076e3516-7a56-4b76-ba13-4f4f11f83af8-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.039511 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.039519 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.039528 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.061654 5030 generic.go:334] "Generic (PLEG): container finished" podID="cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" containerID="1ace08f4089db5eedfbc26aaa16427e4c52aaf88c8dbe004dece963061aab561" exitCode=0 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.061692 5030 generic.go:334] "Generic (PLEG): container finished" podID="cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" containerID="7d91b777f27a8846dc117e3e1f244fee0123666d9f72ac9992025e702d704e39" exitCode=0 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.061712 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" event={"ID":"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf","Type":"ContainerDied","Data":"1ace08f4089db5eedfbc26aaa16427e4c52aaf88c8dbe004dece963061aab561"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.062321 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" event={"ID":"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf","Type":"ContainerDied","Data":"7d91b777f27a8846dc117e3e1f244fee0123666d9f72ac9992025e702d704e39"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.118167 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="0228e6e5a2bf8d67091bb924e23d89a3dacb9f0b27fee5bc85fea465f5349cc2" exitCode=0 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.118191 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="2a7fd43e4183dd4379e4ba1e71ac29b4bdb0e6c7406d53a86f975b26229f1c3a" exitCode=0 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.118200 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="ba9956fcef6b04c46db28c6ae2b349ec3802723f420d58f0af21b4f46ec7c18c" exitCode=0 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.118206 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="222855be530361537c189bb68ea262fab03aa1e573ca39420ca757a8fc9752a9" exitCode=0 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.118244 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"0228e6e5a2bf8d67091bb924e23d89a3dacb9f0b27fee5bc85fea465f5349cc2"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.118268 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"2a7fd43e4183dd4379e4ba1e71ac29b4bdb0e6c7406d53a86f975b26229f1c3a"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.118278 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"ba9956fcef6b04c46db28c6ae2b349ec3802723f420d58f0af21b4f46ec7c18c"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.118287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"222855be530361537c189bb68ea262fab03aa1e573ca39420ca757a8fc9752a9"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.119443 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9" event={"ID":"07b2a1c0-8ab2-433a-9f20-169b898c0501","Type":"ContainerStarted","Data":"cdc3a3907f73feae4f9af165667d40fef503d6913b4afeb396ada1cbccc7827c"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.140512 5030 generic.go:334] "Generic (PLEG): container finished" podID="128b8307-03b0-488a-9e89-83c29af6dd9b" containerID="72b1998c6c8dee12853739e805d1cf361d95b2e26677c4f5d66171c955ca2571" exitCode=143 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.140602 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" event={"ID":"128b8307-03b0-488a-9e89-83c29af6dd9b","Type":"ContainerDied","Data":"72b1998c6c8dee12853739e805d1cf361d95b2e26677c4f5d66171c955ca2571"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.189940 5030 generic.go:334] "Generic (PLEG): container finished" podID="b6b94587-03e7-4220-b417-0244e3623cd5" containerID="2de53c3b859c17f6dce4f358962fc2f49e746c59df64f97dfaf9b707a5340a38" exitCode=2 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.190047 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b6b94587-03e7-4220-b417-0244e3623cd5","Type":"ContainerDied","Data":"2de53c3b859c17f6dce4f358962fc2f49e746c59df64f97dfaf9b707a5340a38"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.194446 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_076e3516-7a56-4b76-ba13-4f4f11f83af8/ovsdbserver-sb/0.log" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.194753 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.194770 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"076e3516-7a56-4b76-ba13-4f4f11f83af8","Type":"ContainerDied","Data":"1e9774ee3008170e37fc8225fc5439b39532c85172bb7503fdf9acaa839534dc"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.194872 5030 scope.go:117] "RemoveContainer" containerID="55836a84ea42622b1c95a5c3544f4cddcb27aff9869f6f7aca168d293dd1544e" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.203642 5030 generic.go:334] "Generic (PLEG): container finished" podID="63645689-a02d-4ee6-b712-554130bdca30" containerID="8afb44471792a68df8295fcfe9340c136a8e28722e34319c5bd8ccede98487ee" exitCode=143 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.203700 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"63645689-a02d-4ee6-b712-554130bdca30","Type":"ContainerDied","Data":"8afb44471792a68df8295fcfe9340c136a8e28722e34319c5bd8ccede98487ee"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.217649 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" event={"ID":"06e5d8f2-57ee-4aba-aea9-db1da4cb32dc","Type":"ContainerStarted","Data":"5f542dfa7b2bec2acde4d439d64f6d69d3426130c372d7886dc4662d11d8d49d"} Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.222036 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:08:07 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: if [ -n "placement" ]; then Jan 20 23:08:07 crc kubenswrapper[5030]: GRANT_DATABASE="placement" Jan 20 23:08:07 crc kubenswrapper[5030]: else Jan 20 23:08:07 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:08:07 crc kubenswrapper[5030]: fi Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:08:07 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:08:07 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:08:07 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:08:07 crc kubenswrapper[5030]: # support updates Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.226786 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" podUID="06e5d8f2-57ee-4aba-aea9-db1da4cb32dc" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.238816 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" event={"ID":"ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2","Type":"ContainerStarted","Data":"bb0be371f8766b46fb8a8eb9da2275bcc053b6f37c8f7c73e59249190e3df8c6"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.282530 5030 generic.go:334] "Generic (PLEG): container finished" podID="6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" containerID="1e47d9a62be110198d6a21ed0e2f60d6739a5d24b7c3d9403df7146ed5e0e34c" exitCode=143 Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.282655 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:08:07 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: if [ -n "nova_cell0" ]; then Jan 20 23:08:07 crc kubenswrapper[5030]: GRANT_DATABASE="nova_cell0" Jan 20 23:08:07 crc kubenswrapper[5030]: else Jan 20 23:08:07 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:08:07 crc kubenswrapper[5030]: fi Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:08:07 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:08:07 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:08:07 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:08:07 crc kubenswrapper[5030]: # support updates Jan 20 23:08:07 crc kubenswrapper[5030]: Jan 20 23:08:07 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.282704 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" event={"ID":"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c","Type":"ContainerDied","Data":"1e47d9a62be110198d6a21ed0e2f60d6739a5d24b7c3d9403df7146ed5e0e34c"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.284322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" event={"ID":"1ab7f925-f4fc-4393-920e-38907a760bf4","Type":"ContainerStarted","Data":"c278fe9524c3246e4f26e57ff30d7760098e40cd2be96930ab5085c598b2e78e"} Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.286101 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" podUID="ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.289178 5030 generic.go:334] "Generic (PLEG): container finished" podID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerID="19fb231c59f8089b6cc2f266b334a33db07302a867317497a770819abfdc1c42" exitCode=143 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.289252 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0037fd82-93c2-4ba4-87c2-563388f2b56c","Type":"ContainerDied","Data":"19fb231c59f8089b6cc2f266b334a33db07302a867317497a770819abfdc1c42"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.293339 5030 generic.go:334] "Generic (PLEG): container finished" podID="1728938b-621e-44f4-91e2-c061a23aa819" containerID="87560233fddfaf3c5921a24575892e3f668b172a67d7c5d34650e6ff443be940" exitCode=0 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.293398 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1728938b-621e-44f4-91e2-c061a23aa819","Type":"ContainerDied","Data":"87560233fddfaf3c5921a24575892e3f668b172a67d7c5d34650e6ff443be940"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.297228 5030 generic.go:334] "Generic (PLEG): container finished" podID="2b88d820-54a4-4307-82de-6b7e7b5cd435" containerID="61d51d6533347c2adcbfbfb7e645e966f84e627cfee7356d6c103a97b3f6d938" exitCode=143 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.297271 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"2b88d820-54a4-4307-82de-6b7e7b5cd435","Type":"ContainerDied","Data":"61d51d6533347c2adcbfbfb7e645e966f84e627cfee7356d6c103a97b3f6d938"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.298383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8" event={"ID":"c5f82922-ba42-4068-b84b-adc7297a751f","Type":"ContainerStarted","Data":"26bdc796a150f85a551349cc6633149f7f6ebe1378a5a42d7ccd760c537ccf88"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.337512 5030 generic.go:334] "Generic (PLEG): container finished" podID="3b8587e7-4e10-44f3-a92c-3ae1bdb9667c" containerID="8fff461aea2537b44e18dcf720322f47cc70d29129fcd1dd6c7ce152cd4912d9" exitCode=2 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.337592 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c","Type":"ContainerDied","Data":"8fff461aea2537b44e18dcf720322f47cc70d29129fcd1dd6c7ce152cd4912d9"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.345400 5030 generic.go:334] "Generic (PLEG): container finished" podID="dc14f499-5449-459d-874e-a7fe1b79e469" containerID="bb03a7a9462c3d1f022ea25c6e705e371d2aa4a25dcb01ea0294ce1846049a9b" exitCode=143 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.345522 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"dc14f499-5449-459d-874e-a7fe1b79e469","Type":"ContainerDied","Data":"bb03a7a9462c3d1f022ea25c6e705e371d2aa4a25dcb01ea0294ce1846049a9b"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.350746 5030 generic.go:334] "Generic (PLEG): container finished" podID="bbd870c6-92ea-4c50-a0dd-77e716077572" containerID="4f38b5a344b1f202e5c8a24ab0113aed84acc42e572b88dfe648cfd86fc0dc6f" exitCode=143 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.350837 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" event={"ID":"bbd870c6-92ea-4c50-a0dd-77e716077572","Type":"ContainerDied","Data":"4f38b5a344b1f202e5c8a24ab0113aed84acc42e572b88dfe648cfd86fc0dc6f"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.352788 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm" event={"ID":"8a7e9fd9-51c7-4047-9342-ac3368e079f1","Type":"ContainerStarted","Data":"fc747c8e47d8720f34fe64e74253a2b80472c1ada25f9050feb6df56eb3f9eab"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.359134 5030 generic.go:334] "Generic (PLEG): container finished" podID="3559ba4b-6e2a-4df1-900f-4a0138dab333" containerID="1b3bd34f6462c487c3b7f7516d49f86c3163d11b2e4ea0798004538ed4feb2ee" exitCode=1 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.359198 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-h7nmm" event={"ID":"3559ba4b-6e2a-4df1-900f-4a0138dab333","Type":"ContainerDied","Data":"1b3bd34f6462c487c3b7f7516d49f86c3163d11b2e4ea0798004538ed4feb2ee"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.359217 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-h7nmm" event={"ID":"3559ba4b-6e2a-4df1-900f-4a0138dab333","Type":"ContainerStarted","Data":"73ee186f1ebbffd7604290cb324748c50fa03cef14113dee116ffdb8311833b3"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.359784 5030 scope.go:117] "RemoveContainer" containerID="1b3bd34f6462c487c3b7f7516d49f86c3163d11b2e4ea0798004538ed4feb2ee" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.377332 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_a0abaaf5-8404-4d53-ab0d-ceb843879e7b/ovsdbserver-nb/0.log" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.377402 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"a0abaaf5-8404-4d53-ab0d-ceb843879e7b","Type":"ContainerDied","Data":"b4ebde8a21ef69b50d163b6790e3f68a93b61e18d955907ca97606c78be05807"} Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.377490 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.404398 5030 generic.go:334] "Generic (PLEG): container finished" podID="cd8f6ca4-97c3-4971-b18f-cfbd9f599c99" containerID="fc1bd89789fbcdfff592fdf01d1355ec90b510f11eea0c403079203e91249261" exitCode=137 Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.404461 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.549894 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.549962 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data podName:78f61cd4-3d8c-48e0-bebd-03969052e500 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:11.549942746 +0000 UTC m=+1963.870203034 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data") pod "rabbitmq-cell1-server-0" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.614763 5030 scope.go:117] "RemoveContainer" containerID="aa914e7fae792f39c001fbef3009f7737f6fcbd15f05ee4dbed06af2baea26c7" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.638028 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.645708 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.652401 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.652496 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle podName:b74f69b6-bbb2-4578-a61c-9479f6802e01 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:11.652473811 +0000 UTC m=+1963.972734159 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle") pod "nova-scheduler-0" (UID: "b74f69b6-bbb2-4578-a61c-9479f6802e01") : secret "combined-ca-bundle" not found Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.657133 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.662820 5030 scope.go:117] "RemoveContainer" containerID="143fd3fdb7a82cd6e839d0f97c95e1923c32e0d47238eb2b8ba19f425ffccf46" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.673515 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.678848 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.701716 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.709767 5030 scope.go:117] "RemoveContainer" containerID="2a38d763c0caa77b3b4fae1898fcea6589985d2f9175090530c93c3047d97531" Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.712796 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.723339 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.725158 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.725199 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="1399fab1-c25c-4b58-a58e-9dfd0062a09c" containerName="ovn-northd" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.732663 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.752907 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl559\" (UniqueName: \"kubernetes.io/projected/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-kube-api-access-fl559\") pod \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.752967 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-internal-tls-certs\") pod \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.753003 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-state-metrics-tls-config\") pod \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.753022 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vppkd\" (UniqueName: \"kubernetes.io/projected/07b2a1c0-8ab2-433a-9f20-169b898c0501-kube-api-access-vppkd\") pod \"07b2a1c0-8ab2-433a-9f20-169b898c0501\" (UID: \"07b2a1c0-8ab2-433a-9f20-169b898c0501\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.753071 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sldsp\" (UniqueName: \"kubernetes.io/projected/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-api-access-sldsp\") pod \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.753135 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-combined-ca-bundle\") pod \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.753173 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-config-data\") pod \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.753211 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-state-metrics-tls-certs\") pod \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\" (UID: \"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.753254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-run-httpd\") pod \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.753295 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-log-httpd\") pod \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.753320 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-etc-swift\") pod \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.753334 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-public-tls-certs\") pod \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.753356 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-combined-ca-bundle\") pod \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\" (UID: \"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.753379 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07b2a1c0-8ab2-433a-9f20-169b898c0501-operator-scripts\") pod \"07b2a1c0-8ab2-433a-9f20-169b898c0501\" (UID: \"07b2a1c0-8ab2-433a-9f20-169b898c0501\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.755251 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07b2a1c0-8ab2-433a-9f20-169b898c0501-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "07b2a1c0-8ab2-433a-9f20-169b898c0501" (UID: "07b2a1c0-8ab2-433a-9f20-169b898c0501"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.769168 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" (UID: "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.774987 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" (UID: "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.777838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" (UID: "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.784236 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-kube-api-access-fl559" (OuterVolumeSpecName: "kube-api-access-fl559") pod "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" (UID: "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf"). InnerVolumeSpecName "kube-api-access-fl559". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.792260 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07b2a1c0-8ab2-433a-9f20-169b898c0501-kube-api-access-vppkd" (OuterVolumeSpecName: "kube-api-access-vppkd") pod "07b2a1c0-8ab2-433a-9f20-169b898c0501" (UID: "07b2a1c0-8ab2-433a-9f20-169b898c0501"). InnerVolumeSpecName "kube-api-access-vppkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.792608 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-api-access-sldsp" (OuterVolumeSpecName: "kube-api-access-sldsp") pod "3b8587e7-4e10-44f3-a92c-3ae1bdb9667c" (UID: "3b8587e7-4e10-44f3-a92c-3ae1bdb9667c"). InnerVolumeSpecName "kube-api-access-sldsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.794149 5030 scope.go:117] "RemoveContainer" containerID="fc1bd89789fbcdfff592fdf01d1355ec90b510f11eea0c403079203e91249261" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.829112 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.830892 5030 scope.go:117] "RemoveContainer" containerID="fc1bd89789fbcdfff592fdf01d1355ec90b510f11eea0c403079203e91249261" Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.832213 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc1bd89789fbcdfff592fdf01d1355ec90b510f11eea0c403079203e91249261\": container with ID starting with fc1bd89789fbcdfff592fdf01d1355ec90b510f11eea0c403079203e91249261 not found: ID does not exist" containerID="fc1bd89789fbcdfff592fdf01d1355ec90b510f11eea0c403079203e91249261" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.832241 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc1bd89789fbcdfff592fdf01d1355ec90b510f11eea0c403079203e91249261"} err="failed to get container status \"fc1bd89789fbcdfff592fdf01d1355ec90b510f11eea0c403079203e91249261\": rpc error: code = NotFound desc = could not find container \"fc1bd89789fbcdfff592fdf01d1355ec90b510f11eea0c403079203e91249261\": container with ID starting with fc1bd89789fbcdfff592fdf01d1355ec90b510f11eea0c403079203e91249261 not found: ID does not exist" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.856389 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.856419 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.856427 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.856437 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07b2a1c0-8ab2-433a-9f20-169b898c0501-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.856449 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl559\" (UniqueName: \"kubernetes.io/projected/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-kube-api-access-fl559\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.856458 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vppkd\" (UniqueName: \"kubernetes.io/projected/07b2a1c0-8ab2-433a-9f20-169b898c0501-kube-api-access-vppkd\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.856467 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sldsp\" (UniqueName: \"kubernetes.io/projected/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-api-access-sldsp\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.883479 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "3b8587e7-4e10-44f3-a92c-3ae1bdb9667c" (UID: "3b8587e7-4e10-44f3-a92c-3ae1bdb9667c"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.904100 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "3b8587e7-4e10-44f3-a92c-3ae1bdb9667c" (UID: "3b8587e7-4e10-44f3-a92c-3ae1bdb9667c"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.918802 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b8587e7-4e10-44f3-a92c-3ae1bdb9667c" (UID: "3b8587e7-4e10-44f3-a92c-3ae1bdb9667c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.937216 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" (UID: "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.947031 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" (UID: "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.951511 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-config-data" (OuterVolumeSpecName: "config-data") pod "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" (UID: "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.957266 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.957308 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-operator-scripts\") pod \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.957364 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-combined-ca-bundle\") pod \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.957386 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-config-data-generated\") pod \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.957527 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-config-data-default\") pod \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.957559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqwk5\" (UniqueName: \"kubernetes.io/projected/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-kube-api-access-dqwk5\") pod \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.957601 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-galera-tls-certs\") pod \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.957794 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-kolla-config\") pod \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\" (UID: \"a3bd79be-d1b9-4637-9dca-93cd7dbb5294\") " Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.958558 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.958580 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.958590 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.958601 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.958611 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.958633 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.958795 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:08:07 crc kubenswrapper[5030]: E0120 23:08:07.958853 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data podName:6ede01a6-9221-4276-b537-212e5da27f5c nodeName:}" failed. No retries permitted until 2026-01-20 23:08:11.958827786 +0000 UTC m=+1964.279088074 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data") pod "rabbitmq-server-0" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c") : configmap "rabbitmq-config-data" not found Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.961895 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "a3bd79be-d1b9-4637-9dca-93cd7dbb5294" (UID: "a3bd79be-d1b9-4637-9dca-93cd7dbb5294"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.963273 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3bd79be-d1b9-4637-9dca-93cd7dbb5294" (UID: "a3bd79be-d1b9-4637-9dca-93cd7dbb5294"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.965169 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "a3bd79be-d1b9-4637-9dca-93cd7dbb5294" (UID: "a3bd79be-d1b9-4637-9dca-93cd7dbb5294"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.965686 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a3bd79be-d1b9-4637-9dca-93cd7dbb5294" (UID: "a3bd79be-d1b9-4637-9dca-93cd7dbb5294"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.986660 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-kube-api-access-dqwk5" (OuterVolumeSpecName: "kube-api-access-dqwk5") pod "a3bd79be-d1b9-4637-9dca-93cd7dbb5294" (UID: "a3bd79be-d1b9-4637-9dca-93cd7dbb5294"). InnerVolumeSpecName "kube-api-access-dqwk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:07 crc kubenswrapper[5030]: I0120 23:08:07.995668 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" (UID: "cdc24c1f-8ba7-46ac-97fb-9da1274d15cf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.006651 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076e3516-7a56-4b76-ba13-4f4f11f83af8" path="/var/lib/kubelet/pods/076e3516-7a56-4b76-ba13-4f4f11f83af8/volumes" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.007868 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c878c14-0187-4224-bffa-99280c03dae7" path="/var/lib/kubelet/pods/0c878c14-0187-4224-bffa-99280c03dae7/volumes" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.009241 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25712fb0-ffff-4189-a2a8-f6f0e88778ff" path="/var/lib/kubelet/pods/25712fb0-ffff-4189-a2a8-f6f0e88778ff/volumes" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.010327 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="288cd5d7-f17b-4184-9220-da634c367fde" path="/var/lib/kubelet/pods/288cd5d7-f17b-4184-9220-da634c367fde/volumes" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.013418 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316f5040-d923-4ef3-9fbf-f52bf080d8dc" path="/var/lib/kubelet/pods/316f5040-d923-4ef3-9fbf-f52bf080d8dc/volumes" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.014076 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62549b05-ca68-4615-9cf8-ee0a6f5c5d69" path="/var/lib/kubelet/pods/62549b05-ca68-4615-9cf8-ee0a6f5c5d69/volumes" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.017075 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="646a1e7c-67a8-4acc-8810-2ba983fb655e" path="/var/lib/kubelet/pods/646a1e7c-67a8-4acc-8810-2ba983fb655e/volumes" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.020340 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="867dbc1d-4b5e-4803-b3ee-aabaf77f4282" path="/var/lib/kubelet/pods/867dbc1d-4b5e-4803-b3ee-aabaf77f4282/volumes" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.022849 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0abaaf5-8404-4d53-ab0d-ceb843879e7b" path="/var/lib/kubelet/pods/a0abaaf5-8404-4d53-ab0d-ceb843879e7b/volumes" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.023894 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd8f6ca4-97c3-4971-b18f-cfbd9f599c99" path="/var/lib/kubelet/pods/cd8f6ca4-97c3-4971-b18f-cfbd9f599c99/volumes" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.025838 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc12aecf-2818-47e1-81b0-8643dc895afb" path="/var/lib/kubelet/pods/dc12aecf-2818-47e1-81b0-8643dc895afb/volumes" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.027193 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3c7c52-f274-4811-8559-49b78f7b9bfa" path="/var/lib/kubelet/pods/ea3c7c52-f274-4811-8559-49b78f7b9bfa/volumes" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.045717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "a3bd79be-d1b9-4637-9dca-93cd7dbb5294" (UID: "a3bd79be-d1b9-4637-9dca-93cd7dbb5294"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.070320 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.073523 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "a3bd79be-d1b9-4637-9dca-93cd7dbb5294" (UID: "a3bd79be-d1b9-4637-9dca-93cd7dbb5294"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.073874 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3bd79be-d1b9-4637-9dca-93cd7dbb5294" (UID: "a3bd79be-d1b9-4637-9dca-93cd7dbb5294"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.082554 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.082604 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.082614 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.082639 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.082649 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.082657 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.082666 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqwk5\" (UniqueName: \"kubernetes.io/projected/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-kube-api-access-dqwk5\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.082674 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.082682 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3bd79be-d1b9-4637-9dca-93cd7dbb5294-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.090298 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.113884 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.154701 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.155035 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.183518 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds292\" (UniqueName: \"kubernetes.io/projected/c5f82922-ba42-4068-b84b-adc7297a751f-kube-api-access-ds292\") pod \"c5f82922-ba42-4068-b84b-adc7297a751f\" (UID: \"c5f82922-ba42-4068-b84b-adc7297a751f\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.183604 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqwr5\" (UniqueName: \"kubernetes.io/projected/8a7e9fd9-51c7-4047-9342-ac3368e079f1-kube-api-access-lqwr5\") pod \"8a7e9fd9-51c7-4047-9342-ac3368e079f1\" (UID: \"8a7e9fd9-51c7-4047-9342-ac3368e079f1\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.183721 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62x2m\" (UniqueName: \"kubernetes.io/projected/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-kube-api-access-62x2m\") pod \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.183798 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5f82922-ba42-4068-b84b-adc7297a751f-operator-scripts\") pod \"c5f82922-ba42-4068-b84b-adc7297a751f\" (UID: \"c5f82922-ba42-4068-b84b-adc7297a751f\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.183852 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-nova-novncproxy-tls-certs\") pod \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.183879 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-vencrypt-tls-certs\") pod \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.183910 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-combined-ca-bundle\") pod \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.183934 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-config-data\") pod \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\" (UID: \"49b5232f-22f5-443f-8b8d-bf8f8b5889e4\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.184004 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a7e9fd9-51c7-4047-9342-ac3368e079f1-operator-scripts\") pod \"8a7e9fd9-51c7-4047-9342-ac3368e079f1\" (UID: \"8a7e9fd9-51c7-4047-9342-ac3368e079f1\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.184391 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.186068 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a7e9fd9-51c7-4047-9342-ac3368e079f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a7e9fd9-51c7-4047-9342-ac3368e079f1" (UID: "8a7e9fd9-51c7-4047-9342-ac3368e079f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.186462 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f82922-ba42-4068-b84b-adc7297a751f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5f82922-ba42-4068-b84b-adc7297a751f" (UID: "c5f82922-ba42-4068-b84b-adc7297a751f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.189741 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.197737 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-kube-api-access-62x2m" (OuterVolumeSpecName: "kube-api-access-62x2m") pod "49b5232f-22f5-443f-8b8d-bf8f8b5889e4" (UID: "49b5232f-22f5-443f-8b8d-bf8f8b5889e4"). InnerVolumeSpecName "kube-api-access-62x2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.213587 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f82922-ba42-4068-b84b-adc7297a751f-kube-api-access-ds292" (OuterVolumeSpecName: "kube-api-access-ds292") pod "c5f82922-ba42-4068-b84b-adc7297a751f" (UID: "c5f82922-ba42-4068-b84b-adc7297a751f"). InnerVolumeSpecName "kube-api-access-ds292". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.226312 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-config-data" (OuterVolumeSpecName: "config-data") pod "49b5232f-22f5-443f-8b8d-bf8f8b5889e4" (UID: "49b5232f-22f5-443f-8b8d-bf8f8b5889e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.226710 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a7e9fd9-51c7-4047-9342-ac3368e079f1-kube-api-access-lqwr5" (OuterVolumeSpecName: "kube-api-access-lqwr5") pod "8a7e9fd9-51c7-4047-9342-ac3368e079f1" (UID: "8a7e9fd9-51c7-4047-9342-ac3368e079f1"). InnerVolumeSpecName "kube-api-access-lqwr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.280722 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49b5232f-22f5-443f-8b8d-bf8f8b5889e4" (UID: "49b5232f-22f5-443f-8b8d-bf8f8b5889e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.282080 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "49b5232f-22f5-443f-8b8d-bf8f8b5889e4" (UID: "49b5232f-22f5-443f-8b8d-bf8f8b5889e4"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.285481 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbd870c6-92ea-4c50-a0dd-77e716077572-logs\") pod \"bbd870c6-92ea-4c50-a0dd-77e716077572\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.285580 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rn29\" (UniqueName: \"kubernetes.io/projected/128b8307-03b0-488a-9e89-83c29af6dd9b-kube-api-access-8rn29\") pod \"128b8307-03b0-488a-9e89-83c29af6dd9b\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.285630 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/128b8307-03b0-488a-9e89-83c29af6dd9b-logs\") pod \"128b8307-03b0-488a-9e89-83c29af6dd9b\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.285649 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttnhg\" (UniqueName: \"kubernetes.io/projected/bbd870c6-92ea-4c50-a0dd-77e716077572-kube-api-access-ttnhg\") pod \"bbd870c6-92ea-4c50-a0dd-77e716077572\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.285689 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-config-data-custom\") pod \"128b8307-03b0-488a-9e89-83c29af6dd9b\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.285717 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-config-data\") pod \"128b8307-03b0-488a-9e89-83c29af6dd9b\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.285952 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-config-data-custom\") pod \"bbd870c6-92ea-4c50-a0dd-77e716077572\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.286005 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-config-data\") pod \"bbd870c6-92ea-4c50-a0dd-77e716077572\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.286026 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-combined-ca-bundle\") pod \"bbd870c6-92ea-4c50-a0dd-77e716077572\" (UID: \"bbd870c6-92ea-4c50-a0dd-77e716077572\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.286067 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-combined-ca-bundle\") pod \"128b8307-03b0-488a-9e89-83c29af6dd9b\" (UID: \"128b8307-03b0-488a-9e89-83c29af6dd9b\") " Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.286674 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62x2m\" (UniqueName: \"kubernetes.io/projected/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-kube-api-access-62x2m\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.286694 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5f82922-ba42-4068-b84b-adc7297a751f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.286704 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.286713 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.286724 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.286732 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a7e9fd9-51c7-4047-9342-ac3368e079f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.286743 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds292\" (UniqueName: \"kubernetes.io/projected/c5f82922-ba42-4068-b84b-adc7297a751f-kube-api-access-ds292\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.286752 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqwr5\" (UniqueName: \"kubernetes.io/projected/8a7e9fd9-51c7-4047-9342-ac3368e079f1-kube-api-access-lqwr5\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.287663 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbd870c6-92ea-4c50-a0dd-77e716077572-logs" (OuterVolumeSpecName: "logs") pod "bbd870c6-92ea-4c50-a0dd-77e716077572" (UID: "bbd870c6-92ea-4c50-a0dd-77e716077572"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.288028 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128b8307-03b0-488a-9e89-83c29af6dd9b-logs" (OuterVolumeSpecName: "logs") pod "128b8307-03b0-488a-9e89-83c29af6dd9b" (UID: "128b8307-03b0-488a-9e89-83c29af6dd9b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.300025 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128b8307-03b0-488a-9e89-83c29af6dd9b-kube-api-access-8rn29" (OuterVolumeSpecName: "kube-api-access-8rn29") pod "128b8307-03b0-488a-9e89-83c29af6dd9b" (UID: "128b8307-03b0-488a-9e89-83c29af6dd9b"). InnerVolumeSpecName "kube-api-access-8rn29". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.300090 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbd870c6-92ea-4c50-a0dd-77e716077572-kube-api-access-ttnhg" (OuterVolumeSpecName: "kube-api-access-ttnhg") pod "bbd870c6-92ea-4c50-a0dd-77e716077572" (UID: "bbd870c6-92ea-4c50-a0dd-77e716077572"). InnerVolumeSpecName "kube-api-access-ttnhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.306883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "128b8307-03b0-488a-9e89-83c29af6dd9b" (UID: "128b8307-03b0-488a-9e89-83c29af6dd9b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.308145 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bbd870c6-92ea-4c50-a0dd-77e716077572" (UID: "bbd870c6-92ea-4c50-a0dd-77e716077572"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.308664 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "49b5232f-22f5-443f-8b8d-bf8f8b5889e4" (UID: "49b5232f-22f5-443f-8b8d-bf8f8b5889e4"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.361531 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "128b8307-03b0-488a-9e89-83c29af6dd9b" (UID: "128b8307-03b0-488a-9e89-83c29af6dd9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.386276 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.386471 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="11f45a61-64e0-41ac-85bd-660c5a2c83be" containerName="memcached" containerID="cri-o://1cada3db4af66ec4d1c27414fa05f35d51c9d85cd6c0e4252f5af47988c2a0b3" gracePeriod=30 Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.389720 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/49b5232f-22f5-443f-8b8d-bf8f8b5889e4-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.389743 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.389752 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbd870c6-92ea-4c50-a0dd-77e716077572-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.389761 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rn29\" (UniqueName: \"kubernetes.io/projected/128b8307-03b0-488a-9e89-83c29af6dd9b-kube-api-access-8rn29\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.389771 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/128b8307-03b0-488a-9e89-83c29af6dd9b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.389780 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttnhg\" (UniqueName: \"kubernetes.io/projected/bbd870c6-92ea-4c50-a0dd-77e716077572-kube-api-access-ttnhg\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.389788 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.389796 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.425871 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-config-data" (OuterVolumeSpecName: "config-data") pod "128b8307-03b0-488a-9e89-83c29af6dd9b" (UID: "128b8307-03b0-488a-9e89-83c29af6dd9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.441357 5030 generic.go:334] "Generic (PLEG): container finished" podID="a3bd79be-d1b9-4637-9dca-93cd7dbb5294" containerID="f9125a8e056c241ba3ecc8ab6eed158313bbc4eae392988d937ab45defc16863" exitCode=0 Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.441409 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"a3bd79be-d1b9-4637-9dca-93cd7dbb5294","Type":"ContainerDied","Data":"f9125a8e056c241ba3ecc8ab6eed158313bbc4eae392988d937ab45defc16863"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.441434 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"a3bd79be-d1b9-4637-9dca-93cd7dbb5294","Type":"ContainerDied","Data":"a2da9c9ec0ad052792dfb644ad3b4a584aeddba98b2deeefbfaaf9f4edbce253"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.441449 5030 scope.go:117] "RemoveContainer" containerID="f9125a8e056c241ba3ecc8ab6eed158313bbc4eae392988d937ab45defc16863" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.441578 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.445858 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9" event={"ID":"07b2a1c0-8ab2-433a-9f20-169b898c0501","Type":"ContainerDied","Data":"cdc3a3907f73feae4f9af165667d40fef503d6913b4afeb396ada1cbccc7827c"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.445904 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.470013 5030 generic.go:334] "Generic (PLEG): container finished" podID="bbd870c6-92ea-4c50-a0dd-77e716077572" containerID="9fcef5bb10bc0f76675b28df6b7242170fa10c2e7d377591672b4db441f8a654" exitCode=0 Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.470077 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" event={"ID":"bbd870c6-92ea-4c50-a0dd-77e716077572","Type":"ContainerDied","Data":"9fcef5bb10bc0f76675b28df6b7242170fa10c2e7d377591672b4db441f8a654"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.470102 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" event={"ID":"bbd870c6-92ea-4c50-a0dd-77e716077572","Type":"ContainerDied","Data":"4c4ca9d754f745d5dcc4840a067783a2ba9decd45cad5e204e08ae9586b7270c"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.470161 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.471614 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8"] Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.475446 5030 generic.go:334] "Generic (PLEG): container finished" podID="1ab7f925-f4fc-4393-920e-38907a760bf4" containerID="cdd063454e35efa5256bac14a74c8c4544724bce8bf98d0ecc179f865f7e95de" exitCode=0 Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.475506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" event={"ID":"1ab7f925-f4fc-4393-920e-38907a760bf4","Type":"ContainerDied","Data":"cdd063454e35efa5256bac14a74c8c4544724bce8bf98d0ecc179f865f7e95de"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.475521 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-config-data" (OuterVolumeSpecName: "config-data") pod "bbd870c6-92ea-4c50-a0dd-77e716077572" (UID: "bbd870c6-92ea-4c50-a0dd-77e716077572"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.491061 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-cb8a-account-create-update-lgmp8"] Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.500474 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128b8307-03b0-488a-9e89-83c29af6dd9b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.500502 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.514694 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbd870c6-92ea-4c50-a0dd-77e716077572" (UID: "bbd870c6-92ea-4c50-a0dd-77e716077572"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.532891 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm" event={"ID":"8a7e9fd9-51c7-4047-9342-ac3368e079f1","Type":"ContainerDied","Data":"fc747c8e47d8720f34fe64e74253a2b80472c1ada25f9050feb6df56eb3f9eab"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.533009 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.572589 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b"] Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.572943 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076e3516-7a56-4b76-ba13-4f4f11f83af8" containerName="ovsdbserver-sb" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.572955 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="076e3516-7a56-4b76-ba13-4f4f11f83af8" containerName="ovsdbserver-sb" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.572968 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbd870c6-92ea-4c50-a0dd-77e716077572" containerName="barbican-keystone-listener" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.572974 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbd870c6-92ea-4c50-a0dd-77e716077572" containerName="barbican-keystone-listener" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.572987 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0abaaf5-8404-4d53-ab0d-ceb843879e7b" containerName="openstack-network-exporter" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.572993 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0abaaf5-8404-4d53-ab0d-ceb843879e7b" containerName="openstack-network-exporter" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.573003 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3bd79be-d1b9-4637-9dca-93cd7dbb5294" containerName="galera" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573009 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3bd79be-d1b9-4637-9dca-93cd7dbb5294" containerName="galera" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.573015 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" containerName="proxy-httpd" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573020 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" containerName="proxy-httpd" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.573031 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128b8307-03b0-488a-9e89-83c29af6dd9b" containerName="barbican-worker" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573037 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="128b8307-03b0-488a-9e89-83c29af6dd9b" containerName="barbican-worker" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.573046 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0abaaf5-8404-4d53-ab0d-ceb843879e7b" containerName="ovsdbserver-nb" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573051 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0abaaf5-8404-4d53-ab0d-ceb843879e7b" containerName="ovsdbserver-nb" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.573060 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b5232f-22f5-443f-8b8d-bf8f8b5889e4" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573067 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b5232f-22f5-443f-8b8d-bf8f8b5889e4" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.573075 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" containerName="proxy-server" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573080 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" containerName="proxy-server" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.573087 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3bd79be-d1b9-4637-9dca-93cd7dbb5294" containerName="mysql-bootstrap" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573093 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3bd79be-d1b9-4637-9dca-93cd7dbb5294" containerName="mysql-bootstrap" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.573104 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbd870c6-92ea-4c50-a0dd-77e716077572" containerName="barbican-keystone-listener-log" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573110 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbd870c6-92ea-4c50-a0dd-77e716077572" containerName="barbican-keystone-listener-log" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.573118 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8587e7-4e10-44f3-a92c-3ae1bdb9667c" containerName="kube-state-metrics" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573124 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8587e7-4e10-44f3-a92c-3ae1bdb9667c" containerName="kube-state-metrics" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.573139 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128b8307-03b0-488a-9e89-83c29af6dd9b" containerName="barbican-worker-log" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573144 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="128b8307-03b0-488a-9e89-83c29af6dd9b" containerName="barbican-worker-log" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.573153 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076e3516-7a56-4b76-ba13-4f4f11f83af8" containerName="openstack-network-exporter" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573159 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="076e3516-7a56-4b76-ba13-4f4f11f83af8" containerName="openstack-network-exporter" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573324 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="076e3516-7a56-4b76-ba13-4f4f11f83af8" containerName="openstack-network-exporter" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573335 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3bd79be-d1b9-4637-9dca-93cd7dbb5294" containerName="galera" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573344 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0abaaf5-8404-4d53-ab0d-ceb843879e7b" containerName="ovsdbserver-nb" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573353 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbd870c6-92ea-4c50-a0dd-77e716077572" containerName="barbican-keystone-listener-log" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573362 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" containerName="proxy-httpd" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573373 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b8587e7-4e10-44f3-a92c-3ae1bdb9667c" containerName="kube-state-metrics" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573386 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbd870c6-92ea-4c50-a0dd-77e716077572" containerName="barbican-keystone-listener" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573393 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0abaaf5-8404-4d53-ab0d-ceb843879e7b" containerName="openstack-network-exporter" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573401 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="076e3516-7a56-4b76-ba13-4f4f11f83af8" containerName="ovsdbserver-sb" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573413 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b5232f-22f5-443f-8b8d-bf8f8b5889e4" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573421 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" containerName="proxy-server" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573428 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="128b8307-03b0-488a-9e89-83c29af6dd9b" containerName="barbican-worker-log" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573437 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="128b8307-03b0-488a-9e89-83c29af6dd9b" containerName="barbican-worker" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.573888 5030 generic.go:334] "Generic (PLEG): container finished" podID="49b5232f-22f5-443f-8b8d-bf8f8b5889e4" containerID="68aba122543fc2cde94a4194512c12897ed28fdcdd2ea3c2f997cce1c69f0ad2" exitCode=0 Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.574021 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.585531 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.588141 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.602012 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd870c6-92ea-4c50-a0dd-77e716077572-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.623061 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"49b5232f-22f5-443f-8b8d-bf8f8b5889e4","Type":"ContainerDied","Data":"68aba122543fc2cde94a4194512c12897ed28fdcdd2ea3c2f997cce1c69f0ad2"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.623098 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"49b5232f-22f5-443f-8b8d-bf8f8b5889e4","Type":"ContainerDied","Data":"edf244324b0d100ade396554331be9e7b568a487f3b6827277b6cd8e62cd010a"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.623110 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"3b8587e7-4e10-44f3-a92c-3ae1bdb9667c","Type":"ContainerDied","Data":"e0913b7a61fba5f555780dc8ee66b46a2eb1f575b2c36c64e4b482cc25cf4906"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.623121 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8" event={"ID":"c5f82922-ba42-4068-b84b-adc7297a751f","Type":"ContainerDied","Data":"26bdc796a150f85a551349cc6633149f7f6ebe1378a5a42d7ccd760c537ccf88"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.623135 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.629563 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kqx2p"] Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.641852 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.641993 5030 scope.go:117] "RemoveContainer" containerID="b0e41ea8ff28c7473b445e12018325b5cf0e9b0b6d55e693a8b950412cfbdc77" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.646857 5030 generic.go:334] "Generic (PLEG): container finished" podID="128b8307-03b0-488a-9e89-83c29af6dd9b" containerID="b907239be444a9423c0f1c089334d96acd30b0631822f0e4c0784e1217a02a7b" exitCode=0 Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.646912 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" event={"ID":"128b8307-03b0-488a-9e89-83c29af6dd9b","Type":"ContainerDied","Data":"b907239be444a9423c0f1c089334d96acd30b0631822f0e4c0784e1217a02a7b"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.646933 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" event={"ID":"128b8307-03b0-488a-9e89-83c29af6dd9b","Type":"ContainerDied","Data":"a9455d4abf69d4d830bfd00a0ad36891faafe3a49ee298915dc3eeeb2bdd3743"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.646986 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.656095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" event={"ID":"cdc24c1f-8ba7-46ac-97fb-9da1274d15cf","Type":"ContainerDied","Data":"a06a815ed889dad6f5811fac204d7f02ab487d8f58ddbdfe6d158f07d14d0a34"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.656216 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.666845 5030 generic.go:334] "Generic (PLEG): container finished" podID="b6b94587-03e7-4220-b417-0244e3623cd5" containerID="06fbce813ee541d9eeed7850ca354fca0580cf614c4eb0a8edaf5b5a22908da6" exitCode=0 Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.666877 5030 generic.go:334] "Generic (PLEG): container finished" podID="b6b94587-03e7-4220-b417-0244e3623cd5" containerID="946c17aa01d77e8eccf38bf873615ae3de802c969d5a314ca7b0bcf0eb6374f1" exitCode=0 Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.666953 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b6b94587-03e7-4220-b417-0244e3623cd5","Type":"ContainerDied","Data":"06fbce813ee541d9eeed7850ca354fca0580cf614c4eb0a8edaf5b5a22908da6"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.667006 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b6b94587-03e7-4220-b417-0244e3623cd5","Type":"ContainerDied","Data":"946c17aa01d77e8eccf38bf873615ae3de802c969d5a314ca7b0bcf0eb6374f1"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.676761 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-mrxvh"] Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.709438 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh9zw\" (UniqueName: \"kubernetes.io/projected/e7476965-cef4-4fe9-a85b-f22312a83d52-kube-api-access-sh9zw\") pod \"keystone-cb8a-account-create-update-pdb5b\" (UID: \"e7476965-cef4-4fe9-a85b-f22312a83d52\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.709524 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7476965-cef4-4fe9-a85b-f22312a83d52-operator-scripts\") pod \"keystone-cb8a-account-create-update-pdb5b\" (UID: \"e7476965-cef4-4fe9-a85b-f22312a83d52\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.732739 5030 scope.go:117] "RemoveContainer" containerID="f9125a8e056c241ba3ecc8ab6eed158313bbc4eae392988d937ab45defc16863" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.751736 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-mrxvh"] Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.751790 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kqx2p"] Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.757308 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9125a8e056c241ba3ecc8ab6eed158313bbc4eae392988d937ab45defc16863\": container with ID starting with f9125a8e056c241ba3ecc8ab6eed158313bbc4eae392988d937ab45defc16863 not found: ID does not exist" containerID="f9125a8e056c241ba3ecc8ab6eed158313bbc4eae392988d937ab45defc16863" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.757350 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9125a8e056c241ba3ecc8ab6eed158313bbc4eae392988d937ab45defc16863"} err="failed to get container status \"f9125a8e056c241ba3ecc8ab6eed158313bbc4eae392988d937ab45defc16863\": rpc error: code = NotFound desc = could not find container \"f9125a8e056c241ba3ecc8ab6eed158313bbc4eae392988d937ab45defc16863\": container with ID starting with f9125a8e056c241ba3ecc8ab6eed158313bbc4eae392988d937ab45defc16863 not found: ID does not exist" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.757373 5030 scope.go:117] "RemoveContainer" containerID="b0e41ea8ff28c7473b445e12018325b5cf0e9b0b6d55e693a8b950412cfbdc77" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.757553 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b"] Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.758394 5030 generic.go:334] "Generic (PLEG): container finished" podID="3559ba4b-6e2a-4df1-900f-4a0138dab333" containerID="4b0d4d83939b8a57f9922652cba204b62e772b9ed6fd96ffea1528b8ae360c41" exitCode=1 Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.758593 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e41ea8ff28c7473b445e12018325b5cf0e9b0b6d55e693a8b950412cfbdc77\": container with ID starting with b0e41ea8ff28c7473b445e12018325b5cf0e9b0b6d55e693a8b950412cfbdc77 not found: ID does not exist" containerID="b0e41ea8ff28c7473b445e12018325b5cf0e9b0b6d55e693a8b950412cfbdc77" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.758638 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e41ea8ff28c7473b445e12018325b5cf0e9b0b6d55e693a8b950412cfbdc77"} err="failed to get container status \"b0e41ea8ff28c7473b445e12018325b5cf0e9b0b6d55e693a8b950412cfbdc77\": rpc error: code = NotFound desc = could not find container \"b0e41ea8ff28c7473b445e12018325b5cf0e9b0b6d55e693a8b950412cfbdc77\": container with ID starting with b0e41ea8ff28c7473b445e12018325b5cf0e9b0b6d55e693a8b950412cfbdc77 not found: ID does not exist" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.758667 5030 scope.go:117] "RemoveContainer" containerID="9fcef5bb10bc0f76675b28df6b7242170fa10c2e7d377591672b4db441f8a654" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.759287 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-h7nmm" secret="" err="secret \"galera-openstack-dockercfg-c2dbx\" not found" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.759314 5030 scope.go:117] "RemoveContainer" containerID="4b0d4d83939b8a57f9922652cba204b62e772b9ed6fd96ffea1528b8ae360c41" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.759590 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-h7nmm_openstack-kuttl-tests(3559ba4b-6e2a-4df1-900f-4a0138dab333)\"" pod="openstack-kuttl-tests/root-account-create-update-h7nmm" podUID="3559ba4b-6e2a-4df1-900f-4a0138dab333" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.759639 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-h7nmm" event={"ID":"3559ba4b-6e2a-4df1-900f-4a0138dab333","Type":"ContainerDied","Data":"4b0d4d83939b8a57f9922652cba204b62e772b9ed6fd96ffea1528b8ae360c41"} Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.784855 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-67567dfbc6-zn99j"] Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.785060 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" podUID="e0ca1870-fde9-4d58-9593-555c107cada4" containerName="keystone-api" containerID="cri-o://71895f6baf692fcb059570c10acee58639753ce9643aab3879a97adb3c69f858" gracePeriod=30 Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.796028 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.814224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7476965-cef4-4fe9-a85b-f22312a83d52-operator-scripts\") pod \"keystone-cb8a-account-create-update-pdb5b\" (UID: \"e7476965-cef4-4fe9-a85b-f22312a83d52\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.814366 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh9zw\" (UniqueName: \"kubernetes.io/projected/e7476965-cef4-4fe9-a85b-f22312a83d52-kube-api-access-sh9zw\") pod \"keystone-cb8a-account-create-update-pdb5b\" (UID: \"e7476965-cef4-4fe9-a85b-f22312a83d52\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.814613 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.814717 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7476965-cef4-4fe9-a85b-f22312a83d52-operator-scripts podName:e7476965-cef4-4fe9-a85b-f22312a83d52 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:09.314701277 +0000 UTC m=+1961.634961565 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e7476965-cef4-4fe9-a85b-f22312a83d52-operator-scripts") pod "keystone-cb8a-account-create-update-pdb5b" (UID: "e7476965-cef4-4fe9-a85b-f22312a83d52") : configmap "openstack-scripts" not found Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.814763 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.814795 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3559ba4b-6e2a-4df1-900f-4a0138dab333-operator-scripts podName:3559ba4b-6e2a-4df1-900f-4a0138dab333 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:09.314783139 +0000 UTC m=+1961.635043507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3559ba4b-6e2a-4df1-900f-4a0138dab333-operator-scripts") pod "root-account-create-update-h7nmm" (UID: "3559ba4b-6e2a-4df1-900f-4a0138dab333") : configmap "openstack-scripts" not found Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.831656 5030 projected.go:194] Error preparing data for projected volume kube-api-access-sh9zw for pod openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.831715 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7476965-cef4-4fe9-a85b-f22312a83d52-kube-api-access-sh9zw podName:e7476965-cef4-4fe9-a85b-f22312a83d52 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:09.331698133 +0000 UTC m=+1961.651958421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sh9zw" (UniqueName: "kubernetes.io/projected/e7476965-cef4-4fe9-a85b-f22312a83d52-kube-api-access-sh9zw") pod "keystone-cb8a-account-create-update-pdb5b" (UID: "e7476965-cef4-4fe9-a85b-f22312a83d52") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.873775 5030 scope.go:117] "RemoveContainer" containerID="4f38b5a344b1f202e5c8a24ab0113aed84acc42e572b88dfe648cfd86fc0dc6f" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.874685 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-h7nmm"] Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.890916 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-p58br"] Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.900574 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-p58br"] Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.906366 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b"] Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.918944 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-sh9zw operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b" podUID="e7476965-cef4-4fe9-a85b-f22312a83d52" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.923124 5030 scope.go:117] "RemoveContainer" containerID="9fcef5bb10bc0f76675b28df6b7242170fa10c2e7d377591672b4db441f8a654" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.928438 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fcef5bb10bc0f76675b28df6b7242170fa10c2e7d377591672b4db441f8a654\": container with ID starting with 9fcef5bb10bc0f76675b28df6b7242170fa10c2e7d377591672b4db441f8a654 not found: ID does not exist" containerID="9fcef5bb10bc0f76675b28df6b7242170fa10c2e7d377591672b4db441f8a654" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.928592 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fcef5bb10bc0f76675b28df6b7242170fa10c2e7d377591672b4db441f8a654"} err="failed to get container status \"9fcef5bb10bc0f76675b28df6b7242170fa10c2e7d377591672b4db441f8a654\": rpc error: code = NotFound desc = could not find container \"9fcef5bb10bc0f76675b28df6b7242170fa10c2e7d377591672b4db441f8a654\": container with ID starting with 9fcef5bb10bc0f76675b28df6b7242170fa10c2e7d377591672b4db441f8a654 not found: ID does not exist" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.932936 5030 scope.go:117] "RemoveContainer" containerID="4f38b5a344b1f202e5c8a24ab0113aed84acc42e572b88dfe648cfd86fc0dc6f" Jan 20 23:08:08 crc kubenswrapper[5030]: E0120 23:08:08.935006 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f38b5a344b1f202e5c8a24ab0113aed84acc42e572b88dfe648cfd86fc0dc6f\": container with ID starting with 4f38b5a344b1f202e5c8a24ab0113aed84acc42e572b88dfe648cfd86fc0dc6f not found: ID does not exist" containerID="4f38b5a344b1f202e5c8a24ab0113aed84acc42e572b88dfe648cfd86fc0dc6f" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.935123 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f38b5a344b1f202e5c8a24ab0113aed84acc42e572b88dfe648cfd86fc0dc6f"} err="failed to get container status \"4f38b5a344b1f202e5c8a24ab0113aed84acc42e572b88dfe648cfd86fc0dc6f\": rpc error: code = NotFound desc = could not find container \"4f38b5a344b1f202e5c8a24ab0113aed84acc42e572b88dfe648cfd86fc0dc6f\": container with ID starting with 4f38b5a344b1f202e5c8a24ab0113aed84acc42e572b88dfe648cfd86fc0dc6f not found: ID does not exist" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.935199 5030 scope.go:117] "RemoveContainer" containerID="68aba122543fc2cde94a4194512c12897ed28fdcdd2ea3c2f997cce1c69f0ad2" Jan 20 23:08:08 crc kubenswrapper[5030]: I0120 23:08:08.968611 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.007532 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-a2f7-account-create-update-4gsvm"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.013388 5030 scope.go:117] "RemoveContainer" containerID="68aba122543fc2cde94a4194512c12897ed28fdcdd2ea3c2f997cce1c69f0ad2" Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.014153 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68aba122543fc2cde94a4194512c12897ed28fdcdd2ea3c2f997cce1c69f0ad2\": container with ID starting with 68aba122543fc2cde94a4194512c12897ed28fdcdd2ea3c2f997cce1c69f0ad2 not found: ID does not exist" containerID="68aba122543fc2cde94a4194512c12897ed28fdcdd2ea3c2f997cce1c69f0ad2" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.014203 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68aba122543fc2cde94a4194512c12897ed28fdcdd2ea3c2f997cce1c69f0ad2"} err="failed to get container status \"68aba122543fc2cde94a4194512c12897ed28fdcdd2ea3c2f997cce1c69f0ad2\": rpc error: code = NotFound desc = could not find container \"68aba122543fc2cde94a4194512c12897ed28fdcdd2ea3c2f997cce1c69f0ad2\": container with ID starting with 68aba122543fc2cde94a4194512c12897ed28fdcdd2ea3c2f997cce1c69f0ad2 not found: ID does not exist" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.014245 5030 scope.go:117] "RemoveContainer" containerID="8fff461aea2537b44e18dcf720322f47cc70d29129fcd1dd6c7ce152cd4912d9" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.037125 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8"] Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.038110 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="65a340046c8d57278deb025b2315942cfd40c2be9694e1d9f412ed27c7d0f561" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.044661 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-46ec-account-create-update-nvkl8"] Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.045239 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="65a340046c8d57278deb025b2315942cfd40c2be9694e1d9f412ed27c7d0f561" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.050985 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="65a340046c8d57278deb025b2315942cfd40c2be9694e1d9f412ed27c7d0f561" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.051062 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="b74f69b6-bbb2-4578-a61c-9479f6802e01" containerName="nova-scheduler-scheduler" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.053546 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" podUID="8659ca99-8cf1-4b79-90f1-e9db988aaefa" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.101:9696/\": dial tcp 10.217.1.101:9696: connect: connection refused" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.063782 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.067480 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-b757-account-create-update-dvwc9"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.074210 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.080483 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.092427 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.097117 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.117186 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.124324 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.128980 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.134078 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-574ddc89bb-w4hm5"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.150963 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.158290 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6c6dffc9bb-hmplz"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.169815 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.173551 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-74ccf5ddc7-wj4px"] Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.320699 5030 scope.go:117] "RemoveContainer" containerID="b907239be444a9423c0f1c089334d96acd30b0631822f0e4c0784e1217a02a7b" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.364821 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh9zw\" (UniqueName: \"kubernetes.io/projected/e7476965-cef4-4fe9-a85b-f22312a83d52-kube-api-access-sh9zw\") pod \"keystone-cb8a-account-create-update-pdb5b\" (UID: \"e7476965-cef4-4fe9-a85b-f22312a83d52\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.364937 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7476965-cef4-4fe9-a85b-f22312a83d52-operator-scripts\") pod \"keystone-cb8a-account-create-update-pdb5b\" (UID: \"e7476965-cef4-4fe9-a85b-f22312a83d52\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b" Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.365076 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.365119 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7476965-cef4-4fe9-a85b-f22312a83d52-operator-scripts podName:e7476965-cef4-4fe9-a85b-f22312a83d52 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:10.365105746 +0000 UTC m=+1962.685366034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e7476965-cef4-4fe9-a85b-f22312a83d52-operator-scripts") pod "keystone-cb8a-account-create-update-pdb5b" (UID: "e7476965-cef4-4fe9-a85b-f22312a83d52") : configmap "openstack-scripts" not found Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.365580 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.365611 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3559ba4b-6e2a-4df1-900f-4a0138dab333-operator-scripts podName:3559ba4b-6e2a-4df1-900f-4a0138dab333 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:10.365604178 +0000 UTC m=+1962.685864466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3559ba4b-6e2a-4df1-900f-4a0138dab333-operator-scripts") pod "root-account-create-update-h7nmm" (UID: "3559ba4b-6e2a-4df1-900f-4a0138dab333") : configmap "openstack-scripts" not found Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.375993 5030 projected.go:194] Error preparing data for projected volume kube-api-access-sh9zw for pod openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.376054 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7476965-cef4-4fe9-a85b-f22312a83d52-kube-api-access-sh9zw podName:e7476965-cef4-4fe9-a85b-f22312a83d52 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:10.376036123 +0000 UTC m=+1962.696296411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sh9zw" (UniqueName: "kubernetes.io/projected/e7476965-cef4-4fe9-a85b-f22312a83d52-kube-api-access-sh9zw") pod "keystone-cb8a-account-create-update-pdb5b" (UID: "e7476965-cef4-4fe9-a85b-f22312a83d52") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.405911 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="cae8a133-5a30-4a92-a0cd-e07c5941af0f" containerName="galera" containerID="cri-o://3ce1c0b95d67e31204548dcefbbebb8d7587a4e6d04adf65b96d1d9f7681e5f5" gracePeriod=30 Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.469001 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="63645689-a02d-4ee6-b712-554130bdca30" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.99:8776/healthcheck\": read tcp 10.217.0.2:58158->10.217.1.99:8776: read: connection reset by peer" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.771456 5030 generic.go:334] "Generic (PLEG): container finished" podID="b6b94587-03e7-4220-b417-0244e3623cd5" containerID="4f4bc6591caf71c0df2653257968c6767a5209155e845f49b6ad42788e1d5137" exitCode=0 Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.771504 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b6b94587-03e7-4220-b417-0244e3623cd5","Type":"ContainerDied","Data":"4f4bc6591caf71c0df2653257968c6767a5209155e845f49b6ad42788e1d5137"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.774403 5030 generic.go:334] "Generic (PLEG): container finished" podID="2b88d820-54a4-4307-82de-6b7e7b5cd435" containerID="ffffc1c09fe8eaaa4c9873343803eceb97caa93ec62a7a0f5b0fb64447d616c8" exitCode=0 Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.774448 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"2b88d820-54a4-4307-82de-6b7e7b5cd435","Type":"ContainerDied","Data":"ffffc1c09fe8eaaa4c9873343803eceb97caa93ec62a7a0f5b0fb64447d616c8"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.789703 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" event={"ID":"06e5d8f2-57ee-4aba-aea9-db1da4cb32dc","Type":"ContainerDied","Data":"5f542dfa7b2bec2acde4d439d64f6d69d3426130c372d7886dc4662d11d8d49d"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.789741 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f542dfa7b2bec2acde4d439d64f6d69d3426130c372d7886dc4662d11d8d49d" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.791131 5030 generic.go:334] "Generic (PLEG): container finished" podID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerID="2d202c8a6069a8f94f4c396d447c7c9ca8b664cf155fd902dbbcf820ccd0a8ea" exitCode=0 Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.791168 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0037fd82-93c2-4ba4-87c2-563388f2b56c","Type":"ContainerDied","Data":"2d202c8a6069a8f94f4c396d447c7c9ca8b664cf155fd902dbbcf820ccd0a8ea"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.800358 5030 generic.go:334] "Generic (PLEG): container finished" podID="b74f69b6-bbb2-4578-a61c-9479f6802e01" containerID="65a340046c8d57278deb025b2315942cfd40c2be9694e1d9f412ed27c7d0f561" exitCode=0 Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.800459 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"b74f69b6-bbb2-4578-a61c-9479f6802e01","Type":"ContainerDied","Data":"65a340046c8d57278deb025b2315942cfd40c2be9694e1d9f412ed27c7d0f561"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.803178 5030 generic.go:334] "Generic (PLEG): container finished" podID="4ee5d591-43e7-437c-9b55-b9132f22246a" containerID="59ac6043f93ef086250501bf602da8db89685b82321f351a7d96e1a127965ace" exitCode=0 Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.803228 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7869b46478-p797p" event={"ID":"4ee5d591-43e7-437c-9b55-b9132f22246a","Type":"ContainerDied","Data":"59ac6043f93ef086250501bf602da8db89685b82321f351a7d96e1a127965ace"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.803262 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7869b46478-p797p" event={"ID":"4ee5d591-43e7-437c-9b55-b9132f22246a","Type":"ContainerDied","Data":"e233ca221348f7805dd360b2f065fe3a1854fe954b610e9cacf74feb82b6f310"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.803273 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e233ca221348f7805dd360b2f065fe3a1854fe954b610e9cacf74feb82b6f310" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.805250 5030 generic.go:334] "Generic (PLEG): container finished" podID="6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" containerID="9685f0161fa2dc2a70a60068493799a7269233e5fdcb1286db91750b507a8e03" exitCode=0 Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.805290 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" event={"ID":"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c","Type":"ContainerDied","Data":"9685f0161fa2dc2a70a60068493799a7269233e5fdcb1286db91750b507a8e03"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.807210 5030 generic.go:334] "Generic (PLEG): container finished" podID="47ec7945-d514-4036-b888-6cc63c552195" containerID="8e0043eba4e2ddfdb9777f320a29f7f7423ad5d5f2afe41d13e3932b5a6a9147" exitCode=0 Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.807252 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"47ec7945-d514-4036-b888-6cc63c552195","Type":"ContainerDied","Data":"8e0043eba4e2ddfdb9777f320a29f7f7423ad5d5f2afe41d13e3932b5a6a9147"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.814895 5030 generic.go:334] "Generic (PLEG): container finished" podID="dc14f499-5449-459d-874e-a7fe1b79e469" containerID="1779ccb659c7f6cf01660be8e527df0c53283c0e70f691746ab04f5da6b5a774" exitCode=0 Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.814951 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"dc14f499-5449-459d-874e-a7fe1b79e469","Type":"ContainerDied","Data":"1779ccb659c7f6cf01660be8e527df0c53283c0e70f691746ab04f5da6b5a774"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.832951 5030 generic.go:334] "Generic (PLEG): container finished" podID="11f45a61-64e0-41ac-85bd-660c5a2c83be" containerID="1cada3db4af66ec4d1c27414fa05f35d51c9d85cd6c0e4252f5af47988c2a0b3" exitCode=0 Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.833024 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"11f45a61-64e0-41ac-85bd-660c5a2c83be","Type":"ContainerDied","Data":"1cada3db4af66ec4d1c27414fa05f35d51c9d85cd6c0e4252f5af47988c2a0b3"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.846898 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" event={"ID":"1ab7f925-f4fc-4393-920e-38907a760bf4","Type":"ContainerStarted","Data":"9572b28bc85be6e8020d2b2f0a24def1773727783a81e9334fe40d21f3e8e65b"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.848432 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.852018 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" event={"ID":"ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2","Type":"ContainerDied","Data":"bb0be371f8766b46fb8a8eb9da2275bcc053b6f37c8f7c73e59249190e3df8c6"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.852060 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb0be371f8766b46fb8a8eb9da2275bcc053b6f37c8f7c73e59249190e3df8c6" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.870332 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" podStartSLOduration=4.870298629 podStartE2EDuration="4.870298629s" podCreationTimestamp="2026-01-20 23:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:08:09.866588419 +0000 UTC m=+1962.186848707" watchObservedRunningTime="2026-01-20 23:08:09.870298629 +0000 UTC m=+1962.190558907" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.880030 5030 generic.go:334] "Generic (PLEG): container finished" podID="63645689-a02d-4ee6-b712-554130bdca30" containerID="76bbe65e67022afeb40494b3071315eb174c4e626acd1c77fb924d72243b1dbb" exitCode=0 Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.880122 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"63645689-a02d-4ee6-b712-554130bdca30","Type":"ContainerDied","Data":"76bbe65e67022afeb40494b3071315eb174c4e626acd1c77fb924d72243b1dbb"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.891428 5030 generic.go:334] "Generic (PLEG): container finished" podID="480916ab-abdd-474b-a001-12e99c3e1e42" containerID="0f3a917976c46bd8ac3d7283588cd417a1a1576ec8637bc5f388d2baab0c0ebd" exitCode=0 Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.891516 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.893524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"480916ab-abdd-474b-a001-12e99c3e1e42","Type":"ContainerDied","Data":"0f3a917976c46bd8ac3d7283588cd417a1a1576ec8637bc5f388d2baab0c0ebd"} Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.894993 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-h7nmm" secret="" err="secret \"galera-openstack-dockercfg-c2dbx\" not found" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.895028 5030 scope.go:117] "RemoveContainer" containerID="4b0d4d83939b8a57f9922652cba204b62e772b9ed6fd96ffea1528b8ae360c41" Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.895200 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-h7nmm_openstack-kuttl-tests(3559ba4b-6e2a-4df1-900f-4a0138dab333)\"" pod="openstack-kuttl-tests/root-account-create-update-h7nmm" podUID="3559ba4b-6e2a-4df1-900f-4a0138dab333" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.922092 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.932469 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.956872 5030 scope.go:117] "RemoveContainer" containerID="72b1998c6c8dee12853739e805d1cf361d95b2e26677c4f5d66171c955ca2571" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.958245 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.958950 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.987357 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e5d8f2-57ee-4aba-aea9-db1da4cb32dc-operator-scripts\") pod \"06e5d8f2-57ee-4aba-aea9-db1da4cb32dc\" (UID: \"06e5d8f2-57ee-4aba-aea9-db1da4cb32dc\") " Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.988385 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06e5d8f2-57ee-4aba-aea9-db1da4cb32dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06e5d8f2-57ee-4aba-aea9-db1da4cb32dc" (UID: "06e5d8f2-57ee-4aba-aea9-db1da4cb32dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.988681 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07b2a1c0-8ab2-433a-9f20-169b898c0501" path="/var/lib/kubelet/pods/07b2a1c0-8ab2-433a-9f20-169b898c0501/volumes" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.988981 5030 scope.go:117] "RemoveContainer" containerID="b907239be444a9423c0f1c089334d96acd30b0631822f0e4c0784e1217a02a7b" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.989102 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128b8307-03b0-488a-9e89-83c29af6dd9b" path="/var/lib/kubelet/pods/128b8307-03b0-488a-9e89-83c29af6dd9b/volumes" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.989156 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdb66\" (UniqueName: \"kubernetes.io/projected/06e5d8f2-57ee-4aba-aea9-db1da4cb32dc-kube-api-access-xdb66\") pod \"06e5d8f2-57ee-4aba-aea9-db1da4cb32dc\" (UID: \"06e5d8f2-57ee-4aba-aea9-db1da4cb32dc\") " Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.989762 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b8587e7-4e10-44f3-a92c-3ae1bdb9667c" path="/var/lib/kubelet/pods/3b8587e7-4e10-44f3-a92c-3ae1bdb9667c/volumes" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.990294 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49108146-06bd-4395-99a7-3b3d37ce1920" path="/var/lib/kubelet/pods/49108146-06bd-4395-99a7-3b3d37ce1920/volumes" Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.990510 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b907239be444a9423c0f1c089334d96acd30b0631822f0e4c0784e1217a02a7b\": container with ID starting with b907239be444a9423c0f1c089334d96acd30b0631822f0e4c0784e1217a02a7b not found: ID does not exist" containerID="b907239be444a9423c0f1c089334d96acd30b0631822f0e4c0784e1217a02a7b" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.990556 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b907239be444a9423c0f1c089334d96acd30b0631822f0e4c0784e1217a02a7b"} err="failed to get container status \"b907239be444a9423c0f1c089334d96acd30b0631822f0e4c0784e1217a02a7b\": rpc error: code = NotFound desc = could not find container \"b907239be444a9423c0f1c089334d96acd30b0631822f0e4c0784e1217a02a7b\": container with ID starting with b907239be444a9423c0f1c089334d96acd30b0631822f0e4c0784e1217a02a7b not found: ID does not exist" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.990585 5030 scope.go:117] "RemoveContainer" containerID="72b1998c6c8dee12853739e805d1cf361d95b2e26677c4f5d66171c955ca2571" Jan 20 23:08:09 crc kubenswrapper[5030]: E0120 23:08:09.990890 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72b1998c6c8dee12853739e805d1cf361d95b2e26677c4f5d66171c955ca2571\": container with ID starting with 72b1998c6c8dee12853739e805d1cf361d95b2e26677c4f5d66171c955ca2571 not found: ID does not exist" containerID="72b1998c6c8dee12853739e805d1cf361d95b2e26677c4f5d66171c955ca2571" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.990916 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72b1998c6c8dee12853739e805d1cf361d95b2e26677c4f5d66171c955ca2571"} err="failed to get container status \"72b1998c6c8dee12853739e805d1cf361d95b2e26677c4f5d66171c955ca2571\": rpc error: code = NotFound desc = could not find container \"72b1998c6c8dee12853739e805d1cf361d95b2e26677c4f5d66171c955ca2571\": container with ID starting with 72b1998c6c8dee12853739e805d1cf361d95b2e26677c4f5d66171c955ca2571 not found: ID does not exist" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.990930 5030 scope.go:117] "RemoveContainer" containerID="1ace08f4089db5eedfbc26aaa16427e4c52aaf88c8dbe004dece963061aab561" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.991321 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49b5232f-22f5-443f-8b8d-bf8f8b5889e4" path="/var/lib/kubelet/pods/49b5232f-22f5-443f-8b8d-bf8f8b5889e4/volumes" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.991867 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72208017-a92b-44ff-9f45-bafc84173dfd" path="/var/lib/kubelet/pods/72208017-a92b-44ff-9f45-bafc84173dfd/volumes" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.992060 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06e5d8f2-57ee-4aba-aea9-db1da4cb32dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:09 crc kubenswrapper[5030]: I0120 23:08:09.996885 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e5d8f2-57ee-4aba-aea9-db1da4cb32dc-kube-api-access-xdb66" (OuterVolumeSpecName: "kube-api-access-xdb66") pod "06e5d8f2-57ee-4aba-aea9-db1da4cb32dc" (UID: "06e5d8f2-57ee-4aba-aea9-db1da4cb32dc"). InnerVolumeSpecName "kube-api-access-xdb66". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.003093 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a7e9fd9-51c7-4047-9342-ac3368e079f1" path="/var/lib/kubelet/pods/8a7e9fd9-51c7-4047-9342-ac3368e079f1/volumes" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.003566 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3bd79be-d1b9-4637-9dca-93cd7dbb5294" path="/var/lib/kubelet/pods/a3bd79be-d1b9-4637-9dca-93cd7dbb5294/volumes" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.004183 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbd870c6-92ea-4c50-a0dd-77e716077572" path="/var/lib/kubelet/pods/bbd870c6-92ea-4c50-a0dd-77e716077572/volumes" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.005382 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beebe328-1fb3-442d-90aa-aac7c2cf9457" path="/var/lib/kubelet/pods/beebe328-1fb3-442d-90aa-aac7c2cf9457/volumes" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.005989 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f82922-ba42-4068-b84b-adc7297a751f" path="/var/lib/kubelet/pods/c5f82922-ba42-4068-b84b-adc7297a751f/volumes" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.006461 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdc24c1f-8ba7-46ac-97fb-9da1274d15cf" path="/var/lib/kubelet/pods/cdc24c1f-8ba7-46ac-97fb-9da1274d15cf/volumes" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.007663 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d44bc258-6678-4851-a91f-f6e382262bf9" path="/var/lib/kubelet/pods/d44bc258-6678-4851-a91f-f6e382262bf9/volumes" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.047795 5030 scope.go:117] "RemoveContainer" containerID="7d91b777f27a8846dc117e3e1f244fee0123666d9f72ac9992025e702d704e39" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.064849 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.070247 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.088175 5030 scope.go:117] "RemoveContainer" containerID="1b3bd34f6462c487c3b7f7516d49f86c3163d11b2e4ea0798004538ed4feb2ee" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.093312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2p8j\" (UniqueName: \"kubernetes.io/projected/ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2-kube-api-access-r2p8j\") pod \"ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2\" (UID: \"ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.093383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2-operator-scripts\") pod \"ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2\" (UID: \"ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.093496 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-public-tls-certs\") pod \"4ee5d591-43e7-437c-9b55-b9132f22246a\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.093524 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-config-data\") pod \"4ee5d591-43e7-437c-9b55-b9132f22246a\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.093545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-internal-tls-certs\") pod \"4ee5d591-43e7-437c-9b55-b9132f22246a\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.093567 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jft59\" (UniqueName: \"kubernetes.io/projected/4ee5d591-43e7-437c-9b55-b9132f22246a-kube-api-access-jft59\") pod \"4ee5d591-43e7-437c-9b55-b9132f22246a\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.093613 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-scripts\") pod \"4ee5d591-43e7-437c-9b55-b9132f22246a\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.093660 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-combined-ca-bundle\") pod \"4ee5d591-43e7-437c-9b55-b9132f22246a\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.093687 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ee5d591-43e7-437c-9b55-b9132f22246a-logs\") pod \"4ee5d591-43e7-437c-9b55-b9132f22246a\" (UID: \"4ee5d591-43e7-437c-9b55-b9132f22246a\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.094103 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdb66\" (UniqueName: \"kubernetes.io/projected/06e5d8f2-57ee-4aba-aea9-db1da4cb32dc-kube-api-access-xdb66\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.096384 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2" (UID: "ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.097054 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee5d591-43e7-437c-9b55-b9132f22246a-logs" (OuterVolumeSpecName: "logs") pod "4ee5d591-43e7-437c-9b55-b9132f22246a" (UID: "4ee5d591-43e7-437c-9b55-b9132f22246a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.100737 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2-kube-api-access-r2p8j" (OuterVolumeSpecName: "kube-api-access-r2p8j") pod "ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2" (UID: "ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2"). InnerVolumeSpecName "kube-api-access-r2p8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.101601 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee5d591-43e7-437c-9b55-b9132f22246a-kube-api-access-jft59" (OuterVolumeSpecName: "kube-api-access-jft59") pod "4ee5d591-43e7-437c-9b55-b9132f22246a" (UID: "4ee5d591-43e7-437c-9b55-b9132f22246a"). InnerVolumeSpecName "kube-api-access-jft59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.110122 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-scripts" (OuterVolumeSpecName: "scripts") pod "4ee5d591-43e7-437c-9b55-b9132f22246a" (UID: "4ee5d591-43e7-437c-9b55-b9132f22246a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.117636 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.133978 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.153782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ee5d591-43e7-437c-9b55-b9132f22246a" (UID: "4ee5d591-43e7-437c-9b55-b9132f22246a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.163773 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-config-data" (OuterVolumeSpecName: "config-data") pod "4ee5d591-43e7-437c-9b55-b9132f22246a" (UID: "4ee5d591-43e7-437c-9b55-b9132f22246a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.196988 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-nova-metadata-tls-certs\") pod \"0037fd82-93c2-4ba4-87c2-563388f2b56c\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197045 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-config-data\") pod \"dc14f499-5449-459d-874e-a7fe1b79e469\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197110 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qffw\" (UniqueName: \"kubernetes.io/projected/0037fd82-93c2-4ba4-87c2-563388f2b56c-kube-api-access-8qffw\") pod \"0037fd82-93c2-4ba4-87c2-563388f2b56c\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197137 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzq55\" (UniqueName: \"kubernetes.io/projected/47ec7945-d514-4036-b888-6cc63c552195-kube-api-access-pzq55\") pod \"47ec7945-d514-4036-b888-6cc63c552195\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197161 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc14f499-5449-459d-874e-a7fe1b79e469-logs\") pod \"dc14f499-5449-459d-874e-a7fe1b79e469\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197195 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47ec7945-d514-4036-b888-6cc63c552195-httpd-run\") pod \"47ec7945-d514-4036-b888-6cc63c552195\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197229 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/11f45a61-64e0-41ac-85bd-660c5a2c83be-memcached-tls-certs\") pod \"11f45a61-64e0-41ac-85bd-660c5a2c83be\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197257 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-internal-tls-certs\") pod \"47ec7945-d514-4036-b888-6cc63c552195\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197282 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-public-tls-certs\") pod \"dc14f499-5449-459d-874e-a7fe1b79e469\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197307 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"dc14f499-5449-459d-874e-a7fe1b79e469\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197324 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-config-data\") pod \"47ec7945-d514-4036-b888-6cc63c552195\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197348 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-config-data\") pod \"0037fd82-93c2-4ba4-87c2-563388f2b56c\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197378 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-combined-ca-bundle\") pod \"0037fd82-93c2-4ba4-87c2-563388f2b56c\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f45a61-64e0-41ac-85bd-660c5a2c83be-combined-ca-bundle\") pod \"11f45a61-64e0-41ac-85bd-660c5a2c83be\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197444 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-scripts\") pod \"47ec7945-d514-4036-b888-6cc63c552195\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197468 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-scripts\") pod \"dc14f499-5449-459d-874e-a7fe1b79e469\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197499 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ec7945-d514-4036-b888-6cc63c552195-logs\") pod \"47ec7945-d514-4036-b888-6cc63c552195\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197532 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"47ec7945-d514-4036-b888-6cc63c552195\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197562 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-combined-ca-bundle\") pod \"47ec7945-d514-4036-b888-6cc63c552195\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197581 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11f45a61-64e0-41ac-85bd-660c5a2c83be-config-data\") pod \"11f45a61-64e0-41ac-85bd-660c5a2c83be\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197609 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fms9k\" (UniqueName: \"kubernetes.io/projected/11f45a61-64e0-41ac-85bd-660c5a2c83be-kube-api-access-fms9k\") pod \"11f45a61-64e0-41ac-85bd-660c5a2c83be\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197667 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0037fd82-93c2-4ba4-87c2-563388f2b56c-logs\") pod \"0037fd82-93c2-4ba4-87c2-563388f2b56c\" (UID: \"0037fd82-93c2-4ba4-87c2-563388f2b56c\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6fnp\" (UniqueName: \"kubernetes.io/projected/dc14f499-5449-459d-874e-a7fe1b79e469-kube-api-access-x6fnp\") pod \"dc14f499-5449-459d-874e-a7fe1b79e469\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197724 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-combined-ca-bundle\") pod \"dc14f499-5449-459d-874e-a7fe1b79e469\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197755 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc14f499-5449-459d-874e-a7fe1b79e469-httpd-run\") pod \"dc14f499-5449-459d-874e-a7fe1b79e469\" (UID: \"dc14f499-5449-459d-874e-a7fe1b79e469\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.197775 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11f45a61-64e0-41ac-85bd-660c5a2c83be-kolla-config\") pod \"11f45a61-64e0-41ac-85bd-660c5a2c83be\" (UID: \"11f45a61-64e0-41ac-85bd-660c5a2c83be\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.198671 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.198693 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jft59\" (UniqueName: \"kubernetes.io/projected/4ee5d591-43e7-437c-9b55-b9132f22246a-kube-api-access-jft59\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.198707 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.198718 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.198730 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ee5d591-43e7-437c-9b55-b9132f22246a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.198742 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2p8j\" (UniqueName: \"kubernetes.io/projected/ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2-kube-api-access-r2p8j\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.198753 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.199256 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f45a61-64e0-41ac-85bd-660c5a2c83be-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "11f45a61-64e0-41ac-85bd-660c5a2c83be" (UID: "11f45a61-64e0-41ac-85bd-660c5a2c83be"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.200011 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f45a61-64e0-41ac-85bd-660c5a2c83be-config-data" (OuterVolumeSpecName: "config-data") pod "11f45a61-64e0-41ac-85bd-660c5a2c83be" (UID: "11f45a61-64e0-41ac-85bd-660c5a2c83be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.200022 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47ec7945-d514-4036-b888-6cc63c552195-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "47ec7945-d514-4036-b888-6cc63c552195" (UID: "47ec7945-d514-4036-b888-6cc63c552195"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.200471 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc14f499-5449-459d-874e-a7fe1b79e469-logs" (OuterVolumeSpecName: "logs") pod "dc14f499-5449-459d-874e-a7fe1b79e469" (UID: "dc14f499-5449-459d-874e-a7fe1b79e469"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.201668 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47ec7945-d514-4036-b888-6cc63c552195-logs" (OuterVolumeSpecName: "logs") pod "47ec7945-d514-4036-b888-6cc63c552195" (UID: "47ec7945-d514-4036-b888-6cc63c552195"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.204377 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc14f499-5449-459d-874e-a7fe1b79e469-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dc14f499-5449-459d-874e-a7fe1b79e469" (UID: "dc14f499-5449-459d-874e-a7fe1b79e469"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.205291 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0037fd82-93c2-4ba4-87c2-563388f2b56c-logs" (OuterVolumeSpecName: "logs") pod "0037fd82-93c2-4ba4-87c2-563388f2b56c" (UID: "0037fd82-93c2-4ba4-87c2-563388f2b56c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.207865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "dc14f499-5449-459d-874e-a7fe1b79e469" (UID: "dc14f499-5449-459d-874e-a7fe1b79e469"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.209390 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "47ec7945-d514-4036-b888-6cc63c552195" (UID: "47ec7945-d514-4036-b888-6cc63c552195"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.217757 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f45a61-64e0-41ac-85bd-660c5a2c83be-kube-api-access-fms9k" (OuterVolumeSpecName: "kube-api-access-fms9k") pod "11f45a61-64e0-41ac-85bd-660c5a2c83be" (UID: "11f45a61-64e0-41ac-85bd-660c5a2c83be"). InnerVolumeSpecName "kube-api-access-fms9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.217771 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-scripts" (OuterVolumeSpecName: "scripts") pod "47ec7945-d514-4036-b888-6cc63c552195" (UID: "47ec7945-d514-4036-b888-6cc63c552195"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.217807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc14f499-5449-459d-874e-a7fe1b79e469-kube-api-access-x6fnp" (OuterVolumeSpecName: "kube-api-access-x6fnp") pod "dc14f499-5449-459d-874e-a7fe1b79e469" (UID: "dc14f499-5449-459d-874e-a7fe1b79e469"). InnerVolumeSpecName "kube-api-access-x6fnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.219731 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-scripts" (OuterVolumeSpecName: "scripts") pod "dc14f499-5449-459d-874e-a7fe1b79e469" (UID: "dc14f499-5449-459d-874e-a7fe1b79e469"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.224714 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0037fd82-93c2-4ba4-87c2-563388f2b56c-kube-api-access-8qffw" (OuterVolumeSpecName: "kube-api-access-8qffw") pod "0037fd82-93c2-4ba4-87c2-563388f2b56c" (UID: "0037fd82-93c2-4ba4-87c2-563388f2b56c"). InnerVolumeSpecName "kube-api-access-8qffw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.224985 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4ee5d591-43e7-437c-9b55-b9132f22246a" (UID: "4ee5d591-43e7-437c-9b55-b9132f22246a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.226712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47ec7945-d514-4036-b888-6cc63c552195-kube-api-access-pzq55" (OuterVolumeSpecName: "kube-api-access-pzq55") pod "47ec7945-d514-4036-b888-6cc63c552195" (UID: "47ec7945-d514-4036-b888-6cc63c552195"). InnerVolumeSpecName "kube-api-access-pzq55". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.236339 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0037fd82-93c2-4ba4-87c2-563388f2b56c" (UID: "0037fd82-93c2-4ba4-87c2-563388f2b56c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.245003 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc14f499-5449-459d-874e-a7fe1b79e469" (UID: "dc14f499-5449-459d-874e-a7fe1b79e469"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.285601 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4ee5d591-43e7-437c-9b55-b9132f22246a" (UID: "4ee5d591-43e7-437c-9b55-b9132f22246a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.286018 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f45a61-64e0-41ac-85bd-660c5a2c83be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11f45a61-64e0-41ac-85bd-660c5a2c83be" (UID: "11f45a61-64e0-41ac-85bd-660c5a2c83be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.295235 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-config-data" (OuterVolumeSpecName: "config-data") pod "0037fd82-93c2-4ba4-87c2-563388f2b56c" (UID: "0037fd82-93c2-4ba4-87c2-563388f2b56c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301495 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301519 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qffw\" (UniqueName: \"kubernetes.io/projected/0037fd82-93c2-4ba4-87c2-563388f2b56c-kube-api-access-8qffw\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301530 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzq55\" (UniqueName: \"kubernetes.io/projected/47ec7945-d514-4036-b888-6cc63c552195-kube-api-access-pzq55\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301540 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc14f499-5449-459d-874e-a7fe1b79e469-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301549 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47ec7945-d514-4036-b888-6cc63c552195-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301804 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301817 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301826 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301834 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f45a61-64e0-41ac-85bd-660c5a2c83be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301842 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301850 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301858 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47ec7945-d514-4036-b888-6cc63c552195-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301929 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301943 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11f45a61-64e0-41ac-85bd-660c5a2c83be-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301952 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fms9k\" (UniqueName: \"kubernetes.io/projected/11f45a61-64e0-41ac-85bd-660c5a2c83be-kube-api-access-fms9k\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301962 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0037fd82-93c2-4ba4-87c2-563388f2b56c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301971 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6fnp\" (UniqueName: \"kubernetes.io/projected/dc14f499-5449-459d-874e-a7fe1b79e469-kube-api-access-x6fnp\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301979 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301986 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ee5d591-43e7-437c-9b55-b9132f22246a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.301996 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc14f499-5449-459d-874e-a7fe1b79e469-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.302004 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11f45a61-64e0-41ac-85bd-660c5a2c83be-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.313850 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f45a61-64e0-41ac-85bd-660c5a2c83be-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "11f45a61-64e0-41ac-85bd-660c5a2c83be" (UID: "11f45a61-64e0-41ac-85bd-660c5a2c83be"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: E0120 23:08:10.370938 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0f3a917976c46bd8ac3d7283588cd417a1a1576ec8637bc5f388d2baab0c0ebd is running failed: container process not found" containerID="0f3a917976c46bd8ac3d7283588cd417a1a1576ec8637bc5f388d2baab0c0ebd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:08:10 crc kubenswrapper[5030]: E0120 23:08:10.371369 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0f3a917976c46bd8ac3d7283588cd417a1a1576ec8637bc5f388d2baab0c0ebd is running failed: container process not found" containerID="0f3a917976c46bd8ac3d7283588cd417a1a1576ec8637bc5f388d2baab0c0ebd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:08:10 crc kubenswrapper[5030]: E0120 23:08:10.372001 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0f3a917976c46bd8ac3d7283588cd417a1a1576ec8637bc5f388d2baab0c0ebd is running failed: container process not found" containerID="0f3a917976c46bd8ac3d7283588cd417a1a1576ec8637bc5f388d2baab0c0ebd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:08:10 crc kubenswrapper[5030]: E0120 23:08:10.372047 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0f3a917976c46bd8ac3d7283588cd417a1a1576ec8637bc5f388d2baab0c0ebd is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="480916ab-abdd-474b-a001-12e99c3e1e42" containerName="nova-cell0-conductor-conductor" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.386162 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-config-data" (OuterVolumeSpecName: "config-data") pod "dc14f499-5449-459d-874e-a7fe1b79e469" (UID: "dc14f499-5449-459d-874e-a7fe1b79e469"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.398281 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "47ec7945-d514-4036-b888-6cc63c552195" (UID: "47ec7945-d514-4036-b888-6cc63c552195"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.401883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-config-data" (OuterVolumeSpecName: "config-data") pod "47ec7945-d514-4036-b888-6cc63c552195" (UID: "47ec7945-d514-4036-b888-6cc63c552195"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.411223 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47ec7945-d514-4036-b888-6cc63c552195" (UID: "47ec7945-d514-4036-b888-6cc63c552195"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: W0120 23:08:10.414939 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/47ec7945-d514-4036-b888-6cc63c552195/volumes/kubernetes.io~secret/combined-ca-bundle Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.414975 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47ec7945-d514-4036-b888-6cc63c552195" (UID: "47ec7945-d514-4036-b888-6cc63c552195"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.415003 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-combined-ca-bundle\") pod \"47ec7945-d514-4036-b888-6cc63c552195\" (UID: \"47ec7945-d514-4036-b888-6cc63c552195\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.416068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7476965-cef4-4fe9-a85b-f22312a83d52-operator-scripts\") pod \"keystone-cb8a-account-create-update-pdb5b\" (UID: \"e7476965-cef4-4fe9-a85b-f22312a83d52\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b" Jan 20 23:08:10 crc kubenswrapper[5030]: E0120 23:08:10.416162 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:08:10 crc kubenswrapper[5030]: E0120 23:08:10.416217 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3559ba4b-6e2a-4df1-900f-4a0138dab333-operator-scripts podName:3559ba4b-6e2a-4df1-900f-4a0138dab333 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:12.416200277 +0000 UTC m=+1964.736460565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3559ba4b-6e2a-4df1-900f-4a0138dab333-operator-scripts") pod "root-account-create-update-h7nmm" (UID: "3559ba4b-6e2a-4df1-900f-4a0138dab333") : configmap "openstack-scripts" not found Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.416364 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh9zw\" (UniqueName: \"kubernetes.io/projected/e7476965-cef4-4fe9-a85b-f22312a83d52-kube-api-access-sh9zw\") pod \"keystone-cb8a-account-create-update-pdb5b\" (UID: \"e7476965-cef4-4fe9-a85b-f22312a83d52\") " pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b" Jan 20 23:08:10 crc kubenswrapper[5030]: E0120 23:08:10.416536 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:08:10 crc kubenswrapper[5030]: E0120 23:08:10.416560 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e7476965-cef4-4fe9-a85b-f22312a83d52-operator-scripts podName:e7476965-cef4-4fe9-a85b-f22312a83d52 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:12.416553286 +0000 UTC m=+1964.736813574 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e7476965-cef4-4fe9-a85b-f22312a83d52-operator-scripts") pod "keystone-cb8a-account-create-update-pdb5b" (UID: "e7476965-cef4-4fe9-a85b-f22312a83d52") : configmap "openstack-scripts" not found Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.424084 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/11f45a61-64e0-41ac-85bd-660c5a2c83be-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.424115 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.424127 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.424136 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ec7945-d514-4036-b888-6cc63c552195-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.424145 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: E0120 23:08:10.439546 5030 projected.go:194] Error preparing data for projected volume kube-api-access-sh9zw for pod openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:08:10 crc kubenswrapper[5030]: E0120 23:08:10.439613 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7476965-cef4-4fe9-a85b-f22312a83d52-kube-api-access-sh9zw podName:e7476965-cef4-4fe9-a85b-f22312a83d52 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:12.439594169 +0000 UTC m=+1964.759854447 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-sh9zw" (UniqueName: "kubernetes.io/projected/e7476965-cef4-4fe9-a85b-f22312a83d52-kube-api-access-sh9zw") pod "keystone-cb8a-account-create-update-pdb5b" (UID: "e7476965-cef4-4fe9-a85b-f22312a83d52") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.448846 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0037fd82-93c2-4ba4-87c2-563388f2b56c" (UID: "0037fd82-93c2-4ba4-87c2-563388f2b56c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.449869 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.453713 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dc14f499-5449-459d-874e-a7fe1b79e469" (UID: "dc14f499-5449-459d-874e-a7fe1b79e469"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.456407 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.525883 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.525920 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0037fd82-93c2-4ba4-87c2-563388f2b56c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.525930 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc14f499-5449-459d-874e-a7fe1b79e469-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.525939 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.611678 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.617980 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.623050 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.630535 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.640693 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.665328 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.710561 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.729038 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-combined-ca-bundle\") pod \"2b88d820-54a4-4307-82de-6b7e7b5cd435\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.729305 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-internal-tls-certs\") pod \"2b88d820-54a4-4307-82de-6b7e7b5cd435\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.729452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-config-data\") pod \"b74f69b6-bbb2-4578-a61c-9479f6802e01\" (UID: \"b74f69b6-bbb2-4578-a61c-9479f6802e01\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.729561 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-674gj\" (UniqueName: \"kubernetes.io/projected/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-kube-api-access-674gj\") pod \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.729694 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffbj4\" (UniqueName: \"kubernetes.io/projected/63645689-a02d-4ee6-b712-554130bdca30-kube-api-access-ffbj4\") pod \"63645689-a02d-4ee6-b712-554130bdca30\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.729769 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-sg-core-conf-yaml\") pod \"b6b94587-03e7-4220-b417-0244e3623cd5\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.729845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-config-data\") pod \"2b88d820-54a4-4307-82de-6b7e7b5cd435\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.729914 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-logs\") pod \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730015 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-config-data\") pod \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730093 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63645689-a02d-4ee6-b712-554130bdca30-logs\") pod \"63645689-a02d-4ee6-b712-554130bdca30\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730156 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b88d820-54a4-4307-82de-6b7e7b5cd435-logs\") pod \"2b88d820-54a4-4307-82de-6b7e7b5cd435\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730229 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj8gf\" (UniqueName: \"kubernetes.io/projected/b74f69b6-bbb2-4578-a61c-9479f6802e01-kube-api-access-jj8gf\") pod \"b74f69b6-bbb2-4578-a61c-9479f6802e01\" (UID: \"b74f69b6-bbb2-4578-a61c-9479f6802e01\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-combined-ca-bundle\") pod \"b6b94587-03e7-4220-b417-0244e3623cd5\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730376 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-internal-tls-certs\") pod \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730441 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-public-tls-certs\") pod \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730516 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z88nw\" (UniqueName: \"kubernetes.io/projected/2b88d820-54a4-4307-82de-6b7e7b5cd435-kube-api-access-z88nw\") pod \"2b88d820-54a4-4307-82de-6b7e7b5cd435\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730585 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-config-data\") pod \"b6b94587-03e7-4220-b417-0244e3623cd5\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730700 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz695\" (UniqueName: \"kubernetes.io/projected/b6b94587-03e7-4220-b417-0244e3623cd5-kube-api-access-vz695\") pod \"b6b94587-03e7-4220-b417-0244e3623cd5\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730773 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-scripts\") pod \"b6b94587-03e7-4220-b417-0244e3623cd5\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730856 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-public-tls-certs\") pod \"2b88d820-54a4-4307-82de-6b7e7b5cd435\" (UID: \"2b88d820-54a4-4307-82de-6b7e7b5cd435\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730928 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-internal-tls-certs\") pod \"63645689-a02d-4ee6-b712-554130bdca30\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.730993 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-combined-ca-bundle\") pod \"63645689-a02d-4ee6-b712-554130bdca30\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.731092 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b94587-03e7-4220-b417-0244e3623cd5-log-httpd\") pod \"b6b94587-03e7-4220-b417-0244e3623cd5\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.731165 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-config-data-custom\") pod \"63645689-a02d-4ee6-b712-554130bdca30\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.731235 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-config-data-custom\") pod \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.731301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63645689-a02d-4ee6-b712-554130bdca30-etc-machine-id\") pod \"63645689-a02d-4ee6-b712-554130bdca30\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.731343 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63645689-a02d-4ee6-b712-554130bdca30-logs" (OuterVolumeSpecName: "logs") pod "63645689-a02d-4ee6-b712-554130bdca30" (UID: "63645689-a02d-4ee6-b712-554130bdca30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.731364 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-combined-ca-bundle\") pod \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\" (UID: \"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.731453 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle\") pod \"b74f69b6-bbb2-4578-a61c-9479f6802e01\" (UID: \"b74f69b6-bbb2-4578-a61c-9479f6802e01\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.731494 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-scripts\") pod \"63645689-a02d-4ee6-b712-554130bdca30\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.731523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-config-data\") pod \"63645689-a02d-4ee6-b712-554130bdca30\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.731545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b94587-03e7-4220-b417-0244e3623cd5-run-httpd\") pod \"b6b94587-03e7-4220-b417-0244e3623cd5\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.731568 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-public-tls-certs\") pod \"63645689-a02d-4ee6-b712-554130bdca30\" (UID: \"63645689-a02d-4ee6-b712-554130bdca30\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.731591 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-ceilometer-tls-certs\") pod \"b6b94587-03e7-4220-b417-0244e3623cd5\" (UID: \"b6b94587-03e7-4220-b417-0244e3623cd5\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.732253 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63645689-a02d-4ee6-b712-554130bdca30-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.741936 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-logs" (OuterVolumeSpecName: "logs") pod "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" (UID: "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.744122 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6b94587-03e7-4220-b417-0244e3623cd5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b6b94587-03e7-4220-b417-0244e3623cd5" (UID: "b6b94587-03e7-4220-b417-0244e3623cd5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.745405 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63645689-a02d-4ee6-b712-554130bdca30-kube-api-access-ffbj4" (OuterVolumeSpecName: "kube-api-access-ffbj4") pod "63645689-a02d-4ee6-b712-554130bdca30" (UID: "63645689-a02d-4ee6-b712-554130bdca30"). InnerVolumeSpecName "kube-api-access-ffbj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.746425 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b88d820-54a4-4307-82de-6b7e7b5cd435-logs" (OuterVolumeSpecName: "logs") pod "2b88d820-54a4-4307-82de-6b7e7b5cd435" (UID: "2b88d820-54a4-4307-82de-6b7e7b5cd435"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.748353 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-kube-api-access-674gj" (OuterVolumeSpecName: "kube-api-access-674gj") pod "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" (UID: "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c"). InnerVolumeSpecName "kube-api-access-674gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.750108 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63645689-a02d-4ee6-b712-554130bdca30-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "63645689-a02d-4ee6-b712-554130bdca30" (UID: "63645689-a02d-4ee6-b712-554130bdca30"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.757274 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6b94587-03e7-4220-b417-0244e3623cd5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b6b94587-03e7-4220-b417-0244e3623cd5" (UID: "b6b94587-03e7-4220-b417-0244e3623cd5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.762552 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b88d820-54a4-4307-82de-6b7e7b5cd435-kube-api-access-z88nw" (OuterVolumeSpecName: "kube-api-access-z88nw") pod "2b88d820-54a4-4307-82de-6b7e7b5cd435" (UID: "2b88d820-54a4-4307-82de-6b7e7b5cd435"). InnerVolumeSpecName "kube-api-access-z88nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.768183 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-scripts" (OuterVolumeSpecName: "scripts") pod "b6b94587-03e7-4220-b417-0244e3623cd5" (UID: "b6b94587-03e7-4220-b417-0244e3623cd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.767736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-scripts" (OuterVolumeSpecName: "scripts") pod "63645689-a02d-4ee6-b712-554130bdca30" (UID: "63645689-a02d-4ee6-b712-554130bdca30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.768924 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "63645689-a02d-4ee6-b712-554130bdca30" (UID: "63645689-a02d-4ee6-b712-554130bdca30"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.777881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74f69b6-bbb2-4578-a61c-9479f6802e01-kube-api-access-jj8gf" (OuterVolumeSpecName: "kube-api-access-jj8gf") pod "b74f69b6-bbb2-4578-a61c-9479f6802e01" (UID: "b74f69b6-bbb2-4578-a61c-9479f6802e01"). InnerVolumeSpecName "kube-api-access-jj8gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.783353 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" (UID: "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.785907 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b94587-03e7-4220-b417-0244e3623cd5-kube-api-access-vz695" (OuterVolumeSpecName: "kube-api-access-vz695") pod "b6b94587-03e7-4220-b417-0244e3623cd5" (UID: "b6b94587-03e7-4220-b417-0244e3623cd5"). InnerVolumeSpecName "kube-api-access-vz695". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.811562 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b88d820-54a4-4307-82de-6b7e7b5cd435" (UID: "2b88d820-54a4-4307-82de-6b7e7b5cd435"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.833352 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7bxk\" (UniqueName: \"kubernetes.io/projected/1728938b-621e-44f4-91e2-c061a23aa819-kube-api-access-d7bxk\") pod \"1728938b-621e-44f4-91e2-c061a23aa819\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.833439 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-scripts\") pod \"1728938b-621e-44f4-91e2-c061a23aa819\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.833517 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-config-data\") pod \"1728938b-621e-44f4-91e2-c061a23aa819\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.833564 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1728938b-621e-44f4-91e2-c061a23aa819-etc-machine-id\") pod \"1728938b-621e-44f4-91e2-c061a23aa819\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.833652 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-combined-ca-bundle\") pod \"1728938b-621e-44f4-91e2-c061a23aa819\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.833707 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480916ab-abdd-474b-a001-12e99c3e1e42-combined-ca-bundle\") pod \"480916ab-abdd-474b-a001-12e99c3e1e42\" (UID: \"480916ab-abdd-474b-a001-12e99c3e1e42\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.833731 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480916ab-abdd-474b-a001-12e99c3e1e42-config-data\") pod \"480916ab-abdd-474b-a001-12e99c3e1e42\" (UID: \"480916ab-abdd-474b-a001-12e99c3e1e42\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.833756 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-config-data-custom\") pod \"1728938b-621e-44f4-91e2-c061a23aa819\" (UID: \"1728938b-621e-44f4-91e2-c061a23aa819\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.833797 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnrvh\" (UniqueName: \"kubernetes.io/projected/480916ab-abdd-474b-a001-12e99c3e1e42-kube-api-access-cnrvh\") pod \"480916ab-abdd-474b-a001-12e99c3e1e42\" (UID: \"480916ab-abdd-474b-a001-12e99c3e1e42\") " Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834105 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-674gj\" (UniqueName: \"kubernetes.io/projected/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-kube-api-access-674gj\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834116 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffbj4\" (UniqueName: \"kubernetes.io/projected/63645689-a02d-4ee6-b712-554130bdca30-kube-api-access-ffbj4\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834125 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834135 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b88d820-54a4-4307-82de-6b7e7b5cd435-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834144 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj8gf\" (UniqueName: \"kubernetes.io/projected/b74f69b6-bbb2-4578-a61c-9479f6802e01-kube-api-access-jj8gf\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834152 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z88nw\" (UniqueName: \"kubernetes.io/projected/2b88d820-54a4-4307-82de-6b7e7b5cd435-kube-api-access-z88nw\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834162 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz695\" (UniqueName: \"kubernetes.io/projected/b6b94587-03e7-4220-b417-0244e3623cd5-kube-api-access-vz695\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834170 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834178 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b94587-03e7-4220-b417-0244e3623cd5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834187 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834195 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834204 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63645689-a02d-4ee6-b712-554130bdca30-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834212 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834219 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b94587-03e7-4220-b417-0244e3623cd5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834227 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.834931 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b6b94587-03e7-4220-b417-0244e3623cd5" (UID: "b6b94587-03e7-4220-b417-0244e3623cd5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.835186 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1728938b-621e-44f4-91e2-c061a23aa819-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1728938b-621e-44f4-91e2-c061a23aa819" (UID: "1728938b-621e-44f4-91e2-c061a23aa819"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.840717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-scripts" (OuterVolumeSpecName: "scripts") pod "1728938b-621e-44f4-91e2-c061a23aa819" (UID: "1728938b-621e-44f4-91e2-c061a23aa819"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.840726 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1728938b-621e-44f4-91e2-c061a23aa819-kube-api-access-d7bxk" (OuterVolumeSpecName: "kube-api-access-d7bxk") pod "1728938b-621e-44f4-91e2-c061a23aa819" (UID: "1728938b-621e-44f4-91e2-c061a23aa819"). InnerVolumeSpecName "kube-api-access-d7bxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.842204 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480916ab-abdd-474b-a001-12e99c3e1e42-kube-api-access-cnrvh" (OuterVolumeSpecName: "kube-api-access-cnrvh") pod "480916ab-abdd-474b-a001-12e99c3e1e42" (UID: "480916ab-abdd-474b-a001-12e99c3e1e42"). InnerVolumeSpecName "kube-api-access-cnrvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.842738 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1728938b-621e-44f4-91e2-c061a23aa819" (UID: "1728938b-621e-44f4-91e2-c061a23aa819"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.852165 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b74f69b6-bbb2-4578-a61c-9479f6802e01" (UID: "b74f69b6-bbb2-4578-a61c-9479f6802e01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.855049 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-config-data" (OuterVolumeSpecName: "config-data") pod "2b88d820-54a4-4307-82de-6b7e7b5cd435" (UID: "2b88d820-54a4-4307-82de-6b7e7b5cd435"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.865721 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63645689-a02d-4ee6-b712-554130bdca30" (UID: "63645689-a02d-4ee6-b712-554130bdca30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.866234 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" (UID: "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.868508 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" (UID: "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.873304 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-config-data" (OuterVolumeSpecName: "config-data") pod "63645689-a02d-4ee6-b712-554130bdca30" (UID: "63645689-a02d-4ee6-b712-554130bdca30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:10 crc kubenswrapper[5030]: I0120 23:08:10.874824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-config-data" (OuterVolumeSpecName: "config-data") pod "b74f69b6-bbb2-4578-a61c-9479f6802e01" (UID: "b74f69b6-bbb2-4578-a61c-9479f6802e01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.882763 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "63645689-a02d-4ee6-b712-554130bdca30" (UID: "63645689-a02d-4ee6-b712-554130bdca30"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.882791 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b6b94587-03e7-4220-b417-0244e3623cd5" (UID: "b6b94587-03e7-4220-b417-0244e3623cd5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.883764 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" (UID: "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.893875 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "63645689-a02d-4ee6-b712-554130bdca30" (UID: "63645689-a02d-4ee6-b712-554130bdca30"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.904758 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"dc14f499-5449-459d-874e-a7fe1b79e469","Type":"ContainerDied","Data":"d7211d602a1c53fe5d6cdfaeaf27028436d65115df95d6b4395a12df62fad210"} Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.904813 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.904831 5030 scope.go:117] "RemoveContainer" containerID="1779ccb659c7f6cf01660be8e527df0c53283c0e70f691746ab04f5da6b5a774" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.907076 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"11f45a61-64e0-41ac-85bd-660c5a2c83be","Type":"ContainerDied","Data":"909e491ca41f02445b9bfd666378a82a3944a12b0f1f9142f0839021c03093fb"} Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.907162 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.914060 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/480916ab-abdd-474b-a001-12e99c3e1e42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "480916ab-abdd-474b-a001-12e99c3e1e42" (UID: "480916ab-abdd-474b-a001-12e99c3e1e42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.918396 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0037fd82-93c2-4ba4-87c2-563388f2b56c","Type":"ContainerDied","Data":"2637b198c127ff15d3cda806ad9c17309ec22af0631afe4f583165d198101b60"} Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.918508 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.924048 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2b88d820-54a4-4307-82de-6b7e7b5cd435" (UID: "2b88d820-54a4-4307-82de-6b7e7b5cd435"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.924172 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-config-data" (OuterVolumeSpecName: "config-data") pod "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" (UID: "6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.924555 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2b88d820-54a4-4307-82de-6b7e7b5cd435" (UID: "2b88d820-54a4-4307-82de-6b7e7b5cd435"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.926464 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"63645689-a02d-4ee6-b712-554130bdca30","Type":"ContainerDied","Data":"1855439598a81b763fc62654a4b8858232e8d6fdbde25d898d67837a55ac5fcc"} Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.926549 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.935114 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.935613 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b" event={"ID":"6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c","Type":"ContainerDied","Data":"8ba66e73fadff9100438a6ab8d73292c0621ba600a6b7e7b25d0fb716bdc9147"} Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.935906 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.935931 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.935943 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480916ab-abdd-474b-a001-12e99c3e1e42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936612 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936648 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936661 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnrvh\" (UniqueName: \"kubernetes.io/projected/480916ab-abdd-474b-a001-12e99c3e1e42-kube-api-access-cnrvh\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936677 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936688 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936702 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7bxk\" (UniqueName: \"kubernetes.io/projected/1728938b-621e-44f4-91e2-c061a23aa819-kube-api-access-d7bxk\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936714 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936729 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936740 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936753 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936765 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936778 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936789 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1728938b-621e-44f4-91e2-c061a23aa819-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936801 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936813 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63645689-a02d-4ee6-b712-554130bdca30-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936825 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936837 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b88d820-54a4-4307-82de-6b7e7b5cd435-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.936849 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74f69b6-bbb2-4578-a61c-9479f6802e01-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.937906 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/480916ab-abdd-474b-a001-12e99c3e1e42-config-data" (OuterVolumeSpecName: "config-data") pod "480916ab-abdd-474b-a001-12e99c3e1e42" (UID: "480916ab-abdd-474b-a001-12e99c3e1e42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.940657 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-config-data" (OuterVolumeSpecName: "config-data") pod "b6b94587-03e7-4220-b417-0244e3623cd5" (UID: "b6b94587-03e7-4220-b417-0244e3623cd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.941672 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1728938b-621e-44f4-91e2-c061a23aa819" (UID: "1728938b-621e-44f4-91e2-c061a23aa819"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.945252 5030 generic.go:334] "Generic (PLEG): container finished" podID="1728938b-621e-44f4-91e2-c061a23aa819" containerID="f02ca9d198cfa26557c21f23bad287bf72936e7445101fad35f42301c6d41bc2" exitCode=0 Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.945293 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1728938b-621e-44f4-91e2-c061a23aa819","Type":"ContainerDied","Data":"f02ca9d198cfa26557c21f23bad287bf72936e7445101fad35f42301c6d41bc2"} Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.945312 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1728938b-621e-44f4-91e2-c061a23aa819","Type":"ContainerDied","Data":"757f521c9d31487a057624a6b3e97bff2316d3bdd4bafa5de2112e15d8363490"} Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.945349 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.955487 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"2b88d820-54a4-4307-82de-6b7e7b5cd435","Type":"ContainerDied","Data":"7d734a100670ce4293494e4925ac17467db764f2908d8753da4ceec6a8815490"} Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.955663 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.957845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6b94587-03e7-4220-b417-0244e3623cd5" (UID: "b6b94587-03e7-4220-b417-0244e3623cd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.958294 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.958500 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"480916ab-abdd-474b-a001-12e99c3e1e42","Type":"ContainerDied","Data":"8b9328262bb10ec471943b09c62e8e73ada907f52ca6af9ad3801d543824add6"} Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.964605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b6b94587-03e7-4220-b417-0244e3623cd5","Type":"ContainerDied","Data":"b271955100407803cc3b336caad14a4684c87d03e1174a9f38f22372a86d5637"} Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.964717 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.974896 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"47ec7945-d514-4036-b888-6cc63c552195","Type":"ContainerDied","Data":"8d79895e347e67e7872b234130dafcf8fa3a7e82a3d87ab0a0ed73c7b1f413e8"} Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.974969 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.993507 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.993541 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"b74f69b6-bbb2-4578-a61c-9479f6802e01","Type":"ContainerDied","Data":"ddb1e50f0149d2f277fe1610af98284b0db0c78e952d740ce24f39679bc3fc95"} Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.995380 5030 scope.go:117] "RemoveContainer" containerID="bb03a7a9462c3d1f022ea25c6e705e371d2aa4a25dcb01ea0294ce1846049a9b" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.997610 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7869b46478-p797p" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.997687 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.997960 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-73fb-account-create-update-7rshq" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:10.997967 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.006307 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-config-data" (OuterVolumeSpecName: "config-data") pod "1728938b-621e-44f4-91e2-c061a23aa819" (UID: "1728938b-621e-44f4-91e2-c061a23aa819"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.024846 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.035904 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.039275 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.039302 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1728938b-621e-44f4-91e2-c061a23aa819-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.039313 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/480916ab-abdd-474b-a001-12e99c3e1e42-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.039322 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.039332 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b94587-03e7-4220-b417-0244e3623cd5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.042006 5030 scope.go:117] "RemoveContainer" containerID="1cada3db4af66ec4d1c27414fa05f35d51c9d85cd6c0e4252f5af47988c2a0b3" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.057376 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.062338 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.070729 5030 scope.go:117] "RemoveContainer" containerID="2d202c8a6069a8f94f4c396d447c7c9ca8b664cf155fd902dbbcf820ccd0a8ea" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.070913 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.078553 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.091748 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.101669 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-8c8d65fc8-zg74b"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.112243 5030 scope.go:117] "RemoveContainer" containerID="19fb231c59f8089b6cc2f266b334a33db07302a867317497a770819abfdc1c42" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.142806 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.160432 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.169067 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7869b46478-p797p"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.179149 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7869b46478-p797p"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.185950 5030 scope.go:117] "RemoveContainer" containerID="76bbe65e67022afeb40494b3071315eb174c4e626acd1c77fb924d72243b1dbb" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.189058 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.193867 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.198799 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.212448 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.233020 5030 scope.go:117] "RemoveContainer" containerID="8afb44471792a68df8295fcfe9340c136a8e28722e34319c5bd8ccede98487ee" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.237234 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-73fb-account-create-update-7rshq"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.245814 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-73fb-account-create-update-7rshq"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.254562 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.260989 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.272318 5030 scope.go:117] "RemoveContainer" containerID="9685f0161fa2dc2a70a60068493799a7269233e5fdcb1286db91750b507a8e03" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.277339 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.288755 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-cb8a-account-create-update-pdb5b"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.291840 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.302607 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.317011 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.322590 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-c9bf-account-create-update-pcwvt"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.328177 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.336868 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.342882 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.349131 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.418109 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-h7nmm" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.426044 5030 scope.go:117] "RemoveContainer" containerID="1e47d9a62be110198d6a21ed0e2f60d6739a5d24b7c3d9403df7146ed5e0e34c" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.449145 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh9zw\" (UniqueName: \"kubernetes.io/projected/e7476965-cef4-4fe9-a85b-f22312a83d52-kube-api-access-sh9zw\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.449199 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7476965-cef4-4fe9-a85b-f22312a83d52-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.463713 5030 scope.go:117] "RemoveContainer" containerID="87560233fddfaf3c5921a24575892e3f668b172a67d7c5d34650e6ff443be940" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.497342 5030 scope.go:117] "RemoveContainer" containerID="f02ca9d198cfa26557c21f23bad287bf72936e7445101fad35f42301c6d41bc2" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.522727 5030 scope.go:117] "RemoveContainer" containerID="87560233fddfaf3c5921a24575892e3f668b172a67d7c5d34650e6ff443be940" Jan 20 23:08:11 crc kubenswrapper[5030]: E0120 23:08:11.523607 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87560233fddfaf3c5921a24575892e3f668b172a67d7c5d34650e6ff443be940\": container with ID starting with 87560233fddfaf3c5921a24575892e3f668b172a67d7c5d34650e6ff443be940 not found: ID does not exist" containerID="87560233fddfaf3c5921a24575892e3f668b172a67d7c5d34650e6ff443be940" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.523660 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87560233fddfaf3c5921a24575892e3f668b172a67d7c5d34650e6ff443be940"} err="failed to get container status \"87560233fddfaf3c5921a24575892e3f668b172a67d7c5d34650e6ff443be940\": rpc error: code = NotFound desc = could not find container \"87560233fddfaf3c5921a24575892e3f668b172a67d7c5d34650e6ff443be940\": container with ID starting with 87560233fddfaf3c5921a24575892e3f668b172a67d7c5d34650e6ff443be940 not found: ID does not exist" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.523682 5030 scope.go:117] "RemoveContainer" containerID="f02ca9d198cfa26557c21f23bad287bf72936e7445101fad35f42301c6d41bc2" Jan 20 23:08:11 crc kubenswrapper[5030]: E0120 23:08:11.524091 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02ca9d198cfa26557c21f23bad287bf72936e7445101fad35f42301c6d41bc2\": container with ID starting with f02ca9d198cfa26557c21f23bad287bf72936e7445101fad35f42301c6d41bc2 not found: ID does not exist" containerID="f02ca9d198cfa26557c21f23bad287bf72936e7445101fad35f42301c6d41bc2" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.524151 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02ca9d198cfa26557c21f23bad287bf72936e7445101fad35f42301c6d41bc2"} err="failed to get container status \"f02ca9d198cfa26557c21f23bad287bf72936e7445101fad35f42301c6d41bc2\": rpc error: code = NotFound desc = could not find container \"f02ca9d198cfa26557c21f23bad287bf72936e7445101fad35f42301c6d41bc2\": container with ID starting with f02ca9d198cfa26557c21f23bad287bf72936e7445101fad35f42301c6d41bc2 not found: ID does not exist" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.524178 5030 scope.go:117] "RemoveContainer" containerID="ffffc1c09fe8eaaa4c9873343803eceb97caa93ec62a7a0f5b0fb64447d616c8" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.550818 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2flwx\" (UniqueName: \"kubernetes.io/projected/3559ba4b-6e2a-4df1-900f-4a0138dab333-kube-api-access-2flwx\") pod \"3559ba4b-6e2a-4df1-900f-4a0138dab333\" (UID: \"3559ba4b-6e2a-4df1-900f-4a0138dab333\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.551045 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3559ba4b-6e2a-4df1-900f-4a0138dab333-operator-scripts\") pod \"3559ba4b-6e2a-4df1-900f-4a0138dab333\" (UID: \"3559ba4b-6e2a-4df1-900f-4a0138dab333\") " Jan 20 23:08:11 crc kubenswrapper[5030]: E0120 23:08:11.551636 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:08:11 crc kubenswrapper[5030]: E0120 23:08:11.551724 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data podName:78f61cd4-3d8c-48e0-bebd-03969052e500 nodeName:}" failed. No retries permitted until 2026-01-20 23:08:19.551697731 +0000 UTC m=+1971.871958019 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data") pod "rabbitmq-cell1-server-0" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.551747 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3559ba4b-6e2a-4df1-900f-4a0138dab333-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3559ba4b-6e2a-4df1-900f-4a0138dab333" (UID: "3559ba4b-6e2a-4df1-900f-4a0138dab333"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.559882 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3559ba4b-6e2a-4df1-900f-4a0138dab333-kube-api-access-2flwx" (OuterVolumeSpecName: "kube-api-access-2flwx") pod "3559ba4b-6e2a-4df1-900f-4a0138dab333" (UID: "3559ba4b-6e2a-4df1-900f-4a0138dab333"). InnerVolumeSpecName "kube-api-access-2flwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.631109 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_1399fab1-c25c-4b58-a58e-9dfd0062a09c/ovn-northd/0.log" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.631181 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.644677 5030 scope.go:117] "RemoveContainer" containerID="61d51d6533347c2adcbfbfb7e645e966f84e627cfee7356d6c103a97b3f6d938" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.652636 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2flwx\" (UniqueName: \"kubernetes.io/projected/3559ba4b-6e2a-4df1-900f-4a0138dab333-kube-api-access-2flwx\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.652670 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3559ba4b-6e2a-4df1-900f-4a0138dab333-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: E0120 23:08:11.665241 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:08:11 crc kubenswrapper[5030]: E0120 23:08:11.666352 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:08:11 crc kubenswrapper[5030]: E0120 23:08:11.667584 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:08:11 crc kubenswrapper[5030]: E0120 23:08:11.667639 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="a666baa5-8c6c-40e4-8c4d-7ab347b29e04" containerName="nova-cell1-conductor-conductor" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.742969 5030 scope.go:117] "RemoveContainer" containerID="0f3a917976c46bd8ac3d7283588cd417a1a1576ec8637bc5f388d2baab0c0ebd" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.753336 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1399fab1-c25c-4b58-a58e-9dfd0062a09c-ovn-rundir\") pod \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.753438 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1399fab1-c25c-4b58-a58e-9dfd0062a09c-config\") pod \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.753479 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-combined-ca-bundle\") pod \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.753506 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-ovn-northd-tls-certs\") pod \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.753546 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-metrics-certs-tls-certs\") pod \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.753838 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnf25\" (UniqueName: \"kubernetes.io/projected/1399fab1-c25c-4b58-a58e-9dfd0062a09c-kube-api-access-hnf25\") pod \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.753867 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1399fab1-c25c-4b58-a58e-9dfd0062a09c-scripts\") pod \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\" (UID: \"1399fab1-c25c-4b58-a58e-9dfd0062a09c\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.754358 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1399fab1-c25c-4b58-a58e-9dfd0062a09c-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "1399fab1-c25c-4b58-a58e-9dfd0062a09c" (UID: "1399fab1-c25c-4b58-a58e-9dfd0062a09c"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.754268 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1399fab1-c25c-4b58-a58e-9dfd0062a09c-config" (OuterVolumeSpecName: "config") pod "1399fab1-c25c-4b58-a58e-9dfd0062a09c" (UID: "1399fab1-c25c-4b58-a58e-9dfd0062a09c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.754912 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1399fab1-c25c-4b58-a58e-9dfd0062a09c-scripts" (OuterVolumeSpecName: "scripts") pod "1399fab1-c25c-4b58-a58e-9dfd0062a09c" (UID: "1399fab1-c25c-4b58-a58e-9dfd0062a09c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.757888 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1399fab1-c25c-4b58-a58e-9dfd0062a09c-kube-api-access-hnf25" (OuterVolumeSpecName: "kube-api-access-hnf25") pod "1399fab1-c25c-4b58-a58e-9dfd0062a09c" (UID: "1399fab1-c25c-4b58-a58e-9dfd0062a09c"). InnerVolumeSpecName "kube-api-access-hnf25". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.767817 5030 scope.go:117] "RemoveContainer" containerID="06fbce813ee541d9eeed7850ca354fca0580cf614c4eb0a8edaf5b5a22908da6" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.776770 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1399fab1-c25c-4b58-a58e-9dfd0062a09c" (UID: "1399fab1-c25c-4b58-a58e-9dfd0062a09c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.794836 5030 scope.go:117] "RemoveContainer" containerID="2de53c3b859c17f6dce4f358962fc2f49e746c59df64f97dfaf9b707a5340a38" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.805542 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.812879 5030 scope.go:117] "RemoveContainer" containerID="4f4bc6591caf71c0df2653257968c6767a5209155e845f49b6ad42788e1d5137" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.816522 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "1399fab1-c25c-4b58-a58e-9dfd0062a09c" (UID: "1399fab1-c25c-4b58-a58e-9dfd0062a09c"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.829443 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "1399fab1-c25c-4b58-a58e-9dfd0062a09c" (UID: "1399fab1-c25c-4b58-a58e-9dfd0062a09c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.845654 5030 scope.go:117] "RemoveContainer" containerID="946c17aa01d77e8eccf38bf873615ae3de802c969d5a314ca7b0bcf0eb6374f1" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.855707 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnf25\" (UniqueName: \"kubernetes.io/projected/1399fab1-c25c-4b58-a58e-9dfd0062a09c-kube-api-access-hnf25\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.856009 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1399fab1-c25c-4b58-a58e-9dfd0062a09c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.856094 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1399fab1-c25c-4b58-a58e-9dfd0062a09c-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.856192 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1399fab1-c25c-4b58-a58e-9dfd0062a09c-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.856270 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.856354 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.856426 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1399fab1-c25c-4b58-a58e-9dfd0062a09c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.861682 5030 scope.go:117] "RemoveContainer" containerID="8e0043eba4e2ddfdb9777f320a29f7f7423ad5d5f2afe41d13e3932b5a6a9147" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.887084 5030 scope.go:117] "RemoveContainer" containerID="459e0a1d8f69b89f034afa3986d4b2888fa00baf09bf1b847ebc91cc377c8c2a" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.908504 5030 scope.go:117] "RemoveContainer" containerID="65a340046c8d57278deb025b2315942cfd40c2be9694e1d9f412ed27c7d0f561" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.958134 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-config-data-default\") pod \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.958265 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cae8a133-5a30-4a92-a0cd-e07c5941af0f-config-data-generated\") pod \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.958297 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.958341 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhrqm\" (UniqueName: \"kubernetes.io/projected/cae8a133-5a30-4a92-a0cd-e07c5941af0f-kube-api-access-mhrqm\") pod \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.958368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae8a133-5a30-4a92-a0cd-e07c5941af0f-galera-tls-certs\") pod \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.958433 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-operator-scripts\") pod \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.958503 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae8a133-5a30-4a92-a0cd-e07c5941af0f-combined-ca-bundle\") pod \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.958524 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-kolla-config\") pod \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\" (UID: \"cae8a133-5a30-4a92-a0cd-e07c5941af0f\") " Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.958794 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cae8a133-5a30-4a92-a0cd-e07c5941af0f-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "cae8a133-5a30-4a92-a0cd-e07c5941af0f" (UID: "cae8a133-5a30-4a92-a0cd-e07c5941af0f"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.959250 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cae8a133-5a30-4a92-a0cd-e07c5941af0f-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.959256 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cae8a133-5a30-4a92-a0cd-e07c5941af0f" (UID: "cae8a133-5a30-4a92-a0cd-e07c5941af0f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.959363 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "cae8a133-5a30-4a92-a0cd-e07c5941af0f" (UID: "cae8a133-5a30-4a92-a0cd-e07c5941af0f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.959649 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "cae8a133-5a30-4a92-a0cd-e07c5941af0f" (UID: "cae8a133-5a30-4a92-a0cd-e07c5941af0f"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.962528 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae8a133-5a30-4a92-a0cd-e07c5941af0f-kube-api-access-mhrqm" (OuterVolumeSpecName: "kube-api-access-mhrqm") pod "cae8a133-5a30-4a92-a0cd-e07c5941af0f" (UID: "cae8a133-5a30-4a92-a0cd-e07c5941af0f"). InnerVolumeSpecName "kube-api-access-mhrqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.971993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "cae8a133-5a30-4a92-a0cd-e07c5941af0f" (UID: "cae8a133-5a30-4a92-a0cd-e07c5941af0f"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.973358 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0037fd82-93c2-4ba4-87c2-563388f2b56c" path="/var/lib/kubelet/pods/0037fd82-93c2-4ba4-87c2-563388f2b56c/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.974110 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e5d8f2-57ee-4aba-aea9-db1da4cb32dc" path="/var/lib/kubelet/pods/06e5d8f2-57ee-4aba-aea9-db1da4cb32dc/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.974865 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11f45a61-64e0-41ac-85bd-660c5a2c83be" path="/var/lib/kubelet/pods/11f45a61-64e0-41ac-85bd-660c5a2c83be/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.975319 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1728938b-621e-44f4-91e2-c061a23aa819" path="/var/lib/kubelet/pods/1728938b-621e-44f4-91e2-c061a23aa819/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.976675 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b88d820-54a4-4307-82de-6b7e7b5cd435" path="/var/lib/kubelet/pods/2b88d820-54a4-4307-82de-6b7e7b5cd435/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.977498 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47ec7945-d514-4036-b888-6cc63c552195" path="/var/lib/kubelet/pods/47ec7945-d514-4036-b888-6cc63c552195/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.978924 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480916ab-abdd-474b-a001-12e99c3e1e42" path="/var/lib/kubelet/pods/480916ab-abdd-474b-a001-12e99c3e1e42/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.979522 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ee5d591-43e7-437c-9b55-b9132f22246a" path="/var/lib/kubelet/pods/4ee5d591-43e7-437c-9b55-b9132f22246a/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.980925 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63645689-a02d-4ee6-b712-554130bdca30" path="/var/lib/kubelet/pods/63645689-a02d-4ee6-b712-554130bdca30/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.981656 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" path="/var/lib/kubelet/pods/6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.982856 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae8a133-5a30-4a92-a0cd-e07c5941af0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cae8a133-5a30-4a92-a0cd-e07c5941af0f" (UID: "cae8a133-5a30-4a92-a0cd-e07c5941af0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.983200 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" path="/var/lib/kubelet/pods/b6b94587-03e7-4220-b417-0244e3623cd5/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.984926 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b74f69b6-bbb2-4578-a61c-9479f6802e01" path="/var/lib/kubelet/pods/b74f69b6-bbb2-4578-a61c-9479f6802e01/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.986052 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc14f499-5449-459d-874e-a7fe1b79e469" path="/var/lib/kubelet/pods/dc14f499-5449-459d-874e-a7fe1b79e469/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.987303 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7476965-cef4-4fe9-a85b-f22312a83d52" path="/var/lib/kubelet/pods/e7476965-cef4-4fe9-a85b-f22312a83d52/volumes" Jan 20 23:08:11 crc kubenswrapper[5030]: I0120 23:08:11.987758 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2" path="/var/lib/kubelet/pods/ec16e80e-8f86-4dfc-ac11-32f7cc4a3bd2/volumes" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.008099 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cae8a133-5a30-4a92-a0cd-e07c5941af0f-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "cae8a133-5a30-4a92-a0cd-e07c5941af0f" (UID: "cae8a133-5a30-4a92-a0cd-e07c5941af0f"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.011390 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.022530 5030 generic.go:334] "Generic (PLEG): container finished" podID="cae8a133-5a30-4a92-a0cd-e07c5941af0f" containerID="3ce1c0b95d67e31204548dcefbbebb8d7587a4e6d04adf65b96d1d9f7681e5f5" exitCode=0 Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.022591 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.022632 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"cae8a133-5a30-4a92-a0cd-e07c5941af0f","Type":"ContainerDied","Data":"3ce1c0b95d67e31204548dcefbbebb8d7587a4e6d04adf65b96d1d9f7681e5f5"} Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.022747 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"cae8a133-5a30-4a92-a0cd-e07c5941af0f","Type":"ContainerDied","Data":"1ab3bed68b1854174840ebd72cbcf854434d5af7db649a2018f4a61a6c04892f"} Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.022772 5030 scope.go:117] "RemoveContainer" containerID="3ce1c0b95d67e31204548dcefbbebb8d7587a4e6d04adf65b96d1d9f7681e5f5" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.032109 5030 generic.go:334] "Generic (PLEG): container finished" podID="78f61cd4-3d8c-48e0-bebd-03969052e500" containerID="2f5b088739c93b3850099728469fea790afce761d8524b43ba23c86d11359f9b" exitCode=0 Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.032165 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"78f61cd4-3d8c-48e0-bebd-03969052e500","Type":"ContainerDied","Data":"2f5b088739c93b3850099728469fea790afce761d8524b43ba23c86d11359f9b"} Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.032188 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"78f61cd4-3d8c-48e0-bebd-03969052e500","Type":"ContainerDied","Data":"198c46fbff0b02ec58713aac2e3aed20927f56beb3bcd29624042538a10fb9ad"} Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.032238 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.044887 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-h7nmm" event={"ID":"3559ba4b-6e2a-4df1-900f-4a0138dab333","Type":"ContainerDied","Data":"73ee186f1ebbffd7604290cb324748c50fa03cef14113dee116ffdb8311833b3"} Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.045013 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-h7nmm" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.064795 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.064848 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.064864 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhrqm\" (UniqueName: \"kubernetes.io/projected/cae8a133-5a30-4a92-a0cd-e07c5941af0f-kube-api-access-mhrqm\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.064877 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae8a133-5a30-4a92-a0cd-e07c5941af0f-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.064891 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.064908 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae8a133-5a30-4a92-a0cd-e07c5941af0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.064920 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cae8a133-5a30-4a92-a0cd-e07c5941af0f-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.068129 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.068217 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data podName:6ede01a6-9221-4276-b537-212e5da27f5c nodeName:}" failed. No retries permitted until 2026-01-20 23:08:20.06818832 +0000 UTC m=+1972.388448608 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data") pod "rabbitmq-server-0" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c") : configmap "rabbitmq-config-data" not found Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.078711 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_1399fab1-c25c-4b58-a58e-9dfd0062a09c/ovn-northd/0.log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.078768 5030 generic.go:334] "Generic (PLEG): container finished" podID="1399fab1-c25c-4b58-a58e-9dfd0062a09c" containerID="a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7" exitCode=139 Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.078864 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"1399fab1-c25c-4b58-a58e-9dfd0062a09c","Type":"ContainerDied","Data":"a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7"} Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.078887 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.078895 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"1399fab1-c25c-4b58-a58e-9dfd0062a09c","Type":"ContainerDied","Data":"078298afef122c853f4483a01a811bf7424016846dac12e491649c1b0e43647a"} Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.103998 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.118452 5030 scope.go:117] "RemoveContainer" containerID="1f4c3fcafbcbc7133742f0c2259065e74d4e0153ceb5bf2f0ca3a73ef1200c9c" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.122955 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.132817 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.138083 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-h7nmm"] Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.143839 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-h7nmm"] Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.151447 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.153142 5030 scope.go:117] "RemoveContainer" containerID="3ce1c0b95d67e31204548dcefbbebb8d7587a4e6d04adf65b96d1d9f7681e5f5" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.153534 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce1c0b95d67e31204548dcefbbebb8d7587a4e6d04adf65b96d1d9f7681e5f5\": container with ID starting with 3ce1c0b95d67e31204548dcefbbebb8d7587a4e6d04adf65b96d1d9f7681e5f5 not found: ID does not exist" containerID="3ce1c0b95d67e31204548dcefbbebb8d7587a4e6d04adf65b96d1d9f7681e5f5" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.153568 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce1c0b95d67e31204548dcefbbebb8d7587a4e6d04adf65b96d1d9f7681e5f5"} err="failed to get container status \"3ce1c0b95d67e31204548dcefbbebb8d7587a4e6d04adf65b96d1d9f7681e5f5\": rpc error: code = NotFound desc = could not find container \"3ce1c0b95d67e31204548dcefbbebb8d7587a4e6d04adf65b96d1d9f7681e5f5\": container with ID starting with 3ce1c0b95d67e31204548dcefbbebb8d7587a4e6d04adf65b96d1d9f7681e5f5 not found: ID does not exist" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.153592 5030 scope.go:117] "RemoveContainer" containerID="1f4c3fcafbcbc7133742f0c2259065e74d4e0153ceb5bf2f0ca3a73ef1200c9c" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.153944 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4c3fcafbcbc7133742f0c2259065e74d4e0153ceb5bf2f0ca3a73ef1200c9c\": container with ID starting with 1f4c3fcafbcbc7133742f0c2259065e74d4e0153ceb5bf2f0ca3a73ef1200c9c not found: ID does not exist" containerID="1f4c3fcafbcbc7133742f0c2259065e74d4e0153ceb5bf2f0ca3a73ef1200c9c" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.153966 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4c3fcafbcbc7133742f0c2259065e74d4e0153ceb5bf2f0ca3a73ef1200c9c"} err="failed to get container status \"1f4c3fcafbcbc7133742f0c2259065e74d4e0153ceb5bf2f0ca3a73ef1200c9c\": rpc error: code = NotFound desc = could not find container \"1f4c3fcafbcbc7133742f0c2259065e74d4e0153ceb5bf2f0ca3a73ef1200c9c\": container with ID starting with 1f4c3fcafbcbc7133742f0c2259065e74d4e0153ceb5bf2f0ca3a73ef1200c9c not found: ID does not exist" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.153979 5030 scope.go:117] "RemoveContainer" containerID="2f5b088739c93b3850099728469fea790afce761d8524b43ba23c86d11359f9b" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.159024 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.166914 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data\") pod \"78f61cd4-3d8c-48e0-bebd-03969052e500\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.166971 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-erlang-cookie\") pod \"78f61cd4-3d8c-48e0-bebd-03969052e500\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.167016 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78f61cd4-3d8c-48e0-bebd-03969052e500-pod-info\") pod \"78f61cd4-3d8c-48e0-bebd-03969052e500\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.167046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-server-conf\") pod \"78f61cd4-3d8c-48e0-bebd-03969052e500\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.167075 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-tls\") pod \"78f61cd4-3d8c-48e0-bebd-03969052e500\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.167110 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-plugins\") pod \"78f61cd4-3d8c-48e0-bebd-03969052e500\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.167145 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8nlm\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-kube-api-access-t8nlm\") pod \"78f61cd4-3d8c-48e0-bebd-03969052e500\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.167177 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-confd\") pod \"78f61cd4-3d8c-48e0-bebd-03969052e500\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.167273 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78f61cd4-3d8c-48e0-bebd-03969052e500-erlang-cookie-secret\") pod \"78f61cd4-3d8c-48e0-bebd-03969052e500\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.167324 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"78f61cd4-3d8c-48e0-bebd-03969052e500\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.167415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-plugins-conf\") pod \"78f61cd4-3d8c-48e0-bebd-03969052e500\" (UID: \"78f61cd4-3d8c-48e0-bebd-03969052e500\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.167906 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.168373 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "78f61cd4-3d8c-48e0-bebd-03969052e500" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.169182 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "78f61cd4-3d8c-48e0-bebd-03969052e500" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.173498 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/78f61cd4-3d8c-48e0-bebd-03969052e500-pod-info" (OuterVolumeSpecName: "pod-info") pod "78f61cd4-3d8c-48e0-bebd-03969052e500" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.173904 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "78f61cd4-3d8c-48e0-bebd-03969052e500" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.177390 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-kube-api-access-t8nlm" (OuterVolumeSpecName: "kube-api-access-t8nlm") pod "78f61cd4-3d8c-48e0-bebd-03969052e500" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500"). InnerVolumeSpecName "kube-api-access-t8nlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.178294 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "78f61cd4-3d8c-48e0-bebd-03969052e500" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.180027 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "78f61cd4-3d8c-48e0-bebd-03969052e500" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.182716 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f61cd4-3d8c-48e0-bebd-03969052e500-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "78f61cd4-3d8c-48e0-bebd-03969052e500" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.193407 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data" (OuterVolumeSpecName: "config-data") pod "78f61cd4-3d8c-48e0-bebd-03969052e500" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.208004 5030 scope.go:117] "RemoveContainer" containerID="cec126b6647bad2db7e9ddc4b5f214085e98e532d7618dfac27c453a22f8875a" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.212720 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tbhl7"] Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213092 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae8a133-5a30-4a92-a0cd-e07c5941af0f" containerName="galera" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213108 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae8a133-5a30-4a92-a0cd-e07c5941af0f" containerName="galera" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213127 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63645689-a02d-4ee6-b712-554130bdca30" containerName="cinder-api" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213135 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="63645689-a02d-4ee6-b712-554130bdca30" containerName="cinder-api" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213147 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae8a133-5a30-4a92-a0cd-e07c5941af0f" containerName="mysql-bootstrap" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213155 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae8a133-5a30-4a92-a0cd-e07c5941af0f" containerName="mysql-bootstrap" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213171 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="sg-core" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213179 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="sg-core" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213188 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3559ba4b-6e2a-4df1-900f-4a0138dab333" containerName="mariadb-account-create-update" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213195 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3559ba4b-6e2a-4df1-900f-4a0138dab333" containerName="mariadb-account-create-update" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213209 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b88d820-54a4-4307-82de-6b7e7b5cd435" containerName="nova-api-api" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213216 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b88d820-54a4-4307-82de-6b7e7b5cd435" containerName="nova-api-api" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213228 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="ceilometer-notification-agent" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213236 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="ceilometer-notification-agent" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213250 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerName="nova-metadata-metadata" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213257 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerName="nova-metadata-metadata" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213275 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee5d591-43e7-437c-9b55-b9132f22246a" containerName="placement-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213372 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee5d591-43e7-437c-9b55-b9132f22246a" containerName="placement-log" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213387 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f45a61-64e0-41ac-85bd-660c5a2c83be" containerName="memcached" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213395 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f45a61-64e0-41ac-85bd-660c5a2c83be" containerName="memcached" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213438 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b88d820-54a4-4307-82de-6b7e7b5cd435" containerName="nova-api-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213448 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b88d820-54a4-4307-82de-6b7e7b5cd435" containerName="nova-api-log" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213460 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1399fab1-c25c-4b58-a58e-9dfd0062a09c" containerName="openstack-network-exporter" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213468 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1399fab1-c25c-4b58-a58e-9dfd0062a09c" containerName="openstack-network-exporter" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213479 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc14f499-5449-459d-874e-a7fe1b79e469" containerName="glance-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213487 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc14f499-5449-459d-874e-a7fe1b79e469" containerName="glance-log" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213501 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="proxy-httpd" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213509 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="proxy-httpd" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213520 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f61cd4-3d8c-48e0-bebd-03969052e500" containerName="rabbitmq" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213527 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f61cd4-3d8c-48e0-bebd-03969052e500" containerName="rabbitmq" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213535 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480916ab-abdd-474b-a001-12e99c3e1e42" containerName="nova-cell0-conductor-conductor" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213542 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="480916ab-abdd-474b-a001-12e99c3e1e42" containerName="nova-cell0-conductor-conductor" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213551 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f61cd4-3d8c-48e0-bebd-03969052e500" containerName="setup-container" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213557 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f61cd4-3d8c-48e0-bebd-03969052e500" containerName="setup-container" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213568 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="ceilometer-central-agent" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213575 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="ceilometer-central-agent" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213586 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63645689-a02d-4ee6-b712-554130bdca30" containerName="cinder-api-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213594 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="63645689-a02d-4ee6-b712-554130bdca30" containerName="cinder-api-log" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213602 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1399fab1-c25c-4b58-a58e-9dfd0062a09c" containerName="ovn-northd" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213609 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1399fab1-c25c-4b58-a58e-9dfd0062a09c" containerName="ovn-northd" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213633 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc14f499-5449-459d-874e-a7fe1b79e469" containerName="glance-httpd" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213642 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc14f499-5449-459d-874e-a7fe1b79e469" containerName="glance-httpd" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213653 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerName="nova-metadata-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213661 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerName="nova-metadata-log" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213672 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee5d591-43e7-437c-9b55-b9132f22246a" containerName="placement-api" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213680 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee5d591-43e7-437c-9b55-b9132f22246a" containerName="placement-api" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213690 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ec7945-d514-4036-b888-6cc63c552195" containerName="glance-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213700 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ec7945-d514-4036-b888-6cc63c552195" containerName="glance-log" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213714 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1728938b-621e-44f4-91e2-c061a23aa819" containerName="cinder-scheduler" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213721 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1728938b-621e-44f4-91e2-c061a23aa819" containerName="cinder-scheduler" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213732 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ec7945-d514-4036-b888-6cc63c552195" containerName="glance-httpd" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213740 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ec7945-d514-4036-b888-6cc63c552195" containerName="glance-httpd" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213750 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74f69b6-bbb2-4578-a61c-9479f6802e01" containerName="nova-scheduler-scheduler" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213757 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74f69b6-bbb2-4578-a61c-9479f6802e01" containerName="nova-scheduler-scheduler" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213772 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1728938b-621e-44f4-91e2-c061a23aa819" containerName="probe" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213779 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1728938b-621e-44f4-91e2-c061a23aa819" containerName="probe" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213788 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" containerName="barbican-api-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213795 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" containerName="barbican-api-log" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.213807 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" containerName="barbican-api" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213814 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" containerName="barbican-api" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.213995 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee5d591-43e7-437c-9b55-b9132f22246a" containerName="placement-api" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214010 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1728938b-621e-44f4-91e2-c061a23aa819" containerName="cinder-scheduler" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214023 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3559ba4b-6e2a-4df1-900f-4a0138dab333" containerName="mariadb-account-create-update" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214033 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1399fab1-c25c-4b58-a58e-9dfd0062a09c" containerName="ovn-northd" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214042 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1399fab1-c25c-4b58-a58e-9dfd0062a09c" containerName="openstack-network-exporter" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214053 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="ceilometer-notification-agent" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214065 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" containerName="barbican-api" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214076 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f45a61-64e0-41ac-85bd-660c5a2c83be" containerName="memcached" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214089 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3559ba4b-6e2a-4df1-900f-4a0138dab333" containerName="mariadb-account-create-update" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214100 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd4290d-9d6d-4a8e-9b8b-a1f51b675e2c" containerName="barbican-api-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214111 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b88d820-54a4-4307-82de-6b7e7b5cd435" containerName="nova-api-api" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214121 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="47ec7945-d514-4036-b888-6cc63c552195" containerName="glance-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214133 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae8a133-5a30-4a92-a0cd-e07c5941af0f" containerName="galera" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214141 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="63645689-a02d-4ee6-b712-554130bdca30" containerName="cinder-api-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214150 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b74f69b6-bbb2-4578-a61c-9479f6802e01" containerName="nova-scheduler-scheduler" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214160 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f61cd4-3d8c-48e0-bebd-03969052e500" containerName="rabbitmq" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214167 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="63645689-a02d-4ee6-b712-554130bdca30" containerName="cinder-api" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214179 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="sg-core" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214189 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b88d820-54a4-4307-82de-6b7e7b5cd435" containerName="nova-api-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214198 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerName="nova-metadata-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214210 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="ceilometer-central-agent" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214221 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1728938b-621e-44f4-91e2-c061a23aa819" containerName="probe" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214232 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerName="nova-metadata-metadata" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214245 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee5d591-43e7-437c-9b55-b9132f22246a" containerName="placement-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.214268 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="47ec7945-d514-4036-b888-6cc63c552195" containerName="glance-httpd" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.216180 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b94587-03e7-4220-b417-0244e3623cd5" containerName="proxy-httpd" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.216209 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc14f499-5449-459d-874e-a7fe1b79e469" containerName="glance-log" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.216224 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc14f499-5449-459d-874e-a7fe1b79e469" containerName="glance-httpd" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.216271 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="480916ab-abdd-474b-a001-12e99c3e1e42" containerName="nova-cell0-conductor-conductor" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.216505 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3559ba4b-6e2a-4df1-900f-4a0138dab333" containerName="mariadb-account-create-update" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.216517 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3559ba4b-6e2a-4df1-900f-4a0138dab333" containerName="mariadb-account-create-update" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.220983 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.228158 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tbhl7"] Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.232255 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-server-conf" (OuterVolumeSpecName: "server-conf") pod "78f61cd4-3d8c-48e0-bebd-03969052e500" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.262708 5030 scope.go:117] "RemoveContainer" containerID="2f5b088739c93b3850099728469fea790afce761d8524b43ba23c86d11359f9b" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.269563 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78f61cd4-3d8c-48e0-bebd-03969052e500-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.269587 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.269596 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.269605 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.269613 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.269700 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78f61cd4-3d8c-48e0-bebd-03969052e500-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.269709 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78f61cd4-3d8c-48e0-bebd-03969052e500-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.269717 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.269724 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.269733 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8nlm\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-kube-api-access-t8nlm\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.269983 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f5b088739c93b3850099728469fea790afce761d8524b43ba23c86d11359f9b\": container with ID starting with 2f5b088739c93b3850099728469fea790afce761d8524b43ba23c86d11359f9b not found: ID does not exist" containerID="2f5b088739c93b3850099728469fea790afce761d8524b43ba23c86d11359f9b" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.271579 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f5b088739c93b3850099728469fea790afce761d8524b43ba23c86d11359f9b"} err="failed to get container status \"2f5b088739c93b3850099728469fea790afce761d8524b43ba23c86d11359f9b\": rpc error: code = NotFound desc = could not find container \"2f5b088739c93b3850099728469fea790afce761d8524b43ba23c86d11359f9b\": container with ID starting with 2f5b088739c93b3850099728469fea790afce761d8524b43ba23c86d11359f9b not found: ID does not exist" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.271608 5030 scope.go:117] "RemoveContainer" containerID="cec126b6647bad2db7e9ddc4b5f214085e98e532d7618dfac27c453a22f8875a" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.272401 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec126b6647bad2db7e9ddc4b5f214085e98e532d7618dfac27c453a22f8875a\": container with ID starting with cec126b6647bad2db7e9ddc4b5f214085e98e532d7618dfac27c453a22f8875a not found: ID does not exist" containerID="cec126b6647bad2db7e9ddc4b5f214085e98e532d7618dfac27c453a22f8875a" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.272419 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec126b6647bad2db7e9ddc4b5f214085e98e532d7618dfac27c453a22f8875a"} err="failed to get container status \"cec126b6647bad2db7e9ddc4b5f214085e98e532d7618dfac27c453a22f8875a\": rpc error: code = NotFound desc = could not find container \"cec126b6647bad2db7e9ddc4b5f214085e98e532d7618dfac27c453a22f8875a\": container with ID starting with cec126b6647bad2db7e9ddc4b5f214085e98e532d7618dfac27c453a22f8875a not found: ID does not exist" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.272431 5030 scope.go:117] "RemoveContainer" containerID="4b0d4d83939b8a57f9922652cba204b62e772b9ed6fd96ffea1528b8ae360c41" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.286278 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "78f61cd4-3d8c-48e0-bebd-03969052e500" (UID: "78f61cd4-3d8c-48e0-bebd-03969052e500"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.286728 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.371781 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-catalog-content\") pod \"certified-operators-tbhl7\" (UID: \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\") " pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.371878 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f7zw\" (UniqueName: \"kubernetes.io/projected/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-kube-api-access-4f7zw\") pod \"certified-operators-tbhl7\" (UID: \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\") " pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.371920 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-utilities\") pod \"certified-operators-tbhl7\" (UID: \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\") " pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.372281 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78f61cd4-3d8c-48e0-bebd-03969052e500-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.372310 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.388484 5030 scope.go:117] "RemoveContainer" containerID="024213f076bfea3599e3af6ece9e72be955eb0c53722fd158681cd3ee0b2aef1" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.472614 5030 scope.go:117] "RemoveContainer" containerID="a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.473234 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-utilities\") pod \"certified-operators-tbhl7\" (UID: \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\") " pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.473322 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-catalog-content\") pod \"certified-operators-tbhl7\" (UID: \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\") " pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.473386 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f7zw\" (UniqueName: \"kubernetes.io/projected/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-kube-api-access-4f7zw\") pod \"certified-operators-tbhl7\" (UID: \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\") " pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.473839 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-catalog-content\") pod \"certified-operators-tbhl7\" (UID: \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\") " pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.479903 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.481321 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.484890 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.485299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-utilities\") pod \"certified-operators-tbhl7\" (UID: \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\") " pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.491470 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f7zw\" (UniqueName: \"kubernetes.io/projected/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-kube-api-access-4f7zw\") pod \"certified-operators-tbhl7\" (UID: \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\") " pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.494875 5030 scope.go:117] "RemoveContainer" containerID="024213f076bfea3599e3af6ece9e72be955eb0c53722fd158681cd3ee0b2aef1" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.496516 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024213f076bfea3599e3af6ece9e72be955eb0c53722fd158681cd3ee0b2aef1\": container with ID starting with 024213f076bfea3599e3af6ece9e72be955eb0c53722fd158681cd3ee0b2aef1 not found: ID does not exist" containerID="024213f076bfea3599e3af6ece9e72be955eb0c53722fd158681cd3ee0b2aef1" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.496558 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024213f076bfea3599e3af6ece9e72be955eb0c53722fd158681cd3ee0b2aef1"} err="failed to get container status \"024213f076bfea3599e3af6ece9e72be955eb0c53722fd158681cd3ee0b2aef1\": rpc error: code = NotFound desc = could not find container \"024213f076bfea3599e3af6ece9e72be955eb0c53722fd158681cd3ee0b2aef1\": container with ID starting with 024213f076bfea3599e3af6ece9e72be955eb0c53722fd158681cd3ee0b2aef1 not found: ID does not exist" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.496582 5030 scope.go:117] "RemoveContainer" containerID="a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7" Jan 20 23:08:12 crc kubenswrapper[5030]: E0120 23:08:12.496928 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7\": container with ID starting with a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7 not found: ID does not exist" containerID="a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.496983 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7"} err="failed to get container status \"a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7\": rpc error: code = NotFound desc = could not find container \"a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7\": container with ID starting with a79fee917f7604dd6c73a5ed6f47bf4fc6bbc65fed359ac78a4fe66d8fb239c7 not found: ID does not exist" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.551907 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.553604 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.587385 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-config-data\") pod \"e0ca1870-fde9-4d58-9593-555c107cada4\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.587443 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-fernet-keys\") pod \"e0ca1870-fde9-4d58-9593-555c107cada4\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.587538 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-public-tls-certs\") pod \"e0ca1870-fde9-4d58-9593-555c107cada4\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.587606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-combined-ca-bundle\") pod \"e0ca1870-fde9-4d58-9593-555c107cada4\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.587685 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-internal-tls-certs\") pod \"e0ca1870-fde9-4d58-9593-555c107cada4\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.587720 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-scripts\") pod \"e0ca1870-fde9-4d58-9593-555c107cada4\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.587764 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-credential-keys\") pod \"e0ca1870-fde9-4d58-9593-555c107cada4\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.587814 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7ckj\" (UniqueName: \"kubernetes.io/projected/e0ca1870-fde9-4d58-9593-555c107cada4-kube-api-access-j7ckj\") pod \"e0ca1870-fde9-4d58-9593-555c107cada4\" (UID: \"e0ca1870-fde9-4d58-9593-555c107cada4\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.592942 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e0ca1870-fde9-4d58-9593-555c107cada4" (UID: "e0ca1870-fde9-4d58-9593-555c107cada4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.593928 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e0ca1870-fde9-4d58-9593-555c107cada4" (UID: "e0ca1870-fde9-4d58-9593-555c107cada4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.594991 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ca1870-fde9-4d58-9593-555c107cada4-kube-api-access-j7ckj" (OuterVolumeSpecName: "kube-api-access-j7ckj") pod "e0ca1870-fde9-4d58-9593-555c107cada4" (UID: "e0ca1870-fde9-4d58-9593-555c107cada4"). InnerVolumeSpecName "kube-api-access-j7ckj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.605370 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-scripts" (OuterVolumeSpecName: "scripts") pod "e0ca1870-fde9-4d58-9593-555c107cada4" (UID: "e0ca1870-fde9-4d58-9593-555c107cada4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.643913 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-config-data" (OuterVolumeSpecName: "config-data") pod "e0ca1870-fde9-4d58-9593-555c107cada4" (UID: "e0ca1870-fde9-4d58-9593-555c107cada4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.645343 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0ca1870-fde9-4d58-9593-555c107cada4" (UID: "e0ca1870-fde9-4d58-9593-555c107cada4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.660524 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e0ca1870-fde9-4d58-9593-555c107cada4" (UID: "e0ca1870-fde9-4d58-9593-555c107cada4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.660975 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e0ca1870-fde9-4d58-9593-555c107cada4" (UID: "e0ca1870-fde9-4d58-9593-555c107cada4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.691631 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6ede01a6-9221-4276-b537-212e5da27f5c-erlang-cookie-secret\") pod \"6ede01a6-9221-4276-b537-212e5da27f5c\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.691704 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-plugins-conf\") pod \"6ede01a6-9221-4276-b537-212e5da27f5c\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.691751 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-confd\") pod \"6ede01a6-9221-4276-b537-212e5da27f5c\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.691772 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-tls\") pod \"6ede01a6-9221-4276-b537-212e5da27f5c\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.691868 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-erlang-cookie\") pod \"6ede01a6-9221-4276-b537-212e5da27f5c\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.691932 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data\") pod \"6ede01a6-9221-4276-b537-212e5da27f5c\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.691973 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4jt8\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-kube-api-access-s4jt8\") pod \"6ede01a6-9221-4276-b537-212e5da27f5c\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.692019 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-plugins\") pod \"6ede01a6-9221-4276-b537-212e5da27f5c\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.692060 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"6ede01a6-9221-4276-b537-212e5da27f5c\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.692090 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6ede01a6-9221-4276-b537-212e5da27f5c-pod-info\") pod \"6ede01a6-9221-4276-b537-212e5da27f5c\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.692117 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-server-conf\") pod \"6ede01a6-9221-4276-b537-212e5da27f5c\" (UID: \"6ede01a6-9221-4276-b537-212e5da27f5c\") " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.692458 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.692477 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.692488 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.692500 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.692513 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.692526 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7ckj\" (UniqueName: \"kubernetes.io/projected/e0ca1870-fde9-4d58-9593-555c107cada4-kube-api-access-j7ckj\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.692539 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.692551 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0ca1870-fde9-4d58-9593-555c107cada4-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.692968 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6ede01a6-9221-4276-b537-212e5da27f5c" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.694826 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6ede01a6-9221-4276-b537-212e5da27f5c" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.697564 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6ede01a6-9221-4276-b537-212e5da27f5c" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.697644 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ede01a6-9221-4276-b537-212e5da27f5c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6ede01a6-9221-4276-b537-212e5da27f5c" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.703976 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6ede01a6-9221-4276-b537-212e5da27f5c" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.712994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6ede01a6-9221-4276-b537-212e5da27f5c-pod-info" (OuterVolumeSpecName: "pod-info") pod "6ede01a6-9221-4276-b537-212e5da27f5c" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.713127 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "6ede01a6-9221-4276-b537-212e5da27f5c" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.723442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-kube-api-access-s4jt8" (OuterVolumeSpecName: "kube-api-access-s4jt8") pod "6ede01a6-9221-4276-b537-212e5da27f5c" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c"). InnerVolumeSpecName "kube-api-access-s4jt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.752195 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data" (OuterVolumeSpecName: "config-data") pod "6ede01a6-9221-4276-b537-212e5da27f5c" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.764291 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-server-conf" (OuterVolumeSpecName: "server-conf") pod "6ede01a6-9221-4276-b537-212e5da27f5c" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.787877 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6ede01a6-9221-4276-b537-212e5da27f5c" (UID: "6ede01a6-9221-4276-b537-212e5da27f5c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.793461 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.793489 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.793498 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.793509 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.793520 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.793530 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4jt8\" (UniqueName: \"kubernetes.io/projected/6ede01a6-9221-4276-b537-212e5da27f5c-kube-api-access-s4jt8\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.793539 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6ede01a6-9221-4276-b537-212e5da27f5c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.793568 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.793577 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6ede01a6-9221-4276-b537-212e5da27f5c-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.793585 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6ede01a6-9221-4276-b537-212e5da27f5c-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.793594 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6ede01a6-9221-4276-b537-212e5da27f5c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.810953 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:08:12 crc kubenswrapper[5030]: I0120 23:08:12.894650 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.062868 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tbhl7"] Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.100104 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0ca1870-fde9-4d58-9593-555c107cada4" containerID="71895f6baf692fcb059570c10acee58639753ce9643aab3879a97adb3c69f858" exitCode=0 Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.100173 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.100158 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" event={"ID":"e0ca1870-fde9-4d58-9593-555c107cada4","Type":"ContainerDied","Data":"71895f6baf692fcb059570c10acee58639753ce9643aab3879a97adb3c69f858"} Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.100254 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-67567dfbc6-zn99j" event={"ID":"e0ca1870-fde9-4d58-9593-555c107cada4","Type":"ContainerDied","Data":"df0724fb704495efada931d4949058aef5caa4e000b7a820e4a92309511cc47b"} Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.100276 5030 scope.go:117] "RemoveContainer" containerID="71895f6baf692fcb059570c10acee58639753ce9643aab3879a97adb3c69f858" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.109510 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbhl7" event={"ID":"c513c4f5-8e96-4a9f-b0ee-ef08bac09288","Type":"ContainerStarted","Data":"9103a2c5f96a2e3b15f6ca66bf75c056d002c0d7cbd674de17882fff796c98be"} Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.114409 5030 generic.go:334] "Generic (PLEG): container finished" podID="6ede01a6-9221-4276-b537-212e5da27f5c" containerID="7e8d67b1a16612d40e2f749471f6d722e5a192ea12c3992c445e75cee863bda4" exitCode=0 Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.114449 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"6ede01a6-9221-4276-b537-212e5da27f5c","Type":"ContainerDied","Data":"7e8d67b1a16612d40e2f749471f6d722e5a192ea12c3992c445e75cee863bda4"} Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.114466 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"6ede01a6-9221-4276-b537-212e5da27f5c","Type":"ContainerDied","Data":"f110732b2b8320173a9a0256ee3d6ea0412e4623b249bed84157371192363e50"} Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.114511 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.123840 5030 scope.go:117] "RemoveContainer" containerID="71895f6baf692fcb059570c10acee58639753ce9643aab3879a97adb3c69f858" Jan 20 23:08:13 crc kubenswrapper[5030]: E0120 23:08:13.124233 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71895f6baf692fcb059570c10acee58639753ce9643aab3879a97adb3c69f858\": container with ID starting with 71895f6baf692fcb059570c10acee58639753ce9643aab3879a97adb3c69f858 not found: ID does not exist" containerID="71895f6baf692fcb059570c10acee58639753ce9643aab3879a97adb3c69f858" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.124277 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71895f6baf692fcb059570c10acee58639753ce9643aab3879a97adb3c69f858"} err="failed to get container status \"71895f6baf692fcb059570c10acee58639753ce9643aab3879a97adb3c69f858\": rpc error: code = NotFound desc = could not find container \"71895f6baf692fcb059570c10acee58639753ce9643aab3879a97adb3c69f858\": container with ID starting with 71895f6baf692fcb059570c10acee58639753ce9643aab3879a97adb3c69f858 not found: ID does not exist" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.124302 5030 scope.go:117] "RemoveContainer" containerID="7e8d67b1a16612d40e2f749471f6d722e5a192ea12c3992c445e75cee863bda4" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.138335 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-67567dfbc6-zn99j"] Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.144389 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-67567dfbc6-zn99j"] Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.152756 5030 scope.go:117] "RemoveContainer" containerID="ff8398a734783704ed53adf1f8cf6ac6b31a5b7eda7fc7e80e5c0e36f03e7f79" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.153747 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.162031 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.197857 5030 scope.go:117] "RemoveContainer" containerID="7e8d67b1a16612d40e2f749471f6d722e5a192ea12c3992c445e75cee863bda4" Jan 20 23:08:13 crc kubenswrapper[5030]: E0120 23:08:13.232071 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8d67b1a16612d40e2f749471f6d722e5a192ea12c3992c445e75cee863bda4\": container with ID starting with 7e8d67b1a16612d40e2f749471f6d722e5a192ea12c3992c445e75cee863bda4 not found: ID does not exist" containerID="7e8d67b1a16612d40e2f749471f6d722e5a192ea12c3992c445e75cee863bda4" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.232115 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8d67b1a16612d40e2f749471f6d722e5a192ea12c3992c445e75cee863bda4"} err="failed to get container status \"7e8d67b1a16612d40e2f749471f6d722e5a192ea12c3992c445e75cee863bda4\": rpc error: code = NotFound desc = could not find container \"7e8d67b1a16612d40e2f749471f6d722e5a192ea12c3992c445e75cee863bda4\": container with ID starting with 7e8d67b1a16612d40e2f749471f6d722e5a192ea12c3992c445e75cee863bda4 not found: ID does not exist" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.232143 5030 scope.go:117] "RemoveContainer" containerID="ff8398a734783704ed53adf1f8cf6ac6b31a5b7eda7fc7e80e5c0e36f03e7f79" Jan 20 23:08:13 crc kubenswrapper[5030]: E0120 23:08:13.248053 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff8398a734783704ed53adf1f8cf6ac6b31a5b7eda7fc7e80e5c0e36f03e7f79\": container with ID starting with ff8398a734783704ed53adf1f8cf6ac6b31a5b7eda7fc7e80e5c0e36f03e7f79 not found: ID does not exist" containerID="ff8398a734783704ed53adf1f8cf6ac6b31a5b7eda7fc7e80e5c0e36f03e7f79" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.248095 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8398a734783704ed53adf1f8cf6ac6b31a5b7eda7fc7e80e5c0e36f03e7f79"} err="failed to get container status \"ff8398a734783704ed53adf1f8cf6ac6b31a5b7eda7fc7e80e5c0e36f03e7f79\": rpc error: code = NotFound desc = could not find container \"ff8398a734783704ed53adf1f8cf6ac6b31a5b7eda7fc7e80e5c0e36f03e7f79\": container with ID starting with ff8398a734783704ed53adf1f8cf6ac6b31a5b7eda7fc7e80e5c0e36f03e7f79 not found: ID does not exist" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.694035 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.812747 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-config-data\") pod \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\" (UID: \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\") " Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.812814 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-combined-ca-bundle\") pod \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\" (UID: \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\") " Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.812972 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2fzh\" (UniqueName: \"kubernetes.io/projected/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-kube-api-access-t2fzh\") pod \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\" (UID: \"a666baa5-8c6c-40e4-8c4d-7ab347b29e04\") " Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.820854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-kube-api-access-t2fzh" (OuterVolumeSpecName: "kube-api-access-t2fzh") pod "a666baa5-8c6c-40e4-8c4d-7ab347b29e04" (UID: "a666baa5-8c6c-40e4-8c4d-7ab347b29e04"). InnerVolumeSpecName "kube-api-access-t2fzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.839641 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-config-data" (OuterVolumeSpecName: "config-data") pod "a666baa5-8c6c-40e4-8c4d-7ab347b29e04" (UID: "a666baa5-8c6c-40e4-8c4d-7ab347b29e04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.862742 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a666baa5-8c6c-40e4-8c4d-7ab347b29e04" (UID: "a666baa5-8c6c-40e4-8c4d-7ab347b29e04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.915507 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2fzh\" (UniqueName: \"kubernetes.io/projected/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-kube-api-access-t2fzh\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.915598 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.915638 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a666baa5-8c6c-40e4-8c4d-7ab347b29e04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.969803 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1399fab1-c25c-4b58-a58e-9dfd0062a09c" path="/var/lib/kubelet/pods/1399fab1-c25c-4b58-a58e-9dfd0062a09c/volumes" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.970353 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3559ba4b-6e2a-4df1-900f-4a0138dab333" path="/var/lib/kubelet/pods/3559ba4b-6e2a-4df1-900f-4a0138dab333/volumes" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.971019 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ede01a6-9221-4276-b537-212e5da27f5c" path="/var/lib/kubelet/pods/6ede01a6-9221-4276-b537-212e5da27f5c/volumes" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.972161 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f61cd4-3d8c-48e0-bebd-03969052e500" path="/var/lib/kubelet/pods/78f61cd4-3d8c-48e0-bebd-03969052e500/volumes" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.972777 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae8a133-5a30-4a92-a0cd-e07c5941af0f" path="/var/lib/kubelet/pods/cae8a133-5a30-4a92-a0cd-e07c5941af0f/volumes" Jan 20 23:08:13 crc kubenswrapper[5030]: I0120 23:08:13.973752 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ca1870-fde9-4d58-9593-555c107cada4" path="/var/lib/kubelet/pods/e0ca1870-fde9-4d58-9593-555c107cada4/volumes" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.134367 5030 generic.go:334] "Generic (PLEG): container finished" podID="a666baa5-8c6c-40e4-8c4d-7ab347b29e04" containerID="0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225" exitCode=0 Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.134408 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"a666baa5-8c6c-40e4-8c4d-7ab347b29e04","Type":"ContainerDied","Data":"0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225"} Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.134433 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.134454 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"a666baa5-8c6c-40e4-8c4d-7ab347b29e04","Type":"ContainerDied","Data":"ec014873a1bd14099b9d990a92fcb7b5aa7c462964ac3d40fca07374b556f60c"} Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.134471 5030 scope.go:117] "RemoveContainer" containerID="0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.138490 5030 generic.go:334] "Generic (PLEG): container finished" podID="c513c4f5-8e96-4a9f-b0ee-ef08bac09288" containerID="79d2dc858c907b9b9c180b8344193a05fea6b7daac96c62a065fe0f37f1e7319" exitCode=0 Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.138544 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbhl7" event={"ID":"c513c4f5-8e96-4a9f-b0ee-ef08bac09288","Type":"ContainerDied","Data":"79d2dc858c907b9b9c180b8344193a05fea6b7daac96c62a065fe0f37f1e7319"} Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.158906 5030 scope.go:117] "RemoveContainer" containerID="0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225" Jan 20 23:08:14 crc kubenswrapper[5030]: E0120 23:08:14.159391 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225\": container with ID starting with 0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225 not found: ID does not exist" containerID="0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.159452 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225"} err="failed to get container status \"0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225\": rpc error: code = NotFound desc = could not find container \"0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225\": container with ID starting with 0009502e9727c8847c4c240f6de42ff45d425c301e8bd51e69cdf56f95bb8225 not found: ID does not exist" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.183572 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.192188 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.602307 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.726904 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-public-tls-certs\") pod \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.727219 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-config\") pod \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.727351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-httpd-config\") pod \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.727384 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-combined-ca-bundle\") pod \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.727402 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-internal-tls-certs\") pod \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.727430 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-ovndb-tls-certs\") pod \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.727447 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8569g\" (UniqueName: \"kubernetes.io/projected/8659ca99-8cf1-4b79-90f1-e9db988aaefa-kube-api-access-8569g\") pod \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\" (UID: \"8659ca99-8cf1-4b79-90f1-e9db988aaefa\") " Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.733941 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8659ca99-8cf1-4b79-90f1-e9db988aaefa" (UID: "8659ca99-8cf1-4b79-90f1-e9db988aaefa"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.734322 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8659ca99-8cf1-4b79-90f1-e9db988aaefa-kube-api-access-8569g" (OuterVolumeSpecName: "kube-api-access-8569g") pod "8659ca99-8cf1-4b79-90f1-e9db988aaefa" (UID: "8659ca99-8cf1-4b79-90f1-e9db988aaefa"). InnerVolumeSpecName "kube-api-access-8569g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.775090 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8659ca99-8cf1-4b79-90f1-e9db988aaefa" (UID: "8659ca99-8cf1-4b79-90f1-e9db988aaefa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.779131 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8659ca99-8cf1-4b79-90f1-e9db988aaefa" (UID: "8659ca99-8cf1-4b79-90f1-e9db988aaefa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.779457 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-config" (OuterVolumeSpecName: "config") pod "8659ca99-8cf1-4b79-90f1-e9db988aaefa" (UID: "8659ca99-8cf1-4b79-90f1-e9db988aaefa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.783532 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8659ca99-8cf1-4b79-90f1-e9db988aaefa" (UID: "8659ca99-8cf1-4b79-90f1-e9db988aaefa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.801473 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8659ca99-8cf1-4b79-90f1-e9db988aaefa" (UID: "8659ca99-8cf1-4b79-90f1-e9db988aaefa"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.829277 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.829318 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8569g\" (UniqueName: \"kubernetes.io/projected/8659ca99-8cf1-4b79-90f1-e9db988aaefa-kube-api-access-8569g\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.829329 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.829341 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.829351 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.829360 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:14 crc kubenswrapper[5030]: I0120 23:08:14.829368 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8659ca99-8cf1-4b79-90f1-e9db988aaefa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.040811 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.133:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.041743 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0037fd82-93c2-4ba4-87c2-563388f2b56c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.133:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.157210 5030 generic.go:334] "Generic (PLEG): container finished" podID="8659ca99-8cf1-4b79-90f1-e9db988aaefa" containerID="b9d0aa43e5924f881e26530da6f2716e1ea0b9c60a52a97d1c2c1728e2fab06b" exitCode=0 Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.157281 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" event={"ID":"8659ca99-8cf1-4b79-90f1-e9db988aaefa","Type":"ContainerDied","Data":"b9d0aa43e5924f881e26530da6f2716e1ea0b9c60a52a97d1c2c1728e2fab06b"} Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.157312 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" event={"ID":"8659ca99-8cf1-4b79-90f1-e9db988aaefa","Type":"ContainerDied","Data":"cf810cb414b460060cb68353480728f3d35d4508163577f0f7c0716bd8de5958"} Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.157334 5030 scope.go:117] "RemoveContainer" containerID="bf734de45d5f60047a30e1ceed955053b0626031cab53137b8904fe6db18d665" Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.157450 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-677b6f7646-2t2kv" Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.192343 5030 scope.go:117] "RemoveContainer" containerID="b9d0aa43e5924f881e26530da6f2716e1ea0b9c60a52a97d1c2c1728e2fab06b" Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.200033 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-677b6f7646-2t2kv"] Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.208563 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-677b6f7646-2t2kv"] Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.228668 5030 scope.go:117] "RemoveContainer" containerID="bf734de45d5f60047a30e1ceed955053b0626031cab53137b8904fe6db18d665" Jan 20 23:08:15 crc kubenswrapper[5030]: E0120 23:08:15.229331 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf734de45d5f60047a30e1ceed955053b0626031cab53137b8904fe6db18d665\": container with ID starting with bf734de45d5f60047a30e1ceed955053b0626031cab53137b8904fe6db18d665 not found: ID does not exist" containerID="bf734de45d5f60047a30e1ceed955053b0626031cab53137b8904fe6db18d665" Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.229397 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf734de45d5f60047a30e1ceed955053b0626031cab53137b8904fe6db18d665"} err="failed to get container status \"bf734de45d5f60047a30e1ceed955053b0626031cab53137b8904fe6db18d665\": rpc error: code = NotFound desc = could not find container \"bf734de45d5f60047a30e1ceed955053b0626031cab53137b8904fe6db18d665\": container with ID starting with bf734de45d5f60047a30e1ceed955053b0626031cab53137b8904fe6db18d665 not found: ID does not exist" Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.229449 5030 scope.go:117] "RemoveContainer" containerID="b9d0aa43e5924f881e26530da6f2716e1ea0b9c60a52a97d1c2c1728e2fab06b" Jan 20 23:08:15 crc kubenswrapper[5030]: E0120 23:08:15.230447 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d0aa43e5924f881e26530da6f2716e1ea0b9c60a52a97d1c2c1728e2fab06b\": container with ID starting with b9d0aa43e5924f881e26530da6f2716e1ea0b9c60a52a97d1c2c1728e2fab06b not found: ID does not exist" containerID="b9d0aa43e5924f881e26530da6f2716e1ea0b9c60a52a97d1c2c1728e2fab06b" Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.230491 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d0aa43e5924f881e26530da6f2716e1ea0b9c60a52a97d1c2c1728e2fab06b"} err="failed to get container status \"b9d0aa43e5924f881e26530da6f2716e1ea0b9c60a52a97d1c2c1728e2fab06b\": rpc error: code = NotFound desc = could not find container \"b9d0aa43e5924f881e26530da6f2716e1ea0b9c60a52a97d1c2c1728e2fab06b\": container with ID starting with b9d0aa43e5924f881e26530da6f2716e1ea0b9c60a52a97d1c2c1728e2fab06b not found: ID does not exist" Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.979026 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8659ca99-8cf1-4b79-90f1-e9db988aaefa" path="/var/lib/kubelet/pods/8659ca99-8cf1-4b79-90f1-e9db988aaefa/volumes" Jan 20 23:08:15 crc kubenswrapper[5030]: I0120 23:08:15.980290 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a666baa5-8c6c-40e4-8c4d-7ab347b29e04" path="/var/lib/kubelet/pods/a666baa5-8c6c-40e4-8c4d-7ab347b29e04/volumes" Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.059832 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.146898 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc"] Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.147699 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" podUID="8831bf4c-47d9-4612-a313-abbb7305ed03" containerName="dnsmasq-dns" containerID="cri-o://a26d5b2837a0e64faafa7120a4f35aa3fe39c4b2885fdb2829f8e628d88a311b" gracePeriod=10 Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.581034 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.762818 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64r4m\" (UniqueName: \"kubernetes.io/projected/8831bf4c-47d9-4612-a313-abbb7305ed03-kube-api-access-64r4m\") pod \"8831bf4c-47d9-4612-a313-abbb7305ed03\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.762896 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-dnsmasq-svc\") pod \"8831bf4c-47d9-4612-a313-abbb7305ed03\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.762942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-dns-swift-storage-0\") pod \"8831bf4c-47d9-4612-a313-abbb7305ed03\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.763063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-config\") pod \"8831bf4c-47d9-4612-a313-abbb7305ed03\" (UID: \"8831bf4c-47d9-4612-a313-abbb7305ed03\") " Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.768716 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8831bf4c-47d9-4612-a313-abbb7305ed03-kube-api-access-64r4m" (OuterVolumeSpecName: "kube-api-access-64r4m") pod "8831bf4c-47d9-4612-a313-abbb7305ed03" (UID: "8831bf4c-47d9-4612-a313-abbb7305ed03"). InnerVolumeSpecName "kube-api-access-64r4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.807420 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8831bf4c-47d9-4612-a313-abbb7305ed03" (UID: "8831bf4c-47d9-4612-a313-abbb7305ed03"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.818078 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "8831bf4c-47d9-4612-a313-abbb7305ed03" (UID: "8831bf4c-47d9-4612-a313-abbb7305ed03"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.828306 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-config" (OuterVolumeSpecName: "config") pod "8831bf4c-47d9-4612-a313-abbb7305ed03" (UID: "8831bf4c-47d9-4612-a313-abbb7305ed03"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.864672 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64r4m\" (UniqueName: \"kubernetes.io/projected/8831bf4c-47d9-4612-a313-abbb7305ed03-kube-api-access-64r4m\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.864726 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.864746 5030 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:16 crc kubenswrapper[5030]: I0120 23:08:16.864763 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8831bf4c-47d9-4612-a313-abbb7305ed03-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:17 crc kubenswrapper[5030]: I0120 23:08:17.188263 5030 generic.go:334] "Generic (PLEG): container finished" podID="8831bf4c-47d9-4612-a313-abbb7305ed03" containerID="a26d5b2837a0e64faafa7120a4f35aa3fe39c4b2885fdb2829f8e628d88a311b" exitCode=0 Jan 20 23:08:17 crc kubenswrapper[5030]: I0120 23:08:17.188336 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" event={"ID":"8831bf4c-47d9-4612-a313-abbb7305ed03","Type":"ContainerDied","Data":"a26d5b2837a0e64faafa7120a4f35aa3fe39c4b2885fdb2829f8e628d88a311b"} Jan 20 23:08:17 crc kubenswrapper[5030]: I0120 23:08:17.188388 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" event={"ID":"8831bf4c-47d9-4612-a313-abbb7305ed03","Type":"ContainerDied","Data":"39149dd9ef01dbe393090834ebcbe34246d1401d2ab3b005f2b2674d8d8e91bd"} Jan 20 23:08:17 crc kubenswrapper[5030]: I0120 23:08:17.188417 5030 scope.go:117] "RemoveContainer" containerID="a26d5b2837a0e64faafa7120a4f35aa3fe39c4b2885fdb2829f8e628d88a311b" Jan 20 23:08:17 crc kubenswrapper[5030]: I0120 23:08:17.188513 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc" Jan 20 23:08:17 crc kubenswrapper[5030]: I0120 23:08:17.221610 5030 scope.go:117] "RemoveContainer" containerID="f9f4e5e4ef0633e901b2f3e5dcb8d01446f70fcef427b87e340a5e8c919d4608" Jan 20 23:08:17 crc kubenswrapper[5030]: I0120 23:08:17.242682 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc"] Jan 20 23:08:17 crc kubenswrapper[5030]: I0120 23:08:17.253122 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d9dfbfb8c-59tpc"] Jan 20 23:08:17 crc kubenswrapper[5030]: I0120 23:08:17.263430 5030 scope.go:117] "RemoveContainer" containerID="a26d5b2837a0e64faafa7120a4f35aa3fe39c4b2885fdb2829f8e628d88a311b" Jan 20 23:08:17 crc kubenswrapper[5030]: E0120 23:08:17.268412 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a26d5b2837a0e64faafa7120a4f35aa3fe39c4b2885fdb2829f8e628d88a311b\": container with ID starting with a26d5b2837a0e64faafa7120a4f35aa3fe39c4b2885fdb2829f8e628d88a311b not found: ID does not exist" containerID="a26d5b2837a0e64faafa7120a4f35aa3fe39c4b2885fdb2829f8e628d88a311b" Jan 20 23:08:17 crc kubenswrapper[5030]: I0120 23:08:17.268487 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a26d5b2837a0e64faafa7120a4f35aa3fe39c4b2885fdb2829f8e628d88a311b"} err="failed to get container status \"a26d5b2837a0e64faafa7120a4f35aa3fe39c4b2885fdb2829f8e628d88a311b\": rpc error: code = NotFound desc = could not find container \"a26d5b2837a0e64faafa7120a4f35aa3fe39c4b2885fdb2829f8e628d88a311b\": container with ID starting with a26d5b2837a0e64faafa7120a4f35aa3fe39c4b2885fdb2829f8e628d88a311b not found: ID does not exist" Jan 20 23:08:17 crc kubenswrapper[5030]: I0120 23:08:17.268531 5030 scope.go:117] "RemoveContainer" containerID="f9f4e5e4ef0633e901b2f3e5dcb8d01446f70fcef427b87e340a5e8c919d4608" Jan 20 23:08:17 crc kubenswrapper[5030]: E0120 23:08:17.269231 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9f4e5e4ef0633e901b2f3e5dcb8d01446f70fcef427b87e340a5e8c919d4608\": container with ID starting with f9f4e5e4ef0633e901b2f3e5dcb8d01446f70fcef427b87e340a5e8c919d4608 not found: ID does not exist" containerID="f9f4e5e4ef0633e901b2f3e5dcb8d01446f70fcef427b87e340a5e8c919d4608" Jan 20 23:08:17 crc kubenswrapper[5030]: I0120 23:08:17.269352 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9f4e5e4ef0633e901b2f3e5dcb8d01446f70fcef427b87e340a5e8c919d4608"} err="failed to get container status \"f9f4e5e4ef0633e901b2f3e5dcb8d01446f70fcef427b87e340a5e8c919d4608\": rpc error: code = NotFound desc = could not find container \"f9f4e5e4ef0633e901b2f3e5dcb8d01446f70fcef427b87e340a5e8c919d4608\": container with ID starting with f9f4e5e4ef0633e901b2f3e5dcb8d01446f70fcef427b87e340a5e8c919d4608 not found: ID does not exist" Jan 20 23:08:17 crc kubenswrapper[5030]: I0120 23:08:17.970166 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8831bf4c-47d9-4612-a313-abbb7305ed03" path="/var/lib/kubelet/pods/8831bf4c-47d9-4612-a313-abbb7305ed03/volumes" Jan 20 23:08:19 crc kubenswrapper[5030]: I0120 23:08:19.214650 5030 generic.go:334] "Generic (PLEG): container finished" podID="c513c4f5-8e96-4a9f-b0ee-ef08bac09288" containerID="49f1ec1cc5af1feea3de1581411a0ec38552e9fb8e7afbf284f597c5a5201a4e" exitCode=0 Jan 20 23:08:19 crc kubenswrapper[5030]: I0120 23:08:19.214702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbhl7" event={"ID":"c513c4f5-8e96-4a9f-b0ee-ef08bac09288","Type":"ContainerDied","Data":"49f1ec1cc5af1feea3de1581411a0ec38552e9fb8e7afbf284f597c5a5201a4e"} Jan 20 23:08:20 crc kubenswrapper[5030]: I0120 23:08:20.226251 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbhl7" event={"ID":"c513c4f5-8e96-4a9f-b0ee-ef08bac09288","Type":"ContainerStarted","Data":"fa4899e258cb9327c5dd3f71f20af4f6589759eca6ce88646a9fa976eb47c801"} Jan 20 23:08:20 crc kubenswrapper[5030]: I0120 23:08:20.251500 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tbhl7" podStartSLOduration=2.7321529 podStartE2EDuration="8.251474994s" podCreationTimestamp="2026-01-20 23:08:12 +0000 UTC" firstStartedPulling="2026-01-20 23:08:14.140395911 +0000 UTC m=+1966.460656199" lastFinishedPulling="2026-01-20 23:08:19.659717965 +0000 UTC m=+1971.979978293" observedRunningTime="2026-01-20 23:08:20.245236421 +0000 UTC m=+1972.565496779" watchObservedRunningTime="2026-01-20 23:08:20.251474994 +0000 UTC m=+1972.571735292" Jan 20 23:08:22 crc kubenswrapper[5030]: I0120 23:08:22.554143 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:22 crc kubenswrapper[5030]: I0120 23:08:22.554592 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:22 crc kubenswrapper[5030]: I0120 23:08:22.627321 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:32 crc kubenswrapper[5030]: I0120 23:08:32.619559 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:35 crc kubenswrapper[5030]: I0120 23:08:35.079335 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tbhl7"] Jan 20 23:08:35 crc kubenswrapper[5030]: I0120 23:08:35.080187 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tbhl7" podUID="c513c4f5-8e96-4a9f-b0ee-ef08bac09288" containerName="registry-server" containerID="cri-o://fa4899e258cb9327c5dd3f71f20af4f6589759eca6ce88646a9fa976eb47c801" gracePeriod=2 Jan 20 23:08:35 crc kubenswrapper[5030]: I0120 23:08:35.828820 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ebac554-54b2-415f-9d13-dc616983bff9" containerID="2f419f949223bd3eecb657c2e5db8dc3d6e15172bce4f76b79a2996c7f986fab" exitCode=137 Jan 20 23:08:35 crc kubenswrapper[5030]: I0120 23:08:35.829102 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"2f419f949223bd3eecb657c2e5db8dc3d6e15172bce4f76b79a2996c7f986fab"} Jan 20 23:08:35 crc kubenswrapper[5030]: I0120 23:08:35.831969 5030 generic.go:334] "Generic (PLEG): container finished" podID="c513c4f5-8e96-4a9f-b0ee-ef08bac09288" containerID="fa4899e258cb9327c5dd3f71f20af4f6589759eca6ce88646a9fa976eb47c801" exitCode=0 Jan 20 23:08:35 crc kubenswrapper[5030]: I0120 23:08:35.832012 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbhl7" event={"ID":"c513c4f5-8e96-4a9f-b0ee-ef08bac09288","Type":"ContainerDied","Data":"fa4899e258cb9327c5dd3f71f20af4f6589759eca6ce88646a9fa976eb47c801"} Jan 20 23:08:35 crc kubenswrapper[5030]: I0120 23:08:35.970257 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.037216 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.163456 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift\") pod \"7ebac554-54b2-415f-9d13-dc616983bff9\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.163543 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f7zw\" (UniqueName: \"kubernetes.io/projected/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-kube-api-access-4f7zw\") pod \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\" (UID: \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\") " Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.163663 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7ebac554-54b2-415f-9d13-dc616983bff9-cache\") pod \"7ebac554-54b2-415f-9d13-dc616983bff9\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.163723 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-utilities\") pod \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\" (UID: \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\") " Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.163765 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-catalog-content\") pod \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\" (UID: \"c513c4f5-8e96-4a9f-b0ee-ef08bac09288\") " Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.163811 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7ebac554-54b2-415f-9d13-dc616983bff9-lock\") pod \"7ebac554-54b2-415f-9d13-dc616983bff9\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.164497 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ebac554-54b2-415f-9d13-dc616983bff9-cache" (OuterVolumeSpecName: "cache") pod "7ebac554-54b2-415f-9d13-dc616983bff9" (UID: "7ebac554-54b2-415f-9d13-dc616983bff9"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.164598 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ebac554-54b2-415f-9d13-dc616983bff9-lock" (OuterVolumeSpecName: "lock") pod "7ebac554-54b2-415f-9d13-dc616983bff9" (UID: "7ebac554-54b2-415f-9d13-dc616983bff9"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.163851 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m97n\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-kube-api-access-2m97n\") pod \"7ebac554-54b2-415f-9d13-dc616983bff9\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.164756 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"7ebac554-54b2-415f-9d13-dc616983bff9\" (UID: \"7ebac554-54b2-415f-9d13-dc616983bff9\") " Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.164780 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-utilities" (OuterVolumeSpecName: "utilities") pod "c513c4f5-8e96-4a9f-b0ee-ef08bac09288" (UID: "c513c4f5-8e96-4a9f-b0ee-ef08bac09288"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.165302 5030 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7ebac554-54b2-415f-9d13-dc616983bff9-cache\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.165328 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.165347 5030 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7ebac554-54b2-415f-9d13-dc616983bff9-lock\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.169826 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-kube-api-access-4f7zw" (OuterVolumeSpecName: "kube-api-access-4f7zw") pod "c513c4f5-8e96-4a9f-b0ee-ef08bac09288" (UID: "c513c4f5-8e96-4a9f-b0ee-ef08bac09288"). InnerVolumeSpecName "kube-api-access-4f7zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.169878 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "swift") pod "7ebac554-54b2-415f-9d13-dc616983bff9" (UID: "7ebac554-54b2-415f-9d13-dc616983bff9"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.171184 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7ebac554-54b2-415f-9d13-dc616983bff9" (UID: "7ebac554-54b2-415f-9d13-dc616983bff9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.171867 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-kube-api-access-2m97n" (OuterVolumeSpecName: "kube-api-access-2m97n") pod "7ebac554-54b2-415f-9d13-dc616983bff9" (UID: "7ebac554-54b2-415f-9d13-dc616983bff9"). InnerVolumeSpecName "kube-api-access-2m97n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.231122 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c513c4f5-8e96-4a9f-b0ee-ef08bac09288" (UID: "c513c4f5-8e96-4a9f-b0ee-ef08bac09288"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.266761 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.266801 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m97n\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-kube-api-access-2m97n\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.266846 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.266859 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7ebac554-54b2-415f-9d13-dc616983bff9-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.266871 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f7zw\" (UniqueName: \"kubernetes.io/projected/c513c4f5-8e96-4a9f-b0ee-ef08bac09288-kube-api-access-4f7zw\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.291727 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.368543 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.851988 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.852070 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"7ebac554-54b2-415f-9d13-dc616983bff9","Type":"ContainerDied","Data":"9c3af1cc8881556988bb09e0cef1c91c5085b9e1f776fcc78cffc47380390a67"} Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.852140 5030 scope.go:117] "RemoveContainer" containerID="2f419f949223bd3eecb657c2e5db8dc3d6e15172bce4f76b79a2996c7f986fab" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.857196 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbhl7" event={"ID":"c513c4f5-8e96-4a9f-b0ee-ef08bac09288","Type":"ContainerDied","Data":"9103a2c5f96a2e3b15f6ca66bf75c056d002c0d7cbd674de17882fff796c98be"} Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.857301 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbhl7" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.901865 5030 scope.go:117] "RemoveContainer" containerID="0228e6e5a2bf8d67091bb924e23d89a3dacb9f0b27fee5bc85fea465f5349cc2" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.904826 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.921666 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.933558 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tbhl7"] Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.940164 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tbhl7"] Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.941137 5030 scope.go:117] "RemoveContainer" containerID="6e533722ba2e3c86bb244f1d172c2e5b6d85b47103226081550521e55d75cf57" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.967379 5030 scope.go:117] "RemoveContainer" containerID="71177140e4344999214a9b119df28b5034a95818f303e43d4d9fda79f212ab31" Jan 20 23:08:36 crc kubenswrapper[5030]: I0120 23:08:36.995663 5030 scope.go:117] "RemoveContainer" containerID="46e37ba2eb5235b1620489492fb3a576c004b9a14e7fd13ba34cffba482905bf" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.023127 5030 scope.go:117] "RemoveContainer" containerID="4b6cf4885689d7d5fb7caeeeabdeec6302123ed28f3ab03a8471fc3d784419c4" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.051334 5030 scope.go:117] "RemoveContainer" containerID="2a7fd43e4183dd4379e4ba1e71ac29b4bdb0e6c7406d53a86f975b26229f1c3a" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.077280 5030 scope.go:117] "RemoveContainer" containerID="c6d0a29d7c5b2300d15ec8193bba8327f5c9413176c849542e224a3285bd14db" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.111112 5030 scope.go:117] "RemoveContainer" containerID="e0a2f61c01c7d8a27a3ed5ef3a647a8e8fbd99b4f1d1be9c8bc289f70dc07409" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.143908 5030 scope.go:117] "RemoveContainer" containerID="ef5734bddbf1b14a4ff079c9189361ff4acf245e82b0aa907a1cab9c7c034a64" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.170933 5030 scope.go:117] "RemoveContainer" containerID="ba9956fcef6b04c46db28c6ae2b349ec3802723f420d58f0af21b4f46ec7c18c" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.204428 5030 scope.go:117] "RemoveContainer" containerID="5c9f5726d07ae41454c108e2c0d2fd019e4139eca228dc8e2ae6c5ddaee5f754" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.226889 5030 scope.go:117] "RemoveContainer" containerID="4340f305b67f1fdb453229a8e50f3723634ae23ea784fed8c8391cf3ca98c768" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.253769 5030 scope.go:117] "RemoveContainer" containerID="e48b459962ea7bc574b006ce6341dda88c60d32146ba90952d7090ce53463da0" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.281866 5030 scope.go:117] "RemoveContainer" containerID="222855be530361537c189bb68ea262fab03aa1e573ca39420ca757a8fc9752a9" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.317217 5030 scope.go:117] "RemoveContainer" containerID="fa4899e258cb9327c5dd3f71f20af4f6589759eca6ce88646a9fa976eb47c801" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.342275 5030 scope.go:117] "RemoveContainer" containerID="49f1ec1cc5af1feea3de1581411a0ec38552e9fb8e7afbf284f597c5a5201a4e" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.373383 5030 scope.go:117] "RemoveContainer" containerID="79d2dc858c907b9b9c180b8344193a05fea6b7daac96c62a065fe0f37f1e7319" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.980544 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" path="/var/lib/kubelet/pods/7ebac554-54b2-415f-9d13-dc616983bff9/volumes" Jan 20 23:08:37 crc kubenswrapper[5030]: I0120 23:08:37.984324 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c513c4f5-8e96-4a9f-b0ee-ef08bac09288" path="/var/lib/kubelet/pods/c513c4f5-8e96-4a9f-b0ee-ef08bac09288/volumes" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.830739 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-7k6np"] Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.835749 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-7k6np"] Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.974083 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e19c0ae-a028-4c60-90bd-dc184f834639" path="/var/lib/kubelet/pods/0e19c0ae-a028-4c60-90bd-dc184f834639/volumes" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.978898 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ntss5"] Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979326 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-replicator" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979359 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-replicator" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979374 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-server" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979388 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-server" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979416 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="swift-recon-cron" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979428 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="swift-recon-cron" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979449 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c513c4f5-8e96-4a9f-b0ee-ef08bac09288" containerName="extract-content" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979461 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c513c4f5-8e96-4a9f-b0ee-ef08bac09288" containerName="extract-content" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979477 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c513c4f5-8e96-4a9f-b0ee-ef08bac09288" containerName="extract-utilities" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979489 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c513c4f5-8e96-4a9f-b0ee-ef08bac09288" containerName="extract-utilities" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979508 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ede01a6-9221-4276-b537-212e5da27f5c" containerName="setup-container" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979522 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ede01a6-9221-4276-b537-212e5da27f5c" containerName="setup-container" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979543 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-auditor" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979556 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-auditor" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979577 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-updater" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979588 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-updater" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979613 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8831bf4c-47d9-4612-a313-abbb7305ed03" containerName="init" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979655 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8831bf4c-47d9-4612-a313-abbb7305ed03" containerName="init" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979675 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="rsync" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979689 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="rsync" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979712 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8831bf4c-47d9-4612-a313-abbb7305ed03" containerName="dnsmasq-dns" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979724 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8831bf4c-47d9-4612-a313-abbb7305ed03" containerName="dnsmasq-dns" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979747 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-updater" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979758 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-updater" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979777 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8659ca99-8cf1-4b79-90f1-e9db988aaefa" containerName="neutron-api" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979789 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8659ca99-8cf1-4b79-90f1-e9db988aaefa" containerName="neutron-api" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979806 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-auditor" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979821 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-auditor" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979837 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-reaper" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979849 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-reaper" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979871 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-replicator" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979883 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-replicator" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979901 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ede01a6-9221-4276-b537-212e5da27f5c" containerName="rabbitmq" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979912 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ede01a6-9221-4276-b537-212e5da27f5c" containerName="rabbitmq" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979933 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ca1870-fde9-4d58-9593-555c107cada4" containerName="keystone-api" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979945 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ca1870-fde9-4d58-9593-555c107cada4" containerName="keystone-api" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979961 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-replicator" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.979974 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-replicator" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.979993 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-auditor" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980008 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-auditor" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.980027 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a666baa5-8c6c-40e4-8c4d-7ab347b29e04" containerName="nova-cell1-conductor-conductor" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980039 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a666baa5-8c6c-40e4-8c4d-7ab347b29e04" containerName="nova-cell1-conductor-conductor" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.980054 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-server" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980067 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-server" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.980083 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c513c4f5-8e96-4a9f-b0ee-ef08bac09288" containerName="registry-server" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980095 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c513c4f5-8e96-4a9f-b0ee-ef08bac09288" containerName="registry-server" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.980118 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-expirer" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980130 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-expirer" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.980147 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-server" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980159 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-server" Jan 20 23:08:47 crc kubenswrapper[5030]: E0120 23:08:47.980175 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8659ca99-8cf1-4b79-90f1-e9db988aaefa" containerName="neutron-httpd" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980188 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8659ca99-8cf1-4b79-90f1-e9db988aaefa" containerName="neutron-httpd" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980426 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ca1870-fde9-4d58-9593-555c107cada4" containerName="keystone-api" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980446 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ede01a6-9221-4276-b537-212e5da27f5c" containerName="rabbitmq" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980467 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-server" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980486 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-replicator" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980505 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-server" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980518 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-updater" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980538 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="rsync" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980559 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="swift-recon-cron" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980581 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-replicator" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980595 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-server" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980617 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8831bf4c-47d9-4612-a313-abbb7305ed03" containerName="dnsmasq-dns" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980668 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-expirer" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980687 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-replicator" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980700 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a666baa5-8c6c-40e4-8c4d-7ab347b29e04" containerName="nova-cell1-conductor-conductor" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980722 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-auditor" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980740 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c513c4f5-8e96-4a9f-b0ee-ef08bac09288" containerName="registry-server" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980753 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-auditor" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980767 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="account-reaper" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980784 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="container-auditor" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980802 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ebac554-54b2-415f-9d13-dc616983bff9" containerName="object-updater" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980820 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8659ca99-8cf1-4b79-90f1-e9db988aaefa" containerName="neutron-api" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.980836 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8659ca99-8cf1-4b79-90f1-e9db988aaefa" containerName="neutron-httpd" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.981520 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ntss5" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.984224 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.984230 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.984514 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:08:47 crc kubenswrapper[5030]: I0120 23:08:47.987485 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:08:48 crc kubenswrapper[5030]: I0120 23:08:48.006025 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ntss5"] Jan 20 23:08:48 crc kubenswrapper[5030]: I0120 23:08:48.062371 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-crc-storage\") pod \"crc-storage-crc-ntss5\" (UID: \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\") " pod="crc-storage/crc-storage-crc-ntss5" Jan 20 23:08:48 crc kubenswrapper[5030]: I0120 23:08:48.062509 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-node-mnt\") pod \"crc-storage-crc-ntss5\" (UID: \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\") " pod="crc-storage/crc-storage-crc-ntss5" Jan 20 23:08:48 crc kubenswrapper[5030]: I0120 23:08:48.062605 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvfng\" (UniqueName: \"kubernetes.io/projected/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-kube-api-access-qvfng\") pod \"crc-storage-crc-ntss5\" (UID: \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\") " pod="crc-storage/crc-storage-crc-ntss5" Jan 20 23:08:48 crc kubenswrapper[5030]: I0120 23:08:48.163762 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-crc-storage\") pod \"crc-storage-crc-ntss5\" (UID: \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\") " pod="crc-storage/crc-storage-crc-ntss5" Jan 20 23:08:48 crc kubenswrapper[5030]: I0120 23:08:48.163892 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-node-mnt\") pod \"crc-storage-crc-ntss5\" (UID: \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\") " pod="crc-storage/crc-storage-crc-ntss5" Jan 20 23:08:48 crc kubenswrapper[5030]: I0120 23:08:48.163997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvfng\" (UniqueName: \"kubernetes.io/projected/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-kube-api-access-qvfng\") pod \"crc-storage-crc-ntss5\" (UID: \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\") " pod="crc-storage/crc-storage-crc-ntss5" Jan 20 23:08:48 crc kubenswrapper[5030]: I0120 23:08:48.164324 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-node-mnt\") pod \"crc-storage-crc-ntss5\" (UID: \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\") " pod="crc-storage/crc-storage-crc-ntss5" Jan 20 23:08:48 crc kubenswrapper[5030]: I0120 23:08:48.165436 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-crc-storage\") pod \"crc-storage-crc-ntss5\" (UID: \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\") " pod="crc-storage/crc-storage-crc-ntss5" Jan 20 23:08:48 crc kubenswrapper[5030]: I0120 23:08:48.198022 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvfng\" (UniqueName: \"kubernetes.io/projected/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-kube-api-access-qvfng\") pod \"crc-storage-crc-ntss5\" (UID: \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\") " pod="crc-storage/crc-storage-crc-ntss5" Jan 20 23:08:48 crc kubenswrapper[5030]: I0120 23:08:48.328157 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ntss5" Jan 20 23:08:48 crc kubenswrapper[5030]: I0120 23:08:48.813373 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ntss5"] Jan 20 23:08:48 crc kubenswrapper[5030]: I0120 23:08:48.999126 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ntss5" event={"ID":"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37","Type":"ContainerStarted","Data":"381e390ef40fd00bf2ce0d33094880d981f8dda2a3feb3081ebce2a5a8b3c35b"} Jan 20 23:08:50 crc kubenswrapper[5030]: I0120 23:08:50.014672 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ntss5" event={"ID":"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37","Type":"ContainerStarted","Data":"551531362bd2e6dbde84980ae713ad54587b8b92a636785573c6691a77427bf2"} Jan 20 23:08:50 crc kubenswrapper[5030]: I0120 23:08:50.040419 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-ntss5" podStartSLOduration=2.270295353 podStartE2EDuration="3.040399508s" podCreationTimestamp="2026-01-20 23:08:47 +0000 UTC" firstStartedPulling="2026-01-20 23:08:48.825999887 +0000 UTC m=+2001.146260215" lastFinishedPulling="2026-01-20 23:08:49.596104082 +0000 UTC m=+2001.916364370" observedRunningTime="2026-01-20 23:08:50.0367849 +0000 UTC m=+2002.357045228" watchObservedRunningTime="2026-01-20 23:08:50.040399508 +0000 UTC m=+2002.360659806" Jan 20 23:08:51 crc kubenswrapper[5030]: I0120 23:08:51.029294 5030 generic.go:334] "Generic (PLEG): container finished" podID="093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37" containerID="551531362bd2e6dbde84980ae713ad54587b8b92a636785573c6691a77427bf2" exitCode=0 Jan 20 23:08:51 crc kubenswrapper[5030]: I0120 23:08:51.029395 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ntss5" event={"ID":"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37","Type":"ContainerDied","Data":"551531362bd2e6dbde84980ae713ad54587b8b92a636785573c6691a77427bf2"} Jan 20 23:08:52 crc kubenswrapper[5030]: I0120 23:08:52.425243 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ntss5" Jan 20 23:08:52 crc kubenswrapper[5030]: I0120 23:08:52.436670 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-crc-storage\") pod \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\" (UID: \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\") " Jan 20 23:08:52 crc kubenswrapper[5030]: I0120 23:08:52.437026 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvfng\" (UniqueName: \"kubernetes.io/projected/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-kube-api-access-qvfng\") pod \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\" (UID: \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\") " Jan 20 23:08:52 crc kubenswrapper[5030]: I0120 23:08:52.437215 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-node-mnt\") pod \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\" (UID: \"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37\") " Jan 20 23:08:52 crc kubenswrapper[5030]: I0120 23:08:52.437375 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37" (UID: "093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:08:52 crc kubenswrapper[5030]: I0120 23:08:52.438060 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:52 crc kubenswrapper[5030]: I0120 23:08:52.445012 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-kube-api-access-qvfng" (OuterVolumeSpecName: "kube-api-access-qvfng") pod "093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37" (UID: "093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37"). InnerVolumeSpecName "kube-api-access-qvfng". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:52 crc kubenswrapper[5030]: I0120 23:08:52.472469 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37" (UID: "093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:52 crc kubenswrapper[5030]: I0120 23:08:52.539506 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvfng\" (UniqueName: \"kubernetes.io/projected/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-kube-api-access-qvfng\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:52 crc kubenswrapper[5030]: I0120 23:08:52.539542 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:53 crc kubenswrapper[5030]: I0120 23:08:53.064888 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ntss5" event={"ID":"093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37","Type":"ContainerDied","Data":"381e390ef40fd00bf2ce0d33094880d981f8dda2a3feb3081ebce2a5a8b3c35b"} Jan 20 23:08:53 crc kubenswrapper[5030]: I0120 23:08:53.064951 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="381e390ef40fd00bf2ce0d33094880d981f8dda2a3feb3081ebce2a5a8b3c35b" Jan 20 23:08:53 crc kubenswrapper[5030]: I0120 23:08:53.064963 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ntss5" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.590341 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-ntss5"] Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.601084 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-ntss5"] Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.733838 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-tk6zs"] Jan 20 23:08:55 crc kubenswrapper[5030]: E0120 23:08:55.734271 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37" containerName="storage" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.734301 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37" containerName="storage" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.734554 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37" containerName="storage" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.735355 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tk6zs" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.739515 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.739594 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.739602 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.740112 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.750870 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tk6zs"] Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.799471 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9dw2\" (UniqueName: \"kubernetes.io/projected/4a293400-ab1b-4e7c-90fd-6eb9356a5713-kube-api-access-q9dw2\") pod \"crc-storage-crc-tk6zs\" (UID: \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\") " pod="crc-storage/crc-storage-crc-tk6zs" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.799583 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4a293400-ab1b-4e7c-90fd-6eb9356a5713-node-mnt\") pod \"crc-storage-crc-tk6zs\" (UID: \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\") " pod="crc-storage/crc-storage-crc-tk6zs" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.799730 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4a293400-ab1b-4e7c-90fd-6eb9356a5713-crc-storage\") pod \"crc-storage-crc-tk6zs\" (UID: \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\") " pod="crc-storage/crc-storage-crc-tk6zs" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.901367 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9dw2\" (UniqueName: \"kubernetes.io/projected/4a293400-ab1b-4e7c-90fd-6eb9356a5713-kube-api-access-q9dw2\") pod \"crc-storage-crc-tk6zs\" (UID: \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\") " pod="crc-storage/crc-storage-crc-tk6zs" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.901473 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4a293400-ab1b-4e7c-90fd-6eb9356a5713-node-mnt\") pod \"crc-storage-crc-tk6zs\" (UID: \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\") " pod="crc-storage/crc-storage-crc-tk6zs" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.901591 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4a293400-ab1b-4e7c-90fd-6eb9356a5713-crc-storage\") pod \"crc-storage-crc-tk6zs\" (UID: \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\") " pod="crc-storage/crc-storage-crc-tk6zs" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.901921 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4a293400-ab1b-4e7c-90fd-6eb9356a5713-node-mnt\") pod \"crc-storage-crc-tk6zs\" (UID: \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\") " pod="crc-storage/crc-storage-crc-tk6zs" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.903017 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4a293400-ab1b-4e7c-90fd-6eb9356a5713-crc-storage\") pod \"crc-storage-crc-tk6zs\" (UID: \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\") " pod="crc-storage/crc-storage-crc-tk6zs" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.945867 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9dw2\" (UniqueName: \"kubernetes.io/projected/4a293400-ab1b-4e7c-90fd-6eb9356a5713-kube-api-access-q9dw2\") pod \"crc-storage-crc-tk6zs\" (UID: \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\") " pod="crc-storage/crc-storage-crc-tk6zs" Jan 20 23:08:55 crc kubenswrapper[5030]: I0120 23:08:55.983829 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37" path="/var/lib/kubelet/pods/093bbf5f-a7ae-4e1b-8e1b-97a2c6ea1b37/volumes" Jan 20 23:08:56 crc kubenswrapper[5030]: I0120 23:08:56.062463 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tk6zs" Jan 20 23:08:56 crc kubenswrapper[5030]: I0120 23:08:56.593290 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tk6zs"] Jan 20 23:08:56 crc kubenswrapper[5030]: I0120 23:08:56.603434 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 23:08:57 crc kubenswrapper[5030]: I0120 23:08:57.114305 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tk6zs" event={"ID":"4a293400-ab1b-4e7c-90fd-6eb9356a5713","Type":"ContainerStarted","Data":"f3e9bb5345c331ed6f6de86155caa9e376cd8c03847bac1b642aa781ad9cc9fc"} Jan 20 23:08:58 crc kubenswrapper[5030]: I0120 23:08:58.127437 5030 generic.go:334] "Generic (PLEG): container finished" podID="4a293400-ab1b-4e7c-90fd-6eb9356a5713" containerID="df1a60f7cd7548e04244683ed2dd64f62aefb4d7e1f8455813c670c81d31dee1" exitCode=0 Jan 20 23:08:58 crc kubenswrapper[5030]: I0120 23:08:58.127521 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tk6zs" event={"ID":"4a293400-ab1b-4e7c-90fd-6eb9356a5713","Type":"ContainerDied","Data":"df1a60f7cd7548e04244683ed2dd64f62aefb4d7e1f8455813c670c81d31dee1"} Jan 20 23:08:59 crc kubenswrapper[5030]: I0120 23:08:59.505148 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tk6zs" Jan 20 23:08:59 crc kubenswrapper[5030]: I0120 23:08:59.559398 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4a293400-ab1b-4e7c-90fd-6eb9356a5713-crc-storage\") pod \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\" (UID: \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\") " Jan 20 23:08:59 crc kubenswrapper[5030]: I0120 23:08:59.559496 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4a293400-ab1b-4e7c-90fd-6eb9356a5713-node-mnt\") pod \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\" (UID: \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\") " Jan 20 23:08:59 crc kubenswrapper[5030]: I0120 23:08:59.559526 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9dw2\" (UniqueName: \"kubernetes.io/projected/4a293400-ab1b-4e7c-90fd-6eb9356a5713-kube-api-access-q9dw2\") pod \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\" (UID: \"4a293400-ab1b-4e7c-90fd-6eb9356a5713\") " Jan 20 23:08:59 crc kubenswrapper[5030]: I0120 23:08:59.559729 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a293400-ab1b-4e7c-90fd-6eb9356a5713-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "4a293400-ab1b-4e7c-90fd-6eb9356a5713" (UID: "4a293400-ab1b-4e7c-90fd-6eb9356a5713"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:08:59 crc kubenswrapper[5030]: I0120 23:08:59.564010 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a293400-ab1b-4e7c-90fd-6eb9356a5713-kube-api-access-q9dw2" (OuterVolumeSpecName: "kube-api-access-q9dw2") pod "4a293400-ab1b-4e7c-90fd-6eb9356a5713" (UID: "4a293400-ab1b-4e7c-90fd-6eb9356a5713"). InnerVolumeSpecName "kube-api-access-q9dw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:08:59 crc kubenswrapper[5030]: I0120 23:08:59.595390 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a293400-ab1b-4e7c-90fd-6eb9356a5713-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "4a293400-ab1b-4e7c-90fd-6eb9356a5713" (UID: "4a293400-ab1b-4e7c-90fd-6eb9356a5713"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:08:59 crc kubenswrapper[5030]: I0120 23:08:59.661202 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/4a293400-ab1b-4e7c-90fd-6eb9356a5713-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:59 crc kubenswrapper[5030]: I0120 23:08:59.661418 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9dw2\" (UniqueName: \"kubernetes.io/projected/4a293400-ab1b-4e7c-90fd-6eb9356a5713-kube-api-access-q9dw2\") on node \"crc\" DevicePath \"\"" Jan 20 23:08:59 crc kubenswrapper[5030]: I0120 23:08:59.661449 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/4a293400-ab1b-4e7c-90fd-6eb9356a5713-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:00 crc kubenswrapper[5030]: I0120 23:09:00.154085 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tk6zs" event={"ID":"4a293400-ab1b-4e7c-90fd-6eb9356a5713","Type":"ContainerDied","Data":"f3e9bb5345c331ed6f6de86155caa9e376cd8c03847bac1b642aa781ad9cc9fc"} Jan 20 23:09:00 crc kubenswrapper[5030]: I0120 23:09:00.154142 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e9bb5345c331ed6f6de86155caa9e376cd8c03847bac1b642aa781ad9cc9fc" Jan 20 23:09:00 crc kubenswrapper[5030]: I0120 23:09:00.154283 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tk6zs" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.777879 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756"] Jan 20 23:09:04 crc kubenswrapper[5030]: E0120 23:09:04.778659 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a293400-ab1b-4e7c-90fd-6eb9356a5713" containerName="storage" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.778681 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a293400-ab1b-4e7c-90fd-6eb9356a5713" containerName="storage" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.778932 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a293400-ab1b-4e7c-90fd-6eb9356a5713" containerName="storage" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.780613 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.785868 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.797194 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756"] Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.845000 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756\" (UID: \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.845237 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756\" (UID: \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.845285 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62k65\" (UniqueName: \"kubernetes.io/projected/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-kube-api-access-62k65\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756\" (UID: \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.947601 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756\" (UID: \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.948099 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756\" (UID: \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.948138 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62k65\" (UniqueName: \"kubernetes.io/projected/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-kube-api-access-62k65\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756\" (UID: \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.948849 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756\" (UID: \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.948874 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756\" (UID: \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" Jan 20 23:09:04 crc kubenswrapper[5030]: I0120 23:09:04.974130 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62k65\" (UniqueName: \"kubernetes.io/projected/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-kube-api-access-62k65\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756\" (UID: \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" Jan 20 23:09:05 crc kubenswrapper[5030]: I0120 23:09:05.113513 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" Jan 20 23:09:05 crc kubenswrapper[5030]: I0120 23:09:05.374032 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756"] Jan 20 23:09:06 crc kubenswrapper[5030]: I0120 23:09:06.251165 5030 generic.go:334] "Generic (PLEG): container finished" podID="6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" containerID="8dbd0fe360394c16eba3d53eba7c7f66786ce4fb1d84c4fe592f6ceb2dfd1694" exitCode=0 Jan 20 23:09:06 crc kubenswrapper[5030]: I0120 23:09:06.251242 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" event={"ID":"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4","Type":"ContainerDied","Data":"8dbd0fe360394c16eba3d53eba7c7f66786ce4fb1d84c4fe592f6ceb2dfd1694"} Jan 20 23:09:06 crc kubenswrapper[5030]: I0120 23:09:06.251309 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" event={"ID":"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4","Type":"ContainerStarted","Data":"a4482527c1588f2761ece87bceb30776cf11626c3dacc70035d173f88c3da712"} Jan 20 23:09:08 crc kubenswrapper[5030]: I0120 23:09:08.280206 5030 generic.go:334] "Generic (PLEG): container finished" podID="6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" containerID="7fbba8623965287f6f79bbf1f6d27d61e11abc8acc492b03cebfaee1740afe30" exitCode=0 Jan 20 23:09:08 crc kubenswrapper[5030]: I0120 23:09:08.280260 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" event={"ID":"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4","Type":"ContainerDied","Data":"7fbba8623965287f6f79bbf1f6d27d61e11abc8acc492b03cebfaee1740afe30"} Jan 20 23:09:09 crc kubenswrapper[5030]: I0120 23:09:09.296295 5030 generic.go:334] "Generic (PLEG): container finished" podID="6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" containerID="be36195b526a7d6bf6b9a7c2898ffaed2eba98b03ee9ade26bdb19e273e31813" exitCode=0 Jan 20 23:09:09 crc kubenswrapper[5030]: I0120 23:09:09.296382 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" event={"ID":"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4","Type":"ContainerDied","Data":"be36195b526a7d6bf6b9a7c2898ffaed2eba98b03ee9ade26bdb19e273e31813"} Jan 20 23:09:10 crc kubenswrapper[5030]: I0120 23:09:10.710964 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" Jan 20 23:09:10 crc kubenswrapper[5030]: I0120 23:09:10.738676 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-bundle\") pod \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\" (UID: \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\") " Jan 20 23:09:10 crc kubenswrapper[5030]: I0120 23:09:10.738897 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62k65\" (UniqueName: \"kubernetes.io/projected/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-kube-api-access-62k65\") pod \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\" (UID: \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\") " Jan 20 23:09:10 crc kubenswrapper[5030]: I0120 23:09:10.738946 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-util\") pod \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\" (UID: \"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4\") " Jan 20 23:09:10 crc kubenswrapper[5030]: I0120 23:09:10.744476 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-bundle" (OuterVolumeSpecName: "bundle") pod "6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" (UID: "6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:09:10 crc kubenswrapper[5030]: I0120 23:09:10.746400 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-kube-api-access-62k65" (OuterVolumeSpecName: "kube-api-access-62k65") pod "6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" (UID: "6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4"). InnerVolumeSpecName "kube-api-access-62k65". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:09:10 crc kubenswrapper[5030]: I0120 23:09:10.774442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-util" (OuterVolumeSpecName: "util") pod "6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" (UID: "6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:09:10 crc kubenswrapper[5030]: I0120 23:09:10.841513 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:10 crc kubenswrapper[5030]: I0120 23:09:10.841923 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62k65\" (UniqueName: \"kubernetes.io/projected/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-kube-api-access-62k65\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:10 crc kubenswrapper[5030]: I0120 23:09:10.842073 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4-util\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:11 crc kubenswrapper[5030]: I0120 23:09:11.320865 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" event={"ID":"6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4","Type":"ContainerDied","Data":"a4482527c1588f2761ece87bceb30776cf11626c3dacc70035d173f88c3da712"} Jan 20 23:09:11 crc kubenswrapper[5030]: I0120 23:09:11.320923 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4482527c1588f2761ece87bceb30776cf11626c3dacc70035d173f88c3da712" Jan 20 23:09:11 crc kubenswrapper[5030]: I0120 23:09:11.320947 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.165577 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-l4rnz"] Jan 20 23:09:20 crc kubenswrapper[5030]: E0120 23:09:20.166302 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" containerName="util" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.166314 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" containerName="util" Jan 20 23:09:20 crc kubenswrapper[5030]: E0120 23:09:20.166325 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" containerName="pull" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.166331 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" containerName="pull" Jan 20 23:09:20 crc kubenswrapper[5030]: E0120 23:09:20.166346 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" containerName="extract" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.166353 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" containerName="extract" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.166476 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" containerName="extract" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.166962 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-l4rnz" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.168855 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.169273 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.169348 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-jhprl" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.177841 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2wxh\" (UniqueName: \"kubernetes.io/projected/1492b841-c24d-49a9-843c-7348405b87be-kube-api-access-p2wxh\") pod \"obo-prometheus-operator-68bc856cb9-l4rnz\" (UID: \"1492b841-c24d-49a9-843c-7348405b87be\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-l4rnz" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.178501 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-l4rnz"] Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.279673 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2wxh\" (UniqueName: \"kubernetes.io/projected/1492b841-c24d-49a9-843c-7348405b87be-kube-api-access-p2wxh\") pod \"obo-prometheus-operator-68bc856cb9-l4rnz\" (UID: \"1492b841-c24d-49a9-843c-7348405b87be\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-l4rnz" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.285576 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j"] Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.286408 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.289997 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-wwmks" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.290415 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.298918 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j"] Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.299126 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2wxh\" (UniqueName: \"kubernetes.io/projected/1492b841-c24d-49a9-843c-7348405b87be-kube-api-access-p2wxh\") pod \"obo-prometheus-operator-68bc856cb9-l4rnz\" (UID: \"1492b841-c24d-49a9-843c-7348405b87be\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-l4rnz" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.307153 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr"] Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.307969 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.323224 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr"] Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.381550 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a9c5ee4-1f00-4f1a-8147-f2ad854edb84-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j\" (UID: \"3a9c5ee4-1f00-4f1a-8147-f2ad854edb84\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.381840 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a9c5ee4-1f00-4f1a-8147-f2ad854edb84-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j\" (UID: \"3a9c5ee4-1f00-4f1a-8147-f2ad854edb84\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.381921 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7410b4ea-f2f7-4350-aa51-0730f3463d7b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr\" (UID: \"7410b4ea-f2f7-4350-aa51-0730f3463d7b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.381947 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7410b4ea-f2f7-4350-aa51-0730f3463d7b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr\" (UID: \"7410b4ea-f2f7-4350-aa51-0730f3463d7b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.406718 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-n46gl"] Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.407660 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-n46gl" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.410479 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.410680 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-kgmzh" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.418394 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-n46gl"] Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.481071 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-l4rnz" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.483247 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf8qv\" (UniqueName: \"kubernetes.io/projected/1daef7a7-0684-404a-8e5a-4850d692e21e-kube-api-access-gf8qv\") pod \"observability-operator-59bdc8b94-n46gl\" (UID: \"1daef7a7-0684-404a-8e5a-4850d692e21e\") " pod="openshift-operators/observability-operator-59bdc8b94-n46gl" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.483280 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7410b4ea-f2f7-4350-aa51-0730f3463d7b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr\" (UID: \"7410b4ea-f2f7-4350-aa51-0730f3463d7b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.483312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/1daef7a7-0684-404a-8e5a-4850d692e21e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-n46gl\" (UID: \"1daef7a7-0684-404a-8e5a-4850d692e21e\") " pod="openshift-operators/observability-operator-59bdc8b94-n46gl" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.483335 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7410b4ea-f2f7-4350-aa51-0730f3463d7b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr\" (UID: \"7410b4ea-f2f7-4350-aa51-0730f3463d7b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.483367 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a9c5ee4-1f00-4f1a-8147-f2ad854edb84-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j\" (UID: \"3a9c5ee4-1f00-4f1a-8147-f2ad854edb84\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.483385 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a9c5ee4-1f00-4f1a-8147-f2ad854edb84-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j\" (UID: \"3a9c5ee4-1f00-4f1a-8147-f2ad854edb84\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.487081 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a9c5ee4-1f00-4f1a-8147-f2ad854edb84-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j\" (UID: \"3a9c5ee4-1f00-4f1a-8147-f2ad854edb84\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.489359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7410b4ea-f2f7-4350-aa51-0730f3463d7b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr\" (UID: \"7410b4ea-f2f7-4350-aa51-0730f3463d7b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.494971 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7410b4ea-f2f7-4350-aa51-0730f3463d7b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr\" (UID: \"7410b4ea-f2f7-4350-aa51-0730f3463d7b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.501393 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-wlqst"] Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.502465 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-wlqst" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.507957 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-krcv5" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.510037 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a9c5ee4-1f00-4f1a-8147-f2ad854edb84-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j\" (UID: \"3a9c5ee4-1f00-4f1a-8147-f2ad854edb84\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.513388 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-wlqst"] Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.584393 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsrkp\" (UniqueName: \"kubernetes.io/projected/711a3beb-9e88-42cd-86f6-0e88e52c05e3-kube-api-access-vsrkp\") pod \"perses-operator-5bf474d74f-wlqst\" (UID: \"711a3beb-9e88-42cd-86f6-0e88e52c05e3\") " pod="openshift-operators/perses-operator-5bf474d74f-wlqst" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.584447 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/711a3beb-9e88-42cd-86f6-0e88e52c05e3-openshift-service-ca\") pod \"perses-operator-5bf474d74f-wlqst\" (UID: \"711a3beb-9e88-42cd-86f6-0e88e52c05e3\") " pod="openshift-operators/perses-operator-5bf474d74f-wlqst" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.584586 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf8qv\" (UniqueName: \"kubernetes.io/projected/1daef7a7-0684-404a-8e5a-4850d692e21e-kube-api-access-gf8qv\") pod \"observability-operator-59bdc8b94-n46gl\" (UID: \"1daef7a7-0684-404a-8e5a-4850d692e21e\") " pod="openshift-operators/observability-operator-59bdc8b94-n46gl" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.584641 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/1daef7a7-0684-404a-8e5a-4850d692e21e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-n46gl\" (UID: \"1daef7a7-0684-404a-8e5a-4850d692e21e\") " pod="openshift-operators/observability-operator-59bdc8b94-n46gl" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.588101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/1daef7a7-0684-404a-8e5a-4850d692e21e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-n46gl\" (UID: \"1daef7a7-0684-404a-8e5a-4850d692e21e\") " pod="openshift-operators/observability-operator-59bdc8b94-n46gl" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.599662 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.605265 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf8qv\" (UniqueName: \"kubernetes.io/projected/1daef7a7-0684-404a-8e5a-4850d692e21e-kube-api-access-gf8qv\") pod \"observability-operator-59bdc8b94-n46gl\" (UID: \"1daef7a7-0684-404a-8e5a-4850d692e21e\") " pod="openshift-operators/observability-operator-59bdc8b94-n46gl" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.652067 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.688844 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/711a3beb-9e88-42cd-86f6-0e88e52c05e3-openshift-service-ca\") pod \"perses-operator-5bf474d74f-wlqst\" (UID: \"711a3beb-9e88-42cd-86f6-0e88e52c05e3\") " pod="openshift-operators/perses-operator-5bf474d74f-wlqst" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.688964 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsrkp\" (UniqueName: \"kubernetes.io/projected/711a3beb-9e88-42cd-86f6-0e88e52c05e3-kube-api-access-vsrkp\") pod \"perses-operator-5bf474d74f-wlqst\" (UID: \"711a3beb-9e88-42cd-86f6-0e88e52c05e3\") " pod="openshift-operators/perses-operator-5bf474d74f-wlqst" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.690176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/711a3beb-9e88-42cd-86f6-0e88e52c05e3-openshift-service-ca\") pod \"perses-operator-5bf474d74f-wlqst\" (UID: \"711a3beb-9e88-42cd-86f6-0e88e52c05e3\") " pod="openshift-operators/perses-operator-5bf474d74f-wlqst" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.713657 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsrkp\" (UniqueName: \"kubernetes.io/projected/711a3beb-9e88-42cd-86f6-0e88e52c05e3-kube-api-access-vsrkp\") pod \"perses-operator-5bf474d74f-wlqst\" (UID: \"711a3beb-9e88-42cd-86f6-0e88e52c05e3\") " pod="openshift-operators/perses-operator-5bf474d74f-wlqst" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.726893 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-n46gl" Jan 20 23:09:20 crc kubenswrapper[5030]: I0120 23:09:20.865030 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-wlqst" Jan 20 23:09:21 crc kubenswrapper[5030]: I0120 23:09:21.030128 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-l4rnz"] Jan 20 23:09:21 crc kubenswrapper[5030]: I0120 23:09:21.165441 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j"] Jan 20 23:09:21 crc kubenswrapper[5030]: W0120 23:09:21.171152 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a9c5ee4_1f00_4f1a_8147_f2ad854edb84.slice/crio-38f7cc03cb167a1af0a61a0be0980a38060d2041c6cced06f9f03679107930fc WatchSource:0}: Error finding container 38f7cc03cb167a1af0a61a0be0980a38060d2041c6cced06f9f03679107930fc: Status 404 returned error can't find the container with id 38f7cc03cb167a1af0a61a0be0980a38060d2041c6cced06f9f03679107930fc Jan 20 23:09:21 crc kubenswrapper[5030]: I0120 23:09:21.241562 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr"] Jan 20 23:09:21 crc kubenswrapper[5030]: W0120 23:09:21.248664 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7410b4ea_f2f7_4350_aa51_0730f3463d7b.slice/crio-a2ea88e72dafe432d11de09c6b419c95e26f506a02eabbe379625de81f45611b WatchSource:0}: Error finding container a2ea88e72dafe432d11de09c6b419c95e26f506a02eabbe379625de81f45611b: Status 404 returned error can't find the container with id a2ea88e72dafe432d11de09c6b419c95e26f506a02eabbe379625de81f45611b Jan 20 23:09:21 crc kubenswrapper[5030]: I0120 23:09:21.318347 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-n46gl"] Jan 20 23:09:21 crc kubenswrapper[5030]: I0120 23:09:21.394491 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-wlqst"] Jan 20 23:09:21 crc kubenswrapper[5030]: W0120 23:09:21.398868 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod711a3beb_9e88_42cd_86f6_0e88e52c05e3.slice/crio-ea39795d46ea9a2ee3c0f7aa51381bc57996c3a8d6ad95284a1b140a6a16178c WatchSource:0}: Error finding container ea39795d46ea9a2ee3c0f7aa51381bc57996c3a8d6ad95284a1b140a6a16178c: Status 404 returned error can't find the container with id ea39795d46ea9a2ee3c0f7aa51381bc57996c3a8d6ad95284a1b140a6a16178c Jan 20 23:09:21 crc kubenswrapper[5030]: I0120 23:09:21.406570 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-l4rnz" event={"ID":"1492b841-c24d-49a9-843c-7348405b87be","Type":"ContainerStarted","Data":"cca6f908acd7162a40f9d37f4bbfad061242444691b9999dbe8ee10acead624f"} Jan 20 23:09:21 crc kubenswrapper[5030]: I0120 23:09:21.407428 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j" event={"ID":"3a9c5ee4-1f00-4f1a-8147-f2ad854edb84","Type":"ContainerStarted","Data":"38f7cc03cb167a1af0a61a0be0980a38060d2041c6cced06f9f03679107930fc"} Jan 20 23:09:21 crc kubenswrapper[5030]: I0120 23:09:21.408237 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-n46gl" event={"ID":"1daef7a7-0684-404a-8e5a-4850d692e21e","Type":"ContainerStarted","Data":"28c127d69150e91dd28335e555a8cd072e0c4386c896039a23e7a1165e840019"} Jan 20 23:09:21 crc kubenswrapper[5030]: I0120 23:09:21.409053 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-wlqst" event={"ID":"711a3beb-9e88-42cd-86f6-0e88e52c05e3","Type":"ContainerStarted","Data":"ea39795d46ea9a2ee3c0f7aa51381bc57996c3a8d6ad95284a1b140a6a16178c"} Jan 20 23:09:21 crc kubenswrapper[5030]: I0120 23:09:21.409890 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr" event={"ID":"7410b4ea-f2f7-4350-aa51-0730f3463d7b","Type":"ContainerStarted","Data":"a2ea88e72dafe432d11de09c6b419c95e26f506a02eabbe379625de81f45611b"} Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.376133 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.378029 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.380097 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.380889 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.381148 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.381317 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-vqg9l" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.381512 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.381803 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.382329 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.398207 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.445365 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.445409 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.445433 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.445455 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.445478 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.445547 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.445569 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.445587 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jwbv\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-kube-api-access-2jwbv\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.445604 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.445655 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.445703 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.546713 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.546759 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.546780 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.546802 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.546825 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.546849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.546870 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.546894 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jwbv\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-kube-api-access-2jwbv\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.546912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.547290 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.547302 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.547381 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.547424 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.547606 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.547734 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.548022 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.548671 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.552520 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.552816 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.559228 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.564445 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.570913 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.575501 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jwbv\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-kube-api-access-2jwbv\") pod \"rabbitmq-server-0\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:24 crc kubenswrapper[5030]: I0120 23:09:24.700731 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.746055 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.747460 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.749249 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.750155 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.753648 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-rzlfs" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.753841 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.759885 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.760545 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.768327 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5j6h\" (UniqueName: \"kubernetes.io/projected/0fa7d356-e537-4b10-829b-9f07de284e64-kube-api-access-h5j6h\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.768382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fa7d356-e537-4b10-829b-9f07de284e64-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.768454 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0fa7d356-e537-4b10-829b-9f07de284e64-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.768491 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.768519 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.768546 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-kolla-config\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.768569 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-config-data-default\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.768608 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa7d356-e537-4b10-829b-9f07de284e64-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.869000 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fa7d356-e537-4b10-829b-9f07de284e64-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.869067 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0fa7d356-e537-4b10-829b-9f07de284e64-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.869090 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.869106 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.869127 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-kolla-config\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.869143 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-config-data-default\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.869176 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa7d356-e537-4b10-829b-9f07de284e64-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.869213 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5j6h\" (UniqueName: \"kubernetes.io/projected/0fa7d356-e537-4b10-829b-9f07de284e64-kube-api-access-h5j6h\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.869448 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.869991 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0fa7d356-e537-4b10-829b-9f07de284e64-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.870458 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-config-data-default\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.870888 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.871788 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-kolla-config\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.876378 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa7d356-e537-4b10-829b-9f07de284e64-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.881196 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fa7d356-e537-4b10-829b-9f07de284e64-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.889716 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5j6h\" (UniqueName: \"kubernetes.io/projected/0fa7d356-e537-4b10-829b-9f07de284e64-kube-api-access-h5j6h\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:25 crc kubenswrapper[5030]: I0120 23:09:25.906736 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.074036 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.167411 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.168387 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.172687 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.172698 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.172997 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-cpgxv" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.184949 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deba34a1-e52c-4ee9-b8fb-07dbe410184c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.185001 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/deba34a1-e52c-4ee9-b8fb-07dbe410184c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.185061 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/deba34a1-e52c-4ee9-b8fb-07dbe410184c-config-data\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.185101 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/deba34a1-e52c-4ee9-b8fb-07dbe410184c-kolla-config\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.185142 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlddk\" (UniqueName: \"kubernetes.io/projected/deba34a1-e52c-4ee9-b8fb-07dbe410184c-kube-api-access-hlddk\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.199823 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.286794 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlddk\" (UniqueName: \"kubernetes.io/projected/deba34a1-e52c-4ee9-b8fb-07dbe410184c-kube-api-access-hlddk\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.286887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deba34a1-e52c-4ee9-b8fb-07dbe410184c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.286934 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/deba34a1-e52c-4ee9-b8fb-07dbe410184c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.287019 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/deba34a1-e52c-4ee9-b8fb-07dbe410184c-config-data\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.287096 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/deba34a1-e52c-4ee9-b8fb-07dbe410184c-kolla-config\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.288014 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/deba34a1-e52c-4ee9-b8fb-07dbe410184c-kolla-config\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.288417 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/deba34a1-e52c-4ee9-b8fb-07dbe410184c-config-data\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.292941 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deba34a1-e52c-4ee9-b8fb-07dbe410184c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.304995 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/deba34a1-e52c-4ee9-b8fb-07dbe410184c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.312854 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlddk\" (UniqueName: \"kubernetes.io/projected/deba34a1-e52c-4ee9-b8fb-07dbe410184c-kube-api-access-hlddk\") pod \"memcached-0\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:26 crc kubenswrapper[5030]: I0120 23:09:26.486812 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:27 crc kubenswrapper[5030]: I0120 23:09:27.163944 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:09:27 crc kubenswrapper[5030]: I0120 23:09:27.174223 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:09:27 crc kubenswrapper[5030]: I0120 23:09:27.478392 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-l4rnz" event={"ID":"1492b841-c24d-49a9-843c-7348405b87be","Type":"ContainerStarted","Data":"2cdd156bf92774d0957beb2e9bc0e0279ddd7eb49a14875438371e45cbaab9ff"} Jan 20 23:09:27 crc kubenswrapper[5030]: I0120 23:09:27.496648 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-l4rnz" podStartSLOduration=1.898995122 podStartE2EDuration="7.496612378s" podCreationTimestamp="2026-01-20 23:09:20 +0000 UTC" firstStartedPulling="2026-01-20 23:09:21.051140806 +0000 UTC m=+2033.371401094" lastFinishedPulling="2026-01-20 23:09:26.648758062 +0000 UTC m=+2038.969018350" observedRunningTime="2026-01-20 23:09:27.493600075 +0000 UTC m=+2039.813860373" watchObservedRunningTime="2026-01-20 23:09:27.496612378 +0000 UTC m=+2039.816872676" Jan 20 23:09:27 crc kubenswrapper[5030]: I0120 23:09:27.500467 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j" event={"ID":"3a9c5ee4-1f00-4f1a-8147-f2ad854edb84","Type":"ContainerStarted","Data":"1220b827a129132603f25404a4be8cbb9adb7a5154531f42ce0a7996ba2f8f42"} Jan 20 23:09:27 crc kubenswrapper[5030]: I0120 23:09:27.505491 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-wlqst" event={"ID":"711a3beb-9e88-42cd-86f6-0e88e52c05e3","Type":"ContainerStarted","Data":"52ce19147256b2fbd04eec2b3766ee499937e0a4ab5991ccd2db775c86dd2d6f"} Jan 20 23:09:27 crc kubenswrapper[5030]: I0120 23:09:27.506031 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-wlqst" Jan 20 23:09:27 crc kubenswrapper[5030]: I0120 23:09:27.507984 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr" event={"ID":"7410b4ea-f2f7-4350-aa51-0730f3463d7b","Type":"ContainerStarted","Data":"9e54a0eeaf48e0532542cbdc91d7f571cbf51ac3f4fa319b9cb2a94824bea8eb"} Jan 20 23:09:27 crc kubenswrapper[5030]: I0120 23:09:27.520392 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j" podStartSLOduration=2.026314632 podStartE2EDuration="7.520377989s" podCreationTimestamp="2026-01-20 23:09:20 +0000 UTC" firstStartedPulling="2026-01-20 23:09:21.173208718 +0000 UTC m=+2033.493469006" lastFinishedPulling="2026-01-20 23:09:26.667272075 +0000 UTC m=+2038.987532363" observedRunningTime="2026-01-20 23:09:27.518484363 +0000 UTC m=+2039.838744671" watchObservedRunningTime="2026-01-20 23:09:27.520377989 +0000 UTC m=+2039.840638277" Jan 20 23:09:27 crc kubenswrapper[5030]: I0120 23:09:27.539752 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-wlqst" podStartSLOduration=2.269570606 podStartE2EDuration="7.539733852s" podCreationTimestamp="2026-01-20 23:09:20 +0000 UTC" firstStartedPulling="2026-01-20 23:09:21.400970983 +0000 UTC m=+2033.721231271" lastFinishedPulling="2026-01-20 23:09:26.671134229 +0000 UTC m=+2038.991394517" observedRunningTime="2026-01-20 23:09:27.533754626 +0000 UTC m=+2039.854014924" watchObservedRunningTime="2026-01-20 23:09:27.539733852 +0000 UTC m=+2039.859994150" Jan 20 23:09:27 crc kubenswrapper[5030]: I0120 23:09:27.552351 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr" podStartSLOduration=2.167697757 podStartE2EDuration="7.55233301s" podCreationTimestamp="2026-01-20 23:09:20 +0000 UTC" firstStartedPulling="2026-01-20 23:09:21.251012689 +0000 UTC m=+2033.571272977" lastFinishedPulling="2026-01-20 23:09:26.635647952 +0000 UTC m=+2038.955908230" observedRunningTime="2026-01-20 23:09:27.54946645 +0000 UTC m=+2039.869726738" watchObservedRunningTime="2026-01-20 23:09:27.55233301 +0000 UTC m=+2039.872593298" Jan 20 23:09:27 crc kubenswrapper[5030]: I0120 23:09:27.851339 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:09:28 crc kubenswrapper[5030]: I0120 23:09:28.539295 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:09:28 crc kubenswrapper[5030]: I0120 23:09:28.540433 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:09:28 crc kubenswrapper[5030]: I0120 23:09:28.544529 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-9qg6f" Jan 20 23:09:28 crc kubenswrapper[5030]: I0120 23:09:28.552020 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:09:28 crc kubenswrapper[5030]: I0120 23:09:28.633124 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdslf\" (UniqueName: \"kubernetes.io/projected/d7bb0c16-38ce-41d8-b090-842104679b2a-kube-api-access-vdslf\") pod \"kube-state-metrics-0\" (UID: \"d7bb0c16-38ce-41d8-b090-842104679b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:09:28 crc kubenswrapper[5030]: I0120 23:09:28.755784 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdslf\" (UniqueName: \"kubernetes.io/projected/d7bb0c16-38ce-41d8-b090-842104679b2a-kube-api-access-vdslf\") pod \"kube-state-metrics-0\" (UID: \"d7bb0c16-38ce-41d8-b090-842104679b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:09:28 crc kubenswrapper[5030]: I0120 23:09:28.790310 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdslf\" (UniqueName: \"kubernetes.io/projected/d7bb0c16-38ce-41d8-b090-842104679b2a-kube-api-access-vdslf\") pod \"kube-state-metrics-0\" (UID: \"d7bb0c16-38ce-41d8-b090-842104679b2a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:09:28 crc kubenswrapper[5030]: I0120 23:09:28.861707 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.851385 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.852946 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.857946 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.858247 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.858367 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-r64td" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.858478 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.858634 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.867162 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:09:30 crc kubenswrapper[5030]: W0120 23:09:30.871950 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fa7d356_e537_4b10_829b_9f07de284e64.slice/crio-4c852fa817a40b30745becbe4d10ea2f1b1e6ec84c4012411b67498fbde8845f WatchSource:0}: Error finding container 4c852fa817a40b30745becbe4d10ea2f1b1e6ec84c4012411b67498fbde8845f: Status 404 returned error can't find the container with id 4c852fa817a40b30745becbe4d10ea2f1b1e6ec84c4012411b67498fbde8845f Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.897359 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpxbp\" (UniqueName: \"kubernetes.io/projected/682ea052-23fc-4878-ae8e-b5887a6e83b8-kube-api-access-lpxbp\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.897416 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/682ea052-23fc-4878-ae8e-b5887a6e83b8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.897534 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.897557 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.897573 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/682ea052-23fc-4878-ae8e-b5887a6e83b8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.897631 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.897655 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:30 crc kubenswrapper[5030]: I0120 23:09:30.897683 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/682ea052-23fc-4878-ae8e-b5887a6e83b8-config\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.000564 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpxbp\" (UniqueName: \"kubernetes.io/projected/682ea052-23fc-4878-ae8e-b5887a6e83b8-kube-api-access-lpxbp\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.000897 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/682ea052-23fc-4878-ae8e-b5887a6e83b8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.000961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.000982 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.001013 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/682ea052-23fc-4878-ae8e-b5887a6e83b8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.001059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.001096 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.001123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/682ea052-23fc-4878-ae8e-b5887a6e83b8-config\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.002251 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/682ea052-23fc-4878-ae8e-b5887a6e83b8-config\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.004286 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/682ea052-23fc-4878-ae8e-b5887a6e83b8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.004739 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") device mount path \"/mnt/openstack/pv15\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.005130 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/682ea052-23fc-4878-ae8e-b5887a6e83b8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.014871 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.019874 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.025439 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.031401 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpxbp\" (UniqueName: \"kubernetes.io/projected/682ea052-23fc-4878-ae8e-b5887a6e83b8-kube-api-access-lpxbp\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.032382 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"ovsdbserver-nb-0\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.054541 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:31 crc kubenswrapper[5030]: W0120 23:09:31.329976 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7bb0c16_38ce_41d8_b090_842104679b2a.slice/crio-ca1026e445e69d4b1c5e62011e842c0f518029fde20c197fa2e9cc1fd52040af WatchSource:0}: Error finding container ca1026e445e69d4b1c5e62011e842c0f518029fde20c197fa2e9cc1fd52040af: Status 404 returned error can't find the container with id ca1026e445e69d4b1c5e62011e842c0f518029fde20c197fa2e9cc1fd52040af Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.330194 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.493403 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:09:31 crc kubenswrapper[5030]: W0120 23:09:31.498137 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod682ea052_23fc_4878_ae8e_b5887a6e83b8.slice/crio-b603bfe9e8c779c496279c300c8a563436121eb38e61ef6fbc644da2b6132a0d WatchSource:0}: Error finding container b603bfe9e8c779c496279c300c8a563436121eb38e61ef6fbc644da2b6132a0d: Status 404 returned error can't find the container with id b603bfe9e8c779c496279c300c8a563436121eb38e61ef6fbc644da2b6132a0d Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.535474 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-n46gl" event={"ID":"1daef7a7-0684-404a-8e5a-4850d692e21e","Type":"ContainerStarted","Data":"ed6fd76f5aff658bf70bad099c3b35aa0463ce844c8af3a57f67518cbcae10d6"} Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.536847 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"0fa7d356-e537-4b10-829b-9f07de284e64","Type":"ContainerStarted","Data":"0bb5172cf37332f877e6edf9a908bfb7d962347a4504c2961384b777b72fd68d"} Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.536872 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"0fa7d356-e537-4b10-829b-9f07de284e64","Type":"ContainerStarted","Data":"4c852fa817a40b30745becbe4d10ea2f1b1e6ec84c4012411b67498fbde8845f"} Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.538227 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"682ea052-23fc-4878-ae8e-b5887a6e83b8","Type":"ContainerStarted","Data":"b603bfe9e8c779c496279c300c8a563436121eb38e61ef6fbc644da2b6132a0d"} Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.539266 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac","Type":"ContainerStarted","Data":"166c0118eb51409e199ccc5e924b57cf4a6300397f664a6f64c04671064b1d3c"} Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.540592 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"deba34a1-e52c-4ee9-b8fb-07dbe410184c","Type":"ContainerStarted","Data":"5c57eec86df36ec8b65b60ad4ae937344c23be3ea71cb4377819fbc39e8de8e1"} Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.540655 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"deba34a1-e52c-4ee9-b8fb-07dbe410184c","Type":"ContainerStarted","Data":"2e2e01d73ac13c66e483610e5659b591aa8a25b91fc14d3bdce31748babff9df"} Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.541359 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.545841 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d7bb0c16-38ce-41d8-b090-842104679b2a","Type":"ContainerStarted","Data":"ca1026e445e69d4b1c5e62011e842c0f518029fde20c197fa2e9cc1fd52040af"} Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.559613 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-n46gl" podStartSLOduration=1.902957628 podStartE2EDuration="11.559592799s" podCreationTimestamp="2026-01-20 23:09:20 +0000 UTC" firstStartedPulling="2026-01-20 23:09:21.323113661 +0000 UTC m=+2033.643373949" lastFinishedPulling="2026-01-20 23:09:30.979748832 +0000 UTC m=+2043.300009120" observedRunningTime="2026-01-20 23:09:31.556067483 +0000 UTC m=+2043.876327781" watchObservedRunningTime="2026-01-20 23:09:31.559592799 +0000 UTC m=+2043.879853097" Jan 20 23:09:31 crc kubenswrapper[5030]: I0120 23:09:31.586094 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=5.586069096 podStartE2EDuration="5.586069096s" podCreationTimestamp="2026-01-20 23:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:09:31.580951551 +0000 UTC m=+2043.901211839" watchObservedRunningTime="2026-01-20 23:09:31.586069096 +0000 UTC m=+2043.906329384" Jan 20 23:09:32 crc kubenswrapper[5030]: I0120 23:09:32.638906 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"682ea052-23fc-4878-ae8e-b5887a6e83b8","Type":"ContainerStarted","Data":"593b160210b2fb95026561fbd077c54222972c387ef711518ba53a962cd1ab83"} Jan 20 23:09:32 crc kubenswrapper[5030]: I0120 23:09:32.639206 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"682ea052-23fc-4878-ae8e-b5887a6e83b8","Type":"ContainerStarted","Data":"84e53198c624c2bd25c3ed46fb9e8cf0112438a848dd4e30d479645f7a46beb6"} Jan 20 23:09:32 crc kubenswrapper[5030]: I0120 23:09:32.664658 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac","Type":"ContainerStarted","Data":"41c64328990c95c8e74986ddc791bda40376cfba59526d1bcd67f04916bafbff"} Jan 20 23:09:32 crc kubenswrapper[5030]: I0120 23:09:32.671263 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d7bb0c16-38ce-41d8-b090-842104679b2a","Type":"ContainerStarted","Data":"f5ebd33c9acc4f750d280ab2a5f57801cf2340ac2e82461bfa53375980cb7bed"} Jan 20 23:09:32 crc kubenswrapper[5030]: I0120 23:09:32.671426 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:09:32 crc kubenswrapper[5030]: I0120 23:09:32.672423 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-n46gl" Jan 20 23:09:32 crc kubenswrapper[5030]: I0120 23:09:32.680722 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-n46gl" Jan 20 23:09:32 crc kubenswrapper[5030]: I0120 23:09:32.686108 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.686095253 podStartE2EDuration="3.686095253s" podCreationTimestamp="2026-01-20 23:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:09:32.674685064 +0000 UTC m=+2044.994945352" watchObservedRunningTime="2026-01-20 23:09:32.686095253 +0000 UTC m=+2045.006355541" Jan 20 23:09:32 crc kubenswrapper[5030]: I0120 23:09:32.751255 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=4.358179062 podStartE2EDuration="4.751236735s" podCreationTimestamp="2026-01-20 23:09:28 +0000 UTC" firstStartedPulling="2026-01-20 23:09:31.332333877 +0000 UTC m=+2043.652594165" lastFinishedPulling="2026-01-20 23:09:31.72539155 +0000 UTC m=+2044.045651838" observedRunningTime="2026-01-20 23:09:32.710787216 +0000 UTC m=+2045.031047524" watchObservedRunningTime="2026-01-20 23:09:32.751236735 +0000 UTC m=+2045.071497023" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.055557 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.101086 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.600834 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.601881 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.603535 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.603805 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.603938 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.604955 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-b6wpm" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.612734 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.671342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.671387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ba022bd-4168-4c45-a924-061342f680dd-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.671421 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn68h\" (UniqueName: \"kubernetes.io/projected/1ba022bd-4168-4c45-a924-061342f680dd-kube-api-access-zn68h\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.671442 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba022bd-4168-4c45-a924-061342f680dd-config\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.671462 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ba022bd-4168-4c45-a924-061342f680dd-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.671480 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.671528 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.671554 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.684926 5030 generic.go:334] "Generic (PLEG): container finished" podID="0fa7d356-e537-4b10-829b-9f07de284e64" containerID="0bb5172cf37332f877e6edf9a908bfb7d962347a4504c2961384b777b72fd68d" exitCode=0 Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.684997 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"0fa7d356-e537-4b10-829b-9f07de284e64","Type":"ContainerDied","Data":"0bb5172cf37332f877e6edf9a908bfb7d962347a4504c2961384b777b72fd68d"} Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.685217 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.772973 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.773423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ba022bd-4168-4c45-a924-061342f680dd-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.773529 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn68h\" (UniqueName: \"kubernetes.io/projected/1ba022bd-4168-4c45-a924-061342f680dd-kube-api-access-zn68h\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.773569 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba022bd-4168-4c45-a924-061342f680dd-config\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.773611 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ba022bd-4168-4c45-a924-061342f680dd-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.773785 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.773898 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.773968 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.774000 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ba022bd-4168-4c45-a924-061342f680dd-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.774567 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.775060 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba022bd-4168-4c45-a924-061342f680dd-config\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.777269 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ba022bd-4168-4c45-a924-061342f680dd-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.781079 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.781301 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.782843 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.796639 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.797303 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn68h\" (UniqueName: \"kubernetes.io/projected/1ba022bd-4168-4c45-a924-061342f680dd-kube-api-access-zn68h\") pod \"ovsdbserver-sb-0\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:34 crc kubenswrapper[5030]: I0120 23:09:34.944057 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:35 crc kubenswrapper[5030]: I0120 23:09:35.693416 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"0fa7d356-e537-4b10-829b-9f07de284e64","Type":"ContainerStarted","Data":"33728d9e320468adf6786f01946b2906c887ec040f837a9e394ae711dd71ba4c"} Jan 20 23:09:35 crc kubenswrapper[5030]: I0120 23:09:35.722291 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=11.722272216 podStartE2EDuration="11.722272216s" podCreationTimestamp="2026-01-20 23:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:09:35.713332398 +0000 UTC m=+2048.033592686" watchObservedRunningTime="2026-01-20 23:09:35.722272216 +0000 UTC m=+2048.042532504" Jan 20 23:09:36 crc kubenswrapper[5030]: I0120 23:09:36.075715 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:36 crc kubenswrapper[5030]: I0120 23:09:36.076049 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:36 crc kubenswrapper[5030]: I0120 23:09:36.093514 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:09:36 crc kubenswrapper[5030]: I0120 23:09:36.163763 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:09:36 crc kubenswrapper[5030]: W0120 23:09:36.169822 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba022bd_4168_4c45_a924_061342f680dd.slice/crio-635d96c387b5636a89141b1cc720c935902a4f4347411c5eb8d998372c7330fe WatchSource:0}: Error finding container 635d96c387b5636a89141b1cc720c935902a4f4347411c5eb8d998372c7330fe: Status 404 returned error can't find the container with id 635d96c387b5636a89141b1cc720c935902a4f4347411c5eb8d998372c7330fe Jan 20 23:09:36 crc kubenswrapper[5030]: I0120 23:09:36.488539 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:09:36 crc kubenswrapper[5030]: I0120 23:09:36.707962 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"1ba022bd-4168-4c45-a924-061342f680dd","Type":"ContainerStarted","Data":"ae6ead157a67c32e8e554cbbc941bac6690f0a81f5616ed9ead93667b7e0e30f"} Jan 20 23:09:36 crc kubenswrapper[5030]: I0120 23:09:36.708008 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"1ba022bd-4168-4c45-a924-061342f680dd","Type":"ContainerStarted","Data":"9ec086e5497da0177834476824c0a381c6c4c92f9315ab92e56f076e2aee3773"} Jan 20 23:09:36 crc kubenswrapper[5030]: I0120 23:09:36.708020 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"1ba022bd-4168-4c45-a924-061342f680dd","Type":"ContainerStarted","Data":"635d96c387b5636a89141b1cc720c935902a4f4347411c5eb8d998372c7330fe"} Jan 20 23:09:36 crc kubenswrapper[5030]: I0120 23:09:36.729042 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=3.729023064 podStartE2EDuration="3.729023064s" podCreationTimestamp="2026-01-20 23:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:09:36.726751548 +0000 UTC m=+2049.047011846" watchObservedRunningTime="2026-01-20 23:09:36.729023064 +0000 UTC m=+2049.049283352" Jan 20 23:09:37 crc kubenswrapper[5030]: I0120 23:09:37.944428 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:38 crc kubenswrapper[5030]: I0120 23:09:38.475637 5030 scope.go:117] "RemoveContainer" containerID="9a906110591b1c9cf594b5c7f6ade4774e1be0f703f6d02471b591d8d76c6641" Jan 20 23:09:38 crc kubenswrapper[5030]: I0120 23:09:38.503001 5030 scope.go:117] "RemoveContainer" containerID="f1d8809b1d196273dc294cf293af381f668786c5df69401b667fed9278b6508b" Jan 20 23:09:38 crc kubenswrapper[5030]: I0120 23:09:38.875180 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.612885 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/alertmanager-metric-storage-0"] Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.615291 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.619593 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"alertmanager-metric-storage-tls-assets-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.619923 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"alertmanager-metric-storage-cluster-tls-config" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.620066 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"alertmanager-metric-storage-generated" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.620197 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"alertmanager-metric-storage-web-config" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.620605 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"metric-storage-alertmanager-dockercfg-mf5pq" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.685124 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/alertmanager-metric-storage-0"] Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.743981 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.744034 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f331453a-4129-49ea-b554-06f628abf66a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.744081 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn95g\" (UniqueName: \"kubernetes.io/projected/f331453a-4129-49ea-b554-06f628abf66a-kube-api-access-wn95g\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.744174 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f331453a-4129-49ea-b554-06f628abf66a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.744254 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f331453a-4129-49ea-b554-06f628abf66a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.744365 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.744447 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.824871 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.829325 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.831133 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.831381 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.831706 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-vvgkp" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.832124 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.845642 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f331453a-4129-49ea-b554-06f628abf66a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.845685 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f331453a-4129-49ea-b554-06f628abf66a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.845722 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.845751 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.845790 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.846572 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f331453a-4129-49ea-b554-06f628abf66a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.846607 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn95g\" (UniqueName: \"kubernetes.io/projected/f331453a-4129-49ea-b554-06f628abf66a-kube-api-access-wn95g\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.847097 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f331453a-4129-49ea-b554-06f628abf66a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.852261 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.853127 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.853137 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f331453a-4129-49ea-b554-06f628abf66a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.853610 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.858938 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f331453a-4129-49ea-b554-06f628abf66a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.871067 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.888694 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn95g\" (UniqueName: \"kubernetes.io/projected/f331453a-4129-49ea-b554-06f628abf66a-kube-api-access-wn95g\") pod \"alertmanager-metric-storage-0\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.931944 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.945706 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.948542 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbzdc\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-kube-api-access-nbzdc\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.948678 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70a9dd1f-f075-4833-b9e8-3fc856743ab8-cache\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.948708 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.948745 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70a9dd1f-f075-4833-b9e8-3fc856743ab8-lock\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:39 crc kubenswrapper[5030]: I0120 23:09:39.948768 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.050461 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70a9dd1f-f075-4833-b9e8-3fc856743ab8-cache\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.050512 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.050586 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70a9dd1f-f075-4833-b9e8-3fc856743ab8-lock\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.050608 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.050673 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbzdc\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-kube-api-access-nbzdc\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:40 crc kubenswrapper[5030]: E0120 23:09:40.053172 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:09:40 crc kubenswrapper[5030]: E0120 23:09:40.053203 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:09:40 crc kubenswrapper[5030]: E0120 23:09:40.053250 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift podName:70a9dd1f-f075-4833-b9e8-3fc856743ab8 nodeName:}" failed. No retries permitted until 2026-01-20 23:09:40.553232635 +0000 UTC m=+2052.873492923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift") pod "swift-storage-0" (UID: "70a9dd1f-f075-4833-b9e8-3fc856743ab8") : configmap "swift-ring-files" not found Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.053362 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70a9dd1f-f075-4833-b9e8-3fc856743ab8-cache\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.053430 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70a9dd1f-f075-4833-b9e8-3fc856743ab8-lock\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.053537 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.083519 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbzdc\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-kube-api-access-nbzdc\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.097111 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.157246 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.157294 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.181107 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.277114 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.454089 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/alertmanager-metric-storage-0"] Jan 20 23:09:40 crc kubenswrapper[5030]: W0120 23:09:40.466085 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf331453a_4129_49ea_b554_06f628abf66a.slice/crio-f6ffe7728ca88ceb4d9f43aa44a217ee69da22ab5eeb0584913769b7d72ad978 WatchSource:0}: Error finding container f6ffe7728ca88ceb4d9f43aa44a217ee69da22ab5eeb0584913769b7d72ad978: Status 404 returned error can't find the container with id f6ffe7728ca88ceb4d9f43aa44a217ee69da22ab5eeb0584913769b7d72ad978 Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.559892 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:40 crc kubenswrapper[5030]: E0120 23:09:40.560102 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:09:40 crc kubenswrapper[5030]: E0120 23:09:40.560129 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:09:40 crc kubenswrapper[5030]: E0120 23:09:40.560189 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift podName:70a9dd1f-f075-4833-b9e8-3fc856743ab8 nodeName:}" failed. No retries permitted until 2026-01-20 23:09:41.560169701 +0000 UTC m=+2053.880429989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift") pod "swift-storage-0" (UID: "70a9dd1f-f075-4833-b9e8-3fc856743ab8") : configmap "swift-ring-files" not found Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.735755 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f331453a-4129-49ea-b554-06f628abf66a","Type":"ContainerStarted","Data":"f6ffe7728ca88ceb4d9f43aa44a217ee69da22ab5eeb0584913769b7d72ad978"} Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.873469 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-wlqst" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.939399 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.941154 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.942866 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage-web-config" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.943689 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"metric-storage-prometheus-dockercfg-82v6w" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.943895 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"prometheus-metric-storage-rulefiles-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.943994 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"prometheus-metric-storage-rulefiles-1" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.944222 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.944329 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.944406 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"prometheus-metric-storage-rulefiles-2" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.950593 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage-tls-assets-0" Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.966157 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 20 23:09:40 crc kubenswrapper[5030]: I0120 23:09:40.988001 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.067582 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.067663 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.067693 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.067805 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.067826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.067846 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc4c6ac6-9b26-4047-a309-eb01bf39412f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.067877 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc4c6ac6-9b26-4047-a309-eb01bf39412f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.067916 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f77ts\" (UniqueName: \"kubernetes.io/projected/fc4c6ac6-9b26-4047-a309-eb01bf39412f-kube-api-access-f77ts\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.067938 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.067955 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-config\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.169226 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc4c6ac6-9b26-4047-a309-eb01bf39412f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.169314 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc4c6ac6-9b26-4047-a309-eb01bf39412f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.169368 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f77ts\" (UniqueName: \"kubernetes.io/projected/fc4c6ac6-9b26-4047-a309-eb01bf39412f-kube-api-access-f77ts\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.169412 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.169448 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-config\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.169519 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.169582 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.169649 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.169732 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.169771 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.171788 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.171835 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.172077 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.172964 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.175973 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.176514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.178936 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc4c6ac6-9b26-4047-a309-eb01bf39412f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.178968 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc4c6ac6-9b26-4047-a309-eb01bf39412f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.190991 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-config\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.191819 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f77ts\" (UniqueName: \"kubernetes.io/projected/fc4c6ac6-9b26-4047-a309-eb01bf39412f-kube-api-access-f77ts\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.235053 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"prometheus-metric-storage-0\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.256932 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.575317 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:41 crc kubenswrapper[5030]: E0120 23:09:41.575513 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:09:41 crc kubenswrapper[5030]: E0120 23:09:41.575537 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:09:41 crc kubenswrapper[5030]: E0120 23:09:41.575588 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift podName:70a9dd1f-f075-4833-b9e8-3fc856743ab8 nodeName:}" failed. No retries permitted until 2026-01-20 23:09:43.57557182 +0000 UTC m=+2055.895832108 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift") pod "swift-storage-0" (UID: "70a9dd1f-f075-4833-b9e8-3fc856743ab8") : configmap "swift-ring-files" not found Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.674991 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 20 23:09:41 crc kubenswrapper[5030]: W0120 23:09:41.683188 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc4c6ac6_9b26_4047_a309_eb01bf39412f.slice/crio-4d379d1cc61c7cdd4213d3b6410a63ed3821d13956cc3aefa96cf4740f07fe7c WatchSource:0}: Error finding container 4d379d1cc61c7cdd4213d3b6410a63ed3821d13956cc3aefa96cf4740f07fe7c: Status 404 returned error can't find the container with id 4d379d1cc61c7cdd4213d3b6410a63ed3821d13956cc3aefa96cf4740f07fe7c Jan 20 23:09:41 crc kubenswrapper[5030]: I0120 23:09:41.743607 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"fc4c6ac6-9b26-4047-a309-eb01bf39412f","Type":"ContainerStarted","Data":"4d379d1cc61c7cdd4213d3b6410a63ed3821d13956cc3aefa96cf4740f07fe7c"} Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.603773 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:43 crc kubenswrapper[5030]: E0120 23:09:43.603972 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:09:43 crc kubenswrapper[5030]: E0120 23:09:43.604132 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:09:43 crc kubenswrapper[5030]: E0120 23:09:43.604192 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift podName:70a9dd1f-f075-4833-b9e8-3fc856743ab8 nodeName:}" failed. No retries permitted until 2026-01-20 23:09:47.604173665 +0000 UTC m=+2059.924433963 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift") pod "swift-storage-0" (UID: "70a9dd1f-f075-4833-b9e8-3fc856743ab8") : configmap "swift-ring-files" not found Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.704638 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-5mkz5"] Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.705565 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.707148 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.707307 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.707155 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.724730 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-5mkz5"] Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.742238 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-5mkz5"] Jan 20 23:09:43 crc kubenswrapper[5030]: E0120 23:09:43.742852 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-4m8dt ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-4m8dt ring-data-devices scripts swiftconf]: context canceled" pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" podUID="15a69469-57be-4bc5-838f-ed224c86fd94" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.762472 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-2qn2g"] Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.763486 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.767505 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.778207 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-2qn2g"] Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.781768 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.807848 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-combined-ca-bundle\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.807914 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-swiftconf\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.808000 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15a69469-57be-4bc5-838f-ed224c86fd94-scripts\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.808091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m8dt\" (UniqueName: \"kubernetes.io/projected/15a69469-57be-4bc5-838f-ed224c86fd94-kube-api-access-4m8dt\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.808215 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/15a69469-57be-4bc5-838f-ed224c86fd94-etc-swift\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.808287 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/15a69469-57be-4bc5-838f-ed224c86fd94-ring-data-devices\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.808338 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-dispersionconf\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.909993 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c20d1400-78ce-4d6d-b499-359aca94def8-ring-data-devices\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.910096 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-combined-ca-bundle\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.910131 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/15a69469-57be-4bc5-838f-ed224c86fd94-etc-swift\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.910166 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/15a69469-57be-4bc5-838f-ed224c86fd94-ring-data-devices\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.910191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-swiftconf\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.910218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-dispersionconf\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.910260 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-dispersionconf\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.910312 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-combined-ca-bundle\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.910338 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66b8b\" (UniqueName: \"kubernetes.io/projected/c20d1400-78ce-4d6d-b499-359aca94def8-kube-api-access-66b8b\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.910362 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-swiftconf\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.910404 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c20d1400-78ce-4d6d-b499-359aca94def8-etc-swift\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.910423 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c20d1400-78ce-4d6d-b499-359aca94def8-scripts\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.910444 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15a69469-57be-4bc5-838f-ed224c86fd94-scripts\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.910469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m8dt\" (UniqueName: \"kubernetes.io/projected/15a69469-57be-4bc5-838f-ed224c86fd94-kube-api-access-4m8dt\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.912296 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/15a69469-57be-4bc5-838f-ed224c86fd94-etc-swift\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.913667 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/15a69469-57be-4bc5-838f-ed224c86fd94-ring-data-devices\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.913920 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15a69469-57be-4bc5-838f-ed224c86fd94-scripts\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.920226 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-swiftconf\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.920407 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-dispersionconf\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.921309 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-combined-ca-bundle\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:43 crc kubenswrapper[5030]: I0120 23:09:43.934423 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m8dt\" (UniqueName: \"kubernetes.io/projected/15a69469-57be-4bc5-838f-ed224c86fd94-kube-api-access-4m8dt\") pod \"swift-ring-rebalance-5mkz5\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.011907 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15a69469-57be-4bc5-838f-ed224c86fd94-scripts\") pod \"15a69469-57be-4bc5-838f-ed224c86fd94\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.011950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/15a69469-57be-4bc5-838f-ed224c86fd94-etc-swift\") pod \"15a69469-57be-4bc5-838f-ed224c86fd94\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.011981 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-combined-ca-bundle\") pod \"15a69469-57be-4bc5-838f-ed224c86fd94\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012013 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/15a69469-57be-4bc5-838f-ed224c86fd94-ring-data-devices\") pod \"15a69469-57be-4bc5-838f-ed224c86fd94\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012053 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-dispersionconf\") pod \"15a69469-57be-4bc5-838f-ed224c86fd94\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012071 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m8dt\" (UniqueName: \"kubernetes.io/projected/15a69469-57be-4bc5-838f-ed224c86fd94-kube-api-access-4m8dt\") pod \"15a69469-57be-4bc5-838f-ed224c86fd94\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012098 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-swiftconf\") pod \"15a69469-57be-4bc5-838f-ed224c86fd94\" (UID: \"15a69469-57be-4bc5-838f-ed224c86fd94\") " Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012294 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66b8b\" (UniqueName: \"kubernetes.io/projected/c20d1400-78ce-4d6d-b499-359aca94def8-kube-api-access-66b8b\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012373 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c20d1400-78ce-4d6d-b499-359aca94def8-etc-swift\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012395 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c20d1400-78ce-4d6d-b499-359aca94def8-scripts\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012396 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15a69469-57be-4bc5-838f-ed224c86fd94-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "15a69469-57be-4bc5-838f-ed224c86fd94" (UID: "15a69469-57be-4bc5-838f-ed224c86fd94"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012419 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c20d1400-78ce-4d6d-b499-359aca94def8-ring-data-devices\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012462 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-combined-ca-bundle\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012501 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-swiftconf\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012555 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-dispersionconf\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012608 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/15a69469-57be-4bc5-838f-ed224c86fd94-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.012919 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a69469-57be-4bc5-838f-ed224c86fd94-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "15a69469-57be-4bc5-838f-ed224c86fd94" (UID: "15a69469-57be-4bc5-838f-ed224c86fd94"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.013101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c20d1400-78ce-4d6d-b499-359aca94def8-etc-swift\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.013289 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c20d1400-78ce-4d6d-b499-359aca94def8-scripts\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.013409 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15a69469-57be-4bc5-838f-ed224c86fd94-scripts" (OuterVolumeSpecName: "scripts") pod "15a69469-57be-4bc5-838f-ed224c86fd94" (UID: "15a69469-57be-4bc5-838f-ed224c86fd94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.013574 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c20d1400-78ce-4d6d-b499-359aca94def8-ring-data-devices\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.019080 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15a69469-57be-4bc5-838f-ed224c86fd94" (UID: "15a69469-57be-4bc5-838f-ed224c86fd94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.019255 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-swiftconf\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.019606 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-dispersionconf\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.019759 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "15a69469-57be-4bc5-838f-ed224c86fd94" (UID: "15a69469-57be-4bc5-838f-ed224c86fd94"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.019968 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "15a69469-57be-4bc5-838f-ed224c86fd94" (UID: "15a69469-57be-4bc5-838f-ed224c86fd94"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.021121 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a69469-57be-4bc5-838f-ed224c86fd94-kube-api-access-4m8dt" (OuterVolumeSpecName: "kube-api-access-4m8dt") pod "15a69469-57be-4bc5-838f-ed224c86fd94" (UID: "15a69469-57be-4bc5-838f-ed224c86fd94"). InnerVolumeSpecName "kube-api-access-4m8dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.037216 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-combined-ca-bundle\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.039428 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66b8b\" (UniqueName: \"kubernetes.io/projected/c20d1400-78ce-4d6d-b499-359aca94def8-kube-api-access-66b8b\") pod \"swift-ring-rebalance-2qn2g\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.080680 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.122243 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/15a69469-57be-4bc5-838f-ed224c86fd94-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.122284 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.122295 5030 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/15a69469-57be-4bc5-838f-ed224c86fd94-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.122304 5030 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.122316 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m8dt\" (UniqueName: \"kubernetes.io/projected/15a69469-57be-4bc5-838f-ed224c86fd94-kube-api-access-4m8dt\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.122326 5030 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/15a69469-57be-4bc5-838f-ed224c86fd94-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.726449 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-2qn2g"] Jan 20 23:09:44 crc kubenswrapper[5030]: W0120 23:09:44.733332 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc20d1400_78ce_4d6d_b499_359aca94def8.slice/crio-ff7dd77b8f8d0d1f0ca73efdf7a397596f4c7a712927fea0f740402376aec3e1 WatchSource:0}: Error finding container ff7dd77b8f8d0d1f0ca73efdf7a397596f4c7a712927fea0f740402376aec3e1: Status 404 returned error can't find the container with id ff7dd77b8f8d0d1f0ca73efdf7a397596f4c7a712927fea0f740402376aec3e1 Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.799432 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fc6vh"] Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.840743 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-fc6vh" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.850524 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fc6vh"] Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.851875 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.852061 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-5mkz5" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.852060 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" event={"ID":"c20d1400-78ce-4d6d-b499-359aca94def8","Type":"ContainerStarted","Data":"ff7dd77b8f8d0d1f0ca73efdf7a397596f4c7a712927fea0f740402376aec3e1"} Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.943927 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2gqr\" (UniqueName: \"kubernetes.io/projected/6aad0d16-4da7-47d6-8707-e0bf8fa3740d-kube-api-access-c2gqr\") pod \"root-account-create-update-fc6vh\" (UID: \"6aad0d16-4da7-47d6-8707-e0bf8fa3740d\") " pod="openstack-kuttl-tests/root-account-create-update-fc6vh" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.944002 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aad0d16-4da7-47d6-8707-e0bf8fa3740d-operator-scripts\") pod \"root-account-create-update-fc6vh\" (UID: \"6aad0d16-4da7-47d6-8707-e0bf8fa3740d\") " pod="openstack-kuttl-tests/root-account-create-update-fc6vh" Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.958342 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-5mkz5"] Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.983098 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-5mkz5"] Jan 20 23:09:44 crc kubenswrapper[5030]: I0120 23:09:44.991912 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.050554 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2gqr\" (UniqueName: \"kubernetes.io/projected/6aad0d16-4da7-47d6-8707-e0bf8fa3740d-kube-api-access-c2gqr\") pod \"root-account-create-update-fc6vh\" (UID: \"6aad0d16-4da7-47d6-8707-e0bf8fa3740d\") " pod="openstack-kuttl-tests/root-account-create-update-fc6vh" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.050654 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aad0d16-4da7-47d6-8707-e0bf8fa3740d-operator-scripts\") pod \"root-account-create-update-fc6vh\" (UID: \"6aad0d16-4da7-47d6-8707-e0bf8fa3740d\") " pod="openstack-kuttl-tests/root-account-create-update-fc6vh" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.052019 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aad0d16-4da7-47d6-8707-e0bf8fa3740d-operator-scripts\") pod \"root-account-create-update-fc6vh\" (UID: \"6aad0d16-4da7-47d6-8707-e0bf8fa3740d\") " pod="openstack-kuttl-tests/root-account-create-update-fc6vh" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.096281 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2gqr\" (UniqueName: \"kubernetes.io/projected/6aad0d16-4da7-47d6-8707-e0bf8fa3740d-kube-api-access-c2gqr\") pod \"root-account-create-update-fc6vh\" (UID: \"6aad0d16-4da7-47d6-8707-e0bf8fa3740d\") " pod="openstack-kuttl-tests/root-account-create-update-fc6vh" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.193672 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-fc6vh" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.430256 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.432012 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.438591 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.438799 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.438916 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-4rtfq" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.439140 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.449023 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.560391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ef6962-b7c4-487b-9b04-d954fbd30b3a-scripts\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.560682 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzc4m\" (UniqueName: \"kubernetes.io/projected/45ef6962-b7c4-487b-9b04-d954fbd30b3a-kube-api-access-vzc4m\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.560703 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45ef6962-b7c4-487b-9b04-d954fbd30b3a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.560721 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ef6962-b7c4-487b-9b04-d954fbd30b3a-config\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.560766 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.560791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.560847 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.662015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzc4m\" (UniqueName: \"kubernetes.io/projected/45ef6962-b7c4-487b-9b04-d954fbd30b3a-kube-api-access-vzc4m\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.662059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45ef6962-b7c4-487b-9b04-d954fbd30b3a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.662080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ef6962-b7c4-487b-9b04-d954fbd30b3a-config\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.662127 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.662152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.662204 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.662232 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ef6962-b7c4-487b-9b04-d954fbd30b3a-scripts\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.663060 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ef6962-b7c4-487b-9b04-d954fbd30b3a-scripts\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.663559 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45ef6962-b7c4-487b-9b04-d954fbd30b3a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.664078 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ef6962-b7c4-487b-9b04-d954fbd30b3a-config\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.672677 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.673335 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.674129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.683937 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzc4m\" (UniqueName: \"kubernetes.io/projected/45ef6962-b7c4-487b-9b04-d954fbd30b3a-kube-api-access-vzc4m\") pod \"ovn-northd-0\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.743765 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fc6vh"] Jan 20 23:09:45 crc kubenswrapper[5030]: W0120 23:09:45.753580 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aad0d16_4da7_47d6_8707_e0bf8fa3740d.slice/crio-40212e3089b256927035893247a60c274de7adbec91897fc7331c53f8518614c WatchSource:0}: Error finding container 40212e3089b256927035893247a60c274de7adbec91897fc7331c53f8518614c: Status 404 returned error can't find the container with id 40212e3089b256927035893247a60c274de7adbec91897fc7331c53f8518614c Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.753922 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.871909 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" event={"ID":"c20d1400-78ce-4d6d-b499-359aca94def8","Type":"ContainerStarted","Data":"887ccad86af31ce3e1594fcc7349c3577a0c6d31033210df78179bff8c7f5ceb"} Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.893519 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-fc6vh" event={"ID":"6aad0d16-4da7-47d6-8707-e0bf8fa3740d","Type":"ContainerStarted","Data":"40212e3089b256927035893247a60c274de7adbec91897fc7331c53f8518614c"} Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.899406 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" podStartSLOduration=2.899389135 podStartE2EDuration="2.899389135s" podCreationTimestamp="2026-01-20 23:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:09:45.895577901 +0000 UTC m=+2058.215838179" watchObservedRunningTime="2026-01-20 23:09:45.899389135 +0000 UTC m=+2058.219649423" Jan 20 23:09:45 crc kubenswrapper[5030]: I0120 23:09:45.980238 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a69469-57be-4bc5-838f-ed224c86fd94" path="/var/lib/kubelet/pods/15a69469-57be-4bc5-838f-ed224c86fd94/volumes" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.180327 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-v9ngp"] Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.181506 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-v9ngp" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.206684 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng"] Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.207713 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.216977 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.229657 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-v9ngp"] Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.263747 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng"] Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.286393 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c70a1a36-5e84-42ef-bafd-cddecb57dfd7-operator-scripts\") pod \"keystone-db-create-v9ngp\" (UID: \"c70a1a36-5e84-42ef-bafd-cddecb57dfd7\") " pod="openstack-kuttl-tests/keystone-db-create-v9ngp" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.286470 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl4cv\" (UniqueName: \"kubernetes.io/projected/c70a1a36-5e84-42ef-bafd-cddecb57dfd7-kube-api-access-xl4cv\") pod \"keystone-db-create-v9ngp\" (UID: \"c70a1a36-5e84-42ef-bafd-cddecb57dfd7\") " pod="openstack-kuttl-tests/keystone-db-create-v9ngp" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.286521 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e2b069-8ada-46bf-9b74-8f899c5fb3b2-operator-scripts\") pod \"keystone-a5ce-account-create-update-tzcng\" (UID: \"48e2b069-8ada-46bf-9b74-8f899c5fb3b2\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.286546 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2hcg\" (UniqueName: \"kubernetes.io/projected/48e2b069-8ada-46bf-9b74-8f899c5fb3b2-kube-api-access-s2hcg\") pod \"keystone-a5ce-account-create-update-tzcng\" (UID: \"48e2b069-8ada-46bf-9b74-8f899c5fb3b2\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.390414 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c70a1a36-5e84-42ef-bafd-cddecb57dfd7-operator-scripts\") pod \"keystone-db-create-v9ngp\" (UID: \"c70a1a36-5e84-42ef-bafd-cddecb57dfd7\") " pod="openstack-kuttl-tests/keystone-db-create-v9ngp" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.390561 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl4cv\" (UniqueName: \"kubernetes.io/projected/c70a1a36-5e84-42ef-bafd-cddecb57dfd7-kube-api-access-xl4cv\") pod \"keystone-db-create-v9ngp\" (UID: \"c70a1a36-5e84-42ef-bafd-cddecb57dfd7\") " pod="openstack-kuttl-tests/keystone-db-create-v9ngp" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.390613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e2b069-8ada-46bf-9b74-8f899c5fb3b2-operator-scripts\") pod \"keystone-a5ce-account-create-update-tzcng\" (UID: \"48e2b069-8ada-46bf-9b74-8f899c5fb3b2\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.390667 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2hcg\" (UniqueName: \"kubernetes.io/projected/48e2b069-8ada-46bf-9b74-8f899c5fb3b2-kube-api-access-s2hcg\") pod \"keystone-a5ce-account-create-update-tzcng\" (UID: \"48e2b069-8ada-46bf-9b74-8f899c5fb3b2\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.391652 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e2b069-8ada-46bf-9b74-8f899c5fb3b2-operator-scripts\") pod \"keystone-a5ce-account-create-update-tzcng\" (UID: \"48e2b069-8ada-46bf-9b74-8f899c5fb3b2\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.392235 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c70a1a36-5e84-42ef-bafd-cddecb57dfd7-operator-scripts\") pod \"keystone-db-create-v9ngp\" (UID: \"c70a1a36-5e84-42ef-bafd-cddecb57dfd7\") " pod="openstack-kuttl-tests/keystone-db-create-v9ngp" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.393164 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.424519 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2hcg\" (UniqueName: \"kubernetes.io/projected/48e2b069-8ada-46bf-9b74-8f899c5fb3b2-kube-api-access-s2hcg\") pod \"keystone-a5ce-account-create-update-tzcng\" (UID: \"48e2b069-8ada-46bf-9b74-8f899c5fb3b2\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.435121 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl4cv\" (UniqueName: \"kubernetes.io/projected/c70a1a36-5e84-42ef-bafd-cddecb57dfd7-kube-api-access-xl4cv\") pod \"keystone-db-create-v9ngp\" (UID: \"c70a1a36-5e84-42ef-bafd-cddecb57dfd7\") " pod="openstack-kuttl-tests/keystone-db-create-v9ngp" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.501685 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-jct9z"] Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.503096 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-jct9z" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.506971 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-v9ngp" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.536484 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-4528-account-create-update-8v25n"] Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.537423 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.549021 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.560675 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.572616 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-jct9z"] Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.581028 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-4528-account-create-update-8v25n"] Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.594539 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8ml8\" (UniqueName: \"kubernetes.io/projected/df14d7d3-54c6-4a6c-996a-07be5a7f4261-kube-api-access-z8ml8\") pod \"placement-db-create-jct9z\" (UID: \"df14d7d3-54c6-4a6c-996a-07be5a7f4261\") " pod="openstack-kuttl-tests/placement-db-create-jct9z" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.594613 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df14d7d3-54c6-4a6c-996a-07be5a7f4261-operator-scripts\") pod \"placement-db-create-jct9z\" (UID: \"df14d7d3-54c6-4a6c-996a-07be5a7f4261\") " pod="openstack-kuttl-tests/placement-db-create-jct9z" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.594667 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf0297cb-4da9-453d-ab16-b516a4a45987-operator-scripts\") pod \"placement-4528-account-create-update-8v25n\" (UID: \"cf0297cb-4da9-453d-ab16-b516a4a45987\") " pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.594728 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrf8k\" (UniqueName: \"kubernetes.io/projected/cf0297cb-4da9-453d-ab16-b516a4a45987-kube-api-access-qrf8k\") pod \"placement-4528-account-create-update-8v25n\" (UID: \"cf0297cb-4da9-453d-ab16-b516a4a45987\") " pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.698971 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrf8k\" (UniqueName: \"kubernetes.io/projected/cf0297cb-4da9-453d-ab16-b516a4a45987-kube-api-access-qrf8k\") pod \"placement-4528-account-create-update-8v25n\" (UID: \"cf0297cb-4da9-453d-ab16-b516a4a45987\") " pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.699065 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8ml8\" (UniqueName: \"kubernetes.io/projected/df14d7d3-54c6-4a6c-996a-07be5a7f4261-kube-api-access-z8ml8\") pod \"placement-db-create-jct9z\" (UID: \"df14d7d3-54c6-4a6c-996a-07be5a7f4261\") " pod="openstack-kuttl-tests/placement-db-create-jct9z" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.699123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df14d7d3-54c6-4a6c-996a-07be5a7f4261-operator-scripts\") pod \"placement-db-create-jct9z\" (UID: \"df14d7d3-54c6-4a6c-996a-07be5a7f4261\") " pod="openstack-kuttl-tests/placement-db-create-jct9z" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.699151 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf0297cb-4da9-453d-ab16-b516a4a45987-operator-scripts\") pod \"placement-4528-account-create-update-8v25n\" (UID: \"cf0297cb-4da9-453d-ab16-b516a4a45987\") " pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.700240 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf0297cb-4da9-453d-ab16-b516a4a45987-operator-scripts\") pod \"placement-4528-account-create-update-8v25n\" (UID: \"cf0297cb-4da9-453d-ab16-b516a4a45987\") " pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.704826 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df14d7d3-54c6-4a6c-996a-07be5a7f4261-operator-scripts\") pod \"placement-db-create-jct9z\" (UID: \"df14d7d3-54c6-4a6c-996a-07be5a7f4261\") " pod="openstack-kuttl-tests/placement-db-create-jct9z" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.704883 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-zn648"] Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.715105 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-zn648" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.743215 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-zn648"] Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.797220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrf8k\" (UniqueName: \"kubernetes.io/projected/cf0297cb-4da9-453d-ab16-b516a4a45987-kube-api-access-qrf8k\") pod \"placement-4528-account-create-update-8v25n\" (UID: \"cf0297cb-4da9-453d-ab16-b516a4a45987\") " pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.805275 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnp64\" (UniqueName: \"kubernetes.io/projected/feff49bf-c912-4af2-98b5-97ea9cca86e4-kube-api-access-jnp64\") pod \"glance-db-create-zn648\" (UID: \"feff49bf-c912-4af2-98b5-97ea9cca86e4\") " pod="openstack-kuttl-tests/glance-db-create-zn648" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.805295 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8ml8\" (UniqueName: \"kubernetes.io/projected/df14d7d3-54c6-4a6c-996a-07be5a7f4261-kube-api-access-z8ml8\") pod \"placement-db-create-jct9z\" (UID: \"df14d7d3-54c6-4a6c-996a-07be5a7f4261\") " pod="openstack-kuttl-tests/placement-db-create-jct9z" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.805543 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feff49bf-c912-4af2-98b5-97ea9cca86e4-operator-scripts\") pod \"glance-db-create-zn648\" (UID: \"feff49bf-c912-4af2-98b5-97ea9cca86e4\") " pod="openstack-kuttl-tests/glance-db-create-zn648" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.837839 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8"] Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.843200 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.867212 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.915500 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feff49bf-c912-4af2-98b5-97ea9cca86e4-operator-scripts\") pod \"glance-db-create-zn648\" (UID: \"feff49bf-c912-4af2-98b5-97ea9cca86e4\") " pod="openstack-kuttl-tests/glance-db-create-zn648" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.915639 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnp64\" (UniqueName: \"kubernetes.io/projected/feff49bf-c912-4af2-98b5-97ea9cca86e4-kube-api-access-jnp64\") pod \"glance-db-create-zn648\" (UID: \"feff49bf-c912-4af2-98b5-97ea9cca86e4\") " pod="openstack-kuttl-tests/glance-db-create-zn648" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.919410 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8"] Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.920129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feff49bf-c912-4af2-98b5-97ea9cca86e4-operator-scripts\") pod \"glance-db-create-zn648\" (UID: \"feff49bf-c912-4af2-98b5-97ea9cca86e4\") " pod="openstack-kuttl-tests/glance-db-create-zn648" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.945939 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnp64\" (UniqueName: \"kubernetes.io/projected/feff49bf-c912-4af2-98b5-97ea9cca86e4-kube-api-access-jnp64\") pod \"glance-db-create-zn648\" (UID: \"feff49bf-c912-4af2-98b5-97ea9cca86e4\") " pod="openstack-kuttl-tests/glance-db-create-zn648" Jan 20 23:09:46 crc kubenswrapper[5030]: I0120 23:09:46.961596 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-jct9z" Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.015415 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.016356 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c8rb\" (UniqueName: \"kubernetes.io/projected/94adfb2f-fb7f-438c-83e0-2d356bcdb122-kube-api-access-7c8rb\") pod \"glance-d1e5-account-create-update-fsnq8\" (UID: \"94adfb2f-fb7f-438c-83e0-2d356bcdb122\") " pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.016460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94adfb2f-fb7f-438c-83e0-2d356bcdb122-operator-scripts\") pod \"glance-d1e5-account-create-update-fsnq8\" (UID: \"94adfb2f-fb7f-438c-83e0-2d356bcdb122\") " pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.018598 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"45ef6962-b7c4-487b-9b04-d954fbd30b3a","Type":"ContainerStarted","Data":"dbc6123b56a3c21f27e32f1cb0e8c8a4b6d91827e4c18576685a7383b2df5bd4"} Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.029826 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-fc6vh" event={"ID":"6aad0d16-4da7-47d6-8707-e0bf8fa3740d","Type":"ContainerStarted","Data":"05a36c447531d8197fe63c5039928766cf8cd79248e11ef0aabf190a64554fc4"} Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.065808 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-zn648" Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.077850 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/root-account-create-update-fc6vh" podStartSLOduration=3.077831907 podStartE2EDuration="3.077831907s" podCreationTimestamp="2026-01-20 23:09:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:09:47.049095096 +0000 UTC m=+2059.369355384" watchObservedRunningTime="2026-01-20 23:09:47.077831907 +0000 UTC m=+2059.398092195" Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.121434 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c8rb\" (UniqueName: \"kubernetes.io/projected/94adfb2f-fb7f-438c-83e0-2d356bcdb122-kube-api-access-7c8rb\") pod \"glance-d1e5-account-create-update-fsnq8\" (UID: \"94adfb2f-fb7f-438c-83e0-2d356bcdb122\") " pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.121648 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94adfb2f-fb7f-438c-83e0-2d356bcdb122-operator-scripts\") pod \"glance-d1e5-account-create-update-fsnq8\" (UID: \"94adfb2f-fb7f-438c-83e0-2d356bcdb122\") " pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.124242 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94adfb2f-fb7f-438c-83e0-2d356bcdb122-operator-scripts\") pod \"glance-d1e5-account-create-update-fsnq8\" (UID: \"94adfb2f-fb7f-438c-83e0-2d356bcdb122\") " pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.146670 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c8rb\" (UniqueName: \"kubernetes.io/projected/94adfb2f-fb7f-438c-83e0-2d356bcdb122-kube-api-access-7c8rb\") pod \"glance-d1e5-account-create-update-fsnq8\" (UID: \"94adfb2f-fb7f-438c-83e0-2d356bcdb122\") " pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.330216 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.352893 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng"] Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.591527 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-v9ngp"] Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.631450 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:47 crc kubenswrapper[5030]: E0120 23:09:47.631676 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:09:47 crc kubenswrapper[5030]: E0120 23:09:47.631690 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:09:47 crc kubenswrapper[5030]: E0120 23:09:47.631734 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift podName:70a9dd1f-f075-4833-b9e8-3fc856743ab8 nodeName:}" failed. No retries permitted until 2026-01-20 23:09:55.631719861 +0000 UTC m=+2067.951980139 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift") pod "swift-storage-0" (UID: "70a9dd1f-f075-4833-b9e8-3fc856743ab8") : configmap "swift-ring-files" not found Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.801246 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-zn648"] Jan 20 23:09:47 crc kubenswrapper[5030]: I0120 23:09:47.912927 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-4528-account-create-update-8v25n"] Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.037294 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-jct9z"] Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.037789 5030 generic.go:334] "Generic (PLEG): container finished" podID="6aad0d16-4da7-47d6-8707-e0bf8fa3740d" containerID="05a36c447531d8197fe63c5039928766cf8cd79248e11ef0aabf190a64554fc4" exitCode=0 Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.037873 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-fc6vh" event={"ID":"6aad0d16-4da7-47d6-8707-e0bf8fa3740d","Type":"ContainerDied","Data":"05a36c447531d8197fe63c5039928766cf8cd79248e11ef0aabf190a64554fc4"} Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.050325 5030 generic.go:334] "Generic (PLEG): container finished" podID="c70a1a36-5e84-42ef-bafd-cddecb57dfd7" containerID="ad07a3cdfc37fecbb9e6cd9f1b6d7bbe467ef4063fc636147ca3c18406a5de63" exitCode=0 Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.050441 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-v9ngp" event={"ID":"c70a1a36-5e84-42ef-bafd-cddecb57dfd7","Type":"ContainerDied","Data":"ad07a3cdfc37fecbb9e6cd9f1b6d7bbe467ef4063fc636147ca3c18406a5de63"} Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.050495 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-v9ngp" event={"ID":"c70a1a36-5e84-42ef-bafd-cddecb57dfd7","Type":"ContainerStarted","Data":"f2123f9b11a1936fc7f7aed1beec7e68ef01a850196012537edbe5c1a73af88d"} Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.053169 5030 generic.go:334] "Generic (PLEG): container finished" podID="48e2b069-8ada-46bf-9b74-8f899c5fb3b2" containerID="bd0d7f8c3ddd224fae9dbfe730191a14cb606035d6b54b8853968ab3874c0688" exitCode=0 Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.053252 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng" event={"ID":"48e2b069-8ada-46bf-9b74-8f899c5fb3b2","Type":"ContainerDied","Data":"bd0d7f8c3ddd224fae9dbfe730191a14cb606035d6b54b8853968ab3874c0688"} Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.053285 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng" event={"ID":"48e2b069-8ada-46bf-9b74-8f899c5fb3b2","Type":"ContainerStarted","Data":"13bb8feffadcc9a27a68c49643586a94fd38c2a6a2562c72575e56575c474587"} Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.060015 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"45ef6962-b7c4-487b-9b04-d954fbd30b3a","Type":"ContainerStarted","Data":"d81bac0da6673890a9c30a104bdda356d42b4ea1937571eec0145eb0c3b6b9ec"} Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.060062 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"45ef6962-b7c4-487b-9b04-d954fbd30b3a","Type":"ContainerStarted","Data":"a26f3d469d1bc70918fe59af26ae8ccec88cda9bef22615517691b50482821fc"} Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.060241 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.069207 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8"] Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.136494 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=3.136477653 podStartE2EDuration="3.136477653s" podCreationTimestamp="2026-01-20 23:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:09:48.132147547 +0000 UTC m=+2060.452407845" watchObservedRunningTime="2026-01-20 23:09:48.136477653 +0000 UTC m=+2060.456737941" Jan 20 23:09:48 crc kubenswrapper[5030]: W0120 23:09:48.495214 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf14d7d3_54c6_4a6c_996a_07be5a7f4261.slice/crio-37a0cf4491d2a3237c86e63be55be5aae5c040571bb8e1f6c71a2aefebd8c7c9 WatchSource:0}: Error finding container 37a0cf4491d2a3237c86e63be55be5aae5c040571bb8e1f6c71a2aefebd8c7c9: Status 404 returned error can't find the container with id 37a0cf4491d2a3237c86e63be55be5aae5c040571bb8e1f6c71a2aefebd8c7c9 Jan 20 23:09:48 crc kubenswrapper[5030]: W0120 23:09:48.505924 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeff49bf_c912_4af2_98b5_97ea9cca86e4.slice/crio-ea205b01c032b3959d7fbfde2f1fec58cf76bb7cc198e4e7d20f1ef3073eaa48 WatchSource:0}: Error finding container ea205b01c032b3959d7fbfde2f1fec58cf76bb7cc198e4e7d20f1ef3073eaa48: Status 404 returned error can't find the container with id ea205b01c032b3959d7fbfde2f1fec58cf76bb7cc198e4e7d20f1ef3073eaa48 Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.740902 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-db-create-lgbqp"] Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.742603 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-create-lgbqp" Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.758147 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-db-create-lgbqp"] Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.860895 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt"] Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.862544 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.863043 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfvd7\" (UniqueName: \"kubernetes.io/projected/5fe039bb-1c22-493b-a545-1f2843b9330e-kube-api-access-nfvd7\") pod \"watcher-db-create-lgbqp\" (UID: \"5fe039bb-1c22-493b-a545-1f2843b9330e\") " pod="openstack-kuttl-tests/watcher-db-create-lgbqp" Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.864677 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fe039bb-1c22-493b-a545-1f2843b9330e-operator-scripts\") pod \"watcher-db-create-lgbqp\" (UID: \"5fe039bb-1c22-493b-a545-1f2843b9330e\") " pod="openstack-kuttl-tests/watcher-db-create-lgbqp" Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.865400 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-db-secret" Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.869959 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt"] Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.966016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfvd7\" (UniqueName: \"kubernetes.io/projected/5fe039bb-1c22-493b-a545-1f2843b9330e-kube-api-access-nfvd7\") pod \"watcher-db-create-lgbqp\" (UID: \"5fe039bb-1c22-493b-a545-1f2843b9330e\") " pod="openstack-kuttl-tests/watcher-db-create-lgbqp" Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.966091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9b1cf97-227f-4c53-b2a4-b272d8803ae7-operator-scripts\") pod \"watcher-dbc5-account-create-update-dr9qt\" (UID: \"a9b1cf97-227f-4c53-b2a4-b272d8803ae7\") " pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.966133 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fe039bb-1c22-493b-a545-1f2843b9330e-operator-scripts\") pod \"watcher-db-create-lgbqp\" (UID: \"5fe039bb-1c22-493b-a545-1f2843b9330e\") " pod="openstack-kuttl-tests/watcher-db-create-lgbqp" Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.966212 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtln2\" (UniqueName: \"kubernetes.io/projected/a9b1cf97-227f-4c53-b2a4-b272d8803ae7-kube-api-access-vtln2\") pod \"watcher-dbc5-account-create-update-dr9qt\" (UID: \"a9b1cf97-227f-4c53-b2a4-b272d8803ae7\") " pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" Jan 20 23:09:48 crc kubenswrapper[5030]: I0120 23:09:48.967438 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fe039bb-1c22-493b-a545-1f2843b9330e-operator-scripts\") pod \"watcher-db-create-lgbqp\" (UID: \"5fe039bb-1c22-493b-a545-1f2843b9330e\") " pod="openstack-kuttl-tests/watcher-db-create-lgbqp" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.042968 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfvd7\" (UniqueName: \"kubernetes.io/projected/5fe039bb-1c22-493b-a545-1f2843b9330e-kube-api-access-nfvd7\") pod \"watcher-db-create-lgbqp\" (UID: \"5fe039bb-1c22-493b-a545-1f2843b9330e\") " pod="openstack-kuttl-tests/watcher-db-create-lgbqp" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.069786 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtln2\" (UniqueName: \"kubernetes.io/projected/a9b1cf97-227f-4c53-b2a4-b272d8803ae7-kube-api-access-vtln2\") pod \"watcher-dbc5-account-create-update-dr9qt\" (UID: \"a9b1cf97-227f-4c53-b2a4-b272d8803ae7\") " pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.069879 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9b1cf97-227f-4c53-b2a4-b272d8803ae7-operator-scripts\") pod \"watcher-dbc5-account-create-update-dr9qt\" (UID: \"a9b1cf97-227f-4c53-b2a4-b272d8803ae7\") " pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.070669 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9b1cf97-227f-4c53-b2a4-b272d8803ae7-operator-scripts\") pod \"watcher-dbc5-account-create-update-dr9qt\" (UID: \"a9b1cf97-227f-4c53-b2a4-b272d8803ae7\") " pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.076656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-zn648" event={"ID":"feff49bf-c912-4af2-98b5-97ea9cca86e4","Type":"ContainerStarted","Data":"ea205b01c032b3959d7fbfde2f1fec58cf76bb7cc198e4e7d20f1ef3073eaa48"} Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.078553 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" event={"ID":"cf0297cb-4da9-453d-ab16-b516a4a45987","Type":"ContainerStarted","Data":"5f7d35b2a367f9318f220339f6dee3e01642d38b191bb85dd167bc66c7052558"} Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.078596 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" event={"ID":"cf0297cb-4da9-453d-ab16-b516a4a45987","Type":"ContainerStarted","Data":"9fe9c921890bd76073c26d8d9844f6e4254509f185b2bdaf960cdf2890300d5b"} Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.080787 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" event={"ID":"94adfb2f-fb7f-438c-83e0-2d356bcdb122","Type":"ContainerStarted","Data":"f0c319d434603d5cca5731e6e06e5cc5ab367ba74821077ad39f503ea413d420"} Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.080830 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" event={"ID":"94adfb2f-fb7f-438c-83e0-2d356bcdb122","Type":"ContainerStarted","Data":"80b6880d5d2e036692b23f8de9e9acde7bd28540198037b5c5722ac773ccc7f2"} Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.082717 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-jct9z" event={"ID":"df14d7d3-54c6-4a6c-996a-07be5a7f4261","Type":"ContainerStarted","Data":"37a0cf4491d2a3237c86e63be55be5aae5c040571bb8e1f6c71a2aefebd8c7c9"} Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.104057 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" podStartSLOduration=3.104038953 podStartE2EDuration="3.104038953s" podCreationTimestamp="2026-01-20 23:09:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:09:49.096442968 +0000 UTC m=+2061.416703286" watchObservedRunningTime="2026-01-20 23:09:49.104038953 +0000 UTC m=+2061.424299241" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.115059 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-create-lgbqp" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.121008 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" podStartSLOduration=3.120987578 podStartE2EDuration="3.120987578s" podCreationTimestamp="2026-01-20 23:09:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:09:49.115585506 +0000 UTC m=+2061.435845794" watchObservedRunningTime="2026-01-20 23:09:49.120987578 +0000 UTC m=+2061.441247866" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.145293 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtln2\" (UniqueName: \"kubernetes.io/projected/a9b1cf97-227f-4c53-b2a4-b272d8803ae7-kube-api-access-vtln2\") pod \"watcher-dbc5-account-create-update-dr9qt\" (UID: \"a9b1cf97-227f-4c53-b2a4-b272d8803ae7\") " pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.183524 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.615049 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.632224 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-fc6vh" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.666655 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-v9ngp" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.681893 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2hcg\" (UniqueName: \"kubernetes.io/projected/48e2b069-8ada-46bf-9b74-8f899c5fb3b2-kube-api-access-s2hcg\") pod \"48e2b069-8ada-46bf-9b74-8f899c5fb3b2\" (UID: \"48e2b069-8ada-46bf-9b74-8f899c5fb3b2\") " Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.681942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e2b069-8ada-46bf-9b74-8f899c5fb3b2-operator-scripts\") pod \"48e2b069-8ada-46bf-9b74-8f899c5fb3b2\" (UID: \"48e2b069-8ada-46bf-9b74-8f899c5fb3b2\") " Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.682824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48e2b069-8ada-46bf-9b74-8f899c5fb3b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48e2b069-8ada-46bf-9b74-8f899c5fb3b2" (UID: "48e2b069-8ada-46bf-9b74-8f899c5fb3b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.694330 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48e2b069-8ada-46bf-9b74-8f899c5fb3b2-kube-api-access-s2hcg" (OuterVolumeSpecName: "kube-api-access-s2hcg") pod "48e2b069-8ada-46bf-9b74-8f899c5fb3b2" (UID: "48e2b069-8ada-46bf-9b74-8f899c5fb3b2"). InnerVolumeSpecName "kube-api-access-s2hcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:09:49 crc kubenswrapper[5030]: W0120 23:09:49.752027 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fe039bb_1c22_493b_a545_1f2843b9330e.slice/crio-086190e650dee3d29b231576989374d1715c98541a312b6517afe61753c21d60 WatchSource:0}: Error finding container 086190e650dee3d29b231576989374d1715c98541a312b6517afe61753c21d60: Status 404 returned error can't find the container with id 086190e650dee3d29b231576989374d1715c98541a312b6517afe61753c21d60 Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.753946 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-db-create-lgbqp"] Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.767024 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt"] Jan 20 23:09:49 crc kubenswrapper[5030]: W0120 23:09:49.776095 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9b1cf97_227f_4c53_b2a4_b272d8803ae7.slice/crio-063956c1442233070a0593f67ed86b9576a49e732dcc35c79d54541534f49a3a WatchSource:0}: Error finding container 063956c1442233070a0593f67ed86b9576a49e732dcc35c79d54541534f49a3a: Status 404 returned error can't find the container with id 063956c1442233070a0593f67ed86b9576a49e732dcc35c79d54541534f49a3a Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.782992 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2gqr\" (UniqueName: \"kubernetes.io/projected/6aad0d16-4da7-47d6-8707-e0bf8fa3740d-kube-api-access-c2gqr\") pod \"6aad0d16-4da7-47d6-8707-e0bf8fa3740d\" (UID: \"6aad0d16-4da7-47d6-8707-e0bf8fa3740d\") " Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.783035 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aad0d16-4da7-47d6-8707-e0bf8fa3740d-operator-scripts\") pod \"6aad0d16-4da7-47d6-8707-e0bf8fa3740d\" (UID: \"6aad0d16-4da7-47d6-8707-e0bf8fa3740d\") " Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.783136 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c70a1a36-5e84-42ef-bafd-cddecb57dfd7-operator-scripts\") pod \"c70a1a36-5e84-42ef-bafd-cddecb57dfd7\" (UID: \"c70a1a36-5e84-42ef-bafd-cddecb57dfd7\") " Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.783204 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl4cv\" (UniqueName: \"kubernetes.io/projected/c70a1a36-5e84-42ef-bafd-cddecb57dfd7-kube-api-access-xl4cv\") pod \"c70a1a36-5e84-42ef-bafd-cddecb57dfd7\" (UID: \"c70a1a36-5e84-42ef-bafd-cddecb57dfd7\") " Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.783495 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2hcg\" (UniqueName: \"kubernetes.io/projected/48e2b069-8ada-46bf-9b74-8f899c5fb3b2-kube-api-access-s2hcg\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.783512 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48e2b069-8ada-46bf-9b74-8f899c5fb3b2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.783804 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aad0d16-4da7-47d6-8707-e0bf8fa3740d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6aad0d16-4da7-47d6-8707-e0bf8fa3740d" (UID: "6aad0d16-4da7-47d6-8707-e0bf8fa3740d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.784287 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c70a1a36-5e84-42ef-bafd-cddecb57dfd7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c70a1a36-5e84-42ef-bafd-cddecb57dfd7" (UID: "c70a1a36-5e84-42ef-bafd-cddecb57dfd7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.787353 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70a1a36-5e84-42ef-bafd-cddecb57dfd7-kube-api-access-xl4cv" (OuterVolumeSpecName: "kube-api-access-xl4cv") pod "c70a1a36-5e84-42ef-bafd-cddecb57dfd7" (UID: "c70a1a36-5e84-42ef-bafd-cddecb57dfd7"). InnerVolumeSpecName "kube-api-access-xl4cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.787720 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aad0d16-4da7-47d6-8707-e0bf8fa3740d-kube-api-access-c2gqr" (OuterVolumeSpecName: "kube-api-access-c2gqr") pod "6aad0d16-4da7-47d6-8707-e0bf8fa3740d" (UID: "6aad0d16-4da7-47d6-8707-e0bf8fa3740d"). InnerVolumeSpecName "kube-api-access-c2gqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.885805 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c70a1a36-5e84-42ef-bafd-cddecb57dfd7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.886104 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl4cv\" (UniqueName: \"kubernetes.io/projected/c70a1a36-5e84-42ef-bafd-cddecb57dfd7-kube-api-access-xl4cv\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.886116 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2gqr\" (UniqueName: \"kubernetes.io/projected/6aad0d16-4da7-47d6-8707-e0bf8fa3740d-kube-api-access-c2gqr\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:49 crc kubenswrapper[5030]: I0120 23:09:49.886125 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aad0d16-4da7-47d6-8707-e0bf8fa3740d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.105313 5030 generic.go:334] "Generic (PLEG): container finished" podID="df14d7d3-54c6-4a6c-996a-07be5a7f4261" containerID="ad8326da150ba9e0b7c9cc03399f4b3024338fea28aab1d3e8440ad17897133b" exitCode=0 Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.105437 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-jct9z" event={"ID":"df14d7d3-54c6-4a6c-996a-07be5a7f4261","Type":"ContainerDied","Data":"ad8326da150ba9e0b7c9cc03399f4b3024338fea28aab1d3e8440ad17897133b"} Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.113848 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-fc6vh" Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.113875 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-fc6vh" event={"ID":"6aad0d16-4da7-47d6-8707-e0bf8fa3740d","Type":"ContainerDied","Data":"40212e3089b256927035893247a60c274de7adbec91897fc7331c53f8518614c"} Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.113923 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40212e3089b256927035893247a60c274de7adbec91897fc7331c53f8518614c" Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.120309 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-v9ngp" event={"ID":"c70a1a36-5e84-42ef-bafd-cddecb57dfd7","Type":"ContainerDied","Data":"f2123f9b11a1936fc7f7aed1beec7e68ef01a850196012537edbe5c1a73af88d"} Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.120358 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2123f9b11a1936fc7f7aed1beec7e68ef01a850196012537edbe5c1a73af88d" Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.120334 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-v9ngp" Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.122804 5030 generic.go:334] "Generic (PLEG): container finished" podID="feff49bf-c912-4af2-98b5-97ea9cca86e4" containerID="7a40b1202cf51379eb919ca23664e6520b4fcd12c201fcc3b3d03e4d4583657b" exitCode=0 Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.122978 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-zn648" event={"ID":"feff49bf-c912-4af2-98b5-97ea9cca86e4","Type":"ContainerDied","Data":"7a40b1202cf51379eb919ca23664e6520b4fcd12c201fcc3b3d03e4d4583657b"} Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.130616 5030 generic.go:334] "Generic (PLEG): container finished" podID="cf0297cb-4da9-453d-ab16-b516a4a45987" containerID="5f7d35b2a367f9318f220339f6dee3e01642d38b191bb85dd167bc66c7052558" exitCode=0 Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.130703 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" event={"ID":"cf0297cb-4da9-453d-ab16-b516a4a45987","Type":"ContainerDied","Data":"5f7d35b2a367f9318f220339f6dee3e01642d38b191bb85dd167bc66c7052558"} Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.133540 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng" event={"ID":"48e2b069-8ada-46bf-9b74-8f899c5fb3b2","Type":"ContainerDied","Data":"13bb8feffadcc9a27a68c49643586a94fd38c2a6a2562c72575e56575c474587"} Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.133583 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13bb8feffadcc9a27a68c49643586a94fd38c2a6a2562c72575e56575c474587" Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.133684 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng" Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.135906 5030 generic.go:334] "Generic (PLEG): container finished" podID="94adfb2f-fb7f-438c-83e0-2d356bcdb122" containerID="f0c319d434603d5cca5731e6e06e5cc5ab367ba74821077ad39f503ea413d420" exitCode=0 Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.135960 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" event={"ID":"94adfb2f-fb7f-438c-83e0-2d356bcdb122","Type":"ContainerDied","Data":"f0c319d434603d5cca5731e6e06e5cc5ab367ba74821077ad39f503ea413d420"} Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.137499 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-create-lgbqp" event={"ID":"5fe039bb-1c22-493b-a545-1f2843b9330e","Type":"ContainerStarted","Data":"086190e650dee3d29b231576989374d1715c98541a312b6517afe61753c21d60"} Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.140288 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" event={"ID":"a9b1cf97-227f-4c53-b2a4-b272d8803ae7","Type":"ContainerStarted","Data":"eaa3c999fcab3e563feef80868cb66e702932c50fc14ab1246ef3976a2fc0297"} Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.140322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" event={"ID":"a9b1cf97-227f-4c53-b2a4-b272d8803ae7","Type":"ContainerStarted","Data":"063956c1442233070a0593f67ed86b9576a49e732dcc35c79d54541534f49a3a"} Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.181356 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-db-create-lgbqp" podStartSLOduration=2.181335376 podStartE2EDuration="2.181335376s" podCreationTimestamp="2026-01-20 23:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:09:50.17088582 +0000 UTC m=+2062.491146118" watchObservedRunningTime="2026-01-20 23:09:50.181335376 +0000 UTC m=+2062.501595674" Jan 20 23:09:50 crc kubenswrapper[5030]: I0120 23:09:50.212495 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" podStartSLOduration=2.212477616 podStartE2EDuration="2.212477616s" podCreationTimestamp="2026-01-20 23:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:09:50.204877351 +0000 UTC m=+2062.525137639" watchObservedRunningTime="2026-01-20 23:09:50.212477616 +0000 UTC m=+2062.532737894" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.158329 5030 generic.go:334] "Generic (PLEG): container finished" podID="a9b1cf97-227f-4c53-b2a4-b272d8803ae7" containerID="eaa3c999fcab3e563feef80868cb66e702932c50fc14ab1246ef3976a2fc0297" exitCode=0 Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.158863 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" event={"ID":"a9b1cf97-227f-4c53-b2a4-b272d8803ae7","Type":"ContainerDied","Data":"eaa3c999fcab3e563feef80868cb66e702932c50fc14ab1246ef3976a2fc0297"} Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.162552 5030 generic.go:334] "Generic (PLEG): container finished" podID="5fe039bb-1c22-493b-a545-1f2843b9330e" containerID="282f387f459239e1abbb114ff819d5f8200b97d66e1dceb89e38ce1c553d65f3" exitCode=0 Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.162862 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-create-lgbqp" event={"ID":"5fe039bb-1c22-493b-a545-1f2843b9330e","Type":"ContainerDied","Data":"282f387f459239e1abbb114ff819d5f8200b97d66e1dceb89e38ce1c553d65f3"} Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.640701 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-jct9z" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.724144 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8ml8\" (UniqueName: \"kubernetes.io/projected/df14d7d3-54c6-4a6c-996a-07be5a7f4261-kube-api-access-z8ml8\") pod \"df14d7d3-54c6-4a6c-996a-07be5a7f4261\" (UID: \"df14d7d3-54c6-4a6c-996a-07be5a7f4261\") " Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.724296 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df14d7d3-54c6-4a6c-996a-07be5a7f4261-operator-scripts\") pod \"df14d7d3-54c6-4a6c-996a-07be5a7f4261\" (UID: \"df14d7d3-54c6-4a6c-996a-07be5a7f4261\") " Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.724903 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df14d7d3-54c6-4a6c-996a-07be5a7f4261-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df14d7d3-54c6-4a6c-996a-07be5a7f4261" (UID: "df14d7d3-54c6-4a6c-996a-07be5a7f4261"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.729321 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df14d7d3-54c6-4a6c-996a-07be5a7f4261-kube-api-access-z8ml8" (OuterVolumeSpecName: "kube-api-access-z8ml8") pod "df14d7d3-54c6-4a6c-996a-07be5a7f4261" (UID: "df14d7d3-54c6-4a6c-996a-07be5a7f4261"). InnerVolumeSpecName "kube-api-access-z8ml8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.751731 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-zn648" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.756830 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.763140 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.826452 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8ml8\" (UniqueName: \"kubernetes.io/projected/df14d7d3-54c6-4a6c-996a-07be5a7f4261-kube-api-access-z8ml8\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.826500 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df14d7d3-54c6-4a6c-996a-07be5a7f4261-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.927766 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrf8k\" (UniqueName: \"kubernetes.io/projected/cf0297cb-4da9-453d-ab16-b516a4a45987-kube-api-access-qrf8k\") pod \"cf0297cb-4da9-453d-ab16-b516a4a45987\" (UID: \"cf0297cb-4da9-453d-ab16-b516a4a45987\") " Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.927809 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnp64\" (UniqueName: \"kubernetes.io/projected/feff49bf-c912-4af2-98b5-97ea9cca86e4-kube-api-access-jnp64\") pod \"feff49bf-c912-4af2-98b5-97ea9cca86e4\" (UID: \"feff49bf-c912-4af2-98b5-97ea9cca86e4\") " Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.927875 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94adfb2f-fb7f-438c-83e0-2d356bcdb122-operator-scripts\") pod \"94adfb2f-fb7f-438c-83e0-2d356bcdb122\" (UID: \"94adfb2f-fb7f-438c-83e0-2d356bcdb122\") " Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.927910 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf0297cb-4da9-453d-ab16-b516a4a45987-operator-scripts\") pod \"cf0297cb-4da9-453d-ab16-b516a4a45987\" (UID: \"cf0297cb-4da9-453d-ab16-b516a4a45987\") " Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.928012 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c8rb\" (UniqueName: \"kubernetes.io/projected/94adfb2f-fb7f-438c-83e0-2d356bcdb122-kube-api-access-7c8rb\") pod \"94adfb2f-fb7f-438c-83e0-2d356bcdb122\" (UID: \"94adfb2f-fb7f-438c-83e0-2d356bcdb122\") " Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.928033 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feff49bf-c912-4af2-98b5-97ea9cca86e4-operator-scripts\") pod \"feff49bf-c912-4af2-98b5-97ea9cca86e4\" (UID: \"feff49bf-c912-4af2-98b5-97ea9cca86e4\") " Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.928743 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf0297cb-4da9-453d-ab16-b516a4a45987-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf0297cb-4da9-453d-ab16-b516a4a45987" (UID: "cf0297cb-4da9-453d-ab16-b516a4a45987"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.928885 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feff49bf-c912-4af2-98b5-97ea9cca86e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "feff49bf-c912-4af2-98b5-97ea9cca86e4" (UID: "feff49bf-c912-4af2-98b5-97ea9cca86e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.929454 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94adfb2f-fb7f-438c-83e0-2d356bcdb122-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94adfb2f-fb7f-438c-83e0-2d356bcdb122" (UID: "94adfb2f-fb7f-438c-83e0-2d356bcdb122"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.932379 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94adfb2f-fb7f-438c-83e0-2d356bcdb122-kube-api-access-7c8rb" (OuterVolumeSpecName: "kube-api-access-7c8rb") pod "94adfb2f-fb7f-438c-83e0-2d356bcdb122" (UID: "94adfb2f-fb7f-438c-83e0-2d356bcdb122"). InnerVolumeSpecName "kube-api-access-7c8rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.933593 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feff49bf-c912-4af2-98b5-97ea9cca86e4-kube-api-access-jnp64" (OuterVolumeSpecName: "kube-api-access-jnp64") pod "feff49bf-c912-4af2-98b5-97ea9cca86e4" (UID: "feff49bf-c912-4af2-98b5-97ea9cca86e4"). InnerVolumeSpecName "kube-api-access-jnp64". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:09:51 crc kubenswrapper[5030]: I0120 23:09:51.934355 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf0297cb-4da9-453d-ab16-b516a4a45987-kube-api-access-qrf8k" (OuterVolumeSpecName: "kube-api-access-qrf8k") pod "cf0297cb-4da9-453d-ab16-b516a4a45987" (UID: "cf0297cb-4da9-453d-ab16-b516a4a45987"). InnerVolumeSpecName "kube-api-access-qrf8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.030356 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94adfb2f-fb7f-438c-83e0-2d356bcdb122-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.030390 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf0297cb-4da9-453d-ab16-b516a4a45987-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.030404 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c8rb\" (UniqueName: \"kubernetes.io/projected/94adfb2f-fb7f-438c-83e0-2d356bcdb122-kube-api-access-7c8rb\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.030418 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feff49bf-c912-4af2-98b5-97ea9cca86e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.030431 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrf8k\" (UniqueName: \"kubernetes.io/projected/cf0297cb-4da9-453d-ab16-b516a4a45987-kube-api-access-qrf8k\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.030442 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnp64\" (UniqueName: \"kubernetes.io/projected/feff49bf-c912-4af2-98b5-97ea9cca86e4-kube-api-access-jnp64\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.176479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-zn648" event={"ID":"feff49bf-c912-4af2-98b5-97ea9cca86e4","Type":"ContainerDied","Data":"ea205b01c032b3959d7fbfde2f1fec58cf76bb7cc198e4e7d20f1ef3073eaa48"} Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.176538 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea205b01c032b3959d7fbfde2f1fec58cf76bb7cc198e4e7d20f1ef3073eaa48" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.176616 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-zn648" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.179560 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" event={"ID":"cf0297cb-4da9-453d-ab16-b516a4a45987","Type":"ContainerDied","Data":"9fe9c921890bd76073c26d8d9844f6e4254509f185b2bdaf960cdf2890300d5b"} Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.179718 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fe9c921890bd76073c26d8d9844f6e4254509f185b2bdaf960cdf2890300d5b" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.179821 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-4528-account-create-update-8v25n" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.182981 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" event={"ID":"94adfb2f-fb7f-438c-83e0-2d356bcdb122","Type":"ContainerDied","Data":"80b6880d5d2e036692b23f8de9e9acde7bd28540198037b5c5722ac773ccc7f2"} Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.183015 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.183028 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80b6880d5d2e036692b23f8de9e9acde7bd28540198037b5c5722ac773ccc7f2" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.186098 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"fc4c6ac6-9b26-4047-a309-eb01bf39412f","Type":"ContainerStarted","Data":"81c0add93369774cfdec79815a42700be1d5e4729d4c196bfc35f4a4f82494d7"} Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.188216 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-jct9z" event={"ID":"df14d7d3-54c6-4a6c-996a-07be5a7f4261","Type":"ContainerDied","Data":"37a0cf4491d2a3237c86e63be55be5aae5c040571bb8e1f6c71a2aefebd8c7c9"} Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.188251 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-jct9z" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.188261 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37a0cf4491d2a3237c86e63be55be5aae5c040571bb8e1f6c71a2aefebd8c7c9" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.190555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f331453a-4129-49ea-b554-06f628abf66a","Type":"ContainerStarted","Data":"adc1e639f97f48f66f61a2912c7d0b9d66666e05456c9a9d9aaa15a28a4004ea"} Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.457748 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.536867 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9b1cf97-227f-4c53-b2a4-b272d8803ae7-operator-scripts\") pod \"a9b1cf97-227f-4c53-b2a4-b272d8803ae7\" (UID: \"a9b1cf97-227f-4c53-b2a4-b272d8803ae7\") " Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.536964 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtln2\" (UniqueName: \"kubernetes.io/projected/a9b1cf97-227f-4c53-b2a4-b272d8803ae7-kube-api-access-vtln2\") pod \"a9b1cf97-227f-4c53-b2a4-b272d8803ae7\" (UID: \"a9b1cf97-227f-4c53-b2a4-b272d8803ae7\") " Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.537695 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9b1cf97-227f-4c53-b2a4-b272d8803ae7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a9b1cf97-227f-4c53-b2a4-b272d8803ae7" (UID: "a9b1cf97-227f-4c53-b2a4-b272d8803ae7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.540446 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b1cf97-227f-4c53-b2a4-b272d8803ae7-kube-api-access-vtln2" (OuterVolumeSpecName: "kube-api-access-vtln2") pod "a9b1cf97-227f-4c53-b2a4-b272d8803ae7" (UID: "a9b1cf97-227f-4c53-b2a4-b272d8803ae7"). InnerVolumeSpecName "kube-api-access-vtln2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.639864 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9b1cf97-227f-4c53-b2a4-b272d8803ae7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.639939 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtln2\" (UniqueName: \"kubernetes.io/projected/a9b1cf97-227f-4c53-b2a4-b272d8803ae7-kube-api-access-vtln2\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.643382 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-create-lgbqp" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.740814 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfvd7\" (UniqueName: \"kubernetes.io/projected/5fe039bb-1c22-493b-a545-1f2843b9330e-kube-api-access-nfvd7\") pod \"5fe039bb-1c22-493b-a545-1f2843b9330e\" (UID: \"5fe039bb-1c22-493b-a545-1f2843b9330e\") " Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.741008 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fe039bb-1c22-493b-a545-1f2843b9330e-operator-scripts\") pod \"5fe039bb-1c22-493b-a545-1f2843b9330e\" (UID: \"5fe039bb-1c22-493b-a545-1f2843b9330e\") " Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.741463 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fe039bb-1c22-493b-a545-1f2843b9330e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5fe039bb-1c22-493b-a545-1f2843b9330e" (UID: "5fe039bb-1c22-493b-a545-1f2843b9330e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.744013 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe039bb-1c22-493b-a545-1f2843b9330e-kube-api-access-nfvd7" (OuterVolumeSpecName: "kube-api-access-nfvd7") pod "5fe039bb-1c22-493b-a545-1f2843b9330e" (UID: "5fe039bb-1c22-493b-a545-1f2843b9330e"). InnerVolumeSpecName "kube-api-access-nfvd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.843526 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfvd7\" (UniqueName: \"kubernetes.io/projected/5fe039bb-1c22-493b-a545-1f2843b9330e-kube-api-access-nfvd7\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:52 crc kubenswrapper[5030]: I0120 23:09:52.843889 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fe039bb-1c22-493b-a545-1f2843b9330e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:53 crc kubenswrapper[5030]: I0120 23:09:53.205788 5030 generic.go:334] "Generic (PLEG): container finished" podID="c20d1400-78ce-4d6d-b499-359aca94def8" containerID="887ccad86af31ce3e1594fcc7349c3577a0c6d31033210df78179bff8c7f5ceb" exitCode=0 Jan 20 23:09:53 crc kubenswrapper[5030]: I0120 23:09:53.205937 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" event={"ID":"c20d1400-78ce-4d6d-b499-359aca94def8","Type":"ContainerDied","Data":"887ccad86af31ce3e1594fcc7349c3577a0c6d31033210df78179bff8c7f5ceb"} Jan 20 23:09:53 crc kubenswrapper[5030]: I0120 23:09:53.209500 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-create-lgbqp" event={"ID":"5fe039bb-1c22-493b-a545-1f2843b9330e","Type":"ContainerDied","Data":"086190e650dee3d29b231576989374d1715c98541a312b6517afe61753c21d60"} Jan 20 23:09:53 crc kubenswrapper[5030]: I0120 23:09:53.209739 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="086190e650dee3d29b231576989374d1715c98541a312b6517afe61753c21d60" Jan 20 23:09:53 crc kubenswrapper[5030]: I0120 23:09:53.209950 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-create-lgbqp" Jan 20 23:09:53 crc kubenswrapper[5030]: I0120 23:09:53.212778 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" Jan 20 23:09:53 crc kubenswrapper[5030]: I0120 23:09:53.212821 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt" event={"ID":"a9b1cf97-227f-4c53-b2a4-b272d8803ae7","Type":"ContainerDied","Data":"063956c1442233070a0593f67ed86b9576a49e732dcc35c79d54541534f49a3a"} Jan 20 23:09:53 crc kubenswrapper[5030]: I0120 23:09:53.212860 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="063956c1442233070a0593f67ed86b9576a49e732dcc35c79d54541534f49a3a" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.616351 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.774899 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-combined-ca-bundle\") pod \"c20d1400-78ce-4d6d-b499-359aca94def8\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.774980 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c20d1400-78ce-4d6d-b499-359aca94def8-etc-swift\") pod \"c20d1400-78ce-4d6d-b499-359aca94def8\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.775017 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c20d1400-78ce-4d6d-b499-359aca94def8-ring-data-devices\") pod \"c20d1400-78ce-4d6d-b499-359aca94def8\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.775093 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-swiftconf\") pod \"c20d1400-78ce-4d6d-b499-359aca94def8\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.775126 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c20d1400-78ce-4d6d-b499-359aca94def8-scripts\") pod \"c20d1400-78ce-4d6d-b499-359aca94def8\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.775157 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-dispersionconf\") pod \"c20d1400-78ce-4d6d-b499-359aca94def8\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.775212 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66b8b\" (UniqueName: \"kubernetes.io/projected/c20d1400-78ce-4d6d-b499-359aca94def8-kube-api-access-66b8b\") pod \"c20d1400-78ce-4d6d-b499-359aca94def8\" (UID: \"c20d1400-78ce-4d6d-b499-359aca94def8\") " Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.776132 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c20d1400-78ce-4d6d-b499-359aca94def8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c20d1400-78ce-4d6d-b499-359aca94def8" (UID: "c20d1400-78ce-4d6d-b499-359aca94def8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.776970 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c20d1400-78ce-4d6d-b499-359aca94def8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c20d1400-78ce-4d6d-b499-359aca94def8" (UID: "c20d1400-78ce-4d6d-b499-359aca94def8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.780100 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c20d1400-78ce-4d6d-b499-359aca94def8-kube-api-access-66b8b" (OuterVolumeSpecName: "kube-api-access-66b8b") pod "c20d1400-78ce-4d6d-b499-359aca94def8" (UID: "c20d1400-78ce-4d6d-b499-359aca94def8"). InnerVolumeSpecName "kube-api-access-66b8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.787561 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c20d1400-78ce-4d6d-b499-359aca94def8" (UID: "c20d1400-78ce-4d6d-b499-359aca94def8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.807071 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c20d1400-78ce-4d6d-b499-359aca94def8-scripts" (OuterVolumeSpecName: "scripts") pod "c20d1400-78ce-4d6d-b499-359aca94def8" (UID: "c20d1400-78ce-4d6d-b499-359aca94def8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.811689 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c20d1400-78ce-4d6d-b499-359aca94def8" (UID: "c20d1400-78ce-4d6d-b499-359aca94def8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.820831 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c20d1400-78ce-4d6d-b499-359aca94def8" (UID: "c20d1400-78ce-4d6d-b499-359aca94def8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.877038 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66b8b\" (UniqueName: \"kubernetes.io/projected/c20d1400-78ce-4d6d-b499-359aca94def8-kube-api-access-66b8b\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.877104 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.877126 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c20d1400-78ce-4d6d-b499-359aca94def8-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.877146 5030 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c20d1400-78ce-4d6d-b499-359aca94def8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.877167 5030 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.877189 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c20d1400-78ce-4d6d-b499-359aca94def8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:54 crc kubenswrapper[5030]: I0120 23:09:54.877245 5030 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c20d1400-78ce-4d6d-b499-359aca94def8-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:09:55 crc kubenswrapper[5030]: I0120 23:09:55.230019 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" event={"ID":"c20d1400-78ce-4d6d-b499-359aca94def8","Type":"ContainerDied","Data":"ff7dd77b8f8d0d1f0ca73efdf7a397596f4c7a712927fea0f740402376aec3e1"} Jan 20 23:09:55 crc kubenswrapper[5030]: I0120 23:09:55.230417 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff7dd77b8f8d0d1f0ca73efdf7a397596f4c7a712927fea0f740402376aec3e1" Jan 20 23:09:55 crc kubenswrapper[5030]: I0120 23:09:55.230481 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-2qn2g" Jan 20 23:09:55 crc kubenswrapper[5030]: I0120 23:09:55.690684 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:55 crc kubenswrapper[5030]: I0120 23:09:55.697324 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift\") pod \"swift-storage-0\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:55 crc kubenswrapper[5030]: I0120 23:09:55.744684 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:09:56 crc kubenswrapper[5030]: I0120 23:09:56.268603 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:09:56 crc kubenswrapper[5030]: W0120 23:09:56.272613 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70a9dd1f_f075_4833_b9e8_3fc856743ab8.slice/crio-02ab03e5fc6eda916c9515fe0f901ffcfb5ff6663636096457e50cde6498c21a WatchSource:0}: Error finding container 02ab03e5fc6eda916c9515fe0f901ffcfb5ff6663636096457e50cde6498c21a: Status 404 returned error can't find the container with id 02ab03e5fc6eda916c9515fe0f901ffcfb5ff6663636096457e50cde6498c21a Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.028431 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-s6lbv"] Jan 20 23:09:57 crc kubenswrapper[5030]: E0120 23:09:57.029258 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c20d1400-78ce-4d6d-b499-359aca94def8" containerName="swift-ring-rebalance" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029271 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c20d1400-78ce-4d6d-b499-359aca94def8" containerName="swift-ring-rebalance" Jan 20 23:09:57 crc kubenswrapper[5030]: E0120 23:09:57.029282 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aad0d16-4da7-47d6-8707-e0bf8fa3740d" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029290 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aad0d16-4da7-47d6-8707-e0bf8fa3740d" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: E0120 23:09:57.029301 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df14d7d3-54c6-4a6c-996a-07be5a7f4261" containerName="mariadb-database-create" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029307 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="df14d7d3-54c6-4a6c-996a-07be5a7f4261" containerName="mariadb-database-create" Jan 20 23:09:57 crc kubenswrapper[5030]: E0120 23:09:57.029316 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e2b069-8ada-46bf-9b74-8f899c5fb3b2" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029322 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e2b069-8ada-46bf-9b74-8f899c5fb3b2" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: E0120 23:09:57.029328 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feff49bf-c912-4af2-98b5-97ea9cca86e4" containerName="mariadb-database-create" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029334 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="feff49bf-c912-4af2-98b5-97ea9cca86e4" containerName="mariadb-database-create" Jan 20 23:09:57 crc kubenswrapper[5030]: E0120 23:09:57.029346 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70a1a36-5e84-42ef-bafd-cddecb57dfd7" containerName="mariadb-database-create" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029351 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70a1a36-5e84-42ef-bafd-cddecb57dfd7" containerName="mariadb-database-create" Jan 20 23:09:57 crc kubenswrapper[5030]: E0120 23:09:57.029359 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94adfb2f-fb7f-438c-83e0-2d356bcdb122" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029365 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="94adfb2f-fb7f-438c-83e0-2d356bcdb122" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: E0120 23:09:57.029375 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0297cb-4da9-453d-ab16-b516a4a45987" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029382 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0297cb-4da9-453d-ab16-b516a4a45987" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: E0120 23:09:57.029398 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe039bb-1c22-493b-a545-1f2843b9330e" containerName="mariadb-database-create" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029404 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe039bb-1c22-493b-a545-1f2843b9330e" containerName="mariadb-database-create" Jan 20 23:09:57 crc kubenswrapper[5030]: E0120 23:09:57.029420 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b1cf97-227f-4c53-b2a4-b272d8803ae7" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029427 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b1cf97-227f-4c53-b2a4-b272d8803ae7" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029559 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fe039bb-1c22-493b-a545-1f2843b9330e" containerName="mariadb-database-create" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029571 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="df14d7d3-54c6-4a6c-996a-07be5a7f4261" containerName="mariadb-database-create" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029578 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aad0d16-4da7-47d6-8707-e0bf8fa3740d" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029589 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf0297cb-4da9-453d-ab16-b516a4a45987" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029596 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="94adfb2f-fb7f-438c-83e0-2d356bcdb122" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029607 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b1cf97-227f-4c53-b2a4-b272d8803ae7" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029629 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="48e2b069-8ada-46bf-9b74-8f899c5fb3b2" containerName="mariadb-account-create-update" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029641 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c20d1400-78ce-4d6d-b499-359aca94def8" containerName="swift-ring-rebalance" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029651 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70a1a36-5e84-42ef-bafd-cddecb57dfd7" containerName="mariadb-database-create" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.029659 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="feff49bf-c912-4af2-98b5-97ea9cca86e4" containerName="mariadb-database-create" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.030172 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.032782 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-w5gc8" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.033430 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.053444 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-s6lbv"] Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.119130 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-config-data\") pod \"glance-db-sync-s6lbv\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.119249 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-db-sync-config-data\") pod \"glance-db-sync-s6lbv\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.120053 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-combined-ca-bundle\") pod \"glance-db-sync-s6lbv\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.120491 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltbbx\" (UniqueName: \"kubernetes.io/projected/590fa392-5f69-4b7b-9def-790dc20e071f-kube-api-access-ltbbx\") pod \"glance-db-sync-s6lbv\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.222384 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-config-data\") pod \"glance-db-sync-s6lbv\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.222613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-db-sync-config-data\") pod \"glance-db-sync-s6lbv\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.222703 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-combined-ca-bundle\") pod \"glance-db-sync-s6lbv\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.222887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltbbx\" (UniqueName: \"kubernetes.io/projected/590fa392-5f69-4b7b-9def-790dc20e071f-kube-api-access-ltbbx\") pod \"glance-db-sync-s6lbv\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.227830 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-combined-ca-bundle\") pod \"glance-db-sync-s6lbv\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.228274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-config-data\") pod \"glance-db-sync-s6lbv\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.228667 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-db-sync-config-data\") pod \"glance-db-sync-s6lbv\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.241744 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltbbx\" (UniqueName: \"kubernetes.io/projected/590fa392-5f69-4b7b-9def-790dc20e071f-kube-api-access-ltbbx\") pod \"glance-db-sync-s6lbv\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.259405 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"0cd31ae9c67288c5d006a85915b20396118db52b0eb4af518b3f1e74a1973aad"} Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.259446 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"3baf6bd0266a5a9fd8e4a8400417e2564cf33cb06f85ced28b5396ab30bdbc41"} Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.259457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"a98d6c66eb67d678dafdd47b3afa23c516a4bb86312f8871a44e29574063d568"} Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.259467 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"54da1c1411ab5ceca784346a76d3a2e8df0bf52604dc8fe074f570c94f675b9c"} Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.259477 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"02ab03e5fc6eda916c9515fe0f901ffcfb5ff6663636096457e50cde6498c21a"} Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.371238 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:09:57 crc kubenswrapper[5030]: I0120 23:09:57.824294 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-s6lbv"] Jan 20 23:09:57 crc kubenswrapper[5030]: W0120 23:09:57.837526 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod590fa392_5f69_4b7b_9def_790dc20e071f.slice/crio-64b8ab38984018e400d640080d72dfc26852dfc0dff421eb350e7ea658a677b2 WatchSource:0}: Error finding container 64b8ab38984018e400d640080d72dfc26852dfc0dff421eb350e7ea658a677b2: Status 404 returned error can't find the container with id 64b8ab38984018e400d640080d72dfc26852dfc0dff421eb350e7ea658a677b2 Jan 20 23:09:58 crc kubenswrapper[5030]: I0120 23:09:58.272324 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-s6lbv" event={"ID":"590fa392-5f69-4b7b-9def-790dc20e071f","Type":"ContainerStarted","Data":"64b8ab38984018e400d640080d72dfc26852dfc0dff421eb350e7ea658a677b2"} Jan 20 23:09:58 crc kubenswrapper[5030]: I0120 23:09:58.275759 5030 generic.go:334] "Generic (PLEG): container finished" podID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerID="81c0add93369774cfdec79815a42700be1d5e4729d4c196bfc35f4a4f82494d7" exitCode=0 Jan 20 23:09:58 crc kubenswrapper[5030]: I0120 23:09:58.275823 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"fc4c6ac6-9b26-4047-a309-eb01bf39412f","Type":"ContainerDied","Data":"81c0add93369774cfdec79815a42700be1d5e4729d4c196bfc35f4a4f82494d7"} Jan 20 23:09:58 crc kubenswrapper[5030]: I0120 23:09:58.278986 5030 generic.go:334] "Generic (PLEG): container finished" podID="f331453a-4129-49ea-b554-06f628abf66a" containerID="adc1e639f97f48f66f61a2912c7d0b9d66666e05456c9a9d9aaa15a28a4004ea" exitCode=0 Jan 20 23:09:58 crc kubenswrapper[5030]: I0120 23:09:58.279203 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f331453a-4129-49ea-b554-06f628abf66a","Type":"ContainerDied","Data":"adc1e639f97f48f66f61a2912c7d0b9d66666e05456c9a9d9aaa15a28a4004ea"} Jan 20 23:09:58 crc kubenswrapper[5030]: I0120 23:09:58.292482 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"8cfd138e7964a40907591a741551bede8ca3e6f85ec899d199026118111003c6"} Jan 20 23:09:58 crc kubenswrapper[5030]: I0120 23:09:58.292526 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"9822aac9b52040694bad86d906341b866e4efd664919a2f1c1b84e448aa91fd2"} Jan 20 23:09:58 crc kubenswrapper[5030]: I0120 23:09:58.292536 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"6139d1ebd8fe4bd1772b00058abe6fe6f57fc73bf53a9abf6d30f7b0b31612d5"} Jan 20 23:09:58 crc kubenswrapper[5030]: I0120 23:09:58.292545 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"8cc93827bc67f52466465d92f18c3615d4a04a5a5999d606cd743da99355de49"} Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.307558 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-s6lbv" event={"ID":"590fa392-5f69-4b7b-9def-790dc20e071f","Type":"ContainerStarted","Data":"53d5d2fb64e11f0a1deba4a0b4ed1e4ed95fa56c211991aefeda39fb89c388f5"} Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.315909 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"fef2f8940ea5258c15f06f5d7dd8a6156a4b7e207d9173332274b4da36d97ab5"} Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.315956 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"df21f467aa48e9cd73ecf780edb3fde78c478ee9e3e1fe10188497a5fc87d1a4"} Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.315970 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"09c4955040ae79ee588305769f26a9faab7bd5b88198cf480c047e86852ecd2d"} Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.315982 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"237d5c77a55c616b660a6d8a7a0789ffbbf5636dcf37dc9b4177236859da902e"} Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.315993 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"02d2cd5180513116470bf587dced9406e0cccb4577828e686f7ad2f6e7756b90"} Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.316006 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"a0b0fd6857b1e7dc5deae4a9fc1014ddd91b0f25a6c5eabcb93bc1c9b7ad0c4e"} Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.316017 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerStarted","Data":"39ce2002c75ebf0e4ceffe55cf58454e136385b8b14ce8fbfd7b9e95218bc1d4"} Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.336730 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-s6lbv" podStartSLOduration=2.33670307 podStartE2EDuration="2.33670307s" podCreationTimestamp="2026-01-20 23:09:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:09:59.333977683 +0000 UTC m=+2071.654237981" watchObservedRunningTime="2026-01-20 23:09:59.33670307 +0000 UTC m=+2071.656963408" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.386170 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=21.386147617 podStartE2EDuration="21.386147617s" podCreationTimestamp="2026-01-20 23:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:09:59.378593263 +0000 UTC m=+2071.698853591" watchObservedRunningTime="2026-01-20 23:09:59.386147617 +0000 UTC m=+2071.706407915" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.539590 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f"] Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.540881 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.543448 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.555599 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f"] Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.662830 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-85c5cb8b5f-rlz2f\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.663424 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-config\") pod \"dnsmasq-dnsmasq-85c5cb8b5f-rlz2f\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.663467 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-85c5cb8b5f-rlz2f\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.663514 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6cmf\" (UniqueName: \"kubernetes.io/projected/6b0797cf-f0bf-4d22-9b01-975b439e5a36-kube-api-access-h6cmf\") pod \"dnsmasq-dnsmasq-85c5cb8b5f-rlz2f\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.765030 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6cmf\" (UniqueName: \"kubernetes.io/projected/6b0797cf-f0bf-4d22-9b01-975b439e5a36-kube-api-access-h6cmf\") pod \"dnsmasq-dnsmasq-85c5cb8b5f-rlz2f\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.765102 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-85c5cb8b5f-rlz2f\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.765183 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-config\") pod \"dnsmasq-dnsmasq-85c5cb8b5f-rlz2f\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.765230 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-85c5cb8b5f-rlz2f\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.766095 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-config\") pod \"dnsmasq-dnsmasq-85c5cb8b5f-rlz2f\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.766247 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-85c5cb8b5f-rlz2f\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.766332 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-85c5cb8b5f-rlz2f\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.795246 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6cmf\" (UniqueName: \"kubernetes.io/projected/6b0797cf-f0bf-4d22-9b01-975b439e5a36-kube-api-access-h6cmf\") pod \"dnsmasq-dnsmasq-85c5cb8b5f-rlz2f\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:09:59 crc kubenswrapper[5030]: I0120 23:09:59.863960 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:10:00 crc kubenswrapper[5030]: I0120 23:10:00.330320 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f"] Jan 20 23:10:00 crc kubenswrapper[5030]: I0120 23:10:00.823732 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:10:01 crc kubenswrapper[5030]: I0120 23:10:01.343482 5030 generic.go:334] "Generic (PLEG): container finished" podID="6b0797cf-f0bf-4d22-9b01-975b439e5a36" containerID="3ea09b1544393da68aa47bb7fcfd6181f5613976340f55cf267b19b8e8bada7c" exitCode=0 Jan 20 23:10:01 crc kubenswrapper[5030]: I0120 23:10:01.343552 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" event={"ID":"6b0797cf-f0bf-4d22-9b01-975b439e5a36","Type":"ContainerDied","Data":"3ea09b1544393da68aa47bb7fcfd6181f5613976340f55cf267b19b8e8bada7c"} Jan 20 23:10:01 crc kubenswrapper[5030]: I0120 23:10:01.343756 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" event={"ID":"6b0797cf-f0bf-4d22-9b01-975b439e5a36","Type":"ContainerStarted","Data":"a2b0050e24e3087ad6390305d44469f92dc1abb74e7f566baa55f357adcdb710"} Jan 20 23:10:02 crc kubenswrapper[5030]: I0120 23:10:02.357862 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" event={"ID":"6b0797cf-f0bf-4d22-9b01-975b439e5a36","Type":"ContainerStarted","Data":"a98358ab78eec0b694659f796a37b14f6795dd13ef8a648b4600dfb66d2fd05c"} Jan 20 23:10:02 crc kubenswrapper[5030]: I0120 23:10:02.357981 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:10:02 crc kubenswrapper[5030]: I0120 23:10:02.369286 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f331453a-4129-49ea-b554-06f628abf66a","Type":"ContainerStarted","Data":"a4bc8dbaee8fec446141a83feeeaff6bfbd90d9c9c809d618868cdf3d15fa788"} Jan 20 23:10:02 crc kubenswrapper[5030]: I0120 23:10:02.382998 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" podStartSLOduration=3.382975819 podStartE2EDuration="3.382975819s" podCreationTimestamp="2026-01-20 23:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:02.376953442 +0000 UTC m=+2074.697213730" watchObservedRunningTime="2026-01-20 23:10:02.382975819 +0000 UTC m=+2074.703236117" Jan 20 23:10:04 crc kubenswrapper[5030]: I0120 23:10:04.390107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f331453a-4129-49ea-b554-06f628abf66a","Type":"ContainerStarted","Data":"1cc12b6243a15a276fb893fecf16ad79a1fa615e0561dc80434a104cb6551b4c"} Jan 20 23:10:04 crc kubenswrapper[5030]: I0120 23:10:04.390384 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:10:04 crc kubenswrapper[5030]: I0120 23:10:04.393393 5030 generic.go:334] "Generic (PLEG): container finished" podID="8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" containerID="41c64328990c95c8e74986ddc791bda40376cfba59526d1bcd67f04916bafbff" exitCode=0 Jan 20 23:10:04 crc kubenswrapper[5030]: I0120 23:10:04.393457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac","Type":"ContainerDied","Data":"41c64328990c95c8e74986ddc791bda40376cfba59526d1bcd67f04916bafbff"} Jan 20 23:10:04 crc kubenswrapper[5030]: I0120 23:10:04.395832 5030 generic.go:334] "Generic (PLEG): container finished" podID="590fa392-5f69-4b7b-9def-790dc20e071f" containerID="53d5d2fb64e11f0a1deba4a0b4ed1e4ed95fa56c211991aefeda39fb89c388f5" exitCode=0 Jan 20 23:10:04 crc kubenswrapper[5030]: I0120 23:10:04.395889 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-s6lbv" event={"ID":"590fa392-5f69-4b7b-9def-790dc20e071f","Type":"ContainerDied","Data":"53d5d2fb64e11f0a1deba4a0b4ed1e4ed95fa56c211991aefeda39fb89c388f5"} Jan 20 23:10:04 crc kubenswrapper[5030]: I0120 23:10:04.397181 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:10:04 crc kubenswrapper[5030]: I0120 23:10:04.428401 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" podStartSLOduration=4.679445993 podStartE2EDuration="25.428378654s" podCreationTimestamp="2026-01-20 23:09:39 +0000 UTC" firstStartedPulling="2026-01-20 23:09:40.467952477 +0000 UTC m=+2052.788212765" lastFinishedPulling="2026-01-20 23:10:01.216885138 +0000 UTC m=+2073.537145426" observedRunningTime="2026-01-20 23:10:04.420123743 +0000 UTC m=+2076.740384041" watchObservedRunningTime="2026-01-20 23:10:04.428378654 +0000 UTC m=+2076.748638942" Jan 20 23:10:05 crc kubenswrapper[5030]: I0120 23:10:05.408915 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac","Type":"ContainerStarted","Data":"0dea74d7713f6e95acbd5e0508ac0db11fd7ae82d19264ddcc72a80ed955f3f8"} Jan 20 23:10:05 crc kubenswrapper[5030]: I0120 23:10:05.411700 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"fc4c6ac6-9b26-4047-a309-eb01bf39412f","Type":"ContainerStarted","Data":"80c605c66cd522f8f765e017919e8e73caf65deb8d8f70b145a11ea221744d70"} Jan 20 23:10:05 crc kubenswrapper[5030]: I0120 23:10:05.413683 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:10:05 crc kubenswrapper[5030]: I0120 23:10:05.462990 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=42.462963213 podStartE2EDuration="42.462963213s" podCreationTimestamp="2026-01-20 23:09:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:05.45139243 +0000 UTC m=+2077.771652758" watchObservedRunningTime="2026-01-20 23:10:05.462963213 +0000 UTC m=+2077.783223541" Jan 20 23:10:05 crc kubenswrapper[5030]: I0120 23:10:05.916967 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:10:05 crc kubenswrapper[5030]: I0120 23:10:05.994563 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-combined-ca-bundle\") pod \"590fa392-5f69-4b7b-9def-790dc20e071f\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " Jan 20 23:10:05 crc kubenswrapper[5030]: I0120 23:10:05.994749 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-config-data\") pod \"590fa392-5f69-4b7b-9def-790dc20e071f\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " Jan 20 23:10:05 crc kubenswrapper[5030]: I0120 23:10:05.994792 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltbbx\" (UniqueName: \"kubernetes.io/projected/590fa392-5f69-4b7b-9def-790dc20e071f-kube-api-access-ltbbx\") pod \"590fa392-5f69-4b7b-9def-790dc20e071f\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " Jan 20 23:10:05 crc kubenswrapper[5030]: I0120 23:10:05.994853 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-db-sync-config-data\") pod \"590fa392-5f69-4b7b-9def-790dc20e071f\" (UID: \"590fa392-5f69-4b7b-9def-790dc20e071f\") " Jan 20 23:10:06 crc kubenswrapper[5030]: I0120 23:10:06.002565 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "590fa392-5f69-4b7b-9def-790dc20e071f" (UID: "590fa392-5f69-4b7b-9def-790dc20e071f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:06 crc kubenswrapper[5030]: I0120 23:10:06.006703 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590fa392-5f69-4b7b-9def-790dc20e071f-kube-api-access-ltbbx" (OuterVolumeSpecName: "kube-api-access-ltbbx") pod "590fa392-5f69-4b7b-9def-790dc20e071f" (UID: "590fa392-5f69-4b7b-9def-790dc20e071f"). InnerVolumeSpecName "kube-api-access-ltbbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:06 crc kubenswrapper[5030]: I0120 23:10:06.030768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "590fa392-5f69-4b7b-9def-790dc20e071f" (UID: "590fa392-5f69-4b7b-9def-790dc20e071f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:06 crc kubenswrapper[5030]: I0120 23:10:06.053721 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-config-data" (OuterVolumeSpecName: "config-data") pod "590fa392-5f69-4b7b-9def-790dc20e071f" (UID: "590fa392-5f69-4b7b-9def-790dc20e071f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:06 crc kubenswrapper[5030]: I0120 23:10:06.099507 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:06 crc kubenswrapper[5030]: I0120 23:10:06.099556 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:06 crc kubenswrapper[5030]: I0120 23:10:06.099570 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltbbx\" (UniqueName: \"kubernetes.io/projected/590fa392-5f69-4b7b-9def-790dc20e071f-kube-api-access-ltbbx\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:06 crc kubenswrapper[5030]: I0120 23:10:06.099588 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/590fa392-5f69-4b7b-9def-790dc20e071f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:06 crc kubenswrapper[5030]: I0120 23:10:06.436534 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-s6lbv" event={"ID":"590fa392-5f69-4b7b-9def-790dc20e071f","Type":"ContainerDied","Data":"64b8ab38984018e400d640080d72dfc26852dfc0dff421eb350e7ea658a677b2"} Jan 20 23:10:06 crc kubenswrapper[5030]: I0120 23:10:06.436597 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64b8ab38984018e400d640080d72dfc26852dfc0dff421eb350e7ea658a677b2" Jan 20 23:10:06 crc kubenswrapper[5030]: I0120 23:10:06.437494 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-s6lbv" Jan 20 23:10:08 crc kubenswrapper[5030]: I0120 23:10:08.455532 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"fc4c6ac6-9b26-4047-a309-eb01bf39412f","Type":"ContainerStarted","Data":"91c5f057c90a781f3cc420c4ec9ef0d245558e00e54dc7ea910fb0d2310af899"} Jan 20 23:10:09 crc kubenswrapper[5030]: I0120 23:10:09.866596 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:10:09 crc kubenswrapper[5030]: I0120 23:10:09.938958 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g"] Jan 20 23:10:09 crc kubenswrapper[5030]: I0120 23:10:09.939204 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" podUID="1ab7f925-f4fc-4393-920e-38907a760bf4" containerName="dnsmasq-dns" containerID="cri-o://9572b28bc85be6e8020d2b2f0a24def1773727783a81e9334fe40d21f3e8e65b" gracePeriod=10 Jan 20 23:10:10 crc kubenswrapper[5030]: I0120 23:10:10.156953 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:10:10 crc kubenswrapper[5030]: I0120 23:10:10.157013 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:10:10 crc kubenswrapper[5030]: I0120 23:10:10.481848 5030 generic.go:334] "Generic (PLEG): container finished" podID="1ab7f925-f4fc-4393-920e-38907a760bf4" containerID="9572b28bc85be6e8020d2b2f0a24def1773727783a81e9334fe40d21f3e8e65b" exitCode=0 Jan 20 23:10:10 crc kubenswrapper[5030]: I0120 23:10:10.481919 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" event={"ID":"1ab7f925-f4fc-4393-920e-38907a760bf4","Type":"ContainerDied","Data":"9572b28bc85be6e8020d2b2f0a24def1773727783a81e9334fe40d21f3e8e65b"} Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.146959 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.276933 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wknqp\" (UniqueName: \"kubernetes.io/projected/1ab7f925-f4fc-4393-920e-38907a760bf4-kube-api-access-wknqp\") pod \"1ab7f925-f4fc-4393-920e-38907a760bf4\" (UID: \"1ab7f925-f4fc-4393-920e-38907a760bf4\") " Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.277006 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ab7f925-f4fc-4393-920e-38907a760bf4-config\") pod \"1ab7f925-f4fc-4393-920e-38907a760bf4\" (UID: \"1ab7f925-f4fc-4393-920e-38907a760bf4\") " Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.277028 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1ab7f925-f4fc-4393-920e-38907a760bf4-dnsmasq-svc\") pod \"1ab7f925-f4fc-4393-920e-38907a760bf4\" (UID: \"1ab7f925-f4fc-4393-920e-38907a760bf4\") " Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.284633 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab7f925-f4fc-4393-920e-38907a760bf4-kube-api-access-wknqp" (OuterVolumeSpecName: "kube-api-access-wknqp") pod "1ab7f925-f4fc-4393-920e-38907a760bf4" (UID: "1ab7f925-f4fc-4393-920e-38907a760bf4"). InnerVolumeSpecName "kube-api-access-wknqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.321000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ab7f925-f4fc-4393-920e-38907a760bf4-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "1ab7f925-f4fc-4393-920e-38907a760bf4" (UID: "1ab7f925-f4fc-4393-920e-38907a760bf4"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.346733 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ab7f925-f4fc-4393-920e-38907a760bf4-config" (OuterVolumeSpecName: "config") pod "1ab7f925-f4fc-4393-920e-38907a760bf4" (UID: "1ab7f925-f4fc-4393-920e-38907a760bf4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.380993 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wknqp\" (UniqueName: \"kubernetes.io/projected/1ab7f925-f4fc-4393-920e-38907a760bf4-kube-api-access-wknqp\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.381331 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1ab7f925-f4fc-4393-920e-38907a760bf4-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.381555 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ab7f925-f4fc-4393-920e-38907a760bf4-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.495917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"fc4c6ac6-9b26-4047-a309-eb01bf39412f","Type":"ContainerStarted","Data":"f2631a14d00bf858921d99a56f28d6f8aa713a92364714274f4423086ce45852"} Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.499517 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" event={"ID":"1ab7f925-f4fc-4393-920e-38907a760bf4","Type":"ContainerDied","Data":"c278fe9524c3246e4f26e57ff30d7760098e40cd2be96930ab5085c598b2e78e"} Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.499549 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.499577 5030 scope.go:117] "RemoveContainer" containerID="9572b28bc85be6e8020d2b2f0a24def1773727783a81e9334fe40d21f3e8e65b" Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.537540 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podStartSLOduration=3.358027238 podStartE2EDuration="32.537493872s" podCreationTimestamp="2026-01-20 23:09:39 +0000 UTC" firstStartedPulling="2026-01-20 23:09:41.68688672 +0000 UTC m=+2054.007147008" lastFinishedPulling="2026-01-20 23:10:10.866353354 +0000 UTC m=+2083.186613642" observedRunningTime="2026-01-20 23:10:11.518359045 +0000 UTC m=+2083.838619393" watchObservedRunningTime="2026-01-20 23:10:11.537493872 +0000 UTC m=+2083.857754200" Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.547763 5030 scope.go:117] "RemoveContainer" containerID="cdd063454e35efa5256bac14a74c8c4544724bce8bf98d0ecc179f865f7e95de" Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.552737 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g"] Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.563789 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g"] Jan 20 23:10:11 crc kubenswrapper[5030]: I0120 23:10:11.978261 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ab7f925-f4fc-4393-920e-38907a760bf4" path="/var/lib/kubelet/pods/1ab7f925-f4fc-4393-920e-38907a760bf4/volumes" Jan 20 23:10:14 crc kubenswrapper[5030]: I0120 23:10:14.704833 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.093089 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-db-sync-wvcfl"] Jan 20 23:10:15 crc kubenswrapper[5030]: E0120 23:10:15.093609 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590fa392-5f69-4b7b-9def-790dc20e071f" containerName="glance-db-sync" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.093639 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="590fa392-5f69-4b7b-9def-790dc20e071f" containerName="glance-db-sync" Jan 20 23:10:15 crc kubenswrapper[5030]: E0120 23:10:15.093670 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab7f925-f4fc-4393-920e-38907a760bf4" containerName="init" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.093676 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab7f925-f4fc-4393-920e-38907a760bf4" containerName="init" Jan 20 23:10:15 crc kubenswrapper[5030]: E0120 23:10:15.093692 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab7f925-f4fc-4393-920e-38907a760bf4" containerName="dnsmasq-dns" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.093697 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab7f925-f4fc-4393-920e-38907a760bf4" containerName="dnsmasq-dns" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.093833 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="590fa392-5f69-4b7b-9def-790dc20e071f" containerName="glance-db-sync" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.093850 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab7f925-f4fc-4393-920e-38907a760bf4" containerName="dnsmasq-dns" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.094373 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.103486 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-watcher-dockercfg-cx859" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.103718 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-config-data" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.124914 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-r4dcm"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.126271 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-r4dcm" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.136957 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-db-sync-wvcfl"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.139484 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-r4dcm"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.143979 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-combined-ca-bundle\") pod \"watcher-db-sync-wvcfl\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.144025 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-config-data\") pod \"watcher-db-sync-wvcfl\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.144059 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qwf\" (UniqueName: \"kubernetes.io/projected/40c90c31-4ceb-4fdb-a889-993ae7d16a18-kube-api-access-46qwf\") pod \"cinder-db-create-r4dcm\" (UID: \"40c90c31-4ceb-4fdb-a889-993ae7d16a18\") " pod="openstack-kuttl-tests/cinder-db-create-r4dcm" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.144077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40c90c31-4ceb-4fdb-a889-993ae7d16a18-operator-scripts\") pod \"cinder-db-create-r4dcm\" (UID: \"40c90c31-4ceb-4fdb-a889-993ae7d16a18\") " pod="openstack-kuttl-tests/cinder-db-create-r4dcm" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.144286 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsrdw\" (UniqueName: \"kubernetes.io/projected/9f621682-a425-4314-ae0d-423bc0b23935-kube-api-access-fsrdw\") pod \"watcher-db-sync-wvcfl\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.144323 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-db-sync-config-data\") pod \"watcher-db-sync-wvcfl\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.188517 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-cdk2h"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.189491 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-cdk2h" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.206220 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.207266 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.212879 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.213521 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-cdk2h"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.236286 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.245118 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-combined-ca-bundle\") pod \"watcher-db-sync-wvcfl\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.245167 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-config-data\") pod \"watcher-db-sync-wvcfl\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.245195 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9409c1f8-6f79-494b-8dde-4fd7eedc9ae3-operator-scripts\") pod \"barbican-f20d-account-create-update-9v9cn\" (UID: \"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3\") " pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.245220 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85x28\" (UniqueName: \"kubernetes.io/projected/9409c1f8-6f79-494b-8dde-4fd7eedc9ae3-kube-api-access-85x28\") pod \"barbican-f20d-account-create-update-9v9cn\" (UID: \"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3\") " pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.245245 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46qwf\" (UniqueName: \"kubernetes.io/projected/40c90c31-4ceb-4fdb-a889-993ae7d16a18-kube-api-access-46qwf\") pod \"cinder-db-create-r4dcm\" (UID: \"40c90c31-4ceb-4fdb-a889-993ae7d16a18\") " pod="openstack-kuttl-tests/cinder-db-create-r4dcm" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.245262 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40c90c31-4ceb-4fdb-a889-993ae7d16a18-operator-scripts\") pod \"cinder-db-create-r4dcm\" (UID: \"40c90c31-4ceb-4fdb-a889-993ae7d16a18\") " pod="openstack-kuttl-tests/cinder-db-create-r4dcm" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.245278 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh9ht\" (UniqueName: \"kubernetes.io/projected/3e35c4d2-863f-4074-8bf1-41a8eabd2910-kube-api-access-fh9ht\") pod \"barbican-db-create-cdk2h\" (UID: \"3e35c4d2-863f-4074-8bf1-41a8eabd2910\") " pod="openstack-kuttl-tests/barbican-db-create-cdk2h" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.245328 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e35c4d2-863f-4074-8bf1-41a8eabd2910-operator-scripts\") pod \"barbican-db-create-cdk2h\" (UID: \"3e35c4d2-863f-4074-8bf1-41a8eabd2910\") " pod="openstack-kuttl-tests/barbican-db-create-cdk2h" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.245345 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsrdw\" (UniqueName: \"kubernetes.io/projected/9f621682-a425-4314-ae0d-423bc0b23935-kube-api-access-fsrdw\") pod \"watcher-db-sync-wvcfl\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.245373 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-db-sync-config-data\") pod \"watcher-db-sync-wvcfl\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.246717 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40c90c31-4ceb-4fdb-a889-993ae7d16a18-operator-scripts\") pod \"cinder-db-create-r4dcm\" (UID: \"40c90c31-4ceb-4fdb-a889-993ae7d16a18\") " pod="openstack-kuttl-tests/cinder-db-create-r4dcm" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.260689 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-config-data\") pod \"watcher-db-sync-wvcfl\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.261192 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-combined-ca-bundle\") pod \"watcher-db-sync-wvcfl\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.261990 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-db-sync-config-data\") pod \"watcher-db-sync-wvcfl\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.275940 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsrdw\" (UniqueName: \"kubernetes.io/projected/9f621682-a425-4314-ae0d-423bc0b23935-kube-api-access-fsrdw\") pod \"watcher-db-sync-wvcfl\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.276614 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qwf\" (UniqueName: \"kubernetes.io/projected/40c90c31-4ceb-4fdb-a889-993ae7d16a18-kube-api-access-46qwf\") pod \"cinder-db-create-r4dcm\" (UID: \"40c90c31-4ceb-4fdb-a889-993ae7d16a18\") " pod="openstack-kuttl-tests/cinder-db-create-r4dcm" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.297134 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.298255 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.300613 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.310340 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.346193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh9ht\" (UniqueName: \"kubernetes.io/projected/3e35c4d2-863f-4074-8bf1-41a8eabd2910-kube-api-access-fh9ht\") pod \"barbican-db-create-cdk2h\" (UID: \"3e35c4d2-863f-4074-8bf1-41a8eabd2910\") " pod="openstack-kuttl-tests/barbican-db-create-cdk2h" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.346703 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e35c4d2-863f-4074-8bf1-41a8eabd2910-operator-scripts\") pod \"barbican-db-create-cdk2h\" (UID: \"3e35c4d2-863f-4074-8bf1-41a8eabd2910\") " pod="openstack-kuttl-tests/barbican-db-create-cdk2h" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.346744 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/047bfc50-9efa-45ba-936c-dd2e03010d76-operator-scripts\") pod \"cinder-4d3d-account-create-update-69ll5\" (UID: \"047bfc50-9efa-45ba-936c-dd2e03010d76\") " pod="openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.346772 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfgwv\" (UniqueName: \"kubernetes.io/projected/047bfc50-9efa-45ba-936c-dd2e03010d76-kube-api-access-wfgwv\") pod \"cinder-4d3d-account-create-update-69ll5\" (UID: \"047bfc50-9efa-45ba-936c-dd2e03010d76\") " pod="openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.346859 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9409c1f8-6f79-494b-8dde-4fd7eedc9ae3-operator-scripts\") pod \"barbican-f20d-account-create-update-9v9cn\" (UID: \"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3\") " pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.346891 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85x28\" (UniqueName: \"kubernetes.io/projected/9409c1f8-6f79-494b-8dde-4fd7eedc9ae3-kube-api-access-85x28\") pod \"barbican-f20d-account-create-update-9v9cn\" (UID: \"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3\") " pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.347293 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e35c4d2-863f-4074-8bf1-41a8eabd2910-operator-scripts\") pod \"barbican-db-create-cdk2h\" (UID: \"3e35c4d2-863f-4074-8bf1-41a8eabd2910\") " pod="openstack-kuttl-tests/barbican-db-create-cdk2h" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.347951 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9409c1f8-6f79-494b-8dde-4fd7eedc9ae3-operator-scripts\") pod \"barbican-f20d-account-create-update-9v9cn\" (UID: \"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3\") " pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.366458 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh9ht\" (UniqueName: \"kubernetes.io/projected/3e35c4d2-863f-4074-8bf1-41a8eabd2910-kube-api-access-fh9ht\") pod \"barbican-db-create-cdk2h\" (UID: \"3e35c4d2-863f-4074-8bf1-41a8eabd2910\") " pod="openstack-kuttl-tests/barbican-db-create-cdk2h" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.377530 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-qpw45"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.378103 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85x28\" (UniqueName: \"kubernetes.io/projected/9409c1f8-6f79-494b-8dde-4fd7eedc9ae3-kube-api-access-85x28\") pod \"barbican-f20d-account-create-update-9v9cn\" (UID: \"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3\") " pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.378442 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-qpw45" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.394995 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-qpw45"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.417370 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.447905 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-r4dcm" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.448217 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfgwv\" (UniqueName: \"kubernetes.io/projected/047bfc50-9efa-45ba-936c-dd2e03010d76-kube-api-access-wfgwv\") pod \"cinder-4d3d-account-create-update-69ll5\" (UID: \"047bfc50-9efa-45ba-936c-dd2e03010d76\") " pod="openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.448259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stdlz\" (UniqueName: \"kubernetes.io/projected/53324643-071d-4bf3-8ea7-702076e4941e-kube-api-access-stdlz\") pod \"neutron-db-create-qpw45\" (UID: \"53324643-071d-4bf3-8ea7-702076e4941e\") " pod="openstack-kuttl-tests/neutron-db-create-qpw45" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.448311 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53324643-071d-4bf3-8ea7-702076e4941e-operator-scripts\") pod \"neutron-db-create-qpw45\" (UID: \"53324643-071d-4bf3-8ea7-702076e4941e\") " pod="openstack-kuttl-tests/neutron-db-create-qpw45" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.448394 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/047bfc50-9efa-45ba-936c-dd2e03010d76-operator-scripts\") pod \"cinder-4d3d-account-create-update-69ll5\" (UID: \"047bfc50-9efa-45ba-936c-dd2e03010d76\") " pod="openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.448663 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-sfwxf"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.449020 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/047bfc50-9efa-45ba-936c-dd2e03010d76-operator-scripts\") pod \"cinder-4d3d-account-create-update-69ll5\" (UID: \"047bfc50-9efa-45ba-936c-dd2e03010d76\") " pod="openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.449833 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.453698 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.454154 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.454429 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.455325 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-zrrcb" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.458775 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-sfwxf"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.472360 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfgwv\" (UniqueName: \"kubernetes.io/projected/047bfc50-9efa-45ba-936c-dd2e03010d76-kube-api-access-wfgwv\") pod \"cinder-4d3d-account-create-update-69ll5\" (UID: \"047bfc50-9efa-45ba-936c-dd2e03010d76\") " pod="openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.513779 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-cdk2h" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.518419 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.520572 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.525401 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.528988 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.530640 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.552221 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efdbb9c-5339-420d-9413-ae2132514f02-config-data\") pod \"keystone-db-sync-sfwxf\" (UID: \"2efdbb9c-5339-420d-9413-ae2132514f02\") " pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.552295 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stdlz\" (UniqueName: \"kubernetes.io/projected/53324643-071d-4bf3-8ea7-702076e4941e-kube-api-access-stdlz\") pod \"neutron-db-create-qpw45\" (UID: \"53324643-071d-4bf3-8ea7-702076e4941e\") " pod="openstack-kuttl-tests/neutron-db-create-qpw45" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.552370 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74cf33a3-0b67-4cc3-b502-7c003f57c236-operator-scripts\") pod \"neutron-6617-account-create-update-bk4hx\" (UID: \"74cf33a3-0b67-4cc3-b502-7c003f57c236\") " pod="openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.552401 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53324643-071d-4bf3-8ea7-702076e4941e-operator-scripts\") pod \"neutron-db-create-qpw45\" (UID: \"53324643-071d-4bf3-8ea7-702076e4941e\") " pod="openstack-kuttl-tests/neutron-db-create-qpw45" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.552420 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glwg9\" (UniqueName: \"kubernetes.io/projected/74cf33a3-0b67-4cc3-b502-7c003f57c236-kube-api-access-glwg9\") pod \"neutron-6617-account-create-update-bk4hx\" (UID: \"74cf33a3-0b67-4cc3-b502-7c003f57c236\") " pod="openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.552438 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blq49\" (UniqueName: \"kubernetes.io/projected/2efdbb9c-5339-420d-9413-ae2132514f02-kube-api-access-blq49\") pod \"keystone-db-sync-sfwxf\" (UID: \"2efdbb9c-5339-420d-9413-ae2132514f02\") " pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.552460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efdbb9c-5339-420d-9413-ae2132514f02-combined-ca-bundle\") pod \"keystone-db-sync-sfwxf\" (UID: \"2efdbb9c-5339-420d-9413-ae2132514f02\") " pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.553584 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53324643-071d-4bf3-8ea7-702076e4941e-operator-scripts\") pod \"neutron-db-create-qpw45\" (UID: \"53324643-071d-4bf3-8ea7-702076e4941e\") " pod="openstack-kuttl-tests/neutron-db-create-qpw45" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.580247 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stdlz\" (UniqueName: \"kubernetes.io/projected/53324643-071d-4bf3-8ea7-702076e4941e-kube-api-access-stdlz\") pod \"neutron-db-create-qpw45\" (UID: \"53324643-071d-4bf3-8ea7-702076e4941e\") " pod="openstack-kuttl-tests/neutron-db-create-qpw45" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.641837 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.654268 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efdbb9c-5339-420d-9413-ae2132514f02-config-data\") pod \"keystone-db-sync-sfwxf\" (UID: \"2efdbb9c-5339-420d-9413-ae2132514f02\") " pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.654355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74cf33a3-0b67-4cc3-b502-7c003f57c236-operator-scripts\") pod \"neutron-6617-account-create-update-bk4hx\" (UID: \"74cf33a3-0b67-4cc3-b502-7c003f57c236\") " pod="openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.654388 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glwg9\" (UniqueName: \"kubernetes.io/projected/74cf33a3-0b67-4cc3-b502-7c003f57c236-kube-api-access-glwg9\") pod \"neutron-6617-account-create-update-bk4hx\" (UID: \"74cf33a3-0b67-4cc3-b502-7c003f57c236\") " pod="openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.654406 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blq49\" (UniqueName: \"kubernetes.io/projected/2efdbb9c-5339-420d-9413-ae2132514f02-kube-api-access-blq49\") pod \"keystone-db-sync-sfwxf\" (UID: \"2efdbb9c-5339-420d-9413-ae2132514f02\") " pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.654428 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efdbb9c-5339-420d-9413-ae2132514f02-combined-ca-bundle\") pod \"keystone-db-sync-sfwxf\" (UID: \"2efdbb9c-5339-420d-9413-ae2132514f02\") " pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.656142 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74cf33a3-0b67-4cc3-b502-7c003f57c236-operator-scripts\") pod \"neutron-6617-account-create-update-bk4hx\" (UID: \"74cf33a3-0b67-4cc3-b502-7c003f57c236\") " pod="openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.658952 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efdbb9c-5339-420d-9413-ae2132514f02-config-data\") pod \"keystone-db-sync-sfwxf\" (UID: \"2efdbb9c-5339-420d-9413-ae2132514f02\") " pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.662034 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efdbb9c-5339-420d-9413-ae2132514f02-combined-ca-bundle\") pod \"keystone-db-sync-sfwxf\" (UID: \"2efdbb9c-5339-420d-9413-ae2132514f02\") " pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.677108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glwg9\" (UniqueName: \"kubernetes.io/projected/74cf33a3-0b67-4cc3-b502-7c003f57c236-kube-api-access-glwg9\") pod \"neutron-6617-account-create-update-bk4hx\" (UID: \"74cf33a3-0b67-4cc3-b502-7c003f57c236\") " pod="openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.678444 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blq49\" (UniqueName: \"kubernetes.io/projected/2efdbb9c-5339-420d-9413-ae2132514f02-kube-api-access-blq49\") pod \"keystone-db-sync-sfwxf\" (UID: \"2efdbb9c-5339-420d-9413-ae2132514f02\") " pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.731306 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-qpw45" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.784206 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.843484 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx" Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.938456 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-db-sync-wvcfl"] Jan 20 23:10:15 crc kubenswrapper[5030]: I0120 23:10:15.991814 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-r4dcm"] Jan 20 23:10:15 crc kubenswrapper[5030]: W0120 23:10:15.999106 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40c90c31_4ceb_4fdb_a889_993ae7d16a18.slice/crio-5f61d6d2e64bcb595c1ce83167c78ceadecf4a424c8e708e14af8d61ff7bfa84 WatchSource:0}: Error finding container 5f61d6d2e64bcb595c1ce83167c78ceadecf4a424c8e708e14af8d61ff7bfa84: Status 404 returned error can't find the container with id 5f61d6d2e64bcb595c1ce83167c78ceadecf4a424c8e708e14af8d61ff7bfa84 Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.078846 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-9ht9g" podUID="1ab7f925-f4fc-4393-920e-38907a760bf4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.150:5353: i/o timeout" Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.126198 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-cdk2h"] Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.218730 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn"] Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.229511 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5"] Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.257520 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.337267 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-sfwxf"] Jan 20 23:10:16 crc kubenswrapper[5030]: W0120 23:10:16.344310 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2efdbb9c_5339_420d_9413_ae2132514f02.slice/crio-c5e7f28bac808e109334fa53afd59c3fc9cb5f2447fe1effdd25e082ecc42025 WatchSource:0}: Error finding container c5e7f28bac808e109334fa53afd59c3fc9cb5f2447fe1effdd25e082ecc42025: Status 404 returned error can't find the container with id c5e7f28bac808e109334fa53afd59c3fc9cb5f2447fe1effdd25e082ecc42025 Jan 20 23:10:16 crc kubenswrapper[5030]: W0120 23:10:16.355125 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53324643_071d_4bf3_8ea7_702076e4941e.slice/crio-c4bb8ae3f61229676a52d704b8e20beb78fe37f9e84519d17cec0462f8ac00cc WatchSource:0}: Error finding container c4bb8ae3f61229676a52d704b8e20beb78fe37f9e84519d17cec0462f8ac00cc: Status 404 returned error can't find the container with id c4bb8ae3f61229676a52d704b8e20beb78fe37f9e84519d17cec0462f8ac00cc Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.358879 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-qpw45"] Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.474093 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx"] Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.560552 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-qpw45" event={"ID":"53324643-071d-4bf3-8ea7-702076e4941e","Type":"ContainerStarted","Data":"9d581ab397f069a331a60a0bdfd08ffce432ef07516ec625a705f575027756ff"} Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.560592 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-qpw45" event={"ID":"53324643-071d-4bf3-8ea7-702076e4941e","Type":"ContainerStarted","Data":"c4bb8ae3f61229676a52d704b8e20beb78fe37f9e84519d17cec0462f8ac00cc"} Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.563450 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" event={"ID":"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3","Type":"ContainerStarted","Data":"33106b797843d9588bd322d3b3adb04bf5c3040e2e4b63f532a40ae848593c1d"} Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.563498 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" event={"ID":"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3","Type":"ContainerStarted","Data":"6ca9d50b2e80dc3fede32930f08c3e8199c6a0cdedb9ae1f2eb31deec431487b"} Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.570148 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" event={"ID":"2efdbb9c-5339-420d-9413-ae2132514f02","Type":"ContainerStarted","Data":"c5e7f28bac808e109334fa53afd59c3fc9cb5f2447fe1effdd25e082ecc42025"} Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.585165 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx" event={"ID":"74cf33a3-0b67-4cc3-b502-7c003f57c236","Type":"ContainerStarted","Data":"a0e36bff6aeef889950b93027cc52fc165afbc0b916d1334fd7e0d85f235533f"} Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.588966 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-create-qpw45" podStartSLOduration=1.588941814 podStartE2EDuration="1.588941814s" podCreationTimestamp="2026-01-20 23:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:16.580689993 +0000 UTC m=+2088.900950281" watchObservedRunningTime="2026-01-20 23:10:16.588941814 +0000 UTC m=+2088.909202102" Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.600594 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5" event={"ID":"047bfc50-9efa-45ba-936c-dd2e03010d76","Type":"ContainerStarted","Data":"4d9a28b163f7a21fe0cc830be92c60e8b594629659fbb834c8152fd68e6afbfd"} Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.607684 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" event={"ID":"9f621682-a425-4314-ae0d-423bc0b23935","Type":"ContainerStarted","Data":"cf60a07c21784c33a730b50212b0e627d92244d1aa8eb31a2c93013cc8796a77"} Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.611441 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" podStartSLOduration=1.611419244 podStartE2EDuration="1.611419244s" podCreationTimestamp="2026-01-20 23:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:16.606848151 +0000 UTC m=+2088.927108439" watchObservedRunningTime="2026-01-20 23:10:16.611419244 +0000 UTC m=+2088.931679532" Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.614850 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-r4dcm" event={"ID":"40c90c31-4ceb-4fdb-a889-993ae7d16a18","Type":"ContainerStarted","Data":"a9b1b63e4c235c4b3ae22db457ef86079fa86f69f2ae3b716377a91a3ea1ec01"} Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.614894 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-r4dcm" event={"ID":"40c90c31-4ceb-4fdb-a889-993ae7d16a18","Type":"ContainerStarted","Data":"5f61d6d2e64bcb595c1ce83167c78ceadecf4a424c8e708e14af8d61ff7bfa84"} Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.619054 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-cdk2h" event={"ID":"3e35c4d2-863f-4074-8bf1-41a8eabd2910","Type":"ContainerStarted","Data":"9491186a3a78527013f443f716cbaf6ddce1ea97f6036b87136fe96596682eb1"} Jan 20 23:10:16 crc kubenswrapper[5030]: I0120 23:10:16.619080 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-cdk2h" event={"ID":"3e35c4d2-863f-4074-8bf1-41a8eabd2910","Type":"ContainerStarted","Data":"2ae67fa7f8c1596d362c6c9eecb16b76af3e9a5f7465893f2f715b54413537dc"} Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.632640 5030 generic.go:334] "Generic (PLEG): container finished" podID="9409c1f8-6f79-494b-8dde-4fd7eedc9ae3" containerID="33106b797843d9588bd322d3b3adb04bf5c3040e2e4b63f532a40ae848593c1d" exitCode=0 Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.632840 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" event={"ID":"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3","Type":"ContainerDied","Data":"33106b797843d9588bd322d3b3adb04bf5c3040e2e4b63f532a40ae848593c1d"} Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.635866 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" event={"ID":"2efdbb9c-5339-420d-9413-ae2132514f02","Type":"ContainerStarted","Data":"a90f48a1dd1d0d8929bad9dd8192cc2e9d1d97e95f316c8ba38b821ea7094fff"} Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.637992 5030 generic.go:334] "Generic (PLEG): container finished" podID="74cf33a3-0b67-4cc3-b502-7c003f57c236" containerID="88037d15f9d9506edc3d17a7d8d74962aa4a73fe38912728d3b943d2a4ac0fc2" exitCode=0 Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.638048 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx" event={"ID":"74cf33a3-0b67-4cc3-b502-7c003f57c236","Type":"ContainerDied","Data":"88037d15f9d9506edc3d17a7d8d74962aa4a73fe38912728d3b943d2a4ac0fc2"} Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.639967 5030 generic.go:334] "Generic (PLEG): container finished" podID="047bfc50-9efa-45ba-936c-dd2e03010d76" containerID="7e0c101fc63b1db43244e04e05803e60eca277ec041218297a5354f0bfcf1f84" exitCode=0 Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.640024 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5" event={"ID":"047bfc50-9efa-45ba-936c-dd2e03010d76","Type":"ContainerDied","Data":"7e0c101fc63b1db43244e04e05803e60eca277ec041218297a5354f0bfcf1f84"} Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.643213 5030 generic.go:334] "Generic (PLEG): container finished" podID="40c90c31-4ceb-4fdb-a889-993ae7d16a18" containerID="a9b1b63e4c235c4b3ae22db457ef86079fa86f69f2ae3b716377a91a3ea1ec01" exitCode=0 Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.643284 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-r4dcm" event={"ID":"40c90c31-4ceb-4fdb-a889-993ae7d16a18","Type":"ContainerDied","Data":"a9b1b63e4c235c4b3ae22db457ef86079fa86f69f2ae3b716377a91a3ea1ec01"} Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.645318 5030 generic.go:334] "Generic (PLEG): container finished" podID="3e35c4d2-863f-4074-8bf1-41a8eabd2910" containerID="9491186a3a78527013f443f716cbaf6ddce1ea97f6036b87136fe96596682eb1" exitCode=0 Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.645384 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-cdk2h" event={"ID":"3e35c4d2-863f-4074-8bf1-41a8eabd2910","Type":"ContainerDied","Data":"9491186a3a78527013f443f716cbaf6ddce1ea97f6036b87136fe96596682eb1"} Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.657683 5030 generic.go:334] "Generic (PLEG): container finished" podID="53324643-071d-4bf3-8ea7-702076e4941e" containerID="9d581ab397f069a331a60a0bdfd08ffce432ef07516ec625a705f575027756ff" exitCode=0 Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.657791 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-qpw45" event={"ID":"53324643-071d-4bf3-8ea7-702076e4941e","Type":"ContainerDied","Data":"9d581ab397f069a331a60a0bdfd08ffce432ef07516ec625a705f575027756ff"} Jan 20 23:10:17 crc kubenswrapper[5030]: I0120 23:10:17.670250 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" podStartSLOduration=2.670229444 podStartE2EDuration="2.670229444s" podCreationTimestamp="2026-01-20 23:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:17.663208452 +0000 UTC m=+2089.983468760" watchObservedRunningTime="2026-01-20 23:10:17.670229444 +0000 UTC m=+2089.990489732" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.293013 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-cdk2h" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.301432 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-r4dcm" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.315488 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e35c4d2-863f-4074-8bf1-41a8eabd2910-operator-scripts\") pod \"3e35c4d2-863f-4074-8bf1-41a8eabd2910\" (UID: \"3e35c4d2-863f-4074-8bf1-41a8eabd2910\") " Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.315548 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40c90c31-4ceb-4fdb-a889-993ae7d16a18-operator-scripts\") pod \"40c90c31-4ceb-4fdb-a889-993ae7d16a18\" (UID: \"40c90c31-4ceb-4fdb-a889-993ae7d16a18\") " Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.315706 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46qwf\" (UniqueName: \"kubernetes.io/projected/40c90c31-4ceb-4fdb-a889-993ae7d16a18-kube-api-access-46qwf\") pod \"40c90c31-4ceb-4fdb-a889-993ae7d16a18\" (UID: \"40c90c31-4ceb-4fdb-a889-993ae7d16a18\") " Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.315742 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh9ht\" (UniqueName: \"kubernetes.io/projected/3e35c4d2-863f-4074-8bf1-41a8eabd2910-kube-api-access-fh9ht\") pod \"3e35c4d2-863f-4074-8bf1-41a8eabd2910\" (UID: \"3e35c4d2-863f-4074-8bf1-41a8eabd2910\") " Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.316093 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40c90c31-4ceb-4fdb-a889-993ae7d16a18-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40c90c31-4ceb-4fdb-a889-993ae7d16a18" (UID: "40c90c31-4ceb-4fdb-a889-993ae7d16a18"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.316967 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e35c4d2-863f-4074-8bf1-41a8eabd2910-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e35c4d2-863f-4074-8bf1-41a8eabd2910" (UID: "3e35c4d2-863f-4074-8bf1-41a8eabd2910"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.329664 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e35c4d2-863f-4074-8bf1-41a8eabd2910-kube-api-access-fh9ht" (OuterVolumeSpecName: "kube-api-access-fh9ht") pod "3e35c4d2-863f-4074-8bf1-41a8eabd2910" (UID: "3e35c4d2-863f-4074-8bf1-41a8eabd2910"). InnerVolumeSpecName "kube-api-access-fh9ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.330451 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c90c31-4ceb-4fdb-a889-993ae7d16a18-kube-api-access-46qwf" (OuterVolumeSpecName: "kube-api-access-46qwf") pod "40c90c31-4ceb-4fdb-a889-993ae7d16a18" (UID: "40c90c31-4ceb-4fdb-a889-993ae7d16a18"). InnerVolumeSpecName "kube-api-access-46qwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.417993 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e35c4d2-863f-4074-8bf1-41a8eabd2910-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.418024 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40c90c31-4ceb-4fdb-a889-993ae7d16a18-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.418036 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46qwf\" (UniqueName: \"kubernetes.io/projected/40c90c31-4ceb-4fdb-a889-993ae7d16a18-kube-api-access-46qwf\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.418047 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh9ht\" (UniqueName: \"kubernetes.io/projected/3e35c4d2-863f-4074-8bf1-41a8eabd2910-kube-api-access-fh9ht\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.670144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-r4dcm" event={"ID":"40c90c31-4ceb-4fdb-a889-993ae7d16a18","Type":"ContainerDied","Data":"5f61d6d2e64bcb595c1ce83167c78ceadecf4a424c8e708e14af8d61ff7bfa84"} Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.670167 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-r4dcm" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.670184 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f61d6d2e64bcb595c1ce83167c78ceadecf4a424c8e708e14af8d61ff7bfa84" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.674875 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-cdk2h" Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.674882 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-cdk2h" event={"ID":"3e35c4d2-863f-4074-8bf1-41a8eabd2910","Type":"ContainerDied","Data":"2ae67fa7f8c1596d362c6c9eecb16b76af3e9a5f7465893f2f715b54413537dc"} Jan 20 23:10:18 crc kubenswrapper[5030]: I0120 23:10:18.674927 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ae67fa7f8c1596d362c6c9eecb16b76af3e9a5f7465893f2f715b54413537dc" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.067912 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.110124 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.125147 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.134644 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-qpw45" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.241847 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53324643-071d-4bf3-8ea7-702076e4941e-operator-scripts\") pod \"53324643-071d-4bf3-8ea7-702076e4941e\" (UID: \"53324643-071d-4bf3-8ea7-702076e4941e\") " Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.241906 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfgwv\" (UniqueName: \"kubernetes.io/projected/047bfc50-9efa-45ba-936c-dd2e03010d76-kube-api-access-wfgwv\") pod \"047bfc50-9efa-45ba-936c-dd2e03010d76\" (UID: \"047bfc50-9efa-45ba-936c-dd2e03010d76\") " Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.242774 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9409c1f8-6f79-494b-8dde-4fd7eedc9ae3-operator-scripts\") pod \"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3\" (UID: \"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3\") " Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.242822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53324643-071d-4bf3-8ea7-702076e4941e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53324643-071d-4bf3-8ea7-702076e4941e" (UID: "53324643-071d-4bf3-8ea7-702076e4941e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.242872 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74cf33a3-0b67-4cc3-b502-7c003f57c236-operator-scripts\") pod \"74cf33a3-0b67-4cc3-b502-7c003f57c236\" (UID: \"74cf33a3-0b67-4cc3-b502-7c003f57c236\") " Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.242907 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glwg9\" (UniqueName: \"kubernetes.io/projected/74cf33a3-0b67-4cc3-b502-7c003f57c236-kube-api-access-glwg9\") pod \"74cf33a3-0b67-4cc3-b502-7c003f57c236\" (UID: \"74cf33a3-0b67-4cc3-b502-7c003f57c236\") " Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.242926 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stdlz\" (UniqueName: \"kubernetes.io/projected/53324643-071d-4bf3-8ea7-702076e4941e-kube-api-access-stdlz\") pod \"53324643-071d-4bf3-8ea7-702076e4941e\" (UID: \"53324643-071d-4bf3-8ea7-702076e4941e\") " Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.242951 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/047bfc50-9efa-45ba-936c-dd2e03010d76-operator-scripts\") pod \"047bfc50-9efa-45ba-936c-dd2e03010d76\" (UID: \"047bfc50-9efa-45ba-936c-dd2e03010d76\") " Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.242972 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85x28\" (UniqueName: \"kubernetes.io/projected/9409c1f8-6f79-494b-8dde-4fd7eedc9ae3-kube-api-access-85x28\") pod \"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3\" (UID: \"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3\") " Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.243266 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53324643-071d-4bf3-8ea7-702076e4941e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.244250 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9409c1f8-6f79-494b-8dde-4fd7eedc9ae3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9409c1f8-6f79-494b-8dde-4fd7eedc9ae3" (UID: "9409c1f8-6f79-494b-8dde-4fd7eedc9ae3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.244577 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/047bfc50-9efa-45ba-936c-dd2e03010d76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "047bfc50-9efa-45ba-936c-dd2e03010d76" (UID: "047bfc50-9efa-45ba-936c-dd2e03010d76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.244957 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74cf33a3-0b67-4cc3-b502-7c003f57c236-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74cf33a3-0b67-4cc3-b502-7c003f57c236" (UID: "74cf33a3-0b67-4cc3-b502-7c003f57c236"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.248865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53324643-071d-4bf3-8ea7-702076e4941e-kube-api-access-stdlz" (OuterVolumeSpecName: "kube-api-access-stdlz") pod "53324643-071d-4bf3-8ea7-702076e4941e" (UID: "53324643-071d-4bf3-8ea7-702076e4941e"). InnerVolumeSpecName "kube-api-access-stdlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.248685 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9409c1f8-6f79-494b-8dde-4fd7eedc9ae3-kube-api-access-85x28" (OuterVolumeSpecName: "kube-api-access-85x28") pod "9409c1f8-6f79-494b-8dde-4fd7eedc9ae3" (UID: "9409c1f8-6f79-494b-8dde-4fd7eedc9ae3"). InnerVolumeSpecName "kube-api-access-85x28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.249766 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047bfc50-9efa-45ba-936c-dd2e03010d76-kube-api-access-wfgwv" (OuterVolumeSpecName: "kube-api-access-wfgwv") pod "047bfc50-9efa-45ba-936c-dd2e03010d76" (UID: "047bfc50-9efa-45ba-936c-dd2e03010d76"). InnerVolumeSpecName "kube-api-access-wfgwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.250822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74cf33a3-0b67-4cc3-b502-7c003f57c236-kube-api-access-glwg9" (OuterVolumeSpecName: "kube-api-access-glwg9") pod "74cf33a3-0b67-4cc3-b502-7c003f57c236" (UID: "74cf33a3-0b67-4cc3-b502-7c003f57c236"). InnerVolumeSpecName "kube-api-access-glwg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.344971 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9409c1f8-6f79-494b-8dde-4fd7eedc9ae3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.345003 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74cf33a3-0b67-4cc3-b502-7c003f57c236-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.345016 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glwg9\" (UniqueName: \"kubernetes.io/projected/74cf33a3-0b67-4cc3-b502-7c003f57c236-kube-api-access-glwg9\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.345028 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stdlz\" (UniqueName: \"kubernetes.io/projected/53324643-071d-4bf3-8ea7-702076e4941e-kube-api-access-stdlz\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.345037 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/047bfc50-9efa-45ba-936c-dd2e03010d76-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.345046 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85x28\" (UniqueName: \"kubernetes.io/projected/9409c1f8-6f79-494b-8dde-4fd7eedc9ae3-kube-api-access-85x28\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.345055 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfgwv\" (UniqueName: \"kubernetes.io/projected/047bfc50-9efa-45ba-936c-dd2e03010d76-kube-api-access-wfgwv\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.684548 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-qpw45" event={"ID":"53324643-071d-4bf3-8ea7-702076e4941e","Type":"ContainerDied","Data":"c4bb8ae3f61229676a52d704b8e20beb78fe37f9e84519d17cec0462f8ac00cc"} Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.684587 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4bb8ae3f61229676a52d704b8e20beb78fe37f9e84519d17cec0462f8ac00cc" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.684568 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-qpw45" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.686955 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" event={"ID":"9409c1f8-6f79-494b-8dde-4fd7eedc9ae3","Type":"ContainerDied","Data":"6ca9d50b2e80dc3fede32930f08c3e8199c6a0cdedb9ae1f2eb31deec431487b"} Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.687022 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ca9d50b2e80dc3fede32930f08c3e8199c6a0cdedb9ae1f2eb31deec431487b" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.686976 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.694597 5030 generic.go:334] "Generic (PLEG): container finished" podID="2efdbb9c-5339-420d-9413-ae2132514f02" containerID="a90f48a1dd1d0d8929bad9dd8192cc2e9d1d97e95f316c8ba38b821ea7094fff" exitCode=0 Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.694648 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" event={"ID":"2efdbb9c-5339-420d-9413-ae2132514f02","Type":"ContainerDied","Data":"a90f48a1dd1d0d8929bad9dd8192cc2e9d1d97e95f316c8ba38b821ea7094fff"} Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.700302 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.700440 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx" event={"ID":"74cf33a3-0b67-4cc3-b502-7c003f57c236","Type":"ContainerDied","Data":"a0e36bff6aeef889950b93027cc52fc165afbc0b916d1334fd7e0d85f235533f"} Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.700470 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0e36bff6aeef889950b93027cc52fc165afbc0b916d1334fd7e0d85f235533f" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.702603 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5" event={"ID":"047bfc50-9efa-45ba-936c-dd2e03010d76","Type":"ContainerDied","Data":"4d9a28b163f7a21fe0cc830be92c60e8b594629659fbb834c8152fd68e6afbfd"} Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.702642 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5" Jan 20 23:10:19 crc kubenswrapper[5030]: I0120 23:10:19.702658 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d9a28b163f7a21fe0cc830be92c60e8b594629659fbb834c8152fd68e6afbfd" Jan 20 23:10:24 crc kubenswrapper[5030]: I0120 23:10:24.588642 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" Jan 20 23:10:24 crc kubenswrapper[5030]: I0120 23:10:24.735816 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blq49\" (UniqueName: \"kubernetes.io/projected/2efdbb9c-5339-420d-9413-ae2132514f02-kube-api-access-blq49\") pod \"2efdbb9c-5339-420d-9413-ae2132514f02\" (UID: \"2efdbb9c-5339-420d-9413-ae2132514f02\") " Jan 20 23:10:24 crc kubenswrapper[5030]: I0120 23:10:24.735859 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efdbb9c-5339-420d-9413-ae2132514f02-combined-ca-bundle\") pod \"2efdbb9c-5339-420d-9413-ae2132514f02\" (UID: \"2efdbb9c-5339-420d-9413-ae2132514f02\") " Jan 20 23:10:24 crc kubenswrapper[5030]: I0120 23:10:24.735959 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efdbb9c-5339-420d-9413-ae2132514f02-config-data\") pod \"2efdbb9c-5339-420d-9413-ae2132514f02\" (UID: \"2efdbb9c-5339-420d-9413-ae2132514f02\") " Jan 20 23:10:24 crc kubenswrapper[5030]: I0120 23:10:24.741720 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2efdbb9c-5339-420d-9413-ae2132514f02-kube-api-access-blq49" (OuterVolumeSpecName: "kube-api-access-blq49") pod "2efdbb9c-5339-420d-9413-ae2132514f02" (UID: "2efdbb9c-5339-420d-9413-ae2132514f02"). InnerVolumeSpecName "kube-api-access-blq49". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:24 crc kubenswrapper[5030]: I0120 23:10:24.756406 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" event={"ID":"2efdbb9c-5339-420d-9413-ae2132514f02","Type":"ContainerDied","Data":"c5e7f28bac808e109334fa53afd59c3fc9cb5f2447fe1effdd25e082ecc42025"} Jan 20 23:10:24 crc kubenswrapper[5030]: I0120 23:10:24.756453 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5e7f28bac808e109334fa53afd59c3fc9cb5f2447fe1effdd25e082ecc42025" Jan 20 23:10:24 crc kubenswrapper[5030]: I0120 23:10:24.756452 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-sfwxf" Jan 20 23:10:24 crc kubenswrapper[5030]: I0120 23:10:24.767192 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efdbb9c-5339-420d-9413-ae2132514f02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2efdbb9c-5339-420d-9413-ae2132514f02" (UID: "2efdbb9c-5339-420d-9413-ae2132514f02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:24 crc kubenswrapper[5030]: I0120 23:10:24.784516 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efdbb9c-5339-420d-9413-ae2132514f02-config-data" (OuterVolumeSpecName: "config-data") pod "2efdbb9c-5339-420d-9413-ae2132514f02" (UID: "2efdbb9c-5339-420d-9413-ae2132514f02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:24 crc kubenswrapper[5030]: I0120 23:10:24.838009 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blq49\" (UniqueName: \"kubernetes.io/projected/2efdbb9c-5339-420d-9413-ae2132514f02-kube-api-access-blq49\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:24 crc kubenswrapper[5030]: I0120 23:10:24.838050 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efdbb9c-5339-420d-9413-ae2132514f02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:24 crc kubenswrapper[5030]: I0120 23:10:24.838066 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efdbb9c-5339-420d-9413-ae2132514f02-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.806768 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kz5cc"] Jan 20 23:10:25 crc kubenswrapper[5030]: E0120 23:10:25.807648 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9409c1f8-6f79-494b-8dde-4fd7eedc9ae3" containerName="mariadb-account-create-update" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.807664 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9409c1f8-6f79-494b-8dde-4fd7eedc9ae3" containerName="mariadb-account-create-update" Jan 20 23:10:25 crc kubenswrapper[5030]: E0120 23:10:25.807694 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74cf33a3-0b67-4cc3-b502-7c003f57c236" containerName="mariadb-account-create-update" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.807704 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="74cf33a3-0b67-4cc3-b502-7c003f57c236" containerName="mariadb-account-create-update" Jan 20 23:10:25 crc kubenswrapper[5030]: E0120 23:10:25.807715 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c90c31-4ceb-4fdb-a889-993ae7d16a18" containerName="mariadb-database-create" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.807724 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c90c31-4ceb-4fdb-a889-993ae7d16a18" containerName="mariadb-database-create" Jan 20 23:10:25 crc kubenswrapper[5030]: E0120 23:10:25.807737 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047bfc50-9efa-45ba-936c-dd2e03010d76" containerName="mariadb-account-create-update" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.807745 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="047bfc50-9efa-45ba-936c-dd2e03010d76" containerName="mariadb-account-create-update" Jan 20 23:10:25 crc kubenswrapper[5030]: E0120 23:10:25.807761 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53324643-071d-4bf3-8ea7-702076e4941e" containerName="mariadb-database-create" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.807769 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="53324643-071d-4bf3-8ea7-702076e4941e" containerName="mariadb-database-create" Jan 20 23:10:25 crc kubenswrapper[5030]: E0120 23:10:25.807785 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efdbb9c-5339-420d-9413-ae2132514f02" containerName="keystone-db-sync" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.807793 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efdbb9c-5339-420d-9413-ae2132514f02" containerName="keystone-db-sync" Jan 20 23:10:25 crc kubenswrapper[5030]: E0120 23:10:25.807804 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e35c4d2-863f-4074-8bf1-41a8eabd2910" containerName="mariadb-database-create" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.807812 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e35c4d2-863f-4074-8bf1-41a8eabd2910" containerName="mariadb-database-create" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.807999 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="53324643-071d-4bf3-8ea7-702076e4941e" containerName="mariadb-database-create" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.808018 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="047bfc50-9efa-45ba-936c-dd2e03010d76" containerName="mariadb-account-create-update" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.808027 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="74cf33a3-0b67-4cc3-b502-7c003f57c236" containerName="mariadb-account-create-update" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.808046 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9409c1f8-6f79-494b-8dde-4fd7eedc9ae3" containerName="mariadb-account-create-update" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.808062 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e35c4d2-863f-4074-8bf1-41a8eabd2910" containerName="mariadb-database-create" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.808077 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efdbb9c-5339-420d-9413-ae2132514f02" containerName="keystone-db-sync" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.808094 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c90c31-4ceb-4fdb-a889-993ae7d16a18" containerName="mariadb-database-create" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.808733 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.814125 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.814242 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.820383 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.820679 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-zrrcb" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.820725 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.839062 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kz5cc"] Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.957433 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-credential-keys\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.957509 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-fernet-keys\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.957533 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-config-data\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.957562 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4vdt\" (UniqueName: \"kubernetes.io/projected/f69572b8-b37f-4e3c-9b64-e9076db53f18-kube-api-access-m4vdt\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.957600 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-combined-ca-bundle\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:25 crc kubenswrapper[5030]: I0120 23:10:25.957655 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-scripts\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.064748 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-scripts\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.064878 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-credential-keys\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.064939 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-fernet-keys\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.064960 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-config-data\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.064979 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4vdt\" (UniqueName: \"kubernetes.io/projected/f69572b8-b37f-4e3c-9b64-e9076db53f18-kube-api-access-m4vdt\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.065005 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-combined-ca-bundle\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.084007 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-config-data\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.084576 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-scripts\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.085112 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-combined-ca-bundle\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.110243 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8x74r"] Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.111146 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-credential-keys\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.111493 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.122020 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.122331 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-8xkvf" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.122599 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.127608 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4vdt\" (UniqueName: \"kubernetes.io/projected/f69572b8-b37f-4e3c-9b64-e9076db53f18-kube-api-access-m4vdt\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.143818 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-82sw7"] Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.157201 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-fernet-keys\") pod \"keystone-bootstrap-kz5cc\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.228475 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.261986 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.262263 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8x74r"] Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.262412 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.267782 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-wgg84" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.268185 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.280275 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.285986 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qncrz"] Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.287006 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-qncrz" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.305404 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.305796 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-7cm49" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.305878 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.319022 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-scripts\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.319065 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-combined-ca-bundle\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.319087 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-combined-ca-bundle\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.319112 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-config-data\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.319151 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3937a364-08e4-44d1-a93e-8dd4a203f200-logs\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.319185 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-scripts\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.319201 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-db-sync-config-data\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.319226 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/818882ee-6bdd-4fa5-8294-d2ffe317ac07-etc-machine-id\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.319247 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kpzj\" (UniqueName: \"kubernetes.io/projected/818882ee-6bdd-4fa5-8294-d2ffe317ac07-kube-api-access-6kpzj\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.319283 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvbbp\" (UniqueName: \"kubernetes.io/projected/3937a364-08e4-44d1-a93e-8dd4a203f200-kube-api-access-pvbbp\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.319296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-config-data\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.321303 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-82sw7"] Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.345809 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qncrz"] Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.362149 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-xz7mx"] Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.363450 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.368699 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-xz7mx"] Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.370131 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.370362 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-znx9n" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.387686 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.389643 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.404701 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.404730 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.413923 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.421139 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-combined-ca-bundle\") pod \"neutron-db-sync-qncrz\" (UID: \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\") " pod="openstack-kuttl-tests/neutron-db-sync-qncrz" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.421202 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-scripts\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.421226 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-combined-ca-bundle\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.421248 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-config\") pod \"neutron-db-sync-qncrz\" (UID: \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\") " pod="openstack-kuttl-tests/neutron-db-sync-qncrz" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.421270 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-combined-ca-bundle\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.421312 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-config-data\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.421340 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3937a364-08e4-44d1-a93e-8dd4a203f200-logs\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.421406 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-scripts\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.421422 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-db-sync-config-data\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.421456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/818882ee-6bdd-4fa5-8294-d2ffe317ac07-etc-machine-id\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.426874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kpzj\" (UniqueName: \"kubernetes.io/projected/818882ee-6bdd-4fa5-8294-d2ffe317ac07-kube-api-access-6kpzj\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.426989 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mtrg\" (UniqueName: \"kubernetes.io/projected/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-kube-api-access-2mtrg\") pod \"neutron-db-sync-qncrz\" (UID: \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\") " pod="openstack-kuttl-tests/neutron-db-sync-qncrz" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.427020 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvbbp\" (UniqueName: \"kubernetes.io/projected/3937a364-08e4-44d1-a93e-8dd4a203f200-kube-api-access-pvbbp\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.427044 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-config-data\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.427455 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-scripts\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.423360 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3937a364-08e4-44d1-a93e-8dd4a203f200-logs\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.422465 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/818882ee-6bdd-4fa5-8294-d2ffe317ac07-etc-machine-id\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.429134 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-scripts\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.431539 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-config-data\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.432566 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-db-sync-config-data\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.432826 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-combined-ca-bundle\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.439467 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-combined-ca-bundle\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.454510 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-config-data\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.455028 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.457175 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvbbp\" (UniqueName: \"kubernetes.io/projected/3937a364-08e4-44d1-a93e-8dd4a203f200-kube-api-access-pvbbp\") pod \"placement-db-sync-8x74r\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.459251 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kpzj\" (UniqueName: \"kubernetes.io/projected/818882ee-6bdd-4fa5-8294-d2ffe317ac07-kube-api-access-6kpzj\") pod \"cinder-db-sync-82sw7\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.528910 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mtrg\" (UniqueName: \"kubernetes.io/projected/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-kube-api-access-2mtrg\") pod \"neutron-db-sync-qncrz\" (UID: \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\") " pod="openstack-kuttl-tests/neutron-db-sync-qncrz" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.528984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-combined-ca-bundle\") pod \"neutron-db-sync-qncrz\" (UID: \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\") " pod="openstack-kuttl-tests/neutron-db-sync-qncrz" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.529006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e7109cf-496b-406f-817c-76e72c01ef90-run-httpd\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.529060 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqm9c\" (UniqueName: \"kubernetes.io/projected/8e7109cf-496b-406f-817c-76e72c01ef90-kube-api-access-jqm9c\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.529106 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-config\") pod \"neutron-db-sync-qncrz\" (UID: \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\") " pod="openstack-kuttl-tests/neutron-db-sync-qncrz" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.529145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-scripts\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.529191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.529222 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-config-data\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.529237 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.529278 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4983d8d3-58c0-4c76-9200-dfb9daefaf64-db-sync-config-data\") pod \"barbican-db-sync-xz7mx\" (UID: \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\") " pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.529304 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4983d8d3-58c0-4c76-9200-dfb9daefaf64-combined-ca-bundle\") pod \"barbican-db-sync-xz7mx\" (UID: \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\") " pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.529353 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98kfz\" (UniqueName: \"kubernetes.io/projected/4983d8d3-58c0-4c76-9200-dfb9daefaf64-kube-api-access-98kfz\") pod \"barbican-db-sync-xz7mx\" (UID: \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\") " pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.529379 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e7109cf-496b-406f-817c-76e72c01ef90-log-httpd\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.538792 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-config\") pod \"neutron-db-sync-qncrz\" (UID: \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\") " pod="openstack-kuttl-tests/neutron-db-sync-qncrz" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.540999 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-combined-ca-bundle\") pod \"neutron-db-sync-qncrz\" (UID: \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\") " pod="openstack-kuttl-tests/neutron-db-sync-qncrz" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.558237 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mtrg\" (UniqueName: \"kubernetes.io/projected/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-kube-api-access-2mtrg\") pod \"neutron-db-sync-qncrz\" (UID: \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\") " pod="openstack-kuttl-tests/neutron-db-sync-qncrz" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.630297 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e7109cf-496b-406f-817c-76e72c01ef90-run-httpd\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.630355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqm9c\" (UniqueName: \"kubernetes.io/projected/8e7109cf-496b-406f-817c-76e72c01ef90-kube-api-access-jqm9c\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.630404 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-scripts\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.630437 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.630461 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-config-data\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.630477 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.630514 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4983d8d3-58c0-4c76-9200-dfb9daefaf64-db-sync-config-data\") pod \"barbican-db-sync-xz7mx\" (UID: \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\") " pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.630532 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4983d8d3-58c0-4c76-9200-dfb9daefaf64-combined-ca-bundle\") pod \"barbican-db-sync-xz7mx\" (UID: \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\") " pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.630555 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98kfz\" (UniqueName: \"kubernetes.io/projected/4983d8d3-58c0-4c76-9200-dfb9daefaf64-kube-api-access-98kfz\") pod \"barbican-db-sync-xz7mx\" (UID: \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\") " pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.630575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e7109cf-496b-406f-817c-76e72c01ef90-log-httpd\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.630873 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e7109cf-496b-406f-817c-76e72c01ef90-run-httpd\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.630979 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e7109cf-496b-406f-817c-76e72c01ef90-log-httpd\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.633937 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.634363 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4983d8d3-58c0-4c76-9200-dfb9daefaf64-db-sync-config-data\") pod \"barbican-db-sync-xz7mx\" (UID: \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\") " pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.634862 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-config-data\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.635939 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-scripts\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.635983 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.636178 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4983d8d3-58c0-4c76-9200-dfb9daefaf64-combined-ca-bundle\") pod \"barbican-db-sync-xz7mx\" (UID: \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\") " pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.641470 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.646940 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqm9c\" (UniqueName: \"kubernetes.io/projected/8e7109cf-496b-406f-817c-76e72c01ef90-kube-api-access-jqm9c\") pod \"ceilometer-0\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.647567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98kfz\" (UniqueName: \"kubernetes.io/projected/4983d8d3-58c0-4c76-9200-dfb9daefaf64-kube-api-access-98kfz\") pod \"barbican-db-sync-xz7mx\" (UID: \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\") " pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.691357 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.720351 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-qncrz" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.795751 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.819056 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.824222 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.967720 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.969366 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.974280 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.974840 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.975046 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-w5gc8" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.975159 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:10:26 crc kubenswrapper[5030]: I0120 23:10:26.980793 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.037485 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fecb64f3-f295-42da-950d-b595899d3c52-logs\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.037530 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-scripts\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.037550 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.037571 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fecb64f3-f295-42da-950d-b595899d3c52-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.037601 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr8lh\" (UniqueName: \"kubernetes.io/projected/fecb64f3-f295-42da-950d-b595899d3c52-kube-api-access-zr8lh\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.037724 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.037746 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-config-data\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.037785 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.139069 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.139124 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-config-data\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.139146 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.139217 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fecb64f3-f295-42da-950d-b595899d3c52-logs\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.139236 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-scripts\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.139250 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.139270 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fecb64f3-f295-42da-950d-b595899d3c52-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.139298 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr8lh\" (UniqueName: \"kubernetes.io/projected/fecb64f3-f295-42da-950d-b595899d3c52-kube-api-access-zr8lh\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.140011 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fecb64f3-f295-42da-950d-b595899d3c52-logs\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.140030 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fecb64f3-f295-42da-950d-b595899d3c52-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.140199 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.143117 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-scripts\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.144385 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.152422 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.153545 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr8lh\" (UniqueName: \"kubernetes.io/projected/fecb64f3-f295-42da-950d-b595899d3c52-kube-api-access-zr8lh\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.153548 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-config-data\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.172015 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.291717 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.369968 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.371647 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.374938 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.375125 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.380218 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.445339 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-logs\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.445425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.445442 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.445463 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.445485 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.445502 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.445540 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.445560 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgzlv\" (UniqueName: \"kubernetes.io/projected/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-kube-api-access-tgzlv\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.546630 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-logs\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.546740 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.546766 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.546791 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.546852 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.546876 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.546916 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.546940 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgzlv\" (UniqueName: \"kubernetes.io/projected/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-kube-api-access-tgzlv\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.547461 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-logs\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.547687 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.547800 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.560501 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.560513 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.560593 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.561299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.568514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.574183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgzlv\" (UniqueName: \"kubernetes.io/projected/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-kube-api-access-tgzlv\") pod \"glance-default-internal-api-0\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:27 crc kubenswrapper[5030]: I0120 23:10:27.693255 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:28 crc kubenswrapper[5030]: I0120 23:10:28.285720 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:10:28 crc kubenswrapper[5030]: I0120 23:10:28.354467 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:10:28 crc kubenswrapper[5030]: I0120 23:10:28.373749 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:10:30 crc kubenswrapper[5030]: E0120 23:10:30.439311 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-master-centos9/openstack-watcher-api@sha256:b880e5d410fd87b038bd5a55febfeb0f83f501169f5be72c83a91dd5de9ec11b" Jan 20 23:10:30 crc kubenswrapper[5030]: E0120 23:10:30.441825 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-db-sync,Image:quay.io/podified-master-centos9/openstack-watcher-api@sha256:b880e5d410fd87b038bd5a55febfeb0f83f501169f5be72c83a91dd5de9ec11b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fsrdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-db-sync-wvcfl_openstack-kuttl-tests(9f621682-a425-4314-ae0d-423bc0b23935): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 23:10:30 crc kubenswrapper[5030]: E0120 23:10:30.443126 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" podUID="9f621682-a425-4314-ae0d-423bc0b23935" Jan 20 23:10:30 crc kubenswrapper[5030]: E0120 23:10:30.835948 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-master-centos9/openstack-watcher-api@sha256:b880e5d410fd87b038bd5a55febfeb0f83f501169f5be72c83a91dd5de9ec11b\\\"\"" pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" podUID="9f621682-a425-4314-ae0d-423bc0b23935" Jan 20 23:10:30 crc kubenswrapper[5030]: I0120 23:10:30.913340 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-82sw7"] Jan 20 23:10:30 crc kubenswrapper[5030]: W0120 23:10:30.915885 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod818882ee_6bdd_4fa5_8294_d2ffe317ac07.slice/crio-94793905d4cdebd00c213f05a5633eff16b7f2814e4bbb8a7ea6e175bb6e6cc8 WatchSource:0}: Error finding container 94793905d4cdebd00c213f05a5633eff16b7f2814e4bbb8a7ea6e175bb6e6cc8: Status 404 returned error can't find the container with id 94793905d4cdebd00c213f05a5633eff16b7f2814e4bbb8a7ea6e175bb6e6cc8 Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.153904 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qncrz"] Jan 20 23:10:31 crc kubenswrapper[5030]: W0120 23:10:31.162380 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3937a364_08e4_44d1_a93e_8dd4a203f200.slice/crio-3a65b49d1e4304d1ce514efce34470e9317697138d8d69046a14f85d7d857187 WatchSource:0}: Error finding container 3a65b49d1e4304d1ce514efce34470e9317697138d8d69046a14f85d7d857187: Status 404 returned error can't find the container with id 3a65b49d1e4304d1ce514efce34470e9317697138d8d69046a14f85d7d857187 Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.166165 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kz5cc"] Jan 20 23:10:31 crc kubenswrapper[5030]: W0120 23:10:31.168840 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd4126e6_c5d3_4d42_9ca2_07029f5e3368.slice/crio-08e9ae82d646800dcb325d1c5671df33f53267d975131032a7c5a837b2756bd2 WatchSource:0}: Error finding container 08e9ae82d646800dcb325d1c5671df33f53267d975131032a7c5a837b2756bd2: Status 404 returned error can't find the container with id 08e9ae82d646800dcb325d1c5671df33f53267d975131032a7c5a837b2756bd2 Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.174286 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:10:31 crc kubenswrapper[5030]: W0120 23:10:31.175059 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf69572b8_b37f_4e3c_9b64_e9076db53f18.slice/crio-aa6b59248d1470fedf611c4e5e1d67f169745462391cadd36f1da360c1440ce8 WatchSource:0}: Error finding container aa6b59248d1470fedf611c4e5e1d67f169745462391cadd36f1da360c1440ce8: Status 404 returned error can't find the container with id aa6b59248d1470fedf611c4e5e1d67f169745462391cadd36f1da360c1440ce8 Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.180344 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8x74r"] Jan 20 23:10:31 crc kubenswrapper[5030]: W0120 23:10:31.180538 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfecb64f3_f295_42da_950d_b595899d3c52.slice/crio-358b1a0d5d55ce2c522fc2d15c85ff2068688fb5a1a4c3a355f702e9094d9fa0 WatchSource:0}: Error finding container 358b1a0d5d55ce2c522fc2d15c85ff2068688fb5a1a4c3a355f702e9094d9fa0: Status 404 returned error can't find the container with id 358b1a0d5d55ce2c522fc2d15c85ff2068688fb5a1a4c3a355f702e9094d9fa0 Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.184826 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.388332 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.402676 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-xz7mx"] Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.410301 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.844543 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fecb64f3-f295-42da-950d-b595899d3c52","Type":"ContainerStarted","Data":"0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.844899 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fecb64f3-f295-42da-950d-b595899d3c52","Type":"ContainerStarted","Data":"358b1a0d5d55ce2c522fc2d15c85ff2068688fb5a1a4c3a355f702e9094d9fa0"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.845673 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"9954eb52-c1ae-4c2e-9e81-9e576d65c90c","Type":"ContainerStarted","Data":"c919a1561d01324d4f129ff5731926b89f5b71f987182082f4c2dbe55112d28f"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.846707 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8e7109cf-496b-406f-817c-76e72c01ef90","Type":"ContainerStarted","Data":"1aa7ff6fdf796de3abe99274b55007a8e84c3ee27933ed3bc8cfcb3c7b9d4dfd"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.848082 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-82sw7" event={"ID":"818882ee-6bdd-4fa5-8294-d2ffe317ac07","Type":"ContainerStarted","Data":"ff54e0d54961da77f3e62ba00cba1c63dc7ac2c110e0251d825b9f5703c04b3f"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.848105 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-82sw7" event={"ID":"818882ee-6bdd-4fa5-8294-d2ffe317ac07","Type":"ContainerStarted","Data":"94793905d4cdebd00c213f05a5633eff16b7f2814e4bbb8a7ea6e175bb6e6cc8"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.851542 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-8x74r" event={"ID":"3937a364-08e4-44d1-a93e-8dd4a203f200","Type":"ContainerStarted","Data":"7919fd8bc32c52113da1c830916d802ce7efb1753f1c016315bf56674d6f572c"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.851586 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-8x74r" event={"ID":"3937a364-08e4-44d1-a93e-8dd4a203f200","Type":"ContainerStarted","Data":"3a65b49d1e4304d1ce514efce34470e9317697138d8d69046a14f85d7d857187"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.852992 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-qncrz" event={"ID":"cd4126e6-c5d3-4d42-9ca2-07029f5e3368","Type":"ContainerStarted","Data":"bffef7765f1438423591828481b992e73d3dfc2858bd854db49a96094325b559"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.853019 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-qncrz" event={"ID":"cd4126e6-c5d3-4d42-9ca2-07029f5e3368","Type":"ContainerStarted","Data":"08e9ae82d646800dcb325d1c5671df33f53267d975131032a7c5a837b2756bd2"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.854547 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" event={"ID":"f69572b8-b37f-4e3c-9b64-e9076db53f18","Type":"ContainerStarted","Data":"4324979a35d72509d550aecc0fddf62f163b48cbd3e592489c10890fd9880571"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.854572 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" event={"ID":"f69572b8-b37f-4e3c-9b64-e9076db53f18","Type":"ContainerStarted","Data":"aa6b59248d1470fedf611c4e5e1d67f169745462391cadd36f1da360c1440ce8"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.858149 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" event={"ID":"4983d8d3-58c0-4c76-9200-dfb9daefaf64","Type":"ContainerStarted","Data":"db8a03c81c30c7c0f36823f636297df6ce0a2e06bb363add7caf45bf70d3c118"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.858195 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" event={"ID":"4983d8d3-58c0-4c76-9200-dfb9daefaf64","Type":"ContainerStarted","Data":"4c96de8c9b1209250873e2879e7920d0ffdf2a079bd51bb6f50b7c69451286f0"} Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.883470 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-82sw7" podStartSLOduration=5.883452526 podStartE2EDuration="5.883452526s" podCreationTimestamp="2026-01-20 23:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:31.883012106 +0000 UTC m=+2104.203272394" watchObservedRunningTime="2026-01-20 23:10:31.883452526 +0000 UTC m=+2104.203712814" Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.907110 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-8x74r" podStartSLOduration=5.907092974 podStartE2EDuration="5.907092974s" podCreationTimestamp="2026-01-20 23:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:31.897324115 +0000 UTC m=+2104.217584403" watchObservedRunningTime="2026-01-20 23:10:31.907092974 +0000 UTC m=+2104.227353262" Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.941507 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-qncrz" podStartSLOduration=5.9414917339999995 podStartE2EDuration="5.941491734s" podCreationTimestamp="2026-01-20 23:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:31.936763109 +0000 UTC m=+2104.257023397" watchObservedRunningTime="2026-01-20 23:10:31.941491734 +0000 UTC m=+2104.261752022" Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.945061 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" podStartSLOduration=5.945047891 podStartE2EDuration="5.945047891s" podCreationTimestamp="2026-01-20 23:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:31.920582364 +0000 UTC m=+2104.240842662" watchObservedRunningTime="2026-01-20 23:10:31.945047891 +0000 UTC m=+2104.265308179" Jan 20 23:10:31 crc kubenswrapper[5030]: I0120 23:10:31.956588 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" podStartSLOduration=6.956571072 podStartE2EDuration="6.956571072s" podCreationTimestamp="2026-01-20 23:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:31.950430173 +0000 UTC m=+2104.270690541" watchObservedRunningTime="2026-01-20 23:10:31.956571072 +0000 UTC m=+2104.276831360" Jan 20 23:10:32 crc kubenswrapper[5030]: I0120 23:10:32.884853 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fecb64f3-f295-42da-950d-b595899d3c52","Type":"ContainerStarted","Data":"063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6"} Jan 20 23:10:32 crc kubenswrapper[5030]: I0120 23:10:32.884911 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="fecb64f3-f295-42da-950d-b595899d3c52" containerName="glance-log" containerID="cri-o://0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43" gracePeriod=30 Jan 20 23:10:32 crc kubenswrapper[5030]: I0120 23:10:32.885524 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="fecb64f3-f295-42da-950d-b595899d3c52" containerName="glance-httpd" containerID="cri-o://063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6" gracePeriod=30 Jan 20 23:10:32 crc kubenswrapper[5030]: I0120 23:10:32.888204 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"9954eb52-c1ae-4c2e-9e81-9e576d65c90c","Type":"ContainerStarted","Data":"10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb"} Jan 20 23:10:32 crc kubenswrapper[5030]: I0120 23:10:32.898928 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8e7109cf-496b-406f-817c-76e72c01ef90","Type":"ContainerStarted","Data":"93e05fd4390c7930eea40ffb0b128a3c7efd393dee960fe75a044a26d1180047"} Jan 20 23:10:32 crc kubenswrapper[5030]: I0120 23:10:32.912884 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=7.912867738 podStartE2EDuration="7.912867738s" podCreationTimestamp="2026-01-20 23:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:32.908305986 +0000 UTC m=+2105.228566284" watchObservedRunningTime="2026-01-20 23:10:32.912867738 +0000 UTC m=+2105.233128026" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.402654 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.464185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-public-tls-certs\") pod \"fecb64f3-f295-42da-950d-b595899d3c52\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.464285 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"fecb64f3-f295-42da-950d-b595899d3c52\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.464327 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fecb64f3-f295-42da-950d-b595899d3c52-logs\") pod \"fecb64f3-f295-42da-950d-b595899d3c52\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.464365 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-combined-ca-bundle\") pod \"fecb64f3-f295-42da-950d-b595899d3c52\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.464787 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fecb64f3-f295-42da-950d-b595899d3c52-logs" (OuterVolumeSpecName: "logs") pod "fecb64f3-f295-42da-950d-b595899d3c52" (UID: "fecb64f3-f295-42da-950d-b595899d3c52"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.465069 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-scripts\") pod \"fecb64f3-f295-42da-950d-b595899d3c52\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.465148 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fecb64f3-f295-42da-950d-b595899d3c52-httpd-run\") pod \"fecb64f3-f295-42da-950d-b595899d3c52\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.465170 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr8lh\" (UniqueName: \"kubernetes.io/projected/fecb64f3-f295-42da-950d-b595899d3c52-kube-api-access-zr8lh\") pod \"fecb64f3-f295-42da-950d-b595899d3c52\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.465218 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-config-data\") pod \"fecb64f3-f295-42da-950d-b595899d3c52\" (UID: \"fecb64f3-f295-42da-950d-b595899d3c52\") " Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.465561 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fecb64f3-f295-42da-950d-b595899d3c52-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.465945 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fecb64f3-f295-42da-950d-b595899d3c52-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fecb64f3-f295-42da-950d-b595899d3c52" (UID: "fecb64f3-f295-42da-950d-b595899d3c52"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.480891 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fecb64f3-f295-42da-950d-b595899d3c52-kube-api-access-zr8lh" (OuterVolumeSpecName: "kube-api-access-zr8lh") pod "fecb64f3-f295-42da-950d-b595899d3c52" (UID: "fecb64f3-f295-42da-950d-b595899d3c52"). InnerVolumeSpecName "kube-api-access-zr8lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.481005 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-scripts" (OuterVolumeSpecName: "scripts") pod "fecb64f3-f295-42da-950d-b595899d3c52" (UID: "fecb64f3-f295-42da-950d-b595899d3c52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.481782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "fecb64f3-f295-42da-950d-b595899d3c52" (UID: "fecb64f3-f295-42da-950d-b595899d3c52"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.530430 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-config-data" (OuterVolumeSpecName: "config-data") pod "fecb64f3-f295-42da-950d-b595899d3c52" (UID: "fecb64f3-f295-42da-950d-b595899d3c52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.535735 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fecb64f3-f295-42da-950d-b595899d3c52" (UID: "fecb64f3-f295-42da-950d-b595899d3c52"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.543638 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fecb64f3-f295-42da-950d-b595899d3c52" (UID: "fecb64f3-f295-42da-950d-b595899d3c52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.566598 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.566644 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.566654 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fecb64f3-f295-42da-950d-b595899d3c52-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.566663 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr8lh\" (UniqueName: \"kubernetes.io/projected/fecb64f3-f295-42da-950d-b595899d3c52-kube-api-access-zr8lh\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.566677 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.566686 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fecb64f3-f295-42da-950d-b595899d3c52-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.566718 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.586492 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.667878 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.910262 5030 generic.go:334] "Generic (PLEG): container finished" podID="fecb64f3-f295-42da-950d-b595899d3c52" containerID="063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6" exitCode=143 Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.910306 5030 generic.go:334] "Generic (PLEG): container finished" podID="fecb64f3-f295-42da-950d-b595899d3c52" containerID="0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43" exitCode=143 Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.910355 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fecb64f3-f295-42da-950d-b595899d3c52","Type":"ContainerDied","Data":"063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6"} Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.910389 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fecb64f3-f295-42da-950d-b595899d3c52","Type":"ContainerDied","Data":"0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43"} Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.910407 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"fecb64f3-f295-42da-950d-b595899d3c52","Type":"ContainerDied","Data":"358b1a0d5d55ce2c522fc2d15c85ff2068688fb5a1a4c3a355f702e9094d9fa0"} Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.910426 5030 scope.go:117] "RemoveContainer" containerID="063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.910588 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.917113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"9954eb52-c1ae-4c2e-9e81-9e576d65c90c","Type":"ContainerStarted","Data":"ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6"} Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.917351 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="9954eb52-c1ae-4c2e-9e81-9e576d65c90c" containerName="glance-log" containerID="cri-o://10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb" gracePeriod=30 Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.917385 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="9954eb52-c1ae-4c2e-9e81-9e576d65c90c" containerName="glance-httpd" containerID="cri-o://ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6" gracePeriod=30 Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.927793 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8e7109cf-496b-406f-817c-76e72c01ef90","Type":"ContainerStarted","Data":"e8e7436bd2501d29035883b10bb837e40ac85eee5032e5783e47d4670ba2866b"} Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.941669 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=7.941649174 podStartE2EDuration="7.941649174s" podCreationTimestamp="2026-01-20 23:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:33.936938699 +0000 UTC m=+2106.257198977" watchObservedRunningTime="2026-01-20 23:10:33.941649174 +0000 UTC m=+2106.261909462" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.947751 5030 scope.go:117] "RemoveContainer" containerID="0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.994347 5030 scope.go:117] "RemoveContainer" containerID="063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6" Jan 20 23:10:33 crc kubenswrapper[5030]: E0120 23:10:33.996017 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6\": container with ID starting with 063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6 not found: ID does not exist" containerID="063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.996072 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6"} err="failed to get container status \"063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6\": rpc error: code = NotFound desc = could not find container \"063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6\": container with ID starting with 063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6 not found: ID does not exist" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.996092 5030 scope.go:117] "RemoveContainer" containerID="0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43" Jan 20 23:10:33 crc kubenswrapper[5030]: E0120 23:10:33.996783 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43\": container with ID starting with 0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43 not found: ID does not exist" containerID="0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.996827 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43"} err="failed to get container status \"0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43\": rpc error: code = NotFound desc = could not find container \"0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43\": container with ID starting with 0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43 not found: ID does not exist" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.996853 5030 scope.go:117] "RemoveContainer" containerID="063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6" Jan 20 23:10:33 crc kubenswrapper[5030]: I0120 23:10:33.997586 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:33.998874 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6"} err="failed to get container status \"063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6\": rpc error: code = NotFound desc = could not find container \"063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6\": container with ID starting with 063ed96632c065aa3ffda4afdc87f5d243f643d136170c37335f616692bfd4f6 not found: ID does not exist" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:33.998908 5030 scope.go:117] "RemoveContainer" containerID="0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:33.999097 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43"} err="failed to get container status \"0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43\": rpc error: code = NotFound desc = could not find container \"0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43\": container with ID starting with 0749f9384e6ea4f8b9cb972f8a6420352331c857c6f920436242bc7a3bdfca43 not found: ID does not exist" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.007412 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.014716 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:10:34 crc kubenswrapper[5030]: E0120 23:10:34.015169 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecb64f3-f295-42da-950d-b595899d3c52" containerName="glance-log" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.015185 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecb64f3-f295-42da-950d-b595899d3c52" containerName="glance-log" Jan 20 23:10:34 crc kubenswrapper[5030]: E0120 23:10:34.015196 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecb64f3-f295-42da-950d-b595899d3c52" containerName="glance-httpd" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.015202 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecb64f3-f295-42da-950d-b595899d3c52" containerName="glance-httpd" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.015350 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fecb64f3-f295-42da-950d-b595899d3c52" containerName="glance-httpd" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.015373 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fecb64f3-f295-42da-950d-b595899d3c52" containerName="glance-log" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.016245 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.018937 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.019122 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.051808 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.075723 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.075778 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-config-data\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.075834 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vktth\" (UniqueName: \"kubernetes.io/projected/a5fbd872-796a-4b41-b925-516fc6db87f6-kube-api-access-vktth\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.075894 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.075910 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-scripts\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.075951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5fbd872-796a-4b41-b925-516fc6db87f6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.075994 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.076009 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5fbd872-796a-4b41-b925-516fc6db87f6-logs\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.177401 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5fbd872-796a-4b41-b925-516fc6db87f6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.177466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.177490 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5fbd872-796a-4b41-b925-516fc6db87f6-logs\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.177554 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.177579 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-config-data\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.177641 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vktth\" (UniqueName: \"kubernetes.io/projected/a5fbd872-796a-4b41-b925-516fc6db87f6-kube-api-access-vktth\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.177675 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.177692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-scripts\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.177913 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5fbd872-796a-4b41-b925-516fc6db87f6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.178873 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5fbd872-796a-4b41-b925-516fc6db87f6-logs\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.179075 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.184182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.187904 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-config-data\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.188641 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.196941 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-scripts\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.203133 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vktth\" (UniqueName: \"kubernetes.io/projected/a5fbd872-796a-4b41-b925-516fc6db87f6-kube-api-access-vktth\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.219754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.390447 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.396811 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.481585 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-config-data\") pod \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.482171 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.482196 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-combined-ca-bundle\") pod \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.482238 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-logs\") pod \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.482272 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgzlv\" (UniqueName: \"kubernetes.io/projected/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-kube-api-access-tgzlv\") pod \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.482371 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-httpd-run\") pod \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.482393 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-internal-tls-certs\") pod \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.483001 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-logs" (OuterVolumeSpecName: "logs") pod "9954eb52-c1ae-4c2e-9e81-9e576d65c90c" (UID: "9954eb52-c1ae-4c2e-9e81-9e576d65c90c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.483046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-scripts\") pod \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\" (UID: \"9954eb52-c1ae-4c2e-9e81-9e576d65c90c\") " Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.483081 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9954eb52-c1ae-4c2e-9e81-9e576d65c90c" (UID: "9954eb52-c1ae-4c2e-9e81-9e576d65c90c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.483404 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.483415 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.486999 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-scripts" (OuterVolumeSpecName: "scripts") pod "9954eb52-c1ae-4c2e-9e81-9e576d65c90c" (UID: "9954eb52-c1ae-4c2e-9e81-9e576d65c90c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.487503 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "9954eb52-c1ae-4c2e-9e81-9e576d65c90c" (UID: "9954eb52-c1ae-4c2e-9e81-9e576d65c90c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.492968 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-kube-api-access-tgzlv" (OuterVolumeSpecName: "kube-api-access-tgzlv") pod "9954eb52-c1ae-4c2e-9e81-9e576d65c90c" (UID: "9954eb52-c1ae-4c2e-9e81-9e576d65c90c"). InnerVolumeSpecName "kube-api-access-tgzlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.516435 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9954eb52-c1ae-4c2e-9e81-9e576d65c90c" (UID: "9954eb52-c1ae-4c2e-9e81-9e576d65c90c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.540128 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-config-data" (OuterVolumeSpecName: "config-data") pod "9954eb52-c1ae-4c2e-9e81-9e576d65c90c" (UID: "9954eb52-c1ae-4c2e-9e81-9e576d65c90c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.566148 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9954eb52-c1ae-4c2e-9e81-9e576d65c90c" (UID: "9954eb52-c1ae-4c2e-9e81-9e576d65c90c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.584928 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.584975 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.584985 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.584996 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgzlv\" (UniqueName: \"kubernetes.io/projected/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-kube-api-access-tgzlv\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.585005 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.585013 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9954eb52-c1ae-4c2e-9e81-9e576d65c90c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.616932 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.687021 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.868593 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:10:34 crc kubenswrapper[5030]: W0120 23:10:34.879583 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5fbd872_796a_4b41_b925_516fc6db87f6.slice/crio-51b13d2e74518375b839ebdd3a626fab34e90c11bca7bc268289841ba997a5f0 WatchSource:0}: Error finding container 51b13d2e74518375b839ebdd3a626fab34e90c11bca7bc268289841ba997a5f0: Status 404 returned error can't find the container with id 51b13d2e74518375b839ebdd3a626fab34e90c11bca7bc268289841ba997a5f0 Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.945003 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5fbd872-796a-4b41-b925-516fc6db87f6","Type":"ContainerStarted","Data":"51b13d2e74518375b839ebdd3a626fab34e90c11bca7bc268289841ba997a5f0"} Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.950674 5030 generic.go:334] "Generic (PLEG): container finished" podID="3937a364-08e4-44d1-a93e-8dd4a203f200" containerID="7919fd8bc32c52113da1c830916d802ce7efb1753f1c016315bf56674d6f572c" exitCode=0 Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.950762 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-8x74r" event={"ID":"3937a364-08e4-44d1-a93e-8dd4a203f200","Type":"ContainerDied","Data":"7919fd8bc32c52113da1c830916d802ce7efb1753f1c016315bf56674d6f572c"} Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.960545 5030 generic.go:334] "Generic (PLEG): container finished" podID="9954eb52-c1ae-4c2e-9e81-9e576d65c90c" containerID="ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6" exitCode=0 Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.960574 5030 generic.go:334] "Generic (PLEG): container finished" podID="9954eb52-c1ae-4c2e-9e81-9e576d65c90c" containerID="10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb" exitCode=143 Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.960708 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.960778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"9954eb52-c1ae-4c2e-9e81-9e576d65c90c","Type":"ContainerDied","Data":"ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6"} Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.960812 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"9954eb52-c1ae-4c2e-9e81-9e576d65c90c","Type":"ContainerDied","Data":"10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb"} Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.960827 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"9954eb52-c1ae-4c2e-9e81-9e576d65c90c","Type":"ContainerDied","Data":"c919a1561d01324d4f129ff5731926b89f5b71f987182082f4c2dbe55112d28f"} Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.960846 5030 scope.go:117] "RemoveContainer" containerID="ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6" Jan 20 23:10:34 crc kubenswrapper[5030]: I0120 23:10:34.972134 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8e7109cf-496b-406f-817c-76e72c01ef90","Type":"ContainerStarted","Data":"02eb4e5a22318beb65ae2d1e3ab0bd1770c0bf9d17ceca08eee138acb3445f34"} Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.010568 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.024439 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.026718 5030 scope.go:117] "RemoveContainer" containerID="10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.037724 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:10:35 crc kubenswrapper[5030]: E0120 23:10:35.038142 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9954eb52-c1ae-4c2e-9e81-9e576d65c90c" containerName="glance-log" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.038158 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9954eb52-c1ae-4c2e-9e81-9e576d65c90c" containerName="glance-log" Jan 20 23:10:35 crc kubenswrapper[5030]: E0120 23:10:35.038175 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9954eb52-c1ae-4c2e-9e81-9e576d65c90c" containerName="glance-httpd" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.038189 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9954eb52-c1ae-4c2e-9e81-9e576d65c90c" containerName="glance-httpd" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.038368 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9954eb52-c1ae-4c2e-9e81-9e576d65c90c" containerName="glance-httpd" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.038379 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9954eb52-c1ae-4c2e-9e81-9e576d65c90c" containerName="glance-log" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.039270 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.043498 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.043688 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.060958 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.082388 5030 scope.go:117] "RemoveContainer" containerID="ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6" Jan 20 23:10:35 crc kubenswrapper[5030]: E0120 23:10:35.085424 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6\": container with ID starting with ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6 not found: ID does not exist" containerID="ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.085462 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6"} err="failed to get container status \"ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6\": rpc error: code = NotFound desc = could not find container \"ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6\": container with ID starting with ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6 not found: ID does not exist" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.085495 5030 scope.go:117] "RemoveContainer" containerID="10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb" Jan 20 23:10:35 crc kubenswrapper[5030]: E0120 23:10:35.085922 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb\": container with ID starting with 10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb not found: ID does not exist" containerID="10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.085999 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb"} err="failed to get container status \"10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb\": rpc error: code = NotFound desc = could not find container \"10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb\": container with ID starting with 10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb not found: ID does not exist" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.086056 5030 scope.go:117] "RemoveContainer" containerID="ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.087522 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6"} err="failed to get container status \"ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6\": rpc error: code = NotFound desc = could not find container \"ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6\": container with ID starting with ce21d40128c77ed78a2639ab95b25425832070524ea24e1fd1c34bed840e66b6 not found: ID does not exist" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.087582 5030 scope.go:117] "RemoveContainer" containerID="10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.088358 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb"} err="failed to get container status \"10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb\": rpc error: code = NotFound desc = could not find container \"10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb\": container with ID starting with 10e89b5115b89b66d7a90021163f10d97d8fd43044c77d9114fa2e96bc2a3ebb not found: ID does not exist" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.112247 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efb2e499-552b-40a7-9f9d-cc30acd4e585-logs\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.112362 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.112419 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.112440 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-config-data\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.112477 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8vxw\" (UniqueName: \"kubernetes.io/projected/efb2e499-552b-40a7-9f9d-cc30acd4e585-kube-api-access-x8vxw\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.112692 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-scripts\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.112750 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.112794 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efb2e499-552b-40a7-9f9d-cc30acd4e585-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.118362 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.118704 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="prometheus" containerID="cri-o://80c605c66cd522f8f765e017919e8e73caf65deb8d8f70b145a11ea221744d70" gracePeriod=600 Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.118857 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="thanos-sidecar" containerID="cri-o://f2631a14d00bf858921d99a56f28d6f8aa713a92364714274f4423086ce45852" gracePeriod=600 Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.118917 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="config-reloader" containerID="cri-o://91c5f057c90a781f3cc420c4ec9ef0d245558e00e54dc7ea910fb0d2310af899" gracePeriod=600 Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.214845 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efb2e499-552b-40a7-9f9d-cc30acd4e585-logs\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.214907 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.214931 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.214948 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-config-data\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.214968 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8vxw\" (UniqueName: \"kubernetes.io/projected/efb2e499-552b-40a7-9f9d-cc30acd4e585-kube-api-access-x8vxw\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.214995 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-scripts\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.215023 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.215044 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efb2e499-552b-40a7-9f9d-cc30acd4e585-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.215544 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efb2e499-552b-40a7-9f9d-cc30acd4e585-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.215773 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efb2e499-552b-40a7-9f9d-cc30acd4e585-logs\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.216055 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.224795 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-config-data\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.225052 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.225369 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-scripts\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.225661 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.237954 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8vxw\" (UniqueName: \"kubernetes.io/projected/efb2e499-552b-40a7-9f9d-cc30acd4e585-kube-api-access-x8vxw\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.249873 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:35 crc kubenswrapper[5030]: I0120 23:10:35.388086 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.008957 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9954eb52-c1ae-4c2e-9e81-9e576d65c90c" path="/var/lib/kubelet/pods/9954eb52-c1ae-4c2e-9e81-9e576d65c90c/volumes" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.010140 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fecb64f3-f295-42da-950d-b595899d3c52" path="/var/lib/kubelet/pods/fecb64f3-f295-42da-950d-b595899d3c52/volumes" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.016388 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5fbd872-796a-4b41-b925-516fc6db87f6","Type":"ContainerStarted","Data":"053f678a1a0cf6510e44aeba332cc377d889571f025e9cd39037db9969376bc1"} Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.026071 5030 generic.go:334] "Generic (PLEG): container finished" podID="f69572b8-b37f-4e3c-9b64-e9076db53f18" containerID="4324979a35d72509d550aecc0fddf62f163b48cbd3e592489c10890fd9880571" exitCode=0 Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.026194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" event={"ID":"f69572b8-b37f-4e3c-9b64-e9076db53f18","Type":"ContainerDied","Data":"4324979a35d72509d550aecc0fddf62f163b48cbd3e592489c10890fd9880571"} Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.031140 5030 generic.go:334] "Generic (PLEG): container finished" podID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerID="f2631a14d00bf858921d99a56f28d6f8aa713a92364714274f4423086ce45852" exitCode=0 Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.031163 5030 generic.go:334] "Generic (PLEG): container finished" podID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerID="91c5f057c90a781f3cc420c4ec9ef0d245558e00e54dc7ea910fb0d2310af899" exitCode=0 Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.031171 5030 generic.go:334] "Generic (PLEG): container finished" podID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerID="80c605c66cd522f8f765e017919e8e73caf65deb8d8f70b145a11ea221744d70" exitCode=0 Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.031200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"fc4c6ac6-9b26-4047-a309-eb01bf39412f","Type":"ContainerDied","Data":"f2631a14d00bf858921d99a56f28d6f8aa713a92364714274f4423086ce45852"} Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.031217 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"fc4c6ac6-9b26-4047-a309-eb01bf39412f","Type":"ContainerDied","Data":"91c5f057c90a781f3cc420c4ec9ef0d245558e00e54dc7ea910fb0d2310af899"} Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.031226 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"fc4c6ac6-9b26-4047-a309-eb01bf39412f","Type":"ContainerDied","Data":"80c605c66cd522f8f765e017919e8e73caf65deb8d8f70b145a11ea221744d70"} Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.034647 5030 generic.go:334] "Generic (PLEG): container finished" podID="4983d8d3-58c0-4c76-9200-dfb9daefaf64" containerID="db8a03c81c30c7c0f36823f636297df6ce0a2e06bb363add7caf45bf70d3c118" exitCode=0 Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.034716 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" event={"ID":"4983d8d3-58c0-4c76-9200-dfb9daefaf64","Type":"ContainerDied","Data":"db8a03c81c30c7c0f36823f636297df6ce0a2e06bb363add7caf45bf70d3c118"} Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.054681 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="ceilometer-central-agent" containerID="cri-o://93e05fd4390c7930eea40ffb0b128a3c7efd393dee960fe75a044a26d1180047" gracePeriod=30 Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.054907 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8e7109cf-496b-406f-817c-76e72c01ef90","Type":"ContainerStarted","Data":"fbe27cf2995fea8f398082bac7eea857f2a9475c0a7a41b96d6ac3447c0862fc"} Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.054944 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="proxy-httpd" containerID="cri-o://fbe27cf2995fea8f398082bac7eea857f2a9475c0a7a41b96d6ac3447c0862fc" gracePeriod=30 Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.054983 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.055002 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="sg-core" containerID="cri-o://02eb4e5a22318beb65ae2d1e3ab0bd1770c0bf9d17ceca08eee138acb3445f34" gracePeriod=30 Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.055038 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="ceilometer-notification-agent" containerID="cri-o://e8e7436bd2501d29035883b10bb837e40ac85eee5032e5783e47d4670ba2866b" gracePeriod=30 Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.099552 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=6.053654264 podStartE2EDuration="10.099535008s" podCreationTimestamp="2026-01-20 23:10:26 +0000 UTC" firstStartedPulling="2026-01-20 23:10:31.413038712 +0000 UTC m=+2103.733299000" lastFinishedPulling="2026-01-20 23:10:35.458919456 +0000 UTC m=+2107.779179744" observedRunningTime="2026-01-20 23:10:36.092194659 +0000 UTC m=+2108.412454957" watchObservedRunningTime="2026-01-20 23:10:36.099535008 +0000 UTC m=+2108.419795296" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.143334 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.143588 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.339269 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-0\") pod \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.340243 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-config\") pod \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.340294 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.340337 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc4c6ac6-9b26-4047-a309-eb01bf39412f-tls-assets\") pod \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.340389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-1\") pod \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.340386 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "fc4c6ac6-9b26-4047-a309-eb01bf39412f" (UID: "fc4c6ac6-9b26-4047-a309-eb01bf39412f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.340423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc4c6ac6-9b26-4047-a309-eb01bf39412f-config-out\") pod \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.341079 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-thanos-prometheus-http-client-file\") pod \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.341110 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-web-config\") pod \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.341116 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "fc4c6ac6-9b26-4047-a309-eb01bf39412f" (UID: "fc4c6ac6-9b26-4047-a309-eb01bf39412f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.341214 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f77ts\" (UniqueName: \"kubernetes.io/projected/fc4c6ac6-9b26-4047-a309-eb01bf39412f-kube-api-access-f77ts\") pod \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.341235 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-2\") pod \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\" (UID: \"fc4c6ac6-9b26-4047-a309-eb01bf39412f\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.341640 5030 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.341652 5030 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.342969 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.345812 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-config" (OuterVolumeSpecName: "config") pod "fc4c6ac6-9b26-4047-a309-eb01bf39412f" (UID: "fc4c6ac6-9b26-4047-a309-eb01bf39412f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.347050 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc4c6ac6-9b26-4047-a309-eb01bf39412f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "fc4c6ac6-9b26-4047-a309-eb01bf39412f" (UID: "fc4c6ac6-9b26-4047-a309-eb01bf39412f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.351522 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "fc4c6ac6-9b26-4047-a309-eb01bf39412f" (UID: "fc4c6ac6-9b26-4047-a309-eb01bf39412f"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.351592 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "fc4c6ac6-9b26-4047-a309-eb01bf39412f" (UID: "fc4c6ac6-9b26-4047-a309-eb01bf39412f"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.360747 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "fc4c6ac6-9b26-4047-a309-eb01bf39412f" (UID: "fc4c6ac6-9b26-4047-a309-eb01bf39412f"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.361549 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc4c6ac6-9b26-4047-a309-eb01bf39412f-kube-api-access-f77ts" (OuterVolumeSpecName: "kube-api-access-f77ts") pod "fc4c6ac6-9b26-4047-a309-eb01bf39412f" (UID: "fc4c6ac6-9b26-4047-a309-eb01bf39412f"). InnerVolumeSpecName "kube-api-access-f77ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.365863 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc4c6ac6-9b26-4047-a309-eb01bf39412f-config-out" (OuterVolumeSpecName: "config-out") pod "fc4c6ac6-9b26-4047-a309-eb01bf39412f" (UID: "fc4c6ac6-9b26-4047-a309-eb01bf39412f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.380265 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-web-config" (OuterVolumeSpecName: "web-config") pod "fc4c6ac6-9b26-4047-a309-eb01bf39412f" (UID: "fc4c6ac6-9b26-4047-a309-eb01bf39412f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.442820 5030 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.442862 5030 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc4c6ac6-9b26-4047-a309-eb01bf39412f-config-out\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.442878 5030 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-web-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.442890 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f77ts\" (UniqueName: \"kubernetes.io/projected/fc4c6ac6-9b26-4047-a309-eb01bf39412f-kube-api-access-f77ts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.442902 5030 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fc4c6ac6-9b26-4047-a309-eb01bf39412f-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.442911 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fc4c6ac6-9b26-4047-a309-eb01bf39412f-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.442944 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.442957 5030 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc4c6ac6-9b26-4047-a309-eb01bf39412f-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.478482 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.543698 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-scripts\") pod \"3937a364-08e4-44d1-a93e-8dd4a203f200\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.543808 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-config-data\") pod \"3937a364-08e4-44d1-a93e-8dd4a203f200\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.543846 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvbbp\" (UniqueName: \"kubernetes.io/projected/3937a364-08e4-44d1-a93e-8dd4a203f200-kube-api-access-pvbbp\") pod \"3937a364-08e4-44d1-a93e-8dd4a203f200\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.543892 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3937a364-08e4-44d1-a93e-8dd4a203f200-logs\") pod \"3937a364-08e4-44d1-a93e-8dd4a203f200\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.543944 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-combined-ca-bundle\") pod \"3937a364-08e4-44d1-a93e-8dd4a203f200\" (UID: \"3937a364-08e4-44d1-a93e-8dd4a203f200\") " Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.544304 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.544937 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3937a364-08e4-44d1-a93e-8dd4a203f200-logs" (OuterVolumeSpecName: "logs") pod "3937a364-08e4-44d1-a93e-8dd4a203f200" (UID: "3937a364-08e4-44d1-a93e-8dd4a203f200"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.548318 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-scripts" (OuterVolumeSpecName: "scripts") pod "3937a364-08e4-44d1-a93e-8dd4a203f200" (UID: "3937a364-08e4-44d1-a93e-8dd4a203f200"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.548457 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3937a364-08e4-44d1-a93e-8dd4a203f200-kube-api-access-pvbbp" (OuterVolumeSpecName: "kube-api-access-pvbbp") pod "3937a364-08e4-44d1-a93e-8dd4a203f200" (UID: "3937a364-08e4-44d1-a93e-8dd4a203f200"). InnerVolumeSpecName "kube-api-access-pvbbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.584202 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3937a364-08e4-44d1-a93e-8dd4a203f200" (UID: "3937a364-08e4-44d1-a93e-8dd4a203f200"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.586204 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-config-data" (OuterVolumeSpecName: "config-data") pod "3937a364-08e4-44d1-a93e-8dd4a203f200" (UID: "3937a364-08e4-44d1-a93e-8dd4a203f200"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.647033 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3937a364-08e4-44d1-a93e-8dd4a203f200-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.647077 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.647091 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.647102 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3937a364-08e4-44d1-a93e-8dd4a203f200-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:36 crc kubenswrapper[5030]: I0120 23:10:36.647117 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvbbp\" (UniqueName: \"kubernetes.io/projected/3937a364-08e4-44d1-a93e-8dd4a203f200-kube-api-access-pvbbp\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.089372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"efb2e499-552b-40a7-9f9d-cc30acd4e585","Type":"ContainerStarted","Data":"71173d5ca0843e6f5b5688d151d66d0caee91c181cd50badb07d7166cf7f0ef6"} Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.089672 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"efb2e499-552b-40a7-9f9d-cc30acd4e585","Type":"ContainerStarted","Data":"7fc127f0ccf1c519c0b61afd47578f823e10cbbef32abf2437b0477539214c31"} Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.096267 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5fbd872-796a-4b41-b925-516fc6db87f6","Type":"ContainerStarted","Data":"326c2d125eb3a698d87da6113f935f138da26210af73c40cb324ca56a9aa23a4"} Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.100369 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-65fb4986d8-k2r2k"] Jan 20 23:10:37 crc kubenswrapper[5030]: E0120 23:10:37.100768 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="prometheus" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.100785 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="prometheus" Jan 20 23:10:37 crc kubenswrapper[5030]: E0120 23:10:37.100801 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="init-config-reloader" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.100808 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="init-config-reloader" Jan 20 23:10:37 crc kubenswrapper[5030]: E0120 23:10:37.100826 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3937a364-08e4-44d1-a93e-8dd4a203f200" containerName="placement-db-sync" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.100833 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3937a364-08e4-44d1-a93e-8dd4a203f200" containerName="placement-db-sync" Jan 20 23:10:37 crc kubenswrapper[5030]: E0120 23:10:37.100843 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="thanos-sidecar" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.100849 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="thanos-sidecar" Jan 20 23:10:37 crc kubenswrapper[5030]: E0120 23:10:37.100856 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="config-reloader" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.100862 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="config-reloader" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.101024 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="prometheus" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.101034 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3937a364-08e4-44d1-a93e-8dd4a203f200" containerName="placement-db-sync" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.101043 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="config-reloader" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.101051 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" containerName="thanos-sidecar" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.101874 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.104138 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.104439 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.112728 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-8x74r" event={"ID":"3937a364-08e4-44d1-a93e-8dd4a203f200","Type":"ContainerDied","Data":"3a65b49d1e4304d1ce514efce34470e9317697138d8d69046a14f85d7d857187"} Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.112772 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a65b49d1e4304d1ce514efce34470e9317697138d8d69046a14f85d7d857187" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.112831 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-8x74r" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.136579 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-65fb4986d8-k2r2k"] Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.138374 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"fc4c6ac6-9b26-4047-a309-eb01bf39412f","Type":"ContainerDied","Data":"4d379d1cc61c7cdd4213d3b6410a63ed3821d13956cc3aefa96cf4740f07fe7c"} Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.138488 5030 scope.go:117] "RemoveContainer" containerID="f2631a14d00bf858921d99a56f28d6f8aa713a92364714274f4423086ce45852" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.138796 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.171545 5030 generic.go:334] "Generic (PLEG): container finished" podID="8e7109cf-496b-406f-817c-76e72c01ef90" containerID="fbe27cf2995fea8f398082bac7eea857f2a9475c0a7a41b96d6ac3447c0862fc" exitCode=0 Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.171584 5030 generic.go:334] "Generic (PLEG): container finished" podID="8e7109cf-496b-406f-817c-76e72c01ef90" containerID="02eb4e5a22318beb65ae2d1e3ab0bd1770c0bf9d17ceca08eee138acb3445f34" exitCode=2 Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.171596 5030 generic.go:334] "Generic (PLEG): container finished" podID="8e7109cf-496b-406f-817c-76e72c01ef90" containerID="e8e7436bd2501d29035883b10bb837e40ac85eee5032e5783e47d4670ba2866b" exitCode=0 Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.171634 5030 generic.go:334] "Generic (PLEG): container finished" podID="8e7109cf-496b-406f-817c-76e72c01ef90" containerID="93e05fd4390c7930eea40ffb0b128a3c7efd393dee960fe75a044a26d1180047" exitCode=0 Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.172169 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8e7109cf-496b-406f-817c-76e72c01ef90","Type":"ContainerDied","Data":"fbe27cf2995fea8f398082bac7eea857f2a9475c0a7a41b96d6ac3447c0862fc"} Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.172432 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8e7109cf-496b-406f-817c-76e72c01ef90","Type":"ContainerDied","Data":"02eb4e5a22318beb65ae2d1e3ab0bd1770c0bf9d17ceca08eee138acb3445f34"} Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.172501 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8e7109cf-496b-406f-817c-76e72c01ef90","Type":"ContainerDied","Data":"e8e7436bd2501d29035883b10bb837e40ac85eee5032e5783e47d4670ba2866b"} Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.172522 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8e7109cf-496b-406f-817c-76e72c01ef90","Type":"ContainerDied","Data":"93e05fd4390c7930eea40ffb0b128a3c7efd393dee960fe75a044a26d1180047"} Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.188256 5030 scope.go:117] "RemoveContainer" containerID="91c5f057c90a781f3cc420c4ec9ef0d245558e00e54dc7ea910fb0d2310af899" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.200390 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.200335403 podStartE2EDuration="4.200335403s" podCreationTimestamp="2026-01-20 23:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:37.131601464 +0000 UTC m=+2109.451861762" watchObservedRunningTime="2026-01-20 23:10:37.200335403 +0000 UTC m=+2109.520595691" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.227258 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.235637 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.255671 5030 scope.go:117] "RemoveContainer" containerID="80c605c66cd522f8f765e017919e8e73caf65deb8d8f70b145a11ea221744d70" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.260613 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-config-data\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.260660 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-public-tls-certs\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.260720 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-scripts\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.260743 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b296da3-d86c-431b-889e-a38437804d3a-logs\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.260779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-internal-tls-certs\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.260801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-combined-ca-bundle\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.260886 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxp4m\" (UniqueName: \"kubernetes.io/projected/0b296da3-d86c-431b-889e-a38437804d3a-kube-api-access-mxp4m\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.275401 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.277488 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.282235 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.282491 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"metric-storage-prometheus-dockercfg-82v6w" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.282890 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.283534 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage-web-config" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.284061 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.284324 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-metric-storage-prometheus-svc" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.287208 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"prometheus-metric-storage-rulefiles-1" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.287458 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"prometheus-metric-storage-rulefiles-2" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.287638 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"prometheus-metric-storage-rulefiles-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.291933 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"prometheus-metric-storage-tls-assets-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.295820 5030 scope.go:117] "RemoveContainer" containerID="81c0add93369774cfdec79815a42700be1d5e4729d4c196bfc35f4a4f82494d7" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.362008 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-public-tls-certs\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.362389 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.362436 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfkm5\" (UniqueName: \"kubernetes.io/projected/db1354ec-1e74-42e4-8eda-ffcba8af62a8-kube-api-access-bfkm5\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.362476 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-scripts\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.362511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/db1354ec-1e74-42e4-8eda-ffcba8af62a8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.362544 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b296da3-d86c-431b-889e-a38437804d3a-logs\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.362582 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-internal-tls-certs\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.363567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b296da3-d86c-431b-889e-a38437804d3a-logs\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.362611 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.363759 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-combined-ca-bundle\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.363792 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.363821 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.363871 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.363911 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.363961 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.364013 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-config\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.364050 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.364096 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxp4m\" (UniqueName: \"kubernetes.io/projected/0b296da3-d86c-431b-889e-a38437804d3a-kube-api-access-mxp4m\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.364124 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/db1354ec-1e74-42e4-8eda-ffcba8af62a8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.364150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.365016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-config-data\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.366946 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-scripts\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.371780 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-combined-ca-bundle\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.371973 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-internal-tls-certs\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.372116 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-public-tls-certs\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.373700 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-config-data\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.387656 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxp4m\" (UniqueName: \"kubernetes.io/projected/0b296da3-d86c-431b-889e-a38437804d3a-kube-api-access-mxp4m\") pod \"placement-65fb4986d8-k2r2k\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.435373 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.436476 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.476704 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.476808 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-config\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.476858 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.476934 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/db1354ec-1e74-42e4-8eda-ffcba8af62a8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.476961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.477029 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.477082 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfkm5\" (UniqueName: \"kubernetes.io/projected/db1354ec-1e74-42e4-8eda-ffcba8af62a8-kube-api-access-bfkm5\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.477109 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/db1354ec-1e74-42e4-8eda-ffcba8af62a8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.477230 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.477265 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.477294 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.477351 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.477394 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.477661 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.478025 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.480183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.480592 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.488308 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.488694 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.489481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/db1354ec-1e74-42e4-8eda-ffcba8af62a8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.489816 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.490264 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/db1354ec-1e74-42e4-8eda-ffcba8af62a8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.496527 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfkm5\" (UniqueName: \"kubernetes.io/projected/db1354ec-1e74-42e4-8eda-ffcba8af62a8-kube-api-access-bfkm5\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.499086 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.500145 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.506458 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-config\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.542398 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"prometheus-metric-storage-0\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.578196 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e7109cf-496b-406f-817c-76e72c01ef90-run-httpd\") pod \"8e7109cf-496b-406f-817c-76e72c01ef90\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.578543 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-scripts\") pod \"8e7109cf-496b-406f-817c-76e72c01ef90\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.578653 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-combined-ca-bundle\") pod \"8e7109cf-496b-406f-817c-76e72c01ef90\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.578710 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-config-data\") pod \"8e7109cf-496b-406f-817c-76e72c01ef90\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.578746 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e7109cf-496b-406f-817c-76e72c01ef90-log-httpd\") pod \"8e7109cf-496b-406f-817c-76e72c01ef90\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.578787 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqm9c\" (UniqueName: \"kubernetes.io/projected/8e7109cf-496b-406f-817c-76e72c01ef90-kube-api-access-jqm9c\") pod \"8e7109cf-496b-406f-817c-76e72c01ef90\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.578822 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-sg-core-conf-yaml\") pod \"8e7109cf-496b-406f-817c-76e72c01ef90\" (UID: \"8e7109cf-496b-406f-817c-76e72c01ef90\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.581038 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e7109cf-496b-406f-817c-76e72c01ef90-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8e7109cf-496b-406f-817c-76e72c01ef90" (UID: "8e7109cf-496b-406f-817c-76e72c01ef90"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.581280 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e7109cf-496b-406f-817c-76e72c01ef90-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8e7109cf-496b-406f-817c-76e72c01ef90" (UID: "8e7109cf-496b-406f-817c-76e72c01ef90"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.584410 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e7109cf-496b-406f-817c-76e72c01ef90-kube-api-access-jqm9c" (OuterVolumeSpecName: "kube-api-access-jqm9c") pod "8e7109cf-496b-406f-817c-76e72c01ef90" (UID: "8e7109cf-496b-406f-817c-76e72c01ef90"). InnerVolumeSpecName "kube-api-access-jqm9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.588207 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-scripts" (OuterVolumeSpecName: "scripts") pod "8e7109cf-496b-406f-817c-76e72c01ef90" (UID: "8e7109cf-496b-406f-817c-76e72c01ef90"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.623057 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.643359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8e7109cf-496b-406f-817c-76e72c01ef90" (UID: "8e7109cf-496b-406f-817c-76e72c01ef90"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.680504 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e7109cf-496b-406f-817c-76e72c01ef90-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.680536 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqm9c\" (UniqueName: \"kubernetes.io/projected/8e7109cf-496b-406f-817c-76e72c01ef90-kube-api-access-jqm9c\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.680547 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.680556 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e7109cf-496b-406f-817c-76e72c01ef90-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.680566 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.710521 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e7109cf-496b-406f-817c-76e72c01ef90" (UID: "8e7109cf-496b-406f-817c-76e72c01ef90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.716411 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-config-data" (OuterVolumeSpecName: "config-data") pod "8e7109cf-496b-406f-817c-76e72c01ef90" (UID: "8e7109cf-496b-406f-817c-76e72c01ef90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.726302 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.730398 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.782180 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.782412 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e7109cf-496b-406f-817c-76e72c01ef90-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.883483 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-config-data\") pod \"f69572b8-b37f-4e3c-9b64-e9076db53f18\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.883539 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-combined-ca-bundle\") pod \"f69572b8-b37f-4e3c-9b64-e9076db53f18\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.883564 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98kfz\" (UniqueName: \"kubernetes.io/projected/4983d8d3-58c0-4c76-9200-dfb9daefaf64-kube-api-access-98kfz\") pod \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\" (UID: \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.883608 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4983d8d3-58c0-4c76-9200-dfb9daefaf64-combined-ca-bundle\") pod \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\" (UID: \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.883656 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-credential-keys\") pod \"f69572b8-b37f-4e3c-9b64-e9076db53f18\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.883675 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4983d8d3-58c0-4c76-9200-dfb9daefaf64-db-sync-config-data\") pod \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\" (UID: \"4983d8d3-58c0-4c76-9200-dfb9daefaf64\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.883718 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-fernet-keys\") pod \"f69572b8-b37f-4e3c-9b64-e9076db53f18\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.883747 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4vdt\" (UniqueName: \"kubernetes.io/projected/f69572b8-b37f-4e3c-9b64-e9076db53f18-kube-api-access-m4vdt\") pod \"f69572b8-b37f-4e3c-9b64-e9076db53f18\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.883763 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-scripts\") pod \"f69572b8-b37f-4e3c-9b64-e9076db53f18\" (UID: \"f69572b8-b37f-4e3c-9b64-e9076db53f18\") " Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.887411 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4983d8d3-58c0-4c76-9200-dfb9daefaf64-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4983d8d3-58c0-4c76-9200-dfb9daefaf64" (UID: "4983d8d3-58c0-4c76-9200-dfb9daefaf64"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.888229 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f69572b8-b37f-4e3c-9b64-e9076db53f18-kube-api-access-m4vdt" (OuterVolumeSpecName: "kube-api-access-m4vdt") pod "f69572b8-b37f-4e3c-9b64-e9076db53f18" (UID: "f69572b8-b37f-4e3c-9b64-e9076db53f18"). InnerVolumeSpecName "kube-api-access-m4vdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.888387 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f69572b8-b37f-4e3c-9b64-e9076db53f18" (UID: "f69572b8-b37f-4e3c-9b64-e9076db53f18"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.888586 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4983d8d3-58c0-4c76-9200-dfb9daefaf64-kube-api-access-98kfz" (OuterVolumeSpecName: "kube-api-access-98kfz") pod "4983d8d3-58c0-4c76-9200-dfb9daefaf64" (UID: "4983d8d3-58c0-4c76-9200-dfb9daefaf64"). InnerVolumeSpecName "kube-api-access-98kfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.888998 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f69572b8-b37f-4e3c-9b64-e9076db53f18" (UID: "f69572b8-b37f-4e3c-9b64-e9076db53f18"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.889960 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-scripts" (OuterVolumeSpecName: "scripts") pod "f69572b8-b37f-4e3c-9b64-e9076db53f18" (UID: "f69572b8-b37f-4e3c-9b64-e9076db53f18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.911016 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-config-data" (OuterVolumeSpecName: "config-data") pod "f69572b8-b37f-4e3c-9b64-e9076db53f18" (UID: "f69572b8-b37f-4e3c-9b64-e9076db53f18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.911471 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f69572b8-b37f-4e3c-9b64-e9076db53f18" (UID: "f69572b8-b37f-4e3c-9b64-e9076db53f18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.926666 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4983d8d3-58c0-4c76-9200-dfb9daefaf64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4983d8d3-58c0-4c76-9200-dfb9daefaf64" (UID: "4983d8d3-58c0-4c76-9200-dfb9daefaf64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.951201 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-65fb4986d8-k2r2k"] Jan 20 23:10:37 crc kubenswrapper[5030]: W0120 23:10:37.951969 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b296da3_d86c_431b_889e_a38437804d3a.slice/crio-6d51eb7685047c08ab7fa7cb5f81c0f5a56b71b7701a86aa0d0143e5d3e360cc WatchSource:0}: Error finding container 6d51eb7685047c08ab7fa7cb5f81c0f5a56b71b7701a86aa0d0143e5d3e360cc: Status 404 returned error can't find the container with id 6d51eb7685047c08ab7fa7cb5f81c0f5a56b71b7701a86aa0d0143e5d3e360cc Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.974001 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc4c6ac6-9b26-4047-a309-eb01bf39412f" path="/var/lib/kubelet/pods/fc4c6ac6-9b26-4047-a309-eb01bf39412f/volumes" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.985928 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.986097 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4vdt\" (UniqueName: \"kubernetes.io/projected/f69572b8-b37f-4e3c-9b64-e9076db53f18-kube-api-access-m4vdt\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.986179 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.986256 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.986331 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.986425 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98kfz\" (UniqueName: \"kubernetes.io/projected/4983d8d3-58c0-4c76-9200-dfb9daefaf64-kube-api-access-98kfz\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.986516 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4983d8d3-58c0-4c76-9200-dfb9daefaf64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.986597 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f69572b8-b37f-4e3c-9b64-e9076db53f18-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:37 crc kubenswrapper[5030]: I0120 23:10:37.986726 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4983d8d3-58c0-4c76-9200-dfb9daefaf64-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:38 crc kubenswrapper[5030]: W0120 23:10:38.140197 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb1354ec_1e74_42e4_8eda_ffcba8af62a8.slice/crio-4b7687b382562c9996b451560f66f08288773b36e3b165c118b1b7c3a5471a69 WatchSource:0}: Error finding container 4b7687b382562c9996b451560f66f08288773b36e3b165c118b1b7c3a5471a69: Status 404 returned error can't find the container with id 4b7687b382562c9996b451560f66f08288773b36e3b165c118b1b7c3a5471a69 Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.146469 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.155358 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kz5cc"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.163455 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-kz5cc"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.184160 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"db1354ec-1e74-42e4-8eda-ffcba8af62a8","Type":"ContainerStarted","Data":"4b7687b382562c9996b451560f66f08288773b36e3b165c118b1b7c3a5471a69"} Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.189795 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.190056 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8e7109cf-496b-406f-817c-76e72c01ef90","Type":"ContainerDied","Data":"1aa7ff6fdf796de3abe99274b55007a8e84c3ee27933ed3bc8cfcb3c7b9d4dfd"} Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.190109 5030 scope.go:117] "RemoveContainer" containerID="fbe27cf2995fea8f398082bac7eea857f2a9475c0a7a41b96d6ac3447c0862fc" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.207340 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"efb2e499-552b-40a7-9f9d-cc30acd4e585","Type":"ContainerStarted","Data":"8722ba8a449d1974a6cabc858a5b686ba2dbe1041c85c5ff5e6525f64ac6076a"} Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.214578 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6kpvj"] Jan 20 23:10:38 crc kubenswrapper[5030]: E0120 23:10:38.215042 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="ceilometer-central-agent" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.215067 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="ceilometer-central-agent" Jan 20 23:10:38 crc kubenswrapper[5030]: E0120 23:10:38.215086 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="proxy-httpd" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.215095 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="proxy-httpd" Jan 20 23:10:38 crc kubenswrapper[5030]: E0120 23:10:38.215106 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4983d8d3-58c0-4c76-9200-dfb9daefaf64" containerName="barbican-db-sync" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.215116 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4983d8d3-58c0-4c76-9200-dfb9daefaf64" containerName="barbican-db-sync" Jan 20 23:10:38 crc kubenswrapper[5030]: E0120 23:10:38.215132 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="ceilometer-notification-agent" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.215140 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="ceilometer-notification-agent" Jan 20 23:10:38 crc kubenswrapper[5030]: E0120 23:10:38.215174 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="sg-core" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.215182 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="sg-core" Jan 20 23:10:38 crc kubenswrapper[5030]: E0120 23:10:38.215194 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f69572b8-b37f-4e3c-9b64-e9076db53f18" containerName="keystone-bootstrap" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.215203 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f69572b8-b37f-4e3c-9b64-e9076db53f18" containerName="keystone-bootstrap" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.215386 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4983d8d3-58c0-4c76-9200-dfb9daefaf64" containerName="barbican-db-sync" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.215406 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="ceilometer-central-agent" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.215421 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f69572b8-b37f-4e3c-9b64-e9076db53f18" containerName="keystone-bootstrap" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.215438 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="sg-core" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.215454 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="proxy-httpd" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.215465 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" containerName="ceilometer-notification-agent" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.216832 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" event={"ID":"0b296da3-d86c-431b-889e-a38437804d3a","Type":"ContainerStarted","Data":"6959375e26fa2c4a22be160d0bdb837d17a80304dda7a78b4f10357d3e5cdc2d"} Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.216863 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" event={"ID":"0b296da3-d86c-431b-889e-a38437804d3a","Type":"ContainerStarted","Data":"6d51eb7685047c08ab7fa7cb5f81c0f5a56b71b7701a86aa0d0143e5d3e360cc"} Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.216947 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.226432 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa6b59248d1470fedf611c4e5e1d67f169745462391cadd36f1da360c1440ce8" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.226536 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-kz5cc" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.227848 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6kpvj"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.258778 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.258761614 podStartE2EDuration="3.258761614s" podCreationTimestamp="2026-01-20 23:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:38.228662259 +0000 UTC m=+2110.548922547" watchObservedRunningTime="2026-01-20 23:10:38.258761614 +0000 UTC m=+2110.579021902" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.282267 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.282705 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-xz7mx" event={"ID":"4983d8d3-58c0-4c76-9200-dfb9daefaf64","Type":"ContainerDied","Data":"4c96de8c9b1209250873e2879e7920d0ffdf2a079bd51bb6f50b7c69451286f0"} Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.282724 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c96de8c9b1209250873e2879e7920d0ffdf2a079bd51bb6f50b7c69451286f0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.290926 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-config-data\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.291088 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-scripts\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.291115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-combined-ca-bundle\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.291150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-credential-keys\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.291178 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-fernet-keys\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.291260 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dskf\" (UniqueName: \"kubernetes.io/projected/354c4165-5673-49e5-9f0f-9b657e9c969f-kube-api-access-9dskf\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.313067 5030 scope.go:117] "RemoveContainer" containerID="02eb4e5a22318beb65ae2d1e3ab0bd1770c0bf9d17ceca08eee138acb3445f34" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.349755 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.394559 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-scripts\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.394607 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-combined-ca-bundle\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.394691 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-credential-keys\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.394720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-fernet-keys\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.394765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dskf\" (UniqueName: \"kubernetes.io/projected/354c4165-5673-49e5-9f0f-9b657e9c969f-kube-api-access-9dskf\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.394801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-config-data\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.407907 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-fernet-keys\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.408878 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-config-data\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.410438 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-combined-ca-bundle\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.412136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-scripts\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.435885 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-credential-keys\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.439306 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.448417 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dskf\" (UniqueName: \"kubernetes.io/projected/354c4165-5673-49e5-9f0f-9b657e9c969f-kube-api-access-9dskf\") pod \"keystone-bootstrap-6kpvj\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.471833 5030 scope.go:117] "RemoveContainer" containerID="e8e7436bd2501d29035883b10bb837e40ac85eee5032e5783e47d4670ba2866b" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.481745 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.483172 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.494889 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.495485 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.499662 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.512984 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-znx9n" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.527076 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.528423 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.532153 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.554533 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.555100 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.556969 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.585586 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.585792 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.586479 5030 scope.go:117] "RemoveContainer" containerID="1f0965177f09b885701678b3a85480e90b7149337fdeb135ff36eef3c364719d" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.602981 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-config-data-custom\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.603139 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-config-data\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.603167 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-combined-ca-bundle\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.603352 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8fd7\" (UniqueName: \"kubernetes.io/projected/e35248d6-8ad0-425b-a55d-ab390f26120c-kube-api-access-z8fd7\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.604310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e35248d6-8ad0-425b-a55d-ab390f26120c-logs\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.610024 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.631735 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.653764 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.655190 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.658414 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.663796 5030 scope.go:117] "RemoveContainer" containerID="93e05fd4390c7930eea40ffb0b128a3c7efd393dee960fe75a044a26d1180047" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.665105 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m"] Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.713417 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.715729 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de431a48-e0d3-4508-a1e7-54598bbab91d-log-httpd\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.715797 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-scripts\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.715815 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szl22\" (UniqueName: \"kubernetes.io/projected/de431a48-e0d3-4508-a1e7-54598bbab91d-kube-api-access-szl22\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.715854 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-config-data-custom\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.715881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-config-data\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.715903 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqzdn\" (UniqueName: \"kubernetes.io/projected/b2354629-cb98-42f4-bad0-675148557a8b-kube-api-access-jqzdn\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.715923 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-combined-ca-bundle\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.715945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6rct\" (UniqueName: \"kubernetes.io/projected/23a44575-58d7-4b33-bf60-5e1592938de0-kube-api-access-g6rct\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.716041 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-combined-ca-bundle\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.716067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-config-data-custom\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.716127 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8fd7\" (UniqueName: \"kubernetes.io/projected/e35248d6-8ad0-425b-a55d-ab390f26120c-kube-api-access-z8fd7\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.716153 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23a44575-58d7-4b33-bf60-5e1592938de0-logs\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.716196 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.716247 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2354629-cb98-42f4-bad0-675148557a8b-logs\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.716265 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de431a48-e0d3-4508-a1e7-54598bbab91d-run-httpd\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.716287 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-config-data\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.716301 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-config-data\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.716323 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e35248d6-8ad0-425b-a55d-ab390f26120c-logs\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.716345 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-config-data\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.716370 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-combined-ca-bundle\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.716387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-config-data-custom\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.721146 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e35248d6-8ad0-425b-a55d-ab390f26120c-logs\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.730474 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-config-data-custom\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.735697 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-config-data\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.738378 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-combined-ca-bundle\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.740903 5030 scope.go:117] "RemoveContainer" containerID="ae017982458ddc6af7aca2197dbd82b45a8fce1dc25b7adfb5138c1f5c66675e" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.747141 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8fd7\" (UniqueName: \"kubernetes.io/projected/e35248d6-8ad0-425b-a55d-ab390f26120c-kube-api-access-z8fd7\") pod \"barbican-keystone-listener-6647c4cc54-xdx92\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.789257 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.805245 5030 scope.go:117] "RemoveContainer" containerID="fd9b139e4f8e23ee3d63600f68be17c4b6a9c41704e129cb9fee4ca166609d8e" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.818870 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-config-data\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.818919 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-combined-ca-bundle\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.818939 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-config-data-custom\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.818993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.819031 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de431a48-e0d3-4508-a1e7-54598bbab91d-log-httpd\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.819068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-scripts\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.819085 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szl22\" (UniqueName: \"kubernetes.io/projected/de431a48-e0d3-4508-a1e7-54598bbab91d-kube-api-access-szl22\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.819138 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqzdn\" (UniqueName: \"kubernetes.io/projected/b2354629-cb98-42f4-bad0-675148557a8b-kube-api-access-jqzdn\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.819179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6rct\" (UniqueName: \"kubernetes.io/projected/23a44575-58d7-4b33-bf60-5e1592938de0-kube-api-access-g6rct\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.819203 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-combined-ca-bundle\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.819219 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-config-data-custom\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.819254 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23a44575-58d7-4b33-bf60-5e1592938de0-logs\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.819284 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.819326 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2354629-cb98-42f4-bad0-675148557a8b-logs\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.819351 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de431a48-e0d3-4508-a1e7-54598bbab91d-run-httpd\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.819378 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-config-data\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.819401 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-config-data\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.827721 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23a44575-58d7-4b33-bf60-5e1592938de0-logs\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.828172 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de431a48-e0d3-4508-a1e7-54598bbab91d-log-httpd\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.828744 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-config-data-custom\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.829114 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de431a48-e0d3-4508-a1e7-54598bbab91d-run-httpd\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.829413 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2354629-cb98-42f4-bad0-675148557a8b-logs\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.833679 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.839667 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-config-data\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.845330 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.846214 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-config-data\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.846695 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-combined-ca-bundle\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.847143 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-combined-ca-bundle\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.848093 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-config-data\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.856991 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-scripts\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.863846 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqzdn\" (UniqueName: \"kubernetes.io/projected/b2354629-cb98-42f4-bad0-675148557a8b-kube-api-access-jqzdn\") pod \"barbican-worker-747bd99d97-8wlhs\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.865252 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-config-data-custom\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.868697 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szl22\" (UniqueName: \"kubernetes.io/projected/de431a48-e0d3-4508-a1e7-54598bbab91d-kube-api-access-szl22\") pod \"ceilometer-0\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.870263 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6rct\" (UniqueName: \"kubernetes.io/projected/23a44575-58d7-4b33-bf60-5e1592938de0-kube-api-access-g6rct\") pod \"barbican-api-8689d4ccf8-dsl7m\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.890034 5030 scope.go:117] "RemoveContainer" containerID="2ae32ab73523fefa4a071347e2c2b96e0ad6fcaa89f2c5f18bc0b2e6a93d85ba" Jan 20 23:10:38 crc kubenswrapper[5030]: I0120 23:10:38.959472 5030 scope.go:117] "RemoveContainer" containerID="745d8a8358077afde7e45bd24c3baddb33dc50783803442ad724e0850b039df1" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.009856 5030 scope.go:117] "RemoveContainer" containerID="1dd447d544e7681d6930f430646e45366ea16afd38076d543d31132350e8193c" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.036560 5030 scope.go:117] "RemoveContainer" containerID="a3c25eb491fc9772df5063d25427ecd83cd060358bb698133efd3c33d5bcaccb" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.108022 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.109179 5030 scope.go:117] "RemoveContainer" containerID="a088a2c71d5376f6316b4b1e10827836e559e903b68e3cf49a068c14fa399de3" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.120671 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.127658 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.154064 5030 scope.go:117] "RemoveContainer" containerID="438639437ff6cdf5a7b8b51f8127f0dcdd80ff295e3a8e7788eb685ff527e8e2" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.213915 5030 scope.go:117] "RemoveContainer" containerID="81069aac51082cb84904c36eea24a6a7c42b1204c1fb2deb3fbd811976d06a4f" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.276872 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6kpvj"] Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.285374 5030 scope.go:117] "RemoveContainer" containerID="2a584d50b092bb2e78449a32c91df2402a0537b6a34ad6a3f2ec1df462b05a74" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.337845 5030 scope.go:117] "RemoveContainer" containerID="8e081848c548e341bad0e73dab25932b18c37499573198982675c824eb766cce" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.370098 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" event={"ID":"0b296da3-d86c-431b-889e-a38437804d3a","Type":"ContainerStarted","Data":"bf4743f2f827f2f4f9f2f8517fc9b12848a8b1ea2c24f37ec0a0b7b4cefab31e"} Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.370167 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.370189 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.396950 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" podStartSLOduration=2.396906803 podStartE2EDuration="2.396906803s" podCreationTimestamp="2026-01-20 23:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:39.390402804 +0000 UTC m=+2111.710663092" watchObservedRunningTime="2026-01-20 23:10:39.396906803 +0000 UTC m=+2111.717167091" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.427444 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92"] Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.700790 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.757319 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m"] Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.773417 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs"] Jan 20 23:10:39 crc kubenswrapper[5030]: W0120 23:10:39.846271 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode35248d6_8ad0_425b_a55d_ab390f26120c.slice/crio-1b4a0b9f602177cf21b9c455ed94e9c8ec3f6932b959a420a6cb042543b219c0 WatchSource:0}: Error finding container 1b4a0b9f602177cf21b9c455ed94e9c8ec3f6932b959a420a6cb042543b219c0: Status 404 returned error can't find the container with id 1b4a0b9f602177cf21b9c455ed94e9c8ec3f6932b959a420a6cb042543b219c0 Jan 20 23:10:39 crc kubenswrapper[5030]: W0120 23:10:39.848135 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde431a48_e0d3_4508_a1e7_54598bbab91d.slice/crio-4692275494f8393f51f0db6c5891e760b2e79c0bcec93c1cdf71a2d7326dbb6d WatchSource:0}: Error finding container 4692275494f8393f51f0db6c5891e760b2e79c0bcec93c1cdf71a2d7326dbb6d: Status 404 returned error can't find the container with id 4692275494f8393f51f0db6c5891e760b2e79c0bcec93c1cdf71a2d7326dbb6d Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.857588 5030 scope.go:117] "RemoveContainer" containerID="e5e0bba9966895f6218767c5716dcc0df0126ba0c296f81b6f368434a9473f17" Jan 20 23:10:39 crc kubenswrapper[5030]: W0120 23:10:39.859217 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23a44575_58d7_4b33_bf60_5e1592938de0.slice/crio-99ea2f92bd0085e00b6ed8e61b6ee5210a2952d1d805b9a13f4fb12ded19ba36 WatchSource:0}: Error finding container 99ea2f92bd0085e00b6ed8e61b6ee5210a2952d1d805b9a13f4fb12ded19ba36: Status 404 returned error can't find the container with id 99ea2f92bd0085e00b6ed8e61b6ee5210a2952d1d805b9a13f4fb12ded19ba36 Jan 20 23:10:39 crc kubenswrapper[5030]: W0120 23:10:39.862426 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2354629_cb98_42f4_bad0_675148557a8b.slice/crio-8bc90db94e27e6c81fe0d3b87f7414446a3b142fd022e669b2ce52bfbbe4b355 WatchSource:0}: Error finding container 8bc90db94e27e6c81fe0d3b87f7414446a3b142fd022e669b2ce52bfbbe4b355: Status 404 returned error can't find the container with id 8bc90db94e27e6c81fe0d3b87f7414446a3b142fd022e669b2ce52bfbbe4b355 Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.972733 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e7109cf-496b-406f-817c-76e72c01ef90" path="/var/lib/kubelet/pods/8e7109cf-496b-406f-817c-76e72c01ef90/volumes" Jan 20 23:10:39 crc kubenswrapper[5030]: I0120 23:10:39.973400 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f69572b8-b37f-4e3c-9b64-e9076db53f18" path="/var/lib/kubelet/pods/f69572b8-b37f-4e3c-9b64-e9076db53f18/volumes" Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.157147 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.157443 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.157485 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.158300 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cbac8a8efde457ecc93be8c164837cd99df9bb7ee7e2524b52649b2e550edfbe"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.158366 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://cbac8a8efde457ecc93be8c164837cd99df9bb7ee7e2524b52649b2e550edfbe" gracePeriod=600 Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.401713 5030 generic.go:334] "Generic (PLEG): container finished" podID="818882ee-6bdd-4fa5-8294-d2ffe317ac07" containerID="ff54e0d54961da77f3e62ba00cba1c63dc7ac2c110e0251d825b9f5703c04b3f" exitCode=0 Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.401788 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-82sw7" event={"ID":"818882ee-6bdd-4fa5-8294-d2ffe317ac07","Type":"ContainerDied","Data":"ff54e0d54961da77f3e62ba00cba1c63dc7ac2c110e0251d825b9f5703c04b3f"} Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.404347 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="cbac8a8efde457ecc93be8c164837cd99df9bb7ee7e2524b52649b2e550edfbe" exitCode=0 Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.404377 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"cbac8a8efde457ecc93be8c164837cd99df9bb7ee7e2524b52649b2e550edfbe"} Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.404410 5030 scope.go:117] "RemoveContainer" containerID="a4be7f5ea905a1cff38b0d78f20d8889059d3e2348384072e966cbf5916a2460" Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.406488 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" event={"ID":"b2354629-cb98-42f4-bad0-675148557a8b","Type":"ContainerStarted","Data":"8bc90db94e27e6c81fe0d3b87f7414446a3b142fd022e669b2ce52bfbbe4b355"} Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.407781 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" event={"ID":"23a44575-58d7-4b33-bf60-5e1592938de0","Type":"ContainerStarted","Data":"99ea2f92bd0085e00b6ed8e61b6ee5210a2952d1d805b9a13f4fb12ded19ba36"} Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.414173 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"de431a48-e0d3-4508-a1e7-54598bbab91d","Type":"ContainerStarted","Data":"4692275494f8393f51f0db6c5891e760b2e79c0bcec93c1cdf71a2d7326dbb6d"} Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.417286 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" event={"ID":"e35248d6-8ad0-425b-a55d-ab390f26120c","Type":"ContainerStarted","Data":"1b4a0b9f602177cf21b9c455ed94e9c8ec3f6932b959a420a6cb042543b219c0"} Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.424118 5030 scope.go:117] "RemoveContainer" containerID="b81ff34b6eebd5990d479504e7a8389fc6a76b12390c4a30c7a2cb9b553d3af3" Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.430846 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" event={"ID":"354c4165-5673-49e5-9f0f-9b657e9c969f","Type":"ContainerStarted","Data":"4de6380c1a8e970bc5c44cec3d099afdd96799357c85a7117c4477a8ca761b17"} Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.566574 5030 scope.go:117] "RemoveContainer" containerID="23c3b4bff5c9d478f4f105968fe365da138cfa39a0900042d6d05c0b978b4131" Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.744830 5030 scope.go:117] "RemoveContainer" containerID="e6ae00a5b5176d1e8061fa02216441a4d16f157d7c215977e50d6e93c60f64c1" Jan 20 23:10:40 crc kubenswrapper[5030]: I0120 23:10:40.789498 5030 scope.go:117] "RemoveContainer" containerID="3edb379847341a6e47f821c96e48da36c12ae72e7fbf28a90433bd533216077a" Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.439287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" event={"ID":"b2354629-cb98-42f4-bad0-675148557a8b","Type":"ContainerStarted","Data":"d25a29179b1fb97279dddb13bc8a1318a8a5a57098f7af8c31048f2d300e2475"} Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.439515 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" event={"ID":"b2354629-cb98-42f4-bad0-675148557a8b","Type":"ContainerStarted","Data":"4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5"} Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.441368 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" event={"ID":"e35248d6-8ad0-425b-a55d-ab390f26120c","Type":"ContainerStarted","Data":"e6a766d34ead818ebeb37ccaabe733eb8ee48b183b9e05802b89424703cbd7ca"} Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.441574 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" event={"ID":"e35248d6-8ad0-425b-a55d-ab390f26120c","Type":"ContainerStarted","Data":"a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9"} Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.442811 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"db1354ec-1e74-42e4-8eda-ffcba8af62a8","Type":"ContainerStarted","Data":"e6154836a82a7f6acdf91096708f2199ef6808a460dc436a26d094f9263ef3e0"} Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.444469 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" event={"ID":"23a44575-58d7-4b33-bf60-5e1592938de0","Type":"ContainerStarted","Data":"a65e7eec2719fa9999163669c81ea7a03d598d4d9ceaade4adca1a34314b5de3"} Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.444499 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" event={"ID":"23a44575-58d7-4b33-bf60-5e1592938de0","Type":"ContainerStarted","Data":"c98691c50c7558cadcd4c891c5e1dd88fcbc902c6642d771d9ee7849bd033934"} Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.444559 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.444593 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.446831 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3"} Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.448424 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" event={"ID":"354c4165-5673-49e5-9f0f-9b657e9c969f","Type":"ContainerStarted","Data":"96be5fdb61f3c54676e8d0b8aa0dde3fe140c4ecbf8af87897e961ce635a4a5c"} Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.450194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"de431a48-e0d3-4508-a1e7-54598bbab91d","Type":"ContainerStarted","Data":"9d8a694665e2a6a02f99434282adda9069dade83d59ff2b7c6049c05690f0ad1"} Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.468194 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" podStartSLOduration=3.468178281 podStartE2EDuration="3.468178281s" podCreationTimestamp="2026-01-20 23:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:41.465994307 +0000 UTC m=+2113.786254635" watchObservedRunningTime="2026-01-20 23:10:41.468178281 +0000 UTC m=+2113.788438569" Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.516480 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" podStartSLOduration=3.51646371 podStartE2EDuration="3.51646371s" podCreationTimestamp="2026-01-20 23:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:41.509333656 +0000 UTC m=+2113.829593954" watchObservedRunningTime="2026-01-20 23:10:41.51646371 +0000 UTC m=+2113.836723998" Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.552184 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" podStartSLOduration=3.552166763 podStartE2EDuration="3.552166763s" podCreationTimestamp="2026-01-20 23:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:41.547978591 +0000 UTC m=+2113.868238879" watchObservedRunningTime="2026-01-20 23:10:41.552166763 +0000 UTC m=+2113.872427051" Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.585115 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" podStartSLOduration=3.585094817 podStartE2EDuration="3.585094817s" podCreationTimestamp="2026-01-20 23:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:41.577587144 +0000 UTC m=+2113.897847432" watchObservedRunningTime="2026-01-20 23:10:41.585094817 +0000 UTC m=+2113.905355105" Jan 20 23:10:41 crc kubenswrapper[5030]: I0120 23:10:41.870981 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.002607 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-combined-ca-bundle\") pod \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.002707 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-config-data\") pod \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.003463 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-db-sync-config-data\") pod \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.003524 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/818882ee-6bdd-4fa5-8294-d2ffe317ac07-etc-machine-id\") pod \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.003596 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-scripts\") pod \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.003708 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/818882ee-6bdd-4fa5-8294-d2ffe317ac07-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "818882ee-6bdd-4fa5-8294-d2ffe317ac07" (UID: "818882ee-6bdd-4fa5-8294-d2ffe317ac07"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.003732 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kpzj\" (UniqueName: \"kubernetes.io/projected/818882ee-6bdd-4fa5-8294-d2ffe317ac07-kube-api-access-6kpzj\") pod \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\" (UID: \"818882ee-6bdd-4fa5-8294-d2ffe317ac07\") " Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.004550 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/818882ee-6bdd-4fa5-8294-d2ffe317ac07-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.009884 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-scripts" (OuterVolumeSpecName: "scripts") pod "818882ee-6bdd-4fa5-8294-d2ffe317ac07" (UID: "818882ee-6bdd-4fa5-8294-d2ffe317ac07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.009996 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818882ee-6bdd-4fa5-8294-d2ffe317ac07-kube-api-access-6kpzj" (OuterVolumeSpecName: "kube-api-access-6kpzj") pod "818882ee-6bdd-4fa5-8294-d2ffe317ac07" (UID: "818882ee-6bdd-4fa5-8294-d2ffe317ac07"). InnerVolumeSpecName "kube-api-access-6kpzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.011406 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "818882ee-6bdd-4fa5-8294-d2ffe317ac07" (UID: "818882ee-6bdd-4fa5-8294-d2ffe317ac07"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.050375 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "818882ee-6bdd-4fa5-8294-d2ffe317ac07" (UID: "818882ee-6bdd-4fa5-8294-d2ffe317ac07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.080165 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-config-data" (OuterVolumeSpecName: "config-data") pod "818882ee-6bdd-4fa5-8294-d2ffe317ac07" (UID: "818882ee-6bdd-4fa5-8294-d2ffe317ac07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.106491 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kpzj\" (UniqueName: \"kubernetes.io/projected/818882ee-6bdd-4fa5-8294-d2ffe317ac07-kube-api-access-6kpzj\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.106708 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.106792 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.106850 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.106913 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/818882ee-6bdd-4fa5-8294-d2ffe317ac07-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.463365 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-82sw7" event={"ID":"818882ee-6bdd-4fa5-8294-d2ffe317ac07","Type":"ContainerDied","Data":"94793905d4cdebd00c213f05a5633eff16b7f2814e4bbb8a7ea6e175bb6e6cc8"} Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.463704 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94793905d4cdebd00c213f05a5633eff16b7f2814e4bbb8a7ea6e175bb6e6cc8" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.463567 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-82sw7" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.714710 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:10:42 crc kubenswrapper[5030]: E0120 23:10:42.715104 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818882ee-6bdd-4fa5-8294-d2ffe317ac07" containerName="cinder-db-sync" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.715121 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="818882ee-6bdd-4fa5-8294-d2ffe317ac07" containerName="cinder-db-sync" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.715275 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="818882ee-6bdd-4fa5-8294-d2ffe317ac07" containerName="cinder-db-sync" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.716155 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.718842 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.719040 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-wgg84" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.719161 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.719367 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.748035 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.824342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33ce50c2-c998-4879-9c9e-c8e80e8fd485-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.824389 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7758n\" (UniqueName: \"kubernetes.io/projected/33ce50c2-c998-4879-9c9e-c8e80e8fd485-kube-api-access-7758n\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.824427 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-scripts\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.824444 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.824710 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.824927 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-config-data\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.842445 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.843865 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.845579 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.860554 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.915123 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-c5f987788-8dlzq"] Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.916698 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.918885 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.919052 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.926403 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33ce50c2-c998-4879-9c9e-c8e80e8fd485-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.926468 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7758n\" (UniqueName: \"kubernetes.io/projected/33ce50c2-c998-4879-9c9e-c8e80e8fd485-kube-api-access-7758n\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.926510 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-scripts\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.926514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33ce50c2-c998-4879-9c9e-c8e80e8fd485-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.926567 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.926695 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.926868 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-config-data\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.933585 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-config-data\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.934098 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.934166 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.950551 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7758n\" (UniqueName: \"kubernetes.io/projected/33ce50c2-c998-4879-9c9e-c8e80e8fd485-kube-api-access-7758n\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.955086 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-scripts\") pod \"cinder-scheduler-0\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:42 crc kubenswrapper[5030]: I0120 23:10:42.965679 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-c5f987788-8dlzq"] Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028424 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-scripts\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028479 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-internal-tls-certs\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028541 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-logs\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028561 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-combined-ca-bundle\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028578 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9374a87c-86c7-42a7-9b0a-e35a61d871f5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028596 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-public-tls-certs\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028644 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9374a87c-86c7-42a7-9b0a-e35a61d871f5-logs\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028660 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-config-data-custom\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028706 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-config-data-custom\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028725 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028749 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-config-data\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028768 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqmbh\" (UniqueName: \"kubernetes.io/projected/9374a87c-86c7-42a7-9b0a-e35a61d871f5-kube-api-access-bqmbh\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028783 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgpxp\" (UniqueName: \"kubernetes.io/projected/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-kube-api-access-sgpxp\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.028814 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-config-data\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.037089 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.134782 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-config-data-custom\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.135052 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.135074 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-config-data\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.135096 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqmbh\" (UniqueName: \"kubernetes.io/projected/9374a87c-86c7-42a7-9b0a-e35a61d871f5-kube-api-access-bqmbh\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.135112 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgpxp\" (UniqueName: \"kubernetes.io/projected/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-kube-api-access-sgpxp\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.135153 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-config-data\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.135192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-scripts\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.135226 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-internal-tls-certs\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.135263 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-logs\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.135277 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-combined-ca-bundle\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.135293 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9374a87c-86c7-42a7-9b0a-e35a61d871f5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.135310 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-public-tls-certs\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.135324 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9374a87c-86c7-42a7-9b0a-e35a61d871f5-logs\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.135339 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-config-data-custom\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.138191 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-config-data-custom\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.139300 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-scripts\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.139402 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9374a87c-86c7-42a7-9b0a-e35a61d871f5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.142520 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-internal-tls-certs\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.145349 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9374a87c-86c7-42a7-9b0a-e35a61d871f5-logs\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.145886 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-logs\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.147063 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-public-tls-certs\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.147549 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-config-data-custom\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.149222 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-combined-ca-bundle\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.153581 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.153611 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgpxp\" (UniqueName: \"kubernetes.io/projected/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-kube-api-access-sgpxp\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.154410 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-config-data\") pod \"barbican-api-c5f987788-8dlzq\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.155465 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-config-data\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.175493 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqmbh\" (UniqueName: \"kubernetes.io/projected/9374a87c-86c7-42a7-9b0a-e35a61d871f5-kube-api-access-bqmbh\") pod \"cinder-api-0\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.231961 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.457993 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.481407 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"de431a48-e0d3-4508-a1e7-54598bbab91d","Type":"ContainerStarted","Data":"810bba1559104d0f47323ad648e6e7f24dfccd3c29f21edb3d5e1595173f2e71"} Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.481449 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"de431a48-e0d3-4508-a1e7-54598bbab91d","Type":"ContainerStarted","Data":"5dd7898784fb28db16c57924aeb1b18210ee989a99b4444e66babb5cdb48ff0f"} Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.490642 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.708726 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-c5f987788-8dlzq"] Jan 20 23:10:43 crc kubenswrapper[5030]: I0120 23:10:43.936224 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:10:43 crc kubenswrapper[5030]: W0120 23:10:43.945958 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9374a87c_86c7_42a7_9b0a_e35a61d871f5.slice/crio-a5ceb80090f29be3e397178ee69eff4d57dbe37cb50e477f225fed142c114aa5 WatchSource:0}: Error finding container a5ceb80090f29be3e397178ee69eff4d57dbe37cb50e477f225fed142c114aa5: Status 404 returned error can't find the container with id a5ceb80090f29be3e397178ee69eff4d57dbe37cb50e477f225fed142c114aa5 Jan 20 23:10:44 crc kubenswrapper[5030]: I0120 23:10:44.391938 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:44 crc kubenswrapper[5030]: I0120 23:10:44.392229 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:44 crc kubenswrapper[5030]: I0120 23:10:44.435887 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:44 crc kubenswrapper[5030]: I0120 23:10:44.470064 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:44 crc kubenswrapper[5030]: I0120 23:10:44.495584 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"33ce50c2-c998-4879-9c9e-c8e80e8fd485","Type":"ContainerStarted","Data":"87d1b7e4f027e9c3956e54d48bd40ab55ba7e8e535e094ae200325bba241269b"} Jan 20 23:10:44 crc kubenswrapper[5030]: I0120 23:10:44.496939 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"9374a87c-86c7-42a7-9b0a-e35a61d871f5","Type":"ContainerStarted","Data":"a5ceb80090f29be3e397178ee69eff4d57dbe37cb50e477f225fed142c114aa5"} Jan 20 23:10:44 crc kubenswrapper[5030]: I0120 23:10:44.498796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" event={"ID":"4b106e3d-3a2a-44d6-a377-8eb7f8f37825","Type":"ContainerStarted","Data":"6da96e85e71775220ada0b0bd685567b9ffd43cc9d125a2bf9ce50b5e63d8545"} Jan 20 23:10:44 crc kubenswrapper[5030]: I0120 23:10:44.498852 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:44 crc kubenswrapper[5030]: I0120 23:10:44.499003 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:45 crc kubenswrapper[5030]: I0120 23:10:45.388248 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:45 crc kubenswrapper[5030]: I0120 23:10:45.389286 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:45 crc kubenswrapper[5030]: I0120 23:10:45.429239 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:45 crc kubenswrapper[5030]: I0120 23:10:45.429665 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:45 crc kubenswrapper[5030]: I0120 23:10:45.431302 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:10:45 crc kubenswrapper[5030]: I0120 23:10:45.508146 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:45 crc kubenswrapper[5030]: I0120 23:10:45.508203 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:46 crc kubenswrapper[5030]: E0120 23:10:46.114969 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb1354ec_1e74_42e4_8eda_ffcba8af62a8.slice/crio-conmon-e6154836a82a7f6acdf91096708f2199ef6808a460dc436a26d094f9263ef3e0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb1354ec_1e74_42e4_8eda_ffcba8af62a8.slice/crio-e6154836a82a7f6acdf91096708f2199ef6808a460dc436a26d094f9263ef3e0.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:10:46 crc kubenswrapper[5030]: I0120 23:10:46.515646 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"33ce50c2-c998-4879-9c9e-c8e80e8fd485","Type":"ContainerStarted","Data":"58bb7cefca1fb3c47879aaf43e560060d90b85da9f280149c6c7ce438fc7ac9a"} Jan 20 23:10:46 crc kubenswrapper[5030]: I0120 23:10:46.516955 5030 generic.go:334] "Generic (PLEG): container finished" podID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerID="e6154836a82a7f6acdf91096708f2199ef6808a460dc436a26d094f9263ef3e0" exitCode=0 Jan 20 23:10:46 crc kubenswrapper[5030]: I0120 23:10:46.517041 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:10:46 crc kubenswrapper[5030]: I0120 23:10:46.517050 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:10:46 crc kubenswrapper[5030]: I0120 23:10:46.517042 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"db1354ec-1e74-42e4-8eda-ffcba8af62a8","Type":"ContainerDied","Data":"e6154836a82a7f6acdf91096708f2199ef6808a460dc436a26d094f9263ef3e0"} Jan 20 23:10:46 crc kubenswrapper[5030]: I0120 23:10:46.652330 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:46 crc kubenswrapper[5030]: I0120 23:10:46.654278 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:10:47 crc kubenswrapper[5030]: I0120 23:10:47.166176 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" podUID="23a44575-58d7-4b33-bf60-5e1592938de0" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 23:10:47 crc kubenswrapper[5030]: I0120 23:10:47.543201 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"9374a87c-86c7-42a7-9b0a-e35a61d871f5","Type":"ContainerStarted","Data":"cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33"} Jan 20 23:10:47 crc kubenswrapper[5030]: I0120 23:10:47.553873 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"de431a48-e0d3-4508-a1e7-54598bbab91d","Type":"ContainerStarted","Data":"e11a6225db06c03537442bca4cc97ea574e0f4e5666909d3028483fe4fb389c2"} Jan 20 23:10:47 crc kubenswrapper[5030]: I0120 23:10:47.554250 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:10:47 crc kubenswrapper[5030]: I0120 23:10:47.559715 5030 generic.go:334] "Generic (PLEG): container finished" podID="354c4165-5673-49e5-9f0f-9b657e9c969f" containerID="96be5fdb61f3c54676e8d0b8aa0dde3fe140c4ecbf8af87897e961ce635a4a5c" exitCode=0 Jan 20 23:10:47 crc kubenswrapper[5030]: I0120 23:10:47.559837 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" event={"ID":"354c4165-5673-49e5-9f0f-9b657e9c969f","Type":"ContainerDied","Data":"96be5fdb61f3c54676e8d0b8aa0dde3fe140c4ecbf8af87897e961ce635a4a5c"} Jan 20 23:10:47 crc kubenswrapper[5030]: I0120 23:10:47.570028 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" event={"ID":"4b106e3d-3a2a-44d6-a377-8eb7f8f37825","Type":"ContainerStarted","Data":"253233ff162c9e7305f61b0faa0810f0949d931fef29e1de6252df9e9b3381e3"} Jan 20 23:10:47 crc kubenswrapper[5030]: I0120 23:10:47.582201 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"db1354ec-1e74-42e4-8eda-ffcba8af62a8","Type":"ContainerStarted","Data":"e53a19888fbe4b142de355206d3ab31f28f294382a2552dd84cc805eb5b41bed"} Jan 20 23:10:47 crc kubenswrapper[5030]: I0120 23:10:47.582272 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:10:47 crc kubenswrapper[5030]: I0120 23:10:47.582281 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:10:47 crc kubenswrapper[5030]: I0120 23:10:47.583146 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.202561457 podStartE2EDuration="9.583127847s" podCreationTimestamp="2026-01-20 23:10:38 +0000 UTC" firstStartedPulling="2026-01-20 23:10:39.86090368 +0000 UTC m=+2112.181163968" lastFinishedPulling="2026-01-20 23:10:47.24147007 +0000 UTC m=+2119.561730358" observedRunningTime="2026-01-20 23:10:47.576576427 +0000 UTC m=+2119.896836715" watchObservedRunningTime="2026-01-20 23:10:47.583127847 +0000 UTC m=+2119.903388135" Jan 20 23:10:47 crc kubenswrapper[5030]: I0120 23:10:47.760202 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:47 crc kubenswrapper[5030]: I0120 23:10:47.763851 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:10:48 crc kubenswrapper[5030]: I0120 23:10:48.590198 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"33ce50c2-c998-4879-9c9e-c8e80e8fd485","Type":"ContainerStarted","Data":"f0e85d91353391beaadfc676a49bcecabece2f85b110a918b4e60e9f221f9a26"} Jan 20 23:10:48 crc kubenswrapper[5030]: I0120 23:10:48.592141 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"9374a87c-86c7-42a7-9b0a-e35a61d871f5","Type":"ContainerStarted","Data":"ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7"} Jan 20 23:10:48 crc kubenswrapper[5030]: I0120 23:10:48.592204 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="9374a87c-86c7-42a7-9b0a-e35a61d871f5" containerName="cinder-api-log" containerID="cri-o://cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33" gracePeriod=30 Jan 20 23:10:48 crc kubenswrapper[5030]: I0120 23:10:48.592242 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="9374a87c-86c7-42a7-9b0a-e35a61d871f5" containerName="cinder-api" containerID="cri-o://ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7" gracePeriod=30 Jan 20 23:10:48 crc kubenswrapper[5030]: I0120 23:10:48.592216 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:48 crc kubenswrapper[5030]: I0120 23:10:48.594470 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" event={"ID":"9f621682-a425-4314-ae0d-423bc0b23935","Type":"ContainerStarted","Data":"f7071f8a75e8f74dd7b4086b6cfd77c0d19c5687e5f92846bb994539bc2278af"} Jan 20 23:10:48 crc kubenswrapper[5030]: I0120 23:10:48.599719 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" event={"ID":"4b106e3d-3a2a-44d6-a377-8eb7f8f37825","Type":"ContainerStarted","Data":"e40e9ac3a45289a55deb5dd5f6e1a134c721f434436b873c0c20a597937dd46e"} Jan 20 23:10:48 crc kubenswrapper[5030]: I0120 23:10:48.599743 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:48 crc kubenswrapper[5030]: I0120 23:10:48.600987 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:48 crc kubenswrapper[5030]: I0120 23:10:48.634095 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=6.634080556 podStartE2EDuration="6.634080556s" podCreationTimestamp="2026-01-20 23:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:48.627238498 +0000 UTC m=+2120.947498786" watchObservedRunningTime="2026-01-20 23:10:48.634080556 +0000 UTC m=+2120.954340844" Jan 20 23:10:48 crc kubenswrapper[5030]: I0120 23:10:48.660877 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=6.66086478 podStartE2EDuration="6.66086478s" podCreationTimestamp="2026-01-20 23:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:48.656145584 +0000 UTC m=+2120.976405872" watchObservedRunningTime="2026-01-20 23:10:48.66086478 +0000 UTC m=+2120.981125068" Jan 20 23:10:48 crc kubenswrapper[5030]: I0120 23:10:48.693556 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" podStartSLOduration=6.693542058 podStartE2EDuration="6.693542058s" podCreationTimestamp="2026-01-20 23:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:48.684144278 +0000 UTC m=+2121.004404586" watchObservedRunningTime="2026-01-20 23:10:48.693542058 +0000 UTC m=+2121.013802346" Jan 20 23:10:48 crc kubenswrapper[5030]: I0120 23:10:48.728008 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" podStartSLOduration=2.518542108 podStartE2EDuration="33.7279833s" podCreationTimestamp="2026-01-20 23:10:15 +0000 UTC" firstStartedPulling="2026-01-20 23:10:15.962191471 +0000 UTC m=+2088.282451759" lastFinishedPulling="2026-01-20 23:10:47.171632663 +0000 UTC m=+2119.491892951" observedRunningTime="2026-01-20 23:10:48.703721777 +0000 UTC m=+2121.023982065" watchObservedRunningTime="2026-01-20 23:10:48.7279833 +0000 UTC m=+2121.048243588" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.182561 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.273416 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-credential-keys\") pod \"354c4165-5673-49e5-9f0f-9b657e9c969f\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.273660 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-combined-ca-bundle\") pod \"354c4165-5673-49e5-9f0f-9b657e9c969f\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.273738 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-fernet-keys\") pod \"354c4165-5673-49e5-9f0f-9b657e9c969f\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.273784 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-scripts\") pod \"354c4165-5673-49e5-9f0f-9b657e9c969f\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.273836 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-config-data\") pod \"354c4165-5673-49e5-9f0f-9b657e9c969f\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.273877 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dskf\" (UniqueName: \"kubernetes.io/projected/354c4165-5673-49e5-9f0f-9b657e9c969f-kube-api-access-9dskf\") pod \"354c4165-5673-49e5-9f0f-9b657e9c969f\" (UID: \"354c4165-5673-49e5-9f0f-9b657e9c969f\") " Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.444465 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "354c4165-5673-49e5-9f0f-9b657e9c969f" (UID: "354c4165-5673-49e5-9f0f-9b657e9c969f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.444599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/354c4165-5673-49e5-9f0f-9b657e9c969f-kube-api-access-9dskf" (OuterVolumeSpecName: "kube-api-access-9dskf") pod "354c4165-5673-49e5-9f0f-9b657e9c969f" (UID: "354c4165-5673-49e5-9f0f-9b657e9c969f"). InnerVolumeSpecName "kube-api-access-9dskf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.444862 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-scripts" (OuterVolumeSpecName: "scripts") pod "354c4165-5673-49e5-9f0f-9b657e9c969f" (UID: "354c4165-5673-49e5-9f0f-9b657e9c969f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.445098 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "354c4165-5673-49e5-9f0f-9b657e9c969f" (UID: "354c4165-5673-49e5-9f0f-9b657e9c969f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.478390 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.478418 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.478428 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dskf\" (UniqueName: \"kubernetes.io/projected/354c4165-5673-49e5-9f0f-9b657e9c969f-kube-api-access-9dskf\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.478440 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.570537 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-config-data" (OuterVolumeSpecName: "config-data") pod "354c4165-5673-49e5-9f0f-9b657e9c969f" (UID: "354c4165-5673-49e5-9f0f-9b657e9c969f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.574013 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "354c4165-5673-49e5-9f0f-9b657e9c969f" (UID: "354c4165-5673-49e5-9f0f-9b657e9c969f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.580383 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.580414 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/354c4165-5673-49e5-9f0f-9b657e9c969f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.608090 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.616681 5030 generic.go:334] "Generic (PLEG): container finished" podID="9374a87c-86c7-42a7-9b0a-e35a61d871f5" containerID="ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7" exitCode=0 Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.616706 5030 generic.go:334] "Generic (PLEG): container finished" podID="9374a87c-86c7-42a7-9b0a-e35a61d871f5" containerID="cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33" exitCode=143 Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.616747 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"9374a87c-86c7-42a7-9b0a-e35a61d871f5","Type":"ContainerDied","Data":"ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7"} Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.616773 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"9374a87c-86c7-42a7-9b0a-e35a61d871f5","Type":"ContainerDied","Data":"cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33"} Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.616783 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"9374a87c-86c7-42a7-9b0a-e35a61d871f5","Type":"ContainerDied","Data":"a5ceb80090f29be3e397178ee69eff4d57dbe37cb50e477f225fed142c114aa5"} Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.616798 5030 scope.go:117] "RemoveContainer" containerID="ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.617249 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.635339 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.637750 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6kpvj" event={"ID":"354c4165-5673-49e5-9f0f-9b657e9c969f","Type":"ContainerDied","Data":"4de6380c1a8e970bc5c44cec3d099afdd96799357c85a7117c4477a8ca761b17"} Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.637782 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4de6380c1a8e970bc5c44cec3d099afdd96799357c85a7117c4477a8ca761b17" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.662730 5030 scope.go:117] "RemoveContainer" containerID="cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.748990 5030 scope.go:117] "RemoveContainer" containerID="ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7" Jan 20 23:10:49 crc kubenswrapper[5030]: E0120 23:10:49.750535 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7\": container with ID starting with ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7 not found: ID does not exist" containerID="ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.750569 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7"} err="failed to get container status \"ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7\": rpc error: code = NotFound desc = could not find container \"ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7\": container with ID starting with ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7 not found: ID does not exist" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.750593 5030 scope.go:117] "RemoveContainer" containerID="cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33" Jan 20 23:10:49 crc kubenswrapper[5030]: E0120 23:10:49.750800 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33\": container with ID starting with cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33 not found: ID does not exist" containerID="cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.750829 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33"} err="failed to get container status \"cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33\": rpc error: code = NotFound desc = could not find container \"cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33\": container with ID starting with cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33 not found: ID does not exist" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.750844 5030 scope.go:117] "RemoveContainer" containerID="ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.751007 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7"} err="failed to get container status \"ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7\": rpc error: code = NotFound desc = could not find container \"ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7\": container with ID starting with ee35a59fb09b3b48103866af5353e9b7223dac57d57f33a1831c37a4f42e2cf7 not found: ID does not exist" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.751021 5030 scope.go:117] "RemoveContainer" containerID="cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.751177 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33"} err="failed to get container status \"cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33\": rpc error: code = NotFound desc = could not find container \"cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33\": container with ID starting with cb0eb6d574c9c0c7cc1765d2d5bd51ea5254100bdc1add5df1a9b7ade7822b33 not found: ID does not exist" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.792006 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-7f866c7db4-82phj"] Jan 20 23:10:49 crc kubenswrapper[5030]: E0120 23:10:49.792418 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354c4165-5673-49e5-9f0f-9b657e9c969f" containerName="keystone-bootstrap" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.792430 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="354c4165-5673-49e5-9f0f-9b657e9c969f" containerName="keystone-bootstrap" Jan 20 23:10:49 crc kubenswrapper[5030]: E0120 23:10:49.792441 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374a87c-86c7-42a7-9b0a-e35a61d871f5" containerName="cinder-api-log" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.792447 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374a87c-86c7-42a7-9b0a-e35a61d871f5" containerName="cinder-api-log" Jan 20 23:10:49 crc kubenswrapper[5030]: E0120 23:10:49.792460 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374a87c-86c7-42a7-9b0a-e35a61d871f5" containerName="cinder-api" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.792466 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374a87c-86c7-42a7-9b0a-e35a61d871f5" containerName="cinder-api" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.792648 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374a87c-86c7-42a7-9b0a-e35a61d871f5" containerName="cinder-api-log" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.792662 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374a87c-86c7-42a7-9b0a-e35a61d871f5" containerName="cinder-api" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.792678 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="354c4165-5673-49e5-9f0f-9b657e9c969f" containerName="keystone-bootstrap" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.793229 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-config-data-custom\") pod \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.793271 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9374a87c-86c7-42a7-9b0a-e35a61d871f5-etc-machine-id\") pod \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.793317 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-config-data\") pod \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.793365 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-scripts\") pod \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.793384 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.793432 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-combined-ca-bundle\") pod \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.793459 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9374a87c-86c7-42a7-9b0a-e35a61d871f5-logs\") pod \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.793518 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqmbh\" (UniqueName: \"kubernetes.io/projected/9374a87c-86c7-42a7-9b0a-e35a61d871f5-kube-api-access-bqmbh\") pod \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\" (UID: \"9374a87c-86c7-42a7-9b0a-e35a61d871f5\") " Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.796098 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9374a87c-86c7-42a7-9b0a-e35a61d871f5-logs" (OuterVolumeSpecName: "logs") pod "9374a87c-86c7-42a7-9b0a-e35a61d871f5" (UID: "9374a87c-86c7-42a7-9b0a-e35a61d871f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.796176 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374a87c-86c7-42a7-9b0a-e35a61d871f5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9374a87c-86c7-42a7-9b0a-e35a61d871f5" (UID: "9374a87c-86c7-42a7-9b0a-e35a61d871f5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.797665 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.798244 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.798383 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.798393 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.798529 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.798651 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-zrrcb" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.809773 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9374a87c-86c7-42a7-9b0a-e35a61d871f5-kube-api-access-bqmbh" (OuterVolumeSpecName: "kube-api-access-bqmbh") pod "9374a87c-86c7-42a7-9b0a-e35a61d871f5" (UID: "9374a87c-86c7-42a7-9b0a-e35a61d871f5"). InnerVolumeSpecName "kube-api-access-bqmbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.820803 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7f866c7db4-82phj"] Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.821709 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9374a87c-86c7-42a7-9b0a-e35a61d871f5" (UID: "9374a87c-86c7-42a7-9b0a-e35a61d871f5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.828038 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-scripts" (OuterVolumeSpecName: "scripts") pod "9374a87c-86c7-42a7-9b0a-e35a61d871f5" (UID: "9374a87c-86c7-42a7-9b0a-e35a61d871f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.872832 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9374a87c-86c7-42a7-9b0a-e35a61d871f5" (UID: "9374a87c-86c7-42a7-9b0a-e35a61d871f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.877107 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-config-data" (OuterVolumeSpecName: "config-data") pod "9374a87c-86c7-42a7-9b0a-e35a61d871f5" (UID: "9374a87c-86c7-42a7-9b0a-e35a61d871f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.894992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895037 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895056 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895095 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-scripts\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-credential-keys\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-config-data\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895212 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9msb7\" (UniqueName: \"kubernetes.io/projected/03b59562-d96f-453a-bfc7-a9f3c7e7f872-kube-api-access-9msb7\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895252 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-fernet-keys\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895306 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895317 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9374a87c-86c7-42a7-9b0a-e35a61d871f5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895326 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895334 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895343 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9374a87c-86c7-42a7-9b0a-e35a61d871f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895351 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9374a87c-86c7-42a7-9b0a-e35a61d871f5-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.895359 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqmbh\" (UniqueName: \"kubernetes.io/projected/9374a87c-86c7-42a7-9b0a-e35a61d871f5-kube-api-access-bqmbh\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.981328 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:10:49 crc kubenswrapper[5030]: I0120 23:10:49.987704 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.000849 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.002175 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.005586 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-scripts\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.005801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-credential-keys\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.005900 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.006035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-config-data\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.006131 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-scripts\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.006258 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-config-data-custom\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.006392 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-logs\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.006523 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-config-data\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.006751 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.006873 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9msb7\" (UniqueName: \"kubernetes.io/projected/03b59562-d96f-453a-bfc7-a9f3c7e7f872-kube-api-access-9msb7\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.006970 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-fernet-keys\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.007065 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.007187 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grk2d\" (UniqueName: \"kubernetes.io/projected/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-kube-api-access-grk2d\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.007309 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.007408 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.007507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.007602 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.009408 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.009682 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.010220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-scripts\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.010603 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.014985 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.015545 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.019491 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-fernet-keys\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.020570 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.020716 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.021176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-credential-keys\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.021675 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-config-data\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.036144 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9msb7\" (UniqueName: \"kubernetes.io/projected/03b59562-d96f-453a-bfc7-a9f3c7e7f872-kube-api-access-9msb7\") pod \"keystone-7f866c7db4-82phj\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.108498 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.108632 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.108743 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grk2d\" (UniqueName: \"kubernetes.io/projected/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-kube-api-access-grk2d\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.108875 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.109016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.109134 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-scripts\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.109183 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-config-data-custom\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.109232 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-logs\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.109264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-config-data\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.109307 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.110169 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-logs\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.112677 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.113159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.113429 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-scripts\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.113797 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.114135 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-config-data\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.114550 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-config-data-custom\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.125167 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grk2d\" (UniqueName: \"kubernetes.io/projected/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-kube-api-access-grk2d\") pod \"cinder-api-0\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.266401 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.321546 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.638405 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.648307 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"db1354ec-1e74-42e4-8eda-ffcba8af62a8","Type":"ContainerStarted","Data":"06d13adbffbcd724575da3d439b5c58b3043b222f3e76d916058a23944d86b4f"} Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.648379 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"db1354ec-1e74-42e4-8eda-ffcba8af62a8","Type":"ContainerStarted","Data":"722d4093088b0113351d34ef93f46a8253354211420a96de0ce82f8dc64ebd7b"} Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.688563 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podStartSLOduration=13.688538772 podStartE2EDuration="13.688538772s" podCreationTimestamp="2026-01-20 23:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:50.677354359 +0000 UTC m=+2122.997614647" watchObservedRunningTime="2026-01-20 23:10:50.688538772 +0000 UTC m=+2123.008799060" Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.735890 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7f866c7db4-82phj"] Jan 20 23:10:50 crc kubenswrapper[5030]: I0120 23:10:50.890222 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:10:50 crc kubenswrapper[5030]: W0120 23:10:50.893214 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c254adc_e7a0_4ef6_8b53_a6e0ed7b79ab.slice/crio-c511903ef3583c6577bc458b52bea49f96d7401aba967350226ca754f53fa376 WatchSource:0}: Error finding container c511903ef3583c6577bc458b52bea49f96d7401aba967350226ca754f53fa376: Status 404 returned error can't find the container with id c511903ef3583c6577bc458b52bea49f96d7401aba967350226ca754f53fa376 Jan 20 23:10:51 crc kubenswrapper[5030]: I0120 23:10:51.132633 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:51 crc kubenswrapper[5030]: I0120 23:10:51.664914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" event={"ID":"03b59562-d96f-453a-bfc7-a9f3c7e7f872","Type":"ContainerStarted","Data":"7ce9f7684a297141dd328f9a8c3e12154ea1db484b1ac1e034af93f31c64eab1"} Jan 20 23:10:51 crc kubenswrapper[5030]: I0120 23:10:51.664960 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" event={"ID":"03b59562-d96f-453a-bfc7-a9f3c7e7f872","Type":"ContainerStarted","Data":"55cb7a6b93f3c971f30d12b05d3eb90c90bab4e19c118061da82f49e26511216"} Jan 20 23:10:51 crc kubenswrapper[5030]: I0120 23:10:51.665721 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:10:51 crc kubenswrapper[5030]: I0120 23:10:51.667895 5030 generic.go:334] "Generic (PLEG): container finished" podID="9f621682-a425-4314-ae0d-423bc0b23935" containerID="f7071f8a75e8f74dd7b4086b6cfd77c0d19c5687e5f92846bb994539bc2278af" exitCode=0 Jan 20 23:10:51 crc kubenswrapper[5030]: I0120 23:10:51.667957 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" event={"ID":"9f621682-a425-4314-ae0d-423bc0b23935","Type":"ContainerDied","Data":"f7071f8a75e8f74dd7b4086b6cfd77c0d19c5687e5f92846bb994539bc2278af"} Jan 20 23:10:51 crc kubenswrapper[5030]: I0120 23:10:51.670133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab","Type":"ContainerStarted","Data":"7fa5493ab547d6dfa6b51ddf09c04288a1a6dcaaf5124b6386a2e61f891bb6d0"} Jan 20 23:10:51 crc kubenswrapper[5030]: I0120 23:10:51.670169 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab","Type":"ContainerStarted","Data":"c511903ef3583c6577bc458b52bea49f96d7401aba967350226ca754f53fa376"} Jan 20 23:10:51 crc kubenswrapper[5030]: I0120 23:10:51.683768 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" podStartSLOduration=2.683749798 podStartE2EDuration="2.683749798s" podCreationTimestamp="2026-01-20 23:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:51.681583475 +0000 UTC m=+2124.001843753" watchObservedRunningTime="2026-01-20 23:10:51.683749798 +0000 UTC m=+2124.004010086" Jan 20 23:10:51 crc kubenswrapper[5030]: I0120 23:10:51.978349 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9374a87c-86c7-42a7-9b0a-e35a61d871f5" path="/var/lib/kubelet/pods/9374a87c-86c7-42a7-9b0a-e35a61d871f5/volumes" Jan 20 23:10:52 crc kubenswrapper[5030]: I0120 23:10:52.624478 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:52 crc kubenswrapper[5030]: I0120 23:10:52.625052 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:52 crc kubenswrapper[5030]: I0120 23:10:52.631464 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:52 crc kubenswrapper[5030]: I0120 23:10:52.679978 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab","Type":"ContainerStarted","Data":"1decb37a6a7c1d14ac75639b689c1ba474b929273efe96a724937a53898e5fcd"} Jan 20 23:10:52 crc kubenswrapper[5030]: I0120 23:10:52.681311 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:10:52 crc kubenswrapper[5030]: I0120 23:10:52.685991 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:10:52 crc kubenswrapper[5030]: I0120 23:10:52.705378 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.705359069 podStartE2EDuration="3.705359069s" podCreationTimestamp="2026-01-20 23:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:52.704724124 +0000 UTC m=+2125.024984422" watchObservedRunningTime="2026-01-20 23:10:52.705359069 +0000 UTC m=+2125.025619357" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.038213 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.049060 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.171980 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsrdw\" (UniqueName: \"kubernetes.io/projected/9f621682-a425-4314-ae0d-423bc0b23935-kube-api-access-fsrdw\") pod \"9f621682-a425-4314-ae0d-423bc0b23935\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.172496 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-combined-ca-bundle\") pod \"9f621682-a425-4314-ae0d-423bc0b23935\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.172605 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-db-sync-config-data\") pod \"9f621682-a425-4314-ae0d-423bc0b23935\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.172735 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-config-data\") pod \"9f621682-a425-4314-ae0d-423bc0b23935\" (UID: \"9f621682-a425-4314-ae0d-423bc0b23935\") " Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.177320 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9f621682-a425-4314-ae0d-423bc0b23935" (UID: "9f621682-a425-4314-ae0d-423bc0b23935"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.184113 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f621682-a425-4314-ae0d-423bc0b23935-kube-api-access-fsrdw" (OuterVolumeSpecName: "kube-api-access-fsrdw") pod "9f621682-a425-4314-ae0d-423bc0b23935" (UID: "9f621682-a425-4314-ae0d-423bc0b23935"). InnerVolumeSpecName "kube-api-access-fsrdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.204760 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f621682-a425-4314-ae0d-423bc0b23935" (UID: "9f621682-a425-4314-ae0d-423bc0b23935"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.221489 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-config-data" (OuterVolumeSpecName: "config-data") pod "9f621682-a425-4314-ae0d-423bc0b23935" (UID: "9f621682-a425-4314-ae0d-423bc0b23935"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.260659 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.275024 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsrdw\" (UniqueName: \"kubernetes.io/projected/9f621682-a425-4314-ae0d-423bc0b23935-kube-api-access-fsrdw\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.275060 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.275071 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.275080 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f621682-a425-4314-ae0d-423bc0b23935-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.693048 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" event={"ID":"9f621682-a425-4314-ae0d-423bc0b23935","Type":"ContainerDied","Data":"cf60a07c21784c33a730b50212b0e627d92244d1aa8eb31a2c93013cc8796a77"} Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.693231 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-db-sync-wvcfl" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.693299 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf60a07c21784c33a730b50212b0e627d92244d1aa8eb31a2c93013cc8796a77" Jan 20 23:10:53 crc kubenswrapper[5030]: I0120 23:10:53.775097 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.002136 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 20 23:10:54 crc kubenswrapper[5030]: E0120 23:10:54.002585 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f621682-a425-4314-ae0d-423bc0b23935" containerName="watcher-db-sync" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.002604 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f621682-a425-4314-ae0d-423bc0b23935" containerName="watcher-db-sync" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.002863 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f621682-a425-4314-ae0d-423bc0b23935" containerName="watcher-db-sync" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.004041 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.006292 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-watcher-dockercfg-cx859" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.009033 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-applier-0"] Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.012226 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-api-config-data" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.017033 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.020770 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-applier-config-data" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.041467 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.048190 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-applier-0"] Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.088839 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.088908 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-config-data\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.089017 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj4xt\" (UniqueName: \"kubernetes.io/projected/0bddc34f-b741-44c3-bbfe-51b78a663cec-kube-api-access-xj4xt\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.089107 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bddc34f-b741-44c3-bbfe-51b78a663cec-logs\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.089146 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.119838 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.121022 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.124574 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-decision-engine-config-data" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.130116 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.190414 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/529ed4cf-39aa-463b-bf65-da6900fa74e4-logs\") pod \"watcher-applier-0\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.190468 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfzlz\" (UniqueName: \"kubernetes.io/projected/529ed4cf-39aa-463b-bf65-da6900fa74e4-kube-api-access-gfzlz\") pod \"watcher-applier-0\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.190558 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.190579 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-config-data\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.190636 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj4xt\" (UniqueName: \"kubernetes.io/projected/0bddc34f-b741-44c3-bbfe-51b78a663cec-kube-api-access-xj4xt\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.190655 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529ed4cf-39aa-463b-bf65-da6900fa74e4-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.190682 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529ed4cf-39aa-463b-bf65-da6900fa74e4-config-data\") pod \"watcher-applier-0\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.190712 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bddc34f-b741-44c3-bbfe-51b78a663cec-logs\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.190733 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.191506 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bddc34f-b741-44c3-bbfe-51b78a663cec-logs\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.195065 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.195108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-config-data\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.197654 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.206202 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj4xt\" (UniqueName: \"kubernetes.io/projected/0bddc34f-b741-44c3-bbfe-51b78a663cec-kube-api-access-xj4xt\") pod \"watcher-api-0\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.292192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48a72147-2d43-4262-afd5-e06c554d7379-logs\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.292596 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grdtf\" (UniqueName: \"kubernetes.io/projected/48a72147-2d43-4262-afd5-e06c554d7379-kube-api-access-grdtf\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.292671 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-config-data\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.292707 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.292754 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529ed4cf-39aa-463b-bf65-da6900fa74e4-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.292784 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529ed4cf-39aa-463b-bf65-da6900fa74e4-config-data\") pod \"watcher-applier-0\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.292830 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/529ed4cf-39aa-463b-bf65-da6900fa74e4-logs\") pod \"watcher-applier-0\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.292855 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfzlz\" (UniqueName: \"kubernetes.io/projected/529ed4cf-39aa-463b-bf65-da6900fa74e4-kube-api-access-gfzlz\") pod \"watcher-applier-0\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.292889 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.293298 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/529ed4cf-39aa-463b-bf65-da6900fa74e4-logs\") pod \"watcher-applier-0\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.296465 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529ed4cf-39aa-463b-bf65-da6900fa74e4-config-data\") pod \"watcher-applier-0\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.297961 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529ed4cf-39aa-463b-bf65-da6900fa74e4-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.317065 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfzlz\" (UniqueName: \"kubernetes.io/projected/529ed4cf-39aa-463b-bf65-da6900fa74e4-kube-api-access-gfzlz\") pod \"watcher-applier-0\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.317483 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.330588 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.397391 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.397455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48a72147-2d43-4262-afd5-e06c554d7379-logs\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.397487 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grdtf\" (UniqueName: \"kubernetes.io/projected/48a72147-2d43-4262-afd5-e06c554d7379-kube-api-access-grdtf\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.397542 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-config-data\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.397573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.401054 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48a72147-2d43-4262-afd5-e06c554d7379-logs\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.402178 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.402547 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.403910 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-config-data\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.417213 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grdtf\" (UniqueName: \"kubernetes.io/projected/48a72147-2d43-4262-afd5-e06c554d7379-kube-api-access-grdtf\") pod \"watcher-decision-engine-0\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.448988 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.718969 5030 generic.go:334] "Generic (PLEG): container finished" podID="cd4126e6-c5d3-4d42-9ca2-07029f5e3368" containerID="bffef7765f1438423591828481b992e73d3dfc2858bd854db49a96094325b559" exitCode=0 Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.719066 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-qncrz" event={"ID":"cd4126e6-c5d3-4d42-9ca2-07029f5e3368","Type":"ContainerDied","Data":"bffef7765f1438423591828481b992e73d3dfc2858bd854db49a96094325b559"} Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.719205 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="33ce50c2-c998-4879-9c9e-c8e80e8fd485" containerName="cinder-scheduler" containerID="cri-o://58bb7cefca1fb3c47879aaf43e560060d90b85da9f280149c6c7ce438fc7ac9a" gracePeriod=30 Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.719326 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="33ce50c2-c998-4879-9c9e-c8e80e8fd485" containerName="probe" containerID="cri-o://f0e85d91353391beaadfc676a49bcecabece2f85b110a918b4e60e9f221f9a26" gracePeriod=30 Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.763493 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.856235 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-applier-0"] Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.869043 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.929574 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m"] Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.929854 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" podUID="23a44575-58d7-4b33-bf60-5e1592938de0" containerName="barbican-api-log" containerID="cri-o://c98691c50c7558cadcd4c891c5e1dd88fcbc902c6642d771d9ee7849bd033934" gracePeriod=30 Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.930301 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" podUID="23a44575-58d7-4b33-bf60-5e1592938de0" containerName="barbican-api" containerID="cri-o://a65e7eec2719fa9999163669c81ea7a03d598d4d9ceaade4adca1a34314b5de3" gracePeriod=30 Jan 20 23:10:54 crc kubenswrapper[5030]: I0120 23:10:54.944853 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 20 23:10:55 crc kubenswrapper[5030]: I0120 23:10:55.024262 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 20 23:10:55 crc kubenswrapper[5030]: I0120 23:10:55.731058 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-applier-0" event={"ID":"529ed4cf-39aa-463b-bf65-da6900fa74e4","Type":"ContainerStarted","Data":"c0215662cba4d3fcee5762da221eedd081e1779a395d41797f159130a47d0f0b"} Jan 20 23:10:55 crc kubenswrapper[5030]: I0120 23:10:55.734104 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"0bddc34f-b741-44c3-bbfe-51b78a663cec","Type":"ContainerStarted","Data":"362390f7e5e2722bdb9dfa7ff294373b01676d6e2e6144fe7ff43a1a55bcb624"} Jan 20 23:10:55 crc kubenswrapper[5030]: I0120 23:10:55.734130 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"0bddc34f-b741-44c3-bbfe-51b78a663cec","Type":"ContainerStarted","Data":"6ccfdc465e4afa42f0fce4e71a832f3fe6e0e0a176f66bf3b2dd07a114bfd007"} Jan 20 23:10:55 crc kubenswrapper[5030]: I0120 23:10:55.734140 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"0bddc34f-b741-44c3-bbfe-51b78a663cec","Type":"ContainerStarted","Data":"45cbd2742643a43463adfd12a55cf9ff200f49d92f855c287cffacc4d5ab724e"} Jan 20 23:10:55 crc kubenswrapper[5030]: I0120 23:10:55.734972 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:55 crc kubenswrapper[5030]: I0120 23:10:55.741733 5030 generic.go:334] "Generic (PLEG): container finished" podID="33ce50c2-c998-4879-9c9e-c8e80e8fd485" containerID="f0e85d91353391beaadfc676a49bcecabece2f85b110a918b4e60e9f221f9a26" exitCode=0 Jan 20 23:10:55 crc kubenswrapper[5030]: I0120 23:10:55.741814 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"33ce50c2-c998-4879-9c9e-c8e80e8fd485","Type":"ContainerDied","Data":"f0e85d91353391beaadfc676a49bcecabece2f85b110a918b4e60e9f221f9a26"} Jan 20 23:10:55 crc kubenswrapper[5030]: I0120 23:10:55.743858 5030 generic.go:334] "Generic (PLEG): container finished" podID="23a44575-58d7-4b33-bf60-5e1592938de0" containerID="c98691c50c7558cadcd4c891c5e1dd88fcbc902c6642d771d9ee7849bd033934" exitCode=143 Jan 20 23:10:55 crc kubenswrapper[5030]: I0120 23:10:55.743907 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" event={"ID":"23a44575-58d7-4b33-bf60-5e1592938de0","Type":"ContainerDied","Data":"c98691c50c7558cadcd4c891c5e1dd88fcbc902c6642d771d9ee7849bd033934"} Jan 20 23:10:55 crc kubenswrapper[5030]: I0120 23:10:55.746039 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"48a72147-2d43-4262-afd5-e06c554d7379","Type":"ContainerStarted","Data":"8ecea9c64378b037f85abf8f32f3055666c1bf1a9e8387f440cdc25b1cfe49c0"} Jan 20 23:10:55 crc kubenswrapper[5030]: I0120 23:10:55.757521 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-api-0" podStartSLOduration=2.757502952 podStartE2EDuration="2.757502952s" podCreationTimestamp="2026-01-20 23:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:55.751818583 +0000 UTC m=+2128.072078871" watchObservedRunningTime="2026-01-20 23:10:55.757502952 +0000 UTC m=+2128.077763240" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.128687 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-qncrz" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.252953 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mtrg\" (UniqueName: \"kubernetes.io/projected/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-kube-api-access-2mtrg\") pod \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\" (UID: \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\") " Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.253694 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-config\") pod \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\" (UID: \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\") " Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.253746 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-combined-ca-bundle\") pod \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\" (UID: \"cd4126e6-c5d3-4d42-9ca2-07029f5e3368\") " Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.258815 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-kube-api-access-2mtrg" (OuterVolumeSpecName: "kube-api-access-2mtrg") pod "cd4126e6-c5d3-4d42-9ca2-07029f5e3368" (UID: "cd4126e6-c5d3-4d42-9ca2-07029f5e3368"). InnerVolumeSpecName "kube-api-access-2mtrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.298519 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd4126e6-c5d3-4d42-9ca2-07029f5e3368" (UID: "cd4126e6-c5d3-4d42-9ca2-07029f5e3368"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.321074 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-config" (OuterVolumeSpecName: "config") pod "cd4126e6-c5d3-4d42-9ca2-07029f5e3368" (UID: "cd4126e6-c5d3-4d42-9ca2-07029f5e3368"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.356363 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mtrg\" (UniqueName: \"kubernetes.io/projected/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-kube-api-access-2mtrg\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.356393 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.356405 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd4126e6-c5d3-4d42-9ca2-07029f5e3368-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.764174 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-qncrz" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.765104 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-qncrz" event={"ID":"cd4126e6-c5d3-4d42-9ca2-07029f5e3368","Type":"ContainerDied","Data":"08e9ae82d646800dcb325d1c5671df33f53267d975131032a7c5a837b2756bd2"} Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.765134 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08e9ae82d646800dcb325d1c5671df33f53267d975131032a7c5a837b2756bd2" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.982679 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf"] Jan 20 23:10:56 crc kubenswrapper[5030]: E0120 23:10:56.983589 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4126e6-c5d3-4d42-9ca2-07029f5e3368" containerName="neutron-db-sync" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.983602 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4126e6-c5d3-4d42-9ca2-07029f5e3368" containerName="neutron-db-sync" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.984888 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd4126e6-c5d3-4d42-9ca2-07029f5e3368" containerName="neutron-db-sync" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.986445 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.992478 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.992733 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-7cm49" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.995496 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:10:56 crc kubenswrapper[5030]: I0120 23:10:56.995559 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.014352 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf"] Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.088402 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-combined-ca-bundle\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.088450 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-ovndb-tls-certs\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.088474 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-httpd-config\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.089029 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-config\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.089272 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swbhz\" (UniqueName: \"kubernetes.io/projected/0e045aec-3446-4403-9f41-3460c1e4edd8-kube-api-access-swbhz\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.190856 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-config\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.190940 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swbhz\" (UniqueName: \"kubernetes.io/projected/0e045aec-3446-4403-9f41-3460c1e4edd8-kube-api-access-swbhz\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.190997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-ovndb-tls-certs\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.191015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-combined-ca-bundle\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.191034 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-httpd-config\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.197129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-config\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.199690 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-combined-ca-bundle\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.201298 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-ovndb-tls-certs\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.205205 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-httpd-config\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.209326 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swbhz\" (UniqueName: \"kubernetes.io/projected/0e045aec-3446-4403-9f41-3460c1e4edd8-kube-api-access-swbhz\") pod \"neutron-694b9bdbdc-2fwcf\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.320013 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:57 crc kubenswrapper[5030]: I0120 23:10:57.957666 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.010481 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf"] Jan 20 23:10:58 crc kubenswrapper[5030]: W0120 23:10:58.020341 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e045aec_3446_4403_9f41_3460c1e4edd8.slice/crio-6e004e65f75f1c4f661c3346d30743dd494fe9516ae6fa3aa71a5ef321bad700 WatchSource:0}: Error finding container 6e004e65f75f1c4f661c3346d30743dd494fe9516ae6fa3aa71a5ef321bad700: Status 404 returned error can't find the container with id 6e004e65f75f1c4f661c3346d30743dd494fe9516ae6fa3aa71a5ef321bad700 Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.588980 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.720495 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23a44575-58d7-4b33-bf60-5e1592938de0-logs\") pod \"23a44575-58d7-4b33-bf60-5e1592938de0\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.720645 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-combined-ca-bundle\") pod \"23a44575-58d7-4b33-bf60-5e1592938de0\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.720750 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6rct\" (UniqueName: \"kubernetes.io/projected/23a44575-58d7-4b33-bf60-5e1592938de0-kube-api-access-g6rct\") pod \"23a44575-58d7-4b33-bf60-5e1592938de0\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.720772 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-config-data\") pod \"23a44575-58d7-4b33-bf60-5e1592938de0\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.720825 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-config-data-custom\") pod \"23a44575-58d7-4b33-bf60-5e1592938de0\" (UID: \"23a44575-58d7-4b33-bf60-5e1592938de0\") " Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.721061 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23a44575-58d7-4b33-bf60-5e1592938de0-logs" (OuterVolumeSpecName: "logs") pod "23a44575-58d7-4b33-bf60-5e1592938de0" (UID: "23a44575-58d7-4b33-bf60-5e1592938de0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.721240 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23a44575-58d7-4b33-bf60-5e1592938de0-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.725305 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "23a44575-58d7-4b33-bf60-5e1592938de0" (UID: "23a44575-58d7-4b33-bf60-5e1592938de0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.731225 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a44575-58d7-4b33-bf60-5e1592938de0-kube-api-access-g6rct" (OuterVolumeSpecName: "kube-api-access-g6rct") pod "23a44575-58d7-4b33-bf60-5e1592938de0" (UID: "23a44575-58d7-4b33-bf60-5e1592938de0"). InnerVolumeSpecName "kube-api-access-g6rct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.735260 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.751753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23a44575-58d7-4b33-bf60-5e1592938de0" (UID: "23a44575-58d7-4b33-bf60-5e1592938de0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.779720 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-config-data" (OuterVolumeSpecName: "config-data") pod "23a44575-58d7-4b33-bf60-5e1592938de0" (UID: "23a44575-58d7-4b33-bf60-5e1592938de0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.796642 5030 generic.go:334] "Generic (PLEG): container finished" podID="23a44575-58d7-4b33-bf60-5e1592938de0" containerID="a65e7eec2719fa9999163669c81ea7a03d598d4d9ceaade4adca1a34314b5de3" exitCode=0 Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.797130 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" event={"ID":"23a44575-58d7-4b33-bf60-5e1592938de0","Type":"ContainerDied","Data":"a65e7eec2719fa9999163669c81ea7a03d598d4d9ceaade4adca1a34314b5de3"} Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.797209 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" event={"ID":"23a44575-58d7-4b33-bf60-5e1592938de0","Type":"ContainerDied","Data":"99ea2f92bd0085e00b6ed8e61b6ee5210a2952d1d805b9a13f4fb12ded19ba36"} Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.797277 5030 scope.go:117] "RemoveContainer" containerID="a65e7eec2719fa9999163669c81ea7a03d598d4d9ceaade4adca1a34314b5de3" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.797468 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.807390 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"48a72147-2d43-4262-afd5-e06c554d7379","Type":"ContainerStarted","Data":"588fc920f6558236538877bdec6583b515618625942d9f216a444cabdbea34f9"} Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.842922 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33ce50c2-c998-4879-9c9e-c8e80e8fd485-etc-machine-id\") pod \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.843050 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-scripts\") pod \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.843104 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7758n\" (UniqueName: \"kubernetes.io/projected/33ce50c2-c998-4879-9c9e-c8e80e8fd485-kube-api-access-7758n\") pod \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.843127 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-config-data\") pod \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.843194 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-combined-ca-bundle\") pod \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.843249 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-config-data-custom\") pod \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\" (UID: \"33ce50c2-c998-4879-9c9e-c8e80e8fd485\") " Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.843553 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.843565 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6rct\" (UniqueName: \"kubernetes.io/projected/23a44575-58d7-4b33-bf60-5e1592938de0-kube-api-access-g6rct\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.843576 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.843586 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23a44575-58d7-4b33-bf60-5e1592938de0-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.850756 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "33ce50c2-c998-4879-9c9e-c8e80e8fd485" (UID: "33ce50c2-c998-4879-9c9e-c8e80e8fd485"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.850808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ce50c2-c998-4879-9c9e-c8e80e8fd485-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "33ce50c2-c998-4879-9c9e-c8e80e8fd485" (UID: "33ce50c2-c998-4879-9c9e-c8e80e8fd485"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.880360 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-decision-engine-0" podStartSLOduration=2.445154244 podStartE2EDuration="4.880343283s" podCreationTimestamp="2026-01-20 23:10:54 +0000 UTC" firstStartedPulling="2026-01-20 23:10:55.055736396 +0000 UTC m=+2127.375996684" lastFinishedPulling="2026-01-20 23:10:57.490925445 +0000 UTC m=+2129.811185723" observedRunningTime="2026-01-20 23:10:58.84218349 +0000 UTC m=+2131.162443778" watchObservedRunningTime="2026-01-20 23:10:58.880343283 +0000 UTC m=+2131.200603571" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.889842 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ce50c2-c998-4879-9c9e-c8e80e8fd485-kube-api-access-7758n" (OuterVolumeSpecName: "kube-api-access-7758n") pod "33ce50c2-c998-4879-9c9e-c8e80e8fd485" (UID: "33ce50c2-c998-4879-9c9e-c8e80e8fd485"). InnerVolumeSpecName "kube-api-access-7758n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.896789 5030 scope.go:117] "RemoveContainer" containerID="c98691c50c7558cadcd4c891c5e1dd88fcbc902c6642d771d9ee7849bd033934" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.897074 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-scripts" (OuterVolumeSpecName: "scripts") pod "33ce50c2-c998-4879-9c9e-c8e80e8fd485" (UID: "33ce50c2-c998-4879-9c9e-c8e80e8fd485"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.897829 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-applier-0" event={"ID":"529ed4cf-39aa-463b-bf65-da6900fa74e4","Type":"ContainerStarted","Data":"4066467fc36d5b2f8ca58161d2e2528be4d9375a2251420eb10d0d6a759ae68e"} Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.922046 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m"] Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.942609 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-8689d4ccf8-dsl7m"] Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.962809 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.962837 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33ce50c2-c998-4879-9c9e-c8e80e8fd485-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.962847 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.962856 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7758n\" (UniqueName: \"kubernetes.io/projected/33ce50c2-c998-4879-9c9e-c8e80e8fd485-kube-api-access-7758n\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.973890 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" event={"ID":"0e045aec-3446-4403-9f41-3460c1e4edd8","Type":"ContainerStarted","Data":"a09b3aeb461e9751eafd059a5278797b9fba7440f6db9b9ac3e7ae86e2e084e5"} Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.973926 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" event={"ID":"0e045aec-3446-4403-9f41-3460c1e4edd8","Type":"ContainerStarted","Data":"2adc9505bd51441e424188ca6fc7870a632c7aaf4ff225ef4486d6433fc3a30a"} Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.973938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" event={"ID":"0e045aec-3446-4403-9f41-3460c1e4edd8","Type":"ContainerStarted","Data":"6e004e65f75f1c4f661c3346d30743dd494fe9516ae6fa3aa71a5ef321bad700"} Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.973962 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:10:58 crc kubenswrapper[5030]: I0120 23:10:58.983400 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-applier-0" podStartSLOduration=3.353514094 podStartE2EDuration="5.98338192s" podCreationTimestamp="2026-01-20 23:10:53 +0000 UTC" firstStartedPulling="2026-01-20 23:10:54.85944019 +0000 UTC m=+2127.179700478" lastFinishedPulling="2026-01-20 23:10:57.489308016 +0000 UTC m=+2129.809568304" observedRunningTime="2026-01-20 23:10:58.948008396 +0000 UTC m=+2131.268268684" watchObservedRunningTime="2026-01-20 23:10:58.98338192 +0000 UTC m=+2131.303642208" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.024945 5030 scope.go:117] "RemoveContainer" containerID="a65e7eec2719fa9999163669c81ea7a03d598d4d9ceaade4adca1a34314b5de3" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.026755 5030 generic.go:334] "Generic (PLEG): container finished" podID="33ce50c2-c998-4879-9c9e-c8e80e8fd485" containerID="58bb7cefca1fb3c47879aaf43e560060d90b85da9f280149c6c7ce438fc7ac9a" exitCode=0 Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.026792 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"33ce50c2-c998-4879-9c9e-c8e80e8fd485","Type":"ContainerDied","Data":"58bb7cefca1fb3c47879aaf43e560060d90b85da9f280149c6c7ce438fc7ac9a"} Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.026816 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"33ce50c2-c998-4879-9c9e-c8e80e8fd485","Type":"ContainerDied","Data":"87d1b7e4f027e9c3956e54d48bd40ab55ba7e8e535e094ae200325bba241269b"} Jan 20 23:10:59 crc kubenswrapper[5030]: E0120 23:10:59.026843 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a65e7eec2719fa9999163669c81ea7a03d598d4d9ceaade4adca1a34314b5de3\": container with ID starting with a65e7eec2719fa9999163669c81ea7a03d598d4d9ceaade4adca1a34314b5de3 not found: ID does not exist" containerID="a65e7eec2719fa9999163669c81ea7a03d598d4d9ceaade4adca1a34314b5de3" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.026864 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65e7eec2719fa9999163669c81ea7a03d598d4d9ceaade4adca1a34314b5de3"} err="failed to get container status \"a65e7eec2719fa9999163669c81ea7a03d598d4d9ceaade4adca1a34314b5de3\": rpc error: code = NotFound desc = could not find container \"a65e7eec2719fa9999163669c81ea7a03d598d4d9ceaade4adca1a34314b5de3\": container with ID starting with a65e7eec2719fa9999163669c81ea7a03d598d4d9ceaade4adca1a34314b5de3 not found: ID does not exist" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.026902 5030 scope.go:117] "RemoveContainer" containerID="c98691c50c7558cadcd4c891c5e1dd88fcbc902c6642d771d9ee7849bd033934" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.026967 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: E0120 23:10:59.028021 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c98691c50c7558cadcd4c891c5e1dd88fcbc902c6642d771d9ee7849bd033934\": container with ID starting with c98691c50c7558cadcd4c891c5e1dd88fcbc902c6642d771d9ee7849bd033934 not found: ID does not exist" containerID="c98691c50c7558cadcd4c891c5e1dd88fcbc902c6642d771d9ee7849bd033934" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.028048 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98691c50c7558cadcd4c891c5e1dd88fcbc902c6642d771d9ee7849bd033934"} err="failed to get container status \"c98691c50c7558cadcd4c891c5e1dd88fcbc902c6642d771d9ee7849bd033934\": rpc error: code = NotFound desc = could not find container \"c98691c50c7558cadcd4c891c5e1dd88fcbc902c6642d771d9ee7849bd033934\": container with ID starting with c98691c50c7558cadcd4c891c5e1dd88fcbc902c6642d771d9ee7849bd033934 not found: ID does not exist" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.028061 5030 scope.go:117] "RemoveContainer" containerID="f0e85d91353391beaadfc676a49bcecabece2f85b110a918b4e60e9f221f9a26" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.034397 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" podStartSLOduration=3.034376807 podStartE2EDuration="3.034376807s" podCreationTimestamp="2026-01-20 23:10:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:10:59.02755136 +0000 UTC m=+2131.347811648" watchObservedRunningTime="2026-01-20 23:10:59.034376807 +0000 UTC m=+2131.354637095" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.083474 5030 scope.go:117] "RemoveContainer" containerID="58bb7cefca1fb3c47879aaf43e560060d90b85da9f280149c6c7ce438fc7ac9a" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.101733 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33ce50c2-c998-4879-9c9e-c8e80e8fd485" (UID: "33ce50c2-c998-4879-9c9e-c8e80e8fd485"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.105707 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-config-data" (OuterVolumeSpecName: "config-data") pod "33ce50c2-c998-4879-9c9e-c8e80e8fd485" (UID: "33ce50c2-c998-4879-9c9e-c8e80e8fd485"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.156003 5030 scope.go:117] "RemoveContainer" containerID="f0e85d91353391beaadfc676a49bcecabece2f85b110a918b4e60e9f221f9a26" Jan 20 23:10:59 crc kubenswrapper[5030]: E0120 23:10:59.157450 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e85d91353391beaadfc676a49bcecabece2f85b110a918b4e60e9f221f9a26\": container with ID starting with f0e85d91353391beaadfc676a49bcecabece2f85b110a918b4e60e9f221f9a26 not found: ID does not exist" containerID="f0e85d91353391beaadfc676a49bcecabece2f85b110a918b4e60e9f221f9a26" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.157486 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e85d91353391beaadfc676a49bcecabece2f85b110a918b4e60e9f221f9a26"} err="failed to get container status \"f0e85d91353391beaadfc676a49bcecabece2f85b110a918b4e60e9f221f9a26\": rpc error: code = NotFound desc = could not find container \"f0e85d91353391beaadfc676a49bcecabece2f85b110a918b4e60e9f221f9a26\": container with ID starting with f0e85d91353391beaadfc676a49bcecabece2f85b110a918b4e60e9f221f9a26 not found: ID does not exist" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.157505 5030 scope.go:117] "RemoveContainer" containerID="58bb7cefca1fb3c47879aaf43e560060d90b85da9f280149c6c7ce438fc7ac9a" Jan 20 23:10:59 crc kubenswrapper[5030]: E0120 23:10:59.157843 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58bb7cefca1fb3c47879aaf43e560060d90b85da9f280149c6c7ce438fc7ac9a\": container with ID starting with 58bb7cefca1fb3c47879aaf43e560060d90b85da9f280149c6c7ce438fc7ac9a not found: ID does not exist" containerID="58bb7cefca1fb3c47879aaf43e560060d90b85da9f280149c6c7ce438fc7ac9a" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.157866 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58bb7cefca1fb3c47879aaf43e560060d90b85da9f280149c6c7ce438fc7ac9a"} err="failed to get container status \"58bb7cefca1fb3c47879aaf43e560060d90b85da9f280149c6c7ce438fc7ac9a\": rpc error: code = NotFound desc = could not find container \"58bb7cefca1fb3c47879aaf43e560060d90b85da9f280149c6c7ce438fc7ac9a\": container with ID starting with 58bb7cefca1fb3c47879aaf43e560060d90b85da9f280149c6c7ce438fc7ac9a not found: ID does not exist" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.173615 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.173653 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ce50c2-c998-4879-9c9e-c8e80e8fd485-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.318393 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.331320 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.392818 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.411391 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.429696 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:10:59 crc kubenswrapper[5030]: E0120 23:10:59.430088 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce50c2-c998-4879-9c9e-c8e80e8fd485" containerName="cinder-scheduler" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.430105 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce50c2-c998-4879-9c9e-c8e80e8fd485" containerName="cinder-scheduler" Jan 20 23:10:59 crc kubenswrapper[5030]: E0120 23:10:59.430117 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a44575-58d7-4b33-bf60-5e1592938de0" containerName="barbican-api-log" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.430123 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a44575-58d7-4b33-bf60-5e1592938de0" containerName="barbican-api-log" Jan 20 23:10:59 crc kubenswrapper[5030]: E0120 23:10:59.430143 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a44575-58d7-4b33-bf60-5e1592938de0" containerName="barbican-api" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.430151 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a44575-58d7-4b33-bf60-5e1592938de0" containerName="barbican-api" Jan 20 23:10:59 crc kubenswrapper[5030]: E0120 23:10:59.430161 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ce50c2-c998-4879-9c9e-c8e80e8fd485" containerName="probe" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.430167 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ce50c2-c998-4879-9c9e-c8e80e8fd485" containerName="probe" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.430327 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce50c2-c998-4879-9c9e-c8e80e8fd485" containerName="cinder-scheduler" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.430338 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a44575-58d7-4b33-bf60-5e1592938de0" containerName="barbican-api-log" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.430347 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ce50c2-c998-4879-9c9e-c8e80e8fd485" containerName="probe" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.430363 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a44575-58d7-4b33-bf60-5e1592938de0" containerName="barbican-api" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.431343 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.440111 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.449185 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.581293 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.581493 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-scripts\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.581967 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-config-data\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.582083 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjrml\" (UniqueName: \"kubernetes.io/projected/39d2f867-199b-446c-9f02-c8b2f842df63-kube-api-access-pjrml\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.582228 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.582311 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39d2f867-199b-446c-9f02-c8b2f842df63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.684059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-config-data\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.684107 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjrml\" (UniqueName: \"kubernetes.io/projected/39d2f867-199b-446c-9f02-c8b2f842df63-kube-api-access-pjrml\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.684138 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.684164 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39d2f867-199b-446c-9f02-c8b2f842df63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.684218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.684251 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-scripts\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.684642 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39d2f867-199b-446c-9f02-c8b2f842df63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.693648 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-config-data\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.697053 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.702730 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjrml\" (UniqueName: \"kubernetes.io/projected/39d2f867-199b-446c-9f02-c8b2f842df63-kube-api-access-pjrml\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.706908 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.712175 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-scripts\") pod \"cinder-scheduler-0\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.752356 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.977858 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a44575-58d7-4b33-bf60-5e1592938de0" path="/var/lib/kubelet/pods/23a44575-58d7-4b33-bf60-5e1592938de0/volumes" Jan 20 23:10:59 crc kubenswrapper[5030]: I0120 23:10:59.979069 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ce50c2-c998-4879-9c9e-c8e80e8fd485" path="/var/lib/kubelet/pods/33ce50c2-c998-4879-9c9e-c8e80e8fd485/volumes" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.061683 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-d9778659d-djkxb"] Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.063194 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.068002 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.068204 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.074643 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-d9778659d-djkxb"] Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.195584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-httpd-config\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.195662 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-public-tls-certs\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.196310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-combined-ca-bundle\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.196524 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxfp5\" (UniqueName: \"kubernetes.io/projected/65835ff1-85e2-4b33-a384-c93fdf569c40-kube-api-access-fxfp5\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.196563 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-config\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.196655 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-internal-tls-certs\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.196701 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-ovndb-tls-certs\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.212904 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:11:00 crc kubenswrapper[5030]: W0120 23:11:00.220562 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39d2f867_199b_446c_9f02_c8b2f842df63.slice/crio-375a932709a0124814adc43ac08fff5d02401d9ced857ef46ee28f723910fa85 WatchSource:0}: Error finding container 375a932709a0124814adc43ac08fff5d02401d9ced857ef46ee28f723910fa85: Status 404 returned error can't find the container with id 375a932709a0124814adc43ac08fff5d02401d9ced857ef46ee28f723910fa85 Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.298346 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-httpd-config\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.298637 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-public-tls-certs\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.298671 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-combined-ca-bundle\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.298729 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxfp5\" (UniqueName: \"kubernetes.io/projected/65835ff1-85e2-4b33-a384-c93fdf569c40-kube-api-access-fxfp5\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.298756 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-config\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.298810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-internal-tls-certs\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.298845 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-ovndb-tls-certs\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.303281 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-httpd-config\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.306114 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-combined-ca-bundle\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.306157 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-public-tls-certs\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.309073 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-config\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.309282 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-internal-tls-certs\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.310106 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-ovndb-tls-certs\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.323517 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxfp5\" (UniqueName: \"kubernetes.io/projected/65835ff1-85e2-4b33-a384-c93fdf569c40-kube-api-access-fxfp5\") pod \"neutron-d9778659d-djkxb\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.395445 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:00 crc kubenswrapper[5030]: I0120 23:11:00.920542 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-d9778659d-djkxb"] Jan 20 23:11:00 crc kubenswrapper[5030]: W0120 23:11:00.927507 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65835ff1_85e2_4b33_a384_c93fdf569c40.slice/crio-b1bbbbcdac7a677bb23329174532b0167d4ac661543b3432b8472b8a2a02c593 WatchSource:0}: Error finding container b1bbbbcdac7a677bb23329174532b0167d4ac661543b3432b8472b8a2a02c593: Status 404 returned error can't find the container with id b1bbbbcdac7a677bb23329174532b0167d4ac661543b3432b8472b8a2a02c593 Jan 20 23:11:01 crc kubenswrapper[5030]: I0120 23:11:01.060027 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" event={"ID":"65835ff1-85e2-4b33-a384-c93fdf569c40","Type":"ContainerStarted","Data":"b1bbbbcdac7a677bb23329174532b0167d4ac661543b3432b8472b8a2a02c593"} Jan 20 23:11:01 crc kubenswrapper[5030]: I0120 23:11:01.064026 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"39d2f867-199b-446c-9f02-c8b2f842df63","Type":"ContainerStarted","Data":"3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444"} Jan 20 23:11:01 crc kubenswrapper[5030]: I0120 23:11:01.064092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"39d2f867-199b-446c-9f02-c8b2f842df63","Type":"ContainerStarted","Data":"375a932709a0124814adc43ac08fff5d02401d9ced857ef46ee28f723910fa85"} Jan 20 23:11:02 crc kubenswrapper[5030]: I0120 23:11:02.073581 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" event={"ID":"65835ff1-85e2-4b33-a384-c93fdf569c40","Type":"ContainerStarted","Data":"23839d0d94f910990f762e129be6aa928cce6a34a983e78f892a4b78ebb1dbaf"} Jan 20 23:11:02 crc kubenswrapper[5030]: I0120 23:11:02.073905 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" event={"ID":"65835ff1-85e2-4b33-a384-c93fdf569c40","Type":"ContainerStarted","Data":"6f907dbea8308c724188d082653b0290d84929b8eaa1ab9fe4f6b434755bb79d"} Jan 20 23:11:02 crc kubenswrapper[5030]: I0120 23:11:02.074055 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:02 crc kubenswrapper[5030]: I0120 23:11:02.078493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"39d2f867-199b-446c-9f02-c8b2f842df63","Type":"ContainerStarted","Data":"ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68"} Jan 20 23:11:02 crc kubenswrapper[5030]: I0120 23:11:02.100080 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" podStartSLOduration=2.1000608 podStartE2EDuration="2.1000608s" podCreationTimestamp="2026-01-20 23:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:02.097672682 +0000 UTC m=+2134.417932980" watchObservedRunningTime="2026-01-20 23:11:02.1000608 +0000 UTC m=+2134.420321088" Jan 20 23:11:02 crc kubenswrapper[5030]: I0120 23:11:02.141715 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.141699148 podStartE2EDuration="3.141699148s" podCreationTimestamp="2026-01-20 23:10:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:02.128103746 +0000 UTC m=+2134.448364034" watchObservedRunningTime="2026-01-20 23:11:02.141699148 +0000 UTC m=+2134.461959436" Jan 20 23:11:02 crc kubenswrapper[5030]: I0120 23:11:02.379579 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:11:04 crc kubenswrapper[5030]: I0120 23:11:04.318453 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:04 crc kubenswrapper[5030]: I0120 23:11:04.325065 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:04 crc kubenswrapper[5030]: I0120 23:11:04.331811 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:11:04 crc kubenswrapper[5030]: I0120 23:11:04.394312 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:11:04 crc kubenswrapper[5030]: I0120 23:11:04.450067 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:04 crc kubenswrapper[5030]: I0120 23:11:04.477802 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:04 crc kubenswrapper[5030]: I0120 23:11:04.752718 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:11:05 crc kubenswrapper[5030]: I0120 23:11:05.106536 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:05 crc kubenswrapper[5030]: I0120 23:11:05.115866 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:05 crc kubenswrapper[5030]: I0120 23:11:05.145379 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:11:05 crc kubenswrapper[5030]: I0120 23:11:05.152481 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:07 crc kubenswrapper[5030]: I0120 23:11:07.780712 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 20 23:11:07 crc kubenswrapper[5030]: I0120 23:11:07.781122 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-api-0" podUID="0bddc34f-b741-44c3-bbfe-51b78a663cec" containerName="watcher-api-log" containerID="cri-o://6ccfdc465e4afa42f0fce4e71a832f3fe6e0e0a176f66bf3b2dd07a114bfd007" gracePeriod=30 Jan 20 23:11:07 crc kubenswrapper[5030]: I0120 23:11:07.781171 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-api-0" podUID="0bddc34f-b741-44c3-bbfe-51b78a663cec" containerName="watcher-api" containerID="cri-o://362390f7e5e2722bdb9dfa7ff294373b01676d6e2e6144fe7ff43a1a55bcb624" gracePeriod=30 Jan 20 23:11:08 crc kubenswrapper[5030]: I0120 23:11:08.135587 5030 generic.go:334] "Generic (PLEG): container finished" podID="0bddc34f-b741-44c3-bbfe-51b78a663cec" containerID="6ccfdc465e4afa42f0fce4e71a832f3fe6e0e0a176f66bf3b2dd07a114bfd007" exitCode=143 Jan 20 23:11:08 crc kubenswrapper[5030]: I0120 23:11:08.135663 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"0bddc34f-b741-44c3-bbfe-51b78a663cec","Type":"ContainerDied","Data":"6ccfdc465e4afa42f0fce4e71a832f3fe6e0e0a176f66bf3b2dd07a114bfd007"} Jan 20 23:11:08 crc kubenswrapper[5030]: I0120 23:11:08.396506 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:11:08 crc kubenswrapper[5030]: I0120 23:11:08.437876 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:11:09 crc kubenswrapper[5030]: I0120 23:11:09.125978 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:09 crc kubenswrapper[5030]: I0120 23:11:09.938169 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.013270 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/watcher-api-0" podUID="0bddc34f-b741-44c3-bbfe-51b78a663cec" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.1.214:9322/\": read tcp 10.217.0.2:40776->10.217.1.214:9322: read: connection reset by peer" Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.013291 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/watcher-api-0" podUID="0bddc34f-b741-44c3-bbfe-51b78a663cec" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.1.214:9322/\": read tcp 10.217.0.2:40770->10.217.1.214:9322: read: connection reset by peer" Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.180672 5030 generic.go:334] "Generic (PLEG): container finished" podID="0bddc34f-b741-44c3-bbfe-51b78a663cec" containerID="362390f7e5e2722bdb9dfa7ff294373b01676d6e2e6144fe7ff43a1a55bcb624" exitCode=0 Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.180712 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"0bddc34f-b741-44c3-bbfe-51b78a663cec","Type":"ContainerDied","Data":"362390f7e5e2722bdb9dfa7ff294373b01676d6e2e6144fe7ff43a1a55bcb624"} Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.509244 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.603499 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-config-data\") pod \"0bddc34f-b741-44c3-bbfe-51b78a663cec\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.603568 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-combined-ca-bundle\") pod \"0bddc34f-b741-44c3-bbfe-51b78a663cec\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.603645 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-custom-prometheus-ca\") pod \"0bddc34f-b741-44c3-bbfe-51b78a663cec\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.603688 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj4xt\" (UniqueName: \"kubernetes.io/projected/0bddc34f-b741-44c3-bbfe-51b78a663cec-kube-api-access-xj4xt\") pod \"0bddc34f-b741-44c3-bbfe-51b78a663cec\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.603709 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bddc34f-b741-44c3-bbfe-51b78a663cec-logs\") pod \"0bddc34f-b741-44c3-bbfe-51b78a663cec\" (UID: \"0bddc34f-b741-44c3-bbfe-51b78a663cec\") " Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.604571 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bddc34f-b741-44c3-bbfe-51b78a663cec-logs" (OuterVolumeSpecName: "logs") pod "0bddc34f-b741-44c3-bbfe-51b78a663cec" (UID: "0bddc34f-b741-44c3-bbfe-51b78a663cec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.609093 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bddc34f-b741-44c3-bbfe-51b78a663cec-kube-api-access-xj4xt" (OuterVolumeSpecName: "kube-api-access-xj4xt") pod "0bddc34f-b741-44c3-bbfe-51b78a663cec" (UID: "0bddc34f-b741-44c3-bbfe-51b78a663cec"). InnerVolumeSpecName "kube-api-access-xj4xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.627700 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bddc34f-b741-44c3-bbfe-51b78a663cec" (UID: "0bddc34f-b741-44c3-bbfe-51b78a663cec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.632970 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "0bddc34f-b741-44c3-bbfe-51b78a663cec" (UID: "0bddc34f-b741-44c3-bbfe-51b78a663cec"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.653580 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-config-data" (OuterVolumeSpecName: "config-data") pod "0bddc34f-b741-44c3-bbfe-51b78a663cec" (UID: "0bddc34f-b741-44c3-bbfe-51b78a663cec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.706266 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.706651 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.706840 5030 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0bddc34f-b741-44c3-bbfe-51b78a663cec-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.706993 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj4xt\" (UniqueName: \"kubernetes.io/projected/0bddc34f-b741-44c3-bbfe-51b78a663cec-kube-api-access-xj4xt\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:11 crc kubenswrapper[5030]: I0120 23:11:11.707119 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bddc34f-b741-44c3-bbfe-51b78a663cec-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.201926 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"0bddc34f-b741-44c3-bbfe-51b78a663cec","Type":"ContainerDied","Data":"45cbd2742643a43463adfd12a55cf9ff200f49d92f855c287cffacc4d5ab724e"} Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.201988 5030 scope.go:117] "RemoveContainer" containerID="362390f7e5e2722bdb9dfa7ff294373b01676d6e2e6144fe7ff43a1a55bcb624" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.202030 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.230716 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.240973 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.248934 5030 scope.go:117] "RemoveContainer" containerID="6ccfdc465e4afa42f0fce4e71a832f3fe6e0e0a176f66bf3b2dd07a114bfd007" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.259057 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 20 23:11:12 crc kubenswrapper[5030]: E0120 23:11:12.259496 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bddc34f-b741-44c3-bbfe-51b78a663cec" containerName="watcher-api-log" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.259521 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bddc34f-b741-44c3-bbfe-51b78a663cec" containerName="watcher-api-log" Jan 20 23:11:12 crc kubenswrapper[5030]: E0120 23:11:12.259548 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bddc34f-b741-44c3-bbfe-51b78a663cec" containerName="watcher-api" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.259559 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bddc34f-b741-44c3-bbfe-51b78a663cec" containerName="watcher-api" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.259794 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bddc34f-b741-44c3-bbfe-51b78a663cec" containerName="watcher-api" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.259812 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bddc34f-b741-44c3-bbfe-51b78a663cec" containerName="watcher-api-log" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.260956 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.270578 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-api-config-data" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.270789 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-watcher-public-svc" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.270970 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-watcher-internal-svc" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.273324 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.319913 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.319974 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.319997 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-public-tls-certs\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.320122 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.320281 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6ntf\" (UniqueName: \"kubernetes.io/projected/70ab6da8-893f-49a3-867f-02b2f7395331-kube-api-access-q6ntf\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.320312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-config-data\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.320568 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70ab6da8-893f-49a3-867f-02b2f7395331-logs\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.421597 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.421685 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6ntf\" (UniqueName: \"kubernetes.io/projected/70ab6da8-893f-49a3-867f-02b2f7395331-kube-api-access-q6ntf\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.421705 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-config-data\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.421778 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70ab6da8-893f-49a3-867f-02b2f7395331-logs\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.421810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.421838 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.421858 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-public-tls-certs\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.422441 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70ab6da8-893f-49a3-867f-02b2f7395331-logs\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.425613 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-public-tls-certs\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.426413 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-config-data\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.427074 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.427343 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.429569 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.452857 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6ntf\" (UniqueName: \"kubernetes.io/projected/70ab6da8-893f-49a3-867f-02b2f7395331-kube-api-access-q6ntf\") pod \"watcher-api-0\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:12 crc kubenswrapper[5030]: I0120 23:11:12.578888 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:13 crc kubenswrapper[5030]: I0120 23:11:13.073910 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 20 23:11:13 crc kubenswrapper[5030]: I0120 23:11:13.215077 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"70ab6da8-893f-49a3-867f-02b2f7395331","Type":"ContainerStarted","Data":"6fc4be01630ab3040830a21792f67c6f2385770be3db6d0ccd8ff393d36b2bd7"} Jan 20 23:11:13 crc kubenswrapper[5030]: I0120 23:11:13.980548 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bddc34f-b741-44c3-bbfe-51b78a663cec" path="/var/lib/kubelet/pods/0bddc34f-b741-44c3-bbfe-51b78a663cec/volumes" Jan 20 23:11:14 crc kubenswrapper[5030]: I0120 23:11:14.231232 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"70ab6da8-893f-49a3-867f-02b2f7395331","Type":"ContainerStarted","Data":"5507e516abcaf14cb7732b59ea6cde2029913fbfae8b91df16c0097cb4b9cba6"} Jan 20 23:11:14 crc kubenswrapper[5030]: I0120 23:11:14.231292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"70ab6da8-893f-49a3-867f-02b2f7395331","Type":"ContainerStarted","Data":"12254c06fd109c72533ed7dc0710acf0c25c1330709a37f4aa8ac1e19abd08d0"} Jan 20 23:11:14 crc kubenswrapper[5030]: I0120 23:11:14.231585 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:14 crc kubenswrapper[5030]: I0120 23:11:14.264303 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-api-0" podStartSLOduration=2.26428096 podStartE2EDuration="2.26428096s" podCreationTimestamp="2026-01-20 23:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:14.252654196 +0000 UTC m=+2146.572914494" watchObservedRunningTime="2026-01-20 23:11:14.26428096 +0000 UTC m=+2146.584541288" Jan 20 23:11:16 crc kubenswrapper[5030]: I0120 23:11:16.079292 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:17 crc kubenswrapper[5030]: I0120 23:11:17.579388 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:21 crc kubenswrapper[5030]: I0120 23:11:21.838962 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:11:22 crc kubenswrapper[5030]: I0120 23:11:22.580140 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:22 crc kubenswrapper[5030]: I0120 23:11:22.587224 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:22 crc kubenswrapper[5030]: I0120 23:11:22.851387 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:11:22 crc kubenswrapper[5030]: I0120 23:11:22.853123 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:22 crc kubenswrapper[5030]: I0120 23:11:22.857127 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-z8rq5" Jan 20 23:11:22 crc kubenswrapper[5030]: I0120 23:11:22.857350 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 20 23:11:22 crc kubenswrapper[5030]: I0120 23:11:22.857507 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 20 23:11:22 crc kubenswrapper[5030]: I0120 23:11:22.862576 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:11:22 crc kubenswrapper[5030]: I0120 23:11:22.938346 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/276869e9-7303-4347-a105-fdc604a60324-combined-ca-bundle\") pod \"openstackclient\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:22 crc kubenswrapper[5030]: I0120 23:11:22.938416 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/276869e9-7303-4347-a105-fdc604a60324-openstack-config\") pod \"openstackclient\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:22 crc kubenswrapper[5030]: I0120 23:11:22.938867 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24fjs\" (UniqueName: \"kubernetes.io/projected/276869e9-7303-4347-a105-fdc604a60324-kube-api-access-24fjs\") pod \"openstackclient\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:22 crc kubenswrapper[5030]: I0120 23:11:22.938928 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/276869e9-7303-4347-a105-fdc604a60324-openstack-config-secret\") pod \"openstackclient\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:23 crc kubenswrapper[5030]: I0120 23:11:23.041037 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24fjs\" (UniqueName: \"kubernetes.io/projected/276869e9-7303-4347-a105-fdc604a60324-kube-api-access-24fjs\") pod \"openstackclient\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:23 crc kubenswrapper[5030]: I0120 23:11:23.041699 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/276869e9-7303-4347-a105-fdc604a60324-openstack-config-secret\") pod \"openstackclient\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:23 crc kubenswrapper[5030]: I0120 23:11:23.041849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/276869e9-7303-4347-a105-fdc604a60324-combined-ca-bundle\") pod \"openstackclient\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:23 crc kubenswrapper[5030]: I0120 23:11:23.042674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/276869e9-7303-4347-a105-fdc604a60324-openstack-config\") pod \"openstackclient\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:23 crc kubenswrapper[5030]: I0120 23:11:23.043590 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/276869e9-7303-4347-a105-fdc604a60324-openstack-config\") pod \"openstackclient\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:23 crc kubenswrapper[5030]: I0120 23:11:23.054363 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/276869e9-7303-4347-a105-fdc604a60324-openstack-config-secret\") pod \"openstackclient\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:23 crc kubenswrapper[5030]: I0120 23:11:23.057856 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/276869e9-7303-4347-a105-fdc604a60324-combined-ca-bundle\") pod \"openstackclient\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:23 crc kubenswrapper[5030]: I0120 23:11:23.060664 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24fjs\" (UniqueName: \"kubernetes.io/projected/276869e9-7303-4347-a105-fdc604a60324-kube-api-access-24fjs\") pod \"openstackclient\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:23 crc kubenswrapper[5030]: I0120 23:11:23.172318 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:11:23 crc kubenswrapper[5030]: I0120 23:11:23.334917 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:11:23 crc kubenswrapper[5030]: I0120 23:11:23.657611 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:11:24 crc kubenswrapper[5030]: I0120 23:11:24.341987 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"276869e9-7303-4347-a105-fdc604a60324","Type":"ContainerStarted","Data":"fbd9bf9f73aba592b00498bb42493eb96178822d5c2e8da3dcd3907782e1e416"} Jan 20 23:11:24 crc kubenswrapper[5030]: I0120 23:11:24.349123 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"276869e9-7303-4347-a105-fdc604a60324","Type":"ContainerStarted","Data":"8c040e8efb9fe728547d88d6c818188007ed0341567fe51a35b908056f3cf94d"} Jan 20 23:11:24 crc kubenswrapper[5030]: I0120 23:11:24.373320 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.373296744 podStartE2EDuration="2.373296744s" podCreationTimestamp="2026-01-20 23:11:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:24.362003268 +0000 UTC m=+2156.682263586" watchObservedRunningTime="2026-01-20 23:11:24.373296744 +0000 UTC m=+2156.693557032" Jan 20 23:11:27 crc kubenswrapper[5030]: I0120 23:11:27.454342 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.056320 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd"] Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.058292 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.065180 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.065311 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.065393 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.067853 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd"] Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.255801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-log-httpd\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.255869 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-internal-tls-certs\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.255929 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-run-httpd\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.255956 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-etc-swift\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.255982 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff2bk\" (UniqueName: \"kubernetes.io/projected/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-kube-api-access-ff2bk\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.256000 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-public-tls-certs\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.256146 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-config-data\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.256185 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-combined-ca-bundle\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.273963 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.274240 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="ceilometer-central-agent" containerID="cri-o://9d8a694665e2a6a02f99434282adda9069dade83d59ff2b7c6049c05690f0ad1" gracePeriod=30 Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.274279 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="proxy-httpd" containerID="cri-o://e11a6225db06c03537442bca4cc97ea574e0f4e5666909d3028483fe4fb389c2" gracePeriod=30 Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.274320 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="sg-core" containerID="cri-o://810bba1559104d0f47323ad648e6e7f24dfccd3c29f21edb3d5e1595173f2e71" gracePeriod=30 Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.274320 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="ceilometer-notification-agent" containerID="cri-o://5dd7898784fb28db16c57924aeb1b18210ee989a99b4444e66babb5cdb48ff0f" gracePeriod=30 Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.358192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-log-httpd\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.358263 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-internal-tls-certs\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.358291 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-run-httpd\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.358317 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-etc-swift\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.358347 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff2bk\" (UniqueName: \"kubernetes.io/projected/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-kube-api-access-ff2bk\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.358365 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-public-tls-certs\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.358396 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-config-data\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.358413 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-combined-ca-bundle\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.358738 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-log-httpd\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.358870 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-run-httpd\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.365318 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-config-data\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.365585 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-public-tls-certs\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.365611 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-combined-ca-bundle\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.365764 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-internal-tls-certs\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.366730 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-etc-swift\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.374797 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff2bk\" (UniqueName: \"kubernetes.io/projected/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-kube-api-access-ff2bk\") pod \"swift-proxy-b8cd55c6b-gdzdd\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.385297 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.462905 5030 generic.go:334] "Generic (PLEG): container finished" podID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerID="e11a6225db06c03537442bca4cc97ea574e0f4e5666909d3028483fe4fb389c2" exitCode=0 Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.462937 5030 generic.go:334] "Generic (PLEG): container finished" podID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerID="810bba1559104d0f47323ad648e6e7f24dfccd3c29f21edb3d5e1595173f2e71" exitCode=2 Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.462956 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"de431a48-e0d3-4508-a1e7-54598bbab91d","Type":"ContainerDied","Data":"e11a6225db06c03537442bca4cc97ea574e0f4e5666909d3028483fe4fb389c2"} Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.462990 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"de431a48-e0d3-4508-a1e7-54598bbab91d","Type":"ContainerDied","Data":"810bba1559104d0f47323ad648e6e7f24dfccd3c29f21edb3d5e1595173f2e71"} Jan 20 23:11:28 crc kubenswrapper[5030]: I0120 23:11:28.853795 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd"] Jan 20 23:11:29 crc kubenswrapper[5030]: I0120 23:11:29.009598 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:11:29 crc kubenswrapper[5030]: I0120 23:11:29.009834 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="a5fbd872-796a-4b41-b925-516fc6db87f6" containerName="glance-log" containerID="cri-o://053f678a1a0cf6510e44aeba332cc377d889571f025e9cd39037db9969376bc1" gracePeriod=30 Jan 20 23:11:29 crc kubenswrapper[5030]: I0120 23:11:29.009964 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="a5fbd872-796a-4b41-b925-516fc6db87f6" containerName="glance-httpd" containerID="cri-o://326c2d125eb3a698d87da6113f935f138da26210af73c40cb324ca56a9aa23a4" gracePeriod=30 Jan 20 23:11:29 crc kubenswrapper[5030]: I0120 23:11:29.473589 5030 generic.go:334] "Generic (PLEG): container finished" podID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerID="9d8a694665e2a6a02f99434282adda9069dade83d59ff2b7c6049c05690f0ad1" exitCode=0 Jan 20 23:11:29 crc kubenswrapper[5030]: I0120 23:11:29.473640 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"de431a48-e0d3-4508-a1e7-54598bbab91d","Type":"ContainerDied","Data":"9d8a694665e2a6a02f99434282adda9069dade83d59ff2b7c6049c05690f0ad1"} Jan 20 23:11:29 crc kubenswrapper[5030]: I0120 23:11:29.475974 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" event={"ID":"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c","Type":"ContainerStarted","Data":"a603f5479f1751fca07b831722eb086379566bd909e2e23ac72bbb41392e0f4d"} Jan 20 23:11:29 crc kubenswrapper[5030]: I0120 23:11:29.476003 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" event={"ID":"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c","Type":"ContainerStarted","Data":"444566938fcbfab3901e6fd0f098d24ad50b1a35701c21a72a39c1fc137f4568"} Jan 20 23:11:29 crc kubenswrapper[5030]: I0120 23:11:29.476013 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" event={"ID":"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c","Type":"ContainerStarted","Data":"4a64b3c6378f7152c57a04b7dc2b20f750c7d82ab0081e988883ad85a13d61b7"} Jan 20 23:11:29 crc kubenswrapper[5030]: I0120 23:11:29.476128 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:29 crc kubenswrapper[5030]: I0120 23:11:29.478044 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5fbd872-796a-4b41-b925-516fc6db87f6" containerID="053f678a1a0cf6510e44aeba332cc377d889571f025e9cd39037db9969376bc1" exitCode=143 Jan 20 23:11:29 crc kubenswrapper[5030]: I0120 23:11:29.478075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5fbd872-796a-4b41-b925-516fc6db87f6","Type":"ContainerDied","Data":"053f678a1a0cf6510e44aeba332cc377d889571f025e9cd39037db9969376bc1"} Jan 20 23:11:29 crc kubenswrapper[5030]: I0120 23:11:29.499914 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" podStartSLOduration=1.499894582 podStartE2EDuration="1.499894582s" podCreationTimestamp="2026-01-20 23:11:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:29.497074703 +0000 UTC m=+2161.817335011" watchObservedRunningTime="2026-01-20 23:11:29.499894582 +0000 UTC m=+2161.820154870" Jan 20 23:11:30 crc kubenswrapper[5030]: I0120 23:11:30.408766 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:11:30 crc kubenswrapper[5030]: I0120 23:11:30.479020 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf"] Jan 20 23:11:30 crc kubenswrapper[5030]: I0120 23:11:30.479532 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" podUID="0e045aec-3446-4403-9f41-3460c1e4edd8" containerName="neutron-api" containerID="cri-o://2adc9505bd51441e424188ca6fc7870a632c7aaf4ff225ef4486d6433fc3a30a" gracePeriod=30 Jan 20 23:11:30 crc kubenswrapper[5030]: I0120 23:11:30.479571 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" podUID="0e045aec-3446-4403-9f41-3460c1e4edd8" containerName="neutron-httpd" containerID="cri-o://a09b3aeb461e9751eafd059a5278797b9fba7440f6db9b9ac3e7ae86e2e084e5" gracePeriod=30 Jan 20 23:11:30 crc kubenswrapper[5030]: I0120 23:11:30.495992 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.342232 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.342788 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="efb2e499-552b-40a7-9f9d-cc30acd4e585" containerName="glance-log" containerID="cri-o://71173d5ca0843e6f5b5688d151d66d0caee91c181cd50badb07d7166cf7f0ef6" gracePeriod=30 Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.342845 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="efb2e499-552b-40a7-9f9d-cc30acd4e585" containerName="glance-httpd" containerID="cri-o://8722ba8a449d1974a6cabc858a5b686ba2dbe1041c85c5ff5e6525f64ac6076a" gracePeriod=30 Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.521286 5030 generic.go:334] "Generic (PLEG): container finished" podID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerID="5dd7898784fb28db16c57924aeb1b18210ee989a99b4444e66babb5cdb48ff0f" exitCode=0 Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.521359 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"de431a48-e0d3-4508-a1e7-54598bbab91d","Type":"ContainerDied","Data":"5dd7898784fb28db16c57924aeb1b18210ee989a99b4444e66babb5cdb48ff0f"} Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.526065 5030 generic.go:334] "Generic (PLEG): container finished" podID="0e045aec-3446-4403-9f41-3460c1e4edd8" containerID="a09b3aeb461e9751eafd059a5278797b9fba7440f6db9b9ac3e7ae86e2e084e5" exitCode=0 Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.526135 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" event={"ID":"0e045aec-3446-4403-9f41-3460c1e4edd8","Type":"ContainerDied","Data":"a09b3aeb461e9751eafd059a5278797b9fba7440f6db9b9ac3e7ae86e2e084e5"} Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.529321 5030 generic.go:334] "Generic (PLEG): container finished" podID="efb2e499-552b-40a7-9f9d-cc30acd4e585" containerID="71173d5ca0843e6f5b5688d151d66d0caee91c181cd50badb07d7166cf7f0ef6" exitCode=143 Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.529456 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"efb2e499-552b-40a7-9f9d-cc30acd4e585","Type":"ContainerDied","Data":"71173d5ca0843e6f5b5688d151d66d0caee91c181cd50badb07d7166cf7f0ef6"} Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.791474 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.846509 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-config-data\") pod \"de431a48-e0d3-4508-a1e7-54598bbab91d\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.846682 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-combined-ca-bundle\") pod \"de431a48-e0d3-4508-a1e7-54598bbab91d\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.846812 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-sg-core-conf-yaml\") pod \"de431a48-e0d3-4508-a1e7-54598bbab91d\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.846841 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de431a48-e0d3-4508-a1e7-54598bbab91d-log-httpd\") pod \"de431a48-e0d3-4508-a1e7-54598bbab91d\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.847267 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-scripts\") pod \"de431a48-e0d3-4508-a1e7-54598bbab91d\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.847206 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de431a48-e0d3-4508-a1e7-54598bbab91d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "de431a48-e0d3-4508-a1e7-54598bbab91d" (UID: "de431a48-e0d3-4508-a1e7-54598bbab91d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.847341 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szl22\" (UniqueName: \"kubernetes.io/projected/de431a48-e0d3-4508-a1e7-54598bbab91d-kube-api-access-szl22\") pod \"de431a48-e0d3-4508-a1e7-54598bbab91d\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.847691 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de431a48-e0d3-4508-a1e7-54598bbab91d-run-httpd\") pod \"de431a48-e0d3-4508-a1e7-54598bbab91d\" (UID: \"de431a48-e0d3-4508-a1e7-54598bbab91d\") " Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.848093 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de431a48-e0d3-4508-a1e7-54598bbab91d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.848352 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de431a48-e0d3-4508-a1e7-54598bbab91d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "de431a48-e0d3-4508-a1e7-54598bbab91d" (UID: "de431a48-e0d3-4508-a1e7-54598bbab91d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.852121 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-scripts" (OuterVolumeSpecName: "scripts") pod "de431a48-e0d3-4508-a1e7-54598bbab91d" (UID: "de431a48-e0d3-4508-a1e7-54598bbab91d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.858695 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de431a48-e0d3-4508-a1e7-54598bbab91d-kube-api-access-szl22" (OuterVolumeSpecName: "kube-api-access-szl22") pod "de431a48-e0d3-4508-a1e7-54598bbab91d" (UID: "de431a48-e0d3-4508-a1e7-54598bbab91d"). InnerVolumeSpecName "kube-api-access-szl22". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.949295 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.949324 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szl22\" (UniqueName: \"kubernetes.io/projected/de431a48-e0d3-4508-a1e7-54598bbab91d-kube-api-access-szl22\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.949335 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de431a48-e0d3-4508-a1e7-54598bbab91d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:31 crc kubenswrapper[5030]: I0120 23:11:31.954783 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "de431a48-e0d3-4508-a1e7-54598bbab91d" (UID: "de431a48-e0d3-4508-a1e7-54598bbab91d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.017133 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de431a48-e0d3-4508-a1e7-54598bbab91d" (UID: "de431a48-e0d3-4508-a1e7-54598bbab91d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.054935 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.055598 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.064897 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-config-data" (OuterVolumeSpecName: "config-data") pod "de431a48-e0d3-4508-a1e7-54598bbab91d" (UID: "de431a48-e0d3-4508-a1e7-54598bbab91d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.065744 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-8pxqj"] Jan 20 23:11:32 crc kubenswrapper[5030]: E0120 23:11:32.066109 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="sg-core" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.066133 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="sg-core" Jan 20 23:11:32 crc kubenswrapper[5030]: E0120 23:11:32.066151 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="ceilometer-notification-agent" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.066158 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="ceilometer-notification-agent" Jan 20 23:11:32 crc kubenswrapper[5030]: E0120 23:11:32.066171 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="ceilometer-central-agent" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.066177 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="ceilometer-central-agent" Jan 20 23:11:32 crc kubenswrapper[5030]: E0120 23:11:32.066195 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="proxy-httpd" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.066201 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="proxy-httpd" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.066804 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="proxy-httpd" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.066826 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="ceilometer-central-agent" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.066838 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="ceilometer-notification-agent" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.066859 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" containerName="sg-core" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.067443 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-8pxqj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.082121 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-8pxqj"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.118986 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.120094 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.121959 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.145206 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.158031 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9mxf\" (UniqueName: \"kubernetes.io/projected/91e13cbf-a921-49d9-ab19-631c7122b04c-kube-api-access-h9mxf\") pod \"nova-api-db-create-8pxqj\" (UID: \"91e13cbf-a921-49d9-ab19-631c7122b04c\") " pod="openstack-kuttl-tests/nova-api-db-create-8pxqj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.158359 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/effb0df3-9c75-4104-90c2-505e1e4e3b1b-operator-scripts\") pod \"nova-api-aaf6-account-create-update-tdntq\" (UID: \"effb0df3-9c75-4104-90c2-505e1e4e3b1b\") " pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.158483 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24rm4\" (UniqueName: \"kubernetes.io/projected/effb0df3-9c75-4104-90c2-505e1e4e3b1b-kube-api-access-24rm4\") pod \"nova-api-aaf6-account-create-update-tdntq\" (UID: \"effb0df3-9c75-4104-90c2-505e1e4e3b1b\") " pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.158536 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91e13cbf-a921-49d9-ab19-631c7122b04c-operator-scripts\") pod \"nova-api-db-create-8pxqj\" (UID: \"91e13cbf-a921-49d9-ab19-631c7122b04c\") " pod="openstack-kuttl-tests/nova-api-db-create-8pxqj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.158647 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de431a48-e0d3-4508-a1e7-54598bbab91d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.173941 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-22qhj"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.175204 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.176229 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-22qhj"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.260225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/effb0df3-9c75-4104-90c2-505e1e4e3b1b-operator-scripts\") pod \"nova-api-aaf6-account-create-update-tdntq\" (UID: \"effb0df3-9c75-4104-90c2-505e1e4e3b1b\") " pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.260335 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24rm4\" (UniqueName: \"kubernetes.io/projected/effb0df3-9c75-4104-90c2-505e1e4e3b1b-kube-api-access-24rm4\") pod \"nova-api-aaf6-account-create-update-tdntq\" (UID: \"effb0df3-9c75-4104-90c2-505e1e4e3b1b\") " pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.260376 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91e13cbf-a921-49d9-ab19-631c7122b04c-operator-scripts\") pod \"nova-api-db-create-8pxqj\" (UID: \"91e13cbf-a921-49d9-ab19-631c7122b04c\") " pod="openstack-kuttl-tests/nova-api-db-create-8pxqj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.260421 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/229641c0-730e-4c1b-b5ba-89b3e250b1c9-operator-scripts\") pod \"nova-cell0-db-create-22qhj\" (UID: \"229641c0-730e-4c1b-b5ba-89b3e250b1c9\") " pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.260445 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9mxf\" (UniqueName: \"kubernetes.io/projected/91e13cbf-a921-49d9-ab19-631c7122b04c-kube-api-access-h9mxf\") pod \"nova-api-db-create-8pxqj\" (UID: \"91e13cbf-a921-49d9-ab19-631c7122b04c\") " pod="openstack-kuttl-tests/nova-api-db-create-8pxqj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.260488 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc4zq\" (UniqueName: \"kubernetes.io/projected/229641c0-730e-4c1b-b5ba-89b3e250b1c9-kube-api-access-kc4zq\") pod \"nova-cell0-db-create-22qhj\" (UID: \"229641c0-730e-4c1b-b5ba-89b3e250b1c9\") " pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.261341 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/effb0df3-9c75-4104-90c2-505e1e4e3b1b-operator-scripts\") pod \"nova-api-aaf6-account-create-update-tdntq\" (UID: \"effb0df3-9c75-4104-90c2-505e1e4e3b1b\") " pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.261578 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91e13cbf-a921-49d9-ab19-631c7122b04c-operator-scripts\") pod \"nova-api-db-create-8pxqj\" (UID: \"91e13cbf-a921-49d9-ab19-631c7122b04c\") " pod="openstack-kuttl-tests/nova-api-db-create-8pxqj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.268590 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.270093 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.272280 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.282021 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9mxf\" (UniqueName: \"kubernetes.io/projected/91e13cbf-a921-49d9-ab19-631c7122b04c-kube-api-access-h9mxf\") pod \"nova-api-db-create-8pxqj\" (UID: \"91e13cbf-a921-49d9-ab19-631c7122b04c\") " pod="openstack-kuttl-tests/nova-api-db-create-8pxqj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.284813 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.288768 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24rm4\" (UniqueName: \"kubernetes.io/projected/effb0df3-9c75-4104-90c2-505e1e4e3b1b-kube-api-access-24rm4\") pod \"nova-api-aaf6-account-create-update-tdntq\" (UID: \"effb0df3-9c75-4104-90c2-505e1e4e3b1b\") " pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.362331 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/946a76ec-5148-426e-97b5-bd22220a7ff2-operator-scripts\") pod \"nova-cell0-e28c-account-create-update-7lfld\" (UID: \"946a76ec-5148-426e-97b5-bd22220a7ff2\") " pod="openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.362371 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlscm\" (UniqueName: \"kubernetes.io/projected/946a76ec-5148-426e-97b5-bd22220a7ff2-kube-api-access-mlscm\") pod \"nova-cell0-e28c-account-create-update-7lfld\" (UID: \"946a76ec-5148-426e-97b5-bd22220a7ff2\") " pod="openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.362425 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/229641c0-730e-4c1b-b5ba-89b3e250b1c9-operator-scripts\") pod \"nova-cell0-db-create-22qhj\" (UID: \"229641c0-730e-4c1b-b5ba-89b3e250b1c9\") " pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.362505 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc4zq\" (UniqueName: \"kubernetes.io/projected/229641c0-730e-4c1b-b5ba-89b3e250b1c9-kube-api-access-kc4zq\") pod \"nova-cell0-db-create-22qhj\" (UID: \"229641c0-730e-4c1b-b5ba-89b3e250b1c9\") " pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.363458 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/229641c0-730e-4c1b-b5ba-89b3e250b1c9-operator-scripts\") pod \"nova-cell0-db-create-22qhj\" (UID: \"229641c0-730e-4c1b-b5ba-89b3e250b1c9\") " pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.371669 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-mbn7q"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.372855 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-mbn7q" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.382752 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc4zq\" (UniqueName: \"kubernetes.io/projected/229641c0-730e-4c1b-b5ba-89b3e250b1c9-kube-api-access-kc4zq\") pod \"nova-cell0-db-create-22qhj\" (UID: \"229641c0-730e-4c1b-b5ba-89b3e250b1c9\") " pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.383296 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-8pxqj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.390968 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-mbn7q"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.437281 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.464010 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc090b44-6543-43f0-a4cb-4ee41c4fdab2-operator-scripts\") pod \"nova-cell1-db-create-mbn7q\" (UID: \"bc090b44-6543-43f0-a4cb-4ee41c4fdab2\") " pod="openstack-kuttl-tests/nova-cell1-db-create-mbn7q" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.464273 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxcjv\" (UniqueName: \"kubernetes.io/projected/bc090b44-6543-43f0-a4cb-4ee41c4fdab2-kube-api-access-sxcjv\") pod \"nova-cell1-db-create-mbn7q\" (UID: \"bc090b44-6543-43f0-a4cb-4ee41c4fdab2\") " pod="openstack-kuttl-tests/nova-cell1-db-create-mbn7q" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.464389 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/946a76ec-5148-426e-97b5-bd22220a7ff2-operator-scripts\") pod \"nova-cell0-e28c-account-create-update-7lfld\" (UID: \"946a76ec-5148-426e-97b5-bd22220a7ff2\") " pod="openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.464502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlscm\" (UniqueName: \"kubernetes.io/projected/946a76ec-5148-426e-97b5-bd22220a7ff2-kube-api-access-mlscm\") pod \"nova-cell0-e28c-account-create-update-7lfld\" (UID: \"946a76ec-5148-426e-97b5-bd22220a7ff2\") " pod="openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.464830 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.465826 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/946a76ec-5148-426e-97b5-bd22220a7ff2-operator-scripts\") pod \"nova-cell0-e28c-account-create-update-7lfld\" (UID: \"946a76ec-5148-426e-97b5-bd22220a7ff2\") " pod="openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.466258 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.476813 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.484003 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.491757 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.492108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlscm\" (UniqueName: \"kubernetes.io/projected/946a76ec-5148-426e-97b5-bd22220a7ff2-kube-api-access-mlscm\") pod \"nova-cell0-e28c-account-create-update-7lfld\" (UID: \"946a76ec-5148-426e-97b5-bd22220a7ff2\") " pod="openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.569382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkj7n\" (UniqueName: \"kubernetes.io/projected/b79f6cab-1cbe-42c5-9e2c-63f34c7c8139-kube-api-access-dkj7n\") pod \"nova-cell1-884d-account-create-update-887vg\" (UID: \"b79f6cab-1cbe-42c5-9e2c-63f34c7c8139\") " pod="openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.569497 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b79f6cab-1cbe-42c5-9e2c-63f34c7c8139-operator-scripts\") pod \"nova-cell1-884d-account-create-update-887vg\" (UID: \"b79f6cab-1cbe-42c5-9e2c-63f34c7c8139\") " pod="openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.569667 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc090b44-6543-43f0-a4cb-4ee41c4fdab2-operator-scripts\") pod \"nova-cell1-db-create-mbn7q\" (UID: \"bc090b44-6543-43f0-a4cb-4ee41c4fdab2\") " pod="openstack-kuttl-tests/nova-cell1-db-create-mbn7q" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.569731 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxcjv\" (UniqueName: \"kubernetes.io/projected/bc090b44-6543-43f0-a4cb-4ee41c4fdab2-kube-api-access-sxcjv\") pod \"nova-cell1-db-create-mbn7q\" (UID: \"bc090b44-6543-43f0-a4cb-4ee41c4fdab2\") " pod="openstack-kuttl-tests/nova-cell1-db-create-mbn7q" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.570953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc090b44-6543-43f0-a4cb-4ee41c4fdab2-operator-scripts\") pod \"nova-cell1-db-create-mbn7q\" (UID: \"bc090b44-6543-43f0-a4cb-4ee41c4fdab2\") " pod="openstack-kuttl-tests/nova-cell1-db-create-mbn7q" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.573907 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5fbd872-796a-4b41-b925-516fc6db87f6" containerID="326c2d125eb3a698d87da6113f935f138da26210af73c40cb324ca56a9aa23a4" exitCode=0 Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.573989 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5fbd872-796a-4b41-b925-516fc6db87f6","Type":"ContainerDied","Data":"326c2d125eb3a698d87da6113f935f138da26210af73c40cb324ca56a9aa23a4"} Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.576843 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"de431a48-e0d3-4508-a1e7-54598bbab91d","Type":"ContainerDied","Data":"4692275494f8393f51f0db6c5891e760b2e79c0bcec93c1cdf71a2d7326dbb6d"} Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.576874 5030 scope.go:117] "RemoveContainer" containerID="e11a6225db06c03537442bca4cc97ea574e0f4e5666909d3028483fe4fb389c2" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.577017 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.585548 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.597856 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxcjv\" (UniqueName: \"kubernetes.io/projected/bc090b44-6543-43f0-a4cb-4ee41c4fdab2-kube-api-access-sxcjv\") pod \"nova-cell1-db-create-mbn7q\" (UID: \"bc090b44-6543-43f0-a4cb-4ee41c4fdab2\") " pod="openstack-kuttl-tests/nova-cell1-db-create-mbn7q" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.641420 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.674439 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkj7n\" (UniqueName: \"kubernetes.io/projected/b79f6cab-1cbe-42c5-9e2c-63f34c7c8139-kube-api-access-dkj7n\") pod \"nova-cell1-884d-account-create-update-887vg\" (UID: \"b79f6cab-1cbe-42c5-9e2c-63f34c7c8139\") " pod="openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.674509 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b79f6cab-1cbe-42c5-9e2c-63f34c7c8139-operator-scripts\") pod \"nova-cell1-884d-account-create-update-887vg\" (UID: \"b79f6cab-1cbe-42c5-9e2c-63f34c7c8139\") " pod="openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.675360 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b79f6cab-1cbe-42c5-9e2c-63f34c7c8139-operator-scripts\") pod \"nova-cell1-884d-account-create-update-887vg\" (UID: \"b79f6cab-1cbe-42c5-9e2c-63f34c7c8139\") " pod="openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.675668 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.709958 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.710416 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-mbn7q" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.712881 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.713643 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkj7n\" (UniqueName: \"kubernetes.io/projected/b79f6cab-1cbe-42c5-9e2c-63f34c7c8139-kube-api-access-dkj7n\") pod \"nova-cell1-884d-account-create-update-887vg\" (UID: \"b79f6cab-1cbe-42c5-9e2c-63f34c7c8139\") " pod="openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.715946 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.718176 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.719188 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.719380 5030 scope.go:117] "RemoveContainer" containerID="810bba1559104d0f47323ad648e6e7f24dfccd3c29f21edb3d5e1595173f2e71" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.776337 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-scripts\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.776387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80b5a44a-5492-4aa7-bd30-8639b6527e66-run-httpd\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.776419 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.776450 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-config-data\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.776539 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frv82\" (UniqueName: \"kubernetes.io/projected/80b5a44a-5492-4aa7-bd30-8639b6527e66-kube-api-access-frv82\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.776557 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80b5a44a-5492-4aa7-bd30-8639b6527e66-log-httpd\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.776698 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.849897 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.879409 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.879464 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-scripts\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.879493 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80b5a44a-5492-4aa7-bd30-8639b6527e66-run-httpd\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.879517 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.879538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-config-data\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.879595 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frv82\" (UniqueName: \"kubernetes.io/projected/80b5a44a-5492-4aa7-bd30-8639b6527e66-kube-api-access-frv82\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.879611 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80b5a44a-5492-4aa7-bd30-8639b6527e66-log-httpd\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.880469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80b5a44a-5492-4aa7-bd30-8639b6527e66-log-httpd\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.880743 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80b5a44a-5492-4aa7-bd30-8639b6527e66-run-httpd\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.890870 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.894935 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-scripts\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.895118 5030 scope.go:117] "RemoveContainer" containerID="5dd7898784fb28db16c57924aeb1b18210ee989a99b4444e66babb5cdb48ff0f" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.895278 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.903292 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-config-data\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.907074 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frv82\" (UniqueName: \"kubernetes.io/projected/80b5a44a-5492-4aa7-bd30-8639b6527e66-kube-api-access-frv82\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.912555 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.966872 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.980390 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-scripts\") pod \"a5fbd872-796a-4b41-b925-516fc6db87f6\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.980669 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vktth\" (UniqueName: \"kubernetes.io/projected/a5fbd872-796a-4b41-b925-516fc6db87f6-kube-api-access-vktth\") pod \"a5fbd872-796a-4b41-b925-516fc6db87f6\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.980693 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-config-data\") pod \"a5fbd872-796a-4b41-b925-516fc6db87f6\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.980734 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"a5fbd872-796a-4b41-b925-516fc6db87f6\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.980795 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5fbd872-796a-4b41-b925-516fc6db87f6-logs\") pod \"a5fbd872-796a-4b41-b925-516fc6db87f6\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.980871 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5fbd872-796a-4b41-b925-516fc6db87f6-httpd-run\") pod \"a5fbd872-796a-4b41-b925-516fc6db87f6\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.980920 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-public-tls-certs\") pod \"a5fbd872-796a-4b41-b925-516fc6db87f6\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.980954 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-combined-ca-bundle\") pod \"a5fbd872-796a-4b41-b925-516fc6db87f6\" (UID: \"a5fbd872-796a-4b41-b925-516fc6db87f6\") " Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.983327 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5fbd872-796a-4b41-b925-516fc6db87f6-logs" (OuterVolumeSpecName: "logs") pod "a5fbd872-796a-4b41-b925-516fc6db87f6" (UID: "a5fbd872-796a-4b41-b925-516fc6db87f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.986166 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-scripts" (OuterVolumeSpecName: "scripts") pod "a5fbd872-796a-4b41-b925-516fc6db87f6" (UID: "a5fbd872-796a-4b41-b925-516fc6db87f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.986352 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5fbd872-796a-4b41-b925-516fc6db87f6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a5fbd872-796a-4b41-b925-516fc6db87f6" (UID: "a5fbd872-796a-4b41-b925-516fc6db87f6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:32 crc kubenswrapper[5030]: I0120 23:11:32.991840 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "a5fbd872-796a-4b41-b925-516fc6db87f6" (UID: "a5fbd872-796a-4b41-b925-516fc6db87f6"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.016712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5fbd872-796a-4b41-b925-516fc6db87f6-kube-api-access-vktth" (OuterVolumeSpecName: "kube-api-access-vktth") pod "a5fbd872-796a-4b41-b925-516fc6db87f6" (UID: "a5fbd872-796a-4b41-b925-516fc6db87f6"). InnerVolumeSpecName "kube-api-access-vktth". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.016972 5030 scope.go:117] "RemoveContainer" containerID="9d8a694665e2a6a02f99434282adda9069dade83d59ff2b7c6049c05690f0ad1" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.032998 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-8pxqj"] Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.037047 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5fbd872-796a-4b41-b925-516fc6db87f6" (UID: "a5fbd872-796a-4b41-b925-516fc6db87f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:33 crc kubenswrapper[5030]: W0120 23:11:33.051123 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91e13cbf_a921_49d9_ab19_631c7122b04c.slice/crio-c30fffa53be3c02a2455e49ac24c7e4a88206de34dc38f7efce82ead5189c386 WatchSource:0}: Error finding container c30fffa53be3c02a2455e49ac24c7e4a88206de34dc38f7efce82ead5189c386: Status 404 returned error can't find the container with id c30fffa53be3c02a2455e49ac24c7e4a88206de34dc38f7efce82ead5189c386 Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.073194 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-config-data" (OuterVolumeSpecName: "config-data") pod "a5fbd872-796a-4b41-b925-516fc6db87f6" (UID: "a5fbd872-796a-4b41-b925-516fc6db87f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.086107 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5fbd872-796a-4b41-b925-516fc6db87f6-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.086152 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.086192 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.086208 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vktth\" (UniqueName: \"kubernetes.io/projected/a5fbd872-796a-4b41-b925-516fc6db87f6-kube-api-access-vktth\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.086221 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.086274 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.086288 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5fbd872-796a-4b41-b925-516fc6db87f6-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.098219 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a5fbd872-796a-4b41-b925-516fc6db87f6" (UID: "a5fbd872-796a-4b41-b925-516fc6db87f6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.112580 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.165288 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq"] Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.187260 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5fbd872-796a-4b41-b925-516fc6db87f6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.187287 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.338438 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-22qhj"] Jan 20 23:11:33 crc kubenswrapper[5030]: W0120 23:11:33.353248 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod229641c0_730e_4c1b_b5ba_89b3e250b1c9.slice/crio-7d74b8769a3ee2d4036de5985b118a79cf780cc0596c125aab02160082af9a5c WatchSource:0}: Error finding container 7d74b8769a3ee2d4036de5985b118a79cf780cc0596c125aab02160082af9a5c: Status 404 returned error can't find the container with id 7d74b8769a3ee2d4036de5985b118a79cf780cc0596c125aab02160082af9a5c Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.452521 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld"] Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.468893 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-mbn7q"] Jan 20 23:11:33 crc kubenswrapper[5030]: W0120 23:11:33.508738 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod946a76ec_5148_426e_97b5_bd22220a7ff2.slice/crio-2fc9e99f368baa369bfe5954383209ed356791d0ffeb27e2eb360da011621f01 WatchSource:0}: Error finding container 2fc9e99f368baa369bfe5954383209ed356791d0ffeb27e2eb360da011621f01: Status 404 returned error can't find the container with id 2fc9e99f368baa369bfe5954383209ed356791d0ffeb27e2eb360da011621f01 Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.594210 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" event={"ID":"effb0df3-9c75-4104-90c2-505e1e4e3b1b","Type":"ContainerStarted","Data":"bc6af635e7469805210beb35edd587c7aa233e75f38fdddef526c1f10038f18a"} Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.594251 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" event={"ID":"effb0df3-9c75-4104-90c2-505e1e4e3b1b","Type":"ContainerStarted","Data":"f8e9df1ed7867a8b684e95f2eb251c3437223ab0c618d1f6962b94c9426591f9"} Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.610154 5030 generic.go:334] "Generic (PLEG): container finished" podID="91e13cbf-a921-49d9-ab19-631c7122b04c" containerID="a8f87c8119662cd288e317c6cd09aea15848c621b368e0d59c2e3dde575d7349" exitCode=0 Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.610244 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-8pxqj" event={"ID":"91e13cbf-a921-49d9-ab19-631c7122b04c","Type":"ContainerDied","Data":"a8f87c8119662cd288e317c6cd09aea15848c621b368e0d59c2e3dde575d7349"} Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.610267 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-8pxqj" event={"ID":"91e13cbf-a921-49d9-ab19-631c7122b04c","Type":"ContainerStarted","Data":"c30fffa53be3c02a2455e49ac24c7e4a88206de34dc38f7efce82ead5189c386"} Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.616526 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-mbn7q" event={"ID":"bc090b44-6543-43f0-a4cb-4ee41c4fdab2","Type":"ContainerStarted","Data":"f855f3c8113651bd998969f31c54bbef1450c223429cd7aeba26ff12c190e285"} Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.621680 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg"] Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.623657 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld" event={"ID":"946a76ec-5148-426e-97b5-bd22220a7ff2","Type":"ContainerStarted","Data":"2fc9e99f368baa369bfe5954383209ed356791d0ffeb27e2eb360da011621f01"} Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.629792 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" event={"ID":"229641c0-730e-4c1b-b5ba-89b3e250b1c9","Type":"ContainerStarted","Data":"33f20a4031ef4829ecbbf9e28989b665faf28e57166d8564c12773236bce9457"} Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.629824 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" event={"ID":"229641c0-730e-4c1b-b5ba-89b3e250b1c9","Type":"ContainerStarted","Data":"7d74b8769a3ee2d4036de5985b118a79cf780cc0596c125aab02160082af9a5c"} Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.632052 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" podStartSLOduration=1.632023143 podStartE2EDuration="1.632023143s" podCreationTimestamp="2026-01-20 23:11:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:33.608965969 +0000 UTC m=+2165.929226257" watchObservedRunningTime="2026-01-20 23:11:33.632023143 +0000 UTC m=+2165.952283431" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.636790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5fbd872-796a-4b41-b925-516fc6db87f6","Type":"ContainerDied","Data":"51b13d2e74518375b839ebdd3a626fab34e90c11bca7bc268289841ba997a5f0"} Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.636857 5030 scope.go:117] "RemoveContainer" containerID="326c2d125eb3a698d87da6113f935f138da26210af73c40cb324ca56a9aa23a4" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.636989 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.655602 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" podStartSLOduration=1.6555854989999998 podStartE2EDuration="1.655585499s" podCreationTimestamp="2026-01-20 23:11:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:33.643927884 +0000 UTC m=+2165.964188172" watchObservedRunningTime="2026-01-20 23:11:33.655585499 +0000 UTC m=+2165.975845787" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.755232 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:33 crc kubenswrapper[5030]: W0120 23:11:33.761388 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80b5a44a_5492_4aa7_bd30_8639b6527e66.slice/crio-d417312e39e06811cd427971e4ff28e1da3cb43b324b96d35898f6e9d0af01af WatchSource:0}: Error finding container d417312e39e06811cd427971e4ff28e1da3cb43b324b96d35898f6e9d0af01af: Status 404 returned error can't find the container with id d417312e39e06811cd427971e4ff28e1da3cb43b324b96d35898f6e9d0af01af Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.891544 5030 scope.go:117] "RemoveContainer" containerID="053f678a1a0cf6510e44aeba332cc377d889571f025e9cd39037db9969376bc1" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.929486 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.945420 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.958408 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:11:33 crc kubenswrapper[5030]: E0120 23:11:33.958853 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5fbd872-796a-4b41-b925-516fc6db87f6" containerName="glance-httpd" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.958868 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5fbd872-796a-4b41-b925-516fc6db87f6" containerName="glance-httpd" Jan 20 23:11:33 crc kubenswrapper[5030]: E0120 23:11:33.958883 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5fbd872-796a-4b41-b925-516fc6db87f6" containerName="glance-log" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.958889 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5fbd872-796a-4b41-b925-516fc6db87f6" containerName="glance-log" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.959059 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5fbd872-796a-4b41-b925-516fc6db87f6" containerName="glance-httpd" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.959078 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5fbd872-796a-4b41-b925-516fc6db87f6" containerName="glance-log" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.960083 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.961858 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.967313 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.972123 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5fbd872-796a-4b41-b925-516fc6db87f6" path="/var/lib/kubelet/pods/a5fbd872-796a-4b41-b925-516fc6db87f6/volumes" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.972886 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de431a48-e0d3-4508-a1e7-54598bbab91d" path="/var/lib/kubelet/pods/de431a48-e0d3-4508-a1e7-54598bbab91d/volumes" Jan 20 23:11:33 crc kubenswrapper[5030]: I0120 23:11:33.986702 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.009607 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7khh\" (UniqueName: \"kubernetes.io/projected/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-kube-api-access-q7khh\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.009738 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.009760 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-config-data\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.009789 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.009805 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.012309 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-logs\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.012372 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.012412 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-scripts\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.114214 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-logs\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.114268 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.114301 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-scripts\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.114355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7khh\" (UniqueName: \"kubernetes.io/projected/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-kube-api-access-q7khh\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.114404 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.114422 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-config-data\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.114441 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.114459 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.114735 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-logs\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.114802 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.114846 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.120720 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-config-data\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.121503 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.121766 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.131555 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-scripts\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.133199 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7khh\" (UniqueName: \"kubernetes.io/projected/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-kube-api-access-q7khh\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.177540 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.282242 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.665463 5030 generic.go:334] "Generic (PLEG): container finished" podID="b79f6cab-1cbe-42c5-9e2c-63f34c7c8139" containerID="82ab471c3d64055113b2021eef08677b0e9845c041b161dbbb64d459e70517df" exitCode=0 Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.665553 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg" event={"ID":"b79f6cab-1cbe-42c5-9e2c-63f34c7c8139","Type":"ContainerDied","Data":"82ab471c3d64055113b2021eef08677b0e9845c041b161dbbb64d459e70517df"} Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.666008 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg" event={"ID":"b79f6cab-1cbe-42c5-9e2c-63f34c7c8139","Type":"ContainerStarted","Data":"64f84cc1afe8c7deff3d6a62281a708a7ab258df052690dad06041ceed4ddb3c"} Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.670688 5030 generic.go:334] "Generic (PLEG): container finished" podID="effb0df3-9c75-4104-90c2-505e1e4e3b1b" containerID="bc6af635e7469805210beb35edd587c7aa233e75f38fdddef526c1f10038f18a" exitCode=0 Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.670771 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" event={"ID":"effb0df3-9c75-4104-90c2-505e1e4e3b1b","Type":"ContainerDied","Data":"bc6af635e7469805210beb35edd587c7aa233e75f38fdddef526c1f10038f18a"} Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.683393 5030 generic.go:334] "Generic (PLEG): container finished" podID="bc090b44-6543-43f0-a4cb-4ee41c4fdab2" containerID="c2d39dc919ce31b1d04bba4cd39e59cbcbecfe645a2a254ced36db8b14b27f79" exitCode=0 Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.683476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-mbn7q" event={"ID":"bc090b44-6543-43f0-a4cb-4ee41c4fdab2","Type":"ContainerDied","Data":"c2d39dc919ce31b1d04bba4cd39e59cbcbecfe645a2a254ced36db8b14b27f79"} Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.689453 5030 generic.go:334] "Generic (PLEG): container finished" podID="efb2e499-552b-40a7-9f9d-cc30acd4e585" containerID="8722ba8a449d1974a6cabc858a5b686ba2dbe1041c85c5ff5e6525f64ac6076a" exitCode=0 Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.689532 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"efb2e499-552b-40a7-9f9d-cc30acd4e585","Type":"ContainerDied","Data":"8722ba8a449d1974a6cabc858a5b686ba2dbe1041c85c5ff5e6525f64ac6076a"} Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.697766 5030 generic.go:334] "Generic (PLEG): container finished" podID="946a76ec-5148-426e-97b5-bd22220a7ff2" containerID="3a9e6b3c5136ed70e89b31faf61fa24914c889d05394db12868771de49733760" exitCode=0 Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.697894 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld" event={"ID":"946a76ec-5148-426e-97b5-bd22220a7ff2","Type":"ContainerDied","Data":"3a9e6b3c5136ed70e89b31faf61fa24914c889d05394db12868771de49733760"} Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.700746 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"80b5a44a-5492-4aa7-bd30-8639b6527e66","Type":"ContainerStarted","Data":"d417312e39e06811cd427971e4ff28e1da3cb43b324b96d35898f6e9d0af01af"} Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.706160 5030 generic.go:334] "Generic (PLEG): container finished" podID="229641c0-730e-4c1b-b5ba-89b3e250b1c9" containerID="33f20a4031ef4829ecbbf9e28989b665faf28e57166d8564c12773236bce9457" exitCode=0 Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.706227 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" event={"ID":"229641c0-730e-4c1b-b5ba-89b3e250b1c9","Type":"ContainerDied","Data":"33f20a4031ef4829ecbbf9e28989b665faf28e57166d8564c12773236bce9457"} Jan 20 23:11:34 crc kubenswrapper[5030]: I0120 23:11:34.752417 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:11:34 crc kubenswrapper[5030]: W0120 23:11:34.777646 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb9833c5_3a7c_426c_b5e2_4e1151e9ebe9.slice/crio-bbccdc40651dfa4bf2b17228da4035b78046324e64507e3f43c65803a0ace128 WatchSource:0}: Error finding container bbccdc40651dfa4bf2b17228da4035b78046324e64507e3f43c65803a0ace128: Status 404 returned error can't find the container with id bbccdc40651dfa4bf2b17228da4035b78046324e64507e3f43c65803a0ace128 Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.038257 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.134465 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efb2e499-552b-40a7-9f9d-cc30acd4e585-httpd-run\") pod \"efb2e499-552b-40a7-9f9d-cc30acd4e585\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.134572 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-scripts\") pod \"efb2e499-552b-40a7-9f9d-cc30acd4e585\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.134630 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8vxw\" (UniqueName: \"kubernetes.io/projected/efb2e499-552b-40a7-9f9d-cc30acd4e585-kube-api-access-x8vxw\") pod \"efb2e499-552b-40a7-9f9d-cc30acd4e585\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.134679 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-combined-ca-bundle\") pod \"efb2e499-552b-40a7-9f9d-cc30acd4e585\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.134696 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-internal-tls-certs\") pod \"efb2e499-552b-40a7-9f9d-cc30acd4e585\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.134742 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"efb2e499-552b-40a7-9f9d-cc30acd4e585\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.134766 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-config-data\") pod \"efb2e499-552b-40a7-9f9d-cc30acd4e585\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.134791 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efb2e499-552b-40a7-9f9d-cc30acd4e585-logs\") pod \"efb2e499-552b-40a7-9f9d-cc30acd4e585\" (UID: \"efb2e499-552b-40a7-9f9d-cc30acd4e585\") " Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.135656 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb2e499-552b-40a7-9f9d-cc30acd4e585-logs" (OuterVolumeSpecName: "logs") pod "efb2e499-552b-40a7-9f9d-cc30acd4e585" (UID: "efb2e499-552b-40a7-9f9d-cc30acd4e585"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.139800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb2e499-552b-40a7-9f9d-cc30acd4e585-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "efb2e499-552b-40a7-9f9d-cc30acd4e585" (UID: "efb2e499-552b-40a7-9f9d-cc30acd4e585"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.143759 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb2e499-552b-40a7-9f9d-cc30acd4e585-kube-api-access-x8vxw" (OuterVolumeSpecName: "kube-api-access-x8vxw") pod "efb2e499-552b-40a7-9f9d-cc30acd4e585" (UID: "efb2e499-552b-40a7-9f9d-cc30acd4e585"). InnerVolumeSpecName "kube-api-access-x8vxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.144854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "efb2e499-552b-40a7-9f9d-cc30acd4e585" (UID: "efb2e499-552b-40a7-9f9d-cc30acd4e585"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.145410 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-scripts" (OuterVolumeSpecName: "scripts") pod "efb2e499-552b-40a7-9f9d-cc30acd4e585" (UID: "efb2e499-552b-40a7-9f9d-cc30acd4e585"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.170396 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efb2e499-552b-40a7-9f9d-cc30acd4e585" (UID: "efb2e499-552b-40a7-9f9d-cc30acd4e585"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.173310 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-8pxqj" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.210811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "efb2e499-552b-40a7-9f9d-cc30acd4e585" (UID: "efb2e499-552b-40a7-9f9d-cc30acd4e585"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.236576 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9mxf\" (UniqueName: \"kubernetes.io/projected/91e13cbf-a921-49d9-ab19-631c7122b04c-kube-api-access-h9mxf\") pod \"91e13cbf-a921-49d9-ab19-631c7122b04c\" (UID: \"91e13cbf-a921-49d9-ab19-631c7122b04c\") " Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.236820 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91e13cbf-a921-49d9-ab19-631c7122b04c-operator-scripts\") pod \"91e13cbf-a921-49d9-ab19-631c7122b04c\" (UID: \"91e13cbf-a921-49d9-ab19-631c7122b04c\") " Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.237342 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/efb2e499-552b-40a7-9f9d-cc30acd4e585-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.237373 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.237385 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8vxw\" (UniqueName: \"kubernetes.io/projected/efb2e499-552b-40a7-9f9d-cc30acd4e585-kube-api-access-x8vxw\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.237396 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.237405 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.237428 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.237440 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efb2e499-552b-40a7-9f9d-cc30acd4e585-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.238019 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e13cbf-a921-49d9-ab19-631c7122b04c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91e13cbf-a921-49d9-ab19-631c7122b04c" (UID: "91e13cbf-a921-49d9-ab19-631c7122b04c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.242129 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e13cbf-a921-49d9-ab19-631c7122b04c-kube-api-access-h9mxf" (OuterVolumeSpecName: "kube-api-access-h9mxf") pod "91e13cbf-a921-49d9-ab19-631c7122b04c" (UID: "91e13cbf-a921-49d9-ab19-631c7122b04c"). InnerVolumeSpecName "kube-api-access-h9mxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.250491 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-config-data" (OuterVolumeSpecName: "config-data") pod "efb2e499-552b-40a7-9f9d-cc30acd4e585" (UID: "efb2e499-552b-40a7-9f9d-cc30acd4e585"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.257376 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.339274 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9mxf\" (UniqueName: \"kubernetes.io/projected/91e13cbf-a921-49d9-ab19-631c7122b04c-kube-api-access-h9mxf\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.342741 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.342764 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb2e499-552b-40a7-9f9d-cc30acd4e585-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.342776 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91e13cbf-a921-49d9-ab19-631c7122b04c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.614397 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.614652 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-decision-engine-0" podUID="48a72147-2d43-4262-afd5-e06c554d7379" containerName="watcher-decision-engine" containerID="cri-o://588fc920f6558236538877bdec6583b515618625942d9f216a444cabdbea34f9" gracePeriod=30 Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.744319 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"80b5a44a-5492-4aa7-bd30-8639b6527e66","Type":"ContainerStarted","Data":"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761"} Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.750046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-8pxqj" event={"ID":"91e13cbf-a921-49d9-ab19-631c7122b04c","Type":"ContainerDied","Data":"c30fffa53be3c02a2455e49ac24c7e4a88206de34dc38f7efce82ead5189c386"} Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.750070 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-8pxqj" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.750088 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c30fffa53be3c02a2455e49ac24c7e4a88206de34dc38f7efce82ead5189c386" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.752106 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9","Type":"ContainerStarted","Data":"781cd384eff78e55f6cd571c577fc758748a027bc29a451c8ad2e11b486203dd"} Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.752144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9","Type":"ContainerStarted","Data":"bbccdc40651dfa4bf2b17228da4035b78046324e64507e3f43c65803a0ace128"} Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.754335 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.754742 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"efb2e499-552b-40a7-9f9d-cc30acd4e585","Type":"ContainerDied","Data":"7fc127f0ccf1c519c0b61afd47578f823e10cbbef32abf2437b0477539214c31"} Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.754785 5030 scope.go:117] "RemoveContainer" containerID="8722ba8a449d1974a6cabc858a5b686ba2dbe1041c85c5ff5e6525f64ac6076a" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.898427 5030 scope.go:117] "RemoveContainer" containerID="71173d5ca0843e6f5b5688d151d66d0caee91c181cd50badb07d7166cf7f0ef6" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.898752 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.910023 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.926255 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:11:35 crc kubenswrapper[5030]: E0120 23:11:35.926641 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb2e499-552b-40a7-9f9d-cc30acd4e585" containerName="glance-log" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.926652 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb2e499-552b-40a7-9f9d-cc30acd4e585" containerName="glance-log" Jan 20 23:11:35 crc kubenswrapper[5030]: E0120 23:11:35.926697 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e13cbf-a921-49d9-ab19-631c7122b04c" containerName="mariadb-database-create" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.926705 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e13cbf-a921-49d9-ab19-631c7122b04c" containerName="mariadb-database-create" Jan 20 23:11:35 crc kubenswrapper[5030]: E0120 23:11:35.926715 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb2e499-552b-40a7-9f9d-cc30acd4e585" containerName="glance-httpd" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.926721 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb2e499-552b-40a7-9f9d-cc30acd4e585" containerName="glance-httpd" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.926874 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e13cbf-a921-49d9-ab19-631c7122b04c" containerName="mariadb-database-create" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.926884 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb2e499-552b-40a7-9f9d-cc30acd4e585" containerName="glance-httpd" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.926903 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb2e499-552b-40a7-9f9d-cc30acd4e585" containerName="glance-log" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.928343 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.931939 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.932124 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.951073 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:11:35 crc kubenswrapper[5030]: I0120 23:11:35.993933 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb2e499-552b-40a7-9f9d-cc30acd4e585" path="/var/lib/kubelet/pods/efb2e499-552b-40a7-9f9d-cc30acd4e585/volumes" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.057547 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.057596 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.058747 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.058804 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.058842 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frwgq\" (UniqueName: \"kubernetes.io/projected/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-kube-api-access-frwgq\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.058938 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.058992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.059025 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-logs\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.163741 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.164017 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.164090 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.164116 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.164148 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frwgq\" (UniqueName: \"kubernetes.io/projected/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-kube-api-access-frwgq\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.164210 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.164264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.164287 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-logs\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.171048 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.172374 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.172403 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.183309 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.186357 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-logs\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.187469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.193477 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.199440 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.200011 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frwgq\" (UniqueName: \"kubernetes.io/projected/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-kube-api-access-frwgq\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.276180 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.334968 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.402400 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.470901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/effb0df3-9c75-4104-90c2-505e1e4e3b1b-operator-scripts\") pod \"effb0df3-9c75-4104-90c2-505e1e4e3b1b\" (UID: \"effb0df3-9c75-4104-90c2-505e1e4e3b1b\") " Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.471130 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24rm4\" (UniqueName: \"kubernetes.io/projected/effb0df3-9c75-4104-90c2-505e1e4e3b1b-kube-api-access-24rm4\") pod \"effb0df3-9c75-4104-90c2-505e1e4e3b1b\" (UID: \"effb0df3-9c75-4104-90c2-505e1e4e3b1b\") " Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.471477 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/effb0df3-9c75-4104-90c2-505e1e4e3b1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "effb0df3-9c75-4104-90c2-505e1e4e3b1b" (UID: "effb0df3-9c75-4104-90c2-505e1e4e3b1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.474752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/effb0df3-9c75-4104-90c2-505e1e4e3b1b-kube-api-access-24rm4" (OuterVolumeSpecName: "kube-api-access-24rm4") pod "effb0df3-9c75-4104-90c2-505e1e4e3b1b" (UID: "effb0df3-9c75-4104-90c2-505e1e4e3b1b"). InnerVolumeSpecName "kube-api-access-24rm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.572454 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlscm\" (UniqueName: \"kubernetes.io/projected/946a76ec-5148-426e-97b5-bd22220a7ff2-kube-api-access-mlscm\") pod \"946a76ec-5148-426e-97b5-bd22220a7ff2\" (UID: \"946a76ec-5148-426e-97b5-bd22220a7ff2\") " Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.572760 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/946a76ec-5148-426e-97b5-bd22220a7ff2-operator-scripts\") pod \"946a76ec-5148-426e-97b5-bd22220a7ff2\" (UID: \"946a76ec-5148-426e-97b5-bd22220a7ff2\") " Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.573188 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24rm4\" (UniqueName: \"kubernetes.io/projected/effb0df3-9c75-4104-90c2-505e1e4e3b1b-kube-api-access-24rm4\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.573201 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/effb0df3-9c75-4104-90c2-505e1e4e3b1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.573508 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/946a76ec-5148-426e-97b5-bd22220a7ff2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "946a76ec-5148-426e-97b5-bd22220a7ff2" (UID: "946a76ec-5148-426e-97b5-bd22220a7ff2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.580215 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.583751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/946a76ec-5148-426e-97b5-bd22220a7ff2-kube-api-access-mlscm" (OuterVolumeSpecName: "kube-api-access-mlscm") pod "946a76ec-5148-426e-97b5-bd22220a7ff2" (UID: "946a76ec-5148-426e-97b5-bd22220a7ff2"). InnerVolumeSpecName "kube-api-access-mlscm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.674853 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlscm\" (UniqueName: \"kubernetes.io/projected/946a76ec-5148-426e-97b5-bd22220a7ff2-kube-api-access-mlscm\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.674889 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/946a76ec-5148-426e-97b5-bd22220a7ff2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.714792 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-mbn7q" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.806764 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.807902 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld" event={"ID":"946a76ec-5148-426e-97b5-bd22220a7ff2","Type":"ContainerDied","Data":"2fc9e99f368baa369bfe5954383209ed356791d0ffeb27e2eb360da011621f01"} Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.807942 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fc9e99f368baa369bfe5954383209ed356791d0ffeb27e2eb360da011621f01" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.811534 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"80b5a44a-5492-4aa7-bd30-8639b6527e66","Type":"ContainerStarted","Data":"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b"} Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.812766 5030 generic.go:334] "Generic (PLEG): container finished" podID="48a72147-2d43-4262-afd5-e06c554d7379" containerID="588fc920f6558236538877bdec6583b515618625942d9f216a444cabdbea34f9" exitCode=0 Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.812803 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"48a72147-2d43-4262-afd5-e06c554d7379","Type":"ContainerDied","Data":"588fc920f6558236538877bdec6583b515618625942d9f216a444cabdbea34f9"} Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.814582 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" event={"ID":"effb0df3-9c75-4104-90c2-505e1e4e3b1b","Type":"ContainerDied","Data":"f8e9df1ed7867a8b684e95f2eb251c3437223ab0c618d1f6962b94c9426591f9"} Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.814599 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8e9df1ed7867a8b684e95f2eb251c3437223ab0c618d1f6962b94c9426591f9" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.816257 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.835147 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-mbn7q" event={"ID":"bc090b44-6543-43f0-a4cb-4ee41c4fdab2","Type":"ContainerDied","Data":"f855f3c8113651bd998969f31c54bbef1450c223429cd7aeba26ff12c190e285"} Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.835195 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f855f3c8113651bd998969f31c54bbef1450c223429cd7aeba26ff12c190e285" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.835251 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-mbn7q" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.839775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9","Type":"ContainerStarted","Data":"4afb199e076e08bd1f320a82a9ae4355b464db743e01e9a15d1354569ef3e0d8"} Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.878538 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxcjv\" (UniqueName: \"kubernetes.io/projected/bc090b44-6543-43f0-a4cb-4ee41c4fdab2-kube-api-access-sxcjv\") pod \"bc090b44-6543-43f0-a4cb-4ee41c4fdab2\" (UID: \"bc090b44-6543-43f0-a4cb-4ee41c4fdab2\") " Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.878604 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc090b44-6543-43f0-a4cb-4ee41c4fdab2-operator-scripts\") pod \"bc090b44-6543-43f0-a4cb-4ee41c4fdab2\" (UID: \"bc090b44-6543-43f0-a4cb-4ee41c4fdab2\") " Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.880072 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc090b44-6543-43f0-a4cb-4ee41c4fdab2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc090b44-6543-43f0-a4cb-4ee41c4fdab2" (UID: "bc090b44-6543-43f0-a4cb-4ee41c4fdab2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.888840 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc090b44-6543-43f0-a4cb-4ee41c4fdab2-kube-api-access-sxcjv" (OuterVolumeSpecName: "kube-api-access-sxcjv") pod "bc090b44-6543-43f0-a4cb-4ee41c4fdab2" (UID: "bc090b44-6543-43f0-a4cb-4ee41c4fdab2"). InnerVolumeSpecName "kube-api-access-sxcjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.980912 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxcjv\" (UniqueName: \"kubernetes.io/projected/bc090b44-6543-43f0-a4cb-4ee41c4fdab2-kube-api-access-sxcjv\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:36 crc kubenswrapper[5030]: I0120 23:11:36.980948 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc090b44-6543-43f0-a4cb-4ee41c4fdab2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.000912 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.023235 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.0232008 podStartE2EDuration="4.0232008s" podCreationTimestamp="2026-01-20 23:11:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:36.87299349 +0000 UTC m=+2169.193253788" watchObservedRunningTime="2026-01-20 23:11:37.0232008 +0000 UTC m=+2169.343461088" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.030839 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.040888 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.082125 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/229641c0-730e-4c1b-b5ba-89b3e250b1c9-operator-scripts\") pod \"229641c0-730e-4c1b-b5ba-89b3e250b1c9\" (UID: \"229641c0-730e-4c1b-b5ba-89b3e250b1c9\") " Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.082191 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc4zq\" (UniqueName: \"kubernetes.io/projected/229641c0-730e-4c1b-b5ba-89b3e250b1c9-kube-api-access-kc4zq\") pod \"229641c0-730e-4c1b-b5ba-89b3e250b1c9\" (UID: \"229641c0-730e-4c1b-b5ba-89b3e250b1c9\") " Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.082938 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/229641c0-730e-4c1b-b5ba-89b3e250b1c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "229641c0-730e-4c1b-b5ba-89b3e250b1c9" (UID: "229641c0-730e-4c1b-b5ba-89b3e250b1c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.091852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/229641c0-730e-4c1b-b5ba-89b3e250b1c9-kube-api-access-kc4zq" (OuterVolumeSpecName: "kube-api-access-kc4zq") pod "229641c0-730e-4c1b-b5ba-89b3e250b1c9" (UID: "229641c0-730e-4c1b-b5ba-89b3e250b1c9"). InnerVolumeSpecName "kube-api-access-kc4zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.183208 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-combined-ca-bundle\") pod \"48a72147-2d43-4262-afd5-e06c554d7379\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.183457 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkj7n\" (UniqueName: \"kubernetes.io/projected/b79f6cab-1cbe-42c5-9e2c-63f34c7c8139-kube-api-access-dkj7n\") pod \"b79f6cab-1cbe-42c5-9e2c-63f34c7c8139\" (UID: \"b79f6cab-1cbe-42c5-9e2c-63f34c7c8139\") " Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.183530 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-custom-prometheus-ca\") pod \"48a72147-2d43-4262-afd5-e06c554d7379\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.183601 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-config-data\") pod \"48a72147-2d43-4262-afd5-e06c554d7379\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.183684 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grdtf\" (UniqueName: \"kubernetes.io/projected/48a72147-2d43-4262-afd5-e06c554d7379-kube-api-access-grdtf\") pod \"48a72147-2d43-4262-afd5-e06c554d7379\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.183724 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b79f6cab-1cbe-42c5-9e2c-63f34c7c8139-operator-scripts\") pod \"b79f6cab-1cbe-42c5-9e2c-63f34c7c8139\" (UID: \"b79f6cab-1cbe-42c5-9e2c-63f34c7c8139\") " Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.183749 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48a72147-2d43-4262-afd5-e06c554d7379-logs\") pod \"48a72147-2d43-4262-afd5-e06c554d7379\" (UID: \"48a72147-2d43-4262-afd5-e06c554d7379\") " Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.184169 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/229641c0-730e-4c1b-b5ba-89b3e250b1c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.184185 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc4zq\" (UniqueName: \"kubernetes.io/projected/229641c0-730e-4c1b-b5ba-89b3e250b1c9-kube-api-access-kc4zq\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.184246 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48a72147-2d43-4262-afd5-e06c554d7379-logs" (OuterVolumeSpecName: "logs") pod "48a72147-2d43-4262-afd5-e06c554d7379" (UID: "48a72147-2d43-4262-afd5-e06c554d7379"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.184334 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b79f6cab-1cbe-42c5-9e2c-63f34c7c8139-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b79f6cab-1cbe-42c5-9e2c-63f34c7c8139" (UID: "b79f6cab-1cbe-42c5-9e2c-63f34c7c8139"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.187994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a72147-2d43-4262-afd5-e06c554d7379-kube-api-access-grdtf" (OuterVolumeSpecName: "kube-api-access-grdtf") pod "48a72147-2d43-4262-afd5-e06c554d7379" (UID: "48a72147-2d43-4262-afd5-e06c554d7379"). InnerVolumeSpecName "kube-api-access-grdtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.188189 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b79f6cab-1cbe-42c5-9e2c-63f34c7c8139-kube-api-access-dkj7n" (OuterVolumeSpecName: "kube-api-access-dkj7n") pod "b79f6cab-1cbe-42c5-9e2c-63f34c7c8139" (UID: "b79f6cab-1cbe-42c5-9e2c-63f34c7c8139"). InnerVolumeSpecName "kube-api-access-dkj7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.211474 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "48a72147-2d43-4262-afd5-e06c554d7379" (UID: "48a72147-2d43-4262-afd5-e06c554d7379"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.221782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48a72147-2d43-4262-afd5-e06c554d7379" (UID: "48a72147-2d43-4262-afd5-e06c554d7379"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.233549 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:11:37 crc kubenswrapper[5030]: W0120 23:11:37.235272 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcb7653a_62fc_4a7c_b6fd_720c3f73d47c.slice/crio-c4e00e0a3fb79ec51d9ca0ca579d187f08cdeb998f3ca6f1c8d2c90619ed953d WatchSource:0}: Error finding container c4e00e0a3fb79ec51d9ca0ca579d187f08cdeb998f3ca6f1c8d2c90619ed953d: Status 404 returned error can't find the container with id c4e00e0a3fb79ec51d9ca0ca579d187f08cdeb998f3ca6f1c8d2c90619ed953d Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.255200 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-config-data" (OuterVolumeSpecName: "config-data") pod "48a72147-2d43-4262-afd5-e06c554d7379" (UID: "48a72147-2d43-4262-afd5-e06c554d7379"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.285197 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.285225 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkj7n\" (UniqueName: \"kubernetes.io/projected/b79f6cab-1cbe-42c5-9e2c-63f34c7c8139-kube-api-access-dkj7n\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.285236 5030 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.285246 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a72147-2d43-4262-afd5-e06c554d7379-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.285254 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grdtf\" (UniqueName: \"kubernetes.io/projected/48a72147-2d43-4262-afd5-e06c554d7379-kube-api-access-grdtf\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.285264 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b79f6cab-1cbe-42c5-9e2c-63f34c7c8139-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.285273 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48a72147-2d43-4262-afd5-e06c554d7379-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:37 crc kubenswrapper[5030]: E0120 23:11:37.338609 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc090b44_6543_43f0_a4cb_4ee41c4fdab2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc090b44_6543_43f0_a4cb_4ee41c4fdab2.slice/crio-f855f3c8113651bd998969f31c54bbef1450c223429cd7aeba26ff12c190e285\": RecentStats: unable to find data in memory cache]" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.880109 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" event={"ID":"229641c0-730e-4c1b-b5ba-89b3e250b1c9","Type":"ContainerDied","Data":"7d74b8769a3ee2d4036de5985b118a79cf780cc0596c125aab02160082af9a5c"} Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.881416 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d74b8769a3ee2d4036de5985b118a79cf780cc0596c125aab02160082af9a5c" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.881569 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-22qhj" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.895355 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg" event={"ID":"b79f6cab-1cbe-42c5-9e2c-63f34c7c8139","Type":"ContainerDied","Data":"64f84cc1afe8c7deff3d6a62281a708a7ab258df052690dad06041ceed4ddb3c"} Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.895395 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64f84cc1afe8c7deff3d6a62281a708a7ab258df052690dad06041ceed4ddb3c" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.895466 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.906758 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c","Type":"ContainerStarted","Data":"9a976ba4a26e8a14c9b5d735869936056345cb3a87523033f8a1c7c29e43c684"} Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.906800 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c","Type":"ContainerStarted","Data":"c4e00e0a3fb79ec51d9ca0ca579d187f08cdeb998f3ca6f1c8d2c90619ed953d"} Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.915818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"48a72147-2d43-4262-afd5-e06c554d7379","Type":"ContainerDied","Data":"8ecea9c64378b037f85abf8f32f3055666c1bf1a9e8387f440cdc25b1cfe49c0"} Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.915859 5030 scope.go:117] "RemoveContainer" containerID="588fc920f6558236538877bdec6583b515618625942d9f216a444cabdbea34f9" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.915971 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:37 crc kubenswrapper[5030]: I0120 23:11:37.928135 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"80b5a44a-5492-4aa7-bd30-8639b6527e66","Type":"ContainerStarted","Data":"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127"} Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.000262 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.045725 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.059316 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 20 23:11:38 crc kubenswrapper[5030]: E0120 23:11:38.059766 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b79f6cab-1cbe-42c5-9e2c-63f34c7c8139" containerName="mariadb-account-create-update" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.059800 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79f6cab-1cbe-42c5-9e2c-63f34c7c8139" containerName="mariadb-account-create-update" Jan 20 23:11:38 crc kubenswrapper[5030]: E0120 23:11:38.059816 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="946a76ec-5148-426e-97b5-bd22220a7ff2" containerName="mariadb-account-create-update" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.059823 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="946a76ec-5148-426e-97b5-bd22220a7ff2" containerName="mariadb-account-create-update" Jan 20 23:11:38 crc kubenswrapper[5030]: E0120 23:11:38.059840 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc090b44-6543-43f0-a4cb-4ee41c4fdab2" containerName="mariadb-database-create" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.059847 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc090b44-6543-43f0-a4cb-4ee41c4fdab2" containerName="mariadb-database-create" Jan 20 23:11:38 crc kubenswrapper[5030]: E0120 23:11:38.059857 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effb0df3-9c75-4104-90c2-505e1e4e3b1b" containerName="mariadb-account-create-update" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.059881 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="effb0df3-9c75-4104-90c2-505e1e4e3b1b" containerName="mariadb-account-create-update" Jan 20 23:11:38 crc kubenswrapper[5030]: E0120 23:11:38.059899 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a72147-2d43-4262-afd5-e06c554d7379" containerName="watcher-decision-engine" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.059905 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a72147-2d43-4262-afd5-e06c554d7379" containerName="watcher-decision-engine" Jan 20 23:11:38 crc kubenswrapper[5030]: E0120 23:11:38.059920 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="229641c0-730e-4c1b-b5ba-89b3e250b1c9" containerName="mariadb-database-create" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.059926 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="229641c0-730e-4c1b-b5ba-89b3e250b1c9" containerName="mariadb-database-create" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.060151 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="48a72147-2d43-4262-afd5-e06c554d7379" containerName="watcher-decision-engine" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.060170 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="effb0df3-9c75-4104-90c2-505e1e4e3b1b" containerName="mariadb-account-create-update" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.060177 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b79f6cab-1cbe-42c5-9e2c-63f34c7c8139" containerName="mariadb-account-create-update" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.060208 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="946a76ec-5148-426e-97b5-bd22220a7ff2" containerName="mariadb-account-create-update" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.060219 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc090b44-6543-43f0-a4cb-4ee41c4fdab2" containerName="mariadb-database-create" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.060232 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="229641c0-730e-4c1b-b5ba-89b3e250b1c9" containerName="mariadb-database-create" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.060924 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.063522 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"watcher-decision-engine-config-data" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.069671 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.201909 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.202016 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.202039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d47b79-8224-4a87-8ffc-808a6743d00a-logs\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.202077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.202111 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdrsr\" (UniqueName: \"kubernetes.io/projected/28d47b79-8224-4a87-8ffc-808a6743d00a-kube-api-access-cdrsr\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.303174 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.303490 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.303514 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d47b79-8224-4a87-8ffc-808a6743d00a-logs\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.303555 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.303592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdrsr\" (UniqueName: \"kubernetes.io/projected/28d47b79-8224-4a87-8ffc-808a6743d00a-kube-api-access-cdrsr\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.304710 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d47b79-8224-4a87-8ffc-808a6743d00a-logs\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.307669 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.308177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.316129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.318453 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdrsr\" (UniqueName: \"kubernetes.io/projected/28d47b79-8224-4a87-8ffc-808a6743d00a-kube-api-access-cdrsr\") pod \"watcher-decision-engine-0\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.390205 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.392835 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.399499 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.918746 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 20 23:11:38 crc kubenswrapper[5030]: W0120 23:11:38.930606 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28d47b79_8224_4a87_8ffc_808a6743d00a.slice/crio-6494dfe0bf15335df1d555e64e5bdad3155f6ca50d23495e750c0cd9ad39591b WatchSource:0}: Error finding container 6494dfe0bf15335df1d555e64e5bdad3155f6ca50d23495e750c0cd9ad39591b: Status 404 returned error can't find the container with id 6494dfe0bf15335df1d555e64e5bdad3155f6ca50d23495e750c0cd9ad39591b Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.940748 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c","Type":"ContainerStarted","Data":"b0b7fbcae4e87a8e2c4501e0a9f4a0d919d679b8183c68e6faeceada12e60e1e"} Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.947126 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"28d47b79-8224-4a87-8ffc-808a6743d00a","Type":"ContainerStarted","Data":"6494dfe0bf15335df1d555e64e5bdad3155f6ca50d23495e750c0cd9ad39591b"} Jan 20 23:11:38 crc kubenswrapper[5030]: I0120 23:11:38.982037 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.9819961 podStartE2EDuration="3.9819961s" podCreationTimestamp="2026-01-20 23:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:38.973988524 +0000 UTC m=+2171.294248812" watchObservedRunningTime="2026-01-20 23:11:38.9819961 +0000 UTC m=+2171.302256388" Jan 20 23:11:39 crc kubenswrapper[5030]: I0120 23:11:39.995466 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48a72147-2d43-4262-afd5-e06c554d7379" path="/var/lib/kubelet/pods/48a72147-2d43-4262-afd5-e06c554d7379/volumes" Jan 20 23:11:39 crc kubenswrapper[5030]: I0120 23:11:39.996518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"28d47b79-8224-4a87-8ffc-808a6743d00a","Type":"ContainerStarted","Data":"bdd7503e79666863d6afaca4fc262aecba84505a7e41daf73b9635bf49a4eb40"} Jan 20 23:11:39 crc kubenswrapper[5030]: I0120 23:11:39.999488 5030 generic.go:334] "Generic (PLEG): container finished" podID="0e045aec-3446-4403-9f41-3460c1e4edd8" containerID="2adc9505bd51441e424188ca6fc7870a632c7aaf4ff225ef4486d6433fc3a30a" exitCode=0 Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.001818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" event={"ID":"0e045aec-3446-4403-9f41-3460c1e4edd8","Type":"ContainerDied","Data":"2adc9505bd51441e424188ca6fc7870a632c7aaf4ff225ef4486d6433fc3a30a"} Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.015862 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/watcher-decision-engine-0" podStartSLOduration=3.015841289 podStartE2EDuration="3.015841289s" podCreationTimestamp="2026-01-20 23:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:40.014013275 +0000 UTC m=+2172.334273573" watchObservedRunningTime="2026-01-20 23:11:40.015841289 +0000 UTC m=+2172.336101577" Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.386315 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.442396 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swbhz\" (UniqueName: \"kubernetes.io/projected/0e045aec-3446-4403-9f41-3460c1e4edd8-kube-api-access-swbhz\") pod \"0e045aec-3446-4403-9f41-3460c1e4edd8\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.442489 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-ovndb-tls-certs\") pod \"0e045aec-3446-4403-9f41-3460c1e4edd8\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.442524 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-httpd-config\") pod \"0e045aec-3446-4403-9f41-3460c1e4edd8\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.442545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-config\") pod \"0e045aec-3446-4403-9f41-3460c1e4edd8\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.442653 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-combined-ca-bundle\") pod \"0e045aec-3446-4403-9f41-3460c1e4edd8\" (UID: \"0e045aec-3446-4403-9f41-3460c1e4edd8\") " Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.450449 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0e045aec-3446-4403-9f41-3460c1e4edd8" (UID: "0e045aec-3446-4403-9f41-3460c1e4edd8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.461792 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e045aec-3446-4403-9f41-3460c1e4edd8-kube-api-access-swbhz" (OuterVolumeSpecName: "kube-api-access-swbhz") pod "0e045aec-3446-4403-9f41-3460c1e4edd8" (UID: "0e045aec-3446-4403-9f41-3460c1e4edd8"). InnerVolumeSpecName "kube-api-access-swbhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.530125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e045aec-3446-4403-9f41-3460c1e4edd8" (UID: "0e045aec-3446-4403-9f41-3460c1e4edd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.532161 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-config" (OuterVolumeSpecName: "config") pod "0e045aec-3446-4403-9f41-3460c1e4edd8" (UID: "0e045aec-3446-4403-9f41-3460c1e4edd8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.544502 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.544529 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.544541 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swbhz\" (UniqueName: \"kubernetes.io/projected/0e045aec-3446-4403-9f41-3460c1e4edd8-kube-api-access-swbhz\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.544550 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.558261 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0e045aec-3446-4403-9f41-3460c1e4edd8" (UID: "0e045aec-3446-4403-9f41-3460c1e4edd8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:40 crc kubenswrapper[5030]: I0120 23:11:40.646127 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e045aec-3446-4403-9f41-3460c1e4edd8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.013900 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="ceilometer-central-agent" containerID="cri-o://4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761" gracePeriod=30 Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.014119 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"80b5a44a-5492-4aa7-bd30-8639b6527e66","Type":"ContainerStarted","Data":"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5"} Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.014155 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.014375 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="proxy-httpd" containerID="cri-o://483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5" gracePeriod=30 Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.014415 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="sg-core" containerID="cri-o://bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127" gracePeriod=30 Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.014445 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="ceilometer-notification-agent" containerID="cri-o://026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b" gracePeriod=30 Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.019947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" event={"ID":"0e045aec-3446-4403-9f41-3460c1e4edd8","Type":"ContainerDied","Data":"6e004e65f75f1c4f661c3346d30743dd494fe9516ae6fa3aa71a5ef321bad700"} Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.020018 5030 scope.go:117] "RemoveContainer" containerID="a09b3aeb461e9751eafd059a5278797b9fba7440f6db9b9ac3e7ae86e2e084e5" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.020495 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.051264 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.575478705 podStartE2EDuration="9.051248458s" podCreationTimestamp="2026-01-20 23:11:32 +0000 UTC" firstStartedPulling="2026-01-20 23:11:33.763362743 +0000 UTC m=+2166.083623031" lastFinishedPulling="2026-01-20 23:11:40.239132496 +0000 UTC m=+2172.559392784" observedRunningTime="2026-01-20 23:11:41.036383124 +0000 UTC m=+2173.356643452" watchObservedRunningTime="2026-01-20 23:11:41.051248458 +0000 UTC m=+2173.371508746" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.057365 5030 scope.go:117] "RemoveContainer" containerID="2adc9505bd51441e424188ca6fc7870a632c7aaf4ff225ef4486d6433fc3a30a" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.070484 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf"] Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.081542 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-694b9bdbdc-2fwcf"] Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.349649 5030 scope.go:117] "RemoveContainer" containerID="0c3a5c19c1e410128793e3779060588805094169ed3914675342d1f99f09c499" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.466181 5030 scope.go:117] "RemoveContainer" containerID="cc33277d470448158fbe92e67fa94ac0abfb328a0c0c4b05536c09660bb8ac02" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.489772 5030 scope.go:117] "RemoveContainer" containerID="e632d16da87f475682232fd9152c12b9f4e4bef74c0982e287d33cc4f3c9ba3b" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.528068 5030 scope.go:117] "RemoveContainer" containerID="e409d88079d3d24e3cc7058504a95aef60f2cbca47dae33b0546b8ba2533ca3a" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.554393 5030 scope.go:117] "RemoveContainer" containerID="94d39046ffec8b1af080acbe58f700c3628f887d518971230088dd0fae4ee9cd" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.581978 5030 scope.go:117] "RemoveContainer" containerID="fea1372c596cefb82e941decc5c1f9e70bb6ceb4faaae0afec71eaec02f7aaf9" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.615654 5030 scope.go:117] "RemoveContainer" containerID="55f2c9b45d924d0e23919182c095b361caabfee3b3a8fd458ebce7784dff7d49" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.645631 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.649112 5030 scope.go:117] "RemoveContainer" containerID="59ac6043f93ef086250501bf602da8db89685b82321f351a7d96e1a127965ace" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.667551 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80b5a44a-5492-4aa7-bd30-8639b6527e66-run-httpd\") pod \"80b5a44a-5492-4aa7-bd30-8639b6527e66\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.667592 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-combined-ca-bundle\") pod \"80b5a44a-5492-4aa7-bd30-8639b6527e66\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.667704 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-config-data\") pod \"80b5a44a-5492-4aa7-bd30-8639b6527e66\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.667732 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frv82\" (UniqueName: \"kubernetes.io/projected/80b5a44a-5492-4aa7-bd30-8639b6527e66-kube-api-access-frv82\") pod \"80b5a44a-5492-4aa7-bd30-8639b6527e66\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.667750 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80b5a44a-5492-4aa7-bd30-8639b6527e66-log-httpd\") pod \"80b5a44a-5492-4aa7-bd30-8639b6527e66\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.667789 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-sg-core-conf-yaml\") pod \"80b5a44a-5492-4aa7-bd30-8639b6527e66\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.667828 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-scripts\") pod \"80b5a44a-5492-4aa7-bd30-8639b6527e66\" (UID: \"80b5a44a-5492-4aa7-bd30-8639b6527e66\") " Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.668265 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80b5a44a-5492-4aa7-bd30-8639b6527e66-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "80b5a44a-5492-4aa7-bd30-8639b6527e66" (UID: "80b5a44a-5492-4aa7-bd30-8639b6527e66"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.668416 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80b5a44a-5492-4aa7-bd30-8639b6527e66-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "80b5a44a-5492-4aa7-bd30-8639b6527e66" (UID: "80b5a44a-5492-4aa7-bd30-8639b6527e66"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.669072 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80b5a44a-5492-4aa7-bd30-8639b6527e66-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.669090 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80b5a44a-5492-4aa7-bd30-8639b6527e66-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.676080 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-scripts" (OuterVolumeSpecName: "scripts") pod "80b5a44a-5492-4aa7-bd30-8639b6527e66" (UID: "80b5a44a-5492-4aa7-bd30-8639b6527e66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.676271 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80b5a44a-5492-4aa7-bd30-8639b6527e66-kube-api-access-frv82" (OuterVolumeSpecName: "kube-api-access-frv82") pod "80b5a44a-5492-4aa7-bd30-8639b6527e66" (UID: "80b5a44a-5492-4aa7-bd30-8639b6527e66"). InnerVolumeSpecName "kube-api-access-frv82". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.682089 5030 scope.go:117] "RemoveContainer" containerID="7c053da04f9e763cce1ed24ccb24b42f76ae096e7227f61b45fb9840d0d097ad" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.708379 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "80b5a44a-5492-4aa7-bd30-8639b6527e66" (UID: "80b5a44a-5492-4aa7-bd30-8639b6527e66"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.727022 5030 scope.go:117] "RemoveContainer" containerID="feb1a638c84cf3386e550d8bc38c115b9e45a247fbc73b072ff09a3993cb8acb" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.743028 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80b5a44a-5492-4aa7-bd30-8639b6527e66" (UID: "80b5a44a-5492-4aa7-bd30-8639b6527e66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.770911 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-config-data" (OuterVolumeSpecName: "config-data") pod "80b5a44a-5492-4aa7-bd30-8639b6527e66" (UID: "80b5a44a-5492-4aa7-bd30-8639b6527e66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.771493 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.771531 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frv82\" (UniqueName: \"kubernetes.io/projected/80b5a44a-5492-4aa7-bd30-8639b6527e66-kube-api-access-frv82\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.771547 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.771560 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.771574 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b5a44a-5492-4aa7-bd30-8639b6527e66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:41 crc kubenswrapper[5030]: I0120 23:11:41.985028 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e045aec-3446-4403-9f41-3460c1e4edd8" path="/var/lib/kubelet/pods/0e045aec-3446-4403-9f41-3460c1e4edd8/volumes" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.037928 5030 generic.go:334] "Generic (PLEG): container finished" podID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerID="483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5" exitCode=0 Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.037957 5030 generic.go:334] "Generic (PLEG): container finished" podID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerID="bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127" exitCode=2 Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.037965 5030 generic.go:334] "Generic (PLEG): container finished" podID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerID="026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b" exitCode=0 Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.037979 5030 generic.go:334] "Generic (PLEG): container finished" podID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerID="4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761" exitCode=0 Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.038000 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"80b5a44a-5492-4aa7-bd30-8639b6527e66","Type":"ContainerDied","Data":"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5"} Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.038022 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"80b5a44a-5492-4aa7-bd30-8639b6527e66","Type":"ContainerDied","Data":"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127"} Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.038032 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"80b5a44a-5492-4aa7-bd30-8639b6527e66","Type":"ContainerDied","Data":"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b"} Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.038043 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"80b5a44a-5492-4aa7-bd30-8639b6527e66","Type":"ContainerDied","Data":"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761"} Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.038053 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"80b5a44a-5492-4aa7-bd30-8639b6527e66","Type":"ContainerDied","Data":"d417312e39e06811cd427971e4ff28e1da3cb43b324b96d35898f6e9d0af01af"} Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.038073 5030 scope.go:117] "RemoveContainer" containerID="483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.038170 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.065989 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.080808 5030 scope.go:117] "RemoveContainer" containerID="bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.080807 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.089821 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:42 crc kubenswrapper[5030]: E0120 23:11:42.090263 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="ceilometer-central-agent" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.090282 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="ceilometer-central-agent" Jan 20 23:11:42 crc kubenswrapper[5030]: E0120 23:11:42.090303 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="sg-core" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.090310 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="sg-core" Jan 20 23:11:42 crc kubenswrapper[5030]: E0120 23:11:42.090319 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e045aec-3446-4403-9f41-3460c1e4edd8" containerName="neutron-httpd" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.090326 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e045aec-3446-4403-9f41-3460c1e4edd8" containerName="neutron-httpd" Jan 20 23:11:42 crc kubenswrapper[5030]: E0120 23:11:42.090351 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="proxy-httpd" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.090358 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="proxy-httpd" Jan 20 23:11:42 crc kubenswrapper[5030]: E0120 23:11:42.090366 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e045aec-3446-4403-9f41-3460c1e4edd8" containerName="neutron-api" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.090372 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e045aec-3446-4403-9f41-3460c1e4edd8" containerName="neutron-api" Jan 20 23:11:42 crc kubenswrapper[5030]: E0120 23:11:42.090380 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="ceilometer-notification-agent" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.090386 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="ceilometer-notification-agent" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.090938 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="proxy-httpd" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.090963 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="ceilometer-central-agent" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.090978 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="sg-core" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.090989 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e045aec-3446-4403-9f41-3460c1e4edd8" containerName="neutron-api" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.090997 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e045aec-3446-4403-9f41-3460c1e4edd8" containerName="neutron-httpd" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.091008 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" containerName="ceilometer-notification-agent" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.092969 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.096034 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.096115 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.097244 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.124929 5030 scope.go:117] "RemoveContainer" containerID="026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.146923 5030 scope.go:117] "RemoveContainer" containerID="4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.167729 5030 scope.go:117] "RemoveContainer" containerID="483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5" Jan 20 23:11:42 crc kubenswrapper[5030]: E0120 23:11:42.172430 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5\": container with ID starting with 483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5 not found: ID does not exist" containerID="483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.172472 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5"} err="failed to get container status \"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5\": rpc error: code = NotFound desc = could not find container \"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5\": container with ID starting with 483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5 not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.172497 5030 scope.go:117] "RemoveContainer" containerID="bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127" Jan 20 23:11:42 crc kubenswrapper[5030]: E0120 23:11:42.173013 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127\": container with ID starting with bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127 not found: ID does not exist" containerID="bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.173074 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127"} err="failed to get container status \"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127\": rpc error: code = NotFound desc = could not find container \"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127\": container with ID starting with bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127 not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.173098 5030 scope.go:117] "RemoveContainer" containerID="026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b" Jan 20 23:11:42 crc kubenswrapper[5030]: E0120 23:11:42.174094 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b\": container with ID starting with 026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b not found: ID does not exist" containerID="026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.174208 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b"} err="failed to get container status \"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b\": rpc error: code = NotFound desc = could not find container \"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b\": container with ID starting with 026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.174324 5030 scope.go:117] "RemoveContainer" containerID="4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761" Jan 20 23:11:42 crc kubenswrapper[5030]: E0120 23:11:42.175550 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761\": container with ID starting with 4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761 not found: ID does not exist" containerID="4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.175596 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761"} err="failed to get container status \"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761\": rpc error: code = NotFound desc = could not find container \"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761\": container with ID starting with 4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761 not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.175661 5030 scope.go:117] "RemoveContainer" containerID="483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.177077 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5"} err="failed to get container status \"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5\": rpc error: code = NotFound desc = could not find container \"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5\": container with ID starting with 483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5 not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.177133 5030 scope.go:117] "RemoveContainer" containerID="bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.177659 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8edcc1ee-8342-4100-b0a2-f78e28f678a3-log-httpd\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.177875 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127"} err="failed to get container status \"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127\": rpc error: code = NotFound desc = could not find container \"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127\": container with ID starting with bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127 not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.177962 5030 scope.go:117] "RemoveContainer" containerID="026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.177972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.178127 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-config-data\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.178171 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8edcc1ee-8342-4100-b0a2-f78e28f678a3-run-httpd\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.178202 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-scripts\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.178283 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.178315 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w92j\" (UniqueName: \"kubernetes.io/projected/8edcc1ee-8342-4100-b0a2-f78e28f678a3-kube-api-access-8w92j\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.179593 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b"} err="failed to get container status \"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b\": rpc error: code = NotFound desc = could not find container \"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b\": container with ID starting with 026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.179633 5030 scope.go:117] "RemoveContainer" containerID="4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.183798 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761"} err="failed to get container status \"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761\": rpc error: code = NotFound desc = could not find container \"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761\": container with ID starting with 4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761 not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.183835 5030 scope.go:117] "RemoveContainer" containerID="483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.184161 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5"} err="failed to get container status \"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5\": rpc error: code = NotFound desc = could not find container \"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5\": container with ID starting with 483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5 not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.184187 5030 scope.go:117] "RemoveContainer" containerID="bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.184568 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127"} err="failed to get container status \"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127\": rpc error: code = NotFound desc = could not find container \"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127\": container with ID starting with bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127 not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.184609 5030 scope.go:117] "RemoveContainer" containerID="026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.184963 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b"} err="failed to get container status \"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b\": rpc error: code = NotFound desc = could not find container \"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b\": container with ID starting with 026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.184999 5030 scope.go:117] "RemoveContainer" containerID="4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.185183 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761"} err="failed to get container status \"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761\": rpc error: code = NotFound desc = could not find container \"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761\": container with ID starting with 4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761 not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.185202 5030 scope.go:117] "RemoveContainer" containerID="483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.185362 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5"} err="failed to get container status \"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5\": rpc error: code = NotFound desc = could not find container \"483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5\": container with ID starting with 483a7d974314c3f639b16471327b158c2b6e79b62249e442f71c87cb614416e5 not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.185383 5030 scope.go:117] "RemoveContainer" containerID="bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.188753 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127"} err="failed to get container status \"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127\": rpc error: code = NotFound desc = could not find container \"bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127\": container with ID starting with bd27cb5c15779748d275e9434d3197b4f74f233cbab8438b651561c31a331127 not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.188794 5030 scope.go:117] "RemoveContainer" containerID="026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.189137 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b"} err="failed to get container status \"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b\": rpc error: code = NotFound desc = could not find container \"026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b\": container with ID starting with 026df1a8b5b5a1c75e7c06670bd37ba624dbe2640108e3ef7b739e3fd5a5fe8b not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.189158 5030 scope.go:117] "RemoveContainer" containerID="4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.189354 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761"} err="failed to get container status \"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761\": rpc error: code = NotFound desc = could not find container \"4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761\": container with ID starting with 4bfd025671c7667debb9620f0ed9a44fc2565a188073451a49373f86ac4bd761 not found: ID does not exist" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.279088 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8edcc1ee-8342-4100-b0a2-f78e28f678a3-log-httpd\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.279178 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.279203 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-config-data\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.279223 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8edcc1ee-8342-4100-b0a2-f78e28f678a3-run-httpd\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.279242 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-scripts\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.279267 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.279284 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w92j\" (UniqueName: \"kubernetes.io/projected/8edcc1ee-8342-4100-b0a2-f78e28f678a3-kube-api-access-8w92j\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.279888 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8edcc1ee-8342-4100-b0a2-f78e28f678a3-log-httpd\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.280524 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8edcc1ee-8342-4100-b0a2-f78e28f678a3-run-httpd\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.283467 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.285345 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-config-data\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.285858 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.295240 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-scripts\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.317354 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w92j\" (UniqueName: \"kubernetes.io/projected/8edcc1ee-8342-4100-b0a2-f78e28f678a3-kube-api-access-8w92j\") pod \"ceilometer-0\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.423651 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.658687 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv"] Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.660153 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.663486 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.663853 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-4btnm" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.664017 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.675231 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv"] Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.686757 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42g9c\" (UniqueName: \"kubernetes.io/projected/9329b737-5b3e-4513-a65a-5797ad48998f-kube-api-access-42g9c\") pod \"nova-cell0-conductor-db-sync-2llnv\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.686808 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-config-data\") pod \"nova-cell0-conductor-db-sync-2llnv\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.687196 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-scripts\") pod \"nova-cell0-conductor-db-sync-2llnv\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.687299 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2llnv\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.788513 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-scripts\") pod \"nova-cell0-conductor-db-sync-2llnv\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.788580 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2llnv\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.788616 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42g9c\" (UniqueName: \"kubernetes.io/projected/9329b737-5b3e-4513-a65a-5797ad48998f-kube-api-access-42g9c\") pod \"nova-cell0-conductor-db-sync-2llnv\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.788676 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-config-data\") pod \"nova-cell0-conductor-db-sync-2llnv\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.795168 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2llnv\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.801151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-scripts\") pod \"nova-cell0-conductor-db-sync-2llnv\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.801505 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-config-data\") pod \"nova-cell0-conductor-db-sync-2llnv\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.806085 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42g9c\" (UniqueName: \"kubernetes.io/projected/9329b737-5b3e-4513-a65a-5797ad48998f-kube-api-access-42g9c\") pod \"nova-cell0-conductor-db-sync-2llnv\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.909867 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:42 crc kubenswrapper[5030]: W0120 23:11:42.911196 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8edcc1ee_8342_4100_b0a2_f78e28f678a3.slice/crio-f3bd6c0cb3d88f9d746c72da3849dcf5392fb76da75b0fd2c9abaab49734c5b1 WatchSource:0}: Error finding container f3bd6c0cb3d88f9d746c72da3849dcf5392fb76da75b0fd2c9abaab49734c5b1: Status 404 returned error can't find the container with id f3bd6c0cb3d88f9d746c72da3849dcf5392fb76da75b0fd2c9abaab49734c5b1 Jan 20 23:11:42 crc kubenswrapper[5030]: I0120 23:11:42.981005 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:43 crc kubenswrapper[5030]: I0120 23:11:43.054921 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8edcc1ee-8342-4100-b0a2-f78e28f678a3","Type":"ContainerStarted","Data":"f3bd6c0cb3d88f9d746c72da3849dcf5392fb76da75b0fd2c9abaab49734c5b1"} Jan 20 23:11:43 crc kubenswrapper[5030]: I0120 23:11:43.432554 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv"] Jan 20 23:11:43 crc kubenswrapper[5030]: I0120 23:11:43.980653 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80b5a44a-5492-4aa7-bd30-8639b6527e66" path="/var/lib/kubelet/pods/80b5a44a-5492-4aa7-bd30-8639b6527e66/volumes" Jan 20 23:11:44 crc kubenswrapper[5030]: I0120 23:11:44.068117 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8edcc1ee-8342-4100-b0a2-f78e28f678a3","Type":"ContainerStarted","Data":"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7"} Jan 20 23:11:44 crc kubenswrapper[5030]: I0120 23:11:44.070978 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" event={"ID":"9329b737-5b3e-4513-a65a-5797ad48998f","Type":"ContainerStarted","Data":"2810d82b7b0d1df9d395d91516e331ed9c41cba08efeb944359966702a2c540a"} Jan 20 23:11:44 crc kubenswrapper[5030]: I0120 23:11:44.071019 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" event={"ID":"9329b737-5b3e-4513-a65a-5797ad48998f","Type":"ContainerStarted","Data":"ce84b955945e431cbe4222813f924096864bb14901f230dffacb0b44b0a467de"} Jan 20 23:11:44 crc kubenswrapper[5030]: I0120 23:11:44.094029 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" podStartSLOduration=2.094004242 podStartE2EDuration="2.094004242s" podCreationTimestamp="2026-01-20 23:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:44.091450789 +0000 UTC m=+2176.411711117" watchObservedRunningTime="2026-01-20 23:11:44.094004242 +0000 UTC m=+2176.414264550" Jan 20 23:11:44 crc kubenswrapper[5030]: I0120 23:11:44.283199 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:44 crc kubenswrapper[5030]: I0120 23:11:44.283246 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:44 crc kubenswrapper[5030]: I0120 23:11:44.318848 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:44 crc kubenswrapper[5030]: I0120 23:11:44.348952 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:45 crc kubenswrapper[5030]: I0120 23:11:45.103748 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8edcc1ee-8342-4100-b0a2-f78e28f678a3","Type":"ContainerStarted","Data":"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d"} Jan 20 23:11:45 crc kubenswrapper[5030]: I0120 23:11:45.104102 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:45 crc kubenswrapper[5030]: I0120 23:11:45.104120 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:46 crc kubenswrapper[5030]: I0120 23:11:46.122828 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8edcc1ee-8342-4100-b0a2-f78e28f678a3","Type":"ContainerStarted","Data":"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be"} Jan 20 23:11:46 crc kubenswrapper[5030]: I0120 23:11:46.580811 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:46 crc kubenswrapper[5030]: I0120 23:11:46.580899 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:46 crc kubenswrapper[5030]: I0120 23:11:46.624733 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:46 crc kubenswrapper[5030]: I0120 23:11:46.632652 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:46 crc kubenswrapper[5030]: I0120 23:11:46.922043 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:46 crc kubenswrapper[5030]: I0120 23:11:46.926284 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:11:47 crc kubenswrapper[5030]: I0120 23:11:47.132322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8edcc1ee-8342-4100-b0a2-f78e28f678a3","Type":"ContainerStarted","Data":"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c"} Jan 20 23:11:47 crc kubenswrapper[5030]: I0120 23:11:47.132864 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:47 crc kubenswrapper[5030]: I0120 23:11:47.133024 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:47 crc kubenswrapper[5030]: I0120 23:11:47.154164 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.470296792 podStartE2EDuration="5.15415054s" podCreationTimestamp="2026-01-20 23:11:42 +0000 UTC" firstStartedPulling="2026-01-20 23:11:42.913102228 +0000 UTC m=+2175.233362516" lastFinishedPulling="2026-01-20 23:11:46.596955976 +0000 UTC m=+2178.917216264" observedRunningTime="2026-01-20 23:11:47.149959667 +0000 UTC m=+2179.470219955" watchObservedRunningTime="2026-01-20 23:11:47.15415054 +0000 UTC m=+2179.474410818" Jan 20 23:11:48 crc kubenswrapper[5030]: I0120 23:11:48.139259 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:48 crc kubenswrapper[5030]: I0120 23:11:48.391290 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:48 crc kubenswrapper[5030]: I0120 23:11:48.441220 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:48 crc kubenswrapper[5030]: I0120 23:11:48.997081 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:49 crc kubenswrapper[5030]: I0120 23:11:49.010988 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:11:49 crc kubenswrapper[5030]: I0120 23:11:49.134332 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:49 crc kubenswrapper[5030]: I0120 23:11:49.148486 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:49 crc kubenswrapper[5030]: I0120 23:11:49.184981 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.158291 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="ceilometer-central-agent" containerID="cri-o://b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7" gracePeriod=30 Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.158397 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="proxy-httpd" containerID="cri-o://d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c" gracePeriod=30 Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.158481 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="ceilometer-notification-agent" containerID="cri-o://ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d" gracePeriod=30 Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.158513 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="sg-core" containerID="cri-o://198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be" gracePeriod=30 Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.879374 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.940457 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-combined-ca-bundle\") pod \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.940524 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-sg-core-conf-yaml\") pod \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.940595 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w92j\" (UniqueName: \"kubernetes.io/projected/8edcc1ee-8342-4100-b0a2-f78e28f678a3-kube-api-access-8w92j\") pod \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.940688 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8edcc1ee-8342-4100-b0a2-f78e28f678a3-run-httpd\") pod \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.940714 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-config-data\") pod \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.940733 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8edcc1ee-8342-4100-b0a2-f78e28f678a3-log-httpd\") pod \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.940767 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-scripts\") pod \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\" (UID: \"8edcc1ee-8342-4100-b0a2-f78e28f678a3\") " Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.943384 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8edcc1ee-8342-4100-b0a2-f78e28f678a3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8edcc1ee-8342-4100-b0a2-f78e28f678a3" (UID: "8edcc1ee-8342-4100-b0a2-f78e28f678a3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.943751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8edcc1ee-8342-4100-b0a2-f78e28f678a3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8edcc1ee-8342-4100-b0a2-f78e28f678a3" (UID: "8edcc1ee-8342-4100-b0a2-f78e28f678a3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.955990 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8edcc1ee-8342-4100-b0a2-f78e28f678a3-kube-api-access-8w92j" (OuterVolumeSpecName: "kube-api-access-8w92j") pod "8edcc1ee-8342-4100-b0a2-f78e28f678a3" (UID: "8edcc1ee-8342-4100-b0a2-f78e28f678a3"). InnerVolumeSpecName "kube-api-access-8w92j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.957533 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-scripts" (OuterVolumeSpecName: "scripts") pod "8edcc1ee-8342-4100-b0a2-f78e28f678a3" (UID: "8edcc1ee-8342-4100-b0a2-f78e28f678a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:50 crc kubenswrapper[5030]: I0120 23:11:50.985390 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8edcc1ee-8342-4100-b0a2-f78e28f678a3" (UID: "8edcc1ee-8342-4100-b0a2-f78e28f678a3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.019687 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8edcc1ee-8342-4100-b0a2-f78e28f678a3" (UID: "8edcc1ee-8342-4100-b0a2-f78e28f678a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.043840 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.043984 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.044014 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w92j\" (UniqueName: \"kubernetes.io/projected/8edcc1ee-8342-4100-b0a2-f78e28f678a3-kube-api-access-8w92j\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.044030 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8edcc1ee-8342-4100-b0a2-f78e28f678a3-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.044044 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8edcc1ee-8342-4100-b0a2-f78e28f678a3-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.044062 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.045073 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-config-data" (OuterVolumeSpecName: "config-data") pod "8edcc1ee-8342-4100-b0a2-f78e28f678a3" (UID: "8edcc1ee-8342-4100-b0a2-f78e28f678a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.145750 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8edcc1ee-8342-4100-b0a2-f78e28f678a3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.171411 5030 generic.go:334] "Generic (PLEG): container finished" podID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerID="d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c" exitCode=0 Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.171442 5030 generic.go:334] "Generic (PLEG): container finished" podID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerID="198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be" exitCode=2 Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.171450 5030 generic.go:334] "Generic (PLEG): container finished" podID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerID="ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d" exitCode=0 Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.171457 5030 generic.go:334] "Generic (PLEG): container finished" podID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerID="b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7" exitCode=0 Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.171477 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8edcc1ee-8342-4100-b0a2-f78e28f678a3","Type":"ContainerDied","Data":"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c"} Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.171514 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.171543 5030 scope.go:117] "RemoveContainer" containerID="d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.171530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8edcc1ee-8342-4100-b0a2-f78e28f678a3","Type":"ContainerDied","Data":"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be"} Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.171593 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8edcc1ee-8342-4100-b0a2-f78e28f678a3","Type":"ContainerDied","Data":"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d"} Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.171663 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8edcc1ee-8342-4100-b0a2-f78e28f678a3","Type":"ContainerDied","Data":"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7"} Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.171687 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8edcc1ee-8342-4100-b0a2-f78e28f678a3","Type":"ContainerDied","Data":"f3bd6c0cb3d88f9d746c72da3849dcf5392fb76da75b0fd2c9abaab49734c5b1"} Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.206493 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.206601 5030 scope.go:117] "RemoveContainer" containerID="198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.232887 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.243028 5030 scope.go:117] "RemoveContainer" containerID="ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.283194 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:51 crc kubenswrapper[5030]: E0120 23:11:51.283827 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="proxy-httpd" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.283855 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="proxy-httpd" Jan 20 23:11:51 crc kubenswrapper[5030]: E0120 23:11:51.283876 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="ceilometer-notification-agent" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.283885 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="ceilometer-notification-agent" Jan 20 23:11:51 crc kubenswrapper[5030]: E0120 23:11:51.283909 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="ceilometer-central-agent" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.283917 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="ceilometer-central-agent" Jan 20 23:11:51 crc kubenswrapper[5030]: E0120 23:11:51.283934 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="sg-core" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.283944 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="sg-core" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.284180 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="ceilometer-notification-agent" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.284205 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="sg-core" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.284224 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="ceilometer-central-agent" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.284240 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" containerName="proxy-httpd" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.286410 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.288806 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.289508 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.291602 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.329096 5030 scope.go:117] "RemoveContainer" containerID="b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.349028 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-scripts\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.349094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c66480db-ddbc-4146-8ff7-c45106fb527c-run-httpd\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.349114 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.349135 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.349315 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxkn5\" (UniqueName: \"kubernetes.io/projected/c66480db-ddbc-4146-8ff7-c45106fb527c-kube-api-access-gxkn5\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.349486 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-config-data\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.349522 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c66480db-ddbc-4146-8ff7-c45106fb527c-log-httpd\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.353882 5030 scope.go:117] "RemoveContainer" containerID="d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c" Jan 20 23:11:51 crc kubenswrapper[5030]: E0120 23:11:51.354286 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c\": container with ID starting with d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c not found: ID does not exist" containerID="d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.354317 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c"} err="failed to get container status \"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c\": rpc error: code = NotFound desc = could not find container \"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c\": container with ID starting with d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.354341 5030 scope.go:117] "RemoveContainer" containerID="198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be" Jan 20 23:11:51 crc kubenswrapper[5030]: E0120 23:11:51.354771 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be\": container with ID starting with 198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be not found: ID does not exist" containerID="198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.354814 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be"} err="failed to get container status \"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be\": rpc error: code = NotFound desc = could not find container \"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be\": container with ID starting with 198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.354842 5030 scope.go:117] "RemoveContainer" containerID="ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d" Jan 20 23:11:51 crc kubenswrapper[5030]: E0120 23:11:51.355126 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d\": container with ID starting with ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d not found: ID does not exist" containerID="ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.355148 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d"} err="failed to get container status \"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d\": rpc error: code = NotFound desc = could not find container \"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d\": container with ID starting with ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.355161 5030 scope.go:117] "RemoveContainer" containerID="b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7" Jan 20 23:11:51 crc kubenswrapper[5030]: E0120 23:11:51.355370 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7\": container with ID starting with b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7 not found: ID does not exist" containerID="b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.355389 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7"} err="failed to get container status \"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7\": rpc error: code = NotFound desc = could not find container \"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7\": container with ID starting with b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7 not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.355401 5030 scope.go:117] "RemoveContainer" containerID="d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.355609 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c"} err="failed to get container status \"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c\": rpc error: code = NotFound desc = could not find container \"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c\": container with ID starting with d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.355640 5030 scope.go:117] "RemoveContainer" containerID="198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.355821 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be"} err="failed to get container status \"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be\": rpc error: code = NotFound desc = could not find container \"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be\": container with ID starting with 198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.355839 5030 scope.go:117] "RemoveContainer" containerID="ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.360716 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d"} err="failed to get container status \"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d\": rpc error: code = NotFound desc = could not find container \"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d\": container with ID starting with ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.360741 5030 scope.go:117] "RemoveContainer" containerID="b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.362026 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7"} err="failed to get container status \"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7\": rpc error: code = NotFound desc = could not find container \"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7\": container with ID starting with b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7 not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.362046 5030 scope.go:117] "RemoveContainer" containerID="d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.362540 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c"} err="failed to get container status \"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c\": rpc error: code = NotFound desc = could not find container \"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c\": container with ID starting with d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.362559 5030 scope.go:117] "RemoveContainer" containerID="198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.362902 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be"} err="failed to get container status \"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be\": rpc error: code = NotFound desc = could not find container \"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be\": container with ID starting with 198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.362948 5030 scope.go:117] "RemoveContainer" containerID="ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.363243 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d"} err="failed to get container status \"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d\": rpc error: code = NotFound desc = could not find container \"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d\": container with ID starting with ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.363262 5030 scope.go:117] "RemoveContainer" containerID="b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.363485 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7"} err="failed to get container status \"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7\": rpc error: code = NotFound desc = could not find container \"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7\": container with ID starting with b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7 not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.363512 5030 scope.go:117] "RemoveContainer" containerID="d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.363788 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c"} err="failed to get container status \"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c\": rpc error: code = NotFound desc = could not find container \"d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c\": container with ID starting with d3890ff02cd7078811268d04ef22cc28cae1c5e8667c775badb7b0296d01db9c not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.363815 5030 scope.go:117] "RemoveContainer" containerID="198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.364255 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be"} err="failed to get container status \"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be\": rpc error: code = NotFound desc = could not find container \"198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be\": container with ID starting with 198b10b493e4375e40e7b9f8473a209122a7a9dd2a564888e97f88c19bd688be not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.364280 5030 scope.go:117] "RemoveContainer" containerID="ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.364553 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d"} err="failed to get container status \"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d\": rpc error: code = NotFound desc = could not find container \"ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d\": container with ID starting with ffd27e6e49338ac5b7f52544a580520bb620a914aa63bde6a2d09eaaf4017d7d not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.364580 5030 scope.go:117] "RemoveContainer" containerID="b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.364884 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7"} err="failed to get container status \"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7\": rpc error: code = NotFound desc = could not find container \"b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7\": container with ID starting with b640045ba5c16cd2d7ae985c03e29601ad58cdc3de4ca2458c6a5d0ac89e4fa7 not found: ID does not exist" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.451677 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-scripts\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.451726 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c66480db-ddbc-4146-8ff7-c45106fb527c-run-httpd\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.452196 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c66480db-ddbc-4146-8ff7-c45106fb527c-run-httpd\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.451747 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.453366 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.453760 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxkn5\" (UniqueName: \"kubernetes.io/projected/c66480db-ddbc-4146-8ff7-c45106fb527c-kube-api-access-gxkn5\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.453827 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-config-data\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.453850 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c66480db-ddbc-4146-8ff7-c45106fb527c-log-httpd\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.454301 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c66480db-ddbc-4146-8ff7-c45106fb527c-log-httpd\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.455657 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-scripts\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.456029 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.457055 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-config-data\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.457324 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.468821 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxkn5\" (UniqueName: \"kubernetes.io/projected/c66480db-ddbc-4146-8ff7-c45106fb527c-kube-api-access-gxkn5\") pod \"ceilometer-0\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.629036 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:51 crc kubenswrapper[5030]: I0120 23:11:51.974364 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8edcc1ee-8342-4100-b0a2-f78e28f678a3" path="/var/lib/kubelet/pods/8edcc1ee-8342-4100-b0a2-f78e28f678a3/volumes" Jan 20 23:11:52 crc kubenswrapper[5030]: I0120 23:11:52.061358 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:52 crc kubenswrapper[5030]: W0120 23:11:52.064883 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc66480db_ddbc_4146_8ff7_c45106fb527c.slice/crio-c36297c7ec770426a2b80b9c973eacf90099f4ed46419c7c8987cf14041eaad0 WatchSource:0}: Error finding container c36297c7ec770426a2b80b9c973eacf90099f4ed46419c7c8987cf14041eaad0: Status 404 returned error can't find the container with id c36297c7ec770426a2b80b9c973eacf90099f4ed46419c7c8987cf14041eaad0 Jan 20 23:11:52 crc kubenswrapper[5030]: I0120 23:11:52.180929 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c66480db-ddbc-4146-8ff7-c45106fb527c","Type":"ContainerStarted","Data":"c36297c7ec770426a2b80b9c973eacf90099f4ed46419c7c8987cf14041eaad0"} Jan 20 23:11:53 crc kubenswrapper[5030]: I0120 23:11:53.195401 5030 generic.go:334] "Generic (PLEG): container finished" podID="9329b737-5b3e-4513-a65a-5797ad48998f" containerID="2810d82b7b0d1df9d395d91516e331ed9c41cba08efeb944359966702a2c540a" exitCode=0 Jan 20 23:11:53 crc kubenswrapper[5030]: I0120 23:11:53.195520 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" event={"ID":"9329b737-5b3e-4513-a65a-5797ad48998f","Type":"ContainerDied","Data":"2810d82b7b0d1df9d395d91516e331ed9c41cba08efeb944359966702a2c540a"} Jan 20 23:11:53 crc kubenswrapper[5030]: I0120 23:11:53.197778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c66480db-ddbc-4146-8ff7-c45106fb527c","Type":"ContainerStarted","Data":"c9dfb4ab9c075027f17c55d4fc7b21c7d6f389d15e6de87d8605e062bbd00f5b"} Jan 20 23:11:53 crc kubenswrapper[5030]: I0120 23:11:53.208018 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.211831 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c66480db-ddbc-4146-8ff7-c45106fb527c","Type":"ContainerStarted","Data":"fc764d22c18ceb40b4cb93cca6c724a9f6139a9a5102632bed0a5f7f70959d8b"} Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.575574 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.718949 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-scripts\") pod \"9329b737-5b3e-4513-a65a-5797ad48998f\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.719042 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42g9c\" (UniqueName: \"kubernetes.io/projected/9329b737-5b3e-4513-a65a-5797ad48998f-kube-api-access-42g9c\") pod \"9329b737-5b3e-4513-a65a-5797ad48998f\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.719165 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-combined-ca-bundle\") pod \"9329b737-5b3e-4513-a65a-5797ad48998f\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.719315 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-config-data\") pod \"9329b737-5b3e-4513-a65a-5797ad48998f\" (UID: \"9329b737-5b3e-4513-a65a-5797ad48998f\") " Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.728553 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9329b737-5b3e-4513-a65a-5797ad48998f-kube-api-access-42g9c" (OuterVolumeSpecName: "kube-api-access-42g9c") pod "9329b737-5b3e-4513-a65a-5797ad48998f" (UID: "9329b737-5b3e-4513-a65a-5797ad48998f"). InnerVolumeSpecName "kube-api-access-42g9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.756571 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-config-data" (OuterVolumeSpecName: "config-data") pod "9329b737-5b3e-4513-a65a-5797ad48998f" (UID: "9329b737-5b3e-4513-a65a-5797ad48998f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.756750 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-scripts" (OuterVolumeSpecName: "scripts") pod "9329b737-5b3e-4513-a65a-5797ad48998f" (UID: "9329b737-5b3e-4513-a65a-5797ad48998f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.772072 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9329b737-5b3e-4513-a65a-5797ad48998f" (UID: "9329b737-5b3e-4513-a65a-5797ad48998f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.821113 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.821148 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.821160 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42g9c\" (UniqueName: \"kubernetes.io/projected/9329b737-5b3e-4513-a65a-5797ad48998f-kube-api-access-42g9c\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:54 crc kubenswrapper[5030]: I0120 23:11:54.821174 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9329b737-5b3e-4513-a65a-5797ad48998f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:55 crc kubenswrapper[5030]: I0120 23:11:55.224790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" event={"ID":"9329b737-5b3e-4513-a65a-5797ad48998f","Type":"ContainerDied","Data":"ce84b955945e431cbe4222813f924096864bb14901f230dffacb0b44b0a467de"} Jan 20 23:11:55 crc kubenswrapper[5030]: I0120 23:11:55.224830 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce84b955945e431cbe4222813f924096864bb14901f230dffacb0b44b0a467de" Jan 20 23:11:55 crc kubenswrapper[5030]: I0120 23:11:55.224804 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv" Jan 20 23:11:55 crc kubenswrapper[5030]: I0120 23:11:55.227420 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c66480db-ddbc-4146-8ff7-c45106fb527c","Type":"ContainerStarted","Data":"7ed99142b10a144dc410a01a3b58aa0fd558792e97af93d0902b7b9e46b2be17"} Jan 20 23:11:55 crc kubenswrapper[5030]: I0120 23:11:55.946246 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs"] Jan 20 23:11:55 crc kubenswrapper[5030]: E0120 23:11:55.946905 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9329b737-5b3e-4513-a65a-5797ad48998f" containerName="nova-cell0-conductor-db-sync" Jan 20 23:11:55 crc kubenswrapper[5030]: I0120 23:11:55.946922 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9329b737-5b3e-4513-a65a-5797ad48998f" containerName="nova-cell0-conductor-db-sync" Jan 20 23:11:55 crc kubenswrapper[5030]: I0120 23:11:55.947147 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9329b737-5b3e-4513-a65a-5797ad48998f" containerName="nova-cell0-conductor-db-sync" Jan 20 23:11:55 crc kubenswrapper[5030]: I0120 23:11:55.947817 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:55 crc kubenswrapper[5030]: I0120 23:11:55.950911 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 20 23:11:55 crc kubenswrapper[5030]: I0120 23:11:55.951121 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-4btnm" Jan 20 23:11:55 crc kubenswrapper[5030]: I0120 23:11:55.951186 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 20 23:11:55 crc kubenswrapper[5030]: I0120 23:11:55.979327 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs"] Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.042469 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-scripts\") pod \"nova-cell0-cell-mapping-hppcs\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.042797 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hppcs\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.042847 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cknxm\" (UniqueName: \"kubernetes.io/projected/1983856c-2527-4397-9320-c20fb28c3eaf-kube-api-access-cknxm\") pod \"nova-cell0-cell-mapping-hppcs\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.042875 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-config-data\") pod \"nova-cell0-cell-mapping-hppcs\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.059562 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.070712 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.079564 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.088259 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.145564 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-scripts\") pod \"nova-cell0-cell-mapping-hppcs\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.145655 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-logs\") pod \"nova-api-0\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.145680 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwb6m\" (UniqueName: \"kubernetes.io/projected/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-kube-api-access-rwb6m\") pod \"nova-api-0\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.145714 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hppcs\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.145742 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cknxm\" (UniqueName: \"kubernetes.io/projected/1983856c-2527-4397-9320-c20fb28c3eaf-kube-api-access-cknxm\") pod \"nova-cell0-cell-mapping-hppcs\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.145765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-config-data\") pod \"nova-cell0-cell-mapping-hppcs\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.145811 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-config-data\") pod \"nova-api-0\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.145848 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.152529 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.153787 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.157217 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-scripts\") pod \"nova-cell0-cell-mapping-hppcs\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.157559 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.160408 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-config-data\") pod \"nova-cell0-cell-mapping-hppcs\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.170817 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hppcs\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.183329 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.199335 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cknxm\" (UniqueName: \"kubernetes.io/projected/1983856c-2527-4397-9320-c20fb28c3eaf-kube-api-access-cknxm\") pod \"nova-cell0-cell-mapping-hppcs\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.247080 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9520ac-d743-44eb-942f-6ffc49538ae1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c9520ac-d743-44eb-942f-6ffc49538ae1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.247154 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-config-data\") pod \"nova-api-0\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.247200 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxwvl\" (UniqueName: \"kubernetes.io/projected/0c9520ac-d743-44eb-942f-6ffc49538ae1-kube-api-access-gxwvl\") pod \"nova-scheduler-0\" (UID: \"0c9520ac-d743-44eb-942f-6ffc49538ae1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.247219 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.247289 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-logs\") pod \"nova-api-0\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.247311 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwb6m\" (UniqueName: \"kubernetes.io/projected/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-kube-api-access-rwb6m\") pod \"nova-api-0\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.247348 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9520ac-d743-44eb-942f-6ffc49538ae1-config-data\") pod \"nova-scheduler-0\" (UID: \"0c9520ac-d743-44eb-942f-6ffc49538ae1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.248044 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-logs\") pod \"nova-api-0\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.257317 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-config-data\") pod \"nova-api-0\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.258027 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.265912 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c66480db-ddbc-4146-8ff7-c45106fb527c","Type":"ContainerStarted","Data":"bfb9b531224f43f9cb325c161c59a99d27c81815008b1c89544e7a27ff6bed3f"} Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.266060 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="ceilometer-central-agent" containerID="cri-o://c9dfb4ab9c075027f17c55d4fc7b21c7d6f389d15e6de87d8605e062bbd00f5b" gracePeriod=30 Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.266423 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.266797 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="proxy-httpd" containerID="cri-o://bfb9b531224f43f9cb325c161c59a99d27c81815008b1c89544e7a27ff6bed3f" gracePeriod=30 Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.266965 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="ceilometer-notification-agent" containerID="cri-o://fc764d22c18ceb40b4cb93cca6c724a9f6139a9a5102632bed0a5f7f70959d8b" gracePeriod=30 Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.267009 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="sg-core" containerID="cri-o://7ed99142b10a144dc410a01a3b58aa0fd558792e97af93d0902b7b9e46b2be17" gracePeriod=30 Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.273088 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.274543 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.275202 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.287408 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.295097 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.297381 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwb6m\" (UniqueName: \"kubernetes.io/projected/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-kube-api-access-rwb6m\") pod \"nova-api-0\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.330399 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.8524925479999999 podStartE2EDuration="5.330378553s" podCreationTimestamp="2026-01-20 23:11:51 +0000 UTC" firstStartedPulling="2026-01-20 23:11:52.067837057 +0000 UTC m=+2184.388097365" lastFinishedPulling="2026-01-20 23:11:55.545723082 +0000 UTC m=+2187.865983370" observedRunningTime="2026-01-20 23:11:56.311407081 +0000 UTC m=+2188.631667369" watchObservedRunningTime="2026-01-20 23:11:56.330378553 +0000 UTC m=+2188.650638841" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.349883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a41a6c93-1830-4e85-8be1-4d422d2a9c63-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.350213 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a41a6c93-1830-4e85-8be1-4d422d2a9c63-logs\") pod \"nova-metadata-0\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.350346 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9520ac-d743-44eb-942f-6ffc49538ae1-config-data\") pod \"nova-scheduler-0\" (UID: \"0c9520ac-d743-44eb-942f-6ffc49538ae1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.350451 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4pbz\" (UniqueName: \"kubernetes.io/projected/a41a6c93-1830-4e85-8be1-4d422d2a9c63-kube-api-access-m4pbz\") pod \"nova-metadata-0\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.350563 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9520ac-d743-44eb-942f-6ffc49538ae1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c9520ac-d743-44eb-942f-6ffc49538ae1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.350671 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a41a6c93-1830-4e85-8be1-4d422d2a9c63-config-data\") pod \"nova-metadata-0\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.350805 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxwvl\" (UniqueName: \"kubernetes.io/projected/0c9520ac-d743-44eb-942f-6ffc49538ae1-kube-api-access-gxwvl\") pod \"nova-scheduler-0\" (UID: \"0c9520ac-d743-44eb-942f-6ffc49538ae1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.358401 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9520ac-d743-44eb-942f-6ffc49538ae1-config-data\") pod \"nova-scheduler-0\" (UID: \"0c9520ac-d743-44eb-942f-6ffc49538ae1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.367169 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9520ac-d743-44eb-942f-6ffc49538ae1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c9520ac-d743-44eb-942f-6ffc49538ae1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.369919 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.371131 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.378331 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.381790 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxwvl\" (UniqueName: \"kubernetes.io/projected/0c9520ac-d743-44eb-942f-6ffc49538ae1-kube-api-access-gxwvl\") pod \"nova-scheduler-0\" (UID: \"0c9520ac-d743-44eb-942f-6ffc49538ae1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.388689 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.413669 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.452630 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.452922 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a41a6c93-1830-4e85-8be1-4d422d2a9c63-logs\") pod \"nova-metadata-0\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.452979 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4pbz\" (UniqueName: \"kubernetes.io/projected/a41a6c93-1830-4e85-8be1-4d422d2a9c63-kube-api-access-m4pbz\") pod \"nova-metadata-0\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.453019 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a41a6c93-1830-4e85-8be1-4d422d2a9c63-config-data\") pod \"nova-metadata-0\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.453050 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drssf\" (UniqueName: \"kubernetes.io/projected/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-kube-api-access-drssf\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.453085 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.453120 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a41a6c93-1830-4e85-8be1-4d422d2a9c63-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.457087 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a41a6c93-1830-4e85-8be1-4d422d2a9c63-logs\") pod \"nova-metadata-0\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.457501 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a41a6c93-1830-4e85-8be1-4d422d2a9c63-config-data\") pod \"nova-metadata-0\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.458347 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a41a6c93-1830-4e85-8be1-4d422d2a9c63-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.476963 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4pbz\" (UniqueName: \"kubernetes.io/projected/a41a6c93-1830-4e85-8be1-4d422d2a9c63-kube-api-access-m4pbz\") pod \"nova-metadata-0\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.529039 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.529794 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.542353 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:11:56 crc kubenswrapper[5030]: E0120 23:11:56.543168 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-drssf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="2d45633d-c5fe-4e10-a508-b3bb6613f6c3" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.554607 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drssf\" (UniqueName: \"kubernetes.io/projected/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-kube-api-access-drssf\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.554682 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.554750 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.558342 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.558586 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.560124 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.564312 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.577145 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.594149 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drssf\" (UniqueName: \"kubernetes.io/projected/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-kube-api-access-drssf\") pod \"nova-cell1-novncproxy-0\" (UID: \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:56 crc kubenswrapper[5030]: I0120 23:11:56.936264 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs"] Jan 20 23:11:56 crc kubenswrapper[5030]: W0120 23:11:56.938282 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1983856c_2527_4397_9320_c20fb28c3eaf.slice/crio-ca37d8985b3f16ce4865f1b6ef95159f81327cbc1c44e92c8edf01686d00dac9 WatchSource:0}: Error finding container ca37d8985b3f16ce4865f1b6ef95159f81327cbc1c44e92c8edf01686d00dac9: Status 404 returned error can't find the container with id ca37d8985b3f16ce4865f1b6ef95159f81327cbc1c44e92c8edf01686d00dac9 Jan 20 23:11:57 crc kubenswrapper[5030]: W0120 23:11:57.094717 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81f14ca2_d8d5_4dee_a353_61186c8a8b7b.slice/crio-bd4d17017ae9fbac0e5667b28f6e87be4b598ce9360a036832f4a343d68f52d3 WatchSource:0}: Error finding container bd4d17017ae9fbac0e5667b28f6e87be4b598ce9360a036832f4a343d68f52d3: Status 404 returned error can't find the container with id bd4d17017ae9fbac0e5667b28f6e87be4b598ce9360a036832f4a343d68f52d3 Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.095438 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.212155 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.301592 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.312386 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9"] Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.313565 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.314036 5030 generic.go:334] "Generic (PLEG): container finished" podID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerID="bfb9b531224f43f9cb325c161c59a99d27c81815008b1c89544e7a27ff6bed3f" exitCode=0 Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.314058 5030 generic.go:334] "Generic (PLEG): container finished" podID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerID="7ed99142b10a144dc410a01a3b58aa0fd558792e97af93d0902b7b9e46b2be17" exitCode=2 Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.314065 5030 generic.go:334] "Generic (PLEG): container finished" podID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerID="fc764d22c18ceb40b4cb93cca6c724a9f6139a9a5102632bed0a5f7f70959d8b" exitCode=0 Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.314098 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c66480db-ddbc-4146-8ff7-c45106fb527c","Type":"ContainerDied","Data":"bfb9b531224f43f9cb325c161c59a99d27c81815008b1c89544e7a27ff6bed3f"} Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.314120 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c66480db-ddbc-4146-8ff7-c45106fb527c","Type":"ContainerDied","Data":"7ed99142b10a144dc410a01a3b58aa0fd558792e97af93d0902b7b9e46b2be17"} Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.314132 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c66480db-ddbc-4146-8ff7-c45106fb527c","Type":"ContainerDied","Data":"fc764d22c18ceb40b4cb93cca6c724a9f6139a9a5102632bed0a5f7f70959d8b"} Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.316442 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.317364 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a41a6c93-1830-4e85-8be1-4d422d2a9c63","Type":"ContainerStarted","Data":"e799b32eecf7cc23a962fb9a253436f495a04e66eaeb41aff80c86acb7bf2048"} Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.319345 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.322903 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9"] Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.323857 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" event={"ID":"1983856c-2527-4397-9320-c20fb28c3eaf","Type":"ContainerStarted","Data":"a49455d12fd72cbf1f03ae03da0c2d0f55e27b28828ca4175a8011acd09f2387"} Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.323878 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" event={"ID":"1983856c-2527-4397-9320-c20fb28c3eaf","Type":"ContainerStarted","Data":"ca37d8985b3f16ce4865f1b6ef95159f81327cbc1c44e92c8edf01686d00dac9"} Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.339642 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.340457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"81f14ca2-d8d5-4dee-a353-61186c8a8b7b","Type":"ContainerStarted","Data":"44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c"} Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.340476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"81f14ca2-d8d5-4dee-a353-61186c8a8b7b","Type":"ContainerStarted","Data":"bd4d17017ae9fbac0e5667b28f6e87be4b598ce9360a036832f4a343d68f52d3"} Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.360771 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.377120 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg96v\" (UniqueName: \"kubernetes.io/projected/18b32995-1917-478d-b62b-66d4fba0cbee-kube-api-access-sg96v\") pod \"nova-cell1-conductor-db-sync-hk8d9\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.377180 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-scripts\") pod \"nova-cell1-conductor-db-sync-hk8d9\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.377239 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-config-data\") pod \"nova-cell1-conductor-db-sync-hk8d9\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.377290 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hk8d9\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.381251 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" podStartSLOduration=2.38123366 podStartE2EDuration="2.38123366s" podCreationTimestamp="2026-01-20 23:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:57.353594124 +0000 UTC m=+2189.673854412" watchObservedRunningTime="2026-01-20 23:11:57.38123366 +0000 UTC m=+2189.701493948" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.478284 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drssf\" (UniqueName: \"kubernetes.io/projected/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-kube-api-access-drssf\") pod \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\" (UID: \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\") " Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.478489 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-combined-ca-bundle\") pod \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\" (UID: \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\") " Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.478526 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-config-data\") pod \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\" (UID: \"2d45633d-c5fe-4e10-a508-b3bb6613f6c3\") " Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.478867 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg96v\" (UniqueName: \"kubernetes.io/projected/18b32995-1917-478d-b62b-66d4fba0cbee-kube-api-access-sg96v\") pod \"nova-cell1-conductor-db-sync-hk8d9\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.478915 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-scripts\") pod \"nova-cell1-conductor-db-sync-hk8d9\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.478949 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-config-data\") pod \"nova-cell1-conductor-db-sync-hk8d9\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.478993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hk8d9\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.483375 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-kube-api-access-drssf" (OuterVolumeSpecName: "kube-api-access-drssf") pod "2d45633d-c5fe-4e10-a508-b3bb6613f6c3" (UID: "2d45633d-c5fe-4e10-a508-b3bb6613f6c3"). InnerVolumeSpecName "kube-api-access-drssf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.484209 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hk8d9\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.486042 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d45633d-c5fe-4e10-a508-b3bb6613f6c3" (UID: "2d45633d-c5fe-4e10-a508-b3bb6613f6c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.487119 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-scripts\") pod \"nova-cell1-conductor-db-sync-hk8d9\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.489188 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-config-data\") pod \"nova-cell1-conductor-db-sync-hk8d9\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.489797 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-config-data" (OuterVolumeSpecName: "config-data") pod "2d45633d-c5fe-4e10-a508-b3bb6613f6c3" (UID: "2d45633d-c5fe-4e10-a508-b3bb6613f6c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.498197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg96v\" (UniqueName: \"kubernetes.io/projected/18b32995-1917-478d-b62b-66d4fba0cbee-kube-api-access-sg96v\") pod \"nova-cell1-conductor-db-sync-hk8d9\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.580901 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drssf\" (UniqueName: \"kubernetes.io/projected/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-kube-api-access-drssf\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.580945 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.580957 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d45633d-c5fe-4e10-a508-b3bb6613f6c3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:57 crc kubenswrapper[5030]: I0120 23:11:57.660934 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.227228 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9"] Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.372829 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" event={"ID":"18b32995-1917-478d-b62b-66d4fba0cbee","Type":"ContainerStarted","Data":"d8cb39a3b5e2aac75a5c3facab7a834e3c732ca77902aa97fa95d8dc48a8422f"} Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.386841 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a41a6c93-1830-4e85-8be1-4d422d2a9c63" containerName="nova-metadata-log" containerID="cri-o://e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf" gracePeriod=30 Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.387290 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="a41a6c93-1830-4e85-8be1-4d422d2a9c63" containerName="nova-metadata-metadata" containerID="cri-o://0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181" gracePeriod=30 Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.387289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a41a6c93-1830-4e85-8be1-4d422d2a9c63","Type":"ContainerStarted","Data":"0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181"} Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.387351 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a41a6c93-1830-4e85-8be1-4d422d2a9c63","Type":"ContainerStarted","Data":"e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf"} Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.396883 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"81f14ca2-d8d5-4dee-a353-61186c8a8b7b","Type":"ContainerStarted","Data":"174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb"} Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.396932 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="81f14ca2-d8d5-4dee-a353-61186c8a8b7b" containerName="nova-api-log" containerID="cri-o://44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c" gracePeriod=30 Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.396991 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="81f14ca2-d8d5-4dee-a353-61186c8a8b7b" containerName="nova-api-api" containerID="cri-o://174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb" gracePeriod=30 Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.409141 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="0c9520ac-d743-44eb-942f-6ffc49538ae1" containerName="nova-scheduler-scheduler" containerID="cri-o://3bab7e848d1c904c3ea9680a2511ae2b06c32b2cb43119543388719f7eb1b7bd" gracePeriod=30 Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.409204 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.409173785 podStartE2EDuration="2.409173785s" podCreationTimestamp="2026-01-20 23:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:58.406037039 +0000 UTC m=+2190.726297337" watchObservedRunningTime="2026-01-20 23:11:58.409173785 +0000 UTC m=+2190.729434073" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.409342 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0c9520ac-d743-44eb-942f-6ffc49538ae1","Type":"ContainerStarted","Data":"3bab7e848d1c904c3ea9680a2511ae2b06c32b2cb43119543388719f7eb1b7bd"} Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.409401 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0c9520ac-d743-44eb-942f-6ffc49538ae1","Type":"ContainerStarted","Data":"f6d0cb257d5bf860290c57b4ac57a872f0b32edd9efaa9e15695346ac985387f"} Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.409420 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.441499 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.441483834 podStartE2EDuration="2.441483834s" podCreationTimestamp="2026-01-20 23:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:58.436639636 +0000 UTC m=+2190.756899924" watchObservedRunningTime="2026-01-20 23:11:58.441483834 +0000 UTC m=+2190.761744122" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.493680 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.493417053 podStartE2EDuration="2.493417053s" podCreationTimestamp="2026-01-20 23:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:58.452724759 +0000 UTC m=+2190.772985047" watchObservedRunningTime="2026-01-20 23:11:58.493417053 +0000 UTC m=+2190.813677341" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.532099 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.546919 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.560675 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.562089 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.564590 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.571963 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.617872 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wlxn\" (UniqueName: \"kubernetes.io/projected/eef985aa-30c1-4bbb-9ff6-b97693927a0c-kube-api-access-5wlxn\") pod \"nova-cell1-novncproxy-0\" (UID: \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.617942 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef985aa-30c1-4bbb-9ff6-b97693927a0c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.618063 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef985aa-30c1-4bbb-9ff6-b97693927a0c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.719589 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wlxn\" (UniqueName: \"kubernetes.io/projected/eef985aa-30c1-4bbb-9ff6-b97693927a0c-kube-api-access-5wlxn\") pod \"nova-cell1-novncproxy-0\" (UID: \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.719670 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef985aa-30c1-4bbb-9ff6-b97693927a0c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.719693 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef985aa-30c1-4bbb-9ff6-b97693927a0c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.726677 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef985aa-30c1-4bbb-9ff6-b97693927a0c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.728584 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef985aa-30c1-4bbb-9ff6-b97693927a0c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.735923 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wlxn\" (UniqueName: \"kubernetes.io/projected/eef985aa-30c1-4bbb-9ff6-b97693927a0c-kube-api-access-5wlxn\") pod \"nova-cell1-novncproxy-0\" (UID: \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:58 crc kubenswrapper[5030]: I0120 23:11:58.806303 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.041845 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.082444 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.139260 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwb6m\" (UniqueName: \"kubernetes.io/projected/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-kube-api-access-rwb6m\") pod \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.139314 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a41a6c93-1830-4e85-8be1-4d422d2a9c63-combined-ca-bundle\") pod \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.139341 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-logs\") pod \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.139375 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a41a6c93-1830-4e85-8be1-4d422d2a9c63-config-data\") pod \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.139445 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-combined-ca-bundle\") pod \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.139466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4pbz\" (UniqueName: \"kubernetes.io/projected/a41a6c93-1830-4e85-8be1-4d422d2a9c63-kube-api-access-m4pbz\") pod \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.139498 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-config-data\") pod \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\" (UID: \"81f14ca2-d8d5-4dee-a353-61186c8a8b7b\") " Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.139593 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a41a6c93-1830-4e85-8be1-4d422d2a9c63-logs\") pod \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\" (UID: \"a41a6c93-1830-4e85-8be1-4d422d2a9c63\") " Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.140450 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a41a6c93-1830-4e85-8be1-4d422d2a9c63-logs" (OuterVolumeSpecName: "logs") pod "a41a6c93-1830-4e85-8be1-4d422d2a9c63" (UID: "a41a6c93-1830-4e85-8be1-4d422d2a9c63"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.148493 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-kube-api-access-rwb6m" (OuterVolumeSpecName: "kube-api-access-rwb6m") pod "81f14ca2-d8d5-4dee-a353-61186c8a8b7b" (UID: "81f14ca2-d8d5-4dee-a353-61186c8a8b7b"). InnerVolumeSpecName "kube-api-access-rwb6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.149985 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41a6c93-1830-4e85-8be1-4d422d2a9c63-kube-api-access-m4pbz" (OuterVolumeSpecName: "kube-api-access-m4pbz") pod "a41a6c93-1830-4e85-8be1-4d422d2a9c63" (UID: "a41a6c93-1830-4e85-8be1-4d422d2a9c63"). InnerVolumeSpecName "kube-api-access-m4pbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.152749 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.170251 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-logs" (OuterVolumeSpecName: "logs") pod "81f14ca2-d8d5-4dee-a353-61186c8a8b7b" (UID: "81f14ca2-d8d5-4dee-a353-61186c8a8b7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.177604 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-config-data" (OuterVolumeSpecName: "config-data") pod "81f14ca2-d8d5-4dee-a353-61186c8a8b7b" (UID: "81f14ca2-d8d5-4dee-a353-61186c8a8b7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.184240 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41a6c93-1830-4e85-8be1-4d422d2a9c63-config-data" (OuterVolumeSpecName: "config-data") pod "a41a6c93-1830-4e85-8be1-4d422d2a9c63" (UID: "a41a6c93-1830-4e85-8be1-4d422d2a9c63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.185582 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81f14ca2-d8d5-4dee-a353-61186c8a8b7b" (UID: "81f14ca2-d8d5-4dee-a353-61186c8a8b7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.187752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41a6c93-1830-4e85-8be1-4d422d2a9c63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a41a6c93-1830-4e85-8be1-4d422d2a9c63" (UID: "a41a6c93-1830-4e85-8be1-4d422d2a9c63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.242120 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.242146 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4pbz\" (UniqueName: \"kubernetes.io/projected/a41a6c93-1830-4e85-8be1-4d422d2a9c63-kube-api-access-m4pbz\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.242157 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.242167 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a41a6c93-1830-4e85-8be1-4d422d2a9c63-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.242176 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwb6m\" (UniqueName: \"kubernetes.io/projected/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-kube-api-access-rwb6m\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.242184 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a41a6c93-1830-4e85-8be1-4d422d2a9c63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.242192 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81f14ca2-d8d5-4dee-a353-61186c8a8b7b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.242200 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a41a6c93-1830-4e85-8be1-4d422d2a9c63-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.387441 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:11:59 crc kubenswrapper[5030]: W0120 23:11:59.394031 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeef985aa_30c1_4bbb_9ff6_b97693927a0c.slice/crio-49916d742b3e42d44211ec5ae394a352f365f60386f2926e2bbd4ea3fb5d32f2 WatchSource:0}: Error finding container 49916d742b3e42d44211ec5ae394a352f365f60386f2926e2bbd4ea3fb5d32f2: Status 404 returned error can't find the container with id 49916d742b3e42d44211ec5ae394a352f365f60386f2926e2bbd4ea3fb5d32f2 Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.417497 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"eef985aa-30c1-4bbb-9ff6-b97693927a0c","Type":"ContainerStarted","Data":"49916d742b3e42d44211ec5ae394a352f365f60386f2926e2bbd4ea3fb5d32f2"} Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.420501 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" event={"ID":"18b32995-1917-478d-b62b-66d4fba0cbee","Type":"ContainerStarted","Data":"a86a208b37c4b3725dd0af75943dbb3f175452a065a1bb1a4420bf3a64fdf949"} Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.423347 5030 generic.go:334] "Generic (PLEG): container finished" podID="a41a6c93-1830-4e85-8be1-4d422d2a9c63" containerID="0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181" exitCode=0 Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.423369 5030 generic.go:334] "Generic (PLEG): container finished" podID="a41a6c93-1830-4e85-8be1-4d422d2a9c63" containerID="e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf" exitCode=143 Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.423411 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a41a6c93-1830-4e85-8be1-4d422d2a9c63","Type":"ContainerDied","Data":"0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181"} Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.423428 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a41a6c93-1830-4e85-8be1-4d422d2a9c63","Type":"ContainerDied","Data":"e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf"} Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.423439 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"a41a6c93-1830-4e85-8be1-4d422d2a9c63","Type":"ContainerDied","Data":"e799b32eecf7cc23a962fb9a253436f495a04e66eaeb41aff80c86acb7bf2048"} Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.423435 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.423466 5030 scope.go:117] "RemoveContainer" containerID="0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.442245 5030 generic.go:334] "Generic (PLEG): container finished" podID="81f14ca2-d8d5-4dee-a353-61186c8a8b7b" containerID="174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb" exitCode=0 Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.442275 5030 generic.go:334] "Generic (PLEG): container finished" podID="81f14ca2-d8d5-4dee-a353-61186c8a8b7b" containerID="44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c" exitCode=143 Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.442316 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"81f14ca2-d8d5-4dee-a353-61186c8a8b7b","Type":"ContainerDied","Data":"174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb"} Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.442340 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"81f14ca2-d8d5-4dee-a353-61186c8a8b7b","Type":"ContainerDied","Data":"44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c"} Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.442352 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"81f14ca2-d8d5-4dee-a353-61186c8a8b7b","Type":"ContainerDied","Data":"bd4d17017ae9fbac0e5667b28f6e87be4b598ce9360a036832f4a343d68f52d3"} Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.442423 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.445486 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" podStartSLOduration=2.445473365 podStartE2EDuration="2.445473365s" podCreationTimestamp="2026-01-20 23:11:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:11:59.43300712 +0000 UTC m=+2191.753267418" watchObservedRunningTime="2026-01-20 23:11:59.445473365 +0000 UTC m=+2191.765733653" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.466867 5030 scope.go:117] "RemoveContainer" containerID="e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.474102 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.486593 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.504514 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:11:59 crc kubenswrapper[5030]: E0120 23:11:59.504942 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41a6c93-1830-4e85-8be1-4d422d2a9c63" containerName="nova-metadata-log" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.504963 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41a6c93-1830-4e85-8be1-4d422d2a9c63" containerName="nova-metadata-log" Jan 20 23:11:59 crc kubenswrapper[5030]: E0120 23:11:59.504979 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f14ca2-d8d5-4dee-a353-61186c8a8b7b" containerName="nova-api-log" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.504987 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f14ca2-d8d5-4dee-a353-61186c8a8b7b" containerName="nova-api-log" Jan 20 23:11:59 crc kubenswrapper[5030]: E0120 23:11:59.505003 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f14ca2-d8d5-4dee-a353-61186c8a8b7b" containerName="nova-api-api" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.505009 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f14ca2-d8d5-4dee-a353-61186c8a8b7b" containerName="nova-api-api" Jan 20 23:11:59 crc kubenswrapper[5030]: E0120 23:11:59.505022 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41a6c93-1830-4e85-8be1-4d422d2a9c63" containerName="nova-metadata-metadata" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.505028 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41a6c93-1830-4e85-8be1-4d422d2a9c63" containerName="nova-metadata-metadata" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.505205 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f14ca2-d8d5-4dee-a353-61186c8a8b7b" containerName="nova-api-log" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.505234 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a41a6c93-1830-4e85-8be1-4d422d2a9c63" containerName="nova-metadata-log" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.505253 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a41a6c93-1830-4e85-8be1-4d422d2a9c63" containerName="nova-metadata-metadata" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.505262 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f14ca2-d8d5-4dee-a353-61186c8a8b7b" containerName="nova-api-api" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.506373 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.507345 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.515010 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.515234 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.516184 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.522423 5030 scope.go:117] "RemoveContainer" containerID="0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181" Jan 20 23:11:59 crc kubenswrapper[5030]: E0120 23:11:59.523379 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181\": container with ID starting with 0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181 not found: ID does not exist" containerID="0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.523431 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181"} err="failed to get container status \"0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181\": rpc error: code = NotFound desc = could not find container \"0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181\": container with ID starting with 0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181 not found: ID does not exist" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.523453 5030 scope.go:117] "RemoveContainer" containerID="e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf" Jan 20 23:11:59 crc kubenswrapper[5030]: E0120 23:11:59.523926 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf\": container with ID starting with e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf not found: ID does not exist" containerID="e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.523942 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf"} err="failed to get container status \"e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf\": rpc error: code = NotFound desc = could not find container \"e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf\": container with ID starting with e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf not found: ID does not exist" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.524115 5030 scope.go:117] "RemoveContainer" containerID="0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.524293 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.524533 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181"} err="failed to get container status \"0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181\": rpc error: code = NotFound desc = could not find container \"0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181\": container with ID starting with 0898d548a7cfb8d5c116a786a7dde44205075f00c71fbd37ab212b7d4ff22181 not found: ID does not exist" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.524584 5030 scope.go:117] "RemoveContainer" containerID="e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.525059 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf"} err="failed to get container status \"e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf\": rpc error: code = NotFound desc = could not find container \"e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf\": container with ID starting with e754b1c0aca34d4984092c2b22d85101a4aa5e4b6e58690972b7872456358eaf not found: ID does not exist" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.525072 5030 scope.go:117] "RemoveContainer" containerID="174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.550706 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-config-data\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.551028 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.551139 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.551208 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-logs\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.551265 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mx64\" (UniqueName: \"kubernetes.io/projected/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-kube-api-access-5mx64\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.580428 5030 scope.go:117] "RemoveContainer" containerID="44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.592782 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.594477 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.597267 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.614729 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.653349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mx64\" (UniqueName: \"kubernetes.io/projected/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-kube-api-access-5mx64\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.653437 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-config-data\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.653471 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e942f538-98eb-4877-91f8-f0a966ad053d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.653499 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e942f538-98eb-4877-91f8-f0a966ad053d-logs\") pod \"nova-api-0\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.653519 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e942f538-98eb-4877-91f8-f0a966ad053d-config-data\") pod \"nova-api-0\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.653549 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vp59\" (UniqueName: \"kubernetes.io/projected/e942f538-98eb-4877-91f8-f0a966ad053d-kube-api-access-5vp59\") pod \"nova-api-0\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.653576 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.653607 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.653658 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-logs\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.654026 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-logs\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.658355 5030 scope.go:117] "RemoveContainer" containerID="174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.660221 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.661677 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: E0120 23:11:59.663414 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb\": container with ID starting with 174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb not found: ID does not exist" containerID="174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.663445 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb"} err="failed to get container status \"174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb\": rpc error: code = NotFound desc = could not find container \"174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb\": container with ID starting with 174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb not found: ID does not exist" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.663467 5030 scope.go:117] "RemoveContainer" containerID="44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c" Jan 20 23:11:59 crc kubenswrapper[5030]: E0120 23:11:59.664703 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c\": container with ID starting with 44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c not found: ID does not exist" containerID="44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.664741 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c"} err="failed to get container status \"44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c\": rpc error: code = NotFound desc = could not find container \"44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c\": container with ID starting with 44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c not found: ID does not exist" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.664754 5030 scope.go:117] "RemoveContainer" containerID="174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.665730 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb"} err="failed to get container status \"174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb\": rpc error: code = NotFound desc = could not find container \"174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb\": container with ID starting with 174f99345ce00296eae0c12f56a666be74a7cec92be5fab3570531c03f8ff4bb not found: ID does not exist" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.665763 5030 scope.go:117] "RemoveContainer" containerID="44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.668076 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c"} err="failed to get container status \"44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c\": rpc error: code = NotFound desc = could not find container \"44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c\": container with ID starting with 44a3f7f5a388d78d9ad5141e11b35dad946d108cedca31766553c772c2b2764c not found: ID does not exist" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.675727 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mx64\" (UniqueName: \"kubernetes.io/projected/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-kube-api-access-5mx64\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.682853 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-config-data\") pod \"nova-metadata-0\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.755669 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e942f538-98eb-4877-91f8-f0a966ad053d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.755716 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e942f538-98eb-4877-91f8-f0a966ad053d-logs\") pod \"nova-api-0\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.755741 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e942f538-98eb-4877-91f8-f0a966ad053d-config-data\") pod \"nova-api-0\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.755771 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vp59\" (UniqueName: \"kubernetes.io/projected/e942f538-98eb-4877-91f8-f0a966ad053d-kube-api-access-5vp59\") pod \"nova-api-0\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.756114 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e942f538-98eb-4877-91f8-f0a966ad053d-logs\") pod \"nova-api-0\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.759186 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e942f538-98eb-4877-91f8-f0a966ad053d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.761592 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e942f538-98eb-4877-91f8-f0a966ad053d-config-data\") pod \"nova-api-0\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.789224 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vp59\" (UniqueName: \"kubernetes.io/projected/e942f538-98eb-4877-91f8-f0a966ad053d-kube-api-access-5vp59\") pod \"nova-api-0\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.928523 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.939987 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.975826 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d45633d-c5fe-4e10-a508-b3bb6613f6c3" path="/var/lib/kubelet/pods/2d45633d-c5fe-4e10-a508-b3bb6613f6c3/volumes" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.976223 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f14ca2-d8d5-4dee-a353-61186c8a8b7b" path="/var/lib/kubelet/pods/81f14ca2-d8d5-4dee-a353-61186c8a8b7b/volumes" Jan 20 23:11:59 crc kubenswrapper[5030]: I0120 23:11:59.976996 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41a6c93-1830-4e85-8be1-4d422d2a9c63" path="/var/lib/kubelet/pods/a41a6c93-1830-4e85-8be1-4d422d2a9c63/volumes" Jan 20 23:12:00 crc kubenswrapper[5030]: I0120 23:12:00.461517 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"334851eb-8b75-4e87-a0ee-4d1af4cc02b4","Type":"ContainerStarted","Data":"dd99fb6b5cf2375443c2b9e4a96579c31808d38953444c2016074d35c6a2d9fe"} Jan 20 23:12:00 crc kubenswrapper[5030]: I0120 23:12:00.462495 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:12:00 crc kubenswrapper[5030]: I0120 23:12:00.475441 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="eef985aa-30c1-4bbb-9ff6-b97693927a0c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6" gracePeriod=30 Jan 20 23:12:00 crc kubenswrapper[5030]: I0120 23:12:00.475531 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"eef985aa-30c1-4bbb-9ff6-b97693927a0c","Type":"ContainerStarted","Data":"0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6"} Jan 20 23:12:00 crc kubenswrapper[5030]: I0120 23:12:00.500980 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.500959584 podStartE2EDuration="2.500959584s" podCreationTimestamp="2026-01-20 23:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:12:00.493860481 +0000 UTC m=+2192.814120779" watchObservedRunningTime="2026-01-20 23:12:00.500959584 +0000 UTC m=+2192.821219872" Jan 20 23:12:00 crc kubenswrapper[5030]: W0120 23:12:00.554045 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode942f538_98eb_4877_91f8_f0a966ad053d.slice/crio-dc12881a69f1ff81ddf194bf9832be067da4b4ebe332fa0d79af9afd855edc39 WatchSource:0}: Error finding container dc12881a69f1ff81ddf194bf9832be067da4b4ebe332fa0d79af9afd855edc39: Status 404 returned error can't find the container with id dc12881a69f1ff81ddf194bf9832be067da4b4ebe332fa0d79af9afd855edc39 Jan 20 23:12:00 crc kubenswrapper[5030]: I0120 23:12:00.568163 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:01 crc kubenswrapper[5030]: I0120 23:12:01.486003 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"334851eb-8b75-4e87-a0ee-4d1af4cc02b4","Type":"ContainerStarted","Data":"48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d"} Jan 20 23:12:01 crc kubenswrapper[5030]: I0120 23:12:01.486350 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"334851eb-8b75-4e87-a0ee-4d1af4cc02b4","Type":"ContainerStarted","Data":"be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a"} Jan 20 23:12:01 crc kubenswrapper[5030]: I0120 23:12:01.488292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e942f538-98eb-4877-91f8-f0a966ad053d","Type":"ContainerStarted","Data":"5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d"} Jan 20 23:12:01 crc kubenswrapper[5030]: I0120 23:12:01.488337 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e942f538-98eb-4877-91f8-f0a966ad053d","Type":"ContainerStarted","Data":"cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541"} Jan 20 23:12:01 crc kubenswrapper[5030]: I0120 23:12:01.488347 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e942f538-98eb-4877-91f8-f0a966ad053d","Type":"ContainerStarted","Data":"dc12881a69f1ff81ddf194bf9832be067da4b4ebe332fa0d79af9afd855edc39"} Jan 20 23:12:01 crc kubenswrapper[5030]: I0120 23:12:01.506600 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.5065779839999998 podStartE2EDuration="2.506577984s" podCreationTimestamp="2026-01-20 23:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:12:01.501545151 +0000 UTC m=+2193.821805449" watchObservedRunningTime="2026-01-20 23:12:01.506577984 +0000 UTC m=+2193.826838282" Jan 20 23:12:01 crc kubenswrapper[5030]: I0120 23:12:01.534250 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.53422965 podStartE2EDuration="2.53422965s" podCreationTimestamp="2026-01-20 23:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:12:01.526695625 +0000 UTC m=+2193.846955913" watchObservedRunningTime="2026-01-20 23:12:01.53422965 +0000 UTC m=+2193.854489938" Jan 20 23:12:01 crc kubenswrapper[5030]: I0120 23:12:01.560740 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:03 crc kubenswrapper[5030]: I0120 23:12:03.807037 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:04 crc kubenswrapper[5030]: I0120 23:12:04.929350 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:04 crc kubenswrapper[5030]: I0120 23:12:04.929400 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:05 crc kubenswrapper[5030]: I0120 23:12:05.534658 5030 generic.go:334] "Generic (PLEG): container finished" podID="18b32995-1917-478d-b62b-66d4fba0cbee" containerID="a86a208b37c4b3725dd0af75943dbb3f175452a065a1bb1a4420bf3a64fdf949" exitCode=0 Jan 20 23:12:05 crc kubenswrapper[5030]: I0120 23:12:05.534772 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" event={"ID":"18b32995-1917-478d-b62b-66d4fba0cbee","Type":"ContainerDied","Data":"a86a208b37c4b3725dd0af75943dbb3f175452a065a1bb1a4420bf3a64fdf949"} Jan 20 23:12:05 crc kubenswrapper[5030]: I0120 23:12:05.537744 5030 generic.go:334] "Generic (PLEG): container finished" podID="1983856c-2527-4397-9320-c20fb28c3eaf" containerID="a49455d12fd72cbf1f03ae03da0c2d0f55e27b28828ca4175a8011acd09f2387" exitCode=0 Jan 20 23:12:05 crc kubenswrapper[5030]: I0120 23:12:05.537798 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" event={"ID":"1983856c-2527-4397-9320-c20fb28c3eaf","Type":"ContainerDied","Data":"a49455d12fd72cbf1f03ae03da0c2d0f55e27b28828ca4175a8011acd09f2387"} Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.088495 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.218846 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-combined-ca-bundle\") pod \"c66480db-ddbc-4146-8ff7-c45106fb527c\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.218981 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-sg-core-conf-yaml\") pod \"c66480db-ddbc-4146-8ff7-c45106fb527c\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.219087 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxkn5\" (UniqueName: \"kubernetes.io/projected/c66480db-ddbc-4146-8ff7-c45106fb527c-kube-api-access-gxkn5\") pod \"c66480db-ddbc-4146-8ff7-c45106fb527c\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.219134 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-config-data\") pod \"c66480db-ddbc-4146-8ff7-c45106fb527c\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.219179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c66480db-ddbc-4146-8ff7-c45106fb527c-run-httpd\") pod \"c66480db-ddbc-4146-8ff7-c45106fb527c\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.219250 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-scripts\") pod \"c66480db-ddbc-4146-8ff7-c45106fb527c\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.219635 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c66480db-ddbc-4146-8ff7-c45106fb527c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c66480db-ddbc-4146-8ff7-c45106fb527c" (UID: "c66480db-ddbc-4146-8ff7-c45106fb527c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.220104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c66480db-ddbc-4146-8ff7-c45106fb527c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c66480db-ddbc-4146-8ff7-c45106fb527c" (UID: "c66480db-ddbc-4146-8ff7-c45106fb527c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.219444 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c66480db-ddbc-4146-8ff7-c45106fb527c-log-httpd\") pod \"c66480db-ddbc-4146-8ff7-c45106fb527c\" (UID: \"c66480db-ddbc-4146-8ff7-c45106fb527c\") " Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.221573 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c66480db-ddbc-4146-8ff7-c45106fb527c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.221615 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c66480db-ddbc-4146-8ff7-c45106fb527c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.226445 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c66480db-ddbc-4146-8ff7-c45106fb527c-kube-api-access-gxkn5" (OuterVolumeSpecName: "kube-api-access-gxkn5") pod "c66480db-ddbc-4146-8ff7-c45106fb527c" (UID: "c66480db-ddbc-4146-8ff7-c45106fb527c"). InnerVolumeSpecName "kube-api-access-gxkn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.233751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-scripts" (OuterVolumeSpecName: "scripts") pod "c66480db-ddbc-4146-8ff7-c45106fb527c" (UID: "c66480db-ddbc-4146-8ff7-c45106fb527c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.246549 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c66480db-ddbc-4146-8ff7-c45106fb527c" (UID: "c66480db-ddbc-4146-8ff7-c45106fb527c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.296359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c66480db-ddbc-4146-8ff7-c45106fb527c" (UID: "c66480db-ddbc-4146-8ff7-c45106fb527c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.322856 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.322889 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxkn5\" (UniqueName: \"kubernetes.io/projected/c66480db-ddbc-4146-8ff7-c45106fb527c-kube-api-access-gxkn5\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.322901 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.322911 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.349212 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-config-data" (OuterVolumeSpecName: "config-data") pod "c66480db-ddbc-4146-8ff7-c45106fb527c" (UID: "c66480db-ddbc-4146-8ff7-c45106fb527c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.425110 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c66480db-ddbc-4146-8ff7-c45106fb527c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.557162 5030 generic.go:334] "Generic (PLEG): container finished" podID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerID="c9dfb4ab9c075027f17c55d4fc7b21c7d6f389d15e6de87d8605e062bbd00f5b" exitCode=0 Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.557229 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.557237 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c66480db-ddbc-4146-8ff7-c45106fb527c","Type":"ContainerDied","Data":"c9dfb4ab9c075027f17c55d4fc7b21c7d6f389d15e6de87d8605e062bbd00f5b"} Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.557299 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c66480db-ddbc-4146-8ff7-c45106fb527c","Type":"ContainerDied","Data":"c36297c7ec770426a2b80b9c973eacf90099f4ed46419c7c8987cf14041eaad0"} Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.557331 5030 scope.go:117] "RemoveContainer" containerID="bfb9b531224f43f9cb325c161c59a99d27c81815008b1c89544e7a27ff6bed3f" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.591222 5030 scope.go:117] "RemoveContainer" containerID="7ed99142b10a144dc410a01a3b58aa0fd558792e97af93d0902b7b9e46b2be17" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.667858 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.675979 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.681030 5030 scope.go:117] "RemoveContainer" containerID="fc764d22c18ceb40b4cb93cca6c724a9f6139a9a5102632bed0a5f7f70959d8b" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.686815 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:12:06 crc kubenswrapper[5030]: E0120 23:12:06.687185 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="proxy-httpd" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.687203 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="proxy-httpd" Jan 20 23:12:06 crc kubenswrapper[5030]: E0120 23:12:06.687225 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="ceilometer-central-agent" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.687234 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="ceilometer-central-agent" Jan 20 23:12:06 crc kubenswrapper[5030]: E0120 23:12:06.687255 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="ceilometer-notification-agent" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.687261 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="ceilometer-notification-agent" Jan 20 23:12:06 crc kubenswrapper[5030]: E0120 23:12:06.687271 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="sg-core" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.687277 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="sg-core" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.687444 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="ceilometer-central-agent" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.687462 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="ceilometer-notification-agent" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.687470 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="proxy-httpd" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.687484 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" containerName="sg-core" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.689583 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.692083 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.692938 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.711028 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.742428 5030 scope.go:117] "RemoveContainer" containerID="c9dfb4ab9c075027f17c55d4fc7b21c7d6f389d15e6de87d8605e062bbd00f5b" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.760290 5030 scope.go:117] "RemoveContainer" containerID="bfb9b531224f43f9cb325c161c59a99d27c81815008b1c89544e7a27ff6bed3f" Jan 20 23:12:06 crc kubenswrapper[5030]: E0120 23:12:06.760725 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfb9b531224f43f9cb325c161c59a99d27c81815008b1c89544e7a27ff6bed3f\": container with ID starting with bfb9b531224f43f9cb325c161c59a99d27c81815008b1c89544e7a27ff6bed3f not found: ID does not exist" containerID="bfb9b531224f43f9cb325c161c59a99d27c81815008b1c89544e7a27ff6bed3f" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.760771 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfb9b531224f43f9cb325c161c59a99d27c81815008b1c89544e7a27ff6bed3f"} err="failed to get container status \"bfb9b531224f43f9cb325c161c59a99d27c81815008b1c89544e7a27ff6bed3f\": rpc error: code = NotFound desc = could not find container \"bfb9b531224f43f9cb325c161c59a99d27c81815008b1c89544e7a27ff6bed3f\": container with ID starting with bfb9b531224f43f9cb325c161c59a99d27c81815008b1c89544e7a27ff6bed3f not found: ID does not exist" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.760796 5030 scope.go:117] "RemoveContainer" containerID="7ed99142b10a144dc410a01a3b58aa0fd558792e97af93d0902b7b9e46b2be17" Jan 20 23:12:06 crc kubenswrapper[5030]: E0120 23:12:06.761137 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ed99142b10a144dc410a01a3b58aa0fd558792e97af93d0902b7b9e46b2be17\": container with ID starting with 7ed99142b10a144dc410a01a3b58aa0fd558792e97af93d0902b7b9e46b2be17 not found: ID does not exist" containerID="7ed99142b10a144dc410a01a3b58aa0fd558792e97af93d0902b7b9e46b2be17" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.761164 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ed99142b10a144dc410a01a3b58aa0fd558792e97af93d0902b7b9e46b2be17"} err="failed to get container status \"7ed99142b10a144dc410a01a3b58aa0fd558792e97af93d0902b7b9e46b2be17\": rpc error: code = NotFound desc = could not find container \"7ed99142b10a144dc410a01a3b58aa0fd558792e97af93d0902b7b9e46b2be17\": container with ID starting with 7ed99142b10a144dc410a01a3b58aa0fd558792e97af93d0902b7b9e46b2be17 not found: ID does not exist" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.761180 5030 scope.go:117] "RemoveContainer" containerID="fc764d22c18ceb40b4cb93cca6c724a9f6139a9a5102632bed0a5f7f70959d8b" Jan 20 23:12:06 crc kubenswrapper[5030]: E0120 23:12:06.761530 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc764d22c18ceb40b4cb93cca6c724a9f6139a9a5102632bed0a5f7f70959d8b\": container with ID starting with fc764d22c18ceb40b4cb93cca6c724a9f6139a9a5102632bed0a5f7f70959d8b not found: ID does not exist" containerID="fc764d22c18ceb40b4cb93cca6c724a9f6139a9a5102632bed0a5f7f70959d8b" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.761560 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc764d22c18ceb40b4cb93cca6c724a9f6139a9a5102632bed0a5f7f70959d8b"} err="failed to get container status \"fc764d22c18ceb40b4cb93cca6c724a9f6139a9a5102632bed0a5f7f70959d8b\": rpc error: code = NotFound desc = could not find container \"fc764d22c18ceb40b4cb93cca6c724a9f6139a9a5102632bed0a5f7f70959d8b\": container with ID starting with fc764d22c18ceb40b4cb93cca6c724a9f6139a9a5102632bed0a5f7f70959d8b not found: ID does not exist" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.761582 5030 scope.go:117] "RemoveContainer" containerID="c9dfb4ab9c075027f17c55d4fc7b21c7d6f389d15e6de87d8605e062bbd00f5b" Jan 20 23:12:06 crc kubenswrapper[5030]: E0120 23:12:06.761824 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9dfb4ab9c075027f17c55d4fc7b21c7d6f389d15e6de87d8605e062bbd00f5b\": container with ID starting with c9dfb4ab9c075027f17c55d4fc7b21c7d6f389d15e6de87d8605e062bbd00f5b not found: ID does not exist" containerID="c9dfb4ab9c075027f17c55d4fc7b21c7d6f389d15e6de87d8605e062bbd00f5b" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.761845 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9dfb4ab9c075027f17c55d4fc7b21c7d6f389d15e6de87d8605e062bbd00f5b"} err="failed to get container status \"c9dfb4ab9c075027f17c55d4fc7b21c7d6f389d15e6de87d8605e062bbd00f5b\": rpc error: code = NotFound desc = could not find container \"c9dfb4ab9c075027f17c55d4fc7b21c7d6f389d15e6de87d8605e062bbd00f5b\": container with ID starting with c9dfb4ab9c075027f17c55d4fc7b21c7d6f389d15e6de87d8605e062bbd00f5b not found: ID does not exist" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.834616 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.834690 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz5t2\" (UniqueName: \"kubernetes.io/projected/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-kube-api-access-jz5t2\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.834745 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.834776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-scripts\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.834822 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-run-httpd\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.834840 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-log-httpd\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.834902 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-config-data\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.937334 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-scripts\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.937510 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-run-httpd\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.937586 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-log-httpd\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.937678 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-config-data\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.937860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.937926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz5t2\" (UniqueName: \"kubernetes.io/projected/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-kube-api-access-jz5t2\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.937984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.938420 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-run-httpd\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.942608 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-log-httpd\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.942871 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-config-data\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.942900 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-scripts\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.944596 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.945337 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:06 crc kubenswrapper[5030]: I0120 23:12:06.959782 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz5t2\" (UniqueName: \"kubernetes.io/projected/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-kube-api-access-jz5t2\") pod \"ceilometer-0\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.028261 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.116968 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.127091 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.243741 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-config-data\") pod \"1983856c-2527-4397-9320-c20fb28c3eaf\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.243815 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cknxm\" (UniqueName: \"kubernetes.io/projected/1983856c-2527-4397-9320-c20fb28c3eaf-kube-api-access-cknxm\") pod \"1983856c-2527-4397-9320-c20fb28c3eaf\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.243846 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-scripts\") pod \"18b32995-1917-478d-b62b-66d4fba0cbee\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.243903 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-combined-ca-bundle\") pod \"1983856c-2527-4397-9320-c20fb28c3eaf\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.243934 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-scripts\") pod \"1983856c-2527-4397-9320-c20fb28c3eaf\" (UID: \"1983856c-2527-4397-9320-c20fb28c3eaf\") " Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.244046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg96v\" (UniqueName: \"kubernetes.io/projected/18b32995-1917-478d-b62b-66d4fba0cbee-kube-api-access-sg96v\") pod \"18b32995-1917-478d-b62b-66d4fba0cbee\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.244077 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-config-data\") pod \"18b32995-1917-478d-b62b-66d4fba0cbee\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.244209 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-combined-ca-bundle\") pod \"18b32995-1917-478d-b62b-66d4fba0cbee\" (UID: \"18b32995-1917-478d-b62b-66d4fba0cbee\") " Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.249359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1983856c-2527-4397-9320-c20fb28c3eaf-kube-api-access-cknxm" (OuterVolumeSpecName: "kube-api-access-cknxm") pod "1983856c-2527-4397-9320-c20fb28c3eaf" (UID: "1983856c-2527-4397-9320-c20fb28c3eaf"). InnerVolumeSpecName "kube-api-access-cknxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.250226 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-scripts" (OuterVolumeSpecName: "scripts") pod "18b32995-1917-478d-b62b-66d4fba0cbee" (UID: "18b32995-1917-478d-b62b-66d4fba0cbee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.253136 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b32995-1917-478d-b62b-66d4fba0cbee-kube-api-access-sg96v" (OuterVolumeSpecName: "kube-api-access-sg96v") pod "18b32995-1917-478d-b62b-66d4fba0cbee" (UID: "18b32995-1917-478d-b62b-66d4fba0cbee"). InnerVolumeSpecName "kube-api-access-sg96v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.255037 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-scripts" (OuterVolumeSpecName: "scripts") pod "1983856c-2527-4397-9320-c20fb28c3eaf" (UID: "1983856c-2527-4397-9320-c20fb28c3eaf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.272652 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-config-data" (OuterVolumeSpecName: "config-data") pod "18b32995-1917-478d-b62b-66d4fba0cbee" (UID: "18b32995-1917-478d-b62b-66d4fba0cbee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.273017 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18b32995-1917-478d-b62b-66d4fba0cbee" (UID: "18b32995-1917-478d-b62b-66d4fba0cbee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.281644 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1983856c-2527-4397-9320-c20fb28c3eaf" (UID: "1983856c-2527-4397-9320-c20fb28c3eaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.286176 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-config-data" (OuterVolumeSpecName: "config-data") pod "1983856c-2527-4397-9320-c20fb28c3eaf" (UID: "1983856c-2527-4397-9320-c20fb28c3eaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.346420 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.346471 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.346486 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cknxm\" (UniqueName: \"kubernetes.io/projected/1983856c-2527-4397-9320-c20fb28c3eaf-kube-api-access-cknxm\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.346500 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.346513 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.346524 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1983856c-2527-4397-9320-c20fb28c3eaf-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.346535 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg96v\" (UniqueName: \"kubernetes.io/projected/18b32995-1917-478d-b62b-66d4fba0cbee-kube-api-access-sg96v\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.346547 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b32995-1917-478d-b62b-66d4fba0cbee-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.486144 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.565309 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a322fe84-6902-4d63-8eb8-1bf586b7f0ff","Type":"ContainerStarted","Data":"be3bd4012b78cf6a4329cd97607925e0b26aad9a281e9e07f016c91c34655251"} Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.567112 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" event={"ID":"1983856c-2527-4397-9320-c20fb28c3eaf","Type":"ContainerDied","Data":"ca37d8985b3f16ce4865f1b6ef95159f81327cbc1c44e92c8edf01686d00dac9"} Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.567142 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca37d8985b3f16ce4865f1b6ef95159f81327cbc1c44e92c8edf01686d00dac9" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.567166 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.569921 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" event={"ID":"18b32995-1917-478d-b62b-66d4fba0cbee","Type":"ContainerDied","Data":"d8cb39a3b5e2aac75a5c3facab7a834e3c732ca77902aa97fa95d8dc48a8422f"} Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.569953 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8cb39a3b5e2aac75a5c3facab7a834e3c732ca77902aa97fa95d8dc48a8422f" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.569994 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.663845 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:12:07 crc kubenswrapper[5030]: E0120 23:12:07.664269 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1983856c-2527-4397-9320-c20fb28c3eaf" containerName="nova-manage" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.664287 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1983856c-2527-4397-9320-c20fb28c3eaf" containerName="nova-manage" Jan 20 23:12:07 crc kubenswrapper[5030]: E0120 23:12:07.664314 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b32995-1917-478d-b62b-66d4fba0cbee" containerName="nova-cell1-conductor-db-sync" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.664321 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b32995-1917-478d-b62b-66d4fba0cbee" containerName="nova-cell1-conductor-db-sync" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.664492 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1983856c-2527-4397-9320-c20fb28c3eaf" containerName="nova-manage" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.664515 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b32995-1917-478d-b62b-66d4fba0cbee" containerName="nova-cell1-conductor-db-sync" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.665149 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.667396 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.674115 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.754755 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8fe612c-1c3a-4269-8177-c09b217721c0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f8fe612c-1c3a-4269-8177-c09b217721c0\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.754867 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8fe612c-1c3a-4269-8177-c09b217721c0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f8fe612c-1c3a-4269-8177-c09b217721c0\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.754937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njzjp\" (UniqueName: \"kubernetes.io/projected/f8fe612c-1c3a-4269-8177-c09b217721c0-kube-api-access-njzjp\") pod \"nova-cell1-conductor-0\" (UID: \"f8fe612c-1c3a-4269-8177-c09b217721c0\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.763308 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.763580 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="e942f538-98eb-4877-91f8-f0a966ad053d" containerName="nova-api-log" containerID="cri-o://cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541" gracePeriod=30 Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.763613 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="e942f538-98eb-4877-91f8-f0a966ad053d" containerName="nova-api-api" containerID="cri-o://5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d" gracePeriod=30 Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.800504 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.801229 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="334851eb-8b75-4e87-a0ee-4d1af4cc02b4" containerName="nova-metadata-log" containerID="cri-o://be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a" gracePeriod=30 Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.801495 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="334851eb-8b75-4e87-a0ee-4d1af4cc02b4" containerName="nova-metadata-metadata" containerID="cri-o://48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d" gracePeriod=30 Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.856015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8fe612c-1c3a-4269-8177-c09b217721c0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f8fe612c-1c3a-4269-8177-c09b217721c0\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.856095 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8fe612c-1c3a-4269-8177-c09b217721c0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f8fe612c-1c3a-4269-8177-c09b217721c0\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.856144 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njzjp\" (UniqueName: \"kubernetes.io/projected/f8fe612c-1c3a-4269-8177-c09b217721c0-kube-api-access-njzjp\") pod \"nova-cell1-conductor-0\" (UID: \"f8fe612c-1c3a-4269-8177-c09b217721c0\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.863771 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8fe612c-1c3a-4269-8177-c09b217721c0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f8fe612c-1c3a-4269-8177-c09b217721c0\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.873284 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8fe612c-1c3a-4269-8177-c09b217721c0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f8fe612c-1c3a-4269-8177-c09b217721c0\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.877014 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njzjp\" (UniqueName: \"kubernetes.io/projected/f8fe612c-1c3a-4269-8177-c09b217721c0-kube-api-access-njzjp\") pod \"nova-cell1-conductor-0\" (UID: \"f8fe612c-1c3a-4269-8177-c09b217721c0\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.975212 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c66480db-ddbc-4146-8ff7-c45106fb527c" path="/var/lib/kubelet/pods/c66480db-ddbc-4146-8ff7-c45106fb527c/volumes" Jan 20 23:12:07 crc kubenswrapper[5030]: I0120 23:12:07.989266 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:12:08 crc kubenswrapper[5030]: E0120 23:12:08.017536 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod334851eb_8b75_4e87_a0ee_4d1af4cc02b4.slice/crio-conmon-be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.332555 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.396675 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.467180 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mx64\" (UniqueName: \"kubernetes.io/projected/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-kube-api-access-5mx64\") pod \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.467215 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-nova-metadata-tls-certs\") pod \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.467269 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-config-data\") pod \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.467397 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vp59\" (UniqueName: \"kubernetes.io/projected/e942f538-98eb-4877-91f8-f0a966ad053d-kube-api-access-5vp59\") pod \"e942f538-98eb-4877-91f8-f0a966ad053d\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.467415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e942f538-98eb-4877-91f8-f0a966ad053d-config-data\") pod \"e942f538-98eb-4877-91f8-f0a966ad053d\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.467442 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e942f538-98eb-4877-91f8-f0a966ad053d-combined-ca-bundle\") pod \"e942f538-98eb-4877-91f8-f0a966ad053d\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.467478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e942f538-98eb-4877-91f8-f0a966ad053d-logs\") pod \"e942f538-98eb-4877-91f8-f0a966ad053d\" (UID: \"e942f538-98eb-4877-91f8-f0a966ad053d\") " Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.467524 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-combined-ca-bundle\") pod \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.467562 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-logs\") pod \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\" (UID: \"334851eb-8b75-4e87-a0ee-4d1af4cc02b4\") " Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.468331 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e942f538-98eb-4877-91f8-f0a966ad053d-logs" (OuterVolumeSpecName: "logs") pod "e942f538-98eb-4877-91f8-f0a966ad053d" (UID: "e942f538-98eb-4877-91f8-f0a966ad053d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.468846 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-logs" (OuterVolumeSpecName: "logs") pod "334851eb-8b75-4e87-a0ee-4d1af4cc02b4" (UID: "334851eb-8b75-4e87-a0ee-4d1af4cc02b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.471669 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e942f538-98eb-4877-91f8-f0a966ad053d-kube-api-access-5vp59" (OuterVolumeSpecName: "kube-api-access-5vp59") pod "e942f538-98eb-4877-91f8-f0a966ad053d" (UID: "e942f538-98eb-4877-91f8-f0a966ad053d"). InnerVolumeSpecName "kube-api-access-5vp59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.471857 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-kube-api-access-5mx64" (OuterVolumeSpecName: "kube-api-access-5mx64") pod "334851eb-8b75-4e87-a0ee-4d1af4cc02b4" (UID: "334851eb-8b75-4e87-a0ee-4d1af4cc02b4"). InnerVolumeSpecName "kube-api-access-5mx64". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.494729 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-config-data" (OuterVolumeSpecName: "config-data") pod "334851eb-8b75-4e87-a0ee-4d1af4cc02b4" (UID: "334851eb-8b75-4e87-a0ee-4d1af4cc02b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.495835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e942f538-98eb-4877-91f8-f0a966ad053d-config-data" (OuterVolumeSpecName: "config-data") pod "e942f538-98eb-4877-91f8-f0a966ad053d" (UID: "e942f538-98eb-4877-91f8-f0a966ad053d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.505175 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "334851eb-8b75-4e87-a0ee-4d1af4cc02b4" (UID: "334851eb-8b75-4e87-a0ee-4d1af4cc02b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.506528 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e942f538-98eb-4877-91f8-f0a966ad053d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e942f538-98eb-4877-91f8-f0a966ad053d" (UID: "e942f538-98eb-4877-91f8-f0a966ad053d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.526035 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.529510 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "334851eb-8b75-4e87-a0ee-4d1af4cc02b4" (UID: "334851eb-8b75-4e87-a0ee-4d1af4cc02b4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:08 crc kubenswrapper[5030]: W0120 23:12:08.533737 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8fe612c_1c3a_4269_8177_c09b217721c0.slice/crio-9556bff1ce1b568094892948cd334e8a457b628dc10d3a86c6f75249d495d434 WatchSource:0}: Error finding container 9556bff1ce1b568094892948cd334e8a457b628dc10d3a86c6f75249d495d434: Status 404 returned error can't find the container with id 9556bff1ce1b568094892948cd334e8a457b628dc10d3a86c6f75249d495d434 Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.569675 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vp59\" (UniqueName: \"kubernetes.io/projected/e942f538-98eb-4877-91f8-f0a966ad053d-kube-api-access-5vp59\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.569879 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e942f538-98eb-4877-91f8-f0a966ad053d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.569951 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e942f538-98eb-4877-91f8-f0a966ad053d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.570007 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e942f538-98eb-4877-91f8-f0a966ad053d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.570061 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.570123 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.570182 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.570236 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mx64\" (UniqueName: \"kubernetes.io/projected/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-kube-api-access-5mx64\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.570304 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334851eb-8b75-4e87-a0ee-4d1af4cc02b4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.591708 5030 generic.go:334] "Generic (PLEG): container finished" podID="e942f538-98eb-4877-91f8-f0a966ad053d" containerID="5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d" exitCode=0 Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.591738 5030 generic.go:334] "Generic (PLEG): container finished" podID="e942f538-98eb-4877-91f8-f0a966ad053d" containerID="cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541" exitCode=143 Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.591778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e942f538-98eb-4877-91f8-f0a966ad053d","Type":"ContainerDied","Data":"5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d"} Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.591804 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e942f538-98eb-4877-91f8-f0a966ad053d","Type":"ContainerDied","Data":"cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541"} Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.591814 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e942f538-98eb-4877-91f8-f0a966ad053d","Type":"ContainerDied","Data":"dc12881a69f1ff81ddf194bf9832be067da4b4ebe332fa0d79af9afd855edc39"} Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.591820 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.591829 5030 scope.go:117] "RemoveContainer" containerID="5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.592995 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"f8fe612c-1c3a-4269-8177-c09b217721c0","Type":"ContainerStarted","Data":"9556bff1ce1b568094892948cd334e8a457b628dc10d3a86c6f75249d495d434"} Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.600841 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.601022 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"334851eb-8b75-4e87-a0ee-4d1af4cc02b4","Type":"ContainerDied","Data":"48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d"} Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.601518 5030 generic.go:334] "Generic (PLEG): container finished" podID="334851eb-8b75-4e87-a0ee-4d1af4cc02b4" containerID="48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d" exitCode=0 Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.601647 5030 generic.go:334] "Generic (PLEG): container finished" podID="334851eb-8b75-4e87-a0ee-4d1af4cc02b4" containerID="be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a" exitCode=143 Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.601862 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"334851eb-8b75-4e87-a0ee-4d1af4cc02b4","Type":"ContainerDied","Data":"be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a"} Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.601959 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"334851eb-8b75-4e87-a0ee-4d1af4cc02b4","Type":"ContainerDied","Data":"dd99fb6b5cf2375443c2b9e4a96579c31808d38953444c2016074d35c6a2d9fe"} Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.669908 5030 scope.go:117] "RemoveContainer" containerID="cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.707127 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.709373 5030 scope.go:117] "RemoveContainer" containerID="5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d" Jan 20 23:12:08 crc kubenswrapper[5030]: E0120 23:12:08.709878 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d\": container with ID starting with 5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d not found: ID does not exist" containerID="5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.709916 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d"} err="failed to get container status \"5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d\": rpc error: code = NotFound desc = could not find container \"5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d\": container with ID starting with 5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d not found: ID does not exist" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.709943 5030 scope.go:117] "RemoveContainer" containerID="cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541" Jan 20 23:12:08 crc kubenswrapper[5030]: E0120 23:12:08.710266 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541\": container with ID starting with cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541 not found: ID does not exist" containerID="cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.710294 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541"} err="failed to get container status \"cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541\": rpc error: code = NotFound desc = could not find container \"cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541\": container with ID starting with cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541 not found: ID does not exist" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.710313 5030 scope.go:117] "RemoveContainer" containerID="5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.710535 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d"} err="failed to get container status \"5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d\": rpc error: code = NotFound desc = could not find container \"5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d\": container with ID starting with 5a017f0d75a2b1b2df4c6128674e3e0c1219ea294796b8680fd9e5bfbbd4885d not found: ID does not exist" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.710560 5030 scope.go:117] "RemoveContainer" containerID="cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.710923 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541"} err="failed to get container status \"cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541\": rpc error: code = NotFound desc = could not find container \"cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541\": container with ID starting with cf80a441aa139cb5d8991cdc7513bf89d5e1f425f393cb598dae97d118a19541 not found: ID does not exist" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.710953 5030 scope.go:117] "RemoveContainer" containerID="48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.718864 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.727634 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.738339 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.745153 5030 scope.go:117] "RemoveContainer" containerID="be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.763419 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:08 crc kubenswrapper[5030]: E0120 23:12:08.763820 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334851eb-8b75-4e87-a0ee-4d1af4cc02b4" containerName="nova-metadata-log" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.763838 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="334851eb-8b75-4e87-a0ee-4d1af4cc02b4" containerName="nova-metadata-log" Jan 20 23:12:08 crc kubenswrapper[5030]: E0120 23:12:08.763856 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334851eb-8b75-4e87-a0ee-4d1af4cc02b4" containerName="nova-metadata-metadata" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.763868 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="334851eb-8b75-4e87-a0ee-4d1af4cc02b4" containerName="nova-metadata-metadata" Jan 20 23:12:08 crc kubenswrapper[5030]: E0120 23:12:08.763885 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e942f538-98eb-4877-91f8-f0a966ad053d" containerName="nova-api-log" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.763891 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e942f538-98eb-4877-91f8-f0a966ad053d" containerName="nova-api-log" Jan 20 23:12:08 crc kubenswrapper[5030]: E0120 23:12:08.763904 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e942f538-98eb-4877-91f8-f0a966ad053d" containerName="nova-api-api" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.763909 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e942f538-98eb-4877-91f8-f0a966ad053d" containerName="nova-api-api" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.764063 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e942f538-98eb-4877-91f8-f0a966ad053d" containerName="nova-api-log" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.764076 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="334851eb-8b75-4e87-a0ee-4d1af4cc02b4" containerName="nova-metadata-metadata" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.764093 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="334851eb-8b75-4e87-a0ee-4d1af4cc02b4" containerName="nova-metadata-log" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.764105 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e942f538-98eb-4877-91f8-f0a966ad053d" containerName="nova-api-api" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.765194 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.768184 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.772794 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.784307 5030 scope.go:117] "RemoveContainer" containerID="48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d" Jan 20 23:12:08 crc kubenswrapper[5030]: E0120 23:12:08.785849 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d\": container with ID starting with 48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d not found: ID does not exist" containerID="48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.785902 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d"} err="failed to get container status \"48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d\": rpc error: code = NotFound desc = could not find container \"48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d\": container with ID starting with 48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d not found: ID does not exist" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.785934 5030 scope.go:117] "RemoveContainer" containerID="be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a" Jan 20 23:12:08 crc kubenswrapper[5030]: E0120 23:12:08.790340 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a\": container with ID starting with be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a not found: ID does not exist" containerID="be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.790384 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a"} err="failed to get container status \"be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a\": rpc error: code = NotFound desc = could not find container \"be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a\": container with ID starting with be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a not found: ID does not exist" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.790408 5030 scope.go:117] "RemoveContainer" containerID="48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.793522 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d"} err="failed to get container status \"48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d\": rpc error: code = NotFound desc = could not find container \"48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d\": container with ID starting with 48e5916f4e1a9d03fb03f8dc556efe4210698efb236815adc11ed5d3e38ac11d not found: ID does not exist" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.793820 5030 scope.go:117] "RemoveContainer" containerID="be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.796044 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a"} err="failed to get container status \"be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a\": rpc error: code = NotFound desc = could not find container \"be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a\": container with ID starting with be2e8b472e5c530acbd60f4c027a8ae17ca39812ba4e92bb7a7d069f2b892d0a not found: ID does not exist" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.802723 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.805338 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.807631 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.811833 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.814138 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.874348 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl4pl\" (UniqueName: \"kubernetes.io/projected/1fea78ed-d40b-4cf9-893c-5130ff681e3a-kube-api-access-hl4pl\") pod \"nova-api-0\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.874388 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4sts\" (UniqueName: \"kubernetes.io/projected/f281c690-6bd6-4c53-b35c-e023c899af09-kube-api-access-j4sts\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.874433 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.874597 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-config-data\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.874703 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f281c690-6bd6-4c53-b35c-e023c899af09-logs\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.874797 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fea78ed-d40b-4cf9-893c-5130ff681e3a-config-data\") pod \"nova-api-0\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.874836 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fea78ed-d40b-4cf9-893c-5130ff681e3a-logs\") pod \"nova-api-0\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.874957 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.874989 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fea78ed-d40b-4cf9-893c-5130ff681e3a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.976252 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.976334 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-config-data\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.976366 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f281c690-6bd6-4c53-b35c-e023c899af09-logs\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.976402 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fea78ed-d40b-4cf9-893c-5130ff681e3a-config-data\") pod \"nova-api-0\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.976423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fea78ed-d40b-4cf9-893c-5130ff681e3a-logs\") pod \"nova-api-0\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.976464 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.976481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fea78ed-d40b-4cf9-893c-5130ff681e3a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.976507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl4pl\" (UniqueName: \"kubernetes.io/projected/1fea78ed-d40b-4cf9-893c-5130ff681e3a-kube-api-access-hl4pl\") pod \"nova-api-0\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.976522 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4sts\" (UniqueName: \"kubernetes.io/projected/f281c690-6bd6-4c53-b35c-e023c899af09-kube-api-access-j4sts\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.977139 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fea78ed-d40b-4cf9-893c-5130ff681e3a-logs\") pod \"nova-api-0\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.977353 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f281c690-6bd6-4c53-b35c-e023c899af09-logs\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.981379 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.982432 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-config-data\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.982435 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fea78ed-d40b-4cf9-893c-5130ff681e3a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.983682 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.984061 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fea78ed-d40b-4cf9-893c-5130ff681e3a-config-data\") pod \"nova-api-0\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.993667 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl4pl\" (UniqueName: \"kubernetes.io/projected/1fea78ed-d40b-4cf9-893c-5130ff681e3a-kube-api-access-hl4pl\") pod \"nova-api-0\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:08 crc kubenswrapper[5030]: I0120 23:12:08.999519 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4sts\" (UniqueName: \"kubernetes.io/projected/f281c690-6bd6-4c53-b35c-e023c899af09-kube-api-access-j4sts\") pod \"nova-metadata-0\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:09 crc kubenswrapper[5030]: I0120 23:12:09.091656 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:09 crc kubenswrapper[5030]: I0120 23:12:09.131857 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:09 crc kubenswrapper[5030]: I0120 23:12:09.576381 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:09 crc kubenswrapper[5030]: I0120 23:12:09.634607 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1fea78ed-d40b-4cf9-893c-5130ff681e3a","Type":"ContainerStarted","Data":"415d12738e7161847a971855ba34466bb1446d27019f10ccdee4acbd437d86e0"} Jan 20 23:12:09 crc kubenswrapper[5030]: I0120 23:12:09.637124 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a322fe84-6902-4d63-8eb8-1bf586b7f0ff","Type":"ContainerStarted","Data":"6d6429ad1c5cf505e741b2afcde3786c0f871b3a9e56e3e91eb06baf3a3c9a5b"} Jan 20 23:12:09 crc kubenswrapper[5030]: I0120 23:12:09.640537 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"f8fe612c-1c3a-4269-8177-c09b217721c0","Type":"ContainerStarted","Data":"7eda88b8cde3a5c2b63324054ee5424a80054e7b8f662825b7f8b622f7a48194"} Jan 20 23:12:09 crc kubenswrapper[5030]: I0120 23:12:09.640728 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:12:09 crc kubenswrapper[5030]: I0120 23:12:09.664913 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.664889755 podStartE2EDuration="2.664889755s" podCreationTimestamp="2026-01-20 23:12:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:12:09.657869136 +0000 UTC m=+2201.978129444" watchObservedRunningTime="2026-01-20 23:12:09.664889755 +0000 UTC m=+2201.985150043" Jan 20 23:12:09 crc kubenswrapper[5030]: I0120 23:12:09.688741 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:12:09 crc kubenswrapper[5030]: W0120 23:12:09.696185 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf281c690_6bd6_4c53_b35c_e023c899af09.slice/crio-90334a60f3e8e0d7fa0a5696400d15ce50220c5604b440e0cdfff29c48d4f8ae WatchSource:0}: Error finding container 90334a60f3e8e0d7fa0a5696400d15ce50220c5604b440e0cdfff29c48d4f8ae: Status 404 returned error can't find the container with id 90334a60f3e8e0d7fa0a5696400d15ce50220c5604b440e0cdfff29c48d4f8ae Jan 20 23:12:09 crc kubenswrapper[5030]: I0120 23:12:09.972097 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="334851eb-8b75-4e87-a0ee-4d1af4cc02b4" path="/var/lib/kubelet/pods/334851eb-8b75-4e87-a0ee-4d1af4cc02b4/volumes" Jan 20 23:12:09 crc kubenswrapper[5030]: I0120 23:12:09.972863 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e942f538-98eb-4877-91f8-f0a966ad053d" path="/var/lib/kubelet/pods/e942f538-98eb-4877-91f8-f0a966ad053d/volumes" Jan 20 23:12:10 crc kubenswrapper[5030]: I0120 23:12:10.653053 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1fea78ed-d40b-4cf9-893c-5130ff681e3a","Type":"ContainerStarted","Data":"93d894b25f0dd791964b556ec34de02333a673581fcb40dc43c0fb213b0e64b1"} Jan 20 23:12:10 crc kubenswrapper[5030]: I0120 23:12:10.653356 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1fea78ed-d40b-4cf9-893c-5130ff681e3a","Type":"ContainerStarted","Data":"1b472fab068d969984ea5daeec86f6ec516de3bfcaa4a46e06ded1e201939fca"} Jan 20 23:12:10 crc kubenswrapper[5030]: I0120 23:12:10.655286 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f281c690-6bd6-4c53-b35c-e023c899af09","Type":"ContainerStarted","Data":"0a8d2a7a25279d9304e1850209b0c77edf0e5d3138a8ce2cc66e6ac6c2dcae34"} Jan 20 23:12:10 crc kubenswrapper[5030]: I0120 23:12:10.655345 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f281c690-6bd6-4c53-b35c-e023c899af09","Type":"ContainerStarted","Data":"c80c53ba88caf9cdc71e70c963723d8e80d4aaced08cd26d33f3d20373ef7624"} Jan 20 23:12:10 crc kubenswrapper[5030]: I0120 23:12:10.655357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f281c690-6bd6-4c53-b35c-e023c899af09","Type":"ContainerStarted","Data":"90334a60f3e8e0d7fa0a5696400d15ce50220c5604b440e0cdfff29c48d4f8ae"} Jan 20 23:12:10 crc kubenswrapper[5030]: I0120 23:12:10.658070 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a322fe84-6902-4d63-8eb8-1bf586b7f0ff","Type":"ContainerStarted","Data":"3551a6ad6a7f9896ce984607a0aa92f84930bab450b78f2926a609f4d1fb569f"} Jan 20 23:12:10 crc kubenswrapper[5030]: I0120 23:12:10.686445 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.68641778 podStartE2EDuration="2.68641778s" podCreationTimestamp="2026-01-20 23:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:12:10.668731424 +0000 UTC m=+2202.988991702" watchObservedRunningTime="2026-01-20 23:12:10.68641778 +0000 UTC m=+2203.006678108" Jan 20 23:12:10 crc kubenswrapper[5030]: I0120 23:12:10.701210 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.701189957 podStartE2EDuration="2.701189957s" podCreationTimestamp="2026-01-20 23:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:12:10.695241274 +0000 UTC m=+2203.015501602" watchObservedRunningTime="2026-01-20 23:12:10.701189957 +0000 UTC m=+2203.021450245" Jan 20 23:12:11 crc kubenswrapper[5030]: I0120 23:12:11.672159 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a322fe84-6902-4d63-8eb8-1bf586b7f0ff","Type":"ContainerStarted","Data":"21a0b20ed7872a7491e01433237609bcf70f423e4212925a50f316efc5c62607"} Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.322099 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-drs6f"] Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.325057 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.333032 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-drs6f"] Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.439056 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hgr6\" (UniqueName: \"kubernetes.io/projected/b62bb447-625b-45a3-828b-76bc52147687-kube-api-access-9hgr6\") pod \"redhat-marketplace-drs6f\" (UID: \"b62bb447-625b-45a3-828b-76bc52147687\") " pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.439118 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b62bb447-625b-45a3-828b-76bc52147687-catalog-content\") pod \"redhat-marketplace-drs6f\" (UID: \"b62bb447-625b-45a3-828b-76bc52147687\") " pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.439213 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b62bb447-625b-45a3-828b-76bc52147687-utilities\") pod \"redhat-marketplace-drs6f\" (UID: \"b62bb447-625b-45a3-828b-76bc52147687\") " pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.540933 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hgr6\" (UniqueName: \"kubernetes.io/projected/b62bb447-625b-45a3-828b-76bc52147687-kube-api-access-9hgr6\") pod \"redhat-marketplace-drs6f\" (UID: \"b62bb447-625b-45a3-828b-76bc52147687\") " pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.540990 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b62bb447-625b-45a3-828b-76bc52147687-catalog-content\") pod \"redhat-marketplace-drs6f\" (UID: \"b62bb447-625b-45a3-828b-76bc52147687\") " pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.541024 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b62bb447-625b-45a3-828b-76bc52147687-utilities\") pod \"redhat-marketplace-drs6f\" (UID: \"b62bb447-625b-45a3-828b-76bc52147687\") " pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.541533 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b62bb447-625b-45a3-828b-76bc52147687-catalog-content\") pod \"redhat-marketplace-drs6f\" (UID: \"b62bb447-625b-45a3-828b-76bc52147687\") " pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.541588 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b62bb447-625b-45a3-828b-76bc52147687-utilities\") pod \"redhat-marketplace-drs6f\" (UID: \"b62bb447-625b-45a3-828b-76bc52147687\") " pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.558016 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hgr6\" (UniqueName: \"kubernetes.io/projected/b62bb447-625b-45a3-828b-76bc52147687-kube-api-access-9hgr6\") pod \"redhat-marketplace-drs6f\" (UID: \"b62bb447-625b-45a3-828b-76bc52147687\") " pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.644011 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.681760 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a322fe84-6902-4d63-8eb8-1bf586b7f0ff","Type":"ContainerStarted","Data":"1920caa76db3c9912949bcd704fb5751bf11db12d8ad8a34a3321e431783f5b0"} Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.681982 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:12 crc kubenswrapper[5030]: I0120 23:12:12.712384 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.149067805 podStartE2EDuration="6.712366139s" podCreationTimestamp="2026-01-20 23:12:06 +0000 UTC" firstStartedPulling="2026-01-20 23:12:07.492759225 +0000 UTC m=+2199.813019513" lastFinishedPulling="2026-01-20 23:12:12.056057559 +0000 UTC m=+2204.376317847" observedRunningTime="2026-01-20 23:12:12.709812999 +0000 UTC m=+2205.030073287" watchObservedRunningTime="2026-01-20 23:12:12.712366139 +0000 UTC m=+2205.032626427" Jan 20 23:12:13 crc kubenswrapper[5030]: I0120 23:12:13.085404 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-drs6f"] Jan 20 23:12:13 crc kubenswrapper[5030]: W0120 23:12:13.111780 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb62bb447_625b_45a3_828b_76bc52147687.slice/crio-d1e9967ed42eb4dece4d45e875c9683dfccc203e89baf193ec8874b13951bf84 WatchSource:0}: Error finding container d1e9967ed42eb4dece4d45e875c9683dfccc203e89baf193ec8874b13951bf84: Status 404 returned error can't find the container with id d1e9967ed42eb4dece4d45e875c9683dfccc203e89baf193ec8874b13951bf84 Jan 20 23:12:13 crc kubenswrapper[5030]: I0120 23:12:13.691329 5030 generic.go:334] "Generic (PLEG): container finished" podID="b62bb447-625b-45a3-828b-76bc52147687" containerID="84ffb3ce2ccdd49860ffeadc3a7cf9bbac40f3439884425e8feabfcdf71a1b4f" exitCode=0 Jan 20 23:12:13 crc kubenswrapper[5030]: I0120 23:12:13.691418 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-drs6f" event={"ID":"b62bb447-625b-45a3-828b-76bc52147687","Type":"ContainerDied","Data":"84ffb3ce2ccdd49860ffeadc3a7cf9bbac40f3439884425e8feabfcdf71a1b4f"} Jan 20 23:12:13 crc kubenswrapper[5030]: I0120 23:12:13.691476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-drs6f" event={"ID":"b62bb447-625b-45a3-828b-76bc52147687","Type":"ContainerStarted","Data":"d1e9967ed42eb4dece4d45e875c9683dfccc203e89baf193ec8874b13951bf84"} Jan 20 23:12:14 crc kubenswrapper[5030]: I0120 23:12:14.133828 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:14 crc kubenswrapper[5030]: I0120 23:12:14.134141 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:14 crc kubenswrapper[5030]: I0120 23:12:14.710499 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-drs6f" event={"ID":"b62bb447-625b-45a3-828b-76bc52147687","Type":"ContainerStarted","Data":"d58c09e995f9f48ec22ebb7393339b5b4a3d9d66d8f8e06c112247593bd74ef5"} Jan 20 23:12:15 crc kubenswrapper[5030]: I0120 23:12:15.724924 5030 generic.go:334] "Generic (PLEG): container finished" podID="b62bb447-625b-45a3-828b-76bc52147687" containerID="d58c09e995f9f48ec22ebb7393339b5b4a3d9d66d8f8e06c112247593bd74ef5" exitCode=0 Jan 20 23:12:15 crc kubenswrapper[5030]: I0120 23:12:15.725175 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-drs6f" event={"ID":"b62bb447-625b-45a3-828b-76bc52147687","Type":"ContainerDied","Data":"d58c09e995f9f48ec22ebb7393339b5b4a3d9d66d8f8e06c112247593bd74ef5"} Jan 20 23:12:16 crc kubenswrapper[5030]: I0120 23:12:16.744359 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-drs6f" event={"ID":"b62bb447-625b-45a3-828b-76bc52147687","Type":"ContainerStarted","Data":"c244d2667dff9dc8fc417c0874e0432312819d42f2d83588dd1df74d82433c77"} Jan 20 23:12:16 crc kubenswrapper[5030]: I0120 23:12:16.772323 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-drs6f" podStartSLOduration=2.033167079 podStartE2EDuration="4.772306103s" podCreationTimestamp="2026-01-20 23:12:12 +0000 UTC" firstStartedPulling="2026-01-20 23:12:13.69306271 +0000 UTC m=+2206.013323028" lastFinishedPulling="2026-01-20 23:12:16.432201764 +0000 UTC m=+2208.752462052" observedRunningTime="2026-01-20 23:12:16.769029074 +0000 UTC m=+2209.089289372" watchObservedRunningTime="2026-01-20 23:12:16.772306103 +0000 UTC m=+2209.092566391" Jan 20 23:12:18 crc kubenswrapper[5030]: I0120 23:12:18.037433 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:12:19 crc kubenswrapper[5030]: I0120 23:12:19.093548 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:19 crc kubenswrapper[5030]: I0120 23:12:19.094143 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:19 crc kubenswrapper[5030]: I0120 23:12:19.133843 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:19 crc kubenswrapper[5030]: I0120 23:12:19.133896 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:20 crc kubenswrapper[5030]: I0120 23:12:20.176858 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="1fea78ed-d40b-4cf9-893c-5130ff681e3a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:12:20 crc kubenswrapper[5030]: I0120 23:12:20.259847 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="1fea78ed-d40b-4cf9-893c-5130ff681e3a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:12:20 crc kubenswrapper[5030]: I0120 23:12:20.259931 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f281c690-6bd6-4c53-b35c-e023c899af09" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.248:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:12:20 crc kubenswrapper[5030]: I0120 23:12:20.260129 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f281c690-6bd6-4c53-b35c-e023c899af09" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.248:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:12:22 crc kubenswrapper[5030]: I0120 23:12:22.645084 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:22 crc kubenswrapper[5030]: I0120 23:12:22.645458 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:22 crc kubenswrapper[5030]: I0120 23:12:22.723739 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:22 crc kubenswrapper[5030]: I0120 23:12:22.880497 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:23 crc kubenswrapper[5030]: I0120 23:12:23.497443 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-drs6f"] Jan 20 23:12:24 crc kubenswrapper[5030]: I0120 23:12:24.844775 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-drs6f" podUID="b62bb447-625b-45a3-828b-76bc52147687" containerName="registry-server" containerID="cri-o://c244d2667dff9dc8fc417c0874e0432312819d42f2d83588dd1df74d82433c77" gracePeriod=2 Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.285935 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.409451 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b62bb447-625b-45a3-828b-76bc52147687-catalog-content\") pod \"b62bb447-625b-45a3-828b-76bc52147687\" (UID: \"b62bb447-625b-45a3-828b-76bc52147687\") " Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.409800 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b62bb447-625b-45a3-828b-76bc52147687-utilities\") pod \"b62bb447-625b-45a3-828b-76bc52147687\" (UID: \"b62bb447-625b-45a3-828b-76bc52147687\") " Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.409839 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hgr6\" (UniqueName: \"kubernetes.io/projected/b62bb447-625b-45a3-828b-76bc52147687-kube-api-access-9hgr6\") pod \"b62bb447-625b-45a3-828b-76bc52147687\" (UID: \"b62bb447-625b-45a3-828b-76bc52147687\") " Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.411162 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b62bb447-625b-45a3-828b-76bc52147687-utilities" (OuterVolumeSpecName: "utilities") pod "b62bb447-625b-45a3-828b-76bc52147687" (UID: "b62bb447-625b-45a3-828b-76bc52147687"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.415792 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b62bb447-625b-45a3-828b-76bc52147687-kube-api-access-9hgr6" (OuterVolumeSpecName: "kube-api-access-9hgr6") pod "b62bb447-625b-45a3-828b-76bc52147687" (UID: "b62bb447-625b-45a3-828b-76bc52147687"). InnerVolumeSpecName "kube-api-access-9hgr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.437393 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b62bb447-625b-45a3-828b-76bc52147687-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b62bb447-625b-45a3-828b-76bc52147687" (UID: "b62bb447-625b-45a3-828b-76bc52147687"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.512463 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b62bb447-625b-45a3-828b-76bc52147687-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.512496 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b62bb447-625b-45a3-828b-76bc52147687-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.512510 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hgr6\" (UniqueName: \"kubernetes.io/projected/b62bb447-625b-45a3-828b-76bc52147687-kube-api-access-9hgr6\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.857705 5030 generic.go:334] "Generic (PLEG): container finished" podID="b62bb447-625b-45a3-828b-76bc52147687" containerID="c244d2667dff9dc8fc417c0874e0432312819d42f2d83588dd1df74d82433c77" exitCode=0 Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.857758 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-drs6f" event={"ID":"b62bb447-625b-45a3-828b-76bc52147687","Type":"ContainerDied","Data":"c244d2667dff9dc8fc417c0874e0432312819d42f2d83588dd1df74d82433c77"} Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.857804 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-drs6f" event={"ID":"b62bb447-625b-45a3-828b-76bc52147687","Type":"ContainerDied","Data":"d1e9967ed42eb4dece4d45e875c9683dfccc203e89baf193ec8874b13951bf84"} Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.857853 5030 scope.go:117] "RemoveContainer" containerID="c244d2667dff9dc8fc417c0874e0432312819d42f2d83588dd1df74d82433c77" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.858754 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-drs6f" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.886388 5030 scope.go:117] "RemoveContainer" containerID="d58c09e995f9f48ec22ebb7393339b5b4a3d9d66d8f8e06c112247593bd74ef5" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.911742 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-drs6f"] Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.925660 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-drs6f"] Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.933091 5030 scope.go:117] "RemoveContainer" containerID="84ffb3ce2ccdd49860ffeadc3a7cf9bbac40f3439884425e8feabfcdf71a1b4f" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.968446 5030 scope.go:117] "RemoveContainer" containerID="c244d2667dff9dc8fc417c0874e0432312819d42f2d83588dd1df74d82433c77" Jan 20 23:12:25 crc kubenswrapper[5030]: E0120 23:12:25.969959 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c244d2667dff9dc8fc417c0874e0432312819d42f2d83588dd1df74d82433c77\": container with ID starting with c244d2667dff9dc8fc417c0874e0432312819d42f2d83588dd1df74d82433c77 not found: ID does not exist" containerID="c244d2667dff9dc8fc417c0874e0432312819d42f2d83588dd1df74d82433c77" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.970000 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c244d2667dff9dc8fc417c0874e0432312819d42f2d83588dd1df74d82433c77"} err="failed to get container status \"c244d2667dff9dc8fc417c0874e0432312819d42f2d83588dd1df74d82433c77\": rpc error: code = NotFound desc = could not find container \"c244d2667dff9dc8fc417c0874e0432312819d42f2d83588dd1df74d82433c77\": container with ID starting with c244d2667dff9dc8fc417c0874e0432312819d42f2d83588dd1df74d82433c77 not found: ID does not exist" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.970023 5030 scope.go:117] "RemoveContainer" containerID="d58c09e995f9f48ec22ebb7393339b5b4a3d9d66d8f8e06c112247593bd74ef5" Jan 20 23:12:25 crc kubenswrapper[5030]: E0120 23:12:25.970546 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d58c09e995f9f48ec22ebb7393339b5b4a3d9d66d8f8e06c112247593bd74ef5\": container with ID starting with d58c09e995f9f48ec22ebb7393339b5b4a3d9d66d8f8e06c112247593bd74ef5 not found: ID does not exist" containerID="d58c09e995f9f48ec22ebb7393339b5b4a3d9d66d8f8e06c112247593bd74ef5" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.970577 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d58c09e995f9f48ec22ebb7393339b5b4a3d9d66d8f8e06c112247593bd74ef5"} err="failed to get container status \"d58c09e995f9f48ec22ebb7393339b5b4a3d9d66d8f8e06c112247593bd74ef5\": rpc error: code = NotFound desc = could not find container \"d58c09e995f9f48ec22ebb7393339b5b4a3d9d66d8f8e06c112247593bd74ef5\": container with ID starting with d58c09e995f9f48ec22ebb7393339b5b4a3d9d66d8f8e06c112247593bd74ef5 not found: ID does not exist" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.970594 5030 scope.go:117] "RemoveContainer" containerID="84ffb3ce2ccdd49860ffeadc3a7cf9bbac40f3439884425e8feabfcdf71a1b4f" Jan 20 23:12:25 crc kubenswrapper[5030]: E0120 23:12:25.971003 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84ffb3ce2ccdd49860ffeadc3a7cf9bbac40f3439884425e8feabfcdf71a1b4f\": container with ID starting with 84ffb3ce2ccdd49860ffeadc3a7cf9bbac40f3439884425e8feabfcdf71a1b4f not found: ID does not exist" containerID="84ffb3ce2ccdd49860ffeadc3a7cf9bbac40f3439884425e8feabfcdf71a1b4f" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.971032 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ffb3ce2ccdd49860ffeadc3a7cf9bbac40f3439884425e8feabfcdf71a1b4f"} err="failed to get container status \"84ffb3ce2ccdd49860ffeadc3a7cf9bbac40f3439884425e8feabfcdf71a1b4f\": rpc error: code = NotFound desc = could not find container \"84ffb3ce2ccdd49860ffeadc3a7cf9bbac40f3439884425e8feabfcdf71a1b4f\": container with ID starting with 84ffb3ce2ccdd49860ffeadc3a7cf9bbac40f3439884425e8feabfcdf71a1b4f not found: ID does not exist" Jan 20 23:12:25 crc kubenswrapper[5030]: I0120 23:12:25.981980 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b62bb447-625b-45a3-828b-76bc52147687" path="/var/lib/kubelet/pods/b62bb447-625b-45a3-828b-76bc52147687/volumes" Jan 20 23:12:28 crc kubenswrapper[5030]: I0120 23:12:28.896789 5030 generic.go:334] "Generic (PLEG): container finished" podID="0c9520ac-d743-44eb-942f-6ffc49538ae1" containerID="3bab7e848d1c904c3ea9680a2511ae2b06c32b2cb43119543388719f7eb1b7bd" exitCode=137 Jan 20 23:12:28 crc kubenswrapper[5030]: I0120 23:12:28.896898 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0c9520ac-d743-44eb-942f-6ffc49538ae1","Type":"ContainerDied","Data":"3bab7e848d1c904c3ea9680a2511ae2b06c32b2cb43119543388719f7eb1b7bd"} Jan 20 23:12:28 crc kubenswrapper[5030]: I0120 23:12:28.897399 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0c9520ac-d743-44eb-942f-6ffc49538ae1","Type":"ContainerDied","Data":"f6d0cb257d5bf860290c57b4ac57a872f0b32edd9efaa9e15695346ac985387f"} Jan 20 23:12:28 crc kubenswrapper[5030]: I0120 23:12:28.897423 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6d0cb257d5bf860290c57b4ac57a872f0b32edd9efaa9e15695346ac985387f" Jan 20 23:12:28 crc kubenswrapper[5030]: I0120 23:12:28.975333 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.090233 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9520ac-d743-44eb-942f-6ffc49538ae1-config-data\") pod \"0c9520ac-d743-44eb-942f-6ffc49538ae1\" (UID: \"0c9520ac-d743-44eb-942f-6ffc49538ae1\") " Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.090526 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxwvl\" (UniqueName: \"kubernetes.io/projected/0c9520ac-d743-44eb-942f-6ffc49538ae1-kube-api-access-gxwvl\") pod \"0c9520ac-d743-44eb-942f-6ffc49538ae1\" (UID: \"0c9520ac-d743-44eb-942f-6ffc49538ae1\") " Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.090613 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9520ac-d743-44eb-942f-6ffc49538ae1-combined-ca-bundle\") pod \"0c9520ac-d743-44eb-942f-6ffc49538ae1\" (UID: \"0c9520ac-d743-44eb-942f-6ffc49538ae1\") " Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.097957 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.099195 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.103367 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.108069 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9520ac-d743-44eb-942f-6ffc49538ae1-kube-api-access-gxwvl" (OuterVolumeSpecName: "kube-api-access-gxwvl") pod "0c9520ac-d743-44eb-942f-6ffc49538ae1" (UID: "0c9520ac-d743-44eb-942f-6ffc49538ae1"). InnerVolumeSpecName "kube-api-access-gxwvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.117589 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.127646 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9520ac-d743-44eb-942f-6ffc49538ae1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c9520ac-d743-44eb-942f-6ffc49538ae1" (UID: "0c9520ac-d743-44eb-942f-6ffc49538ae1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.128599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9520ac-d743-44eb-942f-6ffc49538ae1-config-data" (OuterVolumeSpecName: "config-data") pod "0c9520ac-d743-44eb-942f-6ffc49538ae1" (UID: "0c9520ac-d743-44eb-942f-6ffc49538ae1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.146006 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.146095 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.152052 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.155404 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.193750 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxwvl\" (UniqueName: \"kubernetes.io/projected/0c9520ac-d743-44eb-942f-6ffc49538ae1-kube-api-access-gxwvl\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.194736 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9520ac-d743-44eb-942f-6ffc49538ae1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.194841 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9520ac-d743-44eb-942f-6ffc49538ae1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.909985 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.910986 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:29 crc kubenswrapper[5030]: I0120 23:12:29.915060 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.018524 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.053703 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.065388 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:12:30 crc kubenswrapper[5030]: E0120 23:12:30.067186 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62bb447-625b-45a3-828b-76bc52147687" containerName="extract-utilities" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.067339 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62bb447-625b-45a3-828b-76bc52147687" containerName="extract-utilities" Jan 20 23:12:30 crc kubenswrapper[5030]: E0120 23:12:30.067417 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9520ac-d743-44eb-942f-6ffc49538ae1" containerName="nova-scheduler-scheduler" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.067471 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9520ac-d743-44eb-942f-6ffc49538ae1" containerName="nova-scheduler-scheduler" Jan 20 23:12:30 crc kubenswrapper[5030]: E0120 23:12:30.067542 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62bb447-625b-45a3-828b-76bc52147687" containerName="registry-server" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.067605 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62bb447-625b-45a3-828b-76bc52147687" containerName="registry-server" Jan 20 23:12:30 crc kubenswrapper[5030]: E0120 23:12:30.067706 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62bb447-625b-45a3-828b-76bc52147687" containerName="extract-content" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.067768 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62bb447-625b-45a3-828b-76bc52147687" containerName="extract-content" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.068141 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9520ac-d743-44eb-942f-6ffc49538ae1" containerName="nova-scheduler-scheduler" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.068223 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b62bb447-625b-45a3-828b-76bc52147687" containerName="registry-server" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.069130 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.086252 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.087723 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.118389 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/056d6457-8742-4d6c-a497-dd9c75a8681b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"056d6457-8742-4d6c-a497-dd9c75a8681b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.118508 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/056d6457-8742-4d6c-a497-dd9c75a8681b-config-data\") pod \"nova-scheduler-0\" (UID: \"056d6457-8742-4d6c-a497-dd9c75a8681b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.118545 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxc44\" (UniqueName: \"kubernetes.io/projected/056d6457-8742-4d6c-a497-dd9c75a8681b-kube-api-access-hxc44\") pod \"nova-scheduler-0\" (UID: \"056d6457-8742-4d6c-a497-dd9c75a8681b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.219678 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/056d6457-8742-4d6c-a497-dd9c75a8681b-config-data\") pod \"nova-scheduler-0\" (UID: \"056d6457-8742-4d6c-a497-dd9c75a8681b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.219744 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxc44\" (UniqueName: \"kubernetes.io/projected/056d6457-8742-4d6c-a497-dd9c75a8681b-kube-api-access-hxc44\") pod \"nova-scheduler-0\" (UID: \"056d6457-8742-4d6c-a497-dd9c75a8681b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.219811 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/056d6457-8742-4d6c-a497-dd9c75a8681b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"056d6457-8742-4d6c-a497-dd9c75a8681b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.224099 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/056d6457-8742-4d6c-a497-dd9c75a8681b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"056d6457-8742-4d6c-a497-dd9c75a8681b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.224109 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/056d6457-8742-4d6c-a497-dd9c75a8681b-config-data\") pod \"nova-scheduler-0\" (UID: \"056d6457-8742-4d6c-a497-dd9c75a8681b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.244210 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxc44\" (UniqueName: \"kubernetes.io/projected/056d6457-8742-4d6c-a497-dd9c75a8681b-kube-api-access-hxc44\") pod \"nova-scheduler-0\" (UID: \"056d6457-8742-4d6c-a497-dd9c75a8681b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.420464 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:30 crc kubenswrapper[5030]: E0120 23:12:30.809359 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeef985aa_30c1_4bbb_9ff6_b97693927a0c.slice/crio-conmon-0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.882242 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.904784 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.924800 5030 generic.go:334] "Generic (PLEG): container finished" podID="eef985aa-30c1-4bbb-9ff6-b97693927a0c" containerID="0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6" exitCode=137 Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.925072 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"eef985aa-30c1-4bbb-9ff6-b97693927a0c","Type":"ContainerDied","Data":"0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6"} Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.925097 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"eef985aa-30c1-4bbb-9ff6-b97693927a0c","Type":"ContainerDied","Data":"49916d742b3e42d44211ec5ae394a352f365f60386f2926e2bbd4ea3fb5d32f2"} Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.925114 5030 scope.go:117] "RemoveContainer" containerID="0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.925221 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.929010 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"056d6457-8742-4d6c-a497-dd9c75a8681b","Type":"ContainerStarted","Data":"ca3cc313f6d6499641ecafd7d7e8c991c1e26fd7f029b671212c68b0ed8f8f63"} Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.931242 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wlxn\" (UniqueName: \"kubernetes.io/projected/eef985aa-30c1-4bbb-9ff6-b97693927a0c-kube-api-access-5wlxn\") pod \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\" (UID: \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\") " Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.931431 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef985aa-30c1-4bbb-9ff6-b97693927a0c-config-data\") pod \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\" (UID: \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\") " Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.931557 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef985aa-30c1-4bbb-9ff6-b97693927a0c-combined-ca-bundle\") pod \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\" (UID: \"eef985aa-30c1-4bbb-9ff6-b97693927a0c\") " Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.936099 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef985aa-30c1-4bbb-9ff6-b97693927a0c-kube-api-access-5wlxn" (OuterVolumeSpecName: "kube-api-access-5wlxn") pod "eef985aa-30c1-4bbb-9ff6-b97693927a0c" (UID: "eef985aa-30c1-4bbb-9ff6-b97693927a0c"). InnerVolumeSpecName "kube-api-access-5wlxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.947758 5030 scope.go:117] "RemoveContainer" containerID="0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6" Jan 20 23:12:30 crc kubenswrapper[5030]: E0120 23:12:30.950156 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6\": container with ID starting with 0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6 not found: ID does not exist" containerID="0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.950202 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6"} err="failed to get container status \"0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6\": rpc error: code = NotFound desc = could not find container \"0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6\": container with ID starting with 0f6c619647bd5ef9b6c1eeaac3f8ba03e140af405a4930deabe94b192f2dc1a6 not found: ID does not exist" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.966742 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef985aa-30c1-4bbb-9ff6-b97693927a0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eef985aa-30c1-4bbb-9ff6-b97693927a0c" (UID: "eef985aa-30c1-4bbb-9ff6-b97693927a0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:30 crc kubenswrapper[5030]: I0120 23:12:30.971222 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef985aa-30c1-4bbb-9ff6-b97693927a0c-config-data" (OuterVolumeSpecName: "config-data") pod "eef985aa-30c1-4bbb-9ff6-b97693927a0c" (UID: "eef985aa-30c1-4bbb-9ff6-b97693927a0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.034244 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef985aa-30c1-4bbb-9ff6-b97693927a0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.034270 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wlxn\" (UniqueName: \"kubernetes.io/projected/eef985aa-30c1-4bbb-9ff6-b97693927a0c-kube-api-access-5wlxn\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.034280 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef985aa-30c1-4bbb-9ff6-b97693927a0c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.302242 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.315791 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.326126 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:12:31 crc kubenswrapper[5030]: E0120 23:12:31.326511 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef985aa-30c1-4bbb-9ff6-b97693927a0c" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.326528 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef985aa-30c1-4bbb-9ff6-b97693927a0c" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.326762 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef985aa-30c1-4bbb-9ff6-b97693927a0c" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.327356 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.329376 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.329658 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.329765 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.334210 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.440521 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.440577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zc5v\" (UniqueName: \"kubernetes.io/projected/0b4eb966-3abf-4d3a-b65c-e05c8366c302-kube-api-access-7zc5v\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.440639 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.440694 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.440814 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.542678 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.542778 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.542810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zc5v\" (UniqueName: \"kubernetes.io/projected/0b4eb966-3abf-4d3a-b65c-e05c8366c302-kube-api-access-7zc5v\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.542847 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.542869 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.558919 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.558922 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.559108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.559742 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zc5v\" (UniqueName: \"kubernetes.io/projected/0b4eb966-3abf-4d3a-b65c-e05c8366c302-kube-api-access-7zc5v\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.559983 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.642909 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.938072 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"056d6457-8742-4d6c-a497-dd9c75a8681b","Type":"ContainerStarted","Data":"e4a850aa1442af0f782bbfa2f081e52af907d72c072d3ce3948d8f5585cd9a91"} Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.962464 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.962448098 podStartE2EDuration="2.962448098s" podCreationTimestamp="2026-01-20 23:12:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:12:31.958888342 +0000 UTC m=+2224.279148630" watchObservedRunningTime="2026-01-20 23:12:31.962448098 +0000 UTC m=+2224.282708376" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.973536 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9520ac-d743-44eb-942f-6ffc49538ae1" path="/var/lib/kubelet/pods/0c9520ac-d743-44eb-942f-6ffc49538ae1/volumes" Jan 20 23:12:31 crc kubenswrapper[5030]: I0120 23:12:31.975010 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef985aa-30c1-4bbb-9ff6-b97693927a0c" path="/var/lib/kubelet/pods/eef985aa-30c1-4bbb-9ff6-b97693927a0c/volumes" Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.099471 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.100104 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="ceilometer-central-agent" containerID="cri-o://6d6429ad1c5cf505e741b2afcde3786c0f871b3a9e56e3e91eb06baf3a3c9a5b" gracePeriod=30 Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.100199 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="ceilometer-notification-agent" containerID="cri-o://3551a6ad6a7f9896ce984607a0aa92f84930bab450b78f2926a609f4d1fb569f" gracePeriod=30 Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.100194 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="sg-core" containerID="cri-o://21a0b20ed7872a7491e01433237609bcf70f423e4212925a50f316efc5c62607" gracePeriod=30 Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.100263 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="proxy-httpd" containerID="cri-o://1920caa76db3c9912949bcd704fb5751bf11db12d8ad8a34a3321e431783f5b0" gracePeriod=30 Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.109019 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.245:3000/\": EOF" Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.159527 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.954327 5030 generic.go:334] "Generic (PLEG): container finished" podID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerID="1920caa76db3c9912949bcd704fb5751bf11db12d8ad8a34a3321e431783f5b0" exitCode=0 Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.954567 5030 generic.go:334] "Generic (PLEG): container finished" podID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerID="21a0b20ed7872a7491e01433237609bcf70f423e4212925a50f316efc5c62607" exitCode=2 Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.954575 5030 generic.go:334] "Generic (PLEG): container finished" podID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerID="6d6429ad1c5cf505e741b2afcde3786c0f871b3a9e56e3e91eb06baf3a3c9a5b" exitCode=0 Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.954606 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a322fe84-6902-4d63-8eb8-1bf586b7f0ff","Type":"ContainerDied","Data":"1920caa76db3c9912949bcd704fb5751bf11db12d8ad8a34a3321e431783f5b0"} Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.954643 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a322fe84-6902-4d63-8eb8-1bf586b7f0ff","Type":"ContainerDied","Data":"21a0b20ed7872a7491e01433237609bcf70f423e4212925a50f316efc5c62607"} Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.954653 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a322fe84-6902-4d63-8eb8-1bf586b7f0ff","Type":"ContainerDied","Data":"6d6429ad1c5cf505e741b2afcde3786c0f871b3a9e56e3e91eb06baf3a3c9a5b"} Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.957340 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"0b4eb966-3abf-4d3a-b65c-e05c8366c302","Type":"ContainerStarted","Data":"a311ea86198bf17bcf45cd98e7b6fc056d1db58bf1b30b8a6c217b794dae71eb"} Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.957380 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"0b4eb966-3abf-4d3a-b65c-e05c8366c302","Type":"ContainerStarted","Data":"bfbfa3697dcedc21eed3e8c48da58f623c89e02dbc3c4db25b2696963e479e52"} Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.984874 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.985123 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="1fea78ed-d40b-4cf9-893c-5130ff681e3a" containerName="nova-api-log" containerID="cri-o://1b472fab068d969984ea5daeec86f6ec516de3bfcaa4a46e06ded1e201939fca" gracePeriod=30 Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.985207 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="1fea78ed-d40b-4cf9-893c-5130ff681e3a" containerName="nova-api-api" containerID="cri-o://93d894b25f0dd791964b556ec34de02333a673581fcb40dc43c0fb213b0e64b1" gracePeriod=30 Jan 20 23:12:32 crc kubenswrapper[5030]: I0120 23:12:32.988885 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=1.988865202 podStartE2EDuration="1.988865202s" podCreationTimestamp="2026-01-20 23:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:12:32.97638107 +0000 UTC m=+2225.296641358" watchObservedRunningTime="2026-01-20 23:12:32.988865202 +0000 UTC m=+2225.309125500" Jan 20 23:12:33 crc kubenswrapper[5030]: I0120 23:12:33.966843 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fea78ed-d40b-4cf9-893c-5130ff681e3a" containerID="1b472fab068d969984ea5daeec86f6ec516de3bfcaa4a46e06ded1e201939fca" exitCode=143 Jan 20 23:12:33 crc kubenswrapper[5030]: I0120 23:12:33.972849 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1fea78ed-d40b-4cf9-893c-5130ff681e3a","Type":"ContainerDied","Data":"1b472fab068d969984ea5daeec86f6ec516de3bfcaa4a46e06ded1e201939fca"} Jan 20 23:12:35 crc kubenswrapper[5030]: I0120 23:12:35.420654 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:35 crc kubenswrapper[5030]: I0120 23:12:35.992880 5030 generic.go:334] "Generic (PLEG): container finished" podID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerID="3551a6ad6a7f9896ce984607a0aa92f84930bab450b78f2926a609f4d1fb569f" exitCode=0 Jan 20 23:12:35 crc kubenswrapper[5030]: I0120 23:12:35.993227 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a322fe84-6902-4d63-8eb8-1bf586b7f0ff","Type":"ContainerDied","Data":"3551a6ad6a7f9896ce984607a0aa92f84930bab450b78f2926a609f4d1fb569f"} Jan 20 23:12:36 crc kubenswrapper[5030]: I0120 23:12:36.284274 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:36 crc kubenswrapper[5030]: I0120 23:12:36.339954 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-run-httpd\") pod \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " Jan 20 23:12:36 crc kubenswrapper[5030]: I0120 23:12:36.340033 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz5t2\" (UniqueName: \"kubernetes.io/projected/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-kube-api-access-jz5t2\") pod \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " Jan 20 23:12:36 crc kubenswrapper[5030]: I0120 23:12:36.340055 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-config-data\") pod \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " Jan 20 23:12:36 crc kubenswrapper[5030]: I0120 23:12:36.340085 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-combined-ca-bundle\") pod \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " Jan 20 23:12:36 crc kubenswrapper[5030]: I0120 23:12:36.340209 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-scripts\") pod \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " Jan 20 23:12:36 crc kubenswrapper[5030]: I0120 23:12:36.340239 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-sg-core-conf-yaml\") pod \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " Jan 20 23:12:36 crc kubenswrapper[5030]: I0120 23:12:36.340314 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a322fe84-6902-4d63-8eb8-1bf586b7f0ff" (UID: "a322fe84-6902-4d63-8eb8-1bf586b7f0ff"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:12:36 crc kubenswrapper[5030]: I0120 23:12:36.340367 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-log-httpd\") pod \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\" (UID: \"a322fe84-6902-4d63-8eb8-1bf586b7f0ff\") " Jan 20 23:12:36 crc kubenswrapper[5030]: I0120 23:12:36.341180 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:36 crc kubenswrapper[5030]: I0120 23:12:36.340767 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a322fe84-6902-4d63-8eb8-1bf586b7f0ff" (UID: "a322fe84-6902-4d63-8eb8-1bf586b7f0ff"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:12:36 crc kubenswrapper[5030]: I0120 23:12:36.345952 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-scripts" (OuterVolumeSpecName: "scripts") pod "a322fe84-6902-4d63-8eb8-1bf586b7f0ff" (UID: "a322fe84-6902-4d63-8eb8-1bf586b7f0ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:36 crc kubenswrapper[5030]: I0120 23:12:36.348963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-kube-api-access-jz5t2" (OuterVolumeSpecName: "kube-api-access-jz5t2") pod "a322fe84-6902-4d63-8eb8-1bf586b7f0ff" (UID: "a322fe84-6902-4d63-8eb8-1bf586b7f0ff"). InnerVolumeSpecName "kube-api-access-jz5t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.443395 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.443421 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz5t2\" (UniqueName: \"kubernetes.io/projected/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-kube-api-access-jz5t2\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.443431 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.453554 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a322fe84-6902-4d63-8eb8-1bf586b7f0ff" (UID: "a322fe84-6902-4d63-8eb8-1bf586b7f0ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.463401 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a322fe84-6902-4d63-8eb8-1bf586b7f0ff" (UID: "a322fe84-6902-4d63-8eb8-1bf586b7f0ff"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.479325 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.520563 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-config-data" (OuterVolumeSpecName: "config-data") pod "a322fe84-6902-4d63-8eb8-1bf586b7f0ff" (UID: "a322fe84-6902-4d63-8eb8-1bf586b7f0ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.544667 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fea78ed-d40b-4cf9-893c-5130ff681e3a-config-data\") pod \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.544742 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl4pl\" (UniqueName: \"kubernetes.io/projected/1fea78ed-d40b-4cf9-893c-5130ff681e3a-kube-api-access-hl4pl\") pod \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.544781 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fea78ed-d40b-4cf9-893c-5130ff681e3a-combined-ca-bundle\") pod \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.544808 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fea78ed-d40b-4cf9-893c-5130ff681e3a-logs\") pod \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\" (UID: \"1fea78ed-d40b-4cf9-893c-5130ff681e3a\") " Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.545154 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.545166 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.545176 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a322fe84-6902-4d63-8eb8-1bf586b7f0ff-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.545564 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fea78ed-d40b-4cf9-893c-5130ff681e3a-logs" (OuterVolumeSpecName: "logs") pod "1fea78ed-d40b-4cf9-893c-5130ff681e3a" (UID: "1fea78ed-d40b-4cf9-893c-5130ff681e3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.550556 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fea78ed-d40b-4cf9-893c-5130ff681e3a-kube-api-access-hl4pl" (OuterVolumeSpecName: "kube-api-access-hl4pl") pod "1fea78ed-d40b-4cf9-893c-5130ff681e3a" (UID: "1fea78ed-d40b-4cf9-893c-5130ff681e3a"). InnerVolumeSpecName "kube-api-access-hl4pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.576948 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fea78ed-d40b-4cf9-893c-5130ff681e3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fea78ed-d40b-4cf9-893c-5130ff681e3a" (UID: "1fea78ed-d40b-4cf9-893c-5130ff681e3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.583216 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fea78ed-d40b-4cf9-893c-5130ff681e3a-config-data" (OuterVolumeSpecName: "config-data") pod "1fea78ed-d40b-4cf9-893c-5130ff681e3a" (UID: "1fea78ed-d40b-4cf9-893c-5130ff681e3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.643159 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.646982 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fea78ed-d40b-4cf9-893c-5130ff681e3a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.647004 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl4pl\" (UniqueName: \"kubernetes.io/projected/1fea78ed-d40b-4cf9-893c-5130ff681e3a-kube-api-access-hl4pl\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.647017 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fea78ed-d40b-4cf9-893c-5130ff681e3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:36.647026 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fea78ed-d40b-4cf9-893c-5130ff681e3a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.003096 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fea78ed-d40b-4cf9-893c-5130ff681e3a" containerID="93d894b25f0dd791964b556ec34de02333a673581fcb40dc43c0fb213b0e64b1" exitCode=0 Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.003151 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1fea78ed-d40b-4cf9-893c-5130ff681e3a","Type":"ContainerDied","Data":"93d894b25f0dd791964b556ec34de02333a673581fcb40dc43c0fb213b0e64b1"} Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.003175 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1fea78ed-d40b-4cf9-893c-5130ff681e3a","Type":"ContainerDied","Data":"415d12738e7161847a971855ba34466bb1446d27019f10ccdee4acbd437d86e0"} Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.003173 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.003198 5030 scope.go:117] "RemoveContainer" containerID="93d894b25f0dd791964b556ec34de02333a673581fcb40dc43c0fb213b0e64b1" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.008909 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a322fe84-6902-4d63-8eb8-1bf586b7f0ff","Type":"ContainerDied","Data":"be3bd4012b78cf6a4329cd97607925e0b26aad9a281e9e07f016c91c34655251"} Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.009000 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.043951 5030 scope.go:117] "RemoveContainer" containerID="1b472fab068d969984ea5daeec86f6ec516de3bfcaa4a46e06ded1e201939fca" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.045221 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.058866 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.067138 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:37 crc kubenswrapper[5030]: E0120 23:12:37.067481 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="ceilometer-notification-agent" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.067493 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="ceilometer-notification-agent" Jan 20 23:12:37 crc kubenswrapper[5030]: E0120 23:12:37.067501 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fea78ed-d40b-4cf9-893c-5130ff681e3a" containerName="nova-api-log" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.067507 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fea78ed-d40b-4cf9-893c-5130ff681e3a" containerName="nova-api-log" Jan 20 23:12:37 crc kubenswrapper[5030]: E0120 23:12:37.067523 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fea78ed-d40b-4cf9-893c-5130ff681e3a" containerName="nova-api-api" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.067529 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fea78ed-d40b-4cf9-893c-5130ff681e3a" containerName="nova-api-api" Jan 20 23:12:37 crc kubenswrapper[5030]: E0120 23:12:37.067534 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="proxy-httpd" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.067541 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="proxy-httpd" Jan 20 23:12:37 crc kubenswrapper[5030]: E0120 23:12:37.067556 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="sg-core" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.067561 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="sg-core" Jan 20 23:12:37 crc kubenswrapper[5030]: E0120 23:12:37.067577 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="ceilometer-central-agent" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.067583 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="ceilometer-central-agent" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.067762 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="ceilometer-central-agent" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.067774 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="ceilometer-notification-agent" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.067785 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="proxy-httpd" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.067798 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fea78ed-d40b-4cf9-893c-5130ff681e3a" containerName="nova-api-log" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.067809 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" containerName="sg-core" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.067820 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fea78ed-d40b-4cf9-893c-5130ff681e3a" containerName="nova-api-api" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.069730 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.077427 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.077945 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.078308 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.079537 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.094663 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.107791 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.112296 5030 scope.go:117] "RemoveContainer" containerID="93d894b25f0dd791964b556ec34de02333a673581fcb40dc43c0fb213b0e64b1" Jan 20 23:12:37 crc kubenswrapper[5030]: E0120 23:12:37.116872 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93d894b25f0dd791964b556ec34de02333a673581fcb40dc43c0fb213b0e64b1\": container with ID starting with 93d894b25f0dd791964b556ec34de02333a673581fcb40dc43c0fb213b0e64b1 not found: ID does not exist" containerID="93d894b25f0dd791964b556ec34de02333a673581fcb40dc43c0fb213b0e64b1" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.116905 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93d894b25f0dd791964b556ec34de02333a673581fcb40dc43c0fb213b0e64b1"} err="failed to get container status \"93d894b25f0dd791964b556ec34de02333a673581fcb40dc43c0fb213b0e64b1\": rpc error: code = NotFound desc = could not find container \"93d894b25f0dd791964b556ec34de02333a673581fcb40dc43c0fb213b0e64b1\": container with ID starting with 93d894b25f0dd791964b556ec34de02333a673581fcb40dc43c0fb213b0e64b1 not found: ID does not exist" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.116945 5030 scope.go:117] "RemoveContainer" containerID="1b472fab068d969984ea5daeec86f6ec516de3bfcaa4a46e06ded1e201939fca" Jan 20 23:12:37 crc kubenswrapper[5030]: E0120 23:12:37.118717 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b472fab068d969984ea5daeec86f6ec516de3bfcaa4a46e06ded1e201939fca\": container with ID starting with 1b472fab068d969984ea5daeec86f6ec516de3bfcaa4a46e06ded1e201939fca not found: ID does not exist" containerID="1b472fab068d969984ea5daeec86f6ec516de3bfcaa4a46e06ded1e201939fca" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.118751 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b472fab068d969984ea5daeec86f6ec516de3bfcaa4a46e06ded1e201939fca"} err="failed to get container status \"1b472fab068d969984ea5daeec86f6ec516de3bfcaa4a46e06ded1e201939fca\": rpc error: code = NotFound desc = could not find container \"1b472fab068d969984ea5daeec86f6ec516de3bfcaa4a46e06ded1e201939fca\": container with ID starting with 1b472fab068d969984ea5daeec86f6ec516de3bfcaa4a46e06ded1e201939fca not found: ID does not exist" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.118765 5030 scope.go:117] "RemoveContainer" containerID="1920caa76db3c9912949bcd704fb5751bf11db12d8ad8a34a3321e431783f5b0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.125172 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.127452 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.131243 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.131492 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.136174 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.158849 5030 scope.go:117] "RemoveContainer" containerID="21a0b20ed7872a7491e01433237609bcf70f423e4212925a50f316efc5c62607" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.182401 5030 scope.go:117] "RemoveContainer" containerID="3551a6ad6a7f9896ce984607a0aa92f84930bab450b78f2926a609f4d1fb569f" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.206396 5030 scope.go:117] "RemoveContainer" containerID="6d6429ad1c5cf505e741b2afcde3786c0f871b3a9e56e3e91eb06baf3a3c9a5b" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.259100 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfn44\" (UniqueName: \"kubernetes.io/projected/e8221006-b113-4388-8dd1-5c297d3057f2-kube-api-access-lfn44\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.259141 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9ptp\" (UniqueName: \"kubernetes.io/projected/03d04802-be7b-437b-94fb-eb88151f97c7-kube-api-access-w9ptp\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.259172 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03d04802-be7b-437b-94fb-eb88151f97c7-logs\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.259213 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-config-data\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.259258 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8221006-b113-4388-8dd1-5c297d3057f2-run-httpd\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.259282 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.259301 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-public-tls-certs\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.259329 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.259346 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-scripts\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.259368 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8221006-b113-4388-8dd1-5c297d3057f2-log-httpd\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.259388 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-config-data\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.260330 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.260387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.362368 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfn44\" (UniqueName: \"kubernetes.io/projected/e8221006-b113-4388-8dd1-5c297d3057f2-kube-api-access-lfn44\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.362441 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9ptp\" (UniqueName: \"kubernetes.io/projected/03d04802-be7b-437b-94fb-eb88151f97c7-kube-api-access-w9ptp\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.362505 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03d04802-be7b-437b-94fb-eb88151f97c7-logs\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.362596 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-config-data\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.362716 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8221006-b113-4388-8dd1-5c297d3057f2-run-httpd\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.362778 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.362815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-public-tls-certs\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.362876 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.362922 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-scripts\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.362964 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8221006-b113-4388-8dd1-5c297d3057f2-log-httpd\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.363010 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-config-data\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.363090 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.363170 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.363363 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8221006-b113-4388-8dd1-5c297d3057f2-run-httpd\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.363584 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03d04802-be7b-437b-94fb-eb88151f97c7-logs\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.364357 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8221006-b113-4388-8dd1-5c297d3057f2-log-httpd\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.367157 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.367613 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.368566 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.368743 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-config-data\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.369184 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.369178 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-scripts\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.369407 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-public-tls-certs\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.371174 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-config-data\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.379374 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfn44\" (UniqueName: \"kubernetes.io/projected/e8221006-b113-4388-8dd1-5c297d3057f2-kube-api-access-lfn44\") pod \"ceilometer-0\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.382459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9ptp\" (UniqueName: \"kubernetes.io/projected/03d04802-be7b-437b-94fb-eb88151f97c7-kube-api-access-w9ptp\") pod \"nova-api-0\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.400539 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.471428 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.890496 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.979087 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fea78ed-d40b-4cf9-893c-5130ff681e3a" path="/var/lib/kubelet/pods/1fea78ed-d40b-4cf9-893c-5130ff681e3a/volumes" Jan 20 23:12:37 crc kubenswrapper[5030]: I0120 23:12:37.980136 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a322fe84-6902-4d63-8eb8-1bf586b7f0ff" path="/var/lib/kubelet/pods/a322fe84-6902-4d63-8eb8-1bf586b7f0ff/volumes" Jan 20 23:12:38 crc kubenswrapper[5030]: I0120 23:12:38.038315 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:12:38 crc kubenswrapper[5030]: I0120 23:12:38.049478 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"03d04802-be7b-437b-94fb-eb88151f97c7","Type":"ContainerStarted","Data":"bf45d1e0633c4a063370b13f0f5f74f2f89240c371241acd7d83455184d12326"} Jan 20 23:12:38 crc kubenswrapper[5030]: I0120 23:12:38.051342 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e8221006-b113-4388-8dd1-5c297d3057f2","Type":"ContainerStarted","Data":"5f06366e9e7d171836c27d8001feb68b6522b9a4a5e0d196ba21453a1998bcc6"} Jan 20 23:12:39 crc kubenswrapper[5030]: I0120 23:12:39.065152 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e8221006-b113-4388-8dd1-5c297d3057f2","Type":"ContainerStarted","Data":"74a9f31739f785d263ef15b1b9525a58e3e8c9acd391968b607075e97356471d"} Jan 20 23:12:39 crc kubenswrapper[5030]: I0120 23:12:39.068692 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"03d04802-be7b-437b-94fb-eb88151f97c7","Type":"ContainerStarted","Data":"17dd52c806162812d6a1b27460a0037870da2675c322cff24641dd0fefb8fe52"} Jan 20 23:12:39 crc kubenswrapper[5030]: I0120 23:12:39.068824 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"03d04802-be7b-437b-94fb-eb88151f97c7","Type":"ContainerStarted","Data":"947d5c71f7e781d4bd6096549e0b7ed0edb2757a6a4f6328ec135069ead79a1c"} Jan 20 23:12:39 crc kubenswrapper[5030]: I0120 23:12:39.109853 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.10983498 podStartE2EDuration="2.10983498s" podCreationTimestamp="2026-01-20 23:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:12:39.102719749 +0000 UTC m=+2231.422980067" watchObservedRunningTime="2026-01-20 23:12:39.10983498 +0000 UTC m=+2231.430095268" Jan 20 23:12:40 crc kubenswrapper[5030]: I0120 23:12:40.089162 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e8221006-b113-4388-8dd1-5c297d3057f2","Type":"ContainerStarted","Data":"45945c28e049f19caa74bdaa60824fe31de062dde7f1bf6c7eeb887f1fd6c3bb"} Jan 20 23:12:40 crc kubenswrapper[5030]: I0120 23:12:40.157739 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:12:40 crc kubenswrapper[5030]: I0120 23:12:40.157905 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:12:40 crc kubenswrapper[5030]: I0120 23:12:40.420781 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:40 crc kubenswrapper[5030]: I0120 23:12:40.465161 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:41 crc kubenswrapper[5030]: I0120 23:12:41.111882 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e8221006-b113-4388-8dd1-5c297d3057f2","Type":"ContainerStarted","Data":"aa0f48ea6ef68395fe50a607882e357701f58e47315a580ae3fb7086594500ab"} Jan 20 23:12:41 crc kubenswrapper[5030]: I0120 23:12:41.146577 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:41 crc kubenswrapper[5030]: I0120 23:12:41.643599 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:41 crc kubenswrapper[5030]: I0120 23:12:41.669331 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.121780 5030 scope.go:117] "RemoveContainer" containerID="ddfad440bf1dbcb119302c876ff1167f8af714dafbb96b9a1371e0c947bbd1ef" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.141271 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.191881 5030 scope.go:117] "RemoveContainer" containerID="57cfb712970271072b19d4d183fff7c643dc704e5e1907442c4839bf65f54d4c" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.210569 5030 scope.go:117] "RemoveContainer" containerID="272d75bf9c3c2e74c8b39ab9a0c6c94f4aca0e5a6fdfeb82d45bbdd9a5cc5e12" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.254271 5030 scope.go:117] "RemoveContainer" containerID="9b9922dacab84e233a14985be81f6c61373bf733824192ed0355af9dc1045a6e" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.287114 5030 scope.go:117] "RemoveContainer" containerID="23ae2d82921cf0555f438e643bf03e657a5b1d48089e782a3020dd8f67bf0058" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.345140 5030 scope.go:117] "RemoveContainer" containerID="a508ab8c85c7e03a4396f79456f3d11b9a5a5c184f3e100bc13ce8623c314a5a" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.369817 5030 scope.go:117] "RemoveContainer" containerID="99a71f0deee15d0a9699a8ffa07c2e18238d26e0c5c3bddbccea94b51ac564a5" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.382493 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n"] Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.384601 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.386922 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.387398 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.394016 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n"] Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.401896 5030 scope.go:117] "RemoveContainer" containerID="898572d4bfe70ff7e5c0bc5e173412aee6bb392651d855f94362d6a8dcdc23c8" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.450472 5030 scope.go:117] "RemoveContainer" containerID="e55c81a9c6be668bdf5b020760ecbb0376b4bd21b7daa84f73ecacc1bd8c9253" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.473876 5030 scope.go:117] "RemoveContainer" containerID="b7540dd8f63045c75c56efba1cd97bca76da39abdb06d7e436a00757cad793ab" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.511280 5030 scope.go:117] "RemoveContainer" containerID="741a4b81aa00f963bf02f9332ccebe85e1d0760931f18f3eb6bcb184b92fd180" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.578077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-scripts\") pod \"nova-cell1-cell-mapping-fff5n\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.578200 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-config-data\") pod \"nova-cell1-cell-mapping-fff5n\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.578262 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfbt9\" (UniqueName: \"kubernetes.io/projected/025de003-1154-4871-b722-7d83c8aab74a-kube-api-access-mfbt9\") pod \"nova-cell1-cell-mapping-fff5n\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.578881 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fff5n\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.680839 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fff5n\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.680907 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-scripts\") pod \"nova-cell1-cell-mapping-fff5n\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.681000 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-config-data\") pod \"nova-cell1-cell-mapping-fff5n\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.681053 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfbt9\" (UniqueName: \"kubernetes.io/projected/025de003-1154-4871-b722-7d83c8aab74a-kube-api-access-mfbt9\") pod \"nova-cell1-cell-mapping-fff5n\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.687665 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-config-data\") pod \"nova-cell1-cell-mapping-fff5n\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.687692 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fff5n\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.694204 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-scripts\") pod \"nova-cell1-cell-mapping-fff5n\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:42 crc kubenswrapper[5030]: I0120 23:12:42.707197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfbt9\" (UniqueName: \"kubernetes.io/projected/025de003-1154-4871-b722-7d83c8aab74a-kube-api-access-mfbt9\") pod \"nova-cell1-cell-mapping-fff5n\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:43 crc kubenswrapper[5030]: I0120 23:12:43.006361 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:43 crc kubenswrapper[5030]: I0120 23:12:43.469878 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n"] Jan 20 23:12:43 crc kubenswrapper[5030]: W0120 23:12:43.474198 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod025de003_1154_4871_b722_7d83c8aab74a.slice/crio-e74cf52e63c21a82ed8ab73259bbeec1cf0d17ae381ced13f807de862af83fa2 WatchSource:0}: Error finding container e74cf52e63c21a82ed8ab73259bbeec1cf0d17ae381ced13f807de862af83fa2: Status 404 returned error can't find the container with id e74cf52e63c21a82ed8ab73259bbeec1cf0d17ae381ced13f807de862af83fa2 Jan 20 23:12:44 crc kubenswrapper[5030]: I0120 23:12:44.146704 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" event={"ID":"025de003-1154-4871-b722-7d83c8aab74a","Type":"ContainerStarted","Data":"7b21fefc5a3e56be8e6e529b2aa999397d9311389a31236928705e843a13b1df"} Jan 20 23:12:44 crc kubenswrapper[5030]: I0120 23:12:44.147063 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" event={"ID":"025de003-1154-4871-b722-7d83c8aab74a","Type":"ContainerStarted","Data":"e74cf52e63c21a82ed8ab73259bbeec1cf0d17ae381ced13f807de862af83fa2"} Jan 20 23:12:44 crc kubenswrapper[5030]: I0120 23:12:44.180864 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" podStartSLOduration=2.1808368160000002 podStartE2EDuration="2.180836816s" podCreationTimestamp="2026-01-20 23:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:12:44.166323565 +0000 UTC m=+2236.486583883" watchObservedRunningTime="2026-01-20 23:12:44.180836816 +0000 UTC m=+2236.501097144" Jan 20 23:12:47 crc kubenswrapper[5030]: I0120 23:12:47.401734 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:47 crc kubenswrapper[5030]: I0120 23:12:47.402502 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:48 crc kubenswrapper[5030]: I0120 23:12:48.193015 5030 generic.go:334] "Generic (PLEG): container finished" podID="025de003-1154-4871-b722-7d83c8aab74a" containerID="7b21fefc5a3e56be8e6e529b2aa999397d9311389a31236928705e843a13b1df" exitCode=0 Jan 20 23:12:48 crc kubenswrapper[5030]: I0120 23:12:48.193057 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" event={"ID":"025de003-1154-4871-b722-7d83c8aab74a","Type":"ContainerDied","Data":"7b21fefc5a3e56be8e6e529b2aa999397d9311389a31236928705e843a13b1df"} Jan 20 23:12:48 crc kubenswrapper[5030]: I0120 23:12:48.421443 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="03d04802-be7b-437b-94fb-eb88151f97c7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.252:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:12:48 crc kubenswrapper[5030]: I0120 23:12:48.421909 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="03d04802-be7b-437b-94fb-eb88151f97c7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.252:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:12:49 crc kubenswrapper[5030]: I0120 23:12:49.630998 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:49 crc kubenswrapper[5030]: I0120 23:12:49.734258 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-scripts\") pod \"025de003-1154-4871-b722-7d83c8aab74a\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " Jan 20 23:12:49 crc kubenswrapper[5030]: I0120 23:12:49.734448 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-config-data\") pod \"025de003-1154-4871-b722-7d83c8aab74a\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " Jan 20 23:12:49 crc kubenswrapper[5030]: I0120 23:12:49.735376 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfbt9\" (UniqueName: \"kubernetes.io/projected/025de003-1154-4871-b722-7d83c8aab74a-kube-api-access-mfbt9\") pod \"025de003-1154-4871-b722-7d83c8aab74a\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " Jan 20 23:12:49 crc kubenswrapper[5030]: I0120 23:12:49.735429 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-combined-ca-bundle\") pod \"025de003-1154-4871-b722-7d83c8aab74a\" (UID: \"025de003-1154-4871-b722-7d83c8aab74a\") " Jan 20 23:12:49 crc kubenswrapper[5030]: I0120 23:12:49.740546 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-scripts" (OuterVolumeSpecName: "scripts") pod "025de003-1154-4871-b722-7d83c8aab74a" (UID: "025de003-1154-4871-b722-7d83c8aab74a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:49 crc kubenswrapper[5030]: I0120 23:12:49.740717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025de003-1154-4871-b722-7d83c8aab74a-kube-api-access-mfbt9" (OuterVolumeSpecName: "kube-api-access-mfbt9") pod "025de003-1154-4871-b722-7d83c8aab74a" (UID: "025de003-1154-4871-b722-7d83c8aab74a"). InnerVolumeSpecName "kube-api-access-mfbt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:49 crc kubenswrapper[5030]: I0120 23:12:49.768166 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "025de003-1154-4871-b722-7d83c8aab74a" (UID: "025de003-1154-4871-b722-7d83c8aab74a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:49 crc kubenswrapper[5030]: I0120 23:12:49.770498 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-config-data" (OuterVolumeSpecName: "config-data") pod "025de003-1154-4871-b722-7d83c8aab74a" (UID: "025de003-1154-4871-b722-7d83c8aab74a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:49 crc kubenswrapper[5030]: I0120 23:12:49.838337 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:49 crc kubenswrapper[5030]: I0120 23:12:49.838591 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:49 crc kubenswrapper[5030]: I0120 23:12:49.838708 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfbt9\" (UniqueName: \"kubernetes.io/projected/025de003-1154-4871-b722-7d83c8aab74a-kube-api-access-mfbt9\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:49 crc kubenswrapper[5030]: I0120 23:12:49.838785 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025de003-1154-4871-b722-7d83c8aab74a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.217112 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e8221006-b113-4388-8dd1-5c297d3057f2","Type":"ContainerStarted","Data":"f9d7b66ea535c50368d809f77bef6106ffd34a4fdfa236183bf6052129108f7d"} Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.217569 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.219275 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" event={"ID":"025de003-1154-4871-b722-7d83c8aab74a","Type":"ContainerDied","Data":"e74cf52e63c21a82ed8ab73259bbeec1cf0d17ae381ced13f807de862af83fa2"} Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.219302 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e74cf52e63c21a82ed8ab73259bbeec1cf0d17ae381ced13f807de862af83fa2" Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.219609 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n" Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.265222 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.079838625 podStartE2EDuration="13.26520378s" podCreationTimestamp="2026-01-20 23:12:37 +0000 UTC" firstStartedPulling="2026-01-20 23:12:38.027708721 +0000 UTC m=+2230.347968999" lastFinishedPulling="2026-01-20 23:12:49.213073856 +0000 UTC m=+2241.533334154" observedRunningTime="2026-01-20 23:12:50.244678675 +0000 UTC m=+2242.564938973" watchObservedRunningTime="2026-01-20 23:12:50.26520378 +0000 UTC m=+2242.585464068" Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.405716 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.405997 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="03d04802-be7b-437b-94fb-eb88151f97c7" containerName="nova-api-log" containerID="cri-o://947d5c71f7e781d4bd6096549e0b7ed0edb2757a6a4f6328ec135069ead79a1c" gracePeriod=30 Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.406143 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="03d04802-be7b-437b-94fb-eb88151f97c7" containerName="nova-api-api" containerID="cri-o://17dd52c806162812d6a1b27460a0037870da2675c322cff24641dd0fefb8fe52" gracePeriod=30 Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.437915 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.438157 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="056d6457-8742-4d6c-a497-dd9c75a8681b" containerName="nova-scheduler-scheduler" containerID="cri-o://e4a850aa1442af0f782bbfa2f081e52af907d72c072d3ce3948d8f5585cd9a91" gracePeriod=30 Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.451572 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.451876 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f281c690-6bd6-4c53-b35c-e023c899af09" containerName="nova-metadata-log" containerID="cri-o://c80c53ba88caf9cdc71e70c963723d8e80d4aaced08cd26d33f3d20373ef7624" gracePeriod=30 Jan 20 23:12:50 crc kubenswrapper[5030]: I0120 23:12:50.452015 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f281c690-6bd6-4c53-b35c-e023c899af09" containerName="nova-metadata-metadata" containerID="cri-o://0a8d2a7a25279d9304e1850209b0c77edf0e5d3138a8ce2cc66e6ac6c2dcae34" gracePeriod=30 Jan 20 23:12:51 crc kubenswrapper[5030]: I0120 23:12:51.235297 5030 generic.go:334] "Generic (PLEG): container finished" podID="f281c690-6bd6-4c53-b35c-e023c899af09" containerID="c80c53ba88caf9cdc71e70c963723d8e80d4aaced08cd26d33f3d20373ef7624" exitCode=143 Jan 20 23:12:51 crc kubenswrapper[5030]: I0120 23:12:51.235400 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f281c690-6bd6-4c53-b35c-e023c899af09","Type":"ContainerDied","Data":"c80c53ba88caf9cdc71e70c963723d8e80d4aaced08cd26d33f3d20373ef7624"} Jan 20 23:12:51 crc kubenswrapper[5030]: I0120 23:12:51.239499 5030 generic.go:334] "Generic (PLEG): container finished" podID="03d04802-be7b-437b-94fb-eb88151f97c7" containerID="947d5c71f7e781d4bd6096549e0b7ed0edb2757a6a4f6328ec135069ead79a1c" exitCode=143 Jan 20 23:12:51 crc kubenswrapper[5030]: I0120 23:12:51.239536 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"03d04802-be7b-437b-94fb-eb88151f97c7","Type":"ContainerDied","Data":"947d5c71f7e781d4bd6096549e0b7ed0edb2757a6a4f6328ec135069ead79a1c"} Jan 20 23:12:53 crc kubenswrapper[5030]: I0120 23:12:53.897316 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.024501 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/056d6457-8742-4d6c-a497-dd9c75a8681b-config-data\") pod \"056d6457-8742-4d6c-a497-dd9c75a8681b\" (UID: \"056d6457-8742-4d6c-a497-dd9c75a8681b\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.024732 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxc44\" (UniqueName: \"kubernetes.io/projected/056d6457-8742-4d6c-a497-dd9c75a8681b-kube-api-access-hxc44\") pod \"056d6457-8742-4d6c-a497-dd9c75a8681b\" (UID: \"056d6457-8742-4d6c-a497-dd9c75a8681b\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.025014 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/056d6457-8742-4d6c-a497-dd9c75a8681b-combined-ca-bundle\") pod \"056d6457-8742-4d6c-a497-dd9c75a8681b\" (UID: \"056d6457-8742-4d6c-a497-dd9c75a8681b\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.030443 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/056d6457-8742-4d6c-a497-dd9c75a8681b-kube-api-access-hxc44" (OuterVolumeSpecName: "kube-api-access-hxc44") pod "056d6457-8742-4d6c-a497-dd9c75a8681b" (UID: "056d6457-8742-4d6c-a497-dd9c75a8681b"). InnerVolumeSpecName "kube-api-access-hxc44". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.050149 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/056d6457-8742-4d6c-a497-dd9c75a8681b-config-data" (OuterVolumeSpecName: "config-data") pod "056d6457-8742-4d6c-a497-dd9c75a8681b" (UID: "056d6457-8742-4d6c-a497-dd9c75a8681b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.060957 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/056d6457-8742-4d6c-a497-dd9c75a8681b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "056d6457-8742-4d6c-a497-dd9c75a8681b" (UID: "056d6457-8742-4d6c-a497-dd9c75a8681b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.066775 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.075686 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.126659 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/056d6457-8742-4d6c-a497-dd9c75a8681b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.126684 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/056d6457-8742-4d6c-a497-dd9c75a8681b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.126696 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxc44\" (UniqueName: \"kubernetes.io/projected/056d6457-8742-4d6c-a497-dd9c75a8681b-kube-api-access-hxc44\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.227889 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-nova-metadata-tls-certs\") pod \"f281c690-6bd6-4c53-b35c-e023c899af09\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.227987 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-config-data\") pod \"03d04802-be7b-437b-94fb-eb88151f97c7\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.228052 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-config-data\") pod \"f281c690-6bd6-4c53-b35c-e023c899af09\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.228116 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f281c690-6bd6-4c53-b35c-e023c899af09-logs\") pod \"f281c690-6bd6-4c53-b35c-e023c899af09\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.228198 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-combined-ca-bundle\") pod \"f281c690-6bd6-4c53-b35c-e023c899af09\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.228303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-public-tls-certs\") pod \"03d04802-be7b-437b-94fb-eb88151f97c7\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.228350 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-combined-ca-bundle\") pod \"03d04802-be7b-437b-94fb-eb88151f97c7\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.228576 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f281c690-6bd6-4c53-b35c-e023c899af09-logs" (OuterVolumeSpecName: "logs") pod "f281c690-6bd6-4c53-b35c-e023c899af09" (UID: "f281c690-6bd6-4c53-b35c-e023c899af09"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.228862 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4sts\" (UniqueName: \"kubernetes.io/projected/f281c690-6bd6-4c53-b35c-e023c899af09-kube-api-access-j4sts\") pod \"f281c690-6bd6-4c53-b35c-e023c899af09\" (UID: \"f281c690-6bd6-4c53-b35c-e023c899af09\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.228918 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03d04802-be7b-437b-94fb-eb88151f97c7-logs\") pod \"03d04802-be7b-437b-94fb-eb88151f97c7\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.229045 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9ptp\" (UniqueName: \"kubernetes.io/projected/03d04802-be7b-437b-94fb-eb88151f97c7-kube-api-access-w9ptp\") pod \"03d04802-be7b-437b-94fb-eb88151f97c7\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.229143 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-internal-tls-certs\") pod \"03d04802-be7b-437b-94fb-eb88151f97c7\" (UID: \"03d04802-be7b-437b-94fb-eb88151f97c7\") " Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.229783 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f281c690-6bd6-4c53-b35c-e023c899af09-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.229842 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d04802-be7b-437b-94fb-eb88151f97c7-logs" (OuterVolumeSpecName: "logs") pod "03d04802-be7b-437b-94fb-eb88151f97c7" (UID: "03d04802-be7b-437b-94fb-eb88151f97c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.231902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f281c690-6bd6-4c53-b35c-e023c899af09-kube-api-access-j4sts" (OuterVolumeSpecName: "kube-api-access-j4sts") pod "f281c690-6bd6-4c53-b35c-e023c899af09" (UID: "f281c690-6bd6-4c53-b35c-e023c899af09"). InnerVolumeSpecName "kube-api-access-j4sts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.232421 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d04802-be7b-437b-94fb-eb88151f97c7-kube-api-access-w9ptp" (OuterVolumeSpecName: "kube-api-access-w9ptp") pod "03d04802-be7b-437b-94fb-eb88151f97c7" (UID: "03d04802-be7b-437b-94fb-eb88151f97c7"). InnerVolumeSpecName "kube-api-access-w9ptp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.253283 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-config-data" (OuterVolumeSpecName: "config-data") pod "03d04802-be7b-437b-94fb-eb88151f97c7" (UID: "03d04802-be7b-437b-94fb-eb88151f97c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.255965 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03d04802-be7b-437b-94fb-eb88151f97c7" (UID: "03d04802-be7b-437b-94fb-eb88151f97c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.257131 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f281c690-6bd6-4c53-b35c-e023c899af09" (UID: "f281c690-6bd6-4c53-b35c-e023c899af09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.282507 5030 generic.go:334] "Generic (PLEG): container finished" podID="056d6457-8742-4d6c-a497-dd9c75a8681b" containerID="e4a850aa1442af0f782bbfa2f081e52af907d72c072d3ce3948d8f5585cd9a91" exitCode=0 Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.282632 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.283031 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"056d6457-8742-4d6c-a497-dd9c75a8681b","Type":"ContainerDied","Data":"e4a850aa1442af0f782bbfa2f081e52af907d72c072d3ce3948d8f5585cd9a91"} Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.283076 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"056d6457-8742-4d6c-a497-dd9c75a8681b","Type":"ContainerDied","Data":"ca3cc313f6d6499641ecafd7d7e8c991c1e26fd7f029b671212c68b0ed8f8f63"} Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.283092 5030 scope.go:117] "RemoveContainer" containerID="e4a850aa1442af0f782bbfa2f081e52af907d72c072d3ce3948d8f5585cd9a91" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.284315 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-config-data" (OuterVolumeSpecName: "config-data") pod "f281c690-6bd6-4c53-b35c-e023c899af09" (UID: "f281c690-6bd6-4c53-b35c-e023c899af09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.284869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "03d04802-be7b-437b-94fb-eb88151f97c7" (UID: "03d04802-be7b-437b-94fb-eb88151f97c7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.287060 5030 generic.go:334] "Generic (PLEG): container finished" podID="f281c690-6bd6-4c53-b35c-e023c899af09" containerID="0a8d2a7a25279d9304e1850209b0c77edf0e5d3138a8ce2cc66e6ac6c2dcae34" exitCode=0 Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.287153 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f281c690-6bd6-4c53-b35c-e023c899af09","Type":"ContainerDied","Data":"0a8d2a7a25279d9304e1850209b0c77edf0e5d3138a8ce2cc66e6ac6c2dcae34"} Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.287200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f281c690-6bd6-4c53-b35c-e023c899af09","Type":"ContainerDied","Data":"90334a60f3e8e0d7fa0a5696400d15ce50220c5604b440e0cdfff29c48d4f8ae"} Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.287156 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.289831 5030 generic.go:334] "Generic (PLEG): container finished" podID="03d04802-be7b-437b-94fb-eb88151f97c7" containerID="17dd52c806162812d6a1b27460a0037870da2675c322cff24641dd0fefb8fe52" exitCode=0 Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.289858 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"03d04802-be7b-437b-94fb-eb88151f97c7","Type":"ContainerDied","Data":"17dd52c806162812d6a1b27460a0037870da2675c322cff24641dd0fefb8fe52"} Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.289874 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"03d04802-be7b-437b-94fb-eb88151f97c7","Type":"ContainerDied","Data":"bf45d1e0633c4a063370b13f0f5f74f2f89240c371241acd7d83455184d12326"} Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.289904 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.317154 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.318769 5030 scope.go:117] "RemoveContainer" containerID="e4a850aa1442af0f782bbfa2f081e52af907d72c072d3ce3948d8f5585cd9a91" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.320593 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f281c690-6bd6-4c53-b35c-e023c899af09" (UID: "f281c690-6bd6-4c53-b35c-e023c899af09"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: E0120 23:12:54.322144 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4a850aa1442af0f782bbfa2f081e52af907d72c072d3ce3948d8f5585cd9a91\": container with ID starting with e4a850aa1442af0f782bbfa2f081e52af907d72c072d3ce3948d8f5585cd9a91 not found: ID does not exist" containerID="e4a850aa1442af0f782bbfa2f081e52af907d72c072d3ce3948d8f5585cd9a91" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.322211 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a850aa1442af0f782bbfa2f081e52af907d72c072d3ce3948d8f5585cd9a91"} err="failed to get container status \"e4a850aa1442af0f782bbfa2f081e52af907d72c072d3ce3948d8f5585cd9a91\": rpc error: code = NotFound desc = could not find container \"e4a850aa1442af0f782bbfa2f081e52af907d72c072d3ce3948d8f5585cd9a91\": container with ID starting with e4a850aa1442af0f782bbfa2f081e52af907d72c072d3ce3948d8f5585cd9a91 not found: ID does not exist" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.322250 5030 scope.go:117] "RemoveContainer" containerID="0a8d2a7a25279d9304e1850209b0c77edf0e5d3138a8ce2cc66e6ac6c2dcae34" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.333082 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.333115 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.333126 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.333136 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4sts\" (UniqueName: \"kubernetes.io/projected/f281c690-6bd6-4c53-b35c-e023c899af09-kube-api-access-j4sts\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.333146 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03d04802-be7b-437b-94fb-eb88151f97c7-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.333155 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9ptp\" (UniqueName: \"kubernetes.io/projected/03d04802-be7b-437b-94fb-eb88151f97c7-kube-api-access-w9ptp\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.333165 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.333175 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.333186 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f281c690-6bd6-4c53-b35c-e023c899af09-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.333205 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.333913 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "03d04802-be7b-437b-94fb-eb88151f97c7" (UID: "03d04802-be7b-437b-94fb-eb88151f97c7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.344363 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:12:54 crc kubenswrapper[5030]: E0120 23:12:54.344920 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f281c690-6bd6-4c53-b35c-e023c899af09" containerName="nova-metadata-metadata" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.344941 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f281c690-6bd6-4c53-b35c-e023c899af09" containerName="nova-metadata-metadata" Jan 20 23:12:54 crc kubenswrapper[5030]: E0120 23:12:54.344956 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d04802-be7b-437b-94fb-eb88151f97c7" containerName="nova-api-api" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.344962 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d04802-be7b-437b-94fb-eb88151f97c7" containerName="nova-api-api" Jan 20 23:12:54 crc kubenswrapper[5030]: E0120 23:12:54.344979 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d04802-be7b-437b-94fb-eb88151f97c7" containerName="nova-api-log" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.344985 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d04802-be7b-437b-94fb-eb88151f97c7" containerName="nova-api-log" Jan 20 23:12:54 crc kubenswrapper[5030]: E0120 23:12:54.344998 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="056d6457-8742-4d6c-a497-dd9c75a8681b" containerName="nova-scheduler-scheduler" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.345004 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="056d6457-8742-4d6c-a497-dd9c75a8681b" containerName="nova-scheduler-scheduler" Jan 20 23:12:54 crc kubenswrapper[5030]: E0120 23:12:54.345021 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025de003-1154-4871-b722-7d83c8aab74a" containerName="nova-manage" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.345026 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="025de003-1154-4871-b722-7d83c8aab74a" containerName="nova-manage" Jan 20 23:12:54 crc kubenswrapper[5030]: E0120 23:12:54.345038 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f281c690-6bd6-4c53-b35c-e023c899af09" containerName="nova-metadata-log" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.345044 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f281c690-6bd6-4c53-b35c-e023c899af09" containerName="nova-metadata-log" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.345218 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="056d6457-8742-4d6c-a497-dd9c75a8681b" containerName="nova-scheduler-scheduler" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.345233 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d04802-be7b-437b-94fb-eb88151f97c7" containerName="nova-api-api" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.346207 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f281c690-6bd6-4c53-b35c-e023c899af09" containerName="nova-metadata-log" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.346234 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d04802-be7b-437b-94fb-eb88151f97c7" containerName="nova-api-log" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.346248 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="025de003-1154-4871-b722-7d83c8aab74a" containerName="nova-manage" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.346258 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f281c690-6bd6-4c53-b35c-e023c899af09" containerName="nova-metadata-metadata" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.346877 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.352519 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.352926 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.357840 5030 scope.go:117] "RemoveContainer" containerID="c80c53ba88caf9cdc71e70c963723d8e80d4aaced08cd26d33f3d20373ef7624" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.388468 5030 scope.go:117] "RemoveContainer" containerID="0a8d2a7a25279d9304e1850209b0c77edf0e5d3138a8ce2cc66e6ac6c2dcae34" Jan 20 23:12:54 crc kubenswrapper[5030]: E0120 23:12:54.389131 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a8d2a7a25279d9304e1850209b0c77edf0e5d3138a8ce2cc66e6ac6c2dcae34\": container with ID starting with 0a8d2a7a25279d9304e1850209b0c77edf0e5d3138a8ce2cc66e6ac6c2dcae34 not found: ID does not exist" containerID="0a8d2a7a25279d9304e1850209b0c77edf0e5d3138a8ce2cc66e6ac6c2dcae34" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.389157 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8d2a7a25279d9304e1850209b0c77edf0e5d3138a8ce2cc66e6ac6c2dcae34"} err="failed to get container status \"0a8d2a7a25279d9304e1850209b0c77edf0e5d3138a8ce2cc66e6ac6c2dcae34\": rpc error: code = NotFound desc = could not find container \"0a8d2a7a25279d9304e1850209b0c77edf0e5d3138a8ce2cc66e6ac6c2dcae34\": container with ID starting with 0a8d2a7a25279d9304e1850209b0c77edf0e5d3138a8ce2cc66e6ac6c2dcae34 not found: ID does not exist" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.389178 5030 scope.go:117] "RemoveContainer" containerID="c80c53ba88caf9cdc71e70c963723d8e80d4aaced08cd26d33f3d20373ef7624" Jan 20 23:12:54 crc kubenswrapper[5030]: E0120 23:12:54.389486 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80c53ba88caf9cdc71e70c963723d8e80d4aaced08cd26d33f3d20373ef7624\": container with ID starting with c80c53ba88caf9cdc71e70c963723d8e80d4aaced08cd26d33f3d20373ef7624 not found: ID does not exist" containerID="c80c53ba88caf9cdc71e70c963723d8e80d4aaced08cd26d33f3d20373ef7624" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.389541 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80c53ba88caf9cdc71e70c963723d8e80d4aaced08cd26d33f3d20373ef7624"} err="failed to get container status \"c80c53ba88caf9cdc71e70c963723d8e80d4aaced08cd26d33f3d20373ef7624\": rpc error: code = NotFound desc = could not find container \"c80c53ba88caf9cdc71e70c963723d8e80d4aaced08cd26d33f3d20373ef7624\": container with ID starting with c80c53ba88caf9cdc71e70c963723d8e80d4aaced08cd26d33f3d20373ef7624 not found: ID does not exist" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.389567 5030 scope.go:117] "RemoveContainer" containerID="17dd52c806162812d6a1b27460a0037870da2675c322cff24641dd0fefb8fe52" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.408286 5030 scope.go:117] "RemoveContainer" containerID="947d5c71f7e781d4bd6096549e0b7ed0edb2757a6a4f6328ec135069ead79a1c" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.429076 5030 scope.go:117] "RemoveContainer" containerID="17dd52c806162812d6a1b27460a0037870da2675c322cff24641dd0fefb8fe52" Jan 20 23:12:54 crc kubenswrapper[5030]: E0120 23:12:54.429542 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17dd52c806162812d6a1b27460a0037870da2675c322cff24641dd0fefb8fe52\": container with ID starting with 17dd52c806162812d6a1b27460a0037870da2675c322cff24641dd0fefb8fe52 not found: ID does not exist" containerID="17dd52c806162812d6a1b27460a0037870da2675c322cff24641dd0fefb8fe52" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.429603 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17dd52c806162812d6a1b27460a0037870da2675c322cff24641dd0fefb8fe52"} err="failed to get container status \"17dd52c806162812d6a1b27460a0037870da2675c322cff24641dd0fefb8fe52\": rpc error: code = NotFound desc = could not find container \"17dd52c806162812d6a1b27460a0037870da2675c322cff24641dd0fefb8fe52\": container with ID starting with 17dd52c806162812d6a1b27460a0037870da2675c322cff24641dd0fefb8fe52 not found: ID does not exist" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.429645 5030 scope.go:117] "RemoveContainer" containerID="947d5c71f7e781d4bd6096549e0b7ed0edb2757a6a4f6328ec135069ead79a1c" Jan 20 23:12:54 crc kubenswrapper[5030]: E0120 23:12:54.429917 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"947d5c71f7e781d4bd6096549e0b7ed0edb2757a6a4f6328ec135069ead79a1c\": container with ID starting with 947d5c71f7e781d4bd6096549e0b7ed0edb2757a6a4f6328ec135069ead79a1c not found: ID does not exist" containerID="947d5c71f7e781d4bd6096549e0b7ed0edb2757a6a4f6328ec135069ead79a1c" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.429942 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"947d5c71f7e781d4bd6096549e0b7ed0edb2757a6a4f6328ec135069ead79a1c"} err="failed to get container status \"947d5c71f7e781d4bd6096549e0b7ed0edb2757a6a4f6328ec135069ead79a1c\": rpc error: code = NotFound desc = could not find container \"947d5c71f7e781d4bd6096549e0b7ed0edb2757a6a4f6328ec135069ead79a1c\": container with ID starting with 947d5c71f7e781d4bd6096549e0b7ed0edb2757a6a4f6328ec135069ead79a1c not found: ID does not exist" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.435523 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht5dk\" (UniqueName: \"kubernetes.io/projected/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-kube-api-access-ht5dk\") pod \"nova-scheduler-0\" (UID: \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.435613 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.435729 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-config-data\") pod \"nova-scheduler-0\" (UID: \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.435811 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03d04802-be7b-437b-94fb-eb88151f97c7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.537247 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht5dk\" (UniqueName: \"kubernetes.io/projected/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-kube-api-access-ht5dk\") pod \"nova-scheduler-0\" (UID: \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.537420 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.537700 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-config-data\") pod \"nova-scheduler-0\" (UID: \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.541935 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.543567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-config-data\") pod \"nova-scheduler-0\" (UID: \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.554987 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht5dk\" (UniqueName: \"kubernetes.io/projected/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-kube-api-access-ht5dk\") pod \"nova-scheduler-0\" (UID: \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.655570 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.670863 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.671132 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.684156 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.734929 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.746599 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.750198 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.753314 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.753388 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.753479 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.753517 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.755249 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.758102 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.758292 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.762178 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.773745 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.844423 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.845669 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.845769 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-config-data\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.845834 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.845874 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-config-data\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.845915 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f5rv\" (UniqueName: \"kubernetes.io/projected/d9ae46d8-5c20-4710-90c6-b2d0120d9231-kube-api-access-9f5rv\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.845996 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.846081 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e84f01-7009-4bfc-8cfb-2af258ceed3c-logs\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.846173 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-public-tls-certs\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.846268 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9ae46d8-5c20-4710-90c6-b2d0120d9231-logs\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.846357 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77m8q\" (UniqueName: \"kubernetes.io/projected/80e84f01-7009-4bfc-8cfb-2af258ceed3c-kube-api-access-77m8q\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.948197 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e84f01-7009-4bfc-8cfb-2af258ceed3c-logs\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.948553 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-public-tls-certs\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.948599 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9ae46d8-5c20-4710-90c6-b2d0120d9231-logs\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.948640 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77m8q\" (UniqueName: \"kubernetes.io/projected/80e84f01-7009-4bfc-8cfb-2af258ceed3c-kube-api-access-77m8q\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.948662 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.948684 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.948691 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e84f01-7009-4bfc-8cfb-2af258ceed3c-logs\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.948729 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-config-data\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.948759 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.948780 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-config-data\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.948803 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f5rv\" (UniqueName: \"kubernetes.io/projected/d9ae46d8-5c20-4710-90c6-b2d0120d9231-kube-api-access-9f5rv\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.948823 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.949588 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9ae46d8-5c20-4710-90c6-b2d0120d9231-logs\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.954282 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-config-data\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.954475 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-public-tls-certs\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.954516 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.954725 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.955656 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.955965 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.956749 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-config-data\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.963402 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77m8q\" (UniqueName: \"kubernetes.io/projected/80e84f01-7009-4bfc-8cfb-2af258ceed3c-kube-api-access-77m8q\") pod \"nova-metadata-0\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:54 crc kubenswrapper[5030]: I0120 23:12:54.964817 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f5rv\" (UniqueName: \"kubernetes.io/projected/d9ae46d8-5c20-4710-90c6-b2d0120d9231-kube-api-access-9f5rv\") pod \"nova-api-0\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:55 crc kubenswrapper[5030]: I0120 23:12:55.115057 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:12:55 crc kubenswrapper[5030]: I0120 23:12:55.123293 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:12:55 crc kubenswrapper[5030]: I0120 23:12:55.123486 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:12:55 crc kubenswrapper[5030]: W0120 23:12:55.135466 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod935fdabd_9fb0_4aec_8920_4ccbaf4416a3.slice/crio-2f2378951181784ae7c970bc36e95ac59f829c05bfd64485bd20512a61f67910 WatchSource:0}: Error finding container 2f2378951181784ae7c970bc36e95ac59f829c05bfd64485bd20512a61f67910: Status 404 returned error can't find the container with id 2f2378951181784ae7c970bc36e95ac59f829c05bfd64485bd20512a61f67910 Jan 20 23:12:55 crc kubenswrapper[5030]: I0120 23:12:55.308936 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"935fdabd-9fb0-4aec-8920-4ccbaf4416a3","Type":"ContainerStarted","Data":"2f2378951181784ae7c970bc36e95ac59f829c05bfd64485bd20512a61f67910"} Jan 20 23:12:55 crc kubenswrapper[5030]: I0120 23:12:55.591052 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:12:55 crc kubenswrapper[5030]: I0120 23:12:55.667509 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:12:55 crc kubenswrapper[5030]: W0120 23:12:55.669822 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80e84f01_7009_4bfc_8cfb_2af258ceed3c.slice/crio-adbab748012b4835b73cc6a39938f15d85643d69bd26da766d4cc801e9a7d8f6 WatchSource:0}: Error finding container adbab748012b4835b73cc6a39938f15d85643d69bd26da766d4cc801e9a7d8f6: Status 404 returned error can't find the container with id adbab748012b4835b73cc6a39938f15d85643d69bd26da766d4cc801e9a7d8f6 Jan 20 23:12:55 crc kubenswrapper[5030]: I0120 23:12:55.982433 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d04802-be7b-437b-94fb-eb88151f97c7" path="/var/lib/kubelet/pods/03d04802-be7b-437b-94fb-eb88151f97c7/volumes" Jan 20 23:12:55 crc kubenswrapper[5030]: I0120 23:12:55.983252 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="056d6457-8742-4d6c-a497-dd9c75a8681b" path="/var/lib/kubelet/pods/056d6457-8742-4d6c-a497-dd9c75a8681b/volumes" Jan 20 23:12:55 crc kubenswrapper[5030]: I0120 23:12:55.983945 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f281c690-6bd6-4c53-b35c-e023c899af09" path="/var/lib/kubelet/pods/f281c690-6bd6-4c53-b35c-e023c899af09/volumes" Jan 20 23:12:56 crc kubenswrapper[5030]: I0120 23:12:56.338167 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d9ae46d8-5c20-4710-90c6-b2d0120d9231","Type":"ContainerStarted","Data":"14b424c8af972f0138ff073374e0e6c981d49cdcb262e43e51bfaad6dc967e68"} Jan 20 23:12:56 crc kubenswrapper[5030]: I0120 23:12:56.338538 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d9ae46d8-5c20-4710-90c6-b2d0120d9231","Type":"ContainerStarted","Data":"f803f31c1484582fd4598d69e21199293bf4f36c02688642dfeddaee147a10ab"} Jan 20 23:12:56 crc kubenswrapper[5030]: I0120 23:12:56.338565 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d9ae46d8-5c20-4710-90c6-b2d0120d9231","Type":"ContainerStarted","Data":"79d1a584b801a41f3a5b989619a3373f95baea8a07e06d175bee484b1ef356f1"} Jan 20 23:12:56 crc kubenswrapper[5030]: I0120 23:12:56.344317 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"935fdabd-9fb0-4aec-8920-4ccbaf4416a3","Type":"ContainerStarted","Data":"52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245"} Jan 20 23:12:56 crc kubenswrapper[5030]: I0120 23:12:56.348457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"80e84f01-7009-4bfc-8cfb-2af258ceed3c","Type":"ContainerStarted","Data":"5191d760f933dad69163b25ccde46995e6d88d09b8f999f482abfafa83968828"} Jan 20 23:12:56 crc kubenswrapper[5030]: I0120 23:12:56.348510 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"80e84f01-7009-4bfc-8cfb-2af258ceed3c","Type":"ContainerStarted","Data":"f8d84d8355d9ec6b696884a0457ca932cf4fbadb0be127718cc3bfa520459e7d"} Jan 20 23:12:56 crc kubenswrapper[5030]: I0120 23:12:56.348525 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"80e84f01-7009-4bfc-8cfb-2af258ceed3c","Type":"ContainerStarted","Data":"adbab748012b4835b73cc6a39938f15d85643d69bd26da766d4cc801e9a7d8f6"} Jan 20 23:12:56 crc kubenswrapper[5030]: I0120 23:12:56.367543 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.367526588 podStartE2EDuration="2.367526588s" podCreationTimestamp="2026-01-20 23:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:12:56.355984239 +0000 UTC m=+2248.676244527" watchObservedRunningTime="2026-01-20 23:12:56.367526588 +0000 UTC m=+2248.687786876" Jan 20 23:12:56 crc kubenswrapper[5030]: I0120 23:12:56.394812 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.394790696 podStartE2EDuration="2.394790696s" podCreationTimestamp="2026-01-20 23:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:12:56.385246066 +0000 UTC m=+2248.705506364" watchObservedRunningTime="2026-01-20 23:12:56.394790696 +0000 UTC m=+2248.715050984" Jan 20 23:12:59 crc kubenswrapper[5030]: I0120 23:12:59.671547 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:13:00 crc kubenswrapper[5030]: I0120 23:13:00.123815 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:13:00 crc kubenswrapper[5030]: I0120 23:13:00.123873 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:13:04 crc kubenswrapper[5030]: I0120 23:13:04.671823 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:13:04 crc kubenswrapper[5030]: I0120 23:13:04.721287 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:13:04 crc kubenswrapper[5030]: I0120 23:13:04.758865 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=10.758808285 podStartE2EDuration="10.758808285s" podCreationTimestamp="2026-01-20 23:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:12:56.404759327 +0000 UTC m=+2248.725019625" watchObservedRunningTime="2026-01-20 23:13:04.758808285 +0000 UTC m=+2257.079068613" Jan 20 23:13:05 crc kubenswrapper[5030]: I0120 23:13:05.115811 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:13:05 crc kubenswrapper[5030]: I0120 23:13:05.115921 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:13:05 crc kubenswrapper[5030]: I0120 23:13:05.123570 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:13:05 crc kubenswrapper[5030]: I0120 23:13:05.123610 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:13:05 crc kubenswrapper[5030]: I0120 23:13:05.486069 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:13:06 crc kubenswrapper[5030]: I0120 23:13:06.128761 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.20:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:13:06 crc kubenswrapper[5030]: I0120 23:13:06.128784 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.20:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:13:06 crc kubenswrapper[5030]: I0120 23:13:06.142903 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.44:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:13:06 crc kubenswrapper[5030]: I0120 23:13:06.142928 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.44:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:13:07 crc kubenswrapper[5030]: I0120 23:13:07.478199 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:10 crc kubenswrapper[5030]: I0120 23:13:10.157470 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:13:10 crc kubenswrapper[5030]: I0120 23:13:10.157886 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:13:11 crc kubenswrapper[5030]: I0120 23:13:11.995785 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:11.996247 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="d7bb0c16-38ce-41d8-b090-842104679b2a" containerName="kube-state-metrics" containerID="cri-o://f5ebd33c9acc4f750d280ab2a5f57801cf2340ac2e82461bfa53375980cb7bed" gracePeriod=30 Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.462103 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.539298 5030 generic.go:334] "Generic (PLEG): container finished" podID="d7bb0c16-38ce-41d8-b090-842104679b2a" containerID="f5ebd33c9acc4f750d280ab2a5f57801cf2340ac2e82461bfa53375980cb7bed" exitCode=2 Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.539362 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.539357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d7bb0c16-38ce-41d8-b090-842104679b2a","Type":"ContainerDied","Data":"f5ebd33c9acc4f750d280ab2a5f57801cf2340ac2e82461bfa53375980cb7bed"} Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.539533 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d7bb0c16-38ce-41d8-b090-842104679b2a","Type":"ContainerDied","Data":"ca1026e445e69d4b1c5e62011e842c0f518029fde20c197fa2e9cc1fd52040af"} Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.539571 5030 scope.go:117] "RemoveContainer" containerID="f5ebd33c9acc4f750d280ab2a5f57801cf2340ac2e82461bfa53375980cb7bed" Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.559049 5030 scope.go:117] "RemoveContainer" containerID="f5ebd33c9acc4f750d280ab2a5f57801cf2340ac2e82461bfa53375980cb7bed" Jan 20 23:13:12 crc kubenswrapper[5030]: E0120 23:13:12.559527 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ebd33c9acc4f750d280ab2a5f57801cf2340ac2e82461bfa53375980cb7bed\": container with ID starting with f5ebd33c9acc4f750d280ab2a5f57801cf2340ac2e82461bfa53375980cb7bed not found: ID does not exist" containerID="f5ebd33c9acc4f750d280ab2a5f57801cf2340ac2e82461bfa53375980cb7bed" Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.559559 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ebd33c9acc4f750d280ab2a5f57801cf2340ac2e82461bfa53375980cb7bed"} err="failed to get container status \"f5ebd33c9acc4f750d280ab2a5f57801cf2340ac2e82461bfa53375980cb7bed\": rpc error: code = NotFound desc = could not find container \"f5ebd33c9acc4f750d280ab2a5f57801cf2340ac2e82461bfa53375980cb7bed\": container with ID starting with f5ebd33c9acc4f750d280ab2a5f57801cf2340ac2e82461bfa53375980cb7bed not found: ID does not exist" Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.615924 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdslf\" (UniqueName: \"kubernetes.io/projected/d7bb0c16-38ce-41d8-b090-842104679b2a-kube-api-access-vdslf\") pod \"d7bb0c16-38ce-41d8-b090-842104679b2a\" (UID: \"d7bb0c16-38ce-41d8-b090-842104679b2a\") " Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.621765 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7bb0c16-38ce-41d8-b090-842104679b2a-kube-api-access-vdslf" (OuterVolumeSpecName: "kube-api-access-vdslf") pod "d7bb0c16-38ce-41d8-b090-842104679b2a" (UID: "d7bb0c16-38ce-41d8-b090-842104679b2a"). InnerVolumeSpecName "kube-api-access-vdslf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.719036 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdslf\" (UniqueName: \"kubernetes.io/projected/d7bb0c16-38ce-41d8-b090-842104679b2a-kube-api-access-vdslf\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.877820 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.886382 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.901536 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:13:12 crc kubenswrapper[5030]: E0120 23:13:12.903799 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bb0c16-38ce-41d8-b090-842104679b2a" containerName="kube-state-metrics" Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.903825 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bb0c16-38ce-41d8-b090-842104679b2a" containerName="kube-state-metrics" Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.904046 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bb0c16-38ce-41d8-b090-842104679b2a" containerName="kube-state-metrics" Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.904903 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.907492 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.907534 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 20 23:13:12 crc kubenswrapper[5030]: I0120 23:13:12.920038 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.023878 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.023953 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.024196 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.024445 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfjfs\" (UniqueName: \"kubernetes.io/projected/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-api-access-qfjfs\") pod \"kube-state-metrics-0\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.126694 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfjfs\" (UniqueName: \"kubernetes.io/projected/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-api-access-qfjfs\") pod \"kube-state-metrics-0\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.126998 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.127095 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.127317 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.132166 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.133578 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.141225 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.145924 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfjfs\" (UniqueName: \"kubernetes.io/projected/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-api-access-qfjfs\") pod \"kube-state-metrics-0\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.222965 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.692162 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.767379 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.767793 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="ceilometer-central-agent" containerID="cri-o://74a9f31739f785d263ef15b1b9525a58e3e8c9acd391968b607075e97356471d" gracePeriod=30 Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.767849 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="sg-core" containerID="cri-o://aa0f48ea6ef68395fe50a607882e357701f58e47315a580ae3fb7086594500ab" gracePeriod=30 Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.767911 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="proxy-httpd" containerID="cri-o://f9d7b66ea535c50368d809f77bef6106ffd34a4fdfa236183bf6052129108f7d" gracePeriod=30 Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.767877 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="ceilometer-notification-agent" containerID="cri-o://45945c28e049f19caa74bdaa60824fe31de062dde7f1bf6c7eeb887f1fd6c3bb" gracePeriod=30 Jan 20 23:13:13 crc kubenswrapper[5030]: I0120 23:13:13.973783 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7bb0c16-38ce-41d8-b090-842104679b2a" path="/var/lib/kubelet/pods/d7bb0c16-38ce-41d8-b090-842104679b2a/volumes" Jan 20 23:13:14 crc kubenswrapper[5030]: I0120 23:13:14.580419 5030 generic.go:334] "Generic (PLEG): container finished" podID="e8221006-b113-4388-8dd1-5c297d3057f2" containerID="f9d7b66ea535c50368d809f77bef6106ffd34a4fdfa236183bf6052129108f7d" exitCode=0 Jan 20 23:13:14 crc kubenswrapper[5030]: I0120 23:13:14.580483 5030 generic.go:334] "Generic (PLEG): container finished" podID="e8221006-b113-4388-8dd1-5c297d3057f2" containerID="aa0f48ea6ef68395fe50a607882e357701f58e47315a580ae3fb7086594500ab" exitCode=2 Jan 20 23:13:14 crc kubenswrapper[5030]: I0120 23:13:14.580501 5030 generic.go:334] "Generic (PLEG): container finished" podID="e8221006-b113-4388-8dd1-5c297d3057f2" containerID="74a9f31739f785d263ef15b1b9525a58e3e8c9acd391968b607075e97356471d" exitCode=0 Jan 20 23:13:14 crc kubenswrapper[5030]: I0120 23:13:14.580600 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e8221006-b113-4388-8dd1-5c297d3057f2","Type":"ContainerDied","Data":"f9d7b66ea535c50368d809f77bef6106ffd34a4fdfa236183bf6052129108f7d"} Jan 20 23:13:14 crc kubenswrapper[5030]: I0120 23:13:14.580662 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e8221006-b113-4388-8dd1-5c297d3057f2","Type":"ContainerDied","Data":"aa0f48ea6ef68395fe50a607882e357701f58e47315a580ae3fb7086594500ab"} Jan 20 23:13:14 crc kubenswrapper[5030]: I0120 23:13:14.580698 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e8221006-b113-4388-8dd1-5c297d3057f2","Type":"ContainerDied","Data":"74a9f31739f785d263ef15b1b9525a58e3e8c9acd391968b607075e97356471d"} Jan 20 23:13:14 crc kubenswrapper[5030]: I0120 23:13:14.585296 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"6d52d930-f169-42d5-a93d-d9d30906d5c8","Type":"ContainerStarted","Data":"ee353b1c49e0f185072e424e0c04f05bea612841a01568ff06beb0fff7400225"} Jan 20 23:13:14 crc kubenswrapper[5030]: I0120 23:13:14.585348 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"6d52d930-f169-42d5-a93d-d9d30906d5c8","Type":"ContainerStarted","Data":"d9c1cff779d09e8c608fde6674adc1a0a6cc2298c8dcae729b529c90c9b85c3a"} Jan 20 23:13:14 crc kubenswrapper[5030]: I0120 23:13:14.585511 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:14 crc kubenswrapper[5030]: I0120 23:13:14.618407 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.251122184 podStartE2EDuration="2.618381109s" podCreationTimestamp="2026-01-20 23:13:12 +0000 UTC" firstStartedPulling="2026-01-20 23:13:13.696810255 +0000 UTC m=+2266.017070543" lastFinishedPulling="2026-01-20 23:13:14.06406918 +0000 UTC m=+2266.384329468" observedRunningTime="2026-01-20 23:13:14.604951315 +0000 UTC m=+2266.925211643" watchObservedRunningTime="2026-01-20 23:13:14.618381109 +0000 UTC m=+2266.938641437" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.125058 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.125749 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.129652 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.132336 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.132521 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.137974 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.141860 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.168068 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.269783 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8221006-b113-4388-8dd1-5c297d3057f2-log-httpd\") pod \"e8221006-b113-4388-8dd1-5c297d3057f2\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.269864 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8221006-b113-4388-8dd1-5c297d3057f2-run-httpd\") pod \"e8221006-b113-4388-8dd1-5c297d3057f2\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.269937 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-combined-ca-bundle\") pod \"e8221006-b113-4388-8dd1-5c297d3057f2\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.270038 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-sg-core-conf-yaml\") pod \"e8221006-b113-4388-8dd1-5c297d3057f2\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.270104 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-scripts\") pod \"e8221006-b113-4388-8dd1-5c297d3057f2\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.270155 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-config-data\") pod \"e8221006-b113-4388-8dd1-5c297d3057f2\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.270185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfn44\" (UniqueName: \"kubernetes.io/projected/e8221006-b113-4388-8dd1-5c297d3057f2-kube-api-access-lfn44\") pod \"e8221006-b113-4388-8dd1-5c297d3057f2\" (UID: \"e8221006-b113-4388-8dd1-5c297d3057f2\") " Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.270342 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8221006-b113-4388-8dd1-5c297d3057f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e8221006-b113-4388-8dd1-5c297d3057f2" (UID: "e8221006-b113-4388-8dd1-5c297d3057f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.270958 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8221006-b113-4388-8dd1-5c297d3057f2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.271355 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8221006-b113-4388-8dd1-5c297d3057f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e8221006-b113-4388-8dd1-5c297d3057f2" (UID: "e8221006-b113-4388-8dd1-5c297d3057f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.275391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8221006-b113-4388-8dd1-5c297d3057f2-kube-api-access-lfn44" (OuterVolumeSpecName: "kube-api-access-lfn44") pod "e8221006-b113-4388-8dd1-5c297d3057f2" (UID: "e8221006-b113-4388-8dd1-5c297d3057f2"). InnerVolumeSpecName "kube-api-access-lfn44". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.281935 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-scripts" (OuterVolumeSpecName: "scripts") pod "e8221006-b113-4388-8dd1-5c297d3057f2" (UID: "e8221006-b113-4388-8dd1-5c297d3057f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.299295 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e8221006-b113-4388-8dd1-5c297d3057f2" (UID: "e8221006-b113-4388-8dd1-5c297d3057f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.345391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8221006-b113-4388-8dd1-5c297d3057f2" (UID: "e8221006-b113-4388-8dd1-5c297d3057f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.365650 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-config-data" (OuterVolumeSpecName: "config-data") pod "e8221006-b113-4388-8dd1-5c297d3057f2" (UID: "e8221006-b113-4388-8dd1-5c297d3057f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.372916 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.372953 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.372969 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfn44\" (UniqueName: \"kubernetes.io/projected/e8221006-b113-4388-8dd1-5c297d3057f2-kube-api-access-lfn44\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.372982 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8221006-b113-4388-8dd1-5c297d3057f2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.372994 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.373005 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8221006-b113-4388-8dd1-5c297d3057f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.604077 5030 generic.go:334] "Generic (PLEG): container finished" podID="e8221006-b113-4388-8dd1-5c297d3057f2" containerID="45945c28e049f19caa74bdaa60824fe31de062dde7f1bf6c7eeb887f1fd6c3bb" exitCode=0 Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.605417 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.607546 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e8221006-b113-4388-8dd1-5c297d3057f2","Type":"ContainerDied","Data":"45945c28e049f19caa74bdaa60824fe31de062dde7f1bf6c7eeb887f1fd6c3bb"} Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.607642 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e8221006-b113-4388-8dd1-5c297d3057f2","Type":"ContainerDied","Data":"5f06366e9e7d171836c27d8001feb68b6522b9a4a5e0d196ba21453a1998bcc6"} Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.607685 5030 scope.go:117] "RemoveContainer" containerID="f9d7b66ea535c50368d809f77bef6106ffd34a4fdfa236183bf6052129108f7d" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.609067 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.618725 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.620082 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.648912 5030 scope.go:117] "RemoveContainer" containerID="aa0f48ea6ef68395fe50a607882e357701f58e47315a580ae3fb7086594500ab" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.690686 5030 scope.go:117] "RemoveContainer" containerID="45945c28e049f19caa74bdaa60824fe31de062dde7f1bf6c7eeb887f1fd6c3bb" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.730100 5030 scope.go:117] "RemoveContainer" containerID="74a9f31739f785d263ef15b1b9525a58e3e8c9acd391968b607075e97356471d" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.766119 5030 scope.go:117] "RemoveContainer" containerID="f9d7b66ea535c50368d809f77bef6106ffd34a4fdfa236183bf6052129108f7d" Jan 20 23:13:15 crc kubenswrapper[5030]: E0120 23:13:15.767687 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d7b66ea535c50368d809f77bef6106ffd34a4fdfa236183bf6052129108f7d\": container with ID starting with f9d7b66ea535c50368d809f77bef6106ffd34a4fdfa236183bf6052129108f7d not found: ID does not exist" containerID="f9d7b66ea535c50368d809f77bef6106ffd34a4fdfa236183bf6052129108f7d" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.767728 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d7b66ea535c50368d809f77bef6106ffd34a4fdfa236183bf6052129108f7d"} err="failed to get container status \"f9d7b66ea535c50368d809f77bef6106ffd34a4fdfa236183bf6052129108f7d\": rpc error: code = NotFound desc = could not find container \"f9d7b66ea535c50368d809f77bef6106ffd34a4fdfa236183bf6052129108f7d\": container with ID starting with f9d7b66ea535c50368d809f77bef6106ffd34a4fdfa236183bf6052129108f7d not found: ID does not exist" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.767752 5030 scope.go:117] "RemoveContainer" containerID="aa0f48ea6ef68395fe50a607882e357701f58e47315a580ae3fb7086594500ab" Jan 20 23:13:15 crc kubenswrapper[5030]: E0120 23:13:15.768805 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0f48ea6ef68395fe50a607882e357701f58e47315a580ae3fb7086594500ab\": container with ID starting with aa0f48ea6ef68395fe50a607882e357701f58e47315a580ae3fb7086594500ab not found: ID does not exist" containerID="aa0f48ea6ef68395fe50a607882e357701f58e47315a580ae3fb7086594500ab" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.768837 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0f48ea6ef68395fe50a607882e357701f58e47315a580ae3fb7086594500ab"} err="failed to get container status \"aa0f48ea6ef68395fe50a607882e357701f58e47315a580ae3fb7086594500ab\": rpc error: code = NotFound desc = could not find container \"aa0f48ea6ef68395fe50a607882e357701f58e47315a580ae3fb7086594500ab\": container with ID starting with aa0f48ea6ef68395fe50a607882e357701f58e47315a580ae3fb7086594500ab not found: ID does not exist" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.768856 5030 scope.go:117] "RemoveContainer" containerID="45945c28e049f19caa74bdaa60824fe31de062dde7f1bf6c7eeb887f1fd6c3bb" Jan 20 23:13:15 crc kubenswrapper[5030]: E0120 23:13:15.773841 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45945c28e049f19caa74bdaa60824fe31de062dde7f1bf6c7eeb887f1fd6c3bb\": container with ID starting with 45945c28e049f19caa74bdaa60824fe31de062dde7f1bf6c7eeb887f1fd6c3bb not found: ID does not exist" containerID="45945c28e049f19caa74bdaa60824fe31de062dde7f1bf6c7eeb887f1fd6c3bb" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.773907 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45945c28e049f19caa74bdaa60824fe31de062dde7f1bf6c7eeb887f1fd6c3bb"} err="failed to get container status \"45945c28e049f19caa74bdaa60824fe31de062dde7f1bf6c7eeb887f1fd6c3bb\": rpc error: code = NotFound desc = could not find container \"45945c28e049f19caa74bdaa60824fe31de062dde7f1bf6c7eeb887f1fd6c3bb\": container with ID starting with 45945c28e049f19caa74bdaa60824fe31de062dde7f1bf6c7eeb887f1fd6c3bb not found: ID does not exist" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.773937 5030 scope.go:117] "RemoveContainer" containerID="74a9f31739f785d263ef15b1b9525a58e3e8c9acd391968b607075e97356471d" Jan 20 23:13:15 crc kubenswrapper[5030]: E0120 23:13:15.774330 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a9f31739f785d263ef15b1b9525a58e3e8c9acd391968b607075e97356471d\": container with ID starting with 74a9f31739f785d263ef15b1b9525a58e3e8c9acd391968b607075e97356471d not found: ID does not exist" containerID="74a9f31739f785d263ef15b1b9525a58e3e8c9acd391968b607075e97356471d" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.774355 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a9f31739f785d263ef15b1b9525a58e3e8c9acd391968b607075e97356471d"} err="failed to get container status \"74a9f31739f785d263ef15b1b9525a58e3e8c9acd391968b607075e97356471d\": rpc error: code = NotFound desc = could not find container \"74a9f31739f785d263ef15b1b9525a58e3e8c9acd391968b607075e97356471d\": container with ID starting with 74a9f31739f785d263ef15b1b9525a58e3e8c9acd391968b607075e97356471d not found: ID does not exist" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.780458 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.802851 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.814758 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:13:15 crc kubenswrapper[5030]: E0120 23:13:15.815213 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="sg-core" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.815234 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="sg-core" Jan 20 23:13:15 crc kubenswrapper[5030]: E0120 23:13:15.815246 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="proxy-httpd" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.815254 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="proxy-httpd" Jan 20 23:13:15 crc kubenswrapper[5030]: E0120 23:13:15.815280 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="ceilometer-central-agent" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.815288 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="ceilometer-central-agent" Jan 20 23:13:15 crc kubenswrapper[5030]: E0120 23:13:15.815310 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="ceilometer-notification-agent" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.815318 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="ceilometer-notification-agent" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.815510 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="ceilometer-notification-agent" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.815539 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="proxy-httpd" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.815547 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="ceilometer-central-agent" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.815563 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" containerName="sg-core" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.817356 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.818773 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.819670 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.819863 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.821097 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.973023 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8221006-b113-4388-8dd1-5c297d3057f2" path="/var/lib/kubelet/pods/e8221006-b113-4388-8dd1-5c297d3057f2/volumes" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.994513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.994558 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.994575 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-config-data\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.994762 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-log-httpd\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.994806 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-scripts\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.994880 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.994969 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-run-httpd\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:15 crc kubenswrapper[5030]: I0120 23:13:15.995102 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl5nf\" (UniqueName: \"kubernetes.io/projected/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-kube-api-access-xl5nf\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.096832 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.096890 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-run-httpd\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.096955 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl5nf\" (UniqueName: \"kubernetes.io/projected/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-kube-api-access-xl5nf\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.097028 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.097052 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.097068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-config-data\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.097109 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-log-httpd\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.097124 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-scripts\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.097383 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-run-httpd\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.100033 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-log-httpd\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.102088 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.102687 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.105425 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-config-data\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.105455 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.105993 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-scripts\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.114962 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl5nf\" (UniqueName: \"kubernetes.io/projected/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-kube-api-access-xl5nf\") pod \"ceilometer-0\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.134435 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.449222 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:13:16 crc kubenswrapper[5030]: I0120 23:13:16.613768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b0bf8f-9251-40a3-8177-ee7ec9aec622","Type":"ContainerStarted","Data":"6b235e1482c00f139f72258b244e425359558c1c5e1e3b82b55ef29e51500c6c"} Jan 20 23:13:18 crc kubenswrapper[5030]: I0120 23:13:18.642557 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b0bf8f-9251-40a3-8177-ee7ec9aec622","Type":"ContainerStarted","Data":"25852a0b06abaef1152c482d4491725ea52efdac6ad8a6d9de42b23da10cf0ee"} Jan 20 23:13:19 crc kubenswrapper[5030]: I0120 23:13:19.656106 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b0bf8f-9251-40a3-8177-ee7ec9aec622","Type":"ContainerStarted","Data":"ba3004e2d56e03c4b37e9b30e64983c6fd5e37ab9a10b78ca4fc812d194cf0c1"} Jan 20 23:13:19 crc kubenswrapper[5030]: I0120 23:13:19.656497 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b0bf8f-9251-40a3-8177-ee7ec9aec622","Type":"ContainerStarted","Data":"2165954ce7991d9c92b9a2535612d5213391c2d1d40a0d59981e1f2a924cd889"} Jan 20 23:13:21 crc kubenswrapper[5030]: I0120 23:13:21.686022 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b0bf8f-9251-40a3-8177-ee7ec9aec622","Type":"ContainerStarted","Data":"7837b80e02d030e2ce03ff4cee41239d1427301430cd07ae2624f6088414713f"} Jan 20 23:13:21 crc kubenswrapper[5030]: I0120 23:13:21.688162 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:21 crc kubenswrapper[5030]: I0120 23:13:21.727147 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.006606373 podStartE2EDuration="6.727130419s" podCreationTimestamp="2026-01-20 23:13:15 +0000 UTC" firstStartedPulling="2026-01-20 23:13:16.448304297 +0000 UTC m=+2268.768564585" lastFinishedPulling="2026-01-20 23:13:21.168828303 +0000 UTC m=+2273.489088631" observedRunningTime="2026-01-20 23:13:21.709502993 +0000 UTC m=+2274.029763321" watchObservedRunningTime="2026-01-20 23:13:21.727130419 +0000 UTC m=+2274.047390697" Jan 20 23:13:23 crc kubenswrapper[5030]: I0120 23:13:23.232484 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:40 crc kubenswrapper[5030]: I0120 23:13:40.157145 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:13:40 crc kubenswrapper[5030]: I0120 23:13:40.157937 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:13:40 crc kubenswrapper[5030]: I0120 23:13:40.158009 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 23:13:40 crc kubenswrapper[5030]: I0120 23:13:40.158995 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 23:13:40 crc kubenswrapper[5030]: I0120 23:13:40.159098 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" gracePeriod=600 Jan 20 23:13:40 crc kubenswrapper[5030]: E0120 23:13:40.311949 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:13:40 crc kubenswrapper[5030]: I0120 23:13:40.931664 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" exitCode=0 Jan 20 23:13:40 crc kubenswrapper[5030]: I0120 23:13:40.931688 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3"} Jan 20 23:13:40 crc kubenswrapper[5030]: I0120 23:13:40.931799 5030 scope.go:117] "RemoveContainer" containerID="cbac8a8efde457ecc93be8c164837cd99df9bb7ee7e2524b52649b2e550edfbe" Jan 20 23:13:40 crc kubenswrapper[5030]: I0120 23:13:40.932780 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:13:40 crc kubenswrapper[5030]: E0120 23:13:40.933290 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.065171 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m5dvm"] Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.071794 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.077024 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m5dvm"] Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.152369 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93de7225-d504-409c-b4b7-8e0826937e9d-utilities\") pod \"community-operators-m5dvm\" (UID: \"93de7225-d504-409c-b4b7-8e0826937e9d\") " pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.152690 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93de7225-d504-409c-b4b7-8e0826937e9d-catalog-content\") pod \"community-operators-m5dvm\" (UID: \"93de7225-d504-409c-b4b7-8e0826937e9d\") " pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.152894 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzbr4\" (UniqueName: \"kubernetes.io/projected/93de7225-d504-409c-b4b7-8e0826937e9d-kube-api-access-pzbr4\") pod \"community-operators-m5dvm\" (UID: \"93de7225-d504-409c-b4b7-8e0826937e9d\") " pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.254893 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzbr4\" (UniqueName: \"kubernetes.io/projected/93de7225-d504-409c-b4b7-8e0826937e9d-kube-api-access-pzbr4\") pod \"community-operators-m5dvm\" (UID: \"93de7225-d504-409c-b4b7-8e0826937e9d\") " pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.255094 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93de7225-d504-409c-b4b7-8e0826937e9d-utilities\") pod \"community-operators-m5dvm\" (UID: \"93de7225-d504-409c-b4b7-8e0826937e9d\") " pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.255149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93de7225-d504-409c-b4b7-8e0826937e9d-catalog-content\") pod \"community-operators-m5dvm\" (UID: \"93de7225-d504-409c-b4b7-8e0826937e9d\") " pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.255920 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93de7225-d504-409c-b4b7-8e0826937e9d-catalog-content\") pod \"community-operators-m5dvm\" (UID: \"93de7225-d504-409c-b4b7-8e0826937e9d\") " pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.256536 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93de7225-d504-409c-b4b7-8e0826937e9d-utilities\") pod \"community-operators-m5dvm\" (UID: \"93de7225-d504-409c-b4b7-8e0826937e9d\") " pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.290175 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzbr4\" (UniqueName: \"kubernetes.io/projected/93de7225-d504-409c-b4b7-8e0826937e9d-kube-api-access-pzbr4\") pod \"community-operators-m5dvm\" (UID: \"93de7225-d504-409c-b4b7-8e0826937e9d\") " pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.396377 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.930827 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m5dvm"] Jan 20 23:13:41 crc kubenswrapper[5030]: I0120 23:13:41.944260 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5dvm" event={"ID":"93de7225-d504-409c-b4b7-8e0826937e9d","Type":"ContainerStarted","Data":"02ab49a1053ef22b4bd08283adf64ac364fb532b2a50f0b4570a3ca1e2438562"} Jan 20 23:13:42 crc kubenswrapper[5030]: I0120 23:13:42.970452 5030 generic.go:334] "Generic (PLEG): container finished" podID="93de7225-d504-409c-b4b7-8e0826937e9d" containerID="4e9a105c7a4566c0283977510c3c81350acb14b6445e2f615f8b8c7cfcd944a8" exitCode=0 Jan 20 23:13:42 crc kubenswrapper[5030]: I0120 23:13:42.970508 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5dvm" event={"ID":"93de7225-d504-409c-b4b7-8e0826937e9d","Type":"ContainerDied","Data":"4e9a105c7a4566c0283977510c3c81350acb14b6445e2f615f8b8c7cfcd944a8"} Jan 20 23:13:45 crc kubenswrapper[5030]: I0120 23:13:45.007489 5030 generic.go:334] "Generic (PLEG): container finished" podID="93de7225-d504-409c-b4b7-8e0826937e9d" containerID="d95e27461c55aafa37a1ed5e780d795b6de8def937c32b8fb3766bda32116d42" exitCode=0 Jan 20 23:13:45 crc kubenswrapper[5030]: I0120 23:13:45.007686 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5dvm" event={"ID":"93de7225-d504-409c-b4b7-8e0826937e9d","Type":"ContainerDied","Data":"d95e27461c55aafa37a1ed5e780d795b6de8def937c32b8fb3766bda32116d42"} Jan 20 23:13:46 crc kubenswrapper[5030]: I0120 23:13:46.146689 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:47 crc kubenswrapper[5030]: I0120 23:13:47.035287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5dvm" event={"ID":"93de7225-d504-409c-b4b7-8e0826937e9d","Type":"ContainerStarted","Data":"76189224976e8521cfe7d9597e16f26ebe7d1e854b742ed413b053b50b1fb456"} Jan 20 23:13:47 crc kubenswrapper[5030]: I0120 23:13:47.079139 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m5dvm" podStartSLOduration=3.098732878 podStartE2EDuration="6.079106693s" podCreationTimestamp="2026-01-20 23:13:41 +0000 UTC" firstStartedPulling="2026-01-20 23:13:42.973034808 +0000 UTC m=+2295.293295136" lastFinishedPulling="2026-01-20 23:13:45.953408653 +0000 UTC m=+2298.273668951" observedRunningTime="2026-01-20 23:13:47.065814243 +0000 UTC m=+2299.386074611" watchObservedRunningTime="2026-01-20 23:13:47.079106693 +0000 UTC m=+2299.399366981" Jan 20 23:13:50 crc kubenswrapper[5030]: I0120 23:13:50.554812 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:13:50 crc kubenswrapper[5030]: I0120 23:13:50.555528 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="276869e9-7303-4347-a105-fdc604a60324" containerName="openstackclient" containerID="cri-o://fbd9bf9f73aba592b00498bb42493eb96178822d5c2e8da3dcd3907782e1e416" gracePeriod=2 Jan 20 23:13:50 crc kubenswrapper[5030]: I0120 23:13:50.582356 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:13:50 crc kubenswrapper[5030]: I0120 23:13:50.605128 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:13:50 crc kubenswrapper[5030]: I0120 23:13:50.689723 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-4528-account-create-update-8v25n"] Jan 20 23:13:50 crc kubenswrapper[5030]: I0120 23:13:50.712132 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-4528-account-create-update-8v25n"] Jan 20 23:13:50 crc kubenswrapper[5030]: E0120 23:13:50.740341 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:13:50 crc kubenswrapper[5030]: E0120 23:13:50.740586 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data podName:8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac nodeName:}" failed. No retries permitted until 2026-01-20 23:13:51.240570639 +0000 UTC m=+2303.560830927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data") pod "rabbitmq-server-0" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac") : configmap "rabbitmq-config-data" not found Jan 20 23:13:50 crc kubenswrapper[5030]: I0120 23:13:50.871763 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn"] Jan 20 23:13:50 crc kubenswrapper[5030]: I0120 23:13:50.891717 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx"] Jan 20 23:13:50 crc kubenswrapper[5030]: I0120 23:13:50.924710 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6617-account-create-update-bk4hx"] Jan 20 23:13:50 crc kubenswrapper[5030]: I0120 23:13:50.938442 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-f20d-account-create-update-9v9cn"] Jan 20 23:13:50 crc kubenswrapper[5030]: I0120 23:13:50.960326 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq"] Jan 20 23:13:50 crc kubenswrapper[5030]: I0120 23:13:50.974599 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-aaf6-account-create-update-tdntq"] Jan 20 23:13:50 crc kubenswrapper[5030]: I0120 23:13:50.988691 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.004896 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-d1e5-account-create-update-fsnq8"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.018856 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.019172 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="682ea052-23fc-4878-ae8e-b5887a6e83b8" containerName="openstack-network-exporter" containerID="cri-o://593b160210b2fb95026561fbd077c54222972c387ef711518ba53a962cd1ab83" gracePeriod=300 Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.119677 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.119945 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" containerName="ovn-northd" containerID="cri-o://a26f3d469d1bc70918fe59af26ae8ccec88cda9bef22615517691b50482821fc" gracePeriod=30 Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.120340 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" containerName="openstack-network-exporter" containerID="cri-o://d81bac0da6673890a9c30a104bdda356d42b4ea1937571eec0145eb0c3b6b9ec" gracePeriod=30 Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.163987 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.179687 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fc6vh"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.199153 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-e28c-account-create-update-7lfld"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.239200 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8x74r"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.253208 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="682ea052-23fc-4878-ae8e-b5887a6e83b8" containerName="ovsdbserver-nb" containerID="cri-o://84e53198c624c2bd25c3ed46fb9e8cf0112438a848dd4e30d479645f7a46beb6" gracePeriod=300 Jan 20 23:13:51 crc kubenswrapper[5030]: E0120 23:13:51.285766 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:13:51 crc kubenswrapper[5030]: E0120 23:13:51.285836 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data podName:8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac nodeName:}" failed. No retries permitted until 2026-01-20 23:13:52.285820529 +0000 UTC m=+2304.606080817 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data") pod "rabbitmq-server-0" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac") : configmap "rabbitmq-config-data" not found Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.360656 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8x74r"] Jan 20 23:13:51 crc kubenswrapper[5030]: E0120 23:13:51.361092 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: EOF, stdout: , stderr: , exit code -1" containerID="84e53198c624c2bd25c3ed46fb9e8cf0112438a848dd4e30d479645f7a46beb6" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:13:51 crc kubenswrapper[5030]: E0120 23:13:51.371071 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84e53198c624c2bd25c3ed46fb9e8cf0112438a848dd4e30d479645f7a46beb6" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:13:51 crc kubenswrapper[5030]: E0120 23:13:51.372407 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84e53198c624c2bd25c3ed46fb9e8cf0112438a848dd4e30d479645f7a46beb6" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:13:51 crc kubenswrapper[5030]: E0120 23:13:51.372477 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="682ea052-23fc-4878-ae8e-b5887a6e83b8" containerName="ovsdbserver-nb" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.391768 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-fc6vh"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.403127 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.403378 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.416613 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q"] Jan 20 23:13:51 crc kubenswrapper[5030]: E0120 23:13:51.417077 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276869e9-7303-4347-a105-fdc604a60324" containerName="openstackclient" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.417094 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="276869e9-7303-4347-a105-fdc604a60324" containerName="openstackclient" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.417316 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="276869e9-7303-4347-a105-fdc604a60324" containerName="openstackclient" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.418032 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.443835 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.463510 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.495345 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-xz7mx"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.526107 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-s6lbv"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.555693 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-db-sync-wvcfl"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.587964 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-xz7mx"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.600705 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7100ef4a-f07d-4531-81f9-65fb79ae49d6-operator-scripts\") pod \"nova-api-aaf6-account-create-update-85t2q\" (UID: \"7100ef4a-f07d-4531-81f9-65fb79ae49d6\") " pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.600763 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q45tf\" (UniqueName: \"kubernetes.io/projected/7100ef4a-f07d-4531-81f9-65fb79ae49d6-kube-api-access-q45tf\") pod \"nova-api-aaf6-account-create-update-85t2q\" (UID: \"7100ef4a-f07d-4531-81f9-65fb79ae49d6\") " pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.601725 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.622843 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-db-sync-wvcfl"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.645671 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.662746 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" containerName="ovn-northd" probeResult="failure" output=< Jan 20 23:13:51 crc kubenswrapper[5030]: 2026-01-20T23:13:51Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Jan 20 23:13:51 crc kubenswrapper[5030]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Jan 20 23:13:51 crc kubenswrapper[5030]: 2026-01-20T23:13:51Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Jan 20 23:13:51 crc kubenswrapper[5030]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Jan 20 23:13:51 crc kubenswrapper[5030]: > Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.687839 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-s6lbv"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.703597 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7100ef4a-f07d-4531-81f9-65fb79ae49d6-operator-scripts\") pod \"nova-api-aaf6-account-create-update-85t2q\" (UID: \"7100ef4a-f07d-4531-81f9-65fb79ae49d6\") " pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.703849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q45tf\" (UniqueName: \"kubernetes.io/projected/7100ef4a-f07d-4531-81f9-65fb79ae49d6-kube-api-access-q45tf\") pod \"nova-api-aaf6-account-create-update-85t2q\" (UID: \"7100ef4a-f07d-4531-81f9-65fb79ae49d6\") " pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.704258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7100ef4a-f07d-4531-81f9-65fb79ae49d6-operator-scripts\") pod \"nova-api-aaf6-account-create-update-85t2q\" (UID: \"7100ef4a-f07d-4531-81f9-65fb79ae49d6\") " pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.718680 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-884d-account-create-update-887vg"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.732944 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qncrz"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.757823 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" containerName="ovn-northd" probeResult="failure" output="command timed out" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.767328 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q45tf\" (UniqueName: \"kubernetes.io/projected/7100ef4a-f07d-4531-81f9-65fb79ae49d6-kube-api-access-q45tf\") pod \"nova-api-aaf6-account-create-update-85t2q\" (UID: \"7100ef4a-f07d-4531-81f9-65fb79ae49d6\") " pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.774738 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qncrz"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.822717 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.823362 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="1ba022bd-4168-4c45-a924-061342f680dd" containerName="openstack-network-exporter" containerID="cri-o://ae6ead157a67c32e8e554cbbc941bac6690f0a81f5616ed9ead93667b7e0e30f" gracePeriod=300 Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.845232 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-sks9w"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.846510 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-sks9w" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.852115 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.867004 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-sks9w"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.874028 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.890688 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.891594 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="6d52d930-f169-42d5-a93d-d9d30906d5c8" containerName="kube-state-metrics" containerID="cri-o://ee353b1c49e0f185072e424e0c04f05bea612841a01568ff06beb0fff7400225" gracePeriod=30 Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.920000 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-2qn2g"] Jan 20 23:13:51 crc kubenswrapper[5030]: I0120 23:13:51.936708 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-2qn2g"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.010677 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3937a364-08e4-44d1-a93e-8dd4a203f200" path="/var/lib/kubelet/pods/3937a364-08e4-44d1-a93e-8dd4a203f200/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.013079 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4983d8d3-58c0-4c76-9200-dfb9daefaf64" path="/var/lib/kubelet/pods/4983d8d3-58c0-4c76-9200-dfb9daefaf64/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.013641 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="590fa392-5f69-4b7b-9def-790dc20e071f" path="/var/lib/kubelet/pods/590fa392-5f69-4b7b-9def-790dc20e071f/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.014246 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aad0d16-4da7-47d6-8707-e0bf8fa3740d" path="/var/lib/kubelet/pods/6aad0d16-4da7-47d6-8707-e0bf8fa3740d/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.015538 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74cf33a3-0b67-4cc3-b502-7c003f57c236" path="/var/lib/kubelet/pods/74cf33a3-0b67-4cc3-b502-7c003f57c236/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.017662 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddnc6\" (UniqueName: \"kubernetes.io/projected/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-kube-api-access-ddnc6\") pod \"root-account-create-update-sks9w\" (UID: \"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121\") " pod="openstack-kuttl-tests/root-account-create-update-sks9w" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.017878 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-operator-scripts\") pod \"root-account-create-update-sks9w\" (UID: \"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121\") " pod="openstack-kuttl-tests/root-account-create-update-sks9w" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.019601 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9409c1f8-6f79-494b-8dde-4fd7eedc9ae3" path="/var/lib/kubelet/pods/9409c1f8-6f79-494b-8dde-4fd7eedc9ae3/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.020408 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="946a76ec-5148-426e-97b5-bd22220a7ff2" path="/var/lib/kubelet/pods/946a76ec-5148-426e-97b5-bd22220a7ff2/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.022220 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94adfb2f-fb7f-438c-83e0-2d356bcdb122" path="/var/lib/kubelet/pods/94adfb2f-fb7f-438c-83e0-2d356bcdb122/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.023327 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f621682-a425-4314-ae0d-423bc0b23935" path="/var/lib/kubelet/pods/9f621682-a425-4314-ae0d-423bc0b23935/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.023904 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b79f6cab-1cbe-42c5-9e2c-63f34c7c8139" path="/var/lib/kubelet/pods/b79f6cab-1cbe-42c5-9e2c-63f34c7c8139/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.024558 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c20d1400-78ce-4d6d-b499-359aca94def8" path="/var/lib/kubelet/pods/c20d1400-78ce-4d6d-b499-359aca94def8/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.026483 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd4126e6-c5d3-4d42-9ca2-07029f5e3368" path="/var/lib/kubelet/pods/cd4126e6-c5d3-4d42-9ca2-07029f5e3368/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.027012 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf0297cb-4da9-453d-ab16-b516a4a45987" path="/var/lib/kubelet/pods/cf0297cb-4da9-453d-ab16-b516a4a45987/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.027531 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="effb0df3-9c75-4104-90c2-505e1e4e3b1b" path="/var/lib/kubelet/pods/effb0df3-9c75-4104-90c2-505e1e4e3b1b/volumes" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.027758 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="1ba022bd-4168-4c45-a924-061342f680dd" containerName="ovsdbserver-sb" containerID="cri-o://9ec086e5497da0177834476824c0a381c6c4c92f9315ab92e56f076e2aee3773" gracePeriod=300 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.030142 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-82sw7"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.030173 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-82sw7"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.095730 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-65fb4986d8-k2r2k"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.096012 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" podUID="0b296da3-d86c-431b-889e-a38437804d3a" containerName="placement-log" containerID="cri-o://6959375e26fa2c4a22be160d0bdb837d17a80304dda7a78b4f10357d3e5cdc2d" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.096431 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" podUID="0b296da3-d86c-431b-889e-a38437804d3a" containerName="placement-api" containerID="cri-o://bf4743f2f827f2f4f9f2f8517fc9b12848a8b1ea2c24f37ec0a0b7b4cefab31e" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.119448 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.120457 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-operator-scripts\") pod \"root-account-create-update-sks9w\" (UID: \"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121\") " pod="openstack-kuttl-tests/root-account-create-update-sks9w" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.120607 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddnc6\" (UniqueName: \"kubernetes.io/projected/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-kube-api-access-ddnc6\") pod \"root-account-create-update-sks9w\" (UID: \"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121\") " pod="openstack-kuttl-tests/root-account-create-update-sks9w" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.122678 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-operator-scripts\") pod \"root-account-create-update-sks9w\" (UID: \"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121\") " pod="openstack-kuttl-tests/root-account-create-update-sks9w" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.143269 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-hppcs"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.145513 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddnc6\" (UniqueName: \"kubernetes.io/projected/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-kube-api-access-ddnc6\") pod \"root-account-create-update-sks9w\" (UID: \"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121\") " pod="openstack-kuttl-tests/root-account-create-update-sks9w" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.160252 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.166218 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-fff5n"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.167122 5030 generic.go:334] "Generic (PLEG): container finished" podID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" containerID="d81bac0da6673890a9c30a104bdda356d42b4ea1937571eec0145eb0c3b6b9ec" exitCode=2 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.167171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"45ef6962-b7c4-487b-9b04-d954fbd30b3a","Type":"ContainerDied","Data":"d81bac0da6673890a9c30a104bdda356d42b4ea1937571eec0145eb0c3b6b9ec"} Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.178876 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_682ea052-23fc-4878-ae8e-b5887a6e83b8/ovsdbserver-nb/0.log" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.178945 5030 generic.go:334] "Generic (PLEG): container finished" podID="682ea052-23fc-4878-ae8e-b5887a6e83b8" containerID="593b160210b2fb95026561fbd077c54222972c387ef711518ba53a962cd1ab83" exitCode=2 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.178963 5030 generic.go:334] "Generic (PLEG): container finished" podID="682ea052-23fc-4878-ae8e-b5887a6e83b8" containerID="84e53198c624c2bd25c3ed46fb9e8cf0112438a848dd4e30d479645f7a46beb6" exitCode=143 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.179007 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"682ea052-23fc-4878-ae8e-b5887a6e83b8","Type":"ContainerDied","Data":"593b160210b2fb95026561fbd077c54222972c387ef711518ba53a962cd1ab83"} Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.179039 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"682ea052-23fc-4878-ae8e-b5887a6e83b8","Type":"ContainerDied","Data":"84e53198c624c2bd25c3ed46fb9e8cf0112438a848dd4e30d479645f7a46beb6"} Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.179751 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180112 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-server" containerID="cri-o://54da1c1411ab5ceca784346a76d3a2e8df0bf52604dc8fe074f570c94f675b9c" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180430 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="swift-recon-cron" containerID="cri-o://fef2f8940ea5258c15f06f5d7dd8a6156a4b7e207d9173332274b4da36d97ab5" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180477 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="rsync" containerID="cri-o://df21f467aa48e9cd73ecf780edb3fde78c478ee9e3e1fe10188497a5fc87d1a4" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180509 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-expirer" containerID="cri-o://09c4955040ae79ee588305769f26a9faab7bd5b88198cf480c047e86852ecd2d" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180537 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-updater" containerID="cri-o://237d5c77a55c616b660a6d8a7a0789ffbbf5636dcf37dc9b4177236859da902e" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180564 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-auditor" containerID="cri-o://02d2cd5180513116470bf587dced9406e0cccb4577828e686f7ad2f6e7756b90" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180591 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-replicator" containerID="cri-o://a0b0fd6857b1e7dc5deae4a9fc1014ddd91b0f25a6c5eabcb93bc1c9b7ad0c4e" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180635 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-server" containerID="cri-o://39ce2002c75ebf0e4ceffe55cf58454e136385b8b14ce8fbfd7b9e95218bc1d4" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180667 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-updater" containerID="cri-o://8cfd138e7964a40907591a741551bede8ca3e6f85ec899d199026118111003c6" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180693 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-auditor" containerID="cri-o://3baf6bd0266a5a9fd8e4a8400417e2564cf33cb06f85ced28b5396ab30bdbc41" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180722 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-replicator" containerID="cri-o://a98d6c66eb67d678dafdd47b3afa23c516a4bb86312f8871a44e29574063d568" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180750 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-reaper" containerID="cri-o://0cd31ae9c67288c5d006a85915b20396118db52b0eb4af518b3f1e74a1973aad" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180789 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-replicator" containerID="cri-o://6139d1ebd8fe4bd1772b00058abe6fe6f57fc73bf53a9abf6d30f7b0b31612d5" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180904 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-server" containerID="cri-o://8cc93827bc67f52466465d92f18c3615d4a04a5a5999d606cd743da99355de49" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.180923 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-auditor" containerID="cri-o://9822aac9b52040694bad86d906341b866e4efd664919a2f1c1b84e448aa91fd2" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.188722 5030 generic.go:334] "Generic (PLEG): container finished" podID="6d52d930-f169-42d5-a93d-d9d30906d5c8" containerID="ee353b1c49e0f185072e424e0c04f05bea612841a01568ff06beb0fff7400225" exitCode=2 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.190714 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"6d52d930-f169-42d5-a93d-d9d30906d5c8","Type":"ContainerDied","Data":"ee353b1c49e0f185072e424e0c04f05bea612841a01568ff06beb0fff7400225"} Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.221443 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.221691 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-api-0" podUID="70ab6da8-893f-49a3-867f-02b2f7395331" containerName="watcher-api-log" containerID="cri-o://12254c06fd109c72533ed7dc0710acf0c25c1330709a37f4aa8ac1e19abd08d0" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.222070 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-api-0" podUID="70ab6da8-893f-49a3-867f-02b2f7395331" containerName="watcher-api" containerID="cri-o://5507e516abcaf14cb7732b59ea6cde2029913fbfae8b91df16c0097cb4b9cba6" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.227542 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_1ba022bd-4168-4c45-a924-061342f680dd/ovsdbserver-sb/0.log" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.227670 5030 generic.go:334] "Generic (PLEG): container finished" podID="1ba022bd-4168-4c45-a924-061342f680dd" containerID="ae6ead157a67c32e8e554cbbc941bac6690f0a81f5616ed9ead93667b7e0e30f" exitCode=2 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.234532 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"1ba022bd-4168-4c45-a924-061342f680dd","Type":"ContainerDied","Data":"ae6ead157a67c32e8e554cbbc941bac6690f0a81f5616ed9ead93667b7e0e30f"} Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.234574 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-sks9w" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.310988 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.312645 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.339717 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl"] Jan 20 23:13:52 crc kubenswrapper[5030]: E0120 23:13:52.403856 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:13:52 crc kubenswrapper[5030]: E0120 23:13:52.405045 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data podName:8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac nodeName:}" failed. No retries permitted until 2026-01-20 23:13:54.405024142 +0000 UTC m=+2306.725284430 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data") pod "rabbitmq-server-0" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac") : configmap "rabbitmq-config-data" not found Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.496936 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58bj7\" (UniqueName: \"kubernetes.io/projected/5c2cc197-a38a-41a4-8c40-c65602a593ea-kube-api-access-58bj7\") pod \"dnsmasq-dnsmasq-84b9f45d47-pvptl\" (UID: \"5c2cc197-a38a-41a4-8c40-c65602a593ea\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.497272 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5c2cc197-a38a-41a4-8c40-c65602a593ea-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-pvptl\" (UID: \"5c2cc197-a38a-41a4-8c40-c65602a593ea\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.497318 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2cc197-a38a-41a4-8c40-c65602a593ea-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-pvptl\" (UID: \"5c2cc197-a38a-41a4-8c40-c65602a593ea\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.497939 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-d9778659d-djkxb"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.498400 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" podUID="65835ff1-85e2-4b33-a384-c93fdf569c40" containerName="neutron-api" containerID="cri-o://6f907dbea8308c724188d082653b0290d84929b8eaa1ab9fe4f6b434755bb79d" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.499003 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" podUID="65835ff1-85e2-4b33-a384-c93fdf569c40" containerName="neutron-httpd" containerID="cri-o://23839d0d94f910990f762e129be6aa928cce6a34a983e78f892a4b78ebb1dbaf" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.526368 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-jct9z"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.562091 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.599150 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58bj7\" (UniqueName: \"kubernetes.io/projected/5c2cc197-a38a-41a4-8c40-c65602a593ea-kube-api-access-58bj7\") pod \"dnsmasq-dnsmasq-84b9f45d47-pvptl\" (UID: \"5c2cc197-a38a-41a4-8c40-c65602a593ea\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.599276 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5c2cc197-a38a-41a4-8c40-c65602a593ea-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-pvptl\" (UID: \"5c2cc197-a38a-41a4-8c40-c65602a593ea\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.599372 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2cc197-a38a-41a4-8c40-c65602a593ea-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-pvptl\" (UID: \"5c2cc197-a38a-41a4-8c40-c65602a593ea\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.603806 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5c2cc197-a38a-41a4-8c40-c65602a593ea-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-pvptl\" (UID: \"5c2cc197-a38a-41a4-8c40-c65602a593ea\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.605104 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2cc197-a38a-41a4-8c40-c65602a593ea-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-pvptl\" (UID: \"5c2cc197-a38a-41a4-8c40-c65602a593ea\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.627846 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-jct9z"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.655367 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58bj7\" (UniqueName: \"kubernetes.io/projected/5c2cc197-a38a-41a4-8c40-c65602a593ea-kube-api-access-58bj7\") pod \"dnsmasq-dnsmasq-84b9f45d47-pvptl\" (UID: \"5c2cc197-a38a-41a4-8c40-c65602a593ea\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.657934 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.658187 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-decision-engine-0" podUID="28d47b79-8224-4a87-8ffc-808a6743d00a" containerName="watcher-decision-engine" containerID="cri-o://bdd7503e79666863d6afaca4fc262aecba84505a7e41daf73b9635bf49a4eb40" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.693677 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.693923 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" containerName="glance-log" containerID="cri-o://9a976ba4a26e8a14c9b5d735869936056345cb3a87523033f8a1c7c29e43c684" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.694407 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" containerName="glance-httpd" containerID="cri-o://b0b7fbcae4e87a8e2c4501e0a9f4a0d919d679b8183c68e6faeceada12e60e1e" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.710870 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-applier-0"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.711089 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/watcher-applier-0" podUID="529ed4cf-39aa-463b-bf65-da6900fa74e4" containerName="watcher-applier" containerID="cri-o://4066467fc36d5b2f8ca58161d2e2528be4d9375a2251420eb10d0d6a759ae68e" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.737739 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.738046 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" containerName="glance-log" containerID="cri-o://781cd384eff78e55f6cd571c577fc758748a027bc29a451c8ad2e11b486203dd" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.738513 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" containerName="glance-httpd" containerID="cri-o://4afb199e076e08bd1f320a82a9ae4355b464db743e01e9a15d1354569ef3e0d8" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.752486 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.757719 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.757994 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="39d2f867-199b-446c-9f02-c8b2f842df63" containerName="cinder-scheduler" containerID="cri-o://3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.758146 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="39d2f867-199b-446c-9f02-c8b2f842df63" containerName="probe" containerID="cri-o://ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.783686 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.784083 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="ceilometer-central-agent" containerID="cri-o://25852a0b06abaef1152c482d4491725ea52efdac6ad8a6d9de42b23da10cf0ee" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.784517 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="proxy-httpd" containerID="cri-o://7837b80e02d030e2ce03ff4cee41239d1427301430cd07ae2624f6088414713f" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.784570 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="sg-core" containerID="cri-o://2165954ce7991d9c92b9a2535612d5213391c2d1d40a0d59981e1f2a924cd889" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.784611 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="ceilometer-notification-agent" containerID="cri-o://ba3004e2d56e03c4b37e9b30e64983c6fd5e37ab9a10b78ca4fc812d194cf0c1" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.792943 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m5dvm"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.798685 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-qpw45"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.805681 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-qpw45"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.813423 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.813728 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" containerName="cinder-api-log" containerID="cri-o://7fa5493ab547d6dfa6b51ddf09c04288a1a6dcaaf5124b6386a2e61f891bb6d0" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.814119 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" containerName="cinder-api" containerID="cri-o://1decb37a6a7c1d14ac75639b689c1ba474b929273efe96a724937a53898e5fcd" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.837531 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-zn648"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.844424 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-zn648"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.867727 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.888938 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-4d3d-account-create-update-69ll5"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.907336 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.909210 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerName="nova-api-log" containerID="cri-o://f803f31c1484582fd4598d69e21199293bf4f36c02688642dfeddaee147a10ab" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.909659 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerName="nova-api-api" containerID="cri-o://14b424c8af972f0138ff073374e0e6c981d49cdcb262e43e51bfaad6dc967e68" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.937914 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.938140 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" podUID="e35248d6-8ad0-425b-a55d-ab390f26120c" containerName="barbican-keystone-listener-log" containerID="cri-o://a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.938313 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" podUID="e35248d6-8ad0-425b-a55d-ab390f26120c" containerName="barbican-keystone-listener" containerID="cri-o://e6a766d34ead818ebeb37ccaabe733eb8ee48b183b9e05802b89424703cbd7ca" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.966085 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs"] Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.966366 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" podUID="b2354629-cb98-42f4-bad0-675148557a8b" containerName="barbican-worker-log" containerID="cri-o://4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5" gracePeriod=30 Jan 20 23:13:52 crc kubenswrapper[5030]: I0120 23:13:52.966864 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" podUID="b2354629-cb98-42f4-bad0-675148557a8b" containerName="barbican-worker" containerID="cri-o://d25a29179b1fb97279dddb13bc8a1318a8a5a57098f7af8c31048f2d300e2475" gracePeriod=30 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.008321 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-r4dcm"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.024685 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-r4dcm"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.065815 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.077552 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-c5f987788-8dlzq"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.082265 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" podUID="4b106e3d-3a2a-44d6-a377-8eb7f8f37825" containerName="barbican-api-log" containerID="cri-o://253233ff162c9e7305f61b0faa0810f0949d931fef29e1de6252df9e9b3381e3" gracePeriod=30 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.083080 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" podUID="4b106e3d-3a2a-44d6-a377-8eb7f8f37825" containerName="barbican-api" containerID="cri-o://e40e9ac3a45289a55deb5dd5f6e1a134c721f434436b873c0c20a597937dd46e" gracePeriod=30 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.107428 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.107661 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerName="nova-metadata-log" containerID="cri-o://f8d84d8355d9ec6b696884a0457ca932cf4fbadb0be127718cc3bfa520459e7d" gracePeriod=30 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.107789 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerName="nova-metadata-metadata" containerID="cri-o://5191d760f933dad69163b25ccde46995e6d88d09b8f999f482abfafa83968828" gracePeriod=30 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.179573 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-cdk2h"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.192424 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.203675 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-db-create-lgbqp"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.218585 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-cdk2h"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.276291 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" containerName="rabbitmq" containerID="cri-o://0dea74d7713f6e95acbd5e0508ac0db11fd7ae82d19264ddcc72a80ed955f3f8" gracePeriod=604800 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.289911 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-dbc5-account-create-update-dr9qt"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.318030 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2354629-cb98-42f4-bad0-675148557a8b" containerID="4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5" exitCode=143 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.318102 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" event={"ID":"b2354629-cb98-42f4-bad0-675148557a8b","Type":"ContainerDied","Data":"4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.337271 5030 generic.go:334] "Generic (PLEG): container finished" podID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerID="f803f31c1484582fd4598d69e21199293bf4f36c02688642dfeddaee147a10ab" exitCode=143 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.337330 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d9ae46d8-5c20-4710-90c6-b2d0120d9231","Type":"ContainerDied","Data":"f803f31c1484582fd4598d69e21199293bf4f36c02688642dfeddaee147a10ab"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.354099 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-db-create-lgbqp"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.354334 5030 generic.go:334] "Generic (PLEG): container finished" podID="276869e9-7303-4347-a105-fdc604a60324" containerID="fbd9bf9f73aba592b00498bb42493eb96178822d5c2e8da3dcd3907782e1e416" exitCode=137 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.369719 5030 generic.go:334] "Generic (PLEG): container finished" podID="eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" containerID="781cd384eff78e55f6cd571c577fc758748a027bc29a451c8ad2e11b486203dd" exitCode=143 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.369790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9","Type":"ContainerDied","Data":"781cd384eff78e55f6cd571c577fc758748a027bc29a451c8ad2e11b486203dd"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.373597 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-mbn7q"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.395111 5030 generic.go:334] "Generic (PLEG): container finished" podID="dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" containerID="9a976ba4a26e8a14c9b5d735869936056345cb3a87523033f8a1c7c29e43c684" exitCode=143 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.395241 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c","Type":"ContainerDied","Data":"9a976ba4a26e8a14c9b5d735869936056345cb3a87523033f8a1c7c29e43c684"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.405323 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-mbn7q"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.406043 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_1ba022bd-4168-4c45-a924-061342f680dd/ovsdbserver-sb/0.log" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.406106 5030 generic.go:334] "Generic (PLEG): container finished" podID="1ba022bd-4168-4c45-a924-061342f680dd" containerID="9ec086e5497da0177834476824c0a381c6c4c92f9315ab92e56f076e2aee3773" exitCode=143 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.406195 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"1ba022bd-4168-4c45-a924-061342f680dd","Type":"ContainerDied","Data":"9ec086e5497da0177834476824c0a381c6c4c92f9315ab92e56f076e2aee3773"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.412191 5030 generic.go:334] "Generic (PLEG): container finished" podID="3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" containerID="7fa5493ab547d6dfa6b51ddf09c04288a1a6dcaaf5124b6386a2e61f891bb6d0" exitCode=143 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.412260 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab","Type":"ContainerDied","Data":"7fa5493ab547d6dfa6b51ddf09c04288a1a6dcaaf5124b6386a2e61f891bb6d0"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.422733 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-22qhj"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.426950 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"6d52d930-f169-42d5-a93d-d9d30906d5c8","Type":"ContainerDied","Data":"d9c1cff779d09e8c608fde6674adc1a0a6cc2298c8dcae729b529c90c9b85c3a"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.426988 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9c1cff779d09e8c608fde6674adc1a0a6cc2298c8dcae729b529c90c9b85c3a" Jan 20 23:13:53 crc kubenswrapper[5030]: E0120 23:13:53.433578 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:13:53 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:13:53 crc kubenswrapper[5030]: Jan 20 23:13:53 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:13:53 crc kubenswrapper[5030]: Jan 20 23:13:53 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:13:53 crc kubenswrapper[5030]: Jan 20 23:13:53 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:13:53 crc kubenswrapper[5030]: Jan 20 23:13:53 crc kubenswrapper[5030]: if [ -n "nova_api" ]; then Jan 20 23:13:53 crc kubenswrapper[5030]: GRANT_DATABASE="nova_api" Jan 20 23:13:53 crc kubenswrapper[5030]: else Jan 20 23:13:53 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:13:53 crc kubenswrapper[5030]: fi Jan 20 23:13:53 crc kubenswrapper[5030]: Jan 20 23:13:53 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:13:53 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:13:53 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:13:53 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:13:53 crc kubenswrapper[5030]: # support updates Jan 20 23:13:53 crc kubenswrapper[5030]: Jan 20 23:13:53 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:13:53 crc kubenswrapper[5030]: E0120 23:13:53.435462 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" podUID="7100ef4a-f07d-4531-81f9-65fb79ae49d6" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.435647 5030 generic.go:334] "Generic (PLEG): container finished" podID="e35248d6-8ad0-425b-a55d-ab390f26120c" containerID="a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9" exitCode=143 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.435704 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" event={"ID":"e35248d6-8ad0-425b-a55d-ab390f26120c","Type":"ContainerDied","Data":"a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.441651 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-22qhj"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.458183 5030 generic.go:334] "Generic (PLEG): container finished" podID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerID="2165954ce7991d9c92b9a2535612d5213391c2d1d40a0d59981e1f2a924cd889" exitCode=2 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.458293 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.458322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b0bf8f-9251-40a3-8177-ee7ec9aec622","Type":"ContainerDied","Data":"2165954ce7991d9c92b9a2535612d5213391c2d1d40a0d59981e1f2a924cd889"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.462868 5030 generic.go:334] "Generic (PLEG): container finished" podID="65835ff1-85e2-4b33-a384-c93fdf569c40" containerID="23839d0d94f910990f762e129be6aa928cce6a34a983e78f892a4b78ebb1dbaf" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.462927 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" event={"ID":"65835ff1-85e2-4b33-a384-c93fdf569c40","Type":"ContainerDied","Data":"23839d0d94f910990f762e129be6aa928cce6a34a983e78f892a4b78ebb1dbaf"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.466795 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.466857 5030 generic.go:334] "Generic (PLEG): container finished" podID="0b296da3-d86c-431b-889e-a38437804d3a" containerID="6959375e26fa2c4a22be160d0bdb837d17a80304dda7a78b4f10357d3e5cdc2d" exitCode=143 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.466903 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" event={"ID":"0b296da3-d86c-431b-889e-a38437804d3a","Type":"ContainerDied","Data":"6959375e26fa2c4a22be160d0bdb837d17a80304dda7a78b4f10357d3e5cdc2d"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.468773 5030 generic.go:334] "Generic (PLEG): container finished" podID="70ab6da8-893f-49a3-867f-02b2f7395331" containerID="12254c06fd109c72533ed7dc0710acf0c25c1330709a37f4aa8ac1e19abd08d0" exitCode=143 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.468815 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"70ab6da8-893f-49a3-867f-02b2f7395331","Type":"ContainerDied","Data":"12254c06fd109c72533ed7dc0710acf0c25c1330709a37f4aa8ac1e19abd08d0"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.474964 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_1ba022bd-4168-4c45-a924-061342f680dd/ovsdbserver-sb/0.log" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.475025 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.479255 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-8pxqj"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482543 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="df21f467aa48e9cd73ecf780edb3fde78c478ee9e3e1fe10188497a5fc87d1a4" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482566 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="09c4955040ae79ee588305769f26a9faab7bd5b88198cf480c047e86852ecd2d" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482573 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="237d5c77a55c616b660a6d8a7a0789ffbbf5636dcf37dc9b4177236859da902e" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482579 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="02d2cd5180513116470bf587dced9406e0cccb4577828e686f7ad2f6e7756b90" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482585 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="a0b0fd6857b1e7dc5deae4a9fc1014ddd91b0f25a6c5eabcb93bc1c9b7ad0c4e" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482592 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="39ce2002c75ebf0e4ceffe55cf58454e136385b8b14ce8fbfd7b9e95218bc1d4" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482599 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="8cfd138e7964a40907591a741551bede8ca3e6f85ec899d199026118111003c6" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482605 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="9822aac9b52040694bad86d906341b866e4efd664919a2f1c1b84e448aa91fd2" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482612 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="6139d1ebd8fe4bd1772b00058abe6fe6f57fc73bf53a9abf6d30f7b0b31612d5" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482646 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="8cc93827bc67f52466465d92f18c3615d4a04a5a5999d606cd743da99355de49" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482653 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="0cd31ae9c67288c5d006a85915b20396118db52b0eb4af518b3f1e74a1973aad" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482660 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="3baf6bd0266a5a9fd8e4a8400417e2564cf33cb06f85ced28b5396ab30bdbc41" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482666 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="a98d6c66eb67d678dafdd47b3afa23c516a4bb86312f8871a44e29574063d568" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.482675 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="54da1c1411ab5ceca784346a76d3a2e8df0bf52604dc8fe074f570c94f675b9c" exitCode=0 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"df21f467aa48e9cd73ecf780edb3fde78c478ee9e3e1fe10188497a5fc87d1a4"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483185 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"09c4955040ae79ee588305769f26a9faab7bd5b88198cf480c047e86852ecd2d"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483196 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"237d5c77a55c616b660a6d8a7a0789ffbbf5636dcf37dc9b4177236859da902e"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483205 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"02d2cd5180513116470bf587dced9406e0cccb4577828e686f7ad2f6e7756b90"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483214 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"a0b0fd6857b1e7dc5deae4a9fc1014ddd91b0f25a6c5eabcb93bc1c9b7ad0c4e"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483223 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"39ce2002c75ebf0e4ceffe55cf58454e136385b8b14ce8fbfd7b9e95218bc1d4"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483233 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"8cfd138e7964a40907591a741551bede8ca3e6f85ec899d199026118111003c6"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483242 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"9822aac9b52040694bad86d906341b866e4efd664919a2f1c1b84e448aa91fd2"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483252 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"6139d1ebd8fe4bd1772b00058abe6fe6f57fc73bf53a9abf6d30f7b0b31612d5"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483266 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"8cc93827bc67f52466465d92f18c3615d4a04a5a5999d606cd743da99355de49"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483275 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"0cd31ae9c67288c5d006a85915b20396118db52b0eb4af518b3f1e74a1973aad"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483284 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"3baf6bd0266a5a9fd8e4a8400417e2564cf33cb06f85ced28b5396ab30bdbc41"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"a98d6c66eb67d678dafdd47b3afa23c516a4bb86312f8871a44e29574063d568"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.483300 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"54da1c1411ab5ceca784346a76d3a2e8df0bf52604dc8fe074f570c94f675b9c"} Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.485124 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_682ea052-23fc-4878-ae8e-b5887a6e83b8/ovsdbserver-nb/0.log" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.485285 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.511239 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-8pxqj"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.522800 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.533784 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2llnv"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.539387 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/682ea052-23fc-4878-ae8e-b5887a6e83b8-scripts\") pod \"682ea052-23fc-4878-ae8e-b5887a6e83b8\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.539461 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1ba022bd-4168-4c45-a924-061342f680dd\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.539501 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-combined-ca-bundle\") pod \"682ea052-23fc-4878-ae8e-b5887a6e83b8\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.539656 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn68h\" (UniqueName: \"kubernetes.io/projected/1ba022bd-4168-4c45-a924-061342f680dd-kube-api-access-zn68h\") pod \"1ba022bd-4168-4c45-a924-061342f680dd\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.539709 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-state-metrics-tls-certs\") pod \"6d52d930-f169-42d5-a93d-d9d30906d5c8\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.539739 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-combined-ca-bundle\") pod \"6d52d930-f169-42d5-a93d-d9d30906d5c8\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.539778 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/682ea052-23fc-4878-ae8e-b5887a6e83b8-config\") pod \"682ea052-23fc-4878-ae8e-b5887a6e83b8\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.539837 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba022bd-4168-4c45-a924-061342f680dd-config\") pod \"1ba022bd-4168-4c45-a924-061342f680dd\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.539916 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfjfs\" (UniqueName: \"kubernetes.io/projected/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-api-access-qfjfs\") pod \"6d52d930-f169-42d5-a93d-d9d30906d5c8\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.539952 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-state-metrics-tls-config\") pod \"6d52d930-f169-42d5-a93d-d9d30906d5c8\" (UID: \"6d52d930-f169-42d5-a93d-d9d30906d5c8\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.539980 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-metrics-certs-tls-certs\") pod \"682ea052-23fc-4878-ae8e-b5887a6e83b8\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.540047 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-metrics-certs-tls-certs\") pod \"1ba022bd-4168-4c45-a924-061342f680dd\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.540083 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ba022bd-4168-4c45-a924-061342f680dd-ovsdb-rundir\") pod \"1ba022bd-4168-4c45-a924-061342f680dd\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.540111 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"682ea052-23fc-4878-ae8e-b5887a6e83b8\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.540136 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-combined-ca-bundle\") pod \"1ba022bd-4168-4c45-a924-061342f680dd\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.540179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ba022bd-4168-4c45-a924-061342f680dd-scripts\") pod \"1ba022bd-4168-4c45-a924-061342f680dd\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.540201 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/682ea052-23fc-4878-ae8e-b5887a6e83b8-ovsdb-rundir\") pod \"682ea052-23fc-4878-ae8e-b5887a6e83b8\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.540245 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-ovsdbserver-sb-tls-certs\") pod \"1ba022bd-4168-4c45-a924-061342f680dd\" (UID: \"1ba022bd-4168-4c45-a924-061342f680dd\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.540268 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-ovsdbserver-nb-tls-certs\") pod \"682ea052-23fc-4878-ae8e-b5887a6e83b8\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.540317 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpxbp\" (UniqueName: \"kubernetes.io/projected/682ea052-23fc-4878-ae8e-b5887a6e83b8-kube-api-access-lpxbp\") pod \"682ea052-23fc-4878-ae8e-b5887a6e83b8\" (UID: \"682ea052-23fc-4878-ae8e-b5887a6e83b8\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.542700 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/682ea052-23fc-4878-ae8e-b5887a6e83b8-config" (OuterVolumeSpecName: "config") pod "682ea052-23fc-4878-ae8e-b5887a6e83b8" (UID: "682ea052-23fc-4878-ae8e-b5887a6e83b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.544413 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/682ea052-23fc-4878-ae8e-b5887a6e83b8-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.545804 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ba022bd-4168-4c45-a924-061342f680dd-config" (OuterVolumeSpecName: "config") pod "1ba022bd-4168-4c45-a924-061342f680dd" (UID: "1ba022bd-4168-4c45-a924-061342f680dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.546570 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/682ea052-23fc-4878-ae8e-b5887a6e83b8-scripts" (OuterVolumeSpecName: "scripts") pod "682ea052-23fc-4878-ae8e-b5887a6e83b8" (UID: "682ea052-23fc-4878-ae8e-b5887a6e83b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.547146 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ba022bd-4168-4c45-a924-061342f680dd-scripts" (OuterVolumeSpecName: "scripts") pod "1ba022bd-4168-4c45-a924-061342f680dd" (UID: "1ba022bd-4168-4c45-a924-061342f680dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.547209 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/682ea052-23fc-4878-ae8e-b5887a6e83b8-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "682ea052-23fc-4878-ae8e-b5887a6e83b8" (UID: "682ea052-23fc-4878-ae8e-b5887a6e83b8"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.567009 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ba022bd-4168-4c45-a924-061342f680dd-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "1ba022bd-4168-4c45-a924-061342f680dd" (UID: "1ba022bd-4168-4c45-a924-061342f680dd"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.570613 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "682ea052-23fc-4878-ae8e-b5887a6e83b8" (UID: "682ea052-23fc-4878-ae8e-b5887a6e83b8"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.571494 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/682ea052-23fc-4878-ae8e-b5887a6e83b8-kube-api-access-lpxbp" (OuterVolumeSpecName: "kube-api-access-lpxbp") pod "682ea052-23fc-4878-ae8e-b5887a6e83b8" (UID: "682ea052-23fc-4878-ae8e-b5887a6e83b8"). InnerVolumeSpecName "kube-api-access-lpxbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.571614 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "1ba022bd-4168-4c45-a924-061342f680dd" (UID: "1ba022bd-4168-4c45-a924-061342f680dd"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.573065 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.573462 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="0b4eb966-3abf-4d3a-b65c-e05c8366c302" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a311ea86198bf17bcf45cd98e7b6fc056d1db58bf1b30b8a6c217b794dae71eb" gracePeriod=30 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.587171 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ba022bd-4168-4c45-a924-061342f680dd-kube-api-access-zn68h" (OuterVolumeSpecName: "kube-api-access-zn68h") pod "1ba022bd-4168-4c45-a924-061342f680dd" (UID: "1ba022bd-4168-4c45-a924-061342f680dd"). InnerVolumeSpecName "kube-api-access-zn68h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.587243 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-api-access-qfjfs" (OuterVolumeSpecName: "kube-api-access-qfjfs") pod "6d52d930-f169-42d5-a93d-d9d30906d5c8" (UID: "6d52d930-f169-42d5-a93d-d9d30906d5c8"). InnerVolumeSpecName "kube-api-access-qfjfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.600335 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.605597 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.607695 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "6d52d930-f169-42d5-a93d-d9d30906d5c8" (UID: "6d52d930-f169-42d5-a93d-d9d30906d5c8"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.614048 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.614316 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="f8fe612c-1c3a-4269-8177-c09b217721c0" containerName="nova-cell1-conductor-conductor" containerID="cri-o://7eda88b8cde3a5c2b63324054ee5424a80054e7b8f662825b7f8b622f7a48194" gracePeriod=30 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.624413 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-hk8d9"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.631823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ba022bd-4168-4c45-a924-061342f680dd" (UID: "1ba022bd-4168-4c45-a924-061342f680dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.639018 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/alertmanager-metric-storage-0"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.639325 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" podUID="f331453a-4129-49ea-b554-06f628abf66a" containerName="alertmanager" containerID="cri-o://a4bc8dbaee8fec446141a83feeeaff6bfbd90d9c9c809d618868cdf3d15fa788" gracePeriod=120 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.639806 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" podUID="f331453a-4129-49ea-b554-06f628abf66a" containerName="config-reloader" containerID="cri-o://1cc12b6243a15a276fb893fecf16ad79a1fa615e0561dc80434a104cb6551b4c" gracePeriod=120 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.644748 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.646901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/276869e9-7303-4347-a105-fdc604a60324-openstack-config\") pod \"276869e9-7303-4347-a105-fdc604a60324\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647011 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/276869e9-7303-4347-a105-fdc604a60324-openstack-config-secret\") pod \"276869e9-7303-4347-a105-fdc604a60324\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24fjs\" (UniqueName: \"kubernetes.io/projected/276869e9-7303-4347-a105-fdc604a60324-kube-api-access-24fjs\") pod \"276869e9-7303-4347-a105-fdc604a60324\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647155 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/276869e9-7303-4347-a105-fdc604a60324-combined-ca-bundle\") pod \"276869e9-7303-4347-a105-fdc604a60324\" (UID: \"276869e9-7303-4347-a105-fdc604a60324\") " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647613 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647647 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ba022bd-4168-4c45-a924-061342f680dd-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647667 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647679 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647689 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ba022bd-4168-4c45-a924-061342f680dd-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647698 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/682ea052-23fc-4878-ae8e-b5887a6e83b8-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647707 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpxbp\" (UniqueName: \"kubernetes.io/projected/682ea052-23fc-4878-ae8e-b5887a6e83b8-kube-api-access-lpxbp\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647717 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/682ea052-23fc-4878-ae8e-b5887a6e83b8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647731 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647740 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn68h\" (UniqueName: \"kubernetes.io/projected/1ba022bd-4168-4c45-a924-061342f680dd-kube-api-access-zn68h\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647749 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ba022bd-4168-4c45-a924-061342f680dd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.647758 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfjfs\" (UniqueName: \"kubernetes.io/projected/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-api-access-qfjfs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.650050 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="prometheus" containerID="cri-o://e53a19888fbe4b142de355206d3ab31f28f294382a2552dd84cc805eb5b41bed" gracePeriod=600 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.650716 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="thanos-sidecar" containerID="cri-o://06d13adbffbcd724575da3d439b5c58b3043b222f3e76d916058a23944d86b4f" gracePeriod=600 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.650789 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/prometheus-metric-storage-0" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="config-reloader" containerID="cri-o://722d4093088b0113351d34ef93f46a8253354211420a96de0ce82f8dc64ebd7b" gracePeriod=600 Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.674990 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.675243 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="935fdabd-9fb0-4aec-8920-4ccbaf4416a3" containerName="nova-scheduler-scheduler" containerID="cri-o://52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245" gracePeriod=30 Jan 20 23:13:53 crc kubenswrapper[5030]: E0120 23:13:53.678530 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70a9dd1f_f075_4833_b9e8_3fc856743ab8.slice/crio-conmon-8cc93827bc67f52466465d92f18c3615d4a04a5a5999d606cd743da99355de49.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2354629_cb98_42f4_bad0_675148557a8b.slice/crio-4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode35248d6_8ad0_425b_a55d_ab390f26120c.slice/crio-a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9ae46d8_5c20_4710_90c6_b2d0120d9231.slice/crio-conmon-f803f31c1484582fd4598d69e21199293bf4f36c02688642dfeddaee147a10ab.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80e84f01_7009_4bfc_8cfb_2af258ceed3c.slice/crio-conmon-f8d84d8355d9ec6b696884a0457ca932cf4fbadb0be127718cc3bfa520459e7d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7b0bf8f_9251_40a3_8177_ee7ec9aec622.slice/crio-2165954ce7991d9c92b9a2535612d5213391c2d1d40a0d59981e1f2a924cd889.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2354629_cb98_42f4_bad0_675148557a8b.slice/crio-conmon-4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode35248d6_8ad0_425b_a55d_ab390f26120c.slice/crio-conmon-a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70a9dd1f_f075_4833_b9e8_3fc856743ab8.slice/crio-8cc93827bc67f52466465d92f18c3615d4a04a5a5999d606cd743da99355de49.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80e84f01_7009_4bfc_8cfb_2af258ceed3c.slice/crio-f8d84d8355d9ec6b696884a0457ca932cf4fbadb0be127718cc3bfa520459e7d.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.683430 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.684057 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/276869e9-7303-4347-a105-fdc604a60324-kube-api-access-24fjs" (OuterVolumeSpecName: "kube-api-access-24fjs") pod "276869e9-7303-4347-a105-fdc604a60324" (UID: "276869e9-7303-4347-a105-fdc604a60324"). InnerVolumeSpecName "kube-api-access-24fjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.693097 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d52d930-f169-42d5-a93d-d9d30906d5c8" (UID: "6d52d930-f169-42d5-a93d-d9d30906d5c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.709296 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "682ea052-23fc-4878-ae8e-b5887a6e83b8" (UID: "682ea052-23fc-4878-ae8e-b5887a6e83b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.740106 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.748682 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-sks9w"] Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.748744 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.749050 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24fjs\" (UniqueName: \"kubernetes.io/projected/276869e9-7303-4347-a105-fdc604a60324-kube-api-access-24fjs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.749077 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.749088 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.749097 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.798843 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.813717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/276869e9-7303-4347-a105-fdc604a60324-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "276869e9-7303-4347-a105-fdc604a60324" (UID: "276869e9-7303-4347-a105-fdc604a60324"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.852666 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/276869e9-7303-4347-a105-fdc604a60324-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.852704 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.859152 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "1ba022bd-4168-4c45-a924-061342f680dd" (UID: "1ba022bd-4168-4c45-a924-061342f680dd"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.872409 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "1ba022bd-4168-4c45-a924-061342f680dd" (UID: "1ba022bd-4168-4c45-a924-061342f680dd"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.925421 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "6d52d930-f169-42d5-a93d-d9d30906d5c8" (UID: "6d52d930-f169-42d5-a93d-d9d30906d5c8"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.930874 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/276869e9-7303-4347-a105-fdc604a60324-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "276869e9-7303-4347-a105-fdc604a60324" (UID: "276869e9-7303-4347-a105-fdc604a60324"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.938822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/276869e9-7303-4347-a105-fdc604a60324-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "276869e9-7303-4347-a105-fdc604a60324" (UID: "276869e9-7303-4347-a105-fdc604a60324"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.955272 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d52d930-f169-42d5-a93d-d9d30906d5c8-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.955317 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.955331 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/276869e9-7303-4347-a105-fdc604a60324-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.955341 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ba022bd-4168-4c45-a924-061342f680dd-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.955354 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/276869e9-7303-4347-a105-fdc604a60324-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.965318 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:13:53 crc kubenswrapper[5030]: E0120 23:13:53.965584 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.966372 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "682ea052-23fc-4878-ae8e-b5887a6e83b8" (UID: "682ea052-23fc-4878-ae8e-b5887a6e83b8"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.972900 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "682ea052-23fc-4878-ae8e-b5887a6e83b8" (UID: "682ea052-23fc-4878-ae8e-b5887a6e83b8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.986479 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="025de003-1154-4871-b722-7d83c8aab74a" path="/var/lib/kubelet/pods/025de003-1154-4871-b722-7d83c8aab74a/volumes" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.987676 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="047bfc50-9efa-45ba-936c-dd2e03010d76" path="/var/lib/kubelet/pods/047bfc50-9efa-45ba-936c-dd2e03010d76/volumes" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.992828 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b32995-1917-478d-b62b-66d4fba0cbee" path="/var/lib/kubelet/pods/18b32995-1917-478d-b62b-66d4fba0cbee/volumes" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.993476 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1983856c-2527-4397-9320-c20fb28c3eaf" path="/var/lib/kubelet/pods/1983856c-2527-4397-9320-c20fb28c3eaf/volumes" Jan 20 23:13:53 crc kubenswrapper[5030]: I0120 23:13:53.993986 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="229641c0-730e-4c1b-b5ba-89b3e250b1c9" path="/var/lib/kubelet/pods/229641c0-730e-4c1b-b5ba-89b3e250b1c9/volumes" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.002237 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="276869e9-7303-4347-a105-fdc604a60324" path="/var/lib/kubelet/pods/276869e9-7303-4347-a105-fdc604a60324/volumes" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.003058 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e35c4d2-863f-4074-8bf1-41a8eabd2910" path="/var/lib/kubelet/pods/3e35c4d2-863f-4074-8bf1-41a8eabd2910/volumes" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.003787 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c90c31-4ceb-4fdb-a889-993ae7d16a18" path="/var/lib/kubelet/pods/40c90c31-4ceb-4fdb-a889-993ae7d16a18/volumes" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.006372 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53324643-071d-4bf3-8ea7-702076e4941e" path="/var/lib/kubelet/pods/53324643-071d-4bf3-8ea7-702076e4941e/volumes" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.007085 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe039bb-1c22-493b-a545-1f2843b9330e" path="/var/lib/kubelet/pods/5fe039bb-1c22-493b-a545-1f2843b9330e/volumes" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.007804 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="818882ee-6bdd-4fa5-8294-d2ffe317ac07" path="/var/lib/kubelet/pods/818882ee-6bdd-4fa5-8294-d2ffe317ac07/volumes" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.010038 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e13cbf-a921-49d9-ab19-631c7122b04c" path="/var/lib/kubelet/pods/91e13cbf-a921-49d9-ab19-631c7122b04c/volumes" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.010635 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9329b737-5b3e-4513-a65a-5797ad48998f" path="/var/lib/kubelet/pods/9329b737-5b3e-4513-a65a-5797ad48998f/volumes" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.011201 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9b1cf97-227f-4c53-b2a4-b272d8803ae7" path="/var/lib/kubelet/pods/a9b1cf97-227f-4c53-b2a4-b272d8803ae7/volumes" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.011767 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc090b44-6543-43f0-a4cb-4ee41c4fdab2" path="/var/lib/kubelet/pods/bc090b44-6543-43f0-a4cb-4ee41c4fdab2/volumes" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.014852 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df14d7d3-54c6-4a6c-996a-07be5a7f4261" path="/var/lib/kubelet/pods/df14d7d3-54c6-4a6c-996a-07be5a7f4261/volumes" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.017334 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feff49bf-c912-4af2-98b5-97ea9cca86e4" path="/var/lib/kubelet/pods/feff49bf-c912-4af2-98b5-97ea9cca86e4/volumes" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.057242 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.057272 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/682ea052-23fc-4878-ae8e-b5887a6e83b8-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.140662 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd"] Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.141207 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" podUID="f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" containerName="proxy-httpd" containerID="cri-o://444566938fcbfab3901e6fd0f098d24ad50b1a35701c21a72a39c1fc137f4568" gracePeriod=30 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.141382 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" podUID="f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" containerName="proxy-server" containerID="cri-o://a603f5479f1751fca07b831722eb086379566bd909e2e23ac72bbb41392e0f4d" gracePeriod=30 Jan 20 23:13:54 crc kubenswrapper[5030]: E0120 23:13:54.334641 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4066467fc36d5b2f8ca58161d2e2528be4d9375a2251420eb10d0d6a759ae68e" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 20 23:13:54 crc kubenswrapper[5030]: E0120 23:13:54.337759 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4066467fc36d5b2f8ca58161d2e2528be4d9375a2251420eb10d0d6a759ae68e" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 20 23:13:54 crc kubenswrapper[5030]: E0120 23:13:54.346386 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4066467fc36d5b2f8ca58161d2e2528be4d9375a2251420eb10d0d6a759ae68e" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Jan 20 23:13:54 crc kubenswrapper[5030]: E0120 23:13:54.346471 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/watcher-applier-0" podUID="529ed4cf-39aa-463b-bf65-da6900fa74e4" containerName="watcher-applier" Jan 20 23:13:54 crc kubenswrapper[5030]: E0120 23:13:54.467970 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:13:54 crc kubenswrapper[5030]: E0120 23:13:54.468037 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data podName:8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac nodeName:}" failed. No retries permitted until 2026-01-20 23:13:58.468014155 +0000 UTC m=+2310.788274433 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data") pod "rabbitmq-server-0" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac") : configmap "rabbitmq-config-data" not found Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.486636 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.499936 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" event={"ID":"7100ef4a-f07d-4531-81f9-65fb79ae49d6","Type":"ContainerStarted","Data":"e80ec98bf53751ebe37d7d6c513f1b18c80622c5dab2c8d3ae1e7724171ab863"} Jan 20 23:13:54 crc kubenswrapper[5030]: E0120 23:13:54.506461 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:13:54 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:13:54 crc kubenswrapper[5030]: Jan 20 23:13:54 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:13:54 crc kubenswrapper[5030]: Jan 20 23:13:54 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:13:54 crc kubenswrapper[5030]: Jan 20 23:13:54 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:13:54 crc kubenswrapper[5030]: Jan 20 23:13:54 crc kubenswrapper[5030]: if [ -n "nova_api" ]; then Jan 20 23:13:54 crc kubenswrapper[5030]: GRANT_DATABASE="nova_api" Jan 20 23:13:54 crc kubenswrapper[5030]: else Jan 20 23:13:54 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:13:54 crc kubenswrapper[5030]: fi Jan 20 23:13:54 crc kubenswrapper[5030]: Jan 20 23:13:54 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:13:54 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:13:54 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:13:54 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:13:54 crc kubenswrapper[5030]: # support updates Jan 20 23:13:54 crc kubenswrapper[5030]: Jan 20 23:13:54 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:13:54 crc kubenswrapper[5030]: E0120 23:13:54.507821 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" podUID="7100ef4a-f07d-4531-81f9-65fb79ae49d6" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.515076 5030 generic.go:334] "Generic (PLEG): container finished" podID="0b4eb966-3abf-4d3a-b65c-e05c8366c302" containerID="a311ea86198bf17bcf45cd98e7b6fc056d1db58bf1b30b8a6c217b794dae71eb" exitCode=0 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.515155 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"0b4eb966-3abf-4d3a-b65c-e05c8366c302","Type":"ContainerDied","Data":"a311ea86198bf17bcf45cd98e7b6fc056d1db58bf1b30b8a6c217b794dae71eb"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.524348 5030 generic.go:334] "Generic (PLEG): container finished" podID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerID="f8d84d8355d9ec6b696884a0457ca932cf4fbadb0be127718cc3bfa520459e7d" exitCode=143 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.524453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"80e84f01-7009-4bfc-8cfb-2af258ceed3c","Type":"ContainerDied","Data":"f8d84d8355d9ec6b696884a0457ca932cf4fbadb0be127718cc3bfa520459e7d"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.536884 5030 generic.go:334] "Generic (PLEG): container finished" podID="ee0a79ab-ccac-4a83-9e3e-7f1f232f2121" containerID="516e9e3aaabd9e815871f9c17822c1a181f04d33bbf8a76cb12858b29bd5fad3" exitCode=1 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.537006 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-sks9w" event={"ID":"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121","Type":"ContainerDied","Data":"516e9e3aaabd9e815871f9c17822c1a181f04d33bbf8a76cb12858b29bd5fad3"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.537034 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-sks9w" event={"ID":"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121","Type":"ContainerStarted","Data":"d92021bd9f26a8919bd8a03fd3d9d6c347c4b6cecb9b84ab99d1a599dd0e6c13"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.537816 5030 scope.go:117] "RemoveContainer" containerID="516e9e3aaabd9e815871f9c17822c1a181f04d33bbf8a76cb12858b29bd5fad3" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.545296 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_1ba022bd-4168-4c45-a924-061342f680dd/ovsdbserver-sb/0.log" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.545389 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"1ba022bd-4168-4c45-a924-061342f680dd","Type":"ContainerDied","Data":"635d96c387b5636a89141b1cc720c935902a4f4347411c5eb8d998372c7330fe"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.545427 5030 scope.go:117] "RemoveContainer" containerID="ae6ead157a67c32e8e554cbbc941bac6690f0a81f5616ed9ead93667b7e0e30f" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.545639 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.573553 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjrml\" (UniqueName: \"kubernetes.io/projected/39d2f867-199b-446c-9f02-c8b2f842df63-kube-api-access-pjrml\") pod \"39d2f867-199b-446c-9f02-c8b2f842df63\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.574275 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39d2f867-199b-446c-9f02-c8b2f842df63-etc-machine-id\") pod \"39d2f867-199b-446c-9f02-c8b2f842df63\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.574320 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-combined-ca-bundle\") pod \"39d2f867-199b-446c-9f02-c8b2f842df63\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.574487 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-scripts\") pod \"39d2f867-199b-446c-9f02-c8b2f842df63\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.575797 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-config-data-custom\") pod \"39d2f867-199b-446c-9f02-c8b2f842df63\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.575850 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-config-data\") pod \"39d2f867-199b-446c-9f02-c8b2f842df63\" (UID: \"39d2f867-199b-446c-9f02-c8b2f842df63\") " Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.587580 5030 generic.go:334] "Generic (PLEG): container finished" podID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerID="7837b80e02d030e2ce03ff4cee41239d1427301430cd07ae2624f6088414713f" exitCode=0 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.587777 5030 generic.go:334] "Generic (PLEG): container finished" podID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerID="25852a0b06abaef1152c482d4491725ea52efdac6ad8a6d9de42b23da10cf0ee" exitCode=0 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.587873 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b0bf8f-9251-40a3-8177-ee7ec9aec622","Type":"ContainerDied","Data":"7837b80e02d030e2ce03ff4cee41239d1427301430cd07ae2624f6088414713f"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.587903 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b0bf8f-9251-40a3-8177-ee7ec9aec622","Type":"ContainerDied","Data":"25852a0b06abaef1152c482d4491725ea52efdac6ad8a6d9de42b23da10cf0ee"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.588005 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/39d2f867-199b-446c-9f02-c8b2f842df63-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "39d2f867-199b-446c-9f02-c8b2f842df63" (UID: "39d2f867-199b-446c-9f02-c8b2f842df63"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.595764 5030 generic.go:334] "Generic (PLEG): container finished" podID="f331453a-4129-49ea-b554-06f628abf66a" containerID="1cc12b6243a15a276fb893fecf16ad79a1fa615e0561dc80434a104cb6551b4c" exitCode=0 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.595872 5030 generic.go:334] "Generic (PLEG): container finished" podID="f331453a-4129-49ea-b554-06f628abf66a" containerID="a4bc8dbaee8fec446141a83feeeaff6bfbd90d9c9c809d618868cdf3d15fa788" exitCode=0 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.595940 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f331453a-4129-49ea-b554-06f628abf66a","Type":"ContainerDied","Data":"1cc12b6243a15a276fb893fecf16ad79a1fa615e0561dc80434a104cb6551b4c"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.596009 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f331453a-4129-49ea-b554-06f628abf66a","Type":"ContainerDied","Data":"a4bc8dbaee8fec446141a83feeeaff6bfbd90d9c9c809d618868cdf3d15fa788"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.599129 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.602743 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39d2f867-199b-446c-9f02-c8b2f842df63-kube-api-access-pjrml" (OuterVolumeSpecName: "kube-api-access-pjrml") pod "39d2f867-199b-446c-9f02-c8b2f842df63" (UID: "39d2f867-199b-446c-9f02-c8b2f842df63"). InnerVolumeSpecName "kube-api-access-pjrml". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.612642 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "39d2f867-199b-446c-9f02-c8b2f842df63" (UID: "39d2f867-199b-446c-9f02-c8b2f842df63"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.624592 5030 scope.go:117] "RemoveContainer" containerID="9ec086e5497da0177834476824c0a381c6c4c92f9315ab92e56f076e2aee3773" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.624918 5030 generic.go:334] "Generic (PLEG): container finished" podID="39d2f867-199b-446c-9f02-c8b2f842df63" containerID="ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68" exitCode=0 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.624942 5030 generic.go:334] "Generic (PLEG): container finished" podID="39d2f867-199b-446c-9f02-c8b2f842df63" containerID="3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444" exitCode=0 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.624990 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"39d2f867-199b-446c-9f02-c8b2f842df63","Type":"ContainerDied","Data":"ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.625019 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"39d2f867-199b-446c-9f02-c8b2f842df63","Type":"ContainerDied","Data":"3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.625033 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"39d2f867-199b-446c-9f02-c8b2f842df63","Type":"ContainerDied","Data":"375a932709a0124814adc43ac08fff5d02401d9ced857ef46ee28f723910fa85"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.625097 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.628284 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-scripts" (OuterVolumeSpecName: "scripts") pod "39d2f867-199b-446c-9f02-c8b2f842df63" (UID: "39d2f867-199b-446c-9f02-c8b2f842df63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.636592 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.640852 5030 generic.go:334] "Generic (PLEG): container finished" podID="4b106e3d-3a2a-44d6-a377-8eb7f8f37825" containerID="253233ff162c9e7305f61b0faa0810f0949d931fef29e1de6252df9e9b3381e3" exitCode=143 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.640982 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" event={"ID":"4b106e3d-3a2a-44d6-a377-8eb7f8f37825","Type":"ContainerDied","Data":"253233ff162c9e7305f61b0faa0810f0949d931fef29e1de6252df9e9b3381e3"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.659769 5030 generic.go:334] "Generic (PLEG): container finished" podID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerID="06d13adbffbcd724575da3d439b5c58b3043b222f3e76d916058a23944d86b4f" exitCode=0 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.659824 5030 generic.go:334] "Generic (PLEG): container finished" podID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerID="722d4093088b0113351d34ef93f46a8253354211420a96de0ce82f8dc64ebd7b" exitCode=0 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.659837 5030 generic.go:334] "Generic (PLEG): container finished" podID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerID="e53a19888fbe4b142de355206d3ab31f28f294382a2552dd84cc805eb5b41bed" exitCode=0 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.659915 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"db1354ec-1e74-42e4-8eda-ffcba8af62a8","Type":"ContainerDied","Data":"06d13adbffbcd724575da3d439b5c58b3043b222f3e76d916058a23944d86b4f"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.659948 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"db1354ec-1e74-42e4-8eda-ffcba8af62a8","Type":"ContainerDied","Data":"722d4093088b0113351d34ef93f46a8253354211420a96de0ce82f8dc64ebd7b"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.659957 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"db1354ec-1e74-42e4-8eda-ffcba8af62a8","Type":"ContainerDied","Data":"e53a19888fbe4b142de355206d3ab31f28f294382a2552dd84cc805eb5b41bed"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.665248 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:13:54 crc kubenswrapper[5030]: E0120 23:13:54.677205 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.679246 5030 generic.go:334] "Generic (PLEG): container finished" podID="5c2cc197-a38a-41a4-8c40-c65602a593ea" containerID="fc25b97d9ab3c28c713e451e309b06690ce958169a7863890665189163eec9f1" exitCode=0 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.679338 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" event={"ID":"5c2cc197-a38a-41a4-8c40-c65602a593ea","Type":"ContainerDied","Data":"fc25b97d9ab3c28c713e451e309b06690ce958169a7863890665189163eec9f1"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.679373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" event={"ID":"5c2cc197-a38a-41a4-8c40-c65602a593ea","Type":"ContainerStarted","Data":"f07eb9d7bce3c2d9348ee2081a98fee3bcc609c384a002ba9686ac930a2bbd0f"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.685053 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjrml\" (UniqueName: \"kubernetes.io/projected/39d2f867-199b-446c-9f02-c8b2f842df63-kube-api-access-pjrml\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.685093 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/39d2f867-199b-446c-9f02-c8b2f842df63-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.685106 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.685119 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.687452 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_682ea052-23fc-4878-ae8e-b5887a6e83b8/ovsdbserver-nb/0.log" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.687670 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.688176 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"682ea052-23fc-4878-ae8e-b5887a6e83b8","Type":"ContainerDied","Data":"b603bfe9e8c779c496279c300c8a563436121eb38e61ef6fbc644da2b6132a0d"} Jan 20 23:13:54 crc kubenswrapper[5030]: E0120 23:13:54.694873 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.702735 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.161:5671: connect: connection refused" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.703202 5030 generic.go:334] "Generic (PLEG): container finished" podID="f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" containerID="444566938fcbfab3901e6fd0f098d24ad50b1a35701c21a72a39c1fc137f4568" exitCode=0 Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.703315 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.704299 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" event={"ID":"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c","Type":"ContainerDied","Data":"444566938fcbfab3901e6fd0f098d24ad50b1a35701c21a72a39c1fc137f4568"} Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.704462 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m5dvm" podUID="93de7225-d504-409c-b4b7-8e0826937e9d" containerName="registry-server" containerID="cri-o://76189224976e8521cfe7d9597e16f26ebe7d1e854b742ed413b053b50b1fb456" gracePeriod=2 Jan 20 23:13:54 crc kubenswrapper[5030]: E0120 23:13:54.717944 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:13:54 crc kubenswrapper[5030]: E0120 23:13:54.718002 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="935fdabd-9fb0-4aec-8920-4ccbaf4416a3" containerName="nova-scheduler-scheduler" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.763075 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-config-data" (OuterVolumeSpecName: "config-data") pod "39d2f867-199b-446c-9f02-c8b2f842df63" (UID: "39d2f867-199b-446c-9f02-c8b2f842df63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.786819 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.797792 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39d2f867-199b-446c-9f02-c8b2f842df63" (UID: "39d2f867-199b-446c-9f02-c8b2f842df63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.891804 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39d2f867-199b-446c-9f02-c8b2f842df63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.918020 5030 scope.go:117] "RemoveContainer" containerID="ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.933877 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.992827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-vencrypt-tls-certs\") pod \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.992869 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zc5v\" (UniqueName: \"kubernetes.io/projected/0b4eb966-3abf-4d3a-b65c-e05c8366c302-kube-api-access-7zc5v\") pod \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.993104 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-combined-ca-bundle\") pod \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.993147 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-nova-novncproxy-tls-certs\") pod \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " Jan 20 23:13:54 crc kubenswrapper[5030]: I0120 23:13:54.993178 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-config-data\") pod \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\" (UID: \"0b4eb966-3abf-4d3a-b65c-e05c8366c302\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.001772 5030 scope.go:117] "RemoveContainer" containerID="3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.027559 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.056176 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.063437 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b4eb966-3abf-4d3a-b65c-e05c8366c302-kube-api-access-7zc5v" (OuterVolumeSpecName: "kube-api-access-7zc5v") pod "0b4eb966-3abf-4d3a-b65c-e05c8366c302" (UID: "0b4eb966-3abf-4d3a-b65c-e05c8366c302"). InnerVolumeSpecName "kube-api-access-7zc5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.066331 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.073280 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.080090 5030 scope.go:117] "RemoveContainer" containerID="ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.080601 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68\": container with ID starting with ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68 not found: ID does not exist" containerID="ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.080664 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68"} err="failed to get container status \"ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68\": rpc error: code = NotFound desc = could not find container \"ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68\": container with ID starting with ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68 not found: ID does not exist" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.080697 5030 scope.go:117] "RemoveContainer" containerID="3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.080934 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444\": container with ID starting with 3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444 not found: ID does not exist" containerID="3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.080963 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444"} err="failed to get container status \"3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444\": rpc error: code = NotFound desc = could not find container \"3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444\": container with ID starting with 3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444 not found: ID does not exist" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.080976 5030 scope.go:117] "RemoveContainer" containerID="ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.081201 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68"} err="failed to get container status \"ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68\": rpc error: code = NotFound desc = could not find container \"ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68\": container with ID starting with ff0f7da03801e35a0e5a845e23cbb679b9e4ce6bb7e45f4cbb29430c60f89d68 not found: ID does not exist" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.081216 5030 scope.go:117] "RemoveContainer" containerID="3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.082205 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444"} err="failed to get container status \"3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444\": rpc error: code = NotFound desc = could not find container \"3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444\": container with ID starting with 3cb809cfffdf78e0a5a59e9b6eefcde2297faba93ed46d36968b6566e7e3a444 not found: ID does not exist" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.082436 5030 scope.go:117] "RemoveContainer" containerID="fbd9bf9f73aba592b00498bb42493eb96178822d5c2e8da3dcd3907782e1e416" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.082451 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.088925 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.096233 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.096303 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs podName:03b59562-d96f-453a-bfc7-a9f3c7e7f872 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:55.596285839 +0000 UTC m=+2307.916546127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs") pod "keystone-7f866c7db4-82phj" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872") : secret "cert-keystone-internal-svc" not found Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.096489 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zc5v\" (UniqueName: \"kubernetes.io/projected/0b4eb966-3abf-4d3a-b65c-e05c8366c302-kube-api-access-7zc5v\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.096955 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.096983 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle podName:03b59562-d96f-453a-bfc7-a9f3c7e7f872 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:55.596976486 +0000 UTC m=+2307.917236774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle") pod "keystone-7f866c7db4-82phj" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872") : secret "combined-ca-bundle" not found Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.097219 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.097290 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs podName:03b59562-d96f-453a-bfc7-a9f3c7e7f872 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:55.597274193 +0000 UTC m=+2307.917534481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs") pod "keystone-7f866c7db4-82phj" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872") : secret "cert-keystone-public-svc" not found Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.098822 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.111860 5030 scope.go:117] "RemoveContainer" containerID="593b160210b2fb95026561fbd077c54222972c387ef711518ba53a962cd1ab83" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.144757 5030 scope.go:117] "RemoveContainer" containerID="84e53198c624c2bd25c3ed46fb9e8cf0112438a848dd4e30d479645f7a46beb6" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.150996 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.197840 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-1\") pod \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.197892 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-config\") pod \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.197982 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.198053 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.198072 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-thanos-prometheus-http-client-file\") pod \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.198154 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-0\") pod \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.198218 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfkm5\" (UniqueName: \"kubernetes.io/projected/db1354ec-1e74-42e4-8eda-ffcba8af62a8-kube-api-access-bfkm5\") pod \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.198286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/db1354ec-1e74-42e4-8eda-ffcba8af62a8-config-out\") pod \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.198325 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-2\") pod \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.198363 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.198468 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-secret-combined-ca-bundle\") pod \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.198514 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config\") pod \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.198540 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/db1354ec-1e74-42e4-8eda-ffcba8af62a8-tls-assets\") pod \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\" (UID: \"db1354ec-1e74-42e4-8eda-ffcba8af62a8\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.204302 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "db1354ec-1e74-42e4-8eda-ffcba8af62a8" (UID: "db1354ec-1e74-42e4-8eda-ffcba8af62a8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.206992 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "db1354ec-1e74-42e4-8eda-ffcba8af62a8" (UID: "db1354ec-1e74-42e4-8eda-ffcba8af62a8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.207597 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "db1354ec-1e74-42e4-8eda-ffcba8af62a8" (UID: "db1354ec-1e74-42e4-8eda-ffcba8af62a8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.226260 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-config" (OuterVolumeSpecName: "config") pod "db1354ec-1e74-42e4-8eda-ffcba8af62a8" (UID: "db1354ec-1e74-42e4-8eda-ffcba8af62a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.226332 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1354ec-1e74-42e4-8eda-ffcba8af62a8-config-out" (OuterVolumeSpecName: "config-out") pod "db1354ec-1e74-42e4-8eda-ffcba8af62a8" (UID: "db1354ec-1e74-42e4-8eda-ffcba8af62a8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.229227 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db1354ec-1e74-42e4-8eda-ffcba8af62a8-kube-api-access-bfkm5" (OuterVolumeSpecName: "kube-api-access-bfkm5") pod "db1354ec-1e74-42e4-8eda-ffcba8af62a8" (UID: "db1354ec-1e74-42e4-8eda-ffcba8af62a8"). InnerVolumeSpecName "kube-api-access-bfkm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.248308 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "db1354ec-1e74-42e4-8eda-ffcba8af62a8" (UID: "db1354ec-1e74-42e4-8eda-ffcba8af62a8"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.253399 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db1354ec-1e74-42e4-8eda-ffcba8af62a8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "db1354ec-1e74-42e4-8eda-ffcba8af62a8" (UID: "db1354ec-1e74-42e4-8eda-ffcba8af62a8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.253553 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "db1354ec-1e74-42e4-8eda-ffcba8af62a8" (UID: "db1354ec-1e74-42e4-8eda-ffcba8af62a8"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.261126 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "db1354ec-1e74-42e4-8eda-ffcba8af62a8" (UID: "db1354ec-1e74-42e4-8eda-ffcba8af62a8"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.266512 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "db1354ec-1e74-42e4-8eda-ffcba8af62a8" (UID: "db1354ec-1e74-42e4-8eda-ffcba8af62a8"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.274527 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "db1354ec-1e74-42e4-8eda-ffcba8af62a8" (UID: "db1354ec-1e74-42e4-8eda-ffcba8af62a8"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.300645 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-config-volume\") pod \"f331453a-4129-49ea-b554-06f628abf66a\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.300853 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f331453a-4129-49ea-b554-06f628abf66a-config-out\") pod \"f331453a-4129-49ea-b554-06f628abf66a\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.300903 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f331453a-4129-49ea-b554-06f628abf66a-alertmanager-metric-storage-db\") pod \"f331453a-4129-49ea-b554-06f628abf66a\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.300985 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-web-config\") pod \"f331453a-4129-49ea-b554-06f628abf66a\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.301133 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn95g\" (UniqueName: \"kubernetes.io/projected/f331453a-4129-49ea-b554-06f628abf66a-kube-api-access-wn95g\") pod \"f331453a-4129-49ea-b554-06f628abf66a\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.301223 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f331453a-4129-49ea-b554-06f628abf66a-tls-assets\") pod \"f331453a-4129-49ea-b554-06f628abf66a\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.301291 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-cluster-tls-config\") pod \"f331453a-4129-49ea-b554-06f628abf66a\" (UID: \"f331453a-4129-49ea-b554-06f628abf66a\") " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.303017 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f331453a-4129-49ea-b554-06f628abf66a-alertmanager-metric-storage-db" (OuterVolumeSpecName: "alertmanager-metric-storage-db") pod "f331453a-4129-49ea-b554-06f628abf66a" (UID: "f331453a-4129-49ea-b554-06f628abf66a"). InnerVolumeSpecName "alertmanager-metric-storage-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.303766 5030 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.303787 5030 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/db1354ec-1e74-42e4-8eda-ffcba8af62a8-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.303803 5030 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.303818 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.303832 5030 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.303845 5030 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.303858 5030 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.304361 5030 reconciler_common.go:293] "Volume detached for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/f331453a-4129-49ea-b554-06f628abf66a-alertmanager-metric-storage-db\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.304378 5030 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.304418 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfkm5\" (UniqueName: \"kubernetes.io/projected/db1354ec-1e74-42e4-8eda-ffcba8af62a8-kube-api-access-bfkm5\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.304431 5030 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/db1354ec-1e74-42e4-8eda-ffcba8af62a8-config-out\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.304443 5030 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/db1354ec-1e74-42e4-8eda-ffcba8af62a8-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.304508 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.337942 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-config-volume" (OuterVolumeSpecName: "config-volume") pod "f331453a-4129-49ea-b554-06f628abf66a" (UID: "f331453a-4129-49ea-b554-06f628abf66a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.338034 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f331453a-4129-49ea-b554-06f628abf66a-kube-api-access-wn95g" (OuterVolumeSpecName: "kube-api-access-wn95g") pod "f331453a-4129-49ea-b554-06f628abf66a" (UID: "f331453a-4129-49ea-b554-06f628abf66a"). InnerVolumeSpecName "kube-api-access-wn95g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.354252 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.354482 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="deba34a1-e52c-4ee9-b8fb-07dbe410184c" containerName="memcached" containerID="cri-o://5c57eec86df36ec8b65b60ad4ae937344c23be3ea71cb4377819fbc39e8de8e1" gracePeriod=30 Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.383525 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.397607 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-a5ce-account-create-update-tzcng"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.397680 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267"] Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.397993 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39d2f867-199b-446c-9f02-c8b2f842df63" containerName="cinder-scheduler" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398004 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d2f867-199b-446c-9f02-c8b2f842df63" containerName="cinder-scheduler" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398015 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="init-config-reloader" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398023 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="init-config-reloader" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398035 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="thanos-sidecar" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398041 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="thanos-sidecar" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398049 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f331453a-4129-49ea-b554-06f628abf66a" containerName="config-reloader" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398055 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f331453a-4129-49ea-b554-06f628abf66a" containerName="config-reloader" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398069 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="config-reloader" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398075 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="config-reloader" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398088 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba022bd-4168-4c45-a924-061342f680dd" containerName="ovsdbserver-sb" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398093 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba022bd-4168-4c45-a924-061342f680dd" containerName="ovsdbserver-sb" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398111 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba022bd-4168-4c45-a924-061342f680dd" containerName="openstack-network-exporter" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398117 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba022bd-4168-4c45-a924-061342f680dd" containerName="openstack-network-exporter" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398128 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b4eb966-3abf-4d3a-b65c-e05c8366c302" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398155 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b4eb966-3abf-4d3a-b65c-e05c8366c302" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398166 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="prometheus" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398171 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="prometheus" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398179 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f331453a-4129-49ea-b554-06f628abf66a" containerName="alertmanager" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398185 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f331453a-4129-49ea-b554-06f628abf66a" containerName="alertmanager" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398196 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39d2f867-199b-446c-9f02-c8b2f842df63" containerName="probe" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398201 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d2f867-199b-446c-9f02-c8b2f842df63" containerName="probe" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398211 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="682ea052-23fc-4878-ae8e-b5887a6e83b8" containerName="ovsdbserver-nb" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398217 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="682ea052-23fc-4878-ae8e-b5887a6e83b8" containerName="ovsdbserver-nb" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398227 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="682ea052-23fc-4878-ae8e-b5887a6e83b8" containerName="openstack-network-exporter" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398251 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="682ea052-23fc-4878-ae8e-b5887a6e83b8" containerName="openstack-network-exporter" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398260 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d52d930-f169-42d5-a93d-d9d30906d5c8" containerName="kube-state-metrics" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398266 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d52d930-f169-42d5-a93d-d9d30906d5c8" containerName="kube-state-metrics" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.398277 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f331453a-4129-49ea-b554-06f628abf66a" containerName="init-config-reloader" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398282 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f331453a-4129-49ea-b554-06f628abf66a" containerName="init-config-reloader" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398437 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d52d930-f169-42d5-a93d-d9d30906d5c8" containerName="kube-state-metrics" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398448 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f331453a-4129-49ea-b554-06f628abf66a" containerName="alertmanager" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398458 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f331453a-4129-49ea-b554-06f628abf66a" containerName="config-reloader" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398471 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="thanos-sidecar" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398488 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b4eb966-3abf-4d3a-b65c-e05c8366c302" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398497 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba022bd-4168-4c45-a924-061342f680dd" containerName="openstack-network-exporter" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398505 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="prometheus" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398515 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="39d2f867-199b-446c-9f02-c8b2f842df63" containerName="probe" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398523 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba022bd-4168-4c45-a924-061342f680dd" containerName="ovsdbserver-sb" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398536 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="682ea052-23fc-4878-ae8e-b5887a6e83b8" containerName="openstack-network-exporter" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398544 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" containerName="config-reloader" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398554 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="682ea052-23fc-4878-ae8e-b5887a6e83b8" containerName="ovsdbserver-nb" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.398563 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="39d2f867-199b-446c-9f02-c8b2f842df63" containerName="cinder-scheduler" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.399120 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.401577 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f331453a-4129-49ea-b554-06f628abf66a-config-out" (OuterVolumeSpecName: "config-out") pod "f331453a-4129-49ea-b554-06f628abf66a" (UID: "f331453a-4129-49ea-b554-06f628abf66a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.405439 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.406893 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn95g\" (UniqueName: \"kubernetes.io/projected/f331453a-4129-49ea-b554-06f628abf66a-kube-api-access-wn95g\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.406923 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.406931 5030 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f331453a-4129-49ea-b554-06f628abf66a-config-out\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.407002 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f331453a-4129-49ea-b554-06f628abf66a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f331453a-4129-49ea-b554-06f628abf66a" (UID: "f331453a-4129-49ea-b554-06f628abf66a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.422827 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6kpvj"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.423167 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-config-data" (OuterVolumeSpecName: "config-data") pod "0b4eb966-3abf-4d3a-b65c-e05c8366c302" (UID: "0b4eb966-3abf-4d3a-b65c-e05c8366c302"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.444249 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6kpvj"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.452605 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.486176 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-sfwxf"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.509771 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63d56792-0e0e-490d-9f7b-c613b6b6f979-operator-scripts\") pod \"keystone-a5ce-account-create-update-7j267\" (UID: \"63d56792-0e0e-490d-9f7b-c613b6b6f979\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.509872 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t527\" (UniqueName: \"kubernetes.io/projected/63d56792-0e0e-490d-9f7b-c613b6b6f979-kube-api-access-4t527\") pod \"keystone-a5ce-account-create-update-7j267\" (UID: \"63d56792-0e0e-490d-9f7b-c613b6b6f979\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.509970 5030 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f331453a-4129-49ea-b554-06f628abf66a-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.509981 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.568333 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-sfwxf"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.568982 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "0b4eb966-3abf-4d3a-b65c-e05c8366c302" (UID: "0b4eb966-3abf-4d3a-b65c-e05c8366c302"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.575906 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b4eb966-3abf-4d3a-b65c-e05c8366c302" (UID: "0b4eb966-3abf-4d3a-b65c-e05c8366c302"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.580807 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.589518 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-v9ngp"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.595504 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "0b4eb966-3abf-4d3a-b65c-e05c8366c302" (UID: "0b4eb966-3abf-4d3a-b65c-e05c8366c302"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.600826 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.610488 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7f866c7db4-82phj"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.611976 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63d56792-0e0e-490d-9f7b-c613b6b6f979-operator-scripts\") pod \"keystone-a5ce-account-create-update-7j267\" (UID: \"63d56792-0e0e-490d-9f7b-c613b6b6f979\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.612089 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t527\" (UniqueName: \"kubernetes.io/projected/63d56792-0e0e-490d-9f7b-c613b6b6f979-kube-api-access-4t527\") pod \"keystone-a5ce-account-create-update-7j267\" (UID: \"63d56792-0e0e-490d-9f7b-c613b6b6f979\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.612193 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.612245 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.612260 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/63d56792-0e0e-490d-9f7b-c613b6b6f979-operator-scripts podName:63d56792-0e0e-490d-9f7b-c613b6b6f979 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:56.112242443 +0000 UTC m=+2308.432502731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/63d56792-0e0e-490d-9f7b-c613b6b6f979-operator-scripts") pod "keystone-a5ce-account-create-update-7j267" (UID: "63d56792-0e0e-490d-9f7b-c613b6b6f979") : configmap "openstack-scripts" not found Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.612287 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.612303 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.612316 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b4eb966-3abf-4d3a-b65c-e05c8366c302-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.612327 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.612369 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs podName:03b59562-d96f-453a-bfc7-a9f3c7e7f872 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:56.612354505 +0000 UTC m=+2308.932614793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs") pod "keystone-7f866c7db4-82phj" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872") : secret "cert-keystone-public-svc" not found Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.612520 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.612542 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle podName:03b59562-d96f-453a-bfc7-a9f3c7e7f872 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:56.6125354 +0000 UTC m=+2308.932795788 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle") pod "keystone-7f866c7db4-82phj" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872") : secret "combined-ca-bundle" not found Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.612573 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.612591 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs podName:03b59562-d96f-453a-bfc7-a9f3c7e7f872 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:56.612586161 +0000 UTC m=+2308.932846449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs") pod "keystone-7f866c7db4-82phj" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872") : secret "cert-keystone-internal-svc" not found Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.617266 5030 projected.go:194] Error preparing data for projected volume kube-api-access-4t527 for pod openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.617344 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63d56792-0e0e-490d-9f7b-c613b6b6f979-kube-api-access-4t527 podName:63d56792-0e0e-490d-9f7b-c613b6b6f979 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:56.117328396 +0000 UTC m=+2308.437588684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4t527" (UniqueName: "kubernetes.io/projected/63d56792-0e0e-490d-9f7b-c613b6b6f979-kube-api-access-4t527") pod "keystone-a5ce-account-create-update-7j267" (UID: "63d56792-0e0e-490d-9f7b-c613b6b6f979") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.623101 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "f331453a-4129-49ea-b554-06f628abf66a" (UID: "f331453a-4129-49ea-b554-06f628abf66a"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.623285 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-v9ngp"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.634906 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.638905 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-web-config" (OuterVolumeSpecName: "web-config") pod "f331453a-4129-49ea-b554-06f628abf66a" (UID: "f331453a-4129-49ea-b554-06f628abf66a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.642305 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-sks9w"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.664892 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config" (OuterVolumeSpecName: "web-config") pod "db1354ec-1e74-42e4-8eda-ffcba8af62a8" (UID: "db1354ec-1e74-42e4-8eda-ffcba8af62a8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.684234 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/watcher-api-0" podUID="70ab6da8-893f-49a3-867f-02b2f7395331" containerName="watcher-api-log" probeResult="failure" output="Get \"https://10.217.1.220:9322/\": read tcp 10.217.0.2:44824->10.217.1.220:9322: read: connection reset by peer" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.684304 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/watcher-api-0" podUID="70ab6da8-893f-49a3-867f-02b2f7395331" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.1.220:9322/\": read tcp 10.217.0.2:44828->10.217.1.220:9322: read: connection reset by peer" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.714614 5030 reconciler_common.go:293] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-cluster-tls-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.714658 5030 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/db1354ec-1e74-42e4-8eda-ffcba8af62a8-web-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.714669 5030 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f331453a-4129-49ea-b554-06f628abf66a-web-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.729015 5030 generic.go:334] "Generic (PLEG): container finished" podID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerID="ba3004e2d56e03c4b37e9b30e64983c6fd5e37ab9a10b78ca4fc812d194cf0c1" exitCode=0 Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.729082 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b0bf8f-9251-40a3-8177-ee7ec9aec622","Type":"ContainerDied","Data":"ba3004e2d56e03c4b37e9b30e64983c6fd5e37ab9a10b78ca4fc812d194cf0c1"} Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.729109 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e7b0bf8f-9251-40a3-8177-ee7ec9aec622","Type":"ContainerDied","Data":"6b235e1482c00f139f72258b244e425359558c1c5e1e3b82b55ef29e51500c6c"} Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.729120 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b235e1482c00f139f72258b244e425359558c1c5e1e3b82b55ef29e51500c6c" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.732086 5030 generic.go:334] "Generic (PLEG): container finished" podID="93de7225-d504-409c-b4b7-8e0826937e9d" containerID="76189224976e8521cfe7d9597e16f26ebe7d1e854b742ed413b053b50b1fb456" exitCode=0 Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.732168 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5dvm" event={"ID":"93de7225-d504-409c-b4b7-8e0826937e9d","Type":"ContainerDied","Data":"76189224976e8521cfe7d9597e16f26ebe7d1e854b742ed413b053b50b1fb456"} Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.732323 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5dvm" event={"ID":"93de7225-d504-409c-b4b7-8e0826937e9d","Type":"ContainerDied","Data":"02ab49a1053ef22b4bd08283adf64ac364fb532b2a50f0b4570a3ca1e2438562"} Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.732353 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02ab49a1053ef22b4bd08283adf64ac364fb532b2a50f0b4570a3ca1e2438562" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.743021 5030 generic.go:334] "Generic (PLEG): container finished" podID="ee0a79ab-ccac-4a83-9e3e-7f1f232f2121" containerID="e959b9a2c88101b1abc84c9677d1da8781c558a6553ea4387f6edbb3c88f918b" exitCode=1 Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.743077 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-sks9w" event={"ID":"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121","Type":"ContainerDied","Data":"e959b9a2c88101b1abc84c9677d1da8781c558a6553ea4387f6edbb3c88f918b"} Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.743110 5030 scope.go:117] "RemoveContainer" containerID="516e9e3aaabd9e815871f9c17822c1a181f04d33bbf8a76cb12858b29bd5fad3" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.743609 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-sks9w" secret="" err="secret \"galera-openstack-dockercfg-rzlfs\" not found" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.743659 5030 scope.go:117] "RemoveContainer" containerID="e959b9a2c88101b1abc84c9677d1da8781c558a6553ea4387f6edbb3c88f918b" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.744013 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-sks9w_openstack-kuttl-tests(ee0a79ab-ccac-4a83-9e3e-7f1f232f2121)\"" pod="openstack-kuttl-tests/root-account-create-update-sks9w" podUID="ee0a79ab-ccac-4a83-9e3e-7f1f232f2121" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.763998 5030 generic.go:334] "Generic (PLEG): container finished" podID="f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" containerID="a603f5479f1751fca07b831722eb086379566bd909e2e23ac72bbb41392e0f4d" exitCode=0 Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.764160 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" event={"ID":"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c","Type":"ContainerDied","Data":"a603f5479f1751fca07b831722eb086379566bd909e2e23ac72bbb41392e0f4d"} Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.764199 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" event={"ID":"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c","Type":"ContainerDied","Data":"4a64b3c6378f7152c57a04b7dc2b20f750c7d82ab0081e988883ad85a13d61b7"} Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.764221 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a26f3d469d1bc70918fe59af26ae8ccec88cda9bef22615517691b50482821fc" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.764249 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a64b3c6378f7152c57a04b7dc2b20f750c7d82ab0081e988883ad85a13d61b7" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.766522 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a26f3d469d1bc70918fe59af26ae8ccec88cda9bef22615517691b50482821fc" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.771414 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"0b4eb966-3abf-4d3a-b65c-e05c8366c302","Type":"ContainerDied","Data":"bfbfa3697dcedc21eed3e8c48da58f623c89e02dbc3c4db25b2696963e479e52"} Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.771778 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.772870 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a26f3d469d1bc70918fe59af26ae8ccec88cda9bef22615517691b50482821fc" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.772930 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" containerName="ovn-northd" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.774280 5030 generic.go:334] "Generic (PLEG): container finished" podID="0b296da3-d86c-431b-889e-a38437804d3a" containerID="bf4743f2f827f2f4f9f2f8517fc9b12848a8b1ea2c24f37ec0a0b7b4cefab31e" exitCode=0 Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.774344 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" event={"ID":"0b296da3-d86c-431b-889e-a38437804d3a","Type":"ContainerDied","Data":"bf4743f2f827f2f4f9f2f8517fc9b12848a8b1ea2c24f37ec0a0b7b4cefab31e"} Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.778325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" event={"ID":"5c2cc197-a38a-41a4-8c40-c65602a593ea","Type":"ContainerStarted","Data":"2c07e5959ee3fc2cc7c2847460f6aace325ab51ecac6b3fa4806c949e47bad35"} Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.778747 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.786471 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" event={"ID":"f331453a-4129-49ea-b554-06f628abf66a","Type":"ContainerDied","Data":"f6ffe7728ca88ceb4d9f43aa44a217ee69da22ab5eeb0584913769b7d72ad978"} Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.786568 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.792807 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/prometheus-metric-storage-0" event={"ID":"db1354ec-1e74-42e4-8eda-ffcba8af62a8","Type":"ContainerDied","Data":"4b7687b382562c9996b451560f66f08288773b36e3b165c118b1b7c3a5471a69"} Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.793557 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/prometheus-metric-storage-0" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.796731 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" podStartSLOduration=3.796708965 podStartE2EDuration="3.796708965s" podCreationTimestamp="2026-01-20 23:13:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:13:55.791411827 +0000 UTC m=+2308.111672125" watchObservedRunningTime="2026-01-20 23:13:55.796708965 +0000 UTC m=+2308.116969243" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.801975 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="0fa7d356-e537-4b10-829b-9f07de284e64" containerName="galera" containerID="cri-o://33728d9e320468adf6786f01946b2906c887ec040f837a9e394ae711dd71ba4c" gracePeriod=30 Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.802405 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" podUID="03b59562-d96f-453a-bfc7-a9f3c7e7f872" containerName="keystone-api" containerID="cri-o://7ce9f7684a297141dd328f9a8c3e12154ea1db484b1ac1e034af93f31c64eab1" gracePeriod=30 Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.820224 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.820310 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-operator-scripts podName:ee0a79ab-ccac-4a83-9e3e-7f1f232f2121 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:56.320290155 +0000 UTC m=+2308.640550443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-operator-scripts") pod "root-account-create-update-sks9w" (UID: "ee0a79ab-ccac-4a83-9e3e-7f1f232f2121") : configmap "openstack-scripts" not found Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.876884 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.877157 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:55 crc kubenswrapper[5030]: E0120 23:13:55.886667 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-4t527 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267" podUID="63d56792-0e0e-490d-9f7b-c613b6b6f979" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.900713 5030 scope.go:117] "RemoveContainer" containerID="a311ea86198bf17bcf45cd98e7b6fc056d1db58bf1b30b8a6c217b794dae71eb" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.903095 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.904246 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.912774 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.918106 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/alertmanager-metric-storage-0"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.924726 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/alertmanager-metric-storage-0"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.945780 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 20 23:13:55 crc kubenswrapper[5030]: I0120 23:13:55.952509 5030 scope.go:117] "RemoveContainer" containerID="1cc12b6243a15a276fb893fecf16ad79a1fa615e0561dc80434a104cb6551b4c" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.002277 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b4eb966-3abf-4d3a-b65c-e05c8366c302" path="/var/lib/kubelet/pods/0b4eb966-3abf-4d3a-b65c-e05c8366c302/volumes" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.002950 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ba022bd-4168-4c45-a924-061342f680dd" path="/var/lib/kubelet/pods/1ba022bd-4168-4c45-a924-061342f680dd/volumes" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.004730 5030 scope.go:117] "RemoveContainer" containerID="a4bc8dbaee8fec446141a83feeeaff6bfbd90d9c9c809d618868cdf3d15fa788" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.005812 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2efdbb9c-5339-420d-9413-ae2132514f02" path="/var/lib/kubelet/pods/2efdbb9c-5339-420d-9413-ae2132514f02/volumes" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.006422 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354c4165-5673-49e5-9f0f-9b657e9c969f" path="/var/lib/kubelet/pods/354c4165-5673-49e5-9f0f-9b657e9c969f/volumes" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.006914 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39d2f867-199b-446c-9f02-c8b2f842df63" path="/var/lib/kubelet/pods/39d2f867-199b-446c-9f02-c8b2f842df63/volumes" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.008226 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48e2b069-8ada-46bf-9b74-8f899c5fb3b2" path="/var/lib/kubelet/pods/48e2b069-8ada-46bf-9b74-8f899c5fb3b2/volumes" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.008843 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="682ea052-23fc-4878-ae8e-b5887a6e83b8" path="/var/lib/kubelet/pods/682ea052-23fc-4878-ae8e-b5887a6e83b8/volumes" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.009454 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d52d930-f169-42d5-a93d-d9d30906d5c8" path="/var/lib/kubelet/pods/6d52d930-f169-42d5-a93d-d9d30906d5c8/volumes" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.010487 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70a1a36-5e84-42ef-bafd-cddecb57dfd7" path="/var/lib/kubelet/pods/c70a1a36-5e84-42ef-bafd-cddecb57dfd7/volumes" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.011094 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f331453a-4129-49ea-b554-06f628abf66a" path="/var/lib/kubelet/pods/f331453a-4129-49ea-b554-06f628abf66a/volumes" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.016183 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.018296 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/prometheus-metric-storage-0"] Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.024558 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-sg-core-conf-yaml\") pod \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.024632 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-run-httpd\") pod \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.024660 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff2bk\" (UniqueName: \"kubernetes.io/projected/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-kube-api-access-ff2bk\") pod \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.024724 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93de7225-d504-409c-b4b7-8e0826937e9d-catalog-content\") pod \"93de7225-d504-409c-b4b7-8e0826937e9d\" (UID: \"93de7225-d504-409c-b4b7-8e0826937e9d\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.024755 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-combined-ca-bundle\") pod \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.024799 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-internal-tls-certs\") pod \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.024819 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-public-tls-certs\") pod \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.024862 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-log-httpd\") pod \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.024895 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93de7225-d504-409c-b4b7-8e0826937e9d-utilities\") pod \"93de7225-d504-409c-b4b7-8e0826937e9d\" (UID: \"93de7225-d504-409c-b4b7-8e0826937e9d\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.024920 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-log-httpd\") pod \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.024938 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-config-data\") pod \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.024963 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-combined-ca-bundle\") pod \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.025009 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-run-httpd\") pod \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.025031 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-ceilometer-tls-certs\") pod \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.025128 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-etc-swift\") pod \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\" (UID: \"f3c1f839-16bf-4ca2-8e07-b616fb9aef9c\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.025169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-scripts\") pod \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.025190 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-config-data\") pod \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.025209 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzbr4\" (UniqueName: \"kubernetes.io/projected/93de7225-d504-409c-b4b7-8e0826937e9d-kube-api-access-pzbr4\") pod \"93de7225-d504-409c-b4b7-8e0826937e9d\" (UID: \"93de7225-d504-409c-b4b7-8e0826937e9d\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.025259 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl5nf\" (UniqueName: \"kubernetes.io/projected/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-kube-api-access-xl5nf\") pod \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\" (UID: \"e7b0bf8f-9251-40a3-8177-ee7ec9aec622\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.025311 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" (UID: "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.025689 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" (UID: "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.025750 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.030204 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e7b0bf8f-9251-40a3-8177-ee7ec9aec622" (UID: "e7b0bf8f-9251-40a3-8177-ee7ec9aec622"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.036095 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93de7225-d504-409c-b4b7-8e0826937e9d-utilities" (OuterVolumeSpecName: "utilities") pod "93de7225-d504-409c-b4b7-8e0826937e9d" (UID: "93de7225-d504-409c-b4b7-8e0826937e9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.041351 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.213:8776/healthcheck\": read tcp 10.217.0.2:59554->10.217.1.213:8776: read: connection reset by peer" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.041414 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-scripts" (OuterVolumeSpecName: "scripts") pod "e7b0bf8f-9251-40a3-8177-ee7ec9aec622" (UID: "e7b0bf8f-9251-40a3-8177-ee7ec9aec622"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.041505 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-kube-api-access-xl5nf" (OuterVolumeSpecName: "kube-api-access-xl5nf") pod "e7b0bf8f-9251-40a3-8177-ee7ec9aec622" (UID: "e7b0bf8f-9251-40a3-8177-ee7ec9aec622"). InnerVolumeSpecName "kube-api-access-xl5nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.043366 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e7b0bf8f-9251-40a3-8177-ee7ec9aec622" (UID: "e7b0bf8f-9251-40a3-8177-ee7ec9aec622"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.044570 5030 scope.go:117] "RemoveContainer" containerID="adc1e639f97f48f66f61a2912c7d0b9d66666e05456c9a9d9aaa15a28a4004ea" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.046450 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" (UID: "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.059820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-kube-api-access-ff2bk" (OuterVolumeSpecName: "kube-api-access-ff2bk") pod "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" (UID: "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c"). InnerVolumeSpecName "kube-api-access-ff2bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.078107 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93de7225-d504-409c-b4b7-8e0826937e9d-kube-api-access-pzbr4" (OuterVolumeSpecName: "kube-api-access-pzbr4") pod "93de7225-d504-409c-b4b7-8e0826937e9d" (UID: "93de7225-d504-409c-b4b7-8e0826937e9d"). InnerVolumeSpecName "kube-api-access-pzbr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.080226 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33728d9e320468adf6786f01946b2906c887ec040f837a9e394ae711dd71ba4c" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.082531 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33728d9e320468adf6786f01946b2906c887ec040f837a9e394ae711dd71ba4c" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.085858 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33728d9e320468adf6786f01946b2906c887ec040f837a9e394ae711dd71ba4c" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.085939 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/openstack-galera-0" podUID="0fa7d356-e537-4b10-829b-9f07de284e64" containerName="galera" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.099412 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93de7225-d504-409c-b4b7-8e0826937e9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93de7225-d504-409c-b4b7-8e0826937e9d" (UID: "93de7225-d504-409c-b4b7-8e0826937e9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.120514 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" (UID: "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.122905 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e7b0bf8f-9251-40a3-8177-ee7ec9aec622" (UID: "e7b0bf8f-9251-40a3-8177-ee7ec9aec622"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.127353 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-public-tls-certs\") pod \"0b296da3-d86c-431b-889e-a38437804d3a\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.127488 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-combined-ca-bundle\") pod \"0b296da3-d86c-431b-889e-a38437804d3a\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.127526 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-scripts\") pod \"0b296da3-d86c-431b-889e-a38437804d3a\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.127558 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-internal-tls-certs\") pod \"0b296da3-d86c-431b-889e-a38437804d3a\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.127615 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-config-data\") pod \"0b296da3-d86c-431b-889e-a38437804d3a\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.127664 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxp4m\" (UniqueName: \"kubernetes.io/projected/0b296da3-d86c-431b-889e-a38437804d3a-kube-api-access-mxp4m\") pod \"0b296da3-d86c-431b-889e-a38437804d3a\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.127756 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b296da3-d86c-431b-889e-a38437804d3a-logs\") pod \"0b296da3-d86c-431b-889e-a38437804d3a\" (UID: \"0b296da3-d86c-431b-889e-a38437804d3a\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128006 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t527\" (UniqueName: \"kubernetes.io/projected/63d56792-0e0e-490d-9f7b-c613b6b6f979-kube-api-access-4t527\") pod \"keystone-a5ce-account-create-update-7j267\" (UID: \"63d56792-0e0e-490d-9f7b-c613b6b6f979\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63d56792-0e0e-490d-9f7b-c613b6b6f979-operator-scripts\") pod \"keystone-a5ce-account-create-update-7j267\" (UID: \"63d56792-0e0e-490d-9f7b-c613b6b6f979\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128177 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128190 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzbr4\" (UniqueName: \"kubernetes.io/projected/93de7225-d504-409c-b4b7-8e0826937e9d-kube-api-access-pzbr4\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128200 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl5nf\" (UniqueName: \"kubernetes.io/projected/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-kube-api-access-xl5nf\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128209 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128217 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff2bk\" (UniqueName: \"kubernetes.io/projected/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-kube-api-access-ff2bk\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128226 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93de7225-d504-409c-b4b7-8e0826937e9d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128235 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128243 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128250 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93de7225-d504-409c-b4b7-8e0826937e9d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128258 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128266 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.128273 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.128408 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.128460 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/63d56792-0e0e-490d-9f7b-c613b6b6f979-operator-scripts podName:63d56792-0e0e-490d-9f7b-c613b6b6f979 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:57.128442832 +0000 UTC m=+2309.448703120 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/63d56792-0e0e-490d-9f7b-c613b6b6f979-operator-scripts") pod "keystone-a5ce-account-create-update-7j267" (UID: "63d56792-0e0e-490d-9f7b-c613b6b6f979") : configmap "openstack-scripts" not found Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.135880 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e7b0bf8f-9251-40a3-8177-ee7ec9aec622" (UID: "e7b0bf8f-9251-40a3-8177-ee7ec9aec622"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.138827 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b296da3-d86c-431b-889e-a38437804d3a-logs" (OuterVolumeSpecName: "logs") pod "0b296da3-d86c-431b-889e-a38437804d3a" (UID: "0b296da3-d86c-431b-889e-a38437804d3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.140117 5030 projected.go:194] Error preparing data for projected volume kube-api-access-4t527 for pod openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.140195 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63d56792-0e0e-490d-9f7b-c613b6b6f979-kube-api-access-4t527 podName:63d56792-0e0e-490d-9f7b-c613b6b6f979 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:57.140173205 +0000 UTC m=+2309.460433493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4t527" (UniqueName: "kubernetes.io/projected/63d56792-0e0e-490d-9f7b-c613b6b6f979-kube-api-access-4t527") pod "keystone-a5ce-account-create-update-7j267" (UID: "63d56792-0e0e-490d-9f7b-c613b6b6f979") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.149686 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-scripts" (OuterVolumeSpecName: "scripts") pod "0b296da3-d86c-431b-889e-a38437804d3a" (UID: "0b296da3-d86c-431b-889e-a38437804d3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.157029 5030 scope.go:117] "RemoveContainer" containerID="06d13adbffbcd724575da3d439b5c58b3043b222f3e76d916058a23944d86b4f" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.157050 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b296da3-d86c-431b-889e-a38437804d3a-kube-api-access-mxp4m" (OuterVolumeSpecName: "kube-api-access-mxp4m") pod "0b296da3-d86c-431b-889e-a38437804d3a" (UID: "0b296da3-d86c-431b-889e-a38437804d3a"). InnerVolumeSpecName "kube-api-access-mxp4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.184267 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" (UID: "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.229706 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.229740 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.229756 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.229769 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxp4m\" (UniqueName: \"kubernetes.io/projected/0b296da3-d86c-431b-889e-a38437804d3a-kube-api-access-mxp4m\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.229782 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b296da3-d86c-431b-889e-a38437804d3a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.243291 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-config-data" (OuterVolumeSpecName: "config-data") pod "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" (UID: "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.246514 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" (UID: "f3c1f839-16bf-4ca2-8e07-b616fb9aef9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.249026 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" podUID="4b106e3d-3a2a-44d6-a377-8eb7f8f37825" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.211:9311/healthcheck\": read tcp 10.217.0.2:46782->10.217.1.211:9311: read: connection reset by peer" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.249722 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" podUID="4b106e3d-3a2a-44d6-a377-8eb7f8f37825" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.211:9311/healthcheck\": read tcp 10.217.0.2:46794->10.217.1.211:9311: read: connection reset by peer" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.289712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7b0bf8f-9251-40a3-8177-ee7ec9aec622" (UID: "e7b0bf8f-9251-40a3-8177-ee7ec9aec622"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.308786 5030 scope.go:117] "RemoveContainer" containerID="722d4093088b0113351d34ef93f46a8253354211420a96de0ce82f8dc64ebd7b" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.318741 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.44:8775/\": read tcp 10.217.0.2:47566->10.217.0.44:8775: read: connection reset by peer" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.319015 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.44:8775/\": read tcp 10.217.0.2:47550->10.217.0.44:8775: read: connection reset by peer" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.331990 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.332021 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.332031 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.332109 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.332237 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-operator-scripts podName:ee0a79ab-ccac-4a83-9e3e-7f1f232f2121 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:57.33220102 +0000 UTC m=+2309.652461308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-operator-scripts") pod "root-account-create-update-sks9w" (UID: "ee0a79ab-ccac-4a83-9e3e-7f1f232f2121") : configmap "openstack-scripts" not found Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.337481 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.20:8774/\": read tcp 10.217.0.2:49494->10.217.0.20:8774: read: connection reset by peer" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.337550 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.20:8774/\": read tcp 10.217.0.2:49498->10.217.0.20:8774: read: connection reset by peer" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.363864 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b296da3-d86c-431b-889e-a38437804d3a" (UID: "0b296da3-d86c-431b-889e-a38437804d3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.386713 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-config-data" (OuterVolumeSpecName: "config-data") pod "e7b0bf8f-9251-40a3-8177-ee7ec9aec622" (UID: "e7b0bf8f-9251-40a3-8177-ee7ec9aec622"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.412039 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-config-data" (OuterVolumeSpecName: "config-data") pod "0b296da3-d86c-431b-889e-a38437804d3a" (UID: "0b296da3-d86c-431b-889e-a38437804d3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.435753 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.435790 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.435801 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7b0bf8f-9251-40a3-8177-ee7ec9aec622-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.439045 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0b296da3-d86c-431b-889e-a38437804d3a" (UID: "0b296da3-d86c-431b-889e-a38437804d3a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.454144 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0b296da3-d86c-431b-889e-a38437804d3a" (UID: "0b296da3-d86c-431b-889e-a38437804d3a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.497054 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/memcached-0" podUID="deba34a1-e52c-4ee9-b8fb-07dbe410184c" containerName="memcached" probeResult="failure" output="dial tcp 10.217.1.163:11211: connect: connection refused" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.538279 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.538325 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b296da3-d86c-431b-889e-a38437804d3a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.640343 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.640413 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs podName:03b59562-d96f-453a-bfc7-a9f3c7e7f872 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:58.640396138 +0000 UTC m=+2310.960656426 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs") pod "keystone-7f866c7db4-82phj" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872") : secret "cert-keystone-internal-svc" not found Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.640487 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.640546 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs podName:03b59562-d96f-453a-bfc7-a9f3c7e7f872 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:58.640529931 +0000 UTC m=+2310.960790219 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs") pod "keystone-7f866c7db4-82phj" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872") : secret "cert-keystone-public-svc" not found Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.640580 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:13:56 crc kubenswrapper[5030]: E0120 23:13:56.640604 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle podName:03b59562-d96f-453a-bfc7-a9f3c7e7f872 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:58.640597043 +0000 UTC m=+2310.960857331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle") pod "keystone-7f866c7db4-82phj" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872") : secret "combined-ca-bundle" not found Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.817717 5030 generic.go:334] "Generic (PLEG): container finished" podID="3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" containerID="1decb37a6a7c1d14ac75639b689c1ba474b929273efe96a724937a53898e5fcd" exitCode=0 Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.817871 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab","Type":"ContainerDied","Data":"1decb37a6a7c1d14ac75639b689c1ba474b929273efe96a724937a53898e5fcd"} Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.817905 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab","Type":"ContainerDied","Data":"c511903ef3583c6577bc458b52bea49f96d7401aba967350226ca754f53fa376"} Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.817943 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c511903ef3583c6577bc458b52bea49f96d7401aba967350226ca754f53fa376" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.820426 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" event={"ID":"0b296da3-d86c-431b-889e-a38437804d3a","Type":"ContainerDied","Data":"6d51eb7685047c08ab7fa7cb5f81c0f5a56b71b7701a86aa0d0143e5d3e360cc"} Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.820534 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-65fb4986d8-k2r2k" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.835739 5030 generic.go:334] "Generic (PLEG): container finished" podID="dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" containerID="b0b7fbcae4e87a8e2c4501e0a9f4a0d919d679b8183c68e6faeceada12e60e1e" exitCode=0 Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.835801 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c","Type":"ContainerDied","Data":"b0b7fbcae4e87a8e2c4501e0a9f4a0d919d679b8183c68e6faeceada12e60e1e"} Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.871367 5030 generic.go:334] "Generic (PLEG): container finished" podID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerID="14b424c8af972f0138ff073374e0e6c981d49cdcb262e43e51bfaad6dc967e68" exitCode=0 Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.871426 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d9ae46d8-5c20-4710-90c6-b2d0120d9231","Type":"ContainerDied","Data":"14b424c8af972f0138ff073374e0e6c981d49cdcb262e43e51bfaad6dc967e68"} Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.873159 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.873805 5030 generic.go:334] "Generic (PLEG): container finished" podID="529ed4cf-39aa-463b-bf65-da6900fa74e4" containerID="4066467fc36d5b2f8ca58161d2e2528be4d9375a2251420eb10d0d6a759ae68e" exitCode=0 Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.873852 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-applier-0" event={"ID":"529ed4cf-39aa-463b-bf65-da6900fa74e4","Type":"ContainerDied","Data":"4066467fc36d5b2f8ca58161d2e2528be4d9375a2251420eb10d0d6a759ae68e"} Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.879231 5030 generic.go:334] "Generic (PLEG): container finished" podID="deba34a1-e52c-4ee9-b8fb-07dbe410184c" containerID="5c57eec86df36ec8b65b60ad4ae937344c23be3ea71cb4377819fbc39e8de8e1" exitCode=0 Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.879253 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.879274 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"deba34a1-e52c-4ee9-b8fb-07dbe410184c","Type":"ContainerDied","Data":"5c57eec86df36ec8b65b60ad4ae937344c23be3ea71cb4377819fbc39e8de8e1"} Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.890125 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-65fb4986d8-k2r2k"] Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.894431 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.894462 5030 generic.go:334] "Generic (PLEG): container finished" podID="4b106e3d-3a2a-44d6-a377-8eb7f8f37825" containerID="e40e9ac3a45289a55deb5dd5f6e1a134c721f434436b873c0c20a597937dd46e" exitCode=0 Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.894512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" event={"ID":"4b106e3d-3a2a-44d6-a377-8eb7f8f37825","Type":"ContainerDied","Data":"e40e9ac3a45289a55deb5dd5f6e1a134c721f434436b873c0c20a597937dd46e"} Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.909109 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.911016 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-65fb4986d8-k2r2k"] Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.911337 5030 scope.go:117] "RemoveContainer" containerID="e53a19888fbe4b142de355206d3ab31f28f294382a2552dd84cc805eb5b41bed" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.918182 5030 generic.go:334] "Generic (PLEG): container finished" podID="eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" containerID="4afb199e076e08bd1f320a82a9ae4355b464db743e01e9a15d1354569ef3e0d8" exitCode=0 Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.918263 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9","Type":"ContainerDied","Data":"4afb199e076e08bd1f320a82a9ae4355b464db743e01e9a15d1354569ef3e0d8"} Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.941228 5030 generic.go:334] "Generic (PLEG): container finished" podID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerID="5191d760f933dad69163b25ccde46995e6d88d09b8f999f482abfafa83968828" exitCode=0 Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.941328 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"80e84f01-7009-4bfc-8cfb-2af258ceed3c","Type":"ContainerDied","Data":"5191d760f933dad69163b25ccde46995e6d88d09b8f999f482abfafa83968828"} Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.944069 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" event={"ID":"7100ef4a-f07d-4531-81f9-65fb79ae49d6","Type":"ContainerDied","Data":"e80ec98bf53751ebe37d7d6c513f1b18c80622c5dab2c8d3ae1e7724171ab863"} Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.944197 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.944838 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-etc-machine-id\") pod \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.944996 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-combined-ca-bundle\") pod \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.945082 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70ab6da8-893f-49a3-867f-02b2f7395331-logs\") pod \"70ab6da8-893f-49a3-867f-02b2f7395331\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.945248 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grk2d\" (UniqueName: \"kubernetes.io/projected/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-kube-api-access-grk2d\") pod \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.945327 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-scripts\") pod \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.945419 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-config-data\") pod \"70ab6da8-893f-49a3-867f-02b2f7395331\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.945505 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-internal-tls-certs\") pod \"70ab6da8-893f-49a3-867f-02b2f7395331\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.945579 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-internal-tls-certs\") pod \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.945675 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7100ef4a-f07d-4531-81f9-65fb79ae49d6-operator-scripts\") pod \"7100ef4a-f07d-4531-81f9-65fb79ae49d6\" (UID: \"7100ef4a-f07d-4531-81f9-65fb79ae49d6\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.945757 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-public-tls-certs\") pod \"70ab6da8-893f-49a3-867f-02b2f7395331\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.945851 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-config-data-custom\") pod \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.946745 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q45tf\" (UniqueName: \"kubernetes.io/projected/7100ef4a-f07d-4531-81f9-65fb79ae49d6-kube-api-access-q45tf\") pod \"7100ef4a-f07d-4531-81f9-65fb79ae49d6\" (UID: \"7100ef4a-f07d-4531-81f9-65fb79ae49d6\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.946917 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6ntf\" (UniqueName: \"kubernetes.io/projected/70ab6da8-893f-49a3-867f-02b2f7395331-kube-api-access-q6ntf\") pod \"70ab6da8-893f-49a3-867f-02b2f7395331\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.947044 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-config-data\") pod \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.947122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-public-tls-certs\") pod \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.947224 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-combined-ca-bundle\") pod \"70ab6da8-893f-49a3-867f-02b2f7395331\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.947308 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-custom-prometheus-ca\") pod \"70ab6da8-893f-49a3-867f-02b2f7395331\" (UID: \"70ab6da8-893f-49a3-867f-02b2f7395331\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.947373 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-logs\") pod \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\" (UID: \"3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab\") " Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.945202 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" (UID: "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.946779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70ab6da8-893f-49a3-867f-02b2f7395331-logs" (OuterVolumeSpecName: "logs") pod "70ab6da8-893f-49a3-867f-02b2f7395331" (UID: "70ab6da8-893f-49a3-867f-02b2f7395331"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.948348 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-logs" (OuterVolumeSpecName: "logs") pod "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" (UID: "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.949330 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7100ef4a-f07d-4531-81f9-65fb79ae49d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7100ef4a-f07d-4531-81f9-65fb79ae49d6" (UID: "7100ef4a-f07d-4531-81f9-65fb79ae49d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.951037 5030 generic.go:334] "Generic (PLEG): container finished" podID="70ab6da8-893f-49a3-867f-02b2f7395331" containerID="5507e516abcaf14cb7732b59ea6cde2029913fbfae8b91df16c0097cb4b9cba6" exitCode=0 Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.951183 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-api-0" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.951236 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"70ab6da8-893f-49a3-867f-02b2f7395331","Type":"ContainerDied","Data":"5507e516abcaf14cb7732b59ea6cde2029913fbfae8b91df16c0097cb4b9cba6"} Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.951267 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-api-0" event={"ID":"70ab6da8-893f-49a3-867f-02b2f7395331","Type":"ContainerDied","Data":"6fc4be01630ab3040830a21792f67c6f2385770be3db6d0ccd8ff393d36b2bd7"} Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.954106 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7100ef4a-f07d-4531-81f9-65fb79ae49d6-kube-api-access-q45tf" (OuterVolumeSpecName: "kube-api-access-q45tf") pod "7100ef4a-f07d-4531-81f9-65fb79ae49d6" (UID: "7100ef4a-f07d-4531-81f9-65fb79ae49d6"). InnerVolumeSpecName "kube-api-access-q45tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.959944 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ab6da8-893f-49a3-867f-02b2f7395331-kube-api-access-q6ntf" (OuterVolumeSpecName: "kube-api-access-q6ntf") pod "70ab6da8-893f-49a3-867f-02b2f7395331" (UID: "70ab6da8-893f-49a3-867f-02b2f7395331"). InnerVolumeSpecName "kube-api-access-q6ntf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.964698 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m5dvm" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.964904 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.965020 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.966003 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.966000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-kube-api-access-grk2d" (OuterVolumeSpecName: "kube-api-access-grk2d") pod "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" (UID: "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab"). InnerVolumeSpecName "kube-api-access-grk2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.978480 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" (UID: "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.985257 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-scripts" (OuterVolumeSpecName: "scripts") pod "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" (UID: "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:56 crc kubenswrapper[5030]: I0120 23:13:56.999001 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.007075 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.017416 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70ab6da8-893f-49a3-867f-02b2f7395331" (UID: "70ab6da8-893f-49a3-867f-02b2f7395331"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.032172 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.035885 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd"] Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052276 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77m8q\" (UniqueName: \"kubernetes.io/projected/80e84f01-7009-4bfc-8cfb-2af258ceed3c-kube-api-access-77m8q\") pod \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052370 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e84f01-7009-4bfc-8cfb-2af258ceed3c-logs\") pod \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052393 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-scripts\") pod \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052424 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-combined-ca-bundle\") pod \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052456 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-internal-tls-certs\") pod \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052506 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-nova-metadata-tls-certs\") pod \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052539 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-config-data\") pod \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052587 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frwgq\" (UniqueName: \"kubernetes.io/projected/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-kube-api-access-frwgq\") pod \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052664 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-logs\") pod \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052703 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-config-data\") pod \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052742 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-httpd-run\") pod \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052785 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052816 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-combined-ca-bundle\") pod \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\" (UID: \"80e84f01-7009-4bfc-8cfb-2af258ceed3c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.053303 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70ab6da8-893f-49a3-867f-02b2f7395331-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.053324 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grk2d\" (UniqueName: \"kubernetes.io/projected/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-kube-api-access-grk2d\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.053335 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.053347 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7100ef4a-f07d-4531-81f9-65fb79ae49d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.053356 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.053366 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q45tf\" (UniqueName: \"kubernetes.io/projected/7100ef4a-f07d-4531-81f9-65fb79ae49d6-kube-api-access-q45tf\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.053376 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6ntf\" (UniqueName: \"kubernetes.io/projected/70ab6da8-893f-49a3-867f-02b2f7395331-kube-api-access-q6ntf\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.053385 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.053393 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.053404 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.052367 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-b8cd55c6b-gdzdd"] Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.053661 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80e84f01-7009-4bfc-8cfb-2af258ceed3c-logs" (OuterVolumeSpecName: "logs") pod "80e84f01-7009-4bfc-8cfb-2af258ceed3c" (UID: "80e84f01-7009-4bfc-8cfb-2af258ceed3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.054994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-logs" (OuterVolumeSpecName: "logs") pod "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" (UID: "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.057941 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" (UID: "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.059370 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80e84f01-7009-4bfc-8cfb-2af258ceed3c-kube-api-access-77m8q" (OuterVolumeSpecName: "kube-api-access-77m8q") pod "80e84f01-7009-4bfc-8cfb-2af258ceed3c" (UID: "80e84f01-7009-4bfc-8cfb-2af258ceed3c"). InnerVolumeSpecName "kube-api-access-77m8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.065040 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" (UID: "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.070602 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-scripts" (OuterVolumeSpecName: "scripts") pod "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" (UID: "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.078836 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.082439 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.084392 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.089839 5030 scope.go:117] "RemoveContainer" containerID="e6154836a82a7f6acdf91096708f2199ef6808a460dc436a26d094f9263ef3e0" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.094495 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.115252 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-kube-api-access-frwgq" (OuterVolumeSpecName: "kube-api-access-frwgq") pod "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" (UID: "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c"). InnerVolumeSpecName "kube-api-access-frwgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.132210 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "70ab6da8-893f-49a3-867f-02b2f7395331" (UID: "70ab6da8-893f-49a3-867f-02b2f7395331"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.158784 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-combined-ca-bundle\") pod \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.158849 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-public-tls-certs\") pod \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.158897 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/deba34a1-e52c-4ee9-b8fb-07dbe410184c-kolla-config\") pod \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.158965 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-httpd-run\") pod \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.158992 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-combined-ca-bundle\") pod \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159033 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deba34a1-e52c-4ee9-b8fb-07dbe410184c-combined-ca-bundle\") pod \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159145 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/deba34a1-e52c-4ee9-b8fb-07dbe410184c-memcached-tls-certs\") pod \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159197 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-logs\") pod \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159226 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-config-data\") pod \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159268 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-logs\") pod \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlddk\" (UniqueName: \"kubernetes.io/projected/deba34a1-e52c-4ee9-b8fb-07dbe410184c-kube-api-access-hlddk\") pod \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159332 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-public-tls-certs\") pod \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/deba34a1-e52c-4ee9-b8fb-07dbe410184c-config-data\") pod \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\" (UID: \"deba34a1-e52c-4ee9-b8fb-07dbe410184c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159413 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159456 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgpxp\" (UniqueName: \"kubernetes.io/projected/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-kube-api-access-sgpxp\") pod \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-internal-tls-certs\") pod \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159496 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-config-data\") pod \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7khh\" (UniqueName: \"kubernetes.io/projected/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-kube-api-access-q7khh\") pod \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159581 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-scripts\") pod \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\" (UID: \"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.159608 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-config-data-custom\") pod \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\" (UID: \"4b106e3d-3a2a-44d6-a377-8eb7f8f37825\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.160216 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63d56792-0e0e-490d-9f7b-c613b6b6f979-operator-scripts\") pod \"keystone-a5ce-account-create-update-7j267\" (UID: \"63d56792-0e0e-490d-9f7b-c613b6b6f979\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.160446 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t527\" (UniqueName: \"kubernetes.io/projected/63d56792-0e0e-490d-9f7b-c613b6b6f979-kube-api-access-4t527\") pod \"keystone-a5ce-account-create-update-7j267\" (UID: \"63d56792-0e0e-490d-9f7b-c613b6b6f979\") " pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.160547 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77m8q\" (UniqueName: \"kubernetes.io/projected/80e84f01-7009-4bfc-8cfb-2af258ceed3c-kube-api-access-77m8q\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.160563 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80e84f01-7009-4bfc-8cfb-2af258ceed3c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.160573 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.160583 5030 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.160593 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frwgq\" (UniqueName: \"kubernetes.io/projected/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-kube-api-access-frwgq\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.160602 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.160611 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.161064 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.161062 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-logs" (OuterVolumeSpecName: "logs") pod "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" (UID: "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.165729 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-logs" (OuterVolumeSpecName: "logs") pod "4b106e3d-3a2a-44d6-a377-8eb7f8f37825" (UID: "4b106e3d-3a2a-44d6-a377-8eb7f8f37825"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.166114 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.166384 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deba34a1-e52c-4ee9-b8fb-07dbe410184c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "deba34a1-e52c-4ee9-b8fb-07dbe410184c" (UID: "deba34a1-e52c-4ee9-b8fb-07dbe410184c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.167534 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" (UID: "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: E0120 23:13:57.174056 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:13:57 crc kubenswrapper[5030]: E0120 23:13:57.174118 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/63d56792-0e0e-490d-9f7b-c613b6b6f979-operator-scripts podName:63d56792-0e0e-490d-9f7b-c613b6b6f979 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:59.17410121 +0000 UTC m=+2311.494361498 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/63d56792-0e0e-490d-9f7b-c613b6b6f979-operator-scripts") pod "keystone-a5ce-account-create-update-7j267" (UID: "63d56792-0e0e-490d-9f7b-c613b6b6f979") : configmap "openstack-scripts" not found Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.182133 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deba34a1-e52c-4ee9-b8fb-07dbe410184c-config-data" (OuterVolumeSpecName: "config-data") pod "deba34a1-e52c-4ee9-b8fb-07dbe410184c" (UID: "deba34a1-e52c-4ee9-b8fb-07dbe410184c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.182278 5030 scope.go:117] "RemoveContainer" containerID="bf4743f2f827f2f4f9f2f8517fc9b12848a8b1ea2c24f37ec0a0b7b4cefab31e" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.182917 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-config-data" (OuterVolumeSpecName: "config-data") pod "80e84f01-7009-4bfc-8cfb-2af258ceed3c" (UID: "80e84f01-7009-4bfc-8cfb-2af258ceed3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: E0120 23:13:57.185028 5030 projected.go:194] Error preparing data for projected volume kube-api-access-4t527 for pod openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:13:57 crc kubenswrapper[5030]: E0120 23:13:57.185104 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63d56792-0e0e-490d-9f7b-c613b6b6f979-kube-api-access-4t527 podName:63d56792-0e0e-490d-9f7b-c613b6b6f979 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:59.185081175 +0000 UTC m=+2311.505341683 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4t527" (UniqueName: "kubernetes.io/projected/63d56792-0e0e-490d-9f7b-c613b6b6f979-kube-api-access-4t527") pod "keystone-a5ce-account-create-update-7j267" (UID: "63d56792-0e0e-490d-9f7b-c613b6b6f979") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.195271 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-kube-api-access-q7khh" (OuterVolumeSpecName: "kube-api-access-q7khh") pod "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" (UID: "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9"). InnerVolumeSpecName "kube-api-access-q7khh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.195310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deba34a1-e52c-4ee9-b8fb-07dbe410184c-kube-api-access-hlddk" (OuterVolumeSpecName: "kube-api-access-hlddk") pod "deba34a1-e52c-4ee9-b8fb-07dbe410184c" (UID: "deba34a1-e52c-4ee9-b8fb-07dbe410184c"). InnerVolumeSpecName "kube-api-access-hlddk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.195494 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-kube-api-access-sgpxp" (OuterVolumeSpecName: "kube-api-access-sgpxp") pod "4b106e3d-3a2a-44d6-a377-8eb7f8f37825" (UID: "4b106e3d-3a2a-44d6-a377-8eb7f8f37825"). InnerVolumeSpecName "kube-api-access-sgpxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.196460 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m5dvm"] Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.200745 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" (UID: "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.209559 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4b106e3d-3a2a-44d6-a377-8eb7f8f37825" (UID: "4b106e3d-3a2a-44d6-a377-8eb7f8f37825"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.221036 5030 scope.go:117] "RemoveContainer" containerID="6959375e26fa2c4a22be160d0bdb837d17a80304dda7a78b4f10357d3e5cdc2d" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.221971 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-scripts" (OuterVolumeSpecName: "scripts") pod "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" (UID: "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.222199 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m5dvm"] Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.241345 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-config-data" (OuterVolumeSpecName: "config-data") pod "70ab6da8-893f-49a3-867f-02b2f7395331" (UID: "70ab6da8-893f-49a3-867f-02b2f7395331"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.244402 5030 scope.go:117] "RemoveContainer" containerID="5507e516abcaf14cb7732b59ea6cde2029913fbfae8b91df16c0097cb4b9cba6" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.261682 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-internal-tls-certs\") pod \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.261762 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-config-data\") pod \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.261867 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9ae46d8-5c20-4710-90c6-b2d0120d9231-logs\") pod \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.261912 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f5rv\" (UniqueName: \"kubernetes.io/projected/d9ae46d8-5c20-4710-90c6-b2d0120d9231-kube-api-access-9f5rv\") pod \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.261953 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-combined-ca-bundle\") pod \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262060 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-public-tls-certs\") pod \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\" (UID: \"d9ae46d8-5c20-4710-90c6-b2d0120d9231\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262461 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262485 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262497 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262507 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlddk\" (UniqueName: \"kubernetes.io/projected/deba34a1-e52c-4ee9-b8fb-07dbe410184c-kube-api-access-hlddk\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262516 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/deba34a1-e52c-4ee9-b8fb-07dbe410184c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262539 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262553 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262567 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgpxp\" (UniqueName: \"kubernetes.io/projected/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-kube-api-access-sgpxp\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262580 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7khh\" (UniqueName: \"kubernetes.io/projected/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-kube-api-access-q7khh\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262591 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262604 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262665 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/deba34a1-e52c-4ee9-b8fb-07dbe410184c-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.262681 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.274826 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9ae46d8-5c20-4710-90c6-b2d0120d9231-logs" (OuterVolumeSpecName: "logs") pod "d9ae46d8-5c20-4710-90c6-b2d0120d9231" (UID: "d9ae46d8-5c20-4710-90c6-b2d0120d9231"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.298814 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" (UID: "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.307788 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ae46d8-5c20-4710-90c6-b2d0120d9231-kube-api-access-9f5rv" (OuterVolumeSpecName: "kube-api-access-9f5rv") pod "d9ae46d8-5c20-4710-90c6-b2d0120d9231" (UID: "d9ae46d8-5c20-4710-90c6-b2d0120d9231"). InnerVolumeSpecName "kube-api-access-9f5rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.323804 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" (UID: "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.336054 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.353155 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" (UID: "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.358341 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80e84f01-7009-4bfc-8cfb-2af258ceed3c" (UID: "80e84f01-7009-4bfc-8cfb-2af258ceed3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: E0120 23:13:57.370358 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:13:57 crc kubenswrapper[5030]: E0120 23:13:57.370462 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-operator-scripts podName:ee0a79ab-ccac-4a83-9e3e-7f1f232f2121 nodeName:}" failed. No retries permitted until 2026-01-20 23:13:59.370439489 +0000 UTC m=+2311.690699767 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-operator-scripts") pod "root-account-create-update-sks9w" (UID: "ee0a79ab-ccac-4a83-9e3e-7f1f232f2121") : configmap "openstack-scripts" not found Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.370674 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9ae46d8-5c20-4710-90c6-b2d0120d9231-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.370747 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.370838 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f5rv\" (UniqueName: \"kubernetes.io/projected/d9ae46d8-5c20-4710-90c6-b2d0120d9231-kube-api-access-9f5rv\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.370904 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.370958 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.371098 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.371170 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.382149 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "70ab6da8-893f-49a3-867f-02b2f7395331" (UID: "70ab6da8-893f-49a3-867f-02b2f7395331"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.388680 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" (UID: "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.402800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" (UID: "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.406643 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9ae46d8-5c20-4710-90c6-b2d0120d9231" (UID: "d9ae46d8-5c20-4710-90c6-b2d0120d9231"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.411530 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-config-data" (OuterVolumeSpecName: "config-data") pod "d9ae46d8-5c20-4710-90c6-b2d0120d9231" (UID: "d9ae46d8-5c20-4710-90c6-b2d0120d9231"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.415677 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-config-data" (OuterVolumeSpecName: "config-data") pod "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" (UID: "3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.433408 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.436666 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "70ab6da8-893f-49a3-867f-02b2f7395331" (UID: "70ab6da8-893f-49a3-867f-02b2f7395331"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.468391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deba34a1-e52c-4ee9-b8fb-07dbe410184c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "deba34a1-e52c-4ee9-b8fb-07dbe410184c" (UID: "deba34a1-e52c-4ee9-b8fb-07dbe410184c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.468797 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4b106e3d-3a2a-44d6-a377-8eb7f8f37825" (UID: "4b106e3d-3a2a-44d6-a377-8eb7f8f37825"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.471847 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" (UID: "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.471975 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-internal-tls-certs\") pod \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\" (UID: \"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c\") " Jan 20 23:13:57 crc kubenswrapper[5030]: W0120 23:13:57.472071 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c/volumes/kubernetes.io~secret/internal-tls-certs Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.472080 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" (UID: "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.472301 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-config-data" (OuterVolumeSpecName: "config-data") pod "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" (UID: "dcb7653a-62fc-4a7c-b6fd-720c3f73d47c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.472794 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.472816 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.472830 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.472953 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.472965 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.472974 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deba34a1-e52c-4ee9-b8fb-07dbe410184c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.472983 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.472994 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.473003 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.472992 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-config-data" (OuterVolumeSpecName: "config-data") pod "4b106e3d-3a2a-44d6-a377-8eb7f8f37825" (UID: "4b106e3d-3a2a-44d6-a377-8eb7f8f37825"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.473011 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.473076 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/70ab6da8-893f-49a3-867f-02b2f7395331-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.473089 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.474873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b106e3d-3a2a-44d6-a377-8eb7f8f37825" (UID: "4b106e3d-3a2a-44d6-a377-8eb7f8f37825"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.489341 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" (UID: "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.492034 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deba34a1-e52c-4ee9-b8fb-07dbe410184c-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "deba34a1-e52c-4ee9-b8fb-07dbe410184c" (UID: "deba34a1-e52c-4ee9-b8fb-07dbe410184c"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.492666 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "80e84f01-7009-4bfc-8cfb-2af258ceed3c" (UID: "80e84f01-7009-4bfc-8cfb-2af258ceed3c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.495983 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4b106e3d-3a2a-44d6-a377-8eb7f8f37825" (UID: "4b106e3d-3a2a-44d6-a377-8eb7f8f37825"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.496984 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d9ae46d8-5c20-4710-90c6-b2d0120d9231" (UID: "d9ae46d8-5c20-4710-90c6-b2d0120d9231"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.500455 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-config-data" (OuterVolumeSpecName: "config-data") pod "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" (UID: "eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.509903 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d9ae46d8-5c20-4710-90c6-b2d0120d9231" (UID: "d9ae46d8-5c20-4710-90c6-b2d0120d9231"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.575802 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.575873 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/deba34a1-e52c-4ee9-b8fb-07dbe410184c-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.575894 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.575915 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80e84f01-7009-4bfc-8cfb-2af258ceed3c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.575932 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.575950 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ae46d8-5c20-4710-90c6-b2d0120d9231-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.575970 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.575991 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.576007 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106e3d-3a2a-44d6-a377-8eb7f8f37825-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.596658 5030 scope.go:117] "RemoveContainer" containerID="12254c06fd109c72533ed7dc0710acf0c25c1330709a37f4aa8ac1e19abd08d0" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.600238 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.615366 5030 scope.go:117] "RemoveContainer" containerID="5507e516abcaf14cb7732b59ea6cde2029913fbfae8b91df16c0097cb4b9cba6" Jan 20 23:13:57 crc kubenswrapper[5030]: E0120 23:13:57.615995 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5507e516abcaf14cb7732b59ea6cde2029913fbfae8b91df16c0097cb4b9cba6\": container with ID starting with 5507e516abcaf14cb7732b59ea6cde2029913fbfae8b91df16c0097cb4b9cba6 not found: ID does not exist" containerID="5507e516abcaf14cb7732b59ea6cde2029913fbfae8b91df16c0097cb4b9cba6" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.616030 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5507e516abcaf14cb7732b59ea6cde2029913fbfae8b91df16c0097cb4b9cba6"} err="failed to get container status \"5507e516abcaf14cb7732b59ea6cde2029913fbfae8b91df16c0097cb4b9cba6\": rpc error: code = NotFound desc = could not find container \"5507e516abcaf14cb7732b59ea6cde2029913fbfae8b91df16c0097cb4b9cba6\": container with ID starting with 5507e516abcaf14cb7732b59ea6cde2029913fbfae8b91df16c0097cb4b9cba6 not found: ID does not exist" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.616086 5030 scope.go:117] "RemoveContainer" containerID="12254c06fd109c72533ed7dc0710acf0c25c1330709a37f4aa8ac1e19abd08d0" Jan 20 23:13:57 crc kubenswrapper[5030]: E0120 23:13:57.616415 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12254c06fd109c72533ed7dc0710acf0c25c1330709a37f4aa8ac1e19abd08d0\": container with ID starting with 12254c06fd109c72533ed7dc0710acf0c25c1330709a37f4aa8ac1e19abd08d0 not found: ID does not exist" containerID="12254c06fd109c72533ed7dc0710acf0c25c1330709a37f4aa8ac1e19abd08d0" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.616500 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12254c06fd109c72533ed7dc0710acf0c25c1330709a37f4aa8ac1e19abd08d0"} err="failed to get container status \"12254c06fd109c72533ed7dc0710acf0c25c1330709a37f4aa8ac1e19abd08d0\": rpc error: code = NotFound desc = could not find container \"12254c06fd109c72533ed7dc0710acf0c25c1330709a37f4aa8ac1e19abd08d0\": container with ID starting with 12254c06fd109c72533ed7dc0710acf0c25c1330709a37f4aa8ac1e19abd08d0 not found: ID does not exist" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.641895 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-sks9w" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.649034 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.661466 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q"] Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.669702 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-aaf6-account-create-update-85t2q"] Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.677507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529ed4cf-39aa-463b-bf65-da6900fa74e4-combined-ca-bundle\") pod \"529ed4cf-39aa-463b-bf65-da6900fa74e4\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.677544 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/529ed4cf-39aa-463b-bf65-da6900fa74e4-logs\") pod \"529ed4cf-39aa-463b-bf65-da6900fa74e4\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.677707 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529ed4cf-39aa-463b-bf65-da6900fa74e4-config-data\") pod \"529ed4cf-39aa-463b-bf65-da6900fa74e4\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.677748 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfzlz\" (UniqueName: \"kubernetes.io/projected/529ed4cf-39aa-463b-bf65-da6900fa74e4-kube-api-access-gfzlz\") pod \"529ed4cf-39aa-463b-bf65-da6900fa74e4\" (UID: \"529ed4cf-39aa-463b-bf65-da6900fa74e4\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.678223 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/529ed4cf-39aa-463b-bf65-da6900fa74e4-logs" (OuterVolumeSpecName: "logs") pod "529ed4cf-39aa-463b-bf65-da6900fa74e4" (UID: "529ed4cf-39aa-463b-bf65-da6900fa74e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.678500 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.685868 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-api-0"] Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.686149 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/529ed4cf-39aa-463b-bf65-da6900fa74e4-kube-api-access-gfzlz" (OuterVolumeSpecName: "kube-api-access-gfzlz") pod "529ed4cf-39aa-463b-bf65-da6900fa74e4" (UID: "529ed4cf-39aa-463b-bf65-da6900fa74e4"). InnerVolumeSpecName "kube-api-access-gfzlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.710906 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/529ed4cf-39aa-463b-bf65-da6900fa74e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "529ed4cf-39aa-463b-bf65-da6900fa74e4" (UID: "529ed4cf-39aa-463b-bf65-da6900fa74e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.733985 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/529ed4cf-39aa-463b-bf65-da6900fa74e4-config-data" (OuterVolumeSpecName: "config-data") pod "529ed4cf-39aa-463b-bf65-da6900fa74e4" (UID: "529ed4cf-39aa-463b-bf65-da6900fa74e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.779880 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njzjp\" (UniqueName: \"kubernetes.io/projected/f8fe612c-1c3a-4269-8177-c09b217721c0-kube-api-access-njzjp\") pod \"f8fe612c-1c3a-4269-8177-c09b217721c0\" (UID: \"f8fe612c-1c3a-4269-8177-c09b217721c0\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.779934 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddnc6\" (UniqueName: \"kubernetes.io/projected/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-kube-api-access-ddnc6\") pod \"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121\" (UID: \"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.779961 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-operator-scripts\") pod \"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121\" (UID: \"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.780084 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8fe612c-1c3a-4269-8177-c09b217721c0-combined-ca-bundle\") pod \"f8fe612c-1c3a-4269-8177-c09b217721c0\" (UID: \"f8fe612c-1c3a-4269-8177-c09b217721c0\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.780159 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8fe612c-1c3a-4269-8177-c09b217721c0-config-data\") pod \"f8fe612c-1c3a-4269-8177-c09b217721c0\" (UID: \"f8fe612c-1c3a-4269-8177-c09b217721c0\") " Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.780594 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529ed4cf-39aa-463b-bf65-da6900fa74e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.780612 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfzlz\" (UniqueName: \"kubernetes.io/projected/529ed4cf-39aa-463b-bf65-da6900fa74e4-kube-api-access-gfzlz\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.780638 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529ed4cf-39aa-463b-bf65-da6900fa74e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.780649 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/529ed4cf-39aa-463b-bf65-da6900fa74e4-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.780703 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee0a79ab-ccac-4a83-9e3e-7f1f232f2121" (UID: "ee0a79ab-ccac-4a83-9e3e-7f1f232f2121"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.783660 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-kube-api-access-ddnc6" (OuterVolumeSpecName: "kube-api-access-ddnc6") pod "ee0a79ab-ccac-4a83-9e3e-7f1f232f2121" (UID: "ee0a79ab-ccac-4a83-9e3e-7f1f232f2121"). InnerVolumeSpecName "kube-api-access-ddnc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.783858 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8fe612c-1c3a-4269-8177-c09b217721c0-kube-api-access-njzjp" (OuterVolumeSpecName: "kube-api-access-njzjp") pod "f8fe612c-1c3a-4269-8177-c09b217721c0" (UID: "f8fe612c-1c3a-4269-8177-c09b217721c0"). InnerVolumeSpecName "kube-api-access-njzjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.808297 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8fe612c-1c3a-4269-8177-c09b217721c0-config-data" (OuterVolumeSpecName: "config-data") pod "f8fe612c-1c3a-4269-8177-c09b217721c0" (UID: "f8fe612c-1c3a-4269-8177-c09b217721c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.809030 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8fe612c-1c3a-4269-8177-c09b217721c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8fe612c-1c3a-4269-8177-c09b217721c0" (UID: "f8fe612c-1c3a-4269-8177-c09b217721c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.882139 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8fe612c-1c3a-4269-8177-c09b217721c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.882171 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8fe612c-1c3a-4269-8177-c09b217721c0-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.882180 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njzjp\" (UniqueName: \"kubernetes.io/projected/f8fe612c-1c3a-4269-8177-c09b217721c0-kube-api-access-njzjp\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.882194 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddnc6\" (UniqueName: \"kubernetes.io/projected/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-kube-api-access-ddnc6\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.882204 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.933136 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/alertmanager-metric-storage-0" podUID="f331453a-4129-49ea-b554-06f628abf66a" containerName="alertmanager" probeResult="failure" output="Get \"http://10.217.1.167:9093/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.994694 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b296da3-d86c-431b-889e-a38437804d3a" path="/var/lib/kubelet/pods/0b296da3-d86c-431b-889e-a38437804d3a/volumes" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.995367 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70ab6da8-893f-49a3-867f-02b2f7395331" path="/var/lib/kubelet/pods/70ab6da8-893f-49a3-867f-02b2f7395331/volumes" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.999515 5030 generic.go:334] "Generic (PLEG): container finished" podID="f8fe612c-1c3a-4269-8177-c09b217721c0" containerID="7eda88b8cde3a5c2b63324054ee5424a80054e7b8f662825b7f8b622f7a48194" exitCode=0 Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.999555 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7100ef4a-f07d-4531-81f9-65fb79ae49d6" path="/var/lib/kubelet/pods/7100ef4a-f07d-4531-81f9-65fb79ae49d6/volumes" Jan 20 23:13:57 crc kubenswrapper[5030]: I0120 23:13:57.999789 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.001543 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93de7225-d504-409c-b4b7-8e0826937e9d" path="/var/lib/kubelet/pods/93de7225-d504-409c-b4b7-8e0826937e9d/volumes" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.007544 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db1354ec-1e74-42e4-8eda-ffcba8af62a8" path="/var/lib/kubelet/pods/db1354ec-1e74-42e4-8eda-ffcba8af62a8/volumes" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.008743 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.009404 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" path="/var/lib/kubelet/pods/e7b0bf8f-9251-40a3-8177-ee7ec9aec622/volumes" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.013248 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" path="/var/lib/kubelet/pods/f3c1f839-16bf-4ca2-8e07-b616fb9aef9c/volumes" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.015827 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-applier-0" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.020424 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"f8fe612c-1c3a-4269-8177-c09b217721c0","Type":"ContainerDied","Data":"7eda88b8cde3a5c2b63324054ee5424a80054e7b8f662825b7f8b622f7a48194"} Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.020481 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"f8fe612c-1c3a-4269-8177-c09b217721c0","Type":"ContainerDied","Data":"9556bff1ce1b568094892948cd334e8a457b628dc10d3a86c6f75249d495d434"} Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.020503 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"dcb7653a-62fc-4a7c-b6fd-720c3f73d47c","Type":"ContainerDied","Data":"c4e00e0a3fb79ec51d9ca0ca579d187f08cdeb998f3ca6f1c8d2c90619ed953d"} Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.020524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-applier-0" event={"ID":"529ed4cf-39aa-463b-bf65-da6900fa74e4","Type":"ContainerDied","Data":"c0215662cba4d3fcee5762da221eedd081e1779a395d41797f159130a47d0f0b"} Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.020555 5030 scope.go:117] "RemoveContainer" containerID="7eda88b8cde3a5c2b63324054ee5424a80054e7b8f662825b7f8b622f7a48194" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.034432 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" event={"ID":"4b106e3d-3a2a-44d6-a377-8eb7f8f37825","Type":"ContainerDied","Data":"6da96e85e71775220ada0b0bd685567b9ffd43cc9d125a2bf9ce50b5e63d8545"} Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.034671 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-c5f987788-8dlzq" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.051471 5030 generic.go:334] "Generic (PLEG): container finished" podID="0fa7d356-e537-4b10-829b-9f07de284e64" containerID="33728d9e320468adf6786f01946b2906c887ec040f837a9e394ae711dd71ba4c" exitCode=0 Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.051555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"0fa7d356-e537-4b10-829b-9f07de284e64","Type":"ContainerDied","Data":"33728d9e320468adf6786f01946b2906c887ec040f837a9e394ae711dd71ba4c"} Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.062680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"80e84f01-7009-4bfc-8cfb-2af258ceed3c","Type":"ContainerDied","Data":"adbab748012b4835b73cc6a39938f15d85643d69bd26da766d4cc801e9a7d8f6"} Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.062823 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.071880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"deba34a1-e52c-4ee9-b8fb-07dbe410184c","Type":"ContainerDied","Data":"2e2e01d73ac13c66e483610e5659b591aa8a25b91fc14d3bdce31748babff9df"} Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.071909 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.077929 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9","Type":"ContainerDied","Data":"bbccdc40651dfa4bf2b17228da4035b78046324e64507e3f43c65803a0ace128"} Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.078088 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.079867 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-sks9w" event={"ID":"ee0a79ab-ccac-4a83-9e3e-7f1f232f2121","Type":"ContainerDied","Data":"d92021bd9f26a8919bd8a03fd3d9d6c347c4b6cecb9b84ab99d1a599dd0e6c13"} Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.079920 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-sks9w" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.086400 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d9ae46d8-5c20-4710-90c6-b2d0120d9231","Type":"ContainerDied","Data":"79d1a584b801a41f3a5b989619a3373f95baea8a07e06d175bee484b1ef356f1"} Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.086559 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.094586 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_45ef6962-b7c4-487b-9b04-d954fbd30b3a/ovn-northd/0.log" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.094647 5030 generic.go:334] "Generic (PLEG): container finished" podID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" containerID="a26f3d469d1bc70918fe59af26ae8ccec88cda9bef22615517691b50482821fc" exitCode=139 Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.094759 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.097005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"45ef6962-b7c4-487b-9b04-d954fbd30b3a","Type":"ContainerDied","Data":"a26f3d469d1bc70918fe59af26ae8ccec88cda9bef22615517691b50482821fc"} Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.097101 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.126241 5030 scope.go:117] "RemoveContainer" containerID="7eda88b8cde3a5c2b63324054ee5424a80054e7b8f662825b7f8b622f7a48194" Jan 20 23:13:58 crc kubenswrapper[5030]: E0120 23:13:58.130289 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eda88b8cde3a5c2b63324054ee5424a80054e7b8f662825b7f8b622f7a48194\": container with ID starting with 7eda88b8cde3a5c2b63324054ee5424a80054e7b8f662825b7f8b622f7a48194 not found: ID does not exist" containerID="7eda88b8cde3a5c2b63324054ee5424a80054e7b8f662825b7f8b622f7a48194" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.130330 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eda88b8cde3a5c2b63324054ee5424a80054e7b8f662825b7f8b622f7a48194"} err="failed to get container status \"7eda88b8cde3a5c2b63324054ee5424a80054e7b8f662825b7f8b622f7a48194\": rpc error: code = NotFound desc = could not find container \"7eda88b8cde3a5c2b63324054ee5424a80054e7b8f662825b7f8b622f7a48194\": container with ID starting with 7eda88b8cde3a5c2b63324054ee5424a80054e7b8f662825b7f8b622f7a48194 not found: ID does not exist" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.130355 5030 scope.go:117] "RemoveContainer" containerID="b0b7fbcae4e87a8e2c4501e0a9f4a0d919d679b8183c68e6faeceada12e60e1e" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.158590 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.163376 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.169380 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.173757 5030 scope.go:117] "RemoveContainer" containerID="9a976ba4a26e8a14c9b5d735869936056345cb3a87523033f8a1c7c29e43c684" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.206795 5030 scope.go:117] "RemoveContainer" containerID="4066467fc36d5b2f8ca58161d2e2528be4d9375a2251420eb10d0d6a759ae68e" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.223667 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.228913 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="6d52d930-f169-42d5-a93d-d9d30906d5c8" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.45:8081/readyz\": dial tcp 10.217.0.45:8081: i/o timeout" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.234282 5030 scope.go:117] "RemoveContainer" containerID="e40e9ac3a45289a55deb5dd5f6e1a134c721f434436b873c0c20a597937dd46e" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.243393 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.257587 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-sks9w"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.270361 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-sks9w"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.285200 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.292084 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0fa7d356-e537-4b10-829b-9f07de284e64-config-data-generated\") pod \"0fa7d356-e537-4b10-829b-9f07de284e64\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.292122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0fa7d356-e537-4b10-829b-9f07de284e64\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.292243 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fa7d356-e537-4b10-829b-9f07de284e64-galera-tls-certs\") pod \"0fa7d356-e537-4b10-829b-9f07de284e64\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.292266 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-config-data-default\") pod \"0fa7d356-e537-4b10-829b-9f07de284e64\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.292318 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-kolla-config\") pod \"0fa7d356-e537-4b10-829b-9f07de284e64\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.292344 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-operator-scripts\") pod \"0fa7d356-e537-4b10-829b-9f07de284e64\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.292372 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa7d356-e537-4b10-829b-9f07de284e64-combined-ca-bundle\") pod \"0fa7d356-e537-4b10-829b-9f07de284e64\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.292434 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5j6h\" (UniqueName: \"kubernetes.io/projected/0fa7d356-e537-4b10-829b-9f07de284e64-kube-api-access-h5j6h\") pod \"0fa7d356-e537-4b10-829b-9f07de284e64\" (UID: \"0fa7d356-e537-4b10-829b-9f07de284e64\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.294358 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "0fa7d356-e537-4b10-829b-9f07de284e64" (UID: "0fa7d356-e537-4b10-829b-9f07de284e64"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.295307 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.295442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "0fa7d356-e537-4b10-829b-9f07de284e64" (UID: "0fa7d356-e537-4b10-829b-9f07de284e64"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.296022 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0fa7d356-e537-4b10-829b-9f07de284e64" (UID: "0fa7d356-e537-4b10-829b-9f07de284e64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.296740 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa7d356-e537-4b10-829b-9f07de284e64-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "0fa7d356-e537-4b10-829b-9f07de284e64" (UID: "0fa7d356-e537-4b10-829b-9f07de284e64"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.299828 5030 scope.go:117] "RemoveContainer" containerID="253233ff162c9e7305f61b0faa0810f0949d931fef29e1de6252df9e9b3381e3" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.304239 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-applier-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.310779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "0fa7d356-e537-4b10-829b-9f07de284e64" (UID: "0fa7d356-e537-4b10-829b-9f07de284e64"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.311753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa7d356-e537-4b10-829b-9f07de284e64-kube-api-access-h5j6h" (OuterVolumeSpecName: "kube-api-access-h5j6h") pod "0fa7d356-e537-4b10-829b-9f07de284e64" (UID: "0fa7d356-e537-4b10-829b-9f07de284e64"). InnerVolumeSpecName "kube-api-access-h5j6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.313684 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-applier-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.328425 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa7d356-e537-4b10-829b-9f07de284e64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fa7d356-e537-4b10-829b-9f07de284e64" (UID: "0fa7d356-e537-4b10-829b-9f07de284e64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.336689 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.341593 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-a5ce-account-create-update-7j267"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.359374 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.367310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa7d356-e537-4b10-829b-9f07de284e64-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "0fa7d356-e537-4b10-829b-9f07de284e64" (UID: "0fa7d356-e537-4b10-829b-9f07de284e64"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.371720 5030 scope.go:117] "RemoveContainer" containerID="5191d760f933dad69163b25ccde46995e6d88d09b8f999f482abfafa83968828" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.373778 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.382570 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.388750 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.395319 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.396153 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63d56792-0e0e-490d-9f7b-c613b6b6f979-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.396192 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.396205 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.396218 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fa7d356-e537-4b10-829b-9f07de284e64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.396231 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5j6h\" (UniqueName: \"kubernetes.io/projected/0fa7d356-e537-4b10-829b-9f07de284e64-kube-api-access-h5j6h\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.396241 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0fa7d356-e537-4b10-829b-9f07de284e64-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.396276 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.396289 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fa7d356-e537-4b10-829b-9f07de284e64-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.396300 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0fa7d356-e537-4b10-829b-9f07de284e64-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.396309 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t527\" (UniqueName: \"kubernetes.io/projected/63d56792-0e0e-490d-9f7b-c613b6b6f979-kube-api-access-4t527\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.400419 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.406578 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-c5f987788-8dlzq"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.412318 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-c5f987788-8dlzq"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.413649 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.417437 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.421737 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.431554 5030 scope.go:117] "RemoveContainer" containerID="f8d84d8355d9ec6b696884a0457ca932cf4fbadb0be127718cc3bfa520459e7d" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.476702 5030 scope.go:117] "RemoveContainer" containerID="5c57eec86df36ec8b65b60ad4ae937344c23be3ea71cb4377819fbc39e8de8e1" Jan 20 23:13:58 crc kubenswrapper[5030]: E0120 23:13:58.498246 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.498257 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: E0120 23:13:58.498309 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data podName:8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac nodeName:}" failed. No retries permitted until 2026-01-20 23:14:06.498292981 +0000 UTC m=+2318.818553269 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data") pod "rabbitmq-server-0" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac") : configmap "rabbitmq-config-data" not found Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.506820 5030 scope.go:117] "RemoveContainer" containerID="4afb199e076e08bd1f320a82a9ae4355b464db743e01e9a15d1354569ef3e0d8" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.532318 5030 scope.go:117] "RemoveContainer" containerID="781cd384eff78e55f6cd571c577fc758748a027bc29a451c8ad2e11b486203dd" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.551094 5030 scope.go:117] "RemoveContainer" containerID="e959b9a2c88101b1abc84c9677d1da8781c558a6553ea4387f6edbb3c88f918b" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.570051 5030 scope.go:117] "RemoveContainer" containerID="14b424c8af972f0138ff073374e0e6c981d49cdcb262e43e51bfaad6dc967e68" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.595992 5030 scope.go:117] "RemoveContainer" containerID="f803f31c1484582fd4598d69e21199293bf4f36c02688642dfeddaee147a10ab" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.628265 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_45ef6962-b7c4-487b-9b04-d954fbd30b3a/ovn-northd/0.log" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.628337 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.701422 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ef6962-b7c4-487b-9b04-d954fbd30b3a-config\") pod \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.701547 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-metrics-certs-tls-certs\") pod \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.701581 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-ovn-northd-tls-certs\") pod \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.701608 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ef6962-b7c4-487b-9b04-d954fbd30b3a-scripts\") pod \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.701740 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45ef6962-b7c4-487b-9b04-d954fbd30b3a-ovn-rundir\") pod \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.701802 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzc4m\" (UniqueName: \"kubernetes.io/projected/45ef6962-b7c4-487b-9b04-d954fbd30b3a-kube-api-access-vzc4m\") pod \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.701827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-combined-ca-bundle\") pod \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\" (UID: \"45ef6962-b7c4-487b-9b04-d954fbd30b3a\") " Jan 20 23:13:58 crc kubenswrapper[5030]: E0120 23:13:58.702287 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.702430 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ef6962-b7c4-487b-9b04-d954fbd30b3a-config" (OuterVolumeSpecName: "config") pod "45ef6962-b7c4-487b-9b04-d954fbd30b3a" (UID: "45ef6962-b7c4-487b-9b04-d954fbd30b3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.702581 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ef6962-b7c4-487b-9b04-d954fbd30b3a-scripts" (OuterVolumeSpecName: "scripts") pod "45ef6962-b7c4-487b-9b04-d954fbd30b3a" (UID: "45ef6962-b7c4-487b-9b04-d954fbd30b3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.703165 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45ef6962-b7c4-487b-9b04-d954fbd30b3a-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "45ef6962-b7c4-487b-9b04-d954fbd30b3a" (UID: "45ef6962-b7c4-487b-9b04-d954fbd30b3a"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: E0120 23:13:58.703315 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:13:58 crc kubenswrapper[5030]: E0120 23:13:58.703431 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 20 23:13:58 crc kubenswrapper[5030]: E0120 23:13:58.705185 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs podName:03b59562-d96f-453a-bfc7-a9f3c7e7f872 nodeName:}" failed. No retries permitted until 2026-01-20 23:14:02.702321926 +0000 UTC m=+2315.022582224 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs") pod "keystone-7f866c7db4-82phj" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872") : secret "cert-keystone-public-svc" not found Jan 20 23:13:58 crc kubenswrapper[5030]: E0120 23:13:58.705236 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle podName:03b59562-d96f-453a-bfc7-a9f3c7e7f872 nodeName:}" failed. No retries permitted until 2026-01-20 23:14:02.705221336 +0000 UTC m=+2315.025481634 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle") pod "keystone-7f866c7db4-82phj" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872") : secret "combined-ca-bundle" not found Jan 20 23:13:58 crc kubenswrapper[5030]: E0120 23:13:58.705259 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs podName:03b59562-d96f-453a-bfc7-a9f3c7e7f872 nodeName:}" failed. No retries permitted until 2026-01-20 23:14:02.705249227 +0000 UTC m=+2315.025509525 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs") pod "keystone-7f866c7db4-82phj" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872") : secret "cert-keystone-internal-svc" not found Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.717280 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ef6962-b7c4-487b-9b04-d954fbd30b3a-kube-api-access-vzc4m" (OuterVolumeSpecName: "kube-api-access-vzc4m") pod "45ef6962-b7c4-487b-9b04-d954fbd30b3a" (UID: "45ef6962-b7c4-487b-9b04-d954fbd30b3a"). InnerVolumeSpecName "kube-api-access-vzc4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.729483 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45ef6962-b7c4-487b-9b04-d954fbd30b3a" (UID: "45ef6962-b7c4-487b-9b04-d954fbd30b3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.776943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "45ef6962-b7c4-487b-9b04-d954fbd30b3a" (UID: "45ef6962-b7c4-487b-9b04-d954fbd30b3a"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.787468 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "45ef6962-b7c4-487b-9b04-d954fbd30b3a" (UID: "45ef6962-b7c4-487b-9b04-d954fbd30b3a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.803644 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ef6962-b7c4-487b-9b04-d954fbd30b3a-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.803675 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.803692 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.803705 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45ef6962-b7c4-487b-9b04-d954fbd30b3a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.803717 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45ef6962-b7c4-487b-9b04-d954fbd30b3a-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.803733 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzc4m\" (UniqueName: \"kubernetes.io/projected/45ef6962-b7c4-487b-9b04-d954fbd30b3a-kube-api-access-vzc4m\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:58 crc kubenswrapper[5030]: I0120 23:13:58.803745 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45ef6962-b7c4-487b-9b04-d954fbd30b3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.104784 5030 generic.go:334] "Generic (PLEG): container finished" podID="03b59562-d96f-453a-bfc7-a9f3c7e7f872" containerID="7ce9f7684a297141dd328f9a8c3e12154ea1db484b1ac1e034af93f31c64eab1" exitCode=0 Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.104897 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" event={"ID":"03b59562-d96f-453a-bfc7-a9f3c7e7f872","Type":"ContainerDied","Data":"7ce9f7684a297141dd328f9a8c3e12154ea1db484b1ac1e034af93f31c64eab1"} Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.109698 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_45ef6962-b7c4-487b-9b04-d954fbd30b3a/ovn-northd/0.log" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.109778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"45ef6962-b7c4-487b-9b04-d954fbd30b3a","Type":"ContainerDied","Data":"dbc6123b56a3c21f27e32f1cb0e8c8a4b6d91827e4c18576685a7383b2df5bd4"} Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.109818 5030 scope.go:117] "RemoveContainer" containerID="d81bac0da6673890a9c30a104bdda356d42b4ea1937571eec0145eb0c3b6b9ec" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.109844 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.114730 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"0fa7d356-e537-4b10-829b-9f07de284e64","Type":"ContainerDied","Data":"4c852fa817a40b30745becbe4d10ea2f1b1e6ec84c4012411b67498fbde8845f"} Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.114829 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.142886 5030 scope.go:117] "RemoveContainer" containerID="a26f3d469d1bc70918fe59af26ae8ccec88cda9bef22615517691b50482821fc" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.166293 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.174114 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.178174 5030 scope.go:117] "RemoveContainer" containerID="33728d9e320468adf6786f01946b2906c887ec040f837a9e394ae711dd71ba4c" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.181579 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.189044 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.214707 5030 scope.go:117] "RemoveContainer" containerID="0bb5172cf37332f877e6edf9a908bfb7d962347a4504c2961384b777b72fd68d" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.311908 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.412382 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs\") pod \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.412432 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-fernet-keys\") pod \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.412465 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle\") pod \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.412487 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9msb7\" (UniqueName: \"kubernetes.io/projected/03b59562-d96f-453a-bfc7-a9f3c7e7f872-kube-api-access-9msb7\") pod \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.412543 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-scripts\") pod \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.412573 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-credential-keys\") pod \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.412615 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs\") pod \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.412655 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-config-data\") pod \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\" (UID: \"03b59562-d96f-453a-bfc7-a9f3c7e7f872\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.416836 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "03b59562-d96f-453a-bfc7-a9f3c7e7f872" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.417353 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03b59562-d96f-453a-bfc7-a9f3c7e7f872-kube-api-access-9msb7" (OuterVolumeSpecName: "kube-api-access-9msb7") pod "03b59562-d96f-453a-bfc7-a9f3c7e7f872" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872"). InnerVolumeSpecName "kube-api-access-9msb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.417374 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-scripts" (OuterVolumeSpecName: "scripts") pod "03b59562-d96f-453a-bfc7-a9f3c7e7f872" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.436015 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "03b59562-d96f-453a-bfc7-a9f3c7e7f872" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.438560 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03b59562-d96f-453a-bfc7-a9f3c7e7f872" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.441960 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-config-data" (OuterVolumeSpecName: "config-data") pod "03b59562-d96f-453a-bfc7-a9f3c7e7f872" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.459477 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "03b59562-d96f-453a-bfc7-a9f3c7e7f872" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.461684 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "03b59562-d96f-453a-bfc7-a9f3c7e7f872" (UID: "03b59562-d96f-453a-bfc7-a9f3c7e7f872"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.514587 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.514643 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9msb7\" (UniqueName: \"kubernetes.io/projected/03b59562-d96f-453a-bfc7-a9f3c7e7f872-kube-api-access-9msb7\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.514655 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.514664 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.514673 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.514683 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.514691 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.514699 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03b59562-d96f-453a-bfc7-a9f3c7e7f872-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: E0120 23:13:59.673006 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:13:59 crc kubenswrapper[5030]: E0120 23:13:59.674317 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:13:59 crc kubenswrapper[5030]: E0120 23:13:59.675631 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:13:59 crc kubenswrapper[5030]: E0120 23:13:59.675656 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="935fdabd-9fb0-4aec-8920-4ccbaf4416a3" containerName="nova-scheduler-scheduler" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.747479 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.818979 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-erlang-cookie-secret\") pod \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.819070 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data\") pod \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.819151 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-erlang-cookie\") pod \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.819206 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-plugins-conf\") pod \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.819261 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-pod-info\") pod \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.819323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.819368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-plugins\") pod \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.819400 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jwbv\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-kube-api-access-2jwbv\") pod \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.819474 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-server-conf\") pod \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.819507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-confd\") pod \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.819550 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-tls\") pod \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\" (UID: \"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac\") " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.819995 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.820074 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.820555 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.822861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-pod-info" (OuterVolumeSpecName: "pod-info") pod "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.824599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.824833 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.824979 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.826407 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-kube-api-access-2jwbv" (OuterVolumeSpecName: "kube-api-access-2jwbv") pod "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac"). InnerVolumeSpecName "kube-api-access-2jwbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.837696 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data" (OuterVolumeSpecName: "config-data") pod "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.888760 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-server-conf" (OuterVolumeSpecName: "server-conf") pod "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.900790 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" (UID: "8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.921847 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jwbv\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-kube-api-access-2jwbv\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.921894 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.921918 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.921936 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.921953 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.921971 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.921989 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.922008 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.922025 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.922088 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.922108 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.951416 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.983451 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fa7d356-e537-4b10-829b-9f07de284e64" path="/var/lib/kubelet/pods/0fa7d356-e537-4b10-829b-9f07de284e64/volumes" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.984805 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" path="/var/lib/kubelet/pods/3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab/volumes" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.986826 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" path="/var/lib/kubelet/pods/45ef6962-b7c4-487b-9b04-d954fbd30b3a/volumes" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.989760 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b106e3d-3a2a-44d6-a377-8eb7f8f37825" path="/var/lib/kubelet/pods/4b106e3d-3a2a-44d6-a377-8eb7f8f37825/volumes" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.990690 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="529ed4cf-39aa-463b-bf65-da6900fa74e4" path="/var/lib/kubelet/pods/529ed4cf-39aa-463b-bf65-da6900fa74e4/volumes" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.991243 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63d56792-0e0e-490d-9f7b-c613b6b6f979" path="/var/lib/kubelet/pods/63d56792-0e0e-490d-9f7b-c613b6b6f979/volumes" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.991756 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" path="/var/lib/kubelet/pods/80e84f01-7009-4bfc-8cfb-2af258ceed3c/volumes" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.992377 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" path="/var/lib/kubelet/pods/d9ae46d8-5c20-4710-90c6-b2d0120d9231/volumes" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.993679 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" path="/var/lib/kubelet/pods/dcb7653a-62fc-4a7c-b6fd-720c3f73d47c/volumes" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.994635 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deba34a1-e52c-4ee9-b8fb-07dbe410184c" path="/var/lib/kubelet/pods/deba34a1-e52c-4ee9-b8fb-07dbe410184c/volumes" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.995776 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" path="/var/lib/kubelet/pods/eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9/volumes" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.996596 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee0a79ab-ccac-4a83-9e3e-7f1f232f2121" path="/var/lib/kubelet/pods/ee0a79ab-ccac-4a83-9e3e-7f1f232f2121/volumes" Jan 20 23:13:59 crc kubenswrapper[5030]: I0120 23:13:59.997305 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8fe612c-1c3a-4269-8177-c09b217721c0" path="/var/lib/kubelet/pods/f8fe612c-1c3a-4269-8177-c09b217721c0/volumes" Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.031036 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.172492 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.172581 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7f866c7db4-82phj" event={"ID":"03b59562-d96f-453a-bfc7-a9f3c7e7f872","Type":"ContainerDied","Data":"55cb7a6b93f3c971f30d12b05d3eb90c90bab4e19c118061da82f49e26511216"} Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.172681 5030 scope.go:117] "RemoveContainer" containerID="7ce9f7684a297141dd328f9a8c3e12154ea1db484b1ac1e034af93f31c64eab1" Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.181024 5030 generic.go:334] "Generic (PLEG): container finished" podID="8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" containerID="0dea74d7713f6e95acbd5e0508ac0db11fd7ae82d19264ddcc72a80ed955f3f8" exitCode=0 Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.181287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac","Type":"ContainerDied","Data":"0dea74d7713f6e95acbd5e0508ac0db11fd7ae82d19264ddcc72a80ed955f3f8"} Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.181512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac","Type":"ContainerDied","Data":"166c0118eb51409e199ccc5e924b57cf4a6300397f664a6f64c04671064b1d3c"} Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.181332 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.224497 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7f866c7db4-82phj"] Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.232596 5030 scope.go:117] "RemoveContainer" containerID="0dea74d7713f6e95acbd5e0508ac0db11fd7ae82d19264ddcc72a80ed955f3f8" Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.241991 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-7f866c7db4-82phj"] Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.261633 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.268457 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.307649 5030 scope.go:117] "RemoveContainer" containerID="41c64328990c95c8e74986ddc791bda40376cfba59526d1bcd67f04916bafbff" Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.336124 5030 scope.go:117] "RemoveContainer" containerID="0dea74d7713f6e95acbd5e0508ac0db11fd7ae82d19264ddcc72a80ed955f3f8" Jan 20 23:14:00 crc kubenswrapper[5030]: E0120 23:14:00.336884 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dea74d7713f6e95acbd5e0508ac0db11fd7ae82d19264ddcc72a80ed955f3f8\": container with ID starting with 0dea74d7713f6e95acbd5e0508ac0db11fd7ae82d19264ddcc72a80ed955f3f8 not found: ID does not exist" containerID="0dea74d7713f6e95acbd5e0508ac0db11fd7ae82d19264ddcc72a80ed955f3f8" Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.336938 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dea74d7713f6e95acbd5e0508ac0db11fd7ae82d19264ddcc72a80ed955f3f8"} err="failed to get container status \"0dea74d7713f6e95acbd5e0508ac0db11fd7ae82d19264ddcc72a80ed955f3f8\": rpc error: code = NotFound desc = could not find container \"0dea74d7713f6e95acbd5e0508ac0db11fd7ae82d19264ddcc72a80ed955f3f8\": container with ID starting with 0dea74d7713f6e95acbd5e0508ac0db11fd7ae82d19264ddcc72a80ed955f3f8 not found: ID does not exist" Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.336976 5030 scope.go:117] "RemoveContainer" containerID="41c64328990c95c8e74986ddc791bda40376cfba59526d1bcd67f04916bafbff" Jan 20 23:14:00 crc kubenswrapper[5030]: E0120 23:14:00.337424 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c64328990c95c8e74986ddc791bda40376cfba59526d1bcd67f04916bafbff\": container with ID starting with 41c64328990c95c8e74986ddc791bda40376cfba59526d1bcd67f04916bafbff not found: ID does not exist" containerID="41c64328990c95c8e74986ddc791bda40376cfba59526d1bcd67f04916bafbff" Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.337479 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c64328990c95c8e74986ddc791bda40376cfba59526d1bcd67f04916bafbff"} err="failed to get container status \"41c64328990c95c8e74986ddc791bda40376cfba59526d1bcd67f04916bafbff\": rpc error: code = NotFound desc = could not find container \"41c64328990c95c8e74986ddc791bda40376cfba59526d1bcd67f04916bafbff\": container with ID starting with 41c64328990c95c8e74986ddc791bda40376cfba59526d1bcd67f04916bafbff not found: ID does not exist" Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.397535 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" podUID="65835ff1-85e2-4b33-a384-c93fdf569c40" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.219:9696/\": dial tcp 10.217.1.219:9696: connect: connection refused" Jan 20 23:14:00 crc kubenswrapper[5030]: I0120 23:14:00.976415 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.045735 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.051363 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqzdn\" (UniqueName: \"kubernetes.io/projected/b2354629-cb98-42f4-bad0-675148557a8b-kube-api-access-jqzdn\") pod \"b2354629-cb98-42f4-bad0-675148557a8b\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.051554 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-combined-ca-bundle\") pod \"b2354629-cb98-42f4-bad0-675148557a8b\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.051600 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2354629-cb98-42f4-bad0-675148557a8b-logs\") pod \"b2354629-cb98-42f4-bad0-675148557a8b\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.051643 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-config-data\") pod \"b2354629-cb98-42f4-bad0-675148557a8b\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.051699 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-config-data-custom\") pod \"b2354629-cb98-42f4-bad0-675148557a8b\" (UID: \"b2354629-cb98-42f4-bad0-675148557a8b\") " Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.052817 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2354629-cb98-42f4-bad0-675148557a8b-logs" (OuterVolumeSpecName: "logs") pod "b2354629-cb98-42f4-bad0-675148557a8b" (UID: "b2354629-cb98-42f4-bad0-675148557a8b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.058106 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2354629-cb98-42f4-bad0-675148557a8b-kube-api-access-jqzdn" (OuterVolumeSpecName: "kube-api-access-jqzdn") pod "b2354629-cb98-42f4-bad0-675148557a8b" (UID: "b2354629-cb98-42f4-bad0-675148557a8b"). InnerVolumeSpecName "kube-api-access-jqzdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.058145 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b2354629-cb98-42f4-bad0-675148557a8b" (UID: "b2354629-cb98-42f4-bad0-675148557a8b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.081098 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2354629-cb98-42f4-bad0-675148557a8b" (UID: "b2354629-cb98-42f4-bad0-675148557a8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.102004 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-config-data" (OuterVolumeSpecName: "config-data") pod "b2354629-cb98-42f4-bad0-675148557a8b" (UID: "b2354629-cb98-42f4-bad0-675148557a8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.153122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-config-data-custom\") pod \"e35248d6-8ad0-425b-a55d-ab390f26120c\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.153187 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-combined-ca-bundle\") pod \"e35248d6-8ad0-425b-a55d-ab390f26120c\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.153260 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e35248d6-8ad0-425b-a55d-ab390f26120c-logs\") pod \"e35248d6-8ad0-425b-a55d-ab390f26120c\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.153303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-config-data\") pod \"e35248d6-8ad0-425b-a55d-ab390f26120c\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.153436 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8fd7\" (UniqueName: \"kubernetes.io/projected/e35248d6-8ad0-425b-a55d-ab390f26120c-kube-api-access-z8fd7\") pod \"e35248d6-8ad0-425b-a55d-ab390f26120c\" (UID: \"e35248d6-8ad0-425b-a55d-ab390f26120c\") " Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.154117 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e35248d6-8ad0-425b-a55d-ab390f26120c-logs" (OuterVolumeSpecName: "logs") pod "e35248d6-8ad0-425b-a55d-ab390f26120c" (UID: "e35248d6-8ad0-425b-a55d-ab390f26120c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.154670 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2354629-cb98-42f4-bad0-675148557a8b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.154694 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.154706 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.154718 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqzdn\" (UniqueName: \"kubernetes.io/projected/b2354629-cb98-42f4-bad0-675148557a8b-kube-api-access-jqzdn\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.154729 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e35248d6-8ad0-425b-a55d-ab390f26120c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.154765 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2354629-cb98-42f4-bad0-675148557a8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.157149 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e35248d6-8ad0-425b-a55d-ab390f26120c" (UID: "e35248d6-8ad0-425b-a55d-ab390f26120c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.157567 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e35248d6-8ad0-425b-a55d-ab390f26120c-kube-api-access-z8fd7" (OuterVolumeSpecName: "kube-api-access-z8fd7") pod "e35248d6-8ad0-425b-a55d-ab390f26120c" (UID: "e35248d6-8ad0-425b-a55d-ab390f26120c"). InnerVolumeSpecName "kube-api-access-z8fd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.173608 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e35248d6-8ad0-425b-a55d-ab390f26120c" (UID: "e35248d6-8ad0-425b-a55d-ab390f26120c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.188602 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.205153 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2354629-cb98-42f4-bad0-675148557a8b" containerID="d25a29179b1fb97279dddb13bc8a1318a8a5a57098f7af8c31048f2d300e2475" exitCode=0 Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.205242 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" event={"ID":"b2354629-cb98-42f4-bad0-675148557a8b","Type":"ContainerDied","Data":"d25a29179b1fb97279dddb13bc8a1318a8a5a57098f7af8c31048f2d300e2475"} Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.205282 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" event={"ID":"b2354629-cb98-42f4-bad0-675148557a8b","Type":"ContainerDied","Data":"8bc90db94e27e6c81fe0d3b87f7414446a3b142fd022e669b2ce52bfbbe4b355"} Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.205302 5030 scope.go:117] "RemoveContainer" containerID="d25a29179b1fb97279dddb13bc8a1318a8a5a57098f7af8c31048f2d300e2475" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.205300 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.211104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-config-data" (OuterVolumeSpecName: "config-data") pod "e35248d6-8ad0-425b-a55d-ab390f26120c" (UID: "e35248d6-8ad0-425b-a55d-ab390f26120c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.212919 5030 generic.go:334] "Generic (PLEG): container finished" podID="e35248d6-8ad0-425b-a55d-ab390f26120c" containerID="e6a766d34ead818ebeb37ccaabe733eb8ee48b183b9e05802b89424703cbd7ca" exitCode=0 Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.213019 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" event={"ID":"e35248d6-8ad0-425b-a55d-ab390f26120c","Type":"ContainerDied","Data":"e6a766d34ead818ebeb37ccaabe733eb8ee48b183b9e05802b89424703cbd7ca"} Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.213081 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" event={"ID":"e35248d6-8ad0-425b-a55d-ab390f26120c","Type":"ContainerDied","Data":"1b4a0b9f602177cf21b9c455ed94e9c8ec3f6932b959a420a6cb042543b219c0"} Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.213037 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.231301 5030 scope.go:117] "RemoveContainer" containerID="4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.231422 5030 generic.go:334] "Generic (PLEG): container finished" podID="935fdabd-9fb0-4aec-8920-4ccbaf4416a3" containerID="52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245" exitCode=0 Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.231557 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"935fdabd-9fb0-4aec-8920-4ccbaf4416a3","Type":"ContainerDied","Data":"52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245"} Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.231589 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.231632 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"935fdabd-9fb0-4aec-8920-4ccbaf4416a3","Type":"ContainerDied","Data":"2f2378951181784ae7c970bc36e95ac59f829c05bfd64485bd20512a61f67910"} Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.256107 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-config-data\") pod \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\" (UID: \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\") " Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.256208 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht5dk\" (UniqueName: \"kubernetes.io/projected/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-kube-api-access-ht5dk\") pod \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\" (UID: \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\") " Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.256296 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-combined-ca-bundle\") pod \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\" (UID: \"935fdabd-9fb0-4aec-8920-4ccbaf4416a3\") " Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.256547 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.256567 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.256580 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e35248d6-8ad0-425b-a55d-ab390f26120c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.256594 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8fd7\" (UniqueName: \"kubernetes.io/projected/e35248d6-8ad0-425b-a55d-ab390f26120c-kube-api-access-z8fd7\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.259558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-kube-api-access-ht5dk" (OuterVolumeSpecName: "kube-api-access-ht5dk") pod "935fdabd-9fb0-4aec-8920-4ccbaf4416a3" (UID: "935fdabd-9fb0-4aec-8920-4ccbaf4416a3"). InnerVolumeSpecName "kube-api-access-ht5dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.280579 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "935fdabd-9fb0-4aec-8920-4ccbaf4416a3" (UID: "935fdabd-9fb0-4aec-8920-4ccbaf4416a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.281477 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-config-data" (OuterVolumeSpecName: "config-data") pod "935fdabd-9fb0-4aec-8920-4ccbaf4416a3" (UID: "935fdabd-9fb0-4aec-8920-4ccbaf4416a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.347434 5030 scope.go:117] "RemoveContainer" containerID="d25a29179b1fb97279dddb13bc8a1318a8a5a57098f7af8c31048f2d300e2475" Jan 20 23:14:01 crc kubenswrapper[5030]: E0120 23:14:01.347939 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d25a29179b1fb97279dddb13bc8a1318a8a5a57098f7af8c31048f2d300e2475\": container with ID starting with d25a29179b1fb97279dddb13bc8a1318a8a5a57098f7af8c31048f2d300e2475 not found: ID does not exist" containerID="d25a29179b1fb97279dddb13bc8a1318a8a5a57098f7af8c31048f2d300e2475" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.347978 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d25a29179b1fb97279dddb13bc8a1318a8a5a57098f7af8c31048f2d300e2475"} err="failed to get container status \"d25a29179b1fb97279dddb13bc8a1318a8a5a57098f7af8c31048f2d300e2475\": rpc error: code = NotFound desc = could not find container \"d25a29179b1fb97279dddb13bc8a1318a8a5a57098f7af8c31048f2d300e2475\": container with ID starting with d25a29179b1fb97279dddb13bc8a1318a8a5a57098f7af8c31048f2d300e2475 not found: ID does not exist" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.348002 5030 scope.go:117] "RemoveContainer" containerID="4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5" Jan 20 23:14:01 crc kubenswrapper[5030]: E0120 23:14:01.348319 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5\": container with ID starting with 4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5 not found: ID does not exist" containerID="4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.348337 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5"} err="failed to get container status \"4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5\": rpc error: code = NotFound desc = could not find container \"4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5\": container with ID starting with 4df84c5f2f640e79714079dfad074cdba86b59da1f68acbe5452536735fb75d5 not found: ID does not exist" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.348348 5030 scope.go:117] "RemoveContainer" containerID="e6a766d34ead818ebeb37ccaabe733eb8ee48b183b9e05802b89424703cbd7ca" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.353977 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs"] Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.358503 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.358559 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht5dk\" (UniqueName: \"kubernetes.io/projected/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-kube-api-access-ht5dk\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.358569 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935fdabd-9fb0-4aec-8920-4ccbaf4416a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.362136 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-747bd99d97-8wlhs"] Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.370151 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92"] Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.376366 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6647c4cc54-xdx92"] Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.376605 5030 scope.go:117] "RemoveContainer" containerID="a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.394269 5030 scope.go:117] "RemoveContainer" containerID="e6a766d34ead818ebeb37ccaabe733eb8ee48b183b9e05802b89424703cbd7ca" Jan 20 23:14:01 crc kubenswrapper[5030]: E0120 23:14:01.395329 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6a766d34ead818ebeb37ccaabe733eb8ee48b183b9e05802b89424703cbd7ca\": container with ID starting with e6a766d34ead818ebeb37ccaabe733eb8ee48b183b9e05802b89424703cbd7ca not found: ID does not exist" containerID="e6a766d34ead818ebeb37ccaabe733eb8ee48b183b9e05802b89424703cbd7ca" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.395368 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a766d34ead818ebeb37ccaabe733eb8ee48b183b9e05802b89424703cbd7ca"} err="failed to get container status \"e6a766d34ead818ebeb37ccaabe733eb8ee48b183b9e05802b89424703cbd7ca\": rpc error: code = NotFound desc = could not find container \"e6a766d34ead818ebeb37ccaabe733eb8ee48b183b9e05802b89424703cbd7ca\": container with ID starting with e6a766d34ead818ebeb37ccaabe733eb8ee48b183b9e05802b89424703cbd7ca not found: ID does not exist" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.395398 5030 scope.go:117] "RemoveContainer" containerID="a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9" Jan 20 23:14:01 crc kubenswrapper[5030]: E0120 23:14:01.395771 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9\": container with ID starting with a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9 not found: ID does not exist" containerID="a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.395812 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9"} err="failed to get container status \"a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9\": rpc error: code = NotFound desc = could not find container \"a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9\": container with ID starting with a83a06fdb8b6d9a910cc5205bbc63cf63fdf75a815d953e146b28e0382bacff9 not found: ID does not exist" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.395839 5030 scope.go:117] "RemoveContainer" containerID="52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.414697 5030 scope.go:117] "RemoveContainer" containerID="52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245" Jan 20 23:14:01 crc kubenswrapper[5030]: E0120 23:14:01.415035 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245\": container with ID starting with 52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245 not found: ID does not exist" containerID="52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.415069 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245"} err="failed to get container status \"52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245\": rpc error: code = NotFound desc = could not find container \"52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245\": container with ID starting with 52c8db33c572c36a98dd6dc3c28705b771f1fc91dfa12e7411ac8f4552af1245 not found: ID does not exist" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.567120 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.576362 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.977093 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03b59562-d96f-453a-bfc7-a9f3c7e7f872" path="/var/lib/kubelet/pods/03b59562-d96f-453a-bfc7-a9f3c7e7f872/volumes" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.978063 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" path="/var/lib/kubelet/pods/8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac/volumes" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.978959 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="935fdabd-9fb0-4aec-8920-4ccbaf4416a3" path="/var/lib/kubelet/pods/935fdabd-9fb0-4aec-8920-4ccbaf4416a3/volumes" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.980258 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2354629-cb98-42f4-bad0-675148557a8b" path="/var/lib/kubelet/pods/b2354629-cb98-42f4-bad0-675148557a8b/volumes" Jan 20 23:14:01 crc kubenswrapper[5030]: I0120 23:14:01.981043 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e35248d6-8ad0-425b-a55d-ab390f26120c" path="/var/lib/kubelet/pods/e35248d6-8ad0-425b-a55d-ab390f26120c/volumes" Jan 20 23:14:02 crc kubenswrapper[5030]: I0120 23:14:02.756577 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:14:02 crc kubenswrapper[5030]: I0120 23:14:02.830592 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f"] Jan 20 23:14:02 crc kubenswrapper[5030]: I0120 23:14:02.830870 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" podUID="6b0797cf-f0bf-4d22-9b01-975b439e5a36" containerName="dnsmasq-dns" containerID="cri-o://a98358ab78eec0b694659f796a37b14f6795dd13ef8a648b4600dfb66d2fd05c" gracePeriod=10 Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.273738 5030 generic.go:334] "Generic (PLEG): container finished" podID="6b0797cf-f0bf-4d22-9b01-975b439e5a36" containerID="a98358ab78eec0b694659f796a37b14f6795dd13ef8a648b4600dfb66d2fd05c" exitCode=0 Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.273785 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" event={"ID":"6b0797cf-f0bf-4d22-9b01-975b439e5a36","Type":"ContainerDied","Data":"a98358ab78eec0b694659f796a37b14f6795dd13ef8a648b4600dfb66d2fd05c"} Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.273813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" event={"ID":"6b0797cf-f0bf-4d22-9b01-975b439e5a36","Type":"ContainerDied","Data":"a2b0050e24e3087ad6390305d44469f92dc1abb74e7f566baa55f357adcdb710"} Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.273827 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2b0050e24e3087ad6390305d44469f92dc1abb74e7f566baa55f357adcdb710" Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.325576 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.403263 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-dns-swift-storage-0\") pod \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.403539 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-dnsmasq-svc\") pod \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.403606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-config\") pod \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.403652 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6cmf\" (UniqueName: \"kubernetes.io/projected/6b0797cf-f0bf-4d22-9b01-975b439e5a36-kube-api-access-h6cmf\") pod \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\" (UID: \"6b0797cf-f0bf-4d22-9b01-975b439e5a36\") " Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.409775 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b0797cf-f0bf-4d22-9b01-975b439e5a36-kube-api-access-h6cmf" (OuterVolumeSpecName: "kube-api-access-h6cmf") pod "6b0797cf-f0bf-4d22-9b01-975b439e5a36" (UID: "6b0797cf-f0bf-4d22-9b01-975b439e5a36"). InnerVolumeSpecName "kube-api-access-h6cmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.447895 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "6b0797cf-f0bf-4d22-9b01-975b439e5a36" (UID: "6b0797cf-f0bf-4d22-9b01-975b439e5a36"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.467960 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-config" (OuterVolumeSpecName: "config") pod "6b0797cf-f0bf-4d22-9b01-975b439e5a36" (UID: "6b0797cf-f0bf-4d22-9b01-975b439e5a36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.472605 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6b0797cf-f0bf-4d22-9b01-975b439e5a36" (UID: "6b0797cf-f0bf-4d22-9b01-975b439e5a36"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.504960 5030 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.504988 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.504998 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b0797cf-f0bf-4d22-9b01-975b439e5a36-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:03 crc kubenswrapper[5030]: I0120 23:14:03.505009 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6cmf\" (UniqueName: \"kubernetes.io/projected/6b0797cf-f0bf-4d22-9b01-975b439e5a36-kube-api-access-h6cmf\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:03 crc kubenswrapper[5030]: E0120 23:14:03.977945 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28d47b79_8224_4a87_8ffc_808a6743d00a.slice/crio-bdd7503e79666863d6afaca4fc262aecba84505a7e41daf73b9635bf49a4eb40.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28d47b79_8224_4a87_8ffc_808a6743d00a.slice/crio-conmon-bdd7503e79666863d6afaca4fc262aecba84505a7e41daf73b9635bf49a4eb40.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.284661 5030 generic.go:334] "Generic (PLEG): container finished" podID="28d47b79-8224-4a87-8ffc-808a6743d00a" containerID="bdd7503e79666863d6afaca4fc262aecba84505a7e41daf73b9635bf49a4eb40" exitCode=0 Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.284769 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f" Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.285597 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"28d47b79-8224-4a87-8ffc-808a6743d00a","Type":"ContainerDied","Data":"bdd7503e79666863d6afaca4fc262aecba84505a7e41daf73b9635bf49a4eb40"} Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.285661 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/watcher-decision-engine-0" event={"ID":"28d47b79-8224-4a87-8ffc-808a6743d00a","Type":"ContainerDied","Data":"6494dfe0bf15335df1d555e64e5bdad3155f6ca50d23495e750c0cd9ad39591b"} Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.285677 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6494dfe0bf15335df1d555e64e5bdad3155f6ca50d23495e750c0cd9ad39591b" Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.319427 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.344351 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f"] Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.349800 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-85c5cb8b5f-rlz2f"] Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.419897 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-config-data\") pod \"28d47b79-8224-4a87-8ffc-808a6743d00a\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.420190 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-custom-prometheus-ca\") pod \"28d47b79-8224-4a87-8ffc-808a6743d00a\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.420360 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d47b79-8224-4a87-8ffc-808a6743d00a-logs\") pod \"28d47b79-8224-4a87-8ffc-808a6743d00a\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.420435 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-combined-ca-bundle\") pod \"28d47b79-8224-4a87-8ffc-808a6743d00a\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.420576 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdrsr\" (UniqueName: \"kubernetes.io/projected/28d47b79-8224-4a87-8ffc-808a6743d00a-kube-api-access-cdrsr\") pod \"28d47b79-8224-4a87-8ffc-808a6743d00a\" (UID: \"28d47b79-8224-4a87-8ffc-808a6743d00a\") " Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.420860 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28d47b79-8224-4a87-8ffc-808a6743d00a-logs" (OuterVolumeSpecName: "logs") pod "28d47b79-8224-4a87-8ffc-808a6743d00a" (UID: "28d47b79-8224-4a87-8ffc-808a6743d00a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.421449 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28d47b79-8224-4a87-8ffc-808a6743d00a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.429786 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d47b79-8224-4a87-8ffc-808a6743d00a-kube-api-access-cdrsr" (OuterVolumeSpecName: "kube-api-access-cdrsr") pod "28d47b79-8224-4a87-8ffc-808a6743d00a" (UID: "28d47b79-8224-4a87-8ffc-808a6743d00a"). InnerVolumeSpecName "kube-api-access-cdrsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.457169 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28d47b79-8224-4a87-8ffc-808a6743d00a" (UID: "28d47b79-8224-4a87-8ffc-808a6743d00a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.468980 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-config-data" (OuterVolumeSpecName: "config-data") pod "28d47b79-8224-4a87-8ffc-808a6743d00a" (UID: "28d47b79-8224-4a87-8ffc-808a6743d00a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.474415 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "28d47b79-8224-4a87-8ffc-808a6743d00a" (UID: "28d47b79-8224-4a87-8ffc-808a6743d00a"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.522867 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.522914 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdrsr\" (UniqueName: \"kubernetes.io/projected/28d47b79-8224-4a87-8ffc-808a6743d00a-kube-api-access-cdrsr\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.522935 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:04 crc kubenswrapper[5030]: I0120 23:14:04.522950 5030 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/28d47b79-8224-4a87-8ffc-808a6743d00a-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:05 crc kubenswrapper[5030]: I0120 23:14:05.295199 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/watcher-decision-engine-0" Jan 20 23:14:05 crc kubenswrapper[5030]: I0120 23:14:05.340921 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 20 23:14:05 crc kubenswrapper[5030]: I0120 23:14:05.352433 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/watcher-decision-engine-0"] Jan 20 23:14:05 crc kubenswrapper[5030]: I0120 23:14:05.979447 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28d47b79-8224-4a87-8ffc-808a6743d00a" path="/var/lib/kubelet/pods/28d47b79-8224-4a87-8ffc-808a6743d00a/volumes" Jan 20 23:14:05 crc kubenswrapper[5030]: I0120 23:14:05.980696 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b0797cf-f0bf-4d22-9b01-975b439e5a36" path="/var/lib/kubelet/pods/6b0797cf-f0bf-4d22-9b01-975b439e5a36/volumes" Jan 20 23:14:08 crc kubenswrapper[5030]: I0120 23:14:08.962433 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:14:08 crc kubenswrapper[5030]: E0120 23:14:08.962722 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.058451 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.121034 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-internal-tls-certs\") pod \"65835ff1-85e2-4b33-a384-c93fdf569c40\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.121103 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-httpd-config\") pod \"65835ff1-85e2-4b33-a384-c93fdf569c40\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.121149 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-public-tls-certs\") pod \"65835ff1-85e2-4b33-a384-c93fdf569c40\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.121180 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxfp5\" (UniqueName: \"kubernetes.io/projected/65835ff1-85e2-4b33-a384-c93fdf569c40-kube-api-access-fxfp5\") pod \"65835ff1-85e2-4b33-a384-c93fdf569c40\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.121227 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-config\") pod \"65835ff1-85e2-4b33-a384-c93fdf569c40\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.121281 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-combined-ca-bundle\") pod \"65835ff1-85e2-4b33-a384-c93fdf569c40\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.121325 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-ovndb-tls-certs\") pod \"65835ff1-85e2-4b33-a384-c93fdf569c40\" (UID: \"65835ff1-85e2-4b33-a384-c93fdf569c40\") " Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.126826 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65835ff1-85e2-4b33-a384-c93fdf569c40-kube-api-access-fxfp5" (OuterVolumeSpecName: "kube-api-access-fxfp5") pod "65835ff1-85e2-4b33-a384-c93fdf569c40" (UID: "65835ff1-85e2-4b33-a384-c93fdf569c40"). InnerVolumeSpecName "kube-api-access-fxfp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.127052 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "65835ff1-85e2-4b33-a384-c93fdf569c40" (UID: "65835ff1-85e2-4b33-a384-c93fdf569c40"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.156176 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "65835ff1-85e2-4b33-a384-c93fdf569c40" (UID: "65835ff1-85e2-4b33-a384-c93fdf569c40"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.164693 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-config" (OuterVolumeSpecName: "config") pod "65835ff1-85e2-4b33-a384-c93fdf569c40" (UID: "65835ff1-85e2-4b33-a384-c93fdf569c40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.165949 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65835ff1-85e2-4b33-a384-c93fdf569c40" (UID: "65835ff1-85e2-4b33-a384-c93fdf569c40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.170336 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "65835ff1-85e2-4b33-a384-c93fdf569c40" (UID: "65835ff1-85e2-4b33-a384-c93fdf569c40"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.183763 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "65835ff1-85e2-4b33-a384-c93fdf569c40" (UID: "65835ff1-85e2-4b33-a384-c93fdf569c40"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.222515 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.222744 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxfp5\" (UniqueName: \"kubernetes.io/projected/65835ff1-85e2-4b33-a384-c93fdf569c40-kube-api-access-fxfp5\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.222805 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.222894 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.222948 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.223027 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.223089 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/65835ff1-85e2-4b33-a384-c93fdf569c40-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.424509 5030 generic.go:334] "Generic (PLEG): container finished" podID="65835ff1-85e2-4b33-a384-c93fdf569c40" containerID="6f907dbea8308c724188d082653b0290d84929b8eaa1ab9fe4f6b434755bb79d" exitCode=0 Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.424584 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" event={"ID":"65835ff1-85e2-4b33-a384-c93fdf569c40","Type":"ContainerDied","Data":"6f907dbea8308c724188d082653b0290d84929b8eaa1ab9fe4f6b434755bb79d"} Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.424670 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.424693 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-d9778659d-djkxb" event={"ID":"65835ff1-85e2-4b33-a384-c93fdf569c40","Type":"ContainerDied","Data":"b1bbbbcdac7a677bb23329174532b0167d4ac661543b3432b8472b8a2a02c593"} Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.424739 5030 scope.go:117] "RemoveContainer" containerID="23839d0d94f910990f762e129be6aa928cce6a34a983e78f892a4b78ebb1dbaf" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.464190 5030 scope.go:117] "RemoveContainer" containerID="6f907dbea8308c724188d082653b0290d84929b8eaa1ab9fe4f6b434755bb79d" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.475965 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-d9778659d-djkxb"] Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.481148 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-d9778659d-djkxb"] Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.499676 5030 scope.go:117] "RemoveContainer" containerID="23839d0d94f910990f762e129be6aa928cce6a34a983e78f892a4b78ebb1dbaf" Jan 20 23:14:15 crc kubenswrapper[5030]: E0120 23:14:15.500169 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23839d0d94f910990f762e129be6aa928cce6a34a983e78f892a4b78ebb1dbaf\": container with ID starting with 23839d0d94f910990f762e129be6aa928cce6a34a983e78f892a4b78ebb1dbaf not found: ID does not exist" containerID="23839d0d94f910990f762e129be6aa928cce6a34a983e78f892a4b78ebb1dbaf" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.500203 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23839d0d94f910990f762e129be6aa928cce6a34a983e78f892a4b78ebb1dbaf"} err="failed to get container status \"23839d0d94f910990f762e129be6aa928cce6a34a983e78f892a4b78ebb1dbaf\": rpc error: code = NotFound desc = could not find container \"23839d0d94f910990f762e129be6aa928cce6a34a983e78f892a4b78ebb1dbaf\": container with ID starting with 23839d0d94f910990f762e129be6aa928cce6a34a983e78f892a4b78ebb1dbaf not found: ID does not exist" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.500225 5030 scope.go:117] "RemoveContainer" containerID="6f907dbea8308c724188d082653b0290d84929b8eaa1ab9fe4f6b434755bb79d" Jan 20 23:14:15 crc kubenswrapper[5030]: E0120 23:14:15.500841 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f907dbea8308c724188d082653b0290d84929b8eaa1ab9fe4f6b434755bb79d\": container with ID starting with 6f907dbea8308c724188d082653b0290d84929b8eaa1ab9fe4f6b434755bb79d not found: ID does not exist" containerID="6f907dbea8308c724188d082653b0290d84929b8eaa1ab9fe4f6b434755bb79d" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.500863 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f907dbea8308c724188d082653b0290d84929b8eaa1ab9fe4f6b434755bb79d"} err="failed to get container status \"6f907dbea8308c724188d082653b0290d84929b8eaa1ab9fe4f6b434755bb79d\": rpc error: code = NotFound desc = could not find container \"6f907dbea8308c724188d082653b0290d84929b8eaa1ab9fe4f6b434755bb79d\": container with ID starting with 6f907dbea8308c724188d082653b0290d84929b8eaa1ab9fe4f6b434755bb79d not found: ID does not exist" Jan 20 23:14:15 crc kubenswrapper[5030]: I0120 23:14:15.978425 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65835ff1-85e2-4b33-a384-c93fdf569c40" path="/var/lib/kubelet/pods/65835ff1-85e2-4b33-a384-c93fdf569c40/volumes" Jan 20 23:14:21 crc kubenswrapper[5030]: I0120 23:14:21.962842 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:14:21 crc kubenswrapper[5030]: E0120 23:14:21.964654 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.535923 5030 generic.go:334] "Generic (PLEG): container finished" podID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerID="fef2f8940ea5258c15f06f5d7dd8a6156a4b7e207d9173332274b4da36d97ab5" exitCode=137 Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.536078 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"fef2f8940ea5258c15f06f5d7dd8a6156a4b7e207d9173332274b4da36d97ab5"} Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.684169 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.742609 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70a9dd1f-f075-4833-b9e8-3fc856743ab8-lock\") pod \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.742717 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift\") pod \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.742787 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.742834 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70a9dd1f-f075-4833-b9e8-3fc856743ab8-cache\") pod \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.742946 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbzdc\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-kube-api-access-nbzdc\") pod \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\" (UID: \"70a9dd1f-f075-4833-b9e8-3fc856743ab8\") " Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.743833 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70a9dd1f-f075-4833-b9e8-3fc856743ab8-lock" (OuterVolumeSpecName: "lock") pod "70a9dd1f-f075-4833-b9e8-3fc856743ab8" (UID: "70a9dd1f-f075-4833-b9e8-3fc856743ab8"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.744418 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70a9dd1f-f075-4833-b9e8-3fc856743ab8-cache" (OuterVolumeSpecName: "cache") pod "70a9dd1f-f075-4833-b9e8-3fc856743ab8" (UID: "70a9dd1f-f075-4833-b9e8-3fc856743ab8"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.754090 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "swift") pod "70a9dd1f-f075-4833-b9e8-3fc856743ab8" (UID: "70a9dd1f-f075-4833-b9e8-3fc856743ab8"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.754156 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "70a9dd1f-f075-4833-b9e8-3fc856743ab8" (UID: "70a9dd1f-f075-4833-b9e8-3fc856743ab8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.754219 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-kube-api-access-nbzdc" (OuterVolumeSpecName: "kube-api-access-nbzdc") pod "70a9dd1f-f075-4833-b9e8-3fc856743ab8" (UID: "70a9dd1f-f075-4833-b9e8-3fc856743ab8"). InnerVolumeSpecName "kube-api-access-nbzdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.845335 5030 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70a9dd1f-f075-4833-b9e8-3fc856743ab8-lock\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.845410 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.845513 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.846209 5030 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70a9dd1f-f075-4833-b9e8-3fc856743ab8-cache\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.846289 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbzdc\" (UniqueName: \"kubernetes.io/projected/70a9dd1f-f075-4833-b9e8-3fc856743ab8-kube-api-access-nbzdc\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.871430 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 20 23:14:22 crc kubenswrapper[5030]: I0120 23:14:22.948027 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.560000 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"70a9dd1f-f075-4833-b9e8-3fc856743ab8","Type":"ContainerDied","Data":"02ab03e5fc6eda916c9515fe0f901ffcfb5ff6663636096457e50cde6498c21a"} Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.560087 5030 scope.go:117] "RemoveContainer" containerID="fef2f8940ea5258c15f06f5d7dd8a6156a4b7e207d9173332274b4da36d97ab5" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.560255 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.600272 5030 scope.go:117] "RemoveContainer" containerID="df21f467aa48e9cd73ecf780edb3fde78c478ee9e3e1fe10188497a5fc87d1a4" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.627367 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.637206 5030 scope.go:117] "RemoveContainer" containerID="09c4955040ae79ee588305769f26a9faab7bd5b88198cf480c047e86852ecd2d" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.638956 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.666536 5030 scope.go:117] "RemoveContainer" containerID="237d5c77a55c616b660a6d8a7a0789ffbbf5636dcf37dc9b4177236859da902e" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.696616 5030 scope.go:117] "RemoveContainer" containerID="02d2cd5180513116470bf587dced9406e0cccb4577828e686f7ad2f6e7756b90" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.722344 5030 scope.go:117] "RemoveContainer" containerID="a0b0fd6857b1e7dc5deae4a9fc1014ddd91b0f25a6c5eabcb93bc1c9b7ad0c4e" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.747389 5030 scope.go:117] "RemoveContainer" containerID="39ce2002c75ebf0e4ceffe55cf58454e136385b8b14ce8fbfd7b9e95218bc1d4" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.779068 5030 scope.go:117] "RemoveContainer" containerID="8cfd138e7964a40907591a741551bede8ca3e6f85ec899d199026118111003c6" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.805742 5030 scope.go:117] "RemoveContainer" containerID="9822aac9b52040694bad86d906341b866e4efd664919a2f1c1b84e448aa91fd2" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.834205 5030 scope.go:117] "RemoveContainer" containerID="6139d1ebd8fe4bd1772b00058abe6fe6f57fc73bf53a9abf6d30f7b0b31612d5" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.861917 5030 scope.go:117] "RemoveContainer" containerID="8cc93827bc67f52466465d92f18c3615d4a04a5a5999d606cd743da99355de49" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.889460 5030 scope.go:117] "RemoveContainer" containerID="0cd31ae9c67288c5d006a85915b20396118db52b0eb4af518b3f1e74a1973aad" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.915023 5030 scope.go:117] "RemoveContainer" containerID="3baf6bd0266a5a9fd8e4a8400417e2564cf33cb06f85ced28b5396ab30bdbc41" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.952844 5030 scope.go:117] "RemoveContainer" containerID="a98d6c66eb67d678dafdd47b3afa23c516a4bb86312f8871a44e29574063d568" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.978502 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" path="/var/lib/kubelet/pods/70a9dd1f-f075-4833-b9e8-3fc856743ab8/volumes" Jan 20 23:14:23 crc kubenswrapper[5030]: I0120 23:14:23.984317 5030 scope.go:117] "RemoveContainer" containerID="54da1c1411ab5ceca784346a76d3a2e8df0bf52604dc8fe074f570c94f675b9c" Jan 20 23:14:34 crc kubenswrapper[5030]: I0120 23:14:34.962930 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:14:34 crc kubenswrapper[5030]: E0120 23:14:34.964057 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.287868 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-tk6zs"] Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.299067 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-tk6zs"] Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.415683 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-fzk44"] Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.416298 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" containerName="openstack-network-exporter" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.416402 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" containerName="openstack-network-exporter" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.416498 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b0797cf-f0bf-4d22-9b01-975b439e5a36" containerName="dnsmasq-dns" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.416592 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0797cf-f0bf-4d22-9b01-975b439e5a36" containerName="dnsmasq-dns" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.416713 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerName="nova-metadata-metadata" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.416792 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerName="nova-metadata-metadata" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.416877 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-server" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.416950 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-server" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.417025 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" containerName="glance-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.417102 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" containerName="glance-log" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.417176 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerName="nova-metadata-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.417240 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerName="nova-metadata-log" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.417348 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d47b79-8224-4a87-8ffc-808a6743d00a" containerName="watcher-decision-engine" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.417416 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d47b79-8224-4a87-8ffc-808a6743d00a" containerName="watcher-decision-engine" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.417490 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b106e3d-3a2a-44d6-a377-8eb7f8f37825" containerName="barbican-api-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.417568 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b106e3d-3a2a-44d6-a377-8eb7f8f37825" containerName="barbican-api-log" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.417666 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fe612c-1c3a-4269-8177-c09b217721c0" containerName="nova-cell1-conductor-conductor" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.417785 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fe612c-1c3a-4269-8177-c09b217721c0" containerName="nova-cell1-conductor-conductor" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.417875 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="ceilometer-notification-agent" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.417949 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="ceilometer-notification-agent" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.418024 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="529ed4cf-39aa-463b-bf65-da6900fa74e4" containerName="watcher-applier" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.418100 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="529ed4cf-39aa-463b-bf65-da6900fa74e4" containerName="watcher-applier" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.418179 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-auditor" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.418254 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-auditor" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.418336 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" containerName="proxy-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.418415 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" containerName="proxy-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.418494 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" containerName="glance-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.418565 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" containerName="glance-log" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.418673 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-updater" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.418758 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-updater" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.418836 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93de7225-d504-409c-b4b7-8e0826937e9d" containerName="registry-server" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.418912 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="93de7225-d504-409c-b4b7-8e0826937e9d" containerName="registry-server" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.418995 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" containerName="setup-container" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.419069 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" containerName="setup-container" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.419147 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ab6da8-893f-49a3-867f-02b2f7395331" containerName="watcher-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.419213 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ab6da8-893f-49a3-867f-02b2f7395331" containerName="watcher-api" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.419284 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2354629-cb98-42f4-bad0-675148557a8b" containerName="barbican-worker-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.419358 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2354629-cb98-42f4-bad0-675148557a8b" containerName="barbican-worker-log" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.419433 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b296da3-d86c-431b-889e-a38437804d3a" containerName="placement-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.419532 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b296da3-d86c-431b-889e-a38437804d3a" containerName="placement-log" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.419613 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="rsync" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.419715 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="rsync" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.419794 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35248d6-8ad0-425b-a55d-ab390f26120c" containerName="barbican-keystone-listener" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.419868 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35248d6-8ad0-425b-a55d-ab390f26120c" containerName="barbican-keystone-listener" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.419938 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ab6da8-893f-49a3-867f-02b2f7395331" containerName="watcher-api-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.420010 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ab6da8-893f-49a3-867f-02b2f7395331" containerName="watcher-api-log" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.420085 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerName="nova-api-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.420180 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerName="nova-api-log" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.420252 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" containerName="cinder-api-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.420324 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" containerName="cinder-api-log" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.420401 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-reaper" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.420474 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-reaper" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.420552 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03b59562-d96f-453a-bfc7-a9f3c7e7f872" containerName="keystone-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.422748 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="03b59562-d96f-453a-bfc7-a9f3c7e7f872" containerName="keystone-api" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.422842 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-replicator" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.422925 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-replicator" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.423061 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-replicator" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.423142 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-replicator" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.423221 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b296da3-d86c-431b-889e-a38437804d3a" containerName="placement-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.423294 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b296da3-d86c-431b-889e-a38437804d3a" containerName="placement-api" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.423370 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" containerName="glance-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.423453 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" containerName="glance-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.423529 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="ceilometer-central-agent" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.423597 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="ceilometer-central-agent" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.423699 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" containerName="glance-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.423774 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" containerName="glance-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.423847 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="proxy-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.424085 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="proxy-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.424186 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b106e3d-3a2a-44d6-a377-8eb7f8f37825" containerName="barbican-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.424255 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b106e3d-3a2a-44d6-a377-8eb7f8f37825" containerName="barbican-api" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.424332 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65835ff1-85e2-4b33-a384-c93fdf569c40" containerName="neutron-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.424403 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="65835ff1-85e2-4b33-a384-c93fdf569c40" containerName="neutron-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.424477 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerName="nova-api-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.424544 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerName="nova-api-api" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.424614 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa7d356-e537-4b10-829b-9f07de284e64" containerName="galera" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.424817 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa7d356-e537-4b10-829b-9f07de284e64" containerName="galera" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.424915 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-replicator" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.424995 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-replicator" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.425084 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" containerName="ovn-northd" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.425165 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" containerName="ovn-northd" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.425228 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-expirer" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.425284 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-expirer" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.425341 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-server" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.425399 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-server" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.425454 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" containerName="proxy-server" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.425507 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" containerName="proxy-server" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.425563 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deba34a1-e52c-4ee9-b8fb-07dbe410184c" containerName="memcached" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.425637 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="deba34a1-e52c-4ee9-b8fb-07dbe410184c" containerName="memcached" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.425719 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0a79ab-ccac-4a83-9e3e-7f1f232f2121" containerName="mariadb-account-create-update" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.425775 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0a79ab-ccac-4a83-9e3e-7f1f232f2121" containerName="mariadb-account-create-update" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.425832 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2354629-cb98-42f4-bad0-675148557a8b" containerName="barbican-worker" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.425893 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2354629-cb98-42f4-bad0-675148557a8b" containerName="barbican-worker" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.425962 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-auditor" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.426031 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-auditor" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.426104 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa7d356-e537-4b10-829b-9f07de284e64" containerName="mysql-bootstrap" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.426171 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa7d356-e537-4b10-829b-9f07de284e64" containerName="mysql-bootstrap" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.426242 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="sg-core" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.426313 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="sg-core" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.426392 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93de7225-d504-409c-b4b7-8e0826937e9d" containerName="extract-content" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.426460 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="93de7225-d504-409c-b4b7-8e0826937e9d" containerName="extract-content" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.426538 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-auditor" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.426604 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-auditor" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.426701 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-updater" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.426772 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-updater" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.426847 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="swift-recon-cron" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.426923 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="swift-recon-cron" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.426999 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35248d6-8ad0-425b-a55d-ab390f26120c" containerName="barbican-keystone-listener-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.427158 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35248d6-8ad0-425b-a55d-ab390f26120c" containerName="barbican-keystone-listener-log" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.427235 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-server" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.427303 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-server" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.427380 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0a79ab-ccac-4a83-9e3e-7f1f232f2121" containerName="mariadb-account-create-update" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.427448 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0a79ab-ccac-4a83-9e3e-7f1f232f2121" containerName="mariadb-account-create-update" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.427521 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" containerName="rabbitmq" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.427590 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" containerName="rabbitmq" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.427698 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93de7225-d504-409c-b4b7-8e0826937e9d" containerName="extract-utilities" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.427766 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="93de7225-d504-409c-b4b7-8e0826937e9d" containerName="extract-utilities" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.427834 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65835ff1-85e2-4b33-a384-c93fdf569c40" containerName="neutron-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.427901 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="65835ff1-85e2-4b33-a384-c93fdf569c40" containerName="neutron-api" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.427974 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" containerName="cinder-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.428045 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" containerName="cinder-api" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.428140 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b0797cf-f0bf-4d22-9b01-975b439e5a36" containerName="init" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.428218 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0797cf-f0bf-4d22-9b01-975b439e5a36" containerName="init" Jan 20 23:14:36 crc kubenswrapper[5030]: E0120 23:14:36.428290 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="935fdabd-9fb0-4aec-8920-4ccbaf4416a3" containerName="nova-scheduler-scheduler" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.428353 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="935fdabd-9fb0-4aec-8920-4ccbaf4416a3" containerName="nova-scheduler-scheduler" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.428698 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-auditor" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.428788 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerName="nova-metadata-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.428853 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="rsync" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.428911 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0a79ab-ccac-4a83-9e3e-7f1f232f2121" containerName="mariadb-account-create-update" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.428968 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="935fdabd-9fb0-4aec-8920-4ccbaf4416a3" containerName="nova-scheduler-scheduler" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.429204 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ab6da8-893f-49a3-867f-02b2f7395331" containerName="watcher-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.429270 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="swift-recon-cron" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.429332 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" containerName="openstack-network-exporter" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.429395 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b296da3-d86c-431b-889e-a38437804d3a" containerName="placement-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.429456 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="529ed4cf-39aa-463b-bf65-da6900fa74e4" containerName="watcher-applier" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.429511 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b296da3-d86c-431b-889e-a38437804d3a" containerName="placement-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.429591 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b106e3d-3a2a-44d6-a377-8eb7f8f37825" containerName="barbican-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.429676 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-server" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.429755 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" containerName="glance-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.429828 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa7d356-e537-4b10-829b-9f07de284e64" containerName="galera" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.429883 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e35248d6-8ad0-425b-a55d-ab390f26120c" containerName="barbican-keystone-listener" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.429938 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" containerName="cinder-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.429993 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="65835ff1-85e2-4b33-a384-c93fdf569c40" containerName="neutron-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430048 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-updater" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430100 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" containerName="glance-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430170 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0a79ab-ccac-4a83-9e3e-7f1f232f2121" containerName="mariadb-account-create-update" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430230 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ab6da8-893f-49a3-867f-02b2f7395331" containerName="watcher-api-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430294 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="03b59562-d96f-453a-bfc7-a9f3c7e7f872" containerName="keystone-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430362 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b106e3d-3a2a-44d6-a377-8eb7f8f37825" containerName="barbican-api-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430424 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-expirer" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430483 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ef6962-b7c4-487b-9b04-d954fbd30b3a" containerName="ovn-northd" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430545 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" containerName="proxy-server" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430599 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerName="nova-api-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430729 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="80e84f01-7009-4bfc-8cfb-2af258ceed3c" containerName="nova-metadata-metadata" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430808 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="93de7225-d504-409c-b4b7-8e0826937e9d" containerName="registry-server" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430868 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e35248d6-8ad0-425b-a55d-ab390f26120c" containerName="barbican-keystone-listener-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.430925 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-replicator" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431056 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="sg-core" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431110 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="proxy-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431166 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2354629-cb98-42f4-bad0-675148557a8b" containerName="barbican-worker-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431224 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-replicator" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431279 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-reaper" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431332 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="object-server" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431390 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb9833c5-3a7c-426c-b5e2-4e1151e9ebe9" containerName="glance-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431461 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ae46d8-5c20-4710-90c6-b2d0120d9231" containerName="nova-api-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431539 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fe612c-1c3a-4269-8177-c09b217721c0" containerName="nova-cell1-conductor-conductor" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431611 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="ceilometer-central-agent" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431716 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c254adc-e7a0-4ef6-8b53-a6e0ed7b79ab" containerName="cinder-api-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431775 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-server" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431829 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c1f839-16bf-4ca2-8e07-b616fb9aef9c" containerName="proxy-httpd" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431889 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d47b79-8224-4a87-8ffc-808a6743d00a" containerName="watcher-decision-engine" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431945 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="65835ff1-85e2-4b33-a384-c93fdf569c40" containerName="neutron-api" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.431999 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-replicator" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.432054 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2354629-cb98-42f4-bad0-675148557a8b" containerName="barbican-worker" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.432130 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-updater" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.432183 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b0797cf-f0bf-4d22-9b01-975b439e5a36" containerName="dnsmasq-dns" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.432245 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6e4c2f-4a2b-48c8-9e9a-1a754e5be7ac" containerName="rabbitmq" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.432298 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b0bf8f-9251-40a3-8177-ee7ec9aec622" containerName="ceilometer-notification-agent" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.432351 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="deba34a1-e52c-4ee9-b8fb-07dbe410184c" containerName="memcached" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.432401 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="container-auditor" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.432460 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcb7653a-62fc-4a7c-b6fd-720c3f73d47c" containerName="glance-log" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.432514 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a9dd1f-f075-4833-b9e8-3fc856743ab8" containerName="account-auditor" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.433019 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-fzk44"] Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.433169 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fzk44" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.435807 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.435927 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.435987 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.436132 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.483285 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djnwb\" (UniqueName: \"kubernetes.io/projected/e06491c2-9db3-485c-abe8-b93c0c8d27b4-kube-api-access-djnwb\") pod \"crc-storage-crc-fzk44\" (UID: \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\") " pod="crc-storage/crc-storage-crc-fzk44" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.483347 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e06491c2-9db3-485c-abe8-b93c0c8d27b4-crc-storage\") pod \"crc-storage-crc-fzk44\" (UID: \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\") " pod="crc-storage/crc-storage-crc-fzk44" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.483411 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e06491c2-9db3-485c-abe8-b93c0c8d27b4-node-mnt\") pod \"crc-storage-crc-fzk44\" (UID: \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\") " pod="crc-storage/crc-storage-crc-fzk44" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.585384 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e06491c2-9db3-485c-abe8-b93c0c8d27b4-crc-storage\") pod \"crc-storage-crc-fzk44\" (UID: \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\") " pod="crc-storage/crc-storage-crc-fzk44" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.585501 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e06491c2-9db3-485c-abe8-b93c0c8d27b4-node-mnt\") pod \"crc-storage-crc-fzk44\" (UID: \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\") " pod="crc-storage/crc-storage-crc-fzk44" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.585674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djnwb\" (UniqueName: \"kubernetes.io/projected/e06491c2-9db3-485c-abe8-b93c0c8d27b4-kube-api-access-djnwb\") pod \"crc-storage-crc-fzk44\" (UID: \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\") " pod="crc-storage/crc-storage-crc-fzk44" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.585935 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e06491c2-9db3-485c-abe8-b93c0c8d27b4-node-mnt\") pod \"crc-storage-crc-fzk44\" (UID: \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\") " pod="crc-storage/crc-storage-crc-fzk44" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.586698 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e06491c2-9db3-485c-abe8-b93c0c8d27b4-crc-storage\") pod \"crc-storage-crc-fzk44\" (UID: \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\") " pod="crc-storage/crc-storage-crc-fzk44" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.607352 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djnwb\" (UniqueName: \"kubernetes.io/projected/e06491c2-9db3-485c-abe8-b93c0c8d27b4-kube-api-access-djnwb\") pod \"crc-storage-crc-fzk44\" (UID: \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\") " pod="crc-storage/crc-storage-crc-fzk44" Jan 20 23:14:36 crc kubenswrapper[5030]: I0120 23:14:36.761723 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fzk44" Jan 20 23:14:37 crc kubenswrapper[5030]: I0120 23:14:37.250549 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-fzk44"] Jan 20 23:14:37 crc kubenswrapper[5030]: I0120 23:14:37.255989 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 23:14:37 crc kubenswrapper[5030]: I0120 23:14:37.734900 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fzk44" event={"ID":"e06491c2-9db3-485c-abe8-b93c0c8d27b4","Type":"ContainerStarted","Data":"206a750d504f259589268c3b116df025462dfb26b873838b17e27bd78425a635"} Jan 20 23:14:37 crc kubenswrapper[5030]: I0120 23:14:37.982362 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a293400-ab1b-4e7c-90fd-6eb9356a5713" path="/var/lib/kubelet/pods/4a293400-ab1b-4e7c-90fd-6eb9356a5713/volumes" Jan 20 23:14:38 crc kubenswrapper[5030]: I0120 23:14:38.750944 5030 generic.go:334] "Generic (PLEG): container finished" podID="e06491c2-9db3-485c-abe8-b93c0c8d27b4" containerID="2b0195567e390a19554c983b7344fac252c95ea9a88d9187dcfec9456bdc0669" exitCode=0 Jan 20 23:14:38 crc kubenswrapper[5030]: I0120 23:14:38.751057 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fzk44" event={"ID":"e06491c2-9db3-485c-abe8-b93c0c8d27b4","Type":"ContainerDied","Data":"2b0195567e390a19554c983b7344fac252c95ea9a88d9187dcfec9456bdc0669"} Jan 20 23:14:40 crc kubenswrapper[5030]: I0120 23:14:40.142636 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fzk44" Jan 20 23:14:40 crc kubenswrapper[5030]: I0120 23:14:40.257432 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djnwb\" (UniqueName: \"kubernetes.io/projected/e06491c2-9db3-485c-abe8-b93c0c8d27b4-kube-api-access-djnwb\") pod \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\" (UID: \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\") " Jan 20 23:14:40 crc kubenswrapper[5030]: I0120 23:14:40.257572 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e06491c2-9db3-485c-abe8-b93c0c8d27b4-node-mnt\") pod \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\" (UID: \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\") " Jan 20 23:14:40 crc kubenswrapper[5030]: I0120 23:14:40.257685 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e06491c2-9db3-485c-abe8-b93c0c8d27b4-crc-storage\") pod \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\" (UID: \"e06491c2-9db3-485c-abe8-b93c0c8d27b4\") " Jan 20 23:14:40 crc kubenswrapper[5030]: I0120 23:14:40.257777 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e06491c2-9db3-485c-abe8-b93c0c8d27b4-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "e06491c2-9db3-485c-abe8-b93c0c8d27b4" (UID: "e06491c2-9db3-485c-abe8-b93c0c8d27b4"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:14:40 crc kubenswrapper[5030]: I0120 23:14:40.258249 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e06491c2-9db3-485c-abe8-b93c0c8d27b4-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:40 crc kubenswrapper[5030]: I0120 23:14:40.265696 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e06491c2-9db3-485c-abe8-b93c0c8d27b4-kube-api-access-djnwb" (OuterVolumeSpecName: "kube-api-access-djnwb") pod "e06491c2-9db3-485c-abe8-b93c0c8d27b4" (UID: "e06491c2-9db3-485c-abe8-b93c0c8d27b4"). InnerVolumeSpecName "kube-api-access-djnwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:14:40 crc kubenswrapper[5030]: I0120 23:14:40.289599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e06491c2-9db3-485c-abe8-b93c0c8d27b4-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "e06491c2-9db3-485c-abe8-b93c0c8d27b4" (UID: "e06491c2-9db3-485c-abe8-b93c0c8d27b4"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:14:40 crc kubenswrapper[5030]: I0120 23:14:40.359303 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djnwb\" (UniqueName: \"kubernetes.io/projected/e06491c2-9db3-485c-abe8-b93c0c8d27b4-kube-api-access-djnwb\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:40 crc kubenswrapper[5030]: I0120 23:14:40.359341 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e06491c2-9db3-485c-abe8-b93c0c8d27b4-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:40 crc kubenswrapper[5030]: I0120 23:14:40.777064 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fzk44" event={"ID":"e06491c2-9db3-485c-abe8-b93c0c8d27b4","Type":"ContainerDied","Data":"206a750d504f259589268c3b116df025462dfb26b873838b17e27bd78425a635"} Jan 20 23:14:40 crc kubenswrapper[5030]: I0120 23:14:40.777146 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="206a750d504f259589268c3b116df025462dfb26b873838b17e27bd78425a635" Jan 20 23:14:40 crc kubenswrapper[5030]: I0120 23:14:40.777173 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fzk44" Jan 20 23:14:43 crc kubenswrapper[5030]: I0120 23:14:43.117045 5030 scope.go:117] "RemoveContainer" containerID="a250ee84175492205fc032ec7ef1b2826234a8f95c9690768cde08732f54aa4b" Jan 20 23:14:43 crc kubenswrapper[5030]: I0120 23:14:43.163489 5030 scope.go:117] "RemoveContainer" containerID="1c83759a327a23747c10530750f3480ef054908978614d8ec14557542b7805c0" Jan 20 23:14:43 crc kubenswrapper[5030]: I0120 23:14:43.201790 5030 scope.go:117] "RemoveContainer" containerID="8c414e8e4d8f67036d5bc982cb1a10fc1658a7e83ee567f657f4df23a6dd409b" Jan 20 23:14:43 crc kubenswrapper[5030]: I0120 23:14:43.232954 5030 scope.go:117] "RemoveContainer" containerID="d6822b95c4a216a9a63dcbccea5e2206e6ee671b6acd2b901417699cad643d22" Jan 20 23:14:43 crc kubenswrapper[5030]: I0120 23:14:43.929867 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-fzk44"] Jan 20 23:14:43 crc kubenswrapper[5030]: I0120 23:14:43.941650 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-fzk44"] Jan 20 23:14:43 crc kubenswrapper[5030]: I0120 23:14:43.979736 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e06491c2-9db3-485c-abe8-b93c0c8d27b4" path="/var/lib/kubelet/pods/e06491c2-9db3-485c-abe8-b93c0c8d27b4/volumes" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.071798 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-mwwl4"] Jan 20 23:14:44 crc kubenswrapper[5030]: E0120 23:14:44.072168 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e06491c2-9db3-485c-abe8-b93c0c8d27b4" containerName="storage" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.072187 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e06491c2-9db3-485c-abe8-b93c0c8d27b4" containerName="storage" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.072334 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e06491c2-9db3-485c-abe8-b93c0c8d27b4" containerName="storage" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.072913 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mwwl4" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.075647 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.076041 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.076464 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.077506 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.098885 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mwwl4"] Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.226192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/80f2b839-4822-4121-a227-334e3a1fb94a-crc-storage\") pod \"crc-storage-crc-mwwl4\" (UID: \"80f2b839-4822-4121-a227-334e3a1fb94a\") " pod="crc-storage/crc-storage-crc-mwwl4" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.226250 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtlvc\" (UniqueName: \"kubernetes.io/projected/80f2b839-4822-4121-a227-334e3a1fb94a-kube-api-access-qtlvc\") pod \"crc-storage-crc-mwwl4\" (UID: \"80f2b839-4822-4121-a227-334e3a1fb94a\") " pod="crc-storage/crc-storage-crc-mwwl4" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.226355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/80f2b839-4822-4121-a227-334e3a1fb94a-node-mnt\") pod \"crc-storage-crc-mwwl4\" (UID: \"80f2b839-4822-4121-a227-334e3a1fb94a\") " pod="crc-storage/crc-storage-crc-mwwl4" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.328142 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/80f2b839-4822-4121-a227-334e3a1fb94a-node-mnt\") pod \"crc-storage-crc-mwwl4\" (UID: \"80f2b839-4822-4121-a227-334e3a1fb94a\") " pod="crc-storage/crc-storage-crc-mwwl4" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.328249 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/80f2b839-4822-4121-a227-334e3a1fb94a-crc-storage\") pod \"crc-storage-crc-mwwl4\" (UID: \"80f2b839-4822-4121-a227-334e3a1fb94a\") " pod="crc-storage/crc-storage-crc-mwwl4" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.328288 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtlvc\" (UniqueName: \"kubernetes.io/projected/80f2b839-4822-4121-a227-334e3a1fb94a-kube-api-access-qtlvc\") pod \"crc-storage-crc-mwwl4\" (UID: \"80f2b839-4822-4121-a227-334e3a1fb94a\") " pod="crc-storage/crc-storage-crc-mwwl4" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.329027 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/80f2b839-4822-4121-a227-334e3a1fb94a-node-mnt\") pod \"crc-storage-crc-mwwl4\" (UID: \"80f2b839-4822-4121-a227-334e3a1fb94a\") " pod="crc-storage/crc-storage-crc-mwwl4" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.330128 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/80f2b839-4822-4121-a227-334e3a1fb94a-crc-storage\") pod \"crc-storage-crc-mwwl4\" (UID: \"80f2b839-4822-4121-a227-334e3a1fb94a\") " pod="crc-storage/crc-storage-crc-mwwl4" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.358523 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtlvc\" (UniqueName: \"kubernetes.io/projected/80f2b839-4822-4121-a227-334e3a1fb94a-kube-api-access-qtlvc\") pod \"crc-storage-crc-mwwl4\" (UID: \"80f2b839-4822-4121-a227-334e3a1fb94a\") " pod="crc-storage/crc-storage-crc-mwwl4" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.396907 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mwwl4" Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.665472 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mwwl4"] Jan 20 23:14:44 crc kubenswrapper[5030]: I0120 23:14:44.825340 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mwwl4" event={"ID":"80f2b839-4822-4121-a227-334e3a1fb94a","Type":"ContainerStarted","Data":"bc0a0e45fea26af095a7a873f9805f788b1de14b71b942b5650c5687a2aa4364"} Jan 20 23:14:45 crc kubenswrapper[5030]: I0120 23:14:45.841236 5030 generic.go:334] "Generic (PLEG): container finished" podID="80f2b839-4822-4121-a227-334e3a1fb94a" containerID="0799a414ab9bc02a00a60fde72c5c4830fc0204c4a5564607ae804f0a7459cfb" exitCode=0 Jan 20 23:14:45 crc kubenswrapper[5030]: I0120 23:14:45.841310 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mwwl4" event={"ID":"80f2b839-4822-4121-a227-334e3a1fb94a","Type":"ContainerDied","Data":"0799a414ab9bc02a00a60fde72c5c4830fc0204c4a5564607ae804f0a7459cfb"} Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.170556 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mwwl4" Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.278947 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/80f2b839-4822-4121-a227-334e3a1fb94a-crc-storage\") pod \"80f2b839-4822-4121-a227-334e3a1fb94a\" (UID: \"80f2b839-4822-4121-a227-334e3a1fb94a\") " Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.279121 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/80f2b839-4822-4121-a227-334e3a1fb94a-node-mnt\") pod \"80f2b839-4822-4121-a227-334e3a1fb94a\" (UID: \"80f2b839-4822-4121-a227-334e3a1fb94a\") " Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.279249 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtlvc\" (UniqueName: \"kubernetes.io/projected/80f2b839-4822-4121-a227-334e3a1fb94a-kube-api-access-qtlvc\") pod \"80f2b839-4822-4121-a227-334e3a1fb94a\" (UID: \"80f2b839-4822-4121-a227-334e3a1fb94a\") " Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.279341 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80f2b839-4822-4121-a227-334e3a1fb94a-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "80f2b839-4822-4121-a227-334e3a1fb94a" (UID: "80f2b839-4822-4121-a227-334e3a1fb94a"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.279892 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/80f2b839-4822-4121-a227-334e3a1fb94a-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.285121 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f2b839-4822-4121-a227-334e3a1fb94a-kube-api-access-qtlvc" (OuterVolumeSpecName: "kube-api-access-qtlvc") pod "80f2b839-4822-4121-a227-334e3a1fb94a" (UID: "80f2b839-4822-4121-a227-334e3a1fb94a"). InnerVolumeSpecName "kube-api-access-qtlvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.307518 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80f2b839-4822-4121-a227-334e3a1fb94a-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "80f2b839-4822-4121-a227-334e3a1fb94a" (UID: "80f2b839-4822-4121-a227-334e3a1fb94a"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.381514 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/80f2b839-4822-4121-a227-334e3a1fb94a-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.381573 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtlvc\" (UniqueName: \"kubernetes.io/projected/80f2b839-4822-4121-a227-334e3a1fb94a-kube-api-access-qtlvc\") on node \"crc\" DevicePath \"\"" Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.875815 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mwwl4" event={"ID":"80f2b839-4822-4121-a227-334e3a1fb94a","Type":"ContainerDied","Data":"bc0a0e45fea26af095a7a873f9805f788b1de14b71b942b5650c5687a2aa4364"} Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.875889 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mwwl4" Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.875895 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc0a0e45fea26af095a7a873f9805f788b1de14b71b942b5650c5687a2aa4364" Jan 20 23:14:47 crc kubenswrapper[5030]: I0120 23:14:47.969933 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:14:47 crc kubenswrapper[5030]: E0120 23:14:47.970380 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.149013 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt"] Jan 20 23:15:00 crc kubenswrapper[5030]: E0120 23:15:00.154276 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f2b839-4822-4121-a227-334e3a1fb94a" containerName="storage" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.154309 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f2b839-4822-4121-a227-334e3a1fb94a" containerName="storage" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.154564 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f2b839-4822-4121-a227-334e3a1fb94a" containerName="storage" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.155452 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.158000 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.158435 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.167658 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt"] Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.289968 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsvzp\" (UniqueName: \"kubernetes.io/projected/35630068-6911-4cd5-93bb-7d81bd6a1e72-kube-api-access-vsvzp\") pod \"collect-profiles-29482515-6kzdt\" (UID: \"35630068-6911-4cd5-93bb-7d81bd6a1e72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.290044 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35630068-6911-4cd5-93bb-7d81bd6a1e72-config-volume\") pod \"collect-profiles-29482515-6kzdt\" (UID: \"35630068-6911-4cd5-93bb-7d81bd6a1e72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.290086 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35630068-6911-4cd5-93bb-7d81bd6a1e72-secret-volume\") pod \"collect-profiles-29482515-6kzdt\" (UID: \"35630068-6911-4cd5-93bb-7d81bd6a1e72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.391292 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsvzp\" (UniqueName: \"kubernetes.io/projected/35630068-6911-4cd5-93bb-7d81bd6a1e72-kube-api-access-vsvzp\") pod \"collect-profiles-29482515-6kzdt\" (UID: \"35630068-6911-4cd5-93bb-7d81bd6a1e72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.391349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35630068-6911-4cd5-93bb-7d81bd6a1e72-config-volume\") pod \"collect-profiles-29482515-6kzdt\" (UID: \"35630068-6911-4cd5-93bb-7d81bd6a1e72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.391388 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35630068-6911-4cd5-93bb-7d81bd6a1e72-secret-volume\") pod \"collect-profiles-29482515-6kzdt\" (UID: \"35630068-6911-4cd5-93bb-7d81bd6a1e72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.392783 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35630068-6911-4cd5-93bb-7d81bd6a1e72-config-volume\") pod \"collect-profiles-29482515-6kzdt\" (UID: \"35630068-6911-4cd5-93bb-7d81bd6a1e72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.400707 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35630068-6911-4cd5-93bb-7d81bd6a1e72-secret-volume\") pod \"collect-profiles-29482515-6kzdt\" (UID: \"35630068-6911-4cd5-93bb-7d81bd6a1e72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.409826 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsvzp\" (UniqueName: \"kubernetes.io/projected/35630068-6911-4cd5-93bb-7d81bd6a1e72-kube-api-access-vsvzp\") pod \"collect-profiles-29482515-6kzdt\" (UID: \"35630068-6911-4cd5-93bb-7d81bd6a1e72\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" Jan 20 23:15:00 crc kubenswrapper[5030]: I0120 23:15:00.473899 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" Jan 20 23:15:01 crc kubenswrapper[5030]: I0120 23:15:01.025530 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt"] Jan 20 23:15:01 crc kubenswrapper[5030]: I0120 23:15:01.963097 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:15:01 crc kubenswrapper[5030]: E0120 23:15:01.965547 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:15:02 crc kubenswrapper[5030]: I0120 23:15:02.028563 5030 generic.go:334] "Generic (PLEG): container finished" podID="35630068-6911-4cd5-93bb-7d81bd6a1e72" containerID="408858c7b76bc42f0d3a71d267c335c5c087f74c9039d5854354217999c10a68" exitCode=0 Jan 20 23:15:02 crc kubenswrapper[5030]: I0120 23:15:02.028596 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" event={"ID":"35630068-6911-4cd5-93bb-7d81bd6a1e72","Type":"ContainerDied","Data":"408858c7b76bc42f0d3a71d267c335c5c087f74c9039d5854354217999c10a68"} Jan 20 23:15:02 crc kubenswrapper[5030]: I0120 23:15:02.028634 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" event={"ID":"35630068-6911-4cd5-93bb-7d81bd6a1e72","Type":"ContainerStarted","Data":"2922fd2bd910ffc9848c060f9fb9e72f25a11ef18e925d4beac63a4b74061aeb"} Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.369196 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.539237 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35630068-6911-4cd5-93bb-7d81bd6a1e72-secret-volume\") pod \"35630068-6911-4cd5-93bb-7d81bd6a1e72\" (UID: \"35630068-6911-4cd5-93bb-7d81bd6a1e72\") " Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.539556 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35630068-6911-4cd5-93bb-7d81bd6a1e72-config-volume\") pod \"35630068-6911-4cd5-93bb-7d81bd6a1e72\" (UID: \"35630068-6911-4cd5-93bb-7d81bd6a1e72\") " Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.539663 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsvzp\" (UniqueName: \"kubernetes.io/projected/35630068-6911-4cd5-93bb-7d81bd6a1e72-kube-api-access-vsvzp\") pod \"35630068-6911-4cd5-93bb-7d81bd6a1e72\" (UID: \"35630068-6911-4cd5-93bb-7d81bd6a1e72\") " Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.540709 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35630068-6911-4cd5-93bb-7d81bd6a1e72-config-volume" (OuterVolumeSpecName: "config-volume") pod "35630068-6911-4cd5-93bb-7d81bd6a1e72" (UID: "35630068-6911-4cd5-93bb-7d81bd6a1e72"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.547025 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35630068-6911-4cd5-93bb-7d81bd6a1e72-kube-api-access-vsvzp" (OuterVolumeSpecName: "kube-api-access-vsvzp") pod "35630068-6911-4cd5-93bb-7d81bd6a1e72" (UID: "35630068-6911-4cd5-93bb-7d81bd6a1e72"). InnerVolumeSpecName "kube-api-access-vsvzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.547139 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35630068-6911-4cd5-93bb-7d81bd6a1e72-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "35630068-6911-4cd5-93bb-7d81bd6a1e72" (UID: "35630068-6911-4cd5-93bb-7d81bd6a1e72"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.642118 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35630068-6911-4cd5-93bb-7d81bd6a1e72-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.642185 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsvzp\" (UniqueName: \"kubernetes.io/projected/35630068-6911-4cd5-93bb-7d81bd6a1e72-kube-api-access-vsvzp\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.642208 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35630068-6911-4cd5-93bb-7d81bd6a1e72-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.878488 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:15:03 crc kubenswrapper[5030]: E0120 23:15:03.879007 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35630068-6911-4cd5-93bb-7d81bd6a1e72" containerName="collect-profiles" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.879029 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="35630068-6911-4cd5-93bb-7d81bd6a1e72" containerName="collect-profiles" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.879328 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="35630068-6911-4cd5-93bb-7d81bd6a1e72" containerName="collect-profiles" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.880774 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.885904 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.886514 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-74m5h" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.886808 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.887023 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.887299 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.887466 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.887676 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.888261 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.891466 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.896719 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.906444 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.908212 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.941908 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.954473 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.957481 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.957528 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.957549 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.957567 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.957590 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/995342f6-fe0b-4188-976c-5151541dd002-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.957610 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.957639 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.957690 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.957713 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/995342f6-fe0b-4188-976c-5151541dd002-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.957730 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:03 crc kubenswrapper[5030]: I0120 23:15:03.957747 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv8dr\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-kube-api-access-fv8dr\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.045809 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" event={"ID":"35630068-6911-4cd5-93bb-7d81bd6a1e72","Type":"ContainerDied","Data":"2922fd2bd910ffc9848c060f9fb9e72f25a11ef18e925d4beac63a4b74061aeb"} Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.046036 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2922fd2bd910ffc9848c060f9fb9e72f25a11ef18e925d4beac63a4b74061aeb" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.046085 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.059931 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv8dr\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-kube-api-access-fv8dr\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.059989 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060030 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060053 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65e9d298-7338-47c3-8591-b7803798203f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060073 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-server-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060095 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjmb7\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-kube-api-access-hjmb7\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060129 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060163 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060177 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060206 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060241 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060265 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a54757d6-0570-4290-a802-d82ff94dffb9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060295 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/995342f6-fe0b-4188-976c-5151541dd002-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060433 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-plugins-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060465 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a54757d6-0570-4290-a802-d82ff94dffb9-pod-info\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060497 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060530 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060649 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-server-conf\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060718 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060807 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060839 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060865 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060901 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060930 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65e9d298-7338-47c3-8591-b7803798203f-pod-info\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060959 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-plugins-conf\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.060995 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jk6q\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-kube-api-access-5jk6q\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.061032 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/995342f6-fe0b-4188-976c-5151541dd002-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.061042 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.061057 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.061608 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.062073 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.062762 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.062832 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.063836 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.064597 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/995342f6-fe0b-4188-976c-5151541dd002-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.066559 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.068417 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/995342f6-fe0b-4188-976c-5151541dd002-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.068543 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.081585 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv8dr\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-kube-api-access-fv8dr\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.091908 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.167602 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65e9d298-7338-47c3-8591-b7803798203f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.167664 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-server-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.167689 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjmb7\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-kube-api-access-hjmb7\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.167707 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.167754 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.167777 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.167799 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.167834 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.167876 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a54757d6-0570-4290-a802-d82ff94dffb9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.167903 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.167926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-plugins-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.167951 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a54757d6-0570-4290-a802-d82ff94dffb9-pod-info\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.167978 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.168001 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-server-conf\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.168027 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.168058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.168072 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.168088 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.168106 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65e9d298-7338-47c3-8591-b7803798203f-pod-info\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.168121 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-plugins-conf\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.168136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jk6q\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-kube-api-access-5jk6q\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.168195 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.168327 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.169228 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.169260 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-server-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.169366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-plugins-conf\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.169533 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.169903 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.170063 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-plugins-conf\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.170642 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.171060 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.171094 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") device mount path \"/mnt/openstack/pv03\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.171588 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.171691 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-server-conf\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.171990 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.174386 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a54757d6-0570-4290-a802-d82ff94dffb9-pod-info\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.175705 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a54757d6-0570-4290-a802-d82ff94dffb9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.176949 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.177376 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65e9d298-7338-47c3-8591-b7803798203f-pod-info\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.177770 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65e9d298-7338-47c3-8591-b7803798203f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.179060 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.181149 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.186424 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jk6q\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-kube-api-access-5jk6q\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.193452 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjmb7\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-kube-api-access-hjmb7\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.200453 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-2\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.203448 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-1\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.207718 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.237027 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.249766 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.449036 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w"] Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.455391 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482470-4x64w"] Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.706583 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:15:04 crc kubenswrapper[5030]: W0120 23:15:04.707575 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod995342f6_fe0b_4188_976c_5151541dd002.slice/crio-62150a8f853956dd62fa0336f565eec30c91832921b138ff215bebacbfac1c22 WatchSource:0}: Error finding container 62150a8f853956dd62fa0336f565eec30c91832921b138ff215bebacbfac1c22: Status 404 returned error can't find the container with id 62150a8f853956dd62fa0336f565eec30c91832921b138ff215bebacbfac1c22 Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.765656 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 20 23:15:04 crc kubenswrapper[5030]: W0120 23:15:04.775390 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda54757d6_0570_4290_a802_d82ff94dffb9.slice/crio-923b70d502bfac1e6b9f1928a96a7ee2423a1bf7d60e2576eb292c0b8acaf0c5 WatchSource:0}: Error finding container 923b70d502bfac1e6b9f1928a96a7ee2423a1bf7d60e2576eb292c0b8acaf0c5: Status 404 returned error can't find the container with id 923b70d502bfac1e6b9f1928a96a7ee2423a1bf7d60e2576eb292c0b8acaf0c5 Jan 20 23:15:04 crc kubenswrapper[5030]: I0120 23:15:04.786947 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 20 23:15:04 crc kubenswrapper[5030]: W0120 23:15:04.790948 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65e9d298_7338_47c3_8591_b7803798203f.slice/crio-372aa9e1ff069cb20086ce59b6600724cc05321d10840233fd8eb7b47ac36c9b WatchSource:0}: Error finding container 372aa9e1ff069cb20086ce59b6600724cc05321d10840233fd8eb7b47ac36c9b: Status 404 returned error can't find the container with id 372aa9e1ff069cb20086ce59b6600724cc05321d10840233fd8eb7b47ac36c9b Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.007560 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.009174 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.011227 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.011401 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.012497 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.014251 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-4zg8b" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.014337 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.014352 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.015343 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.024843 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.048359 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.049736 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.058541 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.060612 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.063017 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" event={"ID":"a54757d6-0570-4290-a802-d82ff94dffb9","Type":"ContainerStarted","Data":"923b70d502bfac1e6b9f1928a96a7ee2423a1bf7d60e2576eb292c0b8acaf0c5"} Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.064969 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"995342f6-fe0b-4188-976c-5151541dd002","Type":"ContainerStarted","Data":"62150a8f853956dd62fa0336f565eec30c91832921b138ff215bebacbfac1c22"} Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.069711 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" event={"ID":"65e9d298-7338-47c3-8591-b7803798203f","Type":"ContainerStarted","Data":"372aa9e1ff069cb20086ce59b6600724cc05321d10840233fd8eb7b47ac36c9b"} Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.079759 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.082276 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.082350 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed443431-2b5c-4533-8523-d01380c98c1d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.082420 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.082497 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.082542 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.082579 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.082693 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.082725 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.082750 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.082957 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed443431-2b5c-4533-8523-d01380c98c1d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.083182 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqn7m\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-kube-api-access-hqn7m\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.096329 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186364 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186424 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186451 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed443431-2b5c-4533-8523-d01380c98c1d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186478 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186499 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186519 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqn7m\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-kube-api-access-hqn7m\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186537 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-server-conf\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186556 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186573 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186629 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186649 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186671 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9tv6\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-kube-api-access-w9tv6\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186691 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186717 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186736 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186753 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186777 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed443431-2b5c-4533-8523-d01380c98c1d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186795 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rksgn\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-kube-api-access-rksgn\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186817 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186832 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186849 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-server-conf\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-pod-info\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186886 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186905 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.186930 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.187035 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.187100 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-pod-info\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.187135 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.187171 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.187198 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.187699 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.187912 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.188018 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.188115 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.188145 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.188754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.188799 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.188825 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.189647 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.193250 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.194431 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed443431-2b5c-4533-8523-d01380c98c1d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.197804 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed443431-2b5c-4533-8523-d01380c98c1d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.201507 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.206101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqn7m\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-kube-api-access-hqn7m\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.215368 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"rabbitmq-server-0\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.294735 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.294865 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9tv6\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-kube-api-access-w9tv6\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.294917 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295000 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295051 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295104 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295150 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rksgn\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-kube-api-access-rksgn\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295215 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295225 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295268 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-server-conf\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295334 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-pod-info\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-pod-info\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295500 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295534 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295577 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295670 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295709 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295736 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295763 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295771 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295852 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-server-conf\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295890 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.295923 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.296224 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.297401 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-server-conf\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.298094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.298193 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") device mount path \"/mnt/openstack/pv16\"" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.299128 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.299553 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.300486 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-server-conf\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.300730 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.301298 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.344563 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.350979 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-pod-info\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.353939 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.353950 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.359414 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.360607 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.361713 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-pod-info\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.361956 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.363929 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9tv6\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-kube-api-access-w9tv6\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.364541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.365305 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rksgn\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-kube-api-access-rksgn\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.480512 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"rabbitmq-server-1\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.677313 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-server-2\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.699023 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.829268 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:15:05 crc kubenswrapper[5030]: W0120 23:15:05.848181 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded443431_2b5c_4533_8523_d01380c98c1d.slice/crio-c1ed5f52af5d3a7aea7605a15475d802b60154d0600fbc943b8538b8756caac7 WatchSource:0}: Error finding container c1ed5f52af5d3a7aea7605a15475d802b60154d0600fbc943b8538b8756caac7: Status 404 returned error can't find the container with id c1ed5f52af5d3a7aea7605a15475d802b60154d0600fbc943b8538b8756caac7 Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.972150 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:05 crc kubenswrapper[5030]: I0120 23:15:05.973223 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b55be0c-55e3-4bf3-badd-618e8dc9f738" path="/var/lib/kubelet/pods/6b55be0c-55e3-4bf3-badd-618e8dc9f738/volumes" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.079535 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"ed443431-2b5c-4533-8523-d01380c98c1d","Type":"ContainerStarted","Data":"c1ed5f52af5d3a7aea7605a15475d802b60154d0600fbc943b8538b8756caac7"} Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.171000 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.400594 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.440456 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.442020 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.451477 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.451724 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.452050 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.452113 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-zn7wk" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.456303 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.458685 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.476389 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-2"] Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.477710 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.503660 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-1"] Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.504951 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.511425 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-2"] Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.516849 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-1"] Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.517854 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cf776dcd-8af4-45b0-9c39-0e8db2af4879-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.517953 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.517986 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-config-data-default\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.518082 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6842r\" (UniqueName: \"kubernetes.io/projected/cf776dcd-8af4-45b0-9c39-0e8db2af4879-kube-api-access-6842r\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.518130 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-kolla-config\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.518157 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf776dcd-8af4-45b0-9c39-0e8db2af4879-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.518276 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf776dcd-8af4-45b0-9c39-0e8db2af4879-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.518392 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.619999 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-config-data-default\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620049 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620072 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r74rl\" (UniqueName: \"kubernetes.io/projected/eba5579b-ac6a-439c-894d-25a5c14d6242-kube-api-access-r74rl\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620105 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6842r\" (UniqueName: \"kubernetes.io/projected/cf776dcd-8af4-45b0-9c39-0e8db2af4879-kube-api-access-6842r\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620126 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-operator-scripts\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-config-data-default\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620165 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-kolla-config\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620181 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620199 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-operator-scripts\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620221 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf776dcd-8af4-45b0-9c39-0e8db2af4879-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620236 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-config-data-default\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620264 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5039f885-ec1c-48d9-8e06-cf672e3581df-combined-ca-bundle\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620287 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-kolla-config\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620307 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf776dcd-8af4-45b0-9c39-0e8db2af4879-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620327 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5039f885-ec1c-48d9-8e06-cf672e3581df-galera-tls-certs\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620346 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eba5579b-ac6a-439c-894d-25a5c14d6242-galera-tls-certs\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620362 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5039f885-ec1c-48d9-8e06-cf672e3581df-config-data-generated\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620379 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eba5579b-ac6a-439c-894d-25a5c14d6242-config-data-generated\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620400 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcbl4\" (UniqueName: \"kubernetes.io/projected/5039f885-ec1c-48d9-8e06-cf672e3581df-kube-api-access-vcbl4\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620420 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-kolla-config\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620441 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba5579b-ac6a-439c-894d-25a5c14d6242-combined-ca-bundle\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620461 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.620975 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cf776dcd-8af4-45b0-9c39-0e8db2af4879-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.621024 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-config-data-default\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.621067 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.621263 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-kolla-config\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.621286 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cf776dcd-8af4-45b0-9c39-0e8db2af4879-config-data-generated\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.621686 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") device mount path \"/mnt/openstack/pv07\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.623924 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf776dcd-8af4-45b0-9c39-0e8db2af4879-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.623953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-operator-scripts\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.628478 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf776dcd-8af4-45b0-9c39-0e8db2af4879-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.645359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6842r\" (UniqueName: \"kubernetes.io/projected/cf776dcd-8af4-45b0-9c39-0e8db2af4879-kube-api-access-6842r\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.653290 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.721659 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eba5579b-ac6a-439c-894d-25a5c14d6242-galera-tls-certs\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.721698 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5039f885-ec1c-48d9-8e06-cf672e3581df-config-data-generated\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.721716 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eba5579b-ac6a-439c-894d-25a5c14d6242-config-data-generated\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.721733 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcbl4\" (UniqueName: \"kubernetes.io/projected/5039f885-ec1c-48d9-8e06-cf672e3581df-kube-api-access-vcbl4\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.721761 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-kolla-config\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.721782 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba5579b-ac6a-439c-894d-25a5c14d6242-combined-ca-bundle\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.721847 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.721871 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r74rl\" (UniqueName: \"kubernetes.io/projected/eba5579b-ac6a-439c-894d-25a5c14d6242-kube-api-access-r74rl\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.721907 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-operator-scripts\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.721925 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-config-data-default\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.721951 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.721971 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-operator-scripts\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.721999 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-config-data-default\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.722019 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.722034 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5039f885-ec1c-48d9-8e06-cf672e3581df-combined-ca-bundle\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.722071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-kolla-config\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.722106 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5039f885-ec1c-48d9-8e06-cf672e3581df-galera-tls-certs\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.722202 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5039f885-ec1c-48d9-8e06-cf672e3581df-config-data-generated\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.722954 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-kolla-config\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.723073 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-kolla-config\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.723087 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") device mount path \"/mnt/openstack/pv15\"" pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.723417 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-config-data-default\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.723714 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-operator-scripts\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.723869 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-config-data-default\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.723954 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-operator-scripts\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.734914 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eba5579b-ac6a-439c-894d-25a5c14d6242-config-data-generated\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.737682 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r74rl\" (UniqueName: \"kubernetes.io/projected/eba5579b-ac6a-439c-894d-25a5c14d6242-kube-api-access-r74rl\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.740794 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcbl4\" (UniqueName: \"kubernetes.io/projected/5039f885-ec1c-48d9-8e06-cf672e3581df-kube-api-access-vcbl4\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.748312 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5039f885-ec1c-48d9-8e06-cf672e3581df-combined-ca-bundle\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.753980 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5039f885-ec1c-48d9-8e06-cf672e3581df-galera-tls-certs\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.754164 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba5579b-ac6a-439c-894d-25a5c14d6242-combined-ca-bundle\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.755353 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eba5579b-ac6a-439c-894d-25a5c14d6242-galera-tls-certs\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.778812 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-2\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.804514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"openstack-galera-1\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.890565 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.900702 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:06 crc kubenswrapper[5030]: I0120 23:15:06.953610 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.092654 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"995342f6-fe0b-4188-976c-5151541dd002","Type":"ContainerStarted","Data":"ce04af762c42921ec01aa3670edc17c959abd025e114d214262bf6b03d163300"} Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.094917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-1" event={"ID":"59bcab9d-b9c0-40fe-8973-df39eeca1dd4","Type":"ContainerStarted","Data":"165efdad72daf753004568da6e026ea0e21356f8eff4ac64ca0eba7376ae6625"} Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.097307 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" event={"ID":"65e9d298-7338-47c3-8591-b7803798203f","Type":"ContainerStarted","Data":"752b6a46f4ce2e600f72af503766db84caedf6a5b37e0be36eaa583d522b93d0"} Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.103563 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-2" event={"ID":"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f","Type":"ContainerStarted","Data":"94f45336453177298d12d18e1f69aac4ece70b0a27b5fda3a802dcbbe48cf180"} Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.104878 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" event={"ID":"a54757d6-0570-4290-a802-d82ff94dffb9","Type":"ContainerStarted","Data":"7d5497c08cbd622e49fcc51b99641a1df34a35f61b4bc17ebbc6a57130accf3b"} Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.404374 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:15:07 crc kubenswrapper[5030]: W0120 23:15:07.409175 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf776dcd_8af4_45b0_9c39_0e8db2af4879.slice/crio-a85ca7b9bd72973fcabfe3800e05c7eebed97699919da36c88f980de7144d275 WatchSource:0}: Error finding container a85ca7b9bd72973fcabfe3800e05c7eebed97699919da36c88f980de7144d275: Status 404 returned error can't find the container with id a85ca7b9bd72973fcabfe3800e05c7eebed97699919da36c88f980de7144d275 Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.480339 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-2"] Jan 20 23:15:07 crc kubenswrapper[5030]: W0120 23:15:07.489275 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeba5579b_ac6a_439c_894d_25a5c14d6242.slice/crio-0a0f098aefe0f39a03bf9b39e3f6ba567956e1ea0f46f7a85b2ce0124cfeb144 WatchSource:0}: Error finding container 0a0f098aefe0f39a03bf9b39e3f6ba567956e1ea0f46f7a85b2ce0124cfeb144: Status 404 returned error can't find the container with id 0a0f098aefe0f39a03bf9b39e3f6ba567956e1ea0f46f7a85b2ce0124cfeb144 Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.500785 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-1"] Jan 20 23:15:07 crc kubenswrapper[5030]: W0120 23:15:07.547522 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5039f885_ec1c_48d9_8e06_cf672e3581df.slice/crio-8a7c192a3a9525f9af47eb6b8b657e1414c239f6f2e8d44ad7da72e45d07b7e3 WatchSource:0}: Error finding container 8a7c192a3a9525f9af47eb6b8b657e1414c239f6f2e8d44ad7da72e45d07b7e3: Status 404 returned error can't find the container with id 8a7c192a3a9525f9af47eb6b8b657e1414c239f6f2e8d44ad7da72e45d07b7e3 Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.857442 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.861578 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.871230 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.873100 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-1"] Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.875013 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.876711 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.877885 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.877918 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-ctwg4" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.895016 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-2"] Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.896835 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.933884 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-1"] Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.945105 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473c0915-5bed-4dc5-8b2a-7899b05b9afc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.945173 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.945201 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.945246 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.945271 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/473c0915-5bed-4dc5-8b2a-7899b05b9afc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.945308 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv9x8\" (UniqueName: \"kubernetes.io/projected/473c0915-5bed-4dc5-8b2a-7899b05b9afc-kube-api-access-cv9x8\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.945333 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.945353 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/473c0915-5bed-4dc5-8b2a-7899b05b9afc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.949657 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:15:07 crc kubenswrapper[5030]: I0120 23:15:07.958505 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-2"] Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.046875 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5da8d3-64b4-4c3b-a154-58eaf9379610-combined-ca-bundle\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047173 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv9x8\" (UniqueName: \"kubernetes.io/projected/473c0915-5bed-4dc5-8b2a-7899b05b9afc-kube-api-access-cv9x8\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-kolla-config\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047233 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef744864-0218-4aad-9ffa-a926311790dd-galera-tls-certs\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047267 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54qfw\" (UniqueName: \"kubernetes.io/projected/ef744864-0218-4aad-9ffa-a926311790dd-kube-api-access-54qfw\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047303 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-operator-scripts\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047324 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/473c0915-5bed-4dc5-8b2a-7899b05b9afc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047399 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473c0915-5bed-4dc5-8b2a-7899b05b9afc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047427 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-kolla-config\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047480 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1c5da8d3-64b4-4c3b-a154-58eaf9379610-config-data-generated\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047533 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047557 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-operator-scripts\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047632 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047669 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-config-data-default\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047707 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c5da8d3-64b4-4c3b-a154-58eaf9379610-galera-tls-certs\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047728 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h78r\" (UniqueName: \"kubernetes.io/projected/1c5da8d3-64b4-4c3b-a154-58eaf9379610-kube-api-access-2h78r\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047748 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ef744864-0218-4aad-9ffa-a926311790dd-config-data-generated\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047782 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047806 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef744864-0218-4aad-9ffa-a926311790dd-combined-ca-bundle\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/473c0915-5bed-4dc5-8b2a-7899b05b9afc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.047878 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-config-data-default\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.048648 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.053153 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.053781 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.054870 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.055060 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/473c0915-5bed-4dc5-8b2a-7899b05b9afc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.056680 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473c0915-5bed-4dc5-8b2a-7899b05b9afc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.061013 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/473c0915-5bed-4dc5-8b2a-7899b05b9afc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.083275 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv9x8\" (UniqueName: \"kubernetes.io/projected/473c0915-5bed-4dc5-8b2a-7899b05b9afc-kube-api-access-cv9x8\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.123842 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.129229 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-2" event={"ID":"eba5579b-ac6a-439c-894d-25a5c14d6242","Type":"ContainerStarted","Data":"5f28b962d9cd9e0f5d731673f323a4101176d3bfafc6679aeb1a997bbdac5c30"} Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.129272 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-2" event={"ID":"eba5579b-ac6a-439c-894d-25a5c14d6242","Type":"ContainerStarted","Data":"0a0f098aefe0f39a03bf9b39e3f6ba567956e1ea0f46f7a85b2ce0124cfeb144"} Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149497 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-config-data-default\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149543 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c5da8d3-64b4-4c3b-a154-58eaf9379610-galera-tls-certs\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149570 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h78r\" (UniqueName: \"kubernetes.io/projected/1c5da8d3-64b4-4c3b-a154-58eaf9379610-kube-api-access-2h78r\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149633 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ef744864-0218-4aad-9ffa-a926311790dd-config-data-generated\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149654 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149677 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef744864-0218-4aad-9ffa-a926311790dd-combined-ca-bundle\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149696 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-config-data-default\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5da8d3-64b4-4c3b-a154-58eaf9379610-combined-ca-bundle\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149752 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-kolla-config\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149769 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef744864-0218-4aad-9ffa-a926311790dd-galera-tls-certs\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149790 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54qfw\" (UniqueName: \"kubernetes.io/projected/ef744864-0218-4aad-9ffa-a926311790dd-kube-api-access-54qfw\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149808 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-operator-scripts\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149841 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-kolla-config\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149863 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1c5da8d3-64b4-4c3b-a154-58eaf9379610-config-data-generated\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149892 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.149915 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-operator-scripts\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.151885 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-config-data-default\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.151934 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-operator-scripts\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.152535 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1c5da8d3-64b4-4c3b-a154-58eaf9379610-config-data-generated\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.152540 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-kolla-config\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.152666 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") device mount path \"/mnt/openstack/pv17\"" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.152783 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"cf776dcd-8af4-45b0-9c39-0e8db2af4879","Type":"ContainerStarted","Data":"8c2d034117cf660543507c5a47125dcd72b8b2a6f1553c72d9e7bd79c30f5354"} Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.152827 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"cf776dcd-8af4-45b0-9c39-0e8db2af4879","Type":"ContainerStarted","Data":"a85ca7b9bd72973fcabfe3800e05c7eebed97699919da36c88f980de7144d275"} Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.152974 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-kolla-config\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.153422 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ef744864-0218-4aad-9ffa-a926311790dd-config-data-generated\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.153536 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") device mount path \"/mnt/openstack/pv12\"" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.159025 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-config-data-default\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.160022 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef744864-0218-4aad-9ffa-a926311790dd-galera-tls-certs\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.160063 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c5da8d3-64b4-4c3b-a154-58eaf9379610-galera-tls-certs\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.161326 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5da8d3-64b4-4c3b-a154-58eaf9379610-combined-ca-bundle\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.162288 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef744864-0218-4aad-9ffa-a926311790dd-combined-ca-bundle\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.165377 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-operator-scripts\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.180124 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h78r\" (UniqueName: \"kubernetes.io/projected/1c5da8d3-64b4-4c3b-a154-58eaf9379610-kube-api-access-2h78r\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.185655 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54qfw\" (UniqueName: \"kubernetes.io/projected/ef744864-0218-4aad-9ffa-a926311790dd-kube-api-access-54qfw\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.188007 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.190873 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-2" event={"ID":"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f","Type":"ContainerStarted","Data":"784863a42052e57b14121f049e5cd14fe8a4044d16008f175d245e4a44552a4e"} Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.213891 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-1" event={"ID":"59bcab9d-b9c0-40fe-8973-df39eeca1dd4","Type":"ContainerStarted","Data":"f161398cbd5641c14973f948a7a3f2a59ed72f415935e18bc8af68a8d9fd3226"} Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.236093 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.236967 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-cell1-galera-1\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.237111 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.240319 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.240470 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.253426 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-twj8h" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.253864 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-2\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.282797 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-1" event={"ID":"5039f885-ec1c-48d9-8e06-cf672e3581df","Type":"ContainerStarted","Data":"5f313fcb94932129478d90e13f8338fba0016c110147ee11418a6747ebcfed4d"} Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.282855 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-1" event={"ID":"5039f885-ec1c-48d9-8e06-cf672e3581df","Type":"ContainerStarted","Data":"8a7c192a3a9525f9af47eb6b8b657e1414c239f6f2e8d44ad7da72e45d07b7e3"} Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.282872 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"ed443431-2b5c-4533-8523-d01380c98c1d","Type":"ContainerStarted","Data":"e06d7030eda4e797f3bf0fb208cadb2bda83a4dfc38d8656bdc15ff205433692"} Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.282887 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.507658 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.518748 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.582540 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-kolla-config\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.582604 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mdk8\" (UniqueName: \"kubernetes.io/projected/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-kube-api-access-5mdk8\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.582687 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-config-data\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.582708 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.582768 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.684379 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.684465 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-kolla-config\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.684501 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mdk8\" (UniqueName: \"kubernetes.io/projected/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-kube-api-access-5mdk8\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.684534 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-config-data\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.684552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.685340 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-kolla-config\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.685984 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-config-data\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.689205 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.689366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.702280 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mdk8\" (UniqueName: \"kubernetes.io/projected/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-kube-api-access-5mdk8\") pod \"memcached-0\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.851374 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:15:08 crc kubenswrapper[5030]: W0120 23:15:08.852999 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod473c0915_5bed_4dc5_8b2a_7899b05b9afc.slice/crio-35fc66e37795e49ff03010bae4833bf0b0a1dba482694a2ee04b6136ab9b892f WatchSource:0}: Error finding container 35fc66e37795e49ff03010bae4833bf0b0a1dba482694a2ee04b6136ab9b892f: Status 404 returned error can't find the container with id 35fc66e37795e49ff03010bae4833bf0b0a1dba482694a2ee04b6136ab9b892f Jan 20 23:15:08 crc kubenswrapper[5030]: I0120 23:15:08.945069 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:09 crc kubenswrapper[5030]: I0120 23:15:09.050367 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-2"] Jan 20 23:15:09 crc kubenswrapper[5030]: W0120 23:15:09.056439 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c5da8d3_64b4_4c3b_a154_58eaf9379610.slice/crio-dd86c9ea545cf980466c9bb41e8a660751838f63337ce1194166b6842f3b0ea6 WatchSource:0}: Error finding container dd86c9ea545cf980466c9bb41e8a660751838f63337ce1194166b6842f3b0ea6: Status 404 returned error can't find the container with id dd86c9ea545cf980466c9bb41e8a660751838f63337ce1194166b6842f3b0ea6 Jan 20 23:15:09 crc kubenswrapper[5030]: I0120 23:15:09.061131 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-1"] Jan 20 23:15:09 crc kubenswrapper[5030]: I0120 23:15:09.296577 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-1" event={"ID":"ef744864-0218-4aad-9ffa-a926311790dd","Type":"ContainerStarted","Data":"be373b7a9672d0d64b714b3087d91961119dd1aa33d4a8808c64a1422b4c41fa"} Jan 20 23:15:09 crc kubenswrapper[5030]: I0120 23:15:09.296678 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-1" event={"ID":"ef744864-0218-4aad-9ffa-a926311790dd","Type":"ContainerStarted","Data":"bbdc232122010f480fbcaeb0a155ca38341653bfb3c4d7e16f45693381bcb6ec"} Jan 20 23:15:09 crc kubenswrapper[5030]: I0120 23:15:09.299431 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"473c0915-5bed-4dc5-8b2a-7899b05b9afc","Type":"ContainerStarted","Data":"535d478dd02aa4e1a13cb2f9107eab61342d4f44c93fb443e66d18213cd37e3d"} Jan 20 23:15:09 crc kubenswrapper[5030]: I0120 23:15:09.299483 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"473c0915-5bed-4dc5-8b2a-7899b05b9afc","Type":"ContainerStarted","Data":"35fc66e37795e49ff03010bae4833bf0b0a1dba482694a2ee04b6136ab9b892f"} Jan 20 23:15:09 crc kubenswrapper[5030]: I0120 23:15:09.302427 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-2" event={"ID":"1c5da8d3-64b4-4c3b-a154-58eaf9379610","Type":"ContainerStarted","Data":"42e0535d6cdf00203b6a7f7cf05556132480c1a17ee95f17af617ecebd0114f8"} Jan 20 23:15:09 crc kubenswrapper[5030]: I0120 23:15:09.302488 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-2" event={"ID":"1c5da8d3-64b4-4c3b-a154-58eaf9379610","Type":"ContainerStarted","Data":"dd86c9ea545cf980466c9bb41e8a660751838f63337ce1194166b6842f3b0ea6"} Jan 20 23:15:09 crc kubenswrapper[5030]: I0120 23:15:09.415348 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:15:09 crc kubenswrapper[5030]: W0120 23:15:09.421135 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dfaa83b_13e1_4a2f_a722_b0583f2641ed.slice/crio-81fdaf95be809cb6dffe4401b2865b42b619c06a292132bb661f33734b3abc3f WatchSource:0}: Error finding container 81fdaf95be809cb6dffe4401b2865b42b619c06a292132bb661f33734b3abc3f: Status 404 returned error can't find the container with id 81fdaf95be809cb6dffe4401b2865b42b619c06a292132bb661f33734b3abc3f Jan 20 23:15:09 crc kubenswrapper[5030]: I0120 23:15:09.833179 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:15:09 crc kubenswrapper[5030]: I0120 23:15:09.834137 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:15:09 crc kubenswrapper[5030]: I0120 23:15:09.837227 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-nktrs" Jan 20 23:15:09 crc kubenswrapper[5030]: I0120 23:15:09.850334 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:15:10 crc kubenswrapper[5030]: I0120 23:15:10.008899 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5f8l\" (UniqueName: \"kubernetes.io/projected/4147eacc-6d45-4c82-a803-fe339fa6ae6f-kube-api-access-q5f8l\") pod \"kube-state-metrics-0\" (UID: \"4147eacc-6d45-4c82-a803-fe339fa6ae6f\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:15:10 crc kubenswrapper[5030]: I0120 23:15:10.110357 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5f8l\" (UniqueName: \"kubernetes.io/projected/4147eacc-6d45-4c82-a803-fe339fa6ae6f-kube-api-access-q5f8l\") pod \"kube-state-metrics-0\" (UID: \"4147eacc-6d45-4c82-a803-fe339fa6ae6f\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:15:10 crc kubenswrapper[5030]: I0120 23:15:10.125711 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5f8l\" (UniqueName: \"kubernetes.io/projected/4147eacc-6d45-4c82-a803-fe339fa6ae6f-kube-api-access-q5f8l\") pod \"kube-state-metrics-0\" (UID: \"4147eacc-6d45-4c82-a803-fe339fa6ae6f\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:15:10 crc kubenswrapper[5030]: I0120 23:15:10.150740 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:15:10 crc kubenswrapper[5030]: I0120 23:15:10.314812 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"8dfaa83b-13e1-4a2f-a722-b0583f2641ed","Type":"ContainerStarted","Data":"2b4f10ed0e4a73fb5cad8e46980901e7c694bff893bde9501737b27debf1b796"} Jan 20 23:15:10 crc kubenswrapper[5030]: I0120 23:15:10.314885 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"8dfaa83b-13e1-4a2f-a722-b0583f2641ed","Type":"ContainerStarted","Data":"81fdaf95be809cb6dffe4401b2865b42b619c06a292132bb661f33734b3abc3f"} Jan 20 23:15:10 crc kubenswrapper[5030]: I0120 23:15:10.315805 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:10 crc kubenswrapper[5030]: I0120 23:15:10.348628 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.348595038 podStartE2EDuration="2.348595038s" podCreationTimestamp="2026-01-20 23:15:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:10.34127259 +0000 UTC m=+2382.661532888" watchObservedRunningTime="2026-01-20 23:15:10.348595038 +0000 UTC m=+2382.668855326" Jan 20 23:15:10 crc kubenswrapper[5030]: I0120 23:15:10.620948 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:15:11 crc kubenswrapper[5030]: I0120 23:15:11.327689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"4147eacc-6d45-4c82-a803-fe339fa6ae6f","Type":"ContainerStarted","Data":"9aa42067b73c5428ecbf3ae60386d8e35417c37e2a9c6f37bf6fbe92dae11307"} Jan 20 23:15:12 crc kubenswrapper[5030]: I0120 23:15:12.337296 5030 generic.go:334] "Generic (PLEG): container finished" podID="5039f885-ec1c-48d9-8e06-cf672e3581df" containerID="5f313fcb94932129478d90e13f8338fba0016c110147ee11418a6747ebcfed4d" exitCode=0 Jan 20 23:15:12 crc kubenswrapper[5030]: I0120 23:15:12.337418 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-1" event={"ID":"5039f885-ec1c-48d9-8e06-cf672e3581df","Type":"ContainerDied","Data":"5f313fcb94932129478d90e13f8338fba0016c110147ee11418a6747ebcfed4d"} Jan 20 23:15:12 crc kubenswrapper[5030]: I0120 23:15:12.339330 5030 generic.go:334] "Generic (PLEG): container finished" podID="eba5579b-ac6a-439c-894d-25a5c14d6242" containerID="5f28b962d9cd9e0f5d731673f323a4101176d3bfafc6679aeb1a997bbdac5c30" exitCode=0 Jan 20 23:15:12 crc kubenswrapper[5030]: I0120 23:15:12.339393 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-2" event={"ID":"eba5579b-ac6a-439c-894d-25a5c14d6242","Type":"ContainerDied","Data":"5f28b962d9cd9e0f5d731673f323a4101176d3bfafc6679aeb1a997bbdac5c30"} Jan 20 23:15:12 crc kubenswrapper[5030]: I0120 23:15:12.343104 5030 generic.go:334] "Generic (PLEG): container finished" podID="cf776dcd-8af4-45b0-9c39-0e8db2af4879" containerID="8c2d034117cf660543507c5a47125dcd72b8b2a6f1553c72d9e7bd79c30f5354" exitCode=0 Jan 20 23:15:12 crc kubenswrapper[5030]: I0120 23:15:12.343134 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"cf776dcd-8af4-45b0-9c39-0e8db2af4879","Type":"ContainerDied","Data":"8c2d034117cf660543507c5a47125dcd72b8b2a6f1553c72d9e7bd79c30f5354"} Jan 20 23:15:12 crc kubenswrapper[5030]: I0120 23:15:12.961520 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:15:12 crc kubenswrapper[5030]: E0120 23:15:12.962073 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.355736 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"4147eacc-6d45-4c82-a803-fe339fa6ae6f","Type":"ContainerStarted","Data":"0fbddf80c87828b1d9017e886e1447381703fca3d184f446eab5cc0614376b02"} Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.355816 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.358996 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-2" event={"ID":"eba5579b-ac6a-439c-894d-25a5c14d6242","Type":"ContainerStarted","Data":"7e4d3b88e2537910ea5a1bb30373dd22a7907ac81c0ca61d89885761256dedd0"} Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.361827 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"cf776dcd-8af4-45b0-9c39-0e8db2af4879","Type":"ContainerStarted","Data":"f70480d2e206ab45874b23b6a0a2ec9950697436b259c4e0ec5114358ea4197c"} Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.364136 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c5da8d3-64b4-4c3b-a154-58eaf9379610" containerID="42e0535d6cdf00203b6a7f7cf05556132480c1a17ee95f17af617ecebd0114f8" exitCode=0 Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.364188 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-2" event={"ID":"1c5da8d3-64b4-4c3b-a154-58eaf9379610","Type":"ContainerDied","Data":"42e0535d6cdf00203b6a7f7cf05556132480c1a17ee95f17af617ecebd0114f8"} Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.367252 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-1" event={"ID":"5039f885-ec1c-48d9-8e06-cf672e3581df","Type":"ContainerStarted","Data":"6970aa8432c4959152b2e372b5408ff4682d8c4aae0f06b65a11779ac7a8e3a0"} Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.372058 5030 generic.go:334] "Generic (PLEG): container finished" podID="ef744864-0218-4aad-9ffa-a926311790dd" containerID="be373b7a9672d0d64b714b3087d91961119dd1aa33d4a8808c64a1422b4c41fa" exitCode=0 Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.372116 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-1" event={"ID":"ef744864-0218-4aad-9ffa-a926311790dd","Type":"ContainerDied","Data":"be373b7a9672d0d64b714b3087d91961119dd1aa33d4a8808c64a1422b4c41fa"} Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.377019 5030 generic.go:334] "Generic (PLEG): container finished" podID="473c0915-5bed-4dc5-8b2a-7899b05b9afc" containerID="535d478dd02aa4e1a13cb2f9107eab61342d4f44c93fb443e66d18213cd37e3d" exitCode=0 Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.377086 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"473c0915-5bed-4dc5-8b2a-7899b05b9afc","Type":"ContainerDied","Data":"535d478dd02aa4e1a13cb2f9107eab61342d4f44c93fb443e66d18213cd37e3d"} Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.399710 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.882114021 podStartE2EDuration="4.39968498s" podCreationTimestamp="2026-01-20 23:15:09 +0000 UTC" firstStartedPulling="2026-01-20 23:15:10.624616499 +0000 UTC m=+2382.944876797" lastFinishedPulling="2026-01-20 23:15:12.142187468 +0000 UTC m=+2384.462447756" observedRunningTime="2026-01-20 23:15:13.381969422 +0000 UTC m=+2385.702229720" watchObservedRunningTime="2026-01-20 23:15:13.39968498 +0000 UTC m=+2385.719945278" Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.425923 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=8.425907413000001 podStartE2EDuration="8.425907413s" podCreationTimestamp="2026-01-20 23:15:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:13.41546602 +0000 UTC m=+2385.735726338" watchObservedRunningTime="2026-01-20 23:15:13.425907413 +0000 UTC m=+2385.746167701" Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.449082 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-2" podStartSLOduration=8.449063941 podStartE2EDuration="8.449063941s" podCreationTimestamp="2026-01-20 23:15:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:13.444782768 +0000 UTC m=+2385.765043126" watchObservedRunningTime="2026-01-20 23:15:13.449063941 +0000 UTC m=+2385.769324249" Jan 20 23:15:13 crc kubenswrapper[5030]: I0120 23:15:13.500857 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-1" podStartSLOduration=8.500838111 podStartE2EDuration="8.500838111s" podCreationTimestamp="2026-01-20 23:15:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:13.494322123 +0000 UTC m=+2385.814582421" watchObservedRunningTime="2026-01-20 23:15:13.500838111 +0000 UTC m=+2385.821098399" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.394330 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-2" event={"ID":"1c5da8d3-64b4-4c3b-a154-58eaf9379610","Type":"ContainerStarted","Data":"43baa2367474f77b4fdfa479c6f54d61e03942a2fcd5b0dd9dcd42e7366551d7"} Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.400826 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"473c0915-5bed-4dc5-8b2a-7899b05b9afc","Type":"ContainerStarted","Data":"f0246b91c36db74af56a2f84a2aa1fbfcd276a49ae84c6dacb5c767ebec9f927"} Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.417588 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-1" event={"ID":"ef744864-0218-4aad-9ffa-a926311790dd","Type":"ContainerStarted","Data":"a8bc63dcc4302756d73b4fa0dd16904504481aacc44353be43137f19c1372f23"} Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.423450 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-2" podStartSLOduration=8.423432029 podStartE2EDuration="8.423432029s" podCreationTimestamp="2026-01-20 23:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:14.420229082 +0000 UTC m=+2386.740489410" watchObservedRunningTime="2026-01-20 23:15:14.423432029 +0000 UTC m=+2386.743692327" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.460169 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=8.460144164999999 podStartE2EDuration="8.460144165s" podCreationTimestamp="2026-01-20 23:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:14.454283534 +0000 UTC m=+2386.774543852" watchObservedRunningTime="2026-01-20 23:15:14.460144165 +0000 UTC m=+2386.780404463" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.477941 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-1" podStartSLOduration=8.477914094 podStartE2EDuration="8.477914094s" podCreationTimestamp="2026-01-20 23:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:14.477220287 +0000 UTC m=+2386.797480585" watchObservedRunningTime="2026-01-20 23:15:14.477914094 +0000 UTC m=+2386.798174422" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.803343 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.804711 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.808021 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.808028 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.808454 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-98tvr" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.814147 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.814355 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.824873 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.887719 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.887779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7zc8\" (UniqueName: \"kubernetes.io/projected/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-kube-api-access-v7zc8\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.887813 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.887871 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.888080 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.888128 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.888253 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.888353 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-config\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.990426 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.990504 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.990568 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.990631 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-config\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.990685 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.990717 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7zc8\" (UniqueName: \"kubernetes.io/projected/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-kube-api-access-v7zc8\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.990741 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.990765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.991087 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.991604 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.992294 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.992562 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-config\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.996433 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.998488 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:14 crc kubenswrapper[5030]: I0120 23:15:14.999939 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:15 crc kubenswrapper[5030]: I0120 23:15:15.020432 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7zc8\" (UniqueName: \"kubernetes.io/projected/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-kube-api-access-v7zc8\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:15 crc kubenswrapper[5030]: I0120 23:15:15.020914 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:15 crc kubenswrapper[5030]: I0120 23:15:15.123538 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:15 crc kubenswrapper[5030]: I0120 23:15:15.558451 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:15:15 crc kubenswrapper[5030]: W0120 23:15:15.565894 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7b5a2a9_c6db_4d98_b203_5503e4b7075c.slice/crio-dc3ceb45ba211ba7d0009d6893d08a4f3048c964f22e393caac74abdae209589 WatchSource:0}: Error finding container dc3ceb45ba211ba7d0009d6893d08a4f3048c964f22e393caac74abdae209589: Status 404 returned error can't find the container with id dc3ceb45ba211ba7d0009d6893d08a4f3048c964f22e393caac74abdae209589 Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.146390 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.148915 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.153344 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.153556 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.154028 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-xl4ff" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.159295 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.163174 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.213461 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eef09e13-a7ab-4e00-a12c-fb544a0791ba-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.213519 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.213560 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26v98\" (UniqueName: \"kubernetes.io/projected/eef09e13-a7ab-4e00-a12c-fb544a0791ba-kube-api-access-26v98\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.213640 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eef09e13-a7ab-4e00-a12c-fb544a0791ba-config\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.213660 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.213714 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.213758 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eef09e13-a7ab-4e00-a12c-fb544a0791ba-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.213798 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.316507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.316577 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eef09e13-a7ab-4e00-a12c-fb544a0791ba-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.316609 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.316691 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eef09e13-a7ab-4e00-a12c-fb544a0791ba-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.316720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.316755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26v98\" (UniqueName: \"kubernetes.io/projected/eef09e13-a7ab-4e00-a12c-fb544a0791ba-kube-api-access-26v98\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.316827 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eef09e13-a7ab-4e00-a12c-fb544a0791ba-config\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.316851 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.317127 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.317154 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eef09e13-a7ab-4e00-a12c-fb544a0791ba-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.318039 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eef09e13-a7ab-4e00-a12c-fb544a0791ba-config\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.318988 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eef09e13-a7ab-4e00-a12c-fb544a0791ba-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.321396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.321687 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.325268 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.349793 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26v98\" (UniqueName: \"kubernetes.io/projected/eef09e13-a7ab-4e00-a12c-fb544a0791ba-kube-api-access-26v98\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.355337 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.435848 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"d7b5a2a9-c6db-4d98-b203-5503e4b7075c","Type":"ContainerStarted","Data":"c985cc11f4e3ac6e3f0060dc9a127411c7d93295135d6768cf20b1612fe484c2"} Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.435894 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"d7b5a2a9-c6db-4d98-b203-5503e4b7075c","Type":"ContainerStarted","Data":"4319fd893e780281e4fa992c547508838c8f4137c581cbf49edf43fa6a462908"} Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.435905 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"d7b5a2a9-c6db-4d98-b203-5503e4b7075c","Type":"ContainerStarted","Data":"dc3ceb45ba211ba7d0009d6893d08a4f3048c964f22e393caac74abdae209589"} Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.457942 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.457920995 podStartE2EDuration="3.457920995s" podCreationTimestamp="2026-01-20 23:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:16.452251038 +0000 UTC m=+2388.772511336" watchObservedRunningTime="2026-01-20 23:15:16.457920995 +0000 UTC m=+2388.778181293" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.478642 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.891886 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.892148 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.901051 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.901083 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.954268 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.954322 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:16 crc kubenswrapper[5030]: I0120 23:15:16.974023 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:15:16 crc kubenswrapper[5030]: W0120 23:15:16.979478 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeef09e13_a7ab_4e00_a12c_fb544a0791ba.slice/crio-a54a17f7765517e607697e25e75f56100e9905b25af73aad95d6b390ac988ab6 WatchSource:0}: Error finding container a54a17f7765517e607697e25e75f56100e9905b25af73aad95d6b390ac988ab6: Status 404 returned error can't find the container with id a54a17f7765517e607697e25e75f56100e9905b25af73aad95d6b390ac988ab6 Jan 20 23:15:17 crc kubenswrapper[5030]: I0120 23:15:17.450806 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"eef09e13-a7ab-4e00-a12c-fb544a0791ba","Type":"ContainerStarted","Data":"993cc6b278b50127911dfb227a9f751177138cfc79c86f0047ff5f55bad4e71a"} Jan 20 23:15:17 crc kubenswrapper[5030]: I0120 23:15:17.450873 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"eef09e13-a7ab-4e00-a12c-fb544a0791ba","Type":"ContainerStarted","Data":"acf7a1c2cf6b3d46585291710fe47cc13af65d867980957cc480cde40622afc4"} Jan 20 23:15:17 crc kubenswrapper[5030]: I0120 23:15:17.450895 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"eef09e13-a7ab-4e00-a12c-fb544a0791ba","Type":"ContainerStarted","Data":"a54a17f7765517e607697e25e75f56100e9905b25af73aad95d6b390ac988ab6"} Jan 20 23:15:17 crc kubenswrapper[5030]: I0120 23:15:17.477350 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.477330309 podStartE2EDuration="2.477330309s" podCreationTimestamp="2026-01-20 23:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:17.474102982 +0000 UTC m=+2389.794363300" watchObservedRunningTime="2026-01-20 23:15:17.477330309 +0000 UTC m=+2389.797590617" Jan 20 23:15:18 crc kubenswrapper[5030]: I0120 23:15:18.123941 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:18 crc kubenswrapper[5030]: I0120 23:15:18.189402 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:18 crc kubenswrapper[5030]: I0120 23:15:18.189801 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:18 crc kubenswrapper[5030]: I0120 23:15:18.190167 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:18 crc kubenswrapper[5030]: I0120 23:15:18.465856 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:18 crc kubenswrapper[5030]: I0120 23:15:18.508579 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:18 crc kubenswrapper[5030]: I0120 23:15:18.508710 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:18 crc kubenswrapper[5030]: I0120 23:15:18.519693 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:18 crc kubenswrapper[5030]: I0120 23:15:18.519767 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:18 crc kubenswrapper[5030]: I0120 23:15:18.947287 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:15:19 crc kubenswrapper[5030]: I0120 23:15:19.479104 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:20 crc kubenswrapper[5030]: I0120 23:15:20.161702 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:15:20 crc kubenswrapper[5030]: I0120 23:15:20.179484 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.158111 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.163186 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.165357 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-6zwff" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.165640 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.165801 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.165807 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.186323 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.210390 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.210438 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d060c583-6bca-4692-a1e4-84ebc743f826-lock\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.210471 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d060c583-6bca-4692-a1e4-84ebc743f826-cache\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.210667 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np6h2\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-kube-api-access-np6h2\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.210870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.312272 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np6h2\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-kube-api-access-np6h2\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.312538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.312658 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.312681 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d060c583-6bca-4692-a1e4-84ebc743f826-lock\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.312702 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d060c583-6bca-4692-a1e4-84ebc743f826-cache\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: E0120 23:15:21.312883 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:15:21 crc kubenswrapper[5030]: E0120 23:15:21.312902 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:15:21 crc kubenswrapper[5030]: E0120 23:15:21.312954 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift podName:d060c583-6bca-4692-a1e4-84ebc743f826 nodeName:}" failed. No retries permitted until 2026-01-20 23:15:21.812932987 +0000 UTC m=+2394.133193275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift") pod "swift-storage-0" (UID: "d060c583-6bca-4692-a1e4-84ebc743f826") : configmap "swift-ring-files" not found Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.313356 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d060c583-6bca-4692-a1e4-84ebc743f826-lock\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.313399 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.313637 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d060c583-6bca-4692-a1e4-84ebc743f826-cache\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.337522 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np6h2\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-kube-api-access-np6h2\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.343685 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.478839 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.531994 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.630532 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:15:21 crc kubenswrapper[5030]: I0120 23:15:21.821234 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:21 crc kubenswrapper[5030]: E0120 23:15:21.821376 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:15:21 crc kubenswrapper[5030]: E0120 23:15:21.821404 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:15:21 crc kubenswrapper[5030]: E0120 23:15:21.821470 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift podName:d060c583-6bca-4692-a1e4-84ebc743f826 nodeName:}" failed. No retries permitted until 2026-01-20 23:15:22.821449641 +0000 UTC m=+2395.141709929 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift") pod "swift-storage-0" (UID: "d060c583-6bca-4692-a1e4-84ebc743f826") : configmap "swift-ring-files" not found Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.532087 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.602935 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.816548 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.818341 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.824115 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.828159 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-s5x8l" Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.828208 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.828295 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.833971 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.841247 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:22 crc kubenswrapper[5030]: E0120 23:15:22.841636 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:15:22 crc kubenswrapper[5030]: E0120 23:15:22.841665 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:15:22 crc kubenswrapper[5030]: E0120 23:15:22.841757 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift podName:d060c583-6bca-4692-a1e4-84ebc743f826 nodeName:}" failed. No retries permitted until 2026-01-20 23:15:24.841734406 +0000 UTC m=+2397.161994704 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift") pod "swift-storage-0" (UID: "d060c583-6bca-4692-a1e4-84ebc743f826") : configmap "swift-ring-files" not found Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.942679 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmjj5\" (UniqueName: \"kubernetes.io/projected/683bb59f-0c0f-4f12-970b-b786e886cae3-kube-api-access-qmjj5\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.942776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/683bb59f-0c0f-4f12-970b-b786e886cae3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.942800 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/683bb59f-0c0f-4f12-970b-b786e886cae3-scripts\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.942901 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.942924 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.942947 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:22 crc kubenswrapper[5030]: I0120 23:15:22.943047 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/683bb59f-0c0f-4f12-970b-b786e886cae3-config\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.044229 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.044271 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.044294 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.044368 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/683bb59f-0c0f-4f12-970b-b786e886cae3-config\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.044411 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmjj5\" (UniqueName: \"kubernetes.io/projected/683bb59f-0c0f-4f12-970b-b786e886cae3-kube-api-access-qmjj5\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.044457 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/683bb59f-0c0f-4f12-970b-b786e886cae3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.044481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/683bb59f-0c0f-4f12-970b-b786e886cae3-scripts\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.045481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/683bb59f-0c0f-4f12-970b-b786e886cae3-scripts\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.045703 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/683bb59f-0c0f-4f12-970b-b786e886cae3-config\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.045822 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/683bb59f-0c0f-4f12-970b-b786e886cae3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.050165 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.050830 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.050967 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.059711 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmjj5\" (UniqueName: \"kubernetes.io/projected/683bb59f-0c0f-4f12-970b-b786e886cae3-kube-api-access-qmjj5\") pod \"ovn-northd-0\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.150407 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:23 crc kubenswrapper[5030]: W0120 23:15:23.598203 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod683bb59f_0c0f_4f12_970b_b786e886cae3.slice/crio-7da788db8486da11320cca377df018d7a71cd8aee6416603d06a9002be1b267a WatchSource:0}: Error finding container 7da788db8486da11320cca377df018d7a71cd8aee6416603d06a9002be1b267a: Status 404 returned error can't find the container with id 7da788db8486da11320cca377df018d7a71cd8aee6416603d06a9002be1b267a Jan 20 23:15:23 crc kubenswrapper[5030]: I0120 23:15:23.598397 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:15:24 crc kubenswrapper[5030]: I0120 23:15:24.528133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"683bb59f-0c0f-4f12-970b-b786e886cae3","Type":"ContainerStarted","Data":"0838f1d78c9937fe1578213b08a61215c2546c6cc57d57d9399d671aecb5ea39"} Jan 20 23:15:24 crc kubenswrapper[5030]: I0120 23:15:24.528501 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"683bb59f-0c0f-4f12-970b-b786e886cae3","Type":"ContainerStarted","Data":"f0b6f096253aba52285115aadb119a7459485eb7348095df5221d30a264dfa06"} Jan 20 23:15:24 crc kubenswrapper[5030]: I0120 23:15:24.528525 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"683bb59f-0c0f-4f12-970b-b786e886cae3","Type":"ContainerStarted","Data":"7da788db8486da11320cca377df018d7a71cd8aee6416603d06a9002be1b267a"} Jan 20 23:15:24 crc kubenswrapper[5030]: I0120 23:15:24.528574 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:24 crc kubenswrapper[5030]: I0120 23:15:24.873841 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:24 crc kubenswrapper[5030]: E0120 23:15:24.874043 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:15:24 crc kubenswrapper[5030]: E0120 23:15:24.874060 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:15:24 crc kubenswrapper[5030]: E0120 23:15:24.874113 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift podName:d060c583-6bca-4692-a1e4-84ebc743f826 nodeName:}" failed. No retries permitted until 2026-01-20 23:15:28.87409503 +0000 UTC m=+2401.194355318 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift") pod "swift-storage-0" (UID: "d060c583-6bca-4692-a1e4-84ebc743f826") : configmap "swift-ring-files" not found Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.004063 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=3.004040647 podStartE2EDuration="3.004040647s" podCreationTimestamp="2026-01-20 23:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:24.562976831 +0000 UTC m=+2396.883237149" watchObservedRunningTime="2026-01-20 23:15:25.004040647 +0000 UTC m=+2397.324300935" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.007966 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-fgr22"] Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.009580 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.015837 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.017034 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.025403 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.031058 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-fgr22"] Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.178767 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj6d5\" (UniqueName: \"kubernetes.io/projected/68dec5c8-ac10-4af8-b442-11df66af8d8f-kube-api-access-bj6d5\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.178906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/68dec5c8-ac10-4af8-b442-11df66af8d8f-etc-swift\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.178954 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-combined-ca-bundle\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.178998 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-dispersionconf\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.179045 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68dec5c8-ac10-4af8-b442-11df66af8d8f-scripts\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.179064 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/68dec5c8-ac10-4af8-b442-11df66af8d8f-ring-data-devices\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.179154 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-swiftconf\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.280821 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/68dec5c8-ac10-4af8-b442-11df66af8d8f-etc-swift\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.281101 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-combined-ca-bundle\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.281259 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-dispersionconf\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.281420 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68dec5c8-ac10-4af8-b442-11df66af8d8f-scripts\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.281527 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/68dec5c8-ac10-4af8-b442-11df66af8d8f-ring-data-devices\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.281654 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-swiftconf\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.281784 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj6d5\" (UniqueName: \"kubernetes.io/projected/68dec5c8-ac10-4af8-b442-11df66af8d8f-kube-api-access-bj6d5\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.281314 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/68dec5c8-ac10-4af8-b442-11df66af8d8f-etc-swift\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.282868 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68dec5c8-ac10-4af8-b442-11df66af8d8f-scripts\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.282869 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/68dec5c8-ac10-4af8-b442-11df66af8d8f-ring-data-devices\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.287115 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-dispersionconf\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.287728 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-combined-ca-bundle\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.288264 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-swiftconf\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.305455 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj6d5\" (UniqueName: \"kubernetes.io/projected/68dec5c8-ac10-4af8-b442-11df66af8d8f-kube-api-access-bj6d5\") pod \"swift-ring-rebalance-fgr22\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.342369 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.415380 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-pkd7j"] Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.416478 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-pkd7j" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.418571 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.453694 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-pkd7j"] Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.588420 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ab197-e116-4ebc-9fda-0dbf359b8a58-operator-scripts\") pod \"root-account-create-update-pkd7j\" (UID: \"373ab197-e116-4ebc-9fda-0dbf359b8a58\") " pod="openstack-kuttl-tests/root-account-create-update-pkd7j" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.588500 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2zq\" (UniqueName: \"kubernetes.io/projected/373ab197-e116-4ebc-9fda-0dbf359b8a58-kube-api-access-xz2zq\") pod \"root-account-create-update-pkd7j\" (UID: \"373ab197-e116-4ebc-9fda-0dbf359b8a58\") " pod="openstack-kuttl-tests/root-account-create-update-pkd7j" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.690531 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ab197-e116-4ebc-9fda-0dbf359b8a58-operator-scripts\") pod \"root-account-create-update-pkd7j\" (UID: \"373ab197-e116-4ebc-9fda-0dbf359b8a58\") " pod="openstack-kuttl-tests/root-account-create-update-pkd7j" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.690870 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2zq\" (UniqueName: \"kubernetes.io/projected/373ab197-e116-4ebc-9fda-0dbf359b8a58-kube-api-access-xz2zq\") pod \"root-account-create-update-pkd7j\" (UID: \"373ab197-e116-4ebc-9fda-0dbf359b8a58\") " pod="openstack-kuttl-tests/root-account-create-update-pkd7j" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.691587 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ab197-e116-4ebc-9fda-0dbf359b8a58-operator-scripts\") pod \"root-account-create-update-pkd7j\" (UID: \"373ab197-e116-4ebc-9fda-0dbf359b8a58\") " pod="openstack-kuttl-tests/root-account-create-update-pkd7j" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.712932 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2zq\" (UniqueName: \"kubernetes.io/projected/373ab197-e116-4ebc-9fda-0dbf359b8a58-kube-api-access-xz2zq\") pod \"root-account-create-update-pkd7j\" (UID: \"373ab197-e116-4ebc-9fda-0dbf359b8a58\") " pod="openstack-kuttl-tests/root-account-create-update-pkd7j" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.793468 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-pkd7j" Jan 20 23:15:25 crc kubenswrapper[5030]: I0120 23:15:25.839396 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-fgr22"] Jan 20 23:15:25 crc kubenswrapper[5030]: W0120 23:15:25.843759 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68dec5c8_ac10_4af8_b442_11df66af8d8f.slice/crio-8c9a814ad3b111edb4b6cf940958ca62372f6cbbce57301c52e43e4bc9917d81 WatchSource:0}: Error finding container 8c9a814ad3b111edb4b6cf940958ca62372f6cbbce57301c52e43e4bc9917d81: Status 404 returned error can't find the container with id 8c9a814ad3b111edb4b6cf940958ca62372f6cbbce57301c52e43e4bc9917d81 Jan 20 23:15:26 crc kubenswrapper[5030]: I0120 23:15:26.272262 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-pkd7j"] Jan 20 23:15:26 crc kubenswrapper[5030]: W0120 23:15:26.281095 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod373ab197_e116_4ebc_9fda_0dbf359b8a58.slice/crio-1c9df2fc67a7ce5efe7b4dca3853ea8cb8dd5a84b73e972a26115cb67ff0aaa8 WatchSource:0}: Error finding container 1c9df2fc67a7ce5efe7b4dca3853ea8cb8dd5a84b73e972a26115cb67ff0aaa8: Status 404 returned error can't find the container with id 1c9df2fc67a7ce5efe7b4dca3853ea8cb8dd5a84b73e972a26115cb67ff0aaa8 Jan 20 23:15:26 crc kubenswrapper[5030]: I0120 23:15:26.546853 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" event={"ID":"68dec5c8-ac10-4af8-b442-11df66af8d8f","Type":"ContainerStarted","Data":"d1a2aa936166935c4ce2906d91846b385b067a38b360d28d89c7fd9c07cc9408"} Jan 20 23:15:26 crc kubenswrapper[5030]: I0120 23:15:26.547145 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" event={"ID":"68dec5c8-ac10-4af8-b442-11df66af8d8f","Type":"ContainerStarted","Data":"8c9a814ad3b111edb4b6cf940958ca62372f6cbbce57301c52e43e4bc9917d81"} Jan 20 23:15:26 crc kubenswrapper[5030]: I0120 23:15:26.547767 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-pkd7j" event={"ID":"373ab197-e116-4ebc-9fda-0dbf359b8a58","Type":"ContainerStarted","Data":"1c9df2fc67a7ce5efe7b4dca3853ea8cb8dd5a84b73e972a26115cb67ff0aaa8"} Jan 20 23:15:26 crc kubenswrapper[5030]: I0120 23:15:26.665792 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:26 crc kubenswrapper[5030]: I0120 23:15:26.702276 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" podStartSLOduration=2.702259496 podStartE2EDuration="2.702259496s" podCreationTimestamp="2026-01-20 23:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:26.571664013 +0000 UTC m=+2398.891924311" watchObservedRunningTime="2026-01-20 23:15:26.702259496 +0000 UTC m=+2399.022519784" Jan 20 23:15:26 crc kubenswrapper[5030]: I0120 23:15:26.749374 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:15:26 crc kubenswrapper[5030]: I0120 23:15:26.963345 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:15:26 crc kubenswrapper[5030]: E0120 23:15:26.963599 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:15:27 crc kubenswrapper[5030]: I0120 23:15:27.015103 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/openstack-galera-2" podUID="eba5579b-ac6a-439c-894d-25a5c14d6242" containerName="galera" probeResult="failure" output=< Jan 20 23:15:27 crc kubenswrapper[5030]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Jan 20 23:15:27 crc kubenswrapper[5030]: > Jan 20 23:15:27 crc kubenswrapper[5030]: E0120 23:15:27.162907 5030 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.9:60766->38.102.83.9:37955: write tcp 192.168.126.11:10250->192.168.126.11:45210: write: broken pipe Jan 20 23:15:27 crc kubenswrapper[5030]: I0120 23:15:27.567858 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-pkd7j" event={"ID":"373ab197-e116-4ebc-9fda-0dbf359b8a58","Type":"ContainerDied","Data":"23ab3d19a3162b3c362222b9220b0da88a73e77ca2fa8dc2831bc0408c8efb1d"} Jan 20 23:15:27 crc kubenswrapper[5030]: I0120 23:15:27.567846 5030 generic.go:334] "Generic (PLEG): container finished" podID="373ab197-e116-4ebc-9fda-0dbf359b8a58" containerID="23ab3d19a3162b3c362222b9220b0da88a73e77ca2fa8dc2831bc0408c8efb1d" exitCode=0 Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.052467 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-4rdsx"] Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.054390 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-4rdsx" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.060944 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-4rdsx"] Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.197844 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cftdg\" (UniqueName: \"kubernetes.io/projected/271b7d5f-4799-47ce-b503-0e41bfc34751-kube-api-access-cftdg\") pod \"keystone-db-create-4rdsx\" (UID: \"271b7d5f-4799-47ce-b503-0e41bfc34751\") " pod="openstack-kuttl-tests/keystone-db-create-4rdsx" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.197967 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/271b7d5f-4799-47ce-b503-0e41bfc34751-operator-scripts\") pod \"keystone-db-create-4rdsx\" (UID: \"271b7d5f-4799-47ce-b503-0e41bfc34751\") " pod="openstack-kuttl-tests/keystone-db-create-4rdsx" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.215497 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-3977-account-create-update-tc777"] Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.216962 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.218811 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.228006 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-3977-account-create-update-tc777"] Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.299227 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/271b7d5f-4799-47ce-b503-0e41bfc34751-operator-scripts\") pod \"keystone-db-create-4rdsx\" (UID: \"271b7d5f-4799-47ce-b503-0e41bfc34751\") " pod="openstack-kuttl-tests/keystone-db-create-4rdsx" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.299771 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7shq\" (UniqueName: \"kubernetes.io/projected/f74ad968-4584-45a0-b1e9-783c98abb6f4-kube-api-access-k7shq\") pod \"keystone-3977-account-create-update-tc777\" (UID: \"f74ad968-4584-45a0-b1e9-783c98abb6f4\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.299953 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f74ad968-4584-45a0-b1e9-783c98abb6f4-operator-scripts\") pod \"keystone-3977-account-create-update-tc777\" (UID: \"f74ad968-4584-45a0-b1e9-783c98abb6f4\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.300097 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cftdg\" (UniqueName: \"kubernetes.io/projected/271b7d5f-4799-47ce-b503-0e41bfc34751-kube-api-access-cftdg\") pod \"keystone-db-create-4rdsx\" (UID: \"271b7d5f-4799-47ce-b503-0e41bfc34751\") " pod="openstack-kuttl-tests/keystone-db-create-4rdsx" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.299982 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/271b7d5f-4799-47ce-b503-0e41bfc34751-operator-scripts\") pod \"keystone-db-create-4rdsx\" (UID: \"271b7d5f-4799-47ce-b503-0e41bfc34751\") " pod="openstack-kuttl-tests/keystone-db-create-4rdsx" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.326345 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cftdg\" (UniqueName: \"kubernetes.io/projected/271b7d5f-4799-47ce-b503-0e41bfc34751-kube-api-access-cftdg\") pod \"keystone-db-create-4rdsx\" (UID: \"271b7d5f-4799-47ce-b503-0e41bfc34751\") " pod="openstack-kuttl-tests/keystone-db-create-4rdsx" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.388863 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-4rdsx" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.401835 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f74ad968-4584-45a0-b1e9-783c98abb6f4-operator-scripts\") pod \"keystone-3977-account-create-update-tc777\" (UID: \"f74ad968-4584-45a0-b1e9-783c98abb6f4\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.401874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7shq\" (UniqueName: \"kubernetes.io/projected/f74ad968-4584-45a0-b1e9-783c98abb6f4-kube-api-access-k7shq\") pod \"keystone-3977-account-create-update-tc777\" (UID: \"f74ad968-4584-45a0-b1e9-783c98abb6f4\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.402806 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f74ad968-4584-45a0-b1e9-783c98abb6f4-operator-scripts\") pod \"keystone-3977-account-create-update-tc777\" (UID: \"f74ad968-4584-45a0-b1e9-783c98abb6f4\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.428853 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7shq\" (UniqueName: \"kubernetes.io/projected/f74ad968-4584-45a0-b1e9-783c98abb6f4-kube-api-access-k7shq\") pod \"keystone-3977-account-create-update-tc777\" (UID: \"f74ad968-4584-45a0-b1e9-783c98abb6f4\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.433555 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-dlbqk"] Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.437414 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-dlbqk" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.446921 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-dlbqk"] Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.502925 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcppk\" (UniqueName: \"kubernetes.io/projected/359366f6-13e3-4ce0-be91-1ecc9bd33613-kube-api-access-rcppk\") pod \"placement-db-create-dlbqk\" (UID: \"359366f6-13e3-4ce0-be91-1ecc9bd33613\") " pod="openstack-kuttl-tests/placement-db-create-dlbqk" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.503492 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359366f6-13e3-4ce0-be91-1ecc9bd33613-operator-scripts\") pod \"placement-db-create-dlbqk\" (UID: \"359366f6-13e3-4ce0-be91-1ecc9bd33613\") " pod="openstack-kuttl-tests/placement-db-create-dlbqk" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.537819 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.556553 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb"] Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.558131 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.564223 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.584085 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb"] Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.605481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359366f6-13e3-4ce0-be91-1ecc9bd33613-operator-scripts\") pod \"placement-db-create-dlbqk\" (UID: \"359366f6-13e3-4ce0-be91-1ecc9bd33613\") " pod="openstack-kuttl-tests/placement-db-create-dlbqk" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.605593 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcppk\" (UniqueName: \"kubernetes.io/projected/359366f6-13e3-4ce0-be91-1ecc9bd33613-kube-api-access-rcppk\") pod \"placement-db-create-dlbqk\" (UID: \"359366f6-13e3-4ce0-be91-1ecc9bd33613\") " pod="openstack-kuttl-tests/placement-db-create-dlbqk" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.607359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359366f6-13e3-4ce0-be91-1ecc9bd33613-operator-scripts\") pod \"placement-db-create-dlbqk\" (UID: \"359366f6-13e3-4ce0-be91-1ecc9bd33613\") " pod="openstack-kuttl-tests/placement-db-create-dlbqk" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.631174 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcppk\" (UniqueName: \"kubernetes.io/projected/359366f6-13e3-4ce0-be91-1ecc9bd33613-kube-api-access-rcppk\") pod \"placement-db-create-dlbqk\" (UID: \"359366f6-13e3-4ce0-be91-1ecc9bd33613\") " pod="openstack-kuttl-tests/placement-db-create-dlbqk" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.706975 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mxbt\" (UniqueName: \"kubernetes.io/projected/3e70d93d-ba09-4dcc-95b1-f25a68604a29-kube-api-access-8mxbt\") pod \"placement-e5c6-account-create-update-s9wrb\" (UID: \"3e70d93d-ba09-4dcc-95b1-f25a68604a29\") " pod="openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.707074 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e70d93d-ba09-4dcc-95b1-f25a68604a29-operator-scripts\") pod \"placement-e5c6-account-create-update-s9wrb\" (UID: \"3e70d93d-ba09-4dcc-95b1-f25a68604a29\") " pod="openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.808911 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mxbt\" (UniqueName: \"kubernetes.io/projected/3e70d93d-ba09-4dcc-95b1-f25a68604a29-kube-api-access-8mxbt\") pod \"placement-e5c6-account-create-update-s9wrb\" (UID: \"3e70d93d-ba09-4dcc-95b1-f25a68604a29\") " pod="openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.809051 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e70d93d-ba09-4dcc-95b1-f25a68604a29-operator-scripts\") pod \"placement-e5c6-account-create-update-s9wrb\" (UID: \"3e70d93d-ba09-4dcc-95b1-f25a68604a29\") " pod="openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.809760 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e70d93d-ba09-4dcc-95b1-f25a68604a29-operator-scripts\") pod \"placement-e5c6-account-create-update-s9wrb\" (UID: \"3e70d93d-ba09-4dcc-95b1-f25a68604a29\") " pod="openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.815881 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-dlbqk" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.824844 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mxbt\" (UniqueName: \"kubernetes.io/projected/3e70d93d-ba09-4dcc-95b1-f25a68604a29-kube-api-access-8mxbt\") pod \"placement-e5c6-account-create-update-s9wrb\" (UID: \"3e70d93d-ba09-4dcc-95b1-f25a68604a29\") " pod="openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb" Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.910609 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:28 crc kubenswrapper[5030]: E0120 23:15:28.910766 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:15:28 crc kubenswrapper[5030]: E0120 23:15:28.910793 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:15:28 crc kubenswrapper[5030]: E0120 23:15:28.910853 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift podName:d060c583-6bca-4692-a1e4-84ebc743f826 nodeName:}" failed. No retries permitted until 2026-01-20 23:15:36.910833182 +0000 UTC m=+2409.231093470 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift") pod "swift-storage-0" (UID: "d060c583-6bca-4692-a1e4-84ebc743f826") : configmap "swift-ring-files" not found Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.938442 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-4rdsx"] Jan 20 23:15:28 crc kubenswrapper[5030]: I0120 23:15:28.942136 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb" Jan 20 23:15:28 crc kubenswrapper[5030]: W0120 23:15:28.948737 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod271b7d5f_4799_47ce_b503_0e41bfc34751.slice/crio-95261b84405599ec06e2360e8f5903fe24355bf079c756907a3fd3fdb7c45904 WatchSource:0}: Error finding container 95261b84405599ec06e2360e8f5903fe24355bf079c756907a3fd3fdb7c45904: Status 404 returned error can't find the container with id 95261b84405599ec06e2360e8f5903fe24355bf079c756907a3fd3fdb7c45904 Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.089962 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-pkd7j" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.144924 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-gnnv9"] Jan 20 23:15:29 crc kubenswrapper[5030]: E0120 23:15:29.145268 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="373ab197-e116-4ebc-9fda-0dbf359b8a58" containerName="mariadb-account-create-update" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.145280 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="373ab197-e116-4ebc-9fda-0dbf359b8a58" containerName="mariadb-account-create-update" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.145449 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="373ab197-e116-4ebc-9fda-0dbf359b8a58" containerName="mariadb-account-create-update" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.146042 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gnnv9" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.170011 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gnnv9"] Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.181435 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-3977-account-create-update-tc777"] Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.220436 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ab197-e116-4ebc-9fda-0dbf359b8a58-operator-scripts\") pod \"373ab197-e116-4ebc-9fda-0dbf359b8a58\" (UID: \"373ab197-e116-4ebc-9fda-0dbf359b8a58\") " Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.220533 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz2zq\" (UniqueName: \"kubernetes.io/projected/373ab197-e116-4ebc-9fda-0dbf359b8a58-kube-api-access-xz2zq\") pod \"373ab197-e116-4ebc-9fda-0dbf359b8a58\" (UID: \"373ab197-e116-4ebc-9fda-0dbf359b8a58\") " Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.220821 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2cqh\" (UniqueName: \"kubernetes.io/projected/3313a132-ec2a-462d-ad27-645c76243ac6-kube-api-access-x2cqh\") pod \"glance-db-create-gnnv9\" (UID: \"3313a132-ec2a-462d-ad27-645c76243ac6\") " pod="openstack-kuttl-tests/glance-db-create-gnnv9" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.220866 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3313a132-ec2a-462d-ad27-645c76243ac6-operator-scripts\") pod \"glance-db-create-gnnv9\" (UID: \"3313a132-ec2a-462d-ad27-645c76243ac6\") " pod="openstack-kuttl-tests/glance-db-create-gnnv9" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.221064 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/373ab197-e116-4ebc-9fda-0dbf359b8a58-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "373ab197-e116-4ebc-9fda-0dbf359b8a58" (UID: "373ab197-e116-4ebc-9fda-0dbf359b8a58"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.230746 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/373ab197-e116-4ebc-9fda-0dbf359b8a58-kube-api-access-xz2zq" (OuterVolumeSpecName: "kube-api-access-xz2zq") pod "373ab197-e116-4ebc-9fda-0dbf359b8a58" (UID: "373ab197-e116-4ebc-9fda-0dbf359b8a58"). InnerVolumeSpecName "kube-api-access-xz2zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.262931 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-1026-account-create-update-xgf2v"] Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.264511 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.266512 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.271855 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-1026-account-create-update-xgf2v"] Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.322812 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2cqh\" (UniqueName: \"kubernetes.io/projected/3313a132-ec2a-462d-ad27-645c76243ac6-kube-api-access-x2cqh\") pod \"glance-db-create-gnnv9\" (UID: \"3313a132-ec2a-462d-ad27-645c76243ac6\") " pod="openstack-kuttl-tests/glance-db-create-gnnv9" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.322864 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3313a132-ec2a-462d-ad27-645c76243ac6-operator-scripts\") pod \"glance-db-create-gnnv9\" (UID: \"3313a132-ec2a-462d-ad27-645c76243ac6\") " pod="openstack-kuttl-tests/glance-db-create-gnnv9" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.322911 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/373ab197-e116-4ebc-9fda-0dbf359b8a58-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.322922 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz2zq\" (UniqueName: \"kubernetes.io/projected/373ab197-e116-4ebc-9fda-0dbf359b8a58-kube-api-access-xz2zq\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.323485 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3313a132-ec2a-462d-ad27-645c76243ac6-operator-scripts\") pod \"glance-db-create-gnnv9\" (UID: \"3313a132-ec2a-462d-ad27-645c76243ac6\") " pod="openstack-kuttl-tests/glance-db-create-gnnv9" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.352258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2cqh\" (UniqueName: \"kubernetes.io/projected/3313a132-ec2a-462d-ad27-645c76243ac6-kube-api-access-x2cqh\") pod \"glance-db-create-gnnv9\" (UID: \"3313a132-ec2a-462d-ad27-645c76243ac6\") " pod="openstack-kuttl-tests/glance-db-create-gnnv9" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.414839 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-dlbqk"] Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.424541 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7301c9ea-0137-4118-9bff-ab6934177bb5-operator-scripts\") pod \"glance-1026-account-create-update-xgf2v\" (UID: \"7301c9ea-0137-4118-9bff-ab6934177bb5\") " pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.424669 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcrxt\" (UniqueName: \"kubernetes.io/projected/7301c9ea-0137-4118-9bff-ab6934177bb5-kube-api-access-rcrxt\") pod \"glance-1026-account-create-update-xgf2v\" (UID: \"7301c9ea-0137-4118-9bff-ab6934177bb5\") " pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.464071 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gnnv9" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.525612 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcrxt\" (UniqueName: \"kubernetes.io/projected/7301c9ea-0137-4118-9bff-ab6934177bb5-kube-api-access-rcrxt\") pod \"glance-1026-account-create-update-xgf2v\" (UID: \"7301c9ea-0137-4118-9bff-ab6934177bb5\") " pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.525725 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7301c9ea-0137-4118-9bff-ab6934177bb5-operator-scripts\") pod \"glance-1026-account-create-update-xgf2v\" (UID: \"7301c9ea-0137-4118-9bff-ab6934177bb5\") " pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.526399 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7301c9ea-0137-4118-9bff-ab6934177bb5-operator-scripts\") pod \"glance-1026-account-create-update-xgf2v\" (UID: \"7301c9ea-0137-4118-9bff-ab6934177bb5\") " pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.569075 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcrxt\" (UniqueName: \"kubernetes.io/projected/7301c9ea-0137-4118-9bff-ab6934177bb5-kube-api-access-rcrxt\") pod \"glance-1026-account-create-update-xgf2v\" (UID: \"7301c9ea-0137-4118-9bff-ab6934177bb5\") " pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.573213 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb"] Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.592005 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.613722 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" event={"ID":"f74ad968-4584-45a0-b1e9-783c98abb6f4","Type":"ContainerStarted","Data":"792cb958e3010513ca28d399f852b0e27ce52b1f4fb7f79b6e5a138bf32c9351"} Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.613763 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" event={"ID":"f74ad968-4584-45a0-b1e9-783c98abb6f4","Type":"ContainerStarted","Data":"3f02e3d07d4ce3f9246651fe51ae6d710d15d66cba2c0baa070b775b79bc2616"} Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.624782 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-dlbqk" event={"ID":"359366f6-13e3-4ce0-be91-1ecc9bd33613","Type":"ContainerStarted","Data":"e7bbab20ff89d9be21c42a4fd78f6187a4a9444c98ad28ea0bfebb04e110314e"} Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.631998 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-4rdsx" event={"ID":"271b7d5f-4799-47ce-b503-0e41bfc34751","Type":"ContainerStarted","Data":"6f5089b9d57c3414c07cfe3a78f2c07bcbe1e7a78e9ee5624df9d18e36a5fd93"} Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.632039 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-4rdsx" event={"ID":"271b7d5f-4799-47ce-b503-0e41bfc34751","Type":"ContainerStarted","Data":"95261b84405599ec06e2360e8f5903fe24355bf079c756907a3fd3fdb7c45904"} Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.644904 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" podStartSLOduration=1.644886329 podStartE2EDuration="1.644886329s" podCreationTimestamp="2026-01-20 23:15:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:29.642146563 +0000 UTC m=+2401.962406851" watchObservedRunningTime="2026-01-20 23:15:29.644886329 +0000 UTC m=+2401.965146607" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.646121 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-pkd7j" event={"ID":"373ab197-e116-4ebc-9fda-0dbf359b8a58","Type":"ContainerDied","Data":"1c9df2fc67a7ce5efe7b4dca3853ea8cb8dd5a84b73e972a26115cb67ff0aaa8"} Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.646176 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c9df2fc67a7ce5efe7b4dca3853ea8cb8dd5a84b73e972a26115cb67ff0aaa8" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.646316 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-pkd7j" Jan 20 23:15:29 crc kubenswrapper[5030]: I0120 23:15:29.692512 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-create-4rdsx" podStartSLOduration=1.692491398 podStartE2EDuration="1.692491398s" podCreationTimestamp="2026-01-20 23:15:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:29.667995318 +0000 UTC m=+2401.988255606" watchObservedRunningTime="2026-01-20 23:15:29.692491398 +0000 UTC m=+2402.012751686" Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.166078 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-1026-account-create-update-xgf2v"] Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.172652 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gnnv9"] Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.658137 5030 generic.go:334] "Generic (PLEG): container finished" podID="359366f6-13e3-4ce0-be91-1ecc9bd33613" containerID="f8d50c0f4240936dce63e32dc4fe73758a74f2c14533a746b39d074c52eb8285" exitCode=0 Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.658222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-dlbqk" event={"ID":"359366f6-13e3-4ce0-be91-1ecc9bd33613","Type":"ContainerDied","Data":"f8d50c0f4240936dce63e32dc4fe73758a74f2c14533a746b39d074c52eb8285"} Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.661182 5030 generic.go:334] "Generic (PLEG): container finished" podID="271b7d5f-4799-47ce-b503-0e41bfc34751" containerID="6f5089b9d57c3414c07cfe3a78f2c07bcbe1e7a78e9ee5624df9d18e36a5fd93" exitCode=0 Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.661292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-4rdsx" event={"ID":"271b7d5f-4799-47ce-b503-0e41bfc34751","Type":"ContainerDied","Data":"6f5089b9d57c3414c07cfe3a78f2c07bcbe1e7a78e9ee5624df9d18e36a5fd93"} Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.663388 5030 generic.go:334] "Generic (PLEG): container finished" podID="3e70d93d-ba09-4dcc-95b1-f25a68604a29" containerID="194f5ba11638676e1a1ddfbc95a8db6d66d354d3f02a3aca30a220cde44f7c5e" exitCode=0 Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.663462 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb" event={"ID":"3e70d93d-ba09-4dcc-95b1-f25a68604a29","Type":"ContainerDied","Data":"194f5ba11638676e1a1ddfbc95a8db6d66d354d3f02a3aca30a220cde44f7c5e"} Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.663486 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb" event={"ID":"3e70d93d-ba09-4dcc-95b1-f25a68604a29","Type":"ContainerStarted","Data":"e1fd85f18e69d169e5e068a4bb42930a74bb189cd974d93247443a3c8844087a"} Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.676427 5030 generic.go:334] "Generic (PLEG): container finished" podID="3313a132-ec2a-462d-ad27-645c76243ac6" containerID="b4464c2867c3edb0059011581cf4f977a29af49058830c50e8ad505fb9d5fbc6" exitCode=0 Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.676658 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-gnnv9" event={"ID":"3313a132-ec2a-462d-ad27-645c76243ac6","Type":"ContainerDied","Data":"b4464c2867c3edb0059011581cf4f977a29af49058830c50e8ad505fb9d5fbc6"} Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.676692 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-gnnv9" event={"ID":"3313a132-ec2a-462d-ad27-645c76243ac6","Type":"ContainerStarted","Data":"406e9f9f1b42e074d60d2c8aab3e0aa5595c4a87549a59553d07c0f3e035bde1"} Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.683827 5030 generic.go:334] "Generic (PLEG): container finished" podID="f74ad968-4584-45a0-b1e9-783c98abb6f4" containerID="792cb958e3010513ca28d399f852b0e27ce52b1f4fb7f79b6e5a138bf32c9351" exitCode=0 Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.684203 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" event={"ID":"f74ad968-4584-45a0-b1e9-783c98abb6f4","Type":"ContainerDied","Data":"792cb958e3010513ca28d399f852b0e27ce52b1f4fb7f79b6e5a138bf32c9351"} Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.694325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" event={"ID":"7301c9ea-0137-4118-9bff-ab6934177bb5","Type":"ContainerStarted","Data":"e57aabed19e5fba72fb49b809d6dcc31f2a0753495ac995fd3a0afcefc01d277"} Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.694358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" event={"ID":"7301c9ea-0137-4118-9bff-ab6934177bb5","Type":"ContainerStarted","Data":"93589a794b5962f726cf9d85c061f8be54cc269c679c1bcfba8499374b80c96d"} Jan 20 23:15:30 crc kubenswrapper[5030]: I0120 23:15:30.773343 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" podStartSLOduration=1.773320366 podStartE2EDuration="1.773320366s" podCreationTimestamp="2026-01-20 23:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:30.746229982 +0000 UTC m=+2403.066490270" watchObservedRunningTime="2026-01-20 23:15:30.773320366 +0000 UTC m=+2403.093580654" Jan 20 23:15:31 crc kubenswrapper[5030]: I0120 23:15:31.707576 5030 generic.go:334] "Generic (PLEG): container finished" podID="7301c9ea-0137-4118-9bff-ab6934177bb5" containerID="e57aabed19e5fba72fb49b809d6dcc31f2a0753495ac995fd3a0afcefc01d277" exitCode=0 Jan 20 23:15:31 crc kubenswrapper[5030]: I0120 23:15:31.707808 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" event={"ID":"7301c9ea-0137-4118-9bff-ab6934177bb5","Type":"ContainerDied","Data":"e57aabed19e5fba72fb49b809d6dcc31f2a0753495ac995fd3a0afcefc01d277"} Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.218516 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gnnv9" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.283080 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3313a132-ec2a-462d-ad27-645c76243ac6-operator-scripts\") pod \"3313a132-ec2a-462d-ad27-645c76243ac6\" (UID: \"3313a132-ec2a-462d-ad27-645c76243ac6\") " Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.283139 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2cqh\" (UniqueName: \"kubernetes.io/projected/3313a132-ec2a-462d-ad27-645c76243ac6-kube-api-access-x2cqh\") pod \"3313a132-ec2a-462d-ad27-645c76243ac6\" (UID: \"3313a132-ec2a-462d-ad27-645c76243ac6\") " Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.283729 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3313a132-ec2a-462d-ad27-645c76243ac6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3313a132-ec2a-462d-ad27-645c76243ac6" (UID: "3313a132-ec2a-462d-ad27-645c76243ac6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.289638 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3313a132-ec2a-462d-ad27-645c76243ac6-kube-api-access-x2cqh" (OuterVolumeSpecName: "kube-api-access-x2cqh") pod "3313a132-ec2a-462d-ad27-645c76243ac6" (UID: "3313a132-ec2a-462d-ad27-645c76243ac6"). InnerVolumeSpecName "kube-api-access-x2cqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.384084 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.385012 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3313a132-ec2a-462d-ad27-645c76243ac6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.385041 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2cqh\" (UniqueName: \"kubernetes.io/projected/3313a132-ec2a-462d-ad27-645c76243ac6-kube-api-access-x2cqh\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.388265 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.393940 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-4rdsx" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.399253 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-dlbqk" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.486650 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7shq\" (UniqueName: \"kubernetes.io/projected/f74ad968-4584-45a0-b1e9-783c98abb6f4-kube-api-access-k7shq\") pod \"f74ad968-4584-45a0-b1e9-783c98abb6f4\" (UID: \"f74ad968-4584-45a0-b1e9-783c98abb6f4\") " Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.486691 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359366f6-13e3-4ce0-be91-1ecc9bd33613-operator-scripts\") pod \"359366f6-13e3-4ce0-be91-1ecc9bd33613\" (UID: \"359366f6-13e3-4ce0-be91-1ecc9bd33613\") " Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.486760 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mxbt\" (UniqueName: \"kubernetes.io/projected/3e70d93d-ba09-4dcc-95b1-f25a68604a29-kube-api-access-8mxbt\") pod \"3e70d93d-ba09-4dcc-95b1-f25a68604a29\" (UID: \"3e70d93d-ba09-4dcc-95b1-f25a68604a29\") " Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.486893 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcppk\" (UniqueName: \"kubernetes.io/projected/359366f6-13e3-4ce0-be91-1ecc9bd33613-kube-api-access-rcppk\") pod \"359366f6-13e3-4ce0-be91-1ecc9bd33613\" (UID: \"359366f6-13e3-4ce0-be91-1ecc9bd33613\") " Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.486988 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/271b7d5f-4799-47ce-b503-0e41bfc34751-operator-scripts\") pod \"271b7d5f-4799-47ce-b503-0e41bfc34751\" (UID: \"271b7d5f-4799-47ce-b503-0e41bfc34751\") " Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.487049 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e70d93d-ba09-4dcc-95b1-f25a68604a29-operator-scripts\") pod \"3e70d93d-ba09-4dcc-95b1-f25a68604a29\" (UID: \"3e70d93d-ba09-4dcc-95b1-f25a68604a29\") " Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.487249 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/359366f6-13e3-4ce0-be91-1ecc9bd33613-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "359366f6-13e3-4ce0-be91-1ecc9bd33613" (UID: "359366f6-13e3-4ce0-be91-1ecc9bd33613"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.487464 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/271b7d5f-4799-47ce-b503-0e41bfc34751-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "271b7d5f-4799-47ce-b503-0e41bfc34751" (UID: "271b7d5f-4799-47ce-b503-0e41bfc34751"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.487513 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e70d93d-ba09-4dcc-95b1-f25a68604a29-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e70d93d-ba09-4dcc-95b1-f25a68604a29" (UID: "3e70d93d-ba09-4dcc-95b1-f25a68604a29"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.487101 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cftdg\" (UniqueName: \"kubernetes.io/projected/271b7d5f-4799-47ce-b503-0e41bfc34751-kube-api-access-cftdg\") pod \"271b7d5f-4799-47ce-b503-0e41bfc34751\" (UID: \"271b7d5f-4799-47ce-b503-0e41bfc34751\") " Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.487977 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f74ad968-4584-45a0-b1e9-783c98abb6f4-operator-scripts\") pod \"f74ad968-4584-45a0-b1e9-783c98abb6f4\" (UID: \"f74ad968-4584-45a0-b1e9-783c98abb6f4\") " Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.488439 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359366f6-13e3-4ce0-be91-1ecc9bd33613-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.488454 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/271b7d5f-4799-47ce-b503-0e41bfc34751-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.488464 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e70d93d-ba09-4dcc-95b1-f25a68604a29-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.489613 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f74ad968-4584-45a0-b1e9-783c98abb6f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f74ad968-4584-45a0-b1e9-783c98abb6f4" (UID: "f74ad968-4584-45a0-b1e9-783c98abb6f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.489728 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74ad968-4584-45a0-b1e9-783c98abb6f4-kube-api-access-k7shq" (OuterVolumeSpecName: "kube-api-access-k7shq") pod "f74ad968-4584-45a0-b1e9-783c98abb6f4" (UID: "f74ad968-4584-45a0-b1e9-783c98abb6f4"). InnerVolumeSpecName "kube-api-access-k7shq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.492731 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/359366f6-13e3-4ce0-be91-1ecc9bd33613-kube-api-access-rcppk" (OuterVolumeSpecName: "kube-api-access-rcppk") pod "359366f6-13e3-4ce0-be91-1ecc9bd33613" (UID: "359366f6-13e3-4ce0-be91-1ecc9bd33613"). InnerVolumeSpecName "kube-api-access-rcppk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.494784 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/271b7d5f-4799-47ce-b503-0e41bfc34751-kube-api-access-cftdg" (OuterVolumeSpecName: "kube-api-access-cftdg") pod "271b7d5f-4799-47ce-b503-0e41bfc34751" (UID: "271b7d5f-4799-47ce-b503-0e41bfc34751"). InnerVolumeSpecName "kube-api-access-cftdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.497075 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e70d93d-ba09-4dcc-95b1-f25a68604a29-kube-api-access-8mxbt" (OuterVolumeSpecName: "kube-api-access-8mxbt") pod "3e70d93d-ba09-4dcc-95b1-f25a68604a29" (UID: "3e70d93d-ba09-4dcc-95b1-f25a68604a29"). InnerVolumeSpecName "kube-api-access-8mxbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.589920 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7shq\" (UniqueName: \"kubernetes.io/projected/f74ad968-4584-45a0-b1e9-783c98abb6f4-kube-api-access-k7shq\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.589950 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mxbt\" (UniqueName: \"kubernetes.io/projected/3e70d93d-ba09-4dcc-95b1-f25a68604a29-kube-api-access-8mxbt\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.589961 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcppk\" (UniqueName: \"kubernetes.io/projected/359366f6-13e3-4ce0-be91-1ecc9bd33613-kube-api-access-rcppk\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.589970 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cftdg\" (UniqueName: \"kubernetes.io/projected/271b7d5f-4799-47ce-b503-0e41bfc34751-kube-api-access-cftdg\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.589981 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f74ad968-4584-45a0-b1e9-783c98abb6f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.718796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-dlbqk" event={"ID":"359366f6-13e3-4ce0-be91-1ecc9bd33613","Type":"ContainerDied","Data":"e7bbab20ff89d9be21c42a4fd78f6187a4a9444c98ad28ea0bfebb04e110314e"} Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.718842 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7bbab20ff89d9be21c42a4fd78f6187a4a9444c98ad28ea0bfebb04e110314e" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.718894 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-dlbqk" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.721344 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-4rdsx" event={"ID":"271b7d5f-4799-47ce-b503-0e41bfc34751","Type":"ContainerDied","Data":"95261b84405599ec06e2360e8f5903fe24355bf079c756907a3fd3fdb7c45904"} Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.721383 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95261b84405599ec06e2360e8f5903fe24355bf079c756907a3fd3fdb7c45904" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.721446 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-4rdsx" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.724429 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.724450 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb" event={"ID":"3e70d93d-ba09-4dcc-95b1-f25a68604a29","Type":"ContainerDied","Data":"e1fd85f18e69d169e5e068a4bb42930a74bb189cd974d93247443a3c8844087a"} Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.724497 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1fd85f18e69d169e5e068a4bb42930a74bb189cd974d93247443a3c8844087a" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.725994 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-gnnv9" event={"ID":"3313a132-ec2a-462d-ad27-645c76243ac6","Type":"ContainerDied","Data":"406e9f9f1b42e074d60d2c8aab3e0aa5595c4a87549a59553d07c0f3e035bde1"} Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.726104 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="406e9f9f1b42e074d60d2c8aab3e0aa5595c4a87549a59553d07c0f3e035bde1" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.726049 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gnnv9" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.728012 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.728523 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-3977-account-create-update-tc777" event={"ID":"f74ad968-4584-45a0-b1e9-783c98abb6f4","Type":"ContainerDied","Data":"3f02e3d07d4ce3f9246651fe51ae6d710d15d66cba2c0baa070b775b79bc2616"} Jan 20 23:15:32 crc kubenswrapper[5030]: I0120 23:15:32.728559 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f02e3d07d4ce3f9246651fe51ae6d710d15d66cba2c0baa070b775b79bc2616" Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.028800 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.097949 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7301c9ea-0137-4118-9bff-ab6934177bb5-operator-scripts\") pod \"7301c9ea-0137-4118-9bff-ab6934177bb5\" (UID: \"7301c9ea-0137-4118-9bff-ab6934177bb5\") " Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.098960 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7301c9ea-0137-4118-9bff-ab6934177bb5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7301c9ea-0137-4118-9bff-ab6934177bb5" (UID: "7301c9ea-0137-4118-9bff-ab6934177bb5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.099146 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcrxt\" (UniqueName: \"kubernetes.io/projected/7301c9ea-0137-4118-9bff-ab6934177bb5-kube-api-access-rcrxt\") pod \"7301c9ea-0137-4118-9bff-ab6934177bb5\" (UID: \"7301c9ea-0137-4118-9bff-ab6934177bb5\") " Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.101211 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7301c9ea-0137-4118-9bff-ab6934177bb5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.104704 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7301c9ea-0137-4118-9bff-ab6934177bb5-kube-api-access-rcrxt" (OuterVolumeSpecName: "kube-api-access-rcrxt") pod "7301c9ea-0137-4118-9bff-ab6934177bb5" (UID: "7301c9ea-0137-4118-9bff-ab6934177bb5"). InnerVolumeSpecName "kube-api-access-rcrxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.219024 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcrxt\" (UniqueName: \"kubernetes.io/projected/7301c9ea-0137-4118-9bff-ab6934177bb5-kube-api-access-rcrxt\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.234318 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.741109 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.741107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-1026-account-create-update-xgf2v" event={"ID":"7301c9ea-0137-4118-9bff-ab6934177bb5","Type":"ContainerDied","Data":"93589a794b5962f726cf9d85c061f8be54cc269c679c1bcfba8499374b80c96d"} Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.741256 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93589a794b5962f726cf9d85c061f8be54cc269c679c1bcfba8499374b80c96d" Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.744471 5030 generic.go:334] "Generic (PLEG): container finished" podID="68dec5c8-ac10-4af8-b442-11df66af8d8f" containerID="d1a2aa936166935c4ce2906d91846b385b067a38b360d28d89c7fd9c07cc9408" exitCode=0 Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.744543 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" event={"ID":"68dec5c8-ac10-4af8-b442-11df66af8d8f","Type":"ContainerDied","Data":"d1a2aa936166935c4ce2906d91846b385b067a38b360d28d89c7fd9c07cc9408"} Jan 20 23:15:33 crc kubenswrapper[5030]: I0120 23:15:33.919343 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.031734 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.571951 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-hn24q"] Jan 20 23:15:34 crc kubenswrapper[5030]: E0120 23:15:34.572249 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74ad968-4584-45a0-b1e9-783c98abb6f4" containerName="mariadb-account-create-update" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.572261 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74ad968-4584-45a0-b1e9-783c98abb6f4" containerName="mariadb-account-create-update" Jan 20 23:15:34 crc kubenswrapper[5030]: E0120 23:15:34.572271 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="271b7d5f-4799-47ce-b503-0e41bfc34751" containerName="mariadb-database-create" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.572277 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="271b7d5f-4799-47ce-b503-0e41bfc34751" containerName="mariadb-database-create" Jan 20 23:15:34 crc kubenswrapper[5030]: E0120 23:15:34.572293 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3313a132-ec2a-462d-ad27-645c76243ac6" containerName="mariadb-database-create" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.572299 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3313a132-ec2a-462d-ad27-645c76243ac6" containerName="mariadb-database-create" Jan 20 23:15:34 crc kubenswrapper[5030]: E0120 23:15:34.572313 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e70d93d-ba09-4dcc-95b1-f25a68604a29" containerName="mariadb-account-create-update" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.572319 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e70d93d-ba09-4dcc-95b1-f25a68604a29" containerName="mariadb-account-create-update" Jan 20 23:15:34 crc kubenswrapper[5030]: E0120 23:15:34.572328 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7301c9ea-0137-4118-9bff-ab6934177bb5" containerName="mariadb-account-create-update" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.572334 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7301c9ea-0137-4118-9bff-ab6934177bb5" containerName="mariadb-account-create-update" Jan 20 23:15:34 crc kubenswrapper[5030]: E0120 23:15:34.572344 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359366f6-13e3-4ce0-be91-1ecc9bd33613" containerName="mariadb-database-create" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.572351 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="359366f6-13e3-4ce0-be91-1ecc9bd33613" containerName="mariadb-database-create" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.572510 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3313a132-ec2a-462d-ad27-645c76243ac6" containerName="mariadb-database-create" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.572521 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="271b7d5f-4799-47ce-b503-0e41bfc34751" containerName="mariadb-database-create" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.572536 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7301c9ea-0137-4118-9bff-ab6934177bb5" containerName="mariadb-account-create-update" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.572546 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="359366f6-13e3-4ce0-be91-1ecc9bd33613" containerName="mariadb-database-create" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.572558 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74ad968-4584-45a0-b1e9-783c98abb6f4" containerName="mariadb-account-create-update" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.572565 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e70d93d-ba09-4dcc-95b1-f25a68604a29" containerName="mariadb-account-create-update" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.573094 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.578813 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.578875 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-zvqxh" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.587360 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-hn24q"] Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.645151 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-combined-ca-bundle\") pod \"glance-db-sync-hn24q\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.645224 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-db-sync-config-data\") pod \"glance-db-sync-hn24q\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.645245 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-config-data\") pod \"glance-db-sync-hn24q\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.645349 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpz6h\" (UniqueName: \"kubernetes.io/projected/421b8192-d4d9-41a8-800d-3f3afab07ac7-kube-api-access-rpz6h\") pod \"glance-db-sync-hn24q\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.746442 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpz6h\" (UniqueName: \"kubernetes.io/projected/421b8192-d4d9-41a8-800d-3f3afab07ac7-kube-api-access-rpz6h\") pod \"glance-db-sync-hn24q\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.747569 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-combined-ca-bundle\") pod \"glance-db-sync-hn24q\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.747706 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-db-sync-config-data\") pod \"glance-db-sync-hn24q\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.747796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-config-data\") pod \"glance-db-sync-hn24q\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.754322 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-config-data\") pod \"glance-db-sync-hn24q\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.758140 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-combined-ca-bundle\") pod \"glance-db-sync-hn24q\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.764358 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-db-sync-config-data\") pod \"glance-db-sync-hn24q\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.771358 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpz6h\" (UniqueName: \"kubernetes.io/projected/421b8192-d4d9-41a8-800d-3f3afab07ac7-kube-api-access-rpz6h\") pod \"glance-db-sync-hn24q\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:34 crc kubenswrapper[5030]: I0120 23:15:34.888637 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.069868 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.154811 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-combined-ca-bundle\") pod \"68dec5c8-ac10-4af8-b442-11df66af8d8f\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.154928 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-swiftconf\") pod \"68dec5c8-ac10-4af8-b442-11df66af8d8f\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.154963 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj6d5\" (UniqueName: \"kubernetes.io/projected/68dec5c8-ac10-4af8-b442-11df66af8d8f-kube-api-access-bj6d5\") pod \"68dec5c8-ac10-4af8-b442-11df66af8d8f\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.155034 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/68dec5c8-ac10-4af8-b442-11df66af8d8f-etc-swift\") pod \"68dec5c8-ac10-4af8-b442-11df66af8d8f\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.155076 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68dec5c8-ac10-4af8-b442-11df66af8d8f-scripts\") pod \"68dec5c8-ac10-4af8-b442-11df66af8d8f\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.155158 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/68dec5c8-ac10-4af8-b442-11df66af8d8f-ring-data-devices\") pod \"68dec5c8-ac10-4af8-b442-11df66af8d8f\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.155174 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-dispersionconf\") pod \"68dec5c8-ac10-4af8-b442-11df66af8d8f\" (UID: \"68dec5c8-ac10-4af8-b442-11df66af8d8f\") " Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.156167 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68dec5c8-ac10-4af8-b442-11df66af8d8f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "68dec5c8-ac10-4af8-b442-11df66af8d8f" (UID: "68dec5c8-ac10-4af8-b442-11df66af8d8f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.157396 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68dec5c8-ac10-4af8-b442-11df66af8d8f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "68dec5c8-ac10-4af8-b442-11df66af8d8f" (UID: "68dec5c8-ac10-4af8-b442-11df66af8d8f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.161292 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68dec5c8-ac10-4af8-b442-11df66af8d8f-kube-api-access-bj6d5" (OuterVolumeSpecName: "kube-api-access-bj6d5") pod "68dec5c8-ac10-4af8-b442-11df66af8d8f" (UID: "68dec5c8-ac10-4af8-b442-11df66af8d8f"). InnerVolumeSpecName "kube-api-access-bj6d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.164418 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "68dec5c8-ac10-4af8-b442-11df66af8d8f" (UID: "68dec5c8-ac10-4af8-b442-11df66af8d8f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.183206 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68dec5c8-ac10-4af8-b442-11df66af8d8f" (UID: "68dec5c8-ac10-4af8-b442-11df66af8d8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.187168 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "68dec5c8-ac10-4af8-b442-11df66af8d8f" (UID: "68dec5c8-ac10-4af8-b442-11df66af8d8f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.187400 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68dec5c8-ac10-4af8-b442-11df66af8d8f-scripts" (OuterVolumeSpecName: "scripts") pod "68dec5c8-ac10-4af8-b442-11df66af8d8f" (UID: "68dec5c8-ac10-4af8-b442-11df66af8d8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.259072 5030 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/68dec5c8-ac10-4af8-b442-11df66af8d8f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.259220 5030 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.259255 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.259274 5030 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/68dec5c8-ac10-4af8-b442-11df66af8d8f-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.259293 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj6d5\" (UniqueName: \"kubernetes.io/projected/68dec5c8-ac10-4af8-b442-11df66af8d8f-kube-api-access-bj6d5\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.259318 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/68dec5c8-ac10-4af8-b442-11df66af8d8f-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.259335 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68dec5c8-ac10-4af8-b442-11df66af8d8f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.340556 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-hn24q"] Jan 20 23:15:35 crc kubenswrapper[5030]: W0120 23:15:35.342086 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod421b8192_d4d9_41a8_800d_3f3afab07ac7.slice/crio-f503fdaa56fdcbff37a2cdc674a0b6d82d91839b3387289ed32bc8eede143fa9 WatchSource:0}: Error finding container f503fdaa56fdcbff37a2cdc674a0b6d82d91839b3387289ed32bc8eede143fa9: Status 404 returned error can't find the container with id f503fdaa56fdcbff37a2cdc674a0b6d82d91839b3387289ed32bc8eede143fa9 Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.763775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" event={"ID":"68dec5c8-ac10-4af8-b442-11df66af8d8f","Type":"ContainerDied","Data":"8c9a814ad3b111edb4b6cf940958ca62372f6cbbce57301c52e43e4bc9917d81"} Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.763849 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c9a814ad3b111edb4b6cf940958ca62372f6cbbce57301c52e43e4bc9917d81" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.763917 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-fgr22" Jan 20 23:15:35 crc kubenswrapper[5030]: I0120 23:15:35.771833 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-hn24q" event={"ID":"421b8192-d4d9-41a8-800d-3f3afab07ac7","Type":"ContainerStarted","Data":"f503fdaa56fdcbff37a2cdc674a0b6d82d91839b3387289ed32bc8eede143fa9"} Jan 20 23:15:36 crc kubenswrapper[5030]: I0120 23:15:36.785438 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-hn24q" event={"ID":"421b8192-d4d9-41a8-800d-3f3afab07ac7","Type":"ContainerStarted","Data":"ef9e28371f66994ada583894a1e45f03b589a60920c442f3c6622eb45d5ee18b"} Jan 20 23:15:36 crc kubenswrapper[5030]: I0120 23:15:36.817995 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-hn24q" podStartSLOduration=2.817972172 podStartE2EDuration="2.817972172s" podCreationTimestamp="2026-01-20 23:15:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:36.811131507 +0000 UTC m=+2409.131391825" watchObservedRunningTime="2026-01-20 23:15:36.817972172 +0000 UTC m=+2409.138232490" Jan 20 23:15:36 crc kubenswrapper[5030]: I0120 23:15:36.861683 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-pkd7j"] Jan 20 23:15:36 crc kubenswrapper[5030]: I0120 23:15:36.871167 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-pkd7j"] Jan 20 23:15:36 crc kubenswrapper[5030]: I0120 23:15:36.880613 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-h948j"] Jan 20 23:15:36 crc kubenswrapper[5030]: E0120 23:15:36.880966 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68dec5c8-ac10-4af8-b442-11df66af8d8f" containerName="swift-ring-rebalance" Jan 20 23:15:36 crc kubenswrapper[5030]: I0120 23:15:36.880983 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="68dec5c8-ac10-4af8-b442-11df66af8d8f" containerName="swift-ring-rebalance" Jan 20 23:15:36 crc kubenswrapper[5030]: I0120 23:15:36.881146 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="68dec5c8-ac10-4af8-b442-11df66af8d8f" containerName="swift-ring-rebalance" Jan 20 23:15:36 crc kubenswrapper[5030]: I0120 23:15:36.881830 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-h948j" Jan 20 23:15:36 crc kubenswrapper[5030]: I0120 23:15:36.887796 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 20 23:15:36 crc kubenswrapper[5030]: I0120 23:15:36.890969 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-h948j"] Jan 20 23:15:36 crc kubenswrapper[5030]: I0120 23:15:36.993018 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/303c674e-2c9f-4166-97f9-98ec856371ae-operator-scripts\") pod \"root-account-create-update-h948j\" (UID: \"303c674e-2c9f-4166-97f9-98ec856371ae\") " pod="openstack-kuttl-tests/root-account-create-update-h948j" Jan 20 23:15:36 crc kubenswrapper[5030]: I0120 23:15:36.993116 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-587bt\" (UniqueName: \"kubernetes.io/projected/303c674e-2c9f-4166-97f9-98ec856371ae-kube-api-access-587bt\") pod \"root-account-create-update-h948j\" (UID: \"303c674e-2c9f-4166-97f9-98ec856371ae\") " pod="openstack-kuttl-tests/root-account-create-update-h948j" Jan 20 23:15:36 crc kubenswrapper[5030]: I0120 23:15:36.993235 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.000090 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift\") pod \"swift-storage-0\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.088532 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.094996 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/303c674e-2c9f-4166-97f9-98ec856371ae-operator-scripts\") pod \"root-account-create-update-h948j\" (UID: \"303c674e-2c9f-4166-97f9-98ec856371ae\") " pod="openstack-kuttl-tests/root-account-create-update-h948j" Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.095084 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-587bt\" (UniqueName: \"kubernetes.io/projected/303c674e-2c9f-4166-97f9-98ec856371ae-kube-api-access-587bt\") pod \"root-account-create-update-h948j\" (UID: \"303c674e-2c9f-4166-97f9-98ec856371ae\") " pod="openstack-kuttl-tests/root-account-create-update-h948j" Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.095712 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/303c674e-2c9f-4166-97f9-98ec856371ae-operator-scripts\") pod \"root-account-create-update-h948j\" (UID: \"303c674e-2c9f-4166-97f9-98ec856371ae\") " pod="openstack-kuttl-tests/root-account-create-update-h948j" Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.115057 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-587bt\" (UniqueName: \"kubernetes.io/projected/303c674e-2c9f-4166-97f9-98ec856371ae-kube-api-access-587bt\") pod \"root-account-create-update-h948j\" (UID: \"303c674e-2c9f-4166-97f9-98ec856371ae\") " pod="openstack-kuttl-tests/root-account-create-update-h948j" Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.203411 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-h948j" Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.570848 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:15:37 crc kubenswrapper[5030]: W0120 23:15:37.571317 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd060c583_6bca_4692_a1e4_84ebc743f826.slice/crio-49d8e170bf2a0fa87edd1e94b9adb4f397fa36dcb637ef4a5a52449a9620c987 WatchSource:0}: Error finding container 49d8e170bf2a0fa87edd1e94b9adb4f397fa36dcb637ef4a5a52449a9620c987: Status 404 returned error can't find the container with id 49d8e170bf2a0fa87edd1e94b9adb4f397fa36dcb637ef4a5a52449a9620c987 Jan 20 23:15:37 crc kubenswrapper[5030]: W0120 23:15:37.655852 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod303c674e_2c9f_4166_97f9_98ec856371ae.slice/crio-d601f4fd40eb721382ad43906d0bfe3f0b3b79cf26d3ae6c8043bcfae0b182ea WatchSource:0}: Error finding container d601f4fd40eb721382ad43906d0bfe3f0b3b79cf26d3ae6c8043bcfae0b182ea: Status 404 returned error can't find the container with id d601f4fd40eb721382ad43906d0bfe3f0b3b79cf26d3ae6c8043bcfae0b182ea Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.656574 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-h948j"] Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.798928 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-h948j" event={"ID":"303c674e-2c9f-4166-97f9-98ec856371ae","Type":"ContainerStarted","Data":"d601f4fd40eb721382ad43906d0bfe3f0b3b79cf26d3ae6c8043bcfae0b182ea"} Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.812439 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"86a2e3b58648f4c6865951dbe0e5fd173fe948355567c57408f5478873a50cea"} Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.812489 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"49d8e170bf2a0fa87edd1e94b9adb4f397fa36dcb637ef4a5a52449a9620c987"} Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.898250 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:37 crc kubenswrapper[5030]: I0120 23:15:37.975720 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="373ab197-e116-4ebc-9fda-0dbf359b8a58" path="/var/lib/kubelet/pods/373ab197-e116-4ebc-9fda-0dbf359b8a58/volumes" Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.010357 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.071913 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.154503 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.640131 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/openstack-cell1-galera-2" podUID="1c5da8d3-64b4-4c3b-a154-58eaf9379610" containerName="galera" probeResult="failure" output=< Jan 20 23:15:38 crc kubenswrapper[5030]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Jan 20 23:15:38 crc kubenswrapper[5030]: > Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.823305 5030 generic.go:334] "Generic (PLEG): container finished" podID="65e9d298-7338-47c3-8591-b7803798203f" containerID="752b6a46f4ce2e600f72af503766db84caedf6a5b37e0be36eaa583d522b93d0" exitCode=0 Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.823377 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" event={"ID":"65e9d298-7338-47c3-8591-b7803798203f","Type":"ContainerDied","Data":"752b6a46f4ce2e600f72af503766db84caedf6a5b37e0be36eaa583d522b93d0"} Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.829492 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"c172e50bb94cc43be14453c778229aa3ad0f5285b8f09db985f5f9ca934e07e0"} Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.829536 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"8420aa4a66cecb302aa6d121c7a14ab12b3529e6eb3b2d23cd341bdb0404c01b"} Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.829548 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"ddb4980b7a5fddbacfa0f9edafb461b4d52d06ebf1613fcacc617ee6aa6c2f36"} Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.829560 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"9953d67dbf3825bda9fc957256627f35a45dbc98f02872cfe23353e4f525b038"} Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.829569 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"2724c2174e7487a3d20c504c8b82914e661dd1586f42ae55254b615ffc14a493"} Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.829577 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"e7a7f0b5138e5ce84a4aee902bbc98528f273ace28ad5b3739a5c7855fce0f41"} Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.831592 5030 generic.go:334] "Generic (PLEG): container finished" podID="a54757d6-0570-4290-a802-d82ff94dffb9" containerID="7d5497c08cbd622e49fcc51b99641a1df34a35f61b4bc17ebbc6a57130accf3b" exitCode=0 Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.831640 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" event={"ID":"a54757d6-0570-4290-a802-d82ff94dffb9","Type":"ContainerDied","Data":"7d5497c08cbd622e49fcc51b99641a1df34a35f61b4bc17ebbc6a57130accf3b"} Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.833611 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-h948j" event={"ID":"303c674e-2c9f-4166-97f9-98ec856371ae","Type":"ContainerStarted","Data":"716936d721449a258bd4a806a8716ac0e4856d04d567e559f47e27a903f4413e"} Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.835490 5030 generic.go:334] "Generic (PLEG): container finished" podID="995342f6-fe0b-4188-976c-5151541dd002" containerID="ce04af762c42921ec01aa3670edc17c959abd025e114d214262bf6b03d163300" exitCode=0 Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.835668 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"995342f6-fe0b-4188-976c-5151541dd002","Type":"ContainerDied","Data":"ce04af762c42921ec01aa3670edc17c959abd025e114d214262bf6b03d163300"} Jan 20 23:15:38 crc kubenswrapper[5030]: I0120 23:15:38.939374 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/root-account-create-update-h948j" podStartSLOduration=2.939351895 podStartE2EDuration="2.939351895s" podCreationTimestamp="2026-01-20 23:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:38.919486034 +0000 UTC m=+2411.239746322" watchObservedRunningTime="2026-01-20 23:15:38.939351895 +0000 UTC m=+2411.259612183" Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.845922 5030 generic.go:334] "Generic (PLEG): container finished" podID="ed443431-2b5c-4533-8523-d01380c98c1d" containerID="e06d7030eda4e797f3bf0fb208cadb2bda83a4dfc38d8656bdc15ff205433692" exitCode=0 Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.846009 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"ed443431-2b5c-4533-8523-d01380c98c1d","Type":"ContainerDied","Data":"e06d7030eda4e797f3bf0fb208cadb2bda83a4dfc38d8656bdc15ff205433692"} Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.880753 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" event={"ID":"65e9d298-7338-47c3-8591-b7803798203f","Type":"ContainerStarted","Data":"5ec228c62d11e892bccd8b3e7622fe27d4c99380436acc8ab9a70d07539cf3ba"} Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.881193 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.888290 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"d0baa50e2392f25298e7d0f9821ab644a7212ae9a2f2e398ab1e0dc812fdc4da"} Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.888339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"5f30fa9f5b0648ade3945014008c72e70964fe50f71f44800cc8bb9e7ffd26fd"} Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.888350 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"c3edaddce88c0d920a63257712c29ebc2820562a0cebc56bf7063f7045098b45"} Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.890700 5030 generic.go:334] "Generic (PLEG): container finished" podID="56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" containerID="784863a42052e57b14121f049e5cd14fe8a4044d16008f175d245e4a44552a4e" exitCode=0 Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.890812 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-2" event={"ID":"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f","Type":"ContainerDied","Data":"784863a42052e57b14121f049e5cd14fe8a4044d16008f175d245e4a44552a4e"} Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.900646 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" event={"ID":"a54757d6-0570-4290-a802-d82ff94dffb9","Type":"ContainerStarted","Data":"b4810b367aef312cb07a5bc212828530d847353295a4a0029f98ba6de0de7144"} Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.900849 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.902381 5030 generic.go:334] "Generic (PLEG): container finished" podID="303c674e-2c9f-4166-97f9-98ec856371ae" containerID="716936d721449a258bd4a806a8716ac0e4856d04d567e559f47e27a903f4413e" exitCode=0 Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.902444 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-h948j" event={"ID":"303c674e-2c9f-4166-97f9-98ec856371ae","Type":"ContainerDied","Data":"716936d721449a258bd4a806a8716ac0e4856d04d567e559f47e27a903f4413e"} Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.904945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"995342f6-fe0b-4188-976c-5151541dd002","Type":"ContainerStarted","Data":"38eba0f78e9bf8588976acd650a6cbc2107e5878a8dbe1f929ebf5401b2d0697"} Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.905592 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.907042 5030 generic.go:334] "Generic (PLEG): container finished" podID="59bcab9d-b9c0-40fe-8973-df39eeca1dd4" containerID="f161398cbd5641c14973f948a7a3f2a59ed72f415935e18bc8af68a8d9fd3226" exitCode=0 Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.907070 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-1" event={"ID":"59bcab9d-b9c0-40fe-8973-df39eeca1dd4","Type":"ContainerDied","Data":"f161398cbd5641c14973f948a7a3f2a59ed72f415935e18bc8af68a8d9fd3226"} Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.916374 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" podStartSLOduration=37.916352955 podStartE2EDuration="37.916352955s" podCreationTimestamp="2026-01-20 23:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:39.908019204 +0000 UTC m=+2412.228279492" watchObservedRunningTime="2026-01-20 23:15:39.916352955 +0000 UTC m=+2412.236613243" Jan 20 23:15:39 crc kubenswrapper[5030]: I0120 23:15:39.944439 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" podStartSLOduration=37.944422043 podStartE2EDuration="37.944422043s" podCreationTimestamp="2026-01-20 23:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:39.935005866 +0000 UTC m=+2412.255266154" watchObservedRunningTime="2026-01-20 23:15:39.944422043 +0000 UTC m=+2412.264682331" Jan 20 23:15:40 crc kubenswrapper[5030]: I0120 23:15:40.014790 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=38.014774511 podStartE2EDuration="38.014774511s" podCreationTimestamp="2026-01-20 23:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:40.014136605 +0000 UTC m=+2412.334396913" watchObservedRunningTime="2026-01-20 23:15:40.014774511 +0000 UTC m=+2412.335034789" Jan 20 23:15:40 crc kubenswrapper[5030]: I0120 23:15:40.932790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"c088151164804d480f3d602813acc6fe8110edbb60819ceaf2f0559e9b333bb1"} Jan 20 23:15:40 crc kubenswrapper[5030]: I0120 23:15:40.933045 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"d04f17499f96be1bcb743043670578f206129ddc5a8598a1d01aac043f27b4a5"} Jan 20 23:15:40 crc kubenswrapper[5030]: I0120 23:15:40.933057 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"ff021e7947768f870e170bfceeedcefa9f23164365486fe0e68cf658ba9ee312"} Jan 20 23:15:40 crc kubenswrapper[5030]: I0120 23:15:40.936111 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-2" event={"ID":"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f","Type":"ContainerStarted","Data":"e36b6c921d799153e5dd8b22436909b17571e1a429199d1f21643266c33dc577"} Jan 20 23:15:40 crc kubenswrapper[5030]: I0120 23:15:40.936317 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:15:40 crc kubenswrapper[5030]: I0120 23:15:40.938010 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-1" event={"ID":"59bcab9d-b9c0-40fe-8973-df39eeca1dd4","Type":"ContainerStarted","Data":"a74fc0dc19c36c183a9bd3ffda3fbb852f4467116d2b61c87b1bef566563caba"} Jan 20 23:15:40 crc kubenswrapper[5030]: I0120 23:15:40.938865 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:40 crc kubenswrapper[5030]: I0120 23:15:40.941845 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"ed443431-2b5c-4533-8523-d01380c98c1d","Type":"ContainerStarted","Data":"07c23d45fe75d1a2499dda70559be5d8b50e9dddf38b74b4c6637c72301d7701"} Jan 20 23:15:40 crc kubenswrapper[5030]: I0120 23:15:40.942313 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:40 crc kubenswrapper[5030]: I0120 23:15:40.961907 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:15:40 crc kubenswrapper[5030]: E0120 23:15:40.962148 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:15:40 crc kubenswrapper[5030]: I0120 23:15:40.962711 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-2" podStartSLOduration=37.962696541 podStartE2EDuration="37.962696541s" podCreationTimestamp="2026-01-20 23:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:40.955689952 +0000 UTC m=+2413.275950250" watchObservedRunningTime="2026-01-20 23:15:40.962696541 +0000 UTC m=+2413.282956829" Jan 20 23:15:40 crc kubenswrapper[5030]: I0120 23:15:40.983799 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-1" podStartSLOduration=37.98378272 podStartE2EDuration="37.98378272s" podCreationTimestamp="2026-01-20 23:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:40.977664131 +0000 UTC m=+2413.297924429" watchObservedRunningTime="2026-01-20 23:15:40.98378272 +0000 UTC m=+2413.304043008" Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.007649 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=38.007631505 podStartE2EDuration="38.007631505s" podCreationTimestamp="2026-01-20 23:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:41.003978337 +0000 UTC m=+2413.324238635" watchObservedRunningTime="2026-01-20 23:15:41.007631505 +0000 UTC m=+2413.327891793" Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.174738 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.352563 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.421453 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-h948j" Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.481925 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-587bt\" (UniqueName: \"kubernetes.io/projected/303c674e-2c9f-4166-97f9-98ec856371ae-kube-api-access-587bt\") pod \"303c674e-2c9f-4166-97f9-98ec856371ae\" (UID: \"303c674e-2c9f-4166-97f9-98ec856371ae\") " Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.482098 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/303c674e-2c9f-4166-97f9-98ec856371ae-operator-scripts\") pod \"303c674e-2c9f-4166-97f9-98ec856371ae\" (UID: \"303c674e-2c9f-4166-97f9-98ec856371ae\") " Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.482989 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/303c674e-2c9f-4166-97f9-98ec856371ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "303c674e-2c9f-4166-97f9-98ec856371ae" (UID: "303c674e-2c9f-4166-97f9-98ec856371ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.489759 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/303c674e-2c9f-4166-97f9-98ec856371ae-kube-api-access-587bt" (OuterVolumeSpecName: "kube-api-access-587bt") pod "303c674e-2c9f-4166-97f9-98ec856371ae" (UID: "303c674e-2c9f-4166-97f9-98ec856371ae"). InnerVolumeSpecName "kube-api-access-587bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.583637 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/303c674e-2c9f-4166-97f9-98ec856371ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.583675 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-587bt\" (UniqueName: \"kubernetes.io/projected/303c674e-2c9f-4166-97f9-98ec856371ae-kube-api-access-587bt\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.951561 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-h948j" Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.951740 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-h948j" event={"ID":"303c674e-2c9f-4166-97f9-98ec856371ae","Type":"ContainerDied","Data":"d601f4fd40eb721382ad43906d0bfe3f0b3b79cf26d3ae6c8043bcfae0b182ea"} Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.951778 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d601f4fd40eb721382ad43906d0bfe3f0b3b79cf26d3ae6c8043bcfae0b182ea" Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.957361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"31bdfc8eeca403bd17e795b8cf31b719db2f4805a115c8604e8ad198f7327da9"} Jan 20 23:15:41 crc kubenswrapper[5030]: I0120 23:15:41.957405 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerStarted","Data":"63b6c6d4bece1d87f4729a39f17170c16567071872bdc73d5549a42bd98e07ba"} Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.040218 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=22.040200737 podStartE2EDuration="22.040200737s" podCreationTimestamp="2026-01-20 23:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:42.028605248 +0000 UTC m=+2414.348865546" watchObservedRunningTime="2026-01-20 23:15:42.040200737 +0000 UTC m=+2414.360461015" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.322931 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm"] Jan 20 23:15:42 crc kubenswrapper[5030]: E0120 23:15:42.323223 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="303c674e-2c9f-4166-97f9-98ec856371ae" containerName="mariadb-account-create-update" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.323238 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="303c674e-2c9f-4166-97f9-98ec856371ae" containerName="mariadb-account-create-update" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.323421 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="303c674e-2c9f-4166-97f9-98ec856371ae" containerName="mariadb-account-create-update" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.324282 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.330571 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.337645 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm"] Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.498073 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlm47\" (UniqueName: \"kubernetes.io/projected/195d8fc3-e36b-4a36-92a4-da935758d9ff-kube-api-access-qlm47\") pod \"dnsmasq-dnsmasq-7d5699597c-kfxwm\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.498157 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-7d5699597c-kfxwm\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.498288 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-config\") pod \"dnsmasq-dnsmasq-7d5699597c-kfxwm\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.498449 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d5699597c-kfxwm\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.599640 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlm47\" (UniqueName: \"kubernetes.io/projected/195d8fc3-e36b-4a36-92a4-da935758d9ff-kube-api-access-qlm47\") pod \"dnsmasq-dnsmasq-7d5699597c-kfxwm\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.599701 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-7d5699597c-kfxwm\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.599744 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-config\") pod \"dnsmasq-dnsmasq-7d5699597c-kfxwm\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.599781 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d5699597c-kfxwm\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.600546 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-7d5699597c-kfxwm\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.600632 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d5699597c-kfxwm\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.601057 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-config\") pod \"dnsmasq-dnsmasq-7d5699597c-kfxwm\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.623543 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlm47\" (UniqueName: \"kubernetes.io/projected/195d8fc3-e36b-4a36-92a4-da935758d9ff-kube-api-access-qlm47\") pod \"dnsmasq-dnsmasq-7d5699597c-kfxwm\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:42 crc kubenswrapper[5030]: I0120 23:15:42.640908 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:43 crc kubenswrapper[5030]: I0120 23:15:43.146443 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm"] Jan 20 23:15:43 crc kubenswrapper[5030]: W0120 23:15:43.149467 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod195d8fc3_e36b_4a36_92a4_da935758d9ff.slice/crio-3d75ef8c50cd00560e268d9b45702f8d547c037d46c6de51a0e2d5516f80205f WatchSource:0}: Error finding container 3d75ef8c50cd00560e268d9b45702f8d547c037d46c6de51a0e2d5516f80205f: Status 404 returned error can't find the container with id 3d75ef8c50cd00560e268d9b45702f8d547c037d46c6de51a0e2d5516f80205f Jan 20 23:15:43 crc kubenswrapper[5030]: I0120 23:15:43.935102 5030 scope.go:117] "RemoveContainer" containerID="df1a60f7cd7548e04244683ed2dd64f62aefb4d7e1f8455813c670c81d31dee1" Jan 20 23:15:43 crc kubenswrapper[5030]: I0120 23:15:43.975331 5030 scope.go:117] "RemoveContainer" containerID="a8eea5b347bca88e0564d71e4e5a6ec69c6b240ae4f761c7184c5d721ecc3b69" Jan 20 23:15:43 crc kubenswrapper[5030]: I0120 23:15:43.997488 5030 generic.go:334] "Generic (PLEG): container finished" podID="195d8fc3-e36b-4a36-92a4-da935758d9ff" containerID="9d65101e2fecb07ee01f739c10329f54162ad363b50389fe00638cfc9abde483" exitCode=0 Jan 20 23:15:43 crc kubenswrapper[5030]: I0120 23:15:43.997584 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" event={"ID":"195d8fc3-e36b-4a36-92a4-da935758d9ff","Type":"ContainerDied","Data":"9d65101e2fecb07ee01f739c10329f54162ad363b50389fe00638cfc9abde483"} Jan 20 23:15:43 crc kubenswrapper[5030]: I0120 23:15:43.998071 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" event={"ID":"195d8fc3-e36b-4a36-92a4-da935758d9ff","Type":"ContainerStarted","Data":"3d75ef8c50cd00560e268d9b45702f8d547c037d46c6de51a0e2d5516f80205f"} Jan 20 23:15:44 crc kubenswrapper[5030]: I0120 23:15:44.032995 5030 scope.go:117] "RemoveContainer" containerID="551531362bd2e6dbde84980ae713ad54587b8b92a636785573c6691a77427bf2" Jan 20 23:15:45 crc kubenswrapper[5030]: I0120 23:15:45.021928 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" event={"ID":"195d8fc3-e36b-4a36-92a4-da935758d9ff","Type":"ContainerStarted","Data":"eaa01e4f601a64005981e893cc41a4981cdd97ea4254fbd0d3e015df117736eb"} Jan 20 23:15:45 crc kubenswrapper[5030]: I0120 23:15:45.022126 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:45 crc kubenswrapper[5030]: I0120 23:15:45.052319 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" podStartSLOduration=3.052295459 podStartE2EDuration="3.052295459s" podCreationTimestamp="2026-01-20 23:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:15:45.036383614 +0000 UTC m=+2417.356643902" watchObservedRunningTime="2026-01-20 23:15:45.052295459 +0000 UTC m=+2417.372555777" Jan 20 23:15:46 crc kubenswrapper[5030]: I0120 23:15:46.034969 5030 generic.go:334] "Generic (PLEG): container finished" podID="421b8192-d4d9-41a8-800d-3f3afab07ac7" containerID="ef9e28371f66994ada583894a1e45f03b589a60920c442f3c6622eb45d5ee18b" exitCode=0 Jan 20 23:15:46 crc kubenswrapper[5030]: I0120 23:15:46.035078 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-hn24q" event={"ID":"421b8192-d4d9-41a8-800d-3f3afab07ac7","Type":"ContainerDied","Data":"ef9e28371f66994ada583894a1e45f03b589a60920c442f3c6622eb45d5ee18b"} Jan 20 23:15:47 crc kubenswrapper[5030]: I0120 23:15:47.415428 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:47 crc kubenswrapper[5030]: I0120 23:15:47.606244 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpz6h\" (UniqueName: \"kubernetes.io/projected/421b8192-d4d9-41a8-800d-3f3afab07ac7-kube-api-access-rpz6h\") pod \"421b8192-d4d9-41a8-800d-3f3afab07ac7\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " Jan 20 23:15:47 crc kubenswrapper[5030]: I0120 23:15:47.606300 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-config-data\") pod \"421b8192-d4d9-41a8-800d-3f3afab07ac7\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " Jan 20 23:15:47 crc kubenswrapper[5030]: I0120 23:15:47.606360 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-combined-ca-bundle\") pod \"421b8192-d4d9-41a8-800d-3f3afab07ac7\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " Jan 20 23:15:47 crc kubenswrapper[5030]: I0120 23:15:47.606377 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-db-sync-config-data\") pod \"421b8192-d4d9-41a8-800d-3f3afab07ac7\" (UID: \"421b8192-d4d9-41a8-800d-3f3afab07ac7\") " Jan 20 23:15:47 crc kubenswrapper[5030]: I0120 23:15:47.612803 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "421b8192-d4d9-41a8-800d-3f3afab07ac7" (UID: "421b8192-d4d9-41a8-800d-3f3afab07ac7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:15:47 crc kubenswrapper[5030]: I0120 23:15:47.612902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421b8192-d4d9-41a8-800d-3f3afab07ac7-kube-api-access-rpz6h" (OuterVolumeSpecName: "kube-api-access-rpz6h") pod "421b8192-d4d9-41a8-800d-3f3afab07ac7" (UID: "421b8192-d4d9-41a8-800d-3f3afab07ac7"). InnerVolumeSpecName "kube-api-access-rpz6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:15:47 crc kubenswrapper[5030]: I0120 23:15:47.632963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "421b8192-d4d9-41a8-800d-3f3afab07ac7" (UID: "421b8192-d4d9-41a8-800d-3f3afab07ac7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:15:47 crc kubenswrapper[5030]: I0120 23:15:47.653132 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-config-data" (OuterVolumeSpecName: "config-data") pod "421b8192-d4d9-41a8-800d-3f3afab07ac7" (UID: "421b8192-d4d9-41a8-800d-3f3afab07ac7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:15:47 crc kubenswrapper[5030]: I0120 23:15:47.708487 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:47 crc kubenswrapper[5030]: I0120 23:15:47.708771 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:47 crc kubenswrapper[5030]: I0120 23:15:47.708870 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/421b8192-d4d9-41a8-800d-3f3afab07ac7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:47 crc kubenswrapper[5030]: I0120 23:15:47.708967 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpz6h\" (UniqueName: \"kubernetes.io/projected/421b8192-d4d9-41a8-800d-3f3afab07ac7-kube-api-access-rpz6h\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:48 crc kubenswrapper[5030]: I0120 23:15:48.062916 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-hn24q" event={"ID":"421b8192-d4d9-41a8-800d-3f3afab07ac7","Type":"ContainerDied","Data":"f503fdaa56fdcbff37a2cdc674a0b6d82d91839b3387289ed32bc8eede143fa9"} Jan 20 23:15:48 crc kubenswrapper[5030]: I0120 23:15:48.062974 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f503fdaa56fdcbff37a2cdc674a0b6d82d91839b3387289ed32bc8eede143fa9" Jan 20 23:15:48 crc kubenswrapper[5030]: I0120 23:15:48.063033 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-hn24q" Jan 20 23:15:52 crc kubenswrapper[5030]: I0120 23:15:52.643769 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:15:52 crc kubenswrapper[5030]: I0120 23:15:52.720366 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl"] Jan 20 23:15:52 crc kubenswrapper[5030]: I0120 23:15:52.721070 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" podUID="5c2cc197-a38a-41a4-8c40-c65602a593ea" containerName="dnsmasq-dns" containerID="cri-o://2c07e5959ee3fc2cc7c2847460f6aace325ab51ecac6b3fa4806c949e47bad35" gracePeriod=10 Jan 20 23:15:52 crc kubenswrapper[5030]: I0120 23:15:52.754232 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" podUID="5c2cc197-a38a-41a4-8c40-c65602a593ea" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.70:5353: connect: connection refused" Jan 20 23:15:52 crc kubenswrapper[5030]: I0120 23:15:52.962536 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:15:52 crc kubenswrapper[5030]: E0120 23:15:52.962783 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:15:53 crc kubenswrapper[5030]: I0120 23:15:53.109914 5030 generic.go:334] "Generic (PLEG): container finished" podID="5c2cc197-a38a-41a4-8c40-c65602a593ea" containerID="2c07e5959ee3fc2cc7c2847460f6aace325ab51ecac6b3fa4806c949e47bad35" exitCode=0 Jan 20 23:15:53 crc kubenswrapper[5030]: I0120 23:15:53.109982 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" event={"ID":"5c2cc197-a38a-41a4-8c40-c65602a593ea","Type":"ContainerDied","Data":"2c07e5959ee3fc2cc7c2847460f6aace325ab51ecac6b3fa4806c949e47bad35"} Jan 20 23:15:53 crc kubenswrapper[5030]: I0120 23:15:53.278411 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:15:53 crc kubenswrapper[5030]: I0120 23:15:53.414212 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58bj7\" (UniqueName: \"kubernetes.io/projected/5c2cc197-a38a-41a4-8c40-c65602a593ea-kube-api-access-58bj7\") pod \"5c2cc197-a38a-41a4-8c40-c65602a593ea\" (UID: \"5c2cc197-a38a-41a4-8c40-c65602a593ea\") " Jan 20 23:15:53 crc kubenswrapper[5030]: I0120 23:15:53.414287 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2cc197-a38a-41a4-8c40-c65602a593ea-config\") pod \"5c2cc197-a38a-41a4-8c40-c65602a593ea\" (UID: \"5c2cc197-a38a-41a4-8c40-c65602a593ea\") " Jan 20 23:15:53 crc kubenswrapper[5030]: I0120 23:15:53.414391 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5c2cc197-a38a-41a4-8c40-c65602a593ea-dnsmasq-svc\") pod \"5c2cc197-a38a-41a4-8c40-c65602a593ea\" (UID: \"5c2cc197-a38a-41a4-8c40-c65602a593ea\") " Jan 20 23:15:53 crc kubenswrapper[5030]: I0120 23:15:53.420685 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2cc197-a38a-41a4-8c40-c65602a593ea-kube-api-access-58bj7" (OuterVolumeSpecName: "kube-api-access-58bj7") pod "5c2cc197-a38a-41a4-8c40-c65602a593ea" (UID: "5c2cc197-a38a-41a4-8c40-c65602a593ea"). InnerVolumeSpecName "kube-api-access-58bj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:15:53 crc kubenswrapper[5030]: I0120 23:15:53.465375 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c2cc197-a38a-41a4-8c40-c65602a593ea-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "5c2cc197-a38a-41a4-8c40-c65602a593ea" (UID: "5c2cc197-a38a-41a4-8c40-c65602a593ea"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:15:53 crc kubenswrapper[5030]: I0120 23:15:53.474200 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c2cc197-a38a-41a4-8c40-c65602a593ea-config" (OuterVolumeSpecName: "config") pod "5c2cc197-a38a-41a4-8c40-c65602a593ea" (UID: "5c2cc197-a38a-41a4-8c40-c65602a593ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:15:53 crc kubenswrapper[5030]: I0120 23:15:53.516092 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58bj7\" (UniqueName: \"kubernetes.io/projected/5c2cc197-a38a-41a4-8c40-c65602a593ea-kube-api-access-58bj7\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:53 crc kubenswrapper[5030]: I0120 23:15:53.516125 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c2cc197-a38a-41a4-8c40-c65602a593ea-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:53 crc kubenswrapper[5030]: I0120 23:15:53.516135 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5c2cc197-a38a-41a4-8c40-c65602a593ea-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:15:54 crc kubenswrapper[5030]: I0120 23:15:54.123085 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" event={"ID":"5c2cc197-a38a-41a4-8c40-c65602a593ea","Type":"ContainerDied","Data":"f07eb9d7bce3c2d9348ee2081a98fee3bcc609c384a002ba9686ac930a2bbd0f"} Jan 20 23:15:54 crc kubenswrapper[5030]: I0120 23:15:54.123154 5030 scope.go:117] "RemoveContainer" containerID="2c07e5959ee3fc2cc7c2847460f6aace325ab51ecac6b3fa4806c949e47bad35" Jan 20 23:15:54 crc kubenswrapper[5030]: I0120 23:15:54.123337 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl" Jan 20 23:15:54 crc kubenswrapper[5030]: I0120 23:15:54.168935 5030 scope.go:117] "RemoveContainer" containerID="fc25b97d9ab3c28c713e451e309b06690ce958169a7863890665189163eec9f1" Jan 20 23:15:54 crc kubenswrapper[5030]: I0120 23:15:54.169970 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl"] Jan 20 23:15:54 crc kubenswrapper[5030]: I0120 23:15:54.204511 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-pvptl"] Jan 20 23:15:54 crc kubenswrapper[5030]: I0120 23:15:54.238142 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="995342f6-fe0b-4188-976c-5151541dd002" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Jan 20 23:15:54 crc kubenswrapper[5030]: I0120 23:15:54.252925 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" podUID="65e9d298-7338-47c3-8591-b7803798203f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Jan 20 23:15:54 crc kubenswrapper[5030]: I0120 23:15:54.254466 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" podUID="a54757d6-0570-4290-a802-d82ff94dffb9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Jan 20 23:15:55 crc kubenswrapper[5030]: I0120 23:15:55.350792 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:15:55 crc kubenswrapper[5030]: I0120 23:15:55.703957 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:15:55 crc kubenswrapper[5030]: I0120 23:15:55.973540 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2cc197-a38a-41a4-8c40-c65602a593ea" path="/var/lib/kubelet/pods/5c2cc197-a38a-41a4-8c40-c65602a593ea/volumes" Jan 20 23:15:55 crc kubenswrapper[5030]: I0120 23:15:55.975576 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-2" podUID="56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Jan 20 23:16:04 crc kubenswrapper[5030]: I0120 23:16:04.211969 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:16:04 crc kubenswrapper[5030]: I0120 23:16:04.241888 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:16:04 crc kubenswrapper[5030]: I0120 23:16:04.254428 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:16:05 crc kubenswrapper[5030]: I0120 23:16:05.962983 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:16:05 crc kubenswrapper[5030]: E0120 23:16:05.963367 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:16:05 crc kubenswrapper[5030]: I0120 23:16:05.977162 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.312577 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jm65l"] Jan 20 23:16:06 crc kubenswrapper[5030]: E0120 23:16:06.312910 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2cc197-a38a-41a4-8c40-c65602a593ea" containerName="init" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.312925 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2cc197-a38a-41a4-8c40-c65602a593ea" containerName="init" Jan 20 23:16:06 crc kubenswrapper[5030]: E0120 23:16:06.312948 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421b8192-d4d9-41a8-800d-3f3afab07ac7" containerName="glance-db-sync" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.312954 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="421b8192-d4d9-41a8-800d-3f3afab07ac7" containerName="glance-db-sync" Jan 20 23:16:06 crc kubenswrapper[5030]: E0120 23:16:06.312968 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2cc197-a38a-41a4-8c40-c65602a593ea" containerName="dnsmasq-dns" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.312975 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2cc197-a38a-41a4-8c40-c65602a593ea" containerName="dnsmasq-dns" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.313117 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2cc197-a38a-41a4-8c40-c65602a593ea" containerName="dnsmasq-dns" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.313132 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="421b8192-d4d9-41a8-800d-3f3afab07ac7" containerName="glance-db-sync" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.313726 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-jm65l" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.330018 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jm65l"] Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.368803 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e81789b9-e782-4dbe-9832-a614011b349d-operator-scripts\") pod \"cinder-db-create-jm65l\" (UID: \"e81789b9-e782-4dbe-9832-a614011b349d\") " pod="openstack-kuttl-tests/cinder-db-create-jm65l" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.368977 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsmkw\" (UniqueName: \"kubernetes.io/projected/e81789b9-e782-4dbe-9832-a614011b349d-kube-api-access-lsmkw\") pod \"cinder-db-create-jm65l\" (UID: \"e81789b9-e782-4dbe-9832-a614011b349d\") " pod="openstack-kuttl-tests/cinder-db-create-jm65l" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.404601 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j"] Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.406092 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.408167 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.413152 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-x7ll9"] Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.414521 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-x7ll9" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.427282 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-x7ll9"] Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.453757 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j"] Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.471547 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e81789b9-e782-4dbe-9832-a614011b349d-operator-scripts\") pod \"cinder-db-create-jm65l\" (UID: \"e81789b9-e782-4dbe-9832-a614011b349d\") " pod="openstack-kuttl-tests/cinder-db-create-jm65l" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.471671 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2775de5c-23db-45b4-a940-183a73cd8fb4-operator-scripts\") pod \"neutron-db-create-x7ll9\" (UID: \"2775de5c-23db-45b4-a940-183a73cd8fb4\") " pod="openstack-kuttl-tests/neutron-db-create-x7ll9" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.471721 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsmkw\" (UniqueName: \"kubernetes.io/projected/e81789b9-e782-4dbe-9832-a614011b349d-kube-api-access-lsmkw\") pod \"cinder-db-create-jm65l\" (UID: \"e81789b9-e782-4dbe-9832-a614011b349d\") " pod="openstack-kuttl-tests/cinder-db-create-jm65l" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.471825 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwv22\" (UniqueName: \"kubernetes.io/projected/2775de5c-23db-45b4-a940-183a73cd8fb4-kube-api-access-cwv22\") pod \"neutron-db-create-x7ll9\" (UID: \"2775de5c-23db-45b4-a940-183a73cd8fb4\") " pod="openstack-kuttl-tests/neutron-db-create-x7ll9" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.471892 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3767ed5d-0e94-4d9d-8647-db60f59d03d7-operator-scripts\") pod \"cinder-e400-account-create-update-wgs6j\" (UID: \"3767ed5d-0e94-4d9d-8647-db60f59d03d7\") " pod="openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.471932 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjcfv\" (UniqueName: \"kubernetes.io/projected/3767ed5d-0e94-4d9d-8647-db60f59d03d7-kube-api-access-mjcfv\") pod \"cinder-e400-account-create-update-wgs6j\" (UID: \"3767ed5d-0e94-4d9d-8647-db60f59d03d7\") " pod="openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.472765 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e81789b9-e782-4dbe-9832-a614011b349d-operator-scripts\") pod \"cinder-db-create-jm65l\" (UID: \"e81789b9-e782-4dbe-9832-a614011b349d\") " pod="openstack-kuttl-tests/cinder-db-create-jm65l" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.514373 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsmkw\" (UniqueName: \"kubernetes.io/projected/e81789b9-e782-4dbe-9832-a614011b349d-kube-api-access-lsmkw\") pod \"cinder-db-create-jm65l\" (UID: \"e81789b9-e782-4dbe-9832-a614011b349d\") " pod="openstack-kuttl-tests/cinder-db-create-jm65l" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.518497 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-7ddpx"] Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.519604 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-7ddpx" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.534585 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng"] Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.535896 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.537199 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.542053 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-7ddpx"] Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.568177 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng"] Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.573252 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjcfv\" (UniqueName: \"kubernetes.io/projected/3767ed5d-0e94-4d9d-8647-db60f59d03d7-kube-api-access-mjcfv\") pod \"cinder-e400-account-create-update-wgs6j\" (UID: \"3767ed5d-0e94-4d9d-8647-db60f59d03d7\") " pod="openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.573335 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2775de5c-23db-45b4-a940-183a73cd8fb4-operator-scripts\") pod \"neutron-db-create-x7ll9\" (UID: \"2775de5c-23db-45b4-a940-183a73cd8fb4\") " pod="openstack-kuttl-tests/neutron-db-create-x7ll9" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.573363 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae1ddae-2f96-4602-92de-80ea30454c54-operator-scripts\") pod \"barbican-db-create-7ddpx\" (UID: \"2ae1ddae-2f96-4602-92de-80ea30454c54\") " pod="openstack-kuttl-tests/barbican-db-create-7ddpx" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.573420 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwv22\" (UniqueName: \"kubernetes.io/projected/2775de5c-23db-45b4-a940-183a73cd8fb4-kube-api-access-cwv22\") pod \"neutron-db-create-x7ll9\" (UID: \"2775de5c-23db-45b4-a940-183a73cd8fb4\") " pod="openstack-kuttl-tests/neutron-db-create-x7ll9" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.573455 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pghcg\" (UniqueName: \"kubernetes.io/projected/2ae1ddae-2f96-4602-92de-80ea30454c54-kube-api-access-pghcg\") pod \"barbican-db-create-7ddpx\" (UID: \"2ae1ddae-2f96-4602-92de-80ea30454c54\") " pod="openstack-kuttl-tests/barbican-db-create-7ddpx" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.573481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3767ed5d-0e94-4d9d-8647-db60f59d03d7-operator-scripts\") pod \"cinder-e400-account-create-update-wgs6j\" (UID: \"3767ed5d-0e94-4d9d-8647-db60f59d03d7\") " pod="openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.574168 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3767ed5d-0e94-4d9d-8647-db60f59d03d7-operator-scripts\") pod \"cinder-e400-account-create-update-wgs6j\" (UID: \"3767ed5d-0e94-4d9d-8647-db60f59d03d7\") " pod="openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.574205 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2775de5c-23db-45b4-a940-183a73cd8fb4-operator-scripts\") pod \"neutron-db-create-x7ll9\" (UID: \"2775de5c-23db-45b4-a940-183a73cd8fb4\") " pod="openstack-kuttl-tests/neutron-db-create-x7ll9" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.601440 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwv22\" (UniqueName: \"kubernetes.io/projected/2775de5c-23db-45b4-a940-183a73cd8fb4-kube-api-access-cwv22\") pod \"neutron-db-create-x7ll9\" (UID: \"2775de5c-23db-45b4-a940-183a73cd8fb4\") " pod="openstack-kuttl-tests/neutron-db-create-x7ll9" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.607707 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjcfv\" (UniqueName: \"kubernetes.io/projected/3767ed5d-0e94-4d9d-8647-db60f59d03d7-kube-api-access-mjcfv\") pod \"cinder-e400-account-create-update-wgs6j\" (UID: \"3767ed5d-0e94-4d9d-8647-db60f59d03d7\") " pod="openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.628635 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-jm65l" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.635059 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc"] Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.636456 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.638742 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.665860 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc"] Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.674757 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbmjz\" (UniqueName: \"kubernetes.io/projected/2a877089-7047-4846-a420-c2c4060562f5-kube-api-access-jbmjz\") pod \"barbican-ea87-account-create-update-jdnng\" (UID: \"2a877089-7047-4846-a420-c2c4060562f5\") " pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.674803 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a877089-7047-4846-a420-c2c4060562f5-operator-scripts\") pod \"barbican-ea87-account-create-update-jdnng\" (UID: \"2a877089-7047-4846-a420-c2c4060562f5\") " pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.674841 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pghcg\" (UniqueName: \"kubernetes.io/projected/2ae1ddae-2f96-4602-92de-80ea30454c54-kube-api-access-pghcg\") pod \"barbican-db-create-7ddpx\" (UID: \"2ae1ddae-2f96-4602-92de-80ea30454c54\") " pod="openstack-kuttl-tests/barbican-db-create-7ddpx" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.675185 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae1ddae-2f96-4602-92de-80ea30454c54-operator-scripts\") pod \"barbican-db-create-7ddpx\" (UID: \"2ae1ddae-2f96-4602-92de-80ea30454c54\") " pod="openstack-kuttl-tests/barbican-db-create-7ddpx" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.675861 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae1ddae-2f96-4602-92de-80ea30454c54-operator-scripts\") pod \"barbican-db-create-7ddpx\" (UID: \"2ae1ddae-2f96-4602-92de-80ea30454c54\") " pod="openstack-kuttl-tests/barbican-db-create-7ddpx" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.693098 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pghcg\" (UniqueName: \"kubernetes.io/projected/2ae1ddae-2f96-4602-92de-80ea30454c54-kube-api-access-pghcg\") pod \"barbican-db-create-7ddpx\" (UID: \"2ae1ddae-2f96-4602-92de-80ea30454c54\") " pod="openstack-kuttl-tests/barbican-db-create-7ddpx" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.722890 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.731322 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-x7ll9" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.777596 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de53eca9-723a-4cc3-b742-8b2c091c53a3-operator-scripts\") pod \"neutron-35d5-account-create-update-glbzc\" (UID: \"de53eca9-723a-4cc3-b742-8b2c091c53a3\") " pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.777656 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a877089-7047-4846-a420-c2c4060562f5-operator-scripts\") pod \"barbican-ea87-account-create-update-jdnng\" (UID: \"2a877089-7047-4846-a420-c2c4060562f5\") " pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.777698 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bv5c\" (UniqueName: \"kubernetes.io/projected/de53eca9-723a-4cc3-b742-8b2c091c53a3-kube-api-access-6bv5c\") pod \"neutron-35d5-account-create-update-glbzc\" (UID: \"de53eca9-723a-4cc3-b742-8b2c091c53a3\") " pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.777831 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbmjz\" (UniqueName: \"kubernetes.io/projected/2a877089-7047-4846-a420-c2c4060562f5-kube-api-access-jbmjz\") pod \"barbican-ea87-account-create-update-jdnng\" (UID: \"2a877089-7047-4846-a420-c2c4060562f5\") " pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.781923 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a877089-7047-4846-a420-c2c4060562f5-operator-scripts\") pod \"barbican-ea87-account-create-update-jdnng\" (UID: \"2a877089-7047-4846-a420-c2c4060562f5\") " pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.798805 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-9v5m5"] Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.799855 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.801753 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.802153 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.803598 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.804552 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbmjz\" (UniqueName: \"kubernetes.io/projected/2a877089-7047-4846-a420-c2c4060562f5-kube-api-access-jbmjz\") pod \"barbican-ea87-account-create-update-jdnng\" (UID: \"2a877089-7047-4846-a420-c2c4060562f5\") " pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.810358 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-88kcj" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.827657 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-9v5m5"] Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.879392 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de53eca9-723a-4cc3-b742-8b2c091c53a3-operator-scripts\") pod \"neutron-35d5-account-create-update-glbzc\" (UID: \"de53eca9-723a-4cc3-b742-8b2c091c53a3\") " pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.879740 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-combined-ca-bundle\") pod \"keystone-db-sync-9v5m5\" (UID: \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\") " pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.879768 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-config-data\") pod \"keystone-db-sync-9v5m5\" (UID: \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\") " pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.879804 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bv5c\" (UniqueName: \"kubernetes.io/projected/de53eca9-723a-4cc3-b742-8b2c091c53a3-kube-api-access-6bv5c\") pod \"neutron-35d5-account-create-update-glbzc\" (UID: \"de53eca9-723a-4cc3-b742-8b2c091c53a3\") " pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.879843 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc48z\" (UniqueName: \"kubernetes.io/projected/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-kube-api-access-sc48z\") pod \"keystone-db-sync-9v5m5\" (UID: \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\") " pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.880206 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de53eca9-723a-4cc3-b742-8b2c091c53a3-operator-scripts\") pod \"neutron-35d5-account-create-update-glbzc\" (UID: \"de53eca9-723a-4cc3-b742-8b2c091c53a3\") " pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.888746 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-7ddpx" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.906575 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bv5c\" (UniqueName: \"kubernetes.io/projected/de53eca9-723a-4cc3-b742-8b2c091c53a3-kube-api-access-6bv5c\") pod \"neutron-35d5-account-create-update-glbzc\" (UID: \"de53eca9-723a-4cc3-b742-8b2c091c53a3\") " pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.981355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-combined-ca-bundle\") pod \"keystone-db-sync-9v5m5\" (UID: \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\") " pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.981391 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-config-data\") pod \"keystone-db-sync-9v5m5\" (UID: \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\") " pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.981442 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc48z\" (UniqueName: \"kubernetes.io/projected/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-kube-api-access-sc48z\") pod \"keystone-db-sync-9v5m5\" (UID: \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\") " pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.986213 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-combined-ca-bundle\") pod \"keystone-db-sync-9v5m5\" (UID: \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\") " pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" Jan 20 23:16:06 crc kubenswrapper[5030]: I0120 23:16:06.986848 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-config-data\") pod \"keystone-db-sync-9v5m5\" (UID: \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\") " pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" Jan 20 23:16:07 crc kubenswrapper[5030]: I0120 23:16:07.000153 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc48z\" (UniqueName: \"kubernetes.io/projected/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-kube-api-access-sc48z\") pod \"keystone-db-sync-9v5m5\" (UID: \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\") " pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" Jan 20 23:16:07 crc kubenswrapper[5030]: I0120 23:16:07.043526 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" Jan 20 23:16:07 crc kubenswrapper[5030]: I0120 23:16:07.054062 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" Jan 20 23:16:07 crc kubenswrapper[5030]: I0120 23:16:07.127350 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jm65l"] Jan 20 23:16:07 crc kubenswrapper[5030]: I0120 23:16:07.131422 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" Jan 20 23:16:07 crc kubenswrapper[5030]: I0120 23:16:07.299250 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j"] Jan 20 23:16:07 crc kubenswrapper[5030]: W0120 23:16:07.310634 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2775de5c_23db_45b4_a940_183a73cd8fb4.slice/crio-2cf89285e9615e9364dc4910e5ec68ca49db245ba473370165cf9ab0c76b683a WatchSource:0}: Error finding container 2cf89285e9615e9364dc4910e5ec68ca49db245ba473370165cf9ab0c76b683a: Status 404 returned error can't find the container with id 2cf89285e9615e9364dc4910e5ec68ca49db245ba473370165cf9ab0c76b683a Jan 20 23:16:07 crc kubenswrapper[5030]: W0120 23:16:07.312537 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3767ed5d_0e94_4d9d_8647_db60f59d03d7.slice/crio-4341d24e76fafd3d9878a4276eca94b56d45b0e122e16f521c19607dfcd05146 WatchSource:0}: Error finding container 4341d24e76fafd3d9878a4276eca94b56d45b0e122e16f521c19607dfcd05146: Status 404 returned error can't find the container with id 4341d24e76fafd3d9878a4276eca94b56d45b0e122e16f521c19607dfcd05146 Jan 20 23:16:07 crc kubenswrapper[5030]: I0120 23:16:07.318477 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-x7ll9"] Jan 20 23:16:07 crc kubenswrapper[5030]: I0120 23:16:07.320472 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-jm65l" event={"ID":"e81789b9-e782-4dbe-9832-a614011b349d","Type":"ContainerStarted","Data":"2a4a38bd1ab40c582c1b7d3e431a8055db6991d2e00432c547393412f78cc52c"} Jan 20 23:16:07 crc kubenswrapper[5030]: I0120 23:16:07.430024 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-7ddpx"] Jan 20 23:16:07 crc kubenswrapper[5030]: W0120 23:16:07.449678 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ae1ddae_2f96_4602_92de_80ea30454c54.slice/crio-6f92d8ad3835e34df583d5082f01c7264a7551ea80f3f177d875c4033d6cd362 WatchSource:0}: Error finding container 6f92d8ad3835e34df583d5082f01c7264a7551ea80f3f177d875c4033d6cd362: Status 404 returned error can't find the container with id 6f92d8ad3835e34df583d5082f01c7264a7551ea80f3f177d875c4033d6cd362 Jan 20 23:16:07 crc kubenswrapper[5030]: I0120 23:16:07.515178 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng"] Jan 20 23:16:07 crc kubenswrapper[5030]: W0120 23:16:07.552021 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a877089_7047_4846_a420_c2c4060562f5.slice/crio-3b7705bb4680ec84363a4601a4355015a39c3f56d366fa9b8fd78e4e9a978f71 WatchSource:0}: Error finding container 3b7705bb4680ec84363a4601a4355015a39c3f56d366fa9b8fd78e4e9a978f71: Status 404 returned error can't find the container with id 3b7705bb4680ec84363a4601a4355015a39c3f56d366fa9b8fd78e4e9a978f71 Jan 20 23:16:07 crc kubenswrapper[5030]: I0120 23:16:07.595328 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc"] Jan 20 23:16:07 crc kubenswrapper[5030]: I0120 23:16:07.730083 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-9v5m5"] Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.329836 5030 generic.go:334] "Generic (PLEG): container finished" podID="e81789b9-e782-4dbe-9832-a614011b349d" containerID="2fc8ca77520abb462f511cdc67fcb4d4b532e52fdeff91ec65e80699f856a33b" exitCode=0 Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.330453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-jm65l" event={"ID":"e81789b9-e782-4dbe-9832-a614011b349d","Type":"ContainerDied","Data":"2fc8ca77520abb462f511cdc67fcb4d4b532e52fdeff91ec65e80699f856a33b"} Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.331529 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" event={"ID":"de53eca9-723a-4cc3-b742-8b2c091c53a3","Type":"ContainerStarted","Data":"2c9729c566b30fd83e8dc16f913064627dab1c0d4147acc8dfa17e723cc5653a"} Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.331555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" event={"ID":"de53eca9-723a-4cc3-b742-8b2c091c53a3","Type":"ContainerStarted","Data":"07c712279f37328c5e715c51a7be347ee4671809448365bf9dea6922618e95e5"} Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.332877 5030 generic.go:334] "Generic (PLEG): container finished" podID="2ae1ddae-2f96-4602-92de-80ea30454c54" containerID="2fffc0464c76dcd205b4adc0bbf5e64f2ccf1ded597857e6c78cdfd2bd0884e3" exitCode=0 Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.332942 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-7ddpx" event={"ID":"2ae1ddae-2f96-4602-92de-80ea30454c54","Type":"ContainerDied","Data":"2fffc0464c76dcd205b4adc0bbf5e64f2ccf1ded597857e6c78cdfd2bd0884e3"} Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.332966 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-7ddpx" event={"ID":"2ae1ddae-2f96-4602-92de-80ea30454c54","Type":"ContainerStarted","Data":"6f92d8ad3835e34df583d5082f01c7264a7551ea80f3f177d875c4033d6cd362"} Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.334286 5030 generic.go:334] "Generic (PLEG): container finished" podID="3767ed5d-0e94-4d9d-8647-db60f59d03d7" containerID="defefea15dd6535180f7c0202106736a984c14cefca0c9d1fcc0e37e5e341a5c" exitCode=0 Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.334347 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j" event={"ID":"3767ed5d-0e94-4d9d-8647-db60f59d03d7","Type":"ContainerDied","Data":"defefea15dd6535180f7c0202106736a984c14cefca0c9d1fcc0e37e5e341a5c"} Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.334371 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j" event={"ID":"3767ed5d-0e94-4d9d-8647-db60f59d03d7","Type":"ContainerStarted","Data":"4341d24e76fafd3d9878a4276eca94b56d45b0e122e16f521c19607dfcd05146"} Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.335648 5030 generic.go:334] "Generic (PLEG): container finished" podID="2775de5c-23db-45b4-a940-183a73cd8fb4" containerID="becb9ebf78886f627f252d5d9efae37f435c9773f0c72636e2a16f48b81f3c65" exitCode=0 Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.335689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-x7ll9" event={"ID":"2775de5c-23db-45b4-a940-183a73cd8fb4","Type":"ContainerDied","Data":"becb9ebf78886f627f252d5d9efae37f435c9773f0c72636e2a16f48b81f3c65"} Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.335704 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-x7ll9" event={"ID":"2775de5c-23db-45b4-a940-183a73cd8fb4","Type":"ContainerStarted","Data":"2cf89285e9615e9364dc4910e5ec68ca49db245ba473370165cf9ab0c76b683a"} Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.336907 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" event={"ID":"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe","Type":"ContainerStarted","Data":"1cd43d16a69bd38005f4c564af5a563ffb74b009be30301f57db13f1b6bfaf60"} Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.336930 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" event={"ID":"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe","Type":"ContainerStarted","Data":"a4ed42cce6c789a00e3f6e9aa7a7381f32ce4128c3dff701a7c4a4718f56e860"} Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.338736 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" event={"ID":"2a877089-7047-4846-a420-c2c4060562f5","Type":"ContainerStarted","Data":"4b519540152f7b8b3ca7a036718809383dc33dcff141d1828031db8b17e23383"} Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.338760 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" event={"ID":"2a877089-7047-4846-a420-c2c4060562f5","Type":"ContainerStarted","Data":"3b7705bb4680ec84363a4601a4355015a39c3f56d366fa9b8fd78e4e9a978f71"} Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.360570 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" podStartSLOduration=2.360547766 podStartE2EDuration="2.360547766s" podCreationTimestamp="2026-01-20 23:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:08.358218279 +0000 UTC m=+2440.678478567" watchObservedRunningTime="2026-01-20 23:16:08.360547766 +0000 UTC m=+2440.680808054" Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.388321 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" podStartSLOduration=2.388300475 podStartE2EDuration="2.388300475s" podCreationTimestamp="2026-01-20 23:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:08.384827931 +0000 UTC m=+2440.705088219" watchObservedRunningTime="2026-01-20 23:16:08.388300475 +0000 UTC m=+2440.708560763" Jan 20 23:16:08 crc kubenswrapper[5030]: I0120 23:16:08.400749 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" podStartSLOduration=2.400724055 podStartE2EDuration="2.400724055s" podCreationTimestamp="2026-01-20 23:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:08.396724978 +0000 UTC m=+2440.716985276" watchObservedRunningTime="2026-01-20 23:16:08.400724055 +0000 UTC m=+2440.720984343" Jan 20 23:16:09 crc kubenswrapper[5030]: I0120 23:16:09.352718 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" event={"ID":"de53eca9-723a-4cc3-b742-8b2c091c53a3","Type":"ContainerDied","Data":"2c9729c566b30fd83e8dc16f913064627dab1c0d4147acc8dfa17e723cc5653a"} Jan 20 23:16:09 crc kubenswrapper[5030]: I0120 23:16:09.352357 5030 generic.go:334] "Generic (PLEG): container finished" podID="de53eca9-723a-4cc3-b742-8b2c091c53a3" containerID="2c9729c566b30fd83e8dc16f913064627dab1c0d4147acc8dfa17e723cc5653a" exitCode=0 Jan 20 23:16:09 crc kubenswrapper[5030]: I0120 23:16:09.359363 5030 generic.go:334] "Generic (PLEG): container finished" podID="2a877089-7047-4846-a420-c2c4060562f5" containerID="4b519540152f7b8b3ca7a036718809383dc33dcff141d1828031db8b17e23383" exitCode=0 Jan 20 23:16:09 crc kubenswrapper[5030]: I0120 23:16:09.359668 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" event={"ID":"2a877089-7047-4846-a420-c2c4060562f5","Type":"ContainerDied","Data":"4b519540152f7b8b3ca7a036718809383dc33dcff141d1828031db8b17e23383"} Jan 20 23:16:09 crc kubenswrapper[5030]: I0120 23:16:09.803512 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-jm65l" Jan 20 23:16:09 crc kubenswrapper[5030]: I0120 23:16:09.919571 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-x7ll9" Jan 20 23:16:09 crc kubenswrapper[5030]: I0120 23:16:09.927797 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j" Jan 20 23:16:09 crc kubenswrapper[5030]: I0120 23:16:09.932227 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-7ddpx" Jan 20 23:16:09 crc kubenswrapper[5030]: I0120 23:16:09.938024 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsmkw\" (UniqueName: \"kubernetes.io/projected/e81789b9-e782-4dbe-9832-a614011b349d-kube-api-access-lsmkw\") pod \"e81789b9-e782-4dbe-9832-a614011b349d\" (UID: \"e81789b9-e782-4dbe-9832-a614011b349d\") " Jan 20 23:16:09 crc kubenswrapper[5030]: I0120 23:16:09.938361 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e81789b9-e782-4dbe-9832-a614011b349d-operator-scripts\") pod \"e81789b9-e782-4dbe-9832-a614011b349d\" (UID: \"e81789b9-e782-4dbe-9832-a614011b349d\") " Jan 20 23:16:09 crc kubenswrapper[5030]: I0120 23:16:09.938875 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e81789b9-e782-4dbe-9832-a614011b349d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e81789b9-e782-4dbe-9832-a614011b349d" (UID: "e81789b9-e782-4dbe-9832-a614011b349d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:16:09 crc kubenswrapper[5030]: I0120 23:16:09.939096 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e81789b9-e782-4dbe-9832-a614011b349d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:09 crc kubenswrapper[5030]: I0120 23:16:09.949712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81789b9-e782-4dbe-9832-a614011b349d-kube-api-access-lsmkw" (OuterVolumeSpecName: "kube-api-access-lsmkw") pod "e81789b9-e782-4dbe-9832-a614011b349d" (UID: "e81789b9-e782-4dbe-9832-a614011b349d"). InnerVolumeSpecName "kube-api-access-lsmkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.039913 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pghcg\" (UniqueName: \"kubernetes.io/projected/2ae1ddae-2f96-4602-92de-80ea30454c54-kube-api-access-pghcg\") pod \"2ae1ddae-2f96-4602-92de-80ea30454c54\" (UID: \"2ae1ddae-2f96-4602-92de-80ea30454c54\") " Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.039978 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3767ed5d-0e94-4d9d-8647-db60f59d03d7-operator-scripts\") pod \"3767ed5d-0e94-4d9d-8647-db60f59d03d7\" (UID: \"3767ed5d-0e94-4d9d-8647-db60f59d03d7\") " Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.040000 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae1ddae-2f96-4602-92de-80ea30454c54-operator-scripts\") pod \"2ae1ddae-2f96-4602-92de-80ea30454c54\" (UID: \"2ae1ddae-2f96-4602-92de-80ea30454c54\") " Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.040122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2775de5c-23db-45b4-a940-183a73cd8fb4-operator-scripts\") pod \"2775de5c-23db-45b4-a940-183a73cd8fb4\" (UID: \"2775de5c-23db-45b4-a940-183a73cd8fb4\") " Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.040198 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwv22\" (UniqueName: \"kubernetes.io/projected/2775de5c-23db-45b4-a940-183a73cd8fb4-kube-api-access-cwv22\") pod \"2775de5c-23db-45b4-a940-183a73cd8fb4\" (UID: \"2775de5c-23db-45b4-a940-183a73cd8fb4\") " Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.040323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjcfv\" (UniqueName: \"kubernetes.io/projected/3767ed5d-0e94-4d9d-8647-db60f59d03d7-kube-api-access-mjcfv\") pod \"3767ed5d-0e94-4d9d-8647-db60f59d03d7\" (UID: \"3767ed5d-0e94-4d9d-8647-db60f59d03d7\") " Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.040419 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3767ed5d-0e94-4d9d-8647-db60f59d03d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3767ed5d-0e94-4d9d-8647-db60f59d03d7" (UID: "3767ed5d-0e94-4d9d-8647-db60f59d03d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.040440 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ae1ddae-2f96-4602-92de-80ea30454c54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ae1ddae-2f96-4602-92de-80ea30454c54" (UID: "2ae1ddae-2f96-4602-92de-80ea30454c54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.040677 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2775de5c-23db-45b4-a940-183a73cd8fb4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2775de5c-23db-45b4-a940-183a73cd8fb4" (UID: "2775de5c-23db-45b4-a940-183a73cd8fb4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.040853 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsmkw\" (UniqueName: \"kubernetes.io/projected/e81789b9-e782-4dbe-9832-a614011b349d-kube-api-access-lsmkw\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.040893 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3767ed5d-0e94-4d9d-8647-db60f59d03d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.040903 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae1ddae-2f96-4602-92de-80ea30454c54-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.040912 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2775de5c-23db-45b4-a940-183a73cd8fb4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.043653 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2775de5c-23db-45b4-a940-183a73cd8fb4-kube-api-access-cwv22" (OuterVolumeSpecName: "kube-api-access-cwv22") pod "2775de5c-23db-45b4-a940-183a73cd8fb4" (UID: "2775de5c-23db-45b4-a940-183a73cd8fb4"). InnerVolumeSpecName "kube-api-access-cwv22". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.044515 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ae1ddae-2f96-4602-92de-80ea30454c54-kube-api-access-pghcg" (OuterVolumeSpecName: "kube-api-access-pghcg") pod "2ae1ddae-2f96-4602-92de-80ea30454c54" (UID: "2ae1ddae-2f96-4602-92de-80ea30454c54"). InnerVolumeSpecName "kube-api-access-pghcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.045380 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3767ed5d-0e94-4d9d-8647-db60f59d03d7-kube-api-access-mjcfv" (OuterVolumeSpecName: "kube-api-access-mjcfv") pod "3767ed5d-0e94-4d9d-8647-db60f59d03d7" (UID: "3767ed5d-0e94-4d9d-8647-db60f59d03d7"). InnerVolumeSpecName "kube-api-access-mjcfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.142845 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwv22\" (UniqueName: \"kubernetes.io/projected/2775de5c-23db-45b4-a940-183a73cd8fb4-kube-api-access-cwv22\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.142896 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjcfv\" (UniqueName: \"kubernetes.io/projected/3767ed5d-0e94-4d9d-8647-db60f59d03d7-kube-api-access-mjcfv\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.142908 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pghcg\" (UniqueName: \"kubernetes.io/projected/2ae1ddae-2f96-4602-92de-80ea30454c54-kube-api-access-pghcg\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.371500 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-jm65l" event={"ID":"e81789b9-e782-4dbe-9832-a614011b349d","Type":"ContainerDied","Data":"2a4a38bd1ab40c582c1b7d3e431a8055db6991d2e00432c547393412f78cc52c"} Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.371561 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a4a38bd1ab40c582c1b7d3e431a8055db6991d2e00432c547393412f78cc52c" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.371564 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-jm65l" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.383091 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-7ddpx" event={"ID":"2ae1ddae-2f96-4602-92de-80ea30454c54","Type":"ContainerDied","Data":"6f92d8ad3835e34df583d5082f01c7264a7551ea80f3f177d875c4033d6cd362"} Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.383433 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f92d8ad3835e34df583d5082f01c7264a7551ea80f3f177d875c4033d6cd362" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.383557 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-7ddpx" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.385995 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j" event={"ID":"3767ed5d-0e94-4d9d-8647-db60f59d03d7","Type":"ContainerDied","Data":"4341d24e76fafd3d9878a4276eca94b56d45b0e122e16f521c19607dfcd05146"} Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.386382 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4341d24e76fafd3d9878a4276eca94b56d45b0e122e16f521c19607dfcd05146" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.386104 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.388513 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-x7ll9" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.390289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-x7ll9" event={"ID":"2775de5c-23db-45b4-a940-183a73cd8fb4","Type":"ContainerDied","Data":"2cf89285e9615e9364dc4910e5ec68ca49db245ba473370165cf9ab0c76b683a"} Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.390331 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cf89285e9615e9364dc4910e5ec68ca49db245ba473370165cf9ab0c76b683a" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.766778 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.772263 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.855722 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a877089-7047-4846-a420-c2c4060562f5-operator-scripts\") pod \"2a877089-7047-4846-a420-c2c4060562f5\" (UID: \"2a877089-7047-4846-a420-c2c4060562f5\") " Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.855780 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de53eca9-723a-4cc3-b742-8b2c091c53a3-operator-scripts\") pod \"de53eca9-723a-4cc3-b742-8b2c091c53a3\" (UID: \"de53eca9-723a-4cc3-b742-8b2c091c53a3\") " Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.855870 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbmjz\" (UniqueName: \"kubernetes.io/projected/2a877089-7047-4846-a420-c2c4060562f5-kube-api-access-jbmjz\") pod \"2a877089-7047-4846-a420-c2c4060562f5\" (UID: \"2a877089-7047-4846-a420-c2c4060562f5\") " Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.856015 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bv5c\" (UniqueName: \"kubernetes.io/projected/de53eca9-723a-4cc3-b742-8b2c091c53a3-kube-api-access-6bv5c\") pod \"de53eca9-723a-4cc3-b742-8b2c091c53a3\" (UID: \"de53eca9-723a-4cc3-b742-8b2c091c53a3\") " Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.856285 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de53eca9-723a-4cc3-b742-8b2c091c53a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de53eca9-723a-4cc3-b742-8b2c091c53a3" (UID: "de53eca9-723a-4cc3-b742-8b2c091c53a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.856404 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a877089-7047-4846-a420-c2c4060562f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a877089-7047-4846-a420-c2c4060562f5" (UID: "2a877089-7047-4846-a420-c2c4060562f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.856447 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de53eca9-723a-4cc3-b742-8b2c091c53a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.861764 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de53eca9-723a-4cc3-b742-8b2c091c53a3-kube-api-access-6bv5c" (OuterVolumeSpecName: "kube-api-access-6bv5c") pod "de53eca9-723a-4cc3-b742-8b2c091c53a3" (UID: "de53eca9-723a-4cc3-b742-8b2c091c53a3"). InnerVolumeSpecName "kube-api-access-6bv5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.867808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a877089-7047-4846-a420-c2c4060562f5-kube-api-access-jbmjz" (OuterVolumeSpecName: "kube-api-access-jbmjz") pod "2a877089-7047-4846-a420-c2c4060562f5" (UID: "2a877089-7047-4846-a420-c2c4060562f5"). InnerVolumeSpecName "kube-api-access-jbmjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.957941 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a877089-7047-4846-a420-c2c4060562f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.958004 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbmjz\" (UniqueName: \"kubernetes.io/projected/2a877089-7047-4846-a420-c2c4060562f5-kube-api-access-jbmjz\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:10 crc kubenswrapper[5030]: I0120 23:16:10.958027 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bv5c\" (UniqueName: \"kubernetes.io/projected/de53eca9-723a-4cc3-b742-8b2c091c53a3-kube-api-access-6bv5c\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:11 crc kubenswrapper[5030]: I0120 23:16:11.401572 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" event={"ID":"2a877089-7047-4846-a420-c2c4060562f5","Type":"ContainerDied","Data":"3b7705bb4680ec84363a4601a4355015a39c3f56d366fa9b8fd78e4e9a978f71"} Jan 20 23:16:11 crc kubenswrapper[5030]: I0120 23:16:11.401791 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b7705bb4680ec84363a4601a4355015a39c3f56d366fa9b8fd78e4e9a978f71" Jan 20 23:16:11 crc kubenswrapper[5030]: I0120 23:16:11.401657 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng" Jan 20 23:16:11 crc kubenswrapper[5030]: I0120 23:16:11.403880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" event={"ID":"de53eca9-723a-4cc3-b742-8b2c091c53a3","Type":"ContainerDied","Data":"07c712279f37328c5e715c51a7be347ee4671809448365bf9dea6922618e95e5"} Jan 20 23:16:11 crc kubenswrapper[5030]: I0120 23:16:11.403926 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07c712279f37328c5e715c51a7be347ee4671809448365bf9dea6922618e95e5" Jan 20 23:16:11 crc kubenswrapper[5030]: I0120 23:16:11.403927 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc" Jan 20 23:16:12 crc kubenswrapper[5030]: I0120 23:16:12.417965 5030 generic.go:334] "Generic (PLEG): container finished" podID="148ae5bf-6f06-48b3-a9fd-04f99fbfebfe" containerID="1cd43d16a69bd38005f4c564af5a563ffb74b009be30301f57db13f1b6bfaf60" exitCode=0 Jan 20 23:16:12 crc kubenswrapper[5030]: I0120 23:16:12.418130 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" event={"ID":"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe","Type":"ContainerDied","Data":"1cd43d16a69bd38005f4c564af5a563ffb74b009be30301f57db13f1b6bfaf60"} Jan 20 23:16:13 crc kubenswrapper[5030]: I0120 23:16:13.866606 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" Jan 20 23:16:13 crc kubenswrapper[5030]: I0120 23:16:13.909282 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc48z\" (UniqueName: \"kubernetes.io/projected/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-kube-api-access-sc48z\") pod \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\" (UID: \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\") " Jan 20 23:16:13 crc kubenswrapper[5030]: I0120 23:16:13.909415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-config-data\") pod \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\" (UID: \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\") " Jan 20 23:16:13 crc kubenswrapper[5030]: I0120 23:16:13.909545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-combined-ca-bundle\") pod \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\" (UID: \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\") " Jan 20 23:16:13 crc kubenswrapper[5030]: I0120 23:16:13.916156 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-kube-api-access-sc48z" (OuterVolumeSpecName: "kube-api-access-sc48z") pod "148ae5bf-6f06-48b3-a9fd-04f99fbfebfe" (UID: "148ae5bf-6f06-48b3-a9fd-04f99fbfebfe"). InnerVolumeSpecName "kube-api-access-sc48z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:13 crc kubenswrapper[5030]: I0120 23:16:13.942652 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "148ae5bf-6f06-48b3-a9fd-04f99fbfebfe" (UID: "148ae5bf-6f06-48b3-a9fd-04f99fbfebfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.013140 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-config-data" (OuterVolumeSpecName: "config-data") pod "148ae5bf-6f06-48b3-a9fd-04f99fbfebfe" (UID: "148ae5bf-6f06-48b3-a9fd-04f99fbfebfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.013230 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-config-data\") pod \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\" (UID: \"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe\") " Jan 20 23:16:14 crc kubenswrapper[5030]: W0120 23:16:14.013427 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe/volumes/kubernetes.io~secret/config-data Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.013439 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-config-data" (OuterVolumeSpecName: "config-data") pod "148ae5bf-6f06-48b3-a9fd-04f99fbfebfe" (UID: "148ae5bf-6f06-48b3-a9fd-04f99fbfebfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.014171 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.014196 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc48z\" (UniqueName: \"kubernetes.io/projected/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-kube-api-access-sc48z\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.014209 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.443652 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" event={"ID":"148ae5bf-6f06-48b3-a9fd-04f99fbfebfe","Type":"ContainerDied","Data":"a4ed42cce6c789a00e3f6e9aa7a7381f32ce4128c3dff701a7c4a4718f56e860"} Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.443735 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4ed42cce6c789a00e3f6e9aa7a7381f32ce4128c3dff701a7c4a4718f56e860" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.443865 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-9v5m5" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.664983 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-2xf4v"] Jan 20 23:16:14 crc kubenswrapper[5030]: E0120 23:16:14.665407 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81789b9-e782-4dbe-9832-a614011b349d" containerName="mariadb-database-create" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665425 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81789b9-e782-4dbe-9832-a614011b349d" containerName="mariadb-database-create" Jan 20 23:16:14 crc kubenswrapper[5030]: E0120 23:16:14.665439 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3767ed5d-0e94-4d9d-8647-db60f59d03d7" containerName="mariadb-account-create-update" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665445 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3767ed5d-0e94-4d9d-8647-db60f59d03d7" containerName="mariadb-account-create-update" Jan 20 23:16:14 crc kubenswrapper[5030]: E0120 23:16:14.665454 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de53eca9-723a-4cc3-b742-8b2c091c53a3" containerName="mariadb-account-create-update" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665461 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="de53eca9-723a-4cc3-b742-8b2c091c53a3" containerName="mariadb-account-create-update" Jan 20 23:16:14 crc kubenswrapper[5030]: E0120 23:16:14.665474 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2775de5c-23db-45b4-a940-183a73cd8fb4" containerName="mariadb-database-create" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665483 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2775de5c-23db-45b4-a940-183a73cd8fb4" containerName="mariadb-database-create" Jan 20 23:16:14 crc kubenswrapper[5030]: E0120 23:16:14.665490 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148ae5bf-6f06-48b3-a9fd-04f99fbfebfe" containerName="keystone-db-sync" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665496 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="148ae5bf-6f06-48b3-a9fd-04f99fbfebfe" containerName="keystone-db-sync" Jan 20 23:16:14 crc kubenswrapper[5030]: E0120 23:16:14.665512 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a877089-7047-4846-a420-c2c4060562f5" containerName="mariadb-account-create-update" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665518 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a877089-7047-4846-a420-c2c4060562f5" containerName="mariadb-account-create-update" Jan 20 23:16:14 crc kubenswrapper[5030]: E0120 23:16:14.665527 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae1ddae-2f96-4602-92de-80ea30454c54" containerName="mariadb-database-create" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665533 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae1ddae-2f96-4602-92de-80ea30454c54" containerName="mariadb-database-create" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665709 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2775de5c-23db-45b4-a940-183a73cd8fb4" containerName="mariadb-database-create" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665724 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="148ae5bf-6f06-48b3-a9fd-04f99fbfebfe" containerName="keystone-db-sync" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665735 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81789b9-e782-4dbe-9832-a614011b349d" containerName="mariadb-database-create" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665747 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3767ed5d-0e94-4d9d-8647-db60f59d03d7" containerName="mariadb-account-create-update" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665754 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae1ddae-2f96-4602-92de-80ea30454c54" containerName="mariadb-database-create" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665767 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="de53eca9-723a-4cc3-b742-8b2c091c53a3" containerName="mariadb-account-create-update" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.665776 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a877089-7047-4846-a420-c2c4060562f5" containerName="mariadb-account-create-update" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.666367 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.668355 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.668961 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.668975 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-88kcj" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.669412 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.670794 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.679941 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-2xf4v"] Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.725674 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5fcs\" (UniqueName: \"kubernetes.io/projected/41f0470c-c766-4578-b6ac-a3c182ea512e-kube-api-access-v5fcs\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.725754 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-scripts\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.725791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-config-data\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.725875 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-fernet-keys\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.725911 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-combined-ca-bundle\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.725935 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-credential-keys\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.829160 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5fcs\" (UniqueName: \"kubernetes.io/projected/41f0470c-c766-4578-b6ac-a3c182ea512e-kube-api-access-v5fcs\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.829504 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-scripts\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.829674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-config-data\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.829822 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-fernet-keys\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.829943 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-combined-ca-bundle\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.830034 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-credential-keys\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.832955 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.834459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-scripts\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.834727 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-config-data\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.836951 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-credential-keys\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.837246 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-combined-ca-bundle\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.839175 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-fernet-keys\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.847597 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.864021 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.868845 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.883357 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.896545 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5fcs\" (UniqueName: \"kubernetes.io/projected/41f0470c-c766-4578-b6ac-a3c182ea512e-kube-api-access-v5fcs\") pod \"keystone-bootstrap-2xf4v\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.934586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-config-data\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.934687 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wptg\" (UniqueName: \"kubernetes.io/projected/963ffd35-3168-49a4-a9a7-5fdc54d7d606-kube-api-access-5wptg\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.934708 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963ffd35-3168-49a4-a9a7-5fdc54d7d606-run-httpd\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.934723 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.934761 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-scripts\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.934801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963ffd35-3168-49a4-a9a7-5fdc54d7d606-log-httpd\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.934821 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.964729 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-bdw8v"] Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.965812 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.978443 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.990341 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.990533 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-74drc" Jan 20 23:16:14 crc kubenswrapper[5030]: I0120 23:16:14.992963 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.035056 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-bdw8v"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.035914 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-scripts\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.035955 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-config-data\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.035987 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963ffd35-3168-49a4-a9a7-5fdc54d7d606-log-httpd\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.036007 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.036048 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-combined-ca-bundle\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.036068 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aab764fb-e311-4d5c-8139-65120ec5f9eb-etc-machine-id\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.036089 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-config-data\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.036110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwk49\" (UniqueName: \"kubernetes.io/projected/aab764fb-e311-4d5c-8139-65120ec5f9eb-kube-api-access-gwk49\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.036136 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-db-sync-config-data\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.036188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wptg\" (UniqueName: \"kubernetes.io/projected/963ffd35-3168-49a4-a9a7-5fdc54d7d606-kube-api-access-5wptg\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.036208 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963ffd35-3168-49a4-a9a7-5fdc54d7d606-run-httpd\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.036222 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.036249 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-scripts\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.037040 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963ffd35-3168-49a4-a9a7-5fdc54d7d606-log-httpd\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.049478 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.049828 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963ffd35-3168-49a4-a9a7-5fdc54d7d606-run-httpd\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.055871 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-config-data\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.056413 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-scripts\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.075180 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.077488 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wptg\" (UniqueName: \"kubernetes.io/projected/963ffd35-3168-49a4-a9a7-5fdc54d7d606-kube-api-access-5wptg\") pod \"ceilometer-0\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.082392 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-7wlmd"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.083520 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.089373 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.089590 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.098290 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-dbr5q" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.132869 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-7wlmd"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.143038 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-config-data\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.143103 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-combined-ca-bundle\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.143123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aab764fb-e311-4d5c-8139-65120ec5f9eb-etc-machine-id\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.143152 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8466bfd-157a-488e-a8a3-856b4837603a-config\") pod \"neutron-db-sync-7wlmd\" (UID: \"b8466bfd-157a-488e-a8a3-856b4837603a\") " pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.143172 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwk49\" (UniqueName: \"kubernetes.io/projected/aab764fb-e311-4d5c-8139-65120ec5f9eb-kube-api-access-gwk49\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.143195 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-db-sync-config-data\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.143228 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6rf8\" (UniqueName: \"kubernetes.io/projected/b8466bfd-157a-488e-a8a3-856b4837603a-kube-api-access-t6rf8\") pod \"neutron-db-sync-7wlmd\" (UID: \"b8466bfd-157a-488e-a8a3-856b4837603a\") " pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.143253 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8466bfd-157a-488e-a8a3-856b4837603a-combined-ca-bundle\") pod \"neutron-db-sync-7wlmd\" (UID: \"b8466bfd-157a-488e-a8a3-856b4837603a\") " pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.143296 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-scripts\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.149253 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aab764fb-e311-4d5c-8139-65120ec5f9eb-etc-machine-id\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.150685 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-scripts\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.163934 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-config-data\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.176124 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-db-sync-config-data\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.183643 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hk4fb"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.184979 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.190344 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-combined-ca-bundle\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.193469 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.193990 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-5jnzt" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.217238 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwk49\" (UniqueName: \"kubernetes.io/projected/aab764fb-e311-4d5c-8139-65120ec5f9eb-kube-api-access-gwk49\") pod \"cinder-db-sync-bdw8v\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.217305 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-pzk5s"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.224156 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.235822 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hk4fb"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.235912 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.248029 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.248516 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-bskvr" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.248567 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwgws\" (UniqueName: \"kubernetes.io/projected/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-kube-api-access-rwgws\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.248972 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8466bfd-157a-488e-a8a3-856b4837603a-config\") pod \"neutron-db-sync-7wlmd\" (UID: \"b8466bfd-157a-488e-a8a3-856b4837603a\") " pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.249006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-combined-ca-bundle\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.249038 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-scripts\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.249059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6rf8\" (UniqueName: \"kubernetes.io/projected/b8466bfd-157a-488e-a8a3-856b4837603a-kube-api-access-t6rf8\") pod \"neutron-db-sync-7wlmd\" (UID: \"b8466bfd-157a-488e-a8a3-856b4837603a\") " pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.249084 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8466bfd-157a-488e-a8a3-856b4837603a-combined-ca-bundle\") pod \"neutron-db-sync-7wlmd\" (UID: \"b8466bfd-157a-488e-a8a3-856b4837603a\") " pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.249127 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-logs\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.249143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-config-data\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.262483 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-pzk5s"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.282328 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8466bfd-157a-488e-a8a3-856b4837603a-combined-ca-bundle\") pod \"neutron-db-sync-7wlmd\" (UID: \"b8466bfd-157a-488e-a8a3-856b4837603a\") " pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.285405 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8466bfd-157a-488e-a8a3-856b4837603a-config\") pod \"neutron-db-sync-7wlmd\" (UID: \"b8466bfd-157a-488e-a8a3-856b4837603a\") " pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.294075 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.299918 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6rf8\" (UniqueName: \"kubernetes.io/projected/b8466bfd-157a-488e-a8a3-856b4837603a-kube-api-access-t6rf8\") pod \"neutron-db-sync-7wlmd\" (UID: \"b8466bfd-157a-488e-a8a3-856b4837603a\") " pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.322987 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.353565 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/830844d5-003c-44cd-bf80-5c78c96c1862-db-sync-config-data\") pod \"barbican-db-sync-pzk5s\" (UID: \"830844d5-003c-44cd-bf80-5c78c96c1862\") " pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.353653 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwgws\" (UniqueName: \"kubernetes.io/projected/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-kube-api-access-rwgws\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.353700 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-combined-ca-bundle\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.353731 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-scripts\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.353795 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/830844d5-003c-44cd-bf80-5c78c96c1862-combined-ca-bundle\") pod \"barbican-db-sync-pzk5s\" (UID: \"830844d5-003c-44cd-bf80-5c78c96c1862\") " pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.353822 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qckj\" (UniqueName: \"kubernetes.io/projected/830844d5-003c-44cd-bf80-5c78c96c1862-kube-api-access-7qckj\") pod \"barbican-db-sync-pzk5s\" (UID: \"830844d5-003c-44cd-bf80-5c78c96c1862\") " pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.353842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-logs\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.353859 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-config-data\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.359518 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-logs\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.365227 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-config-data\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.370042 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-scripts\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.370763 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-combined-ca-bundle\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.382197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwgws\" (UniqueName: \"kubernetes.io/projected/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-kube-api-access-rwgws\") pod \"placement-db-sync-hk4fb\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.456876 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/830844d5-003c-44cd-bf80-5c78c96c1862-db-sync-config-data\") pod \"barbican-db-sync-pzk5s\" (UID: \"830844d5-003c-44cd-bf80-5c78c96c1862\") " pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.456985 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/830844d5-003c-44cd-bf80-5c78c96c1862-combined-ca-bundle\") pod \"barbican-db-sync-pzk5s\" (UID: \"830844d5-003c-44cd-bf80-5c78c96c1862\") " pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.457014 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qckj\" (UniqueName: \"kubernetes.io/projected/830844d5-003c-44cd-bf80-5c78c96c1862-kube-api-access-7qckj\") pod \"barbican-db-sync-pzk5s\" (UID: \"830844d5-003c-44cd-bf80-5c78c96c1862\") " pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.462266 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/830844d5-003c-44cd-bf80-5c78c96c1862-combined-ca-bundle\") pod \"barbican-db-sync-pzk5s\" (UID: \"830844d5-003c-44cd-bf80-5c78c96c1862\") " pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.469755 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/830844d5-003c-44cd-bf80-5c78c96c1862-db-sync-config-data\") pod \"barbican-db-sync-pzk5s\" (UID: \"830844d5-003c-44cd-bf80-5c78c96c1862\") " pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.472475 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qckj\" (UniqueName: \"kubernetes.io/projected/830844d5-003c-44cd-bf80-5c78c96c1862-kube-api-access-7qckj\") pod \"barbican-db-sync-pzk5s\" (UID: \"830844d5-003c-44cd-bf80-5c78c96c1862\") " pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.531932 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.537480 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.576431 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.770577 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.772035 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.776599 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-zvqxh" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.776927 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.777048 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.780351 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.822226 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-2xf4v"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.844840 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.864945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85e5c279-1975-4f72-bf29-26cd37764cf5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.865003 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.865038 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn5jm\" (UniqueName: \"kubernetes.io/projected/85e5c279-1975-4f72-bf29-26cd37764cf5-kube-api-access-jn5jm\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.865078 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-scripts\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.865102 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e5c279-1975-4f72-bf29-26cd37764cf5-logs\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.865138 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.865162 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-config-data\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.865191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.865439 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.867430 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.871600 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.871799 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.889802 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.902797 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967420 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967476 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn5jm\" (UniqueName: \"kubernetes.io/projected/85e5c279-1975-4f72-bf29-26cd37764cf5-kube-api-access-jn5jm\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb920579-7b71-4699-b06c-d7529faa5b97-logs\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967533 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-scripts\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967552 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967572 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb920579-7b71-4699-b06c-d7529faa5b97-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967589 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e5c279-1975-4f72-bf29-26cd37764cf5-logs\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967637 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-config-data\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967694 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmd52\" (UniqueName: \"kubernetes.io/projected/eb920579-7b71-4699-b06c-d7529faa5b97-kube-api-access-pmd52\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967718 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967743 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.967846 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.969348 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.969432 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85e5c279-1975-4f72-bf29-26cd37764cf5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.969454 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.969658 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e5c279-1975-4f72-bf29-26cd37764cf5-logs\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.969926 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85e5c279-1975-4f72-bf29-26cd37764cf5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.983841 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.984308 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.985972 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-bdw8v"] Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.986792 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-config-data\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:15 crc kubenswrapper[5030]: I0120 23:16:15.989347 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-scripts\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.017481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn5jm\" (UniqueName: \"kubernetes.io/projected/85e5c279-1975-4f72-bf29-26cd37764cf5-kube-api-access-jn5jm\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.025523 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.072080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb920579-7b71-4699-b06c-d7529faa5b97-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.072162 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmd52\" (UniqueName: \"kubernetes.io/projected/eb920579-7b71-4699-b06c-d7529faa5b97-kube-api-access-pmd52\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.072184 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.072213 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.072248 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.072279 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.072329 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb920579-7b71-4699-b06c-d7529faa5b97-logs\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.072354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.072714 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb920579-7b71-4699-b06c-d7529faa5b97-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.073029 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb920579-7b71-4699-b06c-d7529faa5b97-logs\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.073133 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.079834 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.082179 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.085125 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.086303 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.091401 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hk4fb"] Jan 20 23:16:16 crc kubenswrapper[5030]: W0120 23:16:16.101431 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24326cd7_d35a_41a1_b953_3ca15ad5d5f0.slice/crio-82c5a672737f6c5f4e93f12bce69a8c406d7b63919c6183b59766fb838ac64e0 WatchSource:0}: Error finding container 82c5a672737f6c5f4e93f12bce69a8c406d7b63919c6183b59766fb838ac64e0: Status 404 returned error can't find the container with id 82c5a672737f6c5f4e93f12bce69a8c406d7b63919c6183b59766fb838ac64e0 Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.102738 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmd52\" (UniqueName: \"kubernetes.io/projected/eb920579-7b71-4699-b06c-d7529faa5b97-kube-api-access-pmd52\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.109328 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.110902 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-7wlmd"] Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.113988 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: W0120 23:16:16.129934 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8466bfd_157a_488e_a8a3_856b4837603a.slice/crio-916c00ca82b0f096d113633a3ae2784ca1831cb4174538e1c51a0a3e0bd1fbad WatchSource:0}: Error finding container 916c00ca82b0f096d113633a3ae2784ca1831cb4174538e1c51a0a3e0bd1fbad: Status 404 returned error can't find the container with id 916c00ca82b0f096d113633a3ae2784ca1831cb4174538e1c51a0a3e0bd1fbad Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.133361 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.250772 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-pzk5s"] Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.491676 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" event={"ID":"41f0470c-c766-4578-b6ac-a3c182ea512e","Type":"ContainerStarted","Data":"64e8ddea6808b444003252fcc52a6d1bf1a5cedf7605c158182a33e5e1061381"} Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.492051 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" event={"ID":"41f0470c-c766-4578-b6ac-a3c182ea512e","Type":"ContainerStarted","Data":"82a34867668251a23d4c6c1659885b85b22d77489cc09bc65817cee219efb488"} Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.508826 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" event={"ID":"aab764fb-e311-4d5c-8139-65120ec5f9eb","Type":"ContainerStarted","Data":"dfde573ba78cc60dd1244f7109533ecfac6aa676c36092ae03c24b17936d8227"} Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.513344 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-hk4fb" event={"ID":"24326cd7-d35a-41a1-b953-3ca15ad5d5f0","Type":"ContainerStarted","Data":"a63c3e2c0fa9c632fa78a3a408312008d457ce68e6f245d77409731f31a4fec8"} Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.513382 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-hk4fb" event={"ID":"24326cd7-d35a-41a1-b953-3ca15ad5d5f0","Type":"ContainerStarted","Data":"82c5a672737f6c5f4e93f12bce69a8c406d7b63919c6183b59766fb838ac64e0"} Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.516255 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" podStartSLOduration=2.516246995 podStartE2EDuration="2.516246995s" podCreationTimestamp="2026-01-20 23:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:16.510853954 +0000 UTC m=+2448.831114252" watchObservedRunningTime="2026-01-20 23:16:16.516246995 +0000 UTC m=+2448.836507283" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.536429 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"963ffd35-3168-49a4-a9a7-5fdc54d7d606","Type":"ContainerStarted","Data":"0e4b964cc2f3e1674de73e44e90008db7042d8059c4261ac1f1d8f768a16cb54"} Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.539090 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-hk4fb" podStartSLOduration=1.539075515 podStartE2EDuration="1.539075515s" podCreationTimestamp="2026-01-20 23:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:16.536696688 +0000 UTC m=+2448.856956976" watchObservedRunningTime="2026-01-20 23:16:16.539075515 +0000 UTC m=+2448.859335803" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.543502 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" event={"ID":"830844d5-003c-44cd-bf80-5c78c96c1862","Type":"ContainerStarted","Data":"b48bc3c3df5e58bdccf2a916583c6b9b9c86dcbe2e3c9109286c45a48f936567"} Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.545555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" event={"ID":"b8466bfd-157a-488e-a8a3-856b4837603a","Type":"ContainerStarted","Data":"b52e95dd143ed5cf453a5cbefbdce934cf06f59d7f7dc81da3443df81759cf02"} Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.545581 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" event={"ID":"b8466bfd-157a-488e-a8a3-856b4837603a","Type":"ContainerStarted","Data":"916c00ca82b0f096d113633a3ae2784ca1831cb4174538e1c51a0a3e0bd1fbad"} Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.563326 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" podStartSLOduration=2.56331238 podStartE2EDuration="2.56331238s" podCreationTimestamp="2026-01-20 23:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:16.562638284 +0000 UTC m=+2448.882898662" watchObservedRunningTime="2026-01-20 23:16:16.56331238 +0000 UTC m=+2448.883572668" Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.618142 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:16:16 crc kubenswrapper[5030]: W0120 23:16:16.621315 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85e5c279_1975_4f72_bf29_26cd37764cf5.slice/crio-6facc543ab4db79fdce65ab652f59feb2782d5e3068e3c99eb2454814a416bc4 WatchSource:0}: Error finding container 6facc543ab4db79fdce65ab652f59feb2782d5e3068e3c99eb2454814a416bc4: Status 404 returned error can't find the container with id 6facc543ab4db79fdce65ab652f59feb2782d5e3068e3c99eb2454814a416bc4 Jan 20 23:16:16 crc kubenswrapper[5030]: I0120 23:16:16.708133 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:16:16 crc kubenswrapper[5030]: W0120 23:16:16.721463 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb920579_7b71_4699_b06c_d7529faa5b97.slice/crio-05009376ee1cb4e431e51aa938a13efa95383a6ff9019b503759911cc5957919 WatchSource:0}: Error finding container 05009376ee1cb4e431e51aa938a13efa95383a6ff9019b503759911cc5957919: Status 404 returned error can't find the container with id 05009376ee1cb4e431e51aa938a13efa95383a6ff9019b503759911cc5957919 Jan 20 23:16:17 crc kubenswrapper[5030]: I0120 23:16:17.580116 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"85e5c279-1975-4f72-bf29-26cd37764cf5","Type":"ContainerStarted","Data":"0216a8227c8340641007cbe78f48d541351fbca63f36e13db04d1de3452f1b69"} Jan 20 23:16:17 crc kubenswrapper[5030]: I0120 23:16:17.580662 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"85e5c279-1975-4f72-bf29-26cd37764cf5","Type":"ContainerStarted","Data":"6facc543ab4db79fdce65ab652f59feb2782d5e3068e3c99eb2454814a416bc4"} Jan 20 23:16:17 crc kubenswrapper[5030]: I0120 23:16:17.585277 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"eb920579-7b71-4699-b06c-d7529faa5b97","Type":"ContainerStarted","Data":"05009376ee1cb4e431e51aa938a13efa95383a6ff9019b503759911cc5957919"} Jan 20 23:16:17 crc kubenswrapper[5030]: I0120 23:16:17.589229 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" event={"ID":"830844d5-003c-44cd-bf80-5c78c96c1862","Type":"ContainerStarted","Data":"6d9a540904da26e50af38624029f0074073b55fb95f2afb13be3d84f6f8527ef"} Jan 20 23:16:17 crc kubenswrapper[5030]: I0120 23:16:17.603529 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" event={"ID":"aab764fb-e311-4d5c-8139-65120ec5f9eb","Type":"ContainerStarted","Data":"962d321b449e41fd746f074eb545941f90420af47ecf8de63641e62ace89067f"} Jan 20 23:16:17 crc kubenswrapper[5030]: I0120 23:16:17.620861 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" podStartSLOduration=3.6208465050000003 podStartE2EDuration="3.620846505s" podCreationTimestamp="2026-01-20 23:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:17.61941132 +0000 UTC m=+2449.939671608" watchObservedRunningTime="2026-01-20 23:16:17.620846505 +0000 UTC m=+2449.941106793" Jan 20 23:16:17 crc kubenswrapper[5030]: I0120 23:16:17.638231 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" podStartSLOduration=2.638217065 podStartE2EDuration="2.638217065s" podCreationTimestamp="2026-01-20 23:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:17.604550592 +0000 UTC m=+2449.924810880" watchObservedRunningTime="2026-01-20 23:16:17.638217065 +0000 UTC m=+2449.958477353" Jan 20 23:16:17 crc kubenswrapper[5030]: I0120 23:16:17.970789 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:16:17 crc kubenswrapper[5030]: E0120 23:16:17.971201 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:16:17 crc kubenswrapper[5030]: I0120 23:16:17.994761 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:16:18 crc kubenswrapper[5030]: I0120 23:16:18.052636 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:16:18 crc kubenswrapper[5030]: I0120 23:16:18.075991 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:16:18 crc kubenswrapper[5030]: I0120 23:16:18.630184 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"eb920579-7b71-4699-b06c-d7529faa5b97","Type":"ContainerStarted","Data":"1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07"} Jan 20 23:16:18 crc kubenswrapper[5030]: I0120 23:16:18.630440 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"eb920579-7b71-4699-b06c-d7529faa5b97","Type":"ContainerStarted","Data":"262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98"} Jan 20 23:16:18 crc kubenswrapper[5030]: I0120 23:16:18.630482 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="eb920579-7b71-4699-b06c-d7529faa5b97" containerName="glance-log" containerID="cri-o://262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98" gracePeriod=30 Jan 20 23:16:18 crc kubenswrapper[5030]: I0120 23:16:18.630641 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="eb920579-7b71-4699-b06c-d7529faa5b97" containerName="glance-httpd" containerID="cri-o://1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07" gracePeriod=30 Jan 20 23:16:18 crc kubenswrapper[5030]: I0120 23:16:18.641078 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"85e5c279-1975-4f72-bf29-26cd37764cf5","Type":"ContainerStarted","Data":"bebb92ff566d18dbb68c8efb9a95be01cc65868460875314afe46fc26161f6e7"} Jan 20 23:16:18 crc kubenswrapper[5030]: I0120 23:16:18.641165 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="85e5c279-1975-4f72-bf29-26cd37764cf5" containerName="glance-log" containerID="cri-o://0216a8227c8340641007cbe78f48d541351fbca63f36e13db04d1de3452f1b69" gracePeriod=30 Jan 20 23:16:18 crc kubenswrapper[5030]: I0120 23:16:18.641248 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="85e5c279-1975-4f72-bf29-26cd37764cf5" containerName="glance-httpd" containerID="cri-o://bebb92ff566d18dbb68c8efb9a95be01cc65868460875314afe46fc26161f6e7" gracePeriod=30 Jan 20 23:16:18 crc kubenswrapper[5030]: I0120 23:16:18.656987 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"963ffd35-3168-49a4-a9a7-5fdc54d7d606","Type":"ContainerStarted","Data":"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2"} Jan 20 23:16:18 crc kubenswrapper[5030]: I0120 23:16:18.657025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"963ffd35-3168-49a4-a9a7-5fdc54d7d606","Type":"ContainerStarted","Data":"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd"} Jan 20 23:16:18 crc kubenswrapper[5030]: I0120 23:16:18.665018 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.664998887 podStartE2EDuration="4.664998887s" podCreationTimestamp="2026-01-20 23:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:18.655084218 +0000 UTC m=+2450.975344506" watchObservedRunningTime="2026-01-20 23:16:18.664998887 +0000 UTC m=+2450.985259175" Jan 20 23:16:18 crc kubenswrapper[5030]: I0120 23:16:18.681762 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.681745851 podStartE2EDuration="4.681745851s" podCreationTimestamp="2026-01-20 23:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:18.675409739 +0000 UTC m=+2450.995670027" watchObservedRunningTime="2026-01-20 23:16:18.681745851 +0000 UTC m=+2451.002006139" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.334651 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.360661 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"eb920579-7b71-4699-b06c-d7529faa5b97\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.360735 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-config-data\") pod \"eb920579-7b71-4699-b06c-d7529faa5b97\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.360827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmd52\" (UniqueName: \"kubernetes.io/projected/eb920579-7b71-4699-b06c-d7529faa5b97-kube-api-access-pmd52\") pod \"eb920579-7b71-4699-b06c-d7529faa5b97\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.360866 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-scripts\") pod \"eb920579-7b71-4699-b06c-d7529faa5b97\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.360929 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb920579-7b71-4699-b06c-d7529faa5b97-httpd-run\") pod \"eb920579-7b71-4699-b06c-d7529faa5b97\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.360952 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb920579-7b71-4699-b06c-d7529faa5b97-logs\") pod \"eb920579-7b71-4699-b06c-d7529faa5b97\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.361020 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-combined-ca-bundle\") pod \"eb920579-7b71-4699-b06c-d7529faa5b97\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.361045 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-internal-tls-certs\") pod \"eb920579-7b71-4699-b06c-d7529faa5b97\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.362499 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb920579-7b71-4699-b06c-d7529faa5b97-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eb920579-7b71-4699-b06c-d7529faa5b97" (UID: "eb920579-7b71-4699-b06c-d7529faa5b97"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.364262 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb920579-7b71-4699-b06c-d7529faa5b97-logs" (OuterVolumeSpecName: "logs") pod "eb920579-7b71-4699-b06c-d7529faa5b97" (UID: "eb920579-7b71-4699-b06c-d7529faa5b97"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.386560 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb920579-7b71-4699-b06c-d7529faa5b97-kube-api-access-pmd52" (OuterVolumeSpecName: "kube-api-access-pmd52") pod "eb920579-7b71-4699-b06c-d7529faa5b97" (UID: "eb920579-7b71-4699-b06c-d7529faa5b97"). InnerVolumeSpecName "kube-api-access-pmd52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.387547 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-scripts" (OuterVolumeSpecName: "scripts") pod "eb920579-7b71-4699-b06c-d7529faa5b97" (UID: "eb920579-7b71-4699-b06c-d7529faa5b97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.394857 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "eb920579-7b71-4699-b06c-d7529faa5b97" (UID: "eb920579-7b71-4699-b06c-d7529faa5b97"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.426104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb920579-7b71-4699-b06c-d7529faa5b97" (UID: "eb920579-7b71-4699-b06c-d7529faa5b97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.461815 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-config-data" (OuterVolumeSpecName: "config-data") pod "eb920579-7b71-4699-b06c-d7529faa5b97" (UID: "eb920579-7b71-4699-b06c-d7529faa5b97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.462335 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-config-data\") pod \"eb920579-7b71-4699-b06c-d7529faa5b97\" (UID: \"eb920579-7b71-4699-b06c-d7529faa5b97\") " Jan 20 23:16:19 crc kubenswrapper[5030]: W0120 23:16:19.462527 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/eb920579-7b71-4699-b06c-d7529faa5b97/volumes/kubernetes.io~secret/config-data Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.462538 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-config-data" (OuterVolumeSpecName: "config-data") pod "eb920579-7b71-4699-b06c-d7529faa5b97" (UID: "eb920579-7b71-4699-b06c-d7529faa5b97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.462753 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eb920579-7b71-4699-b06c-d7529faa5b97-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.462766 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb920579-7b71-4699-b06c-d7529faa5b97-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.462776 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.462801 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.462810 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.462822 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmd52\" (UniqueName: \"kubernetes.io/projected/eb920579-7b71-4699-b06c-d7529faa5b97-kube-api-access-pmd52\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.462830 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.470395 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eb920579-7b71-4699-b06c-d7529faa5b97" (UID: "eb920579-7b71-4699-b06c-d7529faa5b97"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.537534 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.564213 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb920579-7b71-4699-b06c-d7529faa5b97-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.564241 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.673709 5030 generic.go:334] "Generic (PLEG): container finished" podID="85e5c279-1975-4f72-bf29-26cd37764cf5" containerID="bebb92ff566d18dbb68c8efb9a95be01cc65868460875314afe46fc26161f6e7" exitCode=0 Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.673742 5030 generic.go:334] "Generic (PLEG): container finished" podID="85e5c279-1975-4f72-bf29-26cd37764cf5" containerID="0216a8227c8340641007cbe78f48d541351fbca63f36e13db04d1de3452f1b69" exitCode=143 Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.673775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"85e5c279-1975-4f72-bf29-26cd37764cf5","Type":"ContainerDied","Data":"bebb92ff566d18dbb68c8efb9a95be01cc65868460875314afe46fc26161f6e7"} Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.674185 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"85e5c279-1975-4f72-bf29-26cd37764cf5","Type":"ContainerDied","Data":"0216a8227c8340641007cbe78f48d541351fbca63f36e13db04d1de3452f1b69"} Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.678158 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"963ffd35-3168-49a4-a9a7-5fdc54d7d606","Type":"ContainerStarted","Data":"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6"} Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.680176 5030 generic.go:334] "Generic (PLEG): container finished" podID="eb920579-7b71-4699-b06c-d7529faa5b97" containerID="1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07" exitCode=143 Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.680198 5030 generic.go:334] "Generic (PLEG): container finished" podID="eb920579-7b71-4699-b06c-d7529faa5b97" containerID="262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98" exitCode=143 Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.680211 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"eb920579-7b71-4699-b06c-d7529faa5b97","Type":"ContainerDied","Data":"1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07"} Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.680227 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"eb920579-7b71-4699-b06c-d7529faa5b97","Type":"ContainerDied","Data":"262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98"} Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.680238 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"eb920579-7b71-4699-b06c-d7529faa5b97","Type":"ContainerDied","Data":"05009376ee1cb4e431e51aa938a13efa95383a6ff9019b503759911cc5957919"} Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.680253 5030 scope.go:117] "RemoveContainer" containerID="1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.680278 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.722831 5030 scope.go:117] "RemoveContainer" containerID="262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.733997 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.744309 5030 scope.go:117] "RemoveContainer" containerID="1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07" Jan 20 23:16:19 crc kubenswrapper[5030]: E0120 23:16:19.744682 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07\": container with ID starting with 1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07 not found: ID does not exist" containerID="1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.744712 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07"} err="failed to get container status \"1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07\": rpc error: code = NotFound desc = could not find container \"1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07\": container with ID starting with 1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07 not found: ID does not exist" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.744733 5030 scope.go:117] "RemoveContainer" containerID="262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98" Jan 20 23:16:19 crc kubenswrapper[5030]: E0120 23:16:19.745478 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98\": container with ID starting with 262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98 not found: ID does not exist" containerID="262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.745499 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98"} err="failed to get container status \"262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98\": rpc error: code = NotFound desc = could not find container \"262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98\": container with ID starting with 262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98 not found: ID does not exist" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.745514 5030 scope.go:117] "RemoveContainer" containerID="1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.745824 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07"} err="failed to get container status \"1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07\": rpc error: code = NotFound desc = could not find container \"1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07\": container with ID starting with 1596809b409171706b4fd4868377fcede1fc0436825638df601e95f93e8dea07 not found: ID does not exist" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.745846 5030 scope.go:117] "RemoveContainer" containerID="262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.746007 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98"} err="failed to get container status \"262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98\": rpc error: code = NotFound desc = could not find container \"262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98\": container with ID starting with 262208838becf886677baa73efea6c6daa77ed9f1e48da46ca5cd4b1bc1a5a98 not found: ID does not exist" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.753994 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.771129 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.782250 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:16:19 crc kubenswrapper[5030]: E0120 23:16:19.782667 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb920579-7b71-4699-b06c-d7529faa5b97" containerName="glance-log" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.782688 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb920579-7b71-4699-b06c-d7529faa5b97" containerName="glance-log" Jan 20 23:16:19 crc kubenswrapper[5030]: E0120 23:16:19.782723 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e5c279-1975-4f72-bf29-26cd37764cf5" containerName="glance-httpd" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.782730 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e5c279-1975-4f72-bf29-26cd37764cf5" containerName="glance-httpd" Jan 20 23:16:19 crc kubenswrapper[5030]: E0120 23:16:19.782748 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e5c279-1975-4f72-bf29-26cd37764cf5" containerName="glance-log" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.782755 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e5c279-1975-4f72-bf29-26cd37764cf5" containerName="glance-log" Jan 20 23:16:19 crc kubenswrapper[5030]: E0120 23:16:19.782763 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb920579-7b71-4699-b06c-d7529faa5b97" containerName="glance-httpd" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.782769 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb920579-7b71-4699-b06c-d7529faa5b97" containerName="glance-httpd" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.782927 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb920579-7b71-4699-b06c-d7529faa5b97" containerName="glance-httpd" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.782945 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e5c279-1975-4f72-bf29-26cd37764cf5" containerName="glance-httpd" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.782956 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb920579-7b71-4699-b06c-d7529faa5b97" containerName="glance-log" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.782966 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e5c279-1975-4f72-bf29-26cd37764cf5" containerName="glance-log" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.783925 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.785949 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.786069 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.804515 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.868742 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn5jm\" (UniqueName: \"kubernetes.io/projected/85e5c279-1975-4f72-bf29-26cd37764cf5-kube-api-access-jn5jm\") pod \"85e5c279-1975-4f72-bf29-26cd37764cf5\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.868798 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85e5c279-1975-4f72-bf29-26cd37764cf5-httpd-run\") pod \"85e5c279-1975-4f72-bf29-26cd37764cf5\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.868854 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e5c279-1975-4f72-bf29-26cd37764cf5-logs\") pod \"85e5c279-1975-4f72-bf29-26cd37764cf5\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.868911 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-combined-ca-bundle\") pod \"85e5c279-1975-4f72-bf29-26cd37764cf5\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.868997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-scripts\") pod \"85e5c279-1975-4f72-bf29-26cd37764cf5\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.869031 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"85e5c279-1975-4f72-bf29-26cd37764cf5\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.869102 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-config-data\") pod \"85e5c279-1975-4f72-bf29-26cd37764cf5\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.869151 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-public-tls-certs\") pod \"85e5c279-1975-4f72-bf29-26cd37764cf5\" (UID: \"85e5c279-1975-4f72-bf29-26cd37764cf5\") " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.869298 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e5c279-1975-4f72-bf29-26cd37764cf5-logs" (OuterVolumeSpecName: "logs") pod "85e5c279-1975-4f72-bf29-26cd37764cf5" (UID: "85e5c279-1975-4f72-bf29-26cd37764cf5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.869320 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e5c279-1975-4f72-bf29-26cd37764cf5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "85e5c279-1975-4f72-bf29-26cd37764cf5" (UID: "85e5c279-1975-4f72-bf29-26cd37764cf5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.869561 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.869651 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.869911 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.869943 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.869962 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b150af41-d2a3-4f36-bbd0-159e468cf579-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.869980 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.870063 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b150af41-d2a3-4f36-bbd0-159e468cf579-logs\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.870093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwghd\" (UniqueName: \"kubernetes.io/projected/b150af41-d2a3-4f36-bbd0-159e468cf579-kube-api-access-nwghd\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.870142 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85e5c279-1975-4f72-bf29-26cd37764cf5-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.870153 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85e5c279-1975-4f72-bf29-26cd37764cf5-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.874576 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-scripts" (OuterVolumeSpecName: "scripts") pod "85e5c279-1975-4f72-bf29-26cd37764cf5" (UID: "85e5c279-1975-4f72-bf29-26cd37764cf5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.889638 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e5c279-1975-4f72-bf29-26cd37764cf5-kube-api-access-jn5jm" (OuterVolumeSpecName: "kube-api-access-jn5jm") pod "85e5c279-1975-4f72-bf29-26cd37764cf5" (UID: "85e5c279-1975-4f72-bf29-26cd37764cf5"). InnerVolumeSpecName "kube-api-access-jn5jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.894026 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "85e5c279-1975-4f72-bf29-26cd37764cf5" (UID: "85e5c279-1975-4f72-bf29-26cd37764cf5"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.936022 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85e5c279-1975-4f72-bf29-26cd37764cf5" (UID: "85e5c279-1975-4f72-bf29-26cd37764cf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.939727 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "85e5c279-1975-4f72-bf29-26cd37764cf5" (UID: "85e5c279-1975-4f72-bf29-26cd37764cf5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.956466 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-config-data" (OuterVolumeSpecName: "config-data") pod "85e5c279-1975-4f72-bf29-26cd37764cf5" (UID: "85e5c279-1975-4f72-bf29-26cd37764cf5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.971941 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.971990 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.972075 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.972096 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.972114 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b150af41-d2a3-4f36-bbd0-159e468cf579-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.972131 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.972170 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b150af41-d2a3-4f36-bbd0-159e468cf579-logs\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.972190 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwghd\" (UniqueName: \"kubernetes.io/projected/b150af41-d2a3-4f36-bbd0-159e468cf579-kube-api-access-nwghd\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.972239 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.972250 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.972269 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.972279 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.972287 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85e5c279-1975-4f72-bf29-26cd37764cf5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.972297 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn5jm\" (UniqueName: \"kubernetes.io/projected/85e5c279-1975-4f72-bf29-26cd37764cf5-kube-api-access-jn5jm\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.976468 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb920579-7b71-4699-b06c-d7529faa5b97" path="/var/lib/kubelet/pods/eb920579-7b71-4699-b06c-d7529faa5b97/volumes" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.978132 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.980310 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b150af41-d2a3-4f36-bbd0-159e468cf579-logs\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.980371 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b150af41-d2a3-4f36-bbd0-159e468cf579-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.983583 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.985110 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:19 crc kubenswrapper[5030]: I0120 23:16:19.988891 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.000691 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.002098 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.003077 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwghd\" (UniqueName: \"kubernetes.io/projected/b150af41-d2a3-4f36-bbd0-159e468cf579-kube-api-access-nwghd\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.018743 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.074120 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.107501 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.549167 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:16:20 crc kubenswrapper[5030]: W0120 23:16:20.550722 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb150af41_d2a3_4f36_bbd0_159e468cf579.slice/crio-53ba6f999c975f115153f6b3f69dd5f81ee2916390c23b9faaa43570783b7fee WatchSource:0}: Error finding container 53ba6f999c975f115153f6b3f69dd5f81ee2916390c23b9faaa43570783b7fee: Status 404 returned error can't find the container with id 53ba6f999c975f115153f6b3f69dd5f81ee2916390c23b9faaa43570783b7fee Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.692436 5030 generic.go:334] "Generic (PLEG): container finished" podID="24326cd7-d35a-41a1-b953-3ca15ad5d5f0" containerID="a63c3e2c0fa9c632fa78a3a408312008d457ce68e6f245d77409731f31a4fec8" exitCode=0 Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.692497 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-hk4fb" event={"ID":"24326cd7-d35a-41a1-b953-3ca15ad5d5f0","Type":"ContainerDied","Data":"a63c3e2c0fa9c632fa78a3a408312008d457ce68e6f245d77409731f31a4fec8"} Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.694989 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"85e5c279-1975-4f72-bf29-26cd37764cf5","Type":"ContainerDied","Data":"6facc543ab4db79fdce65ab652f59feb2782d5e3068e3c99eb2454814a416bc4"} Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.695022 5030 scope.go:117] "RemoveContainer" containerID="bebb92ff566d18dbb68c8efb9a95be01cc65868460875314afe46fc26161f6e7" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.695093 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.704265 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"963ffd35-3168-49a4-a9a7-5fdc54d7d606","Type":"ContainerStarted","Data":"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b"} Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.704529 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.704553 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="proxy-httpd" containerID="cri-o://bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b" gracePeriod=30 Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.704525 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="sg-core" containerID="cri-o://f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6" gracePeriod=30 Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.704506 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="ceilometer-central-agent" containerID="cri-o://78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd" gracePeriod=30 Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.704674 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="ceilometer-notification-agent" containerID="cri-o://3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2" gracePeriod=30 Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.714345 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b150af41-d2a3-4f36-bbd0-159e468cf579","Type":"ContainerStarted","Data":"53ba6f999c975f115153f6b3f69dd5f81ee2916390c23b9faaa43570783b7fee"} Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.747236 5030 scope.go:117] "RemoveContainer" containerID="0216a8227c8340641007cbe78f48d541351fbca63f36e13db04d1de3452f1b69" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.747389 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.766106 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.805216 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.806687 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.808990 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.809333 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.809390 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.340585005 podStartE2EDuration="6.809372365s" podCreationTimestamp="2026-01-20 23:16:14 +0000 UTC" firstStartedPulling="2026-01-20 23:16:15.901103827 +0000 UTC m=+2448.221364125" lastFinishedPulling="2026-01-20 23:16:20.369891177 +0000 UTC m=+2452.690151485" observedRunningTime="2026-01-20 23:16:20.762765609 +0000 UTC m=+2453.083025897" watchObservedRunningTime="2026-01-20 23:16:20.809372365 +0000 UTC m=+2453.129632653" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.833803 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.890849 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57213c76-ec53-41ad-a0c2-bc850c6450e9-logs\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.890899 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.891218 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-scripts\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.892589 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbnct\" (UniqueName: \"kubernetes.io/projected/57213c76-ec53-41ad-a0c2-bc850c6450e9-kube-api-access-gbnct\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.892822 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-config-data\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.892875 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.892937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.892955 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/57213c76-ec53-41ad-a0c2-bc850c6450e9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.994798 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-config-data\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.995112 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.995168 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.995184 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/57213c76-ec53-41ad-a0c2-bc850c6450e9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.995251 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57213c76-ec53-41ad-a0c2-bc850c6450e9-logs\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.995292 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.995356 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-scripts\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.995420 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbnct\" (UniqueName: \"kubernetes.io/projected/57213c76-ec53-41ad-a0c2-bc850c6450e9-kube-api-access-gbnct\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.997985 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.998331 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57213c76-ec53-41ad-a0c2-bc850c6450e9-logs\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:20 crc kubenswrapper[5030]: I0120 23:16:20.998888 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/57213c76-ec53-41ad-a0c2-bc850c6450e9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.001641 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-scripts\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.004576 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.007039 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-config-data\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.007660 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.010496 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbnct\" (UniqueName: \"kubernetes.io/projected/57213c76-ec53-41ad-a0c2-bc850c6450e9-kube-api-access-gbnct\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.032486 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.123215 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.465727 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.504006 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wptg\" (UniqueName: \"kubernetes.io/projected/963ffd35-3168-49a4-a9a7-5fdc54d7d606-kube-api-access-5wptg\") pod \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.504147 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-config-data\") pod \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.504172 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-sg-core-conf-yaml\") pod \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.504228 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-scripts\") pod \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.504277 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963ffd35-3168-49a4-a9a7-5fdc54d7d606-run-httpd\") pod \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.504311 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-combined-ca-bundle\") pod \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.504326 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963ffd35-3168-49a4-a9a7-5fdc54d7d606-log-httpd\") pod \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\" (UID: \"963ffd35-3168-49a4-a9a7-5fdc54d7d606\") " Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.505080 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/963ffd35-3168-49a4-a9a7-5fdc54d7d606-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "963ffd35-3168-49a4-a9a7-5fdc54d7d606" (UID: "963ffd35-3168-49a4-a9a7-5fdc54d7d606"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.505604 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/963ffd35-3168-49a4-a9a7-5fdc54d7d606-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "963ffd35-3168-49a4-a9a7-5fdc54d7d606" (UID: "963ffd35-3168-49a4-a9a7-5fdc54d7d606"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.507800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/963ffd35-3168-49a4-a9a7-5fdc54d7d606-kube-api-access-5wptg" (OuterVolumeSpecName: "kube-api-access-5wptg") pod "963ffd35-3168-49a4-a9a7-5fdc54d7d606" (UID: "963ffd35-3168-49a4-a9a7-5fdc54d7d606"). InnerVolumeSpecName "kube-api-access-5wptg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.510068 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-scripts" (OuterVolumeSpecName: "scripts") pod "963ffd35-3168-49a4-a9a7-5fdc54d7d606" (UID: "963ffd35-3168-49a4-a9a7-5fdc54d7d606"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.528796 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "963ffd35-3168-49a4-a9a7-5fdc54d7d606" (UID: "963ffd35-3168-49a4-a9a7-5fdc54d7d606"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.578067 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "963ffd35-3168-49a4-a9a7-5fdc54d7d606" (UID: "963ffd35-3168-49a4-a9a7-5fdc54d7d606"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.608962 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wptg\" (UniqueName: \"kubernetes.io/projected/963ffd35-3168-49a4-a9a7-5fdc54d7d606-kube-api-access-5wptg\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.609001 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.610001 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.610032 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963ffd35-3168-49a4-a9a7-5fdc54d7d606-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.610048 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.610059 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/963ffd35-3168-49a4-a9a7-5fdc54d7d606-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.610930 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-config-data" (OuterVolumeSpecName: "config-data") pod "963ffd35-3168-49a4-a9a7-5fdc54d7d606" (UID: "963ffd35-3168-49a4-a9a7-5fdc54d7d606"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.627716 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.712066 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963ffd35-3168-49a4-a9a7-5fdc54d7d606-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.735163 5030 generic.go:334] "Generic (PLEG): container finished" podID="41f0470c-c766-4578-b6ac-a3c182ea512e" containerID="64e8ddea6808b444003252fcc52a6d1bf1a5cedf7605c158182a33e5e1061381" exitCode=0 Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.735251 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" event={"ID":"41f0470c-c766-4578-b6ac-a3c182ea512e","Type":"ContainerDied","Data":"64e8ddea6808b444003252fcc52a6d1bf1a5cedf7605c158182a33e5e1061381"} Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.737372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b150af41-d2a3-4f36-bbd0-159e468cf579","Type":"ContainerStarted","Data":"4b7cb4b235ffcb2073ae792c2fcc75799b3869e069dff101834160746d5c9282"} Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.741228 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"57213c76-ec53-41ad-a0c2-bc850c6450e9","Type":"ContainerStarted","Data":"38bcd5a1f25e0883f7b350207a30a8db416dd54db126a21b4d6179df35d4c348"} Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.745858 5030 generic.go:334] "Generic (PLEG): container finished" podID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerID="bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b" exitCode=0 Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.745889 5030 generic.go:334] "Generic (PLEG): container finished" podID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerID="f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6" exitCode=2 Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.745897 5030 generic.go:334] "Generic (PLEG): container finished" podID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerID="3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2" exitCode=0 Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.745904 5030 generic.go:334] "Generic (PLEG): container finished" podID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerID="78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd" exitCode=0 Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.746093 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.746867 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"963ffd35-3168-49a4-a9a7-5fdc54d7d606","Type":"ContainerDied","Data":"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b"} Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.746917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"963ffd35-3168-49a4-a9a7-5fdc54d7d606","Type":"ContainerDied","Data":"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6"} Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.746932 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"963ffd35-3168-49a4-a9a7-5fdc54d7d606","Type":"ContainerDied","Data":"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2"} Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.746947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"963ffd35-3168-49a4-a9a7-5fdc54d7d606","Type":"ContainerDied","Data":"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd"} Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.746962 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"963ffd35-3168-49a4-a9a7-5fdc54d7d606","Type":"ContainerDied","Data":"0e4b964cc2f3e1674de73e44e90008db7042d8059c4261ac1f1d8f768a16cb54"} Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.746982 5030 scope.go:117] "RemoveContainer" containerID="bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.805591 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.814869 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.832899 5030 scope.go:117] "RemoveContainer" containerID="f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.834476 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:16:21 crc kubenswrapper[5030]: E0120 23:16:21.834895 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="ceilometer-notification-agent" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.834913 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="ceilometer-notification-agent" Jan 20 23:16:21 crc kubenswrapper[5030]: E0120 23:16:21.834926 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="sg-core" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.834934 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="sg-core" Jan 20 23:16:21 crc kubenswrapper[5030]: E0120 23:16:21.834944 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="ceilometer-central-agent" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.834950 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="ceilometer-central-agent" Jan 20 23:16:21 crc kubenswrapper[5030]: E0120 23:16:21.834976 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="proxy-httpd" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.834981 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="proxy-httpd" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.835134 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="sg-core" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.835144 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="ceilometer-central-agent" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.835159 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="ceilometer-notification-agent" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.835175 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" containerName="proxy-httpd" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.836823 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.839806 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.839968 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.844783 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.917469 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a6e168-5084-4ff7-96e9-86b047eb8da4-log-httpd\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.917524 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.917652 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgxmh\" (UniqueName: \"kubernetes.io/projected/c8a6e168-5084-4ff7-96e9-86b047eb8da4-kube-api-access-vgxmh\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.917706 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.917826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-config-data\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.918067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a6e168-5084-4ff7-96e9-86b047eb8da4-run-httpd\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.918099 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-scripts\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.976995 5030 scope.go:117] "RemoveContainer" containerID="3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.978298 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e5c279-1975-4f72-bf29-26cd37764cf5" path="/var/lib/kubelet/pods/85e5c279-1975-4f72-bf29-26cd37764cf5/volumes" Jan 20 23:16:21 crc kubenswrapper[5030]: I0120 23:16:21.978909 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="963ffd35-3168-49a4-a9a7-5fdc54d7d606" path="/var/lib/kubelet/pods/963ffd35-3168-49a4-a9a7-5fdc54d7d606/volumes" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.020235 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-config-data\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.020361 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a6e168-5084-4ff7-96e9-86b047eb8da4-run-httpd\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.020385 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-scripts\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.020441 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a6e168-5084-4ff7-96e9-86b047eb8da4-log-httpd\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.020479 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.020525 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgxmh\" (UniqueName: \"kubernetes.io/projected/c8a6e168-5084-4ff7-96e9-86b047eb8da4-kube-api-access-vgxmh\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.020548 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.021137 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a6e168-5084-4ff7-96e9-86b047eb8da4-log-httpd\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.021449 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a6e168-5084-4ff7-96e9-86b047eb8da4-run-httpd\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.032056 5030 scope.go:117] "RemoveContainer" containerID="78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.032496 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-scripts\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.032524 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.032823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-config-data\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.037244 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.071121 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgxmh\" (UniqueName: \"kubernetes.io/projected/c8a6e168-5084-4ff7-96e9-86b047eb8da4-kube-api-access-vgxmh\") pod \"ceilometer-0\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.141044 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.160672 5030 scope.go:117] "RemoveContainer" containerID="bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b" Jan 20 23:16:22 crc kubenswrapper[5030]: E0120 23:16:22.161236 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b\": container with ID starting with bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b not found: ID does not exist" containerID="bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.161266 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b"} err="failed to get container status \"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b\": rpc error: code = NotFound desc = could not find container \"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b\": container with ID starting with bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.161288 5030 scope.go:117] "RemoveContainer" containerID="f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6" Jan 20 23:16:22 crc kubenswrapper[5030]: E0120 23:16:22.161775 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6\": container with ID starting with f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6 not found: ID does not exist" containerID="f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.161813 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6"} err="failed to get container status \"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6\": rpc error: code = NotFound desc = could not find container \"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6\": container with ID starting with f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6 not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.161841 5030 scope.go:117] "RemoveContainer" containerID="3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2" Jan 20 23:16:22 crc kubenswrapper[5030]: E0120 23:16:22.162231 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2\": container with ID starting with 3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2 not found: ID does not exist" containerID="3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.162289 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2"} err="failed to get container status \"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2\": rpc error: code = NotFound desc = could not find container \"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2\": container with ID starting with 3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2 not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.162321 5030 scope.go:117] "RemoveContainer" containerID="78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd" Jan 20 23:16:22 crc kubenswrapper[5030]: E0120 23:16:22.162677 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd\": container with ID starting with 78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd not found: ID does not exist" containerID="78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.162705 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd"} err="failed to get container status \"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd\": rpc error: code = NotFound desc = could not find container \"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd\": container with ID starting with 78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.162724 5030 scope.go:117] "RemoveContainer" containerID="bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.162922 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b"} err="failed to get container status \"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b\": rpc error: code = NotFound desc = could not find container \"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b\": container with ID starting with bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.162941 5030 scope.go:117] "RemoveContainer" containerID="f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.163115 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6"} err="failed to get container status \"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6\": rpc error: code = NotFound desc = could not find container \"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6\": container with ID starting with f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6 not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.163128 5030 scope.go:117] "RemoveContainer" containerID="3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.163285 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2"} err="failed to get container status \"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2\": rpc error: code = NotFound desc = could not find container \"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2\": container with ID starting with 3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2 not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.163301 5030 scope.go:117] "RemoveContainer" containerID="78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.163449 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd"} err="failed to get container status \"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd\": rpc error: code = NotFound desc = could not find container \"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd\": container with ID starting with 78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.163465 5030 scope.go:117] "RemoveContainer" containerID="bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.163690 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b"} err="failed to get container status \"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b\": rpc error: code = NotFound desc = could not find container \"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b\": container with ID starting with bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.163781 5030 scope.go:117] "RemoveContainer" containerID="f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.164102 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6"} err="failed to get container status \"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6\": rpc error: code = NotFound desc = could not find container \"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6\": container with ID starting with f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6 not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.164124 5030 scope.go:117] "RemoveContainer" containerID="3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.165403 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2"} err="failed to get container status \"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2\": rpc error: code = NotFound desc = could not find container \"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2\": container with ID starting with 3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2 not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.165426 5030 scope.go:117] "RemoveContainer" containerID="78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.166439 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd"} err="failed to get container status \"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd\": rpc error: code = NotFound desc = could not find container \"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd\": container with ID starting with 78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.166478 5030 scope.go:117] "RemoveContainer" containerID="bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.166674 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b"} err="failed to get container status \"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b\": rpc error: code = NotFound desc = could not find container \"bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b\": container with ID starting with bb0e695abcabdb85578b912ecddfb1db329eb8e33188bce447d680da10558d9b not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.166994 5030 scope.go:117] "RemoveContainer" containerID="f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.167433 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6"} err="failed to get container status \"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6\": rpc error: code = NotFound desc = could not find container \"f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6\": container with ID starting with f8ac38a1e7df784a854a70b48c381f9bd0dcc6d2bf1fe685ffbe42ae0a9023e6 not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.167448 5030 scope.go:117] "RemoveContainer" containerID="3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.167854 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2"} err="failed to get container status \"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2\": rpc error: code = NotFound desc = could not find container \"3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2\": container with ID starting with 3b2bfdf2afba9fb219c1ce670590074c91fbd48ca87c85936f71d68e0360b2b2 not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.167868 5030 scope.go:117] "RemoveContainer" containerID="78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.168101 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd"} err="failed to get container status \"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd\": rpc error: code = NotFound desc = could not find container \"78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd\": container with ID starting with 78e26e3ddf67d1aee8fbe4994241ed5aa24243a2580be84448d72525f19c09dd not found: ID does not exist" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.223601 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwgws\" (UniqueName: \"kubernetes.io/projected/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-kube-api-access-rwgws\") pod \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.223681 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-config-data\") pod \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.223831 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-scripts\") pod \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.223892 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-combined-ca-bundle\") pod \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.223962 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-logs\") pod \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\" (UID: \"24326cd7-d35a-41a1-b953-3ca15ad5d5f0\") " Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.224529 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-logs" (OuterVolumeSpecName: "logs") pod "24326cd7-d35a-41a1-b953-3ca15ad5d5f0" (UID: "24326cd7-d35a-41a1-b953-3ca15ad5d5f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.230158 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-scripts" (OuterVolumeSpecName: "scripts") pod "24326cd7-d35a-41a1-b953-3ca15ad5d5f0" (UID: "24326cd7-d35a-41a1-b953-3ca15ad5d5f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.230169 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-kube-api-access-rwgws" (OuterVolumeSpecName: "kube-api-access-rwgws") pod "24326cd7-d35a-41a1-b953-3ca15ad5d5f0" (UID: "24326cd7-d35a-41a1-b953-3ca15ad5d5f0"). InnerVolumeSpecName "kube-api-access-rwgws". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.260721 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-config-data" (OuterVolumeSpecName: "config-data") pod "24326cd7-d35a-41a1-b953-3ca15ad5d5f0" (UID: "24326cd7-d35a-41a1-b953-3ca15ad5d5f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.266894 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24326cd7-d35a-41a1-b953-3ca15ad5d5f0" (UID: "24326cd7-d35a-41a1-b953-3ca15ad5d5f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.288203 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.332005 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.332053 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.332073 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.332087 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwgws\" (UniqueName: \"kubernetes.io/projected/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-kube-api-access-rwgws\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.332100 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24326cd7-d35a-41a1-b953-3ca15ad5d5f0-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.733481 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:16:22 crc kubenswrapper[5030]: W0120 23:16:22.738169 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8a6e168_5084_4ff7_96e9_86b047eb8da4.slice/crio-f19a61eba07fe4ea4e199c3de27916551857b7ba13008381060823afbe93858b WatchSource:0}: Error finding container f19a61eba07fe4ea4e199c3de27916551857b7ba13008381060823afbe93858b: Status 404 returned error can't find the container with id f19a61eba07fe4ea4e199c3de27916551857b7ba13008381060823afbe93858b Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.811905 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b150af41-d2a3-4f36-bbd0-159e468cf579","Type":"ContainerStarted","Data":"1ed7ef5c50b59bf38f186f5d6e1cda474cdf4e7c360b255f82fe49304dc9ce36"} Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.823858 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c8a6e168-5084-4ff7-96e9-86b047eb8da4","Type":"ContainerStarted","Data":"f19a61eba07fe4ea4e199c3de27916551857b7ba13008381060823afbe93858b"} Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.842021 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-hk4fb" event={"ID":"24326cd7-d35a-41a1-b953-3ca15ad5d5f0","Type":"ContainerDied","Data":"82c5a672737f6c5f4e93f12bce69a8c406d7b63919c6183b59766fb838ac64e0"} Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.842305 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-hk4fb" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.842316 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82c5a672737f6c5f4e93f12bce69a8c406d7b63919c6183b59766fb838ac64e0" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.858922 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.858899684 podStartE2EDuration="3.858899684s" podCreationTimestamp="2026-01-20 23:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:22.845681314 +0000 UTC m=+2455.165941602" watchObservedRunningTime="2026-01-20 23:16:22.858899684 +0000 UTC m=+2455.179159982" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.859108 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"57213c76-ec53-41ad-a0c2-bc850c6450e9","Type":"ContainerStarted","Data":"d7669dc3f89da19bb41a80426ab0d69429bf073c8734960200737d02eab6660b"} Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.872115 5030 generic.go:334] "Generic (PLEG): container finished" podID="830844d5-003c-44cd-bf80-5c78c96c1862" containerID="6d9a540904da26e50af38624029f0074073b55fb95f2afb13be3d84f6f8527ef" exitCode=0 Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.872279 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" event={"ID":"830844d5-003c-44cd-bf80-5c78c96c1862","Type":"ContainerDied","Data":"6d9a540904da26e50af38624029f0074073b55fb95f2afb13be3d84f6f8527ef"} Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.983344 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-847f86bdb-chr55"] Jan 20 23:16:22 crc kubenswrapper[5030]: E0120 23:16:22.983705 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24326cd7-d35a-41a1-b953-3ca15ad5d5f0" containerName="placement-db-sync" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.983737 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24326cd7-d35a-41a1-b953-3ca15ad5d5f0" containerName="placement-db-sync" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.983913 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="24326cd7-d35a-41a1-b953-3ca15ad5d5f0" containerName="placement-db-sync" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.984750 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.992282 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.992506 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-5jnzt" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.992720 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.992864 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:16:22 crc kubenswrapper[5030]: I0120 23:16:22.998151 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.009686 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-847f86bdb-chr55"] Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.049358 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.049429 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-public-tls-certs\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.049450 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-logs\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.049532 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-scripts\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.049595 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-combined-ca-bundle\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.049791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkrq6\" (UniqueName: \"kubernetes.io/projected/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-kube-api-access-kkrq6\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.049820 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-internal-tls-certs\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.151414 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-public-tls-certs\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.151455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-logs\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.151490 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-scripts\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.151514 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-combined-ca-bundle\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.151586 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkrq6\" (UniqueName: \"kubernetes.io/projected/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-kube-api-access-kkrq6\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.151606 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-internal-tls-certs\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.151670 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.154125 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-logs\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.163089 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.163426 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-internal-tls-certs\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.165356 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-scripts\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.166582 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-public-tls-certs\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.173247 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkrq6\" (UniqueName: \"kubernetes.io/projected/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-kube-api-access-kkrq6\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.187315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-combined-ca-bundle\") pod \"placement-847f86bdb-chr55\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.317573 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.365845 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.456365 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-credential-keys\") pod \"41f0470c-c766-4578-b6ac-a3c182ea512e\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.456424 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-fernet-keys\") pod \"41f0470c-c766-4578-b6ac-a3c182ea512e\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.456457 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-combined-ca-bundle\") pod \"41f0470c-c766-4578-b6ac-a3c182ea512e\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.456607 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-scripts\") pod \"41f0470c-c766-4578-b6ac-a3c182ea512e\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.456645 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5fcs\" (UniqueName: \"kubernetes.io/projected/41f0470c-c766-4578-b6ac-a3c182ea512e-kube-api-access-v5fcs\") pod \"41f0470c-c766-4578-b6ac-a3c182ea512e\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.456702 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-config-data\") pod \"41f0470c-c766-4578-b6ac-a3c182ea512e\" (UID: \"41f0470c-c766-4578-b6ac-a3c182ea512e\") " Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.464771 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f0470c-c766-4578-b6ac-a3c182ea512e-kube-api-access-v5fcs" (OuterVolumeSpecName: "kube-api-access-v5fcs") pod "41f0470c-c766-4578-b6ac-a3c182ea512e" (UID: "41f0470c-c766-4578-b6ac-a3c182ea512e"). InnerVolumeSpecName "kube-api-access-v5fcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.465263 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "41f0470c-c766-4578-b6ac-a3c182ea512e" (UID: "41f0470c-c766-4578-b6ac-a3c182ea512e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.466700 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-scripts" (OuterVolumeSpecName: "scripts") pod "41f0470c-c766-4578-b6ac-a3c182ea512e" (UID: "41f0470c-c766-4578-b6ac-a3c182ea512e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.477721 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "41f0470c-c766-4578-b6ac-a3c182ea512e" (UID: "41f0470c-c766-4578-b6ac-a3c182ea512e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.492538 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-config-data" (OuterVolumeSpecName: "config-data") pod "41f0470c-c766-4578-b6ac-a3c182ea512e" (UID: "41f0470c-c766-4578-b6ac-a3c182ea512e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.500021 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41f0470c-c766-4578-b6ac-a3c182ea512e" (UID: "41f0470c-c766-4578-b6ac-a3c182ea512e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.559110 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.559502 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5fcs\" (UniqueName: \"kubernetes.io/projected/41f0470c-c766-4578-b6ac-a3c182ea512e-kube-api-access-v5fcs\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.559524 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.559547 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.559573 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.559663 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f0470c-c766-4578-b6ac-a3c182ea512e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.774372 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-847f86bdb-chr55"] Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.866479 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-2xf4v"] Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.874149 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-2xf4v"] Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.888335 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"57213c76-ec53-41ad-a0c2-bc850c6450e9","Type":"ContainerStarted","Data":"91d2eee7f4a70bfaff3e7bbd1c18dd79fe9e6768d85b58181b7c77bdea80cc23"} Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.891117 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-2xf4v" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.891132 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82a34867668251a23d4c6c1659885b85b22d77489cc09bc65817cee219efb488" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.902177 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c8a6e168-5084-4ff7-96e9-86b047eb8da4","Type":"ContainerStarted","Data":"97cc1de893464b29ca710c23a72e9bf716d2df3974fc910033898669d3c1cd34"} Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.904199 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" event={"ID":"fa4827b5-38d8-4647-a2bd-bfff1b91d34e","Type":"ContainerStarted","Data":"1ad82cb309994355dd75bacb6def2fb63cb49faec561af7264c273e18bc1ab4f"} Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.925031 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.925013295 podStartE2EDuration="3.925013295s" podCreationTimestamp="2026-01-20 23:16:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:23.910380012 +0000 UTC m=+2456.230640320" watchObservedRunningTime="2026-01-20 23:16:23.925013295 +0000 UTC m=+2456.245273583" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.982027 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f0470c-c766-4578-b6ac-a3c182ea512e" path="/var/lib/kubelet/pods/41f0470c-c766-4578-b6ac-a3c182ea512e/volumes" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.982974 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-t2s2w"] Jan 20 23:16:23 crc kubenswrapper[5030]: E0120 23:16:23.983543 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f0470c-c766-4578-b6ac-a3c182ea512e" containerName="keystone-bootstrap" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.983608 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f0470c-c766-4578-b6ac-a3c182ea512e" containerName="keystone-bootstrap" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.983811 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f0470c-c766-4578-b6ac-a3c182ea512e" containerName="keystone-bootstrap" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.985259 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-t2s2w"] Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.985374 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.989182 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.989466 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-88kcj" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.989934 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.990048 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:16:23 crc kubenswrapper[5030]: I0120 23:16:23.990354 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.179191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-combined-ca-bundle\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.179574 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-credential-keys\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.179593 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-scripts\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.179743 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-fernet-keys\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.179854 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lvph\" (UniqueName: \"kubernetes.io/projected/ca320b58-7251-428b-a18b-ba5c632eb546-kube-api-access-2lvph\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.179913 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-config-data\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.281267 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-credential-keys\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.281325 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-scripts\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.281374 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-fernet-keys\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.281409 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lvph\" (UniqueName: \"kubernetes.io/projected/ca320b58-7251-428b-a18b-ba5c632eb546-kube-api-access-2lvph\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.281437 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-config-data\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.281516 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-combined-ca-bundle\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.286491 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-config-data\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.287062 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-combined-ca-bundle\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.287922 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-fernet-keys\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.290175 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-scripts\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.290301 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-credential-keys\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.293667 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.301901 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lvph\" (UniqueName: \"kubernetes.io/projected/ca320b58-7251-428b-a18b-ba5c632eb546-kube-api-access-2lvph\") pod \"keystone-bootstrap-t2s2w\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.317906 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.486449 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/830844d5-003c-44cd-bf80-5c78c96c1862-db-sync-config-data\") pod \"830844d5-003c-44cd-bf80-5c78c96c1862\" (UID: \"830844d5-003c-44cd-bf80-5c78c96c1862\") " Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.486830 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qckj\" (UniqueName: \"kubernetes.io/projected/830844d5-003c-44cd-bf80-5c78c96c1862-kube-api-access-7qckj\") pod \"830844d5-003c-44cd-bf80-5c78c96c1862\" (UID: \"830844d5-003c-44cd-bf80-5c78c96c1862\") " Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.486976 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/830844d5-003c-44cd-bf80-5c78c96c1862-combined-ca-bundle\") pod \"830844d5-003c-44cd-bf80-5c78c96c1862\" (UID: \"830844d5-003c-44cd-bf80-5c78c96c1862\") " Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.493713 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/830844d5-003c-44cd-bf80-5c78c96c1862-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "830844d5-003c-44cd-bf80-5c78c96c1862" (UID: "830844d5-003c-44cd-bf80-5c78c96c1862"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.495017 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/830844d5-003c-44cd-bf80-5c78c96c1862-kube-api-access-7qckj" (OuterVolumeSpecName: "kube-api-access-7qckj") pod "830844d5-003c-44cd-bf80-5c78c96c1862" (UID: "830844d5-003c-44cd-bf80-5c78c96c1862"). InnerVolumeSpecName "kube-api-access-7qckj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.539558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/830844d5-003c-44cd-bf80-5c78c96c1862-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "830844d5-003c-44cd-bf80-5c78c96c1862" (UID: "830844d5-003c-44cd-bf80-5c78c96c1862"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.589359 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/830844d5-003c-44cd-bf80-5c78c96c1862-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.589392 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/830844d5-003c-44cd-bf80-5c78c96c1862-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.589404 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qckj\" (UniqueName: \"kubernetes.io/projected/830844d5-003c-44cd-bf80-5c78c96c1862-kube-api-access-7qckj\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.792391 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-t2s2w"] Jan 20 23:16:24 crc kubenswrapper[5030]: W0120 23:16:24.795993 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca320b58_7251_428b_a18b_ba5c632eb546.slice/crio-f214098fdd6139be8618c4b38805d53fde9324eecd02470870f66f7f3521b84f WatchSource:0}: Error finding container f214098fdd6139be8618c4b38805d53fde9324eecd02470870f66f7f3521b84f: Status 404 returned error can't find the container with id f214098fdd6139be8618c4b38805d53fde9324eecd02470870f66f7f3521b84f Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.920997 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" event={"ID":"ca320b58-7251-428b-a18b-ba5c632eb546","Type":"ContainerStarted","Data":"f214098fdd6139be8618c4b38805d53fde9324eecd02470870f66f7f3521b84f"} Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.926824 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c8a6e168-5084-4ff7-96e9-86b047eb8da4","Type":"ContainerStarted","Data":"6b041ec1cfd36b8cc15ebfa5ed756044856e97d42755f7989aa1330547d730a2"} Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.930355 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" event={"ID":"fa4827b5-38d8-4647-a2bd-bfff1b91d34e","Type":"ContainerStarted","Data":"63cf8be9dcc2605f51f2d1da4be3b07a3a302bb4c83d0655aed8497ddf7630fd"} Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.930400 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" event={"ID":"fa4827b5-38d8-4647-a2bd-bfff1b91d34e","Type":"ContainerStarted","Data":"195092f4ac72d7d533ff5382326146aa18ffe894548f20c28141674be97ab59c"} Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.930451 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.930473 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.940067 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.940084 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-pzk5s" event={"ID":"830844d5-003c-44cd-bf80-5c78c96c1862","Type":"ContainerDied","Data":"b48bc3c3df5e58bdccf2a916583c6b9b9c86dcbe2e3c9109286c45a48f936567"} Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.940133 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b48bc3c3df5e58bdccf2a916583c6b9b9c86dcbe2e3c9109286c45a48f936567" Jan 20 23:16:24 crc kubenswrapper[5030]: I0120 23:16:24.968906 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" podStartSLOduration=2.968884271 podStartE2EDuration="2.968884271s" podCreationTimestamp="2026-01-20 23:16:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:24.957425544 +0000 UTC m=+2457.277685832" watchObservedRunningTime="2026-01-20 23:16:24.968884271 +0000 UTC m=+2457.289144559" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.152682 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-5748c46787-5fm27"] Jan 20 23:16:25 crc kubenswrapper[5030]: E0120 23:16:25.153047 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="830844d5-003c-44cd-bf80-5c78c96c1862" containerName="barbican-db-sync" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.153058 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="830844d5-003c-44cd-bf80-5c78c96c1862" containerName="barbican-db-sync" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.153247 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="830844d5-003c-44cd-bf80-5c78c96c1862" containerName="barbican-db-sync" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.154153 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.159022 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.163615 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-bskvr" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.163866 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.176549 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5748c46787-5fm27"] Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.200641 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm"] Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.204248 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.206891 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.222881 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm"] Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.315346 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-config-data-custom\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.316090 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e11584-61c4-455a-9943-28b4228c8921-logs\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.316247 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-config-data\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.316279 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-combined-ca-bundle\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.316391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5n4t\" (UniqueName: \"kubernetes.io/projected/c0e11584-61c4-455a-9943-28b4228c8921-kube-api-access-w5n4t\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.316422 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-logs\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.316469 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kjtp\" (UniqueName: \"kubernetes.io/projected/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-kube-api-access-9kjtp\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.316495 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-config-data-custom\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.316793 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-config-data\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.316850 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-combined-ca-bundle\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.321771 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-545bc46c64-vbtck"] Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.328973 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.331842 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.355479 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-545bc46c64-vbtck"] Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418341 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5n4t\" (UniqueName: \"kubernetes.io/projected/c0e11584-61c4-455a-9943-28b4228c8921-kube-api-access-w5n4t\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418398 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-logs\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418437 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kjtp\" (UniqueName: \"kubernetes.io/projected/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-kube-api-access-9kjtp\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418472 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-config-data-custom\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418536 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-config-data\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418577 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-combined-ca-bundle\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418641 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-combined-ca-bundle\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418685 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-config-data\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418715 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-config-data-custom\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418750 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e11584-61c4-455a-9943-28b4228c8921-logs\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418825 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-config-data-custom\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418855 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-logs\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418898 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-config-data\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418925 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-combined-ca-bundle\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.418949 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzrhc\" (UniqueName: \"kubernetes.io/projected/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-kube-api-access-wzrhc\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.419089 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e11584-61c4-455a-9943-28b4228c8921-logs\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.419874 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-logs\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.423688 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-config-data-custom\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.424342 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-combined-ca-bundle\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.425864 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-config-data-custom\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.426036 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-combined-ca-bundle\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.441431 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-config-data\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.443539 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-config-data\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.444348 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kjtp\" (UniqueName: \"kubernetes.io/projected/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-kube-api-access-9kjtp\") pod \"barbican-keystone-listener-67fcc84c58-mxnkm\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.470919 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5n4t\" (UniqueName: \"kubernetes.io/projected/c0e11584-61c4-455a-9943-28b4228c8921-kube-api-access-w5n4t\") pod \"barbican-worker-5748c46787-5fm27\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.488355 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.521120 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-logs\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.521180 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzrhc\" (UniqueName: \"kubernetes.io/projected/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-kube-api-access-wzrhc\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.521281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-combined-ca-bundle\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.521314 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-config-data\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.521363 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-config-data-custom\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.521687 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-logs\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.525056 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-combined-ca-bundle\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.525140 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-config-data-custom\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.526201 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-config-data\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.533570 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.540782 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzrhc\" (UniqueName: \"kubernetes.io/projected/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-kube-api-access-wzrhc\") pod \"barbican-api-545bc46c64-vbtck\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.654656 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.951042 5030 generic.go:334] "Generic (PLEG): container finished" podID="aab764fb-e311-4d5c-8139-65120ec5f9eb" containerID="962d321b449e41fd746f074eb545941f90420af47ecf8de63641e62ace89067f" exitCode=0 Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.951103 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" event={"ID":"aab764fb-e311-4d5c-8139-65120ec5f9eb","Type":"ContainerDied","Data":"962d321b449e41fd746f074eb545941f90420af47ecf8de63641e62ace89067f"} Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.952837 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" event={"ID":"ca320b58-7251-428b-a18b-ba5c632eb546","Type":"ContainerStarted","Data":"2c66ada43959a170fa4404aa8228c36bf2d04ad637119ab68bac6365e9aa4160"} Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.955581 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c8a6e168-5084-4ff7-96e9-86b047eb8da4","Type":"ContainerStarted","Data":"d37a11924a05a8c9890afffc7e94e6751a7a55b62f415ddc75612f6fbf4ef0ea"} Jan 20 23:16:25 crc kubenswrapper[5030]: I0120 23:16:25.957013 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5748c46787-5fm27"] Jan 20 23:16:26 crc kubenswrapper[5030]: I0120 23:16:26.104877 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" podStartSLOduration=3.104860788 podStartE2EDuration="3.104860788s" podCreationTimestamp="2026-01-20 23:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:26.00695628 +0000 UTC m=+2458.327216568" watchObservedRunningTime="2026-01-20 23:16:26.104860788 +0000 UTC m=+2458.425121076" Jan 20 23:16:26 crc kubenswrapper[5030]: I0120 23:16:26.109831 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm"] Jan 20 23:16:26 crc kubenswrapper[5030]: I0120 23:16:26.176642 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-545bc46c64-vbtck"] Jan 20 23:16:26 crc kubenswrapper[5030]: W0120 23:16:26.182561 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a6d8f73_8f49_4501_8fde_95d7fcdfb921.slice/crio-4f31548937bd7ac847a7844a6c576abcc1403660718ff0a8052b5afad2421491 WatchSource:0}: Error finding container 4f31548937bd7ac847a7844a6c576abcc1403660718ff0a8052b5afad2421491: Status 404 returned error can't find the container with id 4f31548937bd7ac847a7844a6c576abcc1403660718ff0a8052b5afad2421491 Jan 20 23:16:26 crc kubenswrapper[5030]: I0120 23:16:26.969157 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" event={"ID":"5a6d8f73-8f49-4501-8fde-95d7fcdfb921","Type":"ContainerStarted","Data":"2cde153b5bb54432054bfba907408b59ea1d150dae9a9dfecb5a08d718525b1c"} Jan 20 23:16:26 crc kubenswrapper[5030]: I0120 23:16:26.969391 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" event={"ID":"5a6d8f73-8f49-4501-8fde-95d7fcdfb921","Type":"ContainerStarted","Data":"4f31548937bd7ac847a7844a6c576abcc1403660718ff0a8052b5afad2421491"} Jan 20 23:16:26 crc kubenswrapper[5030]: I0120 23:16:26.973815 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" event={"ID":"c0e11584-61c4-455a-9943-28b4228c8921","Type":"ContainerStarted","Data":"284d499ae2855fa12179dc7eefb9d028f521319a34d4e38ba00fb9502f35e698"} Jan 20 23:16:26 crc kubenswrapper[5030]: I0120 23:16:26.973848 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" event={"ID":"c0e11584-61c4-455a-9943-28b4228c8921","Type":"ContainerStarted","Data":"8b68d04d926af53c48ce1eed2f0df585a7425d91dacd626198dde33e72447ad4"} Jan 20 23:16:26 crc kubenswrapper[5030]: I0120 23:16:26.973859 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" event={"ID":"c0e11584-61c4-455a-9943-28b4228c8921","Type":"ContainerStarted","Data":"98ed6ea2e1ae1d5a798f1cd07fdf870debb021c1e34fba786b210915bcbb5872"} Jan 20 23:16:26 crc kubenswrapper[5030]: I0120 23:16:26.979112 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" event={"ID":"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0","Type":"ContainerStarted","Data":"12ac3c08c01fa3165cd6cab4fc0e255fb705e419bf03d4f6f9041ab8c246bd09"} Jan 20 23:16:26 crc kubenswrapper[5030]: I0120 23:16:26.979154 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" event={"ID":"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0","Type":"ContainerStarted","Data":"dd7bc71f52051bd49bf081bebfad56ce7784ffdb02bda65b8f96c08631242c97"} Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.395363 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.419881 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" podStartSLOduration=2.419862143 podStartE2EDuration="2.419862143s" podCreationTimestamp="2026-01-20 23:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:26.995001325 +0000 UTC m=+2459.315261623" watchObservedRunningTime="2026-01-20 23:16:27.419862143 +0000 UTC m=+2459.740122431" Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.558376 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-combined-ca-bundle\") pod \"aab764fb-e311-4d5c-8139-65120ec5f9eb\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.558451 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-db-sync-config-data\") pod \"aab764fb-e311-4d5c-8139-65120ec5f9eb\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.558513 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aab764fb-e311-4d5c-8139-65120ec5f9eb-etc-machine-id\") pod \"aab764fb-e311-4d5c-8139-65120ec5f9eb\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.558719 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aab764fb-e311-4d5c-8139-65120ec5f9eb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "aab764fb-e311-4d5c-8139-65120ec5f9eb" (UID: "aab764fb-e311-4d5c-8139-65120ec5f9eb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.558792 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-scripts\") pod \"aab764fb-e311-4d5c-8139-65120ec5f9eb\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.559154 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-config-data\") pod \"aab764fb-e311-4d5c-8139-65120ec5f9eb\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.559458 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwk49\" (UniqueName: \"kubernetes.io/projected/aab764fb-e311-4d5c-8139-65120ec5f9eb-kube-api-access-gwk49\") pod \"aab764fb-e311-4d5c-8139-65120ec5f9eb\" (UID: \"aab764fb-e311-4d5c-8139-65120ec5f9eb\") " Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.559937 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aab764fb-e311-4d5c-8139-65120ec5f9eb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.564013 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab764fb-e311-4d5c-8139-65120ec5f9eb-kube-api-access-gwk49" (OuterVolumeSpecName: "kube-api-access-gwk49") pod "aab764fb-e311-4d5c-8139-65120ec5f9eb" (UID: "aab764fb-e311-4d5c-8139-65120ec5f9eb"). InnerVolumeSpecName "kube-api-access-gwk49". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.564361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "aab764fb-e311-4d5c-8139-65120ec5f9eb" (UID: "aab764fb-e311-4d5c-8139-65120ec5f9eb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.565712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-scripts" (OuterVolumeSpecName: "scripts") pod "aab764fb-e311-4d5c-8139-65120ec5f9eb" (UID: "aab764fb-e311-4d5c-8139-65120ec5f9eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.593865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aab764fb-e311-4d5c-8139-65120ec5f9eb" (UID: "aab764fb-e311-4d5c-8139-65120ec5f9eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.622943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-config-data" (OuterVolumeSpecName: "config-data") pod "aab764fb-e311-4d5c-8139-65120ec5f9eb" (UID: "aab764fb-e311-4d5c-8139-65120ec5f9eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.662307 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.662349 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwk49\" (UniqueName: \"kubernetes.io/projected/aab764fb-e311-4d5c-8139-65120ec5f9eb-kube-api-access-gwk49\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.662363 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.662377 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:27 crc kubenswrapper[5030]: I0120 23:16:27.662389 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aab764fb-e311-4d5c-8139-65120ec5f9eb-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.015384 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" event={"ID":"5a6d8f73-8f49-4501-8fde-95d7fcdfb921","Type":"ContainerStarted","Data":"06ee7ec944a9d5065323264cea939f93bd87f7249327c7674f96be41e6bbd605"} Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.015472 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.015494 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.021752 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" event={"ID":"aab764fb-e311-4d5c-8139-65120ec5f9eb","Type":"ContainerDied","Data":"dfde573ba78cc60dd1244f7109533ecfac6aa676c36092ae03c24b17936d8227"} Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.021788 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfde573ba78cc60dd1244f7109533ecfac6aa676c36092ae03c24b17936d8227" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.021788 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-bdw8v" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.034813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c8a6e168-5084-4ff7-96e9-86b047eb8da4","Type":"ContainerStarted","Data":"50124740c8e7c366ad0b103868e37f29699db8b35cb72c17ac6b12a49bf4b868"} Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.035500 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.049890 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" event={"ID":"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0","Type":"ContainerStarted","Data":"88a2387b417e90bfd501bc833d3ec1e5d951fed92f4116f90417ad981562c482"} Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.079578 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" podStartSLOduration=3.079561063 podStartE2EDuration="3.079561063s" podCreationTimestamp="2026-01-20 23:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:28.071804427 +0000 UTC m=+2460.392064715" watchObservedRunningTime="2026-01-20 23:16:28.079561063 +0000 UTC m=+2460.399821351" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.120510 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=3.453490763 podStartE2EDuration="7.120493764s" podCreationTimestamp="2026-01-20 23:16:21 +0000 UTC" firstStartedPulling="2026-01-20 23:16:22.741277414 +0000 UTC m=+2455.061537712" lastFinishedPulling="2026-01-20 23:16:26.408280425 +0000 UTC m=+2458.728540713" observedRunningTime="2026-01-20 23:16:28.108389474 +0000 UTC m=+2460.428649762" watchObservedRunningTime="2026-01-20 23:16:28.120493764 +0000 UTC m=+2460.440754052" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.159788 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:16:28 crc kubenswrapper[5030]: E0120 23:16:28.160197 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab764fb-e311-4d5c-8139-65120ec5f9eb" containerName="cinder-db-sync" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.160213 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab764fb-e311-4d5c-8139-65120ec5f9eb" containerName="cinder-db-sync" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.160392 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab764fb-e311-4d5c-8139-65120ec5f9eb" containerName="cinder-db-sync" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.161331 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.171864 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.172004 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.172124 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.172277 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-74drc" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.195841 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.195898 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-scripts\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.195951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2clk\" (UniqueName: \"kubernetes.io/projected/7ca89209-f523-4a10-88fd-204d88ca6a94-kube-api-access-t2clk\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.196023 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ca89209-f523-4a10-88fd-204d88ca6a94-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.196039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-config-data\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.196063 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.198657 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.207875 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" podStartSLOduration=3.20785777 podStartE2EDuration="3.20785777s" podCreationTimestamp="2026-01-20 23:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:28.17159479 +0000 UTC m=+2460.491855068" watchObservedRunningTime="2026-01-20 23:16:28.20785777 +0000 UTC m=+2460.528118058" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.295017 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.296555 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.297717 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2clk\" (UniqueName: \"kubernetes.io/projected/7ca89209-f523-4a10-88fd-204d88ca6a94-kube-api-access-t2clk\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.297828 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ca89209-f523-4a10-88fd-204d88ca6a94-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.297851 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-config-data\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.297892 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.297937 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.297971 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-scripts\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.299217 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.300300 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ca89209-f523-4a10-88fd-204d88ca6a94-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.304105 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.304545 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-config-data\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.306168 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.310117 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-scripts\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.317055 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2clk\" (UniqueName: \"kubernetes.io/projected/7ca89209-f523-4a10-88fd-204d88ca6a94-kube-api-access-t2clk\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.327285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.400201 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92537751-2231-4449-9155-124a77773f50-etc-machine-id\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.400281 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-config-data-custom\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.400355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x77q9\" (UniqueName: \"kubernetes.io/projected/92537751-2231-4449-9155-124a77773f50-kube-api-access-x77q9\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.400393 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-config-data\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.400429 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-scripts\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.400461 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92537751-2231-4449-9155-124a77773f50-logs\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.400531 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.502317 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.503063 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92537751-2231-4449-9155-124a77773f50-etc-machine-id\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.503240 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-config-data-custom\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.503329 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x77q9\" (UniqueName: \"kubernetes.io/projected/92537751-2231-4449-9155-124a77773f50-kube-api-access-x77q9\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.503401 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-config-data\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.503498 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-scripts\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.503575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92537751-2231-4449-9155-124a77773f50-logs\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.504131 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92537751-2231-4449-9155-124a77773f50-logs\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.504246 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92537751-2231-4449-9155-124a77773f50-etc-machine-id\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.505997 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.507553 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-config-data-custom\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.509401 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-config-data\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.520179 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-scripts\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.524739 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x77q9\" (UniqueName: \"kubernetes.io/projected/92537751-2231-4449-9155-124a77773f50-kube-api-access-x77q9\") pod \"cinder-api-0\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.543110 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.640747 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-84bf68df56-874dk"] Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.642744 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.644910 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.648342 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.652953 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-84bf68df56-874dk"] Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.709132 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwmlv\" (UniqueName: \"kubernetes.io/projected/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-kube-api-access-gwmlv\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.709191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-public-tls-certs\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.709245 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-internal-tls-certs\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.709269 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.709298 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-logs\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.709344 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.709375 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-combined-ca-bundle\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.709597 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.813148 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.813721 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-logs\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.813775 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.813843 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-combined-ca-bundle\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.813890 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwmlv\" (UniqueName: \"kubernetes.io/projected/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-kube-api-access-gwmlv\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.813925 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-public-tls-certs\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.813970 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-internal-tls-certs\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.816413 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-logs\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.818075 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.818705 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-combined-ca-bundle\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.819170 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-internal-tls-certs\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.826543 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-public-tls-certs\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.826949 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.847156 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwmlv\" (UniqueName: \"kubernetes.io/projected/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-kube-api-access-gwmlv\") pod \"barbican-api-84bf68df56-874dk\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.962597 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:16:28 crc kubenswrapper[5030]: E0120 23:16:28.962853 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:16:28 crc kubenswrapper[5030]: I0120 23:16:28.975861 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:29 crc kubenswrapper[5030]: I0120 23:16:29.087826 5030 generic.go:334] "Generic (PLEG): container finished" podID="ca320b58-7251-428b-a18b-ba5c632eb546" containerID="2c66ada43959a170fa4404aa8228c36bf2d04ad637119ab68bac6365e9aa4160" exitCode=0 Jan 20 23:16:29 crc kubenswrapper[5030]: I0120 23:16:29.087872 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" event={"ID":"ca320b58-7251-428b-a18b-ba5c632eb546","Type":"ContainerDied","Data":"2c66ada43959a170fa4404aa8228c36bf2d04ad637119ab68bac6365e9aa4160"} Jan 20 23:16:29 crc kubenswrapper[5030]: I0120 23:16:29.101432 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:16:29 crc kubenswrapper[5030]: I0120 23:16:29.270267 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:16:29 crc kubenswrapper[5030]: I0120 23:16:29.536093 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-84bf68df56-874dk"] Jan 20 23:16:29 crc kubenswrapper[5030]: W0120 23:16:29.540901 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40a68c1e_d515_4dac_a25a_d7ed5da4b15c.slice/crio-89553af5727ecc437b2d1ea8da1cfc3196096efa610a1336a09881122cfd2e19 WatchSource:0}: Error finding container 89553af5727ecc437b2d1ea8da1cfc3196096efa610a1336a09881122cfd2e19: Status 404 returned error can't find the container with id 89553af5727ecc437b2d1ea8da1cfc3196096efa610a1336a09881122cfd2e19 Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.111904 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.112366 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.156851 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" event={"ID":"40a68c1e-d515-4dac-a25a-d7ed5da4b15c","Type":"ContainerStarted","Data":"05743f2bd663ee4c6959a35a5be0c01f08b3fea21e5994bca8e2faae3bb81e6a"} Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.156896 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" event={"ID":"40a68c1e-d515-4dac-a25a-d7ed5da4b15c","Type":"ContainerStarted","Data":"89553af5727ecc437b2d1ea8da1cfc3196096efa610a1336a09881122cfd2e19"} Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.197815 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.202048 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"92537751-2231-4449-9155-124a77773f50","Type":"ContainerStarted","Data":"62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1"} Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.202086 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"92537751-2231-4449-9155-124a77773f50","Type":"ContainerStarted","Data":"1e9f4b720904a6d53a5f55dd2d0e651c680dfe76e86f9f738dadc1f25b33ea1a"} Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.263316 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"7ca89209-f523-4a10-88fd-204d88ca6a94","Type":"ContainerStarted","Data":"80eab3e871243c714e4f43435e693e302387545833a85b4bc6f78a7a5a06aaac"} Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.263365 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"7ca89209-f523-4a10-88fd-204d88ca6a94","Type":"ContainerStarted","Data":"4c8885cc51edbe4b92ef6ffbdbe57ac72d729921988e0c1152739d2cff608bb9"} Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.374716 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.754611 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.866827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-scripts\") pod \"ca320b58-7251-428b-a18b-ba5c632eb546\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.866876 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-config-data\") pod \"ca320b58-7251-428b-a18b-ba5c632eb546\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.866900 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-combined-ca-bundle\") pod \"ca320b58-7251-428b-a18b-ba5c632eb546\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.866940 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-fernet-keys\") pod \"ca320b58-7251-428b-a18b-ba5c632eb546\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.867036 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lvph\" (UniqueName: \"kubernetes.io/projected/ca320b58-7251-428b-a18b-ba5c632eb546-kube-api-access-2lvph\") pod \"ca320b58-7251-428b-a18b-ba5c632eb546\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.867084 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-credential-keys\") pod \"ca320b58-7251-428b-a18b-ba5c632eb546\" (UID: \"ca320b58-7251-428b-a18b-ba5c632eb546\") " Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.872382 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-scripts" (OuterVolumeSpecName: "scripts") pod "ca320b58-7251-428b-a18b-ba5c632eb546" (UID: "ca320b58-7251-428b-a18b-ba5c632eb546"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.872491 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ca320b58-7251-428b-a18b-ba5c632eb546" (UID: "ca320b58-7251-428b-a18b-ba5c632eb546"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.878178 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca320b58-7251-428b-a18b-ba5c632eb546-kube-api-access-2lvph" (OuterVolumeSpecName: "kube-api-access-2lvph") pod "ca320b58-7251-428b-a18b-ba5c632eb546" (UID: "ca320b58-7251-428b-a18b-ba5c632eb546"). InnerVolumeSpecName "kube-api-access-2lvph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.879865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ca320b58-7251-428b-a18b-ba5c632eb546" (UID: "ca320b58-7251-428b-a18b-ba5c632eb546"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.901859 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca320b58-7251-428b-a18b-ba5c632eb546" (UID: "ca320b58-7251-428b-a18b-ba5c632eb546"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.917603 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-config-data" (OuterVolumeSpecName: "config-data") pod "ca320b58-7251-428b-a18b-ba5c632eb546" (UID: "ca320b58-7251-428b-a18b-ba5c632eb546"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.969212 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lvph\" (UniqueName: \"kubernetes.io/projected/ca320b58-7251-428b-a18b-ba5c632eb546-kube-api-access-2lvph\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.969246 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.969255 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.969265 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.969317 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:30 crc kubenswrapper[5030]: I0120 23:16:30.969328 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca320b58-7251-428b-a18b-ba5c632eb546-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.123996 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.124064 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.165459 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.182795 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.255818 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-7b566b8847-cg4mk"] Jan 20 23:16:31 crc kubenswrapper[5030]: E0120 23:16:31.256205 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca320b58-7251-428b-a18b-ba5c632eb546" containerName="keystone-bootstrap" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.256220 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca320b58-7251-428b-a18b-ba5c632eb546" containerName="keystone-bootstrap" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.256396 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca320b58-7251-428b-a18b-ba5c632eb546" containerName="keystone-bootstrap" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.257011 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.258787 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.260098 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.274106 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-combined-ca-bundle\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.274343 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-public-tls-certs\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.274513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-internal-tls-certs\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.274641 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-scripts\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.274739 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-fernet-keys\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.274848 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrnqm\" (UniqueName: \"kubernetes.io/projected/09176751-6268-4f1b-9348-5adb2938e4ca-kube-api-access-hrnqm\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.274927 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-credential-keys\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.274524 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7b566b8847-cg4mk"] Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.275084 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.275158 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"92537751-2231-4449-9155-124a77773f50","Type":"ContainerStarted","Data":"ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e"} Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.275380 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-config-data\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.276004 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"7ca89209-f523-4a10-88fd-204d88ca6a94","Type":"ContainerStarted","Data":"a7b7d78175af643e005d1b77bc3f44262d772f10f0024e025c2af150ad833d08"} Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.279153 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" event={"ID":"40a68c1e-d515-4dac-a25a-d7ed5da4b15c","Type":"ContainerStarted","Data":"d64a394c4f184d65be789b464e93507d0d51cff95adc8f21cab504aa94390ded"} Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.279742 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.279998 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.282744 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" event={"ID":"ca320b58-7251-428b-a18b-ba5c632eb546","Type":"ContainerDied","Data":"f214098fdd6139be8618c4b38805d53fde9324eecd02470870f66f7f3521b84f"} Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.283015 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f214098fdd6139be8618c4b38805d53fde9324eecd02470870f66f7f3521b84f" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.283037 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.283050 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.283059 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.283068 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.282877 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-t2s2w" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.349648 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.349631312 podStartE2EDuration="3.349631312s" podCreationTimestamp="2026-01-20 23:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:31.323318501 +0000 UTC m=+2463.643578789" watchObservedRunningTime="2026-01-20 23:16:31.349631312 +0000 UTC m=+2463.669891600" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.369900 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" podStartSLOduration=3.369880937 podStartE2EDuration="3.369880937s" podCreationTimestamp="2026-01-20 23:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:31.361698371 +0000 UTC m=+2463.681958659" watchObservedRunningTime="2026-01-20 23:16:31.369880937 +0000 UTC m=+2463.690141225" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.377335 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-combined-ca-bundle\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.377394 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-public-tls-certs\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.377544 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-internal-tls-certs\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.377639 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-scripts\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.377707 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-fernet-keys\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.377785 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrnqm\" (UniqueName: \"kubernetes.io/projected/09176751-6268-4f1b-9348-5adb2938e4ca-kube-api-access-hrnqm\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.377873 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-credential-keys\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.377963 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-config-data\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.386225 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-fernet-keys\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.390240 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-combined-ca-bundle\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.390583 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-public-tls-certs\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.393115 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-config-data\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.398672 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-credential-keys\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.398753 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-internal-tls-certs\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.408508 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.408491014 podStartE2EDuration="3.408491014s" podCreationTimestamp="2026-01-20 23:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:31.380525883 +0000 UTC m=+2463.700786171" watchObservedRunningTime="2026-01-20 23:16:31.408491014 +0000 UTC m=+2463.728751302" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.418485 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-scripts\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.420775 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrnqm\" (UniqueName: \"kubernetes.io/projected/09176751-6268-4f1b-9348-5adb2938e4ca-kube-api-access-hrnqm\") pod \"keystone-7b566b8847-cg4mk\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:31 crc kubenswrapper[5030]: I0120 23:16:31.574013 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:32 crc kubenswrapper[5030]: I0120 23:16:32.066800 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7b566b8847-cg4mk"] Jan 20 23:16:32 crc kubenswrapper[5030]: W0120 23:16:32.070961 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09176751_6268_4f1b_9348_5adb2938e4ca.slice/crio-de8085f07a604d20402ab02d2aa43506f40e06e478b644c7a18f578aa97a45c9 WatchSource:0}: Error finding container de8085f07a604d20402ab02d2aa43506f40e06e478b644c7a18f578aa97a45c9: Status 404 returned error can't find the container with id de8085f07a604d20402ab02d2aa43506f40e06e478b644c7a18f578aa97a45c9 Jan 20 23:16:32 crc kubenswrapper[5030]: I0120 23:16:32.291068 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" event={"ID":"09176751-6268-4f1b-9348-5adb2938e4ca","Type":"ContainerStarted","Data":"de8085f07a604d20402ab02d2aa43506f40e06e478b644c7a18f578aa97a45c9"} Jan 20 23:16:32 crc kubenswrapper[5030]: I0120 23:16:32.496021 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:16:33 crc kubenswrapper[5030]: I0120 23:16:33.303599 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:16:33 crc kubenswrapper[5030]: I0120 23:16:33.303647 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:16:33 crc kubenswrapper[5030]: I0120 23:16:33.303641 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" event={"ID":"09176751-6268-4f1b-9348-5adb2938e4ca","Type":"ContainerStarted","Data":"5335c6175ed4a270a6a0424ed58b7a7123c2bee5625d69ab613e39861259cab6"} Jan 20 23:16:33 crc kubenswrapper[5030]: I0120 23:16:33.303699 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:16:33 crc kubenswrapper[5030]: I0120 23:16:33.303711 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:16:33 crc kubenswrapper[5030]: I0120 23:16:33.304996 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="92537751-2231-4449-9155-124a77773f50" containerName="cinder-api" containerID="cri-o://ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e" gracePeriod=30 Jan 20 23:16:33 crc kubenswrapper[5030]: I0120 23:16:33.305156 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="92537751-2231-4449-9155-124a77773f50" containerName="cinder-api-log" containerID="cri-o://62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1" gracePeriod=30 Jan 20 23:16:33 crc kubenswrapper[5030]: I0120 23:16:33.335326 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" podStartSLOduration=2.33530462 podStartE2EDuration="2.33530462s" podCreationTimestamp="2026-01-20 23:16:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:33.326728784 +0000 UTC m=+2465.646989082" watchObservedRunningTime="2026-01-20 23:16:33.33530462 +0000 UTC m=+2465.655564908" Jan 20 23:16:33 crc kubenswrapper[5030]: I0120 23:16:33.543907 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:33 crc kubenswrapper[5030]: I0120 23:16:33.885723 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.050813 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.132464 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-config-data-custom\") pod \"92537751-2231-4449-9155-124a77773f50\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.132508 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-config-data\") pod \"92537751-2231-4449-9155-124a77773f50\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.132533 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-scripts\") pod \"92537751-2231-4449-9155-124a77773f50\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.132606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92537751-2231-4449-9155-124a77773f50-logs\") pod \"92537751-2231-4449-9155-124a77773f50\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.132658 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x77q9\" (UniqueName: \"kubernetes.io/projected/92537751-2231-4449-9155-124a77773f50-kube-api-access-x77q9\") pod \"92537751-2231-4449-9155-124a77773f50\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.132695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-combined-ca-bundle\") pod \"92537751-2231-4449-9155-124a77773f50\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.132720 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92537751-2231-4449-9155-124a77773f50-etc-machine-id\") pod \"92537751-2231-4449-9155-124a77773f50\" (UID: \"92537751-2231-4449-9155-124a77773f50\") " Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.133175 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92537751-2231-4449-9155-124a77773f50-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "92537751-2231-4449-9155-124a77773f50" (UID: "92537751-2231-4449-9155-124a77773f50"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.135906 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92537751-2231-4449-9155-124a77773f50-logs" (OuterVolumeSpecName: "logs") pod "92537751-2231-4449-9155-124a77773f50" (UID: "92537751-2231-4449-9155-124a77773f50"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.152775 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "92537751-2231-4449-9155-124a77773f50" (UID: "92537751-2231-4449-9155-124a77773f50"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.154909 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92537751-2231-4449-9155-124a77773f50-kube-api-access-x77q9" (OuterVolumeSpecName: "kube-api-access-x77q9") pod "92537751-2231-4449-9155-124a77773f50" (UID: "92537751-2231-4449-9155-124a77773f50"). InnerVolumeSpecName "kube-api-access-x77q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.165822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-scripts" (OuterVolumeSpecName: "scripts") pod "92537751-2231-4449-9155-124a77773f50" (UID: "92537751-2231-4449-9155-124a77773f50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.195663 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92537751-2231-4449-9155-124a77773f50" (UID: "92537751-2231-4449-9155-124a77773f50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.235280 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.235321 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.235330 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92537751-2231-4449-9155-124a77773f50-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.235351 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x77q9\" (UniqueName: \"kubernetes.io/projected/92537751-2231-4449-9155-124a77773f50-kube-api-access-x77q9\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.235362 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.235370 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/92537751-2231-4449-9155-124a77773f50-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.253810 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-config-data" (OuterVolumeSpecName: "config-data") pod "92537751-2231-4449-9155-124a77773f50" (UID: "92537751-2231-4449-9155-124a77773f50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.313940 5030 generic.go:334] "Generic (PLEG): container finished" podID="92537751-2231-4449-9155-124a77773f50" containerID="ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e" exitCode=0 Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.314133 5030 generic.go:334] "Generic (PLEG): container finished" podID="92537751-2231-4449-9155-124a77773f50" containerID="62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1" exitCode=143 Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.314069 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.314027 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"92537751-2231-4449-9155-124a77773f50","Type":"ContainerDied","Data":"ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e"} Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.314269 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"92537751-2231-4449-9155-124a77773f50","Type":"ContainerDied","Data":"62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1"} Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.314292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"92537751-2231-4449-9155-124a77773f50","Type":"ContainerDied","Data":"1e9f4b720904a6d53a5f55dd2d0e651c680dfe76e86f9f738dadc1f25b33ea1a"} Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.314310 5030 scope.go:117] "RemoveContainer" containerID="ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.314440 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.315141 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.326204 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.326320 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.335560 5030 scope.go:117] "RemoveContainer" containerID="62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.336656 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92537751-2231-4449-9155-124a77773f50-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.353786 5030 scope.go:117] "RemoveContainer" containerID="ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e" Jan 20 23:16:34 crc kubenswrapper[5030]: E0120 23:16:34.356885 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e\": container with ID starting with ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e not found: ID does not exist" containerID="ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.356956 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e"} err="failed to get container status \"ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e\": rpc error: code = NotFound desc = could not find container \"ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e\": container with ID starting with ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e not found: ID does not exist" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.356982 5030 scope.go:117] "RemoveContainer" containerID="62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1" Jan 20 23:16:34 crc kubenswrapper[5030]: E0120 23:16:34.357387 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1\": container with ID starting with 62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1 not found: ID does not exist" containerID="62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.357414 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1"} err="failed to get container status \"62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1\": rpc error: code = NotFound desc = could not find container \"62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1\": container with ID starting with 62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1 not found: ID does not exist" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.357430 5030 scope.go:117] "RemoveContainer" containerID="ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.357695 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e"} err="failed to get container status \"ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e\": rpc error: code = NotFound desc = could not find container \"ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e\": container with ID starting with ddd5b19873581e04b894687190d319de8799c8d5d5586af231599b45c702f93e not found: ID does not exist" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.357733 5030 scope.go:117] "RemoveContainer" containerID="62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.357991 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1"} err="failed to get container status \"62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1\": rpc error: code = NotFound desc = could not find container \"62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1\": container with ID starting with 62564317b0af4d1f30a7edce9d2bf5b2cec1445f6e1ac2b11f3df098b63bbfd1 not found: ID does not exist" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.358924 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.372370 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.379470 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.396410 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:16:34 crc kubenswrapper[5030]: E0120 23:16:34.396848 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92537751-2231-4449-9155-124a77773f50" containerName="cinder-api" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.396866 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="92537751-2231-4449-9155-124a77773f50" containerName="cinder-api" Jan 20 23:16:34 crc kubenswrapper[5030]: E0120 23:16:34.396882 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92537751-2231-4449-9155-124a77773f50" containerName="cinder-api-log" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.396889 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="92537751-2231-4449-9155-124a77773f50" containerName="cinder-api-log" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.397069 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="92537751-2231-4449-9155-124a77773f50" containerName="cinder-api" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.397104 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="92537751-2231-4449-9155-124a77773f50" containerName="cinder-api-log" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.398109 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.399867 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.399998 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.400020 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.439680 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.498810 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.545378 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.545445 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-config-data-custom\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.545471 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-config-data\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.545534 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86rdw\" (UniqueName: \"kubernetes.io/projected/b175c0c8-ee1a-404d-9a14-d37744a24ece-kube-api-access-86rdw\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.545551 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b175c0c8-ee1a-404d-9a14-d37744a24ece-logs\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.545607 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-scripts\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.545658 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.545681 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b175c0c8-ee1a-404d-9a14-d37744a24ece-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.545719 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.647165 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.647240 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-config-data-custom\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.647265 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-config-data\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.647299 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86rdw\" (UniqueName: \"kubernetes.io/projected/b175c0c8-ee1a-404d-9a14-d37744a24ece-kube-api-access-86rdw\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.647318 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b175c0c8-ee1a-404d-9a14-d37744a24ece-logs\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.647374 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-scripts\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.647412 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.647434 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b175c0c8-ee1a-404d-9a14-d37744a24ece-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.647510 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b175c0c8-ee1a-404d-9a14-d37744a24ece-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.647544 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.647803 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b175c0c8-ee1a-404d-9a14-d37744a24ece-logs\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.651338 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.651817 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-config-data\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.651862 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.652887 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.655392 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-config-data-custom\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.661097 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-scripts\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.668127 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86rdw\" (UniqueName: \"kubernetes.io/projected/b175c0c8-ee1a-404d-9a14-d37744a24ece-kube-api-access-86rdw\") pod \"cinder-api-0\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:34 crc kubenswrapper[5030]: I0120 23:16:34.728166 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:35 crc kubenswrapper[5030]: I0120 23:16:35.177332 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:16:35 crc kubenswrapper[5030]: W0120 23:16:35.179396 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb175c0c8_ee1a_404d_9a14_d37744a24ece.slice/crio-cec0f26abf2ae33583bcab23496ac4f774202bf153430ca20224a7ba96e017f5 WatchSource:0}: Error finding container cec0f26abf2ae33583bcab23496ac4f774202bf153430ca20224a7ba96e017f5: Status 404 returned error can't find the container with id cec0f26abf2ae33583bcab23496ac4f774202bf153430ca20224a7ba96e017f5 Jan 20 23:16:35 crc kubenswrapper[5030]: I0120 23:16:35.331367 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b175c0c8-ee1a-404d-9a14-d37744a24ece","Type":"ContainerStarted","Data":"cec0f26abf2ae33583bcab23496ac4f774202bf153430ca20224a7ba96e017f5"} Jan 20 23:16:35 crc kubenswrapper[5030]: I0120 23:16:35.977276 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92537751-2231-4449-9155-124a77773f50" path="/var/lib/kubelet/pods/92537751-2231-4449-9155-124a77773f50/volumes" Jan 20 23:16:36 crc kubenswrapper[5030]: I0120 23:16:36.343436 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b175c0c8-ee1a-404d-9a14-d37744a24ece","Type":"ContainerStarted","Data":"3a9460c9a35ed585df222f0319fca4cbe21024e047ae17ac43a0303e76f4c0ee"} Jan 20 23:16:37 crc kubenswrapper[5030]: I0120 23:16:37.185186 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:37 crc kubenswrapper[5030]: I0120 23:16:37.252573 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:37 crc kubenswrapper[5030]: I0120 23:16:37.356710 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b175c0c8-ee1a-404d-9a14-d37744a24ece","Type":"ContainerStarted","Data":"494d5cde95754e39df3e175cb36cce23395a3273fbb92ee15c97c0a4ba5fcaa9"} Jan 20 23:16:37 crc kubenswrapper[5030]: I0120 23:16:37.382915 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.382895104 podStartE2EDuration="3.382895104s" podCreationTimestamp="2026-01-20 23:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:37.37526235 +0000 UTC m=+2469.695522648" watchObservedRunningTime="2026-01-20 23:16:37.382895104 +0000 UTC m=+2469.703155392" Jan 20 23:16:38 crc kubenswrapper[5030]: I0120 23:16:38.370525 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:38 crc kubenswrapper[5030]: I0120 23:16:38.778150 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:38 crc kubenswrapper[5030]: I0120 23:16:38.839599 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:16:39 crc kubenswrapper[5030]: I0120 23:16:39.387319 5030 generic.go:334] "Generic (PLEG): container finished" podID="b8466bfd-157a-488e-a8a3-856b4837603a" containerID="b52e95dd143ed5cf453a5cbefbdce934cf06f59d7f7dc81da3443df81759cf02" exitCode=0 Jan 20 23:16:39 crc kubenswrapper[5030]: I0120 23:16:39.387393 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" event={"ID":"b8466bfd-157a-488e-a8a3-856b4837603a","Type":"ContainerDied","Data":"b52e95dd143ed5cf453a5cbefbdce934cf06f59d7f7dc81da3443df81759cf02"} Jan 20 23:16:39 crc kubenswrapper[5030]: I0120 23:16:39.387675 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="7ca89209-f523-4a10-88fd-204d88ca6a94" containerName="cinder-scheduler" containerID="cri-o://80eab3e871243c714e4f43435e693e302387545833a85b4bc6f78a7a5a06aaac" gracePeriod=30 Jan 20 23:16:39 crc kubenswrapper[5030]: I0120 23:16:39.388059 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="7ca89209-f523-4a10-88fd-204d88ca6a94" containerName="probe" containerID="cri-o://a7b7d78175af643e005d1b77bc3f44262d772f10f0024e025c2af150ad833d08" gracePeriod=30 Jan 20 23:16:40 crc kubenswrapper[5030]: I0120 23:16:40.401714 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ca89209-f523-4a10-88fd-204d88ca6a94" containerID="a7b7d78175af643e005d1b77bc3f44262d772f10f0024e025c2af150ad833d08" exitCode=0 Jan 20 23:16:40 crc kubenswrapper[5030]: I0120 23:16:40.401820 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"7ca89209-f523-4a10-88fd-204d88ca6a94","Type":"ContainerDied","Data":"a7b7d78175af643e005d1b77bc3f44262d772f10f0024e025c2af150ad833d08"} Jan 20 23:16:40 crc kubenswrapper[5030]: I0120 23:16:40.537213 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:40 crc kubenswrapper[5030]: I0120 23:16:40.602529 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:16:40 crc kubenswrapper[5030]: I0120 23:16:40.682191 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-545bc46c64-vbtck"] Jan 20 23:16:40 crc kubenswrapper[5030]: I0120 23:16:40.682409 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" podUID="5a6d8f73-8f49-4501-8fde-95d7fcdfb921" containerName="barbican-api-log" containerID="cri-o://2cde153b5bb54432054bfba907408b59ea1d150dae9a9dfecb5a08d718525b1c" gracePeriod=30 Jan 20 23:16:40 crc kubenswrapper[5030]: I0120 23:16:40.682639 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" podUID="5a6d8f73-8f49-4501-8fde-95d7fcdfb921" containerName="barbican-api" containerID="cri-o://06ee7ec944a9d5065323264cea939f93bd87f7249327c7674f96be41e6bbd605" gracePeriod=30 Jan 20 23:16:40 crc kubenswrapper[5030]: I0120 23:16:40.854433 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" Jan 20 23:16:40 crc kubenswrapper[5030]: I0120 23:16:40.962369 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:16:40 crc kubenswrapper[5030]: E0120 23:16:40.962735 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:16:40 crc kubenswrapper[5030]: I0120 23:16:40.973237 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8466bfd-157a-488e-a8a3-856b4837603a-combined-ca-bundle\") pod \"b8466bfd-157a-488e-a8a3-856b4837603a\" (UID: \"b8466bfd-157a-488e-a8a3-856b4837603a\") " Jan 20 23:16:40 crc kubenswrapper[5030]: I0120 23:16:40.973312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8466bfd-157a-488e-a8a3-856b4837603a-config\") pod \"b8466bfd-157a-488e-a8a3-856b4837603a\" (UID: \"b8466bfd-157a-488e-a8a3-856b4837603a\") " Jan 20 23:16:40 crc kubenswrapper[5030]: I0120 23:16:40.973483 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6rf8\" (UniqueName: \"kubernetes.io/projected/b8466bfd-157a-488e-a8a3-856b4837603a-kube-api-access-t6rf8\") pod \"b8466bfd-157a-488e-a8a3-856b4837603a\" (UID: \"b8466bfd-157a-488e-a8a3-856b4837603a\") " Jan 20 23:16:40 crc kubenswrapper[5030]: I0120 23:16:40.982579 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8466bfd-157a-488e-a8a3-856b4837603a-kube-api-access-t6rf8" (OuterVolumeSpecName: "kube-api-access-t6rf8") pod "b8466bfd-157a-488e-a8a3-856b4837603a" (UID: "b8466bfd-157a-488e-a8a3-856b4837603a"). InnerVolumeSpecName "kube-api-access-t6rf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.019745 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8466bfd-157a-488e-a8a3-856b4837603a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8466bfd-157a-488e-a8a3-856b4837603a" (UID: "b8466bfd-157a-488e-a8a3-856b4837603a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.037296 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8466bfd-157a-488e-a8a3-856b4837603a-config" (OuterVolumeSpecName: "config") pod "b8466bfd-157a-488e-a8a3-856b4837603a" (UID: "b8466bfd-157a-488e-a8a3-856b4837603a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.075930 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6rf8\" (UniqueName: \"kubernetes.io/projected/b8466bfd-157a-488e-a8a3-856b4837603a-kube-api-access-t6rf8\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.075971 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8466bfd-157a-488e-a8a3-856b4837603a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.075986 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8466bfd-157a-488e-a8a3-856b4837603a-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.418408 5030 generic.go:334] "Generic (PLEG): container finished" podID="5a6d8f73-8f49-4501-8fde-95d7fcdfb921" containerID="2cde153b5bb54432054bfba907408b59ea1d150dae9a9dfecb5a08d718525b1c" exitCode=143 Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.418505 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" event={"ID":"5a6d8f73-8f49-4501-8fde-95d7fcdfb921","Type":"ContainerDied","Data":"2cde153b5bb54432054bfba907408b59ea1d150dae9a9dfecb5a08d718525b1c"} Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.420599 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.420741 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-7wlmd" event={"ID":"b8466bfd-157a-488e-a8a3-856b4837603a","Type":"ContainerDied","Data":"916c00ca82b0f096d113633a3ae2784ca1831cb4174538e1c51a0a3e0bd1fbad"} Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.420782 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="916c00ca82b0f096d113633a3ae2784ca1831cb4174538e1c51a0a3e0bd1fbad" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.603969 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6854d54cb9-lmp44"] Jan 20 23:16:41 crc kubenswrapper[5030]: E0120 23:16:41.604545 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8466bfd-157a-488e-a8a3-856b4837603a" containerName="neutron-db-sync" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.604561 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8466bfd-157a-488e-a8a3-856b4837603a" containerName="neutron-db-sync" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.605115 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8466bfd-157a-488e-a8a3-856b4837603a" containerName="neutron-db-sync" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.606069 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.613361 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.613559 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.613672 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-dbr5q" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.613800 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.636020 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6854d54cb9-lmp44"] Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.691178 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-config\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.691232 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfns2\" (UniqueName: \"kubernetes.io/projected/652222f4-da37-437a-9f90-f89857b8399b-kube-api-access-gfns2\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.691393 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-httpd-config\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.691425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-combined-ca-bundle\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.691491 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-ovndb-tls-certs\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.793849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-config\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.794567 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfns2\" (UniqueName: \"kubernetes.io/projected/652222f4-da37-437a-9f90-f89857b8399b-kube-api-access-gfns2\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.814874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-httpd-config\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.814955 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-combined-ca-bundle\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.815045 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-ovndb-tls-certs\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.839633 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-combined-ca-bundle\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.839695 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfns2\" (UniqueName: \"kubernetes.io/projected/652222f4-da37-437a-9f90-f89857b8399b-kube-api-access-gfns2\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.840793 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-config\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.850863 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-httpd-config\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.852250 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-ovndb-tls-certs\") pod \"neutron-6854d54cb9-lmp44\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:41 crc kubenswrapper[5030]: I0120 23:16:41.940679 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.017305 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.019914 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2clk\" (UniqueName: \"kubernetes.io/projected/7ca89209-f523-4a10-88fd-204d88ca6a94-kube-api-access-t2clk\") pod \"7ca89209-f523-4a10-88fd-204d88ca6a94\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.020282 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-config-data\") pod \"7ca89209-f523-4a10-88fd-204d88ca6a94\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.020304 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-scripts\") pod \"7ca89209-f523-4a10-88fd-204d88ca6a94\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.020324 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ca89209-f523-4a10-88fd-204d88ca6a94-etc-machine-id\") pod \"7ca89209-f523-4a10-88fd-204d88ca6a94\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.020398 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-combined-ca-bundle\") pod \"7ca89209-f523-4a10-88fd-204d88ca6a94\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.020430 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-config-data-custom\") pod \"7ca89209-f523-4a10-88fd-204d88ca6a94\" (UID: \"7ca89209-f523-4a10-88fd-204d88ca6a94\") " Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.020631 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ca89209-f523-4a10-88fd-204d88ca6a94-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7ca89209-f523-4a10-88fd-204d88ca6a94" (UID: "7ca89209-f523-4a10-88fd-204d88ca6a94"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.021829 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ca89209-f523-4a10-88fd-204d88ca6a94-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.025389 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7ca89209-f523-4a10-88fd-204d88ca6a94" (UID: "7ca89209-f523-4a10-88fd-204d88ca6a94"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.025468 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca89209-f523-4a10-88fd-204d88ca6a94-kube-api-access-t2clk" (OuterVolumeSpecName: "kube-api-access-t2clk") pod "7ca89209-f523-4a10-88fd-204d88ca6a94" (UID: "7ca89209-f523-4a10-88fd-204d88ca6a94"). InnerVolumeSpecName "kube-api-access-t2clk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.026770 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-scripts" (OuterVolumeSpecName: "scripts") pod "7ca89209-f523-4a10-88fd-204d88ca6a94" (UID: "7ca89209-f523-4a10-88fd-204d88ca6a94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.076970 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ca89209-f523-4a10-88fd-204d88ca6a94" (UID: "7ca89209-f523-4a10-88fd-204d88ca6a94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.123477 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-config-data" (OuterVolumeSpecName: "config-data") pod "7ca89209-f523-4a10-88fd-204d88ca6a94" (UID: "7ca89209-f523-4a10-88fd-204d88ca6a94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.123751 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.123767 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.123794 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2clk\" (UniqueName: \"kubernetes.io/projected/7ca89209-f523-4a10-88fd-204d88ca6a94-kube-api-access-t2clk\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.123806 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.123814 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ca89209-f523-4a10-88fd-204d88ca6a94-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.432577 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ca89209-f523-4a10-88fd-204d88ca6a94" containerID="80eab3e871243c714e4f43435e693e302387545833a85b4bc6f78a7a5a06aaac" exitCode=0 Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.432677 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"7ca89209-f523-4a10-88fd-204d88ca6a94","Type":"ContainerDied","Data":"80eab3e871243c714e4f43435e693e302387545833a85b4bc6f78a7a5a06aaac"} Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.433040 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"7ca89209-f523-4a10-88fd-204d88ca6a94","Type":"ContainerDied","Data":"4c8885cc51edbe4b92ef6ffbdbe57ac72d729921988e0c1152739d2cff608bb9"} Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.433085 5030 scope.go:117] "RemoveContainer" containerID="a7b7d78175af643e005d1b77bc3f44262d772f10f0024e025c2af150ad833d08" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.432700 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.460460 5030 scope.go:117] "RemoveContainer" containerID="80eab3e871243c714e4f43435e693e302387545833a85b4bc6f78a7a5a06aaac" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.479052 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.491139 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.504812 5030 scope.go:117] "RemoveContainer" containerID="a7b7d78175af643e005d1b77bc3f44262d772f10f0024e025c2af150ad833d08" Jan 20 23:16:42 crc kubenswrapper[5030]: E0120 23:16:42.505286 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b7d78175af643e005d1b77bc3f44262d772f10f0024e025c2af150ad833d08\": container with ID starting with a7b7d78175af643e005d1b77bc3f44262d772f10f0024e025c2af150ad833d08 not found: ID does not exist" containerID="a7b7d78175af643e005d1b77bc3f44262d772f10f0024e025c2af150ad833d08" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.505348 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b7d78175af643e005d1b77bc3f44262d772f10f0024e025c2af150ad833d08"} err="failed to get container status \"a7b7d78175af643e005d1b77bc3f44262d772f10f0024e025c2af150ad833d08\": rpc error: code = NotFound desc = could not find container \"a7b7d78175af643e005d1b77bc3f44262d772f10f0024e025c2af150ad833d08\": container with ID starting with a7b7d78175af643e005d1b77bc3f44262d772f10f0024e025c2af150ad833d08 not found: ID does not exist" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.505380 5030 scope.go:117] "RemoveContainer" containerID="80eab3e871243c714e4f43435e693e302387545833a85b4bc6f78a7a5a06aaac" Jan 20 23:16:42 crc kubenswrapper[5030]: E0120 23:16:42.505740 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80eab3e871243c714e4f43435e693e302387545833a85b4bc6f78a7a5a06aaac\": container with ID starting with 80eab3e871243c714e4f43435e693e302387545833a85b4bc6f78a7a5a06aaac not found: ID does not exist" containerID="80eab3e871243c714e4f43435e693e302387545833a85b4bc6f78a7a5a06aaac" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.505789 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80eab3e871243c714e4f43435e693e302387545833a85b4bc6f78a7a5a06aaac"} err="failed to get container status \"80eab3e871243c714e4f43435e693e302387545833a85b4bc6f78a7a5a06aaac\": rpc error: code = NotFound desc = could not find container \"80eab3e871243c714e4f43435e693e302387545833a85b4bc6f78a7a5a06aaac\": container with ID starting with 80eab3e871243c714e4f43435e693e302387545833a85b4bc6f78a7a5a06aaac not found: ID does not exist" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.511791 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6854d54cb9-lmp44"] Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.534312 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:16:42 crc kubenswrapper[5030]: E0120 23:16:42.534720 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca89209-f523-4a10-88fd-204d88ca6a94" containerName="cinder-scheduler" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.534737 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca89209-f523-4a10-88fd-204d88ca6a94" containerName="cinder-scheduler" Jan 20 23:16:42 crc kubenswrapper[5030]: E0120 23:16:42.534759 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca89209-f523-4a10-88fd-204d88ca6a94" containerName="probe" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.534766 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca89209-f523-4a10-88fd-204d88ca6a94" containerName="probe" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.534977 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca89209-f523-4a10-88fd-204d88ca6a94" containerName="probe" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.535005 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca89209-f523-4a10-88fd-204d88ca6a94" containerName="cinder-scheduler" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.536115 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.540224 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.546362 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.634249 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpfnk\" (UniqueName: \"kubernetes.io/projected/74f7986d-211a-4216-b92f-ad900ddcc45f-kube-api-access-cpfnk\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.634328 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.634561 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.634660 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74f7986d-211a-4216-b92f-ad900ddcc45f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.634838 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.634898 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.736814 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.736894 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74f7986d-211a-4216-b92f-ad900ddcc45f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.736963 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.737005 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74f7986d-211a-4216-b92f-ad900ddcc45f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.737021 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.737226 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpfnk\" (UniqueName: \"kubernetes.io/projected/74f7986d-211a-4216-b92f-ad900ddcc45f-kube-api-access-cpfnk\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.737266 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.742378 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.742665 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.743859 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.745988 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:42 crc kubenswrapper[5030]: I0120 23:16:42.757900 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpfnk\" (UniqueName: \"kubernetes.io/projected/74f7986d-211a-4216-b92f-ad900ddcc45f-kube-api-access-cpfnk\") pod \"cinder-scheduler-0\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:43 crc kubenswrapper[5030]: I0120 23:16:43.001550 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:43 crc kubenswrapper[5030]: I0120 23:16:43.449482 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" event={"ID":"652222f4-da37-437a-9f90-f89857b8399b","Type":"ContainerStarted","Data":"0f674785d15a66a0824eddd1da0e18f743a568ad97ea8e98800c71504892f015"} Jan 20 23:16:43 crc kubenswrapper[5030]: I0120 23:16:43.449878 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" event={"ID":"652222f4-da37-437a-9f90-f89857b8399b","Type":"ContainerStarted","Data":"1a6aeedce58898831a51abb4593a2b921a2ad31f8e4c6edb75453fd364129a5b"} Jan 20 23:16:43 crc kubenswrapper[5030]: I0120 23:16:43.449902 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" event={"ID":"652222f4-da37-437a-9f90-f89857b8399b","Type":"ContainerStarted","Data":"51a540c8dc762695227130e63f0102f240e44574e378c707ae515b6d5db7b176"} Jan 20 23:16:43 crc kubenswrapper[5030]: I0120 23:16:43.452208 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:16:43 crc kubenswrapper[5030]: I0120 23:16:43.468523 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:16:43 crc kubenswrapper[5030]: I0120 23:16:43.475674 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" podStartSLOduration=2.475655482 podStartE2EDuration="2.475655482s" podCreationTimestamp="2026-01-20 23:16:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:43.471448782 +0000 UTC m=+2475.791709070" watchObservedRunningTime="2026-01-20 23:16:43.475655482 +0000 UTC m=+2475.795915780" Jan 20 23:16:43 crc kubenswrapper[5030]: I0120 23:16:43.919593 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd"] Jan 20 23:16:43 crc kubenswrapper[5030]: I0120 23:16:43.921132 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:43 crc kubenswrapper[5030]: I0120 23:16:43.924561 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 20 23:16:43 crc kubenswrapper[5030]: I0120 23:16:43.936237 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 20 23:16:43 crc kubenswrapper[5030]: I0120 23:16:43.941415 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd"] Jan 20 23:16:43 crc kubenswrapper[5030]: I0120 23:16:43.994210 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca89209-f523-4a10-88fd-204d88ca6a94" path="/var/lib/kubelet/pods/7ca89209-f523-4a10-88fd-204d88ca6a94/volumes" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.063586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-ovndb-tls-certs\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.064007 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-internal-tls-certs\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.064110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dtkl\" (UniqueName: \"kubernetes.io/projected/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-kube-api-access-7dtkl\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.064149 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-config\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.064185 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-combined-ca-bundle\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.064371 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-public-tls-certs\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.064407 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-httpd-config\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.165796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-internal-tls-certs\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.166497 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dtkl\" (UniqueName: \"kubernetes.io/projected/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-kube-api-access-7dtkl\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.166591 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-config\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.166657 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-combined-ca-bundle\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.166687 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-public-tls-certs\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.166736 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-httpd-config\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.166767 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-ovndb-tls-certs\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.171091 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-combined-ca-bundle\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.175497 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-httpd-config\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.176032 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-public-tls-certs\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.176752 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-internal-tls-certs\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.176949 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-config\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.177217 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-ovndb-tls-certs\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.203426 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dtkl\" (UniqueName: \"kubernetes.io/projected/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-kube-api-access-7dtkl\") pod \"neutron-cc7b7c5c8-d9whd\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.218675 5030 scope.go:117] "RemoveContainer" containerID="4324979a35d72509d550aecc0fddf62f163b48cbd3e592489c10890fd9880571" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.245146 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.456154 5030 scope.go:117] "RemoveContainer" containerID="5f7d35b2a367f9318f220339f6dee3e01642d38b191bb85dd167bc66c7052558" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.504518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"74f7986d-211a-4216-b92f-ad900ddcc45f","Type":"ContainerStarted","Data":"70fb88353fc70c3b9dab582d23bbede418372d313927da95f856408d4d739fad"} Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.504567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"74f7986d-211a-4216-b92f-ad900ddcc45f","Type":"ContainerStarted","Data":"d896aba19503d5dbab8247f5cebe0bfaed0c7e5c6249a0cea78395927026fb0d"} Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.521459 5030 generic.go:334] "Generic (PLEG): container finished" podID="5a6d8f73-8f49-4501-8fde-95d7fcdfb921" containerID="06ee7ec944a9d5065323264cea939f93bd87f7249327c7674f96be41e6bbd605" exitCode=0 Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.521522 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" event={"ID":"5a6d8f73-8f49-4501-8fde-95d7fcdfb921","Type":"ContainerDied","Data":"06ee7ec944a9d5065323264cea939f93bd87f7249327c7674f96be41e6bbd605"} Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.538844 5030 scope.go:117] "RemoveContainer" containerID="887ccad86af31ce3e1594fcc7349c3577a0c6d31033210df78179bff8c7f5ceb" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.568342 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.581855 5030 scope.go:117] "RemoveContainer" containerID="05a36c447531d8197fe63c5039928766cf8cd79248e11ef0aabf190a64554fc4" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.626656 5030 scope.go:117] "RemoveContainer" containerID="9491186a3a78527013f443f716cbaf6ddce1ea97f6036b87136fe96596682eb1" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.649920 5030 scope.go:117] "RemoveContainer" containerID="7a40b1202cf51379eb919ca23664e6520b4fcd12c201fcc3b3d03e4d4583657b" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.668113 5030 scope.go:117] "RemoveContainer" containerID="7e0c101fc63b1db43244e04e05803e60eca277ec041218297a5354f0bfcf1f84" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.679897 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-combined-ca-bundle\") pod \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.679990 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-config-data\") pod \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.680013 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-logs\") pod \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.680100 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzrhc\" (UniqueName: \"kubernetes.io/projected/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-kube-api-access-wzrhc\") pod \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.680157 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-config-data-custom\") pod \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\" (UID: \"5a6d8f73-8f49-4501-8fde-95d7fcdfb921\") " Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.683535 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-logs" (OuterVolumeSpecName: "logs") pod "5a6d8f73-8f49-4501-8fde-95d7fcdfb921" (UID: "5a6d8f73-8f49-4501-8fde-95d7fcdfb921"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.691165 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5a6d8f73-8f49-4501-8fde-95d7fcdfb921" (UID: "5a6d8f73-8f49-4501-8fde-95d7fcdfb921"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.693746 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-kube-api-access-wzrhc" (OuterVolumeSpecName: "kube-api-access-wzrhc") pod "5a6d8f73-8f49-4501-8fde-95d7fcdfb921" (UID: "5a6d8f73-8f49-4501-8fde-95d7fcdfb921"). InnerVolumeSpecName "kube-api-access-wzrhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.708071 5030 scope.go:117] "RemoveContainer" containerID="3ea09b1544393da68aa47bb7fcfd6181f5613976340f55cf267b19b8e8bada7c" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.712394 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a6d8f73-8f49-4501-8fde-95d7fcdfb921" (UID: "5a6d8f73-8f49-4501-8fde-95d7fcdfb921"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.737194 5030 scope.go:117] "RemoveContainer" containerID="ff54e0d54961da77f3e62ba00cba1c63dc7ac2c110e0251d825b9f5703c04b3f" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.765925 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd"] Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.779939 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-config-data" (OuterVolumeSpecName: "config-data") pod "5a6d8f73-8f49-4501-8fde-95d7fcdfb921" (UID: "5a6d8f73-8f49-4501-8fde-95d7fcdfb921"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.782434 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.782461 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.782471 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.782482 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzrhc\" (UniqueName: \"kubernetes.io/projected/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-kube-api-access-wzrhc\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.782492 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5a6d8f73-8f49-4501-8fde-95d7fcdfb921-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.801560 5030 scope.go:117] "RemoveContainer" containerID="7919fd8bc32c52113da1c830916d802ce7efb1753f1c016315bf56674d6f572c" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.834495 5030 scope.go:117] "RemoveContainer" containerID="88037d15f9d9506edc3d17a7d8d74962aa4a73fe38912728d3b943d2a4ac0fc2" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.866187 5030 scope.go:117] "RemoveContainer" containerID="53d5d2fb64e11f0a1deba4a0b4ed1e4ed95fa56c211991aefeda39fb89c388f5" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.916714 5030 scope.go:117] "RemoveContainer" containerID="a9b1b63e4c235c4b3ae22db457ef86079fa86f69f2ae3b716377a91a3ea1ec01" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.947548 5030 scope.go:117] "RemoveContainer" containerID="96be5fdb61f3c54676e8d0b8aa0dde3fe140c4ecbf8af87897e961ce635a4a5c" Jan 20 23:16:44 crc kubenswrapper[5030]: I0120 23:16:44.994284 5030 scope.go:117] "RemoveContainer" containerID="bd0d7f8c3ddd224fae9dbfe730191a14cb606035d6b54b8853968ab3874c0688" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.026089 5030 scope.go:117] "RemoveContainer" containerID="eaa3c999fcab3e563feef80868cb66e702932c50fc14ab1246ef3976a2fc0297" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.053475 5030 scope.go:117] "RemoveContainer" containerID="db8a03c81c30c7c0f36823f636297df6ce0a2e06bb363add7caf45bf70d3c118" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.092905 5030 scope.go:117] "RemoveContainer" containerID="a90f48a1dd1d0d8929bad9dd8192cc2e9d1d97e95f316c8ba38b821ea7094fff" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.145815 5030 scope.go:117] "RemoveContainer" containerID="a98358ab78eec0b694659f796a37b14f6795dd13ef8a648b4600dfb66d2fd05c" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.191698 5030 scope.go:117] "RemoveContainer" containerID="ad8326da150ba9e0b7c9cc03399f4b3024338fea28aab1d3e8440ad17897133b" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.218368 5030 scope.go:117] "RemoveContainer" containerID="f0c319d434603d5cca5731e6e06e5cc5ab367ba74821077ad39f503ea413d420" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.335987 5030 scope.go:117] "RemoveContainer" containerID="ad07a3cdfc37fecbb9e6cd9f1b6d7bbe467ef4063fc636147ca3c18406a5de63" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.376106 5030 scope.go:117] "RemoveContainer" containerID="33106b797843d9588bd322d3b3adb04bf5c3040e2e4b63f532a40ae848593c1d" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.421900 5030 scope.go:117] "RemoveContainer" containerID="bffef7765f1438423591828481b992e73d3dfc2858bd854db49a96094325b559" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.492260 5030 scope.go:117] "RemoveContainer" containerID="282f387f459239e1abbb114ff819d5f8200b97d66e1dceb89e38ce1c553d65f3" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.542800 5030 scope.go:117] "RemoveContainer" containerID="9d581ab397f069a331a60a0bdfd08ffce432ef07516ec625a705f575027756ff" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.571502 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"74f7986d-211a-4216-b92f-ad900ddcc45f","Type":"ContainerStarted","Data":"3b40fee42e01b5b734218bca90644c05dc6e1540c1debc8a4a06f30fcb8a9bf8"} Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.588873 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.5888595089999997 podStartE2EDuration="3.588859509s" podCreationTimestamp="2026-01-20 23:16:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:45.588244534 +0000 UTC m=+2477.908504822" watchObservedRunningTime="2026-01-20 23:16:45.588859509 +0000 UTC m=+2477.909119797" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.626309 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" event={"ID":"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782","Type":"ContainerStarted","Data":"e47ac64b526f7c8c1e0183da0c8d577a84461451afc84c5ba2ba9e3c80008b0f"} Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.626350 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" event={"ID":"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782","Type":"ContainerStarted","Data":"6949cd28ba252efdec3e0d70cae363c20b2cd6eb423def67f259c29ca96e51c2"} Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.626360 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" event={"ID":"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782","Type":"ContainerStarted","Data":"2293d1c739a446e5a0beec33e710dafc58e2b78138f44f5fa202ae92299bbe7c"} Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.627425 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.669331 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" podStartSLOduration=2.669315798 podStartE2EDuration="2.669315798s" podCreationTimestamp="2026-01-20 23:16:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:16:45.660288312 +0000 UTC m=+2477.980548600" watchObservedRunningTime="2026-01-20 23:16:45.669315798 +0000 UTC m=+2477.989576086" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.686121 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" event={"ID":"5a6d8f73-8f49-4501-8fde-95d7fcdfb921","Type":"ContainerDied","Data":"4f31548937bd7ac847a7844a6c576abcc1403660718ff0a8052b5afad2421491"} Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.686179 5030 scope.go:117] "RemoveContainer" containerID="06ee7ec944a9d5065323264cea939f93bd87f7249327c7674f96be41e6bbd605" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.686218 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-545bc46c64-vbtck" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.728799 5030 scope.go:117] "RemoveContainer" containerID="2cde153b5bb54432054bfba907408b59ea1d150dae9a9dfecb5a08d718525b1c" Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.744097 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-545bc46c64-vbtck"] Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.762658 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-545bc46c64-vbtck"] Jan 20 23:16:45 crc kubenswrapper[5030]: I0120 23:16:45.977688 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a6d8f73-8f49-4501-8fde-95d7fcdfb921" path="/var/lib/kubelet/pods/5a6d8f73-8f49-4501-8fde-95d7fcdfb921/volumes" Jan 20 23:16:46 crc kubenswrapper[5030]: I0120 23:16:46.735361 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:16:48 crc kubenswrapper[5030]: I0120 23:16:48.002120 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:51 crc kubenswrapper[5030]: I0120 23:16:51.962835 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:16:51 crc kubenswrapper[5030]: E0120 23:16:51.963942 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:16:52 crc kubenswrapper[5030]: I0120 23:16:52.296554 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:16:53 crc kubenswrapper[5030]: I0120 23:16:53.211565 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:16:54 crc kubenswrapper[5030]: I0120 23:16:54.453112 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:16:54 crc kubenswrapper[5030]: I0120 23:16:54.488786 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:17:02 crc kubenswrapper[5030]: I0120 23:17:02.962078 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:17:02 crc kubenswrapper[5030]: E0120 23:17:02.962851 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:17:03 crc kubenswrapper[5030]: I0120 23:17:03.039133 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.548379 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:17:04 crc kubenswrapper[5030]: E0120 23:17:04.549489 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6d8f73-8f49-4501-8fde-95d7fcdfb921" containerName="barbican-api-log" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.549521 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6d8f73-8f49-4501-8fde-95d7fcdfb921" containerName="barbican-api-log" Jan 20 23:17:04 crc kubenswrapper[5030]: E0120 23:17:04.549681 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6d8f73-8f49-4501-8fde-95d7fcdfb921" containerName="barbican-api" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.549708 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6d8f73-8f49-4501-8fde-95d7fcdfb921" containerName="barbican-api" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.551755 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6d8f73-8f49-4501-8fde-95d7fcdfb921" containerName="barbican-api" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.551815 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6d8f73-8f49-4501-8fde-95d7fcdfb921" containerName="barbican-api-log" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.553386 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.556459 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.556976 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-22k9c" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.558745 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.566235 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.580524 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/97fdbcd3-4caf-4b83-b15a-a3dec765984f-openstack-config\") pod \"openstackclient\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.580702 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fdbcd3-4caf-4b83-b15a-a3dec765984f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.580742 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/97fdbcd3-4caf-4b83-b15a-a3dec765984f-openstack-config-secret\") pod \"openstackclient\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.580766 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f282\" (UniqueName: \"kubernetes.io/projected/97fdbcd3-4caf-4b83-b15a-a3dec765984f-kube-api-access-8f282\") pod \"openstackclient\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.682827 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fdbcd3-4caf-4b83-b15a-a3dec765984f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.682920 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/97fdbcd3-4caf-4b83-b15a-a3dec765984f-openstack-config-secret\") pod \"openstackclient\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.682962 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f282\" (UniqueName: \"kubernetes.io/projected/97fdbcd3-4caf-4b83-b15a-a3dec765984f-kube-api-access-8f282\") pod \"openstackclient\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.683047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/97fdbcd3-4caf-4b83-b15a-a3dec765984f-openstack-config\") pod \"openstackclient\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.685010 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/97fdbcd3-4caf-4b83-b15a-a3dec765984f-openstack-config\") pod \"openstackclient\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.689759 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/97fdbcd3-4caf-4b83-b15a-a3dec765984f-openstack-config-secret\") pod \"openstackclient\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.690332 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fdbcd3-4caf-4b83-b15a-a3dec765984f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.699174 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f282\" (UniqueName: \"kubernetes.io/projected/97fdbcd3-4caf-4b83-b15a-a3dec765984f-kube-api-access-8f282\") pod \"openstackclient\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:04 crc kubenswrapper[5030]: I0120 23:17:04.887851 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:17:05 crc kubenswrapper[5030]: I0120 23:17:05.462392 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:17:05 crc kubenswrapper[5030]: I0120 23:17:05.919839 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"97fdbcd3-4caf-4b83-b15a-a3dec765984f","Type":"ContainerStarted","Data":"4630bdc572e916afefdf83bc662ba7a0751c9ba42447e962df99d64177728647"} Jan 20 23:17:05 crc kubenswrapper[5030]: I0120 23:17:05.920195 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"97fdbcd3-4caf-4b83-b15a-a3dec765984f","Type":"ContainerStarted","Data":"9810798fb952e71a0747a77367543662f031fed6c9ec3a25b7b34afe91847f9d"} Jan 20 23:17:05 crc kubenswrapper[5030]: I0120 23:17:05.932004 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=1.931987253 podStartE2EDuration="1.931987253s" podCreationTimestamp="2026-01-20 23:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:05.931132882 +0000 UTC m=+2498.251393180" watchObservedRunningTime="2026-01-20 23:17:05.931987253 +0000 UTC m=+2498.252247541" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.671357 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6"] Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.674016 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.676355 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.679771 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.685366 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6"] Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.691069 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.752243 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-internal-tls-certs\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.752632 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/24c9c78b-da19-4586-ae0c-97d5d63aeee2-etc-swift\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.752686 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-combined-ca-bundle\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.752738 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24c9c78b-da19-4586-ae0c-97d5d63aeee2-run-httpd\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.752759 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24c9c78b-da19-4586-ae0c-97d5d63aeee2-log-httpd\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.752791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnqqx\" (UniqueName: \"kubernetes.io/projected/24c9c78b-da19-4586-ae0c-97d5d63aeee2-kube-api-access-nnqqx\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.752806 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-public-tls-certs\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.752860 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-config-data\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.854911 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24c9c78b-da19-4586-ae0c-97d5d63aeee2-run-httpd\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.854983 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24c9c78b-da19-4586-ae0c-97d5d63aeee2-log-httpd\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.855022 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-public-tls-certs\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.855047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnqqx\" (UniqueName: \"kubernetes.io/projected/24c9c78b-da19-4586-ae0c-97d5d63aeee2-kube-api-access-nnqqx\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.855123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-config-data\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.855177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-internal-tls-certs\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.855205 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/24c9c78b-da19-4586-ae0c-97d5d63aeee2-etc-swift\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.855261 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-combined-ca-bundle\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.856006 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24c9c78b-da19-4586-ae0c-97d5d63aeee2-log-httpd\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.856291 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24c9c78b-da19-4586-ae0c-97d5d63aeee2-run-httpd\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.861591 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-internal-tls-certs\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.862400 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-public-tls-certs\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.863074 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/24c9c78b-da19-4586-ae0c-97d5d63aeee2-etc-swift\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.865320 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-combined-ca-bundle\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.869658 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-config-data\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.876783 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnqqx\" (UniqueName: \"kubernetes.io/projected/24c9c78b-da19-4586-ae0c-97d5d63aeee2-kube-api-access-nnqqx\") pod \"swift-proxy-ffdf777d-qspf6\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:08 crc kubenswrapper[5030]: I0120 23:17:08.991101 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.120077 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.120398 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="ceilometer-central-agent" containerID="cri-o://97cc1de893464b29ca710c23a72e9bf716d2df3974fc910033898669d3c1cd34" gracePeriod=30 Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.120412 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="proxy-httpd" containerID="cri-o://50124740c8e7c366ad0b103868e37f29699db8b35cb72c17ac6b12a49bf4b868" gracePeriod=30 Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.120513 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="sg-core" containerID="cri-o://d37a11924a05a8c9890afffc7e94e6751a7a55b62f415ddc75612f6fbf4ef0ea" gracePeriod=30 Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.120525 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="ceilometer-notification-agent" containerID="cri-o://6b041ec1cfd36b8cc15ebfa5ed756044856e97d42755f7989aa1330547d730a2" gracePeriod=30 Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.702291 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6"] Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.972632 5030 generic.go:334] "Generic (PLEG): container finished" podID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerID="50124740c8e7c366ad0b103868e37f29699db8b35cb72c17ac6b12a49bf4b868" exitCode=0 Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.972915 5030 generic.go:334] "Generic (PLEG): container finished" podID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerID="d37a11924a05a8c9890afffc7e94e6751a7a55b62f415ddc75612f6fbf4ef0ea" exitCode=2 Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.972927 5030 generic.go:334] "Generic (PLEG): container finished" podID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerID="97cc1de893464b29ca710c23a72e9bf716d2df3974fc910033898669d3c1cd34" exitCode=0 Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.974086 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" event={"ID":"24c9c78b-da19-4586-ae0c-97d5d63aeee2","Type":"ContainerStarted","Data":"ec5e526c2cdd05718671ccd6ac4f385ffc8385a61f4c21bec9eea239556ebbbc"} Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.974125 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" event={"ID":"24c9c78b-da19-4586-ae0c-97d5d63aeee2","Type":"ContainerStarted","Data":"7225b48e4f67fbe319c8386b782b5de8014d9ab4d4a0b86b9145f610062514ac"} Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.974135 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c8a6e168-5084-4ff7-96e9-86b047eb8da4","Type":"ContainerDied","Data":"50124740c8e7c366ad0b103868e37f29699db8b35cb72c17ac6b12a49bf4b868"} Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.974147 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c8a6e168-5084-4ff7-96e9-86b047eb8da4","Type":"ContainerDied","Data":"d37a11924a05a8c9890afffc7e94e6751a7a55b62f415ddc75612f6fbf4ef0ea"} Jan 20 23:17:09 crc kubenswrapper[5030]: I0120 23:17:09.974156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c8a6e168-5084-4ff7-96e9-86b047eb8da4","Type":"ContainerDied","Data":"97cc1de893464b29ca710c23a72e9bf716d2df3974fc910033898669d3c1cd34"} Jan 20 23:17:10 crc kubenswrapper[5030]: I0120 23:17:10.992674 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" event={"ID":"24c9c78b-da19-4586-ae0c-97d5d63aeee2","Type":"ContainerStarted","Data":"c4a40a78281e11cd16875b9ee533f316223288e089e031dfb22cb627128bcb6c"} Jan 20 23:17:10 crc kubenswrapper[5030]: I0120 23:17:10.993280 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:10 crc kubenswrapper[5030]: I0120 23:17:10.993325 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:11 crc kubenswrapper[5030]: I0120 23:17:11.055409 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" podStartSLOduration=3.055384006 podStartE2EDuration="3.055384006s" podCreationTimestamp="2026-01-20 23:17:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:11.018783228 +0000 UTC m=+2503.339043556" watchObservedRunningTime="2026-01-20 23:17:11.055384006 +0000 UTC m=+2503.375644314" Jan 20 23:17:12 crc kubenswrapper[5030]: I0120 23:17:12.027205 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:17:13 crc kubenswrapper[5030]: I0120 23:17:13.855371 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:13 crc kubenswrapper[5030]: I0120 23:17:13.979660 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-combined-ca-bundle\") pod \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " Jan 20 23:17:13 crc kubenswrapper[5030]: I0120 23:17:13.979792 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-config-data\") pod \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " Jan 20 23:17:13 crc kubenswrapper[5030]: I0120 23:17:13.979892 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-scripts\") pod \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " Jan 20 23:17:13 crc kubenswrapper[5030]: I0120 23:17:13.979975 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgxmh\" (UniqueName: \"kubernetes.io/projected/c8a6e168-5084-4ff7-96e9-86b047eb8da4-kube-api-access-vgxmh\") pod \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " Jan 20 23:17:13 crc kubenswrapper[5030]: I0120 23:17:13.980012 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-sg-core-conf-yaml\") pod \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " Jan 20 23:17:13 crc kubenswrapper[5030]: I0120 23:17:13.980042 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a6e168-5084-4ff7-96e9-86b047eb8da4-log-httpd\") pod \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " Jan 20 23:17:13 crc kubenswrapper[5030]: I0120 23:17:13.980140 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a6e168-5084-4ff7-96e9-86b047eb8da4-run-httpd\") pod \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\" (UID: \"c8a6e168-5084-4ff7-96e9-86b047eb8da4\") " Jan 20 23:17:13 crc kubenswrapper[5030]: I0120 23:17:13.982176 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8a6e168-5084-4ff7-96e9-86b047eb8da4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c8a6e168-5084-4ff7-96e9-86b047eb8da4" (UID: "c8a6e168-5084-4ff7-96e9-86b047eb8da4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:17:13 crc kubenswrapper[5030]: I0120 23:17:13.982379 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8a6e168-5084-4ff7-96e9-86b047eb8da4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c8a6e168-5084-4ff7-96e9-86b047eb8da4" (UID: "c8a6e168-5084-4ff7-96e9-86b047eb8da4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:17:13 crc kubenswrapper[5030]: I0120 23:17:13.985852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-scripts" (OuterVolumeSpecName: "scripts") pod "c8a6e168-5084-4ff7-96e9-86b047eb8da4" (UID: "c8a6e168-5084-4ff7-96e9-86b047eb8da4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:13 crc kubenswrapper[5030]: I0120 23:17:13.986584 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a6e168-5084-4ff7-96e9-86b047eb8da4-kube-api-access-vgxmh" (OuterVolumeSpecName: "kube-api-access-vgxmh") pod "c8a6e168-5084-4ff7-96e9-86b047eb8da4" (UID: "c8a6e168-5084-4ff7-96e9-86b047eb8da4"). InnerVolumeSpecName "kube-api-access-vgxmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.007061 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c8a6e168-5084-4ff7-96e9-86b047eb8da4" (UID: "c8a6e168-5084-4ff7-96e9-86b047eb8da4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.026326 5030 generic.go:334] "Generic (PLEG): container finished" podID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerID="6b041ec1cfd36b8cc15ebfa5ed756044856e97d42755f7989aa1330547d730a2" exitCode=0 Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.026366 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c8a6e168-5084-4ff7-96e9-86b047eb8da4","Type":"ContainerDied","Data":"6b041ec1cfd36b8cc15ebfa5ed756044856e97d42755f7989aa1330547d730a2"} Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.026391 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c8a6e168-5084-4ff7-96e9-86b047eb8da4","Type":"ContainerDied","Data":"f19a61eba07fe4ea4e199c3de27916551857b7ba13008381060823afbe93858b"} Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.026406 5030 scope.go:117] "RemoveContainer" containerID="50124740c8e7c366ad0b103868e37f29699db8b35cb72c17ac6b12a49bf4b868" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.026524 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.070418 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8a6e168-5084-4ff7-96e9-86b047eb8da4" (UID: "c8a6e168-5084-4ff7-96e9-86b047eb8da4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.082922 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.082968 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgxmh\" (UniqueName: \"kubernetes.io/projected/c8a6e168-5084-4ff7-96e9-86b047eb8da4-kube-api-access-vgxmh\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.082983 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.082995 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a6e168-5084-4ff7-96e9-86b047eb8da4-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.083006 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a6e168-5084-4ff7-96e9-86b047eb8da4-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.083018 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.117841 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-config-data" (OuterVolumeSpecName: "config-data") pod "c8a6e168-5084-4ff7-96e9-86b047eb8da4" (UID: "c8a6e168-5084-4ff7-96e9-86b047eb8da4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.122816 5030 scope.go:117] "RemoveContainer" containerID="d37a11924a05a8c9890afffc7e94e6751a7a55b62f415ddc75612f6fbf4ef0ea" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.152204 5030 scope.go:117] "RemoveContainer" containerID="6b041ec1cfd36b8cc15ebfa5ed756044856e97d42755f7989aa1330547d730a2" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.177456 5030 scope.go:117] "RemoveContainer" containerID="97cc1de893464b29ca710c23a72e9bf716d2df3974fc910033898669d3c1cd34" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.185382 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a6e168-5084-4ff7-96e9-86b047eb8da4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.213811 5030 scope.go:117] "RemoveContainer" containerID="50124740c8e7c366ad0b103868e37f29699db8b35cb72c17ac6b12a49bf4b868" Jan 20 23:17:14 crc kubenswrapper[5030]: E0120 23:17:14.214864 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50124740c8e7c366ad0b103868e37f29699db8b35cb72c17ac6b12a49bf4b868\": container with ID starting with 50124740c8e7c366ad0b103868e37f29699db8b35cb72c17ac6b12a49bf4b868 not found: ID does not exist" containerID="50124740c8e7c366ad0b103868e37f29699db8b35cb72c17ac6b12a49bf4b868" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.214897 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50124740c8e7c366ad0b103868e37f29699db8b35cb72c17ac6b12a49bf4b868"} err="failed to get container status \"50124740c8e7c366ad0b103868e37f29699db8b35cb72c17ac6b12a49bf4b868\": rpc error: code = NotFound desc = could not find container \"50124740c8e7c366ad0b103868e37f29699db8b35cb72c17ac6b12a49bf4b868\": container with ID starting with 50124740c8e7c366ad0b103868e37f29699db8b35cb72c17ac6b12a49bf4b868 not found: ID does not exist" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.214918 5030 scope.go:117] "RemoveContainer" containerID="d37a11924a05a8c9890afffc7e94e6751a7a55b62f415ddc75612f6fbf4ef0ea" Jan 20 23:17:14 crc kubenswrapper[5030]: E0120 23:17:14.216876 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d37a11924a05a8c9890afffc7e94e6751a7a55b62f415ddc75612f6fbf4ef0ea\": container with ID starting with d37a11924a05a8c9890afffc7e94e6751a7a55b62f415ddc75612f6fbf4ef0ea not found: ID does not exist" containerID="d37a11924a05a8c9890afffc7e94e6751a7a55b62f415ddc75612f6fbf4ef0ea" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.216900 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d37a11924a05a8c9890afffc7e94e6751a7a55b62f415ddc75612f6fbf4ef0ea"} err="failed to get container status \"d37a11924a05a8c9890afffc7e94e6751a7a55b62f415ddc75612f6fbf4ef0ea\": rpc error: code = NotFound desc = could not find container \"d37a11924a05a8c9890afffc7e94e6751a7a55b62f415ddc75612f6fbf4ef0ea\": container with ID starting with d37a11924a05a8c9890afffc7e94e6751a7a55b62f415ddc75612f6fbf4ef0ea not found: ID does not exist" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.216917 5030 scope.go:117] "RemoveContainer" containerID="6b041ec1cfd36b8cc15ebfa5ed756044856e97d42755f7989aa1330547d730a2" Jan 20 23:17:14 crc kubenswrapper[5030]: E0120 23:17:14.217847 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b041ec1cfd36b8cc15ebfa5ed756044856e97d42755f7989aa1330547d730a2\": container with ID starting with 6b041ec1cfd36b8cc15ebfa5ed756044856e97d42755f7989aa1330547d730a2 not found: ID does not exist" containerID="6b041ec1cfd36b8cc15ebfa5ed756044856e97d42755f7989aa1330547d730a2" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.217871 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b041ec1cfd36b8cc15ebfa5ed756044856e97d42755f7989aa1330547d730a2"} err="failed to get container status \"6b041ec1cfd36b8cc15ebfa5ed756044856e97d42755f7989aa1330547d730a2\": rpc error: code = NotFound desc = could not find container \"6b041ec1cfd36b8cc15ebfa5ed756044856e97d42755f7989aa1330547d730a2\": container with ID starting with 6b041ec1cfd36b8cc15ebfa5ed756044856e97d42755f7989aa1330547d730a2 not found: ID does not exist" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.217885 5030 scope.go:117] "RemoveContainer" containerID="97cc1de893464b29ca710c23a72e9bf716d2df3974fc910033898669d3c1cd34" Jan 20 23:17:14 crc kubenswrapper[5030]: E0120 23:17:14.221773 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97cc1de893464b29ca710c23a72e9bf716d2df3974fc910033898669d3c1cd34\": container with ID starting with 97cc1de893464b29ca710c23a72e9bf716d2df3974fc910033898669d3c1cd34 not found: ID does not exist" containerID="97cc1de893464b29ca710c23a72e9bf716d2df3974fc910033898669d3c1cd34" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.221797 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97cc1de893464b29ca710c23a72e9bf716d2df3974fc910033898669d3c1cd34"} err="failed to get container status \"97cc1de893464b29ca710c23a72e9bf716d2df3974fc910033898669d3c1cd34\": rpc error: code = NotFound desc = could not find container \"97cc1de893464b29ca710c23a72e9bf716d2df3974fc910033898669d3c1cd34\": container with ID starting with 97cc1de893464b29ca710c23a72e9bf716d2df3974fc910033898669d3c1cd34 not found: ID does not exist" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.318597 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.358442 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.386271 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.416659 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6854d54cb9-lmp44"] Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.416903 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" podUID="652222f4-da37-437a-9f90-f89857b8399b" containerName="neutron-api" containerID="cri-o://1a6aeedce58898831a51abb4593a2b921a2ad31f8e4c6edb75453fd364129a5b" gracePeriod=30 Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.417345 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" podUID="652222f4-da37-437a-9f90-f89857b8399b" containerName="neutron-httpd" containerID="cri-o://0f674785d15a66a0824eddd1da0e18f743a568ad97ea8e98800c71504892f015" gracePeriod=30 Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.426393 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:14 crc kubenswrapper[5030]: E0120 23:17:14.426839 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="proxy-httpd" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.426857 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="proxy-httpd" Jan 20 23:17:14 crc kubenswrapper[5030]: E0120 23:17:14.426884 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="sg-core" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.426891 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="sg-core" Jan 20 23:17:14 crc kubenswrapper[5030]: E0120 23:17:14.426909 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="ceilometer-central-agent" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.426915 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="ceilometer-central-agent" Jan 20 23:17:14 crc kubenswrapper[5030]: E0120 23:17:14.426931 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="ceilometer-notification-agent" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.426938 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="ceilometer-notification-agent" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.427116 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="ceilometer-central-agent" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.427131 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="ceilometer-notification-agent" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.427150 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="proxy-httpd" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.427163 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" containerName="sg-core" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.429099 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.433178 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.433287 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.437502 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.492216 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmglk\" (UniqueName: \"kubernetes.io/projected/7307ecd5-19b6-4257-8aa6-f7683128ed15-kube-api-access-kmglk\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.492273 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.492347 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7307ecd5-19b6-4257-8aa6-f7683128ed15-log-httpd\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.492398 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-config-data\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.492425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7307ecd5-19b6-4257-8aa6-f7683128ed15-run-httpd\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.492467 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.492498 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-scripts\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.594056 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-config-data\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.594374 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7307ecd5-19b6-4257-8aa6-f7683128ed15-run-httpd\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.594500 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.594605 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-scripts\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.594725 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmglk\" (UniqueName: \"kubernetes.io/projected/7307ecd5-19b6-4257-8aa6-f7683128ed15-kube-api-access-kmglk\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.594830 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.594927 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7307ecd5-19b6-4257-8aa6-f7683128ed15-log-httpd\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.595520 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7307ecd5-19b6-4257-8aa6-f7683128ed15-log-httpd\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.595522 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7307ecd5-19b6-4257-8aa6-f7683128ed15-run-httpd\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.599820 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.599920 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-config-data\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.599979 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.601258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-scripts\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.616857 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmglk\" (UniqueName: \"kubernetes.io/projected/7307ecd5-19b6-4257-8aa6-f7683128ed15-kube-api-access-kmglk\") pod \"ceilometer-0\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.754053 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.940685 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-rj9lg"] Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.942197 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" Jan 20 23:17:14 crc kubenswrapper[5030]: I0120 23:17:14.971446 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-rj9lg"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.051277 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-t4nrn"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.052507 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-t4nrn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.055218 5030 generic.go:334] "Generic (PLEG): container finished" podID="652222f4-da37-437a-9f90-f89857b8399b" containerID="0f674785d15a66a0824eddd1da0e18f743a568ad97ea8e98800c71504892f015" exitCode=0 Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.055262 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" event={"ID":"652222f4-da37-437a-9f90-f89857b8399b","Type":"ContainerDied","Data":"0f674785d15a66a0824eddd1da0e18f743a568ad97ea8e98800c71504892f015"} Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.066408 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.067674 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.069632 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.076669 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-t4nrn"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.087029 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.110155 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dzkm\" (UniqueName: \"kubernetes.io/projected/f1bee829-9ebf-4a41-8293-a7328324b33f-kube-api-access-9dzkm\") pod \"nova-api-db-create-rj9lg\" (UID: \"f1bee829-9ebf-4a41-8293-a7328324b33f\") " pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.110505 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1bee829-9ebf-4a41-8293-a7328324b33f-operator-scripts\") pod \"nova-api-db-create-rj9lg\" (UID: \"f1bee829-9ebf-4a41-8293-a7328324b33f\") " pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.131552 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-5svxp"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.133069 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-5svxp" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.146727 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-5svxp"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.213088 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dzkm\" (UniqueName: \"kubernetes.io/projected/f1bee829-9ebf-4a41-8293-a7328324b33f-kube-api-access-9dzkm\") pod \"nova-api-db-create-rj9lg\" (UID: \"f1bee829-9ebf-4a41-8293-a7328324b33f\") " pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.213152 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9m5g\" (UniqueName: \"kubernetes.io/projected/6762fe60-da88-4169-ac82-e435ae1486db-kube-api-access-j9m5g\") pod \"nova-cell0-db-create-t4nrn\" (UID: \"6762fe60-da88-4169-ac82-e435ae1486db\") " pod="openstack-kuttl-tests/nova-cell0-db-create-t4nrn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.213258 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j67zk\" (UniqueName: \"kubernetes.io/projected/468d96c6-0dc6-46d7-b85c-0041ef331f7d-kube-api-access-j67zk\") pod \"nova-api-83e2-account-create-update-pmcqn\" (UID: \"468d96c6-0dc6-46d7-b85c-0041ef331f7d\") " pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.213349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1bee829-9ebf-4a41-8293-a7328324b33f-operator-scripts\") pod \"nova-api-db-create-rj9lg\" (UID: \"f1bee829-9ebf-4a41-8293-a7328324b33f\") " pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.213395 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/468d96c6-0dc6-46d7-b85c-0041ef331f7d-operator-scripts\") pod \"nova-api-83e2-account-create-update-pmcqn\" (UID: \"468d96c6-0dc6-46d7-b85c-0041ef331f7d\") " pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.213425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6762fe60-da88-4169-ac82-e435ae1486db-operator-scripts\") pod \"nova-cell0-db-create-t4nrn\" (UID: \"6762fe60-da88-4169-ac82-e435ae1486db\") " pod="openstack-kuttl-tests/nova-cell0-db-create-t4nrn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.214743 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1bee829-9ebf-4a41-8293-a7328324b33f-operator-scripts\") pod \"nova-api-db-create-rj9lg\" (UID: \"f1bee829-9ebf-4a41-8293-a7328324b33f\") " pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.225822 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.227423 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.228880 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.232160 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dzkm\" (UniqueName: \"kubernetes.io/projected/f1bee829-9ebf-4a41-8293-a7328324b33f-kube-api-access-9dzkm\") pod \"nova-api-db-create-rj9lg\" (UID: \"f1bee829-9ebf-4a41-8293-a7328324b33f\") " pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.233517 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.274751 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.315032 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6762fe60-da88-4169-ac82-e435ae1486db-operator-scripts\") pod \"nova-cell0-db-create-t4nrn\" (UID: \"6762fe60-da88-4169-ac82-e435ae1486db\") " pod="openstack-kuttl-tests/nova-cell0-db-create-t4nrn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.315239 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d0af08-12b5-437a-9adc-37ff294ee30f-operator-scripts\") pod \"nova-cell1-db-create-5svxp\" (UID: \"e9d0af08-12b5-437a-9adc-37ff294ee30f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-5svxp" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.315351 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9m5g\" (UniqueName: \"kubernetes.io/projected/6762fe60-da88-4169-ac82-e435ae1486db-kube-api-access-j9m5g\") pod \"nova-cell0-db-create-t4nrn\" (UID: \"6762fe60-da88-4169-ac82-e435ae1486db\") " pod="openstack-kuttl-tests/nova-cell0-db-create-t4nrn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.315415 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c68z6\" (UniqueName: \"kubernetes.io/projected/e9d0af08-12b5-437a-9adc-37ff294ee30f-kube-api-access-c68z6\") pod \"nova-cell1-db-create-5svxp\" (UID: \"e9d0af08-12b5-437a-9adc-37ff294ee30f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-5svxp" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.315447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j67zk\" (UniqueName: \"kubernetes.io/projected/468d96c6-0dc6-46d7-b85c-0041ef331f7d-kube-api-access-j67zk\") pod \"nova-api-83e2-account-create-update-pmcqn\" (UID: \"468d96c6-0dc6-46d7-b85c-0041ef331f7d\") " pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.315725 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/468d96c6-0dc6-46d7-b85c-0041ef331f7d-operator-scripts\") pod \"nova-api-83e2-account-create-update-pmcqn\" (UID: \"468d96c6-0dc6-46d7-b85c-0041ef331f7d\") " pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.315750 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6762fe60-da88-4169-ac82-e435ae1486db-operator-scripts\") pod \"nova-cell0-db-create-t4nrn\" (UID: \"6762fe60-da88-4169-ac82-e435ae1486db\") " pod="openstack-kuttl-tests/nova-cell0-db-create-t4nrn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.316221 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/468d96c6-0dc6-46d7-b85c-0041ef331f7d-operator-scripts\") pod \"nova-api-83e2-account-create-update-pmcqn\" (UID: \"468d96c6-0dc6-46d7-b85c-0041ef331f7d\") " pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.335776 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j67zk\" (UniqueName: \"kubernetes.io/projected/468d96c6-0dc6-46d7-b85c-0041ef331f7d-kube-api-access-j67zk\") pod \"nova-api-83e2-account-create-update-pmcqn\" (UID: \"468d96c6-0dc6-46d7-b85c-0041ef331f7d\") " pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.344175 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9m5g\" (UniqueName: \"kubernetes.io/projected/6762fe60-da88-4169-ac82-e435ae1486db-kube-api-access-j9m5g\") pod \"nova-cell0-db-create-t4nrn\" (UID: \"6762fe60-da88-4169-ac82-e435ae1486db\") " pod="openstack-kuttl-tests/nova-cell0-db-create-t4nrn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.355569 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.384393 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-t4nrn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.399467 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.424222 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b90192c-b1c7-4c7c-a4cd-b86ed62f0903-operator-scripts\") pod \"nova-cell0-b1b2-account-create-update-6wwjc\" (UID: \"0b90192c-b1c7-4c7c-a4cd-b86ed62f0903\") " pod="openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.424430 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d0af08-12b5-437a-9adc-37ff294ee30f-operator-scripts\") pod \"nova-cell1-db-create-5svxp\" (UID: \"e9d0af08-12b5-437a-9adc-37ff294ee30f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-5svxp" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.424724 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfr4v\" (UniqueName: \"kubernetes.io/projected/0b90192c-b1c7-4c7c-a4cd-b86ed62f0903-kube-api-access-sfr4v\") pod \"nova-cell0-b1b2-account-create-update-6wwjc\" (UID: \"0b90192c-b1c7-4c7c-a4cd-b86ed62f0903\") " pod="openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.424792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c68z6\" (UniqueName: \"kubernetes.io/projected/e9d0af08-12b5-437a-9adc-37ff294ee30f-kube-api-access-c68z6\") pod \"nova-cell1-db-create-5svxp\" (UID: \"e9d0af08-12b5-437a-9adc-37ff294ee30f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-5svxp" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.426175 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d0af08-12b5-437a-9adc-37ff294ee30f-operator-scripts\") pod \"nova-cell1-db-create-5svxp\" (UID: \"e9d0af08-12b5-437a-9adc-37ff294ee30f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-5svxp" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.446286 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.447516 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.449372 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c68z6\" (UniqueName: \"kubernetes.io/projected/e9d0af08-12b5-437a-9adc-37ff294ee30f-kube-api-access-c68z6\") pod \"nova-cell1-db-create-5svxp\" (UID: \"e9d0af08-12b5-437a-9adc-37ff294ee30f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-5svxp" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.457161 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.465188 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-5svxp" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.505905 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.527236 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfr4v\" (UniqueName: \"kubernetes.io/projected/0b90192c-b1c7-4c7c-a4cd-b86ed62f0903-kube-api-access-sfr4v\") pod \"nova-cell0-b1b2-account-create-update-6wwjc\" (UID: \"0b90192c-b1c7-4c7c-a4cd-b86ed62f0903\") " pod="openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.527299 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b90192c-b1c7-4c7c-a4cd-b86ed62f0903-operator-scripts\") pod \"nova-cell0-b1b2-account-create-update-6wwjc\" (UID: \"0b90192c-b1c7-4c7c-a4cd-b86ed62f0903\") " pod="openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.527373 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9dd2651-9595-4f1a-885f-b7c766d8ffff-operator-scripts\") pod \"nova-cell1-5d10-account-create-update-v94fx\" (UID: \"f9dd2651-9595-4f1a-885f-b7c766d8ffff\") " pod="openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.527417 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkk8p\" (UniqueName: \"kubernetes.io/projected/f9dd2651-9595-4f1a-885f-b7c766d8ffff-kube-api-access-pkk8p\") pod \"nova-cell1-5d10-account-create-update-v94fx\" (UID: \"f9dd2651-9595-4f1a-885f-b7c766d8ffff\") " pod="openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.528257 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b90192c-b1c7-4c7c-a4cd-b86ed62f0903-operator-scripts\") pod \"nova-cell0-b1b2-account-create-update-6wwjc\" (UID: \"0b90192c-b1c7-4c7c-a4cd-b86ed62f0903\") " pod="openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.568068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfr4v\" (UniqueName: \"kubernetes.io/projected/0b90192c-b1c7-4c7c-a4cd-b86ed62f0903-kube-api-access-sfr4v\") pod \"nova-cell0-b1b2-account-create-update-6wwjc\" (UID: \"0b90192c-b1c7-4c7c-a4cd-b86ed62f0903\") " pod="openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.628873 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9dd2651-9595-4f1a-885f-b7c766d8ffff-operator-scripts\") pod \"nova-cell1-5d10-account-create-update-v94fx\" (UID: \"f9dd2651-9595-4f1a-885f-b7c766d8ffff\") " pod="openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.628956 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkk8p\" (UniqueName: \"kubernetes.io/projected/f9dd2651-9595-4f1a-885f-b7c766d8ffff-kube-api-access-pkk8p\") pod \"nova-cell1-5d10-account-create-update-v94fx\" (UID: \"f9dd2651-9595-4f1a-885f-b7c766d8ffff\") " pod="openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.630213 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9dd2651-9595-4f1a-885f-b7c766d8ffff-operator-scripts\") pod \"nova-cell1-5d10-account-create-update-v94fx\" (UID: \"f9dd2651-9595-4f1a-885f-b7c766d8ffff\") " pod="openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.646995 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkk8p\" (UniqueName: \"kubernetes.io/projected/f9dd2651-9595-4f1a-885f-b7c766d8ffff-kube-api-access-pkk8p\") pod \"nova-cell1-5d10-account-create-update-v94fx\" (UID: \"f9dd2651-9595-4f1a-885f-b7c766d8ffff\") " pod="openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.755020 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-rj9lg"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.798061 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.844378 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.939390 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.979504 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a6e168-5084-4ff7-96e9-86b047eb8da4" path="/var/lib/kubelet/pods/c8a6e168-5084-4ff7-96e9-86b047eb8da4/volumes" Jan 20 23:17:15 crc kubenswrapper[5030]: I0120 23:17:15.980289 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn"] Jan 20 23:17:16 crc kubenswrapper[5030]: I0120 23:17:16.058379 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-t4nrn"] Jan 20 23:17:16 crc kubenswrapper[5030]: I0120 23:17:16.098899 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn" event={"ID":"468d96c6-0dc6-46d7-b85c-0041ef331f7d","Type":"ContainerStarted","Data":"1b7751dcd40bec86717488413a4a83f5132363b0b012bdda34434f4694946e8d"} Jan 20 23:17:16 crc kubenswrapper[5030]: I0120 23:17:16.102673 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" event={"ID":"f1bee829-9ebf-4a41-8293-a7328324b33f","Type":"ContainerStarted","Data":"6a7ba4f0dd84f46b89a429cd2c20cef0d3ba99a5eff0dfc1ffb1c070547beeff"} Jan 20 23:17:16 crc kubenswrapper[5030]: I0120 23:17:16.102702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" event={"ID":"f1bee829-9ebf-4a41-8293-a7328324b33f","Type":"ContainerStarted","Data":"523860be83891322b05e0636344adb1e6188af1be178d9284733399d9f7922be"} Jan 20 23:17:16 crc kubenswrapper[5030]: I0120 23:17:16.109394 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7307ecd5-19b6-4257-8aa6-f7683128ed15","Type":"ContainerStarted","Data":"fdc8e278eb1ea40baf3ed2dba414bbd3a25a6e821a66eea5ac995f8c7c5e01d6"} Jan 20 23:17:16 crc kubenswrapper[5030]: I0120 23:17:16.127729 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-5svxp"] Jan 20 23:17:16 crc kubenswrapper[5030]: I0120 23:17:16.133817 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" podStartSLOduration=2.1338027999999998 podStartE2EDuration="2.1338028s" podCreationTimestamp="2026-01-20 23:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:16.114216231 +0000 UTC m=+2508.434476579" watchObservedRunningTime="2026-01-20 23:17:16.1338028 +0000 UTC m=+2508.454063088" Jan 20 23:17:16 crc kubenswrapper[5030]: I0120 23:17:16.377716 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx"] Jan 20 23:17:16 crc kubenswrapper[5030]: I0120 23:17:16.422973 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:17:16 crc kubenswrapper[5030]: I0120 23:17:16.423192 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="57213c76-ec53-41ad-a0c2-bc850c6450e9" containerName="glance-log" containerID="cri-o://d7669dc3f89da19bb41a80426ab0d69429bf073c8734960200737d02eab6660b" gracePeriod=30 Jan 20 23:17:16 crc kubenswrapper[5030]: I0120 23:17:16.423303 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="57213c76-ec53-41ad-a0c2-bc850c6450e9" containerName="glance-httpd" containerID="cri-o://91d2eee7f4a70bfaff3e7bbd1c18dd79fe9e6768d85b58181b7c77bdea80cc23" gracePeriod=30 Jan 20 23:17:16 crc kubenswrapper[5030]: I0120 23:17:16.437239 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc"] Jan 20 23:17:16 crc kubenswrapper[5030]: W0120 23:17:16.448964 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b90192c_b1c7_4c7c_a4cd_b86ed62f0903.slice/crio-44920e6d48ce257d6dfbf7e201a16a3017bd7843ce433db35957af603c4e08bd WatchSource:0}: Error finding container 44920e6d48ce257d6dfbf7e201a16a3017bd7843ce433db35957af603c4e08bd: Status 404 returned error can't find the container with id 44920e6d48ce257d6dfbf7e201a16a3017bd7843ce433db35957af603c4e08bd Jan 20 23:17:16 crc kubenswrapper[5030]: I0120 23:17:16.962054 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:17:16 crc kubenswrapper[5030]: E0120 23:17:16.962482 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.123323 5030 generic.go:334] "Generic (PLEG): container finished" podID="57213c76-ec53-41ad-a0c2-bc850c6450e9" containerID="d7669dc3f89da19bb41a80426ab0d69429bf073c8734960200737d02eab6660b" exitCode=143 Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.123654 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"57213c76-ec53-41ad-a0c2-bc850c6450e9","Type":"ContainerDied","Data":"d7669dc3f89da19bb41a80426ab0d69429bf073c8734960200737d02eab6660b"} Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.126057 5030 generic.go:334] "Generic (PLEG): container finished" podID="f9dd2651-9595-4f1a-885f-b7c766d8ffff" containerID="6e48555d7f59d362f76f98e2a005a7ec8e530db62c2612fee273f24639d73fc1" exitCode=0 Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.126123 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx" event={"ID":"f9dd2651-9595-4f1a-885f-b7c766d8ffff","Type":"ContainerDied","Data":"6e48555d7f59d362f76f98e2a005a7ec8e530db62c2612fee273f24639d73fc1"} Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.126151 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx" event={"ID":"f9dd2651-9595-4f1a-885f-b7c766d8ffff","Type":"ContainerStarted","Data":"f0da3fa84123101b1d1cb99aa7e9af8a67946747033d0e2a9b13b5d19e7f410d"} Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.127749 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7307ecd5-19b6-4257-8aa6-f7683128ed15","Type":"ContainerStarted","Data":"49115faf9e4cca8649a875a9d81b2f07f5541bc765c46bf3ab690053a58c78ba"} Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.131418 5030 generic.go:334] "Generic (PLEG): container finished" podID="652222f4-da37-437a-9f90-f89857b8399b" containerID="1a6aeedce58898831a51abb4593a2b921a2ad31f8e4c6edb75453fd364129a5b" exitCode=0 Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.131479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" event={"ID":"652222f4-da37-437a-9f90-f89857b8399b","Type":"ContainerDied","Data":"1a6aeedce58898831a51abb4593a2b921a2ad31f8e4c6edb75453fd364129a5b"} Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.137160 5030 generic.go:334] "Generic (PLEG): container finished" podID="e9d0af08-12b5-437a-9adc-37ff294ee30f" containerID="c6d102c5c2ff29bc487b0f86be647e928791c6f33823052475caed180c2ca557" exitCode=0 Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.137206 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-5svxp" event={"ID":"e9d0af08-12b5-437a-9adc-37ff294ee30f","Type":"ContainerDied","Data":"c6d102c5c2ff29bc487b0f86be647e928791c6f33823052475caed180c2ca557"} Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.137222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-5svxp" event={"ID":"e9d0af08-12b5-437a-9adc-37ff294ee30f","Type":"ContainerStarted","Data":"9a12d00fc408dc215e9f523b8ee07c6afe0348befcb0e0b84c06a0ebbb75bffd"} Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.141339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc" event={"ID":"0b90192c-b1c7-4c7c-a4cd-b86ed62f0903","Type":"ContainerStarted","Data":"44920e6d48ce257d6dfbf7e201a16a3017bd7843ce433db35957af603c4e08bd"} Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.146079 5030 generic.go:334] "Generic (PLEG): container finished" podID="468d96c6-0dc6-46d7-b85c-0041ef331f7d" containerID="692e91461f0a4c20cf2df5e35e786a30067f875df78474941822ccf0e5c4873d" exitCode=0 Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.146132 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn" event={"ID":"468d96c6-0dc6-46d7-b85c-0041ef331f7d","Type":"ContainerDied","Data":"692e91461f0a4c20cf2df5e35e786a30067f875df78474941822ccf0e5c4873d"} Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.155940 5030 generic.go:334] "Generic (PLEG): container finished" podID="6762fe60-da88-4169-ac82-e435ae1486db" containerID="03f6942764b2fc6914994b80ca01b57c398168e809ec4912feb82cbb3398d604" exitCode=0 Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.156005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-t4nrn" event={"ID":"6762fe60-da88-4169-ac82-e435ae1486db","Type":"ContainerDied","Data":"03f6942764b2fc6914994b80ca01b57c398168e809ec4912feb82cbb3398d604"} Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.156029 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-t4nrn" event={"ID":"6762fe60-da88-4169-ac82-e435ae1486db","Type":"ContainerStarted","Data":"798a1a1eb9c749fd609ee6c431521d33430420f3db6fc9a6bac917f3729d3ed4"} Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.161508 5030 generic.go:334] "Generic (PLEG): container finished" podID="f1bee829-9ebf-4a41-8293-a7328324b33f" containerID="6a7ba4f0dd84f46b89a429cd2c20cef0d3ba99a5eff0dfc1ffb1c070547beeff" exitCode=0 Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.161548 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" event={"ID":"f1bee829-9ebf-4a41-8293-a7328324b33f","Type":"ContainerDied","Data":"6a7ba4f0dd84f46b89a429cd2c20cef0d3ba99a5eff0dfc1ffb1c070547beeff"} Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.192892 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.261240 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.261439 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="b150af41-d2a3-4f36-bbd0-159e468cf579" containerName="glance-log" containerID="cri-o://4b7cb4b235ffcb2073ae792c2fcc75799b3869e069dff101834160746d5c9282" gracePeriod=30 Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.261556 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="b150af41-d2a3-4f36-bbd0-159e468cf579" containerName="glance-httpd" containerID="cri-o://1ed7ef5c50b59bf38f186f5d6e1cda474cdf4e7c360b255f82fe49304dc9ce36" gracePeriod=30 Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.300434 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-config\") pod \"652222f4-da37-437a-9f90-f89857b8399b\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.300483 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-ovndb-tls-certs\") pod \"652222f4-da37-437a-9f90-f89857b8399b\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.300661 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-httpd-config\") pod \"652222f4-da37-437a-9f90-f89857b8399b\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.300774 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfns2\" (UniqueName: \"kubernetes.io/projected/652222f4-da37-437a-9f90-f89857b8399b-kube-api-access-gfns2\") pod \"652222f4-da37-437a-9f90-f89857b8399b\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.300842 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-combined-ca-bundle\") pod \"652222f4-da37-437a-9f90-f89857b8399b\" (UID: \"652222f4-da37-437a-9f90-f89857b8399b\") " Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.322440 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/652222f4-da37-437a-9f90-f89857b8399b-kube-api-access-gfns2" (OuterVolumeSpecName: "kube-api-access-gfns2") pod "652222f4-da37-437a-9f90-f89857b8399b" (UID: "652222f4-da37-437a-9f90-f89857b8399b"). InnerVolumeSpecName "kube-api-access-gfns2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.322940 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "652222f4-da37-437a-9f90-f89857b8399b" (UID: "652222f4-da37-437a-9f90-f89857b8399b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.375003 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "652222f4-da37-437a-9f90-f89857b8399b" (UID: "652222f4-da37-437a-9f90-f89857b8399b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.385322 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-config" (OuterVolumeSpecName: "config") pod "652222f4-da37-437a-9f90-f89857b8399b" (UID: "652222f4-da37-437a-9f90-f89857b8399b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.402906 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.402941 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfns2\" (UniqueName: \"kubernetes.io/projected/652222f4-da37-437a-9f90-f89857b8399b-kube-api-access-gfns2\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.402952 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.402961 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.420367 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "652222f4-da37-437a-9f90-f89857b8399b" (UID: "652222f4-da37-437a-9f90-f89857b8399b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:17 crc kubenswrapper[5030]: I0120 23:17:17.505274 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/652222f4-da37-437a-9f90-f89857b8399b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.184133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" event={"ID":"652222f4-da37-437a-9f90-f89857b8399b","Type":"ContainerDied","Data":"51a540c8dc762695227130e63f0102f240e44574e378c707ae515b6d5db7b176"} Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.184167 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6854d54cb9-lmp44" Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.184398 5030 scope.go:117] "RemoveContainer" containerID="0f674785d15a66a0824eddd1da0e18f743a568ad97ea8e98800c71504892f015" Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.189921 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7307ecd5-19b6-4257-8aa6-f7683128ed15","Type":"ContainerStarted","Data":"7bce6c3837218662f07d89b0975094b7323b176883ec479adf3b9c3d8d278504"} Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.189968 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7307ecd5-19b6-4257-8aa6-f7683128ed15","Type":"ContainerStarted","Data":"a7d332f8eeb25b0745eb5d9d351364fed994b7f285df5857008d9364145e82c4"} Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.222689 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6854d54cb9-lmp44"] Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.225967 5030 generic.go:334] "Generic (PLEG): container finished" podID="b150af41-d2a3-4f36-bbd0-159e468cf579" containerID="4b7cb4b235ffcb2073ae792c2fcc75799b3869e069dff101834160746d5c9282" exitCode=143 Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.226038 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b150af41-d2a3-4f36-bbd0-159e468cf579","Type":"ContainerDied","Data":"4b7cb4b235ffcb2073ae792c2fcc75799b3869e069dff101834160746d5c9282"} Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.244460 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6854d54cb9-lmp44"] Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.255195 5030 scope.go:117] "RemoveContainer" containerID="1a6aeedce58898831a51abb4593a2b921a2ad31f8e4c6edb75453fd364129a5b" Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.258814 5030 generic.go:334] "Generic (PLEG): container finished" podID="0b90192c-b1c7-4c7c-a4cd-b86ed62f0903" containerID="86f2bd833f651a2a67a1766f97e95fffb23c70a4b05e1a965f5196f9e0a203d2" exitCode=0 Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.259415 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc" event={"ID":"0b90192c-b1c7-4c7c-a4cd-b86ed62f0903","Type":"ContainerDied","Data":"86f2bd833f651a2a67a1766f97e95fffb23c70a4b05e1a965f5196f9e0a203d2"} Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.780443 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-t4nrn" Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.838779 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6762fe60-da88-4169-ac82-e435ae1486db-operator-scripts\") pod \"6762fe60-da88-4169-ac82-e435ae1486db\" (UID: \"6762fe60-da88-4169-ac82-e435ae1486db\") " Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.838832 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9m5g\" (UniqueName: \"kubernetes.io/projected/6762fe60-da88-4169-ac82-e435ae1486db-kube-api-access-j9m5g\") pod \"6762fe60-da88-4169-ac82-e435ae1486db\" (UID: \"6762fe60-da88-4169-ac82-e435ae1486db\") " Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.839882 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6762fe60-da88-4169-ac82-e435ae1486db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6762fe60-da88-4169-ac82-e435ae1486db" (UID: "6762fe60-da88-4169-ac82-e435ae1486db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.844473 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6762fe60-da88-4169-ac82-e435ae1486db-kube-api-access-j9m5g" (OuterVolumeSpecName: "kube-api-access-j9m5g") pod "6762fe60-da88-4169-ac82-e435ae1486db" (UID: "6762fe60-da88-4169-ac82-e435ae1486db"). InnerVolumeSpecName "kube-api-access-j9m5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.940558 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6762fe60-da88-4169-ac82-e435ae1486db-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.940594 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9m5g\" (UniqueName: \"kubernetes.io/projected/6762fe60-da88-4169-ac82-e435ae1486db-kube-api-access-j9m5g\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.970010 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-5svxp" Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.976095 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.991016 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx" Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.993217 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn" Jan 20 23:17:18 crc kubenswrapper[5030]: I0120 23:17:18.997257 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.000863 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.146643 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/468d96c6-0dc6-46d7-b85c-0041ef331f7d-operator-scripts\") pod \"468d96c6-0dc6-46d7-b85c-0041ef331f7d\" (UID: \"468d96c6-0dc6-46d7-b85c-0041ef331f7d\") " Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.146778 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j67zk\" (UniqueName: \"kubernetes.io/projected/468d96c6-0dc6-46d7-b85c-0041ef331f7d-kube-api-access-j67zk\") pod \"468d96c6-0dc6-46d7-b85c-0041ef331f7d\" (UID: \"468d96c6-0dc6-46d7-b85c-0041ef331f7d\") " Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.146833 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9dd2651-9595-4f1a-885f-b7c766d8ffff-operator-scripts\") pod \"f9dd2651-9595-4f1a-885f-b7c766d8ffff\" (UID: \"f9dd2651-9595-4f1a-885f-b7c766d8ffff\") " Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.147156 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/468d96c6-0dc6-46d7-b85c-0041ef331f7d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "468d96c6-0dc6-46d7-b85c-0041ef331f7d" (UID: "468d96c6-0dc6-46d7-b85c-0041ef331f7d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.147212 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9dd2651-9595-4f1a-885f-b7c766d8ffff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9dd2651-9595-4f1a-885f-b7c766d8ffff" (UID: "f9dd2651-9595-4f1a-885f-b7c766d8ffff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.147238 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c68z6\" (UniqueName: \"kubernetes.io/projected/e9d0af08-12b5-437a-9adc-37ff294ee30f-kube-api-access-c68z6\") pod \"e9d0af08-12b5-437a-9adc-37ff294ee30f\" (UID: \"e9d0af08-12b5-437a-9adc-37ff294ee30f\") " Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.147316 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1bee829-9ebf-4a41-8293-a7328324b33f-operator-scripts\") pod \"f1bee829-9ebf-4a41-8293-a7328324b33f\" (UID: \"f1bee829-9ebf-4a41-8293-a7328324b33f\") " Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.147351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dzkm\" (UniqueName: \"kubernetes.io/projected/f1bee829-9ebf-4a41-8293-a7328324b33f-kube-api-access-9dzkm\") pod \"f1bee829-9ebf-4a41-8293-a7328324b33f\" (UID: \"f1bee829-9ebf-4a41-8293-a7328324b33f\") " Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.147379 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d0af08-12b5-437a-9adc-37ff294ee30f-operator-scripts\") pod \"e9d0af08-12b5-437a-9adc-37ff294ee30f\" (UID: \"e9d0af08-12b5-437a-9adc-37ff294ee30f\") " Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.147446 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkk8p\" (UniqueName: \"kubernetes.io/projected/f9dd2651-9595-4f1a-885f-b7c766d8ffff-kube-api-access-pkk8p\") pod \"f9dd2651-9595-4f1a-885f-b7c766d8ffff\" (UID: \"f9dd2651-9595-4f1a-885f-b7c766d8ffff\") " Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.148151 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9dd2651-9595-4f1a-885f-b7c766d8ffff-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.148175 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/468d96c6-0dc6-46d7-b85c-0041ef331f7d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.148444 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d0af08-12b5-437a-9adc-37ff294ee30f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9d0af08-12b5-437a-9adc-37ff294ee30f" (UID: "e9d0af08-12b5-437a-9adc-37ff294ee30f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.150064 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1bee829-9ebf-4a41-8293-a7328324b33f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1bee829-9ebf-4a41-8293-a7328324b33f" (UID: "f1bee829-9ebf-4a41-8293-a7328324b33f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.155490 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d0af08-12b5-437a-9adc-37ff294ee30f-kube-api-access-c68z6" (OuterVolumeSpecName: "kube-api-access-c68z6") pod "e9d0af08-12b5-437a-9adc-37ff294ee30f" (UID: "e9d0af08-12b5-437a-9adc-37ff294ee30f"). InnerVolumeSpecName "kube-api-access-c68z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.155535 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9dd2651-9595-4f1a-885f-b7c766d8ffff-kube-api-access-pkk8p" (OuterVolumeSpecName: "kube-api-access-pkk8p") pod "f9dd2651-9595-4f1a-885f-b7c766d8ffff" (UID: "f9dd2651-9595-4f1a-885f-b7c766d8ffff"). InnerVolumeSpecName "kube-api-access-pkk8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.155553 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1bee829-9ebf-4a41-8293-a7328324b33f-kube-api-access-9dzkm" (OuterVolumeSpecName: "kube-api-access-9dzkm") pod "f1bee829-9ebf-4a41-8293-a7328324b33f" (UID: "f1bee829-9ebf-4a41-8293-a7328324b33f"). InnerVolumeSpecName "kube-api-access-9dzkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.155717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/468d96c6-0dc6-46d7-b85c-0041ef331f7d-kube-api-access-j67zk" (OuterVolumeSpecName: "kube-api-access-j67zk") pod "468d96c6-0dc6-46d7-b85c-0041ef331f7d" (UID: "468d96c6-0dc6-46d7-b85c-0041ef331f7d"). InnerVolumeSpecName "kube-api-access-j67zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.250331 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j67zk\" (UniqueName: \"kubernetes.io/projected/468d96c6-0dc6-46d7-b85c-0041ef331f7d-kube-api-access-j67zk\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.250360 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c68z6\" (UniqueName: \"kubernetes.io/projected/e9d0af08-12b5-437a-9adc-37ff294ee30f-kube-api-access-c68z6\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.250372 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1bee829-9ebf-4a41-8293-a7328324b33f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.250381 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dzkm\" (UniqueName: \"kubernetes.io/projected/f1bee829-9ebf-4a41-8293-a7328324b33f-kube-api-access-9dzkm\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.250390 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d0af08-12b5-437a-9adc-37ff294ee30f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.250400 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkk8p\" (UniqueName: \"kubernetes.io/projected/f9dd2651-9595-4f1a-885f-b7c766d8ffff-kube-api-access-pkk8p\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.268234 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx" event={"ID":"f9dd2651-9595-4f1a-885f-b7c766d8ffff","Type":"ContainerDied","Data":"f0da3fa84123101b1d1cb99aa7e9af8a67946747033d0e2a9b13b5d19e7f410d"} Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.268527 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0da3fa84123101b1d1cb99aa7e9af8a67946747033d0e2a9b13b5d19e7f410d" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.268271 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.270049 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.270044 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn" event={"ID":"468d96c6-0dc6-46d7-b85c-0041ef331f7d","Type":"ContainerDied","Data":"1b7751dcd40bec86717488413a4a83f5132363b0b012bdda34434f4694946e8d"} Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.270161 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b7751dcd40bec86717488413a4a83f5132363b0b012bdda34434f4694946e8d" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.271781 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-t4nrn" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.271777 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-t4nrn" event={"ID":"6762fe60-da88-4169-ac82-e435ae1486db","Type":"ContainerDied","Data":"798a1a1eb9c749fd609ee6c431521d33430420f3db6fc9a6bac917f3729d3ed4"} Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.271826 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="798a1a1eb9c749fd609ee6c431521d33430420f3db6fc9a6bac917f3729d3ed4" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.280946 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" event={"ID":"f1bee829-9ebf-4a41-8293-a7328324b33f","Type":"ContainerDied","Data":"523860be83891322b05e0636344adb1e6188af1be178d9284733399d9f7922be"} Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.280974 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="523860be83891322b05e0636344adb1e6188af1be178d9284733399d9f7922be" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.280978 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-rj9lg" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.283569 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-5svxp" event={"ID":"e9d0af08-12b5-437a-9adc-37ff294ee30f","Type":"ContainerDied","Data":"9a12d00fc408dc215e9f523b8ee07c6afe0348befcb0e0b84c06a0ebbb75bffd"} Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.283593 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-5svxp" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.283609 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a12d00fc408dc215e9f523b8ee07c6afe0348befcb0e0b84c06a0ebbb75bffd" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.530557 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.657984 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfr4v\" (UniqueName: \"kubernetes.io/projected/0b90192c-b1c7-4c7c-a4cd-b86ed62f0903-kube-api-access-sfr4v\") pod \"0b90192c-b1c7-4c7c-a4cd-b86ed62f0903\" (UID: \"0b90192c-b1c7-4c7c-a4cd-b86ed62f0903\") " Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.658182 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b90192c-b1c7-4c7c-a4cd-b86ed62f0903-operator-scripts\") pod \"0b90192c-b1c7-4c7c-a4cd-b86ed62f0903\" (UID: \"0b90192c-b1c7-4c7c-a4cd-b86ed62f0903\") " Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.658551 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b90192c-b1c7-4c7c-a4cd-b86ed62f0903-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b90192c-b1c7-4c7c-a4cd-b86ed62f0903" (UID: "0b90192c-b1c7-4c7c-a4cd-b86ed62f0903"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.658738 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b90192c-b1c7-4c7c-a4cd-b86ed62f0903-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.663106 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b90192c-b1c7-4c7c-a4cd-b86ed62f0903-kube-api-access-sfr4v" (OuterVolumeSpecName: "kube-api-access-sfr4v") pod "0b90192c-b1c7-4c7c-a4cd-b86ed62f0903" (UID: "0b90192c-b1c7-4c7c-a4cd-b86ed62f0903"). InnerVolumeSpecName "kube-api-access-sfr4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.760793 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfr4v\" (UniqueName: \"kubernetes.io/projected/0b90192c-b1c7-4c7c-a4cd-b86ed62f0903-kube-api-access-sfr4v\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:19 crc kubenswrapper[5030]: I0120 23:17:19.975986 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="652222f4-da37-437a-9f90-f89857b8399b" path="/var/lib/kubelet/pods/652222f4-da37-437a-9f90-f89857b8399b/volumes" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.051602 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.169385 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-combined-ca-bundle\") pod \"57213c76-ec53-41ad-a0c2-bc850c6450e9\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.169486 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/57213c76-ec53-41ad-a0c2-bc850c6450e9-httpd-run\") pod \"57213c76-ec53-41ad-a0c2-bc850c6450e9\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.169528 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbnct\" (UniqueName: \"kubernetes.io/projected/57213c76-ec53-41ad-a0c2-bc850c6450e9-kube-api-access-gbnct\") pod \"57213c76-ec53-41ad-a0c2-bc850c6450e9\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.169565 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-scripts\") pod \"57213c76-ec53-41ad-a0c2-bc850c6450e9\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.169599 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57213c76-ec53-41ad-a0c2-bc850c6450e9-logs\") pod \"57213c76-ec53-41ad-a0c2-bc850c6450e9\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.169697 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"57213c76-ec53-41ad-a0c2-bc850c6450e9\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.169738 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-config-data\") pod \"57213c76-ec53-41ad-a0c2-bc850c6450e9\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.169752 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-public-tls-certs\") pod \"57213c76-ec53-41ad-a0c2-bc850c6450e9\" (UID: \"57213c76-ec53-41ad-a0c2-bc850c6450e9\") " Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.169979 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57213c76-ec53-41ad-a0c2-bc850c6450e9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "57213c76-ec53-41ad-a0c2-bc850c6450e9" (UID: "57213c76-ec53-41ad-a0c2-bc850c6450e9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.170320 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57213c76-ec53-41ad-a0c2-bc850c6450e9-logs" (OuterVolumeSpecName: "logs") pod "57213c76-ec53-41ad-a0c2-bc850c6450e9" (UID: "57213c76-ec53-41ad-a0c2-bc850c6450e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.170430 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/57213c76-ec53-41ad-a0c2-bc850c6450e9-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.173859 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-scripts" (OuterVolumeSpecName: "scripts") pod "57213c76-ec53-41ad-a0c2-bc850c6450e9" (UID: "57213c76-ec53-41ad-a0c2-bc850c6450e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.175411 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "57213c76-ec53-41ad-a0c2-bc850c6450e9" (UID: "57213c76-ec53-41ad-a0c2-bc850c6450e9"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.176953 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57213c76-ec53-41ad-a0c2-bc850c6450e9-kube-api-access-gbnct" (OuterVolumeSpecName: "kube-api-access-gbnct") pod "57213c76-ec53-41ad-a0c2-bc850c6450e9" (UID: "57213c76-ec53-41ad-a0c2-bc850c6450e9"). InnerVolumeSpecName "kube-api-access-gbnct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.197486 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57213c76-ec53-41ad-a0c2-bc850c6450e9" (UID: "57213c76-ec53-41ad-a0c2-bc850c6450e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.219868 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "57213c76-ec53-41ad-a0c2-bc850c6450e9" (UID: "57213c76-ec53-41ad-a0c2-bc850c6450e9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.227151 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-config-data" (OuterVolumeSpecName: "config-data") pod "57213c76-ec53-41ad-a0c2-bc850c6450e9" (UID: "57213c76-ec53-41ad-a0c2-bc850c6450e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.271909 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.271943 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.271954 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.271964 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbnct\" (UniqueName: \"kubernetes.io/projected/57213c76-ec53-41ad-a0c2-bc850c6450e9-kube-api-access-gbnct\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.271973 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57213c76-ec53-41ad-a0c2-bc850c6450e9-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.271982 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57213c76-ec53-41ad-a0c2-bc850c6450e9-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.272013 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.291107 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.293406 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc" event={"ID":"0b90192c-b1c7-4c7c-a4cd-b86ed62f0903","Type":"ContainerDied","Data":"44920e6d48ce257d6dfbf7e201a16a3017bd7843ce433db35957af603c4e08bd"} Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.293509 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44920e6d48ce257d6dfbf7e201a16a3017bd7843ce433db35957af603c4e08bd" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.293456 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.295796 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.295821 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"57213c76-ec53-41ad-a0c2-bc850c6450e9","Type":"ContainerDied","Data":"91d2eee7f4a70bfaff3e7bbd1c18dd79fe9e6768d85b58181b7c77bdea80cc23"} Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.295743 5030 generic.go:334] "Generic (PLEG): container finished" podID="57213c76-ec53-41ad-a0c2-bc850c6450e9" containerID="91d2eee7f4a70bfaff3e7bbd1c18dd79fe9e6768d85b58181b7c77bdea80cc23" exitCode=0 Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.295961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"57213c76-ec53-41ad-a0c2-bc850c6450e9","Type":"ContainerDied","Data":"38bcd5a1f25e0883f7b350207a30a8db416dd54db126a21b4d6179df35d4c348"} Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.296055 5030 scope.go:117] "RemoveContainer" containerID="91d2eee7f4a70bfaff3e7bbd1c18dd79fe9e6768d85b58181b7c77bdea80cc23" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.298671 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7307ecd5-19b6-4257-8aa6-f7683128ed15","Type":"ContainerStarted","Data":"32c9c38b60046f632a1e66250e35d54f20f80f3410c0e829eeef5f4cd08c9dc4"} Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.298809 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="ceilometer-central-agent" containerID="cri-o://49115faf9e4cca8649a875a9d81b2f07f5541bc765c46bf3ab690053a58c78ba" gracePeriod=30 Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.298866 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="proxy-httpd" containerID="cri-o://32c9c38b60046f632a1e66250e35d54f20f80f3410c0e829eeef5f4cd08c9dc4" gracePeriod=30 Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.298893 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.298931 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="sg-core" containerID="cri-o://7bce6c3837218662f07d89b0975094b7323b176883ec479adf3b9c3d8d278504" gracePeriod=30 Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.298912 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="ceilometer-notification-agent" containerID="cri-o://a7d332f8eeb25b0745eb5d9d351364fed994b7f285df5857008d9364145e82c4" gracePeriod=30 Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.321559 5030 scope.go:117] "RemoveContainer" containerID="d7669dc3f89da19bb41a80426ab0d69429bf073c8734960200737d02eab6660b" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.326658 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.909798398 podStartE2EDuration="6.326641168s" podCreationTimestamp="2026-01-20 23:17:14 +0000 UTC" firstStartedPulling="2026-01-20 23:17:15.35940234 +0000 UTC m=+2507.679662628" lastFinishedPulling="2026-01-20 23:17:19.77624512 +0000 UTC m=+2512.096505398" observedRunningTime="2026-01-20 23:17:20.324455636 +0000 UTC m=+2512.644715924" watchObservedRunningTime="2026-01-20 23:17:20.326641168 +0000 UTC m=+2512.646901456" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.363954 5030 scope.go:117] "RemoveContainer" containerID="91d2eee7f4a70bfaff3e7bbd1c18dd79fe9e6768d85b58181b7c77bdea80cc23" Jan 20 23:17:20 crc kubenswrapper[5030]: E0120 23:17:20.365418 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d2eee7f4a70bfaff3e7bbd1c18dd79fe9e6768d85b58181b7c77bdea80cc23\": container with ID starting with 91d2eee7f4a70bfaff3e7bbd1c18dd79fe9e6768d85b58181b7c77bdea80cc23 not found: ID does not exist" containerID="91d2eee7f4a70bfaff3e7bbd1c18dd79fe9e6768d85b58181b7c77bdea80cc23" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.365523 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d2eee7f4a70bfaff3e7bbd1c18dd79fe9e6768d85b58181b7c77bdea80cc23"} err="failed to get container status \"91d2eee7f4a70bfaff3e7bbd1c18dd79fe9e6768d85b58181b7c77bdea80cc23\": rpc error: code = NotFound desc = could not find container \"91d2eee7f4a70bfaff3e7bbd1c18dd79fe9e6768d85b58181b7c77bdea80cc23\": container with ID starting with 91d2eee7f4a70bfaff3e7bbd1c18dd79fe9e6768d85b58181b7c77bdea80cc23 not found: ID does not exist" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.365561 5030 scope.go:117] "RemoveContainer" containerID="d7669dc3f89da19bb41a80426ab0d69429bf073c8734960200737d02eab6660b" Jan 20 23:17:20 crc kubenswrapper[5030]: E0120 23:17:20.366350 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7669dc3f89da19bb41a80426ab0d69429bf073c8734960200737d02eab6660b\": container with ID starting with d7669dc3f89da19bb41a80426ab0d69429bf073c8734960200737d02eab6660b not found: ID does not exist" containerID="d7669dc3f89da19bb41a80426ab0d69429bf073c8734960200737d02eab6660b" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.366524 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7669dc3f89da19bb41a80426ab0d69429bf073c8734960200737d02eab6660b"} err="failed to get container status \"d7669dc3f89da19bb41a80426ab0d69429bf073c8734960200737d02eab6660b\": rpc error: code = NotFound desc = could not find container \"d7669dc3f89da19bb41a80426ab0d69429bf073c8734960200737d02eab6660b\": container with ID starting with d7669dc3f89da19bb41a80426ab0d69429bf073c8734960200737d02eab6660b not found: ID does not exist" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.385313 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.386070 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.396567 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.411709 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:17:20 crc kubenswrapper[5030]: E0120 23:17:20.412120 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57213c76-ec53-41ad-a0c2-bc850c6450e9" containerName="glance-log" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.412138 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="57213c76-ec53-41ad-a0c2-bc850c6450e9" containerName="glance-log" Jan 20 23:17:20 crc kubenswrapper[5030]: E0120 23:17:20.412153 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="652222f4-da37-437a-9f90-f89857b8399b" containerName="neutron-api" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.412160 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="652222f4-da37-437a-9f90-f89857b8399b" containerName="neutron-api" Jan 20 23:17:20 crc kubenswrapper[5030]: E0120 23:17:20.412173 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468d96c6-0dc6-46d7-b85c-0041ef331f7d" containerName="mariadb-account-create-update" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.412179 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="468d96c6-0dc6-46d7-b85c-0041ef331f7d" containerName="mariadb-account-create-update" Jan 20 23:17:20 crc kubenswrapper[5030]: E0120 23:17:20.412198 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1bee829-9ebf-4a41-8293-a7328324b33f" containerName="mariadb-database-create" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.412204 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1bee829-9ebf-4a41-8293-a7328324b33f" containerName="mariadb-database-create" Jan 20 23:17:20 crc kubenswrapper[5030]: E0120 23:17:20.417860 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57213c76-ec53-41ad-a0c2-bc850c6450e9" containerName="glance-httpd" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.419497 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="57213c76-ec53-41ad-a0c2-bc850c6450e9" containerName="glance-httpd" Jan 20 23:17:20 crc kubenswrapper[5030]: E0120 23:17:20.419526 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9dd2651-9595-4f1a-885f-b7c766d8ffff" containerName="mariadb-account-create-update" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.419535 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9dd2651-9595-4f1a-885f-b7c766d8ffff" containerName="mariadb-account-create-update" Jan 20 23:17:20 crc kubenswrapper[5030]: E0120 23:17:20.419550 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="652222f4-da37-437a-9f90-f89857b8399b" containerName="neutron-httpd" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.419556 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="652222f4-da37-437a-9f90-f89857b8399b" containerName="neutron-httpd" Jan 20 23:17:20 crc kubenswrapper[5030]: E0120 23:17:20.419571 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d0af08-12b5-437a-9adc-37ff294ee30f" containerName="mariadb-database-create" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.419577 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d0af08-12b5-437a-9adc-37ff294ee30f" containerName="mariadb-database-create" Jan 20 23:17:20 crc kubenswrapper[5030]: E0120 23:17:20.419592 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6762fe60-da88-4169-ac82-e435ae1486db" containerName="mariadb-database-create" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.419598 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6762fe60-da88-4169-ac82-e435ae1486db" containerName="mariadb-database-create" Jan 20 23:17:20 crc kubenswrapper[5030]: E0120 23:17:20.419608 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b90192c-b1c7-4c7c-a4cd-b86ed62f0903" containerName="mariadb-account-create-update" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.419614 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b90192c-b1c7-4c7c-a4cd-b86ed62f0903" containerName="mariadb-account-create-update" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.419921 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b90192c-b1c7-4c7c-a4cd-b86ed62f0903" containerName="mariadb-account-create-update" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.419947 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6762fe60-da88-4169-ac82-e435ae1486db" containerName="mariadb-database-create" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.419959 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="57213c76-ec53-41ad-a0c2-bc850c6450e9" containerName="glance-httpd" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.419969 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="652222f4-da37-437a-9f90-f89857b8399b" containerName="neutron-api" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.419978 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1bee829-9ebf-4a41-8293-a7328324b33f" containerName="mariadb-database-create" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.419992 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="468d96c6-0dc6-46d7-b85c-0041ef331f7d" containerName="mariadb-account-create-update" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.420003 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="652222f4-da37-437a-9f90-f89857b8399b" containerName="neutron-httpd" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.420011 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d0af08-12b5-437a-9adc-37ff294ee30f" containerName="mariadb-database-create" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.420019 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="57213c76-ec53-41ad-a0c2-bc850c6450e9" containerName="glance-log" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.420030 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9dd2651-9595-4f1a-885f-b7c766d8ffff" containerName="mariadb-account-create-update" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.421953 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.424345 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.424530 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.433859 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.590607 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.590945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd75cb8e-9fcd-4465-88fb-49e3600608dc-logs\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.590972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.591004 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.591257 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.591395 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd75cb8e-9fcd-4465-88fb-49e3600608dc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.591481 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8sk7\" (UniqueName: \"kubernetes.io/projected/bd75cb8e-9fcd-4465-88fb-49e3600608dc-kube-api-access-m8sk7\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.591522 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.603838 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="b150af41-d2a3-4f36-bbd0-159e468cf579" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.143:9292/healthcheck\": read tcp 10.217.0.2:33492->10.217.0.143:9292: read: connection reset by peer" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.603852 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="b150af41-d2a3-4f36-bbd0-159e468cf579" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.143:9292/healthcheck\": read tcp 10.217.0.2:33508->10.217.0.143:9292: read: connection reset by peer" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.693125 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.693208 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd75cb8e-9fcd-4465-88fb-49e3600608dc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.693253 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8sk7\" (UniqueName: \"kubernetes.io/projected/bd75cb8e-9fcd-4465-88fb-49e3600608dc-kube-api-access-m8sk7\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.693282 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.693322 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.693367 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd75cb8e-9fcd-4465-88fb-49e3600608dc-logs\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.693394 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.693430 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.693785 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.696193 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd75cb8e-9fcd-4465-88fb-49e3600608dc-logs\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.696485 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd75cb8e-9fcd-4465-88fb-49e3600608dc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.697124 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.697530 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.700040 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.702387 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.720127 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8sk7\" (UniqueName: \"kubernetes.io/projected/bd75cb8e-9fcd-4465-88fb-49e3600608dc-kube-api-access-m8sk7\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.721443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:20 crc kubenswrapper[5030]: I0120 23:17:20.745874 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.076355 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.197955 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:17:21 crc kubenswrapper[5030]: W0120 23:17:21.203672 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd75cb8e_9fcd_4465_88fb_49e3600608dc.slice/crio-ff631c9ca80cc4a02c30452998625385af68b600a57971ffca78e17e35a60100 WatchSource:0}: Error finding container ff631c9ca80cc4a02c30452998625385af68b600a57971ffca78e17e35a60100: Status 404 returned error can't find the container with id ff631c9ca80cc4a02c30452998625385af68b600a57971ffca78e17e35a60100 Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.204823 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-combined-ca-bundle\") pod \"b150af41-d2a3-4f36-bbd0-159e468cf579\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.204919 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b150af41-d2a3-4f36-bbd0-159e468cf579-httpd-run\") pod \"b150af41-d2a3-4f36-bbd0-159e468cf579\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.205024 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwghd\" (UniqueName: \"kubernetes.io/projected/b150af41-d2a3-4f36-bbd0-159e468cf579-kube-api-access-nwghd\") pod \"b150af41-d2a3-4f36-bbd0-159e468cf579\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.205053 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-scripts\") pod \"b150af41-d2a3-4f36-bbd0-159e468cf579\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.205136 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"b150af41-d2a3-4f36-bbd0-159e468cf579\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.205163 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-config-data\") pod \"b150af41-d2a3-4f36-bbd0-159e468cf579\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.205195 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-internal-tls-certs\") pod \"b150af41-d2a3-4f36-bbd0-159e468cf579\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.205259 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b150af41-d2a3-4f36-bbd0-159e468cf579-logs\") pod \"b150af41-d2a3-4f36-bbd0-159e468cf579\" (UID: \"b150af41-d2a3-4f36-bbd0-159e468cf579\") " Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.205652 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b150af41-d2a3-4f36-bbd0-159e468cf579-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b150af41-d2a3-4f36-bbd0-159e468cf579" (UID: "b150af41-d2a3-4f36-bbd0-159e468cf579"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.205769 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b150af41-d2a3-4f36-bbd0-159e468cf579-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.205897 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b150af41-d2a3-4f36-bbd0-159e468cf579-logs" (OuterVolumeSpecName: "logs") pod "b150af41-d2a3-4f36-bbd0-159e468cf579" (UID: "b150af41-d2a3-4f36-bbd0-159e468cf579"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.210540 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b150af41-d2a3-4f36-bbd0-159e468cf579-kube-api-access-nwghd" (OuterVolumeSpecName: "kube-api-access-nwghd") pod "b150af41-d2a3-4f36-bbd0-159e468cf579" (UID: "b150af41-d2a3-4f36-bbd0-159e468cf579"). InnerVolumeSpecName "kube-api-access-nwghd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.212065 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "b150af41-d2a3-4f36-bbd0-159e468cf579" (UID: "b150af41-d2a3-4f36-bbd0-159e468cf579"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.213461 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-scripts" (OuterVolumeSpecName: "scripts") pod "b150af41-d2a3-4f36-bbd0-159e468cf579" (UID: "b150af41-d2a3-4f36-bbd0-159e468cf579"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.234196 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b150af41-d2a3-4f36-bbd0-159e468cf579" (UID: "b150af41-d2a3-4f36-bbd0-159e468cf579"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.261310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b150af41-d2a3-4f36-bbd0-159e468cf579" (UID: "b150af41-d2a3-4f36-bbd0-159e468cf579"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.263487 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-config-data" (OuterVolumeSpecName: "config-data") pod "b150af41-d2a3-4f36-bbd0-159e468cf579" (UID: "b150af41-d2a3-4f36-bbd0-159e468cf579"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.307190 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.307226 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.307241 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.307256 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b150af41-d2a3-4f36-bbd0-159e468cf579-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.307269 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.307283 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwghd\" (UniqueName: \"kubernetes.io/projected/b150af41-d2a3-4f36-bbd0-159e468cf579-kube-api-access-nwghd\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.307294 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b150af41-d2a3-4f36-bbd0-159e468cf579-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.311604 5030 generic.go:334] "Generic (PLEG): container finished" podID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerID="32c9c38b60046f632a1e66250e35d54f20f80f3410c0e829eeef5f4cd08c9dc4" exitCode=0 Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.311721 5030 generic.go:334] "Generic (PLEG): container finished" podID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerID="7bce6c3837218662f07d89b0975094b7323b176883ec479adf3b9c3d8d278504" exitCode=2 Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.311798 5030 generic.go:334] "Generic (PLEG): container finished" podID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerID="a7d332f8eeb25b0745eb5d9d351364fed994b7f285df5857008d9364145e82c4" exitCode=0 Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.311726 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7307ecd5-19b6-4257-8aa6-f7683128ed15","Type":"ContainerDied","Data":"32c9c38b60046f632a1e66250e35d54f20f80f3410c0e829eeef5f4cd08c9dc4"} Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.311961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7307ecd5-19b6-4257-8aa6-f7683128ed15","Type":"ContainerDied","Data":"7bce6c3837218662f07d89b0975094b7323b176883ec479adf3b9c3d8d278504"} Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.312026 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7307ecd5-19b6-4257-8aa6-f7683128ed15","Type":"ContainerDied","Data":"a7d332f8eeb25b0745eb5d9d351364fed994b7f285df5857008d9364145e82c4"} Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.313861 5030 generic.go:334] "Generic (PLEG): container finished" podID="b150af41-d2a3-4f36-bbd0-159e468cf579" containerID="1ed7ef5c50b59bf38f186f5d6e1cda474cdf4e7c360b255f82fe49304dc9ce36" exitCode=0 Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.313919 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.313937 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b150af41-d2a3-4f36-bbd0-159e468cf579","Type":"ContainerDied","Data":"1ed7ef5c50b59bf38f186f5d6e1cda474cdf4e7c360b255f82fe49304dc9ce36"} Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.313964 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b150af41-d2a3-4f36-bbd0-159e468cf579","Type":"ContainerDied","Data":"53ba6f999c975f115153f6b3f69dd5f81ee2916390c23b9faaa43570783b7fee"} Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.313982 5030 scope.go:117] "RemoveContainer" containerID="1ed7ef5c50b59bf38f186f5d6e1cda474cdf4e7c360b255f82fe49304dc9ce36" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.315682 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"bd75cb8e-9fcd-4465-88fb-49e3600608dc","Type":"ContainerStarted","Data":"ff631c9ca80cc4a02c30452998625385af68b600a57971ffca78e17e35a60100"} Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.333466 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.373940 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.381223 5030 scope.go:117] "RemoveContainer" containerID="4b7cb4b235ffcb2073ae792c2fcc75799b3869e069dff101834160746d5c9282" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.384643 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.393776 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:17:21 crc kubenswrapper[5030]: E0120 23:17:21.394305 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b150af41-d2a3-4f36-bbd0-159e468cf579" containerName="glance-log" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.394327 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b150af41-d2a3-4f36-bbd0-159e468cf579" containerName="glance-log" Jan 20 23:17:21 crc kubenswrapper[5030]: E0120 23:17:21.394341 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b150af41-d2a3-4f36-bbd0-159e468cf579" containerName="glance-httpd" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.394348 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b150af41-d2a3-4f36-bbd0-159e468cf579" containerName="glance-httpd" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.394536 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b150af41-d2a3-4f36-bbd0-159e468cf579" containerName="glance-log" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.394552 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b150af41-d2a3-4f36-bbd0-159e468cf579" containerName="glance-httpd" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.395589 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.405690 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.405845 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.408736 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.415565 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.420382 5030 scope.go:117] "RemoveContainer" containerID="1ed7ef5c50b59bf38f186f5d6e1cda474cdf4e7c360b255f82fe49304dc9ce36" Jan 20 23:17:21 crc kubenswrapper[5030]: E0120 23:17:21.420779 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed7ef5c50b59bf38f186f5d6e1cda474cdf4e7c360b255f82fe49304dc9ce36\": container with ID starting with 1ed7ef5c50b59bf38f186f5d6e1cda474cdf4e7c360b255f82fe49304dc9ce36 not found: ID does not exist" containerID="1ed7ef5c50b59bf38f186f5d6e1cda474cdf4e7c360b255f82fe49304dc9ce36" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.420810 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed7ef5c50b59bf38f186f5d6e1cda474cdf4e7c360b255f82fe49304dc9ce36"} err="failed to get container status \"1ed7ef5c50b59bf38f186f5d6e1cda474cdf4e7c360b255f82fe49304dc9ce36\": rpc error: code = NotFound desc = could not find container \"1ed7ef5c50b59bf38f186f5d6e1cda474cdf4e7c360b255f82fe49304dc9ce36\": container with ID starting with 1ed7ef5c50b59bf38f186f5d6e1cda474cdf4e7c360b255f82fe49304dc9ce36 not found: ID does not exist" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.420830 5030 scope.go:117] "RemoveContainer" containerID="4b7cb4b235ffcb2073ae792c2fcc75799b3869e069dff101834160746d5c9282" Jan 20 23:17:21 crc kubenswrapper[5030]: E0120 23:17:21.421018 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b7cb4b235ffcb2073ae792c2fcc75799b3869e069dff101834160746d5c9282\": container with ID starting with 4b7cb4b235ffcb2073ae792c2fcc75799b3869e069dff101834160746d5c9282 not found: ID does not exist" containerID="4b7cb4b235ffcb2073ae792c2fcc75799b3869e069dff101834160746d5c9282" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.421047 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7cb4b235ffcb2073ae792c2fcc75799b3869e069dff101834160746d5c9282"} err="failed to get container status \"4b7cb4b235ffcb2073ae792c2fcc75799b3869e069dff101834160746d5c9282\": rpc error: code = NotFound desc = could not find container \"4b7cb4b235ffcb2073ae792c2fcc75799b3869e069dff101834160746d5c9282\": container with ID starting with 4b7cb4b235ffcb2073ae792c2fcc75799b3869e069dff101834160746d5c9282 not found: ID does not exist" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.509909 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.509966 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvbsf\" (UniqueName: \"kubernetes.io/projected/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-kube-api-access-qvbsf\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.510006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.510065 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.510094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.510180 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.510236 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.510262 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.612365 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.613369 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.613393 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.613509 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.613764 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvbsf\" (UniqueName: \"kubernetes.io/projected/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-kube-api-access-qvbsf\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.613804 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.613846 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.613944 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.614269 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.615240 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.617514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.618906 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.618943 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.625537 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.626654 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.632927 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvbsf\" (UniqueName: \"kubernetes.io/projected/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-kube-api-access-qvbsf\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.650941 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-0\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.721893 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.978420 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57213c76-ec53-41ad-a0c2-bc850c6450e9" path="/var/lib/kubelet/pods/57213c76-ec53-41ad-a0c2-bc850c6450e9/volumes" Jan 20 23:17:21 crc kubenswrapper[5030]: I0120 23:17:21.979560 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b150af41-d2a3-4f36-bbd0-159e468cf579" path="/var/lib/kubelet/pods/b150af41-d2a3-4f36-bbd0-159e468cf579/volumes" Jan 20 23:17:22 crc kubenswrapper[5030]: I0120 23:17:22.188292 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:17:22 crc kubenswrapper[5030]: I0120 23:17:22.328634 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c16c5bc4-fe32-4711-a21f-c6e7919d41d3","Type":"ContainerStarted","Data":"0e2d70adc03803d4f2fa32b277855266f614ffbd221e6dbc4f92e76fc69947a2"} Jan 20 23:17:22 crc kubenswrapper[5030]: I0120 23:17:22.332067 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"bd75cb8e-9fcd-4465-88fb-49e3600608dc","Type":"ContainerStarted","Data":"744df9f6f8632e4bde232c473af1353285e6fa32962770ad640031ef5ec3543a"} Jan 20 23:17:23 crc kubenswrapper[5030]: I0120 23:17:23.342306 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c16c5bc4-fe32-4711-a21f-c6e7919d41d3","Type":"ContainerStarted","Data":"50dad30f02e2ad71b34ca576e9d068767ee44b13ac49b20df2f355b3c50adad3"} Jan 20 23:17:23 crc kubenswrapper[5030]: I0120 23:17:23.342354 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c16c5bc4-fe32-4711-a21f-c6e7919d41d3","Type":"ContainerStarted","Data":"e4df01fc1632a811eb3b665c2886fe79657be01644364c0ace65e378ddb318ce"} Jan 20 23:17:23 crc kubenswrapper[5030]: I0120 23:17:23.344165 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"bd75cb8e-9fcd-4465-88fb-49e3600608dc","Type":"ContainerStarted","Data":"7af3824becb0a8b511712e8c071c2a7723b825950e7dda89578b7c6ba4037ce4"} Jan 20 23:17:23 crc kubenswrapper[5030]: I0120 23:17:23.371779 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.371758633 podStartE2EDuration="2.371758633s" podCreationTimestamp="2026-01-20 23:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:23.363475164 +0000 UTC m=+2515.683735452" watchObservedRunningTime="2026-01-20 23:17:23.371758633 +0000 UTC m=+2515.692018931" Jan 20 23:17:23 crc kubenswrapper[5030]: I0120 23:17:23.393306 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.393280949 podStartE2EDuration="3.393280949s" podCreationTimestamp="2026-01-20 23:17:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:23.385033681 +0000 UTC m=+2515.705293969" watchObservedRunningTime="2026-01-20 23:17:23.393280949 +0000 UTC m=+2515.713541237" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.373537 5030 generic.go:334] "Generic (PLEG): container finished" podID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerID="49115faf9e4cca8649a875a9d81b2f07f5541bc765c46bf3ab690053a58c78ba" exitCode=0 Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.373599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7307ecd5-19b6-4257-8aa6-f7683128ed15","Type":"ContainerDied","Data":"49115faf9e4cca8649a875a9d81b2f07f5541bc765c46bf3ab690053a58c78ba"} Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.642826 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d"] Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.644271 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.648869 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-82pqm" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.649502 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.653662 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.668524 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d"] Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.778343 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.792835 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-config-data\") pod \"nova-cell0-conductor-db-sync-vrv9d\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.793052 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fg2r\" (UniqueName: \"kubernetes.io/projected/1b42b071-9e29-4c1b-a187-e79623838902-kube-api-access-4fg2r\") pod \"nova-cell0-conductor-db-sync-vrv9d\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.793227 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-scripts\") pod \"nova-cell0-conductor-db-sync-vrv9d\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.793341 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vrv9d\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.895374 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-sg-core-conf-yaml\") pod \"7307ecd5-19b6-4257-8aa6-f7683128ed15\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.895432 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7307ecd5-19b6-4257-8aa6-f7683128ed15-log-httpd\") pod \"7307ecd5-19b6-4257-8aa6-f7683128ed15\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.895527 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-config-data\") pod \"7307ecd5-19b6-4257-8aa6-f7683128ed15\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.895639 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmglk\" (UniqueName: \"kubernetes.io/projected/7307ecd5-19b6-4257-8aa6-f7683128ed15-kube-api-access-kmglk\") pod \"7307ecd5-19b6-4257-8aa6-f7683128ed15\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.895663 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7307ecd5-19b6-4257-8aa6-f7683128ed15-run-httpd\") pod \"7307ecd5-19b6-4257-8aa6-f7683128ed15\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.895735 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-combined-ca-bundle\") pod \"7307ecd5-19b6-4257-8aa6-f7683128ed15\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.895768 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-scripts\") pod \"7307ecd5-19b6-4257-8aa6-f7683128ed15\" (UID: \"7307ecd5-19b6-4257-8aa6-f7683128ed15\") " Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.896468 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7307ecd5-19b6-4257-8aa6-f7683128ed15-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7307ecd5-19b6-4257-8aa6-f7683128ed15" (UID: "7307ecd5-19b6-4257-8aa6-f7683128ed15"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.896833 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7307ecd5-19b6-4257-8aa6-f7683128ed15-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7307ecd5-19b6-4257-8aa6-f7683128ed15" (UID: "7307ecd5-19b6-4257-8aa6-f7683128ed15"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.896905 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-config-data\") pod \"nova-cell0-conductor-db-sync-vrv9d\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.897038 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fg2r\" (UniqueName: \"kubernetes.io/projected/1b42b071-9e29-4c1b-a187-e79623838902-kube-api-access-4fg2r\") pod \"nova-cell0-conductor-db-sync-vrv9d\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.897149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-scripts\") pod \"nova-cell0-conductor-db-sync-vrv9d\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.897223 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vrv9d\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.897511 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7307ecd5-19b6-4257-8aa6-f7683128ed15-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.898092 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7307ecd5-19b6-4257-8aa6-f7683128ed15-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.902097 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-scripts" (OuterVolumeSpecName: "scripts") pod "7307ecd5-19b6-4257-8aa6-f7683128ed15" (UID: "7307ecd5-19b6-4257-8aa6-f7683128ed15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.903023 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-scripts\") pod \"nova-cell0-conductor-db-sync-vrv9d\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.903342 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7307ecd5-19b6-4257-8aa6-f7683128ed15-kube-api-access-kmglk" (OuterVolumeSpecName: "kube-api-access-kmglk") pod "7307ecd5-19b6-4257-8aa6-f7683128ed15" (UID: "7307ecd5-19b6-4257-8aa6-f7683128ed15"). InnerVolumeSpecName "kube-api-access-kmglk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.906335 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-config-data\") pod \"nova-cell0-conductor-db-sync-vrv9d\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.913474 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vrv9d\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.919504 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fg2r\" (UniqueName: \"kubernetes.io/projected/1b42b071-9e29-4c1b-a187-e79623838902-kube-api-access-4fg2r\") pod \"nova-cell0-conductor-db-sync-vrv9d\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.938071 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7307ecd5-19b6-4257-8aa6-f7683128ed15" (UID: "7307ecd5-19b6-4257-8aa6-f7683128ed15"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.963264 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:25 crc kubenswrapper[5030]: I0120 23:17:25.984808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7307ecd5-19b6-4257-8aa6-f7683128ed15" (UID: "7307ecd5-19b6-4257-8aa6-f7683128ed15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.001129 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.001239 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmglk\" (UniqueName: \"kubernetes.io/projected/7307ecd5-19b6-4257-8aa6-f7683128ed15-kube-api-access-kmglk\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.001412 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.001470 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.033041 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-config-data" (OuterVolumeSpecName: "config-data") pod "7307ecd5-19b6-4257-8aa6-f7683128ed15" (UID: "7307ecd5-19b6-4257-8aa6-f7683128ed15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.104375 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7307ecd5-19b6-4257-8aa6-f7683128ed15-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.390189 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7307ecd5-19b6-4257-8aa6-f7683128ed15","Type":"ContainerDied","Data":"fdc8e278eb1ea40baf3ed2dba414bbd3a25a6e821a66eea5ac995f8c7c5e01d6"} Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.390275 5030 scope.go:117] "RemoveContainer" containerID="32c9c38b60046f632a1e66250e35d54f20f80f3410c0e829eeef5f4cd08c9dc4" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.390291 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.436138 5030 scope.go:117] "RemoveContainer" containerID="7bce6c3837218662f07d89b0975094b7323b176883ec479adf3b9c3d8d278504" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.452289 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d"] Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.501551 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.516116 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.526693 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:26 crc kubenswrapper[5030]: E0120 23:17:26.527160 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="proxy-httpd" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.527179 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="proxy-httpd" Jan 20 23:17:26 crc kubenswrapper[5030]: E0120 23:17:26.527210 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="sg-core" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.527217 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="sg-core" Jan 20 23:17:26 crc kubenswrapper[5030]: E0120 23:17:26.527232 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="ceilometer-central-agent" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.527239 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="ceilometer-central-agent" Jan 20 23:17:26 crc kubenswrapper[5030]: E0120 23:17:26.527250 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="ceilometer-notification-agent" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.527257 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="ceilometer-notification-agent" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.527443 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="sg-core" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.527454 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="proxy-httpd" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.527460 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="ceilometer-notification-agent" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.527472 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" containerName="ceilometer-central-agent" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.529160 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.532585 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.533249 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.537266 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.615883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnj7t\" (UniqueName: \"kubernetes.io/projected/b81f9748-e8da-40b7-8edc-687d07805075-kube-api-access-fnj7t\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.615964 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.616061 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-scripts\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.616089 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b81f9748-e8da-40b7-8edc-687d07805075-run-httpd\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.616229 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b81f9748-e8da-40b7-8edc-687d07805075-log-httpd\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.616379 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.616452 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-config-data\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.717842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.717895 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-config-data\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.717949 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnj7t\" (UniqueName: \"kubernetes.io/projected/b81f9748-e8da-40b7-8edc-687d07805075-kube-api-access-fnj7t\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.717984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.718029 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-scripts\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.718046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b81f9748-e8da-40b7-8edc-687d07805075-run-httpd\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.718071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b81f9748-e8da-40b7-8edc-687d07805075-log-httpd\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.718693 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b81f9748-e8da-40b7-8edc-687d07805075-log-httpd\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.719594 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b81f9748-e8da-40b7-8edc-687d07805075-run-httpd\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.723077 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.723495 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-config-data\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.723565 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.735649 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-scripts\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:26 crc kubenswrapper[5030]: I0120 23:17:26.738086 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnj7t\" (UniqueName: \"kubernetes.io/projected/b81f9748-e8da-40b7-8edc-687d07805075-kube-api-access-fnj7t\") pod \"ceilometer-0\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:27 crc kubenswrapper[5030]: I0120 23:17:27.261285 5030 scope.go:117] "RemoveContainer" containerID="a7d332f8eeb25b0745eb5d9d351364fed994b7f285df5857008d9364145e82c4" Jan 20 23:17:27 crc kubenswrapper[5030]: I0120 23:17:27.320873 5030 scope.go:117] "RemoveContainer" containerID="49115faf9e4cca8649a875a9d81b2f07f5541bc765c46bf3ab690053a58c78ba" Jan 20 23:17:27 crc kubenswrapper[5030]: I0120 23:17:27.322248 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:27 crc kubenswrapper[5030]: I0120 23:17:27.401051 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" event={"ID":"1b42b071-9e29-4c1b-a187-e79623838902","Type":"ContainerStarted","Data":"9999471141c83c0e8b093977a735c1bb6794b7b632975ee7ff219513491befd1"} Jan 20 23:17:27 crc kubenswrapper[5030]: I0120 23:17:27.401341 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" event={"ID":"1b42b071-9e29-4c1b-a187-e79623838902","Type":"ContainerStarted","Data":"dd206b82a4f38083b62a85a00c0c328c49482cc174f3c78c36343cbcbf848609"} Jan 20 23:17:27 crc kubenswrapper[5030]: I0120 23:17:27.417128 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" podStartSLOduration=2.417112563 podStartE2EDuration="2.417112563s" podCreationTimestamp="2026-01-20 23:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:27.416761035 +0000 UTC m=+2519.737021323" watchObservedRunningTime="2026-01-20 23:17:27.417112563 +0000 UTC m=+2519.737372851" Jan 20 23:17:27 crc kubenswrapper[5030]: I0120 23:17:27.770684 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:27 crc kubenswrapper[5030]: I0120 23:17:27.974247 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7307ecd5-19b6-4257-8aa6-f7683128ed15" path="/var/lib/kubelet/pods/7307ecd5-19b6-4257-8aa6-f7683128ed15/volumes" Jan 20 23:17:28 crc kubenswrapper[5030]: I0120 23:17:28.410382 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b81f9748-e8da-40b7-8edc-687d07805075","Type":"ContainerStarted","Data":"13f7ae1310a806de3e64b4c7ebf2389758e5664d41a7af1e8a5de1916fec9f89"} Jan 20 23:17:28 crc kubenswrapper[5030]: I0120 23:17:28.962515 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:17:28 crc kubenswrapper[5030]: E0120 23:17:28.963898 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:17:29 crc kubenswrapper[5030]: I0120 23:17:29.426502 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b81f9748-e8da-40b7-8edc-687d07805075","Type":"ContainerStarted","Data":"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd"} Jan 20 23:17:30 crc kubenswrapper[5030]: I0120 23:17:30.440274 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b81f9748-e8da-40b7-8edc-687d07805075","Type":"ContainerStarted","Data":"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d"} Jan 20 23:17:30 crc kubenswrapper[5030]: I0120 23:17:30.440669 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b81f9748-e8da-40b7-8edc-687d07805075","Type":"ContainerStarted","Data":"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0"} Jan 20 23:17:30 crc kubenswrapper[5030]: I0120 23:17:30.746653 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:30 crc kubenswrapper[5030]: I0120 23:17:30.746897 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:30 crc kubenswrapper[5030]: I0120 23:17:30.783235 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:30 crc kubenswrapper[5030]: I0120 23:17:30.798822 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:31 crc kubenswrapper[5030]: I0120 23:17:31.453287 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:31 crc kubenswrapper[5030]: I0120 23:17:31.453315 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:31 crc kubenswrapper[5030]: I0120 23:17:31.722634 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:31 crc kubenswrapper[5030]: I0120 23:17:31.723080 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:31 crc kubenswrapper[5030]: I0120 23:17:31.765206 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:31 crc kubenswrapper[5030]: I0120 23:17:31.767888 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:32 crc kubenswrapper[5030]: I0120 23:17:32.465282 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b81f9748-e8da-40b7-8edc-687d07805075","Type":"ContainerStarted","Data":"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e"} Jan 20 23:17:32 crc kubenswrapper[5030]: I0120 23:17:32.466366 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:32 crc kubenswrapper[5030]: I0120 23:17:32.466392 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:32 crc kubenswrapper[5030]: I0120 23:17:32.508916 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.871037489 podStartE2EDuration="6.508895028s" podCreationTimestamp="2026-01-20 23:17:26 +0000 UTC" firstStartedPulling="2026-01-20 23:17:27.775747523 +0000 UTC m=+2520.096007811" lastFinishedPulling="2026-01-20 23:17:31.413605062 +0000 UTC m=+2523.733865350" observedRunningTime="2026-01-20 23:17:32.488415107 +0000 UTC m=+2524.808675475" watchObservedRunningTime="2026-01-20 23:17:32.508895028 +0000 UTC m=+2524.829155316" Jan 20 23:17:33 crc kubenswrapper[5030]: I0120 23:17:33.420165 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:33 crc kubenswrapper[5030]: I0120 23:17:33.421594 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:17:33 crc kubenswrapper[5030]: I0120 23:17:33.475631 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:34 crc kubenswrapper[5030]: I0120 23:17:34.037901 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:34 crc kubenswrapper[5030]: I0120 23:17:34.516392 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:34 crc kubenswrapper[5030]: I0120 23:17:34.517219 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:17:34 crc kubenswrapper[5030]: I0120 23:17:34.521774 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:17:35 crc kubenswrapper[5030]: I0120 23:17:35.498115 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="ceilometer-central-agent" containerID="cri-o://ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd" gracePeriod=30 Jan 20 23:17:35 crc kubenswrapper[5030]: I0120 23:17:35.498694 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="ceilometer-notification-agent" containerID="cri-o://204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0" gracePeriod=30 Jan 20 23:17:35 crc kubenswrapper[5030]: I0120 23:17:35.498806 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="sg-core" containerID="cri-o://b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d" gracePeriod=30 Jan 20 23:17:35 crc kubenswrapper[5030]: I0120 23:17:35.498887 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="proxy-httpd" containerID="cri-o://045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e" gracePeriod=30 Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.227022 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.332103 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-scripts\") pod \"b81f9748-e8da-40b7-8edc-687d07805075\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.332180 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnj7t\" (UniqueName: \"kubernetes.io/projected/b81f9748-e8da-40b7-8edc-687d07805075-kube-api-access-fnj7t\") pod \"b81f9748-e8da-40b7-8edc-687d07805075\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.332211 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-config-data\") pod \"b81f9748-e8da-40b7-8edc-687d07805075\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.332231 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-sg-core-conf-yaml\") pod \"b81f9748-e8da-40b7-8edc-687d07805075\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.332333 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b81f9748-e8da-40b7-8edc-687d07805075-run-httpd\") pod \"b81f9748-e8da-40b7-8edc-687d07805075\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.332414 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-combined-ca-bundle\") pod \"b81f9748-e8da-40b7-8edc-687d07805075\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.332514 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b81f9748-e8da-40b7-8edc-687d07805075-log-httpd\") pod \"b81f9748-e8da-40b7-8edc-687d07805075\" (UID: \"b81f9748-e8da-40b7-8edc-687d07805075\") " Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.333580 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81f9748-e8da-40b7-8edc-687d07805075-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b81f9748-e8da-40b7-8edc-687d07805075" (UID: "b81f9748-e8da-40b7-8edc-687d07805075"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.334090 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b81f9748-e8da-40b7-8edc-687d07805075-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b81f9748-e8da-40b7-8edc-687d07805075" (UID: "b81f9748-e8da-40b7-8edc-687d07805075"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.338570 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-scripts" (OuterVolumeSpecName: "scripts") pod "b81f9748-e8da-40b7-8edc-687d07805075" (UID: "b81f9748-e8da-40b7-8edc-687d07805075"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.340595 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b81f9748-e8da-40b7-8edc-687d07805075-kube-api-access-fnj7t" (OuterVolumeSpecName: "kube-api-access-fnj7t") pod "b81f9748-e8da-40b7-8edc-687d07805075" (UID: "b81f9748-e8da-40b7-8edc-687d07805075"). InnerVolumeSpecName "kube-api-access-fnj7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.366187 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b81f9748-e8da-40b7-8edc-687d07805075" (UID: "b81f9748-e8da-40b7-8edc-687d07805075"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.407534 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b81f9748-e8da-40b7-8edc-687d07805075" (UID: "b81f9748-e8da-40b7-8edc-687d07805075"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.435164 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.435190 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b81f9748-e8da-40b7-8edc-687d07805075-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.435199 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.435209 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnj7t\" (UniqueName: \"kubernetes.io/projected/b81f9748-e8da-40b7-8edc-687d07805075-kube-api-access-fnj7t\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.435219 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.435231 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b81f9748-e8da-40b7-8edc-687d07805075-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.456831 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-config-data" (OuterVolumeSpecName: "config-data") pod "b81f9748-e8da-40b7-8edc-687d07805075" (UID: "b81f9748-e8da-40b7-8edc-687d07805075"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.522516 5030 generic.go:334] "Generic (PLEG): container finished" podID="b81f9748-e8da-40b7-8edc-687d07805075" containerID="045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e" exitCode=0 Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.522548 5030 generic.go:334] "Generic (PLEG): container finished" podID="b81f9748-e8da-40b7-8edc-687d07805075" containerID="b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d" exitCode=2 Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.522558 5030 generic.go:334] "Generic (PLEG): container finished" podID="b81f9748-e8da-40b7-8edc-687d07805075" containerID="204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0" exitCode=0 Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.522566 5030 generic.go:334] "Generic (PLEG): container finished" podID="b81f9748-e8da-40b7-8edc-687d07805075" containerID="ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd" exitCode=0 Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.522555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b81f9748-e8da-40b7-8edc-687d07805075","Type":"ContainerDied","Data":"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e"} Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.522631 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b81f9748-e8da-40b7-8edc-687d07805075","Type":"ContainerDied","Data":"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d"} Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.522648 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b81f9748-e8da-40b7-8edc-687d07805075","Type":"ContainerDied","Data":"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0"} Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.522659 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b81f9748-e8da-40b7-8edc-687d07805075","Type":"ContainerDied","Data":"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd"} Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.522654 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.522670 5030 scope.go:117] "RemoveContainer" containerID="045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.522672 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b81f9748-e8da-40b7-8edc-687d07805075","Type":"ContainerDied","Data":"13f7ae1310a806de3e64b4c7ebf2389758e5664d41a7af1e8a5de1916fec9f89"} Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.536817 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b81f9748-e8da-40b7-8edc-687d07805075-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.561080 5030 scope.go:117] "RemoveContainer" containerID="b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.561675 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.575327 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.585299 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:36 crc kubenswrapper[5030]: E0120 23:17:36.585676 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="ceilometer-notification-agent" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.585692 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="ceilometer-notification-agent" Jan 20 23:17:36 crc kubenswrapper[5030]: E0120 23:17:36.585710 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="sg-core" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.585716 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="sg-core" Jan 20 23:17:36 crc kubenswrapper[5030]: E0120 23:17:36.585745 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="ceilometer-central-agent" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.585751 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="ceilometer-central-agent" Jan 20 23:17:36 crc kubenswrapper[5030]: E0120 23:17:36.585757 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="proxy-httpd" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.585763 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="proxy-httpd" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.585938 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="proxy-httpd" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.585952 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="ceilometer-central-agent" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.585964 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="sg-core" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.585978 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b81f9748-e8da-40b7-8edc-687d07805075" containerName="ceilometer-notification-agent" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.589685 5030 scope.go:117] "RemoveContainer" containerID="204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.590127 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.593289 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.593613 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.600130 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.616226 5030 scope.go:117] "RemoveContainer" containerID="ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.638015 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-run-httpd\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.638065 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-config-data\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.638186 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-scripts\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.638257 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qxjw\" (UniqueName: \"kubernetes.io/projected/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-kube-api-access-6qxjw\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.638286 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.638335 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-log-httpd\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.638368 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.713611 5030 scope.go:117] "RemoveContainer" containerID="045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e" Jan 20 23:17:36 crc kubenswrapper[5030]: E0120 23:17:36.714112 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e\": container with ID starting with 045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e not found: ID does not exist" containerID="045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.714144 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e"} err="failed to get container status \"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e\": rpc error: code = NotFound desc = could not find container \"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e\": container with ID starting with 045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.714172 5030 scope.go:117] "RemoveContainer" containerID="b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d" Jan 20 23:17:36 crc kubenswrapper[5030]: E0120 23:17:36.714367 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d\": container with ID starting with b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d not found: ID does not exist" containerID="b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.714391 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d"} err="failed to get container status \"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d\": rpc error: code = NotFound desc = could not find container \"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d\": container with ID starting with b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.714410 5030 scope.go:117] "RemoveContainer" containerID="204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0" Jan 20 23:17:36 crc kubenswrapper[5030]: E0120 23:17:36.714596 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0\": container with ID starting with 204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0 not found: ID does not exist" containerID="204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.714640 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0"} err="failed to get container status \"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0\": rpc error: code = NotFound desc = could not find container \"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0\": container with ID starting with 204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0 not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.714673 5030 scope.go:117] "RemoveContainer" containerID="ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd" Jan 20 23:17:36 crc kubenswrapper[5030]: E0120 23:17:36.714875 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd\": container with ID starting with ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd not found: ID does not exist" containerID="ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.714897 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd"} err="failed to get container status \"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd\": rpc error: code = NotFound desc = could not find container \"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd\": container with ID starting with ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.714913 5030 scope.go:117] "RemoveContainer" containerID="045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.715069 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e"} err="failed to get container status \"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e\": rpc error: code = NotFound desc = could not find container \"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e\": container with ID starting with 045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.715092 5030 scope.go:117] "RemoveContainer" containerID="b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.715405 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d"} err="failed to get container status \"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d\": rpc error: code = NotFound desc = could not find container \"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d\": container with ID starting with b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.715427 5030 scope.go:117] "RemoveContainer" containerID="204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.715680 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0"} err="failed to get container status \"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0\": rpc error: code = NotFound desc = could not find container \"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0\": container with ID starting with 204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0 not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.715708 5030 scope.go:117] "RemoveContainer" containerID="ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.715946 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd"} err="failed to get container status \"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd\": rpc error: code = NotFound desc = could not find container \"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd\": container with ID starting with ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.715969 5030 scope.go:117] "RemoveContainer" containerID="045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.716174 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e"} err="failed to get container status \"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e\": rpc error: code = NotFound desc = could not find container \"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e\": container with ID starting with 045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.716277 5030 scope.go:117] "RemoveContainer" containerID="b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.716745 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d"} err="failed to get container status \"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d\": rpc error: code = NotFound desc = could not find container \"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d\": container with ID starting with b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.716769 5030 scope.go:117] "RemoveContainer" containerID="204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.717021 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0"} err="failed to get container status \"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0\": rpc error: code = NotFound desc = could not find container \"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0\": container with ID starting with 204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0 not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.717042 5030 scope.go:117] "RemoveContainer" containerID="ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.717300 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd"} err="failed to get container status \"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd\": rpc error: code = NotFound desc = could not find container \"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd\": container with ID starting with ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.717401 5030 scope.go:117] "RemoveContainer" containerID="045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.717666 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e"} err="failed to get container status \"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e\": rpc error: code = NotFound desc = could not find container \"045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e\": container with ID starting with 045ffcf7cdcacf1c626b3433ac60c5e6bcf061e6d26688cdf6c28d83e999148e not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.717689 5030 scope.go:117] "RemoveContainer" containerID="b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.717887 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d"} err="failed to get container status \"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d\": rpc error: code = NotFound desc = could not find container \"b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d\": container with ID starting with b800177c36dfad5a5dc56364622ff17b81f6588032cbbf45fb2bae4cf9da734d not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.717911 5030 scope.go:117] "RemoveContainer" containerID="204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.718143 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0"} err="failed to get container status \"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0\": rpc error: code = NotFound desc = could not find container \"204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0\": container with ID starting with 204bb46fc52898bdc6c55a4eb240d0cce60af68b9cdd9170c8f135a4285836d0 not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.718236 5030 scope.go:117] "RemoveContainer" containerID="ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.718524 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd"} err="failed to get container status \"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd\": rpc error: code = NotFound desc = could not find container \"ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd\": container with ID starting with ca7c4898dbcd4ab3efd75bfab9cf0294d2aa7ce2f7aa62a3fa604077c3bcdbbd not found: ID does not exist" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.739884 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-run-httpd\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.740144 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-config-data\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.740356 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-scripts\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.740541 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qxjw\" (UniqueName: \"kubernetes.io/projected/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-kube-api-access-6qxjw\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.740677 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.740826 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-log-httpd\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.740400 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-run-httpd\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.740947 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.741243 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-log-httpd\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.744421 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.744690 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-scripts\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.744953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-config-data\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.746662 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:36 crc kubenswrapper[5030]: I0120 23:17:36.757341 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qxjw\" (UniqueName: \"kubernetes.io/projected/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-kube-api-access-6qxjw\") pod \"ceilometer-0\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:37 crc kubenswrapper[5030]: I0120 23:17:37.008577 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:37 crc kubenswrapper[5030]: W0120 23:17:37.480778 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4574c6ab_54d4_44b5_8adb_2ecb2d0dc8c2.slice/crio-d07e79312dd05be9a84c4b9ab2e0cdc68cb4e3cc6fea228809208806610c009c WatchSource:0}: Error finding container d07e79312dd05be9a84c4b9ab2e0cdc68cb4e3cc6fea228809208806610c009c: Status 404 returned error can't find the container with id d07e79312dd05be9a84c4b9ab2e0cdc68cb4e3cc6fea228809208806610c009c Jan 20 23:17:37 crc kubenswrapper[5030]: I0120 23:17:37.482232 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:17:37 crc kubenswrapper[5030]: I0120 23:17:37.540804 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2","Type":"ContainerStarted","Data":"d07e79312dd05be9a84c4b9ab2e0cdc68cb4e3cc6fea228809208806610c009c"} Jan 20 23:17:37 crc kubenswrapper[5030]: I0120 23:17:37.976392 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b81f9748-e8da-40b7-8edc-687d07805075" path="/var/lib/kubelet/pods/b81f9748-e8da-40b7-8edc-687d07805075/volumes" Jan 20 23:17:38 crc kubenswrapper[5030]: I0120 23:17:38.562567 5030 generic.go:334] "Generic (PLEG): container finished" podID="1b42b071-9e29-4c1b-a187-e79623838902" containerID="9999471141c83c0e8b093977a735c1bb6794b7b632975ee7ff219513491befd1" exitCode=0 Jan 20 23:17:38 crc kubenswrapper[5030]: I0120 23:17:38.562667 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" event={"ID":"1b42b071-9e29-4c1b-a187-e79623838902","Type":"ContainerDied","Data":"9999471141c83c0e8b093977a735c1bb6794b7b632975ee7ff219513491befd1"} Jan 20 23:17:38 crc kubenswrapper[5030]: I0120 23:17:38.569593 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2","Type":"ContainerStarted","Data":"0e78a7035183ce85c56fa66a0a9d17074f673fa9b629ef74ce70dc73e7753c1f"} Jan 20 23:17:39 crc kubenswrapper[5030]: I0120 23:17:39.582169 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2","Type":"ContainerStarted","Data":"2b67334f3a38bcb0b4c3c6b6298b5fbfc294085c5a2a7cd3f00f8a73ee3de45f"} Jan 20 23:17:39 crc kubenswrapper[5030]: I0120 23:17:39.881812 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.003578 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-config-data\") pod \"1b42b071-9e29-4c1b-a187-e79623838902\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.004277 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-scripts\") pod \"1b42b071-9e29-4c1b-a187-e79623838902\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.004524 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-combined-ca-bundle\") pod \"1b42b071-9e29-4c1b-a187-e79623838902\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.005112 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fg2r\" (UniqueName: \"kubernetes.io/projected/1b42b071-9e29-4c1b-a187-e79623838902-kube-api-access-4fg2r\") pod \"1b42b071-9e29-4c1b-a187-e79623838902\" (UID: \"1b42b071-9e29-4c1b-a187-e79623838902\") " Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.010237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-scripts" (OuterVolumeSpecName: "scripts") pod "1b42b071-9e29-4c1b-a187-e79623838902" (UID: "1b42b071-9e29-4c1b-a187-e79623838902"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.010326 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b42b071-9e29-4c1b-a187-e79623838902-kube-api-access-4fg2r" (OuterVolumeSpecName: "kube-api-access-4fg2r") pod "1b42b071-9e29-4c1b-a187-e79623838902" (UID: "1b42b071-9e29-4c1b-a187-e79623838902"). InnerVolumeSpecName "kube-api-access-4fg2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.032060 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-config-data" (OuterVolumeSpecName: "config-data") pod "1b42b071-9e29-4c1b-a187-e79623838902" (UID: "1b42b071-9e29-4c1b-a187-e79623838902"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.043121 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b42b071-9e29-4c1b-a187-e79623838902" (UID: "1b42b071-9e29-4c1b-a187-e79623838902"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.108310 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.108416 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fg2r\" (UniqueName: \"kubernetes.io/projected/1b42b071-9e29-4c1b-a187-e79623838902-kube-api-access-4fg2r\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.108473 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.108550 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b42b071-9e29-4c1b-a187-e79623838902-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.604305 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" event={"ID":"1b42b071-9e29-4c1b-a187-e79623838902","Type":"ContainerDied","Data":"dd206b82a4f38083b62a85a00c0c328c49482cc174f3c78c36343cbcbf848609"} Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.604371 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd206b82a4f38083b62a85a00c0c328c49482cc174f3c78c36343cbcbf848609" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.604446 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.612284 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2","Type":"ContainerStarted","Data":"c7472721bad1245b286f02302917d468dcadbca30bdd149bd656849d03762d55"} Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.719450 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:17:40 crc kubenswrapper[5030]: E0120 23:17:40.720157 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b42b071-9e29-4c1b-a187-e79623838902" containerName="nova-cell0-conductor-db-sync" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.720193 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b42b071-9e29-4c1b-a187-e79623838902" containerName="nova-cell0-conductor-db-sync" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.720554 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b42b071-9e29-4c1b-a187-e79623838902" containerName="nova-cell0-conductor-db-sync" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.721645 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.727369 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-82pqm" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.727415 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.727877 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.824243 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kshsm\" (UniqueName: \"kubernetes.io/projected/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-kube-api-access-kshsm\") pod \"nova-cell0-conductor-0\" (UID: \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.824689 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.824773 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.926255 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kshsm\" (UniqueName: \"kubernetes.io/projected/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-kube-api-access-kshsm\") pod \"nova-cell0-conductor-0\" (UID: \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.926339 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.926411 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.934315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.942141 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.943935 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kshsm\" (UniqueName: \"kubernetes.io/projected/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-kube-api-access-kshsm\") pod \"nova-cell0-conductor-0\" (UID: \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:17:40 crc kubenswrapper[5030]: I0120 23:17:40.962448 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:17:40 crc kubenswrapper[5030]: E0120 23:17:40.962834 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:17:41 crc kubenswrapper[5030]: I0120 23:17:41.050451 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:17:41 crc kubenswrapper[5030]: I0120 23:17:41.535366 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:17:41 crc kubenswrapper[5030]: I0120 23:17:41.622826 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d","Type":"ContainerStarted","Data":"10eb60b928b07a79fe138ef219d7d32491415f9dc55ca9767af8e1040ab88854"} Jan 20 23:17:41 crc kubenswrapper[5030]: I0120 23:17:41.625426 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2","Type":"ContainerStarted","Data":"5bb3a1baca5a5e3b5654b4e0bf660ec981ab5c5ff139ba5b8bd7c7cc1a350255"} Jan 20 23:17:41 crc kubenswrapper[5030]: I0120 23:17:41.625573 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:17:41 crc kubenswrapper[5030]: I0120 23:17:41.661458 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.8921355119999999 podStartE2EDuration="5.661427552s" podCreationTimestamp="2026-01-20 23:17:36 +0000 UTC" firstStartedPulling="2026-01-20 23:17:37.483525974 +0000 UTC m=+2529.803786302" lastFinishedPulling="2026-01-20 23:17:41.252818054 +0000 UTC m=+2533.573078342" observedRunningTime="2026-01-20 23:17:41.646309271 +0000 UTC m=+2533.966569589" watchObservedRunningTime="2026-01-20 23:17:41.661427552 +0000 UTC m=+2533.981687880" Jan 20 23:17:42 crc kubenswrapper[5030]: I0120 23:17:42.643852 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d","Type":"ContainerStarted","Data":"a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5"} Jan 20 23:17:42 crc kubenswrapper[5030]: I0120 23:17:42.667540 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.66751857 podStartE2EDuration="2.66751857s" podCreationTimestamp="2026-01-20 23:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:42.663238977 +0000 UTC m=+2534.983499275" watchObservedRunningTime="2026-01-20 23:17:42.66751857 +0000 UTC m=+2534.987778878" Jan 20 23:17:43 crc kubenswrapper[5030]: I0120 23:17:43.652826 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.083047 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.255714 5030 scope.go:117] "RemoveContainer" containerID="33f20a4031ef4829ecbbf9e28989b665faf28e57166d8564c12773236bce9457" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.306093 5030 scope.go:117] "RemoveContainer" containerID="444566938fcbfab3901e6fd0f098d24ad50b1a35701c21a72a39c1fc137f4568" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.383161 5030 scope.go:117] "RemoveContainer" containerID="82ab471c3d64055113b2021eef08677b0e9845c041b161dbbb64d459e70517df" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.410235 5030 scope.go:117] "RemoveContainer" containerID="c2d39dc919ce31b1d04bba4cd39e59cbcbecfe645a2a254ced36db8b14b27f79" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.477053 5030 scope.go:117] "RemoveContainer" containerID="3a9e6b3c5136ed70e89b31faf61fa24914c889d05394db12868771de49733760" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.525177 5030 scope.go:117] "RemoveContainer" containerID="a603f5479f1751fca07b831722eb086379566bd909e2e23ac72bbb41392e0f4d" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.567410 5030 scope.go:117] "RemoveContainer" containerID="f7071f8a75e8f74dd7b4086b6cfd77c0d19c5687e5f92846bb994539bc2278af" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.614568 5030 scope.go:117] "RemoveContainer" containerID="2810d82b7b0d1df9d395d91516e331ed9c41cba08efeb944359966702a2c540a" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.678454 5030 scope.go:117] "RemoveContainer" containerID="a8f87c8119662cd288e317c6cd09aea15848c621b368e0d59c2e3dde575d7349" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.703141 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg"] Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.704725 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.707109 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.707898 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.716255 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg"] Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.716510 5030 scope.go:117] "RemoveContainer" containerID="bdd7503e79666863d6afaca4fc262aecba84505a7e41daf73b9635bf49a4eb40" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.768525 5030 scope.go:117] "RemoveContainer" containerID="1decb37a6a7c1d14ac75639b689c1ba474b929273efe96a724937a53898e5fcd" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.777744 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6nqfg\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.777816 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-scripts\") pod \"nova-cell0-cell-mapping-6nqfg\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.777940 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn2n4\" (UniqueName: \"kubernetes.io/projected/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-kube-api-access-kn2n4\") pod \"nova-cell0-cell-mapping-6nqfg\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.777960 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-config-data\") pod \"nova-cell0-cell-mapping-6nqfg\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.821110 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.822711 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.824449 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.831010 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.832613 5030 scope.go:117] "RemoveContainer" containerID="bc6af635e7469805210beb35edd587c7aa233e75f38fdddef526c1f10038f18a" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.884729 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn2n4\" (UniqueName: \"kubernetes.io/projected/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-kube-api-access-kn2n4\") pod \"nova-cell0-cell-mapping-6nqfg\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.884776 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-config-data\") pod \"nova-cell0-cell-mapping-6nqfg\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.884836 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6nqfg\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.884879 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-scripts\") pod \"nova-cell0-cell-mapping-6nqfg\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.892429 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6nqfg\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.902405 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-scripts\") pod \"nova-cell0-cell-mapping-6nqfg\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.908953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-config-data\") pod \"nova-cell0-cell-mapping-6nqfg\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.909414 5030 scope.go:117] "RemoveContainer" containerID="7fa5493ab547d6dfa6b51ddf09c04288a1a6dcaaf5124b6386a2e61f891bb6d0" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.914556 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn2n4\" (UniqueName: \"kubernetes.io/projected/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-kube-api-access-kn2n4\") pod \"nova-cell0-cell-mapping-6nqfg\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.915029 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.916274 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.918511 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.959787 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.988390 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/878d7a51-fe53-42bb-982d-8a5971f9be80-config-data\") pod \"nova-api-0\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.988504 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/878d7a51-fe53-42bb-982d-8a5971f9be80-logs\") pod \"nova-api-0\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.988555 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9gx5\" (UniqueName: \"kubernetes.io/projected/878d7a51-fe53-42bb-982d-8a5971f9be80-kube-api-access-k9gx5\") pod \"nova-api-0\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:46 crc kubenswrapper[5030]: I0120 23:17:46.988570 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878d7a51-fe53-42bb-982d-8a5971f9be80-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.014738 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.016260 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.020495 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.036573 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.036659 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.050678 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.052140 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.057300 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.086309 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.091999 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9gx5\" (UniqueName: \"kubernetes.io/projected/878d7a51-fe53-42bb-982d-8a5971f9be80-kube-api-access-k9gx5\") pod \"nova-api-0\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.092042 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878d7a51-fe53-42bb-982d-8a5971f9be80-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.092205 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ps7z\" (UniqueName: \"kubernetes.io/projected/761871f5-8189-4a5e-b442-61bfff40bb23-kube-api-access-7ps7z\") pod \"nova-metadata-0\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.092246 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/878d7a51-fe53-42bb-982d-8a5971f9be80-config-data\") pod \"nova-api-0\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.092304 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761871f5-8189-4a5e-b442-61bfff40bb23-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.092370 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951e26bc-aedf-4210-b5e2-d2436c2a90b1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.092395 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951e26bc-aedf-4210-b5e2-d2436c2a90b1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.092415 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761871f5-8189-4a5e-b442-61bfff40bb23-logs\") pod \"nova-metadata-0\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.092522 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf6lt\" (UniqueName: \"kubernetes.io/projected/951e26bc-aedf-4210-b5e2-d2436c2a90b1-kube-api-access-sf6lt\") pod \"nova-cell1-novncproxy-0\" (UID: \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.092578 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761871f5-8189-4a5e-b442-61bfff40bb23-config-data\") pod \"nova-metadata-0\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.092666 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/878d7a51-fe53-42bb-982d-8a5971f9be80-logs\") pod \"nova-api-0\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.093039 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/878d7a51-fe53-42bb-982d-8a5971f9be80-logs\") pod \"nova-api-0\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.097518 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/878d7a51-fe53-42bb-982d-8a5971f9be80-config-data\") pod \"nova-api-0\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.098926 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878d7a51-fe53-42bb-982d-8a5971f9be80-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.108716 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9gx5\" (UniqueName: \"kubernetes.io/projected/878d7a51-fe53-42bb-982d-8a5971f9be80-kube-api-access-k9gx5\") pod \"nova-api-0\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.153817 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.194602 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a011b0-0e05-4983-a66e-c244d0a17ba2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"73a011b0-0e05-4983-a66e-c244d0a17ba2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.194694 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ps7z\" (UniqueName: \"kubernetes.io/projected/761871f5-8189-4a5e-b442-61bfff40bb23-kube-api-access-7ps7z\") pod \"nova-metadata-0\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.194730 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgdzn\" (UniqueName: \"kubernetes.io/projected/73a011b0-0e05-4983-a66e-c244d0a17ba2-kube-api-access-dgdzn\") pod \"nova-scheduler-0\" (UID: \"73a011b0-0e05-4983-a66e-c244d0a17ba2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.194758 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761871f5-8189-4a5e-b442-61bfff40bb23-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.194784 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951e26bc-aedf-4210-b5e2-d2436c2a90b1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.194803 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951e26bc-aedf-4210-b5e2-d2436c2a90b1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.194820 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761871f5-8189-4a5e-b442-61bfff40bb23-logs\") pod \"nova-metadata-0\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.194850 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf6lt\" (UniqueName: \"kubernetes.io/projected/951e26bc-aedf-4210-b5e2-d2436c2a90b1-kube-api-access-sf6lt\") pod \"nova-cell1-novncproxy-0\" (UID: \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.194881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761871f5-8189-4a5e-b442-61bfff40bb23-config-data\") pod \"nova-metadata-0\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.194946 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a011b0-0e05-4983-a66e-c244d0a17ba2-config-data\") pod \"nova-scheduler-0\" (UID: \"73a011b0-0e05-4983-a66e-c244d0a17ba2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.195908 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761871f5-8189-4a5e-b442-61bfff40bb23-logs\") pod \"nova-metadata-0\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.200859 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951e26bc-aedf-4210-b5e2-d2436c2a90b1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.204305 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951e26bc-aedf-4210-b5e2-d2436c2a90b1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.204850 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761871f5-8189-4a5e-b442-61bfff40bb23-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.205887 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761871f5-8189-4a5e-b442-61bfff40bb23-config-data\") pod \"nova-metadata-0\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.219282 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf6lt\" (UniqueName: \"kubernetes.io/projected/951e26bc-aedf-4210-b5e2-d2436c2a90b1-kube-api-access-sf6lt\") pod \"nova-cell1-novncproxy-0\" (UID: \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.224683 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ps7z\" (UniqueName: \"kubernetes.io/projected/761871f5-8189-4a5e-b442-61bfff40bb23-kube-api-access-7ps7z\") pod \"nova-metadata-0\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.240607 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.296947 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgdzn\" (UniqueName: \"kubernetes.io/projected/73a011b0-0e05-4983-a66e-c244d0a17ba2-kube-api-access-dgdzn\") pod \"nova-scheduler-0\" (UID: \"73a011b0-0e05-4983-a66e-c244d0a17ba2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.297174 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a011b0-0e05-4983-a66e-c244d0a17ba2-config-data\") pod \"nova-scheduler-0\" (UID: \"73a011b0-0e05-4983-a66e-c244d0a17ba2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.297241 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a011b0-0e05-4983-a66e-c244d0a17ba2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"73a011b0-0e05-4983-a66e-c244d0a17ba2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.310267 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a011b0-0e05-4983-a66e-c244d0a17ba2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"73a011b0-0e05-4983-a66e-c244d0a17ba2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.315236 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a011b0-0e05-4983-a66e-c244d0a17ba2-config-data\") pod \"nova-scheduler-0\" (UID: \"73a011b0-0e05-4983-a66e-c244d0a17ba2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.317750 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgdzn\" (UniqueName: \"kubernetes.io/projected/73a011b0-0e05-4983-a66e-c244d0a17ba2-kube-api-access-dgdzn\") pod \"nova-scheduler-0\" (UID: \"73a011b0-0e05-4983-a66e-c244d0a17ba2\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.359312 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.384093 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.549246 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg"] Jan 20 23:17:47 crc kubenswrapper[5030]: W0120 23:17:47.610355 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fcd0796_3d13_46c8_a117_74ff1f36d4cc.slice/crio-630a0cdc0f6521b4524a6f5393f0b3c6d252f342c32045f37f7364f8513da6b3 WatchSource:0}: Error finding container 630a0cdc0f6521b4524a6f5393f0b3c6d252f342c32045f37f7364f8513da6b3: Status 404 returned error can't find the container with id 630a0cdc0f6521b4524a6f5393f0b3c6d252f342c32045f37f7364f8513da6b3 Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.646192 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db"] Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.648080 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.650054 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.651235 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.663039 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db"] Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.691349 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:17:47 crc kubenswrapper[5030]: W0120 23:17:47.696139 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod878d7a51_fe53_42bb_982d_8a5971f9be80.slice/crio-d0c814694b53576e7be357a466ccd86ad84180bdf8a2686713e6290e5cebf42d WatchSource:0}: Error finding container d0c814694b53576e7be357a466ccd86ad84180bdf8a2686713e6290e5cebf42d: Status 404 returned error can't find the container with id d0c814694b53576e7be357a466ccd86ad84180bdf8a2686713e6290e5cebf42d Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.743108 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"878d7a51-fe53-42bb-982d-8a5971f9be80","Type":"ContainerStarted","Data":"d0c814694b53576e7be357a466ccd86ad84180bdf8a2686713e6290e5cebf42d"} Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.748566 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" event={"ID":"7fcd0796-3d13-46c8-a117-74ff1f36d4cc","Type":"ContainerStarted","Data":"630a0cdc0f6521b4524a6f5393f0b3c6d252f342c32045f37f7364f8513da6b3"} Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.808120 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-648db\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.808478 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-scripts\") pod \"nova-cell1-conductor-db-sync-648db\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.808552 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrf69\" (UniqueName: \"kubernetes.io/projected/bef71c28-3c90-4bc9-bd49-13629ca6f394-kube-api-access-jrf69\") pod \"nova-cell1-conductor-db-sync-648db\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.808574 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-config-data\") pod \"nova-cell1-conductor-db-sync-648db\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.842317 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.909884 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-648db\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.909939 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-scripts\") pod \"nova-cell1-conductor-db-sync-648db\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.910012 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrf69\" (UniqueName: \"kubernetes.io/projected/bef71c28-3c90-4bc9-bd49-13629ca6f394-kube-api-access-jrf69\") pod \"nova-cell1-conductor-db-sync-648db\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.910046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-config-data\") pod \"nova-cell1-conductor-db-sync-648db\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.913815 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-config-data\") pod \"nova-cell1-conductor-db-sync-648db\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.915812 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-scripts\") pod \"nova-cell1-conductor-db-sync-648db\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.922693 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-648db\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.930514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrf69\" (UniqueName: \"kubernetes.io/projected/bef71c28-3c90-4bc9-bd49-13629ca6f394-kube-api-access-jrf69\") pod \"nova-cell1-conductor-db-sync-648db\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.979113 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:17:47 crc kubenswrapper[5030]: I0120 23:17:47.979586 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.053746 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.471379 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db"] Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.758713 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"761871f5-8189-4a5e-b442-61bfff40bb23","Type":"ContainerStarted","Data":"3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438"} Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.758755 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"761871f5-8189-4a5e-b442-61bfff40bb23","Type":"ContainerStarted","Data":"b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f"} Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.758767 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"761871f5-8189-4a5e-b442-61bfff40bb23","Type":"ContainerStarted","Data":"a3eff70627f87dc04e661a5b2519f3e459a5f575e7750dab7170a8e80bc7a872"} Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.760356 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" event={"ID":"7fcd0796-3d13-46c8-a117-74ff1f36d4cc","Type":"ContainerStarted","Data":"88572615c9470cc1f53e5fe138aec4921f5f4bbf0c0b6644397ebffec41a7164"} Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.761691 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" event={"ID":"bef71c28-3c90-4bc9-bd49-13629ca6f394","Type":"ContainerStarted","Data":"686aad5aa9f2af3ead8f0f1957fe2fbc06afa5bd392f0df0060aebc4b8dcab80"} Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.761714 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" event={"ID":"bef71c28-3c90-4bc9-bd49-13629ca6f394","Type":"ContainerStarted","Data":"bf5973ab7e4cffb87dbb2b454f3386accb9a34d211ac30f35529a4038ade039b"} Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.762980 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"73a011b0-0e05-4983-a66e-c244d0a17ba2","Type":"ContainerStarted","Data":"237690b55b605a9bc54570366454b4344a9c2730228ac46949a77661cbd4bfff"} Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.763005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"73a011b0-0e05-4983-a66e-c244d0a17ba2","Type":"ContainerStarted","Data":"e1372681d6bbd6f87067a32482a31401ce0358ea67fa49afac45a32511c087b6"} Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.765718 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"951e26bc-aedf-4210-b5e2-d2436c2a90b1","Type":"ContainerStarted","Data":"8577b5f40e530b7a4c036e39483bb7e8ca87a05ce542970247312e4be676a191"} Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.765761 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"951e26bc-aedf-4210-b5e2-d2436c2a90b1","Type":"ContainerStarted","Data":"dbac930019da903f836cabf5ba7f988d8ff451f1fed0156299e3e77a6ca29be7"} Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.767429 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"878d7a51-fe53-42bb-982d-8a5971f9be80","Type":"ContainerStarted","Data":"e04fbae0a14def52a1ca6c867992be7bd258bd9de533d50626669a6d7a7ffc2c"} Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.767464 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"878d7a51-fe53-42bb-982d-8a5971f9be80","Type":"ContainerStarted","Data":"a1d23fe2b7adbf63edbabfdc30bb528d1409be3a65c700d226dbc7625861d5ec"} Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.775916 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.775901284 podStartE2EDuration="2.775901284s" podCreationTimestamp="2026-01-20 23:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:48.773835624 +0000 UTC m=+2541.094095912" watchObservedRunningTime="2026-01-20 23:17:48.775901284 +0000 UTC m=+2541.096161572" Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.793932 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.793916736 podStartE2EDuration="2.793916736s" podCreationTimestamp="2026-01-20 23:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:48.789732985 +0000 UTC m=+2541.109993273" watchObservedRunningTime="2026-01-20 23:17:48.793916736 +0000 UTC m=+2541.114177024" Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.804791 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.804774625 podStartE2EDuration="2.804774625s" podCreationTimestamp="2026-01-20 23:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:48.803070195 +0000 UTC m=+2541.123330483" watchObservedRunningTime="2026-01-20 23:17:48.804774625 +0000 UTC m=+2541.125034914" Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.822679 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" podStartSLOduration=2.822662755 podStartE2EDuration="2.822662755s" podCreationTimestamp="2026-01-20 23:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:48.815529574 +0000 UTC m=+2541.135789862" watchObservedRunningTime="2026-01-20 23:17:48.822662755 +0000 UTC m=+2541.142923043" Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.835448 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.835433381 podStartE2EDuration="2.835433381s" podCreationTimestamp="2026-01-20 23:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:48.828427513 +0000 UTC m=+2541.148687801" watchObservedRunningTime="2026-01-20 23:17:48.835433381 +0000 UTC m=+2541.155693669" Jan 20 23:17:48 crc kubenswrapper[5030]: I0120 23:17:48.845498 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" podStartSLOduration=1.845481692 podStartE2EDuration="1.845481692s" podCreationTimestamp="2026-01-20 23:17:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:48.842085171 +0000 UTC m=+2541.162345469" watchObservedRunningTime="2026-01-20 23:17:48.845481692 +0000 UTC m=+2541.165741980" Jan 20 23:17:50 crc kubenswrapper[5030]: I0120 23:17:50.243187 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:17:50 crc kubenswrapper[5030]: I0120 23:17:50.282608 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:17:50 crc kubenswrapper[5030]: I0120 23:17:50.782539 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="951e26bc-aedf-4210-b5e2-d2436c2a90b1" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8577b5f40e530b7a4c036e39483bb7e8ca87a05ce542970247312e4be676a191" gracePeriod=30 Jan 20 23:17:50 crc kubenswrapper[5030]: I0120 23:17:50.782629 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="761871f5-8189-4a5e-b442-61bfff40bb23" containerName="nova-metadata-log" containerID="cri-o://b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f" gracePeriod=30 Jan 20 23:17:50 crc kubenswrapper[5030]: I0120 23:17:50.782680 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="761871f5-8189-4a5e-b442-61bfff40bb23" containerName="nova-metadata-metadata" containerID="cri-o://3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438" gracePeriod=30 Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.371825 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.500138 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761871f5-8189-4a5e-b442-61bfff40bb23-combined-ca-bundle\") pod \"761871f5-8189-4a5e-b442-61bfff40bb23\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.500290 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761871f5-8189-4a5e-b442-61bfff40bb23-logs\") pod \"761871f5-8189-4a5e-b442-61bfff40bb23\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.500330 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761871f5-8189-4a5e-b442-61bfff40bb23-config-data\") pod \"761871f5-8189-4a5e-b442-61bfff40bb23\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.500428 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ps7z\" (UniqueName: \"kubernetes.io/projected/761871f5-8189-4a5e-b442-61bfff40bb23-kube-api-access-7ps7z\") pod \"761871f5-8189-4a5e-b442-61bfff40bb23\" (UID: \"761871f5-8189-4a5e-b442-61bfff40bb23\") " Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.500856 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/761871f5-8189-4a5e-b442-61bfff40bb23-logs" (OuterVolumeSpecName: "logs") pod "761871f5-8189-4a5e-b442-61bfff40bb23" (UID: "761871f5-8189-4a5e-b442-61bfff40bb23"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.503856 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761871f5-8189-4a5e-b442-61bfff40bb23-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.507059 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761871f5-8189-4a5e-b442-61bfff40bb23-kube-api-access-7ps7z" (OuterVolumeSpecName: "kube-api-access-7ps7z") pod "761871f5-8189-4a5e-b442-61bfff40bb23" (UID: "761871f5-8189-4a5e-b442-61bfff40bb23"). InnerVolumeSpecName "kube-api-access-7ps7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.536441 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761871f5-8189-4a5e-b442-61bfff40bb23-config-data" (OuterVolumeSpecName: "config-data") pod "761871f5-8189-4a5e-b442-61bfff40bb23" (UID: "761871f5-8189-4a5e-b442-61bfff40bb23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.547484 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761871f5-8189-4a5e-b442-61bfff40bb23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "761871f5-8189-4a5e-b442-61bfff40bb23" (UID: "761871f5-8189-4a5e-b442-61bfff40bb23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.606021 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761871f5-8189-4a5e-b442-61bfff40bb23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.606387 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761871f5-8189-4a5e-b442-61bfff40bb23-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.606405 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ps7z\" (UniqueName: \"kubernetes.io/projected/761871f5-8189-4a5e-b442-61bfff40bb23-kube-api-access-7ps7z\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.619558 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.707442 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf6lt\" (UniqueName: \"kubernetes.io/projected/951e26bc-aedf-4210-b5e2-d2436c2a90b1-kube-api-access-sf6lt\") pod \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\" (UID: \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\") " Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.707539 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951e26bc-aedf-4210-b5e2-d2436c2a90b1-config-data\") pod \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\" (UID: \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\") " Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.707710 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951e26bc-aedf-4210-b5e2-d2436c2a90b1-combined-ca-bundle\") pod \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\" (UID: \"951e26bc-aedf-4210-b5e2-d2436c2a90b1\") " Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.711879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/951e26bc-aedf-4210-b5e2-d2436c2a90b1-kube-api-access-sf6lt" (OuterVolumeSpecName: "kube-api-access-sf6lt") pod "951e26bc-aedf-4210-b5e2-d2436c2a90b1" (UID: "951e26bc-aedf-4210-b5e2-d2436c2a90b1"). InnerVolumeSpecName "kube-api-access-sf6lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.742550 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/951e26bc-aedf-4210-b5e2-d2436c2a90b1-config-data" (OuterVolumeSpecName: "config-data") pod "951e26bc-aedf-4210-b5e2-d2436c2a90b1" (UID: "951e26bc-aedf-4210-b5e2-d2436c2a90b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.749003 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/951e26bc-aedf-4210-b5e2-d2436c2a90b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "951e26bc-aedf-4210-b5e2-d2436c2a90b1" (UID: "951e26bc-aedf-4210-b5e2-d2436c2a90b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.792720 5030 generic.go:334] "Generic (PLEG): container finished" podID="761871f5-8189-4a5e-b442-61bfff40bb23" containerID="3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438" exitCode=0 Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.792750 5030 generic.go:334] "Generic (PLEG): container finished" podID="761871f5-8189-4a5e-b442-61bfff40bb23" containerID="b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f" exitCode=143 Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.792777 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.792802 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"761871f5-8189-4a5e-b442-61bfff40bb23","Type":"ContainerDied","Data":"3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438"} Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.792828 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"761871f5-8189-4a5e-b442-61bfff40bb23","Type":"ContainerDied","Data":"b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f"} Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.792837 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"761871f5-8189-4a5e-b442-61bfff40bb23","Type":"ContainerDied","Data":"a3eff70627f87dc04e661a5b2519f3e459a5f575e7750dab7170a8e80bc7a872"} Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.792852 5030 scope.go:117] "RemoveContainer" containerID="3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.796459 5030 generic.go:334] "Generic (PLEG): container finished" podID="951e26bc-aedf-4210-b5e2-d2436c2a90b1" containerID="8577b5f40e530b7a4c036e39483bb7e8ca87a05ce542970247312e4be676a191" exitCode=0 Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.796491 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"951e26bc-aedf-4210-b5e2-d2436c2a90b1","Type":"ContainerDied","Data":"8577b5f40e530b7a4c036e39483bb7e8ca87a05ce542970247312e4be676a191"} Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.796512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"951e26bc-aedf-4210-b5e2-d2436c2a90b1","Type":"ContainerDied","Data":"dbac930019da903f836cabf5ba7f988d8ff451f1fed0156299e3e77a6ca29be7"} Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.796556 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.809548 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951e26bc-aedf-4210-b5e2-d2436c2a90b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.809585 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf6lt\" (UniqueName: \"kubernetes.io/projected/951e26bc-aedf-4210-b5e2-d2436c2a90b1-kube-api-access-sf6lt\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.809602 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951e26bc-aedf-4210-b5e2-d2436c2a90b1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.824594 5030 scope.go:117] "RemoveContainer" containerID="b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.833037 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.850659 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.858933 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.869451 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.877098 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:17:51 crc kubenswrapper[5030]: E0120 23:17:51.877778 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761871f5-8189-4a5e-b442-61bfff40bb23" containerName="nova-metadata-metadata" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.877857 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="761871f5-8189-4a5e-b442-61bfff40bb23" containerName="nova-metadata-metadata" Jan 20 23:17:51 crc kubenswrapper[5030]: E0120 23:17:51.877937 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761871f5-8189-4a5e-b442-61bfff40bb23" containerName="nova-metadata-log" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.877998 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="761871f5-8189-4a5e-b442-61bfff40bb23" containerName="nova-metadata-log" Jan 20 23:17:51 crc kubenswrapper[5030]: E0120 23:17:51.878061 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="951e26bc-aedf-4210-b5e2-d2436c2a90b1" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.878116 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="951e26bc-aedf-4210-b5e2-d2436c2a90b1" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.878340 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="951e26bc-aedf-4210-b5e2-d2436c2a90b1" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.878415 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="761871f5-8189-4a5e-b442-61bfff40bb23" containerName="nova-metadata-log" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.878487 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="761871f5-8189-4a5e-b442-61bfff40bb23" containerName="nova-metadata-metadata" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.883451 5030 scope.go:117] "RemoveContainer" containerID="3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.884747 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:51 crc kubenswrapper[5030]: E0120 23:17:51.885875 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438\": container with ID starting with 3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438 not found: ID does not exist" containerID="3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.885954 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438"} err="failed to get container status \"3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438\": rpc error: code = NotFound desc = could not find container \"3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438\": container with ID starting with 3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438 not found: ID does not exist" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.885993 5030 scope.go:117] "RemoveContainer" containerID="b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.893554 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.893718 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:17:51 crc kubenswrapper[5030]: E0120 23:17:51.900384 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f\": container with ID starting with b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f not found: ID does not exist" containerID="b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.900427 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f"} err="failed to get container status \"b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f\": rpc error: code = NotFound desc = could not find container \"b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f\": container with ID starting with b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f not found: ID does not exist" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.900453 5030 scope.go:117] "RemoveContainer" containerID="3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.903906 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438"} err="failed to get container status \"3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438\": rpc error: code = NotFound desc = could not find container \"3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438\": container with ID starting with 3a07560e23e71ad598717d04ed67ede208a05e3978a9b4989d2063f07adc3438 not found: ID does not exist" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.903939 5030 scope.go:117] "RemoveContainer" containerID="b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.907891 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f"} err="failed to get container status \"b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f\": rpc error: code = NotFound desc = could not find container \"b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f\": container with ID starting with b2d39d697fbfe7ede7e3134aaadebcd088137d409d4b8c200301ec3c13f8862f not found: ID does not exist" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.907921 5030 scope.go:117] "RemoveContainer" containerID="8577b5f40e530b7a4c036e39483bb7e8ca87a05ce542970247312e4be676a191" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.916571 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.942052 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.943763 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.946928 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.947136 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.947336 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.954486 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.960427 5030 scope.go:117] "RemoveContainer" containerID="8577b5f40e530b7a4c036e39483bb7e8ca87a05ce542970247312e4be676a191" Jan 20 23:17:51 crc kubenswrapper[5030]: E0120 23:17:51.960960 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8577b5f40e530b7a4c036e39483bb7e8ca87a05ce542970247312e4be676a191\": container with ID starting with 8577b5f40e530b7a4c036e39483bb7e8ca87a05ce542970247312e4be676a191 not found: ID does not exist" containerID="8577b5f40e530b7a4c036e39483bb7e8ca87a05ce542970247312e4be676a191" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.961005 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8577b5f40e530b7a4c036e39483bb7e8ca87a05ce542970247312e4be676a191"} err="failed to get container status \"8577b5f40e530b7a4c036e39483bb7e8ca87a05ce542970247312e4be676a191\": rpc error: code = NotFound desc = could not find container \"8577b5f40e530b7a4c036e39483bb7e8ca87a05ce542970247312e4be676a191\": container with ID starting with 8577b5f40e530b7a4c036e39483bb7e8ca87a05ce542970247312e4be676a191 not found: ID does not exist" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.972337 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761871f5-8189-4a5e-b442-61bfff40bb23" path="/var/lib/kubelet/pods/761871f5-8189-4a5e-b442-61bfff40bb23/volumes" Jan 20 23:17:51 crc kubenswrapper[5030]: I0120 23:17:51.972938 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="951e26bc-aedf-4210-b5e2-d2436c2a90b1" path="/var/lib/kubelet/pods/951e26bc-aedf-4210-b5e2-d2436c2a90b1/volumes" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.019145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g62pl\" (UniqueName: \"kubernetes.io/projected/c079a7fc-dcd4-48da-bdd8-b47d93405531-kube-api-access-g62pl\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.019187 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b2c3b53-0696-4565-95c5-2b50726c885e-logs\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.019215 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.019304 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.019324 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.019375 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.019408 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-config-data\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.019443 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.019464 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.019613 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzwq2\" (UniqueName: \"kubernetes.io/projected/4b2c3b53-0696-4565-95c5-2b50726c885e-kube-api-access-zzwq2\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.121722 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.121842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-config-data\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.121909 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.121944 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.122019 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzwq2\" (UniqueName: \"kubernetes.io/projected/4b2c3b53-0696-4565-95c5-2b50726c885e-kube-api-access-zzwq2\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.122118 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g62pl\" (UniqueName: \"kubernetes.io/projected/c079a7fc-dcd4-48da-bdd8-b47d93405531-kube-api-access-g62pl\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.122152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b2c3b53-0696-4565-95c5-2b50726c885e-logs\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.122192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.122243 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.122277 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.123332 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b2c3b53-0696-4565-95c5-2b50726c885e-logs\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.127469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.127468 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.127503 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.128925 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.135288 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-config-data\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.135787 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.137969 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.145869 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzwq2\" (UniqueName: \"kubernetes.io/projected/4b2c3b53-0696-4565-95c5-2b50726c885e-kube-api-access-zzwq2\") pod \"nova-metadata-0\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.146986 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g62pl\" (UniqueName: \"kubernetes.io/projected/c079a7fc-dcd4-48da-bdd8-b47d93405531-kube-api-access-g62pl\") pod \"nova-cell1-novncproxy-0\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.253610 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.263119 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.384947 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:52 crc kubenswrapper[5030]: W0120 23:17:52.727964 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b2c3b53_0696_4565_95c5_2b50726c885e.slice/crio-a7edf0f949082c2c80dc1e3ea0d23428b4dd0df116914643add17a300b05c228 WatchSource:0}: Error finding container a7edf0f949082c2c80dc1e3ea0d23428b4dd0df116914643add17a300b05c228: Status 404 returned error can't find the container with id a7edf0f949082c2c80dc1e3ea0d23428b4dd0df116914643add17a300b05c228 Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.728881 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:17:52 crc kubenswrapper[5030]: W0120 23:17:52.798268 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc079a7fc_dcd4_48da_bdd8_b47d93405531.slice/crio-392688e607d62f33ed93e839116811ee0318a10dea920512c7284575812e16b1 WatchSource:0}: Error finding container 392688e607d62f33ed93e839116811ee0318a10dea920512c7284575812e16b1: Status 404 returned error can't find the container with id 392688e607d62f33ed93e839116811ee0318a10dea920512c7284575812e16b1 Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.801515 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:17:52 crc kubenswrapper[5030]: I0120 23:17:52.809331 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"4b2c3b53-0696-4565-95c5-2b50726c885e","Type":"ContainerStarted","Data":"a7edf0f949082c2c80dc1e3ea0d23428b4dd0df116914643add17a300b05c228"} Jan 20 23:17:53 crc kubenswrapper[5030]: I0120 23:17:53.821180 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c079a7fc-dcd4-48da-bdd8-b47d93405531","Type":"ContainerStarted","Data":"7ea9a82b9e61683bff33c64e59c082cc2b81940cd861351fcaf415abf328e32f"} Jan 20 23:17:53 crc kubenswrapper[5030]: I0120 23:17:53.821556 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c079a7fc-dcd4-48da-bdd8-b47d93405531","Type":"ContainerStarted","Data":"392688e607d62f33ed93e839116811ee0318a10dea920512c7284575812e16b1"} Jan 20 23:17:53 crc kubenswrapper[5030]: I0120 23:17:53.823109 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"4b2c3b53-0696-4565-95c5-2b50726c885e","Type":"ContainerStarted","Data":"8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80"} Jan 20 23:17:53 crc kubenswrapper[5030]: I0120 23:17:53.823162 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"4b2c3b53-0696-4565-95c5-2b50726c885e","Type":"ContainerStarted","Data":"254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920"} Jan 20 23:17:53 crc kubenswrapper[5030]: I0120 23:17:53.849245 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.849228125 podStartE2EDuration="2.849228125s" podCreationTimestamp="2026-01-20 23:17:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:53.842570616 +0000 UTC m=+2546.162830924" watchObservedRunningTime="2026-01-20 23:17:53.849228125 +0000 UTC m=+2546.169488413" Jan 20 23:17:55 crc kubenswrapper[5030]: I0120 23:17:55.847990 5030 generic.go:334] "Generic (PLEG): container finished" podID="7fcd0796-3d13-46c8-a117-74ff1f36d4cc" containerID="88572615c9470cc1f53e5fe138aec4921f5f4bbf0c0b6644397ebffec41a7164" exitCode=0 Jan 20 23:17:55 crc kubenswrapper[5030]: I0120 23:17:55.848070 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" event={"ID":"7fcd0796-3d13-46c8-a117-74ff1f36d4cc","Type":"ContainerDied","Data":"88572615c9470cc1f53e5fe138aec4921f5f4bbf0c0b6644397ebffec41a7164"} Jan 20 23:17:55 crc kubenswrapper[5030]: I0120 23:17:55.879577 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=4.879551174 podStartE2EDuration="4.879551174s" podCreationTimestamp="2026-01-20 23:17:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:17:53.86318436 +0000 UTC m=+2546.183444648" watchObservedRunningTime="2026-01-20 23:17:55.879551174 +0000 UTC m=+2548.199811502" Jan 20 23:17:55 crc kubenswrapper[5030]: I0120 23:17:55.962537 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:17:55 crc kubenswrapper[5030]: E0120 23:17:55.963101 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:17:56 crc kubenswrapper[5030]: I0120 23:17:56.869146 5030 generic.go:334] "Generic (PLEG): container finished" podID="bef71c28-3c90-4bc9-bd49-13629ca6f394" containerID="686aad5aa9f2af3ead8f0f1957fe2fbc06afa5bd392f0df0060aebc4b8dcab80" exitCode=0 Jan 20 23:17:56 crc kubenswrapper[5030]: I0120 23:17:56.869239 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" event={"ID":"bef71c28-3c90-4bc9-bd49-13629ca6f394","Type":"ContainerDied","Data":"686aad5aa9f2af3ead8f0f1957fe2fbc06afa5bd392f0df0060aebc4b8dcab80"} Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.155239 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.155592 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.254613 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.255499 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.263533 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.271400 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.334121 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn2n4\" (UniqueName: \"kubernetes.io/projected/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-kube-api-access-kn2n4\") pod \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.334225 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-config-data\") pod \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.334278 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-scripts\") pod \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.334488 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-combined-ca-bundle\") pod \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\" (UID: \"7fcd0796-3d13-46c8-a117-74ff1f36d4cc\") " Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.341833 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-scripts" (OuterVolumeSpecName: "scripts") pod "7fcd0796-3d13-46c8-a117-74ff1f36d4cc" (UID: "7fcd0796-3d13-46c8-a117-74ff1f36d4cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.341949 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-kube-api-access-kn2n4" (OuterVolumeSpecName: "kube-api-access-kn2n4") pod "7fcd0796-3d13-46c8-a117-74ff1f36d4cc" (UID: "7fcd0796-3d13-46c8-a117-74ff1f36d4cc"). InnerVolumeSpecName "kube-api-access-kn2n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.369087 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fcd0796-3d13-46c8-a117-74ff1f36d4cc" (UID: "7fcd0796-3d13-46c8-a117-74ff1f36d4cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.369650 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-config-data" (OuterVolumeSpecName: "config-data") pod "7fcd0796-3d13-46c8-a117-74ff1f36d4cc" (UID: "7fcd0796-3d13-46c8-a117-74ff1f36d4cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.385094 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.425932 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.437022 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.437064 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.437078 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.437094 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn2n4\" (UniqueName: \"kubernetes.io/projected/7fcd0796-3d13-46c8-a117-74ff1f36d4cc-kube-api-access-kn2n4\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.883532 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" event={"ID":"7fcd0796-3d13-46c8-a117-74ff1f36d4cc","Type":"ContainerDied","Data":"630a0cdc0f6521b4524a6f5393f0b3c6d252f342c32045f37f7364f8513da6b3"} Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.883913 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="630a0cdc0f6521b4524a6f5393f0b3c6d252f342c32045f37f7364f8513da6b3" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.883673 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg" Jan 20 23:17:57 crc kubenswrapper[5030]: I0120 23:17:57.941977 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.157968 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.158203 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="878d7a51-fe53-42bb-982d-8a5971f9be80" containerName="nova-api-log" containerID="cri-o://a1d23fe2b7adbf63edbabfdc30bb528d1409be3a65c700d226dbc7625861d5ec" gracePeriod=30 Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.158610 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="878d7a51-fe53-42bb-982d-8a5971f9be80" containerName="nova-api-api" containerID="cri-o://e04fbae0a14def52a1ca6c867992be7bd258bd9de533d50626669a6d7a7ffc2c" gracePeriod=30 Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.168920 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="878d7a51-fe53-42bb-982d-8a5971f9be80" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.175:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.169148 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="878d7a51-fe53-42bb-982d-8a5971f9be80" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.175:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.186931 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.346423 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.383159 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.456826 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-combined-ca-bundle\") pod \"bef71c28-3c90-4bc9-bd49-13629ca6f394\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.456904 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-config-data\") pod \"bef71c28-3c90-4bc9-bd49-13629ca6f394\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.457057 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrf69\" (UniqueName: \"kubernetes.io/projected/bef71c28-3c90-4bc9-bd49-13629ca6f394-kube-api-access-jrf69\") pod \"bef71c28-3c90-4bc9-bd49-13629ca6f394\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.457142 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-scripts\") pod \"bef71c28-3c90-4bc9-bd49-13629ca6f394\" (UID: \"bef71c28-3c90-4bc9-bd49-13629ca6f394\") " Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.461854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-scripts" (OuterVolumeSpecName: "scripts") pod "bef71c28-3c90-4bc9-bd49-13629ca6f394" (UID: "bef71c28-3c90-4bc9-bd49-13629ca6f394"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.461883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bef71c28-3c90-4bc9-bd49-13629ca6f394-kube-api-access-jrf69" (OuterVolumeSpecName: "kube-api-access-jrf69") pod "bef71c28-3c90-4bc9-bd49-13629ca6f394" (UID: "bef71c28-3c90-4bc9-bd49-13629ca6f394"). InnerVolumeSpecName "kube-api-access-jrf69". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.482975 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-config-data" (OuterVolumeSpecName: "config-data") pod "bef71c28-3c90-4bc9-bd49-13629ca6f394" (UID: "bef71c28-3c90-4bc9-bd49-13629ca6f394"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.484818 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bef71c28-3c90-4bc9-bd49-13629ca6f394" (UID: "bef71c28-3c90-4bc9-bd49-13629ca6f394"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.559310 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrf69\" (UniqueName: \"kubernetes.io/projected/bef71c28-3c90-4bc9-bd49-13629ca6f394-kube-api-access-jrf69\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.559341 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.559354 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.559362 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef71c28-3c90-4bc9-bd49-13629ca6f394-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.895755 5030 generic.go:334] "Generic (PLEG): container finished" podID="878d7a51-fe53-42bb-982d-8a5971f9be80" containerID="a1d23fe2b7adbf63edbabfdc30bb528d1409be3a65c700d226dbc7625861d5ec" exitCode=143 Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.895810 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"878d7a51-fe53-42bb-982d-8a5971f9be80","Type":"ContainerDied","Data":"a1d23fe2b7adbf63edbabfdc30bb528d1409be3a65c700d226dbc7625861d5ec"} Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.897876 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" event={"ID":"bef71c28-3c90-4bc9-bd49-13629ca6f394","Type":"ContainerDied","Data":"bf5973ab7e4cffb87dbb2b454f3386accb9a34d211ac30f35529a4038ade039b"} Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.897909 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf5973ab7e4cffb87dbb2b454f3386accb9a34d211ac30f35529a4038ade039b" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.901287 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="4b2c3b53-0696-4565-95c5-2b50726c885e" containerName="nova-metadata-log" containerID="cri-o://254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920" gracePeriod=30 Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.901325 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="4b2c3b53-0696-4565-95c5-2b50726c885e" containerName="nova-metadata-metadata" containerID="cri-o://8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80" gracePeriod=30 Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.901140 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.986246 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:17:58 crc kubenswrapper[5030]: E0120 23:17:58.986744 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef71c28-3c90-4bc9-bd49-13629ca6f394" containerName="nova-cell1-conductor-db-sync" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.986763 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef71c28-3c90-4bc9-bd49-13629ca6f394" containerName="nova-cell1-conductor-db-sync" Jan 20 23:17:58 crc kubenswrapper[5030]: E0120 23:17:58.986788 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fcd0796-3d13-46c8-a117-74ff1f36d4cc" containerName="nova-manage" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.986794 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fcd0796-3d13-46c8-a117-74ff1f36d4cc" containerName="nova-manage" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.986978 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef71c28-3c90-4bc9-bd49-13629ca6f394" containerName="nova-cell1-conductor-db-sync" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.987011 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fcd0796-3d13-46c8-a117-74ff1f36d4cc" containerName="nova-manage" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.987712 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:17:58 crc kubenswrapper[5030]: I0120 23:17:58.995466 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.003513 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.069962 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0863eb67-3ead-4744-b338-aaf75284e458-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0863eb67-3ead-4744-b338-aaf75284e458\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.070030 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztwkh\" (UniqueName: \"kubernetes.io/projected/0863eb67-3ead-4744-b338-aaf75284e458-kube-api-access-ztwkh\") pod \"nova-cell1-conductor-0\" (UID: \"0863eb67-3ead-4744-b338-aaf75284e458\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.070073 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0863eb67-3ead-4744-b338-aaf75284e458-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0863eb67-3ead-4744-b338-aaf75284e458\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.171665 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0863eb67-3ead-4744-b338-aaf75284e458-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0863eb67-3ead-4744-b338-aaf75284e458\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.171724 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztwkh\" (UniqueName: \"kubernetes.io/projected/0863eb67-3ead-4744-b338-aaf75284e458-kube-api-access-ztwkh\") pod \"nova-cell1-conductor-0\" (UID: \"0863eb67-3ead-4744-b338-aaf75284e458\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.171768 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0863eb67-3ead-4744-b338-aaf75284e458-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0863eb67-3ead-4744-b338-aaf75284e458\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.177735 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0863eb67-3ead-4744-b338-aaf75284e458-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0863eb67-3ead-4744-b338-aaf75284e458\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.179506 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0863eb67-3ead-4744-b338-aaf75284e458-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0863eb67-3ead-4744-b338-aaf75284e458\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.216218 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztwkh\" (UniqueName: \"kubernetes.io/projected/0863eb67-3ead-4744-b338-aaf75284e458-kube-api-access-ztwkh\") pod \"nova-cell1-conductor-0\" (UID: \"0863eb67-3ead-4744-b338-aaf75284e458\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.314372 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.500784 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.586871 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzwq2\" (UniqueName: \"kubernetes.io/projected/4b2c3b53-0696-4565-95c5-2b50726c885e-kube-api-access-zzwq2\") pod \"4b2c3b53-0696-4565-95c5-2b50726c885e\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.587240 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b2c3b53-0696-4565-95c5-2b50726c885e-logs\") pod \"4b2c3b53-0696-4565-95c5-2b50726c885e\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.587435 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-combined-ca-bundle\") pod \"4b2c3b53-0696-4565-95c5-2b50726c885e\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.587551 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-nova-metadata-tls-certs\") pod \"4b2c3b53-0696-4565-95c5-2b50726c885e\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.587602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-config-data\") pod \"4b2c3b53-0696-4565-95c5-2b50726c885e\" (UID: \"4b2c3b53-0696-4565-95c5-2b50726c885e\") " Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.587684 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b2c3b53-0696-4565-95c5-2b50726c885e-logs" (OuterVolumeSpecName: "logs") pod "4b2c3b53-0696-4565-95c5-2b50726c885e" (UID: "4b2c3b53-0696-4565-95c5-2b50726c885e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.588024 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b2c3b53-0696-4565-95c5-2b50726c885e-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.592144 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2c3b53-0696-4565-95c5-2b50726c885e-kube-api-access-zzwq2" (OuterVolumeSpecName: "kube-api-access-zzwq2") pod "4b2c3b53-0696-4565-95c5-2b50726c885e" (UID: "4b2c3b53-0696-4565-95c5-2b50726c885e"). InnerVolumeSpecName "kube-api-access-zzwq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.617085 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-config-data" (OuterVolumeSpecName: "config-data") pod "4b2c3b53-0696-4565-95c5-2b50726c885e" (UID: "4b2c3b53-0696-4565-95c5-2b50726c885e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.618797 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b2c3b53-0696-4565-95c5-2b50726c885e" (UID: "4b2c3b53-0696-4565-95c5-2b50726c885e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.635599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4b2c3b53-0696-4565-95c5-2b50726c885e" (UID: "4b2c3b53-0696-4565-95c5-2b50726c885e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.689612 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.689659 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.689669 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzwq2\" (UniqueName: \"kubernetes.io/projected/4b2c3b53-0696-4565-95c5-2b50726c885e-kube-api-access-zzwq2\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.689679 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b2c3b53-0696-4565-95c5-2b50726c885e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.786656 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:17:59 crc kubenswrapper[5030]: W0120 23:17:59.789414 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0863eb67_3ead_4744_b338_aaf75284e458.slice/crio-250dde89a7a2f299c5ed56a1c07167f550c75c679d1ddcf673f5d9cb8e071d82 WatchSource:0}: Error finding container 250dde89a7a2f299c5ed56a1c07167f550c75c679d1ddcf673f5d9cb8e071d82: Status 404 returned error can't find the container with id 250dde89a7a2f299c5ed56a1c07167f550c75c679d1ddcf673f5d9cb8e071d82 Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.907461 5030 generic.go:334] "Generic (PLEG): container finished" podID="4b2c3b53-0696-4565-95c5-2b50726c885e" containerID="8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80" exitCode=0 Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.907490 5030 generic.go:334] "Generic (PLEG): container finished" podID="4b2c3b53-0696-4565-95c5-2b50726c885e" containerID="254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920" exitCode=143 Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.907536 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"4b2c3b53-0696-4565-95c5-2b50726c885e","Type":"ContainerDied","Data":"8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80"} Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.907541 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.907560 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"4b2c3b53-0696-4565-95c5-2b50726c885e","Type":"ContainerDied","Data":"254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920"} Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.907570 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"4b2c3b53-0696-4565-95c5-2b50726c885e","Type":"ContainerDied","Data":"a7edf0f949082c2c80dc1e3ea0d23428b4dd0df116914643add17a300b05c228"} Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.907586 5030 scope.go:117] "RemoveContainer" containerID="8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.909512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0863eb67-3ead-4744-b338-aaf75284e458","Type":"ContainerStarted","Data":"250dde89a7a2f299c5ed56a1c07167f550c75c679d1ddcf673f5d9cb8e071d82"} Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.909678 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="73a011b0-0e05-4983-a66e-c244d0a17ba2" containerName="nova-scheduler-scheduler" containerID="cri-o://237690b55b605a9bc54570366454b4344a9c2730228ac46949a77661cbd4bfff" gracePeriod=30 Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.930334 5030 scope.go:117] "RemoveContainer" containerID="254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.975652 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.975687 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.978412 5030 scope.go:117] "RemoveContainer" containerID="8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80" Jan 20 23:17:59 crc kubenswrapper[5030]: E0120 23:17:59.978982 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80\": container with ID starting with 8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80 not found: ID does not exist" containerID="8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.979034 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80"} err="failed to get container status \"8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80\": rpc error: code = NotFound desc = could not find container \"8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80\": container with ID starting with 8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80 not found: ID does not exist" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.979060 5030 scope.go:117] "RemoveContainer" containerID="254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920" Jan 20 23:17:59 crc kubenswrapper[5030]: E0120 23:17:59.979391 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920\": container with ID starting with 254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920 not found: ID does not exist" containerID="254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.979421 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920"} err="failed to get container status \"254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920\": rpc error: code = NotFound desc = could not find container \"254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920\": container with ID starting with 254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920 not found: ID does not exist" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.979443 5030 scope.go:117] "RemoveContainer" containerID="8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.979863 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80"} err="failed to get container status \"8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80\": rpc error: code = NotFound desc = could not find container \"8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80\": container with ID starting with 8d46d63053b613b7f9c18d22393cd21e777ac92f4167b433f3002c8273cc3d80 not found: ID does not exist" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.979886 5030 scope.go:117] "RemoveContainer" containerID="254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.980130 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920"} err="failed to get container status \"254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920\": rpc error: code = NotFound desc = could not find container \"254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920\": container with ID starting with 254199f0a6670eafcb31548a82c7ebbe2c8069897ab93ee61e78ff304077f920 not found: ID does not exist" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.998227 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:17:59 crc kubenswrapper[5030]: E0120 23:17:59.998852 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2c3b53-0696-4565-95c5-2b50726c885e" containerName="nova-metadata-metadata" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.998878 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2c3b53-0696-4565-95c5-2b50726c885e" containerName="nova-metadata-metadata" Jan 20 23:17:59 crc kubenswrapper[5030]: E0120 23:17:59.998918 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2c3b53-0696-4565-95c5-2b50726c885e" containerName="nova-metadata-log" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.998928 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2c3b53-0696-4565-95c5-2b50726c885e" containerName="nova-metadata-log" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.999173 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b2c3b53-0696-4565-95c5-2b50726c885e" containerName="nova-metadata-log" Jan 20 23:17:59 crc kubenswrapper[5030]: I0120 23:17:59.999200 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b2c3b53-0696-4565-95c5-2b50726c885e" containerName="nova-metadata-metadata" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.000613 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.003645 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.004309 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.011597 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.095399 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-config-data\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.095857 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.095966 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f80d5571-57c5-4fef-82df-13411479cb44-logs\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.096170 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.096291 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46j8d\" (UniqueName: \"kubernetes.io/projected/f80d5571-57c5-4fef-82df-13411479cb44-kube-api-access-46j8d\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.199691 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.199842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f80d5571-57c5-4fef-82df-13411479cb44-logs\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.199895 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.199961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46j8d\" (UniqueName: \"kubernetes.io/projected/f80d5571-57c5-4fef-82df-13411479cb44-kube-api-access-46j8d\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.200086 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-config-data\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.200890 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f80d5571-57c5-4fef-82df-13411479cb44-logs\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.204694 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.206026 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-config-data\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.209116 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.233019 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46j8d\" (UniqueName: \"kubernetes.io/projected/f80d5571-57c5-4fef-82df-13411479cb44-kube-api-access-46j8d\") pod \"nova-metadata-0\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.318934 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.857609 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.919531 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f80d5571-57c5-4fef-82df-13411479cb44","Type":"ContainerStarted","Data":"6b007a96e5de9399b28a1e05154ad7bc2d0c5a1012c32098be5e5e9c579560e6"} Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.923731 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0863eb67-3ead-4744-b338-aaf75284e458","Type":"ContainerStarted","Data":"7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f"} Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.924757 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:18:00 crc kubenswrapper[5030]: I0120 23:18:00.941719 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.941701769 podStartE2EDuration="2.941701769s" podCreationTimestamp="2026-01-20 23:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:18:00.9388368 +0000 UTC m=+2553.259097118" watchObservedRunningTime="2026-01-20 23:18:00.941701769 +0000 UTC m=+2553.261962057" Jan 20 23:18:01 crc kubenswrapper[5030]: I0120 23:18:01.937459 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f80d5571-57c5-4fef-82df-13411479cb44","Type":"ContainerStarted","Data":"a22e11a9b9529094badc689ebfe2f258bac7218ac2b6b75586f9176f2c16a647"} Jan 20 23:18:01 crc kubenswrapper[5030]: I0120 23:18:01.937795 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f80d5571-57c5-4fef-82df-13411479cb44","Type":"ContainerStarted","Data":"d80240b3201497ba480218a5ad2bea99b4075f963c41c08a3bbbea4747fcdf85"} Jan 20 23:18:01 crc kubenswrapper[5030]: I0120 23:18:01.972332 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.972298533 podStartE2EDuration="2.972298533s" podCreationTimestamp="2026-01-20 23:17:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:18:01.963462131 +0000 UTC m=+2554.283722429" watchObservedRunningTime="2026-01-20 23:18:01.972298533 +0000 UTC m=+2554.292558891" Jan 20 23:18:01 crc kubenswrapper[5030]: I0120 23:18:01.984134 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b2c3b53-0696-4565-95c5-2b50726c885e" path="/var/lib/kubelet/pods/4b2c3b53-0696-4565-95c5-2b50726c885e/volumes" Jan 20 23:18:02 crc kubenswrapper[5030]: I0120 23:18:02.264393 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:18:02 crc kubenswrapper[5030]: I0120 23:18:02.293133 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:18:02 crc kubenswrapper[5030]: E0120 23:18:02.387758 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="237690b55b605a9bc54570366454b4344a9c2730228ac46949a77661cbd4bfff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:18:02 crc kubenswrapper[5030]: E0120 23:18:02.390701 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="237690b55b605a9bc54570366454b4344a9c2730228ac46949a77661cbd4bfff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:18:02 crc kubenswrapper[5030]: E0120 23:18:02.392881 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="237690b55b605a9bc54570366454b4344a9c2730228ac46949a77661cbd4bfff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:18:02 crc kubenswrapper[5030]: E0120 23:18:02.393164 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="73a011b0-0e05-4983-a66e-c244d0a17ba2" containerName="nova-scheduler-scheduler" Jan 20 23:18:02 crc kubenswrapper[5030]: I0120 23:18:02.975831 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:18:03 crc kubenswrapper[5030]: I0120 23:18:03.961892 5030 generic.go:334] "Generic (PLEG): container finished" podID="878d7a51-fe53-42bb-982d-8a5971f9be80" containerID="e04fbae0a14def52a1ca6c867992be7bd258bd9de533d50626669a6d7a7ffc2c" exitCode=0 Jan 20 23:18:03 crc kubenswrapper[5030]: I0120 23:18:03.973355 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"878d7a51-fe53-42bb-982d-8a5971f9be80","Type":"ContainerDied","Data":"e04fbae0a14def52a1ca6c867992be7bd258bd9de533d50626669a6d7a7ffc2c"} Jan 20 23:18:03 crc kubenswrapper[5030]: I0120 23:18:03.973386 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"878d7a51-fe53-42bb-982d-8a5971f9be80","Type":"ContainerDied","Data":"d0c814694b53576e7be357a466ccd86ad84180bdf8a2686713e6290e5cebf42d"} Jan 20 23:18:03 crc kubenswrapper[5030]: I0120 23:18:03.973398 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0c814694b53576e7be357a466ccd86ad84180bdf8a2686713e6290e5cebf42d" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.034067 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.195250 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/878d7a51-fe53-42bb-982d-8a5971f9be80-config-data\") pod \"878d7a51-fe53-42bb-982d-8a5971f9be80\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.195353 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9gx5\" (UniqueName: \"kubernetes.io/projected/878d7a51-fe53-42bb-982d-8a5971f9be80-kube-api-access-k9gx5\") pod \"878d7a51-fe53-42bb-982d-8a5971f9be80\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.195502 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878d7a51-fe53-42bb-982d-8a5971f9be80-combined-ca-bundle\") pod \"878d7a51-fe53-42bb-982d-8a5971f9be80\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.195558 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/878d7a51-fe53-42bb-982d-8a5971f9be80-logs\") pod \"878d7a51-fe53-42bb-982d-8a5971f9be80\" (UID: \"878d7a51-fe53-42bb-982d-8a5971f9be80\") " Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.196513 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/878d7a51-fe53-42bb-982d-8a5971f9be80-logs" (OuterVolumeSpecName: "logs") pod "878d7a51-fe53-42bb-982d-8a5971f9be80" (UID: "878d7a51-fe53-42bb-982d-8a5971f9be80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.203006 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/878d7a51-fe53-42bb-982d-8a5971f9be80-kube-api-access-k9gx5" (OuterVolumeSpecName: "kube-api-access-k9gx5") pod "878d7a51-fe53-42bb-982d-8a5971f9be80" (UID: "878d7a51-fe53-42bb-982d-8a5971f9be80"). InnerVolumeSpecName "kube-api-access-k9gx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.226114 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/878d7a51-fe53-42bb-982d-8a5971f9be80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "878d7a51-fe53-42bb-982d-8a5971f9be80" (UID: "878d7a51-fe53-42bb-982d-8a5971f9be80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.240098 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/878d7a51-fe53-42bb-982d-8a5971f9be80-config-data" (OuterVolumeSpecName: "config-data") pod "878d7a51-fe53-42bb-982d-8a5971f9be80" (UID: "878d7a51-fe53-42bb-982d-8a5971f9be80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.297922 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/878d7a51-fe53-42bb-982d-8a5971f9be80-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.297962 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9gx5\" (UniqueName: \"kubernetes.io/projected/878d7a51-fe53-42bb-982d-8a5971f9be80-kube-api-access-k9gx5\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.297978 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/878d7a51-fe53-42bb-982d-8a5971f9be80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.297991 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/878d7a51-fe53-42bb-982d-8a5971f9be80-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.367261 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.992575 5030 generic.go:334] "Generic (PLEG): container finished" podID="73a011b0-0e05-4983-a66e-c244d0a17ba2" containerID="237690b55b605a9bc54570366454b4344a9c2730228ac46949a77661cbd4bfff" exitCode=0 Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.992698 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"73a011b0-0e05-4983-a66e-c244d0a17ba2","Type":"ContainerDied","Data":"237690b55b605a9bc54570366454b4344a9c2730228ac46949a77661cbd4bfff"} Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.993025 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.993159 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv"] Jan 20 23:18:04 crc kubenswrapper[5030]: E0120 23:18:04.993601 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="878d7a51-fe53-42bb-982d-8a5971f9be80" containerName="nova-api-log" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.993612 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="878d7a51-fe53-42bb-982d-8a5971f9be80" containerName="nova-api-log" Jan 20 23:18:04 crc kubenswrapper[5030]: E0120 23:18:04.993659 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="878d7a51-fe53-42bb-982d-8a5971f9be80" containerName="nova-api-api" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.993666 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="878d7a51-fe53-42bb-982d-8a5971f9be80" containerName="nova-api-api" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.993863 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="878d7a51-fe53-42bb-982d-8a5971f9be80" containerName="nova-api-log" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.993879 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="878d7a51-fe53-42bb-982d-8a5971f9be80" containerName="nova-api-api" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.994608 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.996280 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 20 23:18:04 crc kubenswrapper[5030]: I0120 23:18:04.996827 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.014678 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv"] Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.037842 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.094806 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.106214 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.112838 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.117870 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.126839 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.128291 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-config-data\") pod \"nova-cell1-cell-mapping-2m6rv\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.128450 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-scripts\") pod \"nova-cell1-cell-mapping-2m6rv\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.128552 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2m6rv\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.128596 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtml8\" (UniqueName: \"kubernetes.io/projected/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-kube-api-access-vtml8\") pod \"nova-cell1-cell-mapping-2m6rv\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.230572 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-config-data\") pod \"nova-api-0\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.230713 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-logs\") pod \"nova-api-0\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.230842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-scripts\") pod \"nova-cell1-cell-mapping-2m6rv\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.231111 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpk26\" (UniqueName: \"kubernetes.io/projected/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-kube-api-access-kpk26\") pod \"nova-api-0\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.231198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2m6rv\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.231342 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtml8\" (UniqueName: \"kubernetes.io/projected/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-kube-api-access-vtml8\") pod \"nova-cell1-cell-mapping-2m6rv\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.232689 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-config-data\") pod \"nova-cell1-cell-mapping-2m6rv\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.233451 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.237373 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-scripts\") pod \"nova-cell1-cell-mapping-2m6rv\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.237425 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-config-data\") pod \"nova-cell1-cell-mapping-2m6rv\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.241120 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2m6rv\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.247478 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtml8\" (UniqueName: \"kubernetes.io/projected/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-kube-api-access-vtml8\") pod \"nova-cell1-cell-mapping-2m6rv\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.322802 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.322896 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.323297 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.324851 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.336316 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.336392 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-config-data\") pod \"nova-api-0\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.336434 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-logs\") pod \"nova-api-0\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.336504 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpk26\" (UniqueName: \"kubernetes.io/projected/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-kube-api-access-kpk26\") pod \"nova-api-0\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.340338 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-logs\") pod \"nova-api-0\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.343319 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-config-data\") pod \"nova-api-0\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.354971 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.363193 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpk26\" (UniqueName: \"kubernetes.io/projected/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-kube-api-access-kpk26\") pod \"nova-api-0\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.438279 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a011b0-0e05-4983-a66e-c244d0a17ba2-config-data\") pod \"73a011b0-0e05-4983-a66e-c244d0a17ba2\" (UID: \"73a011b0-0e05-4983-a66e-c244d0a17ba2\") " Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.438395 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgdzn\" (UniqueName: \"kubernetes.io/projected/73a011b0-0e05-4983-a66e-c244d0a17ba2-kube-api-access-dgdzn\") pod \"73a011b0-0e05-4983-a66e-c244d0a17ba2\" (UID: \"73a011b0-0e05-4983-a66e-c244d0a17ba2\") " Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.438479 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.439006 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a011b0-0e05-4983-a66e-c244d0a17ba2-combined-ca-bundle\") pod \"73a011b0-0e05-4983-a66e-c244d0a17ba2\" (UID: \"73a011b0-0e05-4983-a66e-c244d0a17ba2\") " Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.443054 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a011b0-0e05-4983-a66e-c244d0a17ba2-kube-api-access-dgdzn" (OuterVolumeSpecName: "kube-api-access-dgdzn") pod "73a011b0-0e05-4983-a66e-c244d0a17ba2" (UID: "73a011b0-0e05-4983-a66e-c244d0a17ba2"). InnerVolumeSpecName "kube-api-access-dgdzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.467203 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a011b0-0e05-4983-a66e-c244d0a17ba2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73a011b0-0e05-4983-a66e-c244d0a17ba2" (UID: "73a011b0-0e05-4983-a66e-c244d0a17ba2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.468254 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a011b0-0e05-4983-a66e-c244d0a17ba2-config-data" (OuterVolumeSpecName: "config-data") pod "73a011b0-0e05-4983-a66e-c244d0a17ba2" (UID: "73a011b0-0e05-4983-a66e-c244d0a17ba2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.541167 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a011b0-0e05-4983-a66e-c244d0a17ba2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.541194 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgdzn\" (UniqueName: \"kubernetes.io/projected/73a011b0-0e05-4983-a66e-c244d0a17ba2-kube-api-access-dgdzn\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.541205 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a011b0-0e05-4983-a66e-c244d0a17ba2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.738992 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv"] Jan 20 23:18:05 crc kubenswrapper[5030]: W0120 23:18:05.740922 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf31a8aa_416b_44ff_b20c_d2b6d4a86a6c.slice/crio-04e1191a21133306372d85a45574bc71d37015f62c33249ed44ecc3ecadab666 WatchSource:0}: Error finding container 04e1191a21133306372d85a45574bc71d37015f62c33249ed44ecc3ecadab666: Status 404 returned error can't find the container with id 04e1191a21133306372d85a45574bc71d37015f62c33249ed44ecc3ecadab666 Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.883735 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:05 crc kubenswrapper[5030]: W0120 23:18:05.885700 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc700c8d5_3fe9_47d4_bd5d_e3ab9dcdd2f4.slice/crio-ec73eebe3ba289efa2bc73cc87e13c54a9361ab9bd277c22be828fd4a691c1ca WatchSource:0}: Error finding container ec73eebe3ba289efa2bc73cc87e13c54a9361ab9bd277c22be828fd4a691c1ca: Status 404 returned error can't find the container with id ec73eebe3ba289efa2bc73cc87e13c54a9361ab9bd277c22be828fd4a691c1ca Jan 20 23:18:05 crc kubenswrapper[5030]: I0120 23:18:05.978742 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="878d7a51-fe53-42bb-982d-8a5971f9be80" path="/var/lib/kubelet/pods/878d7a51-fe53-42bb-982d-8a5971f9be80/volumes" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.011820 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"73a011b0-0e05-4983-a66e-c244d0a17ba2","Type":"ContainerDied","Data":"e1372681d6bbd6f87067a32482a31401ce0358ea67fa49afac45a32511c087b6"} Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.011870 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.011890 5030 scope.go:117] "RemoveContainer" containerID="237690b55b605a9bc54570366454b4344a9c2730228ac46949a77661cbd4bfff" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.012969 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" event={"ID":"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c","Type":"ContainerStarted","Data":"04e1191a21133306372d85a45574bc71d37015f62c33249ed44ecc3ecadab666"} Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.014538 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4","Type":"ContainerStarted","Data":"ec73eebe3ba289efa2bc73cc87e13c54a9361ab9bd277c22be828fd4a691c1ca"} Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.061191 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.081764 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.107456 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:18:06 crc kubenswrapper[5030]: E0120 23:18:06.108003 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a011b0-0e05-4983-a66e-c244d0a17ba2" containerName="nova-scheduler-scheduler" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.108032 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a011b0-0e05-4983-a66e-c244d0a17ba2" containerName="nova-scheduler-scheduler" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.108304 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a011b0-0e05-4983-a66e-c244d0a17ba2" containerName="nova-scheduler-scheduler" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.109146 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.111218 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.116725 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.252749 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b2b1e84-6e97-4755-9601-a76801950ef5-config-data\") pod \"nova-scheduler-0\" (UID: \"4b2b1e84-6e97-4755-9601-a76801950ef5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.253076 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b2b1e84-6e97-4755-9601-a76801950ef5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4b2b1e84-6e97-4755-9601-a76801950ef5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.253098 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfxrb\" (UniqueName: \"kubernetes.io/projected/4b2b1e84-6e97-4755-9601-a76801950ef5-kube-api-access-rfxrb\") pod \"nova-scheduler-0\" (UID: \"4b2b1e84-6e97-4755-9601-a76801950ef5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.354925 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b2b1e84-6e97-4755-9601-a76801950ef5-config-data\") pod \"nova-scheduler-0\" (UID: \"4b2b1e84-6e97-4755-9601-a76801950ef5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.355067 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b2b1e84-6e97-4755-9601-a76801950ef5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4b2b1e84-6e97-4755-9601-a76801950ef5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.355100 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfxrb\" (UniqueName: \"kubernetes.io/projected/4b2b1e84-6e97-4755-9601-a76801950ef5-kube-api-access-rfxrb\") pod \"nova-scheduler-0\" (UID: \"4b2b1e84-6e97-4755-9601-a76801950ef5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.361685 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b2b1e84-6e97-4755-9601-a76801950ef5-config-data\") pod \"nova-scheduler-0\" (UID: \"4b2b1e84-6e97-4755-9601-a76801950ef5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.361706 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b2b1e84-6e97-4755-9601-a76801950ef5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4b2b1e84-6e97-4755-9601-a76801950ef5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.371355 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfxrb\" (UniqueName: \"kubernetes.io/projected/4b2b1e84-6e97-4755-9601-a76801950ef5-kube-api-access-rfxrb\") pod \"nova-scheduler-0\" (UID: \"4b2b1e84-6e97-4755-9601-a76801950ef5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.428828 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.962740 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:18:06 crc kubenswrapper[5030]: E0120 23:18:06.963340 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:18:06 crc kubenswrapper[5030]: I0120 23:18:06.963749 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:18:07 crc kubenswrapper[5030]: I0120 23:18:07.013453 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:07 crc kubenswrapper[5030]: I0120 23:18:07.033171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" event={"ID":"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c","Type":"ContainerStarted","Data":"28ed1595451c9dbd0de380e299bb4915d4ae4edad80751f4c795f728b868d5b4"} Jan 20 23:18:07 crc kubenswrapper[5030]: I0120 23:18:07.038361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4","Type":"ContainerStarted","Data":"6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712"} Jan 20 23:18:07 crc kubenswrapper[5030]: I0120 23:18:07.038402 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4","Type":"ContainerStarted","Data":"878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f"} Jan 20 23:18:07 crc kubenswrapper[5030]: I0120 23:18:07.043505 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"4b2b1e84-6e97-4755-9601-a76801950ef5","Type":"ContainerStarted","Data":"eec6efc54f58304f0d4035dde3e6e1d57ef6c27e5b0909c3b9bb94eb33496a7a"} Jan 20 23:18:07 crc kubenswrapper[5030]: I0120 23:18:07.090943 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.090925402 podStartE2EDuration="2.090925402s" podCreationTimestamp="2026-01-20 23:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:18:07.087607952 +0000 UTC m=+2559.407868250" watchObservedRunningTime="2026-01-20 23:18:07.090925402 +0000 UTC m=+2559.411185690" Jan 20 23:18:07 crc kubenswrapper[5030]: I0120 23:18:07.096927 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" podStartSLOduration=3.096915925 podStartE2EDuration="3.096915925s" podCreationTimestamp="2026-01-20 23:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:18:07.073166796 +0000 UTC m=+2559.393427084" watchObservedRunningTime="2026-01-20 23:18:07.096915925 +0000 UTC m=+2559.417176213" Jan 20 23:18:07 crc kubenswrapper[5030]: I0120 23:18:07.986530 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a011b0-0e05-4983-a66e-c244d0a17ba2" path="/var/lib/kubelet/pods/73a011b0-0e05-4983-a66e-c244d0a17ba2/volumes" Jan 20 23:18:08 crc kubenswrapper[5030]: I0120 23:18:08.057684 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"4b2b1e84-6e97-4755-9601-a76801950ef5","Type":"ContainerStarted","Data":"3ad74e5d8af00be1d836e7e3ec5dd41e7cdc139463c7c45d1c4effbf9180b760"} Jan 20 23:18:08 crc kubenswrapper[5030]: I0120 23:18:08.086041 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.085579714 podStartE2EDuration="2.085579714s" podCreationTimestamp="2026-01-20 23:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:18:08.082391648 +0000 UTC m=+2560.402651966" watchObservedRunningTime="2026-01-20 23:18:08.085579714 +0000 UTC m=+2560.405840042" Jan 20 23:18:10 crc kubenswrapper[5030]: I0120 23:18:10.320963 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:10 crc kubenswrapper[5030]: I0120 23:18:10.321293 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:10 crc kubenswrapper[5030]: I0120 23:18:10.473205 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:18:10 crc kubenswrapper[5030]: I0120 23:18:10.473732 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="4147eacc-6d45-4c82-a803-fe339fa6ae6f" containerName="kube-state-metrics" containerID="cri-o://0fbddf80c87828b1d9017e886e1447381703fca3d184f446eab5cc0614376b02" gracePeriod=30 Jan 20 23:18:10 crc kubenswrapper[5030]: I0120 23:18:10.986447 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.063974 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5f8l\" (UniqueName: \"kubernetes.io/projected/4147eacc-6d45-4c82-a803-fe339fa6ae6f-kube-api-access-q5f8l\") pod \"4147eacc-6d45-4c82-a803-fe339fa6ae6f\" (UID: \"4147eacc-6d45-4c82-a803-fe339fa6ae6f\") " Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.075987 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4147eacc-6d45-4c82-a803-fe339fa6ae6f-kube-api-access-q5f8l" (OuterVolumeSpecName: "kube-api-access-q5f8l") pod "4147eacc-6d45-4c82-a803-fe339fa6ae6f" (UID: "4147eacc-6d45-4c82-a803-fe339fa6ae6f"). InnerVolumeSpecName "kube-api-access-q5f8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.099299 5030 generic.go:334] "Generic (PLEG): container finished" podID="4147eacc-6d45-4c82-a803-fe339fa6ae6f" containerID="0fbddf80c87828b1d9017e886e1447381703fca3d184f446eab5cc0614376b02" exitCode=2 Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.099353 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"4147eacc-6d45-4c82-a803-fe339fa6ae6f","Type":"ContainerDied","Data":"0fbddf80c87828b1d9017e886e1447381703fca3d184f446eab5cc0614376b02"} Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.099379 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"4147eacc-6d45-4c82-a803-fe339fa6ae6f","Type":"ContainerDied","Data":"9aa42067b73c5428ecbf3ae60386d8e35417c37e2a9c6f37bf6fbe92dae11307"} Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.099396 5030 scope.go:117] "RemoveContainer" containerID="0fbddf80c87828b1d9017e886e1447381703fca3d184f446eab5cc0614376b02" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.099501 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.114490 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c" containerID="28ed1595451c9dbd0de380e299bb4915d4ae4edad80751f4c795f728b868d5b4" exitCode=0 Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.114533 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" event={"ID":"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c","Type":"ContainerDied","Data":"28ed1595451c9dbd0de380e299bb4915d4ae4edad80751f4c795f728b868d5b4"} Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.162040 5030 scope.go:117] "RemoveContainer" containerID="0fbddf80c87828b1d9017e886e1447381703fca3d184f446eab5cc0614376b02" Jan 20 23:18:11 crc kubenswrapper[5030]: E0120 23:18:11.162552 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fbddf80c87828b1d9017e886e1447381703fca3d184f446eab5cc0614376b02\": container with ID starting with 0fbddf80c87828b1d9017e886e1447381703fca3d184f446eab5cc0614376b02 not found: ID does not exist" containerID="0fbddf80c87828b1d9017e886e1447381703fca3d184f446eab5cc0614376b02" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.162584 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fbddf80c87828b1d9017e886e1447381703fca3d184f446eab5cc0614376b02"} err="failed to get container status \"0fbddf80c87828b1d9017e886e1447381703fca3d184f446eab5cc0614376b02\": rpc error: code = NotFound desc = could not find container \"0fbddf80c87828b1d9017e886e1447381703fca3d184f446eab5cc0614376b02\": container with ID starting with 0fbddf80c87828b1d9017e886e1447381703fca3d184f446eab5cc0614376b02 not found: ID does not exist" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.165842 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5f8l\" (UniqueName: \"kubernetes.io/projected/4147eacc-6d45-4c82-a803-fe339fa6ae6f-kube-api-access-q5f8l\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.183946 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.195962 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.204444 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:18:11 crc kubenswrapper[5030]: E0120 23:18:11.204895 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4147eacc-6d45-4c82-a803-fe339fa6ae6f" containerName="kube-state-metrics" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.204913 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4147eacc-6d45-4c82-a803-fe339fa6ae6f" containerName="kube-state-metrics" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.205102 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4147eacc-6d45-4c82-a803-fe339fa6ae6f" containerName="kube-state-metrics" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.205795 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.208970 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.221438 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.225495 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.335758 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f80d5571-57c5-4fef-82df-13411479cb44" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.183:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.336014 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f80d5571-57c5-4fef-82df-13411479cb44" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.183:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.368862 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.368969 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mxmq\" (UniqueName: \"kubernetes.io/projected/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-api-access-2mxmq\") pod \"kube-state-metrics-0\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.369014 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.369067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.429202 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.471209 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mxmq\" (UniqueName: \"kubernetes.io/projected/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-api-access-2mxmq\") pod \"kube-state-metrics-0\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.471264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.471321 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.471416 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.474856 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.475181 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.475532 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.488244 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mxmq\" (UniqueName: \"kubernetes.io/projected/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-api-access-2mxmq\") pod \"kube-state-metrics-0\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.535132 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.974512 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4147eacc-6d45-4c82-a803-fe339fa6ae6f" path="/var/lib/kubelet/pods/4147eacc-6d45-4c82-a803-fe339fa6ae6f/volumes" Jan 20 23:18:11 crc kubenswrapper[5030]: W0120 23:18:11.987147 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27f2c51d_7595_4b7f_bdfb_3e50f0a6c657.slice/crio-108c9acd81ab4f0728753e29bcf10ec789e4a9075dd58deafcfa9b957458cbfd WatchSource:0}: Error finding container 108c9acd81ab4f0728753e29bcf10ec789e4a9075dd58deafcfa9b957458cbfd: Status 404 returned error can't find the container with id 108c9acd81ab4f0728753e29bcf10ec789e4a9075dd58deafcfa9b957458cbfd Jan 20 23:18:11 crc kubenswrapper[5030]: I0120 23:18:11.991909 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.126824 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657","Type":"ContainerStarted","Data":"108c9acd81ab4f0728753e29bcf10ec789e4a9075dd58deafcfa9b957458cbfd"} Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.359131 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.359681 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="ceilometer-central-agent" containerID="cri-o://0e78a7035183ce85c56fa66a0a9d17074f673fa9b629ef74ce70dc73e7753c1f" gracePeriod=30 Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.359822 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="ceilometer-notification-agent" containerID="cri-o://2b67334f3a38bcb0b4c3c6b6298b5fbfc294085c5a2a7cd3f00f8a73ee3de45f" gracePeriod=30 Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.359865 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="sg-core" containerID="cri-o://c7472721bad1245b286f02302917d468dcadbca30bdd149bd656849d03762d55" gracePeriod=30 Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.360994 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="proxy-httpd" containerID="cri-o://5bb3a1baca5a5e3b5654b4e0bf660ec981ab5c5ff139ba5b8bd7c7cc1a350255" gracePeriod=30 Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.566120 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.697024 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-scripts\") pod \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.697285 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-config-data\") pod \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.697386 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-combined-ca-bundle\") pod \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.697539 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtml8\" (UniqueName: \"kubernetes.io/projected/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-kube-api-access-vtml8\") pod \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\" (UID: \"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c\") " Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.701641 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-kube-api-access-vtml8" (OuterVolumeSpecName: "kube-api-access-vtml8") pod "bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c" (UID: "bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c"). InnerVolumeSpecName "kube-api-access-vtml8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.718211 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-scripts" (OuterVolumeSpecName: "scripts") pod "bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c" (UID: "bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.724513 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c" (UID: "bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.726117 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-config-data" (OuterVolumeSpecName: "config-data") pod "bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c" (UID: "bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.799417 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtml8\" (UniqueName: \"kubernetes.io/projected/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-kube-api-access-vtml8\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.799450 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.799459 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:12 crc kubenswrapper[5030]: I0120 23:18:12.799469 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.138569 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" event={"ID":"bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c","Type":"ContainerDied","Data":"04e1191a21133306372d85a45574bc71d37015f62c33249ed44ecc3ecadab666"} Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.138661 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04e1191a21133306372d85a45574bc71d37015f62c33249ed44ecc3ecadab666" Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.139051 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv" Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.141873 5030 generic.go:334] "Generic (PLEG): container finished" podID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerID="5bb3a1baca5a5e3b5654b4e0bf660ec981ab5c5ff139ba5b8bd7c7cc1a350255" exitCode=0 Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.141916 5030 generic.go:334] "Generic (PLEG): container finished" podID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerID="c7472721bad1245b286f02302917d468dcadbca30bdd149bd656849d03762d55" exitCode=2 Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.141932 5030 generic.go:334] "Generic (PLEG): container finished" podID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerID="0e78a7035183ce85c56fa66a0a9d17074f673fa9b629ef74ce70dc73e7753c1f" exitCode=0 Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.141930 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2","Type":"ContainerDied","Data":"5bb3a1baca5a5e3b5654b4e0bf660ec981ab5c5ff139ba5b8bd7c7cc1a350255"} Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.141989 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2","Type":"ContainerDied","Data":"c7472721bad1245b286f02302917d468dcadbca30bdd149bd656849d03762d55"} Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.142009 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2","Type":"ContainerDied","Data":"0e78a7035183ce85c56fa66a0a9d17074f673fa9b629ef74ce70dc73e7753c1f"} Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.144368 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657","Type":"ContainerStarted","Data":"9a0f15aa4ffcb12861a489b33c1e7515d93c0502d191361b6c6737274054b795"} Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.145356 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.184306 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.7913425520000001 podStartE2EDuration="2.184287485s" podCreationTimestamp="2026-01-20 23:18:11 +0000 UTC" firstStartedPulling="2026-01-20 23:18:11.989535814 +0000 UTC m=+2564.309796102" lastFinishedPulling="2026-01-20 23:18:12.382480747 +0000 UTC m=+2564.702741035" observedRunningTime="2026-01-20 23:18:13.162460871 +0000 UTC m=+2565.482721199" watchObservedRunningTime="2026-01-20 23:18:13.184287485 +0000 UTC m=+2565.504547783" Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.333664 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.333889 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="4b2b1e84-6e97-4755-9601-a76801950ef5" containerName="nova-scheduler-scheduler" containerID="cri-o://3ad74e5d8af00be1d836e7e3ec5dd41e7cdc139463c7c45d1c4effbf9180b760" gracePeriod=30 Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.345839 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.346144 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" containerName="nova-api-log" containerID="cri-o://878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f" gracePeriod=30 Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.346195 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" containerName="nova-api-api" containerID="cri-o://6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712" gracePeriod=30 Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.379898 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.380104 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f80d5571-57c5-4fef-82df-13411479cb44" containerName="nova-metadata-log" containerID="cri-o://d80240b3201497ba480218a5ad2bea99b4075f963c41c08a3bbbea4747fcdf85" gracePeriod=30 Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.380470 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f80d5571-57c5-4fef-82df-13411479cb44" containerName="nova-metadata-metadata" containerID="cri-o://a22e11a9b9529094badc689ebfe2f258bac7218ac2b6b75586f9176f2c16a647" gracePeriod=30 Jan 20 23:18:13 crc kubenswrapper[5030]: I0120 23:18:13.922801 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.031952 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-combined-ca-bundle\") pod \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.032021 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpk26\" (UniqueName: \"kubernetes.io/projected/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-kube-api-access-kpk26\") pod \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.032158 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-logs\") pod \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.032271 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-config-data\") pod \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\" (UID: \"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4\") " Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.032675 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-logs" (OuterVolumeSpecName: "logs") pod "c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" (UID: "c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.032981 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.038387 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-kube-api-access-kpk26" (OuterVolumeSpecName: "kube-api-access-kpk26") pod "c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" (UID: "c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4"). InnerVolumeSpecName "kube-api-access-kpk26". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.062588 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-config-data" (OuterVolumeSpecName: "config-data") pod "c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" (UID: "c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.069530 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" (UID: "c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.135987 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.136029 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.136046 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpk26\" (UniqueName: \"kubernetes.io/projected/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4-kube-api-access-kpk26\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.155223 5030 generic.go:334] "Generic (PLEG): container finished" podID="f80d5571-57c5-4fef-82df-13411479cb44" containerID="d80240b3201497ba480218a5ad2bea99b4075f963c41c08a3bbbea4747fcdf85" exitCode=143 Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.155268 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f80d5571-57c5-4fef-82df-13411479cb44","Type":"ContainerDied","Data":"d80240b3201497ba480218a5ad2bea99b4075f963c41c08a3bbbea4747fcdf85"} Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.157462 5030 generic.go:334] "Generic (PLEG): container finished" podID="c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" containerID="6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712" exitCode=0 Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.157492 5030 generic.go:334] "Generic (PLEG): container finished" podID="c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" containerID="878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f" exitCode=143 Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.157527 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.157549 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4","Type":"ContainerDied","Data":"6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712"} Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.157666 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4","Type":"ContainerDied","Data":"878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f"} Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.157687 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4","Type":"ContainerDied","Data":"ec73eebe3ba289efa2bc73cc87e13c54a9361ab9bd277c22be828fd4a691c1ca"} Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.157702 5030 scope.go:117] "RemoveContainer" containerID="6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.234683 5030 scope.go:117] "RemoveContainer" containerID="878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.234853 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.252438 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.261161 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:14 crc kubenswrapper[5030]: E0120 23:18:14.262003 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c" containerName="nova-manage" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.262029 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c" containerName="nova-manage" Jan 20 23:18:14 crc kubenswrapper[5030]: E0120 23:18:14.262118 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" containerName="nova-api-api" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.262130 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" containerName="nova-api-api" Jan 20 23:18:14 crc kubenswrapper[5030]: E0120 23:18:14.262158 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" containerName="nova-api-log" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.262168 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" containerName="nova-api-log" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.262477 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" containerName="nova-api-log" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.262499 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c" containerName="nova-manage" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.262529 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" containerName="nova-api-api" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.263870 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.266257 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.268016 5030 scope.go:117] "RemoveContainer" containerID="6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712" Jan 20 23:18:14 crc kubenswrapper[5030]: E0120 23:18:14.268847 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712\": container with ID starting with 6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712 not found: ID does not exist" containerID="6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.269111 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712"} err="failed to get container status \"6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712\": rpc error: code = NotFound desc = could not find container \"6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712\": container with ID starting with 6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712 not found: ID does not exist" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.269274 5030 scope.go:117] "RemoveContainer" containerID="878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f" Jan 20 23:18:14 crc kubenswrapper[5030]: E0120 23:18:14.269655 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f\": container with ID starting with 878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f not found: ID does not exist" containerID="878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.269678 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f"} err="failed to get container status \"878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f\": rpc error: code = NotFound desc = could not find container \"878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f\": container with ID starting with 878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f not found: ID does not exist" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.269692 5030 scope.go:117] "RemoveContainer" containerID="6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.270127 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712"} err="failed to get container status \"6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712\": rpc error: code = NotFound desc = could not find container \"6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712\": container with ID starting with 6e6ba7233006b9f0d4ca345943faa46b64afd3e58a3120acf6fb5c4b7c90f712 not found: ID does not exist" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.270182 5030 scope.go:117] "RemoveContainer" containerID="878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.270463 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f"} err="failed to get container status \"878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f\": rpc error: code = NotFound desc = could not find container \"878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f\": container with ID starting with 878a62d1d026864817dd9af296998d2b827570bebb3e2ba2e82dda2c6d03c72f not found: ID does not exist" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.271755 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.341343 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc9a3bc6-4485-4d58-939b-53bcbc937d48-logs\") pod \"nova-api-0\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.341612 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzdgk\" (UniqueName: \"kubernetes.io/projected/bc9a3bc6-4485-4d58-939b-53bcbc937d48-kube-api-access-mzdgk\") pod \"nova-api-0\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.341684 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9a3bc6-4485-4d58-939b-53bcbc937d48-config-data\") pod \"nova-api-0\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.341782 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9a3bc6-4485-4d58-939b-53bcbc937d48-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.447940 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9a3bc6-4485-4d58-939b-53bcbc937d48-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.448115 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc9a3bc6-4485-4d58-939b-53bcbc937d48-logs\") pod \"nova-api-0\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.448193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzdgk\" (UniqueName: \"kubernetes.io/projected/bc9a3bc6-4485-4d58-939b-53bcbc937d48-kube-api-access-mzdgk\") pod \"nova-api-0\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.448504 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc9a3bc6-4485-4d58-939b-53bcbc937d48-logs\") pod \"nova-api-0\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.448559 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9a3bc6-4485-4d58-939b-53bcbc937d48-config-data\") pod \"nova-api-0\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.451758 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9a3bc6-4485-4d58-939b-53bcbc937d48-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.452392 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9a3bc6-4485-4d58-939b-53bcbc937d48-config-data\") pod \"nova-api-0\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.469517 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzdgk\" (UniqueName: \"kubernetes.io/projected/bc9a3bc6-4485-4d58-939b-53bcbc937d48-kube-api-access-mzdgk\") pod \"nova-api-0\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.547012 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.588445 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.651046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b2b1e84-6e97-4755-9601-a76801950ef5-combined-ca-bundle\") pod \"4b2b1e84-6e97-4755-9601-a76801950ef5\" (UID: \"4b2b1e84-6e97-4755-9601-a76801950ef5\") " Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.651263 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b2b1e84-6e97-4755-9601-a76801950ef5-config-data\") pod \"4b2b1e84-6e97-4755-9601-a76801950ef5\" (UID: \"4b2b1e84-6e97-4755-9601-a76801950ef5\") " Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.651450 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfxrb\" (UniqueName: \"kubernetes.io/projected/4b2b1e84-6e97-4755-9601-a76801950ef5-kube-api-access-rfxrb\") pod \"4b2b1e84-6e97-4755-9601-a76801950ef5\" (UID: \"4b2b1e84-6e97-4755-9601-a76801950ef5\") " Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.654922 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2b1e84-6e97-4755-9601-a76801950ef5-kube-api-access-rfxrb" (OuterVolumeSpecName: "kube-api-access-rfxrb") pod "4b2b1e84-6e97-4755-9601-a76801950ef5" (UID: "4b2b1e84-6e97-4755-9601-a76801950ef5"). InnerVolumeSpecName "kube-api-access-rfxrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.682455 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b2b1e84-6e97-4755-9601-a76801950ef5-config-data" (OuterVolumeSpecName: "config-data") pod "4b2b1e84-6e97-4755-9601-a76801950ef5" (UID: "4b2b1e84-6e97-4755-9601-a76801950ef5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.687156 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b2b1e84-6e97-4755-9601-a76801950ef5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b2b1e84-6e97-4755-9601-a76801950ef5" (UID: "4b2b1e84-6e97-4755-9601-a76801950ef5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.753191 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfxrb\" (UniqueName: \"kubernetes.io/projected/4b2b1e84-6e97-4755-9601-a76801950ef5-kube-api-access-rfxrb\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.753477 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b2b1e84-6e97-4755-9601-a76801950ef5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:14 crc kubenswrapper[5030]: I0120 23:18:14.753488 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b2b1e84-6e97-4755-9601-a76801950ef5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.035201 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.161433 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.169465 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"bc9a3bc6-4485-4d58-939b-53bcbc937d48","Type":"ContainerStarted","Data":"ad96590fad30a237c7e4eff5214a139ad4cd2485ae37f10a66cbd974f870e9d6"} Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.172829 5030 generic.go:334] "Generic (PLEG): container finished" podID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerID="2b67334f3a38bcb0b4c3c6b6298b5fbfc294085c5a2a7cd3f00f8a73ee3de45f" exitCode=0 Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.172883 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2","Type":"ContainerDied","Data":"2b67334f3a38bcb0b4c3c6b6298b5fbfc294085c5a2a7cd3f00f8a73ee3de45f"} Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.172916 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2","Type":"ContainerDied","Data":"d07e79312dd05be9a84c4b9ab2e0cdc68cb4e3cc6fea228809208806610c009c"} Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.173087 5030 scope.go:117] "RemoveContainer" containerID="5bb3a1baca5a5e3b5654b4e0bf660ec981ab5c5ff139ba5b8bd7c7cc1a350255" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.173220 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.179711 5030 generic.go:334] "Generic (PLEG): container finished" podID="4b2b1e84-6e97-4755-9601-a76801950ef5" containerID="3ad74e5d8af00be1d836e7e3ec5dd41e7cdc139463c7c45d1c4effbf9180b760" exitCode=0 Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.180749 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.182669 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"4b2b1e84-6e97-4755-9601-a76801950ef5","Type":"ContainerDied","Data":"3ad74e5d8af00be1d836e7e3ec5dd41e7cdc139463c7c45d1c4effbf9180b760"} Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.182706 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"4b2b1e84-6e97-4755-9601-a76801950ef5","Type":"ContainerDied","Data":"eec6efc54f58304f0d4035dde3e6e1d57ef6c27e5b0909c3b9bb94eb33496a7a"} Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.205171 5030 scope.go:117] "RemoveContainer" containerID="c7472721bad1245b286f02302917d468dcadbca30bdd149bd656849d03762d55" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.230832 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.251666 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.264490 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-scripts\") pod \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.264599 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-combined-ca-bundle\") pod \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.264746 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-run-httpd\") pod \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.264800 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-sg-core-conf-yaml\") pod \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.265463 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qxjw\" (UniqueName: \"kubernetes.io/projected/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-kube-api-access-6qxjw\") pod \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.265520 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-log-httpd\") pod \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.265546 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-config-data\") pod \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\" (UID: \"4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2\") " Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.267175 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" (UID: "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.267339 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" (UID: "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.271387 5030 scope.go:117] "RemoveContainer" containerID="2b67334f3a38bcb0b4c3c6b6298b5fbfc294085c5a2a7cd3f00f8a73ee3de45f" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.273705 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:18:15 crc kubenswrapper[5030]: E0120 23:18:15.274399 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="proxy-httpd" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.274429 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="proxy-httpd" Jan 20 23:18:15 crc kubenswrapper[5030]: E0120 23:18:15.274457 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2b1e84-6e97-4755-9601-a76801950ef5" containerName="nova-scheduler-scheduler" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.274467 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2b1e84-6e97-4755-9601-a76801950ef5" containerName="nova-scheduler-scheduler" Jan 20 23:18:15 crc kubenswrapper[5030]: E0120 23:18:15.274486 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="ceilometer-central-agent" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.274495 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="ceilometer-central-agent" Jan 20 23:18:15 crc kubenswrapper[5030]: E0120 23:18:15.274512 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="sg-core" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.274520 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="sg-core" Jan 20 23:18:15 crc kubenswrapper[5030]: E0120 23:18:15.274533 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="ceilometer-notification-agent" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.274542 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="ceilometer-notification-agent" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.274807 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="sg-core" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.274835 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="ceilometer-notification-agent" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.274849 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="proxy-httpd" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.274870 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" containerName="ceilometer-central-agent" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.274911 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b2b1e84-6e97-4755-9601-a76801950ef5" containerName="nova-scheduler-scheduler" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.276897 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.280016 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.289904 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-kube-api-access-6qxjw" (OuterVolumeSpecName: "kube-api-access-6qxjw") pod "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" (UID: "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2"). InnerVolumeSpecName "kube-api-access-6qxjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.296004 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-scripts" (OuterVolumeSpecName: "scripts") pod "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" (UID: "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.307566 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.308031 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" (UID: "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.351451 5030 scope.go:117] "RemoveContainer" containerID="0e78a7035183ce85c56fa66a0a9d17074f673fa9b629ef74ce70dc73e7753c1f" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.367255 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-config-data\") pod \"nova-scheduler-0\" (UID: \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.367327 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.367360 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqfkk\" (UniqueName: \"kubernetes.io/projected/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-kube-api-access-cqfkk\") pod \"nova-scheduler-0\" (UID: \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.367453 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.367463 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qxjw\" (UniqueName: \"kubernetes.io/projected/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-kube-api-access-6qxjw\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.367472 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.367481 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.367560 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.371228 5030 scope.go:117] "RemoveContainer" containerID="5bb3a1baca5a5e3b5654b4e0bf660ec981ab5c5ff139ba5b8bd7c7cc1a350255" Jan 20 23:18:15 crc kubenswrapper[5030]: E0120 23:18:15.371746 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bb3a1baca5a5e3b5654b4e0bf660ec981ab5c5ff139ba5b8bd7c7cc1a350255\": container with ID starting with 5bb3a1baca5a5e3b5654b4e0bf660ec981ab5c5ff139ba5b8bd7c7cc1a350255 not found: ID does not exist" containerID="5bb3a1baca5a5e3b5654b4e0bf660ec981ab5c5ff139ba5b8bd7c7cc1a350255" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.371782 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bb3a1baca5a5e3b5654b4e0bf660ec981ab5c5ff139ba5b8bd7c7cc1a350255"} err="failed to get container status \"5bb3a1baca5a5e3b5654b4e0bf660ec981ab5c5ff139ba5b8bd7c7cc1a350255\": rpc error: code = NotFound desc = could not find container \"5bb3a1baca5a5e3b5654b4e0bf660ec981ab5c5ff139ba5b8bd7c7cc1a350255\": container with ID starting with 5bb3a1baca5a5e3b5654b4e0bf660ec981ab5c5ff139ba5b8bd7c7cc1a350255 not found: ID does not exist" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.371849 5030 scope.go:117] "RemoveContainer" containerID="c7472721bad1245b286f02302917d468dcadbca30bdd149bd656849d03762d55" Jan 20 23:18:15 crc kubenswrapper[5030]: E0120 23:18:15.372186 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7472721bad1245b286f02302917d468dcadbca30bdd149bd656849d03762d55\": container with ID starting with c7472721bad1245b286f02302917d468dcadbca30bdd149bd656849d03762d55 not found: ID does not exist" containerID="c7472721bad1245b286f02302917d468dcadbca30bdd149bd656849d03762d55" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.372260 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7472721bad1245b286f02302917d468dcadbca30bdd149bd656849d03762d55"} err="failed to get container status \"c7472721bad1245b286f02302917d468dcadbca30bdd149bd656849d03762d55\": rpc error: code = NotFound desc = could not find container \"c7472721bad1245b286f02302917d468dcadbca30bdd149bd656849d03762d55\": container with ID starting with c7472721bad1245b286f02302917d468dcadbca30bdd149bd656849d03762d55 not found: ID does not exist" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.372287 5030 scope.go:117] "RemoveContainer" containerID="2b67334f3a38bcb0b4c3c6b6298b5fbfc294085c5a2a7cd3f00f8a73ee3de45f" Jan 20 23:18:15 crc kubenswrapper[5030]: E0120 23:18:15.372558 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b67334f3a38bcb0b4c3c6b6298b5fbfc294085c5a2a7cd3f00f8a73ee3de45f\": container with ID starting with 2b67334f3a38bcb0b4c3c6b6298b5fbfc294085c5a2a7cd3f00f8a73ee3de45f not found: ID does not exist" containerID="2b67334f3a38bcb0b4c3c6b6298b5fbfc294085c5a2a7cd3f00f8a73ee3de45f" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.372582 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b67334f3a38bcb0b4c3c6b6298b5fbfc294085c5a2a7cd3f00f8a73ee3de45f"} err="failed to get container status \"2b67334f3a38bcb0b4c3c6b6298b5fbfc294085c5a2a7cd3f00f8a73ee3de45f\": rpc error: code = NotFound desc = could not find container \"2b67334f3a38bcb0b4c3c6b6298b5fbfc294085c5a2a7cd3f00f8a73ee3de45f\": container with ID starting with 2b67334f3a38bcb0b4c3c6b6298b5fbfc294085c5a2a7cd3f00f8a73ee3de45f not found: ID does not exist" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.372597 5030 scope.go:117] "RemoveContainer" containerID="0e78a7035183ce85c56fa66a0a9d17074f673fa9b629ef74ce70dc73e7753c1f" Jan 20 23:18:15 crc kubenswrapper[5030]: E0120 23:18:15.372927 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e78a7035183ce85c56fa66a0a9d17074f673fa9b629ef74ce70dc73e7753c1f\": container with ID starting with 0e78a7035183ce85c56fa66a0a9d17074f673fa9b629ef74ce70dc73e7753c1f not found: ID does not exist" containerID="0e78a7035183ce85c56fa66a0a9d17074f673fa9b629ef74ce70dc73e7753c1f" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.372950 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e78a7035183ce85c56fa66a0a9d17074f673fa9b629ef74ce70dc73e7753c1f"} err="failed to get container status \"0e78a7035183ce85c56fa66a0a9d17074f673fa9b629ef74ce70dc73e7753c1f\": rpc error: code = NotFound desc = could not find container \"0e78a7035183ce85c56fa66a0a9d17074f673fa9b629ef74ce70dc73e7753c1f\": container with ID starting with 0e78a7035183ce85c56fa66a0a9d17074f673fa9b629ef74ce70dc73e7753c1f not found: ID does not exist" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.372964 5030 scope.go:117] "RemoveContainer" containerID="3ad74e5d8af00be1d836e7e3ec5dd41e7cdc139463c7c45d1c4effbf9180b760" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.374061 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" (UID: "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.390579 5030 scope.go:117] "RemoveContainer" containerID="3ad74e5d8af00be1d836e7e3ec5dd41e7cdc139463c7c45d1c4effbf9180b760" Jan 20 23:18:15 crc kubenswrapper[5030]: E0120 23:18:15.391115 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad74e5d8af00be1d836e7e3ec5dd41e7cdc139463c7c45d1c4effbf9180b760\": container with ID starting with 3ad74e5d8af00be1d836e7e3ec5dd41e7cdc139463c7c45d1c4effbf9180b760 not found: ID does not exist" containerID="3ad74e5d8af00be1d836e7e3ec5dd41e7cdc139463c7c45d1c4effbf9180b760" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.391227 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad74e5d8af00be1d836e7e3ec5dd41e7cdc139463c7c45d1c4effbf9180b760"} err="failed to get container status \"3ad74e5d8af00be1d836e7e3ec5dd41e7cdc139463c7c45d1c4effbf9180b760\": rpc error: code = NotFound desc = could not find container \"3ad74e5d8af00be1d836e7e3ec5dd41e7cdc139463c7c45d1c4effbf9180b760\": container with ID starting with 3ad74e5d8af00be1d836e7e3ec5dd41e7cdc139463c7c45d1c4effbf9180b760 not found: ID does not exist" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.410497 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-config-data" (OuterVolumeSpecName: "config-data") pod "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" (UID: "4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.468650 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-config-data\") pod \"nova-scheduler-0\" (UID: \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.469761 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.469867 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqfkk\" (UniqueName: \"kubernetes.io/projected/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-kube-api-access-cqfkk\") pod \"nova-scheduler-0\" (UID: \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.470048 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.470110 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.471991 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-config-data\") pod \"nova-scheduler-0\" (UID: \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.475699 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.489138 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqfkk\" (UniqueName: \"kubernetes.io/projected/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-kube-api-access-cqfkk\") pod \"nova-scheduler-0\" (UID: \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.507359 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.519174 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.529331 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.532548 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.535287 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.536019 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.538254 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.539271 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.643020 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.673506 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.673563 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.673748 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-scripts\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.673779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-run-httpd\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.673898 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4bs\" (UniqueName: \"kubernetes.io/projected/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-kube-api-access-mm4bs\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.673956 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-config-data\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.673996 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.674037 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-log-httpd\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.775730 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4bs\" (UniqueName: \"kubernetes.io/projected/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-kube-api-access-mm4bs\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.775971 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-config-data\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.775999 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.776955 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-log-httpd\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.777264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.777336 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.777591 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-log-httpd\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.777766 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-scripts\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.777845 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-run-httpd\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.778464 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-run-httpd\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.780769 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.781533 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.782294 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-config-data\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.782921 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.783510 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-scripts\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.801162 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4bs\" (UniqueName: \"kubernetes.io/projected/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-kube-api-access-mm4bs\") pod \"ceilometer-0\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.849396 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.981989 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2" path="/var/lib/kubelet/pods/4574c6ab-54d4-44b5-8adb-2ecb2d0dc8c2/volumes" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.984405 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b2b1e84-6e97-4755-9601-a76801950ef5" path="/var/lib/kubelet/pods/4b2b1e84-6e97-4755-9601-a76801950ef5/volumes" Jan 20 23:18:15 crc kubenswrapper[5030]: I0120 23:18:15.985025 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4" path="/var/lib/kubelet/pods/c700c8d5-3fe9-47d4-bd5d-e3ab9dcdd2f4/volumes" Jan 20 23:18:16 crc kubenswrapper[5030]: I0120 23:18:16.079228 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:18:16 crc kubenswrapper[5030]: W0120 23:18:16.086611 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8de9d1d_d7de_4313_9b7b_0f07992ede1b.slice/crio-bd99e9f5232c30cbe40654c3425630656b4efdeb865aa25cbf04119baede8ed7 WatchSource:0}: Error finding container bd99e9f5232c30cbe40654c3425630656b4efdeb865aa25cbf04119baede8ed7: Status 404 returned error can't find the container with id bd99e9f5232c30cbe40654c3425630656b4efdeb865aa25cbf04119baede8ed7 Jan 20 23:18:16 crc kubenswrapper[5030]: I0120 23:18:16.192352 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e8de9d1d-d7de-4313-9b7b-0f07992ede1b","Type":"ContainerStarted","Data":"bd99e9f5232c30cbe40654c3425630656b4efdeb865aa25cbf04119baede8ed7"} Jan 20 23:18:16 crc kubenswrapper[5030]: I0120 23:18:16.194426 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"bc9a3bc6-4485-4d58-939b-53bcbc937d48","Type":"ContainerStarted","Data":"37ffde59ab454a57f951598a3da3d18b4dcd431cf7626836ccf91be1b8cc0de5"} Jan 20 23:18:16 crc kubenswrapper[5030]: I0120 23:18:16.194468 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"bc9a3bc6-4485-4d58-939b-53bcbc937d48","Type":"ContainerStarted","Data":"09df9180d7cb987d28ef89a0c521500e624ee57aed9f4a33780620103020bcac"} Jan 20 23:18:16 crc kubenswrapper[5030]: I0120 23:18:16.220363 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.220342842 podStartE2EDuration="2.220342842s" podCreationTimestamp="2026-01-20 23:18:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:18:16.215424813 +0000 UTC m=+2568.535685191" watchObservedRunningTime="2026-01-20 23:18:16.220342842 +0000 UTC m=+2568.540603140" Jan 20 23:18:16 crc kubenswrapper[5030]: W0120 23:18:16.316370 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod057cf7ce_3b6b_410a_a7b7_4c7abaa29400.slice/crio-bc0fc4adf772e70f87475cb5cc0dce622d8456e22478d749d1099a73840bc901 WatchSource:0}: Error finding container bc0fc4adf772e70f87475cb5cc0dce622d8456e22478d749d1099a73840bc901: Status 404 returned error can't find the container with id bc0fc4adf772e70f87475cb5cc0dce622d8456e22478d749d1099a73840bc901 Jan 20 23:18:16 crc kubenswrapper[5030]: I0120 23:18:16.317201 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.017769 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.115346 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f80d5571-57c5-4fef-82df-13411479cb44-logs\") pod \"f80d5571-57c5-4fef-82df-13411479cb44\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.115583 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-nova-metadata-tls-certs\") pod \"f80d5571-57c5-4fef-82df-13411479cb44\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.116466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-combined-ca-bundle\") pod \"f80d5571-57c5-4fef-82df-13411479cb44\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.116557 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46j8d\" (UniqueName: \"kubernetes.io/projected/f80d5571-57c5-4fef-82df-13411479cb44-kube-api-access-46j8d\") pod \"f80d5571-57c5-4fef-82df-13411479cb44\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.115929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f80d5571-57c5-4fef-82df-13411479cb44-logs" (OuterVolumeSpecName: "logs") pod "f80d5571-57c5-4fef-82df-13411479cb44" (UID: "f80d5571-57c5-4fef-82df-13411479cb44"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.116714 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-config-data\") pod \"f80d5571-57c5-4fef-82df-13411479cb44\" (UID: \"f80d5571-57c5-4fef-82df-13411479cb44\") " Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.118240 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f80d5571-57c5-4fef-82df-13411479cb44-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.121479 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80d5571-57c5-4fef-82df-13411479cb44-kube-api-access-46j8d" (OuterVolumeSpecName: "kube-api-access-46j8d") pod "f80d5571-57c5-4fef-82df-13411479cb44" (UID: "f80d5571-57c5-4fef-82df-13411479cb44"). InnerVolumeSpecName "kube-api-access-46j8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.148109 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-config-data" (OuterVolumeSpecName: "config-data") pod "f80d5571-57c5-4fef-82df-13411479cb44" (UID: "f80d5571-57c5-4fef-82df-13411479cb44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.149197 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f80d5571-57c5-4fef-82df-13411479cb44" (UID: "f80d5571-57c5-4fef-82df-13411479cb44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.178255 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f80d5571-57c5-4fef-82df-13411479cb44" (UID: "f80d5571-57c5-4fef-82df-13411479cb44"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.212549 5030 generic.go:334] "Generic (PLEG): container finished" podID="f80d5571-57c5-4fef-82df-13411479cb44" containerID="a22e11a9b9529094badc689ebfe2f258bac7218ac2b6b75586f9176f2c16a647" exitCode=0 Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.212638 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f80d5571-57c5-4fef-82df-13411479cb44","Type":"ContainerDied","Data":"a22e11a9b9529094badc689ebfe2f258bac7218ac2b6b75586f9176f2c16a647"} Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.212669 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f80d5571-57c5-4fef-82df-13411479cb44","Type":"ContainerDied","Data":"6b007a96e5de9399b28a1e05154ad7bc2d0c5a1012c32098be5e5e9c579560e6"} Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.212689 5030 scope.go:117] "RemoveContainer" containerID="a22e11a9b9529094badc689ebfe2f258bac7218ac2b6b75586f9176f2c16a647" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.212792 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.227957 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46j8d\" (UniqueName: \"kubernetes.io/projected/f80d5571-57c5-4fef-82df-13411479cb44-kube-api-access-46j8d\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.227991 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.228007 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.228021 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80d5571-57c5-4fef-82df-13411479cb44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.232215 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e8de9d1d-d7de-4313-9b7b-0f07992ede1b","Type":"ContainerStarted","Data":"ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c"} Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.235910 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"057cf7ce-3b6b-410a-a7b7-4c7abaa29400","Type":"ContainerStarted","Data":"75f338bb0e4ed92531a6b0facd675d48c3ba45af1129daa07c19b8dea4db37af"} Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.235955 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"057cf7ce-3b6b-410a-a7b7-4c7abaa29400","Type":"ContainerStarted","Data":"bc0fc4adf772e70f87475cb5cc0dce622d8456e22478d749d1099a73840bc901"} Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.249695 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.249669496 podStartE2EDuration="2.249669496s" podCreationTimestamp="2026-01-20 23:18:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:18:17.249100192 +0000 UTC m=+2569.569360520" watchObservedRunningTime="2026-01-20 23:18:17.249669496 +0000 UTC m=+2569.569929814" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.265573 5030 scope.go:117] "RemoveContainer" containerID="d80240b3201497ba480218a5ad2bea99b4075f963c41c08a3bbbea4747fcdf85" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.284232 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.309808 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.310265 5030 scope.go:117] "RemoveContainer" containerID="a22e11a9b9529094badc689ebfe2f258bac7218ac2b6b75586f9176f2c16a647" Jan 20 23:18:17 crc kubenswrapper[5030]: E0120 23:18:17.311914 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a22e11a9b9529094badc689ebfe2f258bac7218ac2b6b75586f9176f2c16a647\": container with ID starting with a22e11a9b9529094badc689ebfe2f258bac7218ac2b6b75586f9176f2c16a647 not found: ID does not exist" containerID="a22e11a9b9529094badc689ebfe2f258bac7218ac2b6b75586f9176f2c16a647" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.311949 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a22e11a9b9529094badc689ebfe2f258bac7218ac2b6b75586f9176f2c16a647"} err="failed to get container status \"a22e11a9b9529094badc689ebfe2f258bac7218ac2b6b75586f9176f2c16a647\": rpc error: code = NotFound desc = could not find container \"a22e11a9b9529094badc689ebfe2f258bac7218ac2b6b75586f9176f2c16a647\": container with ID starting with a22e11a9b9529094badc689ebfe2f258bac7218ac2b6b75586f9176f2c16a647 not found: ID does not exist" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.311976 5030 scope.go:117] "RemoveContainer" containerID="d80240b3201497ba480218a5ad2bea99b4075f963c41c08a3bbbea4747fcdf85" Jan 20 23:18:17 crc kubenswrapper[5030]: E0120 23:18:17.312395 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d80240b3201497ba480218a5ad2bea99b4075f963c41c08a3bbbea4747fcdf85\": container with ID starting with d80240b3201497ba480218a5ad2bea99b4075f963c41c08a3bbbea4747fcdf85 not found: ID does not exist" containerID="d80240b3201497ba480218a5ad2bea99b4075f963c41c08a3bbbea4747fcdf85" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.312420 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d80240b3201497ba480218a5ad2bea99b4075f963c41c08a3bbbea4747fcdf85"} err="failed to get container status \"d80240b3201497ba480218a5ad2bea99b4075f963c41c08a3bbbea4747fcdf85\": rpc error: code = NotFound desc = could not find container \"d80240b3201497ba480218a5ad2bea99b4075f963c41c08a3bbbea4747fcdf85\": container with ID starting with d80240b3201497ba480218a5ad2bea99b4075f963c41c08a3bbbea4747fcdf85 not found: ID does not exist" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.321404 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:18:17 crc kubenswrapper[5030]: E0120 23:18:17.322000 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80d5571-57c5-4fef-82df-13411479cb44" containerName="nova-metadata-metadata" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.322016 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80d5571-57c5-4fef-82df-13411479cb44" containerName="nova-metadata-metadata" Jan 20 23:18:17 crc kubenswrapper[5030]: E0120 23:18:17.322053 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80d5571-57c5-4fef-82df-13411479cb44" containerName="nova-metadata-log" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.322061 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80d5571-57c5-4fef-82df-13411479cb44" containerName="nova-metadata-log" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.322213 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f80d5571-57c5-4fef-82df-13411479cb44" containerName="nova-metadata-metadata" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.322234 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f80d5571-57c5-4fef-82df-13411479cb44" containerName="nova-metadata-log" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.323215 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.328256 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.331781 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.332581 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.431528 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-config-data\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.431600 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q95j9\" (UniqueName: \"kubernetes.io/projected/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-kube-api-access-q95j9\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.431660 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.431810 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.431878 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-logs\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.533447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-config-data\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.533510 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q95j9\" (UniqueName: \"kubernetes.io/projected/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-kube-api-access-q95j9\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.533543 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.533573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.533593 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-logs\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.535971 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-logs\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.540353 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-config-data\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.541058 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.543827 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.554379 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q95j9\" (UniqueName: \"kubernetes.io/projected/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-kube-api-access-q95j9\") pod \"nova-metadata-0\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:17 crc kubenswrapper[5030]: I0120 23:18:17.665120 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:18 crc kubenswrapper[5030]: I0120 23:18:18.003112 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80d5571-57c5-4fef-82df-13411479cb44" path="/var/lib/kubelet/pods/f80d5571-57c5-4fef-82df-13411479cb44/volumes" Jan 20 23:18:18 crc kubenswrapper[5030]: I0120 23:18:18.147338 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:18:18 crc kubenswrapper[5030]: W0120 23:18:18.152883 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea3fc979_4f14_4578_9cae_d44c44ba3e8c.slice/crio-4fa48c692de39440b132c3e5aba941e2cdfd78e07ec01fc0b76b37802afc6b7f WatchSource:0}: Error finding container 4fa48c692de39440b132c3e5aba941e2cdfd78e07ec01fc0b76b37802afc6b7f: Status 404 returned error can't find the container with id 4fa48c692de39440b132c3e5aba941e2cdfd78e07ec01fc0b76b37802afc6b7f Jan 20 23:18:18 crc kubenswrapper[5030]: I0120 23:18:18.245325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"057cf7ce-3b6b-410a-a7b7-4c7abaa29400","Type":"ContainerStarted","Data":"80f20f3e14fd30974a6862ee1199368c15661cf9f73c9489fc0941e110a5e9a9"} Jan 20 23:18:18 crc kubenswrapper[5030]: I0120 23:18:18.246307 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ea3fc979-4f14-4578-9cae-d44c44ba3e8c","Type":"ContainerStarted","Data":"4fa48c692de39440b132c3e5aba941e2cdfd78e07ec01fc0b76b37802afc6b7f"} Jan 20 23:18:19 crc kubenswrapper[5030]: I0120 23:18:19.257557 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ea3fc979-4f14-4578-9cae-d44c44ba3e8c","Type":"ContainerStarted","Data":"6937c695cff1667d0d62ce1c83b0152b926dae8c296b22eca0b03bd5169765f4"} Jan 20 23:18:19 crc kubenswrapper[5030]: I0120 23:18:19.258946 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ea3fc979-4f14-4578-9cae-d44c44ba3e8c","Type":"ContainerStarted","Data":"4acdd1af318211b5450f634e63d8de0e0421722ea1b31f9eda4efdab1cfbcc50"} Jan 20 23:18:19 crc kubenswrapper[5030]: I0120 23:18:19.259919 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"057cf7ce-3b6b-410a-a7b7-4c7abaa29400","Type":"ContainerStarted","Data":"3ef15dff2ccad333aa1ac506611e8c97a406d3ce2a293ca656e5a559a26db6a9"} Jan 20 23:18:19 crc kubenswrapper[5030]: I0120 23:18:19.279644 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.2796071749999998 podStartE2EDuration="2.279607175s" podCreationTimestamp="2026-01-20 23:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:18:19.276774877 +0000 UTC m=+2571.597035165" watchObservedRunningTime="2026-01-20 23:18:19.279607175 +0000 UTC m=+2571.599867473" Jan 20 23:18:19 crc kubenswrapper[5030]: I0120 23:18:19.971826 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:18:19 crc kubenswrapper[5030]: E0120 23:18:19.972385 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:18:20 crc kubenswrapper[5030]: I0120 23:18:20.276284 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"057cf7ce-3b6b-410a-a7b7-4c7abaa29400","Type":"ContainerStarted","Data":"01cd66eb65d88e91e692d54445f9da60eb95ccc70ff99e18625b3af0a40809a5"} Jan 20 23:18:20 crc kubenswrapper[5030]: I0120 23:18:20.276578 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:20 crc kubenswrapper[5030]: I0120 23:18:20.297505 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.919284524 podStartE2EDuration="5.297488595s" podCreationTimestamp="2026-01-20 23:18:15 +0000 UTC" firstStartedPulling="2026-01-20 23:18:16.320796701 +0000 UTC m=+2568.641057009" lastFinishedPulling="2026-01-20 23:18:19.699000792 +0000 UTC m=+2572.019261080" observedRunningTime="2026-01-20 23:18:20.293859147 +0000 UTC m=+2572.614119445" watchObservedRunningTime="2026-01-20 23:18:20.297488595 +0000 UTC m=+2572.617748873" Jan 20 23:18:20 crc kubenswrapper[5030]: I0120 23:18:20.644057 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:21 crc kubenswrapper[5030]: I0120 23:18:21.559154 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:18:22 crc kubenswrapper[5030]: I0120 23:18:22.665489 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:22 crc kubenswrapper[5030]: I0120 23:18:22.666399 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:24 crc kubenswrapper[5030]: I0120 23:18:24.589827 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:24 crc kubenswrapper[5030]: I0120 23:18:24.590185 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:25 crc kubenswrapper[5030]: I0120 23:18:25.643571 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:25 crc kubenswrapper[5030]: I0120 23:18:25.671811 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="bc9a3bc6-4485-4d58-939b-53bcbc937d48" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:18:25 crc kubenswrapper[5030]: I0120 23:18:25.671855 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="bc9a3bc6-4485-4d58-939b-53bcbc937d48" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:18:25 crc kubenswrapper[5030]: I0120 23:18:25.677477 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:26 crc kubenswrapper[5030]: I0120 23:18:26.497154 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:18:27 crc kubenswrapper[5030]: I0120 23:18:27.665934 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:27 crc kubenswrapper[5030]: I0120 23:18:27.666027 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:28 crc kubenswrapper[5030]: I0120 23:18:28.679744 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:18:28 crc kubenswrapper[5030]: I0120 23:18:28.679814 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:18:33 crc kubenswrapper[5030]: I0120 23:18:33.964006 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:18:33 crc kubenswrapper[5030]: E0120 23:18:33.965031 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:18:34 crc kubenswrapper[5030]: I0120 23:18:34.593405 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:34 crc kubenswrapper[5030]: I0120 23:18:34.594788 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:34 crc kubenswrapper[5030]: I0120 23:18:34.597103 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:34 crc kubenswrapper[5030]: I0120 23:18:34.600063 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:35 crc kubenswrapper[5030]: I0120 23:18:35.583354 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:35 crc kubenswrapper[5030]: I0120 23:18:35.589264 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.260103 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fmxgh"] Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.262250 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.279745 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fmxgh"] Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.315929 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e78595-7120-46bf-8096-25d98b7ce2a9-catalog-content\") pod \"redhat-operators-fmxgh\" (UID: \"a6e78595-7120-46bf-8096-25d98b7ce2a9\") " pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.316063 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e78595-7120-46bf-8096-25d98b7ce2a9-utilities\") pod \"redhat-operators-fmxgh\" (UID: \"a6e78595-7120-46bf-8096-25d98b7ce2a9\") " pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.316096 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw69r\" (UniqueName: \"kubernetes.io/projected/a6e78595-7120-46bf-8096-25d98b7ce2a9-kube-api-access-nw69r\") pod \"redhat-operators-fmxgh\" (UID: \"a6e78595-7120-46bf-8096-25d98b7ce2a9\") " pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.354613 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.355017 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="sg-core" containerID="cri-o://3ef15dff2ccad333aa1ac506611e8c97a406d3ce2a293ca656e5a559a26db6a9" gracePeriod=30 Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.355221 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="proxy-httpd" containerID="cri-o://01cd66eb65d88e91e692d54445f9da60eb95ccc70ff99e18625b3af0a40809a5" gracePeriod=30 Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.355307 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="ceilometer-notification-agent" containerID="cri-o://80f20f3e14fd30974a6862ee1199368c15661cf9f73c9489fc0941e110a5e9a9" gracePeriod=30 Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.357296 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="ceilometer-central-agent" containerID="cri-o://75f338bb0e4ed92531a6b0facd675d48c3ba45af1129daa07c19b8dea4db37af" gracePeriod=30 Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.373335 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.418666 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw69r\" (UniqueName: \"kubernetes.io/projected/a6e78595-7120-46bf-8096-25d98b7ce2a9-kube-api-access-nw69r\") pod \"redhat-operators-fmxgh\" (UID: \"a6e78595-7120-46bf-8096-25d98b7ce2a9\") " pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.418837 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e78595-7120-46bf-8096-25d98b7ce2a9-catalog-content\") pod \"redhat-operators-fmxgh\" (UID: \"a6e78595-7120-46bf-8096-25d98b7ce2a9\") " pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.418997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e78595-7120-46bf-8096-25d98b7ce2a9-utilities\") pod \"redhat-operators-fmxgh\" (UID: \"a6e78595-7120-46bf-8096-25d98b7ce2a9\") " pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.419746 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e78595-7120-46bf-8096-25d98b7ce2a9-catalog-content\") pod \"redhat-operators-fmxgh\" (UID: \"a6e78595-7120-46bf-8096-25d98b7ce2a9\") " pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.419818 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e78595-7120-46bf-8096-25d98b7ce2a9-utilities\") pod \"redhat-operators-fmxgh\" (UID: \"a6e78595-7120-46bf-8096-25d98b7ce2a9\") " pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.450663 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw69r\" (UniqueName: \"kubernetes.io/projected/a6e78595-7120-46bf-8096-25d98b7ce2a9-kube-api-access-nw69r\") pod \"redhat-operators-fmxgh\" (UID: \"a6e78595-7120-46bf-8096-25d98b7ce2a9\") " pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.582690 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.618741 5030 generic.go:334] "Generic (PLEG): container finished" podID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerID="01cd66eb65d88e91e692d54445f9da60eb95ccc70ff99e18625b3af0a40809a5" exitCode=0 Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.618770 5030 generic.go:334] "Generic (PLEG): container finished" podID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerID="3ef15dff2ccad333aa1ac506611e8c97a406d3ce2a293ca656e5a559a26db6a9" exitCode=2 Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.619568 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"057cf7ce-3b6b-410a-a7b7-4c7abaa29400","Type":"ContainerDied","Data":"01cd66eb65d88e91e692d54445f9da60eb95ccc70ff99e18625b3af0a40809a5"} Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.619597 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"057cf7ce-3b6b-410a-a7b7-4c7abaa29400","Type":"ContainerDied","Data":"3ef15dff2ccad333aa1ac506611e8c97a406d3ce2a293ca656e5a559a26db6a9"} Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.674300 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.680068 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:37 crc kubenswrapper[5030]: I0120 23:18:37.682506 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:38 crc kubenswrapper[5030]: W0120 23:18:38.015898 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e78595_7120_46bf_8096_25d98b7ce2a9.slice/crio-a48797815bcf4a5f5a467a264417bd76fa8d79fc23d15c7fc8117115d56349e7 WatchSource:0}: Error finding container a48797815bcf4a5f5a467a264417bd76fa8d79fc23d15c7fc8117115d56349e7: Status 404 returned error can't find the container with id a48797815bcf4a5f5a467a264417bd76fa8d79fc23d15c7fc8117115d56349e7 Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.017423 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fmxgh"] Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.367034 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.628756 5030 generic.go:334] "Generic (PLEG): container finished" podID="a6e78595-7120-46bf-8096-25d98b7ce2a9" containerID="8675df25996b84ab570a10f59e7ce0ca0082b9e9964a8ad24cfe17825d6cfc45" exitCode=0 Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.628819 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmxgh" event={"ID":"a6e78595-7120-46bf-8096-25d98b7ce2a9","Type":"ContainerDied","Data":"8675df25996b84ab570a10f59e7ce0ca0082b9e9964a8ad24cfe17825d6cfc45"} Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.628870 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmxgh" event={"ID":"a6e78595-7120-46bf-8096-25d98b7ce2a9","Type":"ContainerStarted","Data":"a48797815bcf4a5f5a467a264417bd76fa8d79fc23d15c7fc8117115d56349e7"} Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.631729 5030 generic.go:334] "Generic (PLEG): container finished" podID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerID="80f20f3e14fd30974a6862ee1199368c15661cf9f73c9489fc0941e110a5e9a9" exitCode=0 Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.631747 5030 generic.go:334] "Generic (PLEG): container finished" podID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerID="75f338bb0e4ed92531a6b0facd675d48c3ba45af1129daa07c19b8dea4db37af" exitCode=0 Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.631819 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"057cf7ce-3b6b-410a-a7b7-4c7abaa29400","Type":"ContainerDied","Data":"80f20f3e14fd30974a6862ee1199368c15661cf9f73c9489fc0941e110a5e9a9"} Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.631862 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"057cf7ce-3b6b-410a-a7b7-4c7abaa29400","Type":"ContainerDied","Data":"75f338bb0e4ed92531a6b0facd675d48c3ba45af1129daa07c19b8dea4db37af"} Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.631905 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="bc9a3bc6-4485-4d58-939b-53bcbc937d48" containerName="nova-api-log" containerID="cri-o://09df9180d7cb987d28ef89a0c521500e624ee57aed9f4a33780620103020bcac" gracePeriod=30 Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.631989 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="bc9a3bc6-4485-4d58-939b-53bcbc937d48" containerName="nova-api-api" containerID="cri-o://37ffde59ab454a57f951598a3da3d18b4dcd431cf7626836ccf91be1b8cc0de5" gracePeriod=30 Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.689310 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.752438 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.844253 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm4bs\" (UniqueName: \"kubernetes.io/projected/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-kube-api-access-mm4bs\") pod \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.844362 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-ceilometer-tls-certs\") pod \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.844471 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-config-data\") pod \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.844505 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-sg-core-conf-yaml\") pod \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.844522 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-combined-ca-bundle\") pod \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.844577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-run-httpd\") pod \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.844602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-scripts\") pod \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.844652 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-log-httpd\") pod \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\" (UID: \"057cf7ce-3b6b-410a-a7b7-4c7abaa29400\") " Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.845413 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "057cf7ce-3b6b-410a-a7b7-4c7abaa29400" (UID: "057cf7ce-3b6b-410a-a7b7-4c7abaa29400"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.853894 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "057cf7ce-3b6b-410a-a7b7-4c7abaa29400" (UID: "057cf7ce-3b6b-410a-a7b7-4c7abaa29400"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.870823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-kube-api-access-mm4bs" (OuterVolumeSpecName: "kube-api-access-mm4bs") pod "057cf7ce-3b6b-410a-a7b7-4c7abaa29400" (UID: "057cf7ce-3b6b-410a-a7b7-4c7abaa29400"). InnerVolumeSpecName "kube-api-access-mm4bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.893773 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-scripts" (OuterVolumeSpecName: "scripts") pod "057cf7ce-3b6b-410a-a7b7-4c7abaa29400" (UID: "057cf7ce-3b6b-410a-a7b7-4c7abaa29400"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.956902 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.956929 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.956939 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.956950 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm4bs\" (UniqueName: \"kubernetes.io/projected/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-kube-api-access-mm4bs\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:38 crc kubenswrapper[5030]: I0120 23:18:38.985082 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "057cf7ce-3b6b-410a-a7b7-4c7abaa29400" (UID: "057cf7ce-3b6b-410a-a7b7-4c7abaa29400"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.027491 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "057cf7ce-3b6b-410a-a7b7-4c7abaa29400" (UID: "057cf7ce-3b6b-410a-a7b7-4c7abaa29400"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.048315 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "057cf7ce-3b6b-410a-a7b7-4c7abaa29400" (UID: "057cf7ce-3b6b-410a-a7b7-4c7abaa29400"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.059182 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.059335 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.059349 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.073124 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-config-data" (OuterVolumeSpecName: "config-data") pod "057cf7ce-3b6b-410a-a7b7-4c7abaa29400" (UID: "057cf7ce-3b6b-410a-a7b7-4c7abaa29400"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.161387 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/057cf7ce-3b6b-410a-a7b7-4c7abaa29400-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.646446 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"057cf7ce-3b6b-410a-a7b7-4c7abaa29400","Type":"ContainerDied","Data":"bc0fc4adf772e70f87475cb5cc0dce622d8456e22478d749d1099a73840bc901"} Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.646481 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.646516 5030 scope.go:117] "RemoveContainer" containerID="01cd66eb65d88e91e692d54445f9da60eb95ccc70ff99e18625b3af0a40809a5" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.662651 5030 generic.go:334] "Generic (PLEG): container finished" podID="bc9a3bc6-4485-4d58-939b-53bcbc937d48" containerID="09df9180d7cb987d28ef89a0c521500e624ee57aed9f4a33780620103020bcac" exitCode=143 Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.662718 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"bc9a3bc6-4485-4d58-939b-53bcbc937d48","Type":"ContainerDied","Data":"09df9180d7cb987d28ef89a0c521500e624ee57aed9f4a33780620103020bcac"} Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.672706 5030 scope.go:117] "RemoveContainer" containerID="3ef15dff2ccad333aa1ac506611e8c97a406d3ce2a293ca656e5a559a26db6a9" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.711005 5030 scope.go:117] "RemoveContainer" containerID="80f20f3e14fd30974a6862ee1199368c15661cf9f73c9489fc0941e110a5e9a9" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.728218 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.740827 5030 scope.go:117] "RemoveContainer" containerID="75f338bb0e4ed92531a6b0facd675d48c3ba45af1129daa07c19b8dea4db37af" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.745443 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.754615 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:39 crc kubenswrapper[5030]: E0120 23:18:39.755128 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="ceilometer-central-agent" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.755146 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="ceilometer-central-agent" Jan 20 23:18:39 crc kubenswrapper[5030]: E0120 23:18:39.755162 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="proxy-httpd" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.755170 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="proxy-httpd" Jan 20 23:18:39 crc kubenswrapper[5030]: E0120 23:18:39.755195 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="ceilometer-notification-agent" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.755203 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="ceilometer-notification-agent" Jan 20 23:18:39 crc kubenswrapper[5030]: E0120 23:18:39.755220 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="sg-core" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.755226 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="sg-core" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.755457 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="ceilometer-central-agent" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.755471 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="sg-core" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.755495 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="proxy-httpd" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.755504 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" containerName="ceilometer-notification-agent" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.757793 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.763236 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.763457 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.765723 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.770199 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.782752 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.782796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.782846 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f76e9da0-893b-45ee-9a86-2254184e1b8f-log-httpd\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.782864 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-scripts\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.782886 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnpkf\" (UniqueName: \"kubernetes.io/projected/f76e9da0-893b-45ee-9a86-2254184e1b8f-kube-api-access-jnpkf\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.782989 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f76e9da0-893b-45ee-9a86-2254184e1b8f-run-httpd\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.783038 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-config-data\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.783066 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.884837 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.884897 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.884958 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f76e9da0-893b-45ee-9a86-2254184e1b8f-log-httpd\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.884983 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-scripts\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.885014 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnpkf\" (UniqueName: \"kubernetes.io/projected/f76e9da0-893b-45ee-9a86-2254184e1b8f-kube-api-access-jnpkf\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.885065 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f76e9da0-893b-45ee-9a86-2254184e1b8f-run-httpd\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.885105 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-config-data\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.885142 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.885475 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f76e9da0-893b-45ee-9a86-2254184e1b8f-log-httpd\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.886042 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f76e9da0-893b-45ee-9a86-2254184e1b8f-run-httpd\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.890472 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.891043 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.892076 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-config-data\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.892092 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-scripts\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.892952 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.907931 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnpkf\" (UniqueName: \"kubernetes.io/projected/f76e9da0-893b-45ee-9a86-2254184e1b8f-kube-api-access-jnpkf\") pod \"ceilometer-0\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:39 crc kubenswrapper[5030]: I0120 23:18:39.975938 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="057cf7ce-3b6b-410a-a7b7-4c7abaa29400" path="/var/lib/kubelet/pods/057cf7ce-3b6b-410a-a7b7-4c7abaa29400/volumes" Jan 20 23:18:40 crc kubenswrapper[5030]: I0120 23:18:40.077834 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:40 crc kubenswrapper[5030]: I0120 23:18:40.519837 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:40 crc kubenswrapper[5030]: I0120 23:18:40.676656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmxgh" event={"ID":"a6e78595-7120-46bf-8096-25d98b7ce2a9","Type":"ContainerStarted","Data":"434fa7c998e3bfd9fd45f7e556abf84c144ba9a652f8dc3483e7095a4f1212b0"} Jan 20 23:18:40 crc kubenswrapper[5030]: I0120 23:18:40.678222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f76e9da0-893b-45ee-9a86-2254184e1b8f","Type":"ContainerStarted","Data":"a8ef70e3cb0d13bb5a7dc79d1ab5d5d3debccecaa9a2c216779610527d4b8802"} Jan 20 23:18:41 crc kubenswrapper[5030]: I0120 23:18:41.163682 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:41 crc kubenswrapper[5030]: I0120 23:18:41.688765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f76e9da0-893b-45ee-9a86-2254184e1b8f","Type":"ContainerStarted","Data":"72f3fdbb85e3f1600e9aa97a07c2b0e26512694b71a0762170d887e01a441e93"} Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.413647 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.436361 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9a3bc6-4485-4d58-939b-53bcbc937d48-config-data\") pod \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.436577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc9a3bc6-4485-4d58-939b-53bcbc937d48-logs\") pod \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.436695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzdgk\" (UniqueName: \"kubernetes.io/projected/bc9a3bc6-4485-4d58-939b-53bcbc937d48-kube-api-access-mzdgk\") pod \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.436777 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9a3bc6-4485-4d58-939b-53bcbc937d48-combined-ca-bundle\") pod \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\" (UID: \"bc9a3bc6-4485-4d58-939b-53bcbc937d48\") " Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.437152 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc9a3bc6-4485-4d58-939b-53bcbc937d48-logs" (OuterVolumeSpecName: "logs") pod "bc9a3bc6-4485-4d58-939b-53bcbc937d48" (UID: "bc9a3bc6-4485-4d58-939b-53bcbc937d48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.437684 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc9a3bc6-4485-4d58-939b-53bcbc937d48-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.449554 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9a3bc6-4485-4d58-939b-53bcbc937d48-kube-api-access-mzdgk" (OuterVolumeSpecName: "kube-api-access-mzdgk") pod "bc9a3bc6-4485-4d58-939b-53bcbc937d48" (UID: "bc9a3bc6-4485-4d58-939b-53bcbc937d48"). InnerVolumeSpecName "kube-api-access-mzdgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.500490 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9a3bc6-4485-4d58-939b-53bcbc937d48-config-data" (OuterVolumeSpecName: "config-data") pod "bc9a3bc6-4485-4d58-939b-53bcbc937d48" (UID: "bc9a3bc6-4485-4d58-939b-53bcbc937d48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.522823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9a3bc6-4485-4d58-939b-53bcbc937d48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc9a3bc6-4485-4d58-939b-53bcbc937d48" (UID: "bc9a3bc6-4485-4d58-939b-53bcbc937d48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.541753 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9a3bc6-4485-4d58-939b-53bcbc937d48-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.541787 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzdgk\" (UniqueName: \"kubernetes.io/projected/bc9a3bc6-4485-4d58-939b-53bcbc937d48-kube-api-access-mzdgk\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.541799 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9a3bc6-4485-4d58-939b-53bcbc937d48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.701105 5030 generic.go:334] "Generic (PLEG): container finished" podID="bc9a3bc6-4485-4d58-939b-53bcbc937d48" containerID="37ffde59ab454a57f951598a3da3d18b4dcd431cf7626836ccf91be1b8cc0de5" exitCode=0 Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.701152 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"bc9a3bc6-4485-4d58-939b-53bcbc937d48","Type":"ContainerDied","Data":"37ffde59ab454a57f951598a3da3d18b4dcd431cf7626836ccf91be1b8cc0de5"} Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.701177 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"bc9a3bc6-4485-4d58-939b-53bcbc937d48","Type":"ContainerDied","Data":"ad96590fad30a237c7e4eff5214a139ad4cd2485ae37f10a66cbd974f870e9d6"} Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.701194 5030 scope.go:117] "RemoveContainer" containerID="37ffde59ab454a57f951598a3da3d18b4dcd431cf7626836ccf91be1b8cc0de5" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.701319 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.735684 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.736461 5030 scope.go:117] "RemoveContainer" containerID="09df9180d7cb987d28ef89a0c521500e624ee57aed9f4a33780620103020bcac" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.747071 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.755580 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:42 crc kubenswrapper[5030]: E0120 23:18:42.755981 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9a3bc6-4485-4d58-939b-53bcbc937d48" containerName="nova-api-log" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.755997 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9a3bc6-4485-4d58-939b-53bcbc937d48" containerName="nova-api-log" Jan 20 23:18:42 crc kubenswrapper[5030]: E0120 23:18:42.756040 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9a3bc6-4485-4d58-939b-53bcbc937d48" containerName="nova-api-api" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.756048 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9a3bc6-4485-4d58-939b-53bcbc937d48" containerName="nova-api-api" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.756270 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9a3bc6-4485-4d58-939b-53bcbc937d48" containerName="nova-api-api" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.756297 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9a3bc6-4485-4d58-939b-53bcbc937d48" containerName="nova-api-log" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.757376 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.760756 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.760756 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.761129 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.762949 5030 scope.go:117] "RemoveContainer" containerID="37ffde59ab454a57f951598a3da3d18b4dcd431cf7626836ccf91be1b8cc0de5" Jan 20 23:18:42 crc kubenswrapper[5030]: E0120 23:18:42.763548 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37ffde59ab454a57f951598a3da3d18b4dcd431cf7626836ccf91be1b8cc0de5\": container with ID starting with 37ffde59ab454a57f951598a3da3d18b4dcd431cf7626836ccf91be1b8cc0de5 not found: ID does not exist" containerID="37ffde59ab454a57f951598a3da3d18b4dcd431cf7626836ccf91be1b8cc0de5" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.763578 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ffde59ab454a57f951598a3da3d18b4dcd431cf7626836ccf91be1b8cc0de5"} err="failed to get container status \"37ffde59ab454a57f951598a3da3d18b4dcd431cf7626836ccf91be1b8cc0de5\": rpc error: code = NotFound desc = could not find container \"37ffde59ab454a57f951598a3da3d18b4dcd431cf7626836ccf91be1b8cc0de5\": container with ID starting with 37ffde59ab454a57f951598a3da3d18b4dcd431cf7626836ccf91be1b8cc0de5 not found: ID does not exist" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.763599 5030 scope.go:117] "RemoveContainer" containerID="09df9180d7cb987d28ef89a0c521500e624ee57aed9f4a33780620103020bcac" Jan 20 23:18:42 crc kubenswrapper[5030]: E0120 23:18:42.763838 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09df9180d7cb987d28ef89a0c521500e624ee57aed9f4a33780620103020bcac\": container with ID starting with 09df9180d7cb987d28ef89a0c521500e624ee57aed9f4a33780620103020bcac not found: ID does not exist" containerID="09df9180d7cb987d28ef89a0c521500e624ee57aed9f4a33780620103020bcac" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.763865 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09df9180d7cb987d28ef89a0c521500e624ee57aed9f4a33780620103020bcac"} err="failed to get container status \"09df9180d7cb987d28ef89a0c521500e624ee57aed9f4a33780620103020bcac\": rpc error: code = NotFound desc = could not find container \"09df9180d7cb987d28ef89a0c521500e624ee57aed9f4a33780620103020bcac\": container with ID starting with 09df9180d7cb987d28ef89a0c521500e624ee57aed9f4a33780620103020bcac not found: ID does not exist" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.768897 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.947867 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-config-data\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.948172 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.948230 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-logs\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.948259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.948290 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47j9g\" (UniqueName: \"kubernetes.io/projected/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-kube-api-access-47j9g\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:42 crc kubenswrapper[5030]: I0120 23:18:42.948311 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-public-tls-certs\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.049749 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-config-data\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.049803 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.049860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-logs\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.049892 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.049921 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47j9g\" (UniqueName: \"kubernetes.io/projected/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-kube-api-access-47j9g\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.049937 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-public-tls-certs\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.051125 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-logs\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.053890 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-public-tls-certs\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.057000 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-config-data\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.061564 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.065246 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.088535 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47j9g\" (UniqueName: \"kubernetes.io/projected/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-kube-api-access-47j9g\") pod \"nova-api-0\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.379190 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.716372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f76e9da0-893b-45ee-9a86-2254184e1b8f","Type":"ContainerStarted","Data":"e80ef8346e9ef9a484f93e9cc4cff69690d04d9409cb2c9fcc021ccdb0415bdf"} Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.718327 5030 generic.go:334] "Generic (PLEG): container finished" podID="a6e78595-7120-46bf-8096-25d98b7ce2a9" containerID="434fa7c998e3bfd9fd45f7e556abf84c144ba9a652f8dc3483e7095a4f1212b0" exitCode=0 Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.718387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmxgh" event={"ID":"a6e78595-7120-46bf-8096-25d98b7ce2a9","Type":"ContainerDied","Data":"434fa7c998e3bfd9fd45f7e556abf84c144ba9a652f8dc3483e7095a4f1212b0"} Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.823442 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:18:43 crc kubenswrapper[5030]: W0120 23:18:43.829380 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bb6b715_a2a6_4c25_83c3_315fd30b8d92.slice/crio-7b40e3ba80df28de126301f4970e88e7614fffe14d06733faadcf44b74b9deab WatchSource:0}: Error finding container 7b40e3ba80df28de126301f4970e88e7614fffe14d06733faadcf44b74b9deab: Status 404 returned error can't find the container with id 7b40e3ba80df28de126301f4970e88e7614fffe14d06733faadcf44b74b9deab Jan 20 23:18:43 crc kubenswrapper[5030]: I0120 23:18:43.977609 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc9a3bc6-4485-4d58-939b-53bcbc937d48" path="/var/lib/kubelet/pods/bc9a3bc6-4485-4d58-939b-53bcbc937d48/volumes" Jan 20 23:18:44 crc kubenswrapper[5030]: I0120 23:18:44.740575 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6bb6b715-a2a6-4c25-83c3-315fd30b8d92","Type":"ContainerStarted","Data":"300cae00e1e96594e491715916fe7d890923b30fd9931bc6ed9c3ee0ac8dbeaf"} Jan 20 23:18:44 crc kubenswrapper[5030]: I0120 23:18:44.740971 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6bb6b715-a2a6-4c25-83c3-315fd30b8d92","Type":"ContainerStarted","Data":"fe1a9151522425bf756d729b22560d3e26a3f05df618b4a2133b2f1cf4ed87cf"} Jan 20 23:18:44 crc kubenswrapper[5030]: I0120 23:18:44.740998 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6bb6b715-a2a6-4c25-83c3-315fd30b8d92","Type":"ContainerStarted","Data":"7b40e3ba80df28de126301f4970e88e7614fffe14d06733faadcf44b74b9deab"} Jan 20 23:18:44 crc kubenswrapper[5030]: I0120 23:18:44.743792 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f76e9da0-893b-45ee-9a86-2254184e1b8f","Type":"ContainerStarted","Data":"3e1796240f72dcdbd57b756c6395a70c480589d67f7bb93de3357af84ea140cf"} Jan 20 23:18:44 crc kubenswrapper[5030]: I0120 23:18:44.747399 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmxgh" event={"ID":"a6e78595-7120-46bf-8096-25d98b7ce2a9","Type":"ContainerStarted","Data":"206ce47024540eb040d455134ed9df8683428a511eacaaeeb11f805d81fea5a8"} Jan 20 23:18:44 crc kubenswrapper[5030]: I0120 23:18:44.780334 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.780314521 podStartE2EDuration="2.780314521s" podCreationTimestamp="2026-01-20 23:18:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:18:44.771793037 +0000 UTC m=+2597.092053425" watchObservedRunningTime="2026-01-20 23:18:44.780314521 +0000 UTC m=+2597.100574819" Jan 20 23:18:44 crc kubenswrapper[5030]: I0120 23:18:44.801938 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fmxgh" podStartSLOduration=2.225272937 podStartE2EDuration="7.801906899s" podCreationTimestamp="2026-01-20 23:18:37 +0000 UTC" firstStartedPulling="2026-01-20 23:18:38.630187646 +0000 UTC m=+2590.950447934" lastFinishedPulling="2026-01-20 23:18:44.206821588 +0000 UTC m=+2596.527081896" observedRunningTime="2026-01-20 23:18:44.796686104 +0000 UTC m=+2597.116946392" watchObservedRunningTime="2026-01-20 23:18:44.801906899 +0000 UTC m=+2597.122167227" Jan 20 23:18:45 crc kubenswrapper[5030]: I0120 23:18:45.762632 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f76e9da0-893b-45ee-9a86-2254184e1b8f","Type":"ContainerStarted","Data":"7329e18f69a8e76397a70ad3e3e190bf757c09afedb1a5bb99a58341cea5fa67"} Jan 20 23:18:45 crc kubenswrapper[5030]: I0120 23:18:45.762951 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="ceilometer-central-agent" containerID="cri-o://72f3fdbb85e3f1600e9aa97a07c2b0e26512694b71a0762170d887e01a441e93" gracePeriod=30 Jan 20 23:18:45 crc kubenswrapper[5030]: I0120 23:18:45.763731 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="proxy-httpd" containerID="cri-o://7329e18f69a8e76397a70ad3e3e190bf757c09afedb1a5bb99a58341cea5fa67" gracePeriod=30 Jan 20 23:18:45 crc kubenswrapper[5030]: I0120 23:18:45.763755 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="sg-core" containerID="cri-o://3e1796240f72dcdbd57b756c6395a70c480589d67f7bb93de3357af84ea140cf" gracePeriod=30 Jan 20 23:18:45 crc kubenswrapper[5030]: I0120 23:18:45.763778 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="ceilometer-notification-agent" containerID="cri-o://e80ef8346e9ef9a484f93e9cc4cff69690d04d9409cb2c9fcc021ccdb0415bdf" gracePeriod=30 Jan 20 23:18:45 crc kubenswrapper[5030]: I0120 23:18:45.802259 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.949486967 podStartE2EDuration="6.802232278s" podCreationTimestamp="2026-01-20 23:18:39 +0000 UTC" firstStartedPulling="2026-01-20 23:18:40.534649848 +0000 UTC m=+2592.854910136" lastFinishedPulling="2026-01-20 23:18:45.387395119 +0000 UTC m=+2597.707655447" observedRunningTime="2026-01-20 23:18:45.788644922 +0000 UTC m=+2598.108905240" watchObservedRunningTime="2026-01-20 23:18:45.802232278 +0000 UTC m=+2598.122492606" Jan 20 23:18:45 crc kubenswrapper[5030]: I0120 23:18:45.966786 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:18:46 crc kubenswrapper[5030]: I0120 23:18:46.785374 5030 generic.go:334] "Generic (PLEG): container finished" podID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerID="7329e18f69a8e76397a70ad3e3e190bf757c09afedb1a5bb99a58341cea5fa67" exitCode=0 Jan 20 23:18:46 crc kubenswrapper[5030]: I0120 23:18:46.786076 5030 generic.go:334] "Generic (PLEG): container finished" podID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerID="3e1796240f72dcdbd57b756c6395a70c480589d67f7bb93de3357af84ea140cf" exitCode=2 Jan 20 23:18:46 crc kubenswrapper[5030]: I0120 23:18:46.786092 5030 generic.go:334] "Generic (PLEG): container finished" podID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerID="e80ef8346e9ef9a484f93e9cc4cff69690d04d9409cb2c9fcc021ccdb0415bdf" exitCode=0 Jan 20 23:18:46 crc kubenswrapper[5030]: I0120 23:18:46.785462 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f76e9da0-893b-45ee-9a86-2254184e1b8f","Type":"ContainerDied","Data":"7329e18f69a8e76397a70ad3e3e190bf757c09afedb1a5bb99a58341cea5fa67"} Jan 20 23:18:46 crc kubenswrapper[5030]: I0120 23:18:46.786171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f76e9da0-893b-45ee-9a86-2254184e1b8f","Type":"ContainerDied","Data":"3e1796240f72dcdbd57b756c6395a70c480589d67f7bb93de3357af84ea140cf"} Jan 20 23:18:46 crc kubenswrapper[5030]: I0120 23:18:46.786195 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f76e9da0-893b-45ee-9a86-2254184e1b8f","Type":"ContainerDied","Data":"e80ef8346e9ef9a484f93e9cc4cff69690d04d9409cb2c9fcc021ccdb0415bdf"} Jan 20 23:18:46 crc kubenswrapper[5030]: I0120 23:18:46.792967 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"1dc0601959289421c735475a4194e91b57087f23f408901601c681f937cf2114"} Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.583512 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.583590 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.601973 5030 scope.go:117] "RemoveContainer" containerID="a49455d12fd72cbf1f03ae03da0c2d0f55e27b28828ca4175a8011acd09f2387" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.647257 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rmkfj"] Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.649842 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.657703 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rmkfj"] Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.741203 5030 scope.go:117] "RemoveContainer" containerID="3bab7e848d1c904c3ea9680a2511ae2b06c32b2cb43119543388719f7eb1b7bd" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.748975 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.749148 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a20c2b53-c20c-4205-a6d3-d33750654784-catalog-content\") pod \"certified-operators-rmkfj\" (UID: \"a20c2b53-c20c-4205-a6d3-d33750654784\") " pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.749199 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a20c2b53-c20c-4205-a6d3-d33750654784-utilities\") pod \"certified-operators-rmkfj\" (UID: \"a20c2b53-c20c-4205-a6d3-d33750654784\") " pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.749254 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj4x5\" (UniqueName: \"kubernetes.io/projected/a20c2b53-c20c-4205-a6d3-d33750654784-kube-api-access-lj4x5\") pod \"certified-operators-rmkfj\" (UID: \"a20c2b53-c20c-4205-a6d3-d33750654784\") " pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.760055 5030 scope.go:117] "RemoveContainer" containerID="7b21fefc5a3e56be8e6e529b2aa999397d9311389a31236928705e843a13b1df" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.804878 5030 scope.go:117] "RemoveContainer" containerID="a86a208b37c4b3725dd0af75943dbb3f175452a065a1bb1a4420bf3a64fdf949" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.814727 5030 generic.go:334] "Generic (PLEG): container finished" podID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerID="72f3fdbb85e3f1600e9aa97a07c2b0e26512694b71a0762170d887e01a441e93" exitCode=0 Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.814781 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f76e9da0-893b-45ee-9a86-2254184e1b8f","Type":"ContainerDied","Data":"72f3fdbb85e3f1600e9aa97a07c2b0e26512694b71a0762170d887e01a441e93"} Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.814805 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f76e9da0-893b-45ee-9a86-2254184e1b8f","Type":"ContainerDied","Data":"a8ef70e3cb0d13bb5a7dc79d1ab5d5d3debccecaa9a2c216779610527d4b8802"} Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.814869 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.815558 5030 scope.go:117] "RemoveContainer" containerID="7329e18f69a8e76397a70ad3e3e190bf757c09afedb1a5bb99a58341cea5fa67" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.850228 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f76e9da0-893b-45ee-9a86-2254184e1b8f-run-httpd\") pod \"f76e9da0-893b-45ee-9a86-2254184e1b8f\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.850306 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-sg-core-conf-yaml\") pod \"f76e9da0-893b-45ee-9a86-2254184e1b8f\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.850327 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-ceilometer-tls-certs\") pod \"f76e9da0-893b-45ee-9a86-2254184e1b8f\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.850382 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f76e9da0-893b-45ee-9a86-2254184e1b8f-log-httpd\") pod \"f76e9da0-893b-45ee-9a86-2254184e1b8f\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.850495 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-scripts\") pod \"f76e9da0-893b-45ee-9a86-2254184e1b8f\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.850556 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-combined-ca-bundle\") pod \"f76e9da0-893b-45ee-9a86-2254184e1b8f\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.850609 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-config-data\") pod \"f76e9da0-893b-45ee-9a86-2254184e1b8f\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.850660 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnpkf\" (UniqueName: \"kubernetes.io/projected/f76e9da0-893b-45ee-9a86-2254184e1b8f-kube-api-access-jnpkf\") pod \"f76e9da0-893b-45ee-9a86-2254184e1b8f\" (UID: \"f76e9da0-893b-45ee-9a86-2254184e1b8f\") " Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.850962 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a20c2b53-c20c-4205-a6d3-d33750654784-catalog-content\") pod \"certified-operators-rmkfj\" (UID: \"a20c2b53-c20c-4205-a6d3-d33750654784\") " pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.851003 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a20c2b53-c20c-4205-a6d3-d33750654784-utilities\") pod \"certified-operators-rmkfj\" (UID: \"a20c2b53-c20c-4205-a6d3-d33750654784\") " pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.851041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj4x5\" (UniqueName: \"kubernetes.io/projected/a20c2b53-c20c-4205-a6d3-d33750654784-kube-api-access-lj4x5\") pod \"certified-operators-rmkfj\" (UID: \"a20c2b53-c20c-4205-a6d3-d33750654784\") " pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.851342 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f76e9da0-893b-45ee-9a86-2254184e1b8f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f76e9da0-893b-45ee-9a86-2254184e1b8f" (UID: "f76e9da0-893b-45ee-9a86-2254184e1b8f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.851411 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a20c2b53-c20c-4205-a6d3-d33750654784-catalog-content\") pod \"certified-operators-rmkfj\" (UID: \"a20c2b53-c20c-4205-a6d3-d33750654784\") " pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.851480 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a20c2b53-c20c-4205-a6d3-d33750654784-utilities\") pod \"certified-operators-rmkfj\" (UID: \"a20c2b53-c20c-4205-a6d3-d33750654784\") " pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.851574 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f76e9da0-893b-45ee-9a86-2254184e1b8f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f76e9da0-893b-45ee-9a86-2254184e1b8f" (UID: "f76e9da0-893b-45ee-9a86-2254184e1b8f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.855596 5030 scope.go:117] "RemoveContainer" containerID="3e1796240f72dcdbd57b756c6395a70c480589d67f7bb93de3357af84ea140cf" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.857855 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f76e9da0-893b-45ee-9a86-2254184e1b8f-kube-api-access-jnpkf" (OuterVolumeSpecName: "kube-api-access-jnpkf") pod "f76e9da0-893b-45ee-9a86-2254184e1b8f" (UID: "f76e9da0-893b-45ee-9a86-2254184e1b8f"). InnerVolumeSpecName "kube-api-access-jnpkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.858162 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-scripts" (OuterVolumeSpecName: "scripts") pod "f76e9da0-893b-45ee-9a86-2254184e1b8f" (UID: "f76e9da0-893b-45ee-9a86-2254184e1b8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.875153 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj4x5\" (UniqueName: \"kubernetes.io/projected/a20c2b53-c20c-4205-a6d3-d33750654784-kube-api-access-lj4x5\") pod \"certified-operators-rmkfj\" (UID: \"a20c2b53-c20c-4205-a6d3-d33750654784\") " pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.885233 5030 scope.go:117] "RemoveContainer" containerID="e80ef8346e9ef9a484f93e9cc4cff69690d04d9409cb2c9fcc021ccdb0415bdf" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.886774 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f76e9da0-893b-45ee-9a86-2254184e1b8f" (UID: "f76e9da0-893b-45ee-9a86-2254184e1b8f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.913314 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f76e9da0-893b-45ee-9a86-2254184e1b8f" (UID: "f76e9da0-893b-45ee-9a86-2254184e1b8f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.937058 5030 scope.go:117] "RemoveContainer" containerID="72f3fdbb85e3f1600e9aa97a07c2b0e26512694b71a0762170d887e01a441e93" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.952950 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f76e9da0-893b-45ee-9a86-2254184e1b8f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.952973 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.952982 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.952991 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f76e9da0-893b-45ee-9a86-2254184e1b8f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.952999 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.953009 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnpkf\" (UniqueName: \"kubernetes.io/projected/f76e9da0-893b-45ee-9a86-2254184e1b8f-kube-api-access-jnpkf\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.955339 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f76e9da0-893b-45ee-9a86-2254184e1b8f" (UID: "f76e9da0-893b-45ee-9a86-2254184e1b8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.966530 5030 scope.go:117] "RemoveContainer" containerID="7329e18f69a8e76397a70ad3e3e190bf757c09afedb1a5bb99a58341cea5fa67" Jan 20 23:18:47 crc kubenswrapper[5030]: E0120 23:18:47.971781 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7329e18f69a8e76397a70ad3e3e190bf757c09afedb1a5bb99a58341cea5fa67\": container with ID starting with 7329e18f69a8e76397a70ad3e3e190bf757c09afedb1a5bb99a58341cea5fa67 not found: ID does not exist" containerID="7329e18f69a8e76397a70ad3e3e190bf757c09afedb1a5bb99a58341cea5fa67" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.971815 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7329e18f69a8e76397a70ad3e3e190bf757c09afedb1a5bb99a58341cea5fa67"} err="failed to get container status \"7329e18f69a8e76397a70ad3e3e190bf757c09afedb1a5bb99a58341cea5fa67\": rpc error: code = NotFound desc = could not find container \"7329e18f69a8e76397a70ad3e3e190bf757c09afedb1a5bb99a58341cea5fa67\": container with ID starting with 7329e18f69a8e76397a70ad3e3e190bf757c09afedb1a5bb99a58341cea5fa67 not found: ID does not exist" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.971836 5030 scope.go:117] "RemoveContainer" containerID="3e1796240f72dcdbd57b756c6395a70c480589d67f7bb93de3357af84ea140cf" Jan 20 23:18:47 crc kubenswrapper[5030]: E0120 23:18:47.975141 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e1796240f72dcdbd57b756c6395a70c480589d67f7bb93de3357af84ea140cf\": container with ID starting with 3e1796240f72dcdbd57b756c6395a70c480589d67f7bb93de3357af84ea140cf not found: ID does not exist" containerID="3e1796240f72dcdbd57b756c6395a70c480589d67f7bb93de3357af84ea140cf" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.975167 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e1796240f72dcdbd57b756c6395a70c480589d67f7bb93de3357af84ea140cf"} err="failed to get container status \"3e1796240f72dcdbd57b756c6395a70c480589d67f7bb93de3357af84ea140cf\": rpc error: code = NotFound desc = could not find container \"3e1796240f72dcdbd57b756c6395a70c480589d67f7bb93de3357af84ea140cf\": container with ID starting with 3e1796240f72dcdbd57b756c6395a70c480589d67f7bb93de3357af84ea140cf not found: ID does not exist" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.975181 5030 scope.go:117] "RemoveContainer" containerID="e80ef8346e9ef9a484f93e9cc4cff69690d04d9409cb2c9fcc021ccdb0415bdf" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.976278 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-config-data" (OuterVolumeSpecName: "config-data") pod "f76e9da0-893b-45ee-9a86-2254184e1b8f" (UID: "f76e9da0-893b-45ee-9a86-2254184e1b8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:18:47 crc kubenswrapper[5030]: E0120 23:18:47.977144 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e80ef8346e9ef9a484f93e9cc4cff69690d04d9409cb2c9fcc021ccdb0415bdf\": container with ID starting with e80ef8346e9ef9a484f93e9cc4cff69690d04d9409cb2c9fcc021ccdb0415bdf not found: ID does not exist" containerID="e80ef8346e9ef9a484f93e9cc4cff69690d04d9409cb2c9fcc021ccdb0415bdf" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.977186 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e80ef8346e9ef9a484f93e9cc4cff69690d04d9409cb2c9fcc021ccdb0415bdf"} err="failed to get container status \"e80ef8346e9ef9a484f93e9cc4cff69690d04d9409cb2c9fcc021ccdb0415bdf\": rpc error: code = NotFound desc = could not find container \"e80ef8346e9ef9a484f93e9cc4cff69690d04d9409cb2c9fcc021ccdb0415bdf\": container with ID starting with e80ef8346e9ef9a484f93e9cc4cff69690d04d9409cb2c9fcc021ccdb0415bdf not found: ID does not exist" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.977215 5030 scope.go:117] "RemoveContainer" containerID="72f3fdbb85e3f1600e9aa97a07c2b0e26512694b71a0762170d887e01a441e93" Jan 20 23:18:47 crc kubenswrapper[5030]: E0120 23:18:47.977531 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f3fdbb85e3f1600e9aa97a07c2b0e26512694b71a0762170d887e01a441e93\": container with ID starting with 72f3fdbb85e3f1600e9aa97a07c2b0e26512694b71a0762170d887e01a441e93 not found: ID does not exist" containerID="72f3fdbb85e3f1600e9aa97a07c2b0e26512694b71a0762170d887e01a441e93" Jan 20 23:18:47 crc kubenswrapper[5030]: I0120 23:18:47.977566 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f3fdbb85e3f1600e9aa97a07c2b0e26512694b71a0762170d887e01a441e93"} err="failed to get container status \"72f3fdbb85e3f1600e9aa97a07c2b0e26512694b71a0762170d887e01a441e93\": rpc error: code = NotFound desc = could not find container \"72f3fdbb85e3f1600e9aa97a07c2b0e26512694b71a0762170d887e01a441e93\": container with ID starting with 72f3fdbb85e3f1600e9aa97a07c2b0e26512694b71a0762170d887e01a441e93 not found: ID does not exist" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.054391 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.054419 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f76e9da0-893b-45ee-9a86-2254184e1b8f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.057518 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.146665 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.158185 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.185680 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:48 crc kubenswrapper[5030]: E0120 23:18:48.186187 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="ceilometer-central-agent" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.186201 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="ceilometer-central-agent" Jan 20 23:18:48 crc kubenswrapper[5030]: E0120 23:18:48.186229 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="ceilometer-notification-agent" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.186239 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="ceilometer-notification-agent" Jan 20 23:18:48 crc kubenswrapper[5030]: E0120 23:18:48.186257 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="proxy-httpd" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.186268 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="proxy-httpd" Jan 20 23:18:48 crc kubenswrapper[5030]: E0120 23:18:48.186296 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="sg-core" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.186304 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="sg-core" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.186518 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="proxy-httpd" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.186539 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="ceilometer-notification-agent" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.186557 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="sg-core" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.186566 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" containerName="ceilometer-central-agent" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.188677 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.193593 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.193977 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.194307 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.207154 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.364542 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.364957 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.364996 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-config-data\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.365043 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rwwp\" (UniqueName: \"kubernetes.io/projected/7da27e47-283b-48aa-9439-f8b3cf0c3426-kube-api-access-5rwwp\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.365072 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-scripts\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.365113 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da27e47-283b-48aa-9439-f8b3cf0c3426-log-httpd\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.365164 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da27e47-283b-48aa-9439-f8b3cf0c3426-run-httpd\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.365179 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.466756 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.467085 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.467114 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-config-data\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.467159 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rwwp\" (UniqueName: \"kubernetes.io/projected/7da27e47-283b-48aa-9439-f8b3cf0c3426-kube-api-access-5rwwp\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.467192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-scripts\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.467225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da27e47-283b-48aa-9439-f8b3cf0c3426-log-httpd\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.467274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da27e47-283b-48aa-9439-f8b3cf0c3426-run-httpd\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.467290 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.468846 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da27e47-283b-48aa-9439-f8b3cf0c3426-log-httpd\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.473485 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da27e47-283b-48aa-9439-f8b3cf0c3426-run-httpd\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.477067 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.477647 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.477725 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-config-data\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.477752 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.479439 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-scripts\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.485115 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rwwp\" (UniqueName: \"kubernetes.io/projected/7da27e47-283b-48aa-9439-f8b3cf0c3426-kube-api-access-5rwwp\") pod \"ceilometer-0\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.530766 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.617361 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rmkfj"] Jan 20 23:18:48 crc kubenswrapper[5030]: W0120 23:18:48.628853 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda20c2b53_c20c_4205_a6d3_d33750654784.slice/crio-d76cdc71bd3243642f4a98c2d2adb9dc5fe01f164d49a3f4e24c8b124e5851fe WatchSource:0}: Error finding container d76cdc71bd3243642f4a98c2d2adb9dc5fe01f164d49a3f4e24c8b124e5851fe: Status 404 returned error can't find the container with id d76cdc71bd3243642f4a98c2d2adb9dc5fe01f164d49a3f4e24c8b124e5851fe Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.659640 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fmxgh" podUID="a6e78595-7120-46bf-8096-25d98b7ce2a9" containerName="registry-server" probeResult="failure" output=< Jan 20 23:18:48 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 23:18:48 crc kubenswrapper[5030]: > Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.834592 5030 generic.go:334] "Generic (PLEG): container finished" podID="a20c2b53-c20c-4205-a6d3-d33750654784" containerID="5676ad5ff6b616079ff594877d1698a31d5dde2cc682a8af04e13cc9d5f542b4" exitCode=0 Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.834738 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmkfj" event={"ID":"a20c2b53-c20c-4205-a6d3-d33750654784","Type":"ContainerDied","Data":"5676ad5ff6b616079ff594877d1698a31d5dde2cc682a8af04e13cc9d5f542b4"} Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.834771 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmkfj" event={"ID":"a20c2b53-c20c-4205-a6d3-d33750654784","Type":"ContainerStarted","Data":"d76cdc71bd3243642f4a98c2d2adb9dc5fe01f164d49a3f4e24c8b124e5851fe"} Jan 20 23:18:48 crc kubenswrapper[5030]: I0120 23:18:48.963052 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:18:48 crc kubenswrapper[5030]: W0120 23:18:48.965809 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7da27e47_283b_48aa_9439_f8b3cf0c3426.slice/crio-c1a6abc4deb91d85e7c40a9e16991e793b6e49f9b706362617ba43d16c9dda49 WatchSource:0}: Error finding container c1a6abc4deb91d85e7c40a9e16991e793b6e49f9b706362617ba43d16c9dda49: Status 404 returned error can't find the container with id c1a6abc4deb91d85e7c40a9e16991e793b6e49f9b706362617ba43d16c9dda49 Jan 20 23:18:49 crc kubenswrapper[5030]: I0120 23:18:49.862525 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmkfj" event={"ID":"a20c2b53-c20c-4205-a6d3-d33750654784","Type":"ContainerStarted","Data":"eab73fa73699476dfa9f6bbd10b8653d9c9495a6fa8f1f786aff5509eec544bd"} Jan 20 23:18:49 crc kubenswrapper[5030]: I0120 23:18:49.867778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7da27e47-283b-48aa-9439-f8b3cf0c3426","Type":"ContainerStarted","Data":"b7437d316b2192f7896af5d93582c95337eb7084b35a8b488d5714c2c0ab610f"} Jan 20 23:18:49 crc kubenswrapper[5030]: I0120 23:18:49.867854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7da27e47-283b-48aa-9439-f8b3cf0c3426","Type":"ContainerStarted","Data":"c1a6abc4deb91d85e7c40a9e16991e793b6e49f9b706362617ba43d16c9dda49"} Jan 20 23:18:49 crc kubenswrapper[5030]: I0120 23:18:49.977446 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f76e9da0-893b-45ee-9a86-2254184e1b8f" path="/var/lib/kubelet/pods/f76e9da0-893b-45ee-9a86-2254184e1b8f/volumes" Jan 20 23:18:50 crc kubenswrapper[5030]: I0120 23:18:50.884483 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7da27e47-283b-48aa-9439-f8b3cf0c3426","Type":"ContainerStarted","Data":"491a040f8ee5635c6af0dc1b66387a1b9706edbf38f9295a9f919a35c55cb5ba"} Jan 20 23:18:50 crc kubenswrapper[5030]: I0120 23:18:50.888679 5030 generic.go:334] "Generic (PLEG): container finished" podID="a20c2b53-c20c-4205-a6d3-d33750654784" containerID="eab73fa73699476dfa9f6bbd10b8653d9c9495a6fa8f1f786aff5509eec544bd" exitCode=0 Jan 20 23:18:50 crc kubenswrapper[5030]: I0120 23:18:50.888724 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmkfj" event={"ID":"a20c2b53-c20c-4205-a6d3-d33750654784","Type":"ContainerDied","Data":"eab73fa73699476dfa9f6bbd10b8653d9c9495a6fa8f1f786aff5509eec544bd"} Jan 20 23:18:51 crc kubenswrapper[5030]: I0120 23:18:51.899790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7da27e47-283b-48aa-9439-f8b3cf0c3426","Type":"ContainerStarted","Data":"5cb235fbc11dbb1f4a1b8e0b540880dc076e8771be46d898537d4c8df6eaa4dc"} Jan 20 23:18:51 crc kubenswrapper[5030]: I0120 23:18:51.902373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmkfj" event={"ID":"a20c2b53-c20c-4205-a6d3-d33750654784","Type":"ContainerStarted","Data":"aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a"} Jan 20 23:18:51 crc kubenswrapper[5030]: I0120 23:18:51.933267 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rmkfj" podStartSLOduration=2.507043184 podStartE2EDuration="4.933241635s" podCreationTimestamp="2026-01-20 23:18:47 +0000 UTC" firstStartedPulling="2026-01-20 23:18:48.840277693 +0000 UTC m=+2601.160537991" lastFinishedPulling="2026-01-20 23:18:51.266476154 +0000 UTC m=+2603.586736442" observedRunningTime="2026-01-20 23:18:51.926994984 +0000 UTC m=+2604.247255272" watchObservedRunningTime="2026-01-20 23:18:51.933241635 +0000 UTC m=+2604.253501953" Jan 20 23:18:53 crc kubenswrapper[5030]: I0120 23:18:53.380138 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:53 crc kubenswrapper[5030]: I0120 23:18:53.380766 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:18:53 crc kubenswrapper[5030]: I0120 23:18:53.934934 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7da27e47-283b-48aa-9439-f8b3cf0c3426","Type":"ContainerStarted","Data":"cb217008ccf67ac0890f7c0b296c6e0fbe72ac6353cd839700724550dc254a84"} Jan 20 23:18:53 crc kubenswrapper[5030]: I0120 23:18:53.936300 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:18:53 crc kubenswrapper[5030]: I0120 23:18:53.976287 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.034614863 podStartE2EDuration="5.976253327s" podCreationTimestamp="2026-01-20 23:18:48 +0000 UTC" firstStartedPulling="2026-01-20 23:18:48.96945231 +0000 UTC m=+2601.289712598" lastFinishedPulling="2026-01-20 23:18:52.911090774 +0000 UTC m=+2605.231351062" observedRunningTime="2026-01-20 23:18:53.970395257 +0000 UTC m=+2606.290655545" watchObservedRunningTime="2026-01-20 23:18:53.976253327 +0000 UTC m=+2606.296513645" Jan 20 23:18:54 crc kubenswrapper[5030]: I0120 23:18:54.399833 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="6bb6b715-a2a6-4c25-83c3-315fd30b8d92" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.194:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:18:54 crc kubenswrapper[5030]: I0120 23:18:54.399846 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="6bb6b715-a2a6-4c25-83c3-315fd30b8d92" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.194:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:18:58 crc kubenswrapper[5030]: I0120 23:18:58.057814 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:58 crc kubenswrapper[5030]: I0120 23:18:58.058485 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:58 crc kubenswrapper[5030]: I0120 23:18:58.138923 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:58 crc kubenswrapper[5030]: I0120 23:18:58.651020 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fmxgh" podUID="a6e78595-7120-46bf-8096-25d98b7ce2a9" containerName="registry-server" probeResult="failure" output=< Jan 20 23:18:58 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 23:18:58 crc kubenswrapper[5030]: > Jan 20 23:18:59 crc kubenswrapper[5030]: I0120 23:18:59.080431 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:18:59 crc kubenswrapper[5030]: I0120 23:18:59.161305 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rmkfj"] Jan 20 23:19:01 crc kubenswrapper[5030]: I0120 23:19:01.023878 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rmkfj" podUID="a20c2b53-c20c-4205-a6d3-d33750654784" containerName="registry-server" containerID="cri-o://aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a" gracePeriod=2 Jan 20 23:19:01 crc kubenswrapper[5030]: E0120 23:19:01.284712 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda20c2b53_c20c_4205_a6d3_d33750654784.slice/crio-conmon-aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda20c2b53_c20c_4205_a6d3_d33750654784.slice/crio-aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:19:01 crc kubenswrapper[5030]: I0120 23:19:01.564432 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:19:01 crc kubenswrapper[5030]: I0120 23:19:01.595602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a20c2b53-c20c-4205-a6d3-d33750654784-utilities\") pod \"a20c2b53-c20c-4205-a6d3-d33750654784\" (UID: \"a20c2b53-c20c-4205-a6d3-d33750654784\") " Jan 20 23:19:01 crc kubenswrapper[5030]: I0120 23:19:01.595801 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a20c2b53-c20c-4205-a6d3-d33750654784-catalog-content\") pod \"a20c2b53-c20c-4205-a6d3-d33750654784\" (UID: \"a20c2b53-c20c-4205-a6d3-d33750654784\") " Jan 20 23:19:01 crc kubenswrapper[5030]: I0120 23:19:01.595942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj4x5\" (UniqueName: \"kubernetes.io/projected/a20c2b53-c20c-4205-a6d3-d33750654784-kube-api-access-lj4x5\") pod \"a20c2b53-c20c-4205-a6d3-d33750654784\" (UID: \"a20c2b53-c20c-4205-a6d3-d33750654784\") " Jan 20 23:19:01 crc kubenswrapper[5030]: I0120 23:19:01.596405 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a20c2b53-c20c-4205-a6d3-d33750654784-utilities" (OuterVolumeSpecName: "utilities") pod "a20c2b53-c20c-4205-a6d3-d33750654784" (UID: "a20c2b53-c20c-4205-a6d3-d33750654784"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:01 crc kubenswrapper[5030]: I0120 23:19:01.596817 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a20c2b53-c20c-4205-a6d3-d33750654784-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:01 crc kubenswrapper[5030]: I0120 23:19:01.603927 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a20c2b53-c20c-4205-a6d3-d33750654784-kube-api-access-lj4x5" (OuterVolumeSpecName: "kube-api-access-lj4x5") pod "a20c2b53-c20c-4205-a6d3-d33750654784" (UID: "a20c2b53-c20c-4205-a6d3-d33750654784"). InnerVolumeSpecName "kube-api-access-lj4x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:01 crc kubenswrapper[5030]: I0120 23:19:01.639666 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a20c2b53-c20c-4205-a6d3-d33750654784-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a20c2b53-c20c-4205-a6d3-d33750654784" (UID: "a20c2b53-c20c-4205-a6d3-d33750654784"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:01 crc kubenswrapper[5030]: I0120 23:19:01.698086 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a20c2b53-c20c-4205-a6d3-d33750654784-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:01 crc kubenswrapper[5030]: I0120 23:19:01.698122 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj4x5\" (UniqueName: \"kubernetes.io/projected/a20c2b53-c20c-4205-a6d3-d33750654784-kube-api-access-lj4x5\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.035419 5030 generic.go:334] "Generic (PLEG): container finished" podID="a20c2b53-c20c-4205-a6d3-d33750654784" containerID="aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a" exitCode=0 Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.035485 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmkfj" event={"ID":"a20c2b53-c20c-4205-a6d3-d33750654784","Type":"ContainerDied","Data":"aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a"} Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.035846 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmkfj" event={"ID":"a20c2b53-c20c-4205-a6d3-d33750654784","Type":"ContainerDied","Data":"d76cdc71bd3243642f4a98c2d2adb9dc5fe01f164d49a3f4e24c8b124e5851fe"} Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.035876 5030 scope.go:117] "RemoveContainer" containerID="aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a" Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.035518 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmkfj" Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.071308 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rmkfj"] Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.075642 5030 scope.go:117] "RemoveContainer" containerID="eab73fa73699476dfa9f6bbd10b8653d9c9495a6fa8f1f786aff5509eec544bd" Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.089751 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rmkfj"] Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.099754 5030 scope.go:117] "RemoveContainer" containerID="5676ad5ff6b616079ff594877d1698a31d5dde2cc682a8af04e13cc9d5f542b4" Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.159268 5030 scope.go:117] "RemoveContainer" containerID="aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a" Jan 20 23:19:02 crc kubenswrapper[5030]: E0120 23:19:02.159775 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a\": container with ID starting with aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a not found: ID does not exist" containerID="aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a" Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.159807 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a"} err="failed to get container status \"aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a\": rpc error: code = NotFound desc = could not find container \"aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a\": container with ID starting with aad7afcf8d5a04bd15312a54fcf6d2dd6b339446aac3a3d98d41adc8f3e04f5a not found: ID does not exist" Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.159829 5030 scope.go:117] "RemoveContainer" containerID="eab73fa73699476dfa9f6bbd10b8653d9c9495a6fa8f1f786aff5509eec544bd" Jan 20 23:19:02 crc kubenswrapper[5030]: E0120 23:19:02.160203 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eab73fa73699476dfa9f6bbd10b8653d9c9495a6fa8f1f786aff5509eec544bd\": container with ID starting with eab73fa73699476dfa9f6bbd10b8653d9c9495a6fa8f1f786aff5509eec544bd not found: ID does not exist" containerID="eab73fa73699476dfa9f6bbd10b8653d9c9495a6fa8f1f786aff5509eec544bd" Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.160268 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eab73fa73699476dfa9f6bbd10b8653d9c9495a6fa8f1f786aff5509eec544bd"} err="failed to get container status \"eab73fa73699476dfa9f6bbd10b8653d9c9495a6fa8f1f786aff5509eec544bd\": rpc error: code = NotFound desc = could not find container \"eab73fa73699476dfa9f6bbd10b8653d9c9495a6fa8f1f786aff5509eec544bd\": container with ID starting with eab73fa73699476dfa9f6bbd10b8653d9c9495a6fa8f1f786aff5509eec544bd not found: ID does not exist" Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.160303 5030 scope.go:117] "RemoveContainer" containerID="5676ad5ff6b616079ff594877d1698a31d5dde2cc682a8af04e13cc9d5f542b4" Jan 20 23:19:02 crc kubenswrapper[5030]: E0120 23:19:02.160936 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5676ad5ff6b616079ff594877d1698a31d5dde2cc682a8af04e13cc9d5f542b4\": container with ID starting with 5676ad5ff6b616079ff594877d1698a31d5dde2cc682a8af04e13cc9d5f542b4 not found: ID does not exist" containerID="5676ad5ff6b616079ff594877d1698a31d5dde2cc682a8af04e13cc9d5f542b4" Jan 20 23:19:02 crc kubenswrapper[5030]: I0120 23:19:02.160965 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5676ad5ff6b616079ff594877d1698a31d5dde2cc682a8af04e13cc9d5f542b4"} err="failed to get container status \"5676ad5ff6b616079ff594877d1698a31d5dde2cc682a8af04e13cc9d5f542b4\": rpc error: code = NotFound desc = could not find container \"5676ad5ff6b616079ff594877d1698a31d5dde2cc682a8af04e13cc9d5f542b4\": container with ID starting with 5676ad5ff6b616079ff594877d1698a31d5dde2cc682a8af04e13cc9d5f542b4 not found: ID does not exist" Jan 20 23:19:03 crc kubenswrapper[5030]: I0120 23:19:03.391797 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:19:03 crc kubenswrapper[5030]: I0120 23:19:03.392282 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:19:03 crc kubenswrapper[5030]: I0120 23:19:03.392938 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:19:03 crc kubenswrapper[5030]: I0120 23:19:03.403028 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:19:03 crc kubenswrapper[5030]: I0120 23:19:03.976521 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a20c2b53-c20c-4205-a6d3-d33750654784" path="/var/lib/kubelet/pods/a20c2b53-c20c-4205-a6d3-d33750654784/volumes" Jan 20 23:19:04 crc kubenswrapper[5030]: I0120 23:19:04.058674 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:19:04 crc kubenswrapper[5030]: I0120 23:19:04.068598 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:19:07 crc kubenswrapper[5030]: I0120 23:19:07.658874 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:19:07 crc kubenswrapper[5030]: I0120 23:19:07.748878 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:19:08 crc kubenswrapper[5030]: I0120 23:19:08.472116 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fmxgh"] Jan 20 23:19:09 crc kubenswrapper[5030]: I0120 23:19:09.123721 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fmxgh" podUID="a6e78595-7120-46bf-8096-25d98b7ce2a9" containerName="registry-server" containerID="cri-o://206ce47024540eb040d455134ed9df8683428a511eacaaeeb11f805d81fea5a8" gracePeriod=2 Jan 20 23:19:09 crc kubenswrapper[5030]: I0120 23:19:09.662175 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:19:09 crc kubenswrapper[5030]: I0120 23:19:09.772518 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e78595-7120-46bf-8096-25d98b7ce2a9-catalog-content\") pod \"a6e78595-7120-46bf-8096-25d98b7ce2a9\" (UID: \"a6e78595-7120-46bf-8096-25d98b7ce2a9\") " Jan 20 23:19:09 crc kubenswrapper[5030]: I0120 23:19:09.772972 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw69r\" (UniqueName: \"kubernetes.io/projected/a6e78595-7120-46bf-8096-25d98b7ce2a9-kube-api-access-nw69r\") pod \"a6e78595-7120-46bf-8096-25d98b7ce2a9\" (UID: \"a6e78595-7120-46bf-8096-25d98b7ce2a9\") " Jan 20 23:19:09 crc kubenswrapper[5030]: I0120 23:19:09.773027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e78595-7120-46bf-8096-25d98b7ce2a9-utilities\") pod \"a6e78595-7120-46bf-8096-25d98b7ce2a9\" (UID: \"a6e78595-7120-46bf-8096-25d98b7ce2a9\") " Jan 20 23:19:09 crc kubenswrapper[5030]: I0120 23:19:09.773751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e78595-7120-46bf-8096-25d98b7ce2a9-utilities" (OuterVolumeSpecName: "utilities") pod "a6e78595-7120-46bf-8096-25d98b7ce2a9" (UID: "a6e78595-7120-46bf-8096-25d98b7ce2a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:09 crc kubenswrapper[5030]: I0120 23:19:09.778741 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e78595-7120-46bf-8096-25d98b7ce2a9-kube-api-access-nw69r" (OuterVolumeSpecName: "kube-api-access-nw69r") pod "a6e78595-7120-46bf-8096-25d98b7ce2a9" (UID: "a6e78595-7120-46bf-8096-25d98b7ce2a9"). InnerVolumeSpecName "kube-api-access-nw69r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:09 crc kubenswrapper[5030]: I0120 23:19:09.874996 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e78595-7120-46bf-8096-25d98b7ce2a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6e78595-7120-46bf-8096-25d98b7ce2a9" (UID: "a6e78595-7120-46bf-8096-25d98b7ce2a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:09 crc kubenswrapper[5030]: I0120 23:19:09.875717 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw69r\" (UniqueName: \"kubernetes.io/projected/a6e78595-7120-46bf-8096-25d98b7ce2a9-kube-api-access-nw69r\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:09 crc kubenswrapper[5030]: I0120 23:19:09.875747 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e78595-7120-46bf-8096-25d98b7ce2a9-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:09 crc kubenswrapper[5030]: I0120 23:19:09.875762 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e78595-7120-46bf-8096-25d98b7ce2a9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.140257 5030 generic.go:334] "Generic (PLEG): container finished" podID="a6e78595-7120-46bf-8096-25d98b7ce2a9" containerID="206ce47024540eb040d455134ed9df8683428a511eacaaeeb11f805d81fea5a8" exitCode=0 Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.140333 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmxgh" event={"ID":"a6e78595-7120-46bf-8096-25d98b7ce2a9","Type":"ContainerDied","Data":"206ce47024540eb040d455134ed9df8683428a511eacaaeeb11f805d81fea5a8"} Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.140372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmxgh" event={"ID":"a6e78595-7120-46bf-8096-25d98b7ce2a9","Type":"ContainerDied","Data":"a48797815bcf4a5f5a467a264417bd76fa8d79fc23d15c7fc8117115d56349e7"} Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.140388 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmxgh" Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.140400 5030 scope.go:117] "RemoveContainer" containerID="206ce47024540eb040d455134ed9df8683428a511eacaaeeb11f805d81fea5a8" Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.174691 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fmxgh"] Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.178689 5030 scope.go:117] "RemoveContainer" containerID="434fa7c998e3bfd9fd45f7e556abf84c144ba9a652f8dc3483e7095a4f1212b0" Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.193923 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fmxgh"] Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.218877 5030 scope.go:117] "RemoveContainer" containerID="8675df25996b84ab570a10f59e7ce0ca0082b9e9964a8ad24cfe17825d6cfc45" Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.277905 5030 scope.go:117] "RemoveContainer" containerID="206ce47024540eb040d455134ed9df8683428a511eacaaeeb11f805d81fea5a8" Jan 20 23:19:10 crc kubenswrapper[5030]: E0120 23:19:10.278508 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"206ce47024540eb040d455134ed9df8683428a511eacaaeeb11f805d81fea5a8\": container with ID starting with 206ce47024540eb040d455134ed9df8683428a511eacaaeeb11f805d81fea5a8 not found: ID does not exist" containerID="206ce47024540eb040d455134ed9df8683428a511eacaaeeb11f805d81fea5a8" Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.278571 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206ce47024540eb040d455134ed9df8683428a511eacaaeeb11f805d81fea5a8"} err="failed to get container status \"206ce47024540eb040d455134ed9df8683428a511eacaaeeb11f805d81fea5a8\": rpc error: code = NotFound desc = could not find container \"206ce47024540eb040d455134ed9df8683428a511eacaaeeb11f805d81fea5a8\": container with ID starting with 206ce47024540eb040d455134ed9df8683428a511eacaaeeb11f805d81fea5a8 not found: ID does not exist" Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.278614 5030 scope.go:117] "RemoveContainer" containerID="434fa7c998e3bfd9fd45f7e556abf84c144ba9a652f8dc3483e7095a4f1212b0" Jan 20 23:19:10 crc kubenswrapper[5030]: E0120 23:19:10.279409 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"434fa7c998e3bfd9fd45f7e556abf84c144ba9a652f8dc3483e7095a4f1212b0\": container with ID starting with 434fa7c998e3bfd9fd45f7e556abf84c144ba9a652f8dc3483e7095a4f1212b0 not found: ID does not exist" containerID="434fa7c998e3bfd9fd45f7e556abf84c144ba9a652f8dc3483e7095a4f1212b0" Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.279490 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"434fa7c998e3bfd9fd45f7e556abf84c144ba9a652f8dc3483e7095a4f1212b0"} err="failed to get container status \"434fa7c998e3bfd9fd45f7e556abf84c144ba9a652f8dc3483e7095a4f1212b0\": rpc error: code = NotFound desc = could not find container \"434fa7c998e3bfd9fd45f7e556abf84c144ba9a652f8dc3483e7095a4f1212b0\": container with ID starting with 434fa7c998e3bfd9fd45f7e556abf84c144ba9a652f8dc3483e7095a4f1212b0 not found: ID does not exist" Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.279551 5030 scope.go:117] "RemoveContainer" containerID="8675df25996b84ab570a10f59e7ce0ca0082b9e9964a8ad24cfe17825d6cfc45" Jan 20 23:19:10 crc kubenswrapper[5030]: E0120 23:19:10.280171 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8675df25996b84ab570a10f59e7ce0ca0082b9e9964a8ad24cfe17825d6cfc45\": container with ID starting with 8675df25996b84ab570a10f59e7ce0ca0082b9e9964a8ad24cfe17825d6cfc45 not found: ID does not exist" containerID="8675df25996b84ab570a10f59e7ce0ca0082b9e9964a8ad24cfe17825d6cfc45" Jan 20 23:19:10 crc kubenswrapper[5030]: I0120 23:19:10.280348 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8675df25996b84ab570a10f59e7ce0ca0082b9e9964a8ad24cfe17825d6cfc45"} err="failed to get container status \"8675df25996b84ab570a10f59e7ce0ca0082b9e9964a8ad24cfe17825d6cfc45\": rpc error: code = NotFound desc = could not find container \"8675df25996b84ab570a10f59e7ce0ca0082b9e9964a8ad24cfe17825d6cfc45\": container with ID starting with 8675df25996b84ab570a10f59e7ce0ca0082b9e9964a8ad24cfe17825d6cfc45 not found: ID does not exist" Jan 20 23:19:11 crc kubenswrapper[5030]: I0120 23:19:11.980897 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e78595-7120-46bf-8096-25d98b7ce2a9" path="/var/lib/kubelet/pods/a6e78595-7120-46bf-8096-25d98b7ce2a9/volumes" Jan 20 23:19:18 crc kubenswrapper[5030]: I0120 23:19:18.543778 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.737461 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-1026-account-create-update-h84cz"] Jan 20 23:19:22 crc kubenswrapper[5030]: E0120 23:19:22.738094 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e78595-7120-46bf-8096-25d98b7ce2a9" containerName="extract-content" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.738107 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e78595-7120-46bf-8096-25d98b7ce2a9" containerName="extract-content" Jan 20 23:19:22 crc kubenswrapper[5030]: E0120 23:19:22.738122 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e78595-7120-46bf-8096-25d98b7ce2a9" containerName="registry-server" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.738128 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e78595-7120-46bf-8096-25d98b7ce2a9" containerName="registry-server" Jan 20 23:19:22 crc kubenswrapper[5030]: E0120 23:19:22.738145 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a20c2b53-c20c-4205-a6d3-d33750654784" containerName="extract-utilities" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.738152 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20c2b53-c20c-4205-a6d3-d33750654784" containerName="extract-utilities" Jan 20 23:19:22 crc kubenswrapper[5030]: E0120 23:19:22.738167 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a20c2b53-c20c-4205-a6d3-d33750654784" containerName="registry-server" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.738173 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20c2b53-c20c-4205-a6d3-d33750654784" containerName="registry-server" Jan 20 23:19:22 crc kubenswrapper[5030]: E0120 23:19:22.738182 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a20c2b53-c20c-4205-a6d3-d33750654784" containerName="extract-content" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.738188 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20c2b53-c20c-4205-a6d3-d33750654784" containerName="extract-content" Jan 20 23:19:22 crc kubenswrapper[5030]: E0120 23:19:22.738201 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e78595-7120-46bf-8096-25d98b7ce2a9" containerName="extract-utilities" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.738207 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e78595-7120-46bf-8096-25d98b7ce2a9" containerName="extract-utilities" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.738370 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e78595-7120-46bf-8096-25d98b7ce2a9" containerName="registry-server" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.738388 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a20c2b53-c20c-4205-a6d3-d33750654784" containerName="registry-server" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.738989 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.745735 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.745772 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jjv69"] Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.747069 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jjv69" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.761284 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftm9p\" (UniqueName: \"kubernetes.io/projected/f5bc4584-e61d-47a6-b76f-0ff39b47fe1f-kube-api-access-ftm9p\") pod \"glance-1026-account-create-update-h84cz\" (UID: \"f5bc4584-e61d-47a6-b76f-0ff39b47fe1f\") " pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.761415 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5bc4584-e61d-47a6-b76f-0ff39b47fe1f-operator-scripts\") pod \"glance-1026-account-create-update-h84cz\" (UID: \"f5bc4584-e61d-47a6-b76f-0ff39b47fe1f\") " pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.761517 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.766550 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.769805 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.770015 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="97fdbcd3-4caf-4b83-b15a-a3dec765984f" containerName="openstackclient" containerID="cri-o://4630bdc572e916afefdf83bc662ba7a0751c9ba42447e962df99d64177728647" gracePeriod=2 Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.805879 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-1026-account-create-update-h84cz"] Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.828062 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.854216 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jjv69"] Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.862684 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-1026-account-create-update-xgf2v"] Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.864138 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftm9p\" (UniqueName: \"kubernetes.io/projected/f5bc4584-e61d-47a6-b76f-0ff39b47fe1f-kube-api-access-ftm9p\") pod \"glance-1026-account-create-update-h84cz\" (UID: \"f5bc4584-e61d-47a6-b76f-0ff39b47fe1f\") " pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.864218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5bc4584-e61d-47a6-b76f-0ff39b47fe1f-operator-scripts\") pod \"glance-1026-account-create-update-h84cz\" (UID: \"f5bc4584-e61d-47a6-b76f-0ff39b47fe1f\") " pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.865194 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5bc4584-e61d-47a6-b76f-0ff39b47fe1f-operator-scripts\") pod \"glance-1026-account-create-update-h84cz\" (UID: \"f5bc4584-e61d-47a6-b76f-0ff39b47fe1f\") " pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.882671 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-1026-account-create-update-xgf2v"] Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.914815 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.960813 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-h948j"] Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.968704 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28ng7\" (UniqueName: \"kubernetes.io/projected/8d6c006d-67ff-426d-9203-5c769b474fa3-kube-api-access-28ng7\") pod \"root-account-create-update-jjv69\" (UID: \"8d6c006d-67ff-426d-9203-5c769b474fa3\") " pod="openstack-kuttl-tests/root-account-create-update-jjv69" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.968888 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d6c006d-67ff-426d-9203-5c769b474fa3-operator-scripts\") pod \"root-account-create-update-jjv69\" (UID: \"8d6c006d-67ff-426d-9203-5c769b474fa3\") " pod="openstack-kuttl-tests/root-account-create-update-jjv69" Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.969996 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftm9p\" (UniqueName: \"kubernetes.io/projected/f5bc4584-e61d-47a6-b76f-0ff39b47fe1f-kube-api-access-ftm9p\") pod \"glance-1026-account-create-update-h84cz\" (UID: \"f5bc4584-e61d-47a6-b76f-0ff39b47fe1f\") " pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" Jan 20 23:19:22 crc kubenswrapper[5030]: E0120 23:19:22.970049 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:22 crc kubenswrapper[5030]: E0120 23:19:22.970102 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data podName:59bcab9d-b9c0-40fe-8973-df39eeca1dd4 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:23.470088452 +0000 UTC m=+2635.790348740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data") pod "rabbitmq-server-1" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4") : configmap "rabbitmq-config-data" not found Jan 20 23:19:22 crc kubenswrapper[5030]: I0120 23:19:22.977664 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.048941 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-h948j"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.065366 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.072198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28ng7\" (UniqueName: \"kubernetes.io/projected/8d6c006d-67ff-426d-9203-5c769b474fa3-kube-api-access-28ng7\") pod \"root-account-create-update-jjv69\" (UID: \"8d6c006d-67ff-426d-9203-5c769b474fa3\") " pod="openstack-kuttl-tests/root-account-create-update-jjv69" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.116464 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d6c006d-67ff-426d-9203-5c769b474fa3-operator-scripts\") pod \"root-account-create-update-jjv69\" (UID: \"8d6c006d-67ff-426d-9203-5c769b474fa3\") " pod="openstack-kuttl-tests/root-account-create-update-jjv69" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.120170 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d6c006d-67ff-426d-9203-5c769b474fa3-operator-scripts\") pod \"root-account-create-update-jjv69\" (UID: \"8d6c006d-67ff-426d-9203-5c769b474fa3\") " pod="openstack-kuttl-tests/root-account-create-update-jjv69" Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.126601 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/placement-config-data: secret "placement-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.126703 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data podName:fa4827b5-38d8-4647-a2bd-bfff1b91d34e nodeName:}" failed. No retries permitted until 2026-01-20 23:19:23.626677276 +0000 UTC m=+2635.946937564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data") pod "placement-847f86bdb-chr55" (UID: "fa4827b5-38d8-4647-a2bd-bfff1b91d34e") : secret "placement-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.137709 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf"] Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.138676 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97fdbcd3-4caf-4b83-b15a-a3dec765984f" containerName="openstackclient" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.138688 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="97fdbcd3-4caf-4b83-b15a-a3dec765984f" containerName="openstackclient" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.139021 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="97fdbcd3-4caf-4b83-b15a-a3dec765984f" containerName="openstackclient" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.140398 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.174858 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.210348 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28ng7\" (UniqueName: \"kubernetes.io/projected/8d6c006d-67ff-426d-9203-5c769b474fa3-kube-api-access-28ng7\") pod \"root-account-create-update-jjv69\" (UID: \"8d6c006d-67ff-426d-9203-5c769b474fa3\") " pod="openstack-kuttl-tests/root-account-create-update-jjv69" Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.229813 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.229900 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data podName:56408fa4-b1eb-4a4d-bc8a-9ab82b88957f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:23.729872541 +0000 UTC m=+2636.050132819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data") pod "rabbitmq-server-2" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f") : configmap "rabbitmq-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.246394 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.246605 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data podName:ed443431-2b5c-4533-8523-d01380c98c1d nodeName:}" failed. No retries permitted until 2026-01-20 23:19:23.746589662 +0000 UTC m=+2636.066849950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data") pod "rabbitmq-server-0" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d") : configmap "rabbitmq-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.254689 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.311905 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.328343 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fc63968-1e2b-46fb-b8f3-d4cfd459410c-operator-scripts\") pod \"placement-e5c6-account-create-update-lckcf\" (UID: \"7fc63968-1e2b-46fb-b8f3-d4cfd459410c\") " pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.328493 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.328544 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks4wr\" (UniqueName: \"kubernetes.io/projected/7fc63968-1e2b-46fb-b8f3-d4cfd459410c-kube-api-access-ks4wr\") pod \"placement-e5c6-account-create-update-lckcf\" (UID: \"7fc63968-1e2b-46fb-b8f3-d4cfd459410c\") " pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.330928 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.342685 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.344238 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.350415 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.373231 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jjv69" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.380935 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.412980 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-e5c6-account-create-update-s9wrb"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.431942 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6rgs\" (UniqueName: \"kubernetes.io/projected/e313a291-92ea-456b-9b16-7eb70ebfbe29-kube-api-access-j6rgs\") pod \"neutron-35d5-account-create-update-vslxf\" (UID: \"e313a291-92ea-456b-9b16-7eb70ebfbe29\") " pod="openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.432058 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e313a291-92ea-456b-9b16-7eb70ebfbe29-operator-scripts\") pod \"neutron-35d5-account-create-update-vslxf\" (UID: \"e313a291-92ea-456b-9b16-7eb70ebfbe29\") " pod="openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.432096 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8g28\" (UniqueName: \"kubernetes.io/projected/62358e70-13cb-4b80-8d09-f5ae4f43f6bd-kube-api-access-q8g28\") pod \"cinder-e400-account-create-update-6mkqw\" (UID: \"62358e70-13cb-4b80-8d09-f5ae4f43f6bd\") " pod="openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.432146 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks4wr\" (UniqueName: \"kubernetes.io/projected/7fc63968-1e2b-46fb-b8f3-d4cfd459410c-kube-api-access-ks4wr\") pod \"placement-e5c6-account-create-update-lckcf\" (UID: \"7fc63968-1e2b-46fb-b8f3-d4cfd459410c\") " pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.432194 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fc63968-1e2b-46fb-b8f3-d4cfd459410c-operator-scripts\") pod \"placement-e5c6-account-create-update-lckcf\" (UID: \"7fc63968-1e2b-46fb-b8f3-d4cfd459410c\") " pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.432240 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62358e70-13cb-4b80-8d09-f5ae4f43f6bd-operator-scripts\") pod \"cinder-e400-account-create-update-6mkqw\" (UID: \"62358e70-13cb-4b80-8d09-f5ae4f43f6bd\") " pod="openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.444402 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fc63968-1e2b-46fb-b8f3-d4cfd459410c-operator-scripts\") pod \"placement-e5c6-account-create-update-lckcf\" (UID: \"7fc63968-1e2b-46fb-b8f3-d4cfd459410c\") " pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.446772 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.477691 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.489195 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks4wr\" (UniqueName: \"kubernetes.io/projected/7fc63968-1e2b-46fb-b8f3-d4cfd459410c-kube-api-access-ks4wr\") pod \"placement-e5c6-account-create-update-lckcf\" (UID: \"7fc63968-1e2b-46fb-b8f3-d4cfd459410c\") " pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.503694 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.526756 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.527370 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="eef09e13-a7ab-4e00-a12c-fb544a0791ba" containerName="openstack-network-exporter" containerID="cri-o://993cc6b278b50127911dfb227a9f751177138cfc79c86f0047ff5f55bad4e71a" gracePeriod=300 Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.533706 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e313a291-92ea-456b-9b16-7eb70ebfbe29-operator-scripts\") pod \"neutron-35d5-account-create-update-vslxf\" (UID: \"e313a291-92ea-456b-9b16-7eb70ebfbe29\") " pod="openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.533760 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8g28\" (UniqueName: \"kubernetes.io/projected/62358e70-13cb-4b80-8d09-f5ae4f43f6bd-kube-api-access-q8g28\") pod \"cinder-e400-account-create-update-6mkqw\" (UID: \"62358e70-13cb-4b80-8d09-f5ae4f43f6bd\") " pod="openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.533869 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62358e70-13cb-4b80-8d09-f5ae4f43f6bd-operator-scripts\") pod \"cinder-e400-account-create-update-6mkqw\" (UID: \"62358e70-13cb-4b80-8d09-f5ae4f43f6bd\") " pod="openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.533918 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6rgs\" (UniqueName: \"kubernetes.io/projected/e313a291-92ea-456b-9b16-7eb70ebfbe29-kube-api-access-j6rgs\") pod \"neutron-35d5-account-create-update-vslxf\" (UID: \"e313a291-92ea-456b-9b16-7eb70ebfbe29\") " pod="openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf" Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.537566 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.537659 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data podName:59bcab9d-b9c0-40fe-8973-df39eeca1dd4 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:24.537639942 +0000 UTC m=+2636.857900230 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data") pod "rabbitmq-server-1" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4") : configmap "rabbitmq-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.538430 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62358e70-13cb-4b80-8d09-f5ae4f43f6bd-operator-scripts\") pod \"cinder-e400-account-create-update-6mkqw\" (UID: \"62358e70-13cb-4b80-8d09-f5ae4f43f6bd\") " pod="openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.551563 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e313a291-92ea-456b-9b16-7eb70ebfbe29-operator-scripts\") pod \"neutron-35d5-account-create-update-vslxf\" (UID: \"e313a291-92ea-456b-9b16-7eb70ebfbe29\") " pod="openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.572181 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6rgs\" (UniqueName: \"kubernetes.io/projected/e313a291-92ea-456b-9b16-7eb70ebfbe29-kube-api-access-j6rgs\") pod \"neutron-35d5-account-create-update-vslxf\" (UID: \"e313a291-92ea-456b-9b16-7eb70ebfbe29\") " pod="openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.593312 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8g28\" (UniqueName: \"kubernetes.io/projected/62358e70-13cb-4b80-8d09-f5ae4f43f6bd-kube-api-access-q8g28\") pod \"cinder-e400-account-create-update-6mkqw\" (UID: \"62358e70-13cb-4b80-8d09-f5ae4f43f6bd\") " pod="openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.604031 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-e400-account-create-update-wgs6j"] Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.636169 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/placement-config-data: secret "placement-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.638432 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data podName:fa4827b5-38d8-4647-a2bd-bfff1b91d34e nodeName:}" failed. No retries permitted until 2026-01-20 23:19:24.638415518 +0000 UTC m=+2636.958675806 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data") pod "placement-847f86bdb-chr55" (UID: "fa4827b5-38d8-4647-a2bd-bfff1b91d34e") : secret "placement-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.641957 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.647255 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.661299 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="eef09e13-a7ab-4e00-a12c-fb544a0791ba" containerName="ovsdbserver-sb" containerID="cri-o://acf7a1c2cf6b3d46585291710fe47cc13af65d867980957cc480cde40622afc4" gracePeriod=300 Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.705337 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.710848 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-35d5-account-create-update-glbzc"] Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.742124 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.742194 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data podName:56408fa4-b1eb-4a4d-bc8a-9ab82b88957f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:24.742178516 +0000 UTC m=+2637.062438814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data") pod "rabbitmq-server-2" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f") : configmap "rabbitmq-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.743609 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.751295 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.755504 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.759688 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.762552 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.776457 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.792686 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-ea87-account-create-update-jdnng"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.805675 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.806930 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.812752 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-hn24q"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.814987 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.826660 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.852693 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn"] Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.868691 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: E0120 23:19:23.868763 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data podName:ed443431-2b5c-4533-8523-d01380c98c1d nodeName:}" failed. No retries permitted until 2026-01-20 23:19:24.868745361 +0000 UTC m=+2637.189005649 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data") pod "rabbitmq-server-0" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d") : configmap "rabbitmq-config-data" not found Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.877334 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.877541 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="683bb59f-0c0f-4f12-970b-b786e886cae3" containerName="ovn-northd" containerID="cri-o://f0b6f096253aba52285115aadb119a7459485eb7348095df5221d30a264dfa06" gracePeriod=30 Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.877859 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="683bb59f-0c0f-4f12-970b-b786e886cae3" containerName="openstack-network-exporter" containerID="cri-o://0838f1d78c9937fe1578213b08a61215c2546c6cc57d57d9399d671aecb5ea39" gracePeriod=30 Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.890559 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-83e2-account-create-update-pmcqn"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.911778 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-hn24q"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.940504 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.949248 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.949694 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="d7b5a2a9-c6db-4d98-b203-5503e4b7075c" containerName="openstack-network-exporter" containerID="cri-o://c985cc11f4e3ac6e3f0060dc9a127411c7d93295135d6768cf20b1612fe484c2" gracePeriod=300 Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.971277 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" secret="" err="secret \"barbican-barbican-dockercfg-bskvr\" not found" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.973760 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/cinder-scheduler-0" secret="" err="secret \"cinder-cinder-dockercfg-74drc\" not found" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.975129 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d84d5a65-b170-4bad-aa8a-ddd90f8f48f9-operator-scripts\") pod \"nova-api-83e2-account-create-update-bwpfv\" (UID: \"d84d5a65-b170-4bad-aa8a-ddd90f8f48f9\") " pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.975248 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnd7f\" (UniqueName: \"kubernetes.io/projected/d84d5a65-b170-4bad-aa8a-ddd90f8f48f9-kube-api-access-rnd7f\") pod \"nova-api-83e2-account-create-update-bwpfv\" (UID: \"d84d5a65-b170-4bad-aa8a-ddd90f8f48f9\") " pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.975345 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m59t\" (UniqueName: \"kubernetes.io/projected/131dfd04-f9d2-4fb7-b432-4f34f70cc4b7-kube-api-access-7m59t\") pod \"barbican-ea87-account-create-update-rg22q\" (UID: \"131dfd04-f9d2-4fb7-b432-4f34f70cc4b7\") " pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.975423 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131dfd04-f9d2-4fb7-b432-4f34f70cc4b7-operator-scripts\") pod \"barbican-ea87-account-create-update-rg22q\" (UID: \"131dfd04-f9d2-4fb7-b432-4f34f70cc4b7\") " pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.978134 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a877089-7047-4846-a420-c2c4060562f5" path="/var/lib/kubelet/pods/2a877089-7047-4846-a420-c2c4060562f5/volumes" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.980332 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="303c674e-2c9f-4166-97f9-98ec856371ae" path="/var/lib/kubelet/pods/303c674e-2c9f-4166-97f9-98ec856371ae/volumes" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.981207 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3767ed5d-0e94-4d9d-8647-db60f59d03d7" path="/var/lib/kubelet/pods/3767ed5d-0e94-4d9d-8647-db60f59d03d7/volumes" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.981830 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e70d93d-ba09-4dcc-95b1-f25a68604a29" path="/var/lib/kubelet/pods/3e70d93d-ba09-4dcc-95b1-f25a68604a29/volumes" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.984740 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="421b8192-d4d9-41a8-800d-3f3afab07ac7" path="/var/lib/kubelet/pods/421b8192-d4d9-41a8-800d-3f3afab07ac7/volumes" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.985600 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="468d96c6-0dc6-46d7-b85c-0041ef331f7d" path="/var/lib/kubelet/pods/468d96c6-0dc6-46d7-b85c-0041ef331f7d/volumes" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.986226 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7301c9ea-0137-4118-9bff-ab6934177bb5" path="/var/lib/kubelet/pods/7301c9ea-0137-4118-9bff-ab6934177bb5/volumes" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.987637 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de53eca9-723a-4cc3-b742-8b2c091c53a3" path="/var/lib/kubelet/pods/de53eca9-723a-4cc3-b742-8b2c091c53a3/volumes" Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.988297 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 20 23:19:23 crc kubenswrapper[5030]: I0120 23:19:23.988387 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.050802 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hk4fb"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.068762 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-hk4fb"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.079224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d84d5a65-b170-4bad-aa8a-ddd90f8f48f9-operator-scripts\") pod \"nova-api-83e2-account-create-update-bwpfv\" (UID: \"d84d5a65-b170-4bad-aa8a-ddd90f8f48f9\") " pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.079275 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnd7f\" (UniqueName: \"kubernetes.io/projected/d84d5a65-b170-4bad-aa8a-ddd90f8f48f9-kube-api-access-rnd7f\") pod \"nova-api-83e2-account-create-update-bwpfv\" (UID: \"d84d5a65-b170-4bad-aa8a-ddd90f8f48f9\") " pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.079335 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m59t\" (UniqueName: \"kubernetes.io/projected/131dfd04-f9d2-4fb7-b432-4f34f70cc4b7-kube-api-access-7m59t\") pod \"barbican-ea87-account-create-update-rg22q\" (UID: \"131dfd04-f9d2-4fb7-b432-4f34f70cc4b7\") " pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.079372 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131dfd04-f9d2-4fb7-b432-4f34f70cc4b7-operator-scripts\") pod \"barbican-ea87-account-create-update-rg22q\" (UID: \"131dfd04-f9d2-4fb7-b432-4f34f70cc4b7\") " pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.099562 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-scheduler-config-data: secret "cinder-scheduler-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.099642 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom podName:74f7986d-211a-4216-b92f-ad900ddcc45f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:24.599605708 +0000 UTC m=+2636.919865996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom") pod "cinder-scheduler-0" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f") : secret "cinder-scheduler-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.100405 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.100443 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data podName:40a68c1e-d515-4dac-a25a-d7ed5da4b15c nodeName:}" failed. No retries permitted until 2026-01-20 23:19:24.600434198 +0000 UTC m=+2636.920694486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data") pod "barbican-api-84bf68df56-874dk" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c") : secret "barbican-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.101412 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131dfd04-f9d2-4fb7-b432-4f34f70cc4b7-operator-scripts\") pod \"barbican-ea87-account-create-update-rg22q\" (UID: \"131dfd04-f9d2-4fb7-b432-4f34f70cc4b7\") " pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.118025 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-7wlmd"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.122795 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="d7b5a2a9-c6db-4d98-b203-5503e4b7075c" containerName="ovsdbserver-nb" containerID="cri-o://4319fd893e780281e4fa992c547508838c8f4137c581cbf49edf43fa6a462908" gracePeriod=300 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.127526 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m59t\" (UniqueName: \"kubernetes.io/projected/131dfd04-f9d2-4fb7-b432-4f34f70cc4b7-kube-api-access-7m59t\") pod \"barbican-ea87-account-create-update-rg22q\" (UID: \"131dfd04-f9d2-4fb7-b432-4f34f70cc4b7\") " pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.132486 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-scripts: secret "cinder-scripts" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.132864 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts podName:74f7986d-211a-4216-b92f-ad900ddcc45f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:24.632829445 +0000 UTC m=+2636.953089733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts") pod "cinder-scheduler-0" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f") : secret "cinder-scripts" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.133004 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.133072 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data podName:65e9d298-7338-47c3-8591-b7803798203f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:24.63305059 +0000 UTC m=+2636.953310878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data") pod "rabbitmq-cell1-server-1" (UID: "65e9d298-7338-47c3-8591-b7803798203f") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.133149 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-api-config-data: secret "barbican-api-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.133188 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom podName:40a68c1e-d515-4dac-a25a-d7ed5da4b15c nodeName:}" failed. No retries permitted until 2026-01-20 23:19:24.633180803 +0000 UTC m=+2636.953441091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom") pod "barbican-api-84bf68df56-874dk" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c") : secret "barbican-api-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.133948 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d84d5a65-b170-4bad-aa8a-ddd90f8f48f9-operator-scripts\") pod \"nova-api-83e2-account-create-update-bwpfv\" (UID: \"d84d5a65-b170-4bad-aa8a-ddd90f8f48f9\") " pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.134358 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-config-data: secret "cinder-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.134464 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data podName:74f7986d-211a-4216-b92f-ad900ddcc45f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:24.634439103 +0000 UTC m=+2636.954699391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data") pod "cinder-scheduler-0" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f") : secret "cinder-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.146684 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.148639 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnd7f\" (UniqueName: \"kubernetes.io/projected/d84d5a65-b170-4bad-aa8a-ddd90f8f48f9-kube-api-access-rnd7f\") pod \"nova-api-83e2-account-create-update-bwpfv\" (UID: \"d84d5a65-b170-4bad-aa8a-ddd90f8f48f9\") " pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.168975 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-bdw8v"] Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.190060 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.190146 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data podName:a54757d6-0570-4290-a802-d82ff94dffb9 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:24.690120719 +0000 UTC m=+2637.010381007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data") pod "rabbitmq-cell1-server-2" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.200034 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.200129 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data podName:995342f6-fe0b-4188-976c-5151541dd002 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:24.700080567 +0000 UTC m=+2637.020340845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data") pod "rabbitmq-cell1-server-0" (UID: "995342f6-fe0b-4188-976c-5151541dd002") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.244419 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.254371 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-7wlmd"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.266185 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-bdw8v"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.279260 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-b1b2-account-create-update-6wwjc"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.310118 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.329860 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-5d10-account-create-update-v94fx"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.338266 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-pzk5s"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.347955 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-pzk5s"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.365840 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_683bb59f-0c0f-4f12-970b-b786e886cae3/ovn-northd/0.log" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.365881 5030 generic.go:334] "Generic (PLEG): container finished" podID="683bb59f-0c0f-4f12-970b-b786e886cae3" containerID="0838f1d78c9937fe1578213b08a61215c2546c6cc57d57d9399d671aecb5ea39" exitCode=2 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.365897 5030 generic.go:334] "Generic (PLEG): container finished" podID="683bb59f-0c0f-4f12-970b-b786e886cae3" containerID="f0b6f096253aba52285115aadb119a7459485eb7348095df5221d30a264dfa06" exitCode=143 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.365942 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"683bb59f-0c0f-4f12-970b-b786e886cae3","Type":"ContainerDied","Data":"0838f1d78c9937fe1578213b08a61215c2546c6cc57d57d9399d671aecb5ea39"} Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.365969 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"683bb59f-0c0f-4f12-970b-b786e886cae3","Type":"ContainerDied","Data":"f0b6f096253aba52285115aadb119a7459485eb7348095df5221d30a264dfa06"} Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.375691 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_d7b5a2a9-c6db-4d98-b203-5503e4b7075c/ovsdbserver-nb/0.log" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.375735 5030 generic.go:334] "Generic (PLEG): container finished" podID="d7b5a2a9-c6db-4d98-b203-5503e4b7075c" containerID="c985cc11f4e3ac6e3f0060dc9a127411c7d93295135d6768cf20b1612fe484c2" exitCode=2 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.375753 5030 generic.go:334] "Generic (PLEG): container finished" podID="d7b5a2a9-c6db-4d98-b203-5503e4b7075c" containerID="4319fd893e780281e4fa992c547508838c8f4137c581cbf49edf43fa6a462908" exitCode=143 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.375819 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"d7b5a2a9-c6db-4d98-b203-5503e4b7075c","Type":"ContainerDied","Data":"c985cc11f4e3ac6e3f0060dc9a127411c7d93295135d6768cf20b1612fe484c2"} Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.375845 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"d7b5a2a9-c6db-4d98-b203-5503e4b7075c","Type":"ContainerDied","Data":"4319fd893e780281e4fa992c547508838c8f4137c581cbf49edf43fa6a462908"} Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.381923 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.382172 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="c16c5bc4-fe32-4711-a21f-c6e7919d41d3" containerName="glance-log" containerID="cri-o://e4df01fc1632a811eb3b665c2886fe79657be01644364c0ace65e378ddb318ce" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.382897 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="c16c5bc4-fe32-4711-a21f-c6e7919d41d3" containerName="glance-httpd" containerID="cri-o://50dad30f02e2ad71b34ca576e9d068767ee44b13ac49b20df2f355b3c50adad3" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.392445 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_eef09e13-a7ab-4e00-a12c-fb544a0791ba/ovsdbserver-sb/0.log" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.392615 5030 generic.go:334] "Generic (PLEG): container finished" podID="eef09e13-a7ab-4e00-a12c-fb544a0791ba" containerID="993cc6b278b50127911dfb227a9f751177138cfc79c86f0047ff5f55bad4e71a" exitCode=2 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.392715 5030 generic.go:334] "Generic (PLEG): container finished" podID="eef09e13-a7ab-4e00-a12c-fb544a0791ba" containerID="acf7a1c2cf6b3d46585291710fe47cc13af65d867980957cc480cde40622afc4" exitCode=143 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.392802 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"eef09e13-a7ab-4e00-a12c-fb544a0791ba","Type":"ContainerDied","Data":"993cc6b278b50127911dfb227a9f751177138cfc79c86f0047ff5f55bad4e71a"} Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.392946 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"eef09e13-a7ab-4e00-a12c-fb544a0791ba","Type":"ContainerDied","Data":"acf7a1c2cf6b3d46585291710fe47cc13af65d867980957cc480cde40622afc4"} Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.400580 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.405196 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" Jan 20 23:19:24 crc kubenswrapper[5030]: W0120 23:19:24.414376 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5bc4584_e61d_47a6_b76f_0ff39b47fe1f.slice/crio-b76c22c9bfa805c53d87eeb3f9212726eb23cdef997a0f13a87b245e436d19af WatchSource:0}: Error finding container b76c22c9bfa805c53d87eeb3f9212726eb23cdef997a0f13a87b245e436d19af: Status 404 returned error can't find the container with id b76c22c9bfa805c53d87eeb3f9212726eb23cdef997a0f13a87b245e436d19af Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.424856 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-6nqfg"] Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.443793 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:19:24 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:19:24 crc kubenswrapper[5030]: Jan 20 23:19:24 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:19:24 crc kubenswrapper[5030]: Jan 20 23:19:24 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:19:24 crc kubenswrapper[5030]: Jan 20 23:19:24 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:19:24 crc kubenswrapper[5030]: Jan 20 23:19:24 crc kubenswrapper[5030]: if [ -n "glance" ]; then Jan 20 23:19:24 crc kubenswrapper[5030]: GRANT_DATABASE="glance" Jan 20 23:19:24 crc kubenswrapper[5030]: else Jan 20 23:19:24 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:19:24 crc kubenswrapper[5030]: fi Jan 20 23:19:24 crc kubenswrapper[5030]: Jan 20 23:19:24 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:19:24 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:19:24 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:19:24 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:19:24 crc kubenswrapper[5030]: # support updates Jan 20 23:19:24 crc kubenswrapper[5030]: Jan 20 23:19:24 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.447082 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" podUID="f5bc4584-e61d-47a6-b76f-0ff39b47fe1f" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.476604 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.492449 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-2m6rv"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.509544 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.510574 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="bd75cb8e-9fcd-4465-88fb-49e3600608dc" containerName="glance-log" containerID="cri-o://744df9f6f8632e4bde232c473af1353285e6fa32962770ad640031ef5ec3543a" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.510922 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="bd75cb8e-9fcd-4465-88fb-49e3600608dc" containerName="glance-httpd" containerID="cri-o://7af3824becb0a8b511712e8c071c2a7723b825950e7dda89578b7c6ba4037ce4" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.546303 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-fgr22"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.573187 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-fgr22"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.582332 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_eef09e13-a7ab-4e00-a12c-fb544a0791ba/ovsdbserver-sb/0.log" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.582411 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.588470 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589180 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-server" containerID="cri-o://86a2e3b58648f4c6865951dbe0e5fd173fe948355567c57408f5478873a50cea" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589305 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="swift-recon-cron" containerID="cri-o://31bdfc8eeca403bd17e795b8cf31b719db2f4805a115c8604e8ad198f7327da9" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589355 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="rsync" containerID="cri-o://63b6c6d4bece1d87f4729a39f17170c16567071872bdc73d5549a42bd98e07ba" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589396 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-expirer" containerID="cri-o://c088151164804d480f3d602813acc6fe8110edbb60819ceaf2f0559e9b333bb1" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589435 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-updater" containerID="cri-o://d04f17499f96be1bcb743043670578f206129ddc5a8598a1d01aac043f27b4a5" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589480 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-auditor" containerID="cri-o://ff021e7947768f870e170bfceeedcefa9f23164365486fe0e68cf658ba9ee312" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589518 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-replicator" containerID="cri-o://d0baa50e2392f25298e7d0f9821ab644a7212ae9a2f2e398ab1e0dc812fdc4da" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589572 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-server" containerID="cri-o://5f30fa9f5b0648ade3945014008c72e70964fe50f71f44800cc8bb9e7ffd26fd" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589615 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-updater" containerID="cri-o://c3edaddce88c0d920a63257712c29ebc2820562a0cebc56bf7063f7045098b45" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589677 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-auditor" containerID="cri-o://c172e50bb94cc43be14453c778229aa3ad0f5285b8f09db985f5f9ca934e07e0" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589709 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-replicator" containerID="cri-o://8420aa4a66cecb302aa6d121c7a14ab12b3529e6eb3b2d23cd341bdb0404c01b" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589739 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-server" containerID="cri-o://ddb4980b7a5fddbacfa0f9edafb461b4d52d06ebf1613fcacc617ee6aa6c2f36" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589768 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-reaper" containerID="cri-o://9953d67dbf3825bda9fc957256627f35a45dbc98f02872cfe23353e4f525b038" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589795 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-auditor" containerID="cri-o://2724c2174e7487a3d20c504c8b82914e661dd1586f42ae55254b615ffc14a493" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.589832 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-replicator" containerID="cri-o://e7a7f0b5138e5ce84a4aee902bbc98528f273ace28ad5b3739a5c7855fce0f41" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.605850 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-847f86bdb-chr55"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.606095 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" podUID="fa4827b5-38d8-4647-a2bd-bfff1b91d34e" containerName="placement-log" containerID="cri-o://195092f4ac72d7d533ff5382326146aa18ffe894548f20c28141674be97ab59c" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.606215 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" podUID="fa4827b5-38d8-4647-a2bd-bfff1b91d34e" containerName="placement-api" containerID="cri-o://63cf8be9dcc2605f51f2d1da4be3b07a3a302bb4c83d0655aed8497ddf7630fd" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.614384 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.614589 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="74f7986d-211a-4216-b92f-ad900ddcc45f" containerName="cinder-scheduler" containerID="cri-o://70fb88353fc70c3b9dab582d23bbede418372d313927da95f856408d4d739fad" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.614742 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="74f7986d-211a-4216-b92f-ad900ddcc45f" containerName="probe" containerID="cri-o://3b40fee42e01b5b734218bca90644c05dc6e1540c1debc8a4a06f30fcb8a9bf8" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.620520 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-combined-ca-bundle\") pod \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.620586 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-ovsdbserver-sb-tls-certs\") pod \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.621722 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26v98\" (UniqueName: \"kubernetes.io/projected/eef09e13-a7ab-4e00-a12c-fb544a0791ba-kube-api-access-26v98\") pod \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.622461 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eef09e13-a7ab-4e00-a12c-fb544a0791ba-scripts\") pod \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.622602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eef09e13-a7ab-4e00-a12c-fb544a0791ba-ovsdb-rundir\") pod \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.622700 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eef09e13-a7ab-4e00-a12c-fb544a0791ba-config\") pod \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.622716 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.622763 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-metrics-certs-tls-certs\") pod \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\" (UID: \"eef09e13-a7ab-4e00-a12c-fb544a0791ba\") " Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.623358 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.623496 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data podName:59bcab9d-b9c0-40fe-8973-df39eeca1dd4 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:26.62348099 +0000 UTC m=+2638.943741278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data") pod "rabbitmq-server-1" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4") : configmap "rabbitmq-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.624887 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-scheduler-config-data: secret "cinder-scheduler-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.624960 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom podName:74f7986d-211a-4216-b92f-ad900ddcc45f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:25.624941825 +0000 UTC m=+2637.945202113 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom") pod "cinder-scheduler-0" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f") : secret "cinder-scheduler-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.625546 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eef09e13-a7ab-4e00-a12c-fb544a0791ba-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "eef09e13-a7ab-4e00-a12c-fb544a0791ba" (UID: "eef09e13-a7ab-4e00-a12c-fb544a0791ba"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.626084 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef09e13-a7ab-4e00-a12c-fb544a0791ba-scripts" (OuterVolumeSpecName: "scripts") pod "eef09e13-a7ab-4e00-a12c-fb544a0791ba" (UID: "eef09e13-a7ab-4e00-a12c-fb544a0791ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.626854 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.626933 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data podName:40a68c1e-d515-4dac-a25a-d7ed5da4b15c nodeName:}" failed. No retries permitted until 2026-01-20 23:19:25.626913633 +0000 UTC m=+2637.947173921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data") pod "barbican-api-84bf68df56-874dk" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c") : secret "barbican-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.629764 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef09e13-a7ab-4e00-a12c-fb544a0791ba-config" (OuterVolumeSpecName: "config") pod "eef09e13-a7ab-4e00-a12c-fb544a0791ba" (UID: "eef09e13-a7ab-4e00-a12c-fb544a0791ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.656791 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef09e13-a7ab-4e00-a12c-fb544a0791ba-kube-api-access-26v98" (OuterVolumeSpecName: "kube-api-access-26v98") pod "eef09e13-a7ab-4e00-a12c-fb544a0791ba" (UID: "eef09e13-a7ab-4e00-a12c-fb544a0791ba"). InnerVolumeSpecName "kube-api-access-26v98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.660029 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.660299 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="b175c0c8-ee1a-404d-9a14-d37744a24ece" containerName="cinder-api-log" containerID="cri-o://3a9460c9a35ed585df222f0319fca4cbe21024e047ae17ac43a0303e76f4c0ee" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.660765 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="b175c0c8-ee1a-404d-9a14-d37744a24ece" containerName="cinder-api" containerID="cri-o://494d5cde95754e39df3e175cb36cce23395a3273fbb92ee15c97c0a4ba5fcaa9" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.679349 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "eef09e13-a7ab-4e00-a12c-fb544a0791ba" (UID: "eef09e13-a7ab-4e00-a12c-fb544a0791ba"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.683990 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.684261 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" podUID="b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" containerName="neutron-api" containerID="cri-o://6949cd28ba252efdec3e0d70cae363c20b2cd6eb423def67f259c29ca96e51c2" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.684731 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" podUID="b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" containerName="neutron-httpd" containerID="cri-o://e47ac64b526f7c8c1e0183da0c8d577a84461451afc84c5ba2ba9e3c80008b0f" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.707870 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2"] Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.708598 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef09e13-a7ab-4e00-a12c-fb544a0791ba" containerName="openstack-network-exporter" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.708637 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef09e13-a7ab-4e00-a12c-fb544a0791ba" containerName="openstack-network-exporter" Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.708656 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef09e13-a7ab-4e00-a12c-fb544a0791ba" containerName="ovsdbserver-sb" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.708663 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef09e13-a7ab-4e00-a12c-fb544a0791ba" containerName="ovsdbserver-sb" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.710135 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef09e13-a7ab-4e00-a12c-fb544a0791ba" containerName="openstack-network-exporter" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.710170 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef09e13-a7ab-4e00-a12c-fb544a0791ba" containerName="ovsdbserver-sb" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.711515 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.727564 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2"] Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.732496 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-config-data: secret "cinder-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.732554 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data podName:74f7986d-211a-4216-b92f-ad900ddcc45f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:25.732540096 +0000 UTC m=+2638.052800384 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data") pod "cinder-scheduler-0" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f") : secret "cinder-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.733248 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26v98\" (UniqueName: \"kubernetes.io/projected/eef09e13-a7ab-4e00-a12c-fb544a0791ba-kube-api-access-26v98\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.733263 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eef09e13-a7ab-4e00-a12c-fb544a0791ba-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.733274 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eef09e13-a7ab-4e00-a12c-fb544a0791ba-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.733284 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eef09e13-a7ab-4e00-a12c-fb544a0791ba-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.733307 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.734682 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/placement-config-data: secret "placement-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.734719 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data podName:fa4827b5-38d8-4647-a2bd-bfff1b91d34e nodeName:}" failed. No retries permitted until 2026-01-20 23:19:26.734707778 +0000 UTC m=+2639.054968066 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data") pod "placement-847f86bdb-chr55" (UID: "fa4827b5-38d8-4647-a2bd-bfff1b91d34e") : secret "placement-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.734754 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.734776 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data podName:995342f6-fe0b-4188-976c-5151541dd002 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:25.73476799 +0000 UTC m=+2638.055028278 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data") pod "rabbitmq-cell1-server-0" (UID: "995342f6-fe0b-4188-976c-5151541dd002") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.734808 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.734830 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-api-config-data: secret "barbican-api-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.734842 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data podName:a54757d6-0570-4290-a802-d82ff94dffb9 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:25.734834531 +0000 UTC m=+2638.055094819 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data") pod "rabbitmq-cell1-server-2" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.734864 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom podName:40a68c1e-d515-4dac-a25a-d7ed5da4b15c nodeName:}" failed. No retries permitted until 2026-01-20 23:19:25.734853972 +0000 UTC m=+2638.055114260 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom") pod "barbican-api-84bf68df56-874dk" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c") : secret "barbican-api-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.734876 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.734903 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data podName:65e9d298-7338-47c3-8591-b7803798203f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:25.734895593 +0000 UTC m=+2638.055155881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data") pod "rabbitmq-cell1-server-1" (UID: "65e9d298-7338-47c3-8591-b7803798203f") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.734962 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-scripts: secret "cinder-scripts" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.735019 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts podName:74f7986d-211a-4216-b92f-ad900ddcc45f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:25.735003415 +0000 UTC m=+2638.055263703 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts") pod "cinder-scheduler-0" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f") : secret "cinder-scripts" not found Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.769961 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.779719 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.787578 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.803991 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.807733 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eef09e13-a7ab-4e00-a12c-fb544a0791ba" (UID: "eef09e13-a7ab-4e00-a12c-fb544a0791ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.807827 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gnnv9"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.820355 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gnnv9"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.836757 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7vf4\" (UniqueName: \"kubernetes.io/projected/f482c09a-b782-4aa8-b776-29055767e21d-kube-api-access-m7vf4\") pod \"dnsmasq-dnsmasq-84b9f45d47-24hk2\" (UID: \"f482c09a-b782-4aa8-b776-29055767e21d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.836962 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f482c09a-b782-4aa8-b776-29055767e21d-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-24hk2\" (UID: \"f482c09a-b782-4aa8-b776-29055767e21d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.837013 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f482c09a-b782-4aa8-b776-29055767e21d-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-24hk2\" (UID: \"f482c09a-b782-4aa8-b776-29055767e21d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.837083 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.837095 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.837179 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.837218 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data podName:56408fa4-b1eb-4a4d-bc8a-9ab82b88957f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:26.837203896 +0000 UTC m=+2639.157464184 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data") pod "rabbitmq-server-2" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f") : configmap "rabbitmq-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.871910 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-1" podUID="59bcab9d-b9c0-40fe-8973-df39eeca1dd4" containerName="rabbitmq" containerID="cri-o://a74fc0dc19c36c183a9bd3ffda3fbb852f4467116d2b61c87b1bef566563caba" gracePeriod=604800 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.883079 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="ed443431-2b5c-4533-8523-d01380c98c1d" containerName="rabbitmq" containerID="cri-o://07c23d45fe75d1a2499dda70559be5d8b50e9dddf38b74b4c6637c72301d7701" gracePeriod=604800 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.883233 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jjv69"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.917296 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-1026-account-create-update-h84cz"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.917917 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-2" podUID="56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" containerName="rabbitmq" containerID="cri-o://e36b6c921d799153e5dd8b22436909b17571e1a429199d1f21643266c33dc577" gracePeriod=604800 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.940418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f482c09a-b782-4aa8-b776-29055767e21d-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-24hk2\" (UID: \"f482c09a-b782-4aa8-b776-29055767e21d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.940512 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f482c09a-b782-4aa8-b776-29055767e21d-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-24hk2\" (UID: \"f482c09a-b782-4aa8-b776-29055767e21d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.940581 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7vf4\" (UniqueName: \"kubernetes.io/projected/f482c09a-b782-4aa8-b776-29055767e21d-kube-api-access-m7vf4\") pod \"dnsmasq-dnsmasq-84b9f45d47-24hk2\" (UID: \"f482c09a-b782-4aa8-b776-29055767e21d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.942450 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: E0120 23:19:24.942506 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data podName:ed443431-2b5c-4533-8523-d01380c98c1d nodeName:}" failed. No retries permitted until 2026-01-20 23:19:26.942490271 +0000 UTC m=+2639.262750559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data") pod "rabbitmq-server-0" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d") : configmap "rabbitmq-config-data" not found Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.943046 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f482c09a-b782-4aa8-b776-29055767e21d-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-24hk2\" (UID: \"f482c09a-b782-4aa8-b776-29055767e21d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.943399 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f482c09a-b782-4aa8-b776-29055767e21d-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-24hk2\" (UID: \"f482c09a-b782-4aa8-b776-29055767e21d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.950229 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-1026-account-create-update-h84cz"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.954967 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "eef09e13-a7ab-4e00-a12c-fb544a0791ba" (UID: "eef09e13-a7ab-4e00-a12c-fb544a0791ba"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.956027 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "eef09e13-a7ab-4e00-a12c-fb544a0791ba" (UID: "eef09e13-a7ab-4e00-a12c-fb544a0791ba"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.963040 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7vf4\" (UniqueName: \"kubernetes.io/projected/f482c09a-b782-4aa8-b776-29055767e21d-kube-api-access-m7vf4\") pod \"dnsmasq-dnsmasq-84b9f45d47-24hk2\" (UID: \"f482c09a-b782-4aa8-b776-29055767e21d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.981328 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jm65l"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.990156 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jm65l"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.998577 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.998896 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerName="nova-metadata-metadata" containerID="cri-o://6937c695cff1667d0d62ce1c83b0152b926dae8c296b22eca0b03bd5169765f4" gracePeriod=30 Jan 20 23:19:24 crc kubenswrapper[5030]: I0120 23:19:24.998836 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerName="nova-metadata-log" containerID="cri-o://4acdd1af318211b5450f634e63d8de0e0421722ea1b31f9eda4efdab1cfbcc50" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.015222 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.045410 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.045444 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef09e13-a7ab-4e00-a12c-fb544a0791ba-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.055721 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.056084 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="6bb6b715-a2a6-4c25-83c3-315fd30b8d92" containerName="nova-api-log" containerID="cri-o://fe1a9151522425bf756d729b22560d3e26a3f05df618b4a2133b2f1cf4ed87cf" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.056530 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="6bb6b715-a2a6-4c25-83c3-315fd30b8d92" containerName="nova-api-api" containerID="cri-o://300cae00e1e96594e491715916fe7d890923b30fd9931bc6ed9c3ee0ac8dbeaf" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.073437 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.105755 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-x7ll9"] Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.133564 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4319fd893e780281e4fa992c547508838c8f4137c581cbf49edf43fa6a462908 is running failed: container process not found" containerID="4319fd893e780281e4fa992c547508838c8f4137c581cbf49edf43fa6a462908" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.136381 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4319fd893e780281e4fa992c547508838c8f4137c581cbf49edf43fa6a462908 is running failed: container process not found" containerID="4319fd893e780281e4fa992c547508838c8f4137c581cbf49edf43fa6a462908" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.158258 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4319fd893e780281e4fa992c547508838c8f4137c581cbf49edf43fa6a462908 is running failed: container process not found" containerID="4319fd893e780281e4fa992c547508838c8f4137c581cbf49edf43fa6a462908" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.158333 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4319fd893e780281e4fa992c547508838c8f4137c581cbf49edf43fa6a462908 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="d7b5a2a9-c6db-4d98-b203-5503e4b7075c" containerName="ovsdbserver-nb" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.180719 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-x7ll9"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.221743 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.235721 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.244489 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-1"] Jan 20 23:19:25 crc kubenswrapper[5030]: W0120 23:19:25.271997 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62358e70_13cb_4b80_8d09_f5ae4f43f6bd.slice/crio-69ca8a0f65213103b6432ff4ceb6b83389a392b8bb7a410045ccff616da55263 WatchSource:0}: Error finding container 69ca8a0f65213103b6432ff4ceb6b83389a392b8bb7a410045ccff616da55263: Status 404 returned error can't find the container with id 69ca8a0f65213103b6432ff4ceb6b83389a392b8bb7a410045ccff616da55263 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.274132 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-2"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.292673 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-t4nrn"] Jan 20 23:19:25 crc kubenswrapper[5030]: W0120 23:19:25.295725 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fc63968_1e2b_46fb_b8f3_d4cfd459410c.slice/crio-39e1c77fc43ef513673176366b4a66fb35e92b56f1fd1811af8d55b5fd788004 WatchSource:0}: Error finding container 39e1c77fc43ef513673176366b4a66fb35e92b56f1fd1811af8d55b5fd788004: Status 404 returned error can't find the container with id 39e1c77fc43ef513673176366b4a66fb35e92b56f1fd1811af8d55b5fd788004 Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.296508 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:19:25 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: if [ -n "barbican" ]; then Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="barbican" Jan 20 23:19:25 crc kubenswrapper[5030]: else Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:19:25 crc kubenswrapper[5030]: fi Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:19:25 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:19:25 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:19:25 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:19:25 crc kubenswrapper[5030]: # support updates Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.296956 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:19:25 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: if [ -n "neutron" ]; then Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="neutron" Jan 20 23:19:25 crc kubenswrapper[5030]: else Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:19:25 crc kubenswrapper[5030]: fi Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:19:25 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:19:25 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:19:25 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:19:25 crc kubenswrapper[5030]: # support updates Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.297173 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:19:25 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: if [ -n "cinder" ]; then Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="cinder" Jan 20 23:19:25 crc kubenswrapper[5030]: else Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:19:25 crc kubenswrapper[5030]: fi Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:19:25 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:19:25 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:19:25 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:19:25 crc kubenswrapper[5030]: # support updates Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.297390 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:19:25 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: if [ -n "nova_api" ]; then Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="nova_api" Jan 20 23:19:25 crc kubenswrapper[5030]: else Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:19:25 crc kubenswrapper[5030]: fi Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:19:25 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:19:25 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:19:25 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:19:25 crc kubenswrapper[5030]: # support updates Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.299441 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" podUID="d84d5a65-b170-4bad-aa8a-ddd90f8f48f9" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.299479 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" podUID="131dfd04-f9d2-4fb7-b432-4f34f70cc4b7" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.299497 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf" podUID="e313a291-92ea-456b-9b16-7eb70ebfbe29" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.299515 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw" podUID="62358e70-13cb-4b80-8d09-f5ae4f43f6bd" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.299680 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-t4nrn"] Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.326209 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:19:25 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: if [ -n "placement" ]; then Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="placement" Jan 20 23:19:25 crc kubenswrapper[5030]: else Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:19:25 crc kubenswrapper[5030]: fi Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:19:25 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:19:25 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:19:25 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:19:25 crc kubenswrapper[5030]: # support updates Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.327577 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" podUID="7fc63968-1e2b-46fb-b8f3-d4cfd459410c" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.331658 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5748c46787-5fm27"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.332211 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" podUID="c0e11584-61c4-455a-9943-28b4228c8921" containerName="barbican-worker-log" containerID="cri-o://8b68d04d926af53c48ce1eed2f0df585a7425d91dacd626198dde33e72447ad4" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.332662 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" podUID="c0e11584-61c4-455a-9943-28b4228c8921" containerName="barbican-worker" containerID="cri-o://284d499ae2855fa12179dc7eefb9d028f521319a34d4e38ba00fb9502f35e698" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.346496 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="ed443431-2b5c-4533-8523-d01380c98c1d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.354261 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-7ddpx"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.362873 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-7ddpx"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.373532 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.373972 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" podUID="4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" containerName="barbican-keystone-listener-log" containerID="cri-o://12ac3c08c01fa3165cd6cab4fc0e255fb705e419bf03d4f6f9041ab8c246bd09" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.374591 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" podUID="4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" containerName="barbican-keystone-listener" containerID="cri-o://88a2387b417e90bfd501bc833d3ec1e5d951fed92f4116f90417ad981562c482" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.386766 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-5svxp"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.398598 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.402185 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-5svxp"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.409660 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-dlbqk"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.419996 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-dlbqk"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.429848 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-rj9lg"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.437798 5030 generic.go:334] "Generic (PLEG): container finished" podID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerID="4acdd1af318211b5450f634e63d8de0e0421722ea1b31f9eda4efdab1cfbcc50" exitCode=143 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.437882 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ea3fc979-4f14-4578-9cae-d44c44ba3e8c","Type":"ContainerDied","Data":"4acdd1af318211b5450f634e63d8de0e0421722ea1b31f9eda4efdab1cfbcc50"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.447394 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.452462 5030 generic.go:334] "Generic (PLEG): container finished" podID="fa4827b5-38d8-4647-a2bd-bfff1b91d34e" containerID="195092f4ac72d7d533ff5382326146aa18ffe894548f20c28141674be97ab59c" exitCode=143 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.452717 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" event={"ID":"fa4827b5-38d8-4647-a2bd-bfff1b91d34e","Type":"ContainerDied","Data":"195092f4ac72d7d533ff5382326146aa18ffe894548f20c28141674be97ab59c"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.455541 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.455892 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="c079a7fc-dcd4-48da-bdd8-b47d93405531" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7ea9a82b9e61683bff33c64e59c082cc2b81940cd861351fcaf415abf328e32f" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.462206 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_683bb59f-0c0f-4f12-970b-b786e886cae3/ovn-northd/0.log" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.462337 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.464656 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-rj9lg"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.470696 5030 generic.go:334] "Generic (PLEG): container finished" podID="8d6c006d-67ff-426d-9203-5c769b474fa3" containerID="7c83141a8f44daede2829c14c813ca55519c2208bddf5b310b332a5b4db4f332" exitCode=1 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.470808 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jjv69" event={"ID":"8d6c006d-67ff-426d-9203-5c769b474fa3","Type":"ContainerDied","Data":"7c83141a8f44daede2829c14c813ca55519c2208bddf5b310b332a5b4db4f332"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.470850 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jjv69" event={"ID":"8d6c006d-67ff-426d-9203-5c769b474fa3","Type":"ContainerStarted","Data":"48a2c00e0f645b82fb7c8a947a27dbe27867527fac54e630665e3dd0f242e44b"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.471638 5030 scope.go:117] "RemoveContainer" containerID="7c83141a8f44daede2829c14c813ca55519c2208bddf5b310b332a5b4db4f332" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.481613 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" event={"ID":"d84d5a65-b170-4bad-aa8a-ddd90f8f48f9","Type":"ContainerStarted","Data":"f7af5852f843e702eb374830066e571af883b75be3b5711f1447731388122dd0"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.487736 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.492287 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.496289 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-84bf68df56-874dk"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.496638 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" podUID="40a68c1e-d515-4dac-a25a-d7ed5da4b15c" containerName="barbican-api-log" containerID="cri-o://05743f2bd663ee4c6959a35a5be0c01f08b3fea21e5994bca8e2faae3bb81e6a" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.496680 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" podUID="40a68c1e-d515-4dac-a25a-d7ed5da4b15c" containerName="barbican-api" containerID="cri-o://d64a394c4f184d65be789b464e93507d0d51cff95adc8f21cab504aa94390ded" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.498486 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_d7b5a2a9-c6db-4d98-b203-5503e4b7075c/ovsdbserver-nb/0.log" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.498688 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.504330 5030 generic.go:334] "Generic (PLEG): container finished" podID="b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" containerID="e47ac64b526f7c8c1e0183da0c8d577a84461451afc84c5ba2ba9e3c80008b0f" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.504414 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" event={"ID":"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782","Type":"ContainerDied","Data":"e47ac64b526f7c8c1e0183da0c8d577a84461451afc84c5ba2ba9e3c80008b0f"} Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.514343 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:19:25 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: if [ -n "nova_api" ]; then Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="nova_api" Jan 20 23:19:25 crc kubenswrapper[5030]: else Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:19:25 crc kubenswrapper[5030]: fi Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:19:25 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:19:25 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:19:25 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:19:25 crc kubenswrapper[5030]: # support updates Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.515550 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" podUID="d84d5a65-b170-4bad-aa8a-ddd90f8f48f9" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.517938 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.520656 5030 generic.go:334] "Generic (PLEG): container finished" podID="bd75cb8e-9fcd-4465-88fb-49e3600608dc" containerID="744df9f6f8632e4bde232c473af1353285e6fa32962770ad640031ef5ec3543a" exitCode=143 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.520733 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"bd75cb8e-9fcd-4465-88fb-49e3600608dc","Type":"ContainerDied","Data":"744df9f6f8632e4bde232c473af1353285e6fa32962770ad640031ef5ec3543a"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.536690 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.545367 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_683bb59f-0c0f-4f12-970b-b786e886cae3/ovn-northd/0.log" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.545453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"683bb59f-0c0f-4f12-970b-b786e886cae3","Type":"ContainerDied","Data":"7da788db8486da11320cca377df018d7a71cd8aee6416603d06a9002be1b267a"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.545490 5030 scope.go:117] "RemoveContainer" containerID="0838f1d78c9937fe1578213b08a61215c2546c6cc57d57d9399d671aecb5ea39" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.545617 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.553268 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_d7b5a2a9-c6db-4d98-b203-5503e4b7075c/ovsdbserver-nb/0.log" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.553344 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"d7b5a2a9-c6db-4d98-b203-5503e4b7075c","Type":"ContainerDied","Data":"dc3ceb45ba211ba7d0009d6893d08a4f3048c964f22e393caac74abdae209589"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.553421 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.559681 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" event={"ID":"131dfd04-f9d2-4fb7-b432-4f34f70cc4b7","Type":"ContainerStarted","Data":"f8559d3e07e568f2d4b62436d82a1621448096e0ee9a867a98d54f48744b46f2"} Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.560977 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:19:25 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: if [ -n "barbican" ]; then Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="barbican" Jan 20 23:19:25 crc kubenswrapper[5030]: else Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:19:25 crc kubenswrapper[5030]: fi Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:19:25 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:19:25 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:19:25 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:19:25 crc kubenswrapper[5030]: # support updates Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.564655 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" podUID="131dfd04-f9d2-4fb7-b432-4f34f70cc4b7" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.565957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-metrics-certs-tls-certs\") pod \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.565999 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/97fdbcd3-4caf-4b83-b15a-a3dec765984f-openstack-config-secret\") pod \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566039 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7zc8\" (UniqueName: \"kubernetes.io/projected/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-kube-api-access-v7zc8\") pod \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566096 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-ovn-northd-tls-certs\") pod \"683bb59f-0c0f-4f12-970b-b786e886cae3\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/683bb59f-0c0f-4f12-970b-b786e886cae3-config\") pod \"683bb59f-0c0f-4f12-970b-b786e886cae3\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566140 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fdbcd3-4caf-4b83-b15a-a3dec765984f-combined-ca-bundle\") pod \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566183 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-ovsdbserver-nb-tls-certs\") pod \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566211 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-scripts\") pod \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566253 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-combined-ca-bundle\") pod \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566271 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-combined-ca-bundle\") pod \"683bb59f-0c0f-4f12-970b-b786e886cae3\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566321 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/97fdbcd3-4caf-4b83-b15a-a3dec765984f-openstack-config\") pod \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566396 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-metrics-certs-tls-certs\") pod \"683bb59f-0c0f-4f12-970b-b786e886cae3\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566421 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/683bb59f-0c0f-4f12-970b-b786e886cae3-scripts\") pod \"683bb59f-0c0f-4f12-970b-b786e886cae3\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566450 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-ovsdb-rundir\") pod \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmjj5\" (UniqueName: \"kubernetes.io/projected/683bb59f-0c0f-4f12-970b-b786e886cae3-kube-api-access-qmjj5\") pod \"683bb59f-0c0f-4f12-970b-b786e886cae3\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-config\") pod \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566533 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f282\" (UniqueName: \"kubernetes.io/projected/97fdbcd3-4caf-4b83-b15a-a3dec765984f-kube-api-access-8f282\") pod \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\" (UID: \"97fdbcd3-4caf-4b83-b15a-a3dec765984f\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566582 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/683bb59f-0c0f-4f12-970b-b786e886cae3-ovn-rundir\") pod \"683bb59f-0c0f-4f12-970b-b786e886cae3\" (UID: \"683bb59f-0c0f-4f12-970b-b786e886cae3\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.566604 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\" (UID: \"d7b5a2a9-c6db-4d98-b203-5503e4b7075c\") " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.568265 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d7b5a2a9-c6db-4d98-b203-5503e4b7075c" (UID: "d7b5a2a9-c6db-4d98-b203-5503e4b7075c"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.568766 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-scripts" (OuterVolumeSpecName: "scripts") pod "d7b5a2a9-c6db-4d98-b203-5503e4b7075c" (UID: "d7b5a2a9-c6db-4d98-b203-5503e4b7075c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.570558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/683bb59f-0c0f-4f12-970b-b786e886cae3-scripts" (OuterVolumeSpecName: "scripts") pod "683bb59f-0c0f-4f12-970b-b786e886cae3" (UID: "683bb59f-0c0f-4f12-970b-b786e886cae3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.571324 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/683bb59f-0c0f-4f12-970b-b786e886cae3-config" (OuterVolumeSpecName: "config") pod "683bb59f-0c0f-4f12-970b-b786e886cae3" (UID: "683bb59f-0c0f-4f12-970b-b786e886cae3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.574663 5030 generic.go:334] "Generic (PLEG): container finished" podID="97fdbcd3-4caf-4b83-b15a-a3dec765984f" containerID="4630bdc572e916afefdf83bc662ba7a0751c9ba42447e962df99d64177728647" exitCode=137 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.574800 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.575145 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/683bb59f-0c0f-4f12-970b-b786e886cae3-kube-api-access-qmjj5" (OuterVolumeSpecName: "kube-api-access-qmjj5") pod "683bb59f-0c0f-4f12-970b-b786e886cae3" (UID: "683bb59f-0c0f-4f12-970b-b786e886cae3"). InnerVolumeSpecName "kube-api-access-qmjj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.576310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/683bb59f-0c0f-4f12-970b-b786e886cae3-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "683bb59f-0c0f-4f12-970b-b786e886cae3" (UID: "683bb59f-0c0f-4f12-970b-b786e886cae3"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.577627 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97fdbcd3-4caf-4b83-b15a-a3dec765984f-kube-api-access-8f282" (OuterVolumeSpecName: "kube-api-access-8f282") pod "97fdbcd3-4caf-4b83-b15a-a3dec765984f" (UID: "97fdbcd3-4caf-4b83-b15a-a3dec765984f"). InnerVolumeSpecName "kube-api-access-8f282". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.577803 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf" event={"ID":"e313a291-92ea-456b-9b16-7eb70ebfbe29","Type":"ContainerStarted","Data":"3449f28e7a6f6d192c0d67f2192a98400e5b03a94b2babc4cbb597521df34e37"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.577822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-config" (OuterVolumeSpecName: "config") pod "d7b5a2a9-c6db-4d98-b203-5503e4b7075c" (UID: "d7b5a2a9-c6db-4d98-b203-5503e4b7075c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.583567 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "d7b5a2a9-c6db-4d98-b203-5503e4b7075c" (UID: "d7b5a2a9-c6db-4d98-b203-5503e4b7075c"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.584426 5030 generic.go:334] "Generic (PLEG): container finished" podID="b175c0c8-ee1a-404d-9a14-d37744a24ece" containerID="3a9460c9a35ed585df222f0319fca4cbe21024e047ae17ac43a0303e76f4c0ee" exitCode=143 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.584523 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b175c0c8-ee1a-404d-9a14-d37744a24ece","Type":"ContainerDied","Data":"3a9460c9a35ed585df222f0319fca4cbe21024e047ae17ac43a0303e76f4c0ee"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.591421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" event={"ID":"f5bc4584-e61d-47a6-b76f-0ff39b47fe1f","Type":"ContainerStarted","Data":"b76c22c9bfa805c53d87eeb3f9212726eb23cdef997a0f13a87b245e436d19af"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.594255 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-kube-api-access-v7zc8" (OuterVolumeSpecName: "kube-api-access-v7zc8") pod "d7b5a2a9-c6db-4d98-b203-5503e4b7075c" (UID: "d7b5a2a9-c6db-4d98-b203-5503e4b7075c"). InnerVolumeSpecName "kube-api-access-v7zc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.604676 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.607461 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" podUID="a54757d6-0570-4290-a802-d82ff94dffb9" containerName="rabbitmq" containerID="cri-o://b4810b367aef312cb07a5bc212828530d847353295a4a0029f98ba6de0de7144" gracePeriod=604800 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.614582 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.626519 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.626872 5030 scope.go:117] "RemoveContainer" containerID="f0b6f096253aba52285115aadb119a7459485eb7348095df5221d30a264dfa06" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.627102 5030 generic.go:334] "Generic (PLEG): container finished" podID="6bb6b715-a2a6-4c25-83c3-315fd30b8d92" containerID="fe1a9151522425bf756d729b22560d3e26a3f05df618b4a2133b2f1cf4ed87cf" exitCode=143 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.627157 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6bb6b715-a2a6-4c25-83c3-315fd30b8d92","Type":"ContainerDied","Data":"fe1a9151522425bf756d729b22560d3e26a3f05df618b4a2133b2f1cf4ed87cf"} Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.627372 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:19:25 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: if [ -n "glance" ]; then Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="glance" Jan 20 23:19:25 crc kubenswrapper[5030]: else Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:19:25 crc kubenswrapper[5030]: fi Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:19:25 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:19:25 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:19:25 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:19:25 crc kubenswrapper[5030]: # support updates Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.628779 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" podUID="f5bc4584-e61d-47a6-b76f-0ff39b47fe1f" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.632354 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "683bb59f-0c0f-4f12-970b-b786e886cae3" (UID: "683bb59f-0c0f-4f12-970b-b786e886cae3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.639605 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.653272 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-2" podUID="1c5da8d3-64b4-4c3b-a154-58eaf9379610" containerName="galera" containerID="cri-o://43baa2367474f77b4fdfa479c6f54d61e03942a2fcd5b0dd9dcd42e7366551d7" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.654772 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" podUID="65e9d298-7338-47c3-8591-b7803798203f" containerName="rabbitmq" containerID="cri-o://5ec228c62d11e892bccd8b3e7622fe27d4c99380436acc8ab9a70d07539cf3ba" gracePeriod=604800 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.660492 5030 generic.go:334] "Generic (PLEG): container finished" podID="c16c5bc4-fe32-4711-a21f-c6e7919d41d3" containerID="e4df01fc1632a811eb3b665c2886fe79657be01644364c0ace65e378ddb318ce" exitCode=143 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.660579 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c16c5bc4-fe32-4711-a21f-c6e7919d41d3","Type":"ContainerDied","Data":"e4df01fc1632a811eb3b665c2886fe79657be01644364c0ace65e378ddb318ce"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.665833 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.668108 5030 scope.go:117] "RemoveContainer" containerID="c985cc11f4e3ac6e3f0060dc9a127411c7d93295135d6768cf20b1612fe484c2" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.675959 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/683bb59f-0c0f-4f12-970b-b786e886cae3-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.676212 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.676223 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmjj5\" (UniqueName: \"kubernetes.io/projected/683bb59f-0c0f-4f12-970b-b786e886cae3-kube-api-access-qmjj5\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.676232 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.676242 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f282\" (UniqueName: \"kubernetes.io/projected/97fdbcd3-4caf-4b83-b15a-a3dec765984f-kube-api-access-8f282\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.676250 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/683bb59f-0c0f-4f12-970b-b786e886cae3-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.676272 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.676281 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7zc8\" (UniqueName: \"kubernetes.io/projected/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-kube-api-access-v7zc8\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.676291 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/683bb59f-0c0f-4f12-970b-b786e886cae3-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.676299 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.676307 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.676443 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-scheduler-config-data: secret "cinder-scheduler-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.676496 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom podName:74f7986d-211a-4216-b92f-ad900ddcc45f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:27.676479842 +0000 UTC m=+2639.996740130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom") pod "cinder-scheduler-0" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f") : secret "cinder-scheduler-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.677077 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.677415 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data podName:40a68c1e-d515-4dac-a25a-d7ed5da4b15c nodeName:}" failed. No retries permitted until 2026-01-20 23:19:27.677396354 +0000 UTC m=+2639.997656642 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data") pod "barbican-api-84bf68df56-874dk" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c") : secret "barbican-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.686250 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.708083 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-1" podUID="59bcab9d-b9c0-40fe-8973-df39eeca1dd4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.731599 5030 scope.go:117] "RemoveContainer" containerID="4319fd893e780281e4fa992c547508838c8f4137c581cbf49edf43fa6a462908" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.732831 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738107 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="63b6c6d4bece1d87f4729a39f17170c16567071872bdc73d5549a42bd98e07ba" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738139 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="c088151164804d480f3d602813acc6fe8110edbb60819ceaf2f0559e9b333bb1" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738148 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="d04f17499f96be1bcb743043670578f206129ddc5a8598a1d01aac043f27b4a5" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738156 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="ff021e7947768f870e170bfceeedcefa9f23164365486fe0e68cf658ba9ee312" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738164 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="d0baa50e2392f25298e7d0f9821ab644a7212ae9a2f2e398ab1e0dc812fdc4da" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738173 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="c3edaddce88c0d920a63257712c29ebc2820562a0cebc56bf7063f7045098b45" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738179 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="c172e50bb94cc43be14453c778229aa3ad0f5285b8f09db985f5f9ca934e07e0" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738186 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="8420aa4a66cecb302aa6d121c7a14ab12b3529e6eb3b2d23cd341bdb0404c01b" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738194 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="ddb4980b7a5fddbacfa0f9edafb461b4d52d06ebf1613fcacc617ee6aa6c2f36" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738206 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="9953d67dbf3825bda9fc957256627f35a45dbc98f02872cfe23353e4f525b038" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738212 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="2724c2174e7487a3d20c504c8b82914e661dd1586f42ae55254b615ffc14a493" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738218 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="e7a7f0b5138e5ce84a4aee902bbc98528f273ace28ad5b3739a5c7855fce0f41" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738226 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="86a2e3b58648f4c6865951dbe0e5fd173fe948355567c57408f5478873a50cea" exitCode=0 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738273 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"63b6c6d4bece1d87f4729a39f17170c16567071872bdc73d5549a42bd98e07ba"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738297 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"c088151164804d480f3d602813acc6fe8110edbb60819ceaf2f0559e9b333bb1"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738308 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"d04f17499f96be1bcb743043670578f206129ddc5a8598a1d01aac043f27b4a5"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738317 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"ff021e7947768f870e170bfceeedcefa9f23164365486fe0e68cf658ba9ee312"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.738473 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="e8de9d1d-d7de-4313-9b7b-0f07992ede1b" containerName="nova-scheduler-scheduler" containerID="cri-o://ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.741746 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"d0baa50e2392f25298e7d0f9821ab644a7212ae9a2f2e398ab1e0dc812fdc4da"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.741772 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"c3edaddce88c0d920a63257712c29ebc2820562a0cebc56bf7063f7045098b45"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.741782 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"c172e50bb94cc43be14453c778229aa3ad0f5285b8f09db985f5f9ca934e07e0"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.741792 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"8420aa4a66cecb302aa6d121c7a14ab12b3529e6eb3b2d23cd341bdb0404c01b"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.741801 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"ddb4980b7a5fddbacfa0f9edafb461b4d52d06ebf1613fcacc617ee6aa6c2f36"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.741810 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"9953d67dbf3825bda9fc957256627f35a45dbc98f02872cfe23353e4f525b038"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.741823 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"2724c2174e7487a3d20c504c8b82914e661dd1586f42ae55254b615ffc14a493"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.741833 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"e7a7f0b5138e5ce84a4aee902bbc98528f273ace28ad5b3739a5c7855fce0f41"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.741841 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"86a2e3b58648f4c6865951dbe0e5fd173fe948355567c57408f5478873a50cea"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.766202 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.772904 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97fdbcd3-4caf-4b83-b15a-a3dec765984f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "97fdbcd3-4caf-4b83-b15a-a3dec765984f" (UID: "97fdbcd3-4caf-4b83-b15a-a3dec765984f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.796378 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_eef09e13-a7ab-4e00-a12c-fb544a0791ba/ovsdbserver-sb/0.log" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.796595 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.796480 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"eef09e13-a7ab-4e00-a12c-fb544a0791ba","Type":"ContainerDied","Data":"a54a17f7765517e607697e25e75f56100e9905b25af73aad95d6b390ac988ab6"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.797461 5030 scope.go:117] "RemoveContainer" containerID="4630bdc572e916afefdf83bc662ba7a0751c9ba42447e962df99d64177728647" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.797522 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7b5a2a9-c6db-4d98-b203-5503e4b7075c" (UID: "d7b5a2a9-c6db-4d98-b203-5503e4b7075c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.798841 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.798888 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data podName:a54757d6-0570-4290-a802-d82ff94dffb9 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:27.798870378 +0000 UTC m=+2640.119130666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data") pod "rabbitmq-cell1-server-2" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.799035 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.799101 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data podName:65e9d298-7338-47c3-8591-b7803798203f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:27.799086523 +0000 UTC m=+2640.119346811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data") pod "rabbitmq-cell1-server-1" (UID: "65e9d298-7338-47c3-8591-b7803798203f") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.799202 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-api-config-data: secret "barbican-api-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.799244 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom podName:40a68c1e-d515-4dac-a25a-d7ed5da4b15c nodeName:}" failed. No retries permitted until 2026-01-20 23:19:27.799233516 +0000 UTC m=+2640.119493804 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom") pod "barbican-api-84bf68df56-874dk" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c") : secret "barbican-api-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.799369 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-scripts: secret "cinder-scripts" not found Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.799431 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts podName:74f7986d-211a-4216-b92f-ad900ddcc45f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:27.79942239 +0000 UTC m=+2640.119682678 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts") pod "cinder-scheduler-0" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f") : secret "cinder-scripts" not found Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.799753 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-config-data: secret "cinder-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.799782 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data podName:74f7986d-211a-4216-b92f-ad900ddcc45f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:27.79977405 +0000 UTC m=+2640.120034338 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data") pod "cinder-scheduler-0" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f") : secret "cinder-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.799990 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/97fdbcd3-4caf-4b83-b15a-a3dec765984f-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.800006 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.800072 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.800112 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data podName:995342f6-fe0b-4188-976c-5151541dd002 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:27.800097377 +0000 UTC m=+2640.120357665 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data") pod "rabbitmq-cell1-server-0" (UID: "995342f6-fe0b-4188-976c-5151541dd002") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.808372 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-648db"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.809573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw" event={"ID":"62358e70-13cb-4b80-8d09-f5ae4f43f6bd","Type":"ContainerStarted","Data":"69ca8a0f65213103b6432ff4ceb6b83389a392b8bb7a410045ccff616da55263"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.813388 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97fdbcd3-4caf-4b83-b15a-a3dec765984f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97fdbcd3-4caf-4b83-b15a-a3dec765984f" (UID: "97fdbcd3-4caf-4b83-b15a-a3dec765984f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.819122 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="995342f6-fe0b-4188-976c-5151541dd002" containerName="rabbitmq" containerID="cri-o://38eba0f78e9bf8588976acd650a6cbc2107e5878a8dbe1f929ebf5401b2d0697" gracePeriod=604800 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.819295 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" event={"ID":"7fc63968-1e2b-46fb-b8f3-d4cfd459410c","Type":"ContainerStarted","Data":"39e1c77fc43ef513673176366b4a66fb35e92b56f1fd1811af8d55b5fd788004"} Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.853946 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.872187 5030 scope.go:117] "RemoveContainer" containerID="4630bdc572e916afefdf83bc662ba7a0751c9ba42447e962df99d64177728647" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.872876 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.873110 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="0863eb67-3ead-4744-b338-aaf75284e458" containerName="nova-cell1-conductor-conductor" containerID="cri-o://7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.888755 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4630bdc572e916afefdf83bc662ba7a0751c9ba42447e962df99d64177728647\": container with ID starting with 4630bdc572e916afefdf83bc662ba7a0751c9ba42447e962df99d64177728647 not found: ID does not exist" containerID="4630bdc572e916afefdf83bc662ba7a0751c9ba42447e962df99d64177728647" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.888798 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4630bdc572e916afefdf83bc662ba7a0751c9ba42447e962df99d64177728647"} err="failed to get container status \"4630bdc572e916afefdf83bc662ba7a0751c9ba42447e962df99d64177728647\": rpc error: code = NotFound desc = could not find container \"4630bdc572e916afefdf83bc662ba7a0751c9ba42447e962df99d64177728647\": container with ID starting with 4630bdc572e916afefdf83bc662ba7a0751c9ba42447e962df99d64177728647 not found: ID does not exist" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.888821 5030 scope.go:117] "RemoveContainer" containerID="993cc6b278b50127911dfb227a9f751177138cfc79c86f0047ff5f55bad4e71a" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.899525 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:19:25 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: if [ -n "placement" ]; then Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="placement" Jan 20 23:19:25 crc kubenswrapper[5030]: else Jan 20 23:19:25 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:19:25 crc kubenswrapper[5030]: fi Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:19:25 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:19:25 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:19:25 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:19:25 crc kubenswrapper[5030]: # support updates Jan 20 23:19:25 crc kubenswrapper[5030]: Jan 20 23:19:25 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:19:25 crc kubenswrapper[5030]: E0120 23:19:25.900876 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" podUID="7fc63968-1e2b-46fb-b8f3-d4cfd459410c" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.906097 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.906129 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fdbcd3-4caf-4b83-b15a-a3dec765984f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.924723 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.924956 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="d59d9bd5-ab07-4c22-83c0-71f88ee1b02d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5" gracePeriod=30 Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.928541 5030 scope.go:117] "RemoveContainer" containerID="acf7a1c2cf6b3d46585291710fe47cc13af65d867980957cc480cde40622afc4" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.940384 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.946430 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-vrv9d"] Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.976929 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-2" podUID="56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.981484 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b90192c-b1c7-4c7c-a4cd-b86ed62f0903" path="/var/lib/kubelet/pods/0b90192c-b1c7-4c7c-a4cd-b86ed62f0903/volumes" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.982045 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b42b071-9e29-4c1b-a187-e79623838902" path="/var/lib/kubelet/pods/1b42b071-9e29-4c1b-a187-e79623838902/volumes" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.982611 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24326cd7-d35a-41a1-b953-3ca15ad5d5f0" path="/var/lib/kubelet/pods/24326cd7-d35a-41a1-b953-3ca15ad5d5f0/volumes" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.984750 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2775de5c-23db-45b4-a940-183a73cd8fb4" path="/var/lib/kubelet/pods/2775de5c-23db-45b4-a940-183a73cd8fb4/volumes" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.985669 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ae1ddae-2f96-4602-92de-80ea30454c54" path="/var/lib/kubelet/pods/2ae1ddae-2f96-4602-92de-80ea30454c54/volumes" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.986767 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3313a132-ec2a-462d-ad27-645c76243ac6" path="/var/lib/kubelet/pods/3313a132-ec2a-462d-ad27-645c76243ac6/volumes" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.987288 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="359366f6-13e3-4ce0-be91-1ecc9bd33613" path="/var/lib/kubelet/pods/359366f6-13e3-4ce0-be91-1ecc9bd33613/volumes" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.988025 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6762fe60-da88-4169-ac82-e435ae1486db" path="/var/lib/kubelet/pods/6762fe60-da88-4169-ac82-e435ae1486db/volumes" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.989293 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68dec5c8-ac10-4af8-b442-11df66af8d8f" path="/var/lib/kubelet/pods/68dec5c8-ac10-4af8-b442-11df66af8d8f/volumes" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.989932 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fcd0796-3d13-46c8-a117-74ff1f36d4cc" path="/var/lib/kubelet/pods/7fcd0796-3d13-46c8-a117-74ff1f36d4cc/volumes" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.991247 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="830844d5-003c-44cd-bf80-5c78c96c1862" path="/var/lib/kubelet/pods/830844d5-003c-44cd-bf80-5c78c96c1862/volumes" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.996459 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aab764fb-e311-4d5c-8139-65120ec5f9eb" path="/var/lib/kubelet/pods/aab764fb-e311-4d5c-8139-65120ec5f9eb/volumes" Jan 20 23:19:25 crc kubenswrapper[5030]: I0120 23:19:25.998025 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8466bfd-157a-488e-a8a3-856b4837603a" path="/var/lib/kubelet/pods/b8466bfd-157a-488e-a8a3-856b4837603a/volumes" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.001939 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bef71c28-3c90-4bc9-bd49-13629ca6f394" path="/var/lib/kubelet/pods/bef71c28-3c90-4bc9-bd49-13629ca6f394/volumes" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.002567 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c" path="/var/lib/kubelet/pods/bf31a8aa-416b-44ff-b20c-d2b6d4a86a6c/volumes" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.003290 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81789b9-e782-4dbe-9832-a614011b349d" path="/var/lib/kubelet/pods/e81789b9-e782-4dbe-9832-a614011b349d/volumes" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.003974 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9d0af08-12b5-437a-9adc-37ff294ee30f" path="/var/lib/kubelet/pods/e9d0af08-12b5-437a-9adc-37ff294ee30f/volumes" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.004973 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1bee829-9ebf-4a41-8293-a7328324b33f" path="/var/lib/kubelet/pods/f1bee829-9ebf-4a41-8293-a7328324b33f/volumes" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.005493 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9dd2651-9595-4f1a-885f-b7c766d8ffff" path="/var/lib/kubelet/pods/f9dd2651-9595-4f1a-885f-b7c766d8ffff/volumes" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.047835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97fdbcd3-4caf-4b83-b15a-a3dec765984f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "97fdbcd3-4caf-4b83-b15a-a3dec765984f" (UID: "97fdbcd3-4caf-4b83-b15a-a3dec765984f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: E0120 23:19:26.058403 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:19:26 crc kubenswrapper[5030]: E0120 23:19:26.060768 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:19:26 crc kubenswrapper[5030]: E0120 23:19:26.062890 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:19:26 crc kubenswrapper[5030]: E0120 23:19:26.062925 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="d59d9bd5-ab07-4c22-83c0-71f88ee1b02d" containerName="nova-cell0-conductor-conductor" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.064510 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "683bb59f-0c0f-4f12-970b-b786e886cae3" (UID: "683bb59f-0c0f-4f12-970b-b786e886cae3"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.073988 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "d7b5a2a9-c6db-4d98-b203-5503e4b7075c" (UID: "d7b5a2a9-c6db-4d98-b203-5503e4b7075c"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.110463 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/97fdbcd3-4caf-4b83-b15a-a3dec765984f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.110488 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.110499 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.121009 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d7b5a2a9-c6db-4d98-b203-5503e4b7075c" (UID: "d7b5a2a9-c6db-4d98-b203-5503e4b7075c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.124116 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "683bb59f-0c0f-4f12-970b-b786e886cae3" (UID: "683bb59f-0c0f-4f12-970b-b786e886cae3"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.217155 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7b5a2a9-c6db-4d98-b203-5503e4b7075c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.217189 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/683bb59f-0c0f-4f12-970b-b786e886cae3-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.275013 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6"] Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.275045 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2"] Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.275057 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.275085 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.276518 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" podUID="24c9c78b-da19-4586-ae0c-97d5d63aeee2" containerName="proxy-httpd" containerID="cri-o://ec5e526c2cdd05718671ccd6ac4f385ffc8385a61f4c21bec9eea239556ebbbc" gracePeriod=30 Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.278756 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" podUID="24c9c78b-da19-4586-ae0c-97d5d63aeee2" containerName="proxy-server" containerID="cri-o://c4a40a78281e11cd16875b9ee533f316223288e089e031dfb22cb627128bcb6c" gracePeriod=30 Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.311689 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.344800 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.385902 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.412580 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.420441 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e313a291-92ea-456b-9b16-7eb70ebfbe29-operator-scripts\") pod \"e313a291-92ea-456b-9b16-7eb70ebfbe29\" (UID: \"e313a291-92ea-456b-9b16-7eb70ebfbe29\") " Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.420532 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8g28\" (UniqueName: \"kubernetes.io/projected/62358e70-13cb-4b80-8d09-f5ae4f43f6bd-kube-api-access-q8g28\") pod \"62358e70-13cb-4b80-8d09-f5ae4f43f6bd\" (UID: \"62358e70-13cb-4b80-8d09-f5ae4f43f6bd\") " Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.420559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6rgs\" (UniqueName: \"kubernetes.io/projected/e313a291-92ea-456b-9b16-7eb70ebfbe29-kube-api-access-j6rgs\") pod \"e313a291-92ea-456b-9b16-7eb70ebfbe29\" (UID: \"e313a291-92ea-456b-9b16-7eb70ebfbe29\") " Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.420649 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62358e70-13cb-4b80-8d09-f5ae4f43f6bd-operator-scripts\") pod \"62358e70-13cb-4b80-8d09-f5ae4f43f6bd\" (UID: \"62358e70-13cb-4b80-8d09-f5ae4f43f6bd\") " Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.421488 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62358e70-13cb-4b80-8d09-f5ae4f43f6bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62358e70-13cb-4b80-8d09-f5ae4f43f6bd" (UID: "62358e70-13cb-4b80-8d09-f5ae4f43f6bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.422129 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e313a291-92ea-456b-9b16-7eb70ebfbe29-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e313a291-92ea-456b-9b16-7eb70ebfbe29" (UID: "e313a291-92ea-456b-9b16-7eb70ebfbe29"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.430485 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.438847 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.473359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62358e70-13cb-4b80-8d09-f5ae4f43f6bd-kube-api-access-q8g28" (OuterVolumeSpecName: "kube-api-access-q8g28") pod "62358e70-13cb-4b80-8d09-f5ae4f43f6bd" (UID: "62358e70-13cb-4b80-8d09-f5ae4f43f6bd"). InnerVolumeSpecName "kube-api-access-q8g28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.475259 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e313a291-92ea-456b-9b16-7eb70ebfbe29-kube-api-access-j6rgs" (OuterVolumeSpecName: "kube-api-access-j6rgs") pod "e313a291-92ea-456b-9b16-7eb70ebfbe29" (UID: "e313a291-92ea-456b-9b16-7eb70ebfbe29"). InnerVolumeSpecName "kube-api-access-j6rgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.523333 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e313a291-92ea-456b-9b16-7eb70ebfbe29-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.523595 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8g28\" (UniqueName: \"kubernetes.io/projected/62358e70-13cb-4b80-8d09-f5ae4f43f6bd-kube-api-access-q8g28\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.523606 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6rgs\" (UniqueName: \"kubernetes.io/projected/e313a291-92ea-456b-9b16-7eb70ebfbe29-kube-api-access-j6rgs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.523651 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62358e70-13cb-4b80-8d09-f5ae4f43f6bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.527006 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.624982 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g62pl\" (UniqueName: \"kubernetes.io/projected/c079a7fc-dcd4-48da-bdd8-b47d93405531-kube-api-access-g62pl\") pod \"c079a7fc-dcd4-48da-bdd8-b47d93405531\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.625059 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-vencrypt-tls-certs\") pod \"c079a7fc-dcd4-48da-bdd8-b47d93405531\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.625187 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-combined-ca-bundle\") pod \"c079a7fc-dcd4-48da-bdd8-b47d93405531\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.625219 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-nova-novncproxy-tls-certs\") pod \"c079a7fc-dcd4-48da-bdd8-b47d93405531\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.625256 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-config-data\") pod \"c079a7fc-dcd4-48da-bdd8-b47d93405531\" (UID: \"c079a7fc-dcd4-48da-bdd8-b47d93405531\") " Jan 20 23:19:26 crc kubenswrapper[5030]: E0120 23:19:26.625992 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:26 crc kubenswrapper[5030]: E0120 23:19:26.626072 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data podName:59bcab9d-b9c0-40fe-8973-df39eeca1dd4 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:30.626050284 +0000 UTC m=+2642.946310652 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data") pod "rabbitmq-server-1" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4") : configmap "rabbitmq-config-data" not found Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.632118 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c079a7fc-dcd4-48da-bdd8-b47d93405531-kube-api-access-g62pl" (OuterVolumeSpecName: "kube-api-access-g62pl") pod "c079a7fc-dcd4-48da-bdd8-b47d93405531" (UID: "c079a7fc-dcd4-48da-bdd8-b47d93405531"). InnerVolumeSpecName "kube-api-access-g62pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.664901 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-config-data" (OuterVolumeSpecName: "config-data") pod "c079a7fc-dcd4-48da-bdd8-b47d93405531" (UID: "c079a7fc-dcd4-48da-bdd8-b47d93405531"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.705047 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c079a7fc-dcd4-48da-bdd8-b47d93405531" (UID: "c079a7fc-dcd4-48da-bdd8-b47d93405531"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.713875 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "c079a7fc-dcd4-48da-bdd8-b47d93405531" (UID: "c079a7fc-dcd4-48da-bdd8-b47d93405531"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.728123 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.728155 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.728165 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.728175 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g62pl\" (UniqueName: \"kubernetes.io/projected/c079a7fc-dcd4-48da-bdd8-b47d93405531-kube-api-access-g62pl\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.743175 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "c079a7fc-dcd4-48da-bdd8-b47d93405531" (UID: "c079a7fc-dcd4-48da-bdd8-b47d93405531"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.829270 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf" event={"ID":"e313a291-92ea-456b-9b16-7eb70ebfbe29","Type":"ContainerDied","Data":"3449f28e7a6f6d192c0d67f2192a98400e5b03a94b2babc4cbb597521df34e37"} Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.829370 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.830267 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c079a7fc-dcd4-48da-bdd8-b47d93405531-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:26 crc kubenswrapper[5030]: E0120 23:19:26.830352 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/placement-config-data: secret "placement-config-data" not found Jan 20 23:19:26 crc kubenswrapper[5030]: E0120 23:19:26.830390 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data podName:fa4827b5-38d8-4647-a2bd-bfff1b91d34e nodeName:}" failed. No retries permitted until 2026-01-20 23:19:30.830377954 +0000 UTC m=+2643.150638242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data") pod "placement-847f86bdb-chr55" (UID: "fa4827b5-38d8-4647-a2bd-bfff1b91d34e") : secret "placement-config-data" not found Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.831034 5030 generic.go:334] "Generic (PLEG): container finished" podID="f482c09a-b782-4aa8-b776-29055767e21d" containerID="254c8570554910a0333a31f6208a5439b66b73a03bd47e053c8ab95d4bba2a1b" exitCode=0 Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.831170 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" event={"ID":"f482c09a-b782-4aa8-b776-29055767e21d","Type":"ContainerDied","Data":"254c8570554910a0333a31f6208a5439b66b73a03bd47e053c8ab95d4bba2a1b"} Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.831196 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" event={"ID":"f482c09a-b782-4aa8-b776-29055767e21d","Type":"ContainerStarted","Data":"851c266773abfc2753d4ed64389684d7caa4ecd9ef1e7d2e5c40505681e14f9e"} Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.864453 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="5f30fa9f5b0648ade3945014008c72e70964fe50f71f44800cc8bb9e7ffd26fd" exitCode=0 Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.864555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"5f30fa9f5b0648ade3945014008c72e70964fe50f71f44800cc8bb9e7ffd26fd"} Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.867758 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw" event={"ID":"62358e70-13cb-4b80-8d09-f5ae4f43f6bd","Type":"ContainerDied","Data":"69ca8a0f65213103b6432ff4ceb6b83389a392b8bb7a410045ccff616da55263"} Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.867828 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.888598 5030 generic.go:334] "Generic (PLEG): container finished" podID="4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" containerID="12ac3c08c01fa3165cd6cab4fc0e255fb705e419bf03d4f6f9041ab8c246bd09" exitCode=143 Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.888665 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" event={"ID":"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0","Type":"ContainerDied","Data":"12ac3c08c01fa3165cd6cab4fc0e255fb705e419bf03d4f6f9041ab8c246bd09"} Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.891155 5030 generic.go:334] "Generic (PLEG): container finished" podID="24c9c78b-da19-4586-ae0c-97d5d63aeee2" containerID="c4a40a78281e11cd16875b9ee533f316223288e089e031dfb22cb627128bcb6c" exitCode=0 Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.891174 5030 generic.go:334] "Generic (PLEG): container finished" podID="24c9c78b-da19-4586-ae0c-97d5d63aeee2" containerID="ec5e526c2cdd05718671ccd6ac4f385ffc8385a61f4c21bec9eea239556ebbbc" exitCode=0 Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.891238 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf"] Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.891256 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" event={"ID":"24c9c78b-da19-4586-ae0c-97d5d63aeee2","Type":"ContainerDied","Data":"c4a40a78281e11cd16875b9ee533f316223288e089e031dfb22cb627128bcb6c"} Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.891269 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" event={"ID":"24c9c78b-da19-4586-ae0c-97d5d63aeee2","Type":"ContainerDied","Data":"ec5e526c2cdd05718671ccd6ac4f385ffc8385a61f4c21bec9eea239556ebbbc"} Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.893417 5030 generic.go:334] "Generic (PLEG): container finished" podID="8d6c006d-67ff-426d-9203-5c769b474fa3" containerID="3a68c4f1688aff97806a60e8d2e05ae8b38410610f1b708c39ed861bf30a90fc" exitCode=1 Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.893453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jjv69" event={"ID":"8d6c006d-67ff-426d-9203-5c769b474fa3","Type":"ContainerDied","Data":"3a68c4f1688aff97806a60e8d2e05ae8b38410610f1b708c39ed861bf30a90fc"} Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.893474 5030 scope.go:117] "RemoveContainer" containerID="7c83141a8f44daede2829c14c813ca55519c2208bddf5b310b332a5b4db4f332" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.894673 5030 scope.go:117] "RemoveContainer" containerID="3a68c4f1688aff97806a60e8d2e05ae8b38410610f1b708c39ed861bf30a90fc" Jan 20 23:19:26 crc kubenswrapper[5030]: E0120 23:19:26.895014 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-jjv69_openstack-kuttl-tests(8d6c006d-67ff-426d-9203-5c769b474fa3)\"" pod="openstack-kuttl-tests/root-account-create-update-jjv69" podUID="8d6c006d-67ff-426d-9203-5c769b474fa3" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.905203 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-35d5-account-create-update-vslxf"] Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.905788 5030 generic.go:334] "Generic (PLEG): container finished" podID="c079a7fc-dcd4-48da-bdd8-b47d93405531" containerID="7ea9a82b9e61683bff33c64e59c082cc2b81940cd861351fcaf415abf328e32f" exitCode=0 Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.905864 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c079a7fc-dcd4-48da-bdd8-b47d93405531","Type":"ContainerDied","Data":"7ea9a82b9e61683bff33c64e59c082cc2b81940cd861351fcaf415abf328e32f"} Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.905892 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c079a7fc-dcd4-48da-bdd8-b47d93405531","Type":"ContainerDied","Data":"392688e607d62f33ed93e839116811ee0318a10dea920512c7284575812e16b1"} Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.905890 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.916772 5030 generic.go:334] "Generic (PLEG): container finished" podID="40a68c1e-d515-4dac-a25a-d7ed5da4b15c" containerID="05743f2bd663ee4c6959a35a5be0c01f08b3fea21e5994bca8e2faae3bb81e6a" exitCode=143 Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.916889 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" event={"ID":"40a68c1e-d515-4dac-a25a-d7ed5da4b15c","Type":"ContainerDied","Data":"05743f2bd663ee4c6959a35a5be0c01f08b3fea21e5994bca8e2faae3bb81e6a"} Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.923960 5030 generic.go:334] "Generic (PLEG): container finished" podID="74f7986d-211a-4216-b92f-ad900ddcc45f" containerID="3b40fee42e01b5b734218bca90644c05dc6e1540c1debc8a4a06f30fcb8a9bf8" exitCode=0 Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.924017 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"74f7986d-211a-4216-b92f-ad900ddcc45f","Type":"ContainerDied","Data":"3b40fee42e01b5b734218bca90644c05dc6e1540c1debc8a4a06f30fcb8a9bf8"} Jan 20 23:19:26 crc kubenswrapper[5030]: E0120 23:19:26.933150 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:26 crc kubenswrapper[5030]: E0120 23:19:26.933214 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data podName:56408fa4-b1eb-4a4d-bc8a-9ab82b88957f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:30.93319645 +0000 UTC m=+2643.253456818 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data") pod "rabbitmq-server-2" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f") : configmap "rabbitmq-config-data" not found Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.935931 5030 generic.go:334] "Generic (PLEG): container finished" podID="c0e11584-61c4-455a-9943-28b4228c8921" containerID="8b68d04d926af53c48ce1eed2f0df585a7425d91dacd626198dde33e72447ad4" exitCode=143 Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.936172 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" event={"ID":"c0e11584-61c4-455a-9943-28b4228c8921","Type":"ContainerDied","Data":"8b68d04d926af53c48ce1eed2f0df585a7425d91dacd626198dde33e72447ad4"} Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.964124 5030 scope.go:117] "RemoveContainer" containerID="7ea9a82b9e61683bff33c64e59c082cc2b81940cd861351fcaf415abf328e32f" Jan 20 23:19:26 crc kubenswrapper[5030]: I0120 23:19:26.969606 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.038262 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24c9c78b-da19-4586-ae0c-97d5d63aeee2-log-httpd\") pod \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.038316 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-internal-tls-certs\") pod \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.038436 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-combined-ca-bundle\") pod \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.038492 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnqqx\" (UniqueName: \"kubernetes.io/projected/24c9c78b-da19-4586-ae0c-97d5d63aeee2-kube-api-access-nnqqx\") pod \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.038533 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-public-tls-certs\") pod \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.038591 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-config-data\") pod \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.038666 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/24c9c78b-da19-4586-ae0c-97d5d63aeee2-etc-swift\") pod \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.038721 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24c9c78b-da19-4586-ae0c-97d5d63aeee2-run-httpd\") pod \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\" (UID: \"24c9c78b-da19-4586-ae0c-97d5d63aeee2\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.061509 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24c9c78b-da19-4586-ae0c-97d5d63aeee2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "24c9c78b-da19-4586-ae0c-97d5d63aeee2" (UID: "24c9c78b-da19-4586-ae0c-97d5d63aeee2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.063100 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24c9c78b-da19-4586-ae0c-97d5d63aeee2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "24c9c78b-da19-4586-ae0c-97d5d63aeee2" (UID: "24c9c78b-da19-4586-ae0c-97d5d63aeee2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.063793 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24c9c78b-da19-4586-ae0c-97d5d63aeee2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.071236 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24c9c78b-da19-4586-ae0c-97d5d63aeee2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "24c9c78b-da19-4586-ae0c-97d5d63aeee2" (UID: "24c9c78b-da19-4586-ae0c-97d5d63aeee2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.079778 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.081513 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24c9c78b-da19-4586-ae0c-97d5d63aeee2-kube-api-access-nnqqx" (OuterVolumeSpecName: "kube-api-access-nnqqx") pod "24c9c78b-da19-4586-ae0c-97d5d63aeee2" (UID: "24c9c78b-da19-4586-ae0c-97d5d63aeee2"). InnerVolumeSpecName "kube-api-access-nnqqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.083001 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.083064 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data podName:ed443431-2b5c-4533-8523-d01380c98c1d nodeName:}" failed. No retries permitted until 2026-01-20 23:19:31.083047363 +0000 UTC m=+2643.403307651 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data") pod "rabbitmq-server-0" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d") : configmap "rabbitmq-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.093517 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.115224 5030 scope.go:117] "RemoveContainer" containerID="7ea9a82b9e61683bff33c64e59c082cc2b81940cd861351fcaf415abf328e32f" Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.119188 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ea9a82b9e61683bff33c64e59c082cc2b81940cd861351fcaf415abf328e32f\": container with ID starting with 7ea9a82b9e61683bff33c64e59c082cc2b81940cd861351fcaf415abf328e32f not found: ID does not exist" containerID="7ea9a82b9e61683bff33c64e59c082cc2b81940cd861351fcaf415abf328e32f" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.119241 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ea9a82b9e61683bff33c64e59c082cc2b81940cd861351fcaf415abf328e32f"} err="failed to get container status \"7ea9a82b9e61683bff33c64e59c082cc2b81940cd861351fcaf415abf328e32f\": rpc error: code = NotFound desc = could not find container \"7ea9a82b9e61683bff33c64e59c082cc2b81940cd861351fcaf415abf328e32f\": container with ID starting with 7ea9a82b9e61683bff33c64e59c082cc2b81940cd861351fcaf415abf328e32f not found: ID does not exist" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.131848 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.145093 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-e400-account-create-update-6mkqw"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.168731 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnqqx\" (UniqueName: \"kubernetes.io/projected/24c9c78b-da19-4586-ae0c-97d5d63aeee2-kube-api-access-nnqqx\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.168757 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/24c9c78b-da19-4586-ae0c-97d5d63aeee2-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.168766 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24c9c78b-da19-4586-ae0c-97d5d63aeee2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.201225 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-config-data" (OuterVolumeSpecName: "config-data") pod "24c9c78b-da19-4586-ae0c-97d5d63aeee2" (UID: "24c9c78b-da19-4586-ae0c-97d5d63aeee2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.202883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24c9c78b-da19-4586-ae0c-97d5d63aeee2" (UID: "24c9c78b-da19-4586-ae0c-97d5d63aeee2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.225569 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "24c9c78b-da19-4586-ae0c-97d5d63aeee2" (UID: "24c9c78b-da19-4586-ae0c-97d5d63aeee2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.228206 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "24c9c78b-da19-4586-ae0c-97d5d63aeee2" (UID: "24c9c78b-da19-4586-ae0c-97d5d63aeee2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.270509 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.270537 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.270547 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.270560 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24c9c78b-da19-4586-ae0c-97d5d63aeee2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.311104 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.371735 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-operator-scripts\") pod \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.371771 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-kolla-config\") pod \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.371832 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5da8d3-64b4-4c3b-a154-58eaf9379610-combined-ca-bundle\") pod \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.371905 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h78r\" (UniqueName: \"kubernetes.io/projected/1c5da8d3-64b4-4c3b-a154-58eaf9379610-kube-api-access-2h78r\") pod \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.371926 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1c5da8d3-64b4-4c3b-a154-58eaf9379610-config-data-generated\") pod \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.371962 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-config-data-default\") pod \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.371990 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.372022 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c5da8d3-64b4-4c3b-a154-58eaf9379610-galera-tls-certs\") pod \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\" (UID: \"1c5da8d3-64b4-4c3b-a154-58eaf9379610\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.376300 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "1c5da8d3-64b4-4c3b-a154-58eaf9379610" (UID: "1c5da8d3-64b4-4c3b-a154-58eaf9379610"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.377362 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c5da8d3-64b4-4c3b-a154-58eaf9379610" (UID: "1c5da8d3-64b4-4c3b-a154-58eaf9379610"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.378009 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c5da8d3-64b4-4c3b-a154-58eaf9379610-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "1c5da8d3-64b4-4c3b-a154-58eaf9379610" (UID: "1c5da8d3-64b4-4c3b-a154-58eaf9379610"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.378041 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "1c5da8d3-64b4-4c3b-a154-58eaf9379610" (UID: "1c5da8d3-64b4-4c3b-a154-58eaf9379610"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.384775 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c5da8d3-64b4-4c3b-a154-58eaf9379610-kube-api-access-2h78r" (OuterVolumeSpecName: "kube-api-access-2h78r") pod "1c5da8d3-64b4-4c3b-a154-58eaf9379610" (UID: "1c5da8d3-64b4-4c3b-a154-58eaf9379610"). InnerVolumeSpecName "kube-api-access-2h78r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.409997 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5da8d3-64b4-4c3b-a154-58eaf9379610-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c5da8d3-64b4-4c3b-a154-58eaf9379610" (UID: "1c5da8d3-64b4-4c3b-a154-58eaf9379610"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.410194 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "1c5da8d3-64b4-4c3b-a154-58eaf9379610" (UID: "1c5da8d3-64b4-4c3b-a154-58eaf9379610"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.426764 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c5da8d3-64b4-4c3b-a154-58eaf9379610-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "1c5da8d3-64b4-4c3b-a154-58eaf9379610" (UID: "1c5da8d3-64b4-4c3b-a154-58eaf9379610"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.436768 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.462710 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.476293 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.476320 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.476330 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c5da8d3-64b4-4c3b-a154-58eaf9379610-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.476339 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h78r\" (UniqueName: \"kubernetes.io/projected/1c5da8d3-64b4-4c3b-a154-58eaf9379610-kube-api-access-2h78r\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.476348 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1c5da8d3-64b4-4c3b-a154-58eaf9379610-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.476357 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1c5da8d3-64b4-4c3b-a154-58eaf9379610-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.476374 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.476384 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c5da8d3-64b4-4c3b-a154-58eaf9379610-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.480939 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.492884 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.493159 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="ceilometer-central-agent" containerID="cri-o://b7437d316b2192f7896af5d93582c95337eb7084b35a8b488d5714c2c0ab610f" gracePeriod=30 Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.493282 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="proxy-httpd" containerID="cri-o://cb217008ccf67ac0890f7c0b296c6e0fbe72ac6353cd839700724550dc254a84" gracePeriod=30 Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.493318 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="sg-core" containerID="cri-o://5cb235fbc11dbb1f4a1b8e0b540880dc076e8771be46d898537d4c8df6eaa4dc" gracePeriod=30 Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.493349 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="ceilometer-notification-agent" containerID="cri-o://491a040f8ee5635c6af0dc1b66387a1b9706edbf38f9295a9f919a35c55cb5ba" gracePeriod=30 Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.530676 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.538884 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.539097 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="27f2c51d-7595-4b7f-bdfb-3e50f0a6c657" containerName="kube-state-metrics" containerID="cri-o://9a0f15aa4ffcb12861a489b33c1e7515d93c0502d191361b6c6737274054b795" gracePeriod=30 Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.580527 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftm9p\" (UniqueName: \"kubernetes.io/projected/f5bc4584-e61d-47a6-b76f-0ff39b47fe1f-kube-api-access-ftm9p\") pod \"f5bc4584-e61d-47a6-b76f-0ff39b47fe1f\" (UID: \"f5bc4584-e61d-47a6-b76f-0ff39b47fe1f\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.580597 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m59t\" (UniqueName: \"kubernetes.io/projected/131dfd04-f9d2-4fb7-b432-4f34f70cc4b7-kube-api-access-7m59t\") pod \"131dfd04-f9d2-4fb7-b432-4f34f70cc4b7\" (UID: \"131dfd04-f9d2-4fb7-b432-4f34f70cc4b7\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.580782 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnd7f\" (UniqueName: \"kubernetes.io/projected/d84d5a65-b170-4bad-aa8a-ddd90f8f48f9-kube-api-access-rnd7f\") pod \"d84d5a65-b170-4bad-aa8a-ddd90f8f48f9\" (UID: \"d84d5a65-b170-4bad-aa8a-ddd90f8f48f9\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.580806 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131dfd04-f9d2-4fb7-b432-4f34f70cc4b7-operator-scripts\") pod \"131dfd04-f9d2-4fb7-b432-4f34f70cc4b7\" (UID: \"131dfd04-f9d2-4fb7-b432-4f34f70cc4b7\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.580865 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5bc4584-e61d-47a6-b76f-0ff39b47fe1f-operator-scripts\") pod \"f5bc4584-e61d-47a6-b76f-0ff39b47fe1f\" (UID: \"f5bc4584-e61d-47a6-b76f-0ff39b47fe1f\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.580896 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d84d5a65-b170-4bad-aa8a-ddd90f8f48f9-operator-scripts\") pod \"d84d5a65-b170-4bad-aa8a-ddd90f8f48f9\" (UID: \"d84d5a65-b170-4bad-aa8a-ddd90f8f48f9\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.581378 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.581799 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d84d5a65-b170-4bad-aa8a-ddd90f8f48f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d84d5a65-b170-4bad-aa8a-ddd90f8f48f9" (UID: "d84d5a65-b170-4bad-aa8a-ddd90f8f48f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.582012 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131dfd04-f9d2-4fb7-b432-4f34f70cc4b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "131dfd04-f9d2-4fb7-b432-4f34f70cc4b7" (UID: "131dfd04-f9d2-4fb7-b432-4f34f70cc4b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.582064 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5bc4584-e61d-47a6-b76f-0ff39b47fe1f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f5bc4584-e61d-47a6-b76f-0ff39b47fe1f" (UID: "f5bc4584-e61d-47a6-b76f-0ff39b47fe1f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.585816 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d84d5a65-b170-4bad-aa8a-ddd90f8f48f9-kube-api-access-rnd7f" (OuterVolumeSpecName: "kube-api-access-rnd7f") pod "d84d5a65-b170-4bad-aa8a-ddd90f8f48f9" (UID: "d84d5a65-b170-4bad-aa8a-ddd90f8f48f9"). InnerVolumeSpecName "kube-api-access-rnd7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.587999 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131dfd04-f9d2-4fb7-b432-4f34f70cc4b7-kube-api-access-7m59t" (OuterVolumeSpecName: "kube-api-access-7m59t") pod "131dfd04-f9d2-4fb7-b432-4f34f70cc4b7" (UID: "131dfd04-f9d2-4fb7-b432-4f34f70cc4b7"). InnerVolumeSpecName "kube-api-access-7m59t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.589418 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5bc4584-e61d-47a6-b76f-0ff39b47fe1f-kube-api-access-ftm9p" (OuterVolumeSpecName: "kube-api-access-ftm9p") pod "f5bc4584-e61d-47a6-b76f-0ff39b47fe1f" (UID: "f5bc4584-e61d-47a6-b76f-0ff39b47fe1f"). InnerVolumeSpecName "kube-api-access-ftm9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.646078 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-3977-account-create-update-tc777"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.651537 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.651796 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="8dfaa83b-13e1-4a2f-a722-b0583f2641ed" containerName="memcached" containerID="cri-o://2b4f10ed0e4a73fb5cad8e46980901e7c694bff893bde9501737b27debf1b796" gracePeriod=30 Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.677118 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-3977-account-create-update-tc777"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.691342 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnd7f\" (UniqueName: \"kubernetes.io/projected/d84d5a65-b170-4bad-aa8a-ddd90f8f48f9-kube-api-access-rnd7f\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.691380 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131dfd04-f9d2-4fb7-b432-4f34f70cc4b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.691392 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5bc4584-e61d-47a6-b76f-0ff39b47fe1f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.691404 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d84d5a65-b170-4bad-aa8a-ddd90f8f48f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.691416 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftm9p\" (UniqueName: \"kubernetes.io/projected/f5bc4584-e61d-47a6-b76f-0ff39b47fe1f-kube-api-access-ftm9p\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.691427 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m59t\" (UniqueName: \"kubernetes.io/projected/131dfd04-f9d2-4fb7-b432-4f34f70cc4b7-kube-api-access-7m59t\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.691532 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.691585 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data podName:40a68c1e-d515-4dac-a25a-d7ed5da4b15c nodeName:}" failed. No retries permitted until 2026-01-20 23:19:31.691567826 +0000 UTC m=+2644.011828114 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data") pod "barbican-api-84bf68df56-874dk" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c") : secret "barbican-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.691941 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-scheduler-config-data: secret "cinder-scheduler-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.691982 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom podName:74f7986d-211a-4216-b92f-ad900ddcc45f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:31.691972136 +0000 UTC m=+2644.012232424 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom") pod "cinder-scheduler-0" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f") : secret "cinder-scheduler-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.697382 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-3977-account-create-update-skt8h"] Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.698042 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c5da8d3-64b4-4c3b-a154-58eaf9379610" containerName="mysql-bootstrap" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698067 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c5da8d3-64b4-4c3b-a154-58eaf9379610" containerName="mysql-bootstrap" Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.698085 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="683bb59f-0c0f-4f12-970b-b786e886cae3" containerName="ovn-northd" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698095 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="683bb59f-0c0f-4f12-970b-b786e886cae3" containerName="ovn-northd" Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.698110 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c079a7fc-dcd4-48da-bdd8-b47d93405531" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698118 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c079a7fc-dcd4-48da-bdd8-b47d93405531" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.698140 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="683bb59f-0c0f-4f12-970b-b786e886cae3" containerName="openstack-network-exporter" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698147 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="683bb59f-0c0f-4f12-970b-b786e886cae3" containerName="openstack-network-exporter" Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.698166 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24c9c78b-da19-4586-ae0c-97d5d63aeee2" containerName="proxy-server" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698176 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24c9c78b-da19-4586-ae0c-97d5d63aeee2" containerName="proxy-server" Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.698191 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c5da8d3-64b4-4c3b-a154-58eaf9379610" containerName="galera" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698201 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c5da8d3-64b4-4c3b-a154-58eaf9379610" containerName="galera" Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.698225 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b5a2a9-c6db-4d98-b203-5503e4b7075c" containerName="ovsdbserver-nb" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698233 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b5a2a9-c6db-4d98-b203-5503e4b7075c" containerName="ovsdbserver-nb" Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.698246 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24c9c78b-da19-4586-ae0c-97d5d63aeee2" containerName="proxy-httpd" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698253 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24c9c78b-da19-4586-ae0c-97d5d63aeee2" containerName="proxy-httpd" Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.698274 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b5a2a9-c6db-4d98-b203-5503e4b7075c" containerName="openstack-network-exporter" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698282 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b5a2a9-c6db-4d98-b203-5503e4b7075c" containerName="openstack-network-exporter" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698518 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="24c9c78b-da19-4586-ae0c-97d5d63aeee2" containerName="proxy-httpd" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698537 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="683bb59f-0c0f-4f12-970b-b786e886cae3" containerName="openstack-network-exporter" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698554 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7b5a2a9-c6db-4d98-b203-5503e4b7075c" containerName="openstack-network-exporter" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698572 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c079a7fc-dcd4-48da-bdd8-b47d93405531" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698583 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7b5a2a9-c6db-4d98-b203-5503e4b7075c" containerName="ovsdbserver-nb" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698603 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="24c9c78b-da19-4586-ae0c-97d5d63aeee2" containerName="proxy-server" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698635 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="683bb59f-0c0f-4f12-970b-b786e886cae3" containerName="ovn-northd" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.698648 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c5da8d3-64b4-4c3b-a154-58eaf9379610" containerName="galera" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.699601 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-3977-account-create-update-skt8h" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.711565 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.719835 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.719927 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-3977-account-create-update-skt8h"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.731163 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-9v5m5"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.763126 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-t2s2w"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.770510 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-9v5m5"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.792074 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-t2s2w"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.793003 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fc63968-1e2b-46fb-b8f3-d4cfd459410c-operator-scripts\") pod \"7fc63968-1e2b-46fb-b8f3-d4cfd459410c\" (UID: \"7fc63968-1e2b-46fb-b8f3-d4cfd459410c\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.793111 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks4wr\" (UniqueName: \"kubernetes.io/projected/7fc63968-1e2b-46fb-b8f3-d4cfd459410c-kube-api-access-ks4wr\") pod \"7fc63968-1e2b-46fb-b8f3-d4cfd459410c\" (UID: \"7fc63968-1e2b-46fb-b8f3-d4cfd459410c\") " Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.793364 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px2mc\" (UniqueName: \"kubernetes.io/projected/147222a1-10a0-4701-81a3-dae6f96c9cd0-kube-api-access-px2mc\") pod \"keystone-3977-account-create-update-skt8h\" (UID: \"147222a1-10a0-4701-81a3-dae6f96c9cd0\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-skt8h" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.793438 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/147222a1-10a0-4701-81a3-dae6f96c9cd0-operator-scripts\") pod \"keystone-3977-account-create-update-skt8h\" (UID: \"147222a1-10a0-4701-81a3-dae6f96c9cd0\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-skt8h" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.794064 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc63968-1e2b-46fb-b8f3-d4cfd459410c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7fc63968-1e2b-46fb-b8f3-d4cfd459410c" (UID: "7fc63968-1e2b-46fb-b8f3-d4cfd459410c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.796078 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-1" podUID="ef744864-0218-4aad-9ffa-a926311790dd" containerName="galera" containerID="cri-o://a8bc63dcc4302756d73b4fa0dd16904504481aacc44353be43137f19c1372f23" gracePeriod=28 Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.797294 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7b566b8847-cg4mk"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.797484 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" podUID="09176751-6268-4f1b-9348-5adb2938e4ca" containerName="keystone-api" containerID="cri-o://5335c6175ed4a270a6a0424ed58b7a7123c2bee5625d69ab613e39861259cab6" gracePeriod=30 Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.802415 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc63968-1e2b-46fb-b8f3-d4cfd459410c-kube-api-access-ks4wr" (OuterVolumeSpecName: "kube-api-access-ks4wr") pod "7fc63968-1e2b-46fb-b8f3-d4cfd459410c" (UID: "7fc63968-1e2b-46fb-b8f3-d4cfd459410c"). InnerVolumeSpecName "kube-api-access-ks4wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.811727 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-2"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.818398 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.834789 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-1"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.844724 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-3977-account-create-update-skt8h"] Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.845586 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-px2mc operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-3977-account-create-update-skt8h" podUID="147222a1-10a0-4701-81a3-dae6f96c9cd0" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.851914 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-4rdsx"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.860741 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-4rdsx"] Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.881882 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jjv69"] Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.897836 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.898328 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data podName:a54757d6-0570-4290-a802-d82ff94dffb9 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:31.898312404 +0000 UTC m=+2644.218572692 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data") pod "rabbitmq-cell1-server-2" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.898893 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px2mc\" (UniqueName: \"kubernetes.io/projected/147222a1-10a0-4701-81a3-dae6f96c9cd0-kube-api-access-px2mc\") pod \"keystone-3977-account-create-update-skt8h\" (UID: \"147222a1-10a0-4701-81a3-dae6f96c9cd0\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-skt8h" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.899047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/147222a1-10a0-4701-81a3-dae6f96c9cd0-operator-scripts\") pod \"keystone-3977-account-create-update-skt8h\" (UID: \"147222a1-10a0-4701-81a3-dae6f96c9cd0\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-skt8h" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.899209 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fc63968-1e2b-46fb-b8f3-d4cfd459410c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.899271 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks4wr\" (UniqueName: \"kubernetes.io/projected/7fc63968-1e2b-46fb-b8f3-d4cfd459410c-kube-api-access-ks4wr\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.899367 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.899444 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data podName:995342f6-fe0b-4188-976c-5151541dd002 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:31.899434941 +0000 UTC m=+2644.219695229 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data") pod "rabbitmq-cell1-server-0" (UID: "995342f6-fe0b-4188-976c-5151541dd002") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.899583 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-scripts: secret "cinder-scripts" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.899672 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts podName:74f7986d-211a-4216-b92f-ad900ddcc45f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:31.899663576 +0000 UTC m=+2644.219923864 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts") pod "cinder-scheduler-0" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f") : secret "cinder-scripts" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.899762 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-api-config-data: secret "barbican-api-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.899842 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom podName:40a68c1e-d515-4dac-a25a-d7ed5da4b15c nodeName:}" failed. No retries permitted until 2026-01-20 23:19:31.89983473 +0000 UTC m=+2644.220095018 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom") pod "barbican-api-84bf68df56-874dk" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c") : secret "barbican-api-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.899924 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.899994 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data podName:65e9d298-7338-47c3-8591-b7803798203f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:31.899986584 +0000 UTC m=+2644.220246872 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data") pod "rabbitmq-cell1-server-1" (UID: "65e9d298-7338-47c3-8591-b7803798203f") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.900256 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cinder-config-data: secret "cinder-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.900386 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data podName:74f7986d-211a-4216-b92f-ad900ddcc45f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:31.900378924 +0000 UTC m=+2644.220639202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data") pod "cinder-scheduler-0" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f") : secret "cinder-config-data" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.900353 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.900519 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/147222a1-10a0-4701-81a3-dae6f96c9cd0-operator-scripts podName:147222a1-10a0-4701-81a3-dae6f96c9cd0 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:28.400512297 +0000 UTC m=+2640.720772585 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/147222a1-10a0-4701-81a3-dae6f96c9cd0-operator-scripts") pod "keystone-3977-account-create-update-skt8h" (UID: "147222a1-10a0-4701-81a3-dae6f96c9cd0") : configmap "openstack-scripts" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.902918 5030 projected.go:194] Error preparing data for projected volume kube-api-access-px2mc for pod openstack-kuttl-tests/keystone-3977-account-create-update-skt8h: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:19:27 crc kubenswrapper[5030]: E0120 23:19:27.902997 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/147222a1-10a0-4701-81a3-dae6f96c9cd0-kube-api-access-px2mc podName:147222a1-10a0-4701-81a3-dae6f96c9cd0 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:28.402978656 +0000 UTC m=+2640.723238944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-px2mc" (UniqueName: "kubernetes.io/projected/147222a1-10a0-4701-81a3-dae6f96c9cd0-kube-api-access-px2mc") pod "keystone-3977-account-create-update-skt8h" (UID: "147222a1-10a0-4701-81a3-dae6f96c9cd0") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.970744 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.977356 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="148ae5bf-6f06-48b3-a9fd-04f99fbfebfe" path="/var/lib/kubelet/pods/148ae5bf-6f06-48b3-a9fd-04f99fbfebfe/volumes" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.978065 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="271b7d5f-4799-47ce-b503-0e41bfc34751" path="/var/lib/kubelet/pods/271b7d5f-4799-47ce-b503-0e41bfc34751/volumes" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.979021 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62358e70-13cb-4b80-8d09-f5ae4f43f6bd" path="/var/lib/kubelet/pods/62358e70-13cb-4b80-8d09-f5ae4f43f6bd/volumes" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.979689 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="683bb59f-0c0f-4f12-970b-b786e886cae3" path="/var/lib/kubelet/pods/683bb59f-0c0f-4f12-970b-b786e886cae3/volumes" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.981048 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97fdbcd3-4caf-4b83-b15a-a3dec765984f" path="/var/lib/kubelet/pods/97fdbcd3-4caf-4b83-b15a-a3dec765984f/volumes" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.981721 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c079a7fc-dcd4-48da-bdd8-b47d93405531" path="/var/lib/kubelet/pods/c079a7fc-dcd4-48da-bdd8-b47d93405531/volumes" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.982457 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca320b58-7251-428b-a18b-ba5c632eb546" path="/var/lib/kubelet/pods/ca320b58-7251-428b-a18b-ba5c632eb546/volumes" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.983467 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7b5a2a9-c6db-4d98-b203-5503e4b7075c" path="/var/lib/kubelet/pods/d7b5a2a9-c6db-4d98-b203-5503e4b7075c/volumes" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.984030 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e313a291-92ea-456b-9b16-7eb70ebfbe29" path="/var/lib/kubelet/pods/e313a291-92ea-456b-9b16-7eb70ebfbe29/volumes" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.984325 5030 generic.go:334] "Generic (PLEG): container finished" podID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerID="cb217008ccf67ac0890f7c0b296c6e0fbe72ac6353cd839700724550dc254a84" exitCode=0 Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.984347 5030 generic.go:334] "Generic (PLEG): container finished" podID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerID="5cb235fbc11dbb1f4a1b8e0b540880dc076e8771be46d898537d4c8df6eaa4dc" exitCode=2 Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.984459 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef09e13-a7ab-4e00-a12c-fb544a0791ba" path="/var/lib/kubelet/pods/eef09e13-a7ab-4e00-a12c-fb544a0791ba/volumes" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.985394 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74ad968-4584-45a0-b1e9-783c98abb6f4" path="/var/lib/kubelet/pods/f74ad968-4584-45a0-b1e9-783c98abb6f4/volumes" Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.987242 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" event={"ID":"131dfd04-f9d2-4fb7-b432-4f34f70cc4b7","Type":"ContainerDied","Data":"f8559d3e07e568f2d4b62436d82a1621448096e0ee9a867a98d54f48744b46f2"} Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.987264 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7da27e47-283b-48aa-9439-f8b3cf0c3426","Type":"ContainerDied","Data":"cb217008ccf67ac0890f7c0b296c6e0fbe72ac6353cd839700724550dc254a84"} Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.987277 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7da27e47-283b-48aa-9439-f8b3cf0c3426","Type":"ContainerDied","Data":"5cb235fbc11dbb1f4a1b8e0b540880dc076e8771be46d898537d4c8df6eaa4dc"} Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.996184 5030 generic.go:334] "Generic (PLEG): container finished" podID="27f2c51d-7595-4b7f-bdfb-3e50f0a6c657" containerID="9a0f15aa4ffcb12861a489b33c1e7515d93c0502d191361b6c6737274054b795" exitCode=2 Jan 20 23:19:27 crc kubenswrapper[5030]: I0120 23:19:27.996281 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657","Type":"ContainerDied","Data":"9a0f15aa4ffcb12861a489b33c1e7515d93c0502d191361b6c6737274054b795"} Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.005236 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" event={"ID":"f482c09a-b782-4aa8-b776-29055767e21d","Type":"ContainerStarted","Data":"9d45afe79da6bd57dec573c2982765a080c30dc54f857116f2b1ddf166f894be"} Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.005526 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.010030 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.010008 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv" event={"ID":"d84d5a65-b170-4bad-aa8a-ddd90f8f48f9","Type":"ContainerDied","Data":"f7af5852f843e702eb374830066e571af883b75be3b5711f1447731388122dd0"} Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.016771 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" event={"ID":"7fc63968-1e2b-46fb-b8f3-d4cfd459410c","Type":"ContainerDied","Data":"39e1c77fc43ef513673176366b4a66fb35e92b56f1fd1811af8d55b5fd788004"} Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.016829 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.055118 5030 generic.go:334] "Generic (PLEG): container finished" podID="c16c5bc4-fe32-4711-a21f-c6e7919d41d3" containerID="50dad30f02e2ad71b34ca576e9d068767ee44b13ac49b20df2f355b3c50adad3" exitCode=0 Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.055768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c16c5bc4-fe32-4711-a21f-c6e7919d41d3","Type":"ContainerDied","Data":"50dad30f02e2ad71b34ca576e9d068767ee44b13ac49b20df2f355b3c50adad3"} Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.057684 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" podStartSLOduration=4.057654685 podStartE2EDuration="4.057654685s" podCreationTimestamp="2026-01-20 23:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:19:28.027719827 +0000 UTC m=+2640.347980115" watchObservedRunningTime="2026-01-20 23:19:28.057654685 +0000 UTC m=+2640.377914973" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.062267 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c5da8d3-64b4-4c3b-a154-58eaf9379610" containerID="43baa2367474f77b4fdfa479c6f54d61e03942a2fcd5b0dd9dcd42e7366551d7" exitCode=0 Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.062675 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-2" event={"ID":"1c5da8d3-64b4-4c3b-a154-58eaf9379610","Type":"ContainerDied","Data":"43baa2367474f77b4fdfa479c6f54d61e03942a2fcd5b0dd9dcd42e7366551d7"} Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.062723 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-2" event={"ID":"1c5da8d3-64b4-4c3b-a154-58eaf9379610","Type":"ContainerDied","Data":"dd86c9ea545cf980466c9bb41e8a660751838f63337ce1194166b6842f3b0ea6"} Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.062752 5030 scope.go:117] "RemoveContainer" containerID="43baa2367474f77b4fdfa479c6f54d61e03942a2fcd5b0dd9dcd42e7366551d7" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.062960 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-2" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.078311 5030 generic.go:334] "Generic (PLEG): container finished" podID="bd75cb8e-9fcd-4465-88fb-49e3600608dc" containerID="7af3824becb0a8b511712e8c071c2a7723b825950e7dda89578b7c6ba4037ce4" exitCode=0 Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.078412 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"bd75cb8e-9fcd-4465-88fb-49e3600608dc","Type":"ContainerDied","Data":"7af3824becb0a8b511712e8c071c2a7723b825950e7dda89578b7c6ba4037ce4"} Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.124386 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-jjv69" secret="" err="secret \"galera-openstack-dockercfg-zn7wk\" not found" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.124437 5030 scope.go:117] "RemoveContainer" containerID="3a68c4f1688aff97806a60e8d2e05ae8b38410610f1b708c39ed861bf30a90fc" Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.124656 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-jjv69_openstack-kuttl-tests(8d6c006d-67ff-426d-9203-5c769b474fa3)\"" pod="openstack-kuttl-tests/root-account-create-update-jjv69" podUID="8d6c006d-67ff-426d-9203-5c769b474fa3" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.126502 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" event={"ID":"f5bc4584-e61d-47a6-b76f-0ff39b47fe1f","Type":"ContainerDied","Data":"b76c22c9bfa805c53d87eeb3f9212726eb23cdef997a0f13a87b245e436d19af"} Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.126884 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-1026-account-create-update-h84cz" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.136630 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-3977-account-create-update-skt8h" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.136751 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.136764 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6" event={"ID":"24c9c78b-da19-4586-ae0c-97d5d63aeee2","Type":"ContainerDied","Data":"7225b48e4f67fbe319c8386b782b5de8014d9ab4d4a0b86b9145f610062514ac"} Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.146859 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-2" podUID="eba5579b-ac6a-439c-894d-25a5c14d6242" containerName="galera" containerID="cri-o://7e4d3b88e2537910ea5a1bb30373dd22a7907ac81c0ca61d89885761256dedd0" gracePeriod=30 Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.228120 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.228204 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d6c006d-67ff-426d-9203-5c769b474fa3-operator-scripts podName:8d6c006d-67ff-426d-9203-5c769b474fa3 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:28.728184425 +0000 UTC m=+2641.048444713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8d6c006d-67ff-426d-9203-5c769b474fa3-operator-scripts") pod "root-account-create-update-jjv69" (UID: "8d6c006d-67ff-426d-9203-5c769b474fa3") : configmap "openstack-scripts" not found Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.430532 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px2mc\" (UniqueName: \"kubernetes.io/projected/147222a1-10a0-4701-81a3-dae6f96c9cd0-kube-api-access-px2mc\") pod \"keystone-3977-account-create-update-skt8h\" (UID: \"147222a1-10a0-4701-81a3-dae6f96c9cd0\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-skt8h" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.430605 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/147222a1-10a0-4701-81a3-dae6f96c9cd0-operator-scripts\") pod \"keystone-3977-account-create-update-skt8h\" (UID: \"147222a1-10a0-4701-81a3-dae6f96c9cd0\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-skt8h" Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.430755 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.430803 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/147222a1-10a0-4701-81a3-dae6f96c9cd0-operator-scripts podName:147222a1-10a0-4701-81a3-dae6f96c9cd0 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:29.430786603 +0000 UTC m=+2641.751046891 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/147222a1-10a0-4701-81a3-dae6f96c9cd0-operator-scripts") pod "keystone-3977-account-create-update-skt8h" (UID: "147222a1-10a0-4701-81a3-dae6f96c9cd0") : configmap "openstack-scripts" not found Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.435354 5030 projected.go:194] Error preparing data for projected volume kube-api-access-px2mc for pod openstack-kuttl-tests/keystone-3977-account-create-update-skt8h: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.435409 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/147222a1-10a0-4701-81a3-dae6f96c9cd0-kube-api-access-px2mc podName:147222a1-10a0-4701-81a3-dae6f96c9cd0 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:29.435392734 +0000 UTC m=+2641.755653022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-px2mc" (UniqueName: "kubernetes.io/projected/147222a1-10a0-4701-81a3-dae6f96c9cd0-kube-api-access-px2mc") pod "keystone-3977-account-create-update-skt8h" (UID: "147222a1-10a0-4701-81a3-dae6f96c9cd0") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.451728 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:58384->10.217.0.191:8775: read: connection reset by peer" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.451800 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:58372->10.217.0.191:8775: read: connection reset by peer" Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.510105 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8bc63dcc4302756d73b4fa0dd16904504481aacc44353be43137f19c1372f23" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.512198 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8bc63dcc4302756d73b4fa0dd16904504481aacc44353be43137f19c1372f23" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.514804 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a8bc63dcc4302756d73b4fa0dd16904504481aacc44353be43137f19c1372f23" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.514840 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/openstack-cell1-galera-1" podUID="ef744864-0218-4aad-9ffa-a926311790dd" containerName="galera" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.636344 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.641791 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-3977-account-create-update-skt8h" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.680207 5030 scope.go:117] "RemoveContainer" containerID="42e0535d6cdf00203b6a7f7cf05556132480c1a17ee95f17af617ecebd0114f8" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.698346 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.706798 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.713282 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-2"] Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.715400 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-2"] Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.718264 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.734384 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-state-metrics-tls-config\") pod \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.734451 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-combined-ca-bundle\") pod \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.734497 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-state-metrics-tls-certs\") pod \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.734525 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mxmq\" (UniqueName: \"kubernetes.io/projected/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-api-access-2mxmq\") pod \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\" (UID: \"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657\") " Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.737046 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.737115 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d6c006d-67ff-426d-9203-5c769b474fa3-operator-scripts podName:8d6c006d-67ff-426d-9203-5c769b474fa3 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:29.737096978 +0000 UTC m=+2642.057357266 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8d6c006d-67ff-426d-9203-5c769b474fa3-operator-scripts") pod "root-account-create-update-jjv69" (UID: "8d6c006d-67ff-426d-9203-5c769b474fa3") : configmap "openstack-scripts" not found Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.737477 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv"] Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.739931 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-api-access-2mxmq" (OuterVolumeSpecName: "kube-api-access-2mxmq") pod "27f2c51d-7595-4b7f-bdfb-3e50f0a6c657" (UID: "27f2c51d-7595-4b7f-bdfb-3e50f0a6c657"). InnerVolumeSpecName "kube-api-access-2mxmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.742015 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-83e2-account-create-update-bwpfv"] Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.755721 5030 scope.go:117] "RemoveContainer" containerID="43baa2367474f77b4fdfa479c6f54d61e03942a2fcd5b0dd9dcd42e7366551d7" Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.757973 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43baa2367474f77b4fdfa479c6f54d61e03942a2fcd5b0dd9dcd42e7366551d7\": container with ID starting with 43baa2367474f77b4fdfa479c6f54d61e03942a2fcd5b0dd9dcd42e7366551d7 not found: ID does not exist" containerID="43baa2367474f77b4fdfa479c6f54d61e03942a2fcd5b0dd9dcd42e7366551d7" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.758003 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43baa2367474f77b4fdfa479c6f54d61e03942a2fcd5b0dd9dcd42e7366551d7"} err="failed to get container status \"43baa2367474f77b4fdfa479c6f54d61e03942a2fcd5b0dd9dcd42e7366551d7\": rpc error: code = NotFound desc = could not find container \"43baa2367474f77b4fdfa479c6f54d61e03942a2fcd5b0dd9dcd42e7366551d7\": container with ID starting with 43baa2367474f77b4fdfa479c6f54d61e03942a2fcd5b0dd9dcd42e7366551d7 not found: ID does not exist" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.758023 5030 scope.go:117] "RemoveContainer" containerID="42e0535d6cdf00203b6a7f7cf05556132480c1a17ee95f17af617ecebd0114f8" Jan 20 23:19:28 crc kubenswrapper[5030]: E0120 23:19:28.758550 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e0535d6cdf00203b6a7f7cf05556132480c1a17ee95f17af617ecebd0114f8\": container with ID starting with 42e0535d6cdf00203b6a7f7cf05556132480c1a17ee95f17af617ecebd0114f8 not found: ID does not exist" containerID="42e0535d6cdf00203b6a7f7cf05556132480c1a17ee95f17af617ecebd0114f8" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.758576 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e0535d6cdf00203b6a7f7cf05556132480c1a17ee95f17af617ecebd0114f8"} err="failed to get container status \"42e0535d6cdf00203b6a7f7cf05556132480c1a17ee95f17af617ecebd0114f8\": rpc error: code = NotFound desc = could not find container \"42e0535d6cdf00203b6a7f7cf05556132480c1a17ee95f17af617ecebd0114f8\": container with ID starting with 42e0535d6cdf00203b6a7f7cf05556132480c1a17ee95f17af617ecebd0114f8 not found: ID does not exist" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.758593 5030 scope.go:117] "RemoveContainer" containerID="c4a40a78281e11cd16875b9ee533f316223288e089e031dfb22cb627128bcb6c" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.772606 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf"] Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.779391 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-e5c6-account-create-update-lckcf"] Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.792762 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "27f2c51d-7595-4b7f-bdfb-3e50f0a6c657" (UID: "27f2c51d-7595-4b7f-bdfb-3e50f0a6c657"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.796732 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27f2c51d-7595-4b7f-bdfb-3e50f0a6c657" (UID: "27f2c51d-7595-4b7f-bdfb-3e50f0a6c657"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.829035 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "27f2c51d-7595-4b7f-bdfb-3e50f0a6c657" (UID: "27f2c51d-7595-4b7f-bdfb-3e50f0a6c657"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.830412 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6"] Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.838555 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvbsf\" (UniqueName: \"kubernetes.io/projected/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-kube-api-access-qvbsf\") pod \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.838603 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-config-data\") pod \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.841242 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-internal-tls-certs\") pod \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.841330 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-httpd-run\") pod \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.841368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.841385 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8sk7\" (UniqueName: \"kubernetes.io/projected/bd75cb8e-9fcd-4465-88fb-49e3600608dc-kube-api-access-m8sk7\") pod \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.841423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkrq6\" (UniqueName: \"kubernetes.io/projected/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-kube-api-access-kkrq6\") pod \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.841439 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd75cb8e-9fcd-4465-88fb-49e3600608dc-httpd-run\") pod \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.841496 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-combined-ca-bundle\") pod \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.841512 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-public-tls-certs\") pod \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.841541 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-config-data\") pod \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.841571 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-logs\") pod \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.841585 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-internal-tls-certs\") pod \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.841633 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-combined-ca-bundle\") pod \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.842001 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.842033 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-scripts\") pod \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.842054 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-logs\") pod \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\" (UID: \"c16c5bc4-fe32-4711-a21f-c6e7919d41d3\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.842069 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-scripts\") pod \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.842106 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd75cb8e-9fcd-4465-88fb-49e3600608dc-logs\") pod \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.842123 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-scripts\") pod \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.842138 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-public-tls-certs\") pod \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\" (UID: \"bd75cb8e-9fcd-4465-88fb-49e3600608dc\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.842166 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-combined-ca-bundle\") pod \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.842196 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data\") pod \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.842675 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.842696 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.842706 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.842716 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mxmq\" (UniqueName: \"kubernetes.io/projected/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657-kube-api-access-2mxmq\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.845162 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-logs" (OuterVolumeSpecName: "logs") pod "fa4827b5-38d8-4647-a2bd-bfff1b91d34e" (UID: "fa4827b5-38d8-4647-a2bd-bfff1b91d34e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.845541 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-kube-api-access-qvbsf" (OuterVolumeSpecName: "kube-api-access-qvbsf") pod "c16c5bc4-fe32-4711-a21f-c6e7919d41d3" (UID: "c16c5bc4-fe32-4711-a21f-c6e7919d41d3"). InnerVolumeSpecName "kube-api-access-qvbsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.846449 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-logs" (OuterVolumeSpecName: "logs") pod "c16c5bc4-fe32-4711-a21f-c6e7919d41d3" (UID: "c16c5bc4-fe32-4711-a21f-c6e7919d41d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.846657 5030 scope.go:117] "RemoveContainer" containerID="ec5e526c2cdd05718671ccd6ac4f385ffc8385a61f4c21bec9eea239556ebbbc" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.846891 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.847491 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c16c5bc4-fe32-4711-a21f-c6e7919d41d3" (UID: "c16c5bc4-fe32-4711-a21f-c6e7919d41d3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.847970 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd75cb8e-9fcd-4465-88fb-49e3600608dc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bd75cb8e-9fcd-4465-88fb-49e3600608dc" (UID: "bd75cb8e-9fcd-4465-88fb-49e3600608dc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.852477 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-kube-api-access-kkrq6" (OuterVolumeSpecName: "kube-api-access-kkrq6") pod "fa4827b5-38d8-4647-a2bd-bfff1b91d34e" (UID: "fa4827b5-38d8-4647-a2bd-bfff1b91d34e"). InnerVolumeSpecName "kube-api-access-kkrq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.864291 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-scripts" (OuterVolumeSpecName: "scripts") pod "c16c5bc4-fe32-4711-a21f-c6e7919d41d3" (UID: "c16c5bc4-fe32-4711-a21f-c6e7919d41d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.867039 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd75cb8e-9fcd-4465-88fb-49e3600608dc-logs" (OuterVolumeSpecName: "logs") pod "bd75cb8e-9fcd-4465-88fb-49e3600608dc" (UID: "bd75cb8e-9fcd-4465-88fb-49e3600608dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.870644 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-ffdf777d-qspf6"] Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.876589 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-scripts" (OuterVolumeSpecName: "scripts") pod "bd75cb8e-9fcd-4465-88fb-49e3600608dc" (UID: "bd75cb8e-9fcd-4465-88fb-49e3600608dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.881316 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "c16c5bc4-fe32-4711-a21f-c6e7919d41d3" (UID: "c16c5bc4-fe32-4711-a21f-c6e7919d41d3"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.881391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-scripts" (OuterVolumeSpecName: "scripts") pod "fa4827b5-38d8-4647-a2bd-bfff1b91d34e" (UID: "fa4827b5-38d8-4647-a2bd-bfff1b91d34e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.888632 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "bd75cb8e-9fcd-4465-88fb-49e3600608dc" (UID: "bd75cb8e-9fcd-4465-88fb-49e3600608dc"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.891418 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-1026-account-create-update-h84cz"] Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.897138 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd75cb8e-9fcd-4465-88fb-49e3600608dc-kube-api-access-m8sk7" (OuterVolumeSpecName: "kube-api-access-m8sk7") pod "bd75cb8e-9fcd-4465-88fb-49e3600608dc" (UID: "bd75cb8e-9fcd-4465-88fb-49e3600608dc"). InnerVolumeSpecName "kube-api-access-m8sk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.897495 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-1026-account-create-update-h84cz"] Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.944190 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86rdw\" (UniqueName: \"kubernetes.io/projected/b175c0c8-ee1a-404d-9a14-d37744a24ece-kube-api-access-86rdw\") pod \"b175c0c8-ee1a-404d-9a14-d37744a24ece\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.944242 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b175c0c8-ee1a-404d-9a14-d37744a24ece-etc-machine-id\") pod \"b175c0c8-ee1a-404d-9a14-d37744a24ece\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.944324 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-scripts\") pod \"b175c0c8-ee1a-404d-9a14-d37744a24ece\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.944413 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-combined-ca-bundle\") pod \"b175c0c8-ee1a-404d-9a14-d37744a24ece\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.944456 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-config-data\") pod \"b175c0c8-ee1a-404d-9a14-d37744a24ece\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.944475 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-config-data-custom\") pod \"b175c0c8-ee1a-404d-9a14-d37744a24ece\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.944516 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b175c0c8-ee1a-404d-9a14-d37744a24ece-logs\") pod \"b175c0c8-ee1a-404d-9a14-d37744a24ece\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.944540 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-internal-tls-certs\") pod \"b175c0c8-ee1a-404d-9a14-d37744a24ece\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.944614 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-public-tls-certs\") pod \"b175c0c8-ee1a-404d-9a14-d37744a24ece\" (UID: \"b175c0c8-ee1a-404d-9a14-d37744a24ece\") " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.945269 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkrq6\" (UniqueName: \"kubernetes.io/projected/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-kube-api-access-kkrq6\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.945282 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd75cb8e-9fcd-4465-88fb-49e3600608dc-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.945291 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.945310 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.945320 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.945329 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.945337 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.945345 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd75cb8e-9fcd-4465-88fb-49e3600608dc-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.945352 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.945360 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvbsf\" (UniqueName: \"kubernetes.io/projected/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-kube-api-access-qvbsf\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.945370 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.945402 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.945412 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8sk7\" (UniqueName: \"kubernetes.io/projected/bd75cb8e-9fcd-4465-88fb-49e3600608dc-kube-api-access-m8sk7\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.948373 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b175c0c8-ee1a-404d-9a14-d37744a24ece-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b175c0c8-ee1a-404d-9a14-d37744a24ece" (UID: "b175c0c8-ee1a-404d-9a14-d37744a24ece"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.951614 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b175c0c8-ee1a-404d-9a14-d37744a24ece-logs" (OuterVolumeSpecName: "logs") pod "b175c0c8-ee1a-404d-9a14-d37744a24ece" (UID: "b175c0c8-ee1a-404d-9a14-d37744a24ece"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.957584 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-config-data" (OuterVolumeSpecName: "config-data") pod "c16c5bc4-fe32-4711-a21f-c6e7919d41d3" (UID: "c16c5bc4-fe32-4711-a21f-c6e7919d41d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.959582 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b175c0c8-ee1a-404d-9a14-d37744a24ece-kube-api-access-86rdw" (OuterVolumeSpecName: "kube-api-access-86rdw") pod "b175c0c8-ee1a-404d-9a14-d37744a24ece" (UID: "b175c0c8-ee1a-404d-9a14-d37744a24ece"). InnerVolumeSpecName "kube-api-access-86rdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.959591 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-scripts" (OuterVolumeSpecName: "scripts") pod "b175c0c8-ee1a-404d-9a14-d37744a24ece" (UID: "b175c0c8-ee1a-404d-9a14-d37744a24ece"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.964234 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b175c0c8-ee1a-404d-9a14-d37744a24ece" (UID: "b175c0c8-ee1a-404d-9a14-d37744a24ece"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.979137 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" podUID="40a68c1e-d515-4dac-a25a-d7ed5da4b15c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.153:9311/healthcheck\": dial tcp 10.217.0.153:9311: connect: connection refused" Jan 20 23:19:28 crc kubenswrapper[5030]: I0120 23:19:28.979404 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" podUID="40a68c1e-d515-4dac-a25a-d7ed5da4b15c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.153:9311/healthcheck\": dial tcp 10.217.0.153:9311: connect: connection refused" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.016494 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.048004 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86rdw\" (UniqueName: \"kubernetes.io/projected/b175c0c8-ee1a-404d-9a14-d37744a24ece-kube-api-access-86rdw\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.048026 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.048037 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b175c0c8-ee1a-404d-9a14-d37744a24ece-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.048045 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.048055 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.048064 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.048072 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b175c0c8-ee1a-404d-9a14-d37744a24ece-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.066849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c16c5bc4-fe32-4711-a21f-c6e7919d41d3" (UID: "c16c5bc4-fe32-4711-a21f-c6e7919d41d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.081096 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd75cb8e-9fcd-4465-88fb-49e3600608dc" (UID: "bd75cb8e-9fcd-4465-88fb-49e3600608dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.124850 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.155103 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.155136 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.155146 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.166303 5030 generic.go:334] "Generic (PLEG): container finished" podID="fa4827b5-38d8-4647-a2bd-bfff1b91d34e" containerID="63cf8be9dcc2605f51f2d1da4be3b07a3a302bb4c83d0655aed8497ddf7630fd" exitCode=0 Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.166359 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" event={"ID":"fa4827b5-38d8-4647-a2bd-bfff1b91d34e","Type":"ContainerDied","Data":"63cf8be9dcc2605f51f2d1da4be3b07a3a302bb4c83d0655aed8497ddf7630fd"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.166429 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" event={"ID":"fa4827b5-38d8-4647-a2bd-bfff1b91d34e","Type":"ContainerDied","Data":"1ad82cb309994355dd75bacb6def2fb63cb49faec561af7264c273e18bc1ab4f"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.166447 5030 scope.go:117] "RemoveContainer" containerID="63cf8be9dcc2605f51f2d1da4be3b07a3a302bb4c83d0655aed8497ddf7630fd" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.166545 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-847f86bdb-chr55" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.172188 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b175c0c8-ee1a-404d-9a14-d37744a24ece" (UID: "b175c0c8-ee1a-404d-9a14-d37744a24ece"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.174646 5030 generic.go:334] "Generic (PLEG): container finished" podID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerID="b7437d316b2192f7896af5d93582c95337eb7084b35a8b488d5714c2c0ab610f" exitCode=0 Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.174708 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7da27e47-283b-48aa-9439-f8b3cf0c3426","Type":"ContainerDied","Data":"b7437d316b2192f7896af5d93582c95337eb7084b35a8b488d5714c2c0ab610f"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.181254 5030 generic.go:334] "Generic (PLEG): container finished" podID="8dfaa83b-13e1-4a2f-a722-b0583f2641ed" containerID="2b4f10ed0e4a73fb5cad8e46980901e7c694bff893bde9501737b27debf1b796" exitCode=0 Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.181402 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"8dfaa83b-13e1-4a2f-a722-b0583f2641ed","Type":"ContainerDied","Data":"2b4f10ed0e4a73fb5cad8e46980901e7c694bff893bde9501737b27debf1b796"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.194231 5030 generic.go:334] "Generic (PLEG): container finished" podID="74f7986d-211a-4216-b92f-ad900ddcc45f" containerID="70fb88353fc70c3b9dab582d23bbede418372d313927da95f856408d4d739fad" exitCode=0 Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.194428 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"74f7986d-211a-4216-b92f-ad900ddcc45f","Type":"ContainerDied","Data":"70fb88353fc70c3b9dab582d23bbede418372d313927da95f856408d4d739fad"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.199233 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c16c5bc4-fe32-4711-a21f-c6e7919d41d3","Type":"ContainerDied","Data":"0e2d70adc03803d4f2fa32b277855266f614ffbd221e6dbc4f92e76fc69947a2"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.199327 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.200152 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b175c0c8-ee1a-404d-9a14-d37744a24ece" (UID: "b175c0c8-ee1a-404d-9a14-d37744a24ece"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.200525 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b175c0c8-ee1a-404d-9a14-d37744a24ece" (UID: "b175c0c8-ee1a-404d-9a14-d37744a24ece"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.207427 5030 generic.go:334] "Generic (PLEG): container finished" podID="40a68c1e-d515-4dac-a25a-d7ed5da4b15c" containerID="d64a394c4f184d65be789b464e93507d0d51cff95adc8f21cab504aa94390ded" exitCode=0 Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.207523 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" event={"ID":"40a68c1e-d515-4dac-a25a-d7ed5da4b15c","Type":"ContainerDied","Data":"d64a394c4f184d65be789b464e93507d0d51cff95adc8f21cab504aa94390ded"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.211085 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"27f2c51d-7595-4b7f-bdfb-3e50f0a6c657","Type":"ContainerDied","Data":"108c9acd81ab4f0728753e29bcf10ec789e4a9075dd58deafcfa9b957458cbfd"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.211169 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.214432 5030 generic.go:334] "Generic (PLEG): container finished" podID="b175c0c8-ee1a-404d-9a14-d37744a24ece" containerID="494d5cde95754e39df3e175cb36cce23395a3273fbb92ee15c97c0a4ba5fcaa9" exitCode=0 Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.214596 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.214679 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b175c0c8-ee1a-404d-9a14-d37744a24ece","Type":"ContainerDied","Data":"494d5cde95754e39df3e175cb36cce23395a3273fbb92ee15c97c0a4ba5fcaa9"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.215409 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-config-data" (OuterVolumeSpecName: "config-data") pod "bd75cb8e-9fcd-4465-88fb-49e3600608dc" (UID: "bd75cb8e-9fcd-4465-88fb-49e3600608dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.216180 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"b175c0c8-ee1a-404d-9a14-d37744a24ece","Type":"ContainerDied","Data":"cec0f26abf2ae33583bcab23496ac4f774202bf153430ca20224a7ba96e017f5"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.218560 5030 generic.go:334] "Generic (PLEG): container finished" podID="4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" containerID="88a2387b417e90bfd501bc833d3ec1e5d951fed92f4116f90417ad981562c482" exitCode=0 Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.218608 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" event={"ID":"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0","Type":"ContainerDied","Data":"88a2387b417e90bfd501bc833d3ec1e5d951fed92f4116f90417ad981562c482"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.218635 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" event={"ID":"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0","Type":"ContainerDied","Data":"dd7bc71f52051bd49bf081bebfad56ce7784ffdb02bda65b8f96c08631242c97"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.218647 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd7bc71f52051bd49bf081bebfad56ce7784ffdb02bda65b8f96c08631242c97" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.219674 5030 generic.go:334] "Generic (PLEG): container finished" podID="6bb6b715-a2a6-4c25-83c3-315fd30b8d92" containerID="300cae00e1e96594e491715916fe7d890923b30fd9931bc6ed9c3ee0ac8dbeaf" exitCode=0 Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.219704 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6bb6b715-a2a6-4c25-83c3-315fd30b8d92","Type":"ContainerDied","Data":"300cae00e1e96594e491715916fe7d890923b30fd9931bc6ed9c3ee0ac8dbeaf"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.219717 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6bb6b715-a2a6-4c25-83c3-315fd30b8d92","Type":"ContainerDied","Data":"7b40e3ba80df28de126301f4970e88e7614fffe14d06733faadcf44b74b9deab"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.219725 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b40e3ba80df28de126301f4970e88e7614fffe14d06733faadcf44b74b9deab" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.221965 5030 generic.go:334] "Generic (PLEG): container finished" podID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerID="6937c695cff1667d0d62ce1c83b0152b926dae8c296b22eca0b03bd5169765f4" exitCode=0 Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.222018 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ea3fc979-4f14-4578-9cae-d44c44ba3e8c","Type":"ContainerDied","Data":"6937c695cff1667d0d62ce1c83b0152b926dae8c296b22eca0b03bd5169765f4"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.228631 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-3977-account-create-update-skt8h" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.232811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bd75cb8e-9fcd-4465-88fb-49e3600608dc" (UID: "bd75cb8e-9fcd-4465-88fb-49e3600608dc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.232849 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.232880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"bd75cb8e-9fcd-4465-88fb-49e3600608dc","Type":"ContainerDied","Data":"ff631c9ca80cc4a02c30452998625385af68b600a57971ffca78e17e35a60100"} Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.236352 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data" (OuterVolumeSpecName: "config-data") pod "fa4827b5-38d8-4647-a2bd-bfff1b91d34e" (UID: "fa4827b5-38d8-4647-a2bd-bfff1b91d34e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.248225 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-config-data" (OuterVolumeSpecName: "config-data") pod "b175c0c8-ee1a-404d-9a14-d37744a24ece" (UID: "b175c0c8-ee1a-404d-9a14-d37744a24ece"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.258605 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa4827b5-38d8-4647-a2bd-bfff1b91d34e" (UID: "fa4827b5-38d8-4647-a2bd-bfff1b91d34e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.261876 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-combined-ca-bundle\") pod \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\" (UID: \"fa4827b5-38d8-4647-a2bd-bfff1b91d34e\") " Jan 20 23:19:29 crc kubenswrapper[5030]: W0120 23:19:29.262403 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/fa4827b5-38d8-4647-a2bd-bfff1b91d34e/volumes/kubernetes.io~secret/combined-ca-bundle Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.262427 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa4827b5-38d8-4647-a2bd-bfff1b91d34e" (UID: "fa4827b5-38d8-4647-a2bd-bfff1b91d34e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.263063 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.263079 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.263089 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.263098 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.263107 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b175c0c8-ee1a-404d-9a14-d37744a24ece-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.263115 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd75cb8e-9fcd-4465-88fb-49e3600608dc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.263124 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.263132 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.267096 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c16c5bc4-fe32-4711-a21f-c6e7919d41d3" (UID: "c16c5bc4-fe32-4711-a21f-c6e7919d41d3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.295240 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fa4827b5-38d8-4647-a2bd-bfff1b91d34e" (UID: "fa4827b5-38d8-4647-a2bd-bfff1b91d34e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.315392 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f is running failed: container process not found" containerID="7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.316696 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f is running failed: container process not found" containerID="7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.318018 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f is running failed: container process not found" containerID="7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.318066 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="0863eb67-3ead-4744-b338-aaf75284e458" containerName="nova-cell1-conductor-conductor" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.360189 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fa4827b5-38d8-4647-a2bd-bfff1b91d34e" (UID: "fa4827b5-38d8-4647-a2bd-bfff1b91d34e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.366054 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.366083 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa4827b5-38d8-4647-a2bd-bfff1b91d34e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.366093 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16c5bc4-fe32-4711-a21f-c6e7919d41d3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.467221 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px2mc\" (UniqueName: \"kubernetes.io/projected/147222a1-10a0-4701-81a3-dae6f96c9cd0-kube-api-access-px2mc\") pod \"keystone-3977-account-create-update-skt8h\" (UID: \"147222a1-10a0-4701-81a3-dae6f96c9cd0\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-skt8h" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.467393 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/147222a1-10a0-4701-81a3-dae6f96c9cd0-operator-scripts\") pod \"keystone-3977-account-create-update-skt8h\" (UID: \"147222a1-10a0-4701-81a3-dae6f96c9cd0\") " pod="openstack-kuttl-tests/keystone-3977-account-create-update-skt8h" Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.467673 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.467792 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/147222a1-10a0-4701-81a3-dae6f96c9cd0-operator-scripts podName:147222a1-10a0-4701-81a3-dae6f96c9cd0 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:31.467775581 +0000 UTC m=+2643.788035869 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/147222a1-10a0-4701-81a3-dae6f96c9cd0-operator-scripts") pod "keystone-3977-account-create-update-skt8h" (UID: "147222a1-10a0-4701-81a3-dae6f96c9cd0") : configmap "openstack-scripts" not found Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.472248 5030 projected.go:194] Error preparing data for projected volume kube-api-access-px2mc for pod openstack-kuttl-tests/keystone-3977-account-create-update-skt8h: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.472360 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/147222a1-10a0-4701-81a3-dae6f96c9cd0-kube-api-access-px2mc podName:147222a1-10a0-4701-81a3-dae6f96c9cd0 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:31.472349341 +0000 UTC m=+2643.792609629 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-px2mc" (UniqueName: "kubernetes.io/projected/147222a1-10a0-4701-81a3-dae6f96c9cd0-kube-api-access-px2mc") pod "keystone-3977-account-create-update-skt8h" (UID: "147222a1-10a0-4701-81a3-dae6f96c9cd0") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.541645 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.572999 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.612501 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.616391 5030 scope.go:117] "RemoveContainer" containerID="195092f4ac72d7d533ff5382326146aa18ffe894548f20c28141674be97ab59c" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.645000 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.671965 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47j9g\" (UniqueName: \"kubernetes.io/projected/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-kube-api-access-47j9g\") pod \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672042 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-config-data\") pod \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672085 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-logs\") pod \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672112 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-config-data-custom\") pod \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672163 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-config-data\") pod \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-logs\") pod \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672201 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-nova-metadata-tls-certs\") pod \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672233 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-config-data\") pod \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672256 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-public-tls-certs\") pod \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672274 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-logs\") pod \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672302 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-internal-tls-certs\") pod \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672345 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q95j9\" (UniqueName: \"kubernetes.io/projected/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-kube-api-access-q95j9\") pod \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672376 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kjtp\" (UniqueName: \"kubernetes.io/projected/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-kube-api-access-9kjtp\") pod \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672407 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-combined-ca-bundle\") pod \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\" (UID: \"6bb6b715-a2a6-4c25-83c3-315fd30b8d92\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672433 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-combined-ca-bundle\") pod \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\" (UID: \"ea3fc979-4f14-4578-9cae-d44c44ba3e8c\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.672510 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-combined-ca-bundle\") pod \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\" (UID: \"4dc45225-0ecd-4a1c-9439-dc3c8385e5b0\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.674032 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.674902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-logs" (OuterVolumeSpecName: "logs") pod "4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" (UID: "4dc45225-0ecd-4a1c-9439-dc3c8385e5b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.675393 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-logs" (OuterVolumeSpecName: "logs") pod "ea3fc979-4f14-4578-9cae-d44c44ba3e8c" (UID: "ea3fc979-4f14-4578-9cae-d44c44ba3e8c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.677350 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.683263 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-logs" (OuterVolumeSpecName: "logs") pod "6bb6b715-a2a6-4c25-83c3-315fd30b8d92" (UID: "6bb6b715-a2a6-4c25-83c3-315fd30b8d92"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.684836 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.691856 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-kube-api-access-9kjtp" (OuterVolumeSpecName: "kube-api-access-9kjtp") pod "4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" (UID: "4dc45225-0ecd-4a1c-9439-dc3c8385e5b0"). InnerVolumeSpecName "kube-api-access-9kjtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.693335 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-kube-api-access-q95j9" (OuterVolumeSpecName: "kube-api-access-q95j9") pod "ea3fc979-4f14-4578-9cae-d44c44ba3e8c" (UID: "ea3fc979-4f14-4578-9cae-d44c44ba3e8c"). InnerVolumeSpecName "kube-api-access-q95j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.695009 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" (UID: "4dc45225-0ecd-4a1c-9439-dc3c8385e5b0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.723130 5030 scope.go:117] "RemoveContainer" containerID="63cf8be9dcc2605f51f2d1da4be3b07a3a302bb4c83d0655aed8497ddf7630fd" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.728372 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-kube-api-access-47j9g" (OuterVolumeSpecName: "kube-api-access-47j9g") pod "6bb6b715-a2a6-4c25-83c3-315fd30b8d92" (UID: "6bb6b715-a2a6-4c25-83c3-315fd30b8d92"). InnerVolumeSpecName "kube-api-access-47j9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.728541 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.729031 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63cf8be9dcc2605f51f2d1da4be3b07a3a302bb4c83d0655aed8497ddf7630fd\": container with ID starting with 63cf8be9dcc2605f51f2d1da4be3b07a3a302bb4c83d0655aed8497ddf7630fd not found: ID does not exist" containerID="63cf8be9dcc2605f51f2d1da4be3b07a3a302bb4c83d0655aed8497ddf7630fd" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.729072 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63cf8be9dcc2605f51f2d1da4be3b07a3a302bb4c83d0655aed8497ddf7630fd"} err="failed to get container status \"63cf8be9dcc2605f51f2d1da4be3b07a3a302bb4c83d0655aed8497ddf7630fd\": rpc error: code = NotFound desc = could not find container \"63cf8be9dcc2605f51f2d1da4be3b07a3a302bb4c83d0655aed8497ddf7630fd\": container with ID starting with 63cf8be9dcc2605f51f2d1da4be3b07a3a302bb4c83d0655aed8497ddf7630fd not found: ID does not exist" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.729096 5030 scope.go:117] "RemoveContainer" containerID="195092f4ac72d7d533ff5382326146aa18ffe894548f20c28141674be97ab59c" Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.729387 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195092f4ac72d7d533ff5382326146aa18ffe894548f20c28141674be97ab59c\": container with ID starting with 195092f4ac72d7d533ff5382326146aa18ffe894548f20c28141674be97ab59c not found: ID does not exist" containerID="195092f4ac72d7d533ff5382326146aa18ffe894548f20c28141674be97ab59c" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.729485 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195092f4ac72d7d533ff5382326146aa18ffe894548f20c28141674be97ab59c"} err="failed to get container status \"195092f4ac72d7d533ff5382326146aa18ffe894548f20c28141674be97ab59c\": rpc error: code = NotFound desc = could not find container \"195092f4ac72d7d533ff5382326146aa18ffe894548f20c28141674be97ab59c\": container with ID starting with 195092f4ac72d7d533ff5382326146aa18ffe894548f20c28141674be97ab59c not found: ID does not exist" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.729591 5030 scope.go:117] "RemoveContainer" containerID="50dad30f02e2ad71b34ca576e9d068767ee44b13ac49b20df2f355b3c50adad3" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.730608 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jjv69" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.743872 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.744694 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.751695 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.767133 5030 scope.go:117] "RemoveContainer" containerID="e4df01fc1632a811eb3b665c2886fe79657be01644364c0ace65e378ddb318ce" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.773468 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bb6b715-a2a6-4c25-83c3-315fd30b8d92" (UID: "6bb6b715-a2a6-4c25-83c3-315fd30b8d92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774116 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-combined-ca-bundle\") pod \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774181 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data\") pod \"74f7986d-211a-4216-b92f-ad900ddcc45f\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774237 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mdk8\" (UniqueName: \"kubernetes.io/projected/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-kube-api-access-5mdk8\") pod \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774261 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts\") pod \"74f7986d-211a-4216-b92f-ad900ddcc45f\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774278 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom\") pod \"74f7986d-211a-4216-b92f-ad900ddcc45f\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74f7986d-211a-4216-b92f-ad900ddcc45f-etc-machine-id\") pod \"74f7986d-211a-4216-b92f-ad900ddcc45f\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774356 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-kolla-config\") pod \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774436 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom\") pod \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774476 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpfnk\" (UniqueName: \"kubernetes.io/projected/74f7986d-211a-4216-b92f-ad900ddcc45f-kube-api-access-cpfnk\") pod \"74f7986d-211a-4216-b92f-ad900ddcc45f\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774492 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-public-tls-certs\") pod \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774508 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-combined-ca-bundle\") pod \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774543 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-logs\") pod \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774558 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data\") pod \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.774574 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-config-data\") pod \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.775243 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "8dfaa83b-13e1-4a2f-a722-b0583f2641ed" (UID: "8dfaa83b-13e1-4a2f-a722-b0583f2641ed"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.777000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74f7986d-211a-4216-b92f-ad900ddcc45f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "74f7986d-211a-4216-b92f-ad900ddcc45f" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.781012 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-3977-account-create-update-skt8h"] Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.782179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwmlv\" (UniqueName: \"kubernetes.io/projected/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-kube-api-access-gwmlv\") pod \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.782216 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-combined-ca-bundle\") pod \"74f7986d-211a-4216-b92f-ad900ddcc45f\" (UID: \"74f7986d-211a-4216-b92f-ad900ddcc45f\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.782243 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-internal-tls-certs\") pod \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\" (UID: \"40a68c1e-d515-4dac-a25a-d7ed5da4b15c\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.782265 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-memcached-tls-certs\") pod \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\" (UID: \"8dfaa83b-13e1-4a2f-a722-b0583f2641ed\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.782646 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-logs" (OuterVolumeSpecName: "logs") pod "40a68c1e-d515-4dac-a25a-d7ed5da4b15c" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.783972 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.784035 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d6c006d-67ff-426d-9203-5c769b474fa3-operator-scripts podName:8d6c006d-67ff-426d-9203-5c769b474fa3 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:31.784013664 +0000 UTC m=+2644.104274032 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8d6c006d-67ff-426d-9203-5c769b474fa3-operator-scripts") pod "root-account-create-update-jjv69" (UID: "8d6c006d-67ff-426d-9203-5c769b474fa3") : configmap "openstack-scripts" not found Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.785255 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-config-data" (OuterVolumeSpecName: "config-data") pod "8dfaa83b-13e1-4a2f-a722-b0583f2641ed" (UID: "8dfaa83b-13e1-4a2f-a722-b0583f2641ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.785401 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.785426 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.785440 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.785452 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.785464 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q95j9\" (UniqueName: \"kubernetes.io/projected/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-kube-api-access-q95j9\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.785478 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kjtp\" (UniqueName: \"kubernetes.io/projected/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-kube-api-access-9kjtp\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.785489 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.785500 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.785513 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47j9g\" (UniqueName: \"kubernetes.io/projected/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-kube-api-access-47j9g\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.785522 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74f7986d-211a-4216-b92f-ad900ddcc45f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.785531 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.788870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "40a68c1e-d515-4dac-a25a-d7ed5da4b15c" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.807781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts" (OuterVolumeSpecName: "scripts") pod "74f7986d-211a-4216-b92f-ad900ddcc45f" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.812201 5030 scope.go:117] "RemoveContainer" containerID="9a0f15aa4ffcb12861a489b33c1e7515d93c0502d191361b6c6737274054b795" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.814135 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-kube-api-access-5mdk8" (OuterVolumeSpecName: "kube-api-access-5mdk8") pod "8dfaa83b-13e1-4a2f-a722-b0583f2641ed" (UID: "8dfaa83b-13e1-4a2f-a722-b0583f2641ed"). InnerVolumeSpecName "kube-api-access-5mdk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.816385 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-config-data" (OuterVolumeSpecName: "config-data") pod "ea3fc979-4f14-4578-9cae-d44c44ba3e8c" (UID: "ea3fc979-4f14-4578-9cae-d44c44ba3e8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.817182 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f7986d-211a-4216-b92f-ad900ddcc45f-kube-api-access-cpfnk" (OuterVolumeSpecName: "kube-api-access-cpfnk") pod "74f7986d-211a-4216-b92f-ad900ddcc45f" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f"). InnerVolumeSpecName "kube-api-access-cpfnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.817942 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "74f7986d-211a-4216-b92f-ad900ddcc45f" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.824563 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea3fc979-4f14-4578-9cae-d44c44ba3e8c" (UID: "ea3fc979-4f14-4578-9cae-d44c44ba3e8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.825145 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-kube-api-access-gwmlv" (OuterVolumeSpecName: "kube-api-access-gwmlv") pod "40a68c1e-d515-4dac-a25a-d7ed5da4b15c" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c"). InnerVolumeSpecName "kube-api-access-gwmlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.826365 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-config-data" (OuterVolumeSpecName: "config-data") pod "6bb6b715-a2a6-4c25-83c3-315fd30b8d92" (UID: "6bb6b715-a2a6-4c25-83c3-315fd30b8d92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.827028 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="473c0915-5bed-4dc5-8b2a-7899b05b9afc" containerName="galera" containerID="cri-o://f0246b91c36db74af56a2f84a2aa1fbfcd276a49ae84c6dacb5c767ebec9f927" gracePeriod=26 Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.834180 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-3977-account-create-update-skt8h"] Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.839422 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" (UID: "4dc45225-0ecd-4a1c-9439-dc3c8385e5b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.840997 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-847f86bdb-chr55"] Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.847135 5030 scope.go:117] "RemoveContainer" containerID="494d5cde95754e39df3e175cb36cce23395a3273fbb92ee15c97c0a4ba5fcaa9" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.850815 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-847f86bdb-chr55"] Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.866535 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.868558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6bb6b715-a2a6-4c25-83c3-315fd30b8d92" (UID: "6bb6b715-a2a6-4c25-83c3-315fd30b8d92"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.871578 5030 scope.go:117] "RemoveContainer" containerID="3a9460c9a35ed585df222f0319fca4cbe21024e047ae17ac43a0303e76f4c0ee" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.872355 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40a68c1e-d515-4dac-a25a-d7ed5da4b15c" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.878389 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.887321 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d6c006d-67ff-426d-9203-5c769b474fa3-operator-scripts\") pod \"8d6c006d-67ff-426d-9203-5c769b474fa3\" (UID: \"8d6c006d-67ff-426d-9203-5c769b474fa3\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.887396 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28ng7\" (UniqueName: \"kubernetes.io/projected/8d6c006d-67ff-426d-9203-5c769b474fa3-kube-api-access-28ng7\") pod \"8d6c006d-67ff-426d-9203-5c769b474fa3\" (UID: \"8d6c006d-67ff-426d-9203-5c769b474fa3\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.887589 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztwkh\" (UniqueName: \"kubernetes.io/projected/0863eb67-3ead-4744-b338-aaf75284e458-kube-api-access-ztwkh\") pod \"0863eb67-3ead-4744-b338-aaf75284e458\" (UID: \"0863eb67-3ead-4744-b338-aaf75284e458\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.887692 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0863eb67-3ead-4744-b338-aaf75284e458-config-data\") pod \"0863eb67-3ead-4744-b338-aaf75284e458\" (UID: \"0863eb67-3ead-4744-b338-aaf75284e458\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.887762 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0863eb67-3ead-4744-b338-aaf75284e458-combined-ca-bundle\") pod \"0863eb67-3ead-4744-b338-aaf75284e458\" (UID: \"0863eb67-3ead-4744-b338-aaf75284e458\") " Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.888317 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwmlv\" (UniqueName: \"kubernetes.io/projected/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-kube-api-access-gwmlv\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.888337 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.888346 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mdk8\" (UniqueName: \"kubernetes.io/projected/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-kube-api-access-5mdk8\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.888354 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.888363 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.888371 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.888380 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.888391 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.888400 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.888408 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpfnk\" (UniqueName: \"kubernetes.io/projected/74f7986d-211a-4216-b92f-ad900ddcc45f-kube-api-access-cpfnk\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.888417 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.888425 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.888434 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.904050 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d6c006d-67ff-426d-9203-5c769b474fa3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d6c006d-67ff-426d-9203-5c769b474fa3" (UID: "8d6c006d-67ff-426d-9203-5c769b474fa3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.927556 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0863eb67-3ead-4744-b338-aaf75284e458-kube-api-access-ztwkh" (OuterVolumeSpecName: "kube-api-access-ztwkh") pod "0863eb67-3ead-4744-b338-aaf75284e458" (UID: "0863eb67-3ead-4744-b338-aaf75284e458"). InnerVolumeSpecName "kube-api-access-ztwkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.928821 5030 scope.go:117] "RemoveContainer" containerID="494d5cde95754e39df3e175cb36cce23395a3273fbb92ee15c97c0a4ba5fcaa9" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.929712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6c006d-67ff-426d-9203-5c769b474fa3-kube-api-access-28ng7" (OuterVolumeSpecName: "kube-api-access-28ng7") pod "8d6c006d-67ff-426d-9203-5c769b474fa3" (UID: "8d6c006d-67ff-426d-9203-5c769b474fa3"). InnerVolumeSpecName "kube-api-access-28ng7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.931270 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494d5cde95754e39df3e175cb36cce23395a3273fbb92ee15c97c0a4ba5fcaa9\": container with ID starting with 494d5cde95754e39df3e175cb36cce23395a3273fbb92ee15c97c0a4ba5fcaa9 not found: ID does not exist" containerID="494d5cde95754e39df3e175cb36cce23395a3273fbb92ee15c97c0a4ba5fcaa9" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.931368 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494d5cde95754e39df3e175cb36cce23395a3273fbb92ee15c97c0a4ba5fcaa9"} err="failed to get container status \"494d5cde95754e39df3e175cb36cce23395a3273fbb92ee15c97c0a4ba5fcaa9\": rpc error: code = NotFound desc = could not find container \"494d5cde95754e39df3e175cb36cce23395a3273fbb92ee15c97c0a4ba5fcaa9\": container with ID starting with 494d5cde95754e39df3e175cb36cce23395a3273fbb92ee15c97c0a4ba5fcaa9 not found: ID does not exist" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.931471 5030 scope.go:117] "RemoveContainer" containerID="3a9460c9a35ed585df222f0319fca4cbe21024e047ae17ac43a0303e76f4c0ee" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.931955 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dfaa83b-13e1-4a2f-a722-b0583f2641ed" (UID: "8dfaa83b-13e1-4a2f-a722-b0583f2641ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: E0120 23:19:29.932996 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a9460c9a35ed585df222f0319fca4cbe21024e047ae17ac43a0303e76f4c0ee\": container with ID starting with 3a9460c9a35ed585df222f0319fca4cbe21024e047ae17ac43a0303e76f4c0ee not found: ID does not exist" containerID="3a9460c9a35ed585df222f0319fca4cbe21024e047ae17ac43a0303e76f4c0ee" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.933037 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a9460c9a35ed585df222f0319fca4cbe21024e047ae17ac43a0303e76f4c0ee"} err="failed to get container status \"3a9460c9a35ed585df222f0319fca4cbe21024e047ae17ac43a0303e76f4c0ee\": rpc error: code = NotFound desc = could not find container \"3a9460c9a35ed585df222f0319fca4cbe21024e047ae17ac43a0303e76f4c0ee\": container with ID starting with 3a9460c9a35ed585df222f0319fca4cbe21024e047ae17ac43a0303e76f4c0ee not found: ID does not exist" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.933063 5030 scope.go:117] "RemoveContainer" containerID="7af3824becb0a8b511712e8c071c2a7723b825950e7dda89578b7c6ba4037ce4" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.936009 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.941880 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.962266 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6bb6b715-a2a6-4c25-83c3-315fd30b8d92" (UID: "6bb6b715-a2a6-4c25-83c3-315fd30b8d92"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.967301 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-config-data" (OuterVolumeSpecName: "config-data") pod "4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" (UID: "4dc45225-0ecd-4a1c-9439-dc3c8385e5b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.972284 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0863eb67-3ead-4744-b338-aaf75284e458-config-data" (OuterVolumeSpecName: "config-data") pod "0863eb67-3ead-4744-b338-aaf75284e458" (UID: "0863eb67-3ead-4744-b338-aaf75284e458"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.973148 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="147222a1-10a0-4701-81a3-dae6f96c9cd0" path="/var/lib/kubelet/pods/147222a1-10a0-4701-81a3-dae6f96c9cd0/volumes" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.973593 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c5da8d3-64b4-4c3b-a154-58eaf9379610" path="/var/lib/kubelet/pods/1c5da8d3-64b4-4c3b-a154-58eaf9379610/volumes" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.974191 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24c9c78b-da19-4586-ae0c-97d5d63aeee2" path="/var/lib/kubelet/pods/24c9c78b-da19-4586-ae0c-97d5d63aeee2/volumes" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.975313 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f2c51d-7595-4b7f-bdfb-3e50f0a6c657" path="/var/lib/kubelet/pods/27f2c51d-7595-4b7f-bdfb-3e50f0a6c657/volumes" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.975813 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fc63968-1e2b-46fb-b8f3-d4cfd459410c" path="/var/lib/kubelet/pods/7fc63968-1e2b-46fb-b8f3-d4cfd459410c/volumes" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.976191 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b175c0c8-ee1a-404d-9a14-d37744a24ece" path="/var/lib/kubelet/pods/b175c0c8-ee1a-404d-9a14-d37744a24ece/volumes" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.976692 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74f7986d-211a-4216-b92f-ad900ddcc45f" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.976929 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd75cb8e-9fcd-4465-88fb-49e3600608dc" path="/var/lib/kubelet/pods/bd75cb8e-9fcd-4465-88fb-49e3600608dc/volumes" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.978411 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c16c5bc4-fe32-4711-a21f-c6e7919d41d3" path="/var/lib/kubelet/pods/c16c5bc4-fe32-4711-a21f-c6e7919d41d3/volumes" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.978961 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d84d5a65-b170-4bad-aa8a-ddd90f8f48f9" path="/var/lib/kubelet/pods/d84d5a65-b170-4bad-aa8a-ddd90f8f48f9/volumes" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.980004 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5bc4584-e61d-47a6-b76f-0ff39b47fe1f" path="/var/lib/kubelet/pods/f5bc4584-e61d-47a6-b76f-0ff39b47fe1f/volumes" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.981162 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa4827b5-38d8-4647-a2bd-bfff1b91d34e" path="/var/lib/kubelet/pods/fa4827b5-38d8-4647-a2bd-bfff1b91d34e/volumes" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.982559 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "8dfaa83b-13e1-4a2f-a722-b0583f2641ed" (UID: "8dfaa83b-13e1-4a2f-a722-b0583f2641ed"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.991386 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0863eb67-3ead-4744-b338-aaf75284e458-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0863eb67-3ead-4744-b338-aaf75284e458" (UID: "0863eb67-3ead-4744-b338-aaf75284e458"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.991658 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.991683 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfaa83b-13e1-4a2f-a722-b0583f2641ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.991693 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztwkh\" (UniqueName: \"kubernetes.io/projected/0863eb67-3ead-4744-b338-aaf75284e458-kube-api-access-ztwkh\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.991704 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0863eb67-3ead-4744-b338-aaf75284e458-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.991712 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0863eb67-3ead-4744-b338-aaf75284e458-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.991720 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.991730 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb6b715-a2a6-4c25-83c3-315fd30b8d92-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.991739 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d6c006d-67ff-426d-9203-5c769b474fa3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.991747 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28ng7\" (UniqueName: \"kubernetes.io/projected/8d6c006d-67ff-426d-9203-5c769b474fa3-kube-api-access-28ng7\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.991756 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/147222a1-10a0-4701-81a3-dae6f96c9cd0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.991764 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px2mc\" (UniqueName: \"kubernetes.io/projected/147222a1-10a0-4701-81a3-dae6f96c9cd0-kube-api-access-px2mc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:29 crc kubenswrapper[5030]: I0120 23:19:29.991772 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.005833 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "40a68c1e-d515-4dac-a25a-d7ed5da4b15c" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.011215 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data" (OuterVolumeSpecName: "config-data") pod "40a68c1e-d515-4dac-a25a-d7ed5da4b15c" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.015645 5030 scope.go:117] "RemoveContainer" containerID="744df9f6f8632e4bde232c473af1353285e6fa32962770ad640031ef5ec3543a" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.024106 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ea3fc979-4f14-4578-9cae-d44c44ba3e8c" (UID: "ea3fc979-4f14-4578-9cae-d44c44ba3e8c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.041453 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "40a68c1e-d515-4dac-a25a-d7ed5da4b15c" (UID: "40a68c1e-d515-4dac-a25a-d7ed5da4b15c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.075003 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data" (OuterVolumeSpecName: "config-data") pod "74f7986d-211a-4216-b92f-ad900ddcc45f" (UID: "74f7986d-211a-4216-b92f-ad900ddcc45f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.075163 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-1" podUID="5039f885-ec1c-48d9-8e06-cf672e3581df" containerName="galera" containerID="cri-o://6970aa8432c4959152b2e372b5408ff4682d8c4aae0f06b65a11779ac7a8e3a0" gracePeriod=28 Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.093317 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.093351 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.093362 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40a68c1e-d515-4dac-a25a-d7ed5da4b15c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.093372 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f7986d-211a-4216-b92f-ad900ddcc45f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.093384 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea3fc979-4f14-4578-9cae-d44c44ba3e8c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.247225 5030 generic.go:334] "Generic (PLEG): container finished" podID="eba5579b-ac6a-439c-894d-25a5c14d6242" containerID="7e4d3b88e2537910ea5a1bb30373dd22a7907ac81c0ca61d89885761256dedd0" exitCode=0 Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.247435 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-2" event={"ID":"eba5579b-ac6a-439c-894d-25a5c14d6242","Type":"ContainerDied","Data":"7e4d3b88e2537910ea5a1bb30373dd22a7907ac81c0ca61d89885761256dedd0"} Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.250992 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.250897 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-84bf68df56-874dk" event={"ID":"40a68c1e-d515-4dac-a25a-d7ed5da4b15c","Type":"ContainerDied","Data":"89553af5727ecc437b2d1ea8da1cfc3196096efa610a1336a09881122cfd2e19"} Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.256009 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jjv69" event={"ID":"8d6c006d-67ff-426d-9203-5c769b474fa3","Type":"ContainerDied","Data":"48a2c00e0f645b82fb7c8a947a27dbe27867527fac54e630665e3dd0f242e44b"} Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.256095 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jjv69" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.264562 5030 generic.go:334] "Generic (PLEG): container finished" podID="0863eb67-3ead-4744-b338-aaf75284e458" containerID="7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f" exitCode=0 Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.264663 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0863eb67-3ead-4744-b338-aaf75284e458","Type":"ContainerDied","Data":"7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f"} Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.264945 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.265068 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0863eb67-3ead-4744-b338-aaf75284e458","Type":"ContainerDied","Data":"250dde89a7a2f299c5ed56a1c07167f550c75c679d1ddcf673f5d9cb8e071d82"} Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.269908 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ea3fc979-4f14-4578-9cae-d44c44ba3e8c","Type":"ContainerDied","Data":"4fa48c692de39440b132c3e5aba941e2cdfd78e07ec01fc0b76b37802afc6b7f"} Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.269989 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.276035 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"74f7986d-211a-4216-b92f-ad900ddcc45f","Type":"ContainerDied","Data":"d896aba19503d5dbab8247f5cebe0bfaed0c7e5c6249a0cea78395927026fb0d"} Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.276134 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.283520 5030 generic.go:334] "Generic (PLEG): container finished" podID="ef744864-0218-4aad-9ffa-a926311790dd" containerID="a8bc63dcc4302756d73b4fa0dd16904504481aacc44353be43137f19c1372f23" exitCode=0 Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.283587 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-1" event={"ID":"ef744864-0218-4aad-9ffa-a926311790dd","Type":"ContainerDied","Data":"a8bc63dcc4302756d73b4fa0dd16904504481aacc44353be43137f19c1372f23"} Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.287421 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.288176 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.290002 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"8dfaa83b-13e1-4a2f-a722-b0583f2641ed","Type":"ContainerDied","Data":"81fdaf95be809cb6dffe4401b2865b42b619c06a292132bb661f33734b3abc3f"} Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.290162 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.318115 5030 scope.go:117] "RemoveContainer" containerID="d64a394c4f184d65be789b464e93507d0d51cff95adc8f21cab504aa94390ded" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.379302 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.381851 5030 scope.go:117] "RemoveContainer" containerID="05743f2bd663ee4c6959a35a5be0c01f08b3fea21e5994bca8e2faae3bb81e6a" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.420721 5030 scope.go:117] "RemoveContainer" containerID="3a68c4f1688aff97806a60e8d2e05ae8b38410610f1b708c39ed861bf30a90fc" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.431333 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.438610 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.445505 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.455011 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.463995 5030 scope.go:117] "RemoveContainer" containerID="7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.469789 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.481600 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.490499 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-84bf68df56-874dk"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.498260 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-84bf68df56-874dk"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.500323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"ef744864-0218-4aad-9ffa-a926311790dd\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.500355 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef744864-0218-4aad-9ffa-a926311790dd-combined-ca-bundle\") pod \"ef744864-0218-4aad-9ffa-a926311790dd\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.500521 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-kolla-config\") pod \"ef744864-0218-4aad-9ffa-a926311790dd\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.500645 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef744864-0218-4aad-9ffa-a926311790dd-galera-tls-certs\") pod \"ef744864-0218-4aad-9ffa-a926311790dd\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.500672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-config-data-default\") pod \"ef744864-0218-4aad-9ffa-a926311790dd\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.500841 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ef744864-0218-4aad-9ffa-a926311790dd-config-data-generated\") pod \"ef744864-0218-4aad-9ffa-a926311790dd\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.501253 5030 scope.go:117] "RemoveContainer" containerID="7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.501790 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-operator-scripts\") pod \"ef744864-0218-4aad-9ffa-a926311790dd\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.501837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "ef744864-0218-4aad-9ffa-a926311790dd" (UID: "ef744864-0218-4aad-9ffa-a926311790dd"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.501879 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54qfw\" (UniqueName: \"kubernetes.io/projected/ef744864-0218-4aad-9ffa-a926311790dd-kube-api-access-54qfw\") pod \"ef744864-0218-4aad-9ffa-a926311790dd\" (UID: \"ef744864-0218-4aad-9ffa-a926311790dd\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.502707 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: E0120 23:19:30.503507 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f\": container with ID starting with 7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f not found: ID does not exist" containerID="7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.503562 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f"} err="failed to get container status \"7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f\": rpc error: code = NotFound desc = could not find container \"7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f\": container with ID starting with 7d8d7623e4e887a9ce23663b474c20ba39201d07f14b46c02091fe50f7642f0f not found: ID does not exist" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.503591 5030 scope.go:117] "RemoveContainer" containerID="6937c695cff1667d0d62ce1c83b0152b926dae8c296b22eca0b03bd5169765f4" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.503745 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jjv69"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.504279 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef744864-0218-4aad-9ffa-a926311790dd-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "ef744864-0218-4aad-9ffa-a926311790dd" (UID: "ef744864-0218-4aad-9ffa-a926311790dd"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.504739 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "ef744864-0218-4aad-9ffa-a926311790dd" (UID: "ef744864-0218-4aad-9ffa-a926311790dd"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.505190 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef744864-0218-4aad-9ffa-a926311790dd" (UID: "ef744864-0218-4aad-9ffa-a926311790dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.506907 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef744864-0218-4aad-9ffa-a926311790dd-kube-api-access-54qfw" (OuterVolumeSpecName: "kube-api-access-54qfw") pod "ef744864-0218-4aad-9ffa-a926311790dd" (UID: "ef744864-0218-4aad-9ffa-a926311790dd"). InnerVolumeSpecName "kube-api-access-54qfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.509422 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jjv69"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.535677 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.540154 5030 scope.go:117] "RemoveContainer" containerID="4acdd1af318211b5450f634e63d8de0e0421722ea1b31f9eda4efdab1cfbcc50" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.541771 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.545683 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "mysql-db") pod "ef744864-0218-4aad-9ffa-a926311790dd" (UID: "ef744864-0218-4aad-9ffa-a926311790dd"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.550228 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.559585 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.562185 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef744864-0218-4aad-9ffa-a926311790dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef744864-0218-4aad-9ffa-a926311790dd" (UID: "ef744864-0218-4aad-9ffa-a926311790dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.569143 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-67fcc84c58-mxnkm"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.570974 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef744864-0218-4aad-9ffa-a926311790dd-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "ef744864-0218-4aad-9ffa-a926311790dd" (UID: "ef744864-0218-4aad-9ffa-a926311790dd"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.575747 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.576518 5030 scope.go:117] "RemoveContainer" containerID="3b40fee42e01b5b734218bca90644c05dc6e1540c1debc8a4a06f30fcb8a9bf8" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.583524 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.604291 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"eba5579b-ac6a-439c-894d-25a5c14d6242\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.604367 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eba5579b-ac6a-439c-894d-25a5c14d6242-config-data-generated\") pod \"eba5579b-ac6a-439c-894d-25a5c14d6242\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.604400 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eba5579b-ac6a-439c-894d-25a5c14d6242-galera-tls-certs\") pod \"eba5579b-ac6a-439c-894d-25a5c14d6242\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.604440 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-kolla-config\") pod \"eba5579b-ac6a-439c-894d-25a5c14d6242\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.604475 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r74rl\" (UniqueName: \"kubernetes.io/projected/eba5579b-ac6a-439c-894d-25a5c14d6242-kube-api-access-r74rl\") pod \"eba5579b-ac6a-439c-894d-25a5c14d6242\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.604529 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba5579b-ac6a-439c-894d-25a5c14d6242-combined-ca-bundle\") pod \"eba5579b-ac6a-439c-894d-25a5c14d6242\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.604583 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-config-data-default\") pod \"eba5579b-ac6a-439c-894d-25a5c14d6242\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.606685 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-operator-scripts\") pod \"eba5579b-ac6a-439c-894d-25a5c14d6242\" (UID: \"eba5579b-ac6a-439c-894d-25a5c14d6242\") " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.607101 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.607120 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef744864-0218-4aad-9ffa-a926311790dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.607131 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef744864-0218-4aad-9ffa-a926311790dd-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.607142 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.607151 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ef744864-0218-4aad-9ffa-a926311790dd-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.607159 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef744864-0218-4aad-9ffa-a926311790dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.607168 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54qfw\" (UniqueName: \"kubernetes.io/projected/ef744864-0218-4aad-9ffa-a926311790dd-kube-api-access-54qfw\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.611689 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eba5579b-ac6a-439c-894d-25a5c14d6242-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "eba5579b-ac6a-439c-894d-25a5c14d6242" (UID: "eba5579b-ac6a-439c-894d-25a5c14d6242"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.611767 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "eba5579b-ac6a-439c-894d-25a5c14d6242" (UID: "eba5579b-ac6a-439c-894d-25a5c14d6242"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.612074 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "eba5579b-ac6a-439c-894d-25a5c14d6242" (UID: "eba5579b-ac6a-439c-894d-25a5c14d6242"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.612892 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eba5579b-ac6a-439c-894d-25a5c14d6242" (UID: "eba5579b-ac6a-439c-894d-25a5c14d6242"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.615790 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eba5579b-ac6a-439c-894d-25a5c14d6242-kube-api-access-r74rl" (OuterVolumeSpecName: "kube-api-access-r74rl") pod "eba5579b-ac6a-439c-894d-25a5c14d6242" (UID: "eba5579b-ac6a-439c-894d-25a5c14d6242"). InnerVolumeSpecName "kube-api-access-r74rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.619719 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "mysql-db") pod "eba5579b-ac6a-439c-894d-25a5c14d6242" (UID: "eba5579b-ac6a-439c-894d-25a5c14d6242"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.620769 5030 scope.go:117] "RemoveContainer" containerID="70fb88353fc70c3b9dab582d23bbede418372d313927da95f856408d4d739fad" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.627513 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.636057 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eba5579b-ac6a-439c-894d-25a5c14d6242-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eba5579b-ac6a-439c-894d-25a5c14d6242" (UID: "eba5579b-ac6a-439c-894d-25a5c14d6242"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.641392 5030 scope.go:117] "RemoveContainer" containerID="2b4f10ed0e4a73fb5cad8e46980901e7c694bff893bde9501737b27debf1b796" Jan 20 23:19:30 crc kubenswrapper[5030]: E0120 23:19:30.644325 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:19:30 crc kubenswrapper[5030]: E0120 23:19:30.645802 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:19:30 crc kubenswrapper[5030]: E0120 23:19:30.647173 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:19:30 crc kubenswrapper[5030]: E0120 23:19:30.647232 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="e8de9d1d-d7de-4313-9b7b-0f07992ede1b" containerName="nova-scheduler-scheduler" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.663450 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eba5579b-ac6a-439c-894d-25a5c14d6242-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "eba5579b-ac6a-439c-894d-25a5c14d6242" (UID: "eba5579b-ac6a-439c-894d-25a5c14d6242"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.708532 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.708573 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eba5579b-ac6a-439c-894d-25a5c14d6242-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.708587 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eba5579b-ac6a-439c-894d-25a5c14d6242-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.708599 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.708612 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r74rl\" (UniqueName: \"kubernetes.io/projected/eba5579b-ac6a-439c-894d-25a5c14d6242-kube-api-access-r74rl\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.708642 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba5579b-ac6a-439c-894d-25a5c14d6242-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.708654 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.708666 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba5579b-ac6a-439c-894d-25a5c14d6242-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.708686 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: E0120 23:19:30.708758 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:30 crc kubenswrapper[5030]: E0120 23:19:30.708833 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data podName:59bcab9d-b9c0-40fe-8973-df39eeca1dd4 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:38.708814062 +0000 UTC m=+2651.029074350 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data") pod "rabbitmq-server-1" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4") : configmap "rabbitmq-config-data" not found Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.740170 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.809989 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:30 crc kubenswrapper[5030]: I0120 23:19:30.889372 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.018065 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-kolla-config\") pod \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.018216 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-config-data-default\") pod \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.018304 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.018392 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473c0915-5bed-4dc5-8b2a-7899b05b9afc-combined-ca-bundle\") pod \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.018461 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/473c0915-5bed-4dc5-8b2a-7899b05b9afc-galera-tls-certs\") pod \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.018525 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv9x8\" (UniqueName: \"kubernetes.io/projected/473c0915-5bed-4dc5-8b2a-7899b05b9afc-kube-api-access-cv9x8\") pod \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.018783 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-operator-scripts\") pod \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.018957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/473c0915-5bed-4dc5-8b2a-7899b05b9afc-config-data-generated\") pod \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\" (UID: \"473c0915-5bed-4dc5-8b2a-7899b05b9afc\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.019447 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/473c0915-5bed-4dc5-8b2a-7899b05b9afc-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "473c0915-5bed-4dc5-8b2a-7899b05b9afc" (UID: "473c0915-5bed-4dc5-8b2a-7899b05b9afc"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.019744 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/473c0915-5bed-4dc5-8b2a-7899b05b9afc-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.019820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "473c0915-5bed-4dc5-8b2a-7899b05b9afc" (UID: "473c0915-5bed-4dc5-8b2a-7899b05b9afc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.019847 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.019838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "473c0915-5bed-4dc5-8b2a-7899b05b9afc" (UID: "473c0915-5bed-4dc5-8b2a-7899b05b9afc"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.019843 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "473c0915-5bed-4dc5-8b2a-7899b05b9afc" (UID: "473c0915-5bed-4dc5-8b2a-7899b05b9afc"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.019927 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data podName:56408fa4-b1eb-4a4d-bc8a-9ab82b88957f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:39.019900522 +0000 UTC m=+2651.340160860 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data") pod "rabbitmq-server-2" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f") : configmap "rabbitmq-config-data" not found Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.029076 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473c0915-5bed-4dc5-8b2a-7899b05b9afc-kube-api-access-cv9x8" (OuterVolumeSpecName: "kube-api-access-cv9x8") pod "473c0915-5bed-4dc5-8b2a-7899b05b9afc" (UID: "473c0915-5bed-4dc5-8b2a-7899b05b9afc"). InnerVolumeSpecName "kube-api-access-cv9x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.033711 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "473c0915-5bed-4dc5-8b2a-7899b05b9afc" (UID: "473c0915-5bed-4dc5-8b2a-7899b05b9afc"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.064030 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.065718 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473c0915-5bed-4dc5-8b2a-7899b05b9afc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "473c0915-5bed-4dc5-8b2a-7899b05b9afc" (UID: "473c0915-5bed-4dc5-8b2a-7899b05b9afc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.066974 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.069563 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.069712 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="d59d9bd5-ab07-4c22-83c0-71f88ee1b02d" containerName="nova-cell0-conductor-conductor" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.112122 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473c0915-5bed-4dc5-8b2a-7899b05b9afc-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "473c0915-5bed-4dc5-8b2a-7899b05b9afc" (UID: "473c0915-5bed-4dc5-8b2a-7899b05b9afc"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.122329 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.122384 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.122423 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.122444 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473c0915-5bed-4dc5-8b2a-7899b05b9afc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.122465 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/473c0915-5bed-4dc5-8b2a-7899b05b9afc-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.122485 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv9x8\" (UniqueName: \"kubernetes.io/projected/473c0915-5bed-4dc5-8b2a-7899b05b9afc-kube-api-access-cv9x8\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.122506 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/473c0915-5bed-4dc5-8b2a-7899b05b9afc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.122722 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.122799 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data podName:ed443431-2b5c-4533-8523-d01380c98c1d nodeName:}" failed. No retries permitted until 2026-01-20 23:19:39.122780369 +0000 UTC m=+2651.443040657 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data") pod "rabbitmq-server-0" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d") : configmap "rabbitmq-config-data" not found Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.159121 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.226687 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.305839 5030 generic.go:334] "Generic (PLEG): container finished" podID="473c0915-5bed-4dc5-8b2a-7899b05b9afc" containerID="f0246b91c36db74af56a2f84a2aa1fbfcd276a49ae84c6dacb5c767ebec9f927" exitCode=0 Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.305931 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"473c0915-5bed-4dc5-8b2a-7899b05b9afc","Type":"ContainerDied","Data":"f0246b91c36db74af56a2f84a2aa1fbfcd276a49ae84c6dacb5c767ebec9f927"} Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.305959 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"473c0915-5bed-4dc5-8b2a-7899b05b9afc","Type":"ContainerDied","Data":"35fc66e37795e49ff03010bae4833bf0b0a1dba482694a2ee04b6136ab9b892f"} Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.305977 5030 scope.go:117] "RemoveContainer" containerID="f0246b91c36db74af56a2f84a2aa1fbfcd276a49ae84c6dacb5c767ebec9f927" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.306089 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.315863 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-2" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.316783 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-2" event={"ID":"eba5579b-ac6a-439c-894d-25a5c14d6242","Type":"ContainerDied","Data":"0a0f098aefe0f39a03bf9b39e3f6ba567956e1ea0f46f7a85b2ce0124cfeb144"} Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.328004 5030 generic.go:334] "Generic (PLEG): container finished" podID="56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" containerID="e36b6c921d799153e5dd8b22436909b17571e1a429199d1f21643266c33dc577" exitCode=0 Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.328057 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-2" event={"ID":"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f","Type":"ContainerDied","Data":"e36b6c921d799153e5dd8b22436909b17571e1a429199d1f21643266c33dc577"} Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.329238 5030 generic.go:334] "Generic (PLEG): container finished" podID="59bcab9d-b9c0-40fe-8973-df39eeca1dd4" containerID="a74fc0dc19c36c183a9bd3ffda3fbb852f4467116d2b61c87b1bef566563caba" exitCode=0 Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.329274 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-1" event={"ID":"59bcab9d-b9c0-40fe-8973-df39eeca1dd4","Type":"ContainerDied","Data":"a74fc0dc19c36c183a9bd3ffda3fbb852f4467116d2b61c87b1bef566563caba"} Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.332179 5030 generic.go:334] "Generic (PLEG): container finished" podID="09176751-6268-4f1b-9348-5adb2938e4ca" containerID="5335c6175ed4a270a6a0424ed58b7a7123c2bee5625d69ab613e39861259cab6" exitCode=0 Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.332224 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" event={"ID":"09176751-6268-4f1b-9348-5adb2938e4ca","Type":"ContainerDied","Data":"5335c6175ed4a270a6a0424ed58b7a7123c2bee5625d69ab613e39861259cab6"} Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.335708 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-1" event={"ID":"ef744864-0218-4aad-9ffa-a926311790dd","Type":"ContainerDied","Data":"bbdc232122010f480fbcaeb0a155ca38341653bfb3c4d7e16f45693381bcb6ec"} Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.335813 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-1" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.343726 5030 generic.go:334] "Generic (PLEG): container finished" podID="ed443431-2b5c-4533-8523-d01380c98c1d" containerID="07c23d45fe75d1a2499dda70559be5d8b50e9dddf38b74b4c6637c72301d7701" exitCode=0 Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.343763 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"ed443431-2b5c-4533-8523-d01380c98c1d","Type":"ContainerDied","Data":"07c23d45fe75d1a2499dda70559be5d8b50e9dddf38b74b4c6637c72301d7701"} Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.460731 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.475135 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.476818 5030 scope.go:117] "RemoveContainer" containerID="535d478dd02aa4e1a13cb2f9107eab61342d4f44c93fb443e66d18213cd37e3d" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.492106 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.502765 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.510908 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-2"] Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.521352 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-2"] Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.530892 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-combined-ca-bundle\") pod \"09176751-6268-4f1b-9348-5adb2938e4ca\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.530936 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-public-tls-certs\") pod \"09176751-6268-4f1b-9348-5adb2938e4ca\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.530961 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-scripts\") pod \"09176751-6268-4f1b-9348-5adb2938e4ca\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.531050 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-internal-tls-certs\") pod \"09176751-6268-4f1b-9348-5adb2938e4ca\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.531103 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-fernet-keys\") pod \"09176751-6268-4f1b-9348-5adb2938e4ca\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.531120 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-credential-keys\") pod \"09176751-6268-4f1b-9348-5adb2938e4ca\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.531169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrnqm\" (UniqueName: \"kubernetes.io/projected/09176751-6268-4f1b-9348-5adb2938e4ca-kube-api-access-hrnqm\") pod \"09176751-6268-4f1b-9348-5adb2938e4ca\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.531185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-config-data\") pod \"09176751-6268-4f1b-9348-5adb2938e4ca\" (UID: \"09176751-6268-4f1b-9348-5adb2938e4ca\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.531938 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-1"] Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.540507 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "09176751-6268-4f1b-9348-5adb2938e4ca" (UID: "09176751-6268-4f1b-9348-5adb2938e4ca"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.540541 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-scripts" (OuterVolumeSpecName: "scripts") pod "09176751-6268-4f1b-9348-5adb2938e4ca" (UID: "09176751-6268-4f1b-9348-5adb2938e4ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.541932 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-1"] Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.552441 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09176751-6268-4f1b-9348-5adb2938e4ca-kube-api-access-hrnqm" (OuterVolumeSpecName: "kube-api-access-hrnqm") pod "09176751-6268-4f1b-9348-5adb2938e4ca" (UID: "09176751-6268-4f1b-9348-5adb2938e4ca"). InnerVolumeSpecName "kube-api-access-hrnqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.552465 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "09176751-6268-4f1b-9348-5adb2938e4ca" (UID: "09176751-6268-4f1b-9348-5adb2938e4ca"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.552605 5030 scope.go:117] "RemoveContainer" containerID="f0246b91c36db74af56a2f84a2aa1fbfcd276a49ae84c6dacb5c767ebec9f927" Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.553427 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0246b91c36db74af56a2f84a2aa1fbfcd276a49ae84c6dacb5c767ebec9f927\": container with ID starting with f0246b91c36db74af56a2f84a2aa1fbfcd276a49ae84c6dacb5c767ebec9f927 not found: ID does not exist" containerID="f0246b91c36db74af56a2f84a2aa1fbfcd276a49ae84c6dacb5c767ebec9f927" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.553478 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0246b91c36db74af56a2f84a2aa1fbfcd276a49ae84c6dacb5c767ebec9f927"} err="failed to get container status \"f0246b91c36db74af56a2f84a2aa1fbfcd276a49ae84c6dacb5c767ebec9f927\": rpc error: code = NotFound desc = could not find container \"f0246b91c36db74af56a2f84a2aa1fbfcd276a49ae84c6dacb5c767ebec9f927\": container with ID starting with f0246b91c36db74af56a2f84a2aa1fbfcd276a49ae84c6dacb5c767ebec9f927 not found: ID does not exist" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.553503 5030 scope.go:117] "RemoveContainer" containerID="535d478dd02aa4e1a13cb2f9107eab61342d4f44c93fb443e66d18213cd37e3d" Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.553943 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"535d478dd02aa4e1a13cb2f9107eab61342d4f44c93fb443e66d18213cd37e3d\": container with ID starting with 535d478dd02aa4e1a13cb2f9107eab61342d4f44c93fb443e66d18213cd37e3d not found: ID does not exist" containerID="535d478dd02aa4e1a13cb2f9107eab61342d4f44c93fb443e66d18213cd37e3d" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.553969 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"535d478dd02aa4e1a13cb2f9107eab61342d4f44c93fb443e66d18213cd37e3d"} err="failed to get container status \"535d478dd02aa4e1a13cb2f9107eab61342d4f44c93fb443e66d18213cd37e3d\": rpc error: code = NotFound desc = could not find container \"535d478dd02aa4e1a13cb2f9107eab61342d4f44c93fb443e66d18213cd37e3d\": container with ID starting with 535d478dd02aa4e1a13cb2f9107eab61342d4f44c93fb443e66d18213cd37e3d not found: ID does not exist" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.553990 5030 scope.go:117] "RemoveContainer" containerID="7e4d3b88e2537910ea5a1bb30373dd22a7907ac81c0ca61d89885761256dedd0" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.567257 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09176751-6268-4f1b-9348-5adb2938e4ca" (UID: "09176751-6268-4f1b-9348-5adb2938e4ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.584056 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-config-data" (OuterVolumeSpecName: "config-data") pod "09176751-6268-4f1b-9348-5adb2938e4ca" (UID: "09176751-6268-4f1b-9348-5adb2938e4ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.595200 5030 scope.go:117] "RemoveContainer" containerID="5f28b962d9cd9e0f5d731673f323a4101176d3bfafc6679aeb1a997bbdac5c30" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.596741 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "09176751-6268-4f1b-9348-5adb2938e4ca" (UID: "09176751-6268-4f1b-9348-5adb2938e4ca"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.606200 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "09176751-6268-4f1b-9348-5adb2938e4ca" (UID: "09176751-6268-4f1b-9348-5adb2938e4ca"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.628782 5030 scope.go:117] "RemoveContainer" containerID="a8bc63dcc4302756d73b4fa0dd16904504481aacc44353be43137f19c1372f23" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.632188 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-pod-info\") pod \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.632231 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-plugins-conf\") pod \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.632260 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-server-conf\") pod \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.632293 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-tls\") pod \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.632317 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-erlang-cookie\") pod \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.632343 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9tv6\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-kube-api-access-w9tv6\") pod \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.632366 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-erlang-cookie-secret\") pod \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.632389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-confd\") pod \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.632426 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-plugins\") pod \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.632455 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data\") pod \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.632493 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\" (UID: \"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.633026 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.633294 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.633309 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.633318 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.633326 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.633335 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrnqm\" (UniqueName: \"kubernetes.io/projected/09176751-6268-4f1b-9348-5adb2938e4ca-kube-api-access-hrnqm\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.633344 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.633352 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.633362 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.633369 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09176751-6268-4f1b-9348-5adb2938e4ca-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.634121 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.634315 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.635766 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-pod-info" (OuterVolumeSpecName: "pod-info") pod "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.636988 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.637143 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "persistence") pod "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.640196 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-kube-api-access-w9tv6" (OuterVolumeSpecName: "kube-api-access-w9tv6") pod "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f"). InnerVolumeSpecName "kube-api-access-w9tv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.641948 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.646151 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.657823 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.659085 5030 scope.go:117] "RemoveContainer" containerID="be373b7a9672d0d64b714b3087d91961119dd1aa33d4a8808c64a1422b4c41fa" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.674303 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data" (OuterVolumeSpecName: "config-data") pod "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.679600 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-server-conf" (OuterVolumeSpecName: "server-conf") pod "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.725933 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" (UID: "56408fa4-b1eb-4a4d-bc8a-9ab82b88957f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.734738 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rksgn\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-kube-api-access-rksgn\") pod \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.734780 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-plugins\") pod \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.734816 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-confd\") pod \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.734861 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-plugins-conf\") pod \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.734901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"ed443431-2b5c-4533-8523-d01380c98c1d\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.734927 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-erlang-cookie\") pod \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.734943 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-server-conf\") pod \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.734974 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed443431-2b5c-4533-8523-d01380c98c1d-erlang-cookie-secret\") pod \"ed443431-2b5c-4533-8523-d01380c98c1d\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.734995 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqn7m\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-kube-api-access-hqn7m\") pod \"ed443431-2b5c-4533-8523-d01380c98c1d\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735071 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-erlang-cookie\") pod \"ed443431-2b5c-4533-8523-d01380c98c1d\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735087 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-erlang-cookie-secret\") pod \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735124 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-server-conf\") pod \"ed443431-2b5c-4533-8523-d01380c98c1d\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735147 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data\") pod \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735182 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735208 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-pod-info\") pod \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735226 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-plugins-conf\") pod \"ed443431-2b5c-4533-8523-d01380c98c1d\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735241 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-confd\") pod \"ed443431-2b5c-4533-8523-d01380c98c1d\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735266 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-tls\") pod \"ed443431-2b5c-4533-8523-d01380c98c1d\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-tls\") pod \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\" (UID: \"59bcab9d-b9c0-40fe-8973-df39eeca1dd4\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735309 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data\") pod \"ed443431-2b5c-4533-8523-d01380c98c1d\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed443431-2b5c-4533-8523-d01380c98c1d-pod-info\") pod \"ed443431-2b5c-4533-8523-d01380c98c1d\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735341 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-plugins\") pod \"ed443431-2b5c-4533-8523-d01380c98c1d\" (UID: \"ed443431-2b5c-4533-8523-d01380c98c1d\") " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735672 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735690 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735701 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735711 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735720 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9tv6\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-kube-api-access-w9tv6\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735728 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735736 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735745 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735754 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.735771 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.738361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "59bcab9d-b9c0-40fe-8973-df39eeca1dd4" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.738795 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "59bcab9d-b9c0-40fe-8973-df39eeca1dd4" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.740550 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "59bcab9d-b9c0-40fe-8973-df39eeca1dd4" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.742669 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "persistence") pod "ed443431-2b5c-4533-8523-d01380c98c1d" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.742684 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ed443431-2b5c-4533-8523-d01380c98c1d" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.743015 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-kube-api-access-rksgn" (OuterVolumeSpecName: "kube-api-access-rksgn") pod "59bcab9d-b9c0-40fe-8973-df39eeca1dd4" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4"). InnerVolumeSpecName "kube-api-access-rksgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.743248 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "59bcab9d-b9c0-40fe-8973-df39eeca1dd4" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.743492 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "59bcab9d-b9c0-40fe-8973-df39eeca1dd4" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.744193 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ed443431-2b5c-4533-8523-d01380c98c1d" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.745004 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ed443431-2b5c-4533-8523-d01380c98c1d" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.745458 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ed443431-2b5c-4533-8523-d01380c98c1d" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.753754 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed443431-2b5c-4533-8523-d01380c98c1d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ed443431-2b5c-4533-8523-d01380c98c1d" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.756133 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-kube-api-access-hqn7m" (OuterVolumeSpecName: "kube-api-access-hqn7m") pod "ed443431-2b5c-4533-8523-d01380c98c1d" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d"). InnerVolumeSpecName "kube-api-access-hqn7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.756793 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "persistence") pod "59bcab9d-b9c0-40fe-8973-df39eeca1dd4" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.760077 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ed443431-2b5c-4533-8523-d01380c98c1d-pod-info" (OuterVolumeSpecName: "pod-info") pod "ed443431-2b5c-4533-8523-d01380c98c1d" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.760117 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-pod-info" (OuterVolumeSpecName: "pod-info") pod "59bcab9d-b9c0-40fe-8973-df39eeca1dd4" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.764947 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.768823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data" (OuterVolumeSpecName: "config-data") pod "59bcab9d-b9c0-40fe-8973-df39eeca1dd4" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.771141 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data" (OuterVolumeSpecName: "config-data") pod "ed443431-2b5c-4533-8523-d01380c98c1d" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.780983 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-server-conf" (OuterVolumeSpecName: "server-conf") pod "59bcab9d-b9c0-40fe-8973-df39eeca1dd4" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.783676 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-server-conf" (OuterVolumeSpecName: "server-conf") pod "ed443431-2b5c-4533-8523-d01380c98c1d" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.824435 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ed443431-2b5c-4533-8523-d01380c98c1d" (UID: "ed443431-2b5c-4533-8523-d01380c98c1d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.827316 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "59bcab9d-b9c0-40fe-8973-df39eeca1dd4" (UID: "59bcab9d-b9c0-40fe-8973-df39eeca1dd4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837748 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837792 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837825 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837840 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837853 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837864 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ed443431-2b5c-4533-8523-d01380c98c1d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837877 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqn7m\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-kube-api-access-hqn7m\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837888 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837898 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837908 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837919 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837935 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837947 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837958 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837969 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837982 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.837993 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.838005 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.838017 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed443431-2b5c-4533-8523-d01380c98c1d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.838029 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ed443431-2b5c-4533-8523-d01380c98c1d-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.838041 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ed443431-2b5c-4533-8523-d01380c98c1d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.838053 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rksgn\" (UniqueName: \"kubernetes.io/projected/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-kube-api-access-rksgn\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.838063 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/59bcab9d-b9c0-40fe-8973-df39eeca1dd4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.856441 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.858035 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.940081 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.940114 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.940202 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.940203 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.940220 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.940277 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data podName:65e9d298-7338-47c3-8591-b7803798203f nodeName:}" failed. No retries permitted until 2026-01-20 23:19:39.940255453 +0000 UTC m=+2652.260515741 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data") pod "rabbitmq-cell1-server-1" (UID: "65e9d298-7338-47c3-8591-b7803798203f") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.940336 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data podName:995342f6-fe0b-4188-976c-5151541dd002 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:39.940309074 +0000 UTC m=+2652.260569372 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data") pod "rabbitmq-cell1-server-0" (UID: "995342f6-fe0b-4188-976c-5151541dd002") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:31 crc kubenswrapper[5030]: E0120 23:19:31.940393 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data podName:a54757d6-0570-4290-a802-d82ff94dffb9 nodeName:}" failed. No retries permitted until 2026-01-20 23:19:39.940382136 +0000 UTC m=+2652.260642424 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data") pod "rabbitmq-cell1-server-2" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.988289 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0863eb67-3ead-4744-b338-aaf75284e458" path="/var/lib/kubelet/pods/0863eb67-3ead-4744-b338-aaf75284e458/volumes" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.993239 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40a68c1e-d515-4dac-a25a-d7ed5da4b15c" path="/var/lib/kubelet/pods/40a68c1e-d515-4dac-a25a-d7ed5da4b15c/volumes" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.997058 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="473c0915-5bed-4dc5-8b2a-7899b05b9afc" path="/var/lib/kubelet/pods/473c0915-5bed-4dc5-8b2a-7899b05b9afc/volumes" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.997901 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" path="/var/lib/kubelet/pods/4dc45225-0ecd-4a1c-9439-dc3c8385e5b0/volumes" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.998513 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bb6b715-a2a6-4c25-83c3-315fd30b8d92" path="/var/lib/kubelet/pods/6bb6b715-a2a6-4c25-83c3-315fd30b8d92/volumes" Jan 20 23:19:31 crc kubenswrapper[5030]: I0120 23:19:31.999725 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74f7986d-211a-4216-b92f-ad900ddcc45f" path="/var/lib/kubelet/pods/74f7986d-211a-4216-b92f-ad900ddcc45f/volumes" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.000508 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d6c006d-67ff-426d-9203-5c769b474fa3" path="/var/lib/kubelet/pods/8d6c006d-67ff-426d-9203-5c769b474fa3/volumes" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.001503 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dfaa83b-13e1-4a2f-a722-b0583f2641ed" path="/var/lib/kubelet/pods/8dfaa83b-13e1-4a2f-a722-b0583f2641ed/volumes" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.002164 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" path="/var/lib/kubelet/pods/ea3fc979-4f14-4578-9cae-d44c44ba3e8c/volumes" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.002894 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eba5579b-ac6a-439c-894d-25a5c14d6242" path="/var/lib/kubelet/pods/eba5579b-ac6a-439c-894d-25a5c14d6242/volumes" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.004249 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef744864-0218-4aad-9ffa-a926311790dd" path="/var/lib/kubelet/pods/ef744864-0218-4aad-9ffa-a926311790dd/volumes" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.077286 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="cf776dcd-8af4-45b0-9c39-0e8db2af4879" containerName="galera" containerID="cri-o://f70480d2e206ab45874b23b6a0a2ec9950697436b259c4e0ec5114358ea4197c" gracePeriod=26 Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.362812 5030 generic.go:334] "Generic (PLEG): container finished" podID="b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" containerID="6949cd28ba252efdec3e0d70cae363c20b2cd6eb423def67f259c29ca96e51c2" exitCode=0 Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.362909 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" event={"ID":"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782","Type":"ContainerDied","Data":"6949cd28ba252efdec3e0d70cae363c20b2cd6eb423def67f259c29ca96e51c2"} Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.365518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-2" event={"ID":"56408fa4-b1eb-4a4d-bc8a-9ab82b88957f","Type":"ContainerDied","Data":"94f45336453177298d12d18e1f69aac4ece70b0a27b5fda3a802dcbbe48cf180"} Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.365556 5030 scope.go:117] "RemoveContainer" containerID="e36b6c921d799153e5dd8b22436909b17571e1a429199d1f21643266c33dc577" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.365822 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-2" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.370489 5030 generic.go:334] "Generic (PLEG): container finished" podID="a54757d6-0570-4290-a802-d82ff94dffb9" containerID="b4810b367aef312cb07a5bc212828530d847353295a4a0029f98ba6de0de7144" exitCode=0 Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.370610 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" event={"ID":"a54757d6-0570-4290-a802-d82ff94dffb9","Type":"ContainerDied","Data":"b4810b367aef312cb07a5bc212828530d847353295a4a0029f98ba6de0de7144"} Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.370669 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" event={"ID":"a54757d6-0570-4290-a802-d82ff94dffb9","Type":"ContainerDied","Data":"923b70d502bfac1e6b9f1928a96a7ee2423a1bf7d60e2576eb292c0b8acaf0c5"} Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.370681 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="923b70d502bfac1e6b9f1928a96a7ee2423a1bf7d60e2576eb292c0b8acaf0c5" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.375818 5030 generic.go:334] "Generic (PLEG): container finished" podID="995342f6-fe0b-4188-976c-5151541dd002" containerID="38eba0f78e9bf8588976acd650a6cbc2107e5878a8dbe1f929ebf5401b2d0697" exitCode=0 Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.375900 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"995342f6-fe0b-4188-976c-5151541dd002","Type":"ContainerDied","Data":"38eba0f78e9bf8588976acd650a6cbc2107e5878a8dbe1f929ebf5401b2d0697"} Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.379867 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-1" event={"ID":"59bcab9d-b9c0-40fe-8973-df39eeca1dd4","Type":"ContainerDied","Data":"165efdad72daf753004568da6e026ea0e21356f8eff4ac64ca0eba7376ae6625"} Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.379968 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-1" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.387898 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"ed443431-2b5c-4533-8523-d01380c98c1d","Type":"ContainerDied","Data":"c1ed5f52af5d3a7aea7605a15475d802b60154d0600fbc943b8538b8756caac7"} Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.388049 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.396190 5030 generic.go:334] "Generic (PLEG): container finished" podID="65e9d298-7338-47c3-8591-b7803798203f" containerID="5ec228c62d11e892bccd8b3e7622fe27d4c99380436acc8ab9a70d07539cf3ba" exitCode=0 Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.396235 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" event={"ID":"65e9d298-7338-47c3-8591-b7803798203f","Type":"ContainerDied","Data":"5ec228c62d11e892bccd8b3e7622fe27d4c99380436acc8ab9a70d07539cf3ba"} Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.396249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" event={"ID":"65e9d298-7338-47c3-8591-b7803798203f","Type":"ContainerDied","Data":"372aa9e1ff069cb20086ce59b6600724cc05321d10840233fd8eb7b47ac36c9b"} Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.396262 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="372aa9e1ff069cb20086ce59b6600724cc05321d10840233fd8eb7b47ac36c9b" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.399554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" event={"ID":"09176751-6268-4f1b-9348-5adb2938e4ca","Type":"ContainerDied","Data":"de8085f07a604d20402ab02d2aa43506f40e06e478b644c7a18f578aa97a45c9"} Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.399614 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7b566b8847-cg4mk" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.591343 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.602582 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.616237 5030 scope.go:117] "RemoveContainer" containerID="784863a42052e57b14121f049e5cd14fe8a4044d16008f175d245e4a44552a4e" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.617585 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.639036 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.657767 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-1"] Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.661431 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-server-conf\") pod \"a54757d6-0570-4290-a802-d82ff94dffb9\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.661463 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-tls\") pod \"a54757d6-0570-4290-a802-d82ff94dffb9\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.661482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a54757d6-0570-4290-a802-d82ff94dffb9\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.661503 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-plugins\") pod \"a54757d6-0570-4290-a802-d82ff94dffb9\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.661600 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-erlang-cookie\") pod \"a54757d6-0570-4290-a802-d82ff94dffb9\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.661676 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-plugins-conf\") pod \"a54757d6-0570-4290-a802-d82ff94dffb9\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.661699 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data\") pod \"a54757d6-0570-4290-a802-d82ff94dffb9\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.661756 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-confd\") pod \"a54757d6-0570-4290-a802-d82ff94dffb9\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.661784 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a54757d6-0570-4290-a802-d82ff94dffb9-erlang-cookie-secret\") pod \"a54757d6-0570-4290-a802-d82ff94dffb9\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.661828 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a54757d6-0570-4290-a802-d82ff94dffb9-pod-info\") pod \"a54757d6-0570-4290-a802-d82ff94dffb9\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.661887 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjmb7\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-kube-api-access-hjmb7\") pod \"a54757d6-0570-4290-a802-d82ff94dffb9\" (UID: \"a54757d6-0570-4290-a802-d82ff94dffb9\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.662850 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a54757d6-0570-4290-a802-d82ff94dffb9" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.664132 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a54757d6-0570-4290-a802-d82ff94dffb9" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.664608 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a54757d6-0570-4290-a802-d82ff94dffb9" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.668113 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a54757d6-0570-4290-a802-d82ff94dffb9-pod-info" (OuterVolumeSpecName: "pod-info") pod "a54757d6-0570-4290-a802-d82ff94dffb9" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.671700 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7b566b8847-cg4mk"] Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.673494 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a54757d6-0570-4290-a802-d82ff94dffb9" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.674417 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "a54757d6-0570-4290-a802-d82ff94dffb9" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.679888 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-7b566b8847-cg4mk"] Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.686814 5030 scope.go:117] "RemoveContainer" containerID="a74fc0dc19c36c183a9bd3ffda3fbb852f4467116d2b61c87b1bef566563caba" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.690346 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.691172 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a54757d6-0570-4290-a802-d82ff94dffb9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a54757d6-0570-4290-a802-d82ff94dffb9" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.698368 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-2"] Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.705437 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-kube-api-access-hjmb7" (OuterVolumeSpecName: "kube-api-access-hjmb7") pod "a54757d6-0570-4290-a802-d82ff94dffb9" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9"). InnerVolumeSpecName "kube-api-access-hjmb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.706351 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.715367 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.717729 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data" (OuterVolumeSpecName: "config-data") pod "a54757d6-0570-4290-a802-d82ff94dffb9" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.737049 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.748632 5030 scope.go:117] "RemoveContainer" containerID="f161398cbd5641c14973f948a7a3f2a59ed72f415935e18bc8af68a8d9fd3226" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.763702 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-confd\") pod \"995342f6-fe0b-4188-976c-5151541dd002\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.763751 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data\") pod \"65e9d298-7338-47c3-8591-b7803798203f\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.763778 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-tls\") pod \"995342f6-fe0b-4188-976c-5151541dd002\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.763796 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65e9d298-7338-47c3-8591-b7803798203f-pod-info\") pod \"65e9d298-7338-47c3-8591-b7803798203f\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.763833 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jk6q\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-kube-api-access-5jk6q\") pod \"65e9d298-7338-47c3-8591-b7803798203f\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.763898 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-server-conf\") pod \"995342f6-fe0b-4188-976c-5151541dd002\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.763953 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-plugins\") pod \"995342f6-fe0b-4188-976c-5151541dd002\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.763977 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65e9d298-7338-47c3-8591-b7803798203f-erlang-cookie-secret\") pod \"65e9d298-7338-47c3-8591-b7803798203f\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.763996 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"65e9d298-7338-47c3-8591-b7803798203f\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764013 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv8dr\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-kube-api-access-fv8dr\") pod \"995342f6-fe0b-4188-976c-5151541dd002\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764037 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-plugins-conf\") pod \"65e9d298-7338-47c3-8591-b7803798203f\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-erlang-cookie\") pod \"995342f6-fe0b-4188-976c-5151541dd002\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764091 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-plugins-conf\") pod \"995342f6-fe0b-4188-976c-5151541dd002\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764117 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data\") pod \"995342f6-fe0b-4188-976c-5151541dd002\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764154 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-plugins\") pod \"65e9d298-7338-47c3-8591-b7803798203f\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-server-conf\") pod \"65e9d298-7338-47c3-8591-b7803798203f\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764187 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-tls\") pod \"65e9d298-7338-47c3-8591-b7803798203f\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764209 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-erlang-cookie\") pod \"65e9d298-7338-47c3-8591-b7803798203f\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764238 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-confd\") pod \"65e9d298-7338-47c3-8591-b7803798203f\" (UID: \"65e9d298-7338-47c3-8591-b7803798203f\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"995342f6-fe0b-4188-976c-5151541dd002\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764329 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/995342f6-fe0b-4188-976c-5151541dd002-erlang-cookie-secret\") pod \"995342f6-fe0b-4188-976c-5151541dd002\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764349 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/995342f6-fe0b-4188-976c-5151541dd002-pod-info\") pod \"995342f6-fe0b-4188-976c-5151541dd002\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764612 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjmb7\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-kube-api-access-hjmb7\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764637 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764655 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764665 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764674 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764683 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764691 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764700 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a54757d6-0570-4290-a802-d82ff94dffb9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.764708 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a54757d6-0570-4290-a802-d82ff94dffb9-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.765873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "995342f6-fe0b-4188-976c-5151541dd002" (UID: "995342f6-fe0b-4188-976c-5151541dd002"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.765990 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "995342f6-fe0b-4188-976c-5151541dd002" (UID: "995342f6-fe0b-4188-976c-5151541dd002"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.767528 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "65e9d298-7338-47c3-8591-b7803798203f" (UID: "65e9d298-7338-47c3-8591-b7803798203f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.767859 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "65e9d298-7338-47c3-8591-b7803798203f" (UID: "65e9d298-7338-47c3-8591-b7803798203f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.768150 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "65e9d298-7338-47c3-8591-b7803798203f" (UID: "65e9d298-7338-47c3-8591-b7803798203f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.770078 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "995342f6-fe0b-4188-976c-5151541dd002" (UID: "995342f6-fe0b-4188-976c-5151541dd002"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.772949 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-kube-api-access-fv8dr" (OuterVolumeSpecName: "kube-api-access-fv8dr") pod "995342f6-fe0b-4188-976c-5151541dd002" (UID: "995342f6-fe0b-4188-976c-5151541dd002"). InnerVolumeSpecName "kube-api-access-fv8dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.775872 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "995342f6-fe0b-4188-976c-5151541dd002" (UID: "995342f6-fe0b-4188-976c-5151541dd002"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.776658 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e9d298-7338-47c3-8591-b7803798203f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "65e9d298-7338-47c3-8591-b7803798203f" (UID: "65e9d298-7338-47c3-8591-b7803798203f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.776681 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-server-conf" (OuterVolumeSpecName: "server-conf") pod "a54757d6-0570-4290-a802-d82ff94dffb9" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.776698 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "65e9d298-7338-47c3-8591-b7803798203f" (UID: "65e9d298-7338-47c3-8591-b7803798203f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.776743 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/995342f6-fe0b-4188-976c-5151541dd002-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "995342f6-fe0b-4188-976c-5151541dd002" (UID: "995342f6-fe0b-4188-976c-5151541dd002"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.777405 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-kube-api-access-5jk6q" (OuterVolumeSpecName: "kube-api-access-5jk6q") pod "65e9d298-7338-47c3-8591-b7803798203f" (UID: "65e9d298-7338-47c3-8591-b7803798203f"). InnerVolumeSpecName "kube-api-access-5jk6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.776675 5030 scope.go:117] "RemoveContainer" containerID="07c23d45fe75d1a2499dda70559be5d8b50e9dddf38b74b4c6637c72301d7701" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.778199 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "65e9d298-7338-47c3-8591-b7803798203f" (UID: "65e9d298-7338-47c3-8591-b7803798203f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.779289 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/65e9d298-7338-47c3-8591-b7803798203f-pod-info" (OuterVolumeSpecName: "pod-info") pod "65e9d298-7338-47c3-8591-b7803798203f" (UID: "65e9d298-7338-47c3-8591-b7803798203f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.779374 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "995342f6-fe0b-4188-976c-5151541dd002" (UID: "995342f6-fe0b-4188-976c-5151541dd002"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.780743 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/995342f6-fe0b-4188-976c-5151541dd002-pod-info" (OuterVolumeSpecName: "pod-info") pod "995342f6-fe0b-4188-976c-5151541dd002" (UID: "995342f6-fe0b-4188-976c-5151541dd002"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.792663 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.797169 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data" (OuterVolumeSpecName: "config-data") pod "65e9d298-7338-47c3-8591-b7803798203f" (UID: "65e9d298-7338-47c3-8591-b7803798203f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.799328 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.808809 5030 scope.go:117] "RemoveContainer" containerID="e06d7030eda4e797f3bf0fb208cadb2bda83a4dfc38d8656bdc15ff205433692" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.830804 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a54757d6-0570-4290-a802-d82ff94dffb9" (UID: "a54757d6-0570-4290-a802-d82ff94dffb9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.834903 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data" (OuterVolumeSpecName: "config-data") pod "995342f6-fe0b-4188-976c-5151541dd002" (UID: "995342f6-fe0b-4188-976c-5151541dd002"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.835042 5030 scope.go:117] "RemoveContainer" containerID="5335c6175ed4a270a6a0424ed58b7a7123c2bee5625d69ab613e39861259cab6" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.838836 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-server-conf" (OuterVolumeSpecName: "server-conf") pod "995342f6-fe0b-4188-976c-5151541dd002" (UID: "995342f6-fe0b-4188-976c-5151541dd002"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.845732 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-server-conf" (OuterVolumeSpecName: "server-conf") pod "65e9d298-7338-47c3-8591-b7803798203f" (UID: "65e9d298-7338-47c3-8591-b7803798203f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.865147 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-config-data-custom\") pod \"c0e11584-61c4-455a-9943-28b4228c8921\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.865184 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqfkk\" (UniqueName: \"kubernetes.io/projected/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-kube-api-access-cqfkk\") pod \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\" (UID: \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.865227 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-config-data\") pod \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\" (UID: \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.865261 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5n4t\" (UniqueName: \"kubernetes.io/projected/c0e11584-61c4-455a-9943-28b4228c8921-kube-api-access-w5n4t\") pod \"c0e11584-61c4-455a-9943-28b4228c8921\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.865267 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "995342f6-fe0b-4188-976c-5151541dd002" (UID: "995342f6-fe0b-4188-976c-5151541dd002"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.865286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-combined-ca-bundle\") pod \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\" (UID: \"e8de9d1d-d7de-4313-9b7b-0f07992ede1b\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.865376 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e11584-61c4-455a-9943-28b4228c8921-logs\") pod \"c0e11584-61c4-455a-9943-28b4228c8921\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.865595 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-confd\") pod \"995342f6-fe0b-4188-976c-5151541dd002\" (UID: \"995342f6-fe0b-4188-976c-5151541dd002\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.865654 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-combined-ca-bundle\") pod \"c0e11584-61c4-455a-9943-28b4228c8921\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.865788 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-config-data\") pod \"c0e11584-61c4-455a-9943-28b4228c8921\" (UID: \"c0e11584-61c4-455a-9943-28b4228c8921\") " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866271 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jk6q\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-kube-api-access-5jk6q\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866282 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a54757d6-0570-4290-a802-d82ff94dffb9-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866293 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866302 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866311 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866320 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/65e9d298-7338-47c3-8591-b7803798203f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866338 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866350 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv8dr\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-kube-api-access-fv8dr\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866358 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866368 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866376 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866384 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/995342f6-fe0b-4188-976c-5151541dd002-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866394 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866403 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866411 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866420 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866432 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866441 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a54757d6-0570-4290-a802-d82ff94dffb9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866449 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/995342f6-fe0b-4188-976c-5151541dd002-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866457 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/995342f6-fe0b-4188-976c-5151541dd002-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866466 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65e9d298-7338-47c3-8591-b7803798203f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866473 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.866482 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/65e9d298-7338-47c3-8591-b7803798203f-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.867812 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c0e11584-61c4-455a-9943-28b4228c8921" (UID: "c0e11584-61c4-455a-9943-28b4228c8921"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.868593 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-kube-api-access-cqfkk" (OuterVolumeSpecName: "kube-api-access-cqfkk") pod "e8de9d1d-d7de-4313-9b7b-0f07992ede1b" (UID: "e8de9d1d-d7de-4313-9b7b-0f07992ede1b"). InnerVolumeSpecName "kube-api-access-cqfkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: W0120 23:19:32.868707 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/995342f6-fe0b-4188-976c-5151541dd002/volumes/kubernetes.io~projected/rabbitmq-confd Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.868721 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "995342f6-fe0b-4188-976c-5151541dd002" (UID: "995342f6-fe0b-4188-976c-5151541dd002"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.869251 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e11584-61c4-455a-9943-28b4228c8921-logs" (OuterVolumeSpecName: "logs") pod "c0e11584-61c4-455a-9943-28b4228c8921" (UID: "c0e11584-61c4-455a-9943-28b4228c8921"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.872852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e11584-61c4-455a-9943-28b4228c8921-kube-api-access-w5n4t" (OuterVolumeSpecName: "kube-api-access-w5n4t") pod "c0e11584-61c4-455a-9943-28b4228c8921" (UID: "c0e11584-61c4-455a-9943-28b4228c8921"). InnerVolumeSpecName "kube-api-access-w5n4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.883095 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.884615 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8de9d1d-d7de-4313-9b7b-0f07992ede1b" (UID: "e8de9d1d-d7de-4313-9b7b-0f07992ede1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.885554 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.892082 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-config-data" (OuterVolumeSpecName: "config-data") pod "e8de9d1d-d7de-4313-9b7b-0f07992ede1b" (UID: "e8de9d1d-d7de-4313-9b7b-0f07992ede1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.892876 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0e11584-61c4-455a-9943-28b4228c8921" (UID: "c0e11584-61c4-455a-9943-28b4228c8921"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.896307 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "65e9d298-7338-47c3-8591-b7803798203f" (UID: "65e9d298-7338-47c3-8591-b7803798203f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.906337 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-config-data" (OuterVolumeSpecName: "config-data") pod "c0e11584-61c4-455a-9943-28b4228c8921" (UID: "c0e11584-61c4-455a-9943-28b4228c8921"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.938805 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.969422 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.969470 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.969488 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqfkk\" (UniqueName: \"kubernetes.io/projected/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-kube-api-access-cqfkk\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.969502 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.969517 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.969529 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5n4t\" (UniqueName: \"kubernetes.io/projected/c0e11584-61c4-455a-9943-28b4228c8921-kube-api-access-w5n4t\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.969539 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8de9d1d-d7de-4313-9b7b-0f07992ede1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.969550 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0e11584-61c4-455a-9943-28b4228c8921-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.969562 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/65e9d298-7338-47c3-8591-b7803798203f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.969572 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.969584 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/995342f6-fe0b-4188-976c-5151541dd002-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:32 crc kubenswrapper[5030]: I0120 23:19:32.969594 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0e11584-61c4-455a-9943-28b4228c8921-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.018939 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.070729 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-combined-ca-bundle\") pod \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\" (UID: \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.070946 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-config-data\") pod \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\" (UID: \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.071078 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kshsm\" (UniqueName: \"kubernetes.io/projected/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-kube-api-access-kshsm\") pod \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\" (UID: \"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.075521 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-kube-api-access-kshsm" (OuterVolumeSpecName: "kube-api-access-kshsm") pod "d59d9bd5-ab07-4c22-83c0-71f88ee1b02d" (UID: "d59d9bd5-ab07-4c22-83c0-71f88ee1b02d"). InnerVolumeSpecName "kube-api-access-kshsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.097204 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d59d9bd5-ab07-4c22-83c0-71f88ee1b02d" (UID: "d59d9bd5-ab07-4c22-83c0-71f88ee1b02d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.103028 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-config-data" (OuterVolumeSpecName: "config-data") pod "d59d9bd5-ab07-4c22-83c0-71f88ee1b02d" (UID: "d59d9bd5-ab07-4c22-83c0-71f88ee1b02d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.172198 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-internal-tls-certs\") pod \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.172306 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-public-tls-certs\") pod \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.172373 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dtkl\" (UniqueName: \"kubernetes.io/projected/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-kube-api-access-7dtkl\") pod \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.172419 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-combined-ca-bundle\") pod \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.172436 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-config\") pod \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.172454 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-httpd-config\") pod \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.172782 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-ovndb-tls-certs\") pod \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\" (UID: \"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.173079 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kshsm\" (UniqueName: \"kubernetes.io/projected/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-kube-api-access-kshsm\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.173094 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.173103 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.180027 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" (UID: "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.182431 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-kube-api-access-7dtkl" (OuterVolumeSpecName: "kube-api-access-7dtkl") pod "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" (UID: "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782"). InnerVolumeSpecName "kube-api-access-7dtkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.211197 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-config" (OuterVolumeSpecName: "config") pod "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" (UID: "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.218866 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" (UID: "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.239671 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" (UID: "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.240051 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" (UID: "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.242083 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" (UID: "b77c8d9b-92ac-4cfd-b1cd-fe09be04b782"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.274600 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dtkl\" (UniqueName: \"kubernetes.io/projected/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-kube-api-access-7dtkl\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.274923 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.274935 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.274944 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.274953 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.274962 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.274973 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.411481 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" event={"ID":"b77c8d9b-92ac-4cfd-b1cd-fe09be04b782","Type":"ContainerDied","Data":"2293d1c739a446e5a0beec33e710dafc58e2b78138f44f5fa202ae92299bbe7c"} Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.411530 5030 scope.go:117] "RemoveContainer" containerID="e47ac64b526f7c8c1e0183da0c8d577a84461451afc84c5ba2ba9e3c80008b0f" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.411659 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.418845 5030 generic.go:334] "Generic (PLEG): container finished" podID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerID="491a040f8ee5635c6af0dc1b66387a1b9706edbf38f9295a9f919a35c55cb5ba" exitCode=0 Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.418886 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7da27e47-283b-48aa-9439-f8b3cf0c3426","Type":"ContainerDied","Data":"491a040f8ee5635c6af0dc1b66387a1b9706edbf38f9295a9f919a35c55cb5ba"} Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.421117 5030 generic.go:334] "Generic (PLEG): container finished" podID="e8de9d1d-d7de-4313-9b7b-0f07992ede1b" containerID="ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c" exitCode=0 Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.421139 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e8de9d1d-d7de-4313-9b7b-0f07992ede1b","Type":"ContainerDied","Data":"ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c"} Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.421168 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e8de9d1d-d7de-4313-9b7b-0f07992ede1b","Type":"ContainerDied","Data":"bd99e9f5232c30cbe40654c3425630656b4efdeb865aa25cbf04119baede8ed7"} Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.421171 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.424832 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.424844 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"995342f6-fe0b-4188-976c-5151541dd002","Type":"ContainerDied","Data":"62150a8f853956dd62fa0336f565eec30c91832921b138ff215bebacbfac1c22"} Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.444418 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.465894 5030 scope.go:117] "RemoveContainer" containerID="6949cd28ba252efdec3e0d70cae363c20b2cd6eb423def67f259c29ca96e51c2" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.466180 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd"] Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.475411 5030 generic.go:334] "Generic (PLEG): container finished" podID="5039f885-ec1c-48d9-8e06-cf672e3581df" containerID="6970aa8432c4959152b2e372b5408ff4682d8c4aae0f06b65a11779ac7a8e3a0" exitCode=0 Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.475506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-1" event={"ID":"5039f885-ec1c-48d9-8e06-cf672e3581df","Type":"ContainerDied","Data":"6970aa8432c4959152b2e372b5408ff4682d8c4aae0f06b65a11779ac7a8e3a0"} Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.480108 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-cc7b7c5c8-d9whd"] Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.481328 5030 generic.go:334] "Generic (PLEG): container finished" podID="d59d9bd5-ab07-4c22-83c0-71f88ee1b02d" containerID="a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5" exitCode=0 Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.481499 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.481567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d","Type":"ContainerDied","Data":"a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5"} Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.481601 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"d59d9bd5-ab07-4c22-83c0-71f88ee1b02d","Type":"ContainerDied","Data":"10eb60b928b07a79fe138ef219d7d32491415f9dc55ca9767af8e1040ab88854"} Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.487246 5030 generic.go:334] "Generic (PLEG): container finished" podID="c0e11584-61c4-455a-9943-28b4228c8921" containerID="284d499ae2855fa12179dc7eefb9d028f521319a34d4e38ba00fb9502f35e698" exitCode=0 Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.487419 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-1" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.489439 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.489757 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" event={"ID":"c0e11584-61c4-455a-9943-28b4228c8921","Type":"ContainerDied","Data":"284d499ae2855fa12179dc7eefb9d028f521319a34d4e38ba00fb9502f35e698"} Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.489872 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5748c46787-5fm27" event={"ID":"c0e11584-61c4-455a-9943-28b4228c8921","Type":"ContainerDied","Data":"98ed6ea2e1ae1d5a798f1cd07fdf870debb021c1e34fba786b210915bcbb5872"} Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.490032 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-2" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.494560 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.498033 5030 scope.go:117] "RemoveContainer" containerID="ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.500089 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.531971 5030 scope.go:117] "RemoveContainer" containerID="ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c" Jan 20 23:19:33 crc kubenswrapper[5030]: E0120 23:19:33.532456 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c\": container with ID starting with ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c not found: ID does not exist" containerID="ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.532496 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c"} err="failed to get container status \"ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c\": rpc error: code = NotFound desc = could not find container \"ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c\": container with ID starting with ad028d448e90a820d4e91a3bed049e2458e0c7f4c6dad84df33d8fda48eed11c not found: ID does not exist" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.532523 5030 scope.go:117] "RemoveContainer" containerID="38eba0f78e9bf8588976acd650a6cbc2107e5878a8dbe1f929ebf5401b2d0697" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.537009 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.546861 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.561122 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5748c46787-5fm27"] Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.561823 5030 scope.go:117] "RemoveContainer" containerID="ce04af762c42921ec01aa3670edc17c959abd025e114d214262bf6b03d163300" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.566438 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5748c46787-5fm27"] Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.578943 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.582802 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5039f885-ec1c-48d9-8e06-cf672e3581df-galera-tls-certs\") pod \"5039f885-ec1c-48d9-8e06-cf672e3581df\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.582837 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"5039f885-ec1c-48d9-8e06-cf672e3581df\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.582866 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcbl4\" (UniqueName: \"kubernetes.io/projected/5039f885-ec1c-48d9-8e06-cf672e3581df-kube-api-access-vcbl4\") pod \"5039f885-ec1c-48d9-8e06-cf672e3581df\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.582905 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-kolla-config\") pod \"5039f885-ec1c-48d9-8e06-cf672e3581df\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.582947 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5039f885-ec1c-48d9-8e06-cf672e3581df-config-data-generated\") pod \"5039f885-ec1c-48d9-8e06-cf672e3581df\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.583037 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5039f885-ec1c-48d9-8e06-cf672e3581df-combined-ca-bundle\") pod \"5039f885-ec1c-48d9-8e06-cf672e3581df\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.584758 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-config-data-default\") pod \"5039f885-ec1c-48d9-8e06-cf672e3581df\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.586925 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-operator-scripts\") pod \"5039f885-ec1c-48d9-8e06-cf672e3581df\" (UID: \"5039f885-ec1c-48d9-8e06-cf672e3581df\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.588906 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5039f885-ec1c-48d9-8e06-cf672e3581df-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "5039f885-ec1c-48d9-8e06-cf672e3581df" (UID: "5039f885-ec1c-48d9-8e06-cf672e3581df"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.590880 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "5039f885-ec1c-48d9-8e06-cf672e3581df" (UID: "5039f885-ec1c-48d9-8e06-cf672e3581df"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.592331 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "5039f885-ec1c-48d9-8e06-cf672e3581df" (UID: "5039f885-ec1c-48d9-8e06-cf672e3581df"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.592518 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5039f885-ec1c-48d9-8e06-cf672e3581df" (UID: "5039f885-ec1c-48d9-8e06-cf672e3581df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.593165 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-1"] Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.595348 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5039f885-ec1c-48d9-8e06-cf672e3581df-kube-api-access-vcbl4" (OuterVolumeSpecName: "kube-api-access-vcbl4") pod "5039f885-ec1c-48d9-8e06-cf672e3581df" (UID: "5039f885-ec1c-48d9-8e06-cf672e3581df"). InnerVolumeSpecName "kube-api-access-vcbl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.610780 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "mysql-db") pod "5039f885-ec1c-48d9-8e06-cf672e3581df" (UID: "5039f885-ec1c-48d9-8e06-cf672e3581df"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.612548 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5039f885-ec1c-48d9-8e06-cf672e3581df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5039f885-ec1c-48d9-8e06-cf672e3581df" (UID: "5039f885-ec1c-48d9-8e06-cf672e3581df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.620828 5030 scope.go:117] "RemoveContainer" containerID="6970aa8432c4959152b2e372b5408ff4682d8c4aae0f06b65a11779ac7a8e3a0" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.625809 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.634018 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-2"] Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.634582 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5039f885-ec1c-48d9-8e06-cf672e3581df-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "5039f885-ec1c-48d9-8e06-cf672e3581df" (UID: "5039f885-ec1c-48d9-8e06-cf672e3581df"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.641007 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.647807 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:19:33 crc kubenswrapper[5030]: E0120 23:19:33.648109 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb77c8d9b_92ac_4cfd_b1cd_fe09be04b782.slice/crio-2293d1c739a446e5a0beec33e710dafc58e2b78138f44f5fa202ae92299bbe7c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8de9d1d_d7de_4313_9b7b_0f07992ede1b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0e11584_61c4_455a_9943_28b4228c8921.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.649545 5030 scope.go:117] "RemoveContainer" containerID="5f313fcb94932129478d90e13f8338fba0016c110147ee11418a6747ebcfed4d" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.666994 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.675294 5030 scope.go:117] "RemoveContainer" containerID="a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.688246 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.688272 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5039f885-ec1c-48d9-8e06-cf672e3581df-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.688304 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.688318 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcbl4\" (UniqueName: \"kubernetes.io/projected/5039f885-ec1c-48d9-8e06-cf672e3581df-kube-api-access-vcbl4\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.688327 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.688336 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5039f885-ec1c-48d9-8e06-cf672e3581df-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.688345 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5039f885-ec1c-48d9-8e06-cf672e3581df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.688353 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5039f885-ec1c-48d9-8e06-cf672e3581df-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.702737 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.704198 5030 scope.go:117] "RemoveContainer" containerID="a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5" Jan 20 23:19:33 crc kubenswrapper[5030]: E0120 23:19:33.705689 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5\": container with ID starting with a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5 not found: ID does not exist" containerID="a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.705754 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5"} err="failed to get container status \"a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5\": rpc error: code = NotFound desc = could not find container \"a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5\": container with ID starting with a31e1802ba129e97dc4cba726349613d52e5ebbab78312c1545a66dcfa779bc5 not found: ID does not exist" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.705779 5030 scope.go:117] "RemoveContainer" containerID="284d499ae2855fa12179dc7eefb9d028f521319a34d4e38ba00fb9502f35e698" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.726690 5030 scope.go:117] "RemoveContainer" containerID="8b68d04d926af53c48ce1eed2f0df585a7425d91dacd626198dde33e72447ad4" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.740870 5030 scope.go:117] "RemoveContainer" containerID="284d499ae2855fa12179dc7eefb9d028f521319a34d4e38ba00fb9502f35e698" Jan 20 23:19:33 crc kubenswrapper[5030]: E0120 23:19:33.741365 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"284d499ae2855fa12179dc7eefb9d028f521319a34d4e38ba00fb9502f35e698\": container with ID starting with 284d499ae2855fa12179dc7eefb9d028f521319a34d4e38ba00fb9502f35e698 not found: ID does not exist" containerID="284d499ae2855fa12179dc7eefb9d028f521319a34d4e38ba00fb9502f35e698" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.741415 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"284d499ae2855fa12179dc7eefb9d028f521319a34d4e38ba00fb9502f35e698"} err="failed to get container status \"284d499ae2855fa12179dc7eefb9d028f521319a34d4e38ba00fb9502f35e698\": rpc error: code = NotFound desc = could not find container \"284d499ae2855fa12179dc7eefb9d028f521319a34d4e38ba00fb9502f35e698\": container with ID starting with 284d499ae2855fa12179dc7eefb9d028f521319a34d4e38ba00fb9502f35e698 not found: ID does not exist" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.741436 5030 scope.go:117] "RemoveContainer" containerID="8b68d04d926af53c48ce1eed2f0df585a7425d91dacd626198dde33e72447ad4" Jan 20 23:19:33 crc kubenswrapper[5030]: E0120 23:19:33.742005 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b68d04d926af53c48ce1eed2f0df585a7425d91dacd626198dde33e72447ad4\": container with ID starting with 8b68d04d926af53c48ce1eed2f0df585a7425d91dacd626198dde33e72447ad4 not found: ID does not exist" containerID="8b68d04d926af53c48ce1eed2f0df585a7425d91dacd626198dde33e72447ad4" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.742039 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b68d04d926af53c48ce1eed2f0df585a7425d91dacd626198dde33e72447ad4"} err="failed to get container status \"8b68d04d926af53c48ce1eed2f0df585a7425d91dacd626198dde33e72447ad4\": rpc error: code = NotFound desc = could not find container \"8b68d04d926af53c48ce1eed2f0df585a7425d91dacd626198dde33e72447ad4\": container with ID starting with 8b68d04d926af53c48ce1eed2f0df585a7425d91dacd626198dde33e72447ad4 not found: ID does not exist" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.789322 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da27e47-283b-48aa-9439-f8b3cf0c3426-log-httpd\") pod \"7da27e47-283b-48aa-9439-f8b3cf0c3426\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.789405 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-scripts\") pod \"7da27e47-283b-48aa-9439-f8b3cf0c3426\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.789475 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da27e47-283b-48aa-9439-f8b3cf0c3426-run-httpd\") pod \"7da27e47-283b-48aa-9439-f8b3cf0c3426\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.789507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-ceilometer-tls-certs\") pod \"7da27e47-283b-48aa-9439-f8b3cf0c3426\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.789572 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-config-data\") pod \"7da27e47-283b-48aa-9439-f8b3cf0c3426\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.789593 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-combined-ca-bundle\") pod \"7da27e47-283b-48aa-9439-f8b3cf0c3426\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.789635 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rwwp\" (UniqueName: \"kubernetes.io/projected/7da27e47-283b-48aa-9439-f8b3cf0c3426-kube-api-access-5rwwp\") pod \"7da27e47-283b-48aa-9439-f8b3cf0c3426\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.789663 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-sg-core-conf-yaml\") pod \"7da27e47-283b-48aa-9439-f8b3cf0c3426\" (UID: \"7da27e47-283b-48aa-9439-f8b3cf0c3426\") " Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.789944 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.790032 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7da27e47-283b-48aa-9439-f8b3cf0c3426-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7da27e47-283b-48aa-9439-f8b3cf0c3426" (UID: "7da27e47-283b-48aa-9439-f8b3cf0c3426"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.790966 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7da27e47-283b-48aa-9439-f8b3cf0c3426-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7da27e47-283b-48aa-9439-f8b3cf0c3426" (UID: "7da27e47-283b-48aa-9439-f8b3cf0c3426"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.794896 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-scripts" (OuterVolumeSpecName: "scripts") pod "7da27e47-283b-48aa-9439-f8b3cf0c3426" (UID: "7da27e47-283b-48aa-9439-f8b3cf0c3426"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.796104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da27e47-283b-48aa-9439-f8b3cf0c3426-kube-api-access-5rwwp" (OuterVolumeSpecName: "kube-api-access-5rwwp") pod "7da27e47-283b-48aa-9439-f8b3cf0c3426" (UID: "7da27e47-283b-48aa-9439-f8b3cf0c3426"). InnerVolumeSpecName "kube-api-access-5rwwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.816747 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7da27e47-283b-48aa-9439-f8b3cf0c3426" (UID: "7da27e47-283b-48aa-9439-f8b3cf0c3426"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.825334 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7da27e47-283b-48aa-9439-f8b3cf0c3426" (UID: "7da27e47-283b-48aa-9439-f8b3cf0c3426"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.842354 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7da27e47-283b-48aa-9439-f8b3cf0c3426" (UID: "7da27e47-283b-48aa-9439-f8b3cf0c3426"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.865203 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-config-data" (OuterVolumeSpecName: "config-data") pod "7da27e47-283b-48aa-9439-f8b3cf0c3426" (UID: "7da27e47-283b-48aa-9439-f8b3cf0c3426"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.891972 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.892005 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.892016 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.892027 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rwwp\" (UniqueName: \"kubernetes.io/projected/7da27e47-283b-48aa-9439-f8b3cf0c3426-kube-api-access-5rwwp\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.892055 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.892063 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da27e47-283b-48aa-9439-f8b3cf0c3426-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.892072 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7da27e47-283b-48aa-9439-f8b3cf0c3426-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.892081 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7da27e47-283b-48aa-9439-f8b3cf0c3426-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.947045 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/memcached-0" podUID="8dfaa83b-13e1-4a2f-a722-b0583f2641ed" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.111:11211: i/o timeout" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.973104 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09176751-6268-4f1b-9348-5adb2938e4ca" path="/var/lib/kubelet/pods/09176751-6268-4f1b-9348-5adb2938e4ca/volumes" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.973977 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" path="/var/lib/kubelet/pods/56408fa4-b1eb-4a4d-bc8a-9ab82b88957f/volumes" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.974911 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59bcab9d-b9c0-40fe-8973-df39eeca1dd4" path="/var/lib/kubelet/pods/59bcab9d-b9c0-40fe-8973-df39eeca1dd4/volumes" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.976531 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e9d298-7338-47c3-8591-b7803798203f" path="/var/lib/kubelet/pods/65e9d298-7338-47c3-8591-b7803798203f/volumes" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.978398 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="995342f6-fe0b-4188-976c-5151541dd002" path="/var/lib/kubelet/pods/995342f6-fe0b-4188-976c-5151541dd002/volumes" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.980056 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a54757d6-0570-4290-a802-d82ff94dffb9" path="/var/lib/kubelet/pods/a54757d6-0570-4290-a802-d82ff94dffb9/volumes" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.981889 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" path="/var/lib/kubelet/pods/b77c8d9b-92ac-4cfd-b1cd-fe09be04b782/volumes" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.983400 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0e11584-61c4-455a-9943-28b4228c8921" path="/var/lib/kubelet/pods/c0e11584-61c4-455a-9943-28b4228c8921/volumes" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.984480 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d59d9bd5-ab07-4c22-83c0-71f88ee1b02d" path="/var/lib/kubelet/pods/d59d9bd5-ab07-4c22-83c0-71f88ee1b02d/volumes" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.986434 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8de9d1d-d7de-4313-9b7b-0f07992ede1b" path="/var/lib/kubelet/pods/e8de9d1d-d7de-4313-9b7b-0f07992ede1b/volumes" Jan 20 23:19:33 crc kubenswrapper[5030]: I0120 23:19:33.987895 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed443431-2b5c-4533-8523-d01380c98c1d" path="/var/lib/kubelet/pods/ed443431-2b5c-4533-8523-d01380c98c1d/volumes" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.258899 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.401448 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-operator-scripts\") pod \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.401707 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-config-data-default\") pod \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.402792 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf776dcd-8af4-45b0-9c39-0e8db2af4879-combined-ca-bundle\") pod \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.402847 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf776dcd-8af4-45b0-9c39-0e8db2af4879" (UID: "cf776dcd-8af4-45b0-9c39-0e8db2af4879"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.402857 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-kolla-config\") pod \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.402981 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6842r\" (UniqueName: \"kubernetes.io/projected/cf776dcd-8af4-45b0-9c39-0e8db2af4879-kube-api-access-6842r\") pod \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.403182 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf776dcd-8af4-45b0-9c39-0e8db2af4879-galera-tls-certs\") pod \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.403340 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.403382 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "cf776dcd-8af4-45b0-9c39-0e8db2af4879" (UID: "cf776dcd-8af4-45b0-9c39-0e8db2af4879"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.403441 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "cf776dcd-8af4-45b0-9c39-0e8db2af4879" (UID: "cf776dcd-8af4-45b0-9c39-0e8db2af4879"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.403451 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cf776dcd-8af4-45b0-9c39-0e8db2af4879-config-data-generated\") pod \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\" (UID: \"cf776dcd-8af4-45b0-9c39-0e8db2af4879\") " Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.404095 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf776dcd-8af4-45b0-9c39-0e8db2af4879-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "cf776dcd-8af4-45b0-9c39-0e8db2af4879" (UID: "cf776dcd-8af4-45b0-9c39-0e8db2af4879"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.404363 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cf776dcd-8af4-45b0-9c39-0e8db2af4879-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.404421 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.404450 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.404476 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf776dcd-8af4-45b0-9c39-0e8db2af4879-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.407923 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf776dcd-8af4-45b0-9c39-0e8db2af4879-kube-api-access-6842r" (OuterVolumeSpecName: "kube-api-access-6842r") pod "cf776dcd-8af4-45b0-9c39-0e8db2af4879" (UID: "cf776dcd-8af4-45b0-9c39-0e8db2af4879"). InnerVolumeSpecName "kube-api-access-6842r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.419363 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "mysql-db") pod "cf776dcd-8af4-45b0-9c39-0e8db2af4879" (UID: "cf776dcd-8af4-45b0-9c39-0e8db2af4879"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.436500 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf776dcd-8af4-45b0-9c39-0e8db2af4879-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf776dcd-8af4-45b0-9c39-0e8db2af4879" (UID: "cf776dcd-8af4-45b0-9c39-0e8db2af4879"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.466642 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf776dcd-8af4-45b0-9c39-0e8db2af4879-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "cf776dcd-8af4-45b0-9c39-0e8db2af4879" (UID: "cf776dcd-8af4-45b0-9c39-0e8db2af4879"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.506291 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.506505 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf776dcd-8af4-45b0-9c39-0e8db2af4879-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.506742 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6842r\" (UniqueName: \"kubernetes.io/projected/cf776dcd-8af4-45b0-9c39-0e8db2af4879-kube-api-access-6842r\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.506931 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf776dcd-8af4-45b0-9c39-0e8db2af4879-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.509241 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-1" event={"ID":"5039f885-ec1c-48d9-8e06-cf672e3581df","Type":"ContainerDied","Data":"8a7c192a3a9525f9af47eb6b8b657e1414c239f6f2e8d44ad7da72e45d07b7e3"} Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.509354 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-1" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.521107 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.521453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"7da27e47-283b-48aa-9439-f8b3cf0c3426","Type":"ContainerDied","Data":"c1a6abc4deb91d85e7c40a9e16991e793b6e49f9b706362617ba43d16c9dda49"} Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.521550 5030 scope.go:117] "RemoveContainer" containerID="cb217008ccf67ac0890f7c0b296c6e0fbe72ac6353cd839700724550dc254a84" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.525790 5030 generic.go:334] "Generic (PLEG): container finished" podID="cf776dcd-8af4-45b0-9c39-0e8db2af4879" containerID="f70480d2e206ab45874b23b6a0a2ec9950697436b259c4e0ec5114358ea4197c" exitCode=0 Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.525834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"cf776dcd-8af4-45b0-9c39-0e8db2af4879","Type":"ContainerDied","Data":"f70480d2e206ab45874b23b6a0a2ec9950697436b259c4e0ec5114358ea4197c"} Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.525865 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"cf776dcd-8af4-45b0-9c39-0e8db2af4879","Type":"ContainerDied","Data":"a85ca7b9bd72973fcabfe3800e05c7eebed97699919da36c88f980de7144d275"} Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.525880 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.526103 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.555715 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-1"] Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.567298 5030 scope.go:117] "RemoveContainer" containerID="5cb235fbc11dbb1f4a1b8e0b540880dc076e8771be46d898537d4c8df6eaa4dc" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.572650 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-1"] Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.580396 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.591784 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.593828 5030 scope.go:117] "RemoveContainer" containerID="491a040f8ee5635c6af0dc1b66387a1b9706edbf38f9295a9f919a35c55cb5ba" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.600260 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.607731 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.608412 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.620755 5030 scope.go:117] "RemoveContainer" containerID="b7437d316b2192f7896af5d93582c95337eb7084b35a8b488d5714c2c0ab610f" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.646394 5030 scope.go:117] "RemoveContainer" containerID="f70480d2e206ab45874b23b6a0a2ec9950697436b259c4e0ec5114358ea4197c" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.671326 5030 scope.go:117] "RemoveContainer" containerID="8c2d034117cf660543507c5a47125dcd72b8b2a6f1553c72d9e7bd79c30f5354" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.695227 5030 scope.go:117] "RemoveContainer" containerID="f70480d2e206ab45874b23b6a0a2ec9950697436b259c4e0ec5114358ea4197c" Jan 20 23:19:34 crc kubenswrapper[5030]: E0120 23:19:34.695729 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70480d2e206ab45874b23b6a0a2ec9950697436b259c4e0ec5114358ea4197c\": container with ID starting with f70480d2e206ab45874b23b6a0a2ec9950697436b259c4e0ec5114358ea4197c not found: ID does not exist" containerID="f70480d2e206ab45874b23b6a0a2ec9950697436b259c4e0ec5114358ea4197c" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.695795 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70480d2e206ab45874b23b6a0a2ec9950697436b259c4e0ec5114358ea4197c"} err="failed to get container status \"f70480d2e206ab45874b23b6a0a2ec9950697436b259c4e0ec5114358ea4197c\": rpc error: code = NotFound desc = could not find container \"f70480d2e206ab45874b23b6a0a2ec9950697436b259c4e0ec5114358ea4197c\": container with ID starting with f70480d2e206ab45874b23b6a0a2ec9950697436b259c4e0ec5114358ea4197c not found: ID does not exist" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.695836 5030 scope.go:117] "RemoveContainer" containerID="8c2d034117cf660543507c5a47125dcd72b8b2a6f1553c72d9e7bd79c30f5354" Jan 20 23:19:34 crc kubenswrapper[5030]: E0120 23:19:34.696147 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c2d034117cf660543507c5a47125dcd72b8b2a6f1553c72d9e7bd79c30f5354\": container with ID starting with 8c2d034117cf660543507c5a47125dcd72b8b2a6f1553c72d9e7bd79c30f5354 not found: ID does not exist" containerID="8c2d034117cf660543507c5a47125dcd72b8b2a6f1553c72d9e7bd79c30f5354" Jan 20 23:19:34 crc kubenswrapper[5030]: I0120 23:19:34.696181 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2d034117cf660543507c5a47125dcd72b8b2a6f1553c72d9e7bd79c30f5354"} err="failed to get container status \"8c2d034117cf660543507c5a47125dcd72b8b2a6f1553c72d9e7bd79c30f5354\": rpc error: code = NotFound desc = could not find container \"8c2d034117cf660543507c5a47125dcd72b8b2a6f1553c72d9e7bd79c30f5354\": container with ID starting with 8c2d034117cf660543507c5a47125dcd72b8b2a6f1553c72d9e7bd79c30f5354 not found: ID does not exist" Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.075847 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.159671 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm"] Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.159884 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" podUID="195d8fc3-e36b-4a36-92a4-da935758d9ff" containerName="dnsmasq-dns" containerID="cri-o://eaa01e4f601a64005981e893cc41a4981cdd97ea4254fbd0d3e015df117736eb" gracePeriod=10 Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.545747 5030 generic.go:334] "Generic (PLEG): container finished" podID="195d8fc3-e36b-4a36-92a4-da935758d9ff" containerID="eaa01e4f601a64005981e893cc41a4981cdd97ea4254fbd0d3e015df117736eb" exitCode=0 Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.545807 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" event={"ID":"195d8fc3-e36b-4a36-92a4-da935758d9ff","Type":"ContainerDied","Data":"eaa01e4f601a64005981e893cc41a4981cdd97ea4254fbd0d3e015df117736eb"} Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.603278 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.727501 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-dnsmasq-svc\") pod \"195d8fc3-e36b-4a36-92a4-da935758d9ff\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.727573 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-config\") pod \"195d8fc3-e36b-4a36-92a4-da935758d9ff\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.727597 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-dns-swift-storage-0\") pod \"195d8fc3-e36b-4a36-92a4-da935758d9ff\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.727656 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlm47\" (UniqueName: \"kubernetes.io/projected/195d8fc3-e36b-4a36-92a4-da935758d9ff-kube-api-access-qlm47\") pod \"195d8fc3-e36b-4a36-92a4-da935758d9ff\" (UID: \"195d8fc3-e36b-4a36-92a4-da935758d9ff\") " Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.734449 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/195d8fc3-e36b-4a36-92a4-da935758d9ff-kube-api-access-qlm47" (OuterVolumeSpecName: "kube-api-access-qlm47") pod "195d8fc3-e36b-4a36-92a4-da935758d9ff" (UID: "195d8fc3-e36b-4a36-92a4-da935758d9ff"). InnerVolumeSpecName "kube-api-access-qlm47". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.766133 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-config" (OuterVolumeSpecName: "config") pod "195d8fc3-e36b-4a36-92a4-da935758d9ff" (UID: "195d8fc3-e36b-4a36-92a4-da935758d9ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.779810 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "195d8fc3-e36b-4a36-92a4-da935758d9ff" (UID: "195d8fc3-e36b-4a36-92a4-da935758d9ff"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.787239 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "195d8fc3-e36b-4a36-92a4-da935758d9ff" (UID: "195d8fc3-e36b-4a36-92a4-da935758d9ff"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.829726 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.829771 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.829791 5030 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/195d8fc3-e36b-4a36-92a4-da935758d9ff-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.829810 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlm47\" (UniqueName: \"kubernetes.io/projected/195d8fc3-e36b-4a36-92a4-da935758d9ff-kube-api-access-qlm47\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.981809 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5039f885-ec1c-48d9-8e06-cf672e3581df" path="/var/lib/kubelet/pods/5039f885-ec1c-48d9-8e06-cf672e3581df/volumes" Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.983473 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" path="/var/lib/kubelet/pods/7da27e47-283b-48aa-9439-f8b3cf0c3426/volumes" Jan 20 23:19:35 crc kubenswrapper[5030]: I0120 23:19:35.986073 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf776dcd-8af4-45b0-9c39-0e8db2af4879" path="/var/lib/kubelet/pods/cf776dcd-8af4-45b0-9c39-0e8db2af4879/volumes" Jan 20 23:19:36 crc kubenswrapper[5030]: I0120 23:19:36.559443 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" event={"ID":"195d8fc3-e36b-4a36-92a4-da935758d9ff","Type":"ContainerDied","Data":"3d75ef8c50cd00560e268d9b45702f8d547c037d46c6de51a0e2d5516f80205f"} Jan 20 23:19:36 crc kubenswrapper[5030]: I0120 23:19:36.559801 5030 scope.go:117] "RemoveContainer" containerID="eaa01e4f601a64005981e893cc41a4981cdd97ea4254fbd0d3e015df117736eb" Jan 20 23:19:36 crc kubenswrapper[5030]: I0120 23:19:36.559510 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm" Jan 20 23:19:36 crc kubenswrapper[5030]: I0120 23:19:36.586013 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm"] Jan 20 23:19:36 crc kubenswrapper[5030]: I0120 23:19:36.595058 5030 scope.go:117] "RemoveContainer" containerID="9d65101e2fecb07ee01f739c10329f54162ad363b50389fe00638cfc9abde483" Jan 20 23:19:36 crc kubenswrapper[5030]: I0120 23:19:36.598342 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d5699597c-kfxwm"] Jan 20 23:19:37 crc kubenswrapper[5030]: I0120 23:19:37.979414 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="195d8fc3-e36b-4a36-92a4-da935758d9ff" path="/var/lib/kubelet/pods/195d8fc3-e36b-4a36-92a4-da935758d9ff/volumes" Jan 20 23:19:48 crc kubenswrapper[5030]: I0120 23:19:48.345195 5030 scope.go:117] "RemoveContainer" containerID="4e9a105c7a4566c0283977510c3c81350acb14b6445e2f615f8b8c7cfcd944a8" Jan 20 23:19:48 crc kubenswrapper[5030]: I0120 23:19:48.383568 5030 scope.go:117] "RemoveContainer" containerID="76189224976e8521cfe7d9597e16f26ebe7d1e854b742ed413b053b50b1fb456" Jan 20 23:19:48 crc kubenswrapper[5030]: I0120 23:19:48.434476 5030 scope.go:117] "RemoveContainer" containerID="2165954ce7991d9c92b9a2535612d5213391c2d1d40a0d59981e1f2a924cd889" Jan 20 23:19:48 crc kubenswrapper[5030]: I0120 23:19:48.468972 5030 scope.go:117] "RemoveContainer" containerID="d95e27461c55aafa37a1ed5e780d795b6de8def937c32b8fb3766bda32116d42" Jan 20 23:19:48 crc kubenswrapper[5030]: I0120 23:19:48.514173 5030 scope.go:117] "RemoveContainer" containerID="25852a0b06abaef1152c482d4491725ea52efdac6ad8a6d9de42b23da10cf0ee" Jan 20 23:19:48 crc kubenswrapper[5030]: I0120 23:19:48.549739 5030 scope.go:117] "RemoveContainer" containerID="ee353b1c49e0f185072e424e0c04f05bea612841a01568ff06beb0fff7400225" Jan 20 23:19:48 crc kubenswrapper[5030]: I0120 23:19:48.583099 5030 scope.go:117] "RemoveContainer" containerID="ba3004e2d56e03c4b37e9b30e64983c6fd5e37ab9a10b78ca4fc812d194cf0c1" Jan 20 23:19:48 crc kubenswrapper[5030]: I0120 23:19:48.616661 5030 scope.go:117] "RemoveContainer" containerID="7837b80e02d030e2ce03ff4cee41239d1427301430cd07ae2624f6088414713f" Jan 20 23:19:54 crc kubenswrapper[5030]: I0120 23:19:54.806386 5030 generic.go:334] "Generic (PLEG): container finished" podID="d060c583-6bca-4692-a1e4-84ebc743f826" containerID="31bdfc8eeca403bd17e795b8cf31b719db2f4805a115c8604e8ad198f7327da9" exitCode=137 Jan 20 23:19:54 crc kubenswrapper[5030]: I0120 23:19:54.806435 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"31bdfc8eeca403bd17e795b8cf31b719db2f4805a115c8604e8ad198f7327da9"} Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.080285 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.165776 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"d060c583-6bca-4692-a1e4-84ebc743f826\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.165902 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np6h2\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-kube-api-access-np6h2\") pod \"d060c583-6bca-4692-a1e4-84ebc743f826\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.166024 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d060c583-6bca-4692-a1e4-84ebc743f826-cache\") pod \"d060c583-6bca-4692-a1e4-84ebc743f826\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.166681 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d060c583-6bca-4692-a1e4-84ebc743f826-cache" (OuterVolumeSpecName: "cache") pod "d060c583-6bca-4692-a1e4-84ebc743f826" (UID: "d060c583-6bca-4692-a1e4-84ebc743f826"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.166865 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift\") pod \"d060c583-6bca-4692-a1e4-84ebc743f826\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.167324 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d060c583-6bca-4692-a1e4-84ebc743f826-lock\") pod \"d060c583-6bca-4692-a1e4-84ebc743f826\" (UID: \"d060c583-6bca-4692-a1e4-84ebc743f826\") " Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.167612 5030 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d060c583-6bca-4692-a1e4-84ebc743f826-cache\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.167827 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d060c583-6bca-4692-a1e4-84ebc743f826-lock" (OuterVolumeSpecName: "lock") pod "d060c583-6bca-4692-a1e4-84ebc743f826" (UID: "d060c583-6bca-4692-a1e4-84ebc743f826"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.171127 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "swift") pod "d060c583-6bca-4692-a1e4-84ebc743f826" (UID: "d060c583-6bca-4692-a1e4-84ebc743f826"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.171462 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-kube-api-access-np6h2" (OuterVolumeSpecName: "kube-api-access-np6h2") pod "d060c583-6bca-4692-a1e4-84ebc743f826" (UID: "d060c583-6bca-4692-a1e4-84ebc743f826"). InnerVolumeSpecName "kube-api-access-np6h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.171591 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d060c583-6bca-4692-a1e4-84ebc743f826" (UID: "d060c583-6bca-4692-a1e4-84ebc743f826"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.268785 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.268831 5030 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d060c583-6bca-4692-a1e4-84ebc743f826-lock\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.268875 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.268892 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np6h2\" (UniqueName: \"kubernetes.io/projected/d060c583-6bca-4692-a1e4-84ebc743f826-kube-api-access-np6h2\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.281709 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.369988 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.824664 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"d060c583-6bca-4692-a1e4-84ebc743f826","Type":"ContainerDied","Data":"49d8e170bf2a0fa87edd1e94b9adb4f397fa36dcb637ef4a5a52449a9620c987"} Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.824720 5030 scope.go:117] "RemoveContainer" containerID="31bdfc8eeca403bd17e795b8cf31b719db2f4805a115c8604e8ad198f7327da9" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.824782 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.848671 5030 scope.go:117] "RemoveContainer" containerID="63b6c6d4bece1d87f4729a39f17170c16567071872bdc73d5549a42bd98e07ba" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.872085 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.880313 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.896044 5030 scope.go:117] "RemoveContainer" containerID="c088151164804d480f3d602813acc6fe8110edbb60819ceaf2f0559e9b333bb1" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.921239 5030 scope.go:117] "RemoveContainer" containerID="d04f17499f96be1bcb743043670578f206129ddc5a8598a1d01aac043f27b4a5" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.940606 5030 scope.go:117] "RemoveContainer" containerID="ff021e7947768f870e170bfceeedcefa9f23164365486fe0e68cf658ba9ee312" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.972132 5030 scope.go:117] "RemoveContainer" containerID="d0baa50e2392f25298e7d0f9821ab644a7212ae9a2f2e398ab1e0dc812fdc4da" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.979499 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" path="/var/lib/kubelet/pods/d060c583-6bca-4692-a1e4-84ebc743f826/volumes" Jan 20 23:19:55 crc kubenswrapper[5030]: I0120 23:19:55.998979 5030 scope.go:117] "RemoveContainer" containerID="5f30fa9f5b0648ade3945014008c72e70964fe50f71f44800cc8bb9e7ffd26fd" Jan 20 23:19:56 crc kubenswrapper[5030]: I0120 23:19:56.024442 5030 scope.go:117] "RemoveContainer" containerID="c3edaddce88c0d920a63257712c29ebc2820562a0cebc56bf7063f7045098b45" Jan 20 23:19:56 crc kubenswrapper[5030]: I0120 23:19:56.046392 5030 scope.go:117] "RemoveContainer" containerID="c172e50bb94cc43be14453c778229aa3ad0f5285b8f09db985f5f9ca934e07e0" Jan 20 23:19:56 crc kubenswrapper[5030]: I0120 23:19:56.066611 5030 scope.go:117] "RemoveContainer" containerID="8420aa4a66cecb302aa6d121c7a14ab12b3529e6eb3b2d23cd341bdb0404c01b" Jan 20 23:19:56 crc kubenswrapper[5030]: I0120 23:19:56.091821 5030 scope.go:117] "RemoveContainer" containerID="ddb4980b7a5fddbacfa0f9edafb461b4d52d06ebf1613fcacc617ee6aa6c2f36" Jan 20 23:19:56 crc kubenswrapper[5030]: I0120 23:19:56.122256 5030 scope.go:117] "RemoveContainer" containerID="9953d67dbf3825bda9fc957256627f35a45dbc98f02872cfe23353e4f525b038" Jan 20 23:19:56 crc kubenswrapper[5030]: I0120 23:19:56.149680 5030 scope.go:117] "RemoveContainer" containerID="2724c2174e7487a3d20c504c8b82914e661dd1586f42ae55254b615ffc14a493" Jan 20 23:19:56 crc kubenswrapper[5030]: I0120 23:19:56.186485 5030 scope.go:117] "RemoveContainer" containerID="e7a7f0b5138e5ce84a4aee902bbc98528f273ace28ad5b3739a5c7855fce0f41" Jan 20 23:19:56 crc kubenswrapper[5030]: I0120 23:19:56.209177 5030 scope.go:117] "RemoveContainer" containerID="86a2e3b58648f4c6865951dbe0e5fd173fe948355567c57408f5478873a50cea" Jan 20 23:19:58 crc kubenswrapper[5030]: I0120 23:19:58.000439 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod131dfd04-f9d2-4fb7-b432-4f34f70cc4b7"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod131dfd04-f9d2-4fb7-b432-4f34f70cc4b7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod131dfd04_f9d2_4fb7_b432_4f34f70cc4b7.slice" Jan 20 23:19:58 crc kubenswrapper[5030]: E0120 23:19:58.000508 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod131dfd04-f9d2-4fb7-b432-4f34f70cc4b7] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod131dfd04-f9d2-4fb7-b432-4f34f70cc4b7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod131dfd04_f9d2_4fb7_b432_4f34f70cc4b7.slice" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" podUID="131dfd04-f9d2-4fb7-b432-4f34f70cc4b7" Jan 20 23:19:58 crc kubenswrapper[5030]: I0120 23:19:58.637808 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podd84d5a65-b170-4bad-aa8a-ddd90f8f48f9"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podd84d5a65-b170-4bad-aa8a-ddd90f8f48f9] : Timed out while waiting for systemd to remove kubepods-besteffort-podd84d5a65_b170_4bad_aa8a_ddd90f8f48f9.slice" Jan 20 23:19:58 crc kubenswrapper[5030]: I0120 23:19:58.871744 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q" Jan 20 23:19:58 crc kubenswrapper[5030]: I0120 23:19:58.934517 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q"] Jan 20 23:19:58 crc kubenswrapper[5030]: I0120 23:19:58.944560 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-ea87-account-create-update-rg22q"] Jan 20 23:19:59 crc kubenswrapper[5030]: I0120 23:19:59.974526 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="131dfd04-f9d2-4fb7-b432-4f34f70cc4b7" path="/var/lib/kubelet/pods/131dfd04-f9d2-4fb7-b432-4f34f70cc4b7/volumes" Jan 20 23:20:00 crc kubenswrapper[5030]: I0120 23:20:00.363932 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod8d6c006d-67ff-426d-9203-5c769b474fa3"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod8d6c006d-67ff-426d-9203-5c769b474fa3] : Timed out while waiting for systemd to remove kubepods-besteffort-pod8d6c006d_67ff_426d_9203_5c769b474fa3.slice" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.118812 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-mwwl4"] Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.125463 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-mwwl4"] Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.257233 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-c75cp"] Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.258028 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf776dcd-8af4-45b0-9c39-0e8db2af4879" containerName="mysql-bootstrap" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.258214 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf776dcd-8af4-45b0-9c39-0e8db2af4879" containerName="mysql-bootstrap" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.258358 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-updater" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.258496 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-updater" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.258650 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="995342f6-fe0b-4188-976c-5151541dd002" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.258790 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="995342f6-fe0b-4188-976c-5151541dd002" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.258920 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0863eb67-3ead-4744-b338-aaf75284e458" containerName="nova-cell1-conductor-conductor" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.259039 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0863eb67-3ead-4744-b338-aaf75284e458" containerName="nova-cell1-conductor-conductor" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.259166 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="ceilometer-central-agent" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.260529 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="ceilometer-central-agent" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.260768 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" containerName="setup-container" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.260905 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" containerName="setup-container" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.261048 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="ceilometer-notification-agent" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.261207 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="ceilometer-notification-agent" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.261349 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-server" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.261528 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-server" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.261762 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed443431-2b5c-4533-8523-d01380c98c1d" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.261906 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed443431-2b5c-4533-8523-d01380c98c1d" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.262137 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4827b5-38d8-4647-a2bd-bfff1b91d34e" containerName="placement-api" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.262295 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4827b5-38d8-4647-a2bd-bfff1b91d34e" containerName="placement-api" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.262428 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e11584-61c4-455a-9943-28b4228c8921" containerName="barbican-worker-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.262596 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e11584-61c4-455a-9943-28b4228c8921" containerName="barbican-worker-log" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.262779 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" containerName="neutron-api" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.262915 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" containerName="neutron-api" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.263044 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40a68c1e-d515-4dac-a25a-d7ed5da4b15c" containerName="barbican-api" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.263379 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="40a68c1e-d515-4dac-a25a-d7ed5da4b15c" containerName="barbican-api" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.263530 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195d8fc3-e36b-4a36-92a4-da935758d9ff" containerName="dnsmasq-dns" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.263935 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="195d8fc3-e36b-4a36-92a4-da935758d9ff" containerName="dnsmasq-dns" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.264071 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" containerName="neutron-httpd" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.264746 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" containerName="neutron-httpd" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.265020 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd75cb8e-9fcd-4465-88fb-49e3600608dc" containerName="glance-httpd" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.265150 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd75cb8e-9fcd-4465-88fb-49e3600608dc" containerName="glance-httpd" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.265280 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8de9d1d-d7de-4313-9b7b-0f07992ede1b" containerName="nova-scheduler-scheduler" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.265409 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8de9d1d-d7de-4313-9b7b-0f07992ede1b" containerName="nova-scheduler-scheduler" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.265540 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bcab9d-b9c0-40fe-8973-df39eeca1dd4" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.265691 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bcab9d-b9c0-40fe-8973-df39eeca1dd4" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.265868 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-expirer" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.265999 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-expirer" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.266128 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd75cb8e-9fcd-4465-88fb-49e3600608dc" containerName="glance-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.266258 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd75cb8e-9fcd-4465-88fb-49e3600608dc" containerName="glance-log" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.266384 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb6b715-a2a6-4c25-83c3-315fd30b8d92" containerName="nova-api-api" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.266498 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb6b715-a2a6-4c25-83c3-315fd30b8d92" containerName="nova-api-api" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.266655 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473c0915-5bed-4dc5-8b2a-7899b05b9afc" containerName="mysql-bootstrap" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.266789 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="473c0915-5bed-4dc5-8b2a-7899b05b9afc" containerName="mysql-bootstrap" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.266931 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bcab9d-b9c0-40fe-8973-df39eeca1dd4" containerName="setup-container" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.267058 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bcab9d-b9c0-40fe-8973-df39eeca1dd4" containerName="setup-container" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.267189 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="swift-recon-cron" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.268150 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="swift-recon-cron" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.268297 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e9d298-7338-47c3-8591-b7803798203f" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.268425 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e9d298-7338-47c3-8591-b7803798203f" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.268546 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" containerName="barbican-keystone-listener" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.268697 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" containerName="barbican-keystone-listener" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.268835 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-server" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.268956 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-server" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.269093 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-replicator" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.269212 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-replicator" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.269338 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b175c0c8-ee1a-404d-9a14-d37744a24ece" containerName="cinder-api-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.269457 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b175c0c8-ee1a-404d-9a14-d37744a24ece" containerName="cinder-api-log" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.269583 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-replicator" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.269783 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-replicator" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.270031 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf776dcd-8af4-45b0-9c39-0e8db2af4879" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.270172 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf776dcd-8af4-45b0-9c39-0e8db2af4879" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.270302 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-replicator" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.270429 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-replicator" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.270556 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6c006d-67ff-426d-9203-5c769b474fa3" containerName="mariadb-account-create-update" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.270739 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6c006d-67ff-426d-9203-5c769b474fa3" containerName="mariadb-account-create-update" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.270880 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d59d9bd5-ab07-4c22-83c0-71f88ee1b02d" containerName="nova-cell0-conductor-conductor" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.271007 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d59d9bd5-ab07-4c22-83c0-71f88ee1b02d" containerName="nova-cell0-conductor-conductor" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.271133 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="proxy-httpd" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.271253 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="proxy-httpd" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.271411 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed443431-2b5c-4533-8523-d01380c98c1d" containerName="setup-container" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.271563 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed443431-2b5c-4533-8523-d01380c98c1d" containerName="setup-container" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.271754 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c16c5bc4-fe32-4711-a21f-c6e7919d41d3" containerName="glance-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.271938 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c16c5bc4-fe32-4711-a21f-c6e7919d41d3" containerName="glance-log" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.272138 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-reaper" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.272320 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-reaper" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.272512 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5039f885-ec1c-48d9-8e06-cf672e3581df" containerName="mysql-bootstrap" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.272774 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5039f885-ec1c-48d9-8e06-cf672e3581df" containerName="mysql-bootstrap" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.272948 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54757d6-0570-4290-a802-d82ff94dffb9" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.273087 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54757d6-0570-4290-a802-d82ff94dffb9" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.273219 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.273408 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.273584 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473c0915-5bed-4dc5-8b2a-7899b05b9afc" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.273743 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="473c0915-5bed-4dc5-8b2a-7899b05b9afc" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.274026 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b175c0c8-ee1a-404d-9a14-d37744a24ece" containerName="cinder-api" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.274178 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b175c0c8-ee1a-404d-9a14-d37744a24ece" containerName="cinder-api" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.274342 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerName="nova-metadata-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.274482 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerName="nova-metadata-log" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.274650 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef744864-0218-4aad-9ffa-a926311790dd" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.274797 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef744864-0218-4aad-9ffa-a926311790dd" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.275006 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="rsync" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.275143 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="rsync" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.275285 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f7986d-211a-4216-b92f-ad900ddcc45f" containerName="probe" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.275409 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f7986d-211a-4216-b92f-ad900ddcc45f" containerName="probe" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.275546 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e11584-61c4-455a-9943-28b4228c8921" containerName="barbican-worker" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.275705 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e11584-61c4-455a-9943-28b4228c8921" containerName="barbican-worker" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.275846 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f2c51d-7595-4b7f-bdfb-3e50f0a6c657" containerName="kube-state-metrics" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.275976 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f2c51d-7595-4b7f-bdfb-3e50f0a6c657" containerName="kube-state-metrics" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.276108 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5039f885-ec1c-48d9-8e06-cf672e3581df" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.276225 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5039f885-ec1c-48d9-8e06-cf672e3581df" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.276340 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb6b715-a2a6-4c25-83c3-315fd30b8d92" containerName="nova-api-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.276461 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb6b715-a2a6-4c25-83c3-315fd30b8d92" containerName="nova-api-log" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.276582 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-auditor" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.276751 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-auditor" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.276891 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="995342f6-fe0b-4188-976c-5151541dd002" containerName="setup-container" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.277014 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="995342f6-fe0b-4188-976c-5151541dd002" containerName="setup-container" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.277144 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-auditor" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.277267 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-auditor" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.277397 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eba5579b-ac6a-439c-894d-25a5c14d6242" containerName="mysql-bootstrap" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.277564 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba5579b-ac6a-439c-894d-25a5c14d6242" containerName="mysql-bootstrap" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.277773 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f7986d-211a-4216-b92f-ad900ddcc45f" containerName="cinder-scheduler" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.277959 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f7986d-211a-4216-b92f-ad900ddcc45f" containerName="cinder-scheduler" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.278108 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-auditor" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.278238 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-auditor" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.278452 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-updater" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.278659 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-updater" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.278809 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4827b5-38d8-4647-a2bd-bfff1b91d34e" containerName="placement-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.278933 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4827b5-38d8-4647-a2bd-bfff1b91d34e" containerName="placement-log" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.279027 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerName="nova-metadata-metadata" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.279102 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerName="nova-metadata-metadata" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.279183 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef744864-0218-4aad-9ffa-a926311790dd" containerName="mysql-bootstrap" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.279265 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef744864-0218-4aad-9ffa-a926311790dd" containerName="mysql-bootstrap" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.279349 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40a68c1e-d515-4dac-a25a-d7ed5da4b15c" containerName="barbican-api-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.279425 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="40a68c1e-d515-4dac-a25a-d7ed5da4b15c" containerName="barbican-api-log" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.279509 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09176751-6268-4f1b-9348-5adb2938e4ca" containerName="keystone-api" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.279586 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="09176751-6268-4f1b-9348-5adb2938e4ca" containerName="keystone-api" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.279698 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eba5579b-ac6a-439c-894d-25a5c14d6242" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.279778 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba5579b-ac6a-439c-894d-25a5c14d6242" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.279866 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfaa83b-13e1-4a2f-a722-b0583f2641ed" containerName="memcached" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.279947 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfaa83b-13e1-4a2f-a722-b0583f2641ed" containerName="memcached" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.280023 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54757d6-0570-4290-a802-d82ff94dffb9" containerName="setup-container" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280038 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54757d6-0570-4290-a802-d82ff94dffb9" containerName="setup-container" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.280047 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e9d298-7338-47c3-8591-b7803798203f" containerName="setup-container" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280054 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e9d298-7338-47c3-8591-b7803798203f" containerName="setup-container" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.280069 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195d8fc3-e36b-4a36-92a4-da935758d9ff" containerName="init" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280077 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="195d8fc3-e36b-4a36-92a4-da935758d9ff" containerName="init" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.280091 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-server" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280098 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-server" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.280111 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c16c5bc4-fe32-4711-a21f-c6e7919d41d3" containerName="glance-httpd" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280119 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c16c5bc4-fe32-4711-a21f-c6e7919d41d3" containerName="glance-httpd" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.280276 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" containerName="barbican-keystone-listener-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280284 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" containerName="barbican-keystone-listener-log" Jan 20 23:20:07 crc kubenswrapper[5030]: E0120 23:20:07.280292 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="sg-core" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280300 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="sg-core" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280572 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c16c5bc4-fe32-4711-a21f-c6e7919d41d3" containerName="glance-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280586 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="sg-core" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280598 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8de9d1d-d7de-4313-9b7b-0f07992ede1b" containerName="nova-scheduler-scheduler" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280608 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb6b715-a2a6-4c25-83c3-315fd30b8d92" containerName="nova-api-api" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280671 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e11584-61c4-455a-9943-28b4228c8921" containerName="barbican-worker-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280681 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="473c0915-5bed-4dc5-8b2a-7899b05b9afc" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280692 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-server" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280707 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-server" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280723 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="56408fa4-b1eb-4a4d-bc8a-9ab82b88957f" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280732 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5039f885-ec1c-48d9-8e06-cf672e3581df" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280743 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-auditor" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280758 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6c006d-67ff-426d-9203-5c769b474fa3" containerName="mariadb-account-create-update" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280769 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f7986d-211a-4216-b92f-ad900ddcc45f" containerName="probe" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280785 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef744864-0218-4aad-9ffa-a926311790dd" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280792 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb6b715-a2a6-4c25-83c3-315fd30b8d92" containerName="nova-api-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280802 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa4827b5-38d8-4647-a2bd-bfff1b91d34e" containerName="placement-api" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280811 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="ceilometer-central-agent" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280824 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-server" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280839 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b175c0c8-ee1a-404d-9a14-d37744a24ece" containerName="cinder-api" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280848 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-auditor" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280856 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed443431-2b5c-4533-8523-d01380c98c1d" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280868 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerName="nova-metadata-metadata" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280877 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" containerName="neutron-api" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280888 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="rsync" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280900 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="195d8fc3-e36b-4a36-92a4-da935758d9ff" containerName="dnsmasq-dns" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280908 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa4827b5-38d8-4647-a2bd-bfff1b91d34e" containerName="placement-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280921 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-reaper" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280933 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3fc979-4f14-4578-9cae-d44c44ba3e8c" containerName="nova-metadata-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280944 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-replicator" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280954 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b77c8d9b-92ac-4cfd-b1cd-fe09be04b782" containerName="neutron-httpd" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280963 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="40a68c1e-d515-4dac-a25a-d7ed5da4b15c" containerName="barbican-api-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280975 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfaa83b-13e1-4a2f-a722-b0583f2641ed" containerName="memcached" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280986 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d59d9bd5-ab07-4c22-83c0-71f88ee1b02d" containerName="nova-cell0-conductor-conductor" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.280996 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="40a68c1e-d515-4dac-a25a-d7ed5da4b15c" containerName="barbican-api" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281004 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-auditor" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281017 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" containerName="barbican-keystone-listener-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281030 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc45225-0ecd-4a1c-9439-dc3c8385e5b0" containerName="barbican-keystone-listener" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281043 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b175c0c8-ee1a-404d-9a14-d37744a24ece" containerName="cinder-api-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281055 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f2c51d-7595-4b7f-bdfb-3e50f0a6c657" containerName="kube-state-metrics" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281063 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf776dcd-8af4-45b0-9c39-0e8db2af4879" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281071 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c16c5bc4-fe32-4711-a21f-c6e7919d41d3" containerName="glance-httpd" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281082 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-replicator" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281091 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="swift-recon-cron" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281104 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="09176751-6268-4f1b-9348-5adb2938e4ca" containerName="keystone-api" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281115 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eba5579b-ac6a-439c-894d-25a5c14d6242" containerName="galera" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281127 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-expirer" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281137 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59bcab9d-b9c0-40fe-8973-df39eeca1dd4" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281148 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="object-updater" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281159 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd75cb8e-9fcd-4465-88fb-49e3600608dc" containerName="glance-httpd" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281170 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="account-replicator" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281179 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e11584-61c4-455a-9943-28b4228c8921" containerName="barbican-worker" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281187 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e9d298-7338-47c3-8591-b7803798203f" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281196 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d060c583-6bca-4692-a1e4-84ebc743f826" containerName="container-updater" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281206 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="995342f6-fe0b-4188-976c-5151541dd002" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281219 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0863eb67-3ead-4744-b338-aaf75284e458" containerName="nova-cell1-conductor-conductor" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281227 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="ceilometer-notification-agent" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281238 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f7986d-211a-4216-b92f-ad900ddcc45f" containerName="cinder-scheduler" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281246 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd75cb8e-9fcd-4465-88fb-49e3600608dc" containerName="glance-log" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281257 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54757d6-0570-4290-a802-d82ff94dffb9" containerName="rabbitmq" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281269 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da27e47-283b-48aa-9439-f8b3cf0c3426" containerName="proxy-httpd" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281820 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-c75cp"] Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.281933 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-c75cp" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.285606 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.285970 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.286210 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.286838 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.364199 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-crc-storage\") pod \"crc-storage-crc-c75cp\" (UID: \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\") " pod="crc-storage/crc-storage-crc-c75cp" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.364387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-node-mnt\") pod \"crc-storage-crc-c75cp\" (UID: \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\") " pod="crc-storage/crc-storage-crc-c75cp" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.364433 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g5zb\" (UniqueName: \"kubernetes.io/projected/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-kube-api-access-8g5zb\") pod \"crc-storage-crc-c75cp\" (UID: \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\") " pod="crc-storage/crc-storage-crc-c75cp" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.465800 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-crc-storage\") pod \"crc-storage-crc-c75cp\" (UID: \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\") " pod="crc-storage/crc-storage-crc-c75cp" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.465918 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-node-mnt\") pod \"crc-storage-crc-c75cp\" (UID: \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\") " pod="crc-storage/crc-storage-crc-c75cp" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.465975 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g5zb\" (UniqueName: \"kubernetes.io/projected/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-kube-api-access-8g5zb\") pod \"crc-storage-crc-c75cp\" (UID: \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\") " pod="crc-storage/crc-storage-crc-c75cp" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.466529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-node-mnt\") pod \"crc-storage-crc-c75cp\" (UID: \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\") " pod="crc-storage/crc-storage-crc-c75cp" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.467279 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-crc-storage\") pod \"crc-storage-crc-c75cp\" (UID: \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\") " pod="crc-storage/crc-storage-crc-c75cp" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.501936 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g5zb\" (UniqueName: \"kubernetes.io/projected/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-kube-api-access-8g5zb\") pod \"crc-storage-crc-c75cp\" (UID: \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\") " pod="crc-storage/crc-storage-crc-c75cp" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.600657 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-c75cp" Jan 20 23:20:07 crc kubenswrapper[5030]: I0120 23:20:07.980027 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f2b839-4822-4121-a227-334e3a1fb94a" path="/var/lib/kubelet/pods/80f2b839-4822-4121-a227-334e3a1fb94a/volumes" Jan 20 23:20:08 crc kubenswrapper[5030]: I0120 23:20:08.093053 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-c75cp"] Jan 20 23:20:08 crc kubenswrapper[5030]: I0120 23:20:08.106039 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 23:20:08 crc kubenswrapper[5030]: I0120 23:20:08.981757 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-c75cp" event={"ID":"f087b6c3-d3ac-4e9c-b575-455cb29fb74e","Type":"ContainerStarted","Data":"c1c72bf7b21f657eff09f5362cbd4a2437cf1b2a108eb738f4a88cf96efe412f"} Jan 20 23:20:09 crc kubenswrapper[5030]: I0120 23:20:09.994872 5030 generic.go:334] "Generic (PLEG): container finished" podID="f087b6c3-d3ac-4e9c-b575-455cb29fb74e" containerID="876076c94c99519acaa7d7328c451ee7dcca84a94dd09787d66a65f9124d1455" exitCode=0 Jan 20 23:20:09 crc kubenswrapper[5030]: I0120 23:20:09.994938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-c75cp" event={"ID":"f087b6c3-d3ac-4e9c-b575-455cb29fb74e","Type":"ContainerDied","Data":"876076c94c99519acaa7d7328c451ee7dcca84a94dd09787d66a65f9124d1455"} Jan 20 23:20:11 crc kubenswrapper[5030]: I0120 23:20:11.389116 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-c75cp" Jan 20 23:20:11 crc kubenswrapper[5030]: I0120 23:20:11.536750 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "f087b6c3-d3ac-4e9c-b575-455cb29fb74e" (UID: "f087b6c3-d3ac-4e9c-b575-455cb29fb74e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:20:11 crc kubenswrapper[5030]: I0120 23:20:11.536605 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-node-mnt\") pod \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\" (UID: \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\") " Jan 20 23:20:11 crc kubenswrapper[5030]: I0120 23:20:11.537051 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g5zb\" (UniqueName: \"kubernetes.io/projected/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-kube-api-access-8g5zb\") pod \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\" (UID: \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\") " Jan 20 23:20:11 crc kubenswrapper[5030]: I0120 23:20:11.537337 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-crc-storage\") pod \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\" (UID: \"f087b6c3-d3ac-4e9c-b575-455cb29fb74e\") " Jan 20 23:20:11 crc kubenswrapper[5030]: I0120 23:20:11.538704 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:11 crc kubenswrapper[5030]: I0120 23:20:11.544711 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-kube-api-access-8g5zb" (OuterVolumeSpecName: "kube-api-access-8g5zb") pod "f087b6c3-d3ac-4e9c-b575-455cb29fb74e" (UID: "f087b6c3-d3ac-4e9c-b575-455cb29fb74e"). InnerVolumeSpecName "kube-api-access-8g5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:20:11 crc kubenswrapper[5030]: I0120 23:20:11.569106 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "f087b6c3-d3ac-4e9c-b575-455cb29fb74e" (UID: "f087b6c3-d3ac-4e9c-b575-455cb29fb74e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:20:11 crc kubenswrapper[5030]: I0120 23:20:11.640040 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:11 crc kubenswrapper[5030]: I0120 23:20:11.640121 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g5zb\" (UniqueName: \"kubernetes.io/projected/f087b6c3-d3ac-4e9c-b575-455cb29fb74e-kube-api-access-8g5zb\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:12 crc kubenswrapper[5030]: I0120 23:20:12.020864 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-c75cp" event={"ID":"f087b6c3-d3ac-4e9c-b575-455cb29fb74e","Type":"ContainerDied","Data":"c1c72bf7b21f657eff09f5362cbd4a2437cf1b2a108eb738f4a88cf96efe412f"} Jan 20 23:20:12 crc kubenswrapper[5030]: I0120 23:20:12.021241 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1c72bf7b21f657eff09f5362cbd4a2437cf1b2a108eb738f4a88cf96efe412f" Jan 20 23:20:12 crc kubenswrapper[5030]: I0120 23:20:12.020956 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-c75cp" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.088686 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-c75cp"] Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.102368 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-c75cp"] Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.218086 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-t7hjf"] Jan 20 23:20:15 crc kubenswrapper[5030]: E0120 23:20:15.218559 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6c006d-67ff-426d-9203-5c769b474fa3" containerName="mariadb-account-create-update" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.218590 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6c006d-67ff-426d-9203-5c769b474fa3" containerName="mariadb-account-create-update" Jan 20 23:20:15 crc kubenswrapper[5030]: E0120 23:20:15.218659 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f087b6c3-d3ac-4e9c-b575-455cb29fb74e" containerName="storage" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.218673 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f087b6c3-d3ac-4e9c-b575-455cb29fb74e" containerName="storage" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.218870 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6c006d-67ff-426d-9203-5c769b474fa3" containerName="mariadb-account-create-update" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.218901 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f087b6c3-d3ac-4e9c-b575-455cb29fb74e" containerName="storage" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.219741 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7hjf" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.223382 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-t7hjf"] Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.223672 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.223707 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.224036 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.224349 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.298811 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/000eeb10-8287-4225-9abb-e925a0ee2c37-node-mnt\") pod \"crc-storage-crc-t7hjf\" (UID: \"000eeb10-8287-4225-9abb-e925a0ee2c37\") " pod="crc-storage/crc-storage-crc-t7hjf" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.298937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lznd7\" (UniqueName: \"kubernetes.io/projected/000eeb10-8287-4225-9abb-e925a0ee2c37-kube-api-access-lznd7\") pod \"crc-storage-crc-t7hjf\" (UID: \"000eeb10-8287-4225-9abb-e925a0ee2c37\") " pod="crc-storage/crc-storage-crc-t7hjf" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.298968 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/000eeb10-8287-4225-9abb-e925a0ee2c37-crc-storage\") pod \"crc-storage-crc-t7hjf\" (UID: \"000eeb10-8287-4225-9abb-e925a0ee2c37\") " pod="crc-storage/crc-storage-crc-t7hjf" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.400418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/000eeb10-8287-4225-9abb-e925a0ee2c37-node-mnt\") pod \"crc-storage-crc-t7hjf\" (UID: \"000eeb10-8287-4225-9abb-e925a0ee2c37\") " pod="crc-storage/crc-storage-crc-t7hjf" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.400589 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lznd7\" (UniqueName: \"kubernetes.io/projected/000eeb10-8287-4225-9abb-e925a0ee2c37-kube-api-access-lznd7\") pod \"crc-storage-crc-t7hjf\" (UID: \"000eeb10-8287-4225-9abb-e925a0ee2c37\") " pod="crc-storage/crc-storage-crc-t7hjf" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.400767 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/000eeb10-8287-4225-9abb-e925a0ee2c37-node-mnt\") pod \"crc-storage-crc-t7hjf\" (UID: \"000eeb10-8287-4225-9abb-e925a0ee2c37\") " pod="crc-storage/crc-storage-crc-t7hjf" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.401214 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/000eeb10-8287-4225-9abb-e925a0ee2c37-crc-storage\") pod \"crc-storage-crc-t7hjf\" (UID: \"000eeb10-8287-4225-9abb-e925a0ee2c37\") " pod="crc-storage/crc-storage-crc-t7hjf" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.402430 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/000eeb10-8287-4225-9abb-e925a0ee2c37-crc-storage\") pod \"crc-storage-crc-t7hjf\" (UID: \"000eeb10-8287-4225-9abb-e925a0ee2c37\") " pod="crc-storage/crc-storage-crc-t7hjf" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.438039 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lznd7\" (UniqueName: \"kubernetes.io/projected/000eeb10-8287-4225-9abb-e925a0ee2c37-kube-api-access-lznd7\") pod \"crc-storage-crc-t7hjf\" (UID: \"000eeb10-8287-4225-9abb-e925a0ee2c37\") " pod="crc-storage/crc-storage-crc-t7hjf" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.547916 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7hjf" Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.846426 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-t7hjf"] Jan 20 23:20:15 crc kubenswrapper[5030]: I0120 23:20:15.972382 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f087b6c3-d3ac-4e9c-b575-455cb29fb74e" path="/var/lib/kubelet/pods/f087b6c3-d3ac-4e9c-b575-455cb29fb74e/volumes" Jan 20 23:20:16 crc kubenswrapper[5030]: I0120 23:20:16.061273 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-t7hjf" event={"ID":"000eeb10-8287-4225-9abb-e925a0ee2c37","Type":"ContainerStarted","Data":"6cd16572a5aa308a6c113df39b4d40bbcac50e85aa90a19e4fbe28065fe21d86"} Jan 20 23:20:17 crc kubenswrapper[5030]: I0120 23:20:17.072405 5030 generic.go:334] "Generic (PLEG): container finished" podID="000eeb10-8287-4225-9abb-e925a0ee2c37" containerID="b43b0407df047d8476ad43c9df0c38db00b8003aee80ca3e76a060dbf6a4e8b7" exitCode=0 Jan 20 23:20:17 crc kubenswrapper[5030]: I0120 23:20:17.072496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-t7hjf" event={"ID":"000eeb10-8287-4225-9abb-e925a0ee2c37","Type":"ContainerDied","Data":"b43b0407df047d8476ad43c9df0c38db00b8003aee80ca3e76a060dbf6a4e8b7"} Jan 20 23:20:18 crc kubenswrapper[5030]: I0120 23:20:18.474065 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7hjf" Jan 20 23:20:18 crc kubenswrapper[5030]: I0120 23:20:18.550997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lznd7\" (UniqueName: \"kubernetes.io/projected/000eeb10-8287-4225-9abb-e925a0ee2c37-kube-api-access-lznd7\") pod \"000eeb10-8287-4225-9abb-e925a0ee2c37\" (UID: \"000eeb10-8287-4225-9abb-e925a0ee2c37\") " Jan 20 23:20:18 crc kubenswrapper[5030]: I0120 23:20:18.551109 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/000eeb10-8287-4225-9abb-e925a0ee2c37-crc-storage\") pod \"000eeb10-8287-4225-9abb-e925a0ee2c37\" (UID: \"000eeb10-8287-4225-9abb-e925a0ee2c37\") " Jan 20 23:20:18 crc kubenswrapper[5030]: I0120 23:20:18.551186 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/000eeb10-8287-4225-9abb-e925a0ee2c37-node-mnt\") pod \"000eeb10-8287-4225-9abb-e925a0ee2c37\" (UID: \"000eeb10-8287-4225-9abb-e925a0ee2c37\") " Jan 20 23:20:18 crc kubenswrapper[5030]: I0120 23:20:18.551376 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/000eeb10-8287-4225-9abb-e925a0ee2c37-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "000eeb10-8287-4225-9abb-e925a0ee2c37" (UID: "000eeb10-8287-4225-9abb-e925a0ee2c37"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:20:18 crc kubenswrapper[5030]: I0120 23:20:18.551732 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/000eeb10-8287-4225-9abb-e925a0ee2c37-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:18 crc kubenswrapper[5030]: I0120 23:20:18.557730 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/000eeb10-8287-4225-9abb-e925a0ee2c37-kube-api-access-lznd7" (OuterVolumeSpecName: "kube-api-access-lznd7") pod "000eeb10-8287-4225-9abb-e925a0ee2c37" (UID: "000eeb10-8287-4225-9abb-e925a0ee2c37"). InnerVolumeSpecName "kube-api-access-lznd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:20:18 crc kubenswrapper[5030]: I0120 23:20:18.573604 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/000eeb10-8287-4225-9abb-e925a0ee2c37-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "000eeb10-8287-4225-9abb-e925a0ee2c37" (UID: "000eeb10-8287-4225-9abb-e925a0ee2c37"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:20:18 crc kubenswrapper[5030]: I0120 23:20:18.652788 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lznd7\" (UniqueName: \"kubernetes.io/projected/000eeb10-8287-4225-9abb-e925a0ee2c37-kube-api-access-lznd7\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:18 crc kubenswrapper[5030]: I0120 23:20:18.652878 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/000eeb10-8287-4225-9abb-e925a0ee2c37-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:19 crc kubenswrapper[5030]: I0120 23:20:19.098860 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-t7hjf" event={"ID":"000eeb10-8287-4225-9abb-e925a0ee2c37","Type":"ContainerDied","Data":"6cd16572a5aa308a6c113df39b4d40bbcac50e85aa90a19e4fbe28065fe21d86"} Jan 20 23:20:19 crc kubenswrapper[5030]: I0120 23:20:19.098909 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cd16572a5aa308a6c113df39b4d40bbcac50e85aa90a19e4fbe28065fe21d86" Jan 20 23:20:19 crc kubenswrapper[5030]: I0120 23:20:19.098946 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7hjf" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.568588 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:20:31 crc kubenswrapper[5030]: E0120 23:20:31.569602 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000eeb10-8287-4225-9abb-e925a0ee2c37" containerName="storage" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.569644 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="000eeb10-8287-4225-9abb-e925a0ee2c37" containerName="storage" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.569848 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="000eeb10-8287-4225-9abb-e925a0ee2c37" containerName="storage" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.573297 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.576201 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.576311 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.576792 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-hdchh" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.577299 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.577499 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.577506 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.579031 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.590719 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.749793 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74cm9\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-kube-api-access-74cm9\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.749853 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.749886 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.749919 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-config-data\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.750081 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/562410c1-4193-4bdb-84a5-32bd001f29b7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.750148 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.750333 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.750410 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.750504 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.750685 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.750820 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/562410c1-4193-4bdb-84a5-32bd001f29b7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.841038 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.842429 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.845063 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.846249 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.846714 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.846750 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-gh8gp" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.847198 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.847942 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.848279 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.852207 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.852292 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.852362 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-config-data\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.852456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/562410c1-4193-4bdb-84a5-32bd001f29b7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.852528 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.852600 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.852700 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.852746 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.852816 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.852895 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/562410c1-4193-4bdb-84a5-32bd001f29b7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.852961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74cm9\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-kube-api-access-74cm9\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.853237 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") device mount path \"/mnt/openstack/pv03\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.853576 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-config-data\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.853717 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.853928 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.854852 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.860541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.862998 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.865554 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.867795 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/562410c1-4193-4bdb-84a5-32bd001f29b7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.873069 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.874456 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/562410c1-4193-4bdb-84a5-32bd001f29b7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.891228 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74cm9\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-kube-api-access-74cm9\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.905974 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.954854 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.954993 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/640d9b89-a25b-43d4-bf6b-361b56ad9381-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.955062 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.955152 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.955222 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.955319 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkqkh\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-kube-api-access-jkqkh\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.955382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.955434 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.956458 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/640d9b89-a25b-43d4-bf6b-361b56ad9381-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.957871 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:31 crc kubenswrapper[5030]: I0120 23:20:31.958014 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.059011 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/640d9b89-a25b-43d4-bf6b-361b56ad9381-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.059102 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.059166 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.059217 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.059237 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkqkh\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-kube-api-access-jkqkh\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.059253 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.059315 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.059373 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/640d9b89-a25b-43d4-bf6b-361b56ad9381-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.059377 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.059399 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.059423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.059479 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.060473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.061147 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.061426 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.061761 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.063063 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.068375 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/640d9b89-a25b-43d4-bf6b-361b56ad9381-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.071327 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.073471 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/640d9b89-a25b-43d4-bf6b-361b56ad9381-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.075150 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.078686 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkqkh\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-kube-api-access-jkqkh\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.084893 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.198900 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.260442 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.736104 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:20:32 crc kubenswrapper[5030]: W0120 23:20:32.787248 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod640d9b89_a25b_43d4_bf6b_361b56ad9381.slice/crio-1b51c66d93f05b18faeea3741f8c345ead3b42511cad65268bf2fae5f1d6dddd WatchSource:0}: Error finding container 1b51c66d93f05b18faeea3741f8c345ead3b42511cad65268bf2fae5f1d6dddd: Status 404 returned error can't find the container with id 1b51c66d93f05b18faeea3741f8c345ead3b42511cad65268bf2fae5f1d6dddd Jan 20 23:20:32 crc kubenswrapper[5030]: I0120 23:20:32.787451 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.241348 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"640d9b89-a25b-43d4-bf6b-361b56ad9381","Type":"ContainerStarted","Data":"1b51c66d93f05b18faeea3741f8c345ead3b42511cad65268bf2fae5f1d6dddd"} Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.243617 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"562410c1-4193-4bdb-84a5-32bd001f29b7","Type":"ContainerStarted","Data":"10024960712dd50695d0c5e50d61676789c1dd0b8edb6b5beaf2bef0e0994063"} Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.475355 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.480300 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.484929 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-qc7gv" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.485378 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.486675 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.487437 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.498303 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.513812 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.584693 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.584760 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-config-data-default\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.584814 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-kolla-config\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.584853 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mwr2\" (UniqueName: \"kubernetes.io/projected/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-kube-api-access-7mwr2\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.584877 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.584907 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.585227 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.585273 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.686166 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.686239 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-config-data-default\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.686286 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-kolla-config\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.686320 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mwr2\" (UniqueName: \"kubernetes.io/projected/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-kube-api-access-7mwr2\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.686348 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.686372 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.686410 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.686442 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.687004 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.687780 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-kolla-config\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.688008 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.688143 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-config-data-default\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.689230 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.694977 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.699456 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.717744 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mwr2\" (UniqueName: \"kubernetes.io/projected/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-kube-api-access-7mwr2\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.728702 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:33 crc kubenswrapper[5030]: I0120 23:20:33.799270 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:34 crc kubenswrapper[5030]: I0120 23:20:34.270819 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:20:34 crc kubenswrapper[5030]: W0120 23:20:34.272057 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bdbab30_d0e5_435f_8b5a_19b1bb9496c5.slice/crio-250488f9a7b916786e685f61c5d2e8800ff8b73d0d965fcaa7ba487550700706 WatchSource:0}: Error finding container 250488f9a7b916786e685f61c5d2e8800ff8b73d0d965fcaa7ba487550700706: Status 404 returned error can't find the container with id 250488f9a7b916786e685f61c5d2e8800ff8b73d0d965fcaa7ba487550700706 Jan 20 23:20:34 crc kubenswrapper[5030]: I0120 23:20:34.959412 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:20:34 crc kubenswrapper[5030]: I0120 23:20:34.961803 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:34 crc kubenswrapper[5030]: I0120 23:20:34.965694 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-86kd6" Jan 20 23:20:34 crc kubenswrapper[5030]: I0120 23:20:34.966154 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:20:34 crc kubenswrapper[5030]: I0120 23:20:34.969605 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:20:34 crc kubenswrapper[5030]: I0120 23:20:34.971481 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:20:34 crc kubenswrapper[5030]: I0120 23:20:34.982753 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.107479 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.107527 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.107765 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.107860 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.107974 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.108094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.108796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.108888 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gsqx\" (UniqueName: \"kubernetes.io/projected/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-kube-api-access-2gsqx\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.210475 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.210539 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.210592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.210642 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.210685 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.210712 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gsqx\" (UniqueName: \"kubernetes.io/projected/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-kube-api-access-2gsqx\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.211168 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.211577 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.213515 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.213699 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.215086 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.215609 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.227951 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.228073 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.228420 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.241453 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gsqx\" (UniqueName: \"kubernetes.io/projected/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-kube-api-access-2gsqx\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.262325 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.273584 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5","Type":"ContainerStarted","Data":"bbc4b865330a5660417c7e9fa6e4fcc8d10892e7dfea8227af6e11c00479ab22"} Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.273643 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5","Type":"ContainerStarted","Data":"250488f9a7b916786e685f61c5d2e8800ff8b73d0d965fcaa7ba487550700706"} Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.275367 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"640d9b89-a25b-43d4-bf6b-361b56ad9381","Type":"ContainerStarted","Data":"7c7e9190938d40b95c597cf3c50d0385e343102d0e6ab1a4234a94fd89a2f688"} Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.277731 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"562410c1-4193-4bdb-84a5-32bd001f29b7","Type":"ContainerStarted","Data":"bc2be2e614785a83e92f8382eb54b760f63601c5e35ecfb23e625e29c1a1fe79"} Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.291590 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.386761 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.387971 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.395610 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.395916 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.399863 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.407871 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-ztdtq" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.532960 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e47ba81-d55b-48df-bce7-31220884279a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.533174 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e47ba81-d55b-48df-bce7-31220884279a-kolla-config\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.533270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e47ba81-d55b-48df-bce7-31220884279a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.533306 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e47ba81-d55b-48df-bce7-31220884279a-config-data\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.533335 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hj84\" (UniqueName: \"kubernetes.io/projected/9e47ba81-d55b-48df-bce7-31220884279a-kube-api-access-7hj84\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.635084 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e47ba81-d55b-48df-bce7-31220884279a-config-data\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.635328 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hj84\" (UniqueName: \"kubernetes.io/projected/9e47ba81-d55b-48df-bce7-31220884279a-kube-api-access-7hj84\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.635388 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e47ba81-d55b-48df-bce7-31220884279a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.635447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e47ba81-d55b-48df-bce7-31220884279a-kolla-config\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.635474 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e47ba81-d55b-48df-bce7-31220884279a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.635910 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e47ba81-d55b-48df-bce7-31220884279a-config-data\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.636103 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e47ba81-d55b-48df-bce7-31220884279a-kolla-config\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.640264 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e47ba81-d55b-48df-bce7-31220884279a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.642265 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e47ba81-d55b-48df-bce7-31220884279a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.651095 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hj84\" (UniqueName: \"kubernetes.io/projected/9e47ba81-d55b-48df-bce7-31220884279a-kube-api-access-7hj84\") pod \"memcached-0\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.716948 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:35 crc kubenswrapper[5030]: I0120 23:20:35.812455 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:20:35 crc kubenswrapper[5030]: W0120 23:20:35.818267 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0213ca2_a56f_4a81_bfe7_814d2eeeb275.slice/crio-a11f5559db2164e8b385540e9ce8119882e0b0d3af1526d64480ce2cc73b6bd5 WatchSource:0}: Error finding container a11f5559db2164e8b385540e9ce8119882e0b0d3af1526d64480ce2cc73b6bd5: Status 404 returned error can't find the container with id a11f5559db2164e8b385540e9ce8119882e0b0d3af1526d64480ce2cc73b6bd5 Jan 20 23:20:36 crc kubenswrapper[5030]: I0120 23:20:36.142442 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:20:36 crc kubenswrapper[5030]: W0120 23:20:36.142918 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e47ba81_d55b_48df_bce7_31220884279a.slice/crio-1d198fefad76f437ed452f5a60cc2a6f2226b2aba4bc29f253596f279ebcbb6c WatchSource:0}: Error finding container 1d198fefad76f437ed452f5a60cc2a6f2226b2aba4bc29f253596f279ebcbb6c: Status 404 returned error can't find the container with id 1d198fefad76f437ed452f5a60cc2a6f2226b2aba4bc29f253596f279ebcbb6c Jan 20 23:20:36 crc kubenswrapper[5030]: I0120 23:20:36.292329 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"e0213ca2-a56f-4a81-bfe7-814d2eeeb275","Type":"ContainerStarted","Data":"911be87e425ae0f10d9f79f585e422dfc37027171e5657160647c58cbb384334"} Jan 20 23:20:36 crc kubenswrapper[5030]: I0120 23:20:36.292381 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"e0213ca2-a56f-4a81-bfe7-814d2eeeb275","Type":"ContainerStarted","Data":"a11f5559db2164e8b385540e9ce8119882e0b0d3af1526d64480ce2cc73b6bd5"} Jan 20 23:20:36 crc kubenswrapper[5030]: I0120 23:20:36.295598 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"9e47ba81-d55b-48df-bce7-31220884279a","Type":"ContainerStarted","Data":"1d198fefad76f437ed452f5a60cc2a6f2226b2aba4bc29f253596f279ebcbb6c"} Jan 20 23:20:37 crc kubenswrapper[5030]: I0120 23:20:37.001274 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:20:37 crc kubenswrapper[5030]: I0120 23:20:37.002540 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:20:37 crc kubenswrapper[5030]: I0120 23:20:37.004646 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-l9cgk" Jan 20 23:20:37 crc kubenswrapper[5030]: I0120 23:20:37.010067 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:20:37 crc kubenswrapper[5030]: I0120 23:20:37.163502 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx65g\" (UniqueName: \"kubernetes.io/projected/f2ec21ef-c0b8-4832-a46a-c50fe992fde9-kube-api-access-jx65g\") pod \"kube-state-metrics-0\" (UID: \"f2ec21ef-c0b8-4832-a46a-c50fe992fde9\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:20:37 crc kubenswrapper[5030]: I0120 23:20:37.266610 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx65g\" (UniqueName: \"kubernetes.io/projected/f2ec21ef-c0b8-4832-a46a-c50fe992fde9-kube-api-access-jx65g\") pod \"kube-state-metrics-0\" (UID: \"f2ec21ef-c0b8-4832-a46a-c50fe992fde9\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:20:37 crc kubenswrapper[5030]: I0120 23:20:37.285390 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx65g\" (UniqueName: \"kubernetes.io/projected/f2ec21ef-c0b8-4832-a46a-c50fe992fde9-kube-api-access-jx65g\") pod \"kube-state-metrics-0\" (UID: \"f2ec21ef-c0b8-4832-a46a-c50fe992fde9\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:20:37 crc kubenswrapper[5030]: I0120 23:20:37.304222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"9e47ba81-d55b-48df-bce7-31220884279a","Type":"ContainerStarted","Data":"17543e07621bd81516d23441483660efcf74bfd8b130e95eccf5638cbe189c50"} Jan 20 23:20:37 crc kubenswrapper[5030]: I0120 23:20:37.322537 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:20:37 crc kubenswrapper[5030]: I0120 23:20:37.331547 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.331516438 podStartE2EDuration="2.331516438s" podCreationTimestamp="2026-01-20 23:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:20:37.322570573 +0000 UTC m=+2709.642830861" watchObservedRunningTime="2026-01-20 23:20:37.331516438 +0000 UTC m=+2709.651776746" Jan 20 23:20:37 crc kubenswrapper[5030]: I0120 23:20:37.718097 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:20:37 crc kubenswrapper[5030]: W0120 23:20:37.722696 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2ec21ef_c0b8_4832_a46a_c50fe992fde9.slice/crio-6c3292db84d61aeccd8e8321ca97fd1d37ce0aca764548090b67ed23db272f2d WatchSource:0}: Error finding container 6c3292db84d61aeccd8e8321ca97fd1d37ce0aca764548090b67ed23db272f2d: Status 404 returned error can't find the container with id 6c3292db84d61aeccd8e8321ca97fd1d37ce0aca764548090b67ed23db272f2d Jan 20 23:20:38 crc kubenswrapper[5030]: I0120 23:20:38.323972 5030 generic.go:334] "Generic (PLEG): container finished" podID="2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" containerID="bbc4b865330a5660417c7e9fa6e4fcc8d10892e7dfea8227af6e11c00479ab22" exitCode=0 Jan 20 23:20:38 crc kubenswrapper[5030]: I0120 23:20:38.324439 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5","Type":"ContainerDied","Data":"bbc4b865330a5660417c7e9fa6e4fcc8d10892e7dfea8227af6e11c00479ab22"} Jan 20 23:20:38 crc kubenswrapper[5030]: I0120 23:20:38.327947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"f2ec21ef-c0b8-4832-a46a-c50fe992fde9","Type":"ContainerStarted","Data":"5d18a13475f50e3d0a56e4d7838890360d98df9d2995559fe889ad74e97e6e75"} Jan 20 23:20:38 crc kubenswrapper[5030]: I0120 23:20:38.328002 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"f2ec21ef-c0b8-4832-a46a-c50fe992fde9","Type":"ContainerStarted","Data":"6c3292db84d61aeccd8e8321ca97fd1d37ce0aca764548090b67ed23db272f2d"} Jan 20 23:20:38 crc kubenswrapper[5030]: I0120 23:20:38.328047 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:38 crc kubenswrapper[5030]: I0120 23:20:38.383923 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.017395366 podStartE2EDuration="2.383902155s" podCreationTimestamp="2026-01-20 23:20:36 +0000 UTC" firstStartedPulling="2026-01-20 23:20:37.726273034 +0000 UTC m=+2710.046533332" lastFinishedPulling="2026-01-20 23:20:38.092779823 +0000 UTC m=+2710.413040121" observedRunningTime="2026-01-20 23:20:38.375829211 +0000 UTC m=+2710.696089509" watchObservedRunningTime="2026-01-20 23:20:38.383902155 +0000 UTC m=+2710.704162453" Jan 20 23:20:39 crc kubenswrapper[5030]: I0120 23:20:39.337994 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0213ca2-a56f-4a81-bfe7-814d2eeeb275" containerID="911be87e425ae0f10d9f79f585e422dfc37027171e5657160647c58cbb384334" exitCode=0 Jan 20 23:20:39 crc kubenswrapper[5030]: I0120 23:20:39.338090 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"e0213ca2-a56f-4a81-bfe7-814d2eeeb275","Type":"ContainerDied","Data":"911be87e425ae0f10d9f79f585e422dfc37027171e5657160647c58cbb384334"} Jan 20 23:20:39 crc kubenswrapper[5030]: I0120 23:20:39.341939 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5","Type":"ContainerStarted","Data":"d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca"} Jan 20 23:20:39 crc kubenswrapper[5030]: I0120 23:20:39.342509 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:20:39 crc kubenswrapper[5030]: I0120 23:20:39.390913 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=7.390890913 podStartE2EDuration="7.390890913s" podCreationTimestamp="2026-01-20 23:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:20:39.388058695 +0000 UTC m=+2711.708318993" watchObservedRunningTime="2026-01-20 23:20:39.390890913 +0000 UTC m=+2711.711151201" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.307026 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.308810 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.313342 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.313498 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-vh4r9" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.313842 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.315827 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.316572 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.330661 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.357539 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"e0213ca2-a56f-4a81-bfe7-814d2eeeb275","Type":"ContainerStarted","Data":"b6df586aebc68d2efc6af2bcf7cd3114ec3dd0cbdf0cfbc45b847f60af0e5c15"} Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.386731 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=7.386712164 podStartE2EDuration="7.386712164s" podCreationTimestamp="2026-01-20 23:20:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:20:40.379279155 +0000 UTC m=+2712.699539483" watchObservedRunningTime="2026-01-20 23:20:40.386712164 +0000 UTC m=+2712.706972462" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.419440 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.419650 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.419725 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.419842 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.419925 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgm86\" (UniqueName: \"kubernetes.io/projected/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-kube-api-access-fgm86\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.420024 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.420093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.420167 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-config\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.521779 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.521868 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgm86\" (UniqueName: \"kubernetes.io/projected/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-kube-api-access-fgm86\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.521976 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.522041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.522096 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-config\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.522195 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.522235 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.522277 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.522921 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.523141 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.523202 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-config\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.523456 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.528262 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.528271 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.529094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.541672 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgm86\" (UniqueName: \"kubernetes.io/projected/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-kube-api-access-fgm86\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.561680 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:40 crc kubenswrapper[5030]: I0120 23:20:40.627423 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:41 crc kubenswrapper[5030]: I0120 23:20:41.078652 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:20:41 crc kubenswrapper[5030]: I0120 23:20:41.372474 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb","Type":"ContainerStarted","Data":"43082a76d23b8074e1ed19a5330bef2bfc3758f17d62bee8b5fae165abbbefd5"} Jan 20 23:20:41 crc kubenswrapper[5030]: I0120 23:20:41.372854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb","Type":"ContainerStarted","Data":"685939fe3cb6d868616254e3278d31a750f5b1768e062fc532327bdb009ad0f1"} Jan 20 23:20:42 crc kubenswrapper[5030]: I0120 23:20:42.382069 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb","Type":"ContainerStarted","Data":"ecbd8b4feb5fd36330625ded1056eb0e060128772f26a3acce331c05cfe6ccf3"} Jan 20 23:20:42 crc kubenswrapper[5030]: I0120 23:20:42.410504 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.410488556 podStartE2EDuration="3.410488556s" podCreationTimestamp="2026-01-20 23:20:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:20:42.404322038 +0000 UTC m=+2714.724582326" watchObservedRunningTime="2026-01-20 23:20:42.410488556 +0000 UTC m=+2714.730748844" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.210952 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.213023 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.215812 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.217381 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.222317 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-4z7ks" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.222401 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.229466 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.267173 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.267224 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.267254 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/548fd8df-abc4-45a7-9967-72fab91a66a8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.267308 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/548fd8df-abc4-45a7-9967-72fab91a66a8-config\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.267341 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.267364 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.267390 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/548fd8df-abc4-45a7-9967-72fab91a66a8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.267417 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn87b\" (UniqueName: \"kubernetes.io/projected/548fd8df-abc4-45a7-9967-72fab91a66a8-kube-api-access-rn87b\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.369049 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/548fd8df-abc4-45a7-9967-72fab91a66a8-config\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.369101 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.369142 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.369169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/548fd8df-abc4-45a7-9967-72fab91a66a8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.369200 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn87b\" (UniqueName: \"kubernetes.io/projected/548fd8df-abc4-45a7-9967-72fab91a66a8-kube-api-access-rn87b\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.369253 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.369274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.369293 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/548fd8df-abc4-45a7-9967-72fab91a66a8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.369600 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.369960 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/548fd8df-abc4-45a7-9967-72fab91a66a8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.370420 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/548fd8df-abc4-45a7-9967-72fab91a66a8-config\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.370609 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/548fd8df-abc4-45a7-9967-72fab91a66a8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.376050 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.377713 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.378367 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.387060 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn87b\" (UniqueName: \"kubernetes.io/projected/548fd8df-abc4-45a7-9967-72fab91a66a8-kube-api-access-rn87b\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.394368 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"ovsdbserver-sb-0\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.549014 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.627956 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.800026 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:43 crc kubenswrapper[5030]: I0120 23:20:43.800333 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:44 crc kubenswrapper[5030]: I0120 23:20:44.019268 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:20:44 crc kubenswrapper[5030]: I0120 23:20:44.400785 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"548fd8df-abc4-45a7-9967-72fab91a66a8","Type":"ContainerStarted","Data":"bd96a3a20843e6f56642dd70f8707c625d3a285b19dc8ee6e4fe00aa44c6b9f9"} Jan 20 23:20:44 crc kubenswrapper[5030]: I0120 23:20:44.401079 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"548fd8df-abc4-45a7-9967-72fab91a66a8","Type":"ContainerStarted","Data":"6fe18aeb4fe0ded445a9b466df8df29be383fa0340292b59d55d812502344579"} Jan 20 23:20:44 crc kubenswrapper[5030]: I0120 23:20:44.401094 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"548fd8df-abc4-45a7-9967-72fab91a66a8","Type":"ContainerStarted","Data":"8500dd757a5727ff00695803c642317dfe7c4f2c9c9e647cbdbbd6f3b97a9161"} Jan 20 23:20:44 crc kubenswrapper[5030]: I0120 23:20:44.424613 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.424598786 podStartE2EDuration="2.424598786s" podCreationTimestamp="2026-01-20 23:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:20:44.422252879 +0000 UTC m=+2716.742513167" watchObservedRunningTime="2026-01-20 23:20:44.424598786 +0000 UTC m=+2716.744859064" Jan 20 23:20:45 crc kubenswrapper[5030]: I0120 23:20:45.292782 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:45 crc kubenswrapper[5030]: I0120 23:20:45.293185 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:45 crc kubenswrapper[5030]: I0120 23:20:45.628288 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:45 crc kubenswrapper[5030]: I0120 23:20:45.718905 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:20:46 crc kubenswrapper[5030]: I0120 23:20:46.176712 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:46 crc kubenswrapper[5030]: I0120 23:20:46.290833 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:20:46 crc kubenswrapper[5030]: I0120 23:20:46.550141 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:46 crc kubenswrapper[5030]: I0120 23:20:46.705300 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:46 crc kubenswrapper[5030]: I0120 23:20:46.773157 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:20:47 crc kubenswrapper[5030]: I0120 23:20:47.328182 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:20:47 crc kubenswrapper[5030]: I0120 23:20:47.707345 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:47 crc kubenswrapper[5030]: I0120 23:20:47.785453 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.320959 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.325495 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.327236 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.327757 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.328322 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-mk8g8" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.329154 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.353341 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.459287 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.459342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b607ce93-5c69-4cf7-a953-3d53d413fcff-cache\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.459413 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.459455 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b607ce93-5c69-4cf7-a953-3d53d413fcff-lock\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.459498 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfbkd\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-kube-api-access-mfbkd\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.549680 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.561538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.561582 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b607ce93-5c69-4cf7-a953-3d53d413fcff-cache\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.561674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.561723 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b607ce93-5c69-4cf7-a953-3d53d413fcff-lock\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.561753 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfbkd\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-kube-api-access-mfbkd\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.561895 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") device mount path \"/mnt/openstack/pv15\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.562203 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b607ce93-5c69-4cf7-a953-3d53d413fcff-cache\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: E0120 23:20:48.562315 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:20:48 crc kubenswrapper[5030]: E0120 23:20:48.562348 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:20:48 crc kubenswrapper[5030]: E0120 23:20:48.562391 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift podName:b607ce93-5c69-4cf7-a953-3d53d413fcff nodeName:}" failed. No retries permitted until 2026-01-20 23:20:49.062373723 +0000 UTC m=+2721.382634011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift") pod "swift-storage-0" (UID: "b607ce93-5c69-4cf7-a953-3d53d413fcff") : configmap "swift-ring-files" not found Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.562856 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b607ce93-5c69-4cf7-a953-3d53d413fcff-lock\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.582560 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfbkd\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-kube-api-access-mfbkd\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:48 crc kubenswrapper[5030]: I0120 23:20:48.585362 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.067760 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:49 crc kubenswrapper[5030]: E0120 23:20:49.067919 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:20:49 crc kubenswrapper[5030]: E0120 23:20:49.068142 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:20:49 crc kubenswrapper[5030]: E0120 23:20:49.068192 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift podName:b607ce93-5c69-4cf7-a953-3d53d413fcff nodeName:}" failed. No retries permitted until 2026-01-20 23:20:50.068175542 +0000 UTC m=+2722.388435830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift") pod "swift-storage-0" (UID: "b607ce93-5c69-4cf7-a953-3d53d413fcff") : configmap "swift-ring-files" not found Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.609144 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.634834 5030 scope.go:117] "RemoveContainer" containerID="2b0195567e390a19554c983b7344fac252c95ea9a88d9187dcfec9456bdc0669" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.654218 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.659912 5030 scope.go:117] "RemoveContainer" containerID="0799a414ab9bc02a00a60fde72c5c4830fc0204c4a5564607ae804f0a7459cfb" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.932355 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.933746 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.938854 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-xksnq" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.938925 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.939107 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.939175 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.947413 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.985353 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.985727 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faa5c453-046f-4c27-93cf-c291aaee541e-config\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.985783 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/faa5c453-046f-4c27-93cf-c291aaee541e-scripts\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.985809 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.985829 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.985857 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/faa5c453-046f-4c27-93cf-c291aaee541e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:49 crc kubenswrapper[5030]: I0120 23:20:49.985887 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt2km\" (UniqueName: \"kubernetes.io/projected/faa5c453-046f-4c27-93cf-c291aaee541e-kube-api-access-nt2km\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.086815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/faa5c453-046f-4c27-93cf-c291aaee541e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.086915 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt2km\" (UniqueName: \"kubernetes.io/projected/faa5c453-046f-4c27-93cf-c291aaee541e-kube-api-access-nt2km\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.087001 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.087074 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faa5c453-046f-4c27-93cf-c291aaee541e-config\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.087125 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.087151 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/faa5c453-046f-4c27-93cf-c291aaee541e-scripts\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.087183 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.087208 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: E0120 23:20:50.087394 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:20:50 crc kubenswrapper[5030]: E0120 23:20:50.087418 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:20:50 crc kubenswrapper[5030]: E0120 23:20:50.087467 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift podName:b607ce93-5c69-4cf7-a953-3d53d413fcff nodeName:}" failed. No retries permitted until 2026-01-20 23:20:52.087447684 +0000 UTC m=+2724.407707982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift") pod "swift-storage-0" (UID: "b607ce93-5c69-4cf7-a953-3d53d413fcff") : configmap "swift-ring-files" not found Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.088011 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faa5c453-046f-4c27-93cf-c291aaee541e-config\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.088641 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/faa5c453-046f-4c27-93cf-c291aaee541e-scripts\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.088909 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/faa5c453-046f-4c27-93cf-c291aaee541e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.092497 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.093931 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.094250 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.105666 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt2km\" (UniqueName: \"kubernetes.io/projected/faa5c453-046f-4c27-93cf-c291aaee541e-kube-api-access-nt2km\") pod \"ovn-northd-0\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.267560 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:50 crc kubenswrapper[5030]: I0120 23:20:50.597767 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:20:51 crc kubenswrapper[5030]: I0120 23:20:51.469447 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"faa5c453-046f-4c27-93cf-c291aaee541e","Type":"ContainerStarted","Data":"b65351c43ee430bba2d10de676fa2218e795346d4c6fb18b21ab0f22049ea7a4"} Jan 20 23:20:51 crc kubenswrapper[5030]: I0120 23:20:51.469780 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"faa5c453-046f-4c27-93cf-c291aaee541e","Type":"ContainerStarted","Data":"1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4"} Jan 20 23:20:51 crc kubenswrapper[5030]: I0120 23:20:51.469795 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"faa5c453-046f-4c27-93cf-c291aaee541e","Type":"ContainerStarted","Data":"58e9783cfdb470224161412f6131249bf2ab3292c9ce9b4db52d04fdf42b60a7"} Jan 20 23:20:51 crc kubenswrapper[5030]: I0120 23:20:51.469830 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.122119 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:52 crc kubenswrapper[5030]: E0120 23:20:52.122406 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:20:52 crc kubenswrapper[5030]: E0120 23:20:52.122576 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:20:52 crc kubenswrapper[5030]: E0120 23:20:52.122729 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift podName:b607ce93-5c69-4cf7-a953-3d53d413fcff nodeName:}" failed. No retries permitted until 2026-01-20 23:20:56.122696832 +0000 UTC m=+2728.442957160 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift") pod "swift-storage-0" (UID: "b607ce93-5c69-4cf7-a953-3d53d413fcff") : configmap "swift-ring-files" not found Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.213823 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=3.213797007 podStartE2EDuration="3.213797007s" podCreationTimestamp="2026-01-20 23:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:20:51.489456156 +0000 UTC m=+2723.809716444" watchObservedRunningTime="2026-01-20 23:20:52.213797007 +0000 UTC m=+2724.534057335" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.221447 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-qwhwh"] Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.222903 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.225130 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.226109 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.229921 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.245824 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-qwhwh"] Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.325614 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d427c956-ad6c-44b8-b112-ec484902b44c-scripts\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.325709 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-dispersionconf\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.325740 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d427c956-ad6c-44b8-b112-ec484902b44c-ring-data-devices\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.325793 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fzz9\" (UniqueName: \"kubernetes.io/projected/d427c956-ad6c-44b8-b112-ec484902b44c-kube-api-access-6fzz9\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.325879 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-swiftconf\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.326054 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-combined-ca-bundle\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.326219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d427c956-ad6c-44b8-b112-ec484902b44c-etc-swift\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.427587 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-combined-ca-bundle\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.427690 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d427c956-ad6c-44b8-b112-ec484902b44c-etc-swift\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.427728 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d427c956-ad6c-44b8-b112-ec484902b44c-scripts\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.427759 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-dispersionconf\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.427779 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d427c956-ad6c-44b8-b112-ec484902b44c-ring-data-devices\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.427801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fzz9\" (UniqueName: \"kubernetes.io/projected/d427c956-ad6c-44b8-b112-ec484902b44c-kube-api-access-6fzz9\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.427839 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-swiftconf\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.428428 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d427c956-ad6c-44b8-b112-ec484902b44c-etc-swift\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.429266 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d427c956-ad6c-44b8-b112-ec484902b44c-ring-data-devices\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.434200 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-swiftconf\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.434351 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-dispersionconf\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.435461 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d427c956-ad6c-44b8-b112-ec484902b44c-scripts\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.438561 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-combined-ca-bundle\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.452376 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fzz9\" (UniqueName: \"kubernetes.io/projected/d427c956-ad6c-44b8-b112-ec484902b44c-kube-api-access-6fzz9\") pod \"swift-ring-rebalance-qwhwh\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.458640 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b7q65"] Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.468882 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b7q65"] Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.469159 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-b7q65" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.475913 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.528890 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtkwj\" (UniqueName: \"kubernetes.io/projected/f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61-kube-api-access-vtkwj\") pod \"root-account-create-update-b7q65\" (UID: \"f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61\") " pod="openstack-kuttl-tests/root-account-create-update-b7q65" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.529274 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61-operator-scripts\") pod \"root-account-create-update-b7q65\" (UID: \"f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61\") " pod="openstack-kuttl-tests/root-account-create-update-b7q65" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.548974 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.630664 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61-operator-scripts\") pod \"root-account-create-update-b7q65\" (UID: \"f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61\") " pod="openstack-kuttl-tests/root-account-create-update-b7q65" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.630875 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtkwj\" (UniqueName: \"kubernetes.io/projected/f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61-kube-api-access-vtkwj\") pod \"root-account-create-update-b7q65\" (UID: \"f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61\") " pod="openstack-kuttl-tests/root-account-create-update-b7q65" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.632723 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61-operator-scripts\") pod \"root-account-create-update-b7q65\" (UID: \"f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61\") " pod="openstack-kuttl-tests/root-account-create-update-b7q65" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.651094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtkwj\" (UniqueName: \"kubernetes.io/projected/f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61-kube-api-access-vtkwj\") pod \"root-account-create-update-b7q65\" (UID: \"f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61\") " pod="openstack-kuttl-tests/root-account-create-update-b7q65" Jan 20 23:20:52 crc kubenswrapper[5030]: I0120 23:20:52.818577 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-b7q65" Jan 20 23:20:53 crc kubenswrapper[5030]: I0120 23:20:53.004513 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-qwhwh"] Jan 20 23:20:53 crc kubenswrapper[5030]: I0120 23:20:53.076253 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b7q65"] Jan 20 23:20:53 crc kubenswrapper[5030]: W0120 23:20:53.082436 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf89f3ce0_5bd9_419f_b115_6e1f0ec0cd61.slice/crio-21c5377be5e638f03fd4ca94ebd983274ca23236a5344c0acbd6c7e886024d9e WatchSource:0}: Error finding container 21c5377be5e638f03fd4ca94ebd983274ca23236a5344c0acbd6c7e886024d9e: Status 404 returned error can't find the container with id 21c5377be5e638f03fd4ca94ebd983274ca23236a5344c0acbd6c7e886024d9e Jan 20 23:20:53 crc kubenswrapper[5030]: I0120 23:20:53.500579 5030 generic.go:334] "Generic (PLEG): container finished" podID="f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61" containerID="78e9eb62b758fe4af2f56913ae48a52d46e04bbf281a309ace0c938c99079366" exitCode=0 Jan 20 23:20:53 crc kubenswrapper[5030]: I0120 23:20:53.500640 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-b7q65" event={"ID":"f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61","Type":"ContainerDied","Data":"78e9eb62b758fe4af2f56913ae48a52d46e04bbf281a309ace0c938c99079366"} Jan 20 23:20:53 crc kubenswrapper[5030]: I0120 23:20:53.500697 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-b7q65" event={"ID":"f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61","Type":"ContainerStarted","Data":"21c5377be5e638f03fd4ca94ebd983274ca23236a5344c0acbd6c7e886024d9e"} Jan 20 23:20:53 crc kubenswrapper[5030]: I0120 23:20:53.502044 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" event={"ID":"d427c956-ad6c-44b8-b112-ec484902b44c","Type":"ContainerStarted","Data":"db87732f123d106f7254ea1e9dc33ebc44187354473ff54f9218551dfb61ec86"} Jan 20 23:20:53 crc kubenswrapper[5030]: I0120 23:20:53.502075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" event={"ID":"d427c956-ad6c-44b8-b112-ec484902b44c","Type":"ContainerStarted","Data":"e4147da6705b77ab55b503b206fc40b50ae2703f16c560eca48a336d03afb7ec"} Jan 20 23:20:53 crc kubenswrapper[5030]: I0120 23:20:53.551914 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" podStartSLOduration=1.5518930050000002 podStartE2EDuration="1.551893005s" podCreationTimestamp="2026-01-20 23:20:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:20:53.541758182 +0000 UTC m=+2725.862018470" watchObservedRunningTime="2026-01-20 23:20:53.551893005 +0000 UTC m=+2725.872153303" Jan 20 23:20:54 crc kubenswrapper[5030]: I0120 23:20:54.987267 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-b7q65" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.170763 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61-operator-scripts\") pod \"f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61\" (UID: \"f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61\") " Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.170907 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtkwj\" (UniqueName: \"kubernetes.io/projected/f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61-kube-api-access-vtkwj\") pod \"f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61\" (UID: \"f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61\") " Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.171489 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61" (UID: "f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.175913 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61-kube-api-access-vtkwj" (OuterVolumeSpecName: "kube-api-access-vtkwj") pod "f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61" (UID: "f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61"). InnerVolumeSpecName "kube-api-access-vtkwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.246798 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-2tf9j"] Jan 20 23:20:55 crc kubenswrapper[5030]: E0120 23:20:55.247319 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61" containerName="mariadb-account-create-update" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.247390 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61" containerName="mariadb-account-create-update" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.247606 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61" containerName="mariadb-account-create-update" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.248348 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-2tf9j" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.271136 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-2tf9j"] Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.272838 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c184fea-5939-43b6-80db-23079b0f6d88-operator-scripts\") pod \"keystone-db-create-2tf9j\" (UID: \"2c184fea-5939-43b6-80db-23079b0f6d88\") " pod="openstack-kuttl-tests/keystone-db-create-2tf9j" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.272929 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hql8q\" (UniqueName: \"kubernetes.io/projected/2c184fea-5939-43b6-80db-23079b0f6d88-kube-api-access-hql8q\") pod \"keystone-db-create-2tf9j\" (UID: \"2c184fea-5939-43b6-80db-23079b0f6d88\") " pod="openstack-kuttl-tests/keystone-db-create-2tf9j" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.273012 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtkwj\" (UniqueName: \"kubernetes.io/projected/f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61-kube-api-access-vtkwj\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.273039 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.362608 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-b695-account-create-update-d854n"] Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.364022 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.366154 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.372976 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-b695-account-create-update-d854n"] Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.375349 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq4cs\" (UniqueName: \"kubernetes.io/projected/ac03b977-2ac1-4210-9b75-f3e6d87ef13e-kube-api-access-qq4cs\") pod \"keystone-b695-account-create-update-d854n\" (UID: \"ac03b977-2ac1-4210-9b75-f3e6d87ef13e\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.375407 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hql8q\" (UniqueName: \"kubernetes.io/projected/2c184fea-5939-43b6-80db-23079b0f6d88-kube-api-access-hql8q\") pod \"keystone-db-create-2tf9j\" (UID: \"2c184fea-5939-43b6-80db-23079b0f6d88\") " pod="openstack-kuttl-tests/keystone-db-create-2tf9j" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.375506 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac03b977-2ac1-4210-9b75-f3e6d87ef13e-operator-scripts\") pod \"keystone-b695-account-create-update-d854n\" (UID: \"ac03b977-2ac1-4210-9b75-f3e6d87ef13e\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.375613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c184fea-5939-43b6-80db-23079b0f6d88-operator-scripts\") pod \"keystone-db-create-2tf9j\" (UID: \"2c184fea-5939-43b6-80db-23079b0f6d88\") " pod="openstack-kuttl-tests/keystone-db-create-2tf9j" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.376251 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c184fea-5939-43b6-80db-23079b0f6d88-operator-scripts\") pod \"keystone-db-create-2tf9j\" (UID: \"2c184fea-5939-43b6-80db-23079b0f6d88\") " pod="openstack-kuttl-tests/keystone-db-create-2tf9j" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.393493 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hql8q\" (UniqueName: \"kubernetes.io/projected/2c184fea-5939-43b6-80db-23079b0f6d88-kube-api-access-hql8q\") pod \"keystone-db-create-2tf9j\" (UID: \"2c184fea-5939-43b6-80db-23079b0f6d88\") " pod="openstack-kuttl-tests/keystone-db-create-2tf9j" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.477425 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac03b977-2ac1-4210-9b75-f3e6d87ef13e-operator-scripts\") pod \"keystone-b695-account-create-update-d854n\" (UID: \"ac03b977-2ac1-4210-9b75-f3e6d87ef13e\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.477552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq4cs\" (UniqueName: \"kubernetes.io/projected/ac03b977-2ac1-4210-9b75-f3e6d87ef13e-kube-api-access-qq4cs\") pod \"keystone-b695-account-create-update-d854n\" (UID: \"ac03b977-2ac1-4210-9b75-f3e6d87ef13e\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.478224 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac03b977-2ac1-4210-9b75-f3e6d87ef13e-operator-scripts\") pod \"keystone-b695-account-create-update-d854n\" (UID: \"ac03b977-2ac1-4210-9b75-f3e6d87ef13e\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.508069 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq4cs\" (UniqueName: \"kubernetes.io/projected/ac03b977-2ac1-4210-9b75-f3e6d87ef13e-kube-api-access-qq4cs\") pod \"keystone-b695-account-create-update-d854n\" (UID: \"ac03b977-2ac1-4210-9b75-f3e6d87ef13e\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.534781 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-b7q65" event={"ID":"f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61","Type":"ContainerDied","Data":"21c5377be5e638f03fd4ca94ebd983274ca23236a5344c0acbd6c7e886024d9e"} Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.534977 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21c5377be5e638f03fd4ca94ebd983274ca23236a5344c0acbd6c7e886024d9e" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.534842 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-b7q65" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.566025 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-2tf9j" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.573263 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-4tt5b"] Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.580238 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-4tt5b" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.581653 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4sdb\" (UniqueName: \"kubernetes.io/projected/5e882b9a-9de5-451f-8cb3-896629ba3c4b-kube-api-access-r4sdb\") pod \"placement-db-create-4tt5b\" (UID: \"5e882b9a-9de5-451f-8cb3-896629ba3c4b\") " pod="openstack-kuttl-tests/placement-db-create-4tt5b" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.581765 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e882b9a-9de5-451f-8cb3-896629ba3c4b-operator-scripts\") pod \"placement-db-create-4tt5b\" (UID: \"5e882b9a-9de5-451f-8cb3-896629ba3c4b\") " pod="openstack-kuttl-tests/placement-db-create-4tt5b" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.592705 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-4tt5b"] Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.683963 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.684884 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e882b9a-9de5-451f-8cb3-896629ba3c4b-operator-scripts\") pod \"placement-db-create-4tt5b\" (UID: \"5e882b9a-9de5-451f-8cb3-896629ba3c4b\") " pod="openstack-kuttl-tests/placement-db-create-4tt5b" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.684960 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4sdb\" (UniqueName: \"kubernetes.io/projected/5e882b9a-9de5-451f-8cb3-896629ba3c4b-kube-api-access-r4sdb\") pod \"placement-db-create-4tt5b\" (UID: \"5e882b9a-9de5-451f-8cb3-896629ba3c4b\") " pod="openstack-kuttl-tests/placement-db-create-4tt5b" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.686246 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e882b9a-9de5-451f-8cb3-896629ba3c4b-operator-scripts\") pod \"placement-db-create-4tt5b\" (UID: \"5e882b9a-9de5-451f-8cb3-896629ba3c4b\") " pod="openstack-kuttl-tests/placement-db-create-4tt5b" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.688682 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8"] Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.689580 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.697084 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.703333 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8"] Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.723226 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4sdb\" (UniqueName: \"kubernetes.io/projected/5e882b9a-9de5-451f-8cb3-896629ba3c4b-kube-api-access-r4sdb\") pod \"placement-db-create-4tt5b\" (UID: \"5e882b9a-9de5-451f-8cb3-896629ba3c4b\") " pod="openstack-kuttl-tests/placement-db-create-4tt5b" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.787395 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6054b2-6da1-421a-8364-f6b57c51c01d-operator-scripts\") pod \"placement-4bda-account-create-update-bqzv8\" (UID: \"9e6054b2-6da1-421a-8364-f6b57c51c01d\") " pod="openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.787731 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zmtn\" (UniqueName: \"kubernetes.io/projected/9e6054b2-6da1-421a-8364-f6b57c51c01d-kube-api-access-8zmtn\") pod \"placement-4bda-account-create-update-bqzv8\" (UID: \"9e6054b2-6da1-421a-8364-f6b57c51c01d\") " pod="openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.874206 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-chmn4"] Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.876313 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-chmn4" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.889577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2424g\" (UniqueName: \"kubernetes.io/projected/d65fc805-6e3b-4264-9974-e5da232e4f1e-kube-api-access-2424g\") pod \"glance-db-create-chmn4\" (UID: \"d65fc805-6e3b-4264-9974-e5da232e4f1e\") " pod="openstack-kuttl-tests/glance-db-create-chmn4" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.889633 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6054b2-6da1-421a-8364-f6b57c51c01d-operator-scripts\") pod \"placement-4bda-account-create-update-bqzv8\" (UID: \"9e6054b2-6da1-421a-8364-f6b57c51c01d\") " pod="openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.889696 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d65fc805-6e3b-4264-9974-e5da232e4f1e-operator-scripts\") pod \"glance-db-create-chmn4\" (UID: \"d65fc805-6e3b-4264-9974-e5da232e4f1e\") " pod="openstack-kuttl-tests/glance-db-create-chmn4" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.889718 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zmtn\" (UniqueName: \"kubernetes.io/projected/9e6054b2-6da1-421a-8364-f6b57c51c01d-kube-api-access-8zmtn\") pod \"placement-4bda-account-create-update-bqzv8\" (UID: \"9e6054b2-6da1-421a-8364-f6b57c51c01d\") " pod="openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.890366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6054b2-6da1-421a-8364-f6b57c51c01d-operator-scripts\") pod \"placement-4bda-account-create-update-bqzv8\" (UID: \"9e6054b2-6da1-421a-8364-f6b57c51c01d\") " pod="openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.898630 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-chmn4"] Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.910738 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zmtn\" (UniqueName: \"kubernetes.io/projected/9e6054b2-6da1-421a-8364-f6b57c51c01d-kube-api-access-8zmtn\") pod \"placement-4bda-account-create-update-bqzv8\" (UID: \"9e6054b2-6da1-421a-8364-f6b57c51c01d\") " pod="openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.921684 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-4tt5b" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.978015 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-659a-account-create-update-mdz2z"] Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.979090 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-659a-account-create-update-mdz2z"] Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.979190 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-659a-account-create-update-mdz2z" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.981145 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.994676 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2424g\" (UniqueName: \"kubernetes.io/projected/d65fc805-6e3b-4264-9974-e5da232e4f1e-kube-api-access-2424g\") pod \"glance-db-create-chmn4\" (UID: \"d65fc805-6e3b-4264-9974-e5da232e4f1e\") " pod="openstack-kuttl-tests/glance-db-create-chmn4" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.994802 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d65fc805-6e3b-4264-9974-e5da232e4f1e-operator-scripts\") pod \"glance-db-create-chmn4\" (UID: \"d65fc805-6e3b-4264-9974-e5da232e4f1e\") " pod="openstack-kuttl-tests/glance-db-create-chmn4" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.994826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgxbb\" (UniqueName: \"kubernetes.io/projected/36d70339-6f3a-45dc-858d-2690011172c5-kube-api-access-mgxbb\") pod \"glance-659a-account-create-update-mdz2z\" (UID: \"36d70339-6f3a-45dc-858d-2690011172c5\") " pod="openstack-kuttl-tests/glance-659a-account-create-update-mdz2z" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.994889 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d70339-6f3a-45dc-858d-2690011172c5-operator-scripts\") pod \"glance-659a-account-create-update-mdz2z\" (UID: \"36d70339-6f3a-45dc-858d-2690011172c5\") " pod="openstack-kuttl-tests/glance-659a-account-create-update-mdz2z" Jan 20 23:20:55 crc kubenswrapper[5030]: I0120 23:20:55.997427 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d65fc805-6e3b-4264-9974-e5da232e4f1e-operator-scripts\") pod \"glance-db-create-chmn4\" (UID: \"d65fc805-6e3b-4264-9974-e5da232e4f1e\") " pod="openstack-kuttl-tests/glance-db-create-chmn4" Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.020808 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2424g\" (UniqueName: \"kubernetes.io/projected/d65fc805-6e3b-4264-9974-e5da232e4f1e-kube-api-access-2424g\") pod \"glance-db-create-chmn4\" (UID: \"d65fc805-6e3b-4264-9974-e5da232e4f1e\") " pod="openstack-kuttl-tests/glance-db-create-chmn4" Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.021181 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8" Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.096281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgxbb\" (UniqueName: \"kubernetes.io/projected/36d70339-6f3a-45dc-858d-2690011172c5-kube-api-access-mgxbb\") pod \"glance-659a-account-create-update-mdz2z\" (UID: \"36d70339-6f3a-45dc-858d-2690011172c5\") " pod="openstack-kuttl-tests/glance-659a-account-create-update-mdz2z" Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.096371 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d70339-6f3a-45dc-858d-2690011172c5-operator-scripts\") pod \"glance-659a-account-create-update-mdz2z\" (UID: \"36d70339-6f3a-45dc-858d-2690011172c5\") " pod="openstack-kuttl-tests/glance-659a-account-create-update-mdz2z" Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.097309 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d70339-6f3a-45dc-858d-2690011172c5-operator-scripts\") pod \"glance-659a-account-create-update-mdz2z\" (UID: \"36d70339-6f3a-45dc-858d-2690011172c5\") " pod="openstack-kuttl-tests/glance-659a-account-create-update-mdz2z" Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.117063 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgxbb\" (UniqueName: \"kubernetes.io/projected/36d70339-6f3a-45dc-858d-2690011172c5-kube-api-access-mgxbb\") pod \"glance-659a-account-create-update-mdz2z\" (UID: \"36d70339-6f3a-45dc-858d-2690011172c5\") " pod="openstack-kuttl-tests/glance-659a-account-create-update-mdz2z" Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.121301 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-659a-account-create-update-mdz2z" Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.179799 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-b695-account-create-update-d854n"] Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.186197 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-2tf9j"] Jan 20 23:20:56 crc kubenswrapper[5030]: W0120 23:20:56.191990 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c184fea_5939_43b6_80db_23079b0f6d88.slice/crio-7e472e8c0f46aa8cf58800c2e5a3399245c51ccb6c494f63eb08b90d4ccf641e WatchSource:0}: Error finding container 7e472e8c0f46aa8cf58800c2e5a3399245c51ccb6c494f63eb08b90d4ccf641e: Status 404 returned error can't find the container with id 7e472e8c0f46aa8cf58800c2e5a3399245c51ccb6c494f63eb08b90d4ccf641e Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.197078 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:20:56 crc kubenswrapper[5030]: E0120 23:20:56.197218 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:20:56 crc kubenswrapper[5030]: E0120 23:20:56.197236 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:20:56 crc kubenswrapper[5030]: E0120 23:20:56.197283 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift podName:b607ce93-5c69-4cf7-a953-3d53d413fcff nodeName:}" failed. No retries permitted until 2026-01-20 23:21:04.197266304 +0000 UTC m=+2736.517526582 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift") pod "swift-storage-0" (UID: "b607ce93-5c69-4cf7-a953-3d53d413fcff") : configmap "swift-ring-files" not found Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.198789 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-chmn4" Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.433779 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-4tt5b"] Jan 20 23:20:56 crc kubenswrapper[5030]: W0120 23:20:56.444941 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e882b9a_9de5_451f_8cb3_896629ba3c4b.slice/crio-a7e1c6849ff79106c6887f71f151110341fdf15073d4e73b071bcb7d084ddb5c WatchSource:0}: Error finding container a7e1c6849ff79106c6887f71f151110341fdf15073d4e73b071bcb7d084ddb5c: Status 404 returned error can't find the container with id a7e1c6849ff79106c6887f71f151110341fdf15073d4e73b071bcb7d084ddb5c Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.520684 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8"] Jan 20 23:20:56 crc kubenswrapper[5030]: W0120 23:20:56.541188 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e6054b2_6da1_421a_8364_f6b57c51c01d.slice/crio-60835e8a79d180bb1b3f34d3a815d77eda5f1102db1119dd696f72717e97d409 WatchSource:0}: Error finding container 60835e8a79d180bb1b3f34d3a815d77eda5f1102db1119dd696f72717e97d409: Status 404 returned error can't find the container with id 60835e8a79d180bb1b3f34d3a815d77eda5f1102db1119dd696f72717e97d409 Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.545790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-2tf9j" event={"ID":"2c184fea-5939-43b6-80db-23079b0f6d88","Type":"ContainerStarted","Data":"be58b3a0d8691178b2610831673d15113e2422d314d8e39d2cd7d4b33b03747b"} Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.545835 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-2tf9j" event={"ID":"2c184fea-5939-43b6-80db-23079b0f6d88","Type":"ContainerStarted","Data":"7e472e8c0f46aa8cf58800c2e5a3399245c51ccb6c494f63eb08b90d4ccf641e"} Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.550275 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" event={"ID":"ac03b977-2ac1-4210-9b75-f3e6d87ef13e","Type":"ContainerStarted","Data":"da2491bd3ada4b8716bf4a0515aa6180de24a54d201a7950a35c5b4868a9337b"} Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.550316 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" event={"ID":"ac03b977-2ac1-4210-9b75-f3e6d87ef13e","Type":"ContainerStarted","Data":"2d37052a4560989411c6b3bddc345d5dde0cc6599f169e72c9179c58f914bc1e"} Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.551390 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-4tt5b" event={"ID":"5e882b9a-9de5-451f-8cb3-896629ba3c4b","Type":"ContainerStarted","Data":"a7e1c6849ff79106c6887f71f151110341fdf15073d4e73b071bcb7d084ddb5c"} Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.561133 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-create-2tf9j" podStartSLOduration=1.561116999 podStartE2EDuration="1.561116999s" podCreationTimestamp="2026-01-20 23:20:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:20:56.558301062 +0000 UTC m=+2728.878561360" watchObservedRunningTime="2026-01-20 23:20:56.561116999 +0000 UTC m=+2728.881377287" Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.573113 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" podStartSLOduration=1.573093786 podStartE2EDuration="1.573093786s" podCreationTimestamp="2026-01-20 23:20:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:20:56.569882119 +0000 UTC m=+2728.890142407" watchObservedRunningTime="2026-01-20 23:20:56.573093786 +0000 UTC m=+2728.893354064" Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.622758 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-659a-account-create-update-mdz2z"] Jan 20 23:20:56 crc kubenswrapper[5030]: W0120 23:20:56.631510 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36d70339_6f3a_45dc_858d_2690011172c5.slice/crio-f290321e130546e21d6c62424cae60d4d1aece725a74cce73874d1542c04011b WatchSource:0}: Error finding container f290321e130546e21d6c62424cae60d4d1aece725a74cce73874d1542c04011b: Status 404 returned error can't find the container with id f290321e130546e21d6c62424cae60d4d1aece725a74cce73874d1542c04011b Jan 20 23:20:56 crc kubenswrapper[5030]: I0120 23:20:56.750414 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-chmn4"] Jan 20 23:20:56 crc kubenswrapper[5030]: W0120 23:20:56.755445 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd65fc805_6e3b_4264_9974_e5da232e4f1e.slice/crio-58f3d2892db8357eee6e1e6f71ab87935a8219e8decebc42c1b06457493b3a0a WatchSource:0}: Error finding container 58f3d2892db8357eee6e1e6f71ab87935a8219e8decebc42c1b06457493b3a0a: Status 404 returned error can't find the container with id 58f3d2892db8357eee6e1e6f71ab87935a8219e8decebc42c1b06457493b3a0a Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.566840 5030 generic.go:334] "Generic (PLEG): container finished" podID="d65fc805-6e3b-4264-9974-e5da232e4f1e" containerID="f6e472c12202cebea002a0104ff390df2d1fb5a738489ed64bf7a451c77424dc" exitCode=0 Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.567000 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-chmn4" event={"ID":"d65fc805-6e3b-4264-9974-e5da232e4f1e","Type":"ContainerDied","Data":"f6e472c12202cebea002a0104ff390df2d1fb5a738489ed64bf7a451c77424dc"} Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.567059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-chmn4" event={"ID":"d65fc805-6e3b-4264-9974-e5da232e4f1e","Type":"ContainerStarted","Data":"58f3d2892db8357eee6e1e6f71ab87935a8219e8decebc42c1b06457493b3a0a"} Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.569002 5030 generic.go:334] "Generic (PLEG): container finished" podID="9e6054b2-6da1-421a-8364-f6b57c51c01d" containerID="520c14c6afe1bdc4fa9d9f804c73771f4b7427492baf3220765f9e111a1932b6" exitCode=0 Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.569066 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8" event={"ID":"9e6054b2-6da1-421a-8364-f6b57c51c01d","Type":"ContainerDied","Data":"520c14c6afe1bdc4fa9d9f804c73771f4b7427492baf3220765f9e111a1932b6"} Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.569080 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8" event={"ID":"9e6054b2-6da1-421a-8364-f6b57c51c01d","Type":"ContainerStarted","Data":"60835e8a79d180bb1b3f34d3a815d77eda5f1102db1119dd696f72717e97d409"} Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.571912 5030 generic.go:334] "Generic (PLEG): container finished" podID="2c184fea-5939-43b6-80db-23079b0f6d88" containerID="be58b3a0d8691178b2610831673d15113e2422d314d8e39d2cd7d4b33b03747b" exitCode=0 Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.571969 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-2tf9j" event={"ID":"2c184fea-5939-43b6-80db-23079b0f6d88","Type":"ContainerDied","Data":"be58b3a0d8691178b2610831673d15113e2422d314d8e39d2cd7d4b33b03747b"} Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.573399 5030 generic.go:334] "Generic (PLEG): container finished" podID="ac03b977-2ac1-4210-9b75-f3e6d87ef13e" containerID="da2491bd3ada4b8716bf4a0515aa6180de24a54d201a7950a35c5b4868a9337b" exitCode=0 Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.573457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" event={"ID":"ac03b977-2ac1-4210-9b75-f3e6d87ef13e","Type":"ContainerDied","Data":"da2491bd3ada4b8716bf4a0515aa6180de24a54d201a7950a35c5b4868a9337b"} Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.575419 5030 generic.go:334] "Generic (PLEG): container finished" podID="36d70339-6f3a-45dc-858d-2690011172c5" containerID="ac32d0c374c317b9d33d17588c3e73732c96a11a9f71507f7364903b5fa1f2fa" exitCode=0 Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.575456 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-659a-account-create-update-mdz2z" event={"ID":"36d70339-6f3a-45dc-858d-2690011172c5","Type":"ContainerDied","Data":"ac32d0c374c317b9d33d17588c3e73732c96a11a9f71507f7364903b5fa1f2fa"} Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.575487 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-659a-account-create-update-mdz2z" event={"ID":"36d70339-6f3a-45dc-858d-2690011172c5","Type":"ContainerStarted","Data":"f290321e130546e21d6c62424cae60d4d1aece725a74cce73874d1542c04011b"} Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.576827 5030 generic.go:334] "Generic (PLEG): container finished" podID="5e882b9a-9de5-451f-8cb3-896629ba3c4b" containerID="8281a5190aff5bc9d8e1739e5f4a1d43a6a5875e214c27977cb38115f035a585" exitCode=0 Jan 20 23:20:57 crc kubenswrapper[5030]: I0120 23:20:57.576854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-4tt5b" event={"ID":"5e882b9a-9de5-451f-8cb3-896629ba3c4b","Type":"ContainerDied","Data":"8281a5190aff5bc9d8e1739e5f4a1d43a6a5875e214c27977cb38115f035a585"} Jan 20 23:20:58 crc kubenswrapper[5030]: I0120 23:20:58.942846 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b7q65"] Jan 20 23:20:58 crc kubenswrapper[5030]: I0120 23:20:58.949399 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-b7q65"] Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.051952 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-4tt5b" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.055070 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4sdb\" (UniqueName: \"kubernetes.io/projected/5e882b9a-9de5-451f-8cb3-896629ba3c4b-kube-api-access-r4sdb\") pod \"5e882b9a-9de5-451f-8cb3-896629ba3c4b\" (UID: \"5e882b9a-9de5-451f-8cb3-896629ba3c4b\") " Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.055191 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e882b9a-9de5-451f-8cb3-896629ba3c4b-operator-scripts\") pod \"5e882b9a-9de5-451f-8cb3-896629ba3c4b\" (UID: \"5e882b9a-9de5-451f-8cb3-896629ba3c4b\") " Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.056035 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e882b9a-9de5-451f-8cb3-896629ba3c4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e882b9a-9de5-451f-8cb3-896629ba3c4b" (UID: "5e882b9a-9de5-451f-8cb3-896629ba3c4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.084857 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e882b9a-9de5-451f-8cb3-896629ba3c4b-kube-api-access-r4sdb" (OuterVolumeSpecName: "kube-api-access-r4sdb") pod "5e882b9a-9de5-451f-8cb3-896629ba3c4b" (UID: "5e882b9a-9de5-451f-8cb3-896629ba3c4b"). InnerVolumeSpecName "kube-api-access-r4sdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.157694 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4sdb\" (UniqueName: \"kubernetes.io/projected/5e882b9a-9de5-451f-8cb3-896629ba3c4b-kube-api-access-r4sdb\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.157770 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e882b9a-9de5-451f-8cb3-896629ba3c4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.281442 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-chmn4" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.283934 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-2tf9j" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.289815 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-659a-account-create-update-mdz2z" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.304654 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.306384 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.361685 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c184fea-5939-43b6-80db-23079b0f6d88-operator-scripts\") pod \"2c184fea-5939-43b6-80db-23079b0f6d88\" (UID: \"2c184fea-5939-43b6-80db-23079b0f6d88\") " Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.361728 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zmtn\" (UniqueName: \"kubernetes.io/projected/9e6054b2-6da1-421a-8364-f6b57c51c01d-kube-api-access-8zmtn\") pod \"9e6054b2-6da1-421a-8364-f6b57c51c01d\" (UID: \"9e6054b2-6da1-421a-8364-f6b57c51c01d\") " Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.361756 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgxbb\" (UniqueName: \"kubernetes.io/projected/36d70339-6f3a-45dc-858d-2690011172c5-kube-api-access-mgxbb\") pod \"36d70339-6f3a-45dc-858d-2690011172c5\" (UID: \"36d70339-6f3a-45dc-858d-2690011172c5\") " Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.361828 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2424g\" (UniqueName: \"kubernetes.io/projected/d65fc805-6e3b-4264-9974-e5da232e4f1e-kube-api-access-2424g\") pod \"d65fc805-6e3b-4264-9974-e5da232e4f1e\" (UID: \"d65fc805-6e3b-4264-9974-e5da232e4f1e\") " Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.361844 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac03b977-2ac1-4210-9b75-f3e6d87ef13e-operator-scripts\") pod \"ac03b977-2ac1-4210-9b75-f3e6d87ef13e\" (UID: \"ac03b977-2ac1-4210-9b75-f3e6d87ef13e\") " Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.361860 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hql8q\" (UniqueName: \"kubernetes.io/projected/2c184fea-5939-43b6-80db-23079b0f6d88-kube-api-access-hql8q\") pod \"2c184fea-5939-43b6-80db-23079b0f6d88\" (UID: \"2c184fea-5939-43b6-80db-23079b0f6d88\") " Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.361904 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6054b2-6da1-421a-8364-f6b57c51c01d-operator-scripts\") pod \"9e6054b2-6da1-421a-8364-f6b57c51c01d\" (UID: \"9e6054b2-6da1-421a-8364-f6b57c51c01d\") " Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.361918 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d65fc805-6e3b-4264-9974-e5da232e4f1e-operator-scripts\") pod \"d65fc805-6e3b-4264-9974-e5da232e4f1e\" (UID: \"d65fc805-6e3b-4264-9974-e5da232e4f1e\") " Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.361940 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d70339-6f3a-45dc-858d-2690011172c5-operator-scripts\") pod \"36d70339-6f3a-45dc-858d-2690011172c5\" (UID: \"36d70339-6f3a-45dc-858d-2690011172c5\") " Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.361957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq4cs\" (UniqueName: \"kubernetes.io/projected/ac03b977-2ac1-4210-9b75-f3e6d87ef13e-kube-api-access-qq4cs\") pod \"ac03b977-2ac1-4210-9b75-f3e6d87ef13e\" (UID: \"ac03b977-2ac1-4210-9b75-f3e6d87ef13e\") " Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.363241 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac03b977-2ac1-4210-9b75-f3e6d87ef13e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac03b977-2ac1-4210-9b75-f3e6d87ef13e" (UID: "ac03b977-2ac1-4210-9b75-f3e6d87ef13e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.363579 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c184fea-5939-43b6-80db-23079b0f6d88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c184fea-5939-43b6-80db-23079b0f6d88" (UID: "2c184fea-5939-43b6-80db-23079b0f6d88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.365345 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e6054b2-6da1-421a-8364-f6b57c51c01d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e6054b2-6da1-421a-8364-f6b57c51c01d" (UID: "9e6054b2-6da1-421a-8364-f6b57c51c01d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.365697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac03b977-2ac1-4210-9b75-f3e6d87ef13e-kube-api-access-qq4cs" (OuterVolumeSpecName: "kube-api-access-qq4cs") pod "ac03b977-2ac1-4210-9b75-f3e6d87ef13e" (UID: "ac03b977-2ac1-4210-9b75-f3e6d87ef13e"). InnerVolumeSpecName "kube-api-access-qq4cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.366066 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d70339-6f3a-45dc-858d-2690011172c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36d70339-6f3a-45dc-858d-2690011172c5" (UID: "36d70339-6f3a-45dc-858d-2690011172c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.366318 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e6054b2-6da1-421a-8364-f6b57c51c01d-kube-api-access-8zmtn" (OuterVolumeSpecName: "kube-api-access-8zmtn") pod "9e6054b2-6da1-421a-8364-f6b57c51c01d" (UID: "9e6054b2-6da1-421a-8364-f6b57c51c01d"). InnerVolumeSpecName "kube-api-access-8zmtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.366423 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d65fc805-6e3b-4264-9974-e5da232e4f1e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d65fc805-6e3b-4264-9974-e5da232e4f1e" (UID: "d65fc805-6e3b-4264-9974-e5da232e4f1e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.369756 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d70339-6f3a-45dc-858d-2690011172c5-kube-api-access-mgxbb" (OuterVolumeSpecName: "kube-api-access-mgxbb") pod "36d70339-6f3a-45dc-858d-2690011172c5" (UID: "36d70339-6f3a-45dc-858d-2690011172c5"). InnerVolumeSpecName "kube-api-access-mgxbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.369791 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d65fc805-6e3b-4264-9974-e5da232e4f1e-kube-api-access-2424g" (OuterVolumeSpecName: "kube-api-access-2424g") pod "d65fc805-6e3b-4264-9974-e5da232e4f1e" (UID: "d65fc805-6e3b-4264-9974-e5da232e4f1e"). InnerVolumeSpecName "kube-api-access-2424g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.370532 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c184fea-5939-43b6-80db-23079b0f6d88-kube-api-access-hql8q" (OuterVolumeSpecName: "kube-api-access-hql8q") pod "2c184fea-5939-43b6-80db-23079b0f6d88" (UID: "2c184fea-5939-43b6-80db-23079b0f6d88"). InnerVolumeSpecName "kube-api-access-hql8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.463224 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e6054b2-6da1-421a-8364-f6b57c51c01d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.463252 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d65fc805-6e3b-4264-9974-e5da232e4f1e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.463262 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36d70339-6f3a-45dc-858d-2690011172c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.463273 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq4cs\" (UniqueName: \"kubernetes.io/projected/ac03b977-2ac1-4210-9b75-f3e6d87ef13e-kube-api-access-qq4cs\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.463287 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c184fea-5939-43b6-80db-23079b0f6d88-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.463295 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zmtn\" (UniqueName: \"kubernetes.io/projected/9e6054b2-6da1-421a-8364-f6b57c51c01d-kube-api-access-8zmtn\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.463303 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgxbb\" (UniqueName: \"kubernetes.io/projected/36d70339-6f3a-45dc-858d-2690011172c5-kube-api-access-mgxbb\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.463311 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2424g\" (UniqueName: \"kubernetes.io/projected/d65fc805-6e3b-4264-9974-e5da232e4f1e-kube-api-access-2424g\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.463320 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac03b977-2ac1-4210-9b75-f3e6d87ef13e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.463336 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hql8q\" (UniqueName: \"kubernetes.io/projected/2c184fea-5939-43b6-80db-23079b0f6d88-kube-api-access-hql8q\") on node \"crc\" DevicePath \"\"" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.604372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-chmn4" event={"ID":"d65fc805-6e3b-4264-9974-e5da232e4f1e","Type":"ContainerDied","Data":"58f3d2892db8357eee6e1e6f71ab87935a8219e8decebc42c1b06457493b3a0a"} Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.604415 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58f3d2892db8357eee6e1e6f71ab87935a8219e8decebc42c1b06457493b3a0a" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.604490 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-chmn4" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.606747 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.606742 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8" event={"ID":"9e6054b2-6da1-421a-8364-f6b57c51c01d","Type":"ContainerDied","Data":"60835e8a79d180bb1b3f34d3a815d77eda5f1102db1119dd696f72717e97d409"} Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.606862 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60835e8a79d180bb1b3f34d3a815d77eda5f1102db1119dd696f72717e97d409" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.608430 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-2tf9j" event={"ID":"2c184fea-5939-43b6-80db-23079b0f6d88","Type":"ContainerDied","Data":"7e472e8c0f46aa8cf58800c2e5a3399245c51ccb6c494f63eb08b90d4ccf641e"} Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.608473 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e472e8c0f46aa8cf58800c2e5a3399245c51ccb6c494f63eb08b90d4ccf641e" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.608479 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-2tf9j" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.609881 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" event={"ID":"ac03b977-2ac1-4210-9b75-f3e6d87ef13e","Type":"ContainerDied","Data":"2d37052a4560989411c6b3bddc345d5dde0cc6599f169e72c9179c58f914bc1e"} Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.609904 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d37052a4560989411c6b3bddc345d5dde0cc6599f169e72c9179c58f914bc1e" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.609953 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b695-account-create-update-d854n" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.613409 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-659a-account-create-update-mdz2z" event={"ID":"36d70339-6f3a-45dc-858d-2690011172c5","Type":"ContainerDied","Data":"f290321e130546e21d6c62424cae60d4d1aece725a74cce73874d1542c04011b"} Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.613443 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f290321e130546e21d6c62424cae60d4d1aece725a74cce73874d1542c04011b" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.613492 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-659a-account-create-update-mdz2z" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.618131 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-4tt5b" event={"ID":"5e882b9a-9de5-451f-8cb3-896629ba3c4b","Type":"ContainerDied","Data":"a7e1c6849ff79106c6887f71f151110341fdf15073d4e73b071bcb7d084ddb5c"} Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.618174 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7e1c6849ff79106c6887f71f151110341fdf15073d4e73b071bcb7d084ddb5c" Jan 20 23:20:59 crc kubenswrapper[5030]: I0120 23:20:59.618226 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-4tt5b" Jan 20 23:21:00 crc kubenswrapper[5030]: I0120 23:21:00.043336 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61" path="/var/lib/kubelet/pods/f89f3ce0-5bd9-419f-b115-6e1f0ec0cd61/volumes" Jan 20 23:21:00 crc kubenswrapper[5030]: I0120 23:21:00.328416 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:21:00 crc kubenswrapper[5030]: I0120 23:21:00.627099 5030 generic.go:334] "Generic (PLEG): container finished" podID="d427c956-ad6c-44b8-b112-ec484902b44c" containerID="db87732f123d106f7254ea1e9dc33ebc44187354473ff54f9218551dfb61ec86" exitCode=0 Jan 20 23:21:00 crc kubenswrapper[5030]: I0120 23:21:00.627181 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" event={"ID":"d427c956-ad6c-44b8-b112-ec484902b44c","Type":"ContainerDied","Data":"db87732f123d106f7254ea1e9dc33ebc44187354473ff54f9218551dfb61ec86"} Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.271224 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4r5pg"] Jan 20 23:21:01 crc kubenswrapper[5030]: E0120 23:21:01.271534 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d70339-6f3a-45dc-858d-2690011172c5" containerName="mariadb-account-create-update" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.271546 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d70339-6f3a-45dc-858d-2690011172c5" containerName="mariadb-account-create-update" Jan 20 23:21:01 crc kubenswrapper[5030]: E0120 23:21:01.271556 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c184fea-5939-43b6-80db-23079b0f6d88" containerName="mariadb-database-create" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.271562 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c184fea-5939-43b6-80db-23079b0f6d88" containerName="mariadb-database-create" Jan 20 23:21:01 crc kubenswrapper[5030]: E0120 23:21:01.271577 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e6054b2-6da1-421a-8364-f6b57c51c01d" containerName="mariadb-account-create-update" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.271583 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e6054b2-6da1-421a-8364-f6b57c51c01d" containerName="mariadb-account-create-update" Jan 20 23:21:01 crc kubenswrapper[5030]: E0120 23:21:01.271592 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e882b9a-9de5-451f-8cb3-896629ba3c4b" containerName="mariadb-database-create" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.271597 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e882b9a-9de5-451f-8cb3-896629ba3c4b" containerName="mariadb-database-create" Jan 20 23:21:01 crc kubenswrapper[5030]: E0120 23:21:01.271605 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65fc805-6e3b-4264-9974-e5da232e4f1e" containerName="mariadb-database-create" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.271610 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65fc805-6e3b-4264-9974-e5da232e4f1e" containerName="mariadb-database-create" Jan 20 23:21:01 crc kubenswrapper[5030]: E0120 23:21:01.271643 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac03b977-2ac1-4210-9b75-f3e6d87ef13e" containerName="mariadb-account-create-update" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.271652 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac03b977-2ac1-4210-9b75-f3e6d87ef13e" containerName="mariadb-account-create-update" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.271794 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c184fea-5939-43b6-80db-23079b0f6d88" containerName="mariadb-database-create" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.271806 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e882b9a-9de5-451f-8cb3-896629ba3c4b" containerName="mariadb-database-create" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.271822 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac03b977-2ac1-4210-9b75-f3e6d87ef13e" containerName="mariadb-account-create-update" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.271838 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d65fc805-6e3b-4264-9974-e5da232e4f1e" containerName="mariadb-database-create" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.271851 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d70339-6f3a-45dc-858d-2690011172c5" containerName="mariadb-account-create-update" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.271869 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e6054b2-6da1-421a-8364-f6b57c51c01d" containerName="mariadb-account-create-update" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.272447 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.274189 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-b75fg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.274615 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.290093 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4r5pg"] Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.407710 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn5hm\" (UniqueName: \"kubernetes.io/projected/b334eade-4a24-4596-b272-d8d3c91940fc-kube-api-access-vn5hm\") pod \"glance-db-sync-4r5pg\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.407829 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-combined-ca-bundle\") pod \"glance-db-sync-4r5pg\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.407935 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-db-sync-config-data\") pod \"glance-db-sync-4r5pg\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.407973 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-config-data\") pod \"glance-db-sync-4r5pg\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.509360 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn5hm\" (UniqueName: \"kubernetes.io/projected/b334eade-4a24-4596-b272-d8d3c91940fc-kube-api-access-vn5hm\") pod \"glance-db-sync-4r5pg\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.509710 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-combined-ca-bundle\") pod \"glance-db-sync-4r5pg\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.509875 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-db-sync-config-data\") pod \"glance-db-sync-4r5pg\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.510001 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-config-data\") pod \"glance-db-sync-4r5pg\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.517462 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-combined-ca-bundle\") pod \"glance-db-sync-4r5pg\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.517607 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-db-sync-config-data\") pod \"glance-db-sync-4r5pg\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.517765 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-config-data\") pod \"glance-db-sync-4r5pg\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.527308 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn5hm\" (UniqueName: \"kubernetes.io/projected/b334eade-4a24-4596-b272-d8d3c91940fc-kube-api-access-vn5hm\") pod \"glance-db-sync-4r5pg\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:01 crc kubenswrapper[5030]: I0120 23:21:01.588021 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.037947 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.096016 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4r5pg"] Jan 20 23:21:02 crc kubenswrapper[5030]: W0120 23:21:02.103703 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb334eade_4a24_4596_b272_d8d3c91940fc.slice/crio-daae88fa7ae6df0e295a6dd261cac04dae374cfd040c2e2c05440a1bce758eed WatchSource:0}: Error finding container daae88fa7ae6df0e295a6dd261cac04dae374cfd040c2e2c05440a1bce758eed: Status 404 returned error can't find the container with id daae88fa7ae6df0e295a6dd261cac04dae374cfd040c2e2c05440a1bce758eed Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.219492 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d427c956-ad6c-44b8-b112-ec484902b44c-etc-swift\") pod \"d427c956-ad6c-44b8-b112-ec484902b44c\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.219545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-dispersionconf\") pod \"d427c956-ad6c-44b8-b112-ec484902b44c\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.219665 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fzz9\" (UniqueName: \"kubernetes.io/projected/d427c956-ad6c-44b8-b112-ec484902b44c-kube-api-access-6fzz9\") pod \"d427c956-ad6c-44b8-b112-ec484902b44c\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.219717 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-combined-ca-bundle\") pod \"d427c956-ad6c-44b8-b112-ec484902b44c\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.219754 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-swiftconf\") pod \"d427c956-ad6c-44b8-b112-ec484902b44c\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.219770 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d427c956-ad6c-44b8-b112-ec484902b44c-scripts\") pod \"d427c956-ad6c-44b8-b112-ec484902b44c\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.219803 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d427c956-ad6c-44b8-b112-ec484902b44c-ring-data-devices\") pod \"d427c956-ad6c-44b8-b112-ec484902b44c\" (UID: \"d427c956-ad6c-44b8-b112-ec484902b44c\") " Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.220781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d427c956-ad6c-44b8-b112-ec484902b44c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d427c956-ad6c-44b8-b112-ec484902b44c" (UID: "d427c956-ad6c-44b8-b112-ec484902b44c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.223490 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d427c956-ad6c-44b8-b112-ec484902b44c-kube-api-access-6fzz9" (OuterVolumeSpecName: "kube-api-access-6fzz9") pod "d427c956-ad6c-44b8-b112-ec484902b44c" (UID: "d427c956-ad6c-44b8-b112-ec484902b44c"). InnerVolumeSpecName "kube-api-access-6fzz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.224809 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d427c956-ad6c-44b8-b112-ec484902b44c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d427c956-ad6c-44b8-b112-ec484902b44c" (UID: "d427c956-ad6c-44b8-b112-ec484902b44c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.227609 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d427c956-ad6c-44b8-b112-ec484902b44c" (UID: "d427c956-ad6c-44b8-b112-ec484902b44c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.244496 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d427c956-ad6c-44b8-b112-ec484902b44c" (UID: "d427c956-ad6c-44b8-b112-ec484902b44c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.246108 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d427c956-ad6c-44b8-b112-ec484902b44c" (UID: "d427c956-ad6c-44b8-b112-ec484902b44c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.256932 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d427c956-ad6c-44b8-b112-ec484902b44c-scripts" (OuterVolumeSpecName: "scripts") pod "d427c956-ad6c-44b8-b112-ec484902b44c" (UID: "d427c956-ad6c-44b8-b112-ec484902b44c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.321780 5030 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.321812 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fzz9\" (UniqueName: \"kubernetes.io/projected/d427c956-ad6c-44b8-b112-ec484902b44c-kube-api-access-6fzz9\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.321821 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.321832 5030 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d427c956-ad6c-44b8-b112-ec484902b44c-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.321840 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d427c956-ad6c-44b8-b112-ec484902b44c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.321848 5030 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d427c956-ad6c-44b8-b112-ec484902b44c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.321855 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d427c956-ad6c-44b8-b112-ec484902b44c-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.661537 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" event={"ID":"d427c956-ad6c-44b8-b112-ec484902b44c","Type":"ContainerDied","Data":"e4147da6705b77ab55b503b206fc40b50ae2703f16c560eca48a336d03afb7ec"} Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.661576 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-qwhwh" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.661585 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4147da6705b77ab55b503b206fc40b50ae2703f16c560eca48a336d03afb7ec" Jan 20 23:21:02 crc kubenswrapper[5030]: I0120 23:21:02.663415 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-4r5pg" event={"ID":"b334eade-4a24-4596-b272-d8d3c91940fc","Type":"ContainerStarted","Data":"daae88fa7ae6df0e295a6dd261cac04dae374cfd040c2e2c05440a1bce758eed"} Jan 20 23:21:03 crc kubenswrapper[5030]: I0120 23:21:03.677200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-4r5pg" event={"ID":"b334eade-4a24-4596-b272-d8d3c91940fc","Type":"ContainerStarted","Data":"b46b3e1e35f4cc0f7f8a1e97882714048b5dd5923a5d10dcfff8a62889c31b12"} Jan 20 23:21:03 crc kubenswrapper[5030]: I0120 23:21:03.704392 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-4r5pg" podStartSLOduration=2.704375729 podStartE2EDuration="2.704375729s" podCreationTimestamp="2026-01-20 23:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:03.700545758 +0000 UTC m=+2736.020806076" watchObservedRunningTime="2026-01-20 23:21:03.704375729 +0000 UTC m=+2736.024636007" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.000856 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jhf96"] Jan 20 23:21:04 crc kubenswrapper[5030]: E0120 23:21:04.001646 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d427c956-ad6c-44b8-b112-ec484902b44c" containerName="swift-ring-rebalance" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.001670 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d427c956-ad6c-44b8-b112-ec484902b44c" containerName="swift-ring-rebalance" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.001874 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d427c956-ad6c-44b8-b112-ec484902b44c" containerName="swift-ring-rebalance" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.002539 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jhf96" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.005042 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.014872 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jhf96"] Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.158128 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5pt7\" (UniqueName: \"kubernetes.io/projected/33b4cf7c-7c3e-4537-91dc-aa8e422e749f-kube-api-access-h5pt7\") pod \"root-account-create-update-jhf96\" (UID: \"33b4cf7c-7c3e-4537-91dc-aa8e422e749f\") " pod="openstack-kuttl-tests/root-account-create-update-jhf96" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.158334 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33b4cf7c-7c3e-4537-91dc-aa8e422e749f-operator-scripts\") pod \"root-account-create-update-jhf96\" (UID: \"33b4cf7c-7c3e-4537-91dc-aa8e422e749f\") " pod="openstack-kuttl-tests/root-account-create-update-jhf96" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.260079 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.260147 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5pt7\" (UniqueName: \"kubernetes.io/projected/33b4cf7c-7c3e-4537-91dc-aa8e422e749f-kube-api-access-h5pt7\") pod \"root-account-create-update-jhf96\" (UID: \"33b4cf7c-7c3e-4537-91dc-aa8e422e749f\") " pod="openstack-kuttl-tests/root-account-create-update-jhf96" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.260208 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33b4cf7c-7c3e-4537-91dc-aa8e422e749f-operator-scripts\") pod \"root-account-create-update-jhf96\" (UID: \"33b4cf7c-7c3e-4537-91dc-aa8e422e749f\") " pod="openstack-kuttl-tests/root-account-create-update-jhf96" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.261160 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33b4cf7c-7c3e-4537-91dc-aa8e422e749f-operator-scripts\") pod \"root-account-create-update-jhf96\" (UID: \"33b4cf7c-7c3e-4537-91dc-aa8e422e749f\") " pod="openstack-kuttl-tests/root-account-create-update-jhf96" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.270901 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift\") pod \"swift-storage-0\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.282217 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5pt7\" (UniqueName: \"kubernetes.io/projected/33b4cf7c-7c3e-4537-91dc-aa8e422e749f-kube-api-access-h5pt7\") pod \"root-account-create-update-jhf96\" (UID: \"33b4cf7c-7c3e-4537-91dc-aa8e422e749f\") " pod="openstack-kuttl-tests/root-account-create-update-jhf96" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.319589 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jhf96" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.549177 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.633945 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jhf96"] Jan 20 23:21:04 crc kubenswrapper[5030]: W0120 23:21:04.641944 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33b4cf7c_7c3e_4537_91dc_aa8e422e749f.slice/crio-a2727acd3b30af35584c3f93e884f9d45d914e8515ff08357606bb3f417f9fa6 WatchSource:0}: Error finding container a2727acd3b30af35584c3f93e884f9d45d914e8515ff08357606bb3f417f9fa6: Status 404 returned error can't find the container with id a2727acd3b30af35584c3f93e884f9d45d914e8515ff08357606bb3f417f9fa6 Jan 20 23:21:04 crc kubenswrapper[5030]: I0120 23:21:04.718397 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jhf96" event={"ID":"33b4cf7c-7c3e-4537-91dc-aa8e422e749f","Type":"ContainerStarted","Data":"a2727acd3b30af35584c3f93e884f9d45d914e8515ff08357606bb3f417f9fa6"} Jan 20 23:21:05 crc kubenswrapper[5030]: I0120 23:21:05.071329 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:21:05 crc kubenswrapper[5030]: I0120 23:21:05.730926 5030 generic.go:334] "Generic (PLEG): container finished" podID="33b4cf7c-7c3e-4537-91dc-aa8e422e749f" containerID="adab7175f57c8246456313c9d8bc528995d62fe1b12e9af3d3a9943092a3af37" exitCode=0 Jan 20 23:21:05 crc kubenswrapper[5030]: I0120 23:21:05.731514 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jhf96" event={"ID":"33b4cf7c-7c3e-4537-91dc-aa8e422e749f","Type":"ContainerDied","Data":"adab7175f57c8246456313c9d8bc528995d62fe1b12e9af3d3a9943092a3af37"} Jan 20 23:21:05 crc kubenswrapper[5030]: I0120 23:21:05.737476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"4c756d7b8db1a727410cc6f3d2706156889222c800b5aa7f19e74bcaae74d60c"} Jan 20 23:21:05 crc kubenswrapper[5030]: I0120 23:21:05.737510 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"2461fa19d045db13a97b7b0b5638ffc5a653019b3da567b4db09dfdfa385937a"} Jan 20 23:21:05 crc kubenswrapper[5030]: I0120 23:21:05.737524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"e7b7de68994a8284266afd34bc3a79c36d36d9259d1a0d8a634f2a7d8363e58c"} Jan 20 23:21:05 crc kubenswrapper[5030]: I0120 23:21:05.737537 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"266b590f8256a8c0d5161f4132135e7cfd60b1cb4ad64eb600411af4a34a3979"} Jan 20 23:21:06 crc kubenswrapper[5030]: I0120 23:21:06.749278 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"6b600011c7f8d67f6e7566ea6b3b1b2dfaeb40569b60470daf1e602fda755770"} Jan 20 23:21:06 crc kubenswrapper[5030]: I0120 23:21:06.749529 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"0a2cf07616a39fdc6f3573ae37b852a81c3be261b67e73368904e2e6310386c6"} Jan 20 23:21:06 crc kubenswrapper[5030]: I0120 23:21:06.749540 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"a628f7b623ff9c200508b4770aedb01dcdc16a5616b4c3c7e2c9ee2e72f314da"} Jan 20 23:21:06 crc kubenswrapper[5030]: I0120 23:21:06.749549 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"8f90b5530e219e8189c33b09de69a04843468fdd0affbcabac025c6866f2785c"} Jan 20 23:21:06 crc kubenswrapper[5030]: I0120 23:21:06.749558 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"8d392111f161bad470eb94c639b23a6c545dd65432681f84c5e1ea62cf270e93"} Jan 20 23:21:06 crc kubenswrapper[5030]: I0120 23:21:06.751026 5030 generic.go:334] "Generic (PLEG): container finished" podID="640d9b89-a25b-43d4-bf6b-361b56ad9381" containerID="7c7e9190938d40b95c597cf3c50d0385e343102d0e6ab1a4234a94fd89a2f688" exitCode=0 Jan 20 23:21:06 crc kubenswrapper[5030]: I0120 23:21:06.751072 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"640d9b89-a25b-43d4-bf6b-361b56ad9381","Type":"ContainerDied","Data":"7c7e9190938d40b95c597cf3c50d0385e343102d0e6ab1a4234a94fd89a2f688"} Jan 20 23:21:06 crc kubenswrapper[5030]: I0120 23:21:06.752791 5030 generic.go:334] "Generic (PLEG): container finished" podID="562410c1-4193-4bdb-84a5-32bd001f29b7" containerID="bc2be2e614785a83e92f8382eb54b760f63601c5e35ecfb23e625e29c1a1fe79" exitCode=0 Jan 20 23:21:06 crc kubenswrapper[5030]: I0120 23:21:06.752864 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"562410c1-4193-4bdb-84a5-32bd001f29b7","Type":"ContainerDied","Data":"bc2be2e614785a83e92f8382eb54b760f63601c5e35ecfb23e625e29c1a1fe79"} Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.128202 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jhf96" Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.225268 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33b4cf7c-7c3e-4537-91dc-aa8e422e749f-operator-scripts\") pod \"33b4cf7c-7c3e-4537-91dc-aa8e422e749f\" (UID: \"33b4cf7c-7c3e-4537-91dc-aa8e422e749f\") " Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.225313 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5pt7\" (UniqueName: \"kubernetes.io/projected/33b4cf7c-7c3e-4537-91dc-aa8e422e749f-kube-api-access-h5pt7\") pod \"33b4cf7c-7c3e-4537-91dc-aa8e422e749f\" (UID: \"33b4cf7c-7c3e-4537-91dc-aa8e422e749f\") " Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.226543 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b4cf7c-7c3e-4537-91dc-aa8e422e749f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33b4cf7c-7c3e-4537-91dc-aa8e422e749f" (UID: "33b4cf7c-7c3e-4537-91dc-aa8e422e749f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.232757 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b4cf7c-7c3e-4537-91dc-aa8e422e749f-kube-api-access-h5pt7" (OuterVolumeSpecName: "kube-api-access-h5pt7") pod "33b4cf7c-7c3e-4537-91dc-aa8e422e749f" (UID: "33b4cf7c-7c3e-4537-91dc-aa8e422e749f"). InnerVolumeSpecName "kube-api-access-h5pt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.327413 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33b4cf7c-7c3e-4537-91dc-aa8e422e749f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.327449 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5pt7\" (UniqueName: \"kubernetes.io/projected/33b4cf7c-7c3e-4537-91dc-aa8e422e749f-kube-api-access-h5pt7\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.769393 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"ca4c0372f3752ba9b4ca5958091f96b803487120243179b1402233e986a6b580"} Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.769675 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"d4e9bd5a5fdcfcd937af2ad5f261b5f163beb79f9debd2bf84fdf18248e8fd46"} Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.769685 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"898ef2488ff154992f84a4fc6380c82430045a8ed810e3879ec78e5a246f7a3f"} Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.769693 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"de256107cf63fcb2b870e65d29e45e6ba9a4694e730e0ec20ce025d3f43b07de"} Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.769701 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"43c9302de660f515dd0eebe873e9bc2a05c872d8d24b28b741b9f2ffa3a812d7"} Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.769709 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"2ba800c34c07d475f585e0881947ab835b0bf1d733e21efd9aa4df55007828d7"} Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.771207 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"640d9b89-a25b-43d4-bf6b-361b56ad9381","Type":"ContainerStarted","Data":"6bd004035b8c54e12e2ddf388a55c324758667583b1215f689bef97159ec1247"} Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.771395 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.773282 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"562410c1-4193-4bdb-84a5-32bd001f29b7","Type":"ContainerStarted","Data":"697fa9aa2472880601d53ece29e19b607f5f96fff9acf555dbdca856864c6ede"} Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.773457 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.774730 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-jhf96" event={"ID":"33b4cf7c-7c3e-4537-91dc-aa8e422e749f","Type":"ContainerDied","Data":"a2727acd3b30af35584c3f93e884f9d45d914e8515ff08357606bb3f417f9fa6"} Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.774757 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2727acd3b30af35584c3f93e884f9d45d914e8515ff08357606bb3f417f9fa6" Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.774802 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-jhf96" Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.805399 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=37.805382055 podStartE2EDuration="37.805382055s" podCreationTimestamp="2026-01-20 23:20:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:07.80306316 +0000 UTC m=+2740.123323448" watchObservedRunningTime="2026-01-20 23:21:07.805382055 +0000 UTC m=+2740.125642343" Jan 20 23:21:07 crc kubenswrapper[5030]: I0120 23:21:07.826848 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.82682894 podStartE2EDuration="37.82682894s" podCreationTimestamp="2026-01-20 23:20:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:07.823709804 +0000 UTC m=+2740.143970092" watchObservedRunningTime="2026-01-20 23:21:07.82682894 +0000 UTC m=+2740.147089228" Jan 20 23:21:08 crc kubenswrapper[5030]: I0120 23:21:08.795042 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerStarted","Data":"e68bf3c747c74247d5fd94bc51c9d1b79d386d40e704eeba0fc6a904567c5302"} Jan 20 23:21:08 crc kubenswrapper[5030]: I0120 23:21:08.800269 5030 generic.go:334] "Generic (PLEG): container finished" podID="b334eade-4a24-4596-b272-d8d3c91940fc" containerID="b46b3e1e35f4cc0f7f8a1e97882714048b5dd5923a5d10dcfff8a62889c31b12" exitCode=0 Jan 20 23:21:08 crc kubenswrapper[5030]: I0120 23:21:08.800570 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-4r5pg" event={"ID":"b334eade-4a24-4596-b272-d8d3c91940fc","Type":"ContainerDied","Data":"b46b3e1e35f4cc0f7f8a1e97882714048b5dd5923a5d10dcfff8a62889c31b12"} Jan 20 23:21:08 crc kubenswrapper[5030]: I0120 23:21:08.831932 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=21.831911552 podStartE2EDuration="21.831911552s" podCreationTimestamp="2026-01-20 23:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:08.822633059 +0000 UTC m=+2741.142893367" watchObservedRunningTime="2026-01-20 23:21:08.831911552 +0000 UTC m=+2741.152171840" Jan 20 23:21:08 crc kubenswrapper[5030]: I0120 23:21:08.970172 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h"] Jan 20 23:21:08 crc kubenswrapper[5030]: E0120 23:21:08.970998 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b4cf7c-7c3e-4537-91dc-aa8e422e749f" containerName="mariadb-account-create-update" Jan 20 23:21:08 crc kubenswrapper[5030]: I0120 23:21:08.971025 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b4cf7c-7c3e-4537-91dc-aa8e422e749f" containerName="mariadb-account-create-update" Jan 20 23:21:08 crc kubenswrapper[5030]: I0120 23:21:08.971282 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b4cf7c-7c3e-4537-91dc-aa8e422e749f" containerName="mariadb-account-create-update" Jan 20 23:21:08 crc kubenswrapper[5030]: I0120 23:21:08.972376 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:08 crc kubenswrapper[5030]: I0120 23:21:08.977138 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 20 23:21:08 crc kubenswrapper[5030]: I0120 23:21:08.985818 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h"] Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.051431 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-58b7ffd5-9978h\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.051698 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-58b7ffd5-9978h\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.051873 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-config\") pod \"dnsmasq-dnsmasq-58b7ffd5-9978h\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.051981 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9shc\" (UniqueName: \"kubernetes.io/projected/15e6333d-c767-4935-92b0-31ae1c68e475-kube-api-access-f9shc\") pod \"dnsmasq-dnsmasq-58b7ffd5-9978h\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.153093 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-58b7ffd5-9978h\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.154076 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-58b7ffd5-9978h\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.153991 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-58b7ffd5-9978h\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.154702 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-58b7ffd5-9978h\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.155022 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-config\") pod \"dnsmasq-dnsmasq-58b7ffd5-9978h\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.155109 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9shc\" (UniqueName: \"kubernetes.io/projected/15e6333d-c767-4935-92b0-31ae1c68e475-kube-api-access-f9shc\") pod \"dnsmasq-dnsmasq-58b7ffd5-9978h\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.155758 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-config\") pod \"dnsmasq-dnsmasq-58b7ffd5-9978h\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.174180 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9shc\" (UniqueName: \"kubernetes.io/projected/15e6333d-c767-4935-92b0-31ae1c68e475-kube-api-access-f9shc\") pod \"dnsmasq-dnsmasq-58b7ffd5-9978h\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.285789 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.752493 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h"] Jan 20 23:21:09 crc kubenswrapper[5030]: I0120 23:21:09.812287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" event={"ID":"15e6333d-c767-4935-92b0-31ae1c68e475","Type":"ContainerStarted","Data":"bedbc48311a7629c8862bb3584f123ef90f4e1cb9b7d723f4d6dc579051958a0"} Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.112381 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.157551 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.157610 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.168912 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-db-sync-config-data\") pod \"b334eade-4a24-4596-b272-d8d3c91940fc\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.168970 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-combined-ca-bundle\") pod \"b334eade-4a24-4596-b272-d8d3c91940fc\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.169090 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-config-data\") pod \"b334eade-4a24-4596-b272-d8d3c91940fc\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.169186 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn5hm\" (UniqueName: \"kubernetes.io/projected/b334eade-4a24-4596-b272-d8d3c91940fc-kube-api-access-vn5hm\") pod \"b334eade-4a24-4596-b272-d8d3c91940fc\" (UID: \"b334eade-4a24-4596-b272-d8d3c91940fc\") " Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.176774 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b334eade-4a24-4596-b272-d8d3c91940fc-kube-api-access-vn5hm" (OuterVolumeSpecName: "kube-api-access-vn5hm") pod "b334eade-4a24-4596-b272-d8d3c91940fc" (UID: "b334eade-4a24-4596-b272-d8d3c91940fc"). InnerVolumeSpecName "kube-api-access-vn5hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.178064 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b334eade-4a24-4596-b272-d8d3c91940fc" (UID: "b334eade-4a24-4596-b272-d8d3c91940fc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.203806 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b334eade-4a24-4596-b272-d8d3c91940fc" (UID: "b334eade-4a24-4596-b272-d8d3c91940fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.220665 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-config-data" (OuterVolumeSpecName: "config-data") pod "b334eade-4a24-4596-b272-d8d3c91940fc" (UID: "b334eade-4a24-4596-b272-d8d3c91940fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.271250 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.271293 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.271313 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b334eade-4a24-4596-b272-d8d3c91940fc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.271332 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn5hm\" (UniqueName: \"kubernetes.io/projected/b334eade-4a24-4596-b272-d8d3c91940fc-kube-api-access-vn5hm\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.821376 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-4r5pg" event={"ID":"b334eade-4a24-4596-b272-d8d3c91940fc","Type":"ContainerDied","Data":"daae88fa7ae6df0e295a6dd261cac04dae374cfd040c2e2c05440a1bce758eed"} Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.821685 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daae88fa7ae6df0e295a6dd261cac04dae374cfd040c2e2c05440a1bce758eed" Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.821422 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-4r5pg" Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.823748 5030 generic.go:334] "Generic (PLEG): container finished" podID="15e6333d-c767-4935-92b0-31ae1c68e475" containerID="1db86ad8f305c4dbd40634fef04c4f9149939bdf384a3f0d03577207e902e64e" exitCode=0 Jan 20 23:21:10 crc kubenswrapper[5030]: I0120 23:21:10.823792 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" event={"ID":"15e6333d-c767-4935-92b0-31ae1c68e475","Type":"ContainerDied","Data":"1db86ad8f305c4dbd40634fef04c4f9149939bdf384a3f0d03577207e902e64e"} Jan 20 23:21:11 crc kubenswrapper[5030]: I0120 23:21:11.835707 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" event={"ID":"15e6333d-c767-4935-92b0-31ae1c68e475","Type":"ContainerStarted","Data":"769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0"} Jan 20 23:21:11 crc kubenswrapper[5030]: I0120 23:21:11.835884 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:11 crc kubenswrapper[5030]: I0120 23:21:11.853261 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" podStartSLOduration=3.853228315 podStartE2EDuration="3.853228315s" podCreationTimestamp="2026-01-20 23:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:11.851387121 +0000 UTC m=+2744.171647419" watchObservedRunningTime="2026-01-20 23:21:11.853228315 +0000 UTC m=+2744.173488603" Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.286840 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.359359 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2"] Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.359643 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" podUID="f482c09a-b782-4aa8-b776-29055767e21d" containerName="dnsmasq-dns" containerID="cri-o://9d45afe79da6bd57dec573c2982765a080c30dc54f857116f2b1ddf166f894be" gracePeriod=10 Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.846193 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.925025 5030 generic.go:334] "Generic (PLEG): container finished" podID="f482c09a-b782-4aa8-b776-29055767e21d" containerID="9d45afe79da6bd57dec573c2982765a080c30dc54f857116f2b1ddf166f894be" exitCode=0 Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.925069 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" event={"ID":"f482c09a-b782-4aa8-b776-29055767e21d","Type":"ContainerDied","Data":"9d45afe79da6bd57dec573c2982765a080c30dc54f857116f2b1ddf166f894be"} Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.925082 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.925105 5030 scope.go:117] "RemoveContainer" containerID="9d45afe79da6bd57dec573c2982765a080c30dc54f857116f2b1ddf166f894be" Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.925094 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2" event={"ID":"f482c09a-b782-4aa8-b776-29055767e21d","Type":"ContainerDied","Data":"851c266773abfc2753d4ed64389684d7caa4ecd9ef1e7d2e5c40505681e14f9e"} Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.929383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7vf4\" (UniqueName: \"kubernetes.io/projected/f482c09a-b782-4aa8-b776-29055767e21d-kube-api-access-m7vf4\") pod \"f482c09a-b782-4aa8-b776-29055767e21d\" (UID: \"f482c09a-b782-4aa8-b776-29055767e21d\") " Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.929564 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f482c09a-b782-4aa8-b776-29055767e21d-config\") pod \"f482c09a-b782-4aa8-b776-29055767e21d\" (UID: \"f482c09a-b782-4aa8-b776-29055767e21d\") " Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.929616 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f482c09a-b782-4aa8-b776-29055767e21d-dnsmasq-svc\") pod \"f482c09a-b782-4aa8-b776-29055767e21d\" (UID: \"f482c09a-b782-4aa8-b776-29055767e21d\") " Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.939505 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f482c09a-b782-4aa8-b776-29055767e21d-kube-api-access-m7vf4" (OuterVolumeSpecName: "kube-api-access-m7vf4") pod "f482c09a-b782-4aa8-b776-29055767e21d" (UID: "f482c09a-b782-4aa8-b776-29055767e21d"). InnerVolumeSpecName "kube-api-access-m7vf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.948561 5030 scope.go:117] "RemoveContainer" containerID="254c8570554910a0333a31f6208a5439b66b73a03bd47e053c8ab95d4bba2a1b" Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.981525 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f482c09a-b782-4aa8-b776-29055767e21d-config" (OuterVolumeSpecName: "config") pod "f482c09a-b782-4aa8-b776-29055767e21d" (UID: "f482c09a-b782-4aa8-b776-29055767e21d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:21:19 crc kubenswrapper[5030]: I0120 23:21:19.998039 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f482c09a-b782-4aa8-b776-29055767e21d-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "f482c09a-b782-4aa8-b776-29055767e21d" (UID: "f482c09a-b782-4aa8-b776-29055767e21d"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:21:20 crc kubenswrapper[5030]: I0120 23:21:20.019265 5030 scope.go:117] "RemoveContainer" containerID="9d45afe79da6bd57dec573c2982765a080c30dc54f857116f2b1ddf166f894be" Jan 20 23:21:20 crc kubenswrapper[5030]: E0120 23:21:20.019907 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d45afe79da6bd57dec573c2982765a080c30dc54f857116f2b1ddf166f894be\": container with ID starting with 9d45afe79da6bd57dec573c2982765a080c30dc54f857116f2b1ddf166f894be not found: ID does not exist" containerID="9d45afe79da6bd57dec573c2982765a080c30dc54f857116f2b1ddf166f894be" Jan 20 23:21:20 crc kubenswrapper[5030]: I0120 23:21:20.019976 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d45afe79da6bd57dec573c2982765a080c30dc54f857116f2b1ddf166f894be"} err="failed to get container status \"9d45afe79da6bd57dec573c2982765a080c30dc54f857116f2b1ddf166f894be\": rpc error: code = NotFound desc = could not find container \"9d45afe79da6bd57dec573c2982765a080c30dc54f857116f2b1ddf166f894be\": container with ID starting with 9d45afe79da6bd57dec573c2982765a080c30dc54f857116f2b1ddf166f894be not found: ID does not exist" Jan 20 23:21:20 crc kubenswrapper[5030]: I0120 23:21:20.020008 5030 scope.go:117] "RemoveContainer" containerID="254c8570554910a0333a31f6208a5439b66b73a03bd47e053c8ab95d4bba2a1b" Jan 20 23:21:20 crc kubenswrapper[5030]: E0120 23:21:20.020560 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"254c8570554910a0333a31f6208a5439b66b73a03bd47e053c8ab95d4bba2a1b\": container with ID starting with 254c8570554910a0333a31f6208a5439b66b73a03bd47e053c8ab95d4bba2a1b not found: ID does not exist" containerID="254c8570554910a0333a31f6208a5439b66b73a03bd47e053c8ab95d4bba2a1b" Jan 20 23:21:20 crc kubenswrapper[5030]: I0120 23:21:20.020650 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254c8570554910a0333a31f6208a5439b66b73a03bd47e053c8ab95d4bba2a1b"} err="failed to get container status \"254c8570554910a0333a31f6208a5439b66b73a03bd47e053c8ab95d4bba2a1b\": rpc error: code = NotFound desc = could not find container \"254c8570554910a0333a31f6208a5439b66b73a03bd47e053c8ab95d4bba2a1b\": container with ID starting with 254c8570554910a0333a31f6208a5439b66b73a03bd47e053c8ab95d4bba2a1b not found: ID does not exist" Jan 20 23:21:20 crc kubenswrapper[5030]: I0120 23:21:20.031354 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f482c09a-b782-4aa8-b776-29055767e21d-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:20 crc kubenswrapper[5030]: I0120 23:21:20.031385 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f482c09a-b782-4aa8-b776-29055767e21d-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:20 crc kubenswrapper[5030]: I0120 23:21:20.031398 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7vf4\" (UniqueName: \"kubernetes.io/projected/f482c09a-b782-4aa8-b776-29055767e21d-kube-api-access-m7vf4\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:20 crc kubenswrapper[5030]: I0120 23:21:20.259109 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2"] Jan 20 23:21:20 crc kubenswrapper[5030]: I0120 23:21:20.269188 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-24hk2"] Jan 20 23:21:21 crc kubenswrapper[5030]: I0120 23:21:21.974507 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f482c09a-b782-4aa8-b776-29055767e21d" path="/var/lib/kubelet/pods/f482c09a-b782-4aa8-b776-29055767e21d/volumes" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.202844 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.264400 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.790195 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-6d9ds"] Jan 20 23:21:22 crc kubenswrapper[5030]: E0120 23:21:22.790793 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f482c09a-b782-4aa8-b776-29055767e21d" containerName="dnsmasq-dns" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.790810 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f482c09a-b782-4aa8-b776-29055767e21d" containerName="dnsmasq-dns" Jan 20 23:21:22 crc kubenswrapper[5030]: E0120 23:21:22.790842 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b334eade-4a24-4596-b272-d8d3c91940fc" containerName="glance-db-sync" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.790849 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b334eade-4a24-4596-b272-d8d3c91940fc" containerName="glance-db-sync" Jan 20 23:21:22 crc kubenswrapper[5030]: E0120 23:21:22.790858 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f482c09a-b782-4aa8-b776-29055767e21d" containerName="init" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.790864 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f482c09a-b782-4aa8-b776-29055767e21d" containerName="init" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.791005 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f482c09a-b782-4aa8-b776-29055767e21d" containerName="dnsmasq-dns" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.791020 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b334eade-4a24-4596-b272-d8d3c91940fc" containerName="glance-db-sync" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.791547 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-6d9ds" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.800951 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4"] Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.802108 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.804175 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.850539 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4"] Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.889225 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-6d9ds"] Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.916462 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-t7ft5"] Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.917508 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-t7ft5" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.955951 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-t7ft5"] Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.966673 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62"] Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.967693 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.970796 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.983430 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjngx\" (UniqueName: \"kubernetes.io/projected/4c5a0e0f-eb0d-4757-b571-6571a43793c4-kube-api-access-gjngx\") pod \"cinder-aecb-account-create-update-7xbt4\" (UID: \"4c5a0e0f-eb0d-4757-b571-6571a43793c4\") " pod="openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.983506 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5a0e0f-eb0d-4757-b571-6571a43793c4-operator-scripts\") pod \"cinder-aecb-account-create-update-7xbt4\" (UID: \"4c5a0e0f-eb0d-4757-b571-6571a43793c4\") " pod="openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.983578 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7afc48e-6aa6-4499-ae75-fa49a210f4c7-operator-scripts\") pod \"barbican-db-create-6d9ds\" (UID: \"d7afc48e-6aa6-4499-ae75-fa49a210f4c7\") " pod="openstack-kuttl-tests/barbican-db-create-6d9ds" Jan 20 23:21:22 crc kubenswrapper[5030]: I0120 23:21:22.983594 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkkrn\" (UniqueName: \"kubernetes.io/projected/d7afc48e-6aa6-4499-ae75-fa49a210f4c7-kube-api-access-nkkrn\") pod \"barbican-db-create-6d9ds\" (UID: \"d7afc48e-6aa6-4499-ae75-fa49a210f4c7\") " pod="openstack-kuttl-tests/barbican-db-create-6d9ds" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.000896 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62"] Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.049462 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vv9sg"] Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.050570 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.054597 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.054817 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.054861 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-vdlm9" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.054952 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.057675 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vv9sg"] Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.078017 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-xn8st"] Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.079169 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-xn8st" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.086943 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7afc48e-6aa6-4499-ae75-fa49a210f4c7-operator-scripts\") pod \"barbican-db-create-6d9ds\" (UID: \"d7afc48e-6aa6-4499-ae75-fa49a210f4c7\") " pod="openstack-kuttl-tests/barbican-db-create-6d9ds" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.086980 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkkrn\" (UniqueName: \"kubernetes.io/projected/d7afc48e-6aa6-4499-ae75-fa49a210f4c7-kube-api-access-nkkrn\") pod \"barbican-db-create-6d9ds\" (UID: \"d7afc48e-6aa6-4499-ae75-fa49a210f4c7\") " pod="openstack-kuttl-tests/barbican-db-create-6d9ds" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.087011 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86251e4a-e1bd-42f9-9632-3a087bc96e7d-operator-scripts\") pod \"cinder-db-create-t7ft5\" (UID: \"86251e4a-e1bd-42f9-9632-3a087bc96e7d\") " pod="openstack-kuttl-tests/cinder-db-create-t7ft5" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.087040 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xcgw\" (UniqueName: \"kubernetes.io/projected/86251e4a-e1bd-42f9-9632-3a087bc96e7d-kube-api-access-6xcgw\") pod \"cinder-db-create-t7ft5\" (UID: \"86251e4a-e1bd-42f9-9632-3a087bc96e7d\") " pod="openstack-kuttl-tests/cinder-db-create-t7ft5" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.087081 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjngx\" (UniqueName: \"kubernetes.io/projected/4c5a0e0f-eb0d-4757-b571-6571a43793c4-kube-api-access-gjngx\") pod \"cinder-aecb-account-create-update-7xbt4\" (UID: \"4c5a0e0f-eb0d-4757-b571-6571a43793c4\") " pod="openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.087675 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5a0e0f-eb0d-4757-b571-6571a43793c4-operator-scripts\") pod \"cinder-aecb-account-create-update-7xbt4\" (UID: \"4c5a0e0f-eb0d-4757-b571-6571a43793c4\") " pod="openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.087729 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m8lp\" (UniqueName: \"kubernetes.io/projected/5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821-kube-api-access-5m8lp\") pod \"barbican-64d6-account-create-update-dqw62\" (UID: \"5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821\") " pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.087785 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821-operator-scripts\") pod \"barbican-64d6-account-create-update-dqw62\" (UID: \"5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821\") " pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.088296 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5a0e0f-eb0d-4757-b571-6571a43793c4-operator-scripts\") pod \"cinder-aecb-account-create-update-7xbt4\" (UID: \"4c5a0e0f-eb0d-4757-b571-6571a43793c4\") " pod="openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.088314 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7afc48e-6aa6-4499-ae75-fa49a210f4c7-operator-scripts\") pod \"barbican-db-create-6d9ds\" (UID: \"d7afc48e-6aa6-4499-ae75-fa49a210f4c7\") " pod="openstack-kuttl-tests/barbican-db-create-6d9ds" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.098520 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-xn8st"] Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.129579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjngx\" (UniqueName: \"kubernetes.io/projected/4c5a0e0f-eb0d-4757-b571-6571a43793c4-kube-api-access-gjngx\") pod \"cinder-aecb-account-create-update-7xbt4\" (UID: \"4c5a0e0f-eb0d-4757-b571-6571a43793c4\") " pod="openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.148033 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-779e-account-create-update-prxd6"] Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.149052 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-779e-account-create-update-prxd6" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.151192 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkkrn\" (UniqueName: \"kubernetes.io/projected/d7afc48e-6aa6-4499-ae75-fa49a210f4c7-kube-api-access-nkkrn\") pod \"barbican-db-create-6d9ds\" (UID: \"d7afc48e-6aa6-4499-ae75-fa49a210f4c7\") " pod="openstack-kuttl-tests/barbican-db-create-6d9ds" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.157016 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-779e-account-create-update-prxd6"] Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.157038 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.190575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86251e4a-e1bd-42f9-9632-3a087bc96e7d-operator-scripts\") pod \"cinder-db-create-t7ft5\" (UID: \"86251e4a-e1bd-42f9-9632-3a087bc96e7d\") " pod="openstack-kuttl-tests/cinder-db-create-t7ft5" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.190649 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xcgw\" (UniqueName: \"kubernetes.io/projected/86251e4a-e1bd-42f9-9632-3a087bc96e7d-kube-api-access-6xcgw\") pod \"cinder-db-create-t7ft5\" (UID: \"86251e4a-e1bd-42f9-9632-3a087bc96e7d\") " pod="openstack-kuttl-tests/cinder-db-create-t7ft5" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.190728 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk42r\" (UniqueName: \"kubernetes.io/projected/e0990a32-fa39-48dc-b0f5-c8a366f9632c-kube-api-access-bk42r\") pod \"keystone-db-sync-vv9sg\" (UID: \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\") " pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.190760 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83272a1e-0505-4f7e-8db0-93666671cdf2-operator-scripts\") pod \"neutron-db-create-xn8st\" (UID: \"83272a1e-0505-4f7e-8db0-93666671cdf2\") " pod="openstack-kuttl-tests/neutron-db-create-xn8st" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.190799 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0990a32-fa39-48dc-b0f5-c8a366f9632c-config-data\") pod \"keystone-db-sync-vv9sg\" (UID: \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\") " pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.190822 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m8lp\" (UniqueName: \"kubernetes.io/projected/5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821-kube-api-access-5m8lp\") pod \"barbican-64d6-account-create-update-dqw62\" (UID: \"5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821\") " pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.190847 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0990a32-fa39-48dc-b0f5-c8a366f9632c-combined-ca-bundle\") pod \"keystone-db-sync-vv9sg\" (UID: \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\") " pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.190873 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821-operator-scripts\") pod \"barbican-64d6-account-create-update-dqw62\" (UID: \"5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821\") " pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.190947 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfczm\" (UniqueName: \"kubernetes.io/projected/83272a1e-0505-4f7e-8db0-93666671cdf2-kube-api-access-lfczm\") pod \"neutron-db-create-xn8st\" (UID: \"83272a1e-0505-4f7e-8db0-93666671cdf2\") " pod="openstack-kuttl-tests/neutron-db-create-xn8st" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.191758 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86251e4a-e1bd-42f9-9632-3a087bc96e7d-operator-scripts\") pod \"cinder-db-create-t7ft5\" (UID: \"86251e4a-e1bd-42f9-9632-3a087bc96e7d\") " pod="openstack-kuttl-tests/cinder-db-create-t7ft5" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.192341 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821-operator-scripts\") pod \"barbican-64d6-account-create-update-dqw62\" (UID: \"5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821\") " pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.206846 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m8lp\" (UniqueName: \"kubernetes.io/projected/5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821-kube-api-access-5m8lp\") pod \"barbican-64d6-account-create-update-dqw62\" (UID: \"5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821\") " pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.207711 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xcgw\" (UniqueName: \"kubernetes.io/projected/86251e4a-e1bd-42f9-9632-3a087bc96e7d-kube-api-access-6xcgw\") pod \"cinder-db-create-t7ft5\" (UID: \"86251e4a-e1bd-42f9-9632-3a087bc96e7d\") " pod="openstack-kuttl-tests/cinder-db-create-t7ft5" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.236921 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-t7ft5" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.284877 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.292075 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmlqh\" (UniqueName: \"kubernetes.io/projected/9325461a-2d5c-46e1-9041-3e60abe4feb9-kube-api-access-tmlqh\") pod \"neutron-779e-account-create-update-prxd6\" (UID: \"9325461a-2d5c-46e1-9041-3e60abe4feb9\") " pod="openstack-kuttl-tests/neutron-779e-account-create-update-prxd6" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.292127 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0990a32-fa39-48dc-b0f5-c8a366f9632c-config-data\") pod \"keystone-db-sync-vv9sg\" (UID: \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\") " pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.292170 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9325461a-2d5c-46e1-9041-3e60abe4feb9-operator-scripts\") pod \"neutron-779e-account-create-update-prxd6\" (UID: \"9325461a-2d5c-46e1-9041-3e60abe4feb9\") " pod="openstack-kuttl-tests/neutron-779e-account-create-update-prxd6" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.292194 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0990a32-fa39-48dc-b0f5-c8a366f9632c-combined-ca-bundle\") pod \"keystone-db-sync-vv9sg\" (UID: \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\") " pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.292282 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfczm\" (UniqueName: \"kubernetes.io/projected/83272a1e-0505-4f7e-8db0-93666671cdf2-kube-api-access-lfczm\") pod \"neutron-db-create-xn8st\" (UID: \"83272a1e-0505-4f7e-8db0-93666671cdf2\") " pod="openstack-kuttl-tests/neutron-db-create-xn8st" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.292371 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk42r\" (UniqueName: \"kubernetes.io/projected/e0990a32-fa39-48dc-b0f5-c8a366f9632c-kube-api-access-bk42r\") pod \"keystone-db-sync-vv9sg\" (UID: \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\") " pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.292416 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83272a1e-0505-4f7e-8db0-93666671cdf2-operator-scripts\") pod \"neutron-db-create-xn8st\" (UID: \"83272a1e-0505-4f7e-8db0-93666671cdf2\") " pod="openstack-kuttl-tests/neutron-db-create-xn8st" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.293240 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83272a1e-0505-4f7e-8db0-93666671cdf2-operator-scripts\") pod \"neutron-db-create-xn8st\" (UID: \"83272a1e-0505-4f7e-8db0-93666671cdf2\") " pod="openstack-kuttl-tests/neutron-db-create-xn8st" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.297241 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0990a32-fa39-48dc-b0f5-c8a366f9632c-combined-ca-bundle\") pod \"keystone-db-sync-vv9sg\" (UID: \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\") " pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.297311 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0990a32-fa39-48dc-b0f5-c8a366f9632c-config-data\") pod \"keystone-db-sync-vv9sg\" (UID: \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\") " pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.308492 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfczm\" (UniqueName: \"kubernetes.io/projected/83272a1e-0505-4f7e-8db0-93666671cdf2-kube-api-access-lfczm\") pod \"neutron-db-create-xn8st\" (UID: \"83272a1e-0505-4f7e-8db0-93666671cdf2\") " pod="openstack-kuttl-tests/neutron-db-create-xn8st" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.309257 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk42r\" (UniqueName: \"kubernetes.io/projected/e0990a32-fa39-48dc-b0f5-c8a366f9632c-kube-api-access-bk42r\") pod \"keystone-db-sync-vv9sg\" (UID: \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\") " pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.374685 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.393785 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmlqh\" (UniqueName: \"kubernetes.io/projected/9325461a-2d5c-46e1-9041-3e60abe4feb9-kube-api-access-tmlqh\") pod \"neutron-779e-account-create-update-prxd6\" (UID: \"9325461a-2d5c-46e1-9041-3e60abe4feb9\") " pod="openstack-kuttl-tests/neutron-779e-account-create-update-prxd6" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.393828 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9325461a-2d5c-46e1-9041-3e60abe4feb9-operator-scripts\") pod \"neutron-779e-account-create-update-prxd6\" (UID: \"9325461a-2d5c-46e1-9041-3e60abe4feb9\") " pod="openstack-kuttl-tests/neutron-779e-account-create-update-prxd6" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.394466 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9325461a-2d5c-46e1-9041-3e60abe4feb9-operator-scripts\") pod \"neutron-779e-account-create-update-prxd6\" (UID: \"9325461a-2d5c-46e1-9041-3e60abe4feb9\") " pod="openstack-kuttl-tests/neutron-779e-account-create-update-prxd6" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.405059 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-xn8st" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.407913 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-6d9ds" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.412824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmlqh\" (UniqueName: \"kubernetes.io/projected/9325461a-2d5c-46e1-9041-3e60abe4feb9-kube-api-access-tmlqh\") pod \"neutron-779e-account-create-update-prxd6\" (UID: \"9325461a-2d5c-46e1-9041-3e60abe4feb9\") " pod="openstack-kuttl-tests/neutron-779e-account-create-update-prxd6" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.421753 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.505125 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-779e-account-create-update-prxd6" Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.671731 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-t7ft5"] Jan 20 23:21:23 crc kubenswrapper[5030]: W0120 23:21:23.690724 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86251e4a_e1bd_42f9_9632_3a087bc96e7d.slice/crio-1bb6e8e2f59b317e6b3ffab768e5cbd8544784330675ee8d7473fa47154c733f WatchSource:0}: Error finding container 1bb6e8e2f59b317e6b3ffab768e5cbd8544784330675ee8d7473fa47154c733f: Status 404 returned error can't find the container with id 1bb6e8e2f59b317e6b3ffab768e5cbd8544784330675ee8d7473fa47154c733f Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.810220 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62"] Jan 20 23:21:23 crc kubenswrapper[5030]: W0120 23:21:23.812823 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5de8dc6a_16dd_4ba1_bb28_5d9a8df2f821.slice/crio-20f0d3f9641471e5218aaa18bedcec573eab8639122db33795315581101473d1 WatchSource:0}: Error finding container 20f0d3f9641471e5218aaa18bedcec573eab8639122db33795315581101473d1: Status 404 returned error can't find the container with id 20f0d3f9641471e5218aaa18bedcec573eab8639122db33795315581101473d1 Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.902646 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vv9sg"] Jan 20 23:21:23 crc kubenswrapper[5030]: W0120 23:21:23.913005 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0990a32_fa39_48dc_b0f5_c8a366f9632c.slice/crio-4df1ab9deecf1a18676a2db4c37c214aa086613e7bcb9e5fe1362bd1ce0e4d29 WatchSource:0}: Error finding container 4df1ab9deecf1a18676a2db4c37c214aa086613e7bcb9e5fe1362bd1ce0e4d29: Status 404 returned error can't find the container with id 4df1ab9deecf1a18676a2db4c37c214aa086613e7bcb9e5fe1362bd1ce0e4d29 Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.955040 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-6d9ds"] Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.973189 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-t7ft5" event={"ID":"86251e4a-e1bd-42f9-9632-3a087bc96e7d","Type":"ContainerStarted","Data":"c5d8451ca1df90836c74e02e735263405dd7345ac078ad70ebbc6dc29103b6a3"} Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.973457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-t7ft5" event={"ID":"86251e4a-e1bd-42f9-9632-3a087bc96e7d","Type":"ContainerStarted","Data":"1bb6e8e2f59b317e6b3ffab768e5cbd8544784330675ee8d7473fa47154c733f"} Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.977342 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" event={"ID":"e0990a32-fa39-48dc-b0f5-c8a366f9632c","Type":"ContainerStarted","Data":"4df1ab9deecf1a18676a2db4c37c214aa086613e7bcb9e5fe1362bd1ce0e4d29"} Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.979474 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" event={"ID":"5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821","Type":"ContainerStarted","Data":"20f0d3f9641471e5218aaa18bedcec573eab8639122db33795315581101473d1"} Jan 20 23:21:23 crc kubenswrapper[5030]: I0120 23:21:23.982863 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-xn8st"] Jan 20 23:21:24 crc kubenswrapper[5030]: I0120 23:21:24.010567 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" podStartSLOduration=2.010552637 podStartE2EDuration="2.010552637s" podCreationTimestamp="2026-01-20 23:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:24.009507913 +0000 UTC m=+2756.329768211" watchObservedRunningTime="2026-01-20 23:21:24.010552637 +0000 UTC m=+2756.330812925" Jan 20 23:21:24 crc kubenswrapper[5030]: I0120 23:21:24.022167 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-create-t7ft5" podStartSLOduration=2.022150746 podStartE2EDuration="2.022150746s" podCreationTimestamp="2026-01-20 23:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:23.99358589 +0000 UTC m=+2756.313846178" watchObservedRunningTime="2026-01-20 23:21:24.022150746 +0000 UTC m=+2756.342411044" Jan 20 23:21:24 crc kubenswrapper[5030]: I0120 23:21:24.079831 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-779e-account-create-update-prxd6"] Jan 20 23:21:24 crc kubenswrapper[5030]: I0120 23:21:24.090035 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4"] Jan 20 23:21:24 crc kubenswrapper[5030]: I0120 23:21:24.990846 5030 generic.go:334] "Generic (PLEG): container finished" podID="d7afc48e-6aa6-4499-ae75-fa49a210f4c7" containerID="e5e961cd6d773bbb1889566094203531756a0fa382a0b915ac0416162dfbbe3e" exitCode=0 Jan 20 23:21:24 crc kubenswrapper[5030]: I0120 23:21:24.991095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-6d9ds" event={"ID":"d7afc48e-6aa6-4499-ae75-fa49a210f4c7","Type":"ContainerDied","Data":"e5e961cd6d773bbb1889566094203531756a0fa382a0b915ac0416162dfbbe3e"} Jan 20 23:21:24 crc kubenswrapper[5030]: I0120 23:21:24.991156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-6d9ds" event={"ID":"d7afc48e-6aa6-4499-ae75-fa49a210f4c7","Type":"ContainerStarted","Data":"3736e7421b2697fceaddaa3a8179a618a670449d045f8a1c2d543b46009218d4"} Jan 20 23:21:24 crc kubenswrapper[5030]: I0120 23:21:24.994583 5030 generic.go:334] "Generic (PLEG): container finished" podID="86251e4a-e1bd-42f9-9632-3a087bc96e7d" containerID="c5d8451ca1df90836c74e02e735263405dd7345ac078ad70ebbc6dc29103b6a3" exitCode=0 Jan 20 23:21:24 crc kubenswrapper[5030]: I0120 23:21:24.994646 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-t7ft5" event={"ID":"86251e4a-e1bd-42f9-9632-3a087bc96e7d","Type":"ContainerDied","Data":"c5d8451ca1df90836c74e02e735263405dd7345ac078ad70ebbc6dc29103b6a3"} Jan 20 23:21:24 crc kubenswrapper[5030]: I0120 23:21:24.996360 5030 generic.go:334] "Generic (PLEG): container finished" podID="4c5a0e0f-eb0d-4757-b571-6571a43793c4" containerID="26b1ddd32ac5e65e681ee60619eed541c855aa3b58f3a73c15ea8a35ce058b23" exitCode=0 Jan 20 23:21:24 crc kubenswrapper[5030]: I0120 23:21:24.996400 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4" event={"ID":"4c5a0e0f-eb0d-4757-b571-6571a43793c4","Type":"ContainerDied","Data":"26b1ddd32ac5e65e681ee60619eed541c855aa3b58f3a73c15ea8a35ce058b23"} Jan 20 23:21:24 crc kubenswrapper[5030]: I0120 23:21:24.996416 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4" event={"ID":"4c5a0e0f-eb0d-4757-b571-6571a43793c4","Type":"ContainerStarted","Data":"e27e4401e62fa285d5d1e9936dacf60031667ebe0fe7152d85d42ed3a9166693"} Jan 20 23:21:24 crc kubenswrapper[5030]: I0120 23:21:24.998158 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" event={"ID":"e0990a32-fa39-48dc-b0f5-c8a366f9632c","Type":"ContainerStarted","Data":"c06a48fe05f35f1049e48b548439a81f0af40a795cf0e6eb9e109fa30620ec29"} Jan 20 23:21:25 crc kubenswrapper[5030]: I0120 23:21:25.000421 5030 generic.go:334] "Generic (PLEG): container finished" podID="83272a1e-0505-4f7e-8db0-93666671cdf2" containerID="8d163dcccbc2d4684e45a63084005401caf188afb4ae50b0ddba3caecc42d9d6" exitCode=0 Jan 20 23:21:25 crc kubenswrapper[5030]: I0120 23:21:25.000459 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-xn8st" event={"ID":"83272a1e-0505-4f7e-8db0-93666671cdf2","Type":"ContainerDied","Data":"8d163dcccbc2d4684e45a63084005401caf188afb4ae50b0ddba3caecc42d9d6"} Jan 20 23:21:25 crc kubenswrapper[5030]: I0120 23:21:25.000473 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-xn8st" event={"ID":"83272a1e-0505-4f7e-8db0-93666671cdf2","Type":"ContainerStarted","Data":"66f277e8533816651f3e90ff7e8a1f3d71fdbe427e5432a3c49d1255cc9bff63"} Jan 20 23:21:25 crc kubenswrapper[5030]: I0120 23:21:25.003223 5030 generic.go:334] "Generic (PLEG): container finished" podID="9325461a-2d5c-46e1-9041-3e60abe4feb9" containerID="5e7eee36212b875a3c18706283d7234207c6eeca24cbf4c2798aad5488b2afb1" exitCode=0 Jan 20 23:21:25 crc kubenswrapper[5030]: I0120 23:21:25.003366 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-779e-account-create-update-prxd6" event={"ID":"9325461a-2d5c-46e1-9041-3e60abe4feb9","Type":"ContainerDied","Data":"5e7eee36212b875a3c18706283d7234207c6eeca24cbf4c2798aad5488b2afb1"} Jan 20 23:21:25 crc kubenswrapper[5030]: I0120 23:21:25.003396 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-779e-account-create-update-prxd6" event={"ID":"9325461a-2d5c-46e1-9041-3e60abe4feb9","Type":"ContainerStarted","Data":"7b51d7b8e6bbc66459862d7213e8a00432aa46ddd8ff03589139e5294a5e63b8"} Jan 20 23:21:25 crc kubenswrapper[5030]: I0120 23:21:25.004700 5030 generic.go:334] "Generic (PLEG): container finished" podID="5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821" containerID="bd3d28d20d93dfa760fd440f9b51cd546029af0dd77cca083cc9280b00e90e98" exitCode=0 Jan 20 23:21:25 crc kubenswrapper[5030]: I0120 23:21:25.004729 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" event={"ID":"5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821","Type":"ContainerDied","Data":"bd3d28d20d93dfa760fd440f9b51cd546029af0dd77cca083cc9280b00e90e98"} Jan 20 23:21:25 crc kubenswrapper[5030]: I0120 23:21:25.093604 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" podStartSLOduration=2.0935831 podStartE2EDuration="2.0935831s" podCreationTimestamp="2026-01-20 23:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:25.088476587 +0000 UTC m=+2757.408736905" watchObservedRunningTime="2026-01-20 23:21:25.0935831 +0000 UTC m=+2757.413843388" Jan 20 23:21:26 crc kubenswrapper[5030]: I0120 23:21:26.464689 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-t7ft5" Jan 20 23:21:26 crc kubenswrapper[5030]: I0120 23:21:26.654307 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xcgw\" (UniqueName: \"kubernetes.io/projected/86251e4a-e1bd-42f9-9632-3a087bc96e7d-kube-api-access-6xcgw\") pod \"86251e4a-e1bd-42f9-9632-3a087bc96e7d\" (UID: \"86251e4a-e1bd-42f9-9632-3a087bc96e7d\") " Jan 20 23:21:26 crc kubenswrapper[5030]: I0120 23:21:26.654670 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86251e4a-e1bd-42f9-9632-3a087bc96e7d-operator-scripts\") pod \"86251e4a-e1bd-42f9-9632-3a087bc96e7d\" (UID: \"86251e4a-e1bd-42f9-9632-3a087bc96e7d\") " Jan 20 23:21:26 crc kubenswrapper[5030]: I0120 23:21:26.654959 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86251e4a-e1bd-42f9-9632-3a087bc96e7d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86251e4a-e1bd-42f9-9632-3a087bc96e7d" (UID: "86251e4a-e1bd-42f9-9632-3a087bc96e7d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:21:26 crc kubenswrapper[5030]: I0120 23:21:26.655236 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86251e4a-e1bd-42f9-9632-3a087bc96e7d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.033445 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-t7ft5" event={"ID":"86251e4a-e1bd-42f9-9632-3a087bc96e7d","Type":"ContainerDied","Data":"1bb6e8e2f59b317e6b3ffab768e5cbd8544784330675ee8d7473fa47154c733f"} Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.033510 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bb6e8e2f59b317e6b3ffab768e5cbd8544784330675ee8d7473fa47154c733f" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.033661 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-t7ft5" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.046266 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86251e4a-e1bd-42f9-9632-3a087bc96e7d-kube-api-access-6xcgw" (OuterVolumeSpecName: "kube-api-access-6xcgw") pod "86251e4a-e1bd-42f9-9632-3a087bc96e7d" (UID: "86251e4a-e1bd-42f9-9632-3a087bc96e7d"). InnerVolumeSpecName "kube-api-access-6xcgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.061720 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xcgw\" (UniqueName: \"kubernetes.io/projected/86251e4a-e1bd-42f9-9632-3a087bc96e7d-kube-api-access-6xcgw\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.221261 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.246235 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-6d9ds" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.253100 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-779e-account-create-update-prxd6" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.268652 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-xn8st" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.275681 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.369676 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83272a1e-0505-4f7e-8db0-93666671cdf2-operator-scripts\") pod \"83272a1e-0505-4f7e-8db0-93666671cdf2\" (UID: \"83272a1e-0505-4f7e-8db0-93666671cdf2\") " Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.369777 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjngx\" (UniqueName: \"kubernetes.io/projected/4c5a0e0f-eb0d-4757-b571-6571a43793c4-kube-api-access-gjngx\") pod \"4c5a0e0f-eb0d-4757-b571-6571a43793c4\" (UID: \"4c5a0e0f-eb0d-4757-b571-6571a43793c4\") " Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.369890 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkkrn\" (UniqueName: \"kubernetes.io/projected/d7afc48e-6aa6-4499-ae75-fa49a210f4c7-kube-api-access-nkkrn\") pod \"d7afc48e-6aa6-4499-ae75-fa49a210f4c7\" (UID: \"d7afc48e-6aa6-4499-ae75-fa49a210f4c7\") " Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.369951 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m8lp\" (UniqueName: \"kubernetes.io/projected/5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821-kube-api-access-5m8lp\") pod \"5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821\" (UID: \"5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821\") " Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.369979 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83272a1e-0505-4f7e-8db0-93666671cdf2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83272a1e-0505-4f7e-8db0-93666671cdf2" (UID: "83272a1e-0505-4f7e-8db0-93666671cdf2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.369997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5a0e0f-eb0d-4757-b571-6571a43793c4-operator-scripts\") pod \"4c5a0e0f-eb0d-4757-b571-6571a43793c4\" (UID: \"4c5a0e0f-eb0d-4757-b571-6571a43793c4\") " Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.370054 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821-operator-scripts\") pod \"5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821\" (UID: \"5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821\") " Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.370122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfczm\" (UniqueName: \"kubernetes.io/projected/83272a1e-0505-4f7e-8db0-93666671cdf2-kube-api-access-lfczm\") pod \"83272a1e-0505-4f7e-8db0-93666671cdf2\" (UID: \"83272a1e-0505-4f7e-8db0-93666671cdf2\") " Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.370171 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9325461a-2d5c-46e1-9041-3e60abe4feb9-operator-scripts\") pod \"9325461a-2d5c-46e1-9041-3e60abe4feb9\" (UID: \"9325461a-2d5c-46e1-9041-3e60abe4feb9\") " Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.370203 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7afc48e-6aa6-4499-ae75-fa49a210f4c7-operator-scripts\") pod \"d7afc48e-6aa6-4499-ae75-fa49a210f4c7\" (UID: \"d7afc48e-6aa6-4499-ae75-fa49a210f4c7\") " Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.370820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821" (UID: "5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.370261 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmlqh\" (UniqueName: \"kubernetes.io/projected/9325461a-2d5c-46e1-9041-3e60abe4feb9-kube-api-access-tmlqh\") pod \"9325461a-2d5c-46e1-9041-3e60abe4feb9\" (UID: \"9325461a-2d5c-46e1-9041-3e60abe4feb9\") " Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.371565 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83272a1e-0505-4f7e-8db0-93666671cdf2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.371584 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.373035 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c5a0e0f-eb0d-4757-b571-6571a43793c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c5a0e0f-eb0d-4757-b571-6571a43793c4" (UID: "4c5a0e0f-eb0d-4757-b571-6571a43793c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.373548 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9325461a-2d5c-46e1-9041-3e60abe4feb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9325461a-2d5c-46e1-9041-3e60abe4feb9" (UID: "9325461a-2d5c-46e1-9041-3e60abe4feb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.374732 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821-kube-api-access-5m8lp" (OuterVolumeSpecName: "kube-api-access-5m8lp") pod "5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821" (UID: "5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821"). InnerVolumeSpecName "kube-api-access-5m8lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.374958 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7afc48e-6aa6-4499-ae75-fa49a210f4c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7afc48e-6aa6-4499-ae75-fa49a210f4c7" (UID: "d7afc48e-6aa6-4499-ae75-fa49a210f4c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.378273 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5a0e0f-eb0d-4757-b571-6571a43793c4-kube-api-access-gjngx" (OuterVolumeSpecName: "kube-api-access-gjngx") pod "4c5a0e0f-eb0d-4757-b571-6571a43793c4" (UID: "4c5a0e0f-eb0d-4757-b571-6571a43793c4"). InnerVolumeSpecName "kube-api-access-gjngx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.382049 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9325461a-2d5c-46e1-9041-3e60abe4feb9-kube-api-access-tmlqh" (OuterVolumeSpecName: "kube-api-access-tmlqh") pod "9325461a-2d5c-46e1-9041-3e60abe4feb9" (UID: "9325461a-2d5c-46e1-9041-3e60abe4feb9"). InnerVolumeSpecName "kube-api-access-tmlqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.382142 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7afc48e-6aa6-4499-ae75-fa49a210f4c7-kube-api-access-nkkrn" (OuterVolumeSpecName: "kube-api-access-nkkrn") pod "d7afc48e-6aa6-4499-ae75-fa49a210f4c7" (UID: "d7afc48e-6aa6-4499-ae75-fa49a210f4c7"). InnerVolumeSpecName "kube-api-access-nkkrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.382765 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83272a1e-0505-4f7e-8db0-93666671cdf2-kube-api-access-lfczm" (OuterVolumeSpecName: "kube-api-access-lfczm") pod "83272a1e-0505-4f7e-8db0-93666671cdf2" (UID: "83272a1e-0505-4f7e-8db0-93666671cdf2"). InnerVolumeSpecName "kube-api-access-lfczm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.473595 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkkrn\" (UniqueName: \"kubernetes.io/projected/d7afc48e-6aa6-4499-ae75-fa49a210f4c7-kube-api-access-nkkrn\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.473902 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m8lp\" (UniqueName: \"kubernetes.io/projected/5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821-kube-api-access-5m8lp\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.473914 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5a0e0f-eb0d-4757-b571-6571a43793c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.473923 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfczm\" (UniqueName: \"kubernetes.io/projected/83272a1e-0505-4f7e-8db0-93666671cdf2-kube-api-access-lfczm\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.473932 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9325461a-2d5c-46e1-9041-3e60abe4feb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.473941 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7afc48e-6aa6-4499-ae75-fa49a210f4c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.473953 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmlqh\" (UniqueName: \"kubernetes.io/projected/9325461a-2d5c-46e1-9041-3e60abe4feb9-kube-api-access-tmlqh\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:27 crc kubenswrapper[5030]: I0120 23:21:27.473962 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjngx\" (UniqueName: \"kubernetes.io/projected/4c5a0e0f-eb0d-4757-b571-6571a43793c4-kube-api-access-gjngx\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.056271 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4" event={"ID":"4c5a0e0f-eb0d-4757-b571-6571a43793c4","Type":"ContainerDied","Data":"e27e4401e62fa285d5d1e9936dacf60031667ebe0fe7152d85d42ed3a9166693"} Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.056772 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e27e4401e62fa285d5d1e9936dacf60031667ebe0fe7152d85d42ed3a9166693" Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.056289 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4" Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.058978 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0990a32-fa39-48dc-b0f5-c8a366f9632c" containerID="c06a48fe05f35f1049e48b548439a81f0af40a795cf0e6eb9e109fa30620ec29" exitCode=0 Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.059079 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" event={"ID":"e0990a32-fa39-48dc-b0f5-c8a366f9632c","Type":"ContainerDied","Data":"c06a48fe05f35f1049e48b548439a81f0af40a795cf0e6eb9e109fa30620ec29"} Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.064495 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-xn8st" Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.065139 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-xn8st" event={"ID":"83272a1e-0505-4f7e-8db0-93666671cdf2","Type":"ContainerDied","Data":"66f277e8533816651f3e90ff7e8a1f3d71fdbe427e5432a3c49d1255cc9bff63"} Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.065187 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66f277e8533816651f3e90ff7e8a1f3d71fdbe427e5432a3c49d1255cc9bff63" Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.068289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-779e-account-create-update-prxd6" event={"ID":"9325461a-2d5c-46e1-9041-3e60abe4feb9","Type":"ContainerDied","Data":"7b51d7b8e6bbc66459862d7213e8a00432aa46ddd8ff03589139e5294a5e63b8"} Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.068410 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b51d7b8e6bbc66459862d7213e8a00432aa46ddd8ff03589139e5294a5e63b8" Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.068324 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-779e-account-create-update-prxd6" Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.072493 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.072520 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62" event={"ID":"5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821","Type":"ContainerDied","Data":"20f0d3f9641471e5218aaa18bedcec573eab8639122db33795315581101473d1"} Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.072576 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20f0d3f9641471e5218aaa18bedcec573eab8639122db33795315581101473d1" Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.079180 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-6d9ds" event={"ID":"d7afc48e-6aa6-4499-ae75-fa49a210f4c7","Type":"ContainerDied","Data":"3736e7421b2697fceaddaa3a8179a618a670449d045f8a1c2d543b46009218d4"} Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.079240 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3736e7421b2697fceaddaa3a8179a618a670449d045f8a1c2d543b46009218d4" Jan 20 23:21:28 crc kubenswrapper[5030]: I0120 23:21:28.079321 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-6d9ds" Jan 20 23:21:29 crc kubenswrapper[5030]: I0120 23:21:29.484293 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" Jan 20 23:21:29 crc kubenswrapper[5030]: I0120 23:21:29.617466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0990a32-fa39-48dc-b0f5-c8a366f9632c-combined-ca-bundle\") pod \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\" (UID: \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\") " Jan 20 23:21:29 crc kubenswrapper[5030]: I0120 23:21:29.617775 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0990a32-fa39-48dc-b0f5-c8a366f9632c-config-data\") pod \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\" (UID: \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\") " Jan 20 23:21:29 crc kubenswrapper[5030]: I0120 23:21:29.618039 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk42r\" (UniqueName: \"kubernetes.io/projected/e0990a32-fa39-48dc-b0f5-c8a366f9632c-kube-api-access-bk42r\") pod \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\" (UID: \"e0990a32-fa39-48dc-b0f5-c8a366f9632c\") " Jan 20 23:21:29 crc kubenswrapper[5030]: I0120 23:21:29.625001 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0990a32-fa39-48dc-b0f5-c8a366f9632c-kube-api-access-bk42r" (OuterVolumeSpecName: "kube-api-access-bk42r") pod "e0990a32-fa39-48dc-b0f5-c8a366f9632c" (UID: "e0990a32-fa39-48dc-b0f5-c8a366f9632c"). InnerVolumeSpecName "kube-api-access-bk42r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:29 crc kubenswrapper[5030]: I0120 23:21:29.646771 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0990a32-fa39-48dc-b0f5-c8a366f9632c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0990a32-fa39-48dc-b0f5-c8a366f9632c" (UID: "e0990a32-fa39-48dc-b0f5-c8a366f9632c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:29 crc kubenswrapper[5030]: I0120 23:21:29.674510 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0990a32-fa39-48dc-b0f5-c8a366f9632c-config-data" (OuterVolumeSpecName: "config-data") pod "e0990a32-fa39-48dc-b0f5-c8a366f9632c" (UID: "e0990a32-fa39-48dc-b0f5-c8a366f9632c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:29 crc kubenswrapper[5030]: I0120 23:21:29.719977 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk42r\" (UniqueName: \"kubernetes.io/projected/e0990a32-fa39-48dc-b0f5-c8a366f9632c-kube-api-access-bk42r\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:29 crc kubenswrapper[5030]: I0120 23:21:29.720022 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0990a32-fa39-48dc-b0f5-c8a366f9632c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:29 crc kubenswrapper[5030]: I0120 23:21:29.720042 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0990a32-fa39-48dc-b0f5-c8a366f9632c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.104929 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" event={"ID":"e0990a32-fa39-48dc-b0f5-c8a366f9632c","Type":"ContainerDied","Data":"4df1ab9deecf1a18676a2db4c37c214aa086613e7bcb9e5fe1362bd1ce0e4d29"} Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.104980 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4df1ab9deecf1a18676a2db4c37c214aa086613e7bcb9e5fe1362bd1ce0e4d29" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.105025 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-vv9sg" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.301458 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-h94nz"] Jan 20 23:21:30 crc kubenswrapper[5030]: E0120 23:21:30.301971 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821" containerName="mariadb-account-create-update" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.301990 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821" containerName="mariadb-account-create-update" Jan 20 23:21:30 crc kubenswrapper[5030]: E0120 23:21:30.302001 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0990a32-fa39-48dc-b0f5-c8a366f9632c" containerName="keystone-db-sync" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.302008 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0990a32-fa39-48dc-b0f5-c8a366f9632c" containerName="keystone-db-sync" Jan 20 23:21:30 crc kubenswrapper[5030]: E0120 23:21:30.302024 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c5a0e0f-eb0d-4757-b571-6571a43793c4" containerName="mariadb-account-create-update" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.302030 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5a0e0f-eb0d-4757-b571-6571a43793c4" containerName="mariadb-account-create-update" Jan 20 23:21:30 crc kubenswrapper[5030]: E0120 23:21:30.302049 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7afc48e-6aa6-4499-ae75-fa49a210f4c7" containerName="mariadb-database-create" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.302054 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7afc48e-6aa6-4499-ae75-fa49a210f4c7" containerName="mariadb-database-create" Jan 20 23:21:30 crc kubenswrapper[5030]: E0120 23:21:30.302066 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86251e4a-e1bd-42f9-9632-3a087bc96e7d" containerName="mariadb-database-create" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.302073 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="86251e4a-e1bd-42f9-9632-3a087bc96e7d" containerName="mariadb-database-create" Jan 20 23:21:30 crc kubenswrapper[5030]: E0120 23:21:30.302081 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83272a1e-0505-4f7e-8db0-93666671cdf2" containerName="mariadb-database-create" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.302086 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="83272a1e-0505-4f7e-8db0-93666671cdf2" containerName="mariadb-database-create" Jan 20 23:21:30 crc kubenswrapper[5030]: E0120 23:21:30.302099 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9325461a-2d5c-46e1-9041-3e60abe4feb9" containerName="mariadb-account-create-update" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.302105 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9325461a-2d5c-46e1-9041-3e60abe4feb9" containerName="mariadb-account-create-update" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.302282 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c5a0e0f-eb0d-4757-b571-6571a43793c4" containerName="mariadb-account-create-update" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.302299 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7afc48e-6aa6-4499-ae75-fa49a210f4c7" containerName="mariadb-database-create" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.302316 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="83272a1e-0505-4f7e-8db0-93666671cdf2" containerName="mariadb-database-create" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.302331 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="86251e4a-e1bd-42f9-9632-3a087bc96e7d" containerName="mariadb-database-create" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.302341 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9325461a-2d5c-46e1-9041-3e60abe4feb9" containerName="mariadb-account-create-update" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.302353 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0990a32-fa39-48dc-b0f5-c8a366f9632c" containerName="keystone-db-sync" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.302361 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821" containerName="mariadb-account-create-update" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.304004 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.325134 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.325200 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.325521 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.325794 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-vdlm9" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.325834 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.326282 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-h94nz"] Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.431467 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlh22\" (UniqueName: \"kubernetes.io/projected/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-kube-api-access-wlh22\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.431510 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-fernet-keys\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.431539 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-combined-ca-bundle\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.431795 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-scripts\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.431835 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-config-data\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.432063 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-credential-keys\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.489744 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.491513 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.495548 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.499735 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.501377 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.534200 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-credential-keys\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.534264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-fernet-keys\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.534296 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlh22\" (UniqueName: \"kubernetes.io/projected/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-kube-api-access-wlh22\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.534325 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-combined-ca-bundle\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.534400 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-scripts\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.534427 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-config-data\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.541830 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-combined-ca-bundle\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.543077 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-fernet-keys\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.544093 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-scripts\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.547427 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-credential-keys\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.548405 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-config-data\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.568756 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlh22\" (UniqueName: \"kubernetes.io/projected/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-kube-api-access-wlh22\") pod \"keystone-bootstrap-h94nz\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.592513 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-6smx7"] Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.593558 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.597047 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.597244 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.597424 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-94llg" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.624068 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-6smx7"] Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.635497 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwcgm\" (UniqueName: \"kubernetes.io/projected/e15b072d-db1f-45d5-a486-d53f31091f29-kube-api-access-zwcgm\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.635541 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-scripts\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.635596 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-config-data\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.635636 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15b072d-db1f-45d5-a486-d53f31091f29-log-httpd\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.635657 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15b072d-db1f-45d5-a486-d53f31091f29-run-httpd\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.635677 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.635702 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.645489 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.737432 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-config-data\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.737560 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-config-data\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.737614 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15b072d-db1f-45d5-a486-d53f31091f29-log-httpd\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.737659 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-combined-ca-bundle\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.737680 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15b072d-db1f-45d5-a486-d53f31091f29-run-httpd\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.737713 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.737751 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.737799 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-logs\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.737826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-scripts\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.737845 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t484n\" (UniqueName: \"kubernetes.io/projected/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-kube-api-access-t484n\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.737866 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwcgm\" (UniqueName: \"kubernetes.io/projected/e15b072d-db1f-45d5-a486-d53f31091f29-kube-api-access-zwcgm\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.737893 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-scripts\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.738589 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15b072d-db1f-45d5-a486-d53f31091f29-run-httpd\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.738608 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15b072d-db1f-45d5-a486-d53f31091f29-log-httpd\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.742784 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-scripts\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.750517 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.750806 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.751363 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-config-data\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.759480 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwcgm\" (UniqueName: \"kubernetes.io/projected/e15b072d-db1f-45d5-a486-d53f31091f29-kube-api-access-zwcgm\") pod \"ceilometer-0\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.806617 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.839311 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-config-data\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.839371 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-combined-ca-bundle\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.839429 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-logs\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.839451 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-scripts\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.839472 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t484n\" (UniqueName: \"kubernetes.io/projected/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-kube-api-access-t484n\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.840199 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-logs\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.844165 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-combined-ca-bundle\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.845257 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-config-data\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.846786 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-scripts\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.855326 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t484n\" (UniqueName: \"kubernetes.io/projected/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-kube-api-access-t484n\") pod \"placement-db-sync-6smx7\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:30 crc kubenswrapper[5030]: I0120 23:21:30.930355 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.120387 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-h94nz"] Jan 20 23:21:31 crc kubenswrapper[5030]: W0120 23:21:31.123034 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ffbef9c_2e37_434b_9a2a_2eda3514d7f0.slice/crio-319fc93a625af6f8d4b99627646662478779d050624488d36306212884e5a3f7 WatchSource:0}: Error finding container 319fc93a625af6f8d4b99627646662478779d050624488d36306212884e5a3f7: Status 404 returned error can't find the container with id 319fc93a625af6f8d4b99627646662478779d050624488d36306212884e5a3f7 Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.240221 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:21:31 crc kubenswrapper[5030]: W0120 23:21:31.241898 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode15b072d_db1f_45d5_a486_d53f31091f29.slice/crio-4146829f3edca7b9071eb5729925a4898dc1a6a57925be3c3f8cb23016dfcae3 WatchSource:0}: Error finding container 4146829f3edca7b9071eb5729925a4898dc1a6a57925be3c3f8cb23016dfcae3: Status 404 returned error can't find the container with id 4146829f3edca7b9071eb5729925a4898dc1a6a57925be3c3f8cb23016dfcae3 Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.361072 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-6smx7"] Jan 20 23:21:31 crc kubenswrapper[5030]: W0120 23:21:31.364560 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81e2b764_c26c_4ef7_b6fb_c51e698b8ea1.slice/crio-cdfe5a4a641fff7540d46b0b8d7ee52f9f1164b9a673083f5dc397bfbb042113 WatchSource:0}: Error finding container cdfe5a4a641fff7540d46b0b8d7ee52f9f1164b9a673083f5dc397bfbb042113: Status 404 returned error can't find the container with id cdfe5a4a641fff7540d46b0b8d7ee52f9f1164b9a673083f5dc397bfbb042113 Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.397236 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.399881 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.408049 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.408126 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.408176 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.408329 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-b75fg" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.434662 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.457685 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.459063 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.463007 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.463185 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.486843 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.553453 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.553530 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.553547 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-scripts\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.553567 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.553584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-config-data\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.553652 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.553674 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.553693 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.553715 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5948e-4858-4efa-8f52-9220f419edf2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.553732 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlbx5\" (UniqueName: \"kubernetes.io/projected/93c13494-312a-4bd5-a24b-032b4889c838-kube-api-access-mlbx5\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.553755 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.553771 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93c13494-312a-4bd5-a24b-032b4889c838-logs\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.554135 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.554281 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rckd6\" (UniqueName: \"kubernetes.io/projected/74c5948e-4858-4efa-8f52-9220f419edf2-kube-api-access-rckd6\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.554306 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/93c13494-312a-4bd5-a24b-032b4889c838-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.554391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5948e-4858-4efa-8f52-9220f419edf2-logs\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655170 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655226 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655265 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655287 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5948e-4858-4efa-8f52-9220f419edf2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655303 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlbx5\" (UniqueName: \"kubernetes.io/projected/93c13494-312a-4bd5-a24b-032b4889c838-kube-api-access-mlbx5\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655322 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655338 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93c13494-312a-4bd5-a24b-032b4889c838-logs\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655358 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655399 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rckd6\" (UniqueName: \"kubernetes.io/projected/74c5948e-4858-4efa-8f52-9220f419edf2-kube-api-access-rckd6\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655416 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/93c13494-312a-4bd5-a24b-032b4889c838-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655442 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5948e-4858-4efa-8f52-9220f419edf2-logs\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655463 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655489 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-scripts\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655521 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.655535 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-config-data\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.662051 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5948e-4858-4efa-8f52-9220f419edf2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.662303 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.663719 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93c13494-312a-4bd5-a24b-032b4889c838-logs\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.664032 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.674046 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/93c13494-312a-4bd5-a24b-032b4889c838-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.674581 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5948e-4858-4efa-8f52-9220f419edf2-logs\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.682582 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-config-data\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.690286 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.711476 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rckd6\" (UniqueName: \"kubernetes.io/projected/74c5948e-4858-4efa-8f52-9220f419edf2-kube-api-access-rckd6\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.712034 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.712725 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.712817 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.718764 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-scripts\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.719230 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.721638 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.726529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlbx5\" (UniqueName: \"kubernetes.io/projected/93c13494-312a-4bd5-a24b-032b4889c838-kube-api-access-mlbx5\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.747711 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.751298 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:31 crc kubenswrapper[5030]: I0120 23:21:31.807115 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.030202 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.124011 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e15b072d-db1f-45d5-a486-d53f31091f29","Type":"ContainerStarted","Data":"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a"} Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.124048 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e15b072d-db1f-45d5-a486-d53f31091f29","Type":"ContainerStarted","Data":"4146829f3edca7b9071eb5729925a4898dc1a6a57925be3c3f8cb23016dfcae3"} Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.125542 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" event={"ID":"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0","Type":"ContainerStarted","Data":"e529e28e3eebf95ed91002b404e311beb3c6e1ef97122ba56f112fd06d3bb19e"} Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.125583 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" event={"ID":"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0","Type":"ContainerStarted","Data":"319fc93a625af6f8d4b99627646662478779d050624488d36306212884e5a3f7"} Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.127344 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-6smx7" event={"ID":"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1","Type":"ContainerStarted","Data":"8a44f789b6134272d0e93b7ae826cd695547bc33de5fd34f04050d6752912841"} Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.127371 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-6smx7" event={"ID":"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1","Type":"ContainerStarted","Data":"cdfe5a4a641fff7540d46b0b8d7ee52f9f1164b9a673083f5dc397bfbb042113"} Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.163822 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" podStartSLOduration=2.163801229 podStartE2EDuration="2.163801229s" podCreationTimestamp="2026-01-20 23:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:32.144978877 +0000 UTC m=+2764.465239175" watchObservedRunningTime="2026-01-20 23:21:32.163801229 +0000 UTC m=+2764.484061517" Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.172244 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-6smx7" podStartSLOduration=2.172222621 podStartE2EDuration="2.172222621s" podCreationTimestamp="2026-01-20 23:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:32.167023616 +0000 UTC m=+2764.487283904" watchObservedRunningTime="2026-01-20 23:21:32.172222621 +0000 UTC m=+2764.492482909" Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.280754 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.434725 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.518077 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.534072 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:21:32 crc kubenswrapper[5030]: I0120 23:21:32.562300 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.090784 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-6zk9r"] Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.092283 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.095770 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.095819 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.095972 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-kkfd7" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.104976 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-6zk9r"] Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.160946 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e15b072d-db1f-45d5-a486-d53f31091f29","Type":"ContainerStarted","Data":"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2"} Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.169157 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"74c5948e-4858-4efa-8f52-9220f419edf2","Type":"ContainerStarted","Data":"4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889"} Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.169204 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"74c5948e-4858-4efa-8f52-9220f419edf2","Type":"ContainerStarted","Data":"574365f3308d9dc4d6af75a21a9caa1e295cf0f6f30fee451c6e3f7bf53a5fdb"} Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.171560 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"93c13494-312a-4bd5-a24b-032b4889c838","Type":"ContainerStarted","Data":"d2c02c65725a8deca788bdeba76604a028f1bb73b1d3046fefbaa1490e5c5459"} Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.294902 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-scripts\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.294961 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-combined-ca-bundle\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.295025 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wfj8\" (UniqueName: \"kubernetes.io/projected/78c5052d-ee32-4414-a152-13caff78882f-kube-api-access-4wfj8\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.295045 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-config-data\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.295121 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78c5052d-ee32-4414-a152-13caff78882f-etc-machine-id\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.295136 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-db-sync-config-data\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.388322 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-5px6h"] Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.389260 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-5px6h" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.391481 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-7j7b6" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.394663 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.396924 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-scripts\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.396984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-combined-ca-bundle\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.397043 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wfj8\" (UniqueName: \"kubernetes.io/projected/78c5052d-ee32-4414-a152-13caff78882f-kube-api-access-4wfj8\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.397072 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-config-data\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.397112 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a504a208-146a-4ba0-8145-b42daf7f5f1a-db-sync-config-data\") pod \"barbican-db-sync-5px6h\" (UID: \"a504a208-146a-4ba0-8145-b42daf7f5f1a\") " pod="openstack-kuttl-tests/barbican-db-sync-5px6h" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.397152 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn6tw\" (UniqueName: \"kubernetes.io/projected/a504a208-146a-4ba0-8145-b42daf7f5f1a-kube-api-access-qn6tw\") pod \"barbican-db-sync-5px6h\" (UID: \"a504a208-146a-4ba0-8145-b42daf7f5f1a\") " pod="openstack-kuttl-tests/barbican-db-sync-5px6h" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.397204 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a504a208-146a-4ba0-8145-b42daf7f5f1a-combined-ca-bundle\") pod \"barbican-db-sync-5px6h\" (UID: \"a504a208-146a-4ba0-8145-b42daf7f5f1a\") " pod="openstack-kuttl-tests/barbican-db-sync-5px6h" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.397245 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78c5052d-ee32-4414-a152-13caff78882f-etc-machine-id\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.397275 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-db-sync-config-data\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.397700 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78c5052d-ee32-4414-a152-13caff78882f-etc-machine-id\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.402549 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-combined-ca-bundle\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.404808 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-db-sync-config-data\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.404990 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-config-data\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.405663 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-scripts\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.414461 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wfj8\" (UniqueName: \"kubernetes.io/projected/78c5052d-ee32-4414-a152-13caff78882f-kube-api-access-4wfj8\") pod \"cinder-db-sync-6zk9r\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.416153 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-5px6h"] Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.430573 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.484132 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qw7ll"] Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.485127 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.487504 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.487589 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.488607 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-vkxc5" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.506992 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qw7ll"] Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.509111 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a504a208-146a-4ba0-8145-b42daf7f5f1a-db-sync-config-data\") pod \"barbican-db-sync-5px6h\" (UID: \"a504a208-146a-4ba0-8145-b42daf7f5f1a\") " pod="openstack-kuttl-tests/barbican-db-sync-5px6h" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.509170 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn6tw\" (UniqueName: \"kubernetes.io/projected/a504a208-146a-4ba0-8145-b42daf7f5f1a-kube-api-access-qn6tw\") pod \"barbican-db-sync-5px6h\" (UID: \"a504a208-146a-4ba0-8145-b42daf7f5f1a\") " pod="openstack-kuttl-tests/barbican-db-sync-5px6h" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.509223 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a504a208-146a-4ba0-8145-b42daf7f5f1a-combined-ca-bundle\") pod \"barbican-db-sync-5px6h\" (UID: \"a504a208-146a-4ba0-8145-b42daf7f5f1a\") " pod="openstack-kuttl-tests/barbican-db-sync-5px6h" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.516236 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a504a208-146a-4ba0-8145-b42daf7f5f1a-combined-ca-bundle\") pod \"barbican-db-sync-5px6h\" (UID: \"a504a208-146a-4ba0-8145-b42daf7f5f1a\") " pod="openstack-kuttl-tests/barbican-db-sync-5px6h" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.528252 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn6tw\" (UniqueName: \"kubernetes.io/projected/a504a208-146a-4ba0-8145-b42daf7f5f1a-kube-api-access-qn6tw\") pod \"barbican-db-sync-5px6h\" (UID: \"a504a208-146a-4ba0-8145-b42daf7f5f1a\") " pod="openstack-kuttl-tests/barbican-db-sync-5px6h" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.529098 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a504a208-146a-4ba0-8145-b42daf7f5f1a-db-sync-config-data\") pod \"barbican-db-sync-5px6h\" (UID: \"a504a208-146a-4ba0-8145-b42daf7f5f1a\") " pod="openstack-kuttl-tests/barbican-db-sync-5px6h" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.613276 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fc5f\" (UniqueName: \"kubernetes.io/projected/a403eee4-765c-41f2-9ae3-fa3151068a29-kube-api-access-2fc5f\") pod \"neutron-db-sync-qw7ll\" (UID: \"a403eee4-765c-41f2-9ae3-fa3151068a29\") " pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.613717 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a403eee4-765c-41f2-9ae3-fa3151068a29-combined-ca-bundle\") pod \"neutron-db-sync-qw7ll\" (UID: \"a403eee4-765c-41f2-9ae3-fa3151068a29\") " pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.613776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a403eee4-765c-41f2-9ae3-fa3151068a29-config\") pod \"neutron-db-sync-qw7ll\" (UID: \"a403eee4-765c-41f2-9ae3-fa3151068a29\") " pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.709098 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-5px6h" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.716044 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fc5f\" (UniqueName: \"kubernetes.io/projected/a403eee4-765c-41f2-9ae3-fa3151068a29-kube-api-access-2fc5f\") pod \"neutron-db-sync-qw7ll\" (UID: \"a403eee4-765c-41f2-9ae3-fa3151068a29\") " pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.716149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a403eee4-765c-41f2-9ae3-fa3151068a29-combined-ca-bundle\") pod \"neutron-db-sync-qw7ll\" (UID: \"a403eee4-765c-41f2-9ae3-fa3151068a29\") " pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.716199 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a403eee4-765c-41f2-9ae3-fa3151068a29-config\") pod \"neutron-db-sync-qw7ll\" (UID: \"a403eee4-765c-41f2-9ae3-fa3151068a29\") " pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.737567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a403eee4-765c-41f2-9ae3-fa3151068a29-combined-ca-bundle\") pod \"neutron-db-sync-qw7ll\" (UID: \"a403eee4-765c-41f2-9ae3-fa3151068a29\") " pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.753171 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fc5f\" (UniqueName: \"kubernetes.io/projected/a403eee4-765c-41f2-9ae3-fa3151068a29-kube-api-access-2fc5f\") pod \"neutron-db-sync-qw7ll\" (UID: \"a403eee4-765c-41f2-9ae3-fa3151068a29\") " pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.753876 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a403eee4-765c-41f2-9ae3-fa3151068a29-config\") pod \"neutron-db-sync-qw7ll\" (UID: \"a403eee4-765c-41f2-9ae3-fa3151068a29\") " pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" Jan 20 23:21:33 crc kubenswrapper[5030]: I0120 23:21:33.918577 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.013366 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-6zk9r"] Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.192614 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"74c5948e-4858-4efa-8f52-9220f419edf2","Type":"ContainerStarted","Data":"db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044"} Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.193070 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="74c5948e-4858-4efa-8f52-9220f419edf2" containerName="glance-log" containerID="cri-o://4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889" gracePeriod=30 Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.193514 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="74c5948e-4858-4efa-8f52-9220f419edf2" containerName="glance-httpd" containerID="cri-o://db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044" gracePeriod=30 Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.200691 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"93c13494-312a-4bd5-a24b-032b4889c838","Type":"ContainerStarted","Data":"fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36"} Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.200733 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"93c13494-312a-4bd5-a24b-032b4889c838","Type":"ContainerStarted","Data":"3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7"} Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.200842 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="93c13494-312a-4bd5-a24b-032b4889c838" containerName="glance-log" containerID="cri-o://3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7" gracePeriod=30 Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.201061 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="93c13494-312a-4bd5-a24b-032b4889c838" containerName="glance-httpd" containerID="cri-o://fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36" gracePeriod=30 Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.214462 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e15b072d-db1f-45d5-a486-d53f31091f29","Type":"ContainerStarted","Data":"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66"} Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.220956 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.22092068 podStartE2EDuration="4.22092068s" podCreationTimestamp="2026-01-20 23:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:34.210129121 +0000 UTC m=+2766.530389409" watchObservedRunningTime="2026-01-20 23:21:34.22092068 +0000 UTC m=+2766.541180968" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.239567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" event={"ID":"78c5052d-ee32-4414-a152-13caff78882f","Type":"ContainerStarted","Data":"d359d0c9fa97ed0bf4815e4b6ab7394d1fd1918177fa72e0332f43b9a85a46a6"} Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.248347 5030 generic.go:334] "Generic (PLEG): container finished" podID="81e2b764-c26c-4ef7-b6fb-c51e698b8ea1" containerID="8a44f789b6134272d0e93b7ae826cd695547bc33de5fd34f04050d6752912841" exitCode=0 Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.248394 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-6smx7" event={"ID":"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1","Type":"ContainerDied","Data":"8a44f789b6134272d0e93b7ae826cd695547bc33de5fd34f04050d6752912841"} Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.268755 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.268736577 podStartE2EDuration="4.268736577s" podCreationTimestamp="2026-01-20 23:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:34.231410121 +0000 UTC m=+2766.551670409" watchObservedRunningTime="2026-01-20 23:21:34.268736577 +0000 UTC m=+2766.588996865" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.307104 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-5px6h"] Jan 20 23:21:34 crc kubenswrapper[5030]: W0120 23:21:34.308504 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda504a208_146a_4ba0_8145_b42daf7f5f1a.slice/crio-17918297b7c987d47aeabc0bd0a32a79233ee36ecdcbe784503db484eaea1adb WatchSource:0}: Error finding container 17918297b7c987d47aeabc0bd0a32a79233ee36ecdcbe784503db484eaea1adb: Status 404 returned error can't find the container with id 17918297b7c987d47aeabc0bd0a32a79233ee36ecdcbe784503db484eaea1adb Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.437820 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qw7ll"] Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.929043 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.941978 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961487 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-public-tls-certs\") pod \"93c13494-312a-4bd5-a24b-032b4889c838\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961553 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5948e-4858-4efa-8f52-9220f419edf2-httpd-run\") pod \"74c5948e-4858-4efa-8f52-9220f419edf2\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961614 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/93c13494-312a-4bd5-a24b-032b4889c838-httpd-run\") pod \"93c13494-312a-4bd5-a24b-032b4889c838\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961686 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlbx5\" (UniqueName: \"kubernetes.io/projected/93c13494-312a-4bd5-a24b-032b4889c838-kube-api-access-mlbx5\") pod \"93c13494-312a-4bd5-a24b-032b4889c838\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961726 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-scripts\") pod \"93c13494-312a-4bd5-a24b-032b4889c838\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961753 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5948e-4858-4efa-8f52-9220f419edf2-logs\") pod \"74c5948e-4858-4efa-8f52-9220f419edf2\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961770 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rckd6\" (UniqueName: \"kubernetes.io/projected/74c5948e-4858-4efa-8f52-9220f419edf2-kube-api-access-rckd6\") pod \"74c5948e-4858-4efa-8f52-9220f419edf2\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961792 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-scripts\") pod \"74c5948e-4858-4efa-8f52-9220f419edf2\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961809 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-internal-tls-certs\") pod \"74c5948e-4858-4efa-8f52-9220f419edf2\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961830 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-config-data\") pod \"93c13494-312a-4bd5-a24b-032b4889c838\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961848 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-combined-ca-bundle\") pod \"74c5948e-4858-4efa-8f52-9220f419edf2\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961889 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"74c5948e-4858-4efa-8f52-9220f419edf2\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961919 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93c13494-312a-4bd5-a24b-032b4889c838-logs\") pod \"93c13494-312a-4bd5-a24b-032b4889c838\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.961957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"93c13494-312a-4bd5-a24b-032b4889c838\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.962004 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-config-data\") pod \"74c5948e-4858-4efa-8f52-9220f419edf2\" (UID: \"74c5948e-4858-4efa-8f52-9220f419edf2\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.962029 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-combined-ca-bundle\") pod \"93c13494-312a-4bd5-a24b-032b4889c838\" (UID: \"93c13494-312a-4bd5-a24b-032b4889c838\") " Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.962222 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74c5948e-4858-4efa-8f52-9220f419edf2-logs" (OuterVolumeSpecName: "logs") pod "74c5948e-4858-4efa-8f52-9220f419edf2" (UID: "74c5948e-4858-4efa-8f52-9220f419edf2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.962400 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93c13494-312a-4bd5-a24b-032b4889c838-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "93c13494-312a-4bd5-a24b-032b4889c838" (UID: "93c13494-312a-4bd5-a24b-032b4889c838"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.962569 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/93c13494-312a-4bd5-a24b-032b4889c838-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.962588 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5948e-4858-4efa-8f52-9220f419edf2-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.962657 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93c13494-312a-4bd5-a24b-032b4889c838-logs" (OuterVolumeSpecName: "logs") pod "93c13494-312a-4bd5-a24b-032b4889c838" (UID: "93c13494-312a-4bd5-a24b-032b4889c838"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.963935 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74c5948e-4858-4efa-8f52-9220f419edf2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "74c5948e-4858-4efa-8f52-9220f419edf2" (UID: "74c5948e-4858-4efa-8f52-9220f419edf2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.980052 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-scripts" (OuterVolumeSpecName: "scripts") pod "93c13494-312a-4bd5-a24b-032b4889c838" (UID: "93c13494-312a-4bd5-a24b-032b4889c838"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.983012 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c5948e-4858-4efa-8f52-9220f419edf2-kube-api-access-rckd6" (OuterVolumeSpecName: "kube-api-access-rckd6") pod "74c5948e-4858-4efa-8f52-9220f419edf2" (UID: "74c5948e-4858-4efa-8f52-9220f419edf2"). InnerVolumeSpecName "kube-api-access-rckd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.983794 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93c13494-312a-4bd5-a24b-032b4889c838-kube-api-access-mlbx5" (OuterVolumeSpecName: "kube-api-access-mlbx5") pod "93c13494-312a-4bd5-a24b-032b4889c838" (UID: "93c13494-312a-4bd5-a24b-032b4889c838"). InnerVolumeSpecName "kube-api-access-mlbx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.985123 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-scripts" (OuterVolumeSpecName: "scripts") pod "74c5948e-4858-4efa-8f52-9220f419edf2" (UID: "74c5948e-4858-4efa-8f52-9220f419edf2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.987761 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "74c5948e-4858-4efa-8f52-9220f419edf2" (UID: "74c5948e-4858-4efa-8f52-9220f419edf2"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:21:34 crc kubenswrapper[5030]: I0120 23:21:34.993767 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "93c13494-312a-4bd5-a24b-032b4889c838" (UID: "93c13494-312a-4bd5-a24b-032b4889c838"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.021744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93c13494-312a-4bd5-a24b-032b4889c838" (UID: "93c13494-312a-4bd5-a24b-032b4889c838"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.038531 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74c5948e-4858-4efa-8f52-9220f419edf2" (UID: "74c5948e-4858-4efa-8f52-9220f419edf2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.046776 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "93c13494-312a-4bd5-a24b-032b4889c838" (UID: "93c13494-312a-4bd5-a24b-032b4889c838"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.049230 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-config-data" (OuterVolumeSpecName: "config-data") pod "74c5948e-4858-4efa-8f52-9220f419edf2" (UID: "74c5948e-4858-4efa-8f52-9220f419edf2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.051169 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-config-data" (OuterVolumeSpecName: "config-data") pod "93c13494-312a-4bd5-a24b-032b4889c838" (UID: "93c13494-312a-4bd5-a24b-032b4889c838"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.055716 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "74c5948e-4858-4efa-8f52-9220f419edf2" (UID: "74c5948e-4858-4efa-8f52-9220f419edf2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.065435 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.067055 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rckd6\" (UniqueName: \"kubernetes.io/projected/74c5948e-4858-4efa-8f52-9220f419edf2-kube-api-access-rckd6\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.067177 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.067259 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.067372 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.067489 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.067552 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93c13494-312a-4bd5-a24b-032b4889c838-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.067612 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.067712 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5948e-4858-4efa-8f52-9220f419edf2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.069383 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.069723 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.069788 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5948e-4858-4efa-8f52-9220f419edf2-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.069865 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlbx5\" (UniqueName: \"kubernetes.io/projected/93c13494-312a-4bd5-a24b-032b4889c838-kube-api-access-mlbx5\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.069937 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93c13494-312a-4bd5-a24b-032b4889c838-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.091044 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.101246 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.171174 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.171473 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.257676 5030 generic.go:334] "Generic (PLEG): container finished" podID="74c5948e-4858-4efa-8f52-9220f419edf2" containerID="db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044" exitCode=0 Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.257933 5030 generic.go:334] "Generic (PLEG): container finished" podID="74c5948e-4858-4efa-8f52-9220f419edf2" containerID="4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889" exitCode=143 Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.257734 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"74c5948e-4858-4efa-8f52-9220f419edf2","Type":"ContainerDied","Data":"db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044"} Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.257763 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.257989 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"74c5948e-4858-4efa-8f52-9220f419edf2","Type":"ContainerDied","Data":"4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889"} Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.258017 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"74c5948e-4858-4efa-8f52-9220f419edf2","Type":"ContainerDied","Data":"574365f3308d9dc4d6af75a21a9caa1e295cf0f6f30fee451c6e3f7bf53a5fdb"} Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.258042 5030 scope.go:117] "RemoveContainer" containerID="db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.260594 5030 generic.go:334] "Generic (PLEG): container finished" podID="93c13494-312a-4bd5-a24b-032b4889c838" containerID="fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36" exitCode=143 Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.260639 5030 generic.go:334] "Generic (PLEG): container finished" podID="93c13494-312a-4bd5-a24b-032b4889c838" containerID="3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7" exitCode=143 Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.260680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"93c13494-312a-4bd5-a24b-032b4889c838","Type":"ContainerDied","Data":"fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36"} Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.260706 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"93c13494-312a-4bd5-a24b-032b4889c838","Type":"ContainerDied","Data":"3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7"} Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.260716 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"93c13494-312a-4bd5-a24b-032b4889c838","Type":"ContainerDied","Data":"d2c02c65725a8deca788bdeba76604a028f1bb73b1d3046fefbaa1490e5c5459"} Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.260769 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.270508 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-5px6h" event={"ID":"a504a208-146a-4ba0-8145-b42daf7f5f1a","Type":"ContainerStarted","Data":"8c7558df2cb9e4c2727a9911eeacd806ffc948bf680971ab33112fc7377f785e"} Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.270547 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-5px6h" event={"ID":"a504a208-146a-4ba0-8145-b42daf7f5f1a","Type":"ContainerStarted","Data":"17918297b7c987d47aeabc0bd0a32a79233ee36ecdcbe784503db484eaea1adb"} Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.287320 5030 generic.go:334] "Generic (PLEG): container finished" podID="2ffbef9c-2e37-434b-9a2a-2eda3514d7f0" containerID="e529e28e3eebf95ed91002b404e311beb3c6e1ef97122ba56f112fd06d3bb19e" exitCode=0 Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.287373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" event={"ID":"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0","Type":"ContainerDied","Data":"e529e28e3eebf95ed91002b404e311beb3c6e1ef97122ba56f112fd06d3bb19e"} Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.289835 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" event={"ID":"a403eee4-765c-41f2-9ae3-fa3151068a29","Type":"ContainerStarted","Data":"d0d3f005943a446b84278210290d3b4a01364c1349a853ce14c5fbeadf1bf75c"} Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.289857 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" event={"ID":"a403eee4-765c-41f2-9ae3-fa3151068a29","Type":"ContainerStarted","Data":"92208947e04c6b0d7d351c25396e5784683eed9af973bc28b2bacc1fd66b4d5b"} Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.299738 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" event={"ID":"78c5052d-ee32-4414-a152-13caff78882f","Type":"ContainerStarted","Data":"f9dc97af04fb3932c7410e9c9a358e795dd6cf3a504e15146011c5956745a3ca"} Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.304464 5030 scope.go:117] "RemoveContainer" containerID="4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.305343 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-5px6h" podStartSLOduration=2.305328145 podStartE2EDuration="2.305328145s" podCreationTimestamp="2026-01-20 23:21:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:35.302390714 +0000 UTC m=+2767.622651002" watchObservedRunningTime="2026-01-20 23:21:35.305328145 +0000 UTC m=+2767.625588423" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.362787 5030 scope.go:117] "RemoveContainer" containerID="db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.363211 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" podStartSLOduration=2.363201353 podStartE2EDuration="2.363201353s" podCreationTimestamp="2026-01-20 23:21:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:35.361200085 +0000 UTC m=+2767.681460373" watchObservedRunningTime="2026-01-20 23:21:35.363201353 +0000 UTC m=+2767.683461641" Jan 20 23:21:35 crc kubenswrapper[5030]: E0120 23:21:35.371741 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044\": container with ID starting with db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044 not found: ID does not exist" containerID="db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.371783 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044"} err="failed to get container status \"db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044\": rpc error: code = NotFound desc = could not find container \"db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044\": container with ID starting with db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044 not found: ID does not exist" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.371805 5030 scope.go:117] "RemoveContainer" containerID="4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889" Jan 20 23:21:35 crc kubenswrapper[5030]: E0120 23:21:35.382198 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889\": container with ID starting with 4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889 not found: ID does not exist" containerID="4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.382244 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889"} err="failed to get container status \"4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889\": rpc error: code = NotFound desc = could not find container \"4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889\": container with ID starting with 4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889 not found: ID does not exist" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.382266 5030 scope.go:117] "RemoveContainer" containerID="db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.388989 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044"} err="failed to get container status \"db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044\": rpc error: code = NotFound desc = could not find container \"db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044\": container with ID starting with db8e8cdc588abddbcd921de9c23a619e42dbacc6020f148f02acddb96debd044 not found: ID does not exist" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.389027 5030 scope.go:117] "RemoveContainer" containerID="4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.395611 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889"} err="failed to get container status \"4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889\": rpc error: code = NotFound desc = could not find container \"4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889\": container with ID starting with 4f2acb30b5ff42b5ab4609cd5aeceae725cf86b49227b90dadeab3a4d55fb889 not found: ID does not exist" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.395659 5030 scope.go:117] "RemoveContainer" containerID="fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.402460 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.415654 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.443691 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.467237 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.500662 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:21:35 crc kubenswrapper[5030]: E0120 23:21:35.501009 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c5948e-4858-4efa-8f52-9220f419edf2" containerName="glance-log" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.501020 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c5948e-4858-4efa-8f52-9220f419edf2" containerName="glance-log" Jan 20 23:21:35 crc kubenswrapper[5030]: E0120 23:21:35.501039 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c13494-312a-4bd5-a24b-032b4889c838" containerName="glance-httpd" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.501046 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c13494-312a-4bd5-a24b-032b4889c838" containerName="glance-httpd" Jan 20 23:21:35 crc kubenswrapper[5030]: E0120 23:21:35.501059 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c13494-312a-4bd5-a24b-032b4889c838" containerName="glance-log" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.501066 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c13494-312a-4bd5-a24b-032b4889c838" containerName="glance-log" Jan 20 23:21:35 crc kubenswrapper[5030]: E0120 23:21:35.501086 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c5948e-4858-4efa-8f52-9220f419edf2" containerName="glance-httpd" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.501091 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c5948e-4858-4efa-8f52-9220f419edf2" containerName="glance-httpd" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.501237 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c13494-312a-4bd5-a24b-032b4889c838" containerName="glance-log" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.501248 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c5948e-4858-4efa-8f52-9220f419edf2" containerName="glance-httpd" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.501258 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c5948e-4858-4efa-8f52-9220f419edf2" containerName="glance-log" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.501274 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c13494-312a-4bd5-a24b-032b4889c838" containerName="glance-httpd" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.502097 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.514130 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-b75fg" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.514316 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.514591 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.536835 5030 scope.go:117] "RemoveContainer" containerID="3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.537641 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.593150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.593191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03270088-0dfc-4646-9599-0700fd991275-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.593216 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwv9p\" (UniqueName: \"kubernetes.io/projected/03270088-0dfc-4646-9599-0700fd991275-kube-api-access-wwv9p\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.593238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.593262 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03270088-0dfc-4646-9599-0700fd991275-logs\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.593291 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-scripts\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.593316 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.593348 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-config-data\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.689818 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.697325 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.706033 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.706383 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.708333 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-scripts\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.708373 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.708393 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-config-data\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.708493 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.708513 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03270088-0dfc-4646-9599-0700fd991275-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.708535 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwv9p\" (UniqueName: \"kubernetes.io/projected/03270088-0dfc-4646-9599-0700fd991275-kube-api-access-wwv9p\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.708554 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.708572 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03270088-0dfc-4646-9599-0700fd991275-logs\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.709123 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03270088-0dfc-4646-9599-0700fd991275-logs\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.717120 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.729056 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-scripts\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.729917 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03270088-0dfc-4646-9599-0700fd991275-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.734662 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.752949 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-config-data\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.766025 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.795471 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" podStartSLOduration=2.795447758 podStartE2EDuration="2.795447758s" podCreationTimestamp="2026-01-20 23:21:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:35.456096331 +0000 UTC m=+2767.776356619" watchObservedRunningTime="2026-01-20 23:21:35.795447758 +0000 UTC m=+2768.115708046" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.801816 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwv9p\" (UniqueName: \"kubernetes.io/projected/03270088-0dfc-4646-9599-0700fd991275-kube-api-access-wwv9p\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.813424 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.813526 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-config-data\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.813565 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e2ae2a-382b-46da-a95d-b65c49e1298b-logs\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.813606 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70e2ae2a-382b-46da-a95d-b65c49e1298b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.813649 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-scripts\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.813678 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldzpn\" (UniqueName: \"kubernetes.io/projected/70e2ae2a-382b-46da-a95d-b65c49e1298b-kube-api-access-ldzpn\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.813698 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.815735 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.824196 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.843946 5030 scope.go:117] "RemoveContainer" containerID="fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.844115 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:21:35 crc kubenswrapper[5030]: E0120 23:21:35.848755 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36\": container with ID starting with fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36 not found: ID does not exist" containerID="fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.848807 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36"} err="failed to get container status \"fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36\": rpc error: code = NotFound desc = could not find container \"fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36\": container with ID starting with fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36 not found: ID does not exist" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.848833 5030 scope.go:117] "RemoveContainer" containerID="3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7" Jan 20 23:21:35 crc kubenswrapper[5030]: E0120 23:21:35.852422 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7\": container with ID starting with 3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7 not found: ID does not exist" containerID="3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.852449 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7"} err="failed to get container status \"3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7\": rpc error: code = NotFound desc = could not find container \"3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7\": container with ID starting with 3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7 not found: ID does not exist" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.852465 5030 scope.go:117] "RemoveContainer" containerID="fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.854851 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36"} err="failed to get container status \"fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36\": rpc error: code = NotFound desc = could not find container \"fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36\": container with ID starting with fb933338a5256af0bee365c28fa90e4436cd1d19bfa24277b304b51ebbd7cd36 not found: ID does not exist" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.854874 5030 scope.go:117] "RemoveContainer" containerID="3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.869281 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.877041 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7"} err="failed to get container status \"3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7\": rpc error: code = NotFound desc = could not find container \"3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7\": container with ID starting with 3e72152077961282387c7036b03ed57d6f7634f4a0ba5a9934f6b634280e0ab7 not found: ID does not exist" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.897490 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.920969 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.921048 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.921117 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-config-data\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.921164 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e2ae2a-382b-46da-a95d-b65c49e1298b-logs\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.921203 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70e2ae2a-382b-46da-a95d-b65c49e1298b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.921228 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-scripts\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.921254 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldzpn\" (UniqueName: \"kubernetes.io/projected/70e2ae2a-382b-46da-a95d-b65c49e1298b-kube-api-access-ldzpn\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.921273 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.928019 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-config-data\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.928247 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.928514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e2ae2a-382b-46da-a95d-b65c49e1298b-logs\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.929218 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70e2ae2a-382b-46da-a95d-b65c49e1298b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.944989 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-scripts\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.950196 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.950813 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.953066 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldzpn\" (UniqueName: \"kubernetes.io/projected/70e2ae2a-382b-46da-a95d-b65c49e1298b-kube-api-access-ldzpn\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.960555 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.985788 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74c5948e-4858-4efa-8f52-9220f419edf2" path="/var/lib/kubelet/pods/74c5948e-4858-4efa-8f52-9220f419edf2/volumes" Jan 20 23:21:35 crc kubenswrapper[5030]: I0120 23:21:35.987109 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93c13494-312a-4bd5-a24b-032b4889c838" path="/var/lib/kubelet/pods/93c13494-312a-4bd5-a24b-032b4889c838/volumes" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.022289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t484n\" (UniqueName: \"kubernetes.io/projected/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-kube-api-access-t484n\") pod \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.022380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-scripts\") pod \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.022411 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-config-data\") pod \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.022437 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-combined-ca-bundle\") pod \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.022456 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-logs\") pod \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\" (UID: \"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1\") " Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.024244 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-logs" (OuterVolumeSpecName: "logs") pod "81e2b764-c26c-4ef7-b6fb-c51e698b8ea1" (UID: "81e2b764-c26c-4ef7-b6fb-c51e698b8ea1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.026073 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-kube-api-access-t484n" (OuterVolumeSpecName: "kube-api-access-t484n") pod "81e2b764-c26c-4ef7-b6fb-c51e698b8ea1" (UID: "81e2b764-c26c-4ef7-b6fb-c51e698b8ea1"). InnerVolumeSpecName "kube-api-access-t484n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.027406 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-scripts" (OuterVolumeSpecName: "scripts") pod "81e2b764-c26c-4ef7-b6fb-c51e698b8ea1" (UID: "81e2b764-c26c-4ef7-b6fb-c51e698b8ea1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.049927 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81e2b764-c26c-4ef7-b6fb-c51e698b8ea1" (UID: "81e2b764-c26c-4ef7-b6fb-c51e698b8ea1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.050322 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-config-data" (OuterVolumeSpecName: "config-data") pod "81e2b764-c26c-4ef7-b6fb-c51e698b8ea1" (UID: "81e2b764-c26c-4ef7-b6fb-c51e698b8ea1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.112792 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.124126 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.124159 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.124171 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.124180 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.124191 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t484n\" (UniqueName: \"kubernetes.io/projected/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1-kube-api-access-t484n\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.141414 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.316834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e15b072d-db1f-45d5-a486-d53f31091f29","Type":"ContainerStarted","Data":"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c"} Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.317093 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.316888 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="ceilometer-central-agent" containerID="cri-o://4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a" gracePeriod=30 Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.317106 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="proxy-httpd" containerID="cri-o://ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c" gracePeriod=30 Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.317188 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="ceilometer-notification-agent" containerID="cri-o://5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2" gracePeriod=30 Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.317192 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="sg-core" containerID="cri-o://0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66" gracePeriod=30 Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.323547 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-6smx7" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.323645 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-6smx7" event={"ID":"81e2b764-c26c-4ef7-b6fb-c51e698b8ea1","Type":"ContainerDied","Data":"cdfe5a4a641fff7540d46b0b8d7ee52f9f1164b9a673083f5dc397bfbb042113"} Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.323672 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdfe5a4a641fff7540d46b0b8d7ee52f9f1164b9a673083f5dc397bfbb042113" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.344210 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.049759964 podStartE2EDuration="6.344194258s" podCreationTimestamp="2026-01-20 23:21:30 +0000 UTC" firstStartedPulling="2026-01-20 23:21:31.244021582 +0000 UTC m=+2763.564281870" lastFinishedPulling="2026-01-20 23:21:35.538455876 +0000 UTC m=+2767.858716164" observedRunningTime="2026-01-20 23:21:36.334769801 +0000 UTC m=+2768.655030099" watchObservedRunningTime="2026-01-20 23:21:36.344194258 +0000 UTC m=+2768.664454546" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.444567 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7597899ccc-qc79q"] Jan 20 23:21:36 crc kubenswrapper[5030]: E0120 23:21:36.445012 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e2b764-c26c-4ef7-b6fb-c51e698b8ea1" containerName="placement-db-sync" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.445075 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e2b764-c26c-4ef7-b6fb-c51e698b8ea1" containerName="placement-db-sync" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.445292 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e2b764-c26c-4ef7-b6fb-c51e698b8ea1" containerName="placement-db-sync" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.446221 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.450115 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.450386 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.450487 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.450596 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-94llg" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.450781 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.466068 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7597899ccc-qc79q"] Jan 20 23:21:36 crc kubenswrapper[5030]: E0120 23:21:36.477419 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5a0e0f_eb0d_4757_b571_6571a43793c4.slice/crio-e27e4401e62fa285d5d1e9936dacf60031667ebe0fe7152d85d42ed3a9166693\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode15b072d_db1f_45d5_a486_d53f31091f29.slice/crio-0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5a0e0f_eb0d_4757_b571_6571a43793c4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode15b072d_db1f_45d5_a486_d53f31091f29.slice/crio-conmon-ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.528820 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-config-data\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.528860 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-combined-ca-bundle\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.528887 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqnnw\" (UniqueName: \"kubernetes.io/projected/86fc7598-ee04-466c-9814-31b5cb28d7f9-kube-api-access-zqnnw\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.528908 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-internal-tls-certs\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.528944 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fc7598-ee04-466c-9814-31b5cb28d7f9-logs\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.529074 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-public-tls-certs\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.529094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-scripts\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.630580 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fc7598-ee04-466c-9814-31b5cb28d7f9-logs\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.630727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-public-tls-certs\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.630760 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-scripts\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.630820 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-config-data\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.630846 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-combined-ca-bundle\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.630875 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqnnw\" (UniqueName: \"kubernetes.io/projected/86fc7598-ee04-466c-9814-31b5cb28d7f9-kube-api-access-zqnnw\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.630902 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-internal-tls-certs\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.631023 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fc7598-ee04-466c-9814-31b5cb28d7f9-logs\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.636508 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-config-data\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.641828 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-public-tls-certs\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.644051 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-scripts\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.645151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-internal-tls-certs\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.645192 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-combined-ca-bundle\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.659756 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqnnw\" (UniqueName: \"kubernetes.io/projected/86fc7598-ee04-466c-9814-31b5cb28d7f9-kube-api-access-zqnnw\") pod \"placement-7597899ccc-qc79q\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.680566 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:21:36 crc kubenswrapper[5030]: W0120 23:21:36.693203 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03270088_0dfc_4646_9599_0700fd991275.slice/crio-e3d841ad80fce68034329e6e9ec7acea58fa7d3a8f80743be11d33b14a112b13 WatchSource:0}: Error finding container e3d841ad80fce68034329e6e9ec7acea58fa7d3a8f80743be11d33b14a112b13: Status 404 returned error can't find the container with id e3d841ad80fce68034329e6e9ec7acea58fa7d3a8f80743be11d33b14a112b13 Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.769753 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.821686 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.883182 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.938134 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-config-data\") pod \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.938178 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlh22\" (UniqueName: \"kubernetes.io/projected/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-kube-api-access-wlh22\") pod \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.938197 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-fernet-keys\") pod \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.938227 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-credential-keys\") pod \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.938310 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-scripts\") pod \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.938400 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-combined-ca-bundle\") pod \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\" (UID: \"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0\") " Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.944930 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-kube-api-access-wlh22" (OuterVolumeSpecName: "kube-api-access-wlh22") pod "2ffbef9c-2e37-434b-9a2a-2eda3514d7f0" (UID: "2ffbef9c-2e37-434b-9a2a-2eda3514d7f0"). InnerVolumeSpecName "kube-api-access-wlh22". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.948389 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2ffbef9c-2e37-434b-9a2a-2eda3514d7f0" (UID: "2ffbef9c-2e37-434b-9a2a-2eda3514d7f0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.951133 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-scripts" (OuterVolumeSpecName: "scripts") pod "2ffbef9c-2e37-434b-9a2a-2eda3514d7f0" (UID: "2ffbef9c-2e37-434b-9a2a-2eda3514d7f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.955747 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2ffbef9c-2e37-434b-9a2a-2eda3514d7f0" (UID: "2ffbef9c-2e37-434b-9a2a-2eda3514d7f0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:36 crc kubenswrapper[5030]: I0120 23:21:36.997046 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-config-data" (OuterVolumeSpecName: "config-data") pod "2ffbef9c-2e37-434b-9a2a-2eda3514d7f0" (UID: "2ffbef9c-2e37-434b-9a2a-2eda3514d7f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.020732 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ffbef9c-2e37-434b-9a2a-2eda3514d7f0" (UID: "2ffbef9c-2e37-434b-9a2a-2eda3514d7f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.041192 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.041222 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.041233 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlh22\" (UniqueName: \"kubernetes.io/projected/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-kube-api-access-wlh22\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.041244 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.041254 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.041262 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.084789 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.145558 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-combined-ca-bundle\") pod \"e15b072d-db1f-45d5-a486-d53f31091f29\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.145641 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-config-data\") pod \"e15b072d-db1f-45d5-a486-d53f31091f29\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.145726 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15b072d-db1f-45d5-a486-d53f31091f29-run-httpd\") pod \"e15b072d-db1f-45d5-a486-d53f31091f29\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.145755 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-sg-core-conf-yaml\") pod \"e15b072d-db1f-45d5-a486-d53f31091f29\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.145778 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-scripts\") pod \"e15b072d-db1f-45d5-a486-d53f31091f29\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.145825 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwcgm\" (UniqueName: \"kubernetes.io/projected/e15b072d-db1f-45d5-a486-d53f31091f29-kube-api-access-zwcgm\") pod \"e15b072d-db1f-45d5-a486-d53f31091f29\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.145976 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15b072d-db1f-45d5-a486-d53f31091f29-log-httpd\") pod \"e15b072d-db1f-45d5-a486-d53f31091f29\" (UID: \"e15b072d-db1f-45d5-a486-d53f31091f29\") " Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.147782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e15b072d-db1f-45d5-a486-d53f31091f29-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e15b072d-db1f-45d5-a486-d53f31091f29" (UID: "e15b072d-db1f-45d5-a486-d53f31091f29"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.154388 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e15b072d-db1f-45d5-a486-d53f31091f29-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e15b072d-db1f-45d5-a486-d53f31091f29" (UID: "e15b072d-db1f-45d5-a486-d53f31091f29"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.161505 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-scripts" (OuterVolumeSpecName: "scripts") pod "e15b072d-db1f-45d5-a486-d53f31091f29" (UID: "e15b072d-db1f-45d5-a486-d53f31091f29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.166351 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e15b072d-db1f-45d5-a486-d53f31091f29-kube-api-access-zwcgm" (OuterVolumeSpecName: "kube-api-access-zwcgm") pod "e15b072d-db1f-45d5-a486-d53f31091f29" (UID: "e15b072d-db1f-45d5-a486-d53f31091f29"). InnerVolumeSpecName "kube-api-access-zwcgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.183435 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e15b072d-db1f-45d5-a486-d53f31091f29" (UID: "e15b072d-db1f-45d5-a486-d53f31091f29"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.242262 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e15b072d-db1f-45d5-a486-d53f31091f29" (UID: "e15b072d-db1f-45d5-a486-d53f31091f29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.253284 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.253324 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15b072d-db1f-45d5-a486-d53f31091f29-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.253334 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.253342 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.253353 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwcgm\" (UniqueName: \"kubernetes.io/projected/e15b072d-db1f-45d5-a486-d53f31091f29-kube-api-access-zwcgm\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.253363 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e15b072d-db1f-45d5-a486-d53f31091f29-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.259808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-config-data" (OuterVolumeSpecName: "config-data") pod "e15b072d-db1f-45d5-a486-d53f31091f29" (UID: "e15b072d-db1f-45d5-a486-d53f31091f29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.268198 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7597899ccc-qc79q"] Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.354544 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15b072d-db1f-45d5-a486-d53f31091f29-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.360836 5030 generic.go:334] "Generic (PLEG): container finished" podID="e15b072d-db1f-45d5-a486-d53f31091f29" containerID="ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c" exitCode=0 Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.360864 5030 generic.go:334] "Generic (PLEG): container finished" podID="e15b072d-db1f-45d5-a486-d53f31091f29" containerID="0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66" exitCode=2 Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.360873 5030 generic.go:334] "Generic (PLEG): container finished" podID="e15b072d-db1f-45d5-a486-d53f31091f29" containerID="5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2" exitCode=0 Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.360880 5030 generic.go:334] "Generic (PLEG): container finished" podID="e15b072d-db1f-45d5-a486-d53f31091f29" containerID="4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a" exitCode=0 Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.360917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e15b072d-db1f-45d5-a486-d53f31091f29","Type":"ContainerDied","Data":"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c"} Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.360943 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e15b072d-db1f-45d5-a486-d53f31091f29","Type":"ContainerDied","Data":"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66"} Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.360953 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e15b072d-db1f-45d5-a486-d53f31091f29","Type":"ContainerDied","Data":"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2"} Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.360961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e15b072d-db1f-45d5-a486-d53f31091f29","Type":"ContainerDied","Data":"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a"} Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.360970 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e15b072d-db1f-45d5-a486-d53f31091f29","Type":"ContainerDied","Data":"4146829f3edca7b9071eb5729925a4898dc1a6a57925be3c3f8cb23016dfcae3"} Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.360987 5030 scope.go:117] "RemoveContainer" containerID="ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.361113 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.366723 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"03270088-0dfc-4646-9599-0700fd991275","Type":"ContainerStarted","Data":"42c8c068805b8becc88018dc6e8647a0ecdaa3cf1b06f1ed91892d7203494852"} Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.366767 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"03270088-0dfc-4646-9599-0700fd991275","Type":"ContainerStarted","Data":"e3d841ad80fce68034329e6e9ec7acea58fa7d3a8f80743be11d33b14a112b13"} Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.369744 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.370013 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-h94nz" event={"ID":"2ffbef9c-2e37-434b-9a2a-2eda3514d7f0","Type":"ContainerDied","Data":"319fc93a625af6f8d4b99627646662478779d050624488d36306212884e5a3f7"} Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.370146 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="319fc93a625af6f8d4b99627646662478779d050624488d36306212884e5a3f7" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.372461 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"70e2ae2a-382b-46da-a95d-b65c49e1298b","Type":"ContainerStarted","Data":"ee918c210dd8333de30c29f540d427e2dde9918e4e84fb352a3ad9f06c906777"} Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.373848 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" event={"ID":"86fc7598-ee04-466c-9814-31b5cb28d7f9","Type":"ContainerStarted","Data":"61ba3904a386f332e5734ea5145e129d883e0fe2480ec1a1ef7f47ca0c96a6de"} Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.427136 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.432318 5030 scope.go:117] "RemoveContainer" containerID="0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.444193 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.455840 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:21:37 crc kubenswrapper[5030]: E0120 23:21:37.456198 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="ceilometer-central-agent" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.456213 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="ceilometer-central-agent" Jan 20 23:21:37 crc kubenswrapper[5030]: E0120 23:21:37.456229 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="proxy-httpd" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.456234 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="proxy-httpd" Jan 20 23:21:37 crc kubenswrapper[5030]: E0120 23:21:37.456249 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="ceilometer-notification-agent" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.456256 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="ceilometer-notification-agent" Jan 20 23:21:37 crc kubenswrapper[5030]: E0120 23:21:37.456271 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="sg-core" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.456277 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="sg-core" Jan 20 23:21:37 crc kubenswrapper[5030]: E0120 23:21:37.456288 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ffbef9c-2e37-434b-9a2a-2eda3514d7f0" containerName="keystone-bootstrap" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.456295 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ffbef9c-2e37-434b-9a2a-2eda3514d7f0" containerName="keystone-bootstrap" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.456460 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="proxy-httpd" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.456474 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="ceilometer-central-agent" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.456487 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ffbef9c-2e37-434b-9a2a-2eda3514d7f0" containerName="keystone-bootstrap" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.456497 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="ceilometer-notification-agent" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.456517 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" containerName="sg-core" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.458046 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.461231 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.462452 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.469499 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.486703 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-h94nz"] Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.489500 5030 scope.go:117] "RemoveContainer" containerID="5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.491052 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-h94nz"] Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.525021 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-q284r"] Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.526014 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.529635 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-vdlm9" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.529739 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.529866 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.529944 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.529978 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.532244 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-q284r"] Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.554533 5030 scope.go:117] "RemoveContainer" containerID="4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.558120 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-config-data\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.558164 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5d3183b-1f93-402d-b6ae-04a7140350e1-log-httpd\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.558201 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.558223 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.558246 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k9wk\" (UniqueName: \"kubernetes.io/projected/a5d3183b-1f93-402d-b6ae-04a7140350e1-kube-api-access-8k9wk\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.558265 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5d3183b-1f93-402d-b6ae-04a7140350e1-run-httpd\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.558282 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-scripts\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.659488 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-config-data\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.659530 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-combined-ca-bundle\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.659559 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-config-data\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.659743 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5d3183b-1f93-402d-b6ae-04a7140350e1-log-httpd\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.659861 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.659904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.659953 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k9wk\" (UniqueName: \"kubernetes.io/projected/a5d3183b-1f93-402d-b6ae-04a7140350e1-kube-api-access-8k9wk\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.659983 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5d3183b-1f93-402d-b6ae-04a7140350e1-run-httpd\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.660006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-credential-keys\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.660035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-scripts\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.660073 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-scripts\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.660109 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvwcq\" (UniqueName: \"kubernetes.io/projected/59a0bd3e-1c17-41ac-b449-3392e10dccc8-kube-api-access-fvwcq\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.660198 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-fernet-keys\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.660364 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5d3183b-1f93-402d-b6ae-04a7140350e1-log-httpd\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.660732 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5d3183b-1f93-402d-b6ae-04a7140350e1-run-httpd\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.663998 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.667011 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.667376 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-config-data\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.673494 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-scripts\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.676100 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k9wk\" (UniqueName: \"kubernetes.io/projected/a5d3183b-1f93-402d-b6ae-04a7140350e1-kube-api-access-8k9wk\") pod \"ceilometer-0\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.761781 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-credential-keys\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.761831 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-scripts\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.761854 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvwcq\" (UniqueName: \"kubernetes.io/projected/59a0bd3e-1c17-41ac-b449-3392e10dccc8-kube-api-access-fvwcq\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.762403 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-fernet-keys\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.762446 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-config-data\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.762464 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-combined-ca-bundle\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.765339 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-scripts\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.769125 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-fernet-keys\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.769162 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-combined-ca-bundle\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.773637 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-credential-keys\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.776336 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-config-data\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.778363 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvwcq\" (UniqueName: \"kubernetes.io/projected/59a0bd3e-1c17-41ac-b449-3392e10dccc8-kube-api-access-fvwcq\") pod \"keystone-bootstrap-q284r\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.804609 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.839792 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.869215 5030 scope.go:117] "RemoveContainer" containerID="ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c" Jan 20 23:21:37 crc kubenswrapper[5030]: E0120 23:21:37.872216 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c\": container with ID starting with ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c not found: ID does not exist" containerID="ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.872259 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c"} err="failed to get container status \"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c\": rpc error: code = NotFound desc = could not find container \"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c\": container with ID starting with ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.872280 5030 scope.go:117] "RemoveContainer" containerID="0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66" Jan 20 23:21:37 crc kubenswrapper[5030]: E0120 23:21:37.872957 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66\": container with ID starting with 0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66 not found: ID does not exist" containerID="0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.873013 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66"} err="failed to get container status \"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66\": rpc error: code = NotFound desc = could not find container \"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66\": container with ID starting with 0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66 not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.873044 5030 scope.go:117] "RemoveContainer" containerID="5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2" Jan 20 23:21:37 crc kubenswrapper[5030]: E0120 23:21:37.873315 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2\": container with ID starting with 5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2 not found: ID does not exist" containerID="5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.873340 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2"} err="failed to get container status \"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2\": rpc error: code = NotFound desc = could not find container \"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2\": container with ID starting with 5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2 not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.873357 5030 scope.go:117] "RemoveContainer" containerID="4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a" Jan 20 23:21:37 crc kubenswrapper[5030]: E0120 23:21:37.875486 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a\": container with ID starting with 4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a not found: ID does not exist" containerID="4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.875527 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a"} err="failed to get container status \"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a\": rpc error: code = NotFound desc = could not find container \"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a\": container with ID starting with 4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.875551 5030 scope.go:117] "RemoveContainer" containerID="ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.876052 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c"} err="failed to get container status \"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c\": rpc error: code = NotFound desc = could not find container \"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c\": container with ID starting with ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.876106 5030 scope.go:117] "RemoveContainer" containerID="0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.876388 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66"} err="failed to get container status \"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66\": rpc error: code = NotFound desc = could not find container \"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66\": container with ID starting with 0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66 not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.876408 5030 scope.go:117] "RemoveContainer" containerID="5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.876648 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2"} err="failed to get container status \"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2\": rpc error: code = NotFound desc = could not find container \"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2\": container with ID starting with 5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2 not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.876665 5030 scope.go:117] "RemoveContainer" containerID="4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.876919 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a"} err="failed to get container status \"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a\": rpc error: code = NotFound desc = could not find container \"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a\": container with ID starting with 4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.876939 5030 scope.go:117] "RemoveContainer" containerID="ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.877160 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c"} err="failed to get container status \"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c\": rpc error: code = NotFound desc = could not find container \"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c\": container with ID starting with ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.877194 5030 scope.go:117] "RemoveContainer" containerID="0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.877600 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66"} err="failed to get container status \"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66\": rpc error: code = NotFound desc = could not find container \"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66\": container with ID starting with 0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66 not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.877631 5030 scope.go:117] "RemoveContainer" containerID="5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.877919 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2"} err="failed to get container status \"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2\": rpc error: code = NotFound desc = could not find container \"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2\": container with ID starting with 5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2 not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.877936 5030 scope.go:117] "RemoveContainer" containerID="4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.878121 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a"} err="failed to get container status \"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a\": rpc error: code = NotFound desc = could not find container \"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a\": container with ID starting with 4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.878150 5030 scope.go:117] "RemoveContainer" containerID="ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.878341 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c"} err="failed to get container status \"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c\": rpc error: code = NotFound desc = could not find container \"ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c\": container with ID starting with ed38bdcd55bbecb3bd433b3cfe110340033449813260b421a7fd56fa1e1ba72c not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.878370 5030 scope.go:117] "RemoveContainer" containerID="0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.878547 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66"} err="failed to get container status \"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66\": rpc error: code = NotFound desc = could not find container \"0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66\": container with ID starting with 0a316fed309a4dc26f8f9110bb485ee25466073dc030474f7c975f7f2d79ba66 not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.878574 5030 scope.go:117] "RemoveContainer" containerID="5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.879082 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2"} err="failed to get container status \"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2\": rpc error: code = NotFound desc = could not find container \"5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2\": container with ID starting with 5cde2241729ab9d07a55f24a458a5a07bf5f4673d215e66af6de2ba8bd1b6cd2 not found: ID does not exist" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.879110 5030 scope.go:117] "RemoveContainer" containerID="4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a" Jan 20 23:21:37 crc kubenswrapper[5030]: I0120 23:21:37.879409 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a"} err="failed to get container status \"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a\": rpc error: code = NotFound desc = could not find container \"4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a\": container with ID starting with 4c428985009978c458907d9bec1cc78bbea2f01dbe0239a562f21aa6a7cdc52a not found: ID does not exist" Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.031645 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ffbef9c-2e37-434b-9a2a-2eda3514d7f0" path="/var/lib/kubelet/pods/2ffbef9c-2e37-434b-9a2a-2eda3514d7f0/volumes" Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.032722 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e15b072d-db1f-45d5-a486-d53f31091f29" path="/var/lib/kubelet/pods/e15b072d-db1f-45d5-a486-d53f31091f29/volumes" Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.301716 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:21:38 crc kubenswrapper[5030]: W0120 23:21:38.304400 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5d3183b_1f93_402d_b6ae_04a7140350e1.slice/crio-9fc0eaf21d348e9e9a56aed10565687e90217773a031ea2cefdb91ab9793f90b WatchSource:0}: Error finding container 9fc0eaf21d348e9e9a56aed10565687e90217773a031ea2cefdb91ab9793f90b: Status 404 returned error can't find the container with id 9fc0eaf21d348e9e9a56aed10565687e90217773a031ea2cefdb91ab9793f90b Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.385808 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"03270088-0dfc-4646-9599-0700fd991275","Type":"ContainerStarted","Data":"9bd137d1669b6127c69101e3cff47c37369bbd0fb2dc8cd722bee2c457d4eeb3"} Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.387802 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"70e2ae2a-382b-46da-a95d-b65c49e1298b","Type":"ContainerStarted","Data":"b3d5a25fb3cb3fb071b846be57fec63c5e5ee368d213f3055cf1391136ae34fb"} Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.387835 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"70e2ae2a-382b-46da-a95d-b65c49e1298b","Type":"ContainerStarted","Data":"54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f"} Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.391357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" event={"ID":"86fc7598-ee04-466c-9814-31b5cb28d7f9","Type":"ContainerStarted","Data":"7ce22c372aaae61f2dea8e280dd96ecd34e932c3f390c86324eb4da31bd300fa"} Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.391389 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" event={"ID":"86fc7598-ee04-466c-9814-31b5cb28d7f9","Type":"ContainerStarted","Data":"b7155efd558f19bbb3a86a83a98f9fbd59c3dfaff24a0ec4ba84d50b5ccae084"} Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.391457 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.391520 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.394451 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a5d3183b-1f93-402d-b6ae-04a7140350e1","Type":"ContainerStarted","Data":"9fc0eaf21d348e9e9a56aed10565687e90217773a031ea2cefdb91ab9793f90b"} Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.396088 5030 generic.go:334] "Generic (PLEG): container finished" podID="a504a208-146a-4ba0-8145-b42daf7f5f1a" containerID="8c7558df2cb9e4c2727a9911eeacd806ffc948bf680971ab33112fc7377f785e" exitCode=0 Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.396142 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-5px6h" event={"ID":"a504a208-146a-4ba0-8145-b42daf7f5f1a","Type":"ContainerDied","Data":"8c7558df2cb9e4c2727a9911eeacd806ffc948bf680971ab33112fc7377f785e"} Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.407971 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-q284r"] Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.411729 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.411708879 podStartE2EDuration="3.411708879s" podCreationTimestamp="2026-01-20 23:21:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:38.406395151 +0000 UTC m=+2770.726655439" watchObservedRunningTime="2026-01-20 23:21:38.411708879 +0000 UTC m=+2770.731969167" Jan 20 23:21:38 crc kubenswrapper[5030]: W0120 23:21:38.411750 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59a0bd3e_1c17_41ac_b449_3392e10dccc8.slice/crio-8b9d15dc7b5afc80d2fecb59444efc9b2a748ed2c75fc7d346f18e7d7ce564fd WatchSource:0}: Error finding container 8b9d15dc7b5afc80d2fecb59444efc9b2a748ed2c75fc7d346f18e7d7ce564fd: Status 404 returned error can't find the container with id 8b9d15dc7b5afc80d2fecb59444efc9b2a748ed2c75fc7d346f18e7d7ce564fd Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.431013 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" podStartSLOduration=2.430993421 podStartE2EDuration="2.430993421s" podCreationTimestamp="2026-01-20 23:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:38.425895249 +0000 UTC m=+2770.746155547" watchObservedRunningTime="2026-01-20 23:21:38.430993421 +0000 UTC m=+2770.751253709" Jan 20 23:21:38 crc kubenswrapper[5030]: I0120 23:21:38.451412 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.45138616 podStartE2EDuration="3.45138616s" podCreationTimestamp="2026-01-20 23:21:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:38.44682314 +0000 UTC m=+2770.767083428" watchObservedRunningTime="2026-01-20 23:21:38.45138616 +0000 UTC m=+2770.771646448" Jan 20 23:21:39 crc kubenswrapper[5030]: I0120 23:21:39.409272 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-q284r" event={"ID":"59a0bd3e-1c17-41ac-b449-3392e10dccc8","Type":"ContainerStarted","Data":"e0404281b5a4570801be1942e93ace165a65a028f86411ac0d3d3c68f979b1f5"} Jan 20 23:21:39 crc kubenswrapper[5030]: I0120 23:21:39.409604 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-q284r" event={"ID":"59a0bd3e-1c17-41ac-b449-3392e10dccc8","Type":"ContainerStarted","Data":"8b9d15dc7b5afc80d2fecb59444efc9b2a748ed2c75fc7d346f18e7d7ce564fd"} Jan 20 23:21:39 crc kubenswrapper[5030]: I0120 23:21:39.412676 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a5d3183b-1f93-402d-b6ae-04a7140350e1","Type":"ContainerStarted","Data":"227e0b093c901babf978c1dc04c9783f7c8ea08ded72d6934b4f9c5424d4b2c3"} Jan 20 23:21:39 crc kubenswrapper[5030]: I0120 23:21:39.434548 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-q284r" podStartSLOduration=2.434523746 podStartE2EDuration="2.434523746s" podCreationTimestamp="2026-01-20 23:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:39.426492334 +0000 UTC m=+2771.746752632" watchObservedRunningTime="2026-01-20 23:21:39.434523746 +0000 UTC m=+2771.754784034" Jan 20 23:21:39 crc kubenswrapper[5030]: I0120 23:21:39.807583 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-5px6h" Jan 20 23:21:39 crc kubenswrapper[5030]: I0120 23:21:39.900175 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a504a208-146a-4ba0-8145-b42daf7f5f1a-combined-ca-bundle\") pod \"a504a208-146a-4ba0-8145-b42daf7f5f1a\" (UID: \"a504a208-146a-4ba0-8145-b42daf7f5f1a\") " Jan 20 23:21:39 crc kubenswrapper[5030]: I0120 23:21:39.900262 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a504a208-146a-4ba0-8145-b42daf7f5f1a-db-sync-config-data\") pod \"a504a208-146a-4ba0-8145-b42daf7f5f1a\" (UID: \"a504a208-146a-4ba0-8145-b42daf7f5f1a\") " Jan 20 23:21:39 crc kubenswrapper[5030]: I0120 23:21:39.900399 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn6tw\" (UniqueName: \"kubernetes.io/projected/a504a208-146a-4ba0-8145-b42daf7f5f1a-kube-api-access-qn6tw\") pod \"a504a208-146a-4ba0-8145-b42daf7f5f1a\" (UID: \"a504a208-146a-4ba0-8145-b42daf7f5f1a\") " Jan 20 23:21:39 crc kubenswrapper[5030]: I0120 23:21:39.905528 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a504a208-146a-4ba0-8145-b42daf7f5f1a-kube-api-access-qn6tw" (OuterVolumeSpecName: "kube-api-access-qn6tw") pod "a504a208-146a-4ba0-8145-b42daf7f5f1a" (UID: "a504a208-146a-4ba0-8145-b42daf7f5f1a"). InnerVolumeSpecName "kube-api-access-qn6tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:39 crc kubenswrapper[5030]: I0120 23:21:39.912560 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a504a208-146a-4ba0-8145-b42daf7f5f1a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a504a208-146a-4ba0-8145-b42daf7f5f1a" (UID: "a504a208-146a-4ba0-8145-b42daf7f5f1a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:39 crc kubenswrapper[5030]: I0120 23:21:39.936903 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a504a208-146a-4ba0-8145-b42daf7f5f1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a504a208-146a-4ba0-8145-b42daf7f5f1a" (UID: "a504a208-146a-4ba0-8145-b42daf7f5f1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.002642 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a504a208-146a-4ba0-8145-b42daf7f5f1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.002955 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a504a208-146a-4ba0-8145-b42daf7f5f1a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.002967 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn6tw\" (UniqueName: \"kubernetes.io/projected/a504a208-146a-4ba0-8145-b42daf7f5f1a-kube-api-access-qn6tw\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.156920 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.156981 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.422520 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-5px6h" event={"ID":"a504a208-146a-4ba0-8145-b42daf7f5f1a","Type":"ContainerDied","Data":"17918297b7c987d47aeabc0bd0a32a79233ee36ecdcbe784503db484eaea1adb"} Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.422571 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17918297b7c987d47aeabc0bd0a32a79233ee36ecdcbe784503db484eaea1adb" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.422576 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-5px6h" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.425737 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a5d3183b-1f93-402d-b6ae-04a7140350e1","Type":"ContainerStarted","Data":"71bc36dcfa8577f5ae6ac65dbde97d07ab322d996ae5b70285870910a674c155"} Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.425817 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a5d3183b-1f93-402d-b6ae-04a7140350e1","Type":"ContainerStarted","Data":"460c9cd55ccea5d7179492b6d518235c196e787b6d196ec583152b74f9e066a1"} Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.663576 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2"] Jan 20 23:21:40 crc kubenswrapper[5030]: E0120 23:21:40.663961 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a504a208-146a-4ba0-8145-b42daf7f5f1a" containerName="barbican-db-sync" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.663972 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a504a208-146a-4ba0-8145-b42daf7f5f1a" containerName="barbican-db-sync" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.664130 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a504a208-146a-4ba0-8145-b42daf7f5f1a" containerName="barbican-db-sync" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.664936 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.669195 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-7j7b6" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.669381 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.675530 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.685296 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-77849f758-fppxg"] Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.686813 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.691944 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.698681 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2"] Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.714363 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-config-data-custom\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.714756 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b43301d-1654-48e3-939b-48b8b7a01d08-logs\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.714785 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-combined-ca-bundle\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.714824 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08fe9daf-c9de-4513-8f1d-4de9845d68d0-logs\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.714844 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-config-data\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.714861 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzkf6\" (UniqueName: \"kubernetes.io/projected/08fe9daf-c9de-4513-8f1d-4de9845d68d0-kube-api-access-xzkf6\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.714878 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d882t\" (UniqueName: \"kubernetes.io/projected/8b43301d-1654-48e3-939b-48b8b7a01d08-kube-api-access-d882t\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.714905 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-config-data-custom\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.715368 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-config-data\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.715400 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-combined-ca-bundle\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.724307 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77849f758-fppxg"] Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.788876 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv"] Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.790195 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.792345 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.802882 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv"] Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.817482 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-config-data\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.817614 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08fe9daf-c9de-4513-8f1d-4de9845d68d0-logs\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.817697 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-config-data\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.817730 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-config-data-custom\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.817754 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzkf6\" (UniqueName: \"kubernetes.io/projected/08fe9daf-c9de-4513-8f1d-4de9845d68d0-kube-api-access-xzkf6\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.817770 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-combined-ca-bundle\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.817788 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d882t\" (UniqueName: \"kubernetes.io/projected/8b43301d-1654-48e3-939b-48b8b7a01d08-kube-api-access-d882t\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.817827 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-config-data-custom\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.817843 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2733db1e-741d-49c1-bd1e-8a3b91cb5069-logs\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.817887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-config-data\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.817921 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-combined-ca-bundle\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.817975 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-config-data-custom\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.818059 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08fe9daf-c9de-4513-8f1d-4de9845d68d0-logs\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.818071 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z66zl\" (UniqueName: \"kubernetes.io/projected/2733db1e-741d-49c1-bd1e-8a3b91cb5069-kube-api-access-z66zl\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.818142 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b43301d-1654-48e3-939b-48b8b7a01d08-logs\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.818198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-combined-ca-bundle\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.819028 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b43301d-1654-48e3-939b-48b8b7a01d08-logs\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.823861 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-combined-ca-bundle\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.824669 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-config-data\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.826315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-config-data-custom\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.828531 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-config-data\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.834248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-combined-ca-bundle\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.834343 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-config-data-custom\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.836743 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzkf6\" (UniqueName: \"kubernetes.io/projected/08fe9daf-c9de-4513-8f1d-4de9845d68d0-kube-api-access-xzkf6\") pod \"barbican-keystone-listener-7587c489b5-vzmf2\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.837740 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d882t\" (UniqueName: \"kubernetes.io/projected/8b43301d-1654-48e3-939b-48b8b7a01d08-kube-api-access-d882t\") pod \"barbican-worker-77849f758-fppxg\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.920074 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z66zl\" (UniqueName: \"kubernetes.io/projected/2733db1e-741d-49c1-bd1e-8a3b91cb5069-kube-api-access-z66zl\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.920132 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-config-data\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.920177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-config-data-custom\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.920195 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-combined-ca-bundle\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.920242 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2733db1e-741d-49c1-bd1e-8a3b91cb5069-logs\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.920787 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2733db1e-741d-49c1-bd1e-8a3b91cb5069-logs\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.923797 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-config-data\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.924757 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-combined-ca-bundle\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.925297 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-config-data-custom\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:40 crc kubenswrapper[5030]: I0120 23:21:40.940303 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z66zl\" (UniqueName: \"kubernetes.io/projected/2733db1e-741d-49c1-bd1e-8a3b91cb5069-kube-api-access-z66zl\") pod \"barbican-api-595867cbd5-4xbnv\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:41 crc kubenswrapper[5030]: I0120 23:21:41.007842 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:21:41 crc kubenswrapper[5030]: I0120 23:21:41.024700 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:21:41 crc kubenswrapper[5030]: I0120 23:21:41.106182 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:41 crc kubenswrapper[5030]: I0120 23:21:41.466968 5030 generic.go:334] "Generic (PLEG): container finished" podID="78c5052d-ee32-4414-a152-13caff78882f" containerID="f9dc97af04fb3932c7410e9c9a358e795dd6cf3a504e15146011c5956745a3ca" exitCode=0 Jan 20 23:21:41 crc kubenswrapper[5030]: I0120 23:21:41.467138 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" event={"ID":"78c5052d-ee32-4414-a152-13caff78882f","Type":"ContainerDied","Data":"f9dc97af04fb3932c7410e9c9a358e795dd6cf3a504e15146011c5956745a3ca"} Jan 20 23:21:41 crc kubenswrapper[5030]: I0120 23:21:41.531969 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77849f758-fppxg"] Jan 20 23:21:41 crc kubenswrapper[5030]: I0120 23:21:41.538799 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2"] Jan 20 23:21:41 crc kubenswrapper[5030]: I0120 23:21:41.691642 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv"] Jan 20 23:21:41 crc kubenswrapper[5030]: W0120 23:21:41.694558 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2733db1e_741d_49c1_bd1e_8a3b91cb5069.slice/crio-4926d827b942cd4f7318deba7ced99161643201328b8d702b7e568b035e8ea46 WatchSource:0}: Error finding container 4926d827b942cd4f7318deba7ced99161643201328b8d702b7e568b035e8ea46: Status 404 returned error can't find the container with id 4926d827b942cd4f7318deba7ced99161643201328b8d702b7e568b035e8ea46 Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.478238 5030 generic.go:334] "Generic (PLEG): container finished" podID="59a0bd3e-1c17-41ac-b449-3392e10dccc8" containerID="e0404281b5a4570801be1942e93ace165a65a028f86411ac0d3d3c68f979b1f5" exitCode=0 Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.478312 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-q284r" event={"ID":"59a0bd3e-1c17-41ac-b449-3392e10dccc8","Type":"ContainerDied","Data":"e0404281b5a4570801be1942e93ace165a65a028f86411ac0d3d3c68f979b1f5"} Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.481768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a5d3183b-1f93-402d-b6ae-04a7140350e1","Type":"ContainerStarted","Data":"06cdb5c56809512e8ffd248f82b0ba3a5e54d10a1de1554591dbe53bd608d768"} Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.481967 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.484231 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" event={"ID":"08fe9daf-c9de-4513-8f1d-4de9845d68d0","Type":"ContainerStarted","Data":"6e3cad28bd1101a150e15cb12ca038b8e32ad2b04372748ee44338393d3a39d1"} Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.484277 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" event={"ID":"08fe9daf-c9de-4513-8f1d-4de9845d68d0","Type":"ContainerStarted","Data":"6eaa4dde275842ef0beb5f7d8e6e13309e1e063e95acedde22a48a72efc53e6d"} Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.484294 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" event={"ID":"08fe9daf-c9de-4513-8f1d-4de9845d68d0","Type":"ContainerStarted","Data":"46ca8187512a0c20917af667d8938f43bde4cd8b7c114e3befce9ac1013597e4"} Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.486611 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" event={"ID":"8b43301d-1654-48e3-939b-48b8b7a01d08","Type":"ContainerStarted","Data":"c93bbc1ca0dc3d090bb51dc932be0176d62ec001738ed2e88b8fd6a5633177f8"} Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.486678 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" event={"ID":"8b43301d-1654-48e3-939b-48b8b7a01d08","Type":"ContainerStarted","Data":"85abd1af505db3e1ef7ffda4c8e402a9aaa678d02683c109145cde03d314e60a"} Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.486695 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" event={"ID":"8b43301d-1654-48e3-939b-48b8b7a01d08","Type":"ContainerStarted","Data":"852edb6beb66476327455a282ca0f6f5b5a2e520521c33fe0315cf8b2482dd13"} Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.489354 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" event={"ID":"2733db1e-741d-49c1-bd1e-8a3b91cb5069","Type":"ContainerStarted","Data":"d476fb02a69a8fc92e3c7cca72b9a5effa93125b84bdbaea04da1b75d99fdc98"} Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.489395 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" event={"ID":"2733db1e-741d-49c1-bd1e-8a3b91cb5069","Type":"ContainerStarted","Data":"d3123491bd9a3c29ded560593841312b16cfff97d4b34fce55c2e8a76b57badf"} Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.489406 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" event={"ID":"2733db1e-741d-49c1-bd1e-8a3b91cb5069","Type":"ContainerStarted","Data":"4926d827b942cd4f7318deba7ced99161643201328b8d702b7e568b035e8ea46"} Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.541533 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.62569952 podStartE2EDuration="5.541518944s" podCreationTimestamp="2026-01-20 23:21:37 +0000 UTC" firstStartedPulling="2026-01-20 23:21:38.306999007 +0000 UTC m=+2770.627259295" lastFinishedPulling="2026-01-20 23:21:41.222818431 +0000 UTC m=+2773.543078719" observedRunningTime="2026-01-20 23:21:42.53550974 +0000 UTC m=+2774.855770028" watchObservedRunningTime="2026-01-20 23:21:42.541518944 +0000 UTC m=+2774.861779232" Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.572131 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" podStartSLOduration=2.572116518 podStartE2EDuration="2.572116518s" podCreationTimestamp="2026-01-20 23:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:42.561911853 +0000 UTC m=+2774.882172141" watchObservedRunningTime="2026-01-20 23:21:42.572116518 +0000 UTC m=+2774.892376806" Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.589805 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" podStartSLOduration=2.589782342 podStartE2EDuration="2.589782342s" podCreationTimestamp="2026-01-20 23:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:42.580838818 +0000 UTC m=+2774.901099106" watchObservedRunningTime="2026-01-20 23:21:42.589782342 +0000 UTC m=+2774.910042630" Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.608348 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" podStartSLOduration=2.608329216 podStartE2EDuration="2.608329216s" podCreationTimestamp="2026-01-20 23:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:42.603317797 +0000 UTC m=+2774.923578095" watchObservedRunningTime="2026-01-20 23:21:42.608329216 +0000 UTC m=+2774.928589504" Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.951939 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-6687fdd977-92b7s"] Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.953448 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.955154 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.955154 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.967432 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6687fdd977-92b7s"] Jan 20 23:21:42 crc kubenswrapper[5030]: I0120 23:21:42.982744 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.066662 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-config-data\") pod \"78c5052d-ee32-4414-a152-13caff78882f\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.066731 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wfj8\" (UniqueName: \"kubernetes.io/projected/78c5052d-ee32-4414-a152-13caff78882f-kube-api-access-4wfj8\") pod \"78c5052d-ee32-4414-a152-13caff78882f\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.066760 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-scripts\") pod \"78c5052d-ee32-4414-a152-13caff78882f\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.066827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78c5052d-ee32-4414-a152-13caff78882f-etc-machine-id\") pod \"78c5052d-ee32-4414-a152-13caff78882f\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.066857 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-combined-ca-bundle\") pod \"78c5052d-ee32-4414-a152-13caff78882f\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.066914 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-db-sync-config-data\") pod \"78c5052d-ee32-4414-a152-13caff78882f\" (UID: \"78c5052d-ee32-4414-a152-13caff78882f\") " Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.067130 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-config-data\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.067171 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-public-tls-certs\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.067217 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-internal-tls-certs\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.067308 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-logs\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.067367 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-combined-ca-bundle\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.067390 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdshl\" (UniqueName: \"kubernetes.io/projected/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-kube-api-access-zdshl\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.067492 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78c5052d-ee32-4414-a152-13caff78882f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "78c5052d-ee32-4414-a152-13caff78882f" (UID: "78c5052d-ee32-4414-a152-13caff78882f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.068115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-config-data-custom\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.068190 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78c5052d-ee32-4414-a152-13caff78882f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.071452 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "78c5052d-ee32-4414-a152-13caff78882f" (UID: "78c5052d-ee32-4414-a152-13caff78882f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.072800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c5052d-ee32-4414-a152-13caff78882f-kube-api-access-4wfj8" (OuterVolumeSpecName: "kube-api-access-4wfj8") pod "78c5052d-ee32-4414-a152-13caff78882f" (UID: "78c5052d-ee32-4414-a152-13caff78882f"). InnerVolumeSpecName "kube-api-access-4wfj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.088308 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-scripts" (OuterVolumeSpecName: "scripts") pod "78c5052d-ee32-4414-a152-13caff78882f" (UID: "78c5052d-ee32-4414-a152-13caff78882f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.097062 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78c5052d-ee32-4414-a152-13caff78882f" (UID: "78c5052d-ee32-4414-a152-13caff78882f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.116046 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-config-data" (OuterVolumeSpecName: "config-data") pod "78c5052d-ee32-4414-a152-13caff78882f" (UID: "78c5052d-ee32-4414-a152-13caff78882f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.169697 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-logs\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.169787 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-combined-ca-bundle\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.169815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdshl\" (UniqueName: \"kubernetes.io/projected/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-kube-api-access-zdshl\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.169854 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-config-data-custom\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.169929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-config-data\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.169961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-public-tls-certs\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.169998 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-internal-tls-certs\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.170066 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.170082 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.170096 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wfj8\" (UniqueName: \"kubernetes.io/projected/78c5052d-ee32-4414-a152-13caff78882f-kube-api-access-4wfj8\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.170109 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.170122 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c5052d-ee32-4414-a152-13caff78882f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.171358 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-logs\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.174737 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-internal-tls-certs\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.178277 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-public-tls-certs\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.178479 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-config-data\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.178568 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-config-data-custom\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.179673 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-combined-ca-bundle\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.187830 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdshl\" (UniqueName: \"kubernetes.io/projected/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-kube-api-access-zdshl\") pod \"barbican-api-6687fdd977-92b7s\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.295670 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.513428 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" event={"ID":"78c5052d-ee32-4414-a152-13caff78882f","Type":"ContainerDied","Data":"d359d0c9fa97ed0bf4815e4b6ab7394d1fd1918177fa72e0332f43b9a85a46a6"} Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.513699 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d359d0c9fa97ed0bf4815e4b6ab7394d1fd1918177fa72e0332f43b9a85a46a6" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.513748 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.513807 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.518317 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-6zk9r" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.722548 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:21:43 crc kubenswrapper[5030]: E0120 23:21:43.722928 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c5052d-ee32-4414-a152-13caff78882f" containerName="cinder-db-sync" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.722945 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c5052d-ee32-4414-a152-13caff78882f" containerName="cinder-db-sync" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.723102 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c5052d-ee32-4414-a152-13caff78882f" containerName="cinder-db-sync" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.730122 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.738941 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.739349 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.739462 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-kkfd7" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.742482 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.744354 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.792681 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.792764 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnwt4\" (UniqueName: \"kubernetes.io/projected/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-kube-api-access-jnwt4\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.792794 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.792811 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-scripts\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.792831 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-config-data\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.792855 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.896461 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.896515 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnwt4\" (UniqueName: \"kubernetes.io/projected/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-kube-api-access-jnwt4\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.896533 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.896549 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-scripts\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.896570 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-config-data\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.896595 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.905643 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.917256 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.919267 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.921370 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-scripts\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.927398 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-config-data\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:43 crc kubenswrapper[5030]: I0120 23:21:43.942187 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnwt4\" (UniqueName: \"kubernetes.io/projected/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-kube-api-access-jnwt4\") pod \"cinder-scheduler-0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.010448 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6687fdd977-92b7s"] Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.062718 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.147906 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.149305 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.155117 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.167887 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.199895 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.216732 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df2935f2-3c77-4aa7-81f7-eb34b767e043-logs\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.216992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.217025 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gbnk\" (UniqueName: \"kubernetes.io/projected/df2935f2-3c77-4aa7-81f7-eb34b767e043-kube-api-access-5gbnk\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.217114 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-scripts\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.217161 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-config-data-custom\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.217182 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df2935f2-3c77-4aa7-81f7-eb34b767e043-etc-machine-id\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.217196 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-config-data\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.319130 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-config-data\") pod \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.319465 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-credential-keys\") pod \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.319515 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-scripts\") pod \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.319608 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-fernet-keys\") pod \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.319647 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvwcq\" (UniqueName: \"kubernetes.io/projected/59a0bd3e-1c17-41ac-b449-3392e10dccc8-kube-api-access-fvwcq\") pod \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.319677 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-combined-ca-bundle\") pod \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\" (UID: \"59a0bd3e-1c17-41ac-b449-3392e10dccc8\") " Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.320003 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-config-data-custom\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.320046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df2935f2-3c77-4aa7-81f7-eb34b767e043-etc-machine-id\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.320065 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-config-data\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.320085 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df2935f2-3c77-4aa7-81f7-eb34b767e043-logs\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.320098 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.320123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gbnk\" (UniqueName: \"kubernetes.io/projected/df2935f2-3c77-4aa7-81f7-eb34b767e043-kube-api-access-5gbnk\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.320210 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-scripts\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.320577 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df2935f2-3c77-4aa7-81f7-eb34b767e043-etc-machine-id\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.326859 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-scripts\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.328327 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "59a0bd3e-1c17-41ac-b449-3392e10dccc8" (UID: "59a0bd3e-1c17-41ac-b449-3392e10dccc8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.328859 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df2935f2-3c77-4aa7-81f7-eb34b767e043-logs\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.328987 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59a0bd3e-1c17-41ac-b449-3392e10dccc8-kube-api-access-fvwcq" (OuterVolumeSpecName: "kube-api-access-fvwcq") pod "59a0bd3e-1c17-41ac-b449-3392e10dccc8" (UID: "59a0bd3e-1c17-41ac-b449-3392e10dccc8"). InnerVolumeSpecName "kube-api-access-fvwcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.329006 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "59a0bd3e-1c17-41ac-b449-3392e10dccc8" (UID: "59a0bd3e-1c17-41ac-b449-3392e10dccc8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.331042 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.331168 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-config-data\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.336780 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-scripts" (OuterVolumeSpecName: "scripts") pod "59a0bd3e-1c17-41ac-b449-3392e10dccc8" (UID: "59a0bd3e-1c17-41ac-b449-3392e10dccc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.337095 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-config-data-custom\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.347523 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gbnk\" (UniqueName: \"kubernetes.io/projected/df2935f2-3c77-4aa7-81f7-eb34b767e043-kube-api-access-5gbnk\") pod \"cinder-api-0\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.353845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59a0bd3e-1c17-41ac-b449-3392e10dccc8" (UID: "59a0bd3e-1c17-41ac-b449-3392e10dccc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.357021 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-config-data" (OuterVolumeSpecName: "config-data") pod "59a0bd3e-1c17-41ac-b449-3392e10dccc8" (UID: "59a0bd3e-1c17-41ac-b449-3392e10dccc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.422326 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.422360 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvwcq\" (UniqueName: \"kubernetes.io/projected/59a0bd3e-1c17-41ac-b449-3392e10dccc8-kube-api-access-fvwcq\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.422371 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.422381 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.422390 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.422398 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59a0bd3e-1c17-41ac-b449-3392e10dccc8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.491178 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.515342 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" event={"ID":"6dd505e0-dffa-4321-8f26-f8b6b70c98b4","Type":"ContainerStarted","Data":"7e4d409ce4a99db2a49a40a4b97266a277d67eeee9215613620c200ffe681d89"} Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.515400 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" event={"ID":"6dd505e0-dffa-4321-8f26-f8b6b70c98b4","Type":"ContainerStarted","Data":"1f5e01f7919378443faa99f13231598d5d66cb4e91ba1a20c77e2a68d9a547b1"} Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.515412 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" event={"ID":"6dd505e0-dffa-4321-8f26-f8b6b70c98b4","Type":"ContainerStarted","Data":"1c10f1e81f0dadeab5729eeb2e32949e39106d9a10fee0ca69f392fefd7dc33a"} Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.515556 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.515907 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.517328 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-q284r" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.517376 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-q284r" event={"ID":"59a0bd3e-1c17-41ac-b449-3392e10dccc8","Type":"ContainerDied","Data":"8b9d15dc7b5afc80d2fecb59444efc9b2a748ed2c75fc7d346f18e7d7ce564fd"} Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.517399 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b9d15dc7b5afc80d2fecb59444efc9b2a748ed2c75fc7d346f18e7d7ce564fd" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.553231 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" podStartSLOduration=2.5532044259999997 podStartE2EDuration="2.553204426s" podCreationTimestamp="2026-01-20 23:21:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:44.54337595 +0000 UTC m=+2776.863636248" watchObservedRunningTime="2026-01-20 23:21:44.553204426 +0000 UTC m=+2776.873464744" Jan 20 23:21:44 crc kubenswrapper[5030]: W0120 23:21:44.596017 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode174f9d2_5b44_468a_8269_7b7fbfca6fa0.slice/crio-4ffff958fedff990ba58a7c40a24d4a872cf9dc7ce36d9494861bd667fff70cd WatchSource:0}: Error finding container 4ffff958fedff990ba58a7c40a24d4a872cf9dc7ce36d9494861bd667fff70cd: Status 404 returned error can't find the container with id 4ffff958fedff990ba58a7c40a24d4a872cf9dc7ce36d9494861bd667fff70cd Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.600574 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.714050 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-5c77db7594-ssxbs"] Jan 20 23:21:44 crc kubenswrapper[5030]: E0120 23:21:44.714505 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59a0bd3e-1c17-41ac-b449-3392e10dccc8" containerName="keystone-bootstrap" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.714518 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a0bd3e-1c17-41ac-b449-3392e10dccc8" containerName="keystone-bootstrap" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.714677 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59a0bd3e-1c17-41ac-b449-3392e10dccc8" containerName="keystone-bootstrap" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.715283 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.718334 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.718983 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.719078 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.719173 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.722880 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-vdlm9" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.723565 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.736521 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5c77db7594-ssxbs"] Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.828568 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-config-data\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.828678 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-credential-keys\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.828759 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-combined-ca-bundle\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.828789 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv7j9\" (UniqueName: \"kubernetes.io/projected/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-kube-api-access-fv7j9\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.828823 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-public-tls-certs\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.828932 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-scripts\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.828954 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-internal-tls-certs\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.829006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-fernet-keys\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.930826 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-public-tls-certs\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.931179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-scripts\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.931204 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-internal-tls-certs\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.931230 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-fernet-keys\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.931274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-config-data\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.931293 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-credential-keys\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.931337 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-combined-ca-bundle\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.931364 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv7j9\" (UniqueName: \"kubernetes.io/projected/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-kube-api-access-fv7j9\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.937712 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-public-tls-certs\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.940426 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-config-data\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.940716 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-fernet-keys\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.940823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-combined-ca-bundle\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.943673 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-credential-keys\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.957685 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-internal-tls-certs\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.957777 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-scripts\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:44 crc kubenswrapper[5030]: I0120 23:21:44.962017 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv7j9\" (UniqueName: \"kubernetes.io/projected/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-kube-api-access-fv7j9\") pod \"keystone-5c77db7594-ssxbs\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:45 crc kubenswrapper[5030]: I0120 23:21:45.058466 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:21:45 crc kubenswrapper[5030]: W0120 23:21:45.067293 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf2935f2_3c77_4aa7_81f7_eb34b767e043.slice/crio-25c6f58a443d670b049b023c07ad235b873f0d7f46822e004a0241dc12c10a70 WatchSource:0}: Error finding container 25c6f58a443d670b049b023c07ad235b873f0d7f46822e004a0241dc12c10a70: Status 404 returned error can't find the container with id 25c6f58a443d670b049b023c07ad235b873f0d7f46822e004a0241dc12c10a70 Jan 20 23:21:45 crc kubenswrapper[5030]: I0120 23:21:45.080743 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:45 crc kubenswrapper[5030]: I0120 23:21:45.527873 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5c77db7594-ssxbs"] Jan 20 23:21:45 crc kubenswrapper[5030]: W0120 23:21:45.541212 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e6b8546_b932_4a4a_a2a0_d34aa5399cf6.slice/crio-df65c38c9b137bc52c4c0c7fdc204b407a3bc5d7df6e4766acbdc1e71d56baf1 WatchSource:0}: Error finding container df65c38c9b137bc52c4c0c7fdc204b407a3bc5d7df6e4766acbdc1e71d56baf1: Status 404 returned error can't find the container with id df65c38c9b137bc52c4c0c7fdc204b407a3bc5d7df6e4766acbdc1e71d56baf1 Jan 20 23:21:45 crc kubenswrapper[5030]: I0120 23:21:45.542944 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e174f9d2-5b44-468a-8269-7b7fbfca6fa0","Type":"ContainerStarted","Data":"f4b76bed2869038580868e9775d5e2b88278960b15c8168d308de8d0e406be28"} Jan 20 23:21:45 crc kubenswrapper[5030]: I0120 23:21:45.542985 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e174f9d2-5b44-468a-8269-7b7fbfca6fa0","Type":"ContainerStarted","Data":"4ffff958fedff990ba58a7c40a24d4a872cf9dc7ce36d9494861bd667fff70cd"} Jan 20 23:21:45 crc kubenswrapper[5030]: I0120 23:21:45.545026 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"df2935f2-3c77-4aa7-81f7-eb34b767e043","Type":"ContainerStarted","Data":"25c6f58a443d670b049b023c07ad235b873f0d7f46822e004a0241dc12c10a70"} Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.113974 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.114614 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.144150 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.144196 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.180381 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.200806 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.209231 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.218111 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.557918 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" event={"ID":"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6","Type":"ContainerStarted","Data":"0387f5fa43ea27d1f6121946087bb3ad465a91835672600e0ae5cf2b14eb9303"} Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.557965 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" event={"ID":"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6","Type":"ContainerStarted","Data":"df65c38c9b137bc52c4c0c7fdc204b407a3bc5d7df6e4766acbdc1e71d56baf1"} Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.558954 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.565947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e174f9d2-5b44-468a-8269-7b7fbfca6fa0","Type":"ContainerStarted","Data":"d9701cad4c6ceac072f56c95fa9fdb970a90d205a731f079f8ef4408544d5c84"} Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.583139 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" podStartSLOduration=2.5831188149999997 podStartE2EDuration="2.583118815s" podCreationTimestamp="2026-01-20 23:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:46.580010941 +0000 UTC m=+2778.900271229" watchObservedRunningTime="2026-01-20 23:21:46.583118815 +0000 UTC m=+2778.903379103" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.609774 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"df2935f2-3c77-4aa7-81f7-eb34b767e043","Type":"ContainerStarted","Data":"110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388"} Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.609844 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"df2935f2-3c77-4aa7-81f7-eb34b767e043","Type":"ContainerStarted","Data":"44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499"} Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.611179 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.611236 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.611247 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.611268 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.611401 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.644763 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.644743253 podStartE2EDuration="3.644743253s" podCreationTimestamp="2026-01-20 23:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:46.613412722 +0000 UTC m=+2778.933673010" watchObservedRunningTime="2026-01-20 23:21:46.644743253 +0000 UTC m=+2778.965003541" Jan 20 23:21:46 crc kubenswrapper[5030]: I0120 23:21:46.652904 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=2.652883768 podStartE2EDuration="2.652883768s" podCreationTimestamp="2026-01-20 23:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:46.634964298 +0000 UTC m=+2778.955224586" watchObservedRunningTime="2026-01-20 23:21:46.652883768 +0000 UTC m=+2778.973144056" Jan 20 23:21:46 crc kubenswrapper[5030]: E0120 23:21:46.802680 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5a0e0f_eb0d_4757_b571_6571a43793c4.slice/crio-e27e4401e62fa285d5d1e9936dacf60031667ebe0fe7152d85d42ed3a9166693\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5a0e0f_eb0d_4757_b571_6571a43793c4.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:21:47 crc kubenswrapper[5030]: I0120 23:21:47.184645 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:21:48 crc kubenswrapper[5030]: I0120 23:21:48.398645 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:48 crc kubenswrapper[5030]: I0120 23:21:48.626750 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:21:48 crc kubenswrapper[5030]: I0120 23:21:48.626783 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:21:48 crc kubenswrapper[5030]: I0120 23:21:48.626818 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:21:48 crc kubenswrapper[5030]: I0120 23:21:48.626849 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:21:48 crc kubenswrapper[5030]: I0120 23:21:48.627353 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="df2935f2-3c77-4aa7-81f7-eb34b767e043" containerName="cinder-api-log" containerID="cri-o://44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499" gracePeriod=30 Jan 20 23:21:48 crc kubenswrapper[5030]: I0120 23:21:48.627422 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="df2935f2-3c77-4aa7-81f7-eb34b767e043" containerName="cinder-api" containerID="cri-o://110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388" gracePeriod=30 Jan 20 23:21:48 crc kubenswrapper[5030]: I0120 23:21:48.806156 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:48 crc kubenswrapper[5030]: I0120 23:21:48.821193 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.063741 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.144976 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.148407 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.294446 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.428148 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-config-data-custom\") pod \"df2935f2-3c77-4aa7-81f7-eb34b767e043\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.428197 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df2935f2-3c77-4aa7-81f7-eb34b767e043-logs\") pod \"df2935f2-3c77-4aa7-81f7-eb34b767e043\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.428286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-combined-ca-bundle\") pod \"df2935f2-3c77-4aa7-81f7-eb34b767e043\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.428350 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gbnk\" (UniqueName: \"kubernetes.io/projected/df2935f2-3c77-4aa7-81f7-eb34b767e043-kube-api-access-5gbnk\") pod \"df2935f2-3c77-4aa7-81f7-eb34b767e043\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.428370 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-scripts\") pod \"df2935f2-3c77-4aa7-81f7-eb34b767e043\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.428460 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df2935f2-3c77-4aa7-81f7-eb34b767e043-etc-machine-id\") pod \"df2935f2-3c77-4aa7-81f7-eb34b767e043\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.428504 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-config-data\") pod \"df2935f2-3c77-4aa7-81f7-eb34b767e043\" (UID: \"df2935f2-3c77-4aa7-81f7-eb34b767e043\") " Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.429870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df2935f2-3c77-4aa7-81f7-eb34b767e043-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "df2935f2-3c77-4aa7-81f7-eb34b767e043" (UID: "df2935f2-3c77-4aa7-81f7-eb34b767e043"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.430290 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df2935f2-3c77-4aa7-81f7-eb34b767e043-logs" (OuterVolumeSpecName: "logs") pod "df2935f2-3c77-4aa7-81f7-eb34b767e043" (UID: "df2935f2-3c77-4aa7-81f7-eb34b767e043"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.451389 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "df2935f2-3c77-4aa7-81f7-eb34b767e043" (UID: "df2935f2-3c77-4aa7-81f7-eb34b767e043"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.453404 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df2935f2-3c77-4aa7-81f7-eb34b767e043-kube-api-access-5gbnk" (OuterVolumeSpecName: "kube-api-access-5gbnk") pod "df2935f2-3c77-4aa7-81f7-eb34b767e043" (UID: "df2935f2-3c77-4aa7-81f7-eb34b767e043"). InnerVolumeSpecName "kube-api-access-5gbnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.453816 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-scripts" (OuterVolumeSpecName: "scripts") pod "df2935f2-3c77-4aa7-81f7-eb34b767e043" (UID: "df2935f2-3c77-4aa7-81f7-eb34b767e043"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.466771 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df2935f2-3c77-4aa7-81f7-eb34b767e043" (UID: "df2935f2-3c77-4aa7-81f7-eb34b767e043"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.499707 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-config-data" (OuterVolumeSpecName: "config-data") pod "df2935f2-3c77-4aa7-81f7-eb34b767e043" (UID: "df2935f2-3c77-4aa7-81f7-eb34b767e043"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.530873 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df2935f2-3c77-4aa7-81f7-eb34b767e043-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.530908 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.530917 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.530926 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df2935f2-3c77-4aa7-81f7-eb34b767e043-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.530936 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.530946 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gbnk\" (UniqueName: \"kubernetes.io/projected/df2935f2-3c77-4aa7-81f7-eb34b767e043-kube-api-access-5gbnk\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.530956 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df2935f2-3c77-4aa7-81f7-eb34b767e043-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.641705 5030 generic.go:334] "Generic (PLEG): container finished" podID="df2935f2-3c77-4aa7-81f7-eb34b767e043" containerID="110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388" exitCode=0 Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.641740 5030 generic.go:334] "Generic (PLEG): container finished" podID="df2935f2-3c77-4aa7-81f7-eb34b767e043" containerID="44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499" exitCode=143 Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.642640 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.642948 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"df2935f2-3c77-4aa7-81f7-eb34b767e043","Type":"ContainerDied","Data":"110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388"} Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.643007 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"df2935f2-3c77-4aa7-81f7-eb34b767e043","Type":"ContainerDied","Data":"44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499"} Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.643019 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"df2935f2-3c77-4aa7-81f7-eb34b767e043","Type":"ContainerDied","Data":"25c6f58a443d670b049b023c07ad235b873f0d7f46822e004a0241dc12c10a70"} Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.643037 5030 scope.go:117] "RemoveContainer" containerID="110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.682707 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.692432 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.705181 5030 scope.go:117] "RemoveContainer" containerID="44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.725536 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:21:49 crc kubenswrapper[5030]: E0120 23:21:49.725950 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df2935f2-3c77-4aa7-81f7-eb34b767e043" containerName="cinder-api" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.725967 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="df2935f2-3c77-4aa7-81f7-eb34b767e043" containerName="cinder-api" Jan 20 23:21:49 crc kubenswrapper[5030]: E0120 23:21:49.725994 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df2935f2-3c77-4aa7-81f7-eb34b767e043" containerName="cinder-api-log" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.726001 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="df2935f2-3c77-4aa7-81f7-eb34b767e043" containerName="cinder-api-log" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.726171 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="df2935f2-3c77-4aa7-81f7-eb34b767e043" containerName="cinder-api" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.726184 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="df2935f2-3c77-4aa7-81f7-eb34b767e043" containerName="cinder-api-log" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.727167 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.730808 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.730979 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.736508 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.741409 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.748443 5030 scope.go:117] "RemoveContainer" containerID="110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388" Jan 20 23:21:49 crc kubenswrapper[5030]: E0120 23:21:49.758747 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388\": container with ID starting with 110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388 not found: ID does not exist" containerID="110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.758794 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388"} err="failed to get container status \"110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388\": rpc error: code = NotFound desc = could not find container \"110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388\": container with ID starting with 110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388 not found: ID does not exist" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.758817 5030 scope.go:117] "RemoveContainer" containerID="44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499" Jan 20 23:21:49 crc kubenswrapper[5030]: E0120 23:21:49.760057 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499\": container with ID starting with 44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499 not found: ID does not exist" containerID="44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.760100 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499"} err="failed to get container status \"44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499\": rpc error: code = NotFound desc = could not find container \"44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499\": container with ID starting with 44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499 not found: ID does not exist" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.760128 5030 scope.go:117] "RemoveContainer" containerID="110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.760520 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388"} err="failed to get container status \"110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388\": rpc error: code = NotFound desc = could not find container \"110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388\": container with ID starting with 110c58e1a9d249ca6e07491cf7f25dbfee9e6ad1421a80359e8d3a4a6c17d388 not found: ID does not exist" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.760541 5030 scope.go:117] "RemoveContainer" containerID="44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.762097 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499"} err="failed to get container status \"44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499\": rpc error: code = NotFound desc = could not find container \"44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499\": container with ID starting with 44d92adc39adb5ff6f96d3c2d5df8008980bb880cdd2f04c0b37c2fa67fcd499 not found: ID does not exist" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.778401 5030 scope.go:117] "RemoveContainer" containerID="7d5497c08cbd622e49fcc51b99641a1df34a35f61b4bc17ebbc6a57130accf3b" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.808904 5030 scope.go:117] "RemoveContainer" containerID="752b6a46f4ce2e600f72af503766db84caedf6a5b37e0be36eaa583d522b93d0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.839679 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.839744 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.839784 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zrmx\" (UniqueName: \"kubernetes.io/projected/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-kube-api-access-6zrmx\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.839902 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.839952 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-scripts\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.839970 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-config-data\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.840041 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-config-data-custom\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.840108 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-logs\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.840167 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.850706 5030 scope.go:117] "RemoveContainer" containerID="5ec228c62d11e892bccd8b3e7622fe27d4c99380436acc8ab9a70d07539cf3ba" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.884631 5030 scope.go:117] "RemoveContainer" containerID="b4464c2867c3edb0059011581cf4f977a29af49058830c50e8ad505fb9d5fbc6" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.908747 5030 scope.go:117] "RemoveContainer" containerID="23ab3d19a3162b3c362222b9220b0da88a73e77ca2fa8dc2831bc0408c8efb1d" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.941801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-config-data-custom\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.941857 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-logs\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.941897 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.941929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.941946 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.941969 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zrmx\" (UniqueName: \"kubernetes.io/projected/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-kube-api-access-6zrmx\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.942004 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.942030 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-scripts\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.942045 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-config-data\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.944192 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-logs\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.949223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-config-data-custom\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.949317 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.949824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.952150 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-scripts\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.952574 5030 scope.go:117] "RemoveContainer" containerID="ef9e28371f66994ada583894a1e45f03b589a60920c442f3c6622eb45d5ee18b" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.954253 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.956124 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.957016 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-config-data\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.962513 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zrmx\" (UniqueName: \"kubernetes.io/projected/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-kube-api-access-6zrmx\") pod \"cinder-api-0\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.988001 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df2935f2-3c77-4aa7-81f7-eb34b767e043" path="/var/lib/kubelet/pods/df2935f2-3c77-4aa7-81f7-eb34b767e043/volumes" Jan 20 23:21:49 crc kubenswrapper[5030]: I0120 23:21:49.999911 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:50 crc kubenswrapper[5030]: I0120 23:21:50.028773 5030 scope.go:117] "RemoveContainer" containerID="f8d50c0f4240936dce63e32dc4fe73758a74f2c14533a746b39d074c52eb8285" Jan 20 23:21:50 crc kubenswrapper[5030]: I0120 23:21:50.051574 5030 scope.go:117] "RemoveContainer" containerID="716936d721449a258bd4a806a8716ac0e4856d04d567e559f47e27a903f4413e" Jan 20 23:21:50 crc kubenswrapper[5030]: I0120 23:21:50.118034 5030 scope.go:117] "RemoveContainer" containerID="e57aabed19e5fba72fb49b809d6dcc31f2a0753495ac995fd3a0afcefc01d277" Jan 20 23:21:50 crc kubenswrapper[5030]: I0120 23:21:50.136661 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:50 crc kubenswrapper[5030]: I0120 23:21:50.137079 5030 scope.go:117] "RemoveContainer" containerID="792cb958e3010513ca28d399f852b0e27ce52b1f4fb7f79b6e5a138bf32c9351" Jan 20 23:21:50 crc kubenswrapper[5030]: I0120 23:21:50.185566 5030 scope.go:117] "RemoveContainer" containerID="b4810b367aef312cb07a5bc212828530d847353295a4a0029f98ba6de0de7144" Jan 20 23:21:50 crc kubenswrapper[5030]: I0120 23:21:50.215257 5030 scope.go:117] "RemoveContainer" containerID="d1a2aa936166935c4ce2906d91846b385b067a38b360d28d89c7fd9c07cc9408" Jan 20 23:21:50 crc kubenswrapper[5030]: I0120 23:21:50.250547 5030 scope.go:117] "RemoveContainer" containerID="194f5ba11638676e1a1ddfbc95a8db6d66d354d3f02a3aca30a220cde44f7c5e" Jan 20 23:21:50 crc kubenswrapper[5030]: I0120 23:21:50.287929 5030 scope.go:117] "RemoveContainer" containerID="6f5089b9d57c3414c07cfe3a78f2c07bcbe1e7a78e9ee5624df9d18e36a5fd93" Jan 20 23:21:50 crc kubenswrapper[5030]: I0120 23:21:50.618432 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:21:50 crc kubenswrapper[5030]: W0120 23:21:50.629364 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9890aef_9ec6_4ee3_a4f5_3369f857fe91.slice/crio-d43d7c27d3e271ea822d5e5d2522ae40eccbac1dae312780c02d46e43a252c81 WatchSource:0}: Error finding container d43d7c27d3e271ea822d5e5d2522ae40eccbac1dae312780c02d46e43a252c81: Status 404 returned error can't find the container with id d43d7c27d3e271ea822d5e5d2522ae40eccbac1dae312780c02d46e43a252c81 Jan 20 23:21:50 crc kubenswrapper[5030]: I0120 23:21:50.651568 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"d9890aef-9ec6-4ee3-a4f5-3369f857fe91","Type":"ContainerStarted","Data":"d43d7c27d3e271ea822d5e5d2522ae40eccbac1dae312780c02d46e43a252c81"} Jan 20 23:21:50 crc kubenswrapper[5030]: I0120 23:21:50.653159 5030 generic.go:334] "Generic (PLEG): container finished" podID="a403eee4-765c-41f2-9ae3-fa3151068a29" containerID="d0d3f005943a446b84278210290d3b4a01364c1349a853ce14c5fbeadf1bf75c" exitCode=0 Jan 20 23:21:50 crc kubenswrapper[5030]: I0120 23:21:50.653322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" event={"ID":"a403eee4-765c-41f2-9ae3-fa3151068a29","Type":"ContainerDied","Data":"d0d3f005943a446b84278210290d3b4a01364c1349a853ce14c5fbeadf1bf75c"} Jan 20 23:21:51 crc kubenswrapper[5030]: I0120 23:21:51.673358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"d9890aef-9ec6-4ee3-a4f5-3369f857fe91","Type":"ContainerStarted","Data":"4a372dde5599973c143827ff58942c5bc6bf6c2a7036dc2f4b50d4bb53c3b474"} Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.090695 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.182566 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a403eee4-765c-41f2-9ae3-fa3151068a29-config\") pod \"a403eee4-765c-41f2-9ae3-fa3151068a29\" (UID: \"a403eee4-765c-41f2-9ae3-fa3151068a29\") " Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.182651 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fc5f\" (UniqueName: \"kubernetes.io/projected/a403eee4-765c-41f2-9ae3-fa3151068a29-kube-api-access-2fc5f\") pod \"a403eee4-765c-41f2-9ae3-fa3151068a29\" (UID: \"a403eee4-765c-41f2-9ae3-fa3151068a29\") " Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.182699 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a403eee4-765c-41f2-9ae3-fa3151068a29-combined-ca-bundle\") pod \"a403eee4-765c-41f2-9ae3-fa3151068a29\" (UID: \"a403eee4-765c-41f2-9ae3-fa3151068a29\") " Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.187868 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a403eee4-765c-41f2-9ae3-fa3151068a29-kube-api-access-2fc5f" (OuterVolumeSpecName: "kube-api-access-2fc5f") pod "a403eee4-765c-41f2-9ae3-fa3151068a29" (UID: "a403eee4-765c-41f2-9ae3-fa3151068a29"). InnerVolumeSpecName "kube-api-access-2fc5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.206284 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a403eee4-765c-41f2-9ae3-fa3151068a29-config" (OuterVolumeSpecName: "config") pod "a403eee4-765c-41f2-9ae3-fa3151068a29" (UID: "a403eee4-765c-41f2-9ae3-fa3151068a29"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.209305 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a403eee4-765c-41f2-9ae3-fa3151068a29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a403eee4-765c-41f2-9ae3-fa3151068a29" (UID: "a403eee4-765c-41f2-9ae3-fa3151068a29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.284970 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a403eee4-765c-41f2-9ae3-fa3151068a29-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.285006 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fc5f\" (UniqueName: \"kubernetes.io/projected/a403eee4-765c-41f2-9ae3-fa3151068a29-kube-api-access-2fc5f\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.285018 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a403eee4-765c-41f2-9ae3-fa3151068a29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.697287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" event={"ID":"a403eee4-765c-41f2-9ae3-fa3151068a29","Type":"ContainerDied","Data":"92208947e04c6b0d7d351c25396e5784683eed9af973bc28b2bacc1fd66b4d5b"} Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.697763 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92208947e04c6b0d7d351c25396e5784683eed9af973bc28b2bacc1fd66b4d5b" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.697865 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-qw7ll" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.703416 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"d9890aef-9ec6-4ee3-a4f5-3369f857fe91","Type":"ContainerStarted","Data":"1e92ba7e3b5a44975fbeae958e367e9b9ace413336e57249472fca878dfc9791"} Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.704897 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.748823 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.748763522 podStartE2EDuration="3.748763522s" podCreationTimestamp="2026-01-20 23:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:52.734375267 +0000 UTC m=+2785.054635575" watchObservedRunningTime="2026-01-20 23:21:52.748763522 +0000 UTC m=+2785.069023840" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.814856 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk"] Jan 20 23:21:52 crc kubenswrapper[5030]: E0120 23:21:52.815217 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a403eee4-765c-41f2-9ae3-fa3151068a29" containerName="neutron-db-sync" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.815234 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a403eee4-765c-41f2-9ae3-fa3151068a29" containerName="neutron-db-sync" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.815408 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a403eee4-765c-41f2-9ae3-fa3151068a29" containerName="neutron-db-sync" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.820351 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.822265 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-vkxc5" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.822498 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.825050 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.825078 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.829089 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk"] Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.901939 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pttsj\" (UniqueName: \"kubernetes.io/projected/679648c6-8923-42fc-9226-c78458125e4c-kube-api-access-pttsj\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.901987 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-combined-ca-bundle\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.902161 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-config\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.902195 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-httpd-config\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:52 crc kubenswrapper[5030]: I0120 23:21:52.902260 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-ovndb-tls-certs\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:53 crc kubenswrapper[5030]: I0120 23:21:53.003562 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-ovndb-tls-certs\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:53 crc kubenswrapper[5030]: I0120 23:21:53.003709 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pttsj\" (UniqueName: \"kubernetes.io/projected/679648c6-8923-42fc-9226-c78458125e4c-kube-api-access-pttsj\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:53 crc kubenswrapper[5030]: I0120 23:21:53.003738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-combined-ca-bundle\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:53 crc kubenswrapper[5030]: I0120 23:21:53.003784 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-config\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:53 crc kubenswrapper[5030]: I0120 23:21:53.003798 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-httpd-config\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:53 crc kubenswrapper[5030]: I0120 23:21:53.012686 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-combined-ca-bundle\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:53 crc kubenswrapper[5030]: I0120 23:21:53.013376 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-httpd-config\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:53 crc kubenswrapper[5030]: I0120 23:21:53.016393 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-ovndb-tls-certs\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:53 crc kubenswrapper[5030]: I0120 23:21:53.026782 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-config\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:53 crc kubenswrapper[5030]: I0120 23:21:53.036563 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pttsj\" (UniqueName: \"kubernetes.io/projected/679648c6-8923-42fc-9226-c78458125e4c-kube-api-access-pttsj\") pod \"neutron-7fbf45bdf4-vh8hk\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:53 crc kubenswrapper[5030]: I0120 23:21:53.141290 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:53 crc kubenswrapper[5030]: I0120 23:21:53.586379 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk"] Jan 20 23:21:53 crc kubenswrapper[5030]: W0120 23:21:53.597745 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod679648c6_8923_42fc_9226_c78458125e4c.slice/crio-a39049e7fbb6b796b10f056fe686f6a0bc27e97a257a7c5001a89a0deeb063fb WatchSource:0}: Error finding container a39049e7fbb6b796b10f056fe686f6a0bc27e97a257a7c5001a89a0deeb063fb: Status 404 returned error can't find the container with id a39049e7fbb6b796b10f056fe686f6a0bc27e97a257a7c5001a89a0deeb063fb Jan 20 23:21:53 crc kubenswrapper[5030]: I0120 23:21:53.712122 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" event={"ID":"679648c6-8923-42fc-9226-c78458125e4c","Type":"ContainerStarted","Data":"a39049e7fbb6b796b10f056fe686f6a0bc27e97a257a7c5001a89a0deeb063fb"} Jan 20 23:21:54 crc kubenswrapper[5030]: I0120 23:21:54.298493 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:54 crc kubenswrapper[5030]: I0120 23:21:54.354609 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:21:54 crc kubenswrapper[5030]: I0120 23:21:54.550774 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:54 crc kubenswrapper[5030]: I0120 23:21:54.720502 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:21:54 crc kubenswrapper[5030]: I0120 23:21:54.726215 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" event={"ID":"679648c6-8923-42fc-9226-c78458125e4c","Type":"ContainerStarted","Data":"c189266c2bcb621a50e9f7fd43bbd9ba20e9df922f57e929a34c1b43196e2694"} Jan 20 23:21:54 crc kubenswrapper[5030]: I0120 23:21:54.726260 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" event={"ID":"679648c6-8923-42fc-9226-c78458125e4c","Type":"ContainerStarted","Data":"44457fa640b84e298c9bd24570c134e0e6d163e0861bd161428b3fa40406b8dd"} Jan 20 23:21:54 crc kubenswrapper[5030]: I0120 23:21:54.727365 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="e174f9d2-5b44-468a-8269-7b7fbfca6fa0" containerName="cinder-scheduler" containerID="cri-o://f4b76bed2869038580868e9775d5e2b88278960b15c8168d308de8d0e406be28" gracePeriod=30 Jan 20 23:21:54 crc kubenswrapper[5030]: I0120 23:21:54.727516 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="e174f9d2-5b44-468a-8269-7b7fbfca6fa0" containerName="probe" containerID="cri-o://d9701cad4c6ceac072f56c95fa9fdb970a90d205a731f079f8ef4408544d5c84" gracePeriod=30 Jan 20 23:21:54 crc kubenswrapper[5030]: I0120 23:21:54.781367 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv"] Jan 20 23:21:54 crc kubenswrapper[5030]: I0120 23:21:54.781650 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" podUID="2733db1e-741d-49c1-bd1e-8a3b91cb5069" containerName="barbican-api-log" containerID="cri-o://d3123491bd9a3c29ded560593841312b16cfff97d4b34fce55c2e8a76b57badf" gracePeriod=30 Jan 20 23:21:54 crc kubenswrapper[5030]: I0120 23:21:54.781707 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" podUID="2733db1e-741d-49c1-bd1e-8a3b91cb5069" containerName="barbican-api" containerID="cri-o://d476fb02a69a8fc92e3c7cca72b9a5effa93125b84bdbaea04da1b75d99fdc98" gracePeriod=30 Jan 20 23:21:54 crc kubenswrapper[5030]: I0120 23:21:54.784601 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" podStartSLOduration=2.784581653 podStartE2EDuration="2.784581653s" podCreationTimestamp="2026-01-20 23:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:54.774991033 +0000 UTC m=+2787.095251321" watchObservedRunningTime="2026-01-20 23:21:54.784581653 +0000 UTC m=+2787.104841941" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.298947 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8"] Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.301390 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.303864 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.304266 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.320450 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8"] Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.454456 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ng7z\" (UniqueName: \"kubernetes.io/projected/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-kube-api-access-4ng7z\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.454550 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-combined-ca-bundle\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.454603 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-config\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.454873 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-ovndb-tls-certs\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.454967 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-public-tls-certs\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.455008 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-httpd-config\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.455179 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-internal-tls-certs\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.556849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-combined-ca-bundle\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.557110 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-config\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.557233 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-ovndb-tls-certs\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.557324 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-public-tls-certs\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.557404 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-httpd-config\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.557507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-internal-tls-certs\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.557633 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ng7z\" (UniqueName: \"kubernetes.io/projected/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-kube-api-access-4ng7z\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.562638 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-httpd-config\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.562691 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-public-tls-certs\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.563282 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-ovndb-tls-certs\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.565560 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-internal-tls-certs\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.566258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-combined-ca-bundle\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.581072 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ng7z\" (UniqueName: \"kubernetes.io/projected/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-kube-api-access-4ng7z\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.581125 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-config\") pod \"neutron-6f7b76b9f9-kb5m8\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.629137 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.737461 5030 generic.go:334] "Generic (PLEG): container finished" podID="e174f9d2-5b44-468a-8269-7b7fbfca6fa0" containerID="d9701cad4c6ceac072f56c95fa9fdb970a90d205a731f079f8ef4408544d5c84" exitCode=0 Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.737509 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e174f9d2-5b44-468a-8269-7b7fbfca6fa0","Type":"ContainerDied","Data":"d9701cad4c6ceac072f56c95fa9fdb970a90d205a731f079f8ef4408544d5c84"} Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.739485 5030 generic.go:334] "Generic (PLEG): container finished" podID="2733db1e-741d-49c1-bd1e-8a3b91cb5069" containerID="d3123491bd9a3c29ded560593841312b16cfff97d4b34fce55c2e8a76b57badf" exitCode=143 Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.740280 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" event={"ID":"2733db1e-741d-49c1-bd1e-8a3b91cb5069","Type":"ContainerDied","Data":"d3123491bd9a3c29ded560593841312b16cfff97d4b34fce55c2e8a76b57badf"} Jan 20 23:21:55 crc kubenswrapper[5030]: I0120 23:21:55.740307 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.104656 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8"] Jan 20 23:21:56 crc kubenswrapper[5030]: W0120 23:21:56.106518 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a8a7627_29f1_4e20_a1ea_5a2b45716e6e.slice/crio-518d26b9cc16a171da5a9825e14f7170edc48d0e6e2c8ea887fa51a78620f321 WatchSource:0}: Error finding container 518d26b9cc16a171da5a9825e14f7170edc48d0e6e2c8ea887fa51a78620f321: Status 404 returned error can't find the container with id 518d26b9cc16a171da5a9825e14f7170edc48d0e6e2c8ea887fa51a78620f321 Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.689468 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.769105 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" event={"ID":"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e","Type":"ContainerStarted","Data":"0411531d53355361b046e2e32bfef7f5b3573b390f4b771c528b43dc76aea6b7"} Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.769159 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" event={"ID":"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e","Type":"ContainerStarted","Data":"5cc1be7c0a82dbd5a841390d758773fe97e15ee408f76536d4e338afca6e0a65"} Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.769182 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" event={"ID":"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e","Type":"ContainerStarted","Data":"518d26b9cc16a171da5a9825e14f7170edc48d0e6e2c8ea887fa51a78620f321"} Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.769235 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.773030 5030 generic.go:334] "Generic (PLEG): container finished" podID="e174f9d2-5b44-468a-8269-7b7fbfca6fa0" containerID="f4b76bed2869038580868e9775d5e2b88278960b15c8168d308de8d0e406be28" exitCode=0 Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.775043 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.775552 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e174f9d2-5b44-468a-8269-7b7fbfca6fa0","Type":"ContainerDied","Data":"f4b76bed2869038580868e9775d5e2b88278960b15c8168d308de8d0e406be28"} Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.775583 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e174f9d2-5b44-468a-8269-7b7fbfca6fa0","Type":"ContainerDied","Data":"4ffff958fedff990ba58a7c40a24d4a872cf9dc7ce36d9494861bd667fff70cd"} Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.775601 5030 scope.go:117] "RemoveContainer" containerID="d9701cad4c6ceac072f56c95fa9fdb970a90d205a731f079f8ef4408544d5c84" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.779025 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-scripts\") pod \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.779130 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnwt4\" (UniqueName: \"kubernetes.io/projected/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-kube-api-access-jnwt4\") pod \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.779170 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-etc-machine-id\") pod \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.779286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-combined-ca-bundle\") pod \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.779358 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-config-data-custom\") pod \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.779415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-config-data\") pod \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\" (UID: \"e174f9d2-5b44-468a-8269-7b7fbfca6fa0\") " Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.779455 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e174f9d2-5b44-468a-8269-7b7fbfca6fa0" (UID: "e174f9d2-5b44-468a-8269-7b7fbfca6fa0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.780361 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.784243 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-scripts" (OuterVolumeSpecName: "scripts") pod "e174f9d2-5b44-468a-8269-7b7fbfca6fa0" (UID: "e174f9d2-5b44-468a-8269-7b7fbfca6fa0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.785692 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e174f9d2-5b44-468a-8269-7b7fbfca6fa0" (UID: "e174f9d2-5b44-468a-8269-7b7fbfca6fa0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.786783 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-kube-api-access-jnwt4" (OuterVolumeSpecName: "kube-api-access-jnwt4") pod "e174f9d2-5b44-468a-8269-7b7fbfca6fa0" (UID: "e174f9d2-5b44-468a-8269-7b7fbfca6fa0"). InnerVolumeSpecName "kube-api-access-jnwt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.819358 5030 scope.go:117] "RemoveContainer" containerID="f4b76bed2869038580868e9775d5e2b88278960b15c8168d308de8d0e406be28" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.870773 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e174f9d2-5b44-468a-8269-7b7fbfca6fa0" (UID: "e174f9d2-5b44-468a-8269-7b7fbfca6fa0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.872168 5030 scope.go:117] "RemoveContainer" containerID="d9701cad4c6ceac072f56c95fa9fdb970a90d205a731f079f8ef4408544d5c84" Jan 20 23:21:56 crc kubenswrapper[5030]: E0120 23:21:56.873297 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9701cad4c6ceac072f56c95fa9fdb970a90d205a731f079f8ef4408544d5c84\": container with ID starting with d9701cad4c6ceac072f56c95fa9fdb970a90d205a731f079f8ef4408544d5c84 not found: ID does not exist" containerID="d9701cad4c6ceac072f56c95fa9fdb970a90d205a731f079f8ef4408544d5c84" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.873323 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9701cad4c6ceac072f56c95fa9fdb970a90d205a731f079f8ef4408544d5c84"} err="failed to get container status \"d9701cad4c6ceac072f56c95fa9fdb970a90d205a731f079f8ef4408544d5c84\": rpc error: code = NotFound desc = could not find container \"d9701cad4c6ceac072f56c95fa9fdb970a90d205a731f079f8ef4408544d5c84\": container with ID starting with d9701cad4c6ceac072f56c95fa9fdb970a90d205a731f079f8ef4408544d5c84 not found: ID does not exist" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.873342 5030 scope.go:117] "RemoveContainer" containerID="f4b76bed2869038580868e9775d5e2b88278960b15c8168d308de8d0e406be28" Jan 20 23:21:56 crc kubenswrapper[5030]: E0120 23:21:56.873554 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4b76bed2869038580868e9775d5e2b88278960b15c8168d308de8d0e406be28\": container with ID starting with f4b76bed2869038580868e9775d5e2b88278960b15c8168d308de8d0e406be28 not found: ID does not exist" containerID="f4b76bed2869038580868e9775d5e2b88278960b15c8168d308de8d0e406be28" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.873569 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4b76bed2869038580868e9775d5e2b88278960b15c8168d308de8d0e406be28"} err="failed to get container status \"f4b76bed2869038580868e9775d5e2b88278960b15c8168d308de8d0e406be28\": rpc error: code = NotFound desc = could not find container \"f4b76bed2869038580868e9775d5e2b88278960b15c8168d308de8d0e406be28\": container with ID starting with f4b76bed2869038580868e9775d5e2b88278960b15c8168d308de8d0e406be28 not found: ID does not exist" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.881692 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.884223 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.884244 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnwt4\" (UniqueName: \"kubernetes.io/projected/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-kube-api-access-jnwt4\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.884254 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.915259 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-config-data" (OuterVolumeSpecName: "config-data") pod "e174f9d2-5b44-468a-8269-7b7fbfca6fa0" (UID: "e174f9d2-5b44-468a-8269-7b7fbfca6fa0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:56 crc kubenswrapper[5030]: I0120 23:21:56.986339 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e174f9d2-5b44-468a-8269-7b7fbfca6fa0-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:57 crc kubenswrapper[5030]: E0120 23:21:57.042312 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5a0e0f_eb0d_4757_b571_6571a43793c4.slice/crio-e27e4401e62fa285d5d1e9936dacf60031667ebe0fe7152d85d42ed3a9166693\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5a0e0f_eb0d_4757_b571_6571a43793c4.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.132179 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" podStartSLOduration=2.132162049 podStartE2EDuration="2.132162049s" podCreationTimestamp="2026-01-20 23:21:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:56.795064066 +0000 UTC m=+2789.115324354" watchObservedRunningTime="2026-01-20 23:21:57.132162049 +0000 UTC m=+2789.452422337" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.153441 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.171416 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.179673 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:21:57 crc kubenswrapper[5030]: E0120 23:21:57.180161 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e174f9d2-5b44-468a-8269-7b7fbfca6fa0" containerName="probe" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.180186 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e174f9d2-5b44-468a-8269-7b7fbfca6fa0" containerName="probe" Jan 20 23:21:57 crc kubenswrapper[5030]: E0120 23:21:57.180198 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e174f9d2-5b44-468a-8269-7b7fbfca6fa0" containerName="cinder-scheduler" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.180207 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e174f9d2-5b44-468a-8269-7b7fbfca6fa0" containerName="cinder-scheduler" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.180408 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e174f9d2-5b44-468a-8269-7b7fbfca6fa0" containerName="cinder-scheduler" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.180439 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e174f9d2-5b44-468a-8269-7b7fbfca6fa0" containerName="probe" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.181542 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.184931 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.184984 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.306040 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.306473 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.306516 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8hzv\" (UniqueName: \"kubernetes.io/projected/3c882b06-04ce-4893-af53-f8dfb97dfcdc-kube-api-access-n8hzv\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.306675 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.306710 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c882b06-04ce-4893-af53-f8dfb97dfcdc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.306764 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.407736 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.407773 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c882b06-04ce-4893-af53-f8dfb97dfcdc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.407803 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.407868 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c882b06-04ce-4893-af53-f8dfb97dfcdc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.407926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.407972 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.407999 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8hzv\" (UniqueName: \"kubernetes.io/projected/3c882b06-04ce-4893-af53-f8dfb97dfcdc-kube-api-access-n8hzv\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.414736 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.421165 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-scripts\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.421425 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.422569 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-config-data\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.424699 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8hzv\" (UniqueName: \"kubernetes.io/projected/3c882b06-04ce-4893-af53-f8dfb97dfcdc-kube-api-access-n8hzv\") pod \"cinder-scheduler-0\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.524676 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.935650 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" podUID="2733db1e-741d-49c1-bd1e-8a3b91cb5069" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.251:9311/healthcheck\": read tcp 10.217.0.2:56056->10.217.0.251:9311: read: connection reset by peer" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.935727 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" podUID="2733db1e-741d-49c1-bd1e-8a3b91cb5069" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.251:9311/healthcheck\": read tcp 10.217.0.2:56070->10.217.0.251:9311: read: connection reset by peer" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.974030 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e174f9d2-5b44-468a-8269-7b7fbfca6fa0" path="/var/lib/kubelet/pods/e174f9d2-5b44-468a-8269-7b7fbfca6fa0/volumes" Jan 20 23:21:57 crc kubenswrapper[5030]: I0120 23:21:57.987325 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:21:58 crc kubenswrapper[5030]: W0120 23:21:58.014771 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c882b06_04ce_4893_af53_f8dfb97dfcdc.slice/crio-7971bde154627c987da26516c0826cbdb1889c10b8234541523b58d8bfb5dc87 WatchSource:0}: Error finding container 7971bde154627c987da26516c0826cbdb1889c10b8234541523b58d8bfb5dc87: Status 404 returned error can't find the container with id 7971bde154627c987da26516c0826cbdb1889c10b8234541523b58d8bfb5dc87 Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.516642 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.625855 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-config-data-custom\") pod \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.625928 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2733db1e-741d-49c1-bd1e-8a3b91cb5069-logs\") pod \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.625976 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-config-data\") pod \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.625994 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-combined-ca-bundle\") pod \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.626086 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z66zl\" (UniqueName: \"kubernetes.io/projected/2733db1e-741d-49c1-bd1e-8a3b91cb5069-kube-api-access-z66zl\") pod \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\" (UID: \"2733db1e-741d-49c1-bd1e-8a3b91cb5069\") " Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.626568 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2733db1e-741d-49c1-bd1e-8a3b91cb5069-logs" (OuterVolumeSpecName: "logs") pod "2733db1e-741d-49c1-bd1e-8a3b91cb5069" (UID: "2733db1e-741d-49c1-bd1e-8a3b91cb5069"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.631340 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2733db1e-741d-49c1-bd1e-8a3b91cb5069" (UID: "2733db1e-741d-49c1-bd1e-8a3b91cb5069"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.641932 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2733db1e-741d-49c1-bd1e-8a3b91cb5069-kube-api-access-z66zl" (OuterVolumeSpecName: "kube-api-access-z66zl") pod "2733db1e-741d-49c1-bd1e-8a3b91cb5069" (UID: "2733db1e-741d-49c1-bd1e-8a3b91cb5069"). InnerVolumeSpecName "kube-api-access-z66zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.658819 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2733db1e-741d-49c1-bd1e-8a3b91cb5069" (UID: "2733db1e-741d-49c1-bd1e-8a3b91cb5069"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.682928 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-config-data" (OuterVolumeSpecName: "config-data") pod "2733db1e-741d-49c1-bd1e-8a3b91cb5069" (UID: "2733db1e-741d-49c1-bd1e-8a3b91cb5069"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.728248 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.728299 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2733db1e-741d-49c1-bd1e-8a3b91cb5069-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.728320 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.728338 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2733db1e-741d-49c1-bd1e-8a3b91cb5069-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.728356 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z66zl\" (UniqueName: \"kubernetes.io/projected/2733db1e-741d-49c1-bd1e-8a3b91cb5069-kube-api-access-z66zl\") on node \"crc\" DevicePath \"\"" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.804463 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c882b06-04ce-4893-af53-f8dfb97dfcdc","Type":"ContainerStarted","Data":"914f5f3e09e41c57acffc427aa985037c43c40970debda4d1453d04f95ceac1e"} Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.804904 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c882b06-04ce-4893-af53-f8dfb97dfcdc","Type":"ContainerStarted","Data":"7971bde154627c987da26516c0826cbdb1889c10b8234541523b58d8bfb5dc87"} Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.807207 5030 generic.go:334] "Generic (PLEG): container finished" podID="2733db1e-741d-49c1-bd1e-8a3b91cb5069" containerID="d476fb02a69a8fc92e3c7cca72b9a5effa93125b84bdbaea04da1b75d99fdc98" exitCode=0 Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.807245 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" event={"ID":"2733db1e-741d-49c1-bd1e-8a3b91cb5069","Type":"ContainerDied","Data":"d476fb02a69a8fc92e3c7cca72b9a5effa93125b84bdbaea04da1b75d99fdc98"} Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.807282 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" event={"ID":"2733db1e-741d-49c1-bd1e-8a3b91cb5069","Type":"ContainerDied","Data":"4926d827b942cd4f7318deba7ced99161643201328b8d702b7e568b035e8ea46"} Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.807304 5030 scope.go:117] "RemoveContainer" containerID="d476fb02a69a8fc92e3c7cca72b9a5effa93125b84bdbaea04da1b75d99fdc98" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.807248 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.844278 5030 scope.go:117] "RemoveContainer" containerID="d3123491bd9a3c29ded560593841312b16cfff97d4b34fce55c2e8a76b57badf" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.848597 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv"] Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.862822 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-595867cbd5-4xbnv"] Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.875441 5030 scope.go:117] "RemoveContainer" containerID="d476fb02a69a8fc92e3c7cca72b9a5effa93125b84bdbaea04da1b75d99fdc98" Jan 20 23:21:58 crc kubenswrapper[5030]: E0120 23:21:58.875922 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d476fb02a69a8fc92e3c7cca72b9a5effa93125b84bdbaea04da1b75d99fdc98\": container with ID starting with d476fb02a69a8fc92e3c7cca72b9a5effa93125b84bdbaea04da1b75d99fdc98 not found: ID does not exist" containerID="d476fb02a69a8fc92e3c7cca72b9a5effa93125b84bdbaea04da1b75d99fdc98" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.875969 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d476fb02a69a8fc92e3c7cca72b9a5effa93125b84bdbaea04da1b75d99fdc98"} err="failed to get container status \"d476fb02a69a8fc92e3c7cca72b9a5effa93125b84bdbaea04da1b75d99fdc98\": rpc error: code = NotFound desc = could not find container \"d476fb02a69a8fc92e3c7cca72b9a5effa93125b84bdbaea04da1b75d99fdc98\": container with ID starting with d476fb02a69a8fc92e3c7cca72b9a5effa93125b84bdbaea04da1b75d99fdc98 not found: ID does not exist" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.876027 5030 scope.go:117] "RemoveContainer" containerID="d3123491bd9a3c29ded560593841312b16cfff97d4b34fce55c2e8a76b57badf" Jan 20 23:21:58 crc kubenswrapper[5030]: E0120 23:21:58.876968 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3123491bd9a3c29ded560593841312b16cfff97d4b34fce55c2e8a76b57badf\": container with ID starting with d3123491bd9a3c29ded560593841312b16cfff97d4b34fce55c2e8a76b57badf not found: ID does not exist" containerID="d3123491bd9a3c29ded560593841312b16cfff97d4b34fce55c2e8a76b57badf" Jan 20 23:21:58 crc kubenswrapper[5030]: I0120 23:21:58.877029 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3123491bd9a3c29ded560593841312b16cfff97d4b34fce55c2e8a76b57badf"} err="failed to get container status \"d3123491bd9a3c29ded560593841312b16cfff97d4b34fce55c2e8a76b57badf\": rpc error: code = NotFound desc = could not find container \"d3123491bd9a3c29ded560593841312b16cfff97d4b34fce55c2e8a76b57badf\": container with ID starting with d3123491bd9a3c29ded560593841312b16cfff97d4b34fce55c2e8a76b57badf not found: ID does not exist" Jan 20 23:21:59 crc kubenswrapper[5030]: I0120 23:21:59.819081 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c882b06-04ce-4893-af53-f8dfb97dfcdc","Type":"ContainerStarted","Data":"af8c2994eb7f6ea1db7330e62227663ddd51844eb30651d6420cdeedd6189e70"} Jan 20 23:21:59 crc kubenswrapper[5030]: I0120 23:21:59.846119 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=2.846103712 podStartE2EDuration="2.846103712s" podCreationTimestamp="2026-01-20 23:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:21:59.844896383 +0000 UTC m=+2792.165156681" watchObservedRunningTime="2026-01-20 23:21:59.846103712 +0000 UTC m=+2792.166364010" Jan 20 23:21:59 crc kubenswrapper[5030]: I0120 23:21:59.973817 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2733db1e-741d-49c1-bd1e-8a3b91cb5069" path="/var/lib/kubelet/pods/2733db1e-741d-49c1-bd1e-8a3b91cb5069/volumes" Jan 20 23:22:01 crc kubenswrapper[5030]: I0120 23:22:01.943559 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:22:02 crc kubenswrapper[5030]: I0120 23:22:02.525347 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:22:07 crc kubenswrapper[5030]: E0120 23:22:07.283053 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5a0e0f_eb0d_4757_b571_6571a43793c4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5a0e0f_eb0d_4757_b571_6571a43793c4.slice/crio-e27e4401e62fa285d5d1e9936dacf60031667ebe0fe7152d85d42ed3a9166693\": RecentStats: unable to find data in memory cache]" Jan 20 23:22:07 crc kubenswrapper[5030]: I0120 23:22:07.737406 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:22:07 crc kubenswrapper[5030]: I0120 23:22:07.794291 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:22:07 crc kubenswrapper[5030]: I0120 23:22:07.796213 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:22:07 crc kubenswrapper[5030]: I0120 23:22:07.816797 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:10 crc kubenswrapper[5030]: I0120 23:22:10.158016 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:22:10 crc kubenswrapper[5030]: I0120 23:22:10.158397 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:22:10 crc kubenswrapper[5030]: I0120 23:22:10.158459 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 23:22:10 crc kubenswrapper[5030]: I0120 23:22:10.159360 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1dc0601959289421c735475a4194e91b57087f23f408901601c681f937cf2114"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 23:22:10 crc kubenswrapper[5030]: I0120 23:22:10.159436 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://1dc0601959289421c735475a4194e91b57087f23f408901601c681f937cf2114" gracePeriod=600 Jan 20 23:22:10 crc kubenswrapper[5030]: I0120 23:22:10.959261 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"1dc0601959289421c735475a4194e91b57087f23f408901601c681f937cf2114"} Jan 20 23:22:10 crc kubenswrapper[5030]: I0120 23:22:10.959301 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="1dc0601959289421c735475a4194e91b57087f23f408901601c681f937cf2114" exitCode=0 Jan 20 23:22:10 crc kubenswrapper[5030]: I0120 23:22:10.959602 5030 scope.go:117] "RemoveContainer" containerID="94e385f64fd17ded732a84e7422ca644f2dcc56f5446d2243e1cffc2fbd979b3" Jan 20 23:22:10 crc kubenswrapper[5030]: I0120 23:22:10.959684 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f"} Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.553365 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.718472 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:22:16 crc kubenswrapper[5030]: E0120 23:22:16.718883 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2733db1e-741d-49c1-bd1e-8a3b91cb5069" containerName="barbican-api-log" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.718901 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2733db1e-741d-49c1-bd1e-8a3b91cb5069" containerName="barbican-api-log" Jan 20 23:22:16 crc kubenswrapper[5030]: E0120 23:22:16.718926 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2733db1e-741d-49c1-bd1e-8a3b91cb5069" containerName="barbican-api" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.718933 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2733db1e-741d-49c1-bd1e-8a3b91cb5069" containerName="barbican-api" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.719108 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2733db1e-741d-49c1-bd1e-8a3b91cb5069" containerName="barbican-api-log" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.719127 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2733db1e-741d-49c1-bd1e-8a3b91cb5069" containerName="barbican-api" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.719726 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.721227 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.721277 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.721816 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-b2h9n" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.728638 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.749331 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:22:16 crc kubenswrapper[5030]: E0120 23:22:16.751045 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-ghjhh openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[combined-ca-bundle kube-api-access-ghjhh openstack-config openstack-config-secret]: context canceled" pod="openstack-kuttl-tests/openstackclient" podUID="43f614f0-da36-4500-9556-12c275847cd0" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.756926 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.772095 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.773218 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.785892 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.882119 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fa5677f-cdb3-4391-add0-360b9b188cd3-openstack-config\") pod \"openstackclient\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.882397 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fa5677f-cdb3-4391-add0-360b9b188cd3-openstack-config-secret\") pod \"openstackclient\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.882514 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa5677f-cdb3-4391-add0-360b9b188cd3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.882639 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9wrp\" (UniqueName: \"kubernetes.io/projected/1fa5677f-cdb3-4391-add0-360b9b188cd3-kube-api-access-m9wrp\") pod \"openstackclient\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.984115 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fa5677f-cdb3-4391-add0-360b9b188cd3-openstack-config\") pod \"openstackclient\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.984452 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fa5677f-cdb3-4391-add0-360b9b188cd3-openstack-config-secret\") pod \"openstackclient\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.984604 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa5677f-cdb3-4391-add0-360b9b188cd3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.984754 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9wrp\" (UniqueName: \"kubernetes.io/projected/1fa5677f-cdb3-4391-add0-360b9b188cd3-kube-api-access-m9wrp\") pod \"openstackclient\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.985000 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fa5677f-cdb3-4391-add0-360b9b188cd3-openstack-config\") pod \"openstackclient\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:16 crc kubenswrapper[5030]: I0120 23:22:16.990807 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa5677f-cdb3-4391-add0-360b9b188cd3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:17 crc kubenswrapper[5030]: I0120 23:22:17.005311 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fa5677f-cdb3-4391-add0-360b9b188cd3-openstack-config-secret\") pod \"openstackclient\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:17 crc kubenswrapper[5030]: I0120 23:22:17.016062 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9wrp\" (UniqueName: \"kubernetes.io/projected/1fa5677f-cdb3-4391-add0-360b9b188cd3-kube-api-access-m9wrp\") pod \"openstackclient\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:17 crc kubenswrapper[5030]: I0120 23:22:17.028037 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:17 crc kubenswrapper[5030]: I0120 23:22:17.034154 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="43f614f0-da36-4500-9556-12c275847cd0" podUID="1fa5677f-cdb3-4391-add0-360b9b188cd3" Jan 20 23:22:17 crc kubenswrapper[5030]: I0120 23:22:17.089055 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:17 crc kubenswrapper[5030]: I0120 23:22:17.090237 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:17 crc kubenswrapper[5030]: E0120 23:22:17.540016 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5a0e0f_eb0d_4757_b571_6571a43793c4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5a0e0f_eb0d_4757_b571_6571a43793c4.slice/crio-e27e4401e62fa285d5d1e9936dacf60031667ebe0fe7152d85d42ed3a9166693\": RecentStats: unable to find data in memory cache]" Jan 20 23:22:17 crc kubenswrapper[5030]: W0120 23:22:17.567673 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fa5677f_cdb3_4391_add0_360b9b188cd3.slice/crio-48bf90a8244c4fc3aa633f1f63f3090f040cacf7f87506d18f4193461ee2ae5a WatchSource:0}: Error finding container 48bf90a8244c4fc3aa633f1f63f3090f040cacf7f87506d18f4193461ee2ae5a: Status 404 returned error can't find the container with id 48bf90a8244c4fc3aa633f1f63f3090f040cacf7f87506d18f4193461ee2ae5a Jan 20 23:22:17 crc kubenswrapper[5030]: I0120 23:22:17.581026 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:22:17 crc kubenswrapper[5030]: I0120 23:22:17.973808 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f614f0-da36-4500-9556-12c275847cd0" path="/var/lib/kubelet/pods/43f614f0-da36-4500-9556-12c275847cd0/volumes" Jan 20 23:22:18 crc kubenswrapper[5030]: I0120 23:22:18.052868 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:22:18 crc kubenswrapper[5030]: I0120 23:22:18.052861 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"1fa5677f-cdb3-4391-add0-360b9b188cd3","Type":"ContainerStarted","Data":"f09acdc60f1f81bba1a8d3368f953c5127816f9d9a3d73cbef68294606305e53"} Jan 20 23:22:18 crc kubenswrapper[5030]: I0120 23:22:18.053099 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"1fa5677f-cdb3-4391-add0-360b9b188cd3","Type":"ContainerStarted","Data":"48bf90a8244c4fc3aa633f1f63f3090f040cacf7f87506d18f4193461ee2ae5a"} Jan 20 23:22:18 crc kubenswrapper[5030]: I0120 23:22:18.069549 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="43f614f0-da36-4500-9556-12c275847cd0" podUID="1fa5677f-cdb3-4391-add0-360b9b188cd3" Jan 20 23:22:18 crc kubenswrapper[5030]: I0120 23:22:18.071963 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.071947382 podStartE2EDuration="2.071947382s" podCreationTimestamp="2026-01-20 23:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:22:18.066522522 +0000 UTC m=+2810.386782800" watchObservedRunningTime="2026-01-20 23:22:18.071947382 +0000 UTC m=+2810.392207670" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.080311 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z"] Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.082695 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.085024 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.085119 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.085990 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.097736 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z"] Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.241730 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c256a299-ef10-4eab-ae00-d4116a91a108-etc-swift\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.241842 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c256a299-ef10-4eab-ae00-d4116a91a108-log-httpd\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.241923 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-combined-ca-bundle\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.241950 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-internal-tls-certs\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.242001 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c256a299-ef10-4eab-ae00-d4116a91a108-run-httpd\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.242033 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-config-data\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.242151 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df5wt\" (UniqueName: \"kubernetes.io/projected/c256a299-ef10-4eab-ae00-d4116a91a108-kube-api-access-df5wt\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.242291 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-public-tls-certs\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.344583 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-combined-ca-bundle\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.344642 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-internal-tls-certs\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.344692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c256a299-ef10-4eab-ae00-d4116a91a108-run-httpd\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.344726 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-config-data\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.344757 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df5wt\" (UniqueName: \"kubernetes.io/projected/c256a299-ef10-4eab-ae00-d4116a91a108-kube-api-access-df5wt\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.344805 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-public-tls-certs\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.345182 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c256a299-ef10-4eab-ae00-d4116a91a108-etc-swift\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.345399 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c256a299-ef10-4eab-ae00-d4116a91a108-run-httpd\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.345612 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c256a299-ef10-4eab-ae00-d4116a91a108-log-httpd\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.346099 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c256a299-ef10-4eab-ae00-d4116a91a108-log-httpd\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.350092 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-public-tls-certs\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.350743 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c256a299-ef10-4eab-ae00-d4116a91a108-etc-swift\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.354548 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-config-data\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.362460 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-internal-tls-certs\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.363385 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df5wt\" (UniqueName: \"kubernetes.io/projected/c256a299-ef10-4eab-ae00-d4116a91a108-kube-api-access-df5wt\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.368642 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-combined-ca-bundle\") pod \"swift-proxy-55d95d9dc5-4zf8z\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.401832 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:20 crc kubenswrapper[5030]: I0120 23:22:20.846204 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z"] Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.080277 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" event={"ID":"c256a299-ef10-4eab-ae00-d4116a91a108","Type":"ContainerStarted","Data":"85dab81c3ed675ac6bfcead4abc4422ebb5e103eb112882198a33167ac28209d"} Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.611467 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-lspsv"] Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.612970 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-lspsv" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.624121 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-lspsv"] Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.704102 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-btqm2"] Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.705175 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-btqm2" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.716186 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-btqm2"] Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.769550 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcba4b91-f476-4404-b317-6b0cf9e72659-operator-scripts\") pod \"nova-api-db-create-lspsv\" (UID: \"dcba4b91-f476-4404-b317-6b0cf9e72659\") " pod="openstack-kuttl-tests/nova-api-db-create-lspsv" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.769675 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndn6h\" (UniqueName: \"kubernetes.io/projected/dcba4b91-f476-4404-b317-6b0cf9e72659-kube-api-access-ndn6h\") pod \"nova-api-db-create-lspsv\" (UID: \"dcba4b91-f476-4404-b317-6b0cf9e72659\") " pod="openstack-kuttl-tests/nova-api-db-create-lspsv" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.821877 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf"] Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.823203 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.824838 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.843696 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf"] Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.855458 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.855779 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="ceilometer-central-agent" containerID="cri-o://227e0b093c901babf978c1dc04c9783f7c8ea08ded72d6934b4f9c5424d4b2c3" gracePeriod=30 Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.855932 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="proxy-httpd" containerID="cri-o://06cdb5c56809512e8ffd248f82b0ba3a5e54d10a1de1554591dbe53bd608d768" gracePeriod=30 Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.855985 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="sg-core" containerID="cri-o://71bc36dcfa8577f5ae6ac65dbde97d07ab322d996ae5b70285870910a674c155" gracePeriod=30 Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.856032 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="ceilometer-notification-agent" containerID="cri-o://460c9cd55ccea5d7179492b6d518235c196e787b6d196ec583152b74f9e066a1" gracePeriod=30 Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.871301 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcba4b91-f476-4404-b317-6b0cf9e72659-operator-scripts\") pod \"nova-api-db-create-lspsv\" (UID: \"dcba4b91-f476-4404-b317-6b0cf9e72659\") " pod="openstack-kuttl-tests/nova-api-db-create-lspsv" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.871446 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvzlz\" (UniqueName: \"kubernetes.io/projected/11c5d200-30f8-4559-ba0d-f72cc9180ea2-kube-api-access-nvzlz\") pod \"nova-cell0-db-create-btqm2\" (UID: \"11c5d200-30f8-4559-ba0d-f72cc9180ea2\") " pod="openstack-kuttl-tests/nova-cell0-db-create-btqm2" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.871538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndn6h\" (UniqueName: \"kubernetes.io/projected/dcba4b91-f476-4404-b317-6b0cf9e72659-kube-api-access-ndn6h\") pod \"nova-api-db-create-lspsv\" (UID: \"dcba4b91-f476-4404-b317-6b0cf9e72659\") " pod="openstack-kuttl-tests/nova-api-db-create-lspsv" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.871564 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c5d200-30f8-4559-ba0d-f72cc9180ea2-operator-scripts\") pod \"nova-cell0-db-create-btqm2\" (UID: \"11c5d200-30f8-4559-ba0d-f72cc9180ea2\") " pod="openstack-kuttl-tests/nova-cell0-db-create-btqm2" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.872210 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcba4b91-f476-4404-b317-6b0cf9e72659-operator-scripts\") pod \"nova-api-db-create-lspsv\" (UID: \"dcba4b91-f476-4404-b317-6b0cf9e72659\") " pod="openstack-kuttl-tests/nova-api-db-create-lspsv" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.891529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndn6h\" (UniqueName: \"kubernetes.io/projected/dcba4b91-f476-4404-b317-6b0cf9e72659-kube-api-access-ndn6h\") pod \"nova-api-db-create-lspsv\" (UID: \"dcba4b91-f476-4404-b317-6b0cf9e72659\") " pod="openstack-kuttl-tests/nova-api-db-create-lspsv" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.911734 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-lvkk8"] Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.913142 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.930424 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-lspsv" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.932607 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-lvkk8"] Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.973037 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f1a598a-5986-49a8-ba34-f423759d1d62-operator-scripts\") pod \"nova-api-ebab-account-create-update-cktxf\" (UID: \"7f1a598a-5986-49a8-ba34-f423759d1d62\") " pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.973136 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbcfv\" (UniqueName: \"kubernetes.io/projected/7f1a598a-5986-49a8-ba34-f423759d1d62-kube-api-access-bbcfv\") pod \"nova-api-ebab-account-create-update-cktxf\" (UID: \"7f1a598a-5986-49a8-ba34-f423759d1d62\") " pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.973185 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvzlz\" (UniqueName: \"kubernetes.io/projected/11c5d200-30f8-4559-ba0d-f72cc9180ea2-kube-api-access-nvzlz\") pod \"nova-cell0-db-create-btqm2\" (UID: \"11c5d200-30f8-4559-ba0d-f72cc9180ea2\") " pod="openstack-kuttl-tests/nova-cell0-db-create-btqm2" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.973234 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c5d200-30f8-4559-ba0d-f72cc9180ea2-operator-scripts\") pod \"nova-cell0-db-create-btqm2\" (UID: \"11c5d200-30f8-4559-ba0d-f72cc9180ea2\") " pod="openstack-kuttl-tests/nova-cell0-db-create-btqm2" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.974028 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c5d200-30f8-4559-ba0d-f72cc9180ea2-operator-scripts\") pod \"nova-cell0-db-create-btqm2\" (UID: \"11c5d200-30f8-4559-ba0d-f72cc9180ea2\") " pod="openstack-kuttl-tests/nova-cell0-db-create-btqm2" Jan 20 23:22:21 crc kubenswrapper[5030]: I0120 23:22:21.991352 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvzlz\" (UniqueName: \"kubernetes.io/projected/11c5d200-30f8-4559-ba0d-f72cc9180ea2-kube-api-access-nvzlz\") pod \"nova-cell0-db-create-btqm2\" (UID: \"11c5d200-30f8-4559-ba0d-f72cc9180ea2\") " pod="openstack-kuttl-tests/nova-cell0-db-create-btqm2" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.019162 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-btqm2" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.029124 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9"] Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.030409 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.034062 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.038401 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9"] Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.075792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f1a598a-5986-49a8-ba34-f423759d1d62-operator-scripts\") pod \"nova-api-ebab-account-create-update-cktxf\" (UID: \"7f1a598a-5986-49a8-ba34-f423759d1d62\") " pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.075880 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w955w\" (UniqueName: \"kubernetes.io/projected/0ce7d325-d563-4683-80fc-8626524f2631-kube-api-access-w955w\") pod \"nova-cell1-db-create-lvkk8\" (UID: \"0ce7d325-d563-4683-80fc-8626524f2631\") " pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.075901 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ce7d325-d563-4683-80fc-8626524f2631-operator-scripts\") pod \"nova-cell1-db-create-lvkk8\" (UID: \"0ce7d325-d563-4683-80fc-8626524f2631\") " pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.076042 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbcfv\" (UniqueName: \"kubernetes.io/projected/7f1a598a-5986-49a8-ba34-f423759d1d62-kube-api-access-bbcfv\") pod \"nova-api-ebab-account-create-update-cktxf\" (UID: \"7f1a598a-5986-49a8-ba34-f423759d1d62\") " pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.079285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f1a598a-5986-49a8-ba34-f423759d1d62-operator-scripts\") pod \"nova-api-ebab-account-create-update-cktxf\" (UID: \"7f1a598a-5986-49a8-ba34-f423759d1d62\") " pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.105755 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbcfv\" (UniqueName: \"kubernetes.io/projected/7f1a598a-5986-49a8-ba34-f423759d1d62-kube-api-access-bbcfv\") pod \"nova-api-ebab-account-create-update-cktxf\" (UID: \"7f1a598a-5986-49a8-ba34-f423759d1d62\") " pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.118000 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerID="06cdb5c56809512e8ffd248f82b0ba3a5e54d10a1de1554591dbe53bd608d768" exitCode=0 Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.118029 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerID="71bc36dcfa8577f5ae6ac65dbde97d07ab322d996ae5b70285870910a674c155" exitCode=2 Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.118069 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a5d3183b-1f93-402d-b6ae-04a7140350e1","Type":"ContainerDied","Data":"06cdb5c56809512e8ffd248f82b0ba3a5e54d10a1de1554591dbe53bd608d768"} Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.118094 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a5d3183b-1f93-402d-b6ae-04a7140350e1","Type":"ContainerDied","Data":"71bc36dcfa8577f5ae6ac65dbde97d07ab322d996ae5b70285870910a674c155"} Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.122251 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" event={"ID":"c256a299-ef10-4eab-ae00-d4116a91a108","Type":"ContainerStarted","Data":"cbbd55ec11f922f70da8495be9037539e1d761154559e6b2075472c7deb21023"} Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.122275 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" event={"ID":"c256a299-ef10-4eab-ae00-d4116a91a108","Type":"ContainerStarted","Data":"2e3c443886e08309e23e74c3eb26e87aeed4b515043573a5b997072fad41ac81"} Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.122608 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.122919 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.152266 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.152733 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" podStartSLOduration=2.152703982 podStartE2EDuration="2.152703982s" podCreationTimestamp="2026-01-20 23:22:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:22:22.141712138 +0000 UTC m=+2814.461972426" watchObservedRunningTime="2026-01-20 23:22:22.152703982 +0000 UTC m=+2814.472964270" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.177440 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/289609c1-c2d1-448d-b4ff-ed7127995341-operator-scripts\") pod \"nova-cell0-5d3e-account-create-update-8dft9\" (UID: \"289609c1-c2d1-448d-b4ff-ed7127995341\") " pod="openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.177482 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w955w\" (UniqueName: \"kubernetes.io/projected/0ce7d325-d563-4683-80fc-8626524f2631-kube-api-access-w955w\") pod \"nova-cell1-db-create-lvkk8\" (UID: \"0ce7d325-d563-4683-80fc-8626524f2631\") " pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.177506 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ce7d325-d563-4683-80fc-8626524f2631-operator-scripts\") pod \"nova-cell1-db-create-lvkk8\" (UID: \"0ce7d325-d563-4683-80fc-8626524f2631\") " pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.177655 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qt6n\" (UniqueName: \"kubernetes.io/projected/289609c1-c2d1-448d-b4ff-ed7127995341-kube-api-access-7qt6n\") pod \"nova-cell0-5d3e-account-create-update-8dft9\" (UID: \"289609c1-c2d1-448d-b4ff-ed7127995341\") " pod="openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.178424 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ce7d325-d563-4683-80fc-8626524f2631-operator-scripts\") pod \"nova-cell1-db-create-lvkk8\" (UID: \"0ce7d325-d563-4683-80fc-8626524f2631\") " pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.195970 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w955w\" (UniqueName: \"kubernetes.io/projected/0ce7d325-d563-4683-80fc-8626524f2631-kube-api-access-w955w\") pod \"nova-cell1-db-create-lvkk8\" (UID: \"0ce7d325-d563-4683-80fc-8626524f2631\") " pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.215382 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b"] Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.223118 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.226114 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b"] Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.228061 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.241254 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.281106 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qt6n\" (UniqueName: \"kubernetes.io/projected/289609c1-c2d1-448d-b4ff-ed7127995341-kube-api-access-7qt6n\") pod \"nova-cell0-5d3e-account-create-update-8dft9\" (UID: \"289609c1-c2d1-448d-b4ff-ed7127995341\") " pod="openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.281370 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/289609c1-c2d1-448d-b4ff-ed7127995341-operator-scripts\") pod \"nova-cell0-5d3e-account-create-update-8dft9\" (UID: \"289609c1-c2d1-448d-b4ff-ed7127995341\") " pod="openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.282493 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/289609c1-c2d1-448d-b4ff-ed7127995341-operator-scripts\") pod \"nova-cell0-5d3e-account-create-update-8dft9\" (UID: \"289609c1-c2d1-448d-b4ff-ed7127995341\") " pod="openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.305385 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qt6n\" (UniqueName: \"kubernetes.io/projected/289609c1-c2d1-448d-b4ff-ed7127995341-kube-api-access-7qt6n\") pod \"nova-cell0-5d3e-account-create-update-8dft9\" (UID: \"289609c1-c2d1-448d-b4ff-ed7127995341\") " pod="openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.385597 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43fbd9c6-34c8-4fe9-9514-bc2932bbb632-operator-scripts\") pod \"nova-cell1-5ca2-account-create-update-5xh6b\" (UID: \"43fbd9c6-34c8-4fe9-9514-bc2932bbb632\") " pod="openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.385673 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvz7f\" (UniqueName: \"kubernetes.io/projected/43fbd9c6-34c8-4fe9-9514-bc2932bbb632-kube-api-access-zvz7f\") pod \"nova-cell1-5ca2-account-create-update-5xh6b\" (UID: \"43fbd9c6-34c8-4fe9-9514-bc2932bbb632\") " pod="openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.389685 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.495039 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43fbd9c6-34c8-4fe9-9514-bc2932bbb632-operator-scripts\") pod \"nova-cell1-5ca2-account-create-update-5xh6b\" (UID: \"43fbd9c6-34c8-4fe9-9514-bc2932bbb632\") " pod="openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.495327 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvz7f\" (UniqueName: \"kubernetes.io/projected/43fbd9c6-34c8-4fe9-9514-bc2932bbb632-kube-api-access-zvz7f\") pod \"nova-cell1-5ca2-account-create-update-5xh6b\" (UID: \"43fbd9c6-34c8-4fe9-9514-bc2932bbb632\") " pod="openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.495874 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43fbd9c6-34c8-4fe9-9514-bc2932bbb632-operator-scripts\") pod \"nova-cell1-5ca2-account-create-update-5xh6b\" (UID: \"43fbd9c6-34c8-4fe9-9514-bc2932bbb632\") " pod="openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.524347 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvz7f\" (UniqueName: \"kubernetes.io/projected/43fbd9c6-34c8-4fe9-9514-bc2932bbb632-kube-api-access-zvz7f\") pod \"nova-cell1-5ca2-account-create-update-5xh6b\" (UID: \"43fbd9c6-34c8-4fe9-9514-bc2932bbb632\") " pod="openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.546052 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b" Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.586511 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-lspsv"] Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.598051 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-btqm2"] Jan 20 23:22:22 crc kubenswrapper[5030]: W0120 23:22:22.598389 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcba4b91_f476_4404_b317_6b0cf9e72659.slice/crio-5dafa64333925f2c82f6d23631e4d9e4316c8840ea7f581ed3f96769f23c07cd WatchSource:0}: Error finding container 5dafa64333925f2c82f6d23631e4d9e4316c8840ea7f581ed3f96769f23c07cd: Status 404 returned error can't find the container with id 5dafa64333925f2c82f6d23631e4d9e4316c8840ea7f581ed3f96769f23c07cd Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.825779 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf"] Jan 20 23:22:22 crc kubenswrapper[5030]: W0120 23:22:22.832839 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f1a598a_5986_49a8_ba34_f423759d1d62.slice/crio-9750f11310c1780f5604e7af04af97dffcc065ebfcf9dd20c5d5553f020e2125 WatchSource:0}: Error finding container 9750f11310c1780f5604e7af04af97dffcc065ebfcf9dd20c5d5553f020e2125: Status 404 returned error can't find the container with id 9750f11310c1780f5604e7af04af97dffcc065ebfcf9dd20c5d5553f020e2125 Jan 20 23:22:22 crc kubenswrapper[5030]: I0120 23:22:22.955758 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-lvkk8"] Jan 20 23:22:22 crc kubenswrapper[5030]: W0120 23:22:22.957039 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ce7d325_d563_4683_80fc_8626524f2631.slice/crio-db67b46ad57425ea2516b4da5955c971d03a3aa3e5bc1dc5bef4b7bf08899c11 WatchSource:0}: Error finding container db67b46ad57425ea2516b4da5955c971d03a3aa3e5bc1dc5bef4b7bf08899c11: Status 404 returned error can't find the container with id db67b46ad57425ea2516b4da5955c971d03a3aa3e5bc1dc5bef4b7bf08899c11 Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.070269 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9"] Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.077954 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b"] Jan 20 23:22:23 crc kubenswrapper[5030]: W0120 23:22:23.079903 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod289609c1_c2d1_448d_b4ff_ed7127995341.slice/crio-6f7bd414bb98539e77fdf0c9c6637ad7cd08f64090da8350c0688ca16118266a WatchSource:0}: Error finding container 6f7bd414bb98539e77fdf0c9c6637ad7cd08f64090da8350c0688ca16118266a: Status 404 returned error can't find the container with id 6f7bd414bb98539e77fdf0c9c6637ad7cd08f64090da8350c0688ca16118266a Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.134771 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerID="227e0b093c901babf978c1dc04c9783f7c8ea08ded72d6934b4f9c5424d4b2c3" exitCode=0 Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.134832 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a5d3183b-1f93-402d-b6ae-04a7140350e1","Type":"ContainerDied","Data":"227e0b093c901babf978c1dc04c9783f7c8ea08ded72d6934b4f9c5424d4b2c3"} Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.136023 5030 generic.go:334] "Generic (PLEG): container finished" podID="11c5d200-30f8-4559-ba0d-f72cc9180ea2" containerID="f1698982fca2a79b6548ac198d70a617d97cb084ded4cd799e4664d8686aea9d" exitCode=0 Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.136066 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-btqm2" event={"ID":"11c5d200-30f8-4559-ba0d-f72cc9180ea2","Type":"ContainerDied","Data":"f1698982fca2a79b6548ac198d70a617d97cb084ded4cd799e4664d8686aea9d"} Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.136081 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-btqm2" event={"ID":"11c5d200-30f8-4559-ba0d-f72cc9180ea2","Type":"ContainerStarted","Data":"b184c92aac258089df377731ae3ae254b39b5da2b85939336bee3313c9ad7b10"} Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.138583 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9" event={"ID":"289609c1-c2d1-448d-b4ff-ed7127995341","Type":"ContainerStarted","Data":"6f7bd414bb98539e77fdf0c9c6637ad7cd08f64090da8350c0688ca16118266a"} Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.150024 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b" event={"ID":"43fbd9c6-34c8-4fe9-9514-bc2932bbb632","Type":"ContainerStarted","Data":"afea8b10929802f137e277c1f7632254d5dccf6ce67d9ebecae946177e7206c9"} Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.154328 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" event={"ID":"7f1a598a-5986-49a8-ba34-f423759d1d62","Type":"ContainerStarted","Data":"201653aad7d55159137ac56d6010f5c74db5484c6c17c8f3dc1ea1fcfaa1b109"} Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.154364 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" event={"ID":"7f1a598a-5986-49a8-ba34-f423759d1d62","Type":"ContainerStarted","Data":"9750f11310c1780f5604e7af04af97dffcc065ebfcf9dd20c5d5553f020e2125"} Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.155292 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.156165 5030 generic.go:334] "Generic (PLEG): container finished" podID="dcba4b91-f476-4404-b317-6b0cf9e72659" containerID="f2f845517a574b3a451d614fd6d8f871fedece18d4e8adc83b716896bf2af1fc" exitCode=0 Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.156217 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-lspsv" event={"ID":"dcba4b91-f476-4404-b317-6b0cf9e72659","Type":"ContainerDied","Data":"f2f845517a574b3a451d614fd6d8f871fedece18d4e8adc83b716896bf2af1fc"} Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.156235 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-lspsv" event={"ID":"dcba4b91-f476-4404-b317-6b0cf9e72659","Type":"ContainerStarted","Data":"5dafa64333925f2c82f6d23631e4d9e4316c8840ea7f581ed3f96769f23c07cd"} Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.158161 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" event={"ID":"0ce7d325-d563-4683-80fc-8626524f2631","Type":"ContainerStarted","Data":"db67b46ad57425ea2516b4da5955c971d03a3aa3e5bc1dc5bef4b7bf08899c11"} Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.172662 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" podStartSLOduration=2.172647731 podStartE2EDuration="2.172647731s" podCreationTimestamp="2026-01-20 23:22:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:22:23.169088726 +0000 UTC m=+2815.489349014" watchObservedRunningTime="2026-01-20 23:22:23.172647731 +0000 UTC m=+2815.492908019" Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.191114 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" podStartSLOduration=2.191096634 podStartE2EDuration="2.191096634s" podCreationTimestamp="2026-01-20 23:22:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:22:23.182372135 +0000 UTC m=+2815.502632433" watchObservedRunningTime="2026-01-20 23:22:23.191096634 +0000 UTC m=+2815.511356942" Jan 20 23:22:23 crc kubenswrapper[5030]: I0120 23:22:23.912052 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.032423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-combined-ca-bundle\") pod \"a5d3183b-1f93-402d-b6ae-04a7140350e1\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.032472 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-config-data\") pod \"a5d3183b-1f93-402d-b6ae-04a7140350e1\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.032534 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k9wk\" (UniqueName: \"kubernetes.io/projected/a5d3183b-1f93-402d-b6ae-04a7140350e1-kube-api-access-8k9wk\") pod \"a5d3183b-1f93-402d-b6ae-04a7140350e1\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.032637 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5d3183b-1f93-402d-b6ae-04a7140350e1-log-httpd\") pod \"a5d3183b-1f93-402d-b6ae-04a7140350e1\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.032661 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-scripts\") pod \"a5d3183b-1f93-402d-b6ae-04a7140350e1\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.032676 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5d3183b-1f93-402d-b6ae-04a7140350e1-run-httpd\") pod \"a5d3183b-1f93-402d-b6ae-04a7140350e1\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.032746 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-sg-core-conf-yaml\") pod \"a5d3183b-1f93-402d-b6ae-04a7140350e1\" (UID: \"a5d3183b-1f93-402d-b6ae-04a7140350e1\") " Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.033127 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5d3183b-1f93-402d-b6ae-04a7140350e1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a5d3183b-1f93-402d-b6ae-04a7140350e1" (UID: "a5d3183b-1f93-402d-b6ae-04a7140350e1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.033216 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5d3183b-1f93-402d-b6ae-04a7140350e1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a5d3183b-1f93-402d-b6ae-04a7140350e1" (UID: "a5d3183b-1f93-402d-b6ae-04a7140350e1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.033260 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5d3183b-1f93-402d-b6ae-04a7140350e1-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.042799 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-scripts" (OuterVolumeSpecName: "scripts") pod "a5d3183b-1f93-402d-b6ae-04a7140350e1" (UID: "a5d3183b-1f93-402d-b6ae-04a7140350e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.054315 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5d3183b-1f93-402d-b6ae-04a7140350e1-kube-api-access-8k9wk" (OuterVolumeSpecName: "kube-api-access-8k9wk") pod "a5d3183b-1f93-402d-b6ae-04a7140350e1" (UID: "a5d3183b-1f93-402d-b6ae-04a7140350e1"). InnerVolumeSpecName "kube-api-access-8k9wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.064811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a5d3183b-1f93-402d-b6ae-04a7140350e1" (UID: "a5d3183b-1f93-402d-b6ae-04a7140350e1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.118926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5d3183b-1f93-402d-b6ae-04a7140350e1" (UID: "a5d3183b-1f93-402d-b6ae-04a7140350e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.127547 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-config-data" (OuterVolumeSpecName: "config-data") pod "a5d3183b-1f93-402d-b6ae-04a7140350e1" (UID: "a5d3183b-1f93-402d-b6ae-04a7140350e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.134654 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.134682 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.134693 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.134704 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k9wk\" (UniqueName: \"kubernetes.io/projected/a5d3183b-1f93-402d-b6ae-04a7140350e1-kube-api-access-8k9wk\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.134715 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5d3183b-1f93-402d-b6ae-04a7140350e1-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.134724 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a5d3183b-1f93-402d-b6ae-04a7140350e1-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.167087 5030 generic.go:334] "Generic (PLEG): container finished" podID="7f1a598a-5986-49a8-ba34-f423759d1d62" containerID="201653aad7d55159137ac56d6010f5c74db5484c6c17c8f3dc1ea1fcfaa1b109" exitCode=0 Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.167138 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" event={"ID":"7f1a598a-5986-49a8-ba34-f423759d1d62","Type":"ContainerDied","Data":"201653aad7d55159137ac56d6010f5c74db5484c6c17c8f3dc1ea1fcfaa1b109"} Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.169712 5030 generic.go:334] "Generic (PLEG): container finished" podID="0ce7d325-d563-4683-80fc-8626524f2631" containerID="98640ff698c625ccf0185788c13cd5b2bbbf3afee5b419bbd329aebbce1e377c" exitCode=0 Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.169844 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" event={"ID":"0ce7d325-d563-4683-80fc-8626524f2631","Type":"ContainerDied","Data":"98640ff698c625ccf0185788c13cd5b2bbbf3afee5b419bbd329aebbce1e377c"} Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.172714 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerID="460c9cd55ccea5d7179492b6d518235c196e787b6d196ec583152b74f9e066a1" exitCode=0 Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.172753 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a5d3183b-1f93-402d-b6ae-04a7140350e1","Type":"ContainerDied","Data":"460c9cd55ccea5d7179492b6d518235c196e787b6d196ec583152b74f9e066a1"} Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.173016 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a5d3183b-1f93-402d-b6ae-04a7140350e1","Type":"ContainerDied","Data":"9fc0eaf21d348e9e9a56aed10565687e90217773a031ea2cefdb91ab9793f90b"} Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.172796 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.173051 5030 scope.go:117] "RemoveContainer" containerID="06cdb5c56809512e8ffd248f82b0ba3a5e54d10a1de1554591dbe53bd608d768" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.187313 5030 generic.go:334] "Generic (PLEG): container finished" podID="289609c1-c2d1-448d-b4ff-ed7127995341" containerID="3a8097c840afb5350c719593f5cd228865ad457dac867bc99f215f26394d9899" exitCode=0 Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.187446 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9" event={"ID":"289609c1-c2d1-448d-b4ff-ed7127995341","Type":"ContainerDied","Data":"3a8097c840afb5350c719593f5cd228865ad457dac867bc99f215f26394d9899"} Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.189673 5030 generic.go:334] "Generic (PLEG): container finished" podID="43fbd9c6-34c8-4fe9-9514-bc2932bbb632" containerID="5a4e6c117dd4789ab400c322af2053b5e1adf4bfb2237f771b9a8335839061f4" exitCode=0 Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.189732 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b" event={"ID":"43fbd9c6-34c8-4fe9-9514-bc2932bbb632","Type":"ContainerDied","Data":"5a4e6c117dd4789ab400c322af2053b5e1adf4bfb2237f771b9a8335839061f4"} Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.205283 5030 scope.go:117] "RemoveContainer" containerID="71bc36dcfa8577f5ae6ac65dbde97d07ab322d996ae5b70285870910a674c155" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.248797 5030 scope.go:117] "RemoveContainer" containerID="460c9cd55ccea5d7179492b6d518235c196e787b6d196ec583152b74f9e066a1" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.250665 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.262236 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.284882 5030 scope.go:117] "RemoveContainer" containerID="227e0b093c901babf978c1dc04c9783f7c8ea08ded72d6934b4f9c5424d4b2c3" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.285035 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:24 crc kubenswrapper[5030]: E0120 23:22:24.285398 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="ceilometer-central-agent" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.285409 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="ceilometer-central-agent" Jan 20 23:22:24 crc kubenswrapper[5030]: E0120 23:22:24.285434 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="sg-core" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.285440 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="sg-core" Jan 20 23:22:24 crc kubenswrapper[5030]: E0120 23:22:24.285450 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="proxy-httpd" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.285457 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="proxy-httpd" Jan 20 23:22:24 crc kubenswrapper[5030]: E0120 23:22:24.285469 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="ceilometer-notification-agent" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.285475 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="ceilometer-notification-agent" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.285652 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="ceilometer-central-agent" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.285663 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="proxy-httpd" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.285680 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="ceilometer-notification-agent" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.285689 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" containerName="sg-core" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.287961 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.289968 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.290558 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.295179 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.322527 5030 scope.go:117] "RemoveContainer" containerID="06cdb5c56809512e8ffd248f82b0ba3a5e54d10a1de1554591dbe53bd608d768" Jan 20 23:22:24 crc kubenswrapper[5030]: E0120 23:22:24.323402 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06cdb5c56809512e8ffd248f82b0ba3a5e54d10a1de1554591dbe53bd608d768\": container with ID starting with 06cdb5c56809512e8ffd248f82b0ba3a5e54d10a1de1554591dbe53bd608d768 not found: ID does not exist" containerID="06cdb5c56809512e8ffd248f82b0ba3a5e54d10a1de1554591dbe53bd608d768" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.323432 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06cdb5c56809512e8ffd248f82b0ba3a5e54d10a1de1554591dbe53bd608d768"} err="failed to get container status \"06cdb5c56809512e8ffd248f82b0ba3a5e54d10a1de1554591dbe53bd608d768\": rpc error: code = NotFound desc = could not find container \"06cdb5c56809512e8ffd248f82b0ba3a5e54d10a1de1554591dbe53bd608d768\": container with ID starting with 06cdb5c56809512e8ffd248f82b0ba3a5e54d10a1de1554591dbe53bd608d768 not found: ID does not exist" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.323451 5030 scope.go:117] "RemoveContainer" containerID="71bc36dcfa8577f5ae6ac65dbde97d07ab322d996ae5b70285870910a674c155" Jan 20 23:22:24 crc kubenswrapper[5030]: E0120 23:22:24.324244 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71bc36dcfa8577f5ae6ac65dbde97d07ab322d996ae5b70285870910a674c155\": container with ID starting with 71bc36dcfa8577f5ae6ac65dbde97d07ab322d996ae5b70285870910a674c155 not found: ID does not exist" containerID="71bc36dcfa8577f5ae6ac65dbde97d07ab322d996ae5b70285870910a674c155" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.324264 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71bc36dcfa8577f5ae6ac65dbde97d07ab322d996ae5b70285870910a674c155"} err="failed to get container status \"71bc36dcfa8577f5ae6ac65dbde97d07ab322d996ae5b70285870910a674c155\": rpc error: code = NotFound desc = could not find container \"71bc36dcfa8577f5ae6ac65dbde97d07ab322d996ae5b70285870910a674c155\": container with ID starting with 71bc36dcfa8577f5ae6ac65dbde97d07ab322d996ae5b70285870910a674c155 not found: ID does not exist" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.324275 5030 scope.go:117] "RemoveContainer" containerID="460c9cd55ccea5d7179492b6d518235c196e787b6d196ec583152b74f9e066a1" Jan 20 23:22:24 crc kubenswrapper[5030]: E0120 23:22:24.325774 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"460c9cd55ccea5d7179492b6d518235c196e787b6d196ec583152b74f9e066a1\": container with ID starting with 460c9cd55ccea5d7179492b6d518235c196e787b6d196ec583152b74f9e066a1 not found: ID does not exist" containerID="460c9cd55ccea5d7179492b6d518235c196e787b6d196ec583152b74f9e066a1" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.325816 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"460c9cd55ccea5d7179492b6d518235c196e787b6d196ec583152b74f9e066a1"} err="failed to get container status \"460c9cd55ccea5d7179492b6d518235c196e787b6d196ec583152b74f9e066a1\": rpc error: code = NotFound desc = could not find container \"460c9cd55ccea5d7179492b6d518235c196e787b6d196ec583152b74f9e066a1\": container with ID starting with 460c9cd55ccea5d7179492b6d518235c196e787b6d196ec583152b74f9e066a1 not found: ID does not exist" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.325844 5030 scope.go:117] "RemoveContainer" containerID="227e0b093c901babf978c1dc04c9783f7c8ea08ded72d6934b4f9c5424d4b2c3" Jan 20 23:22:24 crc kubenswrapper[5030]: E0120 23:22:24.326171 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"227e0b093c901babf978c1dc04c9783f7c8ea08ded72d6934b4f9c5424d4b2c3\": container with ID starting with 227e0b093c901babf978c1dc04c9783f7c8ea08ded72d6934b4f9c5424d4b2c3 not found: ID does not exist" containerID="227e0b093c901babf978c1dc04c9783f7c8ea08ded72d6934b4f9c5424d4b2c3" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.326192 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"227e0b093c901babf978c1dc04c9783f7c8ea08ded72d6934b4f9c5424d4b2c3"} err="failed to get container status \"227e0b093c901babf978c1dc04c9783f7c8ea08ded72d6934b4f9c5424d4b2c3\": rpc error: code = NotFound desc = could not find container \"227e0b093c901babf978c1dc04c9783f7c8ea08ded72d6934b4f9c5424d4b2c3\": container with ID starting with 227e0b093c901babf978c1dc04c9783f7c8ea08ded72d6934b4f9c5424d4b2c3 not found: ID does not exist" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.438706 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.438828 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-config-data\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.438855 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-scripts\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.438876 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f86d0309-1b60-4774-92ea-cb2ff2d57da1-run-httpd\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.438896 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f86d0309-1b60-4774-92ea-cb2ff2d57da1-log-httpd\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.438921 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44vgd\" (UniqueName: \"kubernetes.io/projected/f86d0309-1b60-4774-92ea-cb2ff2d57da1-kube-api-access-44vgd\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.438967 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.541024 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.541127 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-config-data\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.541152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-scripts\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.541179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f86d0309-1b60-4774-92ea-cb2ff2d57da1-run-httpd\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.541197 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f86d0309-1b60-4774-92ea-cb2ff2d57da1-log-httpd\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.541220 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44vgd\" (UniqueName: \"kubernetes.io/projected/f86d0309-1b60-4774-92ea-cb2ff2d57da1-kube-api-access-44vgd\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.541241 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.541876 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f86d0309-1b60-4774-92ea-cb2ff2d57da1-run-httpd\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.542110 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f86d0309-1b60-4774-92ea-cb2ff2d57da1-log-httpd\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.544223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.544531 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.546104 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-config-data\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.548562 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-scripts\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.557176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44vgd\" (UniqueName: \"kubernetes.io/projected/f86d0309-1b60-4774-92ea-cb2ff2d57da1-kube-api-access-44vgd\") pod \"ceilometer-0\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.618597 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.630393 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-btqm2" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.634751 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-lspsv" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.743273 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndn6h\" (UniqueName: \"kubernetes.io/projected/dcba4b91-f476-4404-b317-6b0cf9e72659-kube-api-access-ndn6h\") pod \"dcba4b91-f476-4404-b317-6b0cf9e72659\" (UID: \"dcba4b91-f476-4404-b317-6b0cf9e72659\") " Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.743668 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcba4b91-f476-4404-b317-6b0cf9e72659-operator-scripts\") pod \"dcba4b91-f476-4404-b317-6b0cf9e72659\" (UID: \"dcba4b91-f476-4404-b317-6b0cf9e72659\") " Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.743695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvzlz\" (UniqueName: \"kubernetes.io/projected/11c5d200-30f8-4559-ba0d-f72cc9180ea2-kube-api-access-nvzlz\") pod \"11c5d200-30f8-4559-ba0d-f72cc9180ea2\" (UID: \"11c5d200-30f8-4559-ba0d-f72cc9180ea2\") " Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.743777 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c5d200-30f8-4559-ba0d-f72cc9180ea2-operator-scripts\") pod \"11c5d200-30f8-4559-ba0d-f72cc9180ea2\" (UID: \"11c5d200-30f8-4559-ba0d-f72cc9180ea2\") " Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.744120 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcba4b91-f476-4404-b317-6b0cf9e72659-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dcba4b91-f476-4404-b317-6b0cf9e72659" (UID: "dcba4b91-f476-4404-b317-6b0cf9e72659"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.744307 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11c5d200-30f8-4559-ba0d-f72cc9180ea2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11c5d200-30f8-4559-ba0d-f72cc9180ea2" (UID: "11c5d200-30f8-4559-ba0d-f72cc9180ea2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.746822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcba4b91-f476-4404-b317-6b0cf9e72659-kube-api-access-ndn6h" (OuterVolumeSpecName: "kube-api-access-ndn6h") pod "dcba4b91-f476-4404-b317-6b0cf9e72659" (UID: "dcba4b91-f476-4404-b317-6b0cf9e72659"). InnerVolumeSpecName "kube-api-access-ndn6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.747150 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c5d200-30f8-4559-ba0d-f72cc9180ea2-kube-api-access-nvzlz" (OuterVolumeSpecName: "kube-api-access-nvzlz") pod "11c5d200-30f8-4559-ba0d-f72cc9180ea2" (UID: "11c5d200-30f8-4559-ba0d-f72cc9180ea2"). InnerVolumeSpecName "kube-api-access-nvzlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.846999 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11c5d200-30f8-4559-ba0d-f72cc9180ea2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.847035 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndn6h\" (UniqueName: \"kubernetes.io/projected/dcba4b91-f476-4404-b317-6b0cf9e72659-kube-api-access-ndn6h\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.847046 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcba4b91-f476-4404-b317-6b0cf9e72659-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:24 crc kubenswrapper[5030]: I0120 23:22:24.847055 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvzlz\" (UniqueName: \"kubernetes.io/projected/11c5d200-30f8-4559-ba0d-f72cc9180ea2-kube-api-access-nvzlz\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:25 crc kubenswrapper[5030]: W0120 23:22:25.070431 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf86d0309_1b60_4774_92ea_cb2ff2d57da1.slice/crio-9f85bdb45e18f56e7c39e7ff11df2175fb1a5d964d8585983e33398dddaaefc5 WatchSource:0}: Error finding container 9f85bdb45e18f56e7c39e7ff11df2175fb1a5d964d8585983e33398dddaaefc5: Status 404 returned error can't find the container with id 9f85bdb45e18f56e7c39e7ff11df2175fb1a5d964d8585983e33398dddaaefc5 Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.071646 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.203322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-btqm2" event={"ID":"11c5d200-30f8-4559-ba0d-f72cc9180ea2","Type":"ContainerDied","Data":"b184c92aac258089df377731ae3ae254b39b5da2b85939336bee3313c9ad7b10"} Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.203646 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b184c92aac258089df377731ae3ae254b39b5da2b85939336bee3313c9ad7b10" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.203745 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-btqm2" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.207880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f86d0309-1b60-4774-92ea-cb2ff2d57da1","Type":"ContainerStarted","Data":"9f85bdb45e18f56e7c39e7ff11df2175fb1a5d964d8585983e33398dddaaefc5"} Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.209165 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-lspsv" event={"ID":"dcba4b91-f476-4404-b317-6b0cf9e72659","Type":"ContainerDied","Data":"5dafa64333925f2c82f6d23631e4d9e4316c8840ea7f581ed3f96769f23c07cd"} Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.209210 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dafa64333925f2c82f6d23631e4d9e4316c8840ea7f581ed3f96769f23c07cd" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.209382 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-lspsv" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.597562 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.660393 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.729216 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk"] Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.729648 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" podUID="679648c6-8923-42fc-9226-c78458125e4c" containerName="neutron-httpd" containerID="cri-o://c189266c2bcb621a50e9f7fd43bbd9ba20e9df922f57e929a34c1b43196e2694" gracePeriod=30 Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.729493 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" podUID="679648c6-8923-42fc-9226-c78458125e4c" containerName="neutron-api" containerID="cri-o://44457fa640b84e298c9bd24570c134e0e6d163e0861bd161428b3fa40406b8dd" gracePeriod=30 Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.766282 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ce7d325-d563-4683-80fc-8626524f2631-operator-scripts\") pod \"0ce7d325-d563-4683-80fc-8626524f2631\" (UID: \"0ce7d325-d563-4683-80fc-8626524f2631\") " Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.766403 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w955w\" (UniqueName: \"kubernetes.io/projected/0ce7d325-d563-4683-80fc-8626524f2631-kube-api-access-w955w\") pod \"0ce7d325-d563-4683-80fc-8626524f2631\" (UID: \"0ce7d325-d563-4683-80fc-8626524f2631\") " Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.767554 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ce7d325-d563-4683-80fc-8626524f2631-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ce7d325-d563-4683-80fc-8626524f2631" (UID: "0ce7d325-d563-4683-80fc-8626524f2631"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.788641 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce7d325-d563-4683-80fc-8626524f2631-kube-api-access-w955w" (OuterVolumeSpecName: "kube-api-access-w955w") pod "0ce7d325-d563-4683-80fc-8626524f2631" (UID: "0ce7d325-d563-4683-80fc-8626524f2631"). InnerVolumeSpecName "kube-api-access-w955w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.820601 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.854771 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.868308 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w955w\" (UniqueName: \"kubernetes.io/projected/0ce7d325-d563-4683-80fc-8626524f2631-kube-api-access-w955w\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.868337 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ce7d325-d563-4683-80fc-8626524f2631-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.873433 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.968732 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f1a598a-5986-49a8-ba34-f423759d1d62-operator-scripts\") pod \"7f1a598a-5986-49a8-ba34-f423759d1d62\" (UID: \"7f1a598a-5986-49a8-ba34-f423759d1d62\") " Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.968883 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qt6n\" (UniqueName: \"kubernetes.io/projected/289609c1-c2d1-448d-b4ff-ed7127995341-kube-api-access-7qt6n\") pod \"289609c1-c2d1-448d-b4ff-ed7127995341\" (UID: \"289609c1-c2d1-448d-b4ff-ed7127995341\") " Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.969007 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/289609c1-c2d1-448d-b4ff-ed7127995341-operator-scripts\") pod \"289609c1-c2d1-448d-b4ff-ed7127995341\" (UID: \"289609c1-c2d1-448d-b4ff-ed7127995341\") " Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.969039 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbcfv\" (UniqueName: \"kubernetes.io/projected/7f1a598a-5986-49a8-ba34-f423759d1d62-kube-api-access-bbcfv\") pod \"7f1a598a-5986-49a8-ba34-f423759d1d62\" (UID: \"7f1a598a-5986-49a8-ba34-f423759d1d62\") " Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.969071 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43fbd9c6-34c8-4fe9-9514-bc2932bbb632-operator-scripts\") pod \"43fbd9c6-34c8-4fe9-9514-bc2932bbb632\" (UID: \"43fbd9c6-34c8-4fe9-9514-bc2932bbb632\") " Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.969099 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvz7f\" (UniqueName: \"kubernetes.io/projected/43fbd9c6-34c8-4fe9-9514-bc2932bbb632-kube-api-access-zvz7f\") pod \"43fbd9c6-34c8-4fe9-9514-bc2932bbb632\" (UID: \"43fbd9c6-34c8-4fe9-9514-bc2932bbb632\") " Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.969442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/289609c1-c2d1-448d-b4ff-ed7127995341-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "289609c1-c2d1-448d-b4ff-ed7127995341" (UID: "289609c1-c2d1-448d-b4ff-ed7127995341"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.969521 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43fbd9c6-34c8-4fe9-9514-bc2932bbb632-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43fbd9c6-34c8-4fe9-9514-bc2932bbb632" (UID: "43fbd9c6-34c8-4fe9-9514-bc2932bbb632"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.969804 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f1a598a-5986-49a8-ba34-f423759d1d62-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f1a598a-5986-49a8-ba34-f423759d1d62" (UID: "7f1a598a-5986-49a8-ba34-f423759d1d62"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.969884 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43fbd9c6-34c8-4fe9-9514-bc2932bbb632-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.969899 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f1a598a-5986-49a8-ba34-f423759d1d62-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.969909 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/289609c1-c2d1-448d-b4ff-ed7127995341-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.975030 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f1a598a-5986-49a8-ba34-f423759d1d62-kube-api-access-bbcfv" (OuterVolumeSpecName: "kube-api-access-bbcfv") pod "7f1a598a-5986-49a8-ba34-f423759d1d62" (UID: "7f1a598a-5986-49a8-ba34-f423759d1d62"). InnerVolumeSpecName "kube-api-access-bbcfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.975087 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/289609c1-c2d1-448d-b4ff-ed7127995341-kube-api-access-7qt6n" (OuterVolumeSpecName: "kube-api-access-7qt6n") pod "289609c1-c2d1-448d-b4ff-ed7127995341" (UID: "289609c1-c2d1-448d-b4ff-ed7127995341"). InnerVolumeSpecName "kube-api-access-7qt6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.975449 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43fbd9c6-34c8-4fe9-9514-bc2932bbb632-kube-api-access-zvz7f" (OuterVolumeSpecName: "kube-api-access-zvz7f") pod "43fbd9c6-34c8-4fe9-9514-bc2932bbb632" (UID: "43fbd9c6-34c8-4fe9-9514-bc2932bbb632"). InnerVolumeSpecName "kube-api-access-zvz7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:25 crc kubenswrapper[5030]: I0120 23:22:25.979108 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5d3183b-1f93-402d-b6ae-04a7140350e1" path="/var/lib/kubelet/pods/a5d3183b-1f93-402d-b6ae-04a7140350e1/volumes" Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.071946 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qt6n\" (UniqueName: \"kubernetes.io/projected/289609c1-c2d1-448d-b4ff-ed7127995341-kube-api-access-7qt6n\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.071979 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbcfv\" (UniqueName: \"kubernetes.io/projected/7f1a598a-5986-49a8-ba34-f423759d1d62-kube-api-access-bbcfv\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.071988 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvz7f\" (UniqueName: \"kubernetes.io/projected/43fbd9c6-34c8-4fe9-9514-bc2932bbb632-kube-api-access-zvz7f\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.220014 5030 generic.go:334] "Generic (PLEG): container finished" podID="679648c6-8923-42fc-9226-c78458125e4c" containerID="c189266c2bcb621a50e9f7fd43bbd9ba20e9df922f57e929a34c1b43196e2694" exitCode=0 Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.220102 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" event={"ID":"679648c6-8923-42fc-9226-c78458125e4c","Type":"ContainerDied","Data":"c189266c2bcb621a50e9f7fd43bbd9ba20e9df922f57e929a34c1b43196e2694"} Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.222322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f86d0309-1b60-4774-92ea-cb2ff2d57da1","Type":"ContainerStarted","Data":"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58"} Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.224019 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9" Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.224016 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9" event={"ID":"289609c1-c2d1-448d-b4ff-ed7127995341","Type":"ContainerDied","Data":"6f7bd414bb98539e77fdf0c9c6637ad7cd08f64090da8350c0688ca16118266a"} Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.224138 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f7bd414bb98539e77fdf0c9c6637ad7cd08f64090da8350c0688ca16118266a" Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.226129 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b" Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.226148 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b" event={"ID":"43fbd9c6-34c8-4fe9-9514-bc2932bbb632","Type":"ContainerDied","Data":"afea8b10929802f137e277c1f7632254d5dccf6ce67d9ebecae946177e7206c9"} Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.226281 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afea8b10929802f137e277c1f7632254d5dccf6ce67d9ebecae946177e7206c9" Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.228004 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" event={"ID":"7f1a598a-5986-49a8-ba34-f423759d1d62","Type":"ContainerDied","Data":"9750f11310c1780f5604e7af04af97dffcc065ebfcf9dd20c5d5553f020e2125"} Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.228027 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9750f11310c1780f5604e7af04af97dffcc065ebfcf9dd20c5d5553f020e2125" Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.228074 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf" Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.229739 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" event={"ID":"0ce7d325-d563-4683-80fc-8626524f2631","Type":"ContainerDied","Data":"db67b46ad57425ea2516b4da5955c971d03a3aa3e5bc1dc5bef4b7bf08899c11"} Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.229764 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db67b46ad57425ea2516b4da5955c971d03a3aa3e5bc1dc5bef4b7bf08899c11" Jan 20 23:22:26 crc kubenswrapper[5030]: I0120 23:22:26.229943 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-lvkk8" Jan 20 23:22:28 crc kubenswrapper[5030]: E0120 23:22:28.411062 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5a0e0f_eb0d_4757_b571_6571a43793c4.slice/crio-e27e4401e62fa285d5d1e9936dacf60031667ebe0fe7152d85d42ed3a9166693\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5a0e0f_eb0d_4757_b571_6571a43793c4.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.484961 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9"] Jan 20 23:22:28 crc kubenswrapper[5030]: E0120 23:22:28.485315 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43fbd9c6-34c8-4fe9-9514-bc2932bbb632" containerName="mariadb-account-create-update" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.485331 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fbd9c6-34c8-4fe9-9514-bc2932bbb632" containerName="mariadb-account-create-update" Jan 20 23:22:28 crc kubenswrapper[5030]: E0120 23:22:28.485354 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1a598a-5986-49a8-ba34-f423759d1d62" containerName="mariadb-account-create-update" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.485361 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1a598a-5986-49a8-ba34-f423759d1d62" containerName="mariadb-account-create-update" Jan 20 23:22:28 crc kubenswrapper[5030]: E0120 23:22:28.485372 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcba4b91-f476-4404-b317-6b0cf9e72659" containerName="mariadb-database-create" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.485378 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcba4b91-f476-4404-b317-6b0cf9e72659" containerName="mariadb-database-create" Jan 20 23:22:28 crc kubenswrapper[5030]: E0120 23:22:28.485400 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="289609c1-c2d1-448d-b4ff-ed7127995341" containerName="mariadb-account-create-update" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.485405 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="289609c1-c2d1-448d-b4ff-ed7127995341" containerName="mariadb-account-create-update" Jan 20 23:22:28 crc kubenswrapper[5030]: E0120 23:22:28.485418 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c5d200-30f8-4559-ba0d-f72cc9180ea2" containerName="mariadb-database-create" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.485426 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c5d200-30f8-4559-ba0d-f72cc9180ea2" containerName="mariadb-database-create" Jan 20 23:22:28 crc kubenswrapper[5030]: E0120 23:22:28.485440 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce7d325-d563-4683-80fc-8626524f2631" containerName="mariadb-database-create" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.485445 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce7d325-d563-4683-80fc-8626524f2631" containerName="mariadb-database-create" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.485593 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="289609c1-c2d1-448d-b4ff-ed7127995341" containerName="mariadb-account-create-update" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.485603 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="43fbd9c6-34c8-4fe9-9514-bc2932bbb632" containerName="mariadb-account-create-update" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.485630 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcba4b91-f476-4404-b317-6b0cf9e72659" containerName="mariadb-database-create" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.485641 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce7d325-d563-4683-80fc-8626524f2631" containerName="mariadb-database-create" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.485656 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1a598a-5986-49a8-ba34-f423759d1d62" containerName="mariadb-account-create-update" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.485665 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c5d200-30f8-4559-ba0d-f72cc9180ea2" containerName="mariadb-database-create" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.486211 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.489269 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.489528 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-qp6jt" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.489441 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.499900 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9"] Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.554701 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shr4q\" (UniqueName: \"kubernetes.io/projected/5b62b69c-8413-4732-9816-9d2f635d47b5-kube-api-access-shr4q\") pod \"nova-cell0-conductor-db-sync-c97w9\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.554785 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-config-data\") pod \"nova-cell0-conductor-db-sync-c97w9\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.554881 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-scripts\") pod \"nova-cell0-conductor-db-sync-c97w9\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.554901 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c97w9\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.656984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-config-data\") pod \"nova-cell0-conductor-db-sync-c97w9\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.657140 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-scripts\") pod \"nova-cell0-conductor-db-sync-c97w9\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.657164 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c97w9\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.657209 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shr4q\" (UniqueName: \"kubernetes.io/projected/5b62b69c-8413-4732-9816-9d2f635d47b5-kube-api-access-shr4q\") pod \"nova-cell0-conductor-db-sync-c97w9\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.662360 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-config-data\") pod \"nova-cell0-conductor-db-sync-c97w9\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.664121 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-scripts\") pod \"nova-cell0-conductor-db-sync-c97w9\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.664440 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c97w9\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.674600 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shr4q\" (UniqueName: \"kubernetes.io/projected/5b62b69c-8413-4732-9816-9d2f635d47b5-kube-api-access-shr4q\") pod \"nova-cell0-conductor-db-sync-c97w9\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:28 crc kubenswrapper[5030]: I0120 23:22:28.805133 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:29 crc kubenswrapper[5030]: I0120 23:22:29.162499 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f86d0309-1b60-4774-92ea-cb2ff2d57da1","Type":"ContainerStarted","Data":"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1"} Jan 20 23:22:29 crc kubenswrapper[5030]: I0120 23:22:29.162766 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f86d0309-1b60-4774-92ea-cb2ff2d57da1","Type":"ContainerStarted","Data":"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21"} Jan 20 23:22:29 crc kubenswrapper[5030]: I0120 23:22:29.245264 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9"] Jan 20 23:22:30 crc kubenswrapper[5030]: I0120 23:22:30.172554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" event={"ID":"5b62b69c-8413-4732-9816-9d2f635d47b5","Type":"ContainerStarted","Data":"50246376aa5e4e5507db17c42856a6a995ac78ab49646b34f1fdd90c7caa4584"} Jan 20 23:22:30 crc kubenswrapper[5030]: I0120 23:22:30.173185 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" event={"ID":"5b62b69c-8413-4732-9816-9d2f635d47b5","Type":"ContainerStarted","Data":"06c473ea0cce08a788c24e1fabddf717b234fd8b18ec1b246e7c48684c5f0006"} Jan 20 23:22:30 crc kubenswrapper[5030]: I0120 23:22:30.189637 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" podStartSLOduration=2.189603943 podStartE2EDuration="2.189603943s" podCreationTimestamp="2026-01-20 23:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:22:30.188532788 +0000 UTC m=+2822.508793086" watchObservedRunningTime="2026-01-20 23:22:30.189603943 +0000 UTC m=+2822.509864231" Jan 20 23:22:30 crc kubenswrapper[5030]: I0120 23:22:30.406133 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:30 crc kubenswrapper[5030]: I0120 23:22:30.406672 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:22:31 crc kubenswrapper[5030]: I0120 23:22:31.182935 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f86d0309-1b60-4774-92ea-cb2ff2d57da1","Type":"ContainerStarted","Data":"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63"} Jan 20 23:22:31 crc kubenswrapper[5030]: I0120 23:22:31.184171 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:31 crc kubenswrapper[5030]: I0120 23:22:31.217335 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.132469419 podStartE2EDuration="7.217314038s" podCreationTimestamp="2026-01-20 23:22:24 +0000 UTC" firstStartedPulling="2026-01-20 23:22:25.073074025 +0000 UTC m=+2817.393334343" lastFinishedPulling="2026-01-20 23:22:30.157918674 +0000 UTC m=+2822.478178962" observedRunningTime="2026-01-20 23:22:31.203762164 +0000 UTC m=+2823.524022462" watchObservedRunningTime="2026-01-20 23:22:31.217314038 +0000 UTC m=+2823.537574346" Jan 20 23:22:31 crc kubenswrapper[5030]: I0120 23:22:31.328269 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:33 crc kubenswrapper[5030]: I0120 23:22:33.201692 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="sg-core" containerID="cri-o://51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1" gracePeriod=30 Jan 20 23:22:33 crc kubenswrapper[5030]: I0120 23:22:33.201721 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="ceilometer-notification-agent" containerID="cri-o://f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21" gracePeriod=30 Jan 20 23:22:33 crc kubenswrapper[5030]: I0120 23:22:33.202078 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="ceilometer-central-agent" containerID="cri-o://161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58" gracePeriod=30 Jan 20 23:22:33 crc kubenswrapper[5030]: I0120 23:22:33.201757 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="proxy-httpd" containerID="cri-o://42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63" gracePeriod=30 Jan 20 23:22:33 crc kubenswrapper[5030]: I0120 23:22:33.952010 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.055390 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f86d0309-1b60-4774-92ea-cb2ff2d57da1-log-httpd\") pod \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.055541 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-config-data\") pod \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.055563 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44vgd\" (UniqueName: \"kubernetes.io/projected/f86d0309-1b60-4774-92ea-cb2ff2d57da1-kube-api-access-44vgd\") pod \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.055658 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-combined-ca-bundle\") pod \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.055694 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f86d0309-1b60-4774-92ea-cb2ff2d57da1-run-httpd\") pod \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.055731 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-scripts\") pod \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.055831 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-sg-core-conf-yaml\") pod \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\" (UID: \"f86d0309-1b60-4774-92ea-cb2ff2d57da1\") " Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.056030 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f86d0309-1b60-4774-92ea-cb2ff2d57da1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f86d0309-1b60-4774-92ea-cb2ff2d57da1" (UID: "f86d0309-1b60-4774-92ea-cb2ff2d57da1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.056268 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f86d0309-1b60-4774-92ea-cb2ff2d57da1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f86d0309-1b60-4774-92ea-cb2ff2d57da1" (UID: "f86d0309-1b60-4774-92ea-cb2ff2d57da1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.056720 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f86d0309-1b60-4774-92ea-cb2ff2d57da1-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.056746 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f86d0309-1b60-4774-92ea-cb2ff2d57da1-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.061261 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-scripts" (OuterVolumeSpecName: "scripts") pod "f86d0309-1b60-4774-92ea-cb2ff2d57da1" (UID: "f86d0309-1b60-4774-92ea-cb2ff2d57da1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.061335 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f86d0309-1b60-4774-92ea-cb2ff2d57da1-kube-api-access-44vgd" (OuterVolumeSpecName: "kube-api-access-44vgd") pod "f86d0309-1b60-4774-92ea-cb2ff2d57da1" (UID: "f86d0309-1b60-4774-92ea-cb2ff2d57da1"). InnerVolumeSpecName "kube-api-access-44vgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.088153 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f86d0309-1b60-4774-92ea-cb2ff2d57da1" (UID: "f86d0309-1b60-4774-92ea-cb2ff2d57da1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.136374 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f86d0309-1b60-4774-92ea-cb2ff2d57da1" (UID: "f86d0309-1b60-4774-92ea-cb2ff2d57da1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.158634 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44vgd\" (UniqueName: \"kubernetes.io/projected/f86d0309-1b60-4774-92ea-cb2ff2d57da1-kube-api-access-44vgd\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.158671 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.158683 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.158692 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.164525 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-config-data" (OuterVolumeSpecName: "config-data") pod "f86d0309-1b60-4774-92ea-cb2ff2d57da1" (UID: "f86d0309-1b60-4774-92ea-cb2ff2d57da1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.212698 5030 generic.go:334] "Generic (PLEG): container finished" podID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerID="42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63" exitCode=0 Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.212741 5030 generic.go:334] "Generic (PLEG): container finished" podID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerID="51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1" exitCode=2 Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.212753 5030 generic.go:334] "Generic (PLEG): container finished" podID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerID="f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21" exitCode=0 Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.212764 5030 generic.go:334] "Generic (PLEG): container finished" podID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerID="161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58" exitCode=0 Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.212762 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.212782 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f86d0309-1b60-4774-92ea-cb2ff2d57da1","Type":"ContainerDied","Data":"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63"} Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.212834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f86d0309-1b60-4774-92ea-cb2ff2d57da1","Type":"ContainerDied","Data":"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1"} Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.212846 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f86d0309-1b60-4774-92ea-cb2ff2d57da1","Type":"ContainerDied","Data":"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21"} Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.212856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f86d0309-1b60-4774-92ea-cb2ff2d57da1","Type":"ContainerDied","Data":"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58"} Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.212866 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f86d0309-1b60-4774-92ea-cb2ff2d57da1","Type":"ContainerDied","Data":"9f85bdb45e18f56e7c39e7ff11df2175fb1a5d964d8585983e33398dddaaefc5"} Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.212883 5030 scope.go:117] "RemoveContainer" containerID="42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.241330 5030 scope.go:117] "RemoveContainer" containerID="51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.252135 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.260900 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f86d0309-1b60-4774-92ea-cb2ff2d57da1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.275886 5030 scope.go:117] "RemoveContainer" containerID="f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.284758 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.300542 5030 scope.go:117] "RemoveContainer" containerID="161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.319299 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:34 crc kubenswrapper[5030]: E0120 23:22:34.320454 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="sg-core" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.320492 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="sg-core" Jan 20 23:22:34 crc kubenswrapper[5030]: E0120 23:22:34.320530 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="ceilometer-central-agent" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.320538 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="ceilometer-central-agent" Jan 20 23:22:34 crc kubenswrapper[5030]: E0120 23:22:34.320550 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="proxy-httpd" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.320558 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="proxy-httpd" Jan 20 23:22:34 crc kubenswrapper[5030]: E0120 23:22:34.320757 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="ceilometer-notification-agent" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.320768 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="ceilometer-notification-agent" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.321219 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="proxy-httpd" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.321260 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="ceilometer-notification-agent" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.321275 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="sg-core" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.321293 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" containerName="ceilometer-central-agent" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.325285 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.327053 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.327742 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.338425 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.384706 5030 scope.go:117] "RemoveContainer" containerID="42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63" Jan 20 23:22:34 crc kubenswrapper[5030]: E0120 23:22:34.385139 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63\": container with ID starting with 42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63 not found: ID does not exist" containerID="42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.385175 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63"} err="failed to get container status \"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63\": rpc error: code = NotFound desc = could not find container \"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63\": container with ID starting with 42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.385196 5030 scope.go:117] "RemoveContainer" containerID="51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1" Jan 20 23:22:34 crc kubenswrapper[5030]: E0120 23:22:34.385568 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1\": container with ID starting with 51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1 not found: ID does not exist" containerID="51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.385594 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1"} err="failed to get container status \"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1\": rpc error: code = NotFound desc = could not find container \"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1\": container with ID starting with 51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.385611 5030 scope.go:117] "RemoveContainer" containerID="f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21" Jan 20 23:22:34 crc kubenswrapper[5030]: E0120 23:22:34.388301 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21\": container with ID starting with f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21 not found: ID does not exist" containerID="f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.388343 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21"} err="failed to get container status \"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21\": rpc error: code = NotFound desc = could not find container \"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21\": container with ID starting with f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.388369 5030 scope.go:117] "RemoveContainer" containerID="161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58" Jan 20 23:22:34 crc kubenswrapper[5030]: E0120 23:22:34.388902 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58\": container with ID starting with 161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58 not found: ID does not exist" containerID="161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.388925 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58"} err="failed to get container status \"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58\": rpc error: code = NotFound desc = could not find container \"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58\": container with ID starting with 161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.388939 5030 scope.go:117] "RemoveContainer" containerID="42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.389257 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63"} err="failed to get container status \"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63\": rpc error: code = NotFound desc = could not find container \"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63\": container with ID starting with 42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.389296 5030 scope.go:117] "RemoveContainer" containerID="51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.389867 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1"} err="failed to get container status \"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1\": rpc error: code = NotFound desc = could not find container \"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1\": container with ID starting with 51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.389911 5030 scope.go:117] "RemoveContainer" containerID="f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.390262 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21"} err="failed to get container status \"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21\": rpc error: code = NotFound desc = could not find container \"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21\": container with ID starting with f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.390284 5030 scope.go:117] "RemoveContainer" containerID="161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.390559 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58"} err="failed to get container status \"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58\": rpc error: code = NotFound desc = could not find container \"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58\": container with ID starting with 161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.390588 5030 scope.go:117] "RemoveContainer" containerID="42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.390872 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63"} err="failed to get container status \"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63\": rpc error: code = NotFound desc = could not find container \"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63\": container with ID starting with 42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.390898 5030 scope.go:117] "RemoveContainer" containerID="51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.391162 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1"} err="failed to get container status \"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1\": rpc error: code = NotFound desc = could not find container \"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1\": container with ID starting with 51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.391184 5030 scope.go:117] "RemoveContainer" containerID="f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.391477 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21"} err="failed to get container status \"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21\": rpc error: code = NotFound desc = could not find container \"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21\": container with ID starting with f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.391498 5030 scope.go:117] "RemoveContainer" containerID="161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.392739 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58"} err="failed to get container status \"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58\": rpc error: code = NotFound desc = could not find container \"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58\": container with ID starting with 161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.392763 5030 scope.go:117] "RemoveContainer" containerID="42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.393050 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63"} err="failed to get container status \"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63\": rpc error: code = NotFound desc = could not find container \"42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63\": container with ID starting with 42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.393076 5030 scope.go:117] "RemoveContainer" containerID="51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.393330 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1"} err="failed to get container status \"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1\": rpc error: code = NotFound desc = could not find container \"51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1\": container with ID starting with 51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.393387 5030 scope.go:117] "RemoveContainer" containerID="f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.393887 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21"} err="failed to get container status \"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21\": rpc error: code = NotFound desc = could not find container \"f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21\": container with ID starting with f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.393909 5030 scope.go:117] "RemoveContainer" containerID="161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.394993 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58"} err="failed to get container status \"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58\": rpc error: code = NotFound desc = could not find container \"161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58\": container with ID starting with 161b3b6bc490f8f585bf79f27a70125227e50a0ba92a584eaed367640e931a58 not found: ID does not exist" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.463835 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-scripts\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.463912 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.463937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-config-data\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.463957 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edb83d37-768f-454d-b072-71e8c52dcd10-run-httpd\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.463991 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gggr7\" (UniqueName: \"kubernetes.io/projected/edb83d37-768f-454d-b072-71e8c52dcd10-kube-api-access-gggr7\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.464169 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.464230 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edb83d37-768f-454d-b072-71e8c52dcd10-log-httpd\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.565346 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.565393 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edb83d37-768f-454d-b072-71e8c52dcd10-log-httpd\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.565462 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-scripts\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.565505 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.565524 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-config-data\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.565542 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edb83d37-768f-454d-b072-71e8c52dcd10-run-httpd\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.565573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gggr7\" (UniqueName: \"kubernetes.io/projected/edb83d37-768f-454d-b072-71e8c52dcd10-kube-api-access-gggr7\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.566230 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edb83d37-768f-454d-b072-71e8c52dcd10-log-httpd\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.569378 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edb83d37-768f-454d-b072-71e8c52dcd10-run-httpd\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.571471 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.571601 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-scripts\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.572196 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.579000 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-config-data\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.583828 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gggr7\" (UniqueName: \"kubernetes.io/projected/edb83d37-768f-454d-b072-71e8c52dcd10-kube-api-access-gggr7\") pod \"ceilometer-0\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:34 crc kubenswrapper[5030]: I0120 23:22:34.683354 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.270178 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:35 crc kubenswrapper[5030]: W0120 23:22:35.539543 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf86d0309_1b60_4774_92ea_cb2ff2d57da1.slice/crio-f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21.scope WatchSource:0}: Error finding container f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21: Status 404 returned error can't find the container with id f873bd17c6014b30119e008fbcf0b17f2017d32c516d514c71cfa43b4c944d21 Jan 20 23:22:35 crc kubenswrapper[5030]: W0120 23:22:35.541948 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf86d0309_1b60_4774_92ea_cb2ff2d57da1.slice/crio-51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1.scope WatchSource:0}: Error finding container 51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1: Status 404 returned error can't find the container with id 51b6e92637bc88d4d073efd7e5a6ce52a5fee489b2e2b7dff8371579b578f5d1 Jan 20 23:22:35 crc kubenswrapper[5030]: W0120 23:22:35.545292 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf86d0309_1b60_4774_92ea_cb2ff2d57da1.slice/crio-42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63.scope WatchSource:0}: Error finding container 42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63: Status 404 returned error can't find the container with id 42a1a1fe8101f4442bdf17982f2d902c155cd826741d72345eb5f58c3b6dfe63 Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.844787 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.885856 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-combined-ca-bundle\") pod \"679648c6-8923-42fc-9226-c78458125e4c\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.885905 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-ovndb-tls-certs\") pod \"679648c6-8923-42fc-9226-c78458125e4c\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.885973 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-httpd-config\") pod \"679648c6-8923-42fc-9226-c78458125e4c\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.886003 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-config\") pod \"679648c6-8923-42fc-9226-c78458125e4c\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.886054 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pttsj\" (UniqueName: \"kubernetes.io/projected/679648c6-8923-42fc-9226-c78458125e4c-kube-api-access-pttsj\") pod \"679648c6-8923-42fc-9226-c78458125e4c\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.892207 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "679648c6-8923-42fc-9226-c78458125e4c" (UID: "679648c6-8923-42fc-9226-c78458125e4c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.893861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/679648c6-8923-42fc-9226-c78458125e4c-kube-api-access-pttsj" (OuterVolumeSpecName: "kube-api-access-pttsj") pod "679648c6-8923-42fc-9226-c78458125e4c" (UID: "679648c6-8923-42fc-9226-c78458125e4c"). InnerVolumeSpecName "kube-api-access-pttsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.957347 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-config" (OuterVolumeSpecName: "config") pod "679648c6-8923-42fc-9226-c78458125e4c" (UID: "679648c6-8923-42fc-9226-c78458125e4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.971176 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "679648c6-8923-42fc-9226-c78458125e4c" (UID: "679648c6-8923-42fc-9226-c78458125e4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.974957 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f86d0309-1b60-4774-92ea-cb2ff2d57da1" path="/var/lib/kubelet/pods/f86d0309-1b60-4774-92ea-cb2ff2d57da1/volumes" Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.987388 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "679648c6-8923-42fc-9226-c78458125e4c" (UID: "679648c6-8923-42fc-9226-c78458125e4c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.987584 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-ovndb-tls-certs\") pod \"679648c6-8923-42fc-9226-c78458125e4c\" (UID: \"679648c6-8923-42fc-9226-c78458125e4c\") " Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.988243 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.988266 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.988345 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pttsj\" (UniqueName: \"kubernetes.io/projected/679648c6-8923-42fc-9226-c78458125e4c-kube-api-access-pttsj\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.988360 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:35 crc kubenswrapper[5030]: W0120 23:22:35.988460 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/679648c6-8923-42fc-9226-c78458125e4c/volumes/kubernetes.io~secret/ovndb-tls-certs Jan 20 23:22:35 crc kubenswrapper[5030]: I0120 23:22:35.988482 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "679648c6-8923-42fc-9226-c78458125e4c" (UID: "679648c6-8923-42fc-9226-c78458125e4c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.090661 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/679648c6-8923-42fc-9226-c78458125e4c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.235789 5030 generic.go:334] "Generic (PLEG): container finished" podID="679648c6-8923-42fc-9226-c78458125e4c" containerID="44457fa640b84e298c9bd24570c134e0e6d163e0861bd161428b3fa40406b8dd" exitCode=0 Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.235883 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" event={"ID":"679648c6-8923-42fc-9226-c78458125e4c","Type":"ContainerDied","Data":"44457fa640b84e298c9bd24570c134e0e6d163e0861bd161428b3fa40406b8dd"} Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.235921 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" event={"ID":"679648c6-8923-42fc-9226-c78458125e4c","Type":"ContainerDied","Data":"a39049e7fbb6b796b10f056fe686f6a0bc27e97a257a7c5001a89a0deeb063fb"} Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.235949 5030 scope.go:117] "RemoveContainer" containerID="c189266c2bcb621a50e9f7fd43bbd9ba20e9df922f57e929a34c1b43196e2694" Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.236097 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk" Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.249996 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"edb83d37-768f-454d-b072-71e8c52dcd10","Type":"ContainerStarted","Data":"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7"} Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.250040 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"edb83d37-768f-454d-b072-71e8c52dcd10","Type":"ContainerStarted","Data":"ab4b209301a8942f5f177351fb8af030eedfecb164109087a15cd3d30ab2b66d"} Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.265855 5030 scope.go:117] "RemoveContainer" containerID="44457fa640b84e298c9bd24570c134e0e6d163e0861bd161428b3fa40406b8dd" Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.287144 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk"] Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.297177 5030 scope.go:117] "RemoveContainer" containerID="c189266c2bcb621a50e9f7fd43bbd9ba20e9df922f57e929a34c1b43196e2694" Jan 20 23:22:36 crc kubenswrapper[5030]: E0120 23:22:36.297711 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c189266c2bcb621a50e9f7fd43bbd9ba20e9df922f57e929a34c1b43196e2694\": container with ID starting with c189266c2bcb621a50e9f7fd43bbd9ba20e9df922f57e929a34c1b43196e2694 not found: ID does not exist" containerID="c189266c2bcb621a50e9f7fd43bbd9ba20e9df922f57e929a34c1b43196e2694" Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.297747 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c189266c2bcb621a50e9f7fd43bbd9ba20e9df922f57e929a34c1b43196e2694"} err="failed to get container status \"c189266c2bcb621a50e9f7fd43bbd9ba20e9df922f57e929a34c1b43196e2694\": rpc error: code = NotFound desc = could not find container \"c189266c2bcb621a50e9f7fd43bbd9ba20e9df922f57e929a34c1b43196e2694\": container with ID starting with c189266c2bcb621a50e9f7fd43bbd9ba20e9df922f57e929a34c1b43196e2694 not found: ID does not exist" Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.297773 5030 scope.go:117] "RemoveContainer" containerID="44457fa640b84e298c9bd24570c134e0e6d163e0861bd161428b3fa40406b8dd" Jan 20 23:22:36 crc kubenswrapper[5030]: E0120 23:22:36.298261 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44457fa640b84e298c9bd24570c134e0e6d163e0861bd161428b3fa40406b8dd\": container with ID starting with 44457fa640b84e298c9bd24570c134e0e6d163e0861bd161428b3fa40406b8dd not found: ID does not exist" containerID="44457fa640b84e298c9bd24570c134e0e6d163e0861bd161428b3fa40406b8dd" Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.298282 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44457fa640b84e298c9bd24570c134e0e6d163e0861bd161428b3fa40406b8dd"} err="failed to get container status \"44457fa640b84e298c9bd24570c134e0e6d163e0861bd161428b3fa40406b8dd\": rpc error: code = NotFound desc = could not find container \"44457fa640b84e298c9bd24570c134e0e6d163e0861bd161428b3fa40406b8dd\": container with ID starting with 44457fa640b84e298c9bd24570c134e0e6d163e0861bd161428b3fa40406b8dd not found: ID does not exist" Jan 20 23:22:36 crc kubenswrapper[5030]: I0120 23:22:36.299737 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-7fbf45bdf4-vh8hk"] Jan 20 23:22:37 crc kubenswrapper[5030]: I0120 23:22:37.267499 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"edb83d37-768f-454d-b072-71e8c52dcd10","Type":"ContainerStarted","Data":"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13"} Jan 20 23:22:37 crc kubenswrapper[5030]: I0120 23:22:37.972011 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="679648c6-8923-42fc-9226-c78458125e4c" path="/var/lib/kubelet/pods/679648c6-8923-42fc-9226-c78458125e4c/volumes" Jan 20 23:22:38 crc kubenswrapper[5030]: I0120 23:22:38.279280 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"edb83d37-768f-454d-b072-71e8c52dcd10","Type":"ContainerStarted","Data":"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099"} Jan 20 23:22:39 crc kubenswrapper[5030]: I0120 23:22:39.294585 5030 generic.go:334] "Generic (PLEG): container finished" podID="5b62b69c-8413-4732-9816-9d2f635d47b5" containerID="50246376aa5e4e5507db17c42856a6a995ac78ab49646b34f1fdd90c7caa4584" exitCode=0 Jan 20 23:22:39 crc kubenswrapper[5030]: I0120 23:22:39.294691 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" event={"ID":"5b62b69c-8413-4732-9816-9d2f635d47b5","Type":"ContainerDied","Data":"50246376aa5e4e5507db17c42856a6a995ac78ab49646b34f1fdd90c7caa4584"} Jan 20 23:22:39 crc kubenswrapper[5030]: I0120 23:22:39.300318 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"edb83d37-768f-454d-b072-71e8c52dcd10","Type":"ContainerStarted","Data":"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d"} Jan 20 23:22:39 crc kubenswrapper[5030]: I0120 23:22:39.301930 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:39 crc kubenswrapper[5030]: I0120 23:22:39.355884 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.824321947 podStartE2EDuration="5.355868127s" podCreationTimestamp="2026-01-20 23:22:34 +0000 UTC" firstStartedPulling="2026-01-20 23:22:35.272065634 +0000 UTC m=+2827.592325922" lastFinishedPulling="2026-01-20 23:22:38.803611804 +0000 UTC m=+2831.123872102" observedRunningTime="2026-01-20 23:22:39.351912522 +0000 UTC m=+2831.672172850" watchObservedRunningTime="2026-01-20 23:22:39.355868127 +0000 UTC m=+2831.676128425" Jan 20 23:22:40 crc kubenswrapper[5030]: I0120 23:22:40.802277 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:40 crc kubenswrapper[5030]: I0120 23:22:40.992869 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shr4q\" (UniqueName: \"kubernetes.io/projected/5b62b69c-8413-4732-9816-9d2f635d47b5-kube-api-access-shr4q\") pod \"5b62b69c-8413-4732-9816-9d2f635d47b5\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " Jan 20 23:22:40 crc kubenswrapper[5030]: I0120 23:22:40.992930 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-combined-ca-bundle\") pod \"5b62b69c-8413-4732-9816-9d2f635d47b5\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " Jan 20 23:22:40 crc kubenswrapper[5030]: I0120 23:22:40.993109 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-config-data\") pod \"5b62b69c-8413-4732-9816-9d2f635d47b5\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " Jan 20 23:22:40 crc kubenswrapper[5030]: I0120 23:22:40.993164 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-scripts\") pod \"5b62b69c-8413-4732-9816-9d2f635d47b5\" (UID: \"5b62b69c-8413-4732-9816-9d2f635d47b5\") " Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.007429 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-scripts" (OuterVolumeSpecName: "scripts") pod "5b62b69c-8413-4732-9816-9d2f635d47b5" (UID: "5b62b69c-8413-4732-9816-9d2f635d47b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.007523 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b62b69c-8413-4732-9816-9d2f635d47b5-kube-api-access-shr4q" (OuterVolumeSpecName: "kube-api-access-shr4q") pod "5b62b69c-8413-4732-9816-9d2f635d47b5" (UID: "5b62b69c-8413-4732-9816-9d2f635d47b5"). InnerVolumeSpecName "kube-api-access-shr4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.024973 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b62b69c-8413-4732-9816-9d2f635d47b5" (UID: "5b62b69c-8413-4732-9816-9d2f635d47b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.028522 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-config-data" (OuterVolumeSpecName: "config-data") pod "5b62b69c-8413-4732-9816-9d2f635d47b5" (UID: "5b62b69c-8413-4732-9816-9d2f635d47b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.094839 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.094877 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shr4q\" (UniqueName: \"kubernetes.io/projected/5b62b69c-8413-4732-9816-9d2f635d47b5-kube-api-access-shr4q\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.094887 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.094896 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b62b69c-8413-4732-9816-9d2f635d47b5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.331184 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" event={"ID":"5b62b69c-8413-4732-9816-9d2f635d47b5","Type":"ContainerDied","Data":"06c473ea0cce08a788c24e1fabddf717b234fd8b18ec1b246e7c48684c5f0006"} Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.331245 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.331256 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06c473ea0cce08a788c24e1fabddf717b234fd8b18ec1b246e7c48684c5f0006" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.423892 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:22:41 crc kubenswrapper[5030]: E0120 23:22:41.424204 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="679648c6-8923-42fc-9226-c78458125e4c" containerName="neutron-httpd" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.424219 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="679648c6-8923-42fc-9226-c78458125e4c" containerName="neutron-httpd" Jan 20 23:22:41 crc kubenswrapper[5030]: E0120 23:22:41.424241 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="679648c6-8923-42fc-9226-c78458125e4c" containerName="neutron-api" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.424248 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="679648c6-8923-42fc-9226-c78458125e4c" containerName="neutron-api" Jan 20 23:22:41 crc kubenswrapper[5030]: E0120 23:22:41.424259 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b62b69c-8413-4732-9816-9d2f635d47b5" containerName="nova-cell0-conductor-db-sync" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.424266 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b62b69c-8413-4732-9816-9d2f635d47b5" containerName="nova-cell0-conductor-db-sync" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.424427 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="679648c6-8923-42fc-9226-c78458125e4c" containerName="neutron-api" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.424438 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b62b69c-8413-4732-9816-9d2f635d47b5" containerName="nova-cell0-conductor-db-sync" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.424455 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="679648c6-8923-42fc-9226-c78458125e4c" containerName="neutron-httpd" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.425012 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.427230 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.427259 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-qp6jt" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.440108 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.503193 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf61ebcb-4ade-4329-8038-0306cc15e260-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bf61ebcb-4ade-4329-8038-0306cc15e260\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.503366 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f7mv\" (UniqueName: \"kubernetes.io/projected/bf61ebcb-4ade-4329-8038-0306cc15e260-kube-api-access-2f7mv\") pod \"nova-cell0-conductor-0\" (UID: \"bf61ebcb-4ade-4329-8038-0306cc15e260\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.503395 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf61ebcb-4ade-4329-8038-0306cc15e260-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bf61ebcb-4ade-4329-8038-0306cc15e260\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.604510 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf61ebcb-4ade-4329-8038-0306cc15e260-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bf61ebcb-4ade-4329-8038-0306cc15e260\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.604656 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f7mv\" (UniqueName: \"kubernetes.io/projected/bf61ebcb-4ade-4329-8038-0306cc15e260-kube-api-access-2f7mv\") pod \"nova-cell0-conductor-0\" (UID: \"bf61ebcb-4ade-4329-8038-0306cc15e260\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.604685 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf61ebcb-4ade-4329-8038-0306cc15e260-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bf61ebcb-4ade-4329-8038-0306cc15e260\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.607936 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf61ebcb-4ade-4329-8038-0306cc15e260-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bf61ebcb-4ade-4329-8038-0306cc15e260\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.608718 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf61ebcb-4ade-4329-8038-0306cc15e260-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bf61ebcb-4ade-4329-8038-0306cc15e260\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.623690 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f7mv\" (UniqueName: \"kubernetes.io/projected/bf61ebcb-4ade-4329-8038-0306cc15e260-kube-api-access-2f7mv\") pod \"nova-cell0-conductor-0\" (UID: \"bf61ebcb-4ade-4329-8038-0306cc15e260\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:22:41 crc kubenswrapper[5030]: I0120 23:22:41.752309 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:22:42 crc kubenswrapper[5030]: I0120 23:22:42.184597 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:22:42 crc kubenswrapper[5030]: I0120 23:22:42.206845 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:22:42 crc kubenswrapper[5030]: W0120 23:22:42.207998 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf61ebcb_4ade_4329_8038_0306cc15e260.slice/crio-ebcc13f77839c030b7ef2e3cec6e70f3f3b0e1b34845dfeef83641c68966ad9e WatchSource:0}: Error finding container ebcc13f77839c030b7ef2e3cec6e70f3f3b0e1b34845dfeef83641c68966ad9e: Status 404 returned error can't find the container with id ebcc13f77839c030b7ef2e3cec6e70f3f3b0e1b34845dfeef83641c68966ad9e Jan 20 23:22:42 crc kubenswrapper[5030]: I0120 23:22:42.349429 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"bf61ebcb-4ade-4329-8038-0306cc15e260","Type":"ContainerStarted","Data":"ebcc13f77839c030b7ef2e3cec6e70f3f3b0e1b34845dfeef83641c68966ad9e"} Jan 20 23:22:43 crc kubenswrapper[5030]: I0120 23:22:43.312441 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:43 crc kubenswrapper[5030]: I0120 23:22:43.312935 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="ceilometer-central-agent" containerID="cri-o://efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7" gracePeriod=30 Jan 20 23:22:43 crc kubenswrapper[5030]: I0120 23:22:43.313030 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="sg-core" containerID="cri-o://ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099" gracePeriod=30 Jan 20 23:22:43 crc kubenswrapper[5030]: I0120 23:22:43.313072 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="ceilometer-notification-agent" containerID="cri-o://f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13" gracePeriod=30 Jan 20 23:22:43 crc kubenswrapper[5030]: I0120 23:22:43.313313 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="proxy-httpd" containerID="cri-o://51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d" gracePeriod=30 Jan 20 23:22:43 crc kubenswrapper[5030]: I0120 23:22:43.359696 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"bf61ebcb-4ade-4329-8038-0306cc15e260","Type":"ContainerStarted","Data":"8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981"} Jan 20 23:22:43 crc kubenswrapper[5030]: I0120 23:22:43.359854 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="bf61ebcb-4ade-4329-8038-0306cc15e260" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" gracePeriod=30 Jan 20 23:22:43 crc kubenswrapper[5030]: I0120 23:22:43.359880 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:22:43 crc kubenswrapper[5030]: I0120 23:22:43.375248 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.375230695 podStartE2EDuration="2.375230695s" podCreationTimestamp="2026-01-20 23:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:22:43.372821567 +0000 UTC m=+2835.693081855" watchObservedRunningTime="2026-01-20 23:22:43.375230695 +0000 UTC m=+2835.695490993" Jan 20 23:22:43 crc kubenswrapper[5030]: I0120 23:22:43.929870 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.041693 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-combined-ca-bundle\") pod \"edb83d37-768f-454d-b072-71e8c52dcd10\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.041774 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-scripts\") pod \"edb83d37-768f-454d-b072-71e8c52dcd10\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.041822 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-sg-core-conf-yaml\") pod \"edb83d37-768f-454d-b072-71e8c52dcd10\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.042576 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edb83d37-768f-454d-b072-71e8c52dcd10-run-httpd\") pod \"edb83d37-768f-454d-b072-71e8c52dcd10\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.042668 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-config-data\") pod \"edb83d37-768f-454d-b072-71e8c52dcd10\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.042690 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edb83d37-768f-454d-b072-71e8c52dcd10-log-httpd\") pod \"edb83d37-768f-454d-b072-71e8c52dcd10\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.043151 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edb83d37-768f-454d-b072-71e8c52dcd10-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "edb83d37-768f-454d-b072-71e8c52dcd10" (UID: "edb83d37-768f-454d-b072-71e8c52dcd10"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.043214 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gggr7\" (UniqueName: \"kubernetes.io/projected/edb83d37-768f-454d-b072-71e8c52dcd10-kube-api-access-gggr7\") pod \"edb83d37-768f-454d-b072-71e8c52dcd10\" (UID: \"edb83d37-768f-454d-b072-71e8c52dcd10\") " Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.043658 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edb83d37-768f-454d-b072-71e8c52dcd10-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "edb83d37-768f-454d-b072-71e8c52dcd10" (UID: "edb83d37-768f-454d-b072-71e8c52dcd10"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.044870 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edb83d37-768f-454d-b072-71e8c52dcd10-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.044909 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edb83d37-768f-454d-b072-71e8c52dcd10-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.048808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-scripts" (OuterVolumeSpecName: "scripts") pod "edb83d37-768f-454d-b072-71e8c52dcd10" (UID: "edb83d37-768f-454d-b072-71e8c52dcd10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.049014 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edb83d37-768f-454d-b072-71e8c52dcd10-kube-api-access-gggr7" (OuterVolumeSpecName: "kube-api-access-gggr7") pod "edb83d37-768f-454d-b072-71e8c52dcd10" (UID: "edb83d37-768f-454d-b072-71e8c52dcd10"). InnerVolumeSpecName "kube-api-access-gggr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.067557 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "edb83d37-768f-454d-b072-71e8c52dcd10" (UID: "edb83d37-768f-454d-b072-71e8c52dcd10"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.128647 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edb83d37-768f-454d-b072-71e8c52dcd10" (UID: "edb83d37-768f-454d-b072-71e8c52dcd10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.146056 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.146089 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.146101 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.146115 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gggr7\" (UniqueName: \"kubernetes.io/projected/edb83d37-768f-454d-b072-71e8c52dcd10-kube-api-access-gggr7\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.147490 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-config-data" (OuterVolumeSpecName: "config-data") pod "edb83d37-768f-454d-b072-71e8c52dcd10" (UID: "edb83d37-768f-454d-b072-71e8c52dcd10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.253576 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edb83d37-768f-454d-b072-71e8c52dcd10-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.372464 5030 generic.go:334] "Generic (PLEG): container finished" podID="edb83d37-768f-454d-b072-71e8c52dcd10" containerID="51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d" exitCode=0 Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.372503 5030 generic.go:334] "Generic (PLEG): container finished" podID="edb83d37-768f-454d-b072-71e8c52dcd10" containerID="ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099" exitCode=2 Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.372510 5030 generic.go:334] "Generic (PLEG): container finished" podID="edb83d37-768f-454d-b072-71e8c52dcd10" containerID="f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13" exitCode=0 Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.372518 5030 generic.go:334] "Generic (PLEG): container finished" podID="edb83d37-768f-454d-b072-71e8c52dcd10" containerID="efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7" exitCode=0 Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.372537 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"edb83d37-768f-454d-b072-71e8c52dcd10","Type":"ContainerDied","Data":"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d"} Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.372564 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"edb83d37-768f-454d-b072-71e8c52dcd10","Type":"ContainerDied","Data":"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099"} Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.372576 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"edb83d37-768f-454d-b072-71e8c52dcd10","Type":"ContainerDied","Data":"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13"} Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.372585 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"edb83d37-768f-454d-b072-71e8c52dcd10","Type":"ContainerDied","Data":"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7"} Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.372594 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"edb83d37-768f-454d-b072-71e8c52dcd10","Type":"ContainerDied","Data":"ab4b209301a8942f5f177351fb8af030eedfecb164109087a15cd3d30ab2b66d"} Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.372599 5030 scope.go:117] "RemoveContainer" containerID="51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.372578 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.401257 5030 scope.go:117] "RemoveContainer" containerID="ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.406141 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.413182 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.421110 5030 scope.go:117] "RemoveContainer" containerID="f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.430334 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:44 crc kubenswrapper[5030]: E0120 23:22:44.430674 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="sg-core" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.430689 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="sg-core" Jan 20 23:22:44 crc kubenswrapper[5030]: E0120 23:22:44.430703 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="ceilometer-notification-agent" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.430709 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="ceilometer-notification-agent" Jan 20 23:22:44 crc kubenswrapper[5030]: E0120 23:22:44.430742 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="ceilometer-central-agent" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.430749 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="ceilometer-central-agent" Jan 20 23:22:44 crc kubenswrapper[5030]: E0120 23:22:44.430757 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="proxy-httpd" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.430763 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="proxy-httpd" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.430914 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="proxy-httpd" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.430928 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="ceilometer-notification-agent" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.430941 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="sg-core" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.430951 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" containerName="ceilometer-central-agent" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.432401 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.435932 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.436762 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.443268 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.456520 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbdbq\" (UniqueName: \"kubernetes.io/projected/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-kube-api-access-qbdbq\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.456582 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-config-data\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.456610 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.456646 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-run-httpd\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.460717 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-scripts\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.460807 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-log-httpd\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.460849 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.506241 5030 scope.go:117] "RemoveContainer" containerID="efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.526226 5030 scope.go:117] "RemoveContainer" containerID="51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d" Jan 20 23:22:44 crc kubenswrapper[5030]: E0120 23:22:44.526737 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d\": container with ID starting with 51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d not found: ID does not exist" containerID="51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.526771 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d"} err="failed to get container status \"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d\": rpc error: code = NotFound desc = could not find container \"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d\": container with ID starting with 51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.526793 5030 scope.go:117] "RemoveContainer" containerID="ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099" Jan 20 23:22:44 crc kubenswrapper[5030]: E0120 23:22:44.527153 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099\": container with ID starting with ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099 not found: ID does not exist" containerID="ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.527194 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099"} err="failed to get container status \"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099\": rpc error: code = NotFound desc = could not find container \"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099\": container with ID starting with ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099 not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.527210 5030 scope.go:117] "RemoveContainer" containerID="f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13" Jan 20 23:22:44 crc kubenswrapper[5030]: E0120 23:22:44.527543 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13\": container with ID starting with f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13 not found: ID does not exist" containerID="f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.527583 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13"} err="failed to get container status \"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13\": rpc error: code = NotFound desc = could not find container \"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13\": container with ID starting with f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13 not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.527610 5030 scope.go:117] "RemoveContainer" containerID="efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7" Jan 20 23:22:44 crc kubenswrapper[5030]: E0120 23:22:44.527968 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7\": container with ID starting with efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7 not found: ID does not exist" containerID="efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.528010 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7"} err="failed to get container status \"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7\": rpc error: code = NotFound desc = could not find container \"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7\": container with ID starting with efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7 not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.528038 5030 scope.go:117] "RemoveContainer" containerID="51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.528523 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d"} err="failed to get container status \"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d\": rpc error: code = NotFound desc = could not find container \"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d\": container with ID starting with 51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.528552 5030 scope.go:117] "RemoveContainer" containerID="ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.528889 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099"} err="failed to get container status \"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099\": rpc error: code = NotFound desc = could not find container \"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099\": container with ID starting with ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099 not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.528913 5030 scope.go:117] "RemoveContainer" containerID="f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.529177 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13"} err="failed to get container status \"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13\": rpc error: code = NotFound desc = could not find container \"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13\": container with ID starting with f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13 not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.529197 5030 scope.go:117] "RemoveContainer" containerID="efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.529395 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7"} err="failed to get container status \"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7\": rpc error: code = NotFound desc = could not find container \"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7\": container with ID starting with efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7 not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.529412 5030 scope.go:117] "RemoveContainer" containerID="51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.529971 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d"} err="failed to get container status \"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d\": rpc error: code = NotFound desc = could not find container \"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d\": container with ID starting with 51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.529995 5030 scope.go:117] "RemoveContainer" containerID="ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.530329 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099"} err="failed to get container status \"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099\": rpc error: code = NotFound desc = could not find container \"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099\": container with ID starting with ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099 not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.530347 5030 scope.go:117] "RemoveContainer" containerID="f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.530729 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13"} err="failed to get container status \"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13\": rpc error: code = NotFound desc = could not find container \"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13\": container with ID starting with f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13 not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.530789 5030 scope.go:117] "RemoveContainer" containerID="efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.531081 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7"} err="failed to get container status \"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7\": rpc error: code = NotFound desc = could not find container \"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7\": container with ID starting with efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7 not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.531104 5030 scope.go:117] "RemoveContainer" containerID="51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.531326 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d"} err="failed to get container status \"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d\": rpc error: code = NotFound desc = could not find container \"51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d\": container with ID starting with 51361b313be6df3432a0208535f0a181b05936eac6ef957d6876cd04c3281f2d not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.531347 5030 scope.go:117] "RemoveContainer" containerID="ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.531539 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099"} err="failed to get container status \"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099\": rpc error: code = NotFound desc = could not find container \"ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099\": container with ID starting with ab749dc227642e507d8d2d8aa99f5954fad6d1a78b88f7b34a6a5bf871f8e099 not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.531555 5030 scope.go:117] "RemoveContainer" containerID="f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.531780 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13"} err="failed to get container status \"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13\": rpc error: code = NotFound desc = could not find container \"f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13\": container with ID starting with f00fda454204e0bb4dbf46bc9d151800f93b8515515d4c8e17ea3d79e7a58a13 not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.531802 5030 scope.go:117] "RemoveContainer" containerID="efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.531968 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7"} err="failed to get container status \"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7\": rpc error: code = NotFound desc = could not find container \"efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7\": container with ID starting with efdcdc1b0667b8d67960fd9d77ca3b813dc68b21b48a6b1db0895a27e4d62af7 not found: ID does not exist" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.562760 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-run-httpd\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.562796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-scripts\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.562834 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-log-httpd\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.562863 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.562950 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbdbq\" (UniqueName: \"kubernetes.io/projected/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-kube-api-access-qbdbq\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.562993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-config-data\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.563020 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.563244 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-run-httpd\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.563344 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-log-httpd\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.566753 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.569241 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-scripts\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.569362 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.570060 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-config-data\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.584724 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbdbq\" (UniqueName: \"kubernetes.io/projected/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-kube-api-access-qbdbq\") pod \"ceilometer-0\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:44 crc kubenswrapper[5030]: I0120 23:22:44.808163 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:45 crc kubenswrapper[5030]: I0120 23:22:45.051312 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:45 crc kubenswrapper[5030]: I0120 23:22:45.315222 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:45 crc kubenswrapper[5030]: W0120 23:22:45.323538 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70fa23d5_94e0_4700_bdbe_9ec06a2653ae.slice/crio-a39e31912d3bbed1fb4b2f12ebcdf3c86ca3a7d2e4e7852857ff2379a859b70d WatchSource:0}: Error finding container a39e31912d3bbed1fb4b2f12ebcdf3c86ca3a7d2e4e7852857ff2379a859b70d: Status 404 returned error can't find the container with id a39e31912d3bbed1fb4b2f12ebcdf3c86ca3a7d2e4e7852857ff2379a859b70d Jan 20 23:22:45 crc kubenswrapper[5030]: I0120 23:22:45.384098 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"70fa23d5-94e0-4700-bdbe-9ec06a2653ae","Type":"ContainerStarted","Data":"a39e31912d3bbed1fb4b2f12ebcdf3c86ca3a7d2e4e7852857ff2379a859b70d"} Jan 20 23:22:45 crc kubenswrapper[5030]: I0120 23:22:45.549375 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:22:45 crc kubenswrapper[5030]: I0120 23:22:45.549910 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="70e2ae2a-382b-46da-a95d-b65c49e1298b" containerName="glance-log" containerID="cri-o://54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f" gracePeriod=30 Jan 20 23:22:45 crc kubenswrapper[5030]: I0120 23:22:45.550040 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="70e2ae2a-382b-46da-a95d-b65c49e1298b" containerName="glance-httpd" containerID="cri-o://b3d5a25fb3cb3fb071b846be57fec63c5e5ee368d213f3055cf1391136ae34fb" gracePeriod=30 Jan 20 23:22:45 crc kubenswrapper[5030]: E0120 23:22:45.741342 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70e2ae2a_382b_46da_a95d_b65c49e1298b.slice/crio-conmon-54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70e2ae2a_382b_46da_a95d_b65c49e1298b.slice/crio-54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:22:45 crc kubenswrapper[5030]: I0120 23:22:45.973016 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edb83d37-768f-454d-b072-71e8c52dcd10" path="/var/lib/kubelet/pods/edb83d37-768f-454d-b072-71e8c52dcd10/volumes" Jan 20 23:22:46 crc kubenswrapper[5030]: I0120 23:22:46.393714 5030 generic.go:334] "Generic (PLEG): container finished" podID="70e2ae2a-382b-46da-a95d-b65c49e1298b" containerID="54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f" exitCode=143 Jan 20 23:22:46 crc kubenswrapper[5030]: I0120 23:22:46.393783 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"70e2ae2a-382b-46da-a95d-b65c49e1298b","Type":"ContainerDied","Data":"54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f"} Jan 20 23:22:46 crc kubenswrapper[5030]: I0120 23:22:46.395773 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"70fa23d5-94e0-4700-bdbe-9ec06a2653ae","Type":"ContainerStarted","Data":"9700105b48303ed87d20f15125744276f543cd829322b0cf3045c232e4faaaa1"} Jan 20 23:22:46 crc kubenswrapper[5030]: I0120 23:22:46.938258 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:22:46 crc kubenswrapper[5030]: I0120 23:22:46.938771 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="03270088-0dfc-4646-9599-0700fd991275" containerName="glance-log" containerID="cri-o://42c8c068805b8becc88018dc6e8647a0ecdaa3cf1b06f1ed91892d7203494852" gracePeriod=30 Jan 20 23:22:46 crc kubenswrapper[5030]: I0120 23:22:46.938875 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="03270088-0dfc-4646-9599-0700fd991275" containerName="glance-httpd" containerID="cri-o://9bd137d1669b6127c69101e3cff47c37369bbd0fb2dc8cd722bee2c457d4eeb3" gracePeriod=30 Jan 20 23:22:47 crc kubenswrapper[5030]: I0120 23:22:47.408303 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"70fa23d5-94e0-4700-bdbe-9ec06a2653ae","Type":"ContainerStarted","Data":"307e7fcab066c26737977af21851a91a7fce8c9a63ea22ccd1a02a61aae7234f"} Jan 20 23:22:47 crc kubenswrapper[5030]: I0120 23:22:47.411309 5030 generic.go:334] "Generic (PLEG): container finished" podID="03270088-0dfc-4646-9599-0700fd991275" containerID="42c8c068805b8becc88018dc6e8647a0ecdaa3cf1b06f1ed91892d7203494852" exitCode=143 Jan 20 23:22:47 crc kubenswrapper[5030]: I0120 23:22:47.411472 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"03270088-0dfc-4646-9599-0700fd991275","Type":"ContainerDied","Data":"42c8c068805b8becc88018dc6e8647a0ecdaa3cf1b06f1ed91892d7203494852"} Jan 20 23:22:48 crc kubenswrapper[5030]: I0120 23:22:48.423687 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"70fa23d5-94e0-4700-bdbe-9ec06a2653ae","Type":"ContainerStarted","Data":"d4db1e5c28113d674612d715717060efedb67b2d2c59368a248e71762a4da933"} Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.129499 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.236809 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e2ae2a-382b-46da-a95d-b65c49e1298b-logs\") pod \"70e2ae2a-382b-46da-a95d-b65c49e1298b\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.236899 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-scripts\") pod \"70e2ae2a-382b-46da-a95d-b65c49e1298b\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.236965 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldzpn\" (UniqueName: \"kubernetes.io/projected/70e2ae2a-382b-46da-a95d-b65c49e1298b-kube-api-access-ldzpn\") pod \"70e2ae2a-382b-46da-a95d-b65c49e1298b\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.237012 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-combined-ca-bundle\") pod \"70e2ae2a-382b-46da-a95d-b65c49e1298b\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.237056 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-public-tls-certs\") pod \"70e2ae2a-382b-46da-a95d-b65c49e1298b\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.237148 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-config-data\") pod \"70e2ae2a-382b-46da-a95d-b65c49e1298b\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.237180 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70e2ae2a-382b-46da-a95d-b65c49e1298b-httpd-run\") pod \"70e2ae2a-382b-46da-a95d-b65c49e1298b\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.237210 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"70e2ae2a-382b-46da-a95d-b65c49e1298b\" (UID: \"70e2ae2a-382b-46da-a95d-b65c49e1298b\") " Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.237408 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70e2ae2a-382b-46da-a95d-b65c49e1298b-logs" (OuterVolumeSpecName: "logs") pod "70e2ae2a-382b-46da-a95d-b65c49e1298b" (UID: "70e2ae2a-382b-46da-a95d-b65c49e1298b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.237713 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70e2ae2a-382b-46da-a95d-b65c49e1298b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "70e2ae2a-382b-46da-a95d-b65c49e1298b" (UID: "70e2ae2a-382b-46da-a95d-b65c49e1298b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.237730 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70e2ae2a-382b-46da-a95d-b65c49e1298b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.243929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-scripts" (OuterVolumeSpecName: "scripts") pod "70e2ae2a-382b-46da-a95d-b65c49e1298b" (UID: "70e2ae2a-382b-46da-a95d-b65c49e1298b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.244116 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "70e2ae2a-382b-46da-a95d-b65c49e1298b" (UID: "70e2ae2a-382b-46da-a95d-b65c49e1298b"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.248268 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70e2ae2a-382b-46da-a95d-b65c49e1298b-kube-api-access-ldzpn" (OuterVolumeSpecName: "kube-api-access-ldzpn") pod "70e2ae2a-382b-46da-a95d-b65c49e1298b" (UID: "70e2ae2a-382b-46da-a95d-b65c49e1298b"). InnerVolumeSpecName "kube-api-access-ldzpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.275838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70e2ae2a-382b-46da-a95d-b65c49e1298b" (UID: "70e2ae2a-382b-46da-a95d-b65c49e1298b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.288343 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "70e2ae2a-382b-46da-a95d-b65c49e1298b" (UID: "70e2ae2a-382b-46da-a95d-b65c49e1298b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.292595 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-config-data" (OuterVolumeSpecName: "config-data") pod "70e2ae2a-382b-46da-a95d-b65c49e1298b" (UID: "70e2ae2a-382b-46da-a95d-b65c49e1298b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.339910 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldzpn\" (UniqueName: \"kubernetes.io/projected/70e2ae2a-382b-46da-a95d-b65c49e1298b-kube-api-access-ldzpn\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.339949 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.339959 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.339970 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.339978 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/70e2ae2a-382b-46da-a95d-b65c49e1298b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.340003 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.340012 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70e2ae2a-382b-46da-a95d-b65c49e1298b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.359947 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.435192 5030 generic.go:334] "Generic (PLEG): container finished" podID="70e2ae2a-382b-46da-a95d-b65c49e1298b" containerID="b3d5a25fb3cb3fb071b846be57fec63c5e5ee368d213f3055cf1391136ae34fb" exitCode=0 Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.435259 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"70e2ae2a-382b-46da-a95d-b65c49e1298b","Type":"ContainerDied","Data":"b3d5a25fb3cb3fb071b846be57fec63c5e5ee368d213f3055cf1391136ae34fb"} Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.435291 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"70e2ae2a-382b-46da-a95d-b65c49e1298b","Type":"ContainerDied","Data":"ee918c210dd8333de30c29f540d427e2dde9918e4e84fb352a3ad9f06c906777"} Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.435312 5030 scope.go:117] "RemoveContainer" containerID="b3d5a25fb3cb3fb071b846be57fec63c5e5ee368d213f3055cf1391136ae34fb" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.435469 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.441148 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.441902 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"70fa23d5-94e0-4700-bdbe-9ec06a2653ae","Type":"ContainerStarted","Data":"12c935e5d06c1c85c6ae4500bfb8d1c761eae997fe174438d7cbe8c93d4fc204"} Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.442040 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="ceilometer-central-agent" containerID="cri-o://9700105b48303ed87d20f15125744276f543cd829322b0cf3045c232e4faaaa1" gracePeriod=30 Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.442252 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.442501 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="proxy-httpd" containerID="cri-o://12c935e5d06c1c85c6ae4500bfb8d1c761eae997fe174438d7cbe8c93d4fc204" gracePeriod=30 Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.442573 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="sg-core" containerID="cri-o://d4db1e5c28113d674612d715717060efedb67b2d2c59368a248e71762a4da933" gracePeriod=30 Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.442582 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="ceilometer-notification-agent" containerID="cri-o://307e7fcab066c26737977af21851a91a7fce8c9a63ea22ccd1a02a61aae7234f" gracePeriod=30 Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.462686 5030 scope.go:117] "RemoveContainer" containerID="54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.468117 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.07428498 podStartE2EDuration="5.468100396s" podCreationTimestamp="2026-01-20 23:22:44 +0000 UTC" firstStartedPulling="2026-01-20 23:22:45.326500327 +0000 UTC m=+2837.646760615" lastFinishedPulling="2026-01-20 23:22:48.720315743 +0000 UTC m=+2841.040576031" observedRunningTime="2026-01-20 23:22:49.465719419 +0000 UTC m=+2841.785979717" watchObservedRunningTime="2026-01-20 23:22:49.468100396 +0000 UTC m=+2841.788360704" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.483011 5030 scope.go:117] "RemoveContainer" containerID="b3d5a25fb3cb3fb071b846be57fec63c5e5ee368d213f3055cf1391136ae34fb" Jan 20 23:22:49 crc kubenswrapper[5030]: E0120 23:22:49.483462 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3d5a25fb3cb3fb071b846be57fec63c5e5ee368d213f3055cf1391136ae34fb\": container with ID starting with b3d5a25fb3cb3fb071b846be57fec63c5e5ee368d213f3055cf1391136ae34fb not found: ID does not exist" containerID="b3d5a25fb3cb3fb071b846be57fec63c5e5ee368d213f3055cf1391136ae34fb" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.483513 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d5a25fb3cb3fb071b846be57fec63c5e5ee368d213f3055cf1391136ae34fb"} err="failed to get container status \"b3d5a25fb3cb3fb071b846be57fec63c5e5ee368d213f3055cf1391136ae34fb\": rpc error: code = NotFound desc = could not find container \"b3d5a25fb3cb3fb071b846be57fec63c5e5ee368d213f3055cf1391136ae34fb\": container with ID starting with b3d5a25fb3cb3fb071b846be57fec63c5e5ee368d213f3055cf1391136ae34fb not found: ID does not exist" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.483539 5030 scope.go:117] "RemoveContainer" containerID="54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f" Jan 20 23:22:49 crc kubenswrapper[5030]: E0120 23:22:49.483851 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f\": container with ID starting with 54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f not found: ID does not exist" containerID="54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.483893 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f"} err="failed to get container status \"54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f\": rpc error: code = NotFound desc = could not find container \"54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f\": container with ID starting with 54f0ac5d1e7d7b41fdcf95c5129814940a84db11cc387f888d609ea5ae6bcb7f not found: ID does not exist" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.495872 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.503737 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.519664 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:22:49 crc kubenswrapper[5030]: E0120 23:22:49.520088 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e2ae2a-382b-46da-a95d-b65c49e1298b" containerName="glance-log" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.520108 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e2ae2a-382b-46da-a95d-b65c49e1298b" containerName="glance-log" Jan 20 23:22:49 crc kubenswrapper[5030]: E0120 23:22:49.520119 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e2ae2a-382b-46da-a95d-b65c49e1298b" containerName="glance-httpd" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.520129 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e2ae2a-382b-46da-a95d-b65c49e1298b" containerName="glance-httpd" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.520380 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e2ae2a-382b-46da-a95d-b65c49e1298b" containerName="glance-log" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.520409 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e2ae2a-382b-46da-a95d-b65c49e1298b" containerName="glance-httpd" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.521538 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.523106 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.524485 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.535291 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.643967 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-scripts\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.644029 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.644176 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/198942d4-7d99-4cd0-8e85-b9a7497b4343-logs\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.644230 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/198942d4-7d99-4cd0-8e85-b9a7497b4343-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.644420 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.644450 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l28df\" (UniqueName: \"kubernetes.io/projected/198942d4-7d99-4cd0-8e85-b9a7497b4343-kube-api-access-l28df\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.644476 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.644583 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-config-data\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.746032 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.746105 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/198942d4-7d99-4cd0-8e85-b9a7497b4343-logs\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.746134 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/198942d4-7d99-4cd0-8e85-b9a7497b4343-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.746201 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.746224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l28df\" (UniqueName: \"kubernetes.io/projected/198942d4-7d99-4cd0-8e85-b9a7497b4343-kube-api-access-l28df\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.746244 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.746299 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-config-data\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.746338 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-scripts\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.747311 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.747837 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/198942d4-7d99-4cd0-8e85-b9a7497b4343-logs\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.747901 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/198942d4-7d99-4cd0-8e85-b9a7497b4343-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.751210 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.751965 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.752305 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-config-data\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.754495 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-scripts\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.788480 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l28df\" (UniqueName: \"kubernetes.io/projected/198942d4-7d99-4cd0-8e85-b9a7497b4343-kube-api-access-l28df\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.807405 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.854100 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:49 crc kubenswrapper[5030]: I0120 23:22:49.978916 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70e2ae2a-382b-46da-a95d-b65c49e1298b" path="/var/lib/kubelet/pods/70e2ae2a-382b-46da-a95d-b65c49e1298b/volumes" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.302889 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.439657 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.451776 5030 generic.go:334] "Generic (PLEG): container finished" podID="03270088-0dfc-4646-9599-0700fd991275" containerID="9bd137d1669b6127c69101e3cff47c37369bbd0fb2dc8cd722bee2c457d4eeb3" exitCode=0 Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.451825 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"03270088-0dfc-4646-9599-0700fd991275","Type":"ContainerDied","Data":"9bd137d1669b6127c69101e3cff47c37369bbd0fb2dc8cd722bee2c457d4eeb3"} Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.451849 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"03270088-0dfc-4646-9599-0700fd991275","Type":"ContainerDied","Data":"e3d841ad80fce68034329e6e9ec7acea58fa7d3a8f80743be11d33b14a112b13"} Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.451864 5030 scope.go:117] "RemoveContainer" containerID="9bd137d1669b6127c69101e3cff47c37369bbd0fb2dc8cd722bee2c457d4eeb3" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.451957 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.462994 5030 generic.go:334] "Generic (PLEG): container finished" podID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerID="12c935e5d06c1c85c6ae4500bfb8d1c761eae997fe174438d7cbe8c93d4fc204" exitCode=0 Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.463018 5030 generic.go:334] "Generic (PLEG): container finished" podID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerID="d4db1e5c28113d674612d715717060efedb67b2d2c59368a248e71762a4da933" exitCode=2 Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.463025 5030 generic.go:334] "Generic (PLEG): container finished" podID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerID="307e7fcab066c26737977af21851a91a7fce8c9a63ea22ccd1a02a61aae7234f" exitCode=0 Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.463058 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"70fa23d5-94e0-4700-bdbe-9ec06a2653ae","Type":"ContainerDied","Data":"12c935e5d06c1c85c6ae4500bfb8d1c761eae997fe174438d7cbe8c93d4fc204"} Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.463075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"70fa23d5-94e0-4700-bdbe-9ec06a2653ae","Type":"ContainerDied","Data":"d4db1e5c28113d674612d715717060efedb67b2d2c59368a248e71762a4da933"} Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.463084 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"70fa23d5-94e0-4700-bdbe-9ec06a2653ae","Type":"ContainerDied","Data":"307e7fcab066c26737977af21851a91a7fce8c9a63ea22ccd1a02a61aae7234f"} Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.466347 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"198942d4-7d99-4cd0-8e85-b9a7497b4343","Type":"ContainerStarted","Data":"602bd6ed5c21fdad5d228277398f647b30a1961fac5fab87b3f82c5830d574dd"} Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.494931 5030 scope.go:117] "RemoveContainer" containerID="42c8c068805b8becc88018dc6e8647a0ecdaa3cf1b06f1ed91892d7203494852" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.530568 5030 scope.go:117] "RemoveContainer" containerID="9bd137d1669b6127c69101e3cff47c37369bbd0fb2dc8cd722bee2c457d4eeb3" Jan 20 23:22:50 crc kubenswrapper[5030]: E0120 23:22:50.539444 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd137d1669b6127c69101e3cff47c37369bbd0fb2dc8cd722bee2c457d4eeb3\": container with ID starting with 9bd137d1669b6127c69101e3cff47c37369bbd0fb2dc8cd722bee2c457d4eeb3 not found: ID does not exist" containerID="9bd137d1669b6127c69101e3cff47c37369bbd0fb2dc8cd722bee2c457d4eeb3" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.539491 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd137d1669b6127c69101e3cff47c37369bbd0fb2dc8cd722bee2c457d4eeb3"} err="failed to get container status \"9bd137d1669b6127c69101e3cff47c37369bbd0fb2dc8cd722bee2c457d4eeb3\": rpc error: code = NotFound desc = could not find container \"9bd137d1669b6127c69101e3cff47c37369bbd0fb2dc8cd722bee2c457d4eeb3\": container with ID starting with 9bd137d1669b6127c69101e3cff47c37369bbd0fb2dc8cd722bee2c457d4eeb3 not found: ID does not exist" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.539517 5030 scope.go:117] "RemoveContainer" containerID="42c8c068805b8becc88018dc6e8647a0ecdaa3cf1b06f1ed91892d7203494852" Jan 20 23:22:50 crc kubenswrapper[5030]: E0120 23:22:50.540095 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c8c068805b8becc88018dc6e8647a0ecdaa3cf1b06f1ed91892d7203494852\": container with ID starting with 42c8c068805b8becc88018dc6e8647a0ecdaa3cf1b06f1ed91892d7203494852 not found: ID does not exist" containerID="42c8c068805b8becc88018dc6e8647a0ecdaa3cf1b06f1ed91892d7203494852" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.540155 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c8c068805b8becc88018dc6e8647a0ecdaa3cf1b06f1ed91892d7203494852"} err="failed to get container status \"42c8c068805b8becc88018dc6e8647a0ecdaa3cf1b06f1ed91892d7203494852\": rpc error: code = NotFound desc = could not find container \"42c8c068805b8becc88018dc6e8647a0ecdaa3cf1b06f1ed91892d7203494852\": container with ID starting with 42c8c068805b8becc88018dc6e8647a0ecdaa3cf1b06f1ed91892d7203494852 not found: ID does not exist" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.561112 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-combined-ca-bundle\") pod \"03270088-0dfc-4646-9599-0700fd991275\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.561199 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03270088-0dfc-4646-9599-0700fd991275-logs\") pod \"03270088-0dfc-4646-9599-0700fd991275\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.561248 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-scripts\") pod \"03270088-0dfc-4646-9599-0700fd991275\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.561289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"03270088-0dfc-4646-9599-0700fd991275\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.561336 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03270088-0dfc-4646-9599-0700fd991275-httpd-run\") pod \"03270088-0dfc-4646-9599-0700fd991275\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.561379 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-internal-tls-certs\") pod \"03270088-0dfc-4646-9599-0700fd991275\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.561459 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwv9p\" (UniqueName: \"kubernetes.io/projected/03270088-0dfc-4646-9599-0700fd991275-kube-api-access-wwv9p\") pod \"03270088-0dfc-4646-9599-0700fd991275\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.561487 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-config-data\") pod \"03270088-0dfc-4646-9599-0700fd991275\" (UID: \"03270088-0dfc-4646-9599-0700fd991275\") " Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.562300 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03270088-0dfc-4646-9599-0700fd991275-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "03270088-0dfc-4646-9599-0700fd991275" (UID: "03270088-0dfc-4646-9599-0700fd991275"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.562359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03270088-0dfc-4646-9599-0700fd991275-logs" (OuterVolumeSpecName: "logs") pod "03270088-0dfc-4646-9599-0700fd991275" (UID: "03270088-0dfc-4646-9599-0700fd991275"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.565816 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "03270088-0dfc-4646-9599-0700fd991275" (UID: "03270088-0dfc-4646-9599-0700fd991275"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.566759 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03270088-0dfc-4646-9599-0700fd991275-kube-api-access-wwv9p" (OuterVolumeSpecName: "kube-api-access-wwv9p") pod "03270088-0dfc-4646-9599-0700fd991275" (UID: "03270088-0dfc-4646-9599-0700fd991275"). InnerVolumeSpecName "kube-api-access-wwv9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.567140 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-scripts" (OuterVolumeSpecName: "scripts") pod "03270088-0dfc-4646-9599-0700fd991275" (UID: "03270088-0dfc-4646-9599-0700fd991275"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.597977 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03270088-0dfc-4646-9599-0700fd991275" (UID: "03270088-0dfc-4646-9599-0700fd991275"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.607587 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-config-data" (OuterVolumeSpecName: "config-data") pod "03270088-0dfc-4646-9599-0700fd991275" (UID: "03270088-0dfc-4646-9599-0700fd991275"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.609414 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "03270088-0dfc-4646-9599-0700fd991275" (UID: "03270088-0dfc-4646-9599-0700fd991275"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.651454 5030 scope.go:117] "RemoveContainer" containerID="2c66ada43959a170fa4404aa8228c36bf2d04ad637119ab68bac6365e9aa4160" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.663358 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03270088-0dfc-4646-9599-0700fd991275-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.663404 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.663421 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwv9p\" (UniqueName: \"kubernetes.io/projected/03270088-0dfc-4646-9599-0700fd991275-kube-api-access-wwv9p\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.663434 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.663445 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.663459 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03270088-0dfc-4646-9599-0700fd991275-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.663471 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03270088-0dfc-4646-9599-0700fd991275-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.663516 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.688051 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.690966 5030 scope.go:117] "RemoveContainer" containerID="defefea15dd6535180f7c0202106736a984c14cefca0c9d1fcc0e37e5e341a5c" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.717759 5030 scope.go:117] "RemoveContainer" containerID="4b519540152f7b8b3ca7a036718809383dc33dcff141d1828031db8b17e23383" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.743097 5030 scope.go:117] "RemoveContainer" containerID="2c9729c566b30fd83e8dc16f913064627dab1c0d4147acc8dfa17e723cc5653a" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.764667 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.806831 5030 scope.go:117] "RemoveContainer" containerID="2fffc0464c76dcd205b4adc0bbf5e64f2ccf1ded597857e6c78cdfd2bd0884e3" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.863594 5030 scope.go:117] "RemoveContainer" containerID="88a2387b417e90bfd501bc833d3ec1e5d951fed92f4116f90417ad981562c482" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.870338 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.886034 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.911169 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:22:50 crc kubenswrapper[5030]: E0120 23:22:50.911647 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03270088-0dfc-4646-9599-0700fd991275" containerName="glance-log" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.911669 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="03270088-0dfc-4646-9599-0700fd991275" containerName="glance-log" Jan 20 23:22:50 crc kubenswrapper[5030]: E0120 23:22:50.911711 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03270088-0dfc-4646-9599-0700fd991275" containerName="glance-httpd" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.911720 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="03270088-0dfc-4646-9599-0700fd991275" containerName="glance-httpd" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.911925 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="03270088-0dfc-4646-9599-0700fd991275" containerName="glance-log" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.911969 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="03270088-0dfc-4646-9599-0700fd991275" containerName="glance-httpd" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.913290 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.916240 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.916498 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:22:50 crc kubenswrapper[5030]: I0120 23:22:50.942570 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.016224 5030 scope.go:117] "RemoveContainer" containerID="becb9ebf78886f627f252d5d9efae37f435c9773f0c72636e2a16f48b81f3c65" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.041658 5030 scope.go:117] "RemoveContainer" containerID="b52e95dd143ed5cf453a5cbefbdce934cf06f59d7f7dc81da3443df81759cf02" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.075786 5030 scope.go:117] "RemoveContainer" containerID="a63c3e2c0fa9c632fa78a3a408312008d457ce68e6f245d77409731f31a4fec8" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.084660 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.084703 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc88m\" (UniqueName: \"kubernetes.io/projected/8a280c54-785f-4a4f-895b-231aaed18db5-kube-api-access-pc88m\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.084741 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.084772 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.084794 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a280c54-785f-4a4f-895b-231aaed18db5-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.084869 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.084949 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.084984 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a280c54-785f-4a4f-895b-231aaed18db5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.107171 5030 scope.go:117] "RemoveContainer" containerID="64e8ddea6808b444003252fcc52a6d1bf1a5cedf7605c158182a33e5e1061381" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.140501 5030 scope.go:117] "RemoveContainer" containerID="2fc8ca77520abb462f511cdc67fcb4d4b532e52fdeff91ec65e80699f856a33b" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.166753 5030 scope.go:117] "RemoveContainer" containerID="12ac3c08c01fa3165cd6cab4fc0e255fb705e419bf03d4f6f9041ab8c246bd09" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.186491 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.186537 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a280c54-785f-4a4f-895b-231aaed18db5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.186866 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.186905 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc88m\" (UniqueName: \"kubernetes.io/projected/8a280c54-785f-4a4f-895b-231aaed18db5-kube-api-access-pc88m\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.186932 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.186953 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.186971 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a280c54-785f-4a4f-895b-231aaed18db5-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.187011 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.187798 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.187981 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a280c54-785f-4a4f-895b-231aaed18db5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.188391 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a280c54-785f-4a4f-895b-231aaed18db5-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.190519 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.193051 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.193577 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.200167 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.211120 5030 scope.go:117] "RemoveContainer" containerID="6d9a540904da26e50af38624029f0074073b55fb95f2afb13be3d84f6f8527ef" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.211225 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc88m\" (UniqueName: \"kubernetes.io/projected/8a280c54-785f-4a4f-895b-231aaed18db5-kube-api-access-pc88m\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.217923 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.247772 5030 scope.go:117] "RemoveContainer" containerID="1cd43d16a69bd38005f4c564af5a563ffb74b009be30301f57db13f1b6bfaf60" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.280740 5030 scope.go:117] "RemoveContainer" containerID="962d321b449e41fd746f074eb545941f90420af47ecf8de63641e62ace89067f" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.305822 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.488869 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"198942d4-7d99-4cd0-8e85-b9a7497b4343","Type":"ContainerStarted","Data":"5bfcfd97530429318df3c1456d709733df8f61edbb7300154b1b5a2dc34d7062"} Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.489125 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"198942d4-7d99-4cd0-8e85-b9a7497b4343","Type":"ContainerStarted","Data":"af4b4b10add63da920b6c0c9dfac114e8ff71804b575c92731ad570bf240f7e1"} Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.516292 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.516267782 podStartE2EDuration="2.516267782s" podCreationTimestamp="2026-01-20 23:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:22:51.50655423 +0000 UTC m=+2843.826814518" watchObservedRunningTime="2026-01-20 23:22:51.516267782 +0000 UTC m=+2843.836528070" Jan 20 23:22:51 crc kubenswrapper[5030]: E0120 23:22:51.778722 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:22:51 crc kubenswrapper[5030]: E0120 23:22:51.782805 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:22:51 crc kubenswrapper[5030]: E0120 23:22:51.795387 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:22:51 crc kubenswrapper[5030]: E0120 23:22:51.795453 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="bf61ebcb-4ade-4329-8038-0306cc15e260" containerName="nova-cell0-conductor-conductor" Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.854539 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:22:51 crc kubenswrapper[5030]: I0120 23:22:51.978047 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03270088-0dfc-4646-9599-0700fd991275" path="/var/lib/kubelet/pods/03270088-0dfc-4646-9599-0700fd991275/volumes" Jan 20 23:22:52 crc kubenswrapper[5030]: I0120 23:22:52.500303 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8a280c54-785f-4a4f-895b-231aaed18db5","Type":"ContainerStarted","Data":"7d33a9dd6b2ff81227dfeb64ea475066ca5a849d6f4973770fd37aee303e0b58"} Jan 20 23:22:52 crc kubenswrapper[5030]: I0120 23:22:52.500720 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8a280c54-785f-4a4f-895b-231aaed18db5","Type":"ContainerStarted","Data":"12b36267d2b888f2e81d444f6a384cba4e2f8ef616c88070e3918f6855caf7c2"} Jan 20 23:22:53 crc kubenswrapper[5030]: I0120 23:22:53.518533 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8a280c54-785f-4a4f-895b-231aaed18db5","Type":"ContainerStarted","Data":"bbe8780ef4464c2c88e1e7f47c8a400ceb8bf0f5eb423d5f85696f78d14df1f9"} Jan 20 23:22:53 crc kubenswrapper[5030]: I0120 23:22:53.552308 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.552280477 podStartE2EDuration="3.552280477s" podCreationTimestamp="2026-01-20 23:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:22:53.541561071 +0000 UTC m=+2845.861821389" watchObservedRunningTime="2026-01-20 23:22:53.552280477 +0000 UTC m=+2845.872540805" Jan 20 23:22:53 crc kubenswrapper[5030]: I0120 23:22:53.933334 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mk5qx"] Jan 20 23:22:53 crc kubenswrapper[5030]: I0120 23:22:53.937700 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:22:53 crc kubenswrapper[5030]: I0120 23:22:53.959895 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mk5qx"] Jan 20 23:22:54 crc kubenswrapper[5030]: I0120 23:22:54.041272 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22be905b-8370-4260-b4a8-9ed7fe5995a6-catalog-content\") pod \"redhat-marketplace-mk5qx\" (UID: \"22be905b-8370-4260-b4a8-9ed7fe5995a6\") " pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:22:54 crc kubenswrapper[5030]: I0120 23:22:54.041403 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94txk\" (UniqueName: \"kubernetes.io/projected/22be905b-8370-4260-b4a8-9ed7fe5995a6-kube-api-access-94txk\") pod \"redhat-marketplace-mk5qx\" (UID: \"22be905b-8370-4260-b4a8-9ed7fe5995a6\") " pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:22:54 crc kubenswrapper[5030]: I0120 23:22:54.041599 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22be905b-8370-4260-b4a8-9ed7fe5995a6-utilities\") pod \"redhat-marketplace-mk5qx\" (UID: \"22be905b-8370-4260-b4a8-9ed7fe5995a6\") " pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:22:54 crc kubenswrapper[5030]: I0120 23:22:54.143378 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94txk\" (UniqueName: \"kubernetes.io/projected/22be905b-8370-4260-b4a8-9ed7fe5995a6-kube-api-access-94txk\") pod \"redhat-marketplace-mk5qx\" (UID: \"22be905b-8370-4260-b4a8-9ed7fe5995a6\") " pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:22:54 crc kubenswrapper[5030]: I0120 23:22:54.143561 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22be905b-8370-4260-b4a8-9ed7fe5995a6-utilities\") pod \"redhat-marketplace-mk5qx\" (UID: \"22be905b-8370-4260-b4a8-9ed7fe5995a6\") " pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:22:54 crc kubenswrapper[5030]: I0120 23:22:54.143607 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22be905b-8370-4260-b4a8-9ed7fe5995a6-catalog-content\") pod \"redhat-marketplace-mk5qx\" (UID: \"22be905b-8370-4260-b4a8-9ed7fe5995a6\") " pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:22:54 crc kubenswrapper[5030]: I0120 23:22:54.144075 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22be905b-8370-4260-b4a8-9ed7fe5995a6-utilities\") pod \"redhat-marketplace-mk5qx\" (UID: \"22be905b-8370-4260-b4a8-9ed7fe5995a6\") " pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:22:54 crc kubenswrapper[5030]: I0120 23:22:54.144275 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22be905b-8370-4260-b4a8-9ed7fe5995a6-catalog-content\") pod \"redhat-marketplace-mk5qx\" (UID: \"22be905b-8370-4260-b4a8-9ed7fe5995a6\") " pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:22:54 crc kubenswrapper[5030]: I0120 23:22:54.164421 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94txk\" (UniqueName: \"kubernetes.io/projected/22be905b-8370-4260-b4a8-9ed7fe5995a6-kube-api-access-94txk\") pod \"redhat-marketplace-mk5qx\" (UID: \"22be905b-8370-4260-b4a8-9ed7fe5995a6\") " pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:22:54 crc kubenswrapper[5030]: I0120 23:22:54.267046 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:22:54 crc kubenswrapper[5030]: I0120 23:22:54.729226 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mk5qx"] Jan 20 23:22:54 crc kubenswrapper[5030]: W0120 23:22:54.733831 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22be905b_8370_4260_b4a8_9ed7fe5995a6.slice/crio-c134b73ba90b7f221f95fd14289a7e5ba24ce7f44dda9769f3fcedbb496939c9 WatchSource:0}: Error finding container c134b73ba90b7f221f95fd14289a7e5ba24ce7f44dda9769f3fcedbb496939c9: Status 404 returned error can't find the container with id c134b73ba90b7f221f95fd14289a7e5ba24ce7f44dda9769f3fcedbb496939c9 Jan 20 23:22:54 crc kubenswrapper[5030]: I0120 23:22:54.952761 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.061941 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-config-data\") pod \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.062016 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-log-httpd\") pod \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.062116 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-combined-ca-bundle\") pod \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.062178 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-scripts\") pod \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.062242 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-run-httpd\") pod \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.062354 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-sg-core-conf-yaml\") pod \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.062410 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbdbq\" (UniqueName: \"kubernetes.io/projected/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-kube-api-access-qbdbq\") pod \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\" (UID: \"70fa23d5-94e0-4700-bdbe-9ec06a2653ae\") " Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.062760 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "70fa23d5-94e0-4700-bdbe-9ec06a2653ae" (UID: "70fa23d5-94e0-4700-bdbe-9ec06a2653ae"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.063091 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.063264 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "70fa23d5-94e0-4700-bdbe-9ec06a2653ae" (UID: "70fa23d5-94e0-4700-bdbe-9ec06a2653ae"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.068661 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-kube-api-access-qbdbq" (OuterVolumeSpecName: "kube-api-access-qbdbq") pod "70fa23d5-94e0-4700-bdbe-9ec06a2653ae" (UID: "70fa23d5-94e0-4700-bdbe-9ec06a2653ae"). InnerVolumeSpecName "kube-api-access-qbdbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.068676 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-scripts" (OuterVolumeSpecName: "scripts") pod "70fa23d5-94e0-4700-bdbe-9ec06a2653ae" (UID: "70fa23d5-94e0-4700-bdbe-9ec06a2653ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.096378 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "70fa23d5-94e0-4700-bdbe-9ec06a2653ae" (UID: "70fa23d5-94e0-4700-bdbe-9ec06a2653ae"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.161903 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70fa23d5-94e0-4700-bdbe-9ec06a2653ae" (UID: "70fa23d5-94e0-4700-bdbe-9ec06a2653ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.165327 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.165365 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.165383 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbdbq\" (UniqueName: \"kubernetes.io/projected/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-kube-api-access-qbdbq\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.165398 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.165409 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.181360 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-config-data" (OuterVolumeSpecName: "config-data") pod "70fa23d5-94e0-4700-bdbe-9ec06a2653ae" (UID: "70fa23d5-94e0-4700-bdbe-9ec06a2653ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.267744 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70fa23d5-94e0-4700-bdbe-9ec06a2653ae-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.541852 5030 generic.go:334] "Generic (PLEG): container finished" podID="22be905b-8370-4260-b4a8-9ed7fe5995a6" containerID="ff1898650e81f71618798f8630357dd295a63eefc629c41319a839a7e3068d32" exitCode=0 Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.541936 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mk5qx" event={"ID":"22be905b-8370-4260-b4a8-9ed7fe5995a6","Type":"ContainerDied","Data":"ff1898650e81f71618798f8630357dd295a63eefc629c41319a839a7e3068d32"} Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.542393 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mk5qx" event={"ID":"22be905b-8370-4260-b4a8-9ed7fe5995a6","Type":"ContainerStarted","Data":"c134b73ba90b7f221f95fd14289a7e5ba24ce7f44dda9769f3fcedbb496939c9"} Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.546950 5030 generic.go:334] "Generic (PLEG): container finished" podID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerID="9700105b48303ed87d20f15125744276f543cd829322b0cf3045c232e4faaaa1" exitCode=0 Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.546994 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"70fa23d5-94e0-4700-bdbe-9ec06a2653ae","Type":"ContainerDied","Data":"9700105b48303ed87d20f15125744276f543cd829322b0cf3045c232e4faaaa1"} Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.547058 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.547080 5030 scope.go:117] "RemoveContainer" containerID="12c935e5d06c1c85c6ae4500bfb8d1c761eae997fe174438d7cbe8c93d4fc204" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.547062 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"70fa23d5-94e0-4700-bdbe-9ec06a2653ae","Type":"ContainerDied","Data":"a39e31912d3bbed1fb4b2f12ebcdf3c86ca3a7d2e4e7852857ff2379a859b70d"} Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.593955 5030 scope.go:117] "RemoveContainer" containerID="d4db1e5c28113d674612d715717060efedb67b2d2c59368a248e71762a4da933" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.606881 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.630376 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.644607 5030 scope.go:117] "RemoveContainer" containerID="307e7fcab066c26737977af21851a91a7fce8c9a63ea22ccd1a02a61aae7234f" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.649221 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:55 crc kubenswrapper[5030]: E0120 23:22:55.649596 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="ceilometer-central-agent" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.649613 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="ceilometer-central-agent" Jan 20 23:22:55 crc kubenswrapper[5030]: E0120 23:22:55.649652 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="sg-core" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.649661 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="sg-core" Jan 20 23:22:55 crc kubenswrapper[5030]: E0120 23:22:55.649675 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="proxy-httpd" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.649683 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="proxy-httpd" Jan 20 23:22:55 crc kubenswrapper[5030]: E0120 23:22:55.649717 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="ceilometer-notification-agent" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.649725 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="ceilometer-notification-agent" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.649929 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="ceilometer-central-agent" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.649956 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="proxy-httpd" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.649968 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="sg-core" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.649984 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" containerName="ceilometer-notification-agent" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.651940 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.656695 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.656805 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.659477 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.684420 5030 scope.go:117] "RemoveContainer" containerID="9700105b48303ed87d20f15125744276f543cd829322b0cf3045c232e4faaaa1" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.717439 5030 scope.go:117] "RemoveContainer" containerID="12c935e5d06c1c85c6ae4500bfb8d1c761eae997fe174438d7cbe8c93d4fc204" Jan 20 23:22:55 crc kubenswrapper[5030]: E0120 23:22:55.717920 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c935e5d06c1c85c6ae4500bfb8d1c761eae997fe174438d7cbe8c93d4fc204\": container with ID starting with 12c935e5d06c1c85c6ae4500bfb8d1c761eae997fe174438d7cbe8c93d4fc204 not found: ID does not exist" containerID="12c935e5d06c1c85c6ae4500bfb8d1c761eae997fe174438d7cbe8c93d4fc204" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.717972 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c935e5d06c1c85c6ae4500bfb8d1c761eae997fe174438d7cbe8c93d4fc204"} err="failed to get container status \"12c935e5d06c1c85c6ae4500bfb8d1c761eae997fe174438d7cbe8c93d4fc204\": rpc error: code = NotFound desc = could not find container \"12c935e5d06c1c85c6ae4500bfb8d1c761eae997fe174438d7cbe8c93d4fc204\": container with ID starting with 12c935e5d06c1c85c6ae4500bfb8d1c761eae997fe174438d7cbe8c93d4fc204 not found: ID does not exist" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.718003 5030 scope.go:117] "RemoveContainer" containerID="d4db1e5c28113d674612d715717060efedb67b2d2c59368a248e71762a4da933" Jan 20 23:22:55 crc kubenswrapper[5030]: E0120 23:22:55.718316 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4db1e5c28113d674612d715717060efedb67b2d2c59368a248e71762a4da933\": container with ID starting with d4db1e5c28113d674612d715717060efedb67b2d2c59368a248e71762a4da933 not found: ID does not exist" containerID="d4db1e5c28113d674612d715717060efedb67b2d2c59368a248e71762a4da933" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.718346 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4db1e5c28113d674612d715717060efedb67b2d2c59368a248e71762a4da933"} err="failed to get container status \"d4db1e5c28113d674612d715717060efedb67b2d2c59368a248e71762a4da933\": rpc error: code = NotFound desc = could not find container \"d4db1e5c28113d674612d715717060efedb67b2d2c59368a248e71762a4da933\": container with ID starting with d4db1e5c28113d674612d715717060efedb67b2d2c59368a248e71762a4da933 not found: ID does not exist" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.718370 5030 scope.go:117] "RemoveContainer" containerID="307e7fcab066c26737977af21851a91a7fce8c9a63ea22ccd1a02a61aae7234f" Jan 20 23:22:55 crc kubenswrapper[5030]: E0120 23:22:55.718588 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"307e7fcab066c26737977af21851a91a7fce8c9a63ea22ccd1a02a61aae7234f\": container with ID starting with 307e7fcab066c26737977af21851a91a7fce8c9a63ea22ccd1a02a61aae7234f not found: ID does not exist" containerID="307e7fcab066c26737977af21851a91a7fce8c9a63ea22ccd1a02a61aae7234f" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.719046 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"307e7fcab066c26737977af21851a91a7fce8c9a63ea22ccd1a02a61aae7234f"} err="failed to get container status \"307e7fcab066c26737977af21851a91a7fce8c9a63ea22ccd1a02a61aae7234f\": rpc error: code = NotFound desc = could not find container \"307e7fcab066c26737977af21851a91a7fce8c9a63ea22ccd1a02a61aae7234f\": container with ID starting with 307e7fcab066c26737977af21851a91a7fce8c9a63ea22ccd1a02a61aae7234f not found: ID does not exist" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.719087 5030 scope.go:117] "RemoveContainer" containerID="9700105b48303ed87d20f15125744276f543cd829322b0cf3045c232e4faaaa1" Jan 20 23:22:55 crc kubenswrapper[5030]: E0120 23:22:55.719431 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9700105b48303ed87d20f15125744276f543cd829322b0cf3045c232e4faaaa1\": container with ID starting with 9700105b48303ed87d20f15125744276f543cd829322b0cf3045c232e4faaaa1 not found: ID does not exist" containerID="9700105b48303ed87d20f15125744276f543cd829322b0cf3045c232e4faaaa1" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.719461 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9700105b48303ed87d20f15125744276f543cd829322b0cf3045c232e4faaaa1"} err="failed to get container status \"9700105b48303ed87d20f15125744276f543cd829322b0cf3045c232e4faaaa1\": rpc error: code = NotFound desc = could not find container \"9700105b48303ed87d20f15125744276f543cd829322b0cf3045c232e4faaaa1\": container with ID starting with 9700105b48303ed87d20f15125744276f543cd829322b0cf3045c232e4faaaa1 not found: ID does not exist" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.780005 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-run-httpd\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.780046 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-scripts\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.780197 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pp9b\" (UniqueName: \"kubernetes.io/projected/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-kube-api-access-9pp9b\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.780225 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-log-httpd\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.780276 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.780297 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-config-data\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.780425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.881965 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-run-httpd\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.882026 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-scripts\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.882142 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pp9b\" (UniqueName: \"kubernetes.io/projected/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-kube-api-access-9pp9b\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.882367 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-log-httpd\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.882426 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.882451 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-config-data\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.882506 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.882578 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-run-httpd\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.882980 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-log-httpd\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.888783 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.889239 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-config-data\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.899371 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.900538 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-scripts\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.902128 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pp9b\" (UniqueName: \"kubernetes.io/projected/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-kube-api-access-9pp9b\") pod \"ceilometer-0\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.975941 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:22:55 crc kubenswrapper[5030]: I0120 23:22:55.978318 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70fa23d5-94e0-4700-bdbe-9ec06a2653ae" path="/var/lib/kubelet/pods/70fa23d5-94e0-4700-bdbe-9ec06a2653ae/volumes" Jan 20 23:22:56 crc kubenswrapper[5030]: I0120 23:22:56.475439 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:22:56 crc kubenswrapper[5030]: I0120 23:22:56.561209 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3","Type":"ContainerStarted","Data":"38b90920ffd9b22e85cf35dce4d39c76c6e383d266994450394fa70cbed3614b"} Jan 20 23:22:56 crc kubenswrapper[5030]: E0120 23:22:56.758724 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:22:56 crc kubenswrapper[5030]: E0120 23:22:56.760290 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:22:56 crc kubenswrapper[5030]: E0120 23:22:56.761611 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:22:56 crc kubenswrapper[5030]: E0120 23:22:56.761749 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="bf61ebcb-4ade-4329-8038-0306cc15e260" containerName="nova-cell0-conductor-conductor" Jan 20 23:22:57 crc kubenswrapper[5030]: I0120 23:22:57.580358 5030 generic.go:334] "Generic (PLEG): container finished" podID="22be905b-8370-4260-b4a8-9ed7fe5995a6" containerID="b5506cabba75bf7e05d8db87d86c3eac91db00d34f34bfd2358c716aa588e450" exitCode=0 Jan 20 23:22:57 crc kubenswrapper[5030]: I0120 23:22:57.580441 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mk5qx" event={"ID":"22be905b-8370-4260-b4a8-9ed7fe5995a6","Type":"ContainerDied","Data":"b5506cabba75bf7e05d8db87d86c3eac91db00d34f34bfd2358c716aa588e450"} Jan 20 23:22:57 crc kubenswrapper[5030]: I0120 23:22:57.584031 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3","Type":"ContainerStarted","Data":"70c83c11f5c9f3f1223c756a99e10679fdfe1bf9f31f53965a52137798774c01"} Jan 20 23:22:58 crc kubenswrapper[5030]: I0120 23:22:58.610215 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mk5qx" event={"ID":"22be905b-8370-4260-b4a8-9ed7fe5995a6","Type":"ContainerStarted","Data":"01e73d8538d627c7dbb4df4cf4dbae2a423b40dc08e400431c26c002ffb63152"} Jan 20 23:22:58 crc kubenswrapper[5030]: I0120 23:22:58.613013 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3","Type":"ContainerStarted","Data":"eac23441c4d29e590b418ed30378595be267c553201f8bd106d1addebfb0c2c6"} Jan 20 23:22:58 crc kubenswrapper[5030]: I0120 23:22:58.649006 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mk5qx" podStartSLOduration=3.149137571 podStartE2EDuration="5.64898049s" podCreationTimestamp="2026-01-20 23:22:53 +0000 UTC" firstStartedPulling="2026-01-20 23:22:55.544427001 +0000 UTC m=+2847.864687289" lastFinishedPulling="2026-01-20 23:22:58.04426991 +0000 UTC m=+2850.364530208" observedRunningTime="2026-01-20 23:22:58.629503564 +0000 UTC m=+2850.949763862" watchObservedRunningTime="2026-01-20 23:22:58.64898049 +0000 UTC m=+2850.969240798" Jan 20 23:22:59 crc kubenswrapper[5030]: I0120 23:22:59.633420 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3","Type":"ContainerStarted","Data":"13ce826b149e461c22634b916c2f5b5b22878cd99df03e540c947bf05fb5e76e"} Jan 20 23:22:59 crc kubenswrapper[5030]: I0120 23:22:59.858012 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:59 crc kubenswrapper[5030]: I0120 23:22:59.858089 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:59 crc kubenswrapper[5030]: I0120 23:22:59.907586 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:22:59 crc kubenswrapper[5030]: I0120 23:22:59.914289 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:23:00 crc kubenswrapper[5030]: I0120 23:23:00.650658 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3","Type":"ContainerStarted","Data":"aa8f7a6c20d8f634ac8b7433018b7e871b0f67493726e84457b73dd8d881cf92"} Jan 20 23:23:00 crc kubenswrapper[5030]: I0120 23:23:00.651218 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:23:00 crc kubenswrapper[5030]: I0120 23:23:00.651267 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:23:00 crc kubenswrapper[5030]: I0120 23:23:00.697455 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.079682977 podStartE2EDuration="5.697420914s" podCreationTimestamp="2026-01-20 23:22:55 +0000 UTC" firstStartedPulling="2026-01-20 23:22:56.479271969 +0000 UTC m=+2848.799532297" lastFinishedPulling="2026-01-20 23:23:00.097009936 +0000 UTC m=+2852.417270234" observedRunningTime="2026-01-20 23:23:00.6934877 +0000 UTC m=+2853.013748018" watchObservedRunningTime="2026-01-20 23:23:00.697420914 +0000 UTC m=+2853.017681232" Jan 20 23:23:01 crc kubenswrapper[5030]: I0120 23:23:01.306438 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:23:01 crc kubenswrapper[5030]: I0120 23:23:01.306827 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:23:01 crc kubenswrapper[5030]: I0120 23:23:01.356951 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:23:01 crc kubenswrapper[5030]: I0120 23:23:01.378120 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:23:01 crc kubenswrapper[5030]: I0120 23:23:01.660783 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:23:01 crc kubenswrapper[5030]: I0120 23:23:01.662903 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:23:01 crc kubenswrapper[5030]: I0120 23:23:01.663104 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:01 crc kubenswrapper[5030]: E0120 23:23:01.754598 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:23:01 crc kubenswrapper[5030]: E0120 23:23:01.756544 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:23:01 crc kubenswrapper[5030]: E0120 23:23:01.757892 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:23:01 crc kubenswrapper[5030]: E0120 23:23:01.757953 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="bf61ebcb-4ade-4329-8038-0306cc15e260" containerName="nova-cell0-conductor-conductor" Jan 20 23:23:02 crc kubenswrapper[5030]: I0120 23:23:02.398515 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:23:02 crc kubenswrapper[5030]: I0120 23:23:02.400252 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:23:03 crc kubenswrapper[5030]: I0120 23:23:03.474368 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:23:03 crc kubenswrapper[5030]: I0120 23:23:03.491988 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:23:04 crc kubenswrapper[5030]: I0120 23:23:04.268295 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:23:04 crc kubenswrapper[5030]: I0120 23:23:04.268370 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:23:04 crc kubenswrapper[5030]: I0120 23:23:04.358669 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:23:04 crc kubenswrapper[5030]: I0120 23:23:04.767142 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:23:04 crc kubenswrapper[5030]: I0120 23:23:04.836288 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mk5qx"] Jan 20 23:23:06 crc kubenswrapper[5030]: I0120 23:23:06.724704 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mk5qx" podUID="22be905b-8370-4260-b4a8-9ed7fe5995a6" containerName="registry-server" containerID="cri-o://01e73d8538d627c7dbb4df4cf4dbae2a423b40dc08e400431c26c002ffb63152" gracePeriod=2 Jan 20 23:23:06 crc kubenswrapper[5030]: E0120 23:23:06.757044 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:23:06 crc kubenswrapper[5030]: E0120 23:23:06.760170 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:23:06 crc kubenswrapper[5030]: E0120 23:23:06.762304 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:23:06 crc kubenswrapper[5030]: E0120 23:23:06.762403 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="bf61ebcb-4ade-4329-8038-0306cc15e260" containerName="nova-cell0-conductor-conductor" Jan 20 23:23:07 crc kubenswrapper[5030]: I0120 23:23:07.735834 5030 generic.go:334] "Generic (PLEG): container finished" podID="22be905b-8370-4260-b4a8-9ed7fe5995a6" containerID="01e73d8538d627c7dbb4df4cf4dbae2a423b40dc08e400431c26c002ffb63152" exitCode=0 Jan 20 23:23:07 crc kubenswrapper[5030]: I0120 23:23:07.735877 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mk5qx" event={"ID":"22be905b-8370-4260-b4a8-9ed7fe5995a6","Type":"ContainerDied","Data":"01e73d8538d627c7dbb4df4cf4dbae2a423b40dc08e400431c26c002ffb63152"} Jan 20 23:23:07 crc kubenswrapper[5030]: I0120 23:23:07.828895 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:23:07 crc kubenswrapper[5030]: I0120 23:23:07.924817 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22be905b-8370-4260-b4a8-9ed7fe5995a6-utilities\") pod \"22be905b-8370-4260-b4a8-9ed7fe5995a6\" (UID: \"22be905b-8370-4260-b4a8-9ed7fe5995a6\") " Jan 20 23:23:07 crc kubenswrapper[5030]: I0120 23:23:07.924976 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22be905b-8370-4260-b4a8-9ed7fe5995a6-catalog-content\") pod \"22be905b-8370-4260-b4a8-9ed7fe5995a6\" (UID: \"22be905b-8370-4260-b4a8-9ed7fe5995a6\") " Jan 20 23:23:07 crc kubenswrapper[5030]: I0120 23:23:07.925265 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94txk\" (UniqueName: \"kubernetes.io/projected/22be905b-8370-4260-b4a8-9ed7fe5995a6-kube-api-access-94txk\") pod \"22be905b-8370-4260-b4a8-9ed7fe5995a6\" (UID: \"22be905b-8370-4260-b4a8-9ed7fe5995a6\") " Jan 20 23:23:07 crc kubenswrapper[5030]: I0120 23:23:07.926580 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22be905b-8370-4260-b4a8-9ed7fe5995a6-utilities" (OuterVolumeSpecName: "utilities") pod "22be905b-8370-4260-b4a8-9ed7fe5995a6" (UID: "22be905b-8370-4260-b4a8-9ed7fe5995a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:23:07 crc kubenswrapper[5030]: I0120 23:23:07.930606 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22be905b-8370-4260-b4a8-9ed7fe5995a6-kube-api-access-94txk" (OuterVolumeSpecName: "kube-api-access-94txk") pod "22be905b-8370-4260-b4a8-9ed7fe5995a6" (UID: "22be905b-8370-4260-b4a8-9ed7fe5995a6"). InnerVolumeSpecName "kube-api-access-94txk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:07 crc kubenswrapper[5030]: I0120 23:23:07.945968 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22be905b-8370-4260-b4a8-9ed7fe5995a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22be905b-8370-4260-b4a8-9ed7fe5995a6" (UID: "22be905b-8370-4260-b4a8-9ed7fe5995a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:23:08 crc kubenswrapper[5030]: I0120 23:23:08.026527 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22be905b-8370-4260-b4a8-9ed7fe5995a6-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:08 crc kubenswrapper[5030]: I0120 23:23:08.026567 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22be905b-8370-4260-b4a8-9ed7fe5995a6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:08 crc kubenswrapper[5030]: I0120 23:23:08.026577 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94txk\" (UniqueName: \"kubernetes.io/projected/22be905b-8370-4260-b4a8-9ed7fe5995a6-kube-api-access-94txk\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:08 crc kubenswrapper[5030]: I0120 23:23:08.752504 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mk5qx" event={"ID":"22be905b-8370-4260-b4a8-9ed7fe5995a6","Type":"ContainerDied","Data":"c134b73ba90b7f221f95fd14289a7e5ba24ce7f44dda9769f3fcedbb496939c9"} Jan 20 23:23:08 crc kubenswrapper[5030]: I0120 23:23:08.752615 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mk5qx" Jan 20 23:23:08 crc kubenswrapper[5030]: I0120 23:23:08.752951 5030 scope.go:117] "RemoveContainer" containerID="01e73d8538d627c7dbb4df4cf4dbae2a423b40dc08e400431c26c002ffb63152" Jan 20 23:23:08 crc kubenswrapper[5030]: I0120 23:23:08.790533 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mk5qx"] Jan 20 23:23:08 crc kubenswrapper[5030]: I0120 23:23:08.794015 5030 scope.go:117] "RemoveContainer" containerID="b5506cabba75bf7e05d8db87d86c3eac91db00d34f34bfd2358c716aa588e450" Jan 20 23:23:08 crc kubenswrapper[5030]: I0120 23:23:08.807571 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mk5qx"] Jan 20 23:23:08 crc kubenswrapper[5030]: I0120 23:23:08.830785 5030 scope.go:117] "RemoveContainer" containerID="ff1898650e81f71618798f8630357dd295a63eefc629c41319a839a7e3068d32" Jan 20 23:23:09 crc kubenswrapper[5030]: I0120 23:23:09.977752 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22be905b-8370-4260-b4a8-9ed7fe5995a6" path="/var/lib/kubelet/pods/22be905b-8370-4260-b4a8-9ed7fe5995a6/volumes" Jan 20 23:23:11 crc kubenswrapper[5030]: E0120 23:23:11.755714 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:23:11 crc kubenswrapper[5030]: E0120 23:23:11.757613 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:23:11 crc kubenswrapper[5030]: E0120 23:23:11.759982 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:23:11 crc kubenswrapper[5030]: E0120 23:23:11.760058 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="bf61ebcb-4ade-4329-8038-0306cc15e260" containerName="nova-cell0-conductor-conductor" Jan 20 23:23:13 crc kubenswrapper[5030]: I0120 23:23:13.824536 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf61ebcb-4ade-4329-8038-0306cc15e260" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" exitCode=137 Jan 20 23:23:13 crc kubenswrapper[5030]: I0120 23:23:13.824682 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"bf61ebcb-4ade-4329-8038-0306cc15e260","Type":"ContainerDied","Data":"8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981"} Jan 20 23:23:13 crc kubenswrapper[5030]: I0120 23:23:13.825168 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"bf61ebcb-4ade-4329-8038-0306cc15e260","Type":"ContainerDied","Data":"ebcc13f77839c030b7ef2e3cec6e70f3f3b0e1b34845dfeef83641c68966ad9e"} Jan 20 23:23:13 crc kubenswrapper[5030]: I0120 23:23:13.825203 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebcc13f77839c030b7ef2e3cec6e70f3f3b0e1b34845dfeef83641c68966ad9e" Jan 20 23:23:13 crc kubenswrapper[5030]: I0120 23:23:13.846616 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:13 crc kubenswrapper[5030]: I0120 23:23:13.957750 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf61ebcb-4ade-4329-8038-0306cc15e260-config-data\") pod \"bf61ebcb-4ade-4329-8038-0306cc15e260\" (UID: \"bf61ebcb-4ade-4329-8038-0306cc15e260\") " Jan 20 23:23:13 crc kubenswrapper[5030]: I0120 23:23:13.957883 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f7mv\" (UniqueName: \"kubernetes.io/projected/bf61ebcb-4ade-4329-8038-0306cc15e260-kube-api-access-2f7mv\") pod \"bf61ebcb-4ade-4329-8038-0306cc15e260\" (UID: \"bf61ebcb-4ade-4329-8038-0306cc15e260\") " Jan 20 23:23:13 crc kubenswrapper[5030]: I0120 23:23:13.957907 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf61ebcb-4ade-4329-8038-0306cc15e260-combined-ca-bundle\") pod \"bf61ebcb-4ade-4329-8038-0306cc15e260\" (UID: \"bf61ebcb-4ade-4329-8038-0306cc15e260\") " Jan 20 23:23:13 crc kubenswrapper[5030]: I0120 23:23:13.965281 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf61ebcb-4ade-4329-8038-0306cc15e260-kube-api-access-2f7mv" (OuterVolumeSpecName: "kube-api-access-2f7mv") pod "bf61ebcb-4ade-4329-8038-0306cc15e260" (UID: "bf61ebcb-4ade-4329-8038-0306cc15e260"). InnerVolumeSpecName "kube-api-access-2f7mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:13 crc kubenswrapper[5030]: I0120 23:23:13.999237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf61ebcb-4ade-4329-8038-0306cc15e260-config-data" (OuterVolumeSpecName: "config-data") pod "bf61ebcb-4ade-4329-8038-0306cc15e260" (UID: "bf61ebcb-4ade-4329-8038-0306cc15e260"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.010970 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf61ebcb-4ade-4329-8038-0306cc15e260-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf61ebcb-4ade-4329-8038-0306cc15e260" (UID: "bf61ebcb-4ade-4329-8038-0306cc15e260"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.059311 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf61ebcb-4ade-4329-8038-0306cc15e260-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.059352 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f7mv\" (UniqueName: \"kubernetes.io/projected/bf61ebcb-4ade-4329-8038-0306cc15e260-kube-api-access-2f7mv\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.059368 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf61ebcb-4ade-4329-8038-0306cc15e260-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.837191 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.893668 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.916169 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.934048 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:23:14 crc kubenswrapper[5030]: E0120 23:23:14.934653 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22be905b-8370-4260-b4a8-9ed7fe5995a6" containerName="extract-utilities" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.934681 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="22be905b-8370-4260-b4a8-9ed7fe5995a6" containerName="extract-utilities" Jan 20 23:23:14 crc kubenswrapper[5030]: E0120 23:23:14.934717 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22be905b-8370-4260-b4a8-9ed7fe5995a6" containerName="extract-content" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.934731 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="22be905b-8370-4260-b4a8-9ed7fe5995a6" containerName="extract-content" Jan 20 23:23:14 crc kubenswrapper[5030]: E0120 23:23:14.934762 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf61ebcb-4ade-4329-8038-0306cc15e260" containerName="nova-cell0-conductor-conductor" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.934776 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf61ebcb-4ade-4329-8038-0306cc15e260" containerName="nova-cell0-conductor-conductor" Jan 20 23:23:14 crc kubenswrapper[5030]: E0120 23:23:14.934806 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22be905b-8370-4260-b4a8-9ed7fe5995a6" containerName="registry-server" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.934819 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="22be905b-8370-4260-b4a8-9ed7fe5995a6" containerName="registry-server" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.935162 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf61ebcb-4ade-4329-8038-0306cc15e260" containerName="nova-cell0-conductor-conductor" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.935197 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="22be905b-8370-4260-b4a8-9ed7fe5995a6" containerName="registry-server" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.936231 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.952205 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.954266 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-qp6jt" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.978280 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.981588 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c458e9a2-dd7c-4835-88f2-dcd41887e658-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c458e9a2-dd7c-4835-88f2-dcd41887e658\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.981667 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqg64\" (UniqueName: \"kubernetes.io/projected/c458e9a2-dd7c-4835-88f2-dcd41887e658-kube-api-access-wqg64\") pod \"nova-cell0-conductor-0\" (UID: \"c458e9a2-dd7c-4835-88f2-dcd41887e658\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:14 crc kubenswrapper[5030]: I0120 23:23:14.981711 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c458e9a2-dd7c-4835-88f2-dcd41887e658-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c458e9a2-dd7c-4835-88f2-dcd41887e658\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:15 crc kubenswrapper[5030]: I0120 23:23:15.083138 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c458e9a2-dd7c-4835-88f2-dcd41887e658-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c458e9a2-dd7c-4835-88f2-dcd41887e658\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:15 crc kubenswrapper[5030]: I0120 23:23:15.083191 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqg64\" (UniqueName: \"kubernetes.io/projected/c458e9a2-dd7c-4835-88f2-dcd41887e658-kube-api-access-wqg64\") pod \"nova-cell0-conductor-0\" (UID: \"c458e9a2-dd7c-4835-88f2-dcd41887e658\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:15 crc kubenswrapper[5030]: I0120 23:23:15.083221 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c458e9a2-dd7c-4835-88f2-dcd41887e658-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c458e9a2-dd7c-4835-88f2-dcd41887e658\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:15 crc kubenswrapper[5030]: I0120 23:23:15.089221 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c458e9a2-dd7c-4835-88f2-dcd41887e658-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c458e9a2-dd7c-4835-88f2-dcd41887e658\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:15 crc kubenswrapper[5030]: I0120 23:23:15.090432 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c458e9a2-dd7c-4835-88f2-dcd41887e658-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c458e9a2-dd7c-4835-88f2-dcd41887e658\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:15 crc kubenswrapper[5030]: I0120 23:23:15.099874 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqg64\" (UniqueName: \"kubernetes.io/projected/c458e9a2-dd7c-4835-88f2-dcd41887e658-kube-api-access-wqg64\") pod \"nova-cell0-conductor-0\" (UID: \"c458e9a2-dd7c-4835-88f2-dcd41887e658\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:15 crc kubenswrapper[5030]: I0120 23:23:15.285485 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:15 crc kubenswrapper[5030]: I0120 23:23:15.586159 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:23:15 crc kubenswrapper[5030]: I0120 23:23:15.848044 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"c458e9a2-dd7c-4835-88f2-dcd41887e658","Type":"ContainerStarted","Data":"e479ed4455261ff95dbf70f650e6de7b4c94add8a7518effc7599b3e30ce8a54"} Jan 20 23:23:15 crc kubenswrapper[5030]: I0120 23:23:15.848092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"c458e9a2-dd7c-4835-88f2-dcd41887e658","Type":"ContainerStarted","Data":"502a28c098660efaba5078cf04781ee92fe3f9947c2fe0dc97bcfa39d2e6d39f"} Jan 20 23:23:15 crc kubenswrapper[5030]: I0120 23:23:15.848213 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:15 crc kubenswrapper[5030]: I0120 23:23:15.867122 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.867103294 podStartE2EDuration="1.867103294s" podCreationTimestamp="2026-01-20 23:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:15.866479429 +0000 UTC m=+2868.186739727" watchObservedRunningTime="2026-01-20 23:23:15.867103294 +0000 UTC m=+2868.187363582" Jan 20 23:23:15 crc kubenswrapper[5030]: I0120 23:23:15.978411 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf61ebcb-4ade-4329-8038-0306cc15e260" path="/var/lib/kubelet/pods/bf61ebcb-4ade-4329-8038-0306cc15e260/volumes" Jan 20 23:23:20 crc kubenswrapper[5030]: I0120 23:23:20.325847 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:23:20 crc kubenswrapper[5030]: I0120 23:23:20.891817 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v"] Jan 20 23:23:20 crc kubenswrapper[5030]: I0120 23:23:20.893651 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:20 crc kubenswrapper[5030]: I0120 23:23:20.896547 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 20 23:23:20 crc kubenswrapper[5030]: I0120 23:23:20.896999 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 20 23:23:20 crc kubenswrapper[5030]: I0120 23:23:20.914465 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v"] Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.003210 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rbr7v\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.003401 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-config-data\") pod \"nova-cell0-cell-mapping-rbr7v\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.003429 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x95zs\" (UniqueName: \"kubernetes.io/projected/22cb8656-1df5-4761-92bd-b91c969df165-kube-api-access-x95zs\") pod \"nova-cell0-cell-mapping-rbr7v\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.003453 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-scripts\") pod \"nova-cell0-cell-mapping-rbr7v\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.006206 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.008569 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.022193 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.023196 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.078205 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.083021 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.088183 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.103605 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.104364 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x95zs\" (UniqueName: \"kubernetes.io/projected/22cb8656-1df5-4761-92bd-b91c969df165-kube-api-access-x95zs\") pod \"nova-cell0-cell-mapping-rbr7v\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.104399 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-scripts\") pod \"nova-cell0-cell-mapping-rbr7v\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.104450 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rbr7v\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.105992 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-config-data\") pod \"nova-cell0-cell-mapping-rbr7v\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.121774 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-config-data\") pod \"nova-cell0-cell-mapping-rbr7v\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.127359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-scripts\") pod \"nova-cell0-cell-mapping-rbr7v\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.127679 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rbr7v\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.135100 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x95zs\" (UniqueName: \"kubernetes.io/projected/22cb8656-1df5-4761-92bd-b91c969df165-kube-api-access-x95zs\") pod \"nova-cell0-cell-mapping-rbr7v\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.150939 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.152233 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.169486 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.182082 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.208323 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b242b52e-0bb1-4d87-9138-0ca250dd5544-logs\") pod \"nova-api-0\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.208401 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9efad63e-c6bc-42da-bec7-44c26db293da-config-data\") pod \"nova-metadata-0\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.208472 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9efad63e-c6bc-42da-bec7-44c26db293da-logs\") pod \"nova-metadata-0\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.208508 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4339423-65e5-448c-9892-ed409c4b7d97-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4339423-65e5-448c-9892-ed409c4b7d97\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.208535 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4339423-65e5-448c-9892-ed409c4b7d97-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4339423-65e5-448c-9892-ed409c4b7d97\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.208689 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rvwk\" (UniqueName: \"kubernetes.io/projected/b242b52e-0bb1-4d87-9138-0ca250dd5544-kube-api-access-6rvwk\") pod \"nova-api-0\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.208731 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlvbn\" (UniqueName: \"kubernetes.io/projected/b4339423-65e5-448c-9892-ed409c4b7d97-kube-api-access-mlvbn\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4339423-65e5-448c-9892-ed409c4b7d97\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.208775 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpcjd\" (UniqueName: \"kubernetes.io/projected/9efad63e-c6bc-42da-bec7-44c26db293da-kube-api-access-kpcjd\") pod \"nova-metadata-0\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.208816 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9efad63e-c6bc-42da-bec7-44c26db293da-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.208916 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b242b52e-0bb1-4d87-9138-0ca250dd5544-config-data\") pod \"nova-api-0\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.208998 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b242b52e-0bb1-4d87-9138-0ca250dd5544-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.224996 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.227970 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.229102 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.231778 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.262639 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.313464 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b242b52e-0bb1-4d87-9138-0ca250dd5544-config-data\") pod \"nova-api-0\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.313572 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b242b52e-0bb1-4d87-9138-0ca250dd5544-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.313608 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52ae645c-5c85-4557-b829-695539abf517-config-data\") pod \"nova-scheduler-0\" (UID: \"52ae645c-5c85-4557-b829-695539abf517\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.325170 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b242b52e-0bb1-4d87-9138-0ca250dd5544-config-data\") pod \"nova-api-0\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.329897 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b242b52e-0bb1-4d87-9138-0ca250dd5544-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.341225 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn6vj\" (UniqueName: \"kubernetes.io/projected/52ae645c-5c85-4557-b829-695539abf517-kube-api-access-hn6vj\") pod \"nova-scheduler-0\" (UID: \"52ae645c-5c85-4557-b829-695539abf517\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.341357 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b242b52e-0bb1-4d87-9138-0ca250dd5544-logs\") pod \"nova-api-0\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.341447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9efad63e-c6bc-42da-bec7-44c26db293da-config-data\") pod \"nova-metadata-0\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.341512 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9efad63e-c6bc-42da-bec7-44c26db293da-logs\") pod \"nova-metadata-0\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.341557 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4339423-65e5-448c-9892-ed409c4b7d97-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4339423-65e5-448c-9892-ed409c4b7d97\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.341582 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4339423-65e5-448c-9892-ed409c4b7d97-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4339423-65e5-448c-9892-ed409c4b7d97\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.341645 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ae645c-5c85-4557-b829-695539abf517-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"52ae645c-5c85-4557-b829-695539abf517\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.341747 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rvwk\" (UniqueName: \"kubernetes.io/projected/b242b52e-0bb1-4d87-9138-0ca250dd5544-kube-api-access-6rvwk\") pod \"nova-api-0\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.341790 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlvbn\" (UniqueName: \"kubernetes.io/projected/b4339423-65e5-448c-9892-ed409c4b7d97-kube-api-access-mlvbn\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4339423-65e5-448c-9892-ed409c4b7d97\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.341846 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpcjd\" (UniqueName: \"kubernetes.io/projected/9efad63e-c6bc-42da-bec7-44c26db293da-kube-api-access-kpcjd\") pod \"nova-metadata-0\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.341898 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9efad63e-c6bc-42da-bec7-44c26db293da-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.341934 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b242b52e-0bb1-4d87-9138-0ca250dd5544-logs\") pod \"nova-api-0\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.346269 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9efad63e-c6bc-42da-bec7-44c26db293da-config-data\") pod \"nova-metadata-0\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.354827 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9efad63e-c6bc-42da-bec7-44c26db293da-logs\") pod \"nova-metadata-0\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.358818 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4339423-65e5-448c-9892-ed409c4b7d97-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4339423-65e5-448c-9892-ed409c4b7d97\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.360200 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4339423-65e5-448c-9892-ed409c4b7d97-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4339423-65e5-448c-9892-ed409c4b7d97\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.360979 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9efad63e-c6bc-42da-bec7-44c26db293da-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.375317 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlvbn\" (UniqueName: \"kubernetes.io/projected/b4339423-65e5-448c-9892-ed409c4b7d97-kube-api-access-mlvbn\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4339423-65e5-448c-9892-ed409c4b7d97\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.375824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpcjd\" (UniqueName: \"kubernetes.io/projected/9efad63e-c6bc-42da-bec7-44c26db293da-kube-api-access-kpcjd\") pod \"nova-metadata-0\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.378061 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rvwk\" (UniqueName: \"kubernetes.io/projected/b242b52e-0bb1-4d87-9138-0ca250dd5544-kube-api-access-6rvwk\") pod \"nova-api-0\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.401014 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.443381 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ae645c-5c85-4557-b829-695539abf517-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"52ae645c-5c85-4557-b829-695539abf517\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.444144 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52ae645c-5c85-4557-b829-695539abf517-config-data\") pod \"nova-scheduler-0\" (UID: \"52ae645c-5c85-4557-b829-695539abf517\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.444212 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn6vj\" (UniqueName: \"kubernetes.io/projected/52ae645c-5c85-4557-b829-695539abf517-kube-api-access-hn6vj\") pod \"nova-scheduler-0\" (UID: \"52ae645c-5c85-4557-b829-695539abf517\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.455285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52ae645c-5c85-4557-b829-695539abf517-config-data\") pod \"nova-scheduler-0\" (UID: \"52ae645c-5c85-4557-b829-695539abf517\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.457721 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ae645c-5c85-4557-b829-695539abf517-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"52ae645c-5c85-4557-b829-695539abf517\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.461942 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn6vj\" (UniqueName: \"kubernetes.io/projected/52ae645c-5c85-4557-b829-695539abf517-kube-api-access-hn6vj\") pod \"nova-scheduler-0\" (UID: \"52ae645c-5c85-4557-b829-695539abf517\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.515015 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.629222 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.675341 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.798335 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v"] Jan 20 23:23:21 crc kubenswrapper[5030]: W0120 23:23:21.811708 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22cb8656_1df5_4761_92bd_b91c969df165.slice/crio-6b2a4be85cb0bf26f70701cad6c2a8ae962b68788990f88441ea26e1eade6dd2 WatchSource:0}: Error finding container 6b2a4be85cb0bf26f70701cad6c2a8ae962b68788990f88441ea26e1eade6dd2: Status 404 returned error can't find the container with id 6b2a4be85cb0bf26f70701cad6c2a8ae962b68788990f88441ea26e1eade6dd2 Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.918424 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" event={"ID":"22cb8656-1df5-4761-92bd-b91c969df165","Type":"ContainerStarted","Data":"6b2a4be85cb0bf26f70701cad6c2a8ae962b68788990f88441ea26e1eade6dd2"} Jan 20 23:23:21 crc kubenswrapper[5030]: I0120 23:23:21.920652 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:21 crc kubenswrapper[5030]: W0120 23:23:21.922651 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9efad63e_c6bc_42da_bec7_44c26db293da.slice/crio-49994ea8faddcfe2d7c2e1f49e66d37f96a73f4f53a9a7ad2d8374d1048a99f5 WatchSource:0}: Error finding container 49994ea8faddcfe2d7c2e1f49e66d37f96a73f4f53a9a7ad2d8374d1048a99f5: Status 404 returned error can't find the container with id 49994ea8faddcfe2d7c2e1f49e66d37f96a73f4f53a9a7ad2d8374d1048a99f5 Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.005534 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls"] Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.006742 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.008982 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.009243 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.014278 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls"] Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.047467 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.063414 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-scripts\") pod \"nova-cell1-conductor-db-sync-t7hls\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.063883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-t7hls\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.063991 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktlf9\" (UniqueName: \"kubernetes.io/projected/0795a100-bef0-4357-94fb-4cc0d177d748-kube-api-access-ktlf9\") pod \"nova-cell1-conductor-db-sync-t7hls\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.064144 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-config-data\") pod \"nova-cell1-conductor-db-sync-t7hls\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: W0120 23:23:22.067001 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4339423_65e5_448c_9892_ed409c4b7d97.slice/crio-3eee995c288d6fe29c00c56bec4f893ff1190100bed4d520d694fe1df3463883 WatchSource:0}: Error finding container 3eee995c288d6fe29c00c56bec4f893ff1190100bed4d520d694fe1df3463883: Status 404 returned error can't find the container with id 3eee995c288d6fe29c00c56bec4f893ff1190100bed4d520d694fe1df3463883 Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.165897 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-scripts\") pod \"nova-cell1-conductor-db-sync-t7hls\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.166202 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-t7hls\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.166227 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktlf9\" (UniqueName: \"kubernetes.io/projected/0795a100-bef0-4357-94fb-4cc0d177d748-kube-api-access-ktlf9\") pod \"nova-cell1-conductor-db-sync-t7hls\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.166270 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-config-data\") pod \"nova-cell1-conductor-db-sync-t7hls\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.174824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-config-data\") pod \"nova-cell1-conductor-db-sync-t7hls\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.177986 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-scripts\") pod \"nova-cell1-conductor-db-sync-t7hls\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.182151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-t7hls\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.186534 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.187241 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktlf9\" (UniqueName: \"kubernetes.io/projected/0795a100-bef0-4357-94fb-4cc0d177d748-kube-api-access-ktlf9\") pod \"nova-cell1-conductor-db-sync-t7hls\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.223312 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.393908 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:22 crc kubenswrapper[5030]: W0120 23:23:22.828891 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0795a100_bef0_4357_94fb_4cc0d177d748.slice/crio-b4285d41cecf3ba5ecb53dd6c3c23366d6eceb17a3c0ccb9c903e147899c5a1d WatchSource:0}: Error finding container b4285d41cecf3ba5ecb53dd6c3c23366d6eceb17a3c0ccb9c903e147899c5a1d: Status 404 returned error can't find the container with id b4285d41cecf3ba5ecb53dd6c3c23366d6eceb17a3c0ccb9c903e147899c5a1d Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.829769 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls"] Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.929796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" event={"ID":"0795a100-bef0-4357-94fb-4cc0d177d748","Type":"ContainerStarted","Data":"b4285d41cecf3ba5ecb53dd6c3c23366d6eceb17a3c0ccb9c903e147899c5a1d"} Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.931343 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" event={"ID":"22cb8656-1df5-4761-92bd-b91c969df165","Type":"ContainerStarted","Data":"209df5add56e7505cd5ecd36454b3d349bbf9289e03c5cbea49a93aea2a026ed"} Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.933823 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"b4339423-65e5-448c-9892-ed409c4b7d97","Type":"ContainerStarted","Data":"8a7ebd49336571b326167929b60504a36eeb7e60aa5afed8a8770983306bda63"} Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.933895 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"b4339423-65e5-448c-9892-ed409c4b7d97","Type":"ContainerStarted","Data":"3eee995c288d6fe29c00c56bec4f893ff1190100bed4d520d694fe1df3463883"} Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.935826 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b242b52e-0bb1-4d87-9138-0ca250dd5544","Type":"ContainerStarted","Data":"e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2"} Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.935858 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b242b52e-0bb1-4d87-9138-0ca250dd5544","Type":"ContainerStarted","Data":"7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97"} Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.935871 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b242b52e-0bb1-4d87-9138-0ca250dd5544","Type":"ContainerStarted","Data":"093e53f846f24852140607a48d509b57c168d560375e9d43569a3cb4294e4183"} Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.938043 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"52ae645c-5c85-4557-b829-695539abf517","Type":"ContainerStarted","Data":"90e35ced751fa819a5e2825e5898a07e673e9d6fa95fed7cbcac27fb6cf37afd"} Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.938163 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"52ae645c-5c85-4557-b829-695539abf517","Type":"ContainerStarted","Data":"3d8e396501064b6dac020207aca15904d48c810dce0cd0f1632943e795c5cff3"} Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.939660 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9efad63e-c6bc-42da-bec7-44c26db293da","Type":"ContainerStarted","Data":"e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52"} Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.939773 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9efad63e-c6bc-42da-bec7-44c26db293da","Type":"ContainerStarted","Data":"9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3"} Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.939860 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9efad63e-c6bc-42da-bec7-44c26db293da","Type":"ContainerStarted","Data":"49994ea8faddcfe2d7c2e1f49e66d37f96a73f4f53a9a7ad2d8374d1048a99f5"} Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.950312 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" podStartSLOduration=2.950293395 podStartE2EDuration="2.950293395s" podCreationTimestamp="2026-01-20 23:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:22.949810453 +0000 UTC m=+2875.270070741" watchObservedRunningTime="2026-01-20 23:23:22.950293395 +0000 UTC m=+2875.270553673" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.974233 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=1.9742180280000001 podStartE2EDuration="1.974218028s" podCreationTimestamp="2026-01-20 23:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:22.967848365 +0000 UTC m=+2875.288108653" watchObservedRunningTime="2026-01-20 23:23:22.974218028 +0000 UTC m=+2875.294478316" Jan 20 23:23:22 crc kubenswrapper[5030]: I0120 23:23:22.987278 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.987259741 podStartE2EDuration="1.987259741s" podCreationTimestamp="2026-01-20 23:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:22.982477947 +0000 UTC m=+2875.302738235" watchObservedRunningTime="2026-01-20 23:23:22.987259741 +0000 UTC m=+2875.307520029" Jan 20 23:23:23 crc kubenswrapper[5030]: I0120 23:23:23.001435 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.00141638 podStartE2EDuration="2.00141638s" podCreationTimestamp="2026-01-20 23:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:22.998174643 +0000 UTC m=+2875.318434931" watchObservedRunningTime="2026-01-20 23:23:23.00141638 +0000 UTC m=+2875.321676668" Jan 20 23:23:23 crc kubenswrapper[5030]: I0120 23:23:23.015596 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=3.01557474 podStartE2EDuration="3.01557474s" podCreationTimestamp="2026-01-20 23:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:23.010586531 +0000 UTC m=+2875.330846819" watchObservedRunningTime="2026-01-20 23:23:23.01557474 +0000 UTC m=+2875.335835028" Jan 20 23:23:23 crc kubenswrapper[5030]: I0120 23:23:23.366861 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:23:23 crc kubenswrapper[5030]: I0120 23:23:23.376877 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:23 crc kubenswrapper[5030]: I0120 23:23:23.952099 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" event={"ID":"0795a100-bef0-4357-94fb-4cc0d177d748","Type":"ContainerStarted","Data":"60201df21b4c6bccb15761b5b261ceea64ffcc124083582fca6921054cb76f4d"} Jan 20 23:23:23 crc kubenswrapper[5030]: I0120 23:23:23.982038 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" podStartSLOduration=2.982018146 podStartE2EDuration="2.982018146s" podCreationTimestamp="2026-01-20 23:23:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:23.9784136 +0000 UTC m=+2876.298673888" watchObservedRunningTime="2026-01-20 23:23:23.982018146 +0000 UTC m=+2876.302278444" Jan 20 23:23:24 crc kubenswrapper[5030]: I0120 23:23:24.964546 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9efad63e-c6bc-42da-bec7-44c26db293da" containerName="nova-metadata-log" containerID="cri-o://9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3" gracePeriod=30 Jan 20 23:23:24 crc kubenswrapper[5030]: I0120 23:23:24.964697 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9efad63e-c6bc-42da-bec7-44c26db293da" containerName="nova-metadata-metadata" containerID="cri-o://e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52" gracePeriod=30 Jan 20 23:23:24 crc kubenswrapper[5030]: I0120 23:23:24.965023 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="b4339423-65e5-448c-9892-ed409c4b7d97" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8a7ebd49336571b326167929b60504a36eeb7e60aa5afed8a8770983306bda63" gracePeriod=30 Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.553514 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.633028 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.653715 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9efad63e-c6bc-42da-bec7-44c26db293da-config-data\") pod \"9efad63e-c6bc-42da-bec7-44c26db293da\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.653856 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9efad63e-c6bc-42da-bec7-44c26db293da-logs\") pod \"9efad63e-c6bc-42da-bec7-44c26db293da\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.653900 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9efad63e-c6bc-42da-bec7-44c26db293da-combined-ca-bundle\") pod \"9efad63e-c6bc-42da-bec7-44c26db293da\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.653949 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpcjd\" (UniqueName: \"kubernetes.io/projected/9efad63e-c6bc-42da-bec7-44c26db293da-kube-api-access-kpcjd\") pod \"9efad63e-c6bc-42da-bec7-44c26db293da\" (UID: \"9efad63e-c6bc-42da-bec7-44c26db293da\") " Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.655201 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9efad63e-c6bc-42da-bec7-44c26db293da-logs" (OuterVolumeSpecName: "logs") pod "9efad63e-c6bc-42da-bec7-44c26db293da" (UID: "9efad63e-c6bc-42da-bec7-44c26db293da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.666326 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9efad63e-c6bc-42da-bec7-44c26db293da-kube-api-access-kpcjd" (OuterVolumeSpecName: "kube-api-access-kpcjd") pod "9efad63e-c6bc-42da-bec7-44c26db293da" (UID: "9efad63e-c6bc-42da-bec7-44c26db293da"). InnerVolumeSpecName "kube-api-access-kpcjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.688165 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9efad63e-c6bc-42da-bec7-44c26db293da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9efad63e-c6bc-42da-bec7-44c26db293da" (UID: "9efad63e-c6bc-42da-bec7-44c26db293da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.688546 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9efad63e-c6bc-42da-bec7-44c26db293da-config-data" (OuterVolumeSpecName: "config-data") pod "9efad63e-c6bc-42da-bec7-44c26db293da" (UID: "9efad63e-c6bc-42da-bec7-44c26db293da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.755255 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlvbn\" (UniqueName: \"kubernetes.io/projected/b4339423-65e5-448c-9892-ed409c4b7d97-kube-api-access-mlvbn\") pod \"b4339423-65e5-448c-9892-ed409c4b7d97\" (UID: \"b4339423-65e5-448c-9892-ed409c4b7d97\") " Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.755556 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4339423-65e5-448c-9892-ed409c4b7d97-combined-ca-bundle\") pod \"b4339423-65e5-448c-9892-ed409c4b7d97\" (UID: \"b4339423-65e5-448c-9892-ed409c4b7d97\") " Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.755708 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4339423-65e5-448c-9892-ed409c4b7d97-config-data\") pod \"b4339423-65e5-448c-9892-ed409c4b7d97\" (UID: \"b4339423-65e5-448c-9892-ed409c4b7d97\") " Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.756083 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9efad63e-c6bc-42da-bec7-44c26db293da-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.756143 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9efad63e-c6bc-42da-bec7-44c26db293da-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.756198 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9efad63e-c6bc-42da-bec7-44c26db293da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.756265 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpcjd\" (UniqueName: \"kubernetes.io/projected/9efad63e-c6bc-42da-bec7-44c26db293da-kube-api-access-kpcjd\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.759438 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4339423-65e5-448c-9892-ed409c4b7d97-kube-api-access-mlvbn" (OuterVolumeSpecName: "kube-api-access-mlvbn") pod "b4339423-65e5-448c-9892-ed409c4b7d97" (UID: "b4339423-65e5-448c-9892-ed409c4b7d97"). InnerVolumeSpecName "kube-api-access-mlvbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.778544 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4339423-65e5-448c-9892-ed409c4b7d97-config-data" (OuterVolumeSpecName: "config-data") pod "b4339423-65e5-448c-9892-ed409c4b7d97" (UID: "b4339423-65e5-448c-9892-ed409c4b7d97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.779026 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4339423-65e5-448c-9892-ed409c4b7d97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4339423-65e5-448c-9892-ed409c4b7d97" (UID: "b4339423-65e5-448c-9892-ed409c4b7d97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.858062 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4339423-65e5-448c-9892-ed409c4b7d97-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.858295 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlvbn\" (UniqueName: \"kubernetes.io/projected/b4339423-65e5-448c-9892-ed409c4b7d97-kube-api-access-mlvbn\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.858384 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4339423-65e5-448c-9892-ed409c4b7d97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.978413 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4339423-65e5-448c-9892-ed409c4b7d97" containerID="8a7ebd49336571b326167929b60504a36eeb7e60aa5afed8a8770983306bda63" exitCode=0 Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.978574 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.979592 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"b4339423-65e5-448c-9892-ed409c4b7d97","Type":"ContainerDied","Data":"8a7ebd49336571b326167929b60504a36eeb7e60aa5afed8a8770983306bda63"} Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.979665 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"b4339423-65e5-448c-9892-ed409c4b7d97","Type":"ContainerDied","Data":"3eee995c288d6fe29c00c56bec4f893ff1190100bed4d520d694fe1df3463883"} Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.979688 5030 scope.go:117] "RemoveContainer" containerID="8a7ebd49336571b326167929b60504a36eeb7e60aa5afed8a8770983306bda63" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.986024 5030 generic.go:334] "Generic (PLEG): container finished" podID="9efad63e-c6bc-42da-bec7-44c26db293da" containerID="e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52" exitCode=0 Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.986052 5030 generic.go:334] "Generic (PLEG): container finished" podID="9efad63e-c6bc-42da-bec7-44c26db293da" containerID="9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3" exitCode=143 Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.986074 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9efad63e-c6bc-42da-bec7-44c26db293da","Type":"ContainerDied","Data":"e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52"} Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.986138 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.986154 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9efad63e-c6bc-42da-bec7-44c26db293da","Type":"ContainerDied","Data":"9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3"} Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.986160 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:25 crc kubenswrapper[5030]: I0120 23:23:25.986169 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9efad63e-c6bc-42da-bec7-44c26db293da","Type":"ContainerDied","Data":"49994ea8faddcfe2d7c2e1f49e66d37f96a73f4f53a9a7ad2d8374d1048a99f5"} Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.047226 5030 scope.go:117] "RemoveContainer" containerID="8a7ebd49336571b326167929b60504a36eeb7e60aa5afed8a8770983306bda63" Jan 20 23:23:26 crc kubenswrapper[5030]: E0120 23:23:26.050475 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a7ebd49336571b326167929b60504a36eeb7e60aa5afed8a8770983306bda63\": container with ID starting with 8a7ebd49336571b326167929b60504a36eeb7e60aa5afed8a8770983306bda63 not found: ID does not exist" containerID="8a7ebd49336571b326167929b60504a36eeb7e60aa5afed8a8770983306bda63" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.050530 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a7ebd49336571b326167929b60504a36eeb7e60aa5afed8a8770983306bda63"} err="failed to get container status \"8a7ebd49336571b326167929b60504a36eeb7e60aa5afed8a8770983306bda63\": rpc error: code = NotFound desc = could not find container \"8a7ebd49336571b326167929b60504a36eeb7e60aa5afed8a8770983306bda63\": container with ID starting with 8a7ebd49336571b326167929b60504a36eeb7e60aa5afed8a8770983306bda63 not found: ID does not exist" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.050561 5030 scope.go:117] "RemoveContainer" containerID="e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.066323 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.115917 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.118181 5030 scope.go:117] "RemoveContainer" containerID="9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.122842 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.134935 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.143229 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:23:26 crc kubenswrapper[5030]: E0120 23:23:26.144577 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9efad63e-c6bc-42da-bec7-44c26db293da" containerName="nova-metadata-log" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.144609 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efad63e-c6bc-42da-bec7-44c26db293da" containerName="nova-metadata-log" Jan 20 23:23:26 crc kubenswrapper[5030]: E0120 23:23:26.144637 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4339423-65e5-448c-9892-ed409c4b7d97" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.144645 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4339423-65e5-448c-9892-ed409c4b7d97" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:23:26 crc kubenswrapper[5030]: E0120 23:23:26.144658 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9efad63e-c6bc-42da-bec7-44c26db293da" containerName="nova-metadata-metadata" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.144665 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efad63e-c6bc-42da-bec7-44c26db293da" containerName="nova-metadata-metadata" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.145014 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4339423-65e5-448c-9892-ed409c4b7d97" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.145036 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9efad63e-c6bc-42da-bec7-44c26db293da" containerName="nova-metadata-log" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.145051 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9efad63e-c6bc-42da-bec7-44c26db293da" containerName="nova-metadata-metadata" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.145695 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.148160 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.148478 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.148706 5030 scope.go:117] "RemoveContainer" containerID="e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.148946 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:23:26 crc kubenswrapper[5030]: E0120 23:23:26.149512 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52\": container with ID starting with e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52 not found: ID does not exist" containerID="e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.149554 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52"} err="failed to get container status \"e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52\": rpc error: code = NotFound desc = could not find container \"e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52\": container with ID starting with e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52 not found: ID does not exist" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.149575 5030 scope.go:117] "RemoveContainer" containerID="9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3" Jan 20 23:23:26 crc kubenswrapper[5030]: E0120 23:23:26.150187 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3\": container with ID starting with 9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3 not found: ID does not exist" containerID="9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.150209 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3"} err="failed to get container status \"9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3\": rpc error: code = NotFound desc = could not find container \"9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3\": container with ID starting with 9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3 not found: ID does not exist" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.150223 5030 scope.go:117] "RemoveContainer" containerID="e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.150543 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52"} err="failed to get container status \"e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52\": rpc error: code = NotFound desc = could not find container \"e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52\": container with ID starting with e4e078d3bb263e745ad924ee12720b15c784dc79fe139aeb2aef298ffcc56f52 not found: ID does not exist" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.150570 5030 scope.go:117] "RemoveContainer" containerID="9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.150820 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3"} err="failed to get container status \"9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3\": rpc error: code = NotFound desc = could not find container \"9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3\": container with ID starting with 9929878d951901b85b65727387e11a9bfdf4ac5d9cb361bf4b989ee179991ad3 not found: ID does not exist" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.159686 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.167859 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.169486 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.172573 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.173762 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.175579 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.264202 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.264514 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.264600 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg58t\" (UniqueName: \"kubernetes.io/projected/9f30f5e5-c42e-4834-a13e-55b8ea08c459-kube-api-access-mg58t\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.264713 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.264801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.264923 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.265007 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.265119 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f30f5e5-c42e-4834-a13e-55b8ea08c459-logs\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.265220 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mrf8\" (UniqueName: \"kubernetes.io/projected/aa0ffc1d-cef9-4431-8f48-012c129e75b5-kube-api-access-9mrf8\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.265318 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-config-data\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.366941 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f30f5e5-c42e-4834-a13e-55b8ea08c459-logs\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.366989 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mrf8\" (UniqueName: \"kubernetes.io/projected/aa0ffc1d-cef9-4431-8f48-012c129e75b5-kube-api-access-9mrf8\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.367022 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-config-data\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.367042 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.367743 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.367769 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg58t\" (UniqueName: \"kubernetes.io/projected/9f30f5e5-c42e-4834-a13e-55b8ea08c459-kube-api-access-mg58t\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.367788 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.367812 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.367930 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f30f5e5-c42e-4834-a13e-55b8ea08c459-logs\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.368153 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.368215 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.374060 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.374525 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.374938 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.375814 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-config-data\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.375828 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.377565 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.384647 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg58t\" (UniqueName: \"kubernetes.io/projected/9f30f5e5-c42e-4834-a13e-55b8ea08c459-kube-api-access-mg58t\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.388417 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.393993 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mrf8\" (UniqueName: \"kubernetes.io/projected/aa0ffc1d-cef9-4431-8f48-012c129e75b5-kube-api-access-9mrf8\") pod \"nova-cell1-novncproxy-0\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.459667 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.482239 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.675939 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:26 crc kubenswrapper[5030]: I0120 23:23:26.956910 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:23:26 crc kubenswrapper[5030]: W0120 23:23:26.958725 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa0ffc1d_cef9_4431_8f48_012c129e75b5.slice/crio-de86ae1d4e3bac06b48ef5982b4e4ea557972eb3a4c0b2b46eac9c82860798d4 WatchSource:0}: Error finding container de86ae1d4e3bac06b48ef5982b4e4ea557972eb3a4c0b2b46eac9c82860798d4: Status 404 returned error can't find the container with id de86ae1d4e3bac06b48ef5982b4e4ea557972eb3a4c0b2b46eac9c82860798d4 Jan 20 23:23:27 crc kubenswrapper[5030]: I0120 23:23:27.010971 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"aa0ffc1d-cef9-4431-8f48-012c129e75b5","Type":"ContainerStarted","Data":"de86ae1d4e3bac06b48ef5982b4e4ea557972eb3a4c0b2b46eac9c82860798d4"} Jan 20 23:23:27 crc kubenswrapper[5030]: I0120 23:23:27.018902 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:27 crc kubenswrapper[5030]: W0120 23:23:27.040761 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f30f5e5_c42e_4834_a13e_55b8ea08c459.slice/crio-bba3a4772c6810fa2687347e1e0588bc2e5a2e23ec2dc16f052760b76dfd38b0 WatchSource:0}: Error finding container bba3a4772c6810fa2687347e1e0588bc2e5a2e23ec2dc16f052760b76dfd38b0: Status 404 returned error can't find the container with id bba3a4772c6810fa2687347e1e0588bc2e5a2e23ec2dc16f052760b76dfd38b0 Jan 20 23:23:27 crc kubenswrapper[5030]: I0120 23:23:27.977018 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9efad63e-c6bc-42da-bec7-44c26db293da" path="/var/lib/kubelet/pods/9efad63e-c6bc-42da-bec7-44c26db293da/volumes" Jan 20 23:23:27 crc kubenswrapper[5030]: I0120 23:23:27.978179 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4339423-65e5-448c-9892-ed409c4b7d97" path="/var/lib/kubelet/pods/b4339423-65e5-448c-9892-ed409c4b7d97/volumes" Jan 20 23:23:28 crc kubenswrapper[5030]: I0120 23:23:28.030446 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9f30f5e5-c42e-4834-a13e-55b8ea08c459","Type":"ContainerStarted","Data":"e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb"} Jan 20 23:23:28 crc kubenswrapper[5030]: I0120 23:23:28.031398 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9f30f5e5-c42e-4834-a13e-55b8ea08c459","Type":"ContainerStarted","Data":"4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a"} Jan 20 23:23:28 crc kubenswrapper[5030]: I0120 23:23:28.031640 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9f30f5e5-c42e-4834-a13e-55b8ea08c459","Type":"ContainerStarted","Data":"bba3a4772c6810fa2687347e1e0588bc2e5a2e23ec2dc16f052760b76dfd38b0"} Jan 20 23:23:28 crc kubenswrapper[5030]: I0120 23:23:28.032390 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"aa0ffc1d-cef9-4431-8f48-012c129e75b5","Type":"ContainerStarted","Data":"8a48f49c06913d02cc09946d1fe836b60218c213d31a68d5e74ac65345df2ebc"} Jan 20 23:23:28 crc kubenswrapper[5030]: I0120 23:23:28.077041 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.077019857 podStartE2EDuration="2.077019857s" podCreationTimestamp="2026-01-20 23:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:28.072479338 +0000 UTC m=+2880.392739626" watchObservedRunningTime="2026-01-20 23:23:28.077019857 +0000 UTC m=+2880.397280145" Jan 20 23:23:28 crc kubenswrapper[5030]: I0120 23:23:28.077632 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.077611822 podStartE2EDuration="2.077611822s" podCreationTimestamp="2026-01-20 23:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:28.049856946 +0000 UTC m=+2880.370117254" watchObservedRunningTime="2026-01-20 23:23:28.077611822 +0000 UTC m=+2880.397872110" Jan 20 23:23:29 crc kubenswrapper[5030]: I0120 23:23:29.053434 5030 generic.go:334] "Generic (PLEG): container finished" podID="0795a100-bef0-4357-94fb-4cc0d177d748" containerID="60201df21b4c6bccb15761b5b261ceea64ffcc124083582fca6921054cb76f4d" exitCode=0 Jan 20 23:23:29 crc kubenswrapper[5030]: I0120 23:23:29.053790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" event={"ID":"0795a100-bef0-4357-94fb-4cc0d177d748","Type":"ContainerDied","Data":"60201df21b4c6bccb15761b5b261ceea64ffcc124083582fca6921054cb76f4d"} Jan 20 23:23:29 crc kubenswrapper[5030]: I0120 23:23:29.056777 5030 generic.go:334] "Generic (PLEG): container finished" podID="22cb8656-1df5-4761-92bd-b91c969df165" containerID="209df5add56e7505cd5ecd36454b3d349bbf9289e03c5cbea49a93aea2a026ed" exitCode=0 Jan 20 23:23:29 crc kubenswrapper[5030]: I0120 23:23:29.056842 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" event={"ID":"22cb8656-1df5-4761-92bd-b91c969df165","Type":"ContainerDied","Data":"209df5add56e7505cd5ecd36454b3d349bbf9289e03c5cbea49a93aea2a026ed"} Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.397315 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.397592 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="f2ec21ef-c0b8-4832-a46a-c50fe992fde9" containerName="kube-state-metrics" containerID="cri-o://5d18a13475f50e3d0a56e4d7838890360d98df9d2995559fe889ad74e97e6e75" gracePeriod=30 Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.622614 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.629137 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.660060 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-scripts\") pod \"22cb8656-1df5-4761-92bd-b91c969df165\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.660099 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-config-data\") pod \"0795a100-bef0-4357-94fb-4cc0d177d748\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.660158 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktlf9\" (UniqueName: \"kubernetes.io/projected/0795a100-bef0-4357-94fb-4cc0d177d748-kube-api-access-ktlf9\") pod \"0795a100-bef0-4357-94fb-4cc0d177d748\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.660775 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x95zs\" (UniqueName: \"kubernetes.io/projected/22cb8656-1df5-4761-92bd-b91c969df165-kube-api-access-x95zs\") pod \"22cb8656-1df5-4761-92bd-b91c969df165\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.660813 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-combined-ca-bundle\") pod \"0795a100-bef0-4357-94fb-4cc0d177d748\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.660947 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-combined-ca-bundle\") pod \"22cb8656-1df5-4761-92bd-b91c969df165\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.660992 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-config-data\") pod \"22cb8656-1df5-4761-92bd-b91c969df165\" (UID: \"22cb8656-1df5-4761-92bd-b91c969df165\") " Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.661054 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-scripts\") pod \"0795a100-bef0-4357-94fb-4cc0d177d748\" (UID: \"0795a100-bef0-4357-94fb-4cc0d177d748\") " Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.668106 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22cb8656-1df5-4761-92bd-b91c969df165-kube-api-access-x95zs" (OuterVolumeSpecName: "kube-api-access-x95zs") pod "22cb8656-1df5-4761-92bd-b91c969df165" (UID: "22cb8656-1df5-4761-92bd-b91c969df165"). InnerVolumeSpecName "kube-api-access-x95zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.668992 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0795a100-bef0-4357-94fb-4cc0d177d748-kube-api-access-ktlf9" (OuterVolumeSpecName: "kube-api-access-ktlf9") pod "0795a100-bef0-4357-94fb-4cc0d177d748" (UID: "0795a100-bef0-4357-94fb-4cc0d177d748"). InnerVolumeSpecName "kube-api-access-ktlf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.670751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-scripts" (OuterVolumeSpecName: "scripts") pod "0795a100-bef0-4357-94fb-4cc0d177d748" (UID: "0795a100-bef0-4357-94fb-4cc0d177d748"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.683952 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-scripts" (OuterVolumeSpecName: "scripts") pod "22cb8656-1df5-4761-92bd-b91c969df165" (UID: "22cb8656-1df5-4761-92bd-b91c969df165"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.714656 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-config-data" (OuterVolumeSpecName: "config-data") pod "22cb8656-1df5-4761-92bd-b91c969df165" (UID: "22cb8656-1df5-4761-92bd-b91c969df165"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.722183 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0795a100-bef0-4357-94fb-4cc0d177d748" (UID: "0795a100-bef0-4357-94fb-4cc0d177d748"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.723193 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22cb8656-1df5-4761-92bd-b91c969df165" (UID: "22cb8656-1df5-4761-92bd-b91c969df165"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.733927 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-config-data" (OuterVolumeSpecName: "config-data") pod "0795a100-bef0-4357-94fb-4cc0d177d748" (UID: "0795a100-bef0-4357-94fb-4cc0d177d748"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.763202 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x95zs\" (UniqueName: \"kubernetes.io/projected/22cb8656-1df5-4761-92bd-b91c969df165-kube-api-access-x95zs\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.763238 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.763248 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.763258 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.763270 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.763278 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22cb8656-1df5-4761-92bd-b91c969df165-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.763286 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0795a100-bef0-4357-94fb-4cc0d177d748-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.763294 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktlf9\" (UniqueName: \"kubernetes.io/projected/0795a100-bef0-4357-94fb-4cc0d177d748-kube-api-access-ktlf9\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.774777 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.863958 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx65g\" (UniqueName: \"kubernetes.io/projected/f2ec21ef-c0b8-4832-a46a-c50fe992fde9-kube-api-access-jx65g\") pod \"f2ec21ef-c0b8-4832-a46a-c50fe992fde9\" (UID: \"f2ec21ef-c0b8-4832-a46a-c50fe992fde9\") " Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.868170 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2ec21ef-c0b8-4832-a46a-c50fe992fde9-kube-api-access-jx65g" (OuterVolumeSpecName: "kube-api-access-jx65g") pod "f2ec21ef-c0b8-4832-a46a-c50fe992fde9" (UID: "f2ec21ef-c0b8-4832-a46a-c50fe992fde9"). InnerVolumeSpecName "kube-api-access-jx65g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:30 crc kubenswrapper[5030]: I0120 23:23:30.966554 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx65g\" (UniqueName: \"kubernetes.io/projected/f2ec21ef-c0b8-4832-a46a-c50fe992fde9-kube-api-access-jx65g\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.083491 5030 generic.go:334] "Generic (PLEG): container finished" podID="f2ec21ef-c0b8-4832-a46a-c50fe992fde9" containerID="5d18a13475f50e3d0a56e4d7838890360d98df9d2995559fe889ad74e97e6e75" exitCode=2 Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.083573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"f2ec21ef-c0b8-4832-a46a-c50fe992fde9","Type":"ContainerDied","Data":"5d18a13475f50e3d0a56e4d7838890360d98df9d2995559fe889ad74e97e6e75"} Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.083630 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"f2ec21ef-c0b8-4832-a46a-c50fe992fde9","Type":"ContainerDied","Data":"6c3292db84d61aeccd8e8321ca97fd1d37ce0aca764548090b67ed23db272f2d"} Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.083650 5030 scope.go:117] "RemoveContainer" containerID="5d18a13475f50e3d0a56e4d7838890360d98df9d2995559fe889ad74e97e6e75" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.084346 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.085747 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" event={"ID":"0795a100-bef0-4357-94fb-4cc0d177d748","Type":"ContainerDied","Data":"b4285d41cecf3ba5ecb53dd6c3c23366d6eceb17a3c0ccb9c903e147899c5a1d"} Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.085813 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4285d41cecf3ba5ecb53dd6c3c23366d6eceb17a3c0ccb9c903e147899c5a1d" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.085764 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.087913 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" event={"ID":"22cb8656-1df5-4761-92bd-b91c969df165","Type":"ContainerDied","Data":"6b2a4be85cb0bf26f70701cad6c2a8ae962b68788990f88441ea26e1eade6dd2"} Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.087948 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b2a4be85cb0bf26f70701cad6c2a8ae962b68788990f88441ea26e1eade6dd2" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.087955 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.118866 5030 scope.go:117] "RemoveContainer" containerID="5d18a13475f50e3d0a56e4d7838890360d98df9d2995559fe889ad74e97e6e75" Jan 20 23:23:31 crc kubenswrapper[5030]: E0120 23:23:31.123324 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d18a13475f50e3d0a56e4d7838890360d98df9d2995559fe889ad74e97e6e75\": container with ID starting with 5d18a13475f50e3d0a56e4d7838890360d98df9d2995559fe889ad74e97e6e75 not found: ID does not exist" containerID="5d18a13475f50e3d0a56e4d7838890360d98df9d2995559fe889ad74e97e6e75" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.123389 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d18a13475f50e3d0a56e4d7838890360d98df9d2995559fe889ad74e97e6e75"} err="failed to get container status \"5d18a13475f50e3d0a56e4d7838890360d98df9d2995559fe889ad74e97e6e75\": rpc error: code = NotFound desc = could not find container \"5d18a13475f50e3d0a56e4d7838890360d98df9d2995559fe889ad74e97e6e75\": container with ID starting with 5d18a13475f50e3d0a56e4d7838890360d98df9d2995559fe889ad74e97e6e75 not found: ID does not exist" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.150467 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.169682 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.202153 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:23:31 crc kubenswrapper[5030]: E0120 23:23:31.202613 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22cb8656-1df5-4761-92bd-b91c969df165" containerName="nova-manage" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.202647 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="22cb8656-1df5-4761-92bd-b91c969df165" containerName="nova-manage" Jan 20 23:23:31 crc kubenswrapper[5030]: E0120 23:23:31.202663 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ec21ef-c0b8-4832-a46a-c50fe992fde9" containerName="kube-state-metrics" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.202671 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ec21ef-c0b8-4832-a46a-c50fe992fde9" containerName="kube-state-metrics" Jan 20 23:23:31 crc kubenswrapper[5030]: E0120 23:23:31.202694 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0795a100-bef0-4357-94fb-4cc0d177d748" containerName="nova-cell1-conductor-db-sync" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.202700 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0795a100-bef0-4357-94fb-4cc0d177d748" containerName="nova-cell1-conductor-db-sync" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.202866 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2ec21ef-c0b8-4832-a46a-c50fe992fde9" containerName="kube-state-metrics" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.202885 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="22cb8656-1df5-4761-92bd-b91c969df165" containerName="nova-manage" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.202899 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0795a100-bef0-4357-94fb-4cc0d177d748" containerName="nova-cell1-conductor-db-sync" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.203541 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.206059 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.209886 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.210977 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.212308 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.213551 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.218264 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.229699 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.272972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.273098 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.273125 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ff5b\" (UniqueName: \"kubernetes.io/projected/0194cff7-fd9f-4d47-8edd-eae7cd25c987-kube-api-access-7ff5b\") pod \"nova-cell1-conductor-0\" (UID: \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.273159 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0194cff7-fd9f-4d47-8edd-eae7cd25c987-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.273187 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9gzd\" (UniqueName: \"kubernetes.io/projected/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-api-access-x9gzd\") pod \"kube-state-metrics-0\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.273355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.273391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0194cff7-fd9f-4d47-8edd-eae7cd25c987-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.375072 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.375133 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0194cff7-fd9f-4d47-8edd-eae7cd25c987-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.375163 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.375267 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.375296 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ff5b\" (UniqueName: \"kubernetes.io/projected/0194cff7-fd9f-4d47-8edd-eae7cd25c987-kube-api-access-7ff5b\") pod \"nova-cell1-conductor-0\" (UID: \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.375341 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0194cff7-fd9f-4d47-8edd-eae7cd25c987-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.375377 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9gzd\" (UniqueName: \"kubernetes.io/projected/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-api-access-x9gzd\") pod \"kube-state-metrics-0\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.380875 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.381856 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.382681 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.383029 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="b242b52e-0bb1-4d87-9138-0ca250dd5544" containerName="nova-api-api" containerID="cri-o://e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2" gracePeriod=30 Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.383300 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="b242b52e-0bb1-4d87-9138-0ca250dd5544" containerName="nova-api-log" containerID="cri-o://7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97" gracePeriod=30 Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.384129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0194cff7-fd9f-4d47-8edd-eae7cd25c987-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.385430 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.385439 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0194cff7-fd9f-4d47-8edd-eae7cd25c987-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.391664 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.391929 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="52ae645c-5c85-4557-b829-695539abf517" containerName="nova-scheduler-scheduler" containerID="cri-o://90e35ced751fa819a5e2825e5898a07e673e9d6fa95fed7cbcac27fb6cf37afd" gracePeriod=30 Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.401895 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.403497 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9f30f5e5-c42e-4834-a13e-55b8ea08c459" containerName="nova-metadata-log" containerID="cri-o://4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a" gracePeriod=30 Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.404451 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9f30f5e5-c42e-4834-a13e-55b8ea08c459" containerName="nova-metadata-metadata" containerID="cri-o://e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb" gracePeriod=30 Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.405421 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ff5b\" (UniqueName: \"kubernetes.io/projected/0194cff7-fd9f-4d47-8edd-eae7cd25c987-kube-api-access-7ff5b\") pod \"nova-cell1-conductor-0\" (UID: \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.405430 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9gzd\" (UniqueName: \"kubernetes.io/projected/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-api-access-x9gzd\") pod \"kube-state-metrics-0\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.459954 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.482914 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.482968 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.531681 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.543593 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.924390 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.966237 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.978719 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2ec21ef-c0b8-4832-a46a-c50fe992fde9" path="/var/lib/kubelet/pods/f2ec21ef-c0b8-4832-a46a-c50fe992fde9/volumes" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.983369 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b242b52e-0bb1-4d87-9138-0ca250dd5544-config-data\") pod \"b242b52e-0bb1-4d87-9138-0ca250dd5544\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.983416 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-combined-ca-bundle\") pod \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.983463 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f30f5e5-c42e-4834-a13e-55b8ea08c459-logs\") pod \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.983500 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-config-data\") pod \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.983559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b242b52e-0bb1-4d87-9138-0ca250dd5544-logs\") pod \"b242b52e-0bb1-4d87-9138-0ca250dd5544\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.983614 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b242b52e-0bb1-4d87-9138-0ca250dd5544-combined-ca-bundle\") pod \"b242b52e-0bb1-4d87-9138-0ca250dd5544\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.983657 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-nova-metadata-tls-certs\") pod \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.983739 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg58t\" (UniqueName: \"kubernetes.io/projected/9f30f5e5-c42e-4834-a13e-55b8ea08c459-kube-api-access-mg58t\") pod \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\" (UID: \"9f30f5e5-c42e-4834-a13e-55b8ea08c459\") " Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.983771 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rvwk\" (UniqueName: \"kubernetes.io/projected/b242b52e-0bb1-4d87-9138-0ca250dd5544-kube-api-access-6rvwk\") pod \"b242b52e-0bb1-4d87-9138-0ca250dd5544\" (UID: \"b242b52e-0bb1-4d87-9138-0ca250dd5544\") " Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.984414 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f30f5e5-c42e-4834-a13e-55b8ea08c459-logs" (OuterVolumeSpecName: "logs") pod "9f30f5e5-c42e-4834-a13e-55b8ea08c459" (UID: "9f30f5e5-c42e-4834-a13e-55b8ea08c459"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.984549 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b242b52e-0bb1-4d87-9138-0ca250dd5544-logs" (OuterVolumeSpecName: "logs") pod "b242b52e-0bb1-4d87-9138-0ca250dd5544" (UID: "b242b52e-0bb1-4d87-9138-0ca250dd5544"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.989358 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f30f5e5-c42e-4834-a13e-55b8ea08c459-kube-api-access-mg58t" (OuterVolumeSpecName: "kube-api-access-mg58t") pod "9f30f5e5-c42e-4834-a13e-55b8ea08c459" (UID: "9f30f5e5-c42e-4834-a13e-55b8ea08c459"). InnerVolumeSpecName "kube-api-access-mg58t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:31 crc kubenswrapper[5030]: I0120 23:23:31.990106 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b242b52e-0bb1-4d87-9138-0ca250dd5544-kube-api-access-6rvwk" (OuterVolumeSpecName: "kube-api-access-6rvwk") pod "b242b52e-0bb1-4d87-9138-0ca250dd5544" (UID: "b242b52e-0bb1-4d87-9138-0ca250dd5544"). InnerVolumeSpecName "kube-api-access-6rvwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.016590 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b242b52e-0bb1-4d87-9138-0ca250dd5544-config-data" (OuterVolumeSpecName: "config-data") pod "b242b52e-0bb1-4d87-9138-0ca250dd5544" (UID: "b242b52e-0bb1-4d87-9138-0ca250dd5544"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.018102 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f30f5e5-c42e-4834-a13e-55b8ea08c459" (UID: "9f30f5e5-c42e-4834-a13e-55b8ea08c459"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.018649 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-config-data" (OuterVolumeSpecName: "config-data") pod "9f30f5e5-c42e-4834-a13e-55b8ea08c459" (UID: "9f30f5e5-c42e-4834-a13e-55b8ea08c459"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.034450 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b242b52e-0bb1-4d87-9138-0ca250dd5544-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b242b52e-0bb1-4d87-9138-0ca250dd5544" (UID: "b242b52e-0bb1-4d87-9138-0ca250dd5544"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.045704 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9f30f5e5-c42e-4834-a13e-55b8ea08c459" (UID: "9f30f5e5-c42e-4834-a13e-55b8ea08c459"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.086536 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg58t\" (UniqueName: \"kubernetes.io/projected/9f30f5e5-c42e-4834-a13e-55b8ea08c459-kube-api-access-mg58t\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.086596 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rvwk\" (UniqueName: \"kubernetes.io/projected/b242b52e-0bb1-4d87-9138-0ca250dd5544-kube-api-access-6rvwk\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.086615 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b242b52e-0bb1-4d87-9138-0ca250dd5544-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.086657 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.086673 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f30f5e5-c42e-4834-a13e-55b8ea08c459-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.086689 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.086704 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b242b52e-0bb1-4d87-9138-0ca250dd5544-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.086719 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b242b52e-0bb1-4d87-9138-0ca250dd5544-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.086734 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f30f5e5-c42e-4834-a13e-55b8ea08c459-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.107370 5030 generic.go:334] "Generic (PLEG): container finished" podID="9f30f5e5-c42e-4834-a13e-55b8ea08c459" containerID="e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb" exitCode=0 Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.107402 5030 generic.go:334] "Generic (PLEG): container finished" podID="9f30f5e5-c42e-4834-a13e-55b8ea08c459" containerID="4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a" exitCode=143 Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.107443 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.107490 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9f30f5e5-c42e-4834-a13e-55b8ea08c459","Type":"ContainerDied","Data":"e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb"} Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.107518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9f30f5e5-c42e-4834-a13e-55b8ea08c459","Type":"ContainerDied","Data":"4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a"} Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.107528 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9f30f5e5-c42e-4834-a13e-55b8ea08c459","Type":"ContainerDied","Data":"bba3a4772c6810fa2687347e1e0588bc2e5a2e23ec2dc16f052760b76dfd38b0"} Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.107544 5030 scope.go:117] "RemoveContainer" containerID="e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.118821 5030 generic.go:334] "Generic (PLEG): container finished" podID="b242b52e-0bb1-4d87-9138-0ca250dd5544" containerID="e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2" exitCode=0 Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.118851 5030 generic.go:334] "Generic (PLEG): container finished" podID="b242b52e-0bb1-4d87-9138-0ca250dd5544" containerID="7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97" exitCode=143 Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.119030 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b242b52e-0bb1-4d87-9138-0ca250dd5544","Type":"ContainerDied","Data":"e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2"} Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.119062 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b242b52e-0bb1-4d87-9138-0ca250dd5544","Type":"ContainerDied","Data":"7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97"} Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.119072 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b242b52e-0bb1-4d87-9138-0ca250dd5544","Type":"ContainerDied","Data":"093e53f846f24852140607a48d509b57c168d560375e9d43569a3cb4294e4183"} Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.119128 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.122918 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:23:32 crc kubenswrapper[5030]: W0120 23:23:32.124581 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0194cff7_fd9f_4d47_8edd_eae7cd25c987.slice/crio-96383fc6688c12fc4ca8d54b8a6a421d53e687489e44cbc82ef0a736e985cfc1 WatchSource:0}: Error finding container 96383fc6688c12fc4ca8d54b8a6a421d53e687489e44cbc82ef0a736e985cfc1: Status 404 returned error can't find the container with id 96383fc6688c12fc4ca8d54b8a6a421d53e687489e44cbc82ef0a736e985cfc1 Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.140772 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.145440 5030 scope.go:117] "RemoveContainer" containerID="4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.151811 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.160729 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.170571 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.177907 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.186678 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:32 crc kubenswrapper[5030]: E0120 23:23:32.187126 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f30f5e5-c42e-4834-a13e-55b8ea08c459" containerName="nova-metadata-log" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.187142 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f30f5e5-c42e-4834-a13e-55b8ea08c459" containerName="nova-metadata-log" Jan 20 23:23:32 crc kubenswrapper[5030]: E0120 23:23:32.187155 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b242b52e-0bb1-4d87-9138-0ca250dd5544" containerName="nova-api-api" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.187162 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b242b52e-0bb1-4d87-9138-0ca250dd5544" containerName="nova-api-api" Jan 20 23:23:32 crc kubenswrapper[5030]: E0120 23:23:32.187175 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f30f5e5-c42e-4834-a13e-55b8ea08c459" containerName="nova-metadata-metadata" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.187182 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f30f5e5-c42e-4834-a13e-55b8ea08c459" containerName="nova-metadata-metadata" Jan 20 23:23:32 crc kubenswrapper[5030]: E0120 23:23:32.187195 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b242b52e-0bb1-4d87-9138-0ca250dd5544" containerName="nova-api-log" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.187200 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b242b52e-0bb1-4d87-9138-0ca250dd5544" containerName="nova-api-log" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.187385 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f30f5e5-c42e-4834-a13e-55b8ea08c459" containerName="nova-metadata-metadata" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.187401 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b242b52e-0bb1-4d87-9138-0ca250dd5544" containerName="nova-api-log" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.187413 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b242b52e-0bb1-4d87-9138-0ca250dd5544" containerName="nova-api-api" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.187425 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f30f5e5-c42e-4834-a13e-55b8ea08c459" containerName="nova-metadata-log" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.187921 5030 scope.go:117] "RemoveContainer" containerID="e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.188326 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: E0120 23:23:32.189477 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb\": container with ID starting with e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb not found: ID does not exist" containerID="e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.189530 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb"} err="failed to get container status \"e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb\": rpc error: code = NotFound desc = could not find container \"e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb\": container with ID starting with e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb not found: ID does not exist" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.189569 5030 scope.go:117] "RemoveContainer" containerID="4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a" Jan 20 23:23:32 crc kubenswrapper[5030]: E0120 23:23:32.189955 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a\": container with ID starting with 4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a not found: ID does not exist" containerID="4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.189995 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a"} err="failed to get container status \"4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a\": rpc error: code = NotFound desc = could not find container \"4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a\": container with ID starting with 4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a not found: ID does not exist" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.190019 5030 scope.go:117] "RemoveContainer" containerID="e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.190682 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.190944 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb"} err="failed to get container status \"e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb\": rpc error: code = NotFound desc = could not find container \"e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb\": container with ID starting with e82492fe6e50412a679be450f5204167b8933bd04df56a655467fee380ce2cbb not found: ID does not exist" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.190979 5030 scope.go:117] "RemoveContainer" containerID="4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.191126 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.192805 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a"} err="failed to get container status \"4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a\": rpc error: code = NotFound desc = could not find container \"4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a\": container with ID starting with 4f0989d747caef50f62940bac800ca9d687dcb6e491abd2f7dad5876959d568a not found: ID does not exist" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.192845 5030 scope.go:117] "RemoveContainer" containerID="e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.196758 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.198389 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.200961 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.207949 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.229349 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.272816 5030 scope.go:117] "RemoveContainer" containerID="7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.280508 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.280913 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="proxy-httpd" containerID="cri-o://aa8f7a6c20d8f634ac8b7433018b7e871b0f67493726e84457b73dd8d881cf92" gracePeriod=30 Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.281016 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="sg-core" containerID="cri-o://13ce826b149e461c22634b916c2f5b5b22878cd99df03e540c947bf05fb5e76e" gracePeriod=30 Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.281054 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="ceilometer-notification-agent" containerID="cri-o://eac23441c4d29e590b418ed30378595be267c553201f8bd106d1addebfb0c2c6" gracePeriod=30 Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.280780 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="ceilometer-central-agent" containerID="cri-o://70c83c11f5c9f3f1223c756a99e10679fdfe1bf9f31f53965a52137798774c01" gracePeriod=30 Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.290168 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-config-data\") pod \"nova-api-0\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.290217 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.290239 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtfd5\" (UniqueName: \"kubernetes.io/projected/f6367653-9b33-4126-811a-80088b201f5f-kube-api-access-xtfd5\") pod \"nova-api-0\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.290276 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-config-data\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.290296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjphd\" (UniqueName: \"kubernetes.io/projected/19291dc7-4b98-41df-ad76-2a3607e32740-kube-api-access-xjphd\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.290340 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19291dc7-4b98-41df-ad76-2a3607e32740-logs\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.290369 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6367653-9b33-4126-811a-80088b201f5f-logs\") pod \"nova-api-0\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.290391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.290426 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.332018 5030 scope.go:117] "RemoveContainer" containerID="e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2" Jan 20 23:23:32 crc kubenswrapper[5030]: E0120 23:23:32.346016 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2\": container with ID starting with e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2 not found: ID does not exist" containerID="e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.346263 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2"} err="failed to get container status \"e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2\": rpc error: code = NotFound desc = could not find container \"e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2\": container with ID starting with e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2 not found: ID does not exist" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.346348 5030 scope.go:117] "RemoveContainer" containerID="7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97" Jan 20 23:23:32 crc kubenswrapper[5030]: E0120 23:23:32.346608 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97\": container with ID starting with 7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97 not found: ID does not exist" containerID="7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.346751 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97"} err="failed to get container status \"7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97\": rpc error: code = NotFound desc = could not find container \"7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97\": container with ID starting with 7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97 not found: ID does not exist" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.346827 5030 scope.go:117] "RemoveContainer" containerID="e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.347053 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2"} err="failed to get container status \"e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2\": rpc error: code = NotFound desc = could not find container \"e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2\": container with ID starting with e2f4982cb4acd9b6d85664ad25fef764c48dc405233247cb73fc8e7ed86065b2 not found: ID does not exist" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.347134 5030 scope.go:117] "RemoveContainer" containerID="7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.347349 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97"} err="failed to get container status \"7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97\": rpc error: code = NotFound desc = could not find container \"7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97\": container with ID starting with 7996510e100cc333c49fe24caf732eb1cd6c6375b59113517ae0326a2131ff97 not found: ID does not exist" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.391755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-config-data\") pod \"nova-api-0\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.391822 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.392901 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtfd5\" (UniqueName: \"kubernetes.io/projected/f6367653-9b33-4126-811a-80088b201f5f-kube-api-access-xtfd5\") pod \"nova-api-0\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.393128 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-config-data\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.393987 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjphd\" (UniqueName: \"kubernetes.io/projected/19291dc7-4b98-41df-ad76-2a3607e32740-kube-api-access-xjphd\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.394387 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19291dc7-4b98-41df-ad76-2a3607e32740-logs\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.394904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6367653-9b33-4126-811a-80088b201f5f-logs\") pod \"nova-api-0\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.395079 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.395251 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.396304 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6367653-9b33-4126-811a-80088b201f5f-logs\") pod \"nova-api-0\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.396893 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19291dc7-4b98-41df-ad76-2a3607e32740-logs\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.398065 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-config-data\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.400797 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.403413 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.405070 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.408676 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-config-data\") pod \"nova-api-0\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.422567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjphd\" (UniqueName: \"kubernetes.io/projected/19291dc7-4b98-41df-ad76-2a3607e32740-kube-api-access-xjphd\") pod \"nova-metadata-0\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.428311 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtfd5\" (UniqueName: \"kubernetes.io/projected/f6367653-9b33-4126-811a-80088b201f5f-kube-api-access-xtfd5\") pod \"nova-api-0\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.524342 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:32 crc kubenswrapper[5030]: I0120 23:23:32.534953 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.130433 5030 generic.go:334] "Generic (PLEG): container finished" podID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerID="aa8f7a6c20d8f634ac8b7433018b7e871b0f67493726e84457b73dd8d881cf92" exitCode=0 Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.130764 5030 generic.go:334] "Generic (PLEG): container finished" podID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerID="13ce826b149e461c22634b916c2f5b5b22878cd99df03e540c947bf05fb5e76e" exitCode=2 Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.130772 5030 generic.go:334] "Generic (PLEG): container finished" podID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerID="70c83c11f5c9f3f1223c756a99e10679fdfe1bf9f31f53965a52137798774c01" exitCode=0 Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.130509 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3","Type":"ContainerDied","Data":"aa8f7a6c20d8f634ac8b7433018b7e871b0f67493726e84457b73dd8d881cf92"} Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.130847 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3","Type":"ContainerDied","Data":"13ce826b149e461c22634b916c2f5b5b22878cd99df03e540c947bf05fb5e76e"} Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.130865 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3","Type":"ContainerDied","Data":"70c83c11f5c9f3f1223c756a99e10679fdfe1bf9f31f53965a52137798774c01"} Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.135885 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0194cff7-fd9f-4d47-8edd-eae7cd25c987","Type":"ContainerStarted","Data":"62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574"} Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.135931 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0194cff7-fd9f-4d47-8edd-eae7cd25c987","Type":"ContainerStarted","Data":"96383fc6688c12fc4ca8d54b8a6a421d53e687489e44cbc82ef0a736e985cfc1"} Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.136745 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.138616 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"7295c1fa-5222-4dd7-83de-c6cbc3d10d08","Type":"ContainerStarted","Data":"1ddbb073ac679c1112026d39d87c0c538ebb786b29bcf41c39d15c48568ec20a"} Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.138689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"7295c1fa-5222-4dd7-83de-c6cbc3d10d08","Type":"ContainerStarted","Data":"907f1d49209d932c69c4de1af622616dc4250ac29fead804f280cdde35f3f790"} Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.138718 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.160242 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.160214156 podStartE2EDuration="2.160214156s" podCreationTimestamp="2026-01-20 23:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:33.156686841 +0000 UTC m=+2885.476947169" watchObservedRunningTime="2026-01-20 23:23:33.160214156 +0000 UTC m=+2885.480474484" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.187970 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.730392089 podStartE2EDuration="2.187942762s" podCreationTimestamp="2026-01-20 23:23:31 +0000 UTC" firstStartedPulling="2026-01-20 23:23:32.145700017 +0000 UTC m=+2884.465960305" lastFinishedPulling="2026-01-20 23:23:32.60325069 +0000 UTC m=+2884.923510978" observedRunningTime="2026-01-20 23:23:33.176957768 +0000 UTC m=+2885.497218116" watchObservedRunningTime="2026-01-20 23:23:33.187942762 +0000 UTC m=+2885.508203080" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.543791 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.626766 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:33 crc kubenswrapper[5030]: W0120 23:23:33.627233 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19291dc7_4b98_41df_ad76_2a3607e32740.slice/crio-906ebc84be26fac5300539af3617e3711febf7d2c190b38e19f40cc7d89a0fdc WatchSource:0}: Error finding container 906ebc84be26fac5300539af3617e3711febf7d2c190b38e19f40cc7d89a0fdc: Status 404 returned error can't find the container with id 906ebc84be26fac5300539af3617e3711febf7d2c190b38e19f40cc7d89a0fdc Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.785345 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.864795 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-config-data\") pod \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.865112 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-sg-core-conf-yaml\") pod \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.865140 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-log-httpd\") pod \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.865173 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-scripts\") pod \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.865315 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pp9b\" (UniqueName: \"kubernetes.io/projected/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-kube-api-access-9pp9b\") pod \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.865357 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-run-httpd\") pod \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.865419 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-combined-ca-bundle\") pod \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\" (UID: \"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3\") " Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.866165 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" (UID: "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.871596 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" (UID: "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.872173 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.872200 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.873958 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-kube-api-access-9pp9b" (OuterVolumeSpecName: "kube-api-access-9pp9b") pod "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" (UID: "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3"). InnerVolumeSpecName "kube-api-access-9pp9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.876404 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-scripts" (OuterVolumeSpecName: "scripts") pod "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" (UID: "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.890653 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.900537 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" (UID: "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.970420 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" (UID: "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.974156 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn6vj\" (UniqueName: \"kubernetes.io/projected/52ae645c-5c85-4557-b829-695539abf517-kube-api-access-hn6vj\") pod \"52ae645c-5c85-4557-b829-695539abf517\" (UID: \"52ae645c-5c85-4557-b829-695539abf517\") " Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.974280 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ae645c-5c85-4557-b829-695539abf517-combined-ca-bundle\") pod \"52ae645c-5c85-4557-b829-695539abf517\" (UID: \"52ae645c-5c85-4557-b829-695539abf517\") " Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.974435 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52ae645c-5c85-4557-b829-695539abf517-config-data\") pod \"52ae645c-5c85-4557-b829-695539abf517\" (UID: \"52ae645c-5c85-4557-b829-695539abf517\") " Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.974801 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pp9b\" (UniqueName: \"kubernetes.io/projected/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-kube-api-access-9pp9b\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.974817 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.974826 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.974836 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.978291 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f30f5e5-c42e-4834-a13e-55b8ea08c459" path="/var/lib/kubelet/pods/9f30f5e5-c42e-4834-a13e-55b8ea08c459/volumes" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.979046 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b242b52e-0bb1-4d87-9138-0ca250dd5544" path="/var/lib/kubelet/pods/b242b52e-0bb1-4d87-9138-0ca250dd5544/volumes" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.989685 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-config-data" (OuterVolumeSpecName: "config-data") pod "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" (UID: "a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:33 crc kubenswrapper[5030]: I0120 23:23:33.990751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52ae645c-5c85-4557-b829-695539abf517-kube-api-access-hn6vj" (OuterVolumeSpecName: "kube-api-access-hn6vj") pod "52ae645c-5c85-4557-b829-695539abf517" (UID: "52ae645c-5c85-4557-b829-695539abf517"). InnerVolumeSpecName "kube-api-access-hn6vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.016880 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ae645c-5c85-4557-b829-695539abf517-config-data" (OuterVolumeSpecName: "config-data") pod "52ae645c-5c85-4557-b829-695539abf517" (UID: "52ae645c-5c85-4557-b829-695539abf517"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.018590 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ae645c-5c85-4557-b829-695539abf517-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52ae645c-5c85-4557-b829-695539abf517" (UID: "52ae645c-5c85-4557-b829-695539abf517"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.077018 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.077046 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn6vj\" (UniqueName: \"kubernetes.io/projected/52ae645c-5c85-4557-b829-695539abf517-kube-api-access-hn6vj\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.077056 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ae645c-5c85-4557-b829-695539abf517-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.077065 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52ae645c-5c85-4557-b829-695539abf517-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.151546 5030 generic.go:334] "Generic (PLEG): container finished" podID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerID="eac23441c4d29e590b418ed30378595be267c553201f8bd106d1addebfb0c2c6" exitCode=0 Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.151608 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.151639 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3","Type":"ContainerDied","Data":"eac23441c4d29e590b418ed30378595be267c553201f8bd106d1addebfb0c2c6"} Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.151715 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3","Type":"ContainerDied","Data":"38b90920ffd9b22e85cf35dce4d39c76c6e383d266994450394fa70cbed3614b"} Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.151774 5030 scope.go:117] "RemoveContainer" containerID="aa8f7a6c20d8f634ac8b7433018b7e871b0f67493726e84457b73dd8d881cf92" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.153860 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f6367653-9b33-4126-811a-80088b201f5f","Type":"ContainerStarted","Data":"08f4819e5cd3873c1396c66fb5bd78d7ebc51cfed7b5ee84e52a767063d9bc9b"} Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.153891 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f6367653-9b33-4126-811a-80088b201f5f","Type":"ContainerStarted","Data":"ee77ade8b83c0634c6a6ef3106a934b6f387b3b4a999e381f1caa4c80e261761"} Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.153903 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f6367653-9b33-4126-811a-80088b201f5f","Type":"ContainerStarted","Data":"36321fdd62d8f1e03fbf4911633f9ec672be57a5b6e82803e6fd17f102a711d4"} Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.156434 5030 generic.go:334] "Generic (PLEG): container finished" podID="52ae645c-5c85-4557-b829-695539abf517" containerID="90e35ced751fa819a5e2825e5898a07e673e9d6fa95fed7cbcac27fb6cf37afd" exitCode=0 Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.156566 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.156665 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"52ae645c-5c85-4557-b829-695539abf517","Type":"ContainerDied","Data":"90e35ced751fa819a5e2825e5898a07e673e9d6fa95fed7cbcac27fb6cf37afd"} Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.156689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"52ae645c-5c85-4557-b829-695539abf517","Type":"ContainerDied","Data":"3d8e396501064b6dac020207aca15904d48c810dce0cd0f1632943e795c5cff3"} Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.159073 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"19291dc7-4b98-41df-ad76-2a3607e32740","Type":"ContainerStarted","Data":"7bed15ec2bee6eda37933f7a4da995781304c9b328143816ee26bf8cc0576823"} Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.159108 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"19291dc7-4b98-41df-ad76-2a3607e32740","Type":"ContainerStarted","Data":"a702fec726da5013591998426bc04fc9855de518a659e393c8cadad15d59d23e"} Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.159119 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"19291dc7-4b98-41df-ad76-2a3607e32740","Type":"ContainerStarted","Data":"906ebc84be26fac5300539af3617e3711febf7d2c190b38e19f40cc7d89a0fdc"} Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.173827 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.173810723 podStartE2EDuration="2.173810723s" podCreationTimestamp="2026-01-20 23:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:34.171730973 +0000 UTC m=+2886.491991261" watchObservedRunningTime="2026-01-20 23:23:34.173810723 +0000 UTC m=+2886.494071011" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.180205 5030 scope.go:117] "RemoveContainer" containerID="13ce826b149e461c22634b916c2f5b5b22878cd99df03e540c947bf05fb5e76e" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.192368 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.192349328 podStartE2EDuration="2.192349328s" podCreationTimestamp="2026-01-20 23:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:34.18784216 +0000 UTC m=+2886.508102448" watchObservedRunningTime="2026-01-20 23:23:34.192349328 +0000 UTC m=+2886.512609616" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.216754 5030 scope.go:117] "RemoveContainer" containerID="eac23441c4d29e590b418ed30378595be267c553201f8bd106d1addebfb0c2c6" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.217431 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.231406 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.244963 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.246257 5030 scope.go:117] "RemoveContainer" containerID="70c83c11f5c9f3f1223c756a99e10679fdfe1bf9f31f53965a52137798774c01" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.258493 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.278062 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:23:34 crc kubenswrapper[5030]: E0120 23:23:34.278481 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="sg-core" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.278497 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="sg-core" Jan 20 23:23:34 crc kubenswrapper[5030]: E0120 23:23:34.278510 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ae645c-5c85-4557-b829-695539abf517" containerName="nova-scheduler-scheduler" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.278518 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ae645c-5c85-4557-b829-695539abf517" containerName="nova-scheduler-scheduler" Jan 20 23:23:34 crc kubenswrapper[5030]: E0120 23:23:34.278531 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="ceilometer-central-agent" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.278537 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="ceilometer-central-agent" Jan 20 23:23:34 crc kubenswrapper[5030]: E0120 23:23:34.278546 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="proxy-httpd" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.278552 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="proxy-httpd" Jan 20 23:23:34 crc kubenswrapper[5030]: E0120 23:23:34.278565 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="ceilometer-notification-agent" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.278571 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="ceilometer-notification-agent" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.278771 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="ceilometer-notification-agent" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.278787 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="sg-core" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.278802 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="52ae645c-5c85-4557-b829-695539abf517" containerName="nova-scheduler-scheduler" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.278810 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="ceilometer-central-agent" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.278818 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" containerName="proxy-httpd" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.279976 5030 scope.go:117] "RemoveContainer" containerID="aa8f7a6c20d8f634ac8b7433018b7e871b0f67493726e84457b73dd8d881cf92" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.280498 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: E0120 23:23:34.281517 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa8f7a6c20d8f634ac8b7433018b7e871b0f67493726e84457b73dd8d881cf92\": container with ID starting with aa8f7a6c20d8f634ac8b7433018b7e871b0f67493726e84457b73dd8d881cf92 not found: ID does not exist" containerID="aa8f7a6c20d8f634ac8b7433018b7e871b0f67493726e84457b73dd8d881cf92" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.281546 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa8f7a6c20d8f634ac8b7433018b7e871b0f67493726e84457b73dd8d881cf92"} err="failed to get container status \"aa8f7a6c20d8f634ac8b7433018b7e871b0f67493726e84457b73dd8d881cf92\": rpc error: code = NotFound desc = could not find container \"aa8f7a6c20d8f634ac8b7433018b7e871b0f67493726e84457b73dd8d881cf92\": container with ID starting with aa8f7a6c20d8f634ac8b7433018b7e871b0f67493726e84457b73dd8d881cf92 not found: ID does not exist" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.281568 5030 scope.go:117] "RemoveContainer" containerID="13ce826b149e461c22634b916c2f5b5b22878cd99df03e540c947bf05fb5e76e" Jan 20 23:23:34 crc kubenswrapper[5030]: E0120 23:23:34.282669 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13ce826b149e461c22634b916c2f5b5b22878cd99df03e540c947bf05fb5e76e\": container with ID starting with 13ce826b149e461c22634b916c2f5b5b22878cd99df03e540c947bf05fb5e76e not found: ID does not exist" containerID="13ce826b149e461c22634b916c2f5b5b22878cd99df03e540c947bf05fb5e76e" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.282713 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ce826b149e461c22634b916c2f5b5b22878cd99df03e540c947bf05fb5e76e"} err="failed to get container status \"13ce826b149e461c22634b916c2f5b5b22878cd99df03e540c947bf05fb5e76e\": rpc error: code = NotFound desc = could not find container \"13ce826b149e461c22634b916c2f5b5b22878cd99df03e540c947bf05fb5e76e\": container with ID starting with 13ce826b149e461c22634b916c2f5b5b22878cd99df03e540c947bf05fb5e76e not found: ID does not exist" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.282737 5030 scope.go:117] "RemoveContainer" containerID="eac23441c4d29e590b418ed30378595be267c553201f8bd106d1addebfb0c2c6" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.286980 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.289378 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:23:34 crc kubenswrapper[5030]: E0120 23:23:34.289543 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac23441c4d29e590b418ed30378595be267c553201f8bd106d1addebfb0c2c6\": container with ID starting with eac23441c4d29e590b418ed30378595be267c553201f8bd106d1addebfb0c2c6 not found: ID does not exist" containerID="eac23441c4d29e590b418ed30378595be267c553201f8bd106d1addebfb0c2c6" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.289591 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac23441c4d29e590b418ed30378595be267c553201f8bd106d1addebfb0c2c6"} err="failed to get container status \"eac23441c4d29e590b418ed30378595be267c553201f8bd106d1addebfb0c2c6\": rpc error: code = NotFound desc = could not find container \"eac23441c4d29e590b418ed30378595be267c553201f8bd106d1addebfb0c2c6\": container with ID starting with eac23441c4d29e590b418ed30378595be267c553201f8bd106d1addebfb0c2c6 not found: ID does not exist" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.289640 5030 scope.go:117] "RemoveContainer" containerID="70c83c11f5c9f3f1223c756a99e10679fdfe1bf9f31f53965a52137798774c01" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.289757 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.290099 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:23:34 crc kubenswrapper[5030]: E0120 23:23:34.290514 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70c83c11f5c9f3f1223c756a99e10679fdfe1bf9f31f53965a52137798774c01\": container with ID starting with 70c83c11f5c9f3f1223c756a99e10679fdfe1bf9f31f53965a52137798774c01 not found: ID does not exist" containerID="70c83c11f5c9f3f1223c756a99e10679fdfe1bf9f31f53965a52137798774c01" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.290592 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c83c11f5c9f3f1223c756a99e10679fdfe1bf9f31f53965a52137798774c01"} err="failed to get container status \"70c83c11f5c9f3f1223c756a99e10679fdfe1bf9f31f53965a52137798774c01\": rpc error: code = NotFound desc = could not find container \"70c83c11f5c9f3f1223c756a99e10679fdfe1bf9f31f53965a52137798774c01\": container with ID starting with 70c83c11f5c9f3f1223c756a99e10679fdfe1bf9f31f53965a52137798774c01 not found: ID does not exist" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.290685 5030 scope.go:117] "RemoveContainer" containerID="90e35ced751fa819a5e2825e5898a07e673e9d6fa95fed7cbcac27fb6cf37afd" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.304823 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.306053 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.310823 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.320249 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.328449 5030 scope.go:117] "RemoveContainer" containerID="90e35ced751fa819a5e2825e5898a07e673e9d6fa95fed7cbcac27fb6cf37afd" Jan 20 23:23:34 crc kubenswrapper[5030]: E0120 23:23:34.330088 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90e35ced751fa819a5e2825e5898a07e673e9d6fa95fed7cbcac27fb6cf37afd\": container with ID starting with 90e35ced751fa819a5e2825e5898a07e673e9d6fa95fed7cbcac27fb6cf37afd not found: ID does not exist" containerID="90e35ced751fa819a5e2825e5898a07e673e9d6fa95fed7cbcac27fb6cf37afd" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.330115 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e35ced751fa819a5e2825e5898a07e673e9d6fa95fed7cbcac27fb6cf37afd"} err="failed to get container status \"90e35ced751fa819a5e2825e5898a07e673e9d6fa95fed7cbcac27fb6cf37afd\": rpc error: code = NotFound desc = could not find container \"90e35ced751fa819a5e2825e5898a07e673e9d6fa95fed7cbcac27fb6cf37afd\": container with ID starting with 90e35ced751fa819a5e2825e5898a07e673e9d6fa95fed7cbcac27fb6cf37afd not found: ID does not exist" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.383060 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d236fac1-4bfa-4690-8d83-3ce53855d72a-config-data\") pod \"nova-scheduler-0\" (UID: \"d236fac1-4bfa-4690-8d83-3ce53855d72a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.383273 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d236fac1-4bfa-4690-8d83-3ce53855d72a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d236fac1-4bfa-4690-8d83-3ce53855d72a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.383419 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd6hk\" (UniqueName: \"kubernetes.io/projected/d0605074-8297-4759-ba52-2cbc96e57d4f-kube-api-access-fd6hk\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.383521 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.383598 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.383692 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0605074-8297-4759-ba52-2cbc96e57d4f-log-httpd\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.383771 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-config-data\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.383861 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0605074-8297-4759-ba52-2cbc96e57d4f-run-httpd\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.383932 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptkx7\" (UniqueName: \"kubernetes.io/projected/d236fac1-4bfa-4690-8d83-3ce53855d72a-kube-api-access-ptkx7\") pod \"nova-scheduler-0\" (UID: \"d236fac1-4bfa-4690-8d83-3ce53855d72a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.383994 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-scripts\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.384067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.485826 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d236fac1-4bfa-4690-8d83-3ce53855d72a-config-data\") pod \"nova-scheduler-0\" (UID: \"d236fac1-4bfa-4690-8d83-3ce53855d72a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.486017 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d236fac1-4bfa-4690-8d83-3ce53855d72a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d236fac1-4bfa-4690-8d83-3ce53855d72a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.486103 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd6hk\" (UniqueName: \"kubernetes.io/projected/d0605074-8297-4759-ba52-2cbc96e57d4f-kube-api-access-fd6hk\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.486179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.486275 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.486339 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0605074-8297-4759-ba52-2cbc96e57d4f-log-httpd\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.486405 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-config-data\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.486486 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0605074-8297-4759-ba52-2cbc96e57d4f-run-httpd\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.486559 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptkx7\" (UniqueName: \"kubernetes.io/projected/d236fac1-4bfa-4690-8d83-3ce53855d72a-kube-api-access-ptkx7\") pod \"nova-scheduler-0\" (UID: \"d236fac1-4bfa-4690-8d83-3ce53855d72a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.486641 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-scripts\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.486710 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.489700 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0605074-8297-4759-ba52-2cbc96e57d4f-log-httpd\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.490414 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.490728 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0605074-8297-4759-ba52-2cbc96e57d4f-run-httpd\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.491243 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.492070 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d236fac1-4bfa-4690-8d83-3ce53855d72a-config-data\") pod \"nova-scheduler-0\" (UID: \"d236fac1-4bfa-4690-8d83-3ce53855d72a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.492174 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.493083 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-config-data\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.493116 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d236fac1-4bfa-4690-8d83-3ce53855d72a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d236fac1-4bfa-4690-8d83-3ce53855d72a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.493407 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-scripts\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.508201 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptkx7\" (UniqueName: \"kubernetes.io/projected/d236fac1-4bfa-4690-8d83-3ce53855d72a-kube-api-access-ptkx7\") pod \"nova-scheduler-0\" (UID: \"d236fac1-4bfa-4690-8d83-3ce53855d72a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.515056 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd6hk\" (UniqueName: \"kubernetes.io/projected/d0605074-8297-4759-ba52-2cbc96e57d4f-kube-api-access-fd6hk\") pod \"ceilometer-0\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.604727 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:34 crc kubenswrapper[5030]: I0120 23:23:34.632527 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:35 crc kubenswrapper[5030]: I0120 23:23:35.060429 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:23:35 crc kubenswrapper[5030]: W0120 23:23:35.065078 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0605074_8297_4759_ba52_2cbc96e57d4f.slice/crio-7a659ecfd13147610934d8d41ab5f9d878530ab97d39017e633cd8ee65d55ed2 WatchSource:0}: Error finding container 7a659ecfd13147610934d8d41ab5f9d878530ab97d39017e633cd8ee65d55ed2: Status 404 returned error can't find the container with id 7a659ecfd13147610934d8d41ab5f9d878530ab97d39017e633cd8ee65d55ed2 Jan 20 23:23:35 crc kubenswrapper[5030]: I0120 23:23:35.152714 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:35 crc kubenswrapper[5030]: W0120 23:23:35.156241 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd236fac1_4bfa_4690_8d83_3ce53855d72a.slice/crio-7ba655585edf27d825188874521cf688a67ac60e0137261037bae9c310a2e470 WatchSource:0}: Error finding container 7ba655585edf27d825188874521cf688a67ac60e0137261037bae9c310a2e470: Status 404 returned error can't find the container with id 7ba655585edf27d825188874521cf688a67ac60e0137261037bae9c310a2e470 Jan 20 23:23:35 crc kubenswrapper[5030]: I0120 23:23:35.174287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d0605074-8297-4759-ba52-2cbc96e57d4f","Type":"ContainerStarted","Data":"7a659ecfd13147610934d8d41ab5f9d878530ab97d39017e633cd8ee65d55ed2"} Jan 20 23:23:35 crc kubenswrapper[5030]: I0120 23:23:35.176689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d236fac1-4bfa-4690-8d83-3ce53855d72a","Type":"ContainerStarted","Data":"7ba655585edf27d825188874521cf688a67ac60e0137261037bae9c310a2e470"} Jan 20 23:23:35 crc kubenswrapper[5030]: I0120 23:23:35.975368 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52ae645c-5c85-4557-b829-695539abf517" path="/var/lib/kubelet/pods/52ae645c-5c85-4557-b829-695539abf517/volumes" Jan 20 23:23:35 crc kubenswrapper[5030]: I0120 23:23:35.976655 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3" path="/var/lib/kubelet/pods/a6d8c9b6-0ac9-4c96-8389-7cb23f242dc3/volumes" Jan 20 23:23:36 crc kubenswrapper[5030]: I0120 23:23:36.198304 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d0605074-8297-4759-ba52-2cbc96e57d4f","Type":"ContainerStarted","Data":"ff745a897c82b7c9ac1c81a3dcc45f5d44c30d9509ebc08a74b1b89fae4ab036"} Jan 20 23:23:36 crc kubenswrapper[5030]: I0120 23:23:36.201003 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d236fac1-4bfa-4690-8d83-3ce53855d72a","Type":"ContainerStarted","Data":"cb6922dbe7652e99cbdcc19860eb90ace92fa617322c1eeeb5be7f1bcff8e73e"} Jan 20 23:23:36 crc kubenswrapper[5030]: I0120 23:23:36.233732 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.233705641 podStartE2EDuration="2.233705641s" podCreationTimestamp="2026-01-20 23:23:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:36.221312004 +0000 UTC m=+2888.541572332" watchObservedRunningTime="2026-01-20 23:23:36.233705641 +0000 UTC m=+2888.553965969" Jan 20 23:23:36 crc kubenswrapper[5030]: I0120 23:23:36.460790 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:36 crc kubenswrapper[5030]: I0120 23:23:36.484559 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:37 crc kubenswrapper[5030]: I0120 23:23:37.214074 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d0605074-8297-4759-ba52-2cbc96e57d4f","Type":"ContainerStarted","Data":"10ef44deca0b091442d9bd085dfaf88bf180d6ad8e1a2fd04aadb048c1aee53c"} Jan 20 23:23:37 crc kubenswrapper[5030]: I0120 23:23:37.214416 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d0605074-8297-4759-ba52-2cbc96e57d4f","Type":"ContainerStarted","Data":"df337dafc7294a186de3d93e713b23dca245c8b6742d08b48db6f7c8ff32715c"} Jan 20 23:23:37 crc kubenswrapper[5030]: I0120 23:23:37.243124 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:23:37 crc kubenswrapper[5030]: I0120 23:23:37.525188 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:37 crc kubenswrapper[5030]: I0120 23:23:37.525244 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:39 crc kubenswrapper[5030]: I0120 23:23:39.238372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d0605074-8297-4759-ba52-2cbc96e57d4f","Type":"ContainerStarted","Data":"66e7ebe94283fcbc8440fa4f1469cab59fa8b458e04ffcf085ff2b41631d7f73"} Jan 20 23:23:39 crc kubenswrapper[5030]: I0120 23:23:39.240669 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:23:39 crc kubenswrapper[5030]: I0120 23:23:39.277507 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.633023627 podStartE2EDuration="5.277471023s" podCreationTimestamp="2026-01-20 23:23:34 +0000 UTC" firstStartedPulling="2026-01-20 23:23:35.067079535 +0000 UTC m=+2887.387339823" lastFinishedPulling="2026-01-20 23:23:38.711526931 +0000 UTC m=+2891.031787219" observedRunningTime="2026-01-20 23:23:39.271819127 +0000 UTC m=+2891.592079485" watchObservedRunningTime="2026-01-20 23:23:39.277471023 +0000 UTC m=+2891.597731351" Jan 20 23:23:39 crc kubenswrapper[5030]: I0120 23:23:39.632670 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:41 crc kubenswrapper[5030]: I0120 23:23:41.567869 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:23:41 crc kubenswrapper[5030]: I0120 23:23:41.584526 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.162468 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd"] Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.164455 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.167594 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.168240 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.175702 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd"] Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.256837 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-scripts\") pod \"nova-cell1-cell-mapping-rcbbd\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.256908 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j54ww\" (UniqueName: \"kubernetes.io/projected/0f6689ec-afe0-4ab0-8744-8586baafe323-kube-api-access-j54ww\") pod \"nova-cell1-cell-mapping-rcbbd\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.256999 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-config-data\") pod \"nova-cell1-cell-mapping-rcbbd\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.257137 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rcbbd\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.358725 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rcbbd\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.359001 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-scripts\") pod \"nova-cell1-cell-mapping-rcbbd\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.359113 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j54ww\" (UniqueName: \"kubernetes.io/projected/0f6689ec-afe0-4ab0-8744-8586baafe323-kube-api-access-j54ww\") pod \"nova-cell1-cell-mapping-rcbbd\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.359308 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-config-data\") pod \"nova-cell1-cell-mapping-rcbbd\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.367454 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-scripts\") pod \"nova-cell1-cell-mapping-rcbbd\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.368278 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-config-data\") pod \"nova-cell1-cell-mapping-rcbbd\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.375405 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rcbbd\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.382568 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j54ww\" (UniqueName: \"kubernetes.io/projected/0f6689ec-afe0-4ab0-8744-8586baafe323-kube-api-access-j54ww\") pod \"nova-cell1-cell-mapping-rcbbd\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.488854 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.525884 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.525954 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.535743 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.535791 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:42 crc kubenswrapper[5030]: I0120 23:23:42.992928 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd"] Jan 20 23:23:43 crc kubenswrapper[5030]: I0120 23:23:43.287147 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" event={"ID":"0f6689ec-afe0-4ab0-8744-8586baafe323","Type":"ContainerStarted","Data":"56c55391df4a34d7b83ce2d3b282ed437cb6c3b3ba656f105786ceea8b6a70ac"} Jan 20 23:23:43 crc kubenswrapper[5030]: I0120 23:23:43.287512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" event={"ID":"0f6689ec-afe0-4ab0-8744-8586baafe323","Type":"ContainerStarted","Data":"ee3fea33688dd81020b190a9c8704e7b15e4fc5cca4ff3a8adb0f0391c003654"} Jan 20 23:23:43 crc kubenswrapper[5030]: I0120 23:23:43.311452 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" podStartSLOduration=1.311433601 podStartE2EDuration="1.311433601s" podCreationTimestamp="2026-01-20 23:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:43.308829248 +0000 UTC m=+2895.629089556" watchObservedRunningTime="2026-01-20 23:23:43.311433601 +0000 UTC m=+2895.631693909" Jan 20 23:23:43 crc kubenswrapper[5030]: I0120 23:23:43.530950 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="19291dc7-4b98-41df-ad76-2a3607e32740" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.33:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:23:43 crc kubenswrapper[5030]: I0120 23:23:43.578842 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f6367653-9b33-4126-811a-80088b201f5f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.34:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:23:43 crc kubenswrapper[5030]: I0120 23:23:43.619774 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="19291dc7-4b98-41df-ad76-2a3607e32740" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.33:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:23:43 crc kubenswrapper[5030]: I0120 23:23:43.620206 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="f6367653-9b33-4126-811a-80088b201f5f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.34:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:23:44 crc kubenswrapper[5030]: I0120 23:23:44.633839 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:44 crc kubenswrapper[5030]: I0120 23:23:44.676469 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:45 crc kubenswrapper[5030]: I0120 23:23:45.349370 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:48 crc kubenswrapper[5030]: I0120 23:23:48.349213 5030 generic.go:334] "Generic (PLEG): container finished" podID="0f6689ec-afe0-4ab0-8744-8586baafe323" containerID="56c55391df4a34d7b83ce2d3b282ed437cb6c3b3ba656f105786ceea8b6a70ac" exitCode=0 Jan 20 23:23:48 crc kubenswrapper[5030]: I0120 23:23:48.349327 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" event={"ID":"0f6689ec-afe0-4ab0-8744-8586baafe323","Type":"ContainerDied","Data":"56c55391df4a34d7b83ce2d3b282ed437cb6c3b3ba656f105786ceea8b6a70ac"} Jan 20 23:23:49 crc kubenswrapper[5030]: I0120 23:23:49.821161 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:49 crc kubenswrapper[5030]: I0120 23:23:49.934456 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-config-data\") pod \"0f6689ec-afe0-4ab0-8744-8586baafe323\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " Jan 20 23:23:49 crc kubenswrapper[5030]: I0120 23:23:49.934726 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-scripts\") pod \"0f6689ec-afe0-4ab0-8744-8586baafe323\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " Jan 20 23:23:49 crc kubenswrapper[5030]: I0120 23:23:49.934813 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-combined-ca-bundle\") pod \"0f6689ec-afe0-4ab0-8744-8586baafe323\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " Jan 20 23:23:49 crc kubenswrapper[5030]: I0120 23:23:49.934865 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j54ww\" (UniqueName: \"kubernetes.io/projected/0f6689ec-afe0-4ab0-8744-8586baafe323-kube-api-access-j54ww\") pod \"0f6689ec-afe0-4ab0-8744-8586baafe323\" (UID: \"0f6689ec-afe0-4ab0-8744-8586baafe323\") " Jan 20 23:23:49 crc kubenswrapper[5030]: I0120 23:23:49.940325 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-scripts" (OuterVolumeSpecName: "scripts") pod "0f6689ec-afe0-4ab0-8744-8586baafe323" (UID: "0f6689ec-afe0-4ab0-8744-8586baafe323"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:49 crc kubenswrapper[5030]: I0120 23:23:49.941591 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f6689ec-afe0-4ab0-8744-8586baafe323-kube-api-access-j54ww" (OuterVolumeSpecName: "kube-api-access-j54ww") pod "0f6689ec-afe0-4ab0-8744-8586baafe323" (UID: "0f6689ec-afe0-4ab0-8744-8586baafe323"). InnerVolumeSpecName "kube-api-access-j54ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:49 crc kubenswrapper[5030]: I0120 23:23:49.970957 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f6689ec-afe0-4ab0-8744-8586baafe323" (UID: "0f6689ec-afe0-4ab0-8744-8586baafe323"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:49 crc kubenswrapper[5030]: I0120 23:23:49.985178 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-config-data" (OuterVolumeSpecName: "config-data") pod "0f6689ec-afe0-4ab0-8744-8586baafe323" (UID: "0f6689ec-afe0-4ab0-8744-8586baafe323"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.037960 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.038026 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.038044 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j54ww\" (UniqueName: \"kubernetes.io/projected/0f6689ec-afe0-4ab0-8744-8586baafe323-kube-api-access-j54ww\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.038058 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f6689ec-afe0-4ab0-8744-8586baafe323-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.386713 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" event={"ID":"0f6689ec-afe0-4ab0-8744-8586baafe323","Type":"ContainerDied","Data":"ee3fea33688dd81020b190a9c8704e7b15e4fc5cca4ff3a8adb0f0391c003654"} Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.386766 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee3fea33688dd81020b190a9c8704e7b15e4fc5cca4ff3a8adb0f0391c003654" Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.386796 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd" Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.578902 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.579259 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d236fac1-4bfa-4690-8d83-3ce53855d72a" containerName="nova-scheduler-scheduler" containerID="cri-o://cb6922dbe7652e99cbdcc19860eb90ace92fa617322c1eeeb5be7f1bcff8e73e" gracePeriod=30 Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.599146 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.599800 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="f6367653-9b33-4126-811a-80088b201f5f" containerName="nova-api-api" containerID="cri-o://08f4819e5cd3873c1396c66fb5bd78d7ebc51cfed7b5ee84e52a767063d9bc9b" gracePeriod=30 Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.600067 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="f6367653-9b33-4126-811a-80088b201f5f" containerName="nova-api-log" containerID="cri-o://ee77ade8b83c0634c6a6ef3106a934b6f387b3b4a999e381f1caa4c80e261761" gracePeriod=30 Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.692977 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.693660 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="19291dc7-4b98-41df-ad76-2a3607e32740" containerName="nova-metadata-log" containerID="cri-o://a702fec726da5013591998426bc04fc9855de518a659e393c8cadad15d59d23e" gracePeriod=30 Jan 20 23:23:50 crc kubenswrapper[5030]: I0120 23:23:50.693738 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="19291dc7-4b98-41df-ad76-2a3607e32740" containerName="nova-metadata-metadata" containerID="cri-o://7bed15ec2bee6eda37933f7a4da995781304c9b328143816ee26bf8cc0576823" gracePeriod=30 Jan 20 23:23:51 crc kubenswrapper[5030]: I0120 23:23:51.399798 5030 generic.go:334] "Generic (PLEG): container finished" podID="19291dc7-4b98-41df-ad76-2a3607e32740" containerID="a702fec726da5013591998426bc04fc9855de518a659e393c8cadad15d59d23e" exitCode=143 Jan 20 23:23:51 crc kubenswrapper[5030]: I0120 23:23:51.399887 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"19291dc7-4b98-41df-ad76-2a3607e32740","Type":"ContainerDied","Data":"a702fec726da5013591998426bc04fc9855de518a659e393c8cadad15d59d23e"} Jan 20 23:23:51 crc kubenswrapper[5030]: I0120 23:23:51.402716 5030 generic.go:334] "Generic (PLEG): container finished" podID="f6367653-9b33-4126-811a-80088b201f5f" containerID="ee77ade8b83c0634c6a6ef3106a934b6f387b3b4a999e381f1caa4c80e261761" exitCode=143 Jan 20 23:23:51 crc kubenswrapper[5030]: I0120 23:23:51.402768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f6367653-9b33-4126-811a-80088b201f5f","Type":"ContainerDied","Data":"ee77ade8b83c0634c6a6ef3106a934b6f387b3b4a999e381f1caa4c80e261761"} Jan 20 23:23:51 crc kubenswrapper[5030]: I0120 23:23:51.896914 5030 scope.go:117] "RemoveContainer" containerID="03f6942764b2fc6914994b80ca01b57c398168e809ec4912feb82cbb3398d604" Jan 20 23:23:51 crc kubenswrapper[5030]: I0120 23:23:51.928416 5030 scope.go:117] "RemoveContainer" containerID="86f2bd833f651a2a67a1766f97e95fffb23c70a4b05e1a965f5196f9e0a203d2" Jan 20 23:23:51 crc kubenswrapper[5030]: I0120 23:23:51.957377 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:51 crc kubenswrapper[5030]: I0120 23:23:51.974029 5030 scope.go:117] "RemoveContainer" containerID="692e91461f0a4c20cf2df5e35e786a30067f875df78474941822ccf0e5c4873d" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.022998 5030 scope.go:117] "RemoveContainer" containerID="686aad5aa9f2af3ead8f0f1957fe2fbc06afa5bd392f0df0060aebc4b8dcab80" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.077606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d236fac1-4bfa-4690-8d83-3ce53855d72a-config-data\") pod \"d236fac1-4bfa-4690-8d83-3ce53855d72a\" (UID: \"d236fac1-4bfa-4690-8d83-3ce53855d72a\") " Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.077826 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptkx7\" (UniqueName: \"kubernetes.io/projected/d236fac1-4bfa-4690-8d83-3ce53855d72a-kube-api-access-ptkx7\") pod \"d236fac1-4bfa-4690-8d83-3ce53855d72a\" (UID: \"d236fac1-4bfa-4690-8d83-3ce53855d72a\") " Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.077922 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d236fac1-4bfa-4690-8d83-3ce53855d72a-combined-ca-bundle\") pod \"d236fac1-4bfa-4690-8d83-3ce53855d72a\" (UID: \"d236fac1-4bfa-4690-8d83-3ce53855d72a\") " Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.085858 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d236fac1-4bfa-4690-8d83-3ce53855d72a-kube-api-access-ptkx7" (OuterVolumeSpecName: "kube-api-access-ptkx7") pod "d236fac1-4bfa-4690-8d83-3ce53855d72a" (UID: "d236fac1-4bfa-4690-8d83-3ce53855d72a"). InnerVolumeSpecName "kube-api-access-ptkx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.112947 5030 scope.go:117] "RemoveContainer" containerID="a1d23fe2b7adbf63edbabfdc30bb528d1409be3a65c700d226dbc7625861d5ec" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.118649 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d236fac1-4bfa-4690-8d83-3ce53855d72a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d236fac1-4bfa-4690-8d83-3ce53855d72a" (UID: "d236fac1-4bfa-4690-8d83-3ce53855d72a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.123264 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d236fac1-4bfa-4690-8d83-3ce53855d72a-config-data" (OuterVolumeSpecName: "config-data") pod "d236fac1-4bfa-4690-8d83-3ce53855d72a" (UID: "d236fac1-4bfa-4690-8d83-3ce53855d72a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.143100 5030 scope.go:117] "RemoveContainer" containerID="c6d102c5c2ff29bc487b0f86be647e928791c6f33823052475caed180c2ca557" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.170436 5030 scope.go:117] "RemoveContainer" containerID="6a7ba4f0dd84f46b89a429cd2c20cef0d3ba99a5eff0dfc1ffb1c070547beeff" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.179583 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptkx7\" (UniqueName: \"kubernetes.io/projected/d236fac1-4bfa-4690-8d83-3ce53855d72a-kube-api-access-ptkx7\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.179738 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d236fac1-4bfa-4690-8d83-3ce53855d72a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.179827 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d236fac1-4bfa-4690-8d83-3ce53855d72a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.201585 5030 scope.go:117] "RemoveContainer" containerID="e04fbae0a14def52a1ca6c867992be7bd258bd9de533d50626669a6d7a7ffc2c" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.235149 5030 scope.go:117] "RemoveContainer" containerID="88572615c9470cc1f53e5fe138aec4921f5f4bbf0c0b6644397ebffec41a7164" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.287374 5030 scope.go:117] "RemoveContainer" containerID="9999471141c83c0e8b093977a735c1bb6794b7b632975ee7ff219513491befd1" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.324125 5030 scope.go:117] "RemoveContainer" containerID="6e48555d7f59d362f76f98e2a005a7ec8e530db62c2612fee273f24639d73fc1" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.418365 5030 generic.go:334] "Generic (PLEG): container finished" podID="d236fac1-4bfa-4690-8d83-3ce53855d72a" containerID="cb6922dbe7652e99cbdcc19860eb90ace92fa617322c1eeeb5be7f1bcff8e73e" exitCode=0 Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.418453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d236fac1-4bfa-4690-8d83-3ce53855d72a","Type":"ContainerDied","Data":"cb6922dbe7652e99cbdcc19860eb90ace92fa617322c1eeeb5be7f1bcff8e73e"} Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.418490 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d236fac1-4bfa-4690-8d83-3ce53855d72a","Type":"ContainerDied","Data":"7ba655585edf27d825188874521cf688a67ac60e0137261037bae9c310a2e470"} Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.418520 5030 scope.go:117] "RemoveContainer" containerID="cb6922dbe7652e99cbdcc19860eb90ace92fa617322c1eeeb5be7f1bcff8e73e" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.418729 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.482235 5030 scope.go:117] "RemoveContainer" containerID="cb6922dbe7652e99cbdcc19860eb90ace92fa617322c1eeeb5be7f1bcff8e73e" Jan 20 23:23:52 crc kubenswrapper[5030]: E0120 23:23:52.484023 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb6922dbe7652e99cbdcc19860eb90ace92fa617322c1eeeb5be7f1bcff8e73e\": container with ID starting with cb6922dbe7652e99cbdcc19860eb90ace92fa617322c1eeeb5be7f1bcff8e73e not found: ID does not exist" containerID="cb6922dbe7652e99cbdcc19860eb90ace92fa617322c1eeeb5be7f1bcff8e73e" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.484149 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb6922dbe7652e99cbdcc19860eb90ace92fa617322c1eeeb5be7f1bcff8e73e"} err="failed to get container status \"cb6922dbe7652e99cbdcc19860eb90ace92fa617322c1eeeb5be7f1bcff8e73e\": rpc error: code = NotFound desc = could not find container \"cb6922dbe7652e99cbdcc19860eb90ace92fa617322c1eeeb5be7f1bcff8e73e\": container with ID starting with cb6922dbe7652e99cbdcc19860eb90ace92fa617322c1eeeb5be7f1bcff8e73e not found: ID does not exist" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.484251 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.501252 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.520258 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:52 crc kubenswrapper[5030]: E0120 23:23:52.520689 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d236fac1-4bfa-4690-8d83-3ce53855d72a" containerName="nova-scheduler-scheduler" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.520707 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d236fac1-4bfa-4690-8d83-3ce53855d72a" containerName="nova-scheduler-scheduler" Jan 20 23:23:52 crc kubenswrapper[5030]: E0120 23:23:52.520725 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6689ec-afe0-4ab0-8744-8586baafe323" containerName="nova-manage" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.520731 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6689ec-afe0-4ab0-8744-8586baafe323" containerName="nova-manage" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.520962 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d236fac1-4bfa-4690-8d83-3ce53855d72a" containerName="nova-scheduler-scheduler" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.520986 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6689ec-afe0-4ab0-8744-8586baafe323" containerName="nova-manage" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.521698 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.523901 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.529265 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.690189 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f188c9-7a3c-49eb-9f90-553a9742b023-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c9f188c9-7a3c-49eb-9f90-553a9742b023\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.690677 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f188c9-7a3c-49eb-9f90-553a9742b023-config-data\") pod \"nova-scheduler-0\" (UID: \"c9f188c9-7a3c-49eb-9f90-553a9742b023\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.690970 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q64rd\" (UniqueName: \"kubernetes.io/projected/c9f188c9-7a3c-49eb-9f90-553a9742b023-kube-api-access-q64rd\") pod \"nova-scheduler-0\" (UID: \"c9f188c9-7a3c-49eb-9f90-553a9742b023\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.792895 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f188c9-7a3c-49eb-9f90-553a9742b023-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c9f188c9-7a3c-49eb-9f90-553a9742b023\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.793039 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f188c9-7a3c-49eb-9f90-553a9742b023-config-data\") pod \"nova-scheduler-0\" (UID: \"c9f188c9-7a3c-49eb-9f90-553a9742b023\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.793127 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q64rd\" (UniqueName: \"kubernetes.io/projected/c9f188c9-7a3c-49eb-9f90-553a9742b023-kube-api-access-q64rd\") pod \"nova-scheduler-0\" (UID: \"c9f188c9-7a3c-49eb-9f90-553a9742b023\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.799651 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f188c9-7a3c-49eb-9f90-553a9742b023-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c9f188c9-7a3c-49eb-9f90-553a9742b023\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.799868 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f188c9-7a3c-49eb-9f90-553a9742b023-config-data\") pod \"nova-scheduler-0\" (UID: \"c9f188c9-7a3c-49eb-9f90-553a9742b023\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.810079 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q64rd\" (UniqueName: \"kubernetes.io/projected/c9f188c9-7a3c-49eb-9f90-553a9742b023-kube-api-access-q64rd\") pod \"nova-scheduler-0\" (UID: \"c9f188c9-7a3c-49eb-9f90-553a9742b023\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:52 crc kubenswrapper[5030]: I0120 23:23:52.847793 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.379357 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.450533 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c9f188c9-7a3c-49eb-9f90-553a9742b023","Type":"ContainerStarted","Data":"b2682e9c9d3aab53d8b218ad2df9fd947935d8115849e7f408ec769603688867"} Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.472603 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r7nfz"] Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.474636 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.481750 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7nfz"] Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.610258 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdlzj\" (UniqueName: \"kubernetes.io/projected/fed58c55-4c24-455b-a184-a47d4a2cd44f-kube-api-access-zdlzj\") pod \"community-operators-r7nfz\" (UID: \"fed58c55-4c24-455b-a184-a47d4a2cd44f\") " pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.610711 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fed58c55-4c24-455b-a184-a47d4a2cd44f-utilities\") pod \"community-operators-r7nfz\" (UID: \"fed58c55-4c24-455b-a184-a47d4a2cd44f\") " pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.610746 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fed58c55-4c24-455b-a184-a47d4a2cd44f-catalog-content\") pod \"community-operators-r7nfz\" (UID: \"fed58c55-4c24-455b-a184-a47d4a2cd44f\") " pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.712720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fed58c55-4c24-455b-a184-a47d4a2cd44f-utilities\") pod \"community-operators-r7nfz\" (UID: \"fed58c55-4c24-455b-a184-a47d4a2cd44f\") " pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.712765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fed58c55-4c24-455b-a184-a47d4a2cd44f-catalog-content\") pod \"community-operators-r7nfz\" (UID: \"fed58c55-4c24-455b-a184-a47d4a2cd44f\") " pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.712865 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdlzj\" (UniqueName: \"kubernetes.io/projected/fed58c55-4c24-455b-a184-a47d4a2cd44f-kube-api-access-zdlzj\") pod \"community-operators-r7nfz\" (UID: \"fed58c55-4c24-455b-a184-a47d4a2cd44f\") " pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.713561 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fed58c55-4c24-455b-a184-a47d4a2cd44f-catalog-content\") pod \"community-operators-r7nfz\" (UID: \"fed58c55-4c24-455b-a184-a47d4a2cd44f\") " pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.713654 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fed58c55-4c24-455b-a184-a47d4a2cd44f-utilities\") pod \"community-operators-r7nfz\" (UID: \"fed58c55-4c24-455b-a184-a47d4a2cd44f\") " pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.735018 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdlzj\" (UniqueName: \"kubernetes.io/projected/fed58c55-4c24-455b-a184-a47d4a2cd44f-kube-api-access-zdlzj\") pod \"community-operators-r7nfz\" (UID: \"fed58c55-4c24-455b-a184-a47d4a2cd44f\") " pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.833673 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:23:53 crc kubenswrapper[5030]: I0120 23:23:53.991689 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d236fac1-4bfa-4690-8d83-3ce53855d72a" path="/var/lib/kubelet/pods/d236fac1-4bfa-4690-8d83-3ce53855d72a/volumes" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.233276 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.329473 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-config-data\") pod \"f6367653-9b33-4126-811a-80088b201f5f\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.329546 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-combined-ca-bundle\") pod \"f6367653-9b33-4126-811a-80088b201f5f\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.329592 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtfd5\" (UniqueName: \"kubernetes.io/projected/f6367653-9b33-4126-811a-80088b201f5f-kube-api-access-xtfd5\") pod \"f6367653-9b33-4126-811a-80088b201f5f\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.329747 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6367653-9b33-4126-811a-80088b201f5f-logs\") pod \"f6367653-9b33-4126-811a-80088b201f5f\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.330660 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6367653-9b33-4126-811a-80088b201f5f-logs" (OuterVolumeSpecName: "logs") pod "f6367653-9b33-4126-811a-80088b201f5f" (UID: "f6367653-9b33-4126-811a-80088b201f5f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.335458 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6367653-9b33-4126-811a-80088b201f5f-kube-api-access-xtfd5" (OuterVolumeSpecName: "kube-api-access-xtfd5") pod "f6367653-9b33-4126-811a-80088b201f5f" (UID: "f6367653-9b33-4126-811a-80088b201f5f"). InnerVolumeSpecName "kube-api-access-xtfd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:54 crc kubenswrapper[5030]: E0120 23:23:54.367317 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-combined-ca-bundle podName:f6367653-9b33-4126-811a-80088b201f5f nodeName:}" failed. No retries permitted until 2026-01-20 23:23:54.867273749 +0000 UTC m=+2907.187534037 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-combined-ca-bundle") pod "f6367653-9b33-4126-811a-80088b201f5f" (UID: "f6367653-9b33-4126-811a-80088b201f5f") : error deleting /var/lib/kubelet/pods/f6367653-9b33-4126-811a-80088b201f5f/volume-subpaths: remove /var/lib/kubelet/pods/f6367653-9b33-4126-811a-80088b201f5f/volume-subpaths: no such file or directory Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.372918 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-config-data" (OuterVolumeSpecName: "config-data") pod "f6367653-9b33-4126-811a-80088b201f5f" (UID: "f6367653-9b33-4126-811a-80088b201f5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.389473 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.404589 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7nfz"] Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.431464 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.431861 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtfd5\" (UniqueName: \"kubernetes.io/projected/f6367653-9b33-4126-811a-80088b201f5f-kube-api-access-xtfd5\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.431871 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6367653-9b33-4126-811a-80088b201f5f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.464559 5030 generic.go:334] "Generic (PLEG): container finished" podID="19291dc7-4b98-41df-ad76-2a3607e32740" containerID="7bed15ec2bee6eda37933f7a4da995781304c9b328143816ee26bf8cc0576823" exitCode=0 Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.464667 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"19291dc7-4b98-41df-ad76-2a3607e32740","Type":"ContainerDied","Data":"7bed15ec2bee6eda37933f7a4da995781304c9b328143816ee26bf8cc0576823"} Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.464682 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.464719 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"19291dc7-4b98-41df-ad76-2a3607e32740","Type":"ContainerDied","Data":"906ebc84be26fac5300539af3617e3711febf7d2c190b38e19f40cc7d89a0fdc"} Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.464736 5030 scope.go:117] "RemoveContainer" containerID="7bed15ec2bee6eda37933f7a4da995781304c9b328143816ee26bf8cc0576823" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.468200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7nfz" event={"ID":"fed58c55-4c24-455b-a184-a47d4a2cd44f","Type":"ContainerStarted","Data":"ec247ae36eaa8513051ba1725ff78fced38f631c5d4641f7e1a1315548635c37"} Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.470911 5030 generic.go:334] "Generic (PLEG): container finished" podID="f6367653-9b33-4126-811a-80088b201f5f" containerID="08f4819e5cd3873c1396c66fb5bd78d7ebc51cfed7b5ee84e52a767063d9bc9b" exitCode=0 Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.470979 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.471005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f6367653-9b33-4126-811a-80088b201f5f","Type":"ContainerDied","Data":"08f4819e5cd3873c1396c66fb5bd78d7ebc51cfed7b5ee84e52a767063d9bc9b"} Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.471057 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"f6367653-9b33-4126-811a-80088b201f5f","Type":"ContainerDied","Data":"36321fdd62d8f1e03fbf4911633f9ec672be57a5b6e82803e6fd17f102a711d4"} Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.473337 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c9f188c9-7a3c-49eb-9f90-553a9742b023","Type":"ContainerStarted","Data":"38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff"} Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.491190 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.49117312 podStartE2EDuration="2.49117312s" podCreationTimestamp="2026-01-20 23:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:54.487955152 +0000 UTC m=+2906.808215450" watchObservedRunningTime="2026-01-20 23:23:54.49117312 +0000 UTC m=+2906.811433418" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.514175 5030 scope.go:117] "RemoveContainer" containerID="a702fec726da5013591998426bc04fc9855de518a659e393c8cadad15d59d23e" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.533065 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjphd\" (UniqueName: \"kubernetes.io/projected/19291dc7-4b98-41df-ad76-2a3607e32740-kube-api-access-xjphd\") pod \"19291dc7-4b98-41df-ad76-2a3607e32740\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.533127 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19291dc7-4b98-41df-ad76-2a3607e32740-logs\") pod \"19291dc7-4b98-41df-ad76-2a3607e32740\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.533182 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-config-data\") pod \"19291dc7-4b98-41df-ad76-2a3607e32740\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.533299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-combined-ca-bundle\") pod \"19291dc7-4b98-41df-ad76-2a3607e32740\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.533375 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-nova-metadata-tls-certs\") pod \"19291dc7-4b98-41df-ad76-2a3607e32740\" (UID: \"19291dc7-4b98-41df-ad76-2a3607e32740\") " Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.534692 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19291dc7-4b98-41df-ad76-2a3607e32740-logs" (OuterVolumeSpecName: "logs") pod "19291dc7-4b98-41df-ad76-2a3607e32740" (UID: "19291dc7-4b98-41df-ad76-2a3607e32740"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.540427 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19291dc7-4b98-41df-ad76-2a3607e32740-kube-api-access-xjphd" (OuterVolumeSpecName: "kube-api-access-xjphd") pod "19291dc7-4b98-41df-ad76-2a3607e32740" (UID: "19291dc7-4b98-41df-ad76-2a3607e32740"). InnerVolumeSpecName "kube-api-access-xjphd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.561413 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19291dc7-4b98-41df-ad76-2a3607e32740" (UID: "19291dc7-4b98-41df-ad76-2a3607e32740"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.565415 5030 scope.go:117] "RemoveContainer" containerID="7bed15ec2bee6eda37933f7a4da995781304c9b328143816ee26bf8cc0576823" Jan 20 23:23:54 crc kubenswrapper[5030]: E0120 23:23:54.565897 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bed15ec2bee6eda37933f7a4da995781304c9b328143816ee26bf8cc0576823\": container with ID starting with 7bed15ec2bee6eda37933f7a4da995781304c9b328143816ee26bf8cc0576823 not found: ID does not exist" containerID="7bed15ec2bee6eda37933f7a4da995781304c9b328143816ee26bf8cc0576823" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.565954 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bed15ec2bee6eda37933f7a4da995781304c9b328143816ee26bf8cc0576823"} err="failed to get container status \"7bed15ec2bee6eda37933f7a4da995781304c9b328143816ee26bf8cc0576823\": rpc error: code = NotFound desc = could not find container \"7bed15ec2bee6eda37933f7a4da995781304c9b328143816ee26bf8cc0576823\": container with ID starting with 7bed15ec2bee6eda37933f7a4da995781304c9b328143816ee26bf8cc0576823 not found: ID does not exist" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.565983 5030 scope.go:117] "RemoveContainer" containerID="a702fec726da5013591998426bc04fc9855de518a659e393c8cadad15d59d23e" Jan 20 23:23:54 crc kubenswrapper[5030]: E0120 23:23:54.566422 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a702fec726da5013591998426bc04fc9855de518a659e393c8cadad15d59d23e\": container with ID starting with a702fec726da5013591998426bc04fc9855de518a659e393c8cadad15d59d23e not found: ID does not exist" containerID="a702fec726da5013591998426bc04fc9855de518a659e393c8cadad15d59d23e" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.566458 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a702fec726da5013591998426bc04fc9855de518a659e393c8cadad15d59d23e"} err="failed to get container status \"a702fec726da5013591998426bc04fc9855de518a659e393c8cadad15d59d23e\": rpc error: code = NotFound desc = could not find container \"a702fec726da5013591998426bc04fc9855de518a659e393c8cadad15d59d23e\": container with ID starting with a702fec726da5013591998426bc04fc9855de518a659e393c8cadad15d59d23e not found: ID does not exist" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.566483 5030 scope.go:117] "RemoveContainer" containerID="08f4819e5cd3873c1396c66fb5bd78d7ebc51cfed7b5ee84e52a767063d9bc9b" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.569109 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-config-data" (OuterVolumeSpecName: "config-data") pod "19291dc7-4b98-41df-ad76-2a3607e32740" (UID: "19291dc7-4b98-41df-ad76-2a3607e32740"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.588022 5030 scope.go:117] "RemoveContainer" containerID="ee77ade8b83c0634c6a6ef3106a934b6f387b3b4a999e381f1caa4c80e261761" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.597600 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "19291dc7-4b98-41df-ad76-2a3607e32740" (UID: "19291dc7-4b98-41df-ad76-2a3607e32740"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.615314 5030 scope.go:117] "RemoveContainer" containerID="08f4819e5cd3873c1396c66fb5bd78d7ebc51cfed7b5ee84e52a767063d9bc9b" Jan 20 23:23:54 crc kubenswrapper[5030]: E0120 23:23:54.615855 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08f4819e5cd3873c1396c66fb5bd78d7ebc51cfed7b5ee84e52a767063d9bc9b\": container with ID starting with 08f4819e5cd3873c1396c66fb5bd78d7ebc51cfed7b5ee84e52a767063d9bc9b not found: ID does not exist" containerID="08f4819e5cd3873c1396c66fb5bd78d7ebc51cfed7b5ee84e52a767063d9bc9b" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.615901 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f4819e5cd3873c1396c66fb5bd78d7ebc51cfed7b5ee84e52a767063d9bc9b"} err="failed to get container status \"08f4819e5cd3873c1396c66fb5bd78d7ebc51cfed7b5ee84e52a767063d9bc9b\": rpc error: code = NotFound desc = could not find container \"08f4819e5cd3873c1396c66fb5bd78d7ebc51cfed7b5ee84e52a767063d9bc9b\": container with ID starting with 08f4819e5cd3873c1396c66fb5bd78d7ebc51cfed7b5ee84e52a767063d9bc9b not found: ID does not exist" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.615935 5030 scope.go:117] "RemoveContainer" containerID="ee77ade8b83c0634c6a6ef3106a934b6f387b3b4a999e381f1caa4c80e261761" Jan 20 23:23:54 crc kubenswrapper[5030]: E0120 23:23:54.616169 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee77ade8b83c0634c6a6ef3106a934b6f387b3b4a999e381f1caa4c80e261761\": container with ID starting with ee77ade8b83c0634c6a6ef3106a934b6f387b3b4a999e381f1caa4c80e261761 not found: ID does not exist" containerID="ee77ade8b83c0634c6a6ef3106a934b6f387b3b4a999e381f1caa4c80e261761" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.616207 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee77ade8b83c0634c6a6ef3106a934b6f387b3b4a999e381f1caa4c80e261761"} err="failed to get container status \"ee77ade8b83c0634c6a6ef3106a934b6f387b3b4a999e381f1caa4c80e261761\": rpc error: code = NotFound desc = could not find container \"ee77ade8b83c0634c6a6ef3106a934b6f387b3b4a999e381f1caa4c80e261761\": container with ID starting with ee77ade8b83c0634c6a6ef3106a934b6f387b3b4a999e381f1caa4c80e261761 not found: ID does not exist" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.635947 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.635991 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjphd\" (UniqueName: \"kubernetes.io/projected/19291dc7-4b98-41df-ad76-2a3607e32740-kube-api-access-xjphd\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.636004 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19291dc7-4b98-41df-ad76-2a3607e32740-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.636016 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.636028 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19291dc7-4b98-41df-ad76-2a3607e32740-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.830244 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.845676 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.859682 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:54 crc kubenswrapper[5030]: E0120 23:23:54.860125 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19291dc7-4b98-41df-ad76-2a3607e32740" containerName="nova-metadata-metadata" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.860145 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19291dc7-4b98-41df-ad76-2a3607e32740" containerName="nova-metadata-metadata" Jan 20 23:23:54 crc kubenswrapper[5030]: E0120 23:23:54.860178 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6367653-9b33-4126-811a-80088b201f5f" containerName="nova-api-api" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.860187 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6367653-9b33-4126-811a-80088b201f5f" containerName="nova-api-api" Jan 20 23:23:54 crc kubenswrapper[5030]: E0120 23:23:54.860202 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6367653-9b33-4126-811a-80088b201f5f" containerName="nova-api-log" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.860211 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6367653-9b33-4126-811a-80088b201f5f" containerName="nova-api-log" Jan 20 23:23:54 crc kubenswrapper[5030]: E0120 23:23:54.860225 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19291dc7-4b98-41df-ad76-2a3607e32740" containerName="nova-metadata-log" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.860233 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19291dc7-4b98-41df-ad76-2a3607e32740" containerName="nova-metadata-log" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.860444 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19291dc7-4b98-41df-ad76-2a3607e32740" containerName="nova-metadata-log" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.860470 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19291dc7-4b98-41df-ad76-2a3607e32740" containerName="nova-metadata-metadata" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.860489 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6367653-9b33-4126-811a-80088b201f5f" containerName="nova-api-log" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.860509 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6367653-9b33-4126-811a-80088b201f5f" containerName="nova-api-api" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.861701 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.872536 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.873779 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.907388 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.941858 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-combined-ca-bundle\") pod \"f6367653-9b33-4126-811a-80088b201f5f\" (UID: \"f6367653-9b33-4126-811a-80088b201f5f\") " Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.942287 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-config-data\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.942393 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql4bz\" (UniqueName: \"kubernetes.io/projected/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-kube-api-access-ql4bz\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.942487 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.942791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-logs\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.942846 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:54 crc kubenswrapper[5030]: I0120 23:23:54.946304 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6367653-9b33-4126-811a-80088b201f5f" (UID: "f6367653-9b33-4126-811a-80088b201f5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.045814 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.046902 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-logs\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.047074 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.047268 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-config-data\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.047392 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-logs\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.047702 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql4bz\" (UniqueName: \"kubernetes.io/projected/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-kube-api-access-ql4bz\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.048004 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6367653-9b33-4126-811a-80088b201f5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.053867 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.054501 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.056650 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-config-data\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.070260 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql4bz\" (UniqueName: \"kubernetes.io/projected/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-kube-api-access-ql4bz\") pod \"nova-metadata-0\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.185000 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.200645 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.211839 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.215018 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.217984 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.218824 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.228799 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.360002 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7032aa-55b1-4610-bf4c-00870b8e7813-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.360472 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca7032aa-55b1-4610-bf4c-00870b8e7813-config-data\") pod \"nova-api-0\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.360540 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca7032aa-55b1-4610-bf4c-00870b8e7813-logs\") pod \"nova-api-0\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.360657 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llbb2\" (UniqueName: \"kubernetes.io/projected/ca7032aa-55b1-4610-bf4c-00870b8e7813-kube-api-access-llbb2\") pod \"nova-api-0\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.462744 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7032aa-55b1-4610-bf4c-00870b8e7813-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.463044 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca7032aa-55b1-4610-bf4c-00870b8e7813-config-data\") pod \"nova-api-0\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.463097 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca7032aa-55b1-4610-bf4c-00870b8e7813-logs\") pod \"nova-api-0\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.463677 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca7032aa-55b1-4610-bf4c-00870b8e7813-logs\") pod \"nova-api-0\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.463706 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llbb2\" (UniqueName: \"kubernetes.io/projected/ca7032aa-55b1-4610-bf4c-00870b8e7813-kube-api-access-llbb2\") pod \"nova-api-0\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.468709 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca7032aa-55b1-4610-bf4c-00870b8e7813-config-data\") pod \"nova-api-0\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.468989 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7032aa-55b1-4610-bf4c-00870b8e7813-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.480701 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llbb2\" (UniqueName: \"kubernetes.io/projected/ca7032aa-55b1-4610-bf4c-00870b8e7813-kube-api-access-llbb2\") pod \"nova-api-0\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.485367 5030 generic.go:334] "Generic (PLEG): container finished" podID="fed58c55-4c24-455b-a184-a47d4a2cd44f" containerID="d6f4b8d633a41c2edf5a7d5a1b8e69bd2212b92ef202df6bdf7e93b46428dd7a" exitCode=0 Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.485672 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7nfz" event={"ID":"fed58c55-4c24-455b-a184-a47d4a2cd44f","Type":"ContainerDied","Data":"d6f4b8d633a41c2edf5a7d5a1b8e69bd2212b92ef202df6bdf7e93b46428dd7a"} Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.540311 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.684006 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:23:55 crc kubenswrapper[5030]: W0120 23:23:55.693361 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebeef67e_90a0_4fb5_8f63_1f5d46c353be.slice/crio-161cf6150e89149dcb7115e8e89beaa1a7af91001e20221849fa748644aaa665 WatchSource:0}: Error finding container 161cf6150e89149dcb7115e8e89beaa1a7af91001e20221849fa748644aaa665: Status 404 returned error can't find the container with id 161cf6150e89149dcb7115e8e89beaa1a7af91001e20221849fa748644aaa665 Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.976455 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19291dc7-4b98-41df-ad76-2a3607e32740" path="/var/lib/kubelet/pods/19291dc7-4b98-41df-ad76-2a3607e32740/volumes" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.977977 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6367653-9b33-4126-811a-80088b201f5f" path="/var/lib/kubelet/pods/f6367653-9b33-4126-811a-80088b201f5f/volumes" Jan 20 23:23:55 crc kubenswrapper[5030]: I0120 23:23:55.978806 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:23:55 crc kubenswrapper[5030]: W0120 23:23:55.991475 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca7032aa_55b1_4610_bf4c_00870b8e7813.slice/crio-af638d6bc4d06f1130e30ab9f1b619814decd5b68c07fedd3897678b85f42e27 WatchSource:0}: Error finding container af638d6bc4d06f1130e30ab9f1b619814decd5b68c07fedd3897678b85f42e27: Status 404 returned error can't find the container with id af638d6bc4d06f1130e30ab9f1b619814decd5b68c07fedd3897678b85f42e27 Jan 20 23:23:56 crc kubenswrapper[5030]: I0120 23:23:56.497051 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7nfz" event={"ID":"fed58c55-4c24-455b-a184-a47d4a2cd44f","Type":"ContainerStarted","Data":"57cf5b8943ce7ff9d1a0f21a16a10c7ea582823102d00ea3470125b524f4ea32"} Jan 20 23:23:56 crc kubenswrapper[5030]: I0120 23:23:56.499139 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ca7032aa-55b1-4610-bf4c-00870b8e7813","Type":"ContainerStarted","Data":"07064b562bbc7f1438f96fbc52915c6768ed3d96a4916de90f042357f25c66cf"} Jan 20 23:23:56 crc kubenswrapper[5030]: I0120 23:23:56.499186 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ca7032aa-55b1-4610-bf4c-00870b8e7813","Type":"ContainerStarted","Data":"20d324404b605058f19257e06abfa2f7487a6d9ad038c229db64679f28283b87"} Jan 20 23:23:56 crc kubenswrapper[5030]: I0120 23:23:56.499200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ca7032aa-55b1-4610-bf4c-00870b8e7813","Type":"ContainerStarted","Data":"af638d6bc4d06f1130e30ab9f1b619814decd5b68c07fedd3897678b85f42e27"} Jan 20 23:23:56 crc kubenswrapper[5030]: I0120 23:23:56.501330 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ebeef67e-90a0-4fb5-8f63-1f5d46c353be","Type":"ContainerStarted","Data":"bc6613c08618d52d6b28812d72705e495fee861c8d3eb3f7d3abf86b5852deef"} Jan 20 23:23:56 crc kubenswrapper[5030]: I0120 23:23:56.501365 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ebeef67e-90a0-4fb5-8f63-1f5d46c353be","Type":"ContainerStarted","Data":"479d023bb035a62ed100978c00310340f751bb7c864f927f888af7f5537f011e"} Jan 20 23:23:56 crc kubenswrapper[5030]: I0120 23:23:56.501375 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ebeef67e-90a0-4fb5-8f63-1f5d46c353be","Type":"ContainerStarted","Data":"161cf6150e89149dcb7115e8e89beaa1a7af91001e20221849fa748644aaa665"} Jan 20 23:23:56 crc kubenswrapper[5030]: I0120 23:23:56.559102 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.5590789000000003 podStartE2EDuration="2.5590789s" podCreationTimestamp="2026-01-20 23:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:56.554380517 +0000 UTC m=+2908.874640795" watchObservedRunningTime="2026-01-20 23:23:56.5590789 +0000 UTC m=+2908.879339188" Jan 20 23:23:56 crc kubenswrapper[5030]: I0120 23:23:56.576494 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.5764748769999999 podStartE2EDuration="1.576474877s" podCreationTimestamp="2026-01-20 23:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:23:56.572789668 +0000 UTC m=+2908.893049956" watchObservedRunningTime="2026-01-20 23:23:56.576474877 +0000 UTC m=+2908.896735165" Jan 20 23:23:57 crc kubenswrapper[5030]: I0120 23:23:57.523860 5030 generic.go:334] "Generic (PLEG): container finished" podID="fed58c55-4c24-455b-a184-a47d4a2cd44f" containerID="57cf5b8943ce7ff9d1a0f21a16a10c7ea582823102d00ea3470125b524f4ea32" exitCode=0 Jan 20 23:23:57 crc kubenswrapper[5030]: I0120 23:23:57.524283 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7nfz" event={"ID":"fed58c55-4c24-455b-a184-a47d4a2cd44f","Type":"ContainerDied","Data":"57cf5b8943ce7ff9d1a0f21a16a10c7ea582823102d00ea3470125b524f4ea32"} Jan 20 23:23:57 crc kubenswrapper[5030]: I0120 23:23:57.848199 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:23:58 crc kubenswrapper[5030]: I0120 23:23:58.543335 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7nfz" event={"ID":"fed58c55-4c24-455b-a184-a47d4a2cd44f","Type":"ContainerStarted","Data":"f90faff5fc420849c662e52f22154a4adc4a3b088dada6f769cdeae8eeafd35a"} Jan 20 23:23:58 crc kubenswrapper[5030]: I0120 23:23:58.582295 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r7nfz" podStartSLOduration=3.06075829 podStartE2EDuration="5.582266508s" podCreationTimestamp="2026-01-20 23:23:53 +0000 UTC" firstStartedPulling="2026-01-20 23:23:55.488596629 +0000 UTC m=+2907.808856917" lastFinishedPulling="2026-01-20 23:23:58.010104807 +0000 UTC m=+2910.330365135" observedRunningTime="2026-01-20 23:23:58.568746233 +0000 UTC m=+2910.889006591" watchObservedRunningTime="2026-01-20 23:23:58.582266508 +0000 UTC m=+2910.902526816" Jan 20 23:24:00 crc kubenswrapper[5030]: I0120 23:24:00.218814 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:24:00 crc kubenswrapper[5030]: I0120 23:24:00.219277 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:24:02 crc kubenswrapper[5030]: I0120 23:24:02.849236 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:24:02 crc kubenswrapper[5030]: I0120 23:24:02.896399 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:24:03 crc kubenswrapper[5030]: I0120 23:24:03.643381 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:24:03 crc kubenswrapper[5030]: I0120 23:24:03.834600 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:24:03 crc kubenswrapper[5030]: I0120 23:24:03.834729 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:24:03 crc kubenswrapper[5030]: I0120 23:24:03.915188 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:24:04 crc kubenswrapper[5030]: I0120 23:24:04.621973 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:04 crc kubenswrapper[5030]: I0120 23:24:04.701544 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:24:04 crc kubenswrapper[5030]: I0120 23:24:04.754578 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7nfz"] Jan 20 23:24:05 crc kubenswrapper[5030]: I0120 23:24:05.219520 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:24:05 crc kubenswrapper[5030]: I0120 23:24:05.219571 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:24:05 crc kubenswrapper[5030]: I0120 23:24:05.541329 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:05 crc kubenswrapper[5030]: I0120 23:24:05.542695 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:06 crc kubenswrapper[5030]: I0120 23:24:06.232811 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.40:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:24:06 crc kubenswrapper[5030]: I0120 23:24:06.232824 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.40:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:24:06 crc kubenswrapper[5030]: I0120 23:24:06.624109 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="ca7032aa-55b1-4610-bf4c-00870b8e7813" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.41:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:24:06 crc kubenswrapper[5030]: I0120 23:24:06.624109 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="ca7032aa-55b1-4610-bf4c-00870b8e7813" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.41:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:24:06 crc kubenswrapper[5030]: I0120 23:24:06.627910 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r7nfz" podUID="fed58c55-4c24-455b-a184-a47d4a2cd44f" containerName="registry-server" containerID="cri-o://f90faff5fc420849c662e52f22154a4adc4a3b088dada6f769cdeae8eeafd35a" gracePeriod=2 Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.114154 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.121748 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fed58c55-4c24-455b-a184-a47d4a2cd44f-utilities\") pod \"fed58c55-4c24-455b-a184-a47d4a2cd44f\" (UID: \"fed58c55-4c24-455b-a184-a47d4a2cd44f\") " Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.121809 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdlzj\" (UniqueName: \"kubernetes.io/projected/fed58c55-4c24-455b-a184-a47d4a2cd44f-kube-api-access-zdlzj\") pod \"fed58c55-4c24-455b-a184-a47d4a2cd44f\" (UID: \"fed58c55-4c24-455b-a184-a47d4a2cd44f\") " Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.121882 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fed58c55-4c24-455b-a184-a47d4a2cd44f-catalog-content\") pod \"fed58c55-4c24-455b-a184-a47d4a2cd44f\" (UID: \"fed58c55-4c24-455b-a184-a47d4a2cd44f\") " Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.123178 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fed58c55-4c24-455b-a184-a47d4a2cd44f-utilities" (OuterVolumeSpecName: "utilities") pod "fed58c55-4c24-455b-a184-a47d4a2cd44f" (UID: "fed58c55-4c24-455b-a184-a47d4a2cd44f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.129572 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed58c55-4c24-455b-a184-a47d4a2cd44f-kube-api-access-zdlzj" (OuterVolumeSpecName: "kube-api-access-zdlzj") pod "fed58c55-4c24-455b-a184-a47d4a2cd44f" (UID: "fed58c55-4c24-455b-a184-a47d4a2cd44f"). InnerVolumeSpecName "kube-api-access-zdlzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.224805 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fed58c55-4c24-455b-a184-a47d4a2cd44f-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.224861 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdlzj\" (UniqueName: \"kubernetes.io/projected/fed58c55-4c24-455b-a184-a47d4a2cd44f-kube-api-access-zdlzj\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.268371 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fed58c55-4c24-455b-a184-a47d4a2cd44f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fed58c55-4c24-455b-a184-a47d4a2cd44f" (UID: "fed58c55-4c24-455b-a184-a47d4a2cd44f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.327463 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fed58c55-4c24-455b-a184-a47d4a2cd44f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.640295 5030 generic.go:334] "Generic (PLEG): container finished" podID="fed58c55-4c24-455b-a184-a47d4a2cd44f" containerID="f90faff5fc420849c662e52f22154a4adc4a3b088dada6f769cdeae8eeafd35a" exitCode=0 Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.640335 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7nfz" event={"ID":"fed58c55-4c24-455b-a184-a47d4a2cd44f","Type":"ContainerDied","Data":"f90faff5fc420849c662e52f22154a4adc4a3b088dada6f769cdeae8eeafd35a"} Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.640359 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7nfz" event={"ID":"fed58c55-4c24-455b-a184-a47d4a2cd44f","Type":"ContainerDied","Data":"ec247ae36eaa8513051ba1725ff78fced38f631c5d4641f7e1a1315548635c37"} Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.640376 5030 scope.go:117] "RemoveContainer" containerID="f90faff5fc420849c662e52f22154a4adc4a3b088dada6f769cdeae8eeafd35a" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.640483 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7nfz" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.663668 5030 scope.go:117] "RemoveContainer" containerID="57cf5b8943ce7ff9d1a0f21a16a10c7ea582823102d00ea3470125b524f4ea32" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.678971 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7nfz"] Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.689092 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r7nfz"] Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.704478 5030 scope.go:117] "RemoveContainer" containerID="d6f4b8d633a41c2edf5a7d5a1b8e69bd2212b92ef202df6bdf7e93b46428dd7a" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.729464 5030 scope.go:117] "RemoveContainer" containerID="f90faff5fc420849c662e52f22154a4adc4a3b088dada6f769cdeae8eeafd35a" Jan 20 23:24:07 crc kubenswrapper[5030]: E0120 23:24:07.729828 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f90faff5fc420849c662e52f22154a4adc4a3b088dada6f769cdeae8eeafd35a\": container with ID starting with f90faff5fc420849c662e52f22154a4adc4a3b088dada6f769cdeae8eeafd35a not found: ID does not exist" containerID="f90faff5fc420849c662e52f22154a4adc4a3b088dada6f769cdeae8eeafd35a" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.729878 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f90faff5fc420849c662e52f22154a4adc4a3b088dada6f769cdeae8eeafd35a"} err="failed to get container status \"f90faff5fc420849c662e52f22154a4adc4a3b088dada6f769cdeae8eeafd35a\": rpc error: code = NotFound desc = could not find container \"f90faff5fc420849c662e52f22154a4adc4a3b088dada6f769cdeae8eeafd35a\": container with ID starting with f90faff5fc420849c662e52f22154a4adc4a3b088dada6f769cdeae8eeafd35a not found: ID does not exist" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.729905 5030 scope.go:117] "RemoveContainer" containerID="57cf5b8943ce7ff9d1a0f21a16a10c7ea582823102d00ea3470125b524f4ea32" Jan 20 23:24:07 crc kubenswrapper[5030]: E0120 23:24:07.730215 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57cf5b8943ce7ff9d1a0f21a16a10c7ea582823102d00ea3470125b524f4ea32\": container with ID starting with 57cf5b8943ce7ff9d1a0f21a16a10c7ea582823102d00ea3470125b524f4ea32 not found: ID does not exist" containerID="57cf5b8943ce7ff9d1a0f21a16a10c7ea582823102d00ea3470125b524f4ea32" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.730245 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57cf5b8943ce7ff9d1a0f21a16a10c7ea582823102d00ea3470125b524f4ea32"} err="failed to get container status \"57cf5b8943ce7ff9d1a0f21a16a10c7ea582823102d00ea3470125b524f4ea32\": rpc error: code = NotFound desc = could not find container \"57cf5b8943ce7ff9d1a0f21a16a10c7ea582823102d00ea3470125b524f4ea32\": container with ID starting with 57cf5b8943ce7ff9d1a0f21a16a10c7ea582823102d00ea3470125b524f4ea32 not found: ID does not exist" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.730269 5030 scope.go:117] "RemoveContainer" containerID="d6f4b8d633a41c2edf5a7d5a1b8e69bd2212b92ef202df6bdf7e93b46428dd7a" Jan 20 23:24:07 crc kubenswrapper[5030]: E0120 23:24:07.730487 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6f4b8d633a41c2edf5a7d5a1b8e69bd2212b92ef202df6bdf7e93b46428dd7a\": container with ID starting with d6f4b8d633a41c2edf5a7d5a1b8e69bd2212b92ef202df6bdf7e93b46428dd7a not found: ID does not exist" containerID="d6f4b8d633a41c2edf5a7d5a1b8e69bd2212b92ef202df6bdf7e93b46428dd7a" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.730512 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6f4b8d633a41c2edf5a7d5a1b8e69bd2212b92ef202df6bdf7e93b46428dd7a"} err="failed to get container status \"d6f4b8d633a41c2edf5a7d5a1b8e69bd2212b92ef202df6bdf7e93b46428dd7a\": rpc error: code = NotFound desc = could not find container \"d6f4b8d633a41c2edf5a7d5a1b8e69bd2212b92ef202df6bdf7e93b46428dd7a\": container with ID starting with d6f4b8d633a41c2edf5a7d5a1b8e69bd2212b92ef202df6bdf7e93b46428dd7a not found: ID does not exist" Jan 20 23:24:07 crc kubenswrapper[5030]: I0120 23:24:07.994502 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fed58c55-4c24-455b-a184-a47d4a2cd44f" path="/var/lib/kubelet/pods/fed58c55-4c24-455b-a184-a47d4a2cd44f/volumes" Jan 20 23:24:10 crc kubenswrapper[5030]: I0120 23:24:10.157257 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:24:10 crc kubenswrapper[5030]: I0120 23:24:10.157344 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:24:15 crc kubenswrapper[5030]: I0120 23:24:15.225944 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:24:15 crc kubenswrapper[5030]: I0120 23:24:15.227937 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:24:15 crc kubenswrapper[5030]: I0120 23:24:15.232437 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:24:15 crc kubenswrapper[5030]: I0120 23:24:15.545365 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:15 crc kubenswrapper[5030]: I0120 23:24:15.546397 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:15 crc kubenswrapper[5030]: I0120 23:24:15.546976 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:15 crc kubenswrapper[5030]: I0120 23:24:15.550230 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:15 crc kubenswrapper[5030]: I0120 23:24:15.740153 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:15 crc kubenswrapper[5030]: I0120 23:24:15.744740 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:15 crc kubenswrapper[5030]: I0120 23:24:15.749716 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:24:17 crc kubenswrapper[5030]: I0120 23:24:17.466525 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:24:17 crc kubenswrapper[5030]: I0120 23:24:17.468101 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="ceilometer-central-agent" containerID="cri-o://ff745a897c82b7c9ac1c81a3dcc45f5d44c30d9509ebc08a74b1b89fae4ab036" gracePeriod=30 Jan 20 23:24:17 crc kubenswrapper[5030]: I0120 23:24:17.468256 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="sg-core" containerID="cri-o://10ef44deca0b091442d9bd085dfaf88bf180d6ad8e1a2fd04aadb048c1aee53c" gracePeriod=30 Jan 20 23:24:17 crc kubenswrapper[5030]: I0120 23:24:17.468273 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="proxy-httpd" containerID="cri-o://66e7ebe94283fcbc8440fa4f1469cab59fa8b458e04ffcf085ff2b41631d7f73" gracePeriod=30 Jan 20 23:24:17 crc kubenswrapper[5030]: I0120 23:24:17.468310 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="ceilometer-notification-agent" containerID="cri-o://df337dafc7294a186de3d93e713b23dca245c8b6742d08b48db6f7c8ff32715c" gracePeriod=30 Jan 20 23:24:17 crc kubenswrapper[5030]: I0120 23:24:17.759696 5030 generic.go:334] "Generic (PLEG): container finished" podID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerID="66e7ebe94283fcbc8440fa4f1469cab59fa8b458e04ffcf085ff2b41631d7f73" exitCode=0 Jan 20 23:24:17 crc kubenswrapper[5030]: I0120 23:24:17.759739 5030 generic.go:334] "Generic (PLEG): container finished" podID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerID="10ef44deca0b091442d9bd085dfaf88bf180d6ad8e1a2fd04aadb048c1aee53c" exitCode=2 Jan 20 23:24:17 crc kubenswrapper[5030]: I0120 23:24:17.759796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d0605074-8297-4759-ba52-2cbc96e57d4f","Type":"ContainerDied","Data":"66e7ebe94283fcbc8440fa4f1469cab59fa8b458e04ffcf085ff2b41631d7f73"} Jan 20 23:24:17 crc kubenswrapper[5030]: I0120 23:24:17.759855 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d0605074-8297-4759-ba52-2cbc96e57d4f","Type":"ContainerDied","Data":"10ef44deca0b091442d9bd085dfaf88bf180d6ad8e1a2fd04aadb048c1aee53c"} Jan 20 23:24:17 crc kubenswrapper[5030]: E0120 23:24:17.981099 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0605074_8297_4759_ba52_2cbc96e57d4f.slice/crio-ff745a897c82b7c9ac1c81a3dcc45f5d44c30d9509ebc08a74b1b89fae4ab036.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0605074_8297_4759_ba52_2cbc96e57d4f.slice/crio-conmon-ff745a897c82b7c9ac1c81a3dcc45f5d44c30d9509ebc08a74b1b89fae4ab036.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:24:18 crc kubenswrapper[5030]: I0120 23:24:18.368645 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:24:18 crc kubenswrapper[5030]: I0120 23:24:18.773348 5030 generic.go:334] "Generic (PLEG): container finished" podID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerID="df337dafc7294a186de3d93e713b23dca245c8b6742d08b48db6f7c8ff32715c" exitCode=0 Jan 20 23:24:18 crc kubenswrapper[5030]: I0120 23:24:18.773784 5030 generic.go:334] "Generic (PLEG): container finished" podID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerID="ff745a897c82b7c9ac1c81a3dcc45f5d44c30d9509ebc08a74b1b89fae4ab036" exitCode=0 Jan 20 23:24:18 crc kubenswrapper[5030]: I0120 23:24:18.773414 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d0605074-8297-4759-ba52-2cbc96e57d4f","Type":"ContainerDied","Data":"df337dafc7294a186de3d93e713b23dca245c8b6742d08b48db6f7c8ff32715c"} Jan 20 23:24:18 crc kubenswrapper[5030]: I0120 23:24:18.773940 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d0605074-8297-4759-ba52-2cbc96e57d4f","Type":"ContainerDied","Data":"ff745a897c82b7c9ac1c81a3dcc45f5d44c30d9509ebc08a74b1b89fae4ab036"} Jan 20 23:24:18 crc kubenswrapper[5030]: I0120 23:24:18.774076 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="ca7032aa-55b1-4610-bf4c-00870b8e7813" containerName="nova-api-log" containerID="cri-o://20d324404b605058f19257e06abfa2f7487a6d9ad038c229db64679f28283b87" gracePeriod=30 Jan 20 23:24:18 crc kubenswrapper[5030]: I0120 23:24:18.774173 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="ca7032aa-55b1-4610-bf4c-00870b8e7813" containerName="nova-api-api" containerID="cri-o://07064b562bbc7f1438f96fbc52915c6768ed3d96a4916de90f042357f25c66cf" gracePeriod=30 Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.167689 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.264733 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-ceilometer-tls-certs\") pod \"d0605074-8297-4759-ba52-2cbc96e57d4f\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.265069 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-scripts\") pod \"d0605074-8297-4759-ba52-2cbc96e57d4f\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.265183 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0605074-8297-4759-ba52-2cbc96e57d4f-log-httpd\") pod \"d0605074-8297-4759-ba52-2cbc96e57d4f\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.265230 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-combined-ca-bundle\") pod \"d0605074-8297-4759-ba52-2cbc96e57d4f\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.265278 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-config-data\") pod \"d0605074-8297-4759-ba52-2cbc96e57d4f\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.265309 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0605074-8297-4759-ba52-2cbc96e57d4f-run-httpd\") pod \"d0605074-8297-4759-ba52-2cbc96e57d4f\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.265360 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-sg-core-conf-yaml\") pod \"d0605074-8297-4759-ba52-2cbc96e57d4f\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.265399 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd6hk\" (UniqueName: \"kubernetes.io/projected/d0605074-8297-4759-ba52-2cbc96e57d4f-kube-api-access-fd6hk\") pod \"d0605074-8297-4759-ba52-2cbc96e57d4f\" (UID: \"d0605074-8297-4759-ba52-2cbc96e57d4f\") " Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.265786 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0605074-8297-4759-ba52-2cbc96e57d4f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d0605074-8297-4759-ba52-2cbc96e57d4f" (UID: "d0605074-8297-4759-ba52-2cbc96e57d4f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.265847 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0605074-8297-4759-ba52-2cbc96e57d4f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d0605074-8297-4759-ba52-2cbc96e57d4f" (UID: "d0605074-8297-4759-ba52-2cbc96e57d4f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.266203 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0605074-8297-4759-ba52-2cbc96e57d4f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.266225 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0605074-8297-4759-ba52-2cbc96e57d4f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.273722 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-scripts" (OuterVolumeSpecName: "scripts") pod "d0605074-8297-4759-ba52-2cbc96e57d4f" (UID: "d0605074-8297-4759-ba52-2cbc96e57d4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.273789 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0605074-8297-4759-ba52-2cbc96e57d4f-kube-api-access-fd6hk" (OuterVolumeSpecName: "kube-api-access-fd6hk") pod "d0605074-8297-4759-ba52-2cbc96e57d4f" (UID: "d0605074-8297-4759-ba52-2cbc96e57d4f"). InnerVolumeSpecName "kube-api-access-fd6hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.295251 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d0605074-8297-4759-ba52-2cbc96e57d4f" (UID: "d0605074-8297-4759-ba52-2cbc96e57d4f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.325714 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d0605074-8297-4759-ba52-2cbc96e57d4f" (UID: "d0605074-8297-4759-ba52-2cbc96e57d4f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.368762 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.368796 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd6hk\" (UniqueName: \"kubernetes.io/projected/d0605074-8297-4759-ba52-2cbc96e57d4f-kube-api-access-fd6hk\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.368811 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.368824 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.371075 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0605074-8297-4759-ba52-2cbc96e57d4f" (UID: "d0605074-8297-4759-ba52-2cbc96e57d4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.388667 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-config-data" (OuterVolumeSpecName: "config-data") pod "d0605074-8297-4759-ba52-2cbc96e57d4f" (UID: "d0605074-8297-4759-ba52-2cbc96e57d4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.470324 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.470369 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0605074-8297-4759-ba52-2cbc96e57d4f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.783252 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d0605074-8297-4759-ba52-2cbc96e57d4f","Type":"ContainerDied","Data":"7a659ecfd13147610934d8d41ab5f9d878530ab97d39017e633cd8ee65d55ed2"} Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.783306 5030 scope.go:117] "RemoveContainer" containerID="66e7ebe94283fcbc8440fa4f1469cab59fa8b458e04ffcf085ff2b41631d7f73" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.784339 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.785496 5030 generic.go:334] "Generic (PLEG): container finished" podID="ca7032aa-55b1-4610-bf4c-00870b8e7813" containerID="20d324404b605058f19257e06abfa2f7487a6d9ad038c229db64679f28283b87" exitCode=143 Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.785546 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ca7032aa-55b1-4610-bf4c-00870b8e7813","Type":"ContainerDied","Data":"20d324404b605058f19257e06abfa2f7487a6d9ad038c229db64679f28283b87"} Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.809259 5030 scope.go:117] "RemoveContainer" containerID="10ef44deca0b091442d9bd085dfaf88bf180d6ad8e1a2fd04aadb048c1aee53c" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.817217 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.827151 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.839924 5030 scope.go:117] "RemoveContainer" containerID="df337dafc7294a186de3d93e713b23dca245c8b6742d08b48db6f7c8ff32715c" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.860680 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:24:19 crc kubenswrapper[5030]: E0120 23:24:19.861125 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="sg-core" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.861148 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="sg-core" Jan 20 23:24:19 crc kubenswrapper[5030]: E0120 23:24:19.861161 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed58c55-4c24-455b-a184-a47d4a2cd44f" containerName="extract-content" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.861167 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed58c55-4c24-455b-a184-a47d4a2cd44f" containerName="extract-content" Jan 20 23:24:19 crc kubenswrapper[5030]: E0120 23:24:19.861190 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed58c55-4c24-455b-a184-a47d4a2cd44f" containerName="extract-utilities" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.861198 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed58c55-4c24-455b-a184-a47d4a2cd44f" containerName="extract-utilities" Jan 20 23:24:19 crc kubenswrapper[5030]: E0120 23:24:19.861210 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="ceilometer-notification-agent" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.861217 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="ceilometer-notification-agent" Jan 20 23:24:19 crc kubenswrapper[5030]: E0120 23:24:19.861229 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="ceilometer-central-agent" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.861236 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="ceilometer-central-agent" Jan 20 23:24:19 crc kubenswrapper[5030]: E0120 23:24:19.861257 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fed58c55-4c24-455b-a184-a47d4a2cd44f" containerName="registry-server" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.861264 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed58c55-4c24-455b-a184-a47d4a2cd44f" containerName="registry-server" Jan 20 23:24:19 crc kubenswrapper[5030]: E0120 23:24:19.861279 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="proxy-httpd" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.861285 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="proxy-httpd" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.861482 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="sg-core" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.861509 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="proxy-httpd" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.861530 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fed58c55-4c24-455b-a184-a47d4a2cd44f" containerName="registry-server" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.861545 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="ceilometer-notification-agent" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.861556 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" containerName="ceilometer-central-agent" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.863516 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.867868 5030 scope.go:117] "RemoveContainer" containerID="ff745a897c82b7c9ac1c81a3dcc45f5d44c30d9509ebc08a74b1b89fae4ab036" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.868128 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.868352 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.868365 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.893905 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.978764 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.978856 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28de376f-b687-40db-a6ee-c1f3c5297d05-log-httpd\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.978918 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knwqn\" (UniqueName: \"kubernetes.io/projected/28de376f-b687-40db-a6ee-c1f3c5297d05-kube-api-access-knwqn\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.978963 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-config-data\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.979002 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.979049 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.979086 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-scripts\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:19 crc kubenswrapper[5030]: I0120 23:24:19.979161 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28de376f-b687-40db-a6ee-c1f3c5297d05-run-httpd\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.003481 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0605074-8297-4759-ba52-2cbc96e57d4f" path="/var/lib/kubelet/pods/d0605074-8297-4759-ba52-2cbc96e57d4f/volumes" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.080931 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-config-data\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.081001 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.081040 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.081067 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-scripts\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.081142 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28de376f-b687-40db-a6ee-c1f3c5297d05-run-httpd\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.081244 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.081263 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28de376f-b687-40db-a6ee-c1f3c5297d05-log-httpd\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.081295 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knwqn\" (UniqueName: \"kubernetes.io/projected/28de376f-b687-40db-a6ee-c1f3c5297d05-kube-api-access-knwqn\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.083344 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28de376f-b687-40db-a6ee-c1f3c5297d05-log-httpd\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.083725 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28de376f-b687-40db-a6ee-c1f3c5297d05-run-httpd\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.087890 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.095398 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.095714 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-scripts\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.098536 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.103030 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knwqn\" (UniqueName: \"kubernetes.io/projected/28de376f-b687-40db-a6ee-c1f3c5297d05-kube-api-access-knwqn\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.107613 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-config-data\") pod \"ceilometer-0\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.262763 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.760010 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:24:20 crc kubenswrapper[5030]: W0120 23:24:20.771380 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28de376f_b687_40db_a6ee_c1f3c5297d05.slice/crio-6d284c19a5dc6c4d38e6659ec21b9a7a217287dcf310a7276a93d8a9d30b31ce WatchSource:0}: Error finding container 6d284c19a5dc6c4d38e6659ec21b9a7a217287dcf310a7276a93d8a9d30b31ce: Status 404 returned error can't find the container with id 6d284c19a5dc6c4d38e6659ec21b9a7a217287dcf310a7276a93d8a9d30b31ce Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.800221 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"28de376f-b687-40db-a6ee-c1f3c5297d05","Type":"ContainerStarted","Data":"6d284c19a5dc6c4d38e6659ec21b9a7a217287dcf310a7276a93d8a9d30b31ce"} Jan 20 23:24:20 crc kubenswrapper[5030]: I0120 23:24:20.989081 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:24:21 crc kubenswrapper[5030]: I0120 23:24:21.817110 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"28de376f-b687-40db-a6ee-c1f3c5297d05","Type":"ContainerStarted","Data":"ee51aafe12c9d5df495ff9f9551a1985cb09d0adace380058ff113ca1b2ae81f"} Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.240066 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.324873 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca7032aa-55b1-4610-bf4c-00870b8e7813-logs\") pod \"ca7032aa-55b1-4610-bf4c-00870b8e7813\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.324930 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca7032aa-55b1-4610-bf4c-00870b8e7813-config-data\") pod \"ca7032aa-55b1-4610-bf4c-00870b8e7813\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.325057 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7032aa-55b1-4610-bf4c-00870b8e7813-combined-ca-bundle\") pod \"ca7032aa-55b1-4610-bf4c-00870b8e7813\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.325111 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llbb2\" (UniqueName: \"kubernetes.io/projected/ca7032aa-55b1-4610-bf4c-00870b8e7813-kube-api-access-llbb2\") pod \"ca7032aa-55b1-4610-bf4c-00870b8e7813\" (UID: \"ca7032aa-55b1-4610-bf4c-00870b8e7813\") " Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.326695 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca7032aa-55b1-4610-bf4c-00870b8e7813-logs" (OuterVolumeSpecName: "logs") pod "ca7032aa-55b1-4610-bf4c-00870b8e7813" (UID: "ca7032aa-55b1-4610-bf4c-00870b8e7813"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.330524 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca7032aa-55b1-4610-bf4c-00870b8e7813-kube-api-access-llbb2" (OuterVolumeSpecName: "kube-api-access-llbb2") pod "ca7032aa-55b1-4610-bf4c-00870b8e7813" (UID: "ca7032aa-55b1-4610-bf4c-00870b8e7813"). InnerVolumeSpecName "kube-api-access-llbb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.351753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca7032aa-55b1-4610-bf4c-00870b8e7813-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca7032aa-55b1-4610-bf4c-00870b8e7813" (UID: "ca7032aa-55b1-4610-bf4c-00870b8e7813"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.359272 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca7032aa-55b1-4610-bf4c-00870b8e7813-config-data" (OuterVolumeSpecName: "config-data") pod "ca7032aa-55b1-4610-bf4c-00870b8e7813" (UID: "ca7032aa-55b1-4610-bf4c-00870b8e7813"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.427676 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7032aa-55b1-4610-bf4c-00870b8e7813-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.427706 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llbb2\" (UniqueName: \"kubernetes.io/projected/ca7032aa-55b1-4610-bf4c-00870b8e7813-kube-api-access-llbb2\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.427722 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca7032aa-55b1-4610-bf4c-00870b8e7813-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.427732 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca7032aa-55b1-4610-bf4c-00870b8e7813-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.835734 5030 generic.go:334] "Generic (PLEG): container finished" podID="ca7032aa-55b1-4610-bf4c-00870b8e7813" containerID="07064b562bbc7f1438f96fbc52915c6768ed3d96a4916de90f042357f25c66cf" exitCode=0 Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.835811 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ca7032aa-55b1-4610-bf4c-00870b8e7813","Type":"ContainerDied","Data":"07064b562bbc7f1438f96fbc52915c6768ed3d96a4916de90f042357f25c66cf"} Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.835840 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"ca7032aa-55b1-4610-bf4c-00870b8e7813","Type":"ContainerDied","Data":"af638d6bc4d06f1130e30ab9f1b619814decd5b68c07fedd3897678b85f42e27"} Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.835860 5030 scope.go:117] "RemoveContainer" containerID="07064b562bbc7f1438f96fbc52915c6768ed3d96a4916de90f042357f25c66cf" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.835982 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.839978 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"28de376f-b687-40db-a6ee-c1f3c5297d05","Type":"ContainerStarted","Data":"0057d04ce61f03543650dc8be1629b40d289999aded91a5e94e95e0120d3f43c"} Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.879758 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.882386 5030 scope.go:117] "RemoveContainer" containerID="20d324404b605058f19257e06abfa2f7487a6d9ad038c229db64679f28283b87" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.889613 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.914166 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:24:22 crc kubenswrapper[5030]: E0120 23:24:22.914610 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca7032aa-55b1-4610-bf4c-00870b8e7813" containerName="nova-api-api" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.914648 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca7032aa-55b1-4610-bf4c-00870b8e7813" containerName="nova-api-api" Jan 20 23:24:22 crc kubenswrapper[5030]: E0120 23:24:22.914678 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca7032aa-55b1-4610-bf4c-00870b8e7813" containerName="nova-api-log" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.914686 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca7032aa-55b1-4610-bf4c-00870b8e7813" containerName="nova-api-log" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.914928 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca7032aa-55b1-4610-bf4c-00870b8e7813" containerName="nova-api-api" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.914963 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca7032aa-55b1-4610-bf4c-00870b8e7813" containerName="nova-api-log" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.916077 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.919453 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.920132 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.923061 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.934541 5030 scope.go:117] "RemoveContainer" containerID="07064b562bbc7f1438f96fbc52915c6768ed3d96a4916de90f042357f25c66cf" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.943328 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:24:22 crc kubenswrapper[5030]: E0120 23:24:22.947210 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07064b562bbc7f1438f96fbc52915c6768ed3d96a4916de90f042357f25c66cf\": container with ID starting with 07064b562bbc7f1438f96fbc52915c6768ed3d96a4916de90f042357f25c66cf not found: ID does not exist" containerID="07064b562bbc7f1438f96fbc52915c6768ed3d96a4916de90f042357f25c66cf" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.947257 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07064b562bbc7f1438f96fbc52915c6768ed3d96a4916de90f042357f25c66cf"} err="failed to get container status \"07064b562bbc7f1438f96fbc52915c6768ed3d96a4916de90f042357f25c66cf\": rpc error: code = NotFound desc = could not find container \"07064b562bbc7f1438f96fbc52915c6768ed3d96a4916de90f042357f25c66cf\": container with ID starting with 07064b562bbc7f1438f96fbc52915c6768ed3d96a4916de90f042357f25c66cf not found: ID does not exist" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.947283 5030 scope.go:117] "RemoveContainer" containerID="20d324404b605058f19257e06abfa2f7487a6d9ad038c229db64679f28283b87" Jan 20 23:24:22 crc kubenswrapper[5030]: E0120 23:24:22.947785 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20d324404b605058f19257e06abfa2f7487a6d9ad038c229db64679f28283b87\": container with ID starting with 20d324404b605058f19257e06abfa2f7487a6d9ad038c229db64679f28283b87 not found: ID does not exist" containerID="20d324404b605058f19257e06abfa2f7487a6d9ad038c229db64679f28283b87" Jan 20 23:24:22 crc kubenswrapper[5030]: I0120 23:24:22.947806 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20d324404b605058f19257e06abfa2f7487a6d9ad038c229db64679f28283b87"} err="failed to get container status \"20d324404b605058f19257e06abfa2f7487a6d9ad038c229db64679f28283b87\": rpc error: code = NotFound desc = could not find container \"20d324404b605058f19257e06abfa2f7487a6d9ad038c229db64679f28283b87\": container with ID starting with 20d324404b605058f19257e06abfa2f7487a6d9ad038c229db64679f28283b87 not found: ID does not exist" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.039739 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-logs\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.039838 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz8p9\" (UniqueName: \"kubernetes.io/projected/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-kube-api-access-tz8p9\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.039861 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.039888 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.039903 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-public-tls-certs\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.039927 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-config-data\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.141146 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.141183 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-public-tls-certs\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.141214 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-config-data\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.141281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-logs\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.141355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz8p9\" (UniqueName: \"kubernetes.io/projected/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-kube-api-access-tz8p9\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.141375 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.142728 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-logs\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.147109 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.147154 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-config-data\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.147811 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.148721 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-public-tls-certs\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.158754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz8p9\" (UniqueName: \"kubernetes.io/projected/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-kube-api-access-tz8p9\") pod \"nova-api-0\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.315310 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.784031 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:24:23 crc kubenswrapper[5030]: W0120 23:24:23.788440 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ef2b963_9ddd_4c61_9e8f_59a28f39afc7.slice/crio-eb6b92e8d09e0f8c49dd56746b572595ebb6115e1606402f3e37552a8a7d8eca WatchSource:0}: Error finding container eb6b92e8d09e0f8c49dd56746b572595ebb6115e1606402f3e37552a8a7d8eca: Status 404 returned error can't find the container with id eb6b92e8d09e0f8c49dd56746b572595ebb6115e1606402f3e37552a8a7d8eca Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.859121 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7","Type":"ContainerStarted","Data":"eb6b92e8d09e0f8c49dd56746b572595ebb6115e1606402f3e37552a8a7d8eca"} Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.879195 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"28de376f-b687-40db-a6ee-c1f3c5297d05","Type":"ContainerStarted","Data":"af385196233360654b60361b7d104b5701f83f777001a505069b4963855ff6b8"} Jan 20 23:24:23 crc kubenswrapper[5030]: I0120 23:24:23.976893 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca7032aa-55b1-4610-bf4c-00870b8e7813" path="/var/lib/kubelet/pods/ca7032aa-55b1-4610-bf4c-00870b8e7813/volumes" Jan 20 23:24:24 crc kubenswrapper[5030]: I0120 23:24:24.903420 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7","Type":"ContainerStarted","Data":"2e891cea972422856bd8ef4b90e6772ca327c12ee091c5085fee82919599acc9"} Jan 20 23:24:24 crc kubenswrapper[5030]: I0120 23:24:24.903789 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7","Type":"ContainerStarted","Data":"845f94500fe5b956e4b26d202cbb8038a24b9128d9af1a24f8782e2d9fce48ad"} Jan 20 23:24:24 crc kubenswrapper[5030]: I0120 23:24:24.907184 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"28de376f-b687-40db-a6ee-c1f3c5297d05","Type":"ContainerStarted","Data":"ffcd8bf27bab9456122dc23dd68f70f7670cd0e0b388763064ccdca75cfd64b4"} Jan 20 23:24:24 crc kubenswrapper[5030]: I0120 23:24:24.907401 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:24 crc kubenswrapper[5030]: I0120 23:24:24.907447 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="proxy-httpd" containerID="cri-o://ffcd8bf27bab9456122dc23dd68f70f7670cd0e0b388763064ccdca75cfd64b4" gracePeriod=30 Jan 20 23:24:24 crc kubenswrapper[5030]: I0120 23:24:24.907418 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="ceilometer-central-agent" containerID="cri-o://ee51aafe12c9d5df495ff9f9551a1985cb09d0adace380058ff113ca1b2ae81f" gracePeriod=30 Jan 20 23:24:24 crc kubenswrapper[5030]: I0120 23:24:24.907544 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="ceilometer-notification-agent" containerID="cri-o://0057d04ce61f03543650dc8be1629b40d289999aded91a5e94e95e0120d3f43c" gracePeriod=30 Jan 20 23:24:24 crc kubenswrapper[5030]: I0120 23:24:24.907593 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="sg-core" containerID="cri-o://af385196233360654b60361b7d104b5701f83f777001a505069b4963855ff6b8" gracePeriod=30 Jan 20 23:24:24 crc kubenswrapper[5030]: I0120 23:24:24.940347 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.940311854 podStartE2EDuration="2.940311854s" podCreationTimestamp="2026-01-20 23:24:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:24:24.929862113 +0000 UTC m=+2937.250122401" watchObservedRunningTime="2026-01-20 23:24:24.940311854 +0000 UTC m=+2937.260572182" Jan 20 23:24:24 crc kubenswrapper[5030]: I0120 23:24:24.966523 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.33476897 podStartE2EDuration="5.966505142s" podCreationTimestamp="2026-01-20 23:24:19 +0000 UTC" firstStartedPulling="2026-01-20 23:24:20.774180786 +0000 UTC m=+2933.094441074" lastFinishedPulling="2026-01-20 23:24:24.405916948 +0000 UTC m=+2936.726177246" observedRunningTime="2026-01-20 23:24:24.959491134 +0000 UTC m=+2937.279751452" watchObservedRunningTime="2026-01-20 23:24:24.966505142 +0000 UTC m=+2937.286765430" Jan 20 23:24:25 crc kubenswrapper[5030]: I0120 23:24:25.921179 5030 generic.go:334] "Generic (PLEG): container finished" podID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerID="ffcd8bf27bab9456122dc23dd68f70f7670cd0e0b388763064ccdca75cfd64b4" exitCode=0 Jan 20 23:24:25 crc kubenswrapper[5030]: I0120 23:24:25.921491 5030 generic.go:334] "Generic (PLEG): container finished" podID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerID="af385196233360654b60361b7d104b5701f83f777001a505069b4963855ff6b8" exitCode=2 Jan 20 23:24:25 crc kubenswrapper[5030]: I0120 23:24:25.921499 5030 generic.go:334] "Generic (PLEG): container finished" podID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerID="0057d04ce61f03543650dc8be1629b40d289999aded91a5e94e95e0120d3f43c" exitCode=0 Jan 20 23:24:25 crc kubenswrapper[5030]: I0120 23:24:25.921225 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"28de376f-b687-40db-a6ee-c1f3c5297d05","Type":"ContainerDied","Data":"ffcd8bf27bab9456122dc23dd68f70f7670cd0e0b388763064ccdca75cfd64b4"} Jan 20 23:24:25 crc kubenswrapper[5030]: I0120 23:24:25.921553 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"28de376f-b687-40db-a6ee-c1f3c5297d05","Type":"ContainerDied","Data":"af385196233360654b60361b7d104b5701f83f777001a505069b4963855ff6b8"} Jan 20 23:24:25 crc kubenswrapper[5030]: I0120 23:24:25.921575 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"28de376f-b687-40db-a6ee-c1f3c5297d05","Type":"ContainerDied","Data":"0057d04ce61f03543650dc8be1629b40d289999aded91a5e94e95e0120d3f43c"} Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.426564 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.579697 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28de376f-b687-40db-a6ee-c1f3c5297d05-run-httpd\") pod \"28de376f-b687-40db-a6ee-c1f3c5297d05\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.579945 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-ceilometer-tls-certs\") pod \"28de376f-b687-40db-a6ee-c1f3c5297d05\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.580006 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-config-data\") pod \"28de376f-b687-40db-a6ee-c1f3c5297d05\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.580154 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-combined-ca-bundle\") pod \"28de376f-b687-40db-a6ee-c1f3c5297d05\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.580183 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-sg-core-conf-yaml\") pod \"28de376f-b687-40db-a6ee-c1f3c5297d05\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.580177 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28de376f-b687-40db-a6ee-c1f3c5297d05-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "28de376f-b687-40db-a6ee-c1f3c5297d05" (UID: "28de376f-b687-40db-a6ee-c1f3c5297d05"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.580206 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-scripts\") pod \"28de376f-b687-40db-a6ee-c1f3c5297d05\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.580242 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28de376f-b687-40db-a6ee-c1f3c5297d05-log-httpd\") pod \"28de376f-b687-40db-a6ee-c1f3c5297d05\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.580271 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knwqn\" (UniqueName: \"kubernetes.io/projected/28de376f-b687-40db-a6ee-c1f3c5297d05-kube-api-access-knwqn\") pod \"28de376f-b687-40db-a6ee-c1f3c5297d05\" (UID: \"28de376f-b687-40db-a6ee-c1f3c5297d05\") " Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.580637 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28de376f-b687-40db-a6ee-c1f3c5297d05-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.581473 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28de376f-b687-40db-a6ee-c1f3c5297d05-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "28de376f-b687-40db-a6ee-c1f3c5297d05" (UID: "28de376f-b687-40db-a6ee-c1f3c5297d05"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.586869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28de376f-b687-40db-a6ee-c1f3c5297d05-kube-api-access-knwqn" (OuterVolumeSpecName: "kube-api-access-knwqn") pod "28de376f-b687-40db-a6ee-c1f3c5297d05" (UID: "28de376f-b687-40db-a6ee-c1f3c5297d05"). InnerVolumeSpecName "kube-api-access-knwqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.592910 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-scripts" (OuterVolumeSpecName: "scripts") pod "28de376f-b687-40db-a6ee-c1f3c5297d05" (UID: "28de376f-b687-40db-a6ee-c1f3c5297d05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.605608 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "28de376f-b687-40db-a6ee-c1f3c5297d05" (UID: "28de376f-b687-40db-a6ee-c1f3c5297d05"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.649803 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "28de376f-b687-40db-a6ee-c1f3c5297d05" (UID: "28de376f-b687-40db-a6ee-c1f3c5297d05"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.682657 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.682691 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.682702 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.682710 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28de376f-b687-40db-a6ee-c1f3c5297d05-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.682720 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knwqn\" (UniqueName: \"kubernetes.io/projected/28de376f-b687-40db-a6ee-c1f3c5297d05-kube-api-access-knwqn\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.688541 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28de376f-b687-40db-a6ee-c1f3c5297d05" (UID: "28de376f-b687-40db-a6ee-c1f3c5297d05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.719986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-config-data" (OuterVolumeSpecName: "config-data") pod "28de376f-b687-40db-a6ee-c1f3c5297d05" (UID: "28de376f-b687-40db-a6ee-c1f3c5297d05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.784325 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.784361 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28de376f-b687-40db-a6ee-c1f3c5297d05-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.964322 5030 generic.go:334] "Generic (PLEG): container finished" podID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerID="ee51aafe12c9d5df495ff9f9551a1985cb09d0adace380058ff113ca1b2ae81f" exitCode=0 Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.964383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"28de376f-b687-40db-a6ee-c1f3c5297d05","Type":"ContainerDied","Data":"ee51aafe12c9d5df495ff9f9551a1985cb09d0adace380058ff113ca1b2ae81f"} Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.964431 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"28de376f-b687-40db-a6ee-c1f3c5297d05","Type":"ContainerDied","Data":"6d284c19a5dc6c4d38e6659ec21b9a7a217287dcf310a7276a93d8a9d30b31ce"} Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.964461 5030 scope.go:117] "RemoveContainer" containerID="ffcd8bf27bab9456122dc23dd68f70f7670cd0e0b388763064ccdca75cfd64b4" Jan 20 23:24:28 crc kubenswrapper[5030]: I0120 23:24:28.964514 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.011335 5030 scope.go:117] "RemoveContainer" containerID="af385196233360654b60361b7d104b5701f83f777001a505069b4963855ff6b8" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.024154 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.041547 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.053675 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:24:29 crc kubenswrapper[5030]: E0120 23:24:29.054130 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="proxy-httpd" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.054152 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="proxy-httpd" Jan 20 23:24:29 crc kubenswrapper[5030]: E0120 23:24:29.054168 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="ceilometer-notification-agent" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.054178 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="ceilometer-notification-agent" Jan 20 23:24:29 crc kubenswrapper[5030]: E0120 23:24:29.054200 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="ceilometer-central-agent" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.054208 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="ceilometer-central-agent" Jan 20 23:24:29 crc kubenswrapper[5030]: E0120 23:24:29.054227 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="sg-core" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.054236 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="sg-core" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.054446 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="sg-core" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.054458 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="ceilometer-notification-agent" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.054486 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="proxy-httpd" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.054496 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" containerName="ceilometer-central-agent" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.056498 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.059489 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.060060 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.060822 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.095696 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.113745 5030 scope.go:117] "RemoveContainer" containerID="0057d04ce61f03543650dc8be1629b40d289999aded91a5e94e95e0120d3f43c" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.133463 5030 scope.go:117] "RemoveContainer" containerID="ee51aafe12c9d5df495ff9f9551a1985cb09d0adace380058ff113ca1b2ae81f" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.158172 5030 scope.go:117] "RemoveContainer" containerID="ffcd8bf27bab9456122dc23dd68f70f7670cd0e0b388763064ccdca75cfd64b4" Jan 20 23:24:29 crc kubenswrapper[5030]: E0120 23:24:29.158764 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffcd8bf27bab9456122dc23dd68f70f7670cd0e0b388763064ccdca75cfd64b4\": container with ID starting with ffcd8bf27bab9456122dc23dd68f70f7670cd0e0b388763064ccdca75cfd64b4 not found: ID does not exist" containerID="ffcd8bf27bab9456122dc23dd68f70f7670cd0e0b388763064ccdca75cfd64b4" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.158809 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffcd8bf27bab9456122dc23dd68f70f7670cd0e0b388763064ccdca75cfd64b4"} err="failed to get container status \"ffcd8bf27bab9456122dc23dd68f70f7670cd0e0b388763064ccdca75cfd64b4\": rpc error: code = NotFound desc = could not find container \"ffcd8bf27bab9456122dc23dd68f70f7670cd0e0b388763064ccdca75cfd64b4\": container with ID starting with ffcd8bf27bab9456122dc23dd68f70f7670cd0e0b388763064ccdca75cfd64b4 not found: ID does not exist" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.158836 5030 scope.go:117] "RemoveContainer" containerID="af385196233360654b60361b7d104b5701f83f777001a505069b4963855ff6b8" Jan 20 23:24:29 crc kubenswrapper[5030]: E0120 23:24:29.159238 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af385196233360654b60361b7d104b5701f83f777001a505069b4963855ff6b8\": container with ID starting with af385196233360654b60361b7d104b5701f83f777001a505069b4963855ff6b8 not found: ID does not exist" containerID="af385196233360654b60361b7d104b5701f83f777001a505069b4963855ff6b8" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.159342 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af385196233360654b60361b7d104b5701f83f777001a505069b4963855ff6b8"} err="failed to get container status \"af385196233360654b60361b7d104b5701f83f777001a505069b4963855ff6b8\": rpc error: code = NotFound desc = could not find container \"af385196233360654b60361b7d104b5701f83f777001a505069b4963855ff6b8\": container with ID starting with af385196233360654b60361b7d104b5701f83f777001a505069b4963855ff6b8 not found: ID does not exist" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.159418 5030 scope.go:117] "RemoveContainer" containerID="0057d04ce61f03543650dc8be1629b40d289999aded91a5e94e95e0120d3f43c" Jan 20 23:24:29 crc kubenswrapper[5030]: E0120 23:24:29.159947 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0057d04ce61f03543650dc8be1629b40d289999aded91a5e94e95e0120d3f43c\": container with ID starting with 0057d04ce61f03543650dc8be1629b40d289999aded91a5e94e95e0120d3f43c not found: ID does not exist" containerID="0057d04ce61f03543650dc8be1629b40d289999aded91a5e94e95e0120d3f43c" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.159976 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0057d04ce61f03543650dc8be1629b40d289999aded91a5e94e95e0120d3f43c"} err="failed to get container status \"0057d04ce61f03543650dc8be1629b40d289999aded91a5e94e95e0120d3f43c\": rpc error: code = NotFound desc = could not find container \"0057d04ce61f03543650dc8be1629b40d289999aded91a5e94e95e0120d3f43c\": container with ID starting with 0057d04ce61f03543650dc8be1629b40d289999aded91a5e94e95e0120d3f43c not found: ID does not exist" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.159996 5030 scope.go:117] "RemoveContainer" containerID="ee51aafe12c9d5df495ff9f9551a1985cb09d0adace380058ff113ca1b2ae81f" Jan 20 23:24:29 crc kubenswrapper[5030]: E0120 23:24:29.160217 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee51aafe12c9d5df495ff9f9551a1985cb09d0adace380058ff113ca1b2ae81f\": container with ID starting with ee51aafe12c9d5df495ff9f9551a1985cb09d0adace380058ff113ca1b2ae81f not found: ID does not exist" containerID="ee51aafe12c9d5df495ff9f9551a1985cb09d0adace380058ff113ca1b2ae81f" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.160314 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee51aafe12c9d5df495ff9f9551a1985cb09d0adace380058ff113ca1b2ae81f"} err="failed to get container status \"ee51aafe12c9d5df495ff9f9551a1985cb09d0adace380058ff113ca1b2ae81f\": rpc error: code = NotFound desc = could not find container \"ee51aafe12c9d5df495ff9f9551a1985cb09d0adace380058ff113ca1b2ae81f\": container with ID starting with ee51aafe12c9d5df495ff9f9551a1985cb09d0adace380058ff113ca1b2ae81f not found: ID does not exist" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.198838 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.198884 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/702e90be-f218-4875-842f-4ab9128c07af-run-httpd\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.198933 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.198958 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/702e90be-f218-4875-842f-4ab9128c07af-log-httpd\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.198975 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-scripts\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.199057 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crtlr\" (UniqueName: \"kubernetes.io/projected/702e90be-f218-4875-842f-4ab9128c07af-kube-api-access-crtlr\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.199089 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.199119 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-config-data\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.301307 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.301363 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/702e90be-f218-4875-842f-4ab9128c07af-log-httpd\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.301392 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-scripts\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.301458 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crtlr\" (UniqueName: \"kubernetes.io/projected/702e90be-f218-4875-842f-4ab9128c07af-kube-api-access-crtlr\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.301505 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.301547 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-config-data\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.301611 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.301661 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/702e90be-f218-4875-842f-4ab9128c07af-run-httpd\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.302415 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/702e90be-f218-4875-842f-4ab9128c07af-run-httpd\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.303437 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/702e90be-f218-4875-842f-4ab9128c07af-log-httpd\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.307291 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.307567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.307725 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-scripts\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.310191 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.321473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-config-data\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.321497 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crtlr\" (UniqueName: \"kubernetes.io/projected/702e90be-f218-4875-842f-4ab9128c07af-kube-api-access-crtlr\") pod \"ceilometer-0\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.414862 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.927404 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:24:29 crc kubenswrapper[5030]: W0120 23:24:29.938038 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod702e90be_f218_4875_842f_4ab9128c07af.slice/crio-4899972a23c54e8753a2ef0fef3ead6828ecee92ceedbfe44f03c034a2bdd04e WatchSource:0}: Error finding container 4899972a23c54e8753a2ef0fef3ead6828ecee92ceedbfe44f03c034a2bdd04e: Status 404 returned error can't find the container with id 4899972a23c54e8753a2ef0fef3ead6828ecee92ceedbfe44f03c034a2bdd04e Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.974536 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28de376f-b687-40db-a6ee-c1f3c5297d05" path="/var/lib/kubelet/pods/28de376f-b687-40db-a6ee-c1f3c5297d05/volumes" Jan 20 23:24:29 crc kubenswrapper[5030]: I0120 23:24:29.979647 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"702e90be-f218-4875-842f-4ab9128c07af","Type":"ContainerStarted","Data":"4899972a23c54e8753a2ef0fef3ead6828ecee92ceedbfe44f03c034a2bdd04e"} Jan 20 23:24:31 crc kubenswrapper[5030]: I0120 23:24:31.001419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"702e90be-f218-4875-842f-4ab9128c07af","Type":"ContainerStarted","Data":"c93387d8e2c2a1f902f1c328944016965de643e24cca1bada095e1b5ea0c9dce"} Jan 20 23:24:32 crc kubenswrapper[5030]: I0120 23:24:32.014055 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"702e90be-f218-4875-842f-4ab9128c07af","Type":"ContainerStarted","Data":"229ac6ca1e152dd47107a09cb29f43d3a0fefdd1d5a01a30a06f366d1cb97e6c"} Jan 20 23:24:33 crc kubenswrapper[5030]: I0120 23:24:33.027709 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"702e90be-f218-4875-842f-4ab9128c07af","Type":"ContainerStarted","Data":"a419b0df1ac907972cfda436d9e2567302488fb7494ade3b518c011687d6c8f1"} Jan 20 23:24:33 crc kubenswrapper[5030]: I0120 23:24:33.316841 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:33 crc kubenswrapper[5030]: I0120 23:24:33.316916 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:34 crc kubenswrapper[5030]: I0120 23:24:34.330860 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.43:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:24:34 crc kubenswrapper[5030]: I0120 23:24:34.330884 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.43:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:24:35 crc kubenswrapper[5030]: I0120 23:24:35.057944 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"702e90be-f218-4875-842f-4ab9128c07af","Type":"ContainerStarted","Data":"d584867670f8f2d7ca7dd30d194d8a01069a7df39afdda8685e325b3db6c256f"} Jan 20 23:24:35 crc kubenswrapper[5030]: I0120 23:24:35.058851 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:24:35 crc kubenswrapper[5030]: I0120 23:24:35.112572 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.225363705 podStartE2EDuration="6.112540952s" podCreationTimestamp="2026-01-20 23:24:29 +0000 UTC" firstStartedPulling="2026-01-20 23:24:29.941160209 +0000 UTC m=+2942.261420497" lastFinishedPulling="2026-01-20 23:24:33.828337416 +0000 UTC m=+2946.148597744" observedRunningTime="2026-01-20 23:24:35.099028618 +0000 UTC m=+2947.419288946" watchObservedRunningTime="2026-01-20 23:24:35.112540952 +0000 UTC m=+2947.432801270" Jan 20 23:24:40 crc kubenswrapper[5030]: I0120 23:24:40.157077 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:24:40 crc kubenswrapper[5030]: I0120 23:24:40.157150 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:24:43 crc kubenswrapper[5030]: I0120 23:24:43.327261 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:43 crc kubenswrapper[5030]: I0120 23:24:43.328343 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:43 crc kubenswrapper[5030]: I0120 23:24:43.329209 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:43 crc kubenswrapper[5030]: I0120 23:24:43.338753 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:44 crc kubenswrapper[5030]: I0120 23:24:44.169034 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:44 crc kubenswrapper[5030]: I0120 23:24:44.181437 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:24:52 crc kubenswrapper[5030]: I0120 23:24:52.741788 5030 scope.go:117] "RemoveContainer" containerID="300cae00e1e96594e491715916fe7d890923b30fd9931bc6ed9c3ee0ac8dbeaf" Jan 20 23:24:52 crc kubenswrapper[5030]: I0120 23:24:52.780699 5030 scope.go:117] "RemoveContainer" containerID="fe1a9151522425bf756d729b22560d3e26a3f05df618b4a2133b2f1cf4ed87cf" Jan 20 23:24:52 crc kubenswrapper[5030]: I0120 23:24:52.812227 5030 scope.go:117] "RemoveContainer" containerID="28ed1595451c9dbd0de380e299bb4915d4ae4edad80751f4c795f728b868d5b4" Jan 20 23:24:59 crc kubenswrapper[5030]: I0120 23:24:59.426941 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:02 crc kubenswrapper[5030]: I0120 23:25:02.849375 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:25:02 crc kubenswrapper[5030]: I0120 23:25:02.877656 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:25:02 crc kubenswrapper[5030]: I0120 23:25:02.927341 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:25:02 crc kubenswrapper[5030]: I0120 23:25:02.946019 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:25:02 crc kubenswrapper[5030]: I0120 23:25:02.946220 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="9e47ba81-d55b-48df-bce7-31220884279a" containerName="memcached" containerID="cri-o://17543e07621bd81516d23441483660efcf74bfd8b130e95eccf5638cbe189c50" gracePeriod=30 Jan 20 23:25:02 crc kubenswrapper[5030]: I0120 23:25:02.973347 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.090995 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-6smx7"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.101003 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-6smx7"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.166118 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vv9sg"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.168981 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="e0213ca2-a56f-4a81-bfe7-814d2eeeb275" containerName="galera" containerID="cri-o://b6df586aebc68d2efc6af2bcf7cd3114ec3dd0cbdf0cfbc45b847f60af0e5c15" gracePeriod=30 Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.174361 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-vv9sg"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.222935 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xl95q"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.224052 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.247173 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.247771 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" containerName="openstack-network-exporter" containerID="cri-o://ecbd8b4feb5fd36330625ded1056eb0e060128772f26a3acce331c05cfe6ccf3" gracePeriod=300 Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.260535 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xl95q"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.281082 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-6zk9r"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.308725 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-6zk9r"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.319924 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-combined-ca-bundle\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.320241 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-scripts\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.320347 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-config-data\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.320445 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqbxb\" (UniqueName: \"kubernetes.io/projected/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-kube-api-access-xqbxb\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.320543 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-logs\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.335294 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4r5pg"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.350671 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-4r5pg"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.357397 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.357723 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="faa5c453-046f-4c27-93cf-c291aaee541e" containerName="ovn-northd" containerID="cri-o://1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4" gracePeriod=30 Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.357865 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="faa5c453-046f-4c27-93cf-c291aaee541e" containerName="openstack-network-exporter" containerID="cri-o://b65351c43ee430bba2d10de676fa2218e795346d4c6fb18b21ab0f22049ea7a4" gracePeriod=30 Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.367790 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-x8hpn"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.369592 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.370085 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-x8hpn"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.379064 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qw7ll"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.386763 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-qw7ll"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.403027 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" containerName="ovsdbserver-nb" containerID="cri-o://43082a76d23b8074e1ed19a5330bef2bfc3758f17d62bee8b5fae165abbbefd5" gracePeriod=300 Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.420634 5030 generic.go:334] "Generic (PLEG): container finished" podID="ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" containerID="ecbd8b4feb5fd36330625ded1056eb0e060128772f26a3acce331c05cfe6ccf3" exitCode=2 Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.420679 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb","Type":"ContainerDied","Data":"ecbd8b4feb5fd36330625ded1056eb0e060128772f26a3acce331c05cfe6ccf3"} Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.421669 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqbxb\" (UniqueName: \"kubernetes.io/projected/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-kube-api-access-xqbxb\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.421725 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-logs\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.421809 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-combined-ca-bundle\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.421859 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-scripts\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.421890 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-config-data\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.426403 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-logs\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.459724 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-dtsgt"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.460943 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.469044 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-scripts\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.469936 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-config-data\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.470005 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-combined-ca-bundle\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.470154 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.485326 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqbxb\" (UniqueName: \"kubernetes.io/projected/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-kube-api-access-xqbxb\") pod \"placement-db-sync-xl95q\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.503371 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-fs64c"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.505148 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.517700 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-dtsgt"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.526029 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-config-data\") pod \"glance-db-sync-dtsgt\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.526120 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w86j2\" (UniqueName: \"kubernetes.io/projected/43181c01-0cec-4ad9-b037-880357999794-kube-api-access-w86j2\") pod \"keystone-db-sync-x8hpn\" (UID: \"43181c01-0cec-4ad9-b037-880357999794\") " pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.526165 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-combined-ca-bundle\") pod \"glance-db-sync-dtsgt\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.526183 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43181c01-0cec-4ad9-b037-880357999794-config-data\") pod \"keystone-db-sync-x8hpn\" (UID: \"43181c01-0cec-4ad9-b037-880357999794\") " pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.526202 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43181c01-0cec-4ad9-b037-880357999794-combined-ca-bundle\") pod \"keystone-db-sync-x8hpn\" (UID: \"43181c01-0cec-4ad9-b037-880357999794\") " pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.526221 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-db-sync-config-data\") pod \"glance-db-sync-dtsgt\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.526253 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6ch9\" (UniqueName: \"kubernetes.io/projected/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-kube-api-access-c6ch9\") pod \"glance-db-sync-dtsgt\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.540415 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.551742 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-fs64c"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.563805 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-tgzmq"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.565053 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.587765 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-tgzmq"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.613700 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.614397 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="548fd8df-abc4-45a7-9967-72fab91a66a8" containerName="openstack-network-exporter" containerID="cri-o://bd96a3a20843e6f56642dd70f8707c625d3a285b19dc8ee6e4fe00aa44c6b9f9" gracePeriod=300 Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630069 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-combined-ca-bundle\") pod \"neutron-db-sync-tgzmq\" (UID: \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\") " pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630118 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv4cr\" (UniqueName: \"kubernetes.io/projected/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-kube-api-access-fv4cr\") pod \"neutron-db-sync-tgzmq\" (UID: \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\") " pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630150 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w86j2\" (UniqueName: \"kubernetes.io/projected/43181c01-0cec-4ad9-b037-880357999794-kube-api-access-w86j2\") pod \"keystone-db-sync-x8hpn\" (UID: \"43181c01-0cec-4ad9-b037-880357999794\") " pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630188 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-config-data\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630211 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-combined-ca-bundle\") pod \"glance-db-sync-dtsgt\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630229 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43181c01-0cec-4ad9-b037-880357999794-config-data\") pod \"keystone-db-sync-x8hpn\" (UID: \"43181c01-0cec-4ad9-b037-880357999794\") " pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630248 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43181c01-0cec-4ad9-b037-880357999794-combined-ca-bundle\") pod \"keystone-db-sync-x8hpn\" (UID: \"43181c01-0cec-4ad9-b037-880357999794\") " pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630265 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-db-sync-config-data\") pod \"glance-db-sync-dtsgt\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630298 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6ch9\" (UniqueName: \"kubernetes.io/projected/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-kube-api-access-c6ch9\") pod \"glance-db-sync-dtsgt\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630353 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-db-sync-config-data\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-config-data\") pod \"glance-db-sync-dtsgt\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-scripts\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630452 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/31020647-e3cc-44ab-a754-439b424e50cd-etc-machine-id\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630469 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxx9b\" (UniqueName: \"kubernetes.io/projected/31020647-e3cc-44ab-a754-439b424e50cd-kube-api-access-cxx9b\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630483 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-config\") pod \"neutron-db-sync-tgzmq\" (UID: \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\") " pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.630504 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-combined-ca-bundle\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.644368 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43181c01-0cec-4ad9-b037-880357999794-config-data\") pod \"keystone-db-sync-x8hpn\" (UID: \"43181c01-0cec-4ad9-b037-880357999794\") " pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.647747 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-config-data\") pod \"glance-db-sync-dtsgt\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.648436 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-combined-ca-bundle\") pod \"glance-db-sync-dtsgt\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.649706 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-db-sync-config-data\") pod \"glance-db-sync-dtsgt\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.651659 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43181c01-0cec-4ad9-b037-880357999794-combined-ca-bundle\") pod \"keystone-db-sync-x8hpn\" (UID: \"43181c01-0cec-4ad9-b037-880357999794\") " pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.657001 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6ch9\" (UniqueName: \"kubernetes.io/projected/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-kube-api-access-c6ch9\") pod \"glance-db-sync-dtsgt\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.676441 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w86j2\" (UniqueName: \"kubernetes.io/projected/43181c01-0cec-4ad9-b037-880357999794-kube-api-access-w86j2\") pod \"keystone-db-sync-x8hpn\" (UID: \"43181c01-0cec-4ad9-b037-880357999794\") " pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.682058 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" containerName="galera" containerID="cri-o://d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca" gracePeriod=30 Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.704912 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.716641 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.735155 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-db-sync-config-data\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.735237 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-scripts\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.735264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/31020647-e3cc-44ab-a754-439b424e50cd-etc-machine-id\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.735282 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxx9b\" (UniqueName: \"kubernetes.io/projected/31020647-e3cc-44ab-a754-439b424e50cd-kube-api-access-cxx9b\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.735296 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-config\") pod \"neutron-db-sync-tgzmq\" (UID: \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\") " pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.735316 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-combined-ca-bundle\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.735348 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-combined-ca-bundle\") pod \"neutron-db-sync-tgzmq\" (UID: \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\") " pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.735366 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv4cr\" (UniqueName: \"kubernetes.io/projected/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-kube-api-access-fv4cr\") pod \"neutron-db-sync-tgzmq\" (UID: \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\") " pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.735406 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-config-data\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.741476 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="548fd8df-abc4-45a7-9967-72fab91a66a8" containerName="ovsdbserver-sb" containerID="cri-o://6fe18aeb4fe0ded445a9b466df8df29be383fa0340292b59d55d812502344579" gracePeriod=300 Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.741541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/31020647-e3cc-44ab-a754-439b424e50cd-etc-machine-id\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.747123 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-combined-ca-bundle\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.758330 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-db-sync-config-data\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.763225 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-config-data\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.767848 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-combined-ca-bundle\") pod \"neutron-db-sync-tgzmq\" (UID: \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\") " pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.769524 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxx9b\" (UniqueName: \"kubernetes.io/projected/31020647-e3cc-44ab-a754-439b424e50cd-kube-api-access-cxx9b\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.772235 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-config\") pod \"neutron-db-sync-tgzmq\" (UID: \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\") " pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.775131 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv4cr\" (UniqueName: \"kubernetes.io/projected/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-kube-api-access-fv4cr\") pod \"neutron-db-sync-tgzmq\" (UID: \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\") " pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.790409 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-scripts\") pod \"cinder-db-sync-fs64c\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:03 crc kubenswrapper[5030]: E0120 23:25:03.814568 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:25:03 crc kubenswrapper[5030]: E0120 23:25:03.825123 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:25:03 crc kubenswrapper[5030]: E0120 23:25:03.827214 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:25:03 crc kubenswrapper[5030]: E0120 23:25:03.827282 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/openstack-galera-0" podUID="2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" containerName="galera" Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.943424 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:25:03 crc kubenswrapper[5030]: I0120 23:25:03.943714 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="aa0ffc1d-cef9-4431-8f48-012c129e75b5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8a48f49c06913d02cc09946d1fe836b60218c213d31a68d5e74ac65345df2ebc" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.031970 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.046201 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.048075 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78c5052d-ee32-4414-a152-13caff78882f" path="/var/lib/kubelet/pods/78c5052d-ee32-4414-a152-13caff78882f/volumes" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.050894 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e2b764-c26c-4ef7-b6fb-c51e698b8ea1" path="/var/lib/kubelet/pods/81e2b764-c26c-4ef7-b6fb-c51e698b8ea1/volumes" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.058893 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a403eee4-765c-41f2-9ae3-fa3151068a29" path="/var/lib/kubelet/pods/a403eee4-765c-41f2-9ae3-fa3151068a29/volumes" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.059646 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b334eade-4a24-4596-b272-d8d3c91940fc" path="/var/lib/kubelet/pods/b334eade-4a24-4596-b272-d8d3c91940fc/volumes" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.060297 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0990a32-fa39-48dc-b0f5-c8a366f9632c" path="/var/lib/kubelet/pods/e0990a32-fa39-48dc-b0f5-c8a366f9632c/volumes" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.075532 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-5px6h"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.083804 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-5px6h"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.122044 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-qwhwh"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.137251 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-qwhwh"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.183263 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-t9frk"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.184774 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.190059 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.190264 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.208190 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-t9frk"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.211879 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb/ovsdbserver-nb/0.log" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.211948 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.248229 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-nchp6"] Jan 20 23:25:04 crc kubenswrapper[5030]: E0120 23:25:04.248599 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" containerName="openstack-network-exporter" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.248672 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" containerName="openstack-network-exporter" Jan 20 23:25:04 crc kubenswrapper[5030]: E0120 23:25:04.248700 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" containerName="ovsdbserver-nb" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.248706 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" containerName="ovsdbserver-nb" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.248898 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" containerName="openstack-network-exporter" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.248908 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" containerName="ovsdbserver-nb" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.249569 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-nchp6" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.264027 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-nchp6"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.280409 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.280869 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-server" containerID="cri-o://e7b7de68994a8284266afd34bc3a79c36d36d9259d1a0d8a634f2a7d8363e58c" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.280973 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="swift-recon-cron" containerID="cri-o://e68bf3c747c74247d5fd94bc51c9d1b79d386d40e704eeba0fc6a904567c5302" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.281005 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="rsync" containerID="cri-o://ca4c0372f3752ba9b4ca5958091f96b803487120243179b1402233e986a6b580" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.281037 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-expirer" containerID="cri-o://d4e9bd5a5fdcfcd937af2ad5f261b5f163beb79f9debd2bf84fdf18248e8fd46" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.281083 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-updater" containerID="cri-o://898ef2488ff154992f84a4fc6380c82430045a8ed810e3879ec78e5a246f7a3f" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.281129 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-auditor" containerID="cri-o://de256107cf63fcb2b870e65d29e45e6ba9a4694e730e0ec20ce025d3f43b07de" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.281205 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-replicator" containerID="cri-o://43c9302de660f515dd0eebe873e9bc2a05c872d8d24b28b741b9f2ffa3a812d7" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.281241 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-server" containerID="cri-o://2ba800c34c07d475f585e0881947ab835b0bf1d733e21efd9aa4df55007828d7" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.281269 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-updater" containerID="cri-o://6b600011c7f8d67f6e7566ea6b3b1b2dfaeb40569b60470daf1e602fda755770" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.281298 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-auditor" containerID="cri-o://0a2cf07616a39fdc6f3573ae37b852a81c3be261b67e73368904e2e6310386c6" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.281327 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-replicator" containerID="cri-o://a628f7b623ff9c200508b4770aedb01dcdc16a5616b4c3c7e2c9ee2e72f314da" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.281353 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-server" containerID="cri-o://8f90b5530e219e8189c33b09de69a04843468fdd0affbcabac025c6866f2785c" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.281381 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-reaper" containerID="cri-o://8d392111f161bad470eb94c639b23a6c545dd65432681f84c5e1ea62cf270e93" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.281411 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-auditor" containerID="cri-o://4c756d7b8db1a727410cc6f3d2706156889222c800b5aa7f19e74bcaae74d60c" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.281450 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-replicator" containerID="cri-o://2461fa19d045db13a97b7b0b5638ffc5a653019b3da567b4db09dfdfa385937a" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.348772 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgm86\" (UniqueName: \"kubernetes.io/projected/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-kube-api-access-fgm86\") pod \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.348829 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-combined-ca-bundle\") pod \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.348860 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.349578 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-ovsdbserver-nb-tls-certs\") pod \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.349919 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-metrics-certs-tls-certs\") pod \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.349966 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-config\") pod \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.350008 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-ovsdb-rundir\") pod \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.350088 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-scripts\") pod \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\" (UID: \"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb\") " Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.350594 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d9865414-bdc0-4dd1-baee-d04985de4dd0-etc-swift\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.350662 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3bc26a88-4c14-43cd-8255-cf5bb3e15931-db-sync-config-data\") pod \"barbican-db-sync-nchp6\" (UID: \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\") " pod="openstack-kuttl-tests/barbican-db-sync-nchp6" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.350896 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f6fc\" (UniqueName: \"kubernetes.io/projected/d9865414-bdc0-4dd1-baee-d04985de4dd0-kube-api-access-2f6fc\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.351006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9865414-bdc0-4dd1-baee-d04985de4dd0-scripts\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.351041 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-dispersionconf\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.351083 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc26a88-4c14-43cd-8255-cf5bb3e15931-combined-ca-bundle\") pod \"barbican-db-sync-nchp6\" (UID: \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\") " pod="openstack-kuttl-tests/barbican-db-sync-nchp6" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.351142 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-swiftconf\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.351291 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbtdm\" (UniqueName: \"kubernetes.io/projected/3bc26a88-4c14-43cd-8255-cf5bb3e15931-kube-api-access-hbtdm\") pod \"barbican-db-sync-nchp6\" (UID: \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\") " pod="openstack-kuttl-tests/barbican-db-sync-nchp6" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.351488 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d9865414-bdc0-4dd1-baee-d04985de4dd0-ring-data-devices\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.351535 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-combined-ca-bundle\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.354676 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" (UID: "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.354961 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-scripts" (OuterVolumeSpecName: "scripts") pod "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" (UID: "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.354947 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-config" (OuterVolumeSpecName: "config") pod "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" (UID: "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.362477 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" (UID: "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.380223 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-kube-api-access-fgm86" (OuterVolumeSpecName: "kube-api-access-fgm86") pod "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" (UID: "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb"). InnerVolumeSpecName "kube-api-access-fgm86". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.385870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" (UID: "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d9865414-bdc0-4dd1-baee-d04985de4dd0-ring-data-devices\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473448 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-combined-ca-bundle\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473491 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d9865414-bdc0-4dd1-baee-d04985de4dd0-etc-swift\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473509 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3bc26a88-4c14-43cd-8255-cf5bb3e15931-db-sync-config-data\") pod \"barbican-db-sync-nchp6\" (UID: \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\") " pod="openstack-kuttl-tests/barbican-db-sync-nchp6" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473574 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f6fc\" (UniqueName: \"kubernetes.io/projected/d9865414-bdc0-4dd1-baee-d04985de4dd0-kube-api-access-2f6fc\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473610 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9865414-bdc0-4dd1-baee-d04985de4dd0-scripts\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473641 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-dispersionconf\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473661 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc26a88-4c14-43cd-8255-cf5bb3e15931-combined-ca-bundle\") pod \"barbican-db-sync-nchp6\" (UID: \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\") " pod="openstack-kuttl-tests/barbican-db-sync-nchp6" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473681 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-swiftconf\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbtdm\" (UniqueName: \"kubernetes.io/projected/3bc26a88-4c14-43cd-8255-cf5bb3e15931-kube-api-access-hbtdm\") pod \"barbican-db-sync-nchp6\" (UID: \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\") " pod="openstack-kuttl-tests/barbican-db-sync-nchp6" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473817 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473833 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473844 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473853 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgm86\" (UniqueName: \"kubernetes.io/projected/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-kube-api-access-fgm86\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473863 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.473880 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.475404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d9865414-bdc0-4dd1-baee-d04985de4dd0-ring-data-devices\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.477331 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d9865414-bdc0-4dd1-baee-d04985de4dd0-etc-swift\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.479503 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-combined-ca-bundle\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.480118 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9865414-bdc0-4dd1-baee-d04985de4dd0-scripts\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.480920 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc26a88-4c14-43cd-8255-cf5bb3e15931-combined-ca-bundle\") pod \"barbican-db-sync-nchp6\" (UID: \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\") " pod="openstack-kuttl-tests/barbican-db-sync-nchp6" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.500995 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-swiftconf\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.509903 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3bc26a88-4c14-43cd-8255-cf5bb3e15931-db-sync-config-data\") pod \"barbican-db-sync-nchp6\" (UID: \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\") " pod="openstack-kuttl-tests/barbican-db-sync-nchp6" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.513090 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.520966 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="7295c1fa-5222-4dd7-83de-c6cbc3d10d08" containerName="kube-state-metrics" containerID="cri-o://1ddbb073ac679c1112026d39d87c0c538ebb786b29bcf41c39d15c48568ec20a" gracePeriod=30 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.526248 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xl95q"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.540338 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbtdm\" (UniqueName: \"kubernetes.io/projected/3bc26a88-4c14-43cd-8255-cf5bb3e15931-kube-api-access-hbtdm\") pod \"barbican-db-sync-nchp6\" (UID: \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\") " pod="openstack-kuttl-tests/barbican-db-sync-nchp6" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.555904 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_548fd8df-abc4-45a7-9967-72fab91a66a8/ovsdbserver-sb/0.log" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.555965 5030 generic.go:334] "Generic (PLEG): container finished" podID="548fd8df-abc4-45a7-9967-72fab91a66a8" containerID="bd96a3a20843e6f56642dd70f8707c625d3a285b19dc8ee6e4fe00aa44c6b9f9" exitCode=2 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.555985 5030 generic.go:334] "Generic (PLEG): container finished" podID="548fd8df-abc4-45a7-9967-72fab91a66a8" containerID="6fe18aeb4fe0ded445a9b466df8df29be383fa0340292b59d55d812502344579" exitCode=143 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.556092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"548fd8df-abc4-45a7-9967-72fab91a66a8","Type":"ContainerDied","Data":"bd96a3a20843e6f56642dd70f8707c625d3a285b19dc8ee6e4fe00aa44c6b9f9"} Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.556178 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"548fd8df-abc4-45a7-9967-72fab91a66a8","Type":"ContainerDied","Data":"6fe18aeb4fe0ded445a9b466df8df29be383fa0340292b59d55d812502344579"} Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.565656 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f6fc\" (UniqueName: \"kubernetes.io/projected/d9865414-bdc0-4dd1-baee-d04985de4dd0-kube-api-access-2f6fc\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.568591 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb/ovsdbserver-nb/0.log" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.568683 5030 generic.go:334] "Generic (PLEG): container finished" podID="ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" containerID="43082a76d23b8074e1ed19a5330bef2bfc3758f17d62bee8b5fae165abbbefd5" exitCode=143 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.568736 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb","Type":"ContainerDied","Data":"43082a76d23b8074e1ed19a5330bef2bfc3758f17d62bee8b5fae165abbbefd5"} Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.568761 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb","Type":"ContainerDied","Data":"685939fe3cb6d868616254e3278d31a750f5b1768e062fc532327bdb009ad0f1"} Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.568778 5030 scope.go:117] "RemoveContainer" containerID="ecbd8b4feb5fd36330625ded1056eb0e060128772f26a3acce331c05cfe6ccf3" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.568902 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.575990 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-dispersionconf\") pod \"swift-ring-rebalance-t9frk\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.581119 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.591879 5030 generic.go:334] "Generic (PLEG): container finished" podID="faa5c453-046f-4c27-93cf-c291aaee541e" containerID="b65351c43ee430bba2d10de676fa2218e795346d4c6fb18b21ab0f22049ea7a4" exitCode=2 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.592018 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"faa5c453-046f-4c27-93cf-c291aaee541e","Type":"ContainerDied","Data":"b65351c43ee430bba2d10de676fa2218e795346d4c6fb18b21ab0f22049ea7a4"} Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.609015 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-nchp6" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.614406 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.624828 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" (UID: "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.659680 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="898ef2488ff154992f84a4fc6380c82430045a8ed810e3879ec78e5a246f7a3f" exitCode=0 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.659718 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="6b600011c7f8d67f6e7566ea6b3b1b2dfaeb40569b60470daf1e602fda755770" exitCode=0 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.659730 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="0a2cf07616a39fdc6f3573ae37b852a81c3be261b67e73368904e2e6310386c6" exitCode=0 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.659746 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="2461fa19d045db13a97b7b0b5638ffc5a653019b3da567b4db09dfdfa385937a" exitCode=0 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.659803 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"898ef2488ff154992f84a4fc6380c82430045a8ed810e3879ec78e5a246f7a3f"} Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.659836 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"6b600011c7f8d67f6e7566ea6b3b1b2dfaeb40569b60470daf1e602fda755770"} Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.659862 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"0a2cf07616a39fdc6f3573ae37b852a81c3be261b67e73368904e2e6310386c6"} Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.659875 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"2461fa19d045db13a97b7b0b5638ffc5a653019b3da567b4db09dfdfa385937a"} Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.665215 5030 generic.go:334] "Generic (PLEG): container finished" podID="9e47ba81-d55b-48df-bce7-31220884279a" containerID="17543e07621bd81516d23441483660efcf74bfd8b130e95eccf5638cbe189c50" exitCode=0 Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.665251 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"9e47ba81-d55b-48df-bce7-31220884279a","Type":"ContainerDied","Data":"17543e07621bd81516d23441483660efcf74bfd8b130e95eccf5638cbe189c50"} Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.681984 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.682027 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.695843 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" (UID: "ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.703985 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.716602 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-t7hls"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.732558 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.740817 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-c97w9"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.784774 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.823304 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.824666 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.847916 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.877101 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.923146 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.924321 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.930489 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.948080 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4"] Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.991560 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-config-data\") pod \"nova-cell1-conductor-db-sync-676fq\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.993254 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-scripts\") pod \"nova-cell1-conductor-db-sync-676fq\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.993327 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-676fq\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:04 crc kubenswrapper[5030]: I0120 23:25:04.993380 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hrrp\" (UniqueName: \"kubernetes.io/projected/27233b7c-c856-4a41-a674-7399905591de-kube-api-access-2hrrp\") pod \"nova-cell1-conductor-db-sync-676fq\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.052839 5030 scope.go:117] "RemoveContainer" containerID="43082a76d23b8074e1ed19a5330bef2bfc3758f17d62bee8b5fae165abbbefd5" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.091748 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-dtsgt"] Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.096035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-scripts\") pod \"nova-cell1-conductor-db-sync-676fq\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.096107 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-m5mr4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.096133 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxdxw\" (UniqueName: \"kubernetes.io/projected/3358fbb1-638c-4f20-804e-b672441288e4-kube-api-access-fxdxw\") pod \"nova-cell0-conductor-db-sync-m5mr4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.096180 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-676fq\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.096204 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-scripts\") pod \"nova-cell0-conductor-db-sync-m5mr4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.096262 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-config-data\") pod \"nova-cell0-conductor-db-sync-m5mr4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.096282 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hrrp\" (UniqueName: \"kubernetes.io/projected/27233b7c-c856-4a41-a674-7399905591de-kube-api-access-2hrrp\") pod \"nova-cell1-conductor-db-sync-676fq\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.096302 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-config-data\") pod \"nova-cell1-conductor-db-sync-676fq\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.113676 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.113936 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="ceilometer-central-agent" containerID="cri-o://c93387d8e2c2a1f902f1c328944016965de643e24cca1bada095e1b5ea0c9dce" gracePeriod=30 Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.114040 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="proxy-httpd" containerID="cri-o://d584867670f8f2d7ca7dd30d194d8a01069a7df39afdda8685e325b3db6c256f" gracePeriod=30 Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.114085 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="sg-core" containerID="cri-o://a419b0df1ac907972cfda436d9e2567302488fb7494ade3b518c011687d6c8f1" gracePeriod=30 Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.114117 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="ceilometer-notification-agent" containerID="cri-o://229ac6ca1e152dd47107a09cb29f43d3a0fefdd1d5a01a30a06f366d1cb97e6c" gracePeriod=30 Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.116764 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-676fq\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.130189 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.138543 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.138944 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-config-data\") pod \"nova-cell1-conductor-db-sync-676fq\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.140048 5030 scope.go:117] "RemoveContainer" containerID="ecbd8b4feb5fd36330625ded1056eb0e060128772f26a3acce331c05cfe6ccf3" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.141082 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-scripts\") pod \"nova-cell1-conductor-db-sync-676fq\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.149239 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hrrp\" (UniqueName: \"kubernetes.io/projected/27233b7c-c856-4a41-a674-7399905591de-kube-api-access-2hrrp\") pod \"nova-cell1-conductor-db-sync-676fq\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.150867 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.152378 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.169800 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.172943 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.173483 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:25:05 crc kubenswrapper[5030]: E0120 23:25:05.174379 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecbd8b4feb5fd36330625ded1056eb0e060128772f26a3acce331c05cfe6ccf3\": container with ID starting with ecbd8b4feb5fd36330625ded1056eb0e060128772f26a3acce331c05cfe6ccf3 not found: ID does not exist" containerID="ecbd8b4feb5fd36330625ded1056eb0e060128772f26a3acce331c05cfe6ccf3" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.174410 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecbd8b4feb5fd36330625ded1056eb0e060128772f26a3acce331c05cfe6ccf3"} err="failed to get container status \"ecbd8b4feb5fd36330625ded1056eb0e060128772f26a3acce331c05cfe6ccf3\": rpc error: code = NotFound desc = could not find container \"ecbd8b4feb5fd36330625ded1056eb0e060128772f26a3acce331c05cfe6ccf3\": container with ID starting with ecbd8b4feb5fd36330625ded1056eb0e060128772f26a3acce331c05cfe6ccf3 not found: ID does not exist" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.174504 5030 scope.go:117] "RemoveContainer" containerID="43082a76d23b8074e1ed19a5330bef2bfc3758f17d62bee8b5fae165abbbefd5" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.177468 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-vh4r9" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.179810 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:25:05 crc kubenswrapper[5030]: E0120 23:25:05.185782 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43082a76d23b8074e1ed19a5330bef2bfc3758f17d62bee8b5fae165abbbefd5\": container with ID starting with 43082a76d23b8074e1ed19a5330bef2bfc3758f17d62bee8b5fae165abbbefd5 not found: ID does not exist" containerID="43082a76d23b8074e1ed19a5330bef2bfc3758f17d62bee8b5fae165abbbefd5" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.185857 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43082a76d23b8074e1ed19a5330bef2bfc3758f17d62bee8b5fae165abbbefd5"} err="failed to get container status \"43082a76d23b8074e1ed19a5330bef2bfc3758f17d62bee8b5fae165abbbefd5\": rpc error: code = NotFound desc = could not find container \"43082a76d23b8074e1ed19a5330bef2bfc3758f17d62bee8b5fae165abbbefd5\": container with ID starting with 43082a76d23b8074e1ed19a5330bef2bfc3758f17d62bee8b5fae165abbbefd5 not found: ID does not exist" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.190091 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_548fd8df-abc4-45a7-9967-72fab91a66a8/ovsdbserver-sb/0.log" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.190298 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.227228 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-m5mr4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.230304 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxdxw\" (UniqueName: \"kubernetes.io/projected/3358fbb1-638c-4f20-804e-b672441288e4-kube-api-access-fxdxw\") pod \"nova-cell0-conductor-db-sync-m5mr4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.230506 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-scripts\") pod \"nova-cell0-conductor-db-sync-m5mr4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.230597 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-config-data\") pod \"nova-cell0-conductor-db-sync-m5mr4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.247292 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-scripts\") pod \"nova-cell0-conductor-db-sync-m5mr4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.264599 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-config-data\") pod \"nova-cell0-conductor-db-sync-m5mr4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.265089 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-m5mr4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.271992 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxdxw\" (UniqueName: \"kubernetes.io/projected/3358fbb1-638c-4f20-804e-b672441288e4-kube-api-access-fxdxw\") pod \"nova-cell0-conductor-db-sync-m5mr4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:05 crc kubenswrapper[5030]: E0120 23:25:05.275397 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:25:05 crc kubenswrapper[5030]: E0120 23:25:05.287202 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:25:05 crc kubenswrapper[5030]: E0120 23:25:05.291818 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:25:05 crc kubenswrapper[5030]: E0120 23:25:05.291951 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="faa5c453-046f-4c27-93cf-c291aaee541e" containerName="ovn-northd" Jan 20 23:25:05 crc kubenswrapper[5030]: E0120 23:25:05.300132 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b6df586aebc68d2efc6af2bcf7cd3114ec3dd0cbdf0cfbc45b847f60af0e5c15 is running failed: container process not found" containerID="b6df586aebc68d2efc6af2bcf7cd3114ec3dd0cbdf0cfbc45b847f60af0e5c15" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:25:05 crc kubenswrapper[5030]: E0120 23:25:05.307514 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b6df586aebc68d2efc6af2bcf7cd3114ec3dd0cbdf0cfbc45b847f60af0e5c15 is running failed: container process not found" containerID="b6df586aebc68d2efc6af2bcf7cd3114ec3dd0cbdf0cfbc45b847f60af0e5c15" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:25:05 crc kubenswrapper[5030]: E0120 23:25:05.313706 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b6df586aebc68d2efc6af2bcf7cd3114ec3dd0cbdf0cfbc45b847f60af0e5c15 is running failed: container process not found" containerID="b6df586aebc68d2efc6af2bcf7cd3114ec3dd0cbdf0cfbc45b847f60af0e5c15" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:25:05 crc kubenswrapper[5030]: E0120 23:25:05.313764 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b6df586aebc68d2efc6af2bcf7cd3114ec3dd0cbdf0cfbc45b847f60af0e5c15 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="e0213ca2-a56f-4a81-bfe7-814d2eeeb275" containerName="galera" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338170 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/548fd8df-abc4-45a7-9967-72fab91a66a8-ovsdb-rundir\") pod \"548fd8df-abc4-45a7-9967-72fab91a66a8\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338214 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-metrics-certs-tls-certs\") pod \"548fd8df-abc4-45a7-9967-72fab91a66a8\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338288 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn87b\" (UniqueName: \"kubernetes.io/projected/548fd8df-abc4-45a7-9967-72fab91a66a8-kube-api-access-rn87b\") pod \"548fd8df-abc4-45a7-9967-72fab91a66a8\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338311 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-combined-ca-bundle\") pod \"548fd8df-abc4-45a7-9967-72fab91a66a8\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338406 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/548fd8df-abc4-45a7-9967-72fab91a66a8-scripts\") pod \"548fd8df-abc4-45a7-9967-72fab91a66a8\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-ovsdbserver-sb-tls-certs\") pod \"548fd8df-abc4-45a7-9967-72fab91a66a8\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338562 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"548fd8df-abc4-45a7-9967-72fab91a66a8\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338613 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/548fd8df-abc4-45a7-9967-72fab91a66a8-config\") pod \"548fd8df-abc4-45a7-9967-72fab91a66a8\" (UID: \"548fd8df-abc4-45a7-9967-72fab91a66a8\") " Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338818 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338864 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338894 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1130f389-005b-4aee-ad77-913385f27c6b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338949 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g9nn\" (UniqueName: \"kubernetes.io/projected/1130f389-005b-4aee-ad77-913385f27c6b-kube-api-access-5g9nn\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338983 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1130f389-005b-4aee-ad77-913385f27c6b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.338997 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1130f389-005b-4aee-ad77-913385f27c6b-config\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.339046 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.339078 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.344706 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/548fd8df-abc4-45a7-9967-72fab91a66a8-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "548fd8df-abc4-45a7-9967-72fab91a66a8" (UID: "548fd8df-abc4-45a7-9967-72fab91a66a8"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.345794 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/548fd8df-abc4-45a7-9967-72fab91a66a8-config" (OuterVolumeSpecName: "config") pod "548fd8df-abc4-45a7-9967-72fab91a66a8" (UID: "548fd8df-abc4-45a7-9967-72fab91a66a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.358223 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/548fd8df-abc4-45a7-9967-72fab91a66a8-scripts" (OuterVolumeSpecName: "scripts") pod "548fd8df-abc4-45a7-9967-72fab91a66a8" (UID: "548fd8df-abc4-45a7-9967-72fab91a66a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.365655 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "548fd8df-abc4-45a7-9967-72fab91a66a8" (UID: "548fd8df-abc4-45a7-9967-72fab91a66a8"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.366475 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.373791 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/548fd8df-abc4-45a7-9967-72fab91a66a8-kube-api-access-rn87b" (OuterVolumeSpecName: "kube-api-access-rn87b") pod "548fd8df-abc4-45a7-9967-72fab91a66a8" (UID: "548fd8df-abc4-45a7-9967-72fab91a66a8"). InnerVolumeSpecName "kube-api-access-rn87b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.456723 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.457892 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g9nn\" (UniqueName: \"kubernetes.io/projected/1130f389-005b-4aee-ad77-913385f27c6b-kube-api-access-5g9nn\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.457966 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1130f389-005b-4aee-ad77-913385f27c6b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.457981 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1130f389-005b-4aee-ad77-913385f27c6b-config\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.458037 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.458077 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.458103 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.458140 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.458169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1130f389-005b-4aee-ad77-913385f27c6b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.458242 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/548fd8df-abc4-45a7-9967-72fab91a66a8-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.458257 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/548fd8df-abc4-45a7-9967-72fab91a66a8-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.458269 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn87b\" (UniqueName: \"kubernetes.io/projected/548fd8df-abc4-45a7-9967-72fab91a66a8-kube-api-access-rn87b\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.458279 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/548fd8df-abc4-45a7-9967-72fab91a66a8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.458296 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.460604 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1130f389-005b-4aee-ad77-913385f27c6b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.461292 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1130f389-005b-4aee-ad77-913385f27c6b-config\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.461829 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1130f389-005b-4aee-ad77-913385f27c6b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.466356 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.494762 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.494908 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.496579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.498480 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g9nn\" (UniqueName: \"kubernetes.io/projected/1130f389-005b-4aee-ad77-913385f27c6b-kube-api-access-5g9nn\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.509473 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-x8hpn"] Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.611700 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.617813 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.664654 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.697228 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "548fd8df-abc4-45a7-9967-72fab91a66a8" (UID: "548fd8df-abc4-45a7-9967-72fab91a66a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.731405 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0213ca2-a56f-4a81-bfe7-814d2eeeb275" containerID="b6df586aebc68d2efc6af2bcf7cd3114ec3dd0cbdf0cfbc45b847f60af0e5c15" exitCode=0 Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.731472 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"e0213ca2-a56f-4a81-bfe7-814d2eeeb275","Type":"ContainerDied","Data":"b6df586aebc68d2efc6af2bcf7cd3114ec3dd0cbdf0cfbc45b847f60af0e5c15"} Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.731502 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"e0213ca2-a56f-4a81-bfe7-814d2eeeb275","Type":"ContainerDied","Data":"a11f5559db2164e8b385540e9ce8119882e0b0d3af1526d64480ce2cc73b6bd5"} Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.731515 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a11f5559db2164e8b385540e9ce8119882e0b0d3af1526d64480ce2cc73b6bd5" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.743343 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_548fd8df-abc4-45a7-9967-72fab91a66a8/ovsdbserver-sb/0.log" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.743494 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.744211 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"548fd8df-abc4-45a7-9967-72fab91a66a8","Type":"ContainerDied","Data":"8500dd757a5727ff00695803c642317dfe7c4f2c9c9e647cbdbbd6f3b97a9161"} Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.744252 5030 scope.go:117] "RemoveContainer" containerID="bd96a3a20843e6f56642dd70f8707c625d3a285b19dc8ee6e4fe00aa44c6b9f9" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.753813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" event={"ID":"43181c01-0cec-4ad9-b037-880357999794","Type":"ContainerStarted","Data":"b964da6f15dcddd67731a7aaf5812e4acf7bb29d5705242453682a8378c0b7f8"} Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.756002 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "548fd8df-abc4-45a7-9967-72fab91a66a8" (UID: "548fd8df-abc4-45a7-9967-72fab91a66a8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.791933 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.792042 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.793724 5030 generic.go:334] "Generic (PLEG): container finished" podID="702e90be-f218-4875-842f-4ab9128c07af" containerID="d584867670f8f2d7ca7dd30d194d8a01069a7df39afdda8685e325b3db6c256f" exitCode=0 Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.793985 5030 generic.go:334] "Generic (PLEG): container finished" podID="702e90be-f218-4875-842f-4ab9128c07af" containerID="a419b0df1ac907972cfda436d9e2567302488fb7494ade3b518c011687d6c8f1" exitCode=2 Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.794069 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"702e90be-f218-4875-842f-4ab9128c07af","Type":"ContainerDied","Data":"d584867670f8f2d7ca7dd30d194d8a01069a7df39afdda8685e325b3db6c256f"} Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.794096 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"702e90be-f218-4875-842f-4ab9128c07af","Type":"ContainerDied","Data":"a419b0df1ac907972cfda436d9e2567302488fb7494ade3b518c011687d6c8f1"} Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.794179 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.827484 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "548fd8df-abc4-45a7-9967-72fab91a66a8" (UID: "548fd8df-abc4-45a7-9967-72fab91a66a8"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.847928 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-dtsgt" event={"ID":"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063","Type":"ContainerStarted","Data":"175185716a54fc745b8363fbda184eb388e6692213ad6b5915fa1863322d7db2"} Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.863747 5030 generic.go:334] "Generic (PLEG): container finished" podID="aa0ffc1d-cef9-4431-8f48-012c129e75b5" containerID="8a48f49c06913d02cc09946d1fe836b60218c213d31a68d5e74ac65345df2ebc" exitCode=0 Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.863798 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"aa0ffc1d-cef9-4431-8f48-012c129e75b5","Type":"ContainerDied","Data":"8a48f49c06913d02cc09946d1fe836b60218c213d31a68d5e74ac65345df2ebc"} Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.893867 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/548fd8df-abc4-45a7-9967-72fab91a66a8-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:05 crc kubenswrapper[5030]: I0120 23:25:05.908718 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:05.998100 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e47ba81-d55b-48df-bce7-31220884279a-memcached-tls-certs\") pod \"9e47ba81-d55b-48df-bce7-31220884279a\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:05.998156 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e47ba81-d55b-48df-bce7-31220884279a-config-data\") pod \"9e47ba81-d55b-48df-bce7-31220884279a\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:05.998183 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hj84\" (UniqueName: \"kubernetes.io/projected/9e47ba81-d55b-48df-bce7-31220884279a-kube-api-access-7hj84\") pod \"9e47ba81-d55b-48df-bce7-31220884279a\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:05.998222 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e47ba81-d55b-48df-bce7-31220884279a-kolla-config\") pod \"9e47ba81-d55b-48df-bce7-31220884279a\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:05.998270 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e47ba81-d55b-48df-bce7-31220884279a-combined-ca-bundle\") pod \"9e47ba81-d55b-48df-bce7-31220884279a\" (UID: \"9e47ba81-d55b-48df-bce7-31220884279a\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.005816 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e47ba81-d55b-48df-bce7-31220884279a-config-data" (OuterVolumeSpecName: "config-data") pod "9e47ba81-d55b-48df-bce7-31220884279a" (UID: "9e47ba81-d55b-48df-bce7-31220884279a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.008733 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e47ba81-d55b-48df-bce7-31220884279a-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9e47ba81-d55b-48df-bce7-31220884279a" (UID: "9e47ba81-d55b-48df-bce7-31220884279a"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.036756 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e47ba81-d55b-48df-bce7-31220884279a-kube-api-access-7hj84" (OuterVolumeSpecName: "kube-api-access-7hj84") pod "9e47ba81-d55b-48df-bce7-31220884279a" (UID: "9e47ba81-d55b-48df-bce7-31220884279a"). InnerVolumeSpecName "kube-api-access-7hj84". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.037829 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0795a100-bef0-4357-94fb-4cc0d177d748" path="/var/lib/kubelet/pods/0795a100-bef0-4357-94fb-4cc0d177d748/volumes" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.039544 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b62b69c-8413-4732-9816-9d2f635d47b5" path="/var/lib/kubelet/pods/5b62b69c-8413-4732-9816-9d2f635d47b5/volumes" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.043354 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a504a208-146a-4ba0-8145-b42daf7f5f1a" path="/var/lib/kubelet/pods/a504a208-146a-4ba0-8145-b42daf7f5f1a/volumes" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.046498 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb" path="/var/lib/kubelet/pods/ccbcbef9-2e99-4e41-9b5f-ee87d2f812cb/volumes" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.053045 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.053685 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d427c956-ad6c-44b8-b112-ec484902b44c" path="/var/lib/kubelet/pods/d427c956-ad6c-44b8-b112-ec484902b44c/volumes" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.074979 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-tgzmq"] Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.076871 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078112 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="ca4c0372f3752ba9b4ca5958091f96b803487120243179b1402233e986a6b580" exitCode=0 Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078135 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="d4e9bd5a5fdcfcd937af2ad5f261b5f163beb79f9debd2bf84fdf18248e8fd46" exitCode=0 Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078142 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="de256107cf63fcb2b870e65d29e45e6ba9a4694e730e0ec20ce025d3f43b07de" exitCode=0 Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078149 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="43c9302de660f515dd0eebe873e9bc2a05c872d8d24b28b741b9f2ffa3a812d7" exitCode=0 Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078157 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="2ba800c34c07d475f585e0881947ab835b0bf1d733e21efd9aa4df55007828d7" exitCode=0 Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078164 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="a628f7b623ff9c200508b4770aedb01dcdc16a5616b4c3c7e2c9ee2e72f314da" exitCode=0 Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078171 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="8f90b5530e219e8189c33b09de69a04843468fdd0affbcabac025c6866f2785c" exitCode=0 Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078178 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="8d392111f161bad470eb94c639b23a6c545dd65432681f84c5e1ea62cf270e93" exitCode=0 Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078184 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="4c756d7b8db1a727410cc6f3d2706156889222c800b5aa7f19e74bcaae74d60c" exitCode=0 Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078191 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="e7b7de68994a8284266afd34bc3a79c36d36d9259d1a0d8a634f2a7d8363e58c" exitCode=0 Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078233 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"ca4c0372f3752ba9b4ca5958091f96b803487120243179b1402233e986a6b580"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078255 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"d4e9bd5a5fdcfcd937af2ad5f261b5f163beb79f9debd2bf84fdf18248e8fd46"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078267 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"de256107cf63fcb2b870e65d29e45e6ba9a4694e730e0ec20ce025d3f43b07de"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078275 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"43c9302de660f515dd0eebe873e9bc2a05c872d8d24b28b741b9f2ffa3a812d7"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078284 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"2ba800c34c07d475f585e0881947ab835b0bf1d733e21efd9aa4df55007828d7"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"a628f7b623ff9c200508b4770aedb01dcdc16a5616b4c3c7e2c9ee2e72f314da"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078300 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"8f90b5530e219e8189c33b09de69a04843468fdd0affbcabac025c6866f2785c"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078315 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"8d392111f161bad470eb94c639b23a6c545dd65432681f84c5e1ea62cf270e93"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078324 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"4c756d7b8db1a727410cc6f3d2706156889222c800b5aa7f19e74bcaae74d60c"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.078332 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"e7b7de68994a8284266afd34bc3a79c36d36d9259d1a0d8a634f2a7d8363e58c"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.094782 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-t9frk"] Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.103225 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9e47ba81-d55b-48df-bce7-31220884279a-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.103254 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9e47ba81-d55b-48df-bce7-31220884279a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.103263 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hj84\" (UniqueName: \"kubernetes.io/projected/9e47ba81-d55b-48df-bce7-31220884279a-kube-api-access-7hj84\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.120368 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"9e47ba81-d55b-48df-bce7-31220884279a","Type":"ContainerDied","Data":"1d198fefad76f437ed452f5a60cc2a6f2226b2aba4bc29f253596f279ebcbb6c"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.120463 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.138933 5030 generic.go:334] "Generic (PLEG): container finished" podID="7295c1fa-5222-4dd7-83de-c6cbc3d10d08" containerID="1ddbb073ac679c1112026d39d87c0c538ebb786b29bcf41c39d15c48568ec20a" exitCode=2 Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.139077 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.139626 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"7295c1fa-5222-4dd7-83de-c6cbc3d10d08","Type":"ContainerDied","Data":"1ddbb073ac679c1112026d39d87c0c538ebb786b29bcf41c39d15c48568ec20a"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.139670 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"7295c1fa-5222-4dd7-83de-c6cbc3d10d08","Type":"ContainerDied","Data":"907f1d49209d932c69c4de1af622616dc4250ac29fead804f280cdde35f3f790"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.145638 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.161582 5030 scope.go:117] "RemoveContainer" containerID="6fe18aeb4fe0ded445a9b466df8df29be383fa0340292b59d55d812502344579" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.166034 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-xl95q" event={"ID":"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5","Type":"ContainerStarted","Data":"51142efc7916ed03c6153a5dcb3fe231b88d5aa152a4410501c160c9b0963e66"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.166085 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-xl95q" event={"ID":"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5","Type":"ContainerStarted","Data":"c4f4ba8b2379bebb32ac1c3eda6f53fc24741019d3d91a4919cd7e061982d91c"} Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.204288 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9gzd\" (UniqueName: \"kubernetes.io/projected/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-api-access-x9gzd\") pod \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.204346 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-combined-ca-bundle\") pod \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.204398 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-combined-ca-bundle\") pod \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.204416 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-kolla-config\") pod \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.204442 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-config-data-default\") pod \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.204595 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.204639 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-config-data-generated\") pod \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.204667 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-state-metrics-tls-config\") pod \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.204687 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-galera-tls-certs\") pod \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.204730 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-operator-scripts\") pod \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.204745 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-state-metrics-tls-certs\") pod \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\" (UID: \"7295c1fa-5222-4dd7-83de-c6cbc3d10d08\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.204789 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gsqx\" (UniqueName: \"kubernetes.io/projected/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-kube-api-access-2gsqx\") pod \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\" (UID: \"e0213ca2-a56f-4a81-bfe7-814d2eeeb275\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.207236 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "e0213ca2-a56f-4a81-bfe7-814d2eeeb275" (UID: "e0213ca2-a56f-4a81-bfe7-814d2eeeb275"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.213149 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e0213ca2-a56f-4a81-bfe7-814d2eeeb275" (UID: "e0213ca2-a56f-4a81-bfe7-814d2eeeb275"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.213246 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0213ca2-a56f-4a81-bfe7-814d2eeeb275" (UID: "e0213ca2-a56f-4a81-bfe7-814d2eeeb275"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.213959 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "e0213ca2-a56f-4a81-bfe7-814d2eeeb275" (UID: "e0213ca2-a56f-4a81-bfe7-814d2eeeb275"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.247879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-api-access-x9gzd" (OuterVolumeSpecName: "kube-api-access-x9gzd") pod "7295c1fa-5222-4dd7-83de-c6cbc3d10d08" (UID: "7295c1fa-5222-4dd7-83de-c6cbc3d10d08"). InnerVolumeSpecName "kube-api-access-x9gzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.251313 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-kube-api-access-2gsqx" (OuterVolumeSpecName: "kube-api-access-2gsqx") pod "e0213ca2-a56f-4a81-bfe7-814d2eeeb275" (UID: "e0213ca2-a56f-4a81-bfe7-814d2eeeb275"). InnerVolumeSpecName "kube-api-access-2gsqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.284111 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-fs64c"] Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.302439 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-nchp6"] Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.303353 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "e0213ca2-a56f-4a81-bfe7-814d2eeeb275" (UID: "e0213ca2-a56f-4a81-bfe7-814d2eeeb275"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.306481 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mrf8\" (UniqueName: \"kubernetes.io/projected/aa0ffc1d-cef9-4431-8f48-012c129e75b5-kube-api-access-9mrf8\") pod \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.306534 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-config-data\") pod \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.306677 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-combined-ca-bundle\") pod \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.306825 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-vencrypt-tls-certs\") pod \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.306865 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-nova-novncproxy-tls-certs\") pod \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\" (UID: \"aa0ffc1d-cef9-4431-8f48-012c129e75b5\") " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.307353 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.307365 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.307399 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.307412 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.307424 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.307433 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gsqx\" (UniqueName: \"kubernetes.io/projected/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-kube-api-access-2gsqx\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.307442 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9gzd\" (UniqueName: \"kubernetes.io/projected/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-api-access-x9gzd\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.331880 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa0ffc1d-cef9-4431-8f48-012c129e75b5-kube-api-access-9mrf8" (OuterVolumeSpecName: "kube-api-access-9mrf8") pod "aa0ffc1d-cef9-4431-8f48-012c129e75b5" (UID: "aa0ffc1d-cef9-4431-8f48-012c129e75b5"). InnerVolumeSpecName "kube-api-access-9mrf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.338119 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.355898 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.368670 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.368813 5030 scope.go:117] "RemoveContainer" containerID="17543e07621bd81516d23441483660efcf74bfd8b130e95eccf5638cbe189c50" Jan 20 23:25:06 crc kubenswrapper[5030]: E0120 23:25:06.369150 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548fd8df-abc4-45a7-9967-72fab91a66a8" containerName="ovsdbserver-sb" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.369170 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="548fd8df-abc4-45a7-9967-72fab91a66a8" containerName="ovsdbserver-sb" Jan 20 23:25:06 crc kubenswrapper[5030]: E0120 23:25:06.369183 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0ffc1d-cef9-4431-8f48-012c129e75b5" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.369190 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0ffc1d-cef9-4431-8f48-012c129e75b5" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:25:06 crc kubenswrapper[5030]: E0120 23:25:06.369206 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548fd8df-abc4-45a7-9967-72fab91a66a8" containerName="openstack-network-exporter" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.369212 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="548fd8df-abc4-45a7-9967-72fab91a66a8" containerName="openstack-network-exporter" Jan 20 23:25:06 crc kubenswrapper[5030]: E0120 23:25:06.369224 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7295c1fa-5222-4dd7-83de-c6cbc3d10d08" containerName="kube-state-metrics" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.369230 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7295c1fa-5222-4dd7-83de-c6cbc3d10d08" containerName="kube-state-metrics" Jan 20 23:25:06 crc kubenswrapper[5030]: E0120 23:25:06.369243 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e47ba81-d55b-48df-bce7-31220884279a" containerName="memcached" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.369251 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e47ba81-d55b-48df-bce7-31220884279a" containerName="memcached" Jan 20 23:25:06 crc kubenswrapper[5030]: E0120 23:25:06.369278 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0213ca2-a56f-4a81-bfe7-814d2eeeb275" containerName="mysql-bootstrap" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.369284 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0213ca2-a56f-4a81-bfe7-814d2eeeb275" containerName="mysql-bootstrap" Jan 20 23:25:06 crc kubenswrapper[5030]: E0120 23:25:06.369296 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0213ca2-a56f-4a81-bfe7-814d2eeeb275" containerName="galera" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.369301 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0213ca2-a56f-4a81-bfe7-814d2eeeb275" containerName="galera" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.369481 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa0ffc1d-cef9-4431-8f48-012c129e75b5" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.369496 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7295c1fa-5222-4dd7-83de-c6cbc3d10d08" containerName="kube-state-metrics" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.369504 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="548fd8df-abc4-45a7-9967-72fab91a66a8" containerName="ovsdbserver-sb" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.369514 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="548fd8df-abc4-45a7-9967-72fab91a66a8" containerName="openstack-network-exporter" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.369525 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0213ca2-a56f-4a81-bfe7-814d2eeeb275" containerName="galera" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.369535 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e47ba81-d55b-48df-bce7-31220884279a" containerName="memcached" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.371589 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-xl95q" podStartSLOduration=3.37157163 podStartE2EDuration="3.37157163s" podCreationTimestamp="2026-01-20 23:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:06.24843488 +0000 UTC m=+2978.568695168" watchObservedRunningTime="2026-01-20 23:25:06.37157163 +0000 UTC m=+2978.691831918" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.382849 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.387942 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.388771 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-4z7ks" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.388900 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.388999 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:25:06 crc kubenswrapper[5030]: W0120 23:25:06.398330 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bc26a88_4c14_43cd_8255_cf5bb3e15931.slice/crio-9878da5f1dd417bce0b2d70887d8d0b27e99c1128c4c30ef56eda403f6567a11 WatchSource:0}: Error finding container 9878da5f1dd417bce0b2d70887d8d0b27e99c1128c4c30ef56eda403f6567a11: Status 404 returned error can't find the container with id 9878da5f1dd417bce0b2d70887d8d0b27e99c1128c4c30ef56eda403f6567a11 Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.410808 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mrf8\" (UniqueName: \"kubernetes.io/projected/aa0ffc1d-cef9-4431-8f48-012c129e75b5-kube-api-access-9mrf8\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.411378 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.442773 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.474768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e47ba81-d55b-48df-bce7-31220884279a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e47ba81-d55b-48df-bce7-31220884279a" (UID: "9e47ba81-d55b-48df-bce7-31220884279a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.509592 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e47ba81-d55b-48df-bce7-31220884279a-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "9e47ba81-d55b-48df-bce7-31220884279a" (UID: "9e47ba81-d55b-48df-bce7-31220884279a"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.513427 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.513698 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.513782 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.513982 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-config\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.514323 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.514389 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwbbm\" (UniqueName: \"kubernetes.io/projected/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-kube-api-access-bwbbm\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.514452 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.515034 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.515145 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e47ba81-d55b-48df-bce7-31220884279a-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.515167 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.515181 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e47ba81-d55b-48df-bce7-31220884279a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.528184 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0213ca2-a56f-4a81-bfe7-814d2eeeb275" (UID: "e0213ca2-a56f-4a81-bfe7-814d2eeeb275"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.594641 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.619007 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.619127 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.619180 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.619224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.619254 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.619503 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-config\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.619640 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.619685 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwbbm\" (UniqueName: \"kubernetes.io/projected/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-kube-api-access-bwbbm\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.619764 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.621457 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.624346 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.627056 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.627688 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-config\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.635972 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-config-data" (OuterVolumeSpecName: "config-data") pod "aa0ffc1d-cef9-4431-8f48-012c129e75b5" (UID: "aa0ffc1d-cef9-4431-8f48-012c129e75b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.640796 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq"] Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.650906 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa0ffc1d-cef9-4431-8f48-012c129e75b5" (UID: "aa0ffc1d-cef9-4431-8f48-012c129e75b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.653299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.667894 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.668003 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.680501 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwbbm\" (UniqueName: \"kubernetes.io/projected/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-kube-api-access-bwbbm\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.715768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "7295c1fa-5222-4dd7-83de-c6cbc3d10d08" (UID: "7295c1fa-5222-4dd7-83de-c6cbc3d10d08"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.726994 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.727037 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.727055 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.793915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"ovsdbserver-sb-0\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.881720 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "7295c1fa-5222-4dd7-83de-c6cbc3d10d08" (UID: "7295c1fa-5222-4dd7-83de-c6cbc3d10d08"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: W0120 23:25:06.895716 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3358fbb1_638c_4f20_804e_b672441288e4.slice/crio-026c9397190e63d942e81db9fd41b91996d94d9e63c1454235faa6f7aa7b2714 WatchSource:0}: Error finding container 026c9397190e63d942e81db9fd41b91996d94d9e63c1454235faa6f7aa7b2714: Status 404 returned error can't find the container with id 026c9397190e63d942e81db9fd41b91996d94d9e63c1454235faa6f7aa7b2714 Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.898585 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4"] Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.918839 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7295c1fa-5222-4dd7-83de-c6cbc3d10d08" (UID: "7295c1fa-5222-4dd7-83de-c6cbc3d10d08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.920729 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "aa0ffc1d-cef9-4431-8f48-012c129e75b5" (UID: "aa0ffc1d-cef9-4431-8f48-012c129e75b5"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.946897 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.946924 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.946935 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7295c1fa-5222-4dd7-83de-c6cbc3d10d08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.956310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "e0213ca2-a56f-4a81-bfe7-814d2eeeb275" (UID: "e0213ca2-a56f-4a81-bfe7-814d2eeeb275"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:06 crc kubenswrapper[5030]: I0120 23:25:06.967743 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "aa0ffc1d-cef9-4431-8f48-012c129e75b5" (UID: "aa0ffc1d-cef9-4431-8f48-012c129e75b5"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.048897 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0213ca2-a56f-4a81-bfe7-814d2eeeb275-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.050058 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0ffc1d-cef9-4431-8f48-012c129e75b5-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.067055 5030 scope.go:117] "RemoveContainer" containerID="1ddbb073ac679c1112026d39d87c0c538ebb786b29bcf41c39d15c48568ec20a" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.100215 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.164224 5030 scope.go:117] "RemoveContainer" containerID="1ddbb073ac679c1112026d39d87c0c538ebb786b29bcf41c39d15c48568ec20a" Jan 20 23:25:07 crc kubenswrapper[5030]: E0120 23:25:07.165105 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ddbb073ac679c1112026d39d87c0c538ebb786b29bcf41c39d15c48568ec20a\": container with ID starting with 1ddbb073ac679c1112026d39d87c0c538ebb786b29bcf41c39d15c48568ec20a not found: ID does not exist" containerID="1ddbb073ac679c1112026d39d87c0c538ebb786b29bcf41c39d15c48568ec20a" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.165145 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ddbb073ac679c1112026d39d87c0c538ebb786b29bcf41c39d15c48568ec20a"} err="failed to get container status \"1ddbb073ac679c1112026d39d87c0c538ebb786b29bcf41c39d15c48568ec20a\": rpc error: code = NotFound desc = could not find container \"1ddbb073ac679c1112026d39d87c0c538ebb786b29bcf41c39d15c48568ec20a\": container with ID starting with 1ddbb073ac679c1112026d39d87c0c538ebb786b29bcf41c39d15c48568ec20a not found: ID does not exist" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.197373 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.221154 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.260940 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" event={"ID":"3358fbb1-638c-4f20-804e-b672441288e4","Type":"ContainerStarted","Data":"026c9397190e63d942e81db9fd41b91996d94d9e63c1454235faa6f7aa7b2714"} Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.298873 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.318007 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.319327 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.326878 5030 generic.go:334] "Generic (PLEG): container finished" podID="2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" containerID="d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca" exitCode=0 Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.326945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5","Type":"ContainerDied","Data":"d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca"} Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.326966 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5","Type":"ContainerDied","Data":"250488f9a7b916786e685f61c5d2e8800ff8b73d0d965fcaa7ba487550700706"} Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.326982 5030 scope.go:117] "RemoveContainer" containerID="d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.337891 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.371586 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" event={"ID":"d9865414-bdc0-4dd1-baee-d04985de4dd0","Type":"ContainerStarted","Data":"499cc1f0a31d5544b98226ce482c0fd77880d2910ee01a1c3f5bbfccfa9db9a6"} Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.377047 5030 generic.go:334] "Generic (PLEG): container finished" podID="702e90be-f218-4875-842f-4ab9128c07af" containerID="229ac6ca1e152dd47107a09cb29f43d3a0fefdd1d5a01a30a06f366d1cb97e6c" exitCode=0 Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.377068 5030 generic.go:334] "Generic (PLEG): container finished" podID="702e90be-f218-4875-842f-4ab9128c07af" containerID="c93387d8e2c2a1f902f1c328944016965de643e24cca1bada095e1b5ea0c9dce" exitCode=0 Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.377100 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"702e90be-f218-4875-842f-4ab9128c07af","Type":"ContainerDied","Data":"229ac6ca1e152dd47107a09cb29f43d3a0fefdd1d5a01a30a06f366d1cb97e6c"} Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.377120 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"702e90be-f218-4875-842f-4ab9128c07af","Type":"ContainerDied","Data":"c93387d8e2c2a1f902f1c328944016965de643e24cca1bada095e1b5ea0c9dce"} Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.383074 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-nchp6" event={"ID":"3bc26a88-4c14-43cd-8255-cf5bb3e15931","Type":"ContainerStarted","Data":"9878da5f1dd417bce0b2d70887d8d0b27e99c1128c4c30ef56eda403f6567a11"} Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.390842 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:25:07 crc kubenswrapper[5030]: E0120 23:25:07.391220 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" containerName="mysql-bootstrap" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.391232 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" containerName="mysql-bootstrap" Jan 20 23:25:07 crc kubenswrapper[5030]: E0120 23:25:07.391248 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" containerName="galera" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.391254 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" containerName="galera" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.391471 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" containerName="galera" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.392073 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.397956 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.398277 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.402550 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" event={"ID":"3cb2c9c5-64c5-4d21-b370-4fb3718220c5","Type":"ContainerStarted","Data":"f6359ed0b74fc12aa33ee4212fc0ea850cc6f282ea26ce2b047001c64e9c3f8e"} Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.406113 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-ztdtq" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.422088 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"aa0ffc1d-cef9-4431-8f48-012c129e75b5","Type":"ContainerDied","Data":"de86ae1d4e3bac06b48ef5982b4e4ea557972eb3a4c0b2b46eac9c82860798d4"} Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.422187 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.439387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-fs64c" event={"ID":"31020647-e3cc-44ab-a754-439b424e50cd","Type":"ContainerStarted","Data":"fed6dc3be2d915fcc3b39523184f5ef9745dab7ab7b1d39074cb7735879d32a6"} Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.468308 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.469442 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.472759 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.472962 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.474211 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mwr2\" (UniqueName: \"kubernetes.io/projected/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-kube-api-access-7mwr2\") pod \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.474310 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-config-data-default\") pod \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.474440 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-kolla-config\") pod \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.474478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.474528 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-combined-ca-bundle\") pod \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.474587 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-config-data-generated\") pod \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.474611 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-galera-tls-certs\") pod \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.474658 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-operator-scripts\") pod \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\" (UID: \"2bdbab30-d0e5-435f-8b5a-19b1bb9496c5\") " Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.474895 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a13994b-a11d-4798-9aae-b91a3d73d440-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.475007 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5xzp\" (UniqueName: \"kubernetes.io/projected/3a13994b-a11d-4798-9aae-b91a3d73d440-kube-api-access-h5xzp\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.475066 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a13994b-a11d-4798-9aae-b91a3d73d440-config-data\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.475111 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a13994b-a11d-4798-9aae-b91a3d73d440-kolla-config\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.475163 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a13994b-a11d-4798-9aae-b91a3d73d440-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.476382 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" (UID: "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.476435 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" (UID: "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.476753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" (UID: "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.477608 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" (UID: "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.498493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"1130f389-005b-4aee-ad77-913385f27c6b","Type":"ContainerStarted","Data":"4c63500705386fcb4fea39b9804da485e5f9b1b682d80c064214b0d7849248ab"} Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.512235 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" event={"ID":"43181c01-0cec-4ad9-b037-880357999794","Type":"ContainerStarted","Data":"aebde48ffd7b79ddf812bbd9fe02d63be7d61af3719098b682a1f12190844949"} Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.524582 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.526465 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-kube-api-access-7mwr2" (OuterVolumeSpecName: "kube-api-access-7mwr2") pod "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" (UID: "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5"). InnerVolumeSpecName "kube-api-access-7mwr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.546701 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.555150 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.557393 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" event={"ID":"27233b7c-c856-4a41-a674-7399905591de","Type":"ContainerStarted","Data":"680a753b3a2fc91ad4ba12b423c86df8b9406533da21fafde32fa4e13dd16387"} Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.576676 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a13994b-a11d-4798-9aae-b91a3d73d440-kolla-config\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.576742 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a13994b-a11d-4798-9aae-b91a3d73d440-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.576772 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z55pw\" (UniqueName: \"kubernetes.io/projected/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-api-access-z55pw\") pod \"kube-state-metrics-0\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.576806 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a13994b-a11d-4798-9aae-b91a3d73d440-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.576904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5xzp\" (UniqueName: \"kubernetes.io/projected/3a13994b-a11d-4798-9aae-b91a3d73d440-kube-api-access-h5xzp\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.576938 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.576956 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.576978 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.577000 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a13994b-a11d-4798-9aae-b91a3d73d440-config-data\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.577045 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.577056 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mwr2\" (UniqueName: \"kubernetes.io/projected/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-kube-api-access-7mwr2\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.577065 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.577076 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.577086 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.577873 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a13994b-a11d-4798-9aae-b91a3d73d440-config-data\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.578342 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a13994b-a11d-4798-9aae-b91a3d73d440-kolla-config\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.631735 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" (UID: "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.632469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a13994b-a11d-4798-9aae-b91a3d73d440-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.632761 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5xzp\" (UniqueName: \"kubernetes.io/projected/3a13994b-a11d-4798-9aae-b91a3d73d440-kube-api-access-h5xzp\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.632950 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a13994b-a11d-4798-9aae-b91a3d73d440-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.640864 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" podStartSLOduration=4.6408420360000004 podStartE2EDuration="4.640842036s" podCreationTimestamp="2026-01-20 23:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:07.54931307 +0000 UTC m=+2979.869573358" watchObservedRunningTime="2026-01-20 23:25:07.640842036 +0000 UTC m=+2979.961102324" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.684538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.685410 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.685695 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.685785 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z55pw\" (UniqueName: \"kubernetes.io/projected/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-api-access-z55pw\") pod \"kube-state-metrics-0\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.685920 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.689949 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.702008 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.703425 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z55pw\" (UniqueName: \"kubernetes.io/projected/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-api-access-z55pw\") pod \"kube-state-metrics-0\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.705939 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.897004 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.967199 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" (UID: "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.991252 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" (UID: "2bdbab30-d0e5-435f-8b5a-19b1bb9496c5"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.993911 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.993933 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:07 crc kubenswrapper[5030]: I0120 23:25:07.993943 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.026186 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="548fd8df-abc4-45a7-9967-72fab91a66a8" path="/var/lib/kubelet/pods/548fd8df-abc4-45a7-9967-72fab91a66a8/volumes" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.029112 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7295c1fa-5222-4dd7-83de-c6cbc3d10d08" path="/var/lib/kubelet/pods/7295c1fa-5222-4dd7-83de-c6cbc3d10d08/volumes" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.029648 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e47ba81-d55b-48df-bce7-31220884279a" path="/var/lib/kubelet/pods/9e47ba81-d55b-48df-bce7-31220884279a/volumes" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.084914 5030 scope.go:117] "RemoveContainer" containerID="bbc4b865330a5660417c7e9fa6e4fcc8d10892e7dfea8227af6e11c00479ab22" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.119083 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.216803 5030 scope.go:117] "RemoveContainer" containerID="d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca" Jan 20 23:25:08 crc kubenswrapper[5030]: E0120 23:25:08.217602 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca\": container with ID starting with d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca not found: ID does not exist" containerID="d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.217672 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca"} err="failed to get container status \"d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca\": rpc error: code = NotFound desc = could not find container \"d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca\": container with ID starting with d1f4c5c1f33809a2b042032b851a9703ffd7c394082b52bc534b5c5c65ad46ca not found: ID does not exist" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.217701 5030 scope.go:117] "RemoveContainer" containerID="bbc4b865330a5660417c7e9fa6e4fcc8d10892e7dfea8227af6e11c00479ab22" Jan 20 23:25:08 crc kubenswrapper[5030]: E0120 23:25:08.222819 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbc4b865330a5660417c7e9fa6e4fcc8d10892e7dfea8227af6e11c00479ab22\": container with ID starting with bbc4b865330a5660417c7e9fa6e4fcc8d10892e7dfea8227af6e11c00479ab22 not found: ID does not exist" containerID="bbc4b865330a5660417c7e9fa6e4fcc8d10892e7dfea8227af6e11c00479ab22" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.222855 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc4b865330a5660417c7e9fa6e4fcc8d10892e7dfea8227af6e11c00479ab22"} err="failed to get container status \"bbc4b865330a5660417c7e9fa6e4fcc8d10892e7dfea8227af6e11c00479ab22\": rpc error: code = NotFound desc = could not find container \"bbc4b865330a5660417c7e9fa6e4fcc8d10892e7dfea8227af6e11c00479ab22\": container with ID starting with bbc4b865330a5660417c7e9fa6e4fcc8d10892e7dfea8227af6e11c00479ab22 not found: ID does not exist" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.222883 5030 scope.go:117] "RemoveContainer" containerID="8a48f49c06913d02cc09946d1fe836b60218c213d31a68d5e74ac65345df2ebc" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.239841 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.241991 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.260682 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.261419 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.293831 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.303285 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.303921 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-sg-core-conf-yaml\") pod \"702e90be-f218-4875-842f-4ab9128c07af\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.304015 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-config-data\") pod \"702e90be-f218-4875-842f-4ab9128c07af\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.304072 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-combined-ca-bundle\") pod \"702e90be-f218-4875-842f-4ab9128c07af\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.304103 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/702e90be-f218-4875-842f-4ab9128c07af-log-httpd\") pod \"702e90be-f218-4875-842f-4ab9128c07af\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.304193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crtlr\" (UniqueName: \"kubernetes.io/projected/702e90be-f218-4875-842f-4ab9128c07af-kube-api-access-crtlr\") pod \"702e90be-f218-4875-842f-4ab9128c07af\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.304218 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-ceilometer-tls-certs\") pod \"702e90be-f218-4875-842f-4ab9128c07af\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.304242 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/702e90be-f218-4875-842f-4ab9128c07af-run-httpd\") pod \"702e90be-f218-4875-842f-4ab9128c07af\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.304276 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-scripts\") pod \"702e90be-f218-4875-842f-4ab9128c07af\" (UID: \"702e90be-f218-4875-842f-4ab9128c07af\") " Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.308280 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/702e90be-f218-4875-842f-4ab9128c07af-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "702e90be-f218-4875-842f-4ab9128c07af" (UID: "702e90be-f218-4875-842f-4ab9128c07af"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.308516 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/702e90be-f218-4875-842f-4ab9128c07af-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "702e90be-f218-4875-842f-4ab9128c07af" (UID: "702e90be-f218-4875-842f-4ab9128c07af"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.341973 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.342026 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:25:08 crc kubenswrapper[5030]: E0120 23:25:08.342339 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="ceilometer-central-agent" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.342350 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="ceilometer-central-agent" Jan 20 23:25:08 crc kubenswrapper[5030]: E0120 23:25:08.342385 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="sg-core" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.342392 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="sg-core" Jan 20 23:25:08 crc kubenswrapper[5030]: E0120 23:25:08.342419 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="proxy-httpd" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.342426 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="proxy-httpd" Jan 20 23:25:08 crc kubenswrapper[5030]: E0120 23:25:08.342436 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="ceilometer-notification-agent" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.342441 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="ceilometer-notification-agent" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.342605 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="sg-core" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.342613 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="proxy-httpd" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.342641 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="ceilometer-notification-agent" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.342651 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="702e90be-f218-4875-842f-4ab9128c07af" containerName="ceilometer-central-agent" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.343145 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.343214 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.344892 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-scripts" (OuterVolumeSpecName: "scripts") pod "702e90be-f218-4875-842f-4ab9128c07af" (UID: "702e90be-f218-4875-842f-4ab9128c07af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.346100 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.348016 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.349257 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.369191 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702e90be-f218-4875-842f-4ab9128c07af-kube-api-access-crtlr" (OuterVolumeSpecName: "kube-api-access-crtlr") pod "702e90be-f218-4875-842f-4ab9128c07af" (UID: "702e90be-f218-4875-842f-4ab9128c07af"). InnerVolumeSpecName "kube-api-access-crtlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.406254 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.406430 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbjbt\" (UniqueName: \"kubernetes.io/projected/7467fbb5-44e9-42cf-874d-16fd1332885e-kube-api-access-kbjbt\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.406473 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.406644 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.406913 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.407011 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/702e90be-f218-4875-842f-4ab9128c07af-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.407025 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crtlr\" (UniqueName: \"kubernetes.io/projected/702e90be-f218-4875-842f-4ab9128c07af-kube-api-access-crtlr\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.407036 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/702e90be-f218-4875-842f-4ab9128c07af-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.407049 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.407100 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.414470 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.418967 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.419123 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-86kd6" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.419285 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.419292 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.419837 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.508809 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.508888 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.508920 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f239887a-e1d2-484e-96ce-e4bb56b1bc75-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.508950 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fr6n\" (UniqueName: \"kubernetes.io/projected/f239887a-e1d2-484e-96ce-e4bb56b1bc75-kube-api-access-2fr6n\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.509012 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.509048 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f239887a-e1d2-484e-96ce-e4bb56b1bc75-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.509095 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f239887a-e1d2-484e-96ce-e4bb56b1bc75-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.509133 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.509166 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.509194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.509222 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.509266 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbjbt\" (UniqueName: \"kubernetes.io/projected/7467fbb5-44e9-42cf-874d-16fd1332885e-kube-api-access-kbjbt\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.509295 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.517842 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.543594 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbjbt\" (UniqueName: \"kubernetes.io/projected/7467fbb5-44e9-42cf-874d-16fd1332885e-kube-api-access-kbjbt\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.548248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.552253 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.554205 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.584916 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" event={"ID":"3cb2c9c5-64c5-4d21-b370-4fb3718220c5","Type":"ContainerStarted","Data":"41f49fd7ec11dd9a9a557c1bb592ba6c7982e9eb0f90238a36c0c561df24cec1"} Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.613189 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.613267 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f239887a-e1d2-484e-96ce-e4bb56b1bc75-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.613294 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fr6n\" (UniqueName: \"kubernetes.io/projected/f239887a-e1d2-484e-96ce-e4bb56b1bc75-kube-api-access-2fr6n\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.613363 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.613393 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f239887a-e1d2-484e-96ce-e4bb56b1bc75-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.613443 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f239887a-e1d2-484e-96ce-e4bb56b1bc75-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.613495 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.613519 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.613936 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" podStartSLOduration=5.6139236409999995 podStartE2EDuration="5.613923641s" podCreationTimestamp="2026-01-20 23:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:08.607456575 +0000 UTC m=+2980.927716863" watchObservedRunningTime="2026-01-20 23:25:08.613923641 +0000 UTC m=+2980.934183929" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.614740 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f239887a-e1d2-484e-96ce-e4bb56b1bc75-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.615077 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.620059 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.620724 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.624315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.624631 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"702e90be-f218-4875-842f-4ab9128c07af","Type":"ContainerDied","Data":"4899972a23c54e8753a2ef0fef3ead6828ecee92ceedbfe44f03c034a2bdd04e"} Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.624685 5030 scope.go:117] "RemoveContainer" containerID="d584867670f8f2d7ca7dd30d194d8a01069a7df39afdda8685e325b3db6c256f" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.624715 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.631674 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f239887a-e1d2-484e-96ce-e4bb56b1bc75-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.633842 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" event={"ID":"3358fbb1-638c-4f20-804e-b672441288e4","Type":"ContainerStarted","Data":"5c3963eec2dd418b836709714d9c6ad213a8435b737aa8a3604f4191d504f0a0"} Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.635646 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f239887a-e1d2-484e-96ce-e4bb56b1bc75-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.664958 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.664669 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-fs64c" podStartSLOduration=5.664654739 podStartE2EDuration="5.664654739s" podCreationTimestamp="2026-01-20 23:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:08.640081145 +0000 UTC m=+2980.960341433" watchObservedRunningTime="2026-01-20 23:25:08.664654739 +0000 UTC m=+2980.984915027" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.679961 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="562410c1-4193-4bdb-84a5-32bd001f29b7" containerName="rabbitmq" containerID="cri-o://697fa9aa2472880601d53ece29e19b607f5f96fff9acf555dbdca856864c6ede" gracePeriod=604795 Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.683880 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fr6n\" (UniqueName: \"kubernetes.io/projected/f239887a-e1d2-484e-96ce-e4bb56b1bc75-kube-api-access-2fr6n\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.695730 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a","Type":"ContainerStarted","Data":"17a21a1997b5aa160923c8f2a8f47c74ef600585b2bfc9e4e742fece46b9bb39"} Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.697668 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" podStartSLOduration=4.697650578 podStartE2EDuration="4.697650578s" podCreationTimestamp="2026-01-20 23:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:08.680966114 +0000 UTC m=+2981.001226402" watchObservedRunningTime="2026-01-20 23:25:08.697650578 +0000 UTC m=+2981.017910866" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.715732 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.715936 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" event={"ID":"27233b7c-c856-4a41-a674-7399905591de","Type":"ContainerStarted","Data":"616bdc366eb8dd795e7fbeb8202645472cb589464fac89005bdee7e84a582a16"} Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.716495 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "702e90be-f218-4875-842f-4ab9128c07af" (UID: "702e90be-f218-4875-842f-4ab9128c07af"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.717522 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.731969 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-dtsgt" event={"ID":"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063","Type":"ContainerStarted","Data":"e3bc077b333e186d58c38afabbc39ee5da09f71b50da1b5bffef05ed628247e9"} Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.788684 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" podStartSLOduration=4.788662471 podStartE2EDuration="4.788662471s" podCreationTimestamp="2026-01-20 23:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:08.756023851 +0000 UTC m=+2981.076284139" watchObservedRunningTime="2026-01-20 23:25:08.788662471 +0000 UTC m=+2981.108922759" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.809399 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-dtsgt" podStartSLOduration=5.809380423 podStartE2EDuration="5.809380423s" podCreationTimestamp="2026-01-20 23:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:08.77745869 +0000 UTC m=+2981.097718978" watchObservedRunningTime="2026-01-20 23:25:08.809380423 +0000 UTC m=+2981.129640711" Jan 20 23:25:08 crc kubenswrapper[5030]: W0120 23:25:08.833790 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a13994b_a11d_4798_9aae_b91a3d73d440.slice/crio-bc44f82bfdea244db33f854331c5bc77af0d5a17f0fb4ee14c74bfacb210752b WatchSource:0}: Error finding container bc44f82bfdea244db33f854331c5bc77af0d5a17f0fb4ee14c74bfacb210752b: Status 404 returned error can't find the container with id bc44f82bfdea244db33f854331c5bc77af0d5a17f0fb4ee14c74bfacb210752b Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.853063 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" podStartSLOduration=4.8530478299999995 podStartE2EDuration="4.85304783s" podCreationTimestamp="2026-01-20 23:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:08.819096328 +0000 UTC m=+2981.139356606" watchObservedRunningTime="2026-01-20 23:25:08.85304783 +0000 UTC m=+2981.173308118" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.858775 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.871053 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-config-data" (OuterVolumeSpecName: "config-data") pod "702e90be-f218-4875-842f-4ab9128c07af" (UID: "702e90be-f218-4875-842f-4ab9128c07af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.920349 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.928934 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.951842 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.979840 5030 scope.go:117] "RemoveContainer" containerID="a419b0df1ac907972cfda436d9e2567302488fb7494ade3b518c011687d6c8f1" Jan 20 23:25:08 crc kubenswrapper[5030]: I0120 23:25:08.984587 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "702e90be-f218-4875-842f-4ab9128c07af" (UID: "702e90be-f218-4875-842f-4ab9128c07af"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.032789 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.071298 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.087776 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.099717 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.124054 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.136080 5030 scope.go:117] "RemoveContainer" containerID="229ac6ca1e152dd47107a09cb29f43d3a0fefdd1d5a01a30a06f366d1cb97e6c" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.140349 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "702e90be-f218-4875-842f-4ab9128c07af" (UID: "702e90be-f218-4875-842f-4ab9128c07af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.144575 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.151090 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.160940 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.161507 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-qc7gv" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.162135 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.163182 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.185319 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.245237 5030 scope.go:117] "RemoveContainer" containerID="c93387d8e2c2a1f902f1c328944016965de643e24cca1bada095e1b5ea0c9dce" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.250871 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702e90be-f218-4875-842f-4ab9128c07af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.375862 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-config-data-default\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.376454 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv9mt\" (UniqueName: \"kubernetes.io/projected/66881282-2b6f-484d-a7e0-2b968306c5ae-kube-api-access-vv9mt\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.376660 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66881282-2b6f-484d-a7e0-2b968306c5ae-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.378997 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-kolla-config\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.386346 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-operator-scripts\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.386448 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66881282-2b6f-484d-a7e0-2b968306c5ae-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.386544 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.386585 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66881282-2b6f-484d-a7e0-2b968306c5ae-config-data-generated\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.464364 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.490259 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.490294 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66881282-2b6f-484d-a7e0-2b968306c5ae-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.490352 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.490579 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66881282-2b6f-484d-a7e0-2b968306c5ae-config-data-generated\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.490611 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-config-data-default\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.490683 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv9mt\" (UniqueName: \"kubernetes.io/projected/66881282-2b6f-484d-a7e0-2b968306c5ae-kube-api-access-vv9mt\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.490741 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66881282-2b6f-484d-a7e0-2b968306c5ae-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.490761 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-kolla-config\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.490784 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-operator-scripts\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.492480 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.493120 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-operator-scripts\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.493214 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-config-data-default\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.493529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-kolla-config\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.493794 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66881282-2b6f-484d-a7e0-2b968306c5ae-config-data-generated\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.537089 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66881282-2b6f-484d-a7e0-2b968306c5ae-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.537861 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66881282-2b6f-484d-a7e0-2b968306c5ae-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.549080 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv9mt\" (UniqueName: \"kubernetes.io/projected/66881282-2b6f-484d-a7e0-2b968306c5ae-kube-api-access-vv9mt\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.556314 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.559349 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.578165 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.578363 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.578529 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.581330 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.601444 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.653383 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.693905 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfgds\" (UniqueName: \"kubernetes.io/projected/8a47f2f2-2d2d-4c48-b35f-966132c0f842-kube-api-access-cfgds\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.693981 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-config-data\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.694028 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a47f2f2-2d2d-4c48-b35f-966132c0f842-log-httpd\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.694068 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-scripts\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.694102 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.694150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.694178 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.694194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a47f2f2-2d2d-4c48-b35f-966132c0f842-run-httpd\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.745180 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.793231 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.799256 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a47f2f2-2d2d-4c48-b35f-966132c0f842-log-httpd\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.799327 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-scripts\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.799364 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.799413 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.799442 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.799460 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a47f2f2-2d2d-4c48-b35f-966132c0f842-run-httpd\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.799496 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfgds\" (UniqueName: \"kubernetes.io/projected/8a47f2f2-2d2d-4c48-b35f-966132c0f842-kube-api-access-cfgds\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.799533 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-config-data\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.802724 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a47f2f2-2d2d-4c48-b35f-966132c0f842-log-httpd\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.802982 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a47f2f2-2d2d-4c48-b35f-966132c0f842-run-httpd\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.811878 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.812842 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-config-data\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.815279 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.829499 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="640d9b89-a25b-43d4-bf6b-361b56ad9381" containerName="rabbitmq" containerID="cri-o://6bd004035b8c54e12e2ddf388a55c324758667583b1215f689bef97159ec1247" gracePeriod=604794 Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.833036 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.834929 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"1130f389-005b-4aee-ad77-913385f27c6b","Type":"ContainerStarted","Data":"72e0ced67c0808f179d97b548a3e94e7851dc22db2b730a9e88fc584185749d3"} Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.834992 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"1130f389-005b-4aee-ad77-913385f27c6b","Type":"ContainerStarted","Data":"b70eca9805696a2d6eb3718f5825256ff477ee747e614ba956e5589f37b54e0e"} Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.840490 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-scripts\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.850926 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfgds\" (UniqueName: \"kubernetes.io/projected/8a47f2f2-2d2d-4c48-b35f-966132c0f842-kube-api-access-cfgds\") pod \"ceilometer-0\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.880534 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=4.880514442 podStartE2EDuration="4.880514442s" podCreationTimestamp="2026-01-20 23:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:09.879812275 +0000 UTC m=+2982.200072563" watchObservedRunningTime="2026-01-20 23:25:09.880514442 +0000 UTC m=+2982.200774730" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.938205 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"3a13994b-a11d-4798-9aae-b91a3d73d440","Type":"ContainerStarted","Data":"847a1cdf293cd94ff36a98ec31b6492efbc1f13b9fd3fea3d0454c780f44cc6f"} Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.938257 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"3a13994b-a11d-4798-9aae-b91a3d73d440","Type":"ContainerStarted","Data":"bc44f82bfdea244db33f854331c5bc77af0d5a17f0fb4ee14c74bfacb210752b"} Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.939025 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.951081 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7467fbb5-44e9-42cf-874d-16fd1332885e","Type":"ContainerStarted","Data":"b4d25f2e1c4c075148c980fe0bbb316546c8543b18687fc473c810f3c38fa32a"} Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.990336 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.99031623 podStartE2EDuration="2.99031623s" podCreationTimestamp="2026-01-20 23:25:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:09.960393236 +0000 UTC m=+2982.280653534" watchObservedRunningTime="2026-01-20 23:25:09.99031623 +0000 UTC m=+2982.310576518" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.992537 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bdbab30-d0e5-435f-8b5a-19b1bb9496c5" path="/var/lib/kubelet/pods/2bdbab30-d0e5-435f-8b5a-19b1bb9496c5/volumes" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.994103 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="702e90be-f218-4875-842f-4ab9128c07af" path="/var/lib/kubelet/pods/702e90be-f218-4875-842f-4ab9128c07af/volumes" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.995840 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa0ffc1d-cef9-4431-8f48-012c129e75b5" path="/var/lib/kubelet/pods/aa0ffc1d-cef9-4431-8f48-012c129e75b5/volumes" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.997090 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0213ca2-a56f-4a81-bfe7-814d2eeeb275" path="/var/lib/kubelet/pods/e0213ca2-a56f-4a81-bfe7-814d2eeeb275/volumes" Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.997777 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-fs64c" event={"ID":"31020647-e3cc-44ab-a754-439b424e50cd","Type":"ContainerStarted","Data":"9502d696bd2796522f2685c308038851864d146377ed5acba22931664cdb58d8"} Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.997810 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:25:09 crc kubenswrapper[5030]: I0120 23:25:09.999439 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a","Type":"ContainerStarted","Data":"fa2232813d49b6cabe6b5dcbf6c407fdaa8033be9b122e13bf90968f56727a69"} Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.002431 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb","Type":"ContainerStarted","Data":"94347de6b0123b260b344d05156ff2e06368833265f0cee2a5bd2b0f4d48cd5d"} Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.007322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-nchp6" event={"ID":"3bc26a88-4c14-43cd-8255-cf5bb3e15931","Type":"ContainerStarted","Data":"3657580e710d34e6ab5e47b4e7f1bff1e8c70505ca73a3c36c83d6889bcae7b8"} Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.014115 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" event={"ID":"d9865414-bdc0-4dd1-baee-d04985de4dd0","Type":"ContainerStarted","Data":"a74062eb7e3b260098a20f5b7d284abc8950cfd493a9c09773fcfc9d72a97d3c"} Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.032074 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-nchp6" podStartSLOduration=6.03204965 podStartE2EDuration="6.03204965s" podCreationTimestamp="2026-01-20 23:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:10.025313967 +0000 UTC m=+2982.345574255" watchObservedRunningTime="2026-01-20 23:25:10.03204965 +0000 UTC m=+2982.352309938" Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.115394 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.173270 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.173569 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.173606 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.174070 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.174115 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" gracePeriod=600 Jan 20 23:25:10 crc kubenswrapper[5030]: E0120 23:25:10.289815 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:25:10 crc kubenswrapper[5030]: E0120 23:25:10.293709 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:25:10 crc kubenswrapper[5030]: E0120 23:25:10.295118 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:25:10 crc kubenswrapper[5030]: E0120 23:25:10.295237 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="faa5c453-046f-4c27-93cf-c291aaee541e" containerName="ovn-northd" Jan 20 23:25:10 crc kubenswrapper[5030]: E0120 23:25:10.337701 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.483114 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.721752 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/memcached-0" podUID="9e47ba81-d55b-48df-bce7-31220884279a" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.212:11211: i/o timeout" Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.726324 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:25:10 crc kubenswrapper[5030]: W0120 23:25:10.733093 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a47f2f2_2d2d_4c48_b35f_966132c0f842.slice/crio-25d4d69db82e1bee348051f0c99fd1e15a0f40af8c07933cc2bda5c1a2ce0d80 WatchSource:0}: Error finding container 25d4d69db82e1bee348051f0c99fd1e15a0f40af8c07933cc2bda5c1a2ce0d80: Status 404 returned error can't find the container with id 25d4d69db82e1bee348051f0c99fd1e15a0f40af8c07933cc2bda5c1a2ce0d80 Jan 20 23:25:10 crc kubenswrapper[5030]: I0120 23:25:10.797325 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.023431 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a","Type":"ContainerStarted","Data":"7f07936da6b67937a65d8f198644e8a821f77c69a033105597f6a81f498d252f"} Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.024698 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8a47f2f2-2d2d-4c48-b35f-966132c0f842","Type":"ContainerStarted","Data":"25d4d69db82e1bee348051f0c99fd1e15a0f40af8c07933cc2bda5c1a2ce0d80"} Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.026943 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb","Type":"ContainerStarted","Data":"1b112a374f9b05a0f9a2eb5a5ade4d52d82f6111e4094f010c5765691ff6c914"} Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.027112 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.029144 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" exitCode=0 Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.029256 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f"} Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.029285 5030 scope.go:117] "RemoveContainer" containerID="1dc0601959289421c735475a4194e91b57087f23f408901601c681f937cf2114" Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.030004 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:25:11 crc kubenswrapper[5030]: E0120 23:25:11.030287 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.035365 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"66881282-2b6f-484d-a7e0-2b968306c5ae","Type":"ContainerStarted","Data":"a492320c925e809dce4d02a24f5e2910ff9256a4363f1e4a7c5bd831814a12db"} Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.035408 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"66881282-2b6f-484d-a7e0-2b968306c5ae","Type":"ContainerStarted","Data":"da1429337d8d0557f1d287751d35aed815aa1a3fe7afc2c2f1417ba4fc849590"} Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.037926 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f239887a-e1d2-484e-96ce-e4bb56b1bc75","Type":"ContainerStarted","Data":"872d237eda2119aaa6d6a2d8792e96fe1f8b4dac40e197593e2387ae4ae6bf94"} Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.037951 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f239887a-e1d2-484e-96ce-e4bb56b1bc75","Type":"ContainerStarted","Data":"fafd5ba47410f6fedd6499c7232ec2d2d22b86d2d2aae3c9cc4c3d6841b8ea5f"} Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.049684 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7467fbb5-44e9-42cf-874d-16fd1332885e","Type":"ContainerStarted","Data":"493a4b5159ec26b105e62e8cb2c2af22e9255bbb11250b3a8cbcc06e1bbd150a"} Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.053058 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=5.053042605 podStartE2EDuration="5.053042605s" podCreationTimestamp="2026-01-20 23:25:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:11.048098386 +0000 UTC m=+2983.368358674" watchObservedRunningTime="2026-01-20 23:25:11.053042605 +0000 UTC m=+2983.373302893" Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.110592 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=3.621068229 podStartE2EDuration="4.110574478s" podCreationTimestamp="2026-01-20 23:25:07 +0000 UTC" firstStartedPulling="2026-01-20 23:25:09.087521936 +0000 UTC m=+2981.407782224" lastFinishedPulling="2026-01-20 23:25:09.577028195 +0000 UTC m=+2981.897288473" observedRunningTime="2026-01-20 23:25:11.102034261 +0000 UTC m=+2983.422294549" watchObservedRunningTime="2026-01-20 23:25:11.110574478 +0000 UTC m=+2983.430834766" Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.143054 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=3.143041225 podStartE2EDuration="3.143041225s" podCreationTimestamp="2026-01-20 23:25:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:11.140048801 +0000 UTC m=+2983.460309089" watchObservedRunningTime="2026-01-20 23:25:11.143041225 +0000 UTC m=+2983.463301513" Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.795195 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:11 crc kubenswrapper[5030]: I0120 23:25:11.848881 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.077152 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8a47f2f2-2d2d-4c48-b35f-966132c0f842","Type":"ContainerStarted","Data":"ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226"} Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.200145 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="562410c1-4193-4bdb-84a5-32bd001f29b7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.208:5671: connect: connection refused" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.260900 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="640d9b89-a25b-43d4-bf6b-361b56ad9381" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.209:5671: connect: connection refused" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.319594 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.359424 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_faa5c453-046f-4c27-93cf-c291aaee541e/ovn-northd/0.log" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.359500 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.472138 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-combined-ca-bundle\") pod \"faa5c453-046f-4c27-93cf-c291aaee541e\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.472277 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt2km\" (UniqueName: \"kubernetes.io/projected/faa5c453-046f-4c27-93cf-c291aaee541e-kube-api-access-nt2km\") pod \"faa5c453-046f-4c27-93cf-c291aaee541e\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.472300 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-ovn-northd-tls-certs\") pod \"faa5c453-046f-4c27-93cf-c291aaee541e\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.472379 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faa5c453-046f-4c27-93cf-c291aaee541e-config\") pod \"faa5c453-046f-4c27-93cf-c291aaee541e\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.472934 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faa5c453-046f-4c27-93cf-c291aaee541e-config" (OuterVolumeSpecName: "config") pod "faa5c453-046f-4c27-93cf-c291aaee541e" (UID: "faa5c453-046f-4c27-93cf-c291aaee541e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.473020 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/faa5c453-046f-4c27-93cf-c291aaee541e-scripts\") pod \"faa5c453-046f-4c27-93cf-c291aaee541e\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.473050 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/faa5c453-046f-4c27-93cf-c291aaee541e-ovn-rundir\") pod \"faa5c453-046f-4c27-93cf-c291aaee541e\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.473414 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faa5c453-046f-4c27-93cf-c291aaee541e-scripts" (OuterVolumeSpecName: "scripts") pod "faa5c453-046f-4c27-93cf-c291aaee541e" (UID: "faa5c453-046f-4c27-93cf-c291aaee541e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.473494 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa5c453-046f-4c27-93cf-c291aaee541e-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "faa5c453-046f-4c27-93cf-c291aaee541e" (UID: "faa5c453-046f-4c27-93cf-c291aaee541e"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.473601 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-metrics-certs-tls-certs\") pod \"faa5c453-046f-4c27-93cf-c291aaee541e\" (UID: \"faa5c453-046f-4c27-93cf-c291aaee541e\") " Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.474107 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faa5c453-046f-4c27-93cf-c291aaee541e-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.474130 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/faa5c453-046f-4c27-93cf-c291aaee541e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.474142 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/faa5c453-046f-4c27-93cf-c291aaee541e-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.479513 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa5c453-046f-4c27-93cf-c291aaee541e-kube-api-access-nt2km" (OuterVolumeSpecName: "kube-api-access-nt2km") pod "faa5c453-046f-4c27-93cf-c291aaee541e" (UID: "faa5c453-046f-4c27-93cf-c291aaee541e"). InnerVolumeSpecName "kube-api-access-nt2km". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.498650 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faa5c453-046f-4c27-93cf-c291aaee541e" (UID: "faa5c453-046f-4c27-93cf-c291aaee541e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.553191 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "faa5c453-046f-4c27-93cf-c291aaee541e" (UID: "faa5c453-046f-4c27-93cf-c291aaee541e"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.553741 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "faa5c453-046f-4c27-93cf-c291aaee541e" (UID: "faa5c453-046f-4c27-93cf-c291aaee541e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.560105 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.560170 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.563579 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"914f5f3e09e41c57acffc427aa985037c43c40970debda4d1453d04f95ceac1e"} pod="openstack-kuttl-tests/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.563906 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="cinder-scheduler" containerID="cri-o://914f5f3e09e41c57acffc427aa985037c43c40970debda4d1453d04f95ceac1e" gracePeriod=30 Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.576899 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt2km\" (UniqueName: \"kubernetes.io/projected/faa5c453-046f-4c27-93cf-c291aaee541e-kube-api-access-nt2km\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.576942 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.576955 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:12 crc kubenswrapper[5030]: I0120 23:25:12.576969 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa5c453-046f-4c27-93cf-c291aaee541e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.094121 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8a47f2f2-2d2d-4c48-b35f-966132c0f842","Type":"ContainerStarted","Data":"f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875"} Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.094183 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8a47f2f2-2d2d-4c48-b35f-966132c0f842","Type":"ContainerStarted","Data":"353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c"} Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.096094 5030 generic.go:334] "Generic (PLEG): container finished" podID="66881282-2b6f-484d-a7e0-2b968306c5ae" containerID="a492320c925e809dce4d02a24f5e2910ff9256a4363f1e4a7c5bd831814a12db" exitCode=0 Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.096160 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"66881282-2b6f-484d-a7e0-2b968306c5ae","Type":"ContainerDied","Data":"a492320c925e809dce4d02a24f5e2910ff9256a4363f1e4a7c5bd831814a12db"} Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.099337 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_faa5c453-046f-4c27-93cf-c291aaee541e/ovn-northd/0.log" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.099367 5030 generic.go:334] "Generic (PLEG): container finished" podID="faa5c453-046f-4c27-93cf-c291aaee541e" containerID="1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4" exitCode=139 Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.099403 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"faa5c453-046f-4c27-93cf-c291aaee541e","Type":"ContainerDied","Data":"1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4"} Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.099423 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"faa5c453-046f-4c27-93cf-c291aaee541e","Type":"ContainerDied","Data":"58e9783cfdb470224161412f6131249bf2ab3292c9ce9b4db52d04fdf42b60a7"} Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.099438 5030 scope.go:117] "RemoveContainer" containerID="b65351c43ee430bba2d10de676fa2218e795346d4c6fb18b21ab0f22049ea7a4" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.099516 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.103089 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f239887a-e1d2-484e-96ce-e4bb56b1bc75","Type":"ContainerDied","Data":"872d237eda2119aaa6d6a2d8792e96fe1f8b4dac40e197593e2387ae4ae6bf94"} Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.103067 5030 generic.go:334] "Generic (PLEG): container finished" podID="f239887a-e1d2-484e-96ce-e4bb56b1bc75" containerID="872d237eda2119aaa6d6a2d8792e96fe1f8b4dac40e197593e2387ae4ae6bf94" exitCode=0 Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.142977 5030 scope.go:117] "RemoveContainer" containerID="1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.158181 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.225881 5030 scope.go:117] "RemoveContainer" containerID="b65351c43ee430bba2d10de676fa2218e795346d4c6fb18b21ab0f22049ea7a4" Jan 20 23:25:13 crc kubenswrapper[5030]: E0120 23:25:13.234031 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b65351c43ee430bba2d10de676fa2218e795346d4c6fb18b21ab0f22049ea7a4\": container with ID starting with b65351c43ee430bba2d10de676fa2218e795346d4c6fb18b21ab0f22049ea7a4 not found: ID does not exist" containerID="b65351c43ee430bba2d10de676fa2218e795346d4c6fb18b21ab0f22049ea7a4" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.234068 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b65351c43ee430bba2d10de676fa2218e795346d4c6fb18b21ab0f22049ea7a4"} err="failed to get container status \"b65351c43ee430bba2d10de676fa2218e795346d4c6fb18b21ab0f22049ea7a4\": rpc error: code = NotFound desc = could not find container \"b65351c43ee430bba2d10de676fa2218e795346d4c6fb18b21ab0f22049ea7a4\": container with ID starting with b65351c43ee430bba2d10de676fa2218e795346d4c6fb18b21ab0f22049ea7a4 not found: ID does not exist" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.234092 5030 scope.go:117] "RemoveContainer" containerID="1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.235722 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:25:13 crc kubenswrapper[5030]: E0120 23:25:13.236865 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4\": container with ID starting with 1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4 not found: ID does not exist" containerID="1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.236909 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4"} err="failed to get container status \"1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4\": rpc error: code = NotFound desc = could not find container \"1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4\": container with ID starting with 1e123810407dbc28041d5a270441251a62b0c9df1d6352e3e1587152d31680a4 not found: ID does not exist" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.259679 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.273991 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:25:13 crc kubenswrapper[5030]: E0120 23:25:13.274518 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa5c453-046f-4c27-93cf-c291aaee541e" containerName="openstack-network-exporter" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.274558 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa5c453-046f-4c27-93cf-c291aaee541e" containerName="openstack-network-exporter" Jan 20 23:25:13 crc kubenswrapper[5030]: E0120 23:25:13.274587 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa5c453-046f-4c27-93cf-c291aaee541e" containerName="ovn-northd" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.274593 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa5c453-046f-4c27-93cf-c291aaee541e" containerName="ovn-northd" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.274913 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa5c453-046f-4c27-93cf-c291aaee541e" containerName="openstack-network-exporter" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.274961 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa5c453-046f-4c27-93cf-c291aaee541e" containerName="ovn-northd" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.276161 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.278511 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-xksnq" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.278591 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.278874 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.279704 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.308095 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.319989 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.369449 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.404879 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d380087b-96be-407d-8c8c-f00ecaa3a0e8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.405110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.405155 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.405244 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d380087b-96be-407d-8c8c-f00ecaa3a0e8-config\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.405301 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d380087b-96be-407d-8c8c-f00ecaa3a0e8-scripts\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.405368 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95fbd\" (UniqueName: \"kubernetes.io/projected/d380087b-96be-407d-8c8c-f00ecaa3a0e8-kube-api-access-95fbd\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.405416 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.507389 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.507497 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d380087b-96be-407d-8c8c-f00ecaa3a0e8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.507601 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.507647 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.507706 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d380087b-96be-407d-8c8c-f00ecaa3a0e8-config\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.507747 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d380087b-96be-407d-8c8c-f00ecaa3a0e8-scripts\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.507792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95fbd\" (UniqueName: \"kubernetes.io/projected/d380087b-96be-407d-8c8c-f00ecaa3a0e8-kube-api-access-95fbd\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.508875 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d380087b-96be-407d-8c8c-f00ecaa3a0e8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.509114 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d380087b-96be-407d-8c8c-f00ecaa3a0e8-config\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.509729 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d380087b-96be-407d-8c8c-f00ecaa3a0e8-scripts\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.511666 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.511966 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.512799 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.529534 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95fbd\" (UniqueName: \"kubernetes.io/projected/d380087b-96be-407d-8c8c-f00ecaa3a0e8-kube-api-access-95fbd\") pod \"ovn-northd-0\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.614877 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.921589 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:13 crc kubenswrapper[5030]: I0120 23:25:13.972041 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faa5c453-046f-4c27-93cf-c291aaee541e" path="/var/lib/kubelet/pods/faa5c453-046f-4c27-93cf-c291aaee541e/volumes" Jan 20 23:25:14 crc kubenswrapper[5030]: I0120 23:25:14.079927 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:25:14 crc kubenswrapper[5030]: I0120 23:25:14.117717 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"66881282-2b6f-484d-a7e0-2b968306c5ae","Type":"ContainerStarted","Data":"7cd441aae70f6c395bd5a42b94d16ef7f6f319a8b61584ad5fc4661b28a3ff4d"} Jan 20 23:25:14 crc kubenswrapper[5030]: I0120 23:25:14.123409 5030 generic.go:334] "Generic (PLEG): container finished" podID="d9865414-bdc0-4dd1-baee-d04985de4dd0" containerID="a74062eb7e3b260098a20f5b7d284abc8950cfd493a9c09773fcfc9d72a97d3c" exitCode=0 Jan 20 23:25:14 crc kubenswrapper[5030]: I0120 23:25:14.123494 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" event={"ID":"d9865414-bdc0-4dd1-baee-d04985de4dd0","Type":"ContainerDied","Data":"a74062eb7e3b260098a20f5b7d284abc8950cfd493a9c09773fcfc9d72a97d3c"} Jan 20 23:25:14 crc kubenswrapper[5030]: I0120 23:25:14.127080 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"d380087b-96be-407d-8c8c-f00ecaa3a0e8","Type":"ContainerStarted","Data":"489bdd58f58a414c8771258a9930b2d1895870bb46f3f6c283d07c1077dd05a2"} Jan 20 23:25:14 crc kubenswrapper[5030]: I0120 23:25:14.130050 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f239887a-e1d2-484e-96ce-e4bb56b1bc75","Type":"ContainerStarted","Data":"5e702a9647c237b7ea176fced7e9b9708645df2df43294f18dad118f506f7fee"} Jan 20 23:25:14 crc kubenswrapper[5030]: I0120 23:25:14.171874 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=5.171846164 podStartE2EDuration="5.171846164s" podCreationTimestamp="2026-01-20 23:25:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:14.152463234 +0000 UTC m=+2986.472723522" watchObservedRunningTime="2026-01-20 23:25:14.171846164 +0000 UTC m=+2986.492106492" Jan 20 23:25:14 crc kubenswrapper[5030]: I0120 23:25:14.186830 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=6.186803725 podStartE2EDuration="6.186803725s" podCreationTimestamp="2026-01-20 23:25:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:14.176335562 +0000 UTC m=+2986.496595850" watchObservedRunningTime="2026-01-20 23:25:14.186803725 +0000 UTC m=+2986.507064013" Jan 20 23:25:14 crc kubenswrapper[5030]: I0120 23:25:14.209408 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.145632 5030 generic.go:334] "Generic (PLEG): container finished" podID="562410c1-4193-4bdb-84a5-32bd001f29b7" containerID="697fa9aa2472880601d53ece29e19b607f5f96fff9acf555dbdca856864c6ede" exitCode=0 Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.145674 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"562410c1-4193-4bdb-84a5-32bd001f29b7","Type":"ContainerDied","Data":"697fa9aa2472880601d53ece29e19b607f5f96fff9acf555dbdca856864c6ede"} Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.152201 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8a47f2f2-2d2d-4c48-b35f-966132c0f842","Type":"ContainerStarted","Data":"d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99"} Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.152538 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.155962 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"d380087b-96be-407d-8c8c-f00ecaa3a0e8","Type":"ContainerStarted","Data":"5f43545399090228177f04523ac5a317a6277e22050988ab543b21064445e7de"} Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.156002 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"d380087b-96be-407d-8c8c-f00ecaa3a0e8","Type":"ContainerStarted","Data":"352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4"} Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.183995 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.12216942 podStartE2EDuration="6.183977725s" podCreationTimestamp="2026-01-20 23:25:09 +0000 UTC" firstStartedPulling="2026-01-20 23:25:10.73553989 +0000 UTC m=+2983.055800178" lastFinishedPulling="2026-01-20 23:25:14.797348185 +0000 UTC m=+2987.117608483" observedRunningTime="2026-01-20 23:25:15.182638282 +0000 UTC m=+2987.502898590" watchObservedRunningTime="2026-01-20 23:25:15.183977725 +0000 UTC m=+2987.504238013" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.217214 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.217197198 podStartE2EDuration="2.217197198s" podCreationTimestamp="2026-01-20 23:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:15.210760943 +0000 UTC m=+2987.531021231" watchObservedRunningTime="2026-01-20 23:25:15.217197198 +0000 UTC m=+2987.537457476" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.269685 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.342902 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74cm9\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-kube-api-access-74cm9\") pod \"562410c1-4193-4bdb-84a5-32bd001f29b7\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.343019 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-config-data\") pod \"562410c1-4193-4bdb-84a5-32bd001f29b7\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.343045 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-confd\") pod \"562410c1-4193-4bdb-84a5-32bd001f29b7\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.343072 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-server-conf\") pod \"562410c1-4193-4bdb-84a5-32bd001f29b7\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.343106 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/562410c1-4193-4bdb-84a5-32bd001f29b7-pod-info\") pod \"562410c1-4193-4bdb-84a5-32bd001f29b7\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.343169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-plugins\") pod \"562410c1-4193-4bdb-84a5-32bd001f29b7\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.343212 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-plugins-conf\") pod \"562410c1-4193-4bdb-84a5-32bd001f29b7\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.343272 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-erlang-cookie\") pod \"562410c1-4193-4bdb-84a5-32bd001f29b7\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.343311 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-tls\") pod \"562410c1-4193-4bdb-84a5-32bd001f29b7\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.343341 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/562410c1-4193-4bdb-84a5-32bd001f29b7-erlang-cookie-secret\") pod \"562410c1-4193-4bdb-84a5-32bd001f29b7\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.343368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"562410c1-4193-4bdb-84a5-32bd001f29b7\" (UID: \"562410c1-4193-4bdb-84a5-32bd001f29b7\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.344883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "562410c1-4193-4bdb-84a5-32bd001f29b7" (UID: "562410c1-4193-4bdb-84a5-32bd001f29b7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.345906 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "562410c1-4193-4bdb-84a5-32bd001f29b7" (UID: "562410c1-4193-4bdb-84a5-32bd001f29b7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.346336 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "562410c1-4193-4bdb-84a5-32bd001f29b7" (UID: "562410c1-4193-4bdb-84a5-32bd001f29b7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.348852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/562410c1-4193-4bdb-84a5-32bd001f29b7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "562410c1-4193-4bdb-84a5-32bd001f29b7" (UID: "562410c1-4193-4bdb-84a5-32bd001f29b7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.355756 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/562410c1-4193-4bdb-84a5-32bd001f29b7-pod-info" (OuterVolumeSpecName: "pod-info") pod "562410c1-4193-4bdb-84a5-32bd001f29b7" (UID: "562410c1-4193-4bdb-84a5-32bd001f29b7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.369583 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "562410c1-4193-4bdb-84a5-32bd001f29b7" (UID: "562410c1-4193-4bdb-84a5-32bd001f29b7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.369782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-kube-api-access-74cm9" (OuterVolumeSpecName: "kube-api-access-74cm9") pod "562410c1-4193-4bdb-84a5-32bd001f29b7" (UID: "562410c1-4193-4bdb-84a5-32bd001f29b7"). InnerVolumeSpecName "kube-api-access-74cm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.369893 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "562410c1-4193-4bdb-84a5-32bd001f29b7" (UID: "562410c1-4193-4bdb-84a5-32bd001f29b7"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.389554 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-config-data" (OuterVolumeSpecName: "config-data") pod "562410c1-4193-4bdb-84a5-32bd001f29b7" (UID: "562410c1-4193-4bdb-84a5-32bd001f29b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.445889 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74cm9\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-kube-api-access-74cm9\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.446103 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.446187 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/562410c1-4193-4bdb-84a5-32bd001f29b7-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.446243 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.446294 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.446356 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.446415 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.446503 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/562410c1-4193-4bdb-84a5-32bd001f29b7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.446574 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.529971 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-server-conf" (OuterVolumeSpecName: "server-conf") pod "562410c1-4193-4bdb-84a5-32bd001f29b7" (UID: "562410c1-4193-4bdb-84a5-32bd001f29b7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.537956 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.559961 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.560003 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/562410c1-4193-4bdb-84a5-32bd001f29b7-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.596907 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.680104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "562410c1-4193-4bdb-84a5-32bd001f29b7" (UID: "562410c1-4193-4bdb-84a5-32bd001f29b7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.762540 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f6fc\" (UniqueName: \"kubernetes.io/projected/d9865414-bdc0-4dd1-baee-d04985de4dd0-kube-api-access-2f6fc\") pod \"d9865414-bdc0-4dd1-baee-d04985de4dd0\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.762633 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d9865414-bdc0-4dd1-baee-d04985de4dd0-ring-data-devices\") pod \"d9865414-bdc0-4dd1-baee-d04985de4dd0\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.762730 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-dispersionconf\") pod \"d9865414-bdc0-4dd1-baee-d04985de4dd0\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.762759 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-swiftconf\") pod \"d9865414-bdc0-4dd1-baee-d04985de4dd0\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.762808 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-combined-ca-bundle\") pod \"d9865414-bdc0-4dd1-baee-d04985de4dd0\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.762897 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9865414-bdc0-4dd1-baee-d04985de4dd0-scripts\") pod \"d9865414-bdc0-4dd1-baee-d04985de4dd0\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.762956 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d9865414-bdc0-4dd1-baee-d04985de4dd0-etc-swift\") pod \"d9865414-bdc0-4dd1-baee-d04985de4dd0\" (UID: \"d9865414-bdc0-4dd1-baee-d04985de4dd0\") " Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.763365 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/562410c1-4193-4bdb-84a5-32bd001f29b7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.764186 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9865414-bdc0-4dd1-baee-d04985de4dd0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d9865414-bdc0-4dd1-baee-d04985de4dd0" (UID: "d9865414-bdc0-4dd1-baee-d04985de4dd0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.764401 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9865414-bdc0-4dd1-baee-d04985de4dd0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d9865414-bdc0-4dd1-baee-d04985de4dd0" (UID: "d9865414-bdc0-4dd1-baee-d04985de4dd0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.768547 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9865414-bdc0-4dd1-baee-d04985de4dd0-kube-api-access-2f6fc" (OuterVolumeSpecName: "kube-api-access-2f6fc") pod "d9865414-bdc0-4dd1-baee-d04985de4dd0" (UID: "d9865414-bdc0-4dd1-baee-d04985de4dd0"). InnerVolumeSpecName "kube-api-access-2f6fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.787861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d9865414-bdc0-4dd1-baee-d04985de4dd0" (UID: "d9865414-bdc0-4dd1-baee-d04985de4dd0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.788277 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9865414-bdc0-4dd1-baee-d04985de4dd0" (UID: "d9865414-bdc0-4dd1-baee-d04985de4dd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.790557 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9865414-bdc0-4dd1-baee-d04985de4dd0-scripts" (OuterVolumeSpecName: "scripts") pod "d9865414-bdc0-4dd1-baee-d04985de4dd0" (UID: "d9865414-bdc0-4dd1-baee-d04985de4dd0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.791744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d9865414-bdc0-4dd1-baee-d04985de4dd0" (UID: "d9865414-bdc0-4dd1-baee-d04985de4dd0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.865428 5030 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d9865414-bdc0-4dd1-baee-d04985de4dd0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.865472 5030 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.865502 5030 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.865512 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9865414-bdc0-4dd1-baee-d04985de4dd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.865523 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9865414-bdc0-4dd1-baee-d04985de4dd0-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.865532 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d9865414-bdc0-4dd1-baee-d04985de4dd0-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:15 crc kubenswrapper[5030]: I0120 23:25:15.865543 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f6fc\" (UniqueName: \"kubernetes.io/projected/d9865414-bdc0-4dd1-baee-d04985de4dd0-kube-api-access-2f6fc\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.172134 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.172112 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-t9frk" event={"ID":"d9865414-bdc0-4dd1-baee-d04985de4dd0","Type":"ContainerDied","Data":"499cc1f0a31d5544b98226ce482c0fd77880d2910ee01a1c3f5bbfccfa9db9a6"} Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.172295 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="499cc1f0a31d5544b98226ce482c0fd77880d2910ee01a1c3f5bbfccfa9db9a6" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.174768 5030 generic.go:334] "Generic (PLEG): container finished" podID="640d9b89-a25b-43d4-bf6b-361b56ad9381" containerID="6bd004035b8c54e12e2ddf388a55c324758667583b1215f689bef97159ec1247" exitCode=0 Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.174834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"640d9b89-a25b-43d4-bf6b-361b56ad9381","Type":"ContainerDied","Data":"6bd004035b8c54e12e2ddf388a55c324758667583b1215f689bef97159ec1247"} Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.180249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"562410c1-4193-4bdb-84a5-32bd001f29b7","Type":"ContainerDied","Data":"10024960712dd50695d0c5e50d61676789c1dd0b8edb6b5beaf2bef0e0994063"} Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.180310 5030 scope.go:117] "RemoveContainer" containerID="697fa9aa2472880601d53ece29e19b607f5f96fff9acf555dbdca856864c6ede" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.180527 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.181663 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.209812 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.230040 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.242638 5030 scope.go:117] "RemoveContainer" containerID="bc2be2e614785a83e92f8382eb54b760f63601c5e35ecfb23e625e29c1a1fe79" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.261493 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:25:16 crc kubenswrapper[5030]: E0120 23:25:16.262010 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9865414-bdc0-4dd1-baee-d04985de4dd0" containerName="swift-ring-rebalance" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.262026 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9865414-bdc0-4dd1-baee-d04985de4dd0" containerName="swift-ring-rebalance" Jan 20 23:25:16 crc kubenswrapper[5030]: E0120 23:25:16.262036 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="562410c1-4193-4bdb-84a5-32bd001f29b7" containerName="setup-container" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.262044 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="562410c1-4193-4bdb-84a5-32bd001f29b7" containerName="setup-container" Jan 20 23:25:16 crc kubenswrapper[5030]: E0120 23:25:16.262066 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="562410c1-4193-4bdb-84a5-32bd001f29b7" containerName="rabbitmq" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.262074 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="562410c1-4193-4bdb-84a5-32bd001f29b7" containerName="rabbitmq" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.262301 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="562410c1-4193-4bdb-84a5-32bd001f29b7" containerName="rabbitmq" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.262333 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9865414-bdc0-4dd1-baee-d04985de4dd0" containerName="swift-ring-rebalance" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.263532 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.267978 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.267997 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.268128 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.268220 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-hdchh" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.268347 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.268447 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.268550 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.283642 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.377863 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.377902 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.377964 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.377995 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d756a967-e1dd-4c74-8bff-cd1fab21b251-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.378014 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.378065 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d756a967-e1dd-4c74-8bff-cd1fab21b251-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.378115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.378156 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4xch\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-kube-api-access-q4xch\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.378374 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.378497 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.378597 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.479791 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.480113 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.480169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.480218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.480239 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.480314 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.480796 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.480967 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.481002 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d756a967-e1dd-4c74-8bff-cd1fab21b251-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.481023 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.481034 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.481052 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") device mount path \"/mnt/openstack/pv03\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.481059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d756a967-e1dd-4c74-8bff-cd1fab21b251-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.481309 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.481403 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4xch\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-kube-api-access-q4xch\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.481581 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.482558 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.486155 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.486601 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d756a967-e1dd-4c74-8bff-cd1fab21b251-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.488591 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.488963 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d756a967-e1dd-4c74-8bff-cd1fab21b251-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.497729 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4xch\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-kube-api-access-q4xch\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.519011 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.578061 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.605060 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.687196 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkqkh\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-kube-api-access-jkqkh\") pod \"640d9b89-a25b-43d4-bf6b-361b56ad9381\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.687246 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-server-conf\") pod \"640d9b89-a25b-43d4-bf6b-361b56ad9381\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.687369 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"640d9b89-a25b-43d4-bf6b-361b56ad9381\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.687402 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/640d9b89-a25b-43d4-bf6b-361b56ad9381-erlang-cookie-secret\") pod \"640d9b89-a25b-43d4-bf6b-361b56ad9381\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.687469 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-confd\") pod \"640d9b89-a25b-43d4-bf6b-361b56ad9381\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.687485 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-tls\") pod \"640d9b89-a25b-43d4-bf6b-361b56ad9381\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.687528 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-config-data\") pod \"640d9b89-a25b-43d4-bf6b-361b56ad9381\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.687560 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-plugins\") pod \"640d9b89-a25b-43d4-bf6b-361b56ad9381\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.687586 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-erlang-cookie\") pod \"640d9b89-a25b-43d4-bf6b-361b56ad9381\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.687604 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-plugins-conf\") pod \"640d9b89-a25b-43d4-bf6b-361b56ad9381\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.687680 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/640d9b89-a25b-43d4-bf6b-361b56ad9381-pod-info\") pod \"640d9b89-a25b-43d4-bf6b-361b56ad9381\" (UID: \"640d9b89-a25b-43d4-bf6b-361b56ad9381\") " Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.689485 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "640d9b89-a25b-43d4-bf6b-361b56ad9381" (UID: "640d9b89-a25b-43d4-bf6b-361b56ad9381"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.691083 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "640d9b89-a25b-43d4-bf6b-361b56ad9381" (UID: "640d9b89-a25b-43d4-bf6b-361b56ad9381"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.692074 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-kube-api-access-jkqkh" (OuterVolumeSpecName: "kube-api-access-jkqkh") pod "640d9b89-a25b-43d4-bf6b-361b56ad9381" (UID: "640d9b89-a25b-43d4-bf6b-361b56ad9381"). InnerVolumeSpecName "kube-api-access-jkqkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.692497 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/640d9b89-a25b-43d4-bf6b-361b56ad9381-pod-info" (OuterVolumeSpecName: "pod-info") pod "640d9b89-a25b-43d4-bf6b-361b56ad9381" (UID: "640d9b89-a25b-43d4-bf6b-361b56ad9381"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.693200 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "640d9b89-a25b-43d4-bf6b-361b56ad9381" (UID: "640d9b89-a25b-43d4-bf6b-361b56ad9381"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.698373 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/640d9b89-a25b-43d4-bf6b-361b56ad9381-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "640d9b89-a25b-43d4-bf6b-361b56ad9381" (UID: "640d9b89-a25b-43d4-bf6b-361b56ad9381"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.699430 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "640d9b89-a25b-43d4-bf6b-361b56ad9381" (UID: "640d9b89-a25b-43d4-bf6b-361b56ad9381"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.700838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "640d9b89-a25b-43d4-bf6b-361b56ad9381" (UID: "640d9b89-a25b-43d4-bf6b-361b56ad9381"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.732346 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-config-data" (OuterVolumeSpecName: "config-data") pod "640d9b89-a25b-43d4-bf6b-361b56ad9381" (UID: "640d9b89-a25b-43d4-bf6b-361b56ad9381"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.753124 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-server-conf" (OuterVolumeSpecName: "server-conf") pod "640d9b89-a25b-43d4-bf6b-361b56ad9381" (UID: "640d9b89-a25b-43d4-bf6b-361b56ad9381"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.790458 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.790527 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.790537 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/640d9b89-a25b-43d4-bf6b-361b56ad9381-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.790547 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.790555 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.790564 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.790573 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.790581 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/640d9b89-a25b-43d4-bf6b-361b56ad9381-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.790591 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/640d9b89-a25b-43d4-bf6b-361b56ad9381-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.790602 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkqkh\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-kube-api-access-jkqkh\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.811730 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.817316 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "640d9b89-a25b-43d4-bf6b-361b56ad9381" (UID: "640d9b89-a25b-43d4-bf6b-361b56ad9381"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.892287 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:16 crc kubenswrapper[5030]: I0120 23:25:16.892316 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/640d9b89-a25b-43d4-bf6b-361b56ad9381-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.082123 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:25:17 crc kubenswrapper[5030]: W0120 23:25:17.082873 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd756a967_e1dd_4c74_8bff_cd1fab21b251.slice/crio-f7bdd4ff06612594fb30a7ef26696dafe543525cbc7978427d22fa3d8358ca09 WatchSource:0}: Error finding container f7bdd4ff06612594fb30a7ef26696dafe543525cbc7978427d22fa3d8358ca09: Status 404 returned error can't find the container with id f7bdd4ff06612594fb30a7ef26696dafe543525cbc7978427d22fa3d8358ca09 Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.195500 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.195538 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"640d9b89-a25b-43d4-bf6b-361b56ad9381","Type":"ContainerDied","Data":"1b51c66d93f05b18faeea3741f8c345ead3b42511cad65268bf2fae5f1d6dddd"} Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.195904 5030 scope.go:117] "RemoveContainer" containerID="6bd004035b8c54e12e2ddf388a55c324758667583b1215f689bef97159ec1247" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.198179 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"d756a967-e1dd-4c74-8bff-cd1fab21b251","Type":"ContainerStarted","Data":"f7bdd4ff06612594fb30a7ef26696dafe543525cbc7978427d22fa3d8358ca09"} Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.217606 5030 scope.go:117] "RemoveContainer" containerID="7c7e9190938d40b95c597cf3c50d0385e343102d0e6ab1a4234a94fd89a2f688" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.247589 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.258934 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.287156 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:25:17 crc kubenswrapper[5030]: E0120 23:25:17.287609 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="640d9b89-a25b-43d4-bf6b-361b56ad9381" containerName="setup-container" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.287639 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="640d9b89-a25b-43d4-bf6b-361b56ad9381" containerName="setup-container" Jan 20 23:25:17 crc kubenswrapper[5030]: E0120 23:25:17.287662 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="640d9b89-a25b-43d4-bf6b-361b56ad9381" containerName="rabbitmq" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.287669 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="640d9b89-a25b-43d4-bf6b-361b56ad9381" containerName="rabbitmq" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.287857 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="640d9b89-a25b-43d4-bf6b-361b56ad9381" containerName="rabbitmq" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.289183 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.291765 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.292046 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.292571 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.293161 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-gh8gp" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.293425 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.294069 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.294403 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.310265 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.401864 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.401923 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.401958 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.402008 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.402039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrzfw\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-kube-api-access-lrzfw\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.402090 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.402119 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.402168 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.402193 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.402236 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.402254 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.503628 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.503666 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.503730 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.503756 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.503784 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.503807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.503834 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrzfw\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-kube-api-access-lrzfw\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.503859 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.503880 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.503906 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.503928 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.504300 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.504472 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.511379 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.511823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.512819 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.513757 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.515532 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.517986 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.523366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.524514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.537734 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrzfw\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-kube-api-access-lrzfw\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.545349 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.626449 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.978279 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="562410c1-4193-4bdb-84a5-32bd001f29b7" path="/var/lib/kubelet/pods/562410c1-4193-4bdb-84a5-32bd001f29b7/volumes" Jan 20 23:25:17 crc kubenswrapper[5030]: I0120 23:25:17.979505 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="640d9b89-a25b-43d4-bf6b-361b56ad9381" path="/var/lib/kubelet/pods/640d9b89-a25b-43d4-bf6b-361b56ad9381/volumes" Jan 20 23:25:18 crc kubenswrapper[5030]: I0120 23:25:18.101397 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:25:18 crc kubenswrapper[5030]: W0120 23:25:18.146965 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb76fff8_05c4_4ec5_b00b_17dfd94e5dea.slice/crio-85e036d5c8b7cd237180c3996382ff3a3fd4c46e036d082f84bf307b06c9a89d WatchSource:0}: Error finding container 85e036d5c8b7cd237180c3996382ff3a3fd4c46e036d082f84bf307b06c9a89d: Status 404 returned error can't find the container with id 85e036d5c8b7cd237180c3996382ff3a3fd4c46e036d082f84bf307b06c9a89d Jan 20 23:25:18 crc kubenswrapper[5030]: I0120 23:25:18.210395 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea","Type":"ContainerStarted","Data":"85e036d5c8b7cd237180c3996382ff3a3fd4c46e036d082f84bf307b06c9a89d"} Jan 20 23:25:18 crc kubenswrapper[5030]: I0120 23:25:18.241802 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:25:18 crc kubenswrapper[5030]: I0120 23:25:18.261870 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:25:18 crc kubenswrapper[5030]: I0120 23:25:18.922363 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:18 crc kubenswrapper[5030]: I0120 23:25:18.953017 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:19 crc kubenswrapper[5030]: I0120 23:25:19.072103 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:19 crc kubenswrapper[5030]: I0120 23:25:19.072499 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:19 crc kubenswrapper[5030]: I0120 23:25:19.155375 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:19 crc kubenswrapper[5030]: I0120 23:25:19.223648 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"d756a967-e1dd-4c74-8bff-cd1fab21b251","Type":"ContainerStarted","Data":"d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a"} Jan 20 23:25:19 crc kubenswrapper[5030]: I0120 23:25:19.245867 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:25:19 crc kubenswrapper[5030]: I0120 23:25:19.352316 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:25:19 crc kubenswrapper[5030]: I0120 23:25:19.794436 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:19 crc kubenswrapper[5030]: I0120 23:25:19.794480 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:20 crc kubenswrapper[5030]: I0120 23:25:20.004057 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:20 crc kubenswrapper[5030]: I0120 23:25:20.329698 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:25:21 crc kubenswrapper[5030]: I0120 23:25:21.246987 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea","Type":"ContainerStarted","Data":"9940bbb038d38a74af92d31f66d01f141a002adaecb98d66837c57666c0f664f"} Jan 20 23:25:22 crc kubenswrapper[5030]: I0120 23:25:22.257977 5030 generic.go:334] "Generic (PLEG): container finished" podID="3cb2c9c5-64c5-4d21-b370-4fb3718220c5" containerID="41f49fd7ec11dd9a9a557c1bb592ba6c7982e9eb0f90238a36c0c561df24cec1" exitCode=0 Jan 20 23:25:22 crc kubenswrapper[5030]: I0120 23:25:22.258119 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" event={"ID":"3cb2c9c5-64c5-4d21-b370-4fb3718220c5","Type":"ContainerDied","Data":"41f49fd7ec11dd9a9a557c1bb592ba6c7982e9eb0f90238a36c0c561df24cec1"} Jan 20 23:25:22 crc kubenswrapper[5030]: I0120 23:25:22.262446 5030 generic.go:334] "Generic (PLEG): container finished" podID="31020647-e3cc-44ab-a754-439b424e50cd" containerID="9502d696bd2796522f2685c308038851864d146377ed5acba22931664cdb58d8" exitCode=0 Jan 20 23:25:22 crc kubenswrapper[5030]: I0120 23:25:22.262669 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-fs64c" event={"ID":"31020647-e3cc-44ab-a754-439b424e50cd","Type":"ContainerDied","Data":"9502d696bd2796522f2685c308038851864d146377ed5acba22931664cdb58d8"} Jan 20 23:25:22 crc kubenswrapper[5030]: I0120 23:25:22.265131 5030 generic.go:334] "Generic (PLEG): container finished" podID="27233b7c-c856-4a41-a674-7399905591de" containerID="616bdc366eb8dd795e7fbeb8202645472cb589464fac89005bdee7e84a582a16" exitCode=0 Jan 20 23:25:22 crc kubenswrapper[5030]: I0120 23:25:22.265233 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" event={"ID":"27233b7c-c856-4a41-a674-7399905591de","Type":"ContainerDied","Data":"616bdc366eb8dd795e7fbeb8202645472cb589464fac89005bdee7e84a582a16"} Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.018344 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.018702 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="ceilometer-central-agent" containerID="cri-o://ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226" gracePeriod=30 Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.019139 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="proxy-httpd" containerID="cri-o://d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99" gracePeriod=30 Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.019197 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="sg-core" containerID="cri-o://f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875" gracePeriod=30 Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.019237 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="ceilometer-notification-agent" containerID="cri-o://353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c" gracePeriod=30 Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.283643 5030 generic.go:334] "Generic (PLEG): container finished" podID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerID="d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99" exitCode=0 Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.283950 5030 generic.go:334] "Generic (PLEG): container finished" podID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerID="f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875" exitCode=2 Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.283723 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8a47f2f2-2d2d-4c48-b35f-966132c0f842","Type":"ContainerDied","Data":"d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99"} Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.284005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8a47f2f2-2d2d-4c48-b35f-966132c0f842","Type":"ContainerDied","Data":"f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875"} Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.647104 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.741631 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.788321 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.838156 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-config-data\") pod \"27233b7c-c856-4a41-a674-7399905591de\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.838197 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/31020647-e3cc-44ab-a754-439b424e50cd-etc-machine-id\") pod \"31020647-e3cc-44ab-a754-439b424e50cd\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.838286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-scripts\") pod \"31020647-e3cc-44ab-a754-439b424e50cd\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.838348 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-scripts\") pod \"27233b7c-c856-4a41-a674-7399905591de\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.838402 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-config-data\") pod \"31020647-e3cc-44ab-a754-439b424e50cd\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.838426 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-db-sync-config-data\") pod \"31020647-e3cc-44ab-a754-439b424e50cd\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.838441 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-combined-ca-bundle\") pod \"31020647-e3cc-44ab-a754-439b424e50cd\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.838458 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-combined-ca-bundle\") pod \"27233b7c-c856-4a41-a674-7399905591de\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.838532 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hrrp\" (UniqueName: \"kubernetes.io/projected/27233b7c-c856-4a41-a674-7399905591de-kube-api-access-2hrrp\") pod \"27233b7c-c856-4a41-a674-7399905591de\" (UID: \"27233b7c-c856-4a41-a674-7399905591de\") " Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.838694 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxx9b\" (UniqueName: \"kubernetes.io/projected/31020647-e3cc-44ab-a754-439b424e50cd-kube-api-access-cxx9b\") pod \"31020647-e3cc-44ab-a754-439b424e50cd\" (UID: \"31020647-e3cc-44ab-a754-439b424e50cd\") " Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.839491 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/31020647-e3cc-44ab-a754-439b424e50cd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "31020647-e3cc-44ab-a754-439b424e50cd" (UID: "31020647-e3cc-44ab-a754-439b424e50cd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.848001 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-scripts" (OuterVolumeSpecName: "scripts") pod "31020647-e3cc-44ab-a754-439b424e50cd" (UID: "31020647-e3cc-44ab-a754-439b424e50cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.856849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "31020647-e3cc-44ab-a754-439b424e50cd" (UID: "31020647-e3cc-44ab-a754-439b424e50cd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.860167 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27233b7c-c856-4a41-a674-7399905591de-kube-api-access-2hrrp" (OuterVolumeSpecName: "kube-api-access-2hrrp") pod "27233b7c-c856-4a41-a674-7399905591de" (UID: "27233b7c-c856-4a41-a674-7399905591de"). InnerVolumeSpecName "kube-api-access-2hrrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.865052 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31020647-e3cc-44ab-a754-439b424e50cd-kube-api-access-cxx9b" (OuterVolumeSpecName: "kube-api-access-cxx9b") pod "31020647-e3cc-44ab-a754-439b424e50cd" (UID: "31020647-e3cc-44ab-a754-439b424e50cd"). InnerVolumeSpecName "kube-api-access-cxx9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.865095 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-scripts" (OuterVolumeSpecName: "scripts") pod "27233b7c-c856-4a41-a674-7399905591de" (UID: "27233b7c-c856-4a41-a674-7399905591de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.890113 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-config-data" (OuterVolumeSpecName: "config-data") pod "27233b7c-c856-4a41-a674-7399905591de" (UID: "27233b7c-c856-4a41-a674-7399905591de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.898367 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31020647-e3cc-44ab-a754-439b424e50cd" (UID: "31020647-e3cc-44ab-a754-439b424e50cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.898813 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.902198 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27233b7c-c856-4a41-a674-7399905591de" (UID: "27233b7c-c856-4a41-a674-7399905591de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.930838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-config-data" (OuterVolumeSpecName: "config-data") pod "31020647-e3cc-44ab-a754-439b424e50cd" (UID: "31020647-e3cc-44ab-a754-439b424e50cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.940701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-combined-ca-bundle\") pod \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\" (UID: \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\") " Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.940893 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-config\") pod \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\" (UID: \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\") " Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.940950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv4cr\" (UniqueName: \"kubernetes.io/projected/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-kube-api-access-fv4cr\") pod \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\" (UID: \"3cb2c9c5-64c5-4d21-b370-4fb3718220c5\") " Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.941358 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.941375 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.941386 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.941395 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.941405 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.941414 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31020647-e3cc-44ab-a754-439b424e50cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.941422 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hrrp\" (UniqueName: \"kubernetes.io/projected/27233b7c-c856-4a41-a674-7399905591de-kube-api-access-2hrrp\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.941431 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxx9b\" (UniqueName: \"kubernetes.io/projected/31020647-e3cc-44ab-a754-439b424e50cd-kube-api-access-cxx9b\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.941439 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27233b7c-c856-4a41-a674-7399905591de-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.941448 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/31020647-e3cc-44ab-a754-439b424e50cd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.946823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-kube-api-access-fv4cr" (OuterVolumeSpecName: "kube-api-access-fv4cr") pod "3cb2c9c5-64c5-4d21-b370-4fb3718220c5" (UID: "3cb2c9c5-64c5-4d21-b370-4fb3718220c5"). InnerVolumeSpecName "kube-api-access-fv4cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.976731 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-config" (OuterVolumeSpecName: "config") pod "3cb2c9c5-64c5-4d21-b370-4fb3718220c5" (UID: "3cb2c9c5-64c5-4d21-b370-4fb3718220c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:23 crc kubenswrapper[5030]: I0120 23:25:23.985596 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cb2c9c5-64c5-4d21-b370-4fb3718220c5" (UID: "3cb2c9c5-64c5-4d21-b370-4fb3718220c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.042105 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a47f2f2-2d2d-4c48-b35f-966132c0f842-log-httpd\") pod \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.042285 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-ceilometer-tls-certs\") pod \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.042312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-config-data\") pod \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.042359 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-combined-ca-bundle\") pod \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.042389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfgds\" (UniqueName: \"kubernetes.io/projected/8a47f2f2-2d2d-4c48-b35f-966132c0f842-kube-api-access-cfgds\") pod \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.042442 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a47f2f2-2d2d-4c48-b35f-966132c0f842-run-httpd\") pod \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.042458 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-scripts\") pod \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.042779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a47f2f2-2d2d-4c48-b35f-966132c0f842-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a47f2f2-2d2d-4c48-b35f-966132c0f842" (UID: "8a47f2f2-2d2d-4c48-b35f-966132c0f842"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.043060 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-sg-core-conf-yaml\") pod \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\" (UID: \"8a47f2f2-2d2d-4c48-b35f-966132c0f842\") " Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.043319 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a47f2f2-2d2d-4c48-b35f-966132c0f842-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a47f2f2-2d2d-4c48-b35f-966132c0f842" (UID: "8a47f2f2-2d2d-4c48-b35f-966132c0f842"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.043700 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a47f2f2-2d2d-4c48-b35f-966132c0f842-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.043805 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.043827 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.043836 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a47f2f2-2d2d-4c48-b35f-966132c0f842-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.043848 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv4cr\" (UniqueName: \"kubernetes.io/projected/3cb2c9c5-64c5-4d21-b370-4fb3718220c5-kube-api-access-fv4cr\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.049747 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-scripts" (OuterVolumeSpecName: "scripts") pod "8a47f2f2-2d2d-4c48-b35f-966132c0f842" (UID: "8a47f2f2-2d2d-4c48-b35f-966132c0f842"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.049761 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a47f2f2-2d2d-4c48-b35f-966132c0f842-kube-api-access-cfgds" (OuterVolumeSpecName: "kube-api-access-cfgds") pod "8a47f2f2-2d2d-4c48-b35f-966132c0f842" (UID: "8a47f2f2-2d2d-4c48-b35f-966132c0f842"). InnerVolumeSpecName "kube-api-access-cfgds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.079692 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a47f2f2-2d2d-4c48-b35f-966132c0f842" (UID: "8a47f2f2-2d2d-4c48-b35f-966132c0f842"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.091051 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8a47f2f2-2d2d-4c48-b35f-966132c0f842" (UID: "8a47f2f2-2d2d-4c48-b35f-966132c0f842"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.122375 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a47f2f2-2d2d-4c48-b35f-966132c0f842" (UID: "8a47f2f2-2d2d-4c48-b35f-966132c0f842"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.134144 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-config-data" (OuterVolumeSpecName: "config-data") pod "8a47f2f2-2d2d-4c48-b35f-966132c0f842" (UID: "8a47f2f2-2d2d-4c48-b35f-966132c0f842"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.145937 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.145978 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.145990 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.146002 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.146010 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfgds\" (UniqueName: \"kubernetes.io/projected/8a47f2f2-2d2d-4c48-b35f-966132c0f842-kube-api-access-cfgds\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.146022 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a47f2f2-2d2d-4c48-b35f-966132c0f842-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.293496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" event={"ID":"3cb2c9c5-64c5-4d21-b370-4fb3718220c5","Type":"ContainerDied","Data":"f6359ed0b74fc12aa33ee4212fc0ea850cc6f282ea26ce2b047001c64e9c3f8e"} Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.293530 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6359ed0b74fc12aa33ee4212fc0ea850cc6f282ea26ce2b047001c64e9c3f8e" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.293537 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-tgzmq" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.295074 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-fs64c" event={"ID":"31020647-e3cc-44ab-a754-439b424e50cd","Type":"ContainerDied","Data":"fed6dc3be2d915fcc3b39523184f5ef9745dab7ab7b1d39074cb7735879d32a6"} Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.295094 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fed6dc3be2d915fcc3b39523184f5ef9745dab7ab7b1d39074cb7735879d32a6" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.295147 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-fs64c" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.300964 5030 generic.go:334] "Generic (PLEG): container finished" podID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerID="353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c" exitCode=0 Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.300989 5030 generic.go:334] "Generic (PLEG): container finished" podID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerID="ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226" exitCode=0 Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.301025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8a47f2f2-2d2d-4c48-b35f-966132c0f842","Type":"ContainerDied","Data":"353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c"} Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.301052 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8a47f2f2-2d2d-4c48-b35f-966132c0f842","Type":"ContainerDied","Data":"ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226"} Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.301063 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8a47f2f2-2d2d-4c48-b35f-966132c0f842","Type":"ContainerDied","Data":"25d4d69db82e1bee348051f0c99fd1e15a0f40af8c07933cc2bda5c1a2ce0d80"} Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.301079 5030 scope.go:117] "RemoveContainer" containerID="d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.301205 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.313988 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.313988 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq" event={"ID":"27233b7c-c856-4a41-a674-7399905591de","Type":"ContainerDied","Data":"680a753b3a2fc91ad4ba12b423c86df8b9406533da21fafde32fa4e13dd16387"} Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.314468 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="680a753b3a2fc91ad4ba12b423c86df8b9406533da21fafde32fa4e13dd16387" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.316717 5030 generic.go:334] "Generic (PLEG): container finished" podID="3358fbb1-638c-4f20-804e-b672441288e4" containerID="5c3963eec2dd418b836709714d9c6ad213a8435b737aa8a3604f4191d504f0a0" exitCode=0 Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.316754 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" event={"ID":"3358fbb1-638c-4f20-804e-b672441288e4","Type":"ContainerDied","Data":"5c3963eec2dd418b836709714d9c6ad213a8435b737aa8a3604f4191d504f0a0"} Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.352508 5030 scope.go:117] "RemoveContainer" containerID="f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.379596 5030 scope.go:117] "RemoveContainer" containerID="353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.386172 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.393495 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.404551 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.408588 5030 scope.go:117] "RemoveContainer" containerID="ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.411677 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="0194cff7-fd9f-4d47-8edd-eae7cd25c987" containerName="nova-cell1-conductor-conductor" containerID="cri-o://62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574" gracePeriod=30 Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413145 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:25:24 crc kubenswrapper[5030]: E0120 23:25:24.413533 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27233b7c-c856-4a41-a674-7399905591de" containerName="nova-cell1-conductor-db-sync" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413547 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="27233b7c-c856-4a41-a674-7399905591de" containerName="nova-cell1-conductor-db-sync" Jan 20 23:25:24 crc kubenswrapper[5030]: E0120 23:25:24.413562 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb2c9c5-64c5-4d21-b370-4fb3718220c5" containerName="neutron-db-sync" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413568 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb2c9c5-64c5-4d21-b370-4fb3718220c5" containerName="neutron-db-sync" Jan 20 23:25:24 crc kubenswrapper[5030]: E0120 23:25:24.413593 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="ceilometer-notification-agent" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413599 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="ceilometer-notification-agent" Jan 20 23:25:24 crc kubenswrapper[5030]: E0120 23:25:24.413613 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31020647-e3cc-44ab-a754-439b424e50cd" containerName="cinder-db-sync" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413719 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="31020647-e3cc-44ab-a754-439b424e50cd" containerName="cinder-db-sync" Jan 20 23:25:24 crc kubenswrapper[5030]: E0120 23:25:24.413732 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="ceilometer-central-agent" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413738 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="ceilometer-central-agent" Jan 20 23:25:24 crc kubenswrapper[5030]: E0120 23:25:24.413751 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="sg-core" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413757 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="sg-core" Jan 20 23:25:24 crc kubenswrapper[5030]: E0120 23:25:24.413768 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="proxy-httpd" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413774 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="proxy-httpd" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413931 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="proxy-httpd" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413950 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="sg-core" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413962 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="31020647-e3cc-44ab-a754-439b424e50cd" containerName="cinder-db-sync" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413977 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb2c9c5-64c5-4d21-b370-4fb3718220c5" containerName="neutron-db-sync" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413984 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="27233b7c-c856-4a41-a674-7399905591de" containerName="nova-cell1-conductor-db-sync" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.413995 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="ceilometer-notification-agent" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.414005 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" containerName="ceilometer-central-agent" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.415569 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.417899 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.417941 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.418170 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.437315 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.455518 5030 scope.go:117] "RemoveContainer" containerID="d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99" Jan 20 23:25:24 crc kubenswrapper[5030]: E0120 23:25:24.455966 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99\": container with ID starting with d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99 not found: ID does not exist" containerID="d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.455995 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99"} err="failed to get container status \"d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99\": rpc error: code = NotFound desc = could not find container \"d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99\": container with ID starting with d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99 not found: ID does not exist" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.456015 5030 scope.go:117] "RemoveContainer" containerID="f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875" Jan 20 23:25:24 crc kubenswrapper[5030]: E0120 23:25:24.456213 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875\": container with ID starting with f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875 not found: ID does not exist" containerID="f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.456232 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875"} err="failed to get container status \"f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875\": rpc error: code = NotFound desc = could not find container \"f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875\": container with ID starting with f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875 not found: ID does not exist" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.456243 5030 scope.go:117] "RemoveContainer" containerID="353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c" Jan 20 23:25:24 crc kubenswrapper[5030]: E0120 23:25:24.457064 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c\": container with ID starting with 353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c not found: ID does not exist" containerID="353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.457108 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c"} err="failed to get container status \"353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c\": rpc error: code = NotFound desc = could not find container \"353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c\": container with ID starting with 353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c not found: ID does not exist" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.457136 5030 scope.go:117] "RemoveContainer" containerID="ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226" Jan 20 23:25:24 crc kubenswrapper[5030]: E0120 23:25:24.457404 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226\": container with ID starting with ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226 not found: ID does not exist" containerID="ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.457463 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226"} err="failed to get container status \"ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226\": rpc error: code = NotFound desc = could not find container \"ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226\": container with ID starting with ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226 not found: ID does not exist" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.457491 5030 scope.go:117] "RemoveContainer" containerID="d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.457770 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99"} err="failed to get container status \"d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99\": rpc error: code = NotFound desc = could not find container \"d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99\": container with ID starting with d4413354649b435f4fd34b58dbbdc0e52fc50674db357316e86952e327b89b99 not found: ID does not exist" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.457789 5030 scope.go:117] "RemoveContainer" containerID="f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.458089 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875"} err="failed to get container status \"f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875\": rpc error: code = NotFound desc = could not find container \"f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875\": container with ID starting with f40316f6e23ab9c343b0faa34ef742ed4616cfde71fa9c45bcbfbd3bd9c8d875 not found: ID does not exist" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.458122 5030 scope.go:117] "RemoveContainer" containerID="353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.458296 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c"} err="failed to get container status \"353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c\": rpc error: code = NotFound desc = could not find container \"353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c\": container with ID starting with 353cb23db8c56d1c98e5a38b81465c725c4ccf3ec7cd1dd532a16876bd95799c not found: ID does not exist" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.458314 5030 scope.go:117] "RemoveContainer" containerID="ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.458610 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226"} err="failed to get container status \"ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226\": rpc error: code = NotFound desc = could not find container \"ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226\": container with ID starting with ba8c4c5551647c2c67b248e19183dd340f592594b79f1038512473a2c9479226 not found: ID does not exist" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.487383 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.515085 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-7bd69c474b-2t4sm"] Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.525479 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7bd69c474b-2t4sm"] Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.525588 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.541542 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.541758 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="d9890aef-9ec6-4ee3-a4f5-3369f857fe91" containerName="cinder-api-log" containerID="cri-o://4a372dde5599973c143827ff58942c5bc6bf6c2a7036dc2f4b50d4bb53c3b474" gracePeriod=30 Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.541880 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="d9890aef-9ec6-4ee3-a4f5-3369f857fe91" containerName="cinder-api" containerID="cri-o://1e92ba7e3b5a44975fbeae958e367e9b9ace413336e57249472fca878dfc9791" gracePeriod=30 Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.554671 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2840a28-0f62-4378-bbd0-a07833eb25c7-run-httpd\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.554781 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf8g4\" (UniqueName: \"kubernetes.io/projected/e2840a28-0f62-4378-bbd0-a07833eb25c7-kube-api-access-vf8g4\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.554821 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-scripts\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.554850 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.554871 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2840a28-0f62-4378-bbd0-a07833eb25c7-log-httpd\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.554926 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.554960 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.554980 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-config-data\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.656903 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2840a28-0f62-4378-bbd0-a07833eb25c7-run-httpd\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.656959 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf8g4\" (UniqueName: \"kubernetes.io/projected/e2840a28-0f62-4378-bbd0-a07833eb25c7-kube-api-access-vf8g4\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.657001 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-scripts\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.657022 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-ovndb-tls-certs\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.657040 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-config\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.657057 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-public-tls-certs\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.657080 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-combined-ca-bundle\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.657101 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.657121 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2840a28-0f62-4378-bbd0-a07833eb25c7-log-httpd\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.657177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.657211 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.657229 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-config-data\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.657247 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h6q6\" (UniqueName: \"kubernetes.io/projected/c991f145-efc7-4968-b73a-d4cf085acee0-kube-api-access-2h6q6\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.657271 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-internal-tls-certs\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.657309 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-httpd-config\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.658234 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2840a28-0f62-4378-bbd0-a07833eb25c7-log-httpd\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.658424 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2840a28-0f62-4378-bbd0-a07833eb25c7-run-httpd\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.662602 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.663457 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-config-data\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.663975 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.664540 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.666324 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-scripts\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.674016 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf8g4\" (UniqueName: \"kubernetes.io/projected/e2840a28-0f62-4378-bbd0-a07833eb25c7-kube-api-access-vf8g4\") pod \"ceilometer-0\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.749778 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.758956 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h6q6\" (UniqueName: \"kubernetes.io/projected/c991f145-efc7-4968-b73a-d4cf085acee0-kube-api-access-2h6q6\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.759010 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-internal-tls-certs\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.759057 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-httpd-config\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.759103 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-ovndb-tls-certs\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.759119 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-config\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.759137 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-public-tls-certs\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.759158 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-combined-ca-bundle\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.762884 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-combined-ca-bundle\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.764831 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-ovndb-tls-certs\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.765646 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-config\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.766462 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-internal-tls-certs\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.767070 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-public-tls-certs\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.772857 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-httpd-config\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.780471 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h6q6\" (UniqueName: \"kubernetes.io/projected/c991f145-efc7-4968-b73a-d4cf085acee0-kube-api-access-2h6q6\") pod \"neutron-7bd69c474b-2t4sm\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:24 crc kubenswrapper[5030]: I0120 23:25:24.844198 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.185731 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:25:25 crc kubenswrapper[5030]: W0120 23:25:25.186234 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2840a28_0f62_4378_bbd0_a07833eb25c7.slice/crio-4598b0245916d45a6b4a1919da72ecb6246f72eb1ec695ac241706e0cd2d3cfd WatchSource:0}: Error finding container 4598b0245916d45a6b4a1919da72ecb6246f72eb1ec695ac241706e0cd2d3cfd: Status 404 returned error can't find the container with id 4598b0245916d45a6b4a1919da72ecb6246f72eb1ec695ac241706e0cd2d3cfd Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.357365 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7bd69c474b-2t4sm"] Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.399356 5030 generic.go:334] "Generic (PLEG): container finished" podID="d9890aef-9ec6-4ee3-a4f5-3369f857fe91" containerID="4a372dde5599973c143827ff58942c5bc6bf6c2a7036dc2f4b50d4bb53c3b474" exitCode=143 Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.399446 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"d9890aef-9ec6-4ee3-a4f5-3369f857fe91","Type":"ContainerDied","Data":"4a372dde5599973c143827ff58942c5bc6bf6c2a7036dc2f4b50d4bb53c3b474"} Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.412161 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e2840a28-0f62-4378-bbd0-a07833eb25c7","Type":"ContainerStarted","Data":"4598b0245916d45a6b4a1919da72ecb6246f72eb1ec695ac241706e0cd2d3cfd"} Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.750613 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.889798 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-combined-ca-bundle\") pod \"3358fbb1-638c-4f20-804e-b672441288e4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.889987 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-config-data\") pod \"3358fbb1-638c-4f20-804e-b672441288e4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.890046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxdxw\" (UniqueName: \"kubernetes.io/projected/3358fbb1-638c-4f20-804e-b672441288e4-kube-api-access-fxdxw\") pod \"3358fbb1-638c-4f20-804e-b672441288e4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.890074 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-scripts\") pod \"3358fbb1-638c-4f20-804e-b672441288e4\" (UID: \"3358fbb1-638c-4f20-804e-b672441288e4\") " Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.894685 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3358fbb1-638c-4f20-804e-b672441288e4-kube-api-access-fxdxw" (OuterVolumeSpecName: "kube-api-access-fxdxw") pod "3358fbb1-638c-4f20-804e-b672441288e4" (UID: "3358fbb1-638c-4f20-804e-b672441288e4"). InnerVolumeSpecName "kube-api-access-fxdxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.895703 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-scripts" (OuterVolumeSpecName: "scripts") pod "3358fbb1-638c-4f20-804e-b672441288e4" (UID: "3358fbb1-638c-4f20-804e-b672441288e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.917582 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-config-data" (OuterVolumeSpecName: "config-data") pod "3358fbb1-638c-4f20-804e-b672441288e4" (UID: "3358fbb1-638c-4f20-804e-b672441288e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.933180 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3358fbb1-638c-4f20-804e-b672441288e4" (UID: "3358fbb1-638c-4f20-804e-b672441288e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.962115 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:25:25 crc kubenswrapper[5030]: E0120 23:25:25.962355 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.971828 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a47f2f2-2d2d-4c48-b35f-966132c0f842" path="/var/lib/kubelet/pods/8a47f2f2-2d2d-4c48-b35f-966132c0f842/volumes" Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.993949 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.993990 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.994006 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxdxw\" (UniqueName: \"kubernetes.io/projected/3358fbb1-638c-4f20-804e-b672441288e4-kube-api-access-fxdxw\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:25 crc kubenswrapper[5030]: I0120 23:25:25.994020 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3358fbb1-638c-4f20-804e-b672441288e4-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:26 crc kubenswrapper[5030]: I0120 23:25:26.445414 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" event={"ID":"c991f145-efc7-4968-b73a-d4cf085acee0","Type":"ContainerStarted","Data":"73d55bbb51e734b228c4354b255f4da8a22be37140d690a89e16210c155c5407"} Jan 20 23:25:26 crc kubenswrapper[5030]: I0120 23:25:26.445499 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" event={"ID":"c991f145-efc7-4968-b73a-d4cf085acee0","Type":"ContainerStarted","Data":"b5a3121b46f09ab54fc42a4e60030abec28c4ebcaabff01ecef271e79eda8bce"} Jan 20 23:25:26 crc kubenswrapper[5030]: I0120 23:25:26.445567 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:26 crc kubenswrapper[5030]: I0120 23:25:26.445584 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" event={"ID":"c991f145-efc7-4968-b73a-d4cf085acee0","Type":"ContainerStarted","Data":"7c4d557daa2ddc003d00ec3c03074a00cac4f35a5527a4697c964d1d28f8f0cd"} Jan 20 23:25:26 crc kubenswrapper[5030]: I0120 23:25:26.448920 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" Jan 20 23:25:26 crc kubenswrapper[5030]: I0120 23:25:26.449434 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4" event={"ID":"3358fbb1-638c-4f20-804e-b672441288e4","Type":"ContainerDied","Data":"026c9397190e63d942e81db9fd41b91996d94d9e63c1454235faa6f7aa7b2714"} Jan 20 23:25:26 crc kubenswrapper[5030]: I0120 23:25:26.449534 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="026c9397190e63d942e81db9fd41b91996d94d9e63c1454235faa6f7aa7b2714" Jan 20 23:25:26 crc kubenswrapper[5030]: I0120 23:25:26.451147 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e2840a28-0f62-4378-bbd0-a07833eb25c7","Type":"ContainerStarted","Data":"3fa1f8af90d752f9f412358e748ffa6b26b530af7abe3410789eb3720481f96a"} Jan 20 23:25:26 crc kubenswrapper[5030]: I0120 23:25:26.476452 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" podStartSLOduration=2.476433493 podStartE2EDuration="2.476433493s" podCreationTimestamp="2026-01-20 23:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:26.470760456 +0000 UTC m=+2998.791020914" watchObservedRunningTime="2026-01-20 23:25:26.476433493 +0000 UTC m=+2998.796693781" Jan 20 23:25:26 crc kubenswrapper[5030]: E0120 23:25:26.534833 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:25:26 crc kubenswrapper[5030]: E0120 23:25:26.537767 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:25:26 crc kubenswrapper[5030]: E0120 23:25:26.541946 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:25:26 crc kubenswrapper[5030]: E0120 23:25:26.542019 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="0194cff7-fd9f-4d47-8edd-eae7cd25c987" containerName="nova-cell1-conductor-conductor" Jan 20 23:25:27 crc kubenswrapper[5030]: I0120 23:25:27.283877 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:25:27 crc kubenswrapper[5030]: I0120 23:25:27.284434 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="c458e9a2-dd7c-4835-88f2-dcd41887e658" containerName="nova-cell0-conductor-conductor" containerID="cri-o://e479ed4455261ff95dbf70f650e6de7b4c94add8a7518effc7599b3e30ce8a54" gracePeriod=30 Jan 20 23:25:27 crc kubenswrapper[5030]: I0120 23:25:27.465987 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e2840a28-0f62-4378-bbd0-a07833eb25c7","Type":"ContainerStarted","Data":"bfd7a2900f29501d274a01d822ba9fcab833dd0ab9306f38cbde8afbe0832f0e"} Jan 20 23:25:27 crc kubenswrapper[5030]: I0120 23:25:27.722070 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="d9890aef-9ec6-4ee3-a4f5-3369f857fe91" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.0:8776/healthcheck\": read tcp 10.217.0.2:49420->10.217.1.0:8776: read: connection reset by peer" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.139852 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.343892 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-config-data\") pod \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.344230 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-internal-tls-certs\") pod \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.344257 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-public-tls-certs\") pod \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.344276 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-logs\") pod \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.344332 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-combined-ca-bundle\") pod \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.344355 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-etc-machine-id\") pod \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.344426 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-config-data-custom\") pod \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.344469 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zrmx\" (UniqueName: \"kubernetes.io/projected/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-kube-api-access-6zrmx\") pod \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.344495 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-scripts\") pod \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\" (UID: \"d9890aef-9ec6-4ee3-a4f5-3369f857fe91\") " Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.345766 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-logs" (OuterVolumeSpecName: "logs") pod "d9890aef-9ec6-4ee3-a4f5-3369f857fe91" (UID: "d9890aef-9ec6-4ee3-a4f5-3369f857fe91"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.345826 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d9890aef-9ec6-4ee3-a4f5-3369f857fe91" (UID: "d9890aef-9ec6-4ee3-a4f5-3369f857fe91"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.352554 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-kube-api-access-6zrmx" (OuterVolumeSpecName: "kube-api-access-6zrmx") pod "d9890aef-9ec6-4ee3-a4f5-3369f857fe91" (UID: "d9890aef-9ec6-4ee3-a4f5-3369f857fe91"). InnerVolumeSpecName "kube-api-access-6zrmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.353754 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-scripts" (OuterVolumeSpecName: "scripts") pod "d9890aef-9ec6-4ee3-a4f5-3369f857fe91" (UID: "d9890aef-9ec6-4ee3-a4f5-3369f857fe91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.363295 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d9890aef-9ec6-4ee3-a4f5-3369f857fe91" (UID: "d9890aef-9ec6-4ee3-a4f5-3369f857fe91"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.376137 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9890aef-9ec6-4ee3-a4f5-3369f857fe91" (UID: "d9890aef-9ec6-4ee3-a4f5-3369f857fe91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.394498 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d9890aef-9ec6-4ee3-a4f5-3369f857fe91" (UID: "d9890aef-9ec6-4ee3-a4f5-3369f857fe91"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.402665 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-config-data" (OuterVolumeSpecName: "config-data") pod "d9890aef-9ec6-4ee3-a4f5-3369f857fe91" (UID: "d9890aef-9ec6-4ee3-a4f5-3369f857fe91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.405510 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d9890aef-9ec6-4ee3-a4f5-3369f857fe91" (UID: "d9890aef-9ec6-4ee3-a4f5-3369f857fe91"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.447104 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.447358 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.447418 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.447472 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.447523 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.447574 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.447771 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.447852 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zrmx\" (UniqueName: \"kubernetes.io/projected/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-kube-api-access-6zrmx\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.447910 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9890aef-9ec6-4ee3-a4f5-3369f857fe91-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.488681 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e2840a28-0f62-4378-bbd0-a07833eb25c7","Type":"ContainerStarted","Data":"115ce5a1a1b811f563c605b90097f6b0f7096471f6c5040cf6697c80d762190f"} Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.491977 5030 generic.go:334] "Generic (PLEG): container finished" podID="43181c01-0cec-4ad9-b037-880357999794" containerID="aebde48ffd7b79ddf812bbd9fe02d63be7d61af3719098b682a1f12190844949" exitCode=0 Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.492074 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" event={"ID":"43181c01-0cec-4ad9-b037-880357999794","Type":"ContainerDied","Data":"aebde48ffd7b79ddf812bbd9fe02d63be7d61af3719098b682a1f12190844949"} Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.494163 5030 generic.go:334] "Generic (PLEG): container finished" podID="b66b2ce3-1462-4db2-b1b4-6ca42402d7a5" containerID="51142efc7916ed03c6153a5dcb3fe231b88d5aa152a4410501c160c9b0963e66" exitCode=0 Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.494253 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-xl95q" event={"ID":"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5","Type":"ContainerDied","Data":"51142efc7916ed03c6153a5dcb3fe231b88d5aa152a4410501c160c9b0963e66"} Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.497006 5030 generic.go:334] "Generic (PLEG): container finished" podID="d9890aef-9ec6-4ee3-a4f5-3369f857fe91" containerID="1e92ba7e3b5a44975fbeae958e367e9b9ace413336e57249472fca878dfc9791" exitCode=0 Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.497112 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"d9890aef-9ec6-4ee3-a4f5-3369f857fe91","Type":"ContainerDied","Data":"1e92ba7e3b5a44975fbeae958e367e9b9ace413336e57249472fca878dfc9791"} Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.497215 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"d9890aef-9ec6-4ee3-a4f5-3369f857fe91","Type":"ContainerDied","Data":"d43d7c27d3e271ea822d5e5d2522ae40eccbac1dae312780c02d46e43a252c81"} Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.497218 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.497245 5030 scope.go:117] "RemoveContainer" containerID="1e92ba7e3b5a44975fbeae958e367e9b9ace413336e57249472fca878dfc9791" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.562037 5030 scope.go:117] "RemoveContainer" containerID="4a372dde5599973c143827ff58942c5bc6bf6c2a7036dc2f4b50d4bb53c3b474" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.569891 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.579595 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.589176 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:25:28 crc kubenswrapper[5030]: E0120 23:25:28.589610 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9890aef-9ec6-4ee3-a4f5-3369f857fe91" containerName="cinder-api-log" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.589647 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9890aef-9ec6-4ee3-a4f5-3369f857fe91" containerName="cinder-api-log" Jan 20 23:25:28 crc kubenswrapper[5030]: E0120 23:25:28.589677 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9890aef-9ec6-4ee3-a4f5-3369f857fe91" containerName="cinder-api" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.589684 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9890aef-9ec6-4ee3-a4f5-3369f857fe91" containerName="cinder-api" Jan 20 23:25:28 crc kubenswrapper[5030]: E0120 23:25:28.589692 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3358fbb1-638c-4f20-804e-b672441288e4" containerName="nova-cell0-conductor-db-sync" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.589698 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3358fbb1-638c-4f20-804e-b672441288e4" containerName="nova-cell0-conductor-db-sync" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.589862 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9890aef-9ec6-4ee3-a4f5-3369f857fe91" containerName="cinder-api-log" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.589890 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3358fbb1-638c-4f20-804e-b672441288e4" containerName="nova-cell0-conductor-db-sync" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.589905 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9890aef-9ec6-4ee3-a4f5-3369f857fe91" containerName="cinder-api" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.597974 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.600685 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.601486 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.601611 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.601721 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.614096 5030 scope.go:117] "RemoveContainer" containerID="1e92ba7e3b5a44975fbeae958e367e9b9ace413336e57249472fca878dfc9791" Jan 20 23:25:28 crc kubenswrapper[5030]: E0120 23:25:28.616303 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e92ba7e3b5a44975fbeae958e367e9b9ace413336e57249472fca878dfc9791\": container with ID starting with 1e92ba7e3b5a44975fbeae958e367e9b9ace413336e57249472fca878dfc9791 not found: ID does not exist" containerID="1e92ba7e3b5a44975fbeae958e367e9b9ace413336e57249472fca878dfc9791" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.616351 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e92ba7e3b5a44975fbeae958e367e9b9ace413336e57249472fca878dfc9791"} err="failed to get container status \"1e92ba7e3b5a44975fbeae958e367e9b9ace413336e57249472fca878dfc9791\": rpc error: code = NotFound desc = could not find container \"1e92ba7e3b5a44975fbeae958e367e9b9ace413336e57249472fca878dfc9791\": container with ID starting with 1e92ba7e3b5a44975fbeae958e367e9b9ace413336e57249472fca878dfc9791 not found: ID does not exist" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.616380 5030 scope.go:117] "RemoveContainer" containerID="4a372dde5599973c143827ff58942c5bc6bf6c2a7036dc2f4b50d4bb53c3b474" Jan 20 23:25:28 crc kubenswrapper[5030]: E0120 23:25:28.619317 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a372dde5599973c143827ff58942c5bc6bf6c2a7036dc2f4b50d4bb53c3b474\": container with ID starting with 4a372dde5599973c143827ff58942c5bc6bf6c2a7036dc2f4b50d4bb53c3b474 not found: ID does not exist" containerID="4a372dde5599973c143827ff58942c5bc6bf6c2a7036dc2f4b50d4bb53c3b474" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.619364 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a372dde5599973c143827ff58942c5bc6bf6c2a7036dc2f4b50d4bb53c3b474"} err="failed to get container status \"4a372dde5599973c143827ff58942c5bc6bf6c2a7036dc2f4b50d4bb53c3b474\": rpc error: code = NotFound desc = could not find container \"4a372dde5599973c143827ff58942c5bc6bf6c2a7036dc2f4b50d4bb53c3b474\": container with ID starting with 4a372dde5599973c143827ff58942c5bc6bf6c2a7036dc2f4b50d4bb53c3b474 not found: ID does not exist" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.681974 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.753275 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-logs\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.753330 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-config-data-custom\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.753364 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6fw8\" (UniqueName: \"kubernetes.io/projected/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-kube-api-access-g6fw8\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.753910 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.753951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.753985 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-config-data\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.754003 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.754051 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.754081 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-scripts\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.855599 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-logs\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.855703 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-config-data-custom\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.855743 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6fw8\" (UniqueName: \"kubernetes.io/projected/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-kube-api-access-g6fw8\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.856133 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-logs\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.856332 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.856885 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.856935 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-config-data\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.856959 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.857247 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.857289 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-scripts\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.858041 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.861568 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-scripts\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.861837 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.861888 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-config-data-custom\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.863081 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.863167 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.870058 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-config-data\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.873924 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6fw8\" (UniqueName: \"kubernetes.io/projected/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-kube-api-access-g6fw8\") pod \"cinder-api-0\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:28 crc kubenswrapper[5030]: I0120 23:25:28.918074 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:29 crc kubenswrapper[5030]: I0120 23:25:29.408713 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:25:29 crc kubenswrapper[5030]: I0120 23:25:29.526031 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e2840a28-0f62-4378-bbd0-a07833eb25c7","Type":"ContainerStarted","Data":"79b477de7360a1b389230a49bf03141e246b436af45aecf9fc9c9289c4283a3d"} Jan 20 23:25:29 crc kubenswrapper[5030]: I0120 23:25:29.526195 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:29 crc kubenswrapper[5030]: I0120 23:25:29.527576 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"644cc3b2-e3d2-441c-8aa5-c3bec683e86c","Type":"ContainerStarted","Data":"9ecbaad4cafc0c70aaa295183019d98d4b5571eb21b267640e04c91eae93a431"} Jan 20 23:25:29 crc kubenswrapper[5030]: I0120 23:25:29.568992 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.9983103 podStartE2EDuration="5.568970025s" podCreationTimestamp="2026-01-20 23:25:24 +0000 UTC" firstStartedPulling="2026-01-20 23:25:25.188064336 +0000 UTC m=+2997.508324624" lastFinishedPulling="2026-01-20 23:25:28.758724061 +0000 UTC m=+3001.078984349" observedRunningTime="2026-01-20 23:25:29.560219763 +0000 UTC m=+3001.880480061" watchObservedRunningTime="2026-01-20 23:25:29.568970025 +0000 UTC m=+3001.889230323" Jan 20 23:25:29 crc kubenswrapper[5030]: I0120 23:25:29.972803 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9890aef-9ec6-4ee3-a4f5-3369f857fe91" path="/var/lib/kubelet/pods/d9890aef-9ec6-4ee3-a4f5-3369f857fe91/volumes" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.068028 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.080946 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w86j2\" (UniqueName: \"kubernetes.io/projected/43181c01-0cec-4ad9-b037-880357999794-kube-api-access-w86j2\") pod \"43181c01-0cec-4ad9-b037-880357999794\" (UID: \"43181c01-0cec-4ad9-b037-880357999794\") " Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.080991 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43181c01-0cec-4ad9-b037-880357999794-combined-ca-bundle\") pod \"43181c01-0cec-4ad9-b037-880357999794\" (UID: \"43181c01-0cec-4ad9-b037-880357999794\") " Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.081028 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43181c01-0cec-4ad9-b037-880357999794-config-data\") pod \"43181c01-0cec-4ad9-b037-880357999794\" (UID: \"43181c01-0cec-4ad9-b037-880357999794\") " Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.083718 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.099495 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43181c01-0cec-4ad9-b037-880357999794-kube-api-access-w86j2" (OuterVolumeSpecName: "kube-api-access-w86j2") pod "43181c01-0cec-4ad9-b037-880357999794" (UID: "43181c01-0cec-4ad9-b037-880357999794"). InnerVolumeSpecName "kube-api-access-w86j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.132055 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43181c01-0cec-4ad9-b037-880357999794-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43181c01-0cec-4ad9-b037-880357999794" (UID: "43181c01-0cec-4ad9-b037-880357999794"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.182043 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-scripts\") pod \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.182090 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqbxb\" (UniqueName: \"kubernetes.io/projected/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-kube-api-access-xqbxb\") pod \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.182194 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-combined-ca-bundle\") pod \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.182253 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-logs\") pod \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.182321 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-config-data\") pod \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\" (UID: \"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5\") " Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.182786 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w86j2\" (UniqueName: \"kubernetes.io/projected/43181c01-0cec-4ad9-b037-880357999794-kube-api-access-w86j2\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.182804 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43181c01-0cec-4ad9-b037-880357999794-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.184760 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-logs" (OuterVolumeSpecName: "logs") pod "b66b2ce3-1462-4db2-b1b4-6ca42402d7a5" (UID: "b66b2ce3-1462-4db2-b1b4-6ca42402d7a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.198306 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-scripts" (OuterVolumeSpecName: "scripts") pod "b66b2ce3-1462-4db2-b1b4-6ca42402d7a5" (UID: "b66b2ce3-1462-4db2-b1b4-6ca42402d7a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.198522 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-kube-api-access-xqbxb" (OuterVolumeSpecName: "kube-api-access-xqbxb") pod "b66b2ce3-1462-4db2-b1b4-6ca42402d7a5" (UID: "b66b2ce3-1462-4db2-b1b4-6ca42402d7a5"). InnerVolumeSpecName "kube-api-access-xqbxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.231125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-config-data" (OuterVolumeSpecName: "config-data") pod "b66b2ce3-1462-4db2-b1b4-6ca42402d7a5" (UID: "b66b2ce3-1462-4db2-b1b4-6ca42402d7a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.236597 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b66b2ce3-1462-4db2-b1b4-6ca42402d7a5" (UID: "b66b2ce3-1462-4db2-b1b4-6ca42402d7a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.252105 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43181c01-0cec-4ad9-b037-880357999794-config-data" (OuterVolumeSpecName: "config-data") pod "43181c01-0cec-4ad9-b037-880357999794" (UID: "43181c01-0cec-4ad9-b037-880357999794"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.284087 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.284330 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43181c01-0cec-4ad9-b037-880357999794-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.284339 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.284349 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqbxb\" (UniqueName: \"kubernetes.io/projected/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-kube-api-access-xqbxb\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.284361 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.284370 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:30 crc kubenswrapper[5030]: E0120 23:25:30.291984 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e479ed4455261ff95dbf70f650e6de7b4c94add8a7518effc7599b3e30ce8a54" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:25:30 crc kubenswrapper[5030]: E0120 23:25:30.295000 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e479ed4455261ff95dbf70f650e6de7b4c94add8a7518effc7599b3e30ce8a54" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:25:30 crc kubenswrapper[5030]: E0120 23:25:30.295389 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e479ed4455261ff95dbf70f650e6de7b4c94add8a7518effc7599b3e30ce8a54 is running failed: container process not found" containerID="e479ed4455261ff95dbf70f650e6de7b4c94add8a7518effc7599b3e30ce8a54" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:25:30 crc kubenswrapper[5030]: E0120 23:25:30.295454 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e479ed4455261ff95dbf70f650e6de7b4c94add8a7518effc7599b3e30ce8a54 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="c458e9a2-dd7c-4835-88f2-dcd41887e658" containerName="nova-cell0-conductor-conductor" Jan 20 23:25:30 crc kubenswrapper[5030]: E0120 23:25:30.360559 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc458e9a2_dd7c_4835_88f2_dcd41887e658.slice/crio-conmon-e479ed4455261ff95dbf70f650e6de7b4c94add8a7518effc7599b3e30ce8a54.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.613471 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"644cc3b2-e3d2-441c-8aa5-c3bec683e86c","Type":"ContainerStarted","Data":"d8ebde37b10c53ca483ade68635a4736300b159d4b22e45b3f8dec9db7455d00"} Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.620191 5030 generic.go:334] "Generic (PLEG): container finished" podID="c458e9a2-dd7c-4835-88f2-dcd41887e658" containerID="e479ed4455261ff95dbf70f650e6de7b4c94add8a7518effc7599b3e30ce8a54" exitCode=0 Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.620292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"c458e9a2-dd7c-4835-88f2-dcd41887e658","Type":"ContainerDied","Data":"e479ed4455261ff95dbf70f650e6de7b4c94add8a7518effc7599b3e30ce8a54"} Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.625842 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" event={"ID":"43181c01-0cec-4ad9-b037-880357999794","Type":"ContainerDied","Data":"b964da6f15dcddd67731a7aaf5812e4acf7bb29d5705242453682a8378c0b7f8"} Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.625883 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b964da6f15dcddd67731a7aaf5812e4acf7bb29d5705242453682a8378c0b7f8" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.625962 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-x8hpn" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.642047 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-xl95q" event={"ID":"b66b2ce3-1462-4db2-b1b4-6ca42402d7a5","Type":"ContainerDied","Data":"c4f4ba8b2379bebb32ac1c3eda6f53fc24741019d3d91a4919cd7e061982d91c"} Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.642287 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4f4ba8b2379bebb32ac1c3eda6f53fc24741019d3d91a4919cd7e061982d91c" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.642377 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-xl95q" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.694423 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-q284r"] Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.706795 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-q284r"] Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.722736 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-5946fbdfbb-6s7qt"] Jan 20 23:25:30 crc kubenswrapper[5030]: E0120 23:25:30.723131 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43181c01-0cec-4ad9-b037-880357999794" containerName="keystone-db-sync" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.723142 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="43181c01-0cec-4ad9-b037-880357999794" containerName="keystone-db-sync" Jan 20 23:25:30 crc kubenswrapper[5030]: E0120 23:25:30.723175 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66b2ce3-1462-4db2-b1b4-6ca42402d7a5" containerName="placement-db-sync" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.723183 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66b2ce3-1462-4db2-b1b4-6ca42402d7a5" containerName="placement-db-sync" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.723353 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b66b2ce3-1462-4db2-b1b4-6ca42402d7a5" containerName="placement-db-sync" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.723364 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="43181c01-0cec-4ad9-b037-880357999794" containerName="keystone-db-sync" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.724309 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.726177 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5946fbdfbb-6s7qt"] Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.802749 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-2nmjb"] Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.804218 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.806212 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.812599 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-2nmjb"] Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.827290 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjglk\" (UniqueName: \"kubernetes.io/projected/e89f10b6-405e-4c42-a2f5-97111102ec40-kube-api-access-bjglk\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.827342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-combined-ca-bundle\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.827373 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89f10b6-405e-4c42-a2f5-97111102ec40-logs\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.827421 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-public-tls-certs\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.827473 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-config-data\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.827516 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-scripts\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.827536 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-internal-tls-certs\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.930168 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjglk\" (UniqueName: \"kubernetes.io/projected/e89f10b6-405e-4c42-a2f5-97111102ec40-kube-api-access-bjglk\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.930249 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-combined-ca-bundle\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.930283 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-combined-ca-bundle\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.930310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-scripts\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.930352 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-fernet-keys\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.930380 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89f10b6-405e-4c42-a2f5-97111102ec40-logs\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.930401 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-config-data\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.930420 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8pvc\" (UniqueName: \"kubernetes.io/projected/72433a7b-fb29-44be-90d0-485aadbb3d7e-kube-api-access-v8pvc\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.930445 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-public-tls-certs\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.930467 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-credential-keys\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.930706 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-config-data\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.930774 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-scripts\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.930846 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-internal-tls-certs\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.931503 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89f10b6-405e-4c42-a2f5-97111102ec40-logs\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.935075 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-internal-tls-certs\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.939234 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-combined-ca-bundle\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.939347 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-scripts\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.951280 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-public-tls-certs\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.952402 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjglk\" (UniqueName: \"kubernetes.io/projected/e89f10b6-405e-4c42-a2f5-97111102ec40-kube-api-access-bjglk\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:30 crc kubenswrapper[5030]: I0120 23:25:30.956415 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-config-data\") pod \"placement-5946fbdfbb-6s7qt\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.034391 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-combined-ca-bundle\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.034464 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-scripts\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.034494 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-fernet-keys\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.034535 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-config-data\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.034558 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8pvc\" (UniqueName: \"kubernetes.io/projected/72433a7b-fb29-44be-90d0-485aadbb3d7e-kube-api-access-v8pvc\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.034591 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-credential-keys\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.044186 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-combined-ca-bundle\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.046151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-config-data\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.047194 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-scripts\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.049174 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-fernet-keys\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.050324 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8pvc\" (UniqueName: \"kubernetes.io/projected/72433a7b-fb29-44be-90d0-485aadbb3d7e-kube-api-access-v8pvc\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.052824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-credential-keys\") pod \"keystone-bootstrap-2nmjb\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.065083 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.133000 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.140363 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.339852 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c458e9a2-dd7c-4835-88f2-dcd41887e658-combined-ca-bundle\") pod \"c458e9a2-dd7c-4835-88f2-dcd41887e658\" (UID: \"c458e9a2-dd7c-4835-88f2-dcd41887e658\") " Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.340129 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqg64\" (UniqueName: \"kubernetes.io/projected/c458e9a2-dd7c-4835-88f2-dcd41887e658-kube-api-access-wqg64\") pod \"c458e9a2-dd7c-4835-88f2-dcd41887e658\" (UID: \"c458e9a2-dd7c-4835-88f2-dcd41887e658\") " Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.340157 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c458e9a2-dd7c-4835-88f2-dcd41887e658-config-data\") pod \"c458e9a2-dd7c-4835-88f2-dcd41887e658\" (UID: \"c458e9a2-dd7c-4835-88f2-dcd41887e658\") " Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.344695 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c458e9a2-dd7c-4835-88f2-dcd41887e658-kube-api-access-wqg64" (OuterVolumeSpecName: "kube-api-access-wqg64") pod "c458e9a2-dd7c-4835-88f2-dcd41887e658" (UID: "c458e9a2-dd7c-4835-88f2-dcd41887e658"). InnerVolumeSpecName "kube-api-access-wqg64". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.378115 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c458e9a2-dd7c-4835-88f2-dcd41887e658-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c458e9a2-dd7c-4835-88f2-dcd41887e658" (UID: "c458e9a2-dd7c-4835-88f2-dcd41887e658"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.381869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c458e9a2-dd7c-4835-88f2-dcd41887e658-config-data" (OuterVolumeSpecName: "config-data") pod "c458e9a2-dd7c-4835-88f2-dcd41887e658" (UID: "c458e9a2-dd7c-4835-88f2-dcd41887e658"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.442262 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c458e9a2-dd7c-4835-88f2-dcd41887e658-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.442289 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqg64\" (UniqueName: \"kubernetes.io/projected/c458e9a2-dd7c-4835-88f2-dcd41887e658-kube-api-access-wqg64\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.442301 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c458e9a2-dd7c-4835-88f2-dcd41887e658-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:31 crc kubenswrapper[5030]: E0120 23:25:31.533226 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574 is running failed: container process not found" containerID="62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:25:31 crc kubenswrapper[5030]: E0120 23:25:31.533671 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574 is running failed: container process not found" containerID="62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:25:31 crc kubenswrapper[5030]: E0120 23:25:31.534006 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574 is running failed: container process not found" containerID="62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:25:31 crc kubenswrapper[5030]: E0120 23:25:31.534043 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="0194cff7-fd9f-4d47-8edd-eae7cd25c987" containerName="nova-cell1-conductor-conductor" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.592612 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5946fbdfbb-6s7qt"] Jan 20 23:25:31 crc kubenswrapper[5030]: W0120 23:25:31.607605 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode89f10b6_405e_4c42_a2f5_97111102ec40.slice/crio-9e3a01f8ac397d24ef807fbdfa0da4b9784248eeaa7dab52c8e9f6796bd2fd26 WatchSource:0}: Error finding container 9e3a01f8ac397d24ef807fbdfa0da4b9784248eeaa7dab52c8e9f6796bd2fd26: Status 404 returned error can't find the container with id 9e3a01f8ac397d24ef807fbdfa0da4b9784248eeaa7dab52c8e9f6796bd2fd26 Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.665499 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"c458e9a2-dd7c-4835-88f2-dcd41887e658","Type":"ContainerDied","Data":"502a28c098660efaba5078cf04781ee92fe3f9947c2fe0dc97bcfa39d2e6d39f"} Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.665787 5030 scope.go:117] "RemoveContainer" containerID="e479ed4455261ff95dbf70f650e6de7b4c94add8a7518effc7599b3e30ce8a54" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.665915 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.696721 5030 generic.go:334] "Generic (PLEG): container finished" podID="0194cff7-fd9f-4d47-8edd-eae7cd25c987" containerID="62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574" exitCode=0 Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.696833 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0194cff7-fd9f-4d47-8edd-eae7cd25c987","Type":"ContainerDied","Data":"62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574"} Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.697576 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-2nmjb"] Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.704979 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" event={"ID":"e89f10b6-405e-4c42-a2f5-97111102ec40","Type":"ContainerStarted","Data":"9e3a01f8ac397d24ef807fbdfa0da4b9784248eeaa7dab52c8e9f6796bd2fd26"} Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.714857 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"644cc3b2-e3d2-441c-8aa5-c3bec683e86c","Type":"ContainerStarted","Data":"9b87dbe57785654f140e071cbe49ac4f1c7e04524db51c2dad9594ac25b3bdad"} Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.716138 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.719334 5030 generic.go:334] "Generic (PLEG): container finished" podID="3bc26a88-4c14-43cd-8255-cf5bb3e15931" containerID="3657580e710d34e6ab5e47b4e7f1bff1e8c70505ca73a3c36c83d6889bcae7b8" exitCode=0 Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.719405 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-nchp6" event={"ID":"3bc26a88-4c14-43cd-8255-cf5bb3e15931","Type":"ContainerDied","Data":"3657580e710d34e6ab5e47b4e7f1bff1e8c70505ca73a3c36c83d6889bcae7b8"} Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.758429 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.758412365 podStartE2EDuration="3.758412365s" podCreationTimestamp="2026-01-20 23:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:31.743679459 +0000 UTC m=+3004.063939757" watchObservedRunningTime="2026-01-20 23:25:31.758412365 +0000 UTC m=+3004.078672653" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.806366 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.856492 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.880805 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.925034 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:25:31 crc kubenswrapper[5030]: E0120 23:25:31.925451 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0194cff7-fd9f-4d47-8edd-eae7cd25c987" containerName="nova-cell1-conductor-conductor" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.925469 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0194cff7-fd9f-4d47-8edd-eae7cd25c987" containerName="nova-cell1-conductor-conductor" Jan 20 23:25:31 crc kubenswrapper[5030]: E0120 23:25:31.925486 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c458e9a2-dd7c-4835-88f2-dcd41887e658" containerName="nova-cell0-conductor-conductor" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.925495 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c458e9a2-dd7c-4835-88f2-dcd41887e658" containerName="nova-cell0-conductor-conductor" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.925703 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0194cff7-fd9f-4d47-8edd-eae7cd25c987" containerName="nova-cell1-conductor-conductor" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.925728 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c458e9a2-dd7c-4835-88f2-dcd41887e658" containerName="nova-cell0-conductor-conductor" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.926303 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.928561 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.949606 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.959022 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ff5b\" (UniqueName: \"kubernetes.io/projected/0194cff7-fd9f-4d47-8edd-eae7cd25c987-kube-api-access-7ff5b\") pod \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\" (UID: \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\") " Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.959114 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0194cff7-fd9f-4d47-8edd-eae7cd25c987-combined-ca-bundle\") pod \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\" (UID: \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\") " Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.959175 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0194cff7-fd9f-4d47-8edd-eae7cd25c987-config-data\") pod \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\" (UID: \"0194cff7-fd9f-4d47-8edd-eae7cd25c987\") " Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.963523 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0194cff7-fd9f-4d47-8edd-eae7cd25c987-kube-api-access-7ff5b" (OuterVolumeSpecName: "kube-api-access-7ff5b") pod "0194cff7-fd9f-4d47-8edd-eae7cd25c987" (UID: "0194cff7-fd9f-4d47-8edd-eae7cd25c987"). InnerVolumeSpecName "kube-api-access-7ff5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.975731 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59a0bd3e-1c17-41ac-b449-3392e10dccc8" path="/var/lib/kubelet/pods/59a0bd3e-1c17-41ac-b449-3392e10dccc8/volumes" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.976286 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c458e9a2-dd7c-4835-88f2-dcd41887e658" path="/var/lib/kubelet/pods/c458e9a2-dd7c-4835-88f2-dcd41887e658/volumes" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.986773 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0194cff7-fd9f-4d47-8edd-eae7cd25c987-config-data" (OuterVolumeSpecName: "config-data") pod "0194cff7-fd9f-4d47-8edd-eae7cd25c987" (UID: "0194cff7-fd9f-4d47-8edd-eae7cd25c987"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:31 crc kubenswrapper[5030]: I0120 23:25:31.986841 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0194cff7-fd9f-4d47-8edd-eae7cd25c987-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0194cff7-fd9f-4d47-8edd-eae7cd25c987" (UID: "0194cff7-fd9f-4d47-8edd-eae7cd25c987"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.060976 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e33b23a7-cf35-4fc2-8237-9633f3d807f3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.061404 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e33b23a7-cf35-4fc2-8237-9633f3d807f3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.061460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nhdd\" (UniqueName: \"kubernetes.io/projected/e33b23a7-cf35-4fc2-8237-9633f3d807f3-kube-api-access-7nhdd\") pod \"nova-cell0-conductor-0\" (UID: \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.061531 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ff5b\" (UniqueName: \"kubernetes.io/projected/0194cff7-fd9f-4d47-8edd-eae7cd25c987-kube-api-access-7ff5b\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.061544 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0194cff7-fd9f-4d47-8edd-eae7cd25c987-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.061553 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0194cff7-fd9f-4d47-8edd-eae7cd25c987-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.163397 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e33b23a7-cf35-4fc2-8237-9633f3d807f3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.163481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e33b23a7-cf35-4fc2-8237-9633f3d807f3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.163531 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nhdd\" (UniqueName: \"kubernetes.io/projected/e33b23a7-cf35-4fc2-8237-9633f3d807f3-kube-api-access-7nhdd\") pod \"nova-cell0-conductor-0\" (UID: \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.169385 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e33b23a7-cf35-4fc2-8237-9633f3d807f3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.169381 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e33b23a7-cf35-4fc2-8237-9633f3d807f3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.181802 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nhdd\" (UniqueName: \"kubernetes.io/projected/e33b23a7-cf35-4fc2-8237-9633f3d807f3-kube-api-access-7nhdd\") pod \"nova-cell0-conductor-0\" (UID: \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.247381 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.708321 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.736083 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"0194cff7-fd9f-4d47-8edd-eae7cd25c987","Type":"ContainerDied","Data":"96383fc6688c12fc4ca8d54b8a6a421d53e687489e44cbc82ef0a736e985cfc1"} Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.736147 5030 scope.go:117] "RemoveContainer" containerID="62c39b67fe1c9efcc1574e1aedc22e3804ac0161268c3772dd50ef2e324f6574" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.736152 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.738891 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" event={"ID":"e89f10b6-405e-4c42-a2f5-97111102ec40","Type":"ContainerStarted","Data":"4141048d16b87ba820575f3c2ed1b798652fb89491b0f81f3244fea3383c5fb5"} Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.738931 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" event={"ID":"e89f10b6-405e-4c42-a2f5-97111102ec40","Type":"ContainerStarted","Data":"39dedca045f311d6169ac05c06012585bdf93027adb63134aae42af056394de6"} Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.739812 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.739942 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.745068 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" event={"ID":"72433a7b-fb29-44be-90d0-485aadbb3d7e","Type":"ContainerStarted","Data":"bbd14ee795fa04a58c750c6baa61b64963cd12c1c88def18d5dedbd8c96d79f2"} Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.745115 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" event={"ID":"72433a7b-fb29-44be-90d0-485aadbb3d7e","Type":"ContainerStarted","Data":"b4215dea61223996a35d900c15ea536818366fcff1d3caf86146861ef6ec22b2"} Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.750854 5030 generic.go:334] "Generic (PLEG): container finished" podID="6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063" containerID="e3bc077b333e186d58c38afabbc39ee5da09f71b50da1b5bffef05ed628247e9" exitCode=0 Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.750915 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-dtsgt" event={"ID":"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063","Type":"ContainerDied","Data":"e3bc077b333e186d58c38afabbc39ee5da09f71b50da1b5bffef05ed628247e9"} Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.752509 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e33b23a7-cf35-4fc2-8237-9633f3d807f3","Type":"ContainerStarted","Data":"147bc330fa3f24ee5067c7881ac9e25763bbdb0fa4b4e37574f9cfdb8c785d62"} Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.790935 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" podStartSLOduration=2.7909189789999997 podStartE2EDuration="2.790918979s" podCreationTimestamp="2026-01-20 23:25:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:32.779320849 +0000 UTC m=+3005.099581147" watchObservedRunningTime="2026-01-20 23:25:32.790918979 +0000 UTC m=+3005.111179267" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.823181 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" podStartSLOduration=2.82315908 podStartE2EDuration="2.82315908s" podCreationTimestamp="2026-01-20 23:25:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:32.810563385 +0000 UTC m=+3005.130823673" watchObservedRunningTime="2026-01-20 23:25:32.82315908 +0000 UTC m=+3005.143419368" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.876062 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.889748 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.898211 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.899472 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.903199 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:25:32 crc kubenswrapper[5030]: I0120 23:25:32.903651 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.073761 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-nchp6" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.094966 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffx9h\" (UniqueName: \"kubernetes.io/projected/747ac61a-5b10-46ef-8a34-68b43c777d96-kube-api-access-ffx9h\") pod \"nova-cell1-conductor-0\" (UID: \"747ac61a-5b10-46ef-8a34-68b43c777d96\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.095124 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747ac61a-5b10-46ef-8a34-68b43c777d96-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"747ac61a-5b10-46ef-8a34-68b43c777d96\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.095162 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747ac61a-5b10-46ef-8a34-68b43c777d96-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"747ac61a-5b10-46ef-8a34-68b43c777d96\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.195831 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3bc26a88-4c14-43cd-8255-cf5bb3e15931-db-sync-config-data\") pod \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\" (UID: \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\") " Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.195983 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbtdm\" (UniqueName: \"kubernetes.io/projected/3bc26a88-4c14-43cd-8255-cf5bb3e15931-kube-api-access-hbtdm\") pod \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\" (UID: \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\") " Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.196064 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc26a88-4c14-43cd-8255-cf5bb3e15931-combined-ca-bundle\") pod \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\" (UID: \"3bc26a88-4c14-43cd-8255-cf5bb3e15931\") " Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.196881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747ac61a-5b10-46ef-8a34-68b43c777d96-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"747ac61a-5b10-46ef-8a34-68b43c777d96\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.196930 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747ac61a-5b10-46ef-8a34-68b43c777d96-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"747ac61a-5b10-46ef-8a34-68b43c777d96\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.196987 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffx9h\" (UniqueName: \"kubernetes.io/projected/747ac61a-5b10-46ef-8a34-68b43c777d96-kube-api-access-ffx9h\") pod \"nova-cell1-conductor-0\" (UID: \"747ac61a-5b10-46ef-8a34-68b43c777d96\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.201136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747ac61a-5b10-46ef-8a34-68b43c777d96-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"747ac61a-5b10-46ef-8a34-68b43c777d96\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.201790 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc26a88-4c14-43cd-8255-cf5bb3e15931-kube-api-access-hbtdm" (OuterVolumeSpecName: "kube-api-access-hbtdm") pod "3bc26a88-4c14-43cd-8255-cf5bb3e15931" (UID: "3bc26a88-4c14-43cd-8255-cf5bb3e15931"). InnerVolumeSpecName "kube-api-access-hbtdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.202833 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bc26a88-4c14-43cd-8255-cf5bb3e15931-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3bc26a88-4c14-43cd-8255-cf5bb3e15931" (UID: "3bc26a88-4c14-43cd-8255-cf5bb3e15931"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.204378 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747ac61a-5b10-46ef-8a34-68b43c777d96-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"747ac61a-5b10-46ef-8a34-68b43c777d96\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.221211 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffx9h\" (UniqueName: \"kubernetes.io/projected/747ac61a-5b10-46ef-8a34-68b43c777d96-kube-api-access-ffx9h\") pod \"nova-cell1-conductor-0\" (UID: \"747ac61a-5b10-46ef-8a34-68b43c777d96\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.247417 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.247728 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bc26a88-4c14-43cd-8255-cf5bb3e15931-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bc26a88-4c14-43cd-8255-cf5bb3e15931" (UID: "3bc26a88-4c14-43cd-8255-cf5bb3e15931"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.298694 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3bc26a88-4c14-43cd-8255-cf5bb3e15931-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.298725 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbtdm\" (UniqueName: \"kubernetes.io/projected/3bc26a88-4c14-43cd-8255-cf5bb3e15931-kube-api-access-hbtdm\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.298735 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bc26a88-4c14-43cd-8255-cf5bb3e15931-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.698019 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:25:33 crc kubenswrapper[5030]: W0120 23:25:33.703159 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod747ac61a_5b10_46ef_8a34_68b43c777d96.slice/crio-689fb66d9fd62d419d62306c65d758bc0231fafd80e34a96ff076ba5c12fc94c WatchSource:0}: Error finding container 689fb66d9fd62d419d62306c65d758bc0231fafd80e34a96ff076ba5c12fc94c: Status 404 returned error can't find the container with id 689fb66d9fd62d419d62306c65d758bc0231fafd80e34a96ff076ba5c12fc94c Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.767898 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-nchp6" event={"ID":"3bc26a88-4c14-43cd-8255-cf5bb3e15931","Type":"ContainerDied","Data":"9878da5f1dd417bce0b2d70887d8d0b27e99c1128c4c30ef56eda403f6567a11"} Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.767933 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-nchp6" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.767938 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9878da5f1dd417bce0b2d70887d8d0b27e99c1128c4c30ef56eda403f6567a11" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.787856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"747ac61a-5b10-46ef-8a34-68b43c777d96","Type":"ContainerStarted","Data":"689fb66d9fd62d419d62306c65d758bc0231fafd80e34a96ff076ba5c12fc94c"} Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.801857 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e33b23a7-cf35-4fc2-8237-9633f3d807f3","Type":"ContainerStarted","Data":"60ba47fc9818c41be393330baa0d4357fed4f1a59bdc3aa0b5a186691e89940c"} Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.802854 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.825735 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.825717039 podStartE2EDuration="2.825717039s" podCreationTimestamp="2026-01-20 23:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:33.816367842 +0000 UTC m=+3006.136628150" watchObservedRunningTime="2026-01-20 23:25:33.825717039 +0000 UTC m=+3006.145977327" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.980472 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0194cff7-fd9f-4d47-8edd-eae7cd25c987" path="/var/lib/kubelet/pods/0194cff7-fd9f-4d47-8edd-eae7cd25c987/volumes" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.995972 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7"] Jan 20 23:25:33 crc kubenswrapper[5030]: E0120 23:25:33.996382 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc26a88-4c14-43cd-8255-cf5bb3e15931" containerName="barbican-db-sync" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.996393 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc26a88-4c14-43cd-8255-cf5bb3e15931" containerName="barbican-db-sync" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.996569 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc26a88-4c14-43cd-8255-cf5bb3e15931" containerName="barbican-db-sync" Jan 20 23:25:33 crc kubenswrapper[5030]: I0120 23:25:33.997484 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.040563 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk"] Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.042152 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.061701 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj"] Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.069889 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.077159 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk"] Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.097726 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj"] Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.122078 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7"] Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.124499 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-config-data-custom\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.124577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-config-data\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.124736 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b796eed-a508-43ca-9b25-c097b39474b7-logs\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.124842 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-combined-ca-bundle\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.124898 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-public-tls-certs\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.125033 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-internal-tls-certs\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.125207 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr5f9\" (UniqueName: \"kubernetes.io/projected/2b796eed-a508-43ca-9b25-c097b39474b7-kube-api-access-mr5f9\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226440 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-combined-ca-bundle\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226487 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-config-data\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226508 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-public-tls-certs\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226561 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-combined-ca-bundle\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-internal-tls-certs\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226604 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-config-data\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226629 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcg4p\" (UniqueName: \"kubernetes.io/projected/b334c7e2-1209-4a89-b56d-72d97b6f16b3-kube-api-access-bcg4p\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226682 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-combined-ca-bundle\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226704 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr5f9\" (UniqueName: \"kubernetes.io/projected/2b796eed-a508-43ca-9b25-c097b39474b7-kube-api-access-mr5f9\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226726 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e14da0-805b-4832-b6dc-7dc6b829ee4f-logs\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226747 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgqk7\" (UniqueName: \"kubernetes.io/projected/17e14da0-805b-4832-b6dc-7dc6b829ee4f-kube-api-access-dgqk7\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226775 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-config-data-custom\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-config-data-custom\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226812 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-config-data\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226838 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b334c7e2-1209-4a89-b56d-72d97b6f16b3-logs\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226873 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b796eed-a508-43ca-9b25-c097b39474b7-logs\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226874 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.226892 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-config-data-custom\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.227892 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b796eed-a508-43ca-9b25-c097b39474b7-logs\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.231581 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-internal-tls-certs\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.232708 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-public-tls-certs\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.233074 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-combined-ca-bundle\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.233770 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-config-data-custom\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.238880 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-config-data\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.245269 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr5f9\" (UniqueName: \"kubernetes.io/projected/2b796eed-a508-43ca-9b25-c097b39474b7-kube-api-access-mr5f9\") pod \"barbican-api-5ccdc9797b-mz9r7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.328644 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-combined-ca-bundle\") pod \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.329001 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6ch9\" (UniqueName: \"kubernetes.io/projected/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-kube-api-access-c6ch9\") pod \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.329964 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-db-sync-config-data\") pod \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.330729 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-config-data\") pod \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\" (UID: \"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063\") " Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.330951 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-config-data\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.331153 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-combined-ca-bundle\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.331263 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-config-data\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.331338 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcg4p\" (UniqueName: \"kubernetes.io/projected/b334c7e2-1209-4a89-b56d-72d97b6f16b3-kube-api-access-bcg4p\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.331538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-combined-ca-bundle\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.331654 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e14da0-805b-4832-b6dc-7dc6b829ee4f-logs\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.331734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgqk7\" (UniqueName: \"kubernetes.io/projected/17e14da0-805b-4832-b6dc-7dc6b829ee4f-kube-api-access-dgqk7\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.331821 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-config-data-custom\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.331907 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b334c7e2-1209-4a89-b56d-72d97b6f16b3-logs\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.332028 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-config-data-custom\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.333437 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b334c7e2-1209-4a89-b56d-72d97b6f16b3-logs\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.333519 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e14da0-805b-4832-b6dc-7dc6b829ee4f-logs\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.335719 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.338020 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063" (UID: "6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.341979 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-config-data-custom\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.346028 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-config-data-custom\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.347687 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgqk7\" (UniqueName: \"kubernetes.io/projected/17e14da0-805b-4832-b6dc-7dc6b829ee4f-kube-api-access-dgqk7\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.352027 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-kube-api-access-c6ch9" (OuterVolumeSpecName: "kube-api-access-c6ch9") pod "6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063" (UID: "6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063"). InnerVolumeSpecName "kube-api-access-c6ch9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.352374 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-combined-ca-bundle\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.355527 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-combined-ca-bundle\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.357771 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-config-data\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.359502 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcg4p\" (UniqueName: \"kubernetes.io/projected/b334c7e2-1209-4a89-b56d-72d97b6f16b3-kube-api-access-bcg4p\") pod \"barbican-worker-6c8cb8dc59-t28xk\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.362294 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-config-data\") pod \"barbican-keystone-listener-84cc57bc46-58glj\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.380667 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.388914 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063" (UID: "6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.397299 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.406138 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-config-data" (OuterVolumeSpecName: "config-data") pod "6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063" (UID: "6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.434657 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.434681 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.434690 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.434702 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6ch9\" (UniqueName: \"kubernetes.io/projected/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063-kube-api-access-c6ch9\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.767110 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.830790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-dtsgt" event={"ID":"6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063","Type":"ContainerDied","Data":"175185716a54fc745b8363fbda184eb388e6692213ad6b5915fa1863322d7db2"} Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.830827 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="175185716a54fc745b8363fbda184eb388e6692213ad6b5915fa1863322d7db2" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.830884 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-dtsgt" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.835302 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"747ac61a-5b10-46ef-8a34-68b43c777d96","Type":"ContainerStarted","Data":"fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69"} Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.835508 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.841028 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift\") pod \"b607ce93-5c69-4cf7-a953-3d53d413fcff\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.841095 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfbkd\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-kube-api-access-mfbkd\") pod \"b607ce93-5c69-4cf7-a953-3d53d413fcff\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.841157 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b607ce93-5c69-4cf7-a953-3d53d413fcff-cache\") pod \"b607ce93-5c69-4cf7-a953-3d53d413fcff\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.841178 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b607ce93-5c69-4cf7-a953-3d53d413fcff-lock\") pod \"b607ce93-5c69-4cf7-a953-3d53d413fcff\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.841310 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"b607ce93-5c69-4cf7-a953-3d53d413fcff\" (UID: \"b607ce93-5c69-4cf7-a953-3d53d413fcff\") " Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.845227 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b607ce93-5c69-4cf7-a953-3d53d413fcff-lock" (OuterVolumeSpecName: "lock") pod "b607ce93-5c69-4cf7-a953-3d53d413fcff" (UID: "b607ce93-5c69-4cf7-a953-3d53d413fcff"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.848167 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b607ce93-5c69-4cf7-a953-3d53d413fcff-cache" (OuterVolumeSpecName: "cache") pod "b607ce93-5c69-4cf7-a953-3d53d413fcff" (UID: "b607ce93-5c69-4cf7-a953-3d53d413fcff"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.849910 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b607ce93-5c69-4cf7-a953-3d53d413fcff" (UID: "b607ce93-5c69-4cf7-a953-3d53d413fcff"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.854870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "swift") pod "b607ce93-5c69-4cf7-a953-3d53d413fcff" (UID: "b607ce93-5c69-4cf7-a953-3d53d413fcff"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.873750 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-kube-api-access-mfbkd" (OuterVolumeSpecName: "kube-api-access-mfbkd") pod "b607ce93-5c69-4cf7-a953-3d53d413fcff" (UID: "b607ce93-5c69-4cf7-a953-3d53d413fcff"). InnerVolumeSpecName "kube-api-access-mfbkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.875710 5030 generic.go:334] "Generic (PLEG): container finished" podID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerID="e68bf3c747c74247d5fd94bc51c9d1b79d386d40e704eeba0fc6a904567c5302" exitCode=137 Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.876011 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.876574 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"e68bf3c747c74247d5fd94bc51c9d1b79d386d40e704eeba0fc6a904567c5302"} Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.876605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"b607ce93-5c69-4cf7-a953-3d53d413fcff","Type":"ContainerDied","Data":"266b590f8256a8c0d5161f4132135e7cfd60b1cb4ad64eb600411af4a34a3979"} Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.876639 5030 scope.go:117] "RemoveContainer" containerID="e68bf3c747c74247d5fd94bc51c9d1b79d386d40e704eeba0fc6a904567c5302" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.900166 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.900149118 podStartE2EDuration="2.900149118s" podCreationTimestamp="2026-01-20 23:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:34.855278021 +0000 UTC m=+3007.175538309" watchObservedRunningTime="2026-01-20 23:25:34.900149118 +0000 UTC m=+3007.220409396" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.952235 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.952270 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.952282 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfbkd\" (UniqueName: \"kubernetes.io/projected/b607ce93-5c69-4cf7-a953-3d53d413fcff-kube-api-access-mfbkd\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.952291 5030 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b607ce93-5c69-4cf7-a953-3d53d413fcff-cache\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.952302 5030 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b607ce93-5c69-4cf7-a953-3d53d413fcff-lock\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:34 crc kubenswrapper[5030]: I0120 23:25:34.977787 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7"] Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.024721 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk"] Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.045719 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.048076 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.055046 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.058979 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.069054 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.069372 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="198942d4-7d99-4cd0-8e85-b9a7497b4343" containerName="glance-log" containerID="cri-o://af4b4b10add63da920b6c0c9dfac114e8ff71804b575c92731ad570bf240f7e1" gracePeriod=30 Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.069922 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="198942d4-7d99-4cd0-8e85-b9a7497b4343" containerName="glance-httpd" containerID="cri-o://5bfcfd97530429318df3c1456d709733df8f61edbb7300154b1b5a2dc34d7062" gracePeriod=30 Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.104566 5030 scope.go:117] "RemoveContainer" containerID="ca4c0372f3752ba9b4ca5958091f96b803487120243179b1402233e986a6b580" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.131398 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.131849 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063" containerName="glance-db-sync" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.131863 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063" containerName="glance-db-sync" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.131881 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-updater" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.131888 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-updater" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.131900 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-server" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.131906 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-server" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.131919 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-auditor" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.131926 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-auditor" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.131936 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-replicator" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.131943 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-replicator" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.131950 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="swift-recon-cron" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.131956 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="swift-recon-cron" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.131966 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-replicator" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.131972 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-replicator" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.131990 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-auditor" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.131995 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-auditor" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.132005 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-updater" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132011 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-updater" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.132019 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-replicator" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132024 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-replicator" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.132035 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-expirer" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132041 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-expirer" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.132054 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-reaper" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132060 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-reaper" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.132078 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-server" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132084 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-server" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.132096 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="rsync" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132101 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="rsync" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.132111 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-server" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132117 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-server" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.132130 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-auditor" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132136 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-auditor" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132295 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-replicator" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132304 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-server" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132316 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-replicator" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132324 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-server" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132333 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-updater" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132344 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="rsync" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132357 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-auditor" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132364 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-server" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132372 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="swift-recon-cron" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132380 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063" containerName="glance-db-sync" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132390 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-auditor" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132396 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-replicator" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132407 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-auditor" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132417 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="container-updater" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132425 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="account-reaper" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.132433 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" containerName="object-expirer" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.158362 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.168736 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.168952 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="8a280c54-785f-4a4f-895b-231aaed18db5" containerName="glance-log" containerID="cri-o://7d33a9dd6b2ff81227dfeb64ea475066ca5a849d6f4973770fd37aee303e0b58" gracePeriod=30 Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.169090 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="8a280c54-785f-4a4f-895b-231aaed18db5" containerName="glance-httpd" containerID="cri-o://bbe8780ef4464c2c88e1e7f47c8a400ceb8bf0f5eb423d5f85696f78d14df1f9" gracePeriod=30 Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.190266 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.190592 5030 scope.go:117] "RemoveContainer" containerID="d4e9bd5a5fdcfcd937af2ad5f261b5f163beb79f9debd2bf84fdf18248e8fd46" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.202710 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.224103 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj"] Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.262522 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1c1f0eab-495a-4664-9852-3e6488194586-lock\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.262780 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.262871 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1c1f0eab-495a-4664-9852-3e6488194586-etc-swift\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.263025 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1c1f0eab-495a-4664-9852-3e6488194586-cache\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.263148 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrtpp\" (UniqueName: \"kubernetes.io/projected/1c1f0eab-495a-4664-9852-3e6488194586-kube-api-access-wrtpp\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.340773 5030 scope.go:117] "RemoveContainer" containerID="898ef2488ff154992f84a4fc6380c82430045a8ed810e3879ec78e5a246f7a3f" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.365281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.365402 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1c1f0eab-495a-4664-9852-3e6488194586-etc-swift\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.365453 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1c1f0eab-495a-4664-9852-3e6488194586-cache\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.365503 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrtpp\" (UniqueName: \"kubernetes.io/projected/1c1f0eab-495a-4664-9852-3e6488194586-kube-api-access-wrtpp\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.365642 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1c1f0eab-495a-4664-9852-3e6488194586-lock\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.366176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1c1f0eab-495a-4664-9852-3e6488194586-lock\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.366596 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") device mount path \"/mnt/openstack/pv15\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.367025 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1c1f0eab-495a-4664-9852-3e6488194586-cache\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.376207 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1c1f0eab-495a-4664-9852-3e6488194586-etc-swift\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.405435 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrtpp\" (UniqueName: \"kubernetes.io/projected/1c1f0eab-495a-4664-9852-3e6488194586-kube-api-access-wrtpp\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.411001 5030 scope.go:117] "RemoveContainer" containerID="de256107cf63fcb2b870e65d29e45e6ba9a4694e730e0ec20ce025d3f43b07de" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.412983 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.477101 5030 scope.go:117] "RemoveContainer" containerID="43c9302de660f515dd0eebe873e9bc2a05c872d8d24b28b741b9f2ffa3a812d7" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.506791 5030 scope.go:117] "RemoveContainer" containerID="2ba800c34c07d475f585e0881947ab835b0bf1d733e21efd9aa4df55007828d7" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.511618 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.530943 5030 scope.go:117] "RemoveContainer" containerID="6b600011c7f8d67f6e7566ea6b3b1b2dfaeb40569b60470daf1e602fda755770" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.574584 5030 scope.go:117] "RemoveContainer" containerID="0a2cf07616a39fdc6f3573ae37b852a81c3be261b67e73368904e2e6310386c6" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.726705 5030 scope.go:117] "RemoveContainer" containerID="a628f7b623ff9c200508b4770aedb01dcdc16a5616b4c3c7e2c9ee2e72f314da" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.799305 5030 scope.go:117] "RemoveContainer" containerID="8f90b5530e219e8189c33b09de69a04843468fdd0affbcabac025c6866f2785c" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.827670 5030 scope.go:117] "RemoveContainer" containerID="8d392111f161bad470eb94c639b23a6c545dd65432681f84c5e1ea62cf270e93" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.869745 5030 scope.go:117] "RemoveContainer" containerID="4c756d7b8db1a727410cc6f3d2706156889222c800b5aa7f19e74bcaae74d60c" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.889478 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" event={"ID":"2b796eed-a508-43ca-9b25-c097b39474b7","Type":"ContainerStarted","Data":"83f9fdc7aa98bed3c6026ba325295aaed4096990435ab42d93cc5a1829c1757c"} Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.889531 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" event={"ID":"2b796eed-a508-43ca-9b25-c097b39474b7","Type":"ContainerStarted","Data":"b8717a3cdbefcbd0c0ff07aabc3b13b4414654b84eeec748b31cb1d195d654c9"} Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.889542 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" event={"ID":"2b796eed-a508-43ca-9b25-c097b39474b7","Type":"ContainerStarted","Data":"f147cd38fce61595e1d4a805fcbbb6b04012132e47915a6b7f4270822781d6c7"} Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.890607 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.890633 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.898115 5030 scope.go:117] "RemoveContainer" containerID="2461fa19d045db13a97b7b0b5638ffc5a653019b3da567b4db09dfdfa385937a" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.898532 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" event={"ID":"b334c7e2-1209-4a89-b56d-72d97b6f16b3","Type":"ContainerStarted","Data":"75c9449a8a21d460c37176286c18c54720a652f60cdf7d26b5616d4f02353a1a"} Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.898563 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" event={"ID":"b334c7e2-1209-4a89-b56d-72d97b6f16b3","Type":"ContainerStarted","Data":"6b496bef4ca4d0433913ed3347b495cb73325fdbef015e1d05cc61795bb60789"} Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.898574 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" event={"ID":"b334c7e2-1209-4a89-b56d-72d97b6f16b3","Type":"ContainerStarted","Data":"960095848ea85050d361d6661010f9e4804b844498acf033464e6e06806fde20"} Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.909164 5030 generic.go:334] "Generic (PLEG): container finished" podID="198942d4-7d99-4cd0-8e85-b9a7497b4343" containerID="af4b4b10add63da920b6c0c9dfac114e8ff71804b575c92731ad570bf240f7e1" exitCode=143 Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.909225 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"198942d4-7d99-4cd0-8e85-b9a7497b4343","Type":"ContainerDied","Data":"af4b4b10add63da920b6c0c9dfac114e8ff71804b575c92731ad570bf240f7e1"} Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.916651 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" podStartSLOduration=2.916621024 podStartE2EDuration="2.916621024s" podCreationTimestamp="2026-01-20 23:25:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:35.909973262 +0000 UTC m=+3008.230233550" watchObservedRunningTime="2026-01-20 23:25:35.916621024 +0000 UTC m=+3008.236881312" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.922651 5030 generic.go:334] "Generic (PLEG): container finished" podID="8a280c54-785f-4a4f-895b-231aaed18db5" containerID="7d33a9dd6b2ff81227dfeb64ea475066ca5a849d6f4973770fd37aee303e0b58" exitCode=143 Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.922722 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8a280c54-785f-4a4f-895b-231aaed18db5","Type":"ContainerDied","Data":"7d33a9dd6b2ff81227dfeb64ea475066ca5a849d6f4973770fd37aee303e0b58"} Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.931318 5030 scope.go:117] "RemoveContainer" containerID="e7b7de68994a8284266afd34bc3a79c36d36d9259d1a0d8a634f2a7d8363e58c" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.937115 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" event={"ID":"17e14da0-805b-4832-b6dc-7dc6b829ee4f","Type":"ContainerStarted","Data":"d30b1e443b75aaab1d0e162daa649e2952963350a5285a88628501469adfbea4"} Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.937188 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" event={"ID":"17e14da0-805b-4832-b6dc-7dc6b829ee4f","Type":"ContainerStarted","Data":"f3143e4041bdf06c6947b874cc740775ab9c9ec6001b798510175cc97e744a7e"} Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.937395 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" event={"ID":"17e14da0-805b-4832-b6dc-7dc6b829ee4f","Type":"ContainerStarted","Data":"dfb375b6b63fd96c35e67db267a5ee7abdae860b615804420073a47dcdb89a27"} Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.941006 5030 generic.go:334] "Generic (PLEG): container finished" podID="72433a7b-fb29-44be-90d0-485aadbb3d7e" containerID="bbd14ee795fa04a58c750c6baa61b64963cd12c1c88def18d5dedbd8c96d79f2" exitCode=0 Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.941193 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" event={"ID":"72433a7b-fb29-44be-90d0-485aadbb3d7e","Type":"ContainerDied","Data":"bbd14ee795fa04a58c750c6baa61b64963cd12c1c88def18d5dedbd8c96d79f2"} Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.950030 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" podStartSLOduration=2.950007882 podStartE2EDuration="2.950007882s" podCreationTimestamp="2026-01-20 23:25:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:35.931217518 +0000 UTC m=+3008.251477806" watchObservedRunningTime="2026-01-20 23:25:35.950007882 +0000 UTC m=+3008.270268170" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.976964 5030 scope.go:117] "RemoveContainer" containerID="e68bf3c747c74247d5fd94bc51c9d1b79d386d40e704eeba0fc6a904567c5302" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.977506 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68bf3c747c74247d5fd94bc51c9d1b79d386d40e704eeba0fc6a904567c5302\": container with ID starting with e68bf3c747c74247d5fd94bc51c9d1b79d386d40e704eeba0fc6a904567c5302 not found: ID does not exist" containerID="e68bf3c747c74247d5fd94bc51c9d1b79d386d40e704eeba0fc6a904567c5302" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.977535 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68bf3c747c74247d5fd94bc51c9d1b79d386d40e704eeba0fc6a904567c5302"} err="failed to get container status \"e68bf3c747c74247d5fd94bc51c9d1b79d386d40e704eeba0fc6a904567c5302\": rpc error: code = NotFound desc = could not find container \"e68bf3c747c74247d5fd94bc51c9d1b79d386d40e704eeba0fc6a904567c5302\": container with ID starting with e68bf3c747c74247d5fd94bc51c9d1b79d386d40e704eeba0fc6a904567c5302 not found: ID does not exist" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.977561 5030 scope.go:117] "RemoveContainer" containerID="ca4c0372f3752ba9b4ca5958091f96b803487120243179b1402233e986a6b580" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.978369 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca4c0372f3752ba9b4ca5958091f96b803487120243179b1402233e986a6b580\": container with ID starting with ca4c0372f3752ba9b4ca5958091f96b803487120243179b1402233e986a6b580 not found: ID does not exist" containerID="ca4c0372f3752ba9b4ca5958091f96b803487120243179b1402233e986a6b580" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.978430 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4c0372f3752ba9b4ca5958091f96b803487120243179b1402233e986a6b580"} err="failed to get container status \"ca4c0372f3752ba9b4ca5958091f96b803487120243179b1402233e986a6b580\": rpc error: code = NotFound desc = could not find container \"ca4c0372f3752ba9b4ca5958091f96b803487120243179b1402233e986a6b580\": container with ID starting with ca4c0372f3752ba9b4ca5958091f96b803487120243179b1402233e986a6b580 not found: ID does not exist" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.978458 5030 scope.go:117] "RemoveContainer" containerID="d4e9bd5a5fdcfcd937af2ad5f261b5f163beb79f9debd2bf84fdf18248e8fd46" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.981177 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e9bd5a5fdcfcd937af2ad5f261b5f163beb79f9debd2bf84fdf18248e8fd46\": container with ID starting with d4e9bd5a5fdcfcd937af2ad5f261b5f163beb79f9debd2bf84fdf18248e8fd46 not found: ID does not exist" containerID="d4e9bd5a5fdcfcd937af2ad5f261b5f163beb79f9debd2bf84fdf18248e8fd46" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.981258 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e9bd5a5fdcfcd937af2ad5f261b5f163beb79f9debd2bf84fdf18248e8fd46"} err="failed to get container status \"d4e9bd5a5fdcfcd937af2ad5f261b5f163beb79f9debd2bf84fdf18248e8fd46\": rpc error: code = NotFound desc = could not find container \"d4e9bd5a5fdcfcd937af2ad5f261b5f163beb79f9debd2bf84fdf18248e8fd46\": container with ID starting with d4e9bd5a5fdcfcd937af2ad5f261b5f163beb79f9debd2bf84fdf18248e8fd46 not found: ID does not exist" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.981274 5030 scope.go:117] "RemoveContainer" containerID="898ef2488ff154992f84a4fc6380c82430045a8ed810e3879ec78e5a246f7a3f" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.986879 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" podStartSLOduration=2.986861434 podStartE2EDuration="2.986861434s" podCreationTimestamp="2026-01-20 23:25:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:35.962204897 +0000 UTC m=+3008.282465185" watchObservedRunningTime="2026-01-20 23:25:35.986861434 +0000 UTC m=+3008.307121722" Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.990130 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"898ef2488ff154992f84a4fc6380c82430045a8ed810e3879ec78e5a246f7a3f\": container with ID starting with 898ef2488ff154992f84a4fc6380c82430045a8ed810e3879ec78e5a246f7a3f not found: ID does not exist" containerID="898ef2488ff154992f84a4fc6380c82430045a8ed810e3879ec78e5a246f7a3f" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.990171 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"898ef2488ff154992f84a4fc6380c82430045a8ed810e3879ec78e5a246f7a3f"} err="failed to get container status \"898ef2488ff154992f84a4fc6380c82430045a8ed810e3879ec78e5a246f7a3f\": rpc error: code = NotFound desc = could not find container \"898ef2488ff154992f84a4fc6380c82430045a8ed810e3879ec78e5a246f7a3f\": container with ID starting with 898ef2488ff154992f84a4fc6380c82430045a8ed810e3879ec78e5a246f7a3f not found: ID does not exist" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.990195 5030 scope.go:117] "RemoveContainer" containerID="de256107cf63fcb2b870e65d29e45e6ba9a4694e730e0ec20ce025d3f43b07de" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.992259 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b607ce93-5c69-4cf7-a953-3d53d413fcff" path="/var/lib/kubelet/pods/b607ce93-5c69-4cf7-a953-3d53d413fcff/volumes" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.994725 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77849f758-fppxg"] Jan 20 23:25:35 crc kubenswrapper[5030]: E0120 23:25:35.996789 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de256107cf63fcb2b870e65d29e45e6ba9a4694e730e0ec20ce025d3f43b07de\": container with ID starting with de256107cf63fcb2b870e65d29e45e6ba9a4694e730e0ec20ce025d3f43b07de not found: ID does not exist" containerID="de256107cf63fcb2b870e65d29e45e6ba9a4694e730e0ec20ce025d3f43b07de" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.996835 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de256107cf63fcb2b870e65d29e45e6ba9a4694e730e0ec20ce025d3f43b07de"} err="failed to get container status \"de256107cf63fcb2b870e65d29e45e6ba9a4694e730e0ec20ce025d3f43b07de\": rpc error: code = NotFound desc = could not find container \"de256107cf63fcb2b870e65d29e45e6ba9a4694e730e0ec20ce025d3f43b07de\": container with ID starting with de256107cf63fcb2b870e65d29e45e6ba9a4694e730e0ec20ce025d3f43b07de not found: ID does not exist" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.996862 5030 scope.go:117] "RemoveContainer" containerID="43c9302de660f515dd0eebe873e9bc2a05c872d8d24b28b741b9f2ffa3a812d7" Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.997180 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" podUID="8b43301d-1654-48e3-939b-48b8b7a01d08" containerName="barbican-worker-log" containerID="cri-o://85abd1af505db3e1ef7ffda4c8e402a9aaa678d02683c109145cde03d314e60a" gracePeriod=30 Jan 20 23:25:35 crc kubenswrapper[5030]: I0120 23:25:35.997528 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" podUID="8b43301d-1654-48e3-939b-48b8b7a01d08" containerName="barbican-worker" containerID="cri-o://c93bbc1ca0dc3d090bb51dc932be0176d62ec001738ed2e88b8fd6a5633177f8" gracePeriod=30 Jan 20 23:25:36 crc kubenswrapper[5030]: E0120 23:25:36.000424 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43c9302de660f515dd0eebe873e9bc2a05c872d8d24b28b741b9f2ffa3a812d7\": container with ID starting with 43c9302de660f515dd0eebe873e9bc2a05c872d8d24b28b741b9f2ffa3a812d7 not found: ID does not exist" containerID="43c9302de660f515dd0eebe873e9bc2a05c872d8d24b28b741b9f2ffa3a812d7" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.000457 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c9302de660f515dd0eebe873e9bc2a05c872d8d24b28b741b9f2ffa3a812d7"} err="failed to get container status \"43c9302de660f515dd0eebe873e9bc2a05c872d8d24b28b741b9f2ffa3a812d7\": rpc error: code = NotFound desc = could not find container \"43c9302de660f515dd0eebe873e9bc2a05c872d8d24b28b741b9f2ffa3a812d7\": container with ID starting with 43c9302de660f515dd0eebe873e9bc2a05c872d8d24b28b741b9f2ffa3a812d7 not found: ID does not exist" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.000479 5030 scope.go:117] "RemoveContainer" containerID="2ba800c34c07d475f585e0881947ab835b0bf1d733e21efd9aa4df55007828d7" Jan 20 23:25:36 crc kubenswrapper[5030]: E0120 23:25:36.001974 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba800c34c07d475f585e0881947ab835b0bf1d733e21efd9aa4df55007828d7\": container with ID starting with 2ba800c34c07d475f585e0881947ab835b0bf1d733e21efd9aa4df55007828d7 not found: ID does not exist" containerID="2ba800c34c07d475f585e0881947ab835b0bf1d733e21efd9aa4df55007828d7" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.002001 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba800c34c07d475f585e0881947ab835b0bf1d733e21efd9aa4df55007828d7"} err="failed to get container status \"2ba800c34c07d475f585e0881947ab835b0bf1d733e21efd9aa4df55007828d7\": rpc error: code = NotFound desc = could not find container \"2ba800c34c07d475f585e0881947ab835b0bf1d733e21efd9aa4df55007828d7\": container with ID starting with 2ba800c34c07d475f585e0881947ab835b0bf1d733e21efd9aa4df55007828d7 not found: ID does not exist" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.002017 5030 scope.go:117] "RemoveContainer" containerID="6b600011c7f8d67f6e7566ea6b3b1b2dfaeb40569b60470daf1e602fda755770" Jan 20 23:25:36 crc kubenswrapper[5030]: E0120 23:25:36.004926 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b600011c7f8d67f6e7566ea6b3b1b2dfaeb40569b60470daf1e602fda755770\": container with ID starting with 6b600011c7f8d67f6e7566ea6b3b1b2dfaeb40569b60470daf1e602fda755770 not found: ID does not exist" containerID="6b600011c7f8d67f6e7566ea6b3b1b2dfaeb40569b60470daf1e602fda755770" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.004966 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b600011c7f8d67f6e7566ea6b3b1b2dfaeb40569b60470daf1e602fda755770"} err="failed to get container status \"6b600011c7f8d67f6e7566ea6b3b1b2dfaeb40569b60470daf1e602fda755770\": rpc error: code = NotFound desc = could not find container \"6b600011c7f8d67f6e7566ea6b3b1b2dfaeb40569b60470daf1e602fda755770\": container with ID starting with 6b600011c7f8d67f6e7566ea6b3b1b2dfaeb40569b60470daf1e602fda755770 not found: ID does not exist" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.005003 5030 scope.go:117] "RemoveContainer" containerID="0a2cf07616a39fdc6f3573ae37b852a81c3be261b67e73368904e2e6310386c6" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.006876 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2"] Jan 20 23:25:36 crc kubenswrapper[5030]: E0120 23:25:36.006990 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2cf07616a39fdc6f3573ae37b852a81c3be261b67e73368904e2e6310386c6\": container with ID starting with 0a2cf07616a39fdc6f3573ae37b852a81c3be261b67e73368904e2e6310386c6 not found: ID does not exist" containerID="0a2cf07616a39fdc6f3573ae37b852a81c3be261b67e73368904e2e6310386c6" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.007040 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2cf07616a39fdc6f3573ae37b852a81c3be261b67e73368904e2e6310386c6"} err="failed to get container status \"0a2cf07616a39fdc6f3573ae37b852a81c3be261b67e73368904e2e6310386c6\": rpc error: code = NotFound desc = could not find container \"0a2cf07616a39fdc6f3573ae37b852a81c3be261b67e73368904e2e6310386c6\": container with ID starting with 0a2cf07616a39fdc6f3573ae37b852a81c3be261b67e73368904e2e6310386c6 not found: ID does not exist" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.007082 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" podUID="08fe9daf-c9de-4513-8f1d-4de9845d68d0" containerName="barbican-keystone-listener-log" containerID="cri-o://6eaa4dde275842ef0beb5f7d8e6e13309e1e063e95acedde22a48a72efc53e6d" gracePeriod=30 Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.007133 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" podUID="08fe9daf-c9de-4513-8f1d-4de9845d68d0" containerName="barbican-keystone-listener" containerID="cri-o://6e3cad28bd1101a150e15cb12ca038b8e32ad2b04372748ee44338393d3a39d1" gracePeriod=30 Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.007095 5030 scope.go:117] "RemoveContainer" containerID="a628f7b623ff9c200508b4770aedb01dcdc16a5616b4c3c7e2c9ee2e72f314da" Jan 20 23:25:36 crc kubenswrapper[5030]: E0120 23:25:36.009915 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a628f7b623ff9c200508b4770aedb01dcdc16a5616b4c3c7e2c9ee2e72f314da\": container with ID starting with a628f7b623ff9c200508b4770aedb01dcdc16a5616b4c3c7e2c9ee2e72f314da not found: ID does not exist" containerID="a628f7b623ff9c200508b4770aedb01dcdc16a5616b4c3c7e2c9ee2e72f314da" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.009947 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a628f7b623ff9c200508b4770aedb01dcdc16a5616b4c3c7e2c9ee2e72f314da"} err="failed to get container status \"a628f7b623ff9c200508b4770aedb01dcdc16a5616b4c3c7e2c9ee2e72f314da\": rpc error: code = NotFound desc = could not find container \"a628f7b623ff9c200508b4770aedb01dcdc16a5616b4c3c7e2c9ee2e72f314da\": container with ID starting with a628f7b623ff9c200508b4770aedb01dcdc16a5616b4c3c7e2c9ee2e72f314da not found: ID does not exist" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.009971 5030 scope.go:117] "RemoveContainer" containerID="8f90b5530e219e8189c33b09de69a04843468fdd0affbcabac025c6866f2785c" Jan 20 23:25:36 crc kubenswrapper[5030]: E0120 23:25:36.011488 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f90b5530e219e8189c33b09de69a04843468fdd0affbcabac025c6866f2785c\": container with ID starting with 8f90b5530e219e8189c33b09de69a04843468fdd0affbcabac025c6866f2785c not found: ID does not exist" containerID="8f90b5530e219e8189c33b09de69a04843468fdd0affbcabac025c6866f2785c" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.011508 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f90b5530e219e8189c33b09de69a04843468fdd0affbcabac025c6866f2785c"} err="failed to get container status \"8f90b5530e219e8189c33b09de69a04843468fdd0affbcabac025c6866f2785c\": rpc error: code = NotFound desc = could not find container \"8f90b5530e219e8189c33b09de69a04843468fdd0affbcabac025c6866f2785c\": container with ID starting with 8f90b5530e219e8189c33b09de69a04843468fdd0affbcabac025c6866f2785c not found: ID does not exist" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.011522 5030 scope.go:117] "RemoveContainer" containerID="8d392111f161bad470eb94c639b23a6c545dd65432681f84c5e1ea62cf270e93" Jan 20 23:25:36 crc kubenswrapper[5030]: E0120 23:25:36.021418 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d392111f161bad470eb94c639b23a6c545dd65432681f84c5e1ea62cf270e93\": container with ID starting with 8d392111f161bad470eb94c639b23a6c545dd65432681f84c5e1ea62cf270e93 not found: ID does not exist" containerID="8d392111f161bad470eb94c639b23a6c545dd65432681f84c5e1ea62cf270e93" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.021463 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d392111f161bad470eb94c639b23a6c545dd65432681f84c5e1ea62cf270e93"} err="failed to get container status \"8d392111f161bad470eb94c639b23a6c545dd65432681f84c5e1ea62cf270e93\": rpc error: code = NotFound desc = could not find container \"8d392111f161bad470eb94c639b23a6c545dd65432681f84c5e1ea62cf270e93\": container with ID starting with 8d392111f161bad470eb94c639b23a6c545dd65432681f84c5e1ea62cf270e93 not found: ID does not exist" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.021502 5030 scope.go:117] "RemoveContainer" containerID="4c756d7b8db1a727410cc6f3d2706156889222c800b5aa7f19e74bcaae74d60c" Jan 20 23:25:36 crc kubenswrapper[5030]: E0120 23:25:36.027720 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c756d7b8db1a727410cc6f3d2706156889222c800b5aa7f19e74bcaae74d60c\": container with ID starting with 4c756d7b8db1a727410cc6f3d2706156889222c800b5aa7f19e74bcaae74d60c not found: ID does not exist" containerID="4c756d7b8db1a727410cc6f3d2706156889222c800b5aa7f19e74bcaae74d60c" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.027760 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c756d7b8db1a727410cc6f3d2706156889222c800b5aa7f19e74bcaae74d60c"} err="failed to get container status \"4c756d7b8db1a727410cc6f3d2706156889222c800b5aa7f19e74bcaae74d60c\": rpc error: code = NotFound desc = could not find container \"4c756d7b8db1a727410cc6f3d2706156889222c800b5aa7f19e74bcaae74d60c\": container with ID starting with 4c756d7b8db1a727410cc6f3d2706156889222c800b5aa7f19e74bcaae74d60c not found: ID does not exist" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.027788 5030 scope.go:117] "RemoveContainer" containerID="2461fa19d045db13a97b7b0b5638ffc5a653019b3da567b4db09dfdfa385937a" Jan 20 23:25:36 crc kubenswrapper[5030]: E0120 23:25:36.028261 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2461fa19d045db13a97b7b0b5638ffc5a653019b3da567b4db09dfdfa385937a\": container with ID starting with 2461fa19d045db13a97b7b0b5638ffc5a653019b3da567b4db09dfdfa385937a not found: ID does not exist" containerID="2461fa19d045db13a97b7b0b5638ffc5a653019b3da567b4db09dfdfa385937a" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.028288 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2461fa19d045db13a97b7b0b5638ffc5a653019b3da567b4db09dfdfa385937a"} err="failed to get container status \"2461fa19d045db13a97b7b0b5638ffc5a653019b3da567b4db09dfdfa385937a\": rpc error: code = NotFound desc = could not find container \"2461fa19d045db13a97b7b0b5638ffc5a653019b3da567b4db09dfdfa385937a\": container with ID starting with 2461fa19d045db13a97b7b0b5638ffc5a653019b3da567b4db09dfdfa385937a not found: ID does not exist" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.028302 5030 scope.go:117] "RemoveContainer" containerID="e7b7de68994a8284266afd34bc3a79c36d36d9259d1a0d8a634f2a7d8363e58c" Jan 20 23:25:36 crc kubenswrapper[5030]: E0120 23:25:36.028715 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7b7de68994a8284266afd34bc3a79c36d36d9259d1a0d8a634f2a7d8363e58c\": container with ID starting with e7b7de68994a8284266afd34bc3a79c36d36d9259d1a0d8a634f2a7d8363e58c not found: ID does not exist" containerID="e7b7de68994a8284266afd34bc3a79c36d36d9259d1a0d8a634f2a7d8363e58c" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.028760 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7b7de68994a8284266afd34bc3a79c36d36d9259d1a0d8a634f2a7d8363e58c"} err="failed to get container status \"e7b7de68994a8284266afd34bc3a79c36d36d9259d1a0d8a634f2a7d8363e58c\": rpc error: code = NotFound desc = could not find container \"e7b7de68994a8284266afd34bc3a79c36d36d9259d1a0d8a634f2a7d8363e58c\": container with ID starting with e7b7de68994a8284266afd34bc3a79c36d36d9259d1a0d8a634f2a7d8363e58c not found: ID does not exist" Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.068277 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.951083 5030 generic.go:334] "Generic (PLEG): container finished" podID="8b43301d-1654-48e3-939b-48b8b7a01d08" containerID="85abd1af505db3e1ef7ffda4c8e402a9aaa678d02683c109145cde03d314e60a" exitCode=143 Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.951117 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" event={"ID":"8b43301d-1654-48e3-939b-48b8b7a01d08","Type":"ContainerDied","Data":"85abd1af505db3e1ef7ffda4c8e402a9aaa678d02683c109145cde03d314e60a"} Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.954398 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"3d951ed926c631ff4318b93467ee0df5ca43db2435a2c42f8b6dc53ec4fc91c3"} Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.954426 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"e0b30913b14fa56da284823ebc9d76041b474ec2b3ea78132cd310dd05b97ce8"} Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.954437 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"7cd3a2e6ea294102d34df6f1bba4990edb60bc783f665abcaa08bc642403453c"} Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.954446 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"2683362cd99b8235f265be3f4440d62cdb3c454faec2d39def79a9949930d374"} Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.954454 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"9181d3d458c39873908d9b62be1f43c09bde580a5a92747fa92c7120b3c67bd9"} Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.954462 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"b6f075fb50ad4968fe34a14726ac474d4e470b514b1278364c1229eef793355d"} Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.956270 5030 generic.go:334] "Generic (PLEG): container finished" podID="08fe9daf-c9de-4513-8f1d-4de9845d68d0" containerID="6eaa4dde275842ef0beb5f7d8e6e13309e1e063e95acedde22a48a72efc53e6d" exitCode=143 Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.956311 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" event={"ID":"08fe9daf-c9de-4513-8f1d-4de9845d68d0","Type":"ContainerDied","Data":"6eaa4dde275842ef0beb5f7d8e6e13309e1e063e95acedde22a48a72efc53e6d"} Jan 20 23:25:36 crc kubenswrapper[5030]: I0120 23:25:36.963828 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:25:36 crc kubenswrapper[5030]: E0120 23:25:36.964140 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.236534 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.292477 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.306735 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8pvc\" (UniqueName: \"kubernetes.io/projected/72433a7b-fb29-44be-90d0-485aadbb3d7e-kube-api-access-v8pvc\") pod \"72433a7b-fb29-44be-90d0-485aadbb3d7e\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.306836 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-fernet-keys\") pod \"72433a7b-fb29-44be-90d0-485aadbb3d7e\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.306868 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-combined-ca-bundle\") pod \"72433a7b-fb29-44be-90d0-485aadbb3d7e\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.306906 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-scripts\") pod \"72433a7b-fb29-44be-90d0-485aadbb3d7e\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.307015 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-config-data\") pod \"72433a7b-fb29-44be-90d0-485aadbb3d7e\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.307120 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-credential-keys\") pod \"72433a7b-fb29-44be-90d0-485aadbb3d7e\" (UID: \"72433a7b-fb29-44be-90d0-485aadbb3d7e\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.315688 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72433a7b-fb29-44be-90d0-485aadbb3d7e-kube-api-access-v8pvc" (OuterVolumeSpecName: "kube-api-access-v8pvc") pod "72433a7b-fb29-44be-90d0-485aadbb3d7e" (UID: "72433a7b-fb29-44be-90d0-485aadbb3d7e"). InnerVolumeSpecName "kube-api-access-v8pvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.320356 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-scripts" (OuterVolumeSpecName: "scripts") pod "72433a7b-fb29-44be-90d0-485aadbb3d7e" (UID: "72433a7b-fb29-44be-90d0-485aadbb3d7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.321572 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "72433a7b-fb29-44be-90d0-485aadbb3d7e" (UID: "72433a7b-fb29-44be-90d0-485aadbb3d7e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.333913 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "72433a7b-fb29-44be-90d0-485aadbb3d7e" (UID: "72433a7b-fb29-44be-90d0-485aadbb3d7e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.360206 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-config-data" (OuterVolumeSpecName: "config-data") pod "72433a7b-fb29-44be-90d0-485aadbb3d7e" (UID: "72433a7b-fb29-44be-90d0-485aadbb3d7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.380895 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72433a7b-fb29-44be-90d0-485aadbb3d7e" (UID: "72433a7b-fb29-44be-90d0-485aadbb3d7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.408982 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.409002 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.409013 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8pvc\" (UniqueName: \"kubernetes.io/projected/72433a7b-fb29-44be-90d0-485aadbb3d7e-kube-api-access-v8pvc\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.409023 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.409031 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.409039 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72433a7b-fb29-44be-90d0-485aadbb3d7e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.687703 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.809239 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.856617 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-config-data-custom\") pod \"8b43301d-1654-48e3-939b-48b8b7a01d08\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.856769 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b43301d-1654-48e3-939b-48b8b7a01d08-logs\") pod \"8b43301d-1654-48e3-939b-48b8b7a01d08\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.856832 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d882t\" (UniqueName: \"kubernetes.io/projected/8b43301d-1654-48e3-939b-48b8b7a01d08-kube-api-access-d882t\") pod \"8b43301d-1654-48e3-939b-48b8b7a01d08\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.856879 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-config-data\") pod \"8b43301d-1654-48e3-939b-48b8b7a01d08\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.856910 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-combined-ca-bundle\") pod \"8b43301d-1654-48e3-939b-48b8b7a01d08\" (UID: \"8b43301d-1654-48e3-939b-48b8b7a01d08\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.858161 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b43301d-1654-48e3-939b-48b8b7a01d08-logs" (OuterVolumeSpecName: "logs") pod "8b43301d-1654-48e3-939b-48b8b7a01d08" (UID: "8b43301d-1654-48e3-939b-48b8b7a01d08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.866629 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b43301d-1654-48e3-939b-48b8b7a01d08-kube-api-access-d882t" (OuterVolumeSpecName: "kube-api-access-d882t") pod "8b43301d-1654-48e3-939b-48b8b7a01d08" (UID: "8b43301d-1654-48e3-939b-48b8b7a01d08"). InnerVolumeSpecName "kube-api-access-d882t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.866725 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8b43301d-1654-48e3-939b-48b8b7a01d08" (UID: "8b43301d-1654-48e3-939b-48b8b7a01d08"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.895822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b43301d-1654-48e3-939b-48b8b7a01d08" (UID: "8b43301d-1654-48e3-939b-48b8b7a01d08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.920823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-config-data" (OuterVolumeSpecName: "config-data") pod "8b43301d-1654-48e3-939b-48b8b7a01d08" (UID: "8b43301d-1654-48e3-939b-48b8b7a01d08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.958065 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08fe9daf-c9de-4513-8f1d-4de9845d68d0-logs\") pod \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.958135 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-config-data-custom\") pod \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.958181 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-config-data\") pod \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.958214 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzkf6\" (UniqueName: \"kubernetes.io/projected/08fe9daf-c9de-4513-8f1d-4de9845d68d0-kube-api-access-xzkf6\") pod \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.958428 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-combined-ca-bundle\") pod \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\" (UID: \"08fe9daf-c9de-4513-8f1d-4de9845d68d0\") " Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.958556 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08fe9daf-c9de-4513-8f1d-4de9845d68d0-logs" (OuterVolumeSpecName: "logs") pod "08fe9daf-c9de-4513-8f1d-4de9845d68d0" (UID: "08fe9daf-c9de-4513-8f1d-4de9845d68d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.958827 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08fe9daf-c9de-4513-8f1d-4de9845d68d0-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.958840 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.958851 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b43301d-1654-48e3-939b-48b8b7a01d08-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.958861 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d882t\" (UniqueName: \"kubernetes.io/projected/8b43301d-1654-48e3-939b-48b8b7a01d08-kube-api-access-d882t\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.958871 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.958880 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b43301d-1654-48e3-939b-48b8b7a01d08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.962148 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08fe9daf-c9de-4513-8f1d-4de9845d68d0-kube-api-access-xzkf6" (OuterVolumeSpecName: "kube-api-access-xzkf6") pod "08fe9daf-c9de-4513-8f1d-4de9845d68d0" (UID: "08fe9daf-c9de-4513-8f1d-4de9845d68d0"). InnerVolumeSpecName "kube-api-access-xzkf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.963129 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "08fe9daf-c9de-4513-8f1d-4de9845d68d0" (UID: "08fe9daf-c9de-4513-8f1d-4de9845d68d0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.987057 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.992851 5030 generic.go:334] "Generic (PLEG): container finished" podID="08fe9daf-c9de-4513-8f1d-4de9845d68d0" containerID="6e3cad28bd1101a150e15cb12ca038b8e32ad2b04372748ee44338393d3a39d1" exitCode=0 Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.993193 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.998915 5030 generic.go:334] "Generic (PLEG): container finished" podID="8b43301d-1654-48e3-939b-48b8b7a01d08" containerID="c93bbc1ca0dc3d090bb51dc932be0176d62ec001738ed2e88b8fd6a5633177f8" exitCode=0 Jan 20 23:25:37 crc kubenswrapper[5030]: I0120 23:25:37.999078 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.002846 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-2nmjb" event={"ID":"72433a7b-fb29-44be-90d0-485aadbb3d7e","Type":"ContainerDied","Data":"b4215dea61223996a35d900c15ea536818366fcff1d3caf86146861ef6ec22b2"} Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.002898 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4215dea61223996a35d900c15ea536818366fcff1d3caf86146861ef6ec22b2" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.002910 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" event={"ID":"08fe9daf-c9de-4513-8f1d-4de9845d68d0","Type":"ContainerDied","Data":"6e3cad28bd1101a150e15cb12ca038b8e32ad2b04372748ee44338393d3a39d1"} Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.002945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2" event={"ID":"08fe9daf-c9de-4513-8f1d-4de9845d68d0","Type":"ContainerDied","Data":"46ca8187512a0c20917af667d8938f43bde4cd8b7c114e3befce9ac1013597e4"} Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.002959 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" event={"ID":"8b43301d-1654-48e3-939b-48b8b7a01d08","Type":"ContainerDied","Data":"c93bbc1ca0dc3d090bb51dc932be0176d62ec001738ed2e88b8fd6a5633177f8"} Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.002975 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-77849f758-fppxg" event={"ID":"8b43301d-1654-48e3-939b-48b8b7a01d08","Type":"ContainerDied","Data":"852edb6beb66476327455a282ca0f6f5b5a2e520521c33fe0315cf8b2482dd13"} Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.002997 5030 scope.go:117] "RemoveContainer" containerID="6e3cad28bd1101a150e15cb12ca038b8e32ad2b04372748ee44338393d3a39d1" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.032896 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"be5d70fece74ed23d3912dcecfead32af8e6a34f87d9ad333fc71f20d7ff9eda"} Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.032953 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"603531bd19dc85d08fbb782f3b219e34a57b5c3b586d9e1cf64a8141a3e628f4"} Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.032969 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"e9dc24048dc188535b8295f3ab283fcdad761f4bfaf1aac43f23a6f6b734faf2"} Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.032980 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"a9cb444112598e53947461f42b9f15ca62053dfe7dfd69670749b0326697f5da"} Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.040838 5030 scope.go:117] "RemoveContainer" containerID="6eaa4dde275842ef0beb5f7d8e6e13309e1e063e95acedde22a48a72efc53e6d" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.044229 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08fe9daf-c9de-4513-8f1d-4de9845d68d0" (UID: "08fe9daf-c9de-4513-8f1d-4de9845d68d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.061289 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.061328 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.061690 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzkf6\" (UniqueName: \"kubernetes.io/projected/08fe9daf-c9de-4513-8f1d-4de9845d68d0-kube-api-access-xzkf6\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.063765 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-config-data" (OuterVolumeSpecName: "config-data") pod "08fe9daf-c9de-4513-8f1d-4de9845d68d0" (UID: "08fe9daf-c9de-4513-8f1d-4de9845d68d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.064654 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77849f758-fppxg"] Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.065946 5030 scope.go:117] "RemoveContainer" containerID="6e3cad28bd1101a150e15cb12ca038b8e32ad2b04372748ee44338393d3a39d1" Jan 20 23:25:38 crc kubenswrapper[5030]: E0120 23:25:38.066621 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3cad28bd1101a150e15cb12ca038b8e32ad2b04372748ee44338393d3a39d1\": container with ID starting with 6e3cad28bd1101a150e15cb12ca038b8e32ad2b04372748ee44338393d3a39d1 not found: ID does not exist" containerID="6e3cad28bd1101a150e15cb12ca038b8e32ad2b04372748ee44338393d3a39d1" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.066748 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3cad28bd1101a150e15cb12ca038b8e32ad2b04372748ee44338393d3a39d1"} err="failed to get container status \"6e3cad28bd1101a150e15cb12ca038b8e32ad2b04372748ee44338393d3a39d1\": rpc error: code = NotFound desc = could not find container \"6e3cad28bd1101a150e15cb12ca038b8e32ad2b04372748ee44338393d3a39d1\": container with ID starting with 6e3cad28bd1101a150e15cb12ca038b8e32ad2b04372748ee44338393d3a39d1 not found: ID does not exist" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.066780 5030 scope.go:117] "RemoveContainer" containerID="6eaa4dde275842ef0beb5f7d8e6e13309e1e063e95acedde22a48a72efc53e6d" Jan 20 23:25:38 crc kubenswrapper[5030]: E0120 23:25:38.067322 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eaa4dde275842ef0beb5f7d8e6e13309e1e063e95acedde22a48a72efc53e6d\": container with ID starting with 6eaa4dde275842ef0beb5f7d8e6e13309e1e063e95acedde22a48a72efc53e6d not found: ID does not exist" containerID="6eaa4dde275842ef0beb5f7d8e6e13309e1e063e95acedde22a48a72efc53e6d" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.067358 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eaa4dde275842ef0beb5f7d8e6e13309e1e063e95acedde22a48a72efc53e6d"} err="failed to get container status \"6eaa4dde275842ef0beb5f7d8e6e13309e1e063e95acedde22a48a72efc53e6d\": rpc error: code = NotFound desc = could not find container \"6eaa4dde275842ef0beb5f7d8e6e13309e1e063e95acedde22a48a72efc53e6d\": container with ID starting with 6eaa4dde275842ef0beb5f7d8e6e13309e1e063e95acedde22a48a72efc53e6d not found: ID does not exist" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.067374 5030 scope.go:117] "RemoveContainer" containerID="c93bbc1ca0dc3d090bb51dc932be0176d62ec001738ed2e88b8fd6a5633177f8" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.075459 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-77849f758-fppxg"] Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.144537 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-75f85d6df9-6nmsl"] Jan 20 23:25:38 crc kubenswrapper[5030]: E0120 23:25:38.145009 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fe9daf-c9de-4513-8f1d-4de9845d68d0" containerName="barbican-keystone-listener-log" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.145028 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fe9daf-c9de-4513-8f1d-4de9845d68d0" containerName="barbican-keystone-listener-log" Jan 20 23:25:38 crc kubenswrapper[5030]: E0120 23:25:38.145041 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fe9daf-c9de-4513-8f1d-4de9845d68d0" containerName="barbican-keystone-listener" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.145048 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fe9daf-c9de-4513-8f1d-4de9845d68d0" containerName="barbican-keystone-listener" Jan 20 23:25:38 crc kubenswrapper[5030]: E0120 23:25:38.145062 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b43301d-1654-48e3-939b-48b8b7a01d08" containerName="barbican-worker-log" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.145072 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b43301d-1654-48e3-939b-48b8b7a01d08" containerName="barbican-worker-log" Jan 20 23:25:38 crc kubenswrapper[5030]: E0120 23:25:38.145169 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72433a7b-fb29-44be-90d0-485aadbb3d7e" containerName="keystone-bootstrap" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.145177 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="72433a7b-fb29-44be-90d0-485aadbb3d7e" containerName="keystone-bootstrap" Jan 20 23:25:38 crc kubenswrapper[5030]: E0120 23:25:38.145205 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b43301d-1654-48e3-939b-48b8b7a01d08" containerName="barbican-worker" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.145212 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b43301d-1654-48e3-939b-48b8b7a01d08" containerName="barbican-worker" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.145450 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="72433a7b-fb29-44be-90d0-485aadbb3d7e" containerName="keystone-bootstrap" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.145485 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b43301d-1654-48e3-939b-48b8b7a01d08" containerName="barbican-worker" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.145498 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="08fe9daf-c9de-4513-8f1d-4de9845d68d0" containerName="barbican-keystone-listener-log" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.145508 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b43301d-1654-48e3-939b-48b8b7a01d08" containerName="barbican-worker-log" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.145518 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="08fe9daf-c9de-4513-8f1d-4de9845d68d0" containerName="barbican-keystone-listener" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.146210 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.148573 5030 scope.go:117] "RemoveContainer" containerID="85abd1af505db3e1ef7ffda4c8e402a9aaa678d02683c109145cde03d314e60a" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.159592 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-75f85d6df9-6nmsl"] Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.165550 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08fe9daf-c9de-4513-8f1d-4de9845d68d0-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.180869 5030 scope.go:117] "RemoveContainer" containerID="c93bbc1ca0dc3d090bb51dc932be0176d62ec001738ed2e88b8fd6a5633177f8" Jan 20 23:25:38 crc kubenswrapper[5030]: E0120 23:25:38.183566 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c93bbc1ca0dc3d090bb51dc932be0176d62ec001738ed2e88b8fd6a5633177f8\": container with ID starting with c93bbc1ca0dc3d090bb51dc932be0176d62ec001738ed2e88b8fd6a5633177f8 not found: ID does not exist" containerID="c93bbc1ca0dc3d090bb51dc932be0176d62ec001738ed2e88b8fd6a5633177f8" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.183608 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93bbc1ca0dc3d090bb51dc932be0176d62ec001738ed2e88b8fd6a5633177f8"} err="failed to get container status \"c93bbc1ca0dc3d090bb51dc932be0176d62ec001738ed2e88b8fd6a5633177f8\": rpc error: code = NotFound desc = could not find container \"c93bbc1ca0dc3d090bb51dc932be0176d62ec001738ed2e88b8fd6a5633177f8\": container with ID starting with c93bbc1ca0dc3d090bb51dc932be0176d62ec001738ed2e88b8fd6a5633177f8 not found: ID does not exist" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.183647 5030 scope.go:117] "RemoveContainer" containerID="85abd1af505db3e1ef7ffda4c8e402a9aaa678d02683c109145cde03d314e60a" Jan 20 23:25:38 crc kubenswrapper[5030]: E0120 23:25:38.185521 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85abd1af505db3e1ef7ffda4c8e402a9aaa678d02683c109145cde03d314e60a\": container with ID starting with 85abd1af505db3e1ef7ffda4c8e402a9aaa678d02683c109145cde03d314e60a not found: ID does not exist" containerID="85abd1af505db3e1ef7ffda4c8e402a9aaa678d02683c109145cde03d314e60a" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.185547 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85abd1af505db3e1ef7ffda4c8e402a9aaa678d02683c109145cde03d314e60a"} err="failed to get container status \"85abd1af505db3e1ef7ffda4c8e402a9aaa678d02683c109145cde03d314e60a\": rpc error: code = NotFound desc = could not find container \"85abd1af505db3e1ef7ffda4c8e402a9aaa678d02683c109145cde03d314e60a\": container with ID starting with 85abd1af505db3e1ef7ffda4c8e402a9aaa678d02683c109145cde03d314e60a not found: ID does not exist" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.267410 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-credential-keys\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.267805 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grnkc\" (UniqueName: \"kubernetes.io/projected/f9b215df-05ff-45fe-ac23-b5605150449b-kube-api-access-grnkc\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.267840 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-public-tls-certs\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.267868 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-config-data\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.267901 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-fernet-keys\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.267926 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-internal-tls-certs\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.268157 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-scripts\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.268291 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-combined-ca-bundle\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.333182 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2"] Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.342101 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7587c489b5-vzmf2"] Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.369679 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-scripts\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.369730 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-combined-ca-bundle\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.369793 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-credential-keys\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.369842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grnkc\" (UniqueName: \"kubernetes.io/projected/f9b215df-05ff-45fe-ac23-b5605150449b-kube-api-access-grnkc\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.369867 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-public-tls-certs\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.369895 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-config-data\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.369925 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-fernet-keys\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.369945 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-internal-tls-certs\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.375856 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-fernet-keys\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.377031 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-scripts\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.378055 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-internal-tls-certs\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.380976 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-config-data\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.383431 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-combined-ca-bundle\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.384660 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-public-tls-certs\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.384986 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-credential-keys\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.386849 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grnkc\" (UniqueName: \"kubernetes.io/projected/f9b215df-05ff-45fe-ac23-b5605150449b-kube-api-access-grnkc\") pod \"keystone-75f85d6df9-6nmsl\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.476460 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:38 crc kubenswrapper[5030]: I0120 23:25:38.962110 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-75f85d6df9-6nmsl"] Jan 20 23:25:38 crc kubenswrapper[5030]: W0120 23:25:38.980168 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9b215df_05ff_45fe_ac23_b5605150449b.slice/crio-bdf0cc7dddd8060bd93d13cdd78e48bd513874d88254877d97143986099ca496 WatchSource:0}: Error finding container bdf0cc7dddd8060bd93d13cdd78e48bd513874d88254877d97143986099ca496: Status 404 returned error can't find the container with id bdf0cc7dddd8060bd93d13cdd78e48bd513874d88254877d97143986099ca496 Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.075007 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"df6ef689e710dba22c29a499696e2c2009cc0478473059ce8d655ee5a1654893"} Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.075291 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"3896b811d93df67a1c086c1bade3178d6f587e711b21f8ed76c06b9949f5320b"} Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.075302 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"d33aa21651847a416850ac9ea366856f99b5efa24607b2c97ee28cb5316792af"} Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.075312 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"5b32a554719001bcf616841e4db5c2484999d2d6bb3cce476354aa6956e8471a"} Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.075321 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"8d7dbc2d97335bd333e92228100d852a9f983098e53579130b62dd6a10f6e90a"} Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.076770 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" event={"ID":"f9b215df-05ff-45fe-ac23-b5605150449b","Type":"ContainerStarted","Data":"bdf0cc7dddd8060bd93d13cdd78e48bd513874d88254877d97143986099ca496"} Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.078816 5030 generic.go:334] "Generic (PLEG): container finished" podID="198942d4-7d99-4cd0-8e85-b9a7497b4343" containerID="5bfcfd97530429318df3c1456d709733df8f61edbb7300154b1b5a2dc34d7062" exitCode=0 Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.078905 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"198942d4-7d99-4cd0-8e85-b9a7497b4343","Type":"ContainerDied","Data":"5bfcfd97530429318df3c1456d709733df8f61edbb7300154b1b5a2dc34d7062"} Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.083400 5030 generic.go:334] "Generic (PLEG): container finished" podID="8a280c54-785f-4a4f-895b-231aaed18db5" containerID="bbe8780ef4464c2c88e1e7f47c8a400ceb8bf0f5eb423d5f85696f78d14df1f9" exitCode=0 Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.083457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8a280c54-785f-4a4f-895b-231aaed18db5","Type":"ContainerDied","Data":"bbe8780ef4464c2c88e1e7f47c8a400ceb8bf0f5eb423d5f85696f78d14df1f9"} Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.152017 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.215374 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.284612 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"198942d4-7d99-4cd0-8e85-b9a7497b4343\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.284716 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-scripts\") pod \"198942d4-7d99-4cd0-8e85-b9a7497b4343\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.284784 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-combined-ca-bundle\") pod \"198942d4-7d99-4cd0-8e85-b9a7497b4343\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.284807 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-public-tls-certs\") pod \"198942d4-7d99-4cd0-8e85-b9a7497b4343\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.284824 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l28df\" (UniqueName: \"kubernetes.io/projected/198942d4-7d99-4cd0-8e85-b9a7497b4343-kube-api-access-l28df\") pod \"198942d4-7d99-4cd0-8e85-b9a7497b4343\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.285221 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/198942d4-7d99-4cd0-8e85-b9a7497b4343-logs\") pod \"198942d4-7d99-4cd0-8e85-b9a7497b4343\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.285261 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/198942d4-7d99-4cd0-8e85-b9a7497b4343-httpd-run\") pod \"198942d4-7d99-4cd0-8e85-b9a7497b4343\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.285349 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-config-data\") pod \"198942d4-7d99-4cd0-8e85-b9a7497b4343\" (UID: \"198942d4-7d99-4cd0-8e85-b9a7497b4343\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.285926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/198942d4-7d99-4cd0-8e85-b9a7497b4343-logs" (OuterVolumeSpecName: "logs") pod "198942d4-7d99-4cd0-8e85-b9a7497b4343" (UID: "198942d4-7d99-4cd0-8e85-b9a7497b4343"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.285989 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/198942d4-7d99-4cd0-8e85-b9a7497b4343-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "198942d4-7d99-4cd0-8e85-b9a7497b4343" (UID: "198942d4-7d99-4cd0-8e85-b9a7497b4343"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.286314 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/198942d4-7d99-4cd0-8e85-b9a7497b4343-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.286331 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/198942d4-7d99-4cd0-8e85-b9a7497b4343-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.289752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-scripts" (OuterVolumeSpecName: "scripts") pod "198942d4-7d99-4cd0-8e85-b9a7497b4343" (UID: "198942d4-7d99-4cd0-8e85-b9a7497b4343"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.290238 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "198942d4-7d99-4cd0-8e85-b9a7497b4343" (UID: "198942d4-7d99-4cd0-8e85-b9a7497b4343"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.290798 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/198942d4-7d99-4cd0-8e85-b9a7497b4343-kube-api-access-l28df" (OuterVolumeSpecName: "kube-api-access-l28df") pod "198942d4-7d99-4cd0-8e85-b9a7497b4343" (UID: "198942d4-7d99-4cd0-8e85-b9a7497b4343"). InnerVolumeSpecName "kube-api-access-l28df". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.309973 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "198942d4-7d99-4cd0-8e85-b9a7497b4343" (UID: "198942d4-7d99-4cd0-8e85-b9a7497b4343"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.335068 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "198942d4-7d99-4cd0-8e85-b9a7497b4343" (UID: "198942d4-7d99-4cd0-8e85-b9a7497b4343"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.338416 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-config-data" (OuterVolumeSpecName: "config-data") pod "198942d4-7d99-4cd0-8e85-b9a7497b4343" (UID: "198942d4-7d99-4cd0-8e85-b9a7497b4343"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.387229 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-scripts\") pod \"8a280c54-785f-4a4f-895b-231aaed18db5\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.387517 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc88m\" (UniqueName: \"kubernetes.io/projected/8a280c54-785f-4a4f-895b-231aaed18db5-kube-api-access-pc88m\") pod \"8a280c54-785f-4a4f-895b-231aaed18db5\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.387603 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-combined-ca-bundle\") pod \"8a280c54-785f-4a4f-895b-231aaed18db5\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.387688 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-config-data\") pod \"8a280c54-785f-4a4f-895b-231aaed18db5\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.387777 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a280c54-785f-4a4f-895b-231aaed18db5-logs\") pod \"8a280c54-785f-4a4f-895b-231aaed18db5\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.387803 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"8a280c54-785f-4a4f-895b-231aaed18db5\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.387846 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a280c54-785f-4a4f-895b-231aaed18db5-httpd-run\") pod \"8a280c54-785f-4a4f-895b-231aaed18db5\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.387888 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-internal-tls-certs\") pod \"8a280c54-785f-4a4f-895b-231aaed18db5\" (UID: \"8a280c54-785f-4a4f-895b-231aaed18db5\") " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.388300 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.388319 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.388329 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.388340 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l28df\" (UniqueName: \"kubernetes.io/projected/198942d4-7d99-4cd0-8e85-b9a7497b4343-kube-api-access-l28df\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.388349 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.388357 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198942d4-7d99-4cd0-8e85-b9a7497b4343-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.388926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a280c54-785f-4a4f-895b-231aaed18db5-logs" (OuterVolumeSpecName: "logs") pod "8a280c54-785f-4a4f-895b-231aaed18db5" (UID: "8a280c54-785f-4a4f-895b-231aaed18db5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.389046 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a280c54-785f-4a4f-895b-231aaed18db5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8a280c54-785f-4a4f-895b-231aaed18db5" (UID: "8a280c54-785f-4a4f-895b-231aaed18db5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.392015 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-scripts" (OuterVolumeSpecName: "scripts") pod "8a280c54-785f-4a4f-895b-231aaed18db5" (UID: "8a280c54-785f-4a4f-895b-231aaed18db5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.392556 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a280c54-785f-4a4f-895b-231aaed18db5-kube-api-access-pc88m" (OuterVolumeSpecName: "kube-api-access-pc88m") pod "8a280c54-785f-4a4f-895b-231aaed18db5" (UID: "8a280c54-785f-4a4f-895b-231aaed18db5"). InnerVolumeSpecName "kube-api-access-pc88m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.393943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "8a280c54-785f-4a4f-895b-231aaed18db5" (UID: "8a280c54-785f-4a4f-895b-231aaed18db5"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.409841 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.425581 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a280c54-785f-4a4f-895b-231aaed18db5" (UID: "8a280c54-785f-4a4f-895b-231aaed18db5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.441195 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-config-data" (OuterVolumeSpecName: "config-data") pod "8a280c54-785f-4a4f-895b-231aaed18db5" (UID: "8a280c54-785f-4a4f-895b-231aaed18db5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.442187 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8a280c54-785f-4a4f-895b-231aaed18db5" (UID: "8a280c54-785f-4a4f-895b-231aaed18db5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.490440 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a280c54-785f-4a4f-895b-231aaed18db5-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.490496 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.490506 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.490517 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.490546 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc88m\" (UniqueName: \"kubernetes.io/projected/8a280c54-785f-4a4f-895b-231aaed18db5-kube-api-access-pc88m\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.490556 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.490565 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a280c54-785f-4a4f-895b-231aaed18db5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.490573 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a280c54-785f-4a4f-895b-231aaed18db5-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.490596 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.512051 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.592034 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.972999 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08fe9daf-c9de-4513-8f1d-4de9845d68d0" path="/var/lib/kubelet/pods/08fe9daf-c9de-4513-8f1d-4de9845d68d0/volumes" Jan 20 23:25:39 crc kubenswrapper[5030]: I0120 23:25:39.973800 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b43301d-1654-48e3-939b-48b8b7a01d08" path="/var/lib/kubelet/pods/8b43301d-1654-48e3-939b-48b8b7a01d08/volumes" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.122744 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"198942d4-7d99-4cd0-8e85-b9a7497b4343","Type":"ContainerDied","Data":"602bd6ed5c21fdad5d228277398f647b30a1961fac5fab87b3f82c5830d574dd"} Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.122764 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.122799 5030 scope.go:117] "RemoveContainer" containerID="5bfcfd97530429318df3c1456d709733df8f61edbb7300154b1b5a2dc34d7062" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.125647 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"8a280c54-785f-4a4f-895b-231aaed18db5","Type":"ContainerDied","Data":"12b36267d2b888f2e81d444f6a384cba4e2f8ef616c88070e3918f6855caf7c2"} Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.125681 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.133533 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerStarted","Data":"052f7253ce8f20b7006896c49be8c049c88f69895a3e3717387a4a8776731857"} Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.135671 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" event={"ID":"f9b215df-05ff-45fe-ac23-b5605150449b","Type":"ContainerStarted","Data":"c61aa87934994b07e39822ecebbaf1cc3cb4d84929e81efdf4034ad0fdad39fe"} Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.135824 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.150355 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.153719 5030 scope.go:117] "RemoveContainer" containerID="af4b4b10add63da920b6c0c9dfac114e8ff71804b575c92731ad570bf240f7e1" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.160569 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.171922 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:25:40 crc kubenswrapper[5030]: E0120 23:25:40.172295 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a280c54-785f-4a4f-895b-231aaed18db5" containerName="glance-httpd" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.172313 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a280c54-785f-4a4f-895b-231aaed18db5" containerName="glance-httpd" Jan 20 23:25:40 crc kubenswrapper[5030]: E0120 23:25:40.172326 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198942d4-7d99-4cd0-8e85-b9a7497b4343" containerName="glance-log" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.172333 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="198942d4-7d99-4cd0-8e85-b9a7497b4343" containerName="glance-log" Jan 20 23:25:40 crc kubenswrapper[5030]: E0120 23:25:40.172356 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198942d4-7d99-4cd0-8e85-b9a7497b4343" containerName="glance-httpd" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.172362 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="198942d4-7d99-4cd0-8e85-b9a7497b4343" containerName="glance-httpd" Jan 20 23:25:40 crc kubenswrapper[5030]: E0120 23:25:40.172370 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a280c54-785f-4a4f-895b-231aaed18db5" containerName="glance-log" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.172375 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a280c54-785f-4a4f-895b-231aaed18db5" containerName="glance-log" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.172546 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="198942d4-7d99-4cd0-8e85-b9a7497b4343" containerName="glance-log" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.172561 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="198942d4-7d99-4cd0-8e85-b9a7497b4343" containerName="glance-httpd" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.172581 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a280c54-785f-4a4f-895b-231aaed18db5" containerName="glance-log" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.172597 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a280c54-785f-4a4f-895b-231aaed18db5" containerName="glance-httpd" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.174661 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.178400 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-b75fg" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.178440 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.178663 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.178726 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.190649 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.194044 5030 scope.go:117] "RemoveContainer" containerID="bbe8780ef4464c2c88e1e7f47c8a400ceb8bf0f5eb423d5f85696f78d14df1f9" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.194026 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" podStartSLOduration=2.194014487 podStartE2EDuration="2.194014487s" podCreationTimestamp="2026-01-20 23:25:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:40.178231356 +0000 UTC m=+3012.498491644" watchObservedRunningTime="2026-01-20 23:25:40.194014487 +0000 UTC m=+3012.514274775" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.221342 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.233464 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.239255 5030 scope.go:117] "RemoveContainer" containerID="7d33a9dd6b2ff81227dfeb64ea475066ca5a849d6f4973770fd37aee303e0b58" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.262698 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.264329 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.266371 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.266453 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.274817 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.284358 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=5.284342105 podStartE2EDuration="5.284342105s" podCreationTimestamp="2026-01-20 23:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:40.24822053 +0000 UTC m=+3012.568480818" watchObservedRunningTime="2026-01-20 23:25:40.284342105 +0000 UTC m=+3012.604602393" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.306024 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.306094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-scripts\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.306124 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-config-data\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.306194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.306212 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c862e826-c66e-4188-9d2a-096e6039a1af-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.306245 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wftgg\" (UniqueName: \"kubernetes.io/projected/c862e826-c66e-4188-9d2a-096e6039a1af-kube-api-access-wftgg\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.306302 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c862e826-c66e-4188-9d2a-096e6039a1af-logs\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.306389 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.375650 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh"] Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.377083 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.388522 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh"] Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.407679 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d37b566e-a870-4b0d-af53-29da770c8a0a-logs\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.407724 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.407750 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.407775 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.407803 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.407826 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-scripts\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.407851 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-config-data\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.407883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.407904 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f4s2\" (UniqueName: \"kubernetes.io/projected/d37b566e-a870-4b0d-af53-29da770c8a0a-kube-api-access-8f4s2\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.407926 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.407951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.407971 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.407989 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c862e826-c66e-4188-9d2a-096e6039a1af-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.408022 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wftgg\" (UniqueName: \"kubernetes.io/projected/c862e826-c66e-4188-9d2a-096e6039a1af-kube-api-access-wftgg\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.408071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c862e826-c66e-4188-9d2a-096e6039a1af-logs\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.408102 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d37b566e-a870-4b0d-af53-29da770c8a0a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.412544 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.413006 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.414103 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c862e826-c66e-4188-9d2a-096e6039a1af-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.414546 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c862e826-c66e-4188-9d2a-096e6039a1af-logs\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.415766 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.425669 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-scripts\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.429132 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-config-data\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.430776 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wftgg\" (UniqueName: \"kubernetes.io/projected/c862e826-c66e-4188-9d2a-096e6039a1af-kube-api-access-wftgg\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.438394 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.501344 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.509389 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d37b566e-a870-4b0d-af53-29da770c8a0a-logs\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.509434 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.509490 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.509544 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-config\") pod \"dnsmasq-dnsmasq-6cc7959f89-zcnnh\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.509575 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dprsl\" (UniqueName: \"kubernetes.io/projected/398a1488-8639-4110-8d25-2f5ff7e73a30-kube-api-access-dprsl\") pod \"dnsmasq-dnsmasq-6cc7959f89-zcnnh\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.509592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.509636 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f4s2\" (UniqueName: \"kubernetes.io/projected/d37b566e-a870-4b0d-af53-29da770c8a0a-kube-api-access-8f4s2\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.509657 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.509693 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6cc7959f89-zcnnh\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.509715 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.509730 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6cc7959f89-zcnnh\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.509813 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d37b566e-a870-4b0d-af53-29da770c8a0a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.510396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d37b566e-a870-4b0d-af53-29da770c8a0a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.510709 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d37b566e-a870-4b0d-af53-29da770c8a0a-logs\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.511454 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.514962 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.515620 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.516552 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.523412 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.528372 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f4s2\" (UniqueName: \"kubernetes.io/projected/d37b566e-a870-4b0d-af53-29da770c8a0a-kube-api-access-8f4s2\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.547039 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.585876 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.611482 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dprsl\" (UniqueName: \"kubernetes.io/projected/398a1488-8639-4110-8d25-2f5ff7e73a30-kube-api-access-dprsl\") pod \"dnsmasq-dnsmasq-6cc7959f89-zcnnh\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.611742 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6cc7959f89-zcnnh\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.611771 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6cc7959f89-zcnnh\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.612259 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-config\") pod \"dnsmasq-dnsmasq-6cc7959f89-zcnnh\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.613698 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-config\") pod \"dnsmasq-dnsmasq-6cc7959f89-zcnnh\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.616532 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6cc7959f89-zcnnh\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.617185 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6cc7959f89-zcnnh\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.632763 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dprsl\" (UniqueName: \"kubernetes.io/projected/398a1488-8639-4110-8d25-2f5ff7e73a30-kube-api-access-dprsl\") pod \"dnsmasq-dnsmasq-6cc7959f89-zcnnh\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.690205 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.982637 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:25:40 crc kubenswrapper[5030]: I0120 23:25:40.999049 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:25:41 crc kubenswrapper[5030]: I0120 23:25:41.134617 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:25:41 crc kubenswrapper[5030]: W0120 23:25:41.141761 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd37b566e_a870_4b0d_af53_29da770c8a0a.slice/crio-e9709736009f50137989f22243681ea92a35dc763797d6191c771771f8fa156a WatchSource:0}: Error finding container e9709736009f50137989f22243681ea92a35dc763797d6191c771771f8fa156a: Status 404 returned error can't find the container with id e9709736009f50137989f22243681ea92a35dc763797d6191c771771f8fa156a Jan 20 23:25:41 crc kubenswrapper[5030]: I0120 23:25:41.147680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c862e826-c66e-4188-9d2a-096e6039a1af","Type":"ContainerStarted","Data":"1927c255e00e53c76852c6db4b5c544361a7692c44080d16e7b45d2dd23c4bc3"} Jan 20 23:25:41 crc kubenswrapper[5030]: W0120 23:25:41.246563 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod398a1488_8639_4110_8d25_2f5ff7e73a30.slice/crio-ddba0e5fe10a3fac4953b1a468593293a0f89d7234d4f47d94e89fcc660bae41 WatchSource:0}: Error finding container ddba0e5fe10a3fac4953b1a468593293a0f89d7234d4f47d94e89fcc660bae41: Status 404 returned error can't find the container with id ddba0e5fe10a3fac4953b1a468593293a0f89d7234d4f47d94e89fcc660bae41 Jan 20 23:25:41 crc kubenswrapper[5030]: I0120 23:25:41.249650 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh"] Jan 20 23:25:41 crc kubenswrapper[5030]: I0120 23:25:41.981308 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="198942d4-7d99-4cd0-8e85-b9a7497b4343" path="/var/lib/kubelet/pods/198942d4-7d99-4cd0-8e85-b9a7497b4343/volumes" Jan 20 23:25:41 crc kubenswrapper[5030]: I0120 23:25:41.982356 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a280c54-785f-4a4f-895b-231aaed18db5" path="/var/lib/kubelet/pods/8a280c54-785f-4a4f-895b-231aaed18db5/volumes" Jan 20 23:25:42 crc kubenswrapper[5030]: I0120 23:25:42.163269 5030 generic.go:334] "Generic (PLEG): container finished" podID="398a1488-8639-4110-8d25-2f5ff7e73a30" containerID="bf5c5b86b57ae306e665f1225382e53638a0f5f9e6613b8e0eb8cc33cff28a46" exitCode=0 Jan 20 23:25:42 crc kubenswrapper[5030]: I0120 23:25:42.163340 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" event={"ID":"398a1488-8639-4110-8d25-2f5ff7e73a30","Type":"ContainerDied","Data":"bf5c5b86b57ae306e665f1225382e53638a0f5f9e6613b8e0eb8cc33cff28a46"} Jan 20 23:25:42 crc kubenswrapper[5030]: I0120 23:25:42.163369 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" event={"ID":"398a1488-8639-4110-8d25-2f5ff7e73a30","Type":"ContainerStarted","Data":"ddba0e5fe10a3fac4953b1a468593293a0f89d7234d4f47d94e89fcc660bae41"} Jan 20 23:25:42 crc kubenswrapper[5030]: I0120 23:25:42.178478 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"d37b566e-a870-4b0d-af53-29da770c8a0a","Type":"ContainerStarted","Data":"ee502ff2226c831285614d809d297680cd85514fa73adcd319bae1949aba7123"} Jan 20 23:25:42 crc kubenswrapper[5030]: I0120 23:25:42.178519 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"d37b566e-a870-4b0d-af53-29da770c8a0a","Type":"ContainerStarted","Data":"e9709736009f50137989f22243681ea92a35dc763797d6191c771771f8fa156a"} Jan 20 23:25:42 crc kubenswrapper[5030]: I0120 23:25:42.188061 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c862e826-c66e-4188-9d2a-096e6039a1af","Type":"ContainerStarted","Data":"71678e28fd03649c9a6db5bde20993e36008b33bbe2167361630434ed0b68f67"} Jan 20 23:25:42 crc kubenswrapper[5030]: I0120 23:25:42.188117 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c862e826-c66e-4188-9d2a-096e6039a1af","Type":"ContainerStarted","Data":"7f68f7419d2718f62707e650ef6382b95f3ad993626d8918bec9a05262072f2a"} Jan 20 23:25:42 crc kubenswrapper[5030]: I0120 23:25:42.222246 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.222226005 podStartE2EDuration="2.222226005s" podCreationTimestamp="2026-01-20 23:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:42.218420293 +0000 UTC m=+3014.538680591" watchObservedRunningTime="2026-01-20 23:25:42.222226005 +0000 UTC m=+3014.542486293" Jan 20 23:25:43 crc kubenswrapper[5030]: I0120 23:25:43.202336 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" event={"ID":"398a1488-8639-4110-8d25-2f5ff7e73a30","Type":"ContainerStarted","Data":"a90ceeb6bf9c370a55379d97e1f6e03ab4adc0d905acdce8f70731f3ec2d5aa2"} Jan 20 23:25:43 crc kubenswrapper[5030]: I0120 23:25:43.202695 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:43 crc kubenswrapper[5030]: I0120 23:25:43.205096 5030 generic.go:334] "Generic (PLEG): container finished" podID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerID="914f5f3e09e41c57acffc427aa985037c43c40970debda4d1453d04f95ceac1e" exitCode=137 Jan 20 23:25:43 crc kubenswrapper[5030]: I0120 23:25:43.205177 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c882b06-04ce-4893-af53-f8dfb97dfcdc","Type":"ContainerDied","Data":"914f5f3e09e41c57acffc427aa985037c43c40970debda4d1453d04f95ceac1e"} Jan 20 23:25:43 crc kubenswrapper[5030]: I0120 23:25:43.205202 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c882b06-04ce-4893-af53-f8dfb97dfcdc","Type":"ContainerStarted","Data":"ba0403331a7a709650df4336818fc505067385786c8b900b92e2db5d3c4245ae"} Jan 20 23:25:43 crc kubenswrapper[5030]: I0120 23:25:43.205224 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="probe" containerID="cri-o://af8c2994eb7f6ea1db7330e62227663ddd51844eb30651d6420cdeedd6189e70" gracePeriod=30 Jan 20 23:25:43 crc kubenswrapper[5030]: I0120 23:25:43.205243 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="cinder-scheduler" containerID="cri-o://ba0403331a7a709650df4336818fc505067385786c8b900b92e2db5d3c4245ae" gracePeriod=30 Jan 20 23:25:43 crc kubenswrapper[5030]: I0120 23:25:43.208701 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"d37b566e-a870-4b0d-af53-29da770c8a0a","Type":"ContainerStarted","Data":"f9817a66ae09a052b512e6a6f843fe20c1e5f1eeabb2fd55c4ccbcf13f8f066c"} Jan 20 23:25:43 crc kubenswrapper[5030]: I0120 23:25:43.227141 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" podStartSLOduration=3.22712274 podStartE2EDuration="3.22712274s" podCreationTimestamp="2026-01-20 23:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:43.220182493 +0000 UTC m=+3015.540442771" watchObservedRunningTime="2026-01-20 23:25:43.22712274 +0000 UTC m=+3015.547383028" Jan 20 23:25:43 crc kubenswrapper[5030]: I0120 23:25:43.250733 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.250715752 podStartE2EDuration="3.250715752s" podCreationTimestamp="2026-01-20 23:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:43.247283469 +0000 UTC m=+3015.567543757" watchObservedRunningTime="2026-01-20 23:25:43.250715752 +0000 UTC m=+3015.570976040" Jan 20 23:25:43 crc kubenswrapper[5030]: I0120 23:25:43.286247 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:25:44 crc kubenswrapper[5030]: I0120 23:25:44.220064 5030 generic.go:334] "Generic (PLEG): container finished" podID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerID="af8c2994eb7f6ea1db7330e62227663ddd51844eb30651d6420cdeedd6189e70" exitCode=0 Jan 20 23:25:44 crc kubenswrapper[5030]: I0120 23:25:44.220156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c882b06-04ce-4893-af53-f8dfb97dfcdc","Type":"ContainerDied","Data":"af8c2994eb7f6ea1db7330e62227663ddd51844eb30651d6420cdeedd6189e70"} Jan 20 23:25:45 crc kubenswrapper[5030]: I0120 23:25:45.781758 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:45 crc kubenswrapper[5030]: I0120 23:25:45.911412 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:25:45 crc kubenswrapper[5030]: I0120 23:25:45.986744 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6687fdd977-92b7s"] Jan 20 23:25:45 crc kubenswrapper[5030]: I0120 23:25:45.987465 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" podUID="6dd505e0-dffa-4321-8f26-f8b6b70c98b4" containerName="barbican-api-log" containerID="cri-o://1f5e01f7919378443faa99f13231598d5d66cb4e91ba1a20c77e2a68d9a547b1" gracePeriod=30 Jan 20 23:25:45 crc kubenswrapper[5030]: I0120 23:25:45.987585 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" podUID="6dd505e0-dffa-4321-8f26-f8b6b70c98b4" containerName="barbican-api" containerID="cri-o://7e4d409ce4a99db2a49a40a4b97266a277d67eeee9215613620c200ffe681d89" gracePeriod=30 Jan 20 23:25:46 crc kubenswrapper[5030]: I0120 23:25:46.239300 5030 generic.go:334] "Generic (PLEG): container finished" podID="6dd505e0-dffa-4321-8f26-f8b6b70c98b4" containerID="1f5e01f7919378443faa99f13231598d5d66cb4e91ba1a20c77e2a68d9a547b1" exitCode=143 Jan 20 23:25:46 crc kubenswrapper[5030]: I0120 23:25:46.239409 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" event={"ID":"6dd505e0-dffa-4321-8f26-f8b6b70c98b4","Type":"ContainerDied","Data":"1f5e01f7919378443faa99f13231598d5d66cb4e91ba1a20c77e2a68d9a547b1"} Jan 20 23:25:47 crc kubenswrapper[5030]: I0120 23:25:47.974468 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:25:47 crc kubenswrapper[5030]: E0120 23:25:47.975230 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.329636 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" podUID="6dd505e0-dffa-4321-8f26-f8b6b70c98b4" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.252:9311/healthcheck\": read tcp 10.217.0.2:55024->10.217.0.252:9311: read: connection reset by peer" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.329724 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" podUID="6dd505e0-dffa-4321-8f26-f8b6b70c98b4" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.252:9311/healthcheck\": read tcp 10.217.0.2:55040->10.217.0.252:9311: read: connection reset by peer" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.708733 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.814059 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-internal-tls-certs\") pod \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.814116 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-combined-ca-bundle\") pod \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.814171 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-config-data-custom\") pod \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.814211 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-config-data\") pod \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.814857 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-logs\") pod \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.814925 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-public-tls-certs\") pod \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.814953 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdshl\" (UniqueName: \"kubernetes.io/projected/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-kube-api-access-zdshl\") pod \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\" (UID: \"6dd505e0-dffa-4321-8f26-f8b6b70c98b4\") " Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.815877 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-logs" (OuterVolumeSpecName: "logs") pod "6dd505e0-dffa-4321-8f26-f8b6b70c98b4" (UID: "6dd505e0-dffa-4321-8f26-f8b6b70c98b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.820920 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-kube-api-access-zdshl" (OuterVolumeSpecName: "kube-api-access-zdshl") pod "6dd505e0-dffa-4321-8f26-f8b6b70c98b4" (UID: "6dd505e0-dffa-4321-8f26-f8b6b70c98b4"). InnerVolumeSpecName "kube-api-access-zdshl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.824691 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6dd505e0-dffa-4321-8f26-f8b6b70c98b4" (UID: "6dd505e0-dffa-4321-8f26-f8b6b70c98b4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.842963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dd505e0-dffa-4321-8f26-f8b6b70c98b4" (UID: "6dd505e0-dffa-4321-8f26-f8b6b70c98b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.869677 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-config-data" (OuterVolumeSpecName: "config-data") pod "6dd505e0-dffa-4321-8f26-f8b6b70c98b4" (UID: "6dd505e0-dffa-4321-8f26-f8b6b70c98b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.875237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6dd505e0-dffa-4321-8f26-f8b6b70c98b4" (UID: "6dd505e0-dffa-4321-8f26-f8b6b70c98b4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.892847 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6dd505e0-dffa-4321-8f26-f8b6b70c98b4" (UID: "6dd505e0-dffa-4321-8f26-f8b6b70c98b4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.918849 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.918897 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.918910 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.918921 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.918940 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdshl\" (UniqueName: \"kubernetes.io/projected/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-kube-api-access-zdshl\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.918955 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:49 crc kubenswrapper[5030]: I0120 23:25:49.918968 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd505e0-dffa-4321-8f26-f8b6b70c98b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.299470 5030 generic.go:334] "Generic (PLEG): container finished" podID="6dd505e0-dffa-4321-8f26-f8b6b70c98b4" containerID="7e4d409ce4a99db2a49a40a4b97266a277d67eeee9215613620c200ffe681d89" exitCode=0 Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.299524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" event={"ID":"6dd505e0-dffa-4321-8f26-f8b6b70c98b4","Type":"ContainerDied","Data":"7e4d409ce4a99db2a49a40a4b97266a277d67eeee9215613620c200ffe681d89"} Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.299903 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" event={"ID":"6dd505e0-dffa-4321-8f26-f8b6b70c98b4","Type":"ContainerDied","Data":"1c10f1e81f0dadeab5729eeb2e32949e39106d9a10fee0ca69f392fefd7dc33a"} Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.299929 5030 scope.go:117] "RemoveContainer" containerID="7e4d409ce4a99db2a49a40a4b97266a277d67eeee9215613620c200ffe681d89" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.299560 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6687fdd977-92b7s" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.332902 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6687fdd977-92b7s"] Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.350798 5030 scope.go:117] "RemoveContainer" containerID="1f5e01f7919378443faa99f13231598d5d66cb4e91ba1a20c77e2a68d9a547b1" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.351295 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-6687fdd977-92b7s"] Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.382865 5030 scope.go:117] "RemoveContainer" containerID="7e4d409ce4a99db2a49a40a4b97266a277d67eeee9215613620c200ffe681d89" Jan 20 23:25:50 crc kubenswrapper[5030]: E0120 23:25:50.383774 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e4d409ce4a99db2a49a40a4b97266a277d67eeee9215613620c200ffe681d89\": container with ID starting with 7e4d409ce4a99db2a49a40a4b97266a277d67eeee9215613620c200ffe681d89 not found: ID does not exist" containerID="7e4d409ce4a99db2a49a40a4b97266a277d67eeee9215613620c200ffe681d89" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.383833 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e4d409ce4a99db2a49a40a4b97266a277d67eeee9215613620c200ffe681d89"} err="failed to get container status \"7e4d409ce4a99db2a49a40a4b97266a277d67eeee9215613620c200ffe681d89\": rpc error: code = NotFound desc = could not find container \"7e4d409ce4a99db2a49a40a4b97266a277d67eeee9215613620c200ffe681d89\": container with ID starting with 7e4d409ce4a99db2a49a40a4b97266a277d67eeee9215613620c200ffe681d89 not found: ID does not exist" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.383897 5030 scope.go:117] "RemoveContainer" containerID="1f5e01f7919378443faa99f13231598d5d66cb4e91ba1a20c77e2a68d9a547b1" Jan 20 23:25:50 crc kubenswrapper[5030]: E0120 23:25:50.384426 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5e01f7919378443faa99f13231598d5d66cb4e91ba1a20c77e2a68d9a547b1\": container with ID starting with 1f5e01f7919378443faa99f13231598d5d66cb4e91ba1a20c77e2a68d9a547b1 not found: ID does not exist" containerID="1f5e01f7919378443faa99f13231598d5d66cb4e91ba1a20c77e2a68d9a547b1" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.384486 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5e01f7919378443faa99f13231598d5d66cb4e91ba1a20c77e2a68d9a547b1"} err="failed to get container status \"1f5e01f7919378443faa99f13231598d5d66cb4e91ba1a20c77e2a68d9a547b1\": rpc error: code = NotFound desc = could not find container \"1f5e01f7919378443faa99f13231598d5d66cb4e91ba1a20c77e2a68d9a547b1\": container with ID starting with 1f5e01f7919378443faa99f13231598d5d66cb4e91ba1a20c77e2a68d9a547b1 not found: ID does not exist" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.502393 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.502438 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.541011 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.566297 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.587113 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.587160 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.616331 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.651007 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.692165 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.762048 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h"] Jan 20 23:25:50 crc kubenswrapper[5030]: I0120 23:25:50.762279 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" podUID="15e6333d-c767-4935-92b0-31ae1c68e475" containerName="dnsmasq-dns" containerID="cri-o://769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0" gracePeriod=10 Jan 20 23:25:50 crc kubenswrapper[5030]: E0120 23:25:50.985665 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd756a967_e1dd_4c74_8bff_cd1fab21b251.slice/crio-conmon-d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15e6333d_c767_4935_92b0_31ae1c68e475.slice/crio-conmon-769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd756a967_e1dd_4c74_8bff_cd1fab21b251.slice/crio-d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.235366 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.246860 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-dns-swift-storage-0\") pod \"15e6333d-c767-4935-92b0-31ae1c68e475\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.246992 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-dnsmasq-svc\") pod \"15e6333d-c767-4935-92b0-31ae1c68e475\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.247046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9shc\" (UniqueName: \"kubernetes.io/projected/15e6333d-c767-4935-92b0-31ae1c68e475-kube-api-access-f9shc\") pod \"15e6333d-c767-4935-92b0-31ae1c68e475\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.247107 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-config\") pod \"15e6333d-c767-4935-92b0-31ae1c68e475\" (UID: \"15e6333d-c767-4935-92b0-31ae1c68e475\") " Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.252154 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e6333d-c767-4935-92b0-31ae1c68e475-kube-api-access-f9shc" (OuterVolumeSpecName: "kube-api-access-f9shc") pod "15e6333d-c767-4935-92b0-31ae1c68e475" (UID: "15e6333d-c767-4935-92b0-31ae1c68e475"). InnerVolumeSpecName "kube-api-access-f9shc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.317320 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "15e6333d-c767-4935-92b0-31ae1c68e475" (UID: "15e6333d-c767-4935-92b0-31ae1c68e475"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.325284 5030 generic.go:334] "Generic (PLEG): container finished" podID="15e6333d-c767-4935-92b0-31ae1c68e475" containerID="769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0" exitCode=0 Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.325397 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.325425 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" event={"ID":"15e6333d-c767-4935-92b0-31ae1c68e475","Type":"ContainerDied","Data":"769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0"} Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.325505 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h" event={"ID":"15e6333d-c767-4935-92b0-31ae1c68e475","Type":"ContainerDied","Data":"bedbc48311a7629c8862bb3584f123ef90f4e1cb9b7d723f4d6dc579051958a0"} Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.325529 5030 scope.go:117] "RemoveContainer" containerID="769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.326777 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-config" (OuterVolumeSpecName: "config") pod "15e6333d-c767-4935-92b0-31ae1c68e475" (UID: "15e6333d-c767-4935-92b0-31ae1c68e475"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.328862 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "15e6333d-c767-4935-92b0-31ae1c68e475" (UID: "15e6333d-c767-4935-92b0-31ae1c68e475"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.329902 5030 generic.go:334] "Generic (PLEG): container finished" podID="d756a967-e1dd-4c74-8bff-cd1fab21b251" containerID="d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a" exitCode=0 Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.331603 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"d756a967-e1dd-4c74-8bff-cd1fab21b251","Type":"ContainerDied","Data":"d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a"} Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.331721 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.332169 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.332310 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.332340 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.347695 5030 scope.go:117] "RemoveContainer" containerID="1db86ad8f305c4dbd40634fef04c4f9149939bdf384a3f0d03577207e902e64e" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.350231 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.350257 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9shc\" (UniqueName: \"kubernetes.io/projected/15e6333d-c767-4935-92b0-31ae1c68e475-kube-api-access-f9shc\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.350269 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.350280 5030 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15e6333d-c767-4935-92b0-31ae1c68e475-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.374931 5030 scope.go:117] "RemoveContainer" containerID="769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0" Jan 20 23:25:51 crc kubenswrapper[5030]: E0120 23:25:51.376400 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0\": container with ID starting with 769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0 not found: ID does not exist" containerID="769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.376437 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0"} err="failed to get container status \"769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0\": rpc error: code = NotFound desc = could not find container \"769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0\": container with ID starting with 769c67bc7c6d1c6a9063e4f1a20c3b95fc566fdb302450c00d444864a08a3ed0 not found: ID does not exist" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.376458 5030 scope.go:117] "RemoveContainer" containerID="1db86ad8f305c4dbd40634fef04c4f9149939bdf384a3f0d03577207e902e64e" Jan 20 23:25:51 crc kubenswrapper[5030]: E0120 23:25:51.377047 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1db86ad8f305c4dbd40634fef04c4f9149939bdf384a3f0d03577207e902e64e\": container with ID starting with 1db86ad8f305c4dbd40634fef04c4f9149939bdf384a3f0d03577207e902e64e not found: ID does not exist" containerID="1db86ad8f305c4dbd40634fef04c4f9149939bdf384a3f0d03577207e902e64e" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.377079 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db86ad8f305c4dbd40634fef04c4f9149939bdf384a3f0d03577207e902e64e"} err="failed to get container status \"1db86ad8f305c4dbd40634fef04c4f9149939bdf384a3f0d03577207e902e64e\": rpc error: code = NotFound desc = could not find container \"1db86ad8f305c4dbd40634fef04c4f9149939bdf384a3f0d03577207e902e64e\": container with ID starting with 1db86ad8f305c4dbd40634fef04c4f9149939bdf384a3f0d03577207e902e64e not found: ID does not exist" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.658841 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h"] Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.668838 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58b7ffd5-9978h"] Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.974861 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15e6333d-c767-4935-92b0-31ae1c68e475" path="/var/lib/kubelet/pods/15e6333d-c767-4935-92b0-31ae1c68e475/volumes" Jan 20 23:25:51 crc kubenswrapper[5030]: I0120 23:25:51.976134 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd505e0-dffa-4321-8f26-f8b6b70c98b4" path="/var/lib/kubelet/pods/6dd505e0-dffa-4321-8f26-f8b6b70c98b4/volumes" Jan 20 23:25:52 crc kubenswrapper[5030]: I0120 23:25:52.344732 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"d756a967-e1dd-4c74-8bff-cd1fab21b251","Type":"ContainerStarted","Data":"799435f2bd96b95034bb6179f636afcf29a888ed0516e8085ed21b16561d9ed5"} Jan 20 23:25:52 crc kubenswrapper[5030]: I0120 23:25:52.345352 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:25:52 crc kubenswrapper[5030]: I0120 23:25:52.377583 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.377565358 podStartE2EDuration="36.377565358s" podCreationTimestamp="2026-01-20 23:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:52.372649938 +0000 UTC m=+3024.692910226" watchObservedRunningTime="2026-01-20 23:25:52.377565358 +0000 UTC m=+3024.697825646" Jan 20 23:25:53 crc kubenswrapper[5030]: I0120 23:25:53.011657 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:53 crc kubenswrapper[5030]: I0120 23:25:53.046390 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:53 crc kubenswrapper[5030]: I0120 23:25:53.048473 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:25:53 crc kubenswrapper[5030]: I0120 23:25:53.198964 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:25:53 crc kubenswrapper[5030]: I0120 23:25:53.353453 5030 generic.go:334] "Generic (PLEG): container finished" podID="fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" containerID="9940bbb038d38a74af92d31f66d01f141a002adaecb98d66837c57666c0f664f" exitCode=0 Jan 20 23:25:53 crc kubenswrapper[5030]: I0120 23:25:53.353877 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea","Type":"ContainerDied","Data":"9940bbb038d38a74af92d31f66d01f141a002adaecb98d66837c57666c0f664f"} Jan 20 23:25:54 crc kubenswrapper[5030]: I0120 23:25:54.365164 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea","Type":"ContainerStarted","Data":"38aa5aa26bdb0f07041118403707547bc1981c26790c4adec8154899b480eb0b"} Jan 20 23:25:54 crc kubenswrapper[5030]: I0120 23:25:54.366270 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:25:54 crc kubenswrapper[5030]: I0120 23:25:54.408989 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=37.408961543 podStartE2EDuration="37.408961543s" podCreationTimestamp="2026-01-20 23:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:25:54.396850169 +0000 UTC m=+3026.717110487" watchObservedRunningTime="2026-01-20 23:25:54.408961543 +0000 UTC m=+3026.729221861" Jan 20 23:25:54 crc kubenswrapper[5030]: I0120 23:25:54.763364 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:25:54 crc kubenswrapper[5030]: I0120 23:25:54.859434 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:25:54 crc kubenswrapper[5030]: I0120 23:25:54.940086 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8"] Jan 20 23:25:54 crc kubenswrapper[5030]: I0120 23:25:54.940594 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" podUID="1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" containerName="neutron-api" containerID="cri-o://5cc1be7c0a82dbd5a841390d758773fe97e15ee408f76536d4e338afca6e0a65" gracePeriod=30 Jan 20 23:25:54 crc kubenswrapper[5030]: I0120 23:25:54.941098 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" podUID="1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" containerName="neutron-httpd" containerID="cri-o://0411531d53355361b046e2e32bfef7f5b3573b390f4b771c528b43dc76aea6b7" gracePeriod=30 Jan 20 23:25:55 crc kubenswrapper[5030]: I0120 23:25:55.376040 5030 generic.go:334] "Generic (PLEG): container finished" podID="1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" containerID="0411531d53355361b046e2e32bfef7f5b3573b390f4b771c528b43dc76aea6b7" exitCode=0 Jan 20 23:25:55 crc kubenswrapper[5030]: I0120 23:25:55.376125 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" event={"ID":"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e","Type":"ContainerDied","Data":"0411531d53355361b046e2e32bfef7f5b3573b390f4b771c528b43dc76aea6b7"} Jan 20 23:25:55 crc kubenswrapper[5030]: I0120 23:25:55.633374 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" podUID="1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.2:9696/\": dial tcp 10.217.1.2:9696: connect: connection refused" Jan 20 23:26:01 crc kubenswrapper[5030]: I0120 23:26:01.963320 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:26:01 crc kubenswrapper[5030]: E0120 23:26:01.964598 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:26:02 crc kubenswrapper[5030]: I0120 23:26:02.070069 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:26:02 crc kubenswrapper[5030]: I0120 23:26:02.094314 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:26:02 crc kubenswrapper[5030]: I0120 23:26:02.175752 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7597899ccc-qc79q"] Jan 20 23:26:02 crc kubenswrapper[5030]: I0120 23:26:02.176233 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" podUID="86fc7598-ee04-466c-9814-31b5cb28d7f9" containerName="placement-log" containerID="cri-o://b7155efd558f19bbb3a86a83a98f9fbd59c3dfaff24a0ec4ba84d50b5ccae084" gracePeriod=30 Jan 20 23:26:02 crc kubenswrapper[5030]: I0120 23:26:02.176314 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" podUID="86fc7598-ee04-466c-9814-31b5cb28d7f9" containerName="placement-api" containerID="cri-o://7ce22c372aaae61f2dea8e280dd96ecd34e932c3f390c86324eb4da31bd300fa" gracePeriod=30 Jan 20 23:26:02 crc kubenswrapper[5030]: I0120 23:26:02.437300 5030 generic.go:334] "Generic (PLEG): container finished" podID="86fc7598-ee04-466c-9814-31b5cb28d7f9" containerID="b7155efd558f19bbb3a86a83a98f9fbd59c3dfaff24a0ec4ba84d50b5ccae084" exitCode=143 Jan 20 23:26:02 crc kubenswrapper[5030]: I0120 23:26:02.437334 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" event={"ID":"86fc7598-ee04-466c-9814-31b5cb28d7f9","Type":"ContainerDied","Data":"b7155efd558f19bbb3a86a83a98f9fbd59c3dfaff24a0ec4ba84d50b5ccae084"} Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.477308 5030 generic.go:334] "Generic (PLEG): container finished" podID="86fc7598-ee04-466c-9814-31b5cb28d7f9" containerID="7ce22c372aaae61f2dea8e280dd96ecd34e932c3f390c86324eb4da31bd300fa" exitCode=0 Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.477396 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" event={"ID":"86fc7598-ee04-466c-9814-31b5cb28d7f9","Type":"ContainerDied","Data":"7ce22c372aaae61f2dea8e280dd96ecd34e932c3f390c86324eb4da31bd300fa"} Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.753782 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.844447 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-public-tls-certs\") pod \"86fc7598-ee04-466c-9814-31b5cb28d7f9\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.844554 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqnnw\" (UniqueName: \"kubernetes.io/projected/86fc7598-ee04-466c-9814-31b5cb28d7f9-kube-api-access-zqnnw\") pod \"86fc7598-ee04-466c-9814-31b5cb28d7f9\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.844650 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-scripts\") pod \"86fc7598-ee04-466c-9814-31b5cb28d7f9\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.844703 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fc7598-ee04-466c-9814-31b5cb28d7f9-logs\") pod \"86fc7598-ee04-466c-9814-31b5cb28d7f9\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.844776 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-combined-ca-bundle\") pod \"86fc7598-ee04-466c-9814-31b5cb28d7f9\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.844841 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-internal-tls-certs\") pod \"86fc7598-ee04-466c-9814-31b5cb28d7f9\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.844875 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-config-data\") pod \"86fc7598-ee04-466c-9814-31b5cb28d7f9\" (UID: \"86fc7598-ee04-466c-9814-31b5cb28d7f9\") " Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.845706 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86fc7598-ee04-466c-9814-31b5cb28d7f9-logs" (OuterVolumeSpecName: "logs") pod "86fc7598-ee04-466c-9814-31b5cb28d7f9" (UID: "86fc7598-ee04-466c-9814-31b5cb28d7f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.853055 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86fc7598-ee04-466c-9814-31b5cb28d7f9-kube-api-access-zqnnw" (OuterVolumeSpecName: "kube-api-access-zqnnw") pod "86fc7598-ee04-466c-9814-31b5cb28d7f9" (UID: "86fc7598-ee04-466c-9814-31b5cb28d7f9"). InnerVolumeSpecName "kube-api-access-zqnnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.853056 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-scripts" (OuterVolumeSpecName: "scripts") pod "86fc7598-ee04-466c-9814-31b5cb28d7f9" (UID: "86fc7598-ee04-466c-9814-31b5cb28d7f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.903470 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86fc7598-ee04-466c-9814-31b5cb28d7f9" (UID: "86fc7598-ee04-466c-9814-31b5cb28d7f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.916389 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-config-data" (OuterVolumeSpecName: "config-data") pod "86fc7598-ee04-466c-9814-31b5cb28d7f9" (UID: "86fc7598-ee04-466c-9814-31b5cb28d7f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.946788 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.946827 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqnnw\" (UniqueName: \"kubernetes.io/projected/86fc7598-ee04-466c-9814-31b5cb28d7f9-kube-api-access-zqnnw\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.946841 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.946855 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86fc7598-ee04-466c-9814-31b5cb28d7f9-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.946869 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.965951 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "86fc7598-ee04-466c-9814-31b5cb28d7f9" (UID: "86fc7598-ee04-466c-9814-31b5cb28d7f9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:05 crc kubenswrapper[5030]: I0120 23:26:05.991982 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "86fc7598-ee04-466c-9814-31b5cb28d7f9" (UID: "86fc7598-ee04-466c-9814-31b5cb28d7f9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.048814 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.048848 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86fc7598-ee04-466c-9814-31b5cb28d7f9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.494527 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" event={"ID":"86fc7598-ee04-466c-9814-31b5cb28d7f9","Type":"ContainerDied","Data":"61ba3904a386f332e5734ea5145e129d883e0fe2480ec1a1ef7f47ca0c96a6de"} Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.494582 5030 scope.go:117] "RemoveContainer" containerID="7ce22c372aaae61f2dea8e280dd96ecd34e932c3f390c86324eb4da31bd300fa" Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.494692 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7597899ccc-qc79q" Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.524215 5030 scope.go:117] "RemoveContainer" containerID="b7155efd558f19bbb3a86a83a98f9fbd59c3dfaff24a0ec4ba84d50b5ccae084" Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.564424 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7597899ccc-qc79q"] Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.576047 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7597899ccc-qc79q"] Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.610435 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.849341 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v"] Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.856768 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-rbr7v"] Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.875957 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.876329 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" containerName="nova-api-api" containerID="cri-o://2e891cea972422856bd8ef4b90e6772ca327c12ee091c5085fee82919599acc9" gracePeriod=30 Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.876294 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" containerName="nova-api-log" containerID="cri-o://845f94500fe5b956e4b26d202cbb8038a24b9128d9af1a24f8782e2d9fce48ad" gracePeriod=30 Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.893125 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.893391 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c9f188c9-7a3c-49eb-9f90-553a9742b023" containerName="nova-scheduler-scheduler" containerID="cri-o://38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff" gracePeriod=30 Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.973799 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.974028 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerName="nova-metadata-log" containerID="cri-o://479d023bb035a62ed100978c00310340f751bb7c864f927f888af7f5537f011e" gracePeriod=30 Jan 20 23:26:06 crc kubenswrapper[5030]: I0120 23:26:06.974139 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerName="nova-metadata-metadata" containerID="cri-o://bc6613c08618d52d6b28812d72705e495fee861c8d3eb3f7d3abf86b5852deef" gracePeriod=30 Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.091323 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz"] Jan 20 23:26:07 crc kubenswrapper[5030]: E0120 23:26:07.091732 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e6333d-c767-4935-92b0-31ae1c68e475" containerName="init" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.091751 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e6333d-c767-4935-92b0-31ae1c68e475" containerName="init" Jan 20 23:26:07 crc kubenswrapper[5030]: E0120 23:26:07.091778 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86fc7598-ee04-466c-9814-31b5cb28d7f9" containerName="placement-api" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.091785 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="86fc7598-ee04-466c-9814-31b5cb28d7f9" containerName="placement-api" Jan 20 23:26:07 crc kubenswrapper[5030]: E0120 23:26:07.091797 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86fc7598-ee04-466c-9814-31b5cb28d7f9" containerName="placement-log" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.091805 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="86fc7598-ee04-466c-9814-31b5cb28d7f9" containerName="placement-log" Jan 20 23:26:07 crc kubenswrapper[5030]: E0120 23:26:07.091825 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd505e0-dffa-4321-8f26-f8b6b70c98b4" containerName="barbican-api" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.091830 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd505e0-dffa-4321-8f26-f8b6b70c98b4" containerName="barbican-api" Jan 20 23:26:07 crc kubenswrapper[5030]: E0120 23:26:07.091844 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e6333d-c767-4935-92b0-31ae1c68e475" containerName="dnsmasq-dns" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.091850 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e6333d-c767-4935-92b0-31ae1c68e475" containerName="dnsmasq-dns" Jan 20 23:26:07 crc kubenswrapper[5030]: E0120 23:26:07.091862 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd505e0-dffa-4321-8f26-f8b6b70c98b4" containerName="barbican-api-log" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.091868 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd505e0-dffa-4321-8f26-f8b6b70c98b4" containerName="barbican-api-log" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.092041 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e6333d-c767-4935-92b0-31ae1c68e475" containerName="dnsmasq-dns" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.092052 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd505e0-dffa-4321-8f26-f8b6b70c98b4" containerName="barbican-api-log" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.092068 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="86fc7598-ee04-466c-9814-31b5cb28d7f9" containerName="placement-api" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.092084 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd505e0-dffa-4321-8f26-f8b6b70c98b4" containerName="barbican-api" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.092095 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="86fc7598-ee04-466c-9814-31b5cb28d7f9" containerName="placement-log" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.092740 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.094747 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.095791 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.114747 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz"] Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.171540 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsmfh\" (UniqueName: \"kubernetes.io/projected/21862ab8-6630-45ef-9687-a7dad47777bc-kube-api-access-nsmfh\") pod \"nova-cell0-cell-mapping-mtbkz\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.171836 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-scripts\") pod \"nova-cell0-cell-mapping-mtbkz\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.171906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-config-data\") pod \"nova-cell0-cell-mapping-mtbkz\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.171944 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mtbkz\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.274707 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-scripts\") pod \"nova-cell0-cell-mapping-mtbkz\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.275018 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-config-data\") pod \"nova-cell0-cell-mapping-mtbkz\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.275051 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mtbkz\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.275192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsmfh\" (UniqueName: \"kubernetes.io/projected/21862ab8-6630-45ef-9687-a7dad47777bc-kube-api-access-nsmfh\") pod \"nova-cell0-cell-mapping-mtbkz\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.279765 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-scripts\") pod \"nova-cell0-cell-mapping-mtbkz\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.279814 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-config-data\") pod \"nova-cell0-cell-mapping-mtbkz\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.284409 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mtbkz\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.290833 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsmfh\" (UniqueName: \"kubernetes.io/projected/21862ab8-6630-45ef-9687-a7dad47777bc-kube-api-access-nsmfh\") pod \"nova-cell0-cell-mapping-mtbkz\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.469183 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.507891 5030 generic.go:334] "Generic (PLEG): container finished" podID="9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" containerID="845f94500fe5b956e4b26d202cbb8038a24b9128d9af1a24f8782e2d9fce48ad" exitCode=143 Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.507969 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7","Type":"ContainerDied","Data":"845f94500fe5b956e4b26d202cbb8038a24b9128d9af1a24f8782e2d9fce48ad"} Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.510334 5030 generic.go:334] "Generic (PLEG): container finished" podID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerID="479d023bb035a62ed100978c00310340f751bb7c864f927f888af7f5537f011e" exitCode=143 Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.510356 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ebeef67e-90a0-4fb5-8f63-1f5d46c353be","Type":"ContainerDied","Data":"479d023bb035a62ed100978c00310340f751bb7c864f927f888af7f5537f011e"} Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.630804 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.726218 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd"] Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.739149 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-rcbbd"] Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.829937 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4"] Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.831430 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.833466 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.834059 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.843971 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4"] Jan 20 23:26:07 crc kubenswrapper[5030]: E0120 23:26:07.850908 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:26:07 crc kubenswrapper[5030]: E0120 23:26:07.852237 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:26:07 crc kubenswrapper[5030]: E0120 23:26:07.856721 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:26:07 crc kubenswrapper[5030]: E0120 23:26:07.856766 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c9f188c9-7a3c-49eb-9f90-553a9742b023" containerName="nova-scheduler-scheduler" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.889536 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-config-data\") pod \"nova-cell1-cell-mapping-s9rd4\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.889652 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-scripts\") pod \"nova-cell1-cell-mapping-s9rd4\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.889676 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjl5q\" (UniqueName: \"kubernetes.io/projected/a8dc0fd0-0262-4c69-a4ca-392220e30132-kube-api-access-cjl5q\") pod \"nova-cell1-cell-mapping-s9rd4\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.889858 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s9rd4\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.906958 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz"] Jan 20 23:26:07 crc kubenswrapper[5030]: W0120 23:26:07.915839 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21862ab8_6630_45ef_9687_a7dad47777bc.slice/crio-9ae6aa4c3d382ce0937af7995a5fe046799a4966695f33225269c236841a85ff WatchSource:0}: Error finding container 9ae6aa4c3d382ce0937af7995a5fe046799a4966695f33225269c236841a85ff: Status 404 returned error can't find the container with id 9ae6aa4c3d382ce0937af7995a5fe046799a4966695f33225269c236841a85ff Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.982214 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f6689ec-afe0-4ab0-8744-8586baafe323" path="/var/lib/kubelet/pods/0f6689ec-afe0-4ab0-8744-8586baafe323/volumes" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.986915 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22cb8656-1df5-4761-92bd-b91c969df165" path="/var/lib/kubelet/pods/22cb8656-1df5-4761-92bd-b91c969df165/volumes" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.987773 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86fc7598-ee04-466c-9814-31b5cb28d7f9" path="/var/lib/kubelet/pods/86fc7598-ee04-466c-9814-31b5cb28d7f9/volumes" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.993261 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-scripts\") pod \"nova-cell1-cell-mapping-s9rd4\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.993314 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjl5q\" (UniqueName: \"kubernetes.io/projected/a8dc0fd0-0262-4c69-a4ca-392220e30132-kube-api-access-cjl5q\") pod \"nova-cell1-cell-mapping-s9rd4\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.993407 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s9rd4\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.994217 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-config-data\") pod \"nova-cell1-cell-mapping-s9rd4\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.997368 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-scripts\") pod \"nova-cell1-cell-mapping-s9rd4\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.997578 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s9rd4\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:07 crc kubenswrapper[5030]: I0120 23:26:07.999756 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-config-data\") pod \"nova-cell1-cell-mapping-s9rd4\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.015021 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjl5q\" (UniqueName: \"kubernetes.io/projected/a8dc0fd0-0262-4c69-a4ca-392220e30132-kube-api-access-cjl5q\") pod \"nova-cell1-cell-mapping-s9rd4\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.158219 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.520383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" event={"ID":"21862ab8-6630-45ef-9687-a7dad47777bc","Type":"ContainerStarted","Data":"db7c736000c2c144ffc7cd96fd2cc41eeb6b34212f1f572dd0f34ffe24b8f195"} Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.520424 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" event={"ID":"21862ab8-6630-45ef-9687-a7dad47777bc","Type":"ContainerStarted","Data":"9ae6aa4c3d382ce0937af7995a5fe046799a4966695f33225269c236841a85ff"} Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.539687 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" podStartSLOduration=1.5396660070000001 podStartE2EDuration="1.539666007s" podCreationTimestamp="2026-01-20 23:26:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:26:08.537136906 +0000 UTC m=+3040.857397194" watchObservedRunningTime="2026-01-20 23:26:08.539666007 +0000 UTC m=+3040.859926285" Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.600304 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4"] Jan 20 23:26:08 crc kubenswrapper[5030]: W0120 23:26:08.624422 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8dc0fd0_0262_4c69_a4ca_392220e30132.slice/crio-b56d30fe55e428d67ad2d854096bf9936ceea894c022d7b3efccb2e85b709da9 WatchSource:0}: Error finding container b56d30fe55e428d67ad2d854096bf9936ceea894c022d7b3efccb2e85b709da9: Status 404 returned error can't find the container with id b56d30fe55e428d67ad2d854096bf9936ceea894c022d7b3efccb2e85b709da9 Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.841154 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.905712 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.916956 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-config\") pod \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.917036 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ng7z\" (UniqueName: \"kubernetes.io/projected/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-kube-api-access-4ng7z\") pod \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.917062 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-ovndb-tls-certs\") pod \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.917083 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-httpd-config\") pod \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.917218 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-internal-tls-certs\") pod \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.917250 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-combined-ca-bundle\") pod \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.917283 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-public-tls-certs\") pod \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\" (UID: \"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e\") " Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.924769 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" (UID: "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.930048 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-kube-api-access-4ng7z" (OuterVolumeSpecName: "kube-api-access-4ng7z") pod "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" (UID: "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e"). InnerVolumeSpecName "kube-api-access-4ng7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:08 crc kubenswrapper[5030]: I0120 23:26:08.994873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" (UID: "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.000404 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" (UID: "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.009132 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" (UID: "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.009985 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-config" (OuterVolumeSpecName: "config") pod "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" (UID: "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.018312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f188c9-7a3c-49eb-9f90-553a9742b023-combined-ca-bundle\") pod \"c9f188c9-7a3c-49eb-9f90-553a9742b023\" (UID: \"c9f188c9-7a3c-49eb-9f90-553a9742b023\") " Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.018382 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f188c9-7a3c-49eb-9f90-553a9742b023-config-data\") pod \"c9f188c9-7a3c-49eb-9f90-553a9742b023\" (UID: \"c9f188c9-7a3c-49eb-9f90-553a9742b023\") " Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.018491 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q64rd\" (UniqueName: \"kubernetes.io/projected/c9f188c9-7a3c-49eb-9f90-553a9742b023-kube-api-access-q64rd\") pod \"c9f188c9-7a3c-49eb-9f90-553a9742b023\" (UID: \"c9f188c9-7a3c-49eb-9f90-553a9742b023\") " Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.018900 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.018917 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.018929 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.018939 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.018948 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ng7z\" (UniqueName: \"kubernetes.io/projected/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-kube-api-access-4ng7z\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.018957 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.024359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9f188c9-7a3c-49eb-9f90-553a9742b023-kube-api-access-q64rd" (OuterVolumeSpecName: "kube-api-access-q64rd") pod "c9f188c9-7a3c-49eb-9f90-553a9742b023" (UID: "c9f188c9-7a3c-49eb-9f90-553a9742b023"). InnerVolumeSpecName "kube-api-access-q64rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.030699 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" (UID: "1a8a7627-29f1-4e20-a1ea-5a2b45716e6e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.048400 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9f188c9-7a3c-49eb-9f90-553a9742b023-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9f188c9-7a3c-49eb-9f90-553a9742b023" (UID: "c9f188c9-7a3c-49eb-9f90-553a9742b023"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.048831 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9f188c9-7a3c-49eb-9f90-553a9742b023-config-data" (OuterVolumeSpecName: "config-data") pod "c9f188c9-7a3c-49eb-9f90-553a9742b023" (UID: "c9f188c9-7a3c-49eb-9f90-553a9742b023"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.120663 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.120719 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q64rd\" (UniqueName: \"kubernetes.io/projected/c9f188c9-7a3c-49eb-9f90-553a9742b023-kube-api-access-q64rd\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.120734 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f188c9-7a3c-49eb-9f90-553a9742b023-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.120747 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f188c9-7a3c-49eb-9f90-553a9742b023-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.530271 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" event={"ID":"a8dc0fd0-0262-4c69-a4ca-392220e30132","Type":"ContainerStarted","Data":"79d280e4be818945ffca4e232f42bd16e3dc992a372a897fa7e7491e33afe638"} Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.530750 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" event={"ID":"a8dc0fd0-0262-4c69-a4ca-392220e30132","Type":"ContainerStarted","Data":"b56d30fe55e428d67ad2d854096bf9936ceea894c022d7b3efccb2e85b709da9"} Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.532544 5030 generic.go:334] "Generic (PLEG): container finished" podID="c9f188c9-7a3c-49eb-9f90-553a9742b023" containerID="38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff" exitCode=0 Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.532607 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.532612 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c9f188c9-7a3c-49eb-9f90-553a9742b023","Type":"ContainerDied","Data":"38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff"} Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.532739 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c9f188c9-7a3c-49eb-9f90-553a9742b023","Type":"ContainerDied","Data":"b2682e9c9d3aab53d8b218ad2df9fd947935d8115849e7f408ec769603688867"} Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.532759 5030 scope.go:117] "RemoveContainer" containerID="38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.542152 5030 generic.go:334] "Generic (PLEG): container finished" podID="1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" containerID="5cc1be7c0a82dbd5a841390d758773fe97e15ee408f76536d4e338afca6e0a65" exitCode=0 Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.542863 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.542986 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" event={"ID":"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e","Type":"ContainerDied","Data":"5cc1be7c0a82dbd5a841390d758773fe97e15ee408f76536d4e338afca6e0a65"} Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.543029 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8" event={"ID":"1a8a7627-29f1-4e20-a1ea-5a2b45716e6e","Type":"ContainerDied","Data":"518d26b9cc16a171da5a9825e14f7170edc48d0e6e2c8ea887fa51a78620f321"} Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.571168 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" podStartSLOduration=2.571148777 podStartE2EDuration="2.571148777s" podCreationTimestamp="2026-01-20 23:26:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:26:09.554940324 +0000 UTC m=+3041.875200602" watchObservedRunningTime="2026-01-20 23:26:09.571148777 +0000 UTC m=+3041.891409065" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.612716 5030 scope.go:117] "RemoveContainer" containerID="38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff" Jan 20 23:26:09 crc kubenswrapper[5030]: E0120 23:26:09.613910 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff\": container with ID starting with 38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff not found: ID does not exist" containerID="38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.613947 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff"} err="failed to get container status \"38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff\": rpc error: code = NotFound desc = could not find container \"38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff\": container with ID starting with 38f4d2a53cd0e2958524bafcd4474b72ca4fd49864b2ddb4f47e7d73f5496aff not found: ID does not exist" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.613977 5030 scope.go:117] "RemoveContainer" containerID="0411531d53355361b046e2e32bfef7f5b3573b390f4b771c528b43dc76aea6b7" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.662635 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.681066 5030 scope.go:117] "RemoveContainer" containerID="5cc1be7c0a82dbd5a841390d758773fe97e15ee408f76536d4e338afca6e0a65" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.686384 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.697905 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8"] Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.704970 5030 scope.go:117] "RemoveContainer" containerID="0411531d53355361b046e2e32bfef7f5b3573b390f4b771c528b43dc76aea6b7" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.705089 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6f7b76b9f9-kb5m8"] Jan 20 23:26:09 crc kubenswrapper[5030]: E0120 23:26:09.705571 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0411531d53355361b046e2e32bfef7f5b3573b390f4b771c528b43dc76aea6b7\": container with ID starting with 0411531d53355361b046e2e32bfef7f5b3573b390f4b771c528b43dc76aea6b7 not found: ID does not exist" containerID="0411531d53355361b046e2e32bfef7f5b3573b390f4b771c528b43dc76aea6b7" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.705606 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0411531d53355361b046e2e32bfef7f5b3573b390f4b771c528b43dc76aea6b7"} err="failed to get container status \"0411531d53355361b046e2e32bfef7f5b3573b390f4b771c528b43dc76aea6b7\": rpc error: code = NotFound desc = could not find container \"0411531d53355361b046e2e32bfef7f5b3573b390f4b771c528b43dc76aea6b7\": container with ID starting with 0411531d53355361b046e2e32bfef7f5b3573b390f4b771c528b43dc76aea6b7 not found: ID does not exist" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.705643 5030 scope.go:117] "RemoveContainer" containerID="5cc1be7c0a82dbd5a841390d758773fe97e15ee408f76536d4e338afca6e0a65" Jan 20 23:26:09 crc kubenswrapper[5030]: E0120 23:26:09.706046 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc1be7c0a82dbd5a841390d758773fe97e15ee408f76536d4e338afca6e0a65\": container with ID starting with 5cc1be7c0a82dbd5a841390d758773fe97e15ee408f76536d4e338afca6e0a65 not found: ID does not exist" containerID="5cc1be7c0a82dbd5a841390d758773fe97e15ee408f76536d4e338afca6e0a65" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.706121 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc1be7c0a82dbd5a841390d758773fe97e15ee408f76536d4e338afca6e0a65"} err="failed to get container status \"5cc1be7c0a82dbd5a841390d758773fe97e15ee408f76536d4e338afca6e0a65\": rpc error: code = NotFound desc = could not find container \"5cc1be7c0a82dbd5a841390d758773fe97e15ee408f76536d4e338afca6e0a65\": container with ID starting with 5cc1be7c0a82dbd5a841390d758773fe97e15ee408f76536d4e338afca6e0a65 not found: ID does not exist" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.711397 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:26:09 crc kubenswrapper[5030]: E0120 23:26:09.711961 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" containerName="neutron-httpd" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.712028 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" containerName="neutron-httpd" Jan 20 23:26:09 crc kubenswrapper[5030]: E0120 23:26:09.712093 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" containerName="neutron-api" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.712145 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" containerName="neutron-api" Jan 20 23:26:09 crc kubenswrapper[5030]: E0120 23:26:09.712276 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f188c9-7a3c-49eb-9f90-553a9742b023" containerName="nova-scheduler-scheduler" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.712393 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f188c9-7a3c-49eb-9f90-553a9742b023" containerName="nova-scheduler-scheduler" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.712679 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f188c9-7a3c-49eb-9f90-553a9742b023" containerName="nova-scheduler-scheduler" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.712882 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" containerName="neutron-httpd" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.712951 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" containerName="neutron-api" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.713815 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.717528 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.718913 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.837360 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0814026f-5519-40a1-94e8-379b5abac55f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0814026f-5519-40a1-94e8-379b5abac55f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.837484 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0814026f-5519-40a1-94e8-379b5abac55f-config-data\") pod \"nova-scheduler-0\" (UID: \"0814026f-5519-40a1-94e8-379b5abac55f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.837555 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcjh6\" (UniqueName: \"kubernetes.io/projected/0814026f-5519-40a1-94e8-379b5abac55f-kube-api-access-rcjh6\") pod \"nova-scheduler-0\" (UID: \"0814026f-5519-40a1-94e8-379b5abac55f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.939274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0814026f-5519-40a1-94e8-379b5abac55f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0814026f-5519-40a1-94e8-379b5abac55f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.939400 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0814026f-5519-40a1-94e8-379b5abac55f-config-data\") pod \"nova-scheduler-0\" (UID: \"0814026f-5519-40a1-94e8-379b5abac55f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.939488 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcjh6\" (UniqueName: \"kubernetes.io/projected/0814026f-5519-40a1-94e8-379b5abac55f-kube-api-access-rcjh6\") pod \"nova-scheduler-0\" (UID: \"0814026f-5519-40a1-94e8-379b5abac55f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.951276 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0814026f-5519-40a1-94e8-379b5abac55f-config-data\") pod \"nova-scheduler-0\" (UID: \"0814026f-5519-40a1-94e8-379b5abac55f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.964988 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0814026f-5519-40a1-94e8-379b5abac55f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0814026f-5519-40a1-94e8-379b5abac55f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.977206 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcjh6\" (UniqueName: \"kubernetes.io/projected/0814026f-5519-40a1-94e8-379b5abac55f-kube-api-access-rcjh6\") pod \"nova-scheduler-0\" (UID: \"0814026f-5519-40a1-94e8-379b5abac55f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.990649 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a8a7627-29f1-4e20-a1ea-5a2b45716e6e" path="/var/lib/kubelet/pods/1a8a7627-29f1-4e20-a1ea-5a2b45716e6e/volumes" Jan 20 23:26:09 crc kubenswrapper[5030]: I0120 23:26:09.991227 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9f188c9-7a3c-49eb-9f90-553a9742b023" path="/var/lib/kubelet/pods/c9f188c9-7a3c-49eb-9f90-553a9742b023/volumes" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.023036 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.034316 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.141070 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5c77db7594-ssxbs"] Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.141273 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" podUID="2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" containerName="keystone-api" containerID="cri-o://0387f5fa43ea27d1f6121946087bb3ad465a91835672600e0ae5cf2b14eb9303" gracePeriod=30 Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.248231 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.40:8775/\": read tcp 10.217.0.2:45800->10.217.1.40:8775: read: connection reset by peer" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.248568 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.40:8775/\": read tcp 10.217.0.2:45802->10.217.1.40:8775: read: connection reset by peer" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.556880 5030 generic.go:334] "Generic (PLEG): container finished" podID="9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" containerID="2e891cea972422856bd8ef4b90e6772ca327c12ee091c5085fee82919599acc9" exitCode=0 Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.556919 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7","Type":"ContainerDied","Data":"2e891cea972422856bd8ef4b90e6772ca327c12ee091c5085fee82919599acc9"} Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.565496 5030 generic.go:334] "Generic (PLEG): container finished" podID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerID="bc6613c08618d52d6b28812d72705e495fee861c8d3eb3f7d3abf86b5852deef" exitCode=0 Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.565728 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ebeef67e-90a0-4fb5-8f63-1f5d46c353be","Type":"ContainerDied","Data":"bc6613c08618d52d6b28812d72705e495fee861c8d3eb3f7d3abf86b5852deef"} Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.665813 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.834482 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.839369 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.859245 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-logs\") pod \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.859326 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-public-tls-certs\") pod \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.859350 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-internal-tls-certs\") pod \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.860461 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz8p9\" (UniqueName: \"kubernetes.io/projected/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-kube-api-access-tz8p9\") pod \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.860492 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-config-data\") pod \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.860521 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-config-data\") pod \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.860538 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-combined-ca-bundle\") pod \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\" (UID: \"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7\") " Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.860594 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-combined-ca-bundle\") pod \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.860631 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-nova-metadata-tls-certs\") pod \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.860667 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql4bz\" (UniqueName: \"kubernetes.io/projected/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-kube-api-access-ql4bz\") pod \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.860695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-logs\") pod \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\" (UID: \"ebeef67e-90a0-4fb5-8f63-1f5d46c353be\") " Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.861403 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-logs" (OuterVolumeSpecName: "logs") pod "ebeef67e-90a0-4fb5-8f63-1f5d46c353be" (UID: "ebeef67e-90a0-4fb5-8f63-1f5d46c353be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.864740 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-logs" (OuterVolumeSpecName: "logs") pod "9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" (UID: "9ef2b963-9ddd-4c61-9e8f-59a28f39afc7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.873177 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-kube-api-access-ql4bz" (OuterVolumeSpecName: "kube-api-access-ql4bz") pod "ebeef67e-90a0-4fb5-8f63-1f5d46c353be" (UID: "ebeef67e-90a0-4fb5-8f63-1f5d46c353be"). InnerVolumeSpecName "kube-api-access-ql4bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.894536 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-kube-api-access-tz8p9" (OuterVolumeSpecName: "kube-api-access-tz8p9") pod "9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" (UID: "9ef2b963-9ddd-4c61-9e8f-59a28f39afc7"). InnerVolumeSpecName "kube-api-access-tz8p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.941944 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-config-data" (OuterVolumeSpecName: "config-data") pod "ebeef67e-90a0-4fb5-8f63-1f5d46c353be" (UID: "ebeef67e-90a0-4fb5-8f63-1f5d46c353be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.963681 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql4bz\" (UniqueName: \"kubernetes.io/projected/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-kube-api-access-ql4bz\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.963710 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.963719 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.963729 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz8p9\" (UniqueName: \"kubernetes.io/projected/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-kube-api-access-tz8p9\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.963745 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.984357 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebeef67e-90a0-4fb5-8f63-1f5d46c353be" (UID: "ebeef67e-90a0-4fb5-8f63-1f5d46c353be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:10 crc kubenswrapper[5030]: I0120 23:26:10.991702 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-config-data" (OuterVolumeSpecName: "config-data") pod "9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" (UID: "9ef2b963-9ddd-4c61-9e8f-59a28f39afc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.004313 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" (UID: "9ef2b963-9ddd-4c61-9e8f-59a28f39afc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.007650 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ebeef67e-90a0-4fb5-8f63-1f5d46c353be" (UID: "ebeef67e-90a0-4fb5-8f63-1f5d46c353be"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.017298 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" (UID: "9ef2b963-9ddd-4c61-9e8f-59a28f39afc7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.023008 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" (UID: "9ef2b963-9ddd-4c61-9e8f-59a28f39afc7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.065538 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.065575 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.065585 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.065596 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.065607 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.065628 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebeef67e-90a0-4fb5-8f63-1f5d46c353be-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.579816 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0814026f-5519-40a1-94e8-379b5abac55f","Type":"ContainerStarted","Data":"8d8857a54e3051d21b85864143f114d8c2faafa007277a85b349682cb0dd2946"} Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.579867 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0814026f-5519-40a1-94e8-379b5abac55f","Type":"ContainerStarted","Data":"c4b7ffb8cb725c0415939e88cb48fa4d9ffd933d29c2375c642c3a6f7b5e1d1a"} Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.581827 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9ef2b963-9ddd-4c61-9e8f-59a28f39afc7","Type":"ContainerDied","Data":"eb6b92e8d09e0f8c49dd56746b572595ebb6115e1606402f3e37552a8a7d8eca"} Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.581897 5030 scope.go:117] "RemoveContainer" containerID="2e891cea972422856bd8ef4b90e6772ca327c12ee091c5085fee82919599acc9" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.582034 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.586249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ebeef67e-90a0-4fb5-8f63-1f5d46c353be","Type":"ContainerDied","Data":"161cf6150e89149dcb7115e8e89beaa1a7af91001e20221849fa748644aaa665"} Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.586565 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.605203 5030 scope.go:117] "RemoveContainer" containerID="845f94500fe5b956e4b26d202cbb8038a24b9128d9af1a24f8782e2d9fce48ad" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.606657 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.606640451 podStartE2EDuration="2.606640451s" podCreationTimestamp="2026-01-20 23:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:26:11.598990125 +0000 UTC m=+3043.919250413" watchObservedRunningTime="2026-01-20 23:26:11.606640451 +0000 UTC m=+3043.926900739" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.632574 5030 scope.go:117] "RemoveContainer" containerID="bc6613c08618d52d6b28812d72705e495fee861c8d3eb3f7d3abf86b5852deef" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.638462 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.663487 5030 scope.go:117] "RemoveContainer" containerID="479d023bb035a62ed100978c00310340f751bb7c864f927f888af7f5537f011e" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.698335 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.762293 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.771582 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.780315 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:26:11 crc kubenswrapper[5030]: E0120 23:26:11.780673 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerName="nova-metadata-log" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.780688 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerName="nova-metadata-log" Jan 20 23:26:11 crc kubenswrapper[5030]: E0120 23:26:11.780717 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" containerName="nova-api-api" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.780723 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" containerName="nova-api-api" Jan 20 23:26:11 crc kubenswrapper[5030]: E0120 23:26:11.780732 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" containerName="nova-api-log" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.780737 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" containerName="nova-api-log" Jan 20 23:26:11 crc kubenswrapper[5030]: E0120 23:26:11.780747 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerName="nova-metadata-metadata" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.780753 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerName="nova-metadata-metadata" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.780907 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerName="nova-metadata-log" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.780931 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" containerName="nova-api-api" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.780940 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" containerName="nova-api-log" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.780951 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" containerName="nova-metadata-metadata" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.781911 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.784283 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.785334 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.786400 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.791396 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.803007 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.804703 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.809287 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.809504 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.809716 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.974009 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef2b963-9ddd-4c61-9e8f-59a28f39afc7" path="/var/lib/kubelet/pods/9ef2b963-9ddd-4c61-9e8f-59a28f39afc7/volumes" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.974643 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebeef67e-90a0-4fb5-8f63-1f5d46c353be" path="/var/lib/kubelet/pods/ebeef67e-90a0-4fb5-8f63-1f5d46c353be/volumes" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.982293 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.982360 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlf4t\" (UniqueName: \"kubernetes.io/projected/e5276d7d-9906-480d-90c6-4f48c6f126e2-kube-api-access-mlf4t\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.982431 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-config-data\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.982800 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-public-tls-certs\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.982951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5276d7d-9906-480d-90c6-4f48c6f126e2-logs\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.983045 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.983087 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-logs\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.983163 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.983185 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-config-data\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.983277 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:11 crc kubenswrapper[5030]: I0120 23:26:11.983339 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-262j8\" (UniqueName: \"kubernetes.io/projected/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-kube-api-access-262j8\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.085482 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-logs\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.085561 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.085586 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-config-data\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.085670 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.085715 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-262j8\" (UniqueName: \"kubernetes.io/projected/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-kube-api-access-262j8\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.085741 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.085783 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlf4t\" (UniqueName: \"kubernetes.io/projected/e5276d7d-9906-480d-90c6-4f48c6f126e2-kube-api-access-mlf4t\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.085884 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-config-data\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.085917 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-public-tls-certs\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.085972 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5276d7d-9906-480d-90c6-4f48c6f126e2-logs\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.085996 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.086191 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-logs\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.087940 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5276d7d-9906-480d-90c6-4f48c6f126e2-logs\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.091516 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.091745 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.092744 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-config-data\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.093397 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.095005 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-config-data\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.100379 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.105226 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-public-tls-certs\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.119423 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlf4t\" (UniqueName: \"kubernetes.io/projected/e5276d7d-9906-480d-90c6-4f48c6f126e2-kube-api-access-mlf4t\") pod \"nova-api-0\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.120227 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-262j8\" (UniqueName: \"kubernetes.io/projected/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-kube-api-access-262j8\") pod \"nova-metadata-0\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.150103 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.413255 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.620162 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:26:12 crc kubenswrapper[5030]: W0120 23:26:12.623783 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23ce0823_f23e_4e5f_a8ee_aeecbdb7efcf.slice/crio-6c76f3c7c64099b7ee877fe97a9e61fbe3bb63c5803e888fb3eae340dcb08732 WatchSource:0}: Error finding container 6c76f3c7c64099b7ee877fe97a9e61fbe3bb63c5803e888fb3eae340dcb08732: Status 404 returned error can't find the container with id 6c76f3c7c64099b7ee877fe97a9e61fbe3bb63c5803e888fb3eae340dcb08732 Jan 20 23:26:12 crc kubenswrapper[5030]: I0120 23:26:12.841960 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:26:12 crc kubenswrapper[5030]: W0120 23:26:12.846039 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5276d7d_9906_480d_90c6_4f48c6f126e2.slice/crio-e5fc5981d432357c3c6b1bf46ab30c0c33deac8dd8384747436ef962d6aafb34 WatchSource:0}: Error finding container e5fc5981d432357c3c6b1bf46ab30c0c33deac8dd8384747436ef962d6aafb34: Status 404 returned error can't find the container with id e5fc5981d432357c3c6b1bf46ab30c0c33deac8dd8384747436ef962d6aafb34 Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.612040 5030 generic.go:334] "Generic (PLEG): container finished" podID="a8dc0fd0-0262-4c69-a4ca-392220e30132" containerID="79d280e4be818945ffca4e232f42bd16e3dc992a372a897fa7e7491e33afe638" exitCode=0 Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.612098 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" event={"ID":"a8dc0fd0-0262-4c69-a4ca-392220e30132","Type":"ContainerDied","Data":"79d280e4be818945ffca4e232f42bd16e3dc992a372a897fa7e7491e33afe638"} Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.621695 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf","Type":"ContainerStarted","Data":"616e09206b760dcb070e95678184dd06e6e1027d0d51ca57dde5f3bc0847a277"} Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.621751 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf","Type":"ContainerStarted","Data":"ba64c3f6fd654c175099565552c0c0d6fd00f883cbac81faa3e5a76d3ce4b5e9"} Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.621765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf","Type":"ContainerStarted","Data":"6c76f3c7c64099b7ee877fe97a9e61fbe3bb63c5803e888fb3eae340dcb08732"} Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.625324 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e5276d7d-9906-480d-90c6-4f48c6f126e2","Type":"ContainerStarted","Data":"4fcb730cfefb582fb08c7ccfc4f44f7eca229fb8f1e6d2c66c0bb98cd59694cb"} Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.625364 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e5276d7d-9906-480d-90c6-4f48c6f126e2","Type":"ContainerStarted","Data":"bfd8c73cd8e2801b4cb689a42728e83fbff428b8ac539964c23cc5f093ce8bf9"} Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.625377 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e5276d7d-9906-480d-90c6-4f48c6f126e2","Type":"ContainerStarted","Data":"e5fc5981d432357c3c6b1bf46ab30c0c33deac8dd8384747436ef962d6aafb34"} Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.628475 5030 generic.go:334] "Generic (PLEG): container finished" podID="21862ab8-6630-45ef-9687-a7dad47777bc" containerID="db7c736000c2c144ffc7cd96fd2cc41eeb6b34212f1f572dd0f34ffe24b8f195" exitCode=0 Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.628532 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" event={"ID":"21862ab8-6630-45ef-9687-a7dad47777bc","Type":"ContainerDied","Data":"db7c736000c2c144ffc7cd96fd2cc41eeb6b34212f1f572dd0f34ffe24b8f195"} Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.635308 5030 generic.go:334] "Generic (PLEG): container finished" podID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerID="ba0403331a7a709650df4336818fc505067385786c8b900b92e2db5d3c4245ae" exitCode=137 Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.635363 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c882b06-04ce-4893-af53-f8dfb97dfcdc","Type":"ContainerDied","Data":"ba0403331a7a709650df4336818fc505067385786c8b900b92e2db5d3c4245ae"} Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.635402 5030 scope.go:117] "RemoveContainer" containerID="914f5f3e09e41c57acffc427aa985037c43c40970debda4d1453d04f95ceac1e" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.691170 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.691395 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="1fa5677f-cdb3-4391-add0-360b9b188cd3" containerName="openstackclient" containerID="cri-o://f09acdc60f1f81bba1a8d3368f953c5127816f9d9a3d73cbef68294606305e53" gracePeriod=2 Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.707989 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.715157 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.715139361 podStartE2EDuration="2.715139361s" podCreationTimestamp="2026-01-20 23:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:26:13.671509955 +0000 UTC m=+3045.991770253" watchObservedRunningTime="2026-01-20 23:26:13.715139361 +0000 UTC m=+3046.035399649" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.722727 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.722711475 podStartE2EDuration="2.722711475s" podCreationTimestamp="2026-01-20 23:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:26:13.695246229 +0000 UTC m=+3046.015506527" watchObservedRunningTime="2026-01-20 23:26:13.722711475 +0000 UTC m=+3046.042971763" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.750813 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:26:13 crc kubenswrapper[5030]: E0120 23:26:13.751269 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa5677f-cdb3-4391-add0-360b9b188cd3" containerName="openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.751286 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa5677f-cdb3-4391-add0-360b9b188cd3" containerName="openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.751482 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa5677f-cdb3-4391-add0-360b9b188cd3" containerName="openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.752098 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.757551 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="1fa5677f-cdb3-4391-add0-360b9b188cd3" podUID="5995ef35-8891-4ffb-8e20-86add9e274bc" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.760939 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="6b4c0a0d-6ad4-4721-8644-dbc6f77b4e1f" podUID="5995ef35-8891-4ffb-8e20-86add9e274bc" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.760994 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.769991 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:26:13 crc kubenswrapper[5030]: E0120 23:26:13.772022 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-cb4r6 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[combined-ca-bundle kube-api-access-cb4r6 openstack-config openstack-config-secret]: context canceled" pod="openstack-kuttl-tests/openstackclient" podUID="6b4c0a0d-6ad4-4721-8644-dbc6f77b4e1f" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.778577 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.788818 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.792955 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.818003 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.826658 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5995ef35-8891-4ffb-8e20-86add9e274bc-openstack-config-secret\") pod \"openstackclient\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.826717 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp7fs\" (UniqueName: \"kubernetes.io/projected/5995ef35-8891-4ffb-8e20-86add9e274bc-kube-api-access-hp7fs\") pod \"openstackclient\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.826744 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5995ef35-8891-4ffb-8e20-86add9e274bc-openstack-config\") pod \"openstackclient\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.826784 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5995ef35-8891-4ffb-8e20-86add9e274bc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.929178 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5995ef35-8891-4ffb-8e20-86add9e274bc-openstack-config-secret\") pod \"openstackclient\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.929236 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp7fs\" (UniqueName: \"kubernetes.io/projected/5995ef35-8891-4ffb-8e20-86add9e274bc-kube-api-access-hp7fs\") pod \"openstackclient\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.929258 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5995ef35-8891-4ffb-8e20-86add9e274bc-openstack-config\") pod \"openstackclient\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.929296 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5995ef35-8891-4ffb-8e20-86add9e274bc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.930716 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5995ef35-8891-4ffb-8e20-86add9e274bc-openstack-config\") pod \"openstackclient\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.933992 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5995ef35-8891-4ffb-8e20-86add9e274bc-openstack-config-secret\") pod \"openstackclient\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.935887 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5995ef35-8891-4ffb-8e20-86add9e274bc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.947614 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp7fs\" (UniqueName: \"kubernetes.io/projected/5995ef35-8891-4ffb-8e20-86add9e274bc-kube-api-access-hp7fs\") pod \"openstackclient\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.964062 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:26:13 crc kubenswrapper[5030]: E0120 23:26:13.964505 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:26:13 crc kubenswrapper[5030]: I0120 23:26:13.975359 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b4c0a0d-6ad4-4721-8644-dbc6f77b4e1f" path="/var/lib/kubelet/pods/6b4c0a0d-6ad4-4721-8644-dbc6f77b4e1f/volumes" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.063358 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.066088 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.110085 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.133526 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-scripts\") pod \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.133608 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-config-data\") pod \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.133893 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-scripts\") pod \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.133942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv7j9\" (UniqueName: \"kubernetes.io/projected/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-kube-api-access-fv7j9\") pod \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.133971 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-config-data\") pod \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.133995 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8hzv\" (UniqueName: \"kubernetes.io/projected/3c882b06-04ce-4893-af53-f8dfb97dfcdc-kube-api-access-n8hzv\") pod \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.134030 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-public-tls-certs\") pod \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.135002 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-fernet-keys\") pod \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.135086 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-credential-keys\") pod \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.135148 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-config-data-custom\") pod \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.135222 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-internal-tls-certs\") pod \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.135285 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-combined-ca-bundle\") pod \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\" (UID: \"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.135351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c882b06-04ce-4893-af53-f8dfb97dfcdc-etc-machine-id\") pod \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.135389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-combined-ca-bundle\") pod \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\" (UID: \"3c882b06-04ce-4893-af53-f8dfb97dfcdc\") " Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.136956 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c882b06-04ce-4893-af53-f8dfb97dfcdc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3c882b06-04ce-4893-af53-f8dfb97dfcdc" (UID: "3c882b06-04ce-4893-af53-f8dfb97dfcdc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.138351 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-kube-api-access-fv7j9" (OuterVolumeSpecName: "kube-api-access-fv7j9") pod "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" (UID: "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6"). InnerVolumeSpecName "kube-api-access-fv7j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.138407 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-scripts" (OuterVolumeSpecName: "scripts") pod "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" (UID: "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.139480 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-scripts" (OuterVolumeSpecName: "scripts") pod "3c882b06-04ce-4893-af53-f8dfb97dfcdc" (UID: "3c882b06-04ce-4893-af53-f8dfb97dfcdc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.141399 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" (UID: "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.143134 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c882b06-04ce-4893-af53-f8dfb97dfcdc-kube-api-access-n8hzv" (OuterVolumeSpecName: "kube-api-access-n8hzv") pod "3c882b06-04ce-4893-af53-f8dfb97dfcdc" (UID: "3c882b06-04ce-4893-af53-f8dfb97dfcdc"). InnerVolumeSpecName "kube-api-access-n8hzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.143189 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" (UID: "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.144825 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3c882b06-04ce-4893-af53-f8dfb97dfcdc" (UID: "3c882b06-04ce-4893-af53-f8dfb97dfcdc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.166137 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" (UID: "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.182556 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-config-data" (OuterVolumeSpecName: "config-data") pod "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" (UID: "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.197717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" (UID: "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.207207 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c882b06-04ce-4893-af53-f8dfb97dfcdc" (UID: "3c882b06-04ce-4893-af53-f8dfb97dfcdc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.220603 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" (UID: "2e6b8546-b932-4a4a-a2a0-d34aa5399cf6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.237700 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3c882b06-04ce-4893-af53-f8dfb97dfcdc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.237729 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.237739 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.237748 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.237757 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.237766 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv7j9\" (UniqueName: \"kubernetes.io/projected/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-kube-api-access-fv7j9\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.238215 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8hzv\" (UniqueName: \"kubernetes.io/projected/3c882b06-04ce-4893-af53-f8dfb97dfcdc-kube-api-access-n8hzv\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.238235 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.238243 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.238251 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.238261 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.238269 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.238277 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.256395 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-config-data" (OuterVolumeSpecName: "config-data") pod "3c882b06-04ce-4893-af53-f8dfb97dfcdc" (UID: "3c882b06-04ce-4893-af53-f8dfb97dfcdc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.340375 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c882b06-04ce-4893-af53-f8dfb97dfcdc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:14 crc kubenswrapper[5030]: W0120 23:26:14.604222 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5995ef35_8891_4ffb_8e20_86add9e274bc.slice/crio-38add8ec68f77d45000d5875bb7b2f5b3e46a7fe1ee7029b288b7f66331fa1e1 WatchSource:0}: Error finding container 38add8ec68f77d45000d5875bb7b2f5b3e46a7fe1ee7029b288b7f66331fa1e1: Status 404 returned error can't find the container with id 38add8ec68f77d45000d5875bb7b2f5b3e46a7fe1ee7029b288b7f66331fa1e1 Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.608842 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.653325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"5995ef35-8891-4ffb-8e20-86add9e274bc","Type":"ContainerStarted","Data":"38add8ec68f77d45000d5875bb7b2f5b3e46a7fe1ee7029b288b7f66331fa1e1"} Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.658957 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"3c882b06-04ce-4893-af53-f8dfb97dfcdc","Type":"ContainerDied","Data":"7971bde154627c987da26516c0826cbdb1889c10b8234541523b58d8bfb5dc87"} Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.659026 5030 scope.go:117] "RemoveContainer" containerID="ba0403331a7a709650df4336818fc505067385786c8b900b92e2db5d3c4245ae" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.659493 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.662965 5030 generic.go:334] "Generic (PLEG): container finished" podID="2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" containerID="0387f5fa43ea27d1f6121946087bb3ad465a91835672600e0ae5cf2b14eb9303" exitCode=0 Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.663021 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.663119 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" event={"ID":"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6","Type":"ContainerDied","Data":"0387f5fa43ea27d1f6121946087bb3ad465a91835672600e0ae5cf2b14eb9303"} Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.663171 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.663192 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c77db7594-ssxbs" event={"ID":"2e6b8546-b932-4a4a-a2a0-d34aa5399cf6","Type":"ContainerDied","Data":"df65c38c9b137bc52c4c0c7fdc204b407a3bc5d7df6e4766acbdc1e71d56baf1"} Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.673851 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="6b4c0a0d-6ad4-4721-8644-dbc6f77b4e1f" podUID="5995ef35-8891-4ffb-8e20-86add9e274bc" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.722658 5030 scope.go:117] "RemoveContainer" containerID="af8c2994eb7f6ea1db7330e62227663ddd51844eb30651d6420cdeedd6189e70" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.758514 5030 scope.go:117] "RemoveContainer" containerID="0387f5fa43ea27d1f6121946087bb3ad465a91835672600e0ae5cf2b14eb9303" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.813184 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.817042 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="6b4c0a0d-6ad4-4721-8644-dbc6f77b4e1f" podUID="5995ef35-8891-4ffb-8e20-86add9e274bc" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.852983 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5c77db7594-ssxbs"] Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.855031 5030 scope.go:117] "RemoveContainer" containerID="0387f5fa43ea27d1f6121946087bb3ad465a91835672600e0ae5cf2b14eb9303" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.864001 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-5c77db7594-ssxbs"] Jan 20 23:26:14 crc kubenswrapper[5030]: E0120 23:26:14.867453 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0387f5fa43ea27d1f6121946087bb3ad465a91835672600e0ae5cf2b14eb9303\": container with ID starting with 0387f5fa43ea27d1f6121946087bb3ad465a91835672600e0ae5cf2b14eb9303 not found: ID does not exist" containerID="0387f5fa43ea27d1f6121946087bb3ad465a91835672600e0ae5cf2b14eb9303" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.867508 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0387f5fa43ea27d1f6121946087bb3ad465a91835672600e0ae5cf2b14eb9303"} err="failed to get container status \"0387f5fa43ea27d1f6121946087bb3ad465a91835672600e0ae5cf2b14eb9303\": rpc error: code = NotFound desc = could not find container \"0387f5fa43ea27d1f6121946087bb3ad465a91835672600e0ae5cf2b14eb9303\": container with ID starting with 0387f5fa43ea27d1f6121946087bb3ad465a91835672600e0ae5cf2b14eb9303 not found: ID does not exist" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.898535 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.910820 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.919666 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:26:14 crc kubenswrapper[5030]: E0120 23:26:14.920075 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="cinder-scheduler" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.920091 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="cinder-scheduler" Jan 20 23:26:14 crc kubenswrapper[5030]: E0120 23:26:14.920113 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="cinder-scheduler" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.920119 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="cinder-scheduler" Jan 20 23:26:14 crc kubenswrapper[5030]: E0120 23:26:14.920140 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="probe" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.920147 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="probe" Jan 20 23:26:14 crc kubenswrapper[5030]: E0120 23:26:14.920159 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" containerName="keystone-api" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.920165 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" containerName="keystone-api" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.920364 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="cinder-scheduler" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.920376 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="probe" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.920404 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" containerName="keystone-api" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.920819 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" containerName="cinder-scheduler" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.921443 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.924235 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.941763 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.953994 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr6c9\" (UniqueName: \"kubernetes.io/projected/6cc0aba0-a0c7-4b7b-9b91-031765f25948-kube-api-access-vr6c9\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.954082 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.954155 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-config-data\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.954178 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cc0aba0-a0c7-4b7b-9b91-031765f25948-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.954196 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:14 crc kubenswrapper[5030]: I0120 23:26:14.954228 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-scripts\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.015468 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.021913 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.034966 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055006 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-combined-ca-bundle\") pod \"a8dc0fd0-0262-4c69-a4ca-392220e30132\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055067 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsmfh\" (UniqueName: \"kubernetes.io/projected/21862ab8-6630-45ef-9687-a7dad47777bc-kube-api-access-nsmfh\") pod \"21862ab8-6630-45ef-9687-a7dad47777bc\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055133 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-scripts\") pod \"a8dc0fd0-0262-4c69-a4ca-392220e30132\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055237 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-scripts\") pod \"21862ab8-6630-45ef-9687-a7dad47777bc\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055284 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjl5q\" (UniqueName: \"kubernetes.io/projected/a8dc0fd0-0262-4c69-a4ca-392220e30132-kube-api-access-cjl5q\") pod \"a8dc0fd0-0262-4c69-a4ca-392220e30132\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055370 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-config-data\") pod \"a8dc0fd0-0262-4c69-a4ca-392220e30132\" (UID: \"a8dc0fd0-0262-4c69-a4ca-392220e30132\") " Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055393 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-config-data\") pod \"21862ab8-6630-45ef-9687-a7dad47777bc\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055422 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-combined-ca-bundle\") pod \"21862ab8-6630-45ef-9687-a7dad47777bc\" (UID: \"21862ab8-6630-45ef-9687-a7dad47777bc\") " Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055741 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-config-data\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055775 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cc0aba0-a0c7-4b7b-9b91-031765f25948-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055794 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055830 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-scripts\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055894 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr6c9\" (UniqueName: \"kubernetes.io/projected/6cc0aba0-a0c7-4b7b-9b91-031765f25948-kube-api-access-vr6c9\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.055953 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.060145 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cc0aba0-a0c7-4b7b-9b91-031765f25948-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.061500 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-scripts" (OuterVolumeSpecName: "scripts") pod "a8dc0fd0-0262-4c69-a4ca-392220e30132" (UID: "a8dc0fd0-0262-4c69-a4ca-392220e30132"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.062371 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21862ab8-6630-45ef-9687-a7dad47777bc-kube-api-access-nsmfh" (OuterVolumeSpecName: "kube-api-access-nsmfh") pod "21862ab8-6630-45ef-9687-a7dad47777bc" (UID: "21862ab8-6630-45ef-9687-a7dad47777bc"). InnerVolumeSpecName "kube-api-access-nsmfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.062668 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.062700 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.064775 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-scripts" (OuterVolumeSpecName: "scripts") pod "21862ab8-6630-45ef-9687-a7dad47777bc" (UID: "21862ab8-6630-45ef-9687-a7dad47777bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.068436 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8dc0fd0-0262-4c69-a4ca-392220e30132-kube-api-access-cjl5q" (OuterVolumeSpecName: "kube-api-access-cjl5q") pod "a8dc0fd0-0262-4c69-a4ca-392220e30132" (UID: "a8dc0fd0-0262-4c69-a4ca-392220e30132"). InnerVolumeSpecName "kube-api-access-cjl5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.070383 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-config-data\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.075073 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-scripts\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.079038 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr6c9\" (UniqueName: \"kubernetes.io/projected/6cc0aba0-a0c7-4b7b-9b91-031765f25948-kube-api-access-vr6c9\") pod \"cinder-scheduler-0\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.089602 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-config-data" (OuterVolumeSpecName: "config-data") pod "21862ab8-6630-45ef-9687-a7dad47777bc" (UID: "21862ab8-6630-45ef-9687-a7dad47777bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.093490 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21862ab8-6630-45ef-9687-a7dad47777bc" (UID: "21862ab8-6630-45ef-9687-a7dad47777bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.102696 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-config-data" (OuterVolumeSpecName: "config-data") pod "a8dc0fd0-0262-4c69-a4ca-392220e30132" (UID: "a8dc0fd0-0262-4c69-a4ca-392220e30132"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.103151 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8dc0fd0-0262-4c69-a4ca-392220e30132" (UID: "a8dc0fd0-0262-4c69-a4ca-392220e30132"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.157790 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.157969 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.158031 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.158088 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.158143 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsmfh\" (UniqueName: \"kubernetes.io/projected/21862ab8-6630-45ef-9687-a7dad47777bc-kube-api-access-nsmfh\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.158204 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8dc0fd0-0262-4c69-a4ca-392220e30132-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.158261 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21862ab8-6630-45ef-9687-a7dad47777bc-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.158315 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjl5q\" (UniqueName: \"kubernetes.io/projected/a8dc0fd0-0262-4c69-a4ca-392220e30132-kube-api-access-cjl5q\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.243249 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:15 crc kubenswrapper[5030]: W0120 23:26:15.586367 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cc0aba0_a0c7_4b7b_9b91_031765f25948.slice/crio-405be7b843edf098a15353de75dc7e5075b85eb3553f3e422355e25bd6872214 WatchSource:0}: Error finding container 405be7b843edf098a15353de75dc7e5075b85eb3553f3e422355e25bd6872214: Status 404 returned error can't find the container with id 405be7b843edf098a15353de75dc7e5075b85eb3553f3e422355e25bd6872214 Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.592178 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.697393 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-q48f7"] Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.698119 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" Jan 20 23:26:15 crc kubenswrapper[5030]: E0120 23:26:15.698232 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21862ab8-6630-45ef-9687-a7dad47777bc" containerName="nova-manage" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.698249 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="21862ab8-6630-45ef-9687-a7dad47777bc" containerName="nova-manage" Jan 20 23:26:15 crc kubenswrapper[5030]: E0120 23:26:15.698278 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8dc0fd0-0262-4c69-a4ca-392220e30132" containerName="nova-manage" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.698286 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8dc0fd0-0262-4c69-a4ca-392220e30132" containerName="nova-manage" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.698504 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8dc0fd0-0262-4c69-a4ca-392220e30132" containerName="nova-manage" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.698524 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="21862ab8-6630-45ef-9687-a7dad47777bc" containerName="nova-manage" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.699437 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4" event={"ID":"a8dc0fd0-0262-4c69-a4ca-392220e30132","Type":"ContainerDied","Data":"b56d30fe55e428d67ad2d854096bf9936ceea894c022d7b3efccb2e85b709da9"} Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.699466 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b56d30fe55e428d67ad2d854096bf9936ceea894c022d7b3efccb2e85b709da9" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.699533 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-q48f7" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.709064 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.712145 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" event={"ID":"21862ab8-6630-45ef-9687-a7dad47777bc","Type":"ContainerDied","Data":"9ae6aa4c3d382ce0937af7995a5fe046799a4966695f33225269c236841a85ff"} Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.712179 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ae6aa4c3d382ce0937af7995a5fe046799a4966695f33225269c236841a85ff" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.712249 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.729198 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-q48f7"] Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.748383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"6cc0aba0-a0c7-4b7b-9b91-031765f25948","Type":"ContainerStarted","Data":"405be7b843edf098a15353de75dc7e5075b85eb3553f3e422355e25bd6872214"} Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.766053 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqcnr\" (UniqueName: \"kubernetes.io/projected/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-kube-api-access-vqcnr\") pod \"root-account-create-update-q48f7\" (UID: \"261b1067-9a6b-4835-8b8c-e0d2c68a4a99\") " pod="openstack-kuttl-tests/root-account-create-update-q48f7" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.767449 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts\") pod \"root-account-create-update-q48f7\" (UID: \"261b1067-9a6b-4835-8b8c-e0d2c68a4a99\") " pod="openstack-kuttl-tests/root-account-create-update-q48f7" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.766339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"5995ef35-8891-4ffb-8e20-86add9e274bc","Type":"ContainerStarted","Data":"8b8b11c6ac6050ac18676092493bc35bbf1121d7672ace5bea22fe163dd6aca3"} Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.784762 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.811495 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jhf96"] Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.817089 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="6b4c0a0d-6ad4-4721-8644-dbc6f77b4e1f" podUID="5995ef35-8891-4ffb-8e20-86add9e274bc" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.876550 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqcnr\" (UniqueName: \"kubernetes.io/projected/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-kube-api-access-vqcnr\") pod \"root-account-create-update-q48f7\" (UID: \"261b1067-9a6b-4835-8b8c-e0d2c68a4a99\") " pod="openstack-kuttl-tests/root-account-create-update-q48f7" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.894562 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts\") pod \"root-account-create-update-q48f7\" (UID: \"261b1067-9a6b-4835-8b8c-e0d2c68a4a99\") " pod="openstack-kuttl-tests/root-account-create-update-q48f7" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.895694 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts\") pod \"root-account-create-update-q48f7\" (UID: \"261b1067-9a6b-4835-8b8c-e0d2c68a4a99\") " pod="openstack-kuttl-tests/root-account-create-update-q48f7" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.905943 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="6b4c0a0d-6ad4-4721-8644-dbc6f77b4e1f" podUID="5995ef35-8891-4ffb-8e20-86add9e274bc" Jan 20 23:26:15 crc kubenswrapper[5030]: I0120 23:26:15.924450 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqcnr\" (UniqueName: \"kubernetes.io/projected/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-kube-api-access-vqcnr\") pod \"root-account-create-update-q48f7\" (UID: \"261b1067-9a6b-4835-8b8c-e0d2c68a4a99\") " pod="openstack-kuttl-tests/root-account-create-update-q48f7" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.042506 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6b8546-b932-4a4a-a2a0-d34aa5399cf6" path="/var/lib/kubelet/pods/2e6b8546-b932-4a4a-a2a0-d34aa5399cf6/volumes" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.059982 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c882b06-04ce-4893-af53-f8dfb97dfcdc" path="/var/lib/kubelet/pods/3c882b06-04ce-4893-af53-f8dfb97dfcdc/volumes" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.066607 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-jhf96"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.066839 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.099021 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-q48f7" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.177817 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-659a-account-create-update-nthp7"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.178985 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.190454 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.204085 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.204395 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data podName:d756a967-e1dd-4c74-8bff-cd1fab21b251 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:16.704378658 +0000 UTC m=+3049.024638946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data") pod "rabbitmq-server-0" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251") : configmap "rabbitmq-config-data" not found Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.236759 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.238326 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.256659 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-ndbv8"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.257839 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-ndbv8" Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.272984 5030 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.9:36932->38.102.83.9:37955: write tcp 38.102.83.9:36932->38.102.83.9:37955: write: broken pipe Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.306711 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4j4l\" (UniqueName: \"kubernetes.io/projected/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-kube-api-access-m4j4l\") pod \"glance-659a-account-create-update-nthp7\" (UID: \"c07cdd7a-1d57-4d2a-9910-54d0e6e186f5\") " pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.306801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-operator-scripts\") pod \"glance-659a-account-create-update-nthp7\" (UID: \"c07cdd7a-1d57-4d2a-9910-54d0e6e186f5\") " pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.308881 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-659a-account-create-update-nthp7"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.325059 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.393781 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-ndbv8"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.412091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-logs\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.412150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqkqr\" (UniqueName: \"kubernetes.io/projected/3f82fbd9-cd1e-4d71-8d61-02126748c546-kube-api-access-vqkqr\") pod \"glance-db-create-ndbv8\" (UID: \"3f82fbd9-cd1e-4d71-8d61-02126748c546\") " pod="openstack-kuttl-tests/glance-db-create-ndbv8" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.412178 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4j4l\" (UniqueName: \"kubernetes.io/projected/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-kube-api-access-m4j4l\") pod \"glance-659a-account-create-update-nthp7\" (UID: \"c07cdd7a-1d57-4d2a-9910-54d0e6e186f5\") " pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.412206 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.412233 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc858\" (UniqueName: \"kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.412272 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data-custom\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.412309 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.412335 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f82fbd9-cd1e-4d71-8d61-02126748c546-operator-scripts\") pod \"glance-db-create-ndbv8\" (UID: \"3f82fbd9-cd1e-4d71-8d61-02126748c546\") " pod="openstack-kuttl-tests/glance-db-create-ndbv8" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.412352 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-operator-scripts\") pod \"glance-659a-account-create-update-nthp7\" (UID: \"c07cdd7a-1d57-4d2a-9910-54d0e6e186f5\") " pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.413201 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-operator-scripts\") pod \"glance-659a-account-create-update-nthp7\" (UID: \"c07cdd7a-1d57-4d2a-9910-54d0e6e186f5\") " pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.456353 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.457869 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.459179 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4j4l\" (UniqueName: \"kubernetes.io/projected/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-kube-api-access-m4j4l\") pod \"glance-659a-account-create-update-nthp7\" (UID: \"c07cdd7a-1d57-4d2a-9910-54d0e6e186f5\") " pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.507812 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.514329 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-logs\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.514427 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqkqr\" (UniqueName: \"kubernetes.io/projected/3f82fbd9-cd1e-4d71-8d61-02126748c546-kube-api-access-vqkqr\") pod \"glance-db-create-ndbv8\" (UID: \"3f82fbd9-cd1e-4d71-8d61-02126748c546\") " pod="openstack-kuttl-tests/glance-db-create-ndbv8" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.514491 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.514545 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc858\" (UniqueName: \"kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.514606 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data-custom\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.514668 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.514722 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f82fbd9-cd1e-4d71-8d61-02126748c546-operator-scripts\") pod \"glance-db-create-ndbv8\" (UID: \"3f82fbd9-cd1e-4d71-8d61-02126748c546\") " pod="openstack-kuttl-tests/glance-db-create-ndbv8" Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.514848 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.514919 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle podName:64de676f-6fb5-4e1f-ba48-9c6ba84b1e89 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:17.014898695 +0000 UTC m=+3049.335158983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle") pod "barbican-keystone-listener-5547ddfddd-hx7pb" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89") : secret "combined-ca-bundle" not found Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.521165 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f82fbd9-cd1e-4d71-8d61-02126748c546-operator-scripts\") pod \"glance-db-create-ndbv8\" (UID: \"3f82fbd9-cd1e-4d71-8d61-02126748c546\") " pod="openstack-kuttl-tests/glance-db-create-ndbv8" Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.521291 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.521337 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data podName:64de676f-6fb5-4e1f-ba48-9c6ba84b1e89 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:17.021321501 +0000 UTC m=+3049.341581789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data") pod "barbican-keystone-listener-5547ddfddd-hx7pb" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89") : secret "barbican-config-data" not found Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.525382 5030 projected.go:194] Error preparing data for projected volume kube-api-access-pc858 for pod openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.525452 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858 podName:64de676f-6fb5-4e1f-ba48-9c6ba84b1e89 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:17.025435111 +0000 UTC m=+3049.345695399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pc858" (UniqueName: "kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858") pod "barbican-keystone-listener-5547ddfddd-hx7pb" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.532003 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-logs\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.563166 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.565310 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data-custom\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.568217 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqkqr\" (UniqueName: \"kubernetes.io/projected/3f82fbd9-cd1e-4d71-8d61-02126748c546-kube-api-access-vqkqr\") pod \"glance-db-create-ndbv8\" (UID: \"3f82fbd9-cd1e-4d71-8d61-02126748c546\") " pod="openstack-kuttl-tests/glance-db-create-ndbv8" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.570123 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=3.570104082 podStartE2EDuration="3.570104082s" podCreationTimestamp="2026-01-20 23:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:26:15.8941996 +0000 UTC m=+3048.214459888" watchObservedRunningTime="2026-01-20 23:26:16.570104082 +0000 UTC m=+3048.890364370" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.617816 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-659a-account-create-update-mdz2z"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.618939 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g97xg\" (UniqueName: \"kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.618963 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-logs\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.618981 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.619011 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.619118 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data-custom\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.637369 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-ndbv8" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.675967 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-659a-account-create-update-mdz2z"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.719056 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.720193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data-custom\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.720303 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g97xg\" (UniqueName: \"kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.720324 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-logs\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.720348 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.720360 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.720385 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.720493 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.720532 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle podName:c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:17.220519113 +0000 UTC m=+3049.540779401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle") pod "barbican-worker-58bb697bc9-t6twt" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9") : secret "combined-ca-bundle" not found Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.720579 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.720658 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data podName:d756a967-e1dd-4c74-8bff-cd1fab21b251 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:17.720637716 +0000 UTC m=+3050.040898004 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data") pod "rabbitmq-server-0" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251") : configmap "rabbitmq-config-data" not found Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.721114 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.721415 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-logs\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.731588 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.741450 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data-custom\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.764836 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data podName:c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:17.221142858 +0000 UTC m=+3049.541403146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data") pod "barbican-worker-58bb697bc9-t6twt" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9") : secret "barbican-config-data" not found Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.791924 5030 projected.go:194] Error preparing data for projected volume kube-api-access-g97xg for pod openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:16 crc kubenswrapper[5030]: E0120 23:26:16.791987 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg podName:c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:17.291971963 +0000 UTC m=+3049.612232251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-g97xg" (UniqueName: "kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg") pod "barbican-worker-58bb697bc9-t6twt" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.807531 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.826728 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e1cc829-8d44-48c6-ab24-a8ac60596d08-operator-scripts\") pod \"cinder-aecb-account-create-update-lwrdm\" (UID: \"0e1cc829-8d44-48c6-ab24-a8ac60596d08\") " pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.826835 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp4qp\" (UniqueName: \"kubernetes.io/projected/0e1cc829-8d44-48c6-ab24-a8ac60596d08-kube-api-access-wp4qp\") pod \"cinder-aecb-account-create-update-lwrdm\" (UID: \"0e1cc829-8d44-48c6-ab24-a8ac60596d08\") " pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.844440 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.877414 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-chmn4"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.890197 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.898038 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-aecb-account-create-update-7xbt4"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.918246 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-chmn4"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.936059 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.937740 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e1cc829-8d44-48c6-ab24-a8ac60596d08-operator-scripts\") pod \"cinder-aecb-account-create-update-lwrdm\" (UID: \"0e1cc829-8d44-48c6-ab24-a8ac60596d08\") " pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.937882 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp4qp\" (UniqueName: \"kubernetes.io/projected/0e1cc829-8d44-48c6-ab24-a8ac60596d08-kube-api-access-wp4qp\") pod \"cinder-aecb-account-create-update-lwrdm\" (UID: \"0e1cc829-8d44-48c6-ab24-a8ac60596d08\") " pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.938754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e1cc829-8d44-48c6-ab24-a8ac60596d08-operator-scripts\") pod \"cinder-aecb-account-create-update-lwrdm\" (UID: \"0e1cc829-8d44-48c6-ab24-a8ac60596d08\") " pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.948764 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.957807 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.958378 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="1130f389-005b-4aee-ad77-913385f27c6b" containerName="openstack-network-exporter" containerID="cri-o://72e0ced67c0808f179d97b548a3e94e7851dc22db2b730a9e88fc584185749d3" gracePeriod=300 Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.977827 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-t7ft5"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.985064 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-t7ft5"] Jan 20 23:26:16 crc kubenswrapper[5030]: I0120 23:26:16.987571 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp4qp\" (UniqueName: \"kubernetes.io/projected/0e1cc829-8d44-48c6-ab24-a8ac60596d08-kube-api-access-wp4qp\") pod \"cinder-aecb-account-create-update-lwrdm\" (UID: \"0e1cc829-8d44-48c6-ab24-a8ac60596d08\") " pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.005772 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.006423 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" containerName="openstack-network-exporter" containerID="cri-o://7f07936da6b67937a65d8f198644e8a821f77c69a033105597f6a81f498d252f" gracePeriod=300 Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.010648 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa5677f-cdb3-4391-add0-360b9b188cd3" containerID="f09acdc60f1f81bba1a8d3368f953c5127816f9d9a3d73cbef68294606305e53" exitCode=137 Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.011078 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.011577 5030 scope.go:117] "RemoveContainer" containerID="f09acdc60f1f81bba1a8d3368f953c5127816f9d9a3d73cbef68294606305e53" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.012138 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/openstackclient" secret="" err="secret \"openstackclient-openstackclient-dockercfg-b2h9n\" not found" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.046677 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fa5677f-cdb3-4391-add0-360b9b188cd3-openstack-config-secret\") pod \"1fa5677f-cdb3-4391-add0-360b9b188cd3\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.046713 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fa5677f-cdb3-4391-add0-360b9b188cd3-openstack-config\") pod \"1fa5677f-cdb3-4391-add0-360b9b188cd3\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.046750 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9wrp\" (UniqueName: \"kubernetes.io/projected/1fa5677f-cdb3-4391-add0-360b9b188cd3-kube-api-access-m9wrp\") pod \"1fa5677f-cdb3-4391-add0-360b9b188cd3\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.046778 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa5677f-cdb3-4391-add0-360b9b188cd3-combined-ca-bundle\") pod \"1fa5677f-cdb3-4391-add0-360b9b188cd3\" (UID: \"1fa5677f-cdb3-4391-add0-360b9b188cd3\") " Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.046984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.047011 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc858\" (UniqueName: \"kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.047042 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.054749 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.054831 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle podName:64de676f-6fb5-4e1f-ba48-9c6ba84b1e89 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:18.054813836 +0000 UTC m=+3050.375074124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle") pod "barbican-keystone-listener-5547ddfddd-hx7pb" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89") : secret "combined-ca-bundle" not found Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.055411 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.055442 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data podName:64de676f-6fb5-4e1f-ba48-9c6ba84b1e89 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:18.05543469 +0000 UTC m=+3050.375694978 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data") pod "barbican-keystone-listener-5547ddfddd-hx7pb" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89") : secret "barbican-config-data" not found Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.060815 5030 projected.go:194] Error preparing data for projected volume kube-api-access-pc858 for pod openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.060878 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858 podName:64de676f-6fb5-4e1f-ba48-9c6ba84b1e89 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:18.060862432 +0000 UTC m=+3050.381122720 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pc858" (UniqueName: "kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858") pod "barbican-keystone-listener-5547ddfddd-hx7pb" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.080237 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-779e-account-create-update-prxd6"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.115213 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-779e-account-create-update-prxd6"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.137461 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa5677f-cdb3-4391-add0-360b9b188cd3-kube-api-access-m9wrp" (OuterVolumeSpecName: "kube-api-access-m9wrp") pod "1fa5677f-cdb3-4391-add0-360b9b188cd3" (UID: "1fa5677f-cdb3-4391-add0-360b9b188cd3"). InnerVolumeSpecName "kube-api-access-m9wrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.152036 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.152221 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.157536 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9wrp\" (UniqueName: \"kubernetes.io/projected/1fa5677f-cdb3-4391-add0-360b9b188cd3-kube-api-access-m9wrp\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.158026 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/nova-metadata-0" secret="" err="secret \"nova-nova-dockercfg-qp6jt\" not found" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.222701 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-xn8st"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.243581 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.271879 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.272199 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.273532 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.273578 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.273611 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle podName:c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:18.273591741 +0000 UTC m=+3050.593852029 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle") pod "barbican-worker-58bb697bc9-t6twt" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9") : secret "combined-ca-bundle" not found Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.303042 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle podName:23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf nodeName:}" failed. No retries permitted until 2026-01-20 23:26:17.803001153 +0000 UTC m=+3050.123261441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle") pod "nova-metadata-0" (UID: "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf") : secret "combined-ca-bundle" not found Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.296688 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-xn8st"] Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.274027 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.303169 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data podName:c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:18.303152857 +0000 UTC m=+3050.623413145 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data") pod "barbican-worker-58bb697bc9-t6twt" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9") : secret "barbican-config-data" not found Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.325253 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="1130f389-005b-4aee-ad77-913385f27c6b" containerName="ovsdbserver-nb" containerID="cri-o://b70eca9805696a2d6eb3718f5825256ff477ee747e614ba956e5589f37b54e0e" gracePeriod=300 Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.390456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g97xg\" (UniqueName: \"kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.417359 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-dtsgt"] Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.438720 5030 projected.go:194] Error preparing data for projected volume kube-api-access-g97xg for pod openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.438780 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg podName:c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:18.43876079 +0000 UTC m=+3050.759021078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-g97xg" (UniqueName: "kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg") pod "barbican-worker-58bb697bc9-t6twt" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.439053 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.439263 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa5677f-cdb3-4391-add0-360b9b188cd3-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1fa5677f-cdb3-4391-add0-360b9b188cd3" (UID: "1fa5677f-cdb3-4391-add0-360b9b188cd3"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.449541 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5995ef35-8891-4ffb-8e20-86add9e274bc" containerName="openstackclient" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.449578 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5995ef35-8891-4ffb-8e20-86add9e274bc" containerName="openstackclient" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.449867 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5995ef35-8891-4ffb-8e20-86add9e274bc" containerName="openstackclient" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.450924 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.453725 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.458692 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-dtsgt"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.488832 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa5677f-cdb3-4391-add0-360b9b188cd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fa5677f-cdb3-4391-add0-360b9b188cd3" (UID: "1fa5677f-cdb3-4391-add0-360b9b188cd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.506223 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.511910 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1fa5677f-cdb3-4391-add0-360b9b188cd3-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.511942 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fa5677f-cdb3-4391-add0-360b9b188cd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.521226 5030 scope.go:117] "RemoveContainer" containerID="f09acdc60f1f81bba1a8d3368f953c5127816f9d9a3d73cbef68294606305e53" Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.522389 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:26:17 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:26:17 crc kubenswrapper[5030]: Jan 20 23:26:17 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:26:17 crc kubenswrapper[5030]: Jan 20 23:26:17 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:26:17 crc kubenswrapper[5030]: Jan 20 23:26:17 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:26:17 crc kubenswrapper[5030]: Jan 20 23:26:17 crc kubenswrapper[5030]: if [ -n "" ]; then Jan 20 23:26:17 crc kubenswrapper[5030]: GRANT_DATABASE="" Jan 20 23:26:17 crc kubenswrapper[5030]: else Jan 20 23:26:17 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:26:17 crc kubenswrapper[5030]: fi Jan 20 23:26:17 crc kubenswrapper[5030]: Jan 20 23:26:17 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:26:17 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:26:17 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:26:17 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:26:17 crc kubenswrapper[5030]: # support updates Jan 20 23:26:17 crc kubenswrapper[5030]: Jan 20 23:26:17 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.523090 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4"] Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.523776 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f09acdc60f1f81bba1a8d3368f953c5127816f9d9a3d73cbef68294606305e53\": container with ID starting with f09acdc60f1f81bba1a8d3368f953c5127816f9d9a3d73cbef68294606305e53 not found: ID does not exist" containerID="f09acdc60f1f81bba1a8d3368f953c5127816f9d9a3d73cbef68294606305e53" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.523828 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f09acdc60f1f81bba1a8d3368f953c5127816f9d9a3d73cbef68294606305e53"} err="failed to get container status \"f09acdc60f1f81bba1a8d3368f953c5127816f9d9a3d73cbef68294606305e53\": rpc error: code = NotFound desc = could not find container \"f09acdc60f1f81bba1a8d3368f953c5127816f9d9a3d73cbef68294606305e53\": container with ID starting with f09acdc60f1f81bba1a8d3368f953c5127816f9d9a3d73cbef68294606305e53 not found: ID does not exist" Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.523895 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-q48f7" podUID="261b1067-9a6b-4835-8b8c-e0d2c68a4a99" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.539258 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.553117 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" containerName="ovsdbserver-sb" containerID="cri-o://fa2232813d49b6cabe6b5dcbf6c407fdaa8033be9b122e13bf90968f56727a69" gracePeriod=300 Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.553241 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-lspsv"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.564864 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-lspsv"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.577940 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.590934 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-64d6-account-create-update-dqw62"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.601848 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-ebab-account-create-update-cktxf"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.604348 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa5677f-cdb3-4391-add0-360b9b188cd3-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1fa5677f-cdb3-4391-add0-360b9b188cd3" (UID: "1fa5677f-cdb3-4391-add0-360b9b188cd3"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.613546 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzbs7\" (UniqueName: \"kubernetes.io/projected/68e23a57-b3b1-4e99-8171-53da821ba0d8-kube-api-access-vzbs7\") pod \"neutron-779e-account-create-update-5p9c4\" (UID: \"68e23a57-b3b1-4e99-8171-53da821ba0d8\") " pod="openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.613591 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e23a57-b3b1-4e99-8171-53da821ba0d8-operator-scripts\") pod \"neutron-779e-account-create-update-5p9c4\" (UID: \"68e23a57-b3b1-4e99-8171-53da821ba0d8\") " pod="openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.613653 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1fa5677f-cdb3-4391-add0-360b9b188cd3-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.615197 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.615243 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data podName:fb76fff8-05c4-4ec5-b00b-17dfd94e5dea nodeName:}" failed. No retries permitted until 2026-01-20 23:26:18.115230572 +0000 UTC m=+3050.435490860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data") pod "rabbitmq-cell1-server-0" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.615852 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.625288 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.626305 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.627374 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.631241 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.631401 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.649501 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.659913 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.683040 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-fs64c"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.693244 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-fs64c"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.701498 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.701798 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="d380087b-96be-407d-8c8c-f00ecaa3a0e8" containerName="ovn-northd" containerID="cri-o://352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4" gracePeriod=30 Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.706470 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="d380087b-96be-407d-8c8c-f00ecaa3a0e8" containerName="openstack-network-exporter" containerID="cri-o://5f43545399090228177f04523ac5a317a6277e22050988ab543b21064445e7de" gracePeriod=30 Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.715505 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzbs7\" (UniqueName: \"kubernetes.io/projected/68e23a57-b3b1-4e99-8171-53da821ba0d8-kube-api-access-vzbs7\") pod \"neutron-779e-account-create-update-5p9c4\" (UID: \"68e23a57-b3b1-4e99-8171-53da821ba0d8\") " pod="openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.715542 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e23a57-b3b1-4e99-8171-53da821ba0d8-operator-scripts\") pod \"neutron-779e-account-create-update-5p9c4\" (UID: \"68e23a57-b3b1-4e99-8171-53da821ba0d8\") " pod="openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.715831 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v9cg\" (UniqueName: \"kubernetes.io/projected/29c196d8-2e0d-4bbf-9f4c-c6754359f6f3-kube-api-access-2v9cg\") pod \"nova-api-ebab-account-create-update-6gbzs\" (UID: \"29c196d8-2e0d-4bbf-9f4c-c6754359f6f3\") " pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.715891 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c196d8-2e0d-4bbf-9f4c-c6754359f6f3-operator-scripts\") pod \"nova-api-ebab-account-create-update-6gbzs\" (UID: \"29c196d8-2e0d-4bbf-9f4c-c6754359f6f3\") " pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.715950 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-operator-scripts\") pod \"barbican-64d6-account-create-update-xvvgg\" (UID: \"d6231555-bba1-41e1-8c8b-5ba3349cc6a4\") " pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.715993 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ws78\" (UniqueName: \"kubernetes.io/projected/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-kube-api-access-6ws78\") pod \"barbican-64d6-account-create-update-xvvgg\" (UID: \"d6231555-bba1-41e1-8c8b-5ba3349cc6a4\") " pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.717046 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e23a57-b3b1-4e99-8171-53da821ba0d8-operator-scripts\") pod \"neutron-779e-account-create-update-5p9c4\" (UID: \"68e23a57-b3b1-4e99-8171-53da821ba0d8\") " pod="openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.731319 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.748090 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-4bda-account-create-update-bqzv8"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.754676 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzbs7\" (UniqueName: \"kubernetes.io/projected/68e23a57-b3b1-4e99-8171-53da821ba0d8-kube-api-access-vzbs7\") pod \"neutron-779e-account-create-update-5p9c4\" (UID: \"68e23a57-b3b1-4e99-8171-53da821ba0d8\") " pod="openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.759360 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-4tt5b"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.766948 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-4tt5b"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.777527 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.785683 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-5ca2-account-create-update-5xh6b"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.790573 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.793247 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-tgzmq"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.799226 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xl95q"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.826567 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-tgzmq"] Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.833417 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.833501 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle podName:23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf nodeName:}" failed. No retries permitted until 2026-01-20 23:26:18.833482335 +0000 UTC m=+3051.153742623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle") pod "nova-metadata-0" (UID: "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf") : secret "combined-ca-bundle" not found Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.833912 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-xl95q"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.834313 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v9cg\" (UniqueName: \"kubernetes.io/projected/29c196d8-2e0d-4bbf-9f4c-c6754359f6f3-kube-api-access-2v9cg\") pod \"nova-api-ebab-account-create-update-6gbzs\" (UID: \"29c196d8-2e0d-4bbf-9f4c-c6754359f6f3\") " pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.834434 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c196d8-2e0d-4bbf-9f4c-c6754359f6f3-operator-scripts\") pod \"nova-api-ebab-account-create-update-6gbzs\" (UID: \"29c196d8-2e0d-4bbf-9f4c-c6754359f6f3\") " pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.834533 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-operator-scripts\") pod \"barbican-64d6-account-create-update-xvvgg\" (UID: \"d6231555-bba1-41e1-8c8b-5ba3349cc6a4\") " pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.834599 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ws78\" (UniqueName: \"kubernetes.io/projected/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-kube-api-access-6ws78\") pod \"barbican-64d6-account-create-update-xvvgg\" (UID: \"d6231555-bba1-41e1-8c8b-5ba3349cc6a4\") " pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.835373 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:26:17 crc kubenswrapper[5030]: E0120 23:26:17.835408 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data podName:d756a967-e1dd-4c74-8bff-cd1fab21b251 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:19.835398361 +0000 UTC m=+3052.155658649 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data") pod "rabbitmq-server-0" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251") : configmap "rabbitmq-config-data" not found Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.835948 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c196d8-2e0d-4bbf-9f4c-c6754359f6f3-operator-scripts\") pod \"nova-api-ebab-account-create-update-6gbzs\" (UID: \"29c196d8-2e0d-4bbf-9f4c-c6754359f6f3\") " pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.836368 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-operator-scripts\") pod \"barbican-64d6-account-create-update-xvvgg\" (UID: \"d6231555-bba1-41e1-8c8b-5ba3349cc6a4\") " pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.869280 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ws78\" (UniqueName: \"kubernetes.io/projected/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-kube-api-access-6ws78\") pod \"barbican-64d6-account-create-update-xvvgg\" (UID: \"d6231555-bba1-41e1-8c8b-5ba3349cc6a4\") " pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.880337 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v9cg\" (UniqueName: \"kubernetes.io/projected/29c196d8-2e0d-4bbf-9f4c-c6754359f6f3-kube-api-access-2v9cg\") pod \"nova-api-ebab-account-create-update-6gbzs\" (UID: \"29c196d8-2e0d-4bbf-9f4c-c6754359f6f3\") " pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.893605 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-nchp6"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.937594 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-nchp6"] Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.957318 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" Jan 20 23:26:17 crc kubenswrapper[5030]: I0120 23:26:17.987086 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.061543 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa5677f-cdb3-4391-add0-360b9b188cd3" path="/var/lib/kubelet/pods/1fa5677f-cdb3-4391-add0-360b9b188cd3/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.063359 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31020647-e3cc-44ab-a754-439b424e50cd" path="/var/lib/kubelet/pods/31020647-e3cc-44ab-a754-439b424e50cd/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.064554 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b4cf7c-7c3e-4537-91dc-aa8e422e749f" path="/var/lib/kubelet/pods/33b4cf7c-7c3e-4537-91dc-aa8e422e749f/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.065614 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d70339-6f3a-45dc-858d-2690011172c5" path="/var/lib/kubelet/pods/36d70339-6f3a-45dc-858d-2690011172c5/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.066198 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc26a88-4c14-43cd-8255-cf5bb3e15931" path="/var/lib/kubelet/pods/3bc26a88-4c14-43cd-8255-cf5bb3e15931/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.066839 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb2c9c5-64c5-4d21-b370-4fb3718220c5" path="/var/lib/kubelet/pods/3cb2c9c5-64c5-4d21-b370-4fb3718220c5/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.067493 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43fbd9c6-34c8-4fe9-9514-bc2932bbb632" path="/var/lib/kubelet/pods/43fbd9c6-34c8-4fe9-9514-bc2932bbb632/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.075248 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.075313 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc858\" (UniqueName: \"kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.075364 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.076913 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c5a0e0f-eb0d-4757-b571-6571a43793c4" path="/var/lib/kubelet/pods/4c5a0e0f-eb0d-4757-b571-6571a43793c4/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.079560 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.082929 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821" path="/var/lib/kubelet/pods/5de8dc6a-16dd-4ba1-bb28-5d9a8df2f821/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.083697 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle podName:64de676f-6fb5-4e1f-ba48-9c6ba84b1e89 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.079613983 +0000 UTC m=+3052.399874371 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle") pod "barbican-keystone-listener-5547ddfddd-hx7pb" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89") : secret "combined-ca-bundle" not found Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.084379 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e882b9a-9de5-451f-8cb3-896629ba3c4b" path="/var/lib/kubelet/pods/5e882b9a-9de5-451f-8cb3-896629ba3c4b/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.087318 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063" path="/var/lib/kubelet/pods/6f8fea5f-9b78-4b22-b5b0-e2c8f38c7063/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.089190 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.089231 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data podName:64de676f-6fb5-4e1f-ba48-9c6ba84b1e89 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.089221175 +0000 UTC m=+3052.409481463 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data") pod "barbican-keystone-listener-5547ddfddd-hx7pb" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89") : secret "barbican-config-data" not found Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.091418 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f1a598a-5986-49a8-ba34-f423759d1d62" path="/var/lib/kubelet/pods/7f1a598a-5986-49a8-ba34-f423759d1d62/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.092087 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83272a1e-0505-4f7e-8db0-93666671cdf2" path="/var/lib/kubelet/pods/83272a1e-0505-4f7e-8db0-93666671cdf2/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.092602 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86251e4a-e1bd-42f9-9632-3a087bc96e7d" path="/var/lib/kubelet/pods/86251e4a-e1bd-42f9-9632-3a087bc96e7d/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.095347 5030 projected.go:194] Error preparing data for projected volume kube-api-access-pc858 for pod openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.095394 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858 podName:64de676f-6fb5-4e1f-ba48-9c6ba84b1e89 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.095380595 +0000 UTC m=+3052.415640883 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-pc858" (UniqueName: "kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858") pod "barbican-keystone-listener-5547ddfddd-hx7pb" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.096646 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9325461a-2d5c-46e1-9041-3e60abe4feb9" path="/var/lib/kubelet/pods/9325461a-2d5c-46e1-9041-3e60abe4feb9/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.098553 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e6054b2-6da1-421a-8364-f6b57c51c01d" path="/var/lib/kubelet/pods/9e6054b2-6da1-421a-8364-f6b57c51c01d/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.101493 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b66b2ce3-1462-4db2-b1b4-6ca42402d7a5" path="/var/lib/kubelet/pods/b66b2ce3-1462-4db2-b1b4-6ca42402d7a5/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.102940 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d65fc805-6e3b-4264-9974-e5da232e4f1e" path="/var/lib/kubelet/pods/d65fc805-6e3b-4264-9974-e5da232e4f1e/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.103432 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcba4b91-f476-4404-b317-6b0cf9e72659" path="/var/lib/kubelet/pods/dcba4b91-f476-4404-b317-6b0cf9e72659/volumes" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.105544 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.105582 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-5d3e-account-create-update-8dft9"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.108753 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-6d9ds"] Jan 20 23:26:18 crc kubenswrapper[5030]: W0120 23:26:18.116678 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc07cdd7a_1d57_4d2a_9910_54d0e6e186f5.slice/crio-2976d7722e6f1dc7eaccf8c68c191e62af00d654d0c2c4a0697aadeb739bbf78 WatchSource:0}: Error finding container 2976d7722e6f1dc7eaccf8c68c191e62af00d654d0c2c4a0697aadeb739bbf78: Status 404 returned error can't find the container with id 2976d7722e6f1dc7eaccf8c68c191e62af00d654d0c2c4a0697aadeb739bbf78 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.118307 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-q48f7" event={"ID":"261b1067-9a6b-4835-8b8c-e0d2c68a4a99","Type":"ContainerStarted","Data":"643f9c5fc5870c70f78b08664982f8d0a19e355c041c22eaf4cb9fc62357470b"} Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.126210 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-q48f7" secret="" err="secret \"galera-openstack-cell1-dockercfg-86kd6\" not found" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.135430 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:26:18 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: if [ -n "glance" ]; then Jan 20 23:26:18 crc kubenswrapper[5030]: GRANT_DATABASE="glance" Jan 20 23:26:18 crc kubenswrapper[5030]: else Jan 20 23:26:18 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:26:18 crc kubenswrapper[5030]: fi Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:26:18 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:26:18 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:26:18 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:26:18 crc kubenswrapper[5030]: # support updates Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.136115 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:26:18 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: if [ -n "" ]; then Jan 20 23:26:18 crc kubenswrapper[5030]: GRANT_DATABASE="" Jan 20 23:26:18 crc kubenswrapper[5030]: else Jan 20 23:26:18 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:26:18 crc kubenswrapper[5030]: fi Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:26:18 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:26:18 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:26:18 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:26:18 crc kubenswrapper[5030]: # support updates Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.137715 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-q48f7" podUID="261b1067-9a6b-4835-8b8c-e0d2c68a4a99" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.137767 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" podUID="c07cdd7a-1d57-4d2a-9910-54d0e6e186f5" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.150093 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_1130f389-005b-4aee-ad77-913385f27c6b/ovsdbserver-nb/0.log" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.150389 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.152723 5030 generic.go:334] "Generic (PLEG): container finished" podID="d380087b-96be-407d-8c8c-f00ecaa3a0e8" containerID="5f43545399090228177f04523ac5a317a6277e22050988ab543b21064445e7de" exitCode=2 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.153756 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-6d9ds"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.153822 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"d380087b-96be-407d-8c8c-f00ecaa3a0e8","Type":"ContainerDied","Data":"5f43545399090228177f04523ac5a317a6277e22050988ab543b21064445e7de"} Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.183415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1130f389-005b-4aee-ad77-913385f27c6b-config\") pod \"1130f389-005b-4aee-ad77-913385f27c6b\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.183473 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1130f389-005b-4aee-ad77-913385f27c6b-ovsdb-rundir\") pod \"1130f389-005b-4aee-ad77-913385f27c6b\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.183505 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-metrics-certs-tls-certs\") pod \"1130f389-005b-4aee-ad77-913385f27c6b\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.183549 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-ovsdbserver-nb-tls-certs\") pod \"1130f389-005b-4aee-ad77-913385f27c6b\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.183596 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1130f389-005b-4aee-ad77-913385f27c6b-scripts\") pod \"1130f389-005b-4aee-ad77-913385f27c6b\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.183630 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-combined-ca-bundle\") pod \"1130f389-005b-4aee-ad77-913385f27c6b\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.183660 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g9nn\" (UniqueName: \"kubernetes.io/projected/1130f389-005b-4aee-ad77-913385f27c6b-kube-api-access-5g9nn\") pod \"1130f389-005b-4aee-ad77-913385f27c6b\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.183743 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"1130f389-005b-4aee-ad77-913385f27c6b\" (UID: \"1130f389-005b-4aee-ad77-913385f27c6b\") " Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.184382 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.184436 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts podName:261b1067-9a6b-4835-8b8c-e0d2c68a4a99 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:18.68442165 +0000 UTC m=+3051.004681938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts") pod "root-account-create-update-q48f7" (UID: "261b1067-9a6b-4835-8b8c-e0d2c68a4a99") : configmap "openstack-cell1-scripts" not found Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.185558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1130f389-005b-4aee-ad77-913385f27c6b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "1130f389-005b-4aee-ad77-913385f27c6b" (UID: "1130f389-005b-4aee-ad77-913385f27c6b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.188043 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1130f389-005b-4aee-ad77-913385f27c6b-scripts" (OuterVolumeSpecName: "scripts") pod "1130f389-005b-4aee-ad77-913385f27c6b" (UID: "1130f389-005b-4aee-ad77-913385f27c6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.194650 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a/ovsdbserver-sb/0.log" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.194694 5030 generic.go:334] "Generic (PLEG): container finished" podID="74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" containerID="7f07936da6b67937a65d8f198644e8a821f77c69a033105597f6a81f498d252f" exitCode=2 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.194712 5030 generic.go:334] "Generic (PLEG): container finished" podID="74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" containerID="fa2232813d49b6cabe6b5dcbf6c407fdaa8033be9b122e13bf90968f56727a69" exitCode=143 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.194775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a","Type":"ContainerDied","Data":"7f07936da6b67937a65d8f198644e8a821f77c69a033105597f6a81f498d252f"} Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.194800 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a","Type":"ContainerDied","Data":"fa2232813d49b6cabe6b5dcbf6c407fdaa8033be9b122e13bf90968f56727a69"} Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.194887 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.194930 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data podName:fb76fff8-05c4-4ec5-b00b-17dfd94e5dea nodeName:}" failed. No retries permitted until 2026-01-20 23:26:19.194913214 +0000 UTC m=+3051.515173492 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data") pod "rabbitmq-cell1-server-0" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.205903 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1130f389-005b-4aee-ad77-913385f27c6b-config" (OuterVolumeSpecName: "config") pod "1130f389-005b-4aee-ad77-913385f27c6b" (UID: "1130f389-005b-4aee-ad77-913385f27c6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.211722 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1130f389-005b-4aee-ad77-913385f27c6b-kube-api-access-5g9nn" (OuterVolumeSpecName: "kube-api-access-5g9nn") pod "1130f389-005b-4aee-ad77-913385f27c6b" (UID: "1130f389-005b-4aee-ad77-913385f27c6b"). InnerVolumeSpecName "kube-api-access-5g9nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.253296 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_1130f389-005b-4aee-ad77-913385f27c6b/ovsdbserver-nb/0.log" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.253342 5030 generic.go:334] "Generic (PLEG): container finished" podID="1130f389-005b-4aee-ad77-913385f27c6b" containerID="72e0ced67c0808f179d97b548a3e94e7851dc22db2b730a9e88fc584185749d3" exitCode=2 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.253376 5030 generic.go:334] "Generic (PLEG): container finished" podID="1130f389-005b-4aee-ad77-913385f27c6b" containerID="b70eca9805696a2d6eb3718f5825256ff477ee747e614ba956e5589f37b54e0e" exitCode=143 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.253418 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"1130f389-005b-4aee-ad77-913385f27c6b","Type":"ContainerDied","Data":"72e0ced67c0808f179d97b548a3e94e7851dc22db2b730a9e88fc584185749d3"} Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.253465 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"1130f389-005b-4aee-ad77-913385f27c6b","Type":"ContainerDied","Data":"b70eca9805696a2d6eb3718f5825256ff477ee747e614ba956e5589f37b54e0e"} Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.253483 5030 scope.go:117] "RemoveContainer" containerID="72e0ced67c0808f179d97b548a3e94e7851dc22db2b730a9e88fc584185749d3" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.253689 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.263776 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="5995ef35-8891-4ffb-8e20-86add9e274bc" containerName="openstackclient" containerID="cri-o://8b8b11c6ac6050ac18676092493bc35bbf1121d7672ace5bea22fe163dd6aca3" gracePeriod=2 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.263855 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"6cc0aba0-a0c7-4b7b-9b91-031765f25948","Type":"ContainerStarted","Data":"dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac"} Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.268061 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/nova-metadata-0" secret="" err="secret \"nova-nova-dockercfg-qp6jt\" not found" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.274806 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "1130f389-005b-4aee-ad77-913385f27c6b" (UID: "1130f389-005b-4aee-ad77-913385f27c6b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.289995 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.311877 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1130f389-005b-4aee-ad77-913385f27c6b-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.311894 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1130f389-005b-4aee-ad77-913385f27c6b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.311904 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1130f389-005b-4aee-ad77-913385f27c6b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.311917 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g9nn\" (UniqueName: \"kubernetes.io/projected/1130f389-005b-4aee-ad77-913385f27c6b-kube-api-access-5g9nn\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.311943 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.303442 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/nova-metadata-config-data: secret "nova-metadata-config-data" not found Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.315780 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-config-data podName:23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf nodeName:}" failed. No retries permitted until 2026-01-20 23:26:18.815753969 +0000 UTC m=+3051.136014257 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-config-data") pod "nova-metadata-0" (UID: "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf") : secret "nova-metadata-config-data" not found Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.304075 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.315978 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle podName:c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.315968414 +0000 UTC m=+3052.636228702 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle") pod "barbican-worker-58bb697bc9-t6twt" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9") : secret "combined-ca-bundle" not found Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.328085 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.328368 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="d37b566e-a870-4b0d-af53-29da770c8a0a" containerName="glance-log" containerID="cri-o://ee502ff2226c831285614d809d297680cd85514fa73adcd319bae1949aba7123" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.328807 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="d37b566e-a870-4b0d-af53-29da770c8a0a" containerName="glance-httpd" containerID="cri-o://f9817a66ae09a052b512e6a6f843fe20c1e5f1eeabb2fd55c4ccbcf13f8f066c" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.389395 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1130f389-005b-4aee-ad77-913385f27c6b" (UID: "1130f389-005b-4aee-ad77-913385f27c6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.414606 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.414887 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.414963 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.415005 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data podName:c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.414991341 +0000 UTC m=+3052.735251629 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data") pod "barbican-worker-58bb697bc9-t6twt" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9") : secret "barbican-config-data" not found Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.428058 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-t9frk"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.461700 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-t9frk"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.501587 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-lvkk8"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.513818 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.516564 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a/ovsdbserver-sb/0.log" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.516657 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.525006 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g97xg\" (UniqueName: \"kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.525355 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.531809 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-lvkk8"] Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.534642 5030 projected.go:194] Error preparing data for projected volume kube-api-access-g97xg for pod openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.534701 5030 scope.go:117] "RemoveContainer" containerID="b70eca9805696a2d6eb3718f5825256ff477ee747e614ba956e5589f37b54e0e" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.534731 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg podName:c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.53471005 +0000 UTC m=+3052.854970338 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-g97xg" (UniqueName: "kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg") pod "barbican-worker-58bb697bc9-t6twt" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.592938 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "1130f389-005b-4aee-ad77-913385f27c6b" (UID: "1130f389-005b-4aee-ad77-913385f27c6b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.598817 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4"] Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.624114 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.626393 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:26:18 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: if [ -n "cinder" ]; then Jan 20 23:26:18 crc kubenswrapper[5030]: GRANT_DATABASE="cinder" Jan 20 23:26:18 crc kubenswrapper[5030]: else Jan 20 23:26:18 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:26:18 crc kubenswrapper[5030]: fi Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:26:18 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:26:18 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:26:18 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:26:18 crc kubenswrapper[5030]: # support updates Jan 20 23:26:18 crc kubenswrapper[5030]: Jan 20 23:26:18 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.627880 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" podUID="0e1cc829-8d44-48c6-ab24-a8ac60596d08" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.631276 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-ovsdbserver-sb-tls-certs\") pod \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.631341 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-scripts\") pod \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.631399 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-ovsdb-rundir\") pod \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.631417 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.631451 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-combined-ca-bundle\") pod \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.631478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwbbm\" (UniqueName: \"kubernetes.io/projected/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-kube-api-access-bwbbm\") pod \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.631531 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-metrics-certs-tls-certs\") pod \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.631582 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-config\") pod \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\" (UID: \"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a\") " Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.632159 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.632552 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.632992 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-config" (OuterVolumeSpecName: "config") pod "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" (UID: "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.633455 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-scripts" (OuterVolumeSpecName: "scripts") pod "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" (UID: "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.634349 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" (UID: "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.637836 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.637973 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="d380087b-96be-407d-8c8c-f00ecaa3a0e8" containerName="ovn-northd" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.638599 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-s9rd4"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.654552 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" (UID: "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.654664 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "1130f389-005b-4aee-ad77-913385f27c6b" (UID: "1130f389-005b-4aee-ad77-913385f27c6b"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.656045 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-kube-api-access-bwbbm" (OuterVolumeSpecName: "kube-api-access-bwbbm") pod "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" (UID: "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a"). InnerVolumeSpecName "kube-api-access-bwbbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.692677 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-btqm2"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.711314 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-btqm2"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.740667 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwbbm\" (UniqueName: \"kubernetes.io/projected/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-kube-api-access-bwbbm\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.740699 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1130f389-005b-4aee-ad77-913385f27c6b-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.740712 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.740723 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.740733 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.740761 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.741096 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.741233 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts podName:261b1067-9a6b-4835-8b8c-e0d2c68a4a99 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:19.741179818 +0000 UTC m=+3052.061440296 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts") pod "root-account-create-update-q48f7" (UID: "261b1067-9a6b-4835-8b8c-e0d2c68a4a99") : configmap "openstack-cell1-scripts" not found Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.771075 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-q48f7"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.799327 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.801611 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" (UID: "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.810346 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-mtbkz"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.828410 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.830937 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-server" containerID="cri-o://9181d3d458c39873908d9b62be1f43c09bde580a5a92747fa92c7120b3c67bd9" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831131 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-updater" containerID="cri-o://603531bd19dc85d08fbb782f3b219e34a57b5c3b586d9e1cf64a8141a3e628f4" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831196 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-server" containerID="cri-o://be5d70fece74ed23d3912dcecfead32af8e6a34f87d9ad333fc71f20d7ff9eda" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831210 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-auditor" containerID="cri-o://e9dc24048dc188535b8295f3ab283fcdad761f4bfaf1aac43f23a6f6b734faf2" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831294 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-replicator" containerID="cri-o://a9cb444112598e53947461f42b9f15ca62053dfe7dfd69670749b0326697f5da" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831321 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" (UID: "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831359 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-server" containerID="cri-o://3d951ed926c631ff4318b93467ee0df5ca43db2435a2c42f8b6dc53ec4fc91c3" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831412 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-reaper" containerID="cri-o://e0b30913b14fa56da284823ebc9d76041b474ec2b3ea78132cd310dd05b97ce8" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831452 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-auditor" containerID="cri-o://7cd3a2e6ea294102d34df6f1bba4990edb60bc783f665abcaa08bc642403453c" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831519 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-replicator" containerID="cri-o://2683362cd99b8235f265be3f4440d62cdb3c454faec2d39def79a9949930d374" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831684 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-expirer" containerID="cri-o://3896b811d93df67a1c086c1bade3178d6f587e711b21f8ed76c06b9949f5320b" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831763 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="swift-recon-cron" containerID="cri-o://052f7253ce8f20b7006896c49be8c049c88f69895a3e3717387a4a8776731857" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831889 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-auditor" containerID="cri-o://5b32a554719001bcf616841e4db5c2484999d2d6bb3cce476354aa6956e8471a" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831928 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-replicator" containerID="cri-o://8d7dbc2d97335bd333e92228100d852a9f983098e53579130b62dd6a10f6e90a" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.831981 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="rsync" containerID="cri-o://df6ef689e710dba22c29a499696e2c2009cc0478473059ce8d655ee5a1654893" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.832052 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-updater" containerID="cri-o://d33aa21651847a416850ac9ea366856f99b5efa24607b2c97ee28cb5316792af" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.838658 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" (UID: "74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.844218 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.844263 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.844278 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.844415 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/nova-metadata-config-data: secret "nova-metadata-config-data" not found Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.844480 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-config-data podName:23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf nodeName:}" failed. No retries permitted until 2026-01-20 23:26:19.844458358 +0000 UTC m=+3052.164718646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-config-data") pod "nova-metadata-0" (UID: "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf") : secret "nova-metadata-config-data" not found Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.844838 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.844876 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle podName:23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.844866797 +0000 UTC m=+3053.165127295 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle") pod "nova-metadata-0" (UID: "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf") : secret "combined-ca-bundle" not found Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.870017 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.870283 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="c862e826-c66e-4188-9d2a-096e6039a1af" containerName="glance-log" containerID="cri-o://7f68f7419d2718f62707e650ef6382b95f3ad993626d8918bec9a05262072f2a" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.870787 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="c862e826-c66e-4188-9d2a-096e6039a1af" containerName="glance-httpd" containerID="cri-o://71678e28fd03649c9a6db5bde20993e36008b33bbe2167361630434ed0b68f67" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.878511 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.886754 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5946fbdfbb-6s7qt"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.891897 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" podUID="e89f10b6-405e-4c42-a2f5-97111102ec40" containerName="placement-log" containerID="cri-o://39dedca045f311d6169ac05c06012585bdf93027adb63134aae42af056394de6" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.892656 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" podUID="e89f10b6-405e-4c42-a2f5-97111102ec40" containerName="placement-api" containerID="cri-o://4141048d16b87ba820575f3c2ed1b798652fb89491b0f81f3244fea3383c5fb5" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.895530 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.926138 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.926502 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="644cc3b2-e3d2-441c-8aa5-c3bec683e86c" containerName="cinder-api-log" containerID="cri-o://d8ebde37b10c53ca483ade68635a4736300b159d4b22e45b3f8dec9db7455d00" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.926937 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="644cc3b2-e3d2-441c-8aa5-c3bec683e86c" containerName="cinder-api" containerID="cri-o://9b87dbe57785654f140e071cbe49ac4f1c7e04524db51c2dad9594ac25b3bdad" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.933987 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="644cc3b2-e3d2-441c-8aa5-c3bec683e86c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.67:8776/healthcheck\": EOF" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.940094 5030 scope.go:117] "RemoveContainer" containerID="72e0ced67c0808f179d97b548a3e94e7851dc22db2b730a9e88fc584185749d3" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.946227 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e0ced67c0808f179d97b548a3e94e7851dc22db2b730a9e88fc584185749d3\": container with ID starting with 72e0ced67c0808f179d97b548a3e94e7851dc22db2b730a9e88fc584185749d3 not found: ID does not exist" containerID="72e0ced67c0808f179d97b548a3e94e7851dc22db2b730a9e88fc584185749d3" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.946286 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e0ced67c0808f179d97b548a3e94e7851dc22db2b730a9e88fc584185749d3"} err="failed to get container status \"72e0ced67c0808f179d97b548a3e94e7851dc22db2b730a9e88fc584185749d3\": rpc error: code = NotFound desc = could not find container \"72e0ced67c0808f179d97b548a3e94e7851dc22db2b730a9e88fc584185749d3\": container with ID starting with 72e0ced67c0808f179d97b548a3e94e7851dc22db2b730a9e88fc584185749d3 not found: ID does not exist" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.946321 5030 scope.go:117] "RemoveContainer" containerID="b70eca9805696a2d6eb3718f5825256ff477ee747e614ba956e5589f37b54e0e" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.948149 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b70eca9805696a2d6eb3718f5825256ff477ee747e614ba956e5589f37b54e0e\": container with ID starting with b70eca9805696a2d6eb3718f5825256ff477ee747e614ba956e5589f37b54e0e not found: ID does not exist" containerID="b70eca9805696a2d6eb3718f5825256ff477ee747e614ba956e5589f37b54e0e" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.948172 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b70eca9805696a2d6eb3718f5825256ff477ee747e614ba956e5589f37b54e0e"} err="failed to get container status \"b70eca9805696a2d6eb3718f5825256ff477ee747e614ba956e5589f37b54e0e\": rpc error: code = NotFound desc = could not find container \"b70eca9805696a2d6eb3718f5825256ff477ee747e614ba956e5589f37b54e0e\": container with ID starting with b70eca9805696a2d6eb3718f5825256ff477ee747e614ba956e5589f37b54e0e not found: ID does not exist" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.954844 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.971817 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4"] Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.972237 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" containerName="ovsdbserver-sb" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.972250 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" containerName="ovsdbserver-sb" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.972264 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" containerName="openstack-network-exporter" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.972270 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" containerName="openstack-network-exporter" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.972307 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1130f389-005b-4aee-ad77-913385f27c6b" containerName="openstack-network-exporter" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.972314 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1130f389-005b-4aee-ad77-913385f27c6b" containerName="openstack-network-exporter" Jan 20 23:26:18 crc kubenswrapper[5030]: E0120 23:26:18.972349 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1130f389-005b-4aee-ad77-913385f27c6b" containerName="ovsdbserver-nb" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.972355 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1130f389-005b-4aee-ad77-913385f27c6b" containerName="ovsdbserver-nb" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.972533 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1130f389-005b-4aee-ad77-913385f27c6b" containerName="ovsdbserver-nb" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.972549 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1130f389-005b-4aee-ad77-913385f27c6b" containerName="openstack-network-exporter" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.972560 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" containerName="ovsdbserver-sb" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.972574 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" containerName="openstack-network-exporter" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.974986 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.982549 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7bd69c474b-2t4sm"] Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.982807 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" podUID="c991f145-efc7-4968-b73a-d4cf085acee0" containerName="neutron-api" containerID="cri-o://b5a3121b46f09ab54fc42a4e60030abec28c4ebcaabff01ecef271e79eda8bce" gracePeriod=30 Jan 20 23:26:18 crc kubenswrapper[5030]: I0120 23:26:18.982962 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" podUID="c991f145-efc7-4968-b73a-d4cf085acee0" containerName="neutron-httpd" containerID="cri-o://73d55bbb51e734b228c4354b255f4da8a22be37140d690a89e16210c155c5407" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.014704 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.039451 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-659a-account-create-update-nthp7"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.057531 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6tgr\" (UniqueName: \"kubernetes.io/projected/e70c799f-11cc-4de5-87b7-c9fb919df4b8-kube-api-access-r6tgr\") pod \"dnsmasq-dnsmasq-84b9f45d47-q6nw4\" (UID: \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.057597 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e70c799f-11cc-4de5-87b7-c9fb919df4b8-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-q6nw4\" (UID: \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.057697 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e70c799f-11cc-4de5-87b7-c9fb919df4b8-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-q6nw4\" (UID: \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.081244 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.097902 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-ndbv8"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.104098 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.122312 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.122560 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="0814026f-5519-40a1-94e8-379b5abac55f" containerName="nova-scheduler-scheduler" containerID="cri-o://8d8857a54e3051d21b85864143f114d8c2faafa007277a85b349682cb0dd2946" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.134700 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.135372 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="e5276d7d-9906-480d-90c6-4f48c6f126e2" containerName="nova-api-log" containerID="cri-o://bfd8c73cd8e2801b4cb689a42728e83fbff428b8ac539964c23cc5f093ce8bf9" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.135898 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="e5276d7d-9906-480d-90c6-4f48c6f126e2" containerName="nova-api-api" containerID="cri-o://4fcb730cfefb582fb08c7ccfc4f44f7eca229fb8f1e6d2c66c0bb98cd59694cb" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.142281 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="d756a967-e1dd-4c74-8bff-cd1fab21b251" containerName="rabbitmq" containerID="cri-o://799435f2bd96b95034bb6179f636afcf29a888ed0516e8085ed21b16561d9ed5" gracePeriod=604800 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.159006 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6tgr\" (UniqueName: \"kubernetes.io/projected/e70c799f-11cc-4de5-87b7-c9fb919df4b8-kube-api-access-r6tgr\") pod \"dnsmasq-dnsmasq-84b9f45d47-q6nw4\" (UID: \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.159064 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e70c799f-11cc-4de5-87b7-c9fb919df4b8-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-q6nw4\" (UID: \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.159119 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e70c799f-11cc-4de5-87b7-c9fb919df4b8-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-q6nw4\" (UID: \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.159974 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e70c799f-11cc-4de5-87b7-c9fb919df4b8-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-q6nw4\" (UID: \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.160832 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e70c799f-11cc-4de5-87b7-c9fb919df4b8-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-q6nw4\" (UID: \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.156612 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.179381 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:26:19 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: if [ -n "barbican" ]; then Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="barbican" Jan 20 23:26:19 crc kubenswrapper[5030]: else Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:26:19 crc kubenswrapper[5030]: fi Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:26:19 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:26:19 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:26:19 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:26:19 crc kubenswrapper[5030]: # support updates Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.179733 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4"] Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.181396 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" podUID="d6231555-bba1-41e1-8c8b-5ba3349cc6a4" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.188354 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6tgr\" (UniqueName: \"kubernetes.io/projected/e70c799f-11cc-4de5-87b7-c9fb919df4b8-kube-api-access-r6tgr\") pod \"dnsmasq-dnsmasq-84b9f45d47-q6nw4\" (UID: \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.194754 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.204449 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg"] Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.204867 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:26:19 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: if [ -n "neutron" ]; then Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="neutron" Jan 20 23:26:19 crc kubenswrapper[5030]: else Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:26:19 crc kubenswrapper[5030]: fi Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:26:19 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:26:19 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:26:19 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:26:19 crc kubenswrapper[5030]: # support updates Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.206908 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4" podUID="68e23a57-b3b1-4e99-8171-53da821ba0d8" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.230847 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.231205 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" podUID="2b796eed-a508-43ca-9b25-c097b39474b7" containerName="barbican-api-log" containerID="cri-o://b8717a3cdbefcbd0c0ff07aabc3b13b4414654b84eeec748b31cb1d195d654c9" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.231353 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" podUID="2b796eed-a508-43ca-9b25-c097b39474b7" containerName="barbican-api" containerID="cri-o://83f9fdc7aa98bed3c6026ba325295aaed4096990435ab42d93cc5a1829c1757c" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.240293 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs"] Jan 20 23:26:19 crc kubenswrapper[5030]: W0120 23:26:19.249419 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29c196d8_2e0d_4bbf_9f4c_c6754359f6f3.slice/crio-4014fb9782a25f4d714642ff7460c3cd24cfab8b9ae68ff0c2d7c55bed09821a WatchSource:0}: Error finding container 4014fb9782a25f4d714642ff7460c3cd24cfab8b9ae68ff0c2d7c55bed09821a: Status 404 returned error can't find the container with id 4014fb9782a25f4d714642ff7460c3cd24cfab8b9ae68ff0c2d7c55bed09821a Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.254187 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb"] Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.255732 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-pc858], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" podUID="64de676f-6fb5-4e1f-ba48-9c6ba84b1e89" Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.259098 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:26:19 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: if [ -n "nova_api" ]; then Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="nova_api" Jan 20 23:26:19 crc kubenswrapper[5030]: else Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:26:19 crc kubenswrapper[5030]: fi Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:26:19 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:26:19 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:26:19 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:26:19 crc kubenswrapper[5030]: # support updates Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.259296 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.259518 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" podUID="b334c7e2-1209-4a89-b56d-72d97b6f16b3" containerName="barbican-worker-log" containerID="cri-o://6b496bef4ca4d0433913ed3347b495cb73325fdbef015e1d05cc61795bb60789" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.260049 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" podUID="b334c7e2-1209-4a89-b56d-72d97b6f16b3" containerName="barbican-worker" containerID="cri-o://75c9449a8a21d460c37176286c18c54720a652f60cdf7d26b5616d4f02353a1a" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.260493 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs" podUID="29c196d8-2e0d-4bbf-9f4c-c6754359f6f3" Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.265745 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.265854 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data podName:fb76fff8-05c4-4ec5-b00b-17dfd94e5dea nodeName:}" failed. No retries permitted until 2026-01-20 23:26:21.265815658 +0000 UTC m=+3053.586075936 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data") pod "rabbitmq-cell1-server-0" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.270810 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt"] Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.272372 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-g97xg], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" podUID="c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.278282 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.278594 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" podUID="17e14da0-805b-4832-b6dc-7dc6b829ee4f" containerName="barbican-keystone-listener-log" containerID="cri-o://f3143e4041bdf06c6947b874cc740775ab9c9ec6001b798510175cc97e744a7e" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.279362 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" podUID="17e14da0-805b-4832-b6dc-7dc6b829ee4f" containerName="barbican-keystone-listener" containerID="cri-o://d30b1e443b75aaab1d0e162daa649e2952963350a5285a88628501469adfbea4" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.299714 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.300111 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="7467fbb5-44e9-42cf-874d-16fd1332885e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://493a4b5159ec26b105e62e8cb2c2af22e9255bbb11250b3a8cbcc06e1bbd150a" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.310479 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-q48f7"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.314139 5030 generic.go:334] "Generic (PLEG): container finished" podID="3f82fbd9-cd1e-4d71-8d61-02126748c546" containerID="01b284d1f2373249d8d8c7a1c2c3ee891c1872f2d5ad6c4ea03f3ffcbbd09c35" exitCode=1 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.314647 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-ndbv8" event={"ID":"3f82fbd9-cd1e-4d71-8d61-02126748c546","Type":"ContainerDied","Data":"01b284d1f2373249d8d8c7a1c2c3ee891c1872f2d5ad6c4ea03f3ffcbbd09c35"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.314737 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-ndbv8" event={"ID":"3f82fbd9-cd1e-4d71-8d61-02126748c546","Type":"ContainerStarted","Data":"585ace1beb3985a3db038b8d501092d73f30b14956d24ba1eca80ca872ff3dfa"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.315602 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/glance-db-create-ndbv8" secret="" err="secret \"galera-openstack-dockercfg-qc7gv\" not found" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.315707 5030 scope.go:117] "RemoveContainer" containerID="01b284d1f2373249d8d8c7a1c2c3ee891c1872f2d5ad6c4ea03f3ffcbbd09c35" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.317514 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.321346 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs" event={"ID":"29c196d8-2e0d-4bbf-9f4c-c6754359f6f3","Type":"ContainerStarted","Data":"4014fb9782a25f4d714642ff7460c3cd24cfab8b9ae68ff0c2d7c55bed09821a"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.329853 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4" event={"ID":"68e23a57-b3b1-4e99-8171-53da821ba0d8","Type":"ContainerStarted","Data":"733f59c56639f824d80f2a04d03b12eb127d7fa3ff3693d43ce54ffdf366a49e"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.333658 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" event={"ID":"c07cdd7a-1d57-4d2a-9910-54d0e6e186f5","Type":"ContainerStarted","Data":"2976d7722e6f1dc7eaccf8c68c191e62af00d654d0c2c4a0697aadeb739bbf78"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.334349 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" secret="" err="secret \"galera-openstack-dockercfg-qc7gv\" not found" Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.348643 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:26:19 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: if [ -n "glance" ]; then Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="glance" Jan 20 23:26:19 crc kubenswrapper[5030]: else Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:26:19 crc kubenswrapper[5030]: fi Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:26:19 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:26:19 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:26:19 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:26:19 crc kubenswrapper[5030]: # support updates Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.348758 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.350458 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" podUID="c07cdd7a-1d57-4d2a-9910-54d0e6e186f5" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.353161 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-676fq"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.373880 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.374093 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="747ac61a-5b10-46ef-8a34-68b43c777d96" containerName="nova-cell1-conductor-conductor" containerID="cri-o://fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.377781 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="3896b811d93df67a1c086c1bade3178d6f587e711b21f8ed76c06b9949f5320b" exitCode=0 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.377839 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"3896b811d93df67a1c086c1bade3178d6f587e711b21f8ed76c06b9949f5320b"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.377921 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"d33aa21651847a416850ac9ea366856f99b5efa24607b2c97ee28cb5316792af"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.377885 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="d33aa21651847a416850ac9ea366856f99b5efa24607b2c97ee28cb5316792af" exitCode=0 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.377994 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="5b32a554719001bcf616841e4db5c2484999d2d6bb3cce476354aa6956e8471a" exitCode=0 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378010 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="8d7dbc2d97335bd333e92228100d852a9f983098e53579130b62dd6a10f6e90a" exitCode=0 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378017 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="603531bd19dc85d08fbb782f3b219e34a57b5c3b586d9e1cf64a8141a3e628f4" exitCode=0 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378024 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="e9dc24048dc188535b8295f3ab283fcdad761f4bfaf1aac43f23a6f6b734faf2" exitCode=0 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378055 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="a9cb444112598e53947461f42b9f15ca62053dfe7dfd69670749b0326697f5da" exitCode=0 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378062 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="e0b30913b14fa56da284823ebc9d76041b474ec2b3ea78132cd310dd05b97ce8" exitCode=0 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378070 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="7cd3a2e6ea294102d34df6f1bba4990edb60bc783f665abcaa08bc642403453c" exitCode=0 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378076 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="2683362cd99b8235f265be3f4440d62cdb3c454faec2d39def79a9949930d374" exitCode=0 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378173 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"5b32a554719001bcf616841e4db5c2484999d2d6bb3cce476354aa6956e8471a"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378186 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"8d7dbc2d97335bd333e92228100d852a9f983098e53579130b62dd6a10f6e90a"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378197 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"603531bd19dc85d08fbb782f3b219e34a57b5c3b586d9e1cf64a8141a3e628f4"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378207 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"e9dc24048dc188535b8295f3ab283fcdad761f4bfaf1aac43f23a6f6b734faf2"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378217 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"a9cb444112598e53947461f42b9f15ca62053dfe7dfd69670749b0326697f5da"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378226 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"e0b30913b14fa56da284823ebc9d76041b474ec2b3ea78132cd310dd05b97ce8"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378235 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"7cd3a2e6ea294102d34df6f1bba4990edb60bc783f665abcaa08bc642403453c"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.378243 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"2683362cd99b8235f265be3f4440d62cdb3c454faec2d39def79a9949930d374"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.394104 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.397965 5030 generic.go:334] "Generic (PLEG): container finished" podID="e5276d7d-9906-480d-90c6-4f48c6f126e2" containerID="bfd8c73cd8e2801b4cb689a42728e83fbff428b8ac539964c23cc5f093ce8bf9" exitCode=143 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.398107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e5276d7d-9906-480d-90c6-4f48c6f126e2","Type":"ContainerDied","Data":"bfd8c73cd8e2801b4cb689a42728e83fbff428b8ac539964c23cc5f093ce8bf9"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.404404 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-m5mr4"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.404921 5030 generic.go:334] "Generic (PLEG): container finished" podID="d37b566e-a870-4b0d-af53-29da770c8a0a" containerID="ee502ff2226c831285614d809d297680cd85514fa73adcd319bae1949aba7123" exitCode=143 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.405054 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"d37b566e-a870-4b0d-af53-29da770c8a0a","Type":"ContainerDied","Data":"ee502ff2226c831285614d809d297680cd85514fa73adcd319bae1949aba7123"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.413863 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-659a-account-create-update-nthp7"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.424211 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-ndbv8"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.430147 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a/ovsdbserver-sb/0.log" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.430257 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a","Type":"ContainerDied","Data":"17a21a1997b5aa160923c8f2a8f47c74ef600585b2bfc9e4e742fece46b9bb39"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.430303 5030 scope.go:117] "RemoveContainer" containerID="7f07936da6b67937a65d8f198644e8a821f77c69a033105597f6a81f498d252f" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.430442 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.434109 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.434356 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="e33b23a7-cf35-4fc2-8237-9633f3d807f3" containerName="nova-cell0-conductor-conductor" containerID="cri-o://60ba47fc9818c41be393330baa0d4357fed4f1a59bdc3aa0b5a186691e89940c" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.444494 5030 generic.go:334] "Generic (PLEG): container finished" podID="e89f10b6-405e-4c42-a2f5-97111102ec40" containerID="39dedca045f311d6169ac05c06012585bdf93027adb63134aae42af056394de6" exitCode=143 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.444658 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" event={"ID":"e89f10b6-405e-4c42-a2f5-97111102ec40","Type":"ContainerDied","Data":"39dedca045f311d6169ac05c06012585bdf93027adb63134aae42af056394de6"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.448148 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.456238 5030 generic.go:334] "Generic (PLEG): container finished" podID="c991f145-efc7-4968-b73a-d4cf085acee0" containerID="73d55bbb51e734b228c4354b255f4da8a22be37140d690a89e16210c155c5407" exitCode=0 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.456314 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" event={"ID":"c991f145-efc7-4968-b73a-d4cf085acee0","Type":"ContainerDied","Data":"73d55bbb51e734b228c4354b255f4da8a22be37140d690a89e16210c155c5407"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.470290 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"6cc0aba0-a0c7-4b7b-9b91-031765f25948","Type":"ContainerStarted","Data":"68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a"} Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.471413 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.471472 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-operator-scripts podName:c07cdd7a-1d57-4d2a-9910-54d0e6e186f5 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:19.971453646 +0000 UTC m=+3052.291713934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-operator-scripts") pod "glance-659a-account-create-update-nthp7" (UID: "c07cdd7a-1d57-4d2a-9910-54d0e6e186f5") : configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.472542 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.472581 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3f82fbd9-cd1e-4d71-8d61-02126748c546-operator-scripts podName:3f82fbd9-cd1e-4d71-8d61-02126748c546 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:19.972572113 +0000 UTC m=+3052.292832401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3f82fbd9-cd1e-4d71-8d61-02126748c546-operator-scripts") pod "glance-db-create-ndbv8" (UID: "3f82fbd9-cd1e-4d71-8d61-02126748c546") : configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.477177 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="6cc0aba0-a0c7-4b7b-9b91-031765f25948" containerName="cinder-scheduler" containerID="cri-o://dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.478298 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="6cc0aba0-a0c7-4b7b-9b91-031765f25948" containerName="probe" containerID="cri-o://68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.504156 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="f239887a-e1d2-484e-96ce-e4bb56b1bc75" containerName="galera" containerID="cri-o://5e702a9647c237b7ea176fced7e9b9708645df2df43294f18dad118f506f7fee" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.507311 5030 scope.go:117] "RemoveContainer" containerID="fa2232813d49b6cabe6b5dcbf6c407fdaa8033be9b122e13bf90968f56727a69" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.527855 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.528295 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="ceilometer-central-agent" containerID="cri-o://3fa1f8af90d752f9f412358e748ffa6b26b530af7abe3410789eb3720481f96a" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.529946 5030 generic.go:334] "Generic (PLEG): container finished" podID="c862e826-c66e-4188-9d2a-096e6039a1af" containerID="7f68f7419d2718f62707e650ef6382b95f3ad993626d8918bec9a05262072f2a" exitCode=143 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.530030 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c862e826-c66e-4188-9d2a-096e6039a1af","Type":"ContainerDied","Data":"7f68f7419d2718f62707e650ef6382b95f3ad993626d8918bec9a05262072f2a"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.530073 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="proxy-httpd" containerID="cri-o://79b477de7360a1b389230a49bf03141e246b436af45aecf9fc9c9289c4283a3d" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.530134 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="ceilometer-notification-agent" containerID="cri-o://bfd7a2900f29501d274a01d822ba9fcab833dd0ab9306f38cbde8afbe0832f0e" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.530189 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="sg-core" containerID="cri-o://115ce5a1a1b811f563c605b90097f6b0f7096471f6c5040cf6697c80d762190f" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.552904 5030 generic.go:334] "Generic (PLEG): container finished" podID="644cc3b2-e3d2-441c-8aa5-c3bec683e86c" containerID="d8ebde37b10c53ca483ade68635a4736300b159d4b22e45b3f8dec9db7455d00" exitCode=143 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.552978 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"644cc3b2-e3d2-441c-8aa5-c3bec683e86c","Type":"ContainerDied","Data":"d8ebde37b10c53ca483ade68635a4736300b159d4b22e45b3f8dec9db7455d00"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.553210 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.553460 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb" containerName="kube-state-metrics" containerID="cri-o://1b112a374f9b05a0f9a2eb5a5ade4d52d82f6111e4094f010c5765691ff6c914" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.573405 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" containerName="rabbitmq" containerID="cri-o://38aa5aa26bdb0f07041118403707547bc1981c26790c4adec8154899b480eb0b" gracePeriod=604800 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.573653 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"1130f389-005b-4aee-ad77-913385f27c6b","Type":"ContainerDied","Data":"4c63500705386fcb4fea39b9804da485e5f9b1b682d80c064214b0d7849248ab"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.574418 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.574740 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" podUID="c256a299-ef10-4eab-ae00-d4116a91a108" containerName="proxy-httpd" containerID="cri-o://2e3c443886e08309e23e74c3eb26e87aeed4b515043573a5b997072fad41ac81" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.574917 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" podUID="c256a299-ef10-4eab-ae00-d4116a91a108" containerName="proxy-server" containerID="cri-o://cbbd55ec11f922f70da8495be9037539e1d761154559e6b2075472c7deb21023" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.587655 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-b695-account-create-update-77jzh"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.589144 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b695-account-create-update-77jzh" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.591432 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" event={"ID":"d6231555-bba1-41e1-8c8b-5ba3349cc6a4","Type":"ContainerStarted","Data":"f31e8dca3a24798d247e32d5d63e5006a018384b067281e48c8b0ff9ab3cbfd8"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.592027 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" secret="" err="secret \"galera-openstack-dockercfg-qc7gv\" not found" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.592317 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.595877 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-b695-account-create-update-77jzh"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.602871 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm"] Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.603296 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:26:19 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: if [ -n "barbican" ]; then Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="barbican" Jan 20 23:26:19 crc kubenswrapper[5030]: else Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:26:19 crc kubenswrapper[5030]: fi Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:26:19 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:26:19 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:26:19 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:26:19 crc kubenswrapper[5030]: # support updates Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.603766 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.603802 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" event={"ID":"0e1cc829-8d44-48c6-ab24-a8ac60596d08","Type":"ContainerStarted","Data":"9e3ebb3ca2a5c90cc6024193a172242acd2f2c35f608cc83deaf99b69d9da458"} Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.604142 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" containerName="nova-metadata-log" containerID="cri-o://ba64c3f6fd654c175099565552c0c0d6fd00f883cbac81faa3e5a76d3ce4b5e9" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.604459 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" secret="" err="secret \"galera-openstack-dockercfg-qc7gv\" not found" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.604547 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" containerName="nova-metadata-metadata" containerID="cri-o://616e09206b760dcb070e95678184dd06e6e1027d0d51ca57dde5f3bc0847a277" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.605572 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" podUID="d6231555-bba1-41e1-8c8b-5ba3349cc6a4" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.608162 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-q48f7" secret="" err="secret \"galera-openstack-cell1-dockercfg-86kd6\" not found" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.609911 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-k5hcd"] Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.610436 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:26:19 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: if [ -n "cinder" ]; then Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="cinder" Jan 20 23:26:19 crc kubenswrapper[5030]: else Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:26:19 crc kubenswrapper[5030]: fi Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:26:19 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:26:19 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:26:19 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:26:19 crc kubenswrapper[5030]: # support updates Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.611318 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-k5hcd" Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.611711 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:26:19 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: if [ -n "" ]; then Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="" Jan 20 23:26:19 crc kubenswrapper[5030]: else Jan 20 23:26:19 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:26:19 crc kubenswrapper[5030]: fi Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:26:19 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:26:19 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:26:19 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:26:19 crc kubenswrapper[5030]: # support updates Jan 20 23:26:19 crc kubenswrapper[5030]: Jan 20 23:26:19 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.614949 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-q48f7" podUID="261b1067-9a6b-4835-8b8c-e0d2c68a4a99" Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.615005 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" podUID="0e1cc829-8d44-48c6-ab24-a8ac60596d08" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.618378 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-k5hcd"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.627118 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-b695-account-create-update-d854n"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.640064 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-b695-account-create-update-d854n"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.659818 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-2tf9j"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.679050 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-2tf9j"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.683335 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8nd2\" (UniqueName: \"kubernetes.io/projected/c8d05314-fc56-482b-b1a9-d7f5e40b6336-kube-api-access-r8nd2\") pod \"keystone-db-create-k5hcd\" (UID: \"c8d05314-fc56-482b-b1a9-d7f5e40b6336\") " pod="openstack-kuttl-tests/keystone-db-create-k5hcd" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.683596 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eedd3c6-0b24-49e4-81eb-0b21419f8739-operator-scripts\") pod \"keystone-b695-account-create-update-77jzh\" (UID: \"8eedd3c6-0b24-49e4-81eb-0b21419f8739\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-77jzh" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.683750 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdbmd\" (UniqueName: \"kubernetes.io/projected/8eedd3c6-0b24-49e4-81eb-0b21419f8739-kube-api-access-hdbmd\") pod \"keystone-b695-account-create-update-77jzh\" (UID: \"8eedd3c6-0b24-49e4-81eb-0b21419f8739\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-77jzh" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.683892 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8d05314-fc56-482b-b1a9-d7f5e40b6336-operator-scripts\") pod \"keystone-db-create-k5hcd\" (UID: \"c8d05314-fc56-482b-b1a9-d7f5e40b6336\") " pod="openstack-kuttl-tests/keystone-db-create-k5hcd" Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.685119 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.685221 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e1cc829-8d44-48c6-ab24-a8ac60596d08-operator-scripts podName:0e1cc829-8d44-48c6-ab24-a8ac60596d08 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.18520709 +0000 UTC m=+3052.505467378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0e1cc829-8d44-48c6-ab24-a8ac60596d08-operator-scripts") pod "cinder-aecb-account-create-update-lwrdm" (UID: "0e1cc829-8d44-48c6-ab24-a8ac60596d08") : configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.685328 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.686130 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-operator-scripts podName:d6231555-bba1-41e1-8c8b-5ba3349cc6a4 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.185421966 +0000 UTC m=+3052.505682254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-operator-scripts") pod "barbican-64d6-account-create-update-xvvgg" (UID: "d6231555-bba1-41e1-8c8b-5ba3349cc6a4") : configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.694106 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.695321 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-x8hpn"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.695565 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="3a13994b-a11d-4798-9aae-b91a3d73d440" containerName="memcached" containerID="cri-o://847a1cdf293cd94ff36a98ec31b6492efbc1f13b9fd3fea3d0454c780f44cc6f" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.701261 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-x8hpn"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.708681 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-2nmjb"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.711863 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-2nmjb"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.751894 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-75f85d6df9-6nmsl"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.752356 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" podUID="f9b215df-05ff-45fe-ac23-b5605150449b" containerName="keystone-api" containerID="cri-o://c61aa87934994b07e39822ecebbaf1cc3cb4d84929e81efdf4034ad0fdad39fe" gracePeriod=30 Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.785463 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdbmd\" (UniqueName: \"kubernetes.io/projected/8eedd3c6-0b24-49e4-81eb-0b21419f8739-kube-api-access-hdbmd\") pod \"keystone-b695-account-create-update-77jzh\" (UID: \"8eedd3c6-0b24-49e4-81eb-0b21419f8739\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-77jzh" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.785586 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8d05314-fc56-482b-b1a9-d7f5e40b6336-operator-scripts\") pod \"keystone-db-create-k5hcd\" (UID: \"c8d05314-fc56-482b-b1a9-d7f5e40b6336\") " pod="openstack-kuttl-tests/keystone-db-create-k5hcd" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.785591 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.785741 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8nd2\" (UniqueName: \"kubernetes.io/projected/c8d05314-fc56-482b-b1a9-d7f5e40b6336-kube-api-access-r8nd2\") pod \"keystone-db-create-k5hcd\" (UID: \"c8d05314-fc56-482b-b1a9-d7f5e40b6336\") " pod="openstack-kuttl-tests/keystone-db-create-k5hcd" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.785820 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eedd3c6-0b24-49e4-81eb-0b21419f8739-operator-scripts\") pod \"keystone-b695-account-create-update-77jzh\" (UID: \"8eedd3c6-0b24-49e4-81eb-0b21419f8739\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-77jzh" Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.785916 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.785966 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8eedd3c6-0b24-49e4-81eb-0b21419f8739-operator-scripts podName:8eedd3c6-0b24-49e4-81eb-0b21419f8739 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.285951158 +0000 UTC m=+3052.606211446 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8eedd3c6-0b24-49e4-81eb-0b21419f8739-operator-scripts") pod "keystone-b695-account-create-update-77jzh" (UID: "8eedd3c6-0b24-49e4-81eb-0b21419f8739") : configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.786260 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.786291 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts podName:261b1067-9a6b-4835-8b8c-e0d2c68a4a99 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:21.786282457 +0000 UTC m=+3054.106542745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts") pod "root-account-create-update-q48f7" (UID: "261b1067-9a6b-4835-8b8c-e0d2c68a4a99") : configmap "openstack-cell1-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.786324 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.786343 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8d05314-fc56-482b-b1a9-d7f5e40b6336-operator-scripts podName:c8d05314-fc56-482b-b1a9-d7f5e40b6336 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.286335988 +0000 UTC m=+3052.606596276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c8d05314-fc56-482b-b1a9-d7f5e40b6336-operator-scripts") pod "keystone-db-create-k5hcd" (UID: "c8d05314-fc56-482b-b1a9-d7f5e40b6336") : configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.791127 5030 projected.go:194] Error preparing data for projected volume kube-api-access-hdbmd for pod openstack-kuttl-tests/keystone-b695-account-create-update-77jzh: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.791186 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8eedd3c6-0b24-49e4-81eb-0b21419f8739-kube-api-access-hdbmd podName:8eedd3c6-0b24-49e4-81eb-0b21419f8739 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.291168645 +0000 UTC m=+3052.611428933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hdbmd" (UniqueName: "kubernetes.io/projected/8eedd3c6-0b24-49e4-81eb-0b21419f8739-kube-api-access-hdbmd") pod "keystone-b695-account-create-update-77jzh" (UID: "8eedd3c6-0b24-49e4-81eb-0b21419f8739") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.792427 5030 projected.go:194] Error preparing data for projected volume kube-api-access-r8nd2 for pod openstack-kuttl-tests/keystone-db-create-k5hcd: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.792460 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8d05314-fc56-482b-b1a9-d7f5e40b6336-kube-api-access-r8nd2 podName:c8d05314-fc56-482b-b1a9-d7f5e40b6336 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.292451546 +0000 UTC m=+3052.612711834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-r8nd2" (UniqueName: "kubernetes.io/projected/c8d05314-fc56-482b-b1a9-d7f5e40b6336-kube-api-access-r8nd2") pod "keystone-db-create-k5hcd" (UID: "c8d05314-fc56-482b-b1a9-d7f5e40b6336") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.792734 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-b695-account-create-update-77jzh"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.801532 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-k5hcd"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.821234 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.828181 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.832798 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.841137 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.846669 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs"] Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.887444 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.887553 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data podName:d756a967-e1dd-4c74-8bff-cd1fab21b251 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:23.887515487 +0000 UTC m=+3056.207775775 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data") pod "rabbitmq-server-0" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251") : configmap "rabbitmq-config-data" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.888025 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/nova-metadata-config-data: secret "nova-metadata-config-data" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.888084 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-config-data podName:23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf nodeName:}" failed. No retries permitted until 2026-01-20 23:26:21.88804976 +0000 UTC m=+3054.208310048 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-config-data") pod "nova-metadata-0" (UID: "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf") : secret "nova-metadata-config-data" not found Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.899552 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.909928 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.913023 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=5.913007934 podStartE2EDuration="5.913007934s" podCreationTimestamp="2026-01-20 23:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:26:19.508509772 +0000 UTC m=+3051.828770080" watchObservedRunningTime="2026-01-20 23:26:19.913007934 +0000 UTC m=+3052.233268222" Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.931219 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-hdbmd operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-b695-account-create-update-77jzh" podUID="8eedd3c6-0b24-49e4-81eb-0b21419f8739" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.937003 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.988138 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data-custom\") pod \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.989069 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-logs\") pod \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.989811 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.989859 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-operator-scripts podName:c07cdd7a-1d57-4d2a-9910-54d0e6e186f5 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.989842344 +0000 UTC m=+3053.310102632 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-operator-scripts") pod "glance-659a-account-create-update-nthp7" (UID: "c07cdd7a-1d57-4d2a-9910-54d0e6e186f5") : configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: I0120 23:26:19.990125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-logs" (OuterVolumeSpecName: "logs") pod "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.990201 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:19 crc kubenswrapper[5030]: E0120 23:26:19.990225 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3f82fbd9-cd1e-4d71-8d61-02126748c546-operator-scripts podName:3f82fbd9-cd1e-4d71-8d61-02126748c546 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:20.990218403 +0000 UTC m=+3053.310478691 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3f82fbd9-cd1e-4d71-8d61-02126748c546-operator-scripts") pod "glance-db-create-ndbv8" (UID: "3f82fbd9-cd1e-4d71-8d61-02126748c546") : configmap "openstack-scripts" not found Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.012778 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.068802 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce7d325-d563-4683-80fc-8626524f2631" path="/var/lib/kubelet/pods/0ce7d325-d563-4683-80fc-8626524f2631/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.071872 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1130f389-005b-4aee-ad77-913385f27c6b" path="/var/lib/kubelet/pods/1130f389-005b-4aee-ad77-913385f27c6b/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.073207 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c5d200-30f8-4559-ba0d-f72cc9180ea2" path="/var/lib/kubelet/pods/11c5d200-30f8-4559-ba0d-f72cc9180ea2/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.082330 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21862ab8-6630-45ef-9687-a7dad47777bc" path="/var/lib/kubelet/pods/21862ab8-6630-45ef-9687-a7dad47777bc/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.084228 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27233b7c-c856-4a41-a674-7399905591de" path="/var/lib/kubelet/pods/27233b7c-c856-4a41-a674-7399905591de/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.086902 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="289609c1-c2d1-448d-b4ff-ed7127995341" path="/var/lib/kubelet/pods/289609c1-c2d1-448d-b4ff-ed7127995341/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.088544 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c184fea-5939-43b6-80db-23079b0f6d88" path="/var/lib/kubelet/pods/2c184fea-5939-43b6-80db-23079b0f6d88/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.091285 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3358fbb1-638c-4f20-804e-b672441288e4" path="/var/lib/kubelet/pods/3358fbb1-638c-4f20-804e-b672441288e4/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.092037 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43181c01-0cec-4ad9-b037-880357999794" path="/var/lib/kubelet/pods/43181c01-0cec-4ad9-b037-880357999794/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.093683 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72433a7b-fb29-44be-90d0-485aadbb3d7e" path="/var/lib/kubelet/pods/72433a7b-fb29-44be-90d0-485aadbb3d7e/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.096272 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a" path="/var/lib/kubelet/pods/74c8ae4e-73fe-49aa-ab87-cbfe7bd29e7a/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.098137 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8dc0fd0-0262-4c69-a4ca-392220e30132" path="/var/lib/kubelet/pods/a8dc0fd0-0262-4c69-a4ca-392220e30132/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.098798 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac03b977-2ac1-4210-9b75-f3e6d87ef13e" path="/var/lib/kubelet/pods/ac03b977-2ac1-4210-9b75-f3e6d87ef13e/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.099427 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7afc48e-6aa6-4499-ae75-fa49a210f4c7" path="/var/lib/kubelet/pods/d7afc48e-6aa6-4499-ae75-fa49a210f4c7/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.100049 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.100107 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc858\" (UniqueName: \"kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.100186 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data\") pod \"barbican-keystone-listener-5547ddfddd-hx7pb\" (UID: \"64de676f-6fb5-4e1f-ba48-9c6ba84b1e89\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.100783 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.100830 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9865414-bdc0-4dd1-baee-d04985de4dd0" path="/var/lib/kubelet/pods/d9865414-bdc0-4dd1-baee-d04985de4dd0/volumes" Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.101370 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.101400 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.101458 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle podName:64de676f-6fb5-4e1f-ba48-9c6ba84b1e89 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:24.101434546 +0000 UTC m=+3056.421694834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle") pod "barbican-keystone-listener-5547ddfddd-hx7pb" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89") : secret "combined-ca-bundle" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.104767 5030 projected.go:194] Error preparing data for projected volume kube-api-access-pc858 for pod openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.104848 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858 podName:64de676f-6fb5-4e1f-ba48-9c6ba84b1e89 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:24.104823808 +0000 UTC m=+3056.425084096 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-pc858" (UniqueName: "kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858") pod "barbican-keystone-listener-5547ddfddd-hx7pb" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.101041 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.107101 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data podName:64de676f-6fb5-4e1f-ba48-9c6ba84b1e89 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:24.106871038 +0000 UTC m=+3056.427131326 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data") pod "barbican-keystone-listener-5547ddfddd-hx7pb" (UID: "64de676f-6fb5-4e1f-ba48-9c6ba84b1e89") : secret "barbican-config-data" not found Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.172487 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="66881282-2b6f-484d-a7e0-2b968306c5ae" containerName="galera" containerID="cri-o://7cd441aae70f6c395bd5a42b94d16ef7f6f319a8b61584ad5fc4661b28a3ff4d" gracePeriod=30 Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.196323 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-r8nd2 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-db-create-k5hcd" podUID="c8d05314-fc56-482b-b1a9-d7f5e40b6336" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.201433 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4" Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.204307 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.204440 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e1cc829-8d44-48c6-ab24-a8ac60596d08-operator-scripts podName:0e1cc829-8d44-48c6-ab24-a8ac60596d08 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:21.204423348 +0000 UTC m=+3053.524683626 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0e1cc829-8d44-48c6-ab24-a8ac60596d08-operator-scripts") pod "cinder-aecb-account-create-update-lwrdm" (UID: "0e1cc829-8d44-48c6-ab24-a8ac60596d08") : configmap "openstack-scripts" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.205332 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.205388 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-operator-scripts podName:d6231555-bba1-41e1-8c8b-5ba3349cc6a4 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:21.205378712 +0000 UTC m=+3053.525639000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-operator-scripts") pod "barbican-64d6-account-create-update-xvvgg" (UID: "d6231555-bba1-41e1-8c8b-5ba3349cc6a4") : configmap "openstack-scripts" not found Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.211689 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.261796 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.306325 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v9cg\" (UniqueName: \"kubernetes.io/projected/29c196d8-2e0d-4bbf-9f4c-c6754359f6f3-kube-api-access-2v9cg\") pod \"29c196d8-2e0d-4bbf-9f4c-c6754359f6f3\" (UID: \"29c196d8-2e0d-4bbf-9f4c-c6754359f6f3\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.306410 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e23a57-b3b1-4e99-8171-53da821ba0d8-operator-scripts\") pod \"68e23a57-b3b1-4e99-8171-53da821ba0d8\" (UID: \"68e23a57-b3b1-4e99-8171-53da821ba0d8\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.306446 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c196d8-2e0d-4bbf-9f4c-c6754359f6f3-operator-scripts\") pod \"29c196d8-2e0d-4bbf-9f4c-c6754359f6f3\" (UID: \"29c196d8-2e0d-4bbf-9f4c-c6754359f6f3\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.306532 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzbs7\" (UniqueName: \"kubernetes.io/projected/68e23a57-b3b1-4e99-8171-53da821ba0d8-kube-api-access-vzbs7\") pod \"68e23a57-b3b1-4e99-8171-53da821ba0d8\" (UID: \"68e23a57-b3b1-4e99-8171-53da821ba0d8\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.306869 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8nd2\" (UniqueName: \"kubernetes.io/projected/c8d05314-fc56-482b-b1a9-d7f5e40b6336-kube-api-access-r8nd2\") pod \"keystone-db-create-k5hcd\" (UID: \"c8d05314-fc56-482b-b1a9-d7f5e40b6336\") " pod="openstack-kuttl-tests/keystone-db-create-k5hcd" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.306952 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eedd3c6-0b24-49e4-81eb-0b21419f8739-operator-scripts\") pod \"keystone-b695-account-create-update-77jzh\" (UID: \"8eedd3c6-0b24-49e4-81eb-0b21419f8739\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-77jzh" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.307008 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdbmd\" (UniqueName: \"kubernetes.io/projected/8eedd3c6-0b24-49e4-81eb-0b21419f8739-kube-api-access-hdbmd\") pod \"keystone-b695-account-create-update-77jzh\" (UID: \"8eedd3c6-0b24-49e4-81eb-0b21419f8739\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-77jzh" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.307082 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8d05314-fc56-482b-b1a9-d7f5e40b6336-operator-scripts\") pod \"keystone-db-create-k5hcd\" (UID: \"c8d05314-fc56-482b-b1a9-d7f5e40b6336\") " pod="openstack-kuttl-tests/keystone-db-create-k5hcd" Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.307174 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.307220 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8d05314-fc56-482b-b1a9-d7f5e40b6336-operator-scripts podName:c8d05314-fc56-482b-b1a9-d7f5e40b6336 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:21.307205627 +0000 UTC m=+3053.627465915 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c8d05314-fc56-482b-b1a9-d7f5e40b6336-operator-scripts") pod "keystone-db-create-k5hcd" (UID: "c8d05314-fc56-482b-b1a9-d7f5e40b6336") : configmap "openstack-scripts" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.307415 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.307439 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8eedd3c6-0b24-49e4-81eb-0b21419f8739-operator-scripts podName:8eedd3c6-0b24-49e4-81eb-0b21419f8739 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:21.307429742 +0000 UTC m=+3053.627690030 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8eedd3c6-0b24-49e4-81eb-0b21419f8739-operator-scripts") pod "keystone-b695-account-create-update-77jzh" (UID: "8eedd3c6-0b24-49e4-81eb-0b21419f8739") : configmap "openstack-scripts" not found Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.307950 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e23a57-b3b1-4e99-8171-53da821ba0d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68e23a57-b3b1-4e99-8171-53da821ba0d8" (UID: "68e23a57-b3b1-4e99-8171-53da821ba0d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.311878 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29c196d8-2e0d-4bbf-9f4c-c6754359f6f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29c196d8-2e0d-4bbf-9f4c-c6754359f6f3" (UID: "29c196d8-2e0d-4bbf-9f4c-c6754359f6f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.312857 5030 projected.go:194] Error preparing data for projected volume kube-api-access-r8nd2 for pod openstack-kuttl-tests/keystone-db-create-k5hcd: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.312917 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8d05314-fc56-482b-b1a9-d7f5e40b6336-kube-api-access-r8nd2 podName:c8d05314-fc56-482b-b1a9-d7f5e40b6336 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:21.312899715 +0000 UTC m=+3053.633160003 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-r8nd2" (UniqueName: "kubernetes.io/projected/c8d05314-fc56-482b-b1a9-d7f5e40b6336-kube-api-access-r8nd2") pod "keystone-db-create-k5hcd" (UID: "c8d05314-fc56-482b-b1a9-d7f5e40b6336") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.312967 5030 projected.go:194] Error preparing data for projected volume kube-api-access-hdbmd for pod openstack-kuttl-tests/keystone-b695-account-create-update-77jzh: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.312998 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8eedd3c6-0b24-49e4-81eb-0b21419f8739-kube-api-access-hdbmd podName:8eedd3c6-0b24-49e4-81eb-0b21419f8739 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:21.312992527 +0000 UTC m=+3053.633252815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hdbmd" (UniqueName: "kubernetes.io/projected/8eedd3c6-0b24-49e4-81eb-0b21419f8739-kube-api-access-hdbmd") pod "keystone-b695-account-create-update-77jzh" (UID: "8eedd3c6-0b24-49e4-81eb-0b21419f8739") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.316361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29c196d8-2e0d-4bbf-9f4c-c6754359f6f3-kube-api-access-2v9cg" (OuterVolumeSpecName: "kube-api-access-2v9cg") pod "29c196d8-2e0d-4bbf-9f4c-c6754359f6f3" (UID: "29c196d8-2e0d-4bbf-9f4c-c6754359f6f3"). InnerVolumeSpecName "kube-api-access-2v9cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.322087 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68e23a57-b3b1-4e99-8171-53da821ba0d8-kube-api-access-vzbs7" (OuterVolumeSpecName: "kube-api-access-vzbs7") pod "68e23a57-b3b1-4e99-8171-53da821ba0d8" (UID: "68e23a57-b3b1-4e99-8171-53da821ba0d8"). InnerVolumeSpecName "kube-api-access-vzbs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.359663 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.410672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlf4t\" (UniqueName: \"kubernetes.io/projected/e5276d7d-9906-480d-90c6-4f48c6f126e2-kube-api-access-mlf4t\") pod \"e5276d7d-9906-480d-90c6-4f48c6f126e2\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.410999 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-combined-ca-bundle\") pod \"e5276d7d-9906-480d-90c6-4f48c6f126e2\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.411051 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-internal-tls-certs\") pod \"e5276d7d-9906-480d-90c6-4f48c6f126e2\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.411080 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-public-tls-certs\") pod \"e5276d7d-9906-480d-90c6-4f48c6f126e2\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.411107 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5276d7d-9906-480d-90c6-4f48c6f126e2-logs\") pod \"e5276d7d-9906-480d-90c6-4f48c6f126e2\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.411148 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-config-data\") pod \"e5276d7d-9906-480d-90c6-4f48c6f126e2\" (UID: \"e5276d7d-9906-480d-90c6-4f48c6f126e2\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.411522 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.411671 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v9cg\" (UniqueName: \"kubernetes.io/projected/29c196d8-2e0d-4bbf-9f4c-c6754359f6f3-kube-api-access-2v9cg\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.411683 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e23a57-b3b1-4e99-8171-53da821ba0d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.411692 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29c196d8-2e0d-4bbf-9f4c-c6754359f6f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.411702 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzbs7\" (UniqueName: \"kubernetes.io/projected/68e23a57-b3b1-4e99-8171-53da821ba0d8-kube-api-access-vzbs7\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.411860 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.411907 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle podName:c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:24.411892971 +0000 UTC m=+3056.732153259 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle") pod "barbican-worker-58bb697bc9-t6twt" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9") : secret "combined-ca-bundle" not found Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.413024 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5276d7d-9906-480d-90c6-4f48c6f126e2-logs" (OuterVolumeSpecName: "logs") pod "e5276d7d-9906-480d-90c6-4f48c6f126e2" (UID: "e5276d7d-9906-480d-90c6-4f48c6f126e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.417594 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5276d7d-9906-480d-90c6-4f48c6f126e2-kube-api-access-mlf4t" (OuterVolumeSpecName: "kube-api-access-mlf4t") pod "e5276d7d-9906-480d-90c6-4f48c6f126e2" (UID: "e5276d7d-9906-480d-90c6-4f48c6f126e2"). InnerVolumeSpecName "kube-api-access-mlf4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.453592 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5276d7d-9906-480d-90c6-4f48c6f126e2" (UID: "e5276d7d-9906-480d-90c6-4f48c6f126e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.475853 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-config-data" (OuterVolumeSpecName: "config-data") pod "e5276d7d-9906-480d-90c6-4f48c6f126e2" (UID: "e5276d7d-9906-480d-90c6-4f48c6f126e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.498483 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e5276d7d-9906-480d-90c6-4f48c6f126e2" (UID: "e5276d7d-9906-480d-90c6-4f48c6f126e2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.502495 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e5276d7d-9906-480d-90c6-4f48c6f126e2" (UID: "e5276d7d-9906-480d-90c6-4f48c6f126e2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.514100 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-state-metrics-tls-certs\") pod \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.514276 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z55pw\" (UniqueName: \"kubernetes.io/projected/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-api-access-z55pw\") pod \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.514308 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-state-metrics-tls-config\") pod \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.514329 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-combined-ca-bundle\") pod \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\" (UID: \"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb\") " Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.514748 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.514876 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlf4t\" (UniqueName: \"kubernetes.io/projected/e5276d7d-9906-480d-90c6-4f48c6f126e2-kube-api-access-mlf4t\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.514892 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.514902 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.514910 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.514920 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5276d7d-9906-480d-90c6-4f48c6f126e2-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.514929 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5276d7d-9906-480d-90c6-4f48c6f126e2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.515009 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.515055 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data podName:c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:24.515038338 +0000 UTC m=+3056.835298626 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data") pod "barbican-worker-58bb697bc9-t6twt" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9") : secret "barbican-config-data" not found Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.540003 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-api-access-z55pw" (OuterVolumeSpecName: "kube-api-access-z55pw") pod "0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb" (UID: "0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb"). InnerVolumeSpecName "kube-api-access-z55pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.579744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb" (UID: "0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.585428 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb" (UID: "0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.599565 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb" (UID: "0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.616595 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g97xg\" (UniqueName: \"kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg\") pod \"barbican-worker-58bb697bc9-t6twt\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.616690 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z55pw\" (UniqueName: \"kubernetes.io/projected/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-api-access-z55pw\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.616703 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.616713 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.616724 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.618222 5030 generic.go:334] "Generic (PLEG): container finished" podID="2b796eed-a508-43ca-9b25-c097b39474b7" containerID="b8717a3cdbefcbd0c0ff07aabc3b13b4414654b84eeec748b31cb1d195d654c9" exitCode=143 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.618275 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" event={"ID":"2b796eed-a508-43ca-9b25-c097b39474b7","Type":"ContainerDied","Data":"b8717a3cdbefcbd0c0ff07aabc3b13b4414654b84eeec748b31cb1d195d654c9"} Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.619795 5030 projected.go:194] Error preparing data for projected volume kube-api-access-g97xg for pod openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.619845 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg podName:c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:24.619829045 +0000 UTC m=+3056.940089333 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-g97xg" (UniqueName: "kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg") pod "barbican-worker-58bb697bc9-t6twt" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.631564 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs" event={"ID":"29c196d8-2e0d-4bbf-9f4c-c6754359f6f3","Type":"ContainerDied","Data":"4014fb9782a25f4d714642ff7460c3cd24cfab8b9ae68ff0c2d7c55bed09821a"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.631651 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.640647 5030 generic.go:334] "Generic (PLEG): container finished" podID="7467fbb5-44e9-42cf-874d-16fd1332885e" containerID="493a4b5159ec26b105e62e8cb2c2af22e9255bbb11250b3a8cbcc06e1bbd150a" exitCode=0 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.640727 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7467fbb5-44e9-42cf-874d-16fd1332885e","Type":"ContainerDied","Data":"493a4b5159ec26b105e62e8cb2c2af22e9255bbb11250b3a8cbcc06e1bbd150a"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.640751 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"7467fbb5-44e9-42cf-874d-16fd1332885e","Type":"ContainerDied","Data":"b4d25f2e1c4c075148c980fe0bbb316546c8543b18687fc473c810f3c38fa32a"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.640761 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4d25f2e1c4c075148c980fe0bbb316546c8543b18687fc473c810f3c38fa32a" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.644484 5030 generic.go:334] "Generic (PLEG): container finished" podID="17e14da0-805b-4832-b6dc-7dc6b829ee4f" containerID="f3143e4041bdf06c6947b874cc740775ab9c9ec6001b798510175cc97e744a7e" exitCode=143 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.644532 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" event={"ID":"17e14da0-805b-4832-b6dc-7dc6b829ee4f","Type":"ContainerDied","Data":"f3143e4041bdf06c6947b874cc740775ab9c9ec6001b798510175cc97e744a7e"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.646535 5030 generic.go:334] "Generic (PLEG): container finished" podID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerID="79b477de7360a1b389230a49bf03141e246b436af45aecf9fc9c9289c4283a3d" exitCode=0 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.646549 5030 generic.go:334] "Generic (PLEG): container finished" podID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerID="115ce5a1a1b811f563c605b90097f6b0f7096471f6c5040cf6697c80d762190f" exitCode=2 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.646559 5030 generic.go:334] "Generic (PLEG): container finished" podID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerID="bfd7a2900f29501d274a01d822ba9fcab833dd0ab9306f38cbde8afbe0832f0e" exitCode=0 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.646565 5030 generic.go:334] "Generic (PLEG): container finished" podID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerID="3fa1f8af90d752f9f412358e748ffa6b26b530af7abe3410789eb3720481f96a" exitCode=0 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.646591 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e2840a28-0f62-4378-bbd0-a07833eb25c7","Type":"ContainerDied","Data":"79b477de7360a1b389230a49bf03141e246b436af45aecf9fc9c9289c4283a3d"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.646605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e2840a28-0f62-4378-bbd0-a07833eb25c7","Type":"ContainerDied","Data":"115ce5a1a1b811f563c605b90097f6b0f7096471f6c5040cf6697c80d762190f"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.646614 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e2840a28-0f62-4378-bbd0-a07833eb25c7","Type":"ContainerDied","Data":"bfd7a2900f29501d274a01d822ba9fcab833dd0ab9306f38cbde8afbe0832f0e"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.646643 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e2840a28-0f62-4378-bbd0-a07833eb25c7","Type":"ContainerDied","Data":"3fa1f8af90d752f9f412358e748ffa6b26b530af7abe3410789eb3720481f96a"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.647728 5030 generic.go:334] "Generic (PLEG): container finished" podID="23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" containerID="616e09206b760dcb070e95678184dd06e6e1027d0d51ca57dde5f3bc0847a277" exitCode=0 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.647739 5030 generic.go:334] "Generic (PLEG): container finished" podID="23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" containerID="ba64c3f6fd654c175099565552c0c0d6fd00f883cbac81faa3e5a76d3ce4b5e9" exitCode=143 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.647765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf","Type":"ContainerDied","Data":"616e09206b760dcb070e95678184dd06e6e1027d0d51ca57dde5f3bc0847a277"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.647778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf","Type":"ContainerDied","Data":"ba64c3f6fd654c175099565552c0c0d6fd00f883cbac81faa3e5a76d3ce4b5e9"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.648948 5030 generic.go:334] "Generic (PLEG): container finished" podID="3f82fbd9-cd1e-4d71-8d61-02126748c546" containerID="b212f24fbc91bb05579efc68cd38cf0e1aaabcd0a37df17c85f251b376bf39f4" exitCode=1 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.648979 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-ndbv8" event={"ID":"3f82fbd9-cd1e-4d71-8d61-02126748c546","Type":"ContainerDied","Data":"b212f24fbc91bb05579efc68cd38cf0e1aaabcd0a37df17c85f251b376bf39f4"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.649000 5030 scope.go:117] "RemoveContainer" containerID="01b284d1f2373249d8d8c7a1c2c3ee891c1872f2d5ad6c4ea03f3ffcbbd09c35" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.656143 5030 generic.go:334] "Generic (PLEG): container finished" podID="e5276d7d-9906-480d-90c6-4f48c6f126e2" containerID="4fcb730cfefb582fb08c7ccfc4f44f7eca229fb8f1e6d2c66c0bb98cd59694cb" exitCode=0 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.656200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e5276d7d-9906-480d-90c6-4f48c6f126e2","Type":"ContainerDied","Data":"4fcb730cfefb582fb08c7ccfc4f44f7eca229fb8f1e6d2c66c0bb98cd59694cb"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.656224 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"e5276d7d-9906-480d-90c6-4f48c6f126e2","Type":"ContainerDied","Data":"e5fc5981d432357c3c6b1bf46ab30c0c33deac8dd8384747436ef962d6aafb34"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.656279 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.671942 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4" event={"ID":"68e23a57-b3b1-4e99-8171-53da821ba0d8","Type":"ContainerDied","Data":"733f59c56639f824d80f2a04d03b12eb127d7fa3ff3693d43ce54ffdf366a49e"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.672014 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.683814 5030 generic.go:334] "Generic (PLEG): container finished" podID="5995ef35-8891-4ffb-8e20-86add9e274bc" containerID="8b8b11c6ac6050ac18676092493bc35bbf1121d7672ace5bea22fe163dd6aca3" exitCode=137 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.688370 5030 generic.go:334] "Generic (PLEG): container finished" podID="f239887a-e1d2-484e-96ce-e4bb56b1bc75" containerID="5e702a9647c237b7ea176fced7e9b9708645df2df43294f18dad118f506f7fee" exitCode=0 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.688464 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f239887a-e1d2-484e-96ce-e4bb56b1bc75","Type":"ContainerDied","Data":"5e702a9647c237b7ea176fced7e9b9708645df2df43294f18dad118f506f7fee"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.737932 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="df6ef689e710dba22c29a499696e2c2009cc0478473059ce8d655ee5a1654893" exitCode=0 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.737969 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="be5d70fece74ed23d3912dcecfead32af8e6a34f87d9ad333fc71f20d7ff9eda" exitCode=0 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.737977 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="3d951ed926c631ff4318b93467ee0df5ca43db2435a2c42f8b6dc53ec4fc91c3" exitCode=0 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.737986 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="9181d3d458c39873908d9b62be1f43c09bde580a5a92747fa92c7120b3c67bd9" exitCode=0 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.737991 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"df6ef689e710dba22c29a499696e2c2009cc0478473059ce8d655ee5a1654893"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.738044 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"be5d70fece74ed23d3912dcecfead32af8e6a34f87d9ad333fc71f20d7ff9eda"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.738055 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"3d951ed926c631ff4318b93467ee0df5ca43db2435a2c42f8b6dc53ec4fc91c3"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.738063 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"9181d3d458c39873908d9b62be1f43c09bde580a5a92747fa92c7120b3c67bd9"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.744085 5030 generic.go:334] "Generic (PLEG): container finished" podID="b334c7e2-1209-4a89-b56d-72d97b6f16b3" containerID="6b496bef4ca4d0433913ed3347b495cb73325fdbef015e1d05cc61795bb60789" exitCode=143 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.744155 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" event={"ID":"b334c7e2-1209-4a89-b56d-72d97b6f16b3","Type":"ContainerDied","Data":"6b496bef4ca4d0433913ed3347b495cb73325fdbef015e1d05cc61795bb60789"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.750911 5030 generic.go:334] "Generic (PLEG): container finished" podID="c256a299-ef10-4eab-ae00-d4116a91a108" containerID="cbbd55ec11f922f70da8495be9037539e1d761154559e6b2075472c7deb21023" exitCode=0 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.750958 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" event={"ID":"c256a299-ef10-4eab-ae00-d4116a91a108","Type":"ContainerDied","Data":"cbbd55ec11f922f70da8495be9037539e1d761154559e6b2075472c7deb21023"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.750985 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" event={"ID":"c256a299-ef10-4eab-ae00-d4116a91a108","Type":"ContainerDied","Data":"2e3c443886e08309e23e74c3eb26e87aeed4b515043573a5b997072fad41ac81"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.750965 5030 generic.go:334] "Generic (PLEG): container finished" podID="c256a299-ef10-4eab-ae00-d4116a91a108" containerID="2e3c443886e08309e23e74c3eb26e87aeed4b515043573a5b997072fad41ac81" exitCode=0 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.751045 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" event={"ID":"c256a299-ef10-4eab-ae00-d4116a91a108","Type":"ContainerDied","Data":"85dab81c3ed675ac6bfcead4abc4422ebb5e103eb112882198a33167ac28209d"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.751063 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85dab81c3ed675ac6bfcead4abc4422ebb5e103eb112882198a33167ac28209d" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.753084 5030 generic.go:334] "Generic (PLEG): container finished" podID="0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb" containerID="1b112a374f9b05a0f9a2eb5a5ade4d52d82f6111e4094f010c5765691ff6c914" exitCode=2 Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.753196 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b695-account-create-update-77jzh" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.753754 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.754197 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb","Type":"ContainerDied","Data":"1b112a374f9b05a0f9a2eb5a5ade4d52d82f6111e4094f010c5765691ff6c914"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.754239 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb","Type":"ContainerDied","Data":"94347de6b0123b260b344d05156ff2e06368833265f0cee2a5bd2b0f4d48cd5d"} Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.754298 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.754485 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-k5hcd" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.754894 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb" Jan 20 23:26:20 crc kubenswrapper[5030]: I0120 23:26:20.869859 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4"] Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.926710 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:26:20 crc kubenswrapper[5030]: E0120 23:26:20.926922 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle podName:23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf nodeName:}" failed. No retries permitted until 2026-01-20 23:26:24.926907488 +0000 UTC m=+3057.247167776 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle") pod "nova-metadata-0" (UID: "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf") : secret "combined-ca-bundle" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.029593 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.029740 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-operator-scripts podName:c07cdd7a-1d57-4d2a-9910-54d0e6e186f5 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:23.029723537 +0000 UTC m=+3055.349983825 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-operator-scripts") pod "glance-659a-account-create-update-nthp7" (UID: "c07cdd7a-1d57-4d2a-9910-54d0e6e186f5") : configmap "openstack-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.030154 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.030187 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3f82fbd9-cd1e-4d71-8d61-02126748c546-operator-scripts podName:3f82fbd9-cd1e-4d71-8d61-02126748c546 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:23.030179178 +0000 UTC m=+3055.350439466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3f82fbd9-cd1e-4d71-8d61-02126748c546-operator-scripts") pod "glance-db-create-ndbv8" (UID: "3f82fbd9-cd1e-4d71-8d61-02126748c546") : configmap "openstack-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.039691 5030 scope.go:117] "RemoveContainer" containerID="4fcb730cfefb582fb08c7ccfc4f44f7eca229fb8f1e6d2c66c0bb98cd59694cb" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.112605 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.124645 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.139481 5030 scope.go:117] "RemoveContainer" containerID="bfd8c73cd8e2801b4cb689a42728e83fbff428b8ac539964c23cc5f093ce8bf9" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.167774 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b695-account-create-update-77jzh" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.185840 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.186386 5030 scope.go:117] "RemoveContainer" containerID="4fcb730cfefb582fb08c7ccfc4f44f7eca229fb8f1e6d2c66c0bb98cd59694cb" Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.190499 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fcb730cfefb582fb08c7ccfc4f44f7eca229fb8f1e6d2c66c0bb98cd59694cb\": container with ID starting with 4fcb730cfefb582fb08c7ccfc4f44f7eca229fb8f1e6d2c66c0bb98cd59694cb not found: ID does not exist" containerID="4fcb730cfefb582fb08c7ccfc4f44f7eca229fb8f1e6d2c66c0bb98cd59694cb" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.190541 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fcb730cfefb582fb08c7ccfc4f44f7eca229fb8f1e6d2c66c0bb98cd59694cb"} err="failed to get container status \"4fcb730cfefb582fb08c7ccfc4f44f7eca229fb8f1e6d2c66c0bb98cd59694cb\": rpc error: code = NotFound desc = could not find container \"4fcb730cfefb582fb08c7ccfc4f44f7eca229fb8f1e6d2c66c0bb98cd59694cb\": container with ID starting with 4fcb730cfefb582fb08c7ccfc4f44f7eca229fb8f1e6d2c66c0bb98cd59694cb not found: ID does not exist" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.190570 5030 scope.go:117] "RemoveContainer" containerID="bfd8c73cd8e2801b4cb689a42728e83fbff428b8ac539964c23cc5f093ce8bf9" Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.191147 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd8c73cd8e2801b4cb689a42728e83fbff428b8ac539964c23cc5f093ce8bf9\": container with ID starting with bfd8c73cd8e2801b4cb689a42728e83fbff428b8ac539964c23cc5f093ce8bf9 not found: ID does not exist" containerID="bfd8c73cd8e2801b4cb689a42728e83fbff428b8ac539964c23cc5f093ce8bf9" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.191225 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd8c73cd8e2801b4cb689a42728e83fbff428b8ac539964c23cc5f093ce8bf9"} err="failed to get container status \"bfd8c73cd8e2801b4cb689a42728e83fbff428b8ac539964c23cc5f093ce8bf9\": rpc error: code = NotFound desc = could not find container \"bfd8c73cd8e2801b4cb689a42728e83fbff428b8ac539964c23cc5f093ce8bf9\": container with ID starting with bfd8c73cd8e2801b4cb689a42728e83fbff428b8ac539964c23cc5f093ce8bf9 not found: ID does not exist" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.191253 5030 scope.go:117] "RemoveContainer" containerID="1b112a374f9b05a0f9a2eb5a5ade4d52d82f6111e4094f010c5765691ff6c914" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.197751 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-k5hcd" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.204534 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.220745 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs"] Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.233098 5030 scope.go:117] "RemoveContainer" containerID="1b112a374f9b05a0f9a2eb5a5ade4d52d82f6111e4094f010c5765691ff6c914" Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.235326 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b112a374f9b05a0f9a2eb5a5ade4d52d82f6111e4094f010c5765691ff6c914\": container with ID starting with 1b112a374f9b05a0f9a2eb5a5ade4d52d82f6111e4094f010c5765691ff6c914 not found: ID does not exist" containerID="1b112a374f9b05a0f9a2eb5a5ade4d52d82f6111e4094f010c5765691ff6c914" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.235374 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b112a374f9b05a0f9a2eb5a5ade4d52d82f6111e4094f010c5765691ff6c914"} err="failed to get container status \"1b112a374f9b05a0f9a2eb5a5ade4d52d82f6111e4094f010c5765691ff6c914\": rpc error: code = NotFound desc = could not find container \"1b112a374f9b05a0f9a2eb5a5ade4d52d82f6111e4094f010c5765691ff6c914\": container with ID starting with 1b112a374f9b05a0f9a2eb5a5ade4d52d82f6111e4094f010c5765691ff6c914 not found: ID does not exist" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.238916 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.240216 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-ebab-account-create-update-6gbzs"] Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-internal-tls-certs\") pod \"c256a299-ef10-4eab-ae00-d4116a91a108\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c256a299-ef10-4eab-ae00-d4116a91a108-log-httpd\") pod \"c256a299-ef10-4eab-ae00-d4116a91a108\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242386 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbjbt\" (UniqueName: \"kubernetes.io/projected/7467fbb5-44e9-42cf-874d-16fd1332885e-kube-api-access-kbjbt\") pod \"7467fbb5-44e9-42cf-874d-16fd1332885e\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242439 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data-custom\") pod \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242462 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-config-data\") pod \"c256a299-ef10-4eab-ae00-d4116a91a108\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242525 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-public-tls-certs\") pod \"c256a299-ef10-4eab-ae00-d4116a91a108\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242582 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-config-data\") pod \"7467fbb5-44e9-42cf-874d-16fd1332885e\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df5wt\" (UniqueName: \"kubernetes.io/projected/c256a299-ef10-4eab-ae00-d4116a91a108-kube-api-access-df5wt\") pod \"c256a299-ef10-4eab-ae00-d4116a91a108\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242636 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c256a299-ef10-4eab-ae00-d4116a91a108-run-httpd\") pod \"c256a299-ef10-4eab-ae00-d4116a91a108\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242659 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-logs\") pod \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\" (UID: \"c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-combined-ca-bundle\") pod \"c256a299-ef10-4eab-ae00-d4116a91a108\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242725 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-combined-ca-bundle\") pod \"7467fbb5-44e9-42cf-874d-16fd1332885e\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242775 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-nova-novncproxy-tls-certs\") pod \"7467fbb5-44e9-42cf-874d-16fd1332885e\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242798 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c256a299-ef10-4eab-ae00-d4116a91a108-etc-swift\") pod \"c256a299-ef10-4eab-ae00-d4116a91a108\" (UID: \"c256a299-ef10-4eab-ae00-d4116a91a108\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.242830 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-vencrypt-tls-certs\") pod \"7467fbb5-44e9-42cf-874d-16fd1332885e\" (UID: \"7467fbb5-44e9-42cf-874d-16fd1332885e\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.244542 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.247979 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c256a299-ef10-4eab-ae00-d4116a91a108-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c256a299-ef10-4eab-ae00-d4116a91a108" (UID: "c256a299-ef10-4eab-ae00-d4116a91a108"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.249166 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.253821 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-logs" (OuterVolumeSpecName: "logs") pod "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.254072 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c256a299-ef10-4eab-ae00-d4116a91a108-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.254104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c256a299-ef10-4eab-ae00-d4116a91a108-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c256a299-ef10-4eab-ae00-d4116a91a108" (UID: "c256a299-ef10-4eab-ae00-d4116a91a108"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.254417 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.254578 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-operator-scripts podName:d6231555-bba1-41e1-8c8b-5ba3349cc6a4 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:23.254549779 +0000 UTC m=+3055.574810247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-operator-scripts") pod "barbican-64d6-account-create-update-xvvgg" (UID: "d6231555-bba1-41e1-8c8b-5ba3349cc6a4") : configmap "openstack-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.255271 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.255411 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0e1cc829-8d44-48c6-ab24-a8ac60596d08-operator-scripts podName:0e1cc829-8d44-48c6-ab24-a8ac60596d08 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:23.25539869 +0000 UTC m=+3055.575658978 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0e1cc829-8d44-48c6-ab24-a8ac60596d08-operator-scripts") pod "cinder-aecb-account-create-update-lwrdm" (UID: "0e1cc829-8d44-48c6-ab24-a8ac60596d08") : configmap "openstack-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.263153 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.265981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c256a299-ef10-4eab-ae00-d4116a91a108-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c256a299-ef10-4eab-ae00-d4116a91a108" (UID: "c256a299-ef10-4eab-ae00-d4116a91a108"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.275422 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4"] Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.279939 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9" (UID: "c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.280052 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c256a299-ef10-4eab-ae00-d4116a91a108-kube-api-access-df5wt" (OuterVolumeSpecName: "kube-api-access-df5wt") pod "c256a299-ef10-4eab-ae00-d4116a91a108" (UID: "c256a299-ef10-4eab-ae00-d4116a91a108"). InnerVolumeSpecName "kube-api-access-df5wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.280851 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-779e-account-create-update-5p9c4"] Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.281780 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7467fbb5-44e9-42cf-874d-16fd1332885e-kube-api-access-kbjbt" (OuterVolumeSpecName: "kube-api-access-kbjbt") pod "7467fbb5-44e9-42cf-874d-16fd1332885e" (UID: "7467fbb5-44e9-42cf-874d-16fd1332885e"). InnerVolumeSpecName "kube-api-access-kbjbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.284728 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.299588 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.305711 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.326536 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-config-data" (OuterVolumeSpecName: "config-data") pod "7467fbb5-44e9-42cf-874d-16fd1332885e" (UID: "7467fbb5-44e9-42cf-874d-16fd1332885e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.331941 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7467fbb5-44e9-42cf-874d-16fd1332885e" (UID: "7467fbb5-44e9-42cf-874d-16fd1332885e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.349953 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "7467fbb5-44e9-42cf-874d-16fd1332885e" (UID: "7467fbb5-44e9-42cf-874d-16fd1332885e"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.365902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c256a299-ef10-4eab-ae00-d4116a91a108" (UID: "c256a299-ef10-4eab-ae00-d4116a91a108"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.367228 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-nova-metadata-tls-certs\") pod \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.367284 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-config-data\") pod \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.367340 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-262j8\" (UniqueName: \"kubernetes.io/projected/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-kube-api-access-262j8\") pod \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.367401 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-operator-scripts\") pod \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.367425 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f239887a-e1d2-484e-96ce-e4bb56b1bc75-config-data-generated\") pod \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.367460 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp7fs\" (UniqueName: \"kubernetes.io/projected/5995ef35-8891-4ffb-8e20-86add9e274bc-kube-api-access-hp7fs\") pod \"5995ef35-8891-4ffb-8e20-86add9e274bc\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.367490 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2840a28-0f62-4378-bbd0-a07833eb25c7-run-httpd\") pod \"e2840a28-0f62-4378-bbd0-a07833eb25c7\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.367533 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-config-data\") pod \"e2840a28-0f62-4378-bbd0-a07833eb25c7\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.367578 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-logs\") pod \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.371814 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2840a28-0f62-4378-bbd0-a07833eb25c7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e2840a28-0f62-4378-bbd0-a07833eb25c7" (UID: "e2840a28-0f62-4378-bbd0-a07833eb25c7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.374576 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f239887a-e1d2-484e-96ce-e4bb56b1bc75-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f239887a-e1d2-484e-96ce-e4bb56b1bc75" (UID: "f239887a-e1d2-484e-96ce-e4bb56b1bc75"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.375210 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-logs" (OuterVolumeSpecName: "logs") pod "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" (UID: "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.378862 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f239887a-e1d2-484e-96ce-e4bb56b1bc75" (UID: "f239887a-e1d2-484e-96ce-e4bb56b1bc75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.383251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.383336 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-sg-core-conf-yaml\") pod \"e2840a28-0f62-4378-bbd0-a07833eb25c7\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.383400 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f239887a-e1d2-484e-96ce-e4bb56b1bc75-combined-ca-bundle\") pod \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.383443 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5995ef35-8891-4ffb-8e20-86add9e274bc-openstack-config-secret\") pod \"5995ef35-8891-4ffb-8e20-86add9e274bc\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.383487 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fr6n\" (UniqueName: \"kubernetes.io/projected/f239887a-e1d2-484e-96ce-e4bb56b1bc75-kube-api-access-2fr6n\") pod \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.383581 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-config-data-default\") pod \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.383657 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-scripts\") pod \"e2840a28-0f62-4378-bbd0-a07833eb25c7\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.383737 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2840a28-0f62-4378-bbd0-a07833eb25c7-log-httpd\") pod \"e2840a28-0f62-4378-bbd0-a07833eb25c7\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.383867 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5995ef35-8891-4ffb-8e20-86add9e274bc-combined-ca-bundle\") pod \"5995ef35-8891-4ffb-8e20-86add9e274bc\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.383916 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5995ef35-8891-4ffb-8e20-86add9e274bc-openstack-config\") pod \"5995ef35-8891-4ffb-8e20-86add9e274bc\" (UID: \"5995ef35-8891-4ffb-8e20-86add9e274bc\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.383958 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f239887a-e1d2-484e-96ce-e4bb56b1bc75-galera-tls-certs\") pod \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.384029 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-kolla-config\") pod \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\" (UID: \"f239887a-e1d2-484e-96ce-e4bb56b1bc75\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.384145 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-ceilometer-tls-certs\") pod \"e2840a28-0f62-4378-bbd0-a07833eb25c7\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.384209 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle\") pod \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\" (UID: \"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.384252 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf8g4\" (UniqueName: \"kubernetes.io/projected/e2840a28-0f62-4378-bbd0-a07833eb25c7-kube-api-access-vf8g4\") pod \"e2840a28-0f62-4378-bbd0-a07833eb25c7\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.384321 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-combined-ca-bundle\") pod \"e2840a28-0f62-4378-bbd0-a07833eb25c7\" (UID: \"e2840a28-0f62-4378-bbd0-a07833eb25c7\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.384824 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8nd2\" (UniqueName: \"kubernetes.io/projected/c8d05314-fc56-482b-b1a9-d7f5e40b6336-kube-api-access-r8nd2\") pod \"keystone-db-create-k5hcd\" (UID: \"c8d05314-fc56-482b-b1a9-d7f5e40b6336\") " pod="openstack-kuttl-tests/keystone-db-create-k5hcd" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.384946 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eedd3c6-0b24-49e4-81eb-0b21419f8739-operator-scripts\") pod \"keystone-b695-account-create-update-77jzh\" (UID: \"8eedd3c6-0b24-49e4-81eb-0b21419f8739\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-77jzh" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385039 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdbmd\" (UniqueName: \"kubernetes.io/projected/8eedd3c6-0b24-49e4-81eb-0b21419f8739-kube-api-access-hdbmd\") pod \"keystone-b695-account-create-update-77jzh\" (UID: \"8eedd3c6-0b24-49e4-81eb-0b21419f8739\") " pod="openstack-kuttl-tests/keystone-b695-account-create-update-77jzh" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385155 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8d05314-fc56-482b-b1a9-d7f5e40b6336-operator-scripts\") pod \"keystone-db-create-k5hcd\" (UID: \"c8d05314-fc56-482b-b1a9-d7f5e40b6336\") " pod="openstack-kuttl-tests/keystone-db-create-k5hcd" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385339 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385352 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385362 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df5wt\" (UniqueName: \"kubernetes.io/projected/c256a299-ef10-4eab-ae00-d4116a91a108-kube-api-access-df5wt\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385373 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c256a299-ef10-4eab-ae00-d4116a91a108-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385385 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385395 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f239887a-e1d2-484e-96ce-e4bb56b1bc75-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385405 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385414 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2840a28-0f62-4378-bbd0-a07833eb25c7-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385422 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385431 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385440 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c256a299-ef10-4eab-ae00-d4116a91a108-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385450 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385460 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbjbt\" (UniqueName: \"kubernetes.io/projected/7467fbb5-44e9-42cf-874d-16fd1332885e-kube-api-access-kbjbt\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.385469 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.385532 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.385588 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8d05314-fc56-482b-b1a9-d7f5e40b6336-operator-scripts podName:c8d05314-fc56-482b-b1a9-d7f5e40b6336 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:23.385570461 +0000 UTC m=+3055.705830749 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c8d05314-fc56-482b-b1a9-d7f5e40b6336-operator-scripts") pod "keystone-db-create-k5hcd" (UID: "c8d05314-fc56-482b-b1a9-d7f5e40b6336") : configmap "openstack-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.387343 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f239887a-e1d2-484e-96ce-e4bb56b1bc75" (UID: "f239887a-e1d2-484e-96ce-e4bb56b1bc75"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.387425 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.387454 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data podName:fb76fff8-05c4-4ec5-b00b-17dfd94e5dea nodeName:}" failed. No retries permitted until 2026-01-20 23:26:25.387445606 +0000 UTC m=+3057.707705894 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data") pod "rabbitmq-cell1-server-0" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.387640 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.387663 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8eedd3c6-0b24-49e4-81eb-0b21419f8739-operator-scripts podName:8eedd3c6-0b24-49e4-81eb-0b21419f8739 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:23.387655831 +0000 UTC m=+3055.707916119 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8eedd3c6-0b24-49e4-81eb-0b21419f8739-operator-scripts") pod "keystone-b695-account-create-update-77jzh" (UID: "8eedd3c6-0b24-49e4-81eb-0b21419f8739") : configmap "openstack-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.387785 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb"] Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.389035 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "7467fbb5-44e9-42cf-874d-16fd1332885e" (UID: "7467fbb5-44e9-42cf-874d-16fd1332885e"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.389121 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f239887a-e1d2-484e-96ce-e4bb56b1bc75" (UID: "f239887a-e1d2-484e-96ce-e4bb56b1bc75"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.389358 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2840a28-0f62-4378-bbd0-a07833eb25c7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e2840a28-0f62-4378-bbd0-a07833eb25c7" (UID: "e2840a28-0f62-4378-bbd0-a07833eb25c7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.400036 5030 projected.go:194] Error preparing data for projected volume kube-api-access-hdbmd for pod openstack-kuttl-tests/keystone-b695-account-create-update-77jzh: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.401262 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8eedd3c6-0b24-49e4-81eb-0b21419f8739-kube-api-access-hdbmd podName:8eedd3c6-0b24-49e4-81eb-0b21419f8739 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:23.40124439 +0000 UTC m=+3055.721504678 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hdbmd" (UniqueName: "kubernetes.io/projected/8eedd3c6-0b24-49e4-81eb-0b21419f8739-kube-api-access-hdbmd") pod "keystone-b695-account-create-update-77jzh" (UID: "8eedd3c6-0b24-49e4-81eb-0b21419f8739") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.401317 5030 projected.go:194] Error preparing data for projected volume kube-api-access-r8nd2 for pod openstack-kuttl-tests/keystone-db-create-k5hcd: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.401350 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8d05314-fc56-482b-b1a9-d7f5e40b6336-kube-api-access-r8nd2 podName:c8d05314-fc56-482b-b1a9-d7f5e40b6336 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:23.401341453 +0000 UTC m=+3055.721601741 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-r8nd2" (UniqueName: "kubernetes.io/projected/c8d05314-fc56-482b-b1a9-d7f5e40b6336-kube-api-access-r8nd2") pod "keystone-db-create-k5hcd" (UID: "c8d05314-fc56-482b-b1a9-d7f5e40b6336") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.405018 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5547ddfddd-hx7pb"] Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.408714 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f239887a-e1d2-484e-96ce-e4bb56b1bc75-kube-api-access-2fr6n" (OuterVolumeSpecName: "kube-api-access-2fr6n") pod "f239887a-e1d2-484e-96ce-e4bb56b1bc75" (UID: "f239887a-e1d2-484e-96ce-e4bb56b1bc75"). InnerVolumeSpecName "kube-api-access-2fr6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.417595 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2840a28-0f62-4378-bbd0-a07833eb25c7-kube-api-access-vf8g4" (OuterVolumeSpecName: "kube-api-access-vf8g4") pod "e2840a28-0f62-4378-bbd0-a07833eb25c7" (UID: "e2840a28-0f62-4378-bbd0-a07833eb25c7"). InnerVolumeSpecName "kube-api-access-vf8g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.417741 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5995ef35-8891-4ffb-8e20-86add9e274bc-kube-api-access-hp7fs" (OuterVolumeSpecName: "kube-api-access-hp7fs") pod "5995ef35-8891-4ffb-8e20-86add9e274bc" (UID: "5995ef35-8891-4ffb-8e20-86add9e274bc"). InnerVolumeSpecName "kube-api-access-hp7fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.418696 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-kube-api-access-262j8" (OuterVolumeSpecName: "kube-api-access-262j8") pod "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" (UID: "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf"). InnerVolumeSpecName "kube-api-access-262j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.475012 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-scripts" (OuterVolumeSpecName: "scripts") pod "e2840a28-0f62-4378-bbd0-a07833eb25c7" (UID: "e2840a28-0f62-4378-bbd0-a07833eb25c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.480167 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c256a299-ef10-4eab-ae00-d4116a91a108" (UID: "c256a299-ef10-4eab-ae00-d4116a91a108"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.508115 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-262j8\" (UniqueName: \"kubernetes.io/projected/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-kube-api-access-262j8\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.508150 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp7fs\" (UniqueName: \"kubernetes.io/projected/5995ef35-8891-4ffb-8e20-86add9e274bc-kube-api-access-hp7fs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.508162 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.508173 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fr6n\" (UniqueName: \"kubernetes.io/projected/f239887a-e1d2-484e-96ce-e4bb56b1bc75-kube-api-access-2fr6n\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.508187 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7467fbb5-44e9-42cf-874d-16fd1332885e-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.508199 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.508209 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.508218 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2840a28-0f62-4378-bbd0-a07833eb25c7-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.508232 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f239887a-e1d2-484e-96ce-e4bb56b1bc75-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.508242 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf8g4\" (UniqueName: \"kubernetes.io/projected/e2840a28-0f62-4378-bbd0-a07833eb25c7-kube-api-access-vf8g4\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.514572 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "f239887a-e1d2-484e-96ce-e4bb56b1bc75" (UID: "f239887a-e1d2-484e-96ce-e4bb56b1bc75"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.553791 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.612076 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.612274 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.612393 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.612472 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc858\" (UniqueName: \"kubernetes.io/projected/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89-kube-api-access-pc858\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.634242 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f239887a-e1d2-484e-96ce-e4bb56b1bc75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f239887a-e1d2-484e-96ce-e4bb56b1bc75" (UID: "f239887a-e1d2-484e-96ce-e4bb56b1bc75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.639327 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-config-data" (OuterVolumeSpecName: "config-data") pod "c256a299-ef10-4eab-ae00-d4116a91a108" (UID: "c256a299-ef10-4eab-ae00-d4116a91a108"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.648768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-config-data" (OuterVolumeSpecName: "config-data") pod "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" (UID: "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.661918 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5995ef35-8891-4ffb-8e20-86add9e274bc-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5995ef35-8891-4ffb-8e20-86add9e274bc" (UID: "5995ef35-8891-4ffb-8e20-86add9e274bc"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.687186 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e2840a28-0f62-4378-bbd0-a07833eb25c7" (UID: "e2840a28-0f62-4378-bbd0-a07833eb25c7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.690101 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.699072 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5995ef35-8891-4ffb-8e20-86add9e274bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5995ef35-8891-4ffb-8e20-86add9e274bc" (UID: "5995ef35-8891-4ffb-8e20-86add9e274bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.714886 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e1cc829-8d44-48c6-ab24-a8ac60596d08-operator-scripts\") pod \"0e1cc829-8d44-48c6-ab24-a8ac60596d08\" (UID: \"0e1cc829-8d44-48c6-ab24-a8ac60596d08\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.715486 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e1cc829-8d44-48c6-ab24-a8ac60596d08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e1cc829-8d44-48c6-ab24-a8ac60596d08" (UID: "0e1cc829-8d44-48c6-ab24-a8ac60596d08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.719840 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp4qp\" (UniqueName: \"kubernetes.io/projected/0e1cc829-8d44-48c6-ab24-a8ac60596d08-kube-api-access-wp4qp\") pod \"0e1cc829-8d44-48c6-ab24-a8ac60596d08\" (UID: \"0e1cc829-8d44-48c6-ab24-a8ac60596d08\") " Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.720785 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5995ef35-8891-4ffb-8e20-86add9e274bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.720808 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5995ef35-8891-4ffb-8e20-86add9e274bc-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.720819 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.720829 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.720840 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e1cc829-8d44-48c6-ab24-a8ac60596d08-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.720852 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.720864 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.720876 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f239887a-e1d2-484e-96ce-e4bb56b1bc75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.725152 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e1cc829-8d44-48c6-ab24-a8ac60596d08-kube-api-access-wp4qp" (OuterVolumeSpecName: "kube-api-access-wp4qp") pod "0e1cc829-8d44-48c6-ab24-a8ac60596d08" (UID: "0e1cc829-8d44-48c6-ab24-a8ac60596d08"). InnerVolumeSpecName "kube-api-access-wp4qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.727825 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" (UID: "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.738729 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c256a299-ef10-4eab-ae00-d4116a91a108" (UID: "c256a299-ef10-4eab-ae00-d4116a91a108"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.746787 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5995ef35-8891-4ffb-8e20-86add9e274bc-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5995ef35-8891-4ffb-8e20-86add9e274bc" (UID: "5995ef35-8891-4ffb-8e20-86add9e274bc"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.754977 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f239887a-e1d2-484e-96ce-e4bb56b1bc75-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f239887a-e1d2-484e-96ce-e4bb56b1bc75" (UID: "f239887a-e1d2-484e-96ce-e4bb56b1bc75"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.772458 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" event={"ID":"0e1cc829-8d44-48c6-ab24-a8ac60596d08","Type":"ContainerDied","Data":"9e3ebb3ca2a5c90cc6024193a172242acd2f2c35f608cc83deaf99b69d9da458"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.772907 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.784922 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f239887a-e1d2-484e-96ce-e4bb56b1bc75","Type":"ContainerDied","Data":"fafd5ba47410f6fedd6499c7232ec2d2d22b86d2d2aae3c9cc4c3d6841b8ea5f"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.785471 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.788149 5030 scope.go:117] "RemoveContainer" containerID="5e702a9647c237b7ea176fced7e9b9708645df2df43294f18dad118f506f7fee" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.803277 5030 generic.go:334] "Generic (PLEG): container finished" podID="e70c799f-11cc-4de5-87b7-c9fb919df4b8" containerID="c672e76b107c7e32dc030b9ed48f0dabafda407a97c9dd170a2de01ae5711b7d" exitCode=0 Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.803425 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" event={"ID":"e70c799f-11cc-4de5-87b7-c9fb919df4b8","Type":"ContainerDied","Data":"c672e76b107c7e32dc030b9ed48f0dabafda407a97c9dd170a2de01ae5711b7d"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.804043 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" event={"ID":"e70c799f-11cc-4de5-87b7-c9fb919df4b8","Type":"ContainerStarted","Data":"2b484d297f125d231acee7c20314852a50fb43af5d47656b8b6023213e14bd3b"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.806554 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" (UID: "23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.806669 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e2840a28-0f62-4378-bbd0-a07833eb25c7" (UID: "e2840a28-0f62-4378-bbd0-a07833eb25c7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.808911 5030 generic.go:334] "Generic (PLEG): container finished" podID="3a13994b-a11d-4798-9aae-b91a3d73d440" containerID="847a1cdf293cd94ff36a98ec31b6492efbc1f13b9fd3fea3d0454c780f44cc6f" exitCode=0 Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.809237 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"3a13994b-a11d-4798-9aae-b91a3d73d440","Type":"ContainerDied","Data":"847a1cdf293cd94ff36a98ec31b6492efbc1f13b9fd3fea3d0454c780f44cc6f"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.809373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"3a13994b-a11d-4798-9aae-b91a3d73d440","Type":"ContainerDied","Data":"bc44f82bfdea244db33f854331c5bc77af0d5a17f0fb4ee14c74bfacb210752b"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.809387 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc44f82bfdea244db33f854331c5bc77af0d5a17f0fb4ee14c74bfacb210752b" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.811955 5030 generic.go:334] "Generic (PLEG): container finished" podID="0814026f-5519-40a1-94e8-379b5abac55f" containerID="8d8857a54e3051d21b85864143f114d8c2faafa007277a85b349682cb0dd2946" exitCode=0 Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.811994 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0814026f-5519-40a1-94e8-379b5abac55f","Type":"ContainerDied","Data":"8d8857a54e3051d21b85864143f114d8c2faafa007277a85b349682cb0dd2946"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.812128 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"0814026f-5519-40a1-94e8-379b5abac55f","Type":"ContainerDied","Data":"c4b7ffb8cb725c0415939e88cb48fa4d9ffd933d29c2375c642c3a6f7b5e1d1a"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.812143 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4b7ffb8cb725c0415939e88cb48fa4d9ffd933d29c2375c642c3a6f7b5e1d1a" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.822492 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5995ef35-8891-4ffb-8e20-86add9e274bc-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.822542 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f239887a-e1d2-484e-96ce-e4bb56b1bc75-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.822554 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c256a299-ef10-4eab-ae00-d4116a91a108-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.822564 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.822573 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.822582 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp4qp\" (UniqueName: \"kubernetes.io/projected/0e1cc829-8d44-48c6-ab24-a8ac60596d08-kube-api-access-wp4qp\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.822614 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.823431 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: E0120 23:26:21.823945 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts podName:261b1067-9a6b-4835-8b8c-e0d2c68a4a99 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:25.823922452 +0000 UTC m=+3058.144182740 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts") pod "root-account-create-update-q48f7" (UID: "261b1067-9a6b-4835-8b8c-e0d2c68a4a99") : configmap "openstack-cell1-scripts" not found Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.838715 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf","Type":"ContainerDied","Data":"6c76f3c7c64099b7ee877fe97a9e61fbe3bb63c5803e888fb3eae340dcb08732"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.838790 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.844714 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-ndbv8" event={"ID":"3f82fbd9-cd1e-4d71-8d61-02126748c546","Type":"ContainerDied","Data":"585ace1beb3985a3db038b8d501092d73f30b14956d24ba1eca80ca872ff3dfa"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.844772 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="585ace1beb3985a3db038b8d501092d73f30b14956d24ba1eca80ca872ff3dfa" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.851901 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-config-data" (OuterVolumeSpecName: "config-data") pod "e2840a28-0f62-4378-bbd0-a07833eb25c7" (UID: "e2840a28-0f62-4378-bbd0-a07833eb25c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.852714 5030 generic.go:334] "Generic (PLEG): container finished" podID="66881282-2b6f-484d-a7e0-2b968306c5ae" containerID="7cd441aae70f6c395bd5a42b94d16ef7f6f319a8b61584ad5fc4661b28a3ff4d" exitCode=0 Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.852773 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"66881282-2b6f-484d-a7e0-2b968306c5ae","Type":"ContainerDied","Data":"7cd441aae70f6c395bd5a42b94d16ef7f6f319a8b61584ad5fc4661b28a3ff4d"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.852796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"66881282-2b6f-484d-a7e0-2b968306c5ae","Type":"ContainerDied","Data":"da1429337d8d0557f1d287751d35aed815aa1a3fe7afc2c2f1417ba4fc849590"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.852807 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da1429337d8d0557f1d287751d35aed815aa1a3fe7afc2c2f1417ba4fc849590" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.856516 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-q48f7" event={"ID":"261b1067-9a6b-4835-8b8c-e0d2c68a4a99","Type":"ContainerDied","Data":"643f9c5fc5870c70f78b08664982f8d0a19e355c041c22eaf4cb9fc62357470b"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.856539 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="643f9c5fc5870c70f78b08664982f8d0a19e355c041c22eaf4cb9fc62357470b" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.859380 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e2840a28-0f62-4378-bbd0-a07833eb25c7","Type":"ContainerDied","Data":"4598b0245916d45a6b4a1919da72ecb6246f72eb1ec695ac241706e0cd2d3cfd"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.859438 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.862406 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.863618 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" event={"ID":"d6231555-bba1-41e1-8c8b-5ba3349cc6a4","Type":"ContainerDied","Data":"f31e8dca3a24798d247e32d5d63e5006a018384b067281e48c8b0ff9ab3cbfd8"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.863659 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f31e8dca3a24798d247e32d5d63e5006a018384b067281e48c8b0ff9ab3cbfd8" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.865687 5030 generic.go:334] "Generic (PLEG): container finished" podID="d37b566e-a870-4b0d-af53-29da770c8a0a" containerID="f9817a66ae09a052b512e6a6f843fe20c1e5f1eeabb2fd55c4ccbcf13f8f066c" exitCode=0 Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.865735 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"d37b566e-a870-4b0d-af53-29da770c8a0a","Type":"ContainerDied","Data":"f9817a66ae09a052b512e6a6f843fe20c1e5f1eeabb2fd55c4ccbcf13f8f066c"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.867226 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b695-account-create-update-77jzh" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.871879 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.871946 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-k5hcd" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.872891 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.874254 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" event={"ID":"c07cdd7a-1d57-4d2a-9910-54d0e6e186f5","Type":"ContainerDied","Data":"2976d7722e6f1dc7eaccf8c68c191e62af00d654d0c2c4a0697aadeb739bbf78"} Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.874294 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2976d7722e6f1dc7eaccf8c68c191e62af00d654d0c2c4a0697aadeb739bbf78" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.874368 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.886983 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2840a28-0f62-4378-bbd0-a07833eb25c7" (UID: "e2840a28-0f62-4378-bbd0-a07833eb25c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.924348 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.924393 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2840a28-0f62-4378-bbd0-a07833eb25c7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.928270 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-q48f7" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.940341 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.962212 5030 scope.go:117] "RemoveContainer" containerID="872d237eda2119aaa6d6a2d8792e96fe1f8b4dac40e197593e2387ae4ae6bf94" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.978089 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.986394 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb" path="/var/lib/kubelet/pods/0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb/volumes" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.991426 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29c196d8-2e0d-4bbf-9f4c-c6754359f6f3" path="/var/lib/kubelet/pods/29c196d8-2e0d-4bbf-9f4c-c6754359f6f3/volumes" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.992322 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5995ef35-8891-4ffb-8e20-86add9e274bc" path="/var/lib/kubelet/pods/5995ef35-8891-4ffb-8e20-86add9e274bc/volumes" Jan 20 23:26:21 crc kubenswrapper[5030]: I0120 23:26:21.993464 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64de676f-6fb5-4e1f-ba48-9c6ba84b1e89" path="/var/lib/kubelet/pods/64de676f-6fb5-4e1f-ba48-9c6ba84b1e89/volumes" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:21.998018 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68e23a57-b3b1-4e99-8171-53da821ba0d8" path="/var/lib/kubelet/pods/68e23a57-b3b1-4e99-8171-53da821ba0d8/volumes" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:21.998664 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5276d7d-9906-480d-90c6-4f48c6f126e2" path="/var/lib/kubelet/pods/e5276d7d-9906-480d-90c6-4f48c6f126e2/volumes" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.014988 5030 scope.go:117] "RemoveContainer" containerID="616e09206b760dcb070e95678184dd06e6e1027d0d51ca57dde5f3bc0847a277" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.026453 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a13994b-a11d-4798-9aae-b91a3d73d440-memcached-tls-certs\") pod \"3a13994b-a11d-4798-9aae-b91a3d73d440\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.026608 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4j4l\" (UniqueName: \"kubernetes.io/projected/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-kube-api-access-m4j4l\") pod \"c07cdd7a-1d57-4d2a-9910-54d0e6e186f5\" (UID: \"c07cdd7a-1d57-4d2a-9910-54d0e6e186f5\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.026676 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-operator-scripts\") pod \"c07cdd7a-1d57-4d2a-9910-54d0e6e186f5\" (UID: \"c07cdd7a-1d57-4d2a-9910-54d0e6e186f5\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.026703 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts\") pod \"261b1067-9a6b-4835-8b8c-e0d2c68a4a99\" (UID: \"261b1067-9a6b-4835-8b8c-e0d2c68a4a99\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.026779 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a13994b-a11d-4798-9aae-b91a3d73d440-combined-ca-bundle\") pod \"3a13994b-a11d-4798-9aae-b91a3d73d440\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.026798 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a13994b-a11d-4798-9aae-b91a3d73d440-config-data\") pod \"3a13994b-a11d-4798-9aae-b91a3d73d440\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.026857 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a13994b-a11d-4798-9aae-b91a3d73d440-kolla-config\") pod \"3a13994b-a11d-4798-9aae-b91a3d73d440\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.026874 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqcnr\" (UniqueName: \"kubernetes.io/projected/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-kube-api-access-vqcnr\") pod \"261b1067-9a6b-4835-8b8c-e0d2c68a4a99\" (UID: \"261b1067-9a6b-4835-8b8c-e0d2c68a4a99\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.026901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5xzp\" (UniqueName: \"kubernetes.io/projected/3a13994b-a11d-4798-9aae-b91a3d73d440-kube-api-access-h5xzp\") pod \"3a13994b-a11d-4798-9aae-b91a3d73d440\" (UID: \"3a13994b-a11d-4798-9aae-b91a3d73d440\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.034078 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a13994b-a11d-4798-9aae-b91a3d73d440-config-data" (OuterVolumeSpecName: "config-data") pod "3a13994b-a11d-4798-9aae-b91a3d73d440" (UID: "3a13994b-a11d-4798-9aae-b91a3d73d440"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.034676 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "261b1067-9a6b-4835-8b8c-e0d2c68a4a99" (UID: "261b1067-9a6b-4835-8b8c-e0d2c68a4a99"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.036470 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a13994b-a11d-4798-9aae-b91a3d73d440-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "3a13994b-a11d-4798-9aae-b91a3d73d440" (UID: "3a13994b-a11d-4798-9aae-b91a3d73d440"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.038072 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c07cdd7a-1d57-4d2a-9910-54d0e6e186f5" (UID: "c07cdd7a-1d57-4d2a-9910-54d0e6e186f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.038525 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a13994b-a11d-4798-9aae-b91a3d73d440-kube-api-access-h5xzp" (OuterVolumeSpecName: "kube-api-access-h5xzp") pod "3a13994b-a11d-4798-9aae-b91a3d73d440" (UID: "3a13994b-a11d-4798-9aae-b91a3d73d440"). InnerVolumeSpecName "kube-api-access-h5xzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.039785 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-kube-api-access-m4j4l" (OuterVolumeSpecName: "kube-api-access-m4j4l") pod "c07cdd7a-1d57-4d2a-9910-54d0e6e186f5" (UID: "c07cdd7a-1d57-4d2a-9910-54d0e6e186f5"). InnerVolumeSpecName "kube-api-access-m4j4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.044974 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-kube-api-access-vqcnr" (OuterVolumeSpecName: "kube-api-access-vqcnr") pod "261b1067-9a6b-4835-8b8c-e0d2c68a4a99" (UID: "261b1067-9a6b-4835-8b8c-e0d2c68a4a99"). InnerVolumeSpecName "kube-api-access-vqcnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.073225 5030 scope.go:117] "RemoveContainer" containerID="ba64c3f6fd654c175099565552c0c0d6fd00f883cbac81faa3e5a76d3ce4b5e9" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.104522 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a13994b-a11d-4798-9aae-b91a3d73d440-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a13994b-a11d-4798-9aae-b91a3d73d440" (UID: "3a13994b-a11d-4798-9aae-b91a3d73d440"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.141537 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a13994b-a11d-4798-9aae-b91a3d73d440-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "3a13994b-a11d-4798-9aae-b91a3d73d440" (UID: "3a13994b-a11d-4798-9aae-b91a3d73d440"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.144514 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4j4l\" (UniqueName: \"kubernetes.io/projected/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-kube-api-access-m4j4l\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.144557 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.144571 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.144584 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a13994b-a11d-4798-9aae-b91a3d73d440-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.144604 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a13994b-a11d-4798-9aae-b91a3d73d440-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.144632 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a13994b-a11d-4798-9aae-b91a3d73d440-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.144646 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqcnr\" (UniqueName: \"kubernetes.io/projected/261b1067-9a6b-4835-8b8c-e0d2c68a4a99-kube-api-access-vqcnr\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.144664 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5xzp\" (UniqueName: \"kubernetes.io/projected/3a13994b-a11d-4798-9aae-b91a3d73d440-kube-api-access-h5xzp\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.144679 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a13994b-a11d-4798-9aae-b91a3d73d440-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: E0120 23:26:22.250898 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60ba47fc9818c41be393330baa0d4357fed4f1a59bdc3aa0b5a186691e89940c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:26:22 crc kubenswrapper[5030]: E0120 23:26:22.253057 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60ba47fc9818c41be393330baa0d4357fed4f1a59bdc3aa0b5a186691e89940c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:26:22 crc kubenswrapper[5030]: E0120 23:26:22.264094 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60ba47fc9818c41be393330baa0d4357fed4f1a59bdc3aa0b5a186691e89940c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:26:22 crc kubenswrapper[5030]: E0120 23:26:22.264169 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="e33b23a7-cf35-4fc2-8237-9633f3d807f3" containerName="nova-cell0-conductor-conductor" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295478 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-k5hcd"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295513 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-k5hcd"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295534 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295546 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-aecb-account-create-update-lwrdm"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295558 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295570 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295588 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-b695-account-create-update-77jzh"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295599 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-b695-account-create-update-77jzh"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295610 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295635 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295646 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295659 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295674 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.295684 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-58bb697bc9-t6twt"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.300908 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.313810 5030 scope.go:117] "RemoveContainer" containerID="79b477de7360a1b389230a49bf03141e246b436af45aecf9fc9c9289c4283a3d" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.316218 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.332339 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-ndbv8" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.348465 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-operator-scripts\") pod \"66881282-2b6f-484d-a7e0-2b968306c5ae\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.348522 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-operator-scripts\") pod \"d6231555-bba1-41e1-8c8b-5ba3349cc6a4\" (UID: \"d6231555-bba1-41e1-8c8b-5ba3349cc6a4\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.348567 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66881282-2b6f-484d-a7e0-2b968306c5ae-combined-ca-bundle\") pod \"66881282-2b6f-484d-a7e0-2b968306c5ae\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.348717 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66881282-2b6f-484d-a7e0-2b968306c5ae-config-data-generated\") pod \"66881282-2b6f-484d-a7e0-2b968306c5ae\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.348781 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ws78\" (UniqueName: \"kubernetes.io/projected/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-kube-api-access-6ws78\") pod \"d6231555-bba1-41e1-8c8b-5ba3349cc6a4\" (UID: \"d6231555-bba1-41e1-8c8b-5ba3349cc6a4\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.348806 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-config-data-default\") pod \"66881282-2b6f-484d-a7e0-2b968306c5ae\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.348834 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv9mt\" (UniqueName: \"kubernetes.io/projected/66881282-2b6f-484d-a7e0-2b968306c5ae-kube-api-access-vv9mt\") pod \"66881282-2b6f-484d-a7e0-2b968306c5ae\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.348853 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-kolla-config\") pod \"66881282-2b6f-484d-a7e0-2b968306c5ae\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.348886 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66881282-2b6f-484d-a7e0-2b968306c5ae-galera-tls-certs\") pod \"66881282-2b6f-484d-a7e0-2b968306c5ae\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.348931 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"66881282-2b6f-484d-a7e0-2b968306c5ae\" (UID: \"66881282-2b6f-484d-a7e0-2b968306c5ae\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.349376 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8eedd3c6-0b24-49e4-81eb-0b21419f8739-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.349396 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.349405 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g97xg\" (UniqueName: \"kubernetes.io/projected/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-kube-api-access-g97xg\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.349416 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8nd2\" (UniqueName: \"kubernetes.io/projected/c8d05314-fc56-482b-b1a9-d7f5e40b6336-kube-api-access-r8nd2\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.349425 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdbmd\" (UniqueName: \"kubernetes.io/projected/8eedd3c6-0b24-49e4-81eb-0b21419f8739-kube-api-access-hdbmd\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.349433 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8d05314-fc56-482b-b1a9-d7f5e40b6336-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.349444 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.362425 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "66881282-2b6f-484d-a7e0-2b968306c5ae" (UID: "66881282-2b6f-484d-a7e0-2b968306c5ae"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.362770 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "66881282-2b6f-484d-a7e0-2b968306c5ae" (UID: "66881282-2b6f-484d-a7e0-2b968306c5ae"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.364471 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6231555-bba1-41e1-8c8b-5ba3349cc6a4" (UID: "d6231555-bba1-41e1-8c8b-5ba3349cc6a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.364785 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66881282-2b6f-484d-a7e0-2b968306c5ae" (UID: "66881282-2b6f-484d-a7e0-2b968306c5ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.365065 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66881282-2b6f-484d-a7e0-2b968306c5ae-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "66881282-2b6f-484d-a7e0-2b968306c5ae" (UID: "66881282-2b6f-484d-a7e0-2b968306c5ae"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.367081 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-kube-api-access-6ws78" (OuterVolumeSpecName: "kube-api-access-6ws78") pod "d6231555-bba1-41e1-8c8b-5ba3349cc6a4" (UID: "d6231555-bba1-41e1-8c8b-5ba3349cc6a4"). InnerVolumeSpecName "kube-api-access-6ws78". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.372382 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.372776 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.372790 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66881282-2b6f-484d-a7e0-2b968306c5ae-kube-api-access-vv9mt" (OuterVolumeSpecName: "kube-api-access-vv9mt") pod "66881282-2b6f-484d-a7e0-2b968306c5ae" (UID: "66881282-2b6f-484d-a7e0-2b968306c5ae"). InnerVolumeSpecName "kube-api-access-vv9mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.397252 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.402214 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.402796 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "66881282-2b6f-484d-a7e0-2b968306c5ae" (UID: "66881282-2b6f-484d-a7e0-2b968306c5ae"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.403410 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.409659 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.413515 5030 scope.go:117] "RemoveContainer" containerID="115ce5a1a1b811f563c605b90097f6b0f7096471f6c5040cf6697c80d762190f" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.447639 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" podUID="2b796eed-a508-43ca-9b25-c097b39474b7" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.72:9311/healthcheck\": read tcp 10.217.0.2:40870->10.217.1.72:9311: read: connection reset by peer" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.448942 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" podUID="2b796eed-a508-43ca-9b25-c097b39474b7" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.72:9311/healthcheck\": read tcp 10.217.0.2:40856->10.217.1.72:9311: read: connection reset by peer" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.450751 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d37b566e-a870-4b0d-af53-29da770c8a0a-logs\") pod \"d37b566e-a870-4b0d-af53-29da770c8a0a\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.450905 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-config-data\") pod \"d37b566e-a870-4b0d-af53-29da770c8a0a\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.451001 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcjh6\" (UniqueName: \"kubernetes.io/projected/0814026f-5519-40a1-94e8-379b5abac55f-kube-api-access-rcjh6\") pod \"0814026f-5519-40a1-94e8-379b5abac55f\" (UID: \"0814026f-5519-40a1-94e8-379b5abac55f\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.451056 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-internal-tls-certs\") pod \"d37b566e-a870-4b0d-af53-29da770c8a0a\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.451120 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-combined-ca-bundle\") pod \"d37b566e-a870-4b0d-af53-29da770c8a0a\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.451215 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f82fbd9-cd1e-4d71-8d61-02126748c546-operator-scripts\") pod \"3f82fbd9-cd1e-4d71-8d61-02126748c546\" (UID: \"3f82fbd9-cd1e-4d71-8d61-02126748c546\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.451243 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"d37b566e-a870-4b0d-af53-29da770c8a0a\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.451266 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d37b566e-a870-4b0d-af53-29da770c8a0a-httpd-run\") pod \"d37b566e-a870-4b0d-af53-29da770c8a0a\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.451294 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f4s2\" (UniqueName: \"kubernetes.io/projected/d37b566e-a870-4b0d-af53-29da770c8a0a-kube-api-access-8f4s2\") pod \"d37b566e-a870-4b0d-af53-29da770c8a0a\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.451313 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-scripts\") pod \"d37b566e-a870-4b0d-af53-29da770c8a0a\" (UID: \"d37b566e-a870-4b0d-af53-29da770c8a0a\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.451331 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0814026f-5519-40a1-94e8-379b5abac55f-config-data\") pod \"0814026f-5519-40a1-94e8-379b5abac55f\" (UID: \"0814026f-5519-40a1-94e8-379b5abac55f\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.451390 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0814026f-5519-40a1-94e8-379b5abac55f-combined-ca-bundle\") pod \"0814026f-5519-40a1-94e8-379b5abac55f\" (UID: \"0814026f-5519-40a1-94e8-379b5abac55f\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.451435 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqkqr\" (UniqueName: \"kubernetes.io/projected/3f82fbd9-cd1e-4d71-8d61-02126748c546-kube-api-access-vqkqr\") pod \"3f82fbd9-cd1e-4d71-8d61-02126748c546\" (UID: \"3f82fbd9-cd1e-4d71-8d61-02126748c546\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.452157 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.452182 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.452196 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.452208 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66881282-2b6f-484d-a7e0-2b968306c5ae-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.452219 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ws78\" (UniqueName: \"kubernetes.io/projected/d6231555-bba1-41e1-8c8b-5ba3349cc6a4-kube-api-access-6ws78\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.452229 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.452239 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv9mt\" (UniqueName: \"kubernetes.io/projected/66881282-2b6f-484d-a7e0-2b968306c5ae-kube-api-access-vv9mt\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.452248 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66881282-2b6f-484d-a7e0-2b968306c5ae-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.457727 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f82fbd9-cd1e-4d71-8d61-02126748c546-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f82fbd9-cd1e-4d71-8d61-02126748c546" (UID: "3f82fbd9-cd1e-4d71-8d61-02126748c546"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.460152 5030 scope.go:117] "RemoveContainer" containerID="bfd7a2900f29501d274a01d822ba9fcab833dd0ab9306f38cbde8afbe0832f0e" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.461895 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66881282-2b6f-484d-a7e0-2b968306c5ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66881282-2b6f-484d-a7e0-2b968306c5ae" (UID: "66881282-2b6f-484d-a7e0-2b968306c5ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.462453 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d37b566e-a870-4b0d-af53-29da770c8a0a-logs" (OuterVolumeSpecName: "logs") pod "d37b566e-a870-4b0d-af53-29da770c8a0a" (UID: "d37b566e-a870-4b0d-af53-29da770c8a0a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.464706 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d37b566e-a870-4b0d-af53-29da770c8a0a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d37b566e-a870-4b0d-af53-29da770c8a0a" (UID: "d37b566e-a870-4b0d-af53-29da770c8a0a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.469351 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0814026f-5519-40a1-94e8-379b5abac55f-kube-api-access-rcjh6" (OuterVolumeSpecName: "kube-api-access-rcjh6") pod "0814026f-5519-40a1-94e8-379b5abac55f" (UID: "0814026f-5519-40a1-94e8-379b5abac55f"). InnerVolumeSpecName "kube-api-access-rcjh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.487845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f82fbd9-cd1e-4d71-8d61-02126748c546-kube-api-access-vqkqr" (OuterVolumeSpecName: "kube-api-access-vqkqr") pod "3f82fbd9-cd1e-4d71-8d61-02126748c546" (UID: "3f82fbd9-cd1e-4d71-8d61-02126748c546"). InnerVolumeSpecName "kube-api-access-vqkqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.492409 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "d37b566e-a870-4b0d-af53-29da770c8a0a" (UID: "d37b566e-a870-4b0d-af53-29da770c8a0a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.494902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d37b566e-a870-4b0d-af53-29da770c8a0a-kube-api-access-8f4s2" (OuterVolumeSpecName: "kube-api-access-8f4s2") pod "d37b566e-a870-4b0d-af53-29da770c8a0a" (UID: "d37b566e-a870-4b0d-af53-29da770c8a0a"). InnerVolumeSpecName "kube-api-access-8f4s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.496395 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-scripts" (OuterVolumeSpecName: "scripts") pod "d37b566e-a870-4b0d-af53-29da770c8a0a" (UID: "d37b566e-a870-4b0d-af53-29da770c8a0a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.497866 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66881282-2b6f-484d-a7e0-2b968306c5ae-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "66881282-2b6f-484d-a7e0-2b968306c5ae" (UID: "66881282-2b6f-484d-a7e0-2b968306c5ae"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.524607 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d37b566e-a870-4b0d-af53-29da770c8a0a" (UID: "d37b566e-a870-4b0d-af53-29da770c8a0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.538062 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.545386 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.549276 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0814026f-5519-40a1-94e8-379b5abac55f-config-data" (OuterVolumeSpecName: "config-data") pod "0814026f-5519-40a1-94e8-379b5abac55f" (UID: "0814026f-5519-40a1-94e8-379b5abac55f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.554288 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.554317 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f82fbd9-cd1e-4d71-8d61-02126748c546-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.554354 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.554365 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d37b566e-a870-4b0d-af53-29da770c8a0a-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.554376 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f4s2\" (UniqueName: \"kubernetes.io/projected/d37b566e-a870-4b0d-af53-29da770c8a0a-kube-api-access-8f4s2\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.554386 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.557480 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0814026f-5519-40a1-94e8-379b5abac55f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.557587 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66881282-2b6f-484d-a7e0-2b968306c5ae-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.558850 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.558938 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqkqr\" (UniqueName: \"kubernetes.io/projected/3f82fbd9-cd1e-4d71-8d61-02126748c546-kube-api-access-vqkqr\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.559016 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d37b566e-a870-4b0d-af53-29da770c8a0a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.559098 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66881282-2b6f-484d-a7e0-2b968306c5ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.559165 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcjh6\" (UniqueName: \"kubernetes.io/projected/0814026f-5519-40a1-94e8-379b5abac55f-kube-api-access-rcjh6\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.566330 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0814026f-5519-40a1-94e8-379b5abac55f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0814026f-5519-40a1-94e8-379b5abac55f" (UID: "0814026f-5519-40a1-94e8-379b5abac55f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.584059 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.594506 5030 scope.go:117] "RemoveContainer" containerID="3fa1f8af90d752f9f412358e748ffa6b26b530af7abe3410789eb3720481f96a" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.618959 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d37b566e-a870-4b0d-af53-29da770c8a0a" (UID: "d37b566e-a870-4b0d-af53-29da770c8a0a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.624059 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-config-data" (OuterVolumeSpecName: "config-data") pod "d37b566e-a870-4b0d-af53-29da770c8a0a" (UID: "d37b566e-a870-4b0d-af53-29da770c8a0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.660811 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-combined-ca-bundle\") pod \"c862e826-c66e-4188-9d2a-096e6039a1af\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.661544 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c862e826-c66e-4188-9d2a-096e6039a1af-httpd-run\") pod \"c862e826-c66e-4188-9d2a-096e6039a1af\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.661567 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"c862e826-c66e-4188-9d2a-096e6039a1af\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.661614 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-scripts\") pod \"c862e826-c66e-4188-9d2a-096e6039a1af\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.661650 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-public-tls-certs\") pod \"c862e826-c66e-4188-9d2a-096e6039a1af\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.661745 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c862e826-c66e-4188-9d2a-096e6039a1af-logs\") pod \"c862e826-c66e-4188-9d2a-096e6039a1af\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.661790 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wftgg\" (UniqueName: \"kubernetes.io/projected/c862e826-c66e-4188-9d2a-096e6039a1af-kube-api-access-wftgg\") pod \"c862e826-c66e-4188-9d2a-096e6039a1af\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.661912 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-config-data\") pod \"c862e826-c66e-4188-9d2a-096e6039a1af\" (UID: \"c862e826-c66e-4188-9d2a-096e6039a1af\") " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.662317 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c862e826-c66e-4188-9d2a-096e6039a1af-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c862e826-c66e-4188-9d2a-096e6039a1af" (UID: "c862e826-c66e-4188-9d2a-096e6039a1af"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.662453 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.662469 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d37b566e-a870-4b0d-af53-29da770c8a0a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.662479 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.662487 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c862e826-c66e-4188-9d2a-096e6039a1af-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.662496 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0814026f-5519-40a1-94e8-379b5abac55f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.662685 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c862e826-c66e-4188-9d2a-096e6039a1af-logs" (OuterVolumeSpecName: "logs") pod "c862e826-c66e-4188-9d2a-096e6039a1af" (UID: "c862e826-c66e-4188-9d2a-096e6039a1af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.670033 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "c862e826-c66e-4188-9d2a-096e6039a1af" (UID: "c862e826-c66e-4188-9d2a-096e6039a1af"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.678839 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-scripts" (OuterVolumeSpecName: "scripts") pod "c862e826-c66e-4188-9d2a-096e6039a1af" (UID: "c862e826-c66e-4188-9d2a-096e6039a1af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.681113 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c862e826-c66e-4188-9d2a-096e6039a1af-kube-api-access-wftgg" (OuterVolumeSpecName: "kube-api-access-wftgg") pod "c862e826-c66e-4188-9d2a-096e6039a1af" (UID: "c862e826-c66e-4188-9d2a-096e6039a1af"). InnerVolumeSpecName "kube-api-access-wftgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.727901 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c862e826-c66e-4188-9d2a-096e6039a1af" (UID: "c862e826-c66e-4188-9d2a-096e6039a1af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.770020 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.770065 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.770075 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.770083 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c862e826-c66e-4188-9d2a-096e6039a1af-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.770094 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wftgg\" (UniqueName: \"kubernetes.io/projected/c862e826-c66e-4188-9d2a-096e6039a1af-kube-api-access-wftgg\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.783842 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-config-data" (OuterVolumeSpecName: "config-data") pod "c862e826-c66e-4188-9d2a-096e6039a1af" (UID: "c862e826-c66e-4188-9d2a-096e6039a1af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.796991 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c862e826-c66e-4188-9d2a-096e6039a1af" (UID: "c862e826-c66e-4188-9d2a-096e6039a1af"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.800857 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.871349 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.871376 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.871387 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c862e826-c66e-4188-9d2a-096e6039a1af-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.893675 5030 generic.go:334] "Generic (PLEG): container finished" podID="b334c7e2-1209-4a89-b56d-72d97b6f16b3" containerID="75c9449a8a21d460c37176286c18c54720a652f60cdf7d26b5616d4f02353a1a" exitCode=0 Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.893735 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" event={"ID":"b334c7e2-1209-4a89-b56d-72d97b6f16b3","Type":"ContainerDied","Data":"75c9449a8a21d460c37176286c18c54720a652f60cdf7d26b5616d4f02353a1a"} Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.918338 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"d37b566e-a870-4b0d-af53-29da770c8a0a","Type":"ContainerDied","Data":"e9709736009f50137989f22243681ea92a35dc763797d6191c771771f8fa156a"} Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.918394 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.931710 5030 generic.go:334] "Generic (PLEG): container finished" podID="e33b23a7-cf35-4fc2-8237-9633f3d807f3" containerID="60ba47fc9818c41be393330baa0d4357fed4f1a59bdc3aa0b5a186691e89940c" exitCode=0 Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.931775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e33b23a7-cf35-4fc2-8237-9633f3d807f3","Type":"ContainerDied","Data":"60ba47fc9818c41be393330baa0d4357fed4f1a59bdc3aa0b5a186691e89940c"} Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.934304 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" event={"ID":"e70c799f-11cc-4de5-87b7-c9fb919df4b8","Type":"ContainerStarted","Data":"94fd76e4c952cda5c478e27fd2eeb648b29babe915e29e5841ddb5e6d603cd50"} Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.934852 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.940776 5030 generic.go:334] "Generic (PLEG): container finished" podID="e89f10b6-405e-4c42-a2f5-97111102ec40" containerID="4141048d16b87ba820575f3c2ed1b798652fb89491b0f81f3244fea3383c5fb5" exitCode=0 Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.940835 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" event={"ID":"e89f10b6-405e-4c42-a2f5-97111102ec40","Type":"ContainerDied","Data":"4141048d16b87ba820575f3c2ed1b798652fb89491b0f81f3244fea3383c5fb5"} Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.940852 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" event={"ID":"e89f10b6-405e-4c42-a2f5-97111102ec40","Type":"ContainerDied","Data":"9e3a01f8ac397d24ef807fbdfa0da4b9784248eeaa7dab52c8e9f6796bd2fd26"} Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.940862 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e3a01f8ac397d24ef807fbdfa0da4b9784248eeaa7dab52c8e9f6796bd2fd26" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.945474 5030 generic.go:334] "Generic (PLEG): container finished" podID="17e14da0-805b-4832-b6dc-7dc6b829ee4f" containerID="d30b1e443b75aaab1d0e162daa649e2952963350a5285a88628501469adfbea4" exitCode=0 Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.945527 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" event={"ID":"17e14da0-805b-4832-b6dc-7dc6b829ee4f","Type":"ContainerDied","Data":"d30b1e443b75aaab1d0e162daa649e2952963350a5285a88628501469adfbea4"} Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.953970 5030 generic.go:334] "Generic (PLEG): container finished" podID="c862e826-c66e-4188-9d2a-096e6039a1af" containerID="71678e28fd03649c9a6db5bde20993e36008b33bbe2167361630434ed0b68f67" exitCode=0 Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.954124 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c862e826-c66e-4188-9d2a-096e6039a1af","Type":"ContainerDied","Data":"71678e28fd03649c9a6db5bde20993e36008b33bbe2167361630434ed0b68f67"} Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.954170 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c862e826-c66e-4188-9d2a-096e6039a1af","Type":"ContainerDied","Data":"1927c255e00e53c76852c6db4b5c544361a7692c44080d16e7b45d2dd23c4bc3"} Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.954257 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.959672 5030 generic.go:334] "Generic (PLEG): container finished" podID="644cc3b2-e3d2-441c-8aa5-c3bec683e86c" containerID="9b87dbe57785654f140e071cbe49ac4f1c7e04524db51c2dad9594ac25b3bdad" exitCode=0 Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.959729 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"644cc3b2-e3d2-441c-8aa5-c3bec683e86c","Type":"ContainerDied","Data":"9b87dbe57785654f140e071cbe49ac4f1c7e04524db51c2dad9594ac25b3bdad"} Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.962047 5030 generic.go:334] "Generic (PLEG): container finished" podID="2b796eed-a508-43ca-9b25-c097b39474b7" containerID="83f9fdc7aa98bed3c6026ba325295aaed4096990435ab42d93cc5a1829c1757c" exitCode=0 Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.962132 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.962178 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.962213 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" event={"ID":"2b796eed-a508-43ca-9b25-c097b39474b7","Type":"ContainerDied","Data":"83f9fdc7aa98bed3c6026ba325295aaed4096990435ab42d93cc5a1829c1757c"} Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.962271 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-659a-account-create-update-nthp7" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.962413 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-ndbv8" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.962467 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-q48f7" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.962501 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.962688 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" podStartSLOduration=5.962677478 podStartE2EDuration="5.962677478s" podCreationTimestamp="2026-01-20 23:26:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:26:22.954373117 +0000 UTC m=+3055.274633405" watchObservedRunningTime="2026-01-20 23:26:22.962677478 +0000 UTC m=+3055.282937766" Jan 20 23:26:22 crc kubenswrapper[5030]: I0120 23:26:22.963493 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.095048 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.109908 5030 scope.go:117] "RemoveContainer" containerID="8b8b11c6ac6050ac18676092493bc35bbf1121d7672ace5bea22fe163dd6aca3" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.147997 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.172266 5030 scope.go:117] "RemoveContainer" containerID="f9817a66ae09a052b512e6a6f843fe20c1e5f1eeabb2fd55c4ccbcf13f8f066c" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.174068 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.179263 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-combined-ca-bundle\") pod \"e89f10b6-405e-4c42-a2f5-97111102ec40\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.179346 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjglk\" (UniqueName: \"kubernetes.io/projected/e89f10b6-405e-4c42-a2f5-97111102ec40-kube-api-access-bjglk\") pod \"e89f10b6-405e-4c42-a2f5-97111102ec40\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.179709 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-public-tls-certs\") pod \"e89f10b6-405e-4c42-a2f5-97111102ec40\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.179884 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89f10b6-405e-4c42-a2f5-97111102ec40-logs\") pod \"e89f10b6-405e-4c42-a2f5-97111102ec40\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.179963 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-internal-tls-certs\") pod \"e89f10b6-405e-4c42-a2f5-97111102ec40\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.179992 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-scripts\") pod \"e89f10b6-405e-4c42-a2f5-97111102ec40\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.180057 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-config-data\") pod \"e89f10b6-405e-4c42-a2f5-97111102ec40\" (UID: \"e89f10b6-405e-4c42-a2f5-97111102ec40\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.180269 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89f10b6-405e-4c42-a2f5-97111102ec40-logs" (OuterVolumeSpecName: "logs") pod "e89f10b6-405e-4c42-a2f5-97111102ec40" (UID: "e89f10b6-405e-4c42-a2f5-97111102ec40"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.180682 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e89f10b6-405e-4c42-a2f5-97111102ec40-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.182859 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-64d6-account-create-update-xvvgg"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.186009 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-scripts" (OuterVolumeSpecName: "scripts") pod "e89f10b6-405e-4c42-a2f5-97111102ec40" (UID: "e89f10b6-405e-4c42-a2f5-97111102ec40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.186937 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89f10b6-405e-4c42-a2f5-97111102ec40-kube-api-access-bjglk" (OuterVolumeSpecName: "kube-api-access-bjglk") pod "e89f10b6-405e-4c42-a2f5-97111102ec40" (UID: "e89f10b6-405e-4c42-a2f5-97111102ec40"). InnerVolumeSpecName "kube-api-access-bjglk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.191742 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.193969 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.220898 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.232887 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.243533 5030 scope.go:117] "RemoveContainer" containerID="ee502ff2226c831285614d809d297680cd85514fa73adcd319bae1949aba7123" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.255261 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e89f10b6-405e-4c42-a2f5-97111102ec40" (UID: "e89f10b6-405e-4c42-a2f5-97111102ec40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: E0120 23:26:23.258566 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69 is running failed: container process not found" containerID="fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:26:23 crc kubenswrapper[5030]: E0120 23:26:23.263274 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69 is running failed: container process not found" containerID="fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:26:23 crc kubenswrapper[5030]: E0120 23:26:23.263860 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69 is running failed: container process not found" containerID="fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:26:23 crc kubenswrapper[5030]: E0120 23:26:23.263883 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="747ac61a-5b10-46ef-8a34-68b43c777d96" containerName="nova-cell1-conductor-conductor" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.285104 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-scripts\") pod \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.285299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-public-tls-certs\") pod \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.285348 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-internal-tls-certs\") pod \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.285376 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-etc-machine-id\") pod \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.285465 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6fw8\" (UniqueName: \"kubernetes.io/projected/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-kube-api-access-g6fw8\") pod \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.285652 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-combined-ca-bundle\") pod \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.285709 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-config-data-custom\") pod \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.285737 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-config-data\") pod \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.285893 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-logs\") pod \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\" (UID: \"644cc3b2-e3d2-441c-8aa5-c3bec683e86c\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.286768 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.286784 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjglk\" (UniqueName: \"kubernetes.io/projected/e89f10b6-405e-4c42-a2f5-97111102ec40-kube-api-access-bjglk\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.286800 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.287264 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-logs" (OuterVolumeSpecName: "logs") pod "644cc3b2-e3d2-441c-8aa5-c3bec683e86c" (UID: "644cc3b2-e3d2-441c-8aa5-c3bec683e86c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.287347 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "644cc3b2-e3d2-441c-8aa5-c3bec683e86c" (UID: "644cc3b2-e3d2-441c-8aa5-c3bec683e86c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.301499 5030 scope.go:117] "RemoveContainer" containerID="71678e28fd03649c9a6db5bde20993e36008b33bbe2167361630434ed0b68f67" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.317165 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.321561 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-config-data" (OuterVolumeSpecName: "config-data") pod "e89f10b6-405e-4c42-a2f5-97111102ec40" (UID: "e89f10b6-405e-4c42-a2f5-97111102ec40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.323378 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "644cc3b2-e3d2-441c-8aa5-c3bec683e86c" (UID: "644cc3b2-e3d2-441c-8aa5-c3bec683e86c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.329159 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-scripts" (OuterVolumeSpecName: "scripts") pod "644cc3b2-e3d2-441c-8aa5-c3bec683e86c" (UID: "644cc3b2-e3d2-441c-8aa5-c3bec683e86c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.329844 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-kube-api-access-g6fw8" (OuterVolumeSpecName: "kube-api-access-g6fw8") pod "644cc3b2-e3d2-441c-8aa5-c3bec683e86c" (UID: "644cc3b2-e3d2-441c-8aa5-c3bec683e86c"). InnerVolumeSpecName "kube-api-access-g6fw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.337860 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-659a-account-create-update-nthp7"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.356835 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-659a-account-create-update-nthp7"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.372930 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "644cc3b2-e3d2-441c-8aa5-c3bec683e86c" (UID: "644cc3b2-e3d2-441c-8aa5-c3bec683e86c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.373149 5030 scope.go:117] "RemoveContainer" containerID="7f68f7419d2718f62707e650ef6382b95f3ad993626d8918bec9a05262072f2a" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.380210 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "644cc3b2-e3d2-441c-8aa5-c3bec683e86c" (UID: "644cc3b2-e3d2-441c-8aa5-c3bec683e86c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.387946 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-config-data-custom\") pod \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.387998 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgqk7\" (UniqueName: \"kubernetes.io/projected/17e14da0-805b-4832-b6dc-7dc6b829ee4f-kube-api-access-dgqk7\") pod \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.388075 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-config-data\") pod \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.388123 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-combined-ca-bundle\") pod \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.388160 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e14da0-805b-4832-b6dc-7dc6b829ee4f-logs\") pod \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\" (UID: \"17e14da0-805b-4832-b6dc-7dc6b829ee4f\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.388711 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.388730 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.388740 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.388749 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.388761 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.388771 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6fw8\" (UniqueName: \"kubernetes.io/projected/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-kube-api-access-g6fw8\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.388779 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.388788 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.389075 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17e14da0-805b-4832-b6dc-7dc6b829ee4f-logs" (OuterVolumeSpecName: "logs") pod "17e14da0-805b-4832-b6dc-7dc6b829ee4f" (UID: "17e14da0-805b-4832-b6dc-7dc6b829ee4f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.393884 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-q48f7"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.396104 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.405725 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-q48f7"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.406040 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "644cc3b2-e3d2-441c-8aa5-c3bec683e86c" (UID: "644cc3b2-e3d2-441c-8aa5-c3bec683e86c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.414669 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e14da0-805b-4832-b6dc-7dc6b829ee4f-kube-api-access-dgqk7" (OuterVolumeSpecName: "kube-api-access-dgqk7") pod "17e14da0-805b-4832-b6dc-7dc6b829ee4f" (UID: "17e14da0-805b-4832-b6dc-7dc6b829ee4f"). InnerVolumeSpecName "kube-api-access-dgqk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.417539 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.417949 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.421610 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "17e14da0-805b-4832-b6dc-7dc6b829ee4f" (UID: "17e14da0-805b-4832-b6dc-7dc6b829ee4f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.426173 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.428524 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.430787 5030 scope.go:117] "RemoveContainer" containerID="71678e28fd03649c9a6db5bde20993e36008b33bbe2167361630434ed0b68f67" Jan 20 23:26:23 crc kubenswrapper[5030]: E0120 23:26:23.431146 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71678e28fd03649c9a6db5bde20993e36008b33bbe2167361630434ed0b68f67\": container with ID starting with 71678e28fd03649c9a6db5bde20993e36008b33bbe2167361630434ed0b68f67 not found: ID does not exist" containerID="71678e28fd03649c9a6db5bde20993e36008b33bbe2167361630434ed0b68f67" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.431199 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71678e28fd03649c9a6db5bde20993e36008b33bbe2167361630434ed0b68f67"} err="failed to get container status \"71678e28fd03649c9a6db5bde20993e36008b33bbe2167361630434ed0b68f67\": rpc error: code = NotFound desc = could not find container \"71678e28fd03649c9a6db5bde20993e36008b33bbe2167361630434ed0b68f67\": container with ID starting with 71678e28fd03649c9a6db5bde20993e36008b33bbe2167361630434ed0b68f67 not found: ID does not exist" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.431222 5030 scope.go:117] "RemoveContainer" containerID="7f68f7419d2718f62707e650ef6382b95f3ad993626d8918bec9a05262072f2a" Jan 20 23:26:23 crc kubenswrapper[5030]: E0120 23:26:23.431406 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f68f7419d2718f62707e650ef6382b95f3ad993626d8918bec9a05262072f2a\": container with ID starting with 7f68f7419d2718f62707e650ef6382b95f3ad993626d8918bec9a05262072f2a not found: ID does not exist" containerID="7f68f7419d2718f62707e650ef6382b95f3ad993626d8918bec9a05262072f2a" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.431426 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f68f7419d2718f62707e650ef6382b95f3ad993626d8918bec9a05262072f2a"} err="failed to get container status \"7f68f7419d2718f62707e650ef6382b95f3ad993626d8918bec9a05262072f2a\": rpc error: code = NotFound desc = could not find container \"7f68f7419d2718f62707e650ef6382b95f3ad993626d8918bec9a05262072f2a\": container with ID starting with 7f68f7419d2718f62707e650ef6382b95f3ad993626d8918bec9a05262072f2a not found: ID does not exist" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.432811 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-ndbv8"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.444437 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-ndbv8"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.453386 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.459339 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.462549 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17e14da0-805b-4832-b6dc-7dc6b829ee4f" (UID: "17e14da0-805b-4832-b6dc-7dc6b829ee4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.469704 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.474212 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-config-data" (OuterVolumeSpecName: "config-data") pod "644cc3b2-e3d2-441c-8aa5-c3bec683e86c" (UID: "644cc3b2-e3d2-441c-8aa5-c3bec683e86c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.474707 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e89f10b6-405e-4c42-a2f5-97111102ec40" (UID: "e89f10b6-405e-4c42-a2f5-97111102ec40"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.480119 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.489483 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-combined-ca-bundle\") pod \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.489539 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-combined-ca-bundle\") pod \"2b796eed-a508-43ca-9b25-c097b39474b7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.489663 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e33b23a7-cf35-4fc2-8237-9633f3d807f3-combined-ca-bundle\") pod \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\" (UID: \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.489698 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-config-data-custom\") pod \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.490291 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b796eed-a508-43ca-9b25-c097b39474b7-logs\") pod \"2b796eed-a508-43ca-9b25-c097b39474b7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.490345 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcg4p\" (UniqueName: \"kubernetes.io/projected/b334c7e2-1209-4a89-b56d-72d97b6f16b3-kube-api-access-bcg4p\") pod \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.490424 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-internal-tls-certs\") pod \"2b796eed-a508-43ca-9b25-c097b39474b7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.490486 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-config-data\") pod \"2b796eed-a508-43ca-9b25-c097b39474b7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.490539 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nhdd\" (UniqueName: \"kubernetes.io/projected/e33b23a7-cf35-4fc2-8237-9633f3d807f3-kube-api-access-7nhdd\") pod \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\" (UID: \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.490567 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-public-tls-certs\") pod \"2b796eed-a508-43ca-9b25-c097b39474b7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.490594 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e33b23a7-cf35-4fc2-8237-9633f3d807f3-config-data\") pod \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\" (UID: \"e33b23a7-cf35-4fc2-8237-9633f3d807f3\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.490658 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr5f9\" (UniqueName: \"kubernetes.io/projected/2b796eed-a508-43ca-9b25-c097b39474b7-kube-api-access-mr5f9\") pod \"2b796eed-a508-43ca-9b25-c097b39474b7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.490684 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b334c7e2-1209-4a89-b56d-72d97b6f16b3-logs\") pod \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.490727 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-config-data-custom\") pod \"2b796eed-a508-43ca-9b25-c097b39474b7\" (UID: \"2b796eed-a508-43ca-9b25-c097b39474b7\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.490753 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-config-data\") pod \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\" (UID: \"b334c7e2-1209-4a89-b56d-72d97b6f16b3\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.491332 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.491353 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgqk7\" (UniqueName: \"kubernetes.io/projected/17e14da0-805b-4832-b6dc-7dc6b829ee4f-kube-api-access-dgqk7\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.491370 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.491383 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.491396 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.491409 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17e14da0-805b-4832-b6dc-7dc6b829ee4f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.491422 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/644cc3b2-e3d2-441c-8aa5-c3bec683e86c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.492299 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b334c7e2-1209-4a89-b56d-72d97b6f16b3-logs" (OuterVolumeSpecName: "logs") pod "b334c7e2-1209-4a89-b56d-72d97b6f16b3" (UID: "b334c7e2-1209-4a89-b56d-72d97b6f16b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.494731 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b796eed-a508-43ca-9b25-c097b39474b7-logs" (OuterVolumeSpecName: "logs") pod "2b796eed-a508-43ca-9b25-c097b39474b7" (UID: "2b796eed-a508-43ca-9b25-c097b39474b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.497192 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.506029 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e89f10b6-405e-4c42-a2f5-97111102ec40" (UID: "e89f10b6-405e-4c42-a2f5-97111102ec40"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.507221 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2b796eed-a508-43ca-9b25-c097b39474b7" (UID: "2b796eed-a508-43ca-9b25-c097b39474b7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.507656 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b334c7e2-1209-4a89-b56d-72d97b6f16b3-kube-api-access-bcg4p" (OuterVolumeSpecName: "kube-api-access-bcg4p") pod "b334c7e2-1209-4a89-b56d-72d97b6f16b3" (UID: "b334c7e2-1209-4a89-b56d-72d97b6f16b3"). InnerVolumeSpecName "kube-api-access-bcg4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.507887 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b796eed-a508-43ca-9b25-c097b39474b7-kube-api-access-mr5f9" (OuterVolumeSpecName: "kube-api-access-mr5f9") pod "2b796eed-a508-43ca-9b25-c097b39474b7" (UID: "2b796eed-a508-43ca-9b25-c097b39474b7"). InnerVolumeSpecName "kube-api-access-mr5f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.508167 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e33b23a7-cf35-4fc2-8237-9633f3d807f3-kube-api-access-7nhdd" (OuterVolumeSpecName: "kube-api-access-7nhdd") pod "e33b23a7-cf35-4fc2-8237-9633f3d807f3" (UID: "e33b23a7-cf35-4fc2-8237-9633f3d807f3"). InnerVolumeSpecName "kube-api-access-7nhdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.508783 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b334c7e2-1209-4a89-b56d-72d97b6f16b3" (UID: "b334c7e2-1209-4a89-b56d-72d97b6f16b3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.509850 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-config-data" (OuterVolumeSpecName: "config-data") pod "17e14da0-805b-4832-b6dc-7dc6b829ee4f" (UID: "17e14da0-805b-4832-b6dc-7dc6b829ee4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.571609 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e33b23a7-cf35-4fc2-8237-9633f3d807f3-config-data" (OuterVolumeSpecName: "config-data") pod "e33b23a7-cf35-4fc2-8237-9633f3d807f3" (UID: "e33b23a7-cf35-4fc2-8237-9633f3d807f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.574903 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e33b23a7-cf35-4fc2-8237-9633f3d807f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e33b23a7-cf35-4fc2-8237-9633f3d807f3" (UID: "e33b23a7-cf35-4fc2-8237-9633f3d807f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.583467 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b796eed-a508-43ca-9b25-c097b39474b7" (UID: "2b796eed-a508-43ca-9b25-c097b39474b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.592486 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-config-data\") pod \"f9b215df-05ff-45fe-ac23-b5605150449b\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.592531 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grnkc\" (UniqueName: \"kubernetes.io/projected/f9b215df-05ff-45fe-ac23-b5605150449b-kube-api-access-grnkc\") pod \"f9b215df-05ff-45fe-ac23-b5605150449b\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.592613 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-scripts\") pod \"f9b215df-05ff-45fe-ac23-b5605150449b\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.592686 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-credential-keys\") pod \"f9b215df-05ff-45fe-ac23-b5605150449b\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.592715 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-public-tls-certs\") pod \"f9b215df-05ff-45fe-ac23-b5605150449b\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.592759 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-fernet-keys\") pod \"f9b215df-05ff-45fe-ac23-b5605150449b\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.592791 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-internal-tls-certs\") pod \"f9b215df-05ff-45fe-ac23-b5605150449b\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.592814 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-combined-ca-bundle\") pod \"f9b215df-05ff-45fe-ac23-b5605150449b\" (UID: \"f9b215df-05ff-45fe-ac23-b5605150449b\") " Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.593136 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.593148 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17e14da0-805b-4832-b6dc-7dc6b829ee4f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.593158 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e33b23a7-cf35-4fc2-8237-9633f3d807f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.593167 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.593176 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e89f10b6-405e-4c42-a2f5-97111102ec40-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.593185 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b796eed-a508-43ca-9b25-c097b39474b7-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.593195 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcg4p\" (UniqueName: \"kubernetes.io/projected/b334c7e2-1209-4a89-b56d-72d97b6f16b3-kube-api-access-bcg4p\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.593205 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nhdd\" (UniqueName: \"kubernetes.io/projected/e33b23a7-cf35-4fc2-8237-9633f3d807f3-kube-api-access-7nhdd\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.593215 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e33b23a7-cf35-4fc2-8237-9633f3d807f3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.593223 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr5f9\" (UniqueName: \"kubernetes.io/projected/2b796eed-a508-43ca-9b25-c097b39474b7-kube-api-access-mr5f9\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.593232 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b334c7e2-1209-4a89-b56d-72d97b6f16b3-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.593241 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.594597 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b334c7e2-1209-4a89-b56d-72d97b6f16b3" (UID: "b334c7e2-1209-4a89-b56d-72d97b6f16b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.600243 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-config-data" (OuterVolumeSpecName: "config-data") pod "2b796eed-a508-43ca-9b25-c097b39474b7" (UID: "2b796eed-a508-43ca-9b25-c097b39474b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.602210 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-scripts" (OuterVolumeSpecName: "scripts") pod "f9b215df-05ff-45fe-ac23-b5605150449b" (UID: "f9b215df-05ff-45fe-ac23-b5605150449b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: E0120 23:26:23.616281 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4 is running failed: container process not found" containerID="352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:26:23 crc kubenswrapper[5030]: E0120 23:26:23.616682 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4 is running failed: container process not found" containerID="352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:26:23 crc kubenswrapper[5030]: E0120 23:26:23.620056 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4 is running failed: container process not found" containerID="352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:26:23 crc kubenswrapper[5030]: E0120 23:26:23.620107 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="d380087b-96be-407d-8c8c-f00ecaa3a0e8" containerName="ovn-northd" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.621597 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b215df-05ff-45fe-ac23-b5605150449b-kube-api-access-grnkc" (OuterVolumeSpecName: "kube-api-access-grnkc") pod "f9b215df-05ff-45fe-ac23-b5605150449b" (UID: "f9b215df-05ff-45fe-ac23-b5605150449b"). InnerVolumeSpecName "kube-api-access-grnkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.622163 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f9b215df-05ff-45fe-ac23-b5605150449b" (UID: "f9b215df-05ff-45fe-ac23-b5605150449b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.629256 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2b796eed-a508-43ca-9b25-c097b39474b7" (UID: "2b796eed-a508-43ca-9b25-c097b39474b7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.630800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f9b215df-05ff-45fe-ac23-b5605150449b" (UID: "f9b215df-05ff-45fe-ac23-b5605150449b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.630881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-config-data" (OuterVolumeSpecName: "config-data") pod "b334c7e2-1209-4a89-b56d-72d97b6f16b3" (UID: "b334c7e2-1209-4a89-b56d-72d97b6f16b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.633509 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2b796eed-a508-43ca-9b25-c097b39474b7" (UID: "2b796eed-a508-43ca-9b25-c097b39474b7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.638449 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-config-data" (OuterVolumeSpecName: "config-data") pod "f9b215df-05ff-45fe-ac23-b5605150449b" (UID: "f9b215df-05ff-45fe-ac23-b5605150449b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.645131 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9b215df-05ff-45fe-ac23-b5605150449b" (UID: "f9b215df-05ff-45fe-ac23-b5605150449b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.658917 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f9b215df-05ff-45fe-ac23-b5605150449b" (UID: "f9b215df-05ff-45fe-ac23-b5605150449b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.661260 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f9b215df-05ff-45fe-ac23-b5605150449b" (UID: "f9b215df-05ff-45fe-ac23-b5605150449b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.694781 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.694811 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.694823 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grnkc\" (UniqueName: \"kubernetes.io/projected/f9b215df-05ff-45fe-ac23-b5605150449b-kube-api-access-grnkc\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.694834 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.694842 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b796eed-a508-43ca-9b25-c097b39474b7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.694851 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.694860 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.694869 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.694880 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.694889 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b334c7e2-1209-4a89-b56d-72d97b6f16b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.694898 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.694906 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.694915 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b215df-05ff-45fe-ac23-b5605150449b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.893435 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:26:23 crc kubenswrapper[5030]: E0120 23:26:23.902757 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:26:23 crc kubenswrapper[5030]: E0120 23:26:23.902835 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data podName:d756a967-e1dd-4c74-8bff-cd1fab21b251 nodeName:}" failed. No retries permitted until 2026-01-20 23:26:31.902818977 +0000 UTC m=+3064.223079265 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data") pod "rabbitmq-server-0" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251") : configmap "rabbitmq-config-data" not found Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.972701 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0814026f-5519-40a1-94e8-379b5abac55f" path="/var/lib/kubelet/pods/0814026f-5519-40a1-94e8-379b5abac55f/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.973196 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e1cc829-8d44-48c6-ab24-a8ac60596d08" path="/var/lib/kubelet/pods/0e1cc829-8d44-48c6-ab24-a8ac60596d08/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.973568 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" path="/var/lib/kubelet/pods/23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.974093 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="261b1067-9a6b-4835-8b8c-e0d2c68a4a99" path="/var/lib/kubelet/pods/261b1067-9a6b-4835-8b8c-e0d2c68a4a99/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.975024 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a13994b-a11d-4798-9aae-b91a3d73d440" path="/var/lib/kubelet/pods/3a13994b-a11d-4798-9aae-b91a3d73d440/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.975474 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f82fbd9-cd1e-4d71-8d61-02126748c546" path="/var/lib/kubelet/pods/3f82fbd9-cd1e-4d71-8d61-02126748c546/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.976300 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66881282-2b6f-484d-a7e0-2b968306c5ae" path="/var/lib/kubelet/pods/66881282-2b6f-484d-a7e0-2b968306c5ae/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.977227 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.977355 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7467fbb5-44e9-42cf-874d-16fd1332885e" path="/var/lib/kubelet/pods/7467fbb5-44e9-42cf-874d-16fd1332885e/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.977719 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eedd3c6-0b24-49e4-81eb-0b21419f8739" path="/var/lib/kubelet/pods/8eedd3c6-0b24-49e4-81eb-0b21419f8739/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.977996 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c07cdd7a-1d57-4d2a-9910-54d0e6e186f5" path="/var/lib/kubelet/pods/c07cdd7a-1d57-4d2a-9910-54d0e6e186f5/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.979588 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c256a299-ef10-4eab-ae00-d4116a91a108" path="/var/lib/kubelet/pods/c256a299-ef10-4eab-ae00-d4116a91a108/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.980878 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9" path="/var/lib/kubelet/pods/c2c7cfe3-e810-4db4-b2b4-4821bd27d7c9/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.981595 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c862e826-c66e-4188-9d2a-096e6039a1af" path="/var/lib/kubelet/pods/c862e826-c66e-4188-9d2a-096e6039a1af/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.982123 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.982570 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8d05314-fc56-482b-b1a9-d7f5e40b6336" path="/var/lib/kubelet/pods/c8d05314-fc56-482b-b1a9-d7f5e40b6336/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.983112 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d37b566e-a870-4b0d-af53-29da770c8a0a" path="/var/lib/kubelet/pods/d37b566e-a870-4b0d-af53-29da770c8a0a/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.984030 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6231555-bba1-41e1-8c8b-5ba3349cc6a4" path="/var/lib/kubelet/pods/d6231555-bba1-41e1-8c8b-5ba3349cc6a4/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.984453 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" path="/var/lib/kubelet/pods/e2840a28-0f62-4378-bbd0-a07833eb25c7/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.984996 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.985740 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f239887a-e1d2-484e-96ce-e4bb56b1bc75" path="/var/lib/kubelet/pods/f239887a-e1d2-484e-96ce-e4bb56b1bc75/volumes" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.993143 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_d380087b-96be-407d-8c8c-f00ecaa3a0e8/ovn-northd/0.log" Jan 20 23:26:23 crc kubenswrapper[5030]: I0120 23:26:23.993182 5030 generic.go:334] "Generic (PLEG): container finished" podID="d380087b-96be-407d-8c8c-f00ecaa3a0e8" containerID="352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4" exitCode=139 Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.003130 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747ac61a-5b10-46ef-8a34-68b43c777d96-combined-ca-bundle\") pod \"747ac61a-5b10-46ef-8a34-68b43c777d96\" (UID: \"747ac61a-5b10-46ef-8a34-68b43c777d96\") " Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.003221 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747ac61a-5b10-46ef-8a34-68b43c777d96-config-data\") pod \"747ac61a-5b10-46ef-8a34-68b43c777d96\" (UID: \"747ac61a-5b10-46ef-8a34-68b43c777d96\") " Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.003401 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffx9h\" (UniqueName: \"kubernetes.io/projected/747ac61a-5b10-46ef-8a34-68b43c777d96-kube-api-access-ffx9h\") pod \"747ac61a-5b10-46ef-8a34-68b43c777d96\" (UID: \"747ac61a-5b10-46ef-8a34-68b43c777d96\") " Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.008549 5030 generic.go:334] "Generic (PLEG): container finished" podID="f9b215df-05ff-45fe-ac23-b5605150449b" containerID="c61aa87934994b07e39822ecebbaf1cc3cb4d84929e81efdf4034ad0fdad39fe" exitCode=0 Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.008699 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.019811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/747ac61a-5b10-46ef-8a34-68b43c777d96-kube-api-access-ffx9h" (OuterVolumeSpecName: "kube-api-access-ffx9h") pod "747ac61a-5b10-46ef-8a34-68b43c777d96" (UID: "747ac61a-5b10-46ef-8a34-68b43c777d96"). InnerVolumeSpecName "kube-api-access-ffx9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.021958 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.028114 5030 generic.go:334] "Generic (PLEG): container finished" podID="747ac61a-5b10-46ef-8a34-68b43c777d96" containerID="fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69" exitCode=0 Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.028221 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.030975 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.032578 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/747ac61a-5b10-46ef-8a34-68b43c777d96-config-data" (OuterVolumeSpecName: "config-data") pod "747ac61a-5b10-46ef-8a34-68b43c777d96" (UID: "747ac61a-5b10-46ef-8a34-68b43c777d96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.033503 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5946fbdfbb-6s7qt" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.037516 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/747ac61a-5b10-46ef-8a34-68b43c777d96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "747ac61a-5b10-46ef-8a34-68b43c777d96" (UID: "747ac61a-5b10-46ef-8a34-68b43c777d96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.071128 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj" event={"ID":"17e14da0-805b-4832-b6dc-7dc6b829ee4f","Type":"ContainerDied","Data":"dfb375b6b63fd96c35e67db267a5ee7abdae860b615804420073a47dcdb89a27"} Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.071194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7" event={"ID":"2b796eed-a508-43ca-9b25-c097b39474b7","Type":"ContainerDied","Data":"f147cd38fce61595e1d4a805fcbbb6b04012132e47915a6b7f4270822781d6c7"} Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.071216 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk" event={"ID":"b334c7e2-1209-4a89-b56d-72d97b6f16b3","Type":"ContainerDied","Data":"960095848ea85050d361d6661010f9e4804b844498acf033464e6e06806fde20"} Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.071232 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"d380087b-96be-407d-8c8c-f00ecaa3a0e8","Type":"ContainerDied","Data":"352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4"} Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.071251 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"d380087b-96be-407d-8c8c-f00ecaa3a0e8","Type":"ContainerDied","Data":"489bdd58f58a414c8771258a9930b2d1895870bb46f3f6c283d07c1077dd05a2"} Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.071264 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="489bdd58f58a414c8771258a9930b2d1895870bb46f3f6c283d07c1077dd05a2" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.071279 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" event={"ID":"f9b215df-05ff-45fe-ac23-b5605150449b","Type":"ContainerDied","Data":"c61aa87934994b07e39822ecebbaf1cc3cb4d84929e81efdf4034ad0fdad39fe"} Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.071292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-75f85d6df9-6nmsl" event={"ID":"f9b215df-05ff-45fe-ac23-b5605150449b","Type":"ContainerDied","Data":"bdf0cc7dddd8060bd93d13cdd78e48bd513874d88254877d97143986099ca496"} Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.071304 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"644cc3b2-e3d2-441c-8aa5-c3bec683e86c","Type":"ContainerDied","Data":"9ecbaad4cafc0c70aaa295183019d98d4b5571eb21b267640e04c91eae93a431"} Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.071322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"747ac61a-5b10-46ef-8a34-68b43c777d96","Type":"ContainerDied","Data":"fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69"} Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.071337 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"747ac61a-5b10-46ef-8a34-68b43c777d96","Type":"ContainerDied","Data":"689fb66d9fd62d419d62306c65d758bc0231fafd80e34a96ff076ba5c12fc94c"} Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.071341 5030 scope.go:117] "RemoveContainer" containerID="d30b1e443b75aaab1d0e162daa649e2952963350a5285a88628501469adfbea4" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.071354 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e33b23a7-cf35-4fc2-8237-9633f3d807f3","Type":"ContainerDied","Data":"147bc330fa3f24ee5067c7881ac9e25763bbdb0fa4b4e37574f9cfdb8c785d62"} Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.090607 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_d380087b-96be-407d-8c8c-f00ecaa3a0e8/ovn-northd/0.log" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.090710 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.097359 5030 scope.go:117] "RemoveContainer" containerID="f3143e4041bdf06c6947b874cc740775ab9c9ec6001b798510175cc97e744a7e" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.105601 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/747ac61a-5b10-46ef-8a34-68b43c777d96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.105637 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/747ac61a-5b10-46ef-8a34-68b43c777d96-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.105649 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffx9h\" (UniqueName: \"kubernetes.io/projected/747ac61a-5b10-46ef-8a34-68b43c777d96-kube-api-access-ffx9h\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.119720 5030 scope.go:117] "RemoveContainer" containerID="83f9fdc7aa98bed3c6026ba325295aaed4096990435ab42d93cc5a1829c1757c" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.156286 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.157364 5030 scope.go:117] "RemoveContainer" containerID="b8717a3cdbefcbd0c0ff07aabc3b13b4414654b84eeec748b31cb1d195d654c9" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.167506 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-84cc57bc46-58glj"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.175806 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.182287 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.188851 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.192420 5030 scope.go:117] "RemoveContainer" containerID="75c9449a8a21d460c37176286c18c54720a652f60cdf7d26b5616d4f02353a1a" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.197477 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.204139 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.206235 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95fbd\" (UniqueName: \"kubernetes.io/projected/d380087b-96be-407d-8c8c-f00ecaa3a0e8-kube-api-access-95fbd\") pod \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.206303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d380087b-96be-407d-8c8c-f00ecaa3a0e8-ovn-rundir\") pod \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.206371 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d380087b-96be-407d-8c8c-f00ecaa3a0e8-config\") pod \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.206406 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-ovn-northd-tls-certs\") pod \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.206478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-combined-ca-bundle\") pod \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.206507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-metrics-certs-tls-certs\") pod \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.206587 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d380087b-96be-407d-8c8c-f00ecaa3a0e8-scripts\") pod \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\" (UID: \"d380087b-96be-407d-8c8c-f00ecaa3a0e8\") " Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.207723 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d380087b-96be-407d-8c8c-f00ecaa3a0e8-scripts" (OuterVolumeSpecName: "scripts") pod "d380087b-96be-407d-8c8c-f00ecaa3a0e8" (UID: "d380087b-96be-407d-8c8c-f00ecaa3a0e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.209446 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d380087b-96be-407d-8c8c-f00ecaa3a0e8-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "d380087b-96be-407d-8c8c-f00ecaa3a0e8" (UID: "d380087b-96be-407d-8c8c-f00ecaa3a0e8"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.209587 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d380087b-96be-407d-8c8c-f00ecaa3a0e8-config" (OuterVolumeSpecName: "config") pod "d380087b-96be-407d-8c8c-f00ecaa3a0e8" (UID: "d380087b-96be-407d-8c8c-f00ecaa3a0e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.212345 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-6c8cb8dc59-t28xk"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.213771 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d380087b-96be-407d-8c8c-f00ecaa3a0e8-kube-api-access-95fbd" (OuterVolumeSpecName: "kube-api-access-95fbd") pod "d380087b-96be-407d-8c8c-f00ecaa3a0e8" (UID: "d380087b-96be-407d-8c8c-f00ecaa3a0e8"). InnerVolumeSpecName "kube-api-access-95fbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.218817 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.224613 5030 scope.go:117] "RemoveContainer" containerID="6b496bef4ca4d0433913ed3347b495cb73325fdbef015e1d05cc61795bb60789" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.225726 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-5ccdc9797b-mz9r7"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.231290 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-75f85d6df9-6nmsl"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.236087 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-75f85d6df9-6nmsl"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.241922 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5946fbdfbb-6s7qt"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.245050 5030 scope.go:117] "RemoveContainer" containerID="c61aa87934994b07e39822ecebbaf1cc3cb4d84929e81efdf4034ad0fdad39fe" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.246159 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d380087b-96be-407d-8c8c-f00ecaa3a0e8" (UID: "d380087b-96be-407d-8c8c-f00ecaa3a0e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.246936 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-5946fbdfbb-6s7qt"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.271941 5030 scope.go:117] "RemoveContainer" containerID="c61aa87934994b07e39822ecebbaf1cc3cb4d84929e81efdf4034ad0fdad39fe" Jan 20 23:26:24 crc kubenswrapper[5030]: E0120 23:26:24.273506 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c61aa87934994b07e39822ecebbaf1cc3cb4d84929e81efdf4034ad0fdad39fe\": container with ID starting with c61aa87934994b07e39822ecebbaf1cc3cb4d84929e81efdf4034ad0fdad39fe not found: ID does not exist" containerID="c61aa87934994b07e39822ecebbaf1cc3cb4d84929e81efdf4034ad0fdad39fe" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.273570 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61aa87934994b07e39822ecebbaf1cc3cb4d84929e81efdf4034ad0fdad39fe"} err="failed to get container status \"c61aa87934994b07e39822ecebbaf1cc3cb4d84929e81efdf4034ad0fdad39fe\": rpc error: code = NotFound desc = could not find container \"c61aa87934994b07e39822ecebbaf1cc3cb4d84929e81efdf4034ad0fdad39fe\": container with ID starting with c61aa87934994b07e39822ecebbaf1cc3cb4d84929e81efdf4034ad0fdad39fe not found: ID does not exist" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.273595 5030 scope.go:117] "RemoveContainer" containerID="9b87dbe57785654f140e071cbe49ac4f1c7e04524db51c2dad9594ac25b3bdad" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.287471 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "d380087b-96be-407d-8c8c-f00ecaa3a0e8" (UID: "d380087b-96be-407d-8c8c-f00ecaa3a0e8"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.288009 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d380087b-96be-407d-8c8c-f00ecaa3a0e8" (UID: "d380087b-96be-407d-8c8c-f00ecaa3a0e8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.297431 5030 scope.go:117] "RemoveContainer" containerID="d8ebde37b10c53ca483ade68635a4736300b159d4b22e45b3f8dec9db7455d00" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.321436 5030 scope.go:117] "RemoveContainer" containerID="fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.322091 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d380087b-96be-407d-8c8c-f00ecaa3a0e8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.322330 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95fbd\" (UniqueName: \"kubernetes.io/projected/d380087b-96be-407d-8c8c-f00ecaa3a0e8-kube-api-access-95fbd\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.322352 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d380087b-96be-407d-8c8c-f00ecaa3a0e8-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.322387 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d380087b-96be-407d-8c8c-f00ecaa3a0e8-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.322398 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.322409 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.322424 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d380087b-96be-407d-8c8c-f00ecaa3a0e8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.340918 5030 scope.go:117] "RemoveContainer" containerID="fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69" Jan 20 23:26:24 crc kubenswrapper[5030]: E0120 23:26:24.342422 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69\": container with ID starting with fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69 not found: ID does not exist" containerID="fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.342454 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69"} err="failed to get container status \"fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69\": rpc error: code = NotFound desc = could not find container \"fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69\": container with ID starting with fdf9219612cfa5c3c5c7f57285d8cd6648a0c005e87ba80ca08f326ec322bb69 not found: ID does not exist" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.342475 5030 scope.go:117] "RemoveContainer" containerID="60ba47fc9818c41be393330baa0d4357fed4f1a59bdc3aa0b5a186691e89940c" Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.362717 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.367855 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:26:24 crc kubenswrapper[5030]: I0120 23:26:24.849983 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" podUID="c991f145-efc7-4968-b73a-d4cf085acee0" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.66:9696/\": dial tcp 10.217.1.66:9696: connect: connection refused" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.072947 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.129337 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.142820 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.403556 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" podUID="c256a299-ef10-4eab-ae00-d4116a91a108" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.6:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.403589 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/swift-proxy-55d95d9dc5-4zf8z" podUID="c256a299-ef10-4eab-ae00-d4116a91a108" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.1.6:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:26:25 crc kubenswrapper[5030]: E0120 23:26:25.448469 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:26:25 crc kubenswrapper[5030]: E0120 23:26:25.448560 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data podName:fb76fff8-05c4-4ec5-b00b-17dfd94e5dea nodeName:}" failed. No retries permitted until 2026-01-20 23:26:33.448540675 +0000 UTC m=+3065.768800963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data") pod "rabbitmq-cell1-server-0" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.841501 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.956512 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4xch\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-kube-api-access-q4xch\") pod \"d756a967-e1dd-4c74-8bff-cd1fab21b251\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.956552 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-plugins-conf\") pod \"d756a967-e1dd-4c74-8bff-cd1fab21b251\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.956584 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-plugins\") pod \"d756a967-e1dd-4c74-8bff-cd1fab21b251\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.956608 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-server-conf\") pod \"d756a967-e1dd-4c74-8bff-cd1fab21b251\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.956730 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data\") pod \"d756a967-e1dd-4c74-8bff-cd1fab21b251\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.956780 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-confd\") pod \"d756a967-e1dd-4c74-8bff-cd1fab21b251\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.956818 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"d756a967-e1dd-4c74-8bff-cd1fab21b251\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.956842 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d756a967-e1dd-4c74-8bff-cd1fab21b251-erlang-cookie-secret\") pod \"d756a967-e1dd-4c74-8bff-cd1fab21b251\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.956868 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-erlang-cookie\") pod \"d756a967-e1dd-4c74-8bff-cd1fab21b251\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.956906 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d756a967-e1dd-4c74-8bff-cd1fab21b251-pod-info\") pod \"d756a967-e1dd-4c74-8bff-cd1fab21b251\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.956930 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-tls\") pod \"d756a967-e1dd-4c74-8bff-cd1fab21b251\" (UID: \"d756a967-e1dd-4c74-8bff-cd1fab21b251\") " Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.957258 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d756a967-e1dd-4c74-8bff-cd1fab21b251" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.958349 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d756a967-e1dd-4c74-8bff-cd1fab21b251" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.959164 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d756a967-e1dd-4c74-8bff-cd1fab21b251" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.962885 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "d756a967-e1dd-4c74-8bff-cd1fab21b251" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.963000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-kube-api-access-q4xch" (OuterVolumeSpecName: "kube-api-access-q4xch") pod "d756a967-e1dd-4c74-8bff-cd1fab21b251" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251"). InnerVolumeSpecName "kube-api-access-q4xch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.963444 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d756a967-e1dd-4c74-8bff-cd1fab21b251" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.964997 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d756a967-e1dd-4c74-8bff-cd1fab21b251-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d756a967-e1dd-4c74-8bff-cd1fab21b251" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.970303 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d756a967-e1dd-4c74-8bff-cd1fab21b251-pod-info" (OuterVolumeSpecName: "pod-info") pod "d756a967-e1dd-4c74-8bff-cd1fab21b251" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.981447 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17e14da0-805b-4832-b6dc-7dc6b829ee4f" path="/var/lib/kubelet/pods/17e14da0-805b-4832-b6dc-7dc6b829ee4f/volumes" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.982890 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data" (OuterVolumeSpecName: "config-data") pod "d756a967-e1dd-4c74-8bff-cd1fab21b251" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.983325 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b796eed-a508-43ca-9b25-c097b39474b7" path="/var/lib/kubelet/pods/2b796eed-a508-43ca-9b25-c097b39474b7/volumes" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.984673 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="644cc3b2-e3d2-441c-8aa5-c3bec683e86c" path="/var/lib/kubelet/pods/644cc3b2-e3d2-441c-8aa5-c3bec683e86c/volumes" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.986281 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="747ac61a-5b10-46ef-8a34-68b43c777d96" path="/var/lib/kubelet/pods/747ac61a-5b10-46ef-8a34-68b43c777d96/volumes" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.988724 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b334c7e2-1209-4a89-b56d-72d97b6f16b3" path="/var/lib/kubelet/pods/b334c7e2-1209-4a89-b56d-72d97b6f16b3/volumes" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.990450 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d380087b-96be-407d-8c8c-f00ecaa3a0e8" path="/var/lib/kubelet/pods/d380087b-96be-407d-8c8c-f00ecaa3a0e8/volumes" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.991751 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e33b23a7-cf35-4fc2-8237-9633f3d807f3" path="/var/lib/kubelet/pods/e33b23a7-cf35-4fc2-8237-9633f3d807f3/volumes" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.993782 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e89f10b6-405e-4c42-a2f5-97111102ec40" path="/var/lib/kubelet/pods/e89f10b6-405e-4c42-a2f5-97111102ec40/volumes" Jan 20 23:26:25 crc kubenswrapper[5030]: I0120 23:26:25.996427 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b215df-05ff-45fe-ac23-b5605150449b" path="/var/lib/kubelet/pods/f9b215df-05ff-45fe-ac23-b5605150449b/volumes" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.011744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-server-conf" (OuterVolumeSpecName: "server-conf") pod "d756a967-e1dd-4c74-8bff-cd1fab21b251" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.058489 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.058743 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.060507 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d756a967-e1dd-4c74-8bff-cd1fab21b251-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.060653 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.060795 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d756a967-e1dd-4c74-8bff-cd1fab21b251-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.060891 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.060981 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4xch\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-kube-api-access-q4xch\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.061069 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.061157 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.061244 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d756a967-e1dd-4c74-8bff-cd1fab21b251-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.074296 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d756a967-e1dd-4c74-8bff-cd1fab21b251" (UID: "d756a967-e1dd-4c74-8bff-cd1fab21b251"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.086084 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.089174 5030 generic.go:334] "Generic (PLEG): container finished" podID="d756a967-e1dd-4c74-8bff-cd1fab21b251" containerID="799435f2bd96b95034bb6179f636afcf29a888ed0516e8085ed21b16561d9ed5" exitCode=0 Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.089286 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.095123 5030 generic.go:334] "Generic (PLEG): container finished" podID="fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" containerID="38aa5aa26bdb0f07041118403707547bc1981c26790c4adec8154899b480eb0b" exitCode=0 Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.109341 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"d756a967-e1dd-4c74-8bff-cd1fab21b251","Type":"ContainerDied","Data":"799435f2bd96b95034bb6179f636afcf29a888ed0516e8085ed21b16561d9ed5"} Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.109496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"d756a967-e1dd-4c74-8bff-cd1fab21b251","Type":"ContainerDied","Data":"f7bdd4ff06612594fb30a7ef26696dafe543525cbc7978427d22fa3d8358ca09"} Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.109694 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea","Type":"ContainerDied","Data":"38aa5aa26bdb0f07041118403707547bc1981c26790c4adec8154899b480eb0b"} Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.109795 5030 scope.go:117] "RemoveContainer" containerID="799435f2bd96b95034bb6179f636afcf29a888ed0516e8085ed21b16561d9ed5" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.132042 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.134018 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.139231 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.144060 5030 scope.go:117] "RemoveContainer" containerID="d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.163645 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d756a967-e1dd-4c74-8bff-cd1fab21b251-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.163681 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.172614 5030 scope.go:117] "RemoveContainer" containerID="799435f2bd96b95034bb6179f636afcf29a888ed0516e8085ed21b16561d9ed5" Jan 20 23:26:26 crc kubenswrapper[5030]: E0120 23:26:26.173185 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"799435f2bd96b95034bb6179f636afcf29a888ed0516e8085ed21b16561d9ed5\": container with ID starting with 799435f2bd96b95034bb6179f636afcf29a888ed0516e8085ed21b16561d9ed5 not found: ID does not exist" containerID="799435f2bd96b95034bb6179f636afcf29a888ed0516e8085ed21b16561d9ed5" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.173247 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"799435f2bd96b95034bb6179f636afcf29a888ed0516e8085ed21b16561d9ed5"} err="failed to get container status \"799435f2bd96b95034bb6179f636afcf29a888ed0516e8085ed21b16561d9ed5\": rpc error: code = NotFound desc = could not find container \"799435f2bd96b95034bb6179f636afcf29a888ed0516e8085ed21b16561d9ed5\": container with ID starting with 799435f2bd96b95034bb6179f636afcf29a888ed0516e8085ed21b16561d9ed5 not found: ID does not exist" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.173272 5030 scope.go:117] "RemoveContainer" containerID="d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a" Jan 20 23:26:26 crc kubenswrapper[5030]: E0120 23:26:26.173611 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a\": container with ID starting with d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a not found: ID does not exist" containerID="d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.173662 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a"} err="failed to get container status \"d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a\": rpc error: code = NotFound desc = could not find container \"d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a\": container with ID starting with d693603358f10c03e2888a431cc7e41f22f486590841957c05e5d48b491a918a not found: ID does not exist" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.264864 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-plugins-conf\") pod \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.264948 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrzfw\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-kube-api-access-lrzfw\") pod \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.265015 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-plugins\") pod \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.265076 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-erlang-cookie\") pod \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.265110 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.265149 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-erlang-cookie-secret\") pod \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.265234 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data\") pod \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.265285 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-server-conf\") pod \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.265324 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-pod-info\") pod \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.265357 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-tls\") pod \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.265389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-confd\") pod \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\" (UID: \"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea\") " Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.265494 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.265675 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.265821 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.265846 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.267179 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.269563 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.269568 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.269999 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-kube-api-access-lrzfw" (OuterVolumeSpecName: "kube-api-access-lrzfw") pod "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea"). InnerVolumeSpecName "kube-api-access-lrzfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.270014 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.270241 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-pod-info" (OuterVolumeSpecName: "pod-info") pod "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.283766 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data" (OuterVolumeSpecName: "config-data") pod "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.309449 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-server-conf" (OuterVolumeSpecName: "server-conf") pod "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.367689 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.367742 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrzfw\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-kube-api-access-lrzfw\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.367787 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.367807 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.367827 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.367844 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.367861 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.367879 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.374510 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" (UID: "fb76fff8-05c4-4ec5-b00b-17dfd94e5dea"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.399729 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.469314 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:26 crc kubenswrapper[5030]: I0120 23:26:26.469366 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:27 crc kubenswrapper[5030]: I0120 23:26:27.112591 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:26:27 crc kubenswrapper[5030]: I0120 23:26:27.112600 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"fb76fff8-05c4-4ec5-b00b-17dfd94e5dea","Type":"ContainerDied","Data":"85e036d5c8b7cd237180c3996382ff3a3fd4c46e036d082f84bf307b06c9a89d"} Jan 20 23:26:27 crc kubenswrapper[5030]: I0120 23:26:27.113543 5030 scope.go:117] "RemoveContainer" containerID="38aa5aa26bdb0f07041118403707547bc1981c26790c4adec8154899b480eb0b" Jan 20 23:26:27 crc kubenswrapper[5030]: I0120 23:26:27.159946 5030 scope.go:117] "RemoveContainer" containerID="9940bbb038d38a74af92d31f66d01f141a002adaecb98d66837c57666c0f664f" Jan 20 23:26:27 crc kubenswrapper[5030]: I0120 23:26:27.180751 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:26:27 crc kubenswrapper[5030]: I0120 23:26:27.192298 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:26:28 crc kubenswrapper[5030]: I0120 23:26:28.004172 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d756a967-e1dd-4c74-8bff-cd1fab21b251" path="/var/lib/kubelet/pods/d756a967-e1dd-4c74-8bff-cd1fab21b251/volumes" Jan 20 23:26:28 crc kubenswrapper[5030]: I0120 23:26:28.006862 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" path="/var/lib/kubelet/pods/fb76fff8-05c4-4ec5-b00b-17dfd94e5dea/volumes" Jan 20 23:26:28 crc kubenswrapper[5030]: I0120 23:26:28.962323 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:26:28 crc kubenswrapper[5030]: E0120 23:26:28.962782 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:26:29 crc kubenswrapper[5030]: I0120 23:26:29.350920 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:26:29 crc kubenswrapper[5030]: I0120 23:26:29.433141 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh"] Jan 20 23:26:29 crc kubenswrapper[5030]: I0120 23:26:29.433429 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" podUID="398a1488-8639-4110-8d25-2f5ff7e73a30" containerName="dnsmasq-dns" containerID="cri-o://a90ceeb6bf9c370a55379d97e1f6e03ab4adc0d905acdce8f70731f3ec2d5aa2" gracePeriod=10 Jan 20 23:26:29 crc kubenswrapper[5030]: I0120 23:26:29.901557 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:26:29 crc kubenswrapper[5030]: I0120 23:26:29.948261 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-dnsmasq-svc\") pod \"398a1488-8639-4110-8d25-2f5ff7e73a30\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " Jan 20 23:26:29 crc kubenswrapper[5030]: I0120 23:26:29.948319 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-config\") pod \"398a1488-8639-4110-8d25-2f5ff7e73a30\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " Jan 20 23:26:29 crc kubenswrapper[5030]: I0120 23:26:29.948428 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dprsl\" (UniqueName: \"kubernetes.io/projected/398a1488-8639-4110-8d25-2f5ff7e73a30-kube-api-access-dprsl\") pod \"398a1488-8639-4110-8d25-2f5ff7e73a30\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " Jan 20 23:26:29 crc kubenswrapper[5030]: I0120 23:26:29.950401 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-dns-swift-storage-0\") pod \"398a1488-8639-4110-8d25-2f5ff7e73a30\" (UID: \"398a1488-8639-4110-8d25-2f5ff7e73a30\") " Jan 20 23:26:29 crc kubenswrapper[5030]: I0120 23:26:29.956763 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398a1488-8639-4110-8d25-2f5ff7e73a30-kube-api-access-dprsl" (OuterVolumeSpecName: "kube-api-access-dprsl") pod "398a1488-8639-4110-8d25-2f5ff7e73a30" (UID: "398a1488-8639-4110-8d25-2f5ff7e73a30"). InnerVolumeSpecName "kube-api-access-dprsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:29 crc kubenswrapper[5030]: I0120 23:26:29.997174 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "398a1488-8639-4110-8d25-2f5ff7e73a30" (UID: "398a1488-8639-4110-8d25-2f5ff7e73a30"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:29 crc kubenswrapper[5030]: I0120 23:26:29.999468 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "398a1488-8639-4110-8d25-2f5ff7e73a30" (UID: "398a1488-8639-4110-8d25-2f5ff7e73a30"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.006339 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-config" (OuterVolumeSpecName: "config") pod "398a1488-8639-4110-8d25-2f5ff7e73a30" (UID: "398a1488-8639-4110-8d25-2f5ff7e73a30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.051826 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dprsl\" (UniqueName: \"kubernetes.io/projected/398a1488-8639-4110-8d25-2f5ff7e73a30-kube-api-access-dprsl\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.051867 5030 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.051878 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.051888 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/398a1488-8639-4110-8d25-2f5ff7e73a30-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.155119 5030 generic.go:334] "Generic (PLEG): container finished" podID="398a1488-8639-4110-8d25-2f5ff7e73a30" containerID="a90ceeb6bf9c370a55379d97e1f6e03ab4adc0d905acdce8f70731f3ec2d5aa2" exitCode=0 Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.155173 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" event={"ID":"398a1488-8639-4110-8d25-2f5ff7e73a30","Type":"ContainerDied","Data":"a90ceeb6bf9c370a55379d97e1f6e03ab4adc0d905acdce8f70731f3ec2d5aa2"} Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.155208 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" event={"ID":"398a1488-8639-4110-8d25-2f5ff7e73a30","Type":"ContainerDied","Data":"ddba0e5fe10a3fac4953b1a468593293a0f89d7234d4f47d94e89fcc660bae41"} Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.155232 5030 scope.go:117] "RemoveContainer" containerID="a90ceeb6bf9c370a55379d97e1f6e03ab4adc0d905acdce8f70731f3ec2d5aa2" Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.155363 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh" Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.190573 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh"] Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.196298 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6cc7959f89-zcnnh"] Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.196472 5030 scope.go:117] "RemoveContainer" containerID="bf5c5b86b57ae306e665f1225382e53638a0f5f9e6613b8e0eb8cc33cff28a46" Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.219744 5030 scope.go:117] "RemoveContainer" containerID="a90ceeb6bf9c370a55379d97e1f6e03ab4adc0d905acdce8f70731f3ec2d5aa2" Jan 20 23:26:30 crc kubenswrapper[5030]: E0120 23:26:30.220159 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a90ceeb6bf9c370a55379d97e1f6e03ab4adc0d905acdce8f70731f3ec2d5aa2\": container with ID starting with a90ceeb6bf9c370a55379d97e1f6e03ab4adc0d905acdce8f70731f3ec2d5aa2 not found: ID does not exist" containerID="a90ceeb6bf9c370a55379d97e1f6e03ab4adc0d905acdce8f70731f3ec2d5aa2" Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.220196 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a90ceeb6bf9c370a55379d97e1f6e03ab4adc0d905acdce8f70731f3ec2d5aa2"} err="failed to get container status \"a90ceeb6bf9c370a55379d97e1f6e03ab4adc0d905acdce8f70731f3ec2d5aa2\": rpc error: code = NotFound desc = could not find container \"a90ceeb6bf9c370a55379d97e1f6e03ab4adc0d905acdce8f70731f3ec2d5aa2\": container with ID starting with a90ceeb6bf9c370a55379d97e1f6e03ab4adc0d905acdce8f70731f3ec2d5aa2 not found: ID does not exist" Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.220218 5030 scope.go:117] "RemoveContainer" containerID="bf5c5b86b57ae306e665f1225382e53638a0f5f9e6613b8e0eb8cc33cff28a46" Jan 20 23:26:30 crc kubenswrapper[5030]: E0120 23:26:30.220428 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf5c5b86b57ae306e665f1225382e53638a0f5f9e6613b8e0eb8cc33cff28a46\": container with ID starting with bf5c5b86b57ae306e665f1225382e53638a0f5f9e6613b8e0eb8cc33cff28a46 not found: ID does not exist" containerID="bf5c5b86b57ae306e665f1225382e53638a0f5f9e6613b8e0eb8cc33cff28a46" Jan 20 23:26:30 crc kubenswrapper[5030]: I0120 23:26:30.220472 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf5c5b86b57ae306e665f1225382e53638a0f5f9e6613b8e0eb8cc33cff28a46"} err="failed to get container status \"bf5c5b86b57ae306e665f1225382e53638a0f5f9e6613b8e0eb8cc33cff28a46\": rpc error: code = NotFound desc = could not find container \"bf5c5b86b57ae306e665f1225382e53638a0f5f9e6613b8e0eb8cc33cff28a46\": container with ID starting with bf5c5b86b57ae306e665f1225382e53638a0f5f9e6613b8e0eb8cc33cff28a46 not found: ID does not exist" Jan 20 23:26:31 crc kubenswrapper[5030]: I0120 23:26:31.977028 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="398a1488-8639-4110-8d25-2f5ff7e73a30" path="/var/lib/kubelet/pods/398a1488-8639-4110-8d25-2f5ff7e73a30/volumes" Jan 20 23:26:39 crc kubenswrapper[5030]: I0120 23:26:39.962186 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:26:39 crc kubenswrapper[5030]: E0120 23:26:39.963098 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:26:42 crc kubenswrapper[5030]: E0120 23:26:42.304664 5030 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/c4ac2833e5fd0c6c273da68ea801b9f12c862b77ee34f063259a6c479a734b23/diff" to get inode usage: stat /var/lib/containers/storage/overlay/c4ac2833e5fd0c6c273da68ea801b9f12c862b77ee34f063259a6c479a734b23/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack-kuttl-tests_dnsmasq-dnsmasq-6cc7959f89-zcnnh_398a1488-8639-4110-8d25-2f5ff7e73a30/dnsmasq-dns/0.log" to get inode usage: stat /var/log/pods/openstack-kuttl-tests_dnsmasq-dnsmasq-6cc7959f89-zcnnh_398a1488-8639-4110-8d25-2f5ff7e73a30/dnsmasq-dns/0.log: no such file or directory Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.244896 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.321247 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1c1f0eab-495a-4664-9852-3e6488194586-cache\") pod \"1c1f0eab-495a-4664-9852-3e6488194586\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.321335 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"1c1f0eab-495a-4664-9852-3e6488194586\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.321379 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1c1f0eab-495a-4664-9852-3e6488194586-lock\") pod \"1c1f0eab-495a-4664-9852-3e6488194586\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.321466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrtpp\" (UniqueName: \"kubernetes.io/projected/1c1f0eab-495a-4664-9852-3e6488194586-kube-api-access-wrtpp\") pod \"1c1f0eab-495a-4664-9852-3e6488194586\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.321539 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1c1f0eab-495a-4664-9852-3e6488194586-etc-swift\") pod \"1c1f0eab-495a-4664-9852-3e6488194586\" (UID: \"1c1f0eab-495a-4664-9852-3e6488194586\") " Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.322099 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c1f0eab-495a-4664-9852-3e6488194586-cache" (OuterVolumeSpecName: "cache") pod "1c1f0eab-495a-4664-9852-3e6488194586" (UID: "1c1f0eab-495a-4664-9852-3e6488194586"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.322281 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c1f0eab-495a-4664-9852-3e6488194586-lock" (OuterVolumeSpecName: "lock") pod "1c1f0eab-495a-4664-9852-3e6488194586" (UID: "1c1f0eab-495a-4664-9852-3e6488194586"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.327181 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "swift") pod "1c1f0eab-495a-4664-9852-3e6488194586" (UID: "1c1f0eab-495a-4664-9852-3e6488194586"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.328059 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c1f0eab-495a-4664-9852-3e6488194586-kube-api-access-wrtpp" (OuterVolumeSpecName: "kube-api-access-wrtpp") pod "1c1f0eab-495a-4664-9852-3e6488194586" (UID: "1c1f0eab-495a-4664-9852-3e6488194586"). InnerVolumeSpecName "kube-api-access-wrtpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.328184 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c1f0eab-495a-4664-9852-3e6488194586-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1c1f0eab-495a-4664-9852-3e6488194586" (UID: "1c1f0eab-495a-4664-9852-3e6488194586"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.413339 5030 generic.go:334] "Generic (PLEG): container finished" podID="1c1f0eab-495a-4664-9852-3e6488194586" containerID="052f7253ce8f20b7006896c49be8c049c88f69895a3e3717387a4a8776731857" exitCode=137 Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.413425 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"052f7253ce8f20b7006896c49be8c049c88f69895a3e3717387a4a8776731857"} Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.413456 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1c1f0eab-495a-4664-9852-3e6488194586","Type":"ContainerDied","Data":"b6f075fb50ad4968fe34a14726ac474d4e470b514b1278364c1229eef793355d"} Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.413487 5030 scope.go:117] "RemoveContainer" containerID="052f7253ce8f20b7006896c49be8c049c88f69895a3e3717387a4a8776731857" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.413701 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.419352 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-7bd69c474b-2t4sm_c991f145-efc7-4968-b73a-d4cf085acee0/neutron-api/0.log" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.419411 5030 generic.go:334] "Generic (PLEG): container finished" podID="c991f145-efc7-4968-b73a-d4cf085acee0" containerID="b5a3121b46f09ab54fc42a4e60030abec28c4ebcaabff01ecef271e79eda8bce" exitCode=137 Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.419445 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" event={"ID":"c991f145-efc7-4968-b73a-d4cf085acee0","Type":"ContainerDied","Data":"b5a3121b46f09ab54fc42a4e60030abec28c4ebcaabff01ecef271e79eda8bce"} Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.423448 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrtpp\" (UniqueName: \"kubernetes.io/projected/1c1f0eab-495a-4664-9852-3e6488194586-kube-api-access-wrtpp\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.423483 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1c1f0eab-495a-4664-9852-3e6488194586-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.423493 5030 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1c1f0eab-495a-4664-9852-3e6488194586-cache\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.423529 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.423539 5030 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1c1f0eab-495a-4664-9852-3e6488194586-lock\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.445875 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.452964 5030 scope.go:117] "RemoveContainer" containerID="df6ef689e710dba22c29a499696e2c2009cc0478473059ce8d655ee5a1654893" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.454071 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.461134 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.470734 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-7bd69c474b-2t4sm_c991f145-efc7-4968-b73a-d4cf085acee0/neutron-api/0.log" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.470807 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.476398 5030 scope.go:117] "RemoveContainer" containerID="3896b811d93df67a1c086c1bade3178d6f587e711b21f8ed76c06b9949f5320b" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.495753 5030 scope.go:117] "RemoveContainer" containerID="d33aa21651847a416850ac9ea366856f99b5efa24607b2c97ee28cb5316792af" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.518185 5030 scope.go:117] "RemoveContainer" containerID="5b32a554719001bcf616841e4db5c2484999d2d6bb3cce476354aa6956e8471a" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.524475 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h6q6\" (UniqueName: \"kubernetes.io/projected/c991f145-efc7-4968-b73a-d4cf085acee0-kube-api-access-2h6q6\") pod \"c991f145-efc7-4968-b73a-d4cf085acee0\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.524560 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-config\") pod \"c991f145-efc7-4968-b73a-d4cf085acee0\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.524639 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-httpd-config\") pod \"c991f145-efc7-4968-b73a-d4cf085acee0\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.524674 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-combined-ca-bundle\") pod \"c991f145-efc7-4968-b73a-d4cf085acee0\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.524700 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-public-tls-certs\") pod \"c991f145-efc7-4968-b73a-d4cf085acee0\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.524759 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-ovndb-tls-certs\") pod \"c991f145-efc7-4968-b73a-d4cf085acee0\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.525068 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-internal-tls-certs\") pod \"c991f145-efc7-4968-b73a-d4cf085acee0\" (UID: \"c991f145-efc7-4968-b73a-d4cf085acee0\") " Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.525319 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.528044 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c991f145-efc7-4968-b73a-d4cf085acee0-kube-api-access-2h6q6" (OuterVolumeSpecName: "kube-api-access-2h6q6") pod "c991f145-efc7-4968-b73a-d4cf085acee0" (UID: "c991f145-efc7-4968-b73a-d4cf085acee0"). InnerVolumeSpecName "kube-api-access-2h6q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.528576 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c991f145-efc7-4968-b73a-d4cf085acee0" (UID: "c991f145-efc7-4968-b73a-d4cf085acee0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.537792 5030 scope.go:117] "RemoveContainer" containerID="8d7dbc2d97335bd333e92228100d852a9f983098e53579130b62dd6a10f6e90a" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.561916 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-config" (OuterVolumeSpecName: "config") pod "c991f145-efc7-4968-b73a-d4cf085acee0" (UID: "c991f145-efc7-4968-b73a-d4cf085acee0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.569686 5030 scope.go:117] "RemoveContainer" containerID="be5d70fece74ed23d3912dcecfead32af8e6a34f87d9ad333fc71f20d7ff9eda" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.573155 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c991f145-efc7-4968-b73a-d4cf085acee0" (UID: "c991f145-efc7-4968-b73a-d4cf085acee0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.573755 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c991f145-efc7-4968-b73a-d4cf085acee0" (UID: "c991f145-efc7-4968-b73a-d4cf085acee0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.574170 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c991f145-efc7-4968-b73a-d4cf085acee0" (UID: "c991f145-efc7-4968-b73a-d4cf085acee0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.599253 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c991f145-efc7-4968-b73a-d4cf085acee0" (UID: "c991f145-efc7-4968-b73a-d4cf085acee0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.616866 5030 scope.go:117] "RemoveContainer" containerID="603531bd19dc85d08fbb782f3b219e34a57b5c3b586d9e1cf64a8141a3e628f4" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.626982 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.627012 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.627050 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.627060 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.627069 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.627081 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h6q6\" (UniqueName: \"kubernetes.io/projected/c991f145-efc7-4968-b73a-d4cf085acee0-kube-api-access-2h6q6\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.627090 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c991f145-efc7-4968-b73a-d4cf085acee0-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.642128 5030 scope.go:117] "RemoveContainer" containerID="e9dc24048dc188535b8295f3ab283fcdad761f4bfaf1aac43f23a6f6b734faf2" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.658016 5030 scope.go:117] "RemoveContainer" containerID="a9cb444112598e53947461f42b9f15ca62053dfe7dfd69670749b0326697f5da" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.673748 5030 scope.go:117] "RemoveContainer" containerID="3d951ed926c631ff4318b93467ee0df5ca43db2435a2c42f8b6dc53ec4fc91c3" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.693049 5030 scope.go:117] "RemoveContainer" containerID="e0b30913b14fa56da284823ebc9d76041b474ec2b3ea78132cd310dd05b97ce8" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.708918 5030 scope.go:117] "RemoveContainer" containerID="7cd3a2e6ea294102d34df6f1bba4990edb60bc783f665abcaa08bc642403453c" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.725484 5030 scope.go:117] "RemoveContainer" containerID="2683362cd99b8235f265be3f4440d62cdb3c454faec2d39def79a9949930d374" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.746530 5030 scope.go:117] "RemoveContainer" containerID="9181d3d458c39873908d9b62be1f43c09bde580a5a92747fa92c7120b3c67bd9" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.777170 5030 scope.go:117] "RemoveContainer" containerID="052f7253ce8f20b7006896c49be8c049c88f69895a3e3717387a4a8776731857" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.777503 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"052f7253ce8f20b7006896c49be8c049c88f69895a3e3717387a4a8776731857\": container with ID starting with 052f7253ce8f20b7006896c49be8c049c88f69895a3e3717387a4a8776731857 not found: ID does not exist" containerID="052f7253ce8f20b7006896c49be8c049c88f69895a3e3717387a4a8776731857" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.777540 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"052f7253ce8f20b7006896c49be8c049c88f69895a3e3717387a4a8776731857"} err="failed to get container status \"052f7253ce8f20b7006896c49be8c049c88f69895a3e3717387a4a8776731857\": rpc error: code = NotFound desc = could not find container \"052f7253ce8f20b7006896c49be8c049c88f69895a3e3717387a4a8776731857\": container with ID starting with 052f7253ce8f20b7006896c49be8c049c88f69895a3e3717387a4a8776731857 not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.777559 5030 scope.go:117] "RemoveContainer" containerID="df6ef689e710dba22c29a499696e2c2009cc0478473059ce8d655ee5a1654893" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.777941 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df6ef689e710dba22c29a499696e2c2009cc0478473059ce8d655ee5a1654893\": container with ID starting with df6ef689e710dba22c29a499696e2c2009cc0478473059ce8d655ee5a1654893 not found: ID does not exist" containerID="df6ef689e710dba22c29a499696e2c2009cc0478473059ce8d655ee5a1654893" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.777999 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df6ef689e710dba22c29a499696e2c2009cc0478473059ce8d655ee5a1654893"} err="failed to get container status \"df6ef689e710dba22c29a499696e2c2009cc0478473059ce8d655ee5a1654893\": rpc error: code = NotFound desc = could not find container \"df6ef689e710dba22c29a499696e2c2009cc0478473059ce8d655ee5a1654893\": container with ID starting with df6ef689e710dba22c29a499696e2c2009cc0478473059ce8d655ee5a1654893 not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.778032 5030 scope.go:117] "RemoveContainer" containerID="3896b811d93df67a1c086c1bade3178d6f587e711b21f8ed76c06b9949f5320b" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.778289 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3896b811d93df67a1c086c1bade3178d6f587e711b21f8ed76c06b9949f5320b\": container with ID starting with 3896b811d93df67a1c086c1bade3178d6f587e711b21f8ed76c06b9949f5320b not found: ID does not exist" containerID="3896b811d93df67a1c086c1bade3178d6f587e711b21f8ed76c06b9949f5320b" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.778317 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3896b811d93df67a1c086c1bade3178d6f587e711b21f8ed76c06b9949f5320b"} err="failed to get container status \"3896b811d93df67a1c086c1bade3178d6f587e711b21f8ed76c06b9949f5320b\": rpc error: code = NotFound desc = could not find container \"3896b811d93df67a1c086c1bade3178d6f587e711b21f8ed76c06b9949f5320b\": container with ID starting with 3896b811d93df67a1c086c1bade3178d6f587e711b21f8ed76c06b9949f5320b not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.778333 5030 scope.go:117] "RemoveContainer" containerID="d33aa21651847a416850ac9ea366856f99b5efa24607b2c97ee28cb5316792af" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.778555 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d33aa21651847a416850ac9ea366856f99b5efa24607b2c97ee28cb5316792af\": container with ID starting with d33aa21651847a416850ac9ea366856f99b5efa24607b2c97ee28cb5316792af not found: ID does not exist" containerID="d33aa21651847a416850ac9ea366856f99b5efa24607b2c97ee28cb5316792af" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.778576 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d33aa21651847a416850ac9ea366856f99b5efa24607b2c97ee28cb5316792af"} err="failed to get container status \"d33aa21651847a416850ac9ea366856f99b5efa24607b2c97ee28cb5316792af\": rpc error: code = NotFound desc = could not find container \"d33aa21651847a416850ac9ea366856f99b5efa24607b2c97ee28cb5316792af\": container with ID starting with d33aa21651847a416850ac9ea366856f99b5efa24607b2c97ee28cb5316792af not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.778589 5030 scope.go:117] "RemoveContainer" containerID="5b32a554719001bcf616841e4db5c2484999d2d6bb3cce476354aa6956e8471a" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.778819 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b32a554719001bcf616841e4db5c2484999d2d6bb3cce476354aa6956e8471a\": container with ID starting with 5b32a554719001bcf616841e4db5c2484999d2d6bb3cce476354aa6956e8471a not found: ID does not exist" containerID="5b32a554719001bcf616841e4db5c2484999d2d6bb3cce476354aa6956e8471a" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.778879 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b32a554719001bcf616841e4db5c2484999d2d6bb3cce476354aa6956e8471a"} err="failed to get container status \"5b32a554719001bcf616841e4db5c2484999d2d6bb3cce476354aa6956e8471a\": rpc error: code = NotFound desc = could not find container \"5b32a554719001bcf616841e4db5c2484999d2d6bb3cce476354aa6956e8471a\": container with ID starting with 5b32a554719001bcf616841e4db5c2484999d2d6bb3cce476354aa6956e8471a not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.778904 5030 scope.go:117] "RemoveContainer" containerID="8d7dbc2d97335bd333e92228100d852a9f983098e53579130b62dd6a10f6e90a" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.783032 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d7dbc2d97335bd333e92228100d852a9f983098e53579130b62dd6a10f6e90a\": container with ID starting with 8d7dbc2d97335bd333e92228100d852a9f983098e53579130b62dd6a10f6e90a not found: ID does not exist" containerID="8d7dbc2d97335bd333e92228100d852a9f983098e53579130b62dd6a10f6e90a" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.783062 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d7dbc2d97335bd333e92228100d852a9f983098e53579130b62dd6a10f6e90a"} err="failed to get container status \"8d7dbc2d97335bd333e92228100d852a9f983098e53579130b62dd6a10f6e90a\": rpc error: code = NotFound desc = could not find container \"8d7dbc2d97335bd333e92228100d852a9f983098e53579130b62dd6a10f6e90a\": container with ID starting with 8d7dbc2d97335bd333e92228100d852a9f983098e53579130b62dd6a10f6e90a not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.783080 5030 scope.go:117] "RemoveContainer" containerID="be5d70fece74ed23d3912dcecfead32af8e6a34f87d9ad333fc71f20d7ff9eda" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.783869 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be5d70fece74ed23d3912dcecfead32af8e6a34f87d9ad333fc71f20d7ff9eda\": container with ID starting with be5d70fece74ed23d3912dcecfead32af8e6a34f87d9ad333fc71f20d7ff9eda not found: ID does not exist" containerID="be5d70fece74ed23d3912dcecfead32af8e6a34f87d9ad333fc71f20d7ff9eda" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.783892 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be5d70fece74ed23d3912dcecfead32af8e6a34f87d9ad333fc71f20d7ff9eda"} err="failed to get container status \"be5d70fece74ed23d3912dcecfead32af8e6a34f87d9ad333fc71f20d7ff9eda\": rpc error: code = NotFound desc = could not find container \"be5d70fece74ed23d3912dcecfead32af8e6a34f87d9ad333fc71f20d7ff9eda\": container with ID starting with be5d70fece74ed23d3912dcecfead32af8e6a34f87d9ad333fc71f20d7ff9eda not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.783906 5030 scope.go:117] "RemoveContainer" containerID="603531bd19dc85d08fbb782f3b219e34a57b5c3b586d9e1cf64a8141a3e628f4" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.784223 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"603531bd19dc85d08fbb782f3b219e34a57b5c3b586d9e1cf64a8141a3e628f4\": container with ID starting with 603531bd19dc85d08fbb782f3b219e34a57b5c3b586d9e1cf64a8141a3e628f4 not found: ID does not exist" containerID="603531bd19dc85d08fbb782f3b219e34a57b5c3b586d9e1cf64a8141a3e628f4" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.784270 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"603531bd19dc85d08fbb782f3b219e34a57b5c3b586d9e1cf64a8141a3e628f4"} err="failed to get container status \"603531bd19dc85d08fbb782f3b219e34a57b5c3b586d9e1cf64a8141a3e628f4\": rpc error: code = NotFound desc = could not find container \"603531bd19dc85d08fbb782f3b219e34a57b5c3b586d9e1cf64a8141a3e628f4\": container with ID starting with 603531bd19dc85d08fbb782f3b219e34a57b5c3b586d9e1cf64a8141a3e628f4 not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.784303 5030 scope.go:117] "RemoveContainer" containerID="e9dc24048dc188535b8295f3ab283fcdad761f4bfaf1aac43f23a6f6b734faf2" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.784565 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod398a1488_8639_4110_8d25_2f5ff7e73a30.slice/crio-ddba0e5fe10a3fac4953b1a468593293a0f89d7234d4f47d94e89fcc660bae41\": RecentStats: unable to find data in memory cache]" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.784810 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9dc24048dc188535b8295f3ab283fcdad761f4bfaf1aac43f23a6f6b734faf2\": container with ID starting with e9dc24048dc188535b8295f3ab283fcdad761f4bfaf1aac43f23a6f6b734faf2 not found: ID does not exist" containerID="e9dc24048dc188535b8295f3ab283fcdad761f4bfaf1aac43f23a6f6b734faf2" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.784844 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9dc24048dc188535b8295f3ab283fcdad761f4bfaf1aac43f23a6f6b734faf2"} err="failed to get container status \"e9dc24048dc188535b8295f3ab283fcdad761f4bfaf1aac43f23a6f6b734faf2\": rpc error: code = NotFound desc = could not find container \"e9dc24048dc188535b8295f3ab283fcdad761f4bfaf1aac43f23a6f6b734faf2\": container with ID starting with e9dc24048dc188535b8295f3ab283fcdad761f4bfaf1aac43f23a6f6b734faf2 not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.784862 5030 scope.go:117] "RemoveContainer" containerID="a9cb444112598e53947461f42b9f15ca62053dfe7dfd69670749b0326697f5da" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.785163 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9cb444112598e53947461f42b9f15ca62053dfe7dfd69670749b0326697f5da\": container with ID starting with a9cb444112598e53947461f42b9f15ca62053dfe7dfd69670749b0326697f5da not found: ID does not exist" containerID="a9cb444112598e53947461f42b9f15ca62053dfe7dfd69670749b0326697f5da" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.785197 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9cb444112598e53947461f42b9f15ca62053dfe7dfd69670749b0326697f5da"} err="failed to get container status \"a9cb444112598e53947461f42b9f15ca62053dfe7dfd69670749b0326697f5da\": rpc error: code = NotFound desc = could not find container \"a9cb444112598e53947461f42b9f15ca62053dfe7dfd69670749b0326697f5da\": container with ID starting with a9cb444112598e53947461f42b9f15ca62053dfe7dfd69670749b0326697f5da not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.785220 5030 scope.go:117] "RemoveContainer" containerID="3d951ed926c631ff4318b93467ee0df5ca43db2435a2c42f8b6dc53ec4fc91c3" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.785488 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d951ed926c631ff4318b93467ee0df5ca43db2435a2c42f8b6dc53ec4fc91c3\": container with ID starting with 3d951ed926c631ff4318b93467ee0df5ca43db2435a2c42f8b6dc53ec4fc91c3 not found: ID does not exist" containerID="3d951ed926c631ff4318b93467ee0df5ca43db2435a2c42f8b6dc53ec4fc91c3" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.785519 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d951ed926c631ff4318b93467ee0df5ca43db2435a2c42f8b6dc53ec4fc91c3"} err="failed to get container status \"3d951ed926c631ff4318b93467ee0df5ca43db2435a2c42f8b6dc53ec4fc91c3\": rpc error: code = NotFound desc = could not find container \"3d951ed926c631ff4318b93467ee0df5ca43db2435a2c42f8b6dc53ec4fc91c3\": container with ID starting with 3d951ed926c631ff4318b93467ee0df5ca43db2435a2c42f8b6dc53ec4fc91c3 not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.785539 5030 scope.go:117] "RemoveContainer" containerID="e0b30913b14fa56da284823ebc9d76041b474ec2b3ea78132cd310dd05b97ce8" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.785991 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0b30913b14fa56da284823ebc9d76041b474ec2b3ea78132cd310dd05b97ce8\": container with ID starting with e0b30913b14fa56da284823ebc9d76041b474ec2b3ea78132cd310dd05b97ce8 not found: ID does not exist" containerID="e0b30913b14fa56da284823ebc9d76041b474ec2b3ea78132cd310dd05b97ce8" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.786030 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b30913b14fa56da284823ebc9d76041b474ec2b3ea78132cd310dd05b97ce8"} err="failed to get container status \"e0b30913b14fa56da284823ebc9d76041b474ec2b3ea78132cd310dd05b97ce8\": rpc error: code = NotFound desc = could not find container \"e0b30913b14fa56da284823ebc9d76041b474ec2b3ea78132cd310dd05b97ce8\": container with ID starting with e0b30913b14fa56da284823ebc9d76041b474ec2b3ea78132cd310dd05b97ce8 not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.786052 5030 scope.go:117] "RemoveContainer" containerID="7cd3a2e6ea294102d34df6f1bba4990edb60bc783f665abcaa08bc642403453c" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.786407 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cd3a2e6ea294102d34df6f1bba4990edb60bc783f665abcaa08bc642403453c\": container with ID starting with 7cd3a2e6ea294102d34df6f1bba4990edb60bc783f665abcaa08bc642403453c not found: ID does not exist" containerID="7cd3a2e6ea294102d34df6f1bba4990edb60bc783f665abcaa08bc642403453c" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.786437 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cd3a2e6ea294102d34df6f1bba4990edb60bc783f665abcaa08bc642403453c"} err="failed to get container status \"7cd3a2e6ea294102d34df6f1bba4990edb60bc783f665abcaa08bc642403453c\": rpc error: code = NotFound desc = could not find container \"7cd3a2e6ea294102d34df6f1bba4990edb60bc783f665abcaa08bc642403453c\": container with ID starting with 7cd3a2e6ea294102d34df6f1bba4990edb60bc783f665abcaa08bc642403453c not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.786453 5030 scope.go:117] "RemoveContainer" containerID="2683362cd99b8235f265be3f4440d62cdb3c454faec2d39def79a9949930d374" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.786688 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2683362cd99b8235f265be3f4440d62cdb3c454faec2d39def79a9949930d374\": container with ID starting with 2683362cd99b8235f265be3f4440d62cdb3c454faec2d39def79a9949930d374 not found: ID does not exist" containerID="2683362cd99b8235f265be3f4440d62cdb3c454faec2d39def79a9949930d374" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.786716 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2683362cd99b8235f265be3f4440d62cdb3c454faec2d39def79a9949930d374"} err="failed to get container status \"2683362cd99b8235f265be3f4440d62cdb3c454faec2d39def79a9949930d374\": rpc error: code = NotFound desc = could not find container \"2683362cd99b8235f265be3f4440d62cdb3c454faec2d39def79a9949930d374\": container with ID starting with 2683362cd99b8235f265be3f4440d62cdb3c454faec2d39def79a9949930d374 not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.786740 5030 scope.go:117] "RemoveContainer" containerID="9181d3d458c39873908d9b62be1f43c09bde580a5a92747fa92c7120b3c67bd9" Jan 20 23:26:49 crc kubenswrapper[5030]: E0120 23:26:49.787002 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9181d3d458c39873908d9b62be1f43c09bde580a5a92747fa92c7120b3c67bd9\": container with ID starting with 9181d3d458c39873908d9b62be1f43c09bde580a5a92747fa92c7120b3c67bd9 not found: ID does not exist" containerID="9181d3d458c39873908d9b62be1f43c09bde580a5a92747fa92c7120b3c67bd9" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.787038 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9181d3d458c39873908d9b62be1f43c09bde580a5a92747fa92c7120b3c67bd9"} err="failed to get container status \"9181d3d458c39873908d9b62be1f43c09bde580a5a92747fa92c7120b3c67bd9\": rpc error: code = NotFound desc = could not find container \"9181d3d458c39873908d9b62be1f43c09bde580a5a92747fa92c7120b3c67bd9\": container with ID starting with 9181d3d458c39873908d9b62be1f43c09bde580a5a92747fa92c7120b3c67bd9 not found: ID does not exist" Jan 20 23:26:49 crc kubenswrapper[5030]: I0120 23:26:49.975764 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c1f0eab-495a-4664-9852-3e6488194586" path="/var/lib/kubelet/pods/1c1f0eab-495a-4664-9852-3e6488194586/volumes" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.395566 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.436777 5030 generic.go:334] "Generic (PLEG): container finished" podID="6cc0aba0-a0c7-4b7b-9b91-031765f25948" containerID="68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a" exitCode=137 Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.436810 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-config-data-custom\") pod \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.436845 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"6cc0aba0-a0c7-4b7b-9b91-031765f25948","Type":"ContainerDied","Data":"68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a"} Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.436887 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.436911 5030 scope.go:117] "RemoveContainer" containerID="68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.436824 5030 generic.go:334] "Generic (PLEG): container finished" podID="6cc0aba0-a0c7-4b7b-9b91-031765f25948" containerID="dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac" exitCode=137 Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.436909 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-combined-ca-bundle\") pod \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.436893 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"6cc0aba0-a0c7-4b7b-9b91-031765f25948","Type":"ContainerDied","Data":"dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac"} Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.437145 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cc0aba0-a0c7-4b7b-9b91-031765f25948-etc-machine-id\") pod \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.437210 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"6cc0aba0-a0c7-4b7b-9b91-031765f25948","Type":"ContainerDied","Data":"405be7b843edf098a15353de75dc7e5075b85eb3553f3e422355e25bd6872214"} Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.437329 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr6c9\" (UniqueName: \"kubernetes.io/projected/6cc0aba0-a0c7-4b7b-9b91-031765f25948-kube-api-access-vr6c9\") pod \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.437373 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-config-data\") pod \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.437411 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-scripts\") pod \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\" (UID: \"6cc0aba0-a0c7-4b7b-9b91-031765f25948\") " Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.437420 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6cc0aba0-a0c7-4b7b-9b91-031765f25948-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6cc0aba0-a0c7-4b7b-9b91-031765f25948" (UID: "6cc0aba0-a0c7-4b7b-9b91-031765f25948"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.438376 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cc0aba0-a0c7-4b7b-9b91-031765f25948-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.444617 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6cc0aba0-a0c7-4b7b-9b91-031765f25948" (UID: "6cc0aba0-a0c7-4b7b-9b91-031765f25948"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.444782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-scripts" (OuterVolumeSpecName: "scripts") pod "6cc0aba0-a0c7-4b7b-9b91-031765f25948" (UID: "6cc0aba0-a0c7-4b7b-9b91-031765f25948"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.444883 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-7bd69c474b-2t4sm_c991f145-efc7-4968-b73a-d4cf085acee0/neutron-api/0.log" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.444945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" event={"ID":"c991f145-efc7-4968-b73a-d4cf085acee0","Type":"ContainerDied","Data":"7c4d557daa2ddc003d00ec3c03074a00cac4f35a5527a4697c964d1d28f8f0cd"} Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.445072 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7bd69c474b-2t4sm" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.445116 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc0aba0-a0c7-4b7b-9b91-031765f25948-kube-api-access-vr6c9" (OuterVolumeSpecName: "kube-api-access-vr6c9") pod "6cc0aba0-a0c7-4b7b-9b91-031765f25948" (UID: "6cc0aba0-a0c7-4b7b-9b91-031765f25948"). InnerVolumeSpecName "kube-api-access-vr6c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.501983 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cc0aba0-a0c7-4b7b-9b91-031765f25948" (UID: "6cc0aba0-a0c7-4b7b-9b91-031765f25948"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.539380 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.539686 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.539806 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr6c9\" (UniqueName: \"kubernetes.io/projected/6cc0aba0-a0c7-4b7b-9b91-031765f25948-kube-api-access-vr6c9\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.539891 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.545744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-config-data" (OuterVolumeSpecName: "config-data") pod "6cc0aba0-a0c7-4b7b-9b91-031765f25948" (UID: "6cc0aba0-a0c7-4b7b-9b91-031765f25948"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.547580 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7bd69c474b-2t4sm"] Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.550030 5030 scope.go:117] "RemoveContainer" containerID="dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.553637 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-7bd69c474b-2t4sm"] Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.572243 5030 scope.go:117] "RemoveContainer" containerID="68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a" Jan 20 23:26:50 crc kubenswrapper[5030]: E0120 23:26:50.572755 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a\": container with ID starting with 68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a not found: ID does not exist" containerID="68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.572797 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a"} err="failed to get container status \"68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a\": rpc error: code = NotFound desc = could not find container \"68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a\": container with ID starting with 68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a not found: ID does not exist" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.572826 5030 scope.go:117] "RemoveContainer" containerID="dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac" Jan 20 23:26:50 crc kubenswrapper[5030]: E0120 23:26:50.573254 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac\": container with ID starting with dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac not found: ID does not exist" containerID="dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.573289 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac"} err="failed to get container status \"dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac\": rpc error: code = NotFound desc = could not find container \"dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac\": container with ID starting with dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac not found: ID does not exist" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.573314 5030 scope.go:117] "RemoveContainer" containerID="68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.573675 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a"} err="failed to get container status \"68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a\": rpc error: code = NotFound desc = could not find container \"68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a\": container with ID starting with 68ac9fffda3fab1472d6e526de63038a1195841075a107763a3f3f6efd02648a not found: ID does not exist" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.573703 5030 scope.go:117] "RemoveContainer" containerID="dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.573947 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac"} err="failed to get container status \"dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac\": rpc error: code = NotFound desc = could not find container \"dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac\": container with ID starting with dfddcfbadbbb7114ef30ab38cbbcd1581e5adc95cbf1343bc1db7f461a6e01ac not found: ID does not exist" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.573968 5030 scope.go:117] "RemoveContainer" containerID="73d55bbb51e734b228c4354b255f4da8a22be37140d690a89e16210c155c5407" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.599307 5030 scope.go:117] "RemoveContainer" containerID="b5a3121b46f09ab54fc42a4e60030abec28c4ebcaabff01ecef271e79eda8bce" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.641114 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cc0aba0-a0c7-4b7b-9b91-031765f25948-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.777745 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:26:50 crc kubenswrapper[5030]: I0120 23:26:50.789154 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:26:51 crc kubenswrapper[5030]: I0120 23:26:51.999475 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cc0aba0-a0c7-4b7b-9b91-031765f25948" path="/var/lib/kubelet/pods/6cc0aba0-a0c7-4b7b-9b91-031765f25948/volumes" Jan 20 23:26:52 crc kubenswrapper[5030]: I0120 23:26:52.001444 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c991f145-efc7-4968-b73a-d4cf085acee0" path="/var/lib/kubelet/pods/c991f145-efc7-4968-b73a-d4cf085acee0/volumes" Jan 20 23:26:53 crc kubenswrapper[5030]: I0120 23:26:53.510889 5030 scope.go:117] "RemoveContainer" containerID="876076c94c99519acaa7d7328c451ee7dcca84a94dd09787d66a65f9124d1455" Jan 20 23:26:53 crc kubenswrapper[5030]: I0120 23:26:53.550614 5030 scope.go:117] "RemoveContainer" containerID="78e9eb62b758fe4af2f56913ae48a52d46e04bbf281a309ace0c938c99079366" Jan 20 23:26:53 crc kubenswrapper[5030]: I0120 23:26:53.581733 5030 scope.go:117] "RemoveContainer" containerID="db87732f123d106f7254ea1e9dc33ebc44187354473ff54f9218551dfb61ec86" Jan 20 23:26:53 crc kubenswrapper[5030]: I0120 23:26:53.624245 5030 scope.go:117] "RemoveContainer" containerID="911be87e425ae0f10d9f79f585e422dfc37027171e5657160647c58cbb384334" Jan 20 23:26:53 crc kubenswrapper[5030]: I0120 23:26:53.654026 5030 scope.go:117] "RemoveContainer" containerID="b6df586aebc68d2efc6af2bcf7cd3114ec3dd0cbdf0cfbc45b847f60af0e5c15" Jan 20 23:26:53 crc kubenswrapper[5030]: I0120 23:26:53.962315 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:26:53 crc kubenswrapper[5030]: E0120 23:26:53.962736 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:26:59 crc kubenswrapper[5030]: E0120 23:26:59.982485 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod398a1488_8639_4110_8d25_2f5ff7e73a30.slice/crio-ddba0e5fe10a3fac4953b1a468593293a0f89d7234d4f47d94e89fcc660bae41\": RecentStats: unable to find data in memory cache]" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.705892 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-t7hjf"] Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.716274 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-t7hjf"] Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.860831 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-fvvzm"] Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.861439 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e14da0-805b-4832-b6dc-7dc6b829ee4f" containerName="barbican-keystone-listener" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.861484 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e14da0-805b-4832-b6dc-7dc6b829ee4f" containerName="barbican-keystone-listener" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.861517 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c256a299-ef10-4eab-ae00-d4116a91a108" containerName="proxy-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.861534 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c256a299-ef10-4eab-ae00-d4116a91a108" containerName="proxy-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.861560 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b334c7e2-1209-4a89-b56d-72d97b6f16b3" containerName="barbican-worker" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.861577 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b334c7e2-1209-4a89-b56d-72d97b6f16b3" containerName="barbican-worker" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.861610 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-server" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.861662 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-server" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.861684 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-replicator" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.861701 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-replicator" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.861735 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-updater" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.861752 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-updater" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.861771 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d756a967-e1dd-4c74-8bff-cd1fab21b251" containerName="setup-container" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.861786 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d756a967-e1dd-4c74-8bff-cd1fab21b251" containerName="setup-container" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.861820 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a13994b-a11d-4798-9aae-b91a3d73d440" containerName="memcached" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.861837 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a13994b-a11d-4798-9aae-b91a3d73d440" containerName="memcached" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.861858 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c862e826-c66e-4188-9d2a-096e6039a1af" containerName="glance-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.861875 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c862e826-c66e-4188-9d2a-096e6039a1af" containerName="glance-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.861905 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f82fbd9-cd1e-4d71-8d61-02126748c546" containerName="mariadb-database-create" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.861923 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f82fbd9-cd1e-4d71-8d61-02126748c546" containerName="mariadb-database-create" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.861944 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f239887a-e1d2-484e-96ce-e4bb56b1bc75" containerName="mysql-bootstrap" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.861960 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f239887a-e1d2-484e-96ce-e4bb56b1bc75" containerName="mysql-bootstrap" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.861980 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d380087b-96be-407d-8c8c-f00ecaa3a0e8" containerName="openstack-network-exporter" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.861996 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d380087b-96be-407d-8c8c-f00ecaa3a0e8" containerName="openstack-network-exporter" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862018 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b796eed-a508-43ca-9b25-c097b39474b7" containerName="barbican-api" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862034 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b796eed-a508-43ca-9b25-c097b39474b7" containerName="barbican-api" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862062 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b334c7e2-1209-4a89-b56d-72d97b6f16b3" containerName="barbican-worker-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862080 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b334c7e2-1209-4a89-b56d-72d97b6f16b3" containerName="barbican-worker-log" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862111 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d380087b-96be-407d-8c8c-f00ecaa3a0e8" containerName="ovn-northd" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862127 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d380087b-96be-407d-8c8c-f00ecaa3a0e8" containerName="ovn-northd" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862149 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e33b23a7-cf35-4fc2-8237-9633f3d807f3" containerName="nova-cell0-conductor-conductor" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862165 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e33b23a7-cf35-4fc2-8237-9633f3d807f3" containerName="nova-cell0-conductor-conductor" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862218 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="747ac61a-5b10-46ef-8a34-68b43c777d96" containerName="nova-cell1-conductor-conductor" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862238 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="747ac61a-5b10-46ef-8a34-68b43c777d96" containerName="nova-cell1-conductor-conductor" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862259 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" containerName="nova-metadata-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862276 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" containerName="nova-metadata-log" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862306 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5276d7d-9906-480d-90c6-4f48c6f126e2" containerName="nova-api-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862322 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5276d7d-9906-480d-90c6-4f48c6f126e2" containerName="nova-api-log" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862344 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" containerName="nova-metadata-metadata" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862361 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" containerName="nova-metadata-metadata" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862384 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c256a299-ef10-4eab-ae00-d4116a91a108" containerName="proxy-server" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862400 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c256a299-ef10-4eab-ae00-d4116a91a108" containerName="proxy-server" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862425 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-reaper" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862441 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-reaper" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862465 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398a1488-8639-4110-8d25-2f5ff7e73a30" containerName="init" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862481 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="398a1488-8639-4110-8d25-2f5ff7e73a30" containerName="init" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862512 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-auditor" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862529 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-auditor" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862558 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f82fbd9-cd1e-4d71-8d61-02126748c546" containerName="mariadb-database-create" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862575 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f82fbd9-cd1e-4d71-8d61-02126748c546" containerName="mariadb-database-create" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862593 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-auditor" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862609 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-auditor" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862670 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d756a967-e1dd-4c74-8bff-cd1fab21b251" containerName="rabbitmq" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862688 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d756a967-e1dd-4c74-8bff-cd1fab21b251" containerName="rabbitmq" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862718 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb" containerName="kube-state-metrics" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862734 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb" containerName="kube-state-metrics" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862765 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-server" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862781 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-server" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862808 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b215df-05ff-45fe-ac23-b5605150449b" containerName="keystone-api" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862826 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b215df-05ff-45fe-ac23-b5605150449b" containerName="keystone-api" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862844 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="ceilometer-central-agent" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862860 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="ceilometer-central-agent" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862882 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cc0aba0-a0c7-4b7b-9b91-031765f25948" containerName="cinder-scheduler" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862898 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc0aba0-a0c7-4b7b-9b91-031765f25948" containerName="cinder-scheduler" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862930 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="644cc3b2-e3d2-441c-8aa5-c3bec683e86c" containerName="cinder-api-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862947 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="644cc3b2-e3d2-441c-8aa5-c3bec683e86c" containerName="cinder-api-log" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.862966 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89f10b6-405e-4c42-a2f5-97111102ec40" containerName="placement-api" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.862982 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89f10b6-405e-4c42-a2f5-97111102ec40" containerName="placement-api" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863004 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66881282-2b6f-484d-a7e0-2b968306c5ae" containerName="galera" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863020 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="66881282-2b6f-484d-a7e0-2b968306c5ae" containerName="galera" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863046 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f239887a-e1d2-484e-96ce-e4bb56b1bc75" containerName="galera" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863061 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f239887a-e1d2-484e-96ce-e4bb56b1bc75" containerName="galera" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863079 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5276d7d-9906-480d-90c6-4f48c6f126e2" containerName="nova-api-api" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863095 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5276d7d-9906-480d-90c6-4f48c6f126e2" containerName="nova-api-api" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863113 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" containerName="setup-container" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863162 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" containerName="setup-container" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863195 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-auditor" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863211 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-auditor" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863231 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c991f145-efc7-4968-b73a-d4cf085acee0" containerName="neutron-api" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863246 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c991f145-efc7-4968-b73a-d4cf085acee0" containerName="neutron-api" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863270 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b796eed-a508-43ca-9b25-c097b39474b7" containerName="barbican-api-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863286 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b796eed-a508-43ca-9b25-c097b39474b7" containerName="barbican-api-log" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863319 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e14da0-805b-4832-b6dc-7dc6b829ee4f" containerName="barbican-keystone-listener-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863334 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e14da0-805b-4832-b6dc-7dc6b829ee4f" containerName="barbican-keystone-listener-log" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863359 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-expirer" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863376 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-expirer" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863408 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="sg-core" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863456 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="sg-core" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863486 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d37b566e-a870-4b0d-af53-29da770c8a0a" containerName="glance-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863503 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d37b566e-a870-4b0d-af53-29da770c8a0a" containerName="glance-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863528 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d37b566e-a870-4b0d-af53-29da770c8a0a" containerName="glance-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863546 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d37b566e-a870-4b0d-af53-29da770c8a0a" containerName="glance-log" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863571 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-server" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863589 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-server" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863614 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-updater" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863666 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-updater" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863688 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-replicator" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863705 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-replicator" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863729 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cc0aba0-a0c7-4b7b-9b91-031765f25948" containerName="probe" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863745 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc0aba0-a0c7-4b7b-9b91-031765f25948" containerName="probe" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863763 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89f10b6-405e-4c42-a2f5-97111102ec40" containerName="placement-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863781 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89f10b6-405e-4c42-a2f5-97111102ec40" containerName="placement-log" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863803 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c862e826-c66e-4188-9d2a-096e6039a1af" containerName="glance-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863819 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c862e826-c66e-4188-9d2a-096e6039a1af" containerName="glance-log" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863843 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" containerName="rabbitmq" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863860 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" containerName="rabbitmq" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863885 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66881282-2b6f-484d-a7e0-2b968306c5ae" containerName="mysql-bootstrap" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863902 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="66881282-2b6f-484d-a7e0-2b968306c5ae" containerName="mysql-bootstrap" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863927 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="ceilometer-notification-agent" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863943 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="ceilometer-notification-agent" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.863970 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0814026f-5519-40a1-94e8-379b5abac55f" containerName="nova-scheduler-scheduler" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.863986 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0814026f-5519-40a1-94e8-379b5abac55f" containerName="nova-scheduler-scheduler" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.864013 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="rsync" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864029 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="rsync" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.864053 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c991f145-efc7-4968-b73a-d4cf085acee0" containerName="neutron-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864069 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c991f145-efc7-4968-b73a-d4cf085acee0" containerName="neutron-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.864094 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7467fbb5-44e9-42cf-874d-16fd1332885e" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864110 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7467fbb5-44e9-42cf-874d-16fd1332885e" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.864138 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-replicator" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864155 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-replicator" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.864176 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398a1488-8639-4110-8d25-2f5ff7e73a30" containerName="dnsmasq-dns" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864192 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="398a1488-8639-4110-8d25-2f5ff7e73a30" containerName="dnsmasq-dns" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.864219 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="644cc3b2-e3d2-441c-8aa5-c3bec683e86c" containerName="cinder-api" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864235 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="644cc3b2-e3d2-441c-8aa5-c3bec683e86c" containerName="cinder-api" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.864255 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="proxy-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864272 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="proxy-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: E0120 23:27:01.864296 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="swift-recon-cron" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864313 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="swift-recon-cron" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864677 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="sg-core" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864702 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="747ac61a-5b10-46ef-8a34-68b43c777d96" containerName="nova-cell1-conductor-conductor" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864730 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b334c7e2-1209-4a89-b56d-72d97b6f16b3" containerName="barbican-worker" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864761 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="644cc3b2-e3d2-441c-8aa5-c3bec683e86c" containerName="cinder-api" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864794 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e14da0-805b-4832-b6dc-7dc6b829ee4f" containerName="barbican-keystone-listener-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864813 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b334c7e2-1209-4a89-b56d-72d97b6f16b3" containerName="barbican-worker-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864829 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="ceilometer-notification-agent" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864850 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-expirer" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864873 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0814026f-5519-40a1-94e8-379b5abac55f" containerName="nova-scheduler-scheduler" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864889 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" containerName="nova-metadata-metadata" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864907 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="rsync" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864926 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="66881282-2b6f-484d-a7e0-2b968306c5ae" containerName="galera" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864956 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="swift-recon-cron" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.864977 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-auditor" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865010 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d380087b-96be-407d-8c8c-f00ecaa3a0e8" containerName="openstack-network-exporter" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865037 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d37b566e-a870-4b0d-af53-29da770c8a0a" containerName="glance-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865057 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c256a299-ef10-4eab-ae00-d4116a91a108" containerName="proxy-server" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865074 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b215df-05ff-45fe-ac23-b5605150449b" containerName="keystone-api" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865095 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f82fbd9-cd1e-4d71-8d61-02126748c546" containerName="mariadb-database-create" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865113 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="ceilometer-central-agent" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865133 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ce0823-f23e-4e5f-a8ee-aeecbdb7efcf" containerName="nova-metadata-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865150 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c862e826-c66e-4188-9d2a-096e6039a1af" containerName="glance-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865174 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c991f145-efc7-4968-b73a-d4cf085acee0" containerName="neutron-api" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865201 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b796eed-a508-43ca-9b25-c097b39474b7" containerName="barbican-api" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865222 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cc0aba0-a0c7-4b7b-9b91-031765f25948" containerName="probe" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865247 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-replicator" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865269 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d756a967-e1dd-4c74-8bff-cd1fab21b251" containerName="rabbitmq" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865296 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c991f145-efc7-4968-b73a-d4cf085acee0" containerName="neutron-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865317 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d37b566e-a870-4b0d-af53-29da770c8a0a" containerName="glance-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865339 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d380087b-96be-407d-8c8c-f00ecaa3a0e8" containerName="ovn-northd" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865369 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-replicator" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865401 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89f10b6-405e-4c42-a2f5-97111102ec40" containerName="placement-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865431 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-updater" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865455 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="644cc3b2-e3d2-441c-8aa5-c3bec683e86c" containerName="cinder-api-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865473 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e33b23a7-cf35-4fc2-8237-9633f3d807f3" containerName="nova-cell0-conductor-conductor" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865494 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f2a230e-0da5-4c3d-8adf-dfecffcdf0cb" containerName="kube-state-metrics" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865516 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e14da0-805b-4832-b6dc-7dc6b829ee4f" containerName="barbican-keystone-listener" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865536 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2840a28-0f62-4378-bbd0-a07833eb25c7" containerName="proxy-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865557 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb76fff8-05c4-4ec5-b00b-17dfd94e5dea" containerName="rabbitmq" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865582 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c862e826-c66e-4188-9d2a-096e6039a1af" containerName="glance-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865606 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cc0aba0-a0c7-4b7b-9b91-031765f25948" containerName="cinder-scheduler" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865662 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c256a299-ef10-4eab-ae00-d4116a91a108" containerName="proxy-httpd" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865694 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f239887a-e1d2-484e-96ce-e4bb56b1bc75" containerName="galera" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865709 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-updater" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865732 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5276d7d-9906-480d-90c6-4f48c6f126e2" containerName="nova-api-api" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865751 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7467fbb5-44e9-42cf-874d-16fd1332885e" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865776 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-reaper" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865802 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b796eed-a508-43ca-9b25-c097b39474b7" containerName="barbican-api-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865828 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-auditor" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865850 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="398a1488-8639-4110-8d25-2f5ff7e73a30" containerName="dnsmasq-dns" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865880 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-replicator" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865906 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="account-server" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865928 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-server" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865948 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f82fbd9-cd1e-4d71-8d61-02126748c546" containerName="mariadb-database-create" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865974 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5276d7d-9906-480d-90c6-4f48c6f126e2" containerName="nova-api-log" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.865996 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="object-auditor" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.866018 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89f10b6-405e-4c42-a2f5-97111102ec40" containerName="placement-api" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.866044 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a13994b-a11d-4798-9aae-b91a3d73d440" containerName="memcached" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.866070 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1f0eab-495a-4664-9852-3e6488194586" containerName="container-server" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.867104 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fvvzm" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.871197 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.872116 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.872550 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.872714 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.874494 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-fvvzm"] Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.932756 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pmxz\" (UniqueName: \"kubernetes.io/projected/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-kube-api-access-4pmxz\") pod \"crc-storage-crc-fvvzm\" (UID: \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\") " pod="crc-storage/crc-storage-crc-fvvzm" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.932864 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-crc-storage\") pod \"crc-storage-crc-fvvzm\" (UID: \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\") " pod="crc-storage/crc-storage-crc-fvvzm" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.932963 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-node-mnt\") pod \"crc-storage-crc-fvvzm\" (UID: \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\") " pod="crc-storage/crc-storage-crc-fvvzm" Jan 20 23:27:01 crc kubenswrapper[5030]: I0120 23:27:01.979274 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="000eeb10-8287-4225-9abb-e925a0ee2c37" path="/var/lib/kubelet/pods/000eeb10-8287-4225-9abb-e925a0ee2c37/volumes" Jan 20 23:27:02 crc kubenswrapper[5030]: I0120 23:27:02.035225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-crc-storage\") pod \"crc-storage-crc-fvvzm\" (UID: \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\") " pod="crc-storage/crc-storage-crc-fvvzm" Jan 20 23:27:02 crc kubenswrapper[5030]: I0120 23:27:02.035524 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-node-mnt\") pod \"crc-storage-crc-fvvzm\" (UID: \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\") " pod="crc-storage/crc-storage-crc-fvvzm" Jan 20 23:27:02 crc kubenswrapper[5030]: I0120 23:27:02.036181 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-node-mnt\") pod \"crc-storage-crc-fvvzm\" (UID: \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\") " pod="crc-storage/crc-storage-crc-fvvzm" Jan 20 23:27:02 crc kubenswrapper[5030]: I0120 23:27:02.036260 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pmxz\" (UniqueName: \"kubernetes.io/projected/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-kube-api-access-4pmxz\") pod \"crc-storage-crc-fvvzm\" (UID: \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\") " pod="crc-storage/crc-storage-crc-fvvzm" Jan 20 23:27:02 crc kubenswrapper[5030]: I0120 23:27:02.037138 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-crc-storage\") pod \"crc-storage-crc-fvvzm\" (UID: \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\") " pod="crc-storage/crc-storage-crc-fvvzm" Jan 20 23:27:02 crc kubenswrapper[5030]: I0120 23:27:02.073970 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pmxz\" (UniqueName: \"kubernetes.io/projected/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-kube-api-access-4pmxz\") pod \"crc-storage-crc-fvvzm\" (UID: \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\") " pod="crc-storage/crc-storage-crc-fvvzm" Jan 20 23:27:02 crc kubenswrapper[5030]: I0120 23:27:02.203956 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fvvzm" Jan 20 23:27:03 crc kubenswrapper[5030]: I0120 23:27:02.521125 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-fvvzm"] Jan 20 23:27:03 crc kubenswrapper[5030]: I0120 23:27:02.605338 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fvvzm" event={"ID":"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f","Type":"ContainerStarted","Data":"b4bad18f46731398263596f826e6ca12d97c716624f69393c3b659f604360222"} Jan 20 23:27:03 crc kubenswrapper[5030]: I0120 23:27:03.616987 5030 generic.go:334] "Generic (PLEG): container finished" podID="ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f" containerID="14499d52e239bfb767a6c19e58c3b9a73416281ee10917e9bef5a62a3598d5b0" exitCode=0 Jan 20 23:27:03 crc kubenswrapper[5030]: I0120 23:27:03.617043 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fvvzm" event={"ID":"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f","Type":"ContainerDied","Data":"14499d52e239bfb767a6c19e58c3b9a73416281ee10917e9bef5a62a3598d5b0"} Jan 20 23:27:05 crc kubenswrapper[5030]: I0120 23:27:05.008650 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fvvzm" Jan 20 23:27:05 crc kubenswrapper[5030]: I0120 23:27:05.079131 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-crc-storage\") pod \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\" (UID: \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\") " Jan 20 23:27:05 crc kubenswrapper[5030]: I0120 23:27:05.079306 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pmxz\" (UniqueName: \"kubernetes.io/projected/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-kube-api-access-4pmxz\") pod \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\" (UID: \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\") " Jan 20 23:27:05 crc kubenswrapper[5030]: I0120 23:27:05.079348 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-node-mnt\") pod \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\" (UID: \"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f\") " Jan 20 23:27:05 crc kubenswrapper[5030]: I0120 23:27:05.080587 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f" (UID: "ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:27:05 crc kubenswrapper[5030]: I0120 23:27:05.088287 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-kube-api-access-4pmxz" (OuterVolumeSpecName: "kube-api-access-4pmxz") pod "ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f" (UID: "ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f"). InnerVolumeSpecName "kube-api-access-4pmxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:27:05 crc kubenswrapper[5030]: I0120 23:27:05.114907 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f" (UID: "ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:27:05 crc kubenswrapper[5030]: I0120 23:27:05.181862 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:05 crc kubenswrapper[5030]: I0120 23:27:05.181913 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pmxz\" (UniqueName: \"kubernetes.io/projected/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-kube-api-access-4pmxz\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:05 crc kubenswrapper[5030]: I0120 23:27:05.181930 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:05 crc kubenswrapper[5030]: I0120 23:27:05.645281 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-fvvzm" event={"ID":"ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f","Type":"ContainerDied","Data":"b4bad18f46731398263596f826e6ca12d97c716624f69393c3b659f604360222"} Jan 20 23:27:05 crc kubenswrapper[5030]: I0120 23:27:05.645613 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4bad18f46731398263596f826e6ca12d97c716624f69393c3b659f604360222" Jan 20 23:27:05 crc kubenswrapper[5030]: I0120 23:27:05.645403 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-fvvzm" Jan 20 23:27:06 crc kubenswrapper[5030]: I0120 23:27:06.962059 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:27:06 crc kubenswrapper[5030]: E0120 23:27:06.962328 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.543213 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-fvvzm"] Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.555291 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-fvvzm"] Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.658838 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-vnjr9"] Jan 20 23:27:08 crc kubenswrapper[5030]: E0120 23:27:08.659169 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f" containerName="storage" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.659191 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f" containerName="storage" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.659364 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f" containerName="storage" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.659957 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vnjr9" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.662390 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.663406 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.663516 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.663794 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.667759 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-vnjr9"] Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.739763 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3e110f4c-8a24-42d1-b480-87fa2452f2bd-crc-storage\") pod \"crc-storage-crc-vnjr9\" (UID: \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\") " pod="crc-storage/crc-storage-crc-vnjr9" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.739852 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzczr\" (UniqueName: \"kubernetes.io/projected/3e110f4c-8a24-42d1-b480-87fa2452f2bd-kube-api-access-gzczr\") pod \"crc-storage-crc-vnjr9\" (UID: \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\") " pod="crc-storage/crc-storage-crc-vnjr9" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.739917 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3e110f4c-8a24-42d1-b480-87fa2452f2bd-node-mnt\") pod \"crc-storage-crc-vnjr9\" (UID: \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\") " pod="crc-storage/crc-storage-crc-vnjr9" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.841907 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3e110f4c-8a24-42d1-b480-87fa2452f2bd-node-mnt\") pod \"crc-storage-crc-vnjr9\" (UID: \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\") " pod="crc-storage/crc-storage-crc-vnjr9" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.842155 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3e110f4c-8a24-42d1-b480-87fa2452f2bd-crc-storage\") pod \"crc-storage-crc-vnjr9\" (UID: \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\") " pod="crc-storage/crc-storage-crc-vnjr9" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.842218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzczr\" (UniqueName: \"kubernetes.io/projected/3e110f4c-8a24-42d1-b480-87fa2452f2bd-kube-api-access-gzczr\") pod \"crc-storage-crc-vnjr9\" (UID: \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\") " pod="crc-storage/crc-storage-crc-vnjr9" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.842404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3e110f4c-8a24-42d1-b480-87fa2452f2bd-node-mnt\") pod \"crc-storage-crc-vnjr9\" (UID: \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\") " pod="crc-storage/crc-storage-crc-vnjr9" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.843337 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3e110f4c-8a24-42d1-b480-87fa2452f2bd-crc-storage\") pod \"crc-storage-crc-vnjr9\" (UID: \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\") " pod="crc-storage/crc-storage-crc-vnjr9" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.873015 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzczr\" (UniqueName: \"kubernetes.io/projected/3e110f4c-8a24-42d1-b480-87fa2452f2bd-kube-api-access-gzczr\") pod \"crc-storage-crc-vnjr9\" (UID: \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\") " pod="crc-storage/crc-storage-crc-vnjr9" Jan 20 23:27:08 crc kubenswrapper[5030]: I0120 23:27:08.980835 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vnjr9" Jan 20 23:27:09 crc kubenswrapper[5030]: I0120 23:27:09.298901 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-vnjr9"] Jan 20 23:27:09 crc kubenswrapper[5030]: I0120 23:27:09.692106 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vnjr9" event={"ID":"3e110f4c-8a24-42d1-b480-87fa2452f2bd","Type":"ContainerStarted","Data":"40dd39b236f692e4a562bc16c5ca4cb068a51e0a8a66d9534c8d3e81559597d9"} Jan 20 23:27:09 crc kubenswrapper[5030]: I0120 23:27:09.979250 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f" path="/var/lib/kubelet/pods/ce3b7dc7-48f8-4221-ba59-e340c7ec2a6f/volumes" Jan 20 23:27:10 crc kubenswrapper[5030]: E0120 23:27:10.189997 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod398a1488_8639_4110_8d25_2f5ff7e73a30.slice/crio-ddba0e5fe10a3fac4953b1a468593293a0f89d7234d4f47d94e89fcc660bae41\": RecentStats: unable to find data in memory cache]" Jan 20 23:27:10 crc kubenswrapper[5030]: I0120 23:27:10.705044 5030 generic.go:334] "Generic (PLEG): container finished" podID="3e110f4c-8a24-42d1-b480-87fa2452f2bd" containerID="932dcfdf35065219a5e818e5ee79e925e511ed060334c88dc1016384aecb5e71" exitCode=0 Jan 20 23:27:10 crc kubenswrapper[5030]: I0120 23:27:10.705135 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vnjr9" event={"ID":"3e110f4c-8a24-42d1-b480-87fa2452f2bd","Type":"ContainerDied","Data":"932dcfdf35065219a5e818e5ee79e925e511ed060334c88dc1016384aecb5e71"} Jan 20 23:27:12 crc kubenswrapper[5030]: I0120 23:27:12.098953 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vnjr9" Jan 20 23:27:12 crc kubenswrapper[5030]: I0120 23:27:12.198177 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3e110f4c-8a24-42d1-b480-87fa2452f2bd-crc-storage\") pod \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\" (UID: \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\") " Jan 20 23:27:12 crc kubenswrapper[5030]: I0120 23:27:12.198369 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzczr\" (UniqueName: \"kubernetes.io/projected/3e110f4c-8a24-42d1-b480-87fa2452f2bd-kube-api-access-gzczr\") pod \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\" (UID: \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\") " Jan 20 23:27:12 crc kubenswrapper[5030]: I0120 23:27:12.198450 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3e110f4c-8a24-42d1-b480-87fa2452f2bd-node-mnt\") pod \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\" (UID: \"3e110f4c-8a24-42d1-b480-87fa2452f2bd\") " Jan 20 23:27:12 crc kubenswrapper[5030]: I0120 23:27:12.198564 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e110f4c-8a24-42d1-b480-87fa2452f2bd-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "3e110f4c-8a24-42d1-b480-87fa2452f2bd" (UID: "3e110f4c-8a24-42d1-b480-87fa2452f2bd"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:27:12 crc kubenswrapper[5030]: I0120 23:27:12.198869 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3e110f4c-8a24-42d1-b480-87fa2452f2bd-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:12 crc kubenswrapper[5030]: I0120 23:27:12.205586 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e110f4c-8a24-42d1-b480-87fa2452f2bd-kube-api-access-gzczr" (OuterVolumeSpecName: "kube-api-access-gzczr") pod "3e110f4c-8a24-42d1-b480-87fa2452f2bd" (UID: "3e110f4c-8a24-42d1-b480-87fa2452f2bd"). InnerVolumeSpecName "kube-api-access-gzczr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:27:12 crc kubenswrapper[5030]: I0120 23:27:12.219215 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e110f4c-8a24-42d1-b480-87fa2452f2bd-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "3e110f4c-8a24-42d1-b480-87fa2452f2bd" (UID: "3e110f4c-8a24-42d1-b480-87fa2452f2bd"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:27:12 crc kubenswrapper[5030]: I0120 23:27:12.299785 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3e110f4c-8a24-42d1-b480-87fa2452f2bd-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:12 crc kubenswrapper[5030]: I0120 23:27:12.299818 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzczr\" (UniqueName: \"kubernetes.io/projected/3e110f4c-8a24-42d1-b480-87fa2452f2bd-kube-api-access-gzczr\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:12 crc kubenswrapper[5030]: I0120 23:27:12.740142 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-vnjr9" event={"ID":"3e110f4c-8a24-42d1-b480-87fa2452f2bd","Type":"ContainerDied","Data":"40dd39b236f692e4a562bc16c5ca4cb068a51e0a8a66d9534c8d3e81559597d9"} Jan 20 23:27:12 crc kubenswrapper[5030]: I0120 23:27:12.740223 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40dd39b236f692e4a562bc16c5ca4cb068a51e0a8a66d9534c8d3e81559597d9" Jan 20 23:27:12 crc kubenswrapper[5030]: I0120 23:27:12.740284 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-vnjr9" Jan 20 23:27:17 crc kubenswrapper[5030]: I0120 23:27:17.970931 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:27:17 crc kubenswrapper[5030]: E0120 23:27:17.971907 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:27:20 crc kubenswrapper[5030]: E0120 23:27:20.379937 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod398a1488_8639_4110_8d25_2f5ff7e73a30.slice/crio-ddba0e5fe10a3fac4953b1a468593293a0f89d7234d4f47d94e89fcc660bae41\": RecentStats: unable to find data in memory cache]" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.065150 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:27:22 crc kubenswrapper[5030]: E0120 23:27:22.065939 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e110f4c-8a24-42d1-b480-87fa2452f2bd" containerName="storage" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.065962 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e110f4c-8a24-42d1-b480-87fa2452f2bd" containerName="storage" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.066240 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e110f4c-8a24-42d1-b480-87fa2452f2bd" containerName="storage" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.067475 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.070203 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.070471 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.070774 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.071424 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-cjd45" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.071649 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.078911 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.145556 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.145719 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9def69e2-10b9-443b-8b38-7d561e39bb7d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.145840 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9def69e2-10b9-443b-8b38-7d561e39bb7d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.145883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9def69e2-10b9-443b-8b38-7d561e39bb7d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.145915 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl5td\" (UniqueName: \"kubernetes.io/projected/9def69e2-10b9-443b-8b38-7d561e39bb7d-kube-api-access-cl5td\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.146052 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9def69e2-10b9-443b-8b38-7d561e39bb7d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.146138 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.146192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.146252 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.247961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9def69e2-10b9-443b-8b38-7d561e39bb7d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.248035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl5td\" (UniqueName: \"kubernetes.io/projected/9def69e2-10b9-443b-8b38-7d561e39bb7d-kube-api-access-cl5td\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.248104 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9def69e2-10b9-443b-8b38-7d561e39bb7d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.248162 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.248202 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.248248 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.248339 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.248404 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9def69e2-10b9-443b-8b38-7d561e39bb7d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.248466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9def69e2-10b9-443b-8b38-7d561e39bb7d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.248697 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.249028 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.249274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.249942 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9def69e2-10b9-443b-8b38-7d561e39bb7d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.250111 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9def69e2-10b9-443b-8b38-7d561e39bb7d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.256307 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9def69e2-10b9-443b-8b38-7d561e39bb7d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.256408 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.256655 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9def69e2-10b9-443b-8b38-7d561e39bb7d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.286932 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl5td\" (UniqueName: \"kubernetes.io/projected/9def69e2-10b9-443b-8b38-7d561e39bb7d-kube-api-access-cl5td\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.294440 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.298888 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.301210 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.306526 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.306573 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-bzk7k" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.307252 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.308610 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.310040 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.316454 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.349550 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/535e149a-723c-4f6b-9247-9a6b58eff6bb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.349880 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.350033 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/535e149a-723c-4f6b-9247-9a6b58eff6bb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.350190 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.350318 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.350445 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/535e149a-723c-4f6b-9247-9a6b58eff6bb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.350589 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/535e149a-723c-4f6b-9247-9a6b58eff6bb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.350779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb62f\" (UniqueName: \"kubernetes.io/projected/535e149a-723c-4f6b-9247-9a6b58eff6bb-kube-api-access-vb62f\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.351061 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.392522 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.452456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/535e149a-723c-4f6b-9247-9a6b58eff6bb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.452516 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.452539 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/535e149a-723c-4f6b-9247-9a6b58eff6bb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.452585 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.452638 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.452665 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/535e149a-723c-4f6b-9247-9a6b58eff6bb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.452704 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/535e149a-723c-4f6b-9247-9a6b58eff6bb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.452769 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb62f\" (UniqueName: \"kubernetes.io/projected/535e149a-723c-4f6b-9247-9a6b58eff6bb-kube-api-access-vb62f\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.452803 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.453608 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.457741 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/535e149a-723c-4f6b-9247-9a6b58eff6bb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.458805 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.458954 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.459152 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.460017 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/535e149a-723c-4f6b-9247-9a6b58eff6bb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.460266 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/535e149a-723c-4f6b-9247-9a6b58eff6bb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.467529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/535e149a-723c-4f6b-9247-9a6b58eff6bb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.486309 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb62f\" (UniqueName: \"kubernetes.io/projected/535e149a-723c-4f6b-9247-9a6b58eff6bb-kube-api-access-vb62f\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.494460 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.670166 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.712609 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:27:22 crc kubenswrapper[5030]: W0120 23:27:22.714232 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9def69e2_10b9_443b_8b38_7d561e39bb7d.slice/crio-758ba39bd07720b4de6d36d8e189384dff774f7d39eed8a00c5685f17ab1ff29 WatchSource:0}: Error finding container 758ba39bd07720b4de6d36d8e189384dff774f7d39eed8a00c5685f17ab1ff29: Status 404 returned error can't find the container with id 758ba39bd07720b4de6d36d8e189384dff774f7d39eed8a00c5685f17ab1ff29 Jan 20 23:27:22 crc kubenswrapper[5030]: I0120 23:27:22.850984 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"9def69e2-10b9-443b-8b38-7d561e39bb7d","Type":"ContainerStarted","Data":"758ba39bd07720b4de6d36d8e189384dff774f7d39eed8a00c5685f17ab1ff29"} Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.109879 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:27:23 crc kubenswrapper[5030]: W0120 23:27:23.112144 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod535e149a_723c_4f6b_9247_9a6b58eff6bb.slice/crio-c09c53e881513e8e6b8114aeaeb7098dd0c9feec9d99db5af552448a7aea670b WatchSource:0}: Error finding container c09c53e881513e8e6b8114aeaeb7098dd0c9feec9d99db5af552448a7aea670b: Status 404 returned error can't find the container with id c09c53e881513e8e6b8114aeaeb7098dd0c9feec9d99db5af552448a7aea670b Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.685129 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.687527 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.689931 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.695365 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.696536 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-5fq4w" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.701320 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.702443 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.709083 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.862335 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"535e149a-723c-4f6b-9247-9a6b58eff6bb","Type":"ContainerStarted","Data":"c09c53e881513e8e6b8114aeaeb7098dd0c9feec9d99db5af552448a7aea670b"} Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.886890 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkpzd\" (UniqueName: \"kubernetes.io/projected/66ad6a39-d03a-4db8-9f47-9c461325cf0b-kube-api-access-hkpzd\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.887147 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.887311 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-kolla-config\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.887412 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66ad6a39-d03a-4db8-9f47-9c461325cf0b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.887531 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.887682 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ad6a39-d03a-4db8-9f47-9c461325cf0b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.887796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-config-data-default\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.887900 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ad6a39-d03a-4db8-9f47-9c461325cf0b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.989222 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-kolla-config\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.989281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66ad6a39-d03a-4db8-9f47-9c461325cf0b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.989340 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.989366 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-config-data-default\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.989387 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ad6a39-d03a-4db8-9f47-9c461325cf0b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.989412 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ad6a39-d03a-4db8-9f47-9c461325cf0b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.989473 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkpzd\" (UniqueName: \"kubernetes.io/projected/66ad6a39-d03a-4db8-9f47-9c461325cf0b-kube-api-access-hkpzd\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.989496 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:23 crc kubenswrapper[5030]: I0120 23:27:23.990196 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.015220 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.016310 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.020572 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-n56b6" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.023320 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.043403 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66ad6a39-d03a-4db8-9f47-9c461325cf0b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.043970 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-kolla-config\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.046747 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.047130 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-config-data-default\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.060917 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.065305 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ad6a39-d03a-4db8-9f47-9c461325cf0b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.149399 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ad6a39-d03a-4db8-9f47-9c461325cf0b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.153318 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkpzd\" (UniqueName: \"kubernetes.io/projected/66ad6a39-d03a-4db8-9f47-9c461325cf0b-kube-api-access-hkpzd\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.186524 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.193562 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx8sx\" (UniqueName: \"kubernetes.io/projected/295f9631-4bac-43a2-9b71-f94209fa0046-kube-api-access-jx8sx\") pod \"memcached-0\" (UID: \"295f9631-4bac-43a2-9b71-f94209fa0046\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.193604 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/295f9631-4bac-43a2-9b71-f94209fa0046-kolla-config\") pod \"memcached-0\" (UID: \"295f9631-4bac-43a2-9b71-f94209fa0046\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.193703 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/295f9631-4bac-43a2-9b71-f94209fa0046-config-data\") pod \"memcached-0\" (UID: \"295f9631-4bac-43a2-9b71-f94209fa0046\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.295117 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/295f9631-4bac-43a2-9b71-f94209fa0046-config-data\") pod \"memcached-0\" (UID: \"295f9631-4bac-43a2-9b71-f94209fa0046\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.295198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx8sx\" (UniqueName: \"kubernetes.io/projected/295f9631-4bac-43a2-9b71-f94209fa0046-kube-api-access-jx8sx\") pod \"memcached-0\" (UID: \"295f9631-4bac-43a2-9b71-f94209fa0046\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.295223 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/295f9631-4bac-43a2-9b71-f94209fa0046-kolla-config\") pod \"memcached-0\" (UID: \"295f9631-4bac-43a2-9b71-f94209fa0046\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.296666 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/295f9631-4bac-43a2-9b71-f94209fa0046-config-data\") pod \"memcached-0\" (UID: \"295f9631-4bac-43a2-9b71-f94209fa0046\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.296714 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/295f9631-4bac-43a2-9b71-f94209fa0046-kolla-config\") pod \"memcached-0\" (UID: \"295f9631-4bac-43a2-9b71-f94209fa0046\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.316731 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.331931 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx8sx\" (UniqueName: \"kubernetes.io/projected/295f9631-4bac-43a2-9b71-f94209fa0046-kube-api-access-jx8sx\") pod \"memcached-0\" (UID: \"295f9631-4bac-43a2-9b71-f94209fa0046\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.344203 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.874781 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"535e149a-723c-4f6b-9247-9a6b58eff6bb","Type":"ContainerStarted","Data":"15c82f83871ae8a50514e3320b06975949c2d6131b50863a977396f67100c222"} Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.877221 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"9def69e2-10b9-443b-8b38-7d561e39bb7d","Type":"ContainerStarted","Data":"1778eb3dac897ab781625cc651ba606b63a5dd6829ad139e3551fa0e3df843c2"} Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.919093 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:27:24 crc kubenswrapper[5030]: W0120 23:27:24.925229 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod295f9631_4bac_43a2_9b71_f94209fa0046.slice/crio-2f110cca5615a048bca2ed2ff0d4c1eb389f92ad830ae59ef99f3a1cff681dd8 WatchSource:0}: Error finding container 2f110cca5615a048bca2ed2ff0d4c1eb389f92ad830ae59ef99f3a1cff681dd8: Status 404 returned error can't find the container with id 2f110cca5615a048bca2ed2ff0d4c1eb389f92ad830ae59ef99f3a1cff681dd8 Jan 20 23:27:24 crc kubenswrapper[5030]: I0120 23:27:24.929954 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:27:24 crc kubenswrapper[5030]: W0120 23:27:24.930266 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66ad6a39_d03a_4db8_9f47_9c461325cf0b.slice/crio-5b0e18a59044bdb3e7ffadca1ae055ff36b5b21da9606b63a26da194f66c1a48 WatchSource:0}: Error finding container 5b0e18a59044bdb3e7ffadca1ae055ff36b5b21da9606b63a26da194f66c1a48: Status 404 returned error can't find the container with id 5b0e18a59044bdb3e7ffadca1ae055ff36b5b21da9606b63a26da194f66c1a48 Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.030684 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.031591 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.034120 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-sw9j8" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.048992 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.210460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rjqz\" (UniqueName: \"kubernetes.io/projected/4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c-kube-api-access-7rjqz\") pod \"kube-state-metrics-0\" (UID: \"4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.313119 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rjqz\" (UniqueName: \"kubernetes.io/projected/4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c-kube-api-access-7rjqz\") pod \"kube-state-metrics-0\" (UID: \"4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.332898 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rjqz\" (UniqueName: \"kubernetes.io/projected/4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c-kube-api-access-7rjqz\") pod \"kube-state-metrics-0\" (UID: \"4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.345703 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.473274 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.474742 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.477580 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-st4ds" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.477992 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.478251 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.478415 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.496046 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.530783 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.535335 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.538511 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-54dw7" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.539806 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.540007 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.560263 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.604462 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.618070 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95a6a20e-e4cf-4c12-be69-33ace96481c7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.618117 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.618165 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.618501 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.618579 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.618606 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a6a20e-e4cf-4c12-be69-33ace96481c7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.618663 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.618699 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8xmh\" (UniqueName: \"kubernetes.io/projected/95a6a20e-e4cf-4c12-be69-33ace96481c7-kube-api-access-d8xmh\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.618779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6m2v\" (UniqueName: \"kubernetes.io/projected/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-kube-api-access-d6m2v\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.618804 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.618888 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.618923 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.618947 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95a6a20e-e4cf-4c12-be69-33ace96481c7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.619035 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95a6a20e-e4cf-4c12-be69-33ace96481c7-config\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720407 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95a6a20e-e4cf-4c12-be69-33ace96481c7-config\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95a6a20e-e4cf-4c12-be69-33ace96481c7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720498 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720520 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720548 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720590 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a6a20e-e4cf-4c12-be69-33ace96481c7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720610 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720643 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8xmh\" (UniqueName: \"kubernetes.io/projected/95a6a20e-e4cf-4c12-be69-33ace96481c7-kube-api-access-d8xmh\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720673 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6m2v\" (UniqueName: \"kubernetes.io/projected/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-kube-api-access-d6m2v\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720689 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720740 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.720754 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95a6a20e-e4cf-4c12-be69-33ace96481c7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.721201 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95a6a20e-e4cf-4c12-be69-33ace96481c7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.721511 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95a6a20e-e4cf-4c12-be69-33ace96481c7-config\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.722128 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.722133 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.724582 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.725127 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95a6a20e-e4cf-4c12-be69-33ace96481c7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.726954 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.727285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.727440 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.727527 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.727877 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.728069 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a6a20e-e4cf-4c12-be69-33ace96481c7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.742526 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8xmh\" (UniqueName: \"kubernetes.io/projected/95a6a20e-e4cf-4c12-be69-33ace96481c7-kube-api-access-d8xmh\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.743943 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6m2v\" (UniqueName: \"kubernetes.io/projected/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-kube-api-access-d6m2v\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.750190 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.751740 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.800717 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.816781 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.817994 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.820655 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.820842 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-hrzkh" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.826014 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.830716 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.856346 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.902851 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"66ad6a39-d03a-4db8-9f47-9c461325cf0b","Type":"ContainerStarted","Data":"1fbf13e88e2f10e605533393bd27956cf73249bb24c679cb4a83c66136518a75"} Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.902894 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"66ad6a39-d03a-4db8-9f47-9c461325cf0b","Type":"ContainerStarted","Data":"5b0e18a59044bdb3e7ffadca1ae055ff36b5b21da9606b63a26da194f66c1a48"} Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.904501 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"295f9631-4bac-43a2-9b71-f94209fa0046","Type":"ContainerStarted","Data":"99c989c805531e84aa4649956b12836dbd4fb418611a818efb7cce3c72c5eedd"} Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.904555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"295f9631-4bac-43a2-9b71-f94209fa0046","Type":"ContainerStarted","Data":"2f110cca5615a048bca2ed2ff0d4c1eb389f92ad830ae59ef99f3a1cff681dd8"} Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.904801 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.906690 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c","Type":"ContainerStarted","Data":"89c3dd8b3e7fc91cc889f77f886cdedfdb0463cbc962b4028ce5a168ff6ec615"} Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.931470 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.931563 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-config\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.931638 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.931736 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb9wf\" (UniqueName: \"kubernetes.io/projected/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-kube-api-access-wb9wf\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.931775 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.931821 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:25 crc kubenswrapper[5030]: I0120 23:27:25.949510 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.949488864 podStartE2EDuration="2.949488864s" podCreationTimestamp="2026-01-20 23:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:27:25.947083727 +0000 UTC m=+3118.267344025" watchObservedRunningTime="2026-01-20 23:27:25.949488864 +0000 UTC m=+3118.269749152" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.033871 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.034113 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.034337 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-config\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.034504 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.034851 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb9wf\" (UniqueName: \"kubernetes.io/projected/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-kube-api-access-wb9wf\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.034956 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.035201 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.038699 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-config\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.040019 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.040090 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.048780 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.052755 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb9wf\" (UniqueName: \"kubernetes.io/projected/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-kube-api-access-wb9wf\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.069584 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.133479 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.244917 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:27:26 crc kubenswrapper[5030]: W0120 23:27:26.326755 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95a6a20e_e4cf_4c12_be69_33ace96481c7.slice/crio-b0c5d2cc201d083fc84c7006cdf08cd364054e065afe9db1a9548962a8c189c4 WatchSource:0}: Error finding container b0c5d2cc201d083fc84c7006cdf08cd364054e065afe9db1a9548962a8c189c4: Status 404 returned error can't find the container with id b0c5d2cc201d083fc84c7006cdf08cd364054e065afe9db1a9548962a8c189c4 Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.327154 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.550009 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:27:26 crc kubenswrapper[5030]: W0120 23:27:26.854970 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f0e49bd_aaee_440e_a6ec_7498907c5c7c.slice/crio-f7298c10b79114bd49af1af229aa411b4ff244a5cf35c68ef5ab0092bece7705 WatchSource:0}: Error finding container f7298c10b79114bd49af1af229aa411b4ff244a5cf35c68ef5ab0092bece7705: Status 404 returned error can't find the container with id f7298c10b79114bd49af1af229aa411b4ff244a5cf35c68ef5ab0092bece7705 Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.917051 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"95a6a20e-e4cf-4c12-be69-33ace96481c7","Type":"ContainerStarted","Data":"f413a83947a3f641c49ef1fe9149ee0fb63df1dd4f02e577126231ff5df950cd"} Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.917411 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"95a6a20e-e4cf-4c12-be69-33ace96481c7","Type":"ContainerStarted","Data":"b0c5d2cc201d083fc84c7006cdf08cd364054e065afe9db1a9548962a8c189c4"} Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.919884 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c","Type":"ContainerStarted","Data":"3810644d3ad4dbd618d30b4b916ee09f4265c94267ee6ea6b39193711564b3fb"} Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.920644 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.922778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"8f0e49bd-aaee-440e-a6ec-7498907c5c7c","Type":"ContainerStarted","Data":"f7298c10b79114bd49af1af229aa411b4ff244a5cf35c68ef5ab0092bece7705"} Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.924704 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"2de7a5fe-85b0-4b7c-af74-ec2c104fac94","Type":"ContainerStarted","Data":"35642e491b70667a14dd98edd5926927ca38efc248b37af045f7466d342c728b"} Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.924741 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"2de7a5fe-85b0-4b7c-af74-ec2c104fac94","Type":"ContainerStarted","Data":"cbdacca4fc0f1f65be053152d8f0d255499ad9178bd0d5cdd9bf85b7311a1d52"} Jan 20 23:27:26 crc kubenswrapper[5030]: I0120 23:27:26.949865 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.564793039 podStartE2EDuration="1.94984195s" podCreationTimestamp="2026-01-20 23:27:25 +0000 UTC" firstStartedPulling="2026-01-20 23:27:25.61471803 +0000 UTC m=+3117.934978318" lastFinishedPulling="2026-01-20 23:27:25.999766941 +0000 UTC m=+3118.320027229" observedRunningTime="2026-01-20 23:27:26.945367272 +0000 UTC m=+3119.265627600" watchObservedRunningTime="2026-01-20 23:27:26.94984195 +0000 UTC m=+3119.270102248" Jan 20 23:27:27 crc kubenswrapper[5030]: I0120 23:27:27.938257 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"95a6a20e-e4cf-4c12-be69-33ace96481c7","Type":"ContainerStarted","Data":"bd45d6d17dbbe4a1ba70c5139b2c784f386f57335e21bcbdfe022af808f6aec3"} Jan 20 23:27:27 crc kubenswrapper[5030]: I0120 23:27:27.941841 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"8f0e49bd-aaee-440e-a6ec-7498907c5c7c","Type":"ContainerStarted","Data":"a438c41191f2f0394a1ab37c20bc2ba8dd55038f659864717d061a59d37cf44a"} Jan 20 23:27:27 crc kubenswrapper[5030]: I0120 23:27:27.941887 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"8f0e49bd-aaee-440e-a6ec-7498907c5c7c","Type":"ContainerStarted","Data":"1d3442a0a542586be3dae25255dace0589b3705afc3accc9a2bd41c8606fed40"} Jan 20 23:27:27 crc kubenswrapper[5030]: I0120 23:27:27.974336 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.974314699 podStartE2EDuration="3.974314699s" podCreationTimestamp="2026-01-20 23:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:27:27.963895258 +0000 UTC m=+3120.284155586" watchObservedRunningTime="2026-01-20 23:27:27.974314699 +0000 UTC m=+3120.294574987" Jan 20 23:27:28 crc kubenswrapper[5030]: I0120 23:27:28.014757 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=4.014674767 podStartE2EDuration="4.014674767s" podCreationTimestamp="2026-01-20 23:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:27:28.014093163 +0000 UTC m=+3120.334353451" watchObservedRunningTime="2026-01-20 23:27:28.014674767 +0000 UTC m=+3120.334935055" Jan 20 23:27:28 crc kubenswrapper[5030]: I0120 23:27:28.856888 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:28 crc kubenswrapper[5030]: I0120 23:27:28.954047 5030 generic.go:334] "Generic (PLEG): container finished" podID="66ad6a39-d03a-4db8-9f47-9c461325cf0b" containerID="1fbf13e88e2f10e605533393bd27956cf73249bb24c679cb4a83c66136518a75" exitCode=0 Jan 20 23:27:28 crc kubenswrapper[5030]: I0120 23:27:28.954694 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"66ad6a39-d03a-4db8-9f47-9c461325cf0b","Type":"ContainerDied","Data":"1fbf13e88e2f10e605533393bd27956cf73249bb24c679cb4a83c66136518a75"} Jan 20 23:27:29 crc kubenswrapper[5030]: I0120 23:27:29.134032 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:29 crc kubenswrapper[5030]: I0120 23:27:29.966257 5030 generic.go:334] "Generic (PLEG): container finished" podID="2de7a5fe-85b0-4b7c-af74-ec2c104fac94" containerID="35642e491b70667a14dd98edd5926927ca38efc248b37af045f7466d342c728b" exitCode=0 Jan 20 23:27:29 crc kubenswrapper[5030]: I0120 23:27:29.974945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"2de7a5fe-85b0-4b7c-af74-ec2c104fac94","Type":"ContainerDied","Data":"35642e491b70667a14dd98edd5926927ca38efc248b37af045f7466d342c728b"} Jan 20 23:27:29 crc kubenswrapper[5030]: I0120 23:27:29.974995 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"66ad6a39-d03a-4db8-9f47-9c461325cf0b","Type":"ContainerStarted","Data":"aafac8517276cb2773a480452566bd4bb81c0a6d528488652d12ca4598c33812"} Jan 20 23:27:30 crc kubenswrapper[5030]: I0120 23:27:30.009782 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=8.009766542 podStartE2EDuration="8.009766542s" podCreationTimestamp="2026-01-20 23:27:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:27:30.009582928 +0000 UTC m=+3122.329843306" watchObservedRunningTime="2026-01-20 23:27:30.009766542 +0000 UTC m=+3122.330026830" Jan 20 23:27:30 crc kubenswrapper[5030]: I0120 23:27:30.856738 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:30 crc kubenswrapper[5030]: I0120 23:27:30.962782 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:27:30 crc kubenswrapper[5030]: E0120 23:27:30.963351 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:27:30 crc kubenswrapper[5030]: I0120 23:27:30.984434 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"2de7a5fe-85b0-4b7c-af74-ec2c104fac94","Type":"ContainerStarted","Data":"e36e0f7cd8355c63cb64f84f0eb05159bf294b5e0ff81a35379e17c111fa95ef"} Jan 20 23:27:31 crc kubenswrapper[5030]: I0120 23:27:31.019535 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=7.019509675 podStartE2EDuration="7.019509675s" podCreationTimestamp="2026-01-20 23:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:27:31.014080624 +0000 UTC m=+3123.334340942" watchObservedRunningTime="2026-01-20 23:27:31.019509675 +0000 UTC m=+3123.339770003" Jan 20 23:27:31 crc kubenswrapper[5030]: I0120 23:27:31.133871 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:31 crc kubenswrapper[5030]: I0120 23:27:31.933138 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.058375 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.185712 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.237140 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.447919 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.449605 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.451387 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.452406 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.452423 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-qmbrx" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.463594 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.553890 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8sn5\" (UniqueName: \"kubernetes.io/projected/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-kube-api-access-l8sn5\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.554019 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.554049 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.554085 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-scripts\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.554119 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-config\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.655400 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.655441 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.655475 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-scripts\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.655501 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-config\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.655583 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8sn5\" (UniqueName: \"kubernetes.io/projected/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-kube-api-access-l8sn5\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.656074 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.656445 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-config\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.656586 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-scripts\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.668358 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.675146 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8sn5\" (UniqueName: \"kubernetes.io/projected/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-kube-api-access-l8sn5\") pod \"ovn-northd-0\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:32 crc kubenswrapper[5030]: I0120 23:27:32.774307 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:33 crc kubenswrapper[5030]: I0120 23:27:33.200016 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:27:33 crc kubenswrapper[5030]: W0120 23:27:33.210425 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod809c9ceb_bda3_4971_88b2_3fb7bd4cee02.slice/crio-cf1f2eab93c195c5b1e4618d45aa7077ef4a1cba677870047e5207b1ed666494 WatchSource:0}: Error finding container cf1f2eab93c195c5b1e4618d45aa7077ef4a1cba677870047e5207b1ed666494: Status 404 returned error can't find the container with id cf1f2eab93c195c5b1e4618d45aa7077ef4a1cba677870047e5207b1ed666494 Jan 20 23:27:34 crc kubenswrapper[5030]: I0120 23:27:34.013964 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"809c9ceb-bda3-4971-88b2-3fb7bd4cee02","Type":"ContainerStarted","Data":"1f5c17ad369f742c3c741a76fb19d1dc31e60772125f2a6a42307522b80a2db3"} Jan 20 23:27:34 crc kubenswrapper[5030]: I0120 23:27:34.014288 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:34 crc kubenswrapper[5030]: I0120 23:27:34.014299 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"809c9ceb-bda3-4971-88b2-3fb7bd4cee02","Type":"ContainerStarted","Data":"41a4453c6599195810c26e314fe2f606cb5655a663af8cbb23e523f3df4773ab"} Jan 20 23:27:34 crc kubenswrapper[5030]: I0120 23:27:34.014308 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"809c9ceb-bda3-4971-88b2-3fb7bd4cee02","Type":"ContainerStarted","Data":"cf1f2eab93c195c5b1e4618d45aa7077ef4a1cba677870047e5207b1ed666494"} Jan 20 23:27:34 crc kubenswrapper[5030]: I0120 23:27:34.031055 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.031030636 podStartE2EDuration="2.031030636s" podCreationTimestamp="2026-01-20 23:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:27:34.027355307 +0000 UTC m=+3126.347615625" watchObservedRunningTime="2026-01-20 23:27:34.031030636 +0000 UTC m=+3126.351290944" Jan 20 23:27:34 crc kubenswrapper[5030]: I0120 23:27:34.317920 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:34 crc kubenswrapper[5030]: I0120 23:27:34.318000 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:34 crc kubenswrapper[5030]: I0120 23:27:34.345712 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:27:34 crc kubenswrapper[5030]: I0120 23:27:34.445988 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:35 crc kubenswrapper[5030]: I0120 23:27:35.169872 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:27:35 crc kubenswrapper[5030]: I0120 23:27:35.351522 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:27:35 crc kubenswrapper[5030]: I0120 23:27:35.801085 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:35 crc kubenswrapper[5030]: I0120 23:27:35.801155 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.374048 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.385123 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.388588 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.388601 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.389439 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.389493 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-5chxt" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.396720 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.419840 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.420066 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-lock\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.420135 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ggld\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-kube-api-access-2ggld\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.420210 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-cache\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.420267 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.521838 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.521978 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-lock\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.522006 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ggld\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-kube-api-access-2ggld\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.522058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-cache\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.522091 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: E0120 23:27:36.522246 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:27:36 crc kubenswrapper[5030]: E0120 23:27:36.522264 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:27:36 crc kubenswrapper[5030]: E0120 23:27:36.522329 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift podName:bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4 nodeName:}" failed. No retries permitted until 2026-01-20 23:27:37.022307403 +0000 UTC m=+3129.342567711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift") pod "swift-storage-0" (UID: "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4") : configmap "swift-ring-files" not found Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.522353 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") device mount path \"/mnt/openstack/pv17\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.523072 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-lock\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.523082 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-cache\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.550490 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ggld\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-kube-api-access-2ggld\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.571799 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.747225 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-mktxf"] Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.748822 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.751140 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.752128 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.752247 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.757443 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-mktxf"] Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.825731 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-combined-ca-bundle\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.825781 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgh4p\" (UniqueName: \"kubernetes.io/projected/38852246-6c24-4866-921a-ce61995931d7-kube-api-access-fgh4p\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.825809 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38852246-6c24-4866-921a-ce61995931d7-scripts\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.825926 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38852246-6c24-4866-921a-ce61995931d7-ring-data-devices\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.826078 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-swiftconf\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.826211 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38852246-6c24-4866-921a-ce61995931d7-etc-swift\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.826362 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-dispersionconf\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.928210 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-dispersionconf\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.928276 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-combined-ca-bundle\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.928307 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgh4p\" (UniqueName: \"kubernetes.io/projected/38852246-6c24-4866-921a-ce61995931d7-kube-api-access-fgh4p\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.928494 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38852246-6c24-4866-921a-ce61995931d7-scripts\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.928572 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38852246-6c24-4866-921a-ce61995931d7-ring-data-devices\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.928641 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-swiftconf\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.928699 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38852246-6c24-4866-921a-ce61995931d7-etc-swift\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.929317 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38852246-6c24-4866-921a-ce61995931d7-etc-swift\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.929544 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38852246-6c24-4866-921a-ce61995931d7-scripts\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.929556 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38852246-6c24-4866-921a-ce61995931d7-ring-data-devices\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.932222 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-dispersionconf\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.934258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-swiftconf\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.934485 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-combined-ca-bundle\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:36 crc kubenswrapper[5030]: I0120 23:27:36.959831 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgh4p\" (UniqueName: \"kubernetes.io/projected/38852246-6c24-4866-921a-ce61995931d7-kube-api-access-fgh4p\") pod \"swift-ring-rebalance-mktxf\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:37 crc kubenswrapper[5030]: I0120 23:27:37.029875 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:37 crc kubenswrapper[5030]: E0120 23:27:37.030861 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:27:37 crc kubenswrapper[5030]: E0120 23:27:37.030919 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:27:37 crc kubenswrapper[5030]: E0120 23:27:37.031010 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift podName:bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4 nodeName:}" failed. No retries permitted until 2026-01-20 23:27:38.030981457 +0000 UTC m=+3130.351241775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift") pod "swift-storage-0" (UID: "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4") : configmap "swift-ring-files" not found Jan 20 23:27:37 crc kubenswrapper[5030]: I0120 23:27:37.071383 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:37 crc kubenswrapper[5030]: I0120 23:27:37.605881 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-mktxf"] Jan 20 23:27:37 crc kubenswrapper[5030]: W0120 23:27:37.618912 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38852246_6c24_4866_921a_ce61995931d7.slice/crio-4c8ecbfec77b6661b1da6abd97d017fc7b35e8a0d71f35f84b04ff7bf5ab4625 WatchSource:0}: Error finding container 4c8ecbfec77b6661b1da6abd97d017fc7b35e8a0d71f35f84b04ff7bf5ab4625: Status 404 returned error can't find the container with id 4c8ecbfec77b6661b1da6abd97d017fc7b35e8a0d71f35f84b04ff7bf5ab4625 Jan 20 23:27:38 crc kubenswrapper[5030]: I0120 23:27:38.045269 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:38 crc kubenswrapper[5030]: E0120 23:27:38.053671 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:27:38 crc kubenswrapper[5030]: E0120 23:27:38.053724 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:27:38 crc kubenswrapper[5030]: E0120 23:27:38.053818 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift podName:bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4 nodeName:}" failed. No retries permitted until 2026-01-20 23:27:40.053790876 +0000 UTC m=+3132.374051194 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift") pod "swift-storage-0" (UID: "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4") : configmap "swift-ring-files" not found Jan 20 23:27:38 crc kubenswrapper[5030]: I0120 23:27:38.076717 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" event={"ID":"38852246-6c24-4866-921a-ce61995931d7","Type":"ContainerStarted","Data":"9fd232b7659b789db5ad4d7e43b1a3850e70a4da7a82b8d138d56c50052f9690"} Jan 20 23:27:38 crc kubenswrapper[5030]: I0120 23:27:38.076808 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" event={"ID":"38852246-6c24-4866-921a-ce61995931d7","Type":"ContainerStarted","Data":"4c8ecbfec77b6661b1da6abd97d017fc7b35e8a0d71f35f84b04ff7bf5ab4625"} Jan 20 23:27:38 crc kubenswrapper[5030]: I0120 23:27:38.104285 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" podStartSLOduration=2.104264687 podStartE2EDuration="2.104264687s" podCreationTimestamp="2026-01-20 23:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:27:38.100929747 +0000 UTC m=+3130.421190075" watchObservedRunningTime="2026-01-20 23:27:38.104264687 +0000 UTC m=+3130.424524985" Jan 20 23:27:38 crc kubenswrapper[5030]: I0120 23:27:38.242782 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:38 crc kubenswrapper[5030]: I0120 23:27:38.348235 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:27:40 crc kubenswrapper[5030]: I0120 23:27:40.078107 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:40 crc kubenswrapper[5030]: E0120 23:27:40.078454 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:27:40 crc kubenswrapper[5030]: E0120 23:27:40.078494 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:27:40 crc kubenswrapper[5030]: E0120 23:27:40.078573 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift podName:bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4 nodeName:}" failed. No retries permitted until 2026-01-20 23:27:44.078544619 +0000 UTC m=+3136.398804937 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift") pod "swift-storage-0" (UID: "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4") : configmap "swift-ring-files" not found Jan 20 23:27:42 crc kubenswrapper[5030]: I0120 23:27:42.698656 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8b8rc"] Jan 20 23:27:42 crc kubenswrapper[5030]: I0120 23:27:42.700741 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8b8rc" Jan 20 23:27:42 crc kubenswrapper[5030]: I0120 23:27:42.703909 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:27:42 crc kubenswrapper[5030]: I0120 23:27:42.711997 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8b8rc"] Jan 20 23:27:42 crc kubenswrapper[5030]: I0120 23:27:42.837131 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdjg9\" (UniqueName: \"kubernetes.io/projected/3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3-kube-api-access-cdjg9\") pod \"root-account-create-update-8b8rc\" (UID: \"3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3\") " pod="openstack-kuttl-tests/root-account-create-update-8b8rc" Jan 20 23:27:42 crc kubenswrapper[5030]: I0120 23:27:42.837309 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3-operator-scripts\") pod \"root-account-create-update-8b8rc\" (UID: \"3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3\") " pod="openstack-kuttl-tests/root-account-create-update-8b8rc" Jan 20 23:27:42 crc kubenswrapper[5030]: I0120 23:27:42.939036 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdjg9\" (UniqueName: \"kubernetes.io/projected/3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3-kube-api-access-cdjg9\") pod \"root-account-create-update-8b8rc\" (UID: \"3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3\") " pod="openstack-kuttl-tests/root-account-create-update-8b8rc" Jan 20 23:27:42 crc kubenswrapper[5030]: I0120 23:27:42.939387 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3-operator-scripts\") pod \"root-account-create-update-8b8rc\" (UID: \"3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3\") " pod="openstack-kuttl-tests/root-account-create-update-8b8rc" Jan 20 23:27:42 crc kubenswrapper[5030]: I0120 23:27:42.941230 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3-operator-scripts\") pod \"root-account-create-update-8b8rc\" (UID: \"3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3\") " pod="openstack-kuttl-tests/root-account-create-update-8b8rc" Jan 20 23:27:42 crc kubenswrapper[5030]: I0120 23:27:42.966071 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdjg9\" (UniqueName: \"kubernetes.io/projected/3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3-kube-api-access-cdjg9\") pod \"root-account-create-update-8b8rc\" (UID: \"3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3\") " pod="openstack-kuttl-tests/root-account-create-update-8b8rc" Jan 20 23:27:43 crc kubenswrapper[5030]: I0120 23:27:43.027583 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8b8rc" Jan 20 23:27:43 crc kubenswrapper[5030]: I0120 23:27:43.519082 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8b8rc"] Jan 20 23:27:43 crc kubenswrapper[5030]: W0120 23:27:43.526609 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a4dadc6_d4e8_46dd_8c1f_9ad6ca1931c3.slice/crio-90be8e828ba33a7d671ef590c8ac1c12874d7aa609839b5a9ff7cc1b36e3ae86 WatchSource:0}: Error finding container 90be8e828ba33a7d671ef590c8ac1c12874d7aa609839b5a9ff7cc1b36e3ae86: Status 404 returned error can't find the container with id 90be8e828ba33a7d671ef590c8ac1c12874d7aa609839b5a9ff7cc1b36e3ae86 Jan 20 23:27:43 crc kubenswrapper[5030]: I0120 23:27:43.952404 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-8cm8v"] Jan 20 23:27:43 crc kubenswrapper[5030]: I0120 23:27:43.953822 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-8cm8v" Jan 20 23:27:43 crc kubenswrapper[5030]: I0120 23:27:43.971023 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-8cm8v"] Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.058115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a89983-ae51-4585-9d9a-4309df4ac15b-operator-scripts\") pod \"keystone-db-create-8cm8v\" (UID: \"34a89983-ae51-4585-9d9a-4309df4ac15b\") " pod="openstack-kuttl-tests/keystone-db-create-8cm8v" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.058364 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5c68\" (UniqueName: \"kubernetes.io/projected/34a89983-ae51-4585-9d9a-4309df4ac15b-kube-api-access-f5c68\") pod \"keystone-db-create-8cm8v\" (UID: \"34a89983-ae51-4585-9d9a-4309df4ac15b\") " pod="openstack-kuttl-tests/keystone-db-create-8cm8v" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.059306 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk"] Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.060487 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.064900 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.073276 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk"] Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.151125 5030 generic.go:334] "Generic (PLEG): container finished" podID="3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3" containerID="a797b28e8252946d1e122993d906dd040128dd30683045cbacafc029e5f91773" exitCode=0 Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.151178 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-8b8rc" event={"ID":"3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3","Type":"ContainerDied","Data":"a797b28e8252946d1e122993d906dd040128dd30683045cbacafc029e5f91773"} Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.151224 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-8b8rc" event={"ID":"3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3","Type":"ContainerStarted","Data":"90be8e828ba33a7d671ef590c8ac1c12874d7aa609839b5a9ff7cc1b36e3ae86"} Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.162154 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa0b529e-0a74-46b0-a364-0c6edc0e59f4-operator-scripts\") pod \"keystone-6f8f-account-create-update-s4zjk\" (UID: \"aa0b529e-0a74-46b0-a364-0c6edc0e59f4\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.162299 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4mqb\" (UniqueName: \"kubernetes.io/projected/aa0b529e-0a74-46b0-a364-0c6edc0e59f4-kube-api-access-f4mqb\") pod \"keystone-6f8f-account-create-update-s4zjk\" (UID: \"aa0b529e-0a74-46b0-a364-0c6edc0e59f4\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.162406 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.162483 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a89983-ae51-4585-9d9a-4309df4ac15b-operator-scripts\") pod \"keystone-db-create-8cm8v\" (UID: \"34a89983-ae51-4585-9d9a-4309df4ac15b\") " pod="openstack-kuttl-tests/keystone-db-create-8cm8v" Jan 20 23:27:44 crc kubenswrapper[5030]: E0120 23:27:44.162590 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:27:44 crc kubenswrapper[5030]: E0120 23:27:44.162668 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:27:44 crc kubenswrapper[5030]: E0120 23:27:44.162758 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift podName:bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4 nodeName:}" failed. No retries permitted until 2026-01-20 23:27:52.162732486 +0000 UTC m=+3144.482992784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift") pod "swift-storage-0" (UID: "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4") : configmap "swift-ring-files" not found Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.162649 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5c68\" (UniqueName: \"kubernetes.io/projected/34a89983-ae51-4585-9d9a-4309df4ac15b-kube-api-access-f5c68\") pod \"keystone-db-create-8cm8v\" (UID: \"34a89983-ae51-4585-9d9a-4309df4ac15b\") " pod="openstack-kuttl-tests/keystone-db-create-8cm8v" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.164493 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a89983-ae51-4585-9d9a-4309df4ac15b-operator-scripts\") pod \"keystone-db-create-8cm8v\" (UID: \"34a89983-ae51-4585-9d9a-4309df4ac15b\") " pod="openstack-kuttl-tests/keystone-db-create-8cm8v" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.197614 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5c68\" (UniqueName: \"kubernetes.io/projected/34a89983-ae51-4585-9d9a-4309df4ac15b-kube-api-access-f5c68\") pod \"keystone-db-create-8cm8v\" (UID: \"34a89983-ae51-4585-9d9a-4309df4ac15b\") " pod="openstack-kuttl-tests/keystone-db-create-8cm8v" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.265224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa0b529e-0a74-46b0-a364-0c6edc0e59f4-operator-scripts\") pod \"keystone-6f8f-account-create-update-s4zjk\" (UID: \"aa0b529e-0a74-46b0-a364-0c6edc0e59f4\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.265300 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4mqb\" (UniqueName: \"kubernetes.io/projected/aa0b529e-0a74-46b0-a364-0c6edc0e59f4-kube-api-access-f4mqb\") pod \"keystone-6f8f-account-create-update-s4zjk\" (UID: \"aa0b529e-0a74-46b0-a364-0c6edc0e59f4\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.266533 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa0b529e-0a74-46b0-a364-0c6edc0e59f4-operator-scripts\") pod \"keystone-6f8f-account-create-update-s4zjk\" (UID: \"aa0b529e-0a74-46b0-a364-0c6edc0e59f4\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.284320 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-8cm8v" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.289382 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4mqb\" (UniqueName: \"kubernetes.io/projected/aa0b529e-0a74-46b0-a364-0c6edc0e59f4-kube-api-access-f4mqb\") pod \"keystone-6f8f-account-create-update-s4zjk\" (UID: \"aa0b529e-0a74-46b0-a364-0c6edc0e59f4\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.386041 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.406286 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-gl9ft"] Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.407509 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-gl9ft" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.425726 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-gl9ft"] Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.506860 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-gp87c"] Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.508102 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gp87c" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.529866 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-2712-account-create-update-kvv69"] Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.531141 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-2712-account-create-update-kvv69" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.534747 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.537561 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gp87c"] Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.549082 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-2712-account-create-update-kvv69"] Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.571294 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc7c61a-28d8-491f-822f-dd544043c6f7-operator-scripts\") pod \"placement-db-create-gl9ft\" (UID: \"2dc7c61a-28d8-491f-822f-dd544043c6f7\") " pod="openstack-kuttl-tests/placement-db-create-gl9ft" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.571365 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvxtb\" (UniqueName: \"kubernetes.io/projected/2dc7c61a-28d8-491f-822f-dd544043c6f7-kube-api-access-dvxtb\") pod \"placement-db-create-gl9ft\" (UID: \"2dc7c61a-28d8-491f-822f-dd544043c6f7\") " pod="openstack-kuttl-tests/placement-db-create-gl9ft" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.614542 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-0855-account-create-update-x55lz"] Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.616120 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-0855-account-create-update-x55lz" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.618519 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.621884 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-0855-account-create-update-x55lz"] Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.673050 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bab5304f-ca88-442f-b4ca-823d8d0be728-operator-scripts\") pod \"glance-db-create-gp87c\" (UID: \"bab5304f-ca88-442f-b4ca-823d8d0be728\") " pod="openstack-kuttl-tests/glance-db-create-gp87c" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.673114 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9m8f\" (UniqueName: \"kubernetes.io/projected/af3228c2-a636-4f9a-8422-8bc110eec2a8-kube-api-access-f9m8f\") pod \"placement-2712-account-create-update-kvv69\" (UID: \"af3228c2-a636-4f9a-8422-8bc110eec2a8\") " pod="openstack-kuttl-tests/placement-2712-account-create-update-kvv69" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.673457 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc7c61a-28d8-491f-822f-dd544043c6f7-operator-scripts\") pod \"placement-db-create-gl9ft\" (UID: \"2dc7c61a-28d8-491f-822f-dd544043c6f7\") " pod="openstack-kuttl-tests/placement-db-create-gl9ft" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.673598 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvxtb\" (UniqueName: \"kubernetes.io/projected/2dc7c61a-28d8-491f-822f-dd544043c6f7-kube-api-access-dvxtb\") pod \"placement-db-create-gl9ft\" (UID: \"2dc7c61a-28d8-491f-822f-dd544043c6f7\") " pod="openstack-kuttl-tests/placement-db-create-gl9ft" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.673733 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv5jq\" (UniqueName: \"kubernetes.io/projected/bab5304f-ca88-442f-b4ca-823d8d0be728-kube-api-access-sv5jq\") pod \"glance-db-create-gp87c\" (UID: \"bab5304f-ca88-442f-b4ca-823d8d0be728\") " pod="openstack-kuttl-tests/glance-db-create-gp87c" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.673860 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3228c2-a636-4f9a-8422-8bc110eec2a8-operator-scripts\") pod \"placement-2712-account-create-update-kvv69\" (UID: \"af3228c2-a636-4f9a-8422-8bc110eec2a8\") " pod="openstack-kuttl-tests/placement-2712-account-create-update-kvv69" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.675308 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc7c61a-28d8-491f-822f-dd544043c6f7-operator-scripts\") pod \"placement-db-create-gl9ft\" (UID: \"2dc7c61a-28d8-491f-822f-dd544043c6f7\") " pod="openstack-kuttl-tests/placement-db-create-gl9ft" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.697950 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvxtb\" (UniqueName: \"kubernetes.io/projected/2dc7c61a-28d8-491f-822f-dd544043c6f7-kube-api-access-dvxtb\") pod \"placement-db-create-gl9ft\" (UID: \"2dc7c61a-28d8-491f-822f-dd544043c6f7\") " pod="openstack-kuttl-tests/placement-db-create-gl9ft" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.738768 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-gl9ft" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.759265 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-8cm8v"] Jan 20 23:27:44 crc kubenswrapper[5030]: W0120 23:27:44.764017 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34a89983_ae51_4585_9d9a_4309df4ac15b.slice/crio-1b40d2c6ec652f1c77a5aa39306372978bff98c663ce497d5f95dd51422e079c WatchSource:0}: Error finding container 1b40d2c6ec652f1c77a5aa39306372978bff98c663ce497d5f95dd51422e079c: Status 404 returned error can't find the container with id 1b40d2c6ec652f1c77a5aa39306372978bff98c663ce497d5f95dd51422e079c Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.776974 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bab5304f-ca88-442f-b4ca-823d8d0be728-operator-scripts\") pod \"glance-db-create-gp87c\" (UID: \"bab5304f-ca88-442f-b4ca-823d8d0be728\") " pod="openstack-kuttl-tests/glance-db-create-gp87c" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.777166 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9m8f\" (UniqueName: \"kubernetes.io/projected/af3228c2-a636-4f9a-8422-8bc110eec2a8-kube-api-access-f9m8f\") pod \"placement-2712-account-create-update-kvv69\" (UID: \"af3228c2-a636-4f9a-8422-8bc110eec2a8\") " pod="openstack-kuttl-tests/placement-2712-account-create-update-kvv69" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.777246 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2247a835-f44c-4a87-8d10-9d93b86d6fb9-operator-scripts\") pod \"glance-0855-account-create-update-x55lz\" (UID: \"2247a835-f44c-4a87-8d10-9d93b86d6fb9\") " pod="openstack-kuttl-tests/glance-0855-account-create-update-x55lz" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.777505 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv5jq\" (UniqueName: \"kubernetes.io/projected/bab5304f-ca88-442f-b4ca-823d8d0be728-kube-api-access-sv5jq\") pod \"glance-db-create-gp87c\" (UID: \"bab5304f-ca88-442f-b4ca-823d8d0be728\") " pod="openstack-kuttl-tests/glance-db-create-gp87c" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.777661 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2xkj\" (UniqueName: \"kubernetes.io/projected/2247a835-f44c-4a87-8d10-9d93b86d6fb9-kube-api-access-w2xkj\") pod \"glance-0855-account-create-update-x55lz\" (UID: \"2247a835-f44c-4a87-8d10-9d93b86d6fb9\") " pod="openstack-kuttl-tests/glance-0855-account-create-update-x55lz" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.777797 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3228c2-a636-4f9a-8422-8bc110eec2a8-operator-scripts\") pod \"placement-2712-account-create-update-kvv69\" (UID: \"af3228c2-a636-4f9a-8422-8bc110eec2a8\") " pod="openstack-kuttl-tests/placement-2712-account-create-update-kvv69" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.778828 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3228c2-a636-4f9a-8422-8bc110eec2a8-operator-scripts\") pod \"placement-2712-account-create-update-kvv69\" (UID: \"af3228c2-a636-4f9a-8422-8bc110eec2a8\") " pod="openstack-kuttl-tests/placement-2712-account-create-update-kvv69" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.783139 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bab5304f-ca88-442f-b4ca-823d8d0be728-operator-scripts\") pod \"glance-db-create-gp87c\" (UID: \"bab5304f-ca88-442f-b4ca-823d8d0be728\") " pod="openstack-kuttl-tests/glance-db-create-gp87c" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.802433 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9m8f\" (UniqueName: \"kubernetes.io/projected/af3228c2-a636-4f9a-8422-8bc110eec2a8-kube-api-access-f9m8f\") pod \"placement-2712-account-create-update-kvv69\" (UID: \"af3228c2-a636-4f9a-8422-8bc110eec2a8\") " pod="openstack-kuttl-tests/placement-2712-account-create-update-kvv69" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.804317 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv5jq\" (UniqueName: \"kubernetes.io/projected/bab5304f-ca88-442f-b4ca-823d8d0be728-kube-api-access-sv5jq\") pod \"glance-db-create-gp87c\" (UID: \"bab5304f-ca88-442f-b4ca-823d8d0be728\") " pod="openstack-kuttl-tests/glance-db-create-gp87c" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.833533 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gp87c" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.848410 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-2712-account-create-update-kvv69" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.879171 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2xkj\" (UniqueName: \"kubernetes.io/projected/2247a835-f44c-4a87-8d10-9d93b86d6fb9-kube-api-access-w2xkj\") pod \"glance-0855-account-create-update-x55lz\" (UID: \"2247a835-f44c-4a87-8d10-9d93b86d6fb9\") " pod="openstack-kuttl-tests/glance-0855-account-create-update-x55lz" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.879266 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2247a835-f44c-4a87-8d10-9d93b86d6fb9-operator-scripts\") pod \"glance-0855-account-create-update-x55lz\" (UID: \"2247a835-f44c-4a87-8d10-9d93b86d6fb9\") " pod="openstack-kuttl-tests/glance-0855-account-create-update-x55lz" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.879872 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2247a835-f44c-4a87-8d10-9d93b86d6fb9-operator-scripts\") pod \"glance-0855-account-create-update-x55lz\" (UID: \"2247a835-f44c-4a87-8d10-9d93b86d6fb9\") " pod="openstack-kuttl-tests/glance-0855-account-create-update-x55lz" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.913561 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2xkj\" (UniqueName: \"kubernetes.io/projected/2247a835-f44c-4a87-8d10-9d93b86d6fb9-kube-api-access-w2xkj\") pod \"glance-0855-account-create-update-x55lz\" (UID: \"2247a835-f44c-4a87-8d10-9d93b86d6fb9\") " pod="openstack-kuttl-tests/glance-0855-account-create-update-x55lz" Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.923430 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk"] Jan 20 23:27:44 crc kubenswrapper[5030]: W0120 23:27:44.929211 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa0b529e_0a74_46b0_a364_0c6edc0e59f4.slice/crio-39269232e8fd49c7af2fbdc7f88eb87ee553b649d908d602d4e14f0454351123 WatchSource:0}: Error finding container 39269232e8fd49c7af2fbdc7f88eb87ee553b649d908d602d4e14f0454351123: Status 404 returned error can't find the container with id 39269232e8fd49c7af2fbdc7f88eb87ee553b649d908d602d4e14f0454351123 Jan 20 23:27:44 crc kubenswrapper[5030]: I0120 23:27:44.934548 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-0855-account-create-update-x55lz" Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.165785 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" event={"ID":"aa0b529e-0a74-46b0-a364-0c6edc0e59f4","Type":"ContainerStarted","Data":"1e5eb2657b6e29da14a81671a285de70380a94c6019bdd99e6633fb362695c1d"} Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.166159 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" event={"ID":"aa0b529e-0a74-46b0-a364-0c6edc0e59f4","Type":"ContainerStarted","Data":"39269232e8fd49c7af2fbdc7f88eb87ee553b649d908d602d4e14f0454351123"} Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.168574 5030 generic.go:334] "Generic (PLEG): container finished" podID="34a89983-ae51-4585-9d9a-4309df4ac15b" containerID="eb9b1c8eb31c7ba78de3e7a43882ff05111f1008063e4d7ca427acbb295af381" exitCode=0 Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.168656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-8cm8v" event={"ID":"34a89983-ae51-4585-9d9a-4309df4ac15b","Type":"ContainerDied","Data":"eb9b1c8eb31c7ba78de3e7a43882ff05111f1008063e4d7ca427acbb295af381"} Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.168683 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-8cm8v" event={"ID":"34a89983-ae51-4585-9d9a-4309df4ac15b","Type":"ContainerStarted","Data":"1b40d2c6ec652f1c77a5aa39306372978bff98c663ce497d5f95dd51422e079c"} Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.170653 5030 generic.go:334] "Generic (PLEG): container finished" podID="38852246-6c24-4866-921a-ce61995931d7" containerID="9fd232b7659b789db5ad4d7e43b1a3850e70a4da7a82b8d138d56c50052f9690" exitCode=0 Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.170746 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" event={"ID":"38852246-6c24-4866-921a-ce61995931d7","Type":"ContainerDied","Data":"9fd232b7659b789db5ad4d7e43b1a3850e70a4da7a82b8d138d56c50052f9690"} Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.189865 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-gl9ft"] Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.195090 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" podStartSLOduration=1.195067107 podStartE2EDuration="1.195067107s" podCreationTimestamp="2026-01-20 23:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:27:45.180169876 +0000 UTC m=+3137.500430174" watchObservedRunningTime="2026-01-20 23:27:45.195067107 +0000 UTC m=+3137.515327395" Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.321461 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gp87c"] Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.331595 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-2712-account-create-update-kvv69"] Jan 20 23:27:45 crc kubenswrapper[5030]: W0120 23:27:45.334943 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf3228c2_a636_4f9a_8422_8bc110eec2a8.slice/crio-2aa07c1766f1a1db5a3ec30ea07f2c0e1206fe422330c43bcc07caf0ffdce196 WatchSource:0}: Error finding container 2aa07c1766f1a1db5a3ec30ea07f2c0e1206fe422330c43bcc07caf0ffdce196: Status 404 returned error can't find the container with id 2aa07c1766f1a1db5a3ec30ea07f2c0e1206fe422330c43bcc07caf0ffdce196 Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.492410 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-0855-account-create-update-x55lz"] Jan 20 23:27:45 crc kubenswrapper[5030]: W0120 23:27:45.495263 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2247a835_f44c_4a87_8d10_9d93b86d6fb9.slice/crio-4a1a77e3657973b8d30e67d6cd96a5df7ab4387805d4d3eaecdeaa52e7b5e6b2 WatchSource:0}: Error finding container 4a1a77e3657973b8d30e67d6cd96a5df7ab4387805d4d3eaecdeaa52e7b5e6b2: Status 404 returned error can't find the container with id 4a1a77e3657973b8d30e67d6cd96a5df7ab4387805d4d3eaecdeaa52e7b5e6b2 Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.502207 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8b8rc" Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.591051 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3-operator-scripts\") pod \"3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3\" (UID: \"3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3\") " Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.591103 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdjg9\" (UniqueName: \"kubernetes.io/projected/3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3-kube-api-access-cdjg9\") pod \"3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3\" (UID: \"3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3\") " Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.591508 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3" (UID: "3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.599147 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3-kube-api-access-cdjg9" (OuterVolumeSpecName: "kube-api-access-cdjg9") pod "3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3" (UID: "3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3"). InnerVolumeSpecName "kube-api-access-cdjg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.692955 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdjg9\" (UniqueName: \"kubernetes.io/projected/3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3-kube-api-access-cdjg9\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.692991 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:45 crc kubenswrapper[5030]: I0120 23:27:45.962492 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:27:45 crc kubenswrapper[5030]: E0120 23:27:45.962830 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.187374 5030 generic.go:334] "Generic (PLEG): container finished" podID="af3228c2-a636-4f9a-8422-8bc110eec2a8" containerID="34da10e01a3101e7713caaff1ac93588e6e463c974c61ea8cd351045e44cfd22" exitCode=0 Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.187456 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-2712-account-create-update-kvv69" event={"ID":"af3228c2-a636-4f9a-8422-8bc110eec2a8","Type":"ContainerDied","Data":"34da10e01a3101e7713caaff1ac93588e6e463c974c61ea8cd351045e44cfd22"} Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.187489 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-2712-account-create-update-kvv69" event={"ID":"af3228c2-a636-4f9a-8422-8bc110eec2a8","Type":"ContainerStarted","Data":"2aa07c1766f1a1db5a3ec30ea07f2c0e1206fe422330c43bcc07caf0ffdce196"} Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.189952 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-8b8rc" event={"ID":"3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3","Type":"ContainerDied","Data":"90be8e828ba33a7d671ef590c8ac1c12874d7aa609839b5a9ff7cc1b36e3ae86"} Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.189984 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90be8e828ba33a7d671ef590c8ac1c12874d7aa609839b5a9ff7cc1b36e3ae86" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.190042 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8b8rc" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.194402 5030 generic.go:334] "Generic (PLEG): container finished" podID="aa0b529e-0a74-46b0-a364-0c6edc0e59f4" containerID="1e5eb2657b6e29da14a81671a285de70380a94c6019bdd99e6633fb362695c1d" exitCode=0 Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.194464 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" event={"ID":"aa0b529e-0a74-46b0-a364-0c6edc0e59f4","Type":"ContainerDied","Data":"1e5eb2657b6e29da14a81671a285de70380a94c6019bdd99e6633fb362695c1d"} Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.196426 5030 generic.go:334] "Generic (PLEG): container finished" podID="bab5304f-ca88-442f-b4ca-823d8d0be728" containerID="e68403552d73e628a0c63064a45243baf0e1b9e7e0d20930a30f25654c3a5e58" exitCode=0 Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.196486 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-gp87c" event={"ID":"bab5304f-ca88-442f-b4ca-823d8d0be728","Type":"ContainerDied","Data":"e68403552d73e628a0c63064a45243baf0e1b9e7e0d20930a30f25654c3a5e58"} Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.196605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-gp87c" event={"ID":"bab5304f-ca88-442f-b4ca-823d8d0be728","Type":"ContainerStarted","Data":"e47d1d8cd9633a1079c583740f1b6a0acd8ed28fc658cd75716ac0a1ddad3be2"} Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.198945 5030 generic.go:334] "Generic (PLEG): container finished" podID="2dc7c61a-28d8-491f-822f-dd544043c6f7" containerID="06deebca81c94f49e91bdb53421c7e64e98a23a1d53f80d9395798d2657ef569" exitCode=0 Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.199008 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-gl9ft" event={"ID":"2dc7c61a-28d8-491f-822f-dd544043c6f7","Type":"ContainerDied","Data":"06deebca81c94f49e91bdb53421c7e64e98a23a1d53f80d9395798d2657ef569"} Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.199025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-gl9ft" event={"ID":"2dc7c61a-28d8-491f-822f-dd544043c6f7","Type":"ContainerStarted","Data":"e0ea03d8e5ce262631c1e953efe1c2042e0319a5383a75ee76ddc6789c901f58"} Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.201196 5030 generic.go:334] "Generic (PLEG): container finished" podID="2247a835-f44c-4a87-8d10-9d93b86d6fb9" containerID="04d68748733c2310e34f95b5220a0dfe6edb41cdacc8760a24872f073e4c79f0" exitCode=0 Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.201894 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-0855-account-create-update-x55lz" event={"ID":"2247a835-f44c-4a87-8d10-9d93b86d6fb9","Type":"ContainerDied","Data":"04d68748733c2310e34f95b5220a0dfe6edb41cdacc8760a24872f073e4c79f0"} Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.201990 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-0855-account-create-update-x55lz" event={"ID":"2247a835-f44c-4a87-8d10-9d93b86d6fb9","Type":"ContainerStarted","Data":"4a1a77e3657973b8d30e67d6cd96a5df7ab4387805d4d3eaecdeaa52e7b5e6b2"} Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.712281 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-8cm8v" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.716612 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.821096 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5c68\" (UniqueName: \"kubernetes.io/projected/34a89983-ae51-4585-9d9a-4309df4ac15b-kube-api-access-f5c68\") pod \"34a89983-ae51-4585-9d9a-4309df4ac15b\" (UID: \"34a89983-ae51-4585-9d9a-4309df4ac15b\") " Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.821195 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgh4p\" (UniqueName: \"kubernetes.io/projected/38852246-6c24-4866-921a-ce61995931d7-kube-api-access-fgh4p\") pod \"38852246-6c24-4866-921a-ce61995931d7\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.821308 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38852246-6c24-4866-921a-ce61995931d7-scripts\") pod \"38852246-6c24-4866-921a-ce61995931d7\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.821344 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a89983-ae51-4585-9d9a-4309df4ac15b-operator-scripts\") pod \"34a89983-ae51-4585-9d9a-4309df4ac15b\" (UID: \"34a89983-ae51-4585-9d9a-4309df4ac15b\") " Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.821375 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38852246-6c24-4866-921a-ce61995931d7-ring-data-devices\") pod \"38852246-6c24-4866-921a-ce61995931d7\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.821400 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-dispersionconf\") pod \"38852246-6c24-4866-921a-ce61995931d7\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.821425 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-combined-ca-bundle\") pod \"38852246-6c24-4866-921a-ce61995931d7\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.821484 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-swiftconf\") pod \"38852246-6c24-4866-921a-ce61995931d7\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.821570 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38852246-6c24-4866-921a-ce61995931d7-etc-swift\") pod \"38852246-6c24-4866-921a-ce61995931d7\" (UID: \"38852246-6c24-4866-921a-ce61995931d7\") " Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.822666 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38852246-6c24-4866-921a-ce61995931d7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "38852246-6c24-4866-921a-ce61995931d7" (UID: "38852246-6c24-4866-921a-ce61995931d7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.822809 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34a89983-ae51-4585-9d9a-4309df4ac15b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34a89983-ae51-4585-9d9a-4309df4ac15b" (UID: "34a89983-ae51-4585-9d9a-4309df4ac15b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.823347 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38852246-6c24-4866-921a-ce61995931d7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "38852246-6c24-4866-921a-ce61995931d7" (UID: "38852246-6c24-4866-921a-ce61995931d7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.827264 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34a89983-ae51-4585-9d9a-4309df4ac15b-kube-api-access-f5c68" (OuterVolumeSpecName: "kube-api-access-f5c68") pod "34a89983-ae51-4585-9d9a-4309df4ac15b" (UID: "34a89983-ae51-4585-9d9a-4309df4ac15b"). InnerVolumeSpecName "kube-api-access-f5c68". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.828361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38852246-6c24-4866-921a-ce61995931d7-kube-api-access-fgh4p" (OuterVolumeSpecName: "kube-api-access-fgh4p") pod "38852246-6c24-4866-921a-ce61995931d7" (UID: "38852246-6c24-4866-921a-ce61995931d7"). InnerVolumeSpecName "kube-api-access-fgh4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.830937 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "38852246-6c24-4866-921a-ce61995931d7" (UID: "38852246-6c24-4866-921a-ce61995931d7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.845922 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38852246-6c24-4866-921a-ce61995931d7-scripts" (OuterVolumeSpecName: "scripts") pod "38852246-6c24-4866-921a-ce61995931d7" (UID: "38852246-6c24-4866-921a-ce61995931d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.846103 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38852246-6c24-4866-921a-ce61995931d7" (UID: "38852246-6c24-4866-921a-ce61995931d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.859967 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "38852246-6c24-4866-921a-ce61995931d7" (UID: "38852246-6c24-4866-921a-ce61995931d7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.923912 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38852246-6c24-4866-921a-ce61995931d7-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.924184 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5c68\" (UniqueName: \"kubernetes.io/projected/34a89983-ae51-4585-9d9a-4309df4ac15b-kube-api-access-f5c68\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.924282 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgh4p\" (UniqueName: \"kubernetes.io/projected/38852246-6c24-4866-921a-ce61995931d7-kube-api-access-fgh4p\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.924407 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38852246-6c24-4866-921a-ce61995931d7-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.924492 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a89983-ae51-4585-9d9a-4309df4ac15b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.924582 5030 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38852246-6c24-4866-921a-ce61995931d7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.924739 5030 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.924942 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:46 crc kubenswrapper[5030]: I0120 23:27:46.925025 5030 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38852246-6c24-4866-921a-ce61995931d7-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.214094 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-8cm8v" event={"ID":"34a89983-ae51-4585-9d9a-4309df4ac15b","Type":"ContainerDied","Data":"1b40d2c6ec652f1c77a5aa39306372978bff98c663ce497d5f95dd51422e079c"} Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.214153 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b40d2c6ec652f1c77a5aa39306372978bff98c663ce497d5f95dd51422e079c" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.214120 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-8cm8v" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.217231 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" event={"ID":"38852246-6c24-4866-921a-ce61995931d7","Type":"ContainerDied","Data":"4c8ecbfec77b6661b1da6abd97d017fc7b35e8a0d71f35f84b04ff7bf5ab4625"} Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.217487 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c8ecbfec77b6661b1da6abd97d017fc7b35e8a0d71f35f84b04ff7bf5ab4625" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.217568 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-mktxf" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.621533 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-0855-account-create-update-x55lz" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.743569 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2247a835-f44c-4a87-8d10-9d93b86d6fb9-operator-scripts\") pod \"2247a835-f44c-4a87-8d10-9d93b86d6fb9\" (UID: \"2247a835-f44c-4a87-8d10-9d93b86d6fb9\") " Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.743883 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2xkj\" (UniqueName: \"kubernetes.io/projected/2247a835-f44c-4a87-8d10-9d93b86d6fb9-kube-api-access-w2xkj\") pod \"2247a835-f44c-4a87-8d10-9d93b86d6fb9\" (UID: \"2247a835-f44c-4a87-8d10-9d93b86d6fb9\") " Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.744615 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2247a835-f44c-4a87-8d10-9d93b86d6fb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2247a835-f44c-4a87-8d10-9d93b86d6fb9" (UID: "2247a835-f44c-4a87-8d10-9d93b86d6fb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.764821 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2247a835-f44c-4a87-8d10-9d93b86d6fb9-kube-api-access-w2xkj" (OuterVolumeSpecName: "kube-api-access-w2xkj") pod "2247a835-f44c-4a87-8d10-9d93b86d6fb9" (UID: "2247a835-f44c-4a87-8d10-9d93b86d6fb9"). InnerVolumeSpecName "kube-api-access-w2xkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.823917 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-2712-account-create-update-kvv69" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.827771 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.831832 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-gl9ft" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.840063 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gp87c" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.845892 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2247a835-f44c-4a87-8d10-9d93b86d6fb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.845919 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2xkj\" (UniqueName: \"kubernetes.io/projected/2247a835-f44c-4a87-8d10-9d93b86d6fb9-kube-api-access-w2xkj\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.852883 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.947027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9m8f\" (UniqueName: \"kubernetes.io/projected/af3228c2-a636-4f9a-8422-8bc110eec2a8-kube-api-access-f9m8f\") pod \"af3228c2-a636-4f9a-8422-8bc110eec2a8\" (UID: \"af3228c2-a636-4f9a-8422-8bc110eec2a8\") " Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.947076 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4mqb\" (UniqueName: \"kubernetes.io/projected/aa0b529e-0a74-46b0-a364-0c6edc0e59f4-kube-api-access-f4mqb\") pod \"aa0b529e-0a74-46b0-a364-0c6edc0e59f4\" (UID: \"aa0b529e-0a74-46b0-a364-0c6edc0e59f4\") " Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.947169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa0b529e-0a74-46b0-a364-0c6edc0e59f4-operator-scripts\") pod \"aa0b529e-0a74-46b0-a364-0c6edc0e59f4\" (UID: \"aa0b529e-0a74-46b0-a364-0c6edc0e59f4\") " Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.947200 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bab5304f-ca88-442f-b4ca-823d8d0be728-operator-scripts\") pod \"bab5304f-ca88-442f-b4ca-823d8d0be728\" (UID: \"bab5304f-ca88-442f-b4ca-823d8d0be728\") " Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.947236 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv5jq\" (UniqueName: \"kubernetes.io/projected/bab5304f-ca88-442f-b4ca-823d8d0be728-kube-api-access-sv5jq\") pod \"bab5304f-ca88-442f-b4ca-823d8d0be728\" (UID: \"bab5304f-ca88-442f-b4ca-823d8d0be728\") " Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.947297 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc7c61a-28d8-491f-822f-dd544043c6f7-operator-scripts\") pod \"2dc7c61a-28d8-491f-822f-dd544043c6f7\" (UID: \"2dc7c61a-28d8-491f-822f-dd544043c6f7\") " Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.947347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3228c2-a636-4f9a-8422-8bc110eec2a8-operator-scripts\") pod \"af3228c2-a636-4f9a-8422-8bc110eec2a8\" (UID: \"af3228c2-a636-4f9a-8422-8bc110eec2a8\") " Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.947380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvxtb\" (UniqueName: \"kubernetes.io/projected/2dc7c61a-28d8-491f-822f-dd544043c6f7-kube-api-access-dvxtb\") pod \"2dc7c61a-28d8-491f-822f-dd544043c6f7\" (UID: \"2dc7c61a-28d8-491f-822f-dd544043c6f7\") " Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.948005 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3228c2-a636-4f9a-8422-8bc110eec2a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af3228c2-a636-4f9a-8422-8bc110eec2a8" (UID: "af3228c2-a636-4f9a-8422-8bc110eec2a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.948077 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0b529e-0a74-46b0-a364-0c6edc0e59f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa0b529e-0a74-46b0-a364-0c6edc0e59f4" (UID: "aa0b529e-0a74-46b0-a364-0c6edc0e59f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.948822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bab5304f-ca88-442f-b4ca-823d8d0be728-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bab5304f-ca88-442f-b4ca-823d8d0be728" (UID: "bab5304f-ca88-442f-b4ca-823d8d0be728"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.949125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dc7c61a-28d8-491f-822f-dd544043c6f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2dc7c61a-28d8-491f-822f-dd544043c6f7" (UID: "2dc7c61a-28d8-491f-822f-dd544043c6f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.950661 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc7c61a-28d8-491f-822f-dd544043c6f7-kube-api-access-dvxtb" (OuterVolumeSpecName: "kube-api-access-dvxtb") pod "2dc7c61a-28d8-491f-822f-dd544043c6f7" (UID: "2dc7c61a-28d8-491f-822f-dd544043c6f7"). InnerVolumeSpecName "kube-api-access-dvxtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.951185 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa0b529e-0a74-46b0-a364-0c6edc0e59f4-kube-api-access-f4mqb" (OuterVolumeSpecName: "kube-api-access-f4mqb") pod "aa0b529e-0a74-46b0-a364-0c6edc0e59f4" (UID: "aa0b529e-0a74-46b0-a364-0c6edc0e59f4"). InnerVolumeSpecName "kube-api-access-f4mqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.951444 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3228c2-a636-4f9a-8422-8bc110eec2a8-kube-api-access-f9m8f" (OuterVolumeSpecName: "kube-api-access-f9m8f") pod "af3228c2-a636-4f9a-8422-8bc110eec2a8" (UID: "af3228c2-a636-4f9a-8422-8bc110eec2a8"). InnerVolumeSpecName "kube-api-access-f9m8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:27:47 crc kubenswrapper[5030]: I0120 23:27:47.951679 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bab5304f-ca88-442f-b4ca-823d8d0be728-kube-api-access-sv5jq" (OuterVolumeSpecName: "kube-api-access-sv5jq") pod "bab5304f-ca88-442f-b4ca-823d8d0be728" (UID: "bab5304f-ca88-442f-b4ca-823d8d0be728"). InnerVolumeSpecName "kube-api-access-sv5jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.049589 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa0b529e-0a74-46b0-a364-0c6edc0e59f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.049653 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bab5304f-ca88-442f-b4ca-823d8d0be728-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.049671 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv5jq\" (UniqueName: \"kubernetes.io/projected/bab5304f-ca88-442f-b4ca-823d8d0be728-kube-api-access-sv5jq\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.049687 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc7c61a-28d8-491f-822f-dd544043c6f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.049702 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3228c2-a636-4f9a-8422-8bc110eec2a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.049713 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvxtb\" (UniqueName: \"kubernetes.io/projected/2dc7c61a-28d8-491f-822f-dd544043c6f7-kube-api-access-dvxtb\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.049725 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9m8f\" (UniqueName: \"kubernetes.io/projected/af3228c2-a636-4f9a-8422-8bc110eec2a8-kube-api-access-f9m8f\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.049736 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4mqb\" (UniqueName: \"kubernetes.io/projected/aa0b529e-0a74-46b0-a364-0c6edc0e59f4-kube-api-access-f4mqb\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.240816 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-gl9ft" event={"ID":"2dc7c61a-28d8-491f-822f-dd544043c6f7","Type":"ContainerDied","Data":"e0ea03d8e5ce262631c1e953efe1c2042e0319a5383a75ee76ddc6789c901f58"} Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.242515 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0ea03d8e5ce262631c1e953efe1c2042e0319a5383a75ee76ddc6789c901f58" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.240840 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-gl9ft" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.243669 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-0855-account-create-update-x55lz" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.243707 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-0855-account-create-update-x55lz" event={"ID":"2247a835-f44c-4a87-8d10-9d93b86d6fb9","Type":"ContainerDied","Data":"4a1a77e3657973b8d30e67d6cd96a5df7ab4387805d4d3eaecdeaa52e7b5e6b2"} Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.243760 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a1a77e3657973b8d30e67d6cd96a5df7ab4387805d4d3eaecdeaa52e7b5e6b2" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.246658 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-2712-account-create-update-kvv69" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.246698 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-2712-account-create-update-kvv69" event={"ID":"af3228c2-a636-4f9a-8422-8bc110eec2a8","Type":"ContainerDied","Data":"2aa07c1766f1a1db5a3ec30ea07f2c0e1206fe422330c43bcc07caf0ffdce196"} Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.246741 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aa07c1766f1a1db5a3ec30ea07f2c0e1206fe422330c43bcc07caf0ffdce196" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.250000 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" event={"ID":"aa0b529e-0a74-46b0-a364-0c6edc0e59f4","Type":"ContainerDied","Data":"39269232e8fd49c7af2fbdc7f88eb87ee553b649d908d602d4e14f0454351123"} Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.250060 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39269232e8fd49c7af2fbdc7f88eb87ee553b649d908d602d4e14f0454351123" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.250170 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.255371 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-gp87c" event={"ID":"bab5304f-ca88-442f-b4ca-823d8d0be728","Type":"ContainerDied","Data":"e47d1d8cd9633a1079c583740f1b6a0acd8ed28fc658cd75716ac0a1ddad3be2"} Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.255408 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e47d1d8cd9633a1079c583740f1b6a0acd8ed28fc658cd75716ac0a1ddad3be2" Jan 20 23:27:48 crc kubenswrapper[5030]: I0120 23:27:48.255503 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-gp87c" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.337458 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8b8rc"] Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.348154 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8b8rc"] Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.856144 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-fbr4k"] Jan 20 23:27:49 crc kubenswrapper[5030]: E0120 23:27:49.857235 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc7c61a-28d8-491f-822f-dd544043c6f7" containerName="mariadb-database-create" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.857308 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc7c61a-28d8-491f-822f-dd544043c6f7" containerName="mariadb-database-create" Jan 20 23:27:49 crc kubenswrapper[5030]: E0120 23:27:49.857809 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38852246-6c24-4866-921a-ce61995931d7" containerName="swift-ring-rebalance" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.857914 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="38852246-6c24-4866-921a-ce61995931d7" containerName="swift-ring-rebalance" Jan 20 23:27:49 crc kubenswrapper[5030]: E0120 23:27:49.858019 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3228c2-a636-4f9a-8422-8bc110eec2a8" containerName="mariadb-account-create-update" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.858125 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3228c2-a636-4f9a-8422-8bc110eec2a8" containerName="mariadb-account-create-update" Jan 20 23:27:49 crc kubenswrapper[5030]: E0120 23:27:49.858229 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2247a835-f44c-4a87-8d10-9d93b86d6fb9" containerName="mariadb-account-create-update" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.858447 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2247a835-f44c-4a87-8d10-9d93b86d6fb9" containerName="mariadb-account-create-update" Jan 20 23:27:49 crc kubenswrapper[5030]: E0120 23:27:49.858538 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab5304f-ca88-442f-b4ca-823d8d0be728" containerName="mariadb-database-create" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.858637 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab5304f-ca88-442f-b4ca-823d8d0be728" containerName="mariadb-database-create" Jan 20 23:27:49 crc kubenswrapper[5030]: E0120 23:27:49.858712 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a89983-ae51-4585-9d9a-4309df4ac15b" containerName="mariadb-database-create" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.858837 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a89983-ae51-4585-9d9a-4309df4ac15b" containerName="mariadb-database-create" Jan 20 23:27:49 crc kubenswrapper[5030]: E0120 23:27:49.858902 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3" containerName="mariadb-account-create-update" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.859239 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3" containerName="mariadb-account-create-update" Jan 20 23:27:49 crc kubenswrapper[5030]: E0120 23:27:49.859295 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0b529e-0a74-46b0-a364-0c6edc0e59f4" containerName="mariadb-account-create-update" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.859355 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0b529e-0a74-46b0-a364-0c6edc0e59f4" containerName="mariadb-account-create-update" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.860806 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc7c61a-28d8-491f-822f-dd544043c6f7" containerName="mariadb-database-create" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.860894 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3228c2-a636-4f9a-8422-8bc110eec2a8" containerName="mariadb-account-create-update" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.860968 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a89983-ae51-4585-9d9a-4309df4ac15b" containerName="mariadb-database-create" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.861035 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa0b529e-0a74-46b0-a364-0c6edc0e59f4" containerName="mariadb-account-create-update" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.861089 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3" containerName="mariadb-account-create-update" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.861177 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab5304f-ca88-442f-b4ca-823d8d0be728" containerName="mariadb-database-create" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.861265 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="38852246-6c24-4866-921a-ce61995931d7" containerName="swift-ring-rebalance" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.861327 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2247a835-f44c-4a87-8d10-9d93b86d6fb9" containerName="mariadb-account-create-update" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.862161 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.865024 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.865421 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-6vjfc" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.880557 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-fbr4k"] Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.975612 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3" path="/var/lib/kubelet/pods/3a4dadc6-d4e8-46dd-8c1f-9ad6ca1931c3/volumes" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.985210 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-config-data\") pod \"glance-db-sync-fbr4k\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.985361 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-combined-ca-bundle\") pod \"glance-db-sync-fbr4k\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.985444 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mlqd\" (UniqueName: \"kubernetes.io/projected/dd6294ee-f735-499f-b340-952f3f7b2ded-kube-api-access-7mlqd\") pod \"glance-db-sync-fbr4k\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:49 crc kubenswrapper[5030]: I0120 23:27:49.985479 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-db-sync-config-data\") pod \"glance-db-sync-fbr4k\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:50 crc kubenswrapper[5030]: I0120 23:27:50.086849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-config-data\") pod \"glance-db-sync-fbr4k\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:50 crc kubenswrapper[5030]: I0120 23:27:50.087293 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-combined-ca-bundle\") pod \"glance-db-sync-fbr4k\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:50 crc kubenswrapper[5030]: I0120 23:27:50.087343 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mlqd\" (UniqueName: \"kubernetes.io/projected/dd6294ee-f735-499f-b340-952f3f7b2ded-kube-api-access-7mlqd\") pod \"glance-db-sync-fbr4k\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:50 crc kubenswrapper[5030]: I0120 23:27:50.087376 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-db-sync-config-data\") pod \"glance-db-sync-fbr4k\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:50 crc kubenswrapper[5030]: I0120 23:27:50.092598 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-db-sync-config-data\") pod \"glance-db-sync-fbr4k\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:50 crc kubenswrapper[5030]: I0120 23:27:50.095243 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-combined-ca-bundle\") pod \"glance-db-sync-fbr4k\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:50 crc kubenswrapper[5030]: I0120 23:27:50.095316 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-config-data\") pod \"glance-db-sync-fbr4k\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:50 crc kubenswrapper[5030]: I0120 23:27:50.108701 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mlqd\" (UniqueName: \"kubernetes.io/projected/dd6294ee-f735-499f-b340-952f3f7b2ded-kube-api-access-7mlqd\") pod \"glance-db-sync-fbr4k\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:50 crc kubenswrapper[5030]: I0120 23:27:50.195095 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:50 crc kubenswrapper[5030]: I0120 23:27:50.821404 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-fbr4k"] Jan 20 23:27:51 crc kubenswrapper[5030]: I0120 23:27:51.312686 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-fbr4k" event={"ID":"dd6294ee-f735-499f-b340-952f3f7b2ded","Type":"ContainerStarted","Data":"f91c21eeb665d073e3d594005511db69bae64ddb8e8de53ad099e767b3acc02b"} Jan 20 23:27:52 crc kubenswrapper[5030]: I0120 23:27:52.239766 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:52 crc kubenswrapper[5030]: I0120 23:27:52.257370 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift\") pod \"swift-storage-0\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:52 crc kubenswrapper[5030]: I0120 23:27:52.310030 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:27:52 crc kubenswrapper[5030]: I0120 23:27:52.324225 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-fbr4k" event={"ID":"dd6294ee-f735-499f-b340-952f3f7b2ded","Type":"ContainerStarted","Data":"3a8a67f49b51b49c3c0887455162bd0a450f3ed1781f0832075d5df233e94c00"} Jan 20 23:27:52 crc kubenswrapper[5030]: I0120 23:27:52.355440 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-fbr4k" podStartSLOduration=3.355417119 podStartE2EDuration="3.355417119s" podCreationTimestamp="2026-01-20 23:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:27:52.351989815 +0000 UTC m=+3144.672250113" watchObservedRunningTime="2026-01-20 23:27:52.355417119 +0000 UTC m=+3144.675677407" Jan 20 23:27:52 crc kubenswrapper[5030]: I0120 23:27:52.831747 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:27:53 crc kubenswrapper[5030]: I0120 23:27:53.336309 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"fc30b61ad3b670e8566e451c0bf77bed40bb4ae683b1002df810eb3e8a7c7e56"} Jan 20 23:27:53 crc kubenswrapper[5030]: I0120 23:27:53.336354 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"c023313c2dc4acfa642739fd15b6dce1cdb2384c9ddf2f534c8c7e3a6317cc61"} Jan 20 23:27:53 crc kubenswrapper[5030]: I0120 23:27:53.336366 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"bc4cce11a2f9326a3c8c9f33ccadc33a156ee68b356f675f96fecd75cc9144e0"} Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.367721 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8q5qn"] Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.369548 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8q5qn" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.372154 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.378657 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"6a5682b6b00c488de2f354ed3629a706b621d15c96ef14b746c170ab2da520e2"} Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.378717 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"234735c49802543dffa54027fc58edd4cb277d91c2216d680f26f1186b80560f"} Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.378730 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"b7ae32b1a4f1e7f7fe7aad12eac5a1d984d90e41f390bb645d6ca4b2f630627a"} Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.378742 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"b60f5caf9f00c875e2dfdf1fc8cd6af041235cd0e56388a3a11dd402988b4307"} Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.378753 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"49574068fdaf73188ebe3b005b4e8a44cf73f4cfe484defd630ff27da3cfb6e0"} Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.378797 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"e61677d1c4df154307a801a3e62190f8fa8c59cdfba25fbc0e4949bc2af510ce"} Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.379279 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8q5qn"] Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.484004 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxqnm\" (UniqueName: \"kubernetes.io/projected/a6b6dfdf-c86c-4d99-913b-0c28fb0c1811-kube-api-access-nxqnm\") pod \"root-account-create-update-8q5qn\" (UID: \"a6b6dfdf-c86c-4d99-913b-0c28fb0c1811\") " pod="openstack-kuttl-tests/root-account-create-update-8q5qn" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.484094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b6dfdf-c86c-4d99-913b-0c28fb0c1811-operator-scripts\") pod \"root-account-create-update-8q5qn\" (UID: \"a6b6dfdf-c86c-4d99-913b-0c28fb0c1811\") " pod="openstack-kuttl-tests/root-account-create-update-8q5qn" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.574362 5030 scope.go:117] "RemoveContainer" containerID="e529e28e3eebf95ed91002b404e311beb3c6e1ef97122ba56f112fd06d3bb19e" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.585667 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxqnm\" (UniqueName: \"kubernetes.io/projected/a6b6dfdf-c86c-4d99-913b-0c28fb0c1811-kube-api-access-nxqnm\") pod \"root-account-create-update-8q5qn\" (UID: \"a6b6dfdf-c86c-4d99-913b-0c28fb0c1811\") " pod="openstack-kuttl-tests/root-account-create-update-8q5qn" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.585721 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b6dfdf-c86c-4d99-913b-0c28fb0c1811-operator-scripts\") pod \"root-account-create-update-8q5qn\" (UID: \"a6b6dfdf-c86c-4d99-913b-0c28fb0c1811\") " pod="openstack-kuttl-tests/root-account-create-update-8q5qn" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.586582 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b6dfdf-c86c-4d99-913b-0c28fb0c1811-operator-scripts\") pod \"root-account-create-update-8q5qn\" (UID: \"a6b6dfdf-c86c-4d99-913b-0c28fb0c1811\") " pod="openstack-kuttl-tests/root-account-create-update-8q5qn" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.609174 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxqnm\" (UniqueName: \"kubernetes.io/projected/a6b6dfdf-c86c-4d99-913b-0c28fb0c1811-kube-api-access-nxqnm\") pod \"root-account-create-update-8q5qn\" (UID: \"a6b6dfdf-c86c-4d99-913b-0c28fb0c1811\") " pod="openstack-kuttl-tests/root-account-create-update-8q5qn" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.647015 5030 scope.go:117] "RemoveContainer" containerID="26b1ddd32ac5e65e681ee60619eed541c855aa3b58f3a73c15ea8a35ce058b23" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.669171 5030 scope.go:117] "RemoveContainer" containerID="f6e472c12202cebea002a0104ff390df2d1fb5a738489ed64bf7a451c77424dc" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.709035 5030 scope.go:117] "RemoveContainer" containerID="b43b0407df047d8476ad43c9df0c38db00b8003aee80ca3e76a060dbf6a4e8b7" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.732489 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8q5qn" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.747509 5030 scope.go:117] "RemoveContainer" containerID="8d163dcccbc2d4684e45a63084005401caf188afb4ae50b0ddba3caecc42d9d6" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.788528 5030 scope.go:117] "RemoveContainer" containerID="be58b3a0d8691178b2610831673d15113e2422d314d8e39d2cd7d4b33b03747b" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.823978 5030 scope.go:117] "RemoveContainer" containerID="f9dc97af04fb3932c7410e9c9a358e795dd6cf3a504e15146011c5956745a3ca" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.904356 5030 scope.go:117] "RemoveContainer" containerID="8a44f789b6134272d0e93b7ae826cd695547bc33de5fd34f04050d6752912841" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.962505 5030 scope.go:117] "RemoveContainer" containerID="c06a48fe05f35f1049e48b548439a81f0af40a795cf0e6eb9e109fa30620ec29" Jan 20 23:27:54 crc kubenswrapper[5030]: I0120 23:27:54.992261 5030 scope.go:117] "RemoveContainer" containerID="e0404281b5a4570801be1942e93ace165a65a028f86411ac0d3d3c68f979b1f5" Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.040352 5030 scope.go:117] "RemoveContainer" containerID="8c7558df2cb9e4c2727a9911eeacd806ffc948bf680971ab33112fc7377f785e" Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.072671 5030 scope.go:117] "RemoveContainer" containerID="e5e961cd6d773bbb1889566094203531756a0fa382a0b915ac0416162dfbbe3e" Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.116279 5030 scope.go:117] "RemoveContainer" containerID="adab7175f57c8246456313c9d8bc528995d62fe1b12e9af3d3a9943092a3af37" Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.148395 5030 scope.go:117] "RemoveContainer" containerID="ac32d0c374c317b9d33d17588c3e73732c96a11a9f71507f7364903b5fa1f2fa" Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.177387 5030 scope.go:117] "RemoveContainer" containerID="520c14c6afe1bdc4fa9d9f804c73771f4b7427492baf3220765f9e111a1932b6" Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.196486 5030 scope.go:117] "RemoveContainer" containerID="d0d3f005943a446b84278210290d3b4a01364c1349a853ce14c5fbeadf1bf75c" Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.241549 5030 scope.go:117] "RemoveContainer" containerID="8281a5190aff5bc9d8e1739e5f4a1d43a6a5875e214c27977cb38115f035a585" Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.265416 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8q5qn"] Jan 20 23:27:55 crc kubenswrapper[5030]: W0120 23:27:55.274393 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6b6dfdf_c86c_4d99_913b_0c28fb0c1811.slice/crio-3641d3391147390e9803c367d5e8d6806335519f3b2d213668b529858da899b7 WatchSource:0}: Error finding container 3641d3391147390e9803c367d5e8d6806335519f3b2d213668b529858da899b7: Status 404 returned error can't find the container with id 3641d3391147390e9803c367d5e8d6806335519f3b2d213668b529858da899b7 Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.388062 5030 scope.go:117] "RemoveContainer" containerID="bd3d28d20d93dfa760fd440f9b51cd546029af0dd77cca083cc9280b00e90e98" Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.403670 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"d264c92684da056e428a6c733961f8fe92aa28e300c81f22e991287ca3bdb9ab"} Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.403724 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"fb5859be41c2b9cb5338526464383b63d74381d26c7f59f26d24b5ff23c949cc"} Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.403735 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"60cb27177b8ee600ca0fa9b124dfd7d1119967c490af0614c1bb464ddeaa2707"} Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.403746 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"59c21e5e34c6d5dbf8de4d3d91b825d296eab38936e7d83529e745d3d2b3e2a5"} Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.403754 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"9308c7e5305215685af92a1087be86f06b8582716c763d7b3d73399d891e23fb"} Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.419306 5030 scope.go:117] "RemoveContainer" containerID="c5d8451ca1df90836c74e02e735263405dd7345ac078ad70ebbc6dc29103b6a3" Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.427160 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-8q5qn" event={"ID":"a6b6dfdf-c86c-4d99-913b-0c28fb0c1811","Type":"ContainerStarted","Data":"3641d3391147390e9803c367d5e8d6806335519f3b2d213668b529858da899b7"} Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.461545 5030 scope.go:117] "RemoveContainer" containerID="b46b3e1e35f4cc0f7f8a1e97882714048b5dd5923a5d10dcfff8a62889c31b12" Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.510174 5030 scope.go:117] "RemoveContainer" containerID="5e7eee36212b875a3c18706283d7234207c6eeca24cbf4c2798aad5488b2afb1" Jan 20 23:27:55 crc kubenswrapper[5030]: I0120 23:27:55.530630 5030 scope.go:117] "RemoveContainer" containerID="da2491bd3ada4b8716bf4a0515aa6180de24a54d201a7950a35c5b4868a9337b" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.446298 5030 generic.go:334] "Generic (PLEG): container finished" podID="a6b6dfdf-c86c-4d99-913b-0c28fb0c1811" containerID="9fd52bc2e65925f44682cc313398c8fe0670f551130374c8775dcfb43a8ae0b8" exitCode=0 Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.447323 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-8q5qn" event={"ID":"a6b6dfdf-c86c-4d99-913b-0c28fb0c1811","Type":"ContainerDied","Data":"9fd52bc2e65925f44682cc313398c8fe0670f551130374c8775dcfb43a8ae0b8"} Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.454744 5030 generic.go:334] "Generic (PLEG): container finished" podID="9def69e2-10b9-443b-8b38-7d561e39bb7d" containerID="1778eb3dac897ab781625cc651ba606b63a5dd6829ad139e3551fa0e3df843c2" exitCode=0 Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.454830 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"9def69e2-10b9-443b-8b38-7d561e39bb7d","Type":"ContainerDied","Data":"1778eb3dac897ab781625cc651ba606b63a5dd6829ad139e3551fa0e3df843c2"} Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.472395 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"25c15d7715d9e340868a2859dd1a58b1e7573ed255f28afe95351da547d45981"} Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.472461 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerStarted","Data":"53566cbe13e0554a3b86a64ff88413e4e2971a75f558b615fb405505c102f2c8"} Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.544410 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=21.544386012 podStartE2EDuration="21.544386012s" podCreationTimestamp="2026-01-20 23:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:27:56.537637809 +0000 UTC m=+3148.857898107" watchObservedRunningTime="2026-01-20 23:27:56.544386012 +0000 UTC m=+3148.864646310" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.727236 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb"] Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.730086 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.732192 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.746069 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb"] Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.830483 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-8687b85765-4kxkb\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.830698 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-8687b85765-4kxkb\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.830783 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-config\") pod \"dnsmasq-dnsmasq-8687b85765-4kxkb\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.830820 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrs4k\" (UniqueName: \"kubernetes.io/projected/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-kube-api-access-qrs4k\") pod \"dnsmasq-dnsmasq-8687b85765-4kxkb\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.932983 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-8687b85765-4kxkb\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.933055 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-config\") pod \"dnsmasq-dnsmasq-8687b85765-4kxkb\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.933082 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrs4k\" (UniqueName: \"kubernetes.io/projected/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-kube-api-access-qrs4k\") pod \"dnsmasq-dnsmasq-8687b85765-4kxkb\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.933159 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-8687b85765-4kxkb\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.933819 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-8687b85765-4kxkb\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.933886 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-8687b85765-4kxkb\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.934528 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-config\") pod \"dnsmasq-dnsmasq-8687b85765-4kxkb\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:56 crc kubenswrapper[5030]: I0120 23:27:56.960185 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrs4k\" (UniqueName: \"kubernetes.io/projected/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-kube-api-access-qrs4k\") pod \"dnsmasq-dnsmasq-8687b85765-4kxkb\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:57 crc kubenswrapper[5030]: I0120 23:27:57.045437 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:27:57 crc kubenswrapper[5030]: I0120 23:27:57.482979 5030 generic.go:334] "Generic (PLEG): container finished" podID="535e149a-723c-4f6b-9247-9a6b58eff6bb" containerID="15c82f83871ae8a50514e3320b06975949c2d6131b50863a977396f67100c222" exitCode=0 Jan 20 23:27:57 crc kubenswrapper[5030]: I0120 23:27:57.483046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"535e149a-723c-4f6b-9247-9a6b58eff6bb","Type":"ContainerDied","Data":"15c82f83871ae8a50514e3320b06975949c2d6131b50863a977396f67100c222"} Jan 20 23:27:57 crc kubenswrapper[5030]: I0120 23:27:57.488524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"9def69e2-10b9-443b-8b38-7d561e39bb7d","Type":"ContainerStarted","Data":"b96f29c6412312302df952900fcc3ad294f799efc164332681b4bdeaa2e0b9e6"} Jan 20 23:27:57 crc kubenswrapper[5030]: I0120 23:27:57.489338 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:27:57 crc kubenswrapper[5030]: I0120 23:27:57.563255 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb"] Jan 20 23:27:57 crc kubenswrapper[5030]: W0120 23:27:57.563673 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5a9211_0407_48f1_9d3e_8fb089aa2b56.slice/crio-39e392ea69997447b0c3104564e074803eff954643c5d17a12a8dd50622411ce WatchSource:0}: Error finding container 39e392ea69997447b0c3104564e074803eff954643c5d17a12a8dd50622411ce: Status 404 returned error can't find the container with id 39e392ea69997447b0c3104564e074803eff954643c5d17a12a8dd50622411ce Jan 20 23:27:57 crc kubenswrapper[5030]: I0120 23:27:57.566034 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.565971551 podStartE2EDuration="36.565971551s" podCreationTimestamp="2026-01-20 23:27:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:27:57.530987824 +0000 UTC m=+3149.851248122" watchObservedRunningTime="2026-01-20 23:27:57.565971551 +0000 UTC m=+3149.886231839" Jan 20 23:27:57 crc kubenswrapper[5030]: I0120 23:27:57.870772 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8q5qn" Jan 20 23:27:57 crc kubenswrapper[5030]: I0120 23:27:57.956100 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b6dfdf-c86c-4d99-913b-0c28fb0c1811-operator-scripts\") pod \"a6b6dfdf-c86c-4d99-913b-0c28fb0c1811\" (UID: \"a6b6dfdf-c86c-4d99-913b-0c28fb0c1811\") " Jan 20 23:27:57 crc kubenswrapper[5030]: I0120 23:27:57.956244 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxqnm\" (UniqueName: \"kubernetes.io/projected/a6b6dfdf-c86c-4d99-913b-0c28fb0c1811-kube-api-access-nxqnm\") pod \"a6b6dfdf-c86c-4d99-913b-0c28fb0c1811\" (UID: \"a6b6dfdf-c86c-4d99-913b-0c28fb0c1811\") " Jan 20 23:27:57 crc kubenswrapper[5030]: I0120 23:27:57.956780 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b6dfdf-c86c-4d99-913b-0c28fb0c1811-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6b6dfdf-c86c-4d99-913b-0c28fb0c1811" (UID: "a6b6dfdf-c86c-4d99-913b-0c28fb0c1811"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:27:57 crc kubenswrapper[5030]: I0120 23:27:57.960957 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b6dfdf-c86c-4d99-913b-0c28fb0c1811-kube-api-access-nxqnm" (OuterVolumeSpecName: "kube-api-access-nxqnm") pod "a6b6dfdf-c86c-4d99-913b-0c28fb0c1811" (UID: "a6b6dfdf-c86c-4d99-913b-0c28fb0c1811"). InnerVolumeSpecName "kube-api-access-nxqnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:27:57 crc kubenswrapper[5030]: I0120 23:27:57.969719 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:27:57 crc kubenswrapper[5030]: E0120 23:27:57.970098 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:27:58 crc kubenswrapper[5030]: I0120 23:27:58.058822 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b6dfdf-c86c-4d99-913b-0c28fb0c1811-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:58 crc kubenswrapper[5030]: I0120 23:27:58.058895 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxqnm\" (UniqueName: \"kubernetes.io/projected/a6b6dfdf-c86c-4d99-913b-0c28fb0c1811-kube-api-access-nxqnm\") on node \"crc\" DevicePath \"\"" Jan 20 23:27:58 crc kubenswrapper[5030]: I0120 23:27:58.501063 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"535e149a-723c-4f6b-9247-9a6b58eff6bb","Type":"ContainerStarted","Data":"a5f50faa8925170ab9b2766c285e845c40fa14a592054ea95170b7280822dae6"} Jan 20 23:27:58 crc kubenswrapper[5030]: I0120 23:27:58.503047 5030 generic.go:334] "Generic (PLEG): container finished" podID="dd6294ee-f735-499f-b340-952f3f7b2ded" containerID="3a8a67f49b51b49c3c0887455162bd0a450f3ed1781f0832075d5df233e94c00" exitCode=0 Jan 20 23:27:58 crc kubenswrapper[5030]: I0120 23:27:58.503121 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-fbr4k" event={"ID":"dd6294ee-f735-499f-b340-952f3f7b2ded","Type":"ContainerDied","Data":"3a8a67f49b51b49c3c0887455162bd0a450f3ed1781f0832075d5df233e94c00"} Jan 20 23:27:58 crc kubenswrapper[5030]: I0120 23:27:58.503475 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:27:58 crc kubenswrapper[5030]: I0120 23:27:58.505846 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" event={"ID":"7d5a9211-0407-48f1-9d3e-8fb089aa2b56","Type":"ContainerDied","Data":"50588e92c6d2ae18b5d0cda787251d31257c61cb2c33dda23324d2a5c33f60fe"} Jan 20 23:27:58 crc kubenswrapper[5030]: I0120 23:27:58.505772 5030 generic.go:334] "Generic (PLEG): container finished" podID="7d5a9211-0407-48f1-9d3e-8fb089aa2b56" containerID="50588e92c6d2ae18b5d0cda787251d31257c61cb2c33dda23324d2a5c33f60fe" exitCode=0 Jan 20 23:27:58 crc kubenswrapper[5030]: I0120 23:27:58.506440 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" event={"ID":"7d5a9211-0407-48f1-9d3e-8fb089aa2b56","Type":"ContainerStarted","Data":"39e392ea69997447b0c3104564e074803eff954643c5d17a12a8dd50622411ce"} Jan 20 23:27:58 crc kubenswrapper[5030]: I0120 23:27:58.525227 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-8q5qn" Jan 20 23:27:58 crc kubenswrapper[5030]: I0120 23:27:58.525228 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-8q5qn" event={"ID":"a6b6dfdf-c86c-4d99-913b-0c28fb0c1811","Type":"ContainerDied","Data":"3641d3391147390e9803c367d5e8d6806335519f3b2d213668b529858da899b7"} Jan 20 23:27:58 crc kubenswrapper[5030]: I0120 23:27:58.525310 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3641d3391147390e9803c367d5e8d6806335519f3b2d213668b529858da899b7" Jan 20 23:27:58 crc kubenswrapper[5030]: I0120 23:27:58.562884 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=37.562864683 podStartE2EDuration="37.562864683s" podCreationTimestamp="2026-01-20 23:27:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:27:58.555911065 +0000 UTC m=+3150.876171363" watchObservedRunningTime="2026-01-20 23:27:58.562864683 +0000 UTC m=+3150.883124971" Jan 20 23:27:59 crc kubenswrapper[5030]: I0120 23:27:59.538253 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" event={"ID":"7d5a9211-0407-48f1-9d3e-8fb089aa2b56","Type":"ContainerStarted","Data":"9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b"} Jan 20 23:27:59 crc kubenswrapper[5030]: I0120 23:27:59.564011 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" podStartSLOduration=3.563993348 podStartE2EDuration="3.563993348s" podCreationTimestamp="2026-01-20 23:27:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:27:59.56032462 +0000 UTC m=+3151.880584898" watchObservedRunningTime="2026-01-20 23:27:59.563993348 +0000 UTC m=+3151.884253636" Jan 20 23:27:59 crc kubenswrapper[5030]: I0120 23:27:59.961839 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:27:59 crc kubenswrapper[5030]: I0120 23:27:59.990692 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-db-sync-config-data\") pod \"dd6294ee-f735-499f-b340-952f3f7b2ded\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " Jan 20 23:27:59 crc kubenswrapper[5030]: I0120 23:27:59.990763 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-config-data\") pod \"dd6294ee-f735-499f-b340-952f3f7b2ded\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " Jan 20 23:27:59 crc kubenswrapper[5030]: I0120 23:27:59.990824 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mlqd\" (UniqueName: \"kubernetes.io/projected/dd6294ee-f735-499f-b340-952f3f7b2ded-kube-api-access-7mlqd\") pod \"dd6294ee-f735-499f-b340-952f3f7b2ded\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " Jan 20 23:27:59 crc kubenswrapper[5030]: I0120 23:27:59.990886 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-combined-ca-bundle\") pod \"dd6294ee-f735-499f-b340-952f3f7b2ded\" (UID: \"dd6294ee-f735-499f-b340-952f3f7b2ded\") " Jan 20 23:27:59 crc kubenswrapper[5030]: I0120 23:27:59.996829 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6294ee-f735-499f-b340-952f3f7b2ded-kube-api-access-7mlqd" (OuterVolumeSpecName: "kube-api-access-7mlqd") pod "dd6294ee-f735-499f-b340-952f3f7b2ded" (UID: "dd6294ee-f735-499f-b340-952f3f7b2ded"). InnerVolumeSpecName "kube-api-access-7mlqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:00 crc kubenswrapper[5030]: I0120 23:28:00.002751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dd6294ee-f735-499f-b340-952f3f7b2ded" (UID: "dd6294ee-f735-499f-b340-952f3f7b2ded"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:00 crc kubenswrapper[5030]: I0120 23:28:00.042865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd6294ee-f735-499f-b340-952f3f7b2ded" (UID: "dd6294ee-f735-499f-b340-952f3f7b2ded"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:00 crc kubenswrapper[5030]: I0120 23:28:00.045634 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-config-data" (OuterVolumeSpecName: "config-data") pod "dd6294ee-f735-499f-b340-952f3f7b2ded" (UID: "dd6294ee-f735-499f-b340-952f3f7b2ded"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:00 crc kubenswrapper[5030]: I0120 23:28:00.094422 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:00 crc kubenswrapper[5030]: I0120 23:28:00.094461 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:00 crc kubenswrapper[5030]: I0120 23:28:00.094476 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mlqd\" (UniqueName: \"kubernetes.io/projected/dd6294ee-f735-499f-b340-952f3f7b2ded-kube-api-access-7mlqd\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:00 crc kubenswrapper[5030]: I0120 23:28:00.094487 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6294ee-f735-499f-b340-952f3f7b2ded-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:00 crc kubenswrapper[5030]: I0120 23:28:00.553055 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-fbr4k" Jan 20 23:28:00 crc kubenswrapper[5030]: I0120 23:28:00.553056 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-fbr4k" event={"ID":"dd6294ee-f735-499f-b340-952f3f7b2ded","Type":"ContainerDied","Data":"f91c21eeb665d073e3d594005511db69bae64ddb8e8de53ad099e767b3acc02b"} Jan 20 23:28:00 crc kubenswrapper[5030]: I0120 23:28:00.553111 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f91c21eeb665d073e3d594005511db69bae64ddb8e8de53ad099e767b3acc02b" Jan 20 23:28:00 crc kubenswrapper[5030]: I0120 23:28:00.553593 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.048264 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.130385 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4"] Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.131022 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" podUID="e70c799f-11cc-4de5-87b7-c9fb919df4b8" containerName="dnsmasq-dns" containerID="cri-o://94fd76e4c952cda5c478e27fd2eeb648b29babe915e29e5841ddb5e6d603cd50" gracePeriod=10 Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.597004 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.619033 5030 generic.go:334] "Generic (PLEG): container finished" podID="e70c799f-11cc-4de5-87b7-c9fb919df4b8" containerID="94fd76e4c952cda5c478e27fd2eeb648b29babe915e29e5841ddb5e6d603cd50" exitCode=0 Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.619083 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" event={"ID":"e70c799f-11cc-4de5-87b7-c9fb919df4b8","Type":"ContainerDied","Data":"94fd76e4c952cda5c478e27fd2eeb648b29babe915e29e5841ddb5e6d603cd50"} Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.619121 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" event={"ID":"e70c799f-11cc-4de5-87b7-c9fb919df4b8","Type":"ContainerDied","Data":"2b484d297f125d231acee7c20314852a50fb43af5d47656b8b6023213e14bd3b"} Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.619126 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.619167 5030 scope.go:117] "RemoveContainer" containerID="94fd76e4c952cda5c478e27fd2eeb648b29babe915e29e5841ddb5e6d603cd50" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.620882 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e70c799f-11cc-4de5-87b7-c9fb919df4b8-config\") pod \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\" (UID: \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\") " Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.620953 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e70c799f-11cc-4de5-87b7-c9fb919df4b8-dnsmasq-svc\") pod \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\" (UID: \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\") " Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.621015 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6tgr\" (UniqueName: \"kubernetes.io/projected/e70c799f-11cc-4de5-87b7-c9fb919df4b8-kube-api-access-r6tgr\") pod \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\" (UID: \"e70c799f-11cc-4de5-87b7-c9fb919df4b8\") " Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.629500 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e70c799f-11cc-4de5-87b7-c9fb919df4b8-kube-api-access-r6tgr" (OuterVolumeSpecName: "kube-api-access-r6tgr") pod "e70c799f-11cc-4de5-87b7-c9fb919df4b8" (UID: "e70c799f-11cc-4de5-87b7-c9fb919df4b8"). InnerVolumeSpecName "kube-api-access-r6tgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.686145 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e70c799f-11cc-4de5-87b7-c9fb919df4b8-config" (OuterVolumeSpecName: "config") pod "e70c799f-11cc-4de5-87b7-c9fb919df4b8" (UID: "e70c799f-11cc-4de5-87b7-c9fb919df4b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.687110 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e70c799f-11cc-4de5-87b7-c9fb919df4b8-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "e70c799f-11cc-4de5-87b7-c9fb919df4b8" (UID: "e70c799f-11cc-4de5-87b7-c9fb919df4b8"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.703533 5030 scope.go:117] "RemoveContainer" containerID="c672e76b107c7e32dc030b9ed48f0dabafda407a97c9dd170a2de01ae5711b7d" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.722056 5030 scope.go:117] "RemoveContainer" containerID="94fd76e4c952cda5c478e27fd2eeb648b29babe915e29e5841ddb5e6d603cd50" Jan 20 23:28:07 crc kubenswrapper[5030]: E0120 23:28:07.722452 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94fd76e4c952cda5c478e27fd2eeb648b29babe915e29e5841ddb5e6d603cd50\": container with ID starting with 94fd76e4c952cda5c478e27fd2eeb648b29babe915e29e5841ddb5e6d603cd50 not found: ID does not exist" containerID="94fd76e4c952cda5c478e27fd2eeb648b29babe915e29e5841ddb5e6d603cd50" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.722504 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94fd76e4c952cda5c478e27fd2eeb648b29babe915e29e5841ddb5e6d603cd50"} err="failed to get container status \"94fd76e4c952cda5c478e27fd2eeb648b29babe915e29e5841ddb5e6d603cd50\": rpc error: code = NotFound desc = could not find container \"94fd76e4c952cda5c478e27fd2eeb648b29babe915e29e5841ddb5e6d603cd50\": container with ID starting with 94fd76e4c952cda5c478e27fd2eeb648b29babe915e29e5841ddb5e6d603cd50 not found: ID does not exist" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.722539 5030 scope.go:117] "RemoveContainer" containerID="c672e76b107c7e32dc030b9ed48f0dabafda407a97c9dd170a2de01ae5711b7d" Jan 20 23:28:07 crc kubenswrapper[5030]: E0120 23:28:07.722878 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c672e76b107c7e32dc030b9ed48f0dabafda407a97c9dd170a2de01ae5711b7d\": container with ID starting with c672e76b107c7e32dc030b9ed48f0dabafda407a97c9dd170a2de01ae5711b7d not found: ID does not exist" containerID="c672e76b107c7e32dc030b9ed48f0dabafda407a97c9dd170a2de01ae5711b7d" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.722919 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c672e76b107c7e32dc030b9ed48f0dabafda407a97c9dd170a2de01ae5711b7d"} err="failed to get container status \"c672e76b107c7e32dc030b9ed48f0dabafda407a97c9dd170a2de01ae5711b7d\": rpc error: code = NotFound desc = could not find container \"c672e76b107c7e32dc030b9ed48f0dabafda407a97c9dd170a2de01ae5711b7d\": container with ID starting with c672e76b107c7e32dc030b9ed48f0dabafda407a97c9dd170a2de01ae5711b7d not found: ID does not exist" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.723133 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e70c799f-11cc-4de5-87b7-c9fb919df4b8-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.723157 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/e70c799f-11cc-4de5-87b7-c9fb919df4b8-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.723184 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6tgr\" (UniqueName: \"kubernetes.io/projected/e70c799f-11cc-4de5-87b7-c9fb919df4b8-kube-api-access-r6tgr\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.954231 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4"] Jan 20 23:28:07 crc kubenswrapper[5030]: I0120 23:28:07.973208 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-q6nw4"] Jan 20 23:28:09 crc kubenswrapper[5030]: I0120 23:28:09.971563 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e70c799f-11cc-4de5-87b7-c9fb919df4b8" path="/var/lib/kubelet/pods/e70c799f-11cc-4de5-87b7-c9fb919df4b8/volumes" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.396905 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.675062 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.707993 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-cxftt"] Jan 20 23:28:12 crc kubenswrapper[5030]: E0120 23:28:12.708328 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b6dfdf-c86c-4d99-913b-0c28fb0c1811" containerName="mariadb-account-create-update" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.708354 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b6dfdf-c86c-4d99-913b-0c28fb0c1811" containerName="mariadb-account-create-update" Jan 20 23:28:12 crc kubenswrapper[5030]: E0120 23:28:12.708386 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70c799f-11cc-4de5-87b7-c9fb919df4b8" containerName="init" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.708393 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70c799f-11cc-4de5-87b7-c9fb919df4b8" containerName="init" Jan 20 23:28:12 crc kubenswrapper[5030]: E0120 23:28:12.708401 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6294ee-f735-499f-b340-952f3f7b2ded" containerName="glance-db-sync" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.708407 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6294ee-f735-499f-b340-952f3f7b2ded" containerName="glance-db-sync" Jan 20 23:28:12 crc kubenswrapper[5030]: E0120 23:28:12.708415 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70c799f-11cc-4de5-87b7-c9fb919df4b8" containerName="dnsmasq-dns" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.708421 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70c799f-11cc-4de5-87b7-c9fb919df4b8" containerName="dnsmasq-dns" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.708568 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b6dfdf-c86c-4d99-913b-0c28fb0c1811" containerName="mariadb-account-create-update" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.708584 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e70c799f-11cc-4de5-87b7-c9fb919df4b8" containerName="dnsmasq-dns" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.712181 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd6294ee-f735-499f-b340-952f3f7b2ded" containerName="glance-db-sync" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.712784 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-cxftt" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.726523 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-cxftt"] Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.808465 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pw4j\" (UniqueName: \"kubernetes.io/projected/106f4c1b-d73d-4dac-988b-0b08d0c66b69-kube-api-access-6pw4j\") pod \"barbican-db-create-cxftt\" (UID: \"106f4c1b-d73d-4dac-988b-0b08d0c66b69\") " pod="openstack-kuttl-tests/barbican-db-create-cxftt" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.808527 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106f4c1b-d73d-4dac-988b-0b08d0c66b69-operator-scripts\") pod \"barbican-db-create-cxftt\" (UID: \"106f4c1b-d73d-4dac-988b-0b08d0c66b69\") " pod="openstack-kuttl-tests/barbican-db-create-cxftt" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.897612 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jx4gj"] Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.898562 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-jx4gj" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.909226 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc"] Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.909929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pw4j\" (UniqueName: \"kubernetes.io/projected/106f4c1b-d73d-4dac-988b-0b08d0c66b69-kube-api-access-6pw4j\") pod \"barbican-db-create-cxftt\" (UID: \"106f4c1b-d73d-4dac-988b-0b08d0c66b69\") " pod="openstack-kuttl-tests/barbican-db-create-cxftt" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.909980 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106f4c1b-d73d-4dac-988b-0b08d0c66b69-operator-scripts\") pod \"barbican-db-create-cxftt\" (UID: \"106f4c1b-d73d-4dac-988b-0b08d0c66b69\") " pod="openstack-kuttl-tests/barbican-db-create-cxftt" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.910297 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.910814 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106f4c1b-d73d-4dac-988b-0b08d0c66b69-operator-scripts\") pod \"barbican-db-create-cxftt\" (UID: \"106f4c1b-d73d-4dac-988b-0b08d0c66b69\") " pod="openstack-kuttl-tests/barbican-db-create-cxftt" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.911942 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.922919 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jx4gj"] Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.931122 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc"] Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.948473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pw4j\" (UniqueName: \"kubernetes.io/projected/106f4c1b-d73d-4dac-988b-0b08d0c66b69-kube-api-access-6pw4j\") pod \"barbican-db-create-cxftt\" (UID: \"106f4c1b-d73d-4dac-988b-0b08d0c66b69\") " pod="openstack-kuttl-tests/barbican-db-create-cxftt" Jan 20 23:28:12 crc kubenswrapper[5030]: I0120 23:28:12.962195 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:28:12 crc kubenswrapper[5030]: E0120 23:28:12.962364 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.011341 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a99373ff-558b-4e95-a48e-1b457b727b99-operator-scripts\") pod \"barbican-6a16-account-create-update-t6wkc\" (UID: \"a99373ff-558b-4e95-a48e-1b457b727b99\") " pod="openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.011416 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7mwh\" (UniqueName: \"kubernetes.io/projected/a99373ff-558b-4e95-a48e-1b457b727b99-kube-api-access-x7mwh\") pod \"barbican-6a16-account-create-update-t6wkc\" (UID: \"a99373ff-558b-4e95-a48e-1b457b727b99\") " pod="openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.011457 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29d2p\" (UniqueName: \"kubernetes.io/projected/4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e-kube-api-access-29d2p\") pod \"cinder-db-create-jx4gj\" (UID: \"4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e\") " pod="openstack-kuttl-tests/cinder-db-create-jx4gj" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.011481 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e-operator-scripts\") pod \"cinder-db-create-jx4gj\" (UID: \"4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e\") " pod="openstack-kuttl-tests/cinder-db-create-jx4gj" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.032139 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-cxftt" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.103557 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jqzcn"] Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.104493 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-jqzcn" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.112792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a99373ff-558b-4e95-a48e-1b457b727b99-operator-scripts\") pod \"barbican-6a16-account-create-update-t6wkc\" (UID: \"a99373ff-558b-4e95-a48e-1b457b727b99\") " pod="openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.112926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7mwh\" (UniqueName: \"kubernetes.io/projected/a99373ff-558b-4e95-a48e-1b457b727b99-kube-api-access-x7mwh\") pod \"barbican-6a16-account-create-update-t6wkc\" (UID: \"a99373ff-558b-4e95-a48e-1b457b727b99\") " pod="openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.112988 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29d2p\" (UniqueName: \"kubernetes.io/projected/4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e-kube-api-access-29d2p\") pod \"cinder-db-create-jx4gj\" (UID: \"4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e\") " pod="openstack-kuttl-tests/cinder-db-create-jx4gj" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.113036 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e-operator-scripts\") pod \"cinder-db-create-jx4gj\" (UID: \"4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e\") " pod="openstack-kuttl-tests/cinder-db-create-jx4gj" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.113656 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a99373ff-558b-4e95-a48e-1b457b727b99-operator-scripts\") pod \"barbican-6a16-account-create-update-t6wkc\" (UID: \"a99373ff-558b-4e95-a48e-1b457b727b99\") " pod="openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.114069 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e-operator-scripts\") pod \"cinder-db-create-jx4gj\" (UID: \"4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e\") " pod="openstack-kuttl-tests/cinder-db-create-jx4gj" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.117506 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p"] Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.118653 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.121345 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.125263 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jqzcn"] Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.137216 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7mwh\" (UniqueName: \"kubernetes.io/projected/a99373ff-558b-4e95-a48e-1b457b727b99-kube-api-access-x7mwh\") pod \"barbican-6a16-account-create-update-t6wkc\" (UID: \"a99373ff-558b-4e95-a48e-1b457b727b99\") " pod="openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.138916 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29d2p\" (UniqueName: \"kubernetes.io/projected/4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e-kube-api-access-29d2p\") pod \"cinder-db-create-jx4gj\" (UID: \"4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e\") " pod="openstack-kuttl-tests/cinder-db-create-jx4gj" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.146549 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p"] Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.213127 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq"] Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.214284 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.214403 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9wwp\" (UniqueName: \"kubernetes.io/projected/f8fbce55-4d90-46db-a2ee-7a17b043c246-kube-api-access-f9wwp\") pod \"neutron-7c7c-account-create-update-mwr5p\" (UID: \"f8fbce55-4d90-46db-a2ee-7a17b043c246\") " pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.214493 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x4bv\" (UniqueName: \"kubernetes.io/projected/e7627c98-1b9d-43d6-9479-0b36228b1288-kube-api-access-7x4bv\") pod \"neutron-db-create-jqzcn\" (UID: \"e7627c98-1b9d-43d6-9479-0b36228b1288\") " pod="openstack-kuttl-tests/neutron-db-create-jqzcn" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.214575 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7627c98-1b9d-43d6-9479-0b36228b1288-operator-scripts\") pod \"neutron-db-create-jqzcn\" (UID: \"e7627c98-1b9d-43d6-9479-0b36228b1288\") " pod="openstack-kuttl-tests/neutron-db-create-jqzcn" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.214645 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbce55-4d90-46db-a2ee-7a17b043c246-operator-scripts\") pod \"neutron-7c7c-account-create-update-mwr5p\" (UID: \"f8fbce55-4d90-46db-a2ee-7a17b043c246\") " pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.213131 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-jx4gj" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.217054 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.232108 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.236272 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq"] Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.296649 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pd9lv"] Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.297955 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.303391 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.303736 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-c75q4" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.303846 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.303955 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.318409 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pddd\" (UniqueName: \"kubernetes.io/projected/e34e0e3b-f240-482d-925b-0c746ed1346e-kube-api-access-9pddd\") pod \"cinder-c1a3-account-create-update-kqpvq\" (UID: \"e34e0e3b-f240-482d-925b-0c746ed1346e\") " pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.318485 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7627c98-1b9d-43d6-9479-0b36228b1288-operator-scripts\") pod \"neutron-db-create-jqzcn\" (UID: \"e7627c98-1b9d-43d6-9479-0b36228b1288\") " pod="openstack-kuttl-tests/neutron-db-create-jqzcn" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.318532 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e34e0e3b-f240-482d-925b-0c746ed1346e-operator-scripts\") pod \"cinder-c1a3-account-create-update-kqpvq\" (UID: \"e34e0e3b-f240-482d-925b-0c746ed1346e\") " pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.318573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbce55-4d90-46db-a2ee-7a17b043c246-operator-scripts\") pod \"neutron-7c7c-account-create-update-mwr5p\" (UID: \"f8fbce55-4d90-46db-a2ee-7a17b043c246\") " pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.318608 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9wwp\" (UniqueName: \"kubernetes.io/projected/f8fbce55-4d90-46db-a2ee-7a17b043c246-kube-api-access-f9wwp\") pod \"neutron-7c7c-account-create-update-mwr5p\" (UID: \"f8fbce55-4d90-46db-a2ee-7a17b043c246\") " pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.318675 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x4bv\" (UniqueName: \"kubernetes.io/projected/e7627c98-1b9d-43d6-9479-0b36228b1288-kube-api-access-7x4bv\") pod \"neutron-db-create-jqzcn\" (UID: \"e7627c98-1b9d-43d6-9479-0b36228b1288\") " pod="openstack-kuttl-tests/neutron-db-create-jqzcn" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.319471 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7627c98-1b9d-43d6-9479-0b36228b1288-operator-scripts\") pod \"neutron-db-create-jqzcn\" (UID: \"e7627c98-1b9d-43d6-9479-0b36228b1288\") " pod="openstack-kuttl-tests/neutron-db-create-jqzcn" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.319729 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbce55-4d90-46db-a2ee-7a17b043c246-operator-scripts\") pod \"neutron-7c7c-account-create-update-mwr5p\" (UID: \"f8fbce55-4d90-46db-a2ee-7a17b043c246\") " pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.342176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9wwp\" (UniqueName: \"kubernetes.io/projected/f8fbce55-4d90-46db-a2ee-7a17b043c246-kube-api-access-f9wwp\") pod \"neutron-7c7c-account-create-update-mwr5p\" (UID: \"f8fbce55-4d90-46db-a2ee-7a17b043c246\") " pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.343429 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x4bv\" (UniqueName: \"kubernetes.io/projected/e7627c98-1b9d-43d6-9479-0b36228b1288-kube-api-access-7x4bv\") pod \"neutron-db-create-jqzcn\" (UID: \"e7627c98-1b9d-43d6-9479-0b36228b1288\") " pod="openstack-kuttl-tests/neutron-db-create-jqzcn" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.354968 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pd9lv"] Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.423275 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e34e0e3b-f240-482d-925b-0c746ed1346e-operator-scripts\") pod \"cinder-c1a3-account-create-update-kqpvq\" (UID: \"e34e0e3b-f240-482d-925b-0c746ed1346e\") " pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.423417 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr8s9\" (UniqueName: \"kubernetes.io/projected/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-kube-api-access-lr8s9\") pod \"keystone-db-sync-pd9lv\" (UID: \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\") " pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.423797 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-combined-ca-bundle\") pod \"keystone-db-sync-pd9lv\" (UID: \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\") " pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.423869 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-config-data\") pod \"keystone-db-sync-pd9lv\" (UID: \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\") " pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.423931 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pddd\" (UniqueName: \"kubernetes.io/projected/e34e0e3b-f240-482d-925b-0c746ed1346e-kube-api-access-9pddd\") pod \"cinder-c1a3-account-create-update-kqpvq\" (UID: \"e34e0e3b-f240-482d-925b-0c746ed1346e\") " pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.424192 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e34e0e3b-f240-482d-925b-0c746ed1346e-operator-scripts\") pod \"cinder-c1a3-account-create-update-kqpvq\" (UID: \"e34e0e3b-f240-482d-925b-0c746ed1346e\") " pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.430085 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-jqzcn" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.454800 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pddd\" (UniqueName: \"kubernetes.io/projected/e34e0e3b-f240-482d-925b-0c746ed1346e-kube-api-access-9pddd\") pod \"cinder-c1a3-account-create-update-kqpvq\" (UID: \"e34e0e3b-f240-482d-925b-0c746ed1346e\") " pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.494583 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.527697 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr8s9\" (UniqueName: \"kubernetes.io/projected/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-kube-api-access-lr8s9\") pod \"keystone-db-sync-pd9lv\" (UID: \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\") " pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.527783 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-combined-ca-bundle\") pod \"keystone-db-sync-pd9lv\" (UID: \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\") " pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.527818 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-config-data\") pod \"keystone-db-sync-pd9lv\" (UID: \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\") " pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.532297 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-combined-ca-bundle\") pod \"keystone-db-sync-pd9lv\" (UID: \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\") " pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.534048 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-config-data\") pod \"keystone-db-sync-pd9lv\" (UID: \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\") " pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.546315 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.549273 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr8s9\" (UniqueName: \"kubernetes.io/projected/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-kube-api-access-lr8s9\") pod \"keystone-db-sync-pd9lv\" (UID: \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\") " pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.616576 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-cxftt"] Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.627158 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" Jan 20 23:28:13 crc kubenswrapper[5030]: W0120 23:28:13.628976 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod106f4c1b_d73d_4dac_988b_0b08d0c66b69.slice/crio-5eb244ee5229d5a174eefa99b99390d433ea4c237b18039c907ebfa274210c9c WatchSource:0}: Error finding container 5eb244ee5229d5a174eefa99b99390d433ea4c237b18039c907ebfa274210c9c: Status 404 returned error can't find the container with id 5eb244ee5229d5a174eefa99b99390d433ea4c237b18039c907ebfa274210c9c Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.714021 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-cxftt" event={"ID":"106f4c1b-d73d-4dac-988b-0b08d0c66b69","Type":"ContainerStarted","Data":"5eb244ee5229d5a174eefa99b99390d433ea4c237b18039c907ebfa274210c9c"} Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.759566 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jx4gj"] Jan 20 23:28:13 crc kubenswrapper[5030]: W0120 23:28:13.769906 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d6d04f3_a997_4d4a_85cc_b37c1c89ab5e.slice/crio-402669c0b8c00aaae9627472fd899be79adc610dbd8c6e8b99a77d13a654473b WatchSource:0}: Error finding container 402669c0b8c00aaae9627472fd899be79adc610dbd8c6e8b99a77d13a654473b: Status 404 returned error can't find the container with id 402669c0b8c00aaae9627472fd899be79adc610dbd8c6e8b99a77d13a654473b Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.813482 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc"] Jan 20 23:28:13 crc kubenswrapper[5030]: W0120 23:28:13.820123 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda99373ff_558b_4e95_a48e_1b457b727b99.slice/crio-8b1bb35de8d62b3d7efa380e21096248589833575a54245fbec7d7a8c6965e30 WatchSource:0}: Error finding container 8b1bb35de8d62b3d7efa380e21096248589833575a54245fbec7d7a8c6965e30: Status 404 returned error can't find the container with id 8b1bb35de8d62b3d7efa380e21096248589833575a54245fbec7d7a8c6965e30 Jan 20 23:28:13 crc kubenswrapper[5030]: I0120 23:28:13.934747 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jqzcn"] Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.044665 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p"] Jan 20 23:28:14 crc kubenswrapper[5030]: W0120 23:28:14.046924 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8fbce55_4d90_46db_a2ee_7a17b043c246.slice/crio-ff464a6bde45d8346face36744845e8a1bb628f1be7dd08a0f18b4bf1c4dd0ea WatchSource:0}: Error finding container ff464a6bde45d8346face36744845e8a1bb628f1be7dd08a0f18b4bf1c4dd0ea: Status 404 returned error can't find the container with id ff464a6bde45d8346face36744845e8a1bb628f1be7dd08a0f18b4bf1c4dd0ea Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.114862 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq"] Jan 20 23:28:14 crc kubenswrapper[5030]: W0120 23:28:14.123108 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode34e0e3b_f240_482d_925b_0c746ed1346e.slice/crio-0b1b35c8464f3d6cfa0f1b8cf6959a9b30c0a4c63ef8291769669960c06329a7 WatchSource:0}: Error finding container 0b1b35c8464f3d6cfa0f1b8cf6959a9b30c0a4c63ef8291769669960c06329a7: Status 404 returned error can't find the container with id 0b1b35c8464f3d6cfa0f1b8cf6959a9b30c0a4c63ef8291769669960c06329a7 Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.211977 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pd9lv"] Jan 20 23:28:14 crc kubenswrapper[5030]: W0120 23:28:14.337767 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23a72a1f_d1f6_4a23_ba6c_9baf90e14fc1.slice/crio-2d754963ab042de9679c2591e8f11e45148366aaeaab41a7393435f6859a9284 WatchSource:0}: Error finding container 2d754963ab042de9679c2591e8f11e45148366aaeaab41a7393435f6859a9284: Status 404 returned error can't find the container with id 2d754963ab042de9679c2591e8f11e45148366aaeaab41a7393435f6859a9284 Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.726228 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" event={"ID":"e34e0e3b-f240-482d-925b-0c746ed1346e","Type":"ContainerStarted","Data":"0bf73260889f3156f80b4c12bb900ddb99d6d59bc8af041c63fd69a91e15ccea"} Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.726561 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" event={"ID":"e34e0e3b-f240-482d-925b-0c746ed1346e","Type":"ContainerStarted","Data":"0b1b35c8464f3d6cfa0f1b8cf6959a9b30c0a4c63ef8291769669960c06329a7"} Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.729472 5030 generic.go:334] "Generic (PLEG): container finished" podID="4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e" containerID="75bdd4500474fa14aff48f8007f5e9d67c778fd39b444260ca7c21a9d0b54539" exitCode=0 Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.729514 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-jx4gj" event={"ID":"4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e","Type":"ContainerDied","Data":"75bdd4500474fa14aff48f8007f5e9d67c778fd39b444260ca7c21a9d0b54539"} Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.729532 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-jx4gj" event={"ID":"4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e","Type":"ContainerStarted","Data":"402669c0b8c00aaae9627472fd899be79adc610dbd8c6e8b99a77d13a654473b"} Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.733286 5030 generic.go:334] "Generic (PLEG): container finished" podID="a99373ff-558b-4e95-a48e-1b457b727b99" containerID="769e9e12ef6c3bc11d8c7227a59831aaf4beb7eb11f3328fed8e63f694d38681" exitCode=0 Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.733325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc" event={"ID":"a99373ff-558b-4e95-a48e-1b457b727b99","Type":"ContainerDied","Data":"769e9e12ef6c3bc11d8c7227a59831aaf4beb7eb11f3328fed8e63f694d38681"} Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.733339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc" event={"ID":"a99373ff-558b-4e95-a48e-1b457b727b99","Type":"ContainerStarted","Data":"8b1bb35de8d62b3d7efa380e21096248589833575a54245fbec7d7a8c6965e30"} Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.745142 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" event={"ID":"f8fbce55-4d90-46db-a2ee-7a17b043c246","Type":"ContainerStarted","Data":"c54f67d7ddbd7c4c1aee630494153ae6c202ce4616219d0db07a4f4bb1e257f7"} Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.745188 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" event={"ID":"f8fbce55-4d90-46db-a2ee-7a17b043c246","Type":"ContainerStarted","Data":"ff464a6bde45d8346face36744845e8a1bb628f1be7dd08a0f18b4bf1c4dd0ea"} Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.753797 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-jqzcn" event={"ID":"e7627c98-1b9d-43d6-9479-0b36228b1288","Type":"ContainerStarted","Data":"6d02ba4f740c40f15ed8b65911dd184538b19b9aa75c2e5567cbec252abecfc4"} Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.753837 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-jqzcn" event={"ID":"e7627c98-1b9d-43d6-9479-0b36228b1288","Type":"ContainerStarted","Data":"4636fb815a14ce46beca6cc1ff3191bcb5286b2037dd9076d43775bb432da24a"} Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.761186 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" podStartSLOduration=1.761166839 podStartE2EDuration="1.761166839s" podCreationTimestamp="2026-01-20 23:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:14.749003615 +0000 UTC m=+3167.069263903" watchObservedRunningTime="2026-01-20 23:28:14.761166839 +0000 UTC m=+3167.081427117" Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.771938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" event={"ID":"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1","Type":"ContainerStarted","Data":"d972a5ba5335c79ee7f0e9a2ee9867a6a5817c5df455505e757a4059fd140323"} Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.771987 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" event={"ID":"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1","Type":"ContainerStarted","Data":"2d754963ab042de9679c2591e8f11e45148366aaeaab41a7393435f6859a9284"} Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.775958 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" podStartSLOduration=1.775943737 podStartE2EDuration="1.775943737s" podCreationTimestamp="2026-01-20 23:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:14.772907424 +0000 UTC m=+3167.093167722" watchObservedRunningTime="2026-01-20 23:28:14.775943737 +0000 UTC m=+3167.096204025" Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.801983 5030 generic.go:334] "Generic (PLEG): container finished" podID="106f4c1b-d73d-4dac-988b-0b08d0c66b69" containerID="cba3263a6508fedaf28776583fb352f8fcd4acf74a5a4acef1cadfc0e9b2be2d" exitCode=0 Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.802036 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-cxftt" event={"ID":"106f4c1b-d73d-4dac-988b-0b08d0c66b69","Type":"ContainerDied","Data":"cba3263a6508fedaf28776583fb352f8fcd4acf74a5a4acef1cadfc0e9b2be2d"} Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.948195 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-create-jqzcn" podStartSLOduration=1.948175436 podStartE2EDuration="1.948175436s" podCreationTimestamp="2026-01-20 23:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:14.943194155 +0000 UTC m=+3167.263454443" watchObservedRunningTime="2026-01-20 23:28:14.948175436 +0000 UTC m=+3167.268435724" Jan 20 23:28:14 crc kubenswrapper[5030]: I0120 23:28:14.957396 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" podStartSLOduration=1.9573781590000001 podStartE2EDuration="1.957378159s" podCreationTimestamp="2026-01-20 23:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:14.916175092 +0000 UTC m=+3167.236435380" watchObservedRunningTime="2026-01-20 23:28:14.957378159 +0000 UTC m=+3167.277638447" Jan 20 23:28:15 crc kubenswrapper[5030]: I0120 23:28:15.815644 5030 generic.go:334] "Generic (PLEG): container finished" podID="f8fbce55-4d90-46db-a2ee-7a17b043c246" containerID="c54f67d7ddbd7c4c1aee630494153ae6c202ce4616219d0db07a4f4bb1e257f7" exitCode=0 Jan 20 23:28:15 crc kubenswrapper[5030]: I0120 23:28:15.815768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" event={"ID":"f8fbce55-4d90-46db-a2ee-7a17b043c246","Type":"ContainerDied","Data":"c54f67d7ddbd7c4c1aee630494153ae6c202ce4616219d0db07a4f4bb1e257f7"} Jan 20 23:28:15 crc kubenswrapper[5030]: I0120 23:28:15.821279 5030 generic.go:334] "Generic (PLEG): container finished" podID="e7627c98-1b9d-43d6-9479-0b36228b1288" containerID="6d02ba4f740c40f15ed8b65911dd184538b19b9aa75c2e5567cbec252abecfc4" exitCode=0 Jan 20 23:28:15 crc kubenswrapper[5030]: I0120 23:28:15.821493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-jqzcn" event={"ID":"e7627c98-1b9d-43d6-9479-0b36228b1288","Type":"ContainerDied","Data":"6d02ba4f740c40f15ed8b65911dd184538b19b9aa75c2e5567cbec252abecfc4"} Jan 20 23:28:15 crc kubenswrapper[5030]: I0120 23:28:15.824721 5030 generic.go:334] "Generic (PLEG): container finished" podID="e34e0e3b-f240-482d-925b-0c746ed1346e" containerID="0bf73260889f3156f80b4c12bb900ddb99d6d59bc8af041c63fd69a91e15ccea" exitCode=0 Jan 20 23:28:15 crc kubenswrapper[5030]: I0120 23:28:15.824985 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" event={"ID":"e34e0e3b-f240-482d-925b-0c746ed1346e","Type":"ContainerDied","Data":"0bf73260889f3156f80b4c12bb900ddb99d6d59bc8af041c63fd69a91e15ccea"} Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.313392 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.318731 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-jx4gj" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.329732 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-cxftt" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.477035 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pw4j\" (UniqueName: \"kubernetes.io/projected/106f4c1b-d73d-4dac-988b-0b08d0c66b69-kube-api-access-6pw4j\") pod \"106f4c1b-d73d-4dac-988b-0b08d0c66b69\" (UID: \"106f4c1b-d73d-4dac-988b-0b08d0c66b69\") " Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.477151 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a99373ff-558b-4e95-a48e-1b457b727b99-operator-scripts\") pod \"a99373ff-558b-4e95-a48e-1b457b727b99\" (UID: \"a99373ff-558b-4e95-a48e-1b457b727b99\") " Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.477171 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e-operator-scripts\") pod \"4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e\" (UID: \"4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e\") " Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.477289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29d2p\" (UniqueName: \"kubernetes.io/projected/4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e-kube-api-access-29d2p\") pod \"4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e\" (UID: \"4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e\") " Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.477343 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7mwh\" (UniqueName: \"kubernetes.io/projected/a99373ff-558b-4e95-a48e-1b457b727b99-kube-api-access-x7mwh\") pod \"a99373ff-558b-4e95-a48e-1b457b727b99\" (UID: \"a99373ff-558b-4e95-a48e-1b457b727b99\") " Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.477370 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106f4c1b-d73d-4dac-988b-0b08d0c66b69-operator-scripts\") pod \"106f4c1b-d73d-4dac-988b-0b08d0c66b69\" (UID: \"106f4c1b-d73d-4dac-988b-0b08d0c66b69\") " Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.478162 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/106f4c1b-d73d-4dac-988b-0b08d0c66b69-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "106f4c1b-d73d-4dac-988b-0b08d0c66b69" (UID: "106f4c1b-d73d-4dac-988b-0b08d0c66b69"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.478207 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a99373ff-558b-4e95-a48e-1b457b727b99-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a99373ff-558b-4e95-a48e-1b457b727b99" (UID: "a99373ff-558b-4e95-a48e-1b457b727b99"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.478218 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e" (UID: "4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.482550 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a99373ff-558b-4e95-a48e-1b457b727b99-kube-api-access-x7mwh" (OuterVolumeSpecName: "kube-api-access-x7mwh") pod "a99373ff-558b-4e95-a48e-1b457b727b99" (UID: "a99373ff-558b-4e95-a48e-1b457b727b99"). InnerVolumeSpecName "kube-api-access-x7mwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.482665 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e-kube-api-access-29d2p" (OuterVolumeSpecName: "kube-api-access-29d2p") pod "4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e" (UID: "4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e"). InnerVolumeSpecName "kube-api-access-29d2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.482757 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106f4c1b-d73d-4dac-988b-0b08d0c66b69-kube-api-access-6pw4j" (OuterVolumeSpecName: "kube-api-access-6pw4j") pod "106f4c1b-d73d-4dac-988b-0b08d0c66b69" (UID: "106f4c1b-d73d-4dac-988b-0b08d0c66b69"). InnerVolumeSpecName "kube-api-access-6pw4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.579464 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pw4j\" (UniqueName: \"kubernetes.io/projected/106f4c1b-d73d-4dac-988b-0b08d0c66b69-kube-api-access-6pw4j\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.579512 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a99373ff-558b-4e95-a48e-1b457b727b99-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.579525 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.579536 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29d2p\" (UniqueName: \"kubernetes.io/projected/4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e-kube-api-access-29d2p\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.579547 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7mwh\" (UniqueName: \"kubernetes.io/projected/a99373ff-558b-4e95-a48e-1b457b727b99-kube-api-access-x7mwh\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.579556 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106f4c1b-d73d-4dac-988b-0b08d0c66b69-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.839745 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-cxftt" event={"ID":"106f4c1b-d73d-4dac-988b-0b08d0c66b69","Type":"ContainerDied","Data":"5eb244ee5229d5a174eefa99b99390d433ea4c237b18039c907ebfa274210c9c"} Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.839795 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-cxftt" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.839995 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eb244ee5229d5a174eefa99b99390d433ea4c237b18039c907ebfa274210c9c" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.843210 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-jx4gj" event={"ID":"4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e","Type":"ContainerDied","Data":"402669c0b8c00aaae9627472fd899be79adc610dbd8c6e8b99a77d13a654473b"} Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.843245 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-jx4gj" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.843275 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="402669c0b8c00aaae9627472fd899be79adc610dbd8c6e8b99a77d13a654473b" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.846828 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc" event={"ID":"a99373ff-558b-4e95-a48e-1b457b727b99","Type":"ContainerDied","Data":"8b1bb35de8d62b3d7efa380e21096248589833575a54245fbec7d7a8c6965e30"} Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.846877 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc" Jan 20 23:28:16 crc kubenswrapper[5030]: I0120 23:28:16.846894 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b1bb35de8d62b3d7efa380e21096248589833575a54245fbec7d7a8c6965e30" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.096254 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.190851 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pddd\" (UniqueName: \"kubernetes.io/projected/e34e0e3b-f240-482d-925b-0c746ed1346e-kube-api-access-9pddd\") pod \"e34e0e3b-f240-482d-925b-0c746ed1346e\" (UID: \"e34e0e3b-f240-482d-925b-0c746ed1346e\") " Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.191009 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e34e0e3b-f240-482d-925b-0c746ed1346e-operator-scripts\") pod \"e34e0e3b-f240-482d-925b-0c746ed1346e\" (UID: \"e34e0e3b-f240-482d-925b-0c746ed1346e\") " Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.191481 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e34e0e3b-f240-482d-925b-0c746ed1346e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e34e0e3b-f240-482d-925b-0c746ed1346e" (UID: "e34e0e3b-f240-482d-925b-0c746ed1346e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.196772 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e34e0e3b-f240-482d-925b-0c746ed1346e-kube-api-access-9pddd" (OuterVolumeSpecName: "kube-api-access-9pddd") pod "e34e0e3b-f240-482d-925b-0c746ed1346e" (UID: "e34e0e3b-f240-482d-925b-0c746ed1346e"). InnerVolumeSpecName "kube-api-access-9pddd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.292914 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pddd\" (UniqueName: \"kubernetes.io/projected/e34e0e3b-f240-482d-925b-0c746ed1346e-kube-api-access-9pddd\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.292948 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e34e0e3b-f240-482d-925b-0c746ed1346e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.369971 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.375305 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-jqzcn" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.496339 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9wwp\" (UniqueName: \"kubernetes.io/projected/f8fbce55-4d90-46db-a2ee-7a17b043c246-kube-api-access-f9wwp\") pod \"f8fbce55-4d90-46db-a2ee-7a17b043c246\" (UID: \"f8fbce55-4d90-46db-a2ee-7a17b043c246\") " Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.496464 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x4bv\" (UniqueName: \"kubernetes.io/projected/e7627c98-1b9d-43d6-9479-0b36228b1288-kube-api-access-7x4bv\") pod \"e7627c98-1b9d-43d6-9479-0b36228b1288\" (UID: \"e7627c98-1b9d-43d6-9479-0b36228b1288\") " Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.496551 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7627c98-1b9d-43d6-9479-0b36228b1288-operator-scripts\") pod \"e7627c98-1b9d-43d6-9479-0b36228b1288\" (UID: \"e7627c98-1b9d-43d6-9479-0b36228b1288\") " Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.496581 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbce55-4d90-46db-a2ee-7a17b043c246-operator-scripts\") pod \"f8fbce55-4d90-46db-a2ee-7a17b043c246\" (UID: \"f8fbce55-4d90-46db-a2ee-7a17b043c246\") " Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.497715 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7627c98-1b9d-43d6-9479-0b36228b1288-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7627c98-1b9d-43d6-9479-0b36228b1288" (UID: "e7627c98-1b9d-43d6-9479-0b36228b1288"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.497862 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fbce55-4d90-46db-a2ee-7a17b043c246-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8fbce55-4d90-46db-a2ee-7a17b043c246" (UID: "f8fbce55-4d90-46db-a2ee-7a17b043c246"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.501845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7627c98-1b9d-43d6-9479-0b36228b1288-kube-api-access-7x4bv" (OuterVolumeSpecName: "kube-api-access-7x4bv") pod "e7627c98-1b9d-43d6-9479-0b36228b1288" (UID: "e7627c98-1b9d-43d6-9479-0b36228b1288"). InnerVolumeSpecName "kube-api-access-7x4bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.502851 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8fbce55-4d90-46db-a2ee-7a17b043c246-kube-api-access-f9wwp" (OuterVolumeSpecName: "kube-api-access-f9wwp") pod "f8fbce55-4d90-46db-a2ee-7a17b043c246" (UID: "f8fbce55-4d90-46db-a2ee-7a17b043c246"). InnerVolumeSpecName "kube-api-access-f9wwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.598818 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9wwp\" (UniqueName: \"kubernetes.io/projected/f8fbce55-4d90-46db-a2ee-7a17b043c246-kube-api-access-f9wwp\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.598877 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x4bv\" (UniqueName: \"kubernetes.io/projected/e7627c98-1b9d-43d6-9479-0b36228b1288-kube-api-access-7x4bv\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.598899 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7627c98-1b9d-43d6-9479-0b36228b1288-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.598918 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8fbce55-4d90-46db-a2ee-7a17b043c246-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.859387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" event={"ID":"f8fbce55-4d90-46db-a2ee-7a17b043c246","Type":"ContainerDied","Data":"ff464a6bde45d8346face36744845e8a1bb628f1be7dd08a0f18b4bf1c4dd0ea"} Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.859431 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.859446 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff464a6bde45d8346face36744845e8a1bb628f1be7dd08a0f18b4bf1c4dd0ea" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.864146 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-jqzcn" event={"ID":"e7627c98-1b9d-43d6-9479-0b36228b1288","Type":"ContainerDied","Data":"4636fb815a14ce46beca6cc1ff3191bcb5286b2037dd9076d43775bb432da24a"} Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.864487 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4636fb815a14ce46beca6cc1ff3191bcb5286b2037dd9076d43775bb432da24a" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.864179 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-jqzcn" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.869860 5030 generic.go:334] "Generic (PLEG): container finished" podID="23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1" containerID="d972a5ba5335c79ee7f0e9a2ee9867a6a5817c5df455505e757a4059fd140323" exitCode=0 Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.869998 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" event={"ID":"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1","Type":"ContainerDied","Data":"d972a5ba5335c79ee7f0e9a2ee9867a6a5817c5df455505e757a4059fd140323"} Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.874012 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" event={"ID":"e34e0e3b-f240-482d-925b-0c746ed1346e","Type":"ContainerDied","Data":"0b1b35c8464f3d6cfa0f1b8cf6959a9b30c0a4c63ef8291769669960c06329a7"} Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.874060 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b1b35c8464f3d6cfa0f1b8cf6959a9b30c0a4c63ef8291769669960c06329a7" Jan 20 23:28:17 crc kubenswrapper[5030]: I0120 23:28:17.874129 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq" Jan 20 23:28:19 crc kubenswrapper[5030]: I0120 23:28:19.293383 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" Jan 20 23:28:19 crc kubenswrapper[5030]: I0120 23:28:19.428347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr8s9\" (UniqueName: \"kubernetes.io/projected/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-kube-api-access-lr8s9\") pod \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\" (UID: \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\") " Jan 20 23:28:19 crc kubenswrapper[5030]: I0120 23:28:19.428433 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-combined-ca-bundle\") pod \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\" (UID: \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\") " Jan 20 23:28:19 crc kubenswrapper[5030]: I0120 23:28:19.428470 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-config-data\") pod \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\" (UID: \"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1\") " Jan 20 23:28:19 crc kubenswrapper[5030]: I0120 23:28:19.434549 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-kube-api-access-lr8s9" (OuterVolumeSpecName: "kube-api-access-lr8s9") pod "23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1" (UID: "23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1"). InnerVolumeSpecName "kube-api-access-lr8s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:19 crc kubenswrapper[5030]: I0120 23:28:19.469241 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1" (UID: "23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:19 crc kubenswrapper[5030]: I0120 23:28:19.473778 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-config-data" (OuterVolumeSpecName: "config-data") pod "23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1" (UID: "23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:19 crc kubenswrapper[5030]: I0120 23:28:19.530409 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr8s9\" (UniqueName: \"kubernetes.io/projected/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-kube-api-access-lr8s9\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:19 crc kubenswrapper[5030]: I0120 23:28:19.530779 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:19 crc kubenswrapper[5030]: I0120 23:28:19.530823 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:19 crc kubenswrapper[5030]: I0120 23:28:19.907153 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" event={"ID":"23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1","Type":"ContainerDied","Data":"2d754963ab042de9679c2591e8f11e45148366aaeaab41a7393435f6859a9284"} Jan 20 23:28:19 crc kubenswrapper[5030]: I0120 23:28:19.907201 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d754963ab042de9679c2591e8f11e45148366aaeaab41a7393435f6859a9284" Jan 20 23:28:19 crc kubenswrapper[5030]: I0120 23:28:19.907265 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-pd9lv" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.059703 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-fk65m"] Jan 20 23:28:20 crc kubenswrapper[5030]: E0120 23:28:20.060678 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106f4c1b-d73d-4dac-988b-0b08d0c66b69" containerName="mariadb-database-create" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.060701 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="106f4c1b-d73d-4dac-988b-0b08d0c66b69" containerName="mariadb-database-create" Jan 20 23:28:20 crc kubenswrapper[5030]: E0120 23:28:20.060725 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7627c98-1b9d-43d6-9479-0b36228b1288" containerName="mariadb-database-create" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.060732 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7627c98-1b9d-43d6-9479-0b36228b1288" containerName="mariadb-database-create" Jan 20 23:28:20 crc kubenswrapper[5030]: E0120 23:28:20.060750 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99373ff-558b-4e95-a48e-1b457b727b99" containerName="mariadb-account-create-update" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.060757 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99373ff-558b-4e95-a48e-1b457b727b99" containerName="mariadb-account-create-update" Jan 20 23:28:20 crc kubenswrapper[5030]: E0120 23:28:20.060785 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e" containerName="mariadb-database-create" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.060792 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e" containerName="mariadb-database-create" Jan 20 23:28:20 crc kubenswrapper[5030]: E0120 23:28:20.060805 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e34e0e3b-f240-482d-925b-0c746ed1346e" containerName="mariadb-account-create-update" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.060811 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34e0e3b-f240-482d-925b-0c746ed1346e" containerName="mariadb-account-create-update" Jan 20 23:28:20 crc kubenswrapper[5030]: E0120 23:28:20.060824 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1" containerName="keystone-db-sync" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.060832 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1" containerName="keystone-db-sync" Jan 20 23:28:20 crc kubenswrapper[5030]: E0120 23:28:20.060848 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fbce55-4d90-46db-a2ee-7a17b043c246" containerName="mariadb-account-create-update" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.060854 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fbce55-4d90-46db-a2ee-7a17b043c246" containerName="mariadb-account-create-update" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.061382 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fbce55-4d90-46db-a2ee-7a17b043c246" containerName="mariadb-account-create-update" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.061410 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7627c98-1b9d-43d6-9479-0b36228b1288" containerName="mariadb-database-create" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.061424 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1" containerName="keystone-db-sync" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.061435 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="106f4c1b-d73d-4dac-988b-0b08d0c66b69" containerName="mariadb-database-create" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.061450 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e34e0e3b-f240-482d-925b-0c746ed1346e" containerName="mariadb-account-create-update" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.061461 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e" containerName="mariadb-database-create" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.061474 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a99373ff-558b-4e95-a48e-1b457b727b99" containerName="mariadb-account-create-update" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.073979 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.083020 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.083087 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.083289 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.083375 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.083966 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-c75q4" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.153849 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-fk65m"] Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.265093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-credential-keys\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.265150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-combined-ca-bundle\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.265177 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltltn\" (UniqueName: \"kubernetes.io/projected/dd0de19f-f83e-411a-bd86-8c989dd2723f-kube-api-access-ltltn\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.265235 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-fernet-keys\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.265260 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-config-data\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.265293 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-scripts\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.281719 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-7fxnx"] Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.282763 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.297252 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-wtz2q" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.297295 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.297252 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.310494 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-7fxnx"] Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.330450 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-4x6fd"] Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.331464 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.339827 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.339981 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.340096 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-bp47s" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.368425 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-fernet-keys\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.368482 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-config-data\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.368511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-config-data\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.368542 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-scripts\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.368571 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxqdm\" (UniqueName: \"kubernetes.io/projected/56cbfa09-ec86-4e48-84e6-7452ae29b778-kube-api-access-vxqdm\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.368599 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-credential-keys\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.368632 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56cbfa09-ec86-4e48-84e6-7452ae29b778-logs\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.368655 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-scripts\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.368679 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-combined-ca-bundle\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.368699 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltltn\" (UniqueName: \"kubernetes.io/projected/dd0de19f-f83e-411a-bd86-8c989dd2723f-kube-api-access-ltltn\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.368726 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-combined-ca-bundle\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.386443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-config-data\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.390294 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-scripts\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.398357 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-fernet-keys\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.399421 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-4x6fd"] Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.400751 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-credential-keys\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.408941 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-combined-ca-bundle\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.409007 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-qvzhj"] Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.410112 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.414310 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-lkwbp" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.414901 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.432316 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltltn\" (UniqueName: \"kubernetes.io/projected/dd0de19f-f83e-411a-bd86-8c989dd2723f-kube-api-access-ltltn\") pod \"keystone-bootstrap-fk65m\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.449087 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.470197 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56cbfa09-ec86-4e48-84e6-7452ae29b778-logs\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.470833 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-scripts\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.470926 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-db-sync-config-data\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.471059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-combined-ca-bundle\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.471115 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56cbfa09-ec86-4e48-84e6-7452ae29b778-logs\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.471235 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-scripts\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.471355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-config-data\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.471425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b2c52ab-999e-420d-836c-dc6ea4055ff6-etc-machine-id\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.471509 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-config-data\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.471580 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-combined-ca-bundle\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.471665 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp7pv\" (UniqueName: \"kubernetes.io/projected/7b2c52ab-999e-420d-836c-dc6ea4055ff6-kube-api-access-mp7pv\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.471755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxqdm\" (UniqueName: \"kubernetes.io/projected/56cbfa09-ec86-4e48-84e6-7452ae29b778-kube-api-access-vxqdm\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.484844 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-qvzhj"] Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.488283 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-combined-ca-bundle\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.488567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-scripts\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.491302 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-config-data\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.502018 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxqdm\" (UniqueName: \"kubernetes.io/projected/56cbfa09-ec86-4e48-84e6-7452ae29b778-kube-api-access-vxqdm\") pod \"placement-db-sync-7fxnx\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.573892 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17be7dc5-b32e-47c3-a083-823f4300b700-combined-ca-bundle\") pod \"barbican-db-sync-qvzhj\" (UID: \"17be7dc5-b32e-47c3-a083-823f4300b700\") " pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.573947 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-scripts\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.574000 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl5nf\" (UniqueName: \"kubernetes.io/projected/17be7dc5-b32e-47c3-a083-823f4300b700-kube-api-access-nl5nf\") pod \"barbican-db-sync-qvzhj\" (UID: \"17be7dc5-b32e-47c3-a083-823f4300b700\") " pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.574041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b2c52ab-999e-420d-836c-dc6ea4055ff6-etc-machine-id\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.574068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-config-data\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.574089 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-combined-ca-bundle\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.574106 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp7pv\" (UniqueName: \"kubernetes.io/projected/7b2c52ab-999e-420d-836c-dc6ea4055ff6-kube-api-access-mp7pv\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.574131 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17be7dc5-b32e-47c3-a083-823f4300b700-db-sync-config-data\") pod \"barbican-db-sync-qvzhj\" (UID: \"17be7dc5-b32e-47c3-a083-823f4300b700\") " pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.574181 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-db-sync-config-data\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.574885 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b2c52ab-999e-420d-836c-dc6ea4055ff6-etc-machine-id\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.585510 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-combined-ca-bundle\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.586162 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-db-sync-config-data\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.590440 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.592161 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.593163 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-config-data\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.605376 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-scripts\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.608997 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.609757 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.613212 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp7pv\" (UniqueName: \"kubernetes.io/projected/7b2c52ab-999e-420d-836c-dc6ea4055ff6-kube-api-access-mp7pv\") pod \"cinder-db-sync-4x6fd\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.617905 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.651243 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.676749 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d72149d3-95a7-48e4-8016-a8029e658356-log-httpd\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.676803 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.676870 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17be7dc5-b32e-47c3-a083-823f4300b700-combined-ca-bundle\") pod \"barbican-db-sync-qvzhj\" (UID: \"17be7dc5-b32e-47c3-a083-823f4300b700\") " pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.676937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmxx6\" (UniqueName: \"kubernetes.io/projected/d72149d3-95a7-48e4-8016-a8029e658356-kube-api-access-dmxx6\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.676965 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.676989 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl5nf\" (UniqueName: \"kubernetes.io/projected/17be7dc5-b32e-47c3-a083-823f4300b700-kube-api-access-nl5nf\") pod \"barbican-db-sync-qvzhj\" (UID: \"17be7dc5-b32e-47c3-a083-823f4300b700\") " pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.677018 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-scripts\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.677067 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17be7dc5-b32e-47c3-a083-823f4300b700-db-sync-config-data\") pod \"barbican-db-sync-qvzhj\" (UID: \"17be7dc5-b32e-47c3-a083-823f4300b700\") " pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.677089 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d72149d3-95a7-48e4-8016-a8029e658356-run-httpd\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.677115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-config-data\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.686841 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17be7dc5-b32e-47c3-a083-823f4300b700-db-sync-config-data\") pod \"barbican-db-sync-qvzhj\" (UID: \"17be7dc5-b32e-47c3-a083-823f4300b700\") " pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.699545 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17be7dc5-b32e-47c3-a083-823f4300b700-combined-ca-bundle\") pod \"barbican-db-sync-qvzhj\" (UID: \"17be7dc5-b32e-47c3-a083-823f4300b700\") " pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.703333 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl5nf\" (UniqueName: \"kubernetes.io/projected/17be7dc5-b32e-47c3-a083-823f4300b700-kube-api-access-nl5nf\") pod \"barbican-db-sync-qvzhj\" (UID: \"17be7dc5-b32e-47c3-a083-823f4300b700\") " pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.756142 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.778310 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d72149d3-95a7-48e4-8016-a8029e658356-run-httpd\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.778361 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-config-data\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.778400 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d72149d3-95a7-48e4-8016-a8029e658356-log-httpd\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.778422 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.778486 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmxx6\" (UniqueName: \"kubernetes.io/projected/d72149d3-95a7-48e4-8016-a8029e658356-kube-api-access-dmxx6\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.778509 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.778532 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-scripts\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.779494 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d72149d3-95a7-48e4-8016-a8029e658356-log-httpd\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.779794 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d72149d3-95a7-48e4-8016-a8029e658356-run-httpd\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.783415 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-scripts\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.783691 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-config-data\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.789037 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.799001 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.804936 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmxx6\" (UniqueName: \"kubernetes.io/projected/d72149d3-95a7-48e4-8016-a8029e658356-kube-api-access-dmxx6\") pod \"ceilometer-0\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.859507 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.861980 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-fk65m"] Jan 20 23:28:20 crc kubenswrapper[5030]: W0120 23:28:20.876178 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd0de19f_f83e_411a_bd86_8c989dd2723f.slice/crio-cf0203ea016cd7cc18bedd007b7662acc4f5af19ca930a72ab51ac32169d996d WatchSource:0}: Error finding container cf0203ea016cd7cc18bedd007b7662acc4f5af19ca930a72ab51ac32169d996d: Status 404 returned error can't find the container with id cf0203ea016cd7cc18bedd007b7662acc4f5af19ca930a72ab51ac32169d996d Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.921248 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" event={"ID":"dd0de19f-f83e-411a-bd86-8c989dd2723f","Type":"ContainerStarted","Data":"cf0203ea016cd7cc18bedd007b7662acc4f5af19ca930a72ab51ac32169d996d"} Jan 20 23:28:20 crc kubenswrapper[5030]: I0120 23:28:20.952042 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.197780 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.199676 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.206203 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.206410 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-6vjfc" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.207139 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.215755 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.229565 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.230965 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.237090 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-7fxnx"] Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.240919 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.260179 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.294387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.294422 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da00bb6b-bc6b-453c-91b7-f1ede664c454-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.294495 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkv29\" (UniqueName: \"kubernetes.io/projected/da00bb6b-bc6b-453c-91b7-f1ede664c454-kube-api-access-bkv29\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.294515 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.294541 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-config-data\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.294568 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-scripts\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.294599 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da00bb6b-bc6b-453c-91b7-f1ede664c454-logs\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.300935 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:28:21 crc kubenswrapper[5030]: E0120 23:28:21.301476 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run kube-api-access-bkv29 logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="da00bb6b-bc6b-453c-91b7-f1ede664c454" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.348029 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-4x6fd"] Jan 20 23:28:21 crc kubenswrapper[5030]: W0120 23:28:21.357562 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b2c52ab_999e_420d_836c_dc6ea4055ff6.slice/crio-8369badc4c4d1072a48ddfe5b9580da2918bdabe43ea365c58099c942ebe3a80 WatchSource:0}: Error finding container 8369badc4c4d1072a48ddfe5b9580da2918bdabe43ea365c58099c942ebe3a80: Status 404 returned error can't find the container with id 8369badc4c4d1072a48ddfe5b9580da2918bdabe43ea365c58099c942ebe3a80 Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.387651 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-qvzhj"] Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.396542 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkv29\" (UniqueName: \"kubernetes.io/projected/da00bb6b-bc6b-453c-91b7-f1ede664c454-kube-api-access-bkv29\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.396593 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0102d8ca-9bb5-417f-8e32-8efab78e9f22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.396653 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.396701 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-config-data\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.396736 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-scripts\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.396760 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.396794 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.396821 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0102d8ca-9bb5-417f-8e32-8efab78e9f22-logs\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.396845 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da00bb6b-bc6b-453c-91b7-f1ede664c454-logs\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.396872 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.396906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl2wr\" (UniqueName: \"kubernetes.io/projected/0102d8ca-9bb5-417f-8e32-8efab78e9f22-kube-api-access-wl2wr\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.396941 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.396990 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.397014 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da00bb6b-bc6b-453c-91b7-f1ede664c454-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.397535 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da00bb6b-bc6b-453c-91b7-f1ede664c454-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.403348 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da00bb6b-bc6b-453c-91b7-f1ede664c454-logs\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.404152 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.408188 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-config-data\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.415591 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.418026 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-scripts\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.419996 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkv29\" (UniqueName: \"kubernetes.io/projected/da00bb6b-bc6b-453c-91b7-f1ede664c454-kube-api-access-bkv29\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.430574 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.498896 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0102d8ca-9bb5-417f-8e32-8efab78e9f22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.498986 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.499011 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.499040 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0102d8ca-9bb5-417f-8e32-8efab78e9f22-logs\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.499080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.499116 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl2wr\" (UniqueName: \"kubernetes.io/projected/0102d8ca-9bb5-417f-8e32-8efab78e9f22-kube-api-access-wl2wr\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.499159 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.500161 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.500401 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0102d8ca-9bb5-417f-8e32-8efab78e9f22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.501017 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0102d8ca-9bb5-417f-8e32-8efab78e9f22-logs\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.506567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.508648 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.518576 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.519783 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.521981 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl2wr\" (UniqueName: \"kubernetes.io/projected/0102d8ca-9bb5-417f-8e32-8efab78e9f22-kube-api-access-wl2wr\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.558256 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.578580 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.942054 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" event={"ID":"7b2c52ab-999e-420d-836c-dc6ea4055ff6","Type":"ContainerStarted","Data":"8369badc4c4d1072a48ddfe5b9580da2918bdabe43ea365c58099c942ebe3a80"} Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.944544 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d72149d3-95a7-48e4-8016-a8029e658356","Type":"ContainerStarted","Data":"746dc54c0edd3981c6544a2c6c7541b75b6cec8d496d579ebbba9b28e3dd2027"} Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.950065 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" event={"ID":"dd0de19f-f83e-411a-bd86-8c989dd2723f","Type":"ContainerStarted","Data":"511cd6ffef13ece96e755ef9d69349d5ab4a8423bbfa256860fb87cc018a49db"} Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.953834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-7fxnx" event={"ID":"56cbfa09-ec86-4e48-84e6-7452ae29b778","Type":"ContainerStarted","Data":"51aa64d4069629fa3a6cef0e85c67b56680900af8f99e9f962991a0d0dfaca45"} Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.953883 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-7fxnx" event={"ID":"56cbfa09-ec86-4e48-84e6-7452ae29b778","Type":"ContainerStarted","Data":"0de70b3777356d8b347817bf1ac6ea5366c55483e1ee9c099316196bfec1b5ba"} Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.960556 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.962057 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" event={"ID":"17be7dc5-b32e-47c3-a083-823f4300b700","Type":"ContainerStarted","Data":"d9107a0eac389bd83b0b5be0fa014626ef904c92ed31eaab47cade0dbda1a139"} Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.962113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" event={"ID":"17be7dc5-b32e-47c3-a083-823f4300b700","Type":"ContainerStarted","Data":"28a15683af1fd70db5f0d0017576e205db7681ff3493078bdd2472842712b1a7"} Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.977371 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" podStartSLOduration=1.977355483 podStartE2EDuration="1.977355483s" podCreationTimestamp="2026-01-20 23:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:21.971511922 +0000 UTC m=+3174.291772200" watchObservedRunningTime="2026-01-20 23:28:21.977355483 +0000 UTC m=+3174.297615771" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.979864 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:21 crc kubenswrapper[5030]: I0120 23:28:21.993808 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-7fxnx" podStartSLOduration=1.993780581 podStartE2EDuration="1.993780581s" podCreationTimestamp="2026-01-20 23:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:21.987283153 +0000 UTC m=+3174.307543441" watchObservedRunningTime="2026-01-20 23:28:21.993780581 +0000 UTC m=+3174.314040869" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.013717 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" podStartSLOduration=2.013697833 podStartE2EDuration="2.013697833s" podCreationTimestamp="2026-01-20 23:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:22.011427827 +0000 UTC m=+3174.331688115" watchObservedRunningTime="2026-01-20 23:28:22.013697833 +0000 UTC m=+3174.333958121" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.046722 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:28:22 crc kubenswrapper[5030]: W0120 23:28:22.056981 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0102d8ca_9bb5_417f_8e32_8efab78e9f22.slice/crio-e7f5d5a19f7e1822725c383973ce208af0be43246c5ba0eeb232eeb1959f3297 WatchSource:0}: Error finding container e7f5d5a19f7e1822725c383973ce208af0be43246c5ba0eeb232eeb1959f3297: Status 404 returned error can't find the container with id e7f5d5a19f7e1822725c383973ce208af0be43246c5ba0eeb232eeb1959f3297 Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.118284 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-scripts\") pod \"da00bb6b-bc6b-453c-91b7-f1ede664c454\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.118759 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da00bb6b-bc6b-453c-91b7-f1ede664c454-httpd-run\") pod \"da00bb6b-bc6b-453c-91b7-f1ede664c454\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.118784 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkv29\" (UniqueName: \"kubernetes.io/projected/da00bb6b-bc6b-453c-91b7-f1ede664c454-kube-api-access-bkv29\") pod \"da00bb6b-bc6b-453c-91b7-f1ede664c454\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.118869 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-combined-ca-bundle\") pod \"da00bb6b-bc6b-453c-91b7-f1ede664c454\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.118908 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da00bb6b-bc6b-453c-91b7-f1ede664c454-logs\") pod \"da00bb6b-bc6b-453c-91b7-f1ede664c454\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.118979 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"da00bb6b-bc6b-453c-91b7-f1ede664c454\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.118997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-config-data\") pod \"da00bb6b-bc6b-453c-91b7-f1ede664c454\" (UID: \"da00bb6b-bc6b-453c-91b7-f1ede664c454\") " Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.119633 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da00bb6b-bc6b-453c-91b7-f1ede664c454-logs" (OuterVolumeSpecName: "logs") pod "da00bb6b-bc6b-453c-91b7-f1ede664c454" (UID: "da00bb6b-bc6b-453c-91b7-f1ede664c454"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.119724 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da00bb6b-bc6b-453c-91b7-f1ede664c454-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "da00bb6b-bc6b-453c-91b7-f1ede664c454" (UID: "da00bb6b-bc6b-453c-91b7-f1ede664c454"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.130001 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da00bb6b-bc6b-453c-91b7-f1ede664c454" (UID: "da00bb6b-bc6b-453c-91b7-f1ede664c454"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.131309 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "da00bb6b-bc6b-453c-91b7-f1ede664c454" (UID: "da00bb6b-bc6b-453c-91b7-f1ede664c454"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.138854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-config-data" (OuterVolumeSpecName: "config-data") pod "da00bb6b-bc6b-453c-91b7-f1ede664c454" (UID: "da00bb6b-bc6b-453c-91b7-f1ede664c454"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.142124 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da00bb6b-bc6b-453c-91b7-f1ede664c454-kube-api-access-bkv29" (OuterVolumeSpecName: "kube-api-access-bkv29") pod "da00bb6b-bc6b-453c-91b7-f1ede664c454" (UID: "da00bb6b-bc6b-453c-91b7-f1ede664c454"). InnerVolumeSpecName "kube-api-access-bkv29". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.142783 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-scripts" (OuterVolumeSpecName: "scripts") pod "da00bb6b-bc6b-453c-91b7-f1ede664c454" (UID: "da00bb6b-bc6b-453c-91b7-f1ede664c454"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.221972 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.222010 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.222020 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.222029 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da00bb6b-bc6b-453c-91b7-f1ede664c454-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.222040 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkv29\" (UniqueName: \"kubernetes.io/projected/da00bb6b-bc6b-453c-91b7-f1ede664c454-kube-api-access-bkv29\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.222051 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da00bb6b-bc6b-453c-91b7-f1ede664c454-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.222061 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da00bb6b-bc6b-453c-91b7-f1ede664c454-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.249066 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.324080 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.978327 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0102d8ca-9bb5-417f-8e32-8efab78e9f22","Type":"ContainerStarted","Data":"749cba0760492510ca64904082a3aae98d7031dc2ff515200ba3191109955564"} Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.978576 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0102d8ca-9bb5-417f-8e32-8efab78e9f22","Type":"ContainerStarted","Data":"e7f5d5a19f7e1822725c383973ce208af0be43246c5ba0eeb232eeb1959f3297"} Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.981412 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" event={"ID":"7b2c52ab-999e-420d-836c-dc6ea4055ff6","Type":"ContainerStarted","Data":"4ebfc1d04aa89c829d06fdef5b71f0d7b4ef8876a9b3d81c860e269d91b40c9c"} Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.986163 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d72149d3-95a7-48e4-8016-a8029e658356","Type":"ContainerStarted","Data":"cb1b71cee4930c80b7a9c83f52bd0374b4d41c4e164fe20693ffe4c6cb6e9e50"} Jan 20 23:28:22 crc kubenswrapper[5030]: I0120 23:28:22.986206 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.006579 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" podStartSLOduration=3.006539617 podStartE2EDuration="3.006539617s" podCreationTimestamp="2026-01-20 23:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:22.997189221 +0000 UTC m=+3175.317449509" watchObservedRunningTime="2026-01-20 23:28:23.006539617 +0000 UTC m=+3175.326799905" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.053406 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.076100 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.101052 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.102439 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.105118 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.109816 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.242219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-logs\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.242275 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-scripts\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.242350 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9ms6\" (UniqueName: \"kubernetes.io/projected/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-kube-api-access-k9ms6\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.242382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-config-data\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.242416 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.242471 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.242511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.328989 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-f9brm"] Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.329929 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-f9brm" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.343060 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.343147 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-s754x" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.343435 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.346188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.346260 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-logs\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.346285 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-scripts\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.346333 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9ms6\" (UniqueName: \"kubernetes.io/projected/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-kube-api-access-k9ms6\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.346361 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-config-data\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.346399 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.346457 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.346965 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.347891 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-logs\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.351405 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.354285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-scripts\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.355417 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-config-data\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.359160 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.360315 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-f9brm"] Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.386894 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9ms6\" (UniqueName: \"kubernetes.io/projected/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-kube-api-access-k9ms6\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.410890 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.425122 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.448315 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcrfj\" (UniqueName: \"kubernetes.io/projected/94c75021-4a7f-4851-aada-24da4ddb3387-kube-api-access-zcrfj\") pod \"neutron-db-sync-f9brm\" (UID: \"94c75021-4a7f-4851-aada-24da4ddb3387\") " pod="openstack-kuttl-tests/neutron-db-sync-f9brm" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.448422 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c75021-4a7f-4851-aada-24da4ddb3387-combined-ca-bundle\") pod \"neutron-db-sync-f9brm\" (UID: \"94c75021-4a7f-4851-aada-24da4ddb3387\") " pod="openstack-kuttl-tests/neutron-db-sync-f9brm" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.448483 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94c75021-4a7f-4851-aada-24da4ddb3387-config\") pod \"neutron-db-sync-f9brm\" (UID: \"94c75021-4a7f-4851-aada-24da4ddb3387\") " pod="openstack-kuttl-tests/neutron-db-sync-f9brm" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.555072 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c75021-4a7f-4851-aada-24da4ddb3387-combined-ca-bundle\") pod \"neutron-db-sync-f9brm\" (UID: \"94c75021-4a7f-4851-aada-24da4ddb3387\") " pod="openstack-kuttl-tests/neutron-db-sync-f9brm" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.555768 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94c75021-4a7f-4851-aada-24da4ddb3387-config\") pod \"neutron-db-sync-f9brm\" (UID: \"94c75021-4a7f-4851-aada-24da4ddb3387\") " pod="openstack-kuttl-tests/neutron-db-sync-f9brm" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.555855 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcrfj\" (UniqueName: \"kubernetes.io/projected/94c75021-4a7f-4851-aada-24da4ddb3387-kube-api-access-zcrfj\") pod \"neutron-db-sync-f9brm\" (UID: \"94c75021-4a7f-4851-aada-24da4ddb3387\") " pod="openstack-kuttl-tests/neutron-db-sync-f9brm" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.563511 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/94c75021-4a7f-4851-aada-24da4ddb3387-config\") pod \"neutron-db-sync-f9brm\" (UID: \"94c75021-4a7f-4851-aada-24da4ddb3387\") " pod="openstack-kuttl-tests/neutron-db-sync-f9brm" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.563568 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c75021-4a7f-4851-aada-24da4ddb3387-combined-ca-bundle\") pod \"neutron-db-sync-f9brm\" (UID: \"94c75021-4a7f-4851-aada-24da4ddb3387\") " pod="openstack-kuttl-tests/neutron-db-sync-f9brm" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.579878 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcrfj\" (UniqueName: \"kubernetes.io/projected/94c75021-4a7f-4851-aada-24da4ddb3387-kube-api-access-zcrfj\") pod \"neutron-db-sync-f9brm\" (UID: \"94c75021-4a7f-4851-aada-24da4ddb3387\") " pod="openstack-kuttl-tests/neutron-db-sync-f9brm" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.654099 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-f9brm" Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.872682 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:28:23 crc kubenswrapper[5030]: W0120 23:28:23.893840 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5ccb98e_eb94_4f38_8f47_ba9f45d0ce39.slice/crio-20728bcc4474c83b22c1dc9e7ad146fcc4e58fac17397ef097e97e6a929ab012 WatchSource:0}: Error finding container 20728bcc4474c83b22c1dc9e7ad146fcc4e58fac17397ef097e97e6a929ab012: Status 404 returned error can't find the container with id 20728bcc4474c83b22c1dc9e7ad146fcc4e58fac17397ef097e97e6a929ab012 Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.897196 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-f9brm"] Jan 20 23:28:23 crc kubenswrapper[5030]: W0120 23:28:23.912590 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94c75021_4a7f_4851_aada_24da4ddb3387.slice/crio-98e99d783c8e85c6a5cb75837f8ddbd2d005b3019e3d5cf6e7db2ecea4f7a119 WatchSource:0}: Error finding container 98e99d783c8e85c6a5cb75837f8ddbd2d005b3019e3d5cf6e7db2ecea4f7a119: Status 404 returned error can't find the container with id 98e99d783c8e85c6a5cb75837f8ddbd2d005b3019e3d5cf6e7db2ecea4f7a119 Jan 20 23:28:23 crc kubenswrapper[5030]: I0120 23:28:23.973753 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da00bb6b-bc6b-453c-91b7-f1ede664c454" path="/var/lib/kubelet/pods/da00bb6b-bc6b-453c-91b7-f1ede664c454/volumes" Jan 20 23:28:24 crc kubenswrapper[5030]: I0120 23:28:24.006186 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0102d8ca-9bb5-417f-8e32-8efab78e9f22","Type":"ContainerStarted","Data":"9af0e8c69a3180f95cf8b7b15e0a513b2b8f18bee9d34db8bc2221876cd8f160"} Jan 20 23:28:24 crc kubenswrapper[5030]: I0120 23:28:24.007906 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-f9brm" event={"ID":"94c75021-4a7f-4851-aada-24da4ddb3387","Type":"ContainerStarted","Data":"98e99d783c8e85c6a5cb75837f8ddbd2d005b3019e3d5cf6e7db2ecea4f7a119"} Jan 20 23:28:24 crc kubenswrapper[5030]: I0120 23:28:24.016371 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d72149d3-95a7-48e4-8016-a8029e658356","Type":"ContainerStarted","Data":"12bb23d36b3f804fce0bab26de1f6d7e35de2f78a62878d22bc7ae1bd6b04acc"} Jan 20 23:28:24 crc kubenswrapper[5030]: I0120 23:28:24.020709 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39","Type":"ContainerStarted","Data":"20728bcc4474c83b22c1dc9e7ad146fcc4e58fac17397ef097e97e6a929ab012"} Jan 20 23:28:24 crc kubenswrapper[5030]: I0120 23:28:24.041409 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.041384187 podStartE2EDuration="3.041384187s" podCreationTimestamp="2026-01-20 23:28:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:24.027117602 +0000 UTC m=+3176.347377890" watchObservedRunningTime="2026-01-20 23:28:24.041384187 +0000 UTC m=+3176.361644485" Jan 20 23:28:25 crc kubenswrapper[5030]: I0120 23:28:25.036395 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-f9brm" event={"ID":"94c75021-4a7f-4851-aada-24da4ddb3387","Type":"ContainerStarted","Data":"1c6850eecdc5fa4c5a36b12834246a3043eb86f02c48821721d9c04dc239804a"} Jan 20 23:28:25 crc kubenswrapper[5030]: I0120 23:28:25.041261 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d72149d3-95a7-48e4-8016-a8029e658356","Type":"ContainerStarted","Data":"c3f39f4dcfea9ce39a1b2f55f1629c023a92601135383bf7a408f83fadeaa291"} Jan 20 23:28:25 crc kubenswrapper[5030]: I0120 23:28:25.047392 5030 generic.go:334] "Generic (PLEG): container finished" podID="dd0de19f-f83e-411a-bd86-8c989dd2723f" containerID="511cd6ffef13ece96e755ef9d69349d5ab4a8423bbfa256860fb87cc018a49db" exitCode=0 Jan 20 23:28:25 crc kubenswrapper[5030]: I0120 23:28:25.047465 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" event={"ID":"dd0de19f-f83e-411a-bd86-8c989dd2723f","Type":"ContainerDied","Data":"511cd6ffef13ece96e755ef9d69349d5ab4a8423bbfa256860fb87cc018a49db"} Jan 20 23:28:25 crc kubenswrapper[5030]: I0120 23:28:25.048725 5030 generic.go:334] "Generic (PLEG): container finished" podID="56cbfa09-ec86-4e48-84e6-7452ae29b778" containerID="51aa64d4069629fa3a6cef0e85c67b56680900af8f99e9f962991a0d0dfaca45" exitCode=0 Jan 20 23:28:25 crc kubenswrapper[5030]: I0120 23:28:25.048766 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-7fxnx" event={"ID":"56cbfa09-ec86-4e48-84e6-7452ae29b778","Type":"ContainerDied","Data":"51aa64d4069629fa3a6cef0e85c67b56680900af8f99e9f962991a0d0dfaca45"} Jan 20 23:28:25 crc kubenswrapper[5030]: I0120 23:28:25.051736 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39","Type":"ContainerStarted","Data":"8868c8dbb49e9f936230d6a050a8159c9fa35eae21d95f7e496666e0266f14ea"} Jan 20 23:28:25 crc kubenswrapper[5030]: I0120 23:28:25.079249 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-f9brm" podStartSLOduration=2.079226441 podStartE2EDuration="2.079226441s" podCreationTimestamp="2026-01-20 23:28:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:25.056914201 +0000 UTC m=+3177.377174499" watchObservedRunningTime="2026-01-20 23:28:25.079226441 +0000 UTC m=+3177.399486729" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.061283 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39","Type":"ContainerStarted","Data":"09a49037e83033c54cae83c7b3f4e49aed1949d7ad0788fb8288bb11b66f88d2"} Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.065433 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d72149d3-95a7-48e4-8016-a8029e658356","Type":"ContainerStarted","Data":"3aa4e31f19cbf1f889f6c7fa026765ddf9e403581f09b4802d8ac3d7c287bf80"} Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.065573 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.066977 5030 generic.go:334] "Generic (PLEG): container finished" podID="17be7dc5-b32e-47c3-a083-823f4300b700" containerID="d9107a0eac389bd83b0b5be0fa014626ef904c92ed31eaab47cade0dbda1a139" exitCode=0 Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.067089 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" event={"ID":"17be7dc5-b32e-47c3-a083-823f4300b700","Type":"ContainerDied","Data":"d9107a0eac389bd83b0b5be0fa014626ef904c92ed31eaab47cade0dbda1a139"} Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.112183 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.112159465 podStartE2EDuration="3.112159465s" podCreationTimestamp="2026-01-20 23:28:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:26.096283831 +0000 UTC m=+3178.416544129" watchObservedRunningTime="2026-01-20 23:28:26.112159465 +0000 UTC m=+3178.432419763" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.128003 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.562602052 podStartE2EDuration="6.127979039s" podCreationTimestamp="2026-01-20 23:28:20 +0000 UTC" firstStartedPulling="2026-01-20 23:28:21.539992816 +0000 UTC m=+3173.860253104" lastFinishedPulling="2026-01-20 23:28:25.105369803 +0000 UTC m=+3177.425630091" observedRunningTime="2026-01-20 23:28:26.122122866 +0000 UTC m=+3178.442383164" watchObservedRunningTime="2026-01-20 23:28:26.127979039 +0000 UTC m=+3178.448239327" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.704645 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.709727 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.812763 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-config-data\") pod \"dd0de19f-f83e-411a-bd86-8c989dd2723f\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.812869 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-fernet-keys\") pod \"dd0de19f-f83e-411a-bd86-8c989dd2723f\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.812896 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-scripts\") pod \"dd0de19f-f83e-411a-bd86-8c989dd2723f\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.812978 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-scripts\") pod \"56cbfa09-ec86-4e48-84e6-7452ae29b778\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.813002 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-config-data\") pod \"56cbfa09-ec86-4e48-84e6-7452ae29b778\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.813069 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-combined-ca-bundle\") pod \"dd0de19f-f83e-411a-bd86-8c989dd2723f\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.813097 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56cbfa09-ec86-4e48-84e6-7452ae29b778-logs\") pod \"56cbfa09-ec86-4e48-84e6-7452ae29b778\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.813129 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-credential-keys\") pod \"dd0de19f-f83e-411a-bd86-8c989dd2723f\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.813184 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltltn\" (UniqueName: \"kubernetes.io/projected/dd0de19f-f83e-411a-bd86-8c989dd2723f-kube-api-access-ltltn\") pod \"dd0de19f-f83e-411a-bd86-8c989dd2723f\" (UID: \"dd0de19f-f83e-411a-bd86-8c989dd2723f\") " Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.813211 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-combined-ca-bundle\") pod \"56cbfa09-ec86-4e48-84e6-7452ae29b778\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.813232 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxqdm\" (UniqueName: \"kubernetes.io/projected/56cbfa09-ec86-4e48-84e6-7452ae29b778-kube-api-access-vxqdm\") pod \"56cbfa09-ec86-4e48-84e6-7452ae29b778\" (UID: \"56cbfa09-ec86-4e48-84e6-7452ae29b778\") " Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.815317 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56cbfa09-ec86-4e48-84e6-7452ae29b778-logs" (OuterVolumeSpecName: "logs") pod "56cbfa09-ec86-4e48-84e6-7452ae29b778" (UID: "56cbfa09-ec86-4e48-84e6-7452ae29b778"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.819567 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dd0de19f-f83e-411a-bd86-8c989dd2723f" (UID: "dd0de19f-f83e-411a-bd86-8c989dd2723f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.819934 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd0de19f-f83e-411a-bd86-8c989dd2723f-kube-api-access-ltltn" (OuterVolumeSpecName: "kube-api-access-ltltn") pod "dd0de19f-f83e-411a-bd86-8c989dd2723f" (UID: "dd0de19f-f83e-411a-bd86-8c989dd2723f"). InnerVolumeSpecName "kube-api-access-ltltn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.820689 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-scripts" (OuterVolumeSpecName: "scripts") pod "dd0de19f-f83e-411a-bd86-8c989dd2723f" (UID: "dd0de19f-f83e-411a-bd86-8c989dd2723f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.821127 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-scripts" (OuterVolumeSpecName: "scripts") pod "56cbfa09-ec86-4e48-84e6-7452ae29b778" (UID: "56cbfa09-ec86-4e48-84e6-7452ae29b778"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.821430 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56cbfa09-ec86-4e48-84e6-7452ae29b778-kube-api-access-vxqdm" (OuterVolumeSpecName: "kube-api-access-vxqdm") pod "56cbfa09-ec86-4e48-84e6-7452ae29b778" (UID: "56cbfa09-ec86-4e48-84e6-7452ae29b778"). InnerVolumeSpecName "kube-api-access-vxqdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.825566 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "dd0de19f-f83e-411a-bd86-8c989dd2723f" (UID: "dd0de19f-f83e-411a-bd86-8c989dd2723f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.841158 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56cbfa09-ec86-4e48-84e6-7452ae29b778" (UID: "56cbfa09-ec86-4e48-84e6-7452ae29b778"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.846519 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-config-data" (OuterVolumeSpecName: "config-data") pod "dd0de19f-f83e-411a-bd86-8c989dd2723f" (UID: "dd0de19f-f83e-411a-bd86-8c989dd2723f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.854642 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd0de19f-f83e-411a-bd86-8c989dd2723f" (UID: "dd0de19f-f83e-411a-bd86-8c989dd2723f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.854697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-config-data" (OuterVolumeSpecName: "config-data") pod "56cbfa09-ec86-4e48-84e6-7452ae29b778" (UID: "56cbfa09-ec86-4e48-84e6-7452ae29b778"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.915274 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.915523 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.915638 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.915705 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56cbfa09-ec86-4e48-84e6-7452ae29b778-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.915758 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.915823 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltltn\" (UniqueName: \"kubernetes.io/projected/dd0de19f-f83e-411a-bd86-8c989dd2723f-kube-api-access-ltltn\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.915879 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56cbfa09-ec86-4e48-84e6-7452ae29b778-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.915934 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxqdm\" (UniqueName: \"kubernetes.io/projected/56cbfa09-ec86-4e48-84e6-7452ae29b778-kube-api-access-vxqdm\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.915988 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.916039 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:26 crc kubenswrapper[5030]: I0120 23:28:26.916095 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd0de19f-f83e-411a-bd86-8c989dd2723f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.077795 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.077779 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-fk65m" event={"ID":"dd0de19f-f83e-411a-bd86-8c989dd2723f","Type":"ContainerDied","Data":"cf0203ea016cd7cc18bedd007b7662acc4f5af19ca930a72ab51ac32169d996d"} Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.077972 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf0203ea016cd7cc18bedd007b7662acc4f5af19ca930a72ab51ac32169d996d" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.079258 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-7fxnx" event={"ID":"56cbfa09-ec86-4e48-84e6-7452ae29b778","Type":"ContainerDied","Data":"0de70b3777356d8b347817bf1ac6ea5366c55483e1ee9c099316196bfec1b5ba"} Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.079292 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0de70b3777356d8b347817bf1ac6ea5366c55483e1ee9c099316196bfec1b5ba" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.079354 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-7fxnx" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.082536 5030 generic.go:334] "Generic (PLEG): container finished" podID="7b2c52ab-999e-420d-836c-dc6ea4055ff6" containerID="4ebfc1d04aa89c829d06fdef5b71f0d7b4ef8876a9b3d81c860e269d91b40c9c" exitCode=0 Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.082668 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" event={"ID":"7b2c52ab-999e-420d-836c-dc6ea4055ff6","Type":"ContainerDied","Data":"4ebfc1d04aa89c829d06fdef5b71f0d7b4ef8876a9b3d81c860e269d91b40c9c"} Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.184981 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-fk65m"] Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.194327 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-fk65m"] Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.266871 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6ldn5"] Jan 20 23:28:27 crc kubenswrapper[5030]: E0120 23:28:27.267205 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd0de19f-f83e-411a-bd86-8c989dd2723f" containerName="keystone-bootstrap" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.267221 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd0de19f-f83e-411a-bd86-8c989dd2723f" containerName="keystone-bootstrap" Jan 20 23:28:27 crc kubenswrapper[5030]: E0120 23:28:27.267242 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56cbfa09-ec86-4e48-84e6-7452ae29b778" containerName="placement-db-sync" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.267248 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="56cbfa09-ec86-4e48-84e6-7452ae29b778" containerName="placement-db-sync" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.267390 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd0de19f-f83e-411a-bd86-8c989dd2723f" containerName="keystone-bootstrap" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.267407 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="56cbfa09-ec86-4e48-84e6-7452ae29b778" containerName="placement-db-sync" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.267935 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.275308 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6ldn5"] Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.279064 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.279151 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.279164 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.279193 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-c75q4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.279235 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.299035 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7d684cd5bb-rtqs4"] Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.310513 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.313150 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-wtz2q" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.313392 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.313511 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.325932 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7d684cd5bb-rtqs4"] Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.423970 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.424320 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-scripts\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.424357 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dacb533-2f94-4433-ab8e-2efc4ee6e370-logs\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.424398 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-combined-ca-bundle\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.424423 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7whh\" (UniqueName: \"kubernetes.io/projected/5dacb533-2f94-4433-ab8e-2efc4ee6e370-kube-api-access-l7whh\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.424489 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-fernet-keys\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.424513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-combined-ca-bundle\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.424542 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-scripts\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.424561 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-config-data\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.424608 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-467rv\" (UniqueName: \"kubernetes.io/projected/ddd28027-576f-4b84-aed2-3ddcedb5bab3-kube-api-access-467rv\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.424653 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-credential-keys\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.424707 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-config-data\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.525508 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17be7dc5-b32e-47c3-a083-823f4300b700-combined-ca-bundle\") pod \"17be7dc5-b32e-47c3-a083-823f4300b700\" (UID: \"17be7dc5-b32e-47c3-a083-823f4300b700\") " Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.525658 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl5nf\" (UniqueName: \"kubernetes.io/projected/17be7dc5-b32e-47c3-a083-823f4300b700-kube-api-access-nl5nf\") pod \"17be7dc5-b32e-47c3-a083-823f4300b700\" (UID: \"17be7dc5-b32e-47c3-a083-823f4300b700\") " Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.525777 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17be7dc5-b32e-47c3-a083-823f4300b700-db-sync-config-data\") pod \"17be7dc5-b32e-47c3-a083-823f4300b700\" (UID: \"17be7dc5-b32e-47c3-a083-823f4300b700\") " Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.526017 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-combined-ca-bundle\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.526046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7whh\" (UniqueName: \"kubernetes.io/projected/5dacb533-2f94-4433-ab8e-2efc4ee6e370-kube-api-access-l7whh\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.526092 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-fernet-keys\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.526111 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-combined-ca-bundle\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.526137 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-scripts\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.526157 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-config-data\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.526185 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-467rv\" (UniqueName: \"kubernetes.io/projected/ddd28027-576f-4b84-aed2-3ddcedb5bab3-kube-api-access-467rv\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.526202 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-credential-keys\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.526254 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-config-data\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.526274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-scripts\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.526292 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dacb533-2f94-4433-ab8e-2efc4ee6e370-logs\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.526699 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dacb533-2f94-4433-ab8e-2efc4ee6e370-logs\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.530467 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17be7dc5-b32e-47c3-a083-823f4300b700-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "17be7dc5-b32e-47c3-a083-823f4300b700" (UID: "17be7dc5-b32e-47c3-a083-823f4300b700"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.534669 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-combined-ca-bundle\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.535635 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-combined-ca-bundle\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.535656 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-fernet-keys\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.536802 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-credential-keys\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.537122 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-scripts\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.541519 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-scripts\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.541726 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-config-data\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.542481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-config-data\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.542974 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17be7dc5-b32e-47c3-a083-823f4300b700-kube-api-access-nl5nf" (OuterVolumeSpecName: "kube-api-access-nl5nf") pod "17be7dc5-b32e-47c3-a083-823f4300b700" (UID: "17be7dc5-b32e-47c3-a083-823f4300b700"). InnerVolumeSpecName "kube-api-access-nl5nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.543218 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7whh\" (UniqueName: \"kubernetes.io/projected/5dacb533-2f94-4433-ab8e-2efc4ee6e370-kube-api-access-l7whh\") pod \"placement-7d684cd5bb-rtqs4\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.548209 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-467rv\" (UniqueName: \"kubernetes.io/projected/ddd28027-576f-4b84-aed2-3ddcedb5bab3-kube-api-access-467rv\") pod \"keystone-bootstrap-6ldn5\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.559066 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17be7dc5-b32e-47c3-a083-823f4300b700-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17be7dc5-b32e-47c3-a083-823f4300b700" (UID: "17be7dc5-b32e-47c3-a083-823f4300b700"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.595585 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.627474 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17be7dc5-b32e-47c3-a083-823f4300b700-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.627840 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17be7dc5-b32e-47c3-a083-823f4300b700-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.627854 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl5nf\" (UniqueName: \"kubernetes.io/projected/17be7dc5-b32e-47c3-a083-823f4300b700-kube-api-access-nl5nf\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.645049 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.975589 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:28:27 crc kubenswrapper[5030]: E0120 23:28:27.976045 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:28:27 crc kubenswrapper[5030]: I0120 23:28:27.976369 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd0de19f-f83e-411a-bd86-8c989dd2723f" path="/var/lib/kubelet/pods/dd0de19f-f83e-411a-bd86-8c989dd2723f/volumes" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.097032 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6ldn5"] Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.098525 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.098485 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-qvzhj" event={"ID":"17be7dc5-b32e-47c3-a083-823f4300b700","Type":"ContainerDied","Data":"28a15683af1fd70db5f0d0017576e205db7681ff3493078bdd2472842712b1a7"} Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.099005 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28a15683af1fd70db5f0d0017576e205db7681ff3493078bdd2472842712b1a7" Jan 20 23:28:28 crc kubenswrapper[5030]: W0120 23:28:28.105264 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddd28027_576f_4b84_aed2_3ddcedb5bab3.slice/crio-18d8f44d0fa7b69c37f9acffb4bdbbc4ec1dcfe7473496bb6166ab67dac756da WatchSource:0}: Error finding container 18d8f44d0fa7b69c37f9acffb4bdbbc4ec1dcfe7473496bb6166ab67dac756da: Status 404 returned error can't find the container with id 18d8f44d0fa7b69c37f9acffb4bdbbc4ec1dcfe7473496bb6166ab67dac756da Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.111759 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.147395 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7d684cd5bb-rtqs4"] Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.401702 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-697dff9875-7clld"] Jan 20 23:28:28 crc kubenswrapper[5030]: E0120 23:28:28.402392 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17be7dc5-b32e-47c3-a083-823f4300b700" containerName="barbican-db-sync" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.402408 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="17be7dc5-b32e-47c3-a083-823f4300b700" containerName="barbican-db-sync" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.402611 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="17be7dc5-b32e-47c3-a083-823f4300b700" containerName="barbican-db-sync" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.403811 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.410689 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb"] Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.411022 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-lkwbp" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.411241 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.411787 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.412141 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.417815 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.435595 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-697dff9875-7clld"] Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.457086 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb"] Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.496023 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.551140 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w"] Jan 20 23:28:28 crc kubenswrapper[5030]: E0120 23:28:28.551516 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2c52ab-999e-420d-836c-dc6ea4055ff6" containerName="cinder-db-sync" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.551529 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2c52ab-999e-420d-836c-dc6ea4055ff6" containerName="cinder-db-sync" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.551708 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b2c52ab-999e-420d-836c-dc6ea4055ff6" containerName="cinder-db-sync" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.552688 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.564921 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.565890 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-combined-ca-bundle\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.565937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-config-data\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.565958 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-config-data-custom\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.565992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-logs\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.566016 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b822da8-631f-47ee-8874-dfa862386ad2-logs\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.566039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thmbg\" (UniqueName: \"kubernetes.io/projected/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-kube-api-access-thmbg\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.566078 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-combined-ca-bundle\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.566094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-config-data-custom\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.566114 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm57c\" (UniqueName: \"kubernetes.io/projected/5b822da8-631f-47ee-8874-dfa862386ad2-kube-api-access-vm57c\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.566137 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-config-data\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.600062 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w"] Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667210 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp7pv\" (UniqueName: \"kubernetes.io/projected/7b2c52ab-999e-420d-836c-dc6ea4055ff6-kube-api-access-mp7pv\") pod \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b2c52ab-999e-420d-836c-dc6ea4055ff6-etc-machine-id\") pod \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667357 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-combined-ca-bundle\") pod \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667381 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b2c52ab-999e-420d-836c-dc6ea4055ff6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7b2c52ab-999e-420d-836c-dc6ea4055ff6" (UID: "7b2c52ab-999e-420d-836c-dc6ea4055ff6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-scripts\") pod \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667528 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-config-data\") pod \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667550 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-db-sync-config-data\") pod \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\" (UID: \"7b2c52ab-999e-420d-836c-dc6ea4055ff6\") " Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667747 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-combined-ca-bundle\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667787 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-combined-ca-bundle\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667823 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-config-data\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667843 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-config-data-custom\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667880 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-logs\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667910 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b822da8-631f-47ee-8874-dfa862386ad2-logs\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667930 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-config-data\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667948 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thmbg\" (UniqueName: \"kubernetes.io/projected/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-kube-api-access-thmbg\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.667996 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-combined-ca-bundle\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.668015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-config-data-custom\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.668034 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm57c\" (UniqueName: \"kubernetes.io/projected/5b822da8-631f-47ee-8874-dfa862386ad2-kube-api-access-vm57c\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.668094 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-config-data\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.668114 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36dbe7a0-9145-4358-9b8d-50341f70b132-logs\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.668136 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqv9q\" (UniqueName: \"kubernetes.io/projected/36dbe7a0-9145-4358-9b8d-50341f70b132-kube-api-access-kqv9q\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.668161 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-config-data-custom\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.668206 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b2c52ab-999e-420d-836c-dc6ea4055ff6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.668390 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-logs\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.669103 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b822da8-631f-47ee-8874-dfa862386ad2-logs\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.672793 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-combined-ca-bundle\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.673409 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-config-data-custom\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.677259 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-scripts" (OuterVolumeSpecName: "scripts") pod "7b2c52ab-999e-420d-836c-dc6ea4055ff6" (UID: "7b2c52ab-999e-420d-836c-dc6ea4055ff6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.680379 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7b2c52ab-999e-420d-836c-dc6ea4055ff6" (UID: "7b2c52ab-999e-420d-836c-dc6ea4055ff6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.683020 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b2c52ab-999e-420d-836c-dc6ea4055ff6-kube-api-access-mp7pv" (OuterVolumeSpecName: "kube-api-access-mp7pv") pod "7b2c52ab-999e-420d-836c-dc6ea4055ff6" (UID: "7b2c52ab-999e-420d-836c-dc6ea4055ff6"). InnerVolumeSpecName "kube-api-access-mp7pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.684242 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-config-data\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.686256 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-config-data-custom\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.686408 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm57c\" (UniqueName: \"kubernetes.io/projected/5b822da8-631f-47ee-8874-dfa862386ad2-kube-api-access-vm57c\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.689375 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-config-data\") pod \"barbican-keystone-listener-6bfc98675d-dsbwb\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.689563 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-combined-ca-bundle\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.718155 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thmbg\" (UniqueName: \"kubernetes.io/projected/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-kube-api-access-thmbg\") pod \"barbican-worker-697dff9875-7clld\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.735510 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.755541 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b2c52ab-999e-420d-836c-dc6ea4055ff6" (UID: "7b2c52ab-999e-420d-836c-dc6ea4055ff6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.764810 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.769509 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-config-data-custom\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.769584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-combined-ca-bundle\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.769674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-config-data\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.769728 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36dbe7a0-9145-4358-9b8d-50341f70b132-logs\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.769752 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqv9q\" (UniqueName: \"kubernetes.io/projected/36dbe7a0-9145-4358-9b8d-50341f70b132-kube-api-access-kqv9q\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.769810 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.769823 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.769833 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp7pv\" (UniqueName: \"kubernetes.io/projected/7b2c52ab-999e-420d-836c-dc6ea4055ff6-kube-api-access-mp7pv\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.769843 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.775064 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36dbe7a0-9145-4358-9b8d-50341f70b132-logs\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.781249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-combined-ca-bundle\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.787352 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-config-data-custom\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.823611 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-config-data\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.837225 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqv9q\" (UniqueName: \"kubernetes.io/projected/36dbe7a0-9145-4358-9b8d-50341f70b132-kube-api-access-kqv9q\") pod \"barbican-api-9cf5cf68d-sff9w\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.858767 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-config-data" (OuterVolumeSpecName: "config-data") pod "7b2c52ab-999e-420d-836c-dc6ea4055ff6" (UID: "7b2c52ab-999e-420d-836c-dc6ea4055ff6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.877747 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2c52ab-999e-420d-836c-dc6ea4055ff6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:28 crc kubenswrapper[5030]: I0120 23:28:28.927027 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.152645 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" event={"ID":"ddd28027-576f-4b84-aed2-3ddcedb5bab3","Type":"ContainerStarted","Data":"ac3990bd9655b2eaf6d0a47482dc86ec1760584475b052542bdff00f237ec4ac"} Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.152886 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" event={"ID":"ddd28027-576f-4b84-aed2-3ddcedb5bab3","Type":"ContainerStarted","Data":"18d8f44d0fa7b69c37f9acffb4bdbbc4ec1dcfe7473496bb6166ab67dac756da"} Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.159297 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" event={"ID":"5dacb533-2f94-4433-ab8e-2efc4ee6e370","Type":"ContainerStarted","Data":"6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7"} Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.159328 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" event={"ID":"5dacb533-2f94-4433-ab8e-2efc4ee6e370","Type":"ContainerStarted","Data":"8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8"} Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.159338 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" event={"ID":"5dacb533-2f94-4433-ab8e-2efc4ee6e370","Type":"ContainerStarted","Data":"394e938f5e1ab967db58f7f06b9d758bfb040635991db16883733f455c6b5a6e"} Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.168168 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.168208 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.199759 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" event={"ID":"7b2c52ab-999e-420d-836c-dc6ea4055ff6","Type":"ContainerDied","Data":"8369badc4c4d1072a48ddfe5b9580da2918bdabe43ea365c58099c942ebe3a80"} Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.199799 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8369badc4c4d1072a48ddfe5b9580da2918bdabe43ea365c58099c942ebe3a80" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.199848 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-4x6fd" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.203290 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" podStartSLOduration=2.203273863 podStartE2EDuration="2.203273863s" podCreationTimestamp="2026-01-20 23:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:29.176454394 +0000 UTC m=+3181.496714682" watchObservedRunningTime="2026-01-20 23:28:29.203273863 +0000 UTC m=+3181.523534151" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.360706 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" podStartSLOduration=2.360684433 podStartE2EDuration="2.360684433s" podCreationTimestamp="2026-01-20 23:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:29.199951363 +0000 UTC m=+3181.520211651" watchObservedRunningTime="2026-01-20 23:28:29.360684433 +0000 UTC m=+3181.680944721" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.361312 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.362566 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.366828 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.367091 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-bp47s" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.367219 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.367302 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.406467 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.479771 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-697dff9875-7clld"] Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.489685 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-scripts\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.492886 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.493006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.493042 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b7zv\" (UniqueName: \"kubernetes.io/projected/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-kube-api-access-2b7zv\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.493205 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.493273 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-config-data\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.547301 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.551400 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.556596 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.558636 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.566039 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb"] Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.602710 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.602776 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.602794 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b7zv\" (UniqueName: \"kubernetes.io/projected/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-kube-api-access-2b7zv\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.602845 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.602872 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-config-data\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.602915 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-scripts\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.604087 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.609492 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-config-data\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.619270 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.619332 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-scripts\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.619594 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.623087 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b7zv\" (UniqueName: \"kubernetes.io/projected/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-kube-api-access-2b7zv\") pod \"cinder-scheduler-0\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.699925 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w"] Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.706150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqbwq\" (UniqueName: \"kubernetes.io/projected/36edc87c-d811-414a-8630-86c81093bdbf-kube-api-access-gqbwq\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.706348 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-config-data\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.706492 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36edc87c-d811-414a-8630-86c81093bdbf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.706562 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-scripts\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.706712 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.706836 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-config-data-custom\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.706913 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36edc87c-d811-414a-8630-86c81093bdbf-logs\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.707609 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.808377 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqbwq\" (UniqueName: \"kubernetes.io/projected/36edc87c-d811-414a-8630-86c81093bdbf-kube-api-access-gqbwq\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.810176 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-config-data\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.810210 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36edc87c-d811-414a-8630-86c81093bdbf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.810226 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-scripts\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.810257 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.810321 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-config-data-custom\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.810372 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36edc87c-d811-414a-8630-86c81093bdbf-logs\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.811020 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36edc87c-d811-414a-8630-86c81093bdbf-logs\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.811580 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36edc87c-d811-414a-8630-86c81093bdbf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.815057 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-config-data\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.816984 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.826194 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-config-data-custom\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.831163 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqbwq\" (UniqueName: \"kubernetes.io/projected/36edc87c-d811-414a-8630-86c81093bdbf-kube-api-access-gqbwq\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:29 crc kubenswrapper[5030]: I0120 23:28:29.831232 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-scripts\") pod \"cinder-api-0\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.043551 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.197890 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.226888 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" event={"ID":"5b822da8-631f-47ee-8874-dfa862386ad2","Type":"ContainerStarted","Data":"ec590bd73705be339baa9b7c1b90768301a4f289003bd0b2191fc3d357538be2"} Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.227034 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" event={"ID":"5b822da8-631f-47ee-8874-dfa862386ad2","Type":"ContainerStarted","Data":"39015b910f2f39e56b985ca85a7e5d5c7980fd4ff906f21ce55154bbcc3bfbc9"} Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.227094 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" event={"ID":"5b822da8-631f-47ee-8874-dfa862386ad2","Type":"ContainerStarted","Data":"f265d8dcfee99cc0764d4a6b166d08346ad9871953325de827ceb130b513203e"} Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.248214 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" podStartSLOduration=2.248197357 podStartE2EDuration="2.248197357s" podCreationTimestamp="2026-01-20 23:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:30.241510906 +0000 UTC m=+3182.561771194" watchObservedRunningTime="2026-01-20 23:28:30.248197357 +0000 UTC m=+3182.568457645" Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.251193 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" event={"ID":"36dbe7a0-9145-4358-9b8d-50341f70b132","Type":"ContainerStarted","Data":"50df2ae8942ce51ed05a0430d679194edeabc2f16286a07b4ea6bf95d741b532"} Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.251326 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" event={"ID":"36dbe7a0-9145-4358-9b8d-50341f70b132","Type":"ContainerStarted","Data":"89a6eaffa75f7ed5cdb5cb31c7a58a9e4133282fc31de2764c164ad8febfa435"} Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.251422 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" event={"ID":"36dbe7a0-9145-4358-9b8d-50341f70b132","Type":"ContainerStarted","Data":"ed07673a5867863998fecb8762dabc9a52894bd61d5f540489349b856506ba08"} Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.251519 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.251603 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.290874 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" event={"ID":"0c16760b-b3ab-42a0-90a7-5cba1f6b889a","Type":"ContainerStarted","Data":"51cd04a97de01286d78837802f843f6e120edea93900487da629dcee88840ff1"} Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.290929 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" event={"ID":"0c16760b-b3ab-42a0-90a7-5cba1f6b889a","Type":"ContainerStarted","Data":"dcdfc1820f080b0e8b90f9417e65df736b83363b67ea72b8806323bba24c5a86"} Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.290942 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" event={"ID":"0c16760b-b3ab-42a0-90a7-5cba1f6b889a","Type":"ContainerStarted","Data":"60c6adbd497cc5e458d71143c0ae628b469af0691d95ef0fcf275c509bb46bd3"} Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.311092 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" podStartSLOduration=2.311076229 podStartE2EDuration="2.311076229s" podCreationTimestamp="2026-01-20 23:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:30.310288161 +0000 UTC m=+3182.630548439" watchObservedRunningTime="2026-01-20 23:28:30.311076229 +0000 UTC m=+3182.631336517" Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.313565 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" podStartSLOduration=2.31355785 podStartE2EDuration="2.31355785s" podCreationTimestamp="2026-01-20 23:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:30.287107669 +0000 UTC m=+3182.607367957" watchObservedRunningTime="2026-01-20 23:28:30.31355785 +0000 UTC m=+3182.633818138" Jan 20 23:28:30 crc kubenswrapper[5030]: I0120 23:28:30.520583 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:28:30 crc kubenswrapper[5030]: W0120 23:28:30.529988 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36edc87c_d811_414a_8630_86c81093bdbf.slice/crio-6d70c12ba776d0c0aead021878157f3d30644db9645995dc386797eda4d22bbf WatchSource:0}: Error finding container 6d70c12ba776d0c0aead021878157f3d30644db9645995dc386797eda4d22bbf: Status 404 returned error can't find the container with id 6d70c12ba776d0c0aead021878157f3d30644db9645995dc386797eda4d22bbf Jan 20 23:28:31 crc kubenswrapper[5030]: I0120 23:28:31.307998 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1df9d9fc-b1ae-420d-8d8a-1c453cb77106","Type":"ContainerStarted","Data":"b8f27ffed3eee51863138cee9e48d260bc090210cbb06cbfeed0342e5a7050be"} Jan 20 23:28:31 crc kubenswrapper[5030]: I0120 23:28:31.308258 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1df9d9fc-b1ae-420d-8d8a-1c453cb77106","Type":"ContainerStarted","Data":"3abf721ba5637f0d56a9b0207f2737ad4bcd5e9e876b5279f33392be61081b9a"} Jan 20 23:28:31 crc kubenswrapper[5030]: I0120 23:28:31.310015 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"36edc87c-d811-414a-8630-86c81093bdbf","Type":"ContainerStarted","Data":"afe9fcbff8775eba20651455c79c527a300dd3b3f856f1adcfb7be13f0d24a0b"} Jan 20 23:28:31 crc kubenswrapper[5030]: I0120 23:28:31.310059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"36edc87c-d811-414a-8630-86c81093bdbf","Type":"ContainerStarted","Data":"6d70c12ba776d0c0aead021878157f3d30644db9645995dc386797eda4d22bbf"} Jan 20 23:28:31 crc kubenswrapper[5030]: I0120 23:28:31.579730 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:31 crc kubenswrapper[5030]: I0120 23:28:31.579773 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:31 crc kubenswrapper[5030]: I0120 23:28:31.616586 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:31 crc kubenswrapper[5030]: I0120 23:28:31.660189 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:32 crc kubenswrapper[5030]: I0120 23:28:32.318774 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"36edc87c-d811-414a-8630-86c81093bdbf","Type":"ContainerStarted","Data":"51240c072577546536a7362dad9a075e93e1489c8474e8d196c9d05db5b7e28b"} Jan 20 23:28:32 crc kubenswrapper[5030]: I0120 23:28:32.319015 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:32 crc kubenswrapper[5030]: I0120 23:28:32.321078 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1df9d9fc-b1ae-420d-8d8a-1c453cb77106","Type":"ContainerStarted","Data":"585ae5ff44f886ced04efde6a25158e4b15847e87657330c8e6e321d18ce82b3"} Jan 20 23:28:32 crc kubenswrapper[5030]: I0120 23:28:32.322725 5030 generic.go:334] "Generic (PLEG): container finished" podID="ddd28027-576f-4b84-aed2-3ddcedb5bab3" containerID="ac3990bd9655b2eaf6d0a47482dc86ec1760584475b052542bdff00f237ec4ac" exitCode=0 Jan 20 23:28:32 crc kubenswrapper[5030]: I0120 23:28:32.322805 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" event={"ID":"ddd28027-576f-4b84-aed2-3ddcedb5bab3","Type":"ContainerDied","Data":"ac3990bd9655b2eaf6d0a47482dc86ec1760584475b052542bdff00f237ec4ac"} Jan 20 23:28:32 crc kubenswrapper[5030]: I0120 23:28:32.322996 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:32 crc kubenswrapper[5030]: I0120 23:28:32.323027 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:32 crc kubenswrapper[5030]: I0120 23:28:32.351291 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.3512729869999998 podStartE2EDuration="3.351272987s" podCreationTimestamp="2026-01-20 23:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:32.348852929 +0000 UTC m=+3184.669113217" watchObservedRunningTime="2026-01-20 23:28:32.351272987 +0000 UTC m=+3184.671533275" Jan 20 23:28:32 crc kubenswrapper[5030]: I0120 23:28:32.374568 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.37455258 podStartE2EDuration="3.37455258s" podCreationTimestamp="2026-01-20 23:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:32.372539272 +0000 UTC m=+3184.692799570" watchObservedRunningTime="2026-01-20 23:28:32.37455258 +0000 UTC m=+3184.694812868" Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.426144 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.426186 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.468567 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.481216 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.740286 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.902485 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-fernet-keys\") pod \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.902535 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-scripts\") pod \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.902577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-config-data\") pod \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.902648 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-combined-ca-bundle\") pod \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.902697 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-credential-keys\") pod \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.902724 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-467rv\" (UniqueName: \"kubernetes.io/projected/ddd28027-576f-4b84-aed2-3ddcedb5bab3-kube-api-access-467rv\") pod \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\" (UID: \"ddd28027-576f-4b84-aed2-3ddcedb5bab3\") " Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.908146 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-scripts" (OuterVolumeSpecName: "scripts") pod "ddd28027-576f-4b84-aed2-3ddcedb5bab3" (UID: "ddd28027-576f-4b84-aed2-3ddcedb5bab3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.920779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ddd28027-576f-4b84-aed2-3ddcedb5bab3" (UID: "ddd28027-576f-4b84-aed2-3ddcedb5bab3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.920858 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ddd28027-576f-4b84-aed2-3ddcedb5bab3" (UID: "ddd28027-576f-4b84-aed2-3ddcedb5bab3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.929112 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddd28027-576f-4b84-aed2-3ddcedb5bab3" (UID: "ddd28027-576f-4b84-aed2-3ddcedb5bab3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.936324 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd28027-576f-4b84-aed2-3ddcedb5bab3-kube-api-access-467rv" (OuterVolumeSpecName: "kube-api-access-467rv") pod "ddd28027-576f-4b84-aed2-3ddcedb5bab3" (UID: "ddd28027-576f-4b84-aed2-3ddcedb5bab3"). InnerVolumeSpecName "kube-api-access-467rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:33 crc kubenswrapper[5030]: I0120 23:28:33.949206 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-config-data" (OuterVolumeSpecName: "config-data") pod "ddd28027-576f-4b84-aed2-3ddcedb5bab3" (UID: "ddd28027-576f-4b84-aed2-3ddcedb5bab3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.004959 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.004989 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.004999 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.005008 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.005019 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ddd28027-576f-4b84-aed2-3ddcedb5bab3-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.005030 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-467rv\" (UniqueName: \"kubernetes.io/projected/ddd28027-576f-4b84-aed2-3ddcedb5bab3-kube-api-access-467rv\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.327424 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.327800 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.351912 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" event={"ID":"ddd28027-576f-4b84-aed2-3ddcedb5bab3","Type":"ContainerDied","Data":"18d8f44d0fa7b69c37f9acffb4bdbbc4ec1dcfe7473496bb6166ab67dac756da"} Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.351972 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18d8f44d0fa7b69c37f9acffb4bdbbc4ec1dcfe7473496bb6166ab67dac756da" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.352047 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6ldn5" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.352798 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.352855 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.579144 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-65885c5c7c-j7j2s"] Jan 20 23:28:34 crc kubenswrapper[5030]: E0120 23:28:34.579584 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd28027-576f-4b84-aed2-3ddcedb5bab3" containerName="keystone-bootstrap" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.579603 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd28027-576f-4b84-aed2-3ddcedb5bab3" containerName="keystone-bootstrap" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.579818 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd28027-576f-4b84-aed2-3ddcedb5bab3" containerName="keystone-bootstrap" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.580545 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.584211 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.586574 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.586893 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.586948 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-c75q4" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.599322 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-65885c5c7c-j7j2s"] Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.708673 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.718858 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-fernet-keys\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.719145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-combined-ca-bundle\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.719173 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-credential-keys\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.719235 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcj6v\" (UniqueName: \"kubernetes.io/projected/68408e1c-9c9a-465d-9abd-ea36b270db22-kube-api-access-qcj6v\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.719420 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-scripts\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.719549 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-config-data\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.820779 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-scripts\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.821542 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-config-data\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.822186 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-fernet-keys\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.822600 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-combined-ca-bundle\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.822648 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-credential-keys\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.822670 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcj6v\" (UniqueName: \"kubernetes.io/projected/68408e1c-9c9a-465d-9abd-ea36b270db22-kube-api-access-qcj6v\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.824066 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-scripts\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.825834 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-config-data\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.826812 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-credential-keys\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.828015 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-fernet-keys\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.834346 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-combined-ca-bundle\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.842173 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcj6v\" (UniqueName: \"kubernetes.io/projected/68408e1c-9c9a-465d-9abd-ea36b270db22-kube-api-access-qcj6v\") pod \"keystone-65885c5c7c-j7j2s\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:34 crc kubenswrapper[5030]: I0120 23:28:34.898939 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:35 crc kubenswrapper[5030]: I0120 23:28:35.367686 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-65885c5c7c-j7j2s"] Jan 20 23:28:36 crc kubenswrapper[5030]: I0120 23:28:36.281266 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:36 crc kubenswrapper[5030]: I0120 23:28:36.286398 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:28:36 crc kubenswrapper[5030]: I0120 23:28:36.373228 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" event={"ID":"68408e1c-9c9a-465d-9abd-ea36b270db22","Type":"ContainerStarted","Data":"388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e"} Jan 20 23:28:36 crc kubenswrapper[5030]: I0120 23:28:36.373516 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" event={"ID":"68408e1c-9c9a-465d-9abd-ea36b270db22","Type":"ContainerStarted","Data":"4e61e017fdd7a44394deca0298a839476bfd215ec5ecaf9e15a97b23f9e1e533"} Jan 20 23:28:36 crc kubenswrapper[5030]: I0120 23:28:36.395640 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" podStartSLOduration=2.395605189 podStartE2EDuration="2.395605189s" podCreationTimestamp="2026-01-20 23:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:36.393312913 +0000 UTC m=+3188.713573211" watchObservedRunningTime="2026-01-20 23:28:36.395605189 +0000 UTC m=+3188.715865477" Jan 20 23:28:37 crc kubenswrapper[5030]: I0120 23:28:37.383742 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:28:38 crc kubenswrapper[5030]: I0120 23:28:38.413189 5030 generic.go:334] "Generic (PLEG): container finished" podID="94c75021-4a7f-4851-aada-24da4ddb3387" containerID="1c6850eecdc5fa4c5a36b12834246a3043eb86f02c48821721d9c04dc239804a" exitCode=0 Jan 20 23:28:38 crc kubenswrapper[5030]: I0120 23:28:38.413318 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-f9brm" event={"ID":"94c75021-4a7f-4851-aada-24da4ddb3387","Type":"ContainerDied","Data":"1c6850eecdc5fa4c5a36b12834246a3043eb86f02c48821721d9c04dc239804a"} Jan 20 23:28:39 crc kubenswrapper[5030]: I0120 23:28:39.814886 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-f9brm" Jan 20 23:28:39 crc kubenswrapper[5030]: I0120 23:28:39.888110 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:28:39 crc kubenswrapper[5030]: I0120 23:28:39.924703 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94c75021-4a7f-4851-aada-24da4ddb3387-config\") pod \"94c75021-4a7f-4851-aada-24da4ddb3387\" (UID: \"94c75021-4a7f-4851-aada-24da4ddb3387\") " Jan 20 23:28:39 crc kubenswrapper[5030]: I0120 23:28:39.924745 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c75021-4a7f-4851-aada-24da4ddb3387-combined-ca-bundle\") pod \"94c75021-4a7f-4851-aada-24da4ddb3387\" (UID: \"94c75021-4a7f-4851-aada-24da4ddb3387\") " Jan 20 23:28:39 crc kubenswrapper[5030]: I0120 23:28:39.924927 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcrfj\" (UniqueName: \"kubernetes.io/projected/94c75021-4a7f-4851-aada-24da4ddb3387-kube-api-access-zcrfj\") pod \"94c75021-4a7f-4851-aada-24da4ddb3387\" (UID: \"94c75021-4a7f-4851-aada-24da4ddb3387\") " Jan 20 23:28:39 crc kubenswrapper[5030]: I0120 23:28:39.930157 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c75021-4a7f-4851-aada-24da4ddb3387-kube-api-access-zcrfj" (OuterVolumeSpecName: "kube-api-access-zcrfj") pod "94c75021-4a7f-4851-aada-24da4ddb3387" (UID: "94c75021-4a7f-4851-aada-24da4ddb3387"). InnerVolumeSpecName "kube-api-access-zcrfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:28:39 crc kubenswrapper[5030]: I0120 23:28:39.969473 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c75021-4a7f-4851-aada-24da4ddb3387-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94c75021-4a7f-4851-aada-24da4ddb3387" (UID: "94c75021-4a7f-4851-aada-24da4ddb3387"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:39 crc kubenswrapper[5030]: I0120 23:28:39.971091 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c75021-4a7f-4851-aada-24da4ddb3387-config" (OuterVolumeSpecName: "config") pod "94c75021-4a7f-4851-aada-24da4ddb3387" (UID: "94c75021-4a7f-4851-aada-24da4ddb3387"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.027392 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcrfj\" (UniqueName: \"kubernetes.io/projected/94c75021-4a7f-4851-aada-24da4ddb3387-kube-api-access-zcrfj\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.027428 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/94c75021-4a7f-4851-aada-24da4ddb3387-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.027443 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c75021-4a7f-4851-aada-24da4ddb3387-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.271143 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.335024 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.444724 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-f9brm" event={"ID":"94c75021-4a7f-4851-aada-24da4ddb3387","Type":"ContainerDied","Data":"98e99d783c8e85c6a5cb75837f8ddbd2d005b3019e3d5cf6e7db2ecea4f7a119"} Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.444776 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98e99d783c8e85c6a5cb75837f8ddbd2d005b3019e3d5cf6e7db2ecea4f7a119" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.444744 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-f9brm" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.652206 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-78c57799cd-6f6m4"] Jan 20 23:28:40 crc kubenswrapper[5030]: E0120 23:28:40.652532 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c75021-4a7f-4851-aada-24da4ddb3387" containerName="neutron-db-sync" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.652548 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c75021-4a7f-4851-aada-24da4ddb3387" containerName="neutron-db-sync" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.654816 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c75021-4a7f-4851-aada-24da4ddb3387" containerName="neutron-db-sync" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.655769 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.658038 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-s754x" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.659262 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.664014 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.667383 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-78c57799cd-6f6m4"] Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.738799 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-config\") pod \"neutron-78c57799cd-6f6m4\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.739072 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-httpd-config\") pod \"neutron-78c57799cd-6f6m4\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.739101 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rmsj\" (UniqueName: \"kubernetes.io/projected/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-kube-api-access-8rmsj\") pod \"neutron-78c57799cd-6f6m4\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.739138 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-combined-ca-bundle\") pod \"neutron-78c57799cd-6f6m4\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.840510 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-config\") pod \"neutron-78c57799cd-6f6m4\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.840554 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-httpd-config\") pod \"neutron-78c57799cd-6f6m4\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.840582 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rmsj\" (UniqueName: \"kubernetes.io/projected/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-kube-api-access-8rmsj\") pod \"neutron-78c57799cd-6f6m4\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.840614 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-combined-ca-bundle\") pod \"neutron-78c57799cd-6f6m4\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.844530 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-combined-ca-bundle\") pod \"neutron-78c57799cd-6f6m4\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.844811 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-config\") pod \"neutron-78c57799cd-6f6m4\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.846152 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-httpd-config\") pod \"neutron-78c57799cd-6f6m4\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.863291 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rmsj\" (UniqueName: \"kubernetes.io/projected/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-kube-api-access-8rmsj\") pod \"neutron-78c57799cd-6f6m4\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.962276 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:28:40 crc kubenswrapper[5030]: E0120 23:28:40.962547 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:28:40 crc kubenswrapper[5030]: I0120 23:28:40.976961 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:41 crc kubenswrapper[5030]: I0120 23:28:41.428060 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-78c57799cd-6f6m4"] Jan 20 23:28:41 crc kubenswrapper[5030]: W0120 23:28:41.436460 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5910c0d_bfc0_4b55_8b1b_ded9285b8b80.slice/crio-529b7c5956ff76bd7df792dcf6277eaa728c5dfc90f8ce1528d08fcae5c529c1 WatchSource:0}: Error finding container 529b7c5956ff76bd7df792dcf6277eaa728c5dfc90f8ce1528d08fcae5c529c1: Status 404 returned error can't find the container with id 529b7c5956ff76bd7df792dcf6277eaa728c5dfc90f8ce1528d08fcae5c529c1 Jan 20 23:28:41 crc kubenswrapper[5030]: I0120 23:28:41.460022 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" event={"ID":"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80","Type":"ContainerStarted","Data":"529b7c5956ff76bd7df792dcf6277eaa728c5dfc90f8ce1528d08fcae5c529c1"} Jan 20 23:28:41 crc kubenswrapper[5030]: I0120 23:28:41.748065 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:28:42 crc kubenswrapper[5030]: I0120 23:28:42.471936 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" event={"ID":"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80","Type":"ContainerStarted","Data":"f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4"} Jan 20 23:28:42 crc kubenswrapper[5030]: I0120 23:28:42.472287 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:28:42 crc kubenswrapper[5030]: I0120 23:28:42.472310 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" event={"ID":"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80","Type":"ContainerStarted","Data":"0479d8ca3c7e55d792916ca93b8115469fdc13f431eb12a4779cf505b8f82e6f"} Jan 20 23:28:42 crc kubenswrapper[5030]: I0120 23:28:42.496271 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" podStartSLOduration=2.496255389 podStartE2EDuration="2.496255389s" podCreationTimestamp="2026-01-20 23:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:28:42.491813211 +0000 UTC m=+3194.812073579" watchObservedRunningTime="2026-01-20 23:28:42.496255389 +0000 UTC m=+3194.816515677" Jan 20 23:28:50 crc kubenswrapper[5030]: I0120 23:28:50.960348 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:28:55 crc kubenswrapper[5030]: I0120 23:28:55.924205 5030 scope.go:117] "RemoveContainer" containerID="201653aad7d55159137ac56d6010f5c74db5484c6c17c8f3dc1ea1fcfaa1b109" Jan 20 23:28:55 crc kubenswrapper[5030]: I0120 23:28:55.962228 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:28:55 crc kubenswrapper[5030]: E0120 23:28:55.962810 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:28:55 crc kubenswrapper[5030]: I0120 23:28:55.978105 5030 scope.go:117] "RemoveContainer" containerID="5a4e6c117dd4789ab400c322af2053b5e1adf4bfb2237f771b9a8335839061f4" Jan 20 23:28:56 crc kubenswrapper[5030]: I0120 23:28:56.013467 5030 scope.go:117] "RemoveContainer" containerID="2e3c443886e08309e23e74c3eb26e87aeed4b515043573a5b997072fad41ac81" Jan 20 23:28:56 crc kubenswrapper[5030]: I0120 23:28:56.043521 5030 scope.go:117] "RemoveContainer" containerID="cbbd55ec11f922f70da8495be9037539e1d761154559e6b2075472c7deb21023" Jan 20 23:28:56 crc kubenswrapper[5030]: I0120 23:28:56.067441 5030 scope.go:117] "RemoveContainer" containerID="8d73936322fff9c039c7ecccb4057f7aa50329948c3517040a4a262e2dd60981" Jan 20 23:28:56 crc kubenswrapper[5030]: I0120 23:28:56.098839 5030 scope.go:117] "RemoveContainer" containerID="50246376aa5e4e5507db17c42856a6a995ac78ab49646b34f1fdd90c7caa4584" Jan 20 23:28:56 crc kubenswrapper[5030]: I0120 23:28:56.149779 5030 scope.go:117] "RemoveContainer" containerID="f1698982fca2a79b6548ac198d70a617d97cb084ded4cd799e4664d8686aea9d" Jan 20 23:28:56 crc kubenswrapper[5030]: I0120 23:28:56.178538 5030 scope.go:117] "RemoveContainer" containerID="f2f845517a574b3a451d614fd6d8f871fedece18d4e8adc83b716896bf2af1fc" Jan 20 23:28:56 crc kubenswrapper[5030]: I0120 23:28:56.242752 5030 scope.go:117] "RemoveContainer" containerID="98640ff698c625ccf0185788c13cd5b2bbbf3afee5b419bbd329aebbce1e377c" Jan 20 23:28:56 crc kubenswrapper[5030]: I0120 23:28:56.277894 5030 scope.go:117] "RemoveContainer" containerID="3a8097c840afb5350c719593f5cd228865ad457dac867bc99f215f26394d9899" Jan 20 23:28:58 crc kubenswrapper[5030]: I0120 23:28:58.673327 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:28:58 crc kubenswrapper[5030]: I0120 23:28:58.674649 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.042795 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h8bbx"] Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.045397 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.056928 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h8bbx"] Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.161043 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ece861-114a-4566-ab6a-a0488943ac92-utilities\") pod \"certified-operators-h8bbx\" (UID: \"c1ece861-114a-4566-ab6a-a0488943ac92\") " pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.161091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jb8v\" (UniqueName: \"kubernetes.io/projected/c1ece861-114a-4566-ab6a-a0488943ac92-kube-api-access-6jb8v\") pod \"certified-operators-h8bbx\" (UID: \"c1ece861-114a-4566-ab6a-a0488943ac92\") " pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.161191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ece861-114a-4566-ab6a-a0488943ac92-catalog-content\") pod \"certified-operators-h8bbx\" (UID: \"c1ece861-114a-4566-ab6a-a0488943ac92\") " pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.262862 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ece861-114a-4566-ab6a-a0488943ac92-utilities\") pod \"certified-operators-h8bbx\" (UID: \"c1ece861-114a-4566-ab6a-a0488943ac92\") " pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.262930 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jb8v\" (UniqueName: \"kubernetes.io/projected/c1ece861-114a-4566-ab6a-a0488943ac92-kube-api-access-6jb8v\") pod \"certified-operators-h8bbx\" (UID: \"c1ece861-114a-4566-ab6a-a0488943ac92\") " pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.263055 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ece861-114a-4566-ab6a-a0488943ac92-catalog-content\") pod \"certified-operators-h8bbx\" (UID: \"c1ece861-114a-4566-ab6a-a0488943ac92\") " pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.263365 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ece861-114a-4566-ab6a-a0488943ac92-utilities\") pod \"certified-operators-h8bbx\" (UID: \"c1ece861-114a-4566-ab6a-a0488943ac92\") " pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.264665 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ece861-114a-4566-ab6a-a0488943ac92-catalog-content\") pod \"certified-operators-h8bbx\" (UID: \"c1ece861-114a-4566-ab6a-a0488943ac92\") " pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.282589 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jb8v\" (UniqueName: \"kubernetes.io/projected/c1ece861-114a-4566-ab6a-a0488943ac92-kube-api-access-6jb8v\") pod \"certified-operators-h8bbx\" (UID: \"c1ece861-114a-4566-ab6a-a0488943ac92\") " pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.375349 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:02 crc kubenswrapper[5030]: W0120 23:29:02.914565 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1ece861_114a_4566_ab6a_a0488943ac92.slice/crio-0a7deb235f037ab55759560ee0e1e086d71b931ac5b8938b3c6274ffdb92c438 WatchSource:0}: Error finding container 0a7deb235f037ab55759560ee0e1e086d71b931ac5b8938b3c6274ffdb92c438: Status 404 returned error can't find the container with id 0a7deb235f037ab55759560ee0e1e086d71b931ac5b8938b3c6274ffdb92c438 Jan 20 23:29:02 crc kubenswrapper[5030]: I0120 23:29:02.917722 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h8bbx"] Jan 20 23:29:03 crc kubenswrapper[5030]: I0120 23:29:03.720120 5030 generic.go:334] "Generic (PLEG): container finished" podID="c1ece861-114a-4566-ab6a-a0488943ac92" containerID="d3cfcba4ccb5149c7d961b4d7bdd22020869d2e34231e12f3dee07abe05ef728" exitCode=0 Jan 20 23:29:03 crc kubenswrapper[5030]: I0120 23:29:03.720233 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8bbx" event={"ID":"c1ece861-114a-4566-ab6a-a0488943ac92","Type":"ContainerDied","Data":"d3cfcba4ccb5149c7d961b4d7bdd22020869d2e34231e12f3dee07abe05ef728"} Jan 20 23:29:03 crc kubenswrapper[5030]: I0120 23:29:03.720500 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8bbx" event={"ID":"c1ece861-114a-4566-ab6a-a0488943ac92","Type":"ContainerStarted","Data":"0a7deb235f037ab55759560ee0e1e086d71b931ac5b8938b3c6274ffdb92c438"} Jan 20 23:29:04 crc kubenswrapper[5030]: I0120 23:29:04.736991 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8bbx" event={"ID":"c1ece861-114a-4566-ab6a-a0488943ac92","Type":"ContainerStarted","Data":"d2679d6810a09b1e800c74f2e6c0119f9097ba8fb3ef6591be6abb902ad883d0"} Jan 20 23:29:05 crc kubenswrapper[5030]: I0120 23:29:05.773144 5030 generic.go:334] "Generic (PLEG): container finished" podID="c1ece861-114a-4566-ab6a-a0488943ac92" containerID="d2679d6810a09b1e800c74f2e6c0119f9097ba8fb3ef6591be6abb902ad883d0" exitCode=0 Jan 20 23:29:05 crc kubenswrapper[5030]: I0120 23:29:05.773359 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8bbx" event={"ID":"c1ece861-114a-4566-ab6a-a0488943ac92","Type":"ContainerDied","Data":"d2679d6810a09b1e800c74f2e6c0119f9097ba8fb3ef6591be6abb902ad883d0"} Jan 20 23:29:06 crc kubenswrapper[5030]: I0120 23:29:06.294893 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:29:06 crc kubenswrapper[5030]: I0120 23:29:06.786076 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8bbx" event={"ID":"c1ece861-114a-4566-ab6a-a0488943ac92","Type":"ContainerStarted","Data":"59cc829e121560281eeef3334c8d78cd7b609c7fe07a52623f95e884e4b274b2"} Jan 20 23:29:06 crc kubenswrapper[5030]: I0120 23:29:06.811567 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h8bbx" podStartSLOduration=2.1832326 podStartE2EDuration="4.811553104s" podCreationTimestamp="2026-01-20 23:29:02 +0000 UTC" firstStartedPulling="2026-01-20 23:29:03.723589214 +0000 UTC m=+3216.043849532" lastFinishedPulling="2026-01-20 23:29:06.351909728 +0000 UTC m=+3218.672170036" observedRunningTime="2026-01-20 23:29:06.807086836 +0000 UTC m=+3219.127347124" watchObservedRunningTime="2026-01-20 23:29:06.811553104 +0000 UTC m=+3219.131813382" Jan 20 23:29:08 crc kubenswrapper[5030]: I0120 23:29:08.963006 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:29:08 crc kubenswrapper[5030]: E0120 23:29:08.963798 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.495189 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.497670 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.499584 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-4lh4d" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.501032 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.501307 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.515759 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.605511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-combined-ca-bundle\") pod \"openstackclient\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.605589 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-openstack-config-secret\") pod \"openstackclient\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.605752 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-588ff\" (UniqueName: \"kubernetes.io/projected/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-kube-api-access-588ff\") pod \"openstackclient\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.605870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-openstack-config\") pod \"openstackclient\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.707347 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-combined-ca-bundle\") pod \"openstackclient\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.707409 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-openstack-config-secret\") pod \"openstackclient\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.707454 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-588ff\" (UniqueName: \"kubernetes.io/projected/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-kube-api-access-588ff\") pod \"openstackclient\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.707498 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-openstack-config\") pod \"openstackclient\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.708902 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-openstack-config\") pod \"openstackclient\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.718942 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-openstack-config-secret\") pod \"openstackclient\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.729446 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-combined-ca-bundle\") pod \"openstackclient\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.732473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-588ff\" (UniqueName: \"kubernetes.io/projected/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-kube-api-access-588ff\") pod \"openstackclient\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:09 crc kubenswrapper[5030]: I0120 23:29:09.830755 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:29:10 crc kubenswrapper[5030]: I0120 23:29:10.297535 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:29:10 crc kubenswrapper[5030]: W0120 23:29:10.303781 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaea67ae4_7b1c_4bc1_afe4_eacfc66d0081.slice/crio-9032d48a07c90b7a5d2e0d3805cc787da850de386bbfae8911a63712b51e50ed WatchSource:0}: Error finding container 9032d48a07c90b7a5d2e0d3805cc787da850de386bbfae8911a63712b51e50ed: Status 404 returned error can't find the container with id 9032d48a07c90b7a5d2e0d3805cc787da850de386bbfae8911a63712b51e50ed Jan 20 23:29:10 crc kubenswrapper[5030]: I0120 23:29:10.834266 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081","Type":"ContainerStarted","Data":"7cb774b7ad67df28cae541d2b5fd9915c0040832858e0d9cbf4f702324dd16b6"} Jan 20 23:29:10 crc kubenswrapper[5030]: I0120 23:29:10.834764 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081","Type":"ContainerStarted","Data":"9032d48a07c90b7a5d2e0d3805cc787da850de386bbfae8911a63712b51e50ed"} Jan 20 23:29:10 crc kubenswrapper[5030]: I0120 23:29:10.858802 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=1.858781387 podStartE2EDuration="1.858781387s" podCreationTimestamp="2026-01-20 23:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:29:10.857544947 +0000 UTC m=+3223.177805245" watchObservedRunningTime="2026-01-20 23:29:10.858781387 +0000 UTC m=+3223.179041685" Jan 20 23:29:10 crc kubenswrapper[5030]: I0120 23:29:10.988828 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:29:12 crc kubenswrapper[5030]: I0120 23:29:12.375513 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:12 crc kubenswrapper[5030]: I0120 23:29:12.378502 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:12 crc kubenswrapper[5030]: I0120 23:29:12.455092 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:12 crc kubenswrapper[5030]: I0120 23:29:12.901451 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:12 crc kubenswrapper[5030]: I0120 23:29:12.947960 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h8bbx"] Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.113933 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7"] Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.115683 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.120230 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.126738 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7"] Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.188637 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4441979e-fb8d-4588-9180-8b1b74b3ecae-etc-swift\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.188752 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4441979e-fb8d-4588-9180-8b1b74b3ecae-config-data\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.188789 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4441979e-fb8d-4588-9180-8b1b74b3ecae-combined-ca-bundle\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.188860 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4441979e-fb8d-4588-9180-8b1b74b3ecae-log-httpd\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.188917 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4441979e-fb8d-4588-9180-8b1b74b3ecae-run-httpd\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.188968 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jchqn\" (UniqueName: \"kubernetes.io/projected/4441979e-fb8d-4588-9180-8b1b74b3ecae-kube-api-access-jchqn\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.293618 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jchqn\" (UniqueName: \"kubernetes.io/projected/4441979e-fb8d-4588-9180-8b1b74b3ecae-kube-api-access-jchqn\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.293726 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4441979e-fb8d-4588-9180-8b1b74b3ecae-etc-swift\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.293761 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4441979e-fb8d-4588-9180-8b1b74b3ecae-config-data\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.293794 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4441979e-fb8d-4588-9180-8b1b74b3ecae-combined-ca-bundle\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.293863 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4441979e-fb8d-4588-9180-8b1b74b3ecae-log-httpd\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.293921 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4441979e-fb8d-4588-9180-8b1b74b3ecae-run-httpd\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.294443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4441979e-fb8d-4588-9180-8b1b74b3ecae-run-httpd\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.300094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4441979e-fb8d-4588-9180-8b1b74b3ecae-log-httpd\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.304139 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4441979e-fb8d-4588-9180-8b1b74b3ecae-etc-swift\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.308537 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4441979e-fb8d-4588-9180-8b1b74b3ecae-config-data\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.314401 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4441979e-fb8d-4588-9180-8b1b74b3ecae-combined-ca-bundle\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.334341 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jchqn\" (UniqueName: \"kubernetes.io/projected/4441979e-fb8d-4588-9180-8b1b74b3ecae-kube-api-access-jchqn\") pod \"swift-proxy-7669cfdd4b-6jmf7\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.473728 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.534093 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.534376 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="ceilometer-central-agent" containerID="cri-o://cb1b71cee4930c80b7a9c83f52bd0374b4d41c4e164fe20693ffe4c6cb6e9e50" gracePeriod=30 Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.534438 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="sg-core" containerID="cri-o://c3f39f4dcfea9ce39a1b2f55f1629c023a92601135383bf7a408f83fadeaa291" gracePeriod=30 Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.534443 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="proxy-httpd" containerID="cri-o://3aa4e31f19cbf1f889f6c7fa026765ddf9e403581f09b4802d8ac3d7c287bf80" gracePeriod=30 Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.534535 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="ceilometer-notification-agent" containerID="cri-o://12bb23d36b3f804fce0bab26de1f6d7e35de2f78a62878d22bc7ae1bd6b04acc" gracePeriod=30 Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.872928 5030 generic.go:334] "Generic (PLEG): container finished" podID="d72149d3-95a7-48e4-8016-a8029e658356" containerID="3aa4e31f19cbf1f889f6c7fa026765ddf9e403581f09b4802d8ac3d7c287bf80" exitCode=0 Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.873253 5030 generic.go:334] "Generic (PLEG): container finished" podID="d72149d3-95a7-48e4-8016-a8029e658356" containerID="c3f39f4dcfea9ce39a1b2f55f1629c023a92601135383bf7a408f83fadeaa291" exitCode=2 Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.873263 5030 generic.go:334] "Generic (PLEG): container finished" podID="d72149d3-95a7-48e4-8016-a8029e658356" containerID="cb1b71cee4930c80b7a9c83f52bd0374b4d41c4e164fe20693ffe4c6cb6e9e50" exitCode=0 Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.873018 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d72149d3-95a7-48e4-8016-a8029e658356","Type":"ContainerDied","Data":"3aa4e31f19cbf1f889f6c7fa026765ddf9e403581f09b4802d8ac3d7c287bf80"} Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.873335 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d72149d3-95a7-48e4-8016-a8029e658356","Type":"ContainerDied","Data":"c3f39f4dcfea9ce39a1b2f55f1629c023a92601135383bf7a408f83fadeaa291"} Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.873362 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d72149d3-95a7-48e4-8016-a8029e658356","Type":"ContainerDied","Data":"cb1b71cee4930c80b7a9c83f52bd0374b4d41c4e164fe20693ffe4c6cb6e9e50"} Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.873447 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h8bbx" podUID="c1ece861-114a-4566-ab6a-a0488943ac92" containerName="registry-server" containerID="cri-o://59cc829e121560281eeef3334c8d78cd7b609c7fe07a52623f95e884e4b274b2" gracePeriod=2 Jan 20 23:29:14 crc kubenswrapper[5030]: I0120 23:29:14.929713 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7"] Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.249346 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.307591 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ece861-114a-4566-ab6a-a0488943ac92-utilities\") pod \"c1ece861-114a-4566-ab6a-a0488943ac92\" (UID: \"c1ece861-114a-4566-ab6a-a0488943ac92\") " Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.307717 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ece861-114a-4566-ab6a-a0488943ac92-catalog-content\") pod \"c1ece861-114a-4566-ab6a-a0488943ac92\" (UID: \"c1ece861-114a-4566-ab6a-a0488943ac92\") " Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.307855 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jb8v\" (UniqueName: \"kubernetes.io/projected/c1ece861-114a-4566-ab6a-a0488943ac92-kube-api-access-6jb8v\") pod \"c1ece861-114a-4566-ab6a-a0488943ac92\" (UID: \"c1ece861-114a-4566-ab6a-a0488943ac92\") " Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.309065 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1ece861-114a-4566-ab6a-a0488943ac92-utilities" (OuterVolumeSpecName: "utilities") pod "c1ece861-114a-4566-ab6a-a0488943ac92" (UID: "c1ece861-114a-4566-ab6a-a0488943ac92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.318450 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1ece861-114a-4566-ab6a-a0488943ac92-kube-api-access-6jb8v" (OuterVolumeSpecName: "kube-api-access-6jb8v") pod "c1ece861-114a-4566-ab6a-a0488943ac92" (UID: "c1ece861-114a-4566-ab6a-a0488943ac92"). InnerVolumeSpecName "kube-api-access-6jb8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.409580 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jb8v\" (UniqueName: \"kubernetes.io/projected/c1ece861-114a-4566-ab6a-a0488943ac92-kube-api-access-6jb8v\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.409606 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1ece861-114a-4566-ab6a-a0488943ac92-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.588914 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1ece861-114a-4566-ab6a-a0488943ac92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1ece861-114a-4566-ab6a-a0488943ac92" (UID: "c1ece861-114a-4566-ab6a-a0488943ac92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.614173 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1ece861-114a-4566-ab6a-a0488943ac92-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.896206 5030 generic.go:334] "Generic (PLEG): container finished" podID="c1ece861-114a-4566-ab6a-a0488943ac92" containerID="59cc829e121560281eeef3334c8d78cd7b609c7fe07a52623f95e884e4b274b2" exitCode=0 Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.896261 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8bbx" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.896279 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8bbx" event={"ID":"c1ece861-114a-4566-ab6a-a0488943ac92","Type":"ContainerDied","Data":"59cc829e121560281eeef3334c8d78cd7b609c7fe07a52623f95e884e4b274b2"} Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.896720 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8bbx" event={"ID":"c1ece861-114a-4566-ab6a-a0488943ac92","Type":"ContainerDied","Data":"0a7deb235f037ab55759560ee0e1e086d71b931ac5b8938b3c6274ffdb92c438"} Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.896738 5030 scope.go:117] "RemoveContainer" containerID="59cc829e121560281eeef3334c8d78cd7b609c7fe07a52623f95e884e4b274b2" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.906207 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" event={"ID":"4441979e-fb8d-4588-9180-8b1b74b3ecae","Type":"ContainerStarted","Data":"38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef"} Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.906255 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" event={"ID":"4441979e-fb8d-4588-9180-8b1b74b3ecae","Type":"ContainerStarted","Data":"86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314"} Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.906273 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" event={"ID":"4441979e-fb8d-4588-9180-8b1b74b3ecae","Type":"ContainerStarted","Data":"137df26c6bb4d8670f11ce9fa1ef37b015be34b9b5f49f4140a64e683476e6d7"} Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.907811 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.908028 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.934559 5030 scope.go:117] "RemoveContainer" containerID="d2679d6810a09b1e800c74f2e6c0119f9097ba8fb3ef6591be6abb902ad883d0" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.943051 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" podStartSLOduration=1.943029742 podStartE2EDuration="1.943029742s" podCreationTimestamp="2026-01-20 23:29:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:29:15.931882482 +0000 UTC m=+3228.252142760" watchObservedRunningTime="2026-01-20 23:29:15.943029742 +0000 UTC m=+3228.263290050" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.955472 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h8bbx"] Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.965575 5030 scope.go:117] "RemoveContainer" containerID="d3cfcba4ccb5149c7d961b4d7bdd22020869d2e34231e12f3dee07abe05ef728" Jan 20 23:29:15 crc kubenswrapper[5030]: I0120 23:29:15.975005 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h8bbx"] Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.002394 5030 scope.go:117] "RemoveContainer" containerID="59cc829e121560281eeef3334c8d78cd7b609c7fe07a52623f95e884e4b274b2" Jan 20 23:29:16 crc kubenswrapper[5030]: E0120 23:29:16.002919 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59cc829e121560281eeef3334c8d78cd7b609c7fe07a52623f95e884e4b274b2\": container with ID starting with 59cc829e121560281eeef3334c8d78cd7b609c7fe07a52623f95e884e4b274b2 not found: ID does not exist" containerID="59cc829e121560281eeef3334c8d78cd7b609c7fe07a52623f95e884e4b274b2" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.002959 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cc829e121560281eeef3334c8d78cd7b609c7fe07a52623f95e884e4b274b2"} err="failed to get container status \"59cc829e121560281eeef3334c8d78cd7b609c7fe07a52623f95e884e4b274b2\": rpc error: code = NotFound desc = could not find container \"59cc829e121560281eeef3334c8d78cd7b609c7fe07a52623f95e884e4b274b2\": container with ID starting with 59cc829e121560281eeef3334c8d78cd7b609c7fe07a52623f95e884e4b274b2 not found: ID does not exist" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.002983 5030 scope.go:117] "RemoveContainer" containerID="d2679d6810a09b1e800c74f2e6c0119f9097ba8fb3ef6591be6abb902ad883d0" Jan 20 23:29:16 crc kubenswrapper[5030]: E0120 23:29:16.003273 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2679d6810a09b1e800c74f2e6c0119f9097ba8fb3ef6591be6abb902ad883d0\": container with ID starting with d2679d6810a09b1e800c74f2e6c0119f9097ba8fb3ef6591be6abb902ad883d0 not found: ID does not exist" containerID="d2679d6810a09b1e800c74f2e6c0119f9097ba8fb3ef6591be6abb902ad883d0" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.003293 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2679d6810a09b1e800c74f2e6c0119f9097ba8fb3ef6591be6abb902ad883d0"} err="failed to get container status \"d2679d6810a09b1e800c74f2e6c0119f9097ba8fb3ef6591be6abb902ad883d0\": rpc error: code = NotFound desc = could not find container \"d2679d6810a09b1e800c74f2e6c0119f9097ba8fb3ef6591be6abb902ad883d0\": container with ID starting with d2679d6810a09b1e800c74f2e6c0119f9097ba8fb3ef6591be6abb902ad883d0 not found: ID does not exist" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.003305 5030 scope.go:117] "RemoveContainer" containerID="d3cfcba4ccb5149c7d961b4d7bdd22020869d2e34231e12f3dee07abe05ef728" Jan 20 23:29:16 crc kubenswrapper[5030]: E0120 23:29:16.003550 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3cfcba4ccb5149c7d961b4d7bdd22020869d2e34231e12f3dee07abe05ef728\": container with ID starting with d3cfcba4ccb5149c7d961b4d7bdd22020869d2e34231e12f3dee07abe05ef728 not found: ID does not exist" containerID="d3cfcba4ccb5149c7d961b4d7bdd22020869d2e34231e12f3dee07abe05ef728" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.003694 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3cfcba4ccb5149c7d961b4d7bdd22020869d2e34231e12f3dee07abe05ef728"} err="failed to get container status \"d3cfcba4ccb5149c7d961b4d7bdd22020869d2e34231e12f3dee07abe05ef728\": rpc error: code = NotFound desc = could not find container \"d3cfcba4ccb5149c7d961b4d7bdd22020869d2e34231e12f3dee07abe05ef728\": container with ID starting with d3cfcba4ccb5149c7d961b4d7bdd22020869d2e34231e12f3dee07abe05ef728 not found: ID does not exist" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.826431 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.919440 5030 generic.go:334] "Generic (PLEG): container finished" podID="d72149d3-95a7-48e4-8016-a8029e658356" containerID="12bb23d36b3f804fce0bab26de1f6d7e35de2f78a62878d22bc7ae1bd6b04acc" exitCode=0 Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.919507 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d72149d3-95a7-48e4-8016-a8029e658356","Type":"ContainerDied","Data":"12bb23d36b3f804fce0bab26de1f6d7e35de2f78a62878d22bc7ae1bd6b04acc"} Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.919535 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d72149d3-95a7-48e4-8016-a8029e658356","Type":"ContainerDied","Data":"746dc54c0edd3981c6544a2c6c7541b75b6cec8d496d579ebbba9b28e3dd2027"} Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.919557 5030 scope.go:117] "RemoveContainer" containerID="3aa4e31f19cbf1f889f6c7fa026765ddf9e403581f09b4802d8ac3d7c287bf80" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.919692 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.933711 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-combined-ca-bundle\") pod \"d72149d3-95a7-48e4-8016-a8029e658356\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.933820 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmxx6\" (UniqueName: \"kubernetes.io/projected/d72149d3-95a7-48e4-8016-a8029e658356-kube-api-access-dmxx6\") pod \"d72149d3-95a7-48e4-8016-a8029e658356\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.933953 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-config-data\") pod \"d72149d3-95a7-48e4-8016-a8029e658356\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.934079 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d72149d3-95a7-48e4-8016-a8029e658356-log-httpd\") pod \"d72149d3-95a7-48e4-8016-a8029e658356\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.934112 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-scripts\") pod \"d72149d3-95a7-48e4-8016-a8029e658356\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.934171 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-sg-core-conf-yaml\") pod \"d72149d3-95a7-48e4-8016-a8029e658356\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.934600 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d72149d3-95a7-48e4-8016-a8029e658356-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d72149d3-95a7-48e4-8016-a8029e658356" (UID: "d72149d3-95a7-48e4-8016-a8029e658356"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.934808 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d72149d3-95a7-48e4-8016-a8029e658356-run-httpd\") pod \"d72149d3-95a7-48e4-8016-a8029e658356\" (UID: \"d72149d3-95a7-48e4-8016-a8029e658356\") " Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.935129 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d72149d3-95a7-48e4-8016-a8029e658356-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d72149d3-95a7-48e4-8016-a8029e658356" (UID: "d72149d3-95a7-48e4-8016-a8029e658356"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.935479 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d72149d3-95a7-48e4-8016-a8029e658356-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.935496 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d72149d3-95a7-48e4-8016-a8029e658356-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.952304 5030 scope.go:117] "RemoveContainer" containerID="c3f39f4dcfea9ce39a1b2f55f1629c023a92601135383bf7a408f83fadeaa291" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.952308 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d72149d3-95a7-48e4-8016-a8029e658356-kube-api-access-dmxx6" (OuterVolumeSpecName: "kube-api-access-dmxx6") pod "d72149d3-95a7-48e4-8016-a8029e658356" (UID: "d72149d3-95a7-48e4-8016-a8029e658356"). InnerVolumeSpecName "kube-api-access-dmxx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.952461 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-scripts" (OuterVolumeSpecName: "scripts") pod "d72149d3-95a7-48e4-8016-a8029e658356" (UID: "d72149d3-95a7-48e4-8016-a8029e658356"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:16 crc kubenswrapper[5030]: I0120 23:29:16.958760 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d72149d3-95a7-48e4-8016-a8029e658356" (UID: "d72149d3-95a7-48e4-8016-a8029e658356"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.009573 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d72149d3-95a7-48e4-8016-a8029e658356" (UID: "d72149d3-95a7-48e4-8016-a8029e658356"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.033116 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-config-data" (OuterVolumeSpecName: "config-data") pod "d72149d3-95a7-48e4-8016-a8029e658356" (UID: "d72149d3-95a7-48e4-8016-a8029e658356"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.040919 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.040953 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.040980 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.040993 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmxx6\" (UniqueName: \"kubernetes.io/projected/d72149d3-95a7-48e4-8016-a8029e658356-kube-api-access-dmxx6\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.041006 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d72149d3-95a7-48e4-8016-a8029e658356-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.043008 5030 scope.go:117] "RemoveContainer" containerID="12bb23d36b3f804fce0bab26de1f6d7e35de2f78a62878d22bc7ae1bd6b04acc" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.064051 5030 scope.go:117] "RemoveContainer" containerID="cb1b71cee4930c80b7a9c83f52bd0374b4d41c4e164fe20693ffe4c6cb6e9e50" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.084195 5030 scope.go:117] "RemoveContainer" containerID="3aa4e31f19cbf1f889f6c7fa026765ddf9e403581f09b4802d8ac3d7c287bf80" Jan 20 23:29:17 crc kubenswrapper[5030]: E0120 23:29:17.084787 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aa4e31f19cbf1f889f6c7fa026765ddf9e403581f09b4802d8ac3d7c287bf80\": container with ID starting with 3aa4e31f19cbf1f889f6c7fa026765ddf9e403581f09b4802d8ac3d7c287bf80 not found: ID does not exist" containerID="3aa4e31f19cbf1f889f6c7fa026765ddf9e403581f09b4802d8ac3d7c287bf80" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.084817 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa4e31f19cbf1f889f6c7fa026765ddf9e403581f09b4802d8ac3d7c287bf80"} err="failed to get container status \"3aa4e31f19cbf1f889f6c7fa026765ddf9e403581f09b4802d8ac3d7c287bf80\": rpc error: code = NotFound desc = could not find container \"3aa4e31f19cbf1f889f6c7fa026765ddf9e403581f09b4802d8ac3d7c287bf80\": container with ID starting with 3aa4e31f19cbf1f889f6c7fa026765ddf9e403581f09b4802d8ac3d7c287bf80 not found: ID does not exist" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.084839 5030 scope.go:117] "RemoveContainer" containerID="c3f39f4dcfea9ce39a1b2f55f1629c023a92601135383bf7a408f83fadeaa291" Jan 20 23:29:17 crc kubenswrapper[5030]: E0120 23:29:17.085115 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f39f4dcfea9ce39a1b2f55f1629c023a92601135383bf7a408f83fadeaa291\": container with ID starting with c3f39f4dcfea9ce39a1b2f55f1629c023a92601135383bf7a408f83fadeaa291 not found: ID does not exist" containerID="c3f39f4dcfea9ce39a1b2f55f1629c023a92601135383bf7a408f83fadeaa291" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.085139 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f39f4dcfea9ce39a1b2f55f1629c023a92601135383bf7a408f83fadeaa291"} err="failed to get container status \"c3f39f4dcfea9ce39a1b2f55f1629c023a92601135383bf7a408f83fadeaa291\": rpc error: code = NotFound desc = could not find container \"c3f39f4dcfea9ce39a1b2f55f1629c023a92601135383bf7a408f83fadeaa291\": container with ID starting with c3f39f4dcfea9ce39a1b2f55f1629c023a92601135383bf7a408f83fadeaa291 not found: ID does not exist" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.085152 5030 scope.go:117] "RemoveContainer" containerID="12bb23d36b3f804fce0bab26de1f6d7e35de2f78a62878d22bc7ae1bd6b04acc" Jan 20 23:29:17 crc kubenswrapper[5030]: E0120 23:29:17.085812 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12bb23d36b3f804fce0bab26de1f6d7e35de2f78a62878d22bc7ae1bd6b04acc\": container with ID starting with 12bb23d36b3f804fce0bab26de1f6d7e35de2f78a62878d22bc7ae1bd6b04acc not found: ID does not exist" containerID="12bb23d36b3f804fce0bab26de1f6d7e35de2f78a62878d22bc7ae1bd6b04acc" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.085834 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12bb23d36b3f804fce0bab26de1f6d7e35de2f78a62878d22bc7ae1bd6b04acc"} err="failed to get container status \"12bb23d36b3f804fce0bab26de1f6d7e35de2f78a62878d22bc7ae1bd6b04acc\": rpc error: code = NotFound desc = could not find container \"12bb23d36b3f804fce0bab26de1f6d7e35de2f78a62878d22bc7ae1bd6b04acc\": container with ID starting with 12bb23d36b3f804fce0bab26de1f6d7e35de2f78a62878d22bc7ae1bd6b04acc not found: ID does not exist" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.085847 5030 scope.go:117] "RemoveContainer" containerID="cb1b71cee4930c80b7a9c83f52bd0374b4d41c4e164fe20693ffe4c6cb6e9e50" Jan 20 23:29:17 crc kubenswrapper[5030]: E0120 23:29:17.086335 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1b71cee4930c80b7a9c83f52bd0374b4d41c4e164fe20693ffe4c6cb6e9e50\": container with ID starting with cb1b71cee4930c80b7a9c83f52bd0374b4d41c4e164fe20693ffe4c6cb6e9e50 not found: ID does not exist" containerID="cb1b71cee4930c80b7a9c83f52bd0374b4d41c4e164fe20693ffe4c6cb6e9e50" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.086382 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1b71cee4930c80b7a9c83f52bd0374b4d41c4e164fe20693ffe4c6cb6e9e50"} err="failed to get container status \"cb1b71cee4930c80b7a9c83f52bd0374b4d41c4e164fe20693ffe4c6cb6e9e50\": rpc error: code = NotFound desc = could not find container \"cb1b71cee4930c80b7a9c83f52bd0374b4d41c4e164fe20693ffe4c6cb6e9e50\": container with ID starting with cb1b71cee4930c80b7a9c83f52bd0374b4d41c4e164fe20693ffe4c6cb6e9e50 not found: ID does not exist" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.253380 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.267445 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.292599 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:17 crc kubenswrapper[5030]: E0120 23:29:17.293018 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="proxy-httpd" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.293037 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="proxy-httpd" Jan 20 23:29:17 crc kubenswrapper[5030]: E0120 23:29:17.293062 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ece861-114a-4566-ab6a-a0488943ac92" containerName="extract-content" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.293068 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ece861-114a-4566-ab6a-a0488943ac92" containerName="extract-content" Jan 20 23:29:17 crc kubenswrapper[5030]: E0120 23:29:17.293077 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="sg-core" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.293083 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="sg-core" Jan 20 23:29:17 crc kubenswrapper[5030]: E0120 23:29:17.293092 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ece861-114a-4566-ab6a-a0488943ac92" containerName="registry-server" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.293097 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ece861-114a-4566-ab6a-a0488943ac92" containerName="registry-server" Jan 20 23:29:17 crc kubenswrapper[5030]: E0120 23:29:17.293112 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ece861-114a-4566-ab6a-a0488943ac92" containerName="extract-utilities" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.293118 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ece861-114a-4566-ab6a-a0488943ac92" containerName="extract-utilities" Jan 20 23:29:17 crc kubenswrapper[5030]: E0120 23:29:17.293130 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="ceilometer-notification-agent" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.293135 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="ceilometer-notification-agent" Jan 20 23:29:17 crc kubenswrapper[5030]: E0120 23:29:17.293153 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="ceilometer-central-agent" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.293159 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="ceilometer-central-agent" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.293318 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="ceilometer-central-agent" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.293329 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ece861-114a-4566-ab6a-a0488943ac92" containerName="registry-server" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.293338 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="proxy-httpd" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.293350 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="sg-core" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.293360 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d72149d3-95a7-48e4-8016-a8029e658356" containerName="ceilometer-notification-agent" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.294918 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.299866 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.299881 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.308499 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.349810 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-scripts\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.349872 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.350010 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.350189 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqvdh\" (UniqueName: \"kubernetes.io/projected/1dce7135-b29a-4e44-8da2-643520d68a10-kube-api-access-gqvdh\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.350253 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-config-data\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.350399 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dce7135-b29a-4e44-8da2-643520d68a10-log-httpd\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.350494 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dce7135-b29a-4e44-8da2-643520d68a10-run-httpd\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.452159 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dce7135-b29a-4e44-8da2-643520d68a10-log-httpd\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.452205 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dce7135-b29a-4e44-8da2-643520d68a10-run-httpd\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.452239 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-scripts\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.452278 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.452317 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.452363 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqvdh\" (UniqueName: \"kubernetes.io/projected/1dce7135-b29a-4e44-8da2-643520d68a10-kube-api-access-gqvdh\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.452385 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-config-data\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.452945 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dce7135-b29a-4e44-8da2-643520d68a10-log-httpd\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.453107 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dce7135-b29a-4e44-8da2-643520d68a10-run-httpd\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.457046 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-scripts\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.457772 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-config-data\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.458002 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.458147 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.478807 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqvdh\" (UniqueName: \"kubernetes.io/projected/1dce7135-b29a-4e44-8da2-643520d68a10-kube-api-access-gqvdh\") pod \"ceilometer-0\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.609972 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.971678 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1ece861-114a-4566-ab6a-a0488943ac92" path="/var/lib/kubelet/pods/c1ece861-114a-4566-ab6a-a0488943ac92/volumes" Jan 20 23:29:17 crc kubenswrapper[5030]: I0120 23:29:17.972521 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d72149d3-95a7-48e4-8016-a8029e658356" path="/var/lib/kubelet/pods/d72149d3-95a7-48e4-8016-a8029e658356/volumes" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.091316 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:18 crc kubenswrapper[5030]: W0120 23:29:18.097357 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dce7135_b29a_4e44_8da2_643520d68a10.slice/crio-cb2a56ce443f698372e774dc9e1b5f5b90bc8a62075007ef9d0422d72c8a893d WatchSource:0}: Error finding container cb2a56ce443f698372e774dc9e1b5f5b90bc8a62075007ef9d0422d72c8a893d: Status 404 returned error can't find the container with id cb2a56ce443f698372e774dc9e1b5f5b90bc8a62075007ef9d0422d72c8a893d Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.248030 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-d56mx"] Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.250531 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-d56mx" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.285978 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-d56mx"] Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.351819 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-9kt22"] Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.353237 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-9kt22" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.362686 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-9kt22"] Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.369779 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q"] Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.373815 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.374537 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhg9b\" (UniqueName: \"kubernetes.io/projected/cbe0edb2-3db0-4578-ba0f-22e9f7209b5a-kube-api-access-mhg9b\") pod \"nova-api-db-create-d56mx\" (UID: \"cbe0edb2-3db0-4578-ba0f-22e9f7209b5a\") " pod="openstack-kuttl-tests/nova-api-db-create-d56mx" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.374667 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbe0edb2-3db0-4578-ba0f-22e9f7209b5a-operator-scripts\") pod \"nova-api-db-create-d56mx\" (UID: \"cbe0edb2-3db0-4578-ba0f-22e9f7209b5a\") " pod="openstack-kuttl-tests/nova-api-db-create-d56mx" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.376640 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.379186 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q"] Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.451920 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-qxfmd"] Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.453549 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-qxfmd" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.473503 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-qxfmd"] Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.477030 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46aa31eb-445e-4c7b-ab98-6326b68c20ac-operator-scripts\") pod \"nova-api-e6f9-account-create-update-cd46q\" (UID: \"46aa31eb-445e-4c7b-ab98-6326b68c20ac\") " pod="openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.477083 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4572984e-7cd5-46d9-8d0b-627ab9438173-operator-scripts\") pod \"nova-cell0-db-create-9kt22\" (UID: \"4572984e-7cd5-46d9-8d0b-627ab9438173\") " pod="openstack-kuttl-tests/nova-cell0-db-create-9kt22" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.477139 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbe0edb2-3db0-4578-ba0f-22e9f7209b5a-operator-scripts\") pod \"nova-api-db-create-d56mx\" (UID: \"cbe0edb2-3db0-4578-ba0f-22e9f7209b5a\") " pod="openstack-kuttl-tests/nova-api-db-create-d56mx" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.477191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twrzw\" (UniqueName: \"kubernetes.io/projected/46aa31eb-445e-4c7b-ab98-6326b68c20ac-kube-api-access-twrzw\") pod \"nova-api-e6f9-account-create-update-cd46q\" (UID: \"46aa31eb-445e-4c7b-ab98-6326b68c20ac\") " pod="openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.477216 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5q29\" (UniqueName: \"kubernetes.io/projected/4572984e-7cd5-46d9-8d0b-627ab9438173-kube-api-access-t5q29\") pod \"nova-cell0-db-create-9kt22\" (UID: \"4572984e-7cd5-46d9-8d0b-627ab9438173\") " pod="openstack-kuttl-tests/nova-cell0-db-create-9kt22" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.477238 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhg9b\" (UniqueName: \"kubernetes.io/projected/cbe0edb2-3db0-4578-ba0f-22e9f7209b5a-kube-api-access-mhg9b\") pod \"nova-api-db-create-d56mx\" (UID: \"cbe0edb2-3db0-4578-ba0f-22e9f7209b5a\") " pod="openstack-kuttl-tests/nova-api-db-create-d56mx" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.478254 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbe0edb2-3db0-4578-ba0f-22e9f7209b5a-operator-scripts\") pod \"nova-api-db-create-d56mx\" (UID: \"cbe0edb2-3db0-4578-ba0f-22e9f7209b5a\") " pod="openstack-kuttl-tests/nova-api-db-create-d56mx" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.497572 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhg9b\" (UniqueName: \"kubernetes.io/projected/cbe0edb2-3db0-4578-ba0f-22e9f7209b5a-kube-api-access-mhg9b\") pod \"nova-api-db-create-d56mx\" (UID: \"cbe0edb2-3db0-4578-ba0f-22e9f7209b5a\") " pod="openstack-kuttl-tests/nova-api-db-create-d56mx" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.555578 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4"] Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.557025 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.563879 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.564594 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4"] Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.578279 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4572984e-7cd5-46d9-8d0b-627ab9438173-operator-scripts\") pod \"nova-cell0-db-create-9kt22\" (UID: \"4572984e-7cd5-46d9-8d0b-627ab9438173\") " pod="openstack-kuttl-tests/nova-cell0-db-create-9kt22" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.578342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6003f418-8606-4a88-b48b-2fb2bd06f465-operator-scripts\") pod \"nova-cell1-db-create-qxfmd\" (UID: \"6003f418-8606-4a88-b48b-2fb2bd06f465\") " pod="openstack-kuttl-tests/nova-cell1-db-create-qxfmd" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.578391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rpw8\" (UniqueName: \"kubernetes.io/projected/6003f418-8606-4a88-b48b-2fb2bd06f465-kube-api-access-6rpw8\") pod \"nova-cell1-db-create-qxfmd\" (UID: \"6003f418-8606-4a88-b48b-2fb2bd06f465\") " pod="openstack-kuttl-tests/nova-cell1-db-create-qxfmd" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.578437 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twrzw\" (UniqueName: \"kubernetes.io/projected/46aa31eb-445e-4c7b-ab98-6326b68c20ac-kube-api-access-twrzw\") pod \"nova-api-e6f9-account-create-update-cd46q\" (UID: \"46aa31eb-445e-4c7b-ab98-6326b68c20ac\") " pod="openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.578463 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5q29\" (UniqueName: \"kubernetes.io/projected/4572984e-7cd5-46d9-8d0b-627ab9438173-kube-api-access-t5q29\") pod \"nova-cell0-db-create-9kt22\" (UID: \"4572984e-7cd5-46d9-8d0b-627ab9438173\") " pod="openstack-kuttl-tests/nova-cell0-db-create-9kt22" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.578683 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46aa31eb-445e-4c7b-ab98-6326b68c20ac-operator-scripts\") pod \"nova-api-e6f9-account-create-update-cd46q\" (UID: \"46aa31eb-445e-4c7b-ab98-6326b68c20ac\") " pod="openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.579138 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4572984e-7cd5-46d9-8d0b-627ab9438173-operator-scripts\") pod \"nova-cell0-db-create-9kt22\" (UID: \"4572984e-7cd5-46d9-8d0b-627ab9438173\") " pod="openstack-kuttl-tests/nova-cell0-db-create-9kt22" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.579428 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46aa31eb-445e-4c7b-ab98-6326b68c20ac-operator-scripts\") pod \"nova-api-e6f9-account-create-update-cd46q\" (UID: \"46aa31eb-445e-4c7b-ab98-6326b68c20ac\") " pod="openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.585785 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-d56mx" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.597249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twrzw\" (UniqueName: \"kubernetes.io/projected/46aa31eb-445e-4c7b-ab98-6326b68c20ac-kube-api-access-twrzw\") pod \"nova-api-e6f9-account-create-update-cd46q\" (UID: \"46aa31eb-445e-4c7b-ab98-6326b68c20ac\") " pod="openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.598274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5q29\" (UniqueName: \"kubernetes.io/projected/4572984e-7cd5-46d9-8d0b-627ab9438173-kube-api-access-t5q29\") pod \"nova-cell0-db-create-9kt22\" (UID: \"4572984e-7cd5-46d9-8d0b-627ab9438173\") " pod="openstack-kuttl-tests/nova-cell0-db-create-9kt22" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.673314 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-9kt22" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.680720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6003f418-8606-4a88-b48b-2fb2bd06f465-operator-scripts\") pod \"nova-cell1-db-create-qxfmd\" (UID: \"6003f418-8606-4a88-b48b-2fb2bd06f465\") " pod="openstack-kuttl-tests/nova-cell1-db-create-qxfmd" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.680797 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rpw8\" (UniqueName: \"kubernetes.io/projected/6003f418-8606-4a88-b48b-2fb2bd06f465-kube-api-access-6rpw8\") pod \"nova-cell1-db-create-qxfmd\" (UID: \"6003f418-8606-4a88-b48b-2fb2bd06f465\") " pod="openstack-kuttl-tests/nova-cell1-db-create-qxfmd" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.680849 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qtb8\" (UniqueName: \"kubernetes.io/projected/75ab0bd4-f8be-4a99-87c5-44b8efbaa33d-kube-api-access-9qtb8\") pod \"nova-cell0-7b33-account-create-update-rwxx4\" (UID: \"75ab0bd4-f8be-4a99-87c5-44b8efbaa33d\") " pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.680909 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75ab0bd4-f8be-4a99-87c5-44b8efbaa33d-operator-scripts\") pod \"nova-cell0-7b33-account-create-update-rwxx4\" (UID: \"75ab0bd4-f8be-4a99-87c5-44b8efbaa33d\") " pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.681726 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6003f418-8606-4a88-b48b-2fb2bd06f465-operator-scripts\") pod \"nova-cell1-db-create-qxfmd\" (UID: \"6003f418-8606-4a88-b48b-2fb2bd06f465\") " pod="openstack-kuttl-tests/nova-cell1-db-create-qxfmd" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.689285 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.699205 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rpw8\" (UniqueName: \"kubernetes.io/projected/6003f418-8606-4a88-b48b-2fb2bd06f465-kube-api-access-6rpw8\") pod \"nova-cell1-db-create-qxfmd\" (UID: \"6003f418-8606-4a88-b48b-2fb2bd06f465\") " pod="openstack-kuttl-tests/nova-cell1-db-create-qxfmd" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.765161 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769"] Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.766361 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.769159 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-qxfmd" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.769864 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.783588 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qtb8\" (UniqueName: \"kubernetes.io/projected/75ab0bd4-f8be-4a99-87c5-44b8efbaa33d-kube-api-access-9qtb8\") pod \"nova-cell0-7b33-account-create-update-rwxx4\" (UID: \"75ab0bd4-f8be-4a99-87c5-44b8efbaa33d\") " pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.783757 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75ab0bd4-f8be-4a99-87c5-44b8efbaa33d-operator-scripts\") pod \"nova-cell0-7b33-account-create-update-rwxx4\" (UID: \"75ab0bd4-f8be-4a99-87c5-44b8efbaa33d\") " pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.784393 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75ab0bd4-f8be-4a99-87c5-44b8efbaa33d-operator-scripts\") pod \"nova-cell0-7b33-account-create-update-rwxx4\" (UID: \"75ab0bd4-f8be-4a99-87c5-44b8efbaa33d\") " pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.792484 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769"] Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.805825 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qtb8\" (UniqueName: \"kubernetes.io/projected/75ab0bd4-f8be-4a99-87c5-44b8efbaa33d-kube-api-access-9qtb8\") pod \"nova-cell0-7b33-account-create-update-rwxx4\" (UID: \"75ab0bd4-f8be-4a99-87c5-44b8efbaa33d\") " pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.873337 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.885733 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc5m8\" (UniqueName: \"kubernetes.io/projected/6e2d7a8b-c587-4109-ac49-11dbec78bbba-kube-api-access-wc5m8\") pod \"nova-cell1-eba6-account-create-update-lz769\" (UID: \"6e2d7a8b-c587-4109-ac49-11dbec78bbba\") " pod="openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.885814 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e2d7a8b-c587-4109-ac49-11dbec78bbba-operator-scripts\") pod \"nova-cell1-eba6-account-create-update-lz769\" (UID: \"6e2d7a8b-c587-4109-ac49-11dbec78bbba\") " pod="openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.950543 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1dce7135-b29a-4e44-8da2-643520d68a10","Type":"ContainerStarted","Data":"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08"} Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.950600 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1dce7135-b29a-4e44-8da2-643520d68a10","Type":"ContainerStarted","Data":"cb2a56ce443f698372e774dc9e1b5f5b90bc8a62075007ef9d0422d72c8a893d"} Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.987981 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc5m8\" (UniqueName: \"kubernetes.io/projected/6e2d7a8b-c587-4109-ac49-11dbec78bbba-kube-api-access-wc5m8\") pod \"nova-cell1-eba6-account-create-update-lz769\" (UID: \"6e2d7a8b-c587-4109-ac49-11dbec78bbba\") " pod="openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.988065 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e2d7a8b-c587-4109-ac49-11dbec78bbba-operator-scripts\") pod \"nova-cell1-eba6-account-create-update-lz769\" (UID: \"6e2d7a8b-c587-4109-ac49-11dbec78bbba\") " pod="openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769" Jan 20 23:29:18 crc kubenswrapper[5030]: I0120 23:29:18.988865 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e2d7a8b-c587-4109-ac49-11dbec78bbba-operator-scripts\") pod \"nova-cell1-eba6-account-create-update-lz769\" (UID: \"6e2d7a8b-c587-4109-ac49-11dbec78bbba\") " pod="openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769" Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.007906 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc5m8\" (UniqueName: \"kubernetes.io/projected/6e2d7a8b-c587-4109-ac49-11dbec78bbba-kube-api-access-wc5m8\") pod \"nova-cell1-eba6-account-create-update-lz769\" (UID: \"6e2d7a8b-c587-4109-ac49-11dbec78bbba\") " pod="openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769" Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.100581 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769" Jan 20 23:29:19 crc kubenswrapper[5030]: W0120 23:29:19.103901 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbe0edb2_3db0_4578_ba0f_22e9f7209b5a.slice/crio-3c841fe0735533b1d4fc1a64fe1676953a3bb76aa050ec112807254d96fb6786 WatchSource:0}: Error finding container 3c841fe0735533b1d4fc1a64fe1676953a3bb76aa050ec112807254d96fb6786: Status 404 returned error can't find the container with id 3c841fe0735533b1d4fc1a64fe1676953a3bb76aa050ec112807254d96fb6786 Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.108403 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-d56mx"] Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.228575 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-9kt22"] Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.237533 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q"] Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.335644 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-qxfmd"] Jan 20 23:29:19 crc kubenswrapper[5030]: W0120 23:29:19.370815 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6003f418_8606_4a88_b48b_2fb2bd06f465.slice/crio-21091977493e34fa7c0a46010f12316512cdc3295f4228761503661c923be516 WatchSource:0}: Error finding container 21091977493e34fa7c0a46010f12316512cdc3295f4228761503661c923be516: Status 404 returned error can't find the container with id 21091977493e34fa7c0a46010f12316512cdc3295f4228761503661c923be516 Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.394734 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4"] Jan 20 23:29:19 crc kubenswrapper[5030]: W0120 23:29:19.404776 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75ab0bd4_f8be_4a99_87c5_44b8efbaa33d.slice/crio-a0edb4d35736dd06269c5e13d4b7868838574c90ae4cdc4d84ba64d696db4a4f WatchSource:0}: Error finding container a0edb4d35736dd06269c5e13d4b7868838574c90ae4cdc4d84ba64d696db4a4f: Status 404 returned error can't find the container with id a0edb4d35736dd06269c5e13d4b7868838574c90ae4cdc4d84ba64d696db4a4f Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.514549 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769"] Jan 20 23:29:19 crc kubenswrapper[5030]: W0120 23:29:19.517145 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e2d7a8b_c587_4109_ac49_11dbec78bbba.slice/crio-f12c223c647a93d6c6d10a585551605ddd0f6440c22298e439f17dc5c18953a8 WatchSource:0}: Error finding container f12c223c647a93d6c6d10a585551605ddd0f6440c22298e439f17dc5c18953a8: Status 404 returned error can't find the container with id f12c223c647a93d6c6d10a585551605ddd0f6440c22298e439f17dc5c18953a8 Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.961142 5030 generic.go:334] "Generic (PLEG): container finished" podID="cbe0edb2-3db0-4578-ba0f-22e9f7209b5a" containerID="58c6f323880abc0cbf7d43a0533adcf8ff481ae18da8b64a1cdce69e82b6174f" exitCode=0 Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.967962 5030 generic.go:334] "Generic (PLEG): container finished" podID="6003f418-8606-4a88-b48b-2fb2bd06f465" containerID="c8a3ef2fef70b1b939dd1d84f145e5e6434e7cc730bbddeeb9840fb2dac557ec" exitCode=0 Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.974680 5030 generic.go:334] "Generic (PLEG): container finished" podID="4572984e-7cd5-46d9-8d0b-627ab9438173" containerID="7085afa79dd0d98549d5a88ded95ec6171a3544ee9fd3de70e24d0a912577f79" exitCode=0 Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.981479 5030 generic.go:334] "Generic (PLEG): container finished" podID="75ab0bd4-f8be-4a99-87c5-44b8efbaa33d" containerID="1e78519247299b9a68e9bc1aea8273f93f583e1bcef928b636dee432adb3ccd8" exitCode=0 Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.982774 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-d56mx" event={"ID":"cbe0edb2-3db0-4578-ba0f-22e9f7209b5a","Type":"ContainerDied","Data":"58c6f323880abc0cbf7d43a0533adcf8ff481ae18da8b64a1cdce69e82b6174f"} Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.982876 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-d56mx" event={"ID":"cbe0edb2-3db0-4578-ba0f-22e9f7209b5a","Type":"ContainerStarted","Data":"3c841fe0735533b1d4fc1a64fe1676953a3bb76aa050ec112807254d96fb6786"} Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.982942 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-qxfmd" event={"ID":"6003f418-8606-4a88-b48b-2fb2bd06f465","Type":"ContainerDied","Data":"c8a3ef2fef70b1b939dd1d84f145e5e6434e7cc730bbddeeb9840fb2dac557ec"} Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.983020 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-qxfmd" event={"ID":"6003f418-8606-4a88-b48b-2fb2bd06f465","Type":"ContainerStarted","Data":"21091977493e34fa7c0a46010f12316512cdc3295f4228761503661c923be516"} Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.983088 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-9kt22" event={"ID":"4572984e-7cd5-46d9-8d0b-627ab9438173","Type":"ContainerDied","Data":"7085afa79dd0d98549d5a88ded95ec6171a3544ee9fd3de70e24d0a912577f79"} Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.983146 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-9kt22" event={"ID":"4572984e-7cd5-46d9-8d0b-627ab9438173","Type":"ContainerStarted","Data":"ea7c64ab35eb4e08a07f8ed1205fc3a119227ab7fffc4c9bbf906c62f4d5c63b"} Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.983264 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4" event={"ID":"75ab0bd4-f8be-4a99-87c5-44b8efbaa33d","Type":"ContainerDied","Data":"1e78519247299b9a68e9bc1aea8273f93f583e1bcef928b636dee432adb3ccd8"} Jan 20 23:29:19 crc kubenswrapper[5030]: I0120 23:29:19.983334 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4" event={"ID":"75ab0bd4-f8be-4a99-87c5-44b8efbaa33d","Type":"ContainerStarted","Data":"a0edb4d35736dd06269c5e13d4b7868838574c90ae4cdc4d84ba64d696db4a4f"} Jan 20 23:29:20 crc kubenswrapper[5030]: I0120 23:29:20.005364 5030 generic.go:334] "Generic (PLEG): container finished" podID="46aa31eb-445e-4c7b-ab98-6326b68c20ac" containerID="fd760ce4c516eadaa45a00279c67df1a2875779ca50c76f040476cc90657dcdd" exitCode=0 Jan 20 23:29:20 crc kubenswrapper[5030]: I0120 23:29:20.005494 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q" event={"ID":"46aa31eb-445e-4c7b-ab98-6326b68c20ac","Type":"ContainerDied","Data":"fd760ce4c516eadaa45a00279c67df1a2875779ca50c76f040476cc90657dcdd"} Jan 20 23:29:20 crc kubenswrapper[5030]: I0120 23:29:20.005521 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q" event={"ID":"46aa31eb-445e-4c7b-ab98-6326b68c20ac","Type":"ContainerStarted","Data":"db58d31443fedcdd0322dddf0a09bf5de65255d4fedf652a83b0657ecd698bfd"} Jan 20 23:29:20 crc kubenswrapper[5030]: I0120 23:29:20.006488 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769" event={"ID":"6e2d7a8b-c587-4109-ac49-11dbec78bbba","Type":"ContainerStarted","Data":"f12c223c647a93d6c6d10a585551605ddd0f6440c22298e439f17dc5c18953a8"} Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.018934 5030 generic.go:334] "Generic (PLEG): container finished" podID="6e2d7a8b-c587-4109-ac49-11dbec78bbba" containerID="05d2c8ca19d2f726e4f7a14f77d3b2e23274d5e5e2654c81a841c51062f6366e" exitCode=0 Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.019136 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769" event={"ID":"6e2d7a8b-c587-4109-ac49-11dbec78bbba","Type":"ContainerDied","Data":"05d2c8ca19d2f726e4f7a14f77d3b2e23274d5e5e2654c81a841c51062f6366e"} Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.024977 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1dce7135-b29a-4e44-8da2-643520d68a10","Type":"ContainerStarted","Data":"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d"} Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.025014 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1dce7135-b29a-4e44-8da2-643520d68a10","Type":"ContainerStarted","Data":"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf"} Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.385957 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-qxfmd" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.425917 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6003f418-8606-4a88-b48b-2fb2bd06f465-operator-scripts\") pod \"6003f418-8606-4a88-b48b-2fb2bd06f465\" (UID: \"6003f418-8606-4a88-b48b-2fb2bd06f465\") " Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.426053 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rpw8\" (UniqueName: \"kubernetes.io/projected/6003f418-8606-4a88-b48b-2fb2bd06f465-kube-api-access-6rpw8\") pod \"6003f418-8606-4a88-b48b-2fb2bd06f465\" (UID: \"6003f418-8606-4a88-b48b-2fb2bd06f465\") " Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.427196 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6003f418-8606-4a88-b48b-2fb2bd06f465-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6003f418-8606-4a88-b48b-2fb2bd06f465" (UID: "6003f418-8606-4a88-b48b-2fb2bd06f465"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.432483 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6003f418-8606-4a88-b48b-2fb2bd06f465-kube-api-access-6rpw8" (OuterVolumeSpecName: "kube-api-access-6rpw8") pod "6003f418-8606-4a88-b48b-2fb2bd06f465" (UID: "6003f418-8606-4a88-b48b-2fb2bd06f465"). InnerVolumeSpecName "kube-api-access-6rpw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.477115 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.489093 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-9kt22" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.489172 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-d56mx" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.509420 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.527014 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4572984e-7cd5-46d9-8d0b-627ab9438173-operator-scripts\") pod \"4572984e-7cd5-46d9-8d0b-627ab9438173\" (UID: \"4572984e-7cd5-46d9-8d0b-627ab9438173\") " Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.528608 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5q29\" (UniqueName: \"kubernetes.io/projected/4572984e-7cd5-46d9-8d0b-627ab9438173-kube-api-access-t5q29\") pod \"4572984e-7cd5-46d9-8d0b-627ab9438173\" (UID: \"4572984e-7cd5-46d9-8d0b-627ab9438173\") " Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.528752 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbe0edb2-3db0-4578-ba0f-22e9f7209b5a-operator-scripts\") pod \"cbe0edb2-3db0-4578-ba0f-22e9f7209b5a\" (UID: \"cbe0edb2-3db0-4578-ba0f-22e9f7209b5a\") " Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.528884 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhg9b\" (UniqueName: \"kubernetes.io/projected/cbe0edb2-3db0-4578-ba0f-22e9f7209b5a-kube-api-access-mhg9b\") pod \"cbe0edb2-3db0-4578-ba0f-22e9f7209b5a\" (UID: \"cbe0edb2-3db0-4578-ba0f-22e9f7209b5a\") " Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.528973 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qtb8\" (UniqueName: \"kubernetes.io/projected/75ab0bd4-f8be-4a99-87c5-44b8efbaa33d-kube-api-access-9qtb8\") pod \"75ab0bd4-f8be-4a99-87c5-44b8efbaa33d\" (UID: \"75ab0bd4-f8be-4a99-87c5-44b8efbaa33d\") " Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.529042 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75ab0bd4-f8be-4a99-87c5-44b8efbaa33d-operator-scripts\") pod \"75ab0bd4-f8be-4a99-87c5-44b8efbaa33d\" (UID: \"75ab0bd4-f8be-4a99-87c5-44b8efbaa33d\") " Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.529502 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6003f418-8606-4a88-b48b-2fb2bd06f465-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.529572 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rpw8\" (UniqueName: \"kubernetes.io/projected/6003f418-8606-4a88-b48b-2fb2bd06f465-kube-api-access-6rpw8\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.527784 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4572984e-7cd5-46d9-8d0b-627ab9438173-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4572984e-7cd5-46d9-8d0b-627ab9438173" (UID: "4572984e-7cd5-46d9-8d0b-627ab9438173"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.530490 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75ab0bd4-f8be-4a99-87c5-44b8efbaa33d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75ab0bd4-f8be-4a99-87c5-44b8efbaa33d" (UID: "75ab0bd4-f8be-4a99-87c5-44b8efbaa33d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.531244 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe0edb2-3db0-4578-ba0f-22e9f7209b5a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cbe0edb2-3db0-4578-ba0f-22e9f7209b5a" (UID: "cbe0edb2-3db0-4578-ba0f-22e9f7209b5a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.531839 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe0edb2-3db0-4578-ba0f-22e9f7209b5a-kube-api-access-mhg9b" (OuterVolumeSpecName: "kube-api-access-mhg9b") pod "cbe0edb2-3db0-4578-ba0f-22e9f7209b5a" (UID: "cbe0edb2-3db0-4578-ba0f-22e9f7209b5a"). InnerVolumeSpecName "kube-api-access-mhg9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.536322 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4572984e-7cd5-46d9-8d0b-627ab9438173-kube-api-access-t5q29" (OuterVolumeSpecName: "kube-api-access-t5q29") pod "4572984e-7cd5-46d9-8d0b-627ab9438173" (UID: "4572984e-7cd5-46d9-8d0b-627ab9438173"). InnerVolumeSpecName "kube-api-access-t5q29". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.539157 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75ab0bd4-f8be-4a99-87c5-44b8efbaa33d-kube-api-access-9qtb8" (OuterVolumeSpecName: "kube-api-access-9qtb8") pod "75ab0bd4-f8be-4a99-87c5-44b8efbaa33d" (UID: "75ab0bd4-f8be-4a99-87c5-44b8efbaa33d"). InnerVolumeSpecName "kube-api-access-9qtb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.631255 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46aa31eb-445e-4c7b-ab98-6326b68c20ac-operator-scripts\") pod \"46aa31eb-445e-4c7b-ab98-6326b68c20ac\" (UID: \"46aa31eb-445e-4c7b-ab98-6326b68c20ac\") " Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.631318 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twrzw\" (UniqueName: \"kubernetes.io/projected/46aa31eb-445e-4c7b-ab98-6326b68c20ac-kube-api-access-twrzw\") pod \"46aa31eb-445e-4c7b-ab98-6326b68c20ac\" (UID: \"46aa31eb-445e-4c7b-ab98-6326b68c20ac\") " Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.631709 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46aa31eb-445e-4c7b-ab98-6326b68c20ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46aa31eb-445e-4c7b-ab98-6326b68c20ac" (UID: "46aa31eb-445e-4c7b-ab98-6326b68c20ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.632378 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5q29\" (UniqueName: \"kubernetes.io/projected/4572984e-7cd5-46d9-8d0b-627ab9438173-kube-api-access-t5q29\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.632397 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbe0edb2-3db0-4578-ba0f-22e9f7209b5a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.632410 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhg9b\" (UniqueName: \"kubernetes.io/projected/cbe0edb2-3db0-4578-ba0f-22e9f7209b5a-kube-api-access-mhg9b\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.632420 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qtb8\" (UniqueName: \"kubernetes.io/projected/75ab0bd4-f8be-4a99-87c5-44b8efbaa33d-kube-api-access-9qtb8\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.632429 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75ab0bd4-f8be-4a99-87c5-44b8efbaa33d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.632438 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4572984e-7cd5-46d9-8d0b-627ab9438173-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.632446 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46aa31eb-445e-4c7b-ab98-6326b68c20ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.634993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46aa31eb-445e-4c7b-ab98-6326b68c20ac-kube-api-access-twrzw" (OuterVolumeSpecName: "kube-api-access-twrzw") pod "46aa31eb-445e-4c7b-ab98-6326b68c20ac" (UID: "46aa31eb-445e-4c7b-ab98-6326b68c20ac"). InnerVolumeSpecName "kube-api-access-twrzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:21 crc kubenswrapper[5030]: I0120 23:29:21.734186 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twrzw\" (UniqueName: \"kubernetes.io/projected/46aa31eb-445e-4c7b-ab98-6326b68c20ac-kube-api-access-twrzw\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.036104 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-d56mx" event={"ID":"cbe0edb2-3db0-4578-ba0f-22e9f7209b5a","Type":"ContainerDied","Data":"3c841fe0735533b1d4fc1a64fe1676953a3bb76aa050ec112807254d96fb6786"} Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.036439 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c841fe0735533b1d4fc1a64fe1676953a3bb76aa050ec112807254d96fb6786" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.036230 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-d56mx" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.037612 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-qxfmd" event={"ID":"6003f418-8606-4a88-b48b-2fb2bd06f465","Type":"ContainerDied","Data":"21091977493e34fa7c0a46010f12316512cdc3295f4228761503661c923be516"} Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.037679 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21091977493e34fa7c0a46010f12316512cdc3295f4228761503661c923be516" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.037643 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-qxfmd" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.039873 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-9kt22" event={"ID":"4572984e-7cd5-46d9-8d0b-627ab9438173","Type":"ContainerDied","Data":"ea7c64ab35eb4e08a07f8ed1205fc3a119227ab7fffc4c9bbf906c62f4d5c63b"} Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.039927 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea7c64ab35eb4e08a07f8ed1205fc3a119227ab7fffc4c9bbf906c62f4d5c63b" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.040167 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-9kt22" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.041987 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4" event={"ID":"75ab0bd4-f8be-4a99-87c5-44b8efbaa33d","Type":"ContainerDied","Data":"a0edb4d35736dd06269c5e13d4b7868838574c90ae4cdc4d84ba64d696db4a4f"} Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.042060 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0edb4d35736dd06269c5e13d4b7868838574c90ae4cdc4d84ba64d696db4a4f" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.042009 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.043730 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.044475 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q" event={"ID":"46aa31eb-445e-4c7b-ab98-6326b68c20ac","Type":"ContainerDied","Data":"db58d31443fedcdd0322dddf0a09bf5de65255d4fedf652a83b0657ecd698bfd"} Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.044514 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db58d31443fedcdd0322dddf0a09bf5de65255d4fedf652a83b0657ecd698bfd" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.412469 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.447181 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e2d7a8b-c587-4109-ac49-11dbec78bbba-operator-scripts\") pod \"6e2d7a8b-c587-4109-ac49-11dbec78bbba\" (UID: \"6e2d7a8b-c587-4109-ac49-11dbec78bbba\") " Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.447325 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc5m8\" (UniqueName: \"kubernetes.io/projected/6e2d7a8b-c587-4109-ac49-11dbec78bbba-kube-api-access-wc5m8\") pod \"6e2d7a8b-c587-4109-ac49-11dbec78bbba\" (UID: \"6e2d7a8b-c587-4109-ac49-11dbec78bbba\") " Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.447650 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2d7a8b-c587-4109-ac49-11dbec78bbba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e2d7a8b-c587-4109-ac49-11dbec78bbba" (UID: "6e2d7a8b-c587-4109-ac49-11dbec78bbba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.447931 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e2d7a8b-c587-4109-ac49-11dbec78bbba-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.453422 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2d7a8b-c587-4109-ac49-11dbec78bbba-kube-api-access-wc5m8" (OuterVolumeSpecName: "kube-api-access-wc5m8") pod "6e2d7a8b-c587-4109-ac49-11dbec78bbba" (UID: "6e2d7a8b-c587-4109-ac49-11dbec78bbba"). InnerVolumeSpecName "kube-api-access-wc5m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:22 crc kubenswrapper[5030]: I0120 23:29:22.549495 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc5m8\" (UniqueName: \"kubernetes.io/projected/6e2d7a8b-c587-4109-ac49-11dbec78bbba-kube-api-access-wc5m8\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.059596 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1dce7135-b29a-4e44-8da2-643520d68a10","Type":"ContainerStarted","Data":"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903"} Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.059997 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.061568 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769" event={"ID":"6e2d7a8b-c587-4109-ac49-11dbec78bbba","Type":"ContainerDied","Data":"f12c223c647a93d6c6d10a585551605ddd0f6440c22298e439f17dc5c18953a8"} Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.061610 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f12c223c647a93d6c6d10a585551605ddd0f6440c22298e439f17dc5c18953a8" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.061677 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.097288 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.62557475 podStartE2EDuration="6.097265667s" podCreationTimestamp="2026-01-20 23:29:17 +0000 UTC" firstStartedPulling="2026-01-20 23:29:18.100300774 +0000 UTC m=+3230.420561062" lastFinishedPulling="2026-01-20 23:29:22.571991691 +0000 UTC m=+3234.892251979" observedRunningTime="2026-01-20 23:29:23.080418919 +0000 UTC m=+3235.400679207" watchObservedRunningTime="2026-01-20 23:29:23.097265667 +0000 UTC m=+3235.417525955" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.784306 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz"] Jan 20 23:29:23 crc kubenswrapper[5030]: E0120 23:29:23.784679 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ab0bd4-f8be-4a99-87c5-44b8efbaa33d" containerName="mariadb-account-create-update" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.784695 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ab0bd4-f8be-4a99-87c5-44b8efbaa33d" containerName="mariadb-account-create-update" Jan 20 23:29:23 crc kubenswrapper[5030]: E0120 23:29:23.784717 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6003f418-8606-4a88-b48b-2fb2bd06f465" containerName="mariadb-database-create" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.784724 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6003f418-8606-4a88-b48b-2fb2bd06f465" containerName="mariadb-database-create" Jan 20 23:29:23 crc kubenswrapper[5030]: E0120 23:29:23.784741 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46aa31eb-445e-4c7b-ab98-6326b68c20ac" containerName="mariadb-account-create-update" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.784748 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="46aa31eb-445e-4c7b-ab98-6326b68c20ac" containerName="mariadb-account-create-update" Jan 20 23:29:23 crc kubenswrapper[5030]: E0120 23:29:23.784761 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2d7a8b-c587-4109-ac49-11dbec78bbba" containerName="mariadb-account-create-update" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.784766 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2d7a8b-c587-4109-ac49-11dbec78bbba" containerName="mariadb-account-create-update" Jan 20 23:29:23 crc kubenswrapper[5030]: E0120 23:29:23.784779 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe0edb2-3db0-4578-ba0f-22e9f7209b5a" containerName="mariadb-database-create" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.784785 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe0edb2-3db0-4578-ba0f-22e9f7209b5a" containerName="mariadb-database-create" Jan 20 23:29:23 crc kubenswrapper[5030]: E0120 23:29:23.784798 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4572984e-7cd5-46d9-8d0b-627ab9438173" containerName="mariadb-database-create" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.784803 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4572984e-7cd5-46d9-8d0b-627ab9438173" containerName="mariadb-database-create" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.784947 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6003f418-8606-4a88-b48b-2fb2bd06f465" containerName="mariadb-database-create" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.784962 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe0edb2-3db0-4578-ba0f-22e9f7209b5a" containerName="mariadb-database-create" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.784972 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2d7a8b-c587-4109-ac49-11dbec78bbba" containerName="mariadb-account-create-update" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.784985 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="46aa31eb-445e-4c7b-ab98-6326b68c20ac" containerName="mariadb-account-create-update" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.785003 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ab0bd4-f8be-4a99-87c5-44b8efbaa33d" containerName="mariadb-account-create-update" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.785012 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4572984e-7cd5-46d9-8d0b-627ab9438173" containerName="mariadb-database-create" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.785551 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.790483 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.790616 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-jczwg" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.790788 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.796370 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz"] Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.872445 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs79c\" (UniqueName: \"kubernetes.io/projected/cd48950b-220e-43d8-a96e-ad983ee50c2c-kube-api-access-qs79c\") pod \"nova-cell0-conductor-db-sync-2pwxz\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.872681 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2pwxz\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.872706 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-config-data\") pod \"nova-cell0-conductor-db-sync-2pwxz\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.872743 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-scripts\") pod \"nova-cell0-conductor-db-sync-2pwxz\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.962545 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:29:23 crc kubenswrapper[5030]: E0120 23:29:23.962882 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.973543 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2pwxz\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.973720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-config-data\") pod \"nova-cell0-conductor-db-sync-2pwxz\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.973818 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-scripts\") pod \"nova-cell0-conductor-db-sync-2pwxz\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.973934 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs79c\" (UniqueName: \"kubernetes.io/projected/cd48950b-220e-43d8-a96e-ad983ee50c2c-kube-api-access-qs79c\") pod \"nova-cell0-conductor-db-sync-2pwxz\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.978773 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-scripts\") pod \"nova-cell0-conductor-db-sync-2pwxz\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.978961 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2pwxz\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.979884 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-config-data\") pod \"nova-cell0-conductor-db-sync-2pwxz\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:23 crc kubenswrapper[5030]: I0120 23:29:23.995834 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs79c\" (UniqueName: \"kubernetes.io/projected/cd48950b-220e-43d8-a96e-ad983ee50c2c-kube-api-access-qs79c\") pod \"nova-cell0-conductor-db-sync-2pwxz\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:24 crc kubenswrapper[5030]: I0120 23:29:24.104648 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:24 crc kubenswrapper[5030]: I0120 23:29:24.476286 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:24 crc kubenswrapper[5030]: I0120 23:29:24.478167 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:29:24 crc kubenswrapper[5030]: I0120 23:29:24.562324 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz"] Jan 20 23:29:24 crc kubenswrapper[5030]: W0120 23:29:24.569719 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd48950b_220e_43d8_a96e_ad983ee50c2c.slice/crio-37c0e1a634d42d58cceb4adbaa466e32d68c8924f598f086be1feabb9c306001 WatchSource:0}: Error finding container 37c0e1a634d42d58cceb4adbaa466e32d68c8924f598f086be1feabb9c306001: Status 404 returned error can't find the container with id 37c0e1a634d42d58cceb4adbaa466e32d68c8924f598f086be1feabb9c306001 Jan 20 23:29:24 crc kubenswrapper[5030]: I0120 23:29:24.759721 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:24 crc kubenswrapper[5030]: I0120 23:29:24.931366 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:29:24 crc kubenswrapper[5030]: I0120 23:29:24.932267 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" containerName="glance-log" containerID="cri-o://8868c8dbb49e9f936230d6a050a8159c9fa35eae21d95f7e496666e0266f14ea" gracePeriod=30 Jan 20 23:29:24 crc kubenswrapper[5030]: I0120 23:29:24.932422 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" containerName="glance-httpd" containerID="cri-o://09a49037e83033c54cae83c7b3f4e49aed1949d7ad0788fb8288bb11b66f88d2" gracePeriod=30 Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.083670 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" containerID="8868c8dbb49e9f936230d6a050a8159c9fa35eae21d95f7e496666e0266f14ea" exitCode=143 Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.083751 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39","Type":"ContainerDied","Data":"8868c8dbb49e9f936230d6a050a8159c9fa35eae21d95f7e496666e0266f14ea"} Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.085383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" event={"ID":"cd48950b-220e-43d8-a96e-ad983ee50c2c","Type":"ContainerStarted","Data":"76fbcef65dc0a7b0baa96f8d4bc2c60c0c4628b98902d125ac94e7d82e8fe890"} Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.085431 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" event={"ID":"cd48950b-220e-43d8-a96e-ad983ee50c2c","Type":"ContainerStarted","Data":"37c0e1a634d42d58cceb4adbaa466e32d68c8924f598f086be1feabb9c306001"} Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.085650 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="ceilometer-central-agent" containerID="cri-o://559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08" gracePeriod=30 Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.085752 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="ceilometer-notification-agent" containerID="cri-o://3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf" gracePeriod=30 Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.085747 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="sg-core" containerID="cri-o://fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d" gracePeriod=30 Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.085894 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="proxy-httpd" containerID="cri-o://4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903" gracePeriod=30 Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.114568 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" podStartSLOduration=2.1145483 podStartE2EDuration="2.1145483s" podCreationTimestamp="2026-01-20 23:29:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:29:25.107878238 +0000 UTC m=+3237.428138526" watchObservedRunningTime="2026-01-20 23:29:25.1145483 +0000 UTC m=+3237.434808588" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.771463 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.802296 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-combined-ca-bundle\") pod \"1dce7135-b29a-4e44-8da2-643520d68a10\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.803107 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dce7135-b29a-4e44-8da2-643520d68a10-log-httpd\") pod \"1dce7135-b29a-4e44-8da2-643520d68a10\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.803179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqvdh\" (UniqueName: \"kubernetes.io/projected/1dce7135-b29a-4e44-8da2-643520d68a10-kube-api-access-gqvdh\") pod \"1dce7135-b29a-4e44-8da2-643520d68a10\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.803254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dce7135-b29a-4e44-8da2-643520d68a10-run-httpd\") pod \"1dce7135-b29a-4e44-8da2-643520d68a10\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.803297 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-sg-core-conf-yaml\") pod \"1dce7135-b29a-4e44-8da2-643520d68a10\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.803315 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-config-data\") pod \"1dce7135-b29a-4e44-8da2-643520d68a10\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.803351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-scripts\") pod \"1dce7135-b29a-4e44-8da2-643520d68a10\" (UID: \"1dce7135-b29a-4e44-8da2-643520d68a10\") " Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.803614 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dce7135-b29a-4e44-8da2-643520d68a10-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1dce7135-b29a-4e44-8da2-643520d68a10" (UID: "1dce7135-b29a-4e44-8da2-643520d68a10"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.803855 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dce7135-b29a-4e44-8da2-643520d68a10-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1dce7135-b29a-4e44-8da2-643520d68a10" (UID: "1dce7135-b29a-4e44-8da2-643520d68a10"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.803968 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dce7135-b29a-4e44-8da2-643520d68a10-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.804000 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dce7135-b29a-4e44-8da2-643520d68a10-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.814767 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-scripts" (OuterVolumeSpecName: "scripts") pod "1dce7135-b29a-4e44-8da2-643520d68a10" (UID: "1dce7135-b29a-4e44-8da2-643520d68a10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.814960 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dce7135-b29a-4e44-8da2-643520d68a10-kube-api-access-gqvdh" (OuterVolumeSpecName: "kube-api-access-gqvdh") pod "1dce7135-b29a-4e44-8da2-643520d68a10" (UID: "1dce7135-b29a-4e44-8da2-643520d68a10"). InnerVolumeSpecName "kube-api-access-gqvdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.839753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1dce7135-b29a-4e44-8da2-643520d68a10" (UID: "1dce7135-b29a-4e44-8da2-643520d68a10"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.896259 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dce7135-b29a-4e44-8da2-643520d68a10" (UID: "1dce7135-b29a-4e44-8da2-643520d68a10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.905088 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqvdh\" (UniqueName: \"kubernetes.io/projected/1dce7135-b29a-4e44-8da2-643520d68a10-kube-api-access-gqvdh\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.905115 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.905125 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.905134 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:25 crc kubenswrapper[5030]: I0120 23:29:25.917222 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-config-data" (OuterVolumeSpecName: "config-data") pod "1dce7135-b29a-4e44-8da2-643520d68a10" (UID: "1dce7135-b29a-4e44-8da2-643520d68a10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.007052 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dce7135-b29a-4e44-8da2-643520d68a10-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.096011 5030 generic.go:334] "Generic (PLEG): container finished" podID="1dce7135-b29a-4e44-8da2-643520d68a10" containerID="4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903" exitCode=0 Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.096048 5030 generic.go:334] "Generic (PLEG): container finished" podID="1dce7135-b29a-4e44-8da2-643520d68a10" containerID="fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d" exitCode=2 Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.096056 5030 generic.go:334] "Generic (PLEG): container finished" podID="1dce7135-b29a-4e44-8da2-643520d68a10" containerID="3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf" exitCode=0 Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.096062 5030 generic.go:334] "Generic (PLEG): container finished" podID="1dce7135-b29a-4e44-8da2-643520d68a10" containerID="559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08" exitCode=0 Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.096092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1dce7135-b29a-4e44-8da2-643520d68a10","Type":"ContainerDied","Data":"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903"} Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.096148 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1dce7135-b29a-4e44-8da2-643520d68a10","Type":"ContainerDied","Data":"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d"} Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.096163 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1dce7135-b29a-4e44-8da2-643520d68a10","Type":"ContainerDied","Data":"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf"} Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.096174 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1dce7135-b29a-4e44-8da2-643520d68a10","Type":"ContainerDied","Data":"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08"} Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.096185 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1dce7135-b29a-4e44-8da2-643520d68a10","Type":"ContainerDied","Data":"cb2a56ce443f698372e774dc9e1b5f5b90bc8a62075007ef9d0422d72c8a893d"} Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.096204 5030 scope.go:117] "RemoveContainer" containerID="4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.096371 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.119535 5030 scope.go:117] "RemoveContainer" containerID="fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.126667 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.139082 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.139110 5030 scope.go:117] "RemoveContainer" containerID="3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.158150 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:26 crc kubenswrapper[5030]: E0120 23:29:26.158567 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="ceilometer-notification-agent" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.158584 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="ceilometer-notification-agent" Jan 20 23:29:26 crc kubenswrapper[5030]: E0120 23:29:26.158605 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="proxy-httpd" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.158612 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="proxy-httpd" Jan 20 23:29:26 crc kubenswrapper[5030]: E0120 23:29:26.158659 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="ceilometer-central-agent" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.158667 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="ceilometer-central-agent" Jan 20 23:29:26 crc kubenswrapper[5030]: E0120 23:29:26.158692 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="sg-core" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.158699 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="sg-core" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.158859 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="ceilometer-central-agent" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.158870 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="sg-core" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.158880 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="ceilometer-notification-agent" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.158896 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" containerName="proxy-httpd" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.160425 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.163003 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.163234 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.176461 5030 scope.go:117] "RemoveContainer" containerID="559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.185408 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.203355 5030 scope.go:117] "RemoveContainer" containerID="4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903" Jan 20 23:29:26 crc kubenswrapper[5030]: E0120 23:29:26.203889 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903\": container with ID starting with 4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903 not found: ID does not exist" containerID="4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.203932 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903"} err="failed to get container status \"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903\": rpc error: code = NotFound desc = could not find container \"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903\": container with ID starting with 4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903 not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.203957 5030 scope.go:117] "RemoveContainer" containerID="fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d" Jan 20 23:29:26 crc kubenswrapper[5030]: E0120 23:29:26.204354 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d\": container with ID starting with fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d not found: ID does not exist" containerID="fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.204394 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d"} err="failed to get container status \"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d\": rpc error: code = NotFound desc = could not find container \"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d\": container with ID starting with fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.204421 5030 scope.go:117] "RemoveContainer" containerID="3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf" Jan 20 23:29:26 crc kubenswrapper[5030]: E0120 23:29:26.204642 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf\": container with ID starting with 3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf not found: ID does not exist" containerID="3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.204662 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf"} err="failed to get container status \"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf\": rpc error: code = NotFound desc = could not find container \"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf\": container with ID starting with 3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.204675 5030 scope.go:117] "RemoveContainer" containerID="559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08" Jan 20 23:29:26 crc kubenswrapper[5030]: E0120 23:29:26.204922 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08\": container with ID starting with 559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08 not found: ID does not exist" containerID="559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.204942 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08"} err="failed to get container status \"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08\": rpc error: code = NotFound desc = could not find container \"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08\": container with ID starting with 559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08 not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.204955 5030 scope.go:117] "RemoveContainer" containerID="4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.205294 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903"} err="failed to get container status \"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903\": rpc error: code = NotFound desc = could not find container \"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903\": container with ID starting with 4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903 not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.205312 5030 scope.go:117] "RemoveContainer" containerID="fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.205876 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d"} err="failed to get container status \"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d\": rpc error: code = NotFound desc = could not find container \"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d\": container with ID starting with fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.205901 5030 scope.go:117] "RemoveContainer" containerID="3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.206124 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf"} err="failed to get container status \"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf\": rpc error: code = NotFound desc = could not find container \"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf\": container with ID starting with 3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.206143 5030 scope.go:117] "RemoveContainer" containerID="559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.206377 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08"} err="failed to get container status \"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08\": rpc error: code = NotFound desc = could not find container \"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08\": container with ID starting with 559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08 not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.206428 5030 scope.go:117] "RemoveContainer" containerID="4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.206706 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903"} err="failed to get container status \"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903\": rpc error: code = NotFound desc = could not find container \"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903\": container with ID starting with 4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903 not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.206727 5030 scope.go:117] "RemoveContainer" containerID="fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.206997 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d"} err="failed to get container status \"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d\": rpc error: code = NotFound desc = could not find container \"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d\": container with ID starting with fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.207019 5030 scope.go:117] "RemoveContainer" containerID="3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.207246 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf"} err="failed to get container status \"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf\": rpc error: code = NotFound desc = could not find container \"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf\": container with ID starting with 3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.207265 5030 scope.go:117] "RemoveContainer" containerID="559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.207568 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08"} err="failed to get container status \"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08\": rpc error: code = NotFound desc = could not find container \"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08\": container with ID starting with 559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08 not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.207587 5030 scope.go:117] "RemoveContainer" containerID="4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.207843 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903"} err="failed to get container status \"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903\": rpc error: code = NotFound desc = could not find container \"4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903\": container with ID starting with 4d147bbf1e2d1341ab99b2baf6ed972ee272ecfb60fee832ca59f2e651525903 not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.207870 5030 scope.go:117] "RemoveContainer" containerID="fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.208085 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d"} err="failed to get container status \"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d\": rpc error: code = NotFound desc = could not find container \"fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d\": container with ID starting with fa13c0997b4e6dd89fb97c2e5cfa719bc7bfe7f8b9b780d50b38a82601184f5d not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.208103 5030 scope.go:117] "RemoveContainer" containerID="3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.208532 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf"} err="failed to get container status \"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf\": rpc error: code = NotFound desc = could not find container \"3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf\": container with ID starting with 3f6b3cebbee949cd8a5cef9f065c860ca399c3d451f5f177734893d9e6776ebf not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.208581 5030 scope.go:117] "RemoveContainer" containerID="559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.208817 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08"} err="failed to get container status \"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08\": rpc error: code = NotFound desc = could not find container \"559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08\": container with ID starting with 559442b973af9ab3421903cac94d12aca0e8726e82b04a16dc715147c9f5ae08 not found: ID does not exist" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.210871 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.210951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf81edba-3554-4020-a0c8-ceabf27e3558-log-httpd\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.211016 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf81edba-3554-4020-a0c8-ceabf27e3558-run-httpd\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.211051 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-config-data\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.211078 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rnmt\" (UniqueName: \"kubernetes.io/projected/cf81edba-3554-4020-a0c8-ceabf27e3558-kube-api-access-4rnmt\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.211127 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-scripts\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.211169 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.297013 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.297267 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="0102d8ca-9bb5-417f-8e32-8efab78e9f22" containerName="glance-log" containerID="cri-o://749cba0760492510ca64904082a3aae98d7031dc2ff515200ba3191109955564" gracePeriod=30 Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.297383 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="0102d8ca-9bb5-417f-8e32-8efab78e9f22" containerName="glance-httpd" containerID="cri-o://9af0e8c69a3180f95cf8b7b15e0a513b2b8f18bee9d34db8bc2221876cd8f160" gracePeriod=30 Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.313145 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-scripts\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.313224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.313297 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.313354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf81edba-3554-4020-a0c8-ceabf27e3558-log-httpd\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.313401 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf81edba-3554-4020-a0c8-ceabf27e3558-run-httpd\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.313449 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-config-data\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.313499 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rnmt\" (UniqueName: \"kubernetes.io/projected/cf81edba-3554-4020-a0c8-ceabf27e3558-kube-api-access-4rnmt\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.314052 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf81edba-3554-4020-a0c8-ceabf27e3558-log-httpd\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.314070 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf81edba-3554-4020-a0c8-ceabf27e3558-run-httpd\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.317220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-scripts\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.317262 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.318797 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-config-data\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.325093 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.332447 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rnmt\" (UniqueName: \"kubernetes.io/projected/cf81edba-3554-4020-a0c8-ceabf27e3558-kube-api-access-4rnmt\") pod \"ceilometer-0\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.486920 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.841458 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:26 crc kubenswrapper[5030]: I0120 23:29:26.921216 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:26 crc kubenswrapper[5030]: W0120 23:29:26.923156 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf81edba_3554_4020_a0c8_ceabf27e3558.slice/crio-fe3bccead818d56382c9e58fcd7380a0850c4dcac8fd43f3acaee77bed4cc5c1 WatchSource:0}: Error finding container fe3bccead818d56382c9e58fcd7380a0850c4dcac8fd43f3acaee77bed4cc5c1: Status 404 returned error can't find the container with id fe3bccead818d56382c9e58fcd7380a0850c4dcac8fd43f3acaee77bed4cc5c1 Jan 20 23:29:27 crc kubenswrapper[5030]: I0120 23:29:27.109156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"cf81edba-3554-4020-a0c8-ceabf27e3558","Type":"ContainerStarted","Data":"fe3bccead818d56382c9e58fcd7380a0850c4dcac8fd43f3acaee77bed4cc5c1"} Jan 20 23:29:27 crc kubenswrapper[5030]: I0120 23:29:27.111567 5030 generic.go:334] "Generic (PLEG): container finished" podID="0102d8ca-9bb5-417f-8e32-8efab78e9f22" containerID="749cba0760492510ca64904082a3aae98d7031dc2ff515200ba3191109955564" exitCode=143 Jan 20 23:29:27 crc kubenswrapper[5030]: I0120 23:29:27.111596 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0102d8ca-9bb5-417f-8e32-8efab78e9f22","Type":"ContainerDied","Data":"749cba0760492510ca64904082a3aae98d7031dc2ff515200ba3191109955564"} Jan 20 23:29:27 crc kubenswrapper[5030]: I0120 23:29:27.978501 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dce7135-b29a-4e44-8da2-643520d68a10" path="/var/lib/kubelet/pods/1dce7135-b29a-4e44-8da2-643520d68a10/volumes" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.150091 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"cf81edba-3554-4020-a0c8-ceabf27e3558","Type":"ContainerStarted","Data":"9df7a24b586a4514cc650e7cbd657c67fc091b31bd5524823eb9ccda99406fac"} Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.664372 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.713590 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.714030 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9ms6\" (UniqueName: \"kubernetes.io/projected/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-kube-api-access-k9ms6\") pod \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.714116 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-logs\") pod \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.714168 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-scripts\") pod \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.714252 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-config-data\") pod \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.714288 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-combined-ca-bundle\") pod \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.714361 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-httpd-run\") pod \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\" (UID: \"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39\") " Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.714697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-logs" (OuterVolumeSpecName: "logs") pod "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" (UID: "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.714934 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" (UID: "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.717890 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" (UID: "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.721535 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-scripts" (OuterVolumeSpecName: "scripts") pod "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" (UID: "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.733124 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-kube-api-access-k9ms6" (OuterVolumeSpecName: "kube-api-access-k9ms6") pod "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" (UID: "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39"). InnerVolumeSpecName "kube-api-access-k9ms6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.740867 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" (UID: "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.770349 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-config-data" (OuterVolumeSpecName: "config-data") pod "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" (UID: "a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.815834 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.816066 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.816158 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.816242 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9ms6\" (UniqueName: \"kubernetes.io/projected/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-kube-api-access-k9ms6\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.816304 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.816366 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.816438 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.832925 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 20 23:29:28 crc kubenswrapper[5030]: I0120 23:29:28.920825 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.163210 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" containerID="09a49037e83033c54cae83c7b3f4e49aed1949d7ad0788fb8288bb11b66f88d2" exitCode=0 Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.163264 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39","Type":"ContainerDied","Data":"09a49037e83033c54cae83c7b3f4e49aed1949d7ad0788fb8288bb11b66f88d2"} Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.163310 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.163340 5030 scope.go:117] "RemoveContainer" containerID="09a49037e83033c54cae83c7b3f4e49aed1949d7ad0788fb8288bb11b66f88d2" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.163325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39","Type":"ContainerDied","Data":"20728bcc4474c83b22c1dc9e7ad146fcc4e58fac17397ef097e97e6a929ab012"} Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.169825 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"cf81edba-3554-4020-a0c8-ceabf27e3558","Type":"ContainerStarted","Data":"e5f82a19369a6a7c05bd265d0da4db7f3ccb8cbcaa4716a63f45bacaef27cd0f"} Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.169855 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"cf81edba-3554-4020-a0c8-ceabf27e3558","Type":"ContainerStarted","Data":"ed0b8b4851425e35da342fdd890436e03bd77bc0f6b419e327a46fd125c0f8df"} Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.183665 5030 scope.go:117] "RemoveContainer" containerID="8868c8dbb49e9f936230d6a050a8159c9fa35eae21d95f7e496666e0266f14ea" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.231576 5030 scope.go:117] "RemoveContainer" containerID="09a49037e83033c54cae83c7b3f4e49aed1949d7ad0788fb8288bb11b66f88d2" Jan 20 23:29:29 crc kubenswrapper[5030]: E0120 23:29:29.232057 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09a49037e83033c54cae83c7b3f4e49aed1949d7ad0788fb8288bb11b66f88d2\": container with ID starting with 09a49037e83033c54cae83c7b3f4e49aed1949d7ad0788fb8288bb11b66f88d2 not found: ID does not exist" containerID="09a49037e83033c54cae83c7b3f4e49aed1949d7ad0788fb8288bb11b66f88d2" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.232101 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09a49037e83033c54cae83c7b3f4e49aed1949d7ad0788fb8288bb11b66f88d2"} err="failed to get container status \"09a49037e83033c54cae83c7b3f4e49aed1949d7ad0788fb8288bb11b66f88d2\": rpc error: code = NotFound desc = could not find container \"09a49037e83033c54cae83c7b3f4e49aed1949d7ad0788fb8288bb11b66f88d2\": container with ID starting with 09a49037e83033c54cae83c7b3f4e49aed1949d7ad0788fb8288bb11b66f88d2 not found: ID does not exist" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.232126 5030 scope.go:117] "RemoveContainer" containerID="8868c8dbb49e9f936230d6a050a8159c9fa35eae21d95f7e496666e0266f14ea" Jan 20 23:29:29 crc kubenswrapper[5030]: E0120 23:29:29.232499 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8868c8dbb49e9f936230d6a050a8159c9fa35eae21d95f7e496666e0266f14ea\": container with ID starting with 8868c8dbb49e9f936230d6a050a8159c9fa35eae21d95f7e496666e0266f14ea not found: ID does not exist" containerID="8868c8dbb49e9f936230d6a050a8159c9fa35eae21d95f7e496666e0266f14ea" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.232533 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8868c8dbb49e9f936230d6a050a8159c9fa35eae21d95f7e496666e0266f14ea"} err="failed to get container status \"8868c8dbb49e9f936230d6a050a8159c9fa35eae21d95f7e496666e0266f14ea\": rpc error: code = NotFound desc = could not find container \"8868c8dbb49e9f936230d6a050a8159c9fa35eae21d95f7e496666e0266f14ea\": container with ID starting with 8868c8dbb49e9f936230d6a050a8159c9fa35eae21d95f7e496666e0266f14ea not found: ID does not exist" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.250395 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.258258 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.265683 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:29:29 crc kubenswrapper[5030]: E0120 23:29:29.266048 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" containerName="glance-log" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.266065 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" containerName="glance-log" Jan 20 23:29:29 crc kubenswrapper[5030]: E0120 23:29:29.266091 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" containerName="glance-httpd" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.266098 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" containerName="glance-httpd" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.266252 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" containerName="glance-httpd" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.266271 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" containerName="glance-log" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.267133 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.269479 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.285898 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.326990 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8324c6f6-da57-40dc-a9f3-ce99a41078e6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.327039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6khbk\" (UniqueName: \"kubernetes.io/projected/8324c6f6-da57-40dc-a9f3-ce99a41078e6-kube-api-access-6khbk\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.327067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8324c6f6-da57-40dc-a9f3-ce99a41078e6-logs\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.327118 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-scripts\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.327188 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-config-data\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.327219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.327255 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.429245 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8324c6f6-da57-40dc-a9f3-ce99a41078e6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.429498 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6khbk\" (UniqueName: \"kubernetes.io/projected/8324c6f6-da57-40dc-a9f3-ce99a41078e6-kube-api-access-6khbk\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.429525 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8324c6f6-da57-40dc-a9f3-ce99a41078e6-logs\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.429544 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-scripts\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.429596 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-config-data\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.429651 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.429685 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.429836 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8324c6f6-da57-40dc-a9f3-ce99a41078e6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.430549 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.430718 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8324c6f6-da57-40dc-a9f3-ce99a41078e6-logs\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.435823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-config-data\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.435943 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.436429 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-scripts\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.447413 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6khbk\" (UniqueName: \"kubernetes.io/projected/8324c6f6-da57-40dc-a9f3-ce99a41078e6-kube-api-access-6khbk\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.465715 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.582796 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.846483 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.941153 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-scripts\") pod \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.941261 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl2wr\" (UniqueName: \"kubernetes.io/projected/0102d8ca-9bb5-417f-8e32-8efab78e9f22-kube-api-access-wl2wr\") pod \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.941337 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-combined-ca-bundle\") pod \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.941399 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0102d8ca-9bb5-417f-8e32-8efab78e9f22-logs\") pod \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.941442 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0102d8ca-9bb5-417f-8e32-8efab78e9f22-httpd-run\") pod \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.941483 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.941511 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-config-data\") pod \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\" (UID: \"0102d8ca-9bb5-417f-8e32-8efab78e9f22\") " Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.942120 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0102d8ca-9bb5-417f-8e32-8efab78e9f22-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0102d8ca-9bb5-417f-8e32-8efab78e9f22" (UID: "0102d8ca-9bb5-417f-8e32-8efab78e9f22"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.942349 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0102d8ca-9bb5-417f-8e32-8efab78e9f22-logs" (OuterVolumeSpecName: "logs") pod "0102d8ca-9bb5-417f-8e32-8efab78e9f22" (UID: "0102d8ca-9bb5-417f-8e32-8efab78e9f22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.942794 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0102d8ca-9bb5-417f-8e32-8efab78e9f22-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.942813 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0102d8ca-9bb5-417f-8e32-8efab78e9f22-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.945085 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0102d8ca-9bb5-417f-8e32-8efab78e9f22-kube-api-access-wl2wr" (OuterVolumeSpecName: "kube-api-access-wl2wr") pod "0102d8ca-9bb5-417f-8e32-8efab78e9f22" (UID: "0102d8ca-9bb5-417f-8e32-8efab78e9f22"). InnerVolumeSpecName "kube-api-access-wl2wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.945739 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "0102d8ca-9bb5-417f-8e32-8efab78e9f22" (UID: "0102d8ca-9bb5-417f-8e32-8efab78e9f22"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.947012 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-scripts" (OuterVolumeSpecName: "scripts") pod "0102d8ca-9bb5-417f-8e32-8efab78e9f22" (UID: "0102d8ca-9bb5-417f-8e32-8efab78e9f22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.972785 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39" path="/var/lib/kubelet/pods/a5ccb98e-eb94-4f38-8f47-ba9f45d0ce39/volumes" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.978429 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0102d8ca-9bb5-417f-8e32-8efab78e9f22" (UID: "0102d8ca-9bb5-417f-8e32-8efab78e9f22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:29 crc kubenswrapper[5030]: I0120 23:29:29.995673 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-config-data" (OuterVolumeSpecName: "config-data") pod "0102d8ca-9bb5-417f-8e32-8efab78e9f22" (UID: "0102d8ca-9bb5-417f-8e32-8efab78e9f22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.044121 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.044153 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.044164 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl2wr\" (UniqueName: \"kubernetes.io/projected/0102d8ca-9bb5-417f-8e32-8efab78e9f22-kube-api-access-wl2wr\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.044175 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0102d8ca-9bb5-417f-8e32-8efab78e9f22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.044199 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.065269 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.067799 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 20 23:29:30 crc kubenswrapper[5030]: W0120 23:29:30.067957 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8324c6f6_da57_40dc_a9f3_ce99a41078e6.slice/crio-f24b402cd5da637624318996811378eb99d57b35a3ed06639b327625181fe635 WatchSource:0}: Error finding container f24b402cd5da637624318996811378eb99d57b35a3ed06639b327625181fe635: Status 404 returned error can't find the container with id f24b402cd5da637624318996811378eb99d57b35a3ed06639b327625181fe635 Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.145877 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.182520 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"8324c6f6-da57-40dc-a9f3-ce99a41078e6","Type":"ContainerStarted","Data":"f24b402cd5da637624318996811378eb99d57b35a3ed06639b327625181fe635"} Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.186711 5030 generic.go:334] "Generic (PLEG): container finished" podID="0102d8ca-9bb5-417f-8e32-8efab78e9f22" containerID="9af0e8c69a3180f95cf8b7b15e0a513b2b8f18bee9d34db8bc2221876cd8f160" exitCode=0 Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.186781 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.186795 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0102d8ca-9bb5-417f-8e32-8efab78e9f22","Type":"ContainerDied","Data":"9af0e8c69a3180f95cf8b7b15e0a513b2b8f18bee9d34db8bc2221876cd8f160"} Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.186836 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"0102d8ca-9bb5-417f-8e32-8efab78e9f22","Type":"ContainerDied","Data":"e7f5d5a19f7e1822725c383973ce208af0be43246c5ba0eeb232eeb1959f3297"} Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.186888 5030 scope.go:117] "RemoveContainer" containerID="9af0e8c69a3180f95cf8b7b15e0a513b2b8f18bee9d34db8bc2221876cd8f160" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.254925 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.262013 5030 scope.go:117] "RemoveContainer" containerID="749cba0760492510ca64904082a3aae98d7031dc2ff515200ba3191109955564" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.262529 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.278855 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:29:30 crc kubenswrapper[5030]: E0120 23:29:30.279286 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0102d8ca-9bb5-417f-8e32-8efab78e9f22" containerName="glance-log" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.279309 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0102d8ca-9bb5-417f-8e32-8efab78e9f22" containerName="glance-log" Jan 20 23:29:30 crc kubenswrapper[5030]: E0120 23:29:30.279338 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0102d8ca-9bb5-417f-8e32-8efab78e9f22" containerName="glance-httpd" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.279345 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0102d8ca-9bb5-417f-8e32-8efab78e9f22" containerName="glance-httpd" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.279531 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0102d8ca-9bb5-417f-8e32-8efab78e9f22" containerName="glance-log" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.279549 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0102d8ca-9bb5-417f-8e32-8efab78e9f22" containerName="glance-httpd" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.280545 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.285963 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.300095 5030 scope.go:117] "RemoveContainer" containerID="9af0e8c69a3180f95cf8b7b15e0a513b2b8f18bee9d34db8bc2221876cd8f160" Jan 20 23:29:30 crc kubenswrapper[5030]: E0120 23:29:30.301163 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9af0e8c69a3180f95cf8b7b15e0a513b2b8f18bee9d34db8bc2221876cd8f160\": container with ID starting with 9af0e8c69a3180f95cf8b7b15e0a513b2b8f18bee9d34db8bc2221876cd8f160 not found: ID does not exist" containerID="9af0e8c69a3180f95cf8b7b15e0a513b2b8f18bee9d34db8bc2221876cd8f160" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.301222 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af0e8c69a3180f95cf8b7b15e0a513b2b8f18bee9d34db8bc2221876cd8f160"} err="failed to get container status \"9af0e8c69a3180f95cf8b7b15e0a513b2b8f18bee9d34db8bc2221876cd8f160\": rpc error: code = NotFound desc = could not find container \"9af0e8c69a3180f95cf8b7b15e0a513b2b8f18bee9d34db8bc2221876cd8f160\": container with ID starting with 9af0e8c69a3180f95cf8b7b15e0a513b2b8f18bee9d34db8bc2221876cd8f160 not found: ID does not exist" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.301260 5030 scope.go:117] "RemoveContainer" containerID="749cba0760492510ca64904082a3aae98d7031dc2ff515200ba3191109955564" Jan 20 23:29:30 crc kubenswrapper[5030]: E0120 23:29:30.311115 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749cba0760492510ca64904082a3aae98d7031dc2ff515200ba3191109955564\": container with ID starting with 749cba0760492510ca64904082a3aae98d7031dc2ff515200ba3191109955564 not found: ID does not exist" containerID="749cba0760492510ca64904082a3aae98d7031dc2ff515200ba3191109955564" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.311163 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749cba0760492510ca64904082a3aae98d7031dc2ff515200ba3191109955564"} err="failed to get container status \"749cba0760492510ca64904082a3aae98d7031dc2ff515200ba3191109955564\": rpc error: code = NotFound desc = could not find container \"749cba0760492510ca64904082a3aae98d7031dc2ff515200ba3191109955564\": container with ID starting with 749cba0760492510ca64904082a3aae98d7031dc2ff515200ba3191109955564 not found: ID does not exist" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.322095 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.348094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.348135 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03f8ffc8-cadf-4aab-ab46-71163f0339d9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.348239 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.348277 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.348346 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.348391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bx74\" (UniqueName: \"kubernetes.io/projected/03f8ffc8-cadf-4aab-ab46-71163f0339d9-kube-api-access-7bx74\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.348428 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03f8ffc8-cadf-4aab-ab46-71163f0339d9-logs\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.450387 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.450495 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bx74\" (UniqueName: \"kubernetes.io/projected/03f8ffc8-cadf-4aab-ab46-71163f0339d9-kube-api-access-7bx74\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.450548 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03f8ffc8-cadf-4aab-ab46-71163f0339d9-logs\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.450605 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.450680 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03f8ffc8-cadf-4aab-ab46-71163f0339d9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.450744 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.450782 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.451012 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03f8ffc8-cadf-4aab-ab46-71163f0339d9-logs\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.451091 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.451275 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03f8ffc8-cadf-4aab-ab46-71163f0339d9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.454692 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.455641 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.456882 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.470003 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bx74\" (UniqueName: \"kubernetes.io/projected/03f8ffc8-cadf-4aab-ab46-71163f0339d9-kube-api-access-7bx74\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.483791 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:30 crc kubenswrapper[5030]: I0120 23:29:30.610886 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:31 crc kubenswrapper[5030]: I0120 23:29:31.052701 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:29:31 crc kubenswrapper[5030]: W0120 23:29:31.073051 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03f8ffc8_cadf_4aab_ab46_71163f0339d9.slice/crio-a41a6866c28b712a4ddfd1b80a16b93b2eb82d94b1aee8b4feddb018dd384238 WatchSource:0}: Error finding container a41a6866c28b712a4ddfd1b80a16b93b2eb82d94b1aee8b4feddb018dd384238: Status 404 returned error can't find the container with id a41a6866c28b712a4ddfd1b80a16b93b2eb82d94b1aee8b4feddb018dd384238 Jan 20 23:29:31 crc kubenswrapper[5030]: I0120 23:29:31.200423 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"8324c6f6-da57-40dc-a9f3-ce99a41078e6","Type":"ContainerStarted","Data":"16c5cb22986bc3b811996c0ba29d425238d57e0b2b669a5b412626aeb83f8fda"} Jan 20 23:29:31 crc kubenswrapper[5030]: I0120 23:29:31.200730 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"8324c6f6-da57-40dc-a9f3-ce99a41078e6","Type":"ContainerStarted","Data":"64bde24ce641ac620803ec772b33819c4f11068e6abd5447571283350997935d"} Jan 20 23:29:31 crc kubenswrapper[5030]: I0120 23:29:31.213592 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"cf81edba-3554-4020-a0c8-ceabf27e3558","Type":"ContainerStarted","Data":"c6b7e845bd06634d957235c624c20f150acb117778effe670708b0b64b0caee0"} Jan 20 23:29:31 crc kubenswrapper[5030]: I0120 23:29:31.213761 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="ceilometer-central-agent" containerID="cri-o://9df7a24b586a4514cc650e7cbd657c67fc091b31bd5524823eb9ccda99406fac" gracePeriod=30 Jan 20 23:29:31 crc kubenswrapper[5030]: I0120 23:29:31.213985 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:31 crc kubenswrapper[5030]: I0120 23:29:31.214028 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="proxy-httpd" containerID="cri-o://c6b7e845bd06634d957235c624c20f150acb117778effe670708b0b64b0caee0" gracePeriod=30 Jan 20 23:29:31 crc kubenswrapper[5030]: I0120 23:29:31.214083 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="sg-core" containerID="cri-o://e5f82a19369a6a7c05bd265d0da4db7f3ccb8cbcaa4716a63f45bacaef27cd0f" gracePeriod=30 Jan 20 23:29:31 crc kubenswrapper[5030]: I0120 23:29:31.214115 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="ceilometer-notification-agent" containerID="cri-o://ed0b8b4851425e35da342fdd890436e03bd77bc0f6b419e327a46fd125c0f8df" gracePeriod=30 Jan 20 23:29:31 crc kubenswrapper[5030]: I0120 23:29:31.222874 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"03f8ffc8-cadf-4aab-ab46-71163f0339d9","Type":"ContainerStarted","Data":"a41a6866c28b712a4ddfd1b80a16b93b2eb82d94b1aee8b4feddb018dd384238"} Jan 20 23:29:31 crc kubenswrapper[5030]: I0120 23:29:31.251708 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.849988786 podStartE2EDuration="5.251695213s" podCreationTimestamp="2026-01-20 23:29:26 +0000 UTC" firstStartedPulling="2026-01-20 23:29:26.925799714 +0000 UTC m=+3239.246060002" lastFinishedPulling="2026-01-20 23:29:30.327506141 +0000 UTC m=+3242.647766429" observedRunningTime="2026-01-20 23:29:31.248491015 +0000 UTC m=+3243.568758343" watchObservedRunningTime="2026-01-20 23:29:31.251695213 +0000 UTC m=+3243.571955501" Jan 20 23:29:31 crc kubenswrapper[5030]: I0120 23:29:31.257590 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.257579635 podStartE2EDuration="2.257579635s" podCreationTimestamp="2026-01-20 23:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:29:31.223166641 +0000 UTC m=+3243.543426929" watchObservedRunningTime="2026-01-20 23:29:31.257579635 +0000 UTC m=+3243.577839923" Jan 20 23:29:31 crc kubenswrapper[5030]: I0120 23:29:31.974204 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0102d8ca-9bb5-417f-8e32-8efab78e9f22" path="/var/lib/kubelet/pods/0102d8ca-9bb5-417f-8e32-8efab78e9f22/volumes" Jan 20 23:29:32 crc kubenswrapper[5030]: I0120 23:29:32.233750 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"03f8ffc8-cadf-4aab-ab46-71163f0339d9","Type":"ContainerStarted","Data":"53982c449caa05c2e8194b1dac6d69dabe8a0b162f8d8436686b33b0358d19c0"} Jan 20 23:29:32 crc kubenswrapper[5030]: I0120 23:29:32.233794 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"03f8ffc8-cadf-4aab-ab46-71163f0339d9","Type":"ContainerStarted","Data":"f8e64917c4556d4d40e2c4d864f098fcf9cbc495e12808995f36687c0b2205f0"} Jan 20 23:29:32 crc kubenswrapper[5030]: I0120 23:29:32.237162 5030 generic.go:334] "Generic (PLEG): container finished" podID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerID="c6b7e845bd06634d957235c624c20f150acb117778effe670708b0b64b0caee0" exitCode=0 Jan 20 23:29:32 crc kubenswrapper[5030]: I0120 23:29:32.237199 5030 generic.go:334] "Generic (PLEG): container finished" podID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerID="e5f82a19369a6a7c05bd265d0da4db7f3ccb8cbcaa4716a63f45bacaef27cd0f" exitCode=2 Jan 20 23:29:32 crc kubenswrapper[5030]: I0120 23:29:32.237210 5030 generic.go:334] "Generic (PLEG): container finished" podID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerID="ed0b8b4851425e35da342fdd890436e03bd77bc0f6b419e327a46fd125c0f8df" exitCode=0 Jan 20 23:29:32 crc kubenswrapper[5030]: I0120 23:29:32.237226 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"cf81edba-3554-4020-a0c8-ceabf27e3558","Type":"ContainerDied","Data":"c6b7e845bd06634d957235c624c20f150acb117778effe670708b0b64b0caee0"} Jan 20 23:29:32 crc kubenswrapper[5030]: I0120 23:29:32.237264 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"cf81edba-3554-4020-a0c8-ceabf27e3558","Type":"ContainerDied","Data":"e5f82a19369a6a7c05bd265d0da4db7f3ccb8cbcaa4716a63f45bacaef27cd0f"} Jan 20 23:29:32 crc kubenswrapper[5030]: I0120 23:29:32.237273 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"cf81edba-3554-4020-a0c8-ceabf27e3558","Type":"ContainerDied","Data":"ed0b8b4851425e35da342fdd890436e03bd77bc0f6b419e327a46fd125c0f8df"} Jan 20 23:29:32 crc kubenswrapper[5030]: I0120 23:29:32.260846 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.260825951 podStartE2EDuration="2.260825951s" podCreationTimestamp="2026-01-20 23:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:29:32.25584389 +0000 UTC m=+3244.576104178" watchObservedRunningTime="2026-01-20 23:29:32.260825951 +0000 UTC m=+3244.581086239" Jan 20 23:29:34 crc kubenswrapper[5030]: I0120 23:29:34.258017 5030 generic.go:334] "Generic (PLEG): container finished" podID="cd48950b-220e-43d8-a96e-ad983ee50c2c" containerID="76fbcef65dc0a7b0baa96f8d4bc2c60c0c4628b98902d125ac94e7d82e8fe890" exitCode=0 Jan 20 23:29:34 crc kubenswrapper[5030]: I0120 23:29:34.258384 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" event={"ID":"cd48950b-220e-43d8-a96e-ad983ee50c2c","Type":"ContainerDied","Data":"76fbcef65dc0a7b0baa96f8d4bc2c60c0c4628b98902d125ac94e7d82e8fe890"} Jan 20 23:29:34 crc kubenswrapper[5030]: I0120 23:29:34.961984 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:29:34 crc kubenswrapper[5030]: E0120 23:29:34.962444 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:29:35 crc kubenswrapper[5030]: I0120 23:29:35.594220 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:35 crc kubenswrapper[5030]: I0120 23:29:35.742490 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-combined-ca-bundle\") pod \"cd48950b-220e-43d8-a96e-ad983ee50c2c\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " Jan 20 23:29:35 crc kubenswrapper[5030]: I0120 23:29:35.742688 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs79c\" (UniqueName: \"kubernetes.io/projected/cd48950b-220e-43d8-a96e-ad983ee50c2c-kube-api-access-qs79c\") pod \"cd48950b-220e-43d8-a96e-ad983ee50c2c\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " Jan 20 23:29:35 crc kubenswrapper[5030]: I0120 23:29:35.742740 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-scripts\") pod \"cd48950b-220e-43d8-a96e-ad983ee50c2c\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " Jan 20 23:29:35 crc kubenswrapper[5030]: I0120 23:29:35.742786 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-config-data\") pod \"cd48950b-220e-43d8-a96e-ad983ee50c2c\" (UID: \"cd48950b-220e-43d8-a96e-ad983ee50c2c\") " Jan 20 23:29:35 crc kubenswrapper[5030]: I0120 23:29:35.754351 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-scripts" (OuterVolumeSpecName: "scripts") pod "cd48950b-220e-43d8-a96e-ad983ee50c2c" (UID: "cd48950b-220e-43d8-a96e-ad983ee50c2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:35 crc kubenswrapper[5030]: I0120 23:29:35.754485 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd48950b-220e-43d8-a96e-ad983ee50c2c-kube-api-access-qs79c" (OuterVolumeSpecName: "kube-api-access-qs79c") pod "cd48950b-220e-43d8-a96e-ad983ee50c2c" (UID: "cd48950b-220e-43d8-a96e-ad983ee50c2c"). InnerVolumeSpecName "kube-api-access-qs79c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:35 crc kubenswrapper[5030]: I0120 23:29:35.786644 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-config-data" (OuterVolumeSpecName: "config-data") pod "cd48950b-220e-43d8-a96e-ad983ee50c2c" (UID: "cd48950b-220e-43d8-a96e-ad983ee50c2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:35 crc kubenswrapper[5030]: I0120 23:29:35.787675 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd48950b-220e-43d8-a96e-ad983ee50c2c" (UID: "cd48950b-220e-43d8-a96e-ad983ee50c2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:35 crc kubenswrapper[5030]: I0120 23:29:35.845147 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs79c\" (UniqueName: \"kubernetes.io/projected/cd48950b-220e-43d8-a96e-ad983ee50c2c-kube-api-access-qs79c\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:35 crc kubenswrapper[5030]: I0120 23:29:35.845498 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:35 crc kubenswrapper[5030]: I0120 23:29:35.845513 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:35 crc kubenswrapper[5030]: I0120 23:29:35.845525 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd48950b-220e-43d8-a96e-ad983ee50c2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.279211 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" event={"ID":"cd48950b-220e-43d8-a96e-ad983ee50c2c","Type":"ContainerDied","Data":"37c0e1a634d42d58cceb4adbaa466e32d68c8924f598f086be1feabb9c306001"} Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.279254 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37c0e1a634d42d58cceb4adbaa466e32d68c8924f598f086be1feabb9c306001" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.279317 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.350885 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:29:36 crc kubenswrapper[5030]: E0120 23:29:36.351225 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd48950b-220e-43d8-a96e-ad983ee50c2c" containerName="nova-cell0-conductor-db-sync" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.351241 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd48950b-220e-43d8-a96e-ad983ee50c2c" containerName="nova-cell0-conductor-db-sync" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.351447 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd48950b-220e-43d8-a96e-ad983ee50c2c" containerName="nova-cell0-conductor-db-sync" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.352030 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.353413 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.353601 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-jczwg" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.376604 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.455845 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc29f\" (UniqueName: \"kubernetes.io/projected/ed0cd39a-1c2a-4c57-952f-9f389530459d-kube-api-access-gc29f\") pod \"nova-cell0-conductor-0\" (UID: \"ed0cd39a-1c2a-4c57-952f-9f389530459d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.455910 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0cd39a-1c2a-4c57-952f-9f389530459d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ed0cd39a-1c2a-4c57-952f-9f389530459d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.455960 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0cd39a-1c2a-4c57-952f-9f389530459d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ed0cd39a-1c2a-4c57-952f-9f389530459d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.557158 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0cd39a-1c2a-4c57-952f-9f389530459d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ed0cd39a-1c2a-4c57-952f-9f389530459d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.557228 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0cd39a-1c2a-4c57-952f-9f389530459d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ed0cd39a-1c2a-4c57-952f-9f389530459d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.557325 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc29f\" (UniqueName: \"kubernetes.io/projected/ed0cd39a-1c2a-4c57-952f-9f389530459d-kube-api-access-gc29f\") pod \"nova-cell0-conductor-0\" (UID: \"ed0cd39a-1c2a-4c57-952f-9f389530459d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.562237 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0cd39a-1c2a-4c57-952f-9f389530459d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ed0cd39a-1c2a-4c57-952f-9f389530459d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.564501 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0cd39a-1c2a-4c57-952f-9f389530459d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ed0cd39a-1c2a-4c57-952f-9f389530459d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.575823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc29f\" (UniqueName: \"kubernetes.io/projected/ed0cd39a-1c2a-4c57-952f-9f389530459d-kube-api-access-gc29f\") pod \"nova-cell0-conductor-0\" (UID: \"ed0cd39a-1c2a-4c57-952f-9f389530459d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.676281 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.773892 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.865431 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-sg-core-conf-yaml\") pod \"cf81edba-3554-4020-a0c8-ceabf27e3558\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.865496 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf81edba-3554-4020-a0c8-ceabf27e3558-run-httpd\") pod \"cf81edba-3554-4020-a0c8-ceabf27e3558\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.865580 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-scripts\") pod \"cf81edba-3554-4020-a0c8-ceabf27e3558\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.865698 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-combined-ca-bundle\") pod \"cf81edba-3554-4020-a0c8-ceabf27e3558\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.865765 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rnmt\" (UniqueName: \"kubernetes.io/projected/cf81edba-3554-4020-a0c8-ceabf27e3558-kube-api-access-4rnmt\") pod \"cf81edba-3554-4020-a0c8-ceabf27e3558\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.865801 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf81edba-3554-4020-a0c8-ceabf27e3558-log-httpd\") pod \"cf81edba-3554-4020-a0c8-ceabf27e3558\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.865831 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-config-data\") pod \"cf81edba-3554-4020-a0c8-ceabf27e3558\" (UID: \"cf81edba-3554-4020-a0c8-ceabf27e3558\") " Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.866063 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf81edba-3554-4020-a0c8-ceabf27e3558-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cf81edba-3554-4020-a0c8-ceabf27e3558" (UID: "cf81edba-3554-4020-a0c8-ceabf27e3558"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.866213 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf81edba-3554-4020-a0c8-ceabf27e3558-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.866966 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf81edba-3554-4020-a0c8-ceabf27e3558-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cf81edba-3554-4020-a0c8-ceabf27e3558" (UID: "cf81edba-3554-4020-a0c8-ceabf27e3558"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.872819 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf81edba-3554-4020-a0c8-ceabf27e3558-kube-api-access-4rnmt" (OuterVolumeSpecName: "kube-api-access-4rnmt") pod "cf81edba-3554-4020-a0c8-ceabf27e3558" (UID: "cf81edba-3554-4020-a0c8-ceabf27e3558"). InnerVolumeSpecName "kube-api-access-4rnmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.873081 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-scripts" (OuterVolumeSpecName: "scripts") pod "cf81edba-3554-4020-a0c8-ceabf27e3558" (UID: "cf81edba-3554-4020-a0c8-ceabf27e3558"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.898451 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cf81edba-3554-4020-a0c8-ceabf27e3558" (UID: "cf81edba-3554-4020-a0c8-ceabf27e3558"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.940908 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf81edba-3554-4020-a0c8-ceabf27e3558" (UID: "cf81edba-3554-4020-a0c8-ceabf27e3558"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.954789 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-config-data" (OuterVolumeSpecName: "config-data") pod "cf81edba-3554-4020-a0c8-ceabf27e3558" (UID: "cf81edba-3554-4020-a0c8-ceabf27e3558"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.968060 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rnmt\" (UniqueName: \"kubernetes.io/projected/cf81edba-3554-4020-a0c8-ceabf27e3558-kube-api-access-4rnmt\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.968090 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf81edba-3554-4020-a0c8-ceabf27e3558-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.968101 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.968111 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.968120 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:36 crc kubenswrapper[5030]: I0120 23:29:36.968129 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf81edba-3554-4020-a0c8-ceabf27e3558-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.139489 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.307101 5030 generic.go:334] "Generic (PLEG): container finished" podID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerID="9df7a24b586a4514cc650e7cbd657c67fc091b31bd5524823eb9ccda99406fac" exitCode=0 Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.307223 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"cf81edba-3554-4020-a0c8-ceabf27e3558","Type":"ContainerDied","Data":"9df7a24b586a4514cc650e7cbd657c67fc091b31bd5524823eb9ccda99406fac"} Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.307464 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"cf81edba-3554-4020-a0c8-ceabf27e3558","Type":"ContainerDied","Data":"fe3bccead818d56382c9e58fcd7380a0850c4dcac8fd43f3acaee77bed4cc5c1"} Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.307491 5030 scope.go:117] "RemoveContainer" containerID="c6b7e845bd06634d957235c624c20f150acb117778effe670708b0b64b0caee0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.307247 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.310899 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"ed0cd39a-1c2a-4c57-952f-9f389530459d","Type":"ContainerStarted","Data":"1b49fdd0b65a50cd457af6f8430af339083aebd08aaccaa4a459230e29b498ce"} Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.340551 5030 scope.go:117] "RemoveContainer" containerID="e5f82a19369a6a7c05bd265d0da4db7f3ccb8cbcaa4716a63f45bacaef27cd0f" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.353472 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.369739 5030 scope.go:117] "RemoveContainer" containerID="ed0b8b4851425e35da342fdd890436e03bd77bc0f6b419e327a46fd125c0f8df" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.373131 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.389419 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:37 crc kubenswrapper[5030]: E0120 23:29:37.389953 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="ceilometer-central-agent" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.389981 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="ceilometer-central-agent" Jan 20 23:29:37 crc kubenswrapper[5030]: E0120 23:29:37.390007 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="ceilometer-notification-agent" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.390017 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="ceilometer-notification-agent" Jan 20 23:29:37 crc kubenswrapper[5030]: E0120 23:29:37.390048 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="sg-core" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.390056 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="sg-core" Jan 20 23:29:37 crc kubenswrapper[5030]: E0120 23:29:37.390071 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="proxy-httpd" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.390079 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="proxy-httpd" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.390281 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="proxy-httpd" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.390307 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="ceilometer-central-agent" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.390329 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="sg-core" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.390343 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" containerName="ceilometer-notification-agent" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.392498 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.396805 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.397130 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.404080 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.406794 5030 scope.go:117] "RemoveContainer" containerID="9df7a24b586a4514cc650e7cbd657c67fc091b31bd5524823eb9ccda99406fac" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.433444 5030 scope.go:117] "RemoveContainer" containerID="c6b7e845bd06634d957235c624c20f150acb117778effe670708b0b64b0caee0" Jan 20 23:29:37 crc kubenswrapper[5030]: E0120 23:29:37.433859 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b7e845bd06634d957235c624c20f150acb117778effe670708b0b64b0caee0\": container with ID starting with c6b7e845bd06634d957235c624c20f150acb117778effe670708b0b64b0caee0 not found: ID does not exist" containerID="c6b7e845bd06634d957235c624c20f150acb117778effe670708b0b64b0caee0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.433900 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b7e845bd06634d957235c624c20f150acb117778effe670708b0b64b0caee0"} err="failed to get container status \"c6b7e845bd06634d957235c624c20f150acb117778effe670708b0b64b0caee0\": rpc error: code = NotFound desc = could not find container \"c6b7e845bd06634d957235c624c20f150acb117778effe670708b0b64b0caee0\": container with ID starting with c6b7e845bd06634d957235c624c20f150acb117778effe670708b0b64b0caee0 not found: ID does not exist" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.433924 5030 scope.go:117] "RemoveContainer" containerID="e5f82a19369a6a7c05bd265d0da4db7f3ccb8cbcaa4716a63f45bacaef27cd0f" Jan 20 23:29:37 crc kubenswrapper[5030]: E0120 23:29:37.434408 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5f82a19369a6a7c05bd265d0da4db7f3ccb8cbcaa4716a63f45bacaef27cd0f\": container with ID starting with e5f82a19369a6a7c05bd265d0da4db7f3ccb8cbcaa4716a63f45bacaef27cd0f not found: ID does not exist" containerID="e5f82a19369a6a7c05bd265d0da4db7f3ccb8cbcaa4716a63f45bacaef27cd0f" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.434450 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5f82a19369a6a7c05bd265d0da4db7f3ccb8cbcaa4716a63f45bacaef27cd0f"} err="failed to get container status \"e5f82a19369a6a7c05bd265d0da4db7f3ccb8cbcaa4716a63f45bacaef27cd0f\": rpc error: code = NotFound desc = could not find container \"e5f82a19369a6a7c05bd265d0da4db7f3ccb8cbcaa4716a63f45bacaef27cd0f\": container with ID starting with e5f82a19369a6a7c05bd265d0da4db7f3ccb8cbcaa4716a63f45bacaef27cd0f not found: ID does not exist" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.434483 5030 scope.go:117] "RemoveContainer" containerID="ed0b8b4851425e35da342fdd890436e03bd77bc0f6b419e327a46fd125c0f8df" Jan 20 23:29:37 crc kubenswrapper[5030]: E0120 23:29:37.434922 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed0b8b4851425e35da342fdd890436e03bd77bc0f6b419e327a46fd125c0f8df\": container with ID starting with ed0b8b4851425e35da342fdd890436e03bd77bc0f6b419e327a46fd125c0f8df not found: ID does not exist" containerID="ed0b8b4851425e35da342fdd890436e03bd77bc0f6b419e327a46fd125c0f8df" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.434949 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed0b8b4851425e35da342fdd890436e03bd77bc0f6b419e327a46fd125c0f8df"} err="failed to get container status \"ed0b8b4851425e35da342fdd890436e03bd77bc0f6b419e327a46fd125c0f8df\": rpc error: code = NotFound desc = could not find container \"ed0b8b4851425e35da342fdd890436e03bd77bc0f6b419e327a46fd125c0f8df\": container with ID starting with ed0b8b4851425e35da342fdd890436e03bd77bc0f6b419e327a46fd125c0f8df not found: ID does not exist" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.434963 5030 scope.go:117] "RemoveContainer" containerID="9df7a24b586a4514cc650e7cbd657c67fc091b31bd5524823eb9ccda99406fac" Jan 20 23:29:37 crc kubenswrapper[5030]: E0120 23:29:37.435181 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9df7a24b586a4514cc650e7cbd657c67fc091b31bd5524823eb9ccda99406fac\": container with ID starting with 9df7a24b586a4514cc650e7cbd657c67fc091b31bd5524823eb9ccda99406fac not found: ID does not exist" containerID="9df7a24b586a4514cc650e7cbd657c67fc091b31bd5524823eb9ccda99406fac" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.435201 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9df7a24b586a4514cc650e7cbd657c67fc091b31bd5524823eb9ccda99406fac"} err="failed to get container status \"9df7a24b586a4514cc650e7cbd657c67fc091b31bd5524823eb9ccda99406fac\": rpc error: code = NotFound desc = could not find container \"9df7a24b586a4514cc650e7cbd657c67fc091b31bd5524823eb9ccda99406fac\": container with ID starting with 9df7a24b586a4514cc650e7cbd657c67fc091b31bd5524823eb9ccda99406fac not found: ID does not exist" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.580659 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.580742 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q97cj\" (UniqueName: \"kubernetes.io/projected/19e6b507-ad9e-4d7e-a74b-8157b8add97a-kube-api-access-q97cj\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.580770 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19e6b507-ad9e-4d7e-a74b-8157b8add97a-log-httpd\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.580798 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19e6b507-ad9e-4d7e-a74b-8157b8add97a-run-httpd\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.580839 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.580862 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-scripts\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.580899 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-config-data\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.683010 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19e6b507-ad9e-4d7e-a74b-8157b8add97a-run-httpd\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.685074 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.685760 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-scripts\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.685883 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-config-data\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.685968 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.684072 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19e6b507-ad9e-4d7e-a74b-8157b8add97a-run-httpd\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.686178 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q97cj\" (UniqueName: \"kubernetes.io/projected/19e6b507-ad9e-4d7e-a74b-8157b8add97a-kube-api-access-q97cj\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.686277 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19e6b507-ad9e-4d7e-a74b-8157b8add97a-log-httpd\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.686749 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19e6b507-ad9e-4d7e-a74b-8157b8add97a-log-httpd\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.689010 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.689939 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-scripts\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.691274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.699556 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-config-data\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.717110 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q97cj\" (UniqueName: \"kubernetes.io/projected/19e6b507-ad9e-4d7e-a74b-8157b8add97a-kube-api-access-q97cj\") pod \"ceilometer-0\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:37 crc kubenswrapper[5030]: I0120 23:29:37.977571 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf81edba-3554-4020-a0c8-ceabf27e3558" path="/var/lib/kubelet/pods/cf81edba-3554-4020-a0c8-ceabf27e3558/volumes" Jan 20 23:29:38 crc kubenswrapper[5030]: I0120 23:29:38.017243 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:38 crc kubenswrapper[5030]: I0120 23:29:38.319915 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"ed0cd39a-1c2a-4c57-952f-9f389530459d","Type":"ContainerStarted","Data":"4e409b8ed77abff8688fbca2a42ce4b6681d55a94a5e7bf4e50db39b16315018"} Jan 20 23:29:38 crc kubenswrapper[5030]: I0120 23:29:38.320276 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:29:38 crc kubenswrapper[5030]: I0120 23:29:38.339050 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.339029867 podStartE2EDuration="2.339029867s" podCreationTimestamp="2026-01-20 23:29:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:29:38.336476185 +0000 UTC m=+3250.656736483" watchObservedRunningTime="2026-01-20 23:29:38.339029867 +0000 UTC m=+3250.659290175" Jan 20 23:29:38 crc kubenswrapper[5030]: W0120 23:29:38.460025 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19e6b507_ad9e_4d7e_a74b_8157b8add97a.slice/crio-d3c3a853636360a860f1aef7d1bb37fe24687d234258f2174c384fbfe62dd80d WatchSource:0}: Error finding container d3c3a853636360a860f1aef7d1bb37fe24687d234258f2174c384fbfe62dd80d: Status 404 returned error can't find the container with id d3c3a853636360a860f1aef7d1bb37fe24687d234258f2174c384fbfe62dd80d Jan 20 23:29:38 crc kubenswrapper[5030]: I0120 23:29:38.460917 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:29:39 crc kubenswrapper[5030]: I0120 23:29:39.330166 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19e6b507-ad9e-4d7e-a74b-8157b8add97a","Type":"ContainerStarted","Data":"d3c3a853636360a860f1aef7d1bb37fe24687d234258f2174c384fbfe62dd80d"} Jan 20 23:29:39 crc kubenswrapper[5030]: I0120 23:29:39.583862 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:39 crc kubenswrapper[5030]: I0120 23:29:39.583916 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:39 crc kubenswrapper[5030]: I0120 23:29:39.608988 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:39 crc kubenswrapper[5030]: I0120 23:29:39.620237 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:40 crc kubenswrapper[5030]: I0120 23:29:40.338380 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19e6b507-ad9e-4d7e-a74b-8157b8add97a","Type":"ContainerStarted","Data":"48cb6ab03283c40b54928b82b2dc7d88ccc380ff74d227332416a5477d39bb1b"} Jan 20 23:29:40 crc kubenswrapper[5030]: I0120 23:29:40.338684 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:40 crc kubenswrapper[5030]: I0120 23:29:40.338698 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:40 crc kubenswrapper[5030]: I0120 23:29:40.338708 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19e6b507-ad9e-4d7e-a74b-8157b8add97a","Type":"ContainerStarted","Data":"ace880f4a825d6a95c30a4f0e9a310b328e92780df913bc19ae82a8b7b47d214"} Jan 20 23:29:40 crc kubenswrapper[5030]: I0120 23:29:40.611043 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:40 crc kubenswrapper[5030]: I0120 23:29:40.611103 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:40 crc kubenswrapper[5030]: I0120 23:29:40.653198 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:40 crc kubenswrapper[5030]: I0120 23:29:40.682610 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:41 crc kubenswrapper[5030]: I0120 23:29:41.348539 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19e6b507-ad9e-4d7e-a74b-8157b8add97a","Type":"ContainerStarted","Data":"53a4d99532461ab3bfdb793a4b54d0f3bff57141dd53a54b9d2557354a5473bd"} Jan 20 23:29:41 crc kubenswrapper[5030]: I0120 23:29:41.348807 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:41 crc kubenswrapper[5030]: I0120 23:29:41.349518 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:42 crc kubenswrapper[5030]: I0120 23:29:42.077397 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:42 crc kubenswrapper[5030]: I0120 23:29:42.360023 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19e6b507-ad9e-4d7e-a74b-8157b8add97a","Type":"ContainerStarted","Data":"c44dbd4030758ff38d58724eca64dbedfe88fccad979f265543c57147bde8f99"} Jan 20 23:29:42 crc kubenswrapper[5030]: I0120 23:29:42.360093 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:29:42 crc kubenswrapper[5030]: I0120 23:29:42.360264 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:29:42 crc kubenswrapper[5030]: I0120 23:29:42.396192 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.9966183549999998 podStartE2EDuration="5.396176269s" podCreationTimestamp="2026-01-20 23:29:37 +0000 UTC" firstStartedPulling="2026-01-20 23:29:38.462044555 +0000 UTC m=+3250.782304843" lastFinishedPulling="2026-01-20 23:29:41.861602479 +0000 UTC m=+3254.181862757" observedRunningTime="2026-01-20 23:29:42.388826501 +0000 UTC m=+3254.709086809" watchObservedRunningTime="2026-01-20 23:29:42.396176269 +0000 UTC m=+3254.716436557" Jan 20 23:29:42 crc kubenswrapper[5030]: I0120 23:29:42.424418 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:29:43 crc kubenswrapper[5030]: I0120 23:29:43.254882 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:43 crc kubenswrapper[5030]: I0120 23:29:43.266843 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:29:46 crc kubenswrapper[5030]: I0120 23:29:46.712287 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:29:46 crc kubenswrapper[5030]: I0120 23:29:46.962931 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:29:46 crc kubenswrapper[5030]: E0120 23:29:46.963225 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.198074 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt"] Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.199816 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.201747 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.201932 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.228805 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt"] Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.262642 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cpcvt\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.262756 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-config-data\") pod \"nova-cell0-cell-mapping-cpcvt\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.262799 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hthv9\" (UniqueName: \"kubernetes.io/projected/569f5821-3389-4731-95ab-a39e17508f8e-kube-api-access-hthv9\") pod \"nova-cell0-cell-mapping-cpcvt\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.262874 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-scripts\") pod \"nova-cell0-cell-mapping-cpcvt\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.300535 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.304030 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.306283 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.331280 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.341753 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.342873 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.345240 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.363818 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bc8p\" (UniqueName: \"kubernetes.io/projected/6860c5ea-d056-44a7-baf9-89c835744a86-kube-api-access-4bc8p\") pod \"nova-cell1-novncproxy-0\" (UID: \"6860c5ea-d056-44a7-baf9-89c835744a86\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.363858 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7jt2\" (UniqueName: \"kubernetes.io/projected/430ba0db-7327-43e8-a350-455f10b27f90-kube-api-access-j7jt2\") pod \"nova-api-0\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.364018 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/430ba0db-7327-43e8-a350-455f10b27f90-config-data\") pod \"nova-api-0\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.364067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6860c5ea-d056-44a7-baf9-89c835744a86-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6860c5ea-d056-44a7-baf9-89c835744a86\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.364095 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cpcvt\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.364112 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/430ba0db-7327-43e8-a350-455f10b27f90-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.364214 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-config-data\") pod \"nova-cell0-cell-mapping-cpcvt\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.364245 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hthv9\" (UniqueName: \"kubernetes.io/projected/569f5821-3389-4731-95ab-a39e17508f8e-kube-api-access-hthv9\") pod \"nova-cell0-cell-mapping-cpcvt\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.364283 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6860c5ea-d056-44a7-baf9-89c835744a86-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6860c5ea-d056-44a7-baf9-89c835744a86\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.364315 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430ba0db-7327-43e8-a350-455f10b27f90-logs\") pod \"nova-api-0\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.364343 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-scripts\") pod \"nova-cell0-cell-mapping-cpcvt\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.369380 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.369572 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-config-data\") pod \"nova-cell0-cell-mapping-cpcvt\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.370720 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cpcvt\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.375748 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-scripts\") pod \"nova-cell0-cell-mapping-cpcvt\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.402359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hthv9\" (UniqueName: \"kubernetes.io/projected/569f5821-3389-4731-95ab-a39e17508f8e-kube-api-access-hthv9\") pod \"nova-cell0-cell-mapping-cpcvt\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.440447 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.464841 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.468032 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6860c5ea-d056-44a7-baf9-89c835744a86-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6860c5ea-d056-44a7-baf9-89c835744a86\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.468101 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430ba0db-7327-43e8-a350-455f10b27f90-logs\") pod \"nova-api-0\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.468180 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bc8p\" (UniqueName: \"kubernetes.io/projected/6860c5ea-d056-44a7-baf9-89c835744a86-kube-api-access-4bc8p\") pod \"nova-cell1-novncproxy-0\" (UID: \"6860c5ea-d056-44a7-baf9-89c835744a86\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.468204 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7jt2\" (UniqueName: \"kubernetes.io/projected/430ba0db-7327-43e8-a350-455f10b27f90-kube-api-access-j7jt2\") pod \"nova-api-0\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.468246 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/430ba0db-7327-43e8-a350-455f10b27f90-config-data\") pod \"nova-api-0\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.468271 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/430ba0db-7327-43e8-a350-455f10b27f90-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.468290 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6860c5ea-d056-44a7-baf9-89c835744a86-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6860c5ea-d056-44a7-baf9-89c835744a86\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.491098 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430ba0db-7327-43e8-a350-455f10b27f90-logs\") pod \"nova-api-0\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.497487 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/430ba0db-7327-43e8-a350-455f10b27f90-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.497604 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.508040 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.527248 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.547829 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6860c5ea-d056-44a7-baf9-89c835744a86-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6860c5ea-d056-44a7-baf9-89c835744a86\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.548392 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6860c5ea-d056-44a7-baf9-89c835744a86-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6860c5ea-d056-44a7-baf9-89c835744a86\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.549281 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/430ba0db-7327-43e8-a350-455f10b27f90-config-data\") pod \"nova-api-0\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.551989 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bc8p\" (UniqueName: \"kubernetes.io/projected/6860c5ea-d056-44a7-baf9-89c835744a86-kube-api-access-4bc8p\") pod \"nova-cell1-novncproxy-0\" (UID: \"6860c5ea-d056-44a7-baf9-89c835744a86\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.566680 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.570555 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.596769 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7jt2\" (UniqueName: \"kubernetes.io/projected/430ba0db-7327-43e8-a350-455f10b27f90-kube-api-access-j7jt2\") pod \"nova-api-0\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.597366 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.638289 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.645919 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.659926 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.700028 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.700109 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-config-data\") pod \"nova-scheduler-0\" (UID: \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.700132 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb4d5046-d68a-43d1-a12c-9ea876e906ad-logs\") pod \"nova-metadata-0\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.700154 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nldlf\" (UniqueName: \"kubernetes.io/projected/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-kube-api-access-nldlf\") pod \"nova-scheduler-0\" (UID: \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.700175 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4d5046-d68a-43d1-a12c-9ea876e906ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.700190 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4d5046-d68a-43d1-a12c-9ea876e906ad-config-data\") pod \"nova-metadata-0\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.700244 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzdwp\" (UniqueName: \"kubernetes.io/projected/cb4d5046-d68a-43d1-a12c-9ea876e906ad-kube-api-access-vzdwp\") pod \"nova-metadata-0\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.803494 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-config-data\") pod \"nova-scheduler-0\" (UID: \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.803764 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb4d5046-d68a-43d1-a12c-9ea876e906ad-logs\") pod \"nova-metadata-0\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.803787 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nldlf\" (UniqueName: \"kubernetes.io/projected/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-kube-api-access-nldlf\") pod \"nova-scheduler-0\" (UID: \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.803808 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4d5046-d68a-43d1-a12c-9ea876e906ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.803831 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4d5046-d68a-43d1-a12c-9ea876e906ad-config-data\") pod \"nova-metadata-0\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.803888 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzdwp\" (UniqueName: \"kubernetes.io/projected/cb4d5046-d68a-43d1-a12c-9ea876e906ad-kube-api-access-vzdwp\") pod \"nova-metadata-0\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.803921 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.805213 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb4d5046-d68a-43d1-a12c-9ea876e906ad-logs\") pod \"nova-metadata-0\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.814406 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4d5046-d68a-43d1-a12c-9ea876e906ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.827880 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzdwp\" (UniqueName: \"kubernetes.io/projected/cb4d5046-d68a-43d1-a12c-9ea876e906ad-kube-api-access-vzdwp\") pod \"nova-metadata-0\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.828130 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.831205 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-config-data\") pod \"nova-scheduler-0\" (UID: \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.831943 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nldlf\" (UniqueName: \"kubernetes.io/projected/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-kube-api-access-nldlf\") pod \"nova-scheduler-0\" (UID: \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.835194 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4d5046-d68a-43d1-a12c-9ea876e906ad-config-data\") pod \"nova-metadata-0\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.960305 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:47 crc kubenswrapper[5030]: I0120 23:29:47.970828 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.108292 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt"] Jan 20 23:29:48 crc kubenswrapper[5030]: W0120 23:29:48.115257 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod569f5821_3389_4731_95ab_a39e17508f8e.slice/crio-ff21cc23426e59d692bf9e7867cc3f202804f66c5aed9aaf57680c37da00277e WatchSource:0}: Error finding container ff21cc23426e59d692bf9e7867cc3f202804f66c5aed9aaf57680c37da00277e: Status 404 returned error can't find the container with id ff21cc23426e59d692bf9e7867cc3f202804f66c5aed9aaf57680c37da00277e Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.193094 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.200640 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.318271 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z"] Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.319391 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.324369 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.325608 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.339885 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z"] Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.414173 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spw9f\" (UniqueName: \"kubernetes.io/projected/73af7840-4469-4d06-a6a4-23396532a060-kube-api-access-spw9f\") pod \"nova-cell1-conductor-db-sync-2fk4z\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.414219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-scripts\") pod \"nova-cell1-conductor-db-sync-2fk4z\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.414246 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2fk4z\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.414387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-config-data\") pod \"nova-cell1-conductor-db-sync-2fk4z\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.442503 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:29:48 crc kubenswrapper[5030]: W0120 23:29:48.442552 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb4d5046_d68a_43d1_a12c_9ea876e906ad.slice/crio-67c6abed01f3402399ce8f8b9068f144c2b1659e1f05ece2608f528bba4554c1 WatchSource:0}: Error finding container 67c6abed01f3402399ce8f8b9068f144c2b1659e1f05ece2608f528bba4554c1: Status 404 returned error can't find the container with id 67c6abed01f3402399ce8f8b9068f144c2b1659e1f05ece2608f528bba4554c1 Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.455642 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6860c5ea-d056-44a7-baf9-89c835744a86","Type":"ContainerStarted","Data":"0bf03ac9e9e90dc2573d5bf2575fcda2640adffdd87fe54b7c3a07153fbde546"} Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.457901 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"430ba0db-7327-43e8-a350-455f10b27f90","Type":"ContainerStarted","Data":"f4eead8ff3e6ee6d16f007081c5bdd57af05de2cc8223708c5a2492e3280fee4"} Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.457943 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"430ba0db-7327-43e8-a350-455f10b27f90","Type":"ContainerStarted","Data":"03a328db0d80c2653719f66f4eeea681f5b201065c4568d66b4a29a6dbfae698"} Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.460104 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" event={"ID":"569f5821-3389-4731-95ab-a39e17508f8e","Type":"ContainerStarted","Data":"72a954b6da79217b10f1275cd3e8dd3a0eb5ed45f587ffd21e26c1f49bee64c7"} Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.460141 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" event={"ID":"569f5821-3389-4731-95ab-a39e17508f8e","Type":"ContainerStarted","Data":"ff21cc23426e59d692bf9e7867cc3f202804f66c5aed9aaf57680c37da00277e"} Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.489607 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" podStartSLOduration=1.489586303 podStartE2EDuration="1.489586303s" podCreationTimestamp="2026-01-20 23:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:29:48.47909483 +0000 UTC m=+3260.799355118" watchObservedRunningTime="2026-01-20 23:29:48.489586303 +0000 UTC m=+3260.809846592" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.508040 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.517655 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spw9f\" (UniqueName: \"kubernetes.io/projected/73af7840-4469-4d06-a6a4-23396532a060-kube-api-access-spw9f\") pod \"nova-cell1-conductor-db-sync-2fk4z\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.517724 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-scripts\") pod \"nova-cell1-conductor-db-sync-2fk4z\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.517763 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2fk4z\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.517814 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-config-data\") pod \"nova-cell1-conductor-db-sync-2fk4z\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.523577 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-config-data\") pod \"nova-cell1-conductor-db-sync-2fk4z\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.526029 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-scripts\") pod \"nova-cell1-conductor-db-sync-2fk4z\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.540322 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2fk4z\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.548289 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spw9f\" (UniqueName: \"kubernetes.io/projected/73af7840-4469-4d06-a6a4-23396532a060-kube-api-access-spw9f\") pod \"nova-cell1-conductor-db-sync-2fk4z\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:48 crc kubenswrapper[5030]: I0120 23:29:48.648945 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:49 crc kubenswrapper[5030]: I0120 23:29:49.143780 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z"] Jan 20 23:29:49 crc kubenswrapper[5030]: I0120 23:29:49.504232 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8","Type":"ContainerStarted","Data":"0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329"} Jan 20 23:29:49 crc kubenswrapper[5030]: I0120 23:29:49.504757 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8","Type":"ContainerStarted","Data":"ecf2c74eb645307215adb3353507eab372ed6d42b129b1d10085bc1c6eb4fcf9"} Jan 20 23:29:49 crc kubenswrapper[5030]: I0120 23:29:49.549373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6860c5ea-d056-44a7-baf9-89c835744a86","Type":"ContainerStarted","Data":"6ebff1ddfc2fee47dd48f67a0e30fc5e637d56329096440d4d71adb702b7d8f6"} Jan 20 23:29:49 crc kubenswrapper[5030]: I0120 23:29:49.586517 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.586498757 podStartE2EDuration="2.586498757s" podCreationTimestamp="2026-01-20 23:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:29:49.551328086 +0000 UTC m=+3261.871588374" watchObservedRunningTime="2026-01-20 23:29:49.586498757 +0000 UTC m=+3261.906759045" Jan 20 23:29:49 crc kubenswrapper[5030]: I0120 23:29:49.600906 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"430ba0db-7327-43e8-a350-455f10b27f90","Type":"ContainerStarted","Data":"def134f3754d2d42b8149296707e23527e69e2133f1f0e44476b5aca3249d319"} Jan 20 23:29:49 crc kubenswrapper[5030]: I0120 23:29:49.601433 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.601413458 podStartE2EDuration="2.601413458s" podCreationTimestamp="2026-01-20 23:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:29:49.588003734 +0000 UTC m=+3261.908264032" watchObservedRunningTime="2026-01-20 23:29:49.601413458 +0000 UTC m=+3261.921673746" Jan 20 23:29:49 crc kubenswrapper[5030]: I0120 23:29:49.609385 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" event={"ID":"73af7840-4469-4d06-a6a4-23396532a060","Type":"ContainerStarted","Data":"a0185081b3092b8a631eab11382590320aaebf5ec5045325334882084bebba3c"} Jan 20 23:29:49 crc kubenswrapper[5030]: I0120 23:29:49.627943 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.627925039 podStartE2EDuration="2.627925039s" podCreationTimestamp="2026-01-20 23:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:29:49.627362816 +0000 UTC m=+3261.947623104" watchObservedRunningTime="2026-01-20 23:29:49.627925039 +0000 UTC m=+3261.948185327" Jan 20 23:29:49 crc kubenswrapper[5030]: I0120 23:29:49.628261 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"cb4d5046-d68a-43d1-a12c-9ea876e906ad","Type":"ContainerStarted","Data":"31c42f176ac6f27debe6898cfd918e7789ef66851ebe784b5c6a21ae123b90a6"} Jan 20 23:29:49 crc kubenswrapper[5030]: I0120 23:29:49.628321 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"cb4d5046-d68a-43d1-a12c-9ea876e906ad","Type":"ContainerStarted","Data":"27b8ba9a4e872283ee4de503b07fa9e3fc2e6b3b7ad61a9b2470b666804d0090"} Jan 20 23:29:49 crc kubenswrapper[5030]: I0120 23:29:49.628333 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"cb4d5046-d68a-43d1-a12c-9ea876e906ad","Type":"ContainerStarted","Data":"67c6abed01f3402399ce8f8b9068f144c2b1659e1f05ece2608f528bba4554c1"} Jan 20 23:29:49 crc kubenswrapper[5030]: I0120 23:29:49.650436 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.650417594 podStartE2EDuration="2.650417594s" podCreationTimestamp="2026-01-20 23:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:29:49.647678558 +0000 UTC m=+3261.967938846" watchObservedRunningTime="2026-01-20 23:29:49.650417594 +0000 UTC m=+3261.970677882" Jan 20 23:29:50 crc kubenswrapper[5030]: I0120 23:29:50.639419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" event={"ID":"73af7840-4469-4d06-a6a4-23396532a060","Type":"ContainerStarted","Data":"04816dfe20b62b8e232c335ff1b70ffffcc8b5ac48f2109858f796af3749054b"} Jan 20 23:29:50 crc kubenswrapper[5030]: I0120 23:29:50.657841 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" podStartSLOduration=2.657826921 podStartE2EDuration="2.657826921s" podCreationTimestamp="2026-01-20 23:29:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:29:50.656904668 +0000 UTC m=+3262.977164976" watchObservedRunningTime="2026-01-20 23:29:50.657826921 +0000 UTC m=+3262.978087219" Jan 20 23:29:52 crc kubenswrapper[5030]: I0120 23:29:52.660984 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:52 crc kubenswrapper[5030]: I0120 23:29:52.960560 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:52 crc kubenswrapper[5030]: I0120 23:29:52.960865 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:52 crc kubenswrapper[5030]: I0120 23:29:52.971034 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:55 crc kubenswrapper[5030]: I0120 23:29:55.686417 5030 generic.go:334] "Generic (PLEG): container finished" podID="569f5821-3389-4731-95ab-a39e17508f8e" containerID="72a954b6da79217b10f1275cd3e8dd3a0eb5ed45f587ffd21e26c1f49bee64c7" exitCode=0 Jan 20 23:29:55 crc kubenswrapper[5030]: I0120 23:29:55.686464 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" event={"ID":"569f5821-3389-4731-95ab-a39e17508f8e","Type":"ContainerDied","Data":"72a954b6da79217b10f1275cd3e8dd3a0eb5ed45f587ffd21e26c1f49bee64c7"} Jan 20 23:29:55 crc kubenswrapper[5030]: I0120 23:29:55.689164 5030 generic.go:334] "Generic (PLEG): container finished" podID="73af7840-4469-4d06-a6a4-23396532a060" containerID="04816dfe20b62b8e232c335ff1b70ffffcc8b5ac48f2109858f796af3749054b" exitCode=0 Jan 20 23:29:55 crc kubenswrapper[5030]: I0120 23:29:55.689207 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" event={"ID":"73af7840-4469-4d06-a6a4-23396532a060","Type":"ContainerDied","Data":"04816dfe20b62b8e232c335ff1b70ffffcc8b5ac48f2109858f796af3749054b"} Jan 20 23:29:56 crc kubenswrapper[5030]: I0120 23:29:56.525381 5030 scope.go:117] "RemoveContainer" containerID="209df5add56e7505cd5ecd36454b3d349bbf9289e03c5cbea49a93aea2a026ed" Jan 20 23:29:56 crc kubenswrapper[5030]: I0120 23:29:56.586377 5030 scope.go:117] "RemoveContainer" containerID="60201df21b4c6bccb15761b5b261ceea64ffcc124083582fca6921054cb76f4d" Jan 20 23:29:56 crc kubenswrapper[5030]: I0120 23:29:56.646577 5030 scope.go:117] "RemoveContainer" containerID="56c55391df4a34d7b83ce2d3b282ed437cb6c3b3ba656f105786ceea8b6a70ac" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.060276 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sgfz6"] Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.062169 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.117000 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sgfz6"] Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.152788 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.174936 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.209827 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gzj5\" (UniqueName: \"kubernetes.io/projected/905b9300-659f-4490-b8c1-52e5874b23b7-kube-api-access-5gzj5\") pod \"redhat-operators-sgfz6\" (UID: \"905b9300-659f-4490-b8c1-52e5874b23b7\") " pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.210191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905b9300-659f-4490-b8c1-52e5874b23b7-utilities\") pod \"redhat-operators-sgfz6\" (UID: \"905b9300-659f-4490-b8c1-52e5874b23b7\") " pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.210412 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905b9300-659f-4490-b8c1-52e5874b23b7-catalog-content\") pod \"redhat-operators-sgfz6\" (UID: \"905b9300-659f-4490-b8c1-52e5874b23b7\") " pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.311418 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-scripts\") pod \"569f5821-3389-4731-95ab-a39e17508f8e\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.311732 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-config-data\") pod \"73af7840-4469-4d06-a6a4-23396532a060\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.311758 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hthv9\" (UniqueName: \"kubernetes.io/projected/569f5821-3389-4731-95ab-a39e17508f8e-kube-api-access-hthv9\") pod \"569f5821-3389-4731-95ab-a39e17508f8e\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.311791 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spw9f\" (UniqueName: \"kubernetes.io/projected/73af7840-4469-4d06-a6a4-23396532a060-kube-api-access-spw9f\") pod \"73af7840-4469-4d06-a6a4-23396532a060\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.311838 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-config-data\") pod \"569f5821-3389-4731-95ab-a39e17508f8e\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.311893 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-combined-ca-bundle\") pod \"73af7840-4469-4d06-a6a4-23396532a060\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.311929 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-scripts\") pod \"73af7840-4469-4d06-a6a4-23396532a060\" (UID: \"73af7840-4469-4d06-a6a4-23396532a060\") " Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.312052 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-combined-ca-bundle\") pod \"569f5821-3389-4731-95ab-a39e17508f8e\" (UID: \"569f5821-3389-4731-95ab-a39e17508f8e\") " Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.312381 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzj5\" (UniqueName: \"kubernetes.io/projected/905b9300-659f-4490-b8c1-52e5874b23b7-kube-api-access-5gzj5\") pod \"redhat-operators-sgfz6\" (UID: \"905b9300-659f-4490-b8c1-52e5874b23b7\") " pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.312428 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905b9300-659f-4490-b8c1-52e5874b23b7-utilities\") pod \"redhat-operators-sgfz6\" (UID: \"905b9300-659f-4490-b8c1-52e5874b23b7\") " pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.312480 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905b9300-659f-4490-b8c1-52e5874b23b7-catalog-content\") pod \"redhat-operators-sgfz6\" (UID: \"905b9300-659f-4490-b8c1-52e5874b23b7\") " pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.312883 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905b9300-659f-4490-b8c1-52e5874b23b7-catalog-content\") pod \"redhat-operators-sgfz6\" (UID: \"905b9300-659f-4490-b8c1-52e5874b23b7\") " pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.313139 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905b9300-659f-4490-b8c1-52e5874b23b7-utilities\") pod \"redhat-operators-sgfz6\" (UID: \"905b9300-659f-4490-b8c1-52e5874b23b7\") " pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.317504 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73af7840-4469-4d06-a6a4-23396532a060-kube-api-access-spw9f" (OuterVolumeSpecName: "kube-api-access-spw9f") pod "73af7840-4469-4d06-a6a4-23396532a060" (UID: "73af7840-4469-4d06-a6a4-23396532a060"). InnerVolumeSpecName "kube-api-access-spw9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.317682 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-scripts" (OuterVolumeSpecName: "scripts") pod "569f5821-3389-4731-95ab-a39e17508f8e" (UID: "569f5821-3389-4731-95ab-a39e17508f8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.317955 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-scripts" (OuterVolumeSpecName: "scripts") pod "73af7840-4469-4d06-a6a4-23396532a060" (UID: "73af7840-4469-4d06-a6a4-23396532a060"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.325084 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/569f5821-3389-4731-95ab-a39e17508f8e-kube-api-access-hthv9" (OuterVolumeSpecName: "kube-api-access-hthv9") pod "569f5821-3389-4731-95ab-a39e17508f8e" (UID: "569f5821-3389-4731-95ab-a39e17508f8e"). InnerVolumeSpecName "kube-api-access-hthv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.329253 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gzj5\" (UniqueName: \"kubernetes.io/projected/905b9300-659f-4490-b8c1-52e5874b23b7-kube-api-access-5gzj5\") pod \"redhat-operators-sgfz6\" (UID: \"905b9300-659f-4490-b8c1-52e5874b23b7\") " pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.339246 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73af7840-4469-4d06-a6a4-23396532a060" (UID: "73af7840-4469-4d06-a6a4-23396532a060"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.339256 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "569f5821-3389-4731-95ab-a39e17508f8e" (UID: "569f5821-3389-4731-95ab-a39e17508f8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.342178 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-config-data" (OuterVolumeSpecName: "config-data") pod "569f5821-3389-4731-95ab-a39e17508f8e" (UID: "569f5821-3389-4731-95ab-a39e17508f8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.346150 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-config-data" (OuterVolumeSpecName: "config-data") pod "73af7840-4469-4d06-a6a4-23396532a060" (UID: "73af7840-4469-4d06-a6a4-23396532a060"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.414169 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.414205 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.414215 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.414227 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hthv9\" (UniqueName: \"kubernetes.io/projected/569f5821-3389-4731-95ab-a39e17508f8e-kube-api-access-hthv9\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.414236 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spw9f\" (UniqueName: \"kubernetes.io/projected/73af7840-4469-4d06-a6a4-23396532a060-kube-api-access-spw9f\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.414245 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/569f5821-3389-4731-95ab-a39e17508f8e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.414254 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.414262 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73af7840-4469-4d06-a6a4-23396532a060-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.473077 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.640812 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.640858 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.660448 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.679792 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.720383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" event={"ID":"569f5821-3389-4731-95ab-a39e17508f8e","Type":"ContainerDied","Data":"ff21cc23426e59d692bf9e7867cc3f202804f66c5aed9aaf57680c37da00277e"} Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.720434 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff21cc23426e59d692bf9e7867cc3f202804f66c5aed9aaf57680c37da00277e" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.720501 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.722978 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.723044 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z" event={"ID":"73af7840-4469-4d06-a6a4-23396532a060","Type":"ContainerDied","Data":"a0185081b3092b8a631eab11382590320aaebf5ec5045325334882084bebba3c"} Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.723065 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0185081b3092b8a631eab11382590320aaebf5ec5045325334882084bebba3c" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.734880 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.814743 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:29:57 crc kubenswrapper[5030]: E0120 23:29:57.815339 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73af7840-4469-4d06-a6a4-23396532a060" containerName="nova-cell1-conductor-db-sync" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.815422 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="73af7840-4469-4d06-a6a4-23396532a060" containerName="nova-cell1-conductor-db-sync" Jan 20 23:29:57 crc kubenswrapper[5030]: E0120 23:29:57.815508 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="569f5821-3389-4731-95ab-a39e17508f8e" containerName="nova-manage" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.815560 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="569f5821-3389-4731-95ab-a39e17508f8e" containerName="nova-manage" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.815799 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="569f5821-3389-4731-95ab-a39e17508f8e" containerName="nova-manage" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.815876 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="73af7840-4469-4d06-a6a4-23396532a060" containerName="nova-cell1-conductor-db-sync" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.816701 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.823297 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.832221 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zfpr\" (UniqueName: \"kubernetes.io/projected/9936c7df-3635-4a9b-acec-a8c178738415-kube-api-access-6zfpr\") pod \"nova-cell1-conductor-0\" (UID: \"9936c7df-3635-4a9b-acec-a8c178738415\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.832534 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9936c7df-3635-4a9b-acec-a8c178738415-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9936c7df-3635-4a9b-acec-a8c178738415\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.832565 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9936c7df-3635-4a9b-acec-a8c178738415-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9936c7df-3635-4a9b-acec-a8c178738415\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.839090 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.904801 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.905051 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="430ba0db-7327-43e8-a350-455f10b27f90" containerName="nova-api-log" containerID="cri-o://f4eead8ff3e6ee6d16f007081c5bdd57af05de2cc8223708c5a2492e3280fee4" gracePeriod=30 Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.905219 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="430ba0db-7327-43e8-a350-455f10b27f90" containerName="nova-api-api" containerID="cri-o://def134f3754d2d42b8149296707e23527e69e2133f1f0e44476b5aca3249d319" gracePeriod=30 Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.913149 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="430ba0db-7327-43e8-a350-455f10b27f90" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.170:8774/\": EOF" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.913569 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="430ba0db-7327-43e8-a350-455f10b27f90" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.170:8774/\": EOF" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.934111 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zfpr\" (UniqueName: \"kubernetes.io/projected/9936c7df-3635-4a9b-acec-a8c178738415-kube-api-access-6zfpr\") pod \"nova-cell1-conductor-0\" (UID: \"9936c7df-3635-4a9b-acec-a8c178738415\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.934311 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9936c7df-3635-4a9b-acec-a8c178738415-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9936c7df-3635-4a9b-acec-a8c178738415\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.934394 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9936c7df-3635-4a9b-acec-a8c178738415-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9936c7df-3635-4a9b-acec-a8c178738415\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.936528 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sgfz6"] Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.945419 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9936c7df-3635-4a9b-acec-a8c178738415-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9936c7df-3635-4a9b-acec-a8c178738415\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.948280 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9936c7df-3635-4a9b-acec-a8c178738415-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9936c7df-3635-4a9b-acec-a8c178738415\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.951098 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zfpr\" (UniqueName: \"kubernetes.io/projected/9936c7df-3635-4a9b-acec-a8c178738415-kube-api-access-6zfpr\") pod \"nova-cell1-conductor-0\" (UID: \"9936c7df-3635-4a9b-acec-a8c178738415\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.960958 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.976568 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.976600 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:57 crc kubenswrapper[5030]: I0120 23:29:57.978250 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:29:58 crc kubenswrapper[5030]: I0120 23:29:58.032463 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:29:58 crc kubenswrapper[5030]: I0120 23:29:58.055713 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:29:58 crc kubenswrapper[5030]: I0120 23:29:58.134951 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:29:58 crc kubenswrapper[5030]: W0120 23:29:58.608965 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9936c7df_3635_4a9b_acec_a8c178738415.slice/crio-8e47adb9603263dec83e5e3cc5ab6c6c8ae22def4e5ad55b13e29694ce1bb22f WatchSource:0}: Error finding container 8e47adb9603263dec83e5e3cc5ab6c6c8ae22def4e5ad55b13e29694ce1bb22f: Status 404 returned error can't find the container with id 8e47adb9603263dec83e5e3cc5ab6c6c8ae22def4e5ad55b13e29694ce1bb22f Jan 20 23:29:58 crc kubenswrapper[5030]: I0120 23:29:58.618455 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:29:58 crc kubenswrapper[5030]: I0120 23:29:58.732746 5030 generic.go:334] "Generic (PLEG): container finished" podID="905b9300-659f-4490-b8c1-52e5874b23b7" containerID="08325065dcc4631e5da5a30b800e1b1193f67cf5d051d623c4924d3a56eb756c" exitCode=0 Jan 20 23:29:58 crc kubenswrapper[5030]: I0120 23:29:58.733927 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sgfz6" event={"ID":"905b9300-659f-4490-b8c1-52e5874b23b7","Type":"ContainerDied","Data":"08325065dcc4631e5da5a30b800e1b1193f67cf5d051d623c4924d3a56eb756c"} Jan 20 23:29:58 crc kubenswrapper[5030]: I0120 23:29:58.733956 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sgfz6" event={"ID":"905b9300-659f-4490-b8c1-52e5874b23b7","Type":"ContainerStarted","Data":"265a66a7c8f9c28844f273eabe7c1d4029c1aafd39d840ad898ee4d144893fc1"} Jan 20 23:29:58 crc kubenswrapper[5030]: I0120 23:29:58.736479 5030 generic.go:334] "Generic (PLEG): container finished" podID="430ba0db-7327-43e8-a350-455f10b27f90" containerID="f4eead8ff3e6ee6d16f007081c5bdd57af05de2cc8223708c5a2492e3280fee4" exitCode=143 Jan 20 23:29:58 crc kubenswrapper[5030]: I0120 23:29:58.736515 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"430ba0db-7327-43e8-a350-455f10b27f90","Type":"ContainerDied","Data":"f4eead8ff3e6ee6d16f007081c5bdd57af05de2cc8223708c5a2492e3280fee4"} Jan 20 23:29:58 crc kubenswrapper[5030]: I0120 23:29:58.739825 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"9936c7df-3635-4a9b-acec-a8c178738415","Type":"ContainerStarted","Data":"8e47adb9603263dec83e5e3cc5ab6c6c8ae22def4e5ad55b13e29694ce1bb22f"} Jan 20 23:29:58 crc kubenswrapper[5030]: I0120 23:29:58.740064 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c1184999-f4e1-4ac3-9b82-ce2ba004a8a8" containerName="nova-scheduler-scheduler" containerID="cri-o://0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329" gracePeriod=30 Jan 20 23:29:58 crc kubenswrapper[5030]: E0120 23:29:58.744860 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:29:58 crc kubenswrapper[5030]: E0120 23:29:58.752004 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:29:58 crc kubenswrapper[5030]: E0120 23:29:58.761950 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:29:58 crc kubenswrapper[5030]: E0120 23:29:58.762009 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c1184999-f4e1-4ac3-9b82-ce2ba004a8a8" containerName="nova-scheduler-scheduler" Jan 20 23:29:59 crc kubenswrapper[5030]: I0120 23:29:59.043806 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="cb4d5046-d68a-43d1-a12c-9ea876e906ad" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.172:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:29:59 crc kubenswrapper[5030]: I0120 23:29:59.044545 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="cb4d5046-d68a-43d1-a12c-9ea876e906ad" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.172:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:29:59 crc kubenswrapper[5030]: I0120 23:29:59.761915 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="cb4d5046-d68a-43d1-a12c-9ea876e906ad" containerName="nova-metadata-log" containerID="cri-o://27b8ba9a4e872283ee4de503b07fa9e3fc2e6b3b7ad61a9b2470b666804d0090" gracePeriod=30 Jan 20 23:29:59 crc kubenswrapper[5030]: I0120 23:29:59.762962 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"9936c7df-3635-4a9b-acec-a8c178738415","Type":"ContainerStarted","Data":"fdfa491c7a2eb603bdceb5aaf6cc99fee2799e677de3bd0ed64998869774b6c9"} Jan 20 23:29:59 crc kubenswrapper[5030]: I0120 23:29:59.762993 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:29:59 crc kubenswrapper[5030]: I0120 23:29:59.763250 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="cb4d5046-d68a-43d1-a12c-9ea876e906ad" containerName="nova-metadata-metadata" containerID="cri-o://31c42f176ac6f27debe6898cfd918e7789ef66851ebe784b5c6a21ae123b90a6" gracePeriod=30 Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.126472 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=3.1264539510000002 podStartE2EDuration="3.126453951s" podCreationTimestamp="2026-01-20 23:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:29:59.784156234 +0000 UTC m=+3272.104416522" watchObservedRunningTime="2026-01-20 23:30:00.126453951 +0000 UTC m=+3272.446714239" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.139730 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm"] Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.141188 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.143778 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.143952 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.149367 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm"] Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.177294 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a201bafe-f06d-4f32-a740-c15f9ea69d8d-secret-volume\") pod \"collect-profiles-29482530-pd5cm\" (UID: \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.177370 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt4w6\" (UniqueName: \"kubernetes.io/projected/a201bafe-f06d-4f32-a740-c15f9ea69d8d-kube-api-access-pt4w6\") pod \"collect-profiles-29482530-pd5cm\" (UID: \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.177442 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a201bafe-f06d-4f32-a740-c15f9ea69d8d-config-volume\") pod \"collect-profiles-29482530-pd5cm\" (UID: \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.279171 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt4w6\" (UniqueName: \"kubernetes.io/projected/a201bafe-f06d-4f32-a740-c15f9ea69d8d-kube-api-access-pt4w6\") pod \"collect-profiles-29482530-pd5cm\" (UID: \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.279285 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a201bafe-f06d-4f32-a740-c15f9ea69d8d-config-volume\") pod \"collect-profiles-29482530-pd5cm\" (UID: \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.279382 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a201bafe-f06d-4f32-a740-c15f9ea69d8d-secret-volume\") pod \"collect-profiles-29482530-pd5cm\" (UID: \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.282186 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a201bafe-f06d-4f32-a740-c15f9ea69d8d-config-volume\") pod \"collect-profiles-29482530-pd5cm\" (UID: \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.297942 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt4w6\" (UniqueName: \"kubernetes.io/projected/a201bafe-f06d-4f32-a740-c15f9ea69d8d-kube-api-access-pt4w6\") pod \"collect-profiles-29482530-pd5cm\" (UID: \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.299208 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a201bafe-f06d-4f32-a740-c15f9ea69d8d-secret-volume\") pod \"collect-profiles-29482530-pd5cm\" (UID: \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.393784 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.477522 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.483198 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-combined-ca-bundle\") pod \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\" (UID: \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\") " Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.483270 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nldlf\" (UniqueName: \"kubernetes.io/projected/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-kube-api-access-nldlf\") pod \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\" (UID: \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\") " Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.483394 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-config-data\") pod \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\" (UID: \"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8\") " Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.486506 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-kube-api-access-nldlf" (OuterVolumeSpecName: "kube-api-access-nldlf") pod "c1184999-f4e1-4ac3-9b82-ce2ba004a8a8" (UID: "c1184999-f4e1-4ac3-9b82-ce2ba004a8a8"). InnerVolumeSpecName "kube-api-access-nldlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.510307 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1184999-f4e1-4ac3-9b82-ce2ba004a8a8" (UID: "c1184999-f4e1-4ac3-9b82-ce2ba004a8a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.517016 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-config-data" (OuterVolumeSpecName: "config-data") pod "c1184999-f4e1-4ac3-9b82-ce2ba004a8a8" (UID: "c1184999-f4e1-4ac3-9b82-ce2ba004a8a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.586491 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.586523 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.586538 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nldlf\" (UniqueName: \"kubernetes.io/projected/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8-kube-api-access-nldlf\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.776600 5030 generic.go:334] "Generic (PLEG): container finished" podID="cb4d5046-d68a-43d1-a12c-9ea876e906ad" containerID="27b8ba9a4e872283ee4de503b07fa9e3fc2e6b3b7ad61a9b2470b666804d0090" exitCode=143 Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.776654 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"cb4d5046-d68a-43d1-a12c-9ea876e906ad","Type":"ContainerDied","Data":"27b8ba9a4e872283ee4de503b07fa9e3fc2e6b3b7ad61a9b2470b666804d0090"} Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.779489 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sgfz6" event={"ID":"905b9300-659f-4490-b8c1-52e5874b23b7","Type":"ContainerStarted","Data":"ce0586a5474a6cc2de7fdd01bb27bb5cdc8ccc33deeb51efd71e9fe9e9551a25"} Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.781471 5030 generic.go:334] "Generic (PLEG): container finished" podID="c1184999-f4e1-4ac3-9b82-ce2ba004a8a8" containerID="0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329" exitCode=0 Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.781525 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.781579 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8","Type":"ContainerDied","Data":"0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329"} Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.781639 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c1184999-f4e1-4ac3-9b82-ce2ba004a8a8","Type":"ContainerDied","Data":"ecf2c74eb645307215adb3353507eab372ed6d42b129b1d10085bc1c6eb4fcf9"} Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.781661 5030 scope.go:117] "RemoveContainer" containerID="0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.809593 5030 scope.go:117] "RemoveContainer" containerID="0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329" Jan 20 23:30:00 crc kubenswrapper[5030]: E0120 23:30:00.810118 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329\": container with ID starting with 0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329 not found: ID does not exist" containerID="0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.810183 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329"} err="failed to get container status \"0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329\": rpc error: code = NotFound desc = could not find container \"0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329\": container with ID starting with 0515643db2a4ec36f3e956089dca9e730064db8090f617649dc14e67775c0329 not found: ID does not exist" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.827575 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.838995 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.846694 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:30:00 crc kubenswrapper[5030]: E0120 23:30:00.847074 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1184999-f4e1-4ac3-9b82-ce2ba004a8a8" containerName="nova-scheduler-scheduler" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.847085 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1184999-f4e1-4ac3-9b82-ce2ba004a8a8" containerName="nova-scheduler-scheduler" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.847257 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1184999-f4e1-4ac3-9b82-ce2ba004a8a8" containerName="nova-scheduler-scheduler" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.847824 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.850530 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.856301 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.891159 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e68f755c-e882-4722-9d23-1cd24541f37a-config-data\") pod \"nova-scheduler-0\" (UID: \"e68f755c-e882-4722-9d23-1cd24541f37a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.891224 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fzvm\" (UniqueName: \"kubernetes.io/projected/e68f755c-e882-4722-9d23-1cd24541f37a-kube-api-access-6fzvm\") pod \"nova-scheduler-0\" (UID: \"e68f755c-e882-4722-9d23-1cd24541f37a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.891270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e68f755c-e882-4722-9d23-1cd24541f37a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e68f755c-e882-4722-9d23-1cd24541f37a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.923730 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm"] Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.962096 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:30:00 crc kubenswrapper[5030]: E0120 23:30:00.962333 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.993322 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e68f755c-e882-4722-9d23-1cd24541f37a-config-data\") pod \"nova-scheduler-0\" (UID: \"e68f755c-e882-4722-9d23-1cd24541f37a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.993391 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fzvm\" (UniqueName: \"kubernetes.io/projected/e68f755c-e882-4722-9d23-1cd24541f37a-kube-api-access-6fzvm\") pod \"nova-scheduler-0\" (UID: \"e68f755c-e882-4722-9d23-1cd24541f37a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.993451 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e68f755c-e882-4722-9d23-1cd24541f37a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e68f755c-e882-4722-9d23-1cd24541f37a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:00 crc kubenswrapper[5030]: I0120 23:30:00.999040 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e68f755c-e882-4722-9d23-1cd24541f37a-config-data\") pod \"nova-scheduler-0\" (UID: \"e68f755c-e882-4722-9d23-1cd24541f37a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:01 crc kubenswrapper[5030]: I0120 23:30:01.005160 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e68f755c-e882-4722-9d23-1cd24541f37a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e68f755c-e882-4722-9d23-1cd24541f37a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:01 crc kubenswrapper[5030]: I0120 23:30:01.013718 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fzvm\" (UniqueName: \"kubernetes.io/projected/e68f755c-e882-4722-9d23-1cd24541f37a-kube-api-access-6fzvm\") pod \"nova-scheduler-0\" (UID: \"e68f755c-e882-4722-9d23-1cd24541f37a\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:01 crc kubenswrapper[5030]: I0120 23:30:01.169533 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:01 crc kubenswrapper[5030]: I0120 23:30:01.639223 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:30:01 crc kubenswrapper[5030]: W0120 23:30:01.639222 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode68f755c_e882_4722_9d23_1cd24541f37a.slice/crio-be471bec5632700e47a6ea9567eabe0438acec90c4db65e184ed7d7532a7ac3d WatchSource:0}: Error finding container be471bec5632700e47a6ea9567eabe0438acec90c4db65e184ed7d7532a7ac3d: Status 404 returned error can't find the container with id be471bec5632700e47a6ea9567eabe0438acec90c4db65e184ed7d7532a7ac3d Jan 20 23:30:01 crc kubenswrapper[5030]: I0120 23:30:01.791222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e68f755c-e882-4722-9d23-1cd24541f37a","Type":"ContainerStarted","Data":"be471bec5632700e47a6ea9567eabe0438acec90c4db65e184ed7d7532a7ac3d"} Jan 20 23:30:01 crc kubenswrapper[5030]: I0120 23:30:01.793810 5030 generic.go:334] "Generic (PLEG): container finished" podID="a201bafe-f06d-4f32-a740-c15f9ea69d8d" containerID="7ef770f8b6fb8a9dfbf7cf9dbe9f107c4a5dcb6df01d2136ef60b4b74a34b569" exitCode=0 Jan 20 23:30:01 crc kubenswrapper[5030]: I0120 23:30:01.793905 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" event={"ID":"a201bafe-f06d-4f32-a740-c15f9ea69d8d","Type":"ContainerDied","Data":"7ef770f8b6fb8a9dfbf7cf9dbe9f107c4a5dcb6df01d2136ef60b4b74a34b569"} Jan 20 23:30:01 crc kubenswrapper[5030]: I0120 23:30:01.793977 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" event={"ID":"a201bafe-f06d-4f32-a740-c15f9ea69d8d","Type":"ContainerStarted","Data":"aace32478ebd3a95f3ac7ad5841c19b8199890b572b28d137010a27a3df52d02"} Jan 20 23:30:01 crc kubenswrapper[5030]: I0120 23:30:01.973001 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1184999-f4e1-4ac3-9b82-ce2ba004a8a8" path="/var/lib/kubelet/pods/c1184999-f4e1-4ac3-9b82-ce2ba004a8a8/volumes" Jan 20 23:30:02 crc kubenswrapper[5030]: I0120 23:30:02.814684 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e68f755c-e882-4722-9d23-1cd24541f37a","Type":"ContainerStarted","Data":"9d17498b4f63fe63c2218b1ac9b50d472f4d2ff957fc47efa50328e708c8df4a"} Jan 20 23:30:02 crc kubenswrapper[5030]: I0120 23:30:02.842990 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.84296407 podStartE2EDuration="2.84296407s" podCreationTimestamp="2026-01-20 23:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:30:02.837421685 +0000 UTC m=+3275.157682003" watchObservedRunningTime="2026-01-20 23:30:02.84296407 +0000 UTC m=+3275.163224388" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.171405 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.241556 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.348573 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a201bafe-f06d-4f32-a740-c15f9ea69d8d-config-volume\") pod \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\" (UID: \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\") " Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.348700 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a201bafe-f06d-4f32-a740-c15f9ea69d8d-secret-volume\") pod \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\" (UID: \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\") " Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.348748 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt4w6\" (UniqueName: \"kubernetes.io/projected/a201bafe-f06d-4f32-a740-c15f9ea69d8d-kube-api-access-pt4w6\") pod \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\" (UID: \"a201bafe-f06d-4f32-a740-c15f9ea69d8d\") " Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.349252 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a201bafe-f06d-4f32-a740-c15f9ea69d8d-config-volume" (OuterVolumeSpecName: "config-volume") pod "a201bafe-f06d-4f32-a740-c15f9ea69d8d" (UID: "a201bafe-f06d-4f32-a740-c15f9ea69d8d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.353328 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a201bafe-f06d-4f32-a740-c15f9ea69d8d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a201bafe-f06d-4f32-a740-c15f9ea69d8d" (UID: "a201bafe-f06d-4f32-a740-c15f9ea69d8d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.353562 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a201bafe-f06d-4f32-a740-c15f9ea69d8d-kube-api-access-pt4w6" (OuterVolumeSpecName: "kube-api-access-pt4w6") pod "a201bafe-f06d-4f32-a740-c15f9ea69d8d" (UID: "a201bafe-f06d-4f32-a740-c15f9ea69d8d"). InnerVolumeSpecName "kube-api-access-pt4w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.450801 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a201bafe-f06d-4f32-a740-c15f9ea69d8d-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.450840 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt4w6\" (UniqueName: \"kubernetes.io/projected/a201bafe-f06d-4f32-a740-c15f9ea69d8d-kube-api-access-pt4w6\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.450873 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a201bafe-f06d-4f32-a740-c15f9ea69d8d-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.830443 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt"] Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.831814 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" Jan 20 23:30:03 crc kubenswrapper[5030]: E0120 23:30:03.832534 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a201bafe-f06d-4f32-a740-c15f9ea69d8d" containerName="collect-profiles" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.832705 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a201bafe-f06d-4f32-a740-c15f9ea69d8d" containerName="collect-profiles" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.833041 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a201bafe-f06d-4f32-a740-c15f9ea69d8d" containerName="collect-profiles" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.833856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm" event={"ID":"a201bafe-f06d-4f32-a740-c15f9ea69d8d","Type":"ContainerDied","Data":"aace32478ebd3a95f3ac7ad5841c19b8199890b572b28d137010a27a3df52d02"} Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.833971 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aace32478ebd3a95f3ac7ad5841c19b8199890b572b28d137010a27a3df52d02" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.834024 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.836878 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.837109 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.844003 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt"] Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.860232 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-srmvt\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.860288 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-config-data\") pod \"nova-cell1-cell-mapping-srmvt\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.860425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-scripts\") pod \"nova-cell1-cell-mapping-srmvt\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.860481 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcg7b\" (UniqueName: \"kubernetes.io/projected/021b4577-29eb-4118-90a9-e4d409ec342a-kube-api-access-tcg7b\") pod \"nova-cell1-cell-mapping-srmvt\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.963584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-config-data\") pod \"nova-cell1-cell-mapping-srmvt\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.963834 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-scripts\") pod \"nova-cell1-cell-mapping-srmvt\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.963914 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcg7b\" (UniqueName: \"kubernetes.io/projected/021b4577-29eb-4118-90a9-e4d409ec342a-kube-api-access-tcg7b\") pod \"nova-cell1-cell-mapping-srmvt\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.964029 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-srmvt\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.968908 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-scripts\") pod \"nova-cell1-cell-mapping-srmvt\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.969385 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-srmvt\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.970779 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-config-data\") pod \"nova-cell1-cell-mapping-srmvt\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:03 crc kubenswrapper[5030]: I0120 23:30:03.986252 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcg7b\" (UniqueName: \"kubernetes.io/projected/021b4577-29eb-4118-90a9-e4d409ec342a-kube-api-access-tcg7b\") pod \"nova-cell1-cell-mapping-srmvt\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:04 crc kubenswrapper[5030]: I0120 23:30:04.039274 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp"] Jan 20 23:30:04 crc kubenswrapper[5030]: I0120 23:30:04.048976 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482485-74cfp"] Jan 20 23:30:04 crc kubenswrapper[5030]: I0120 23:30:04.167072 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:04 crc kubenswrapper[5030]: I0120 23:30:04.580678 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt"] Jan 20 23:30:04 crc kubenswrapper[5030]: W0120 23:30:04.587756 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod021b4577_29eb_4118_90a9_e4d409ec342a.slice/crio-2d15ebcc8ee7d6f70e8c9fbfac0a0c2ad83b4db73280724baafe0e85c2fb1dc7 WatchSource:0}: Error finding container 2d15ebcc8ee7d6f70e8c9fbfac0a0c2ad83b4db73280724baafe0e85c2fb1dc7: Status 404 returned error can't find the container with id 2d15ebcc8ee7d6f70e8c9fbfac0a0c2ad83b4db73280724baafe0e85c2fb1dc7 Jan 20 23:30:04 crc kubenswrapper[5030]: I0120 23:30:04.857183 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" event={"ID":"021b4577-29eb-4118-90a9-e4d409ec342a","Type":"ContainerStarted","Data":"2d15ebcc8ee7d6f70e8c9fbfac0a0c2ad83b4db73280724baafe0e85c2fb1dc7"} Jan 20 23:30:04 crc kubenswrapper[5030]: I0120 23:30:04.869350 5030 generic.go:334] "Generic (PLEG): container finished" podID="cb4d5046-d68a-43d1-a12c-9ea876e906ad" containerID="31c42f176ac6f27debe6898cfd918e7789ef66851ebe784b5c6a21ae123b90a6" exitCode=0 Jan 20 23:30:04 crc kubenswrapper[5030]: I0120 23:30:04.869409 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"cb4d5046-d68a-43d1-a12c-9ea876e906ad","Type":"ContainerDied","Data":"31c42f176ac6f27debe6898cfd918e7789ef66851ebe784b5c6a21ae123b90a6"} Jan 20 23:30:04 crc kubenswrapper[5030]: I0120 23:30:04.872997 5030 generic.go:334] "Generic (PLEG): container finished" podID="905b9300-659f-4490-b8c1-52e5874b23b7" containerID="ce0586a5474a6cc2de7fdd01bb27bb5cdc8ccc33deeb51efd71e9fe9e9551a25" exitCode=0 Jan 20 23:30:04 crc kubenswrapper[5030]: I0120 23:30:04.873064 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sgfz6" event={"ID":"905b9300-659f-4490-b8c1-52e5874b23b7","Type":"ContainerDied","Data":"ce0586a5474a6cc2de7fdd01bb27bb5cdc8ccc33deeb51efd71e9fe9e9551a25"} Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.681873 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.687065 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.711845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4d5046-d68a-43d1-a12c-9ea876e906ad-config-data\") pod \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.711998 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4d5046-d68a-43d1-a12c-9ea876e906ad-combined-ca-bundle\") pod \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.712179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb4d5046-d68a-43d1-a12c-9ea876e906ad-logs\") pod \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.712386 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzdwp\" (UniqueName: \"kubernetes.io/projected/cb4d5046-d68a-43d1-a12c-9ea876e906ad-kube-api-access-vzdwp\") pod \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\" (UID: \"cb4d5046-d68a-43d1-a12c-9ea876e906ad\") " Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.713088 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb4d5046-d68a-43d1-a12c-9ea876e906ad-logs" (OuterVolumeSpecName: "logs") pod "cb4d5046-d68a-43d1-a12c-9ea876e906ad" (UID: "cb4d5046-d68a-43d1-a12c-9ea876e906ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.736646 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4d5046-d68a-43d1-a12c-9ea876e906ad-kube-api-access-vzdwp" (OuterVolumeSpecName: "kube-api-access-vzdwp") pod "cb4d5046-d68a-43d1-a12c-9ea876e906ad" (UID: "cb4d5046-d68a-43d1-a12c-9ea876e906ad"). InnerVolumeSpecName "kube-api-access-vzdwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.746649 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4d5046-d68a-43d1-a12c-9ea876e906ad-config-data" (OuterVolumeSpecName: "config-data") pod "cb4d5046-d68a-43d1-a12c-9ea876e906ad" (UID: "cb4d5046-d68a-43d1-a12c-9ea876e906ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.770985 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb4d5046-d68a-43d1-a12c-9ea876e906ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb4d5046-d68a-43d1-a12c-9ea876e906ad" (UID: "cb4d5046-d68a-43d1-a12c-9ea876e906ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.814224 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/430ba0db-7327-43e8-a350-455f10b27f90-config-data\") pod \"430ba0db-7327-43e8-a350-455f10b27f90\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.814277 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430ba0db-7327-43e8-a350-455f10b27f90-logs\") pod \"430ba0db-7327-43e8-a350-455f10b27f90\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.814302 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/430ba0db-7327-43e8-a350-455f10b27f90-combined-ca-bundle\") pod \"430ba0db-7327-43e8-a350-455f10b27f90\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.814344 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7jt2\" (UniqueName: \"kubernetes.io/projected/430ba0db-7327-43e8-a350-455f10b27f90-kube-api-access-j7jt2\") pod \"430ba0db-7327-43e8-a350-455f10b27f90\" (UID: \"430ba0db-7327-43e8-a350-455f10b27f90\") " Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.814734 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb4d5046-d68a-43d1-a12c-9ea876e906ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.814745 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb4d5046-d68a-43d1-a12c-9ea876e906ad-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.814756 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzdwp\" (UniqueName: \"kubernetes.io/projected/cb4d5046-d68a-43d1-a12c-9ea876e906ad-kube-api-access-vzdwp\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.814766 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb4d5046-d68a-43d1-a12c-9ea876e906ad-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.817873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/430ba0db-7327-43e8-a350-455f10b27f90-kube-api-access-j7jt2" (OuterVolumeSpecName: "kube-api-access-j7jt2") pod "430ba0db-7327-43e8-a350-455f10b27f90" (UID: "430ba0db-7327-43e8-a350-455f10b27f90"). InnerVolumeSpecName "kube-api-access-j7jt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.818193 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/430ba0db-7327-43e8-a350-455f10b27f90-logs" (OuterVolumeSpecName: "logs") pod "430ba0db-7327-43e8-a350-455f10b27f90" (UID: "430ba0db-7327-43e8-a350-455f10b27f90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.841541 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/430ba0db-7327-43e8-a350-455f10b27f90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "430ba0db-7327-43e8-a350-455f10b27f90" (UID: "430ba0db-7327-43e8-a350-455f10b27f90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.842935 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/430ba0db-7327-43e8-a350-455f10b27f90-config-data" (OuterVolumeSpecName: "config-data") pod "430ba0db-7327-43e8-a350-455f10b27f90" (UID: "430ba0db-7327-43e8-a350-455f10b27f90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.894579 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" event={"ID":"021b4577-29eb-4118-90a9-e4d409ec342a","Type":"ContainerStarted","Data":"9d227bee93ee3810f8bf8c42e3725c916f0abced2be65bffcd089b41e0c07bac"} Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.898649 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"cb4d5046-d68a-43d1-a12c-9ea876e906ad","Type":"ContainerDied","Data":"67c6abed01f3402399ce8f8b9068f144c2b1659e1f05ece2608f528bba4554c1"} Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.898691 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.898701 5030 scope.go:117] "RemoveContainer" containerID="31c42f176ac6f27debe6898cfd918e7789ef66851ebe784b5c6a21ae123b90a6" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.904314 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sgfz6" event={"ID":"905b9300-659f-4490-b8c1-52e5874b23b7","Type":"ContainerStarted","Data":"a1f98313bdc9737123229e8d5af02e60151563f266aa18bebce4c2cd65e40a39"} Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.907197 5030 generic.go:334] "Generic (PLEG): container finished" podID="430ba0db-7327-43e8-a350-455f10b27f90" containerID="def134f3754d2d42b8149296707e23527e69e2133f1f0e44476b5aca3249d319" exitCode=0 Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.907301 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.907308 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"430ba0db-7327-43e8-a350-455f10b27f90","Type":"ContainerDied","Data":"def134f3754d2d42b8149296707e23527e69e2133f1f0e44476b5aca3249d319"} Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.907534 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"430ba0db-7327-43e8-a350-455f10b27f90","Type":"ContainerDied","Data":"03a328db0d80c2653719f66f4eeea681f5b201065c4568d66b4a29a6dbfae698"} Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.915833 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" podStartSLOduration=2.915814074 podStartE2EDuration="2.915814074s" podCreationTimestamp="2026-01-20 23:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:30:05.909921622 +0000 UTC m=+3278.230181910" watchObservedRunningTime="2026-01-20 23:30:05.915814074 +0000 UTC m=+3278.236074362" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.916068 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/430ba0db-7327-43e8-a350-455f10b27f90-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.916095 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430ba0db-7327-43e8-a350-455f10b27f90-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.916105 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/430ba0db-7327-43e8-a350-455f10b27f90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.916118 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7jt2\" (UniqueName: \"kubernetes.io/projected/430ba0db-7327-43e8-a350-455f10b27f90-kube-api-access-j7jt2\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.932494 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sgfz6" podStartSLOduration=2.355369984 podStartE2EDuration="8.932478548s" podCreationTimestamp="2026-01-20 23:29:57 +0000 UTC" firstStartedPulling="2026-01-20 23:29:58.734016073 +0000 UTC m=+3271.054276361" lastFinishedPulling="2026-01-20 23:30:05.311124637 +0000 UTC m=+3277.631384925" observedRunningTime="2026-01-20 23:30:05.931135395 +0000 UTC m=+3278.251395683" watchObservedRunningTime="2026-01-20 23:30:05.932478548 +0000 UTC m=+3278.252738836" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.934350 5030 scope.go:117] "RemoveContainer" containerID="27b8ba9a4e872283ee4de503b07fa9e3fc2e6b3b7ad61a9b2470b666804d0090" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.968522 5030 scope.go:117] "RemoveContainer" containerID="def134f3754d2d42b8149296707e23527e69e2133f1f0e44476b5aca3249d319" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.979809 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d932ea97-0b18-4e73-8f4d-024982fbbd5c" path="/var/lib/kubelet/pods/d932ea97-0b18-4e73-8f4d-024982fbbd5c/volumes" Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.980893 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.980920 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:30:05 crc kubenswrapper[5030]: I0120 23:30:05.989438 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.007498 5030 scope.go:117] "RemoveContainer" containerID="f4eead8ff3e6ee6d16f007081c5bdd57af05de2cc8223708c5a2492e3280fee4" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.017158 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.031903 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:30:06 crc kubenswrapper[5030]: E0120 23:30:06.032314 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="430ba0db-7327-43e8-a350-455f10b27f90" containerName="nova-api-api" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.032333 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="430ba0db-7327-43e8-a350-455f10b27f90" containerName="nova-api-api" Jan 20 23:30:06 crc kubenswrapper[5030]: E0120 23:30:06.032360 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="430ba0db-7327-43e8-a350-455f10b27f90" containerName="nova-api-log" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.032368 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="430ba0db-7327-43e8-a350-455f10b27f90" containerName="nova-api-log" Jan 20 23:30:06 crc kubenswrapper[5030]: E0120 23:30:06.032379 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4d5046-d68a-43d1-a12c-9ea876e906ad" containerName="nova-metadata-log" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.032386 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4d5046-d68a-43d1-a12c-9ea876e906ad" containerName="nova-metadata-log" Jan 20 23:30:06 crc kubenswrapper[5030]: E0120 23:30:06.032396 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4d5046-d68a-43d1-a12c-9ea876e906ad" containerName="nova-metadata-metadata" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.032402 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4d5046-d68a-43d1-a12c-9ea876e906ad" containerName="nova-metadata-metadata" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.032570 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4d5046-d68a-43d1-a12c-9ea876e906ad" containerName="nova-metadata-log" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.032584 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="430ba0db-7327-43e8-a350-455f10b27f90" containerName="nova-api-log" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.032593 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4d5046-d68a-43d1-a12c-9ea876e906ad" containerName="nova-metadata-metadata" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.032601 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="430ba0db-7327-43e8-a350-455f10b27f90" containerName="nova-api-api" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.033548 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.037247 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.038973 5030 scope.go:117] "RemoveContainer" containerID="def134f3754d2d42b8149296707e23527e69e2133f1f0e44476b5aca3249d319" Jan 20 23:30:06 crc kubenswrapper[5030]: E0120 23:30:06.039518 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"def134f3754d2d42b8149296707e23527e69e2133f1f0e44476b5aca3249d319\": container with ID starting with def134f3754d2d42b8149296707e23527e69e2133f1f0e44476b5aca3249d319 not found: ID does not exist" containerID="def134f3754d2d42b8149296707e23527e69e2133f1f0e44476b5aca3249d319" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.039565 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def134f3754d2d42b8149296707e23527e69e2133f1f0e44476b5aca3249d319"} err="failed to get container status \"def134f3754d2d42b8149296707e23527e69e2133f1f0e44476b5aca3249d319\": rpc error: code = NotFound desc = could not find container \"def134f3754d2d42b8149296707e23527e69e2133f1f0e44476b5aca3249d319\": container with ID starting with def134f3754d2d42b8149296707e23527e69e2133f1f0e44476b5aca3249d319 not found: ID does not exist" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.039664 5030 scope.go:117] "RemoveContainer" containerID="f4eead8ff3e6ee6d16f007081c5bdd57af05de2cc8223708c5a2492e3280fee4" Jan 20 23:30:06 crc kubenswrapper[5030]: E0120 23:30:06.044093 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4eead8ff3e6ee6d16f007081c5bdd57af05de2cc8223708c5a2492e3280fee4\": container with ID starting with f4eead8ff3e6ee6d16f007081c5bdd57af05de2cc8223708c5a2492e3280fee4 not found: ID does not exist" containerID="f4eead8ff3e6ee6d16f007081c5bdd57af05de2cc8223708c5a2492e3280fee4" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.044148 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4eead8ff3e6ee6d16f007081c5bdd57af05de2cc8223708c5a2492e3280fee4"} err="failed to get container status \"f4eead8ff3e6ee6d16f007081c5bdd57af05de2cc8223708c5a2492e3280fee4\": rpc error: code = NotFound desc = could not find container \"f4eead8ff3e6ee6d16f007081c5bdd57af05de2cc8223708c5a2492e3280fee4\": container with ID starting with f4eead8ff3e6ee6d16f007081c5bdd57af05de2cc8223708c5a2492e3280fee4 not found: ID does not exist" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.057935 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.065750 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.068145 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.071777 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.074790 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.119895 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqh94\" (UniqueName: \"kubernetes.io/projected/aa573ced-61f2-4fad-8b68-66936ebb08b7-kube-api-access-sqh94\") pod \"nova-api-0\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.119961 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxl8z\" (UniqueName: \"kubernetes.io/projected/1929e21b-87de-4c5f-838a-8351a97185d8-kube-api-access-bxl8z\") pod \"nova-metadata-0\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.120072 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa573ced-61f2-4fad-8b68-66936ebb08b7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.120107 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1929e21b-87de-4c5f-838a-8351a97185d8-config-data\") pod \"nova-metadata-0\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.120138 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1929e21b-87de-4c5f-838a-8351a97185d8-logs\") pod \"nova-metadata-0\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.120232 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa573ced-61f2-4fad-8b68-66936ebb08b7-config-data\") pod \"nova-api-0\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.120310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1929e21b-87de-4c5f-838a-8351a97185d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.120369 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa573ced-61f2-4fad-8b68-66936ebb08b7-logs\") pod \"nova-api-0\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.170358 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.221669 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa573ced-61f2-4fad-8b68-66936ebb08b7-logs\") pod \"nova-api-0\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.221745 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqh94\" (UniqueName: \"kubernetes.io/projected/aa573ced-61f2-4fad-8b68-66936ebb08b7-kube-api-access-sqh94\") pod \"nova-api-0\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.221770 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxl8z\" (UniqueName: \"kubernetes.io/projected/1929e21b-87de-4c5f-838a-8351a97185d8-kube-api-access-bxl8z\") pod \"nova-metadata-0\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.222073 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa573ced-61f2-4fad-8b68-66936ebb08b7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.222721 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1929e21b-87de-4c5f-838a-8351a97185d8-config-data\") pod \"nova-metadata-0\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.222697 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa573ced-61f2-4fad-8b68-66936ebb08b7-logs\") pod \"nova-api-0\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.222760 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1929e21b-87de-4c5f-838a-8351a97185d8-logs\") pod \"nova-metadata-0\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.222805 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa573ced-61f2-4fad-8b68-66936ebb08b7-config-data\") pod \"nova-api-0\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.222929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1929e21b-87de-4c5f-838a-8351a97185d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.223529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1929e21b-87de-4c5f-838a-8351a97185d8-logs\") pod \"nova-metadata-0\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.226302 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1929e21b-87de-4c5f-838a-8351a97185d8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.226829 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1929e21b-87de-4c5f-838a-8351a97185d8-config-data\") pod \"nova-metadata-0\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.226855 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa573ced-61f2-4fad-8b68-66936ebb08b7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.228921 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa573ced-61f2-4fad-8b68-66936ebb08b7-config-data\") pod \"nova-api-0\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.241045 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqh94\" (UniqueName: \"kubernetes.io/projected/aa573ced-61f2-4fad-8b68-66936ebb08b7-kube-api-access-sqh94\") pod \"nova-api-0\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.262215 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxl8z\" (UniqueName: \"kubernetes.io/projected/1929e21b-87de-4c5f-838a-8351a97185d8-kube-api-access-bxl8z\") pod \"nova-metadata-0\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.364582 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.390697 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:06 crc kubenswrapper[5030]: W0120 23:30:06.848062 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa573ced_61f2_4fad_8b68_66936ebb08b7.slice/crio-cde9d4b456e80472951636c38abd9fef21504ed985cb17477abb76be02ac9891 WatchSource:0}: Error finding container cde9d4b456e80472951636c38abd9fef21504ed985cb17477abb76be02ac9891: Status 404 returned error can't find the container with id cde9d4b456e80472951636c38abd9fef21504ed985cb17477abb76be02ac9891 Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.849793 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.932661 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"aa573ced-61f2-4fad-8b68-66936ebb08b7","Type":"ContainerStarted","Data":"cde9d4b456e80472951636c38abd9fef21504ed985cb17477abb76be02ac9891"} Jan 20 23:30:06 crc kubenswrapper[5030]: I0120 23:30:06.935364 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:30:06 crc kubenswrapper[5030]: W0120 23:30:06.943211 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1929e21b_87de_4c5f_838a_8351a97185d8.slice/crio-661d897973355e41d139e797e58b27436a48e298b49f8939715af7f88b9f5e3f WatchSource:0}: Error finding container 661d897973355e41d139e797e58b27436a48e298b49f8939715af7f88b9f5e3f: Status 404 returned error can't find the container with id 661d897973355e41d139e797e58b27436a48e298b49f8939715af7f88b9f5e3f Jan 20 23:30:07 crc kubenswrapper[5030]: I0120 23:30:07.473441 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:30:07 crc kubenswrapper[5030]: I0120 23:30:07.473708 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:30:07 crc kubenswrapper[5030]: I0120 23:30:07.950676 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"1929e21b-87de-4c5f-838a-8351a97185d8","Type":"ContainerStarted","Data":"e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8"} Jan 20 23:30:07 crc kubenswrapper[5030]: I0120 23:30:07.950772 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"1929e21b-87de-4c5f-838a-8351a97185d8","Type":"ContainerStarted","Data":"862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f"} Jan 20 23:30:07 crc kubenswrapper[5030]: I0120 23:30:07.950799 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"1929e21b-87de-4c5f-838a-8351a97185d8","Type":"ContainerStarted","Data":"661d897973355e41d139e797e58b27436a48e298b49f8939715af7f88b9f5e3f"} Jan 20 23:30:07 crc kubenswrapper[5030]: I0120 23:30:07.955781 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"aa573ced-61f2-4fad-8b68-66936ebb08b7","Type":"ContainerStarted","Data":"9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd"} Jan 20 23:30:07 crc kubenswrapper[5030]: I0120 23:30:07.955861 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"aa573ced-61f2-4fad-8b68-66936ebb08b7","Type":"ContainerStarted","Data":"a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51"} Jan 20 23:30:07 crc kubenswrapper[5030]: I0120 23:30:07.977198 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.977181795 podStartE2EDuration="2.977181795s" podCreationTimestamp="2026-01-20 23:30:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:30:07.97366527 +0000 UTC m=+3280.293925568" watchObservedRunningTime="2026-01-20 23:30:07.977181795 +0000 UTC m=+3280.297442083" Jan 20 23:30:07 crc kubenswrapper[5030]: I0120 23:30:07.983865 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="430ba0db-7327-43e8-a350-455f10b27f90" path="/var/lib/kubelet/pods/430ba0db-7327-43e8-a350-455f10b27f90/volumes" Jan 20 23:30:07 crc kubenswrapper[5030]: I0120 23:30:07.985318 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4d5046-d68a-43d1-a12c-9ea876e906ad" path="/var/lib/kubelet/pods/cb4d5046-d68a-43d1-a12c-9ea876e906ad/volumes" Jan 20 23:30:08 crc kubenswrapper[5030]: I0120 23:30:08.019463 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=3.019438898 podStartE2EDuration="3.019438898s" podCreationTimestamp="2026-01-20 23:30:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:30:08.005966511 +0000 UTC m=+3280.326226800" watchObservedRunningTime="2026-01-20 23:30:08.019438898 +0000 UTC m=+3280.339699186" Jan 20 23:30:08 crc kubenswrapper[5030]: I0120 23:30:08.024001 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:08 crc kubenswrapper[5030]: I0120 23:30:08.533655 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sgfz6" podUID="905b9300-659f-4490-b8c1-52e5874b23b7" containerName="registry-server" probeResult="failure" output=< Jan 20 23:30:08 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 23:30:08 crc kubenswrapper[5030]: > Jan 20 23:30:09 crc kubenswrapper[5030]: I0120 23:30:09.977580 5030 generic.go:334] "Generic (PLEG): container finished" podID="021b4577-29eb-4118-90a9-e4d409ec342a" containerID="9d227bee93ee3810f8bf8c42e3725c916f0abced2be65bffcd089b41e0c07bac" exitCode=0 Jan 20 23:30:09 crc kubenswrapper[5030]: I0120 23:30:09.987819 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" event={"ID":"021b4577-29eb-4118-90a9-e4d409ec342a","Type":"ContainerDied","Data":"9d227bee93ee3810f8bf8c42e3725c916f0abced2be65bffcd089b41e0c07bac"} Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.172250 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.218234 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.365526 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.366407 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.389664 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.515437 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-scripts\") pod \"021b4577-29eb-4118-90a9-e4d409ec342a\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.515592 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcg7b\" (UniqueName: \"kubernetes.io/projected/021b4577-29eb-4118-90a9-e4d409ec342a-kube-api-access-tcg7b\") pod \"021b4577-29eb-4118-90a9-e4d409ec342a\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.515761 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-config-data\") pod \"021b4577-29eb-4118-90a9-e4d409ec342a\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.515794 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-combined-ca-bundle\") pod \"021b4577-29eb-4118-90a9-e4d409ec342a\" (UID: \"021b4577-29eb-4118-90a9-e4d409ec342a\") " Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.520825 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-scripts" (OuterVolumeSpecName: "scripts") pod "021b4577-29eb-4118-90a9-e4d409ec342a" (UID: "021b4577-29eb-4118-90a9-e4d409ec342a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.520920 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/021b4577-29eb-4118-90a9-e4d409ec342a-kube-api-access-tcg7b" (OuterVolumeSpecName: "kube-api-access-tcg7b") pod "021b4577-29eb-4118-90a9-e4d409ec342a" (UID: "021b4577-29eb-4118-90a9-e4d409ec342a"). InnerVolumeSpecName "kube-api-access-tcg7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.538696 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-config-data" (OuterVolumeSpecName: "config-data") pod "021b4577-29eb-4118-90a9-e4d409ec342a" (UID: "021b4577-29eb-4118-90a9-e4d409ec342a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.558796 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "021b4577-29eb-4118-90a9-e4d409ec342a" (UID: "021b4577-29eb-4118-90a9-e4d409ec342a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.618237 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.618285 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.618301 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/021b4577-29eb-4118-90a9-e4d409ec342a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:11 crc kubenswrapper[5030]: I0120 23:30:11.618316 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcg7b\" (UniqueName: \"kubernetes.io/projected/021b4577-29eb-4118-90a9-e4d409ec342a-kube-api-access-tcg7b\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.005495 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" event={"ID":"021b4577-29eb-4118-90a9-e4d409ec342a","Type":"ContainerDied","Data":"2d15ebcc8ee7d6f70e8c9fbfac0a0c2ad83b4db73280724baafe0e85c2fb1dc7"} Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.005549 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d15ebcc8ee7d6f70e8c9fbfac0a0c2ad83b4db73280724baafe0e85c2fb1dc7" Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.005815 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt" Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.051380 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.196738 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.196952 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="aa573ced-61f2-4fad-8b68-66936ebb08b7" containerName="nova-api-log" containerID="cri-o://a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51" gracePeriod=30 Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.197065 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="aa573ced-61f2-4fad-8b68-66936ebb08b7" containerName="nova-api-api" containerID="cri-o://9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd" gracePeriod=30 Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.282516 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.473566 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.756772 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.839004 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa573ced-61f2-4fad-8b68-66936ebb08b7-combined-ca-bundle\") pod \"aa573ced-61f2-4fad-8b68-66936ebb08b7\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.839077 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa573ced-61f2-4fad-8b68-66936ebb08b7-logs\") pod \"aa573ced-61f2-4fad-8b68-66936ebb08b7\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.839102 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa573ced-61f2-4fad-8b68-66936ebb08b7-config-data\") pod \"aa573ced-61f2-4fad-8b68-66936ebb08b7\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.839170 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqh94\" (UniqueName: \"kubernetes.io/projected/aa573ced-61f2-4fad-8b68-66936ebb08b7-kube-api-access-sqh94\") pod \"aa573ced-61f2-4fad-8b68-66936ebb08b7\" (UID: \"aa573ced-61f2-4fad-8b68-66936ebb08b7\") " Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.839404 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa573ced-61f2-4fad-8b68-66936ebb08b7-logs" (OuterVolumeSpecName: "logs") pod "aa573ced-61f2-4fad-8b68-66936ebb08b7" (UID: "aa573ced-61f2-4fad-8b68-66936ebb08b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.839947 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa573ced-61f2-4fad-8b68-66936ebb08b7-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.844578 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa573ced-61f2-4fad-8b68-66936ebb08b7-kube-api-access-sqh94" (OuterVolumeSpecName: "kube-api-access-sqh94") pod "aa573ced-61f2-4fad-8b68-66936ebb08b7" (UID: "aa573ced-61f2-4fad-8b68-66936ebb08b7"). InnerVolumeSpecName "kube-api-access-sqh94". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.865700 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa573ced-61f2-4fad-8b68-66936ebb08b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa573ced-61f2-4fad-8b68-66936ebb08b7" (UID: "aa573ced-61f2-4fad-8b68-66936ebb08b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.875989 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa573ced-61f2-4fad-8b68-66936ebb08b7-config-data" (OuterVolumeSpecName: "config-data") pod "aa573ced-61f2-4fad-8b68-66936ebb08b7" (UID: "aa573ced-61f2-4fad-8b68-66936ebb08b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.941053 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa573ced-61f2-4fad-8b68-66936ebb08b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.941098 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa573ced-61f2-4fad-8b68-66936ebb08b7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.941109 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqh94\" (UniqueName: \"kubernetes.io/projected/aa573ced-61f2-4fad-8b68-66936ebb08b7-kube-api-access-sqh94\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:12 crc kubenswrapper[5030]: I0120 23:30:12.962334 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.017977 5030 generic.go:334] "Generic (PLEG): container finished" podID="aa573ced-61f2-4fad-8b68-66936ebb08b7" containerID="9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd" exitCode=0 Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.018022 5030 generic.go:334] "Generic (PLEG): container finished" podID="aa573ced-61f2-4fad-8b68-66936ebb08b7" containerID="a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51" exitCode=143 Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.018033 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"aa573ced-61f2-4fad-8b68-66936ebb08b7","Type":"ContainerDied","Data":"9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd"} Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.018078 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"aa573ced-61f2-4fad-8b68-66936ebb08b7","Type":"ContainerDied","Data":"a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51"} Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.018092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"aa573ced-61f2-4fad-8b68-66936ebb08b7","Type":"ContainerDied","Data":"cde9d4b456e80472951636c38abd9fef21504ed985cb17477abb76be02ac9891"} Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.018110 5030 scope.go:117] "RemoveContainer" containerID="9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.018245 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.018322 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="1929e21b-87de-4c5f-838a-8351a97185d8" containerName="nova-metadata-log" containerID="cri-o://862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f" gracePeriod=30 Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.018395 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="1929e21b-87de-4c5f-838a-8351a97185d8" containerName="nova-metadata-metadata" containerID="cri-o://e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8" gracePeriod=30 Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.048854 5030 scope.go:117] "RemoveContainer" containerID="a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.076532 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.087334 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.088793 5030 scope.go:117] "RemoveContainer" containerID="9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd" Jan 20 23:30:13 crc kubenswrapper[5030]: E0120 23:30:13.089424 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd\": container with ID starting with 9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd not found: ID does not exist" containerID="9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.089482 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd"} err="failed to get container status \"9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd\": rpc error: code = NotFound desc = could not find container \"9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd\": container with ID starting with 9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd not found: ID does not exist" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.089516 5030 scope.go:117] "RemoveContainer" containerID="a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51" Jan 20 23:30:13 crc kubenswrapper[5030]: E0120 23:30:13.090020 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51\": container with ID starting with a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51 not found: ID does not exist" containerID="a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.090076 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51"} err="failed to get container status \"a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51\": rpc error: code = NotFound desc = could not find container \"a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51\": container with ID starting with a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51 not found: ID does not exist" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.090113 5030 scope.go:117] "RemoveContainer" containerID="9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.090523 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd"} err="failed to get container status \"9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd\": rpc error: code = NotFound desc = could not find container \"9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd\": container with ID starting with 9ddff13f5fbb7b57944f4a68d5d187f093ab200483b81e9d64431394fef5decd not found: ID does not exist" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.090564 5030 scope.go:117] "RemoveContainer" containerID="a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.091035 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51"} err="failed to get container status \"a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51\": rpc error: code = NotFound desc = could not find container \"a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51\": container with ID starting with a71c2eb2185e5cc3b7a41c038c8b569199e5a9e037f08db6f8cc6b36a3e08c51 not found: ID does not exist" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.100241 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:30:13 crc kubenswrapper[5030]: E0120 23:30:13.100665 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa573ced-61f2-4fad-8b68-66936ebb08b7" containerName="nova-api-log" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.100682 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa573ced-61f2-4fad-8b68-66936ebb08b7" containerName="nova-api-log" Jan 20 23:30:13 crc kubenswrapper[5030]: E0120 23:30:13.100697 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa573ced-61f2-4fad-8b68-66936ebb08b7" containerName="nova-api-api" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.100704 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa573ced-61f2-4fad-8b68-66936ebb08b7" containerName="nova-api-api" Jan 20 23:30:13 crc kubenswrapper[5030]: E0120 23:30:13.100714 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="021b4577-29eb-4118-90a9-e4d409ec342a" containerName="nova-manage" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.100720 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="021b4577-29eb-4118-90a9-e4d409ec342a" containerName="nova-manage" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.100870 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="021b4577-29eb-4118-90a9-e4d409ec342a" containerName="nova-manage" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.100887 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa573ced-61f2-4fad-8b68-66936ebb08b7" containerName="nova-api-log" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.100902 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa573ced-61f2-4fad-8b68-66936ebb08b7" containerName="nova-api-api" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.101841 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.104727 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.116704 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.245153 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-config-data\") pod \"nova-api-0\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.245212 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-logs\") pod \"nova-api-0\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.245265 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.245338 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n9qk\" (UniqueName: \"kubernetes.io/projected/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-kube-api-access-2n9qk\") pod \"nova-api-0\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.346736 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-config-data\") pod \"nova-api-0\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.347056 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-logs\") pod \"nova-api-0\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.347086 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.347127 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n9qk\" (UniqueName: \"kubernetes.io/projected/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-kube-api-access-2n9qk\") pod \"nova-api-0\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.348975 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-logs\") pod \"nova-api-0\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.351830 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-config-data\") pod \"nova-api-0\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.352299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.364001 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n9qk\" (UniqueName: \"kubernetes.io/projected/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-kube-api-access-2n9qk\") pod \"nova-api-0\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.421613 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.545343 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.650579 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1929e21b-87de-4c5f-838a-8351a97185d8-logs\") pod \"1929e21b-87de-4c5f-838a-8351a97185d8\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.650805 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1929e21b-87de-4c5f-838a-8351a97185d8-config-data\") pod \"1929e21b-87de-4c5f-838a-8351a97185d8\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.650843 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxl8z\" (UniqueName: \"kubernetes.io/projected/1929e21b-87de-4c5f-838a-8351a97185d8-kube-api-access-bxl8z\") pod \"1929e21b-87de-4c5f-838a-8351a97185d8\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.650868 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1929e21b-87de-4c5f-838a-8351a97185d8-combined-ca-bundle\") pod \"1929e21b-87de-4c5f-838a-8351a97185d8\" (UID: \"1929e21b-87de-4c5f-838a-8351a97185d8\") " Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.651142 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1929e21b-87de-4c5f-838a-8351a97185d8-logs" (OuterVolumeSpecName: "logs") pod "1929e21b-87de-4c5f-838a-8351a97185d8" (UID: "1929e21b-87de-4c5f-838a-8351a97185d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.651256 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1929e21b-87de-4c5f-838a-8351a97185d8-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.656455 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1929e21b-87de-4c5f-838a-8351a97185d8-kube-api-access-bxl8z" (OuterVolumeSpecName: "kube-api-access-bxl8z") pod "1929e21b-87de-4c5f-838a-8351a97185d8" (UID: "1929e21b-87de-4c5f-838a-8351a97185d8"). InnerVolumeSpecName "kube-api-access-bxl8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.679322 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1929e21b-87de-4c5f-838a-8351a97185d8-config-data" (OuterVolumeSpecName: "config-data") pod "1929e21b-87de-4c5f-838a-8351a97185d8" (UID: "1929e21b-87de-4c5f-838a-8351a97185d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.680055 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1929e21b-87de-4c5f-838a-8351a97185d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1929e21b-87de-4c5f-838a-8351a97185d8" (UID: "1929e21b-87de-4c5f-838a-8351a97185d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.752698 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1929e21b-87de-4c5f-838a-8351a97185d8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.752730 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxl8z\" (UniqueName: \"kubernetes.io/projected/1929e21b-87de-4c5f-838a-8351a97185d8-kube-api-access-bxl8z\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.752741 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1929e21b-87de-4c5f-838a-8351a97185d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.893678 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:30:13 crc kubenswrapper[5030]: I0120 23:30:13.982230 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa573ced-61f2-4fad-8b68-66936ebb08b7" path="/var/lib/kubelet/pods/aa573ced-61f2-4fad-8b68-66936ebb08b7/volumes" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.028870 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1","Type":"ContainerStarted","Data":"9f48055581cfa836ec976d39c83be75bc5d8e68e69f74bd43ee85c063d08641f"} Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.032914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"1a1fe53fa35b53c77fc4449b81793acb927248e20c61c789e75acc9a687014ea"} Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.037731 5030 generic.go:334] "Generic (PLEG): container finished" podID="1929e21b-87de-4c5f-838a-8351a97185d8" containerID="e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8" exitCode=0 Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.037763 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.037768 5030 generic.go:334] "Generic (PLEG): container finished" podID="1929e21b-87de-4c5f-838a-8351a97185d8" containerID="862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f" exitCode=143 Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.037813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"1929e21b-87de-4c5f-838a-8351a97185d8","Type":"ContainerDied","Data":"e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8"} Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.037844 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"1929e21b-87de-4c5f-838a-8351a97185d8","Type":"ContainerDied","Data":"862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f"} Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.037865 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"1929e21b-87de-4c5f-838a-8351a97185d8","Type":"ContainerDied","Data":"661d897973355e41d139e797e58b27436a48e298b49f8939715af7f88b9f5e3f"} Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.037893 5030 scope.go:117] "RemoveContainer" containerID="e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.038138 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="e68f755c-e882-4722-9d23-1cd24541f37a" containerName="nova-scheduler-scheduler" containerID="cri-o://9d17498b4f63fe63c2218b1ac9b50d472f4d2ff957fc47efa50328e708c8df4a" gracePeriod=30 Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.072859 5030 scope.go:117] "RemoveContainer" containerID="862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.099013 5030 scope.go:117] "RemoveContainer" containerID="e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8" Jan 20 23:30:14 crc kubenswrapper[5030]: E0120 23:30:14.101839 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8\": container with ID starting with e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8 not found: ID does not exist" containerID="e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.101897 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8"} err="failed to get container status \"e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8\": rpc error: code = NotFound desc = could not find container \"e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8\": container with ID starting with e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8 not found: ID does not exist" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.101928 5030 scope.go:117] "RemoveContainer" containerID="862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f" Jan 20 23:30:14 crc kubenswrapper[5030]: E0120 23:30:14.102338 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f\": container with ID starting with 862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f not found: ID does not exist" containerID="862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.102370 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f"} err="failed to get container status \"862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f\": rpc error: code = NotFound desc = could not find container \"862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f\": container with ID starting with 862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f not found: ID does not exist" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.102386 5030 scope.go:117] "RemoveContainer" containerID="e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.102738 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8"} err="failed to get container status \"e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8\": rpc error: code = NotFound desc = could not find container \"e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8\": container with ID starting with e7444f3c76414bb29c4c285b2772dc4622525d65e72e12328a3dbeabd71dd6d8 not found: ID does not exist" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.102752 5030 scope.go:117] "RemoveContainer" containerID="862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.103192 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f"} err="failed to get container status \"862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f\": rpc error: code = NotFound desc = could not find container \"862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f\": container with ID starting with 862178fabfc90bc71729c9c182a4bb5b8402fc603710eb90e8f8c1c96e9f0f3f not found: ID does not exist" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.133750 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.154747 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.165088 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:30:14 crc kubenswrapper[5030]: E0120 23:30:14.165617 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1929e21b-87de-4c5f-838a-8351a97185d8" containerName="nova-metadata-log" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.165666 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1929e21b-87de-4c5f-838a-8351a97185d8" containerName="nova-metadata-log" Jan 20 23:30:14 crc kubenswrapper[5030]: E0120 23:30:14.165696 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1929e21b-87de-4c5f-838a-8351a97185d8" containerName="nova-metadata-metadata" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.165705 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1929e21b-87de-4c5f-838a-8351a97185d8" containerName="nova-metadata-metadata" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.165952 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1929e21b-87de-4c5f-838a-8351a97185d8" containerName="nova-metadata-log" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.165980 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1929e21b-87de-4c5f-838a-8351a97185d8" containerName="nova-metadata-metadata" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.167299 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.169000 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.176162 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.260054 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff299fc0-074c-4420-8ac5-030bebd0c385-config-data\") pod \"nova-metadata-0\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.260820 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff299fc0-074c-4420-8ac5-030bebd0c385-logs\") pod \"nova-metadata-0\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.261328 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff299fc0-074c-4420-8ac5-030bebd0c385-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.261446 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssg2g\" (UniqueName: \"kubernetes.io/projected/ff299fc0-074c-4420-8ac5-030bebd0c385-kube-api-access-ssg2g\") pod \"nova-metadata-0\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.362711 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff299fc0-074c-4420-8ac5-030bebd0c385-config-data\") pod \"nova-metadata-0\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.362776 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff299fc0-074c-4420-8ac5-030bebd0c385-logs\") pod \"nova-metadata-0\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.362923 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff299fc0-074c-4420-8ac5-030bebd0c385-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.362961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssg2g\" (UniqueName: \"kubernetes.io/projected/ff299fc0-074c-4420-8ac5-030bebd0c385-kube-api-access-ssg2g\") pod \"nova-metadata-0\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.364289 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff299fc0-074c-4420-8ac5-030bebd0c385-logs\") pod \"nova-metadata-0\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.372396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff299fc0-074c-4420-8ac5-030bebd0c385-config-data\") pod \"nova-metadata-0\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.372491 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff299fc0-074c-4420-8ac5-030bebd0c385-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.377909 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssg2g\" (UniqueName: \"kubernetes.io/projected/ff299fc0-074c-4420-8ac5-030bebd0c385-kube-api-access-ssg2g\") pod \"nova-metadata-0\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:14 crc kubenswrapper[5030]: I0120 23:30:14.485917 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:15 crc kubenswrapper[5030]: I0120 23:30:15.091838 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1","Type":"ContainerStarted","Data":"50f7830c863df6553b3aedb3541412b57d1b96e325097a32a008245e2d15caf3"} Jan 20 23:30:15 crc kubenswrapper[5030]: I0120 23:30:15.092310 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1","Type":"ContainerStarted","Data":"d2040a6e25a52cf56fd194763cb8ba952aafe253c5f4e0b2ef114a3c8448c118"} Jan 20 23:30:15 crc kubenswrapper[5030]: I0120 23:30:15.115648 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.115610096 podStartE2EDuration="2.115610096s" podCreationTimestamp="2026-01-20 23:30:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:30:15.115584916 +0000 UTC m=+3287.435845204" watchObservedRunningTime="2026-01-20 23:30:15.115610096 +0000 UTC m=+3287.435870384" Jan 20 23:30:15 crc kubenswrapper[5030]: I0120 23:30:15.762984 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:30:15 crc kubenswrapper[5030]: W0120 23:30:15.763836 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff299fc0_074c_4420_8ac5_030bebd0c385.slice/crio-27d6ada5afe73f9dfe769965c8eb20db23b4135948bb0c76cf36d211eb5ffc7e WatchSource:0}: Error finding container 27d6ada5afe73f9dfe769965c8eb20db23b4135948bb0c76cf36d211eb5ffc7e: Status 404 returned error can't find the container with id 27d6ada5afe73f9dfe769965c8eb20db23b4135948bb0c76cf36d211eb5ffc7e Jan 20 23:30:15 crc kubenswrapper[5030]: I0120 23:30:15.984166 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1929e21b-87de-4c5f-838a-8351a97185d8" path="/var/lib/kubelet/pods/1929e21b-87de-4c5f-838a-8351a97185d8/volumes" Jan 20 23:30:16 crc kubenswrapper[5030]: I0120 23:30:16.107942 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ff299fc0-074c-4420-8ac5-030bebd0c385","Type":"ContainerStarted","Data":"d465c7355fb972c93c904f55d8b5006846ff351d0ce18969da3c146b863c1b94"} Jan 20 23:30:16 crc kubenswrapper[5030]: I0120 23:30:16.108005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ff299fc0-074c-4420-8ac5-030bebd0c385","Type":"ContainerStarted","Data":"27d6ada5afe73f9dfe769965c8eb20db23b4135948bb0c76cf36d211eb5ffc7e"} Jan 20 23:30:16 crc kubenswrapper[5030]: E0120 23:30:16.173215 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d17498b4f63fe63c2218b1ac9b50d472f4d2ff957fc47efa50328e708c8df4a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:30:16 crc kubenswrapper[5030]: E0120 23:30:16.175142 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d17498b4f63fe63c2218b1ac9b50d472f4d2ff957fc47efa50328e708c8df4a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:30:16 crc kubenswrapper[5030]: E0120 23:30:16.176974 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d17498b4f63fe63c2218b1ac9b50d472f4d2ff957fc47efa50328e708c8df4a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:30:16 crc kubenswrapper[5030]: E0120 23:30:16.177012 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="e68f755c-e882-4722-9d23-1cd24541f37a" containerName="nova-scheduler-scheduler" Jan 20 23:30:17 crc kubenswrapper[5030]: I0120 23:30:17.122749 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ff299fc0-074c-4420-8ac5-030bebd0c385","Type":"ContainerStarted","Data":"25854f2a2d04afcd6032d297351aac945108d95777f369400932fa4d630807df"} Jan 20 23:30:17 crc kubenswrapper[5030]: I0120 23:30:17.161965 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=3.161939833 podStartE2EDuration="3.161939833s" podCreationTimestamp="2026-01-20 23:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:30:17.152887923 +0000 UTC m=+3289.473148241" watchObservedRunningTime="2026-01-20 23:30:17.161939833 +0000 UTC m=+3289.482200151" Jan 20 23:30:17 crc kubenswrapper[5030]: I0120 23:30:17.555169 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:30:17 crc kubenswrapper[5030]: I0120 23:30:17.608571 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:30:17 crc kubenswrapper[5030]: I0120 23:30:17.794119 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sgfz6"] Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.148691 5030 generic.go:334] "Generic (PLEG): container finished" podID="e68f755c-e882-4722-9d23-1cd24541f37a" containerID="9d17498b4f63fe63c2218b1ac9b50d472f4d2ff957fc47efa50328e708c8df4a" exitCode=0 Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.149097 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sgfz6" podUID="905b9300-659f-4490-b8c1-52e5874b23b7" containerName="registry-server" containerID="cri-o://a1f98313bdc9737123229e8d5af02e60151563f266aa18bebce4c2cd65e40a39" gracePeriod=2 Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.149164 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e68f755c-e882-4722-9d23-1cd24541f37a","Type":"ContainerDied","Data":"9d17498b4f63fe63c2218b1ac9b50d472f4d2ff957fc47efa50328e708c8df4a"} Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.149185 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"e68f755c-e882-4722-9d23-1cd24541f37a","Type":"ContainerDied","Data":"be471bec5632700e47a6ea9567eabe0438acec90c4db65e184ed7d7532a7ac3d"} Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.149194 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be471bec5632700e47a6ea9567eabe0438acec90c4db65e184ed7d7532a7ac3d" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.151463 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.259073 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e68f755c-e882-4722-9d23-1cd24541f37a-combined-ca-bundle\") pod \"e68f755c-e882-4722-9d23-1cd24541f37a\" (UID: \"e68f755c-e882-4722-9d23-1cd24541f37a\") " Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.259464 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e68f755c-e882-4722-9d23-1cd24541f37a-config-data\") pod \"e68f755c-e882-4722-9d23-1cd24541f37a\" (UID: \"e68f755c-e882-4722-9d23-1cd24541f37a\") " Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.259651 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fzvm\" (UniqueName: \"kubernetes.io/projected/e68f755c-e882-4722-9d23-1cd24541f37a-kube-api-access-6fzvm\") pod \"e68f755c-e882-4722-9d23-1cd24541f37a\" (UID: \"e68f755c-e882-4722-9d23-1cd24541f37a\") " Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.265662 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e68f755c-e882-4722-9d23-1cd24541f37a-kube-api-access-6fzvm" (OuterVolumeSpecName: "kube-api-access-6fzvm") pod "e68f755c-e882-4722-9d23-1cd24541f37a" (UID: "e68f755c-e882-4722-9d23-1cd24541f37a"). InnerVolumeSpecName "kube-api-access-6fzvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.289744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e68f755c-e882-4722-9d23-1cd24541f37a-config-data" (OuterVolumeSpecName: "config-data") pod "e68f755c-e882-4722-9d23-1cd24541f37a" (UID: "e68f755c-e882-4722-9d23-1cd24541f37a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.291790 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e68f755c-e882-4722-9d23-1cd24541f37a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e68f755c-e882-4722-9d23-1cd24541f37a" (UID: "e68f755c-e882-4722-9d23-1cd24541f37a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.361958 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e68f755c-e882-4722-9d23-1cd24541f37a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.361998 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e68f755c-e882-4722-9d23-1cd24541f37a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.362013 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fzvm\" (UniqueName: \"kubernetes.io/projected/e68f755c-e882-4722-9d23-1cd24541f37a-kube-api-access-6fzvm\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.486983 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.487651 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.565414 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.666466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905b9300-659f-4490-b8c1-52e5874b23b7-catalog-content\") pod \"905b9300-659f-4490-b8c1-52e5874b23b7\" (UID: \"905b9300-659f-4490-b8c1-52e5874b23b7\") " Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.666563 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905b9300-659f-4490-b8c1-52e5874b23b7-utilities\") pod \"905b9300-659f-4490-b8c1-52e5874b23b7\" (UID: \"905b9300-659f-4490-b8c1-52e5874b23b7\") " Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.666612 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gzj5\" (UniqueName: \"kubernetes.io/projected/905b9300-659f-4490-b8c1-52e5874b23b7-kube-api-access-5gzj5\") pod \"905b9300-659f-4490-b8c1-52e5874b23b7\" (UID: \"905b9300-659f-4490-b8c1-52e5874b23b7\") " Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.667470 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/905b9300-659f-4490-b8c1-52e5874b23b7-utilities" (OuterVolumeSpecName: "utilities") pod "905b9300-659f-4490-b8c1-52e5874b23b7" (UID: "905b9300-659f-4490-b8c1-52e5874b23b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.670960 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/905b9300-659f-4490-b8c1-52e5874b23b7-kube-api-access-5gzj5" (OuterVolumeSpecName: "kube-api-access-5gzj5") pod "905b9300-659f-4490-b8c1-52e5874b23b7" (UID: "905b9300-659f-4490-b8c1-52e5874b23b7"). InnerVolumeSpecName "kube-api-access-5gzj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.769441 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/905b9300-659f-4490-b8c1-52e5874b23b7-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.769500 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gzj5\" (UniqueName: \"kubernetes.io/projected/905b9300-659f-4490-b8c1-52e5874b23b7-kube-api-access-5gzj5\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.786028 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/905b9300-659f-4490-b8c1-52e5874b23b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "905b9300-659f-4490-b8c1-52e5874b23b7" (UID: "905b9300-659f-4490-b8c1-52e5874b23b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:30:19 crc kubenswrapper[5030]: I0120 23:30:19.870957 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/905b9300-659f-4490-b8c1-52e5874b23b7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.167985 5030 generic.go:334] "Generic (PLEG): container finished" podID="905b9300-659f-4490-b8c1-52e5874b23b7" containerID="a1f98313bdc9737123229e8d5af02e60151563f266aa18bebce4c2cd65e40a39" exitCode=0 Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.168078 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.168082 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sgfz6" event={"ID":"905b9300-659f-4490-b8c1-52e5874b23b7","Type":"ContainerDied","Data":"a1f98313bdc9737123229e8d5af02e60151563f266aa18bebce4c2cd65e40a39"} Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.168161 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sgfz6" event={"ID":"905b9300-659f-4490-b8c1-52e5874b23b7","Type":"ContainerDied","Data":"265a66a7c8f9c28844f273eabe7c1d4029c1aafd39d840ad898ee4d144893fc1"} Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.168194 5030 scope.go:117] "RemoveContainer" containerID="a1f98313bdc9737123229e8d5af02e60151563f266aa18bebce4c2cd65e40a39" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.169668 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sgfz6" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.207717 5030 scope.go:117] "RemoveContainer" containerID="ce0586a5474a6cc2de7fdd01bb27bb5cdc8ccc33deeb51efd71e9fe9e9551a25" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.207942 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.224133 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.247882 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sgfz6"] Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.251385 5030 scope.go:117] "RemoveContainer" containerID="08325065dcc4631e5da5a30b800e1b1193f67cf5d051d623c4924d3a56eb756c" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.261841 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sgfz6"] Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.277174 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:30:20 crc kubenswrapper[5030]: E0120 23:30:20.277773 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905b9300-659f-4490-b8c1-52e5874b23b7" containerName="extract-content" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.277806 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="905b9300-659f-4490-b8c1-52e5874b23b7" containerName="extract-content" Jan 20 23:30:20 crc kubenswrapper[5030]: E0120 23:30:20.277833 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e68f755c-e882-4722-9d23-1cd24541f37a" containerName="nova-scheduler-scheduler" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.277844 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e68f755c-e882-4722-9d23-1cd24541f37a" containerName="nova-scheduler-scheduler" Jan 20 23:30:20 crc kubenswrapper[5030]: E0120 23:30:20.277870 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905b9300-659f-4490-b8c1-52e5874b23b7" containerName="extract-utilities" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.277878 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="905b9300-659f-4490-b8c1-52e5874b23b7" containerName="extract-utilities" Jan 20 23:30:20 crc kubenswrapper[5030]: E0120 23:30:20.277907 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905b9300-659f-4490-b8c1-52e5874b23b7" containerName="registry-server" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.277915 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="905b9300-659f-4490-b8c1-52e5874b23b7" containerName="registry-server" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.278173 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e68f755c-e882-4722-9d23-1cd24541f37a" containerName="nova-scheduler-scheduler" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.278199 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="905b9300-659f-4490-b8c1-52e5874b23b7" containerName="registry-server" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.279288 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.282348 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.291235 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.301712 5030 scope.go:117] "RemoveContainer" containerID="a1f98313bdc9737123229e8d5af02e60151563f266aa18bebce4c2cd65e40a39" Jan 20 23:30:20 crc kubenswrapper[5030]: E0120 23:30:20.302277 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f98313bdc9737123229e8d5af02e60151563f266aa18bebce4c2cd65e40a39\": container with ID starting with a1f98313bdc9737123229e8d5af02e60151563f266aa18bebce4c2cd65e40a39 not found: ID does not exist" containerID="a1f98313bdc9737123229e8d5af02e60151563f266aa18bebce4c2cd65e40a39" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.302329 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f98313bdc9737123229e8d5af02e60151563f266aa18bebce4c2cd65e40a39"} err="failed to get container status \"a1f98313bdc9737123229e8d5af02e60151563f266aa18bebce4c2cd65e40a39\": rpc error: code = NotFound desc = could not find container \"a1f98313bdc9737123229e8d5af02e60151563f266aa18bebce4c2cd65e40a39\": container with ID starting with a1f98313bdc9737123229e8d5af02e60151563f266aa18bebce4c2cd65e40a39 not found: ID does not exist" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.302361 5030 scope.go:117] "RemoveContainer" containerID="ce0586a5474a6cc2de7fdd01bb27bb5cdc8ccc33deeb51efd71e9fe9e9551a25" Jan 20 23:30:20 crc kubenswrapper[5030]: E0120 23:30:20.302769 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce0586a5474a6cc2de7fdd01bb27bb5cdc8ccc33deeb51efd71e9fe9e9551a25\": container with ID starting with ce0586a5474a6cc2de7fdd01bb27bb5cdc8ccc33deeb51efd71e9fe9e9551a25 not found: ID does not exist" containerID="ce0586a5474a6cc2de7fdd01bb27bb5cdc8ccc33deeb51efd71e9fe9e9551a25" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.302817 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0586a5474a6cc2de7fdd01bb27bb5cdc8ccc33deeb51efd71e9fe9e9551a25"} err="failed to get container status \"ce0586a5474a6cc2de7fdd01bb27bb5cdc8ccc33deeb51efd71e9fe9e9551a25\": rpc error: code = NotFound desc = could not find container \"ce0586a5474a6cc2de7fdd01bb27bb5cdc8ccc33deeb51efd71e9fe9e9551a25\": container with ID starting with ce0586a5474a6cc2de7fdd01bb27bb5cdc8ccc33deeb51efd71e9fe9e9551a25 not found: ID does not exist" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.302847 5030 scope.go:117] "RemoveContainer" containerID="08325065dcc4631e5da5a30b800e1b1193f67cf5d051d623c4924d3a56eb756c" Jan 20 23:30:20 crc kubenswrapper[5030]: E0120 23:30:20.303250 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08325065dcc4631e5da5a30b800e1b1193f67cf5d051d623c4924d3a56eb756c\": container with ID starting with 08325065dcc4631e5da5a30b800e1b1193f67cf5d051d623c4924d3a56eb756c not found: ID does not exist" containerID="08325065dcc4631e5da5a30b800e1b1193f67cf5d051d623c4924d3a56eb756c" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.303312 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08325065dcc4631e5da5a30b800e1b1193f67cf5d051d623c4924d3a56eb756c"} err="failed to get container status \"08325065dcc4631e5da5a30b800e1b1193f67cf5d051d623c4924d3a56eb756c\": rpc error: code = NotFound desc = could not find container \"08325065dcc4631e5da5a30b800e1b1193f67cf5d051d623c4924d3a56eb756c\": container with ID starting with 08325065dcc4631e5da5a30b800e1b1193f67cf5d051d623c4924d3a56eb756c not found: ID does not exist" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.380268 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6bf49fb-3a1f-4a62-9340-a40608e2d844-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.380388 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58bdc\" (UniqueName: \"kubernetes.io/projected/d6bf49fb-3a1f-4a62-9340-a40608e2d844-kube-api-access-58bdc\") pod \"nova-scheduler-0\" (UID: \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.380561 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6bf49fb-3a1f-4a62-9340-a40608e2d844-config-data\") pod \"nova-scheduler-0\" (UID: \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.482642 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6bf49fb-3a1f-4a62-9340-a40608e2d844-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.482774 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58bdc\" (UniqueName: \"kubernetes.io/projected/d6bf49fb-3a1f-4a62-9340-a40608e2d844-kube-api-access-58bdc\") pod \"nova-scheduler-0\" (UID: \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.482829 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6bf49fb-3a1f-4a62-9340-a40608e2d844-config-data\") pod \"nova-scheduler-0\" (UID: \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.488091 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6bf49fb-3a1f-4a62-9340-a40608e2d844-config-data\") pod \"nova-scheduler-0\" (UID: \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.489315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6bf49fb-3a1f-4a62-9340-a40608e2d844-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.500314 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58bdc\" (UniqueName: \"kubernetes.io/projected/d6bf49fb-3a1f-4a62-9340-a40608e2d844-kube-api-access-58bdc\") pod \"nova-scheduler-0\" (UID: \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:20 crc kubenswrapper[5030]: I0120 23:30:20.634228 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:21 crc kubenswrapper[5030]: I0120 23:30:21.261172 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:30:21 crc kubenswrapper[5030]: I0120 23:30:21.976381 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="905b9300-659f-4490-b8c1-52e5874b23b7" path="/var/lib/kubelet/pods/905b9300-659f-4490-b8c1-52e5874b23b7/volumes" Jan 20 23:30:21 crc kubenswrapper[5030]: I0120 23:30:21.978654 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e68f755c-e882-4722-9d23-1cd24541f37a" path="/var/lib/kubelet/pods/e68f755c-e882-4722-9d23-1cd24541f37a/volumes" Jan 20 23:30:22 crc kubenswrapper[5030]: I0120 23:30:22.199066 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d6bf49fb-3a1f-4a62-9340-a40608e2d844","Type":"ContainerStarted","Data":"b67b792123403768573a8ebd3b75c1e265d1b457365b9957bcba8f5d6b593839"} Jan 20 23:30:22 crc kubenswrapper[5030]: I0120 23:30:22.199137 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d6bf49fb-3a1f-4a62-9340-a40608e2d844","Type":"ContainerStarted","Data":"c67cad9dedfea3af06bf566a87b5352eaf9bd53006d7cdeb5d315d7a96a253d6"} Jan 20 23:30:22 crc kubenswrapper[5030]: I0120 23:30:22.230701 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.230669483 podStartE2EDuration="2.230669483s" podCreationTimestamp="2026-01-20 23:30:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:30:22.220896946 +0000 UTC m=+3294.541157264" watchObservedRunningTime="2026-01-20 23:30:22.230669483 +0000 UTC m=+3294.550929801" Jan 20 23:30:23 crc kubenswrapper[5030]: I0120 23:30:23.423015 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:23 crc kubenswrapper[5030]: I0120 23:30:23.423100 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:24 crc kubenswrapper[5030]: I0120 23:30:24.486292 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:24 crc kubenswrapper[5030]: I0120 23:30:24.486354 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:24 crc kubenswrapper[5030]: I0120 23:30:24.505821 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.182:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:30:24 crc kubenswrapper[5030]: I0120 23:30:24.505860 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.182:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:30:25 crc kubenswrapper[5030]: I0120 23:30:25.527854 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.183:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:30:25 crc kubenswrapper[5030]: I0120 23:30:25.568894 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.183:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:30:25 crc kubenswrapper[5030]: I0120 23:30:25.634850 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:30 crc kubenswrapper[5030]: I0120 23:30:30.634608 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:30 crc kubenswrapper[5030]: I0120 23:30:30.677536 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:31 crc kubenswrapper[5030]: I0120 23:30:31.370593 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:30:33 crc kubenswrapper[5030]: I0120 23:30:33.426900 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:33 crc kubenswrapper[5030]: I0120 23:30:33.428075 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:33 crc kubenswrapper[5030]: I0120 23:30:33.428768 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:33 crc kubenswrapper[5030]: I0120 23:30:33.429170 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:33 crc kubenswrapper[5030]: I0120 23:30:33.431647 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:33 crc kubenswrapper[5030]: I0120 23:30:33.433245 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:30:34 crc kubenswrapper[5030]: I0120 23:30:34.490191 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:34 crc kubenswrapper[5030]: I0120 23:30:34.493298 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:34 crc kubenswrapper[5030]: I0120 23:30:34.494878 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:35 crc kubenswrapper[5030]: I0120 23:30:35.204700 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:30:35 crc kubenswrapper[5030]: I0120 23:30:35.205244 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="ceilometer-central-agent" containerID="cri-o://ace880f4a825d6a95c30a4f0e9a310b328e92780df913bc19ae82a8b7b47d214" gracePeriod=30 Jan 20 23:30:35 crc kubenswrapper[5030]: I0120 23:30:35.205389 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="ceilometer-notification-agent" containerID="cri-o://48cb6ab03283c40b54928b82b2dc7d88ccc380ff74d227332416a5477d39bb1b" gracePeriod=30 Jan 20 23:30:35 crc kubenswrapper[5030]: I0120 23:30:35.205389 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="sg-core" containerID="cri-o://53a4d99532461ab3bfdb793a4b54d0f3bff57141dd53a54b9d2557354a5473bd" gracePeriod=30 Jan 20 23:30:35 crc kubenswrapper[5030]: I0120 23:30:35.205649 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="proxy-httpd" containerID="cri-o://c44dbd4030758ff38d58724eca64dbedfe88fccad979f265543c57147bde8f99" gracePeriod=30 Jan 20 23:30:35 crc kubenswrapper[5030]: I0120 23:30:35.365508 5030 generic.go:334] "Generic (PLEG): container finished" podID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerID="c44dbd4030758ff38d58724eca64dbedfe88fccad979f265543c57147bde8f99" exitCode=0 Jan 20 23:30:35 crc kubenswrapper[5030]: I0120 23:30:35.365538 5030 generic.go:334] "Generic (PLEG): container finished" podID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerID="53a4d99532461ab3bfdb793a4b54d0f3bff57141dd53a54b9d2557354a5473bd" exitCode=2 Jan 20 23:30:35 crc kubenswrapper[5030]: I0120 23:30:35.367085 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19e6b507-ad9e-4d7e-a74b-8157b8add97a","Type":"ContainerDied","Data":"c44dbd4030758ff38d58724eca64dbedfe88fccad979f265543c57147bde8f99"} Jan 20 23:30:35 crc kubenswrapper[5030]: I0120 23:30:35.367116 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19e6b507-ad9e-4d7e-a74b-8157b8add97a","Type":"ContainerDied","Data":"53a4d99532461ab3bfdb793a4b54d0f3bff57141dd53a54b9d2557354a5473bd"} Jan 20 23:30:35 crc kubenswrapper[5030]: I0120 23:30:35.370519 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:30:36 crc kubenswrapper[5030]: I0120 23:30:36.379447 5030 generic.go:334] "Generic (PLEG): container finished" podID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerID="ace880f4a825d6a95c30a4f0e9a310b328e92780df913bc19ae82a8b7b47d214" exitCode=0 Jan 20 23:30:36 crc kubenswrapper[5030]: I0120 23:30:36.379564 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19e6b507-ad9e-4d7e-a74b-8157b8add97a","Type":"ContainerDied","Data":"ace880f4a825d6a95c30a4f0e9a310b328e92780df913bc19ae82a8b7b47d214"} Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.392917 5030 generic.go:334] "Generic (PLEG): container finished" podID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerID="48cb6ab03283c40b54928b82b2dc7d88ccc380ff74d227332416a5477d39bb1b" exitCode=0 Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.394023 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19e6b507-ad9e-4d7e-a74b-8157b8add97a","Type":"ContainerDied","Data":"48cb6ab03283c40b54928b82b2dc7d88ccc380ff74d227332416a5477d39bb1b"} Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.631956 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.756567 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19e6b507-ad9e-4d7e-a74b-8157b8add97a-log-httpd\") pod \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.757290 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e6b507-ad9e-4d7e-a74b-8157b8add97a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "19e6b507-ad9e-4d7e-a74b-8157b8add97a" (UID: "19e6b507-ad9e-4d7e-a74b-8157b8add97a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.757308 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-scripts\") pod \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.757434 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-config-data\") pod \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.757547 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-combined-ca-bundle\") pod \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.757615 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-sg-core-conf-yaml\") pod \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.757785 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q97cj\" (UniqueName: \"kubernetes.io/projected/19e6b507-ad9e-4d7e-a74b-8157b8add97a-kube-api-access-q97cj\") pod \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.757882 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19e6b507-ad9e-4d7e-a74b-8157b8add97a-run-httpd\") pod \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\" (UID: \"19e6b507-ad9e-4d7e-a74b-8157b8add97a\") " Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.758596 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e6b507-ad9e-4d7e-a74b-8157b8add97a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "19e6b507-ad9e-4d7e-a74b-8157b8add97a" (UID: "19e6b507-ad9e-4d7e-a74b-8157b8add97a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.758799 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19e6b507-ad9e-4d7e-a74b-8157b8add97a-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.763098 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e6b507-ad9e-4d7e-a74b-8157b8add97a-kube-api-access-q97cj" (OuterVolumeSpecName: "kube-api-access-q97cj") pod "19e6b507-ad9e-4d7e-a74b-8157b8add97a" (UID: "19e6b507-ad9e-4d7e-a74b-8157b8add97a"). InnerVolumeSpecName "kube-api-access-q97cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.771915 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-scripts" (OuterVolumeSpecName: "scripts") pod "19e6b507-ad9e-4d7e-a74b-8157b8add97a" (UID: "19e6b507-ad9e-4d7e-a74b-8157b8add97a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.789779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "19e6b507-ad9e-4d7e-a74b-8157b8add97a" (UID: "19e6b507-ad9e-4d7e-a74b-8157b8add97a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.847838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-config-data" (OuterVolumeSpecName: "config-data") pod "19e6b507-ad9e-4d7e-a74b-8157b8add97a" (UID: "19e6b507-ad9e-4d7e-a74b-8157b8add97a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.850502 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19e6b507-ad9e-4d7e-a74b-8157b8add97a" (UID: "19e6b507-ad9e-4d7e-a74b-8157b8add97a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.860740 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.860794 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.860816 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q97cj\" (UniqueName: \"kubernetes.io/projected/19e6b507-ad9e-4d7e-a74b-8157b8add97a-kube-api-access-q97cj\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.860838 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19e6b507-ad9e-4d7e-a74b-8157b8add97a-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.860883 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:37 crc kubenswrapper[5030]: I0120 23:30:37.860901 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19e6b507-ad9e-4d7e-a74b-8157b8add97a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.411728 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"19e6b507-ad9e-4d7e-a74b-8157b8add97a","Type":"ContainerDied","Data":"d3c3a853636360a860f1aef7d1bb37fe24687d234258f2174c384fbfe62dd80d"} Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.411880 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.412683 5030 scope.go:117] "RemoveContainer" containerID="c44dbd4030758ff38d58724eca64dbedfe88fccad979f265543c57147bde8f99" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.450195 5030 scope.go:117] "RemoveContainer" containerID="53a4d99532461ab3bfdb793a4b54d0f3bff57141dd53a54b9d2557354a5473bd" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.453958 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.467989 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.479554 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:30:38 crc kubenswrapper[5030]: E0120 23:30:38.480047 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="ceilometer-central-agent" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.480064 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="ceilometer-central-agent" Jan 20 23:30:38 crc kubenswrapper[5030]: E0120 23:30:38.480088 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="ceilometer-notification-agent" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.480096 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="ceilometer-notification-agent" Jan 20 23:30:38 crc kubenswrapper[5030]: E0120 23:30:38.480107 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="sg-core" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.480113 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="sg-core" Jan 20 23:30:38 crc kubenswrapper[5030]: E0120 23:30:38.480135 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="proxy-httpd" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.480141 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="proxy-httpd" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.480528 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="ceilometer-central-agent" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.480547 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="proxy-httpd" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.480569 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="sg-core" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.480599 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" containerName="ceilometer-notification-agent" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.482147 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.483944 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.488544 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.510339 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.510386 5030 scope.go:117] "RemoveContainer" containerID="48cb6ab03283c40b54928b82b2dc7d88ccc380ff74d227332416a5477d39bb1b" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.554068 5030 scope.go:117] "RemoveContainer" containerID="ace880f4a825d6a95c30a4f0e9a310b328e92780df913bc19ae82a8b7b47d214" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.577730 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.578452 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-config-data\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.578492 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a35bcf9-e354-4385-8e10-2bb69352fc70-log-httpd\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.578513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-scripts\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.578584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrxbw\" (UniqueName: \"kubernetes.io/projected/4a35bcf9-e354-4385-8e10-2bb69352fc70-kube-api-access-nrxbw\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.578617 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.578741 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a35bcf9-e354-4385-8e10-2bb69352fc70-run-httpd\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.683311 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-config-data\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.683659 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a35bcf9-e354-4385-8e10-2bb69352fc70-log-httpd\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.684182 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-scripts\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.684478 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a35bcf9-e354-4385-8e10-2bb69352fc70-log-httpd\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.684480 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrxbw\" (UniqueName: \"kubernetes.io/projected/4a35bcf9-e354-4385-8e10-2bb69352fc70-kube-api-access-nrxbw\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.684980 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.685224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a35bcf9-e354-4385-8e10-2bb69352fc70-run-httpd\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.685341 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.685549 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a35bcf9-e354-4385-8e10-2bb69352fc70-run-httpd\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.689405 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-scripts\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.690012 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.692506 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.692707 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-config-data\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.709600 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrxbw\" (UniqueName: \"kubernetes.io/projected/4a35bcf9-e354-4385-8e10-2bb69352fc70-kube-api-access-nrxbw\") pod \"ceilometer-0\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:38 crc kubenswrapper[5030]: I0120 23:30:38.830759 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:39 crc kubenswrapper[5030]: I0120 23:30:39.317807 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:30:39 crc kubenswrapper[5030]: W0120 23:30:39.323767 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a35bcf9_e354_4385_8e10_2bb69352fc70.slice/crio-b05bddb60c5402e5115fab37c48ab9c491f3ed3ccf925d08d74606f471c12990 WatchSource:0}: Error finding container b05bddb60c5402e5115fab37c48ab9c491f3ed3ccf925d08d74606f471c12990: Status 404 returned error can't find the container with id b05bddb60c5402e5115fab37c48ab9c491f3ed3ccf925d08d74606f471c12990 Jan 20 23:30:39 crc kubenswrapper[5030]: I0120 23:30:39.325581 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 23:30:39 crc kubenswrapper[5030]: I0120 23:30:39.457740 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4a35bcf9-e354-4385-8e10-2bb69352fc70","Type":"ContainerStarted","Data":"b05bddb60c5402e5115fab37c48ab9c491f3ed3ccf925d08d74606f471c12990"} Jan 20 23:30:39 crc kubenswrapper[5030]: I0120 23:30:39.974579 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e6b507-ad9e-4d7e-a74b-8157b8add97a" path="/var/lib/kubelet/pods/19e6b507-ad9e-4d7e-a74b-8157b8add97a/volumes" Jan 20 23:30:40 crc kubenswrapper[5030]: I0120 23:30:40.467702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4a35bcf9-e354-4385-8e10-2bb69352fc70","Type":"ContainerStarted","Data":"3bfb3d28c0e12ae18efaa28d415459489a4a00796c95465b198516c16c2dc578"} Jan 20 23:30:41 crc kubenswrapper[5030]: I0120 23:30:41.477476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4a35bcf9-e354-4385-8e10-2bb69352fc70","Type":"ContainerStarted","Data":"d8018bf3bade68fa64b2765cfd6560c427fb070f2f81ee36252a4716114b59ee"} Jan 20 23:30:42 crc kubenswrapper[5030]: I0120 23:30:42.492205 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4a35bcf9-e354-4385-8e10-2bb69352fc70","Type":"ContainerStarted","Data":"7185c5fb05b2a58b278c176c1f7c6feba52b9f23788ceb4dae8035f37f075ebe"} Jan 20 23:30:44 crc kubenswrapper[5030]: I0120 23:30:44.513959 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4a35bcf9-e354-4385-8e10-2bb69352fc70","Type":"ContainerStarted","Data":"e7c6be4bd549d34d96b35ea32ddf2d9e56b7140e18ff6672de159cff243a514f"} Jan 20 23:30:44 crc kubenswrapper[5030]: I0120 23:30:44.514836 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:30:44 crc kubenswrapper[5030]: I0120 23:30:44.551056 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.41771165 podStartE2EDuration="6.551030946s" podCreationTimestamp="2026-01-20 23:30:38 +0000 UTC" firstStartedPulling="2026-01-20 23:30:39.325383538 +0000 UTC m=+3311.645643826" lastFinishedPulling="2026-01-20 23:30:43.458702794 +0000 UTC m=+3315.778963122" observedRunningTime="2026-01-20 23:30:44.541513055 +0000 UTC m=+3316.861773333" watchObservedRunningTime="2026-01-20 23:30:44.551030946 +0000 UTC m=+3316.871291244" Jan 20 23:30:56 crc kubenswrapper[5030]: I0120 23:30:56.915690 5030 scope.go:117] "RemoveContainer" containerID="a06bb76ff4a3825e1075099128adc0c7f62a34e84ae7348cfcb05f086f4b2136" Jan 20 23:31:08 crc kubenswrapper[5030]: I0120 23:31:08.839107 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:20 crc kubenswrapper[5030]: I0120 23:31:20.698811 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:31:20 crc kubenswrapper[5030]: I0120 23:31:20.717672 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt"] Jan 20 23:31:20 crc kubenswrapper[5030]: I0120 23:31:20.728850 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-srmvt"] Jan 20 23:31:20 crc kubenswrapper[5030]: I0120 23:31:20.796675 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:31:20 crc kubenswrapper[5030]: I0120 23:31:20.796866 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="9936c7df-3635-4a9b-acec-a8c178738415" containerName="nova-cell1-conductor-conductor" containerID="cri-o://fdfa491c7a2eb603bdceb5aaf6cc99fee2799e677de3bd0ed64998869774b6c9" gracePeriod=30 Jan 20 23:31:20 crc kubenswrapper[5030]: I0120 23:31:20.830505 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:31:20 crc kubenswrapper[5030]: I0120 23:31:20.830704 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="6860c5ea-d056-44a7-baf9-89c835744a86" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://6ebff1ddfc2fee47dd48f67a0e30fc5e637d56329096440d4d71adb702b7d8f6" gracePeriod=30 Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.340557 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.465010 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.466459 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.480001 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.481498 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.515326 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.523554 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.565196 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-6676986dff-v2h4r"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.566415 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.583459 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.583735 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="36edc87c-d811-414a-8630-86c81093bdbf" containerName="cinder-api-log" containerID="cri-o://afe9fcbff8775eba20651455c79c527a300dd3b3f856f1adcfb7be13f0d24a0b" gracePeriod=30 Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.583787 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="36edc87c-d811-414a-8630-86c81093bdbf" containerName="cinder-api" containerID="cri-o://51240c072577546536a7362dad9a075e93e1489c8474e8d196c9d05db5b7e28b" gracePeriod=30 Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.632846 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkzpp\" (UniqueName: \"kubernetes.io/projected/52fa921e-8a73-400d-9f48-50876a4765f4-kube-api-access-tkzpp\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.632901 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-config-data-custom\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.632927 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7hsz\" (UniqueName: \"kubernetes.io/projected/08962f16-0341-46c7-b0df-9972861083df-kube-api-access-q7hsz\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.632956 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-config-data-custom\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.633010 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-config-data\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.633067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-combined-ca-bundle\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.633104 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08962f16-0341-46c7-b0df-9972861083df-logs\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.633122 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-combined-ca-bundle\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.633142 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-config-data\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.633163 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52fa921e-8a73-400d-9f48-50876a4765f4-logs\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.637664 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6676986dff-v2h4r"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.651754 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.652018 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="1df9d9fc-b1ae-420d-8d8a-1c453cb77106" containerName="cinder-scheduler" containerID="cri-o://b8f27ffed3eee51863138cee9e48d260bc090210cbb06cbfeed0342e5a7050be" gracePeriod=30 Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.652156 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="1df9d9fc-b1ae-420d-8d8a-1c453cb77106" containerName="probe" containerID="cri-o://585ae5ff44f886ced04efde6a25158e4b15847e87657330c8e6e321d18ce82b3" gracePeriod=30 Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.738755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7hsz\" (UniqueName: \"kubernetes.io/projected/08962f16-0341-46c7-b0df-9972861083df-kube-api-access-q7hsz\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.738819 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqjtt\" (UniqueName: \"kubernetes.io/projected/c20f9105-8569-4c92-98f9-dd35cc9d43a9-kube-api-access-lqjtt\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.738852 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-config-data-custom\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.738877 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-config-data\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.738944 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-fernet-keys\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.738970 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-combined-ca-bundle\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.739010 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08962f16-0341-46c7-b0df-9972861083df-logs\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.739035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-combined-ca-bundle\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.739058 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-combined-ca-bundle\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.739080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-config-data\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.739106 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52fa921e-8a73-400d-9f48-50876a4765f4-logs\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.739143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-scripts\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.739168 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkzpp\" (UniqueName: \"kubernetes.io/projected/52fa921e-8a73-400d-9f48-50876a4765f4-kube-api-access-tkzpp\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.739191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-config-data\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.739211 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-credential-keys\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.739232 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-config-data-custom\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.740717 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.755377 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08962f16-0341-46c7-b0df-9972861083df-logs\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.755394 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52fa921e-8a73-400d-9f48-50876a4765f4-logs\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.756860 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.757429 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-config-data-custom\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.757487 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-config-data\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.759651 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-config-data\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.761798 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-config-data-custom\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.770801 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-combined-ca-bundle\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.863400 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-combined-ca-bundle\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.863465 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7tw5\" (UniqueName: \"kubernetes.io/projected/a9545469-c146-486f-9c79-b4a97f0f5f83-kube-api-access-t7tw5\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.863509 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-scripts\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.863545 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-config-data\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.863561 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-credential-keys\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.863593 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqjtt\" (UniqueName: \"kubernetes.io/projected/c20f9105-8569-4c92-98f9-dd35cc9d43a9-kube-api-access-lqjtt\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.863637 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-combined-ca-bundle\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.863676 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-config-data-custom\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.863695 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9545469-c146-486f-9c79-b4a97f0f5f83-logs\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.863723 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-fernet-keys\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.863758 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-config-data\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.864247 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-combined-ca-bundle\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.871321 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-combined-ca-bundle\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.871760 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7hsz\" (UniqueName: \"kubernetes.io/projected/08962f16-0341-46c7-b0df-9972861083df-kube-api-access-q7hsz\") pod \"barbican-keystone-listener-5868d649cd-4hxqk\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.872061 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-fernet-keys\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.872648 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-scripts\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.876471 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkzpp\" (UniqueName: \"kubernetes.io/projected/52fa921e-8a73-400d-9f48-50876a4765f4-kube-api-access-tkzpp\") pod \"barbican-worker-9d499c945-ffpkv\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.874938 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.885027 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-credential-keys\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.892668 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.892983 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" containerName="nova-api-log" containerID="cri-o://d2040a6e25a52cf56fd194763cb8ba952aafe253c5f4e0b2ef114a3c8448c118" gracePeriod=30 Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.893868 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" containerName="nova-api-api" containerID="cri-o://50f7830c863df6553b3aedb3541412b57d1b96e325097a32a008245e2d15caf3" gracePeriod=30 Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.896876 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqjtt\" (UniqueName: \"kubernetes.io/projected/c20f9105-8569-4c92-98f9-dd35cc9d43a9-kube-api-access-lqjtt\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.915368 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-config-data\") pod \"keystone-6676986dff-v2h4r\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.937309 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.937794 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d6bf49fb-3a1f-4a62-9340-a40608e2d844" containerName="nova-scheduler-scheduler" containerID="cri-o://b67b792123403768573a8ebd3b75c1e265d1b457365b9957bcba8f5d6b593839" gracePeriod=30 Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.971009 5030 generic.go:334] "Generic (PLEG): container finished" podID="6860c5ea-d056-44a7-baf9-89c835744a86" containerID="6ebff1ddfc2fee47dd48f67a0e30fc5e637d56329096440d4d71adb702b7d8f6" exitCode=0 Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.972677 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-config-data-custom\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.972783 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9545469-c146-486f-9c79-b4a97f0f5f83-logs\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.972900 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-config-data\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.973025 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7tw5\" (UniqueName: \"kubernetes.io/projected/a9545469-c146-486f-9c79-b4a97f0f5f83-kube-api-access-t7tw5\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.973160 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-combined-ca-bundle\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.975404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9545469-c146-486f-9c79-b4a97f0f5f83-logs\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.980002 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="021b4577-29eb-4118-90a9-e4d409ec342a" path="/var/lib/kubelet/pods/021b4577-29eb-4118-90a9-e4d409ec342a/volumes" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.981312 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-7d99d84474-qxm76"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.982782 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-config-data\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.986319 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6860c5ea-d056-44a7-baf9-89c835744a86","Type":"ContainerDied","Data":"6ebff1ddfc2fee47dd48f67a0e30fc5e637d56329096440d4d71adb702b7d8f6"} Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.986406 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.986746 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7d99d84474-qxm76"] Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.987021 5030 generic.go:334] "Generic (PLEG): container finished" podID="36edc87c-d811-414a-8630-86c81093bdbf" containerID="afe9fcbff8775eba20651455c79c527a300dd3b3f856f1adcfb7be13f0d24a0b" exitCode=143 Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.987046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"36edc87c-d811-414a-8630-86c81093bdbf","Type":"ContainerDied","Data":"afe9fcbff8775eba20651455c79c527a300dd3b3f856f1adcfb7be13f0d24a0b"} Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.989888 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-config-data-custom\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.998240 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-combined-ca-bundle\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:21 crc kubenswrapper[5030]: I0120 23:31:21.999798 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7tw5\" (UniqueName: \"kubernetes.io/projected/a9545469-c146-486f-9c79-b4a97f0f5f83-kube-api-access-t7tw5\") pod \"barbican-api-67bdd6745b-xntdb\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.002984 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.003262 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerName="nova-metadata-log" containerID="cri-o://d465c7355fb972c93c904f55d8b5006846ff351d0ce18969da3c146b863c1b94" gracePeriod=30 Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.003440 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerName="nova-metadata-metadata" containerID="cri-o://25854f2a2d04afcd6032d297351aac945108d95777f369400932fa4d630807df" gracePeriod=30 Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.069380 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.074033 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6860c5ea-d056-44a7-baf9-89c835744a86-combined-ca-bundle\") pod \"6860c5ea-d056-44a7-baf9-89c835744a86\" (UID: \"6860c5ea-d056-44a7-baf9-89c835744a86\") " Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.074157 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bc8p\" (UniqueName: \"kubernetes.io/projected/6860c5ea-d056-44a7-baf9-89c835744a86-kube-api-access-4bc8p\") pod \"6860c5ea-d056-44a7-baf9-89c835744a86\" (UID: \"6860c5ea-d056-44a7-baf9-89c835744a86\") " Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.074211 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6860c5ea-d056-44a7-baf9-89c835744a86-config-data\") pod \"6860c5ea-d056-44a7-baf9-89c835744a86\" (UID: \"6860c5ea-d056-44a7-baf9-89c835744a86\") " Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.074321 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdvrq\" (UniqueName: \"kubernetes.io/projected/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-kube-api-access-rdvrq\") pod \"neutron-7d99d84474-qxm76\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.074353 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-config\") pod \"neutron-7d99d84474-qxm76\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.074385 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-httpd-config\") pod \"neutron-7d99d84474-qxm76\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.074447 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-combined-ca-bundle\") pod \"neutron-7d99d84474-qxm76\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.079936 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6860c5ea-d056-44a7-baf9-89c835744a86-kube-api-access-4bc8p" (OuterVolumeSpecName: "kube-api-access-4bc8p") pod "6860c5ea-d056-44a7-baf9-89c835744a86" (UID: "6860c5ea-d056-44a7-baf9-89c835744a86"). InnerVolumeSpecName "kube-api-access-4bc8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.081993 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.096584 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.134123 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6860c5ea-d056-44a7-baf9-89c835744a86-config-data" (OuterVolumeSpecName: "config-data") pod "6860c5ea-d056-44a7-baf9-89c835744a86" (UID: "6860c5ea-d056-44a7-baf9-89c835744a86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.149031 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6860c5ea-d056-44a7-baf9-89c835744a86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6860c5ea-d056-44a7-baf9-89c835744a86" (UID: "6860c5ea-d056-44a7-baf9-89c835744a86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.175680 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-httpd-config\") pod \"neutron-7d99d84474-qxm76\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.175779 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-combined-ca-bundle\") pod \"neutron-7d99d84474-qxm76\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.175857 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdvrq\" (UniqueName: \"kubernetes.io/projected/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-kube-api-access-rdvrq\") pod \"neutron-7d99d84474-qxm76\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.175890 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-config\") pod \"neutron-7d99d84474-qxm76\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.175943 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6860c5ea-d056-44a7-baf9-89c835744a86-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.175953 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6860c5ea-d056-44a7-baf9-89c835744a86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.175963 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bc8p\" (UniqueName: \"kubernetes.io/projected/6860c5ea-d056-44a7-baf9-89c835744a86-kube-api-access-4bc8p\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.184396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-httpd-config\") pod \"neutron-7d99d84474-qxm76\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.184694 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-combined-ca-bundle\") pod \"neutron-7d99d84474-qxm76\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.185097 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-config\") pod \"neutron-7d99d84474-qxm76\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.206042 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.211498 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdvrq\" (UniqueName: \"kubernetes.io/projected/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-kube-api-access-rdvrq\") pod \"neutron-7d99d84474-qxm76\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.244529 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.324392 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.780283 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk"] Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.864038 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:31:22 crc kubenswrapper[5030]: I0120 23:31:22.864197 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="295f9631-4bac-43a2-9b71-f94209fa0046" containerName="memcached" containerID="cri-o://99c989c805531e84aa4649956b12836dbd4fb418611a818efb7cce3c72c5eedd" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.002016 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.002214 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="8324c6f6-da57-40dc-a9f3-ce99a41078e6" containerName="glance-log" containerID="cri-o://64bde24ce641ac620803ec772b33819c4f11068e6abd5447571283350997935d" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.004810 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="8324c6f6-da57-40dc-a9f3-ce99a41078e6" containerName="glance-httpd" containerID="cri-o://16c5cb22986bc3b811996c0ba29d425238d57e0b2b669a5b412626aeb83f8fda" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.040862 5030 generic.go:334] "Generic (PLEG): container finished" podID="9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" containerID="d2040a6e25a52cf56fd194763cb8ba952aafe253c5f4e0b2ef114a3c8448c118" exitCode=143 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.040932 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1","Type":"ContainerDied","Data":"d2040a6e25a52cf56fd194763cb8ba952aafe253c5f4e0b2ef114a3c8448c118"} Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.056855 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6676986dff-v2h4r"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.072628 5030 generic.go:334] "Generic (PLEG): container finished" podID="1df9d9fc-b1ae-420d-8d8a-1c453cb77106" containerID="585ae5ff44f886ced04efde6a25158e4b15847e87657330c8e6e321d18ce82b3" exitCode=0 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.072709 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1df9d9fc-b1ae-420d-8d8a-1c453cb77106","Type":"ContainerDied","Data":"585ae5ff44f886ced04efde6a25158e4b15847e87657330c8e6e321d18ce82b3"} Jan 20 23:31:23 crc kubenswrapper[5030]: W0120 23:31:23.079757 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc20f9105_8569_4c92_98f9_dd35cc9d43a9.slice/crio-3fe947ee9f0bbbedd35dd0ede808e4e4700dfcee91938cbfaad6dac6667227b4 WatchSource:0}: Error finding container 3fe947ee9f0bbbedd35dd0ede808e4e4700dfcee91938cbfaad6dac6667227b4: Status 404 returned error can't find the container with id 3fe947ee9f0bbbedd35dd0ede808e4e4700dfcee91938cbfaad6dac6667227b4 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.084814 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" event={"ID":"08962f16-0341-46c7-b0df-9972861083df","Type":"ContainerStarted","Data":"382ecae1ec29df3c8334fa5c814576e52cef4b516be068ee7b882aca010b9afc"} Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.091485 5030 generic.go:334] "Generic (PLEG): container finished" podID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerID="d465c7355fb972c93c904f55d8b5006846ff351d0ce18969da3c146b863c1b94" exitCode=143 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.091796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ff299fc0-074c-4420-8ac5-030bebd0c385","Type":"ContainerDied","Data":"d465c7355fb972c93c904f55d8b5006846ff351d0ce18969da3c146b863c1b94"} Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.099596 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.099840 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="03f8ffc8-cadf-4aab-ab46-71163f0339d9" containerName="glance-log" containerID="cri-o://f8e64917c4556d4d40e2c4d864f098fcf9cbc495e12808995f36687c0b2205f0" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.100256 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="03f8ffc8-cadf-4aab-ab46-71163f0339d9" containerName="glance-httpd" containerID="cri-o://53982c449caa05c2e8194b1dac6d69dabe8a0b162f8d8436686b33b0358d19c0" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.107092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6860c5ea-d056-44a7-baf9-89c835744a86","Type":"ContainerDied","Data":"0bf03ac9e9e90dc2573d5bf2575fcda2640adffdd87fe54b7c3a07153fbde546"} Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.107144 5030 scope.go:117] "RemoveContainer" containerID="6ebff1ddfc2fee47dd48f67a0e30fc5e637d56329096440d4d71adb702b7d8f6" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.107266 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.133148 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.134263 5030 generic.go:334] "Generic (PLEG): container finished" podID="9936c7df-3635-4a9b-acec-a8c178738415" containerID="fdfa491c7a2eb603bdceb5aaf6cc99fee2799e677de3bd0ed64998869774b6c9" exitCode=0 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.134295 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"9936c7df-3635-4a9b-acec-a8c178738415","Type":"ContainerDied","Data":"fdfa491c7a2eb603bdceb5aaf6cc99fee2799e677de3bd0ed64998869774b6c9"} Jan 20 23:31:23 crc kubenswrapper[5030]: E0120 23:31:23.139248 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fdfa491c7a2eb603bdceb5aaf6cc99fee2799e677de3bd0ed64998869774b6c9 is running failed: container process not found" containerID="fdfa491c7a2eb603bdceb5aaf6cc99fee2799e677de3bd0ed64998869774b6c9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:23 crc kubenswrapper[5030]: E0120 23:31:23.148756 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fdfa491c7a2eb603bdceb5aaf6cc99fee2799e677de3bd0ed64998869774b6c9 is running failed: container process not found" containerID="fdfa491c7a2eb603bdceb5aaf6cc99fee2799e677de3bd0ed64998869774b6c9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:23 crc kubenswrapper[5030]: E0120 23:31:23.158746 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fdfa491c7a2eb603bdceb5aaf6cc99fee2799e677de3bd0ed64998869774b6c9 is running failed: container process not found" containerID="fdfa491c7a2eb603bdceb5aaf6cc99fee2799e677de3bd0ed64998869774b6c9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:23 crc kubenswrapper[5030]: E0120 23:31:23.158814 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fdfa491c7a2eb603bdceb5aaf6cc99fee2799e677de3bd0ed64998869774b6c9 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="9936c7df-3635-4a9b-acec-a8c178738415" containerName="nova-cell1-conductor-conductor" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.160476 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.210936 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7d99d84474-qxm76"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.227696 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228151 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-server" containerID="cri-o://c023313c2dc4acfa642739fd15b6dce1cdb2384c9ddf2f534c8c7e3a6317cc61" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228533 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="swift-recon-cron" containerID="cri-o://25c15d7715d9e340868a2859dd1a58b1e7573ed255f28afe95351da547d45981" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228572 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="rsync" containerID="cri-o://53566cbe13e0554a3b86a64ff88413e4e2971a75f558b615fb405505c102f2c8" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228600 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-expirer" containerID="cri-o://d264c92684da056e428a6c733961f8fe92aa28e300c81f22e991287ca3bdb9ab" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228648 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-updater" containerID="cri-o://fb5859be41c2b9cb5338526464383b63d74381d26c7f59f26d24b5ff23c949cc" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228680 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-auditor" containerID="cri-o://60cb27177b8ee600ca0fa9b124dfd7d1119967c490af0614c1bb464ddeaa2707" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228709 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-replicator" containerID="cri-o://59c21e5e34c6d5dbf8de4d3d91b825d296eab38936e7d83529e745d3d2b3e2a5" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228736 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-server" containerID="cri-o://9308c7e5305215685af92a1087be86f06b8582716c763d7b3d73399d891e23fb" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228766 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-updater" containerID="cri-o://6a5682b6b00c488de2f354ed3629a706b621d15c96ef14b746c170ab2da520e2" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228797 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-auditor" containerID="cri-o://234735c49802543dffa54027fc58edd4cb277d91c2216d680f26f1186b80560f" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228826 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-replicator" containerID="cri-o://b7ae32b1a4f1e7f7fe7aad12eac5a1d984d90e41f390bb645d6ca4b2f630627a" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228867 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-server" containerID="cri-o://b60f5caf9f00c875e2dfdf1fc8cd6af041235cd0e56388a3a11dd402988b4307" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228897 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-reaper" containerID="cri-o://49574068fdaf73188ebe3b005b4e8a44cf73f4cfe484defd630ff27da3cfb6e0" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228928 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-auditor" containerID="cri-o://e61677d1c4df154307a801a3e62190f8fa8c59cdfba25fbc0e4949bc2af510ce" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.228955 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-replicator" containerID="cri-o://fc30b61ad3b670e8566e451c0bf77bed40bb4ae683b1002df810eb3e8a7c7e56" gracePeriod=30 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.232994 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.269441 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.302592 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:31:23 crc kubenswrapper[5030]: E0120 23:31:23.303048 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6860c5ea-d056-44a7-baf9-89c835744a86" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.303062 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6860c5ea-d056-44a7-baf9-89c835744a86" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.303258 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6860c5ea-d056-44a7-baf9-89c835744a86" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.303992 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.310226 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.327686 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-7fc9646994-sp2l9"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.329219 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.346508 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.371611 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7fc9646994-sp2l9"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.408580 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6676986dff-v2h4r"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.409549 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-combined-ca-bundle\") pod \"neutron-7fc9646994-sp2l9\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.409632 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c37973c-d677-4481-b1cf-7a5183b904bc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c37973c-d677-4481-b1cf-7a5183b904bc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.409663 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvddh\" (UniqueName: \"kubernetes.io/projected/1cef2808-064a-47dd-8abb-9f6b08d260be-kube-api-access-zvddh\") pod \"neutron-7fc9646994-sp2l9\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.409735 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-httpd-config\") pod \"neutron-7fc9646994-sp2l9\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.409756 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c37973c-d677-4481-b1cf-7a5183b904bc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c37973c-d677-4481-b1cf-7a5183b904bc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.409800 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-config\") pod \"neutron-7fc9646994-sp2l9\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.409832 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65b49\" (UniqueName: \"kubernetes.io/projected/6c37973c-d677-4481-b1cf-7a5183b904bc-kube-api-access-65b49\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c37973c-d677-4481-b1cf-7a5183b904bc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.512359 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-httpd-config\") pod \"neutron-7fc9646994-sp2l9\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.555091 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c37973c-d677-4481-b1cf-7a5183b904bc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c37973c-d677-4481-b1cf-7a5183b904bc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.555302 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-config\") pod \"neutron-7fc9646994-sp2l9\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.555498 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65b49\" (UniqueName: \"kubernetes.io/projected/6c37973c-d677-4481-b1cf-7a5183b904bc-kube-api-access-65b49\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c37973c-d677-4481-b1cf-7a5183b904bc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.555562 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-combined-ca-bundle\") pod \"neutron-7fc9646994-sp2l9\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.556056 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c37973c-d677-4481-b1cf-7a5183b904bc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c37973c-d677-4481-b1cf-7a5183b904bc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.556120 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvddh\" (UniqueName: \"kubernetes.io/projected/1cef2808-064a-47dd-8abb-9f6b08d260be-kube-api-access-zvddh\") pod \"neutron-7fc9646994-sp2l9\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.558782 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-84588d847c-cnh5j"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.561009 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.576654 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-combined-ca-bundle\") pod \"neutron-7fc9646994-sp2l9\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.576703 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c37973c-d677-4481-b1cf-7a5183b904bc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c37973c-d677-4481-b1cf-7a5183b904bc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.576742 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c37973c-d677-4481-b1cf-7a5183b904bc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c37973c-d677-4481-b1cf-7a5183b904bc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.576750 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-config\") pod \"neutron-7fc9646994-sp2l9\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.578085 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="535e149a-723c-4f6b-9247-9a6b58eff6bb" containerName="rabbitmq" containerID="cri-o://a5f50faa8925170ab9b2766c285e845c40fa14a592054ea95170b7280822dae6" gracePeriod=604798 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.580600 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvddh\" (UniqueName: \"kubernetes.io/projected/1cef2808-064a-47dd-8abb-9f6b08d260be-kube-api-access-zvddh\") pod \"neutron-7fc9646994-sp2l9\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.584941 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7d99d84474-qxm76"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.599316 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-httpd-config\") pod \"neutron-7fc9646994-sp2l9\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.601396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65b49\" (UniqueName: \"kubernetes.io/projected/6c37973c-d677-4481-b1cf-7a5183b904bc-kube-api-access-65b49\") pod \"nova-cell1-novncproxy-0\" (UID: \"6c37973c-d677-4481-b1cf-7a5183b904bc\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.604702 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-84588d847c-cnh5j"] Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.654945 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.658353 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-fernet-keys\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.658399 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-combined-ca-bundle\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.658434 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67ltf\" (UniqueName: \"kubernetes.io/projected/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-kube-api-access-67ltf\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.658469 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-credential-keys\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.658504 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-scripts\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.658577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-config-data\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.693542 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.717457 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.760356 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9936c7df-3635-4a9b-acec-a8c178738415-combined-ca-bundle\") pod \"9936c7df-3635-4a9b-acec-a8c178738415\" (UID: \"9936c7df-3635-4a9b-acec-a8c178738415\") " Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.760422 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zfpr\" (UniqueName: \"kubernetes.io/projected/9936c7df-3635-4a9b-acec-a8c178738415-kube-api-access-6zfpr\") pod \"9936c7df-3635-4a9b-acec-a8c178738415\" (UID: \"9936c7df-3635-4a9b-acec-a8c178738415\") " Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.760588 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9936c7df-3635-4a9b-acec-a8c178738415-config-data\") pod \"9936c7df-3635-4a9b-acec-a8c178738415\" (UID: \"9936c7df-3635-4a9b-acec-a8c178738415\") " Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.760858 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-credential-keys\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.760921 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-scripts\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.760939 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-config-data\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.761040 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-fernet-keys\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.761088 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-combined-ca-bundle\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.761139 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67ltf\" (UniqueName: \"kubernetes.io/projected/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-kube-api-access-67ltf\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.779099 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-credential-keys\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.783689 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-combined-ca-bundle\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.783997 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-scripts\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.785231 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-fernet-keys\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.785662 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-config-data\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.787087 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67ltf\" (UniqueName: \"kubernetes.io/projected/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-kube-api-access-67ltf\") pod \"keystone-84588d847c-cnh5j\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.793845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9936c7df-3635-4a9b-acec-a8c178738415-kube-api-access-6zfpr" (OuterVolumeSpecName: "kube-api-access-6zfpr") pod "9936c7df-3635-4a9b-acec-a8c178738415" (UID: "9936c7df-3635-4a9b-acec-a8c178738415"). InnerVolumeSpecName "kube-api-access-6zfpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.821430 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9936c7df-3635-4a9b-acec-a8c178738415-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9936c7df-3635-4a9b-acec-a8c178738415" (UID: "9936c7df-3635-4a9b-acec-a8c178738415"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.827033 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9936c7df-3635-4a9b-acec-a8c178738415-config-data" (OuterVolumeSpecName: "config-data") pod "9936c7df-3635-4a9b-acec-a8c178738415" (UID: "9936c7df-3635-4a9b-acec-a8c178738415"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.863531 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zfpr\" (UniqueName: \"kubernetes.io/projected/9936c7df-3635-4a9b-acec-a8c178738415-kube-api-access-6zfpr\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.863864 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9936c7df-3635-4a9b-acec-a8c178738415-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.863878 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9936c7df-3635-4a9b-acec-a8c178738415-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.962805 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="9def69e2-10b9-443b-8b38-7d561e39bb7d" containerName="rabbitmq" containerID="cri-o://b96f29c6412312302df952900fcc3ad294f799efc164332681b4bdeaa2e0b9e6" gracePeriod=604798 Jan 20 23:31:23 crc kubenswrapper[5030]: I0120 23:31:23.997545 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.013076 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6860c5ea-d056-44a7-baf9-89c835744a86" path="/var/lib/kubelet/pods/6860c5ea-d056-44a7-baf9-89c835744a86/volumes" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.187661 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"9936c7df-3635-4a9b-acec-a8c178738415","Type":"ContainerDied","Data":"8e47adb9603263dec83e5e3cc5ab6c6c8ae22def4e5ad55b13e29694ce1bb22f"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.187705 5030 scope.go:117] "RemoveContainer" containerID="fdfa491c7a2eb603bdceb5aaf6cc99fee2799e677de3bd0ed64998869774b6c9" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.187812 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.200497 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" event={"ID":"a9545469-c146-486f-9c79-b4a97f0f5f83","Type":"ContainerStarted","Data":"cfb105ff10aa6a2752e27b961832094a2cba935c02c88987ebabcede7da0e5ca"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.200533 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" event={"ID":"a9545469-c146-486f-9c79-b4a97f0f5f83","Type":"ContainerStarted","Data":"74ff37a44373e19ae35efe5753ddbebcab1fa9f608cbdb06b3b70cdf0a4e7190"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.216315 5030 generic.go:334] "Generic (PLEG): container finished" podID="295f9631-4bac-43a2-9b71-f94209fa0046" containerID="99c989c805531e84aa4649956b12836dbd4fb418611a818efb7cce3c72c5eedd" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.216656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"295f9631-4bac-43a2-9b71-f94209fa0046","Type":"ContainerDied","Data":"99c989c805531e84aa4649956b12836dbd4fb418611a818efb7cce3c72c5eedd"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.236028 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.244913 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.248963 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" event={"ID":"08962f16-0341-46c7-b0df-9972861083df","Type":"ContainerStarted","Data":"c2d0233a880b88e93dbeabd9bcdcab1ad13499f0f2deb9a9aa140faad87db60d"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.249009 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" event={"ID":"08962f16-0341-46c7-b0df-9972861083df","Type":"ContainerStarted","Data":"6c650f247290a40ecefbd23466ce8a8a0175d13bde2a6888e5bd093543ff4b53"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.272363 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:31:24 crc kubenswrapper[5030]: E0120 23:31:24.272867 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9936c7df-3635-4a9b-acec-a8c178738415" containerName="nova-cell1-conductor-conductor" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.272880 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9936c7df-3635-4a9b-acec-a8c178738415" containerName="nova-cell1-conductor-conductor" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.273080 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9936c7df-3635-4a9b-acec-a8c178738415" containerName="nova-cell1-conductor-conductor" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.274080 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.284817 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.287282 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.291276 5030 generic.go:334] "Generic (PLEG): container finished" podID="03f8ffc8-cadf-4aab-ab46-71163f0339d9" containerID="f8e64917c4556d4d40e2c4d864f098fcf9cbc495e12808995f36687c0b2205f0" exitCode=143 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.291384 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"03f8ffc8-cadf-4aab-ab46-71163f0339d9","Type":"ContainerDied","Data":"f8e64917c4556d4d40e2c4d864f098fcf9cbc495e12808995f36687c0b2205f0"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.301358 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" podStartSLOduration=3.301343071 podStartE2EDuration="3.301343071s" podCreationTimestamp="2026-01-20 23:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:24.28763846 +0000 UTC m=+3356.607898738" watchObservedRunningTime="2026-01-20 23:31:24.301343071 +0000 UTC m=+3356.621603359" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.371813 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374074 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="53566cbe13e0554a3b86a64ff88413e4e2971a75f558b615fb405505c102f2c8" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374096 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="d264c92684da056e428a6c733961f8fe92aa28e300c81f22e991287ca3bdb9ab" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374107 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="fb5859be41c2b9cb5338526464383b63d74381d26c7f59f26d24b5ff23c949cc" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374115 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="60cb27177b8ee600ca0fa9b124dfd7d1119967c490af0614c1bb464ddeaa2707" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374122 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="59c21e5e34c6d5dbf8de4d3d91b825d296eab38936e7d83529e745d3d2b3e2a5" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374130 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="9308c7e5305215685af92a1087be86f06b8582716c763d7b3d73399d891e23fb" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374137 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="6a5682b6b00c488de2f354ed3629a706b621d15c96ef14b746c170ab2da520e2" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374143 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="234735c49802543dffa54027fc58edd4cb277d91c2216d680f26f1186b80560f" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374150 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="b7ae32b1a4f1e7f7fe7aad12eac5a1d984d90e41f390bb645d6ca4b2f630627a" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374156 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="b60f5caf9f00c875e2dfdf1fc8cd6af041235cd0e56388a3a11dd402988b4307" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374163 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="49574068fdaf73188ebe3b005b4e8a44cf73f4cfe484defd630ff27da3cfb6e0" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374169 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="e61677d1c4df154307a801a3e62190f8fa8c59cdfba25fbc0e4949bc2af510ce" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374175 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="fc30b61ad3b670e8566e451c0bf77bed40bb4ae683b1002df810eb3e8a7c7e56" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374181 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="c023313c2dc4acfa642739fd15b6dce1cdb2384c9ddf2f534c8c7e3a6317cc61" exitCode=0 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374212 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"53566cbe13e0554a3b86a64ff88413e4e2971a75f558b615fb405505c102f2c8"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374232 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"d264c92684da056e428a6c733961f8fe92aa28e300c81f22e991287ca3bdb9ab"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374242 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"fb5859be41c2b9cb5338526464383b63d74381d26c7f59f26d24b5ff23c949cc"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374251 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"60cb27177b8ee600ca0fa9b124dfd7d1119967c490af0614c1bb464ddeaa2707"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374259 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"59c21e5e34c6d5dbf8de4d3d91b825d296eab38936e7d83529e745d3d2b3e2a5"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374267 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"9308c7e5305215685af92a1087be86f06b8582716c763d7b3d73399d891e23fb"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374275 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"6a5682b6b00c488de2f354ed3629a706b621d15c96ef14b746c170ab2da520e2"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374285 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"234735c49802543dffa54027fc58edd4cb277d91c2216d680f26f1186b80560f"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374294 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"b7ae32b1a4f1e7f7fe7aad12eac5a1d984d90e41f390bb645d6ca4b2f630627a"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374303 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"b60f5caf9f00c875e2dfdf1fc8cd6af041235cd0e56388a3a11dd402988b4307"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374311 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"49574068fdaf73188ebe3b005b4e8a44cf73f4cfe484defd630ff27da3cfb6e0"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374321 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"e61677d1c4df154307a801a3e62190f8fa8c59cdfba25fbc0e4949bc2af510ce"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374330 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"fc30b61ad3b670e8566e451c0bf77bed40bb4ae683b1002df810eb3e8a7c7e56"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.374340 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"c023313c2dc4acfa642739fd15b6dce1cdb2384c9ddf2f534c8c7e3a6317cc61"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.375935 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" event={"ID":"c20f9105-8569-4c92-98f9-dd35cc9d43a9","Type":"ContainerStarted","Data":"1275f690feea00793157e12c75f8deef238efc5fe1e7db14cbe0b3879432536f"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.375957 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" event={"ID":"c20f9105-8569-4c92-98f9-dd35cc9d43a9","Type":"ContainerStarted","Data":"3fe947ee9f0bbbedd35dd0ede808e4e4700dfcee91938cbfaad6dac6667227b4"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.376053 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" podUID="c20f9105-8569-4c92-98f9-dd35cc9d43a9" containerName="keystone-api" containerID="cri-o://1275f690feea00793157e12c75f8deef238efc5fe1e7db14cbe0b3879432536f" gracePeriod=30 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.376529 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.384937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wp96\" (UniqueName: \"kubernetes.io/projected/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-kube-api-access-7wp96\") pod \"nova-cell1-conductor-0\" (UID: \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.385094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.385118 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.401563 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" event={"ID":"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0","Type":"ContainerStarted","Data":"c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.401601 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" event={"ID":"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0","Type":"ContainerStarted","Data":"b78656f45017feef1a2b7ad4852c2dc7a0590f527f6edad24908c341f2f9f7cc"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.402493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" event={"ID":"52fa921e-8a73-400d-9f48-50876a4765f4","Type":"ContainerStarted","Data":"980840455d95fe822b68afa33dbee4e1d193ae82fd6f5af390afd022819a58a3"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.402508 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" event={"ID":"52fa921e-8a73-400d-9f48-50876a4765f4","Type":"ContainerStarted","Data":"843eec708f845d1a5dd4e5589d6ea3b37d05f1d00054a95c32e47588c8472c8d"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.407869 5030 generic.go:334] "Generic (PLEG): container finished" podID="8324c6f6-da57-40dc-a9f3-ce99a41078e6" containerID="64bde24ce641ac620803ec772b33819c4f11068e6abd5447571283350997935d" exitCode=143 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.407912 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"8324c6f6-da57-40dc-a9f3-ce99a41078e6","Type":"ContainerDied","Data":"64bde24ce641ac620803ec772b33819c4f11068e6abd5447571283350997935d"} Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.444690 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb"] Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.444981 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" podUID="5b822da8-631f-47ee-8874-dfa862386ad2" containerName="barbican-keystone-listener-log" containerID="cri-o://39015b910f2f39e56b985ca85a7e5d5c7980fd4ff906f21ce55154bbcc3bfbc9" gracePeriod=30 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.445362 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" podUID="5b822da8-631f-47ee-8874-dfa862386ad2" containerName="barbican-keystone-listener" containerID="cri-o://ec590bd73705be339baa9b7c1b90768301a4f289003bd0b2191fc3d357538be2" gracePeriod=30 Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.479597 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" podStartSLOduration=3.479582536 podStartE2EDuration="3.479582536s" podCreationTimestamp="2026-01-20 23:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:24.389377432 +0000 UTC m=+3356.709637720" watchObservedRunningTime="2026-01-20 23:31:24.479582536 +0000 UTC m=+3356.799842824" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.487673 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.487798 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wp96\" (UniqueName: \"kubernetes.io/projected/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-kube-api-access-7wp96\") pod \"nova-cell1-conductor-0\" (UID: \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.487868 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.495981 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.497157 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.521934 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6ldn5"] Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.536114 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wp96\" (UniqueName: \"kubernetes.io/projected/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-kube-api-access-7wp96\") pod \"nova-cell1-conductor-0\" (UID: \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.556877 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6ldn5"] Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.636439 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.654224 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-sf2nd"] Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.655504 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.694650 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-credential-keys\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.694702 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-combined-ca-bundle\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.694732 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-config-data\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.694757 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8dz7\" (UniqueName: \"kubernetes.io/projected/0db48c6e-e94b-4a4c-bc75-70ee65bded48-kube-api-access-w8dz7\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.694823 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-scripts\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.694852 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-fernet-keys\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.710743 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.758324 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-sf2nd"] Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.797787 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv"] Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.799064 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-scripts\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.801825 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-fernet-keys\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.801956 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-credential-keys\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.802019 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-combined-ca-bundle\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.802042 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-config-data\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.802084 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8dz7\" (UniqueName: \"kubernetes.io/projected/0db48c6e-e94b-4a4c-bc75-70ee65bded48-kube-api-access-w8dz7\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.808430 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-scripts\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.810341 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-credential-keys\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.811249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-fernet-keys\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.816381 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh"] Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.817370 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-config-data\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.831670 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-combined-ca-bundle\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.838329 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.844782 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8dz7\" (UniqueName: \"kubernetes.io/projected/0db48c6e-e94b-4a4c-bc75-70ee65bded48-kube-api-access-w8dz7\") pod \"keystone-bootstrap-sf2nd\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.903122 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.904294 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-combined-ca-bundle\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.904373 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-config-data-custom\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.904413 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada482ec-e4e4-4484-9dcf-f57ee6186883-logs\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.904448 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lt2w\" (UniqueName: \"kubernetes.io/projected/ada482ec-e4e4-4484-9dcf-f57ee6186883-kube-api-access-6lt2w\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.904480 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-config-data\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.905009 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:24 crc kubenswrapper[5030]: I0120 23:31:24.978998 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh"] Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.007354 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/295f9631-4bac-43a2-9b71-f94209fa0046-config-data\") pod \"295f9631-4bac-43a2-9b71-f94209fa0046\" (UID: \"295f9631-4bac-43a2-9b71-f94209fa0046\") " Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.007560 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx8sx\" (UniqueName: \"kubernetes.io/projected/295f9631-4bac-43a2-9b71-f94209fa0046-kube-api-access-jx8sx\") pod \"295f9631-4bac-43a2-9b71-f94209fa0046\" (UID: \"295f9631-4bac-43a2-9b71-f94209fa0046\") " Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.007585 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/295f9631-4bac-43a2-9b71-f94209fa0046-kolla-config\") pod \"295f9631-4bac-43a2-9b71-f94209fa0046\" (UID: \"295f9631-4bac-43a2-9b71-f94209fa0046\") " Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.007913 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-config-data-custom\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.007967 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada482ec-e4e4-4484-9dcf-f57ee6186883-logs\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.008009 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lt2w\" (UniqueName: \"kubernetes.io/projected/ada482ec-e4e4-4484-9dcf-f57ee6186883-kube-api-access-6lt2w\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.008058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-config-data\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.008171 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-combined-ca-bundle\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.011979 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295f9631-4bac-43a2-9b71-f94209fa0046-config-data" (OuterVolumeSpecName: "config-data") pod "295f9631-4bac-43a2-9b71-f94209fa0046" (UID: "295f9631-4bac-43a2-9b71-f94209fa0046"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.016337 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada482ec-e4e4-4484-9dcf-f57ee6186883-logs\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.023608 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-config-data\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.024193 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-config-data-custom\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.024353 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295f9631-4bac-43a2-9b71-f94209fa0046-kube-api-access-jx8sx" (OuterVolumeSpecName: "kube-api-access-jx8sx") pod "295f9631-4bac-43a2-9b71-f94209fa0046" (UID: "295f9631-4bac-43a2-9b71-f94209fa0046"). InnerVolumeSpecName "kube-api-access-jx8sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.024890 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/295f9631-4bac-43a2-9b71-f94209fa0046-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "295f9631-4bac-43a2-9b71-f94209fa0046" (UID: "295f9631-4bac-43a2-9b71-f94209fa0046"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.035212 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn"] Jan 20 23:31:25 crc kubenswrapper[5030]: E0120 23:31:25.035815 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295f9631-4bac-43a2-9b71-f94209fa0046" containerName="memcached" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.035831 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="295f9631-4bac-43a2-9b71-f94209fa0046" containerName="memcached" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.036065 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="295f9631-4bac-43a2-9b71-f94209fa0046" containerName="memcached" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.037471 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.041346 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-combined-ca-bundle\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.081012 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn"] Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.097498 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lt2w\" (UniqueName: \"kubernetes.io/projected/ada482ec-e4e4-4484-9dcf-f57ee6186883-kube-api-access-6lt2w\") pod \"barbican-keystone-listener-6f8d945766-px9fh\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.110079 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-combined-ca-bundle\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.110119 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-config-data-custom\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.110147 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-config-data\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.110202 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab5d494-17b4-4c5d-babf-e74a3deb090f-logs\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.110246 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56pl2\" (UniqueName: \"kubernetes.io/projected/cab5d494-17b4-4c5d-babf-e74a3deb090f-kube-api-access-56pl2\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.110330 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/295f9631-4bac-43a2-9b71-f94209fa0046-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.110342 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx8sx\" (UniqueName: \"kubernetes.io/projected/295f9631-4bac-43a2-9b71-f94209fa0046-kube-api-access-jx8sx\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.110352 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/295f9631-4bac-43a2-9b71-f94209fa0046-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.128764 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-799d455fb6-7jld8"] Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.130371 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.159941 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-799d455fb6-7jld8"] Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.216786 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56pl2\" (UniqueName: \"kubernetes.io/projected/cab5d494-17b4-4c5d-babf-e74a3deb090f-kube-api-access-56pl2\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.219868 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-combined-ca-bundle\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.220027 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-config-data-custom\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.220542 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-config-data\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.229579 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab5d494-17b4-4c5d-babf-e74a3deb090f-logs\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.230112 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab5d494-17b4-4c5d-babf-e74a3deb090f-logs\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.246476 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.248829 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-config-data\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.334757 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb"] Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.336674 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-scripts\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.348813 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-combined-ca-bundle\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.349037 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-logs\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.349090 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-config-data\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.349207 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs95g\" (UniqueName: \"kubernetes.io/projected/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-kube-api-access-vs95g\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.370931 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-config-data-custom\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.371394 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56pl2\" (UniqueName: \"kubernetes.io/projected/cab5d494-17b4-4c5d-babf-e74a3deb090f-kube-api-access-56pl2\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.382928 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-combined-ca-bundle\") pod \"barbican-worker-7cdcf65d89-xwjmn\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.393284 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.451425 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-logs\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.451749 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-config-data\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.451812 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs95g\" (UniqueName: \"kubernetes.io/projected/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-kube-api-access-vs95g\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.451861 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-scripts\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.451904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-combined-ca-bundle\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.453394 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-logs\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.464603 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-combined-ca-bundle\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.465263 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-config-data\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.465311 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92"] Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.468269 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.482716 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-scripts\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.483460 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs95g\" (UniqueName: \"kubernetes.io/projected/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-kube-api-access-vs95g\") pod \"placement-799d455fb6-7jld8\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.495354 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92"] Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.508116 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6c37973c-d677-4481-b1cf-7a5183b904bc","Type":"ContainerStarted","Data":"0e54723f6444752205278a26f0fdd96906e1fd825eb4831b37817b13229742fc"} Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.508681 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7fc9646994-sp2l9"] Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.520349 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" event={"ID":"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4","Type":"ContainerStarted","Data":"7f8aa711319815580fd9a2344e5e5ec25bc08dbcc14cc5368c5872a1bee7b59f"} Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.531526 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.531919 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" event={"ID":"a9545469-c146-486f-9c79-b4a97f0f5f83","Type":"ContainerStarted","Data":"3a86e44dec0c7e253766c39f87b1ea44710eab1fd6d34ec1fc85a0990e2a59f0"} Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.532099 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.532139 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.541255 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"295f9631-4bac-43a2-9b71-f94209fa0046","Type":"ContainerDied","Data":"2f110cca5615a048bca2ed2ff0d4c1eb389f92ad830ae59ef99f3a1cff681dd8"} Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.541298 5030 scope.go:117] "RemoveContainer" containerID="99c989c805531e84aa4649956b12836dbd4fb418611a818efb7cce3c72c5eedd" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.541416 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.548867 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-7b5f97974b-77qrc"] Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.550441 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.557714 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-84588d847c-cnh5j"] Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.575763 5030 generic.go:334] "Generic (PLEG): container finished" podID="5b822da8-631f-47ee-8874-dfa862386ad2" containerID="39015b910f2f39e56b985ca85a7e5d5c7980fd4ff906f21ce55154bbcc3bfbc9" exitCode=143 Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.575827 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" event={"ID":"5b822da8-631f-47ee-8874-dfa862386ad2","Type":"ContainerDied","Data":"39015b910f2f39e56b985ca85a7e5d5c7980fd4ff906f21ce55154bbcc3bfbc9"} Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.584733 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.183:8775/\": read tcp 10.217.0.2:40058->10.217.1.183:8775: read: connection reset by peer" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.584907 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.183:8775/\": read tcp 10.217.0.2:40052->10.217.1.183:8775: read: connection reset by peer" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.584939 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7b5f97974b-77qrc"] Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.607892 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="36edc87c-d811-414a-8630-86c81093bdbf" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.145:8776/healthcheck\": read tcp 10.217.0.2:39148->10.217.1.145:8776: read: connection reset by peer" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.613339 5030 generic.go:334] "Generic (PLEG): container finished" podID="1df9d9fc-b1ae-420d-8d8a-1c453cb77106" containerID="b8f27ffed3eee51863138cee9e48d260bc090210cbb06cbfeed0342e5a7050be" exitCode=0 Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.613410 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1df9d9fc-b1ae-420d-8d8a-1c453cb77106","Type":"ContainerDied","Data":"b8f27ffed3eee51863138cee9e48d260bc090210cbb06cbfeed0342e5a7050be"} Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.615265 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.619411 5030 generic.go:334] "Generic (PLEG): container finished" podID="d6bf49fb-3a1f-4a62-9340-a40608e2d844" containerID="b67b792123403768573a8ebd3b75c1e265d1b457365b9957bcba8f5d6b593839" exitCode=0 Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.619462 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d6bf49fb-3a1f-4a62-9340-a40608e2d844","Type":"ContainerDied","Data":"b67b792123403768573a8ebd3b75c1e265d1b457365b9957bcba8f5d6b593839"} Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.631698 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" event={"ID":"1cef2808-064a-47dd-8abb-9f6b08d260be","Type":"ContainerStarted","Data":"116e40c4f860de9e142f95412a9597d3a3d7a81c4b7919eb25a5a0d68703a63f"} Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.643098 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7fc9646994-sp2l9"] Jan 20 23:31:25 crc kubenswrapper[5030]: E0120 23:31:25.651822 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b67b792123403768573a8ebd3b75c1e265d1b457365b9957bcba8f5d6b593839 is running failed: container process not found" containerID="b67b792123403768573a8ebd3b75c1e265d1b457365b9957bcba8f5d6b593839" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.654716 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" podStartSLOduration=4.654704152 podStartE2EDuration="4.654704152s" podCreationTimestamp="2026-01-20 23:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:25.575020403 +0000 UTC m=+3357.895280691" watchObservedRunningTime="2026-01-20 23:31:25.654704152 +0000 UTC m=+3357.974964440" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.656922 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-httpd-config\") pod \"neutron-7b5f97974b-77qrc\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.656988 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckjl7\" (UniqueName: \"kubernetes.io/projected/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-kube-api-access-ckjl7\") pod \"neutron-7b5f97974b-77qrc\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.657048 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-config\") pod \"neutron-7b5f97974b-77qrc\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.658242 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-config-data\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.658289 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-combined-ca-bundle\") pod \"neutron-7b5f97974b-77qrc\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.658318 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jtfm\" (UniqueName: \"kubernetes.io/projected/f7c50398-f0ad-450d-b494-be906da18af9-kube-api-access-7jtfm\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.658337 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7c50398-f0ad-450d-b494-be906da18af9-logs\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.658400 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-config-data-custom\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.658427 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-combined-ca-bundle\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: E0120 23:31:25.668976 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b67b792123403768573a8ebd3b75c1e265d1b457365b9957bcba8f5d6b593839 is running failed: container process not found" containerID="b67b792123403768573a8ebd3b75c1e265d1b457365b9957bcba8f5d6b593839" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:31:25 crc kubenswrapper[5030]: E0120 23:31:25.672227 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b67b792123403768573a8ebd3b75c1e265d1b457365b9957bcba8f5d6b593839 is running failed: container process not found" containerID="b67b792123403768573a8ebd3b75c1e265d1b457365b9957bcba8f5d6b593839" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:31:25 crc kubenswrapper[5030]: E0120 23:31:25.672269 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b67b792123403768573a8ebd3b75c1e265d1b457365b9957bcba8f5d6b593839 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d6bf49fb-3a1f-4a62-9340-a40608e2d844" containerName="nova-scheduler-scheduler" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.760032 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckjl7\" (UniqueName: \"kubernetes.io/projected/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-kube-api-access-ckjl7\") pod \"neutron-7b5f97974b-77qrc\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.760155 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-config\") pod \"neutron-7b5f97974b-77qrc\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.760221 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-config-data\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.760261 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-combined-ca-bundle\") pod \"neutron-7b5f97974b-77qrc\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.760313 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jtfm\" (UniqueName: \"kubernetes.io/projected/f7c50398-f0ad-450d-b494-be906da18af9-kube-api-access-7jtfm\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.760336 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7c50398-f0ad-450d-b494-be906da18af9-logs\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.760414 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-config-data-custom\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.760468 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-combined-ca-bundle\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.760504 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-httpd-config\") pod \"neutron-7b5f97974b-77qrc\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.766049 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-httpd-config\") pod \"neutron-7b5f97974b-77qrc\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.766762 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7c50398-f0ad-450d-b494-be906da18af9-logs\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.768258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-combined-ca-bundle\") pod \"neutron-7b5f97974b-77qrc\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.780983 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-config\") pod \"neutron-7b5f97974b-77qrc\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.790673 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-config-data-custom\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.790712 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-combined-ca-bundle\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.793666 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-config-data\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.794278 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckjl7\" (UniqueName: \"kubernetes.io/projected/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-kube-api-access-ckjl7\") pod \"neutron-7b5f97974b-77qrc\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.800196 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jtfm\" (UniqueName: \"kubernetes.io/projected/f7c50398-f0ad-450d-b494-be906da18af9-kube-api-access-7jtfm\") pod \"barbican-api-dd9dbdd6-tcp92\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.989139 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9936c7df-3635-4a9b-acec-a8c178738415" path="/var/lib/kubelet/pods/9936c7df-3635-4a9b-acec-a8c178738415/volumes" Jan 20 23:31:25 crc kubenswrapper[5030]: I0120 23:31:25.990106 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd28027-576f-4b84-aed2-3ddcedb5bab3" path="/var/lib/kubelet/pods/ddd28027-576f-4b84-aed2-3ddcedb5bab3/volumes" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.253524 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.253876 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="ceilometer-central-agent" containerID="cri-o://3bfb3d28c0e12ae18efaa28d415459489a4a00796c95465b198516c16c2dc578" gracePeriod=30 Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.254346 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="proxy-httpd" containerID="cri-o://e7c6be4bd549d34d96b35ea32ddf2d9e56b7140e18ff6672de159cff243a514f" gracePeriod=30 Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.254486 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="sg-core" containerID="cri-o://7185c5fb05b2a58b278c176c1f7c6feba52b9f23788ceb4dae8035f37f075ebe" gracePeriod=30 Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.254592 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="ceilometer-notification-agent" containerID="cri-o://d8018bf3bade68fa64b2765cfd6560c427fb070f2f81ee36252a4716114b59ee" gracePeriod=30 Jan 20 23:31:26 crc kubenswrapper[5030]: E0120 23:31:26.465792 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff299fc0_074c_4420_8ac5_030bebd0c385.slice/crio-conmon-25854f2a2d04afcd6032d297351aac945108d95777f369400932fa4d630807df.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff299fc0_074c_4420_8ac5_030bebd0c385.slice/crio-25854f2a2d04afcd6032d297351aac945108d95777f369400932fa4d630807df.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36edc87c_d811_414a_8630_86c81093bdbf.slice/crio-51240c072577546536a7362dad9a075e93e1489c8474e8d196c9d05db5b7e28b.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.654590 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d6bf49fb-3a1f-4a62-9340-a40608e2d844","Type":"ContainerDied","Data":"c67cad9dedfea3af06bf566a87b5352eaf9bd53006d7cdeb5d315d7a96a253d6"} Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.657009 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c67cad9dedfea3af06bf566a87b5352eaf9bd53006d7cdeb5d315d7a96a253d6" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.677801 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.683690 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" event={"ID":"52fa921e-8a73-400d-9f48-50876a4765f4","Type":"ContainerStarted","Data":"60bdee87e6e2052c44319ee5200fc4ba267eec8e3cf7f7c7482f4b470bee7599"} Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.683831 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" podUID="52fa921e-8a73-400d-9f48-50876a4765f4" containerName="barbican-worker-log" containerID="cri-o://980840455d95fe822b68afa33dbee4e1d193ae82fd6f5af390afd022819a58a3" gracePeriod=30 Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.684207 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" podUID="52fa921e-8a73-400d-9f48-50876a4765f4" containerName="barbican-worker" containerID="cri-o://60bdee87e6e2052c44319ee5200fc4ba267eec8e3cf7f7c7482f4b470bee7599" gracePeriod=30 Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.694542 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.705383 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.709685 5030 generic.go:334] "Generic (PLEG): container finished" podID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerID="25854f2a2d04afcd6032d297351aac945108d95777f369400932fa4d630807df" exitCode=0 Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.709738 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ff299fc0-074c-4420-8ac5-030bebd0c385","Type":"ContainerDied","Data":"25854f2a2d04afcd6032d297351aac945108d95777f369400932fa4d630807df"} Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.709759 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"ff299fc0-074c-4420-8ac5-030bebd0c385","Type":"ContainerDied","Data":"27d6ada5afe73f9dfe769965c8eb20db23b4135948bb0c76cf36d211eb5ffc7e"} Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.709771 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27d6ada5afe73f9dfe769965c8eb20db23b4135948bb0c76cf36d211eb5ffc7e" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.730665 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.746248 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.747545 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.749148 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.750271 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.750971 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-n56b6" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.756253 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.764656 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" podStartSLOduration=5.764637991 podStartE2EDuration="5.764637991s" podCreationTimestamp="2026-01-20 23:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:26.70885778 +0000 UTC m=+3359.029118068" watchObservedRunningTime="2026-01-20 23:31:26.764637991 +0000 UTC m=+3359.084898279" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.800581 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.800728 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.800878 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.800901 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.801455 5030 generic.go:334] "Generic (PLEG): container finished" podID="36edc87c-d811-414a-8630-86c81093bdbf" containerID="51240c072577546536a7362dad9a075e93e1489c8474e8d196c9d05db5b7e28b" exitCode=0 Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.801530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"36edc87c-d811-414a-8630-86c81093bdbf","Type":"ContainerDied","Data":"51240c072577546536a7362dad9a075e93e1489c8474e8d196c9d05db5b7e28b"} Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.801556 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"36edc87c-d811-414a-8630-86c81093bdbf","Type":"ContainerDied","Data":"6d70c12ba776d0c0aead021878157f3d30644db9645995dc386797eda4d22bbf"} Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.801569 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d70c12ba776d0c0aead021878157f3d30644db9645995dc386797eda4d22bbf" Jan 20 23:31:26 crc kubenswrapper[5030]: I0120 23:31:26.801817 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.812324 5030 generic.go:334] "Generic (PLEG): container finished" podID="9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" containerID="50f7830c863df6553b3aedb3541412b57d1b96e325097a32a008245e2d15caf3" exitCode=0 Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.812441 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1","Type":"ContainerDied","Data":"50f7830c863df6553b3aedb3541412b57d1b96e325097a32a008245e2d15caf3"} Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.812465 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.812483 5030 scope.go:117] "RemoveContainer" containerID="50f7830c863df6553b3aedb3541412b57d1b96e325097a32a008245e2d15caf3" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.812471 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1","Type":"ContainerDied","Data":"9f48055581cfa836ec976d39c83be75bc5d8e68e69f74bd43ee85c063d08641f"} Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.850050 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.896692 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff299fc0-074c-4420-8ac5-030bebd0c385-config-data\") pod \"ff299fc0-074c-4420-8ac5-030bebd0c385\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.896725 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-logs\") pod \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.898473 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-config-data-custom\") pod \"36edc87c-d811-414a-8630-86c81093bdbf\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.898515 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff299fc0-074c-4420-8ac5-030bebd0c385-logs\") pod \"ff299fc0-074c-4420-8ac5-030bebd0c385\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.898569 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-etc-machine-id\") pod \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.898596 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff299fc0-074c-4420-8ac5-030bebd0c385-combined-ca-bundle\") pod \"ff299fc0-074c-4420-8ac5-030bebd0c385\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.898666 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-scripts\") pod \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.898523 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-logs" (OuterVolumeSpecName: "logs") pod "9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" (UID: "9fdd777a-3f61-4d1d-80e6-b508ce72ffc1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.898693 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6bf49fb-3a1f-4a62-9340-a40608e2d844-config-data\") pod \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\" (UID: \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.898747 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b7zv\" (UniqueName: \"kubernetes.io/projected/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-kube-api-access-2b7zv\") pod \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.898775 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-combined-ca-bundle\") pod \"36edc87c-d811-414a-8630-86c81093bdbf\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.898842 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-combined-ca-bundle\") pod \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.899076 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqbwq\" (UniqueName: \"kubernetes.io/projected/36edc87c-d811-414a-8630-86c81093bdbf-kube-api-access-gqbwq\") pod \"36edc87c-d811-414a-8630-86c81093bdbf\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.899135 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-config-data\") pod \"36edc87c-d811-414a-8630-86c81093bdbf\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.899161 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-combined-ca-bundle\") pod \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.899188 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n9qk\" (UniqueName: \"kubernetes.io/projected/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-kube-api-access-2n9qk\") pod \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.899232 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-config-data\") pod \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.899279 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-config-data\") pod \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\" (UID: \"9fdd777a-3f61-4d1d-80e6-b508ce72ffc1\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.899297 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-config-data-custom\") pod \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\" (UID: \"1df9d9fc-b1ae-420d-8d8a-1c453cb77106\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.899313 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssg2g\" (UniqueName: \"kubernetes.io/projected/ff299fc0-074c-4420-8ac5-030bebd0c385-kube-api-access-ssg2g\") pod \"ff299fc0-074c-4420-8ac5-030bebd0c385\" (UID: \"ff299fc0-074c-4420-8ac5-030bebd0c385\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.899353 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58bdc\" (UniqueName: \"kubernetes.io/projected/d6bf49fb-3a1f-4a62-9340-a40608e2d844-kube-api-access-58bdc\") pod \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\" (UID: \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.899377 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36edc87c-d811-414a-8630-86c81093bdbf-etc-machine-id\") pod \"36edc87c-d811-414a-8630-86c81093bdbf\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.899406 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6bf49fb-3a1f-4a62-9340-a40608e2d844-combined-ca-bundle\") pod \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\" (UID: \"d6bf49fb-3a1f-4a62-9340-a40608e2d844\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.899453 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36edc87c-d811-414a-8630-86c81093bdbf-logs\") pod \"36edc87c-d811-414a-8630-86c81093bdbf\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.899501 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-scripts\") pod \"36edc87c-d811-414a-8630-86c81093bdbf\" (UID: \"36edc87c-d811-414a-8630-86c81093bdbf\") " Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.900450 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dc2869a-ba69-4466-aa02-b6fe06035daa-config-data\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.900478 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4dc2869a-ba69-4466-aa02-b6fe06035daa-kolla-config\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.900511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lx8n\" (UniqueName: \"kubernetes.io/projected/4dc2869a-ba69-4466-aa02-b6fe06035daa-kube-api-access-5lx8n\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.900590 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc2869a-ba69-4466-aa02-b6fe06035daa-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.900612 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc2869a-ba69-4466-aa02-b6fe06035daa-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.900884 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.902350 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.902578 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"1df9d9fc-b1ae-420d-8d8a-1c453cb77106","Type":"ContainerDied","Data":"3abf721ba5637f0d56a9b0207f2737ad4bcd5e9e876b5279f33392be61081b9a"} Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.902752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36edc87c-d811-414a-8630-86c81093bdbf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "36edc87c-d811-414a-8630-86c81093bdbf" (UID: "36edc87c-d811-414a-8630-86c81093bdbf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.904310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1df9d9fc-b1ae-420d-8d8a-1c453cb77106" (UID: "1df9d9fc-b1ae-420d-8d8a-1c453cb77106"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.909527 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36edc87c-d811-414a-8630-86c81093bdbf-logs" (OuterVolumeSpecName: "logs") pod "36edc87c-d811-414a-8630-86c81093bdbf" (UID: "36edc87c-d811-414a-8630-86c81093bdbf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.911603 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff299fc0-074c-4420-8ac5-030bebd0c385-logs" (OuterVolumeSpecName: "logs") pod "ff299fc0-074c-4420-8ac5-030bebd0c385" (UID: "ff299fc0-074c-4420-8ac5-030bebd0c385"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.913460 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36edc87c-d811-414a-8630-86c81093bdbf-kube-api-access-gqbwq" (OuterVolumeSpecName: "kube-api-access-gqbwq") pod "36edc87c-d811-414a-8630-86c81093bdbf" (UID: "36edc87c-d811-414a-8630-86c81093bdbf"). InnerVolumeSpecName "kube-api-access-gqbwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.916699 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "36edc87c-d811-414a-8630-86c81093bdbf" (UID: "36edc87c-d811-414a-8630-86c81093bdbf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.916882 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-kube-api-access-2n9qk" (OuterVolumeSpecName: "kube-api-access-2n9qk") pod "9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" (UID: "9fdd777a-3f61-4d1d-80e6-b508ce72ffc1"). InnerVolumeSpecName "kube-api-access-2n9qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.936482 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" event={"ID":"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0","Type":"ContainerStarted","Data":"494d3ef6c648973849b47baaa186ccd729b934284fa3f94433ae78272e1c1568"} Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.936685 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" podUID="9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" containerName="neutron-api" containerID="cri-o://c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32" gracePeriod=30 Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.937989 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.938027 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" podUID="9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" containerName="neutron-httpd" containerID="cri-o://494d3ef6c648973849b47baaa186ccd729b934284fa3f94433ae78272e1c1568" gracePeriod=30 Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.953722 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh"] Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.953773 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6bf49fb-3a1f-4a62-9340-a40608e2d844-kube-api-access-58bdc" (OuterVolumeSpecName: "kube-api-access-58bdc") pod "d6bf49fb-3a1f-4a62-9340-a40608e2d844" (UID: "d6bf49fb-3a1f-4a62-9340-a40608e2d844"). InnerVolumeSpecName "kube-api-access-58bdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.977075 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-kube-api-access-2b7zv" (OuterVolumeSpecName: "kube-api-access-2b7zv") pod "1df9d9fc-b1ae-420d-8d8a-1c453cb77106" (UID: "1df9d9fc-b1ae-420d-8d8a-1c453cb77106"). InnerVolumeSpecName "kube-api-access-2b7zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.984104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff299fc0-074c-4420-8ac5-030bebd0c385-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff299fc0-074c-4420-8ac5-030bebd0c385" (UID: "ff299fc0-074c-4420-8ac5-030bebd0c385"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.985452 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1df9d9fc-b1ae-420d-8d8a-1c453cb77106" (UID: "1df9d9fc-b1ae-420d-8d8a-1c453cb77106"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.985502 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-scripts" (OuterVolumeSpecName: "scripts") pod "1df9d9fc-b1ae-420d-8d8a-1c453cb77106" (UID: "1df9d9fc-b1ae-420d-8d8a-1c453cb77106"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.985583 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff299fc0-074c-4420-8ac5-030bebd0c385-kube-api-access-ssg2g" (OuterVolumeSpecName: "kube-api-access-ssg2g") pod "ff299fc0-074c-4420-8ac5-030bebd0c385" (UID: "ff299fc0-074c-4420-8ac5-030bebd0c385"). InnerVolumeSpecName "kube-api-access-ssg2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.985907 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-scripts" (OuterVolumeSpecName: "scripts") pod "36edc87c-d811-414a-8630-86c81093bdbf" (UID: "36edc87c-d811-414a-8630-86c81093bdbf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.989423 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6c37973c-d677-4481-b1cf-7a5183b904bc","Type":"ContainerStarted","Data":"e70c54687a33367df40bfc627eae4a59f6c05998f88b62c110f7442a47b30c24"} Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:26.994469 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6bf49fb-3a1f-4a62-9340-a40608e2d844-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6bf49fb-3a1f-4a62-9340-a40608e2d844" (UID: "d6bf49fb-3a1f-4a62-9340-a40608e2d844"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.003402 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn"] Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.003894 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dc2869a-ba69-4466-aa02-b6fe06035daa-config-data\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.003929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4dc2869a-ba69-4466-aa02-b6fe06035daa-kolla-config\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.003956 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lx8n\" (UniqueName: \"kubernetes.io/projected/4dc2869a-ba69-4466-aa02-b6fe06035daa-kube-api-access-5lx8n\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.003996 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc2869a-ba69-4466-aa02-b6fe06035daa-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004011 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc2869a-ba69-4466-aa02-b6fe06035daa-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004083 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004094 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssg2g\" (UniqueName: \"kubernetes.io/projected/ff299fc0-074c-4420-8ac5-030bebd0c385-kube-api-access-ssg2g\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004105 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58bdc\" (UniqueName: \"kubernetes.io/projected/d6bf49fb-3a1f-4a62-9340-a40608e2d844-kube-api-access-58bdc\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004115 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36edc87c-d811-414a-8630-86c81093bdbf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004125 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6bf49fb-3a1f-4a62-9340-a40608e2d844-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004136 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36edc87c-d811-414a-8630-86c81093bdbf-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004143 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004151 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff299fc0-074c-4420-8ac5-030bebd0c385-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004161 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004169 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004177 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff299fc0-074c-4420-8ac5-030bebd0c385-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004185 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004193 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b7zv\" (UniqueName: \"kubernetes.io/projected/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-kube-api-access-2b7zv\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004201 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqbwq\" (UniqueName: \"kubernetes.io/projected/36edc87c-d811-414a-8630-86c81093bdbf-kube-api-access-gqbwq\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.004210 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n9qk\" (UniqueName: \"kubernetes.io/projected/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-kube-api-access-2n9qk\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.005982 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4dc2869a-ba69-4466-aa02-b6fe06035daa-kolla-config\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.006444 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dc2869a-ba69-4466-aa02-b6fe06035daa-config-data\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.016373 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-sf2nd"] Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.023664 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff299fc0-074c-4420-8ac5-030bebd0c385-config-data" (OuterVolumeSpecName: "config-data") pod "ff299fc0-074c-4420-8ac5-030bebd0c385" (UID: "ff299fc0-074c-4420-8ac5-030bebd0c385"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.026371 5030 generic.go:334] "Generic (PLEG): container finished" podID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerID="e7c6be4bd549d34d96b35ea32ddf2d9e56b7140e18ff6672de159cff243a514f" exitCode=0 Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.026392 5030 generic.go:334] "Generic (PLEG): container finished" podID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerID="7185c5fb05b2a58b278c176c1f7c6feba52b9f23788ceb4dae8035f37f075ebe" exitCode=2 Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.026479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4a35bcf9-e354-4385-8e10-2bb69352fc70","Type":"ContainerDied","Data":"e7c6be4bd549d34d96b35ea32ddf2d9e56b7140e18ff6672de159cff243a514f"} Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.026508 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4a35bcf9-e354-4385-8e10-2bb69352fc70","Type":"ContainerDied","Data":"7185c5fb05b2a58b278c176c1f7c6feba52b9f23788ceb4dae8035f37f075ebe"} Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.026558 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" podUID="a9545469-c146-486f-9c79-b4a97f0f5f83" containerName="barbican-api-log" containerID="cri-o://cfb105ff10aa6a2752e27b961832094a2cba935c02c88987ebabcede7da0e5ca" gracePeriod=30 Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.026651 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" podUID="a9545469-c146-486f-9c79-b4a97f0f5f83" containerName="barbican-api" containerID="cri-o://3a86e44dec0c7e253766c39f87b1ea44710eab1fd6d34ec1fc85a0990e2a59f0" gracePeriod=30 Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.044032 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc2869a-ba69-4466-aa02-b6fe06035daa-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.049579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc2869a-ba69-4466-aa02-b6fe06035daa-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.060754 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" podStartSLOduration=6.060735098 podStartE2EDuration="6.060735098s" podCreationTimestamp="2026-01-20 23:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:27.029435811 +0000 UTC m=+3359.349696099" watchObservedRunningTime="2026-01-20 23:31:27.060735098 +0000 UTC m=+3359.380995376" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.081529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lx8n\" (UniqueName: \"kubernetes.io/projected/4dc2869a-ba69-4466-aa02-b6fe06035daa-kube-api-access-5lx8n\") pod \"memcached-0\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.089011 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=4.088994552 podStartE2EDuration="4.088994552s" podCreationTimestamp="2026-01-20 23:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:27.059268973 +0000 UTC m=+3359.379529261" watchObservedRunningTime="2026-01-20 23:31:27.088994552 +0000 UTC m=+3359.409254840" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.107028 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff299fc0-074c-4420-8ac5-030bebd0c385-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.125707 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-799d455fb6-7jld8"] Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.140139 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-799d455fb6-7jld8"] Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.175668 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" (UID: "9fdd777a-3f61-4d1d-80e6-b508ce72ffc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176128 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-d9f464b66-s2shz"] Jan 20 23:31:27 crc kubenswrapper[5030]: E0120 23:31:27.176492 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6bf49fb-3a1f-4a62-9340-a40608e2d844" containerName="nova-scheduler-scheduler" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176503 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6bf49fb-3a1f-4a62-9340-a40608e2d844" containerName="nova-scheduler-scheduler" Jan 20 23:31:27 crc kubenswrapper[5030]: E0120 23:31:27.176528 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df9d9fc-b1ae-420d-8d8a-1c453cb77106" containerName="cinder-scheduler" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176534 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df9d9fc-b1ae-420d-8d8a-1c453cb77106" containerName="cinder-scheduler" Jan 20 23:31:27 crc kubenswrapper[5030]: E0120 23:31:27.176545 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36edc87c-d811-414a-8630-86c81093bdbf" containerName="cinder-api-log" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176551 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="36edc87c-d811-414a-8630-86c81093bdbf" containerName="cinder-api-log" Jan 20 23:31:27 crc kubenswrapper[5030]: E0120 23:31:27.176559 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36edc87c-d811-414a-8630-86c81093bdbf" containerName="cinder-api" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176564 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="36edc87c-d811-414a-8630-86c81093bdbf" containerName="cinder-api" Jan 20 23:31:27 crc kubenswrapper[5030]: E0120 23:31:27.176577 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerName="nova-metadata-log" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176585 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerName="nova-metadata-log" Jan 20 23:31:27 crc kubenswrapper[5030]: E0120 23:31:27.176596 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerName="nova-metadata-metadata" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176602 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerName="nova-metadata-metadata" Jan 20 23:31:27 crc kubenswrapper[5030]: E0120 23:31:27.176634 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" containerName="nova-api-log" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176640 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" containerName="nova-api-log" Jan 20 23:31:27 crc kubenswrapper[5030]: E0120 23:31:27.176654 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df9d9fc-b1ae-420d-8d8a-1c453cb77106" containerName="probe" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176660 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df9d9fc-b1ae-420d-8d8a-1c453cb77106" containerName="probe" Jan 20 23:31:27 crc kubenswrapper[5030]: E0120 23:31:27.176669 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" containerName="nova-api-api" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176674 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" containerName="nova-api-api" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176851 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df9d9fc-b1ae-420d-8d8a-1c453cb77106" containerName="probe" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176867 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="36edc87c-d811-414a-8630-86c81093bdbf" containerName="cinder-api-log" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176877 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" containerName="nova-api-log" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176889 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" containerName="nova-api-api" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176899 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6bf49fb-3a1f-4a62-9340-a40608e2d844" containerName="nova-scheduler-scheduler" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176914 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="36edc87c-d811-414a-8630-86c81093bdbf" containerName="cinder-api" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176924 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerName="nova-metadata-metadata" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176934 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df9d9fc-b1ae-420d-8d8a-1c453cb77106" containerName="cinder-scheduler" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.176944 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff299fc0-074c-4420-8ac5-030bebd0c385" containerName="nova-metadata-log" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.177885 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.187978 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.188183 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.205330 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-d9f464b66-s2shz"] Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.221162 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-config-data" (OuterVolumeSpecName: "config-data") pod "9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" (UID: "9fdd777a-3f61-4d1d-80e6-b508ce72ffc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.226779 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.226801 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.262717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6bf49fb-3a1f-4a62-9340-a40608e2d844-config-data" (OuterVolumeSpecName: "config-data") pod "d6bf49fb-3a1f-4a62-9340-a40608e2d844" (UID: "d6bf49fb-3a1f-4a62-9340-a40608e2d844"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.272746 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36edc87c-d811-414a-8630-86c81093bdbf" (UID: "36edc87c-d811-414a-8630-86c81093bdbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.316854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1df9d9fc-b1ae-420d-8d8a-1c453cb77106" (UID: "1df9d9fc-b1ae-420d-8d8a-1c453cb77106"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.329134 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-internal-tls-certs\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.329218 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-combined-ca-bundle\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.329238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzxtf\" (UniqueName: \"kubernetes.io/projected/45231f32-366a-4f97-96b7-c59f7f059770-kube-api-access-rzxtf\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.329302 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-public-tls-certs\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.329330 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45231f32-366a-4f97-96b7-c59f7f059770-logs\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.329347 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-config-data\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.329369 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-scripts\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.329413 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6bf49fb-3a1f-4a62-9340-a40608e2d844-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.329424 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.329433 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.334364 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-config-data" (OuterVolumeSpecName: "config-data") pod "36edc87c-d811-414a-8630-86c81093bdbf" (UID: "36edc87c-d811-414a-8630-86c81093bdbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.369831 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.431772 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-internal-tls-certs\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.431846 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-combined-ca-bundle\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.431867 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxtf\" (UniqueName: \"kubernetes.io/projected/45231f32-366a-4f97-96b7-c59f7f059770-kube-api-access-rzxtf\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.431929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-public-tls-certs\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.431958 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45231f32-366a-4f97-96b7-c59f7f059770-logs\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.431978 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-config-data\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.431999 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-scripts\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.432046 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36edc87c-d811-414a-8630-86c81093bdbf-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.433278 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45231f32-366a-4f97-96b7-c59f7f059770-logs\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.437457 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-config-data" (OuterVolumeSpecName: "config-data") pod "1df9d9fc-b1ae-420d-8d8a-1c453cb77106" (UID: "1df9d9fc-b1ae-420d-8d8a-1c453cb77106"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.440240 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-internal-tls-certs\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.451261 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-scripts\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.461325 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-public-tls-certs\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.470256 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-config-data\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.474240 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzxtf\" (UniqueName: \"kubernetes.io/projected/45231f32-366a-4f97-96b7-c59f7f059770-kube-api-access-rzxtf\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.476147 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-combined-ca-bundle\") pod \"placement-d9f464b66-s2shz\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.536831 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1df9d9fc-b1ae-420d-8d8a-1c453cb77106-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.857267 5030 scope.go:117] "RemoveContainer" containerID="d2040a6e25a52cf56fd194763cb8ba952aafe253c5f4e0b2ef114a3c8448c118" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.912878 5030 scope.go:117] "RemoveContainer" containerID="50f7830c863df6553b3aedb3541412b57d1b96e325097a32a008245e2d15caf3" Jan 20 23:31:27 crc kubenswrapper[5030]: E0120 23:31:27.913791 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50f7830c863df6553b3aedb3541412b57d1b96e325097a32a008245e2d15caf3\": container with ID starting with 50f7830c863df6553b3aedb3541412b57d1b96e325097a32a008245e2d15caf3 not found: ID does not exist" containerID="50f7830c863df6553b3aedb3541412b57d1b96e325097a32a008245e2d15caf3" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.913832 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f7830c863df6553b3aedb3541412b57d1b96e325097a32a008245e2d15caf3"} err="failed to get container status \"50f7830c863df6553b3aedb3541412b57d1b96e325097a32a008245e2d15caf3\": rpc error: code = NotFound desc = could not find container \"50f7830c863df6553b3aedb3541412b57d1b96e325097a32a008245e2d15caf3\": container with ID starting with 50f7830c863df6553b3aedb3541412b57d1b96e325097a32a008245e2d15caf3 not found: ID does not exist" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.913859 5030 scope.go:117] "RemoveContainer" containerID="d2040a6e25a52cf56fd194763cb8ba952aafe253c5f4e0b2ef114a3c8448c118" Jan 20 23:31:27 crc kubenswrapper[5030]: E0120 23:31:27.914201 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2040a6e25a52cf56fd194763cb8ba952aafe253c5f4e0b2ef114a3c8448c118\": container with ID starting with d2040a6e25a52cf56fd194763cb8ba952aafe253c5f4e0b2ef114a3c8448c118 not found: ID does not exist" containerID="d2040a6e25a52cf56fd194763cb8ba952aafe253c5f4e0b2ef114a3c8448c118" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.914224 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2040a6e25a52cf56fd194763cb8ba952aafe253c5f4e0b2ef114a3c8448c118"} err="failed to get container status \"d2040a6e25a52cf56fd194763cb8ba952aafe253c5f4e0b2ef114a3c8448c118\": rpc error: code = NotFound desc = could not find container \"d2040a6e25a52cf56fd194763cb8ba952aafe253c5f4e0b2ef114a3c8448c118\": container with ID starting with d2040a6e25a52cf56fd194763cb8ba952aafe253c5f4e0b2ef114a3c8448c118 not found: ID does not exist" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.914237 5030 scope.go:117] "RemoveContainer" containerID="585ae5ff44f886ced04efde6a25158e4b15847e87657330c8e6e321d18ce82b3" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.930587 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:27 crc kubenswrapper[5030]: I0120 23:31:27.931614 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.018507 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295f9631-4bac-43a2-9b71-f94209fa0046" path="/var/lib/kubelet/pods/295f9631-4bac-43a2-9b71-f94209fa0046/volumes" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.050415 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.050490 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.050507 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.050519 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:31:28 crc kubenswrapper[5030]: E0120 23:31:28.052513 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f8ffc8-cadf-4aab-ab46-71163f0339d9" containerName="glance-log" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.052536 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f8ffc8-cadf-4aab-ab46-71163f0339d9" containerName="glance-log" Jan 20 23:31:28 crc kubenswrapper[5030]: E0120 23:31:28.052548 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f8ffc8-cadf-4aab-ab46-71163f0339d9" containerName="glance-httpd" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.052555 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f8ffc8-cadf-4aab-ab46-71163f0339d9" containerName="glance-httpd" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.053005 5030 scope.go:117] "RemoveContainer" containerID="b8f27ffed3eee51863138cee9e48d260bc090210cbb06cbfeed0342e5a7050be" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.054240 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f8ffc8-cadf-4aab-ab46-71163f0339d9" containerName="glance-httpd" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.054273 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f8ffc8-cadf-4aab-ab46-71163f0339d9" containerName="glance-log" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.070234 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.074515 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.083577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-config-data\") pod \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.083729 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03f8ffc8-cadf-4aab-ab46-71163f0339d9-httpd-run\") pod \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.083795 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bx74\" (UniqueName: \"kubernetes.io/projected/03f8ffc8-cadf-4aab-ab46-71163f0339d9-kube-api-access-7bx74\") pod \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.083846 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03f8ffc8-cadf-4aab-ab46-71163f0339d9-logs\") pod \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.083957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-scripts\") pod \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.084001 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-combined-ca-bundle\") pod \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.084059 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.088376 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03f8ffc8-cadf-4aab-ab46-71163f0339d9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "03f8ffc8-cadf-4aab-ab46-71163f0339d9" (UID: "03f8ffc8-cadf-4aab-ab46-71163f0339d9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.088488 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03f8ffc8-cadf-4aab-ab46-71163f0339d9-logs" (OuterVolumeSpecName: "logs") pod "03f8ffc8-cadf-4aab-ab46-71163f0339d9" (UID: "03f8ffc8-cadf-4aab-ab46-71163f0339d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.093364 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.111185 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.148725 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.150420 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "03f8ffc8-cadf-4aab-ab46-71163f0339d9" (UID: "03f8ffc8-cadf-4aab-ab46-71163f0339d9"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.155659 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f8ffc8-cadf-4aab-ab46-71163f0339d9-kube-api-access-7bx74" (OuterVolumeSpecName: "kube-api-access-7bx74") pod "03f8ffc8-cadf-4aab-ab46-71163f0339d9" (UID: "03f8ffc8-cadf-4aab-ab46-71163f0339d9"). InnerVolumeSpecName "kube-api-access-7bx74". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.157369 5030 generic.go:334] "Generic (PLEG): container finished" podID="52fa921e-8a73-400d-9f48-50876a4765f4" containerID="980840455d95fe822b68afa33dbee4e1d193ae82fd6f5af390afd022819a58a3" exitCode=143 Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.172756 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" event={"ID":"52fa921e-8a73-400d-9f48-50876a4765f4","Type":"ContainerDied","Data":"980840455d95fe822b68afa33dbee4e1d193ae82fd6f5af390afd022819a58a3"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.172905 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.181721 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.191435 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-scripts" (OuterVolumeSpecName: "scripts") pod "03f8ffc8-cadf-4aab-ab46-71163f0339d9" (UID: "03f8ffc8-cadf-4aab-ab46-71163f0339d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.204253 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.204567 5030 generic.go:334] "Generic (PLEG): container finished" podID="03f8ffc8-cadf-4aab-ab46-71163f0339d9" containerID="53982c449caa05c2e8194b1dac6d69dabe8a0b162f8d8436686b33b0358d19c0" exitCode=0 Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.204684 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"03f8ffc8-cadf-4aab-ab46-71163f0339d9","Type":"ContainerDied","Data":"53982c449caa05c2e8194b1dac6d69dabe8a0b162f8d8436686b33b0358d19c0"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.204712 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"03f8ffc8-cadf-4aab-ab46-71163f0339d9","Type":"ContainerDied","Data":"a41a6866c28b712a4ddfd1b80a16b93b2eb82d94b1aee8b4feddb018dd384238"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.204822 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.209309 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bx74\" (UniqueName: \"kubernetes.io/projected/03f8ffc8-cadf-4aab-ab46-71163f0339d9-kube-api-access-7bx74\") pod \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " Jan 20 23:31:28 crc kubenswrapper[5030]: W0120 23:31:28.209533 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/03f8ffc8-cadf-4aab-ab46-71163f0339d9/volumes/kubernetes.io~projected/kube-api-access-7bx74 Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.209892 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f8ffc8-cadf-4aab-ab46-71163f0339d9-kube-api-access-7bx74" (OuterVolumeSpecName: "kube-api-access-7bx74") pod "03f8ffc8-cadf-4aab-ab46-71163f0339d9" (UID: "03f8ffc8-cadf-4aab-ab46-71163f0339d9"). InnerVolumeSpecName "kube-api-access-7bx74". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.223212 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-scripts\") pod \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " Jan 20 23:31:28 crc kubenswrapper[5030]: W0120 23:31:28.225164 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/03f8ffc8-cadf-4aab-ab46-71163f0339d9/volumes/kubernetes.io~secret/scripts Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.225189 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-scripts" (OuterVolumeSpecName: "scripts") pod "03f8ffc8-cadf-4aab-ab46-71163f0339d9" (UID: "03f8ffc8-cadf-4aab-ab46-71163f0339d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.225331 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\" (UID: \"03f8ffc8-cadf-4aab-ab46-71163f0339d9\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.225705 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-logs\") pod \"nova-api-0\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.225765 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw4xn\" (UniqueName: \"kubernetes.io/projected/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-kube-api-access-cw4xn\") pod \"nova-api-0\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.225831 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.225946 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.225997 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-config-data\") pod \"nova-api-0\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.226042 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-scripts\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.226060 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2sfm\" (UniqueName: \"kubernetes.io/projected/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-kube-api-access-p2sfm\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.226078 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.227883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.232930 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03f8ffc8-cadf-4aab-ab46-71163f0339d9" (UID: "03f8ffc8-cadf-4aab-ab46-71163f0339d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: W0120 23:31:28.234138 5030 mount_helper_common.go:34] Warning: mount cleanup skipped because path does not exist: /var/lib/kubelet/pods/03f8ffc8-cadf-4aab-ab46-71163f0339d9/volumes/kubernetes.io~local-volume/local-storage06-crc Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.234151 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "03f8ffc8-cadf-4aab-ab46-71163f0339d9" (UID: "03f8ffc8-cadf-4aab-ab46-71163f0339d9"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.236484 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-config-data\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.236780 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.236817 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.236831 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03f8ffc8-cadf-4aab-ab46-71163f0339d9-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.236845 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bx74\" (UniqueName: \"kubernetes.io/projected/03f8ffc8-cadf-4aab-ab46-71163f0339d9-kube-api-access-7bx74\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.236855 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03f8ffc8-cadf-4aab-ab46-71163f0339d9-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.236863 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.237293 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" event={"ID":"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4","Type":"ContainerStarted","Data":"d33518e830a3f7bd93cdb4cab2e8d31c3450485a686941e871eb32f2b839ba27"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.237355 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.241943 5030 generic.go:334] "Generic (PLEG): container finished" podID="9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" containerID="494d3ef6c648973849b47baaa186ccd729b934284fa3f94433ae78272e1c1568" exitCode=0 Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.241996 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" event={"ID":"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0","Type":"ContainerDied","Data":"494d3ef6c648973849b47baaa186ccd729b934284fa3f94433ae78272e1c1568"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.253013 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" event={"ID":"1cef2808-064a-47dd-8abb-9f6b08d260be","Type":"ContainerStarted","Data":"5e56e0a90f48d67ecced2eb4b7bd71d6bb25bbb2926717de0d9772105c53f0b2"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.255724 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" event={"ID":"6ba68c0c-821f-4bb0-848d-cfdc09051ae9","Type":"ContainerStarted","Data":"9b7f98334b8d850db2bf7cda190144b8a285e2d255aae8325a62953d1f4ca1dd"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.264128 5030 generic.go:334] "Generic (PLEG): container finished" podID="8324c6f6-da57-40dc-a9f3-ce99a41078e6" containerID="16c5cb22986bc3b811996c0ba29d425238d57e0b2b669a5b412626aeb83f8fda" exitCode=0 Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.264205 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"8324c6f6-da57-40dc-a9f3-ce99a41078e6","Type":"ContainerDied","Data":"16c5cb22986bc3b811996c0ba29d425238d57e0b2b669a5b412626aeb83f8fda"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.268451 5030 generic.go:334] "Generic (PLEG): container finished" podID="a9545469-c146-486f-9c79-b4a97f0f5f83" containerID="3a86e44dec0c7e253766c39f87b1ea44710eab1fd6d34ec1fc85a0990e2a59f0" exitCode=0 Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.268493 5030 generic.go:334] "Generic (PLEG): container finished" podID="a9545469-c146-486f-9c79-b4a97f0f5f83" containerID="cfb105ff10aa6a2752e27b961832094a2cba935c02c88987ebabcede7da0e5ca" exitCode=143 Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.268558 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" event={"ID":"a9545469-c146-486f-9c79-b4a97f0f5f83","Type":"ContainerDied","Data":"3a86e44dec0c7e253766c39f87b1ea44710eab1fd6d34ec1fc85a0990e2a59f0"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.268584 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" event={"ID":"a9545469-c146-486f-9c79-b4a97f0f5f83","Type":"ContainerDied","Data":"cfb105ff10aa6a2752e27b961832094a2cba935c02c88987ebabcede7da0e5ca"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.269888 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.283137 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-config-data" (OuterVolumeSpecName: "config-data") pod "03f8ffc8-cadf-4aab-ab46-71163f0339d9" (UID: "03f8ffc8-cadf-4aab-ab46-71163f0339d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.287760 5030 generic.go:334] "Generic (PLEG): container finished" podID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerID="3bfb3d28c0e12ae18efaa28d415459489a4a00796c95465b198516c16c2dc578" exitCode=0 Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.287867 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4a35bcf9-e354-4385-8e10-2bb69352fc70","Type":"ContainerDied","Data":"3bfb3d28c0e12ae18efaa28d415459489a4a00796c95465b198516c16c2dc578"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.292234 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138","Type":"ContainerStarted","Data":"84edb744cac90df709d3e319b064c1e917012698257aa60d1a82f95a44480254"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.295833 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-7b5f97974b-77qrc"] Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.299346 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" event={"ID":"ada482ec-e4e4-4484-9dcf-f57ee6186883","Type":"ContainerStarted","Data":"8b8128a3668934c19fbac0e0ca235b70c80be6230b83cb5d307540df05d8ebb6"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.304086 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" event={"ID":"0db48c6e-e94b-4a4c-bc75-70ee65bded48","Type":"ContainerStarted","Data":"1b08250fb71c5960b241c61b26a731ce793dea6ee47186122cb8df37bec45e0d"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.310472 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" event={"ID":"cab5d494-17b4-4c5d-babf-e74a3deb090f","Type":"ContainerStarted","Data":"a2f1a7881e425dd7a9753c2d65901ac704440ad9d401de953b810127bc0896a5"} Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.310528 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.310565 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.310700 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="6c37973c-d677-4481-b1cf-7a5183b904bc" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e70c54687a33367df40bfc627eae4a59f6c05998f88b62c110f7442a47b30c24" gracePeriod=30 Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.311708 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.338606 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-config-data\") pod \"nova-api-0\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.338682 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-scripts\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.338703 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2sfm\" (UniqueName: \"kubernetes.io/projected/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-kube-api-access-p2sfm\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.338721 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.338771 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.338786 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-config-data\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.338863 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-logs\") pod \"nova-api-0\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.338891 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw4xn\" (UniqueName: \"kubernetes.io/projected/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-kube-api-access-cw4xn\") pod \"nova-api-0\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.338925 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.338983 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.339036 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.339048 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03f8ffc8-cadf-4aab-ab46-71163f0339d9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.340507 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-logs\") pod \"nova-api-0\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.341957 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.343734 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.351183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-config-data\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.351368 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.352354 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.354732 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-config-data\") pod \"nova-api-0\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.355791 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.356009 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-scripts\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.357352 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw4xn\" (UniqueName: \"kubernetes.io/projected/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-kube-api-access-cw4xn\") pod \"nova-api-0\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.362943 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2sfm\" (UniqueName: \"kubernetes.io/projected/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-kube-api-access-p2sfm\") pod \"cinder-scheduler-0\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.364686 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92"] Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.469024 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" podStartSLOduration=5.469005688 podStartE2EDuration="5.469005688s" podCreationTimestamp="2026-01-20 23:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:28.298941372 +0000 UTC m=+3360.619201650" watchObservedRunningTime="2026-01-20 23:31:28.469005688 +0000 UTC m=+3360.789265976" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.491164 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.656151 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.729936 5030 scope.go:117] "RemoveContainer" containerID="53982c449caa05c2e8194b1dac6d69dabe8a0b162f8d8436686b33b0358d19c0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.756328 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-d9f464b66-s2shz"] Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.781195 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.860592 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-config-data\") pod \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.860677 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8324c6f6-da57-40dc-a9f3-ce99a41078e6-logs\") pod \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.860789 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6khbk\" (UniqueName: \"kubernetes.io/projected/8324c6f6-da57-40dc-a9f3-ce99a41078e6-kube-api-access-6khbk\") pod \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.860872 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8324c6f6-da57-40dc-a9f3-ce99a41078e6-httpd-run\") pod \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.860900 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-scripts\") pod \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.861007 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-combined-ca-bundle\") pod \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.861048 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\" (UID: \"8324c6f6-da57-40dc-a9f3-ce99a41078e6\") " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.862334 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8324c6f6-da57-40dc-a9f3-ce99a41078e6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8324c6f6-da57-40dc-a9f3-ce99a41078e6" (UID: "8324c6f6-da57-40dc-a9f3-ce99a41078e6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.862607 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8324c6f6-da57-40dc-a9f3-ce99a41078e6-logs" (OuterVolumeSpecName: "logs") pod "8324c6f6-da57-40dc-a9f3-ce99a41078e6" (UID: "8324c6f6-da57-40dc-a9f3-ce99a41078e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.866313 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-scripts" (OuterVolumeSpecName: "scripts") pod "8324c6f6-da57-40dc-a9f3-ce99a41078e6" (UID: "8324c6f6-da57-40dc-a9f3-ce99a41078e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.867349 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "8324c6f6-da57-40dc-a9f3-ce99a41078e6" (UID: "8324c6f6-da57-40dc-a9f3-ce99a41078e6"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.868046 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8324c6f6-da57-40dc-a9f3-ce99a41078e6-kube-api-access-6khbk" (OuterVolumeSpecName: "kube-api-access-6khbk") pod "8324c6f6-da57-40dc-a9f3-ce99a41078e6" (UID: "8324c6f6-da57-40dc-a9f3-ce99a41078e6"). InnerVolumeSpecName "kube-api-access-6khbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.881473 5030 scope.go:117] "RemoveContainer" containerID="f8e64917c4556d4d40e2c4d864f098fcf9cbc495e12808995f36687c0b2205f0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.903826 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8324c6f6-da57-40dc-a9f3-ce99a41078e6" (UID: "8324c6f6-da57-40dc-a9f3-ce99a41078e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.927849 5030 scope.go:117] "RemoveContainer" containerID="53982c449caa05c2e8194b1dac6d69dabe8a0b162f8d8436686b33b0358d19c0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.930574 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:31:28 crc kubenswrapper[5030]: E0120 23:31:28.933001 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53982c449caa05c2e8194b1dac6d69dabe8a0b162f8d8436686b33b0358d19c0\": container with ID starting with 53982c449caa05c2e8194b1dac6d69dabe8a0b162f8d8436686b33b0358d19c0 not found: ID does not exist" containerID="53982c449caa05c2e8194b1dac6d69dabe8a0b162f8d8436686b33b0358d19c0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.933050 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53982c449caa05c2e8194b1dac6d69dabe8a0b162f8d8436686b33b0358d19c0"} err="failed to get container status \"53982c449caa05c2e8194b1dac6d69dabe8a0b162f8d8436686b33b0358d19c0\": rpc error: code = NotFound desc = could not find container \"53982c449caa05c2e8194b1dac6d69dabe8a0b162f8d8436686b33b0358d19c0\": container with ID starting with 53982c449caa05c2e8194b1dac6d69dabe8a0b162f8d8436686b33b0358d19c0 not found: ID does not exist" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.933072 5030 scope.go:117] "RemoveContainer" containerID="f8e64917c4556d4d40e2c4d864f098fcf9cbc495e12808995f36687c0b2205f0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.936110 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-config-data" (OuterVolumeSpecName: "config-data") pod "8324c6f6-da57-40dc-a9f3-ce99a41078e6" (UID: "8324c6f6-da57-40dc-a9f3-ce99a41078e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:28 crc kubenswrapper[5030]: E0120 23:31:28.938484 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8e64917c4556d4d40e2c4d864f098fcf9cbc495e12808995f36687c0b2205f0\": container with ID starting with f8e64917c4556d4d40e2c4d864f098fcf9cbc495e12808995f36687c0b2205f0 not found: ID does not exist" containerID="f8e64917c4556d4d40e2c4d864f098fcf9cbc495e12808995f36687c0b2205f0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.938520 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8e64917c4556d4d40e2c4d864f098fcf9cbc495e12808995f36687c0b2205f0"} err="failed to get container status \"f8e64917c4556d4d40e2c4d864f098fcf9cbc495e12808995f36687c0b2205f0\": rpc error: code = NotFound desc = could not find container \"f8e64917c4556d4d40e2c4d864f098fcf9cbc495e12808995f36687c0b2205f0\": container with ID starting with f8e64917c4556d4d40e2c4d864f098fcf9cbc495e12808995f36687c0b2205f0 not found: ID does not exist" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.960847 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.963019 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.963067 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8324c6f6-da57-40dc-a9f3-ce99a41078e6-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.963082 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6khbk\" (UniqueName: \"kubernetes.io/projected/8324c6f6-da57-40dc-a9f3-ce99a41078e6-kube-api-access-6khbk\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.963092 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8324c6f6-da57-40dc-a9f3-ce99a41078e6-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.963102 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.963112 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8324c6f6-da57-40dc-a9f3-ce99a41078e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.963140 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.970848 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:31:28 crc kubenswrapper[5030]: E0120 23:31:28.971256 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8324c6f6-da57-40dc-a9f3-ce99a41078e6" containerName="glance-log" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.971273 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8324c6f6-da57-40dc-a9f3-ce99a41078e6" containerName="glance-log" Jan 20 23:31:28 crc kubenswrapper[5030]: E0120 23:31:28.971291 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8324c6f6-da57-40dc-a9f3-ce99a41078e6" containerName="glance-httpd" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.971297 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8324c6f6-da57-40dc-a9f3-ce99a41078e6" containerName="glance-httpd" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.971488 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8324c6f6-da57-40dc-a9f3-ce99a41078e6" containerName="glance-httpd" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.971515 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8324c6f6-da57-40dc-a9f3-ce99a41078e6" containerName="glance-log" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.972518 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.975263 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.993320 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:28 crc kubenswrapper[5030]: I0120 23:31:28.995681 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.009987 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.013764 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.031073 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.055676 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.066156 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-config-data-custom\") pod \"a9545469-c146-486f-9c79-b4a97f0f5f83\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.066305 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-config-data\") pod \"a9545469-c146-486f-9c79-b4a97f0f5f83\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.066337 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9545469-c146-486f-9c79-b4a97f0f5f83-logs\") pod \"a9545469-c146-486f-9c79-b4a97f0f5f83\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.066378 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-combined-ca-bundle\") pod \"a9545469-c146-486f-9c79-b4a97f0f5f83\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.066419 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7tw5\" (UniqueName: \"kubernetes.io/projected/a9545469-c146-486f-9c79-b4a97f0f5f83-kube-api-access-t7tw5\") pod \"a9545469-c146-486f-9c79-b4a97f0f5f83\" (UID: \"a9545469-c146-486f-9c79-b4a97f0f5f83\") " Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.066800 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcpmc\" (UniqueName: \"kubernetes.io/projected/762f567e-ffe6-41fa-bde1-18775c7052b7-kube-api-access-mcpmc\") pod \"nova-metadata-0\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.066879 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762f567e-ffe6-41fa-bde1-18775c7052b7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.066910 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762f567e-ffe6-41fa-bde1-18775c7052b7-logs\") pod \"nova-metadata-0\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.066947 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762f567e-ffe6-41fa-bde1-18775c7052b7-config-data\") pod \"nova-metadata-0\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.067071 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.068563 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.069476 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9545469-c146-486f-9c79-b4a97f0f5f83-logs" (OuterVolumeSpecName: "logs") pod "a9545469-c146-486f-9c79-b4a97f0f5f83" (UID: "a9545469-c146-486f-9c79-b4a97f0f5f83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.108940 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: E0120 23:31:29.110221 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9545469-c146-486f-9c79-b4a97f0f5f83" containerName="barbican-api-log" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.110352 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9545469-c146-486f-9c79-b4a97f0f5f83" containerName="barbican-api-log" Jan 20 23:31:29 crc kubenswrapper[5030]: E0120 23:31:29.110454 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9545469-c146-486f-9c79-b4a97f0f5f83" containerName="barbican-api" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.110533 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9545469-c146-486f-9c79-b4a97f0f5f83" containerName="barbican-api" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.111069 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9545469-c146-486f-9c79-b4a97f0f5f83" containerName="barbican-api" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.111166 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9545469-c146-486f-9c79-b4a97f0f5f83" containerName="barbican-api-log" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.117971 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.121866 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.145439 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a9545469-c146-486f-9c79-b4a97f0f5f83" (UID: "a9545469-c146-486f-9c79-b4a97f0f5f83"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.145823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9545469-c146-486f-9c79-b4a97f0f5f83-kube-api-access-t7tw5" (OuterVolumeSpecName: "kube-api-access-t7tw5") pod "a9545469-c146-486f-9c79-b4a97f0f5f83" (UID: "a9545469-c146-486f-9c79-b4a97f0f5f83"). InnerVolumeSpecName "kube-api-access-t7tw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.149673 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.166904 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168179 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-config-data-custom\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/09bdbd07-9917-40c2-9583-b9633d529373-etc-machine-id\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168263 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8rgw\" (UniqueName: \"kubernetes.io/projected/09bdbd07-9917-40c2-9583-b9633d529373-kube-api-access-q8rgw\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168319 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcpmc\" (UniqueName: \"kubernetes.io/projected/762f567e-ffe6-41fa-bde1-18775c7052b7-kube-api-access-mcpmc\") pod \"nova-metadata-0\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168365 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168394 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09bdbd07-9917-40c2-9583-b9633d529373-logs\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168417 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-scripts\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168435 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762f567e-ffe6-41fa-bde1-18775c7052b7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168460 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762f567e-ffe6-41fa-bde1-18775c7052b7-logs\") pod \"nova-metadata-0\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168481 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-config-data\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168503 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762f567e-ffe6-41fa-bde1-18775c7052b7-config-data\") pod \"nova-metadata-0\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168569 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9545469-c146-486f-9c79-b4a97f0f5f83-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168583 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7tw5\" (UniqueName: \"kubernetes.io/projected/a9545469-c146-486f-9c79-b4a97f0f5f83-kube-api-access-t7tw5\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.168596 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.169228 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762f567e-ffe6-41fa-bde1-18775c7052b7-logs\") pod \"nova-metadata-0\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.176391 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762f567e-ffe6-41fa-bde1-18775c7052b7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.180401 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.185256 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762f567e-ffe6-41fa-bde1-18775c7052b7-config-data\") pod \"nova-metadata-0\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.201733 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.213678 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.215333 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.218144 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.220703 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.244206 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcpmc\" (UniqueName: \"kubernetes.io/projected/762f567e-ffe6-41fa-bde1-18775c7052b7-kube-api-access-mcpmc\") pod \"nova-metadata-0\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.263972 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274036 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcjxx\" (UniqueName: \"kubernetes.io/projected/07cd67f8-2bd9-448c-a418-5d1aa37808d6-kube-api-access-hcjxx\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274092 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-config-data-custom\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274134 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07cd67f8-2bd9-448c-a418-5d1aa37808d6-logs\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274155 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/09bdbd07-9917-40c2-9583-b9633d529373-etc-machine-id\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274228 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8rgw\" (UniqueName: \"kubernetes.io/projected/09bdbd07-9917-40c2-9583-b9633d529373-kube-api-access-q8rgw\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274268 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274295 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274328 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07cd67f8-2bd9-448c-a418-5d1aa37808d6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274372 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09bdbd07-9917-40c2-9583-b9633d529373-logs\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274394 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-scripts\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274415 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274437 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.274460 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-config-data\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.280882 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-config-data-custom\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.280949 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-config-data\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.291183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09bdbd07-9917-40c2-9583-b9633d529373-logs\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.294430 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/09bdbd07-9917-40c2-9583-b9633d529373-etc-machine-id\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.300387 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.304770 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.309952 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.313589 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.314786 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8rgw\" (UniqueName: \"kubernetes.io/projected/09bdbd07-9917-40c2-9583-b9633d529373-kube-api-access-q8rgw\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.321167 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-scripts\") pod \"cinder-api-0\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.325254 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.347469 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/memcached-0" podUID="295f9631-4bac-43a2-9b71-f94209fa0046" containerName="memcached" probeResult="failure" output="dial tcp 10.217.1.105:11211: i/o timeout" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.355098 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" event={"ID":"a9545469-c146-486f-9c79-b4a97f0f5f83","Type":"ContainerDied","Data":"74ff37a44373e19ae35efe5753ddbebcab1fa9f608cbdb06b3b70cdf0a4e7190"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.355141 5030 scope.go:117] "RemoveContainer" containerID="3a86e44dec0c7e253766c39f87b1ea44710eab1fd6d34ec1fc85a0990e2a59f0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.355291 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.360001 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" event={"ID":"ada482ec-e4e4-4484-9dcf-f57ee6186883","Type":"ContainerStarted","Data":"3b5c27ad1243d28a1b12fb9293783374d79cbf506d43fe581f1deb5fb186bb75"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.368633 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" event={"ID":"cab5d494-17b4-4c5d-babf-e74a3deb090f","Type":"ContainerStarted","Data":"da1b2fae7bed4ea45499659585d573132a7fc0d2dbe9cb24247436e14e86bb77"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.369684 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138","Type":"ContainerStarted","Data":"67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.371213 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.371661 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.375823 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcjxx\" (UniqueName: \"kubernetes.io/projected/07cd67f8-2bd9-448c-a418-5d1aa37808d6-kube-api-access-hcjxx\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.375884 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07cd67f8-2bd9-448c-a418-5d1aa37808d6-logs\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.375922 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.375949 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.375973 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.375989 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv4lg\" (UniqueName: \"kubernetes.io/projected/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-kube-api-access-bv4lg\") pod \"nova-scheduler-0\" (UID: \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.376020 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07cd67f8-2bd9-448c-a418-5d1aa37808d6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.376044 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-config-data\") pod \"nova-scheduler-0\" (UID: \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.376071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.376092 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.376904 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07cd67f8-2bd9-448c-a418-5d1aa37808d6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.377229 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.379604 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07cd67f8-2bd9-448c-a418-5d1aa37808d6-logs\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.380798 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"4dc2869a-ba69-4466-aa02-b6fe06035daa","Type":"ContainerStarted","Data":"73c74e1285256165f30db19767d9beb46653b00b7cf33280e3fa0754345416e3"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.389703 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" event={"ID":"45231f32-366a-4f97-96b7-c59f7f059770","Type":"ContainerStarted","Data":"6d1376b2e091462d5adf754a73c025d2355de04478a28e9cb17b61bbda8c4846"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.393273 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.397254 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.397832 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.404253 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" event={"ID":"f7c50398-f0ad-450d-b494-be906da18af9","Type":"ContainerStarted","Data":"c1b33ab002e4f712903d4ed520c8c52b8e5d06f4f632273086ae04558ec64596"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.404291 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" event={"ID":"f7c50398-f0ad-450d-b494-be906da18af9","Type":"ContainerStarted","Data":"c331aa3fbfa5def7fb3bb9752bb98a11aef8a753d2bcf269b8f711288695298c"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.407139 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcjxx\" (UniqueName: \"kubernetes.io/projected/07cd67f8-2bd9-448c-a418-5d1aa37808d6-kube-api-access-hcjxx\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.410618 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" event={"ID":"0db48c6e-e94b-4a4c-bc75-70ee65bded48","Type":"ContainerStarted","Data":"a41bbbe631becdc2663227271247bffebe52245317b5d494313c36b84c0010bf"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.415195 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=5.415178093 podStartE2EDuration="5.415178093s" podCreationTimestamp="2026-01-20 23:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:29.387873442 +0000 UTC m=+3361.708133730" watchObservedRunningTime="2026-01-20 23:31:29.415178093 +0000 UTC m=+3361.735438381" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.415574 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" event={"ID":"6ba68c0c-821f-4bb0-848d-cfdc09051ae9","Type":"ContainerStarted","Data":"f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.420427 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"8324c6f6-da57-40dc-a9f3-ce99a41078e6","Type":"ContainerDied","Data":"f24b402cd5da637624318996811378eb99d57b35a3ed06639b327625181fe635"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.420518 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.432455 5030 generic.go:334] "Generic (PLEG): container finished" podID="6c37973c-d677-4481-b1cf-7a5183b904bc" containerID="e70c54687a33367df40bfc627eae4a59f6c05998f88b62c110f7442a47b30c24" exitCode=0 Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.432507 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6c37973c-d677-4481-b1cf-7a5183b904bc","Type":"ContainerDied","Data":"e70c54687a33367df40bfc627eae4a59f6c05998f88b62c110f7442a47b30c24"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.446606 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" podStartSLOduration=5.446591303 podStartE2EDuration="5.446591303s" podCreationTimestamp="2026-01-20 23:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:29.427292717 +0000 UTC m=+3361.747553005" watchObservedRunningTime="2026-01-20 23:31:29.446591303 +0000 UTC m=+3361.766851591" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.448421 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" podUID="1cef2808-064a-47dd-8abb-9f6b08d260be" containerName="neutron-api" containerID="cri-o://5e56e0a90f48d67ecced2eb4b7bd71d6bb25bbb2926717de0d9772105c53f0b2" gracePeriod=30 Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.448483 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" podUID="1cef2808-064a-47dd-8abb-9f6b08d260be" containerName="neutron-httpd" containerID="cri-o://d818d37af6dd495c07c90156f3acd2271b27602fefb571a1c777bca760f33f38" gracePeriod=30 Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.448297 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" event={"ID":"1cef2808-064a-47dd-8abb-9f6b08d260be","Type":"ContainerStarted","Data":"d818d37af6dd495c07c90156f3acd2271b27602fefb571a1c777bca760f33f38"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.448608 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.450530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" event={"ID":"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31","Type":"ContainerStarted","Data":"19b2b71612e3a2a85e00ffaa0364277bb531f46af7bc4f89dedf5363f5ce15ed"} Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.479738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.479781 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv4lg\" (UniqueName: \"kubernetes.io/projected/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-kube-api-access-bv4lg\") pod \"nova-scheduler-0\" (UID: \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.479818 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-config-data\") pod \"nova-scheduler-0\" (UID: \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.480350 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.498270 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-config-data\") pod \"nova-scheduler-0\" (UID: \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.503310 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.516094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv4lg\" (UniqueName: \"kubernetes.io/projected/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-kube-api-access-bv4lg\") pod \"nova-scheduler-0\" (UID: \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.547894 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9545469-c146-486f-9c79-b4a97f0f5f83" (UID: "a9545469-c146-486f-9c79-b4a97f0f5f83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.582100 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.618806 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.629072 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.658425 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" podStartSLOduration=6.658399921 podStartE2EDuration="6.658399921s" podCreationTimestamp="2026-01-20 23:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:29.478048225 +0000 UTC m=+3361.798308513" watchObservedRunningTime="2026-01-20 23:31:29.658399921 +0000 UTC m=+3361.978660209" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.676899 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.748779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-config-data" (OuterVolumeSpecName: "config-data") pod "a9545469-c146-486f-9c79-b4a97f0f5f83" (UID: "a9545469-c146-486f-9c79-b4a97f0f5f83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.816827 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9545469-c146-486f-9c79-b4a97f0f5f83-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.826366 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:31:29 crc kubenswrapper[5030]: I0120 23:31:29.845907 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.070112 5030 scope.go:117] "RemoveContainer" containerID="cfb105ff10aa6a2752e27b961832094a2cba935c02c88987ebabcede7da0e5ca" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.144354 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f8ffc8-cadf-4aab-ab46-71163f0339d9" path="/var/lib/kubelet/pods/03f8ffc8-cadf-4aab-ab46-71163f0339d9/volumes" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.145144 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df9d9fc-b1ae-420d-8d8a-1c453cb77106" path="/var/lib/kubelet/pods/1df9d9fc-b1ae-420d-8d8a-1c453cb77106/volumes" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.145872 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36edc87c-d811-414a-8630-86c81093bdbf" path="/var/lib/kubelet/pods/36edc87c-d811-414a-8630-86c81093bdbf/volumes" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.147139 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fdd777a-3f61-4d1d-80e6-b508ce72ffc1" path="/var/lib/kubelet/pods/9fdd777a-3f61-4d1d-80e6-b508ce72ffc1/volumes" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.147699 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6bf49fb-3a1f-4a62-9340-a40608e2d844" path="/var/lib/kubelet/pods/d6bf49fb-3a1f-4a62-9340-a40608e2d844/volumes" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.148178 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff299fc0-074c-4420-8ac5-030bebd0c385" path="/var/lib/kubelet/pods/ff299fc0-074c-4420-8ac5-030bebd0c385/volumes" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.221225 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.236526 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.349052 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c37973c-d677-4481-b1cf-7a5183b904bc-combined-ca-bundle\") pod \"6c37973c-d677-4481-b1cf-7a5183b904bc\" (UID: \"6c37973c-d677-4481-b1cf-7a5183b904bc\") " Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.349465 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c37973c-d677-4481-b1cf-7a5183b904bc-config-data\") pod \"6c37973c-d677-4481-b1cf-7a5183b904bc\" (UID: \"6c37973c-d677-4481-b1cf-7a5183b904bc\") " Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.349519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65b49\" (UniqueName: \"kubernetes.io/projected/6c37973c-d677-4481-b1cf-7a5183b904bc-kube-api-access-65b49\") pod \"6c37973c-d677-4481-b1cf-7a5183b904bc\" (UID: \"6c37973c-d677-4481-b1cf-7a5183b904bc\") " Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.352821 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.368980 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c37973c-d677-4481-b1cf-7a5183b904bc-kube-api-access-65b49" (OuterVolumeSpecName: "kube-api-access-65b49") pod "6c37973c-d677-4481-b1cf-7a5183b904bc" (UID: "6c37973c-d677-4481-b1cf-7a5183b904bc"). InnerVolumeSpecName "kube-api-access-65b49". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.369851 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.380401 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:31:30 crc kubenswrapper[5030]: E0120 23:31:30.380898 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c37973c-d677-4481-b1cf-7a5183b904bc" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.380916 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c37973c-d677-4481-b1cf-7a5183b904bc" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.381129 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c37973c-d677-4481-b1cf-7a5183b904bc" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.382300 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.384210 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.405078 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb"] Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.416599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c37973c-d677-4481-b1cf-7a5183b904bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c37973c-d677-4481-b1cf-7a5183b904bc" (UID: "6c37973c-d677-4481-b1cf-7a5183b904bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.416866 5030 scope.go:117] "RemoveContainer" containerID="16c5cb22986bc3b811996c0ba29d425238d57e0b2b669a5b412626aeb83f8fda" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.446844 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-67bdd6745b-xntdb"] Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.452675 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c37973c-d677-4481-b1cf-7a5183b904bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.452710 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65b49\" (UniqueName: \"kubernetes.io/projected/6c37973c-d677-4481-b1cf-7a5183b904bc-kube-api-access-65b49\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.454767 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c37973c-d677-4481-b1cf-7a5183b904bc-config-data" (OuterVolumeSpecName: "config-data") pod "6c37973c-d677-4481-b1cf-7a5183b904bc" (UID: "6c37973c-d677-4481-b1cf-7a5183b904bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.500380 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.519243 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"59347f0b-d425-4d66-b2a0-7381c5fa0c3f","Type":"ContainerStarted","Data":"174175d4a37a36cc56c6f4cbaeb3c45aab32225280ac671d0de8d047a6cdb780"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.521547 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" event={"ID":"cab5d494-17b4-4c5d-babf-e74a3deb090f","Type":"ContainerStarted","Data":"4429058b871189e87dd7f8f69716b448dc7619a6b516fe76a0b7321f6197a303"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.524844 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" event={"ID":"6ba68c0c-821f-4bb0-848d-cfdc09051ae9","Type":"ContainerStarted","Data":"25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.524961 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" podUID="6ba68c0c-821f-4bb0-848d-cfdc09051ae9" containerName="placement-log" containerID="cri-o://f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382" gracePeriod=30 Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.525152 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.525172 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.525200 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" podUID="6ba68c0c-821f-4bb0-848d-cfdc09051ae9" containerName="placement-api" containerID="cri-o://25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659" gracePeriod=30 Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.531463 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6c37973c-d677-4481-b1cf-7a5183b904bc","Type":"ContainerDied","Data":"0e54723f6444752205278a26f0fdd96906e1fd825eb4831b37817b13229742fc"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.531532 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.540947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"4dc2869a-ba69-4466-aa02-b6fe06035daa","Type":"ContainerStarted","Data":"019f85f2b7928eefb30edadc94479be6aefb358db65c713d1e1e7a186e7be788"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.541694 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.543382 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" podStartSLOduration=6.543371484 podStartE2EDuration="6.543371484s" podCreationTimestamp="2026-01-20 23:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:30.537719807 +0000 UTC m=+3362.857980105" watchObservedRunningTime="2026-01-20 23:31:30.543371484 +0000 UTC m=+3362.863631772" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.554924 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-scripts\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.554958 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.555031 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8238-69f5-43a5-843f-82e5195fde82-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.555081 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8238-69f5-43a5-843f-82e5195fde82-logs\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.555109 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.555144 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-config-data\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.555167 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpb5x\" (UniqueName: \"kubernetes.io/projected/1e5f8238-69f5-43a5-843f-82e5195fde82-kube-api-access-cpb5x\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.555214 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c37973c-d677-4481-b1cf-7a5183b904bc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.567238 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" podStartSLOduration=6.567219711 podStartE2EDuration="6.567219711s" podCreationTimestamp="2026-01-20 23:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:30.554447912 +0000 UTC m=+3362.874708200" watchObservedRunningTime="2026-01-20 23:31:30.567219711 +0000 UTC m=+3362.887479999" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.571644 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" event={"ID":"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31","Type":"ContainerStarted","Data":"ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.618967 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" event={"ID":"ada482ec-e4e4-4484-9dcf-f57ee6186883","Type":"ContainerStarted","Data":"fe5492f77d34acfeec74d4db9a01fa04e6cc8dcc5bcd07f35cdcc0d1d43df7ea"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.626156 5030 scope.go:117] "RemoveContainer" containerID="64bde24ce641ac620803ec772b33819c4f11068e6abd5447571283350997935d" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.636711 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-697dff9875-7clld"] Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.636988 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" podUID="0c16760b-b3ab-42a0-90a7-5cba1f6b889a" containerName="barbican-worker-log" containerID="cri-o://dcdfc1820f080b0e8b90f9417e65df736b83363b67ea72b8806323bba24c5a86" gracePeriod=30 Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.637093 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" podUID="0c16760b-b3ab-42a0-90a7-5cba1f6b889a" containerName="barbican-worker" containerID="cri-o://51cd04a97de01286d78837802f843f6e120edea93900487da629dcee88840ff1" gracePeriod=30 Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.647742 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=4.647722289 podStartE2EDuration="4.647722289s" podCreationTimestamp="2026-01-20 23:31:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:30.603212831 +0000 UTC m=+3362.923473109" watchObservedRunningTime="2026-01-20 23:31:30.647722289 +0000 UTC m=+3362.967982577" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.672153 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8238-69f5-43a5-843f-82e5195fde82-logs\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.672234 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.672287 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-config-data\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.672315 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpb5x\" (UniqueName: \"kubernetes.io/projected/1e5f8238-69f5-43a5-843f-82e5195fde82-kube-api-access-cpb5x\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.672397 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.672419 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-scripts\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.672520 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8238-69f5-43a5-843f-82e5195fde82-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.674856 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8238-69f5-43a5-843f-82e5195fde82-logs\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.676976 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.679361 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8238-69f5-43a5-843f-82e5195fde82-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.682684 5030 generic.go:334] "Generic (PLEG): container finished" podID="9def69e2-10b9-443b-8b38-7d561e39bb7d" containerID="b96f29c6412312302df952900fcc3ad294f799efc164332681b4bdeaa2e0b9e6" exitCode=0 Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.682898 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"9def69e2-10b9-443b-8b38-7d561e39bb7d","Type":"ContainerDied","Data":"b96f29c6412312302df952900fcc3ad294f799efc164332681b4bdeaa2e0b9e6"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.695733 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-config-data\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.695893 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.707896 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c77f3ffb-97b9-4a1f-9610-7705ae4aee88","Type":"ContainerStarted","Data":"c5b38855c50b837a2fe620b8319b6db6b3aec970de01e6ba699b9ec3b5ce1658"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.709241 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-scripts\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.712817 5030 generic.go:334] "Generic (PLEG): container finished" podID="1cef2808-064a-47dd-8abb-9f6b08d260be" containerID="d818d37af6dd495c07c90156f3acd2271b27602fefb571a1c777bca760f33f38" exitCode=0 Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.712869 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" event={"ID":"1cef2808-064a-47dd-8abb-9f6b08d260be","Type":"ContainerDied","Data":"d818d37af6dd495c07c90156f3acd2271b27602fefb571a1c777bca760f33f38"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.713906 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpb5x\" (UniqueName: \"kubernetes.io/projected/1e5f8238-69f5-43a5-843f-82e5195fde82-kube-api-access-cpb5x\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.715674 5030 generic.go:334] "Generic (PLEG): container finished" podID="535e149a-723c-4f6b-9247-9a6b58eff6bb" containerID="a5f50faa8925170ab9b2766c285e845c40fa14a592054ea95170b7280822dae6" exitCode=0 Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.715731 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"535e149a-723c-4f6b-9247-9a6b58eff6bb","Type":"ContainerDied","Data":"a5f50faa8925170ab9b2766c285e845c40fa14a592054ea95170b7280822dae6"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.720631 5030 generic.go:334] "Generic (PLEG): container finished" podID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerID="d8018bf3bade68fa64b2765cfd6560c427fb070f2f81ee36252a4716114b59ee" exitCode=0 Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.720693 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4a35bcf9-e354-4385-8e10-2bb69352fc70","Type":"ContainerDied","Data":"d8018bf3bade68fa64b2765cfd6560c427fb070f2f81ee36252a4716114b59ee"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.724251 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.729405 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" podStartSLOduration=6.729387036 podStartE2EDuration="6.729387036s" podCreationTimestamp="2026-01-20 23:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:30.662190999 +0000 UTC m=+3362.982451287" watchObservedRunningTime="2026-01-20 23:31:30.729387036 +0000 UTC m=+3363.049647324" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.737816 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"762f567e-ffe6-41fa-bde1-18775c7052b7","Type":"ContainerStarted","Data":"13aa8ff0f386c218e015550ef1b633540bd0a0b402aa584614806fad23141ea8"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.740814 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.751988 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.753185 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk"] Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.753923 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" podUID="08962f16-0341-46c7-b0df-9972861083df" containerName="barbican-keystone-listener-log" containerID="cri-o://6c650f247290a40ecefbd23466ce8a8a0175d13bde2a6888e5bd093543ff4b53" gracePeriod=30 Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.754075 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" podUID="08962f16-0341-46c7-b0df-9972861083df" containerName="barbican-keystone-listener" containerID="cri-o://c2d0233a880b88e93dbeabd9bcdcab1ad13499f0f2deb9a9aa140faad87db60d" gracePeriod=30 Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.769606 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" event={"ID":"45231f32-366a-4f97-96b7-c59f7f059770","Type":"ContainerStarted","Data":"f6e275cc62225396c909dc76f6977b00be923535bbcf8999afec6b5f3602f7a1"} Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.814684 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.814744 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.824696 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.825974 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.828086 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.828453 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:31:30 crc kubenswrapper[5030]: I0120 23:31:30.928525 5030 scope.go:117] "RemoveContainer" containerID="e70c54687a33367df40bfc627eae4a59f6c05998f88b62c110f7442a47b30c24" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.006461 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.006557 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjp8l\" (UniqueName: \"kubernetes.io/projected/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-kube-api-access-jjp8l\") pod \"nova-cell1-novncproxy-0\" (UID: \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.006741 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.108949 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.109017 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.109083 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjp8l\" (UniqueName: \"kubernetes.io/projected/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-kube-api-access-jjp8l\") pod \"nova-cell1-novncproxy-0\" (UID: \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.116144 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.116295 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.125306 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.136731 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjp8l\" (UniqueName: \"kubernetes.io/projected/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-kube-api-access-jjp8l\") pod \"nova-cell1-novncproxy-0\" (UID: \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.142100 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.161890 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.210319 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"535e149a-723c-4f6b-9247-9a6b58eff6bb\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.210397 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/535e149a-723c-4f6b-9247-9a6b58eff6bb-erlang-cookie-secret\") pod \"535e149a-723c-4f6b-9247-9a6b58eff6bb\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.210430 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/535e149a-723c-4f6b-9247-9a6b58eff6bb-pod-info\") pod \"535e149a-723c-4f6b-9247-9a6b58eff6bb\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.210465 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb62f\" (UniqueName: \"kubernetes.io/projected/535e149a-723c-4f6b-9247-9a6b58eff6bb-kube-api-access-vb62f\") pod \"535e149a-723c-4f6b-9247-9a6b58eff6bb\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.210519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/535e149a-723c-4f6b-9247-9a6b58eff6bb-server-conf\") pod \"535e149a-723c-4f6b-9247-9a6b58eff6bb\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.210581 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-confd\") pod \"535e149a-723c-4f6b-9247-9a6b58eff6bb\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.210612 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-erlang-cookie\") pod \"535e149a-723c-4f6b-9247-9a6b58eff6bb\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.210718 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-plugins\") pod \"535e149a-723c-4f6b-9247-9a6b58eff6bb\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.210764 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/535e149a-723c-4f6b-9247-9a6b58eff6bb-plugins-conf\") pod \"535e149a-723c-4f6b-9247-9a6b58eff6bb\" (UID: \"535e149a-723c-4f6b-9247-9a6b58eff6bb\") " Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.211708 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/535e149a-723c-4f6b-9247-9a6b58eff6bb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "535e149a-723c-4f6b-9247-9a6b58eff6bb" (UID: "535e149a-723c-4f6b-9247-9a6b58eff6bb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.212063 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "535e149a-723c-4f6b-9247-9a6b58eff6bb" (UID: "535e149a-723c-4f6b-9247-9a6b58eff6bb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.212133 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "535e149a-723c-4f6b-9247-9a6b58eff6bb" (UID: "535e149a-723c-4f6b-9247-9a6b58eff6bb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.228051 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/535e149a-723c-4f6b-9247-9a6b58eff6bb-kube-api-access-vb62f" (OuterVolumeSpecName: "kube-api-access-vb62f") pod "535e149a-723c-4f6b-9247-9a6b58eff6bb" (UID: "535e149a-723c-4f6b-9247-9a6b58eff6bb"). InnerVolumeSpecName "kube-api-access-vb62f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.228961 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/535e149a-723c-4f6b-9247-9a6b58eff6bb-pod-info" (OuterVolumeSpecName: "pod-info") pod "535e149a-723c-4f6b-9247-9a6b58eff6bb" (UID: "535e149a-723c-4f6b-9247-9a6b58eff6bb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.229359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "535e149a-723c-4f6b-9247-9a6b58eff6bb" (UID: "535e149a-723c-4f6b-9247-9a6b58eff6bb"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.230173 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/535e149a-723c-4f6b-9247-9a6b58eff6bb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "535e149a-723c-4f6b-9247-9a6b58eff6bb" (UID: "535e149a-723c-4f6b-9247-9a6b58eff6bb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.297514 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/535e149a-723c-4f6b-9247-9a6b58eff6bb-server-conf" (OuterVolumeSpecName: "server-conf") pod "535e149a-723c-4f6b-9247-9a6b58eff6bb" (UID: "535e149a-723c-4f6b-9247-9a6b58eff6bb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.316878 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.317181 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.317191 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/535e149a-723c-4f6b-9247-9a6b58eff6bb-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.317227 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.317237 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/535e149a-723c-4f6b-9247-9a6b58eff6bb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.317246 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/535e149a-723c-4f6b-9247-9a6b58eff6bb-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.317257 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb62f\" (UniqueName: \"kubernetes.io/projected/535e149a-723c-4f6b-9247-9a6b58eff6bb-kube-api-access-vb62f\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.317271 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/535e149a-723c-4f6b-9247-9a6b58eff6bb-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.355681 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.423087 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.613855 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.753807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "535e149a-723c-4f6b-9247-9a6b58eff6bb" (UID: "535e149a-723c-4f6b-9247-9a6b58eff6bb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.834765 5030 generic.go:334] "Generic (PLEG): container finished" podID="0c16760b-b3ab-42a0-90a7-5cba1f6b889a" containerID="dcdfc1820f080b0e8b90f9417e65df736b83363b67ea72b8806323bba24c5a86" exitCode=143 Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.834826 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" event={"ID":"0c16760b-b3ab-42a0-90a7-5cba1f6b889a","Type":"ContainerDied","Data":"dcdfc1820f080b0e8b90f9417e65df736b83363b67ea72b8806323bba24c5a86"} Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.839940 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/535e149a-723c-4f6b-9247-9a6b58eff6bb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.849801 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"59347f0b-d425-4d66-b2a0-7381c5fa0c3f","Type":"ContainerStarted","Data":"2175f12451ad8dc33d5ba3fc27ace3cfecef118535575d9b83d785039ff027ba"} Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.857519 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4a35bcf9-e354-4385-8e10-2bb69352fc70","Type":"ContainerDied","Data":"b05bddb60c5402e5115fab37c48ab9c491f3ed3ccf925d08d74606f471c12990"} Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.857554 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b05bddb60c5402e5115fab37c48ab9c491f3ed3ccf925d08d74606f471c12990" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.895012 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.903334 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" event={"ID":"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31","Type":"ContainerStarted","Data":"c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20"} Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.904165 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.913982 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.914444 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"09bdbd07-9917-40c2-9583-b9633d529373","Type":"ContainerStarted","Data":"680837b7360badfe79509cf11a55879d38791a680f79da2a441b410e582e65aa"} Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.920540 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"535e149a-723c-4f6b-9247-9a6b58eff6bb","Type":"ContainerDied","Data":"c09c53e881513e8e6b8114aeaeb7098dd0c9feec9d99db5af552448a7aea670b"} Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.920569 5030 scope.go:117] "RemoveContainer" containerID="a5f50faa8925170ab9b2766c285e845c40fa14a592054ea95170b7280822dae6" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.920700 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.939305 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"07cd67f8-2bd9-448c-a418-5d1aa37808d6","Type":"ContainerStarted","Data":"42aa7635e18fa04ed8e3a79e886ed851b1cd7dd91cd9b0fa6211ef4a73541160"} Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.949260 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" podStartSLOduration=7.949244856 podStartE2EDuration="7.949244856s" podCreationTimestamp="2026-01-20 23:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:31.931004624 +0000 UTC m=+3364.251264912" watchObservedRunningTime="2026-01-20 23:31:31.949244856 +0000 UTC m=+3364.269505144" Jan 20 23:31:31 crc kubenswrapper[5030]: I0120 23:31:31.991061 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" podStartSLOduration=4.991044288 podStartE2EDuration="4.991044288s" podCreationTimestamp="2026-01-20 23:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:31.986001715 +0000 UTC m=+3364.306262003" watchObservedRunningTime="2026-01-20 23:31:31.991044288 +0000 UTC m=+3364.311304576" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.016247 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c37973c-d677-4481-b1cf-7a5183b904bc" path="/var/lib/kubelet/pods/6c37973c-d677-4481-b1cf-7a5183b904bc/volumes" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.017367 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8324c6f6-da57-40dc-a9f3-ce99a41078e6" path="/var/lib/kubelet/pods/8324c6f6-da57-40dc-a9f3-ce99a41078e6/volumes" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.018344 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9545469-c146-486f-9c79-b4a97f0f5f83" path="/var/lib/kubelet/pods/a9545469-c146-486f-9c79-b4a97f0f5f83/volumes" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.028118 5030 generic.go:334] "Generic (PLEG): container finished" podID="5b822da8-631f-47ee-8874-dfa862386ad2" containerID="ec590bd73705be339baa9b7c1b90768301a4f289003bd0b2191fc3d357538be2" exitCode=0 Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.032430 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.032739 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.032765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" event={"ID":"45231f32-366a-4f97-96b7-c59f7f059770","Type":"ContainerStarted","Data":"ee034c3b521d612a1a183817d8e0111d607e0d2c34437b652f35b1658e1e0301"} Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.032784 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.032795 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" event={"ID":"5b822da8-631f-47ee-8874-dfa862386ad2","Type":"ContainerDied","Data":"ec590bd73705be339baa9b7c1b90768301a4f289003bd0b2191fc3d357538be2"} Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.032812 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" event={"ID":"5b822da8-631f-47ee-8874-dfa862386ad2","Type":"ContainerDied","Data":"f265d8dcfee99cc0764d4a6b166d08346ad9871953325de827ceb130b513203e"} Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.032822 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f265d8dcfee99cc0764d4a6b166d08346ad9871953325de827ceb130b513203e" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.032844 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.032881 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.033331 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.047705 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.051959 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-plugins\") pod \"9def69e2-10b9-443b-8b38-7d561e39bb7d\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.052050 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9def69e2-10b9-443b-8b38-7d561e39bb7d-erlang-cookie-secret\") pod \"9def69e2-10b9-443b-8b38-7d561e39bb7d\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.052078 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"9def69e2-10b9-443b-8b38-7d561e39bb7d\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.053568 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9def69e2-10b9-443b-8b38-7d561e39bb7d" (UID: "9def69e2-10b9-443b-8b38-7d561e39bb7d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.059091 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.060403 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-confd\") pod \"9def69e2-10b9-443b-8b38-7d561e39bb7d\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.060715 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="ceilometer-notification-agent" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.060735 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="ceilometer-notification-agent" Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.060744 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba68c0c-821f-4bb0-848d-cfdc09051ae9" containerName="placement-log" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.060750 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba68c0c-821f-4bb0-848d-cfdc09051ae9" containerName="placement-log" Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.060759 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b822da8-631f-47ee-8874-dfa862386ad2" containerName="barbican-keystone-listener" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.060765 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b822da8-631f-47ee-8874-dfa862386ad2" containerName="barbican-keystone-listener" Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.060775 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b822da8-631f-47ee-8874-dfa862386ad2" containerName="barbican-keystone-listener-log" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.060781 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b822da8-631f-47ee-8874-dfa862386ad2" containerName="barbican-keystone-listener-log" Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.060790 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9def69e2-10b9-443b-8b38-7d561e39bb7d" containerName="rabbitmq" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.060789 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-erlang-cookie\") pod \"9def69e2-10b9-443b-8b38-7d561e39bb7d\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.060838 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9def69e2-10b9-443b-8b38-7d561e39bb7d-server-conf\") pod \"9def69e2-10b9-443b-8b38-7d561e39bb7d\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.060891 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl5td\" (UniqueName: \"kubernetes.io/projected/9def69e2-10b9-443b-8b38-7d561e39bb7d-kube-api-access-cl5td\") pod \"9def69e2-10b9-443b-8b38-7d561e39bb7d\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.060919 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9def69e2-10b9-443b-8b38-7d561e39bb7d-pod-info\") pod \"9def69e2-10b9-443b-8b38-7d561e39bb7d\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.060980 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9def69e2-10b9-443b-8b38-7d561e39bb7d-plugins-conf\") pod \"9def69e2-10b9-443b-8b38-7d561e39bb7d\" (UID: \"9def69e2-10b9-443b-8b38-7d561e39bb7d\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.060797 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9def69e2-10b9-443b-8b38-7d561e39bb7d" containerName="rabbitmq" Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.061358 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9def69e2-10b9-443b-8b38-7d561e39bb7d" containerName="setup-container" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061366 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9def69e2-10b9-443b-8b38-7d561e39bb7d" containerName="setup-container" Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.061381 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535e149a-723c-4f6b-9247-9a6b58eff6bb" containerName="setup-container" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061389 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="535e149a-723c-4f6b-9247-9a6b58eff6bb" containerName="setup-container" Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.061404 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535e149a-723c-4f6b-9247-9a6b58eff6bb" containerName="rabbitmq" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061412 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="535e149a-723c-4f6b-9247-9a6b58eff6bb" containerName="rabbitmq" Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.061435 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba68c0c-821f-4bb0-848d-cfdc09051ae9" containerName="placement-api" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061441 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba68c0c-821f-4bb0-848d-cfdc09051ae9" containerName="placement-api" Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.061448 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="sg-core" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061454 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="sg-core" Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.061461 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="proxy-httpd" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061466 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="proxy-httpd" Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.061477 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="ceilometer-central-agent" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061482 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="ceilometer-central-agent" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061671 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba68c0c-821f-4bb0-848d-cfdc09051ae9" containerName="placement-api" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061681 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="ceilometer-notification-agent" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061692 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="proxy-httpd" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061702 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="ceilometer-central-agent" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061715 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="535e149a-723c-4f6b-9247-9a6b58eff6bb" containerName="rabbitmq" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061723 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b822da8-631f-47ee-8874-dfa862386ad2" containerName="barbican-keystone-listener" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061731 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" containerName="sg-core" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061738 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9def69e2-10b9-443b-8b38-7d561e39bb7d" containerName="rabbitmq" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061748 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b822da8-631f-47ee-8874-dfa862386ad2" containerName="barbican-keystone-listener-log" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.061757 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba68c0c-821f-4bb0-848d-cfdc09051ae9" containerName="placement-log" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.062191 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.062748 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.070407 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9def69e2-10b9-443b-8b38-7d561e39bb7d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9def69e2-10b9-443b-8b38-7d561e39bb7d" (UID: "9def69e2-10b9-443b-8b38-7d561e39bb7d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.078863 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9def69e2-10b9-443b-8b38-7d561e39bb7d" (UID: "9def69e2-10b9-443b-8b38-7d561e39bb7d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.079502 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" event={"ID":"f7c50398-f0ad-450d-b494-be906da18af9","Type":"ContainerStarted","Data":"ace8455d89e787da2e667d87dede5ef4ab3dc9bb9804afaac85184a3e3d85013"} Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.079865 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.079903 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.087130 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.087327 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.087438 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.087564 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.087691 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.087796 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-bzk7k" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.087926 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.105594 5030 generic.go:334] "Generic (PLEG): container finished" podID="08962f16-0341-46c7-b0df-9972861083df" containerID="6c650f247290a40ecefbd23466ce8a8a0175d13bde2a6888e5bd093543ff4b53" exitCode=143 Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.105680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" event={"ID":"08962f16-0341-46c7-b0df-9972861083df","Type":"ContainerDied","Data":"6c650f247290a40ecefbd23466ce8a8a0175d13bde2a6888e5bd093543ff4b53"} Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.108927 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "persistence") pod "9def69e2-10b9-443b-8b38-7d561e39bb7d" (UID: "9def69e2-10b9-443b-8b38-7d561e39bb7d"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.113206 5030 scope.go:117] "RemoveContainer" containerID="15c82f83871ae8a50514e3320b06975949c2d6131b50863a977396f67100c222" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.121039 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9def69e2-10b9-443b-8b38-7d561e39bb7d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9def69e2-10b9-443b-8b38-7d561e39bb7d" (UID: "9def69e2-10b9-443b-8b38-7d561e39bb7d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.125420 5030 generic.go:334] "Generic (PLEG): container finished" podID="6ba68c0c-821f-4bb0-848d-cfdc09051ae9" containerID="25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659" exitCode=0 Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.125452 5030 generic.go:334] "Generic (PLEG): container finished" podID="6ba68c0c-821f-4bb0-848d-cfdc09051ae9" containerID="f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382" exitCode=143 Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.125512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" event={"ID":"6ba68c0c-821f-4bb0-848d-cfdc09051ae9","Type":"ContainerDied","Data":"25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659"} Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.125536 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" event={"ID":"6ba68c0c-821f-4bb0-848d-cfdc09051ae9","Type":"ContainerDied","Data":"f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382"} Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.125547 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" event={"ID":"6ba68c0c-821f-4bb0-848d-cfdc09051ae9","Type":"ContainerDied","Data":"9b7f98334b8d850db2bf7cda190144b8a285e2d255aae8325a62953d1f4ca1dd"} Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.125629 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-799d455fb6-7jld8" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.132830 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.138775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"762f567e-ffe6-41fa-bde1-18775c7052b7","Type":"ContainerStarted","Data":"22f47e79ac62c2871d4a7b59223d44f759f9d8b75d4c1bbf6a9e584a19c34da3"} Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.163778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"58c0a209-c9c9-4d67-9c4f-bc086ec04be5","Type":"ContainerStarted","Data":"dd62ecca2e723bec7d0ae9393501f468d6032d9fe6b0b7ae8df93327c50ef9f3"} Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.164507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrxbw\" (UniqueName: \"kubernetes.io/projected/4a35bcf9-e354-4385-8e10-2bb69352fc70-kube-api-access-nrxbw\") pod \"4a35bcf9-e354-4385-8e10-2bb69352fc70\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.164561 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-config-data\") pod \"4a35bcf9-e354-4385-8e10-2bb69352fc70\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.164709 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-scripts\") pod \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.164803 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-sg-core-conf-yaml\") pod \"4a35bcf9-e354-4385-8e10-2bb69352fc70\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.164824 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-scripts\") pod \"4a35bcf9-e354-4385-8e10-2bb69352fc70\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.164879 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-config-data-custom\") pod \"5b822da8-631f-47ee-8874-dfa862386ad2\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.164900 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs95g\" (UniqueName: \"kubernetes.io/projected/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-kube-api-access-vs95g\") pod \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.164921 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-combined-ca-bundle\") pod \"5b822da8-631f-47ee-8874-dfa862386ad2\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.164965 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm57c\" (UniqueName: \"kubernetes.io/projected/5b822da8-631f-47ee-8874-dfa862386ad2-kube-api-access-vm57c\") pod \"5b822da8-631f-47ee-8874-dfa862386ad2\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.164983 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a35bcf9-e354-4385-8e10-2bb69352fc70-run-httpd\") pod \"4a35bcf9-e354-4385-8e10-2bb69352fc70\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.165049 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a35bcf9-e354-4385-8e10-2bb69352fc70-log-httpd\") pod \"4a35bcf9-e354-4385-8e10-2bb69352fc70\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.165075 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-combined-ca-bundle\") pod \"4a35bcf9-e354-4385-8e10-2bb69352fc70\" (UID: \"4a35bcf9-e354-4385-8e10-2bb69352fc70\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.165132 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-logs\") pod \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.165281 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-config-data\") pod \"5b822da8-631f-47ee-8874-dfa862386ad2\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.165309 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b822da8-631f-47ee-8874-dfa862386ad2-logs\") pod \"5b822da8-631f-47ee-8874-dfa862386ad2\" (UID: \"5b822da8-631f-47ee-8874-dfa862386ad2\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.165346 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-config-data\") pod \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.165366 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-combined-ca-bundle\") pod \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\" (UID: \"6ba68c0c-821f-4bb0-848d-cfdc09051ae9\") " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.165993 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9def69e2-10b9-443b-8b38-7d561e39bb7d-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.166011 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9def69e2-10b9-443b-8b38-7d561e39bb7d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.166031 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.166040 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.166382 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-logs" (OuterVolumeSpecName: "logs") pod "6ba68c0c-821f-4bb0-848d-cfdc09051ae9" (UID: "6ba68c0c-821f-4bb0-848d-cfdc09051ae9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.170111 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b822da8-631f-47ee-8874-dfa862386ad2-logs" (OuterVolumeSpecName: "logs") pod "5b822da8-631f-47ee-8874-dfa862386ad2" (UID: "5b822da8-631f-47ee-8874-dfa862386ad2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.174667 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9def69e2-10b9-443b-8b38-7d561e39bb7d-pod-info" (OuterVolumeSpecName: "pod-info") pod "9def69e2-10b9-443b-8b38-7d561e39bb7d" (UID: "9def69e2-10b9-443b-8b38-7d561e39bb7d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.179122 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.179213 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"9def69e2-10b9-443b-8b38-7d561e39bb7d","Type":"ContainerDied","Data":"758ba39bd07720b4de6d36d8e189384dff774f7d39eed8a00c5685f17ab1ff29"} Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.182473 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9def69e2-10b9-443b-8b38-7d561e39bb7d-kube-api-access-cl5td" (OuterVolumeSpecName: "kube-api-access-cl5td") pod "9def69e2-10b9-443b-8b38-7d561e39bb7d" (UID: "9def69e2-10b9-443b-8b38-7d561e39bb7d"). InnerVolumeSpecName "kube-api-access-cl5td". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.193943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a35bcf9-e354-4385-8e10-2bb69352fc70-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4a35bcf9-e354-4385-8e10-2bb69352fc70" (UID: "4a35bcf9-e354-4385-8e10-2bb69352fc70"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.194178 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a35bcf9-e354-4385-8e10-2bb69352fc70-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4a35bcf9-e354-4385-8e10-2bb69352fc70" (UID: "4a35bcf9-e354-4385-8e10-2bb69352fc70"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.217790 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-kube-api-access-vs95g" (OuterVolumeSpecName: "kube-api-access-vs95g") pod "6ba68c0c-821f-4bb0-848d-cfdc09051ae9" (UID: "6ba68c0c-821f-4bb0-848d-cfdc09051ae9"). InnerVolumeSpecName "kube-api-access-vs95g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.218542 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-scripts" (OuterVolumeSpecName: "scripts") pod "6ba68c0c-821f-4bb0-848d-cfdc09051ae9" (UID: "6ba68c0c-821f-4bb0-848d-cfdc09051ae9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.224151 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-scripts" (OuterVolumeSpecName: "scripts") pod "4a35bcf9-e354-4385-8e10-2bb69352fc70" (UID: "4a35bcf9-e354-4385-8e10-2bb69352fc70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.224960 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a35bcf9-e354-4385-8e10-2bb69352fc70-kube-api-access-nrxbw" (OuterVolumeSpecName: "kube-api-access-nrxbw") pod "4a35bcf9-e354-4385-8e10-2bb69352fc70" (UID: "4a35bcf9-e354-4385-8e10-2bb69352fc70"). InnerVolumeSpecName "kube-api-access-nrxbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.231251 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b822da8-631f-47ee-8874-dfa862386ad2-kube-api-access-vm57c" (OuterVolumeSpecName: "kube-api-access-vm57c") pod "5b822da8-631f-47ee-8874-dfa862386ad2" (UID: "5b822da8-631f-47ee-8874-dfa862386ad2"). InnerVolumeSpecName "kube-api-access-vm57c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.245672 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.253730 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5b822da8-631f-47ee-8874-dfa862386ad2" (UID: "5b822da8-631f-47ee-8874-dfa862386ad2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.268727 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nlmg\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-kube-api-access-2nlmg\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.268818 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.268857 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.268876 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/707515d1-17d7-47ff-be95-73414a6a2a95-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.268914 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/707515d1-17d7-47ff-be95-73414a6a2a95-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.268933 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.268955 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.269101 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.276884 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.276951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.277054 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.277208 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.277225 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.277238 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs95g\" (UniqueName: \"kubernetes.io/projected/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-kube-api-access-vs95g\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.277251 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm57c\" (UniqueName: \"kubernetes.io/projected/5b822da8-631f-47ee-8874-dfa862386ad2-kube-api-access-vm57c\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.277260 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a35bcf9-e354-4385-8e10-2bb69352fc70-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.277269 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a35bcf9-e354-4385-8e10-2bb69352fc70-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.277280 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.277292 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl5td\" (UniqueName: \"kubernetes.io/projected/9def69e2-10b9-443b-8b38-7d561e39bb7d-kube-api-access-cl5td\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.277304 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9def69e2-10b9-443b-8b38-7d561e39bb7d-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.277316 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b822da8-631f-47ee-8874-dfa862386ad2-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.277330 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrxbw\" (UniqueName: \"kubernetes.io/projected/4a35bcf9-e354-4385-8e10-2bb69352fc70-kube-api-access-nrxbw\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.277342 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.284981 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" podStartSLOduration=8.284964882 podStartE2EDuration="8.284964882s" podCreationTimestamp="2026-01-20 23:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:32.211361711 +0000 UTC m=+3364.531621999" watchObservedRunningTime="2026-01-20 23:31:32.284964882 +0000 UTC m=+3364.605225170" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.315348 5030 scope.go:117] "RemoveContainer" containerID="25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.373114 5030 scope.go:117] "RemoveContainer" containerID="f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.378448 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.378495 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.378520 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.378546 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.378591 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nlmg\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-kube-api-access-2nlmg\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.378634 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.378655 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.378672 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/707515d1-17d7-47ff-be95-73414a6a2a95-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.378699 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/707515d1-17d7-47ff-be95-73414a6a2a95-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.378716 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.378737 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.380494 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.380892 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.381324 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.383084 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.383390 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.384360 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.393897 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.395458 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.404139 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/707515d1-17d7-47ff-be95-73414a6a2a95-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.404128 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/707515d1-17d7-47ff-be95-73414a6a2a95-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.407607 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nlmg\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-kube-api-access-2nlmg\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.436576 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9def69e2-10b9-443b-8b38-7d561e39bb7d-server-conf" (OuterVolumeSpecName: "server-conf") pod "9def69e2-10b9-443b-8b38-7d561e39bb7d" (UID: "9def69e2-10b9-443b-8b38-7d561e39bb7d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.483071 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9def69e2-10b9-443b-8b38-7d561e39bb7d-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.549081 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4a35bcf9-e354-4385-8e10-2bb69352fc70" (UID: "4a35bcf9-e354-4385-8e10-2bb69352fc70"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.585128 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.595093 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.596138 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b822da8-631f-47ee-8874-dfa862386ad2" (UID: "5b822da8-631f-47ee-8874-dfa862386ad2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.656401 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.674193 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ba68c0c-821f-4bb0-848d-cfdc09051ae9" (UID: "6ba68c0c-821f-4bb0-848d-cfdc09051ae9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.688763 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.688792 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.688802 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.736395 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.767768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-config-data" (OuterVolumeSpecName: "config-data") pod "5b822da8-631f-47ee-8874-dfa862386ad2" (UID: "5b822da8-631f-47ee-8874-dfa862386ad2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.771920 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-config-data" (OuterVolumeSpecName: "config-data") pod "6ba68c0c-821f-4bb0-848d-cfdc09051ae9" (UID: "6ba68c0c-821f-4bb0-848d-cfdc09051ae9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.802814 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b822da8-631f-47ee-8874-dfa862386ad2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.802850 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba68c0c-821f-4bb0-848d-cfdc09051ae9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.829214 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9def69e2-10b9-443b-8b38-7d561e39bb7d" (UID: "9def69e2-10b9-443b-8b38-7d561e39bb7d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.830736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a35bcf9-e354-4385-8e10-2bb69352fc70" (UID: "4a35bcf9-e354-4385-8e10-2bb69352fc70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.851459 5030 scope.go:117] "RemoveContainer" containerID="25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.853811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-config-data" (OuterVolumeSpecName: "config-data") pod "4a35bcf9-e354-4385-8e10-2bb69352fc70" (UID: "4a35bcf9-e354-4385-8e10-2bb69352fc70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.856778 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659\": container with ID starting with 25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659 not found: ID does not exist" containerID="25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.856829 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659"} err="failed to get container status \"25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659\": rpc error: code = NotFound desc = could not find container \"25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659\": container with ID starting with 25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659 not found: ID does not exist" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.856857 5030 scope.go:117] "RemoveContainer" containerID="f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382" Jan 20 23:31:32 crc kubenswrapper[5030]: E0120 23:31:32.864720 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382\": container with ID starting with f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382 not found: ID does not exist" containerID="f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.864753 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382"} err="failed to get container status \"f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382\": rpc error: code = NotFound desc = could not find container \"f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382\": container with ID starting with f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382 not found: ID does not exist" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.864772 5030 scope.go:117] "RemoveContainer" containerID="25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.868693 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659"} err="failed to get container status \"25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659\": rpc error: code = NotFound desc = could not find container \"25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659\": container with ID starting with 25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659 not found: ID does not exist" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.868724 5030 scope.go:117] "RemoveContainer" containerID="f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.872678 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382"} err="failed to get container status \"f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382\": rpc error: code = NotFound desc = could not find container \"f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382\": container with ID starting with f1b468e4d7277eeca3c7cc6b874bb3f41f69f5a99916ac3be1642d0205ca3382 not found: ID does not exist" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.872702 5030 scope.go:117] "RemoveContainer" containerID="b96f29c6412312302df952900fcc3ad294f799efc164332681b4bdeaa2e0b9e6" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.904196 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.904221 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9def69e2-10b9-443b-8b38-7d561e39bb7d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.904232 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a35bcf9-e354-4385-8e10-2bb69352fc70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:32 crc kubenswrapper[5030]: I0120 23:31:32.978735 5030 scope.go:117] "RemoveContainer" containerID="1778eb3dac897ab781625cc651ba606b63a5dd6829ad139e3551fa0e3df843c2" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.087967 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-799d455fb6-7jld8"] Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.100963 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-799d455fb6-7jld8"] Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.166893 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.186932 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.205688 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.207782 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.212120 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.213248 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.213372 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.213509 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.213659 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.213765 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-cjd45" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.221861 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.225185 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.279013 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"07cd67f8-2bd9-448c-a418-5d1aa37808d6","Type":"ContainerStarted","Data":"faedcaf04f2f97afdb74b95ee3377cf5cd8b168accbb420e066401fd1a93fb51"} Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.286736 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"09bdbd07-9917-40c2-9583-b9633d529373","Type":"ContainerStarted","Data":"940e781c7a85b198a4d2b6b1b3c4b7082bf01ab6f893127baf96e3dcc162635e"} Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.291813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"1e5f8238-69f5-43a5-843f-82e5195fde82","Type":"ContainerStarted","Data":"39bccae4387f02af6fa7a5e024ae31aae72dd0997152fc9e4dd93c999c710c26"} Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.299605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c77f3ffb-97b9-4a1f-9610-7705ae4aee88","Type":"ContainerStarted","Data":"1bd3eefd66df110f1bde6987f3ca5ec62e35e4907fb703cbc406ac9b3fda7696"} Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.313792 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"e17cbb6a-58ff-46e9-91d7-d92d52b77b55","Type":"ContainerStarted","Data":"cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50"} Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.313837 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"e17cbb6a-58ff-46e9-91d7-d92d52b77b55","Type":"ContainerStarted","Data":"cbf2dd2974b420b1c99104903866e2f9cb223cd17877d633d71540a17d53b366"} Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.316244 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"762f567e-ffe6-41fa-bde1-18775c7052b7","Type":"ContainerStarted","Data":"ea6e94f88e69a2c72c358c2a248a3a4b1216e97c838cc4ea35f480cab4e9861f"} Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.319575 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.319614 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.319656 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.319695 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.319883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.319949 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.319967 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.320086 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.320113 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.320163 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.320296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw8g5\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-kube-api-access-hw8g5\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.340134 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=3.340087404 podStartE2EDuration="3.340087404s" podCreationTimestamp="2026-01-20 23:31:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:33.336935468 +0000 UTC m=+3365.657195756" watchObservedRunningTime="2026-01-20 23:31:33.340087404 +0000 UTC m=+3365.660347692" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.362950 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"58c0a209-c9c9-4d67-9c4f-bc086ec04be5","Type":"ContainerStarted","Data":"965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db"} Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.367161 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"59347f0b-d425-4d66-b2a0-7381c5fa0c3f","Type":"ContainerStarted","Data":"20807d44d96cf2571ff139187e55cdd065bf126b9fb85662d43174da48227d54"} Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.369680 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.369984 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.387436 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=5.38741806 podStartE2EDuration="5.38741806s" podCreationTimestamp="2026-01-20 23:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:33.354686308 +0000 UTC m=+3365.674946596" watchObservedRunningTime="2026-01-20 23:31:33.38741806 +0000 UTC m=+3365.707678338" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.397607 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=4.397591096 podStartE2EDuration="4.397591096s" podCreationTimestamp="2026-01-20 23:31:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:33.374974099 +0000 UTC m=+3365.695234387" watchObservedRunningTime="2026-01-20 23:31:33.397591096 +0000 UTC m=+3365.717851384" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.425707 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.426799 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.426824 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.426858 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.426946 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw8g5\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-kube-api-access-hw8g5\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.426972 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.426989 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.427019 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.427048 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.427084 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.427119 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.427134 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.428279 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.428955 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.432087 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.433642 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.434398 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.436444 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.437591 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.438154 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.442937 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.444495 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=6.444473212 podStartE2EDuration="6.444473212s" podCreationTimestamp="2026-01-20 23:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:33.411334139 +0000 UTC m=+3365.731594427" watchObservedRunningTime="2026-01-20 23:31:33.444473212 +0000 UTC m=+3365.764733500" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.445359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.452927 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw8g5\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-kube-api-access-hw8g5\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.491734 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.570259 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.739749 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.769193 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.812427 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb"] Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.825877 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6bfc98675d-dsbwb"] Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.913850 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.916319 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.919122 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.920259 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:31:33 crc kubenswrapper[5030]: I0120 23:31:33.927156 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:33.995744 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a35bcf9-e354-4385-8e10-2bb69352fc70" path="/var/lib/kubelet/pods/4a35bcf9-e354-4385-8e10-2bb69352fc70/volumes" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:33.996793 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="535e149a-723c-4f6b-9247-9a6b58eff6bb" path="/var/lib/kubelet/pods/535e149a-723c-4f6b-9247-9a6b58eff6bb/volumes" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:33.999836 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b822da8-631f-47ee-8874-dfa862386ad2" path="/var/lib/kubelet/pods/5b822da8-631f-47ee-8874-dfa862386ad2/volumes" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.000400 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba68c0c-821f-4bb0-848d-cfdc09051ae9" path="/var/lib/kubelet/pods/6ba68c0c-821f-4bb0-848d-cfdc09051ae9/volumes" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.001372 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9def69e2-10b9-443b-8b38-7d561e39bb7d" path="/var/lib/kubelet/pods/9def69e2-10b9-443b-8b38-7d561e39bb7d/volumes" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.049586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-log-httpd\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.049666 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.049699 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-run-httpd\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.050237 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.050311 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-scripts\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.050351 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztxbs\" (UniqueName: \"kubernetes.io/projected/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-kube-api-access-ztxbs\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.050404 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-config-data\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.151604 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-log-httpd\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.151665 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.151692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-run-httpd\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.151755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.151796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-scripts\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.151831 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztxbs\" (UniqueName: \"kubernetes.io/projected/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-kube-api-access-ztxbs\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.151868 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-config-data\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.154741 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-run-httpd\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.155092 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-log-httpd\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.163851 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.164513 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-config-data\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.165874 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.168875 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-scripts\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.171354 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztxbs\" (UniqueName: \"kubernetes.io/projected/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-kube-api-access-ztxbs\") pod \"ceilometer-0\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.252192 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.345718 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.372129 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.372333 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.405668 5030 generic.go:334] "Generic (PLEG): container finished" podID="0db48c6e-e94b-4a4c-bc75-70ee65bded48" containerID="a41bbbe631becdc2663227271247bffebe52245317b5d494313c36b84c0010bf" exitCode=0 Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.405775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" event={"ID":"0db48c6e-e94b-4a4c-bc75-70ee65bded48","Type":"ContainerDied","Data":"a41bbbe631becdc2663227271247bffebe52245317b5d494313c36b84c0010bf"} Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.452014 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c77f3ffb-97b9-4a1f-9610-7705ae4aee88","Type":"ContainerStarted","Data":"6e686278be52fa85b6babe42f39d57705fd3b25a80bb2213f78c824fd28e3f1e"} Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.481029 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"707515d1-17d7-47ff-be95-73414a6a2a95","Type":"ContainerStarted","Data":"d81d866e2fdb7ee31c791a88789cae06990472531241a26ed1edacf53c12a8d5"} Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.485245 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"09bdbd07-9917-40c2-9583-b9633d529373","Type":"ContainerStarted","Data":"4c3206d4e2b2e6b3beae0608df2af062fdb038791006f7d48e30d5b6ee108603"} Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.486379 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.486520 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=7.486499416 podStartE2EDuration="7.486499416s" podCreationTimestamp="2026-01-20 23:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:34.483150844 +0000 UTC m=+3366.803411142" watchObservedRunningTime="2026-01-20 23:31:34.486499416 +0000 UTC m=+3366.806759704" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.502995 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"1e5f8238-69f5-43a5-843f-82e5195fde82","Type":"ContainerStarted","Data":"825b5af702ffdf5de2be5f22ddc61eac516d5cbe358b33e13e145f3e929fbc57"} Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.503024 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"1e5f8238-69f5-43a5-843f-82e5195fde82","Type":"ContainerStarted","Data":"5119d0d0cc4f6a962474254571f2c6fcb5f0d2fbf5b431abae2990241b7a16de"} Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.517279 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"07cd67f8-2bd9-448c-a418-5d1aa37808d6","Type":"ContainerStarted","Data":"b3c9339bc8d74ca4a63b31adbe8e69052f2ce6fb016f4626733fb1fb227b5639"} Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.521927 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=5.521910883 podStartE2EDuration="5.521910883s" podCreationTimestamp="2026-01-20 23:31:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:34.504599444 +0000 UTC m=+3366.824859732" watchObservedRunningTime="2026-01-20 23:31:34.521910883 +0000 UTC m=+3366.842171171" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.537050 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.537014629 podStartE2EDuration="4.537014629s" podCreationTimestamp="2026-01-20 23:31:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:34.525155641 +0000 UTC m=+3366.845415929" watchObservedRunningTime="2026-01-20 23:31:34.537014629 +0000 UTC m=+3366.857274917" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.630701 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.666359 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.686252 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=5.686237431 podStartE2EDuration="5.686237431s" podCreationTimestamp="2026-01-20 23:31:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:34.546890838 +0000 UTC m=+3366.867151136" watchObservedRunningTime="2026-01-20 23:31:34.686237431 +0000 UTC m=+3367.006497719" Jan 20 23:31:34 crc kubenswrapper[5030]: I0120 23:31:34.751471 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:31:35 crc kubenswrapper[5030]: I0120 23:31:35.533421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"707515d1-17d7-47ff-be95-73414a6a2a95","Type":"ContainerStarted","Data":"18661e9685ba3cc1b4a95ae118b9c8f65210502829dae351496739d132cd571e"} Jan 20 23:31:35 crc kubenswrapper[5030]: I0120 23:31:35.537419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52","Type":"ContainerStarted","Data":"746414e59b0a1e3e1f015ec60e5aebed410f00b7ae6cc7b7630090e1aad843de"} Jan 20 23:31:35 crc kubenswrapper[5030]: I0120 23:31:35.537470 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52","Type":"ContainerStarted","Data":"d8b46dfb5649b1be74f2b09d2ae386d92d01893a4d73c02b981ebf3cea1a79fe"} Jan 20 23:31:35 crc kubenswrapper[5030]: I0120 23:31:35.541194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c","Type":"ContainerStarted","Data":"13613ed8fdeb75e14a275c5b8e8136860ac61830c6715d8c58f9e95416b9de84"} Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.077038 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.162629 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.203321 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-combined-ca-bundle\") pod \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.203356 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-scripts\") pod \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.203455 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8dz7\" (UniqueName: \"kubernetes.io/projected/0db48c6e-e94b-4a4c-bc75-70ee65bded48-kube-api-access-w8dz7\") pod \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.203510 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-fernet-keys\") pod \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.203530 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-config-data\") pod \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.203577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-credential-keys\") pod \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\" (UID: \"0db48c6e-e94b-4a4c-bc75-70ee65bded48\") " Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.211766 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0db48c6e-e94b-4a4c-bc75-70ee65bded48" (UID: "0db48c6e-e94b-4a4c-bc75-70ee65bded48"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.213547 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0db48c6e-e94b-4a4c-bc75-70ee65bded48-kube-api-access-w8dz7" (OuterVolumeSpecName: "kube-api-access-w8dz7") pod "0db48c6e-e94b-4a4c-bc75-70ee65bded48" (UID: "0db48c6e-e94b-4a4c-bc75-70ee65bded48"). InnerVolumeSpecName "kube-api-access-w8dz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.214021 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-scripts" (OuterVolumeSpecName: "scripts") pod "0db48c6e-e94b-4a4c-bc75-70ee65bded48" (UID: "0db48c6e-e94b-4a4c-bc75-70ee65bded48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.219789 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0db48c6e-e94b-4a4c-bc75-70ee65bded48" (UID: "0db48c6e-e94b-4a4c-bc75-70ee65bded48"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.245160 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-config-data" (OuterVolumeSpecName: "config-data") pod "0db48c6e-e94b-4a4c-bc75-70ee65bded48" (UID: "0db48c6e-e94b-4a4c-bc75-70ee65bded48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.257787 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0db48c6e-e94b-4a4c-bc75-70ee65bded48" (UID: "0db48c6e-e94b-4a4c-bc75-70ee65bded48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.307087 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.307125 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.307135 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.307146 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8dz7\" (UniqueName: \"kubernetes.io/projected/0db48c6e-e94b-4a4c-bc75-70ee65bded48-kube-api-access-w8dz7\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.307157 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.307167 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db48c6e-e94b-4a4c-bc75-70ee65bded48-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.557394 5030 generic.go:334] "Generic (PLEG): container finished" podID="0c16760b-b3ab-42a0-90a7-5cba1f6b889a" containerID="51cd04a97de01286d78837802f843f6e120edea93900487da629dcee88840ff1" exitCode=0 Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.557496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" event={"ID":"0c16760b-b3ab-42a0-90a7-5cba1f6b889a","Type":"ContainerDied","Data":"51cd04a97de01286d78837802f843f6e120edea93900487da629dcee88840ff1"} Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.577206 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" event={"ID":"0db48c6e-e94b-4a4c-bc75-70ee65bded48","Type":"ContainerDied","Data":"1b08250fb71c5960b241c61b26a731ce793dea6ee47186122cb8df37bec45e0d"} Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.577243 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b08250fb71c5960b241c61b26a731ce793dea6ee47186122cb8df37bec45e0d" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.577359 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-sf2nd" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.585599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52","Type":"ContainerStarted","Data":"c010e2926c7e322751386b3c6ca4213ac366a66e780234a4f8bfc756ce46c955"} Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.590578 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c","Type":"ContainerStarted","Data":"28edcad0484db485386c7130df3370aaacf3e682e8b1296021f215f02115680c"} Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.851080 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.920256 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-logs\") pod \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.920602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thmbg\" (UniqueName: \"kubernetes.io/projected/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-kube-api-access-thmbg\") pod \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.920667 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-logs" (OuterVolumeSpecName: "logs") pod "0c16760b-b3ab-42a0-90a7-5cba1f6b889a" (UID: "0c16760b-b3ab-42a0-90a7-5cba1f6b889a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.920700 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-config-data\") pod \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.920733 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-config-data-custom\") pod \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.920785 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-combined-ca-bundle\") pod \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\" (UID: \"0c16760b-b3ab-42a0-90a7-5cba1f6b889a\") " Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.921238 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.927515 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-kube-api-access-thmbg" (OuterVolumeSpecName: "kube-api-access-thmbg") pod "0c16760b-b3ab-42a0-90a7-5cba1f6b889a" (UID: "0c16760b-b3ab-42a0-90a7-5cba1f6b889a"). InnerVolumeSpecName "kube-api-access-thmbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.927755 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0c16760b-b3ab-42a0-90a7-5cba1f6b889a" (UID: "0c16760b-b3ab-42a0-90a7-5cba1f6b889a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.955837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c16760b-b3ab-42a0-90a7-5cba1f6b889a" (UID: "0c16760b-b3ab-42a0-90a7-5cba1f6b889a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:36 crc kubenswrapper[5030]: I0120 23:31:36.972119 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-config-data" (OuterVolumeSpecName: "config-data") pod "0c16760b-b3ab-42a0-90a7-5cba1f6b889a" (UID: "0c16760b-b3ab-42a0-90a7-5cba1f6b889a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.024410 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thmbg\" (UniqueName: \"kubernetes.io/projected/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-kube-api-access-thmbg\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.024454 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.024465 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.024475 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c16760b-b3ab-42a0-90a7-5cba1f6b889a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.185435 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.185708 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="809c9ceb-bda3-4971-88b2-3fb7bd4cee02" containerName="ovn-northd" containerID="cri-o://41a4453c6599195810c26e314fe2f606cb5655a663af8cbb23e523f3df4773ab" gracePeriod=30 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.185771 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="809c9ceb-bda3-4971-88b2-3fb7bd4cee02" containerName="openstack-network-exporter" containerID="cri-o://1f5c17ad369f742c3c741a76fb19d1dc31e60772125f2a6a42307522b80a2db3" gracePeriod=30 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.371804 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.552049 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.552272 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="ed0cd39a-1c2a-4c57-952f-9f389530459d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://4e409b8ed77abff8688fbca2a42ce4b6681d55a94a5e7bf4e50db39b16315018" gracePeriod=30 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.581438 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.581673 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="58c0a209-c9c9-4d67-9c4f-bc086ec04be5" containerName="nova-scheduler-scheduler" containerID="cri-o://965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db" gracePeriod=30 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.602979 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-84588d847c-cnh5j"] Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.603218 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" podUID="dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4" containerName="keystone-api" containerID="cri-o://d33518e830a3f7bd93cdb4cab2e8d31c3450485a686941e871eb32f2b839ba27" gracePeriod=30 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.649140 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-595565b585-58jwl"] Jan 20 23:31:37 crc kubenswrapper[5030]: E0120 23:31:37.649550 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c16760b-b3ab-42a0-90a7-5cba1f6b889a" containerName="barbican-worker" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.649565 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c16760b-b3ab-42a0-90a7-5cba1f6b889a" containerName="barbican-worker" Jan 20 23:31:37 crc kubenswrapper[5030]: E0120 23:31:37.649589 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c16760b-b3ab-42a0-90a7-5cba1f6b889a" containerName="barbican-worker-log" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.649596 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c16760b-b3ab-42a0-90a7-5cba1f6b889a" containerName="barbican-worker-log" Jan 20 23:31:37 crc kubenswrapper[5030]: E0120 23:31:37.649614 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db48c6e-e94b-4a4c-bc75-70ee65bded48" containerName="keystone-bootstrap" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.649709 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db48c6e-e94b-4a4c-bc75-70ee65bded48" containerName="keystone-bootstrap" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.649888 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c16760b-b3ab-42a0-90a7-5cba1f6b889a" containerName="barbican-worker" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.649913 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db48c6e-e94b-4a4c-bc75-70ee65bded48" containerName="keystone-bootstrap" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.649922 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c16760b-b3ab-42a0-90a7-5cba1f6b889a" containerName="barbican-worker-log" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.650581 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.655190 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.655473 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.656503 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_809c9ceb-bda3-4971-88b2-3fb7bd4cee02/ovn-northd/0.log" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.656541 5030 generic.go:334] "Generic (PLEG): container finished" podID="809c9ceb-bda3-4971-88b2-3fb7bd4cee02" containerID="1f5c17ad369f742c3c741a76fb19d1dc31e60772125f2a6a42307522b80a2db3" exitCode=2 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.656575 5030 generic.go:334] "Generic (PLEG): container finished" podID="809c9ceb-bda3-4971-88b2-3fb7bd4cee02" containerID="41a4453c6599195810c26e314fe2f606cb5655a663af8cbb23e523f3df4773ab" exitCode=143 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.656648 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"809c9ceb-bda3-4971-88b2-3fb7bd4cee02","Type":"ContainerDied","Data":"1f5c17ad369f742c3c741a76fb19d1dc31e60772125f2a6a42307522b80a2db3"} Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.656669 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"809c9ceb-bda3-4971-88b2-3fb7bd4cee02","Type":"ContainerDied","Data":"41a4453c6599195810c26e314fe2f606cb5655a663af8cbb23e523f3df4773ab"} Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.664488 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-595565b585-58jwl"] Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.668721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52","Type":"ContainerStarted","Data":"c7e043b0b8441b0f27569b6c1a47553f98a4e898b7a8ab452dec340014f4532c"} Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.674420 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.674827 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="07cd67f8-2bd9-448c-a418-5d1aa37808d6" containerName="glance-log" containerID="cri-o://faedcaf04f2f97afdb74b95ee3377cf5cd8b168accbb420e066401fd1a93fb51" gracePeriod=30 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.675010 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="07cd67f8-2bd9-448c-a418-5d1aa37808d6" containerName="glance-httpd" containerID="cri-o://b3c9339bc8d74ca4a63b31adbe8e69052f2ce6fb016f4626733fb1fb227b5639" gracePeriod=30 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.683155 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.688694 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-697dff9875-7clld" event={"ID":"0c16760b-b3ab-42a0-90a7-5cba1f6b889a","Type":"ContainerDied","Data":"60c6adbd497cc5e458d71143c0ae628b469af0691d95ef0fcf275c509bb46bd3"} Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.688747 5030 scope.go:117] "RemoveContainer" containerID="51cd04a97de01286d78837802f843f6e120edea93900487da629dcee88840ff1" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.745605 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.745806 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="09bdbd07-9917-40c2-9583-b9633d529373" containerName="cinder-api-log" containerID="cri-o://940e781c7a85b198a4d2b6b1b3c4b7082bf01ab6f893127baf96e3dcc162635e" gracePeriod=30 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.746135 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="09bdbd07-9917-40c2-9583-b9633d529373" containerName="cinder-api" containerID="cri-o://4c3206d4e2b2e6b3beae0608df2af062fdb038791006f7d48e30d5b6ee108603" gracePeriod=30 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.746871 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-credential-keys\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.746909 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-fernet-keys\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.746927 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-internal-tls-certs\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.746992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-combined-ca-bundle\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.747055 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvbd9\" (UniqueName: \"kubernetes.io/projected/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-kube-api-access-lvbd9\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.747070 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-public-tls-certs\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.747123 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-scripts\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.747168 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-config-data\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.768858 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.769052 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" containerName="nova-cell1-conductor-conductor" containerID="cri-o://67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" gracePeriod=30 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.778875 5030 scope.go:117] "RemoveContainer" containerID="dcdfc1820f080b0e8b90f9417e65df736b83363b67ea72b8806323bba24c5a86" Jan 20 23:31:37 crc kubenswrapper[5030]: E0120 23:31:37.779661 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 41a4453c6599195810c26e314fe2f606cb5655a663af8cbb23e523f3df4773ab is running failed: container process not found" containerID="41a4453c6599195810c26e314fe2f606cb5655a663af8cbb23e523f3df4773ab" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:31:37 crc kubenswrapper[5030]: E0120 23:31:37.780015 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 41a4453c6599195810c26e314fe2f606cb5655a663af8cbb23e523f3df4773ab is running failed: container process not found" containerID="41a4453c6599195810c26e314fe2f606cb5655a663af8cbb23e523f3df4773ab" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:31:37 crc kubenswrapper[5030]: E0120 23:31:37.782902 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 41a4453c6599195810c26e314fe2f606cb5655a663af8cbb23e523f3df4773ab is running failed: container process not found" containerID="41a4453c6599195810c26e314fe2f606cb5655a663af8cbb23e523f3df4773ab" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:31:37 crc kubenswrapper[5030]: E0120 23:31:37.782936 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 41a4453c6599195810c26e314fe2f606cb5655a663af8cbb23e523f3df4773ab is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="809c9ceb-bda3-4971-88b2-3fb7bd4cee02" containerName="ovn-northd" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.805985 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-697dff9875-7clld"] Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.829971 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-697dff9875-7clld"] Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.843196 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.843401 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="1e5f8238-69f5-43a5-843f-82e5195fde82" containerName="glance-log" containerID="cri-o://5119d0d0cc4f6a962474254571f2c6fcb5f0d2fbf5b431abae2990241b7a16de" gracePeriod=30 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.843869 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="1e5f8238-69f5-43a5-843f-82e5195fde82" containerName="glance-httpd" containerID="cri-o://825b5af702ffdf5de2be5f22ddc61eac516d5cbe358b33e13e145f3e929fbc57" gracePeriod=30 Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.848862 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-scripts\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.848938 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-config-data\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.848971 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-credential-keys\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.848993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-fernet-keys\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.849015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-internal-tls-certs\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.849048 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-combined-ca-bundle\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.849100 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvbd9\" (UniqueName: \"kubernetes.io/projected/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-kube-api-access-lvbd9\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.849118 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-public-tls-certs\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.859777 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-config-data\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.861568 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-public-tls-certs\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.861702 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-scripts\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.862892 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-internal-tls-certs\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.870066 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-credential-keys\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.883436 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-fernet-keys\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.889467 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvbd9\" (UniqueName: \"kubernetes.io/projected/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-kube-api-access-lvbd9\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.892696 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-combined-ca-bundle\") pod \"keystone-595565b585-58jwl\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.982440 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:37 crc kubenswrapper[5030]: I0120 23:31:37.997750 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c16760b-b3ab-42a0-90a7-5cba1f6b889a" path="/var/lib/kubelet/pods/0c16760b-b3ab-42a0-90a7-5cba1f6b889a/volumes" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.048015 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_809c9ceb-bda3-4971-88b2-3fb7bd4cee02/ovn-northd/0.log" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.048102 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.050720 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-ovn-rundir\") pod \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.050774 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-combined-ca-bundle\") pod \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.050805 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-config\") pod \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.050845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8sn5\" (UniqueName: \"kubernetes.io/projected/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-kube-api-access-l8sn5\") pod \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.050870 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-scripts\") pod \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\" (UID: \"809c9ceb-bda3-4971-88b2-3fb7bd4cee02\") " Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.052085 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "809c9ceb-bda3-4971-88b2-3fb7bd4cee02" (UID: "809c9ceb-bda3-4971-88b2-3fb7bd4cee02"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.061471 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-scripts" (OuterVolumeSpecName: "scripts") pod "809c9ceb-bda3-4971-88b2-3fb7bd4cee02" (UID: "809c9ceb-bda3-4971-88b2-3fb7bd4cee02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.061547 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-config" (OuterVolumeSpecName: "config") pod "809c9ceb-bda3-4971-88b2-3fb7bd4cee02" (UID: "809c9ceb-bda3-4971-88b2-3fb7bd4cee02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.064382 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-kube-api-access-l8sn5" (OuterVolumeSpecName: "kube-api-access-l8sn5") pod "809c9ceb-bda3-4971-88b2-3fb7bd4cee02" (UID: "809c9ceb-bda3-4971-88b2-3fb7bd4cee02"). InnerVolumeSpecName "kube-api-access-l8sn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.108960 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.112230 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="95a6a20e-e4cf-4c12-be69-33ace96481c7" containerName="openstack-network-exporter" containerID="cri-o://bd45d6d17dbbe4a1ba70c5139b2c784f386f57335e21bcbdfe022af808f6aec3" gracePeriod=300 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.159573 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.159904 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.159930 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8sn5\" (UniqueName: \"kubernetes.io/projected/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-kube-api-access-l8sn5\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.159942 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.200842 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" podUID="dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4" containerName="keystone-api" probeResult="failure" output="Get \"http://10.217.1.193:5000/v3\": read tcp 10.217.0.2:46938->10.217.1.193:5000: read: connection reset by peer" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.204402 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "809c9ceb-bda3-4971-88b2-3fb7bd4cee02" (UID: "809c9ceb-bda3-4971-88b2-3fb7bd4cee02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.221302 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="95a6a20e-e4cf-4c12-be69-33ace96481c7" containerName="ovsdbserver-nb" containerID="cri-o://f413a83947a3f641c49ef1fe9149ee0fb63df1dd4f02e577126231ff5df950cd" gracePeriod=300 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.261694 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809c9ceb-bda3-4971-88b2-3fb7bd4cee02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.435688 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7b5f97974b-77qrc"] Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.440318 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" podUID="6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" containerName="neutron-api" containerID="cri-o://ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00" gracePeriod=30 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.440438 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" podUID="6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" containerName="neutron-httpd" containerID="cri-o://c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20" gracePeriod=30 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.471278 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj"] Jan 20 23:31:38 crc kubenswrapper[5030]: E0120 23:31:38.471722 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="809c9ceb-bda3-4971-88b2-3fb7bd4cee02" containerName="ovn-northd" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.471738 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="809c9ceb-bda3-4971-88b2-3fb7bd4cee02" containerName="ovn-northd" Jan 20 23:31:38 crc kubenswrapper[5030]: E0120 23:31:38.471763 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="809c9ceb-bda3-4971-88b2-3fb7bd4cee02" containerName="openstack-network-exporter" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.471770 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="809c9ceb-bda3-4971-88b2-3fb7bd4cee02" containerName="openstack-network-exporter" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.472021 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="809c9ceb-bda3-4971-88b2-3fb7bd4cee02" containerName="openstack-network-exporter" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.472058 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="809c9ceb-bda3-4971-88b2-3fb7bd4cee02" containerName="ovn-northd" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.473213 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.487466 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj"] Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.546051 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" podUID="6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.1.200:9696/\": read tcp 10.217.0.2:38244->10.217.1.200:9696: read: connection reset by peer" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.567726 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-combined-ca-bundle\") pod \"neutron-68cdcdb79d-vdlkj\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.567857 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-config\") pod \"neutron-68cdcdb79d-vdlkj\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.567893 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfxbg\" (UniqueName: \"kubernetes.io/projected/933d538e-e825-4673-a66a-6eedb28088a8-kube-api-access-wfxbg\") pod \"neutron-68cdcdb79d-vdlkj\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.567924 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-httpd-config\") pod \"neutron-68cdcdb79d-vdlkj\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.627454 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.669432 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-config\") pod \"neutron-68cdcdb79d-vdlkj\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.669487 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfxbg\" (UniqueName: \"kubernetes.io/projected/933d538e-e825-4673-a66a-6eedb28088a8-kube-api-access-wfxbg\") pod \"neutron-68cdcdb79d-vdlkj\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.669518 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-httpd-config\") pod \"neutron-68cdcdb79d-vdlkj\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.669594 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-combined-ca-bundle\") pod \"neutron-68cdcdb79d-vdlkj\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.676853 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-httpd-config\") pod \"neutron-68cdcdb79d-vdlkj\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.680248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-config\") pod \"neutron-68cdcdb79d-vdlkj\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.681266 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-combined-ca-bundle\") pod \"neutron-68cdcdb79d-vdlkj\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.686849 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfxbg\" (UniqueName: \"kubernetes.io/projected/933d538e-e825-4673-a66a-6eedb28088a8-kube-api-access-wfxbg\") pod \"neutron-68cdcdb79d-vdlkj\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.739883 5030 generic.go:334] "Generic (PLEG): container finished" podID="09bdbd07-9917-40c2-9583-b9633d529373" containerID="4c3206d4e2b2e6b3beae0608df2af062fdb038791006f7d48e30d5b6ee108603" exitCode=0 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.739927 5030 generic.go:334] "Generic (PLEG): container finished" podID="09bdbd07-9917-40c2-9583-b9633d529373" containerID="940e781c7a85b198a4d2b6b1b3c4b7082bf01ab6f893127baf96e3dcc162635e" exitCode=143 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.739949 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"09bdbd07-9917-40c2-9583-b9633d529373","Type":"ContainerDied","Data":"4c3206d4e2b2e6b3beae0608df2af062fdb038791006f7d48e30d5b6ee108603"} Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.740002 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"09bdbd07-9917-40c2-9583-b9633d529373","Type":"ContainerDied","Data":"940e781c7a85b198a4d2b6b1b3c4b7082bf01ab6f893127baf96e3dcc162635e"} Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.742048 5030 generic.go:334] "Generic (PLEG): container finished" podID="1e5f8238-69f5-43a5-843f-82e5195fde82" containerID="825b5af702ffdf5de2be5f22ddc61eac516d5cbe358b33e13e145f3e929fbc57" exitCode=0 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.742067 5030 generic.go:334] "Generic (PLEG): container finished" podID="1e5f8238-69f5-43a5-843f-82e5195fde82" containerID="5119d0d0cc4f6a962474254571f2c6fcb5f0d2fbf5b431abae2990241b7a16de" exitCode=143 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.742120 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"1e5f8238-69f5-43a5-843f-82e5195fde82","Type":"ContainerDied","Data":"825b5af702ffdf5de2be5f22ddc61eac516d5cbe358b33e13e145f3e929fbc57"} Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.742148 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"1e5f8238-69f5-43a5-843f-82e5195fde82","Type":"ContainerDied","Data":"5119d0d0cc4f6a962474254571f2c6fcb5f0d2fbf5b431abae2990241b7a16de"} Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.742158 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"1e5f8238-69f5-43a5-843f-82e5195fde82","Type":"ContainerDied","Data":"39bccae4387f02af6fa7a5e024ae31aae72dd0997152fc9e4dd93c999c710c26"} Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.742168 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39bccae4387f02af6fa7a5e024ae31aae72dd0997152fc9e4dd93c999c710c26" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.744951 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_809c9ceb-bda3-4971-88b2-3fb7bd4cee02/ovn-northd/0.log" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.744999 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"809c9ceb-bda3-4971-88b2-3fb7bd4cee02","Type":"ContainerDied","Data":"cf1f2eab93c195c5b1e4618d45aa7077ef4a1cba677870047e5207b1ed666494"} Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.745021 5030 scope.go:117] "RemoveContainer" containerID="1f5c17ad369f742c3c741a76fb19d1dc31e60772125f2a6a42307522b80a2db3" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.745161 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.766358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52","Type":"ContainerStarted","Data":"1ea957f41ba227c3e41f1638e15e663ffad55ccf95f6a36654fdbd2571340e9c"} Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.766560 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.783765 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_95a6a20e-e4cf-4c12-be69-33ace96481c7/ovsdbserver-nb/0.log" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.784055 5030 generic.go:334] "Generic (PLEG): container finished" podID="95a6a20e-e4cf-4c12-be69-33ace96481c7" containerID="bd45d6d17dbbe4a1ba70c5139b2c784f386f57335e21bcbdfe022af808f6aec3" exitCode=2 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.784110 5030 generic.go:334] "Generic (PLEG): container finished" podID="95a6a20e-e4cf-4c12-be69-33ace96481c7" containerID="f413a83947a3f641c49ef1fe9149ee0fb63df1dd4f02e577126231ff5df950cd" exitCode=143 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.784129 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"95a6a20e-e4cf-4c12-be69-33ace96481c7","Type":"ContainerDied","Data":"bd45d6d17dbbe4a1ba70c5139b2c784f386f57335e21bcbdfe022af808f6aec3"} Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.784159 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"95a6a20e-e4cf-4c12-be69-33ace96481c7","Type":"ContainerDied","Data":"f413a83947a3f641c49ef1fe9149ee0fb63df1dd4f02e577126231ff5df950cd"} Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.785224 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.789698 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.796736 5030 scope.go:117] "RemoveContainer" containerID="41a4453c6599195810c26e314fe2f606cb5655a663af8cbb23e523f3df4773ab" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.796937 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.814233 5030 generic.go:334] "Generic (PLEG): container finished" podID="dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4" containerID="d33518e830a3f7bd93cdb4cab2e8d31c3450485a686941e871eb32f2b839ba27" exitCode=0 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.814340 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" event={"ID":"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4","Type":"ContainerDied","Data":"d33518e830a3f7bd93cdb4cab2e8d31c3450485a686941e871eb32f2b839ba27"} Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.817850 5030 generic.go:334] "Generic (PLEG): container finished" podID="07cd67f8-2bd9-448c-a418-5d1aa37808d6" containerID="b3c9339bc8d74ca4a63b31adbe8e69052f2ce6fb016f4626733fb1fb227b5639" exitCode=0 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.817881 5030 generic.go:334] "Generic (PLEG): container finished" podID="07cd67f8-2bd9-448c-a418-5d1aa37808d6" containerID="faedcaf04f2f97afdb74b95ee3377cf5cd8b168accbb420e066401fd1a93fb51" exitCode=143 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.818892 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"07cd67f8-2bd9-448c-a418-5d1aa37808d6","Type":"ContainerDied","Data":"b3c9339bc8d74ca4a63b31adbe8e69052f2ce6fb016f4626733fb1fb227b5639"} Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.818929 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"07cd67f8-2bd9-448c-a418-5d1aa37808d6","Type":"ContainerDied","Data":"faedcaf04f2f97afdb74b95ee3377cf5cd8b168accbb420e066401fd1a93fb51"} Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.835260 5030 generic.go:334] "Generic (PLEG): container finished" podID="6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" containerID="c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20" exitCode=0 Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.836288 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" event={"ID":"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31","Type":"ContainerDied","Data":"c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20"} Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.847362 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.883970 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.911688 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:31:38 crc kubenswrapper[5030]: E0120 23:31:38.912385 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5f8238-69f5-43a5-843f-82e5195fde82" containerName="glance-log" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.912398 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5f8238-69f5-43a5-843f-82e5195fde82" containerName="glance-log" Jan 20 23:31:38 crc kubenswrapper[5030]: E0120 23:31:38.912442 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5f8238-69f5-43a5-843f-82e5195fde82" containerName="glance-httpd" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.912450 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5f8238-69f5-43a5-843f-82e5195fde82" containerName="glance-httpd" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.913208 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e5f8238-69f5-43a5-843f-82e5195fde82" containerName="glance-log" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.913247 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e5f8238-69f5-43a5-843f-82e5195fde82" containerName="glance-httpd" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.914370 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.919826 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.919955 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.919994 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-qmbrx" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.920368 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.920408 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.932809 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.770273032 podStartE2EDuration="5.932786928s" podCreationTimestamp="2026-01-20 23:31:33 +0000 UTC" firstStartedPulling="2026-01-20 23:31:34.7547882 +0000 UTC m=+3367.075048488" lastFinishedPulling="2026-01-20 23:31:37.917302106 +0000 UTC m=+3370.237562384" observedRunningTime="2026-01-20 23:31:38.808235384 +0000 UTC m=+3371.128495672" watchObservedRunningTime="2026-01-20 23:31:38.932786928 +0000 UTC m=+3371.253047216" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.972601 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.976123 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-config-data\") pod \"1e5f8238-69f5-43a5-843f-82e5195fde82\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.976182 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-scripts\") pod \"1e5f8238-69f5-43a5-843f-82e5195fde82\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.976212 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"1e5f8238-69f5-43a5-843f-82e5195fde82\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.976261 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpb5x\" (UniqueName: \"kubernetes.io/projected/1e5f8238-69f5-43a5-843f-82e5195fde82-kube-api-access-cpb5x\") pod \"1e5f8238-69f5-43a5-843f-82e5195fde82\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.977906 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8238-69f5-43a5-843f-82e5195fde82-httpd-run\") pod \"1e5f8238-69f5-43a5-843f-82e5195fde82\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.977931 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8238-69f5-43a5-843f-82e5195fde82-logs\") pod \"1e5f8238-69f5-43a5-843f-82e5195fde82\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.977960 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-combined-ca-bundle\") pod \"1e5f8238-69f5-43a5-843f-82e5195fde82\" (UID: \"1e5f8238-69f5-43a5-843f-82e5195fde82\") " Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.978101 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106c4d71-621c-48e5-93b0-52a932f7ec8f-config\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.978160 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/106c4d71-621c-48e5-93b0-52a932f7ec8f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.978196 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.978243 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.978278 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpt4g\" (UniqueName: \"kubernetes.io/projected/106c4d71-621c-48e5-93b0-52a932f7ec8f-kube-api-access-dpt4g\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.978308 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/106c4d71-621c-48e5-93b0-52a932f7ec8f-scripts\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.978371 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.979224 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e5f8238-69f5-43a5-843f-82e5195fde82-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1e5f8238-69f5-43a5-843f-82e5195fde82" (UID: "1e5f8238-69f5-43a5-843f-82e5195fde82"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.980133 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e5f8238-69f5-43a5-843f-82e5195fde82-logs" (OuterVolumeSpecName: "logs") pod "1e5f8238-69f5-43a5-843f-82e5195fde82" (UID: "1e5f8238-69f5-43a5-843f-82e5195fde82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.994963 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.995004 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.996062 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "1e5f8238-69f5-43a5-843f-82e5195fde82" (UID: "1e5f8238-69f5-43a5-843f-82e5195fde82"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:31:38 crc kubenswrapper[5030]: I0120 23:31:38.996369 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.007263 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_95a6a20e-e4cf-4c12-be69-33ace96481c7/ovsdbserver-nb/0.log" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.007328 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.014109 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-scripts" (OuterVolumeSpecName: "scripts") pod "1e5f8238-69f5-43a5-843f-82e5195fde82" (UID: "1e5f8238-69f5-43a5-843f-82e5195fde82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.018441 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e5f8238-69f5-43a5-843f-82e5195fde82-kube-api-access-cpb5x" (OuterVolumeSpecName: "kube-api-access-cpb5x") pod "1e5f8238-69f5-43a5-843f-82e5195fde82" (UID: "1e5f8238-69f5-43a5-843f-82e5195fde82"). InnerVolumeSpecName "kube-api-access-cpb5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.023891 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.031919 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.052855 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w"] Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.053105 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" podUID="36dbe7a0-9145-4358-9b8d-50341f70b132" containerName="barbican-api-log" containerID="cri-o://89a6eaffa75f7ed5cdb5cb31c7a58a9e4133282fc31de2764c164ad8febfa435" gracePeriod=30 Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.053229 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" podUID="36dbe7a0-9145-4358-9b8d-50341f70b132" containerName="barbican-api" containerID="cri-o://50df2ae8942ce51ed05a0430d679194edeabc2f16286a07b4ea6bf95d741b532" gracePeriod=30 Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.052802 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e5f8238-69f5-43a5-843f-82e5195fde82" (UID: "1e5f8238-69f5-43a5-843f-82e5195fde82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.085053 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67ltf\" (UniqueName: \"kubernetes.io/projected/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-kube-api-access-67ltf\") pod \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.085176 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09bdbd07-9917-40c2-9583-b9633d529373-logs\") pod \"09bdbd07-9917-40c2-9583-b9633d529373\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.085275 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-config-data\") pod \"09bdbd07-9917-40c2-9583-b9633d529373\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.085389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/09bdbd07-9917-40c2-9583-b9633d529373-etc-machine-id\") pod \"09bdbd07-9917-40c2-9583-b9633d529373\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.085479 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95a6a20e-e4cf-4c12-be69-33ace96481c7-ovsdb-rundir\") pod \"95a6a20e-e4cf-4c12-be69-33ace96481c7\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.085574 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8xmh\" (UniqueName: \"kubernetes.io/projected/95a6a20e-e4cf-4c12-be69-33ace96481c7-kube-api-access-d8xmh\") pod \"95a6a20e-e4cf-4c12-be69-33ace96481c7\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.086367 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-scripts\") pod \"09bdbd07-9917-40c2-9583-b9633d529373\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.086466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-config-data-custom\") pod \"09bdbd07-9917-40c2-9583-b9633d529373\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.086580 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8rgw\" (UniqueName: \"kubernetes.io/projected/09bdbd07-9917-40c2-9583-b9633d529373-kube-api-access-q8rgw\") pod \"09bdbd07-9917-40c2-9583-b9633d529373\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.086672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-scripts\") pod \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.086754 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95a6a20e-e4cf-4c12-be69-33ace96481c7-scripts\") pod \"95a6a20e-e4cf-4c12-be69-33ace96481c7\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.086847 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95a6a20e-e4cf-4c12-be69-33ace96481c7-config\") pod \"95a6a20e-e4cf-4c12-be69-33ace96481c7\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.087034 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpt4g\" (UniqueName: \"kubernetes.io/projected/106c4d71-621c-48e5-93b0-52a932f7ec8f-kube-api-access-dpt4g\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.087125 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/106c4d71-621c-48e5-93b0-52a932f7ec8f-scripts\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.087276 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.087392 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106c4d71-621c-48e5-93b0-52a932f7ec8f-config\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.087553 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/106c4d71-621c-48e5-93b0-52a932f7ec8f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.087710 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.087858 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.087908 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09bdbd07-9917-40c2-9583-b9633d529373-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "09bdbd07-9917-40c2-9583-b9633d529373" (UID: "09bdbd07-9917-40c2-9583-b9633d529373"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.088074 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8238-69f5-43a5-843f-82e5195fde82-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.088171 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e5f8238-69f5-43a5-843f-82e5195fde82-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.088259 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.088363 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.088479 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.088556 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpb5x\" (UniqueName: \"kubernetes.io/projected/1e5f8238-69f5-43a5-843f-82e5195fde82-kube-api-access-cpb5x\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.088655 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/09bdbd07-9917-40c2-9583-b9633d529373-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.088966 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09bdbd07-9917-40c2-9583-b9633d529373-logs" (OuterVolumeSpecName: "logs") pod "09bdbd07-9917-40c2-9583-b9633d529373" (UID: "09bdbd07-9917-40c2-9583-b9633d529373"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.091177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106c4d71-621c-48e5-93b0-52a932f7ec8f-config\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.091254 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/106c4d71-621c-48e5-93b0-52a932f7ec8f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.096143 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "09bdbd07-9917-40c2-9583-b9633d529373" (UID: "09bdbd07-9917-40c2-9583-b9633d529373"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.098955 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.102854 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/106c4d71-621c-48e5-93b0-52a932f7ec8f-scripts\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.106943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95a6a20e-e4cf-4c12-be69-33ace96481c7-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "95a6a20e-e4cf-4c12-be69-33ace96481c7" (UID: "95a6a20e-e4cf-4c12-be69-33ace96481c7"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.107093 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a6a20e-e4cf-4c12-be69-33ace96481c7-kube-api-access-d8xmh" (OuterVolumeSpecName: "kube-api-access-d8xmh") pod "95a6a20e-e4cf-4c12-be69-33ace96481c7" (UID: "95a6a20e-e4cf-4c12-be69-33ace96481c7"). InnerVolumeSpecName "kube-api-access-d8xmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.107129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.107308 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.108880 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a6a20e-e4cf-4c12-be69-33ace96481c7-config" (OuterVolumeSpecName: "config") pod "95a6a20e-e4cf-4c12-be69-33ace96481c7" (UID: "95a6a20e-e4cf-4c12-be69-33ace96481c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.110981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95a6a20e-e4cf-4c12-be69-33ace96481c7-scripts" (OuterVolumeSpecName: "scripts") pod "95a6a20e-e4cf-4c12-be69-33ace96481c7" (UID: "95a6a20e-e4cf-4c12-be69-33ace96481c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.113789 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-scripts" (OuterVolumeSpecName: "scripts") pod "dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4" (UID: "dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.113859 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-kube-api-access-67ltf" (OuterVolumeSpecName: "kube-api-access-67ltf") pod "dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4" (UID: "dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4"). InnerVolumeSpecName "kube-api-access-67ltf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.121843 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09bdbd07-9917-40c2-9583-b9633d529373-kube-api-access-q8rgw" (OuterVolumeSpecName: "kube-api-access-q8rgw") pod "09bdbd07-9917-40c2-9583-b9633d529373" (UID: "09bdbd07-9917-40c2-9583-b9633d529373"). InnerVolumeSpecName "kube-api-access-q8rgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.127129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.127136 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-scripts" (OuterVolumeSpecName: "scripts") pod "09bdbd07-9917-40c2-9583-b9633d529373" (UID: "09bdbd07-9917-40c2-9583-b9633d529373"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.134390 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpt4g\" (UniqueName: \"kubernetes.io/projected/106c4d71-621c-48e5-93b0-52a932f7ec8f-kube-api-access-dpt4g\") pod \"ovn-northd-0\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.181559 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-config-data" (OuterVolumeSpecName: "config-data") pod "1e5f8238-69f5-43a5-843f-82e5195fde82" (UID: "1e5f8238-69f5-43a5-843f-82e5195fde82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.190712 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.191091 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcjxx\" (UniqueName: \"kubernetes.io/projected/07cd67f8-2bd9-448c-a418-5d1aa37808d6-kube-api-access-hcjxx\") pod \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.191699 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07cd67f8-2bd9-448c-a418-5d1aa37808d6-logs\") pod \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.191808 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-combined-ca-bundle\") pod \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.191906 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-credential-keys\") pod \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.192002 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-combined-ca-bundle\") pod \"09bdbd07-9917-40c2-9583-b9633d529373\" (UID: \"09bdbd07-9917-40c2-9583-b9633d529373\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.192112 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a6a20e-e4cf-4c12-be69-33ace96481c7-combined-ca-bundle\") pod \"95a6a20e-e4cf-4c12-be69-33ace96481c7\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.192202 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-combined-ca-bundle\") pod \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.192289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-fernet-keys\") pod \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.192404 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07cd67f8-2bd9-448c-a418-5d1aa37808d6-httpd-run\") pod \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.192491 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-config-data\") pod \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\" (UID: \"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.192579 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"95a6a20e-e4cf-4c12-be69-33ace96481c7\" (UID: \"95a6a20e-e4cf-4c12-be69-33ace96481c7\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.192667 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-scripts\") pod \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.193948 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.194071 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8rgw\" (UniqueName: \"kubernetes.io/projected/09bdbd07-9917-40c2-9583-b9633d529373-kube-api-access-q8rgw\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.194139 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.194210 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95a6a20e-e4cf-4c12-be69-33ace96481c7-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.194264 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95a6a20e-e4cf-4c12-be69-33ace96481c7-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.194334 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e5f8238-69f5-43a5-843f-82e5195fde82-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.194390 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67ltf\" (UniqueName: \"kubernetes.io/projected/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-kube-api-access-67ltf\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.194466 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09bdbd07-9917-40c2-9583-b9633d529373-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.194523 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95a6a20e-e4cf-4c12-be69-33ace96481c7-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.194598 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8xmh\" (UniqueName: \"kubernetes.io/projected/95a6a20e-e4cf-4c12-be69-33ace96481c7-kube-api-access-d8xmh\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.194699 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.196091 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07cd67f8-2bd9-448c-a418-5d1aa37808d6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "07cd67f8-2bd9-448c-a418-5d1aa37808d6" (UID: "07cd67f8-2bd9-448c-a418-5d1aa37808d6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.201815 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "07cd67f8-2bd9-448c-a418-5d1aa37808d6" (UID: "07cd67f8-2bd9-448c-a418-5d1aa37808d6"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.205398 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07cd67f8-2bd9-448c-a418-5d1aa37808d6-logs" (OuterVolumeSpecName: "logs") pod "07cd67f8-2bd9-448c-a418-5d1aa37808d6" (UID: "07cd67f8-2bd9-448c-a418-5d1aa37808d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.209516 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-scripts" (OuterVolumeSpecName: "scripts") pod "07cd67f8-2bd9-448c-a418-5d1aa37808d6" (UID: "07cd67f8-2bd9-448c-a418-5d1aa37808d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.210573 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4" (UID: "dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.211506 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4" (UID: "dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.213032 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "95a6a20e-e4cf-4c12-be69-33ace96481c7" (UID: "95a6a20e-e4cf-4c12-be69-33ace96481c7"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.213197 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07cd67f8-2bd9-448c-a418-5d1aa37808d6-kube-api-access-hcjxx" (OuterVolumeSpecName: "kube-api-access-hcjxx") pod "07cd67f8-2bd9-448c-a418-5d1aa37808d6" (UID: "07cd67f8-2bd9-448c-a418-5d1aa37808d6"). InnerVolumeSpecName "kube-api-access-hcjxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.217084 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.263852 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-595565b585-58jwl"] Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.273154 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.275321 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-config-data" (OuterVolumeSpecName: "config-data") pod "09bdbd07-9917-40c2-9583-b9633d529373" (UID: "09bdbd07-9917-40c2-9583-b9633d529373"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.295387 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-config-data\") pod \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\" (UID: \"07cd67f8-2bd9-448c-a418-5d1aa37808d6\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.295793 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.295817 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcjxx\" (UniqueName: \"kubernetes.io/projected/07cd67f8-2bd9-448c-a418-5d1aa37808d6-kube-api-access-hcjxx\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.295842 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07cd67f8-2bd9-448c-a418-5d1aa37808d6-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.295852 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.295861 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.295871 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.295881 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07cd67f8-2bd9-448c-a418-5d1aa37808d6-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.295895 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.295904 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.295913 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.317864 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07cd67f8-2bd9-448c-a418-5d1aa37808d6" (UID: "07cd67f8-2bd9-448c-a418-5d1aa37808d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.338578 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a6a20e-e4cf-4c12-be69-33ace96481c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95a6a20e-e4cf-4c12-be69-33ace96481c7" (UID: "95a6a20e-e4cf-4c12-be69-33ace96481c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.345032 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4" (UID: "dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.345580 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-config-data" (OuterVolumeSpecName: "config-data") pod "dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4" (UID: "dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.350781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09bdbd07-9917-40c2-9583-b9633d529373" (UID: "09bdbd07-9917-40c2-9583-b9633d529373"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.357425 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.370955 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.376012 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.376475 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.376498 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.396773 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a6a20e-e4cf-4c12-be69-33ace96481c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.396800 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.396810 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.396819 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.396827 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.396835 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.396844 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09bdbd07-9917-40c2-9583-b9633d529373-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.420545 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-config-data" (OuterVolumeSpecName: "config-data") pod "07cd67f8-2bd9-448c-a418-5d1aa37808d6" (UID: "07cd67f8-2bd9-448c-a418-5d1aa37808d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.498496 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07cd67f8-2bd9-448c-a418-5d1aa37808d6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.636799 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj"] Jan 20 23:31:39 crc kubenswrapper[5030]: E0120 23:31:39.679370 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:39 crc kubenswrapper[5030]: E0120 23:31:39.684452 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:39 crc kubenswrapper[5030]: E0120 23:31:39.699125 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:39 crc kubenswrapper[5030]: E0120 23:31:39.699185 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" containerName="nova-cell1-conductor-conductor" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.806185 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.852236 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_95a6a20e-e4cf-4c12-be69-33ace96481c7/ovsdbserver-nb/0.log" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.852306 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"95a6a20e-e4cf-4c12-be69-33ace96481c7","Type":"ContainerDied","Data":"b0c5d2cc201d083fc84c7006cdf08cd364054e065afe9db1a9548962a8c189c4"} Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.852338 5030 scope.go:117] "RemoveContainer" containerID="bd45d6d17dbbe4a1ba70c5139b2c784f386f57335e21bcbdfe022af808f6aec3" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.852462 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.859918 5030 generic.go:334] "Generic (PLEG): container finished" podID="36dbe7a0-9145-4358-9b8d-50341f70b132" containerID="89a6eaffa75f7ed5cdb5cb31c7a58a9e4133282fc31de2764c164ad8febfa435" exitCode=143 Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.859998 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" event={"ID":"36dbe7a0-9145-4358-9b8d-50341f70b132","Type":"ContainerDied","Data":"89a6eaffa75f7ed5cdb5cb31c7a58a9e4133282fc31de2764c164ad8febfa435"} Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.861524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" event={"ID":"dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4","Type":"ContainerDied","Data":"7f8aa711319815580fd9a2344e5e5ec25bc08dbcc14cc5368c5872a1bee7b59f"} Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.861669 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-84588d847c-cnh5j" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.875920 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"09bdbd07-9917-40c2-9583-b9633d529373","Type":"ContainerDied","Data":"680837b7360badfe79509cf11a55879d38791a680f79da2a441b410e582e65aa"} Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.876087 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.891760 5030 generic.go:334] "Generic (PLEG): container finished" podID="6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" containerID="ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00" exitCode=0 Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.892239 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.892707 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" event={"ID":"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31","Type":"ContainerDied","Data":"ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00"} Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.892758 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7b5f97974b-77qrc" event={"ID":"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31","Type":"ContainerDied","Data":"19b2b71612e3a2a85e00ffaa0364277bb531f46af7bc4f89dedf5363f5ce15ed"} Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.902534 5030 scope.go:117] "RemoveContainer" containerID="f413a83947a3f641c49ef1fe9149ee0fb63df1dd4f02e577126231ff5df950cd" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.903576 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-595565b585-58jwl" event={"ID":"5fffcdc3-dbc6-49d2-9e77-96c41651ee20","Type":"ContainerStarted","Data":"a68d733c81a876f1b864477403c6ac9c0c4ab39c56b7e56209376ae051cc669c"} Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.903608 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-595565b585-58jwl" event={"ID":"5fffcdc3-dbc6-49d2-9e77-96c41651ee20","Type":"ContainerStarted","Data":"134d7060918f615b30082a51ebe9b97f6a9e6095e61ab8e55495cb0466c2d9b2"} Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.905747 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.907488 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckjl7\" (UniqueName: \"kubernetes.io/projected/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-kube-api-access-ckjl7\") pod \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.907560 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-config\") pod \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.907595 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-combined-ca-bundle\") pod \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.907660 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-httpd-config\") pod \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\" (UID: \"6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31\") " Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.912107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"07cd67f8-2bd9-448c-a418-5d1aa37808d6","Type":"ContainerDied","Data":"42aa7635e18fa04ed8e3a79e886ed851b1cd7dd91cd9b0fa6211ef4a73541160"} Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.912147 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.915895 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-kube-api-access-ckjl7" (OuterVolumeSpecName: "kube-api-access-ckjl7") pod "6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" (UID: "6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31"). InnerVolumeSpecName "kube-api-access-ckjl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.920697 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" event={"ID":"933d538e-e825-4673-a66a-6eedb28088a8","Type":"ContainerStarted","Data":"53b6e46f0f2606d32f4134ae76e23a2e80e6de9eb52d26d3a088d0057324ab6b"} Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.921084 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:39 crc kubenswrapper[5030]: I0120 23:31:39.938736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" (UID: "6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:39.973090 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-595565b585-58jwl" podStartSLOduration=2.9730679 podStartE2EDuration="2.9730679s" podCreationTimestamp="2026-01-20 23:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:39.959786279 +0000 UTC m=+3372.280046567" watchObservedRunningTime="2026-01-20 23:31:39.9730679 +0000 UTC m=+3372.293328188" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:39.993392 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" (UID: "6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.010588 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.010613 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.010642 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckjl7\" (UniqueName: \"kubernetes.io/projected/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-kube-api-access-ckjl7\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.027447 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-config" (OuterVolumeSpecName: "config") pod "6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" (UID: "6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.081132 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="59347f0b-d425-4d66-b2a0-7381c5fa0c3f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.081700 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="59347f0b-d425-4d66-b2a0-7381c5fa0c3f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.118786 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.270201 5030 scope.go:117] "RemoveContainer" containerID="d33518e830a3f7bd93cdb4cab2e8d31c3450485a686941e871eb32f2b839ba27" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.330778 5030 scope.go:117] "RemoveContainer" containerID="4c3206d4e2b2e6b3beae0608df2af062fdb038791006f7d48e30d5b6ee108603" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.345869 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="809c9ceb-bda3-4971-88b2-3fb7bd4cee02" path="/var/lib/kubelet/pods/809c9ceb-bda3-4971-88b2-3fb7bd4cee02/volumes" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.362467 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.362513 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.362529 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.362548 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.362561 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-84588d847c-cnh5j"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.363519 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="8f0e49bd-aaee-440e-a6ec-7498907c5c7c" containerName="openstack-network-exporter" containerID="cri-o://a438c41191f2f0394a1ab37c20bc2ba8dd55038f659864717d061a59d37cf44a" gracePeriod=300 Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.395739 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-84588d847c-cnh5j"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.406249 5030 scope.go:117] "RemoveContainer" containerID="940e781c7a85b198a4d2b6b1b3c4b7082bf01ab6f893127baf96e3dcc162635e" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.428670 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.470092 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.205:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.470473 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.205:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.485895 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.530664 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7b5f97974b-77qrc"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.546950 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-7b5f97974b-77qrc"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.559882 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: E0120 23:31:40.560337 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09bdbd07-9917-40c2-9583-b9633d529373" containerName="cinder-api-log" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560354 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="09bdbd07-9917-40c2-9583-b9633d529373" containerName="cinder-api-log" Jan 20 23:31:40 crc kubenswrapper[5030]: E0120 23:31:40.560375 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" containerName="neutron-api" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560382 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" containerName="neutron-api" Jan 20 23:31:40 crc kubenswrapper[5030]: E0120 23:31:40.560393 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a6a20e-e4cf-4c12-be69-33ace96481c7" containerName="ovsdbserver-nb" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560399 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a6a20e-e4cf-4c12-be69-33ace96481c7" containerName="ovsdbserver-nb" Jan 20 23:31:40 crc kubenswrapper[5030]: E0120 23:31:40.560412 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a6a20e-e4cf-4c12-be69-33ace96481c7" containerName="openstack-network-exporter" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560418 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a6a20e-e4cf-4c12-be69-33ace96481c7" containerName="openstack-network-exporter" Jan 20 23:31:40 crc kubenswrapper[5030]: E0120 23:31:40.560429 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4" containerName="keystone-api" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560434 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4" containerName="keystone-api" Jan 20 23:31:40 crc kubenswrapper[5030]: E0120 23:31:40.560453 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07cd67f8-2bd9-448c-a418-5d1aa37808d6" containerName="glance-log" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560460 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="07cd67f8-2bd9-448c-a418-5d1aa37808d6" containerName="glance-log" Jan 20 23:31:40 crc kubenswrapper[5030]: E0120 23:31:40.560469 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" containerName="neutron-httpd" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560476 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" containerName="neutron-httpd" Jan 20 23:31:40 crc kubenswrapper[5030]: E0120 23:31:40.560484 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07cd67f8-2bd9-448c-a418-5d1aa37808d6" containerName="glance-httpd" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560491 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="07cd67f8-2bd9-448c-a418-5d1aa37808d6" containerName="glance-httpd" Jan 20 23:31:40 crc kubenswrapper[5030]: E0120 23:31:40.560502 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09bdbd07-9917-40c2-9583-b9633d529373" containerName="cinder-api" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560508 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="09bdbd07-9917-40c2-9583-b9633d529373" containerName="cinder-api" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560692 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a6a20e-e4cf-4c12-be69-33ace96481c7" containerName="openstack-network-exporter" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560700 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" containerName="neutron-api" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560711 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="09bdbd07-9917-40c2-9583-b9633d529373" containerName="cinder-api-log" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560720 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="07cd67f8-2bd9-448c-a418-5d1aa37808d6" containerName="glance-httpd" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560729 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" containerName="neutron-httpd" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560746 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a6a20e-e4cf-4c12-be69-33ace96481c7" containerName="ovsdbserver-nb" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560753 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4" containerName="keystone-api" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560761 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="07cd67f8-2bd9-448c-a418-5d1aa37808d6" containerName="glance-log" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.560773 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="09bdbd07-9917-40c2-9583-b9633d529373" containerName="cinder-api" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.561683 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.566762 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="8f0e49bd-aaee-440e-a6ec-7498907c5c7c" containerName="ovsdbserver-sb" containerID="cri-o://1d3442a0a542586be3dae25255dace0589b3705afc3accc9a2bd41c8606fed40" gracePeriod=300 Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.569523 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.569778 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.570138 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.574108 5030 scope.go:117] "RemoveContainer" containerID="c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.575983 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.586332 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.594641 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.603806 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.610670 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.620379 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5c8797d4c8-svfrs"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.622124 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.631668 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5c8797d4c8-svfrs"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.640352 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.651944 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.654067 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.656226 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.675401 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.676036 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.676581 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-54dw7" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.681924 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.724137 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.726749 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.731359 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.731603 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.731764 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.731897 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-6vjfc" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.755204 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.756426 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8663671b-ef57-49a5-b24b-4cb3385ffc76-logs\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.756460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-scripts\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.756482 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-config\") pod \"neutron-5c8797d4c8-svfrs\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.756500 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.756536 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lt4m\" (UniqueName: \"kubernetes.io/projected/8663671b-ef57-49a5-b24b-4cb3385ffc76-kube-api-access-8lt4m\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.756567 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.756590 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-combined-ca-bundle\") pod \"neutron-5c8797d4c8-svfrs\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.756613 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.756658 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-httpd-config\") pod \"neutron-5c8797d4c8-svfrs\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.756678 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-config-data-custom\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.756708 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf8tm\" (UniqueName: \"kubernetes.io/projected/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-kube-api-access-rf8tm\") pod \"neutron-5c8797d4c8-svfrs\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.756744 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8663671b-ef57-49a5-b24b-4cb3385ffc76-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.756769 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-config-data\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.762052 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.777008 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.783005 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.784662 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.786910 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.787134 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.794290 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858332 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-scripts\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858373 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a52282-d198-4a51-912a-1c2267f20d4b-config\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858395 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-config\") pod \"neutron-5c8797d4c8-svfrs\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858413 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858444 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lt4m\" (UniqueName: \"kubernetes.io/projected/8663671b-ef57-49a5-b24b-4cb3385ffc76-kube-api-access-8lt4m\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858465 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858493 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858526 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-combined-ca-bundle\") pod \"neutron-5c8797d4c8-svfrs\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858554 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858575 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-httpd-config\") pod \"neutron-5c8797d4c8-svfrs\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858613 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858647 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858668 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-config-data-custom\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858716 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858740 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/27a52282-d198-4a51-912a-1c2267f20d4b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858763 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf8tm\" (UniqueName: \"kubernetes.io/projected/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-kube-api-access-rf8tm\") pod \"neutron-5c8797d4c8-svfrs\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858788 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858816 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858844 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8663671b-ef57-49a5-b24b-4cb3385ffc76-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858874 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858916 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-config-data\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858936 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858953 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858974 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27a52282-d198-4a51-912a-1c2267f20d4b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.858990 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8663671b-ef57-49a5-b24b-4cb3385ffc76-logs\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.859006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4t5j\" (UniqueName: \"kubernetes.io/projected/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-kube-api-access-f4t5j\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.859025 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n6vx\" (UniqueName: \"kubernetes.io/projected/27a52282-d198-4a51-912a-1c2267f20d4b-kube-api-access-5n6vx\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.859450 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8663671b-ef57-49a5-b24b-4cb3385ffc76-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.860911 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8663671b-ef57-49a5-b24b-4cb3385ffc76-logs\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.867967 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-httpd-config\") pod \"neutron-5c8797d4c8-svfrs\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.870470 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-config\") pod \"neutron-5c8797d4c8-svfrs\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.872819 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-config-data-custom\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.874970 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-config-data\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.876117 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.877086 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.877355 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.877449 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-scripts\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.879449 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-combined-ca-bundle\") pod \"neutron-5c8797d4c8-svfrs\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.883741 5030 scope.go:117] "RemoveContainer" containerID="ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.888681 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf8tm\" (UniqueName: \"kubernetes.io/projected/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-kube-api-access-rf8tm\") pod \"neutron-5c8797d4c8-svfrs\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.890134 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lt4m\" (UniqueName: \"kubernetes.io/projected/8663671b-ef57-49a5-b24b-4cb3385ffc76-kube-api-access-8lt4m\") pod \"cinder-api-0\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.932099 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_8f0e49bd-aaee-440e-a6ec-7498907c5c7c/ovsdbserver-sb/0.log" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.932138 5030 generic.go:334] "Generic (PLEG): container finished" podID="8f0e49bd-aaee-440e-a6ec-7498907c5c7c" containerID="a438c41191f2f0394a1ab37c20bc2ba8dd55038f659864717d061a59d37cf44a" exitCode=2 Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.932155 5030 generic.go:334] "Generic (PLEG): container finished" podID="8f0e49bd-aaee-440e-a6ec-7498907c5c7c" containerID="1d3442a0a542586be3dae25255dace0589b3705afc3accc9a2bd41c8606fed40" exitCode=143 Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.932182 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"8f0e49bd-aaee-440e-a6ec-7498907c5c7c","Type":"ContainerDied","Data":"a438c41191f2f0394a1ab37c20bc2ba8dd55038f659864717d061a59d37cf44a"} Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.932202 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"8f0e49bd-aaee-440e-a6ec-7498907c5c7c","Type":"ContainerDied","Data":"1d3442a0a542586be3dae25255dace0589b3705afc3accc9a2bd41c8606fed40"} Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.938008 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"106c4d71-621c-48e5-93b0-52a932f7ec8f","Type":"ContainerStarted","Data":"9f22df28708507c4cb11d52f25b980dfed79f069a0a1720d14cfc940f5eae877"} Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.940454 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="c77f3ffb-97b9-4a1f-9610-7705ae4aee88" containerName="cinder-scheduler" containerID="cri-o://1bd3eefd66df110f1bde6987f3ca5ec62e35e4907fb703cbc406ac9b3fda7696" gracePeriod=30 Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.940481 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" event={"ID":"933d538e-e825-4673-a66a-6eedb28088a8","Type":"ContainerStarted","Data":"c6b155122882e1ad33c2f41e132bab61dfd832b47c24b4a08122f551206e54db"} Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.942149 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="c77f3ffb-97b9-4a1f-9610-7705ae4aee88" containerName="probe" containerID="cri-o://6e686278be52fa85b6babe42f39d57705fd3b25a80bb2213f78c824fd28e3f1e" gracePeriod=30 Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.956959 5030 scope.go:117] "RemoveContainer" containerID="c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20" Jan 20 23:31:40 crc kubenswrapper[5030]: E0120 23:31:40.960129 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20\": container with ID starting with c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20 not found: ID does not exist" containerID="c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.960177 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20"} err="failed to get container status \"c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20\": rpc error: code = NotFound desc = could not find container \"c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20\": container with ID starting with c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20 not found: ID does not exist" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.960271 5030 scope.go:117] "RemoveContainer" containerID="ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00" Jan 20 23:31:40 crc kubenswrapper[5030]: E0120 23:31:40.961988 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00\": container with ID starting with ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00 not found: ID does not exist" containerID="ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.962009 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00"} err="failed to get container status \"ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00\": rpc error: code = NotFound desc = could not find container \"ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00\": container with ID starting with ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00 not found: ID does not exist" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.962026 5030 scope.go:117] "RemoveContainer" containerID="b3c9339bc8d74ca4a63b31adbe8e69052f2ce6fb016f4626733fb1fb227b5639" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.963057 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37c2ebda-5e72-41eb-9c8c-1326134768c7-logs\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.963124 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n6vx\" (UniqueName: \"kubernetes.io/projected/27a52282-d198-4a51-912a-1c2267f20d4b-kube-api-access-5n6vx\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.963169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a52282-d198-4a51-912a-1c2267f20d4b-config\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.963259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37c2ebda-5e72-41eb-9c8c-1326134768c7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.963318 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.963347 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-scripts\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.963476 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.963738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.963965 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.964698 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.964772 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.964833 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.964882 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.964949 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/27a52282-d198-4a51-912a-1c2267f20d4b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.965005 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.965035 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.965059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.965141 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.965168 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nvst\" (UniqueName: \"kubernetes.io/projected/37c2ebda-5e72-41eb-9c8c-1326134768c7-kube-api-access-7nvst\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.965214 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.965234 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.965268 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27a52282-d198-4a51-912a-1c2267f20d4b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.965289 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4t5j\" (UniqueName: \"kubernetes.io/projected/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-kube-api-access-f4t5j\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.965306 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-config-data\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.965757 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.966384 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.967523 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.968164 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a52282-d198-4a51-912a-1c2267f20d4b-config\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.968341 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/27a52282-d198-4a51-912a-1c2267f20d4b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.968392 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") device mount path \"/mnt/openstack/pv07\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.969459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27a52282-d198-4a51-912a-1c2267f20d4b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.979383 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.980740 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.981471 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.984898 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.985112 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.985693 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.987151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.993876 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4t5j\" (UniqueName: \"kubernetes.io/projected/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-kube-api-access-f4t5j\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:40 crc kubenswrapper[5030]: I0120 23:31:40.995251 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n6vx\" (UniqueName: \"kubernetes.io/projected/27a52282-d198-4a51-912a-1c2267f20d4b-kube-api-access-5n6vx\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.009965 5030 scope.go:117] "RemoveContainer" containerID="faedcaf04f2f97afdb74b95ee3377cf5cd8b168accbb420e066401fd1a93fb51" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.030053 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.034380 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.069260 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-scripts\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.069685 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.069845 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.079163 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.079931 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nvst\" (UniqueName: \"kubernetes.io/projected/37c2ebda-5e72-41eb-9c8c-1326134768c7-kube-api-access-7nvst\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.080200 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-config-data\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.080248 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37c2ebda-5e72-41eb-9c8c-1326134768c7-logs\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.080310 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-scripts\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.080465 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37c2ebda-5e72-41eb-9c8c-1326134768c7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.080581 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37c2ebda-5e72-41eb-9c8c-1326134768c7-logs\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.083108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.084866 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37c2ebda-5e72-41eb-9c8c-1326134768c7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.085408 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.087473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-config-data\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.102183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.108409 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.109703 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nvst\" (UniqueName: \"kubernetes.io/projected/37c2ebda-5e72-41eb-9c8c-1326134768c7-kube-api-access-7nvst\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.129099 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: E0120 23:31:41.134551 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1d3442a0a542586be3dae25255dace0589b3705afc3accc9a2bd41c8606fed40 is running failed: container process not found" containerID="1d3442a0a542586be3dae25255dace0589b3705afc3accc9a2bd41c8606fed40" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:31:41 crc kubenswrapper[5030]: E0120 23:31:41.135019 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1d3442a0a542586be3dae25255dace0589b3705afc3accc9a2bd41c8606fed40 is running failed: container process not found" containerID="1d3442a0a542586be3dae25255dace0589b3705afc3accc9a2bd41c8606fed40" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:31:41 crc kubenswrapper[5030]: E0120 23:31:41.135537 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1d3442a0a542586be3dae25255dace0589b3705afc3accc9a2bd41c8606fed40 is running failed: container process not found" containerID="1d3442a0a542586be3dae25255dace0589b3705afc3accc9a2bd41c8606fed40" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:31:41 crc kubenswrapper[5030]: E0120 23:31:41.135575 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1d3442a0a542586be3dae25255dace0589b3705afc3accc9a2bd41c8606fed40 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="8f0e49bd-aaee-440e-a6ec-7498907c5c7c" containerName="ovsdbserver-sb" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.156201 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.163188 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.189784 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.206683 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.220739 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.222744 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.254000 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_8f0e49bd-aaee-440e-a6ec-7498907c5c7c/ovsdbserver-sb/0.log" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.254065 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.385448 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb9wf\" (UniqueName: \"kubernetes.io/projected/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-kube-api-access-wb9wf\") pod \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.385732 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-combined-ca-bundle\") pod \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.385758 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-ovsdb-rundir\") pod \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.385790 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.385951 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-config\") pod \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.386090 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-scripts\") pod \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\" (UID: \"8f0e49bd-aaee-440e-a6ec-7498907c5c7c\") " Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.386946 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "8f0e49bd-aaee-440e-a6ec-7498907c5c7c" (UID: "8f0e49bd-aaee-440e-a6ec-7498907c5c7c"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.387343 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-scripts" (OuterVolumeSpecName: "scripts") pod "8f0e49bd-aaee-440e-a6ec-7498907c5c7c" (UID: "8f0e49bd-aaee-440e-a6ec-7498907c5c7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.387940 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-config" (OuterVolumeSpecName: "config") pod "8f0e49bd-aaee-440e-a6ec-7498907c5c7c" (UID: "8f0e49bd-aaee-440e-a6ec-7498907c5c7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.396753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "8f0e49bd-aaee-440e-a6ec-7498907c5c7c" (UID: "8f0e49bd-aaee-440e-a6ec-7498907c5c7c"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.397427 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-kube-api-access-wb9wf" (OuterVolumeSpecName: "kube-api-access-wb9wf") pod "8f0e49bd-aaee-440e-a6ec-7498907c5c7c" (UID: "8f0e49bd-aaee-440e-a6ec-7498907c5c7c"). InnerVolumeSpecName "kube-api-access-wb9wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.435815 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f0e49bd-aaee-440e-a6ec-7498907c5c7c" (UID: "8f0e49bd-aaee-440e-a6ec-7498907c5c7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.488372 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.488615 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb9wf\" (UniqueName: \"kubernetes.io/projected/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-kube-api-access-wb9wf\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.488865 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.488924 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.488996 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.489049 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f0e49bd-aaee-440e-a6ec-7498907c5c7c-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.519843 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.591146 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:41 crc kubenswrapper[5030]: I0120 23:31:41.637525 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:31:41 crc kubenswrapper[5030]: E0120 23:31:41.683017 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4e409b8ed77abff8688fbca2a42ce4b6681d55a94a5e7bf4e50db39b16315018" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:41 crc kubenswrapper[5030]: E0120 23:31:41.686775 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4e409b8ed77abff8688fbca2a42ce4b6681d55a94a5e7bf4e50db39b16315018" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:41 crc kubenswrapper[5030]: E0120 23:31:41.694315 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4e409b8ed77abff8688fbca2a42ce4b6681d55a94a5e7bf4e50db39b16315018" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:41 crc kubenswrapper[5030]: E0120 23:31:41.694385 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="ed0cd39a-1c2a-4c57-952f-9f389530459d" containerName="nova-cell0-conductor-conductor" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.033359 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_8f0e49bd-aaee-440e-a6ec-7498907c5c7c/ovsdbserver-sb/0.log" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.033460 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.033630 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07cd67f8-2bd9-448c-a418-5d1aa37808d6" path="/var/lib/kubelet/pods/07cd67f8-2bd9-448c-a418-5d1aa37808d6/volumes" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.034495 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09bdbd07-9917-40c2-9583-b9633d529373" path="/var/lib/kubelet/pods/09bdbd07-9917-40c2-9583-b9633d529373/volumes" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.042259 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e5f8238-69f5-43a5-843f-82e5195fde82" path="/var/lib/kubelet/pods/1e5f8238-69f5-43a5-843f-82e5195fde82/volumes" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.047053 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31" path="/var/lib/kubelet/pods/6a1bd7b8-3e6a-4cf2-8387-40e92fa0ab31/volumes" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.047955 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a6a20e-e4cf-4c12-be69-33ace96481c7" path="/var/lib/kubelet/pods/95a6a20e-e4cf-4c12-be69-33ace96481c7/volumes" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.048535 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4" path="/var/lib/kubelet/pods/dd7f96cb-d8e2-47d9-b7e6-d1b13a7394f4/volumes" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.049747 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"8f0e49bd-aaee-440e-a6ec-7498907c5c7c","Type":"ContainerDied","Data":"f7298c10b79114bd49af1af229aa411b4ff244a5cf35c68ef5ab0092bece7705"} Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.052109 5030 scope.go:117] "RemoveContainer" containerID="a438c41191f2f0394a1ab37c20bc2ba8dd55038f659864717d061a59d37cf44a" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.056346 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5c8797d4c8-svfrs"] Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.072979 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"106c4d71-621c-48e5-93b0-52a932f7ec8f","Type":"ContainerStarted","Data":"cc67c82c8df000970b4455986ac36b806a2231abd3274bc2b7d5e93b9f142daa"} Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.073020 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"106c4d71-621c-48e5-93b0-52a932f7ec8f","Type":"ContainerStarted","Data":"efea214823ed9d6aa4284654980e5f98e08e924c721db4c512b861b509986df7"} Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.073196 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="106c4d71-621c-48e5-93b0-52a932f7ec8f" containerName="ovn-northd" containerID="cri-o://efea214823ed9d6aa4284654980e5f98e08e924c721db4c512b861b509986df7" gracePeriod=30 Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.073405 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.073651 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="106c4d71-621c-48e5-93b0-52a932f7ec8f" containerName="openstack-network-exporter" containerID="cri-o://cc67c82c8df000970b4455986ac36b806a2231abd3274bc2b7d5e93b9f142daa" gracePeriod=30 Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.095059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" event={"ID":"933d538e-e825-4673-a66a-6eedb28088a8","Type":"ContainerStarted","Data":"6d2c18d4a36183671c3763d395c44aa36c2a831486d57039da0c5c11b9df222f"} Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.095479 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" podUID="933d538e-e825-4673-a66a-6eedb28088a8" containerName="neutron-api" containerID="cri-o://c6b155122882e1ad33c2f41e132bab61dfd832b47c24b4a08122f551206e54db" gracePeriod=30 Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.095815 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.095859 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" podUID="933d538e-e825-4673-a66a-6eedb28088a8" containerName="neutron-httpd" containerID="cri-o://6d2c18d4a36183671c3763d395c44aa36c2a831486d57039da0c5c11b9df222f" gracePeriod=30 Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.101708 5030 scope.go:117] "RemoveContainer" containerID="1d3442a0a542586be3dae25255dace0589b3705afc3accc9a2bd41c8606fed40" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.118793 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"8663671b-ef57-49a5-b24b-4cb3385ffc76","Type":"ContainerStarted","Data":"651d309731d23d217163f62c311b1286133380d551c3616274dabbbe3072b8ab"} Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.123825 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.141353 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.149052 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.278369 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=4.278350435 podStartE2EDuration="4.278350435s" podCreationTimestamp="2026-01-20 23:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:42.104473586 +0000 UTC m=+3374.424733874" watchObservedRunningTime="2026-01-20 23:31:42.278350435 +0000 UTC m=+3374.598610723" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.314099 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.323086 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" podStartSLOduration=4.322675548 podStartE2EDuration="4.322675548s" podCreationTimestamp="2026-01-20 23:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:42.135826205 +0000 UTC m=+3374.456086493" watchObservedRunningTime="2026-01-20 23:31:42.322675548 +0000 UTC m=+3374.642935836" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.344754 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.354675 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.369959 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:31:42 crc kubenswrapper[5030]: E0120 23:31:42.371226 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0e49bd-aaee-440e-a6ec-7498907c5c7c" containerName="openstack-network-exporter" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.371241 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0e49bd-aaee-440e-a6ec-7498907c5c7c" containerName="openstack-network-exporter" Jan 20 23:31:42 crc kubenswrapper[5030]: E0120 23:31:42.371305 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0e49bd-aaee-440e-a6ec-7498907c5c7c" containerName="ovsdbserver-sb" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.371312 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0e49bd-aaee-440e-a6ec-7498907c5c7c" containerName="ovsdbserver-sb" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.371736 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0e49bd-aaee-440e-a6ec-7498907c5c7c" containerName="openstack-network-exporter" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.371752 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0e49bd-aaee-440e-a6ec-7498907c5c7c" containerName="ovsdbserver-sb" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.373408 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.378169 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.379471 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.379956 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-hrzkh" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.380133 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.380386 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.456257 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf91700-edc5-4ecb-a027-b6b904e948e1-config\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.456779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.457318 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcf91700-edc5-4ecb-a027-b6b904e948e1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.457516 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.457692 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.457789 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcf91700-edc5-4ecb-a027-b6b904e948e1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.457867 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.458163 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrphn\" (UniqueName: \"kubernetes.io/projected/dcf91700-edc5-4ecb-a027-b6b904e948e1-kube-api-access-zrphn\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.559185 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcf91700-edc5-4ecb-a027-b6b904e948e1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.559238 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.559274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.559298 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcf91700-edc5-4ecb-a027-b6b904e948e1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.559322 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.559372 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrphn\" (UniqueName: \"kubernetes.io/projected/dcf91700-edc5-4ecb-a027-b6b904e948e1-kube-api-access-zrphn\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.559417 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf91700-edc5-4ecb-a027-b6b904e948e1-config\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.559440 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.559733 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.563119 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcf91700-edc5-4ecb-a027-b6b904e948e1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.564445 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcf91700-edc5-4ecb-a027-b6b904e948e1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.571136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf91700-edc5-4ecb-a027-b6b904e948e1-config\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.572141 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.573857 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.583881 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.593806 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrphn\" (UniqueName: \"kubernetes.io/projected/dcf91700-edc5-4ecb-a027-b6b904e948e1-kube-api-access-zrphn\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.682442 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:42 crc kubenswrapper[5030]: I0120 23:31:42.703949 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.006841 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.180505 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-config-data-custom\") pod \"36dbe7a0-9145-4358-9b8d-50341f70b132\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.181256 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-config-data\") pod \"36dbe7a0-9145-4358-9b8d-50341f70b132\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.181313 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqv9q\" (UniqueName: \"kubernetes.io/projected/36dbe7a0-9145-4358-9b8d-50341f70b132-kube-api-access-kqv9q\") pod \"36dbe7a0-9145-4358-9b8d-50341f70b132\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.181346 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-combined-ca-bundle\") pod \"36dbe7a0-9145-4358-9b8d-50341f70b132\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.181382 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36dbe7a0-9145-4358-9b8d-50341f70b132-logs\") pod \"36dbe7a0-9145-4358-9b8d-50341f70b132\" (UID: \"36dbe7a0-9145-4358-9b8d-50341f70b132\") " Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.183059 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36dbe7a0-9145-4358-9b8d-50341f70b132-logs" (OuterVolumeSpecName: "logs") pod "36dbe7a0-9145-4358-9b8d-50341f70b132" (UID: "36dbe7a0-9145-4358-9b8d-50341f70b132"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.207513 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36dbe7a0-9145-4358-9b8d-50341f70b132-kube-api-access-kqv9q" (OuterVolumeSpecName: "kube-api-access-kqv9q") pod "36dbe7a0-9145-4358-9b8d-50341f70b132" (UID: "36dbe7a0-9145-4358-9b8d-50341f70b132"). InnerVolumeSpecName "kube-api-access-kqv9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.209349 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "36dbe7a0-9145-4358-9b8d-50341f70b132" (UID: "36dbe7a0-9145-4358-9b8d-50341f70b132"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.223159 5030 generic.go:334] "Generic (PLEG): container finished" podID="36dbe7a0-9145-4358-9b8d-50341f70b132" containerID="50df2ae8942ce51ed05a0430d679194edeabc2f16286a07b4ea6bf95d741b532" exitCode=0 Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.223213 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" event={"ID":"36dbe7a0-9145-4358-9b8d-50341f70b132","Type":"ContainerDied","Data":"50df2ae8942ce51ed05a0430d679194edeabc2f16286a07b4ea6bf95d741b532"} Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.223238 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" event={"ID":"36dbe7a0-9145-4358-9b8d-50341f70b132","Type":"ContainerDied","Data":"ed07673a5867863998fecb8762dabc9a52894bd61d5f540489349b856506ba08"} Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.223252 5030 scope.go:117] "RemoveContainer" containerID="50df2ae8942ce51ed05a0430d679194edeabc2f16286a07b4ea6bf95d741b532" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.223343 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.259800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36dbe7a0-9145-4358-9b8d-50341f70b132" (UID: "36dbe7a0-9145-4358-9b8d-50341f70b132"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.263910 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" event={"ID":"1ca5f6c1-a17b-4405-93e9-fa2fab576c91","Type":"ContainerStarted","Data":"8fcf93bf7f94a7ddab1cf7ab5a1c43d40b4dd3a1caa57e86b368f3b81fec5d70"} Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.263956 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" event={"ID":"1ca5f6c1-a17b-4405-93e9-fa2fab576c91","Type":"ContainerStarted","Data":"64151782faeffac0b0b3a84c40974698501a2092d9149ec0d69ca210c8027e09"} Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.282184 5030 generic.go:334] "Generic (PLEG): container finished" podID="106c4d71-621c-48e5-93b0-52a932f7ec8f" containerID="cc67c82c8df000970b4455986ac36b806a2231abd3274bc2b7d5e93b9f142daa" exitCode=2 Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.282255 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"106c4d71-621c-48e5-93b0-52a932f7ec8f","Type":"ContainerDied","Data":"cc67c82c8df000970b4455986ac36b806a2231abd3274bc2b7d5e93b9f142daa"} Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.284900 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.284937 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqv9q\" (UniqueName: \"kubernetes.io/projected/36dbe7a0-9145-4358-9b8d-50341f70b132-kube-api-access-kqv9q\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.284950 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.284961 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36dbe7a0-9145-4358-9b8d-50341f70b132-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.317306 5030 generic.go:334] "Generic (PLEG): container finished" podID="933d538e-e825-4673-a66a-6eedb28088a8" containerID="6d2c18d4a36183671c3763d395c44aa36c2a831486d57039da0c5c11b9df222f" exitCode=0 Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.321695 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" event={"ID":"933d538e-e825-4673-a66a-6eedb28088a8","Type":"ContainerDied","Data":"6d2c18d4a36183671c3763d395c44aa36c2a831486d57039da0c5c11b9df222f"} Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.373366 5030 generic.go:334] "Generic (PLEG): container finished" podID="ed0cd39a-1c2a-4c57-952f-9f389530459d" containerID="4e409b8ed77abff8688fbca2a42ce4b6681d55a94a5e7bf4e50db39b16315018" exitCode=0 Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.373686 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"ed0cd39a-1c2a-4c57-952f-9f389530459d","Type":"ContainerDied","Data":"4e409b8ed77abff8688fbca2a42ce4b6681d55a94a5e7bf4e50db39b16315018"} Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.383453 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-config-data" (OuterVolumeSpecName: "config-data") pod "36dbe7a0-9145-4358-9b8d-50341f70b132" (UID: "36dbe7a0-9145-4358-9b8d-50341f70b132"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.386179 5030 generic.go:334] "Generic (PLEG): container finished" podID="c77f3ffb-97b9-4a1f-9610-7705ae4aee88" containerID="6e686278be52fa85b6babe42f39d57705fd3b25a80bb2213f78c824fd28e3f1e" exitCode=0 Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.386236 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c77f3ffb-97b9-4a1f-9610-7705ae4aee88","Type":"ContainerDied","Data":"6e686278be52fa85b6babe42f39d57705fd3b25a80bb2213f78c824fd28e3f1e"} Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.386771 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36dbe7a0-9145-4358-9b8d-50341f70b132-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.419856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8","Type":"ContainerStarted","Data":"468f7ecd23883ebc707e2adea47de8e5894f76a5861f669d19466c6da8e55817"} Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.423494 5030 scope.go:117] "RemoveContainer" containerID="89a6eaffa75f7ed5cdb5cb31c7a58a9e4133282fc31de2764c164ad8febfa435" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.447105 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"37c2ebda-5e72-41eb-9c8c-1326134768c7","Type":"ContainerStarted","Data":"b381ea0333a3d05f431a79b7595daed3dba7f5af1c7a605d46c79bd6f10c58d3"} Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.458103 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"27a52282-d198-4a51-912a-1c2267f20d4b","Type":"ContainerStarted","Data":"2c8b30fe3bc11e57c9985ae029d78217dff93a87b0704952f4f312545e5171c2"} Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.461901 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"8663671b-ef57-49a5-b24b-4cb3385ffc76","Type":"ContainerStarted","Data":"5acce875889d65a3f586a3832b99f2d8bced89596046689298d3df1e58194783"} Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.665763 5030 scope.go:117] "RemoveContainer" containerID="50df2ae8942ce51ed05a0430d679194edeabc2f16286a07b4ea6bf95d741b532" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.666029 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:43 crc kubenswrapper[5030]: E0120 23:31:43.680377 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50df2ae8942ce51ed05a0430d679194edeabc2f16286a07b4ea6bf95d741b532\": container with ID starting with 50df2ae8942ce51ed05a0430d679194edeabc2f16286a07b4ea6bf95d741b532 not found: ID does not exist" containerID="50df2ae8942ce51ed05a0430d679194edeabc2f16286a07b4ea6bf95d741b532" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.680422 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50df2ae8942ce51ed05a0430d679194edeabc2f16286a07b4ea6bf95d741b532"} err="failed to get container status \"50df2ae8942ce51ed05a0430d679194edeabc2f16286a07b4ea6bf95d741b532\": rpc error: code = NotFound desc = could not find container \"50df2ae8942ce51ed05a0430d679194edeabc2f16286a07b4ea6bf95d741b532\": container with ID starting with 50df2ae8942ce51ed05a0430d679194edeabc2f16286a07b4ea6bf95d741b532 not found: ID does not exist" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.680447 5030 scope.go:117] "RemoveContainer" containerID="89a6eaffa75f7ed5cdb5cb31c7a58a9e4133282fc31de2764c164ad8febfa435" Jan 20 23:31:43 crc kubenswrapper[5030]: E0120 23:31:43.686829 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89a6eaffa75f7ed5cdb5cb31c7a58a9e4133282fc31de2764c164ad8febfa435\": container with ID starting with 89a6eaffa75f7ed5cdb5cb31c7a58a9e4133282fc31de2764c164ad8febfa435 not found: ID does not exist" containerID="89a6eaffa75f7ed5cdb5cb31c7a58a9e4133282fc31de2764c164ad8febfa435" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.686892 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89a6eaffa75f7ed5cdb5cb31c7a58a9e4133282fc31de2764c164ad8febfa435"} err="failed to get container status \"89a6eaffa75f7ed5cdb5cb31c7a58a9e4133282fc31de2764c164ad8febfa435\": rpc error: code = NotFound desc = could not find container \"89a6eaffa75f7ed5cdb5cb31c7a58a9e4133282fc31de2764c164ad8febfa435\": container with ID starting with 89a6eaffa75f7ed5cdb5cb31c7a58a9e4133282fc31de2764c164ad8febfa435 not found: ID does not exist" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.701442 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w"] Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.744684 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-9cf5cf68d-sff9w"] Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.770497 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5c8797d4c8-svfrs"] Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.804554 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc29f\" (UniqueName: \"kubernetes.io/projected/ed0cd39a-1c2a-4c57-952f-9f389530459d-kube-api-access-gc29f\") pod \"ed0cd39a-1c2a-4c57-952f-9f389530459d\" (UID: \"ed0cd39a-1c2a-4c57-952f-9f389530459d\") " Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.804709 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0cd39a-1c2a-4c57-952f-9f389530459d-combined-ca-bundle\") pod \"ed0cd39a-1c2a-4c57-952f-9f389530459d\" (UID: \"ed0cd39a-1c2a-4c57-952f-9f389530459d\") " Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.804747 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0cd39a-1c2a-4c57-952f-9f389530459d-config-data\") pod \"ed0cd39a-1c2a-4c57-952f-9f389530459d\" (UID: \"ed0cd39a-1c2a-4c57-952f-9f389530459d\") " Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.845445 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw"] Jan 20 23:31:43 crc kubenswrapper[5030]: E0120 23:31:43.845862 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36dbe7a0-9145-4358-9b8d-50341f70b132" containerName="barbican-api" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.845878 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="36dbe7a0-9145-4358-9b8d-50341f70b132" containerName="barbican-api" Jan 20 23:31:43 crc kubenswrapper[5030]: E0120 23:31:43.845903 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36dbe7a0-9145-4358-9b8d-50341f70b132" containerName="barbican-api-log" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.845912 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="36dbe7a0-9145-4358-9b8d-50341f70b132" containerName="barbican-api-log" Jan 20 23:31:43 crc kubenswrapper[5030]: E0120 23:31:43.845922 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0cd39a-1c2a-4c57-952f-9f389530459d" containerName="nova-cell0-conductor-conductor" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.845928 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0cd39a-1c2a-4c57-952f-9f389530459d" containerName="nova-cell0-conductor-conductor" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.846111 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0cd39a-1c2a-4c57-952f-9f389530459d" containerName="nova-cell0-conductor-conductor" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.846134 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="36dbe7a0-9145-4358-9b8d-50341f70b132" containerName="barbican-api" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.846145 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="36dbe7a0-9145-4358-9b8d-50341f70b132" containerName="barbican-api-log" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.847077 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.850727 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0cd39a-1c2a-4c57-952f-9f389530459d-kube-api-access-gc29f" (OuterVolumeSpecName: "kube-api-access-gc29f") pod "ed0cd39a-1c2a-4c57-952f-9f389530459d" (UID: "ed0cd39a-1c2a-4c57-952f-9f389530459d"). InnerVolumeSpecName "kube-api-access-gc29f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.852064 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.852232 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.852428 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.870263 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw"] Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.912166 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-ovndb-tls-certs\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.912803 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scg64\" (UniqueName: \"kubernetes.io/projected/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-kube-api-access-scg64\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.912831 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-httpd-config\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.913530 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-config\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.913782 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-combined-ca-bundle\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.913914 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-public-tls-certs\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.914138 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-internal-tls-certs\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.914544 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc29f\" (UniqueName: \"kubernetes.io/projected/ed0cd39a-1c2a-4c57-952f-9f389530459d-kube-api-access-gc29f\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:43 crc kubenswrapper[5030]: I0120 23:31:43.916750 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.024824 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-internal-tls-certs\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.024897 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-ovndb-tls-certs\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.024901 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36dbe7a0-9145-4358-9b8d-50341f70b132" path="/var/lib/kubelet/pods/36dbe7a0-9145-4358-9b8d-50341f70b132/volumes" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.024932 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scg64\" (UniqueName: \"kubernetes.io/projected/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-kube-api-access-scg64\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.024951 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-httpd-config\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.025035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-config\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.025064 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-combined-ca-bundle\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.025082 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-public-tls-certs\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.026042 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0e49bd-aaee-440e-a6ec-7498907c5c7c" path="/var/lib/kubelet/pods/8f0e49bd-aaee-440e-a6ec-7498907c5c7c/volumes" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.079380 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-internal-tls-certs\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.094274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-combined-ca-bundle\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.095869 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-ovndb-tls-certs\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.096297 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-public-tls-certs\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.097037 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-httpd-config\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.101101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-config\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.101211 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scg64\" (UniqueName: \"kubernetes.io/projected/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-kube-api-access-scg64\") pod \"neutron-5ffdf8f866-wq6lw\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.132684 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0cd39a-1c2a-4c57-952f-9f389530459d-config-data" (OuterVolumeSpecName: "config-data") pod "ed0cd39a-1c2a-4c57-952f-9f389530459d" (UID: "ed0cd39a-1c2a-4c57-952f-9f389530459d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.163605 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0cd39a-1c2a-4c57-952f-9f389530459d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed0cd39a-1c2a-4c57-952f-9f389530459d" (UID: "ed0cd39a-1c2a-4c57-952f-9f389530459d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.196166 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.237451 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0cd39a-1c2a-4c57-952f-9f389530459d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.237487 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0cd39a-1c2a-4c57-952f-9f389530459d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.482150 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"37c2ebda-5e72-41eb-9c8c-1326134768c7","Type":"ContainerStarted","Data":"ef6976dc418e8fc75f18940eec16d6ef87c71cd9327cf28588a9a353d0afb920"} Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.483936 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"27a52282-d198-4a51-912a-1c2267f20d4b","Type":"ContainerStarted","Data":"7500d10c05813e90a38f8a26285eef2c34b7e71479f0011b90857ee52f9a3d74"} Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.484029 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"27a52282-d198-4a51-912a-1c2267f20d4b","Type":"ContainerStarted","Data":"9610c64c6d13e545efd2ca85d32f3b7947e2a7fa0179fb1ad72d312779b314e3"} Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.489340 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" event={"ID":"1ca5f6c1-a17b-4405-93e9-fa2fab576c91","Type":"ContainerStarted","Data":"94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b"} Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.489485 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" podUID="1ca5f6c1-a17b-4405-93e9-fa2fab576c91" containerName="neutron-api" containerID="cri-o://8fcf93bf7f94a7ddab1cf7ab5a1c43d40b4dd3a1caa57e86b368f3b81fec5d70" gracePeriod=30 Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.490039 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" podUID="1ca5f6c1-a17b-4405-93e9-fa2fab576c91" containerName="neutron-httpd" containerID="cri-o://94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b" gracePeriod=30 Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.490078 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.507191 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.507647 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"ed0cd39a-1c2a-4c57-952f-9f389530459d","Type":"ContainerDied","Data":"1b49fdd0b65a50cd457af6f8430af339083aebd08aaccaa4a459230e29b498ce"} Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.514119 5030 scope.go:117] "RemoveContainer" containerID="4e409b8ed77abff8688fbca2a42ce4b6681d55a94a5e7bf4e50db39b16315018" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.526342 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"dcf91700-edc5-4ecb-a027-b6b904e948e1","Type":"ContainerStarted","Data":"3fa7964cdb8712e5bc2875ed3c827e06a24e37a9f1c57a93f9ae63ab718c78b7"} Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.533986 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=4.533954277 podStartE2EDuration="4.533954277s" podCreationTimestamp="2026-01-20 23:31:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:44.501881111 +0000 UTC m=+3376.822141399" watchObservedRunningTime="2026-01-20 23:31:44.533954277 +0000 UTC m=+3376.854214565" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.594190 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" podStartSLOduration=4.594168085 podStartE2EDuration="4.594168085s" podCreationTimestamp="2026-01-20 23:31:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:44.525940953 +0000 UTC m=+3376.846201241" watchObservedRunningTime="2026-01-20 23:31:44.594168085 +0000 UTC m=+3376.914428373" Jan 20 23:31:44 crc kubenswrapper[5030]: E0120 23:31:44.670353 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:44 crc kubenswrapper[5030]: E0120 23:31:44.722954 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.737858 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.750613 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw"] Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.760672 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.768134 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.769414 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.773894 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.777381 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:31:44 crc kubenswrapper[5030]: E0120 23:31:44.781839 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:44 crc kubenswrapper[5030]: E0120 23:31:44.781936 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" containerName="nova-cell1-conductor-conductor" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.856716 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j88lf\" (UniqueName: \"kubernetes.io/projected/f7fef1ff-a583-42c4-bb53-b6a684d9afad-kube-api-access-j88lf\") pod \"nova-cell0-conductor-0\" (UID: \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.856876 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7fef1ff-a583-42c4-bb53-b6a684d9afad-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.856971 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7fef1ff-a583-42c4-bb53-b6a684d9afad-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.959269 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7fef1ff-a583-42c4-bb53-b6a684d9afad-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.959341 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j88lf\" (UniqueName: \"kubernetes.io/projected/f7fef1ff-a583-42c4-bb53-b6a684d9afad-kube-api-access-j88lf\") pod \"nova-cell0-conductor-0\" (UID: \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.959425 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7fef1ff-a583-42c4-bb53-b6a684d9afad-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.964699 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7fef1ff-a583-42c4-bb53-b6a684d9afad-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.966571 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7fef1ff-a583-42c4-bb53-b6a684d9afad-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:44 crc kubenswrapper[5030]: I0120 23:31:44.977785 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j88lf\" (UniqueName: \"kubernetes.io/projected/f7fef1ff-a583-42c4-bb53-b6a684d9afad-kube-api-access-j88lf\") pod \"nova-cell0-conductor-0\" (UID: \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.099028 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.577504 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"dcf91700-edc5-4ecb-a027-b6b904e948e1","Type":"ContainerStarted","Data":"e15443f82953fd9286b3c4ae0ad6d0ae04bde20b646f4ec3be8e39a140ec4ff8"} Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.577957 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"dcf91700-edc5-4ecb-a027-b6b904e948e1","Type":"ContainerStarted","Data":"fdf31a0124040cc80f23e3e5fff8eb8638fc26f71aedd567cd3ee47023452251"} Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.581384 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8","Type":"ContainerStarted","Data":"dba39c972f09e29c1e7af7710cba14d22bb01635d4706b36515e50f46d419e23"} Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.581416 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8","Type":"ContainerStarted","Data":"e9b169497221304dbe6adee831f056e749c90f520798d74f0635cb2f60436f1d"} Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.583218 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"37c2ebda-5e72-41eb-9c8c-1326134768c7","Type":"ContainerStarted","Data":"42893f23058587a6ea7bb91a49056a29e0ef50a9501774456a34ef1a79090ca0"} Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.588309 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"8663671b-ef57-49a5-b24b-4cb3385ffc76","Type":"ContainerStarted","Data":"5577e422daaa40c7d4a3a4b146deed7f44ec2457dac0430688594f2eba2c7186"} Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.589154 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.591300 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" event={"ID":"a5f7d386-d22d-49cd-8ffd-877ac3c3108f","Type":"ContainerStarted","Data":"e671209acb03220158445d6ada0caf834e79aeef65a37553919aa2552ea75946"} Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.591325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" event={"ID":"a5f7d386-d22d-49cd-8ffd-877ac3c3108f","Type":"ContainerStarted","Data":"32707ab577e549414bd76831b2cae74f34212e803ea399d716a06b2198ac6006"} Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.591335 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" event={"ID":"a5f7d386-d22d-49cd-8ffd-877ac3c3108f","Type":"ContainerStarted","Data":"916135fad9cef7b09807c2cb58922e481b601cc7c9c43b0f59be62f4506292d8"} Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.591584 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.604174 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=3.604155793 podStartE2EDuration="3.604155793s" podCreationTimestamp="2026-01-20 23:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:45.599410719 +0000 UTC m=+3377.919671007" watchObservedRunningTime="2026-01-20 23:31:45.604155793 +0000 UTC m=+3377.924416081" Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.625341 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=5.625323426 podStartE2EDuration="5.625323426s" podCreationTimestamp="2026-01-20 23:31:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:45.620770376 +0000 UTC m=+3377.941030654" watchObservedRunningTime="2026-01-20 23:31:45.625323426 +0000 UTC m=+3377.945583714" Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.648023 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.650943 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=5.650924206 podStartE2EDuration="5.650924206s" podCreationTimestamp="2026-01-20 23:31:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:45.643605588 +0000 UTC m=+3377.963865876" watchObservedRunningTime="2026-01-20 23:31:45.650924206 +0000 UTC m=+3377.971184494" Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.662295 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" podStartSLOduration=2.662277381 podStartE2EDuration="2.662277381s" podCreationTimestamp="2026-01-20 23:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:45.659571525 +0000 UTC m=+3377.979831813" watchObservedRunningTime="2026-01-20 23:31:45.662277381 +0000 UTC m=+3377.982537669" Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.704403 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.704803 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=5.704785789 podStartE2EDuration="5.704785789s" podCreationTimestamp="2026-01-20 23:31:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:45.689473919 +0000 UTC m=+3378.009734197" watchObservedRunningTime="2026-01-20 23:31:45.704785789 +0000 UTC m=+3378.025046077" Jan 20 23:31:45 crc kubenswrapper[5030]: I0120 23:31:45.978423 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed0cd39a-1c2a-4c57-952f-9f389530459d" path="/var/lib/kubelet/pods/ed0cd39a-1c2a-4c57-952f-9f389530459d/volumes" Jan 20 23:31:46 crc kubenswrapper[5030]: I0120 23:31:46.207661 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:46 crc kubenswrapper[5030]: I0120 23:31:46.605678 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f7fef1ff-a583-42c4-bb53-b6a684d9afad","Type":"ContainerStarted","Data":"9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99"} Jan 20 23:31:46 crc kubenswrapper[5030]: I0120 23:31:46.605733 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f7fef1ff-a583-42c4-bb53-b6a684d9afad","Type":"ContainerStarted","Data":"f88ba205a7043255fab8362632c57445cbf0e88f7e203535c2a66d85faa0b12d"} Jan 20 23:31:46 crc kubenswrapper[5030]: I0120 23:31:46.630409 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.630387086 podStartE2EDuration="2.630387086s" podCreationTimestamp="2026-01-20 23:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:46.62230655 +0000 UTC m=+3378.942566828" watchObservedRunningTime="2026-01-20 23:31:46.630387086 +0000 UTC m=+3378.950647374" Jan 20 23:31:47 crc kubenswrapper[5030]: I0120 23:31:47.208063 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:47 crc kubenswrapper[5030]: I0120 23:31:47.280019 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:47 crc kubenswrapper[5030]: I0120 23:31:47.623599 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:47 crc kubenswrapper[5030]: I0120 23:31:47.705146 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.084763 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_106c4d71-621c-48e5-93b0-52a932f7ec8f/ovn-northd/0.log" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.084836 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.224587 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-combined-ca-bundle\") pod \"106c4d71-621c-48e5-93b0-52a932f7ec8f\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.224687 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/106c4d71-621c-48e5-93b0-52a932f7ec8f-ovn-rundir\") pod \"106c4d71-621c-48e5-93b0-52a932f7ec8f\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.224814 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpt4g\" (UniqueName: \"kubernetes.io/projected/106c4d71-621c-48e5-93b0-52a932f7ec8f-kube-api-access-dpt4g\") pod \"106c4d71-621c-48e5-93b0-52a932f7ec8f\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.224908 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-ovn-northd-tls-certs\") pod \"106c4d71-621c-48e5-93b0-52a932f7ec8f\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.224938 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-metrics-certs-tls-certs\") pod \"106c4d71-621c-48e5-93b0-52a932f7ec8f\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.224976 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/106c4d71-621c-48e5-93b0-52a932f7ec8f-scripts\") pod \"106c4d71-621c-48e5-93b0-52a932f7ec8f\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.225051 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106c4d71-621c-48e5-93b0-52a932f7ec8f-config\") pod \"106c4d71-621c-48e5-93b0-52a932f7ec8f\" (UID: \"106c4d71-621c-48e5-93b0-52a932f7ec8f\") " Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.226015 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/106c4d71-621c-48e5-93b0-52a932f7ec8f-config" (OuterVolumeSpecName: "config") pod "106c4d71-621c-48e5-93b0-52a932f7ec8f" (UID: "106c4d71-621c-48e5-93b0-52a932f7ec8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.228214 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/106c4d71-621c-48e5-93b0-52a932f7ec8f-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "106c4d71-621c-48e5-93b0-52a932f7ec8f" (UID: "106c4d71-621c-48e5-93b0-52a932f7ec8f"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.229131 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/106c4d71-621c-48e5-93b0-52a932f7ec8f-scripts" (OuterVolumeSpecName: "scripts") pod "106c4d71-621c-48e5-93b0-52a932f7ec8f" (UID: "106c4d71-621c-48e5-93b0-52a932f7ec8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.231491 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106c4d71-621c-48e5-93b0-52a932f7ec8f-kube-api-access-dpt4g" (OuterVolumeSpecName: "kube-api-access-dpt4g") pod "106c4d71-621c-48e5-93b0-52a932f7ec8f" (UID: "106c4d71-621c-48e5-93b0-52a932f7ec8f"). InnerVolumeSpecName "kube-api-access-dpt4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.276204 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "106c4d71-621c-48e5-93b0-52a932f7ec8f" (UID: "106c4d71-621c-48e5-93b0-52a932f7ec8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.310460 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "106c4d71-621c-48e5-93b0-52a932f7ec8f" (UID: "106c4d71-621c-48e5-93b0-52a932f7ec8f"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.327325 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.327364 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/106c4d71-621c-48e5-93b0-52a932f7ec8f-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.327377 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpt4g\" (UniqueName: \"kubernetes.io/projected/106c4d71-621c-48e5-93b0-52a932f7ec8f-kube-api-access-dpt4g\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.327389 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.327400 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/106c4d71-621c-48e5-93b0-52a932f7ec8f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.327410 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106c4d71-621c-48e5-93b0-52a932f7ec8f-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.336978 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "106c4d71-621c-48e5-93b0-52a932f7ec8f" (UID: "106c4d71-621c-48e5-93b0-52a932f7ec8f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.429513 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/106c4d71-621c-48e5-93b0-52a932f7ec8f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.640609 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_106c4d71-621c-48e5-93b0-52a932f7ec8f/ovn-northd/0.log" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.640686 5030 generic.go:334] "Generic (PLEG): container finished" podID="106c4d71-621c-48e5-93b0-52a932f7ec8f" containerID="efea214823ed9d6aa4284654980e5f98e08e924c721db4c512b861b509986df7" exitCode=139 Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.640797 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.640837 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"106c4d71-621c-48e5-93b0-52a932f7ec8f","Type":"ContainerDied","Data":"efea214823ed9d6aa4284654980e5f98e08e924c721db4c512b861b509986df7"} Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.640903 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"106c4d71-621c-48e5-93b0-52a932f7ec8f","Type":"ContainerDied","Data":"9f22df28708507c4cb11d52f25b980dfed79f069a0a1720d14cfc940f5eae877"} Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.640932 5030 scope.go:117] "RemoveContainer" containerID="cc67c82c8df000970b4455986ac36b806a2231abd3274bc2b7d5e93b9f142daa" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.713059 5030 scope.go:117] "RemoveContainer" containerID="efea214823ed9d6aa4284654980e5f98e08e924c721db4c512b861b509986df7" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.733393 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.742730 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.757876 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.768742 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:31:48 crc kubenswrapper[5030]: E0120 23:31:48.769221 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106c4d71-621c-48e5-93b0-52a932f7ec8f" containerName="openstack-network-exporter" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.769242 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="106c4d71-621c-48e5-93b0-52a932f7ec8f" containerName="openstack-network-exporter" Jan 20 23:31:48 crc kubenswrapper[5030]: E0120 23:31:48.769256 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106c4d71-621c-48e5-93b0-52a932f7ec8f" containerName="ovn-northd" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.769265 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="106c4d71-621c-48e5-93b0-52a932f7ec8f" containerName="ovn-northd" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.769535 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="106c4d71-621c-48e5-93b0-52a932f7ec8f" containerName="openstack-network-exporter" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.769557 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="106c4d71-621c-48e5-93b0-52a932f7ec8f" containerName="ovn-northd" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.771050 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.775985 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.776186 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-qmbrx" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.776419 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.776602 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.784232 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.789853 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.837522 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b274deb-e830-47a2-916c-3c10670c08ac-scripts\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.837613 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.837681 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbtbw\" (UniqueName: \"kubernetes.io/projected/7b274deb-e830-47a2-916c-3c10670c08ac-kube-api-access-jbtbw\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.837769 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b274deb-e830-47a2-916c-3c10670c08ac-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.837884 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.837934 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b274deb-e830-47a2-916c-3c10670c08ac-config\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.838091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.848765 5030 scope.go:117] "RemoveContainer" containerID="cc67c82c8df000970b4455986ac36b806a2231abd3274bc2b7d5e93b9f142daa" Jan 20 23:31:48 crc kubenswrapper[5030]: E0120 23:31:48.849146 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc67c82c8df000970b4455986ac36b806a2231abd3274bc2b7d5e93b9f142daa\": container with ID starting with cc67c82c8df000970b4455986ac36b806a2231abd3274bc2b7d5e93b9f142daa not found: ID does not exist" containerID="cc67c82c8df000970b4455986ac36b806a2231abd3274bc2b7d5e93b9f142daa" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.849170 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc67c82c8df000970b4455986ac36b806a2231abd3274bc2b7d5e93b9f142daa"} err="failed to get container status \"cc67c82c8df000970b4455986ac36b806a2231abd3274bc2b7d5e93b9f142daa\": rpc error: code = NotFound desc = could not find container \"cc67c82c8df000970b4455986ac36b806a2231abd3274bc2b7d5e93b9f142daa\": container with ID starting with cc67c82c8df000970b4455986ac36b806a2231abd3274bc2b7d5e93b9f142daa not found: ID does not exist" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.849189 5030 scope.go:117] "RemoveContainer" containerID="efea214823ed9d6aa4284654980e5f98e08e924c721db4c512b861b509986df7" Jan 20 23:31:48 crc kubenswrapper[5030]: E0120 23:31:48.849466 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efea214823ed9d6aa4284654980e5f98e08e924c721db4c512b861b509986df7\": container with ID starting with efea214823ed9d6aa4284654980e5f98e08e924c721db4c512b861b509986df7 not found: ID does not exist" containerID="efea214823ed9d6aa4284654980e5f98e08e924c721db4c512b861b509986df7" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.849489 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efea214823ed9d6aa4284654980e5f98e08e924c721db4c512b861b509986df7"} err="failed to get container status \"efea214823ed9d6aa4284654980e5f98e08e924c721db4c512b861b509986df7\": rpc error: code = NotFound desc = could not find container \"efea214823ed9d6aa4284654980e5f98e08e924c721db4c512b861b509986df7\": container with ID starting with efea214823ed9d6aa4284654980e5f98e08e924c721db4c512b861b509986df7 not found: ID does not exist" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.939466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.939890 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbtbw\" (UniqueName: \"kubernetes.io/projected/7b274deb-e830-47a2-916c-3c10670c08ac-kube-api-access-jbtbw\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.939970 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b274deb-e830-47a2-916c-3c10670c08ac-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.940054 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.940285 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b274deb-e830-47a2-916c-3c10670c08ac-config\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.940459 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.940568 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b274deb-e830-47a2-916c-3c10670c08ac-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.940795 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b274deb-e830-47a2-916c-3c10670c08ac-scripts\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.941353 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b274deb-e830-47a2-916c-3c10670c08ac-config\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.943280 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.943459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b274deb-e830-47a2-916c-3c10670c08ac-scripts\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.944414 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.945410 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:48 crc kubenswrapper[5030]: I0120 23:31:48.954765 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbtbw\" (UniqueName: \"kubernetes.io/projected/7b274deb-e830-47a2-916c-3c10670c08ac-kube-api-access-jbtbw\") pod \"ovn-northd-0\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:48.999907 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.000440 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.001886 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.002378 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.135877 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-68cdcdb79d-vdlkj_933d538e-e825-4673-a66a-6eedb28088a8/neutron-api/0.log" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.135942 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.153017 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.246037 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-config\") pod \"933d538e-e825-4673-a66a-6eedb28088a8\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.246083 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-combined-ca-bundle\") pod \"933d538e-e825-4673-a66a-6eedb28088a8\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.246249 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfxbg\" (UniqueName: \"kubernetes.io/projected/933d538e-e825-4673-a66a-6eedb28088a8-kube-api-access-wfxbg\") pod \"933d538e-e825-4673-a66a-6eedb28088a8\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.246277 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-httpd-config\") pod \"933d538e-e825-4673-a66a-6eedb28088a8\" (UID: \"933d538e-e825-4673-a66a-6eedb28088a8\") " Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.250830 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/933d538e-e825-4673-a66a-6eedb28088a8-kube-api-access-wfxbg" (OuterVolumeSpecName: "kube-api-access-wfxbg") pod "933d538e-e825-4673-a66a-6eedb28088a8" (UID: "933d538e-e825-4673-a66a-6eedb28088a8"). InnerVolumeSpecName "kube-api-access-wfxbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.252352 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "933d538e-e825-4673-a66a-6eedb28088a8" (UID: "933d538e-e825-4673-a66a-6eedb28088a8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.298860 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-config" (OuterVolumeSpecName: "config") pod "933d538e-e825-4673-a66a-6eedb28088a8" (UID: "933d538e-e825-4673-a66a-6eedb28088a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.302926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "933d538e-e825-4673-a66a-6eedb28088a8" (UID: "933d538e-e825-4673-a66a-6eedb28088a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.348718 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.348748 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.348760 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfxbg\" (UniqueName: \"kubernetes.io/projected/933d538e-e825-4673-a66a-6eedb28088a8-kube-api-access-wfxbg\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.348768 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/933d538e-e825-4673-a66a-6eedb28088a8-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.375981 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.377610 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.382561 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.581641 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:31:49 crc kubenswrapper[5030]: E0120 23:31:49.639824 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:49 crc kubenswrapper[5030]: E0120 23:31:49.641148 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:49 crc kubenswrapper[5030]: E0120 23:31:49.642986 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:49 crc kubenswrapper[5030]: E0120 23:31:49.643045 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" containerName="nova-cell1-conductor-conductor" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.650940 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"7b274deb-e830-47a2-916c-3c10670c08ac","Type":"ContainerStarted","Data":"962a6cfb18b7fa8e5ad9dcf337905f3245c7a5968d387512e6db9b76e1dd3a2b"} Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.654109 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-68cdcdb79d-vdlkj_933d538e-e825-4673-a66a-6eedb28088a8/neutron-api/0.log" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.654161 5030 generic.go:334] "Generic (PLEG): container finished" podID="933d538e-e825-4673-a66a-6eedb28088a8" containerID="c6b155122882e1ad33c2f41e132bab61dfd832b47c24b4a08122f551206e54db" exitCode=1 Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.654212 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.654250 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" event={"ID":"933d538e-e825-4673-a66a-6eedb28088a8","Type":"ContainerDied","Data":"c6b155122882e1ad33c2f41e132bab61dfd832b47c24b4a08122f551206e54db"} Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.654304 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj" event={"ID":"933d538e-e825-4673-a66a-6eedb28088a8","Type":"ContainerDied","Data":"53b6e46f0f2606d32f4134ae76e23a2e80e6de9eb52d26d3a088d0057324ab6b"} Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.654323 5030 scope.go:117] "RemoveContainer" containerID="6d2c18d4a36183671c3763d395c44aa36c2a831486d57039da0c5c11b9df222f" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.655103 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.660529 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.662153 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.715717 5030 scope.go:117] "RemoveContainer" containerID="c6b155122882e1ad33c2f41e132bab61dfd832b47c24b4a08122f551206e54db" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.729379 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj"] Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.740919 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.755484 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-68cdcdb79d-vdlkj"] Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.796121 5030 scope.go:117] "RemoveContainer" containerID="6d2c18d4a36183671c3763d395c44aa36c2a831486d57039da0c5c11b9df222f" Jan 20 23:31:49 crc kubenswrapper[5030]: E0120 23:31:49.796850 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d2c18d4a36183671c3763d395c44aa36c2a831486d57039da0c5c11b9df222f\": container with ID starting with 6d2c18d4a36183671c3763d395c44aa36c2a831486d57039da0c5c11b9df222f not found: ID does not exist" containerID="6d2c18d4a36183671c3763d395c44aa36c2a831486d57039da0c5c11b9df222f" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.796886 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d2c18d4a36183671c3763d395c44aa36c2a831486d57039da0c5c11b9df222f"} err="failed to get container status \"6d2c18d4a36183671c3763d395c44aa36c2a831486d57039da0c5c11b9df222f\": rpc error: code = NotFound desc = could not find container \"6d2c18d4a36183671c3763d395c44aa36c2a831486d57039da0c5c11b9df222f\": container with ID starting with 6d2c18d4a36183671c3763d395c44aa36c2a831486d57039da0c5c11b9df222f not found: ID does not exist" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.796910 5030 scope.go:117] "RemoveContainer" containerID="c6b155122882e1ad33c2f41e132bab61dfd832b47c24b4a08122f551206e54db" Jan 20 23:31:49 crc kubenswrapper[5030]: E0120 23:31:49.797539 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b155122882e1ad33c2f41e132bab61dfd832b47c24b4a08122f551206e54db\": container with ID starting with c6b155122882e1ad33c2f41e132bab61dfd832b47c24b4a08122f551206e54db not found: ID does not exist" containerID="c6b155122882e1ad33c2f41e132bab61dfd832b47c24b4a08122f551206e54db" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.797585 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b155122882e1ad33c2f41e132bab61dfd832b47c24b4a08122f551206e54db"} err="failed to get container status \"c6b155122882e1ad33c2f41e132bab61dfd832b47c24b4a08122f551206e54db\": rpc error: code = NotFound desc = could not find container \"c6b155122882e1ad33c2f41e132bab61dfd832b47c24b4a08122f551206e54db\": container with ID starting with c6b155122882e1ad33c2f41e132bab61dfd832b47c24b4a08122f551206e54db not found: ID does not exist" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.975372 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106c4d71-621c-48e5-93b0-52a932f7ec8f" path="/var/lib/kubelet/pods/106c4d71-621c-48e5-93b0-52a932f7ec8f/volumes" Jan 20 23:31:49 crc kubenswrapper[5030]: I0120 23:31:49.976441 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="933d538e-e825-4673-a66a-6eedb28088a8" path="/var/lib/kubelet/pods/933d538e-e825-4673-a66a-6eedb28088a8/volumes" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.135103 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.599824 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w"] Jan 20 23:31:50 crc kubenswrapper[5030]: E0120 23:31:50.600249 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933d538e-e825-4673-a66a-6eedb28088a8" containerName="neutron-api" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.600263 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="933d538e-e825-4673-a66a-6eedb28088a8" containerName="neutron-api" Jan 20 23:31:50 crc kubenswrapper[5030]: E0120 23:31:50.600312 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="933d538e-e825-4673-a66a-6eedb28088a8" containerName="neutron-httpd" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.600320 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="933d538e-e825-4673-a66a-6eedb28088a8" containerName="neutron-httpd" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.600537 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="933d538e-e825-4673-a66a-6eedb28088a8" containerName="neutron-api" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.600578 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="933d538e-e825-4673-a66a-6eedb28088a8" containerName="neutron-httpd" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.601954 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.605395 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd4v7\" (UniqueName: \"kubernetes.io/projected/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-kube-api-access-zd4v7\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.605450 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-combined-ca-bundle\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.605478 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-internal-tls-certs\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.605684 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-config-data-custom\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.605772 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-public-tls-certs\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.606043 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-config-data\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.606070 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-logs\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.607609 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.610398 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.616055 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w"] Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.665171 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5c8797d4c8-svfrs_1ca5f6c1-a17b-4405-93e9-fa2fab576c91/neutron-api/0.log" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.665226 5030 generic.go:334] "Generic (PLEG): container finished" podID="1ca5f6c1-a17b-4405-93e9-fa2fab576c91" containerID="8fcf93bf7f94a7ddab1cf7ab5a1c43d40b4dd3a1caa57e86b368f3b81fec5d70" exitCode=1 Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.665281 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" event={"ID":"1ca5f6c1-a17b-4405-93e9-fa2fab576c91","Type":"ContainerDied","Data":"8fcf93bf7f94a7ddab1cf7ab5a1c43d40b4dd3a1caa57e86b368f3b81fec5d70"} Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.668400 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"7b274deb-e830-47a2-916c-3c10670c08ac","Type":"ContainerStarted","Data":"df6cb8bdfcb5d92a6cd5e2e793e2b41a4b1b3fe7648144b68448df9085f79bf1"} Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.668444 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"7b274deb-e830-47a2-916c-3c10670c08ac","Type":"ContainerStarted","Data":"59d37188df8105f7a6563229d86d8c1078a8f47264c721f27153dbfc6bbaa4e2"} Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.694441 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.6944226650000003 podStartE2EDuration="2.694422665s" podCreationTimestamp="2026-01-20 23:31:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:50.68717046 +0000 UTC m=+3383.007430748" watchObservedRunningTime="2026-01-20 23:31:50.694422665 +0000 UTC m=+3383.014682953" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.707743 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd4v7\" (UniqueName: \"kubernetes.io/projected/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-kube-api-access-zd4v7\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.707790 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-combined-ca-bundle\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.707809 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-internal-tls-certs\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.707883 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-config-data-custom\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.707924 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-public-tls-certs\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.708046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-config-data\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.708064 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-logs\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.708950 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-logs\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.713191 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-config-data-custom\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.713694 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-public-tls-certs\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.713764 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-combined-ca-bundle\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.713986 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-config-data\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.714995 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-internal-tls-certs\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.724558 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd4v7\" (UniqueName: \"kubernetes.io/projected/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-kube-api-access-zd4v7\") pod \"barbican-api-58d5d86f6d-bhw7w\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:50 crc kubenswrapper[5030]: I0120 23:31:50.920688 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.218105 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.218570 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.223974 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.224033 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.250238 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.273707 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.280049 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.283570 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.378993 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w"] Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.694287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" event={"ID":"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f","Type":"ContainerStarted","Data":"616813822df1b61971570aac8e1cfac100ae80c314656371569d202dcc6d59ca"} Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.694814 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" event={"ID":"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f","Type":"ContainerStarted","Data":"2a887328275e6def3c5749cf008ab14a4de46e1fc9a87d3cfdcb183be08841f1"} Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.697104 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.697159 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.697180 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.697197 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:51 crc kubenswrapper[5030]: I0120 23:31:51.697215 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:52 crc kubenswrapper[5030]: I0120 23:31:52.327047 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" podUID="9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.1.190:9696/\": dial tcp 10.217.1.190:9696: connect: connection refused" Jan 20 23:31:52 crc kubenswrapper[5030]: I0120 23:31:52.706130 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" event={"ID":"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f","Type":"ContainerStarted","Data":"016ababbed1655b0d2abe8b696a59d0e9ec529a56e150638fbe3d4558285b38e"} Jan 20 23:31:52 crc kubenswrapper[5030]: I0120 23:31:52.756827 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" podStartSLOduration=2.75680829 podStartE2EDuration="2.75680829s" podCreationTimestamp="2026-01-20 23:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:52.748074358 +0000 UTC m=+3385.068334646" watchObservedRunningTime="2026-01-20 23:31:52.75680829 +0000 UTC m=+3385.077068578" Jan 20 23:31:52 crc kubenswrapper[5030]: I0120 23:31:52.824877 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.451309 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.453707 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.555436 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.628966 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.632595 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.696107 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" podUID="1cef2808-064a-47dd-8abb-9f6b08d260be" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.1.192:9696/\": dial tcp 10.217.1.192:9696: connect: connection refused" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.733525 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerID="25c15d7715d9e340868a2859dd1a58b1e7573ed255f28afe95351da547d45981" exitCode=137 Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.733723 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"25c15d7715d9e340868a2859dd1a58b1e7573ed255f28afe95351da547d45981"} Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.734569 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.735006 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.806181 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.890251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ggld\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-kube-api-access-2ggld\") pod \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.890335 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift\") pod \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.890374 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-lock\") pod \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.890482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-cache\") pod \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.890599 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\" (UID: \"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4\") " Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.891217 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-cache" (OuterVolumeSpecName: "cache") pod "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" (UID: "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.891441 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-lock" (OuterVolumeSpecName: "lock") pod "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" (UID: "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.891549 5030 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-lock\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.891566 5030 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-cache\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.895742 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-kube-api-access-2ggld" (OuterVolumeSpecName: "kube-api-access-2ggld") pod "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" (UID: "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4"). InnerVolumeSpecName "kube-api-access-2ggld". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.896734 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" (UID: "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.909920 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "swift") pod "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" (UID: "bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.993646 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.993683 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ggld\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-kube-api-access-2ggld\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:53 crc kubenswrapper[5030]: I0120 23:31:53.993694 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.012332 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.095431 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:54 crc kubenswrapper[5030]: W0120 23:31:54.493520 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ba68c0c_821f_4bb0_848d_cfdc09051ae9.slice/crio-25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659.scope WatchSource:0}: Error finding container 25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659: Status 404 returned error can't find the container with id 25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659 Jan 20 23:31:54 crc kubenswrapper[5030]: W0120 23:31:54.494000 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1bd7b8_3e6a_4cf2_8387_40e92fa0ab31.slice/crio-ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1bd7b8_3e6a_4cf2_8387_40e92fa0ab31.slice/crio-ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00.scope: no such file or directory Jan 20 23:31:54 crc kubenswrapper[5030]: W0120 23:31:54.495580 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09bdbd07_9917_40c2_9583_b9633d529373.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09bdbd07_9917_40c2_9583_b9633d529373.slice: no such file or directory Jan 20 23:31:54 crc kubenswrapper[5030]: W0120 23:31:54.495722 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07cd67f8_2bd9_448c_a418_5d1aa37808d6.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07cd67f8_2bd9_448c_a418_5d1aa37808d6.slice: no such file or directory Jan 20 23:31:54 crc kubenswrapper[5030]: W0120 23:31:54.502067 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1bd7b8_3e6a_4cf2_8387_40e92fa0ab31.slice/crio-conmon-c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1bd7b8_3e6a_4cf2_8387_40e92fa0ab31.slice/crio-conmon-c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20.scope: no such file or directory Jan 20 23:31:54 crc kubenswrapper[5030]: W0120 23:31:54.516962 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1bd7b8_3e6a_4cf2_8387_40e92fa0ab31.slice/crio-c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1bd7b8_3e6a_4cf2_8387_40e92fa0ab31.slice/crio-c00fd1a1c8dcf4dab7e03eea20d30b7c3c5a53fc69fd8a93e3f58aecedbd5f20.scope: no such file or directory Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.645324 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.648649 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.651217 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.651277 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" containerName="nova-cell1-conductor-conductor" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.670300 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff299fc0_074c_4420_8ac5_030bebd0c385.slice/crio-27d6ada5afe73f9dfe769965c8eb20db23b4135948bb0c76cf36d211eb5ffc7e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1bd7b8_3e6a_4cf2_8387_40e92fa0ab31.slice/crio-19b2b71612e3a2a85e00ffaa0364277bb531f46af7bc4f89dedf5363f5ce15ed\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1bd7b8_3e6a_4cf2_8387_40e92fa0ab31.slice/crio-conmon-ef3a9d42d0c9f411a4a6ac538f08e00a2d51a4f08ab3b8e9dbe2422848ec6c00.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ba68c0c_821f_4bb0_848d_cfdc09051ae9.slice/crio-conmon-25f06c130c645e53862e319929f809023a159e1d46a2b11a3a3ff497784b8659.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff299fc0_074c_4420_8ac5_030bebd0c385.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.753611 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4","Type":"ContainerDied","Data":"bc4cce11a2f9326a3c8c9f33ccadc33a156ee68b356f675f96fecd75cc9144e0"} Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.753677 5030 scope.go:117] "RemoveContainer" containerID="25c15d7715d9e340868a2859dd1a58b1e7573ed255f28afe95351da547d45981" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.753764 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.755773 5030 generic.go:334] "Generic (PLEG): container finished" podID="c20f9105-8569-4c92-98f9-dd35cc9d43a9" containerID="1275f690feea00793157e12c75f8deef238efc5fe1e7db14cbe0b3879432536f" exitCode=137 Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.755902 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" event={"ID":"c20f9105-8569-4c92-98f9-dd35cc9d43a9","Type":"ContainerDied","Data":"1275f690feea00793157e12c75f8deef238efc5fe1e7db14cbe0b3879432536f"} Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.793927 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.803638 5030 scope.go:117] "RemoveContainer" containerID="53566cbe13e0554a3b86a64ff88413e4e2971a75f558b615fb405505c102f2c8" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.819799 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.845418 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.845839 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-replicator" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.845853 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-replicator" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.845866 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-replicator" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.845872 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-replicator" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.845884 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-auditor" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.845892 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-auditor" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.845902 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-updater" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.845908 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-updater" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.845914 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="rsync" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.845920 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="rsync" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.845930 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-server" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.845936 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-server" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.845943 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-expirer" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.845949 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-expirer" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.845960 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-updater" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.845966 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-updater" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.845976 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-auditor" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.845981 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-auditor" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.845989 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-auditor" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.845994 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-auditor" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.846012 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-replicator" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846017 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-replicator" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.846026 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-reaper" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846032 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-reaper" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.846041 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-server" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846046 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-server" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.846061 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="swift-recon-cron" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846067 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="swift-recon-cron" Jan 20 23:31:54 crc kubenswrapper[5030]: E0120 23:31:54.846079 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-server" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846084 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-server" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846247 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-expirer" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846256 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-server" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846266 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-server" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846277 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-updater" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846288 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="rsync" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846303 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-replicator" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846313 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-replicator" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846323 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-replicator" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846332 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-updater" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846341 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="object-auditor" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846351 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="container-auditor" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846362 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-auditor" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846367 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="swift-recon-cron" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846375 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-reaper" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.846382 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" containerName="account-server" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.853847 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.853968 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.860543 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.909946 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.914004 5030 scope.go:117] "RemoveContainer" containerID="d264c92684da056e428a6c733961f8fe92aa28e300c81f22e991287ca3bdb9ab" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.962910 5030 scope.go:117] "RemoveContainer" containerID="fb5859be41c2b9cb5338526464383b63d74381d26c7f59f26d24b5ff23c949cc" Jan 20 23:31:54 crc kubenswrapper[5030]: I0120 23:31:54.981835 5030 scope.go:117] "RemoveContainer" containerID="60cb27177b8ee600ca0fa9b124dfd7d1119967c490af0614c1bb464ddeaa2707" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.006944 5030 scope.go:117] "RemoveContainer" containerID="59c21e5e34c6d5dbf8de4d3d91b825d296eab38936e7d83529e745d3d2b3e2a5" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.012606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-scripts\") pod \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.012855 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-credential-keys\") pod \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.012959 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-config-data\") pod \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.013048 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqjtt\" (UniqueName: \"kubernetes.io/projected/c20f9105-8569-4c92-98f9-dd35cc9d43a9-kube-api-access-lqjtt\") pod \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.013148 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-combined-ca-bundle\") pod \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.013246 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-fernet-keys\") pod \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\" (UID: \"c20f9105-8569-4c92-98f9-dd35cc9d43a9\") " Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.014321 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.016539 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkswp\" (UniqueName: \"kubernetes.io/projected/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-kube-api-access-mkswp\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.016590 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-etc-swift\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.016772 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-cache\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.016795 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-lock\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.018904 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-scripts" (OuterVolumeSpecName: "scripts") pod "c20f9105-8569-4c92-98f9-dd35cc9d43a9" (UID: "c20f9105-8569-4c92-98f9-dd35cc9d43a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.018997 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c20f9105-8569-4c92-98f9-dd35cc9d43a9-kube-api-access-lqjtt" (OuterVolumeSpecName: "kube-api-access-lqjtt") pod "c20f9105-8569-4c92-98f9-dd35cc9d43a9" (UID: "c20f9105-8569-4c92-98f9-dd35cc9d43a9"). InnerVolumeSpecName "kube-api-access-lqjtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.020362 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c20f9105-8569-4c92-98f9-dd35cc9d43a9" (UID: "c20f9105-8569-4c92-98f9-dd35cc9d43a9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.023602 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c20f9105-8569-4c92-98f9-dd35cc9d43a9" (UID: "c20f9105-8569-4c92-98f9-dd35cc9d43a9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.037838 5030 scope.go:117] "RemoveContainer" containerID="9308c7e5305215685af92a1087be86f06b8582716c763d7b3d73399d891e23fb" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.045508 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-config-data" (OuterVolumeSpecName: "config-data") pod "c20f9105-8569-4c92-98f9-dd35cc9d43a9" (UID: "c20f9105-8569-4c92-98f9-dd35cc9d43a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.056717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c20f9105-8569-4c92-98f9-dd35cc9d43a9" (UID: "c20f9105-8569-4c92-98f9-dd35cc9d43a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.120865 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-cache\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.120926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-lock\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.121005 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.121106 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkswp\" (UniqueName: \"kubernetes.io/projected/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-kube-api-access-mkswp\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.121145 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-etc-swift\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.121264 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.121283 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqjtt\" (UniqueName: \"kubernetes.io/projected/c20f9105-8569-4c92-98f9-dd35cc9d43a9-kube-api-access-lqjtt\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.121297 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.121308 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.121318 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.121330 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c20f9105-8569-4c92-98f9-dd35cc9d43a9-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.122525 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-lock\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.122594 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-cache\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.122888 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") device mount path \"/mnt/openstack/pv17\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.126987 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-etc-swift\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.149271 5030 scope.go:117] "RemoveContainer" containerID="6a5682b6b00c488de2f354ed3629a706b621d15c96ef14b746c170ab2da520e2" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.153890 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkswp\" (UniqueName: \"kubernetes.io/projected/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-kube-api-access-mkswp\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.158521 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"swift-storage-0\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.177411 5030 scope.go:117] "RemoveContainer" containerID="234735c49802543dffa54027fc58edd4cb277d91c2216d680f26f1186b80560f" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.200952 5030 scope.go:117] "RemoveContainer" containerID="b7ae32b1a4f1e7f7fe7aad12eac5a1d984d90e41f390bb645d6ca4b2f630627a" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.222438 5030 scope.go:117] "RemoveContainer" containerID="b60f5caf9f00c875e2dfdf1fc8cd6af041235cd0e56388a3a11dd402988b4307" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.242410 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.255405 5030 scope.go:117] "RemoveContainer" containerID="49574068fdaf73188ebe3b005b4e8a44cf73f4cfe484defd630ff27da3cfb6e0" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.286245 5030 scope.go:117] "RemoveContainer" containerID="e61677d1c4df154307a801a3e62190f8fa8c59cdfba25fbc0e4949bc2af510ce" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.313261 5030 scope.go:117] "RemoveContainer" containerID="fc30b61ad3b670e8566e451c0bf77bed40bb4ae683b1002df810eb3e8a7c7e56" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.342737 5030 scope.go:117] "RemoveContainer" containerID="c023313c2dc4acfa642739fd15b6dce1cdb2384c9ddf2f534c8c7e3a6317cc61" Jan 20 23:31:55 crc kubenswrapper[5030]: E0120 23:31:55.598356 5030 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/becb122dfac0ee9bbbeaf7acd1c77364424f81098d3ff89cdefae9ecdceb797f/diff" to get inode usage: stat /var/lib/containers/storage/overlay/becb122dfac0ee9bbbeaf7acd1c77364424f81098d3ff89cdefae9ecdceb797f/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack-kuttl-tests_swift-storage-0_bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4/swift-recon-cron/0.log" to get inode usage: stat /var/log/pods/openstack-kuttl-tests_swift-storage-0_bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4/swift-recon-cron/0.log: no such file or directory Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.759424 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.775406 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.775436 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6676986dff-v2h4r" event={"ID":"c20f9105-8569-4c92-98f9-dd35cc9d43a9","Type":"ContainerDied","Data":"3fe947ee9f0bbbedd35dd0ede808e4e4700dfcee91938cbfaad6dac6667227b4"} Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.775492 5030 scope.go:117] "RemoveContainer" containerID="1275f690feea00793157e12c75f8deef238efc5fe1e7db14cbe0b3879432536f" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.864406 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6676986dff-v2h4r"] Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.877860 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-6676986dff-v2h4r"] Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.979504 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4" path="/var/lib/kubelet/pods/bfb5c6fe-b2dc-4f4c-9b95-7e775d719cc4/volumes" Jan 20 23:31:55 crc kubenswrapper[5030]: I0120 23:31:55.982145 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c20f9105-8569-4c92-98f9-dd35cc9d43a9" path="/var/lib/kubelet/pods/c20f9105-8569-4c92-98f9-dd35cc9d43a9/volumes" Jan 20 23:31:56 crc kubenswrapper[5030]: E0120 23:31:56.652688 5030 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/5b2e921a8ad90a1979813315a365bb69e1d3661f95d724aafcb283e87400c75b/diff" to get inode usage: stat /var/lib/containers/storage/overlay/5b2e921a8ad90a1979813315a365bb69e1d3661f95d724aafcb283e87400c75b/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack-kuttl-tests_rabbitmq-server-0_9def69e2-10b9-443b-8b38-7d561e39bb7d/rabbitmq/0.log" to get inode usage: stat /var/log/pods/openstack-kuttl-tests_rabbitmq-server-0_9def69e2-10b9-443b-8b38-7d561e39bb7d/rabbitmq/0.log: no such file or directory Jan 20 23:31:56 crc kubenswrapper[5030]: I0120 23:31:56.837258 5030 generic.go:334] "Generic (PLEG): container finished" podID="52fa921e-8a73-400d-9f48-50876a4765f4" containerID="60bdee87e6e2052c44319ee5200fc4ba267eec8e3cf7f7c7482f4b470bee7599" exitCode=137 Jan 20 23:31:56 crc kubenswrapper[5030]: I0120 23:31:56.837356 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" event={"ID":"52fa921e-8a73-400d-9f48-50876a4765f4","Type":"ContainerDied","Data":"60bdee87e6e2052c44319ee5200fc4ba267eec8e3cf7f7c7482f4b470bee7599"} Jan 20 23:31:56 crc kubenswrapper[5030]: I0120 23:31:56.843450 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"0478c6b5a2aa7d3114b8aba566b34095474deb6aee07d02a4122007b8a828961"} Jan 20 23:31:56 crc kubenswrapper[5030]: I0120 23:31:56.843493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"6fa5a926460488785a7ffba60bb9b7fadfcf332e45c0b16ee382744faf654dce"} Jan 20 23:31:56 crc kubenswrapper[5030]: I0120 23:31:56.843508 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"68a3e13cf2691b33d5c70e2cd4ee154f6c6ade16bd21a3ed1c5508abb12bf1aa"} Jan 20 23:31:56 crc kubenswrapper[5030]: I0120 23:31:56.843519 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"ee86ab1790b689950fea354926e10566d74ac66d70330658ee9e95ec735cc44f"} Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.134394 5030 scope.go:117] "RemoveContainer" containerID="bbd14ee795fa04a58c750c6baa61b64963cd12c1c88def18d5dedbd8c96d79f2" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.195200 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.240519 5030 scope.go:117] "RemoveContainer" containerID="5f43545399090228177f04523ac5a317a6277e22050988ab543b21064445e7de" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.259833 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-config-data\") pod \"52fa921e-8a73-400d-9f48-50876a4765f4\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.259914 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkzpp\" (UniqueName: \"kubernetes.io/projected/52fa921e-8a73-400d-9f48-50876a4765f4-kube-api-access-tkzpp\") pod \"52fa921e-8a73-400d-9f48-50876a4765f4\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.260078 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52fa921e-8a73-400d-9f48-50876a4765f4-logs\") pod \"52fa921e-8a73-400d-9f48-50876a4765f4\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.260130 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-combined-ca-bundle\") pod \"52fa921e-8a73-400d-9f48-50876a4765f4\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.260186 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-config-data-custom\") pod \"52fa921e-8a73-400d-9f48-50876a4765f4\" (UID: \"52fa921e-8a73-400d-9f48-50876a4765f4\") " Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.264465 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52fa921e-8a73-400d-9f48-50876a4765f4-logs" (OuterVolumeSpecName: "logs") pod "52fa921e-8a73-400d-9f48-50876a4765f4" (UID: "52fa921e-8a73-400d-9f48-50876a4765f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.274533 5030 scope.go:117] "RemoveContainer" containerID="847a1cdf293cd94ff36a98ec31b6492efbc1f13b9fd3fea3d0454c780f44cc6f" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.279750 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "52fa921e-8a73-400d-9f48-50876a4765f4" (UID: "52fa921e-8a73-400d-9f48-50876a4765f4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.284107 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52fa921e-8a73-400d-9f48-50876a4765f4-kube-api-access-tkzpp" (OuterVolumeSpecName: "kube-api-access-tkzpp") pod "52fa921e-8a73-400d-9f48-50876a4765f4" (UID: "52fa921e-8a73-400d-9f48-50876a4765f4"). InnerVolumeSpecName "kube-api-access-tkzpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.328067 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52fa921e-8a73-400d-9f48-50876a4765f4" (UID: "52fa921e-8a73-400d-9f48-50876a4765f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.331772 5030 scope.go:117] "RemoveContainer" containerID="e3bc077b333e186d58c38afabbc39ee5da09f71b50da1b5bffef05ed628247e9" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.342708 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-config-data" (OuterVolumeSpecName: "config-data") pod "52fa921e-8a73-400d-9f48-50876a4765f4" (UID: "52fa921e-8a73-400d-9f48-50876a4765f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.362442 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkzpp\" (UniqueName: \"kubernetes.io/projected/52fa921e-8a73-400d-9f48-50876a4765f4-kube-api-access-tkzpp\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.362474 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52fa921e-8a73-400d-9f48-50876a4765f4-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.362487 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.362496 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.362505 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52fa921e-8a73-400d-9f48-50876a4765f4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.385475 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-7d99d84474-qxm76_9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0/neutron-api/0.log" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.385550 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.385792 5030 scope.go:117] "RemoveContainer" containerID="aebde48ffd7b79ddf812bbd9fe02d63be7d61af3719098b682a1f12190844949" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.436762 5030 scope.go:117] "RemoveContainer" containerID="a492320c925e809dce4d02a24f5e2910ff9256a4363f1e4a7c5bd831814a12db" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.463931 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdvrq\" (UniqueName: \"kubernetes.io/projected/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-kube-api-access-rdvrq\") pod \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.464089 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-config\") pod \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.464183 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-combined-ca-bundle\") pod \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.464257 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-httpd-config\") pod \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\" (UID: \"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0\") " Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.466335 5030 scope.go:117] "RemoveContainer" containerID="41f49fd7ec11dd9a9a557c1bb592ba6c7982e9eb0f90238a36c0c561df24cec1" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.467882 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" (UID: "9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.468209 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-kube-api-access-rdvrq" (OuterVolumeSpecName: "kube-api-access-rdvrq") pod "9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" (UID: "9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0"). InnerVolumeSpecName "kube-api-access-rdvrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.517116 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-config" (OuterVolumeSpecName: "config") pod "9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" (UID: "9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.536905 5030 scope.go:117] "RemoveContainer" containerID="352f738bc011e320357be8465b773ae7a2714b7a4e9f08dfc7440ae1c2f120a4" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.554189 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" (UID: "9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.566251 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.566278 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.566293 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdvrq\" (UniqueName: \"kubernetes.io/projected/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-kube-api-access-rdvrq\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.566304 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:31:57 crc kubenswrapper[5030]: E0120 23:31:57.666844 5030 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/cd84ba3264594a03b07c5beec9bba630b8e4e250fa1caf5fcba4391334b028f2/diff" to get inode usage: stat /var/lib/containers/storage/overlay/cd84ba3264594a03b07c5beec9bba630b8e4e250fa1caf5fcba4391334b028f2/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack-kuttl-tests_rabbitmq-cell1-server-0_535e149a-723c-4f6b-9247-9a6b58eff6bb/rabbitmq/0.log" to get inode usage: stat /var/log/pods/openstack-kuttl-tests_rabbitmq-cell1-server-0_535e149a-723c-4f6b-9247-9a6b58eff6bb/rabbitmq/0.log: no such file or directory Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.699988 5030 scope.go:117] "RemoveContainer" containerID="4141048d16b87ba820575f3c2ed1b798652fb89491b0f81f3244fea3383c5fb5" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.725445 5030 scope.go:117] "RemoveContainer" containerID="616bdc366eb8dd795e7fbeb8202645472cb589464fac89005bdee7e84a582a16" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.768253 5030 scope.go:117] "RemoveContainer" containerID="3657580e710d34e6ab5e47b4e7f1bff1e8c70505ca73a3c36c83d6889bcae7b8" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.827893 5030 scope.go:117] "RemoveContainer" containerID="5c3963eec2dd418b836709714d9c6ad213a8435b737aa8a3604f4191d504f0a0" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.869399 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" event={"ID":"52fa921e-8a73-400d-9f48-50876a4765f4","Type":"ContainerDied","Data":"843eec708f845d1a5dd4e5589d6ea3b37d05f1d00054a95c32e47588c8472c8d"} Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.869461 5030 scope.go:117] "RemoveContainer" containerID="60bdee87e6e2052c44319ee5200fc4ba267eec8e3cf7f7c7482f4b470bee7599" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.870804 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.902804 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"e1b5f2a13f3163554624db7b1aafe91f5aee9ab1131474fe0369c5f0f4a9dee1"} Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.902947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"142f3acf1108276cc11ce9ff0213af6ed4a6eb46e60c66d8d0e27543458e4c19"} Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.902959 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"aef8219014260e57e40ee40900cc995c3ff708186b3f043e7917e6d02c4cb21e"} Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.902967 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"611008d575119572dd79f498dbab28ea12bc998a1001a31f1ecbe0cfe79cdd53"} Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.902976 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"a1d0030d1db7eafecbee2fd3e7074cf5fa995ab67d4254165b2fedc019d55a78"} Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.924196 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv"] Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.928129 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-7d99d84474-qxm76_9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0/neutron-api/0.log" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.928170 5030 generic.go:334] "Generic (PLEG): container finished" podID="9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" containerID="c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32" exitCode=137 Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.928198 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" event={"ID":"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0","Type":"ContainerDied","Data":"c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32"} Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.928223 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" event={"ID":"9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0","Type":"ContainerDied","Data":"b78656f45017feef1a2b7ad4852c2dc7a0590f527f6edad24908c341f2f9f7cc"} Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.928289 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7d99d84474-qxm76" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.930399 5030 scope.go:117] "RemoveContainer" containerID="7cd441aae70f6c395bd5a42b94d16ef7f6f319a8b61584ad5fc4661b28a3ff4d" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.930897 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-9d499c945-ffpkv"] Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.946268 5030 scope.go:117] "RemoveContainer" containerID="980840455d95fe822b68afa33dbee4e1d193ae82fd6f5af390afd022819a58a3" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.967532 5030 scope.go:117] "RemoveContainer" containerID="9502d696bd2796522f2685c308038851864d146377ed5acba22931664cdb58d8" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.979592 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52fa921e-8a73-400d-9f48-50876a4765f4" path="/var/lib/kubelet/pods/52fa921e-8a73-400d-9f48-50876a4765f4/volumes" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.980460 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7d99d84474-qxm76"] Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.982982 5030 scope.go:117] "RemoveContainer" containerID="494d3ef6c648973849b47baaa186ccd729b934284fa3f94433ae78272e1c1568" Jan 20 23:31:57 crc kubenswrapper[5030]: I0120 23:31:57.983484 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-7d99d84474-qxm76"] Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.031237 5030 scope.go:117] "RemoveContainer" containerID="c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32" Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.044234 5030 scope.go:117] "RemoveContainer" containerID="a74062eb7e3b260098a20f5b7d284abc8950cfd493a9c09773fcfc9d72a97d3c" Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.056901 5030 scope.go:117] "RemoveContainer" containerID="494d3ef6c648973849b47baaa186ccd729b934284fa3f94433ae78272e1c1568" Jan 20 23:31:58 crc kubenswrapper[5030]: E0120 23:31:58.057428 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494d3ef6c648973849b47baaa186ccd729b934284fa3f94433ae78272e1c1568\": container with ID starting with 494d3ef6c648973849b47baaa186ccd729b934284fa3f94433ae78272e1c1568 not found: ID does not exist" containerID="494d3ef6c648973849b47baaa186ccd729b934284fa3f94433ae78272e1c1568" Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.057457 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494d3ef6c648973849b47baaa186ccd729b934284fa3f94433ae78272e1c1568"} err="failed to get container status \"494d3ef6c648973849b47baaa186ccd729b934284fa3f94433ae78272e1c1568\": rpc error: code = NotFound desc = could not find container \"494d3ef6c648973849b47baaa186ccd729b934284fa3f94433ae78272e1c1568\": container with ID starting with 494d3ef6c648973849b47baaa186ccd729b934284fa3f94433ae78272e1c1568 not found: ID does not exist" Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.057477 5030 scope.go:117] "RemoveContainer" containerID="c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32" Jan 20 23:31:58 crc kubenswrapper[5030]: E0120 23:31:58.057935 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32\": container with ID starting with c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32 not found: ID does not exist" containerID="c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32" Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.057976 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32"} err="failed to get container status \"c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32\": rpc error: code = NotFound desc = could not find container \"c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32\": container with ID starting with c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32 not found: ID does not exist" Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.075143 5030 scope.go:117] "RemoveContainer" containerID="39dedca045f311d6169ac05c06012585bdf93027adb63134aae42af056394de6" Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.108628 5030 scope.go:117] "RemoveContainer" containerID="493a4b5159ec26b105e62e8cb2c2af22e9255bbb11250b3a8cbcc06e1bbd150a" Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.131340 5030 scope.go:117] "RemoveContainer" containerID="51142efc7916ed03c6153a5dcb3fe231b88d5aa152a4410501c160c9b0963e66" Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.841892 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod36edc87c-d811-414a-8630-86c81093bdbf"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod36edc87c-d811-414a-8630-86c81093bdbf] : Timed out while waiting for systemd to remove kubepods-besteffort-pod36edc87c_d811_414a_8630_86c81093bdbf.slice" Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.910274 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.916099 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.950766 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"5255250b357a4ac7681649a1e5c0a926b39fbee907e1ba4000a642da16130fdd"} Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.950817 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"69b7c21f321b09445606557c0e6984dc6d0949ad2a3b0ee72bd8c7a27d07ecc7"} Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.950834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"02c333bfad293ec86e408a8920ec51d1c9268264d8402be95b6a1751dcb8d930"} Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.950849 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"7d9c91edb4a35a4bf32992a5fc61b2174f916fe1a309ebe2c3715d15e2bd931f"} Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.994280 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7d684cd5bb-rtqs4"] Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.994559 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" podUID="5dacb533-2f94-4433-ab8e-2efc4ee6e370" containerName="placement-log" containerID="cri-o://8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8" gracePeriod=30 Jan 20 23:31:58 crc kubenswrapper[5030]: I0120 23:31:58.994734 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" podUID="5dacb533-2f94-4433-ab8e-2efc4ee6e370" containerName="placement-api" containerID="cri-o://6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7" gracePeriod=30 Jan 20 23:31:59 crc kubenswrapper[5030]: I0120 23:31:59.221597 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:31:59 crc kubenswrapper[5030]: E0120 23:31:59.640531 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:59 crc kubenswrapper[5030]: E0120 23:31:59.642808 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:59 crc kubenswrapper[5030]: E0120 23:31:59.644540 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:31:59 crc kubenswrapper[5030]: E0120 23:31:59.644607 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" containerName="nova-cell1-conductor-conductor" Jan 20 23:31:59 crc kubenswrapper[5030]: I0120 23:31:59.960919 5030 generic.go:334] "Generic (PLEG): container finished" podID="5dacb533-2f94-4433-ab8e-2efc4ee6e370" containerID="8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8" exitCode=143 Jan 20 23:31:59 crc kubenswrapper[5030]: I0120 23:31:59.962889 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-7fc9646994-sp2l9_1cef2808-064a-47dd-8abb-9f6b08d260be/neutron-api/0.log" Jan 20 23:31:59 crc kubenswrapper[5030]: I0120 23:31:59.962943 5030 generic.go:334] "Generic (PLEG): container finished" podID="1cef2808-064a-47dd-8abb-9f6b08d260be" containerID="5e56e0a90f48d67ecced2eb4b7bd71d6bb25bbb2926717de0d9772105c53f0b2" exitCode=137 Jan 20 23:31:59 crc kubenswrapper[5030]: I0120 23:31:59.983311 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" path="/var/lib/kubelet/pods/9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0/volumes" Jan 20 23:31:59 crc kubenswrapper[5030]: I0120 23:31:59.984119 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" event={"ID":"5dacb533-2f94-4433-ab8e-2efc4ee6e370","Type":"ContainerDied","Data":"8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8"} Jan 20 23:31:59 crc kubenswrapper[5030]: I0120 23:31:59.984162 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" event={"ID":"1cef2808-064a-47dd-8abb-9f6b08d260be","Type":"ContainerDied","Data":"5e56e0a90f48d67ecced2eb4b7bd71d6bb25bbb2926717de0d9772105c53f0b2"} Jan 20 23:31:59 crc kubenswrapper[5030]: I0120 23:31:59.984181 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" event={"ID":"1cef2808-064a-47dd-8abb-9f6b08d260be","Type":"ContainerDied","Data":"116e40c4f860de9e142f95412a9597d3a3d7a81c4b7919eb25a5a0d68703a63f"} Jan 20 23:31:59 crc kubenswrapper[5030]: I0120 23:31:59.984191 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="116e40c4f860de9e142f95412a9597d3a3d7a81c4b7919eb25a5a0d68703a63f" Jan 20 23:31:59 crc kubenswrapper[5030]: I0120 23:31:59.984206 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"5c43afbc3bd855975b6195fe0ab81ffca022d863f4ec31ef3dcc4088f902abf7"} Jan 20 23:31:59 crc kubenswrapper[5030]: I0120 23:31:59.984219 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"d8020553ea006945bae8427811e7617cdae1effe1553b99640376e2be8359797"} Jan 20 23:31:59 crc kubenswrapper[5030]: I0120 23:31:59.984232 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerStarted","Data":"10f2ad934ade0755d3b8fbad653de62253c2374bf376c3091a96065c690f7603"} Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.006398 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=6.006376702 podStartE2EDuration="6.006376702s" podCreationTimestamp="2026-01-20 23:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:31:59.999261429 +0000 UTC m=+3392.319521737" watchObservedRunningTime="2026-01-20 23:32:00.006376702 +0000 UTC m=+3392.326636990" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.013368 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-7fc9646994-sp2l9_1cef2808-064a-47dd-8abb-9f6b08d260be/neutron-api/0.log" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.013454 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.117965 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-combined-ca-bundle\") pod \"1cef2808-064a-47dd-8abb-9f6b08d260be\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.118388 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvddh\" (UniqueName: \"kubernetes.io/projected/1cef2808-064a-47dd-8abb-9f6b08d260be-kube-api-access-zvddh\") pod \"1cef2808-064a-47dd-8abb-9f6b08d260be\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.118412 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-httpd-config\") pod \"1cef2808-064a-47dd-8abb-9f6b08d260be\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.118507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-config\") pod \"1cef2808-064a-47dd-8abb-9f6b08d260be\" (UID: \"1cef2808-064a-47dd-8abb-9f6b08d260be\") " Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.128438 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cef2808-064a-47dd-8abb-9f6b08d260be-kube-api-access-zvddh" (OuterVolumeSpecName: "kube-api-access-zvddh") pod "1cef2808-064a-47dd-8abb-9f6b08d260be" (UID: "1cef2808-064a-47dd-8abb-9f6b08d260be"). InnerVolumeSpecName "kube-api-access-zvddh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.128438 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1cef2808-064a-47dd-8abb-9f6b08d260be" (UID: "1cef2808-064a-47dd-8abb-9f6b08d260be"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.134968 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69"] Jan 20 23:32:00 crc kubenswrapper[5030]: E0120 23:32:00.135453 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" containerName="neutron-httpd" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.135472 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" containerName="neutron-httpd" Jan 20 23:32:00 crc kubenswrapper[5030]: E0120 23:32:00.135510 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52fa921e-8a73-400d-9f48-50876a4765f4" containerName="barbican-worker" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.135518 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="52fa921e-8a73-400d-9f48-50876a4765f4" containerName="barbican-worker" Jan 20 23:32:00 crc kubenswrapper[5030]: E0120 23:32:00.135538 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cef2808-064a-47dd-8abb-9f6b08d260be" containerName="neutron-api" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.135544 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cef2808-064a-47dd-8abb-9f6b08d260be" containerName="neutron-api" Jan 20 23:32:00 crc kubenswrapper[5030]: E0120 23:32:00.135557 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cef2808-064a-47dd-8abb-9f6b08d260be" containerName="neutron-httpd" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.135581 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cef2808-064a-47dd-8abb-9f6b08d260be" containerName="neutron-httpd" Jan 20 23:32:00 crc kubenswrapper[5030]: E0120 23:32:00.135594 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c20f9105-8569-4c92-98f9-dd35cc9d43a9" containerName="keystone-api" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.135601 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c20f9105-8569-4c92-98f9-dd35cc9d43a9" containerName="keystone-api" Jan 20 23:32:00 crc kubenswrapper[5030]: E0120 23:32:00.135650 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" containerName="neutron-api" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.135658 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" containerName="neutron-api" Jan 20 23:32:00 crc kubenswrapper[5030]: E0120 23:32:00.135680 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52fa921e-8a73-400d-9f48-50876a4765f4" containerName="barbican-worker-log" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.135688 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="52fa921e-8a73-400d-9f48-50876a4765f4" containerName="barbican-worker-log" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.135968 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="52fa921e-8a73-400d-9f48-50876a4765f4" containerName="barbican-worker-log" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.135988 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" containerName="neutron-httpd" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.135999 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cef2808-064a-47dd-8abb-9f6b08d260be" containerName="neutron-api" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.136008 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c20f9105-8569-4c92-98f9-dd35cc9d43a9" containerName="keystone-api" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.136016 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0" containerName="neutron-api" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.136048 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="52fa921e-8a73-400d-9f48-50876a4765f4" containerName="barbican-worker" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.136066 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cef2808-064a-47dd-8abb-9f6b08d260be" containerName="neutron-httpd" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.137237 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.160697 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69"] Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.204829 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cef2808-064a-47dd-8abb-9f6b08d260be" (UID: "1cef2808-064a-47dd-8abb-9f6b08d260be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.213852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-config" (OuterVolumeSpecName: "config") pod "1cef2808-064a-47dd-8abb-9f6b08d260be" (UID: "1cef2808-064a-47dd-8abb-9f6b08d260be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.223927 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-796955bb77-dbf69\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.224015 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8jtr\" (UniqueName: \"kubernetes.io/projected/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-kube-api-access-h8jtr\") pod \"dnsmasq-dnsmasq-796955bb77-dbf69\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.224109 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-config\") pod \"dnsmasq-dnsmasq-796955bb77-dbf69\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.224138 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-796955bb77-dbf69\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.224435 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.224451 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.224467 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvddh\" (UniqueName: \"kubernetes.io/projected/1cef2808-064a-47dd-8abb-9f6b08d260be-kube-api-access-zvddh\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.224478 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1cef2808-064a-47dd-8abb-9f6b08d260be-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.327227 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-config\") pod \"dnsmasq-dnsmasq-796955bb77-dbf69\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.327281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-796955bb77-dbf69\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.327414 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-796955bb77-dbf69\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.327443 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8jtr\" (UniqueName: \"kubernetes.io/projected/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-kube-api-access-h8jtr\") pod \"dnsmasq-dnsmasq-796955bb77-dbf69\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.328465 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-796955bb77-dbf69\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.328525 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-config\") pod \"dnsmasq-dnsmasq-796955bb77-dbf69\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.328810 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-796955bb77-dbf69\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.354639 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8jtr\" (UniqueName: \"kubernetes.io/projected/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-kube-api-access-h8jtr\") pod \"dnsmasq-dnsmasq-796955bb77-dbf69\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.548931 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.983367 5030 generic.go:334] "Generic (PLEG): container finished" podID="08962f16-0341-46c7-b0df-9972861083df" containerID="c2d0233a880b88e93dbeabd9bcdcab1ad13499f0f2deb9a9aa140faad87db60d" exitCode=137 Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.983452 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" event={"ID":"08962f16-0341-46c7-b0df-9972861083df","Type":"ContainerDied","Data":"c2d0233a880b88e93dbeabd9bcdcab1ad13499f0f2deb9a9aa140faad87db60d"} Jan 20 23:32:00 crc kubenswrapper[5030]: I0120 23:32:00.984325 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-7fc9646994-sp2l9" Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.039822 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7fc9646994-sp2l9"] Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.054004 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-7fc9646994-sp2l9"] Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.062376 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69"] Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.174923 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.245715 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-combined-ca-bundle\") pod \"08962f16-0341-46c7-b0df-9972861083df\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.245833 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08962f16-0341-46c7-b0df-9972861083df-logs\") pod \"08962f16-0341-46c7-b0df-9972861083df\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.245869 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-config-data\") pod \"08962f16-0341-46c7-b0df-9972861083df\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.245893 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-config-data-custom\") pod \"08962f16-0341-46c7-b0df-9972861083df\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.245968 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7hsz\" (UniqueName: \"kubernetes.io/projected/08962f16-0341-46c7-b0df-9972861083df-kube-api-access-q7hsz\") pod \"08962f16-0341-46c7-b0df-9972861083df\" (UID: \"08962f16-0341-46c7-b0df-9972861083df\") " Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.246784 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08962f16-0341-46c7-b0df-9972861083df-logs" (OuterVolumeSpecName: "logs") pod "08962f16-0341-46c7-b0df-9972861083df" (UID: "08962f16-0341-46c7-b0df-9972861083df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.251775 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "08962f16-0341-46c7-b0df-9972861083df" (UID: "08962f16-0341-46c7-b0df-9972861083df"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.252535 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08962f16-0341-46c7-b0df-9972861083df-kube-api-access-q7hsz" (OuterVolumeSpecName: "kube-api-access-q7hsz") pod "08962f16-0341-46c7-b0df-9972861083df" (UID: "08962f16-0341-46c7-b0df-9972861083df"). InnerVolumeSpecName "kube-api-access-q7hsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.271170 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08962f16-0341-46c7-b0df-9972861083df" (UID: "08962f16-0341-46c7-b0df-9972861083df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.290644 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-config-data" (OuterVolumeSpecName: "config-data") pod "08962f16-0341-46c7-b0df-9972861083df" (UID: "08962f16-0341-46c7-b0df-9972861083df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.348425 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08962f16-0341-46c7-b0df-9972861083df-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.348460 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.348471 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.348483 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7hsz\" (UniqueName: \"kubernetes.io/projected/08962f16-0341-46c7-b0df-9972861083df-kube-api-access-q7hsz\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.348494 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08962f16-0341-46c7-b0df-9972861083df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.980202 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cef2808-064a-47dd-8abb-9f6b08d260be" path="/var/lib/kubelet/pods/1cef2808-064a-47dd-8abb-9f6b08d260be/volumes" Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.998071 5030 generic.go:334] "Generic (PLEG): container finished" podID="a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" containerID="3b71e337b7a424ae92b3ec58e6074fdcfb016322dcb1f6c81c08b0044d34ffba" exitCode=0 Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.998340 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" event={"ID":"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0","Type":"ContainerDied","Data":"3b71e337b7a424ae92b3ec58e6074fdcfb016322dcb1f6c81c08b0044d34ffba"} Jan 20 23:32:01 crc kubenswrapper[5030]: I0120 23:32:01.998409 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" event={"ID":"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0","Type":"ContainerStarted","Data":"0883728b834863e289f76f52e7bae14541a8e0c72081e58bd13bfada03b04d47"} Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.008177 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" event={"ID":"08962f16-0341-46c7-b0df-9972861083df","Type":"ContainerDied","Data":"382ecae1ec29df3c8334fa5c814576e52cef4b516be068ee7b882aca010b9afc"} Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.008249 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.008273 5030 scope.go:117] "RemoveContainer" containerID="c2d0233a880b88e93dbeabd9bcdcab1ad13499f0f2deb9a9aa140faad87db60d" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.160454 5030 scope.go:117] "RemoveContainer" containerID="6c650f247290a40ecefbd23466ce8a8a0175d13bde2a6888e5bd093543ff4b53" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.179541 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk"] Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.192643 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5868d649cd-4hxqk"] Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.258796 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.325707 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.398698 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92"] Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.398972 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" podUID="f7c50398-f0ad-450d-b494-be906da18af9" containerName="barbican-api-log" containerID="cri-o://c1b33ab002e4f712903d4ed520c8c52b8e5d06f4f632273086ae04558ec64596" gracePeriod=30 Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.399126 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" podUID="f7c50398-f0ad-450d-b494-be906da18af9" containerName="barbican-api" containerID="cri-o://ace8455d89e787da2e667d87dede5ef4ab3dc9bb9804afaac85184a3e3d85013" gracePeriod=30 Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.578484 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.682820 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-combined-ca-bundle\") pod \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.682879 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dacb533-2f94-4433-ab8e-2efc4ee6e370-logs\") pod \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.682920 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7whh\" (UniqueName: \"kubernetes.io/projected/5dacb533-2f94-4433-ab8e-2efc4ee6e370-kube-api-access-l7whh\") pod \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.683058 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-config-data\") pod \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.683174 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-scripts\") pod \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\" (UID: \"5dacb533-2f94-4433-ab8e-2efc4ee6e370\") " Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.683330 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dacb533-2f94-4433-ab8e-2efc4ee6e370-logs" (OuterVolumeSpecName: "logs") pod "5dacb533-2f94-4433-ab8e-2efc4ee6e370" (UID: "5dacb533-2f94-4433-ab8e-2efc4ee6e370"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.683678 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dacb533-2f94-4433-ab8e-2efc4ee6e370-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.688287 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-scripts" (OuterVolumeSpecName: "scripts") pod "5dacb533-2f94-4433-ab8e-2efc4ee6e370" (UID: "5dacb533-2f94-4433-ab8e-2efc4ee6e370"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.694790 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dacb533-2f94-4433-ab8e-2efc4ee6e370-kube-api-access-l7whh" (OuterVolumeSpecName: "kube-api-access-l7whh") pod "5dacb533-2f94-4433-ab8e-2efc4ee6e370" (UID: "5dacb533-2f94-4433-ab8e-2efc4ee6e370"). InnerVolumeSpecName "kube-api-access-l7whh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.730954 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-config-data" (OuterVolumeSpecName: "config-data") pod "5dacb533-2f94-4433-ab8e-2efc4ee6e370" (UID: "5dacb533-2f94-4433-ab8e-2efc4ee6e370"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.737562 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dacb533-2f94-4433-ab8e-2efc4ee6e370" (UID: "5dacb533-2f94-4433-ab8e-2efc4ee6e370"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.785080 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.785118 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7whh\" (UniqueName: \"kubernetes.io/projected/5dacb533-2f94-4433-ab8e-2efc4ee6e370-kube-api-access-l7whh\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.785133 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:02 crc kubenswrapper[5030]: I0120 23:32:02.785141 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dacb533-2f94-4433-ab8e-2efc4ee6e370-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.028256 5030 generic.go:334] "Generic (PLEG): container finished" podID="f7c50398-f0ad-450d-b494-be906da18af9" containerID="c1b33ab002e4f712903d4ed520c8c52b8e5d06f4f632273086ae04558ec64596" exitCode=143 Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.028358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" event={"ID":"f7c50398-f0ad-450d-b494-be906da18af9","Type":"ContainerDied","Data":"c1b33ab002e4f712903d4ed520c8c52b8e5d06f4f632273086ae04558ec64596"} Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.033356 5030 generic.go:334] "Generic (PLEG): container finished" podID="5dacb533-2f94-4433-ab8e-2efc4ee6e370" containerID="6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7" exitCode=0 Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.033440 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" event={"ID":"5dacb533-2f94-4433-ab8e-2efc4ee6e370","Type":"ContainerDied","Data":"6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7"} Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.033480 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" event={"ID":"5dacb533-2f94-4433-ab8e-2efc4ee6e370","Type":"ContainerDied","Data":"394e938f5e1ab967db58f7f06b9d758bfb040635991db16883733f455c6b5a6e"} Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.033510 5030 scope.go:117] "RemoveContainer" containerID="6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7" Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.033611 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7d684cd5bb-rtqs4" Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.039816 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" event={"ID":"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0","Type":"ContainerStarted","Data":"bffdf532710533e55a9bfcb1f25017e67b8336ed19189f2017903124f6b31f35"} Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.040161 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.063218 5030 scope.go:117] "RemoveContainer" containerID="8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8" Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.070271 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" podStartSLOduration=3.070250289 podStartE2EDuration="3.070250289s" podCreationTimestamp="2026-01-20 23:32:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:03.067109184 +0000 UTC m=+3395.387369482" watchObservedRunningTime="2026-01-20 23:32:03.070250289 +0000 UTC m=+3395.390510597" Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.101689 5030 scope.go:117] "RemoveContainer" containerID="6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7" Jan 20 23:32:03 crc kubenswrapper[5030]: E0120 23:32:03.102324 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7\": container with ID starting with 6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7 not found: ID does not exist" containerID="6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7" Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.102391 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7"} err="failed to get container status \"6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7\": rpc error: code = NotFound desc = could not find container \"6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7\": container with ID starting with 6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7 not found: ID does not exist" Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.102429 5030 scope.go:117] "RemoveContainer" containerID="8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8" Jan 20 23:32:03 crc kubenswrapper[5030]: E0120 23:32:03.102843 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8\": container with ID starting with 8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8 not found: ID does not exist" containerID="8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8" Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.102897 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8"} err="failed to get container status \"8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8\": rpc error: code = NotFound desc = could not find container \"8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8\": container with ID starting with 8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8 not found: ID does not exist" Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.102991 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7d684cd5bb-rtqs4"] Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.123398 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7d684cd5bb-rtqs4"] Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.992786 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08962f16-0341-46c7-b0df-9972861083df" path="/var/lib/kubelet/pods/08962f16-0341-46c7-b0df-9972861083df/volumes" Jan 20 23:32:03 crc kubenswrapper[5030]: I0120 23:32:03.995084 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dacb533-2f94-4433-ab8e-2efc4ee6e370" path="/var/lib/kubelet/pods/5dacb533-2f94-4433-ab8e-2efc4ee6e370/volumes" Jan 20 23:32:04 crc kubenswrapper[5030]: I0120 23:32:04.258781 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:04 crc kubenswrapper[5030]: E0120 23:32:04.639583 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:04 crc kubenswrapper[5030]: E0120 23:32:04.642246 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:04 crc kubenswrapper[5030]: E0120 23:32:04.644218 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:04 crc kubenswrapper[5030]: E0120 23:32:04.644261 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:05 crc kubenswrapper[5030]: I0120 23:32:05.929932 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:32:05 crc kubenswrapper[5030]: I0120 23:32:05.930475 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c" containerName="kube-state-metrics" containerID="cri-o://3810644d3ad4dbd618d30b4b916ee09f4265c94267ee6ea6b39193711564b3fb" gracePeriod=30 Jan 20 23:32:05 crc kubenswrapper[5030]: I0120 23:32:05.985533 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:05 crc kubenswrapper[5030]: I0120 23:32:05.986175 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="ceilometer-central-agent" containerID="cri-o://746414e59b0a1e3e1f015ec60e5aebed410f00b7ae6cc7b7630090e1aad843de" gracePeriod=30 Jan 20 23:32:05 crc kubenswrapper[5030]: I0120 23:32:05.986229 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="proxy-httpd" containerID="cri-o://1ea957f41ba227c3e41f1638e15e663ffad55ccf95f6a36654fdbd2571340e9c" gracePeriod=30 Jan 20 23:32:05 crc kubenswrapper[5030]: I0120 23:32:05.986263 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="ceilometer-notification-agent" containerID="cri-o://c010e2926c7e322751386b3c6ca4213ac366a66e780234a4f8bfc756ce46c955" gracePeriod=30 Jan 20 23:32:05 crc kubenswrapper[5030]: I0120 23:32:05.986311 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="sg-core" containerID="cri-o://c7e043b0b8441b0f27569b6c1a47553f98a4e898b7a8ab452dec340014f4532c" gracePeriod=30 Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.093463 5030 generic.go:334] "Generic (PLEG): container finished" podID="f7c50398-f0ad-450d-b494-be906da18af9" containerID="ace8455d89e787da2e667d87dede5ef4ab3dc9bb9804afaac85184a3e3d85013" exitCode=0 Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.093552 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" event={"ID":"f7c50398-f0ad-450d-b494-be906da18af9","Type":"ContainerDied","Data":"ace8455d89e787da2e667d87dede5ef4ab3dc9bb9804afaac85184a3e3d85013"} Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.093599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" event={"ID":"f7c50398-f0ad-450d-b494-be906da18af9","Type":"ContainerDied","Data":"c331aa3fbfa5def7fb3bb9752bb98a11aef8a753d2bcf269b8f711288695298c"} Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.093628 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c331aa3fbfa5def7fb3bb9752bb98a11aef8a753d2bcf269b8f711288695298c" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.095886 5030 generic.go:334] "Generic (PLEG): container finished" podID="4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c" containerID="3810644d3ad4dbd618d30b4b916ee09f4265c94267ee6ea6b39193711564b3fb" exitCode=2 Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.095924 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c","Type":"ContainerDied","Data":"3810644d3ad4dbd618d30b4b916ee09f4265c94267ee6ea6b39193711564b3fb"} Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.185961 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.250827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-config-data-custom\") pod \"f7c50398-f0ad-450d-b494-be906da18af9\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.250937 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-config-data\") pod \"f7c50398-f0ad-450d-b494-be906da18af9\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.251063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jtfm\" (UniqueName: \"kubernetes.io/projected/f7c50398-f0ad-450d-b494-be906da18af9-kube-api-access-7jtfm\") pod \"f7c50398-f0ad-450d-b494-be906da18af9\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.251155 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-combined-ca-bundle\") pod \"f7c50398-f0ad-450d-b494-be906da18af9\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.251218 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7c50398-f0ad-450d-b494-be906da18af9-logs\") pod \"f7c50398-f0ad-450d-b494-be906da18af9\" (UID: \"f7c50398-f0ad-450d-b494-be906da18af9\") " Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.252185 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c50398-f0ad-450d-b494-be906da18af9-logs" (OuterVolumeSpecName: "logs") pod "f7c50398-f0ad-450d-b494-be906da18af9" (UID: "f7c50398-f0ad-450d-b494-be906da18af9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.252318 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7c50398-f0ad-450d-b494-be906da18af9-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.257762 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f7c50398-f0ad-450d-b494-be906da18af9" (UID: "f7c50398-f0ad-450d-b494-be906da18af9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.257806 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c50398-f0ad-450d-b494-be906da18af9-kube-api-access-7jtfm" (OuterVolumeSpecName: "kube-api-access-7jtfm") pod "f7c50398-f0ad-450d-b494-be906da18af9" (UID: "f7c50398-f0ad-450d-b494-be906da18af9"). InnerVolumeSpecName "kube-api-access-7jtfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.298973 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-config-data" (OuterVolumeSpecName: "config-data") pod "f7c50398-f0ad-450d-b494-be906da18af9" (UID: "f7c50398-f0ad-450d-b494-be906da18af9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.299698 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7c50398-f0ad-450d-b494-be906da18af9" (UID: "f7c50398-f0ad-450d-b494-be906da18af9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.354226 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.354259 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.354271 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jtfm\" (UniqueName: \"kubernetes.io/projected/f7c50398-f0ad-450d-b494-be906da18af9-kube-api-access-7jtfm\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.354282 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c50398-f0ad-450d-b494-be906da18af9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.787769 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.862874 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rjqz\" (UniqueName: \"kubernetes.io/projected/4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c-kube-api-access-7rjqz\") pod \"4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c\" (UID: \"4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c\") " Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.879401 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c-kube-api-access-7rjqz" (OuterVolumeSpecName: "kube-api-access-7rjqz") pod "4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c" (UID: "4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c"). InnerVolumeSpecName "kube-api-access-7rjqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:06 crc kubenswrapper[5030]: I0120 23:32:06.966124 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rjqz\" (UniqueName: \"kubernetes.io/projected/4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c-kube-api-access-7rjqz\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.106060 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.106078 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c","Type":"ContainerDied","Data":"89c3dd8b3e7fc91cc889f77f886cdedfdb0463cbc962b4028ce5a168ff6ec615"} Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.106149 5030 scope.go:117] "RemoveContainer" containerID="3810644d3ad4dbd618d30b4b916ee09f4265c94267ee6ea6b39193711564b3fb" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.109549 5030 generic.go:334] "Generic (PLEG): container finished" podID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerID="1ea957f41ba227c3e41f1638e15e663ffad55ccf95f6a36654fdbd2571340e9c" exitCode=0 Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.109573 5030 generic.go:334] "Generic (PLEG): container finished" podID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerID="c7e043b0b8441b0f27569b6c1a47553f98a4e898b7a8ab452dec340014f4532c" exitCode=2 Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.109584 5030 generic.go:334] "Generic (PLEG): container finished" podID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerID="c010e2926c7e322751386b3c6ca4213ac366a66e780234a4f8bfc756ce46c955" exitCode=0 Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.109592 5030 generic.go:334] "Generic (PLEG): container finished" podID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerID="746414e59b0a1e3e1f015ec60e5aebed410f00b7ae6cc7b7630090e1aad843de" exitCode=0 Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.109648 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52","Type":"ContainerDied","Data":"1ea957f41ba227c3e41f1638e15e663ffad55ccf95f6a36654fdbd2571340e9c"} Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.109675 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52","Type":"ContainerDied","Data":"c7e043b0b8441b0f27569b6c1a47553f98a4e898b7a8ab452dec340014f4532c"} Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.109689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52","Type":"ContainerDied","Data":"c010e2926c7e322751386b3c6ca4213ac366a66e780234a4f8bfc756ce46c955"} Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.109700 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52","Type":"ContainerDied","Data":"746414e59b0a1e3e1f015ec60e5aebed410f00b7ae6cc7b7630090e1aad843de"} Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.111032 5030 generic.go:334] "Generic (PLEG): container finished" podID="707515d1-17d7-47ff-be95-73414a6a2a95" containerID="18661e9685ba3cc1b4a95ae118b9c8f65210502829dae351496739d132cd571e" exitCode=0 Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.111115 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.113249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"707515d1-17d7-47ff-be95-73414a6a2a95","Type":"ContainerDied","Data":"18661e9685ba3cc1b4a95ae118b9c8f65210502829dae351496739d132cd571e"} Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.204020 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.242930 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.258458 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.282163 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:32:07 crc kubenswrapper[5030]: E0120 23:32:07.282644 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08962f16-0341-46c7-b0df-9972861083df" containerName="barbican-keystone-listener-log" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.282660 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="08962f16-0341-46c7-b0df-9972861083df" containerName="barbican-keystone-listener-log" Jan 20 23:32:07 crc kubenswrapper[5030]: E0120 23:32:07.282677 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="proxy-httpd" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.282685 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="proxy-httpd" Jan 20 23:32:07 crc kubenswrapper[5030]: E0120 23:32:07.282702 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dacb533-2f94-4433-ab8e-2efc4ee6e370" containerName="placement-api" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.282709 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dacb533-2f94-4433-ab8e-2efc4ee6e370" containerName="placement-api" Jan 20 23:32:07 crc kubenswrapper[5030]: E0120 23:32:07.282722 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c50398-f0ad-450d-b494-be906da18af9" containerName="barbican-api-log" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.282729 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c50398-f0ad-450d-b494-be906da18af9" containerName="barbican-api-log" Jan 20 23:32:07 crc kubenswrapper[5030]: E0120 23:32:07.282736 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c50398-f0ad-450d-b494-be906da18af9" containerName="barbican-api" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.282743 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c50398-f0ad-450d-b494-be906da18af9" containerName="barbican-api" Jan 20 23:32:07 crc kubenswrapper[5030]: E0120 23:32:07.282752 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="ceilometer-central-agent" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.282759 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="ceilometer-central-agent" Jan 20 23:32:07 crc kubenswrapper[5030]: E0120 23:32:07.282780 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="ceilometer-notification-agent" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.282787 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="ceilometer-notification-agent" Jan 20 23:32:07 crc kubenswrapper[5030]: E0120 23:32:07.282798 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dacb533-2f94-4433-ab8e-2efc4ee6e370" containerName="placement-log" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.282804 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dacb533-2f94-4433-ab8e-2efc4ee6e370" containerName="placement-log" Jan 20 23:32:07 crc kubenswrapper[5030]: E0120 23:32:07.282827 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c" containerName="kube-state-metrics" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.282836 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c" containerName="kube-state-metrics" Jan 20 23:32:07 crc kubenswrapper[5030]: E0120 23:32:07.282852 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08962f16-0341-46c7-b0df-9972861083df" containerName="barbican-keystone-listener" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.282859 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="08962f16-0341-46c7-b0df-9972861083df" containerName="barbican-keystone-listener" Jan 20 23:32:07 crc kubenswrapper[5030]: E0120 23:32:07.282873 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="sg-core" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.282880 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="sg-core" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.283083 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c" containerName="kube-state-metrics" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.283100 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="sg-core" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.283117 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="proxy-httpd" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.283130 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dacb533-2f94-4433-ab8e-2efc4ee6e370" containerName="placement-log" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.283141 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="08962f16-0341-46c7-b0df-9972861083df" containerName="barbican-keystone-listener-log" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.283157 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="ceilometer-central-agent" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.283168 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" containerName="ceilometer-notification-agent" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.283184 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="08962f16-0341-46c7-b0df-9972861083df" containerName="barbican-keystone-listener" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.283200 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c50398-f0ad-450d-b494-be906da18af9" containerName="barbican-api" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.283210 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c50398-f0ad-450d-b494-be906da18af9" containerName="barbican-api-log" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.283221 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dacb533-2f94-4433-ab8e-2efc4ee6e370" containerName="placement-api" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.284697 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.288823 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.289230 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.295607 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.319389 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92"] Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.333972 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-dd9dbdd6-tcp92"] Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.373684 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-scripts\") pod \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.373776 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-log-httpd\") pod \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.373830 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-config-data\") pod \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.373857 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-sg-core-conf-yaml\") pod \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.373917 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztxbs\" (UniqueName: \"kubernetes.io/projected/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-kube-api-access-ztxbs\") pod \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.373949 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-run-httpd\") pod \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.374000 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-combined-ca-bundle\") pod \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\" (UID: \"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52\") " Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.374129 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.374167 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.374236 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" (UID: "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.374257 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.374319 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7257g\" (UniqueName: \"kubernetes.io/projected/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-api-access-7257g\") pod \"kube-state-metrics-0\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.374428 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.374586 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" (UID: "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.378090 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-scripts" (OuterVolumeSpecName: "scripts") pod "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" (UID: "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.378885 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-kube-api-access-ztxbs" (OuterVolumeSpecName: "kube-api-access-ztxbs") pod "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" (UID: "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52"). InnerVolumeSpecName "kube-api-access-ztxbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.399898 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" (UID: "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.441098 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" (UID: "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.462104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-config-data" (OuterVolumeSpecName: "config-data") pod "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" (UID: "1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.475612 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.475720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7257g\" (UniqueName: \"kubernetes.io/projected/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-api-access-7257g\") pod \"kube-state-metrics-0\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.475782 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.475804 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.475872 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.475884 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.475893 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.475904 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztxbs\" (UniqueName: \"kubernetes.io/projected/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-kube-api-access-ztxbs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.475912 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.475921 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.479282 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.479578 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.480176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.491884 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7257g\" (UniqueName: \"kubernetes.io/projected/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-api-access-7257g\") pod \"kube-state-metrics-0\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.625211 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.972848 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c" path="/var/lib/kubelet/pods/4fd2760d-ec53-4e6b-bdc8-28e0d366fa3c/volumes" Jan 20 23:32:07 crc kubenswrapper[5030]: I0120 23:32:07.973672 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c50398-f0ad-450d-b494-be906da18af9" path="/var/lib/kubelet/pods/f7c50398-f0ad-450d-b494-be906da18af9/volumes" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.081276 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.123778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52","Type":"ContainerDied","Data":"d8b46dfb5649b1be74f2b09d2ae386d92d01893a4d73c02b981ebf3cea1a79fe"} Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.123825 5030 scope.go:117] "RemoveContainer" containerID="1ea957f41ba227c3e41f1638e15e663ffad55ccf95f6a36654fdbd2571340e9c" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.123964 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.128767 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" containerID="28edcad0484db485386c7130df3370aaacf3e682e8b1296021f215f02115680c" exitCode=0 Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.128854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c","Type":"ContainerDied","Data":"28edcad0484db485386c7130df3370aaacf3e682e8b1296021f215f02115680c"} Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.130790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"707515d1-17d7-47ff-be95-73414a6a2a95","Type":"ContainerStarted","Data":"6d9893896c0fe1b9cbcae45b2dde218a0a94a2e3b05ad47f803bf4db0286bc6c"} Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.131074 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.133204 5030 generic.go:334] "Generic (PLEG): container finished" podID="43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" exitCode=137 Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.133253 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138","Type":"ContainerDied","Data":"67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971"} Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.148190 5030 generic.go:334] "Generic (PLEG): container finished" podID="58c0a209-c9c9-4d67-9c4f-bc086ec04be5" containerID="965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db" exitCode=137 Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.148248 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"58c0a209-c9c9-4d67-9c4f-bc086ec04be5","Type":"ContainerDied","Data":"965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db"} Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.148279 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"58c0a209-c9c9-4d67-9c4f-bc086ec04be5","Type":"ContainerDied","Data":"dd62ecca2e723bec7d0ae9393501f468d6032d9fe6b0b7ae8df93327c50ef9f3"} Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.148339 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.173936 5030 scope.go:117] "RemoveContainer" containerID="c7e043b0b8441b0f27569b6c1a47553f98a4e898b7a8ab452dec340014f4532c" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.188474 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv4lg\" (UniqueName: \"kubernetes.io/projected/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-kube-api-access-bv4lg\") pod \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\" (UID: \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\") " Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.188521 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-config-data\") pod \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\" (UID: \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\") " Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.188661 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-combined-ca-bundle\") pod \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\" (UID: \"58c0a209-c9c9-4d67-9c4f-bc086ec04be5\") " Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.192384 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.201662 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-kube-api-access-bv4lg" (OuterVolumeSpecName: "kube-api-access-bv4lg") pod "58c0a209-c9c9-4d67-9c4f-bc086ec04be5" (UID: "58c0a209-c9c9-4d67-9c4f-bc086ec04be5"). InnerVolumeSpecName "kube-api-access-bv4lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.212397 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.216584 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.221815 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:08 crc kubenswrapper[5030]: E0120 23:32:08.222262 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.222274 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:08 crc kubenswrapper[5030]: E0120 23:32:08.222293 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c0a209-c9c9-4d67-9c4f-bc086ec04be5" containerName="nova-scheduler-scheduler" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.222299 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c0a209-c9c9-4d67-9c4f-bc086ec04be5" containerName="nova-scheduler-scheduler" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.222584 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c0a209-c9c9-4d67-9c4f-bc086ec04be5" containerName="nova-scheduler-scheduler" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.222604 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.224549 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.227202 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.227337 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.228904 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.264828 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.273808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-config-data" (OuterVolumeSpecName: "config-data") pod "58c0a209-c9c9-4d67-9c4f-bc086ec04be5" (UID: "58c0a209-c9c9-4d67-9c4f-bc086ec04be5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.279789 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.282330 5030 scope.go:117] "RemoveContainer" containerID="c010e2926c7e322751386b3c6ca4213ac366a66e780234a4f8bfc756ce46c955" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.287644 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=36.287612587 podStartE2EDuration="36.287612587s" podCreationTimestamp="2026-01-20 23:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:08.207767584 +0000 UTC m=+3400.528027872" watchObservedRunningTime="2026-01-20 23:32:08.287612587 +0000 UTC m=+3400.607872875" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.291454 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv4lg\" (UniqueName: \"kubernetes.io/projected/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-kube-api-access-bv4lg\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.291496 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.298574 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58c0a209-c9c9-4d67-9c4f-bc086ec04be5" (UID: "58c0a209-c9c9-4d67-9c4f-bc086ec04be5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.321970 5030 scope.go:117] "RemoveContainer" containerID="746414e59b0a1e3e1f015ec60e5aebed410f00b7ae6cc7b7630090e1aad843de" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.344415 5030 scope.go:117] "RemoveContainer" containerID="965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.375426 5030 scope.go:117] "RemoveContainer" containerID="965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db" Jan 20 23:32:08 crc kubenswrapper[5030]: E0120 23:32:08.375884 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db\": container with ID starting with 965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db not found: ID does not exist" containerID="965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.375993 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db"} err="failed to get container status \"965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db\": rpc error: code = NotFound desc = could not find container \"965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db\": container with ID starting with 965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db not found: ID does not exist" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.392255 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wp96\" (UniqueName: \"kubernetes.io/projected/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-kube-api-access-7wp96\") pod \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\" (UID: \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\") " Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.392481 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-config-data\") pod \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\" (UID: \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\") " Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.392716 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-combined-ca-bundle\") pod \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\" (UID: \"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138\") " Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.393021 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-scripts\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.393145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.393231 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/feadf8b7-c4d1-486b-892d-2de53cf8144f-log-httpd\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.393327 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/feadf8b7-c4d1-486b-892d-2de53cf8144f-run-httpd\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.393455 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-config-data\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.393574 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.393679 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.393819 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzpjb\" (UniqueName: \"kubernetes.io/projected/feadf8b7-c4d1-486b-892d-2de53cf8144f-kube-api-access-tzpjb\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.393997 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58c0a209-c9c9-4d67-9c4f-bc086ec04be5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.397117 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-kube-api-access-7wp96" (OuterVolumeSpecName: "kube-api-access-7wp96") pod "43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" (UID: "43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138"). InnerVolumeSpecName "kube-api-access-7wp96". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.421355 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" (UID: "43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.426532 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-config-data" (OuterVolumeSpecName: "config-data") pod "43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" (UID: "43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.479532 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.487290 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.495416 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.495461 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.495492 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzpjb\" (UniqueName: \"kubernetes.io/projected/feadf8b7-c4d1-486b-892d-2de53cf8144f-kube-api-access-tzpjb\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.495539 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-scripts\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.495564 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.495584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/feadf8b7-c4d1-486b-892d-2de53cf8144f-log-httpd\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.495633 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/feadf8b7-c4d1-486b-892d-2de53cf8144f-run-httpd\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.495669 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-config-data\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.495715 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.495728 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.495739 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wp96\" (UniqueName: \"kubernetes.io/projected/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138-kube-api-access-7wp96\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.496748 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/feadf8b7-c4d1-486b-892d-2de53cf8144f-log-httpd\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.498597 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/feadf8b7-c4d1-486b-892d-2de53cf8144f-run-httpd\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.502024 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-scripts\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.503286 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.503928 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.504790 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-config-data\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.506664 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.510791 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.511987 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.516060 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.517595 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzpjb\" (UniqueName: \"kubernetes.io/projected/feadf8b7-c4d1-486b-892d-2de53cf8144f-kube-api-access-tzpjb\") pod \"ceilometer-0\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.520861 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.548995 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.597592 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.597931 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-config-data\") pod \"nova-scheduler-0\" (UID: \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.598029 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwjzf\" (UniqueName: \"kubernetes.io/projected/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-kube-api-access-qwjzf\") pod \"nova-scheduler-0\" (UID: \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.699257 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwjzf\" (UniqueName: \"kubernetes.io/projected/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-kube-api-access-qwjzf\") pod \"nova-scheduler-0\" (UID: \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.699361 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.699414 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-config-data\") pod \"nova-scheduler-0\" (UID: \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.703278 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.703729 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-config-data\") pod \"nova-scheduler-0\" (UID: \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.721915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwjzf\" (UniqueName: \"kubernetes.io/projected/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-kube-api-access-qwjzf\") pod \"nova-scheduler-0\" (UID: \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.816795 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:08 crc kubenswrapper[5030]: I0120 23:32:08.827931 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.185785 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"feadf8b7-c4d1-486b-892d-2de53cf8144f","Type":"ContainerStarted","Data":"85fda7f6420a6392ed5af0a51ae368a02ff4b88b53768c801f7a41b0554638c1"} Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.189791 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d3774e65-1783-48d9-af7d-3a9964b609ad","Type":"ContainerStarted","Data":"a161cc88506234cfca0ad20e4d3a89e586f86988256fad6a81063a57a16e11d4"} Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.189901 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d3774e65-1783-48d9-af7d-3a9964b609ad","Type":"ContainerStarted","Data":"a948a61b5bd35929d797fe8333c7b5ea55cd36a76861fe7af8012a56b49c1e28"} Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.190135 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.212950 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c","Type":"ContainerStarted","Data":"fd3d688e67d991d64ec087d4b4ddfd8bf95aa94fca6895e70222b2a4a5f67e47"} Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.213595 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.221279 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.222223 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138","Type":"ContainerDied","Data":"84edb744cac90df709d3e319b064c1e917012698257aa60d1a82f95a44480254"} Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.222359 5030 scope.go:117] "RemoveContainer" containerID="67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.252496 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.810364201 podStartE2EDuration="2.252470254s" podCreationTimestamp="2026-01-20 23:32:07 +0000 UTC" firstStartedPulling="2026-01-20 23:32:08.281790616 +0000 UTC m=+3400.602050904" lastFinishedPulling="2026-01-20 23:32:08.723896679 +0000 UTC m=+3401.044156957" observedRunningTime="2026-01-20 23:32:09.212917216 +0000 UTC m=+3401.533177504" watchObservedRunningTime="2026-01-20 23:32:09.252470254 +0000 UTC m=+3401.572730552" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.255661 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.25563996 podStartE2EDuration="36.25563996s" podCreationTimestamp="2026-01-20 23:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:09.247232567 +0000 UTC m=+3401.567492875" watchObservedRunningTime="2026-01-20 23:32:09.25563996 +0000 UTC m=+3401.575900248" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.289780 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.307510 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.328364 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.339712 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.341155 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.344372 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.345117 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.410610 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd7a6b3c-a038-4385-83a9-16177204e200-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bd7a6b3c-a038-4385-83a9-16177204e200\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.410756 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mw7c\" (UniqueName: \"kubernetes.io/projected/bd7a6b3c-a038-4385-83a9-16177204e200-kube-api-access-5mw7c\") pod \"nova-cell1-conductor-0\" (UID: \"bd7a6b3c-a038-4385-83a9-16177204e200\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.410801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd7a6b3c-a038-4385-83a9-16177204e200-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bd7a6b3c-a038-4385-83a9-16177204e200\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.512385 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mw7c\" (UniqueName: \"kubernetes.io/projected/bd7a6b3c-a038-4385-83a9-16177204e200-kube-api-access-5mw7c\") pod \"nova-cell1-conductor-0\" (UID: \"bd7a6b3c-a038-4385-83a9-16177204e200\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.512435 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd7a6b3c-a038-4385-83a9-16177204e200-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bd7a6b3c-a038-4385-83a9-16177204e200\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.512532 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd7a6b3c-a038-4385-83a9-16177204e200-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bd7a6b3c-a038-4385-83a9-16177204e200\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.516391 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd7a6b3c-a038-4385-83a9-16177204e200-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"bd7a6b3c-a038-4385-83a9-16177204e200\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.516909 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd7a6b3c-a038-4385-83a9-16177204e200-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"bd7a6b3c-a038-4385-83a9-16177204e200\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.536865 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mw7c\" (UniqueName: \"kubernetes.io/projected/bd7a6b3c-a038-4385-83a9-16177204e200-kube-api-access-5mw7c\") pod \"nova-cell1-conductor-0\" (UID: \"bd7a6b3c-a038-4385-83a9-16177204e200\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.649231 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.658489 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.707404 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-65885c5c7c-j7j2s"] Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.707843 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" podUID="68408e1c-9c9a-465d-9abd-ea36b270db22" containerName="keystone-api" containerID="cri-o://388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e" gracePeriod=30 Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.976432 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52" path="/var/lib/kubelet/pods/1d051a94-6a4a-4fc7-a0d0-d4a7e1a5ab52/volumes" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.977712 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138" path="/var/lib/kubelet/pods/43aac3a1-0cc6-4a8c-ad4a-0fc0f681d138/volumes" Jan 20 23:32:09 crc kubenswrapper[5030]: I0120 23:32:09.978373 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c0a209-c9c9-4d67-9c4f-bc086ec04be5" path="/var/lib/kubelet/pods/58c0a209-c9c9-4d67-9c4f-bc086ec04be5/volumes" Jan 20 23:32:10 crc kubenswrapper[5030]: I0120 23:32:10.163500 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:10 crc kubenswrapper[5030]: I0120 23:32:10.230453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"feadf8b7-c4d1-486b-892d-2de53cf8144f","Type":"ContainerStarted","Data":"9933c1796e12a1821918b7663a5fc5bf0366cd5c6ed549d56583f91cd777a7a0"} Jan 20 23:32:10 crc kubenswrapper[5030]: I0120 23:32:10.233421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c","Type":"ContainerStarted","Data":"b2a6251f95696d843b8df69fd2ffb1338c7d7f9c951e93808aaa29c34d6887ed"} Jan 20 23:32:10 crc kubenswrapper[5030]: I0120 23:32:10.233577 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c","Type":"ContainerStarted","Data":"61544cfe2b823daee8ed0864944e43bd2a8cd1710711670204ece0993a1f4eb6"} Jan 20 23:32:10 crc kubenswrapper[5030]: I0120 23:32:10.234757 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"bd7a6b3c-a038-4385-83a9-16177204e200","Type":"ContainerStarted","Data":"2e9b520770d2dceb5700f36e96093a6b52cf9b6214a65c5c6a9bccaea19b2bbe"} Jan 20 23:32:10 crc kubenswrapper[5030]: I0120 23:32:10.246102 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.246087397 podStartE2EDuration="2.246087397s" podCreationTimestamp="2026-01-20 23:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:10.245384139 +0000 UTC m=+3402.565644417" watchObservedRunningTime="2026-01-20 23:32:10.246087397 +0000 UTC m=+3402.566347685" Jan 20 23:32:10 crc kubenswrapper[5030]: I0120 23:32:10.549793 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:32:10 crc kubenswrapper[5030]: I0120 23:32:10.604190 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb"] Jan 20 23:32:10 crc kubenswrapper[5030]: I0120 23:32:10.604430 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" podUID="7d5a9211-0407-48f1-9d3e-8fb089aa2b56" containerName="dnsmasq-dns" containerID="cri-o://9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b" gracePeriod=10 Jan 20 23:32:11 crc kubenswrapper[5030]: W0120 23:32:11.037756 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c50398_f0ad_450d_b494_be906da18af9.slice/crio-ace8455d89e787da2e667d87dede5ef4ab3dc9bb9804afaac85184a3e3d85013.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c50398_f0ad_450d_b494_be906da18af9.slice/crio-ace8455d89e787da2e667d87dede5ef4ab3dc9bb9804afaac85184a3e3d85013.scope: no such file or directory Jan 20 23:32:11 crc kubenswrapper[5030]: W0120 23:32:11.038109 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e5f8238_69f5_43a5_843f_82e5195fde82.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e5f8238_69f5_43a5_843f_82e5195fde82.slice: no such file or directory Jan 20 23:32:11 crc kubenswrapper[5030]: W0120 23:32:11.039905 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77f3ffb_97b9_4a1f_9610_7705ae4aee88.slice/crio-conmon-1bd3eefd66df110f1bde6987f3ca5ec62e35e4907fb703cbc406ac9b3fda7696.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77f3ffb_97b9_4a1f_9610_7705ae4aee88.slice/crio-conmon-1bd3eefd66df110f1bde6987f3ca5ec62e35e4907fb703cbc406ac9b3fda7696.scope: no such file or directory Jan 20 23:32:11 crc kubenswrapper[5030]: W0120 23:32:11.040151 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77f3ffb_97b9_4a1f_9610_7705ae4aee88.slice/crio-1bd3eefd66df110f1bde6987f3ca5ec62e35e4907fb703cbc406ac9b3fda7696.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77f3ffb_97b9_4a1f_9610_7705ae4aee88.slice/crio-1bd3eefd66df110f1bde6987f3ca5ec62e35e4907fb703cbc406ac9b3fda7696.scope: no such file or directory Jan 20 23:32:11 crc kubenswrapper[5030]: W0120 23:32:11.054965 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77f3ffb_97b9_4a1f_9610_7705ae4aee88.slice/crio-conmon-6e686278be52fa85b6babe42f39d57705fd3b25a80bb2213f78c824fd28e3f1e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77f3ffb_97b9_4a1f_9610_7705ae4aee88.slice/crio-conmon-6e686278be52fa85b6babe42f39d57705fd3b25a80bb2213f78c824fd28e3f1e.scope: no such file or directory Jan 20 23:32:11 crc kubenswrapper[5030]: W0120 23:32:11.055038 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77f3ffb_97b9_4a1f_9610_7705ae4aee88.slice/crio-6e686278be52fa85b6babe42f39d57705fd3b25a80bb2213f78c824fd28e3f1e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77f3ffb_97b9_4a1f_9610_7705ae4aee88.slice/crio-6e686278be52fa85b6babe42f39d57705fd3b25a80bb2213f78c824fd28e3f1e.scope: no such file or directory Jan 20 23:32:11 crc kubenswrapper[5030]: W0120 23:32:11.055058 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d051a94_6a4a_4fc7_a0d0_d4a7e1a5ab52.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d051a94_6a4a_4fc7_a0d0_d4a7e1a5ab52.slice: no such file or directory Jan 20 23:32:11 crc kubenswrapper[5030]: W0120 23:32:11.060696 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod933d538e_e825_4673_a66a_6eedb28088a8.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod933d538e_e825_4673_a66a_6eedb28088a8.slice: no such file or directory Jan 20 23:32:11 crc kubenswrapper[5030]: W0120 23:32:11.063140 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod106c4d71_621c_48e5_93b0_52a932f7ec8f.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod106c4d71_621c_48e5_93b0_52a932f7ec8f.slice: no such file or directory Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.099901 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.152289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dnsmasq-svc\") pod \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.152418 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dns-swift-storage-0\") pod \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.152472 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrs4k\" (UniqueName: \"kubernetes.io/projected/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-kube-api-access-qrs4k\") pod \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.152631 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-config\") pod \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.159900 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-kube-api-access-qrs4k" (OuterVolumeSpecName: "kube-api-access-qrs4k") pod "7d5a9211-0407-48f1-9d3e-8fb089aa2b56" (UID: "7d5a9211-0407-48f1-9d3e-8fb089aa2b56"). InnerVolumeSpecName "kube-api-access-qrs4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.160830 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" podUID="1ca5f6c1-a17b-4405-93e9-fa2fab576c91" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.242409 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-config" (OuterVolumeSpecName: "config") pod "7d5a9211-0407-48f1-9d3e-8fb089aa2b56" (UID: "7d5a9211-0407-48f1-9d3e-8fb089aa2b56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.254388 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "7d5a9211-0407-48f1-9d3e-8fb089aa2b56" (UID: "7d5a9211-0407-48f1-9d3e-8fb089aa2b56"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.254739 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dnsmasq-svc\") pod \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\" (UID: \"7d5a9211-0407-48f1-9d3e-8fb089aa2b56\") " Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.255094 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrs4k\" (UniqueName: \"kubernetes.io/projected/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-kube-api-access-qrs4k\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.255106 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:11 crc kubenswrapper[5030]: W0120 23:32:11.255193 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/7d5a9211-0407-48f1-9d3e-8fb089aa2b56/volumes/kubernetes.io~configmap/dnsmasq-svc Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.255204 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "7d5a9211-0407-48f1-9d3e-8fb089aa2b56" (UID: "7d5a9211-0407-48f1-9d3e-8fb089aa2b56"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.266246 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"feadf8b7-c4d1-486b-892d-2de53cf8144f","Type":"ContainerStarted","Data":"1829fc165f134d3d35e6a4275330f06aec81c9eba4e63208b4c9653f26fa4906"} Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.266289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"feadf8b7-c4d1-486b-892d-2de53cf8144f","Type":"ContainerStarted","Data":"78e4e6e095afd325b5fc01623dc7035951add02e1372618dfb0a76729e6c5778"} Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.266928 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7d5a9211-0407-48f1-9d3e-8fb089aa2b56" (UID: "7d5a9211-0407-48f1-9d3e-8fb089aa2b56"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.298733 5030 generic.go:334] "Generic (PLEG): container finished" podID="c77f3ffb-97b9-4a1f-9610-7705ae4aee88" containerID="1bd3eefd66df110f1bde6987f3ca5ec62e35e4907fb703cbc406ac9b3fda7696" exitCode=137 Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.299052 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c77f3ffb-97b9-4a1f-9610-7705ae4aee88","Type":"ContainerDied","Data":"1bd3eefd66df110f1bde6987f3ca5ec62e35e4907fb703cbc406ac9b3fda7696"} Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.303031 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"bd7a6b3c-a038-4385-83a9-16177204e200","Type":"ContainerStarted","Data":"bb485dcb26d38ffb2a84701b1fb131d5a4d1deabd2757ae3aa97eb607aca774a"} Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.305684 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.311878 5030 generic.go:334] "Generic (PLEG): container finished" podID="7d5a9211-0407-48f1-9d3e-8fb089aa2b56" containerID="9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b" exitCode=0 Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.312251 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.312162 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" event={"ID":"7d5a9211-0407-48f1-9d3e-8fb089aa2b56","Type":"ContainerDied","Data":"9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b"} Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.312671 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb" event={"ID":"7d5a9211-0407-48f1-9d3e-8fb089aa2b56","Type":"ContainerDied","Data":"39e392ea69997447b0c3104564e074803eff954643c5d17a12a8dd50622411ce"} Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.312704 5030 scope.go:117] "RemoveContainer" containerID="9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.323806 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.3237906649999998 podStartE2EDuration="2.323790665s" podCreationTimestamp="2026-01-20 23:32:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:11.321369546 +0000 UTC m=+3403.641629834" watchObservedRunningTime="2026-01-20 23:32:11.323790665 +0000 UTC m=+3403.644050953" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.350811 5030 scope.go:117] "RemoveContainer" containerID="50588e92c6d2ae18b5d0cda787251d31257c61cb2c33dda23324d2a5c33f60fe" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.361504 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.361609 5030 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d5a9211-0407-48f1-9d3e-8fb089aa2b56-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.383995 5030 scope.go:117] "RemoveContainer" containerID="9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b" Jan 20 23:32:11 crc kubenswrapper[5030]: E0120 23:32:11.388657 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b\": container with ID starting with 9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b not found: ID does not exist" containerID="9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.388726 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b"} err="failed to get container status \"9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b\": rpc error: code = NotFound desc = could not find container \"9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b\": container with ID starting with 9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b not found: ID does not exist" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.388756 5030 scope.go:117] "RemoveContainer" containerID="50588e92c6d2ae18b5d0cda787251d31257c61cb2c33dda23324d2a5c33f60fe" Jan 20 23:32:11 crc kubenswrapper[5030]: E0120 23:32:11.390258 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50588e92c6d2ae18b5d0cda787251d31257c61cb2c33dda23324d2a5c33f60fe\": container with ID starting with 50588e92c6d2ae18b5d0cda787251d31257c61cb2c33dda23324d2a5c33f60fe not found: ID does not exist" containerID="50588e92c6d2ae18b5d0cda787251d31257c61cb2c33dda23324d2a5c33f60fe" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.390326 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50588e92c6d2ae18b5d0cda787251d31257c61cb2c33dda23324d2a5c33f60fe"} err="failed to get container status \"50588e92c6d2ae18b5d0cda787251d31257c61cb2c33dda23324d2a5c33f60fe\": rpc error: code = NotFound desc = could not find container \"50588e92c6d2ae18b5d0cda787251d31257c61cb2c33dda23324d2a5c33f60fe\": container with ID starting with 50588e92c6d2ae18b5d0cda787251d31257c61cb2c33dda23324d2a5c33f60fe not found: ID does not exist" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.414908 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb"] Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.422185 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.424661 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-8687b85765-4kxkb"] Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.569336 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2sfm\" (UniqueName: \"kubernetes.io/projected/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-kube-api-access-p2sfm\") pod \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.569403 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-scripts\") pod \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.569455 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-config-data-custom\") pod \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.569497 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-combined-ca-bundle\") pod \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.569547 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-etc-machine-id\") pod \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.569568 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-config-data\") pod \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\" (UID: \"c77f3ffb-97b9-4a1f-9610-7705ae4aee88\") " Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.575247 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c77f3ffb-97b9-4a1f-9610-7705ae4aee88" (UID: "c77f3ffb-97b9-4a1f-9610-7705ae4aee88"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.578799 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c77f3ffb-97b9-4a1f-9610-7705ae4aee88" (UID: "c77f3ffb-97b9-4a1f-9610-7705ae4aee88"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.578829 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-scripts" (OuterVolumeSpecName: "scripts") pod "c77f3ffb-97b9-4a1f-9610-7705ae4aee88" (UID: "c77f3ffb-97b9-4a1f-9610-7705ae4aee88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.582848 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-kube-api-access-p2sfm" (OuterVolumeSpecName: "kube-api-access-p2sfm") pod "c77f3ffb-97b9-4a1f-9610-7705ae4aee88" (UID: "c77f3ffb-97b9-4a1f-9610-7705ae4aee88"). InnerVolumeSpecName "kube-api-access-p2sfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.650062 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c77f3ffb-97b9-4a1f-9610-7705ae4aee88" (UID: "c77f3ffb-97b9-4a1f-9610-7705ae4aee88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.667417 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-config-data" (OuterVolumeSpecName: "config-data") pod "c77f3ffb-97b9-4a1f-9610-7705ae4aee88" (UID: "c77f3ffb-97b9-4a1f-9610-7705ae4aee88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.671546 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.671579 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.671591 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.671600 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.671610 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.671673 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2sfm\" (UniqueName: \"kubernetes.io/projected/c77f3ffb-97b9-4a1f-9610-7705ae4aee88-kube-api-access-p2sfm\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:11 crc kubenswrapper[5030]: I0120 23:32:11.972870 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d5a9211-0407-48f1-9d3e-8fb089aa2b56" path="/var/lib/kubelet/pods/7d5a9211-0407-48f1-9d3e-8fb089aa2b56/volumes" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.324727 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"c77f3ffb-97b9-4a1f-9610-7705ae4aee88","Type":"ContainerDied","Data":"c5b38855c50b837a2fe620b8319b6db6b3aec970de01e6ba699b9ec3b5ce1658"} Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.324768 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.324810 5030 scope.go:117] "RemoveContainer" containerID="6e686278be52fa85b6babe42f39d57705fd3b25a80bb2213f78c824fd28e3f1e" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.326905 5030 generic.go:334] "Generic (PLEG): container finished" podID="d30c7a9f-0dab-4a96-8e99-d85cf8961b3c" containerID="b2a6251f95696d843b8df69fd2ffb1338c7d7f9c951e93808aaa29c34d6887ed" exitCode=1 Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.327018 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c","Type":"ContainerDied","Data":"b2a6251f95696d843b8df69fd2ffb1338c7d7f9c951e93808aaa29c34d6887ed"} Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.327574 5030 scope.go:117] "RemoveContainer" containerID="b2a6251f95696d843b8df69fd2ffb1338c7d7f9c951e93808aaa29c34d6887ed" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.352865 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.355588 5030 scope.go:117] "RemoveContainer" containerID="1bd3eefd66df110f1bde6987f3ca5ec62e35e4907fb703cbc406ac9b3fda7696" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.363469 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.394211 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:32:12 crc kubenswrapper[5030]: E0120 23:32:12.394784 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5a9211-0407-48f1-9d3e-8fb089aa2b56" containerName="dnsmasq-dns" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.394811 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5a9211-0407-48f1-9d3e-8fb089aa2b56" containerName="dnsmasq-dns" Jan 20 23:32:12 crc kubenswrapper[5030]: E0120 23:32:12.394848 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c77f3ffb-97b9-4a1f-9610-7705ae4aee88" containerName="probe" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.394862 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77f3ffb-97b9-4a1f-9610-7705ae4aee88" containerName="probe" Jan 20 23:32:12 crc kubenswrapper[5030]: E0120 23:32:12.394908 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c77f3ffb-97b9-4a1f-9610-7705ae4aee88" containerName="cinder-scheduler" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.394921 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77f3ffb-97b9-4a1f-9610-7705ae4aee88" containerName="cinder-scheduler" Jan 20 23:32:12 crc kubenswrapper[5030]: E0120 23:32:12.394955 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5a9211-0407-48f1-9d3e-8fb089aa2b56" containerName="init" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.394967 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5a9211-0407-48f1-9d3e-8fb089aa2b56" containerName="init" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.395307 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c77f3ffb-97b9-4a1f-9610-7705ae4aee88" containerName="probe" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.395340 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5a9211-0407-48f1-9d3e-8fb089aa2b56" containerName="dnsmasq-dns" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.395384 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c77f3ffb-97b9-4a1f-9610-7705ae4aee88" containerName="cinder-scheduler" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.397923 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.404449 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.412669 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.492817 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-config-data\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.492907 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.492952 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-scripts\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.492977 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6153d63b-e003-4951-b9f1-a27d11979663-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.492999 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnvp7\" (UniqueName: \"kubernetes.io/projected/6153d63b-e003-4951-b9f1-a27d11979663-kube-api-access-lnvp7\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.493065 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.594431 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-config-data\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.595307 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.595345 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-scripts\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.595375 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6153d63b-e003-4951-b9f1-a27d11979663-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.595400 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnvp7\" (UniqueName: \"kubernetes.io/projected/6153d63b-e003-4951-b9f1-a27d11979663-kube-api-access-lnvp7\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.595475 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.595541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6153d63b-e003-4951-b9f1-a27d11979663-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.598670 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.598672 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-config-data\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.599137 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.609132 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-scripts\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.614169 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnvp7\" (UniqueName: \"kubernetes.io/projected/6153d63b-e003-4951-b9f1-a27d11979663-kube-api-access-lnvp7\") pod \"cinder-scheduler-0\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:12 crc kubenswrapper[5030]: I0120 23:32:12.771332 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.229345 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.289064 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.346519 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c","Type":"ContainerStarted","Data":"4192700693d1637b6b90305448260e5aec12e2ef6a95aa8b5a4302d3b91fa7ad"} Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.355442 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"feadf8b7-c4d1-486b-892d-2de53cf8144f","Type":"ContainerStarted","Data":"bef839d4347f778186d61e62d0225190937025ed64ee682d8132b9242ecbb895"} Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.355935 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.368722 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"6153d63b-e003-4951-b9f1-a27d11979663","Type":"ContainerStarted","Data":"47daa988a3c5ccca1761a0cec44bd26b3e43c7826edfa50d76aad6620005d357"} Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.387977 5030 generic.go:334] "Generic (PLEG): container finished" podID="68408e1c-9c9a-465d-9abd-ea36b270db22" containerID="388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e" exitCode=0 Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.388078 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.388105 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" event={"ID":"68408e1c-9c9a-465d-9abd-ea36b270db22","Type":"ContainerDied","Data":"388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e"} Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.388144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-65885c5c7c-j7j2s" event={"ID":"68408e1c-9c9a-465d-9abd-ea36b270db22","Type":"ContainerDied","Data":"4e61e017fdd7a44394deca0298a839476bfd215ec5ecaf9e15a97b23f9e1e533"} Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.388162 5030 scope.go:117] "RemoveContainer" containerID="388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.410863 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcj6v\" (UniqueName: \"kubernetes.io/projected/68408e1c-9c9a-465d-9abd-ea36b270db22-kube-api-access-qcj6v\") pod \"68408e1c-9c9a-465d-9abd-ea36b270db22\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.411135 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-fernet-keys\") pod \"68408e1c-9c9a-465d-9abd-ea36b270db22\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.411293 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-scripts\") pod \"68408e1c-9c9a-465d-9abd-ea36b270db22\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.411463 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-combined-ca-bundle\") pod \"68408e1c-9c9a-465d-9abd-ea36b270db22\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.411557 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-credential-keys\") pod \"68408e1c-9c9a-465d-9abd-ea36b270db22\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.411687 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-config-data\") pod \"68408e1c-9c9a-465d-9abd-ea36b270db22\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.419406 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68408e1c-9c9a-465d-9abd-ea36b270db22-kube-api-access-qcj6v" (OuterVolumeSpecName: "kube-api-access-qcj6v") pod "68408e1c-9c9a-465d-9abd-ea36b270db22" (UID: "68408e1c-9c9a-465d-9abd-ea36b270db22"). InnerVolumeSpecName "kube-api-access-qcj6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.420549 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-scripts" (OuterVolumeSpecName: "scripts") pod "68408e1c-9c9a-465d-9abd-ea36b270db22" (UID: "68408e1c-9c9a-465d-9abd-ea36b270db22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.427844 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "68408e1c-9c9a-465d-9abd-ea36b270db22" (UID: "68408e1c-9c9a-465d-9abd-ea36b270db22"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.430733 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "68408e1c-9c9a-465d-9abd-ea36b270db22" (UID: "68408e1c-9c9a-465d-9abd-ea36b270db22"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.442238 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.785282623 podStartE2EDuration="5.442215687s" podCreationTimestamp="2026-01-20 23:32:08 +0000 UTC" firstStartedPulling="2026-01-20 23:32:08.818523689 +0000 UTC m=+3401.138783977" lastFinishedPulling="2026-01-20 23:32:12.475456753 +0000 UTC m=+3404.795717041" observedRunningTime="2026-01-20 23:32:13.428172687 +0000 UTC m=+3405.748432975" watchObservedRunningTime="2026-01-20 23:32:13.442215687 +0000 UTC m=+3405.762475975" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.452083 5030 scope.go:117] "RemoveContainer" containerID="388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e" Jan 20 23:32:13 crc kubenswrapper[5030]: E0120 23:32:13.468950 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-combined-ca-bundle podName:68408e1c-9c9a-465d-9abd-ea36b270db22 nodeName:}" failed. No retries permitted until 2026-01-20 23:32:13.968921903 +0000 UTC m=+3406.289182191 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-combined-ca-bundle") pod "68408e1c-9c9a-465d-9abd-ea36b270db22" (UID: "68408e1c-9c9a-465d-9abd-ea36b270db22") : error deleting /var/lib/kubelet/pods/68408e1c-9c9a-465d-9abd-ea36b270db22/volume-subpaths: remove /var/lib/kubelet/pods/68408e1c-9c9a-465d-9abd-ea36b270db22/volume-subpaths: no such file or directory Jan 20 23:32:13 crc kubenswrapper[5030]: E0120 23:32:13.469249 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e\": container with ID starting with 388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e not found: ID does not exist" containerID="388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.469336 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e"} err="failed to get container status \"388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e\": rpc error: code = NotFound desc = could not find container \"388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e\": container with ID starting with 388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e not found: ID does not exist" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.477722 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-config-data" (OuterVolumeSpecName: "config-data") pod "68408e1c-9c9a-465d-9abd-ea36b270db22" (UID: "68408e1c-9c9a-465d-9abd-ea36b270db22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.518327 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcj6v\" (UniqueName: \"kubernetes.io/projected/68408e1c-9c9a-465d-9abd-ea36b270db22-kube-api-access-qcj6v\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.518361 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.518372 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.518380 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.518389 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.828659 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.837507 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.837739 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f7fef1ff-a583-42c4-bb53-b6a684d9afad" containerName="nova-cell0-conductor-conductor" containerID="cri-o://9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99" gracePeriod=30 Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.850357 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.866190 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.866466 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerName="nova-metadata-log" containerID="cri-o://22f47e79ac62c2871d4a7b59223d44f759f9d8b75d4c1bbf6a9e584a19c34da3" gracePeriod=30 Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.867004 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerName="nova-metadata-metadata" containerID="cri-o://ea6e94f88e69a2c72c358c2a248a3a4b1216e97c838cc4ea35f480cab4e9861f" gracePeriod=30 Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.879136 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.879356 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="e17cbb6a-58ff-46e9-91d7-d92d52b77b55" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50" gracePeriod=30 Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.906365 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.906649 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="59347f0b-d425-4d66-b2a0-7381c5fa0c3f" containerName="nova-api-log" containerID="cri-o://2175f12451ad8dc33d5ba3fc27ace3cfecef118535575d9b83d785039ff027ba" gracePeriod=30 Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.907145 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="59347f0b-d425-4d66-b2a0-7381c5fa0c3f" containerName="nova-api-api" containerID="cri-o://20807d44d96cf2571ff139187e55cdd065bf126b9fb85662d43174da48227d54" gracePeriod=30 Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.954325 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:13 crc kubenswrapper[5030]: I0120 23:32:13.988238 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c77f3ffb-97b9-4a1f-9610-7705ae4aee88" path="/var/lib/kubelet/pods/c77f3ffb-97b9-4a1f-9610-7705ae4aee88/volumes" Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.027015 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-combined-ca-bundle\") pod \"68408e1c-9c9a-465d-9abd-ea36b270db22\" (UID: \"68408e1c-9c9a-465d-9abd-ea36b270db22\") " Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.048191 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68408e1c-9c9a-465d-9abd-ea36b270db22" (UID: "68408e1c-9c9a-465d-9abd-ea36b270db22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.130736 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68408e1c-9c9a-465d-9abd-ea36b270db22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.220963 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.304899 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-78c57799cd-6f6m4"] Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.305139 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" podUID="f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" containerName="neutron-api" containerID="cri-o://0479d8ca3c7e55d792916ca93b8115469fdc13f431eb12a4779cf505b8f82e6f" gracePeriod=30 Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.305291 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" podUID="f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" containerName="neutron-httpd" containerID="cri-o://f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4" gracePeriod=30 Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.341566 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-65885c5c7c-j7j2s"] Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.345882 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-65885c5c7c-j7j2s"] Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.412683 5030 generic.go:334] "Generic (PLEG): container finished" podID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerID="22f47e79ac62c2871d4a7b59223d44f759f9d8b75d4c1bbf6a9e584a19c34da3" exitCode=143 Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.412743 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"762f567e-ffe6-41fa-bde1-18775c7052b7","Type":"ContainerDied","Data":"22f47e79ac62c2871d4a7b59223d44f759f9d8b75d4c1bbf6a9e584a19c34da3"} Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.416222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"6153d63b-e003-4951-b9f1-a27d11979663","Type":"ContainerStarted","Data":"61f13c629eb50d8806b59af161a0e2ebd559836747a102e58d63db7089892566"} Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.419700 5030 generic.go:334] "Generic (PLEG): container finished" podID="59347f0b-d425-4d66-b2a0-7381c5fa0c3f" containerID="2175f12451ad8dc33d5ba3fc27ace3cfecef118535575d9b83d785039ff027ba" exitCode=143 Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.419749 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"59347f0b-d425-4d66-b2a0-7381c5fa0c3f","Type":"ContainerDied","Data":"2175f12451ad8dc33d5ba3fc27ace3cfecef118535575d9b83d785039ff027ba"} Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.433038 5030 generic.go:334] "Generic (PLEG): container finished" podID="bd7a6b3c-a038-4385-83a9-16177204e200" containerID="bb485dcb26d38ffb2a84701b1fb131d5a4d1deabd2757ae3aa97eb607aca774a" exitCode=1 Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.433989 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"bd7a6b3c-a038-4385-83a9-16177204e200","Type":"ContainerDied","Data":"bb485dcb26d38ffb2a84701b1fb131d5a4d1deabd2757ae3aa97eb607aca774a"} Jan 20 23:32:14 crc kubenswrapper[5030]: W0120 23:32:14.574157 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/system.slice/system-systemd\\x2dcoredump.slice/systemd-coredump@6-265072-0.service": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/system.slice/system-systemd\x2dcoredump.slice/systemd-coredump@6-265072-0.service: no such file or directory Jan 20 23:32:14 crc kubenswrapper[5030]: I0120 23:32:14.927814 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.039226 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.042949 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5c8797d4c8-svfrs_1ca5f6c1-a17b-4405-93e9-fa2fab576c91/neutron-httpd/0.log" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.043711 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5c8797d4c8-svfrs_1ca5f6c1-a17b-4405-93e9-fa2fab576c91/neutron-api/0.log" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.043787 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.050216 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mw7c\" (UniqueName: \"kubernetes.io/projected/bd7a6b3c-a038-4385-83a9-16177204e200-kube-api-access-5mw7c\") pod \"bd7a6b3c-a038-4385-83a9-16177204e200\" (UID: \"bd7a6b3c-a038-4385-83a9-16177204e200\") " Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.050342 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd7a6b3c-a038-4385-83a9-16177204e200-combined-ca-bundle\") pod \"bd7a6b3c-a038-4385-83a9-16177204e200\" (UID: \"bd7a6b3c-a038-4385-83a9-16177204e200\") " Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.050480 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd7a6b3c-a038-4385-83a9-16177204e200-config-data\") pod \"bd7a6b3c-a038-4385-83a9-16177204e200\" (UID: \"bd7a6b3c-a038-4385-83a9-16177204e200\") " Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.057845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd7a6b3c-a038-4385-83a9-16177204e200-kube-api-access-5mw7c" (OuterVolumeSpecName: "kube-api-access-5mw7c") pod "bd7a6b3c-a038-4385-83a9-16177204e200" (UID: "bd7a6b3c-a038-4385-83a9-16177204e200"). InnerVolumeSpecName "kube-api-access-5mw7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.091251 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd7a6b3c-a038-4385-83a9-16177204e200-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd7a6b3c-a038-4385-83a9-16177204e200" (UID: "bd7a6b3c-a038-4385-83a9-16177204e200"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:15 crc kubenswrapper[5030]: E0120 23:32:15.106669 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:15 crc kubenswrapper[5030]: E0120 23:32:15.112429 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:15 crc kubenswrapper[5030]: E0120 23:32:15.115661 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:15 crc kubenswrapper[5030]: E0120 23:32:15.115699 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f7fef1ff-a583-42c4-bb53-b6a684d9afad" containerName="nova-cell0-conductor-conductor" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.116400 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd7a6b3c-a038-4385-83a9-16177204e200-config-data" (OuterVolumeSpecName: "config-data") pod "bd7a6b3c-a038-4385-83a9-16177204e200" (UID: "bd7a6b3c-a038-4385-83a9-16177204e200"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.152153 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjp8l\" (UniqueName: \"kubernetes.io/projected/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-kube-api-access-jjp8l\") pod \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\" (UID: \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\") " Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.152295 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-httpd-config\") pod \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.152357 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-combined-ca-bundle\") pod \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.152429 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf8tm\" (UniqueName: \"kubernetes.io/projected/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-kube-api-access-rf8tm\") pod \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.152758 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-combined-ca-bundle\") pod \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\" (UID: \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\") " Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.152828 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-config-data\") pod \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\" (UID: \"e17cbb6a-58ff-46e9-91d7-d92d52b77b55\") " Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.152875 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-config\") pod \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\" (UID: \"1ca5f6c1-a17b-4405-93e9-fa2fab576c91\") " Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.153395 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd7a6b3c-a038-4385-83a9-16177204e200-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.153413 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd7a6b3c-a038-4385-83a9-16177204e200-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.153422 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mw7c\" (UniqueName: \"kubernetes.io/projected/bd7a6b3c-a038-4385-83a9-16177204e200-kube-api-access-5mw7c\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.156429 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-kube-api-access-jjp8l" (OuterVolumeSpecName: "kube-api-access-jjp8l") pod "e17cbb6a-58ff-46e9-91d7-d92d52b77b55" (UID: "e17cbb6a-58ff-46e9-91d7-d92d52b77b55"). InnerVolumeSpecName "kube-api-access-jjp8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.157853 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1ca5f6c1-a17b-4405-93e9-fa2fab576c91" (UID: "1ca5f6c1-a17b-4405-93e9-fa2fab576c91"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.160775 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-kube-api-access-rf8tm" (OuterVolumeSpecName: "kube-api-access-rf8tm") pod "1ca5f6c1-a17b-4405-93e9-fa2fab576c91" (UID: "1ca5f6c1-a17b-4405-93e9-fa2fab576c91"). InnerVolumeSpecName "kube-api-access-rf8tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.188504 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-config-data" (OuterVolumeSpecName: "config-data") pod "e17cbb6a-58ff-46e9-91d7-d92d52b77b55" (UID: "e17cbb6a-58ff-46e9-91d7-d92d52b77b55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.206170 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e17cbb6a-58ff-46e9-91d7-d92d52b77b55" (UID: "e17cbb6a-58ff-46e9-91d7-d92d52b77b55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.216099 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ca5f6c1-a17b-4405-93e9-fa2fab576c91" (UID: "1ca5f6c1-a17b-4405-93e9-fa2fab576c91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.225804 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-config" (OuterVolumeSpecName: "config") pod "1ca5f6c1-a17b-4405-93e9-fa2fab576c91" (UID: "1ca5f6c1-a17b-4405-93e9-fa2fab576c91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.255250 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.255294 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.255307 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf8tm\" (UniqueName: \"kubernetes.io/projected/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-kube-api-access-rf8tm\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.255318 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.255333 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.255345 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ca5f6c1-a17b-4405-93e9-fa2fab576c91-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.255359 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjp8l\" (UniqueName: \"kubernetes.io/projected/e17cbb6a-58ff-46e9-91d7-d92d52b77b55-kube-api-access-jjp8l\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.443496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"6153d63b-e003-4951-b9f1-a27d11979663","Type":"ContainerStarted","Data":"24f4bfb73431d6bcf7cc305370aec071e7e270a5776bad4e062eabb3f4ff4ed1"} Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.446914 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5c8797d4c8-svfrs_1ca5f6c1-a17b-4405-93e9-fa2fab576c91/neutron-httpd/0.log" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.447749 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5c8797d4c8-svfrs_1ca5f6c1-a17b-4405-93e9-fa2fab576c91/neutron-api/0.log" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.447781 5030 generic.go:334] "Generic (PLEG): container finished" podID="1ca5f6c1-a17b-4405-93e9-fa2fab576c91" containerID="94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b" exitCode=137 Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.447819 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" event={"ID":"1ca5f6c1-a17b-4405-93e9-fa2fab576c91","Type":"ContainerDied","Data":"94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b"} Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.447838 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" event={"ID":"1ca5f6c1-a17b-4405-93e9-fa2fab576c91","Type":"ContainerDied","Data":"64151782faeffac0b0b3a84c40974698501a2092d9149ec0d69ca210c8027e09"} Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.447853 5030 scope.go:117] "RemoveContainer" containerID="94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.447947 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5c8797d4c8-svfrs" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.452576 5030 generic.go:334] "Generic (PLEG): container finished" podID="e17cbb6a-58ff-46e9-91d7-d92d52b77b55" containerID="cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50" exitCode=0 Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.452603 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.452672 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"e17cbb6a-58ff-46e9-91d7-d92d52b77b55","Type":"ContainerDied","Data":"cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50"} Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.452702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"e17cbb6a-58ff-46e9-91d7-d92d52b77b55","Type":"ContainerDied","Data":"cbf2dd2974b420b1c99104903866e2f9cb223cd17877d633d71540a17d53b366"} Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.455554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"bd7a6b3c-a038-4385-83a9-16177204e200","Type":"ContainerDied","Data":"2e9b520770d2dceb5700f36e96093a6b52cf9b6214a65c5c6a9bccaea19b2bbe"} Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.455610 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.461318 5030 generic.go:334] "Generic (PLEG): container finished" podID="f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" containerID="f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4" exitCode=0 Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.461489 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d30c7a9f-0dab-4a96-8e99-d85cf8961b3c" containerName="nova-scheduler-scheduler" containerID="cri-o://4192700693d1637b6b90305448260e5aec12e2ef6a95aa8b5a4302d3b91fa7ad" gracePeriod=30 Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.461695 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" event={"ID":"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80","Type":"ContainerDied","Data":"f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4"} Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.465316 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.465299589 podStartE2EDuration="3.465299589s" podCreationTimestamp="2026-01-20 23:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:15.460265928 +0000 UTC m=+3407.780526216" watchObservedRunningTime="2026-01-20 23:32:15.465299589 +0000 UTC m=+3407.785559877" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.481420 5030 scope.go:117] "RemoveContainer" containerID="8fcf93bf7f94a7ddab1cf7ab5a1c43d40b4dd3a1caa57e86b368f3b81fec5d70" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.511124 5030 scope.go:117] "RemoveContainer" containerID="94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b" Jan 20 23:32:15 crc kubenswrapper[5030]: E0120 23:32:15.511805 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b\": container with ID starting with 94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b not found: ID does not exist" containerID="94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.511850 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b"} err="failed to get container status \"94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b\": rpc error: code = NotFound desc = could not find container \"94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b\": container with ID starting with 94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b not found: ID does not exist" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.511874 5030 scope.go:117] "RemoveContainer" containerID="8fcf93bf7f94a7ddab1cf7ab5a1c43d40b4dd3a1caa57e86b368f3b81fec5d70" Jan 20 23:32:15 crc kubenswrapper[5030]: E0120 23:32:15.512276 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fcf93bf7f94a7ddab1cf7ab5a1c43d40b4dd3a1caa57e86b368f3b81fec5d70\": container with ID starting with 8fcf93bf7f94a7ddab1cf7ab5a1c43d40b4dd3a1caa57e86b368f3b81fec5d70 not found: ID does not exist" containerID="8fcf93bf7f94a7ddab1cf7ab5a1c43d40b4dd3a1caa57e86b368f3b81fec5d70" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.512322 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fcf93bf7f94a7ddab1cf7ab5a1c43d40b4dd3a1caa57e86b368f3b81fec5d70"} err="failed to get container status \"8fcf93bf7f94a7ddab1cf7ab5a1c43d40b4dd3a1caa57e86b368f3b81fec5d70\": rpc error: code = NotFound desc = could not find container \"8fcf93bf7f94a7ddab1cf7ab5a1c43d40b4dd3a1caa57e86b368f3b81fec5d70\": container with ID starting with 8fcf93bf7f94a7ddab1cf7ab5a1c43d40b4dd3a1caa57e86b368f3b81fec5d70 not found: ID does not exist" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.512351 5030 scope.go:117] "RemoveContainer" containerID="cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.525317 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.539703 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.545031 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5c8797d4c8-svfrs"] Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.545860 5030 scope.go:117] "RemoveContainer" containerID="cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50" Jan 20 23:32:15 crc kubenswrapper[5030]: E0120 23:32:15.547957 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50\": container with ID starting with cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50 not found: ID does not exist" containerID="cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.547992 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50"} err="failed to get container status \"cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50\": rpc error: code = NotFound desc = could not find container \"cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50\": container with ID starting with cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50 not found: ID does not exist" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.548013 5030 scope.go:117] "RemoveContainer" containerID="bb485dcb26d38ffb2a84701b1fb131d5a4d1deabd2757ae3aa97eb607aca774a" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.564483 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:32:15 crc kubenswrapper[5030]: E0120 23:32:15.564837 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd7a6b3c-a038-4385-83a9-16177204e200" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.564847 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd7a6b3c-a038-4385-83a9-16177204e200" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:15 crc kubenswrapper[5030]: E0120 23:32:15.564874 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca5f6c1-a17b-4405-93e9-fa2fab576c91" containerName="neutron-api" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.564879 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca5f6c1-a17b-4405-93e9-fa2fab576c91" containerName="neutron-api" Jan 20 23:32:15 crc kubenswrapper[5030]: E0120 23:32:15.564895 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17cbb6a-58ff-46e9-91d7-d92d52b77b55" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.564902 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17cbb6a-58ff-46e9-91d7-d92d52b77b55" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:32:15 crc kubenswrapper[5030]: E0120 23:32:15.564920 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68408e1c-9c9a-465d-9abd-ea36b270db22" containerName="keystone-api" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.564926 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="68408e1c-9c9a-465d-9abd-ea36b270db22" containerName="keystone-api" Jan 20 23:32:15 crc kubenswrapper[5030]: E0120 23:32:15.564934 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca5f6c1-a17b-4405-93e9-fa2fab576c91" containerName="neutron-httpd" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.564940 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca5f6c1-a17b-4405-93e9-fa2fab576c91" containerName="neutron-httpd" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.565102 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17cbb6a-58ff-46e9-91d7-d92d52b77b55" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.565112 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca5f6c1-a17b-4405-93e9-fa2fab576c91" containerName="neutron-api" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.565122 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="68408e1c-9c9a-465d-9abd-ea36b270db22" containerName="keystone-api" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.565143 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca5f6c1-a17b-4405-93e9-fa2fab576c91" containerName="neutron-httpd" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.565153 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd7a6b3c-a038-4385-83a9-16177204e200" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.565738 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.569954 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.587084 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5c8797d4c8-svfrs"] Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.597908 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.604257 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.615654 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.634809 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.635929 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.644981 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.651086 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.660917 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.660974 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.660998 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7c6p\" (UniqueName: \"kubernetes.io/projected/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-kube-api-access-g7c6p\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.762280 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62366776-47f0-4d58-87fc-e76f904677bc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"62366776-47f0-4d58-87fc-e76f904677bc\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.762367 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62366776-47f0-4d58-87fc-e76f904677bc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"62366776-47f0-4d58-87fc-e76f904677bc\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.762398 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.762426 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg8s9\" (UniqueName: \"kubernetes.io/projected/62366776-47f0-4d58-87fc-e76f904677bc-kube-api-access-gg8s9\") pod \"nova-cell1-conductor-0\" (UID: \"62366776-47f0-4d58-87fc-e76f904677bc\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.762475 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.762503 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7c6p\" (UniqueName: \"kubernetes.io/projected/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-kube-api-access-g7c6p\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.767006 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.779133 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.780033 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7c6p\" (UniqueName: \"kubernetes.io/projected/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-kube-api-access-g7c6p\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.863937 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62366776-47f0-4d58-87fc-e76f904677bc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"62366776-47f0-4d58-87fc-e76f904677bc\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.864392 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg8s9\" (UniqueName: \"kubernetes.io/projected/62366776-47f0-4d58-87fc-e76f904677bc-kube-api-access-gg8s9\") pod \"nova-cell1-conductor-0\" (UID: \"62366776-47f0-4d58-87fc-e76f904677bc\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.864575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62366776-47f0-4d58-87fc-e76f904677bc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"62366776-47f0-4d58-87fc-e76f904677bc\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.867671 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62366776-47f0-4d58-87fc-e76f904677bc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"62366776-47f0-4d58-87fc-e76f904677bc\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.869202 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62366776-47f0-4d58-87fc-e76f904677bc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"62366776-47f0-4d58-87fc-e76f904677bc\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.881583 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg8s9\" (UniqueName: \"kubernetes.io/projected/62366776-47f0-4d58-87fc-e76f904677bc-kube-api-access-gg8s9\") pod \"nova-cell1-conductor-0\" (UID: \"62366776-47f0-4d58-87fc-e76f904677bc\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.884763 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.955075 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.995305 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca5f6c1-a17b-4405-93e9-fa2fab576c91" path="/var/lib/kubelet/pods/1ca5f6c1-a17b-4405-93e9-fa2fab576c91/volumes" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.996029 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68408e1c-9c9a-465d-9abd-ea36b270db22" path="/var/lib/kubelet/pods/68408e1c-9c9a-465d-9abd-ea36b270db22/volumes" Jan 20 23:32:15 crc kubenswrapper[5030]: I0120 23:32:15.996674 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd7a6b3c-a038-4385-83a9-16177204e200" path="/var/lib/kubelet/pods/bd7a6b3c-a038-4385-83a9-16177204e200/volumes" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:15.997796 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17cbb6a-58ff-46e9-91d7-d92d52b77b55" path="/var/lib/kubelet/pods/e17cbb6a-58ff-46e9-91d7-d92d52b77b55/volumes" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.325498 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.394307 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.459501 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.481292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36","Type":"ContainerStarted","Data":"3bc3fb5407dd3c6c5032c7bba1d7112685f8f092f245656031aa75ee63e71784"} Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.512870 5030 generic.go:334] "Generic (PLEG): container finished" podID="d30c7a9f-0dab-4a96-8e99-d85cf8961b3c" containerID="4192700693d1637b6b90305448260e5aec12e2ef6a95aa8b5a4302d3b91fa7ad" exitCode=1 Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.512919 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c","Type":"ContainerDied","Data":"4192700693d1637b6b90305448260e5aec12e2ef6a95aa8b5a4302d3b91fa7ad"} Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.512949 5030 scope.go:117] "RemoveContainer" containerID="b2a6251f95696d843b8df69fd2ffb1338c7d7f9c951e93808aaa29c34d6887ed" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.556083 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64"] Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.558105 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.561793 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.562097 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.585439 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64"] Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.624262 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.689946 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwjzf\" (UniqueName: \"kubernetes.io/projected/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-kube-api-access-qwjzf\") pod \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\" (UID: \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\") " Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.690367 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-combined-ca-bundle\") pod \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\" (UID: \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\") " Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.690448 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-config-data\") pod \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\" (UID: \"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c\") " Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.690817 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7862bdc0-be79-4653-95d6-88b5ebcb8f04-etc-swift\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.691446 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7862bdc0-be79-4653-95d6-88b5ebcb8f04-log-httpd\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.691545 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cntt9\" (UniqueName: \"kubernetes.io/projected/7862bdc0-be79-4653-95d6-88b5ebcb8f04-kube-api-access-cntt9\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.691683 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-combined-ca-bundle\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.691829 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-config-data\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.691939 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7862bdc0-be79-4653-95d6-88b5ebcb8f04-run-httpd\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.692019 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-internal-tls-certs\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.692092 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-public-tls-certs\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.713588 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-kube-api-access-qwjzf" (OuterVolumeSpecName: "kube-api-access-qwjzf") pod "d30c7a9f-0dab-4a96-8e99-d85cf8961b3c" (UID: "d30c7a9f-0dab-4a96-8e99-d85cf8961b3c"). InnerVolumeSpecName "kube-api-access-qwjzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.744832 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d30c7a9f-0dab-4a96-8e99-d85cf8961b3c" (UID: "d30c7a9f-0dab-4a96-8e99-d85cf8961b3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.751831 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-config-data" (OuterVolumeSpecName: "config-data") pod "d30c7a9f-0dab-4a96-8e99-d85cf8961b3c" (UID: "d30c7a9f-0dab-4a96-8e99-d85cf8961b3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.794351 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7862bdc0-be79-4653-95d6-88b5ebcb8f04-run-httpd\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.794424 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-internal-tls-certs\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.794450 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-public-tls-certs\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.794511 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7862bdc0-be79-4653-95d6-88b5ebcb8f04-etc-swift\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.794562 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7862bdc0-be79-4653-95d6-88b5ebcb8f04-log-httpd\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.794610 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cntt9\" (UniqueName: \"kubernetes.io/projected/7862bdc0-be79-4653-95d6-88b5ebcb8f04-kube-api-access-cntt9\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.794688 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-combined-ca-bundle\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.794737 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-config-data\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.794825 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.794846 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.794863 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwjzf\" (UniqueName: \"kubernetes.io/projected/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c-kube-api-access-qwjzf\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.794953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7862bdc0-be79-4653-95d6-88b5ebcb8f04-run-httpd\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.795223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7862bdc0-be79-4653-95d6-88b5ebcb8f04-log-httpd\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.798493 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-public-tls-certs\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.799088 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-combined-ca-bundle\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.800267 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-internal-tls-certs\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.800605 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-config-data\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.804437 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7862bdc0-be79-4653-95d6-88b5ebcb8f04-etc-swift\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.814220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cntt9\" (UniqueName: \"kubernetes.io/projected/7862bdc0-be79-4653-95d6-88b5ebcb8f04-kube-api-access-cntt9\") pod \"swift-proxy-6864c8c87d-g8v64\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.886543 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.966198 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.966470 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="ceilometer-central-agent" containerID="cri-o://9933c1796e12a1821918b7663a5fc5bf0366cd5c6ed549d56583f91cd777a7a0" gracePeriod=30 Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.966502 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="proxy-httpd" containerID="cri-o://bef839d4347f778186d61e62d0225190937025ed64ee682d8132b9242ecbb895" gracePeriod=30 Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.966505 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="ceilometer-notification-agent" containerID="cri-o://78e4e6e095afd325b5fc01623dc7035951add02e1372618dfb0a76729e6c5778" gracePeriod=30 Jan 20 23:32:16 crc kubenswrapper[5030]: I0120 23:32:16.966499 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="sg-core" containerID="cri-o://1829fc165f134d3d35e6a4275330f06aec81c9eba4e63208b4c9653f26fa4906" gracePeriod=30 Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.305068 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.205:8775/\": read tcp 10.217.0.2:41742->10.217.1.205:8775: read: connection reset by peer" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.305072 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.205:8775/\": read tcp 10.217.0.2:41756->10.217.1.205:8775: read: connection reset by peer" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.321644 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64"] Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.375522 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.540128 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"62366776-47f0-4d58-87fc-e76f904677bc","Type":"ContainerStarted","Data":"a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4"} Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.540183 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"62366776-47f0-4d58-87fc-e76f904677bc","Type":"ContainerStarted","Data":"695839cee1703cd16648cb2100e7c8e05f0f4ba151c74b46ff61ff564b788ab2"} Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.541102 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.547368 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36","Type":"ContainerStarted","Data":"b22012f1b40823409d6ada919f03d5a1334f1623994ec457ac7f6893cf60c735"} Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.547514 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b22012f1b40823409d6ada919f03d5a1334f1623994ec457ac7f6893cf60c735" gracePeriod=30 Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.556028 5030 generic.go:334] "Generic (PLEG): container finished" podID="59347f0b-d425-4d66-b2a0-7381c5fa0c3f" containerID="20807d44d96cf2571ff139187e55cdd065bf126b9fb85662d43174da48227d54" exitCode=0 Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.556038 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.55601915 podStartE2EDuration="2.55601915s" podCreationTimestamp="2026-01-20 23:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:17.553752275 +0000 UTC m=+3409.874012563" watchObservedRunningTime="2026-01-20 23:32:17.55601915 +0000 UTC m=+3409.876279428" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.556091 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"59347f0b-d425-4d66-b2a0-7381c5fa0c3f","Type":"ContainerDied","Data":"20807d44d96cf2571ff139187e55cdd065bf126b9fb85662d43174da48227d54"} Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.558971 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.559001 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d30c7a9f-0dab-4a96-8e99-d85cf8961b3c","Type":"ContainerDied","Data":"61544cfe2b823daee8ed0864944e43bd2a8cd1710711670204ece0993a1f4eb6"} Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.559061 5030 scope.go:117] "RemoveContainer" containerID="4192700693d1637b6b90305448260e5aec12e2ef6a95aa8b5a4302d3b91fa7ad" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.566034 5030 generic.go:334] "Generic (PLEG): container finished" podID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerID="ea6e94f88e69a2c72c358c2a248a3a4b1216e97c838cc4ea35f480cab4e9861f" exitCode=0 Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.566119 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"762f567e-ffe6-41fa-bde1-18775c7052b7","Type":"ContainerDied","Data":"ea6e94f88e69a2c72c358c2a248a3a4b1216e97c838cc4ea35f480cab4e9861f"} Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.569034 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" event={"ID":"7862bdc0-be79-4653-95d6-88b5ebcb8f04","Type":"ContainerStarted","Data":"d71a72fc88c049637f628543d92a183999d12f38fd28e487b1624442390391f8"} Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.571040 5030 generic.go:334] "Generic (PLEG): container finished" podID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerID="bef839d4347f778186d61e62d0225190937025ed64ee682d8132b9242ecbb895" exitCode=0 Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.571073 5030 generic.go:334] "Generic (PLEG): container finished" podID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerID="1829fc165f134d3d35e6a4275330f06aec81c9eba4e63208b4c9653f26fa4906" exitCode=2 Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.571084 5030 generic.go:334] "Generic (PLEG): container finished" podID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerID="78e4e6e095afd325b5fc01623dc7035951add02e1372618dfb0a76729e6c5778" exitCode=0 Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.571094 5030 generic.go:334] "Generic (PLEG): container finished" podID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerID="9933c1796e12a1821918b7663a5fc5bf0366cd5c6ed549d56583f91cd777a7a0" exitCode=0 Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.571112 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"feadf8b7-c4d1-486b-892d-2de53cf8144f","Type":"ContainerDied","Data":"bef839d4347f778186d61e62d0225190937025ed64ee682d8132b9242ecbb895"} Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.571130 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"feadf8b7-c4d1-486b-892d-2de53cf8144f","Type":"ContainerDied","Data":"1829fc165f134d3d35e6a4275330f06aec81c9eba4e63208b4c9653f26fa4906"} Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.571140 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"feadf8b7-c4d1-486b-892d-2de53cf8144f","Type":"ContainerDied","Data":"78e4e6e095afd325b5fc01623dc7035951add02e1372618dfb0a76729e6c5778"} Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.571150 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"feadf8b7-c4d1-486b-892d-2de53cf8144f","Type":"ContainerDied","Data":"9933c1796e12a1821918b7663a5fc5bf0366cd5c6ed549d56583f91cd777a7a0"} Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.572833 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.572817767 podStartE2EDuration="2.572817767s" podCreationTimestamp="2026-01-20 23:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:17.565024638 +0000 UTC m=+3409.885284926" watchObservedRunningTime="2026-01-20 23:32:17.572817767 +0000 UTC m=+3409.893078055" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.613217 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.622769 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.638675 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:17 crc kubenswrapper[5030]: E0120 23:32:17.639094 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30c7a9f-0dab-4a96-8e99-d85cf8961b3c" containerName="nova-scheduler-scheduler" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.639112 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30c7a9f-0dab-4a96-8e99-d85cf8961b3c" containerName="nova-scheduler-scheduler" Jan 20 23:32:17 crc kubenswrapper[5030]: E0120 23:32:17.639138 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30c7a9f-0dab-4a96-8e99-d85cf8961b3c" containerName="nova-scheduler-scheduler" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.639145 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30c7a9f-0dab-4a96-8e99-d85cf8961b3c" containerName="nova-scheduler-scheduler" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.639321 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30c7a9f-0dab-4a96-8e99-d85cf8961b3c" containerName="nova-scheduler-scheduler" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.639340 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30c7a9f-0dab-4a96-8e99-d85cf8961b3c" containerName="nova-scheduler-scheduler" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.639989 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.644185 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.645437 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.656188 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.712589 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsknb\" (UniqueName: \"kubernetes.io/projected/8922a316-0627-47b3-81ad-b3f7bff4da38-kube-api-access-tsknb\") pod \"nova-scheduler-0\" (UID: \"8922a316-0627-47b3-81ad-b3f7bff4da38\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.712726 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8922a316-0627-47b3-81ad-b3f7bff4da38-config-data\") pod \"nova-scheduler-0\" (UID: \"8922a316-0627-47b3-81ad-b3f7bff4da38\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.712893 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8922a316-0627-47b3-81ad-b3f7bff4da38-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8922a316-0627-47b3-81ad-b3f7bff4da38\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.772020 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.814452 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8922a316-0627-47b3-81ad-b3f7bff4da38-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8922a316-0627-47b3-81ad-b3f7bff4da38\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.814507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsknb\" (UniqueName: \"kubernetes.io/projected/8922a316-0627-47b3-81ad-b3f7bff4da38-kube-api-access-tsknb\") pod \"nova-scheduler-0\" (UID: \"8922a316-0627-47b3-81ad-b3f7bff4da38\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.814569 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8922a316-0627-47b3-81ad-b3f7bff4da38-config-data\") pod \"nova-scheduler-0\" (UID: \"8922a316-0627-47b3-81ad-b3f7bff4da38\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.821192 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8922a316-0627-47b3-81ad-b3f7bff4da38-config-data\") pod \"nova-scheduler-0\" (UID: \"8922a316-0627-47b3-81ad-b3f7bff4da38\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.822964 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8922a316-0627-47b3-81ad-b3f7bff4da38-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8922a316-0627-47b3-81ad-b3f7bff4da38\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.842099 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsknb\" (UniqueName: \"kubernetes.io/projected/8922a316-0627-47b3-81ad-b3f7bff4da38-kube-api-access-tsknb\") pod \"nova-scheduler-0\" (UID: \"8922a316-0627-47b3-81ad-b3f7bff4da38\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.914280 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.985018 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d30c7a9f-0dab-4a96-8e99-d85cf8961b3c" path="/var/lib/kubelet/pods/d30c7a9f-0dab-4a96-8e99-d85cf8961b3c/volumes" Jan 20 23:32:17 crc kubenswrapper[5030]: I0120 23:32:17.997541 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.011973 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.031006 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.036193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcpmc\" (UniqueName: \"kubernetes.io/projected/762f567e-ffe6-41fa-bde1-18775c7052b7-kube-api-access-mcpmc\") pod \"762f567e-ffe6-41fa-bde1-18775c7052b7\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.036338 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762f567e-ffe6-41fa-bde1-18775c7052b7-config-data\") pod \"762f567e-ffe6-41fa-bde1-18775c7052b7\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.036369 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762f567e-ffe6-41fa-bde1-18775c7052b7-combined-ca-bundle\") pod \"762f567e-ffe6-41fa-bde1-18775c7052b7\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.036523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762f567e-ffe6-41fa-bde1-18775c7052b7-logs\") pod \"762f567e-ffe6-41fa-bde1-18775c7052b7\" (UID: \"762f567e-ffe6-41fa-bde1-18775c7052b7\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.051765 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/762f567e-ffe6-41fa-bde1-18775c7052b7-logs" (OuterVolumeSpecName: "logs") pod "762f567e-ffe6-41fa-bde1-18775c7052b7" (UID: "762f567e-ffe6-41fa-bde1-18775c7052b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.083864 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/762f567e-ffe6-41fa-bde1-18775c7052b7-kube-api-access-mcpmc" (OuterVolumeSpecName: "kube-api-access-mcpmc") pod "762f567e-ffe6-41fa-bde1-18775c7052b7" (UID: "762f567e-ffe6-41fa-bde1-18775c7052b7"). InnerVolumeSpecName "kube-api-access-mcpmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.115275 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762f567e-ffe6-41fa-bde1-18775c7052b7-config-data" (OuterVolumeSpecName: "config-data") pod "762f567e-ffe6-41fa-bde1-18775c7052b7" (UID: "762f567e-ffe6-41fa-bde1-18775c7052b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.135773 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762f567e-ffe6-41fa-bde1-18775c7052b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "762f567e-ffe6-41fa-bde1-18775c7052b7" (UID: "762f567e-ffe6-41fa-bde1-18775c7052b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.139187 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-ceilometer-tls-certs\") pod \"feadf8b7-c4d1-486b-892d-2de53cf8144f\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.139239 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-combined-ca-bundle\") pod \"feadf8b7-c4d1-486b-892d-2de53cf8144f\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.139297 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-scripts\") pod \"feadf8b7-c4d1-486b-892d-2de53cf8144f\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.139320 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/feadf8b7-c4d1-486b-892d-2de53cf8144f-log-httpd\") pod \"feadf8b7-c4d1-486b-892d-2de53cf8144f\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.139347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/feadf8b7-c4d1-486b-892d-2de53cf8144f-run-httpd\") pod \"feadf8b7-c4d1-486b-892d-2de53cf8144f\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.139389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzpjb\" (UniqueName: \"kubernetes.io/projected/feadf8b7-c4d1-486b-892d-2de53cf8144f-kube-api-access-tzpjb\") pod \"feadf8b7-c4d1-486b-892d-2de53cf8144f\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.139427 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw4xn\" (UniqueName: \"kubernetes.io/projected/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-kube-api-access-cw4xn\") pod \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.139452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-logs\") pod \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.139479 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-combined-ca-bundle\") pod \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.139496 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-config-data\") pod \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\" (UID: \"59347f0b-d425-4d66-b2a0-7381c5fa0c3f\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.139514 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-config-data\") pod \"feadf8b7-c4d1-486b-892d-2de53cf8144f\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.139566 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-sg-core-conf-yaml\") pod \"feadf8b7-c4d1-486b-892d-2de53cf8144f\" (UID: \"feadf8b7-c4d1-486b-892d-2de53cf8144f\") " Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.140288 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762f567e-ffe6-41fa-bde1-18775c7052b7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.140302 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762f567e-ffe6-41fa-bde1-18775c7052b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.140313 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762f567e-ffe6-41fa-bde1-18775c7052b7-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.140322 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcpmc\" (UniqueName: \"kubernetes.io/projected/762f567e-ffe6-41fa-bde1-18775c7052b7-kube-api-access-mcpmc\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.142569 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feadf8b7-c4d1-486b-892d-2de53cf8144f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "feadf8b7-c4d1-486b-892d-2de53cf8144f" (UID: "feadf8b7-c4d1-486b-892d-2de53cf8144f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.150762 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-kube-api-access-cw4xn" (OuterVolumeSpecName: "kube-api-access-cw4xn") pod "59347f0b-d425-4d66-b2a0-7381c5fa0c3f" (UID: "59347f0b-d425-4d66-b2a0-7381c5fa0c3f"). InnerVolumeSpecName "kube-api-access-cw4xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.150954 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feadf8b7-c4d1-486b-892d-2de53cf8144f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "feadf8b7-c4d1-486b-892d-2de53cf8144f" (UID: "feadf8b7-c4d1-486b-892d-2de53cf8144f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.152327 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-logs" (OuterVolumeSpecName: "logs") pod "59347f0b-d425-4d66-b2a0-7381c5fa0c3f" (UID: "59347f0b-d425-4d66-b2a0-7381c5fa0c3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.156214 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feadf8b7-c4d1-486b-892d-2de53cf8144f-kube-api-access-tzpjb" (OuterVolumeSpecName: "kube-api-access-tzpjb") pod "feadf8b7-c4d1-486b-892d-2de53cf8144f" (UID: "feadf8b7-c4d1-486b-892d-2de53cf8144f"). InnerVolumeSpecName "kube-api-access-tzpjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.174313 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-scripts" (OuterVolumeSpecName: "scripts") pod "feadf8b7-c4d1-486b-892d-2de53cf8144f" (UID: "feadf8b7-c4d1-486b-892d-2de53cf8144f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.196563 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "feadf8b7-c4d1-486b-892d-2de53cf8144f" (UID: "feadf8b7-c4d1-486b-892d-2de53cf8144f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.207678 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-config-data" (OuterVolumeSpecName: "config-data") pod "59347f0b-d425-4d66-b2a0-7381c5fa0c3f" (UID: "59347f0b-d425-4d66-b2a0-7381c5fa0c3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.249629 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59347f0b-d425-4d66-b2a0-7381c5fa0c3f" (UID: "59347f0b-d425-4d66-b2a0-7381c5fa0c3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.251019 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzpjb\" (UniqueName: \"kubernetes.io/projected/feadf8b7-c4d1-486b-892d-2de53cf8144f-kube-api-access-tzpjb\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.251072 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw4xn\" (UniqueName: \"kubernetes.io/projected/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-kube-api-access-cw4xn\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.251085 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.251101 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.251115 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59347f0b-d425-4d66-b2a0-7381c5fa0c3f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.251126 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.251135 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.251143 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/feadf8b7-c4d1-486b-892d-2de53cf8144f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.251152 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/feadf8b7-c4d1-486b-892d-2de53cf8144f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.260288 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "feadf8b7-c4d1-486b-892d-2de53cf8144f" (UID: "feadf8b7-c4d1-486b-892d-2de53cf8144f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.292915 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "feadf8b7-c4d1-486b-892d-2de53cf8144f" (UID: "feadf8b7-c4d1-486b-892d-2de53cf8144f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.320936 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-config-data" (OuterVolumeSpecName: "config-data") pod "feadf8b7-c4d1-486b-892d-2de53cf8144f" (UID: "feadf8b7-c4d1-486b-892d-2de53cf8144f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.353795 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.353825 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.353860 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feadf8b7-c4d1-486b-892d-2de53cf8144f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.543150 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.561300 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.585927 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"762f567e-ffe6-41fa-bde1-18775c7052b7","Type":"ContainerDied","Data":"13aa8ff0f386c218e015550ef1b633540bd0a0b402aa584614806fad23141ea8"} Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.585977 5030 scope.go:117] "RemoveContainer" containerID="ea6e94f88e69a2c72c358c2a248a3a4b1216e97c838cc4ea35f480cab4e9861f" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.586106 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.593835 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" event={"ID":"7862bdc0-be79-4653-95d6-88b5ebcb8f04","Type":"ContainerStarted","Data":"9c1a4fdb8c86de289792869fbdaa6943771f4e66e1d91b446b548a98eb55fa91"} Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.593878 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" event={"ID":"7862bdc0-be79-4653-95d6-88b5ebcb8f04","Type":"ContainerStarted","Data":"3ca1567f269cf2f2bd9e348bfea75b6cd1bfe32af33811e55a8d700adc4b3be8"} Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.594882 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.594910 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.597673 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"feadf8b7-c4d1-486b-892d-2de53cf8144f","Type":"ContainerDied","Data":"85fda7f6420a6392ed5af0a51ae368a02ff4b88b53768c801f7a41b0554638c1"} Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.597749 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.602030 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"8922a316-0627-47b3-81ad-b3f7bff4da38","Type":"ContainerStarted","Data":"9244e2541f003ba811aef956752d3a19d91534c85875b25cd48c5bce7f6bd6b8"} Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.609645 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="62366776-47f0-4d58-87fc-e76f904677bc" containerName="nova-cell1-conductor-conductor" containerID="cri-o://a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" gracePeriod=30 Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.609700 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"59347f0b-d425-4d66-b2a0-7381c5fa0c3f","Type":"ContainerDied","Data":"174175d4a37a36cc56c6f4cbaeb3c45aab32225280ac671d0de8d047a6cdb780"} Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.609877 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.629601 5030 scope.go:117] "RemoveContainer" containerID="22f47e79ac62c2871d4a7b59223d44f759f9d8b75d4c1bbf6a9e584a19c34da3" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.649279 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" podStartSLOduration=2.649256795 podStartE2EDuration="2.649256795s" podCreationTimestamp="2026-01-20 23:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:18.611755077 +0000 UTC m=+3410.932015365" watchObservedRunningTime="2026-01-20 23:32:18.649256795 +0000 UTC m=+3410.969517083" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.688448 5030 scope.go:117] "RemoveContainer" containerID="bef839d4347f778186d61e62d0225190937025ed64ee682d8132b9242ecbb895" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.701153 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.717387 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.745881 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.753835 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: E0120 23:32:18.754299 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerName="nova-metadata-metadata" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754314 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerName="nova-metadata-metadata" Jan 20 23:32:18 crc kubenswrapper[5030]: E0120 23:32:18.754342 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerName="nova-metadata-log" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754373 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerName="nova-metadata-log" Jan 20 23:32:18 crc kubenswrapper[5030]: E0120 23:32:18.754386 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="sg-core" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754392 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="sg-core" Jan 20 23:32:18 crc kubenswrapper[5030]: E0120 23:32:18.754415 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="proxy-httpd" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754423 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="proxy-httpd" Jan 20 23:32:18 crc kubenswrapper[5030]: E0120 23:32:18.754432 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59347f0b-d425-4d66-b2a0-7381c5fa0c3f" containerName="nova-api-log" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754437 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59347f0b-d425-4d66-b2a0-7381c5fa0c3f" containerName="nova-api-log" Jan 20 23:32:18 crc kubenswrapper[5030]: E0120 23:32:18.754453 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="ceilometer-notification-agent" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754459 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="ceilometer-notification-agent" Jan 20 23:32:18 crc kubenswrapper[5030]: E0120 23:32:18.754467 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="ceilometer-central-agent" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754473 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="ceilometer-central-agent" Jan 20 23:32:18 crc kubenswrapper[5030]: E0120 23:32:18.754483 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59347f0b-d425-4d66-b2a0-7381c5fa0c3f" containerName="nova-api-api" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754488 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59347f0b-d425-4d66-b2a0-7381c5fa0c3f" containerName="nova-api-api" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754685 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="ceilometer-notification-agent" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754854 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="sg-core" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754872 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="ceilometer-central-agent" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754881 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59347f0b-d425-4d66-b2a0-7381c5fa0c3f" containerName="nova-api-log" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754891 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59347f0b-d425-4d66-b2a0-7381c5fa0c3f" containerName="nova-api-api" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754905 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerName="nova-metadata-log" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754915 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" containerName="proxy-httpd" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.754926 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="762f567e-ffe6-41fa-bde1-18775c7052b7" containerName="nova-metadata-metadata" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.765663 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.769893 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.770068 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.770254 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.773418 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.785348 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.797861 5030 scope.go:117] "RemoveContainer" containerID="1829fc165f134d3d35e6a4275330f06aec81c9eba4e63208b4c9653f26fa4906" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.802640 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.804720 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.806262 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.812498 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.820799 5030 scope.go:117] "RemoveContainer" containerID="78e4e6e095afd325b5fc01623dc7035951add02e1372618dfb0a76729e6c5778" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.825963 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.837709 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.845826 5030 scope.go:117] "RemoveContainer" containerID="9933c1796e12a1821918b7663a5fc5bf0366cd5c6ed549d56583f91cd777a7a0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.852168 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.853792 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.861004 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.862271 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.877189 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.877237 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-logs\") pod \"nova-metadata-0\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.877260 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e05505c3-5757-4e56-bd28-6ea231262b0d-log-httpd\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.877284 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.877306 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e05505c3-5757-4e56-bd28-6ea231262b0d-run-httpd\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.877332 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cc29\" (UniqueName: \"kubernetes.io/projected/e05505c3-5757-4e56-bd28-6ea231262b0d-kube-api-access-4cc29\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.877358 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.877381 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr2kr\" (UniqueName: \"kubernetes.io/projected/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-kube-api-access-lr2kr\") pod \"nova-metadata-0\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.877398 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-config-data\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.877415 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.877435 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-scripts\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.877461 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-config-data\") pod \"nova-metadata-0\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.884479 5030 scope.go:117] "RemoveContainer" containerID="20807d44d96cf2571ff139187e55cdd065bf126b9fb85662d43174da48227d54" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.905691 5030 scope.go:117] "RemoveContainer" containerID="2175f12451ad8dc33d5ba3fc27ace3cfecef118535575d9b83d785039ff027ba" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.978796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979088 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-logs\") pod \"nova-metadata-0\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e05505c3-5757-4e56-bd28-6ea231262b0d-log-httpd\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979263 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979340 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e05505c3-5757-4e56-bd28-6ea231262b0d-run-httpd\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979417 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cc29\" (UniqueName: \"kubernetes.io/projected/e05505c3-5757-4e56-bd28-6ea231262b0d-kube-api-access-4cc29\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979493 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979568 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr2kr\" (UniqueName: \"kubernetes.io/projected/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-kube-api-access-lr2kr\") pod \"nova-metadata-0\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979666 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-config-data\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979794 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e05505c3-5757-4e56-bd28-6ea231262b0d-log-httpd\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979781 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-logs\") pod \"nova-metadata-0\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979811 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-scripts\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979885 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e05505c3-5757-4e56-bd28-6ea231262b0d-run-httpd\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.979896 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-config-data\") pod \"nova-api-0\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.980010 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-config-data\") pod \"nova-metadata-0\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.980129 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.980910 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgnqt\" (UniqueName: \"kubernetes.io/projected/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-kube-api-access-zgnqt\") pod \"nova-api-0\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.980960 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-logs\") pod \"nova-api-0\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.987290 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.987347 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-config-data\") pod \"nova-metadata-0\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.987927 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-scripts\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.990459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-config-data\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.991236 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.992207 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:18 crc kubenswrapper[5030]: I0120 23:32:18.992590 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.016156 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr2kr\" (UniqueName: \"kubernetes.io/projected/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-kube-api-access-lr2kr\") pod \"nova-metadata-0\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.017003 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cc29\" (UniqueName: \"kubernetes.io/projected/e05505c3-5757-4e56-bd28-6ea231262b0d-kube-api-access-4cc29\") pod \"ceilometer-0\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.082655 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-config-data\") pod \"nova-api-0\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.082801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.082841 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgnqt\" (UniqueName: \"kubernetes.io/projected/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-kube-api-access-zgnqt\") pod \"nova-api-0\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.082883 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-logs\") pod \"nova-api-0\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.084212 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-logs\") pod \"nova-api-0\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.091172 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.095179 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-config-data\") pod \"nova-api-0\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.102689 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.102846 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgnqt\" (UniqueName: \"kubernetes.io/projected/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-kube-api-access-zgnqt\") pod \"nova-api-0\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.124078 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.173980 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.638901 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"8922a316-0627-47b3-81ad-b3f7bff4da38","Type":"ContainerStarted","Data":"5c00320f3cd7578f1299e03d39a202a0074a2fde41b4b4782b08379f7ae774ee"} Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.639291 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="8922a316-0627-47b3-81ad-b3f7bff4da38" containerName="nova-scheduler-scheduler" containerID="cri-o://5c00320f3cd7578f1299e03d39a202a0074a2fde41b4b4782b08379f7ae774ee" gracePeriod=30 Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.672011 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.671990612 podStartE2EDuration="2.671990612s" podCreationTimestamp="2026-01-20 23:32:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:19.658584047 +0000 UTC m=+3411.978844335" watchObservedRunningTime="2026-01-20 23:32:19.671990612 +0000 UTC m=+3411.992250900" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.683977 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.739933 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.751098 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.881837 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.895349 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.950122 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.981137 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59347f0b-d425-4d66-b2a0-7381c5fa0c3f" path="/var/lib/kubelet/pods/59347f0b-d425-4d66-b2a0-7381c5fa0c3f/volumes" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.982189 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="762f567e-ffe6-41fa-bde1-18775c7052b7" path="/var/lib/kubelet/pods/762f567e-ffe6-41fa-bde1-18775c7052b7/volumes" Jan 20 23:32:19 crc kubenswrapper[5030]: I0120 23:32:19.982813 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feadf8b7-c4d1-486b-892d-2de53cf8144f" path="/var/lib/kubelet/pods/feadf8b7-c4d1-486b-892d-2de53cf8144f/volumes" Jan 20 23:32:20 crc kubenswrapper[5030]: E0120 23:32:20.105013 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:20 crc kubenswrapper[5030]: E0120 23:32:20.112890 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:20 crc kubenswrapper[5030]: E0120 23:32:20.114648 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:20 crc kubenswrapper[5030]: E0120 23:32:20.114696 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f7fef1ff-a583-42c4-bb53-b6a684d9afad" containerName="nova-cell0-conductor-conductor" Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.657296 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"129d8b83-1cbd-4f36-b30f-bfa23bb0581d","Type":"ContainerStarted","Data":"8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463"} Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.657349 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"129d8b83-1cbd-4f36-b30f-bfa23bb0581d","Type":"ContainerStarted","Data":"05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab"} Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.657343 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="129d8b83-1cbd-4f36-b30f-bfa23bb0581d" containerName="nova-api-log" containerID="cri-o://05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab" gracePeriod=30 Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.657362 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"129d8b83-1cbd-4f36-b30f-bfa23bb0581d","Type":"ContainerStarted","Data":"9bfeda2a713eccb9439290ac047fcfc4d26c02b4fd41a2060741b99a5da742bb"} Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.657399 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="129d8b83-1cbd-4f36-b30f-bfa23bb0581d" containerName="nova-api-api" containerID="cri-o://8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463" gracePeriod=30 Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.668133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e05505c3-5757-4e56-bd28-6ea231262b0d","Type":"ContainerStarted","Data":"949ac4be0e8dfa966768eb1d9a84f99b2add5daf9b84ac534d860118dc091fa8"} Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.668793 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e05505c3-5757-4e56-bd28-6ea231262b0d","Type":"ContainerStarted","Data":"efeda1b63b650f05ed990fe45e76d00f47c7d1c3650354965fcf23b6b5b0f1aa"} Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.672675 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b","Type":"ContainerStarted","Data":"67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36"} Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.672733 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b","Type":"ContainerStarted","Data":"8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8"} Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.672744 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b","Type":"ContainerStarted","Data":"a67006b4d0fad8e4e10d2951bdbbfcc61a90439d90cfdc8867db3b6b5b2e1cc5"} Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.672818 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" containerName="nova-metadata-log" containerID="cri-o://8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8" gracePeriod=30 Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.672912 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" containerName="nova-metadata-metadata" containerID="cri-o://67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36" gracePeriod=30 Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.685415 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.685393634 podStartE2EDuration="2.685393634s" podCreationTimestamp="2026-01-20 23:32:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:20.680027824 +0000 UTC m=+3413.000288162" watchObservedRunningTime="2026-01-20 23:32:20.685393634 +0000 UTC m=+3413.005653922" Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.703218 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.703192524 podStartE2EDuration="2.703192524s" podCreationTimestamp="2026-01-20 23:32:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:20.699805882 +0000 UTC m=+3413.020066170" watchObservedRunningTime="2026-01-20 23:32:20.703192524 +0000 UTC m=+3413.023452842" Jan 20 23:32:20 crc kubenswrapper[5030]: I0120 23:32:20.901993 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.282973 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.289328 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.425461 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgnqt\" (UniqueName: \"kubernetes.io/projected/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-kube-api-access-zgnqt\") pod \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.425537 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-logs\") pod \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.425682 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr2kr\" (UniqueName: \"kubernetes.io/projected/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-kube-api-access-lr2kr\") pod \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.425737 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-config-data\") pod \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.425822 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-combined-ca-bundle\") pod \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.425860 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-config-data\") pod \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\" (UID: \"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b\") " Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.425961 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-combined-ca-bundle\") pod \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.425991 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-logs\") pod \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\" (UID: \"129d8b83-1cbd-4f36-b30f-bfa23bb0581d\") " Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.426838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-logs" (OuterVolumeSpecName: "logs") pod "dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" (UID: "dc6fbe32-ba1a-4466-84d8-ce5e7feb010b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.427189 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-logs" (OuterVolumeSpecName: "logs") pod "129d8b83-1cbd-4f36-b30f-bfa23bb0581d" (UID: "129d8b83-1cbd-4f36-b30f-bfa23bb0581d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.432852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-kube-api-access-zgnqt" (OuterVolumeSpecName: "kube-api-access-zgnqt") pod "129d8b83-1cbd-4f36-b30f-bfa23bb0581d" (UID: "129d8b83-1cbd-4f36-b30f-bfa23bb0581d"). InnerVolumeSpecName "kube-api-access-zgnqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.436242 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-kube-api-access-lr2kr" (OuterVolumeSpecName: "kube-api-access-lr2kr") pod "dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" (UID: "dc6fbe32-ba1a-4466-84d8-ce5e7feb010b"). InnerVolumeSpecName "kube-api-access-lr2kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.462156 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-config-data" (OuterVolumeSpecName: "config-data") pod "dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" (UID: "dc6fbe32-ba1a-4466-84d8-ce5e7feb010b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.467107 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "129d8b83-1cbd-4f36-b30f-bfa23bb0581d" (UID: "129d8b83-1cbd-4f36-b30f-bfa23bb0581d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.471749 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" (UID: "dc6fbe32-ba1a-4466-84d8-ce5e7feb010b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.483746 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-config-data" (OuterVolumeSpecName: "config-data") pod "129d8b83-1cbd-4f36-b30f-bfa23bb0581d" (UID: "129d8b83-1cbd-4f36-b30f-bfa23bb0581d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.528243 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.528294 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.528303 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.528312 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.528321 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgnqt\" (UniqueName: \"kubernetes.io/projected/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-kube-api-access-zgnqt\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.528332 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.528340 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr2kr\" (UniqueName: \"kubernetes.io/projected/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b-kube-api-access-lr2kr\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.528348 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/129d8b83-1cbd-4f36-b30f-bfa23bb0581d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.661581 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.714956 5030 generic.go:334] "Generic (PLEG): container finished" podID="8922a316-0627-47b3-81ad-b3f7bff4da38" containerID="5c00320f3cd7578f1299e03d39a202a0074a2fde41b4b4782b08379f7ae774ee" exitCode=1 Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.715168 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"8922a316-0627-47b3-81ad-b3f7bff4da38","Type":"ContainerDied","Data":"5c00320f3cd7578f1299e03d39a202a0074a2fde41b4b4782b08379f7ae774ee"} Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.715197 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"8922a316-0627-47b3-81ad-b3f7bff4da38","Type":"ContainerDied","Data":"9244e2541f003ba811aef956752d3a19d91534c85875b25cd48c5bce7f6bd6b8"} Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.715213 5030 scope.go:117] "RemoveContainer" containerID="5c00320f3cd7578f1299e03d39a202a0074a2fde41b4b4782b08379f7ae774ee" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.715373 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.726650 5030 generic.go:334] "Generic (PLEG): container finished" podID="129d8b83-1cbd-4f36-b30f-bfa23bb0581d" containerID="8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463" exitCode=0 Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.726674 5030 generic.go:334] "Generic (PLEG): container finished" podID="129d8b83-1cbd-4f36-b30f-bfa23bb0581d" containerID="05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab" exitCode=143 Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.726752 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.726754 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"129d8b83-1cbd-4f36-b30f-bfa23bb0581d","Type":"ContainerDied","Data":"8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463"} Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.726828 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"129d8b83-1cbd-4f36-b30f-bfa23bb0581d","Type":"ContainerDied","Data":"05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab"} Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.726841 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"129d8b83-1cbd-4f36-b30f-bfa23bb0581d","Type":"ContainerDied","Data":"9bfeda2a713eccb9439290ac047fcfc4d26c02b4fd41a2060741b99a5da742bb"} Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.731060 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8922a316-0627-47b3-81ad-b3f7bff4da38-config-data\") pod \"8922a316-0627-47b3-81ad-b3f7bff4da38\" (UID: \"8922a316-0627-47b3-81ad-b3f7bff4da38\") " Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.731120 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsknb\" (UniqueName: \"kubernetes.io/projected/8922a316-0627-47b3-81ad-b3f7bff4da38-kube-api-access-tsknb\") pod \"8922a316-0627-47b3-81ad-b3f7bff4da38\" (UID: \"8922a316-0627-47b3-81ad-b3f7bff4da38\") " Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.731237 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8922a316-0627-47b3-81ad-b3f7bff4da38-combined-ca-bundle\") pod \"8922a316-0627-47b3-81ad-b3f7bff4da38\" (UID: \"8922a316-0627-47b3-81ad-b3f7bff4da38\") " Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.741169 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e05505c3-5757-4e56-bd28-6ea231262b0d","Type":"ContainerStarted","Data":"d0a7ae091b8e99c409e9f71e8676dec7b704731a453b2a67b9cabfc59f9230c5"} Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.757578 5030 generic.go:334] "Generic (PLEG): container finished" podID="dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" containerID="67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36" exitCode=0 Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.757607 5030 generic.go:334] "Generic (PLEG): container finished" podID="dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" containerID="8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8" exitCode=143 Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.757649 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b","Type":"ContainerDied","Data":"67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36"} Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.757682 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b","Type":"ContainerDied","Data":"8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8"} Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.757693 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"dc6fbe32-ba1a-4466-84d8-ce5e7feb010b","Type":"ContainerDied","Data":"a67006b4d0fad8e4e10d2951bdbbfcc61a90439d90cfdc8867db3b6b5b2e1cc5"} Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.757832 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.771949 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8922a316-0627-47b3-81ad-b3f7bff4da38-kube-api-access-tsknb" (OuterVolumeSpecName: "kube-api-access-tsknb") pod "8922a316-0627-47b3-81ad-b3f7bff4da38" (UID: "8922a316-0627-47b3-81ad-b3f7bff4da38"). InnerVolumeSpecName "kube-api-access-tsknb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.772844 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8922a316-0627-47b3-81ad-b3f7bff4da38-config-data" (OuterVolumeSpecName: "config-data") pod "8922a316-0627-47b3-81ad-b3f7bff4da38" (UID: "8922a316-0627-47b3-81ad-b3f7bff4da38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.780719 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8922a316-0627-47b3-81ad-b3f7bff4da38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8922a316-0627-47b3-81ad-b3f7bff4da38" (UID: "8922a316-0627-47b3-81ad-b3f7bff4da38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.833415 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8922a316-0627-47b3-81ad-b3f7bff4da38-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.833455 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsknb\" (UniqueName: \"kubernetes.io/projected/8922a316-0627-47b3-81ad-b3f7bff4da38-kube-api-access-tsknb\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.833467 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8922a316-0627-47b3-81ad-b3f7bff4da38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.862028 5030 scope.go:117] "RemoveContainer" containerID="5c00320f3cd7578f1299e03d39a202a0074a2fde41b4b4782b08379f7ae774ee" Jan 20 23:32:21 crc kubenswrapper[5030]: E0120 23:32:21.862516 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c00320f3cd7578f1299e03d39a202a0074a2fde41b4b4782b08379f7ae774ee\": container with ID starting with 5c00320f3cd7578f1299e03d39a202a0074a2fde41b4b4782b08379f7ae774ee not found: ID does not exist" containerID="5c00320f3cd7578f1299e03d39a202a0074a2fde41b4b4782b08379f7ae774ee" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.862563 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c00320f3cd7578f1299e03d39a202a0074a2fde41b4b4782b08379f7ae774ee"} err="failed to get container status \"5c00320f3cd7578f1299e03d39a202a0074a2fde41b4b4782b08379f7ae774ee\": rpc error: code = NotFound desc = could not find container \"5c00320f3cd7578f1299e03d39a202a0074a2fde41b4b4782b08379f7ae774ee\": container with ID starting with 5c00320f3cd7578f1299e03d39a202a0074a2fde41b4b4782b08379f7ae774ee not found: ID does not exist" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.862590 5030 scope.go:117] "RemoveContainer" containerID="8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.886435 5030 scope.go:117] "RemoveContainer" containerID="05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.898382 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.926840 5030 scope.go:117] "RemoveContainer" containerID="8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.927685 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:21 crc kubenswrapper[5030]: E0120 23:32:21.931890 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463\": container with ID starting with 8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463 not found: ID does not exist" containerID="8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.931930 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463"} err="failed to get container status \"8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463\": rpc error: code = NotFound desc = could not find container \"8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463\": container with ID starting with 8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463 not found: ID does not exist" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.931955 5030 scope.go:117] "RemoveContainer" containerID="05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab" Jan 20 23:32:21 crc kubenswrapper[5030]: E0120 23:32:21.933553 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab\": container with ID starting with 05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab not found: ID does not exist" containerID="05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.933593 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab"} err="failed to get container status \"05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab\": rpc error: code = NotFound desc = could not find container \"05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab\": container with ID starting with 05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab not found: ID does not exist" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.933631 5030 scope.go:117] "RemoveContainer" containerID="8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.935663 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463"} err="failed to get container status \"8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463\": rpc error: code = NotFound desc = could not find container \"8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463\": container with ID starting with 8554470d4dbde7c1960fa7503e4acb8cda9beaa9f975f277eed7bebd6a9b9463 not found: ID does not exist" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.935695 5030 scope.go:117] "RemoveContainer" containerID="05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.937572 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab"} err="failed to get container status \"05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab\": rpc error: code = NotFound desc = could not find container \"05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab\": container with ID starting with 05ef0a5171a222a63350a5b7896dc6f2c19daa97646c112753c1a36d9bcd60ab not found: ID does not exist" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.937607 5030 scope.go:117] "RemoveContainer" containerID="67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.943042 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.970497 5030 scope.go:117] "RemoveContainer" containerID="8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.973782 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" path="/var/lib/kubelet/pods/dc6fbe32-ba1a-4466-84d8-ce5e7feb010b/volumes" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.974889 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.976125 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:21 crc kubenswrapper[5030]: E0120 23:32:21.976572 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" containerName="nova-metadata-log" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.976586 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" containerName="nova-metadata-log" Jan 20 23:32:21 crc kubenswrapper[5030]: E0120 23:32:21.976605 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" containerName="nova-metadata-metadata" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.976611 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" containerName="nova-metadata-metadata" Jan 20 23:32:21 crc kubenswrapper[5030]: E0120 23:32:21.976638 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129d8b83-1cbd-4f36-b30f-bfa23bb0581d" containerName="nova-api-log" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.976644 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="129d8b83-1cbd-4f36-b30f-bfa23bb0581d" containerName="nova-api-log" Jan 20 23:32:21 crc kubenswrapper[5030]: E0120 23:32:21.976659 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8922a316-0627-47b3-81ad-b3f7bff4da38" containerName="nova-scheduler-scheduler" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.976665 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8922a316-0627-47b3-81ad-b3f7bff4da38" containerName="nova-scheduler-scheduler" Jan 20 23:32:21 crc kubenswrapper[5030]: E0120 23:32:21.976685 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129d8b83-1cbd-4f36-b30f-bfa23bb0581d" containerName="nova-api-api" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.976690 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="129d8b83-1cbd-4f36-b30f-bfa23bb0581d" containerName="nova-api-api" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.976851 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="129d8b83-1cbd-4f36-b30f-bfa23bb0581d" containerName="nova-api-log" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.976868 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" containerName="nova-metadata-metadata" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.976879 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8922a316-0627-47b3-81ad-b3f7bff4da38" containerName="nova-scheduler-scheduler" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.976892 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="129d8b83-1cbd-4f36-b30f-bfa23bb0581d" containerName="nova-api-api" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.976909 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc6fbe32-ba1a-4466-84d8-ce5e7feb010b" containerName="nova-metadata-log" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.977908 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.980277 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.993052 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:21 crc kubenswrapper[5030]: I0120 23:32:21.995646 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:21.998234 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.002536 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.005831 5030 scope.go:117] "RemoveContainer" containerID="67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36" Jan 20 23:32:22 crc kubenswrapper[5030]: E0120 23:32:22.006676 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36\": container with ID starting with 67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36 not found: ID does not exist" containerID="67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.006716 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36"} err="failed to get container status \"67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36\": rpc error: code = NotFound desc = could not find container \"67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36\": container with ID starting with 67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36 not found: ID does not exist" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.006741 5030 scope.go:117] "RemoveContainer" containerID="8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8" Jan 20 23:32:22 crc kubenswrapper[5030]: E0120 23:32:22.007181 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8\": container with ID starting with 8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8 not found: ID does not exist" containerID="8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.007199 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8"} err="failed to get container status \"8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8\": rpc error: code = NotFound desc = could not find container \"8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8\": container with ID starting with 8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8 not found: ID does not exist" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.007214 5030 scope.go:117] "RemoveContainer" containerID="67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.007736 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36"} err="failed to get container status \"67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36\": rpc error: code = NotFound desc = could not find container \"67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36\": container with ID starting with 67871a201224ccc342b9f604929ce6833006df8b7fa4f49c225526ed87b74e36 not found: ID does not exist" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.007753 5030 scope.go:117] "RemoveContainer" containerID="8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.009145 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8"} err="failed to get container status \"8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8\": rpc error: code = NotFound desc = could not find container \"8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8\": container with ID starting with 8e3b6160193ad9a6b5f02ca69e2679e13f3e9f32f6a594f6c43f86c72d48fbd8 not found: ID does not exist" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.032778 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.036434 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d85642b-b230-46f2-8eed-f66a85e61c1b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.036956 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c6kc\" (UniqueName: \"kubernetes.io/projected/829becc4-31bd-41a4-b235-ad48f57dbee1-kube-api-access-7c6kc\") pod \"nova-metadata-0\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.036994 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829becc4-31bd-41a4-b235-ad48f57dbee1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.037042 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89d9j\" (UniqueName: \"kubernetes.io/projected/1d85642b-b230-46f2-8eed-f66a85e61c1b-kube-api-access-89d9j\") pod \"nova-api-0\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.037063 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/829becc4-31bd-41a4-b235-ad48f57dbee1-logs\") pod \"nova-metadata-0\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.037086 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d85642b-b230-46f2-8eed-f66a85e61c1b-config-data\") pod \"nova-api-0\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.037155 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d85642b-b230-46f2-8eed-f66a85e61c1b-logs\") pod \"nova-api-0\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.037171 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829becc4-31bd-41a4-b235-ad48f57dbee1-config-data\") pod \"nova-metadata-0\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.073989 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.078358 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.085208 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.089929 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.092925 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.122912 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.139351 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.139411 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c6kc\" (UniqueName: \"kubernetes.io/projected/829becc4-31bd-41a4-b235-ad48f57dbee1-kube-api-access-7c6kc\") pod \"nova-metadata-0\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.139429 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-config-data\") pod \"nova-scheduler-0\" (UID: \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.139469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829becc4-31bd-41a4-b235-ad48f57dbee1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.139495 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89d9j\" (UniqueName: \"kubernetes.io/projected/1d85642b-b230-46f2-8eed-f66a85e61c1b-kube-api-access-89d9j\") pod \"nova-api-0\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.139511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ps7c\" (UniqueName: \"kubernetes.io/projected/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-kube-api-access-8ps7c\") pod \"nova-scheduler-0\" (UID: \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.139818 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/829becc4-31bd-41a4-b235-ad48f57dbee1-logs\") pod \"nova-metadata-0\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.140174 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/829becc4-31bd-41a4-b235-ad48f57dbee1-logs\") pod \"nova-metadata-0\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.140271 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d85642b-b230-46f2-8eed-f66a85e61c1b-config-data\") pod \"nova-api-0\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.140781 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d85642b-b230-46f2-8eed-f66a85e61c1b-logs\") pod \"nova-api-0\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.140802 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829becc4-31bd-41a4-b235-ad48f57dbee1-config-data\") pod \"nova-metadata-0\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.140917 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d85642b-b230-46f2-8eed-f66a85e61c1b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.141581 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d85642b-b230-46f2-8eed-f66a85e61c1b-logs\") pod \"nova-api-0\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.144361 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d85642b-b230-46f2-8eed-f66a85e61c1b-config-data\") pod \"nova-api-0\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.144795 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d85642b-b230-46f2-8eed-f66a85e61c1b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.145071 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829becc4-31bd-41a4-b235-ad48f57dbee1-config-data\") pod \"nova-metadata-0\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.148075 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829becc4-31bd-41a4-b235-ad48f57dbee1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.153782 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c6kc\" (UniqueName: \"kubernetes.io/projected/829becc4-31bd-41a4-b235-ad48f57dbee1-kube-api-access-7c6kc\") pod \"nova-metadata-0\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.156566 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89d9j\" (UniqueName: \"kubernetes.io/projected/1d85642b-b230-46f2-8eed-f66a85e61c1b-kube-api-access-89d9j\") pod \"nova-api-0\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.241926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.242180 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-config-data\") pod \"nova-scheduler-0\" (UID: \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.242300 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ps7c\" (UniqueName: \"kubernetes.io/projected/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-kube-api-access-8ps7c\") pod \"nova-scheduler-0\" (UID: \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.246606 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.247178 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-config-data\") pod \"nova-scheduler-0\" (UID: \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.258099 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ps7c\" (UniqueName: \"kubernetes.io/projected/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-kube-api-access-8ps7c\") pod \"nova-scheduler-0\" (UID: \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.309848 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.320412 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.413148 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.739820 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.802583 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e05505c3-5757-4e56-bd28-6ea231262b0d","Type":"ContainerStarted","Data":"07f8f4343762bc08f533b68567863fa6d796dea19f1af42572b0b8beeb71f1f3"} Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.808206 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.826921 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:22 crc kubenswrapper[5030]: I0120 23:32:22.927761 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.073163 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:32:23 crc kubenswrapper[5030]: E0120 23:32:23.225603 5030 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/eb73eef74a077e97d38e8caeae07376fa77c640ae3476021c1836b26fe258a54/diff" to get inode usage: stat /var/lib/containers/storage/overlay/eb73eef74a077e97d38e8caeae07376fa77c640ae3476021c1836b26fe258a54/diff: no such file or directory, extraDiskErr: Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.573759 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:32:23 crc kubenswrapper[5030]: E0120 23:32:23.636010 5030 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/aa1a5d185f77cd2ddd904b172ed5f2e5f19c15015395b18d4d24278236bb508e/diff" to get inode usage: stat /var/lib/containers/storage/overlay/aa1a5d185f77cd2ddd904b172ed5f2e5f19c15015395b18d4d24278236bb508e/diff: no such file or directory, extraDiskErr: Jan 20 23:32:23 crc kubenswrapper[5030]: E0120 23:32:23.677780 5030 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/d201a5dba911385717df34245d1077ed9544bf5b531fe58a60ea205b2ff7108a/diff" to get inode usage: stat /var/lib/containers/storage/overlay/d201a5dba911385717df34245d1077ed9544bf5b531fe58a60ea205b2ff7108a/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack-kuttl-tests_keystone-6676986dff-v2h4r_c20f9105-8569-4c92-98f9-dd35cc9d43a9/keystone-api/0.log" to get inode usage: stat /var/log/pods/openstack-kuttl-tests_keystone-6676986dff-v2h4r_c20f9105-8569-4c92-98f9-dd35cc9d43a9/keystone-api/0.log: no such file or directory Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.805190 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt"] Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.813176 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-cpcvt"] Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.828340 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f","Type":"ContainerStarted","Data":"cdb6d2f9db6194b1ea4151e2e5ad354315daea751a20f7a37d579ccb89c0b8b0"} Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.828387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f","Type":"ContainerStarted","Data":"9a45d5c19874a4894842110ca7897b77225f5553648cbcb6877dccdf65e51e86"} Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.830774 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e05505c3-5757-4e56-bd28-6ea231262b0d","Type":"ContainerStarted","Data":"d90bb2c84a3cd15fd33a8899f38b282b8121c83786d060fd74c4c4b2c774cbfe"} Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.830877 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="ceilometer-central-agent" containerID="cri-o://949ac4be0e8dfa966768eb1d9a84f99b2add5daf9b84ac534d860118dc091fa8" gracePeriod=30 Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.830915 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.830955 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="proxy-httpd" containerID="cri-o://d90bb2c84a3cd15fd33a8899f38b282b8121c83786d060fd74c4c4b2c774cbfe" gracePeriod=30 Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.830985 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="ceilometer-notification-agent" containerID="cri-o://d0a7ae091b8e99c409e9f71e8676dec7b704731a453b2a67b9cabfc59f9230c5" gracePeriod=30 Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.831060 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="sg-core" containerID="cri-o://07f8f4343762bc08f533b68567863fa6d796dea19f1af42572b0b8beeb71f1f3" gracePeriod=30 Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.839637 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1d85642b-b230-46f2-8eed-f66a85e61c1b","Type":"ContainerStarted","Data":"c24dea5d8d838a9dbdf953c08617234b342ae4bd0aee7b36ee5a69b0bf99fee6"} Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.839678 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1d85642b-b230-46f2-8eed-f66a85e61c1b","Type":"ContainerStarted","Data":"7b306056aeb3ce3bc150f2c02b655a5f02e9143719bfda942c7d8484334555bd"} Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.839687 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1d85642b-b230-46f2-8eed-f66a85e61c1b","Type":"ContainerStarted","Data":"c70925694a9d9536009bbd7a6337c10f29110ab7d81e95a973a7668f714dc648"} Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.847007 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.846991727 podStartE2EDuration="1.846991727s" podCreationTimestamp="2026-01-20 23:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:23.841421402 +0000 UTC m=+3416.161681690" watchObservedRunningTime="2026-01-20 23:32:23.846991727 +0000 UTC m=+3416.167252015" Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.878951 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"829becc4-31bd-41a4-b235-ad48f57dbee1","Type":"ContainerStarted","Data":"f10b5944ce5d36e7efa431dc68a535e4bf1305dd15aa9302dcb64353989f638d"} Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.878995 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"829becc4-31bd-41a4-b235-ad48f57dbee1","Type":"ContainerStarted","Data":"1f0af1bb554adf26fc05657d277f62fef7d8a6b83f95128281d44a82cffa4c25"} Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.879005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"829becc4-31bd-41a4-b235-ad48f57dbee1","Type":"ContainerStarted","Data":"5ffbeec0396e29632c366f67d0b7b96cc67dddd7d380d7c870e788e1c406a0ba"} Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.908193 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.917356 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.329515179 podStartE2EDuration="5.91733881s" podCreationTimestamp="2026-01-20 23:32:18 +0000 UTC" firstStartedPulling="2026-01-20 23:32:19.746209868 +0000 UTC m=+3412.066470156" lastFinishedPulling="2026-01-20 23:32:23.334033499 +0000 UTC m=+3415.654293787" observedRunningTime="2026-01-20 23:32:23.88346564 +0000 UTC m=+3416.203725928" watchObservedRunningTime="2026-01-20 23:32:23.91733881 +0000 UTC m=+3416.237599098" Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.933658 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.947874 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.947851628 podStartE2EDuration="2.947851628s" podCreationTimestamp="2026-01-20 23:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:23.91900799 +0000 UTC m=+3416.239268278" watchObservedRunningTime="2026-01-20 23:32:23.947851628 +0000 UTC m=+3416.268111916" Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.981152 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="129d8b83-1cbd-4f36-b30f-bfa23bb0581d" path="/var/lib/kubelet/pods/129d8b83-1cbd-4f36-b30f-bfa23bb0581d/volumes" Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.982019 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="569f5821-3389-4731-95ab-a39e17508f8e" path="/var/lib/kubelet/pods/569f5821-3389-4731-95ab-a39e17508f8e/volumes" Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.982565 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8922a316-0627-47b3-81ad-b3f7bff4da38" path="/var/lib/kubelet/pods/8922a316-0627-47b3-81ad-b3f7bff4da38/volumes" Jan 20 23:32:23 crc kubenswrapper[5030]: I0120 23:32:23.983216 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:23 crc kubenswrapper[5030]: E0120 23:32:23.994743 5030 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/72453b952f30b945bb67a07340974030198a229810707bd3a43b2e8f33ce034b/diff" to get inode usage: stat /var/lib/containers/storage/overlay/72453b952f30b945bb67a07340974030198a229810707bd3a43b2e8f33ce034b/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack-kuttl-tests_neutron-7d99d84474-qxm76_9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0/neutron-api/0.log" to get inode usage: stat /var/log/pods/openstack-kuttl-tests_neutron-7d99d84474-qxm76_9f88af0a-ee00-4d06-bc82-6a34a5dfe6d0/neutron-api/0.log: no such file or directory Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.004532 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=3.00451065 podStartE2EDuration="3.00451065s" podCreationTimestamp="2026-01-20 23:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:23.968634262 +0000 UTC m=+3416.288894550" watchObservedRunningTime="2026-01-20 23:32:24.00451065 +0000 UTC m=+3416.324770938" Jan 20 23:32:24 crc kubenswrapper[5030]: E0120 23:32:24.013707 5030 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/383762f409996daec58226d33f9e6dfa63508f29c23d93fb34675c1661850451/diff" to get inode usage: stat /var/lib/containers/storage/overlay/383762f409996daec58226d33f9e6dfa63508f29c23d93fb34675c1661850451/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack-kuttl-tests_barbican-keystone-listener-5868d649cd-4hxqk_08962f16-0341-46c7-b0df-9972861083df/barbican-keystone-listener/0.log" to get inode usage: stat /var/log/pods/openstack-kuttl-tests_barbican-keystone-listener-5868d649cd-4hxqk_08962f16-0341-46c7-b0df-9972861083df/barbican-keystone-listener/0.log: no such file or directory Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.132558 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-4h926"] Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.133748 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.136446 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.136666 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.158305 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-4h926"] Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.287099 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-config-data\") pod \"nova-cell0-cell-mapping-4h926\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.287356 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-scripts\") pod \"nova-cell0-cell-mapping-4h926\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.287507 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4h926\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.287710 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ptwb\" (UniqueName: \"kubernetes.io/projected/e290163a-2fc9-492f-b113-1e1a42ff37f9-kube-api-access-8ptwb\") pod \"nova-cell0-cell-mapping-4h926\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.389815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-config-data\") pod \"nova-cell0-cell-mapping-4h926\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.389888 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-scripts\") pod \"nova-cell0-cell-mapping-4h926\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.389916 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4h926\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.389970 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ptwb\" (UniqueName: \"kubernetes.io/projected/e290163a-2fc9-492f-b113-1e1a42ff37f9-kube-api-access-8ptwb\") pod \"nova-cell0-cell-mapping-4h926\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.395926 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-config-data\") pod \"nova-cell0-cell-mapping-4h926\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.396076 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4h926\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.396683 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-scripts\") pod \"nova-cell0-cell-mapping-4h926\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.412888 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ptwb\" (UniqueName: \"kubernetes.io/projected/e290163a-2fc9-492f-b113-1e1a42ff37f9-kube-api-access-8ptwb\") pod \"nova-cell0-cell-mapping-4h926\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.456356 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:24 crc kubenswrapper[5030]: W0120 23:32:24.810844 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeadf8b7_c4d1_486b_892d_2de53cf8144f.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeadf8b7_c4d1_486b_892d_2de53cf8144f.slice: no such file or directory Jan 20 23:32:24 crc kubenswrapper[5030]: W0120 23:32:24.811105 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd30c7a9f_0dab_4a96_8e99_d85cf8961b3c.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd30c7a9f_0dab_4a96_8e99_d85cf8961b3c.slice: no such file or directory Jan 20 23:32:24 crc kubenswrapper[5030]: W0120 23:32:24.811140 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd7a6b3c_a038_4385_83a9_16177204e200.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd7a6b3c_a038_4385_83a9_16177204e200.slice: no such file or directory Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.900750 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-4h926"] Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.911539 5030 generic.go:334] "Generic (PLEG): container finished" podID="f7fef1ff-a583-42c4-bb53-b6a684d9afad" containerID="9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99" exitCode=0 Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.911949 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f7fef1ff-a583-42c4-bb53-b6a684d9afad","Type":"ContainerDied","Data":"9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99"} Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.916278 5030 generic.go:334] "Generic (PLEG): container finished" podID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerID="d90bb2c84a3cd15fd33a8899f38b282b8121c83786d060fd74c4c4b2c774cbfe" exitCode=0 Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.916310 5030 generic.go:334] "Generic (PLEG): container finished" podID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerID="07f8f4343762bc08f533b68567863fa6d796dea19f1af42572b0b8beeb71f1f3" exitCode=2 Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.916319 5030 generic.go:334] "Generic (PLEG): container finished" podID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerID="d0a7ae091b8e99c409e9f71e8676dec7b704731a453b2a67b9cabfc59f9230c5" exitCode=0 Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.916584 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e05505c3-5757-4e56-bd28-6ea231262b0d","Type":"ContainerDied","Data":"d90bb2c84a3cd15fd33a8899f38b282b8121c83786d060fd74c4c4b2c774cbfe"} Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.916722 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e05505c3-5757-4e56-bd28-6ea231262b0d","Type":"ContainerDied","Data":"07f8f4343762bc08f533b68567863fa6d796dea19f1af42572b0b8beeb71f1f3"} Jan 20 23:32:24 crc kubenswrapper[5030]: I0120 23:32:24.916765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e05505c3-5757-4e56-bd28-6ea231262b0d","Type":"ContainerDied","Data":"d0a7ae091b8e99c409e9f71e8676dec7b704731a453b2a67b9cabfc59f9230c5"} Jan 20 23:32:24 crc kubenswrapper[5030]: E0120 23:32:24.985853 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff299fc0_074c_4420_8ac5_030bebd0c385.slice/crio-27d6ada5afe73f9dfe769965c8eb20db23b4135948bb0c76cf36d211eb5ffc7e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ca5f6c1_a17b_4405_93e9_fa2fab576c91.slice/crio-conmon-94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dacb533_2f94_4433_ab8e_2efc4ee6e370.slice/crio-394e938f5e1ab967db58f7f06b9d758bfb040635991db16883733f455c6b5a6e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08962f16_0341_46c7_b0df_9972861083df.slice/crio-382ecae1ec29df3c8334fa5c814576e52cef4b516be068ee7b882aca010b9afc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58c0a209_c9c9_4d67_9c4f_bc086ec04be5.slice/crio-dd62ecca2e723bec7d0ae9393501f468d6032d9fe6b0b7ae8df93327c50ef9f3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod762f567e_ffe6_41fa_bde1_18775c7052b7.slice/crio-22f47e79ac62c2871d4a7b59223d44f759f9d8b75d4c1bbf6a9e584a19c34da3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68408e1c_9c9a_465d_9abd_ea36b270db22.slice/crio-4e61e017fdd7a44394deca0298a839476bfd215ec5ecaf9e15a97b23f9e1e533\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c50398_f0ad_450d_b494_be906da18af9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fd2760d_ec53_4e6b_bdc8_28e0d366fa3c.slice/crio-3810644d3ad4dbd618d30b4b916ee09f4265c94267ee6ea6b39193711564b3fb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc20f9105_8569_4c92_98f9_dd35cc9d43a9.slice/crio-3fe947ee9f0bbbedd35dd0ede808e4e4700dfcee91938cbfaad6dac6667227b4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cef2808_064a_47dd_8abb_9f6b08d260be.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc20f9105_8569_4c92_98f9_dd35cc9d43a9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59347f0b_d425_4d66_b2a0_7381c5fa0c3f.slice/crio-2175f12451ad8dc33d5ba3fc27ace3cfecef118535575d9b83d785039ff027ba.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode17cbb6a_58ff_46e9_91d7_d92d52b77b55.slice/crio-cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52fa921e_8a73_400d_9f48_50876a4765f4.slice/crio-conmon-60bdee87e6e2052c44319ee5200fc4ba267eec8e3cf7f7c7482f4b470bee7599.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52fa921e_8a73_400d_9f48_50876a4765f4.slice/crio-60bdee87e6e2052c44319ee5200fc4ba267eec8e3cf7f7c7482f4b470bee7599.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode17cbb6a_58ff_46e9_91d7_d92d52b77b55.slice/crio-conmon-cfe94759062286893ce7379a1dcf043d50382f54749656ee9889c3d05dcb6f50.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ca5f6c1_a17b_4405_93e9_fa2fab576c91.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f88af0a_ee00_4d06_bc82_6a34a5dfe6d0.slice/crio-conmon-c27061aa78b8d4b16debaf8ab0665638e5f5bcbfe742795f3779e3b600a81d32.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5a9211_0407_48f1_9d3e_8fb089aa2b56.slice/crio-39e392ea69997447b0c3104564e074803eff954643c5d17a12a8dd50622411ce\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43aac3a1_0cc6_4a8c_ad4a_0fc0f681d138.slice/crio-67accc64f7359987e8142ff3b360f4221522f2d2d92c9fbf5c93961817da4971.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode17cbb6a_58ff_46e9_91d7_d92d52b77b55.slice/crio-cbf2dd2974b420b1c99104903866e2f9cb223cd17877d633d71540a17d53b366\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43aac3a1_0cc6_4a8c_ad4a_0fc0f681d138.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5a9211_0407_48f1_9d3e_8fb089aa2b56.slice/crio-9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f88af0a_ee00_4d06_bc82_6a34a5dfe6d0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ca5f6c1_a17b_4405_93e9_fa2fab576c91.slice/crio-64151782faeffac0b0b3a84c40974698501a2092d9149ec0d69ca210c8027e09\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff299fc0_074c_4420_8ac5_030bebd0c385.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fd2760d_ec53_4e6b_bdc8_28e0d366fa3c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cef2808_064a_47dd_8abb_9f6b08d260be.slice/crio-conmon-5e56e0a90f48d67ecced2eb4b7bd71d6bb25bbb2926717de0d9772105c53f0b2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dacb533_2f94_4433_ab8e_2efc4ee6e370.slice/crio-6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5910c0d_bfc0_4b55_8b1b_ded9285b8b80.slice/crio-conmon-f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08962f16_0341_46c7_b0df_9972861083df.slice/crio-c2d0233a880b88e93dbeabd9bcdcab1ad13499f0f2deb9a9aa140faad87db60d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43aac3a1_0cc6_4a8c_ad4a_0fc0f681d138.slice/crio-84edb744cac90df709d3e319b064c1e917012698257aa60d1a82f95a44480254\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dacb533_2f94_4433_ab8e_2efc4ee6e370.slice/crio-conmon-8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77f3ffb_97b9_4a1f_9610_7705ae4aee88.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode17cbb6a_58ff_46e9_91d7_d92d52b77b55.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c50398_f0ad_450d_b494_be906da18af9.slice/crio-c1b33ab002e4f712903d4ed520c8c52b8e5d06f4f632273086ae04558ec64596.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5a9211_0407_48f1_9d3e_8fb089aa2b56.slice/crio-conmon-9be471d7edca57b5bc0c6085a96b2c752b0a5617225b4842ba9f8e0453f1fb0b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fd2760d_ec53_4e6b_bdc8_28e0d366fa3c.slice/crio-89c3dd8b3e7fc91cc889f77f886cdedfdb0463cbc962b4028ce5a168ff6ec615\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58c0a209_c9c9_4d67_9c4f_bc086ec04be5.slice/crio-965f8129c0827e86f36346b8ea27c6631c7772c9bb570f3f295be8fe1e4322db.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c50398_f0ad_450d_b494_be906da18af9.slice/crio-conmon-ace8455d89e787da2e667d87dede5ef4ab3dc9bb9804afaac85184a3e3d85013.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dacb533_2f94_4433_ab8e_2efc4ee6e370.slice/crio-8c3f61c87d162d5598e178ac51202d42ad931e959d58c0518cd748182ab58ea8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5910c0d_bfc0_4b55_8b1b_ded9285b8b80.slice/crio-f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59347f0b_d425_4d66_b2a0_7381c5fa0c3f.slice/crio-conmon-2175f12451ad8dc33d5ba3fc27ace3cfecef118535575d9b83d785039ff027ba.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dacb533_2f94_4433_ab8e_2efc4ee6e370.slice/crio-conmon-6027748eb0f602c09078efd4752c9cce7fdd41807e1b3dc3cd7b2d14753738a7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77f3ffb_97b9_4a1f_9610_7705ae4aee88.slice/crio-c5b38855c50b837a2fe620b8319b6db6b3aec970de01e6ba699b9ec3b5ce1658\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ca5f6c1_a17b_4405_93e9_fa2fab576c91.slice/crio-94b82bc15de57b2d77f2a173df68d1c3bc2b2bec3f7ea2a42876633d1d1d489b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c50398_f0ad_450d_b494_be906da18af9.slice/crio-conmon-c1b33ab002e4f712903d4ed520c8c52b8e5d06f4f632273086ae04558ec64596.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68408e1c_9c9a_465d_9abd_ea36b270db22.slice/crio-conmon-388886d50609f46ab30782d021167b8d67acd9933784f50626f2481e348f1a4e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5a9211_0407_48f1_9d3e_8fb089aa2b56.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:32:25 crc kubenswrapper[5030]: E0120 23:32:25.100860 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99 is running failed: container process not found" containerID="9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:25 crc kubenswrapper[5030]: E0120 23:32:25.101562 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99 is running failed: container process not found" containerID="9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:25 crc kubenswrapper[5030]: E0120 23:32:25.102099 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99 is running failed: container process not found" containerID="9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:25 crc kubenswrapper[5030]: E0120 23:32:25.102164 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="f7fef1ff-a583-42c4-bb53-b6a684d9afad" containerName="nova-cell0-conductor-conductor" Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.531083 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.716468 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j88lf\" (UniqueName: \"kubernetes.io/projected/f7fef1ff-a583-42c4-bb53-b6a684d9afad-kube-api-access-j88lf\") pod \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\" (UID: \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\") " Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.716539 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7fef1ff-a583-42c4-bb53-b6a684d9afad-config-data\") pod \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\" (UID: \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\") " Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.716722 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7fef1ff-a583-42c4-bb53-b6a684d9afad-combined-ca-bundle\") pod \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\" (UID: \"f7fef1ff-a583-42c4-bb53-b6a684d9afad\") " Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.723027 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7fef1ff-a583-42c4-bb53-b6a684d9afad-kube-api-access-j88lf" (OuterVolumeSpecName: "kube-api-access-j88lf") pod "f7fef1ff-a583-42c4-bb53-b6a684d9afad" (UID: "f7fef1ff-a583-42c4-bb53-b6a684d9afad"). InnerVolumeSpecName "kube-api-access-j88lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.742759 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7fef1ff-a583-42c4-bb53-b6a684d9afad-config-data" (OuterVolumeSpecName: "config-data") pod "f7fef1ff-a583-42c4-bb53-b6a684d9afad" (UID: "f7fef1ff-a583-42c4-bb53-b6a684d9afad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.744740 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7fef1ff-a583-42c4-bb53-b6a684d9afad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7fef1ff-a583-42c4-bb53-b6a684d9afad" (UID: "f7fef1ff-a583-42c4-bb53-b6a684d9afad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.819375 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j88lf\" (UniqueName: \"kubernetes.io/projected/f7fef1ff-a583-42c4-bb53-b6a684d9afad-kube-api-access-j88lf\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.819418 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7fef1ff-a583-42c4-bb53-b6a684d9afad-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.819432 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7fef1ff-a583-42c4-bb53-b6a684d9afad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.926347 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"f7fef1ff-a583-42c4-bb53-b6a684d9afad","Type":"ContainerDied","Data":"f88ba205a7043255fab8362632c57445cbf0e88f7e203535c2a66d85faa0b12d"} Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.926411 5030 scope.go:117] "RemoveContainer" containerID="9c0b415fcd9427d7d542445f1845d68febeca8e97179f236e4de243f8fceda99" Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.926364 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.928063 5030 generic.go:334] "Generic (PLEG): container finished" podID="7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f" containerID="cdb6d2f9db6194b1ea4151e2e5ad354315daea751a20f7a37d579ccb89c0b8b0" exitCode=1 Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.928122 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f","Type":"ContainerDied","Data":"cdb6d2f9db6194b1ea4151e2e5ad354315daea751a20f7a37d579ccb89c0b8b0"} Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.931071 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="829becc4-31bd-41a4-b235-ad48f57dbee1" containerName="nova-metadata-log" containerID="cri-o://1f0af1bb554adf26fc05657d277f62fef7d8a6b83f95128281d44a82cffa4c25" gracePeriod=30 Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.932867 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" event={"ID":"e290163a-2fc9-492f-b113-1e1a42ff37f9","Type":"ContainerStarted","Data":"114affce9b800cbe0857f605af142f1a72a5d890f7e1f80920fbac277a72b8f3"} Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.932924 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" event={"ID":"e290163a-2fc9-492f-b113-1e1a42ff37f9","Type":"ContainerStarted","Data":"5ae3a349ff9649bc0c317dc43094719c687e5372c28f5c066d01b755692d15ed"} Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.933145 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="829becc4-31bd-41a4-b235-ad48f57dbee1" containerName="nova-metadata-metadata" containerID="cri-o://f10b5944ce5d36e7efa431dc68a535e4bf1305dd15aa9302dcb64353989f638d" gracePeriod=30 Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.933201 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="1d85642b-b230-46f2-8eed-f66a85e61c1b" containerName="nova-api-api" containerID="cri-o://c24dea5d8d838a9dbdf953c08617234b342ae4bd0aee7b36ee5a69b0bf99fee6" gracePeriod=30 Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.933200 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="1d85642b-b230-46f2-8eed-f66a85e61c1b" containerName="nova-api-log" containerID="cri-o://7b306056aeb3ce3bc150f2c02b655a5f02e9143719bfda942c7d8484334555bd" gracePeriod=30 Jan 20 23:32:25 crc kubenswrapper[5030]: E0120 23:32:25.966909 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:25 crc kubenswrapper[5030]: E0120 23:32:25.968464 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:25 crc kubenswrapper[5030]: E0120 23:32:25.971454 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:25 crc kubenswrapper[5030]: E0120 23:32:25.977676 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="62366776-47f0-4d58-87fc-e76f904677bc" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.977484 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" podStartSLOduration=1.977464639 podStartE2EDuration="1.977464639s" podCreationTimestamp="2026-01-20 23:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:25.972592392 +0000 UTC m=+3418.292852680" watchObservedRunningTime="2026-01-20 23:32:25.977464639 +0000 UTC m=+3418.297724937" Jan 20 23:32:25 crc kubenswrapper[5030]: I0120 23:32:25.999119 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.008642 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.022948 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:32:26 crc kubenswrapper[5030]: E0120 23:32:26.023322 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7fef1ff-a583-42c4-bb53-b6a684d9afad" containerName="nova-cell0-conductor-conductor" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.023340 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7fef1ff-a583-42c4-bb53-b6a684d9afad" containerName="nova-cell0-conductor-conductor" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.023511 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7fef1ff-a583-42c4-bb53-b6a684d9afad" containerName="nova-cell0-conductor-conductor" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.024092 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.026082 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.058665 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.123451 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e443f370-de90-4c56-ba15-f33683726237-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e443f370-de90-4c56-ba15-f33683726237\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.123540 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d49zh\" (UniqueName: \"kubernetes.io/projected/e443f370-de90-4c56-ba15-f33683726237-kube-api-access-d49zh\") pod \"nova-cell0-conductor-0\" (UID: \"e443f370-de90-4c56-ba15-f33683726237\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.123586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e443f370-de90-4c56-ba15-f33683726237-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e443f370-de90-4c56-ba15-f33683726237\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.225150 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d49zh\" (UniqueName: \"kubernetes.io/projected/e443f370-de90-4c56-ba15-f33683726237-kube-api-access-d49zh\") pod \"nova-cell0-conductor-0\" (UID: \"e443f370-de90-4c56-ba15-f33683726237\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.225205 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e443f370-de90-4c56-ba15-f33683726237-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e443f370-de90-4c56-ba15-f33683726237\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.225290 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e443f370-de90-4c56-ba15-f33683726237-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e443f370-de90-4c56-ba15-f33683726237\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.230228 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e443f370-de90-4c56-ba15-f33683726237-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e443f370-de90-4c56-ba15-f33683726237\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.230816 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e443f370-de90-4c56-ba15-f33683726237-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e443f370-de90-4c56-ba15-f33683726237\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.308133 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d49zh\" (UniqueName: \"kubernetes.io/projected/e443f370-de90-4c56-ba15-f33683726237-kube-api-access-d49zh\") pod \"nova-cell0-conductor-0\" (UID: \"e443f370-de90-4c56-ba15-f33683726237\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.342541 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.438201 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.530801 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-combined-ca-bundle\") pod \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\" (UID: \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\") " Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.530903 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ps7c\" (UniqueName: \"kubernetes.io/projected/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-kube-api-access-8ps7c\") pod \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\" (UID: \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\") " Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.530978 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-config-data\") pod \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\" (UID: \"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f\") " Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.535873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-kube-api-access-8ps7c" (OuterVolumeSpecName: "kube-api-access-8ps7c") pod "7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f" (UID: "7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f"). InnerVolumeSpecName "kube-api-access-8ps7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.554974 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-config-data" (OuterVolumeSpecName: "config-data") pod "7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f" (UID: "7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.559442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f" (UID: "7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.633592 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ps7c\" (UniqueName: \"kubernetes.io/projected/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-kube-api-access-8ps7c\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.633645 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.633656 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.777836 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.906449 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.913489 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.973682 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:26 crc kubenswrapper[5030]: I0120 23:32:26.980831 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e443f370-de90-4c56-ba15-f33683726237","Type":"ContainerStarted","Data":"2d20e74e91df826a223972321541ae454de238264babe509a656b84bf49f54e8"} Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.002195 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7"] Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.002427 5030 generic.go:334] "Generic (PLEG): container finished" podID="829becc4-31bd-41a4-b235-ad48f57dbee1" containerID="f10b5944ce5d36e7efa431dc68a535e4bf1305dd15aa9302dcb64353989f638d" exitCode=0 Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.002457 5030 generic.go:334] "Generic (PLEG): container finished" podID="829becc4-31bd-41a4-b235-ad48f57dbee1" containerID="1f0af1bb554adf26fc05657d277f62fef7d8a6b83f95128281d44a82cffa4c25" exitCode=143 Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.002446 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" podUID="4441979e-fb8d-4588-9180-8b1b74b3ecae" containerName="proxy-httpd" containerID="cri-o://86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314" gracePeriod=30 Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.002577 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"829becc4-31bd-41a4-b235-ad48f57dbee1","Type":"ContainerDied","Data":"f10b5944ce5d36e7efa431dc68a535e4bf1305dd15aa9302dcb64353989f638d"} Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.002609 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"829becc4-31bd-41a4-b235-ad48f57dbee1","Type":"ContainerDied","Data":"1f0af1bb554adf26fc05657d277f62fef7d8a6b83f95128281d44a82cffa4c25"} Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.002649 5030 scope.go:117] "RemoveContainer" containerID="f10b5944ce5d36e7efa431dc68a535e4bf1305dd15aa9302dcb64353989f638d" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.002651 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" podUID="4441979e-fb8d-4588-9180-8b1b74b3ecae" containerName="proxy-server" containerID="cri-o://38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef" gracePeriod=30 Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.002687 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.028522 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f","Type":"ContainerDied","Data":"9a45d5c19874a4894842110ca7897b77225f5553648cbcb6877dccdf65e51e86"} Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.028655 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.036741 5030 generic.go:334] "Generic (PLEG): container finished" podID="1d85642b-b230-46f2-8eed-f66a85e61c1b" containerID="c24dea5d8d838a9dbdf953c08617234b342ae4bd0aee7b36ee5a69b0bf99fee6" exitCode=0 Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.036770 5030 generic.go:334] "Generic (PLEG): container finished" podID="1d85642b-b230-46f2-8eed-f66a85e61c1b" containerID="7b306056aeb3ce3bc150f2c02b655a5f02e9143719bfda942c7d8484334555bd" exitCode=143 Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.036898 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1d85642b-b230-46f2-8eed-f66a85e61c1b","Type":"ContainerDied","Data":"c24dea5d8d838a9dbdf953c08617234b342ae4bd0aee7b36ee5a69b0bf99fee6"} Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.036969 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1d85642b-b230-46f2-8eed-f66a85e61c1b","Type":"ContainerDied","Data":"7b306056aeb3ce3bc150f2c02b655a5f02e9143719bfda942c7d8484334555bd"} Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.041460 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.049157 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/829becc4-31bd-41a4-b235-ad48f57dbee1-logs\") pod \"829becc4-31bd-41a4-b235-ad48f57dbee1\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.049977 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c6kc\" (UniqueName: \"kubernetes.io/projected/829becc4-31bd-41a4-b235-ad48f57dbee1-kube-api-access-7c6kc\") pod \"829becc4-31bd-41a4-b235-ad48f57dbee1\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.049903 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/829becc4-31bd-41a4-b235-ad48f57dbee1-logs" (OuterVolumeSpecName: "logs") pod "829becc4-31bd-41a4-b235-ad48f57dbee1" (UID: "829becc4-31bd-41a4-b235-ad48f57dbee1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.054576 5030 scope.go:117] "RemoveContainer" containerID="1f0af1bb554adf26fc05657d277f62fef7d8a6b83f95128281d44a82cffa4c25" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.057976 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/829becc4-31bd-41a4-b235-ad48f57dbee1-kube-api-access-7c6kc" (OuterVolumeSpecName: "kube-api-access-7c6kc") pod "829becc4-31bd-41a4-b235-ad48f57dbee1" (UID: "829becc4-31bd-41a4-b235-ad48f57dbee1"). InnerVolumeSpecName "kube-api-access-7c6kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.058961 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829becc4-31bd-41a4-b235-ad48f57dbee1-combined-ca-bundle\") pod \"829becc4-31bd-41a4-b235-ad48f57dbee1\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.059151 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829becc4-31bd-41a4-b235-ad48f57dbee1-config-data\") pod \"829becc4-31bd-41a4-b235-ad48f57dbee1\" (UID: \"829becc4-31bd-41a4-b235-ad48f57dbee1\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.060030 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/829becc4-31bd-41a4-b235-ad48f57dbee1-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.060056 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c6kc\" (UniqueName: \"kubernetes.io/projected/829becc4-31bd-41a4-b235-ad48f57dbee1-kube-api-access-7c6kc\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.098880 5030 scope.go:117] "RemoveContainer" containerID="f10b5944ce5d36e7efa431dc68a535e4bf1305dd15aa9302dcb64353989f638d" Jan 20 23:32:27 crc kubenswrapper[5030]: E0120 23:32:27.101324 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f10b5944ce5d36e7efa431dc68a535e4bf1305dd15aa9302dcb64353989f638d\": container with ID starting with f10b5944ce5d36e7efa431dc68a535e4bf1305dd15aa9302dcb64353989f638d not found: ID does not exist" containerID="f10b5944ce5d36e7efa431dc68a535e4bf1305dd15aa9302dcb64353989f638d" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.101371 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f10b5944ce5d36e7efa431dc68a535e4bf1305dd15aa9302dcb64353989f638d"} err="failed to get container status \"f10b5944ce5d36e7efa431dc68a535e4bf1305dd15aa9302dcb64353989f638d\": rpc error: code = NotFound desc = could not find container \"f10b5944ce5d36e7efa431dc68a535e4bf1305dd15aa9302dcb64353989f638d\": container with ID starting with f10b5944ce5d36e7efa431dc68a535e4bf1305dd15aa9302dcb64353989f638d not found: ID does not exist" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.101394 5030 scope.go:117] "RemoveContainer" containerID="1f0af1bb554adf26fc05657d277f62fef7d8a6b83f95128281d44a82cffa4c25" Jan 20 23:32:27 crc kubenswrapper[5030]: E0120 23:32:27.103441 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f0af1bb554adf26fc05657d277f62fef7d8a6b83f95128281d44a82cffa4c25\": container with ID starting with 1f0af1bb554adf26fc05657d277f62fef7d8a6b83f95128281d44a82cffa4c25 not found: ID does not exist" containerID="1f0af1bb554adf26fc05657d277f62fef7d8a6b83f95128281d44a82cffa4c25" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.103485 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0af1bb554adf26fc05657d277f62fef7d8a6b83f95128281d44a82cffa4c25"} err="failed to get container status \"1f0af1bb554adf26fc05657d277f62fef7d8a6b83f95128281d44a82cffa4c25\": rpc error: code = NotFound desc = could not find container \"1f0af1bb554adf26fc05657d277f62fef7d8a6b83f95128281d44a82cffa4c25\": container with ID starting with 1f0af1bb554adf26fc05657d277f62fef7d8a6b83f95128281d44a82cffa4c25 not found: ID does not exist" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.103513 5030 scope.go:117] "RemoveContainer" containerID="cdb6d2f9db6194b1ea4151e2e5ad354315daea751a20f7a37d579ccb89c0b8b0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.113090 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829becc4-31bd-41a4-b235-ad48f57dbee1-config-data" (OuterVolumeSpecName: "config-data") pod "829becc4-31bd-41a4-b235-ad48f57dbee1" (UID: "829becc4-31bd-41a4-b235-ad48f57dbee1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.117893 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.127848 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.135683 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:27 crc kubenswrapper[5030]: E0120 23:32:27.136132 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829becc4-31bd-41a4-b235-ad48f57dbee1" containerName="nova-metadata-metadata" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.136144 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="829becc4-31bd-41a4-b235-ad48f57dbee1" containerName="nova-metadata-metadata" Jan 20 23:32:27 crc kubenswrapper[5030]: E0120 23:32:27.136160 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d85642b-b230-46f2-8eed-f66a85e61c1b" containerName="nova-api-log" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.136166 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d85642b-b230-46f2-8eed-f66a85e61c1b" containerName="nova-api-log" Jan 20 23:32:27 crc kubenswrapper[5030]: E0120 23:32:27.136175 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f" containerName="nova-scheduler-scheduler" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.136184 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f" containerName="nova-scheduler-scheduler" Jan 20 23:32:27 crc kubenswrapper[5030]: E0120 23:32:27.136198 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829becc4-31bd-41a4-b235-ad48f57dbee1" containerName="nova-metadata-log" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.136204 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="829becc4-31bd-41a4-b235-ad48f57dbee1" containerName="nova-metadata-log" Jan 20 23:32:27 crc kubenswrapper[5030]: E0120 23:32:27.136228 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d85642b-b230-46f2-8eed-f66a85e61c1b" containerName="nova-api-api" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.136234 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d85642b-b230-46f2-8eed-f66a85e61c1b" containerName="nova-api-api" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.136400 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="829becc4-31bd-41a4-b235-ad48f57dbee1" containerName="nova-metadata-metadata" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.136411 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f" containerName="nova-scheduler-scheduler" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.136420 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d85642b-b230-46f2-8eed-f66a85e61c1b" containerName="nova-api-api" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.136428 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d85642b-b230-46f2-8eed-f66a85e61c1b" containerName="nova-api-log" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.136441 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="829becc4-31bd-41a4-b235-ad48f57dbee1" containerName="nova-metadata-log" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.137127 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.137788 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829becc4-31bd-41a4-b235-ad48f57dbee1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "829becc4-31bd-41a4-b235-ad48f57dbee1" (UID: "829becc4-31bd-41a4-b235-ad48f57dbee1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.139745 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.149215 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.161189 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d85642b-b230-46f2-8eed-f66a85e61c1b-logs\") pod \"1d85642b-b230-46f2-8eed-f66a85e61c1b\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.161573 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89d9j\" (UniqueName: \"kubernetes.io/projected/1d85642b-b230-46f2-8eed-f66a85e61c1b-kube-api-access-89d9j\") pod \"1d85642b-b230-46f2-8eed-f66a85e61c1b\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.161781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d85642b-b230-46f2-8eed-f66a85e61c1b-logs" (OuterVolumeSpecName: "logs") pod "1d85642b-b230-46f2-8eed-f66a85e61c1b" (UID: "1d85642b-b230-46f2-8eed-f66a85e61c1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.162495 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d85642b-b230-46f2-8eed-f66a85e61c1b-combined-ca-bundle\") pod \"1d85642b-b230-46f2-8eed-f66a85e61c1b\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.162686 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d85642b-b230-46f2-8eed-f66a85e61c1b-config-data\") pod \"1d85642b-b230-46f2-8eed-f66a85e61c1b\" (UID: \"1d85642b-b230-46f2-8eed-f66a85e61c1b\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.163528 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d85642b-b230-46f2-8eed-f66a85e61c1b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.164150 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829becc4-31bd-41a4-b235-ad48f57dbee1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.164170 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829becc4-31bd-41a4-b235-ad48f57dbee1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.164601 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d85642b-b230-46f2-8eed-f66a85e61c1b-kube-api-access-89d9j" (OuterVolumeSpecName: "kube-api-access-89d9j") pod "1d85642b-b230-46f2-8eed-f66a85e61c1b" (UID: "1d85642b-b230-46f2-8eed-f66a85e61c1b"). InnerVolumeSpecName "kube-api-access-89d9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.207897 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d85642b-b230-46f2-8eed-f66a85e61c1b-config-data" (OuterVolumeSpecName: "config-data") pod "1d85642b-b230-46f2-8eed-f66a85e61c1b" (UID: "1d85642b-b230-46f2-8eed-f66a85e61c1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.208048 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d85642b-b230-46f2-8eed-f66a85e61c1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d85642b-b230-46f2-8eed-f66a85e61c1b" (UID: "1d85642b-b230-46f2-8eed-f66a85e61c1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.265880 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1b03a7-ded3-477f-b573-e021cd6141c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be1b03a7-ded3-477f-b573-e021cd6141c6\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.266014 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x5dl\" (UniqueName: \"kubernetes.io/projected/be1b03a7-ded3-477f-b573-e021cd6141c6-kube-api-access-6x5dl\") pod \"nova-scheduler-0\" (UID: \"be1b03a7-ded3-477f-b573-e021cd6141c6\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.266046 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1b03a7-ded3-477f-b573-e021cd6141c6-config-data\") pod \"nova-scheduler-0\" (UID: \"be1b03a7-ded3-477f-b573-e021cd6141c6\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.266190 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89d9j\" (UniqueName: \"kubernetes.io/projected/1d85642b-b230-46f2-8eed-f66a85e61c1b-kube-api-access-89d9j\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.266221 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d85642b-b230-46f2-8eed-f66a85e61c1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.266231 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d85642b-b230-46f2-8eed-f66a85e61c1b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.332375 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.341044 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.354580 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.356116 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.358473 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.358543 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.364043 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.367537 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x5dl\" (UniqueName: \"kubernetes.io/projected/be1b03a7-ded3-477f-b573-e021cd6141c6-kube-api-access-6x5dl\") pod \"nova-scheduler-0\" (UID: \"be1b03a7-ded3-477f-b573-e021cd6141c6\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.367587 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1b03a7-ded3-477f-b573-e021cd6141c6-config-data\") pod \"nova-scheduler-0\" (UID: \"be1b03a7-ded3-477f-b573-e021cd6141c6\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.367705 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1b03a7-ded3-477f-b573-e021cd6141c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be1b03a7-ded3-477f-b573-e021cd6141c6\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.372016 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1b03a7-ded3-477f-b573-e021cd6141c6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be1b03a7-ded3-477f-b573-e021cd6141c6\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.374000 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1b03a7-ded3-477f-b573-e021cd6141c6-config-data\") pod \"nova-scheduler-0\" (UID: \"be1b03a7-ded3-477f-b573-e021cd6141c6\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.389177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x5dl\" (UniqueName: \"kubernetes.io/projected/be1b03a7-ded3-477f-b573-e021cd6141c6-kube-api-access-6x5dl\") pod \"nova-scheduler-0\" (UID: \"be1b03a7-ded3-477f-b573-e021cd6141c6\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.463996 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.469388 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.469553 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-config-data\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.469628 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3da609ec-4cd1-486f-9cf7-97c8835f558a-logs\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.469667 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.469724 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4wnh\" (UniqueName: \"kubernetes.io/projected/3da609ec-4cd1-486f-9cf7-97c8835f558a-kube-api-access-z4wnh\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.571887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.572310 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-config-data\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.572333 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3da609ec-4cd1-486f-9cf7-97c8835f558a-logs\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.572362 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.572418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4wnh\" (UniqueName: \"kubernetes.io/projected/3da609ec-4cd1-486f-9cf7-97c8835f558a-kube-api-access-z4wnh\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.574419 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3da609ec-4cd1-486f-9cf7-97c8835f558a-logs\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.579440 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.579773 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.586351 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-config-data\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.590853 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4wnh\" (UniqueName: \"kubernetes.io/projected/3da609ec-4cd1-486f-9cf7-97c8835f558a-kube-api-access-z4wnh\") pod \"nova-metadata-0\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.669417 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.844660 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.974373 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.980691 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4441979e-fb8d-4588-9180-8b1b74b3ecae-combined-ca-bundle\") pod \"4441979e-fb8d-4588-9180-8b1b74b3ecae\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.980812 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4441979e-fb8d-4588-9180-8b1b74b3ecae-config-data\") pod \"4441979e-fb8d-4588-9180-8b1b74b3ecae\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.980863 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jchqn\" (UniqueName: \"kubernetes.io/projected/4441979e-fb8d-4588-9180-8b1b74b3ecae-kube-api-access-jchqn\") pod \"4441979e-fb8d-4588-9180-8b1b74b3ecae\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.980936 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4441979e-fb8d-4588-9180-8b1b74b3ecae-etc-swift\") pod \"4441979e-fb8d-4588-9180-8b1b74b3ecae\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.980968 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4441979e-fb8d-4588-9180-8b1b74b3ecae-log-httpd\") pod \"4441979e-fb8d-4588-9180-8b1b74b3ecae\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.980987 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4441979e-fb8d-4588-9180-8b1b74b3ecae-run-httpd\") pod \"4441979e-fb8d-4588-9180-8b1b74b3ecae\" (UID: \"4441979e-fb8d-4588-9180-8b1b74b3ecae\") " Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.981611 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4441979e-fb8d-4588-9180-8b1b74b3ecae-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4441979e-fb8d-4588-9180-8b1b74b3ecae" (UID: "4441979e-fb8d-4588-9180-8b1b74b3ecae"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.982551 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4441979e-fb8d-4588-9180-8b1b74b3ecae-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4441979e-fb8d-4588-9180-8b1b74b3ecae" (UID: "4441979e-fb8d-4588-9180-8b1b74b3ecae"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:27 crc kubenswrapper[5030]: I0120 23:32:27.989017 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4441979e-fb8d-4588-9180-8b1b74b3ecae-kube-api-access-jchqn" (OuterVolumeSpecName: "kube-api-access-jchqn") pod "4441979e-fb8d-4588-9180-8b1b74b3ecae" (UID: "4441979e-fb8d-4588-9180-8b1b74b3ecae"). InnerVolumeSpecName "kube-api-access-jchqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.003492 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4441979e-fb8d-4588-9180-8b1b74b3ecae-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4441979e-fb8d-4588-9180-8b1b74b3ecae" (UID: "4441979e-fb8d-4588-9180-8b1b74b3ecae"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.017927 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f" path="/var/lib/kubelet/pods/7eb4b558-b9f0-476e-bbbe-2f5307bd9e4f/volumes" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.020229 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="829becc4-31bd-41a4-b235-ad48f57dbee1" path="/var/lib/kubelet/pods/829becc4-31bd-41a4-b235-ad48f57dbee1/volumes" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.020964 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7fef1ff-a583-42c4-bb53-b6a684d9afad" path="/var/lib/kubelet/pods/f7fef1ff-a583-42c4-bb53-b6a684d9afad/volumes" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.051835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4441979e-fb8d-4588-9180-8b1b74b3ecae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4441979e-fb8d-4588-9180-8b1b74b3ecae" (UID: "4441979e-fb8d-4588-9180-8b1b74b3ecae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.052876 5030 generic.go:334] "Generic (PLEG): container finished" podID="4441979e-fb8d-4588-9180-8b1b74b3ecae" containerID="38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef" exitCode=0 Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.052993 5030 generic.go:334] "Generic (PLEG): container finished" podID="4441979e-fb8d-4588-9180-8b1b74b3ecae" containerID="86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314" exitCode=0 Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.052989 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.065495 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.066958 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=3.06694088 podStartE2EDuration="3.06694088s" podCreationTimestamp="2026-01-20 23:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:28.062190085 +0000 UTC m=+3420.382450383" watchObservedRunningTime="2026-01-20 23:32:28.06694088 +0000 UTC m=+3420.387201168" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.073748 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4441979e-fb8d-4588-9180-8b1b74b3ecae-config-data" (OuterVolumeSpecName: "config-data") pod "4441979e-fb8d-4588-9180-8b1b74b3ecae" (UID: "4441979e-fb8d-4588-9180-8b1b74b3ecae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.083464 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4441979e-fb8d-4588-9180-8b1b74b3ecae-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.083598 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4441979e-fb8d-4588-9180-8b1b74b3ecae-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.089533 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4441979e-fb8d-4588-9180-8b1b74b3ecae-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.089628 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4441979e-fb8d-4588-9180-8b1b74b3ecae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.089643 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4441979e-fb8d-4588-9180-8b1b74b3ecae-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.089661 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jchqn\" (UniqueName: \"kubernetes.io/projected/4441979e-fb8d-4588-9180-8b1b74b3ecae-kube-api-access-jchqn\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.092647 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.092685 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e443f370-de90-4c56-ba15-f33683726237","Type":"ContainerStarted","Data":"ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8"} Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.092810 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" event={"ID":"4441979e-fb8d-4588-9180-8b1b74b3ecae","Type":"ContainerDied","Data":"38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef"} Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.092838 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" event={"ID":"4441979e-fb8d-4588-9180-8b1b74b3ecae","Type":"ContainerDied","Data":"86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314"} Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.092851 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7" event={"ID":"4441979e-fb8d-4588-9180-8b1b74b3ecae","Type":"ContainerDied","Data":"137df26c6bb4d8670f11ce9fa1ef37b015be34b9b5f49f4140a64e683476e6d7"} Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.092872 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"1d85642b-b230-46f2-8eed-f66a85e61c1b","Type":"ContainerDied","Data":"c70925694a9d9536009bbd7a6337c10f29110ab7d81e95a973a7668f714dc648"} Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.092885 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"be1b03a7-ded3-477f-b573-e021cd6141c6","Type":"ContainerStarted","Data":"1b35bdca6567f37338a555342e246159e645960e78dabba2ea63a83b98265898"} Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.093095 5030 scope.go:117] "RemoveContainer" containerID="38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.158654 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.159932 5030 scope.go:117] "RemoveContainer" containerID="86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.192658 5030 scope.go:117] "RemoveContainer" containerID="38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef" Jan 20 23:32:28 crc kubenswrapper[5030]: E0120 23:32:28.193540 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef\": container with ID starting with 38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef not found: ID does not exist" containerID="38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.193572 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef"} err="failed to get container status \"38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef\": rpc error: code = NotFound desc = could not find container \"38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef\": container with ID starting with 38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef not found: ID does not exist" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.193592 5030 scope.go:117] "RemoveContainer" containerID="86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314" Jan 20 23:32:28 crc kubenswrapper[5030]: E0120 23:32:28.194580 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314\": container with ID starting with 86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314 not found: ID does not exist" containerID="86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.194604 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314"} err="failed to get container status \"86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314\": rpc error: code = NotFound desc = could not find container \"86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314\": container with ID starting with 86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314 not found: ID does not exist" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.194662 5030 scope.go:117] "RemoveContainer" containerID="38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.195694 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef"} err="failed to get container status \"38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef\": rpc error: code = NotFound desc = could not find container \"38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef\": container with ID starting with 38937736ddc0091dcd552a6a9dc31e776ebab95f90327452a88269fccc0cd1ef not found: ID does not exist" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.195718 5030 scope.go:117] "RemoveContainer" containerID="86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.196124 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314"} err="failed to get container status \"86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314\": rpc error: code = NotFound desc = could not find container \"86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314\": container with ID starting with 86713f067927c211e620da0971b5cab91069e8bd851b3d57a900421c967e1314 not found: ID does not exist" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.196269 5030 scope.go:117] "RemoveContainer" containerID="c24dea5d8d838a9dbdf953c08617234b342ae4bd0aee7b36ee5a69b0bf99fee6" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.206829 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.216699 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:28 crc kubenswrapper[5030]: E0120 23:32:28.217399 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4441979e-fb8d-4588-9180-8b1b74b3ecae" containerName="proxy-httpd" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.217465 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4441979e-fb8d-4588-9180-8b1b74b3ecae" containerName="proxy-httpd" Jan 20 23:32:28 crc kubenswrapper[5030]: E0120 23:32:28.217552 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4441979e-fb8d-4588-9180-8b1b74b3ecae" containerName="proxy-server" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.217606 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4441979e-fb8d-4588-9180-8b1b74b3ecae" containerName="proxy-server" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.217828 5030 scope.go:117] "RemoveContainer" containerID="7b306056aeb3ce3bc150f2c02b655a5f02e9143719bfda942c7d8484334555bd" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.217988 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4441979e-fb8d-4588-9180-8b1b74b3ecae" containerName="proxy-httpd" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.218067 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4441979e-fb8d-4588-9180-8b1b74b3ecae" containerName="proxy-server" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.219300 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.223522 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.223750 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.223905 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.224932 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.239849 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.293569 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8zm6\" (UniqueName: \"kubernetes.io/projected/236920b9-1246-4f61-bb2e-4bc11bbcdc08-kube-api-access-g8zm6\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.293634 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-internal-tls-certs\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.293654 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236920b9-1246-4f61-bb2e-4bc11bbcdc08-logs\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.293680 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.293700 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-config-data\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.293738 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-public-tls-certs\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.387532 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7"] Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.394755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8zm6\" (UniqueName: \"kubernetes.io/projected/236920b9-1246-4f61-bb2e-4bc11bbcdc08-kube-api-access-g8zm6\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.394808 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-internal-tls-certs\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.394837 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236920b9-1246-4f61-bb2e-4bc11bbcdc08-logs\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.394861 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.394879 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-config-data\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.394923 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-public-tls-certs\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.395246 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236920b9-1246-4f61-bb2e-4bc11bbcdc08-logs\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.396886 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7669cfdd4b-6jmf7"] Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.399680 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-internal-tls-certs\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.399702 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.401966 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-public-tls-certs\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.401992 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-config-data\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.411707 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8zm6\" (UniqueName: \"kubernetes.io/projected/236920b9-1246-4f61-bb2e-4bc11bbcdc08-kube-api-access-g8zm6\") pod \"nova-api-0\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.543236 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:28 crc kubenswrapper[5030]: I0120 23:32:28.974809 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:29 crc kubenswrapper[5030]: I0120 23:32:29.080506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"236920b9-1246-4f61-bb2e-4bc11bbcdc08","Type":"ContainerStarted","Data":"1b7b5f74499c98105149ce03214e0bf559a3366406175e45d3602c7695472d30"} Jan 20 23:32:29 crc kubenswrapper[5030]: I0120 23:32:29.082860 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3da609ec-4cd1-486f-9cf7-97c8835f558a","Type":"ContainerStarted","Data":"aa5fa8d6d3684ec051cffc9d6f9e2a0270374a2dfed62649f32932efff1cec2b"} Jan 20 23:32:29 crc kubenswrapper[5030]: I0120 23:32:29.082891 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3da609ec-4cd1-486f-9cf7-97c8835f558a","Type":"ContainerStarted","Data":"785f3390655955d663a5f7c707be245ffc609cbc1e023d3090c7bf9faf70f823"} Jan 20 23:32:29 crc kubenswrapper[5030]: I0120 23:32:29.082901 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3da609ec-4cd1-486f-9cf7-97c8835f558a","Type":"ContainerStarted","Data":"3a2a62c71ec55e37df5e75c8d20fa43e4babe542baa75bee0eb164905ac46fb7"} Jan 20 23:32:29 crc kubenswrapper[5030]: I0120 23:32:29.087351 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"be1b03a7-ded3-477f-b573-e021cd6141c6","Type":"ContainerStarted","Data":"670fa6f9dccf2f9d72088d97bae231dbd4a6bbfdace49d8e49a23adf96196407"} Jan 20 23:32:29 crc kubenswrapper[5030]: I0120 23:32:29.109362 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.109339143 podStartE2EDuration="2.109339143s" podCreationTimestamp="2026-01-20 23:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:29.098808769 +0000 UTC m=+3421.419069057" watchObservedRunningTime="2026-01-20 23:32:29.109339143 +0000 UTC m=+3421.429599431" Jan 20 23:32:29 crc kubenswrapper[5030]: I0120 23:32:29.127894 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.127878442 podStartE2EDuration="2.127878442s" podCreationTimestamp="2026-01-20 23:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:29.124525511 +0000 UTC m=+3421.444785799" watchObservedRunningTime="2026-01-20 23:32:29.127878442 +0000 UTC m=+3421.448138730" Jan 20 23:32:29 crc kubenswrapper[5030]: I0120 23:32:29.978249 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d85642b-b230-46f2-8eed-f66a85e61c1b" path="/var/lib/kubelet/pods/1d85642b-b230-46f2-8eed-f66a85e61c1b/volumes" Jan 20 23:32:29 crc kubenswrapper[5030]: I0120 23:32:29.980667 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4441979e-fb8d-4588-9180-8b1b74b3ecae" path="/var/lib/kubelet/pods/4441979e-fb8d-4588-9180-8b1b74b3ecae/volumes" Jan 20 23:32:30 crc kubenswrapper[5030]: I0120 23:32:30.100326 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"236920b9-1246-4f61-bb2e-4bc11bbcdc08","Type":"ContainerStarted","Data":"2de8fa2a037b67168804e69f75199eca84ad18411f17e7b89cf62579ebc1392c"} Jan 20 23:32:30 crc kubenswrapper[5030]: I0120 23:32:30.100393 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"236920b9-1246-4f61-bb2e-4bc11bbcdc08","Type":"ContainerStarted","Data":"6e9ead31f5362c40cf442b019ebe6d5c57c2a7f054c096d807665f657db342a5"} Jan 20 23:32:30 crc kubenswrapper[5030]: I0120 23:32:30.103329 5030 generic.go:334] "Generic (PLEG): container finished" podID="e290163a-2fc9-492f-b113-1e1a42ff37f9" containerID="114affce9b800cbe0857f605af142f1a72a5d890f7e1f80920fbac277a72b8f3" exitCode=0 Jan 20 23:32:30 crc kubenswrapper[5030]: I0120 23:32:30.104162 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" event={"ID":"e290163a-2fc9-492f-b113-1e1a42ff37f9","Type":"ContainerDied","Data":"114affce9b800cbe0857f605af142f1a72a5d890f7e1f80920fbac277a72b8f3"} Jan 20 23:32:30 crc kubenswrapper[5030]: I0120 23:32:30.128070 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.128048734 podStartE2EDuration="2.128048734s" podCreationTimestamp="2026-01-20 23:32:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:30.119564549 +0000 UTC m=+3422.439824837" watchObservedRunningTime="2026-01-20 23:32:30.128048734 +0000 UTC m=+3422.448309022" Jan 20 23:32:30 crc kubenswrapper[5030]: E0120 23:32:30.958602 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:30 crc kubenswrapper[5030]: E0120 23:32:30.960502 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:30 crc kubenswrapper[5030]: E0120 23:32:30.962406 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:30 crc kubenswrapper[5030]: E0120 23:32:30.962456 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="62366776-47f0-4d58-87fc-e76f904677bc" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.534493 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.677162 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-combined-ca-bundle\") pod \"e290163a-2fc9-492f-b113-1e1a42ff37f9\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.677309 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-config-data\") pod \"e290163a-2fc9-492f-b113-1e1a42ff37f9\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.677358 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ptwb\" (UniqueName: \"kubernetes.io/projected/e290163a-2fc9-492f-b113-1e1a42ff37f9-kube-api-access-8ptwb\") pod \"e290163a-2fc9-492f-b113-1e1a42ff37f9\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.677388 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-scripts\") pod \"e290163a-2fc9-492f-b113-1e1a42ff37f9\" (UID: \"e290163a-2fc9-492f-b113-1e1a42ff37f9\") " Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.682397 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e290163a-2fc9-492f-b113-1e1a42ff37f9-kube-api-access-8ptwb" (OuterVolumeSpecName: "kube-api-access-8ptwb") pod "e290163a-2fc9-492f-b113-1e1a42ff37f9" (UID: "e290163a-2fc9-492f-b113-1e1a42ff37f9"). InnerVolumeSpecName "kube-api-access-8ptwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.682548 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-scripts" (OuterVolumeSpecName: "scripts") pod "e290163a-2fc9-492f-b113-1e1a42ff37f9" (UID: "e290163a-2fc9-492f-b113-1e1a42ff37f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.706969 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-config-data" (OuterVolumeSpecName: "config-data") pod "e290163a-2fc9-492f-b113-1e1a42ff37f9" (UID: "e290163a-2fc9-492f-b113-1e1a42ff37f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.711481 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e290163a-2fc9-492f-b113-1e1a42ff37f9" (UID: "e290163a-2fc9-492f-b113-1e1a42ff37f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.779464 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.779502 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ptwb\" (UniqueName: \"kubernetes.io/projected/e290163a-2fc9-492f-b113-1e1a42ff37f9-kube-api-access-8ptwb\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.779512 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.779523 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e290163a-2fc9-492f-b113-1e1a42ff37f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:31 crc kubenswrapper[5030]: I0120 23:32:31.909653 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.084481 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-config\") pod \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.084811 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-httpd-config\") pod \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.084852 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-combined-ca-bundle\") pod \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.084888 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rmsj\" (UniqueName: \"kubernetes.io/projected/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-kube-api-access-8rmsj\") pod \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\" (UID: \"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80\") " Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.089408 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-kube-api-access-8rmsj" (OuterVolumeSpecName: "kube-api-access-8rmsj") pod "f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" (UID: "f5910c0d-bfc0-4b55-8b1b-ded9285b8b80"). InnerVolumeSpecName "kube-api-access-8rmsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.089869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" (UID: "f5910c0d-bfc0-4b55-8b1b-ded9285b8b80"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.126124 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" event={"ID":"e290163a-2fc9-492f-b113-1e1a42ff37f9","Type":"ContainerDied","Data":"5ae3a349ff9649bc0c317dc43094719c687e5372c28f5c066d01b755692d15ed"} Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.126193 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ae3a349ff9649bc0c317dc43094719c687e5372c28f5c066d01b755692d15ed" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.126188 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-4h926" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.129108 5030 generic.go:334] "Generic (PLEG): container finished" podID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerID="949ac4be0e8dfa966768eb1d9a84f99b2add5daf9b84ac534d860118dc091fa8" exitCode=0 Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.129175 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e05505c3-5757-4e56-bd28-6ea231262b0d","Type":"ContainerDied","Data":"949ac4be0e8dfa966768eb1d9a84f99b2add5daf9b84ac534d860118dc091fa8"} Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.139800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-config" (OuterVolumeSpecName: "config") pod "f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" (UID: "f5910c0d-bfc0-4b55-8b1b-ded9285b8b80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.142062 5030 generic.go:334] "Generic (PLEG): container finished" podID="f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" containerID="0479d8ca3c7e55d792916ca93b8115469fdc13f431eb12a4779cf505b8f82e6f" exitCode=0 Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.142167 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" event={"ID":"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80","Type":"ContainerDied","Data":"0479d8ca3c7e55d792916ca93b8115469fdc13f431eb12a4779cf505b8f82e6f"} Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.142255 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" event={"ID":"f5910c0d-bfc0-4b55-8b1b-ded9285b8b80","Type":"ContainerDied","Data":"529b7c5956ff76bd7df792dcf6277eaa728c5dfc90f8ce1528d08fcae5c529c1"} Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.142288 5030 scope.go:117] "RemoveContainer" containerID="f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.142811 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78c57799cd-6f6m4" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.158309 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" (UID: "f5910c0d-bfc0-4b55-8b1b-ded9285b8b80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.166074 5030 scope.go:117] "RemoveContainer" containerID="0479d8ca3c7e55d792916ca93b8115469fdc13f431eb12a4779cf505b8f82e6f" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.187127 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.187165 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.187176 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.187187 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rmsj\" (UniqueName: \"kubernetes.io/projected/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80-kube-api-access-8rmsj\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.193047 5030 scope.go:117] "RemoveContainer" containerID="f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4" Jan 20 23:32:32 crc kubenswrapper[5030]: E0120 23:32:32.193466 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4\": container with ID starting with f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4 not found: ID does not exist" containerID="f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.193508 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4"} err="failed to get container status \"f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4\": rpc error: code = NotFound desc = could not find container \"f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4\": container with ID starting with f57edd0f56b30083d7dbd05b95a68c2e71b862cbddeb9d855eb55a1a661412c4 not found: ID does not exist" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.193531 5030 scope.go:117] "RemoveContainer" containerID="0479d8ca3c7e55d792916ca93b8115469fdc13f431eb12a4779cf505b8f82e6f" Jan 20 23:32:32 crc kubenswrapper[5030]: E0120 23:32:32.193909 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0479d8ca3c7e55d792916ca93b8115469fdc13f431eb12a4779cf505b8f82e6f\": container with ID starting with 0479d8ca3c7e55d792916ca93b8115469fdc13f431eb12a4779cf505b8f82e6f not found: ID does not exist" containerID="0479d8ca3c7e55d792916ca93b8115469fdc13f431eb12a4779cf505b8f82e6f" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.194003 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0479d8ca3c7e55d792916ca93b8115469fdc13f431eb12a4779cf505b8f82e6f"} err="failed to get container status \"0479d8ca3c7e55d792916ca93b8115469fdc13f431eb12a4779cf505b8f82e6f\": rpc error: code = NotFound desc = could not find container \"0479d8ca3c7e55d792916ca93b8115469fdc13f431eb12a4779cf505b8f82e6f\": container with ID starting with 0479d8ca3c7e55d792916ca93b8115469fdc13f431eb12a4779cf505b8f82e6f not found: ID does not exist" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.205034 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.288250 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e05505c3-5757-4e56-bd28-6ea231262b0d-log-httpd\") pod \"e05505c3-5757-4e56-bd28-6ea231262b0d\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.288293 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-sg-core-conf-yaml\") pod \"e05505c3-5757-4e56-bd28-6ea231262b0d\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.288320 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-combined-ca-bundle\") pod \"e05505c3-5757-4e56-bd28-6ea231262b0d\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.288374 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-config-data\") pod \"e05505c3-5757-4e56-bd28-6ea231262b0d\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.288448 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cc29\" (UniqueName: \"kubernetes.io/projected/e05505c3-5757-4e56-bd28-6ea231262b0d-kube-api-access-4cc29\") pod \"e05505c3-5757-4e56-bd28-6ea231262b0d\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.288480 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-scripts\") pod \"e05505c3-5757-4e56-bd28-6ea231262b0d\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.288508 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e05505c3-5757-4e56-bd28-6ea231262b0d-run-httpd\") pod \"e05505c3-5757-4e56-bd28-6ea231262b0d\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.288526 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-ceilometer-tls-certs\") pod \"e05505c3-5757-4e56-bd28-6ea231262b0d\" (UID: \"e05505c3-5757-4e56-bd28-6ea231262b0d\") " Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.289175 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e05505c3-5757-4e56-bd28-6ea231262b0d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e05505c3-5757-4e56-bd28-6ea231262b0d" (UID: "e05505c3-5757-4e56-bd28-6ea231262b0d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.289354 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e05505c3-5757-4e56-bd28-6ea231262b0d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e05505c3-5757-4e56-bd28-6ea231262b0d" (UID: "e05505c3-5757-4e56-bd28-6ea231262b0d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.293000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e05505c3-5757-4e56-bd28-6ea231262b0d-kube-api-access-4cc29" (OuterVolumeSpecName: "kube-api-access-4cc29") pod "e05505c3-5757-4e56-bd28-6ea231262b0d" (UID: "e05505c3-5757-4e56-bd28-6ea231262b0d"). InnerVolumeSpecName "kube-api-access-4cc29". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.293064 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-scripts" (OuterVolumeSpecName: "scripts") pod "e05505c3-5757-4e56-bd28-6ea231262b0d" (UID: "e05505c3-5757-4e56-bd28-6ea231262b0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.315216 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e05505c3-5757-4e56-bd28-6ea231262b0d" (UID: "e05505c3-5757-4e56-bd28-6ea231262b0d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.338838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e05505c3-5757-4e56-bd28-6ea231262b0d" (UID: "e05505c3-5757-4e56-bd28-6ea231262b0d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.382076 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e05505c3-5757-4e56-bd28-6ea231262b0d" (UID: "e05505c3-5757-4e56-bd28-6ea231262b0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.390141 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e05505c3-5757-4e56-bd28-6ea231262b0d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.390166 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.390178 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.390189 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cc29\" (UniqueName: \"kubernetes.io/projected/e05505c3-5757-4e56-bd28-6ea231262b0d-kube-api-access-4cc29\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.390200 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.390208 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e05505c3-5757-4e56-bd28-6ea231262b0d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.390216 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.391738 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-config-data" (OuterVolumeSpecName: "config-data") pod "e05505c3-5757-4e56-bd28-6ea231262b0d" (UID: "e05505c3-5757-4e56-bd28-6ea231262b0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.464980 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.491387 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e05505c3-5757-4e56-bd28-6ea231262b0d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.504191 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-78c57799cd-6f6m4"] Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.511608 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-78c57799cd-6f6m4"] Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.670668 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:32 crc kubenswrapper[5030]: I0120 23:32:32.670756 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.164127 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"e05505c3-5757-4e56-bd28-6ea231262b0d","Type":"ContainerDied","Data":"efeda1b63b650f05ed990fe45e76d00f47c7d1c3650354965fcf23b6b5b0f1aa"} Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.164770 5030 scope.go:117] "RemoveContainer" containerID="d90bb2c84a3cd15fd33a8899f38b282b8121c83786d060fd74c4c4b2c774cbfe" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.164228 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.200849 5030 scope.go:117] "RemoveContainer" containerID="07f8f4343762bc08f533b68567863fa6d796dea19f1af42572b0b8beeb71f1f3" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.206262 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.217123 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.243051 5030 scope.go:117] "RemoveContainer" containerID="d0a7ae091b8e99c409e9f71e8676dec7b704731a453b2a67b9cabfc59f9230c5" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.249659 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:33 crc kubenswrapper[5030]: E0120 23:32:33.250483 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" containerName="neutron-httpd" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.250518 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" containerName="neutron-httpd" Jan 20 23:32:33 crc kubenswrapper[5030]: E0120 23:32:33.250544 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" containerName="neutron-api" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.250555 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" containerName="neutron-api" Jan 20 23:32:33 crc kubenswrapper[5030]: E0120 23:32:33.250577 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="sg-core" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.250588 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="sg-core" Jan 20 23:32:33 crc kubenswrapper[5030]: E0120 23:32:33.250602 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="proxy-httpd" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.250613 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="proxy-httpd" Jan 20 23:32:33 crc kubenswrapper[5030]: E0120 23:32:33.250668 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="ceilometer-central-agent" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.250680 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="ceilometer-central-agent" Jan 20 23:32:33 crc kubenswrapper[5030]: E0120 23:32:33.250703 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e290163a-2fc9-492f-b113-1e1a42ff37f9" containerName="nova-manage" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.250714 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e290163a-2fc9-492f-b113-1e1a42ff37f9" containerName="nova-manage" Jan 20 23:32:33 crc kubenswrapper[5030]: E0120 23:32:33.250763 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="ceilometer-notification-agent" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.250776 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="ceilometer-notification-agent" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.251059 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e290163a-2fc9-492f-b113-1e1a42ff37f9" containerName="nova-manage" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.251093 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="ceilometer-central-agent" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.251121 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" containerName="neutron-httpd" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.251158 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="ceilometer-notification-agent" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.251182 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="sg-core" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.251202 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" containerName="neutron-api" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.251222 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" containerName="proxy-httpd" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.254382 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.258306 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.258607 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.258793 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.263977 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.288206 5030 scope.go:117] "RemoveContainer" containerID="949ac4be0e8dfa966768eb1d9a84f99b2add5daf9b84ac534d860118dc091fa8" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.411242 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-scripts\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.411383 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.411419 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-config-data\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.411783 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-run-httpd\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.411914 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.412086 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.412334 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-log-httpd\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.412413 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg5r5\" (UniqueName: \"kubernetes.io/projected/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-kube-api-access-rg5r5\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.515334 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-scripts\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.515470 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.515529 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-config-data\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.515617 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-run-httpd\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.515721 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.515805 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.515921 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-log-httpd\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.515967 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg5r5\" (UniqueName: \"kubernetes.io/projected/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-kube-api-access-rg5r5\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.518261 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-log-httpd\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.519129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-run-httpd\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.521614 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.524256 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-config-data\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.526691 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.529723 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-scripts\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.529627 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.537996 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg5r5\" (UniqueName: \"kubernetes.io/projected/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-kube-api-access-rg5r5\") pod \"ceilometer-0\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.580547 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.972210 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e05505c3-5757-4e56-bd28-6ea231262b0d" path="/var/lib/kubelet/pods/e05505c3-5757-4e56-bd28-6ea231262b0d/volumes" Jan 20 23:32:33 crc kubenswrapper[5030]: I0120 23:32:33.973412 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5910c0d-bfc0-4b55-8b1b-ded9285b8b80" path="/var/lib/kubelet/pods/f5910c0d-bfc0-4b55-8b1b-ded9285b8b80/volumes" Jan 20 23:32:34 crc kubenswrapper[5030]: I0120 23:32:34.097505 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:32:34 crc kubenswrapper[5030]: I0120 23:32:34.213157 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"53dd7d1d-6d16-400b-90b7-ac3c9a96127d","Type":"ContainerStarted","Data":"db936a98cba652d89112b07f95414cb18b1e11db1909089db24be980234e5573"} Jan 20 23:32:35 crc kubenswrapper[5030]: I0120 23:32:35.232209 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"53dd7d1d-6d16-400b-90b7-ac3c9a96127d","Type":"ContainerStarted","Data":"839cc89adae584c41efedb5d0da7286f639a42f63dac0d84d40646b3fef94436"} Jan 20 23:32:35 crc kubenswrapper[5030]: E0120 23:32:35.958394 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:35 crc kubenswrapper[5030]: E0120 23:32:35.960666 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:35 crc kubenswrapper[5030]: E0120 23:32:35.971498 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:35 crc kubenswrapper[5030]: E0120 23:32:35.971578 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="62366776-47f0-4d58-87fc-e76f904677bc" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:36 crc kubenswrapper[5030]: I0120 23:32:36.247079 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"53dd7d1d-6d16-400b-90b7-ac3c9a96127d","Type":"ContainerStarted","Data":"89a0d19a8c65efcfc7faf4ed10346e6ccbc4f9606df7b694da7901505cafdb19"} Jan 20 23:32:36 crc kubenswrapper[5030]: I0120 23:32:36.398661 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:32:36 crc kubenswrapper[5030]: I0120 23:32:36.892532 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:36 crc kubenswrapper[5030]: I0120 23:32:36.893116 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="be1b03a7-ded3-477f-b573-e021cd6141c6" containerName="nova-scheduler-scheduler" containerID="cri-o://670fa6f9dccf2f9d72088d97bae231dbd4a6bbfdace49d8e49a23adf96196407" gracePeriod=30 Jan 20 23:32:36 crc kubenswrapper[5030]: I0120 23:32:36.908275 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:36 crc kubenswrapper[5030]: I0120 23:32:36.908564 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="236920b9-1246-4f61-bb2e-4bc11bbcdc08" containerName="nova-api-log" containerID="cri-o://6e9ead31f5362c40cf442b019ebe6d5c57c2a7f054c096d807665f657db342a5" gracePeriod=30 Jan 20 23:32:36 crc kubenswrapper[5030]: I0120 23:32:36.908609 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="236920b9-1246-4f61-bb2e-4bc11bbcdc08" containerName="nova-api-api" containerID="cri-o://2de8fa2a037b67168804e69f75199eca84ad18411f17e7b89cf62579ebc1392c" gracePeriod=30 Jan 20 23:32:36 crc kubenswrapper[5030]: I0120 23:32:36.926899 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:36 crc kubenswrapper[5030]: I0120 23:32:36.927163 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="3da609ec-4cd1-486f-9cf7-97c8835f558a" containerName="nova-metadata-log" containerID="cri-o://785f3390655955d663a5f7c707be245ffc609cbc1e023d3090c7bf9faf70f823" gracePeriod=30 Jan 20 23:32:36 crc kubenswrapper[5030]: I0120 23:32:36.927284 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="3da609ec-4cd1-486f-9cf7-97c8835f558a" containerName="nova-metadata-metadata" containerID="cri-o://aa5fa8d6d3684ec051cffc9d6f9e2a0270374a2dfed62649f32932efff1cec2b" gracePeriod=30 Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.262522 5030 generic.go:334] "Generic (PLEG): container finished" podID="3da609ec-4cd1-486f-9cf7-97c8835f558a" containerID="aa5fa8d6d3684ec051cffc9d6f9e2a0270374a2dfed62649f32932efff1cec2b" exitCode=0 Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.262786 5030 generic.go:334] "Generic (PLEG): container finished" podID="3da609ec-4cd1-486f-9cf7-97c8835f558a" containerID="785f3390655955d663a5f7c707be245ffc609cbc1e023d3090c7bf9faf70f823" exitCode=143 Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.262588 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3da609ec-4cd1-486f-9cf7-97c8835f558a","Type":"ContainerDied","Data":"aa5fa8d6d3684ec051cffc9d6f9e2a0270374a2dfed62649f32932efff1cec2b"} Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.262898 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3da609ec-4cd1-486f-9cf7-97c8835f558a","Type":"ContainerDied","Data":"785f3390655955d663a5f7c707be245ffc609cbc1e023d3090c7bf9faf70f823"} Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.266358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"53dd7d1d-6d16-400b-90b7-ac3c9a96127d","Type":"ContainerStarted","Data":"ff79c2278d13b526a6c8be59f9d6e9324d01879e24402a230ce9cf77aa025730"} Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.267989 5030 generic.go:334] "Generic (PLEG): container finished" podID="236920b9-1246-4f61-bb2e-4bc11bbcdc08" containerID="2de8fa2a037b67168804e69f75199eca84ad18411f17e7b89cf62579ebc1392c" exitCode=0 Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.268029 5030 generic.go:334] "Generic (PLEG): container finished" podID="236920b9-1246-4f61-bb2e-4bc11bbcdc08" containerID="6e9ead31f5362c40cf442b019ebe6d5c57c2a7f054c096d807665f657db342a5" exitCode=143 Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.268055 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"236920b9-1246-4f61-bb2e-4bc11bbcdc08","Type":"ContainerDied","Data":"2de8fa2a037b67168804e69f75199eca84ad18411f17e7b89cf62579ebc1392c"} Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.268087 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"236920b9-1246-4f61-bb2e-4bc11bbcdc08","Type":"ContainerDied","Data":"6e9ead31f5362c40cf442b019ebe6d5c57c2a7f054c096d807665f657db342a5"} Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.559423 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.566749 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.710464 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-nova-metadata-tls-certs\") pod \"3da609ec-4cd1-486f-9cf7-97c8835f558a\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.710825 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-internal-tls-certs\") pod \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.710868 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-combined-ca-bundle\") pod \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.710914 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4wnh\" (UniqueName: \"kubernetes.io/projected/3da609ec-4cd1-486f-9cf7-97c8835f558a-kube-api-access-z4wnh\") pod \"3da609ec-4cd1-486f-9cf7-97c8835f558a\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.710939 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-config-data\") pod \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.710978 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8zm6\" (UniqueName: \"kubernetes.io/projected/236920b9-1246-4f61-bb2e-4bc11bbcdc08-kube-api-access-g8zm6\") pod \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.711019 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3da609ec-4cd1-486f-9cf7-97c8835f558a-logs\") pod \"3da609ec-4cd1-486f-9cf7-97c8835f558a\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.711059 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236920b9-1246-4f61-bb2e-4bc11bbcdc08-logs\") pod \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.711190 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-config-data\") pod \"3da609ec-4cd1-486f-9cf7-97c8835f558a\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.711218 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-combined-ca-bundle\") pod \"3da609ec-4cd1-486f-9cf7-97c8835f558a\" (UID: \"3da609ec-4cd1-486f-9cf7-97c8835f558a\") " Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.711336 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-public-tls-certs\") pod \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.711652 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3da609ec-4cd1-486f-9cf7-97c8835f558a-logs" (OuterVolumeSpecName: "logs") pod "3da609ec-4cd1-486f-9cf7-97c8835f558a" (UID: "3da609ec-4cd1-486f-9cf7-97c8835f558a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.711823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/236920b9-1246-4f61-bb2e-4bc11bbcdc08-logs" (OuterVolumeSpecName: "logs") pod "236920b9-1246-4f61-bb2e-4bc11bbcdc08" (UID: "236920b9-1246-4f61-bb2e-4bc11bbcdc08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.712416 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3da609ec-4cd1-486f-9cf7-97c8835f558a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.712446 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236920b9-1246-4f61-bb2e-4bc11bbcdc08-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.715055 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da609ec-4cd1-486f-9cf7-97c8835f558a-kube-api-access-z4wnh" (OuterVolumeSpecName: "kube-api-access-z4wnh") pod "3da609ec-4cd1-486f-9cf7-97c8835f558a" (UID: "3da609ec-4cd1-486f-9cf7-97c8835f558a"). InnerVolumeSpecName "kube-api-access-z4wnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.725794 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236920b9-1246-4f61-bb2e-4bc11bbcdc08-kube-api-access-g8zm6" (OuterVolumeSpecName: "kube-api-access-g8zm6") pod "236920b9-1246-4f61-bb2e-4bc11bbcdc08" (UID: "236920b9-1246-4f61-bb2e-4bc11bbcdc08"). InnerVolumeSpecName "kube-api-access-g8zm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.737628 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-config-data" (OuterVolumeSpecName: "config-data") pod "236920b9-1246-4f61-bb2e-4bc11bbcdc08" (UID: "236920b9-1246-4f61-bb2e-4bc11bbcdc08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.740728 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-config-data" (OuterVolumeSpecName: "config-data") pod "3da609ec-4cd1-486f-9cf7-97c8835f558a" (UID: "3da609ec-4cd1-486f-9cf7-97c8835f558a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.746385 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "236920b9-1246-4f61-bb2e-4bc11bbcdc08" (UID: "236920b9-1246-4f61-bb2e-4bc11bbcdc08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.746957 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3da609ec-4cd1-486f-9cf7-97c8835f558a" (UID: "3da609ec-4cd1-486f-9cf7-97c8835f558a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:37 crc kubenswrapper[5030]: E0120 23:32:37.764075 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-public-tls-certs podName:236920b9-1246-4f61-bb2e-4bc11bbcdc08 nodeName:}" failed. No retries permitted until 2026-01-20 23:32:38.26404772 +0000 UTC m=+3430.584308008 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-public-tls-certs") pod "236920b9-1246-4f61-bb2e-4bc11bbcdc08" (UID: "236920b9-1246-4f61-bb2e-4bc11bbcdc08") : error deleting /var/lib/kubelet/pods/236920b9-1246-4f61-bb2e-4bc11bbcdc08/volume-subpaths: remove /var/lib/kubelet/pods/236920b9-1246-4f61-bb2e-4bc11bbcdc08/volume-subpaths: no such file or directory Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.768583 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "236920b9-1246-4f61-bb2e-4bc11bbcdc08" (UID: "236920b9-1246-4f61-bb2e-4bc11bbcdc08"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.771203 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3da609ec-4cd1-486f-9cf7-97c8835f558a" (UID: "3da609ec-4cd1-486f-9cf7-97c8835f558a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.814214 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.814249 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.814259 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3da609ec-4cd1-486f-9cf7-97c8835f558a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.814268 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.814278 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.814290 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4wnh\" (UniqueName: \"kubernetes.io/projected/3da609ec-4cd1-486f-9cf7-97c8835f558a-kube-api-access-z4wnh\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.814299 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:37 crc kubenswrapper[5030]: I0120 23:32:37.814308 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8zm6\" (UniqueName: \"kubernetes.io/projected/236920b9-1246-4f61-bb2e-4bc11bbcdc08-kube-api-access-g8zm6\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.280844 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.281141 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"3da609ec-4cd1-486f-9cf7-97c8835f558a","Type":"ContainerDied","Data":"3a2a62c71ec55e37df5e75c8d20fa43e4babe542baa75bee0eb164905ac46fb7"} Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.281200 5030 scope.go:117] "RemoveContainer" containerID="aa5fa8d6d3684ec051cffc9d6f9e2a0270374a2dfed62649f32932efff1cec2b" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.288817 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"53dd7d1d-6d16-400b-90b7-ac3c9a96127d","Type":"ContainerStarted","Data":"ddcfcc6a8d5fcd86438c7dd6b86c04c164ea7cb885b084b0dbc89d752fbc650e"} Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.288888 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.301151 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"236920b9-1246-4f61-bb2e-4bc11bbcdc08","Type":"ContainerDied","Data":"1b7b5f74499c98105149ce03214e0bf559a3366406175e45d3602c7695472d30"} Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.301333 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.314719 5030 scope.go:117] "RemoveContainer" containerID="785f3390655955d663a5f7c707be245ffc609cbc1e023d3090c7bf9faf70f823" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.325092 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-public-tls-certs\") pod \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\" (UID: \"236920b9-1246-4f61-bb2e-4bc11bbcdc08\") " Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.346464 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "236920b9-1246-4f61-bb2e-4bc11bbcdc08" (UID: "236920b9-1246-4f61-bb2e-4bc11bbcdc08"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.354598 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.943112723 podStartE2EDuration="5.354572215s" podCreationTimestamp="2026-01-20 23:32:33 +0000 UTC" firstStartedPulling="2026-01-20 23:32:34.102617497 +0000 UTC m=+3426.422877785" lastFinishedPulling="2026-01-20 23:32:37.514076989 +0000 UTC m=+3429.834337277" observedRunningTime="2026-01-20 23:32:38.323705608 +0000 UTC m=+3430.643965936" watchObservedRunningTime="2026-01-20 23:32:38.354572215 +0000 UTC m=+3430.674832513" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.365163 5030 scope.go:117] "RemoveContainer" containerID="2de8fa2a037b67168804e69f75199eca84ad18411f17e7b89cf62579ebc1392c" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.382917 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.395848 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.397352 5030 scope.go:117] "RemoveContainer" containerID="6e9ead31f5362c40cf442b019ebe6d5c57c2a7f054c096d807665f657db342a5" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.406723 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:38 crc kubenswrapper[5030]: E0120 23:32:38.407118 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da609ec-4cd1-486f-9cf7-97c8835f558a" containerName="nova-metadata-log" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.407135 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da609ec-4cd1-486f-9cf7-97c8835f558a" containerName="nova-metadata-log" Jan 20 23:32:38 crc kubenswrapper[5030]: E0120 23:32:38.407163 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da609ec-4cd1-486f-9cf7-97c8835f558a" containerName="nova-metadata-metadata" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.407170 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da609ec-4cd1-486f-9cf7-97c8835f558a" containerName="nova-metadata-metadata" Jan 20 23:32:38 crc kubenswrapper[5030]: E0120 23:32:38.407182 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236920b9-1246-4f61-bb2e-4bc11bbcdc08" containerName="nova-api-api" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.407188 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="236920b9-1246-4f61-bb2e-4bc11bbcdc08" containerName="nova-api-api" Jan 20 23:32:38 crc kubenswrapper[5030]: E0120 23:32:38.407202 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236920b9-1246-4f61-bb2e-4bc11bbcdc08" containerName="nova-api-log" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.407208 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="236920b9-1246-4f61-bb2e-4bc11bbcdc08" containerName="nova-api-log" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.407363 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="236920b9-1246-4f61-bb2e-4bc11bbcdc08" containerName="nova-api-api" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.407371 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da609ec-4cd1-486f-9cf7-97c8835f558a" containerName="nova-metadata-log" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.407390 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da609ec-4cd1-486f-9cf7-97c8835f558a" containerName="nova-metadata-metadata" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.407401 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="236920b9-1246-4f61-bb2e-4bc11bbcdc08" containerName="nova-api-log" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.410862 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.412840 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.413813 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.414286 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.430292 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/236920b9-1246-4f61-bb2e-4bc11bbcdc08-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.531528 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbz8j\" (UniqueName: \"kubernetes.io/projected/49ede297-9008-4ab6-87b2-a848861f456c-kube-api-access-wbz8j\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.531583 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.531601 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-config-data\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.531959 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ede297-9008-4ab6-87b2-a848861f456c-logs\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.532106 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.630172 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.633489 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ede297-9008-4ab6-87b2-a848861f456c-logs\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.633548 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.633612 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbz8j\" (UniqueName: \"kubernetes.io/projected/49ede297-9008-4ab6-87b2-a848861f456c-kube-api-access-wbz8j\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.633648 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.633665 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-config-data\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.633877 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ede297-9008-4ab6-87b2-a848861f456c-logs\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.636736 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-config-data\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.638406 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.639069 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.639293 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.653202 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbz8j\" (UniqueName: \"kubernetes.io/projected/49ede297-9008-4ab6-87b2-a848861f456c-kube-api-access-wbz8j\") pod \"nova-metadata-0\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.660204 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.661843 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.663326 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.664682 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.664905 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.672949 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.735772 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.838012 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.838082 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-logs\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.838128 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-config-data\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.838370 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpznm\" (UniqueName: \"kubernetes.io/projected/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-kube-api-access-rpznm\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.838453 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.838504 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-public-tls-certs\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.940872 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-config-data\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.941097 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpznm\" (UniqueName: \"kubernetes.io/projected/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-kube-api-access-rpznm\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.941162 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.941212 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-public-tls-certs\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.941304 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.941354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-logs\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.942007 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-logs\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.945258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-config-data\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.945557 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-public-tls-certs\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.947705 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.947859 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:38 crc kubenswrapper[5030]: I0120 23:32:38.968384 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpznm\" (UniqueName: \"kubernetes.io/projected/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-kube-api-access-rpznm\") pod \"nova-api-0\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.049762 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.194824 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.317606 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"49ede297-9008-4ab6-87b2-a848861f456c","Type":"ContainerStarted","Data":"26773ef79223fda2e356ee246916ec7de8b2bf3e31aa9e8d7aac4fc8e6a4df87"} Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.541797 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.569615 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.760688 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x5dl\" (UniqueName: \"kubernetes.io/projected/be1b03a7-ded3-477f-b573-e021cd6141c6-kube-api-access-6x5dl\") pod \"be1b03a7-ded3-477f-b573-e021cd6141c6\" (UID: \"be1b03a7-ded3-477f-b573-e021cd6141c6\") " Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.760791 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1b03a7-ded3-477f-b573-e021cd6141c6-config-data\") pod \"be1b03a7-ded3-477f-b573-e021cd6141c6\" (UID: \"be1b03a7-ded3-477f-b573-e021cd6141c6\") " Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.760841 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1b03a7-ded3-477f-b573-e021cd6141c6-combined-ca-bundle\") pod \"be1b03a7-ded3-477f-b573-e021cd6141c6\" (UID: \"be1b03a7-ded3-477f-b573-e021cd6141c6\") " Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.764923 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be1b03a7-ded3-477f-b573-e021cd6141c6-kube-api-access-6x5dl" (OuterVolumeSpecName: "kube-api-access-6x5dl") pod "be1b03a7-ded3-477f-b573-e021cd6141c6" (UID: "be1b03a7-ded3-477f-b573-e021cd6141c6"). InnerVolumeSpecName "kube-api-access-6x5dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.789732 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be1b03a7-ded3-477f-b573-e021cd6141c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be1b03a7-ded3-477f-b573-e021cd6141c6" (UID: "be1b03a7-ded3-477f-b573-e021cd6141c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.792029 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be1b03a7-ded3-477f-b573-e021cd6141c6-config-data" (OuterVolumeSpecName: "config-data") pod "be1b03a7-ded3-477f-b573-e021cd6141c6" (UID: "be1b03a7-ded3-477f-b573-e021cd6141c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.863232 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x5dl\" (UniqueName: \"kubernetes.io/projected/be1b03a7-ded3-477f-b573-e021cd6141c6-kube-api-access-6x5dl\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.863263 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be1b03a7-ded3-477f-b573-e021cd6141c6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.863275 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be1b03a7-ded3-477f-b573-e021cd6141c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.984377 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236920b9-1246-4f61-bb2e-4bc11bbcdc08" path="/var/lib/kubelet/pods/236920b9-1246-4f61-bb2e-4bc11bbcdc08/volumes" Jan 20 23:32:39 crc kubenswrapper[5030]: I0120 23:32:39.985847 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3da609ec-4cd1-486f-9cf7-97c8835f558a" path="/var/lib/kubelet/pods/3da609ec-4cd1-486f-9cf7-97c8835f558a/volumes" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.157507 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.157576 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.335740 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"49ede297-9008-4ab6-87b2-a848861f456c","Type":"ContainerStarted","Data":"87708e45f8565e70d16688c36ab9c0d62303f36b41bc8a9b49ea37525d50d839"} Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.335781 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"49ede297-9008-4ab6-87b2-a848861f456c","Type":"ContainerStarted","Data":"47416c4e1855bb17d7d18634d2d54d2bf63df19cae02840bed6ead7ebabb8062"} Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.339358 5030 generic.go:334] "Generic (PLEG): container finished" podID="be1b03a7-ded3-477f-b573-e021cd6141c6" containerID="670fa6f9dccf2f9d72088d97bae231dbd4a6bbfdace49d8e49a23adf96196407" exitCode=0 Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.339550 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"be1b03a7-ded3-477f-b573-e021cd6141c6","Type":"ContainerDied","Data":"670fa6f9dccf2f9d72088d97bae231dbd4a6bbfdace49d8e49a23adf96196407"} Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.339651 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"be1b03a7-ded3-477f-b573-e021cd6141c6","Type":"ContainerDied","Data":"1b35bdca6567f37338a555342e246159e645960e78dabba2ea63a83b98265898"} Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.339755 5030 scope.go:117] "RemoveContainer" containerID="670fa6f9dccf2f9d72088d97bae231dbd4a6bbfdace49d8e49a23adf96196407" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.339941 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.345350 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b","Type":"ContainerStarted","Data":"222a2f47d191a1664c821dc9ee94346d6f1a1e6a3b2a2cd0b837cc0323155351"} Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.345608 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b","Type":"ContainerStarted","Data":"cbab7da01b3bd5a283a0561e255163fd3199803f895805e5b5c4db5d92e3a9d5"} Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.345721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b","Type":"ContainerStarted","Data":"07f90b9775e15b482141d3af8398c817b5b9f11f55357742d8c62bc5f6b3d23e"} Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.379222 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.379199955 podStartE2EDuration="2.379199955s" podCreationTimestamp="2026-01-20 23:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:40.361662291 +0000 UTC m=+3432.681922609" watchObservedRunningTime="2026-01-20 23:32:40.379199955 +0000 UTC m=+3432.699460263" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.387930 5030 scope.go:117] "RemoveContainer" containerID="670fa6f9dccf2f9d72088d97bae231dbd4a6bbfdace49d8e49a23adf96196407" Jan 20 23:32:40 crc kubenswrapper[5030]: E0120 23:32:40.388525 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670fa6f9dccf2f9d72088d97bae231dbd4a6bbfdace49d8e49a23adf96196407\": container with ID starting with 670fa6f9dccf2f9d72088d97bae231dbd4a6bbfdace49d8e49a23adf96196407 not found: ID does not exist" containerID="670fa6f9dccf2f9d72088d97bae231dbd4a6bbfdace49d8e49a23adf96196407" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.388565 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670fa6f9dccf2f9d72088d97bae231dbd4a6bbfdace49d8e49a23adf96196407"} err="failed to get container status \"670fa6f9dccf2f9d72088d97bae231dbd4a6bbfdace49d8e49a23adf96196407\": rpc error: code = NotFound desc = could not find container \"670fa6f9dccf2f9d72088d97bae231dbd4a6bbfdace49d8e49a23adf96196407\": container with ID starting with 670fa6f9dccf2f9d72088d97bae231dbd4a6bbfdace49d8e49a23adf96196407 not found: ID does not exist" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.403734 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.417371 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.427745 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:40 crc kubenswrapper[5030]: E0120 23:32:40.428464 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1b03a7-ded3-477f-b573-e021cd6141c6" containerName="nova-scheduler-scheduler" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.428558 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1b03a7-ded3-477f-b573-e021cd6141c6" containerName="nova-scheduler-scheduler" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.428844 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="be1b03a7-ded3-477f-b573-e021cd6141c6" containerName="nova-scheduler-scheduler" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.429828 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.434382 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.433357 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.433337876 podStartE2EDuration="2.433337876s" podCreationTimestamp="2026-01-20 23:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:40.399086047 +0000 UTC m=+3432.719346345" watchObservedRunningTime="2026-01-20 23:32:40.433337876 +0000 UTC m=+3432.753598174" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.474064 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.578472 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km8pm\" (UniqueName: \"kubernetes.io/projected/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-kube-api-access-km8pm\") pod \"nova-scheduler-0\" (UID: \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.578525 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.578579 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-config-data\") pod \"nova-scheduler-0\" (UID: \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.680692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km8pm\" (UniqueName: \"kubernetes.io/projected/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-kube-api-access-km8pm\") pod \"nova-scheduler-0\" (UID: \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.680763 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.680792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-config-data\") pod \"nova-scheduler-0\" (UID: \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.688105 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-config-data\") pod \"nova-scheduler-0\" (UID: \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.689086 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.699441 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km8pm\" (UniqueName: \"kubernetes.io/projected/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-kube-api-access-km8pm\") pod \"nova-scheduler-0\" (UID: \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:40 crc kubenswrapper[5030]: I0120 23:32:40.766758 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:40 crc kubenswrapper[5030]: E0120 23:32:40.961789 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:40 crc kubenswrapper[5030]: E0120 23:32:40.964952 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:40 crc kubenswrapper[5030]: E0120 23:32:40.967173 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:40 crc kubenswrapper[5030]: E0120 23:32:40.967205 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="62366776-47f0-4d58-87fc-e76f904677bc" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:41 crc kubenswrapper[5030]: I0120 23:32:41.255249 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:32:41 crc kubenswrapper[5030]: I0120 23:32:41.366262 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55","Type":"ContainerStarted","Data":"75cefa19df80f7c69578b47dcc8c1d1893e327cf9990e5d6d398a3297ab5a566"} Jan 20 23:32:41 crc kubenswrapper[5030]: E0120 23:32:41.558208 5030 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/0784772005f4d5de173d0f6c1f2f81b4fabe93b93c7c2f3cfbcb6e8659659179/diff" to get inode usage: stat /var/lib/containers/storage/overlay/0784772005f4d5de173d0f6c1f2f81b4fabe93b93c7c2f3cfbcb6e8659659179/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack-kuttl-tests_neutron-78c57799cd-6f6m4_f5910c0d-bfc0-4b55-8b1b-ded9285b8b80/neutron-api/0.log" to get inode usage: stat /var/log/pods/openstack-kuttl-tests_neutron-78c57799cd-6f6m4_f5910c0d-bfc0-4b55-8b1b-ded9285b8b80/neutron-api/0.log: no such file or directory Jan 20 23:32:41 crc kubenswrapper[5030]: I0120 23:32:41.978400 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be1b03a7-ded3-477f-b573-e021cd6141c6" path="/var/lib/kubelet/pods/be1b03a7-ded3-477f-b573-e021cd6141c6/volumes" Jan 20 23:32:42 crc kubenswrapper[5030]: I0120 23:32:42.381309 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55","Type":"ContainerStarted","Data":"d7be87fefb3c4d5a418de162a73e2d41c4f9db940696c277c29fa956663309f9"} Jan 20 23:32:42 crc kubenswrapper[5030]: I0120 23:32:42.405850 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.405832895 podStartE2EDuration="2.405832895s" podCreationTimestamp="2026-01-20 23:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:42.400936616 +0000 UTC m=+3434.721196904" watchObservedRunningTime="2026-01-20 23:32:42.405832895 +0000 UTC m=+3434.726093193" Jan 20 23:32:43 crc kubenswrapper[5030]: I0120 23:32:43.737274 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:43 crc kubenswrapper[5030]: I0120 23:32:43.737425 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:45 crc kubenswrapper[5030]: I0120 23:32:45.766982 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:45 crc kubenswrapper[5030]: E0120 23:32:45.958726 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:45 crc kubenswrapper[5030]: E0120 23:32:45.961131 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:45 crc kubenswrapper[5030]: E0120 23:32:45.963358 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:32:45 crc kubenswrapper[5030]: E0120 23:32:45.963459 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="62366776-47f0-4d58-87fc-e76f904677bc" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:47 crc kubenswrapper[5030]: W0120 23:32:47.620294 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8922a316_0627_47b3_81ad_b3f7bff4da38.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8922a316_0627_47b3_81ad_b3f7bff4da38.slice: no such file or directory Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.032677 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.164268 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-combined-ca-bundle\") pod \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\" (UID: \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\") " Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.164729 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-config-data\") pod \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\" (UID: \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\") " Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.164807 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7c6p\" (UniqueName: \"kubernetes.io/projected/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-kube-api-access-g7c6p\") pod \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\" (UID: \"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36\") " Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.174183 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-kube-api-access-g7c6p" (OuterVolumeSpecName: "kube-api-access-g7c6p") pod "4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36" (UID: "4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36"). InnerVolumeSpecName "kube-api-access-g7c6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.216090 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-config-data" (OuterVolumeSpecName: "config-data") pod "4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36" (UID: "4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.219063 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36" (UID: "4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.267678 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.267764 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.267788 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7c6p\" (UniqueName: \"kubernetes.io/projected/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36-kube-api-access-g7c6p\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.480287 5030 generic.go:334] "Generic (PLEG): container finished" podID="4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36" containerID="b22012f1b40823409d6ada919f03d5a1334f1623994ec457ac7f6893cf60c735" exitCode=137 Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.480355 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36","Type":"ContainerDied","Data":"b22012f1b40823409d6ada919f03d5a1334f1623994ec457ac7f6893cf60c735"} Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.480396 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.480419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36","Type":"ContainerDied","Data":"3bc3fb5407dd3c6c5032c7bba1d7112685f8f092f245656031aa75ee63e71784"} Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.480443 5030 scope.go:117] "RemoveContainer" containerID="b22012f1b40823409d6ada919f03d5a1334f1623994ec457ac7f6893cf60c735" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.534466 5030 scope.go:117] "RemoveContainer" containerID="b22012f1b40823409d6ada919f03d5a1334f1623994ec457ac7f6893cf60c735" Jan 20 23:32:48 crc kubenswrapper[5030]: E0120 23:32:48.535173 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b22012f1b40823409d6ada919f03d5a1334f1623994ec457ac7f6893cf60c735\": container with ID starting with b22012f1b40823409d6ada919f03d5a1334f1623994ec457ac7f6893cf60c735 not found: ID does not exist" containerID="b22012f1b40823409d6ada919f03d5a1334f1623994ec457ac7f6893cf60c735" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.535225 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22012f1b40823409d6ada919f03d5a1334f1623994ec457ac7f6893cf60c735"} err="failed to get container status \"b22012f1b40823409d6ada919f03d5a1334f1623994ec457ac7f6893cf60c735\": rpc error: code = NotFound desc = could not find container \"b22012f1b40823409d6ada919f03d5a1334f1623994ec457ac7f6893cf60c735\": container with ID starting with b22012f1b40823409d6ada919f03d5a1334f1623994ec457ac7f6893cf60c735 not found: ID does not exist" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.543338 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.561432 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.575693 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:32:48 crc kubenswrapper[5030]: E0120 23:32:48.576261 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.576285 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.576551 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.577434 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.583040 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.583580 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.583930 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.595781 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:32:48 crc kubenswrapper[5030]: W0120 23:32:48.676184 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode05505c3_5757_4e56_bd28_6ea231262b0d.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode05505c3_5757_4e56_bd28_6ea231262b0d.slice: no such file or directory Jan 20 23:32:48 crc kubenswrapper[5030]: W0120 23:32:48.676360 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc6fbe32_ba1a_4466_84d8_ce5e7feb010b.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc6fbe32_ba1a_4466_84d8_ce5e7feb010b.slice: no such file or directory Jan 20 23:32:48 crc kubenswrapper[5030]: W0120 23:32:48.676702 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod129d8b83_1cbd_4f36_b30f_bfa23bb0581d.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod129d8b83_1cbd_4f36_b30f_bfa23bb0581d.slice: no such file or directory Jan 20 23:32:48 crc kubenswrapper[5030]: W0120 23:32:48.676842 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod829becc4_31bd_41a4_b235_ad48f57dbee1.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod829becc4_31bd_41a4_b235_ad48f57dbee1.slice: no such file or directory Jan 20 23:32:48 crc kubenswrapper[5030]: W0120 23:32:48.676874 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d85642b_b230_46f2_8eed_f66a85e61c1b.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d85642b_b230_46f2_8eed_f66a85e61c1b.slice: no such file or directory Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.678238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mznx8\" (UniqueName: \"kubernetes.io/projected/a8865fb5-d3c6-4178-9f1f-43e29f24e345-kube-api-access-mznx8\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.678291 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.678350 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.678377 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.678453 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: W0120 23:32:48.683714 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7eb4b558_b9f0_476e_bbbe_2f5307bd9e4f.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7eb4b558_b9f0_476e_bbbe_2f5307bd9e4f.slice: no such file or directory Jan 20 23:32:48 crc kubenswrapper[5030]: W0120 23:32:48.683802 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode290163a_2fc9_492f_b113_1e1a42ff37f9.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode290163a_2fc9_492f_b113_1e1a42ff37f9.slice: no such file or directory Jan 20 23:32:48 crc kubenswrapper[5030]: W0120 23:32:48.703922 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe1b03a7_ded3_477f_b573_e021cd6141c6.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe1b03a7_ded3_477f_b573_e021cd6141c6.slice: no such file or directory Jan 20 23:32:48 crc kubenswrapper[5030]: W0120 23:32:48.703963 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3da609ec_4cd1_486f_9cf7_97c8835f558a.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3da609ec_4cd1_486f_9cf7_97c8835f558a.slice: no such file or directory Jan 20 23:32:48 crc kubenswrapper[5030]: W0120 23:32:48.709494 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod236920b9_1246_4f61_bb2e_4bc11bbcdc08.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod236920b9_1246_4f61_bb2e_4bc11bbcdc08.slice: no such file or directory Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.736754 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.736795 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.781513 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.781566 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.781714 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.781750 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mznx8\" (UniqueName: \"kubernetes.io/projected/a8865fb5-d3c6-4178-9f1f-43e29f24e345-kube-api-access-mznx8\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.781785 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.785451 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.787758 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.796408 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.816296 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.818835 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mznx8\" (UniqueName: \"kubernetes.io/projected/a8865fb5-d3c6-4178-9f1f-43e29f24e345-kube-api-access-mznx8\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.903254 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:48 crc kubenswrapper[5030]: I0120 23:32:48.965496 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.050563 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.050748 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.087342 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62366776-47f0-4d58-87fc-e76f904677bc-combined-ca-bundle\") pod \"62366776-47f0-4d58-87fc-e76f904677bc\" (UID: \"62366776-47f0-4d58-87fc-e76f904677bc\") " Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.087399 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg8s9\" (UniqueName: \"kubernetes.io/projected/62366776-47f0-4d58-87fc-e76f904677bc-kube-api-access-gg8s9\") pod \"62366776-47f0-4d58-87fc-e76f904677bc\" (UID: \"62366776-47f0-4d58-87fc-e76f904677bc\") " Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.087561 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62366776-47f0-4d58-87fc-e76f904677bc-config-data\") pod \"62366776-47f0-4d58-87fc-e76f904677bc\" (UID: \"62366776-47f0-4d58-87fc-e76f904677bc\") " Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.093505 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62366776-47f0-4d58-87fc-e76f904677bc-kube-api-access-gg8s9" (OuterVolumeSpecName: "kube-api-access-gg8s9") pod "62366776-47f0-4d58-87fc-e76f904677bc" (UID: "62366776-47f0-4d58-87fc-e76f904677bc"). InnerVolumeSpecName "kube-api-access-gg8s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.118288 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62366776-47f0-4d58-87fc-e76f904677bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62366776-47f0-4d58-87fc-e76f904677bc" (UID: "62366776-47f0-4d58-87fc-e76f904677bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.120075 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62366776-47f0-4d58-87fc-e76f904677bc-config-data" (OuterVolumeSpecName: "config-data") pod "62366776-47f0-4d58-87fc-e76f904677bc" (UID: "62366776-47f0-4d58-87fc-e76f904677bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.189731 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62366776-47f0-4d58-87fc-e76f904677bc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.189768 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62366776-47f0-4d58-87fc-e76f904677bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.189786 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg8s9\" (UniqueName: \"kubernetes.io/projected/62366776-47f0-4d58-87fc-e76f904677bc-kube-api-access-gg8s9\") on node \"crc\" DevicePath \"\"" Jan 20 23:32:49 crc kubenswrapper[5030]: W0120 23:32:49.339198 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8865fb5_d3c6_4178_9f1f_43e29f24e345.slice/crio-611e07425cd008334d0ac08cfa6e833437f1f4992115ef65bc4ff6bb7b8c28d2 WatchSource:0}: Error finding container 611e07425cd008334d0ac08cfa6e833437f1f4992115ef65bc4ff6bb7b8c28d2: Status 404 returned error can't find the container with id 611e07425cd008334d0ac08cfa6e833437f1f4992115ef65bc4ff6bb7b8c28d2 Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.342847 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.495120 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"a8865fb5-d3c6-4178-9f1f-43e29f24e345","Type":"ContainerStarted","Data":"611e07425cd008334d0ac08cfa6e833437f1f4992115ef65bc4ff6bb7b8c28d2"} Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.497519 5030 generic.go:334] "Generic (PLEG): container finished" podID="62366776-47f0-4d58-87fc-e76f904677bc" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" exitCode=137 Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.497595 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.497603 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"62366776-47f0-4d58-87fc-e76f904677bc","Type":"ContainerDied","Data":"a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4"} Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.497672 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"62366776-47f0-4d58-87fc-e76f904677bc","Type":"ContainerDied","Data":"695839cee1703cd16648cb2100e7c8e05f0f4ba151c74b46ff61ff564b788ab2"} Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.497697 5030 scope.go:117] "RemoveContainer" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.527400 5030 scope.go:117] "RemoveContainer" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" Jan 20 23:32:49 crc kubenswrapper[5030]: E0120 23:32:49.527866 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4\": container with ID starting with a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4 not found: ID does not exist" containerID="a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.527908 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4"} err="failed to get container status \"a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4\": rpc error: code = NotFound desc = could not find container \"a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4\": container with ID starting with a4b4301bffa946bf59bf93abbf1be57955fb7e2ec8763ca7b13e1efd1aac1ee4 not found: ID does not exist" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.541243 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.549320 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.577008 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:49 crc kubenswrapper[5030]: E0120 23:32:49.577608 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62366776-47f0-4d58-87fc-e76f904677bc" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.577706 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="62366776-47f0-4d58-87fc-e76f904677bc" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.579463 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="62366776-47f0-4d58-87fc-e76f904677bc" containerName="nova-cell1-conductor-conductor" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.580198 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.580373 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.586920 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.698239 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7kb7\" (UniqueName: \"kubernetes.io/projected/e905a9f8-849b-45bb-be21-2293c96cd0e1-kube-api-access-g7kb7\") pod \"nova-cell1-conductor-0\" (UID: \"e905a9f8-849b-45bb-be21-2293c96cd0e1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.698303 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e905a9f8-849b-45bb-be21-2293c96cd0e1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e905a9f8-849b-45bb-be21-2293c96cd0e1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.698371 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e905a9f8-849b-45bb-be21-2293c96cd0e1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e905a9f8-849b-45bb-be21-2293c96cd0e1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.757302 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="49ede297-9008-4ab6-87b2-a848861f456c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.250:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.757678 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="49ede297-9008-4ab6-87b2-a848861f456c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.250:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.800759 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7kb7\" (UniqueName: \"kubernetes.io/projected/e905a9f8-849b-45bb-be21-2293c96cd0e1-kube-api-access-g7kb7\") pod \"nova-cell1-conductor-0\" (UID: \"e905a9f8-849b-45bb-be21-2293c96cd0e1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.801015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e905a9f8-849b-45bb-be21-2293c96cd0e1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e905a9f8-849b-45bb-be21-2293c96cd0e1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.801069 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e905a9f8-849b-45bb-be21-2293c96cd0e1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e905a9f8-849b-45bb-be21-2293c96cd0e1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.808393 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e905a9f8-849b-45bb-be21-2293c96cd0e1-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e905a9f8-849b-45bb-be21-2293c96cd0e1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.808671 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e905a9f8-849b-45bb-be21-2293c96cd0e1-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e905a9f8-849b-45bb-be21-2293c96cd0e1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.823576 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7kb7\" (UniqueName: \"kubernetes.io/projected/e905a9f8-849b-45bb-be21-2293c96cd0e1-kube-api-access-g7kb7\") pod \"nova-cell1-conductor-0\" (UID: \"e905a9f8-849b-45bb-be21-2293c96cd0e1\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.898303 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.980689 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36" path="/var/lib/kubelet/pods/4fa9b5f3-dde8-4d90-89fe-ca635a5dcd36/volumes" Jan 20 23:32:49 crc kubenswrapper[5030]: I0120 23:32:49.981443 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62366776-47f0-4d58-87fc-e76f904677bc" path="/var/lib/kubelet/pods/62366776-47f0-4d58-87fc-e76f904677bc/volumes" Jan 20 23:32:50 crc kubenswrapper[5030]: I0120 23:32:50.066739 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.251:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:32:50 crc kubenswrapper[5030]: I0120 23:32:50.067023 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.251:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:32:50 crc kubenswrapper[5030]: I0120 23:32:50.401882 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:32:50 crc kubenswrapper[5030]: W0120 23:32:50.407918 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode905a9f8_849b_45bb_be21_2293c96cd0e1.slice/crio-22eb344454fb2717588cb3420815db293ac7fc199f8a1cebf60d9bebf3439f8c WatchSource:0}: Error finding container 22eb344454fb2717588cb3420815db293ac7fc199f8a1cebf60d9bebf3439f8c: Status 404 returned error can't find the container with id 22eb344454fb2717588cb3420815db293ac7fc199f8a1cebf60d9bebf3439f8c Jan 20 23:32:50 crc kubenswrapper[5030]: I0120 23:32:50.517874 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"a8865fb5-d3c6-4178-9f1f-43e29f24e345","Type":"ContainerStarted","Data":"5ae7bb9ba42eb770778d66a021e4203f2b42a0265e7c087325892501138ebe93"} Jan 20 23:32:50 crc kubenswrapper[5030]: I0120 23:32:50.519892 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e905a9f8-849b-45bb-be21-2293c96cd0e1","Type":"ContainerStarted","Data":"22eb344454fb2717588cb3420815db293ac7fc199f8a1cebf60d9bebf3439f8c"} Jan 20 23:32:50 crc kubenswrapper[5030]: I0120 23:32:50.537788 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.537769437 podStartE2EDuration="2.537769437s" podCreationTimestamp="2026-01-20 23:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:50.530750117 +0000 UTC m=+3442.851010405" watchObservedRunningTime="2026-01-20 23:32:50.537769437 +0000 UTC m=+3442.858029715" Jan 20 23:32:50 crc kubenswrapper[5030]: I0120 23:32:50.767392 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:50 crc kubenswrapper[5030]: I0120 23:32:50.809923 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:51 crc kubenswrapper[5030]: I0120 23:32:51.534421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e905a9f8-849b-45bb-be21-2293c96cd0e1","Type":"ContainerStarted","Data":"a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9"} Jan 20 23:32:51 crc kubenswrapper[5030]: I0120 23:32:51.535119 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:32:51 crc kubenswrapper[5030]: I0120 23:32:51.554905 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.554883308 podStartE2EDuration="2.554883308s" podCreationTimestamp="2026-01-20 23:32:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:32:51.552119121 +0000 UTC m=+3443.872379439" watchObservedRunningTime="2026-01-20 23:32:51.554883308 +0000 UTC m=+3443.875143616" Jan 20 23:32:51 crc kubenswrapper[5030]: I0120 23:32:51.570959 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:32:53 crc kubenswrapper[5030]: I0120 23:32:53.904342 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:58 crc kubenswrapper[5030]: I0120 23:32:58.744719 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:58 crc kubenswrapper[5030]: I0120 23:32:58.747314 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:58 crc kubenswrapper[5030]: I0120 23:32:58.753182 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:58 crc kubenswrapper[5030]: I0120 23:32:58.903413 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:58 crc kubenswrapper[5030]: I0120 23:32:58.924430 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:59 crc kubenswrapper[5030]: I0120 23:32:59.058103 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:59 crc kubenswrapper[5030]: I0120 23:32:59.058446 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:59 crc kubenswrapper[5030]: I0120 23:32:59.059117 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:59 crc kubenswrapper[5030]: I0120 23:32:59.065862 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:59 crc kubenswrapper[5030]: I0120 23:32:59.071532 5030 scope.go:117] "RemoveContainer" containerID="8d8857a54e3051d21b85864143f114d8c2faafa007277a85b349682cb0dd2946" Jan 20 23:32:59 crc kubenswrapper[5030]: I0120 23:32:59.100546 5030 scope.go:117] "RemoveContainer" containerID="db7c736000c2c144ffc7cd96fd2cc41eeb6b34212f1f572dd0f34ffe24b8f195" Jan 20 23:32:59 crc kubenswrapper[5030]: I0120 23:32:59.148178 5030 scope.go:117] "RemoveContainer" containerID="b212f24fbc91bb05579efc68cd38cf0e1aaabcd0a37df17c85f251b376bf39f4" Jan 20 23:32:59 crc kubenswrapper[5030]: I0120 23:32:59.183614 5030 scope.go:117] "RemoveContainer" containerID="79d280e4be818945ffca4e232f42bd16e3dc992a372a897fa7e7491e33afe638" Jan 20 23:32:59 crc kubenswrapper[5030]: I0120 23:32:59.632670 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:59 crc kubenswrapper[5030]: I0120 23:32:59.638145 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:32:59 crc kubenswrapper[5030]: I0120 23:32:59.640538 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:32:59 crc kubenswrapper[5030]: I0120 23:32:59.649444 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:32:59 crc kubenswrapper[5030]: I0120 23:32:59.925705 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.474790 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp"] Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.477516 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.483177 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.483285 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.487841 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp"] Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.634843 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-config-data\") pod \"nova-cell1-cell-mapping-zlfhp\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.634884 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqwpk\" (UniqueName: \"kubernetes.io/projected/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-kube-api-access-kqwpk\") pod \"nova-cell1-cell-mapping-zlfhp\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.634944 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zlfhp\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.635015 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-scripts\") pod \"nova-cell1-cell-mapping-zlfhp\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.736123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-scripts\") pod \"nova-cell1-cell-mapping-zlfhp\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.736250 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-config-data\") pod \"nova-cell1-cell-mapping-zlfhp\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.736280 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqwpk\" (UniqueName: \"kubernetes.io/projected/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-kube-api-access-kqwpk\") pod \"nova-cell1-cell-mapping-zlfhp\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.736355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zlfhp\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.745007 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-config-data\") pod \"nova-cell1-cell-mapping-zlfhp\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.758786 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zlfhp\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.763486 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-scripts\") pod \"nova-cell1-cell-mapping-zlfhp\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.764374 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqwpk\" (UniqueName: \"kubernetes.io/projected/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-kube-api-access-kqwpk\") pod \"nova-cell1-cell-mapping-zlfhp\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:00 crc kubenswrapper[5030]: I0120 23:33:00.796109 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:01 crc kubenswrapper[5030]: I0120 23:33:01.264903 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp"] Jan 20 23:33:01 crc kubenswrapper[5030]: W0120 23:33:01.268306 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9db94d9e_2b96_4043_bbc7_cf41ae5ce9b6.slice/crio-e132b3a9a664871250a87ea10c0612172955e15e48c3b421e1f1c2913ceea955 WatchSource:0}: Error finding container e132b3a9a664871250a87ea10c0612172955e15e48c3b421e1f1c2913ceea955: Status 404 returned error can't find the container with id e132b3a9a664871250a87ea10c0612172955e15e48c3b421e1f1c2913ceea955 Jan 20 23:33:01 crc kubenswrapper[5030]: I0120 23:33:01.649638 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" event={"ID":"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6","Type":"ContainerStarted","Data":"e132b3a9a664871250a87ea10c0612172955e15e48c3b421e1f1c2913ceea955"} Jan 20 23:33:01 crc kubenswrapper[5030]: I0120 23:33:01.697266 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:33:01 crc kubenswrapper[5030]: I0120 23:33:01.697691 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="ceilometer-central-agent" containerID="cri-o://839cc89adae584c41efedb5d0da7286f639a42f63dac0d84d40646b3fef94436" gracePeriod=30 Jan 20 23:33:01 crc kubenswrapper[5030]: I0120 23:33:01.698882 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="proxy-httpd" containerID="cri-o://ddcfcc6a8d5fcd86438c7dd6b86c04c164ea7cb885b084b0dbc89d752fbc650e" gracePeriod=30 Jan 20 23:33:01 crc kubenswrapper[5030]: I0120 23:33:01.699062 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="ceilometer-notification-agent" containerID="cri-o://89a0d19a8c65efcfc7faf4ed10346e6ccbc4f9606df7b694da7901505cafdb19" gracePeriod=30 Jan 20 23:33:01 crc kubenswrapper[5030]: I0120 23:33:01.699108 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="sg-core" containerID="cri-o://ff79c2278d13b526a6c8be59f9d6e9324d01879e24402a230ce9cf77aa025730" gracePeriod=30 Jan 20 23:33:01 crc kubenswrapper[5030]: I0120 23:33:01.706321 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.249:3000/\": EOF" Jan 20 23:33:02 crc kubenswrapper[5030]: I0120 23:33:02.662493 5030 generic.go:334] "Generic (PLEG): container finished" podID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerID="ddcfcc6a8d5fcd86438c7dd6b86c04c164ea7cb885b084b0dbc89d752fbc650e" exitCode=0 Jan 20 23:33:02 crc kubenswrapper[5030]: I0120 23:33:02.662525 5030 generic.go:334] "Generic (PLEG): container finished" podID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerID="ff79c2278d13b526a6c8be59f9d6e9324d01879e24402a230ce9cf77aa025730" exitCode=2 Jan 20 23:33:02 crc kubenswrapper[5030]: I0120 23:33:02.662533 5030 generic.go:334] "Generic (PLEG): container finished" podID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerID="839cc89adae584c41efedb5d0da7286f639a42f63dac0d84d40646b3fef94436" exitCode=0 Jan 20 23:33:02 crc kubenswrapper[5030]: I0120 23:33:02.662559 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"53dd7d1d-6d16-400b-90b7-ac3c9a96127d","Type":"ContainerDied","Data":"ddcfcc6a8d5fcd86438c7dd6b86c04c164ea7cb885b084b0dbc89d752fbc650e"} Jan 20 23:33:02 crc kubenswrapper[5030]: I0120 23:33:02.662673 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"53dd7d1d-6d16-400b-90b7-ac3c9a96127d","Type":"ContainerDied","Data":"ff79c2278d13b526a6c8be59f9d6e9324d01879e24402a230ce9cf77aa025730"} Jan 20 23:33:02 crc kubenswrapper[5030]: I0120 23:33:02.662711 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"53dd7d1d-6d16-400b-90b7-ac3c9a96127d","Type":"ContainerDied","Data":"839cc89adae584c41efedb5d0da7286f639a42f63dac0d84d40646b3fef94436"} Jan 20 23:33:02 crc kubenswrapper[5030]: I0120 23:33:02.665207 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" event={"ID":"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6","Type":"ContainerStarted","Data":"90cc47d6fbfec80696ca78278e6616158c0acbacf6dd0460eedd37b9b8007991"} Jan 20 23:33:02 crc kubenswrapper[5030]: I0120 23:33:02.678549 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" podStartSLOduration=2.678528312 podStartE2EDuration="2.678528312s" podCreationTimestamp="2026-01-20 23:33:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:33:02.677865425 +0000 UTC m=+3454.998125723" watchObservedRunningTime="2026-01-20 23:33:02.678528312 +0000 UTC m=+3454.998788600" Jan 20 23:33:03 crc kubenswrapper[5030]: I0120 23:33:03.582260 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.249:3000/\": dial tcp 10.217.1.249:3000: connect: connection refused" Jan 20 23:33:05 crc kubenswrapper[5030]: I0120 23:33:05.704589 5030 generic.go:334] "Generic (PLEG): container finished" podID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerID="89a0d19a8c65efcfc7faf4ed10346e6ccbc4f9606df7b694da7901505cafdb19" exitCode=0 Jan 20 23:33:05 crc kubenswrapper[5030]: I0120 23:33:05.704695 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"53dd7d1d-6d16-400b-90b7-ac3c9a96127d","Type":"ContainerDied","Data":"89a0d19a8c65efcfc7faf4ed10346e6ccbc4f9606df7b694da7901505cafdb19"} Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.020874 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.160187 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-sg-core-conf-yaml\") pod \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.160285 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-scripts\") pod \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.160342 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg5r5\" (UniqueName: \"kubernetes.io/projected/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-kube-api-access-rg5r5\") pod \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.160383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-ceilometer-tls-certs\") pod \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.160403 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-run-httpd\") pod \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.160422 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-log-httpd\") pod \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.160492 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-combined-ca-bundle\") pod \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.160533 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-config-data\") pod \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\" (UID: \"53dd7d1d-6d16-400b-90b7-ac3c9a96127d\") " Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.161466 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "53dd7d1d-6d16-400b-90b7-ac3c9a96127d" (UID: "53dd7d1d-6d16-400b-90b7-ac3c9a96127d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.161896 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "53dd7d1d-6d16-400b-90b7-ac3c9a96127d" (UID: "53dd7d1d-6d16-400b-90b7-ac3c9a96127d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.168222 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-kube-api-access-rg5r5" (OuterVolumeSpecName: "kube-api-access-rg5r5") pod "53dd7d1d-6d16-400b-90b7-ac3c9a96127d" (UID: "53dd7d1d-6d16-400b-90b7-ac3c9a96127d"). InnerVolumeSpecName "kube-api-access-rg5r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.168977 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-scripts" (OuterVolumeSpecName: "scripts") pod "53dd7d1d-6d16-400b-90b7-ac3c9a96127d" (UID: "53dd7d1d-6d16-400b-90b7-ac3c9a96127d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.196726 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "53dd7d1d-6d16-400b-90b7-ac3c9a96127d" (UID: "53dd7d1d-6d16-400b-90b7-ac3c9a96127d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.254858 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "53dd7d1d-6d16-400b-90b7-ac3c9a96127d" (UID: "53dd7d1d-6d16-400b-90b7-ac3c9a96127d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.262854 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg5r5\" (UniqueName: \"kubernetes.io/projected/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-kube-api-access-rg5r5\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.262888 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.262897 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.262909 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.262918 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.262926 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.284390 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53dd7d1d-6d16-400b-90b7-ac3c9a96127d" (UID: "53dd7d1d-6d16-400b-90b7-ac3c9a96127d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.296544 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-config-data" (OuterVolumeSpecName: "config-data") pod "53dd7d1d-6d16-400b-90b7-ac3c9a96127d" (UID: "53dd7d1d-6d16-400b-90b7-ac3c9a96127d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.364709 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.364760 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53dd7d1d-6d16-400b-90b7-ac3c9a96127d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.718323 5030 generic.go:334] "Generic (PLEG): container finished" podID="9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6" containerID="90cc47d6fbfec80696ca78278e6616158c0acbacf6dd0460eedd37b9b8007991" exitCode=0 Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.718524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" event={"ID":"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6","Type":"ContainerDied","Data":"90cc47d6fbfec80696ca78278e6616158c0acbacf6dd0460eedd37b9b8007991"} Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.724914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"53dd7d1d-6d16-400b-90b7-ac3c9a96127d","Type":"ContainerDied","Data":"db936a98cba652d89112b07f95414cb18b1e11db1909089db24be980234e5573"} Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.724980 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.725115 5030 scope.go:117] "RemoveContainer" containerID="ddcfcc6a8d5fcd86438c7dd6b86c04c164ea7cb885b084b0dbc89d752fbc650e" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.786964 5030 scope.go:117] "RemoveContainer" containerID="ff79c2278d13b526a6c8be59f9d6e9324d01879e24402a230ce9cf77aa025730" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.796123 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.805754 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.821117 5030 scope.go:117] "RemoveContainer" containerID="89a0d19a8c65efcfc7faf4ed10346e6ccbc4f9606df7b694da7901505cafdb19" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.829117 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:33:06 crc kubenswrapper[5030]: E0120 23:33:06.829727 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="ceilometer-central-agent" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.829760 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="ceilometer-central-agent" Jan 20 23:33:06 crc kubenswrapper[5030]: E0120 23:33:06.829778 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="ceilometer-notification-agent" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.829787 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="ceilometer-notification-agent" Jan 20 23:33:06 crc kubenswrapper[5030]: E0120 23:33:06.829819 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="proxy-httpd" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.829827 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="proxy-httpd" Jan 20 23:33:06 crc kubenswrapper[5030]: E0120 23:33:06.829847 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="sg-core" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.829855 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="sg-core" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.830108 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="sg-core" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.830131 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="ceilometer-notification-agent" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.830144 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="proxy-httpd" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.830166 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" containerName="ceilometer-central-agent" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.832253 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.834960 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.835090 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.835176 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.847653 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.860686 5030 scope.go:117] "RemoveContainer" containerID="839cc89adae584c41efedb5d0da7286f639a42f63dac0d84d40646b3fef94436" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.975300 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74b19e4a-b5b1-4c2f-9cf9-01097291798d-run-httpd\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.975584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74b19e4a-b5b1-4c2f-9cf9-01097291798d-log-httpd\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.975673 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.975758 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-config-data\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.975784 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bs57\" (UniqueName: \"kubernetes.io/projected/74b19e4a-b5b1-4c2f-9cf9-01097291798d-kube-api-access-2bs57\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.975807 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.975837 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:06 crc kubenswrapper[5030]: I0120 23:33:06.975862 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-scripts\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.077938 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74b19e4a-b5b1-4c2f-9cf9-01097291798d-run-httpd\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.078001 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74b19e4a-b5b1-4c2f-9cf9-01097291798d-log-httpd\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.078073 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.078157 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-config-data\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.078181 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bs57\" (UniqueName: \"kubernetes.io/projected/74b19e4a-b5b1-4c2f-9cf9-01097291798d-kube-api-access-2bs57\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.078203 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.078245 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.078268 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-scripts\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.079183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74b19e4a-b5b1-4c2f-9cf9-01097291798d-run-httpd\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.081156 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74b19e4a-b5b1-4c2f-9cf9-01097291798d-log-httpd\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.083651 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.084175 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.085066 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.107169 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-scripts\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.107987 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-config-data\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.123926 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bs57\" (UniqueName: \"kubernetes.io/projected/74b19e4a-b5b1-4c2f-9cf9-01097291798d-kube-api-access-2bs57\") pod \"ceilometer-0\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.161442 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.482582 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.763796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"74b19e4a-b5b1-4c2f-9cf9-01097291798d","Type":"ContainerStarted","Data":"9d012a1b3e031d39ce2f719f0c80af63eaa499061275413e5c9cc4abcc05c8fc"} Jan 20 23:33:07 crc kubenswrapper[5030]: I0120 23:33:07.992642 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53dd7d1d-6d16-400b-90b7-ac3c9a96127d" path="/var/lib/kubelet/pods/53dd7d1d-6d16-400b-90b7-ac3c9a96127d/volumes" Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.148969 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.310468 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-scripts\") pod \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.310611 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqwpk\" (UniqueName: \"kubernetes.io/projected/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-kube-api-access-kqwpk\") pod \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.310663 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-config-data\") pod \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.310814 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-combined-ca-bundle\") pod \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\" (UID: \"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6\") " Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.315609 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-scripts" (OuterVolumeSpecName: "scripts") pod "9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6" (UID: "9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.316240 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-kube-api-access-kqwpk" (OuterVolumeSpecName: "kube-api-access-kqwpk") pod "9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6" (UID: "9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6"). InnerVolumeSpecName "kube-api-access-kqwpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.344833 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-config-data" (OuterVolumeSpecName: "config-data") pod "9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6" (UID: "9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.345783 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6" (UID: "9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.413126 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.413186 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqwpk\" (UniqueName: \"kubernetes.io/projected/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-kube-api-access-kqwpk\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.413209 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.413227 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.787577 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"74b19e4a-b5b1-4c2f-9cf9-01097291798d","Type":"ContainerStarted","Data":"5f9274ccadd108b7edd55790f7101b6f78736bb7058b62758a8e76d34b967ae3"} Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.793545 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" event={"ID":"9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6","Type":"ContainerDied","Data":"e132b3a9a664871250a87ea10c0612172955e15e48c3b421e1f1c2913ceea955"} Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.793612 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp" Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.793614 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e132b3a9a664871250a87ea10c0612172955e15e48c3b421e1f1c2913ceea955" Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.936670 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.936945 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerName="nova-api-log" containerID="cri-o://cbab7da01b3bd5a283a0561e255163fd3199803f895805e5b5c4db5d92e3a9d5" gracePeriod=30 Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.936999 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerName="nova-api-api" containerID="cri-o://222a2f47d191a1664c821dc9ee94346d6f1a1e6a3b2a2cd0b837cc0323155351" gracePeriod=30 Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.982372 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:33:08 crc kubenswrapper[5030]: I0120 23:33:08.982574 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55" containerName="nova-scheduler-scheduler" containerID="cri-o://d7be87fefb3c4d5a418de162a73e2d41c4f9db940696c277c29fa956663309f9" gracePeriod=30 Jan 20 23:33:09 crc kubenswrapper[5030]: I0120 23:33:09.005748 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:33:09 crc kubenswrapper[5030]: I0120 23:33:09.005967 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="49ede297-9008-4ab6-87b2-a848861f456c" containerName="nova-metadata-log" containerID="cri-o://47416c4e1855bb17d7d18634d2d54d2bf63df19cae02840bed6ead7ebabb8062" gracePeriod=30 Jan 20 23:33:09 crc kubenswrapper[5030]: I0120 23:33:09.006090 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="49ede297-9008-4ab6-87b2-a848861f456c" containerName="nova-metadata-metadata" containerID="cri-o://87708e45f8565e70d16688c36ab9c0d62303f36b41bc8a9b49ea37525d50d839" gracePeriod=30 Jan 20 23:33:09 crc kubenswrapper[5030]: I0120 23:33:09.803170 5030 generic.go:334] "Generic (PLEG): container finished" podID="49ede297-9008-4ab6-87b2-a848861f456c" containerID="47416c4e1855bb17d7d18634d2d54d2bf63df19cae02840bed6ead7ebabb8062" exitCode=143 Jan 20 23:33:09 crc kubenswrapper[5030]: I0120 23:33:09.803300 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"49ede297-9008-4ab6-87b2-a848861f456c","Type":"ContainerDied","Data":"47416c4e1855bb17d7d18634d2d54d2bf63df19cae02840bed6ead7ebabb8062"} Jan 20 23:33:09 crc kubenswrapper[5030]: I0120 23:33:09.806187 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"74b19e4a-b5b1-4c2f-9cf9-01097291798d","Type":"ContainerStarted","Data":"1364f3685e72aa6c1fbe3584228fe8fd80440e7f23b6bca2e91fbdaa8ae36662"} Jan 20 23:33:09 crc kubenswrapper[5030]: I0120 23:33:09.808258 5030 generic.go:334] "Generic (PLEG): container finished" podID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerID="cbab7da01b3bd5a283a0561e255163fd3199803f895805e5b5c4db5d92e3a9d5" exitCode=143 Jan 20 23:33:09 crc kubenswrapper[5030]: I0120 23:33:09.808285 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b","Type":"ContainerDied","Data":"cbab7da01b3bd5a283a0561e255163fd3199803f895805e5b5c4db5d92e3a9d5"} Jan 20 23:33:10 crc kubenswrapper[5030]: I0120 23:33:10.157851 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:33:10 crc kubenswrapper[5030]: I0120 23:33:10.157924 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:33:10 crc kubenswrapper[5030]: E0120 23:33:10.770303 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7be87fefb3c4d5a418de162a73e2d41c4f9db940696c277c29fa956663309f9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:33:10 crc kubenswrapper[5030]: E0120 23:33:10.773093 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7be87fefb3c4d5a418de162a73e2d41c4f9db940696c277c29fa956663309f9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:33:10 crc kubenswrapper[5030]: E0120 23:33:10.776106 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7be87fefb3c4d5a418de162a73e2d41c4f9db940696c277c29fa956663309f9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:33:10 crc kubenswrapper[5030]: E0120 23:33:10.776169 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55" containerName="nova-scheduler-scheduler" Jan 20 23:33:10 crc kubenswrapper[5030]: I0120 23:33:10.823789 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"74b19e4a-b5b1-4c2f-9cf9-01097291798d","Type":"ContainerStarted","Data":"23f3dcb0553f3089e12b860e39bdceefecb58a03add17ca1ec21c1cfd51434a4"} Jan 20 23:33:11 crc kubenswrapper[5030]: I0120 23:33:11.837413 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"74b19e4a-b5b1-4c2f-9cf9-01097291798d","Type":"ContainerStarted","Data":"a817fccb1d8108fa564f78fe7c39689a51604fe6b3c5f2955526baac928c24e3"} Jan 20 23:33:11 crc kubenswrapper[5030]: I0120 23:33:11.837823 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:11 crc kubenswrapper[5030]: I0120 23:33:11.879240 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.422431175 podStartE2EDuration="5.879216265s" podCreationTimestamp="2026-01-20 23:33:06 +0000 UTC" firstStartedPulling="2026-01-20 23:33:07.498408597 +0000 UTC m=+3459.818668885" lastFinishedPulling="2026-01-20 23:33:10.955193637 +0000 UTC m=+3463.275453975" observedRunningTime="2026-01-20 23:33:11.869384417 +0000 UTC m=+3464.189644725" watchObservedRunningTime="2026-01-20 23:33:11.879216265 +0000 UTC m=+3464.199476563" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.104080 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.251:8774/\": read tcp 10.217.0.2:44410->10.217.1.251:8774: read: connection reset by peer" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.104662 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.251:8774/\": read tcp 10.217.0.2:44412->10.217.1.251:8774: read: connection reset by peer" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.727239 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.739214 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.787525 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-logs\") pod \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.787591 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-combined-ca-bundle\") pod \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.787745 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-config-data\") pod \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.787774 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-public-tls-certs\") pod \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.787893 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpznm\" (UniqueName: \"kubernetes.io/projected/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-kube-api-access-rpznm\") pod \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.787973 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-internal-tls-certs\") pod \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\" (UID: \"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b\") " Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.790080 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-logs" (OuterVolumeSpecName: "logs") pod "7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" (UID: "7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.799839 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-kube-api-access-rpznm" (OuterVolumeSpecName: "kube-api-access-rpznm") pod "7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" (UID: "7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b"). InnerVolumeSpecName "kube-api-access-rpznm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.832842 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" (UID: "7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.853071 5030 generic.go:334] "Generic (PLEG): container finished" podID="49ede297-9008-4ab6-87b2-a848861f456c" containerID="87708e45f8565e70d16688c36ab9c0d62303f36b41bc8a9b49ea37525d50d839" exitCode=0 Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.853132 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"49ede297-9008-4ab6-87b2-a848861f456c","Type":"ContainerDied","Data":"87708e45f8565e70d16688c36ab9c0d62303f36b41bc8a9b49ea37525d50d839"} Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.853159 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"49ede297-9008-4ab6-87b2-a848861f456c","Type":"ContainerDied","Data":"26773ef79223fda2e356ee246916ec7de8b2bf3e31aa9e8d7aac4fc8e6a4df87"} Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.853202 5030 scope.go:117] "RemoveContainer" containerID="87708e45f8565e70d16688c36ab9c0d62303f36b41bc8a9b49ea37525d50d839" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.853322 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.857706 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-config-data" (OuterVolumeSpecName: "config-data") pod "7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" (UID: "7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.861517 5030 generic.go:334] "Generic (PLEG): container finished" podID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerID="222a2f47d191a1664c821dc9ee94346d6f1a1e6a3b2a2cd0b837cc0323155351" exitCode=0 Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.861656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b","Type":"ContainerDied","Data":"222a2f47d191a1664c821dc9ee94346d6f1a1e6a3b2a2cd0b837cc0323155351"} Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.861728 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b","Type":"ContainerDied","Data":"07f90b9775e15b482141d3af8398c817b5b9f11f55357742d8c62bc5f6b3d23e"} Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.861833 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.871159 5030 generic.go:334] "Generic (PLEG): container finished" podID="eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55" containerID="d7be87fefb3c4d5a418de162a73e2d41c4f9db940696c277c29fa956663309f9" exitCode=0 Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.871396 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55","Type":"ContainerDied","Data":"d7be87fefb3c4d5a418de162a73e2d41c4f9db940696c277c29fa956663309f9"} Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.878140 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" (UID: "7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.883070 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" (UID: "7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.886576 5030 scope.go:117] "RemoveContainer" containerID="47416c4e1855bb17d7d18634d2d54d2bf63df19cae02840bed6ead7ebabb8062" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.892314 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ede297-9008-4ab6-87b2-a848861f456c-logs\") pod \"49ede297-9008-4ab6-87b2-a848861f456c\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.892409 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-nova-metadata-tls-certs\") pod \"49ede297-9008-4ab6-87b2-a848861f456c\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.892492 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbz8j\" (UniqueName: \"kubernetes.io/projected/49ede297-9008-4ab6-87b2-a848861f456c-kube-api-access-wbz8j\") pod \"49ede297-9008-4ab6-87b2-a848861f456c\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.892524 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-config-data\") pod \"49ede297-9008-4ab6-87b2-a848861f456c\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.892580 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-combined-ca-bundle\") pod \"49ede297-9008-4ab6-87b2-a848861f456c\" (UID: \"49ede297-9008-4ab6-87b2-a848861f456c\") " Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.893107 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.893124 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.893135 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.893145 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.893154 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpznm\" (UniqueName: \"kubernetes.io/projected/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-kube-api-access-rpznm\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.893162 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.897419 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49ede297-9008-4ab6-87b2-a848861f456c-logs" (OuterVolumeSpecName: "logs") pod "49ede297-9008-4ab6-87b2-a848861f456c" (UID: "49ede297-9008-4ab6-87b2-a848861f456c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.899583 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ede297-9008-4ab6-87b2-a848861f456c-kube-api-access-wbz8j" (OuterVolumeSpecName: "kube-api-access-wbz8j") pod "49ede297-9008-4ab6-87b2-a848861f456c" (UID: "49ede297-9008-4ab6-87b2-a848861f456c"). InnerVolumeSpecName "kube-api-access-wbz8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.917790 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.920696 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49ede297-9008-4ab6-87b2-a848861f456c" (UID: "49ede297-9008-4ab6-87b2-a848861f456c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.922719 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-config-data" (OuterVolumeSpecName: "config-data") pod "49ede297-9008-4ab6-87b2-a848861f456c" (UID: "49ede297-9008-4ab6-87b2-a848861f456c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.931699 5030 scope.go:117] "RemoveContainer" containerID="87708e45f8565e70d16688c36ab9c0d62303f36b41bc8a9b49ea37525d50d839" Jan 20 23:33:12 crc kubenswrapper[5030]: E0120 23:33:12.933956 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87708e45f8565e70d16688c36ab9c0d62303f36b41bc8a9b49ea37525d50d839\": container with ID starting with 87708e45f8565e70d16688c36ab9c0d62303f36b41bc8a9b49ea37525d50d839 not found: ID does not exist" containerID="87708e45f8565e70d16688c36ab9c0d62303f36b41bc8a9b49ea37525d50d839" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.934002 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87708e45f8565e70d16688c36ab9c0d62303f36b41bc8a9b49ea37525d50d839"} err="failed to get container status \"87708e45f8565e70d16688c36ab9c0d62303f36b41bc8a9b49ea37525d50d839\": rpc error: code = NotFound desc = could not find container \"87708e45f8565e70d16688c36ab9c0d62303f36b41bc8a9b49ea37525d50d839\": container with ID starting with 87708e45f8565e70d16688c36ab9c0d62303f36b41bc8a9b49ea37525d50d839 not found: ID does not exist" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.934029 5030 scope.go:117] "RemoveContainer" containerID="47416c4e1855bb17d7d18634d2d54d2bf63df19cae02840bed6ead7ebabb8062" Jan 20 23:33:12 crc kubenswrapper[5030]: E0120 23:33:12.936469 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47416c4e1855bb17d7d18634d2d54d2bf63df19cae02840bed6ead7ebabb8062\": container with ID starting with 47416c4e1855bb17d7d18634d2d54d2bf63df19cae02840bed6ead7ebabb8062 not found: ID does not exist" containerID="47416c4e1855bb17d7d18634d2d54d2bf63df19cae02840bed6ead7ebabb8062" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.936497 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47416c4e1855bb17d7d18634d2d54d2bf63df19cae02840bed6ead7ebabb8062"} err="failed to get container status \"47416c4e1855bb17d7d18634d2d54d2bf63df19cae02840bed6ead7ebabb8062\": rpc error: code = NotFound desc = could not find container \"47416c4e1855bb17d7d18634d2d54d2bf63df19cae02840bed6ead7ebabb8062\": container with ID starting with 47416c4e1855bb17d7d18634d2d54d2bf63df19cae02840bed6ead7ebabb8062 not found: ID does not exist" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.936516 5030 scope.go:117] "RemoveContainer" containerID="222a2f47d191a1664c821dc9ee94346d6f1a1e6a3b2a2cd0b837cc0323155351" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.940735 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "49ede297-9008-4ab6-87b2-a848861f456c" (UID: "49ede297-9008-4ab6-87b2-a848861f456c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.967527 5030 scope.go:117] "RemoveContainer" containerID="cbab7da01b3bd5a283a0561e255163fd3199803f895805e5b5c4db5d92e3a9d5" Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.998151 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km8pm\" (UniqueName: \"kubernetes.io/projected/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-kube-api-access-km8pm\") pod \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\" (UID: \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\") " Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.998568 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-combined-ca-bundle\") pod \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\" (UID: \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\") " Jan 20 23:33:12 crc kubenswrapper[5030]: I0120 23:33:12.998780 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-config-data\") pod \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\" (UID: \"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55\") " Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:12.999371 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49ede297-9008-4ab6-87b2-a848861f456c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.002224 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.002283 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbz8j\" (UniqueName: \"kubernetes.io/projected/49ede297-9008-4ab6-87b2-a848861f456c-kube-api-access-wbz8j\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.002332 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.002379 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ede297-9008-4ab6-87b2-a848861f456c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.025087 5030 scope.go:117] "RemoveContainer" containerID="222a2f47d191a1664c821dc9ee94346d6f1a1e6a3b2a2cd0b837cc0323155351" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.028885 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-kube-api-access-km8pm" (OuterVolumeSpecName: "kube-api-access-km8pm") pod "eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55" (UID: "eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55"). InnerVolumeSpecName "kube-api-access-km8pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:13 crc kubenswrapper[5030]: E0120 23:33:13.029012 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222a2f47d191a1664c821dc9ee94346d6f1a1e6a3b2a2cd0b837cc0323155351\": container with ID starting with 222a2f47d191a1664c821dc9ee94346d6f1a1e6a3b2a2cd0b837cc0323155351 not found: ID does not exist" containerID="222a2f47d191a1664c821dc9ee94346d6f1a1e6a3b2a2cd0b837cc0323155351" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.029050 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222a2f47d191a1664c821dc9ee94346d6f1a1e6a3b2a2cd0b837cc0323155351"} err="failed to get container status \"222a2f47d191a1664c821dc9ee94346d6f1a1e6a3b2a2cd0b837cc0323155351\": rpc error: code = NotFound desc = could not find container \"222a2f47d191a1664c821dc9ee94346d6f1a1e6a3b2a2cd0b837cc0323155351\": container with ID starting with 222a2f47d191a1664c821dc9ee94346d6f1a1e6a3b2a2cd0b837cc0323155351 not found: ID does not exist" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.029076 5030 scope.go:117] "RemoveContainer" containerID="cbab7da01b3bd5a283a0561e255163fd3199803f895805e5b5c4db5d92e3a9d5" Jan 20 23:33:13 crc kubenswrapper[5030]: E0120 23:33:13.029347 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbab7da01b3bd5a283a0561e255163fd3199803f895805e5b5c4db5d92e3a9d5\": container with ID starting with cbab7da01b3bd5a283a0561e255163fd3199803f895805e5b5c4db5d92e3a9d5 not found: ID does not exist" containerID="cbab7da01b3bd5a283a0561e255163fd3199803f895805e5b5c4db5d92e3a9d5" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.029379 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbab7da01b3bd5a283a0561e255163fd3199803f895805e5b5c4db5d92e3a9d5"} err="failed to get container status \"cbab7da01b3bd5a283a0561e255163fd3199803f895805e5b5c4db5d92e3a9d5\": rpc error: code = NotFound desc = could not find container \"cbab7da01b3bd5a283a0561e255163fd3199803f895805e5b5c4db5d92e3a9d5\": container with ID starting with cbab7da01b3bd5a283a0561e255163fd3199803f895805e5b5c4db5d92e3a9d5 not found: ID does not exist" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.082792 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55" (UID: "eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.085770 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-config-data" (OuterVolumeSpecName: "config-data") pod "eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55" (UID: "eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.105236 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km8pm\" (UniqueName: \"kubernetes.io/projected/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-kube-api-access-km8pm\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.105276 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.105286 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.201141 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.209079 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.240994 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:33:13 crc kubenswrapper[5030]: E0120 23:33:13.241344 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerName="nova-api-api" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.241362 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerName="nova-api-api" Jan 20 23:33:13 crc kubenswrapper[5030]: E0120 23:33:13.241374 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerName="nova-api-log" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.241380 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerName="nova-api-log" Jan 20 23:33:13 crc kubenswrapper[5030]: E0120 23:33:13.241395 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ede297-9008-4ab6-87b2-a848861f456c" containerName="nova-metadata-metadata" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.241401 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ede297-9008-4ab6-87b2-a848861f456c" containerName="nova-metadata-metadata" Jan 20 23:33:13 crc kubenswrapper[5030]: E0120 23:33:13.241431 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ede297-9008-4ab6-87b2-a848861f456c" containerName="nova-metadata-log" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.241437 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ede297-9008-4ab6-87b2-a848861f456c" containerName="nova-metadata-log" Jan 20 23:33:13 crc kubenswrapper[5030]: E0120 23:33:13.241451 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55" containerName="nova-scheduler-scheduler" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.241457 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55" containerName="nova-scheduler-scheduler" Jan 20 23:33:13 crc kubenswrapper[5030]: E0120 23:33:13.241467 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6" containerName="nova-manage" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.241475 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6" containerName="nova-manage" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.241640 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6" containerName="nova-manage" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.241649 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ede297-9008-4ab6-87b2-a848861f456c" containerName="nova-metadata-metadata" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.241668 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ede297-9008-4ab6-87b2-a848861f456c" containerName="nova-metadata-log" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.241677 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerName="nova-api-api" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.241690 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55" containerName="nova-scheduler-scheduler" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.241702 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" containerName="nova-api-log" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.242557 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.245886 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.251682 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.273901 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.303551 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.311060 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.311169 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-config-data\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.311227 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqzcd\" (UniqueName: \"kubernetes.io/projected/84ff2188-a797-40b5-8749-ee4330e6bf9a-kube-api-access-mqzcd\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.311272 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.311296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ff2188-a797-40b5-8749-ee4330e6bf9a-logs\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.315867 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.330702 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.332055 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.336145 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.336293 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.336385 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.369715 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.413614 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqzcd\" (UniqueName: \"kubernetes.io/projected/84ff2188-a797-40b5-8749-ee4330e6bf9a-kube-api-access-mqzcd\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.413755 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mtk9\" (UniqueName: \"kubernetes.io/projected/48f09696-840e-4a93-87c8-57bff4137d85-kube-api-access-6mtk9\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.413805 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-public-tls-certs\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.413829 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f09696-840e-4a93-87c8-57bff4137d85-logs\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.413862 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.413887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ff2188-a797-40b5-8749-ee4330e6bf9a-logs\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.413914 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-internal-tls-certs\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.414002 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-config-data\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.414028 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.414060 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.414116 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-config-data\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.415241 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ff2188-a797-40b5-8749-ee4330e6bf9a-logs\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.418449 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.418991 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-config-data\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.419191 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.432362 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqzcd\" (UniqueName: \"kubernetes.io/projected/84ff2188-a797-40b5-8749-ee4330e6bf9a-kube-api-access-mqzcd\") pod \"nova-metadata-0\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.515460 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mtk9\" (UniqueName: \"kubernetes.io/projected/48f09696-840e-4a93-87c8-57bff4137d85-kube-api-access-6mtk9\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.515515 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-public-tls-certs\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.515535 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f09696-840e-4a93-87c8-57bff4137d85-logs\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.515559 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-internal-tls-certs\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.516032 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-config-data\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.516168 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.516094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f09696-840e-4a93-87c8-57bff4137d85-logs\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.518858 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-public-tls-certs\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.518906 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-config-data\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.519247 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-internal-tls-certs\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.520006 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.540179 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mtk9\" (UniqueName: \"kubernetes.io/projected/48f09696-840e-4a93-87c8-57bff4137d85-kube-api-access-6mtk9\") pod \"nova-api-0\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.577209 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.654695 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.882978 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55","Type":"ContainerDied","Data":"75cefa19df80f7c69578b47dcc8c1d1893e327cf9990e5d6d398a3297ab5a566"} Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.883247 5030 scope.go:117] "RemoveContainer" containerID="d7be87fefb3c4d5a418de162a73e2d41c4f9db940696c277c29fa956663309f9" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.883052 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.946098 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.973033 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ede297-9008-4ab6-87b2-a848861f456c" path="/var/lib/kubelet/pods/49ede297-9008-4ab6-87b2-a848861f456c/volumes" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.973757 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b" path="/var/lib/kubelet/pods/7bdd67aa-3f61-44c7-a5bb-ac7794bb1e1b/volumes" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.974320 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.974351 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.975522 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.977496 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:33:13 crc kubenswrapper[5030]: I0120 23:33:13.978235 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:33:14 crc kubenswrapper[5030]: W0120 23:33:14.012545 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84ff2188_a797_40b5_8749_ee4330e6bf9a.slice/crio-ca011647d8e6be8a4c913b33c0c024544cdcedd1bbca0021d426d30459834d55 WatchSource:0}: Error finding container ca011647d8e6be8a4c913b33c0c024544cdcedd1bbca0021d426d30459834d55: Status 404 returned error can't find the container with id ca011647d8e6be8a4c913b33c0c024544cdcedd1bbca0021d426d30459834d55 Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.013095 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.024512 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-config-data\") pod \"nova-scheduler-0\" (UID: \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.024669 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.024701 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpcbf\" (UniqueName: \"kubernetes.io/projected/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-kube-api-access-zpcbf\") pod \"nova-scheduler-0\" (UID: \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.126304 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-config-data\") pod \"nova-scheduler-0\" (UID: \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.126543 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.126746 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpcbf\" (UniqueName: \"kubernetes.io/projected/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-kube-api-access-zpcbf\") pod \"nova-scheduler-0\" (UID: \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:14 crc kubenswrapper[5030]: W0120 23:33:14.127554 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48f09696_840e_4a93_87c8_57bff4137d85.slice/crio-a5664ed439f2148de7c5bf170fed005019e2274cf2b4a55e94286225a10be1dd WatchSource:0}: Error finding container a5664ed439f2148de7c5bf170fed005019e2274cf2b4a55e94286225a10be1dd: Status 404 returned error can't find the container with id a5664ed439f2148de7c5bf170fed005019e2274cf2b4a55e94286225a10be1dd Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.131639 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.132611 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-config-data\") pod \"nova-scheduler-0\" (UID: \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.134312 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.146177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpcbf\" (UniqueName: \"kubernetes.io/projected/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-kube-api-access-zpcbf\") pod \"nova-scheduler-0\" (UID: \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.294694 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.738777 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:33:14 crc kubenswrapper[5030]: W0120 23:33:14.740533 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8011bf0a_412f_4bbc_a0d8_12da1a319a3f.slice/crio-401dd115801d4b61cfe50f74d674e6d16df7fa7681d2854c84ba54d632acf884 WatchSource:0}: Error finding container 401dd115801d4b61cfe50f74d674e6d16df7fa7681d2854c84ba54d632acf884: Status 404 returned error can't find the container with id 401dd115801d4b61cfe50f74d674e6d16df7fa7681d2854c84ba54d632acf884 Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.905854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"48f09696-840e-4a93-87c8-57bff4137d85","Type":"ContainerStarted","Data":"5038cf6a212317ce8623b41306e775f298e2c7a56a2b338c29c6f7840be367fa"} Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.905894 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"48f09696-840e-4a93-87c8-57bff4137d85","Type":"ContainerStarted","Data":"7ebffc22266db7afd3868873c3574233432e05434afc8ac8fe64ce746971e63e"} Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.905905 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"48f09696-840e-4a93-87c8-57bff4137d85","Type":"ContainerStarted","Data":"a5664ed439f2148de7c5bf170fed005019e2274cf2b4a55e94286225a10be1dd"} Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.913441 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"84ff2188-a797-40b5-8749-ee4330e6bf9a","Type":"ContainerStarted","Data":"820679d70ac112437e5c84042ac99cfca7c5a633c0df3f571874cadf6eb168d5"} Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.913475 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"84ff2188-a797-40b5-8749-ee4330e6bf9a","Type":"ContainerStarted","Data":"89d17ab7dc16cfbf12bf89444cea716d748efedb57923810b56061ff70252a84"} Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.913485 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"84ff2188-a797-40b5-8749-ee4330e6bf9a","Type":"ContainerStarted","Data":"ca011647d8e6be8a4c913b33c0c024544cdcedd1bbca0021d426d30459834d55"} Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.915257 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"8011bf0a-412f-4bbc-a0d8-12da1a319a3f","Type":"ContainerStarted","Data":"a649b6dc8b2b3e30909cde7117460adb9c95953db7108da7fc84fff13889ed6f"} Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.915300 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"8011bf0a-412f-4bbc-a0d8-12da1a319a3f","Type":"ContainerStarted","Data":"401dd115801d4b61cfe50f74d674e6d16df7fa7681d2854c84ba54d632acf884"} Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.922770 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.9227544810000001 podStartE2EDuration="1.922754481s" podCreationTimestamp="2026-01-20 23:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:33:14.921355637 +0000 UTC m=+3467.241615925" watchObservedRunningTime="2026-01-20 23:33:14.922754481 +0000 UTC m=+3467.243014769" Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.946984 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=1.946965916 podStartE2EDuration="1.946965916s" podCreationTimestamp="2026-01-20 23:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:33:14.939823914 +0000 UTC m=+3467.260084202" watchObservedRunningTime="2026-01-20 23:33:14.946965916 +0000 UTC m=+3467.267226204" Jan 20 23:33:14 crc kubenswrapper[5030]: I0120 23:33:14.963977 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.9639596780000002 podStartE2EDuration="1.963959678s" podCreationTimestamp="2026-01-20 23:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:33:14.957110682 +0000 UTC m=+3467.277370970" watchObservedRunningTime="2026-01-20 23:33:14.963959678 +0000 UTC m=+3467.284219966" Jan 20 23:33:15 crc kubenswrapper[5030]: I0120 23:33:15.974042 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55" path="/var/lib/kubelet/pods/eb14fbe1-3c84-4f5c-9d11-b4cd8e8bed55/volumes" Jan 20 23:33:18 crc kubenswrapper[5030]: I0120 23:33:18.577751 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:18 crc kubenswrapper[5030]: I0120 23:33:18.579840 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:19 crc kubenswrapper[5030]: I0120 23:33:19.294964 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:23 crc kubenswrapper[5030]: I0120 23:33:23.577420 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:23 crc kubenswrapper[5030]: I0120 23:33:23.578398 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:23 crc kubenswrapper[5030]: I0120 23:33:23.655526 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:23 crc kubenswrapper[5030]: I0120 23:33:23.655595 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:24 crc kubenswrapper[5030]: I0120 23:33:24.295594 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:24 crc kubenswrapper[5030]: I0120 23:33:24.350528 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:24 crc kubenswrapper[5030]: I0120 23:33:24.591860 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.44:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:33:24 crc kubenswrapper[5030]: I0120 23:33:24.591907 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.44:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:33:24 crc kubenswrapper[5030]: I0120 23:33:24.669806 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="48f09696-840e-4a93-87c8-57bff4137d85" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.45:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:33:24 crc kubenswrapper[5030]: I0120 23:33:24.669852 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="48f09696-840e-4a93-87c8-57bff4137d85" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.45:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:33:25 crc kubenswrapper[5030]: I0120 23:33:25.110561 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:33 crc kubenswrapper[5030]: I0120 23:33:33.585446 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:33 crc kubenswrapper[5030]: I0120 23:33:33.585737 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:33 crc kubenswrapper[5030]: I0120 23:33:33.597377 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:33 crc kubenswrapper[5030]: I0120 23:33:33.598479 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:33 crc kubenswrapper[5030]: I0120 23:33:33.672601 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:33 crc kubenswrapper[5030]: I0120 23:33:33.673697 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:33 crc kubenswrapper[5030]: I0120 23:33:33.677454 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:33 crc kubenswrapper[5030]: I0120 23:33:33.702257 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:34 crc kubenswrapper[5030]: I0120 23:33:34.355410 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:34 crc kubenswrapper[5030]: I0120 23:33:34.367259 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:37 crc kubenswrapper[5030]: I0120 23:33:37.174170 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:40 crc kubenswrapper[5030]: I0120 23:33:40.157378 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:33:40 crc kubenswrapper[5030]: I0120 23:33:40.157792 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:33:40 crc kubenswrapper[5030]: I0120 23:33:40.157840 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 23:33:40 crc kubenswrapper[5030]: I0120 23:33:40.158686 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a1fe53fa35b53c77fc4449b81793acb927248e20c61c789e75acc9a687014ea"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 23:33:40 crc kubenswrapper[5030]: I0120 23:33:40.158759 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://1a1fe53fa35b53c77fc4449b81793acb927248e20c61c789e75acc9a687014ea" gracePeriod=600 Jan 20 23:33:40 crc kubenswrapper[5030]: I0120 23:33:40.430643 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="1a1fe53fa35b53c77fc4449b81793acb927248e20c61c789e75acc9a687014ea" exitCode=0 Jan 20 23:33:40 crc kubenswrapper[5030]: I0120 23:33:40.430821 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"1a1fe53fa35b53c77fc4449b81793acb927248e20c61c789e75acc9a687014ea"} Jan 20 23:33:40 crc kubenswrapper[5030]: I0120 23:33:40.430935 5030 scope.go:117] "RemoveContainer" containerID="a759e8347a1a114d312c623ecef1a23c0947594f13f6c637c7c63dc497e4949f" Jan 20 23:33:41 crc kubenswrapper[5030]: I0120 23:33:41.444060 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4"} Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.100287 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.100960 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="7b274deb-e830-47a2-916c-3c10670c08ac" containerName="ovn-northd" containerID="cri-o://59d37188df8105f7a6563229d86d8c1078a8f47264c721f27153dbfc6bbaa4e2" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.101030 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="7b274deb-e830-47a2-916c-3c10670c08ac" containerName="openstack-network-exporter" containerID="cri-o://df6cb8bdfcb5d92a6cd5e2e793e2b41a4b1b3fe7648144b68448df9085f79bf1" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.125161 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.125567 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="8011bf0a-412f-4bbc-a0d8-12da1a319a3f" containerName="nova-scheduler-scheduler" containerID="cri-o://a649b6dc8b2b3e30909cde7117460adb9c95953db7108da7fc84fff13889ed6f" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.156126 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.156956 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="27a52282-d198-4a51-912a-1c2267f20d4b" containerName="openstack-network-exporter" containerID="cri-o://7500d10c05813e90a38f8a26285eef2c34b7e71479f0011b90857ee52f9a3d74" gracePeriod=300 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.172902 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.173139 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="aea67ae4-7b1c-4bc1-afe4-eacfc66d0081" containerName="openstackclient" containerID="cri-o://7cb774b7ad67df28cae541d2b5fd9915c0040832858e0d9cbf4f702324dd16b6" gracePeriod=2 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.195063 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="aea67ae4-7b1c-4bc1-afe4-eacfc66d0081" podUID="e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.195116 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.219900 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.220150 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-log" containerID="cri-o://89d17ab7dc16cfbf12bf89444cea716d748efedb57923810b56061ff70252a84" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.220360 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-metadata" containerID="cri-o://820679d70ac112437e5c84042ac99cfca7c5a633c0df3f571874cadf6eb168d5" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.232302 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.232527 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="4dc2869a-ba69-4466-aa02-b6fe06035daa" containerName="memcached" containerID="cri-o://019f85f2b7928eefb30edadc94479be6aefb358db65c713d1e1e7a186e7be788" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.247701 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.248111 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="e443f370-de90-4c56-ba15-f33683726237" containerName="nova-cell0-conductor-conductor" containerID="cri-o://ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.266379 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t"] Jan 20 23:33:51 crc kubenswrapper[5030]: E0120 23:33:51.266939 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea67ae4-7b1c-4bc1-afe4-eacfc66d0081" containerName="openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.266953 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea67ae4-7b1c-4bc1-afe4-eacfc66d0081" containerName="openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.267172 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea67ae4-7b1c-4bc1-afe4-eacfc66d0081" containerName="openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.268371 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.295939 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-548bcb5f76-7c959"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.297436 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.318099 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.319211 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: E0120 23:33:51.348384 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-logs\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349492 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-config-data\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349525 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-combined-ca-bundle\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349550 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fwzb\" (UniqueName: \"kubernetes.io/projected/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-kube-api-access-4fwzb\") pod \"openstackclient\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349571 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-openstack-config\") pod \"openstackclient\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349589 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-config-data-custom\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349613 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-config-data-custom\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349647 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-public-tls-certs\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349677 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-combined-ca-bundle\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349700 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-openstack-config-secret\") pod \"openstackclient\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349716 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68tfg\" (UniqueName: \"kubernetes.io/projected/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-kube-api-access-68tfg\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349733 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-internal-tls-certs\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-config-data\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-logs\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349822 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.349842 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4gz5\" (UniqueName: \"kubernetes.io/projected/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-kube-api-access-d4gz5\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.355545 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.372276 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="27a52282-d198-4a51-912a-1c2267f20d4b" containerName="ovsdbserver-nb" containerID="cri-o://9610c64c6d13e545efd2ca85d32f3b7947e2a7fa0179fb1ad72d312779b314e3" gracePeriod=300 Jan 20 23:33:51 crc kubenswrapper[5030]: E0120 23:33:51.372394 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.377764 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-548bcb5f76-7c959"] Jan 20 23:33:51 crc kubenswrapper[5030]: E0120 23:33:51.392325 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:33:51 crc kubenswrapper[5030]: E0120 23:33:51.392398 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="e443f370-de90-4c56-ba15-f33683726237" containerName="nova-cell0-conductor-conductor" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.395682 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.431930 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.432143 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" containerName="glance-log" containerID="cri-o://dba39c972f09e29c1e7af7710cba14d22bb01635d4706b36515e50f46d419e23" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.432282 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" containerName="glance-httpd" containerID="cri-o://e9b169497221304dbe6adee831f056e749c90f520798d74f0635cb2f60436f1d" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.450848 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-config-data-custom\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.450883 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-public-tls-certs\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.450912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-combined-ca-bundle\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.450938 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-openstack-config-secret\") pod \"openstackclient\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.450955 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68tfg\" (UniqueName: \"kubernetes.io/projected/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-kube-api-access-68tfg\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.450971 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-internal-tls-certs\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.451015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-config-data\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.451034 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-logs\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.451061 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.451084 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4gz5\" (UniqueName: \"kubernetes.io/projected/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-kube-api-access-d4gz5\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.451133 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-logs\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.451158 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-config-data\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.451188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-combined-ca-bundle\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.451209 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fwzb\" (UniqueName: \"kubernetes.io/projected/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-kube-api-access-4fwzb\") pod \"openstackclient\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.451225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-openstack-config\") pod \"openstackclient\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.451242 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-config-data-custom\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.452277 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-logs\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.461555 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.461786 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="8663671b-ef57-49a5-b24b-4cb3385ffc76" containerName="cinder-api-log" containerID="cri-o://5acce875889d65a3f586a3832b99f2d8bced89596046689298d3df1e58194783" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.461901 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="8663671b-ef57-49a5-b24b-4cb3385ffc76" containerName="cinder-api" containerID="cri-o://5577e422daaa40c7d4a3a4b146deed7f44ec2457dac0430688594f2eba2c7186" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.466319 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-logs\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.470430 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-openstack-config\") pod \"openstackclient\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.471505 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-public-tls-certs\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.474048 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-combined-ca-bundle\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.495491 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.497013 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.533358 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4gz5\" (UniqueName: \"kubernetes.io/projected/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-kube-api-access-d4gz5\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.539561 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-combined-ca-bundle\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.540529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-config-data-custom\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.541346 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.542761 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-config-data\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.543726 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-config-data\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.553192 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-config-data-custom\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.553216 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-openstack-config-secret\") pod \"openstackclient\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.553811 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-internal-tls-certs\") pod \"barbican-api-548bcb5f76-7c959\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.594470 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fwzb\" (UniqueName: \"kubernetes.io/projected/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-kube-api-access-4fwzb\") pod \"openstackclient\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.596801 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68tfg\" (UniqueName: \"kubernetes.io/projected/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-kube-api-access-68tfg\") pod \"barbican-keystone-listener-774477bf6d-2vg4t\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.605390 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af288905-8d90-4f98-ba70-c7351555851b-logs\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.605520 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-config-data\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.605739 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-combined-ca-bundle\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.605768 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-config-data-custom\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.605841 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hsmx\" (UniqueName: \"kubernetes.io/projected/af288905-8d90-4f98-ba70-c7351555851b-kube-api-access-2hsmx\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.638349 5030 generic.go:334] "Generic (PLEG): container finished" podID="27a52282-d198-4a51-912a-1c2267f20d4b" containerID="7500d10c05813e90a38f8a26285eef2c34b7e71479f0011b90857ee52f9a3d74" exitCode=2 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.638736 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"27a52282-d198-4a51-912a-1c2267f20d4b","Type":"ContainerDied","Data":"7500d10c05813e90a38f8a26285eef2c34b7e71479f0011b90857ee52f9a3d74"} Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.643581 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.643933 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="48f09696-840e-4a93-87c8-57bff4137d85" containerName="nova-api-log" containerID="cri-o://7ebffc22266db7afd3868873c3574233432e05434afc8ac8fe64ce746971e63e" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.644551 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="48f09696-840e-4a93-87c8-57bff4137d85" containerName="nova-api-api" containerID="cri-o://5038cf6a212317ce8623b41306e775f298e2c7a56a2b338c29c6f7840be367fa" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.673704 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.674426 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.681967 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7f67b969c8-d246l"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.682009 5030 generic.go:334] "Generic (PLEG): container finished" podID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerID="89d17ab7dc16cfbf12bf89444cea716d748efedb57923810b56061ff70252a84" exitCode=143 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.687590 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"84ff2188-a797-40b5-8749-ee4330e6bf9a","Type":"ContainerDied","Data":"89d17ab7dc16cfbf12bf89444cea716d748efedb57923810b56061ff70252a84"} Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.687783 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.688445 5030 generic.go:334] "Generic (PLEG): container finished" podID="7b274deb-e830-47a2-916c-3c10670c08ac" containerID="df6cb8bdfcb5d92a6cd5e2e793e2b41a4b1b3fe7648144b68448df9085f79bf1" exitCode=2 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.688469 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"7b274deb-e830-47a2-916c-3c10670c08ac","Type":"ContainerDied","Data":"df6cb8bdfcb5d92a6cd5e2e793e2b41a4b1b3fe7648144b68448df9085f79bf1"} Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.773135 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.776105 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-combined-ca-bundle\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.776140 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-config-data-custom\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.776204 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hsmx\" (UniqueName: \"kubernetes.io/projected/af288905-8d90-4f98-ba70-c7351555851b-kube-api-access-2hsmx\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.776292 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af288905-8d90-4f98-ba70-c7351555851b-logs\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.776358 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-config-data\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.786334 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af288905-8d90-4f98-ba70-c7351555851b-logs\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.796552 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-combined-ca-bundle\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.800666 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-config-data-custom\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.802727 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.802963 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="37c2ebda-5e72-41eb-9c8c-1326134768c7" containerName="glance-log" containerID="cri-o://ef6976dc418e8fc75f18940eec16d6ef87c71cd9327cf28588a9a353d0afb920" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.803397 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="37c2ebda-5e72-41eb-9c8c-1326134768c7" containerName="glance-httpd" containerID="cri-o://42893f23058587a6ea7bb91a49056a29e0ef50a9501774456a34ef1a79090ca0" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.815287 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hsmx\" (UniqueName: \"kubernetes.io/projected/af288905-8d90-4f98-ba70-c7351555851b-kube-api-access-2hsmx\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.815331 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-config-data\") pod \"barbican-worker-dcf7bb847-h8v9h\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.827394 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.827681 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="6153d63b-e003-4951-b9f1-a27d11979663" containerName="cinder-scheduler" containerID="cri-o://61f13c629eb50d8806b59af161a0e2ebd559836747a102e58d63db7089892566" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.828114 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="6153d63b-e003-4951-b9f1-a27d11979663" containerName="probe" containerID="cri-o://24f4bfb73431d6bcf7cc305370aec071e7e270a5776bad4e062eabb3f4ff4ed1" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.854350 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.854600 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="a8865fb5-d3c6-4178-9f1f-43e29f24e345" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5ae7bb9ba42eb770778d66a021e4203f2b42a0265e7c087325892501138ebe93" gracePeriod=30 Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.878270 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.880070 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7f67b969c8-d246l"] Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.881826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-scripts\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.881865 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-combined-ca-bundle\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.881988 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-config-data\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.882090 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lshx\" (UniqueName: \"kubernetes.io/projected/3e8ed654-59ee-4055-bb97-7d079e100af7-kube-api-access-7lshx\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.882153 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-public-tls-certs\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.882411 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-internal-tls-certs\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.882441 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e8ed654-59ee-4055-bb97-7d079e100af7-logs\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.914066 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.987052 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-scripts\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.987114 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-combined-ca-bundle\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.987149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-config-data\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.987191 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lshx\" (UniqueName: \"kubernetes.io/projected/3e8ed654-59ee-4055-bb97-7d079e100af7-kube-api-access-7lshx\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.987232 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-public-tls-certs\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.987266 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-internal-tls-certs\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.987292 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e8ed654-59ee-4055-bb97-7d079e100af7-logs\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:51 crc kubenswrapper[5030]: I0120 23:33:51.987731 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e8ed654-59ee-4055-bb97-7d079e100af7-logs\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.015185 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-public-tls-certs\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.017093 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-scripts\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.017513 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-config-data\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.017599 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-combined-ca-bundle\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.028948 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-internal-tls-certs\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.063843 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.123673 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.124590 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="dcf91700-edc5-4ecb-a027-b6b904e948e1" containerName="openstack-network-exporter" containerID="cri-o://e15443f82953fd9286b3c4ae0ad6d0ae04bde20b646f4ec3be8e39a140ec4ff8" gracePeriod=300 Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.133436 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.157862 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-676d666f78-55975"] Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.159447 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.180047 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lshx\" (UniqueName: \"kubernetes.io/projected/3e8ed654-59ee-4055-bb97-7d079e100af7-kube-api-access-7lshx\") pod \"placement-7f67b969c8-d246l\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.244062 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.322165 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-676d666f78-55975"] Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.336036 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-internal-tls-certs\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.336277 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-public-tls-certs\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.336317 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s8tj\" (UniqueName: \"kubernetes.io/projected/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-kube-api-access-7s8tj\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.336339 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-combined-ca-bundle\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.336369 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-httpd-config\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.336388 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-ovndb-tls-certs\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.336405 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-config\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.348816 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-b56fccfd8-pxhmq"] Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.350091 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.373466 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/memcached-0" podUID="4dc2869a-ba69-4466-aa02-b6fe06035daa" containerName="memcached" probeResult="failure" output="dial tcp 10.217.1.201:11211: connect: connection refused" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.386878 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-b56fccfd8-pxhmq"] Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.415892 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="dcf91700-edc5-4ecb-a027-b6b904e948e1" containerName="ovsdbserver-sb" containerID="cri-o://fdf31a0124040cc80f23e3e5fff8eb8638fc26f71aedd567cd3ee47023452251" gracePeriod=300 Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.443674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-internal-tls-certs\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.443735 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-public-tls-certs\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.443788 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s8tj\" (UniqueName: \"kubernetes.io/projected/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-kube-api-access-7s8tj\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.443817 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-combined-ca-bundle\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.443853 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-httpd-config\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.443874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-ovndb-tls-certs\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.443898 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-config\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.463033 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-httpd-config\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.463530 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-internal-tls-certs\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.464140 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-public-tls-certs\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.464457 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-config\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.475352 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-combined-ca-bundle\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.475475 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-ovndb-tls-certs\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.477356 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s8tj\" (UniqueName: \"kubernetes.io/projected/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-kube-api-access-7s8tj\") pod \"neutron-676d666f78-55975\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.545413 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-fernet-keys\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.545902 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-config-data\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.545967 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-internal-tls-certs\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.545996 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-scripts\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.546025 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-public-tls-certs\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.546047 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-combined-ca-bundle\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.546093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pj4n\" (UniqueName: \"kubernetes.io/projected/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-kube-api-access-7pj4n\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.546112 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-credential-keys\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.581078 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="2de7a5fe-85b0-4b7c-af74-ec2c104fac94" containerName="galera" containerID="cri-o://e36e0f7cd8355c63cb64f84f0eb05159bf294b5e0ff81a35379e17c111fa95ef" gracePeriod=30 Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.609902 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.648408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-internal-tls-certs\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.648456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-scripts\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.648487 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-public-tls-certs\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.648507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-combined-ca-bundle\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.648546 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pj4n\" (UniqueName: \"kubernetes.io/projected/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-kube-api-access-7pj4n\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.648584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-credential-keys\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.648653 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-fernet-keys\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.648696 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-config-data\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.658401 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-credential-keys\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.659069 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-public-tls-certs\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.659347 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-config-data\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.662154 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-combined-ca-bundle\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.665407 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-scripts\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.666381 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-internal-tls-certs\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.667411 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-fernet-keys\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.673550 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pj4n\" (UniqueName: \"kubernetes.io/projected/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-kube-api-access-7pj4n\") pod \"keystone-b56fccfd8-pxhmq\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.685152 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:52 crc kubenswrapper[5030]: E0120 23:33:52.725985 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fdf31a0124040cc80f23e3e5fff8eb8638fc26f71aedd567cd3ee47023452251 is running failed: container process not found" containerID="fdf31a0124040cc80f23e3e5fff8eb8638fc26f71aedd567cd3ee47023452251" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:33:52 crc kubenswrapper[5030]: E0120 23:33:52.729249 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fdf31a0124040cc80f23e3e5fff8eb8638fc26f71aedd567cd3ee47023452251 is running failed: container process not found" containerID="fdf31a0124040cc80f23e3e5fff8eb8638fc26f71aedd567cd3ee47023452251" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:33:52 crc kubenswrapper[5030]: E0120 23:33:52.734123 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fdf31a0124040cc80f23e3e5fff8eb8638fc26f71aedd567cd3ee47023452251 is running failed: container process not found" containerID="fdf31a0124040cc80f23e3e5fff8eb8638fc26f71aedd567cd3ee47023452251" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:33:52 crc kubenswrapper[5030]: E0120 23:33:52.734193 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fdf31a0124040cc80f23e3e5fff8eb8638fc26f71aedd567cd3ee47023452251 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="dcf91700-edc5-4ecb-a027-b6b904e948e1" containerName="ovsdbserver-sb" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.763992 5030 generic.go:334] "Generic (PLEG): container finished" podID="37c2ebda-5e72-41eb-9c8c-1326134768c7" containerID="ef6976dc418e8fc75f18940eec16d6ef87c71cd9327cf28588a9a353d0afb920" exitCode=143 Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.764268 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"37c2ebda-5e72-41eb-9c8c-1326134768c7","Type":"ContainerDied","Data":"ef6976dc418e8fc75f18940eec16d6ef87c71cd9327cf28588a9a353d0afb920"} Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.818190 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_27a52282-d198-4a51-912a-1c2267f20d4b/ovsdbserver-nb/0.log" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.818410 5030 generic.go:334] "Generic (PLEG): container finished" podID="27a52282-d198-4a51-912a-1c2267f20d4b" containerID="9610c64c6d13e545efd2ca85d32f3b7947e2a7fa0179fb1ad72d312779b314e3" exitCode=143 Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.818536 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"27a52282-d198-4a51-912a-1c2267f20d4b","Type":"ContainerDied","Data":"9610c64c6d13e545efd2ca85d32f3b7947e2a7fa0179fb1ad72d312779b314e3"} Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.830762 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h"] Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.852327 5030 generic.go:334] "Generic (PLEG): container finished" podID="8663671b-ef57-49a5-b24b-4cb3385ffc76" containerID="5acce875889d65a3f586a3832b99f2d8bced89596046689298d3df1e58194783" exitCode=143 Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.852567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"8663671b-ef57-49a5-b24b-4cb3385ffc76","Type":"ContainerDied","Data":"5acce875889d65a3f586a3832b99f2d8bced89596046689298d3df1e58194783"} Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.882980 5030 generic.go:334] "Generic (PLEG): container finished" podID="4dc2869a-ba69-4466-aa02-b6fe06035daa" containerID="019f85f2b7928eefb30edadc94479be6aefb358db65c713d1e1e7a186e7be788" exitCode=0 Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.883114 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"4dc2869a-ba69-4466-aa02-b6fe06035daa","Type":"ContainerDied","Data":"019f85f2b7928eefb30edadc94479be6aefb358db65c713d1e1e7a186e7be788"} Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.933662 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp"] Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.935467 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.936664 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_dcf91700-edc5-4ecb-a027-b6b904e948e1/ovsdbserver-sb/0.log" Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.936692 5030 generic.go:334] "Generic (PLEG): container finished" podID="dcf91700-edc5-4ecb-a027-b6b904e948e1" containerID="e15443f82953fd9286b3c4ae0ad6d0ae04bde20b646f4ec3be8e39a140ec4ff8" exitCode=2 Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.936707 5030 generic.go:334] "Generic (PLEG): container finished" podID="dcf91700-edc5-4ecb-a027-b6b904e948e1" containerID="fdf31a0124040cc80f23e3e5fff8eb8638fc26f71aedd567cd3ee47023452251" exitCode=143 Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.941757 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"dcf91700-edc5-4ecb-a027-b6b904e948e1","Type":"ContainerDied","Data":"e15443f82953fd9286b3c4ae0ad6d0ae04bde20b646f4ec3be8e39a140ec4ff8"} Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.941858 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"dcf91700-edc5-4ecb-a027-b6b904e948e1","Type":"ContainerDied","Data":"fdf31a0124040cc80f23e3e5fff8eb8638fc26f71aedd567cd3ee47023452251"} Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.989013 5030 generic.go:334] "Generic (PLEG): container finished" podID="2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" containerID="dba39c972f09e29c1e7af7710cba14d22bb01635d4706b36515e50f46d419e23" exitCode=143 Jan 20 23:33:52 crc kubenswrapper[5030]: I0120 23:33:52.989103 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8","Type":"ContainerDied","Data":"dba39c972f09e29c1e7af7710cba14d22bb01635d4706b36515e50f46d419e23"} Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.004340 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.014053 5030 generic.go:334] "Generic (PLEG): container finished" podID="48f09696-840e-4a93-87c8-57bff4137d85" containerID="7ebffc22266db7afd3868873c3574233432e05434afc8ac8fe64ce746971e63e" exitCode=143 Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.014089 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"48f09696-840e-4a93-87c8-57bff4137d85","Type":"ContainerDied","Data":"7ebffc22266db7afd3868873c3574233432e05434afc8ac8fe64ce746971e63e"} Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.043796 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="66ad6a39-d03a-4db8-9f47-9c461325cf0b" containerName="galera" containerID="cri-o://aafac8517276cb2773a480452566bd4bb81c0a6d528488652d12ca4598c33812" gracePeriod=30 Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.056668 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.069875 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f8fdbf9-041a-46dc-87c9-f2531626e150-logs\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.069983 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-combined-ca-bundle\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.070006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv5hp\" (UniqueName: \"kubernetes.io/projected/4f8fdbf9-041a-46dc-87c9-f2531626e150-kube-api-access-qv5hp\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.070054 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-config-data-custom\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.070185 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-config-data\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.173456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-combined-ca-bundle\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.173722 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv5hp\" (UniqueName: \"kubernetes.io/projected/4f8fdbf9-041a-46dc-87c9-f2531626e150-kube-api-access-qv5hp\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.173770 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-config-data-custom\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.173860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-config-data\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.173924 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f8fdbf9-041a-46dc-87c9-f2531626e150-logs\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.174202 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f8fdbf9-041a-46dc-87c9-f2531626e150-logs\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.194748 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.219949 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-config-data\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.221387 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-config-data-custom\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.288687 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.290844 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.292670 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-combined-ca-bundle\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.320489 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv5hp\" (UniqueName: \"kubernetes.io/projected/4f8fdbf9-041a-46dc-87c9-f2531626e150-kube-api-access-qv5hp\") pod \"barbican-worker-d56b7b96f-pdtfp\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.370093 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.394796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-combined-ca-bundle\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.394914 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-config-data-custom\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.394949 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-config-data\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.395010 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qqj2\" (UniqueName: \"kubernetes.io/projected/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-kube-api-access-4qqj2\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.395041 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-logs\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.422277 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.498066 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-config-data-custom\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.504093 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-config-data\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.504412 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qqj2\" (UniqueName: \"kubernetes.io/projected/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-kube-api-access-4qqj2\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.504568 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-logs\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.504670 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-combined-ca-bundle\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.510223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-logs\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.519451 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-config-data-custom\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.520571 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-config-data\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.524578 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-combined-ca-bundle\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.536141 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qqj2\" (UniqueName: \"kubernetes.io/projected/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-kube-api-access-4qqj2\") pod \"barbican-keystone-listener-f68656994-jgqr2\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.543029 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-676d666f78-55975"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.559859 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-b56fccfd8-pxhmq"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.576170 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-548bcb5f76-7c959"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.585145 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.596781 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7f67b969c8-d246l"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.605419 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.606987 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6776c47bdb-m5w28"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.608939 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.614550 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.618241 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.668448 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6776c47bdb-m5w28"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.670187 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_27a52282-d198-4a51-912a-1c2267f20d4b/ovsdbserver-nb/0.log" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.670250 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.682311 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.683793 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-ddf8577d5-p79tx"] Jan 20 23:33:53 crc kubenswrapper[5030]: E0120 23:33:53.684116 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a52282-d198-4a51-912a-1c2267f20d4b" containerName="openstack-network-exporter" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.684133 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a52282-d198-4a51-912a-1c2267f20d4b" containerName="openstack-network-exporter" Jan 20 23:33:53 crc kubenswrapper[5030]: E0120 23:33:53.684152 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc2869a-ba69-4466-aa02-b6fe06035daa" containerName="memcached" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.684159 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc2869a-ba69-4466-aa02-b6fe06035daa" containerName="memcached" Jan 20 23:33:53 crc kubenswrapper[5030]: E0120 23:33:53.684184 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a52282-d198-4a51-912a-1c2267f20d4b" containerName="ovsdbserver-nb" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.684190 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a52282-d198-4a51-912a-1c2267f20d4b" containerName="ovsdbserver-nb" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.684359 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a52282-d198-4a51-912a-1c2267f20d4b" containerName="ovsdbserver-nb" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.684377 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a52282-d198-4a51-912a-1c2267f20d4b" containerName="openstack-network-exporter" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.684390 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc2869a-ba69-4466-aa02-b6fe06035daa" containerName="memcached" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.685107 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.708204 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc2869a-ba69-4466-aa02-b6fe06035daa-combined-ca-bundle\") pod \"4dc2869a-ba69-4466-aa02-b6fe06035daa\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.708278 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc2869a-ba69-4466-aa02-b6fe06035daa-memcached-tls-certs\") pod \"4dc2869a-ba69-4466-aa02-b6fe06035daa\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.708310 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/27a52282-d198-4a51-912a-1c2267f20d4b-ovsdb-rundir\") pod \"27a52282-d198-4a51-912a-1c2267f20d4b\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.708417 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lx8n\" (UniqueName: \"kubernetes.io/projected/4dc2869a-ba69-4466-aa02-b6fe06035daa-kube-api-access-5lx8n\") pod \"4dc2869a-ba69-4466-aa02-b6fe06035daa\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.708514 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"27a52282-d198-4a51-912a-1c2267f20d4b\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.708574 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4dc2869a-ba69-4466-aa02-b6fe06035daa-kolla-config\") pod \"4dc2869a-ba69-4466-aa02-b6fe06035daa\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.708597 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-combined-ca-bundle\") pod \"27a52282-d198-4a51-912a-1c2267f20d4b\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.708640 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a52282-d198-4a51-912a-1c2267f20d4b-config\") pod \"27a52282-d198-4a51-912a-1c2267f20d4b\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.708659 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-metrics-certs-tls-certs\") pod \"27a52282-d198-4a51-912a-1c2267f20d4b\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.708735 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27a52282-d198-4a51-912a-1c2267f20d4b-scripts\") pod \"27a52282-d198-4a51-912a-1c2267f20d4b\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.708765 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dc2869a-ba69-4466-aa02-b6fe06035daa-config-data\") pod \"4dc2869a-ba69-4466-aa02-b6fe06035daa\" (UID: \"4dc2869a-ba69-4466-aa02-b6fe06035daa\") " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.708804 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n6vx\" (UniqueName: \"kubernetes.io/projected/27a52282-d198-4a51-912a-1c2267f20d4b-kube-api-access-5n6vx\") pod \"27a52282-d198-4a51-912a-1c2267f20d4b\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.708827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-ovsdbserver-nb-tls-certs\") pod \"27a52282-d198-4a51-912a-1c2267f20d4b\" (UID: \"27a52282-d198-4a51-912a-1c2267f20d4b\") " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709163 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-internal-tls-certs\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709228 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlx9l\" (UniqueName: \"kubernetes.io/projected/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-kube-api-access-nlx9l\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709245 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-public-tls-certs\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709290 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-ovndb-tls-certs\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709314 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-combined-ca-bundle\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709373 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-config-data\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709399 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-httpd-config\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709465 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-config-data-custom\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709514 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f522d3b2-124f-4af1-a548-408c3517ef13-logs\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709542 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-internal-tls-certs\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709611 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-public-tls-certs\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709670 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq2ml\" (UniqueName: \"kubernetes.io/projected/f522d3b2-124f-4af1-a548-408c3517ef13-kube-api-access-mq2ml\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709687 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-combined-ca-bundle\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.709701 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-config\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: W0120 23:33:53.711961 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49a8a7d0_ed33_43a6_a11d_c544f55be6a5.slice/crio-7dc8386abdbcaed1bea62f94581045b43de423bc42d2ef1a13578add2fac1da0 WatchSource:0}: Error finding container 7dc8386abdbcaed1bea62f94581045b43de423bc42d2ef1a13578add2fac1da0: Status 404 returned error can't find the container with id 7dc8386abdbcaed1bea62f94581045b43de423bc42d2ef1a13578add2fac1da0 Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.713827 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a52282-d198-4a51-912a-1c2267f20d4b-scripts" (OuterVolumeSpecName: "scripts") pod "27a52282-d198-4a51-912a-1c2267f20d4b" (UID: "27a52282-d198-4a51-912a-1c2267f20d4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.714959 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27a52282-d198-4a51-912a-1c2267f20d4b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "27a52282-d198-4a51-912a-1c2267f20d4b" (UID: "27a52282-d198-4a51-912a-1c2267f20d4b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.718586 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a52282-d198-4a51-912a-1c2267f20d4b-config" (OuterVolumeSpecName: "config") pod "27a52282-d198-4a51-912a-1c2267f20d4b" (UID: "27a52282-d198-4a51-912a-1c2267f20d4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.719839 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "27a52282-d198-4a51-912a-1c2267f20d4b" (UID: "27a52282-d198-4a51-912a-1c2267f20d4b"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.720339 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.720535 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a52282-d198-4a51-912a-1c2267f20d4b-kube-api-access-5n6vx" (OuterVolumeSpecName: "kube-api-access-5n6vx") pod "27a52282-d198-4a51-912a-1c2267f20d4b" (UID: "27a52282-d198-4a51-912a-1c2267f20d4b"). InnerVolumeSpecName "kube-api-access-5n6vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.723408 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc2869a-ba69-4466-aa02-b6fe06035daa-config-data" (OuterVolumeSpecName: "config-data") pod "4dc2869a-ba69-4466-aa02-b6fe06035daa" (UID: "4dc2869a-ba69-4466-aa02-b6fe06035daa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.723591 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc2869a-ba69-4466-aa02-b6fe06035daa-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "4dc2869a-ba69-4466-aa02-b6fe06035daa" (UID: "4dc2869a-ba69-4466-aa02-b6fe06035daa"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.738920 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dc2869a-ba69-4466-aa02-b6fe06035daa-kube-api-access-5lx8n" (OuterVolumeSpecName: "kube-api-access-5lx8n") pod "4dc2869a-ba69-4466-aa02-b6fe06035daa" (UID: "4dc2869a-ba69-4466-aa02-b6fe06035daa"). InnerVolumeSpecName "kube-api-access-5lx8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.749989 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-ddf8577d5-p79tx"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.763032 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.766650 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.781248 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-combined-ca-bundle\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811491 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-internal-tls-certs\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811515 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-public-tls-certs\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811532 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-public-tls-certs\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811570 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-public-tls-certs\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811593 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq2ml\" (UniqueName: \"kubernetes.io/projected/f522d3b2-124f-4af1-a548-408c3517ef13-kube-api-access-mq2ml\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811611 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-combined-ca-bundle\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811647 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-config\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811676 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-credential-keys\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811715 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-internal-tls-certs\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811737 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-scripts\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811766 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-internal-tls-certs\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlx9l\" (UniqueName: \"kubernetes.io/projected/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-kube-api-access-nlx9l\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811828 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-combined-ca-bundle\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811846 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-public-tls-certs\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811890 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-ovndb-tls-certs\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811915 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-combined-ca-bundle\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811955 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2skqw\" (UniqueName: \"kubernetes.io/projected/24bdc108-6bee-469c-88da-d064ef3b2381-kube-api-access-2skqw\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811974 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-config-data\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.811988 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-fernet-keys\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812027 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-config-data\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812047 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-internal-tls-certs\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812063 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6jjc\" (UniqueName: \"kubernetes.io/projected/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-kube-api-access-v6jjc\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812082 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-httpd-config\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812118 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-config-data\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812146 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-config-data-custom\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-scripts\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812208 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f522d3b2-124f-4af1-a548-408c3517ef13-logs\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812222 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24bdc108-6bee-469c-88da-d064ef3b2381-logs\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812285 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a52282-d198-4a51-912a-1c2267f20d4b-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812297 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/27a52282-d198-4a51-912a-1c2267f20d4b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812306 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dc2869a-ba69-4466-aa02-b6fe06035daa-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812317 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n6vx\" (UniqueName: \"kubernetes.io/projected/27a52282-d198-4a51-912a-1c2267f20d4b-kube-api-access-5n6vx\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812343 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/27a52282-d198-4a51-912a-1c2267f20d4b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812340 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-548bcb5f76-7c959"] Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812353 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lx8n\" (UniqueName: \"kubernetes.io/projected/4dc2869a-ba69-4466-aa02-b6fe06035daa-kube-api-access-5lx8n\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812434 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.812449 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4dc2869a-ba69-4466-aa02-b6fe06035daa-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.834194 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f522d3b2-124f-4af1-a548-408c3517ef13-logs\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.846261 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlx9l\" (UniqueName: \"kubernetes.io/projected/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-kube-api-access-nlx9l\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.854214 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-combined-ca-bundle\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.856327 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-config-data-custom\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.860501 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-config\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.861006 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-public-tls-certs\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.863140 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-config-data\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.864446 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq2ml\" (UniqueName: \"kubernetes.io/projected/f522d3b2-124f-4af1-a548-408c3517ef13-kube-api-access-mq2ml\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.867582 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-internal-tls-certs\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.869067 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-public-tls-certs\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.871711 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-combined-ca-bundle\") pod \"barbican-api-d5769bb96-hnxc7\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.872555 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-ovndb-tls-certs\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.875754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-internal-tls-certs\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.878166 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-httpd-config\") pod \"neutron-6776c47bdb-m5w28\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.908420 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="a8865fb5-d3c6-4178-9f1f-43e29f24e345" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.1.253:6080/vnc_lite.html\": dial tcp 10.217.1.253:6080: connect: connection refused" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.910069 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914018 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-credential-keys\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914062 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-internal-tls-certs\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914086 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-scripts\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914151 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-combined-ca-bundle\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914219 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2skqw\" (UniqueName: \"kubernetes.io/projected/24bdc108-6bee-469c-88da-d064ef3b2381-kube-api-access-2skqw\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914243 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-fernet-keys\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914262 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-config-data\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914289 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-internal-tls-certs\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914308 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6jjc\" (UniqueName: \"kubernetes.io/projected/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-kube-api-access-v6jjc\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914342 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-config-data\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914379 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-scripts\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914397 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24bdc108-6bee-469c-88da-d064ef3b2381-logs\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914415 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-combined-ca-bundle\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914441 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-public-tls-certs\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.914468 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-public-tls-certs\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.917262 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-public-tls-certs\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.920082 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-scripts\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.920365 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-public-tls-certs\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.921224 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24bdc108-6bee-469c-88da-d064ef3b2381-logs\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.923730 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-scripts\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.926892 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-combined-ca-bundle\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.948590 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-combined-ca-bundle\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.951248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-config-data\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.952153 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-fernet-keys\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.953612 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-internal-tls-certs\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.953863 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2skqw\" (UniqueName: \"kubernetes.io/projected/24bdc108-6bee-469c-88da-d064ef3b2381-kube-api-access-2skqw\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.978995 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-credential-keys\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:53 crc kubenswrapper[5030]: I0120 23:33:53.979616 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-config-data\") pod \"placement-5f5fbc99cd-g6d5w\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:53.999453 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.041251 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6jjc\" (UniqueName: \"kubernetes.io/projected/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-kube-api-access-v6jjc\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.042874 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-internal-tls-certs\") pod \"keystone-ddf8577d5-p79tx\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.050254 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.050997 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"4dc2869a-ba69-4466-aa02-b6fe06035daa","Type":"ContainerDied","Data":"73c74e1285256165f30db19767d9beb46653b00b7cf33280e3fa0754345416e3"} Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.051035 5030 scope.go:117] "RemoveContainer" containerID="019f85f2b7928eefb30edadc94479be6aefb358db65c713d1e1e7a186e7be788" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.066800 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_27a52282-d198-4a51-912a-1c2267f20d4b/ovsdbserver-nb/0.log" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.067303 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"27a52282-d198-4a51-912a-1c2267f20d4b","Type":"ContainerDied","Data":"2c8b30fe3bc11e57c9985ae029d78217dff93a87b0704952f4f312545e5171c2"} Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.067532 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.072249 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7f67b969c8-d246l"] Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.073256 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" event={"ID":"49a8a7d0-ed33-43a6-a11d-c544f55be6a5","Type":"ContainerStarted","Data":"7dc8386abdbcaed1bea62f94581045b43de423bc42d2ef1a13578add2fac1da0"} Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.090371 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" event={"ID":"af288905-8d90-4f98-ba70-c7351555851b","Type":"ContainerStarted","Data":"8dafb465f988901e4fe4867094e00575da0eb7bc1d0020e5216b4a9150e27f42"} Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.106968 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.109159 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.112461 5030 scope.go:117] "RemoveContainer" containerID="7500d10c05813e90a38f8a26285eef2c34b7e71479f0011b90857ee52f9a3d74" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.124341 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.126910 5030 generic.go:334] "Generic (PLEG): container finished" podID="6153d63b-e003-4951-b9f1-a27d11979663" containerID="24f4bfb73431d6bcf7cc305370aec071e7e270a5776bad4e062eabb3f4ff4ed1" exitCode=0 Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.127006 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"6153d63b-e003-4951-b9f1-a27d11979663","Type":"ContainerDied","Data":"24f4bfb73431d6bcf7cc305370aec071e7e270a5776bad4e062eabb3f4ff4ed1"} Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.134861 5030 generic.go:334] "Generic (PLEG): container finished" podID="a8865fb5-d3c6-4178-9f1f-43e29f24e345" containerID="5ae7bb9ba42eb770778d66a021e4203f2b42a0265e7c087325892501138ebe93" exitCode=0 Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.134911 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"a8865fb5-d3c6-4178-9f1f-43e29f24e345","Type":"ContainerDied","Data":"5ae7bb9ba42eb770778d66a021e4203f2b42a0265e7c087325892501138ebe93"} Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.159841 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" event={"ID":"3e8ed654-59ee-4055-bb97-7d079e100af7","Type":"ContainerStarted","Data":"b5f348a0227bc5cb6c1679c20e34ff57444d3d5f79ad0466f1e16ac007e64197"} Jan 20 23:33:54 crc kubenswrapper[5030]: E0120 23:33:54.176800 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59d37188df8105f7a6563229d86d8c1078a8f47264c721f27153dbfc6bbaa4e2" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:33:54 crc kubenswrapper[5030]: E0120 23:33:54.178063 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59d37188df8105f7a6563229d86d8c1078a8f47264c721f27153dbfc6bbaa4e2" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.196387 5030 scope.go:117] "RemoveContainer" containerID="9610c64c6d13e545efd2ca85d32f3b7947e2a7fa0179fb1ad72d312779b314e3" Jan 20 23:33:54 crc kubenswrapper[5030]: E0120 23:33:54.222383 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59d37188df8105f7a6563229d86d8c1078a8f47264c721f27153dbfc6bbaa4e2" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:33:54 crc kubenswrapper[5030]: E0120 23:33:54.222437 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="7b274deb-e830-47a2-916c-3c10670c08ac" containerName="ovn-northd" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.234448 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" event={"ID":"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df","Type":"ContainerStarted","Data":"723b0978ae71c86018114fdc1223e1db2589a8a21412aca5690191cf3fbae746"} Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.243409 5030 generic.go:334] "Generic (PLEG): container finished" podID="aea67ae4-7b1c-4bc1-afe4-eacfc66d0081" containerID="7cb774b7ad67df28cae541d2b5fd9915c0040832858e0d9cbf4f702324dd16b6" exitCode=137 Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.257129 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.287287 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:33:54 crc kubenswrapper[5030]: E0120 23:33:54.300926 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a649b6dc8b2b3e30909cde7117460adb9c95953db7108da7fc84fff13889ed6f is running failed: container process not found" containerID="a649b6dc8b2b3e30909cde7117460adb9c95953db7108da7fc84fff13889ed6f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:33:54 crc kubenswrapper[5030]: E0120 23:33:54.311228 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a649b6dc8b2b3e30909cde7117460adb9c95953db7108da7fc84fff13889ed6f is running failed: container process not found" containerID="a649b6dc8b2b3e30909cde7117460adb9c95953db7108da7fc84fff13889ed6f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:33:54 crc kubenswrapper[5030]: E0120 23:33:54.320745 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a649b6dc8b2b3e30909cde7117460adb9c95953db7108da7fc84fff13889ed6f is running failed: container process not found" containerID="a649b6dc8b2b3e30909cde7117460adb9c95953db7108da7fc84fff13889ed6f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:33:54 crc kubenswrapper[5030]: E0120 23:33:54.320818 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a649b6dc8b2b3e30909cde7117460adb9c95953db7108da7fc84fff13889ed6f is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="8011bf0a-412f-4bbc-a0d8-12da1a319a3f" containerName="nova-scheduler-scheduler" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.328957 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-676d666f78-55975"] Jan 20 23:33:54 crc kubenswrapper[5030]: E0120 23:33:54.335333 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aafac8517276cb2773a480452566bd4bb81c0a6d528488652d12ca4598c33812" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:33:54 crc kubenswrapper[5030]: E0120 23:33:54.337885 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aafac8517276cb2773a480452566bd4bb81c0a6d528488652d12ca4598c33812" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:33:54 crc kubenswrapper[5030]: E0120 23:33:54.339303 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aafac8517276cb2773a480452566bd4bb81c0a6d528488652d12ca4598c33812" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:33:54 crc kubenswrapper[5030]: E0120 23:33:54.339346 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/openstack-galera-0" podUID="66ad6a39-d03a-4db8-9f47-9c461325cf0b" containerName="galera" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.363699 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc2869a-ba69-4466-aa02-b6fe06035daa-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "4dc2869a-ba69-4466-aa02-b6fe06035daa" (UID: "4dc2869a-ba69-4466-aa02-b6fe06035daa"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.443970 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc2869a-ba69-4466-aa02-b6fe06035daa-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.448886 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-b56fccfd8-pxhmq"] Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.455962 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.456225 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="ceilometer-central-agent" containerID="cri-o://5f9274ccadd108b7edd55790f7101b6f78736bb7058b62758a8e76d34b967ae3" gracePeriod=30 Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.456691 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="proxy-httpd" containerID="cri-o://a817fccb1d8108fa564f78fe7c39689a51604fe6b3c5f2955526baac928c24e3" gracePeriod=30 Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.456734 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="sg-core" containerID="cri-o://23f3dcb0553f3089e12b860e39bdceefecb58a03add17ca1ec21c1cfd51434a4" gracePeriod=30 Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.456764 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="ceilometer-notification-agent" containerID="cri-o://1364f3685e72aa6c1fbe3584228fe8fd80440e7f23b6bca2e91fbdaa8ae36662" gracePeriod=30 Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.466981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc2869a-ba69-4466-aa02-b6fe06035daa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dc2869a-ba69-4466-aa02-b6fe06035daa" (UID: "4dc2869a-ba69-4466-aa02-b6fe06035daa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:54 crc kubenswrapper[5030]: W0120 23:33:54.500171 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4ee31a9_b56d_4e9c_8b51_7b344bb678d0.slice/crio-acd5ced3fb7131fa70792ce751b2c18f2bcaf93815d7f7913895cd6abadd74a1 WatchSource:0}: Error finding container acd5ced3fb7131fa70792ce751b2c18f2bcaf93815d7f7913895cd6abadd74a1: Status 404 returned error can't find the container with id acd5ced3fb7131fa70792ce751b2c18f2bcaf93815d7f7913895cd6abadd74a1 Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.531457 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27a52282-d198-4a51-912a-1c2267f20d4b" (UID: "27a52282-d198-4a51-912a-1c2267f20d4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.546458 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc2869a-ba69-4466-aa02-b6fe06035daa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.546482 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:54 crc kubenswrapper[5030]: W0120 23:33:54.596750 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode566cb27_7416_41c6_b6e4_d31c2fdb3f1d.slice/crio-72a59b1fd23ebc0f8e8c313dbec6a6d1db2478bf9de1ccc57dd3cc6a9ae73bb6 WatchSource:0}: Error finding container 72a59b1fd23ebc0f8e8c313dbec6a6d1db2478bf9de1ccc57dd3cc6a9ae73bb6: Status 404 returned error can't find the container with id 72a59b1fd23ebc0f8e8c313dbec6a6d1db2478bf9de1ccc57dd3cc6a9ae73bb6 Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.613848 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "27a52282-d198-4a51-912a-1c2267f20d4b" (UID: "27a52282-d198-4a51-912a-1c2267f20d4b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.665769 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.684008 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp"] Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.707994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "27a52282-d198-4a51-912a-1c2267f20d4b" (UID: "27a52282-d198-4a51-912a-1c2267f20d4b"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.726269 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2"] Jan 20 23:33:54 crc kubenswrapper[5030]: W0120 23:33:54.736870 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f8fdbf9_041a_46dc_87c9_f2531626e150.slice/crio-73cbf0a8806f54718d2882617fea2872c8f36fcd941dd194702aef26e3b0465c WatchSource:0}: Error finding container 73cbf0a8806f54718d2882617fea2872c8f36fcd941dd194702aef26e3b0465c: Status 404 returned error can't find the container with id 73cbf0a8806f54718d2882617fea2872c8f36fcd941dd194702aef26e3b0465c Jan 20 23:33:54 crc kubenswrapper[5030]: I0120 23:33:54.769818 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27a52282-d198-4a51-912a-1c2267f20d4b-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.256941 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_dcf91700-edc5-4ecb-a027-b6b904e948e1/ovsdbserver-sb/0.log" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.257325 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.303370 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-metrics-certs-tls-certs\") pod \"dcf91700-edc5-4ecb-a027-b6b904e948e1\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.303458 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-combined-ca-bundle\") pod \"dcf91700-edc5-4ecb-a027-b6b904e948e1\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.303765 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-ovsdbserver-sb-tls-certs\") pod \"dcf91700-edc5-4ecb-a027-b6b904e948e1\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.303846 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcf91700-edc5-4ecb-a027-b6b904e948e1-ovsdb-rundir\") pod \"dcf91700-edc5-4ecb-a027-b6b904e948e1\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.303911 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcf91700-edc5-4ecb-a027-b6b904e948e1-scripts\") pod \"dcf91700-edc5-4ecb-a027-b6b904e948e1\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.303937 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf91700-edc5-4ecb-a027-b6b904e948e1-config\") pod \"dcf91700-edc5-4ecb-a027-b6b904e948e1\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.304008 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrphn\" (UniqueName: \"kubernetes.io/projected/dcf91700-edc5-4ecb-a027-b6b904e948e1-kube-api-access-zrphn\") pod \"dcf91700-edc5-4ecb-a027-b6b904e948e1\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.304060 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"dcf91700-edc5-4ecb-a027-b6b904e948e1\" (UID: \"dcf91700-edc5-4ecb-a027-b6b904e948e1\") " Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.312080 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf91700-edc5-4ecb-a027-b6b904e948e1-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "dcf91700-edc5-4ecb-a027-b6b904e948e1" (UID: "dcf91700-edc5-4ecb-a027-b6b904e948e1"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.313317 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf91700-edc5-4ecb-a027-b6b904e948e1-config" (OuterVolumeSpecName: "config") pod "dcf91700-edc5-4ecb-a027-b6b904e948e1" (UID: "dcf91700-edc5-4ecb-a027-b6b904e948e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.318339 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcf91700-edc5-4ecb-a027-b6b904e948e1-scripts" (OuterVolumeSpecName: "scripts") pod "dcf91700-edc5-4ecb-a027-b6b904e948e1" (UID: "dcf91700-edc5-4ecb-a027-b6b904e948e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.327630 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "dcf91700-edc5-4ecb-a027-b6b904e948e1" (UID: "dcf91700-edc5-4ecb-a027-b6b904e948e1"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.343867 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf91700-edc5-4ecb-a027-b6b904e948e1-kube-api-access-zrphn" (OuterVolumeSpecName: "kube-api-access-zrphn") pod "dcf91700-edc5-4ecb-a027-b6b904e948e1" (UID: "dcf91700-edc5-4ecb-a027-b6b904e948e1"). InnerVolumeSpecName "kube-api-access-zrphn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.344570 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" event={"ID":"712df845-7e0f-48ed-b6a1-bb7c967fb5e3","Type":"ContainerStarted","Data":"96d3fa65bfb8aded4d4835d8a23a2f287745f32c728f721355c9d27da2608427"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.351166 5030 generic.go:334] "Generic (PLEG): container finished" podID="8011bf0a-412f-4bbc-a0d8-12da1a319a3f" containerID="a649b6dc8b2b3e30909cde7117460adb9c95953db7108da7fc84fff13889ed6f" exitCode=0 Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.351264 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"8011bf0a-412f-4bbc-a0d8-12da1a319a3f","Type":"ContainerDied","Data":"a649b6dc8b2b3e30909cde7117460adb9c95953db7108da7fc84fff13889ed6f"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.414051 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcf91700-edc5-4ecb-a027-b6b904e948e1-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.414084 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcf91700-edc5-4ecb-a027-b6b904e948e1-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.414113 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrphn\" (UniqueName: \"kubernetes.io/projected/dcf91700-edc5-4ecb-a027-b6b904e948e1-kube-api-access-zrphn\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.414140 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.414150 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dcf91700-edc5-4ecb-a027-b6b904e948e1-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.418893 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcf91700-edc5-4ecb-a027-b6b904e948e1" (UID: "dcf91700-edc5-4ecb-a027-b6b904e948e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.466422 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.478891 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"a8865fb5-d3c6-4178-9f1f-43e29f24e345","Type":"ContainerDied","Data":"611e07425cd008334d0ac08cfa6e833437f1f4992115ef65bc4ff6bb7b8c28d2"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.478929 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="611e07425cd008334d0ac08cfa6e833437f1f4992115ef65bc4ff6bb7b8c28d2" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.480811 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" event={"ID":"af288905-8d90-4f98-ba70-c7351555851b","Type":"ContainerStarted","Data":"1b7ac12593213a554ea14bfe0f61419d4d62872f58e89bac2f8027ab71aa20eb"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.482248 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" event={"ID":"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df","Type":"ContainerStarted","Data":"aa169a139bd24a0b6ddc7e560156c87e40f4eba5bcd5d1dd9974ceb2324a9013"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.496838 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-676d666f78-55975" event={"ID":"a84cf4eb-2a79-4637-aca8-d84c4e588fe7","Type":"ContainerStarted","Data":"8a5263047017b01347184055ef56482fe8d62fe194eed7c994156f67f1bacf32"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.503129 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w"] Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.504059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0","Type":"ContainerStarted","Data":"acd5ced3fb7131fa70792ce751b2c18f2bcaf93815d7f7913895cd6abadd74a1"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.507451 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_dcf91700-edc5-4ecb-a027-b6b904e948e1/ovsdbserver-sb/0.log" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.507504 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"dcf91700-edc5-4ecb-a027-b6b904e948e1","Type":"ContainerDied","Data":"3fa7964cdb8712e5bc2875ed3c827e06a24e37a9f1c57a93f9ae63ab718c78b7"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.507528 5030 scope.go:117] "RemoveContainer" containerID="e15443f82953fd9286b3c4ae0ad6d0ae04bde20b646f4ec3be8e39a140ec4ff8" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.507638 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.515298 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.515318 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.522234 5030 generic.go:334] "Generic (PLEG): container finished" podID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerID="a817fccb1d8108fa564f78fe7c39689a51604fe6b3c5f2955526baac928c24e3" exitCode=0 Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.522257 5030 generic.go:334] "Generic (PLEG): container finished" podID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerID="23f3dcb0553f3089e12b860e39bdceefecb58a03add17ca1ec21c1cfd51434a4" exitCode=2 Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.522266 5030 generic.go:334] "Generic (PLEG): container finished" podID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerID="5f9274ccadd108b7edd55790f7101b6f78736bb7058b62758a8e76d34b967ae3" exitCode=0 Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.522298 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"74b19e4a-b5b1-4c2f-9cf9-01097291798d","Type":"ContainerDied","Data":"a817fccb1d8108fa564f78fe7c39689a51604fe6b3c5f2955526baac928c24e3"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.522317 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"74b19e4a-b5b1-4c2f-9cf9-01097291798d","Type":"ContainerDied","Data":"23f3dcb0553f3089e12b860e39bdceefecb58a03add17ca1ec21c1cfd51434a4"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.522327 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"74b19e4a-b5b1-4c2f-9cf9-01097291798d","Type":"ContainerDied","Data":"5f9274ccadd108b7edd55790f7101b6f78736bb7058b62758a8e76d34b967ae3"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.535188 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "dcf91700-edc5-4ecb-a027-b6b904e948e1" (UID: "dcf91700-edc5-4ecb-a027-b6b904e948e1"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.536551 5030 generic.go:334] "Generic (PLEG): container finished" podID="2de7a5fe-85b0-4b7c-af74-ec2c104fac94" containerID="e36e0f7cd8355c63cb64f84f0eb05159bf294b5e0ff81a35379e17c111fa95ef" exitCode=0 Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.536615 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"2de7a5fe-85b0-4b7c-af74-ec2c104fac94","Type":"ContainerDied","Data":"e36e0f7cd8355c63cb64f84f0eb05159bf294b5e0ff81a35379e17c111fa95ef"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.552352 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7"] Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.573848 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" event={"ID":"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d","Type":"ContainerStarted","Data":"72a59b1fd23ebc0f8e8c313dbec6a6d1db2478bf9de1ccc57dd3cc6a9ae73bb6"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.576994 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" event={"ID":"4f8fdbf9-041a-46dc-87c9-f2531626e150","Type":"ContainerStarted","Data":"73cbf0a8806f54718d2882617fea2872c8f36fcd941dd194702aef26e3b0465c"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.577035 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "dcf91700-edc5-4ecb-a027-b6b904e948e1" (UID: "dcf91700-edc5-4ecb-a027-b6b904e948e1"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.594823 5030 generic.go:334] "Generic (PLEG): container finished" podID="37c2ebda-5e72-41eb-9c8c-1326134768c7" containerID="42893f23058587a6ea7bb91a49056a29e0ef50a9501774456a34ef1a79090ca0" exitCode=0 Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.594961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"37c2ebda-5e72-41eb-9c8c-1326134768c7","Type":"ContainerDied","Data":"42893f23058587a6ea7bb91a49056a29e0ef50a9501774456a34ef1a79090ca0"} Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.599207 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6776c47bdb-m5w28"] Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.617575 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.617596 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf91700-edc5-4ecb-a027-b6b904e948e1-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.621341 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9032d48a07c90b7a5d2e0d3805cc787da850de386bbfae8911a63712b51e50ed" Jan 20 23:33:55 crc kubenswrapper[5030]: W0120 23:33:55.668986 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7b3ce7d_73be_4f56_80a5_d9147ad0580e.slice/crio-c23bf4e0593f5e8bcacf1cb2436a6278954db7f885655cbf8b8477146c1a2b24 WatchSource:0}: Error finding container c23bf4e0593f5e8bcacf1cb2436a6278954db7f885655cbf8b8477146c1a2b24: Status 404 returned error can't find the container with id c23bf4e0593f5e8bcacf1cb2436a6278954db7f885655cbf8b8477146c1a2b24 Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.727042 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.44:8775/\": read tcp 10.217.0.2:54952->10.217.0.44:8775: read: connection reset by peer" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.727359 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.44:8775/\": read tcp 10.217.0.2:54946->10.217.0.44:8775: read: connection reset by peer" Jan 20 23:33:55 crc kubenswrapper[5030]: E0120 23:33:55.802033 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e36e0f7cd8355c63cb64f84f0eb05159bf294b5e0ff81a35379e17c111fa95ef is running failed: container process not found" containerID="e36e0f7cd8355c63cb64f84f0eb05159bf294b5e0ff81a35379e17c111fa95ef" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:33:55 crc kubenswrapper[5030]: E0120 23:33:55.802369 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e36e0f7cd8355c63cb64f84f0eb05159bf294b5e0ff81a35379e17c111fa95ef is running failed: container process not found" containerID="e36e0f7cd8355c63cb64f84f0eb05159bf294b5e0ff81a35379e17c111fa95ef" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:33:55 crc kubenswrapper[5030]: E0120 23:33:55.807009 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e36e0f7cd8355c63cb64f84f0eb05159bf294b5e0ff81a35379e17c111fa95ef is running failed: container process not found" containerID="e36e0f7cd8355c63cb64f84f0eb05159bf294b5e0ff81a35379e17c111fa95ef" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:33:55 crc kubenswrapper[5030]: E0120 23:33:55.807040 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e36e0f7cd8355c63cb64f84f0eb05159bf294b5e0ff81a35379e17c111fa95ef is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="2de7a5fe-85b0-4b7c-af74-ec2c104fac94" containerName="galera" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.860499 5030 scope.go:117] "RemoveContainer" containerID="fdf31a0124040cc80f23e3e5fff8eb8638fc26f71aedd567cd3ee47023452251" Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.958910 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-ddf8577d5-p79tx"] Jan 20 23:33:55 crc kubenswrapper[5030]: I0120 23:33:55.986029 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.131757 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-combined-ca-bundle\") pod \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.131812 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-nova-novncproxy-tls-certs\") pod \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.131854 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mznx8\" (UniqueName: \"kubernetes.io/projected/a8865fb5-d3c6-4178-9f1f-43e29f24e345-kube-api-access-mznx8\") pod \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.131912 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-config-data\") pod \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.135073 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-vencrypt-tls-certs\") pod \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\" (UID: \"a8865fb5-d3c6-4178-9f1f-43e29f24e345\") " Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.187172 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.197636 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.213409 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.236741 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.243569 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-openstack-config-secret\") pod \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.243654 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-openstack-config\") pod \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.243712 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-combined-ca-bundle\") pod \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.243736 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-588ff\" (UniqueName: \"kubernetes.io/projected/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-kube-api-access-588ff\") pod \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\" (UID: \"aea67ae4-7b1c-4bc1-afe4-eacfc66d0081\") " Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.244526 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.261204 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.262117 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:33:56 crc kubenswrapper[5030]: E0120 23:33:56.262516 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8011bf0a-412f-4bbc-a0d8-12da1a319a3f" containerName="nova-scheduler-scheduler" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.262529 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8011bf0a-412f-4bbc-a0d8-12da1a319a3f" containerName="nova-scheduler-scheduler" Jan 20 23:33:56 crc kubenswrapper[5030]: E0120 23:33:56.262552 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf91700-edc5-4ecb-a027-b6b904e948e1" containerName="ovsdbserver-sb" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.262558 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf91700-edc5-4ecb-a027-b6b904e948e1" containerName="ovsdbserver-sb" Jan 20 23:33:56 crc kubenswrapper[5030]: E0120 23:33:56.264508 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf91700-edc5-4ecb-a027-b6b904e948e1" containerName="openstack-network-exporter" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.264524 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf91700-edc5-4ecb-a027-b6b904e948e1" containerName="openstack-network-exporter" Jan 20 23:33:56 crc kubenswrapper[5030]: E0120 23:33:56.264541 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8865fb5-d3c6-4178-9f1f-43e29f24e345" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.264548 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8865fb5-d3c6-4178-9f1f-43e29f24e345" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.264850 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8011bf0a-412f-4bbc-a0d8-12da1a319a3f" containerName="nova-scheduler-scheduler" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.264870 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf91700-edc5-4ecb-a027-b6b904e948e1" containerName="openstack-network-exporter" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.264881 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf91700-edc5-4ecb-a027-b6b904e948e1" containerName="ovsdbserver-sb" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.264894 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8865fb5-d3c6-4178-9f1f-43e29f24e345" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.266016 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.273234 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.273410 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.275187 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.275350 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-hrzkh" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.279859 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.344748 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.346380 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: E0120 23:33:56.347796 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8 is running failed: container process not found" containerID="ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:33:56 crc kubenswrapper[5030]: E0120 23:33:56.348067 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8 is running failed: container process not found" containerID="ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:33:56 crc kubenswrapper[5030]: E0120 23:33:56.349734 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8 is running failed: container process not found" containerID="ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:33:56 crc kubenswrapper[5030]: E0120 23:33:56.349795 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="e443f370-de90-4c56-ba15-f33683726237" containerName="nova-cell0-conductor-conductor" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.350006 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.350079 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.350206 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-n56b6" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.353530 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8865fb5-d3c6-4178-9f1f-43e29f24e345-kube-api-access-mznx8" (OuterVolumeSpecName: "kube-api-access-mznx8") pod "a8865fb5-d3c6-4178-9f1f-43e29f24e345" (UID: "a8865fb5-d3c6-4178-9f1f-43e29f24e345"). InnerVolumeSpecName "kube-api-access-mznx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.356898 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.373634 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-combined-ca-bundle\") pod \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\" (UID: \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\") " Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.373777 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpcbf\" (UniqueName: \"kubernetes.io/projected/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-kube-api-access-zpcbf\") pod \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\" (UID: \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\") " Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.373804 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-config-data\") pod \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\" (UID: \"8011bf0a-412f-4bbc-a0d8-12da1a319a3f\") " Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.374021 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.374052 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.374080 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.374117 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.374192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-config\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.374218 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.374244 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsddh\" (UniqueName: \"kubernetes.io/projected/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-kube-api-access-vsddh\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.374275 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.374338 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mznx8\" (UniqueName: \"kubernetes.io/projected/a8865fb5-d3c6-4178-9f1f-43e29f24e345-kube-api-access-mznx8\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.378815 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-kube-api-access-588ff" (OuterVolumeSpecName: "kube-api-access-588ff") pod "aea67ae4-7b1c-4bc1-afe4-eacfc66d0081" (UID: "aea67ae4-7b1c-4bc1-afe4-eacfc66d0081"). InnerVolumeSpecName "kube-api-access-588ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.393833 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-kube-api-access-zpcbf" (OuterVolumeSpecName: "kube-api-access-zpcbf") pod "8011bf0a-412f-4bbc-a0d8-12da1a319a3f" (UID: "8011bf0a-412f-4bbc-a0d8-12da1a319a3f"). InnerVolumeSpecName "kube-api-access-zpcbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.448147 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-config-data" (OuterVolumeSpecName: "config-data") pod "a8865fb5-d3c6-4178-9f1f-43e29f24e345" (UID: "a8865fb5-d3c6-4178-9f1f-43e29f24e345"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.534319 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.537338 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.544199 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsddh\" (UniqueName: \"kubernetes.io/projected/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-kube-api-access-vsddh\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.544359 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.544468 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.544528 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.544585 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.544631 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxqn4\" (UniqueName: \"kubernetes.io/projected/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-kube-api-access-vxqn4\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.544674 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-kolla-config\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.544749 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.544805 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-combined-ca-bundle\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.544933 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-config-data\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.545024 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-memcached-tls-certs\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.545064 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-config\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.545249 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpcbf\" (UniqueName: \"kubernetes.io/projected/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-kube-api-access-zpcbf\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.545277 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.545295 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-588ff\" (UniqueName: \"kubernetes.io/projected/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-kube-api-access-588ff\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.547278 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.548154 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.549416 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-config\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.564511 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.567919 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsddh\" (UniqueName: \"kubernetes.io/projected/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-kube-api-access-vsddh\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.567966 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.568701 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.646674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxqn4\" (UniqueName: \"kubernetes.io/projected/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-kube-api-access-vxqn4\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.646727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-kolla-config\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.646774 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-combined-ca-bundle\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.646820 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-config-data\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.646853 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-memcached-tls-certs\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.654963 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-kolla-config\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.663458 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-config-data\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.669353 5030 generic.go:334] "Generic (PLEG): container finished" podID="66ad6a39-d03a-4db8-9f47-9c461325cf0b" containerID="aafac8517276cb2773a480452566bd4bb81c0a6d528488652d12ca4598c33812" exitCode=0 Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.669423 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"66ad6a39-d03a-4db8-9f47-9c461325cf0b","Type":"ContainerDied","Data":"aafac8517276cb2773a480452566bd4bb81c0a6d528488652d12ca4598c33812"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.673795 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"2de7a5fe-85b0-4b7c-af74-ec2c104fac94","Type":"ContainerDied","Data":"cbdacca4fc0f1f65be053152d8f0d255499ad9178bd0d5cdd9bf85b7311a1d52"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.674047 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbdacca4fc0f1f65be053152d8f0d255499ad9178bd0d5cdd9bf85b7311a1d52" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.674493 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-config-data" (OuterVolumeSpecName: "config-data") pod "8011bf0a-412f-4bbc-a0d8-12da1a319a3f" (UID: "8011bf0a-412f-4bbc-a0d8-12da1a319a3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.684106 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-combined-ca-bundle\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.685058 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-memcached-tls-certs\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.706740 5030 generic.go:334] "Generic (PLEG): container finished" podID="8663671b-ef57-49a5-b24b-4cb3385ffc76" containerID="5577e422daaa40c7d4a3a4b146deed7f44ec2457dac0430688594f2eba2c7186" exitCode=0 Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.706809 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"8663671b-ef57-49a5-b24b-4cb3385ffc76","Type":"ContainerDied","Data":"5577e422daaa40c7d4a3a4b146deed7f44ec2457dac0430688594f2eba2c7186"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.706835 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"8663671b-ef57-49a5-b24b-4cb3385ffc76","Type":"ContainerDied","Data":"651d309731d23d217163f62c311b1286133380d551c3616274dabbbe3072b8ab"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.706846 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="651d309731d23d217163f62c311b1286133380d551c3616274dabbbe3072b8ab" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.709455 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxqn4\" (UniqueName: \"kubernetes.io/projected/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-kube-api-access-vxqn4\") pod \"memcached-0\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.718876 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" event={"ID":"b7b3ce7d-73be-4f56-80a5-d9147ad0580e","Type":"ContainerStarted","Data":"c23bf4e0593f5e8bcacf1cb2436a6278954db7f885655cbf8b8477146c1a2b24"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.733586 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"8011bf0a-412f-4bbc-a0d8-12da1a319a3f","Type":"ContainerDied","Data":"401dd115801d4b61cfe50f74d674e6d16df7fa7681d2854c84ba54d632acf884"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.733655 5030 scope.go:117] "RemoveContainer" containerID="a649b6dc8b2b3e30909cde7117460adb9c95953db7108da7fc84fff13889ed6f" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.734006 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.743977 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8865fb5-d3c6-4178-9f1f-43e29f24e345" (UID: "a8865fb5-d3c6-4178-9f1f-43e29f24e345"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.748966 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.748996 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.751862 5030 generic.go:334] "Generic (PLEG): container finished" podID="48f09696-840e-4a93-87c8-57bff4137d85" containerID="5038cf6a212317ce8623b41306e775f298e2c7a56a2b338c29c6f7840be367fa" exitCode=0 Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.752089 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"48f09696-840e-4a93-87c8-57bff4137d85","Type":"ContainerDied","Data":"5038cf6a212317ce8623b41306e775f298e2c7a56a2b338c29c6f7840be367fa"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.752282 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"48f09696-840e-4a93-87c8-57bff4137d85","Type":"ContainerDied","Data":"a5664ed439f2148de7c5bf170fed005019e2274cf2b4a55e94286225a10be1dd"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.752317 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5664ed439f2148de7c5bf170fed005019e2274cf2b4a55e94286225a10be1dd" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.756025 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.763382 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"37c2ebda-5e72-41eb-9c8c-1326134768c7","Type":"ContainerDied","Data":"b381ea0333a3d05f431a79b7595daed3dba7f5af1c7a605d46c79bd6f10c58d3"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.763419 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b381ea0333a3d05f431a79b7595daed3dba7f5af1c7a605d46c79bd6f10c58d3" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.763455 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "aea67ae4-7b1c-4bc1-afe4-eacfc66d0081" (UID: "aea67ae4-7b1c-4bc1-afe4-eacfc66d0081"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.764144 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aea67ae4-7b1c-4bc1-afe4-eacfc66d0081" (UID: "aea67ae4-7b1c-4bc1-afe4-eacfc66d0081"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.765790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" event={"ID":"d788ece9-38a7-4cd0-bc65-a193ef38a0f1","Type":"ContainerStarted","Data":"21ddf443b33320832914b0ae456ad31ef25c904b08d8a0dbbf76217b1a452c19"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.768746 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "a8865fb5-d3c6-4178-9f1f-43e29f24e345" (UID: "a8865fb5-d3c6-4178-9f1f-43e29f24e345"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.769492 5030 generic.go:334] "Generic (PLEG): container finished" podID="e443f370-de90-4c56-ba15-f33683726237" containerID="ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8" exitCode=0 Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.769551 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e443f370-de90-4c56-ba15-f33683726237","Type":"ContainerDied","Data":"ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.769577 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"e443f370-de90-4c56-ba15-f33683726237","Type":"ContainerDied","Data":"2d20e74e91df826a223972321541ae454de238264babe509a656b84bf49f54e8"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.769588 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d20e74e91df826a223972321541ae454de238264babe509a656b84bf49f54e8" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.777453 5030 generic.go:334] "Generic (PLEG): container finished" podID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerID="820679d70ac112437e5c84042ac99cfca7c5a633c0df3f571874cadf6eb168d5" exitCode=0 Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.777512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"84ff2188-a797-40b5-8749-ee4330e6bf9a","Type":"ContainerDied","Data":"820679d70ac112437e5c84042ac99cfca7c5a633c0df3f571874cadf6eb168d5"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.777535 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"84ff2188-a797-40b5-8749-ee4330e6bf9a","Type":"ContainerDied","Data":"ca011647d8e6be8a4c913b33c0c024544cdcedd1bbca0021d426d30459834d55"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.777544 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca011647d8e6be8a4c913b33c0c024544cdcedd1bbca0021d426d30459834d55" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.778608 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "aea67ae4-7b1c-4bc1-afe4-eacfc66d0081" (UID: "aea67ae4-7b1c-4bc1-afe4-eacfc66d0081"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.780013 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" event={"ID":"24bdc108-6bee-469c-88da-d064ef3b2381","Type":"ContainerStarted","Data":"e77ff7d96b5860e987b608f6b2379ead4e92ea8c144adfd8c3837b5cc2f1f7f4"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.793867 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8011bf0a-412f-4bbc-a0d8-12da1a319a3f" (UID: "8011bf0a-412f-4bbc-a0d8-12da1a319a3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.798121 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "a8865fb5-d3c6-4178-9f1f-43e29f24e345" (UID: "a8865fb5-d3c6-4178-9f1f-43e29f24e345"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.825443 5030 generic.go:334] "Generic (PLEG): container finished" podID="2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" containerID="e9b169497221304dbe6adee831f056e749c90f520798d74f0635cb2f60436f1d" exitCode=0 Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.825499 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8","Type":"ContainerDied","Data":"e9b169497221304dbe6adee831f056e749c90f520798d74f0635cb2f60436f1d"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.825523 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8","Type":"ContainerDied","Data":"468f7ecd23883ebc707e2adea47de8e5894f76a5861f669d19466c6da8e55817"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.825535 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="468f7ecd23883ebc707e2adea47de8e5894f76a5861f669d19466c6da8e55817" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.829749 5030 generic.go:334] "Generic (PLEG): container finished" podID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerID="1364f3685e72aa6c1fbe3584228fe8fd80440e7f23b6bca2e91fbdaa8ae36662" exitCode=0 Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.829809 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"74b19e4a-b5b1-4c2f-9cf9-01097291798d","Type":"ContainerDied","Data":"1364f3685e72aa6c1fbe3584228fe8fd80440e7f23b6bca2e91fbdaa8ae36662"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.831048 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.836296 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.836678 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" event={"ID":"f522d3b2-124f-4af1-a548-408c3517ef13","Type":"ContainerStarted","Data":"0e97ee6fd33e202aaa801e530cf385259e03f928d93c86cd176d92e52a6d75c0"} Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.850417 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.850472 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.850483 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.850494 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.850519 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8011bf0a-412f-4bbc-a0d8-12da1a319a3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:56 crc kubenswrapper[5030]: I0120 23:33:56.850550 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8865fb5-d3c6-4178-9f1f-43e29f24e345-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.014515 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.154342 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.156421 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6m2v\" (UniqueName: \"kubernetes.io/projected/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-kube-api-access-d6m2v\") pod \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.156604 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-config-data-generated\") pod \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.156664 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-galera-tls-certs\") pod \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.156750 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-combined-ca-bundle\") pod \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.156796 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-config-data-default\") pod \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.156818 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-kolla-config\") pod \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.156844 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-operator-scripts\") pod \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.156890 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\" (UID: \"2de7a5fe-85b0-4b7c-af74-ec2c104fac94\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.158842 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "2de7a5fe-85b0-4b7c-af74-ec2c104fac94" (UID: "2de7a5fe-85b0-4b7c-af74-ec2c104fac94"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.158920 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2de7a5fe-85b0-4b7c-af74-ec2c104fac94" (UID: "2de7a5fe-85b0-4b7c-af74-ec2c104fac94"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.159477 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "2de7a5fe-85b0-4b7c-af74-ec2c104fac94" (UID: "2de7a5fe-85b0-4b7c-af74-ec2c104fac94"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.159735 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2de7a5fe-85b0-4b7c-af74-ec2c104fac94" (UID: "2de7a5fe-85b0-4b7c-af74-ec2c104fac94"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.185443 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-kube-api-access-d6m2v" (OuterVolumeSpecName: "kube-api-access-d6m2v") pod "2de7a5fe-85b0-4b7c-af74-ec2c104fac94" (UID: "2de7a5fe-85b0-4b7c-af74-ec2c104fac94"). InnerVolumeSpecName "kube-api-access-d6m2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.204134 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.210095 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="aea67ae4-7b1c-4bc1-afe4-eacfc66d0081" podUID="e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.216164 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.229177 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.237216 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.261055 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-public-tls-certs\") pod \"37c2ebda-5e72-41eb-9c8c-1326134768c7\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.261191 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-combined-ca-bundle\") pod \"37c2ebda-5e72-41eb-9c8c-1326134768c7\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.261221 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37c2ebda-5e72-41eb-9c8c-1326134768c7-httpd-run\") pod \"37c2ebda-5e72-41eb-9c8c-1326134768c7\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.261337 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-config-data\") pod \"37c2ebda-5e72-41eb-9c8c-1326134768c7\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.261414 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37c2ebda-5e72-41eb-9c8c-1326134768c7-logs\") pod \"37c2ebda-5e72-41eb-9c8c-1326134768c7\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.261462 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"37c2ebda-5e72-41eb-9c8c-1326134768c7\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.261507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-scripts\") pod \"37c2ebda-5e72-41eb-9c8c-1326134768c7\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.261529 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nvst\" (UniqueName: \"kubernetes.io/projected/37c2ebda-5e72-41eb-9c8c-1326134768c7-kube-api-access-7nvst\") pod \"37c2ebda-5e72-41eb-9c8c-1326134768c7\" (UID: \"37c2ebda-5e72-41eb-9c8c-1326134768c7\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.261984 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.261998 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.262008 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.262016 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.262024 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6m2v\" (UniqueName: \"kubernetes.io/projected/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-kube-api-access-d6m2v\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.263871 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37c2ebda-5e72-41eb-9c8c-1326134768c7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "37c2ebda-5e72-41eb-9c8c-1326134768c7" (UID: "37c2ebda-5e72-41eb-9c8c-1326134768c7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.265236 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:33:57 crc kubenswrapper[5030]: E0120 23:33:57.265643 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" containerName="glance-log" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.265659 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" containerName="glance-log" Jan 20 23:33:57 crc kubenswrapper[5030]: E0120 23:33:57.265684 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c2ebda-5e72-41eb-9c8c-1326134768c7" containerName="glance-log" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.265691 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c2ebda-5e72-41eb-9c8c-1326134768c7" containerName="glance-log" Jan 20 23:33:57 crc kubenswrapper[5030]: E0120 23:33:57.265699 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f09696-840e-4a93-87c8-57bff4137d85" containerName="nova-api-log" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.265705 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f09696-840e-4a93-87c8-57bff4137d85" containerName="nova-api-log" Jan 20 23:33:57 crc kubenswrapper[5030]: E0120 23:33:57.265719 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f09696-840e-4a93-87c8-57bff4137d85" containerName="nova-api-api" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.265725 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f09696-840e-4a93-87c8-57bff4137d85" containerName="nova-api-api" Jan 20 23:33:57 crc kubenswrapper[5030]: E0120 23:33:57.265733 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de7a5fe-85b0-4b7c-af74-ec2c104fac94" containerName="mysql-bootstrap" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.265739 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de7a5fe-85b0-4b7c-af74-ec2c104fac94" containerName="mysql-bootstrap" Jan 20 23:33:57 crc kubenswrapper[5030]: E0120 23:33:57.265751 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c2ebda-5e72-41eb-9c8c-1326134768c7" containerName="glance-httpd" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.265757 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c2ebda-5e72-41eb-9c8c-1326134768c7" containerName="glance-httpd" Jan 20 23:33:57 crc kubenswrapper[5030]: E0120 23:33:57.265773 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de7a5fe-85b0-4b7c-af74-ec2c104fac94" containerName="galera" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.265779 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de7a5fe-85b0-4b7c-af74-ec2c104fac94" containerName="galera" Jan 20 23:33:57 crc kubenswrapper[5030]: E0120 23:33:57.265789 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" containerName="glance-httpd" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.265829 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" containerName="glance-httpd" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.265994 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" containerName="glance-log" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.266007 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c2ebda-5e72-41eb-9c8c-1326134768c7" containerName="glance-httpd" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.266022 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" containerName="glance-httpd" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.266034 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de7a5fe-85b0-4b7c-af74-ec2c104fac94" containerName="galera" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.266046 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c2ebda-5e72-41eb-9c8c-1326134768c7" containerName="glance-log" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.266055 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f09696-840e-4a93-87c8-57bff4137d85" containerName="nova-api-api" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.266062 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f09696-840e-4a93-87c8-57bff4137d85" containerName="nova-api-log" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.266721 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.267894 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37c2ebda-5e72-41eb-9c8c-1326134768c7-logs" (OuterVolumeSpecName: "logs") pod "37c2ebda-5e72-41eb-9c8c-1326134768c7" (UID: "37c2ebda-5e72-41eb-9c8c-1326134768c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.271481 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.308364 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.317753 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.318964 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "mysql-db") pod "2de7a5fe-85b0-4b7c-af74-ec2c104fac94" (UID: "2de7a5fe-85b0-4b7c-af74-ec2c104fac94"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.328777 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37c2ebda-5e72-41eb-9c8c-1326134768c7-kube-api-access-7nvst" (OuterVolumeSpecName: "kube-api-access-7nvst") pod "37c2ebda-5e72-41eb-9c8c-1326134768c7" (UID: "37c2ebda-5e72-41eb-9c8c-1326134768c7"). InnerVolumeSpecName "kube-api-access-7nvst". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.331966 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "37c2ebda-5e72-41eb-9c8c-1326134768c7" (UID: "37c2ebda-5e72-41eb-9c8c-1326134768c7"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.332029 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.336435 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-scripts" (OuterVolumeSpecName: "scripts") pod "37c2ebda-5e72-41eb-9c8c-1326134768c7" (UID: "37c2ebda-5e72-41eb-9c8c-1326134768c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.341487 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.343370 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.347184 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.349314 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.353456 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.355729 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.368695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mtk9\" (UniqueName: \"kubernetes.io/projected/48f09696-840e-4a93-87c8-57bff4137d85-kube-api-access-6mtk9\") pod \"48f09696-840e-4a93-87c8-57bff4137d85\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.368791 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.368836 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-internal-tls-certs\") pod \"48f09696-840e-4a93-87c8-57bff4137d85\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.368878 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-logs\") pod \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.368919 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f09696-840e-4a93-87c8-57bff4137d85-logs\") pod \"48f09696-840e-4a93-87c8-57bff4137d85\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.368971 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-internal-tls-certs\") pod \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.368997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-config-data\") pod \"48f09696-840e-4a93-87c8-57bff4137d85\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.369067 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4t5j\" (UniqueName: \"kubernetes.io/projected/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-kube-api-access-f4t5j\") pod \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.369167 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-config-data\") pod \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.369245 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-public-tls-certs\") pod \"48f09696-840e-4a93-87c8-57bff4137d85\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.369299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-combined-ca-bundle\") pod \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.369323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-httpd-run\") pod \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.369346 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-combined-ca-bundle\") pod \"48f09696-840e-4a93-87c8-57bff4137d85\" (UID: \"48f09696-840e-4a93-87c8-57bff4137d85\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.369415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-scripts\") pod \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\" (UID: \"2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8\") " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.369763 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-logs" (OuterVolumeSpecName: "logs") pod "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" (UID: "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.369826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhnzj\" (UniqueName: \"kubernetes.io/projected/fc08df3d-397a-42ff-ab96-622f2dc6e492-kube-api-access-rhnzj\") pod \"nova-scheduler-0\" (UID: \"fc08df3d-397a-42ff-ab96-622f2dc6e492\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.370034 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f09696-840e-4a93-87c8-57bff4137d85-logs" (OuterVolumeSpecName: "logs") pod "48f09696-840e-4a93-87c8-57bff4137d85" (UID: "48f09696-840e-4a93-87c8-57bff4137d85"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.370221 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc08df3d-397a-42ff-ab96-622f2dc6e492-config-data\") pod \"nova-scheduler-0\" (UID: \"fc08df3d-397a-42ff-ab96-622f2dc6e492\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.370651 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc08df3d-397a-42ff-ab96-622f2dc6e492-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc08df3d-397a-42ff-ab96-622f2dc6e492\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.370811 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37c2ebda-5e72-41eb-9c8c-1326134768c7-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.370880 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.370933 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.371055 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48f09696-840e-4a93-87c8-57bff4137d85-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.371112 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37c2ebda-5e72-41eb-9c8c-1326134768c7-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.371172 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.371228 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.371287 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nvst\" (UniqueName: \"kubernetes.io/projected/37c2ebda-5e72-41eb-9c8c-1326134768c7-kube-api-access-7nvst\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.376599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f09696-840e-4a93-87c8-57bff4137d85-kube-api-access-6mtk9" (OuterVolumeSpecName: "kube-api-access-6mtk9") pod "48f09696-840e-4a93-87c8-57bff4137d85" (UID: "48f09696-840e-4a93-87c8-57bff4137d85"). InnerVolumeSpecName "kube-api-access-6mtk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.382207 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" (UID: "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.405963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" (UID: "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.420905 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-kube-api-access-f4t5j" (OuterVolumeSpecName: "kube-api-access-f4t5j") pod "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" (UID: "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8"). InnerVolumeSpecName "kube-api-access-f4t5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.431308 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-scripts" (OuterVolumeSpecName: "scripts") pod "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" (UID: "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.472649 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc08df3d-397a-42ff-ab96-622f2dc6e492-config-data\") pod \"nova-scheduler-0\" (UID: \"fc08df3d-397a-42ff-ab96-622f2dc6e492\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.472699 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.472745 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.472765 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.472873 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc08df3d-397a-42ff-ab96-622f2dc6e492-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc08df3d-397a-42ff-ab96-622f2dc6e492\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.472907 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.472933 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmgl2\" (UniqueName: \"kubernetes.io/projected/62e3599a-b500-49ac-a3bd-cdc62cd37fba-kube-api-access-fmgl2\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.472996 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhnzj\" (UniqueName: \"kubernetes.io/projected/fc08df3d-397a-42ff-ab96-622f2dc6e492-kube-api-access-rhnzj\") pod \"nova-scheduler-0\" (UID: \"fc08df3d-397a-42ff-ab96-622f2dc6e492\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.473054 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4t5j\" (UniqueName: \"kubernetes.io/projected/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-kube-api-access-f4t5j\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.473069 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.473079 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.473088 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mtk9\" (UniqueName: \"kubernetes.io/projected/48f09696-840e-4a93-87c8-57bff4137d85-kube-api-access-6mtk9\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.473106 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.480275 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc08df3d-397a-42ff-ab96-622f2dc6e492-config-data\") pod \"nova-scheduler-0\" (UID: \"fc08df3d-397a-42ff-ab96-622f2dc6e492\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.480306 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc08df3d-397a-42ff-ab96-622f2dc6e492-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc08df3d-397a-42ff-ab96-622f2dc6e492\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.504084 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhnzj\" (UniqueName: \"kubernetes.io/projected/fc08df3d-397a-42ff-ab96-622f2dc6e492-kube-api-access-rhnzj\") pod \"nova-scheduler-0\" (UID: \"fc08df3d-397a-42ff-ab96-622f2dc6e492\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.574988 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmgl2\" (UniqueName: \"kubernetes.io/projected/62e3599a-b500-49ac-a3bd-cdc62cd37fba-kube-api-access-fmgl2\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.578752 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.578809 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.578841 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.578962 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.593916 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.594019 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmgl2\" (UniqueName: \"kubernetes.io/projected/62e3599a-b500-49ac-a3bd-cdc62cd37fba-kube-api-access-fmgl2\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.600258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.602333 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.602359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.664246 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.681884 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.848286 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"74b19e4a-b5b1-4c2f-9cf9-01097291798d","Type":"ContainerDied","Data":"9d012a1b3e031d39ce2f719f0c80af63eaa499061275413e5c9cc4abcc05c8fc"} Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.848327 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d012a1b3e031d39ce2f719f0c80af63eaa499061275413e5c9cc4abcc05c8fc" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.850004 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-676d666f78-55975" event={"ID":"a84cf4eb-2a79-4637-aca8-d84c4e588fe7","Type":"ContainerStarted","Data":"5ecde8ec052026f4220036a91d5a3a71cf246312e5090cad9d644e11e17921bb"} Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.855001 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" event={"ID":"4f8fdbf9-041a-46dc-87c9-f2531626e150","Type":"ContainerStarted","Data":"a57e2746787ddfcbf60fecfd45d88fdb4eec56a5f3449cfc24acf2bbf3818b17"} Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.860123 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" event={"ID":"b7b3ce7d-73be-4f56-80a5-d9147ad0580e","Type":"ContainerStarted","Data":"0bfe039149eb41418e7a2f78fbb14ac57973f48030e358e2f58e1c2623aba811"} Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.866445 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" event={"ID":"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d","Type":"ContainerStarted","Data":"7610bfc12a250b49177a9212a8e1ab21b02766f17a8cb9d0985354440968ab12"} Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.869221 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.870559 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" event={"ID":"f522d3b2-124f-4af1-a548-408c3517ef13","Type":"ContainerStarted","Data":"872566d668262ae5e782cbaec5e0b5a5b6cfd95d8d6fa500178bc4186010d399"} Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.866686 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" podUID="e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" containerName="keystone-api" containerID="cri-o://7610bfc12a250b49177a9212a8e1ab21b02766f17a8cb9d0985354440968ab12" gracePeriod=30 Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.875682 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" event={"ID":"af288905-8d90-4f98-ba70-c7351555851b","Type":"ContainerStarted","Data":"2a5756d22f4f3a9b513c8862e9d5fd28415db4db8f9387d9f8b00baa64a602f6"} Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.876886 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" podUID="af288905-8d90-4f98-ba70-c7351555851b" containerName="barbican-worker-log" containerID="cri-o://1b7ac12593213a554ea14bfe0f61419d4d62872f58e89bac2f8027ab71aa20eb" gracePeriod=30 Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.877040 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" podUID="af288905-8d90-4f98-ba70-c7351555851b" containerName="barbican-worker" containerID="cri-o://2a5756d22f4f3a9b513c8862e9d5fd28415db4db8f9387d9f8b00baa64a602f6" gracePeriod=30 Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.903975 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" event={"ID":"3e8ed654-59ee-4055-bb97-7d079e100af7","Type":"ContainerStarted","Data":"e2f3539b0d09fd315ddf7e23acf530f47592955f23925972593c99550f0a21fb"} Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.920904 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" event={"ID":"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df","Type":"ContainerStarted","Data":"ddab932bf79386702fccfc6c79a0fd5f50d026b98c269df11b368ec27242c3df"} Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.921116 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" podUID="ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" containerName="barbican-keystone-listener-log" containerID="cri-o://aa169a139bd24a0b6ddc7e560156c87e40f4eba5bcd5d1dd9974ceb2324a9013" gracePeriod=30 Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.921271 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" podUID="ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" containerName="barbican-keystone-listener" containerID="cri-o://ddab932bf79386702fccfc6c79a0fd5f50d026b98c269df11b368ec27242c3df" gracePeriod=30 Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.925276 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" event={"ID":"712df845-7e0f-48ed-b6a1-bb7c967fb5e3","Type":"ContainerStarted","Data":"201d5bd12c4c0c61fa84da5a2967111dadbe0c96f4de78b25225f4ddfc98e0b3"} Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.941366 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" podStartSLOduration=6.9413494 podStartE2EDuration="6.9413494s" podCreationTimestamp="2026-01-20 23:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:33:57.904208762 +0000 UTC m=+3510.224469050" watchObservedRunningTime="2026-01-20 23:33:57.9413494 +0000 UTC m=+3510.261609688" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.941758 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.942709 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.942835 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"66ad6a39-d03a-4db8-9f47-9c461325cf0b","Type":"ContainerDied","Data":"5b0e18a59044bdb3e7ffadca1ae055ff36b5b21da9606b63a26da194f66c1a48"} Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.942879 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b0e18a59044bdb3e7ffadca1ae055ff36b5b21da9606b63a26da194f66c1a48" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.942959 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.943024 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.943192 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.945050 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" podStartSLOduration=6.9450413 podStartE2EDuration="6.9450413s" podCreationTimestamp="2026-01-20 23:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:33:57.920960967 +0000 UTC m=+3510.241221265" watchObservedRunningTime="2026-01-20 23:33:57.9450413 +0000 UTC m=+3510.265301588" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.959262 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" podStartSLOduration=6.959243444 podStartE2EDuration="6.959243444s" podCreationTimestamp="2026-01-20 23:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:33:57.9471211 +0000 UTC m=+3510.267381398" watchObservedRunningTime="2026-01-20 23:33:57.959243444 +0000 UTC m=+3510.279503732" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.991216 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dc2869a-ba69-4466-aa02-b6fe06035daa" path="/var/lib/kubelet/pods/4dc2869a-ba69-4466-aa02-b6fe06035daa/volumes" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.992852 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8011bf0a-412f-4bbc-a0d8-12da1a319a3f" path="/var/lib/kubelet/pods/8011bf0a-412f-4bbc-a0d8-12da1a319a3f/volumes" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.995807 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8865fb5-d3c6-4178-9f1f-43e29f24e345" path="/var/lib/kubelet/pods/a8865fb5-d3c6-4178-9f1f-43e29f24e345/volumes" Jan 20 23:33:57 crc kubenswrapper[5030]: I0120 23:33:57.998977 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea67ae4-7b1c-4bc1-afe4-eacfc66d0081" path="/var/lib/kubelet/pods/aea67ae4-7b1c-4bc1-afe4-eacfc66d0081/volumes" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.001669 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf91700-edc5-4ecb-a027-b6b904e948e1" path="/var/lib/kubelet/pods/dcf91700-edc5-4ecb-a027-b6b904e948e1/volumes" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.008801 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.042291 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2de7a5fe-85b0-4b7c-af74-ec2c104fac94" (UID: "2de7a5fe-85b0-4b7c-af74-ec2c104fac94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.111244 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.136651 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.227375 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.302171 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" (UID: "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.319230 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37c2ebda-5e72-41eb-9c8c-1326134768c7" (UID: "37c2ebda-5e72-41eb-9c8c-1326134768c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.322072 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-config-data" (OuterVolumeSpecName: "config-data") pod "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" (UID: "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.345182 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.345218 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.345240 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.348614 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "48f09696-840e-4a93-87c8-57bff4137d85" (UID: "48f09696-840e-4a93-87c8-57bff4137d85"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.362661 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-config-data" (OuterVolumeSpecName: "config-data") pod "48f09696-840e-4a93-87c8-57bff4137d85" (UID: "48f09696-840e-4a93-87c8-57bff4137d85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.376182 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48f09696-840e-4a93-87c8-57bff4137d85" (UID: "48f09696-840e-4a93-87c8-57bff4137d85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.388808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-config-data" (OuterVolumeSpecName: "config-data") pod "37c2ebda-5e72-41eb-9c8c-1326134768c7" (UID: "37c2ebda-5e72-41eb-9c8c-1326134768c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.392960 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "2de7a5fe-85b0-4b7c-af74-ec2c104fac94" (UID: "2de7a5fe-85b0-4b7c-af74-ec2c104fac94"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.415139 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "37c2ebda-5e72-41eb-9c8c-1326134768c7" (UID: "37c2ebda-5e72-41eb-9c8c-1326134768c7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.450975 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7a5fe-85b0-4b7c-af74-ec2c104fac94-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.451013 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.451023 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.451032 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.451041 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.451049 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c2ebda-5e72-41eb-9c8c-1326134768c7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.480912 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" (UID: "2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.486294 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "48f09696-840e-4a93-87c8-57bff4137d85" (UID: "48f09696-840e-4a93-87c8-57bff4137d85"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.553553 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f09696-840e-4a93-87c8-57bff4137d85-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.553596 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.656874 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.681475 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.715916 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w"] Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.735782 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.736253 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.797771 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.802169 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7cb86f569d-4mmlt"] Jan 20 23:33:58 crc kubenswrapper[5030]: E0120 23:33:58.816704 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e443f370-de90-4c56-ba15-f33683726237" containerName="nova-cell0-conductor-conductor" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.816736 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e443f370-de90-4c56-ba15-f33683726237" containerName="nova-cell0-conductor-conductor" Jan 20 23:33:58 crc kubenswrapper[5030]: E0120 23:33:58.816776 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-metadata" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.816783 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-metadata" Jan 20 23:33:58 crc kubenswrapper[5030]: E0120 23:33:58.816811 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8663671b-ef57-49a5-b24b-4cb3385ffc76" containerName="cinder-api" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.816817 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8663671b-ef57-49a5-b24b-4cb3385ffc76" containerName="cinder-api" Jan 20 23:33:58 crc kubenswrapper[5030]: E0120 23:33:58.816838 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-log" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.816845 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-log" Jan 20 23:33:58 crc kubenswrapper[5030]: E0120 23:33:58.816864 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8663671b-ef57-49a5-b24b-4cb3385ffc76" containerName="cinder-api-log" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.816871 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8663671b-ef57-49a5-b24b-4cb3385ffc76" containerName="cinder-api-log" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.817352 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8663671b-ef57-49a5-b24b-4cb3385ffc76" containerName="cinder-api-log" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.817377 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8663671b-ef57-49a5-b24b-4cb3385ffc76" containerName="cinder-api" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.817397 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-metadata" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.817414 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-log" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.817431 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e443f370-de90-4c56-ba15-f33683726237" containerName="nova-cell0-conductor-conductor" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.821988 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.837814 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ad6a39-d03a-4db8-9f47-9c461325cf0b" containerName="galera" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.840895 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.867758 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.886278 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.906570 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.910556 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-combined-ca-bundle\") pod \"8663671b-ef57-49a5-b24b-4cb3385ffc76\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.910681 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-config-data\") pod \"8663671b-ef57-49a5-b24b-4cb3385ffc76\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.910803 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-public-tls-certs\") pod \"8663671b-ef57-49a5-b24b-4cb3385ffc76\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.910825 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-scripts\") pod \"8663671b-ef57-49a5-b24b-4cb3385ffc76\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.910845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e443f370-de90-4c56-ba15-f33683726237-combined-ca-bundle\") pod \"e443f370-de90-4c56-ba15-f33683726237\" (UID: \"e443f370-de90-4c56-ba15-f33683726237\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.910963 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lt4m\" (UniqueName: \"kubernetes.io/projected/8663671b-ef57-49a5-b24b-4cb3385ffc76-kube-api-access-8lt4m\") pod \"8663671b-ef57-49a5-b24b-4cb3385ffc76\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.910985 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d49zh\" (UniqueName: \"kubernetes.io/projected/e443f370-de90-4c56-ba15-f33683726237-kube-api-access-d49zh\") pod \"e443f370-de90-4c56-ba15-f33683726237\" (UID: \"e443f370-de90-4c56-ba15-f33683726237\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.911045 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-combined-ca-bundle\") pod \"84ff2188-a797-40b5-8749-ee4330e6bf9a\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.911133 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-config-data-custom\") pod \"8663671b-ef57-49a5-b24b-4cb3385ffc76\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.911158 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e443f370-de90-4c56-ba15-f33683726237-config-data\") pod \"e443f370-de90-4c56-ba15-f33683726237\" (UID: \"e443f370-de90-4c56-ba15-f33683726237\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.911225 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8663671b-ef57-49a5-b24b-4cb3385ffc76-etc-machine-id\") pod \"8663671b-ef57-49a5-b24b-4cb3385ffc76\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.911280 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8663671b-ef57-49a5-b24b-4cb3385ffc76-logs\") pod \"8663671b-ef57-49a5-b24b-4cb3385ffc76\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.911318 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-config-data\") pod \"84ff2188-a797-40b5-8749-ee4330e6bf9a\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.911354 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqzcd\" (UniqueName: \"kubernetes.io/projected/84ff2188-a797-40b5-8749-ee4330e6bf9a-kube-api-access-mqzcd\") pod \"84ff2188-a797-40b5-8749-ee4330e6bf9a\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.911377 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-internal-tls-certs\") pod \"8663671b-ef57-49a5-b24b-4cb3385ffc76\" (UID: \"8663671b-ef57-49a5-b24b-4cb3385ffc76\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.911396 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ff2188-a797-40b5-8749-ee4330e6bf9a-logs\") pod \"84ff2188-a797-40b5-8749-ee4330e6bf9a\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.911442 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-nova-metadata-tls-certs\") pod \"84ff2188-a797-40b5-8749-ee4330e6bf9a\" (UID: \"84ff2188-a797-40b5-8749-ee4330e6bf9a\") " Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.953492 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.954097 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.958465 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84ff2188-a797-40b5-8749-ee4330e6bf9a-logs" (OuterVolumeSpecName: "logs") pod "84ff2188-a797-40b5-8749-ee4330e6bf9a" (UID: "84ff2188-a797-40b5-8749-ee4330e6bf9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.958577 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8663671b-ef57-49a5-b24b-4cb3385ffc76-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8663671b-ef57-49a5-b24b-4cb3385ffc76" (UID: "8663671b-ef57-49a5-b24b-4cb3385ffc76"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.959214 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8663671b-ef57-49a5-b24b-4cb3385ffc76-logs" (OuterVolumeSpecName: "logs") pod "8663671b-ef57-49a5-b24b-4cb3385ffc76" (UID: "8663671b-ef57-49a5-b24b-4cb3385ffc76"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.960650 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7cb86f569d-4mmlt"] Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.971429 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:33:58 crc kubenswrapper[5030]: E0120 23:33:58.971832 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ad6a39-d03a-4db8-9f47-9c461325cf0b" containerName="mysql-bootstrap" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.971848 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ad6a39-d03a-4db8-9f47-9c461325cf0b" containerName="mysql-bootstrap" Jan 20 23:33:58 crc kubenswrapper[5030]: E0120 23:33:58.971869 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="ceilometer-central-agent" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.971877 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="ceilometer-central-agent" Jan 20 23:33:58 crc kubenswrapper[5030]: E0120 23:33:58.971893 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="proxy-httpd" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.971899 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="proxy-httpd" Jan 20 23:33:58 crc kubenswrapper[5030]: E0120 23:33:58.971916 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="ceilometer-notification-agent" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.971921 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="ceilometer-notification-agent" Jan 20 23:33:58 crc kubenswrapper[5030]: E0120 23:33:58.971937 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ad6a39-d03a-4db8-9f47-9c461325cf0b" containerName="galera" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.971943 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ad6a39-d03a-4db8-9f47-9c461325cf0b" containerName="galera" Jan 20 23:33:58 crc kubenswrapper[5030]: E0120 23:33:58.971952 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="sg-core" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.971957 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="sg-core" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.972131 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="ceilometer-central-agent" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.972142 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="ceilometer-notification-agent" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.972155 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="sg-core" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.972170 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" containerName="proxy-httpd" Jan 20 23:33:58 crc kubenswrapper[5030]: I0120 23:33:58.996973 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.007461 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.007649 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.007754 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.009149 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.031313 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84ff2188-a797-40b5-8749-ee4330e6bf9a-kube-api-access-mqzcd" (OuterVolumeSpecName: "kube-api-access-mqzcd") pod "84ff2188-a797-40b5-8749-ee4330e6bf9a" (UID: "84ff2188-a797-40b5-8749-ee4330e6bf9a"). InnerVolumeSpecName "kube-api-access-mqzcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.038915 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e443f370-de90-4c56-ba15-f33683726237-kube-api-access-d49zh" (OuterVolumeSpecName: "kube-api-access-d49zh") pod "e443f370-de90-4c56-ba15-f33683726237" (UID: "e443f370-de90-4c56-ba15-f33683726237"). InnerVolumeSpecName "kube-api-access-d49zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.040683 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-scripts" (OuterVolumeSpecName: "scripts") pod "8663671b-ef57-49a5-b24b-4cb3385ffc76" (UID: "8663671b-ef57-49a5-b24b-4cb3385ffc76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.042959 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8663671b-ef57-49a5-b24b-4cb3385ffc76" (UID: "8663671b-ef57-49a5-b24b-4cb3385ffc76"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.051216 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" event={"ID":"24bdc108-6bee-469c-88da-d064ef3b2381","Type":"ContainerStarted","Data":"7672969c7149a4ed716e18defd00bfdaebf66124fd33296b0113971e32bea71b"} Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.054558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8663671b-ef57-49a5-b24b-4cb3385ffc76-kube-api-access-8lt4m" (OuterVolumeSpecName: "kube-api-access-8lt4m") pod "8663671b-ef57-49a5-b24b-4cb3385ffc76" (UID: "8663671b-ef57-49a5-b24b-4cb3385ffc76"). InnerVolumeSpecName "kube-api-access-8lt4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.056433 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74b19e4a-b5b1-4c2f-9cf9-01097291798d-run-httpd\") pod \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.056501 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bs57\" (UniqueName: \"kubernetes.io/projected/74b19e4a-b5b1-4c2f-9cf9-01097291798d-kube-api-access-2bs57\") pod \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.056562 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ad6a39-d03a-4db8-9f47-9c461325cf0b-combined-ca-bundle\") pod \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.056649 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.056675 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74b19e4a-b5b1-4c2f-9cf9-01097291798d-log-httpd\") pod \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.056705 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-ceilometer-tls-certs\") pod \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.056785 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-kolla-config\") pod \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.056807 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-config-data-default\") pod \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.056848 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkpzd\" (UniqueName: \"kubernetes.io/projected/66ad6a39-d03a-4db8-9f47-9c461325cf0b-kube-api-access-hkpzd\") pod \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.056877 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ad6a39-d03a-4db8-9f47-9c461325cf0b-galera-tls-certs\") pod \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.056950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-combined-ca-bundle\") pod \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.056978 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-sg-core-conf-yaml\") pod \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.056996 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66ad6a39-d03a-4db8-9f47-9c461325cf0b-config-data-generated\") pod \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.057019 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-config-data\") pod \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.057049 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-operator-scripts\") pod \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\" (UID: \"66ad6a39-d03a-4db8-9f47-9c461325cf0b\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.057068 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-scripts\") pod \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\" (UID: \"74b19e4a-b5b1-4c2f-9cf9-01097291798d\") " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.058499 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "66ad6a39-d03a-4db8-9f47-9c461325cf0b" (UID: "66ad6a39-d03a-4db8-9f47-9c461325cf0b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.059139 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74b19e4a-b5b1-4c2f-9cf9-01097291798d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "74b19e4a-b5b1-4c2f-9cf9-01097291798d" (UID: "74b19e4a-b5b1-4c2f-9cf9-01097291798d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.062080 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74b19e4a-b5b1-4c2f-9cf9-01097291798d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "74b19e4a-b5b1-4c2f-9cf9-01097291798d" (UID: "74b19e4a-b5b1-4c2f-9cf9-01097291798d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.062523 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-scripts\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.063955 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-internal-tls-certs\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064018 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-config-data\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064043 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-public-tls-certs\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064144 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwvq5\" (UniqueName: \"kubernetes.io/projected/427b84ba-dba3-4881-bd7c-461d4c3674c9-kube-api-access-lwvq5\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-combined-ca-bundle\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064321 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427b84ba-dba3-4881-bd7c-461d4c3674c9-logs\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064401 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74b19e4a-b5b1-4c2f-9cf9-01097291798d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064413 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lt4m\" (UniqueName: \"kubernetes.io/projected/8663671b-ef57-49a5-b24b-4cb3385ffc76-kube-api-access-8lt4m\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064426 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d49zh\" (UniqueName: \"kubernetes.io/projected/e443f370-de90-4c56-ba15-f33683726237-kube-api-access-d49zh\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064435 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064444 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064453 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8663671b-ef57-49a5-b24b-4cb3385ffc76-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064462 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8663671b-ef57-49a5-b24b-4cb3385ffc76-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064472 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqzcd\" (UniqueName: \"kubernetes.io/projected/84ff2188-a797-40b5-8749-ee4330e6bf9a-kube-api-access-mqzcd\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064481 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84ff2188-a797-40b5-8749-ee4330e6bf9a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064493 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74b19e4a-b5b1-4c2f-9cf9-01097291798d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.064502 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.065842 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66ad6a39-d03a-4db8-9f47-9c461325cf0b-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "66ad6a39-d03a-4db8-9f47-9c461325cf0b" (UID: "66ad6a39-d03a-4db8-9f47-9c461325cf0b"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.066290 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "66ad6a39-d03a-4db8-9f47-9c461325cf0b" (UID: "66ad6a39-d03a-4db8-9f47-9c461325cf0b"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.073108 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" event={"ID":"d788ece9-38a7-4cd0-bc65-a193ef38a0f1","Type":"ContainerStarted","Data":"20dc72195b475941779d438aac438e68649738b63961b6c84b8477517eefa288"} Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.073159 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.074375 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8663671b-ef57-49a5-b24b-4cb3385ffc76" (UID: "8663671b-ef57-49a5-b24b-4cb3385ffc76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.075426 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66ad6a39-d03a-4db8-9f47-9c461325cf0b" (UID: "66ad6a39-d03a-4db8-9f47-9c461325cf0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.076871 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0","Type":"ContainerStarted","Data":"4ad92fd7919def846bc376773b3ae36feeaf8411e7fdf8b0ef42055825101089"} Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.086017 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74b19e4a-b5b1-4c2f-9cf9-01097291798d-kube-api-access-2bs57" (OuterVolumeSpecName: "kube-api-access-2bs57") pod "74b19e4a-b5b1-4c2f-9cf9-01097291798d" (UID: "74b19e4a-b5b1-4c2f-9cf9-01097291798d"). InnerVolumeSpecName "kube-api-access-2bs57". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.086541 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-scripts" (OuterVolumeSpecName: "scripts") pod "74b19e4a-b5b1-4c2f-9cf9-01097291798d" (UID: "74b19e4a-b5b1-4c2f-9cf9-01097291798d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.094452 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" event={"ID":"49a8a7d0-ed33-43a6-a11d-c544f55be6a5","Type":"ContainerStarted","Data":"4dfae6ef76bc32ff34131ca4dcbf419f695a22d4c5a49864bd669e4ed93b98e4"} Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.104615 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ad6a39-d03a-4db8-9f47-9c461325cf0b-kube-api-access-hkpzd" (OuterVolumeSpecName: "kube-api-access-hkpzd") pod "66ad6a39-d03a-4db8-9f47-9c461325cf0b" (UID: "66ad6a39-d03a-4db8-9f47-9c461325cf0b"). InnerVolumeSpecName "kube-api-access-hkpzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.118586 5030 generic.go:334] "Generic (PLEG): container finished" podID="af288905-8d90-4f98-ba70-c7351555851b" containerID="1b7ac12593213a554ea14bfe0f61419d4d62872f58e89bac2f8027ab71aa20eb" exitCode=143 Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.118705 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" event={"ID":"af288905-8d90-4f98-ba70-c7351555851b","Type":"ContainerDied","Data":"1b7ac12593213a554ea14bfe0f61419d4d62872f58e89bac2f8027ab71aa20eb"} Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.137383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" event={"ID":"3e8ed654-59ee-4055-bb97-7d079e100af7","Type":"ContainerStarted","Data":"bd36c5c2f114bde7769c6cda08f898ffc29246290b55ddd8e9c6358ddea9bfac"} Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.137548 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" podUID="3e8ed654-59ee-4055-bb97-7d079e100af7" containerName="placement-log" containerID="cri-o://e2f3539b0d09fd315ddf7e23acf530f47592955f23925972593c99550f0a21fb" gracePeriod=30 Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.137673 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.137704 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" podUID="3e8ed654-59ee-4055-bb97-7d079e100af7" containerName="placement-api" containerID="cri-o://bd36c5c2f114bde7769c6cda08f898ffc29246290b55ddd8e9c6358ddea9bfac" gracePeriod=30 Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.137765 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.151170 5030 generic.go:334] "Generic (PLEG): container finished" podID="ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" containerID="aa169a139bd24a0b6ddc7e560156c87e40f4eba5bcd5d1dd9974ceb2324a9013" exitCode=143 Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.151483 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.152210 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" event={"ID":"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df","Type":"ContainerDied","Data":"aa169a139bd24a0b6ddc7e560156c87e40f4eba5bcd5d1dd9974ceb2324a9013"} Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.152288 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.152477 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.152654 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.152836 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:33:59 crc kubenswrapper[5030]: E0120 23:33:59.159391 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59d37188df8105f7a6563229d86d8c1078a8f47264c721f27153dbfc6bbaa4e2" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:33:59 crc kubenswrapper[5030]: E0120 23:33:59.163750 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59d37188df8105f7a6563229d86d8c1078a8f47264c721f27153dbfc6bbaa4e2" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:33:59 crc kubenswrapper[5030]: E0120 23:33:59.166680 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="59d37188df8105f7a6563229d86d8c1078a8f47264c721f27153dbfc6bbaa4e2" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:33:59 crc kubenswrapper[5030]: E0120 23:33:59.166748 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="7b274deb-e830-47a2-916c-3c10670c08ac" containerName="ovn-northd" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.169468 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwvq5\" (UniqueName: \"kubernetes.io/projected/427b84ba-dba3-4881-bd7c-461d4c3674c9-kube-api-access-lwvq5\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.169512 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-public-tls-certs\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.169556 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.169632 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6j6d\" (UniqueName: \"kubernetes.io/projected/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-kube-api-access-w6j6d\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.169710 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-combined-ca-bundle\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.169738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427b84ba-dba3-4881-bd7c-461d4c3674c9-logs\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.169774 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.169802 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-scripts\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.169824 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-logs\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.169870 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-internal-tls-certs\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.170826 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427b84ba-dba3-4881-bd7c-461d4c3674c9-logs\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.173794 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-config-data\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.173913 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-public-tls-certs\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.173989 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-config-data\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.174291 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.174818 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkpzd\" (UniqueName: \"kubernetes.io/projected/66ad6a39-d03a-4db8-9f47-9c461325cf0b-kube-api-access-hkpzd\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.174850 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/66ad6a39-d03a-4db8-9f47-9c461325cf0b-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.174862 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.174872 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66ad6a39-d03a-4db8-9f47-9c461325cf0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.174881 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.174894 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bs57\" (UniqueName: \"kubernetes.io/projected/74b19e4a-b5b1-4c2f-9cf9-01097291798d-kube-api-access-2bs57\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.178667 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" podStartSLOduration=6.17864771 podStartE2EDuration="6.17864771s" podCreationTimestamp="2026-01-20 23:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:33:59.094539805 +0000 UTC m=+3511.414800093" watchObservedRunningTime="2026-01-20 23:33:59.17864771 +0000 UTC m=+3511.498907998" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.189571 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "66ad6a39-d03a-4db8-9f47-9c461325cf0b" (UID: "66ad6a39-d03a-4db8-9f47-9c461325cf0b"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.191587 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=8.191571273 podStartE2EDuration="8.191571273s" podCreationTimestamp="2026-01-20 23:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:33:59.116748012 +0000 UTC m=+3511.437008300" watchObservedRunningTime="2026-01-20 23:33:59.191571273 +0000 UTC m=+3511.511831561" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.193481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwvq5\" (UniqueName: \"kubernetes.io/projected/427b84ba-dba3-4881-bd7c-461d4c3674c9-kube-api-access-lwvq5\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.201553 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" podStartSLOduration=8.201538714 podStartE2EDuration="8.201538714s" podCreationTimestamp="2026-01-20 23:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:33:59.165059081 +0000 UTC m=+3511.485319369" watchObservedRunningTime="2026-01-20 23:33:59.201538714 +0000 UTC m=+3511.521799002" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.214953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-public-tls-certs\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.215201 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-internal-tls-certs\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.216410 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-combined-ca-bundle\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.217463 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-scripts\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.218185 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-config-data\") pod \"placement-7cb86f569d-4mmlt\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.267879 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.276346 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-public-tls-certs\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.276566 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.276717 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6j6d\" (UniqueName: \"kubernetes.io/projected/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-kube-api-access-w6j6d\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.276838 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.276912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-logs\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.277014 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-config-data\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.277125 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.281432 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-logs\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.297538 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.298427 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-public-tls-certs\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.299773 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-config-data\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.308939 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.312147 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6j6d\" (UniqueName: \"kubernetes.io/projected/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-kube-api-access-w6j6d\") pod \"nova-api-0\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.329647 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.507579 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.613507 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.668374 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.736410 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:33:59 crc kubenswrapper[5030]: W0120 23:33:59.835927 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78cb6e0a_d348_4ae5_811d_c3ebb411c0d8.slice/crio-13139228853e1fd0b3b9ef2a3e2e246bbaa705a6eb68d3ef96e9d2141d86cbb3 WatchSource:0}: Error finding container 13139228853e1fd0b3b9ef2a3e2e246bbaa705a6eb68d3ef96e9d2141d86cbb3: Status 404 returned error can't find the container with id 13139228853e1fd0b3b9ef2a3e2e246bbaa705a6eb68d3ef96e9d2141d86cbb3 Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.845584 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ad6a39-d03a-4db8-9f47-9c461325cf0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66ad6a39-d03a-4db8-9f47-9c461325cf0b" (UID: "66ad6a39-d03a-4db8-9f47-9c461325cf0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:33:59 crc kubenswrapper[5030]: W0120 23:33:59.858927 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62e3599a_b500_49ac_a3bd_cdc62cd37fba.slice/crio-a7a6837dc5241c3984d4126c7c7fc2763143aef1696320aa72598ebe24d62cb7 WatchSource:0}: Error finding container a7a6837dc5241c3984d4126c7c7fc2763143aef1696320aa72598ebe24d62cb7: Status 404 returned error can't find the container with id a7a6837dc5241c3984d4126c7c7fc2763143aef1696320aa72598ebe24d62cb7 Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.902268 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ad6a39-d03a-4db8-9f47-9c461325cf0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:33:59 crc kubenswrapper[5030]: I0120 23:33:59.970464 5030 scope.go:117] "RemoveContainer" containerID="e36e0f7cd8355c63cb64f84f0eb05159bf294b5e0ff81a35379e17c111fa95ef" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.012219 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8663671b-ef57-49a5-b24b-4cb3385ffc76" (UID: "8663671b-ef57-49a5-b24b-4cb3385ffc76"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.041332 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-config-data" (OuterVolumeSpecName: "config-data") pod "84ff2188-a797-40b5-8749-ee4330e6bf9a" (UID: "84ff2188-a797-40b5-8749-ee4330e6bf9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.043330 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f09696-840e-4a93-87c8-57bff4137d85" path="/var/lib/kubelet/pods/48f09696-840e-4a93-87c8-57bff4137d85/volumes" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.109513 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.109542 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.152827 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "74b19e4a-b5b1-4c2f-9cf9-01097291798d" (UID: "74b19e4a-b5b1-4c2f-9cf9-01097291798d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.223449 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.243695 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e443f370-de90-4c56-ba15-f33683726237-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e443f370-de90-4c56-ba15-f33683726237" (UID: "e443f370-de90-4c56-ba15-f33683726237"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.248525 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e443f370-de90-4c56-ba15-f33683726237-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.249219 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.249257 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.251947 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" podStartSLOduration=7.251924761 podStartE2EDuration="7.251924761s" podCreationTimestamp="2026-01-20 23:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:00.194747906 +0000 UTC m=+3512.515008194" watchObservedRunningTime="2026-01-20 23:34:00.251924761 +0000 UTC m=+3512.572185059" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.256593 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" podUID="49a8a7d0-ed33-43a6-a11d-c544f55be6a5" containerName="barbican-api-log" containerID="cri-o://4dfae6ef76bc32ff34131ca4dcbf419f695a22d4c5a49864bd669e4ed93b98e4" gracePeriod=30 Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.256704 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" podUID="49a8a7d0-ed33-43a6-a11d-c544f55be6a5" containerName="barbican-api" containerID="cri-o://761d40e911e20f49a30762ff6791f0b0ae3ae094eea88367552a54962873e183" gracePeriod=30 Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.285064 5030 generic.go:334] "Generic (PLEG): container finished" podID="3e8ed654-59ee-4055-bb97-7d079e100af7" containerID="bd36c5c2f114bde7769c6cda08f898ffc29246290b55ddd8e9c6358ddea9bfac" exitCode=0 Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.285098 5030 generic.go:334] "Generic (PLEG): container finished" podID="3e8ed654-59ee-4055-bb97-7d079e100af7" containerID="e2f3539b0d09fd315ddf7e23acf530f47592955f23925972593c99550f0a21fb" exitCode=143 Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.293392 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" podUID="24bdc108-6bee-469c-88da-d064ef3b2381" containerName="placement-log" containerID="cri-o://7672969c7149a4ed716e18defd00bfdaebf66124fd33296b0113971e32bea71b" gracePeriod=30 Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.293546 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" podUID="24bdc108-6bee-469c-88da-d064ef3b2381" containerName="placement-api" containerID="cri-o://f2c210a76ba1350e345c553a28c3e3eb934894ee48a553eb12bcb7de9a8f8525" gracePeriod=30 Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.315495 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84ff2188-a797-40b5-8749-ee4330e6bf9a" (UID: "84ff2188-a797-40b5-8749-ee4330e6bf9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.326374 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-676d666f78-55975" podUID="a84cf4eb-2a79-4637-aca8-d84c4e588fe7" containerName="neutron-api" containerID="cri-o://5ecde8ec052026f4220036a91d5a3a71cf246312e5090cad9d644e11e17921bb" gracePeriod=30 Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.326688 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-676d666f78-55975" podUID="a84cf4eb-2a79-4637-aca8-d84c4e588fe7" containerName="neutron-httpd" containerID="cri-o://98af69b0bfdc6fc1afc39aff882af1e07a660d71a898d7354bda19b513fc36c9" gracePeriod=30 Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.329017 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e443f370-de90-4c56-ba15-f33683726237-config-data" (OuterVolumeSpecName: "config-data") pod "e443f370-de90-4c56-ba15-f33683726237" (UID: "e443f370-de90-4c56-ba15-f33683726237"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.334032 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" podStartSLOduration=7.334021777 podStartE2EDuration="7.334021777s" podCreationTimestamp="2026-01-20 23:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:00.242907942 +0000 UTC m=+3512.563168240" watchObservedRunningTime="2026-01-20 23:34:00.334021777 +0000 UTC m=+3512.654282065" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.346432 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" podStartSLOduration=7.346419007 podStartE2EDuration="7.346419007s" podCreationTimestamp="2026-01-20 23:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:00.269194778 +0000 UTC m=+3512.589455076" watchObservedRunningTime="2026-01-20 23:34:00.346419007 +0000 UTC m=+3512.666679295" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.352966 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.353001 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e443f370-de90-4c56-ba15-f33683726237-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.430920 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" podStartSLOduration=9.430900932 podStartE2EDuration="9.430900932s" podCreationTimestamp="2026-01-20 23:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:00.301042779 +0000 UTC m=+3512.621303077" watchObservedRunningTime="2026-01-20 23:34:00.430900932 +0000 UTC m=+3512.751161220" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.501904 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "84ff2188-a797-40b5-8749-ee4330e6bf9a" (UID: "84ff2188-a797-40b5-8749-ee4330e6bf9a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.522427 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" podStartSLOduration=7.522404128 podStartE2EDuration="7.522404128s" podCreationTimestamp="2026-01-20 23:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:00.329259602 +0000 UTC m=+3512.649519900" watchObservedRunningTime="2026-01-20 23:34:00.522404128 +0000 UTC m=+3512.842664416" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.523341 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8663671b-ef57-49a5-b24b-4cb3385ffc76" (UID: "8663671b-ef57-49a5-b24b-4cb3385ffc76"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.560379 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.560407 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ff2188-a797-40b5-8749-ee4330e6bf9a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.646742 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ad6a39-d03a-4db8-9f47-9c461325cf0b-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "66ad6a39-d03a-4db8-9f47-9c461325cf0b" (UID: "66ad6a39-d03a-4db8-9f47-9c461325cf0b"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.653075 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-config-data" (OuterVolumeSpecName: "config-data") pod "8663671b-ef57-49a5-b24b-4cb3385ffc76" (UID: "8663671b-ef57-49a5-b24b-4cb3385ffc76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.661225 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ad6a39-d03a-4db8-9f47-9c461325cf0b-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.661250 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8663671b-ef57-49a5-b24b-4cb3385ffc76-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.673842 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74b19e4a-b5b1-4c2f-9cf9-01097291798d" (UID: "74b19e4a-b5b1-4c2f-9cf9-01097291798d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.734799 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "74b19e4a-b5b1-4c2f-9cf9-01097291798d" (UID: "74b19e4a-b5b1-4c2f-9cf9-01097291798d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.741256 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-676d666f78-55975" podStartSLOduration=9.741239324 podStartE2EDuration="9.741239324s" podCreationTimestamp="2026-01-20 23:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:00.381161199 +0000 UTC m=+3512.701421487" watchObservedRunningTime="2026-01-20 23:34:00.741239324 +0000 UTC m=+3513.061499612" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.762896 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.762927 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.766823 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" podStartSLOduration=8.766806513 podStartE2EDuration="8.766806513s" podCreationTimestamp="2026-01-20 23:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:00.426859415 +0000 UTC m=+3512.747119703" watchObservedRunningTime="2026-01-20 23:34:00.766806513 +0000 UTC m=+3513.087066801" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.794431 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-config-data" (OuterVolumeSpecName: "config-data") pod "74b19e4a-b5b1-4c2f-9cf9-01097291798d" (UID: "74b19e4a-b5b1-4c2f-9cf9-01097291798d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.864781 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74b19e4a-b5b1-4c2f-9cf9-01097291798d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903701 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" event={"ID":"712df845-7e0f-48ed-b6a1-bb7c967fb5e3","Type":"ContainerStarted","Data":"acc6b39f58731022e9895707a3b85947e579adfd11aa69a1924f699081e64377"} Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903739 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8","Type":"ContainerStarted","Data":"13139228853e1fd0b3b9ef2a3e2e246bbaa705a6eb68d3ef96e9d2141d86cbb3"} Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903758 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903769 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"62e3599a-b500-49ac-a3bd-cdc62cd37fba","Type":"ContainerStarted","Data":"a7a6837dc5241c3984d4126c7c7fc2763143aef1696320aa72598ebe24d62cb7"} Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903780 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" event={"ID":"f522d3b2-124f-4af1-a548-408c3517ef13","Type":"ContainerStarted","Data":"3073a848a12d66bdc5d1ca7e95e2a440b8209ff9cae9549ac13f372bee8452dd"} Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903792 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903801 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf","Type":"ContainerStarted","Data":"761c37bf734cc250d9a7a79cc9d6abd457e8c00b3d6f3c264908f33d77e8f9eb"} Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903822 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" event={"ID":"b7b3ce7d-73be-4f56-80a5-d9147ad0580e","Type":"ContainerStarted","Data":"580832931f6c0e54e8102660db221878e02f00569eb352c1853ac3c3279ce10e"} Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" event={"ID":"49a8a7d0-ed33-43a6-a11d-c544f55be6a5","Type":"ContainerStarted","Data":"761d40e911e20f49a30762ff6791f0b0ae3ae094eea88367552a54962873e183"} Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903845 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903858 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903868 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh"] Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903882 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7cb86f569d-4mmlt"] Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903893 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"fc08df3d-397a-42ff-ab96-622f2dc6e492","Type":"ContainerStarted","Data":"2cbcc3680fcca00921c7e0f493ef34108c21360ddf810ce3b1e05e25e124674a"} Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903902 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" event={"ID":"3e8ed654-59ee-4055-bb97-7d079e100af7","Type":"ContainerDied","Data":"bd36c5c2f114bde7769c6cda08f898ffc29246290b55ddd8e9c6358ddea9bfac"} Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" event={"ID":"3e8ed654-59ee-4055-bb97-7d079e100af7","Type":"ContainerDied","Data":"e2f3539b0d09fd315ddf7e23acf530f47592955f23925972593c99550f0a21fb"} Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903924 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903934 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903948 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" event={"ID":"24bdc108-6bee-469c-88da-d064ef3b2381","Type":"ContainerStarted","Data":"f2c210a76ba1350e345c553a28c3e3eb934894ee48a553eb12bcb7de9a8f8525"} Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903964 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903975 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7cb86f569d-4mmlt"] Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.903986 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-66d4dcc7c4-pckll"] Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.905575 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-676d666f78-55975" event={"ID":"a84cf4eb-2a79-4637-aca8-d84c4e588fe7","Type":"ContainerStarted","Data":"98af69b0bfdc6fc1afc39aff882af1e07a660d71a898d7354bda19b513fc36c9"} Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.905597 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" event={"ID":"4f8fdbf9-041a-46dc-87c9-f2531626e150","Type":"ContainerStarted","Data":"4d481d03cd21a082e415c95294fe569e98e8b2c182e04998e1cc9ef7f56beef0"} Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.905610 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-66d4dcc7c4-pckll"] Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.905649 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn"] Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.905817 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" podUID="cab5d494-17b4-4c5d-babf-e74a3deb090f" containerName="barbican-worker-log" containerID="cri-o://da1b2fae7bed4ea45499659585d573132a7fc0d2dbe9cb24247436e14e86bb77" gracePeriod=30 Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.908379 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.908665 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" podUID="ada482ec-e4e4-4484-9dcf-f57ee6186883" containerName="barbican-keystone-listener-log" containerID="cri-o://3b5c27ad1243d28a1b12fb9293783374d79cbf506d43fe581f1deb5fb186bb75" gracePeriod=30 Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.908852 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" podUID="ada482ec-e4e4-4484-9dcf-f57ee6186883" containerName="barbican-keystone-listener" containerID="cri-o://fe5492f77d34acfeec74d4db9a01fa04e6cc8dcc5bcd07f35cdcc0d1d43df7ea" gracePeriod=30 Jan 20 23:34:00 crc kubenswrapper[5030]: I0120 23:34:00.910111 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" podUID="cab5d494-17b4-4c5d-babf-e74a3deb090f" containerName="barbican-worker" containerID="cri-o://4429058b871189e87dd7f8f69716b448dc7619a6b516fe76a0b7321f6197a303" gracePeriod=30 Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:00.940738 5030 scope.go:117] "RemoveContainer" containerID="aafac8517276cb2773a480452566bd4bb81c0a6d528488652d12ca4598c33812" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:00.968106 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-public-tls-certs\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:00.968160 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbtsm\" (UniqueName: \"kubernetes.io/projected/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-kube-api-access-qbtsm\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:00.968215 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-logs\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:00.968234 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-config-data\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:00.968274 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-scripts\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:00.968340 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-combined-ca-bundle\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:00.968358 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-internal-tls-certs\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.070782 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-public-tls-certs\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.070835 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbtsm\" (UniqueName: \"kubernetes.io/projected/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-kube-api-access-qbtsm\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.070873 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-logs\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.070892 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-config-data\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.070925 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-scripts\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.070965 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-combined-ca-bundle\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.070983 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-internal-tls-certs\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.071580 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-logs\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.095937 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-scripts\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.096505 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-public-tls-certs\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.102768 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-config-data\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.109927 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="8663671b-ef57-49a5-b24b-4cb3385ffc76" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.217:8776/healthcheck\": dial tcp 10.217.1.217:8776: i/o timeout" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.111152 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbtsm\" (UniqueName: \"kubernetes.io/projected/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-kube-api-access-qbtsm\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.111551 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-internal-tls-certs\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.112326 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-combined-ca-bundle\") pod \"placement-66d4dcc7c4-pckll\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.410737 5030 generic.go:334] "Generic (PLEG): container finished" podID="6153d63b-e003-4951-b9f1-a27d11979663" containerID="61f13c629eb50d8806b59af161a0e2ebd559836747a102e58d63db7089892566" exitCode=0 Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.410815 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"6153d63b-e003-4951-b9f1-a27d11979663","Type":"ContainerDied","Data":"61f13c629eb50d8806b59af161a0e2ebd559836747a102e58d63db7089892566"} Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.422310 5030 generic.go:334] "Generic (PLEG): container finished" podID="49a8a7d0-ed33-43a6-a11d-c544f55be6a5" containerID="4dfae6ef76bc32ff34131ca4dcbf419f695a22d4c5a49864bd669e4ed93b98e4" exitCode=143 Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.422353 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" event={"ID":"49a8a7d0-ed33-43a6-a11d-c544f55be6a5","Type":"ContainerDied","Data":"4dfae6ef76bc32ff34131ca4dcbf419f695a22d4c5a49864bd669e4ed93b98e4"} Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.424906 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" event={"ID":"427b84ba-dba3-4881-bd7c-461d4c3674c9","Type":"ContainerStarted","Data":"ba1b7cf1041077927df282c8dace10d519450b0dfb4303c6f186843a3559be29"} Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.460464 5030 generic.go:334] "Generic (PLEG): container finished" podID="ada482ec-e4e4-4484-9dcf-f57ee6186883" containerID="3b5c27ad1243d28a1b12fb9293783374d79cbf506d43fe581f1deb5fb186bb75" exitCode=143 Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.460797 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" event={"ID":"ada482ec-e4e4-4484-9dcf-f57ee6186883","Type":"ContainerDied","Data":"3b5c27ad1243d28a1b12fb9293783374d79cbf506d43fe581f1deb5fb186bb75"} Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.463684 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" event={"ID":"3e8ed654-59ee-4055-bb97-7d079e100af7","Type":"ContainerDied","Data":"b5f348a0227bc5cb6c1679c20e34ff57444d3d5f79ad0466f1e16ac007e64197"} Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.463737 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5f348a0227bc5cb6c1679c20e34ff57444d3d5f79ad0466f1e16ac007e64197" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.465757 5030 generic.go:334] "Generic (PLEG): container finished" podID="24bdc108-6bee-469c-88da-d064ef3b2381" containerID="f2c210a76ba1350e345c553a28c3e3eb934894ee48a553eb12bcb7de9a8f8525" exitCode=0 Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.465778 5030 generic.go:334] "Generic (PLEG): container finished" podID="24bdc108-6bee-469c-88da-d064ef3b2381" containerID="7672969c7149a4ed716e18defd00bfdaebf66124fd33296b0113971e32bea71b" exitCode=143 Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.465842 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" event={"ID":"24bdc108-6bee-469c-88da-d064ef3b2381","Type":"ContainerDied","Data":"f2c210a76ba1350e345c553a28c3e3eb934894ee48a553eb12bcb7de9a8f8525"} Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.465862 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" event={"ID":"24bdc108-6bee-469c-88da-d064ef3b2381","Type":"ContainerDied","Data":"7672969c7149a4ed716e18defd00bfdaebf66124fd33296b0113971e32bea71b"} Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.465936 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" event={"ID":"24bdc108-6bee-469c-88da-d064ef3b2381","Type":"ContainerDied","Data":"e77ff7d96b5860e987b608f6b2379ead4e92ea8c144adfd8c3837b5cc2f1f7f4"} Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.465991 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e77ff7d96b5860e987b608f6b2379ead4e92ea8c144adfd8c3837b5cc2f1f7f4" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.468332 5030 generic.go:334] "Generic (PLEG): container finished" podID="cab5d494-17b4-4c5d-babf-e74a3deb090f" containerID="da1b2fae7bed4ea45499659585d573132a7fc0d2dbe9cb24247436e14e86bb77" exitCode=143 Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.468407 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" event={"ID":"cab5d494-17b4-4c5d-babf-e74a3deb090f","Type":"ContainerDied","Data":"da1b2fae7bed4ea45499659585d573132a7fc0d2dbe9cb24247436e14e86bb77"} Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.471707 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6","Type":"ContainerStarted","Data":"b55591f7f581673e6196c5fcab36a06878621f4bb7f255642f69739d3102cf41"} Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.514823 5030 scope.go:117] "RemoveContainer" containerID="a797b28e8252946d1e122993d906dd040128dd30683045cbacafc029e5f91773" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.576517 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.590787 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-config-data\") pod \"3e8ed654-59ee-4055-bb97-7d079e100af7\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.590868 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-scripts\") pod \"3e8ed654-59ee-4055-bb97-7d079e100af7\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.590897 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-combined-ca-bundle\") pod \"3e8ed654-59ee-4055-bb97-7d079e100af7\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.590928 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e8ed654-59ee-4055-bb97-7d079e100af7-logs\") pod \"3e8ed654-59ee-4055-bb97-7d079e100af7\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.590957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-internal-tls-certs\") pod \"3e8ed654-59ee-4055-bb97-7d079e100af7\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.591002 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-public-tls-certs\") pod \"3e8ed654-59ee-4055-bb97-7d079e100af7\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.591029 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lshx\" (UniqueName: \"kubernetes.io/projected/3e8ed654-59ee-4055-bb97-7d079e100af7-kube-api-access-7lshx\") pod \"3e8ed654-59ee-4055-bb97-7d079e100af7\" (UID: \"3e8ed654-59ee-4055-bb97-7d079e100af7\") " Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.622726 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e8ed654-59ee-4055-bb97-7d079e100af7-logs" (OuterVolumeSpecName: "logs") pod "3e8ed654-59ee-4055-bb97-7d079e100af7" (UID: "3e8ed654-59ee-4055-bb97-7d079e100af7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.631933 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-scripts" (OuterVolumeSpecName: "scripts") pod "3e8ed654-59ee-4055-bb97-7d079e100af7" (UID: "3e8ed654-59ee-4055-bb97-7d079e100af7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.635588 5030 scope.go:117] "RemoveContainer" containerID="35642e491b70667a14dd98edd5926927ca38efc248b37af045f7466d342c728b" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.699113 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.699143 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e8ed654-59ee-4055-bb97-7d079e100af7-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.717802 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.742904 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e8ed654-59ee-4055-bb97-7d079e100af7-kube-api-access-7lshx" (OuterVolumeSpecName: "kube-api-access-7lshx") pod "3e8ed654-59ee-4055-bb97-7d079e100af7" (UID: "3e8ed654-59ee-4055-bb97-7d079e100af7"). InnerVolumeSpecName "kube-api-access-7lshx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.760878 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.786680 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.802910 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lshx\" (UniqueName: \"kubernetes.io/projected/3e8ed654-59ee-4055-bb97-7d079e100af7-kube-api-access-7lshx\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.831714 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.832946 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e8ed654-59ee-4055-bb97-7d079e100af7" (UID: "3e8ed654-59ee-4055-bb97-7d079e100af7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.887740 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-config-data" (OuterVolumeSpecName: "config-data") pod "3e8ed654-59ee-4055-bb97-7d079e100af7" (UID: "3e8ed654-59ee-4055-bb97-7d079e100af7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.892592 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:34:01 crc kubenswrapper[5030]: E0120 23:34:01.892974 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8ed654-59ee-4055-bb97-7d079e100af7" containerName="placement-api" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.892989 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8ed654-59ee-4055-bb97-7d079e100af7" containerName="placement-api" Jan 20 23:34:01 crc kubenswrapper[5030]: E0120 23:34:01.893003 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8ed654-59ee-4055-bb97-7d079e100af7" containerName="placement-log" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.893009 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8ed654-59ee-4055-bb97-7d079e100af7" containerName="placement-log" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.893179 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8ed654-59ee-4055-bb97-7d079e100af7" containerName="placement-api" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.893198 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8ed654-59ee-4055-bb97-7d079e100af7" containerName="placement-log" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.894155 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.897035 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.902369 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-5fq4w" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.902595 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.902770 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.904507 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.904534 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.912262 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.928578 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.930005 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.933142 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.936565 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.982739 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66ad6a39-d03a-4db8-9f47-9c461325cf0b" path="/var/lib/kubelet/pods/66ad6a39-d03a-4db8-9f47-9c461325cf0b/volumes" Jan 20 23:34:01 crc kubenswrapper[5030]: I0120 23:34:01.983314 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e443f370-de90-4c56-ba15-f33683726237" path="/var/lib/kubelet/pods/e443f370-de90-4c56-ba15-f33683726237/volumes" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.000395 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3e8ed654-59ee-4055-bb97-7d079e100af7" (UID: "3e8ed654-59ee-4055-bb97-7d079e100af7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.005812 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.005842 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-kolla-config\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.005886 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.005904 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-config-data-default\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.005931 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.005985 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbl5q\" (UniqueName: \"kubernetes.io/projected/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-kube-api-access-lbl5q\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.006013 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.006072 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.006118 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.012578 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e8ed654-59ee-4055-bb97-7d079e100af7" (UID: "3e8ed654-59ee-4055-bb97-7d079e100af7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.069988 5030 scope.go:117] "RemoveContainer" containerID="14499d52e239bfb767a6c19e58c3b9a73416281ee10917e9bef5a62a3598d5b0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.107894 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.108112 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.108141 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sxl5\" (UniqueName: \"kubernetes.io/projected/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-kube-api-access-2sxl5\") pod \"nova-cell0-conductor-0\" (UID: \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.108160 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.108177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-kolla-config\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.108197 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.108228 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.108248 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-config-data-default\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.108286 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.108338 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbl5q\" (UniqueName: \"kubernetes.io/projected/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-kube-api-access-lbl5q\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.108368 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.108464 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e8ed654-59ee-4055-bb97-7d079e100af7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.110137 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.113097 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.113685 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-config-data-default\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.118756 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-kolla-config\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.121605 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.138163 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.143004 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbl5q\" (UniqueName: \"kubernetes.io/projected/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-kube-api-access-lbl5q\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.152247 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.159930 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.160435 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.195197 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.210772 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.210828 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sxl5\" (UniqueName: \"kubernetes.io/projected/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-kube-api-access-2sxl5\") pod \"nova-cell0-conductor-0\" (UID: \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.210856 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.214105 5030 scope.go:117] "RemoveContainer" containerID="1fbf13e88e2f10e605533393bd27956cf73249bb24c679cb4a83c66136518a75" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.229268 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.245033 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.251194 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.251526 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.251683 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sxl5\" (UniqueName: \"kubernetes.io/projected/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-kube-api-access-2sxl5\") pod \"nova-cell0-conductor-0\" (UID: \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.255550 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.258674 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.267174 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.272277 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:02 crc kubenswrapper[5030]: E0120 23:34:02.272734 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24bdc108-6bee-469c-88da-d064ef3b2381" containerName="placement-log" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.272751 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24bdc108-6bee-469c-88da-d064ef3b2381" containerName="placement-log" Jan 20 23:34:02 crc kubenswrapper[5030]: E0120 23:34:02.272775 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6153d63b-e003-4951-b9f1-a27d11979663" containerName="cinder-scheduler" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.272783 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6153d63b-e003-4951-b9f1-a27d11979663" containerName="cinder-scheduler" Jan 20 23:34:02 crc kubenswrapper[5030]: E0120 23:34:02.272796 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6153d63b-e003-4951-b9f1-a27d11979663" containerName="probe" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.272803 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6153d63b-e003-4951-b9f1-a27d11979663" containerName="probe" Jan 20 23:34:02 crc kubenswrapper[5030]: E0120 23:34:02.272821 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24bdc108-6bee-469c-88da-d064ef3b2381" containerName="placement-api" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.272827 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24bdc108-6bee-469c-88da-d064ef3b2381" containerName="placement-api" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.273002 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6153d63b-e003-4951-b9f1-a27d11979663" containerName="probe" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.273026 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6153d63b-e003-4951-b9f1-a27d11979663" containerName="cinder-scheduler" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.273035 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="24bdc108-6bee-469c-88da-d064ef3b2381" containerName="placement-api" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.273046 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="24bdc108-6bee-469c-88da-d064ef3b2381" containerName="placement-log" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.273995 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.276458 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.276768 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.278898 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.311569 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.311689 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6153d63b-e003-4951-b9f1-a27d11979663-etc-machine-id\") pod \"6153d63b-e003-4951-b9f1-a27d11979663\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.311742 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6153d63b-e003-4951-b9f1-a27d11979663-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6153d63b-e003-4951-b9f1-a27d11979663" (UID: "6153d63b-e003-4951-b9f1-a27d11979663"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.311747 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnvp7\" (UniqueName: \"kubernetes.io/projected/6153d63b-e003-4951-b9f1-a27d11979663-kube-api-access-lnvp7\") pod \"6153d63b-e003-4951-b9f1-a27d11979663\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.311807 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-internal-tls-certs\") pod \"24bdc108-6bee-469c-88da-d064ef3b2381\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.311853 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-combined-ca-bundle\") pod \"6153d63b-e003-4951-b9f1-a27d11979663\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.311877 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-config-data\") pod \"6153d63b-e003-4951-b9f1-a27d11979663\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.311941 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24bdc108-6bee-469c-88da-d064ef3b2381-logs\") pod \"24bdc108-6bee-469c-88da-d064ef3b2381\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.311970 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-scripts\") pod \"24bdc108-6bee-469c-88da-d064ef3b2381\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.312038 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-config-data\") pod \"24bdc108-6bee-469c-88da-d064ef3b2381\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.312077 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-combined-ca-bundle\") pod \"24bdc108-6bee-469c-88da-d064ef3b2381\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.312112 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-config-data-custom\") pod \"6153d63b-e003-4951-b9f1-a27d11979663\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.312147 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-public-tls-certs\") pod \"24bdc108-6bee-469c-88da-d064ef3b2381\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.312221 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-scripts\") pod \"6153d63b-e003-4951-b9f1-a27d11979663\" (UID: \"6153d63b-e003-4951-b9f1-a27d11979663\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.312256 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2skqw\" (UniqueName: \"kubernetes.io/projected/24bdc108-6bee-469c-88da-d064ef3b2381-kube-api-access-2skqw\") pod \"24bdc108-6bee-469c-88da-d064ef3b2381\" (UID: \"24bdc108-6bee-469c-88da-d064ef3b2381\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.312914 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6153d63b-e003-4951-b9f1-a27d11979663-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.314076 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.315881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24bdc108-6bee-469c-88da-d064ef3b2381-logs" (OuterVolumeSpecName: "logs") pod "24bdc108-6bee-469c-88da-d064ef3b2381" (UID: "24bdc108-6bee-469c-88da-d064ef3b2381"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.343718 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24bdc108-6bee-469c-88da-d064ef3b2381-kube-api-access-2skqw" (OuterVolumeSpecName: "kube-api-access-2skqw") pod "24bdc108-6bee-469c-88da-d064ef3b2381" (UID: "24bdc108-6bee-469c-88da-d064ef3b2381"). InnerVolumeSpecName "kube-api-access-2skqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.344403 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6153d63b-e003-4951-b9f1-a27d11979663" (UID: "6153d63b-e003-4951-b9f1-a27d11979663"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.358722 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-scripts" (OuterVolumeSpecName: "scripts") pod "6153d63b-e003-4951-b9f1-a27d11979663" (UID: "6153d63b-e003-4951-b9f1-a27d11979663"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.361399 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-scripts" (OuterVolumeSpecName: "scripts") pod "24bdc108-6bee-469c-88da-d064ef3b2381" (UID: "24bdc108-6bee-469c-88da-d064ef3b2381"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.361652 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6153d63b-e003-4951-b9f1-a27d11979663-kube-api-access-lnvp7" (OuterVolumeSpecName: "kube-api-access-lnvp7") pod "6153d63b-e003-4951-b9f1-a27d11979663" (UID: "6153d63b-e003-4951-b9f1-a27d11979663"). InnerVolumeSpecName "kube-api-access-lnvp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.418224 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-logs\") pod \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.418321 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-combined-ca-bundle\") pod \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.418369 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-public-tls-certs\") pod \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.418404 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-config-data-custom\") pod \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.418597 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-internal-tls-certs\") pod \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.418654 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-config-data\") pod \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.418678 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4gz5\" (UniqueName: \"kubernetes.io/projected/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-kube-api-access-d4gz5\") pod \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\" (UID: \"49a8a7d0-ed33-43a6-a11d-c544f55be6a5\") " Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.418834 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-logs" (OuterVolumeSpecName: "logs") pod "49a8a7d0-ed33-43a6-a11d-c544f55be6a5" (UID: "49a8a7d0-ed33-43a6-a11d-c544f55be6a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.418977 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-scripts\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419016 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-config-data\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419040 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-config-data-custom\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419062 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419184 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419236 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxhq6\" (UniqueName: \"kubernetes.io/projected/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-kube-api-access-gxhq6\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419268 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419291 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-logs\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419309 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419364 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2skqw\" (UniqueName: \"kubernetes.io/projected/24bdc108-6bee-469c-88da-d064ef3b2381-kube-api-access-2skqw\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419375 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnvp7\" (UniqueName: \"kubernetes.io/projected/6153d63b-e003-4951-b9f1-a27d11979663-kube-api-access-lnvp7\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419385 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24bdc108-6bee-469c-88da-d064ef3b2381-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419394 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419404 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419414 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.419423 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.463204 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-kube-api-access-d4gz5" (OuterVolumeSpecName: "kube-api-access-d4gz5") pod "49a8a7d0-ed33-43a6-a11d-c544f55be6a5" (UID: "49a8a7d0-ed33-43a6-a11d-c544f55be6a5"). InnerVolumeSpecName "kube-api-access-d4gz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.463857 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "49a8a7d0-ed33-43a6-a11d-c544f55be6a5" (UID: "49a8a7d0-ed33-43a6-a11d-c544f55be6a5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.521807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.521886 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxhq6\" (UniqueName: \"kubernetes.io/projected/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-kube-api-access-gxhq6\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.521912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.521934 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-logs\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.521952 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.522003 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-scripts\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.522026 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-config-data\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.522050 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-config-data-custom\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.522071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.522144 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4gz5\" (UniqueName: \"kubernetes.io/projected/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-kube-api-access-d4gz5\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.522158 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.528973 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-logs\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.531979 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.536823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.548109 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-scripts\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.549520 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.563637 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-public-tls-certs\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.569012 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-config-data-custom\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.576122 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-config-data\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.586327 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_7b274deb-e830-47a2-916c-3c10670c08ac/ovn-northd/0.log" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.586373 5030 generic.go:334] "Generic (PLEG): container finished" podID="7b274deb-e830-47a2-916c-3c10670c08ac" containerID="59d37188df8105f7a6563229d86d8c1078a8f47264c721f27153dbfc6bbaa4e2" exitCode=139 Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.586451 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"7b274deb-e830-47a2-916c-3c10670c08ac","Type":"ContainerDied","Data":"59d37188df8105f7a6563229d86d8c1078a8f47264c721f27153dbfc6bbaa4e2"} Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.595270 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxhq6\" (UniqueName: \"kubernetes.io/projected/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-kube-api-access-gxhq6\") pod \"cinder-api-0\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.602529 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.628689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"62e3599a-b500-49ac-a3bd-cdc62cd37fba","Type":"ContainerStarted","Data":"dd2f2f168ba8ebfef34ccd4c89c839a800fe84963b8fed60ce66039f01507f02"} Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.649740 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49a8a7d0-ed33-43a6-a11d-c544f55be6a5" (UID: "49a8a7d0-ed33-43a6-a11d-c544f55be6a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.680821 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.681361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" event={"ID":"427b84ba-dba3-4881-bd7c-461d4c3674c9","Type":"ContainerStarted","Data":"0a8376f95d2df43c0daf95c4b85f7a34edcb659b644e3d8e243b73c2e1a7e0bf"} Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.681438 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"6153d63b-e003-4951-b9f1-a27d11979663","Type":"ContainerDied","Data":"47daa988a3c5ccca1761a0cec44bd26b3e43c7826edfa50d76aad6620005d357"} Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.681486 5030 scope.go:117] "RemoveContainer" containerID="24f4bfb73431d6bcf7cc305370aec071e7e270a5776bad4e062eabb3f4ff4ed1" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.705432 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf","Type":"ContainerStarted","Data":"d5e32ad78f87773cab221c54225e04fd0ff72e850437ba7b5b0074e225f96e54"} Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.705642 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.715660 5030 generic.go:334] "Generic (PLEG): container finished" podID="49a8a7d0-ed33-43a6-a11d-c544f55be6a5" containerID="761d40e911e20f49a30762ff6791f0b0ae3ae094eea88367552a54962873e183" exitCode=0 Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.715757 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" event={"ID":"49a8a7d0-ed33-43a6-a11d-c544f55be6a5","Type":"ContainerDied","Data":"761d40e911e20f49a30762ff6791f0b0ae3ae094eea88367552a54962873e183"} Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.715805 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" event={"ID":"49a8a7d0-ed33-43a6-a11d-c544f55be6a5","Type":"ContainerDied","Data":"7dc8386abdbcaed1bea62f94581045b43de423bc42d2ef1a13578add2fac1da0"} Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.715923 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-548bcb5f76-7c959" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.716342 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=5.716320913 podStartE2EDuration="5.716320913s" podCreationTimestamp="2026-01-20 23:33:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:02.649600558 +0000 UTC m=+3514.969860846" watchObservedRunningTime="2026-01-20 23:34:02.716320913 +0000 UTC m=+3515.036581201" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.719642 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.720016 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7f67b969c8-d246l" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.766456 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.797577 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.806769 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24bdc108-6bee-469c-88da-d064ef3b2381" (UID: "24bdc108-6bee-469c-88da-d064ef3b2381"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.808047 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=5.808027733 podStartE2EDuration="5.808027733s" podCreationTimestamp="2026-01-20 23:33:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:02.708961904 +0000 UTC m=+3515.029222212" watchObservedRunningTime="2026-01-20 23:34:02.808027733 +0000 UTC m=+3515.128288021" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.837495 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=6.837475885 podStartE2EDuration="6.837475885s" podCreationTimestamp="2026-01-20 23:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:02.727170425 +0000 UTC m=+3515.047430713" watchObservedRunningTime="2026-01-20 23:34:02.837475885 +0000 UTC m=+3515.157736173" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.859354 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.866024 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "49a8a7d0-ed33-43a6-a11d-c544f55be6a5" (UID: "49a8a7d0-ed33-43a6-a11d-c544f55be6a5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.869227 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:02 crc kubenswrapper[5030]: I0120 23:34:02.870549 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.074465 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "24bdc108-6bee-469c-88da-d064ef3b2381" (UID: "24bdc108-6bee-469c-88da-d064ef3b2381"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.077992 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.089340 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7f67b969c8-d246l"] Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.099906 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6153d63b-e003-4951-b9f1-a27d11979663" (UID: "6153d63b-e003-4951-b9f1-a27d11979663"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.102268 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7f67b969c8-d246l"] Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.114879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "49a8a7d0-ed33-43a6-a11d-c544f55be6a5" (UID: "49a8a7d0-ed33-43a6-a11d-c544f55be6a5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.116762 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-config-data" (OuterVolumeSpecName: "config-data") pod "24bdc108-6bee-469c-88da-d064ef3b2381" (UID: "24bdc108-6bee-469c-88da-d064ef3b2381"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.144163 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-config-data" (OuterVolumeSpecName: "config-data") pod "49a8a7d0-ed33-43a6-a11d-c544f55be6a5" (UID: "49a8a7d0-ed33-43a6-a11d-c544f55be6a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.164663 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "24bdc108-6bee-469c-88da-d064ef3b2381" (UID: "24bdc108-6bee-469c-88da-d064ef3b2381"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.174642 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-config-data" (OuterVolumeSpecName: "config-data") pod "6153d63b-e003-4951-b9f1-a27d11979663" (UID: "6153d63b-e003-4951-b9f1-a27d11979663"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.182506 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.182765 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6153d63b-e003-4951-b9f1-a27d11979663-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.182779 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.182791 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.182803 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49a8a7d0-ed33-43a6-a11d-c544f55be6a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.182816 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24bdc108-6bee-469c-88da-d064ef3b2381-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.195509 5030 scope.go:117] "RemoveContainer" containerID="61f13c629eb50d8806b59af161a0e2ebd559836747a102e58d63db7089892566" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.239742 5030 scope.go:117] "RemoveContainer" containerID="761d40e911e20f49a30762ff6791f0b0ae3ae094eea88367552a54962873e183" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.306355 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.375116 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:34:03 crc kubenswrapper[5030]: W0120 23:34:03.379911 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1c966f9_0e90_4a5d_9e43_81ec4980cefc.slice/crio-62aceb603d9b599795ce3db4aa808f7df55085790cf93f0bb01657f574a12d00 WatchSource:0}: Error finding container 62aceb603d9b599795ce3db4aa808f7df55085790cf93f0bb01657f574a12d00: Status 404 returned error can't find the container with id 62aceb603d9b599795ce3db4aa808f7df55085790cf93f0bb01657f574a12d00 Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.413203 5030 scope.go:117] "RemoveContainer" containerID="4dfae6ef76bc32ff34131ca4dcbf419f695a22d4c5a49864bd669e4ed93b98e4" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.416230 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.427682 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-66d4dcc7c4-pckll"] Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.441144 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:34:03 crc kubenswrapper[5030]: E0120 23:34:03.441544 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a8a7d0-ed33-43a6-a11d-c544f55be6a5" containerName="barbican-api-log" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.441556 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a8a7d0-ed33-43a6-a11d-c544f55be6a5" containerName="barbican-api-log" Jan 20 23:34:03 crc kubenswrapper[5030]: E0120 23:34:03.441582 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a8a7d0-ed33-43a6-a11d-c544f55be6a5" containerName="barbican-api" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.441588 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a8a7d0-ed33-43a6-a11d-c544f55be6a5" containerName="barbican-api" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.441835 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a8a7d0-ed33-43a6-a11d-c544f55be6a5" containerName="barbican-api" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.441850 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a8a7d0-ed33-43a6-a11d-c544f55be6a5" containerName="barbican-api-log" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.442850 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.449979 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w"] Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.450402 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.463852 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.465996 5030 scope.go:117] "RemoveContainer" containerID="761d40e911e20f49a30762ff6791f0b0ae3ae094eea88367552a54962873e183" Jan 20 23:34:03 crc kubenswrapper[5030]: E0120 23:34:03.473689 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"761d40e911e20f49a30762ff6791f0b0ae3ae094eea88367552a54962873e183\": container with ID starting with 761d40e911e20f49a30762ff6791f0b0ae3ae094eea88367552a54962873e183 not found: ID does not exist" containerID="761d40e911e20f49a30762ff6791f0b0ae3ae094eea88367552a54962873e183" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.473741 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"761d40e911e20f49a30762ff6791f0b0ae3ae094eea88367552a54962873e183"} err="failed to get container status \"761d40e911e20f49a30762ff6791f0b0ae3ae094eea88367552a54962873e183\": rpc error: code = NotFound desc = could not find container \"761d40e911e20f49a30762ff6791f0b0ae3ae094eea88367552a54962873e183\": container with ID starting with 761d40e911e20f49a30762ff6791f0b0ae3ae094eea88367552a54962873e183 not found: ID does not exist" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.473765 5030 scope.go:117] "RemoveContainer" containerID="4dfae6ef76bc32ff34131ca4dcbf419f695a22d4c5a49864bd669e4ed93b98e4" Jan 20 23:34:03 crc kubenswrapper[5030]: E0120 23:34:03.476386 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dfae6ef76bc32ff34131ca4dcbf419f695a22d4c5a49864bd669e4ed93b98e4\": container with ID starting with 4dfae6ef76bc32ff34131ca4dcbf419f695a22d4c5a49864bd669e4ed93b98e4 not found: ID does not exist" containerID="4dfae6ef76bc32ff34131ca4dcbf419f695a22d4c5a49864bd669e4ed93b98e4" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.476435 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dfae6ef76bc32ff34131ca4dcbf419f695a22d4c5a49864bd669e4ed93b98e4"} err="failed to get container status \"4dfae6ef76bc32ff34131ca4dcbf419f695a22d4c5a49864bd669e4ed93b98e4\": rpc error: code = NotFound desc = could not find container \"4dfae6ef76bc32ff34131ca4dcbf419f695a22d4c5a49864bd669e4ed93b98e4\": container with ID starting with 4dfae6ef76bc32ff34131ca4dcbf419f695a22d4c5a49864bd669e4ed93b98e4 not found: ID does not exist" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.486127 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-5f5fbc99cd-g6d5w"] Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.495174 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-548bcb5f76-7c959"] Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.505482 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-548bcb5f76-7c959"] Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.506934 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.507034 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj7zd\" (UniqueName: \"kubernetes.io/projected/5613b909-aedf-4a87-babc-ca3e427fe828-kube-api-access-gj7zd\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.507067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-scripts\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.507094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-config-data\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.507146 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5613b909-aedf-4a87-babc-ca3e427fe828-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.507189 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.576900 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.577957 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.44:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.578006 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.44:8775/\": dial tcp 10.217.0.44:8775: i/o timeout" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.611914 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5613b909-aedf-4a87-babc-ca3e427fe828-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.611978 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.612078 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.612137 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj7zd\" (UniqueName: \"kubernetes.io/projected/5613b909-aedf-4a87-babc-ca3e427fe828-kube-api-access-gj7zd\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.612160 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-scripts\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.612184 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-config-data\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.613368 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5613b909-aedf-4a87-babc-ca3e427fe828-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.617633 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-scripts\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.620289 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-config-data\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.620858 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.628077 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.634179 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj7zd\" (UniqueName: \"kubernetes.io/projected/5613b909-aedf-4a87-babc-ca3e427fe828-kube-api-access-gj7zd\") pod \"cinder-scheduler-0\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.777799 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" event={"ID":"427b84ba-dba3-4881-bd7c-461d4c3674c9","Type":"ContainerStarted","Data":"b25711156d4469258b400fabb5bdbe3ad378d45aab78c771c47ee90613bc46b2"} Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.777942 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.778718 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" podUID="427b84ba-dba3-4881-bd7c-461d4c3674c9" containerName="placement-log" containerID="cri-o://0a8376f95d2df43c0daf95c4b85f7a34edcb659b644e3d8e243b73c2e1a7e0bf" gracePeriod=30 Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.779351 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.779451 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.779535 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" podUID="427b84ba-dba3-4881-bd7c-461d4c3674c9" containerName="placement-api" containerID="cri-o://b25711156d4469258b400fabb5bdbe3ad378d45aab78c771c47ee90613bc46b2" gracePeriod=30 Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.787322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" event={"ID":"2d1b606a-c4d7-4ead-83db-20b4cd7ded47","Type":"ContainerStarted","Data":"9eefb33856a9e5a03ba65a238e5121aef46912d0197436eddf25525a753d96d1"} Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.802687 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.810863 5030 generic.go:334] "Generic (PLEG): container finished" podID="ada482ec-e4e4-4484-9dcf-f57ee6186883" containerID="fe5492f77d34acfeec74d4db9a01fa04e6cc8dcc5bcd07f35cdcc0d1d43df7ea" exitCode=0 Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.810924 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" event={"ID":"ada482ec-e4e4-4484-9dcf-f57ee6186883","Type":"ContainerDied","Data":"fe5492f77d34acfeec74d4db9a01fa04e6cc8dcc5bcd07f35cdcc0d1d43df7ea"} Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.810951 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" event={"ID":"ada482ec-e4e4-4484-9dcf-f57ee6186883","Type":"ContainerDied","Data":"8b8128a3668934c19fbac0e0ca235b70c80be6230b83cb5d307540df05d8ebb6"} Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.810966 5030 scope.go:117] "RemoveContainer" containerID="fe5492f77d34acfeec74d4db9a01fa04e6cc8dcc5bcd07f35cdcc0d1d43df7ea" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.811085 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.825761 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"fc08df3d-397a-42ff-ab96-622f2dc6e492","Type":"ContainerStarted","Data":"0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15"} Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.827005 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_7b274deb-e830-47a2-916c-3c10670c08ac/ovn-northd/0.log" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.827065 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.847382 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.847697 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6","Type":"ContainerStarted","Data":"e1bab47349c35f6aa385252a8ecdb111200aae28f3c39202af37cf667e9c4edb"} Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.847725 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6","Type":"ContainerStarted","Data":"1367bed5839e543dc91ccaa45f346a8078b98871533bbd6dbcd36fea44e6682e"} Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.875607 5030 scope.go:117] "RemoveContainer" containerID="3b5c27ad1243d28a1b12fb9293783374d79cbf506d43fe581f1deb5fb186bb75" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.881716 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_7b274deb-e830-47a2-916c-3c10670c08ac/ovn-northd/0.log" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.881781 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"7b274deb-e830-47a2-916c-3c10670c08ac","Type":"ContainerDied","Data":"962a6cfb18b7fa8e5ad9dcf337905f3245c7a5968d387512e6db9b76e1dd3a2b"} Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.881841 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.888241 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.904323 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" podStartSLOduration=5.904305079 podStartE2EDuration="5.904305079s" podCreationTimestamp="2026-01-20 23:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:03.864019004 +0000 UTC m=+3516.184279292" watchObservedRunningTime="2026-01-20 23:34:03.904305079 +0000 UTC m=+3516.224565367" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.919264 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-combined-ca-bundle\") pod \"ada482ec-e4e4-4484-9dcf-f57ee6186883\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.919307 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-config-data\") pod \"ada482ec-e4e4-4484-9dcf-f57ee6186883\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.919418 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada482ec-e4e4-4484-9dcf-f57ee6186883-logs\") pod \"ada482ec-e4e4-4484-9dcf-f57ee6186883\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.919462 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lt2w\" (UniqueName: \"kubernetes.io/projected/ada482ec-e4e4-4484-9dcf-f57ee6186883-kube-api-access-6lt2w\") pod \"ada482ec-e4e4-4484-9dcf-f57ee6186883\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.919535 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-config-data-custom\") pod \"ada482ec-e4e4-4484-9dcf-f57ee6186883\" (UID: \"ada482ec-e4e4-4484-9dcf-f57ee6186883\") " Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.919955 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ada482ec-e4e4-4484-9dcf-f57ee6186883-logs" (OuterVolumeSpecName: "logs") pod "ada482ec-e4e4-4484-9dcf-f57ee6186883" (UID: "ada482ec-e4e4-4484-9dcf-f57ee6186883"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.924077 5030 generic.go:334] "Generic (PLEG): container finished" podID="cab5d494-17b4-4c5d-babf-e74a3deb090f" containerID="4429058b871189e87dd7f8f69716b448dc7619a6b516fe76a0b7321f6197a303" exitCode=0 Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.924136 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" event={"ID":"cab5d494-17b4-4c5d-babf-e74a3deb090f","Type":"ContainerDied","Data":"4429058b871189e87dd7f8f69716b448dc7619a6b516fe76a0b7321f6197a303"} Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.924156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" event={"ID":"cab5d494-17b4-4c5d-babf-e74a3deb090f","Type":"ContainerDied","Data":"a2f1a7881e425dd7a9753c2d65901ac704440ad9d401de953b810127bc0896a5"} Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.924209 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn" Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.928945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"c1c966f9-0e90-4a5d-9e43-81ec4980cefc","Type":"ContainerStarted","Data":"62aceb603d9b599795ce3db4aa808f7df55085790cf93f0bb01657f574a12d00"} Jan 20 23:34:03 crc kubenswrapper[5030]: I0120 23:34:03.953880 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ada482ec-e4e4-4484-9dcf-f57ee6186883" (UID: "ada482ec-e4e4-4484-9dcf-f57ee6186883"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.003846 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada482ec-e4e4-4484-9dcf-f57ee6186883-kube-api-access-6lt2w" (OuterVolumeSpecName: "kube-api-access-6lt2w") pod "ada482ec-e4e4-4484-9dcf-f57ee6186883" (UID: "ada482ec-e4e4-4484-9dcf-f57ee6186883"). InnerVolumeSpecName "kube-api-access-6lt2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.003872 5030 scope.go:117] "RemoveContainer" containerID="fe5492f77d34acfeec74d4db9a01fa04e6cc8dcc5bcd07f35cdcc0d1d43df7ea" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.013535 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24bdc108-6bee-469c-88da-d064ef3b2381" path="/var/lib/kubelet/pods/24bdc108-6bee-469c-88da-d064ef3b2381/volumes" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.017497 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e8ed654-59ee-4055-bb97-7d079e100af7" path="/var/lib/kubelet/pods/3e8ed654-59ee-4055-bb97-7d079e100af7/volumes" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.018886 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a8a7d0-ed33-43a6-a11d-c544f55be6a5" path="/var/lib/kubelet/pods/49a8a7d0-ed33-43a6-a11d-c544f55be6a5/volumes" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.019730 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6153d63b-e003-4951-b9f1-a27d11979663" path="/var/lib/kubelet/pods/6153d63b-e003-4951-b9f1-a27d11979663/volumes" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.028950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-metrics-certs-tls-certs\") pod \"7b274deb-e830-47a2-916c-3c10670c08ac\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.029005 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b274deb-e830-47a2-916c-3c10670c08ac-ovn-rundir\") pod \"7b274deb-e830-47a2-916c-3c10670c08ac\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.029122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56pl2\" (UniqueName: \"kubernetes.io/projected/cab5d494-17b4-4c5d-babf-e74a3deb090f-kube-api-access-56pl2\") pod \"cab5d494-17b4-4c5d-babf-e74a3deb090f\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.029179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-combined-ca-bundle\") pod \"cab5d494-17b4-4c5d-babf-e74a3deb090f\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.029301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b274deb-e830-47a2-916c-3c10670c08ac-config\") pod \"7b274deb-e830-47a2-916c-3c10670c08ac\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.029333 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-combined-ca-bundle\") pod \"7b274deb-e830-47a2-916c-3c10670c08ac\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.029358 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b274deb-e830-47a2-916c-3c10670c08ac-scripts\") pod \"7b274deb-e830-47a2-916c-3c10670c08ac\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.029389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-ovn-northd-tls-certs\") pod \"7b274deb-e830-47a2-916c-3c10670c08ac\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.029423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-config-data\") pod \"cab5d494-17b4-4c5d-babf-e74a3deb090f\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.029447 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-config-data-custom\") pod \"cab5d494-17b4-4c5d-babf-e74a3deb090f\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.029477 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbtbw\" (UniqueName: \"kubernetes.io/projected/7b274deb-e830-47a2-916c-3c10670c08ac-kube-api-access-jbtbw\") pod \"7b274deb-e830-47a2-916c-3c10670c08ac\" (UID: \"7b274deb-e830-47a2-916c-3c10670c08ac\") " Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.029504 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab5d494-17b4-4c5d-babf-e74a3deb090f-logs\") pod \"cab5d494-17b4-4c5d-babf-e74a3deb090f\" (UID: \"cab5d494-17b4-4c5d-babf-e74a3deb090f\") " Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.061249 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8663671b-ef57-49a5-b24b-4cb3385ffc76" path="/var/lib/kubelet/pods/8663671b-ef57-49a5-b24b-4cb3385ffc76/volumes" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.062873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b274deb-e830-47a2-916c-3c10670c08ac-scripts" (OuterVolumeSpecName: "scripts") pod "7b274deb-e830-47a2-916c-3c10670c08ac" (UID: "7b274deb-e830-47a2-916c-3c10670c08ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.077596 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cab5d494-17b4-4c5d-babf-e74a3deb090f-logs" (OuterVolumeSpecName: "logs") pod "cab5d494-17b4-4c5d-babf-e74a3deb090f" (UID: "cab5d494-17b4-4c5d-babf-e74a3deb090f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: E0120 23:34:04.076799 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe5492f77d34acfeec74d4db9a01fa04e6cc8dcc5bcd07f35cdcc0d1d43df7ea\": container with ID starting with fe5492f77d34acfeec74d4db9a01fa04e6cc8dcc5bcd07f35cdcc0d1d43df7ea not found: ID does not exist" containerID="fe5492f77d34acfeec74d4db9a01fa04e6cc8dcc5bcd07f35cdcc0d1d43df7ea" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.099366 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe5492f77d34acfeec74d4db9a01fa04e6cc8dcc5bcd07f35cdcc0d1d43df7ea"} err="failed to get container status \"fe5492f77d34acfeec74d4db9a01fa04e6cc8dcc5bcd07f35cdcc0d1d43df7ea\": rpc error: code = NotFound desc = could not find container \"fe5492f77d34acfeec74d4db9a01fa04e6cc8dcc5bcd07f35cdcc0d1d43df7ea\": container with ID starting with fe5492f77d34acfeec74d4db9a01fa04e6cc8dcc5bcd07f35cdcc0d1d43df7ea not found: ID does not exist" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.099391 5030 scope.go:117] "RemoveContainer" containerID="3b5c27ad1243d28a1b12fb9293783374d79cbf506d43fe581f1deb5fb186bb75" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.066761 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada482ec-e4e4-4484-9dcf-f57ee6186883-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.099524 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lt2w\" (UniqueName: \"kubernetes.io/projected/ada482ec-e4e4-4484-9dcf-f57ee6186883-kube-api-access-6lt2w\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.099537 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.100491 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b274deb-e830-47a2-916c-3c10670c08ac-config" (OuterVolumeSpecName: "config") pod "7b274deb-e830-47a2-916c-3c10670c08ac" (UID: "7b274deb-e830-47a2-916c-3c10670c08ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: E0120 23:34:04.101681 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5c27ad1243d28a1b12fb9293783374d79cbf506d43fe581f1deb5fb186bb75\": container with ID starting with 3b5c27ad1243d28a1b12fb9293783374d79cbf506d43fe581f1deb5fb186bb75 not found: ID does not exist" containerID="3b5c27ad1243d28a1b12fb9293783374d79cbf506d43fe581f1deb5fb186bb75" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.101726 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5c27ad1243d28a1b12fb9293783374d79cbf506d43fe581f1deb5fb186bb75"} err="failed to get container status \"3b5c27ad1243d28a1b12fb9293783374d79cbf506d43fe581f1deb5fb186bb75\": rpc error: code = NotFound desc = could not find container \"3b5c27ad1243d28a1b12fb9293783374d79cbf506d43fe581f1deb5fb186bb75\": container with ID starting with 3b5c27ad1243d28a1b12fb9293783374d79cbf506d43fe581f1deb5fb186bb75 not found: ID does not exist" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.101755 5030 scope.go:117] "RemoveContainer" containerID="df6cb8bdfcb5d92a6cd5e2e793e2b41a4b1b3fe7648144b68448df9085f79bf1" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.102169 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b274deb-e830-47a2-916c-3c10670c08ac-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "7b274deb-e830-47a2-916c-3c10670c08ac" (UID: "7b274deb-e830-47a2-916c-3c10670c08ac"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.124227 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.124265 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6","Type":"ContainerStarted","Data":"305ab9810000c0d4e7831bbb87809b2f75bb48931479151aad674ac887230e4b"} Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.124285 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6","Type":"ContainerStarted","Data":"b494db42719753d2ddab15c12638a8470632b47ca6ab1e3ea4f6027688bba6d7"} Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.124311 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.124324 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8","Type":"ContainerStarted","Data":"f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c"} Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.124335 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8","Type":"ContainerStarted","Data":"03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86"} Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.124686 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b274deb-e830-47a2-916c-3c10670c08ac-kube-api-access-jbtbw" (OuterVolumeSpecName: "kube-api-access-jbtbw") pod "7b274deb-e830-47a2-916c-3c10670c08ac" (UID: "7b274deb-e830-47a2-916c-3c10670c08ac"). InnerVolumeSpecName "kube-api-access-jbtbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.139882 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40","Type":"ContainerStarted","Data":"ea5bef0367f5c6e6f5f70d506f2c6ada8260a08aca5e604b68d36f92efd5d55a"} Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.160903 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=6.16088524 podStartE2EDuration="6.16088524s" podCreationTimestamp="2026-01-20 23:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:03.940487025 +0000 UTC m=+3516.260747313" watchObservedRunningTime="2026-01-20 23:34:04.16088524 +0000 UTC m=+3516.481145518" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.171805 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cab5d494-17b4-4c5d-babf-e74a3deb090f" (UID: "cab5d494-17b4-4c5d-babf-e74a3deb090f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.192011 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cab5d494-17b4-4c5d-babf-e74a3deb090f-kube-api-access-56pl2" (OuterVolumeSpecName: "kube-api-access-56pl2") pod "cab5d494-17b4-4c5d-babf-e74a3deb090f" (UID: "cab5d494-17b4-4c5d-babf-e74a3deb090f"). InnerVolumeSpecName "kube-api-access-56pl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.195594 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=3.195573779 podStartE2EDuration="3.195573779s" podCreationTimestamp="2026-01-20 23:34:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:04.125740889 +0000 UTC m=+3516.446001177" watchObservedRunningTime="2026-01-20 23:34:04.195573779 +0000 UTC m=+3516.515834067" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.202800 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b274deb-e830-47a2-916c-3c10670c08ac-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.202830 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b274deb-e830-47a2-916c-3c10670c08ac-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.202844 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.202858 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbtbw\" (UniqueName: \"kubernetes.io/projected/7b274deb-e830-47a2-916c-3c10670c08ac-kube-api-access-jbtbw\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.202867 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cab5d494-17b4-4c5d-babf-e74a3deb090f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.202875 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b274deb-e830-47a2-916c-3c10670c08ac-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.202887 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56pl2\" (UniqueName: \"kubernetes.io/projected/cab5d494-17b4-4c5d-babf-e74a3deb090f-kube-api-access-56pl2\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.206054 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=8.206037363 podStartE2EDuration="8.206037363s" podCreationTimestamp="2026-01-20 23:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:04.155341355 +0000 UTC m=+3516.475601643" watchObservedRunningTime="2026-01-20 23:34:04.206037363 +0000 UTC m=+3516.526297651" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.220084 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ada482ec-e4e4-4484-9dcf-f57ee6186883" (UID: "ada482ec-e4e4-4484-9dcf-f57ee6186883"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.229996 5030 scope.go:117] "RemoveContainer" containerID="59d37188df8105f7a6563229d86d8c1078a8f47264c721f27153dbfc6bbaa4e2" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.307983 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.337774 5030 scope.go:117] "RemoveContainer" containerID="4429058b871189e87dd7f8f69716b448dc7619a6b516fe76a0b7321f6197a303" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.373791 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cab5d494-17b4-4c5d-babf-e74a3deb090f" (UID: "cab5d494-17b4-4c5d-babf-e74a3deb090f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.409423 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.433085 5030 scope.go:117] "RemoveContainer" containerID="da1b2fae7bed4ea45499659585d573132a7fc0d2dbe9cb24247436e14e86bb77" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.454641 5030 scope.go:117] "RemoveContainer" containerID="4429058b871189e87dd7f8f69716b448dc7619a6b516fe76a0b7321f6197a303" Jan 20 23:34:04 crc kubenswrapper[5030]: E0120 23:34:04.456405 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4429058b871189e87dd7f8f69716b448dc7619a6b516fe76a0b7321f6197a303\": container with ID starting with 4429058b871189e87dd7f8f69716b448dc7619a6b516fe76a0b7321f6197a303 not found: ID does not exist" containerID="4429058b871189e87dd7f8f69716b448dc7619a6b516fe76a0b7321f6197a303" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.456452 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4429058b871189e87dd7f8f69716b448dc7619a6b516fe76a0b7321f6197a303"} err="failed to get container status \"4429058b871189e87dd7f8f69716b448dc7619a6b516fe76a0b7321f6197a303\": rpc error: code = NotFound desc = could not find container \"4429058b871189e87dd7f8f69716b448dc7619a6b516fe76a0b7321f6197a303\": container with ID starting with 4429058b871189e87dd7f8f69716b448dc7619a6b516fe76a0b7321f6197a303 not found: ID does not exist" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.456488 5030 scope.go:117] "RemoveContainer" containerID="da1b2fae7bed4ea45499659585d573132a7fc0d2dbe9cb24247436e14e86bb77" Jan 20 23:34:04 crc kubenswrapper[5030]: E0120 23:34:04.457385 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da1b2fae7bed4ea45499659585d573132a7fc0d2dbe9cb24247436e14e86bb77\": container with ID starting with da1b2fae7bed4ea45499659585d573132a7fc0d2dbe9cb24247436e14e86bb77 not found: ID does not exist" containerID="da1b2fae7bed4ea45499659585d573132a7fc0d2dbe9cb24247436e14e86bb77" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.457411 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1b2fae7bed4ea45499659585d573132a7fc0d2dbe9cb24247436e14e86bb77"} err="failed to get container status \"da1b2fae7bed4ea45499659585d573132a7fc0d2dbe9cb24247436e14e86bb77\": rpc error: code = NotFound desc = could not find container \"da1b2fae7bed4ea45499659585d573132a7fc0d2dbe9cb24247436e14e86bb77\": container with ID starting with da1b2fae7bed4ea45499659585d573132a7fc0d2dbe9cb24247436e14e86bb77 not found: ID does not exist" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.473750 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b274deb-e830-47a2-916c-3c10670c08ac" (UID: "7b274deb-e830-47a2-916c-3c10670c08ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.511491 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.525705 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-config-data" (OuterVolumeSpecName: "config-data") pod "ada482ec-e4e4-4484-9dcf-f57ee6186883" (UID: "ada482ec-e4e4-4484-9dcf-f57ee6186883"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.548580 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-config-data" (OuterVolumeSpecName: "config-data") pod "cab5d494-17b4-4c5d-babf-e74a3deb090f" (UID: "cab5d494-17b4-4c5d-babf-e74a3deb090f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.581727 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "7b274deb-e830-47a2-916c-3c10670c08ac" (UID: "7b274deb-e830-47a2-916c-3c10670c08ac"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.612895 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.612933 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cab5d494-17b4-4c5d-babf-e74a3deb090f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.612942 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada482ec-e4e4-4484-9dcf-f57ee6186883-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.613809 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.617825 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7b274deb-e830-47a2-916c-3c10670c08ac" (UID: "7b274deb-e830-47a2-916c-3c10670c08ac"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.714878 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b274deb-e830-47a2-916c-3c10670c08ac-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.743929 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.806402 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh"] Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.837239 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-6f8d945766-px9fh"] Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.859700 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.888664 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.918803 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:04 crc kubenswrapper[5030]: E0120 23:34:04.919223 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b274deb-e830-47a2-916c-3c10670c08ac" containerName="ovn-northd" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.919234 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b274deb-e830-47a2-916c-3c10670c08ac" containerName="ovn-northd" Jan 20 23:34:04 crc kubenswrapper[5030]: E0120 23:34:04.919244 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cab5d494-17b4-4c5d-babf-e74a3deb090f" containerName="barbican-worker" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.919250 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab5d494-17b4-4c5d-babf-e74a3deb090f" containerName="barbican-worker" Jan 20 23:34:04 crc kubenswrapper[5030]: E0120 23:34:04.919260 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada482ec-e4e4-4484-9dcf-f57ee6186883" containerName="barbican-keystone-listener-log" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.919267 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada482ec-e4e4-4484-9dcf-f57ee6186883" containerName="barbican-keystone-listener-log" Jan 20 23:34:04 crc kubenswrapper[5030]: E0120 23:34:04.919279 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b274deb-e830-47a2-916c-3c10670c08ac" containerName="openstack-network-exporter" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.919285 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b274deb-e830-47a2-916c-3c10670c08ac" containerName="openstack-network-exporter" Jan 20 23:34:04 crc kubenswrapper[5030]: E0120 23:34:04.919319 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cab5d494-17b4-4c5d-babf-e74a3deb090f" containerName="barbican-worker-log" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.919326 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cab5d494-17b4-4c5d-babf-e74a3deb090f" containerName="barbican-worker-log" Jan 20 23:34:04 crc kubenswrapper[5030]: E0120 23:34:04.919338 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada482ec-e4e4-4484-9dcf-f57ee6186883" containerName="barbican-keystone-listener" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.919344 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada482ec-e4e4-4484-9dcf-f57ee6186883" containerName="barbican-keystone-listener" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.919506 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada482ec-e4e4-4484-9dcf-f57ee6186883" containerName="barbican-keystone-listener-log" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.919514 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b274deb-e830-47a2-916c-3c10670c08ac" containerName="openstack-network-exporter" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.919523 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada482ec-e4e4-4484-9dcf-f57ee6186883" containerName="barbican-keystone-listener" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.919541 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cab5d494-17b4-4c5d-babf-e74a3deb090f" containerName="barbican-worker-log" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.919551 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cab5d494-17b4-4c5d-babf-e74a3deb090f" containerName="barbican-worker" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.919560 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b274deb-e830-47a2-916c-3c10670c08ac" containerName="ovn-northd" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.921108 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.924357 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-qmbrx" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.924639 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.924754 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.924943 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.962260 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn"] Jan 20 23:34:04 crc kubenswrapper[5030]: I0120 23:34:04.990678 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7cdcf65d89-xwjmn"] Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.014652 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.023650 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.023704 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.023797 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-config\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.024208 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-scripts\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.024440 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h5gs\" (UniqueName: \"kubernetes.io/projected/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-kube-api-access-9h5gs\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.024479 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.024524 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.126234 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-scripts\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.126335 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h5gs\" (UniqueName: \"kubernetes.io/projected/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-kube-api-access-9h5gs\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.126378 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.126406 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.126434 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.126478 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.126541 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-config\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.127142 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-scripts\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.130952 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-config\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.131039 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.133801 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.136613 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.144411 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.147106 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h5gs\" (UniqueName: \"kubernetes.io/projected/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-kube-api-access-9h5gs\") pod \"ovn-northd-0\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.191059 5030 generic.go:334] "Generic (PLEG): container finished" podID="427b84ba-dba3-4881-bd7c-461d4c3674c9" containerID="0a8376f95d2df43c0daf95c4b85f7a34edcb659b644e3d8e243b73c2e1a7e0bf" exitCode=143 Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.191125 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" event={"ID":"427b84ba-dba3-4881-bd7c-461d4c3674c9","Type":"ContainerDied","Data":"0a8376f95d2df43c0daf95c4b85f7a34edcb659b644e3d8e243b73c2e1a7e0bf"} Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.202221 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"c1c966f9-0e90-4a5d-9e43-81ec4980cefc","Type":"ContainerStarted","Data":"6b394c55a99879becceaf8003a6a0e868d119d03350ca95317201ad98351c493"} Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.206548 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" event={"ID":"2d1b606a-c4d7-4ead-83db-20b4cd7ded47","Type":"ContainerStarted","Data":"e06dfb116e83c6a042119cbcc962d6e9c94a759890c06b3cecc1ccd89cd716ab"} Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.206579 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" event={"ID":"2d1b606a-c4d7-4ead-83db-20b4cd7ded47","Type":"ContainerStarted","Data":"4afb02528814af0d996d7fabe7520b1955adff3dccf2a87e4e71cb62bed38778"} Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.207342 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.207366 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.223882 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40","Type":"ContainerStarted","Data":"e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c"} Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.236438 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5613b909-aedf-4a87-babc-ca3e427fe828","Type":"ContainerStarted","Data":"af80d5994aeeb83a95385a0f7fce28dff4b04a7a0c3cd44b876f399ee8c542dd"} Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.257692 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" podStartSLOduration=5.257675519 podStartE2EDuration="5.257675519s" podCreationTimestamp="2026-01-20 23:34:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:05.245593256 +0000 UTC m=+3517.565853544" watchObservedRunningTime="2026-01-20 23:34:05.257675519 +0000 UTC m=+3517.577935807" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.287043 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:05 crc kubenswrapper[5030]: W0120 23:34:05.755253 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod023f2e4d_ae22_4ce0_b4bc_86f930b149f9.slice/crio-1f4e68d7f6ceb2e2eeec21cb82ae32f014a1bb6e6f9eae6482222cae5d23fe90 WatchSource:0}: Error finding container 1f4e68d7f6ceb2e2eeec21cb82ae32f014a1bb6e6f9eae6482222cae5d23fe90: Status 404 returned error can't find the container with id 1f4e68d7f6ceb2e2eeec21cb82ae32f014a1bb6e6f9eae6482222cae5d23fe90 Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.761159 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.971148 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b274deb-e830-47a2-916c-3c10670c08ac" path="/var/lib/kubelet/pods/7b274deb-e830-47a2-916c-3c10670c08ac/volumes" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.972007 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada482ec-e4e4-4484-9dcf-f57ee6186883" path="/var/lib/kubelet/pods/ada482ec-e4e4-4484-9dcf-f57ee6186883/volumes" Jan 20 23:34:05 crc kubenswrapper[5030]: I0120 23:34:05.972657 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cab5d494-17b4-4c5d-babf-e74a3deb090f" path="/var/lib/kubelet/pods/cab5d494-17b4-4c5d-babf-e74a3deb090f/volumes" Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.273388 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40","Type":"ContainerStarted","Data":"103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889"} Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.273832 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" containerName="cinder-api-log" containerID="cri-o://e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c" gracePeriod=30 Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.273914 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.274076 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" containerName="cinder-api" containerID="cri-o://103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889" gracePeriod=30 Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.279517 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5613b909-aedf-4a87-babc-ca3e427fe828","Type":"ContainerStarted","Data":"343f155d1a459e91f33918892e6f512eb6a0c2d10d0aff7fca8e1becf913fd1c"} Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.279551 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5613b909-aedf-4a87-babc-ca3e427fe828","Type":"ContainerStarted","Data":"031ff5602823d87e2940a4709da9332283c628b5bf6321ae670f5bb9b3a72763"} Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.306035 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"023f2e4d-ae22-4ce0-b4bc-86f930b149f9","Type":"ContainerStarted","Data":"d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a"} Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.307781 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"023f2e4d-ae22-4ce0-b4bc-86f930b149f9","Type":"ContainerStarted","Data":"ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b"} Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.307838 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.307854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"023f2e4d-ae22-4ce0-b4bc-86f930b149f9","Type":"ContainerStarted","Data":"1f4e68d7f6ceb2e2eeec21cb82ae32f014a1bb6e6f9eae6482222cae5d23fe90"} Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.327270 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=4.327246829 podStartE2EDuration="4.327246829s" podCreationTimestamp="2026-01-20 23:34:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:06.30002339 +0000 UTC m=+3518.620283678" watchObservedRunningTime="2026-01-20 23:34:06.327246829 +0000 UTC m=+3518.647507117" Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.337609 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.337592349 podStartE2EDuration="3.337592349s" podCreationTimestamp="2026-01-20 23:34:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:06.318544828 +0000 UTC m=+3518.638805116" watchObservedRunningTime="2026-01-20 23:34:06.337592349 +0000 UTC m=+3518.657852637" Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.359666 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.359651453 podStartE2EDuration="2.359651453s" podCreationTimestamp="2026-01-20 23:34:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:06.347886578 +0000 UTC m=+3518.668146906" watchObservedRunningTime="2026-01-20 23:34:06.359651453 +0000 UTC m=+3518.679911741" Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.878560 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.889397 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:06 crc kubenswrapper[5030]: I0120 23:34:06.896604 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.083251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-config-data-custom\") pod \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.083313 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-config-data\") pod \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.083338 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-combined-ca-bundle\") pod \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.083390 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-logs\") pod \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.083453 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-etc-machine-id\") pod \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.083475 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-internal-tls-certs\") pod \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.083532 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-public-tls-certs\") pod \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.083585 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-scripts\") pod \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.083582 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" (UID: "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.083682 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxhq6\" (UniqueName: \"kubernetes.io/projected/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-kube-api-access-gxhq6\") pod \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\" (UID: \"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40\") " Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.083928 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-logs" (OuterVolumeSpecName: "logs") pod "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" (UID: "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.084341 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.084373 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.091023 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-scripts" (OuterVolumeSpecName: "scripts") pod "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" (UID: "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.095776 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" (UID: "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.103786 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-kube-api-access-gxhq6" (OuterVolumeSpecName: "kube-api-access-gxhq6") pod "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" (UID: "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40"). InnerVolumeSpecName "kube-api-access-gxhq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.133218 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" (UID: "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.166116 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" (UID: "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.174727 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" (UID: "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.180481 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-config-data" (OuterVolumeSpecName: "config-data") pod "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" (UID: "ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.185644 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxhq6\" (UniqueName: \"kubernetes.io/projected/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-kube-api-access-gxhq6\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.185675 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.185685 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.185695 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.185704 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.185712 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.185720 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.315112 5030 generic.go:334] "Generic (PLEG): container finished" podID="01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6" containerID="305ab9810000c0d4e7831bbb87809b2f75bb48931479151aad674ac887230e4b" exitCode=1 Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.315206 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6","Type":"ContainerDied","Data":"305ab9810000c0d4e7831bbb87809b2f75bb48931479151aad674ac887230e4b"} Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.316653 5030 scope.go:117] "RemoveContainer" containerID="305ab9810000c0d4e7831bbb87809b2f75bb48931479151aad674ac887230e4b" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.317224 5030 generic.go:334] "Generic (PLEG): container finished" podID="ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" containerID="103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889" exitCode=0 Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.317256 5030 generic.go:334] "Generic (PLEG): container finished" podID="ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" containerID="e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c" exitCode=143 Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.317263 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.317323 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40","Type":"ContainerDied","Data":"103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889"} Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.317375 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40","Type":"ContainerDied","Data":"e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c"} Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.317393 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40","Type":"ContainerDied","Data":"ea5bef0367f5c6e6f5f70d506f2c6ada8260a08aca5e604b68d36f92efd5d55a"} Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.317428 5030 scope.go:117] "RemoveContainer" containerID="103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.318610 5030 generic.go:334] "Generic (PLEG): container finished" podID="c1c966f9-0e90-4a5d-9e43-81ec4980cefc" containerID="6b394c55a99879becceaf8003a6a0e868d119d03350ca95317201ad98351c493" exitCode=0 Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.319674 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"c1c966f9-0e90-4a5d-9e43-81ec4980cefc","Type":"ContainerDied","Data":"6b394c55a99879becceaf8003a6a0e868d119d03350ca95317201ad98351c493"} Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.321044 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" containerName="openstack-network-exporter" containerID="cri-o://f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c" gracePeriod=300 Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.374256 5030 scope.go:117] "RemoveContainer" containerID="e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.394054 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" containerName="ovsdbserver-sb" containerID="cri-o://03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86" gracePeriod=300 Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.416755 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.438220 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.449336 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:07 crc kubenswrapper[5030]: E0120 23:34:07.449769 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" containerName="cinder-api" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.449781 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" containerName="cinder-api" Jan 20 23:34:07 crc kubenswrapper[5030]: E0120 23:34:07.449810 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" containerName="cinder-api-log" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.449816 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" containerName="cinder-api-log" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.449999 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" containerName="cinder-api-log" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.450010 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" containerName="cinder-api" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.450955 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.454876 5030 scope.go:117] "RemoveContainer" containerID="103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.463591 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.463910 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.464032 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:34:07 crc kubenswrapper[5030]: E0120 23:34:07.464184 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889\": container with ID starting with 103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889 not found: ID does not exist" containerID="103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.464218 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889"} err="failed to get container status \"103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889\": rpc error: code = NotFound desc = could not find container \"103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889\": container with ID starting with 103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889 not found: ID does not exist" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.464242 5030 scope.go:117] "RemoveContainer" containerID="e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c" Jan 20 23:34:07 crc kubenswrapper[5030]: E0120 23:34:07.469353 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c\": container with ID starting with e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c not found: ID does not exist" containerID="e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.469391 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c"} err="failed to get container status \"e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c\": rpc error: code = NotFound desc = could not find container \"e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c\": container with ID starting with e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c not found: ID does not exist" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.469415 5030 scope.go:117] "RemoveContainer" containerID="103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.470795 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889"} err="failed to get container status \"103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889\": rpc error: code = NotFound desc = could not find container \"103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889\": container with ID starting with 103992f4ec6a2c94b3d25cd56f67e37f8e737e3e72a396a14bc139bb3af26889 not found: ID does not exist" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.470847 5030 scope.go:117] "RemoveContainer" containerID="e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.472131 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c"} err="failed to get container status \"e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c\": rpc error: code = NotFound desc = could not find container \"e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c\": container with ID starting with e31c01f6f3a7e1901d4233ae75b37e500924a65ef51b41901f8d6b2b7c64f24c not found: ID does not exist" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.480045 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.600414 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-config-data\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.600517 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-scripts\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.600642 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtkvm\" (UniqueName: \"kubernetes.io/projected/c095502b-f026-40aa-a84f-057d9dc0e758-kube-api-access-jtkvm\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.600702 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.600740 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c095502b-f026-40aa-a84f-057d9dc0e758-logs\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.600788 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.600819 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c095502b-f026-40aa-a84f-057d9dc0e758-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.600874 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.600915 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-config-data-custom\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.703559 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-scripts\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.703826 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtkvm\" (UniqueName: \"kubernetes.io/projected/c095502b-f026-40aa-a84f-057d9dc0e758-kube-api-access-jtkvm\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.703853 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.703883 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c095502b-f026-40aa-a84f-057d9dc0e758-logs\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.703914 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.703933 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c095502b-f026-40aa-a84f-057d9dc0e758-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.703960 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.703987 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-config-data-custom\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.704020 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-config-data\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.705525 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c095502b-f026-40aa-a84f-057d9dc0e758-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.706182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c095502b-f026-40aa-a84f-057d9dc0e758-logs\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.709498 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.714420 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-scripts\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.714546 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-config-data-custom\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.714920 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-config-data\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.718537 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.718820 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.726044 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtkvm\" (UniqueName: \"kubernetes.io/projected/c095502b-f026-40aa-a84f-057d9dc0e758-kube-api-access-jtkvm\") pod \"cinder-api-0\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.868464 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_78cb6e0a-d348-4ae5-811d-c3ebb411c0d8/ovsdbserver-sb/0.log" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.868554 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.913702 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:07 crc kubenswrapper[5030]: I0120 23:34:07.920829 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.101:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.008485 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsddh\" (UniqueName: \"kubernetes.io/projected/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-kube-api-access-vsddh\") pod \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.008803 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-config\") pod \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.008839 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-scripts\") pod \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.008968 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-metrics-certs-tls-certs\") pod \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.009026 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-ovsdb-rundir\") pod \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.009056 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.009085 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-combined-ca-bundle\") pod \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.009139 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-ovsdbserver-sb-tls-certs\") pod \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\" (UID: \"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8\") " Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.009989 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-scripts" (OuterVolumeSpecName: "scripts") pod "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" (UID: "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.016281 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" (UID: "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.023384 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-config" (OuterVolumeSpecName: "config") pod "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" (UID: "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.024960 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" (UID: "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.033780 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40" path="/var/lib/kubelet/pods/ecf56540-f3fc-4d6c-ac4c-b4e5b890fc40/volumes" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.038776 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-kube-api-access-vsddh" (OuterVolumeSpecName: "kube-api-access-vsddh") pod "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" (UID: "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8"). InnerVolumeSpecName "kube-api-access-vsddh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.070873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" (UID: "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.111048 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsddh\" (UniqueName: \"kubernetes.io/projected/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-kube-api-access-vsddh\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.111078 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.111088 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.111097 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.111115 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.111123 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.111821 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" (UID: "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.120478 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" (UID: "78cb6e0a-d348-4ae5-811d-c3ebb411c0d8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.140780 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.213367 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.213403 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.213414 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.268449 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.336134 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"c1c966f9-0e90-4a5d-9e43-81ec4980cefc","Type":"ContainerStarted","Data":"933d723028f4a35c1b50d6e05e66a8a5d7dfd126606c38522abe3bed4c985ccc"} Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.338371 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6","Type":"ContainerStarted","Data":"a038db663aa5f8995ab3f2012bde10c376602dd9df5515774d38a1f9b699c0c7"} Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.338998 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.342608 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_78cb6e0a-d348-4ae5-811d-c3ebb411c0d8/ovsdbserver-sb/0.log" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.342671 5030 generic.go:334] "Generic (PLEG): container finished" podID="78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" containerID="f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c" exitCode=2 Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.342690 5030 generic.go:334] "Generic (PLEG): container finished" podID="78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" containerID="03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86" exitCode=143 Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.342917 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="023f2e4d-ae22-4ce0-b4bc-86f930b149f9" containerName="ovn-northd" containerID="cri-o://ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b" gracePeriod=30 Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.343243 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.343893 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8","Type":"ContainerDied","Data":"f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c"} Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.343933 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8","Type":"ContainerDied","Data":"03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86"} Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.343947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"78cb6e0a-d348-4ae5-811d-c3ebb411c0d8","Type":"ContainerDied","Data":"13139228853e1fd0b3b9ef2a3e2e246bbaa705a6eb68d3ef96e9d2141d86cbb3"} Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.343962 5030 scope.go:117] "RemoveContainer" containerID="f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.343963 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="023f2e4d-ae22-4ce0-b4bc-86f930b149f9" containerName="openstack-network-exporter" containerID="cri-o://d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a" gracePeriod=30 Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.372642 5030 scope.go:117] "RemoveContainer" containerID="03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.378452 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=7.37843404 podStartE2EDuration="7.37843404s" podCreationTimestamp="2026-01-20 23:34:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:08.363471507 +0000 UTC m=+3520.683731805" watchObservedRunningTime="2026-01-20 23:34:08.37843404 +0000 UTC m=+3520.698694328" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.423662 5030 scope.go:117] "RemoveContainer" containerID="f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c" Jan 20 23:34:08 crc kubenswrapper[5030]: E0120 23:34:08.423982 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c\": container with ID starting with f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c not found: ID does not exist" containerID="f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.424034 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c"} err="failed to get container status \"f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c\": rpc error: code = NotFound desc = could not find container \"f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c\": container with ID starting with f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c not found: ID does not exist" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.424061 5030 scope.go:117] "RemoveContainer" containerID="03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86" Jan 20 23:34:08 crc kubenswrapper[5030]: E0120 23:34:08.424375 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86\": container with ID starting with 03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86 not found: ID does not exist" containerID="03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.424417 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86"} err="failed to get container status \"03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86\": rpc error: code = NotFound desc = could not find container \"03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86\": container with ID starting with 03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86 not found: ID does not exist" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.424443 5030 scope.go:117] "RemoveContainer" containerID="f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.424902 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c"} err="failed to get container status \"f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c\": rpc error: code = NotFound desc = could not find container \"f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c\": container with ID starting with f73c3c924cf72ee6d521c2cdaa5fb8ccc1dc11964bb065c79dbf46a191be1f2c not found: ID does not exist" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.424939 5030 scope.go:117] "RemoveContainer" containerID="03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.425758 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86"} err="failed to get container status \"03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86\": rpc error: code = NotFound desc = could not find container \"03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86\": container with ID starting with 03742963be6819a7411c986a2a5c1ef44b55e886bdda860996e91145de546b86 not found: ID does not exist" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.434024 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.479879 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.488072 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.497779 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:08 crc kubenswrapper[5030]: E0120 23:34:08.498231 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" containerName="ovsdbserver-sb" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.498245 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" containerName="ovsdbserver-sb" Jan 20 23:34:08 crc kubenswrapper[5030]: E0120 23:34:08.498283 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" containerName="openstack-network-exporter" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.498289 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" containerName="openstack-network-exporter" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.498478 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" containerName="ovsdbserver-sb" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.498501 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" containerName="openstack-network-exporter" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.499554 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.503873 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-hrzkh" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.504059 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.504237 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.504349 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.506523 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.622391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.622488 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65b84346-911d-4a8d-a75d-1fa2ba7e167b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.622513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp7wz\" (UniqueName: \"kubernetes.io/projected/65b84346-911d-4a8d-a75d-1fa2ba7e167b-kube-api-access-sp7wz\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.622561 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65b84346-911d-4a8d-a75d-1fa2ba7e167b-config\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.622582 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/65b84346-911d-4a8d-a75d-1fa2ba7e167b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.622658 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.622724 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.622750 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.724435 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.724481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.724545 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.724601 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65b84346-911d-4a8d-a75d-1fa2ba7e167b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.724638 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp7wz\" (UniqueName: \"kubernetes.io/projected/65b84346-911d-4a8d-a75d-1fa2ba7e167b-kube-api-access-sp7wz\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.724676 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65b84346-911d-4a8d-a75d-1fa2ba7e167b-config\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.724694 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/65b84346-911d-4a8d-a75d-1fa2ba7e167b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.725888 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.730868 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.735405 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65b84346-911d-4a8d-a75d-1fa2ba7e167b-config\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.735817 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/65b84346-911d-4a8d-a75d-1fa2ba7e167b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.736981 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kzhtm"] Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.761435 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.763223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.776019 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.777797 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.784414 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.801532 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65b84346-911d-4a8d-a75d-1fa2ba7e167b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.804735 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.810714 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp7wz\" (UniqueName: \"kubernetes.io/projected/65b84346-911d-4a8d-a75d-1fa2ba7e167b-kube-api-access-sp7wz\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.828772 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-catalog-content\") pod \"community-operators-kzhtm\" (UID: \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\") " pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.828816 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-utilities\") pod \"community-operators-kzhtm\" (UID: \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\") " pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.829005 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9d7d\" (UniqueName: \"kubernetes.io/projected/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-kube-api-access-g9d7d\") pod \"community-operators-kzhtm\" (UID: \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\") " pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.857505 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzhtm"] Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.887148 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.904989 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.930574 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9d7d\" (UniqueName: \"kubernetes.io/projected/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-kube-api-access-g9d7d\") pod \"community-operators-kzhtm\" (UID: \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\") " pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.948745 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.101:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.949492 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-catalog-content\") pod \"community-operators-kzhtm\" (UID: \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\") " pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.949525 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-utilities\") pod \"community-operators-kzhtm\" (UID: \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\") " pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.951999 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-catalog-content\") pod \"community-operators-kzhtm\" (UID: \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\") " pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.952220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-utilities\") pod \"community-operators-kzhtm\" (UID: \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\") " pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:08 crc kubenswrapper[5030]: I0120 23:34:08.958005 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.047507 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9d7d\" (UniqueName: \"kubernetes.io/projected/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-kube-api-access-g9d7d\") pod \"community-operators-kzhtm\" (UID: \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\") " pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.050982 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.159557 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.166478 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.172252 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.179752 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.194890 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.200706 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.200897 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e905a9f8-849b-45bb-be21-2293c96cd0e1" containerName="nova-cell1-conductor-conductor" containerID="cri-o://a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9" gracePeriod=30 Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.240572 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_023f2e4d-ae22-4ce0-b4bc-86f930b149f9/ovn-northd/0.log" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.240673 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.313672 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-ddf8577d5-p79tx"] Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.313977 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" podUID="d788ece9-38a7-4cd0-bc65-a193ef38a0f1" containerName="keystone-api" containerID="cri-o://20dc72195b475941779d438aac438e68649738b63961b6c84b8477517eefa288" gracePeriod=30 Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.348934 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.349739 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.368280 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-metrics-certs-tls-certs\") pod \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.368413 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-scripts\") pod \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.368459 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h5gs\" (UniqueName: \"kubernetes.io/projected/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-kube-api-access-9h5gs\") pod \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.368526 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-ovn-rundir\") pod \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.368817 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-ovn-northd-tls-certs\") pod \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.368855 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-combined-ca-bundle\") pod \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.369147 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-config\") pod \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\" (UID: \"023f2e4d-ae22-4ce0-b4bc-86f930b149f9\") " Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.370996 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "023f2e4d-ae22-4ce0-b4bc-86f930b149f9" (UID: "023f2e4d-ae22-4ce0-b4bc-86f930b149f9"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.372054 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-config" (OuterVolumeSpecName: "config") pod "023f2e4d-ae22-4ce0-b4bc-86f930b149f9" (UID: "023f2e4d-ae22-4ce0-b4bc-86f930b149f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.372400 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-scripts" (OuterVolumeSpecName: "scripts") pod "023f2e4d-ae22-4ce0-b4bc-86f930b149f9" (UID: "023f2e4d-ae22-4ce0-b4bc-86f930b149f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.377713 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-kube-api-access-9h5gs" (OuterVolumeSpecName: "kube-api-access-9h5gs") pod "023f2e4d-ae22-4ce0-b4bc-86f930b149f9" (UID: "023f2e4d-ae22-4ce0-b4bc-86f930b149f9"). InnerVolumeSpecName "kube-api-access-9h5gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.390975 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_023f2e4d-ae22-4ce0-b4bc-86f930b149f9/ovn-northd/0.log" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.391026 5030 generic.go:334] "Generic (PLEG): container finished" podID="023f2e4d-ae22-4ce0-b4bc-86f930b149f9" containerID="d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a" exitCode=2 Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.391043 5030 generic.go:334] "Generic (PLEG): container finished" podID="023f2e4d-ae22-4ce0-b4bc-86f930b149f9" containerID="ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b" exitCode=143 Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.391110 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"023f2e4d-ae22-4ce0-b4bc-86f930b149f9","Type":"ContainerDied","Data":"d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a"} Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.391136 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"023f2e4d-ae22-4ce0-b4bc-86f930b149f9","Type":"ContainerDied","Data":"ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b"} Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.391145 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"023f2e4d-ae22-4ce0-b4bc-86f930b149f9","Type":"ContainerDied","Data":"1f4e68d7f6ceb2e2eeec21cb82ae32f014a1bb6e6f9eae6482222cae5d23fe90"} Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.391160 5030 scope.go:117] "RemoveContainer" containerID="d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.391295 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.410027 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "023f2e4d-ae22-4ce0-b4bc-86f930b149f9" (UID: "023f2e4d-ae22-4ce0-b4bc-86f930b149f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.459928 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c095502b-f026-40aa-a84f-057d9dc0e758","Type":"ContainerStarted","Data":"0a1ee7a63dca89b17f71807197fe62d7d5cfa69a0e0f70c4db0e644417b91720"} Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.474933 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.474960 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.474969 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h5gs\" (UniqueName: \"kubernetes.io/projected/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-kube-api-access-9h5gs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.474979 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.474988 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.496689 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq"] Jan 20 23:34:09 crc kubenswrapper[5030]: E0120 23:34:09.497146 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="023f2e4d-ae22-4ce0-b4bc-86f930b149f9" containerName="ovn-northd" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.497158 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="023f2e4d-ae22-4ce0-b4bc-86f930b149f9" containerName="ovn-northd" Jan 20 23:34:09 crc kubenswrapper[5030]: E0120 23:34:09.497171 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="023f2e4d-ae22-4ce0-b4bc-86f930b149f9" containerName="openstack-network-exporter" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.497178 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="023f2e4d-ae22-4ce0-b4bc-86f930b149f9" containerName="openstack-network-exporter" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.497357 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="023f2e4d-ae22-4ce0-b4bc-86f930b149f9" containerName="openstack-network-exporter" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.497376 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="023f2e4d-ae22-4ce0-b4bc-86f930b149f9" containerName="ovn-northd" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.498084 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.609707 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq"] Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.651290 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.662256 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "023f2e4d-ae22-4ce0-b4bc-86f930b149f9" (UID: "023f2e4d-ae22-4ce0-b4bc-86f930b149f9"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.682725 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdf4b\" (UniqueName: \"kubernetes.io/projected/c8072f56-2133-435c-87fd-e0a273ee9e91-kube-api-access-rdf4b\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.682774 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-credential-keys\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.682797 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-scripts\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.682883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-config-data\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.682915 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-internal-tls-certs\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.682940 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-public-tls-certs\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.682973 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-combined-ca-bundle\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.683001 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-fernet-keys\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.683043 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.697811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "023f2e4d-ae22-4ce0-b4bc-86f930b149f9" (UID: "023f2e4d-ae22-4ce0-b4bc-86f930b149f9"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.703374 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.703774 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.742593 5030 scope.go:117] "RemoveContainer" containerID="ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.784684 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-config-data\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.784747 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-internal-tls-certs\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.784783 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-public-tls-certs\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.784839 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-combined-ca-bundle\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.784866 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-fernet-keys\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.784888 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdf4b\" (UniqueName: \"kubernetes.io/projected/c8072f56-2133-435c-87fd-e0a273ee9e91-kube-api-access-rdf4b\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.784911 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-credential-keys\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.784936 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-scripts\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.784985 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/023f2e4d-ae22-4ce0-b4bc-86f930b149f9-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.803235 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-public-tls-certs\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.811839 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-config-data\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.812617 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-combined-ca-bundle\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.816715 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdf4b\" (UniqueName: \"kubernetes.io/projected/c8072f56-2133-435c-87fd-e0a273ee9e91-kube-api-access-rdf4b\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.816984 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-credential-keys\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.822214 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-scripts\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.822342 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-fernet-keys\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.822468 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-internal-tls-certs\") pod \"keystone-5b8b6c8845-fv7lq\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.859766 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" podUID="d788ece9-38a7-4cd0-bc65-a193ef38a0f1" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.0.102:5000/v3\": read tcp 10.217.0.2:60716->10.217.0.102:5000: read: connection reset by peer" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.868033 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:09 crc kubenswrapper[5030]: E0120 23:34:09.913451 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.918136 5030 scope.go:117] "RemoveContainer" containerID="d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a" Jan 20 23:34:09 crc kubenswrapper[5030]: E0120 23:34:09.922859 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a\": container with ID starting with d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a not found: ID does not exist" containerID="d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.923259 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a"} err="failed to get container status \"d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a\": rpc error: code = NotFound desc = could not find container \"d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a\": container with ID starting with d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a not found: ID does not exist" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.923363 5030 scope.go:117] "RemoveContainer" containerID="ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b" Jan 20 23:34:09 crc kubenswrapper[5030]: E0120 23:34:09.931840 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b\": container with ID starting with ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b not found: ID does not exist" containerID="ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.931884 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b"} err="failed to get container status \"ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b\": rpc error: code = NotFound desc = could not find container \"ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b\": container with ID starting with ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b not found: ID does not exist" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.931913 5030 scope.go:117] "RemoveContainer" containerID="d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.936726 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a"} err="failed to get container status \"d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a\": rpc error: code = NotFound desc = could not find container \"d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a\": container with ID starting with d40afc21369c82775bba3f363adaa83230c7c239358a118f0fb97763387cea5a not found: ID does not exist" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.936761 5030 scope.go:117] "RemoveContainer" containerID="ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b" Jan 20 23:34:09 crc kubenswrapper[5030]: E0120 23:34:09.950092 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.950208 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b"} err="failed to get container status \"ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b\": rpc error: code = NotFound desc = could not find container \"ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b\": container with ID starting with ff1406eabbd49a63779c83311d7fd0bd69d007d70f963321e1aae24018bab07b not found: ID does not exist" Jan 20 23:34:09 crc kubenswrapper[5030]: E0120 23:34:09.953416 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:34:09 crc kubenswrapper[5030]: E0120 23:34:09.953471 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="e905a9f8-849b-45bb-be21-2293c96cd0e1" containerName="nova-cell1-conductor-conductor" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.997078 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78cb6e0a-d348-4ae5-811d-c3ebb411c0d8" path="/var/lib/kubelet/pods/78cb6e0a-d348-4ae5-811d-c3ebb411c0d8/volumes" Jan 20 23:34:09 crc kubenswrapper[5030]: I0120 23:34:09.998600 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.081899 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.113730 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.165446 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.169812 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.172022 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.172161 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.175440 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-qmbrx" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.175607 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.183528 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.255763 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzhtm"] Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.305050 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rndp5\" (UniqueName: \"kubernetes.io/projected/aaa15825-c66d-4625-86d9-2066c914a955-kube-api-access-rndp5\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.305102 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.305131 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aaa15825-c66d-4625-86d9-2066c914a955-scripts\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.305148 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aaa15825-c66d-4625-86d9-2066c914a955-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.305251 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa15825-c66d-4625-86d9-2066c914a955-config\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.305270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.305297 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.306455 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6776c47bdb-m5w28"] Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.307413 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" podUID="b7b3ce7d-73be-4f56-80a5-d9147ad0580e" containerName="neutron-api" containerID="cri-o://0bfe039149eb41418e7a2f78fbb14ac57973f48030e358e2f58e1c2623aba811" gracePeriod=30 Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.308392 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" podUID="b7b3ce7d-73be-4f56-80a5-d9147ad0580e" containerName="neutron-httpd" containerID="cri-o://580832931f6c0e54e8102660db221878e02f00569eb352c1853ac3c3279ce10e" gracePeriod=30 Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.324600 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" podUID="b7b3ce7d-73be-4f56-80a5-d9147ad0580e" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.328450 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6d84555d94-5jgdn"] Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.331195 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.337864 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6d84555d94-5jgdn"] Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.393986 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.109:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.394010 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.109:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.407731 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-public-tls-certs\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.407786 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-ovndb-tls-certs\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.407805 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-httpd-config\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.407831 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa15825-c66d-4625-86d9-2066c914a955-config\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.407850 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.407878 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.408004 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcxdl\" (UniqueName: \"kubernetes.io/projected/7203857c-1149-4912-9921-7e547f6418f6-kube-api-access-tcxdl\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.408050 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rndp5\" (UniqueName: \"kubernetes.io/projected/aaa15825-c66d-4625-86d9-2066c914a955-kube-api-access-rndp5\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.408091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-internal-tls-certs\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.408146 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.408198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aaa15825-c66d-4625-86d9-2066c914a955-scripts\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.408223 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aaa15825-c66d-4625-86d9-2066c914a955-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.408249 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-config\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.408409 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-combined-ca-bundle\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.409486 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aaa15825-c66d-4625-86d9-2066c914a955-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.409583 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa15825-c66d-4625-86d9-2066c914a955-config\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.410508 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aaa15825-c66d-4625-86d9-2066c914a955-scripts\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.414249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.418933 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.431177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rndp5\" (UniqueName: \"kubernetes.io/projected/aaa15825-c66d-4625-86d9-2066c914a955-kube-api-access-rndp5\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.439190 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.493765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzhtm" event={"ID":"4f7e1038-a9e1-40d9-a09d-4c224fb114d4","Type":"ContainerStarted","Data":"b6391fd1b04273b40e16deee537ea078556fe55d839ecf59646b59e7b0cb9d91"} Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.505281 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.512112 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcxdl\" (UniqueName: \"kubernetes.io/projected/7203857c-1149-4912-9921-7e547f6418f6-kube-api-access-tcxdl\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.512183 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-internal-tls-certs\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.512279 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-config\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.512425 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-combined-ca-bundle\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.512509 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-public-tls-certs\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.512564 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-ovndb-tls-certs\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.512594 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-httpd-config\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.517205 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-httpd-config\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.520666 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-combined-ca-bundle\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.522939 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-public-tls-certs\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.525195 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-ovndb-tls-certs\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.526596 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-config\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.530129 5030 generic.go:334] "Generic (PLEG): container finished" podID="d788ece9-38a7-4cd0-bc65-a193ef38a0f1" containerID="20dc72195b475941779d438aac438e68649738b63961b6c84b8477517eefa288" exitCode=0 Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.530253 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" event={"ID":"d788ece9-38a7-4cd0-bc65-a193ef38a0f1","Type":"ContainerDied","Data":"20dc72195b475941779d438aac438e68649738b63961b6c84b8477517eefa288"} Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.548385 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-internal-tls-certs\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.552270 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcxdl\" (UniqueName: \"kubernetes.io/projected/7203857c-1149-4912-9921-7e547f6418f6-kube-api-access-tcxdl\") pod \"neutron-6d84555d94-5jgdn\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.573551 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"65b84346-911d-4a8d-a75d-1fa2ba7e167b","Type":"ContainerStarted","Data":"92064a82e61c9bc79eeabe218007e5799a0995d02963cca03efa89e98c5198d5"} Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.589856 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="fc08df3d-397a-42ff-ab96-622f2dc6e492" containerName="nova-scheduler-scheduler" containerID="cri-o://0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15" gracePeriod=30 Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.592181 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c095502b-f026-40aa-a84f-057d9dc0e758","Type":"ContainerStarted","Data":"71cfc22f41e9fa0ef8e7067f5101e00ba2cfaf07926332c0f19d957ce019cfcd"} Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.593137 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6" containerName="nova-cell0-conductor-conductor" containerID="cri-o://a038db663aa5f8995ab3f2012bde10c376602dd9df5515774d38a1f9b699c0c7" gracePeriod=30 Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.605793 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq"] Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.669173 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.782272 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.850313 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.920784 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.101:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.922603 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-credential-keys\") pod \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.922722 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-fernet-keys\") pod \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.922760 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6jjc\" (UniqueName: \"kubernetes.io/projected/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-kube-api-access-v6jjc\") pod \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.922841 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-scripts\") pod \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.923026 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-internal-tls-certs\") pod \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.923062 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-combined-ca-bundle\") pod \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.923100 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-public-tls-certs\") pod \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.923127 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-config-data\") pod \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\" (UID: \"d788ece9-38a7-4cd0-bc65-a193ef38a0f1\") " Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.929126 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-kube-api-access-v6jjc" (OuterVolumeSpecName: "kube-api-access-v6jjc") pod "d788ece9-38a7-4cd0-bc65-a193ef38a0f1" (UID: "d788ece9-38a7-4cd0-bc65-a193ef38a0f1"). InnerVolumeSpecName "kube-api-access-v6jjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:10 crc kubenswrapper[5030]: I0120 23:34:10.930781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-scripts" (OuterVolumeSpecName: "scripts") pod "d788ece9-38a7-4cd0-bc65-a193ef38a0f1" (UID: "d788ece9-38a7-4cd0-bc65-a193ef38a0f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:10.999068 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d788ece9-38a7-4cd0-bc65-a193ef38a0f1" (UID: "d788ece9-38a7-4cd0-bc65-a193ef38a0f1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.005490 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d788ece9-38a7-4cd0-bc65-a193ef38a0f1" (UID: "d788ece9-38a7-4cd0-bc65-a193ef38a0f1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.037474 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.037528 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.037540 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.037553 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6jjc\" (UniqueName: \"kubernetes.io/projected/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-kube-api-access-v6jjc\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.146115 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.150692 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d788ece9-38a7-4cd0-bc65-a193ef38a0f1" (UID: "d788ece9-38a7-4cd0-bc65-a193ef38a0f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:11 crc kubenswrapper[5030]: W0120 23:34:11.205011 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaa15825_c66d_4625_86d9_2066c914a955.slice/crio-041789556f9faf3b54d6eabf8376afb0a5f37ae440ce691b95c9bb1994ea130d WatchSource:0}: Error finding container 041789556f9faf3b54d6eabf8376afb0a5f37ae440ce691b95c9bb1994ea130d: Status 404 returned error can't find the container with id 041789556f9faf3b54d6eabf8376afb0a5f37ae440ce691b95c9bb1994ea130d Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.247837 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:11 crc kubenswrapper[5030]: E0120 23:34:11.339803 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f7e1038_a9e1_40d9_a09d_4c224fb114d4.slice/crio-700ae07561fddcc52977bb5e74502ea44461979ec254243a6c0a66a86fc28dce.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f7e1038_a9e1_40d9_a09d_4c224fb114d4.slice/crio-conmon-700ae07561fddcc52977bb5e74502ea44461979ec254243a6c0a66a86fc28dce.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01dad8b0_3ba1_4c0f_b734_d6fbcc73b0e6.slice/crio-conmon-a038db663aa5f8995ab3f2012bde10c376602dd9df5515774d38a1f9b699c0c7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01dad8b0_3ba1_4c0f_b734_d6fbcc73b0e6.slice/crio-a038db663aa5f8995ab3f2012bde10c376602dd9df5515774d38a1f9b699c0c7.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.365949 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d788ece9-38a7-4cd0-bc65-a193ef38a0f1" (UID: "d788ece9-38a7-4cd0-bc65-a193ef38a0f1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.447054 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-config-data" (OuterVolumeSpecName: "config-data") pod "d788ece9-38a7-4cd0-bc65-a193ef38a0f1" (UID: "d788ece9-38a7-4cd0-bc65-a193ef38a0f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.452453 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6d84555d94-5jgdn"] Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.461915 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d788ece9-38a7-4cd0-bc65-a193ef38a0f1" (UID: "d788ece9-38a7-4cd0-bc65-a193ef38a0f1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.463071 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.463091 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.463103 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d788ece9-38a7-4cd0-bc65-a193ef38a0f1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.685350 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f7e1038-a9e1-40d9-a09d-4c224fb114d4" containerID="700ae07561fddcc52977bb5e74502ea44461979ec254243a6c0a66a86fc28dce" exitCode=0 Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.685660 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzhtm" event={"ID":"4f7e1038-a9e1-40d9-a09d-4c224fb114d4","Type":"ContainerDied","Data":"700ae07561fddcc52977bb5e74502ea44461979ec254243a6c0a66a86fc28dce"} Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.720578 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" event={"ID":"c8072f56-2133-435c-87fd-e0a273ee9e91","Type":"ContainerStarted","Data":"b5752b3ecf6216d9097f2ae1074c2eb5cb7b6c853218dcdae4da8ae22eb06798"} Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.720635 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" event={"ID":"c8072f56-2133-435c-87fd-e0a273ee9e91","Type":"ContainerStarted","Data":"3ef07a5f16be50259078c28b946630efabeb51ba07519002de54bd4eb3dfe589"} Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.720926 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.740133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" event={"ID":"d788ece9-38a7-4cd0-bc65-a193ef38a0f1","Type":"ContainerDied","Data":"21ddf443b33320832914b0ae456ad31ef25c904b08d8a0dbbf76217b1a452c19"} Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.740181 5030 scope.go:117] "RemoveContainer" containerID="20dc72195b475941779d438aac438e68649738b63961b6c84b8477517eefa288" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.740324 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-ddf8577d5-p79tx" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.772863 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" podStartSLOduration=2.772840204 podStartE2EDuration="2.772840204s" podCreationTimestamp="2026-01-20 23:34:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:11.738992235 +0000 UTC m=+3524.059252523" watchObservedRunningTime="2026-01-20 23:34:11.772840204 +0000 UTC m=+3524.093100482" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.783986 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"65b84346-911d-4a8d-a75d-1fa2ba7e167b","Type":"ContainerStarted","Data":"a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7"} Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.784032 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"65b84346-911d-4a8d-a75d-1fa2ba7e167b","Type":"ContainerStarted","Data":"d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31"} Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.784490 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="65b84346-911d-4a8d-a75d-1fa2ba7e167b" containerName="openstack-network-exporter" containerID="cri-o://a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7" gracePeriod=300 Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.795362 5030 generic.go:334] "Generic (PLEG): container finished" podID="b7b3ce7d-73be-4f56-80a5-d9147ad0580e" containerID="580832931f6c0e54e8102660db221878e02f00569eb352c1853ac3c3279ce10e" exitCode=0 Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.795425 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" event={"ID":"b7b3ce7d-73be-4f56-80a5-d9147ad0580e","Type":"ContainerDied","Data":"580832931f6c0e54e8102660db221878e02f00569eb352c1853ac3c3279ce10e"} Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.812951 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=3.812934355 podStartE2EDuration="3.812934355s" podCreationTimestamp="2026-01-20 23:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:11.811787147 +0000 UTC m=+3524.132047445" watchObservedRunningTime="2026-01-20 23:34:11.812934355 +0000 UTC m=+3524.133194643" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.815885 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"aaa15825-c66d-4625-86d9-2066c914a955","Type":"ContainerStarted","Data":"041789556f9faf3b54d6eabf8376afb0a5f37ae440ce691b95c9bb1994ea130d"} Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.826657 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" event={"ID":"7203857c-1149-4912-9921-7e547f6418f6","Type":"ContainerStarted","Data":"77d98cce311295ce790bd78ecfe1e35d9d791f4e74f411d2a6b413ba8273cbdb"} Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.828231 5030 generic.go:334] "Generic (PLEG): container finished" podID="01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6" containerID="a038db663aa5f8995ab3f2012bde10c376602dd9df5515774d38a1f9b699c0c7" exitCode=1 Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.828386 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" containerName="nova-api-log" containerID="cri-o://1367bed5839e543dc91ccaa45f346a8078b98871533bbd6dbcd36fea44e6682e" gracePeriod=30 Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.828589 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6","Type":"ContainerDied","Data":"a038db663aa5f8995ab3f2012bde10c376602dd9df5515774d38a1f9b699c0c7"} Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.828869 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" containerName="nova-api-api" containerID="cri-o://e1bab47349c35f6aa385252a8ecdb111200aae28f3c39202af37cf667e9c4edb" gracePeriod=30 Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.874788 5030 scope.go:117] "RemoveContainer" containerID="305ab9810000c0d4e7831bbb87809b2f75bb48931479151aad674ac887230e4b" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.893713 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-ddf8577d5-p79tx"] Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.895059 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="65b84346-911d-4a8d-a75d-1fa2ba7e167b" containerName="ovsdbserver-sb" containerID="cri-o://d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31" gracePeriod=300 Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.912716 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.952263 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-ddf8577d5-p79tx"] Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.986727 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sxl5\" (UniqueName: \"kubernetes.io/projected/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-kube-api-access-2sxl5\") pod \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\" (UID: \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\") " Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.987052 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-config-data\") pod \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\" (UID: \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\") " Jan 20 23:34:11 crc kubenswrapper[5030]: I0120 23:34:11.987273 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-combined-ca-bundle\") pod \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\" (UID: \"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6\") " Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.011992 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-kube-api-access-2sxl5" (OuterVolumeSpecName: "kube-api-access-2sxl5") pod "01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6" (UID: "01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6"). InnerVolumeSpecName "kube-api-access-2sxl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.019467 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="023f2e4d-ae22-4ce0-b4bc-86f930b149f9" path="/var/lib/kubelet/pods/023f2e4d-ae22-4ce0-b4bc-86f930b149f9/volumes" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.020643 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d788ece9-38a7-4cd0-bc65-a193ef38a0f1" path="/var/lib/kubelet/pods/d788ece9-38a7-4cd0-bc65-a193ef38a0f1/volumes" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.034991 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6" (UID: "01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.070492 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-config-data" (OuterVolumeSpecName: "config-data") pod "01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6" (UID: "01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.097824 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sxl5\" (UniqueName: \"kubernetes.io/projected/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-kube-api-access-2sxl5\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.097858 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.097870 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.257115 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.257157 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.379282 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_65b84346-911d-4a8d-a75d-1fa2ba7e167b/ovsdbserver-sb/0.log" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.379351 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.505604 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.505699 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/65b84346-911d-4a8d-a75d-1fa2ba7e167b-ovsdb-rundir\") pod \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.505755 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65b84346-911d-4a8d-a75d-1fa2ba7e167b-config\") pod \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.505799 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-combined-ca-bundle\") pod \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.505832 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp7wz\" (UniqueName: \"kubernetes.io/projected/65b84346-911d-4a8d-a75d-1fa2ba7e167b-kube-api-access-sp7wz\") pod \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.505870 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-ovsdbserver-sb-tls-certs\") pod \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.505905 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-metrics-certs-tls-certs\") pod \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.506442 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65b84346-911d-4a8d-a75d-1fa2ba7e167b-scripts\") pod \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\" (UID: \"65b84346-911d-4a8d-a75d-1fa2ba7e167b\") " Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.506883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65b84346-911d-4a8d-a75d-1fa2ba7e167b-config" (OuterVolumeSpecName: "config") pod "65b84346-911d-4a8d-a75d-1fa2ba7e167b" (UID: "65b84346-911d-4a8d-a75d-1fa2ba7e167b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.507187 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65b84346-911d-4a8d-a75d-1fa2ba7e167b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "65b84346-911d-4a8d-a75d-1fa2ba7e167b" (UID: "65b84346-911d-4a8d-a75d-1fa2ba7e167b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.507237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65b84346-911d-4a8d-a75d-1fa2ba7e167b-scripts" (OuterVolumeSpecName: "scripts") pod "65b84346-911d-4a8d-a75d-1fa2ba7e167b" (UID: "65b84346-911d-4a8d-a75d-1fa2ba7e167b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.510919 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "65b84346-911d-4a8d-a75d-1fa2ba7e167b" (UID: "65b84346-911d-4a8d-a75d-1fa2ba7e167b"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.511328 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b84346-911d-4a8d-a75d-1fa2ba7e167b-kube-api-access-sp7wz" (OuterVolumeSpecName: "kube-api-access-sp7wz") pod "65b84346-911d-4a8d-a75d-1fa2ba7e167b" (UID: "65b84346-911d-4a8d-a75d-1fa2ba7e167b"). InnerVolumeSpecName "kube-api-access-sp7wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.555824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65b84346-911d-4a8d-a75d-1fa2ba7e167b" (UID: "65b84346-911d-4a8d-a75d-1fa2ba7e167b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.608016 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65b84346-911d-4a8d-a75d-1fa2ba7e167b-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.608060 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.608072 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp7wz\" (UniqueName: \"kubernetes.io/projected/65b84346-911d-4a8d-a75d-1fa2ba7e167b-kube-api-access-sp7wz\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.608081 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65b84346-911d-4a8d-a75d-1fa2ba7e167b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.608102 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.608111 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/65b84346-911d-4a8d-a75d-1fa2ba7e167b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.620364 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "65b84346-911d-4a8d-a75d-1fa2ba7e167b" (UID: "65b84346-911d-4a8d-a75d-1fa2ba7e167b"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.622089 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "65b84346-911d-4a8d-a75d-1fa2ba7e167b" (UID: "65b84346-911d-4a8d-a75d-1fa2ba7e167b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.640271 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.710806 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.710857 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b84346-911d-4a8d-a75d-1fa2ba7e167b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.710869 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.843950 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"aaa15825-c66d-4625-86d9-2066c914a955","Type":"ContainerStarted","Data":"decf69e4e589732702c19f699d836d58b658a46251d7b72038207fef3f5a3ca1"} Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.844243 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"aaa15825-c66d-4625-86d9-2066c914a955","Type":"ContainerStarted","Data":"506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20"} Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.844435 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.850555 5030 generic.go:334] "Generic (PLEG): container finished" podID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" containerID="1367bed5839e543dc91ccaa45f346a8078b98871533bbd6dbcd36fea44e6682e" exitCode=143 Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.850638 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6","Type":"ContainerDied","Data":"1367bed5839e543dc91ccaa45f346a8078b98871533bbd6dbcd36fea44e6682e"} Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.855950 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c095502b-f026-40aa-a84f-057d9dc0e758","Type":"ContainerStarted","Data":"8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1"} Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.857039 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.864176 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" event={"ID":"7203857c-1149-4912-9921-7e547f6418f6","Type":"ContainerStarted","Data":"0de19bed1ecff6e0a69896036a321fb497ea5ae981f8bcf78e60eb8bd7775f53"} Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.864219 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" event={"ID":"7203857c-1149-4912-9921-7e547f6418f6","Type":"ContainerStarted","Data":"3800eea402918756e94a42e7bf2f7203a22cdc0c78eae378314eed1505938d12"} Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.865110 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.870766 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.870801 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6","Type":"ContainerDied","Data":"b494db42719753d2ddab15c12638a8470632b47ca6ab1e3ea4f6027688bba6d7"} Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.870849 5030 scope.go:117] "RemoveContainer" containerID="a038db663aa5f8995ab3f2012bde10c376602dd9df5515774d38a1f9b699c0c7" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.899156 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_65b84346-911d-4a8d-a75d-1fa2ba7e167b/ovsdbserver-sb/0.log" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.899205 5030 generic.go:334] "Generic (PLEG): container finished" podID="65b84346-911d-4a8d-a75d-1fa2ba7e167b" containerID="a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7" exitCode=2 Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.899220 5030 generic.go:334] "Generic (PLEG): container finished" podID="65b84346-911d-4a8d-a75d-1fa2ba7e167b" containerID="d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31" exitCode=143 Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.899989 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.903693 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"65b84346-911d-4a8d-a75d-1fa2ba7e167b","Type":"ContainerDied","Data":"a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7"} Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.903742 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"65b84346-911d-4a8d-a75d-1fa2ba7e167b","Type":"ContainerDied","Data":"d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31"} Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.903752 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"65b84346-911d-4a8d-a75d-1fa2ba7e167b","Type":"ContainerDied","Data":"92064a82e61c9bc79eeabe218007e5799a0995d02963cca03efa89e98c5198d5"} Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.903766 5030 scope.go:117] "RemoveContainer" containerID="a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.926548 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.101:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.929113 5030 scope.go:117] "RemoveContainer" containerID="d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.930933 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.930919817 podStartE2EDuration="2.930919817s" podCreationTimestamp="2026-01-20 23:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:12.879580913 +0000 UTC m=+3525.199841201" watchObservedRunningTime="2026-01-20 23:34:12.930919817 +0000 UTC m=+3525.251180105" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.939755 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=5.93973505 podStartE2EDuration="5.93973505s" podCreationTimestamp="2026-01-20 23:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:12.927899584 +0000 UTC m=+3525.248159872" watchObservedRunningTime="2026-01-20 23:34:12.93973505 +0000 UTC m=+3525.259995338" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.957793 5030 scope.go:117] "RemoveContainer" containerID="a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7" Jan 20 23:34:12 crc kubenswrapper[5030]: E0120 23:34:12.963683 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7\": container with ID starting with a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7 not found: ID does not exist" containerID="a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.963729 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7"} err="failed to get container status \"a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7\": rpc error: code = NotFound desc = could not find container \"a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7\": container with ID starting with a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7 not found: ID does not exist" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.963752 5030 scope.go:117] "RemoveContainer" containerID="d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31" Jan 20 23:34:12 crc kubenswrapper[5030]: E0120 23:34:12.965091 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31\": container with ID starting with d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31 not found: ID does not exist" containerID="d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.965117 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31"} err="failed to get container status \"d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31\": rpc error: code = NotFound desc = could not find container \"d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31\": container with ID starting with d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31 not found: ID does not exist" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.965139 5030 scope.go:117] "RemoveContainer" containerID="a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.965328 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" podStartSLOduration=2.965312919 podStartE2EDuration="2.965312919s" podCreationTimestamp="2026-01-20 23:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:12.963864064 +0000 UTC m=+3525.284124352" watchObservedRunningTime="2026-01-20 23:34:12.965312919 +0000 UTC m=+3525.285573207" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.965986 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7"} err="failed to get container status \"a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7\": rpc error: code = NotFound desc = could not find container \"a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7\": container with ID starting with a57c64957dee91bef6c0796a3f6cbd61475c4d9583397cba3817ac43ba3a51f7 not found: ID does not exist" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.966016 5030 scope.go:117] "RemoveContainer" containerID="d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.968785 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31"} err="failed to get container status \"d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31\": rpc error: code = NotFound desc = could not find container \"d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31\": container with ID starting with d63f241b2b05ce00b996084c3ce47f233d42fadfd90dfa085bf4c154209cbe31 not found: ID does not exist" Jan 20 23:34:12 crc kubenswrapper[5030]: I0120 23:34:12.995663 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.012951 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.033007 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.042707 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.058387 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.073251 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:13 crc kubenswrapper[5030]: E0120 23:34:13.073939 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b84346-911d-4a8d-a75d-1fa2ba7e167b" containerName="ovsdbserver-sb" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.074201 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b84346-911d-4a8d-a75d-1fa2ba7e167b" containerName="ovsdbserver-sb" Jan 20 23:34:13 crc kubenswrapper[5030]: E0120 23:34:13.074291 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6" containerName="nova-cell0-conductor-conductor" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.074351 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6" containerName="nova-cell0-conductor-conductor" Jan 20 23:34:13 crc kubenswrapper[5030]: E0120 23:34:13.074426 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6" containerName="nova-cell0-conductor-conductor" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.074501 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6" containerName="nova-cell0-conductor-conductor" Jan 20 23:34:13 crc kubenswrapper[5030]: E0120 23:34:13.074579 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b84346-911d-4a8d-a75d-1fa2ba7e167b" containerName="openstack-network-exporter" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.074653 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b84346-911d-4a8d-a75d-1fa2ba7e167b" containerName="openstack-network-exporter" Jan 20 23:34:13 crc kubenswrapper[5030]: E0120 23:34:13.074727 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d788ece9-38a7-4cd0-bc65-a193ef38a0f1" containerName="keystone-api" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.074785 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d788ece9-38a7-4cd0-bc65-a193ef38a0f1" containerName="keystone-api" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.075012 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6" containerName="nova-cell0-conductor-conductor" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.075087 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6" containerName="nova-cell0-conductor-conductor" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.075174 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b84346-911d-4a8d-a75d-1fa2ba7e167b" containerName="openstack-network-exporter" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.075245 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d788ece9-38a7-4cd0-bc65-a193ef38a0f1" containerName="keystone-api" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.075310 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b84346-911d-4a8d-a75d-1fa2ba7e167b" containerName="ovsdbserver-sb" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.076121 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.080288 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.100316 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.102164 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.108645 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.109012 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.109159 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-hrzkh" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.109387 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.151894 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.193795 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.223077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bf3518-ec94-4593-9c4d-768add03ad93-config\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.223342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.223500 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.224829 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qttr6\" (UniqueName: \"kubernetes.io/projected/d7bf3518-ec94-4593-9c4d-768add03ad93-kube-api-access-qttr6\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.224870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7bf3518-ec94-4593-9c4d-768add03ad93-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.224913 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5193f521-7049-4215-9ec2-5ee65b965e72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5193f521-7049-4215-9ec2-5ee65b965e72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.224933 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.224972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7bf3518-ec94-4593-9c4d-768add03ad93-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.224987 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8zhw\" (UniqueName: \"kubernetes.io/projected/5193f521-7049-4215-9ec2-5ee65b965e72-kube-api-access-c8zhw\") pod \"nova-cell0-conductor-0\" (UID: \"5193f521-7049-4215-9ec2-5ee65b965e72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.225024 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5193f521-7049-4215-9ec2-5ee65b965e72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5193f521-7049-4215-9ec2-5ee65b965e72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.225056 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.273692 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6d84555d94-5jgdn"] Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.275878 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.285251 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-777576f9c4-nj6sl"] Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.287074 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.293986 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-777576f9c4-nj6sl"] Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.326616 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7bf3518-ec94-4593-9c4d-768add03ad93-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.326945 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8zhw\" (UniqueName: \"kubernetes.io/projected/5193f521-7049-4215-9ec2-5ee65b965e72-kube-api-access-c8zhw\") pod \"nova-cell0-conductor-0\" (UID: \"5193f521-7049-4215-9ec2-5ee65b965e72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.327113 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5193f521-7049-4215-9ec2-5ee65b965e72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5193f521-7049-4215-9ec2-5ee65b965e72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.327252 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.327369 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bf3518-ec94-4593-9c4d-768add03ad93-config\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.327502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.327793 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7bf3518-ec94-4593-9c4d-768add03ad93-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.328228 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bf3518-ec94-4593-9c4d-768add03ad93-config\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.331450 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.331675 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qttr6\" (UniqueName: \"kubernetes.io/projected/d7bf3518-ec94-4593-9c4d-768add03ad93-kube-api-access-qttr6\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.331893 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7bf3518-ec94-4593-9c4d-768add03ad93-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.331988 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.332079 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5193f521-7049-4215-9ec2-5ee65b965e72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5193f521-7049-4215-9ec2-5ee65b965e72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.332154 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.334202 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7bf3518-ec94-4593-9c4d-768add03ad93-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.339594 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5193f521-7049-4215-9ec2-5ee65b965e72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5193f521-7049-4215-9ec2-5ee65b965e72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.344780 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.347495 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.349099 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5193f521-7049-4215-9ec2-5ee65b965e72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5193f521-7049-4215-9ec2-5ee65b965e72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.355915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.356197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qttr6\" (UniqueName: \"kubernetes.io/projected/d7bf3518-ec94-4593-9c4d-768add03ad93-kube-api-access-qttr6\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.360269 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8zhw\" (UniqueName: \"kubernetes.io/projected/5193f521-7049-4215-9ec2-5ee65b965e72-kube-api-access-c8zhw\") pod \"nova-cell0-conductor-0\" (UID: \"5193f521-7049-4215-9ec2-5ee65b965e72\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.403576 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.432252 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.449880 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-internal-tls-certs\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.450238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-combined-ca-bundle\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.450333 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-public-tls-certs\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.450478 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-httpd-config\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.450680 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p7f2\" (UniqueName: \"kubernetes.io/projected/4929132a-94b7-487e-92b7-037bb4e45da6-kube-api-access-8p7f2\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.450801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-config\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.452051 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-ovndb-tls-certs\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.456698 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.488356 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-777576f9c4-nj6sl"] Jan 20 23:34:13 crc kubenswrapper[5030]: E0120 23:34:13.489252 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config httpd-config internal-tls-certs kube-api-access-8p7f2 ovndb-tls-certs public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" podUID="4929132a-94b7-487e-92b7-037bb4e45da6" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.520391 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-55c6fff65d-h7t2s"] Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.526774 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.541409 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-55c6fff65d-h7t2s"] Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.557781 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-combined-ca-bundle\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.557825 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-public-tls-certs\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.557902 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-httpd-config\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.557926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p7f2\" (UniqueName: \"kubernetes.io/projected/4929132a-94b7-487e-92b7-037bb4e45da6-kube-api-access-8p7f2\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.557947 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-config\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.557984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-ovndb-tls-certs\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.558028 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-internal-tls-certs\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.563900 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-config\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.566532 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-ovndb-tls-certs\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.567404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-public-tls-certs\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.567465 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-httpd-config\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.570245 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-internal-tls-certs\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.576405 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-combined-ca-bundle\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.590997 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p7f2\" (UniqueName: \"kubernetes.io/projected/4929132a-94b7-487e-92b7-037bb4e45da6-kube-api-access-8p7f2\") pod \"neutron-777576f9c4-nj6sl\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.659983 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-internal-tls-certs\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.660052 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-ovndb-tls-certs\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.660076 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgtx7\" (UniqueName: \"kubernetes.io/projected/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-kube-api-access-vgtx7\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.660129 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-config\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.660212 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-combined-ca-bundle\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.660264 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-httpd-config\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.660281 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-public-tls-certs\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.762136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-combined-ca-bundle\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.762190 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-httpd-config\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.762208 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-public-tls-certs\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.762231 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-internal-tls-certs\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.762269 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-ovndb-tls-certs\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.762290 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgtx7\" (UniqueName: \"kubernetes.io/projected/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-kube-api-access-vgtx7\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.762334 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-config\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.770844 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-internal-tls-certs\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.770875 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-config\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.774486 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-public-tls-certs\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.779990 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-ovndb-tls-certs\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.783763 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgtx7\" (UniqueName: \"kubernetes.io/projected/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-kube-api-access-vgtx7\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.809583 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-httpd-config\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.812552 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-combined-ca-bundle\") pod \"neutron-55c6fff65d-h7t2s\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: E0120 23:34:13.891054 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:34:13 crc kubenswrapper[5030]: E0120 23:34:13.913372 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.930771 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.101:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:13 crc kubenswrapper[5030]: E0120 23:34:13.933297 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:34:13 crc kubenswrapper[5030]: E0120 23:34:13.933388 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="fc08df3d-397a-42ff-ab96-622f2dc6e492" containerName="nova-scheduler-scheduler" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.944613 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.956789 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.101:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.966460 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.986485 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.988034 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6" path="/var/lib/kubelet/pods/01dad8b0-3ba1-4c0f-b734-d6fbcc73b0e6/volumes" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.989023 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b84346-911d-4a8d-a75d-1fa2ba7e167b" path="/var/lib/kubelet/pods/65b84346-911d-4a8d-a75d-1fa2ba7e167b/volumes" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.994040 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:34:13 crc kubenswrapper[5030]: I0120 23:34:13.994424 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.043858 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.065289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-ovndb-tls-certs\") pod \"4929132a-94b7-487e-92b7-037bb4e45da6\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.065385 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-config\") pod \"4929132a-94b7-487e-92b7-037bb4e45da6\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.065404 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-httpd-config\") pod \"4929132a-94b7-487e-92b7-037bb4e45da6\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.065462 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-internal-tls-certs\") pod \"4929132a-94b7-487e-92b7-037bb4e45da6\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.065502 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-combined-ca-bundle\") pod \"4929132a-94b7-487e-92b7-037bb4e45da6\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.065551 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p7f2\" (UniqueName: \"kubernetes.io/projected/4929132a-94b7-487e-92b7-037bb4e45da6-kube-api-access-8p7f2\") pod \"4929132a-94b7-487e-92b7-037bb4e45da6\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.065573 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-public-tls-certs\") pod \"4929132a-94b7-487e-92b7-037bb4e45da6\" (UID: \"4929132a-94b7-487e-92b7-037bb4e45da6\") " Jan 20 23:34:14 crc kubenswrapper[5030]: W0120 23:34:14.067174 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5193f521_7049_4215_9ec2_5ee65b965e72.slice/crio-b9c6b8925392291ee2c6a100e763bbe8df320474a6705bd387db8379dc246485 WatchSource:0}: Error finding container b9c6b8925392291ee2c6a100e763bbe8df320474a6705bd387db8379dc246485: Status 404 returned error can't find the container with id b9c6b8925392291ee2c6a100e763bbe8df320474a6705bd387db8379dc246485 Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.070245 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4929132a-94b7-487e-92b7-037bb4e45da6" (UID: "4929132a-94b7-487e-92b7-037bb4e45da6"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.071642 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4929132a-94b7-487e-92b7-037bb4e45da6" (UID: "4929132a-94b7-487e-92b7-037bb4e45da6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.082544 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4929132a-94b7-487e-92b7-037bb4e45da6-kube-api-access-8p7f2" (OuterVolumeSpecName: "kube-api-access-8p7f2") pod "4929132a-94b7-487e-92b7-037bb4e45da6" (UID: "4929132a-94b7-487e-92b7-037bb4e45da6"). InnerVolumeSpecName "kube-api-access-8p7f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.082758 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4929132a-94b7-487e-92b7-037bb4e45da6" (UID: "4929132a-94b7-487e-92b7-037bb4e45da6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.082901 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4929132a-94b7-487e-92b7-037bb4e45da6" (UID: "4929132a-94b7-487e-92b7-037bb4e45da6"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.084734 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-config" (OuterVolumeSpecName: "config") pod "4929132a-94b7-487e-92b7-037bb4e45da6" (UID: "4929132a-94b7-487e-92b7-037bb4e45da6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.084869 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w"] Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.085043 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" podUID="8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" containerName="barbican-api-log" containerID="cri-o://616813822df1b61971570aac8e1cfac100ae80c314656371569d202dcc6d59ca" gracePeriod=30 Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.085401 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" podUID="8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" containerName="barbican-api" containerID="cri-o://016ababbed1655b0d2abe8b696a59d0e9ec529a56e150638fbe3d4558285b38e" gracePeriod=30 Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.093393 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4929132a-94b7-487e-92b7-037bb4e45da6" (UID: "4929132a-94b7-487e-92b7-037bb4e45da6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.156528 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.171902 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.172191 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.172201 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.172213 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.172224 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.172238 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p7f2\" (UniqueName: \"kubernetes.io/projected/4929132a-94b7-487e-92b7-037bb4e45da6-kube-api-access-8p7f2\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.172248 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929132a-94b7-487e-92b7-037bb4e45da6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.177308 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.390905 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.478504 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e905a9f8-849b-45bb-be21-2293c96cd0e1-combined-ca-bundle\") pod \"e905a9f8-849b-45bb-be21-2293c96cd0e1\" (UID: \"e905a9f8-849b-45bb-be21-2293c96cd0e1\") " Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.478759 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7kb7\" (UniqueName: \"kubernetes.io/projected/e905a9f8-849b-45bb-be21-2293c96cd0e1-kube-api-access-g7kb7\") pod \"e905a9f8-849b-45bb-be21-2293c96cd0e1\" (UID: \"e905a9f8-849b-45bb-be21-2293c96cd0e1\") " Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.478831 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e905a9f8-849b-45bb-be21-2293c96cd0e1-config-data\") pod \"e905a9f8-849b-45bb-be21-2293c96cd0e1\" (UID: \"e905a9f8-849b-45bb-be21-2293c96cd0e1\") " Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.485401 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e905a9f8-849b-45bb-be21-2293c96cd0e1-kube-api-access-g7kb7" (OuterVolumeSpecName: "kube-api-access-g7kb7") pod "e905a9f8-849b-45bb-be21-2293c96cd0e1" (UID: "e905a9f8-849b-45bb-be21-2293c96cd0e1"). InnerVolumeSpecName "kube-api-access-g7kb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.541266 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-55c6fff65d-h7t2s"] Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.541903 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e905a9f8-849b-45bb-be21-2293c96cd0e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e905a9f8-849b-45bb-be21-2293c96cd0e1" (UID: "e905a9f8-849b-45bb-be21-2293c96cd0e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.569444 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e905a9f8-849b-45bb-be21-2293c96cd0e1-config-data" (OuterVolumeSpecName: "config-data") pod "e905a9f8-849b-45bb-be21-2293c96cd0e1" (UID: "e905a9f8-849b-45bb-be21-2293c96cd0e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.582389 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7kb7\" (UniqueName: \"kubernetes.io/projected/e905a9f8-849b-45bb-be21-2293c96cd0e1-kube-api-access-g7kb7\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.582542 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e905a9f8-849b-45bb-be21-2293c96cd0e1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.582630 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e905a9f8-849b-45bb-be21-2293c96cd0e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.830787 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.831027 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="62e3599a-b500-49ac-a3bd-cdc62cd37fba" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://dd2f2f168ba8ebfef34ccd4c89c839a800fe84963b8fed60ce66039f01507f02" gracePeriod=30 Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.982399 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" event={"ID":"ba78ec90-96e7-41c8-a6f9-ceff78ba166c","Type":"ContainerStarted","Data":"dcdc63e978b66aea26fcff51171500f5e3244ba6e5d9533407f5b85a501aa9c4"} Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.982666 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" event={"ID":"ba78ec90-96e7-41c8-a6f9-ceff78ba166c","Type":"ContainerStarted","Data":"687541627ea37a329b0829eb92018510c1bf895b4c6761fac6f5d8a5ce4f236a"} Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.989433 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"d7bf3518-ec94-4593-9c4d-768add03ad93","Type":"ContainerStarted","Data":"c24781b0440cb0ec4f5466dcf225f8aee2a756ed49cb3c20f947a7fd55315498"} Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.989506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"d7bf3518-ec94-4593-9c4d-768add03ad93","Type":"ContainerStarted","Data":"e936339e4e8ee4ffadf2179ae8d7a2c4c7acf995ef0f21e4513af509be552351"} Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.989518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"d7bf3518-ec94-4593-9c4d-768add03ad93","Type":"ContainerStarted","Data":"ba61e8605605b6d4b63b3d5d0c266d4591ed247e2d3f3e1bb8b4ead232bd7d91"} Jan 20 23:34:14 crc kubenswrapper[5030]: I0120 23:34:14.997289 5030 generic.go:334] "Generic (PLEG): container finished" podID="e905a9f8-849b-45bb-be21-2293c96cd0e1" containerID="a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9" exitCode=0 Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:14.997391 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:14.997425 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e905a9f8-849b-45bb-be21-2293c96cd0e1","Type":"ContainerDied","Data":"a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9"} Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.001387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"e905a9f8-849b-45bb-be21-2293c96cd0e1","Type":"ContainerDied","Data":"22eb344454fb2717588cb3420815db293ac7fc199f8a1cebf60d9bebf3439f8c"} Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.001413 5030 scope.go:117] "RemoveContainer" containerID="a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.003926 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"5193f521-7049-4215-9ec2-5ee65b965e72","Type":"ContainerStarted","Data":"12edf405cbfdabbe7764f8aed590a89c8ebdedcdde1c808d0e999161b7b56af4"} Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.003962 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"5193f521-7049-4215-9ec2-5ee65b965e72","Type":"ContainerStarted","Data":"b9c6b8925392291ee2c6a100e763bbe8df320474a6705bd387db8379dc246485"} Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.004262 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.008174 5030 generic.go:334] "Generic (PLEG): container finished" podID="8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" containerID="616813822df1b61971570aac8e1cfac100ae80c314656371569d202dcc6d59ca" exitCode=143 Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.008890 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" event={"ID":"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f","Type":"ContainerDied","Data":"616813822df1b61971570aac8e1cfac100ae80c314656371569d202dcc6d59ca"} Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.009321 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" podUID="7203857c-1149-4912-9921-7e547f6418f6" containerName="neutron-api" containerID="cri-o://3800eea402918756e94a42e7bf2f7203a22cdc0c78eae378314eed1505938d12" gracePeriod=30 Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.009518 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-777576f9c4-nj6sl" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.009633 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" podUID="7203857c-1149-4912-9921-7e547f6418f6" containerName="neutron-httpd" containerID="cri-o://0de19bed1ecff6e0a69896036a321fb497ea5ae981f8bcf78e60eb8bd7775f53" gracePeriod=30 Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.053849 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.053826613 podStartE2EDuration="2.053826613s" podCreationTimestamp="2026-01-20 23:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:15.018832746 +0000 UTC m=+3527.339093034" watchObservedRunningTime="2026-01-20 23:34:15.053826613 +0000 UTC m=+3527.374086901" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.069005 5030 scope.go:117] "RemoveContainer" containerID="a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9" Jan 20 23:34:15 crc kubenswrapper[5030]: E0120 23:34:15.073077 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9\": container with ID starting with a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9 not found: ID does not exist" containerID="a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.073123 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9"} err="failed to get container status \"a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9\": rpc error: code = NotFound desc = could not find container \"a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9\": container with ID starting with a3cb6fe2ed16bffdd4c8cba642bd660e006e2dd02f97183f690f1e9b3f9055f9 not found: ID does not exist" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.082478 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.088426 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.099074 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:34:15 crc kubenswrapper[5030]: E0120 23:34:15.099628 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e905a9f8-849b-45bb-be21-2293c96cd0e1" containerName="nova-cell1-conductor-conductor" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.099646 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e905a9f8-849b-45bb-be21-2293c96cd0e1" containerName="nova-cell1-conductor-conductor" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.099834 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e905a9f8-849b-45bb-be21-2293c96cd0e1" containerName="nova-cell1-conductor-conductor" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.100575 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.103139 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.130696 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.134680 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.13466008 podStartE2EDuration="2.13466008s" podCreationTimestamp="2026-01-20 23:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:15.073139951 +0000 UTC m=+3527.393400249" watchObservedRunningTime="2026-01-20 23:34:15.13466008 +0000 UTC m=+3527.454920368" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.168601 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-777576f9c4-nj6sl"] Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.181097 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-777576f9c4-nj6sl"] Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.306340 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.306490 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qhzl\" (UniqueName: \"kubernetes.io/projected/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-kube-api-access-2qhzl\") pod \"nova-cell1-conductor-0\" (UID: \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.306548 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.410811 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qhzl\" (UniqueName: \"kubernetes.io/projected/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-kube-api-access-2qhzl\") pod \"nova-cell1-conductor-0\" (UID: \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.410895 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.410927 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.416875 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.419388 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.431103 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qhzl\" (UniqueName: \"kubernetes.io/projected/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-kube-api-access-2qhzl\") pod \"nova-cell1-conductor-0\" (UID: \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.446796 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.755476 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.921313 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-config-data\") pod \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.921406 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-vencrypt-tls-certs\") pod \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.921480 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-combined-ca-bundle\") pod \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.921567 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmgl2\" (UniqueName: \"kubernetes.io/projected/62e3599a-b500-49ac-a3bd-cdc62cd37fba-kube-api-access-fmgl2\") pod \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.921687 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-nova-novncproxy-tls-certs\") pod \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\" (UID: \"62e3599a-b500-49ac-a3bd-cdc62cd37fba\") " Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.925436 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.929266 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62e3599a-b500-49ac-a3bd-cdc62cd37fba-kube-api-access-fmgl2" (OuterVolumeSpecName: "kube-api-access-fmgl2") pod "62e3599a-b500-49ac-a3bd-cdc62cd37fba" (UID: "62e3599a-b500-49ac-a3bd-cdc62cd37fba"). InnerVolumeSpecName "kube-api-access-fmgl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.953843 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62e3599a-b500-49ac-a3bd-cdc62cd37fba" (UID: "62e3599a-b500-49ac-a3bd-cdc62cd37fba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.972827 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-config-data" (OuterVolumeSpecName: "config-data") pod "62e3599a-b500-49ac-a3bd-cdc62cd37fba" (UID: "62e3599a-b500-49ac-a3bd-cdc62cd37fba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.978041 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4929132a-94b7-487e-92b7-037bb4e45da6" path="/var/lib/kubelet/pods/4929132a-94b7-487e-92b7-037bb4e45da6/volumes" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.978609 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e905a9f8-849b-45bb-be21-2293c96cd0e1" path="/var/lib/kubelet/pods/e905a9f8-849b-45bb-be21-2293c96cd0e1/volumes" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.986095 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "62e3599a-b500-49ac-a3bd-cdc62cd37fba" (UID: "62e3599a-b500-49ac-a3bd-cdc62cd37fba"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:15 crc kubenswrapper[5030]: I0120 23:34:15.996870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "62e3599a-b500-49ac-a3bd-cdc62cd37fba" (UID: "62e3599a-b500-49ac-a3bd-cdc62cd37fba"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.027147 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmgl2\" (UniqueName: \"kubernetes.io/projected/62e3599a-b500-49ac-a3bd-cdc62cd37fba-kube-api-access-fmgl2\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.027247 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.027304 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.027362 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.027463 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62e3599a-b500-49ac-a3bd-cdc62cd37fba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.061268 5030 generic.go:334] "Generic (PLEG): container finished" podID="7203857c-1149-4912-9921-7e547f6418f6" containerID="0de19bed1ecff6e0a69896036a321fb497ea5ae981f8bcf78e60eb8bd7775f53" exitCode=0 Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.061325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" event={"ID":"7203857c-1149-4912-9921-7e547f6418f6","Type":"ContainerDied","Data":"0de19bed1ecff6e0a69896036a321fb497ea5ae981f8bcf78e60eb8bd7775f53"} Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.065819 5030 generic.go:334] "Generic (PLEG): container finished" podID="62e3599a-b500-49ac-a3bd-cdc62cd37fba" containerID="dd2f2f168ba8ebfef34ccd4c89c839a800fe84963b8fed60ce66039f01507f02" exitCode=0 Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.065870 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"62e3599a-b500-49ac-a3bd-cdc62cd37fba","Type":"ContainerDied","Data":"dd2f2f168ba8ebfef34ccd4c89c839a800fe84963b8fed60ce66039f01507f02"} Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.065890 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"62e3599a-b500-49ac-a3bd-cdc62cd37fba","Type":"ContainerDied","Data":"a7a6837dc5241c3984d4126c7c7fc2763143aef1696320aa72598ebe24d62cb7"} Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.065908 5030 scope.go:117] "RemoveContainer" containerID="dd2f2f168ba8ebfef34ccd4c89c839a800fe84963b8fed60ce66039f01507f02" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.065994 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.070064 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d","Type":"ContainerStarted","Data":"11bb102f85b946924c2d3fb32c00ebef6d6f693f819cb13039457de7fa8497db"} Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.080224 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" event={"ID":"ba78ec90-96e7-41c8-a6f9-ceff78ba166c","Type":"ContainerStarted","Data":"5e694b41d33d8c2fffa8c96991878aaf037227af1c4a3149eb6d66b6726458a8"} Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.101538 5030 scope.go:117] "RemoveContainer" containerID="dd2f2f168ba8ebfef34ccd4c89c839a800fe84963b8fed60ce66039f01507f02" Jan 20 23:34:16 crc kubenswrapper[5030]: E0120 23:34:16.103229 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd2f2f168ba8ebfef34ccd4c89c839a800fe84963b8fed60ce66039f01507f02\": container with ID starting with dd2f2f168ba8ebfef34ccd4c89c839a800fe84963b8fed60ce66039f01507f02 not found: ID does not exist" containerID="dd2f2f168ba8ebfef34ccd4c89c839a800fe84963b8fed60ce66039f01507f02" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.103272 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd2f2f168ba8ebfef34ccd4c89c839a800fe84963b8fed60ce66039f01507f02"} err="failed to get container status \"dd2f2f168ba8ebfef34ccd4c89c839a800fe84963b8fed60ce66039f01507f02\": rpc error: code = NotFound desc = could not find container \"dd2f2f168ba8ebfef34ccd4c89c839a800fe84963b8fed60ce66039f01507f02\": container with ID starting with dd2f2f168ba8ebfef34ccd4c89c839a800fe84963b8fed60ce66039f01507f02 not found: ID does not exist" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.105577 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" podStartSLOduration=3.105564461 podStartE2EDuration="3.105564461s" podCreationTimestamp="2026-01-20 23:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:16.104681781 +0000 UTC m=+3528.424942069" watchObservedRunningTime="2026-01-20 23:34:16.105564461 +0000 UTC m=+3528.425824749" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.137671 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.156959 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.178871 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:34:16 crc kubenswrapper[5030]: E0120 23:34:16.179377 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e3599a-b500-49ac-a3bd-cdc62cd37fba" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.179395 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e3599a-b500-49ac-a3bd-cdc62cd37fba" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.179572 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="62e3599a-b500-49ac-a3bd-cdc62cd37fba" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.180283 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.184605 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.184984 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.185308 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.190337 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.335009 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.335119 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.335159 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.335180 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.335484 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6fdp\" (UniqueName: \"kubernetes.io/projected/dc1666bc-40ca-4f98-a280-6c13671a5196-kube-api-access-s6fdp\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.437024 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.437088 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.437111 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.437196 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6fdp\" (UniqueName: \"kubernetes.io/projected/dc1666bc-40ca-4f98-a280-6c13671a5196-kube-api-access-s6fdp\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.437234 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.441803 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.442373 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.452884 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.455959 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.457890 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.458195 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6fdp\" (UniqueName: \"kubernetes.io/projected/dc1666bc-40ca-4f98-a280-6c13671a5196-kube-api-access-s6fdp\") pod \"nova-cell1-novncproxy-0\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.521428 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.530645 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.729349 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.848321 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhnzj\" (UniqueName: \"kubernetes.io/projected/fc08df3d-397a-42ff-ab96-622f2dc6e492-kube-api-access-rhnzj\") pod \"fc08df3d-397a-42ff-ab96-622f2dc6e492\" (UID: \"fc08df3d-397a-42ff-ab96-622f2dc6e492\") " Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.848426 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc08df3d-397a-42ff-ab96-622f2dc6e492-config-data\") pod \"fc08df3d-397a-42ff-ab96-622f2dc6e492\" (UID: \"fc08df3d-397a-42ff-ab96-622f2dc6e492\") " Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.848487 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc08df3d-397a-42ff-ab96-622f2dc6e492-combined-ca-bundle\") pod \"fc08df3d-397a-42ff-ab96-622f2dc6e492\" (UID: \"fc08df3d-397a-42ff-ab96-622f2dc6e492\") " Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.884086 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc08df3d-397a-42ff-ab96-622f2dc6e492-kube-api-access-rhnzj" (OuterVolumeSpecName: "kube-api-access-rhnzj") pod "fc08df3d-397a-42ff-ab96-622f2dc6e492" (UID: "fc08df3d-397a-42ff-ab96-622f2dc6e492"). InnerVolumeSpecName "kube-api-access-rhnzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.946804 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc08df3d-397a-42ff-ab96-622f2dc6e492-config-data" (OuterVolumeSpecName: "config-data") pod "fc08df3d-397a-42ff-ab96-622f2dc6e492" (UID: "fc08df3d-397a-42ff-ab96-622f2dc6e492"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.951226 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhnzj\" (UniqueName: \"kubernetes.io/projected/fc08df3d-397a-42ff-ab96-622f2dc6e492-kube-api-access-rhnzj\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.951265 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc08df3d-397a-42ff-ab96-622f2dc6e492-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:16 crc kubenswrapper[5030]: I0120 23:34:16.951474 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc08df3d-397a-42ff-ab96-622f2dc6e492-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc08df3d-397a-42ff-ab96-622f2dc6e492" (UID: "fc08df3d-397a-42ff-ab96-622f2dc6e492"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.057662 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc08df3d-397a-42ff-ab96-622f2dc6e492-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.089898 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.122321 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d","Type":"ContainerStarted","Data":"8d583b27015652f7e8e83138eda5370ac6f3f2cbeca97079c99f7be638682e39"} Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.123095 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.150410 5030 generic.go:334] "Generic (PLEG): container finished" podID="fc08df3d-397a-42ff-ab96-622f2dc6e492" containerID="0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15" exitCode=1 Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.150505 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"fc08df3d-397a-42ff-ab96-622f2dc6e492","Type":"ContainerDied","Data":"0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15"} Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.150533 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"fc08df3d-397a-42ff-ab96-622f2dc6e492","Type":"ContainerDied","Data":"2cbcc3680fcca00921c7e0f493ef34108c21360ddf810ce3b1e05e25e124674a"} Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.150550 5030 scope.go:117] "RemoveContainer" containerID="0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.150714 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.182021 5030 generic.go:334] "Generic (PLEG): container finished" podID="5193f521-7049-4215-9ec2-5ee65b965e72" containerID="12edf405cbfdabbe7764f8aed590a89c8ebdedcdde1c808d0e999161b7b56af4" exitCode=1 Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.182107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"5193f521-7049-4215-9ec2-5ee65b965e72","Type":"ContainerDied","Data":"12edf405cbfdabbe7764f8aed590a89c8ebdedcdde1c808d0e999161b7b56af4"} Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.183970 5030 scope.go:117] "RemoveContainer" containerID="12edf405cbfdabbe7764f8aed590a89c8ebdedcdde1c808d0e999161b7b56af4" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.185366 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.197122 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.197105003 podStartE2EDuration="2.197105003s" podCreationTimestamp="2026-01-20 23:34:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:17.152888253 +0000 UTC m=+3529.473148541" watchObservedRunningTime="2026-01-20 23:34:17.197105003 +0000 UTC m=+3529.517365291" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.202664 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.208104 5030 scope.go:117] "RemoveContainer" containerID="0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15" Jan 20 23:34:17 crc kubenswrapper[5030]: E0120 23:34:17.210980 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15\": container with ID starting with 0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15 not found: ID does not exist" containerID="0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.211031 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15"} err="failed to get container status \"0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15\": rpc error: code = NotFound desc = could not find container \"0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15\": container with ID starting with 0539f32a13e37b78df5c4f6ddf367ff7fe5dd935729ce5ed1e6918d133ca6e15 not found: ID does not exist" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.217675 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.235457 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:34:17 crc kubenswrapper[5030]: E0120 23:34:17.237279 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc08df3d-397a-42ff-ab96-622f2dc6e492" containerName="nova-scheduler-scheduler" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.237303 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc08df3d-397a-42ff-ab96-622f2dc6e492" containerName="nova-scheduler-scheduler" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.237563 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc08df3d-397a-42ff-ab96-622f2dc6e492" containerName="nova-scheduler-scheduler" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.238350 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.247415 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.292756 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.334113 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" podUID="8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.226:9311/healthcheck\": read tcp 10.217.0.2:60786->10.217.1.226:9311: read: connection reset by peer" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.334648 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" podUID="8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.226:9311/healthcheck\": read tcp 10.217.0.2:60794->10.217.1.226:9311: read: connection reset by peer" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.365649 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75e4451-042a-424d-a919-049a3213d060-config-data\") pod \"nova-scheduler-0\" (UID: \"a75e4451-042a-424d-a919-049a3213d060\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.365691 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75e4451-042a-424d-a919-049a3213d060-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a75e4451-042a-424d-a919-049a3213d060\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.365853 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m246\" (UniqueName: \"kubernetes.io/projected/a75e4451-042a-424d-a919-049a3213d060-kube-api-access-2m246\") pod \"nova-scheduler-0\" (UID: \"a75e4451-042a-424d-a919-049a3213d060\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.467033 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75e4451-042a-424d-a919-049a3213d060-config-data\") pod \"nova-scheduler-0\" (UID: \"a75e4451-042a-424d-a919-049a3213d060\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.467074 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75e4451-042a-424d-a919-049a3213d060-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a75e4451-042a-424d-a919-049a3213d060\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.467161 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m246\" (UniqueName: \"kubernetes.io/projected/a75e4451-042a-424d-a919-049a3213d060-kube-api-access-2m246\") pod \"nova-scheduler-0\" (UID: \"a75e4451-042a-424d-a919-049a3213d060\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.479448 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75e4451-042a-424d-a919-049a3213d060-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a75e4451-042a-424d-a919-049a3213d060\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.483279 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75e4451-042a-424d-a919-049a3213d060-config-data\") pod \"nova-scheduler-0\" (UID: \"a75e4451-042a-424d-a919-049a3213d060\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.504330 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m246\" (UniqueName: \"kubernetes.io/projected/a75e4451-042a-424d-a919-049a3213d060-kube-api-access-2m246\") pod \"nova-scheduler-0\" (UID: \"a75e4451-042a-424d-a919-049a3213d060\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.558144 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.920407 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.982414 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62e3599a-b500-49ac-a3bd-cdc62cd37fba" path="/var/lib/kubelet/pods/62e3599a-b500-49ac-a3bd-cdc62cd37fba/volumes" Jan 20 23:34:17 crc kubenswrapper[5030]: I0120 23:34:17.982940 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc08df3d-397a-42ff-ab96-622f2dc6e492" path="/var/lib/kubelet/pods/fc08df3d-397a-42ff-ab96-622f2dc6e492/volumes" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.088265 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd4v7\" (UniqueName: \"kubernetes.io/projected/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-kube-api-access-zd4v7\") pod \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.088312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-combined-ca-bundle\") pod \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.088375 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-logs\") pod \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.088431 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-internal-tls-certs\") pod \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.088477 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-public-tls-certs\") pod \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.088501 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-config-data-custom\") pod \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.088520 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-config-data\") pod \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\" (UID: \"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f\") " Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.096474 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" (UID: "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.096518 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-logs" (OuterVolumeSpecName: "logs") pod "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" (UID: "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.106986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-kube-api-access-zd4v7" (OuterVolumeSpecName: "kube-api-access-zd4v7") pod "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" (UID: "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f"). InnerVolumeSpecName "kube-api-access-zd4v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.131050 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" (UID: "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.178801 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" (UID: "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.190773 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd4v7\" (UniqueName: \"kubernetes.io/projected/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-kube-api-access-zd4v7\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.190816 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.190831 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.190847 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.190860 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.192118 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-config-data" (OuterVolumeSpecName: "config-data") pod "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" (UID: "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.195165 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" (UID: "8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.200863 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"dc1666bc-40ca-4f98-a280-6c13671a5196","Type":"ContainerStarted","Data":"33dc9db4ec964af41a6a0cfcd301bf153fe2ff8a2a7bd2f13f623f706d7557be"} Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.200908 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"dc1666bc-40ca-4f98-a280-6c13671a5196","Type":"ContainerStarted","Data":"32929c5ff6bdf7c14f5b3c333537b3e2ce877218b0d9751c8e828c2774dffd1b"} Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.200920 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.201026 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="dc1666bc-40ca-4f98-a280-6c13671a5196" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://33dc9db4ec964af41a6a0cfcd301bf153fe2ff8a2a7bd2f13f623f706d7557be" gracePeriod=30 Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.203886 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"5193f521-7049-4215-9ec2-5ee65b965e72","Type":"ContainerStarted","Data":"582661be087cfd633cc6137300755d1479d99380a51515479b7469ad2950cc82"} Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.205119 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.221156 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.221142811 podStartE2EDuration="2.221142811s" podCreationTimestamp="2026-01-20 23:34:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:18.216319475 +0000 UTC m=+3530.536579763" watchObservedRunningTime="2026-01-20 23:34:18.221142811 +0000 UTC m=+3530.541403089" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.223868 5030 generic.go:334] "Generic (PLEG): container finished" podID="8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" containerID="016ababbed1655b0d2abe8b696a59d0e9ec529a56e150638fbe3d4558285b38e" exitCode=0 Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.223966 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" event={"ID":"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f","Type":"ContainerDied","Data":"016ababbed1655b0d2abe8b696a59d0e9ec529a56e150638fbe3d4558285b38e"} Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.224014 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" event={"ID":"8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f","Type":"ContainerDied","Data":"2a887328275e6def3c5749cf008ab14a4de46e1fc9a87d3cfdcb183be08841f1"} Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.224032 5030 scope.go:117] "RemoveContainer" containerID="016ababbed1655b0d2abe8b696a59d0e9ec529a56e150638fbe3d4558285b38e" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.224137 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.233374 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a75e4451-042a-424d-a919-049a3213d060","Type":"ContainerStarted","Data":"bd4ad3300dd5f9d4928a27642c21e47130d7f60d93cc8338838edda5ae26e239"} Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.281106 5030 scope.go:117] "RemoveContainer" containerID="616813822df1b61971570aac8e1cfac100ae80c314656371569d202dcc6d59ca" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.293602 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.293643 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.343749 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w"] Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.355917 5030 scope.go:117] "RemoveContainer" containerID="016ababbed1655b0d2abe8b696a59d0e9ec529a56e150638fbe3d4558285b38e" Jan 20 23:34:18 crc kubenswrapper[5030]: E0120 23:34:18.358331 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016ababbed1655b0d2abe8b696a59d0e9ec529a56e150638fbe3d4558285b38e\": container with ID starting with 016ababbed1655b0d2abe8b696a59d0e9ec529a56e150638fbe3d4558285b38e not found: ID does not exist" containerID="016ababbed1655b0d2abe8b696a59d0e9ec529a56e150638fbe3d4558285b38e" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.358366 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016ababbed1655b0d2abe8b696a59d0e9ec529a56e150638fbe3d4558285b38e"} err="failed to get container status \"016ababbed1655b0d2abe8b696a59d0e9ec529a56e150638fbe3d4558285b38e\": rpc error: code = NotFound desc = could not find container \"016ababbed1655b0d2abe8b696a59d0e9ec529a56e150638fbe3d4558285b38e\": container with ID starting with 016ababbed1655b0d2abe8b696a59d0e9ec529a56e150638fbe3d4558285b38e not found: ID does not exist" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.358387 5030 scope.go:117] "RemoveContainer" containerID="616813822df1b61971570aac8e1cfac100ae80c314656371569d202dcc6d59ca" Jan 20 23:34:18 crc kubenswrapper[5030]: E0120 23:34:18.360143 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"616813822df1b61971570aac8e1cfac100ae80c314656371569d202dcc6d59ca\": container with ID starting with 616813822df1b61971570aac8e1cfac100ae80c314656371569d202dcc6d59ca not found: ID does not exist" containerID="616813822df1b61971570aac8e1cfac100ae80c314656371569d202dcc6d59ca" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.360177 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616813822df1b61971570aac8e1cfac100ae80c314656371569d202dcc6d59ca"} err="failed to get container status \"616813822df1b61971570aac8e1cfac100ae80c314656371569d202dcc6d59ca\": rpc error: code = NotFound desc = could not find container \"616813822df1b61971570aac8e1cfac100ae80c314656371569d202dcc6d59ca\": container with ID starting with 616813822df1b61971570aac8e1cfac100ae80c314656371569d202dcc6d59ca not found: ID does not exist" Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.365860 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-58d5d86f6d-bhw7w"] Jan 20 23:34:18 crc kubenswrapper[5030]: I0120 23:34:18.457421 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.255075 5030 generic.go:334] "Generic (PLEG): container finished" podID="cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d" containerID="8d583b27015652f7e8e83138eda5370ac6f3f2cbeca97079c99f7be638682e39" exitCode=1 Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.255151 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d","Type":"ContainerDied","Data":"8d583b27015652f7e8e83138eda5370ac6f3f2cbeca97079c99f7be638682e39"} Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.255734 5030 scope.go:117] "RemoveContainer" containerID="8d583b27015652f7e8e83138eda5370ac6f3f2cbeca97079c99f7be638682e39" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.259803 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a75e4451-042a-424d-a919-049a3213d060","Type":"ContainerStarted","Data":"1ad70098428c7b0568be41217ee431839dc1d2a5d66015163a6657cfd3bb4619"} Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.309212 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.309195779 podStartE2EDuration="2.309195779s" podCreationTimestamp="2026-01-20 23:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:19.30593957 +0000 UTC m=+3531.626199858" watchObservedRunningTime="2026-01-20 23:34:19.309195779 +0000 UTC m=+3531.629456067" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.394600 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-6b8948f648-jn267"] Jan 20 23:34:19 crc kubenswrapper[5030]: E0120 23:34:19.395069 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" containerName="barbican-api" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.395082 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" containerName="barbican-api" Jan 20 23:34:19 crc kubenswrapper[5030]: E0120 23:34:19.395129 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" containerName="barbican-api-log" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.395135 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" containerName="barbican-api-log" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.395312 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" containerName="barbican-api-log" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.395342 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" containerName="barbican-api" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.397021 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.403525 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6b8948f648-jn267"] Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.503532 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.521613 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r9d4\" (UniqueName: \"kubernetes.io/projected/515ad7d0-529c-4fe7-8a8b-e1af630713fc-kube-api-access-6r9d4\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.521986 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-config-data\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.522115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/515ad7d0-529c-4fe7-8a8b-e1af630713fc-logs\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.522136 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-config-data-custom\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.522473 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-internal-tls-certs\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.522505 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-combined-ca-bundle\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.522665 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-public-tls-certs\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.557265 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.624600 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r9d4\" (UniqueName: \"kubernetes.io/projected/515ad7d0-529c-4fe7-8a8b-e1af630713fc-kube-api-access-6r9d4\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.624666 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-config-data\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.624753 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/515ad7d0-529c-4fe7-8a8b-e1af630713fc-logs\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.624772 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-config-data-custom\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.624807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-internal-tls-certs\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.624838 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-combined-ca-bundle\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.624880 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-public-tls-certs\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.632275 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/515ad7d0-529c-4fe7-8a8b-e1af630713fc-logs\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.632357 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-internal-tls-certs\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.632884 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-public-tls-certs\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.633747 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-combined-ca-bundle\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.649047 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-config-data-custom\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.649800 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-config-data\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.654209 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r9d4\" (UniqueName: \"kubernetes.io/projected/515ad7d0-529c-4fe7-8a8b-e1af630713fc-kube-api-access-6r9d4\") pod \"barbican-api-6b8948f648-jn267\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.738653 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.751273 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6b8948f648-jn267"] Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.775433 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26"] Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.801092 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.810544 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26"] Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.930980 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-config-data\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.931014 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-combined-ca-bundle\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.931038 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxjqv\" (UniqueName: \"kubernetes.io/projected/b3a2955e-219c-481d-bb5d-ca59f20d1255-kube-api-access-jxjqv\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.931079 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-internal-tls-certs\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.931232 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3a2955e-219c-481d-bb5d-ca59f20d1255-logs\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.931388 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-config-data-custom\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.931441 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-public-tls-certs\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:19 crc kubenswrapper[5030]: I0120 23:34:19.973697 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f" path="/var/lib/kubelet/pods/8d8bfc4f-992d-4d6d-a28d-3b8a4035c81f/volumes" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.037291 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-config-data\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.037361 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-combined-ca-bundle\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.037397 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxjqv\" (UniqueName: \"kubernetes.io/projected/b3a2955e-219c-481d-bb5d-ca59f20d1255-kube-api-access-jxjqv\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.038424 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-internal-tls-certs\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.038462 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3a2955e-219c-481d-bb5d-ca59f20d1255-logs\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.038547 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-config-data-custom\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.038589 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-public-tls-certs\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.040207 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3a2955e-219c-481d-bb5d-ca59f20d1255-logs\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.051681 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-public-tls-certs\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.052101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-combined-ca-bundle\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.052275 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-config-data-custom\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.053074 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-internal-tls-certs\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.053397 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-config-data\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.059866 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxjqv\" (UniqueName: \"kubernetes.io/projected/b3a2955e-219c-481d-bb5d-ca59f20d1255-kube-api-access-jxjqv\") pod \"barbican-api-54d7f78ddd-xnd26\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.137095 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.269512 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-6d84555d94-5jgdn_7203857c-1149-4912-9921-7e547f6418f6/neutron-api/0.log" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.269548 5030 generic.go:334] "Generic (PLEG): container finished" podID="7203857c-1149-4912-9921-7e547f6418f6" containerID="3800eea402918756e94a42e7bf2f7203a22cdc0c78eae378314eed1505938d12" exitCode=1 Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.269587 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" event={"ID":"7203857c-1149-4912-9921-7e547f6418f6","Type":"ContainerDied","Data":"3800eea402918756e94a42e7bf2f7203a22cdc0c78eae378314eed1505938d12"} Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.271381 5030 generic.go:334] "Generic (PLEG): container finished" podID="5193f521-7049-4215-9ec2-5ee65b965e72" containerID="582661be087cfd633cc6137300755d1479d99380a51515479b7469ad2950cc82" exitCode=1 Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.272189 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"5193f521-7049-4215-9ec2-5ee65b965e72","Type":"ContainerDied","Data":"582661be087cfd633cc6137300755d1479d99380a51515479b7469ad2950cc82"} Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.272218 5030 scope.go:117] "RemoveContainer" containerID="12edf405cbfdabbe7764f8aed590a89c8ebdedcdde1c808d0e999161b7b56af4" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.274811 5030 scope.go:117] "RemoveContainer" containerID="582661be087cfd633cc6137300755d1479d99380a51515479b7469ad2950cc82" Jan 20 23:34:20 crc kubenswrapper[5030]: E0120 23:34:20.275092 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(5193f521-7049-4215-9ec2-5ee65b965e72)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="5193f521-7049-4215-9ec2-5ee65b965e72" Jan 20 23:34:20 crc kubenswrapper[5030]: I0120 23:34:20.350901 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:21 crc kubenswrapper[5030]: I0120 23:34:21.288449 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-55c6fff65d-h7t2s_ba78ec90-96e7-41c8-a6f9-ceff78ba166c/neutron-api/0.log" Jan 20 23:34:21 crc kubenswrapper[5030]: I0120 23:34:21.288510 5030 generic.go:334] "Generic (PLEG): container finished" podID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerID="dcdc63e978b66aea26fcff51171500f5e3244ba6e5d9533407f5b85a501aa9c4" exitCode=1 Jan 20 23:34:21 crc kubenswrapper[5030]: I0120 23:34:21.288581 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" event={"ID":"ba78ec90-96e7-41c8-a6f9-ceff78ba166c","Type":"ContainerDied","Data":"dcdc63e978b66aea26fcff51171500f5e3244ba6e5d9533407f5b85a501aa9c4"} Jan 20 23:34:21 crc kubenswrapper[5030]: I0120 23:34:21.289392 5030 scope.go:117] "RemoveContainer" containerID="dcdc63e978b66aea26fcff51171500f5e3244ba6e5d9533407f5b85a501aa9c4" Jan 20 23:34:21 crc kubenswrapper[5030]: I0120 23:34:21.290147 5030 generic.go:334] "Generic (PLEG): container finished" podID="a75e4451-042a-424d-a919-049a3213d060" containerID="1ad70098428c7b0568be41217ee431839dc1d2a5d66015163a6657cfd3bb4619" exitCode=1 Jan 20 23:34:21 crc kubenswrapper[5030]: I0120 23:34:21.290180 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a75e4451-042a-424d-a919-049a3213d060","Type":"ContainerDied","Data":"1ad70098428c7b0568be41217ee431839dc1d2a5d66015163a6657cfd3bb4619"} Jan 20 23:34:21 crc kubenswrapper[5030]: I0120 23:34:21.290562 5030 scope.go:117] "RemoveContainer" containerID="1ad70098428c7b0568be41217ee431839dc1d2a5d66015163a6657cfd3bb4619" Jan 20 23:34:21 crc kubenswrapper[5030]: I0120 23:34:21.298075 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 20 23:34:21 crc kubenswrapper[5030]: I0120 23:34:21.451810 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:21 crc kubenswrapper[5030]: I0120 23:34:21.521926 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.167989 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-6d84555d94-5jgdn_7203857c-1149-4912-9921-7e547f6418f6/neutron-api/0.log" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.168289 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.285439 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-combined-ca-bundle\") pod \"7203857c-1149-4912-9921-7e547f6418f6\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.285789 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-internal-tls-certs\") pod \"7203857c-1149-4912-9921-7e547f6418f6\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.286208 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-ovndb-tls-certs\") pod \"7203857c-1149-4912-9921-7e547f6418f6\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.286349 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-httpd-config\") pod \"7203857c-1149-4912-9921-7e547f6418f6\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.286460 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-public-tls-certs\") pod \"7203857c-1149-4912-9921-7e547f6418f6\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.286493 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-config\") pod \"7203857c-1149-4912-9921-7e547f6418f6\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.286523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcxdl\" (UniqueName: \"kubernetes.io/projected/7203857c-1149-4912-9921-7e547f6418f6-kube-api-access-tcxdl\") pod \"7203857c-1149-4912-9921-7e547f6418f6\" (UID: \"7203857c-1149-4912-9921-7e547f6418f6\") " Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.298837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7203857c-1149-4912-9921-7e547f6418f6" (UID: "7203857c-1149-4912-9921-7e547f6418f6"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.300204 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7203857c-1149-4912-9921-7e547f6418f6-kube-api-access-tcxdl" (OuterVolumeSpecName: "kube-api-access-tcxdl") pod "7203857c-1149-4912-9921-7e547f6418f6" (UID: "7203857c-1149-4912-9921-7e547f6418f6"). InnerVolumeSpecName "kube-api-access-tcxdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.325569 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-6d84555d94-5jgdn_7203857c-1149-4912-9921-7e547f6418f6/neutron-api/0.log" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.325667 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" event={"ID":"7203857c-1149-4912-9921-7e547f6418f6","Type":"ContainerDied","Data":"77d98cce311295ce790bd78ecfe1e35d9d791f4e74f411d2a6b413ba8273cbdb"} Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.325705 5030 scope.go:117] "RemoveContainer" containerID="0de19bed1ecff6e0a69896036a321fb497ea5ae981f8bcf78e60eb8bd7775f53" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.325808 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d84555d94-5jgdn" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.348480 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d","Type":"ContainerStarted","Data":"0d9fc4ee758d84087beaab327bba3d23f35c0de724fa32f1178cb99595f25278"} Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.349013 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.365397 5030 scope.go:117] "RemoveContainer" containerID="3800eea402918756e94a42e7bf2f7203a22cdc0c78eae378314eed1505938d12" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.394832 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.394869 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcxdl\" (UniqueName: \"kubernetes.io/projected/7203857c-1149-4912-9921-7e547f6418f6-kube-api-access-tcxdl\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.404283 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.405354 5030 scope.go:117] "RemoveContainer" containerID="582661be087cfd633cc6137300755d1479d99380a51515479b7469ad2950cc82" Jan 20 23:34:22 crc kubenswrapper[5030]: E0120 23:34:22.405891 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(5193f521-7049-4215-9ec2-5ee65b965e72)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="5193f521-7049-4215-9ec2-5ee65b965e72" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.496331 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6b8948f648-jn267"] Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.558601 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7203857c-1149-4912-9921-7e547f6418f6" (UID: "7203857c-1149-4912-9921-7e547f6418f6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.561724 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.575240 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7203857c-1149-4912-9921-7e547f6418f6" (UID: "7203857c-1149-4912-9921-7e547f6418f6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.576820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7203857c-1149-4912-9921-7e547f6418f6" (UID: "7203857c-1149-4912-9921-7e547f6418f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.578406 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7203857c-1149-4912-9921-7e547f6418f6" (UID: "7203857c-1149-4912-9921-7e547f6418f6"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.582599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-config" (OuterVolumeSpecName: "config") pod "7203857c-1149-4912-9921-7e547f6418f6" (UID: "7203857c-1149-4912-9921-7e547f6418f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.598651 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.598687 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.598702 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.598716 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.598727 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7203857c-1149-4912-9921-7e547f6418f6-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.601039 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26"] Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.611823 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:34:22 crc kubenswrapper[5030]: W0120 23:34:22.616592 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3a2955e_219c_481d_bb5d_ca59f20d1255.slice/crio-0948e4c0daa26693867712ecc8c65337053c44cee21476c93d44514857119892 WatchSource:0}: Error finding container 0948e4c0daa26693867712ecc8c65337053c44cee21476c93d44514857119892: Status 404 returned error can't find the container with id 0948e4c0daa26693867712ecc8c65337053c44cee21476c93d44514857119892 Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.620291 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-676d666f78-55975" podUID="a84cf4eb-2a79-4637-aca8-d84c4e588fe7" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.669602 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6d84555d94-5jgdn"] Jan 20 23:34:22 crc kubenswrapper[5030]: I0120 23:34:22.678464 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6d84555d94-5jgdn"] Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.097236 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5qbtc"] Jan 20 23:34:23 crc kubenswrapper[5030]: E0120 23:34:23.098030 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7203857c-1149-4912-9921-7e547f6418f6" containerName="neutron-httpd" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.098048 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7203857c-1149-4912-9921-7e547f6418f6" containerName="neutron-httpd" Jan 20 23:34:23 crc kubenswrapper[5030]: E0120 23:34:23.098069 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7203857c-1149-4912-9921-7e547f6418f6" containerName="neutron-api" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.098076 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7203857c-1149-4912-9921-7e547f6418f6" containerName="neutron-api" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.098300 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7203857c-1149-4912-9921-7e547f6418f6" containerName="neutron-api" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.098320 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7203857c-1149-4912-9921-7e547f6418f6" containerName="neutron-httpd" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.134188 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qbtc"] Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.134300 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.209812 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-catalog-content\") pod \"redhat-marketplace-5qbtc\" (UID: \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\") " pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.209858 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt7lt\" (UniqueName: \"kubernetes.io/projected/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-kube-api-access-wt7lt\") pod \"redhat-marketplace-5qbtc\" (UID: \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\") " pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.209938 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-utilities\") pod \"redhat-marketplace-5qbtc\" (UID: \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\") " pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.311336 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-catalog-content\") pod \"redhat-marketplace-5qbtc\" (UID: \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\") " pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.311402 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt7lt\" (UniqueName: \"kubernetes.io/projected/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-kube-api-access-wt7lt\") pod \"redhat-marketplace-5qbtc\" (UID: \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\") " pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.311485 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-utilities\") pod \"redhat-marketplace-5qbtc\" (UID: \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\") " pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.312101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-utilities\") pod \"redhat-marketplace-5qbtc\" (UID: \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\") " pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.312166 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-catalog-content\") pod \"redhat-marketplace-5qbtc\" (UID: \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\") " pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.359500 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" event={"ID":"b3a2955e-219c-481d-bb5d-ca59f20d1255","Type":"ContainerStarted","Data":"0245694e9895e15ee5e885b1444d16d218789764028b37b96632396ae39f6ff7"} Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.359545 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" event={"ID":"b3a2955e-219c-481d-bb5d-ca59f20d1255","Type":"ContainerStarted","Data":"16f35b422f9f079e31efcdbbb024530bfa9466aa4f2e1d38af8351b8f4450329"} Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.359555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" event={"ID":"b3a2955e-219c-481d-bb5d-ca59f20d1255","Type":"ContainerStarted","Data":"0948e4c0daa26693867712ecc8c65337053c44cee21476c93d44514857119892"} Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.360727 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.360758 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.362794 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a75e4451-042a-424d-a919-049a3213d060","Type":"ContainerStarted","Data":"f934c94653ca57fb3a0c59c188fa4c84207ab84b1a111cd3517e299176d2d609"} Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.366792 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f7e1038-a9e1-40d9-a09d-4c224fb114d4" containerID="d265c55e4aeef9268ed267ea571a9d19e742c27097df56c4176eb709a398dffe" exitCode=0 Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.366845 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzhtm" event={"ID":"4f7e1038-a9e1-40d9-a09d-4c224fb114d4","Type":"ContainerDied","Data":"d265c55e4aeef9268ed267ea571a9d19e742c27097df56c4176eb709a398dffe"} Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.369215 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" event={"ID":"515ad7d0-529c-4fe7-8a8b-e1af630713fc","Type":"ContainerStarted","Data":"42706cceac4151563e09d1f6cd064d1ecf5b2e84ce6f3b4b234afe04aff748d5"} Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.369264 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" event={"ID":"515ad7d0-529c-4fe7-8a8b-e1af630713fc","Type":"ContainerStarted","Data":"202a49fb9f94af701c51bd0880081eb9874aac6941b1924cef3186fd4505db91"} Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.369275 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" event={"ID":"515ad7d0-529c-4fe7-8a8b-e1af630713fc","Type":"ContainerStarted","Data":"7e8c34b367ede1364b745d0eb22af1a4f9825d558ec828565b9f836634fcdce3"} Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.369359 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api-log" containerID="cri-o://202a49fb9f94af701c51bd0880081eb9874aac6941b1924cef3186fd4505db91" gracePeriod=30 Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.369423 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.369452 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.369479 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api" containerID="cri-o://42706cceac4151563e09d1f6cd064d1ecf5b2e84ce6f3b4b234afe04aff748d5" gracePeriod=30 Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.379089 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt7lt\" (UniqueName: \"kubernetes.io/projected/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-kube-api-access-wt7lt\") pod \"redhat-marketplace-5qbtc\" (UID: \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\") " pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.396271 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f"] Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.397854 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.401123 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-55c6fff65d-h7t2s_ba78ec90-96e7-41c8-a6f9-ceff78ba166c/neutron-api/0.log" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.402010 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" event={"ID":"ba78ec90-96e7-41c8-a6f9-ceff78ba166c","Type":"ContainerStarted","Data":"5cddc67edd2f5935f1b5d1f382f2fcc2e653907621cfa3680325fd62aadb8aa1"} Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.433861 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f"] Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.450469 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj"] Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.460241 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.465910 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.474167 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" podStartSLOduration=4.474148295 podStartE2EDuration="4.474148295s" podCreationTimestamp="2026-01-20 23:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:23.438875821 +0000 UTC m=+3535.759136109" watchObservedRunningTime="2026-01-20 23:34:23.474148295 +0000 UTC m=+3535.794408583" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.496636 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.496888 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="aaa15825-c66d-4625-86d9-2066c914a955" containerName="ovn-northd" containerID="cri-o://506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" gracePeriod=30 Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.497073 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="aaa15825-c66d-4625-86d9-2066c914a955" containerName="openstack-network-exporter" containerID="cri-o://decf69e4e589732702c19f699d836d58b658a46251d7b72038207fef3f5a3ca1" gracePeriod=30 Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.517205 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvgxd\" (UniqueName: \"kubernetes.io/projected/f4bc1efe-faed-4290-a85c-c631a91f02ca-kube-api-access-vvgxd\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.517254 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fwn8\" (UniqueName: \"kubernetes.io/projected/5c2a94b7-b322-4de7-b55d-80c87895ab2d-kube-api-access-7fwn8\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.517302 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-combined-ca-bundle\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.517334 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-config-data\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.517395 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-config-data-custom\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.517457 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2a94b7-b322-4de7-b55d-80c87895ab2d-logs\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.517475 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-config-data\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.517493 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-config-data-custom\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.517549 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4bc1efe-faed-4290-a85c-c631a91f02ca-logs\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.517563 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-combined-ca-bundle\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.527059 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj"] Jan 20 23:34:23 crc kubenswrapper[5030]: E0120 23:34:23.530010 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.557658 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.558002 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="5613b909-aedf-4a87-babc-ca3e427fe828" containerName="probe" containerID="cri-o://343f155d1a459e91f33918892e6f512eb6a0c2d10d0aff7fca8e1becf913fd1c" gracePeriod=30 Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.557944 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="5613b909-aedf-4a87-babc-ca3e427fe828" containerName="cinder-scheduler" containerID="cri-o://031ff5602823d87e2940a4709da9332283c628b5bf6321ae670f5bb9b3a72763" gracePeriod=30 Jan 20 23:34:23 crc kubenswrapper[5030]: E0120 23:34:23.607008 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.621255 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-combined-ca-bundle\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.621545 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-config-data\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.621595 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-config-data-custom\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.621655 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2a94b7-b322-4de7-b55d-80c87895ab2d-logs\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.621674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-config-data\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.621694 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-config-data-custom\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.621738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4bc1efe-faed-4290-a85c-c631a91f02ca-logs\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.621756 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-combined-ca-bundle\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.621807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvgxd\" (UniqueName: \"kubernetes.io/projected/f4bc1efe-faed-4290-a85c-c631a91f02ca-kube-api-access-vvgxd\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.621845 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fwn8\" (UniqueName: \"kubernetes.io/projected/5c2a94b7-b322-4de7-b55d-80c87895ab2d-kube-api-access-7fwn8\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.623054 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2a94b7-b322-4de7-b55d-80c87895ab2d-logs\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.623810 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4bc1efe-faed-4290-a85c-c631a91f02ca-logs\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.635107 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-config-data-custom\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.637290 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-combined-ca-bundle\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: E0120 23:34:23.637402 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:34:23 crc kubenswrapper[5030]: E0120 23:34:23.637438 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="aaa15825-c66d-4625-86d9-2066c914a955" containerName="ovn-northd" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.637928 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-config-data\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.638206 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-config-data\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.649457 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-config-data-custom\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.653472 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-combined-ca-bundle\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.663783 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.668304 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvgxd\" (UniqueName: \"kubernetes.io/projected/f4bc1efe-faed-4290-a85c-c631a91f02ca-kube-api-access-vvgxd\") pod \"barbican-keystone-listener-64946f8974-8fmpj\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.687442 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.687702 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="53c21d1a-4b88-4cd9-bf6c-2394c988dbcf" containerName="memcached" containerID="cri-o://d5e32ad78f87773cab221c54225e04fd0ff72e850437ba7b5b0074e225f96e54" gracePeriod=30 Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.688220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fwn8\" (UniqueName: \"kubernetes.io/projected/5c2a94b7-b322-4de7-b55d-80c87895ab2d-kube-api-access-7fwn8\") pod \"barbican-worker-7b7d88d9b5-npm6f\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.702060 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" podStartSLOduration=4.702041562 podStartE2EDuration="4.702041562s" podCreationTimestamp="2026-01-20 23:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:23.623107241 +0000 UTC m=+3535.943367529" watchObservedRunningTime="2026-01-20 23:34:23.702041562 +0000 UTC m=+3536.022301850" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.803828 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.826231 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:34:23 crc kubenswrapper[5030]: I0120 23:34:23.900983 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.007757 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.008554 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="c095502b-f026-40aa-a84f-057d9dc0e758" containerName="cinder-api-log" containerID="cri-o://71cfc22f41e9fa0ef8e7067f5101e00ba2cfaf07926332c0f19d957ce019cfcd" gracePeriod=30 Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.009117 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="c095502b-f026-40aa-a84f-057d9dc0e758" containerName="cinder-api" containerID="cri-o://8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1" gracePeriod=30 Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.106308 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-api-0" podUID="c095502b-f026-40aa-a84f-057d9dc0e758" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.116:8776/healthcheck\": EOF" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.143765 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" podUID="b7b3ce7d-73be-4f56-80a5-d9147ad0580e" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.100:9696/\": dial tcp 10.217.0.100:9696: connect: connection refused" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.462929 5030 generic.go:334] "Generic (PLEG): container finished" podID="aaa15825-c66d-4625-86d9-2066c914a955" containerID="decf69e4e589732702c19f699d836d58b658a46251d7b72038207fef3f5a3ca1" exitCode=2 Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.464584 5030 generic.go:334] "Generic (PLEG): container finished" podID="c095502b-f026-40aa-a84f-057d9dc0e758" containerID="71cfc22f41e9fa0ef8e7067f5101e00ba2cfaf07926332c0f19d957ce019cfcd" exitCode=143 Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.465874 5030 generic.go:334] "Generic (PLEG): container finished" podID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerID="202a49fb9f94af701c51bd0880081eb9874aac6941b1924cef3186fd4505db91" exitCode=143 Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.708325 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7203857c-1149-4912-9921-7e547f6418f6" path="/var/lib/kubelet/pods/7203857c-1149-4912-9921-7e547f6418f6/volumes" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.719609 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.719659 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"aaa15825-c66d-4625-86d9-2066c914a955","Type":"ContainerDied","Data":"decf69e4e589732702c19f699d836d58b658a46251d7b72038207fef3f5a3ca1"} Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.719684 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-66d4dcc7c4-pckll"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.719703 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c095502b-f026-40aa-a84f-057d9dc0e758","Type":"ContainerDied","Data":"71cfc22f41e9fa0ef8e7067f5101e00ba2cfaf07926332c0f19d957ce019cfcd"} Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.719713 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" event={"ID":"515ad7d0-529c-4fe7-8a8b-e1af630713fc","Type":"ContainerDied","Data":"202a49fb9f94af701c51bd0880081eb9874aac6941b1924cef3186fd4505db91"} Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.719724 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.719737 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.721220 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.721241 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.721251 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.721265 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.731652 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" podUID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerName="placement-log" containerID="cri-o://4afb02528814af0d996d7fabe7520b1955adff3dccf2a87e4e71cb62bed38778" gracePeriod=30 Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.731841 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.737108 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="d7bf3518-ec94-4593-9c4d-768add03ad93" containerName="openstack-network-exporter" containerID="cri-o://c24781b0440cb0ec4f5466dcf225f8aee2a756ed49cb3c20f947a7fd55315498" gracePeriod=300 Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.742500 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" podUID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerName="placement-api" containerID="cri-o://e06dfb116e83c6a042119cbcc962d6e9c94a759890c06b3cecc1ccd89cd716ab" gracePeriod=30 Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.745777 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qbtc"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.745814 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.745829 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.745841 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.745850 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-55c6fff65d-h7t2s"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.745861 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.746750 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" podUID="c8072f56-2133-435c-87fd-e0a273ee9e91" containerName="keystone-api" containerID="cri-o://b5752b3ecf6216d9097f2ae1074c2eb5cb7b6c853218dcdae4da8ae22eb06798" gracePeriod=30 Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.746792 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.746808 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-557d5c9854-wbd5h"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.746919 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.747041 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d" containerName="nova-cell1-conductor-conductor" containerID="cri-o://0d9fc4ee758d84087beaab327bba3d23f35c0de724fa32f1178cb99595f25278" gracePeriod=30 Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.748373 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.750456 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-557d5c9854-wbd5h"] Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.750674 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.750894 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-httpd" containerID="cri-o://5e694b41d33d8c2fffa8c96991878aaf037227af1c4a3149eb6d66b6726458a8" gracePeriod=30 Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.751374 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-api" containerID="cri-o://5cddc67edd2f5935f1b5d1f382f2fcc2e653907621cfa3680325fd62aadb8aa1" gracePeriod=30 Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.770833 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" podUID="c8072f56-2133-435c-87fd-e0a273ee9e91" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.0.119:5000/v3\": EOF" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.840994 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846408 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-public-tls-certs\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846456 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-config-data\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846481 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f18e53-0386-4e06-89f6-a7d30f880808-logs\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846498 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-fernet-keys\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846515 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-combined-ca-bundle\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846537 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-credential-keys\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846553 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-combined-ca-bundle\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-config-data\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846604 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-scripts\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846633 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-logs\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846652 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-ovndb-tls-certs\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846674 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-internal-tls-certs\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846699 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-public-tls-certs\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846716 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-config-data\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846735 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-httpd-config\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846749 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-combined-ca-bundle\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846767 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-internal-tls-certs\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846787 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-internal-tls-certs\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846811 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j95mn\" (UniqueName: \"kubernetes.io/projected/ec706fe5-85b7-4580-9013-e39497640ab0-kube-api-access-j95mn\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846827 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-combined-ca-bundle\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846845 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-internal-tls-certs\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846867 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkhm5\" (UniqueName: \"kubernetes.io/projected/99f18e53-0386-4e06-89f6-a7d30f880808-kube-api-access-hkhm5\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846885 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-config\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846934 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzpgp\" (UniqueName: \"kubernetes.io/projected/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-kube-api-access-rzpgp\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-scripts\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846968 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-public-tls-certs\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.846986 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4mt8\" (UniqueName: \"kubernetes.io/projected/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-kube-api-access-f4mt8\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.847001 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-public-tls-certs\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.847015 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-config-data-custom\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951555 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f18e53-0386-4e06-89f6-a7d30f880808-logs\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951611 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-fernet-keys\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951649 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-combined-ca-bundle\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951683 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-credential-keys\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951707 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-combined-ca-bundle\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-config-data\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951786 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-scripts\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951812 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-logs\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951838 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-ovndb-tls-certs\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951875 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-internal-tls-certs\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951909 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-public-tls-certs\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951930 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-config-data\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951966 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-httpd-config\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.951986 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-combined-ca-bundle\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952013 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-internal-tls-certs\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952034 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-internal-tls-certs\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952069 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j95mn\" (UniqueName: \"kubernetes.io/projected/ec706fe5-85b7-4580-9013-e39497640ab0-kube-api-access-j95mn\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952094 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-combined-ca-bundle\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952124 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-internal-tls-certs\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952156 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkhm5\" (UniqueName: \"kubernetes.io/projected/99f18e53-0386-4e06-89f6-a7d30f880808-kube-api-access-hkhm5\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952184 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-config\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952206 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzpgp\" (UniqueName: \"kubernetes.io/projected/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-kube-api-access-rzpgp\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952226 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-scripts\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952241 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-public-tls-certs\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952259 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4mt8\" (UniqueName: \"kubernetes.io/projected/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-kube-api-access-f4mt8\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952275 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-public-tls-certs\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952291 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-config-data-custom\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952342 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-public-tls-certs\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.952366 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-config-data\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.984365 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f18e53-0386-4e06-89f6-a7d30f880808-logs\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:24 crc kubenswrapper[5030]: I0120 23:34:24.985192 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-logs\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.004237 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-scripts\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.005737 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-public-tls-certs\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.008270 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-credential-keys\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.008293 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-combined-ca-bundle\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.009674 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-config-data-custom\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.020212 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-combined-ca-bundle\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.020552 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-config-data\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.020955 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-combined-ca-bundle\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.024881 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-public-tls-certs\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.030025 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-internal-tls-certs\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.030987 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-config\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.031767 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-public-tls-certs\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.032199 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-fernet-keys\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.032929 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-ovndb-tls-certs\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.033376 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-config-data\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.036330 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4mt8\" (UniqueName: \"kubernetes.io/projected/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-kube-api-access-f4mt8\") pod \"keystone-78c6cc75f4-xhf7x\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.037174 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-httpd-config\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.040474 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-internal-tls-certs\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.042419 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-internal-tls-certs\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.045182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-public-tls-certs\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.073433 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-combined-ca-bundle\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.073495 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-scripts\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.075153 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzpgp\" (UniqueName: \"kubernetes.io/projected/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-kube-api-access-rzpgp\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.078216 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-internal-tls-certs\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.095409 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j95mn\" (UniqueName: \"kubernetes.io/projected/ec706fe5-85b7-4580-9013-e39497640ab0-kube-api-access-j95mn\") pod \"neutron-557d5c9854-wbd5h\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.096070 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkhm5\" (UniqueName: \"kubernetes.io/projected/99f18e53-0386-4e06-89f6-a7d30f880808-kube-api-access-hkhm5\") pod \"barbican-api-778f4b5b5b-7p9xw\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.103466 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-config-data\") pod \"placement-6db9c8bfb8-7rnnr\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.121828 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod27a52282-d198-4a51-912a-1c2267f20d4b"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod27a52282-d198-4a51-912a-1c2267f20d4b] : Timed out while waiting for systemd to remove kubepods-besteffort-pod27a52282_d198_4a51_912a_1c2267f20d4b.slice" Jan 20 23:34:25 crc kubenswrapper[5030]: E0120 23:34:25.121878 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod27a52282-d198-4a51-912a-1c2267f20d4b] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod27a52282-d198-4a51-912a-1c2267f20d4b] : Timed out while waiting for systemd to remove kubepods-besteffort-pod27a52282_d198_4a51_912a_1c2267f20d4b.slice" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="27a52282-d198-4a51-912a-1c2267f20d4b" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.160880 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="d7bf3518-ec94-4593-9c4d-768add03ad93" containerName="ovsdbserver-sb" containerID="cri-o://e936339e4e8ee4ffadf2179ae8d7a2c4c7acf995ef0f21e4513af509be552351" gracePeriod=300 Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.161581 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.433403 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="c1c966f9-0e90-4a5d-9e43-81ec4980cefc" containerName="galera" containerID="cri-o://933d723028f4a35c1b50d6e05e66a8a5d7dfd126606c38522abe3bed4c985ccc" gracePeriod=29 Jan 20 23:34:25 crc kubenswrapper[5030]: E0120 23:34:25.517138 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:34:25 crc kubenswrapper[5030]: E0120 23:34:25.531785 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.532202 5030 generic.go:334] "Generic (PLEG): container finished" podID="c8072f56-2133-435c-87fd-e0a273ee9e91" containerID="b5752b3ecf6216d9097f2ae1074c2eb5cb7b6c853218dcdae4da8ae22eb06798" exitCode=0 Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.532256 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" event={"ID":"c8072f56-2133-435c-87fd-e0a273ee9e91","Type":"ContainerDied","Data":"b5752b3ecf6216d9097f2ae1074c2eb5cb7b6c853218dcdae4da8ae22eb06798"} Jan 20 23:34:25 crc kubenswrapper[5030]: E0120 23:34:25.557090 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:34:25 crc kubenswrapper[5030]: E0120 23:34:25.557151 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="aaa15825-c66d-4625-86d9-2066c914a955" containerName="ovn-northd" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.562149 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-55c6fff65d-h7t2s_ba78ec90-96e7-41c8-a6f9-ceff78ba166c/neutron-api/0.log" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.562194 5030 generic.go:334] "Generic (PLEG): container finished" podID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerID="5e694b41d33d8c2fffa8c96991878aaf037227af1c4a3149eb6d66b6726458a8" exitCode=0 Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.562243 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" event={"ID":"ba78ec90-96e7-41c8-a6f9-ceff78ba166c","Type":"ContainerDied","Data":"5e694b41d33d8c2fffa8c96991878aaf037227af1c4a3149eb6d66b6726458a8"} Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.566212 5030 generic.go:334] "Generic (PLEG): container finished" podID="53c21d1a-4b88-4cd9-bf6c-2394c988dbcf" containerID="d5e32ad78f87773cab221c54225e04fd0ff72e850437ba7b5b0074e225f96e54" exitCode=0 Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.566283 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf","Type":"ContainerDied","Data":"d5e32ad78f87773cab221c54225e04fd0ff72e850437ba7b5b0074e225f96e54"} Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.566306 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf","Type":"ContainerDied","Data":"761c37bf734cc250d9a7a79cc9d6abd457e8c00b3d6f3c264908f33d77e8f9eb"} Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.566316 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="761c37bf734cc250d9a7a79cc9d6abd457e8c00b3d6f3c264908f33d77e8f9eb" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.567423 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qbtc" event={"ID":"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec","Type":"ContainerStarted","Data":"5b53dd15d8c69e1e32d2fbdedd4cdd055d414a06bae73824d679b5ac7d1bd3cb"} Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.570067 5030 generic.go:334] "Generic (PLEG): container finished" podID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" containerID="e1bab47349c35f6aa385252a8ecdb111200aae28f3c39202af37cf667e9c4edb" exitCode=0 Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.570109 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6","Type":"ContainerDied","Data":"e1bab47349c35f6aa385252a8ecdb111200aae28f3c39202af37cf667e9c4edb"} Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.581341 5030 generic.go:334] "Generic (PLEG): container finished" podID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerID="4afb02528814af0d996d7fabe7520b1955adff3dccf2a87e4e71cb62bed38778" exitCode=143 Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.581402 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" event={"ID":"2d1b606a-c4d7-4ead-83db-20b4cd7ded47","Type":"ContainerDied","Data":"4afb02528814af0d996d7fabe7520b1955adff3dccf2a87e4e71cb62bed38778"} Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.589075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzhtm" event={"ID":"4f7e1038-a9e1-40d9-a09d-4c224fb114d4","Type":"ContainerStarted","Data":"6ee1b6560a64ef259ec6eaa5bbb33e54f0ef57489ad2da168aaaaf38c3656758"} Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.626004 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_d7bf3518-ec94-4593-9c4d-768add03ad93/ovsdbserver-sb/0.log" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.626050 5030 generic.go:334] "Generic (PLEG): container finished" podID="d7bf3518-ec94-4593-9c4d-768add03ad93" containerID="c24781b0440cb0ec4f5466dcf225f8aee2a756ed49cb3c20f947a7fd55315498" exitCode=2 Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.626066 5030 generic.go:334] "Generic (PLEG): container finished" podID="d7bf3518-ec94-4593-9c4d-768add03ad93" containerID="e936339e4e8ee4ffadf2179ae8d7a2c4c7acf995ef0f21e4513af509be552351" exitCode=143 Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.626130 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"d7bf3518-ec94-4593-9c4d-768add03ad93","Type":"ContainerDied","Data":"c24781b0440cb0ec4f5466dcf225f8aee2a756ed49cb3c20f947a7fd55315498"} Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.626155 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"d7bf3518-ec94-4593-9c4d-768add03ad93","Type":"ContainerDied","Data":"e936339e4e8ee4ffadf2179ae8d7a2c4c7acf995ef0f21e4513af509be552351"} Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.628428 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kzhtm" podStartSLOduration=5.117243436 podStartE2EDuration="17.628411211s" podCreationTimestamp="2026-01-20 23:34:08 +0000 UTC" firstStartedPulling="2026-01-20 23:34:11.690605453 +0000 UTC m=+3524.010865741" lastFinishedPulling="2026-01-20 23:34:24.201773228 +0000 UTC m=+3536.522033516" observedRunningTime="2026-01-20 23:34:25.615709014 +0000 UTC m=+3537.935969312" watchObservedRunningTime="2026-01-20 23:34:25.628411211 +0000 UTC m=+3537.948671509" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.658730 5030 generic.go:334] "Generic (PLEG): container finished" podID="5613b909-aedf-4a87-babc-ca3e427fe828" containerID="343f155d1a459e91f33918892e6f512eb6a0c2d10d0aff7fca8e1becf913fd1c" exitCode=0 Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.658901 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="a75e4451-042a-424d-a919-049a3213d060" containerName="nova-scheduler-scheduler" containerID="cri-o://f934c94653ca57fb3a0c59c188fa4c84207ab84b1a111cd3517e299176d2d609" gracePeriod=30 Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.659154 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5613b909-aedf-4a87-babc-ca3e427fe828","Type":"ContainerDied","Data":"343f155d1a459e91f33918892e6f512eb6a0c2d10d0aff7fca8e1becf913fd1c"} Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.659202 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.734305 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.781324 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:34:25 crc kubenswrapper[5030]: I0120 23:34:25.815406 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.027653 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a52282-d198-4a51-912a-1c2267f20d4b" path="/var/lib/kubelet/pods/27a52282-d198-4a51-912a-1c2267f20d4b/volumes" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.056572 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.061848 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.061964 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.078109 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.078473 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-54dw7" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.078591 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.078857 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.141742 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.156887 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.190395 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.222919 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.223013 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.223057 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-config\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.223082 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.223105 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.223153 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.223180 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.223215 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dtzs\" (UniqueName: \"kubernetes.io/projected/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-kube-api-access-4dtzs\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.328513 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-kolla-config\") pod \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.328601 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-memcached-tls-certs\") pod \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.328670 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxqn4\" (UniqueName: \"kubernetes.io/projected/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-kube-api-access-vxqn4\") pod \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.328727 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-combined-ca-bundle\") pod \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.328776 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-config-data\") pod \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\" (UID: \"53c21d1a-4b88-4cd9-bf6c-2394c988dbcf\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.329040 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.329089 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.329166 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.329187 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.329232 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dtzs\" (UniqueName: \"kubernetes.io/projected/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-kube-api-access-4dtzs\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.329294 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.329390 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.329444 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-config\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.332163 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "53c21d1a-4b88-4cd9-bf6c-2394c988dbcf" (UID: "53c21d1a-4b88-4cd9-bf6c-2394c988dbcf"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.333693 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.333965 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.334069 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-config-data" (OuterVolumeSpecName: "config-data") pod "53c21d1a-4b88-4cd9-bf6c-2394c988dbcf" (UID: "53c21d1a-4b88-4cd9-bf6c-2394c988dbcf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.335466 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") device mount path \"/mnt/openstack/pv07\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.342092 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.342384 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.343355 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-config\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.419585 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.420232 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-kube-api-access-vxqn4" (OuterVolumeSpecName: "kube-api-access-vxqn4") pod "53c21d1a-4b88-4cd9-bf6c-2394c988dbcf" (UID: "53c21d1a-4b88-4cd9-bf6c-2394c988dbcf"). InnerVolumeSpecName "kube-api-access-vxqn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.445290 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj"] Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.456938 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.460869 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8zhw\" (UniqueName: \"kubernetes.io/projected/5193f521-7049-4215-9ec2-5ee65b965e72-kube-api-access-c8zhw\") pod \"5193f521-7049-4215-9ec2-5ee65b965e72\" (UID: \"5193f521-7049-4215-9ec2-5ee65b965e72\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.460929 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5193f521-7049-4215-9ec2-5ee65b965e72-config-data\") pod \"5193f521-7049-4215-9ec2-5ee65b965e72\" (UID: \"5193f521-7049-4215-9ec2-5ee65b965e72\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.461860 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.461876 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxqn4\" (UniqueName: \"kubernetes.io/projected/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-kube-api-access-vxqn4\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.461888 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.466363 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5193f521-7049-4215-9ec2-5ee65b965e72-kube-api-access-c8zhw" (OuterVolumeSpecName: "kube-api-access-c8zhw") pod "5193f521-7049-4215-9ec2-5ee65b965e72" (UID: "5193f521-7049-4215-9ec2-5ee65b965e72"). InnerVolumeSpecName "kube-api-access-c8zhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.470512 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f"] Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.485390 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dtzs\" (UniqueName: \"kubernetes.io/projected/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-kube-api-access-4dtzs\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.529866 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_d7bf3518-ec94-4593-9c4d-768add03ad93/ovsdbserver-sb/0.log" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.529944 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.542399 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.549767 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53c21d1a-4b88-4cd9-bf6c-2394c988dbcf" (UID: "53c21d1a-4b88-4cd9-bf6c-2394c988dbcf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.562518 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5193f521-7049-4215-9ec2-5ee65b965e72-combined-ca-bundle\") pod \"5193f521-7049-4215-9ec2-5ee65b965e72\" (UID: \"5193f521-7049-4215-9ec2-5ee65b965e72\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.563571 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8zhw\" (UniqueName: \"kubernetes.io/projected/5193f521-7049-4215-9ec2-5ee65b965e72-kube-api-access-c8zhw\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.563586 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.566072 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "53c21d1a-4b88-4cd9-bf6c-2394c988dbcf" (UID: "53c21d1a-4b88-4cd9-bf6c-2394c988dbcf"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.645479 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5193f521-7049-4215-9ec2-5ee65b965e72-config-data" (OuterVolumeSpecName: "config-data") pod "5193f521-7049-4215-9ec2-5ee65b965e72" (UID: "5193f521-7049-4215-9ec2-5ee65b965e72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.666254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7bf3518-ec94-4593-9c4d-768add03ad93-ovsdb-rundir\") pod \"d7bf3518-ec94-4593-9c4d-768add03ad93\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.666322 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-metrics-certs-tls-certs\") pod \"d7bf3518-ec94-4593-9c4d-768add03ad93\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.666347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7bf3518-ec94-4593-9c4d-768add03ad93-scripts\") pod \"d7bf3518-ec94-4593-9c4d-768add03ad93\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.666700 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"d7bf3518-ec94-4593-9c4d-768add03ad93\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.666769 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-ovsdbserver-sb-tls-certs\") pod \"d7bf3518-ec94-4593-9c4d-768add03ad93\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.666811 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-combined-ca-bundle\") pod \"d7bf3518-ec94-4593-9c4d-768add03ad93\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.666842 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bf3518-ec94-4593-9c4d-768add03ad93-config\") pod \"d7bf3518-ec94-4593-9c4d-768add03ad93\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.666943 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qttr6\" (UniqueName: \"kubernetes.io/projected/d7bf3518-ec94-4593-9c4d-768add03ad93-kube-api-access-qttr6\") pod \"d7bf3518-ec94-4593-9c4d-768add03ad93\" (UID: \"d7bf3518-ec94-4593-9c4d-768add03ad93\") " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.667542 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5193f521-7049-4215-9ec2-5ee65b965e72-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.667562 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.677716 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5193f521-7049-4215-9ec2-5ee65b965e72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5193f521-7049-4215-9ec2-5ee65b965e72" (UID: "5193f521-7049-4215-9ec2-5ee65b965e72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.678392 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bf3518-ec94-4593-9c4d-768add03ad93-scripts" (OuterVolumeSpecName: "scripts") pod "d7bf3518-ec94-4593-9c4d-768add03ad93" (UID: "d7bf3518-ec94-4593-9c4d-768add03ad93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.679392 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bf3518-ec94-4593-9c4d-768add03ad93-config" (OuterVolumeSpecName: "config") pod "d7bf3518-ec94-4593-9c4d-768add03ad93" (UID: "d7bf3518-ec94-4593-9c4d-768add03ad93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.683789 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7bf3518-ec94-4593-9c4d-768add03ad93-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d7bf3518-ec94-4593-9c4d-768add03ad93" (UID: "d7bf3518-ec94-4593-9c4d-768add03ad93"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.698721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"5193f521-7049-4215-9ec2-5ee65b965e72","Type":"ContainerDied","Data":"b9c6b8925392291ee2c6a100e763bbe8df320474a6705bd387db8379dc246485"} Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.699073 5030 scope.go:117] "RemoveContainer" containerID="582661be087cfd633cc6137300755d1479d99380a51515479b7469ad2950cc82" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.699156 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.710861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7bf3518-ec94-4593-9c4d-768add03ad93-kube-api-access-qttr6" (OuterVolumeSpecName: "kube-api-access-qttr6") pod "d7bf3518-ec94-4593-9c4d-768add03ad93" (UID: "d7bf3518-ec94-4593-9c4d-768add03ad93"). InnerVolumeSpecName "kube-api-access-qttr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.711021 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "d7bf3518-ec94-4593-9c4d-768add03ad93" (UID: "d7bf3518-ec94-4593-9c4d-768add03ad93"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.729952 5030 generic.go:334] "Generic (PLEG): container finished" podID="bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" containerID="d18e355af79243407dfe997cf0460ad17659ff5b7691bfae2e85595dc86dbaed" exitCode=0 Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.730046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qbtc" event={"ID":"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec","Type":"ContainerDied","Data":"d18e355af79243407dfe997cf0460ad17659ff5b7691bfae2e85595dc86dbaed"} Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.750296 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.789878 5030 generic.go:334] "Generic (PLEG): container finished" podID="c1c966f9-0e90-4a5d-9e43-81ec4980cefc" containerID="933d723028f4a35c1b50d6e05e66a8a5d7dfd126606c38522abe3bed4c985ccc" exitCode=0 Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.789968 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"c1c966f9-0e90-4a5d-9e43-81ec4980cefc","Type":"ContainerDied","Data":"933d723028f4a35c1b50d6e05e66a8a5d7dfd126606c38522abe3bed4c985ccc"} Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.811783 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.816946 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d7bf3518-ec94-4593-9c4d-768add03ad93-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.816994 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7bf3518-ec94-4593-9c4d-768add03ad93-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.817006 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5193f521-7049-4215-9ec2-5ee65b965e72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.817038 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.817066 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bf3518-ec94-4593-9c4d-768add03ad93-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.817081 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qttr6\" (UniqueName: \"kubernetes.io/projected/d7bf3518-ec94-4593-9c4d-768add03ad93-kube-api-access-qttr6\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.835813 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw"] Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.860005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" event={"ID":"5c2a94b7-b322-4de7-b55d-80c87895ab2d","Type":"ContainerStarted","Data":"6197a3bce9c647c502cc83d469a50e6590cde057c66a12bfa91c47c2073f2f65"} Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.871945 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.873845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7bf3518-ec94-4593-9c4d-768add03ad93" (UID: "d7bf3518-ec94-4593-9c4d-768add03ad93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.882082 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.901789 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_d7bf3518-ec94-4593-9c4d-768add03ad93/ovsdbserver-sb/0.log" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.904523 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.905325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"d7bf3518-ec94-4593-9c4d-768add03ad93","Type":"ContainerDied","Data":"ba61e8605605b6d4b63b3d5d0c266d4591ed247e2d3f3e1bb8b4ead232bd7d91"} Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.908055 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.912383 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.921189 5030 scope.go:117] "RemoveContainer" containerID="c24781b0440cb0ec4f5466dcf225f8aee2a756ed49cb3c20f947a7fd55315498" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.922795 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.927096 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.927686 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.931809 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "d7bf3518-ec94-4593-9c4d-768add03ad93" (UID: "d7bf3518-ec94-4593-9c4d-768add03ad93"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.931914 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.936876 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" event={"ID":"f4bc1efe-faed-4290-a85c-c631a91f02ca","Type":"ContainerStarted","Data":"b227144782ed82b8ff9633022adb1be69c5e6055e89903a202f63d2c3a5def9f"} Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944000 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:26 crc kubenswrapper[5030]: E0120 23:34:26.944471 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c21d1a-4b88-4cd9-bf6c-2394c988dbcf" containerName="memcached" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944496 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c21d1a-4b88-4cd9-bf6c-2394c988dbcf" containerName="memcached" Jan 20 23:34:26 crc kubenswrapper[5030]: E0120 23:34:26.944513 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5193f521-7049-4215-9ec2-5ee65b965e72" containerName="nova-cell0-conductor-conductor" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944522 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5193f521-7049-4215-9ec2-5ee65b965e72" containerName="nova-cell0-conductor-conductor" Jan 20 23:34:26 crc kubenswrapper[5030]: E0120 23:34:26.944535 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bf3518-ec94-4593-9c4d-768add03ad93" containerName="ovsdbserver-sb" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944543 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bf3518-ec94-4593-9c4d-768add03ad93" containerName="ovsdbserver-sb" Jan 20 23:34:26 crc kubenswrapper[5030]: E0120 23:34:26.944559 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8072f56-2133-435c-87fd-e0a273ee9e91" containerName="keystone-api" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944568 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8072f56-2133-435c-87fd-e0a273ee9e91" containerName="keystone-api" Jan 20 23:34:26 crc kubenswrapper[5030]: E0120 23:34:26.944584 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" containerName="nova-api-log" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944593 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" containerName="nova-api-log" Jan 20 23:34:26 crc kubenswrapper[5030]: E0120 23:34:26.944615 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" containerName="nova-api-api" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944640 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" containerName="nova-api-api" Jan 20 23:34:26 crc kubenswrapper[5030]: E0120 23:34:26.944656 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bf3518-ec94-4593-9c4d-768add03ad93" containerName="openstack-network-exporter" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944662 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bf3518-ec94-4593-9c4d-768add03ad93" containerName="openstack-network-exporter" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944891 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c21d1a-4b88-4cd9-bf6c-2394c988dbcf" containerName="memcached" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944914 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" containerName="nova-api-log" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944927 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5193f521-7049-4215-9ec2-5ee65b965e72" containerName="nova-cell0-conductor-conductor" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944941 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5193f521-7049-4215-9ec2-5ee65b965e72" containerName="nova-cell0-conductor-conductor" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944952 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bf3518-ec94-4593-9c4d-768add03ad93" containerName="ovsdbserver-sb" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944968 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" containerName="nova-api-api" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.944989 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8072f56-2133-435c-87fd-e0a273ee9e91" containerName="keystone-api" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.945004 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bf3518-ec94-4593-9c4d-768add03ad93" containerName="openstack-network-exporter" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.945840 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.947408 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.951230 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" podUID="b3a2955e-219c-481d-bb5d-ca59f20d1255" containerName="barbican-api-log" containerID="cri-o://16f35b422f9f079e31efcdbbb024530bfa9466aa4f2e1d38af8351b8f4450329" gracePeriod=30 Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.951395 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" podUID="b3a2955e-219c-481d-bb5d-ca59f20d1255" containerName="barbican-api" containerID="cri-o://0245694e9895e15ee5e885b1444d16d218789764028b37b96632396ae39f6ff7" gracePeriod=30 Jan 20 23:34:26 crc kubenswrapper[5030]: I0120 23:34:26.990979 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d7bf3518-ec94-4593-9c4d-768add03ad93" (UID: "d7bf3518-ec94-4593-9c4d-768add03ad93"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.001453 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.032659 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-combined-ca-bundle\") pod \"c8072f56-2133-435c-87fd-e0a273ee9e91\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.032997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-logs\") pod \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033023 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6j6d\" (UniqueName: \"kubernetes.io/projected/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-kube-api-access-w6j6d\") pod \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033043 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-credential-keys\") pod \"c8072f56-2133-435c-87fd-e0a273ee9e91\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033080 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdf4b\" (UniqueName: \"kubernetes.io/projected/c8072f56-2133-435c-87fd-e0a273ee9e91-kube-api-access-rdf4b\") pod \"c8072f56-2133-435c-87fd-e0a273ee9e91\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033124 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-fernet-keys\") pod \"c8072f56-2133-435c-87fd-e0a273ee9e91\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033142 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-config-data\") pod \"c8072f56-2133-435c-87fd-e0a273ee9e91\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033167 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-public-tls-certs\") pod \"c8072f56-2133-435c-87fd-e0a273ee9e91\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-internal-tls-certs\") pod \"c8072f56-2133-435c-87fd-e0a273ee9e91\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033214 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-combined-ca-bundle\") pod \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033245 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-public-tls-certs\") pod \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033285 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-config-data\") pod \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033315 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-scripts\") pod \"c8072f56-2133-435c-87fd-e0a273ee9e91\" (UID: \"c8072f56-2133-435c-87fd-e0a273ee9e91\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033337 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-internal-tls-certs\") pod \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\" (UID: \"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033505 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c05bad7-0581-4182-9c7b-7a790c22cf64-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9c05bad7-0581-4182-9c7b-7a790c22cf64\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033565 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfn69\" (UniqueName: \"kubernetes.io/projected/9c05bad7-0581-4182-9c7b-7a790c22cf64-kube-api-access-bfn69\") pod \"nova-cell0-conductor-0\" (UID: \"9c05bad7-0581-4182-9c7b-7a790c22cf64\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033640 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c05bad7-0581-4182-9c7b-7a790c22cf64-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9c05bad7-0581-4182-9c7b-7a790c22cf64\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033786 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.033804 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7bf3518-ec94-4593-9c4d-768add03ad93-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.063643 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-logs" (OuterVolumeSpecName: "logs") pod "7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" (UID: "7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.090150 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c8072f56-2133-435c-87fd-e0a273ee9e91" (UID: "c8072f56-2133-435c-87fd-e0a273ee9e91"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.095851 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-scripts" (OuterVolumeSpecName: "scripts") pod "c8072f56-2133-435c-87fd-e0a273ee9e91" (UID: "c8072f56-2133-435c-87fd-e0a273ee9e91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.118226 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8072f56-2133-435c-87fd-e0a273ee9e91" (UID: "c8072f56-2133-435c-87fd-e0a273ee9e91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.118399 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c8072f56-2133-435c-87fd-e0a273ee9e91" (UID: "c8072f56-2133-435c-87fd-e0a273ee9e91"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.136767 5030 scope.go:117] "RemoveContainer" containerID="e936339e4e8ee4ffadf2179ae8d7a2c4c7acf995ef0f21e4513af509be552351" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.137028 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-kube-api-access-w6j6d" (OuterVolumeSpecName: "kube-api-access-w6j6d") pod "7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" (UID: "7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6"). InnerVolumeSpecName "kube-api-access-w6j6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.137253 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8072f56-2133-435c-87fd-e0a273ee9e91-kube-api-access-rdf4b" (OuterVolumeSpecName: "kube-api-access-rdf4b") pod "c8072f56-2133-435c-87fd-e0a273ee9e91" (UID: "c8072f56-2133-435c-87fd-e0a273ee9e91"). InnerVolumeSpecName "kube-api-access-rdf4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.160046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c05bad7-0581-4182-9c7b-7a790c22cf64-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9c05bad7-0581-4182-9c7b-7a790c22cf64\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.160141 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfn69\" (UniqueName: \"kubernetes.io/projected/9c05bad7-0581-4182-9c7b-7a790c22cf64-kube-api-access-bfn69\") pod \"nova-cell0-conductor-0\" (UID: \"9c05bad7-0581-4182-9c7b-7a790c22cf64\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.160230 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c05bad7-0581-4182-9c7b-7a790c22cf64-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9c05bad7-0581-4182-9c7b-7a790c22cf64\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.175373 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.183138 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.183178 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.183191 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6j6d\" (UniqueName: \"kubernetes.io/projected/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-kube-api-access-w6j6d\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.183204 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.183214 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdf4b\" (UniqueName: \"kubernetes.io/projected/c8072f56-2133-435c-87fd-e0a273ee9e91-kube-api-access-rdf4b\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.183230 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.196463 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr"] Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.202166 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c05bad7-0581-4182-9c7b-7a790c22cf64-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9c05bad7-0581-4182-9c7b-7a790c22cf64\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.209512 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-config-data" (OuterVolumeSpecName: "config-data") pod "c8072f56-2133-435c-87fd-e0a273ee9e91" (UID: "c8072f56-2133-435c-87fd-e0a273ee9e91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.215874 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c05bad7-0581-4182-9c7b-7a790c22cf64-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9c05bad7-0581-4182-9c7b-7a790c22cf64\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.264281 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfn69\" (UniqueName: \"kubernetes.io/projected/9c05bad7-0581-4182-9c7b-7a790c22cf64-kube-api-access-bfn69\") pod \"nova-cell0-conductor-0\" (UID: \"9c05bad7-0581-4182-9c7b-7a790c22cf64\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.268091 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" (UID: "7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.286019 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.286046 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.314578 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c8072f56-2133-435c-87fd-e0a273ee9e91" (UID: "c8072f56-2133-435c-87fd-e0a273ee9e91"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.314960 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" (UID: "7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.319136 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" (UID: "7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.387763 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.387803 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.387816 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.397094 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c8072f56-2133-435c-87fd-e0a273ee9e91" (UID: "c8072f56-2133-435c-87fd-e0a273ee9e91"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.411421 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-config-data" (OuterVolumeSpecName: "config-data") pod "7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" (UID: "7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.497268 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.497300 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8072f56-2133-435c-87fd-e0a273ee9e91-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.559337 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.623953 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.626707 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x"] Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.631972 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.663132 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.664042 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-557d5c9854-wbd5h"] Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.677416 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:34:27 crc kubenswrapper[5030]: W0120 23:34:27.696919 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0979c96_fbd8_4e5b_a22b_35c1cae6e6da.slice/crio-fbeb9f441575597f0a4417989c98da7e307dc5d45a00cfa38b2f2c1696700e56 WatchSource:0}: Error finding container fbeb9f441575597f0a4417989c98da7e307dc5d45a00cfa38b2f2c1696700e56: Status 404 returned error can't find the container with id fbeb9f441575597f0a4417989c98da7e307dc5d45a00cfa38b2f2c1696700e56 Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.717264 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj7zd\" (UniqueName: \"kubernetes.io/projected/5613b909-aedf-4a87-babc-ca3e427fe828-kube-api-access-gj7zd\") pod \"5613b909-aedf-4a87-babc-ca3e427fe828\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.717318 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.717351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-config-data\") pod \"5613b909-aedf-4a87-babc-ca3e427fe828\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.717384 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbl5q\" (UniqueName: \"kubernetes.io/projected/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-kube-api-access-lbl5q\") pod \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.717415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-combined-ca-bundle\") pod \"5613b909-aedf-4a87-babc-ca3e427fe828\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.717523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-config-data-custom\") pod \"5613b909-aedf-4a87-babc-ca3e427fe828\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.717568 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-operator-scripts\") pod \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.717606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-kolla-config\") pod \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.717636 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-config-data-generated\") pod \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.717659 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-config-data-default\") pod \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.719780 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "c1c966f9-0e90-4a5d-9e43-81ec4980cefc" (UID: "c1c966f9-0e90-4a5d-9e43-81ec4980cefc"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.721668 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1c966f9-0e90-4a5d-9e43-81ec4980cefc" (UID: "c1c966f9-0e90-4a5d-9e43-81ec4980cefc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.727102 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "c1c966f9-0e90-4a5d-9e43-81ec4980cefc" (UID: "c1c966f9-0e90-4a5d-9e43-81ec4980cefc"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.741142 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "c1c966f9-0e90-4a5d-9e43-81ec4980cefc" (UID: "c1c966f9-0e90-4a5d-9e43-81ec4980cefc"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.753909 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-kube-api-access-lbl5q" (OuterVolumeSpecName: "kube-api-access-lbl5q") pod "c1c966f9-0e90-4a5d-9e43-81ec4980cefc" (UID: "c1c966f9-0e90-4a5d-9e43-81ec4980cefc"). InnerVolumeSpecName "kube-api-access-lbl5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.767023 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5613b909-aedf-4a87-babc-ca3e427fe828" (UID: "5613b909-aedf-4a87-babc-ca3e427fe828"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.767102 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.779564 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5613b909-aedf-4a87-babc-ca3e427fe828-kube-api-access-gj7zd" (OuterVolumeSpecName: "kube-api-access-gj7zd") pod "5613b909-aedf-4a87-babc-ca3e427fe828" (UID: "5613b909-aedf-4a87-babc-ca3e427fe828"). InnerVolumeSpecName "kube-api-access-gj7zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.795502 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.813637 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:34:27 crc kubenswrapper[5030]: E0120 23:34:27.814091 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5193f521-7049-4215-9ec2-5ee65b965e72" containerName="nova-cell0-conductor-conductor" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.814105 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5193f521-7049-4215-9ec2-5ee65b965e72" containerName="nova-cell0-conductor-conductor" Jan 20 23:34:27 crc kubenswrapper[5030]: E0120 23:34:27.814112 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5613b909-aedf-4a87-babc-ca3e427fe828" containerName="probe" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.814118 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5613b909-aedf-4a87-babc-ca3e427fe828" containerName="probe" Jan 20 23:34:27 crc kubenswrapper[5030]: E0120 23:34:27.814137 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c966f9-0e90-4a5d-9e43-81ec4980cefc" containerName="galera" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.814142 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c966f9-0e90-4a5d-9e43-81ec4980cefc" containerName="galera" Jan 20 23:34:27 crc kubenswrapper[5030]: E0120 23:34:27.814155 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5613b909-aedf-4a87-babc-ca3e427fe828" containerName="cinder-scheduler" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.814161 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5613b909-aedf-4a87-babc-ca3e427fe828" containerName="cinder-scheduler" Jan 20 23:34:27 crc kubenswrapper[5030]: E0120 23:34:27.814172 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c966f9-0e90-4a5d-9e43-81ec4980cefc" containerName="mysql-bootstrap" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.814178 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c966f9-0e90-4a5d-9e43-81ec4980cefc" containerName="mysql-bootstrap" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.814355 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1c966f9-0e90-4a5d-9e43-81ec4980cefc" containerName="galera" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.814372 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5613b909-aedf-4a87-babc-ca3e427fe828" containerName="probe" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.814382 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5613b909-aedf-4a87-babc-ca3e427fe828" containerName="cinder-scheduler" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.815036 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.819644 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5613b909-aedf-4a87-babc-ca3e427fe828-etc-machine-id\") pod \"5613b909-aedf-4a87-babc-ca3e427fe828\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.819925 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-scripts\") pod \"5613b909-aedf-4a87-babc-ca3e427fe828\" (UID: \"5613b909-aedf-4a87-babc-ca3e427fe828\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.819946 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-galera-tls-certs\") pod \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.819961 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-combined-ca-bundle\") pod \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\" (UID: \"c1c966f9-0e90-4a5d-9e43-81ec4980cefc\") " Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.819993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5613b909-aedf-4a87-babc-ca3e427fe828-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5613b909-aedf-4a87-babc-ca3e427fe828" (UID: "5613b909-aedf-4a87-babc-ca3e427fe828"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.820412 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.820429 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.820439 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.820450 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.820461 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.820469 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5613b909-aedf-4a87-babc-ca3e427fe828-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.820477 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj7zd\" (UniqueName: \"kubernetes.io/projected/5613b909-aedf-4a87-babc-ca3e427fe828-kube-api-access-gj7zd\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.820486 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbl5q\" (UniqueName: \"kubernetes.io/projected/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-kube-api-access-lbl5q\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.823480 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-n56b6" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.823704 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.830366 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.850066 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-scripts" (OuterVolumeSpecName: "scripts") pod "5613b909-aedf-4a87-babc-ca3e427fe828" (UID: "5613b909-aedf-4a87-babc-ca3e427fe828"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.858886 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.867849 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.884849 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.887389 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.891393 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.892549 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.892598 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-hrzkh" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.892699 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.893171 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "c1c966f9-0e90-4a5d-9e43-81ec4980cefc" (UID: "c1c966f9-0e90-4a5d-9e43-81ec4980cefc"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.909397 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925144 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce2df28d-2515-4995-b928-c2f6d198045d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925188 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8td79\" (UniqueName: \"kubernetes.io/projected/ce2df28d-2515-4995-b928-c2f6d198045d-kube-api-access-8td79\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925231 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce2df28d-2515-4995-b928-c2f6d198045d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925330 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925357 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce2df28d-2515-4995-b928-c2f6d198045d-config\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925566 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-config-data\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925604 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925636 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925652 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925707 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-kolla-config\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhjpb\" (UniqueName: \"kubernetes.io/projected/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-kube-api-access-zhjpb\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925883 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:27 crc kubenswrapper[5030]: I0120 23:34:27.925905 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:27.997296 5030 generic.go:334] "Generic (PLEG): container finished" podID="b3a2955e-219c-481d-bb5d-ca59f20d1255" containerID="0245694e9895e15ee5e885b1444d16d218789764028b37b96632396ae39f6ff7" exitCode=0 Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:27.997848 5030 generic.go:334] "Generic (PLEG): container finished" podID="b3a2955e-219c-481d-bb5d-ca59f20d1255" containerID="16f35b422f9f079e31efcdbbb024530bfa9466aa4f2e1d38af8351b8f4450329" exitCode=143 Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:27.997310 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5193f521-7049-4215-9ec2-5ee65b965e72" path="/var/lib/kubelet/pods/5193f521-7049-4215-9ec2-5ee65b965e72/volumes" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:27.998981 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c21d1a-4b88-4cd9-bf6c-2394c988dbcf" path="/var/lib/kubelet/pods/53c21d1a-4b88-4cd9-bf6c-2394c988dbcf/volumes" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:27.999938 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7bf3518-ec94-4593-9c4d-768add03ad93" path="/var/lib/kubelet/pods/d7bf3518-ec94-4593-9c4d-768add03ad93/volumes" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.014758 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.027309 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce2df28d-2515-4995-b928-c2f6d198045d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.027384 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.027427 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.027462 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.027509 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce2df28d-2515-4995-b928-c2f6d198045d-config\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.027563 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-config-data\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.027578 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.027594 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.027611 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.027675 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-kolla-config\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.027708 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhjpb\" (UniqueName: \"kubernetes.io/projected/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-kube-api-access-zhjpb\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.027753 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce2df28d-2515-4995-b928-c2f6d198045d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.027778 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8td79\" (UniqueName: \"kubernetes.io/projected/ce2df28d-2515-4995-b928-c2f6d198045d-kube-api-access-8td79\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.030200 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce2df28d-2515-4995-b928-c2f6d198045d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.031299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce2df28d-2515-4995-b928-c2f6d198045d-config\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.032369 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-config-data\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.032462 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.035737 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-kolla-config\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.038171 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce2df28d-2515-4995-b928-c2f6d198045d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.053729 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.053745 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.060969 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.074645 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.077508 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.078304 5030 generic.go:334] "Generic (PLEG): container finished" podID="5613b909-aedf-4a87-babc-ca3e427fe828" containerID="031ff5602823d87e2940a4709da9332283c628b5bf6321ae670f5bb9b3a72763" exitCode=0 Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.078565 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.089798 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8td79\" (UniqueName: \"kubernetes.io/projected/ce2df28d-2515-4995-b928-c2f6d198045d-kube-api-access-8td79\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.090523 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhjpb\" (UniqueName: \"kubernetes.io/projected/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-kube-api-access-zhjpb\") pod \"memcached-0\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.100801 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.122317 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.151308 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-n56b6" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.159318 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.267694 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.297249 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.308486 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1c966f9-0e90-4a5d-9e43-81ec4980cefc" (UID: "c1c966f9-0e90-4a5d-9e43-81ec4980cefc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.337319 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.337345 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.388318 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5613b909-aedf-4a87-babc-ca3e427fe828" (UID: "5613b909-aedf-4a87-babc-ca3e427fe828"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.399344 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "c1c966f9-0e90-4a5d-9e43-81ec4980cefc" (UID: "c1c966f9-0e90-4a5d-9e43-81ec4980cefc"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.432506 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-config-data" (OuterVolumeSpecName: "config-data") pod "5613b909-aedf-4a87-babc-ca3e427fe828" (UID: "5613b909-aedf-4a87-babc-ca3e427fe828"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.443744 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1c966f9-0e90-4a5d-9e43-81ec4980cefc-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.443791 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.443803 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5613b909-aedf-4a87-babc-ca3e427fe828-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.526463 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-hrzkh" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.538509 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.631747 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod2de7a5fe-85b0-4b7c-af74-ec2c104fac94"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod2de7a5fe-85b0-4b7c-af74-ec2c104fac94] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2de7a5fe_85b0_4b7c_af74_ec2c104fac94.slice" Jan 20 23:34:28 crc kubenswrapper[5030]: E0120 23:34:28.631836 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod2de7a5fe-85b0-4b7c-af74-ec2c104fac94] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod2de7a5fe-85b0-4b7c-af74-ec2c104fac94] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2de7a5fe_85b0_4b7c_af74_ec2c104fac94.slice" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="2de7a5fe-85b0-4b7c-af74-ec2c104fac94" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.640522 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod37c2ebda-5e72-41eb-9c8c-1326134768c7"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod37c2ebda-5e72-41eb-9c8c-1326134768c7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod37c2ebda_5e72_41eb_9c8c_1326134768c7.slice" Jan 20 23:34:28 crc kubenswrapper[5030]: E0120 23:34:28.640551 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod37c2ebda-5e72-41eb-9c8c-1326134768c7] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod37c2ebda-5e72-41eb-9c8c-1326134768c7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod37c2ebda_5e72_41eb_9c8c_1326134768c7.slice" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="37c2ebda-5e72-41eb-9c8c-1326134768c7" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.652801 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2ae89f8a_49bd_4a0b_8e3c_3ee6999395d8.slice" Jan 20 23:34:28 crc kubenswrapper[5030]: E0120 23:34:28.652972 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2ae89f8a_49bd_4a0b_8e3c_3ee6999395d8.slice" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716337 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" event={"ID":"b3a2955e-219c-481d-bb5d-ca59f20d1255","Type":"ContainerDied","Data":"0245694e9895e15ee5e885b1444d16d218789764028b37b96632396ae39f6ff7"} Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716377 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" event={"ID":"b3a2955e-219c-481d-bb5d-ca59f20d1255","Type":"ContainerDied","Data":"16f35b422f9f079e31efcdbbb024530bfa9466aa4f2e1d38af8351b8f4450329"} Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716397 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6","Type":"ContainerDied","Data":"b55591f7f581673e6196c5fcab36a06878621f4bb7f255642f69739d3102cf41"} Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716416 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" event={"ID":"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb","Type":"ContainerStarted","Data":"d067ff9e7fd3a18c290c327283c6479bcb00b964b9f5c64f4ad32859942f2e6b"} Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716435 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" event={"ID":"99f18e53-0386-4e06-89f6-a7d30f880808","Type":"ContainerStarted","Data":"cef03121a4040309a93985530bf6c90fc8542184ca20627744371245053e16db"} Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716451 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716468 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5613b909-aedf-4a87-babc-ca3e427fe828","Type":"ContainerDied","Data":"031ff5602823d87e2940a4709da9332283c628b5bf6321ae670f5bb9b3a72763"} Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"5613b909-aedf-4a87-babc-ca3e427fe828","Type":"ContainerDied","Data":"af80d5994aeeb83a95385a0f7fce28dff4b04a7a0c3cd44b876f399ee8c542dd"} Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716490 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" event={"ID":"ec706fe5-85b7-4580-9013-e39497640ab0","Type":"ContainerStarted","Data":"3ba60cf0d2af06af168f14d98c0ee6b461d95799278997d935c999524b467e98"} Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716504 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716514 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq" event={"ID":"c8072f56-2133-435c-87fd-e0a273ee9e91","Type":"ContainerDied","Data":"3ef07a5f16be50259078c28b946630efabeb51ba07519002de54bd4eb3dfe589"} Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716526 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" event={"ID":"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da","Type":"ContainerStarted","Data":"fbeb9f441575597f0a4417989c98da7e307dc5d45a00cfa38b2f2c1696700e56"} Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716535 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"c1c966f9-0e90-4a5d-9e43-81ec4980cefc","Type":"ContainerDied","Data":"62aceb603d9b599795ce3db4aa808f7df55085790cf93f0bb01657f574a12d00"} Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.716555 5030 scope.go:117] "RemoveContainer" containerID="e1bab47349c35f6aa385252a8ecdb111200aae28f3c39202af37cf667e9c4edb" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.741958 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.792895 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.848273 5030 scope.go:117] "RemoveContainer" containerID="1367bed5839e543dc91ccaa45f346a8078b98871533bbd6dbcd36fea44e6682e" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.879273 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq"] Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.891828 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-5b8b6c8845-fv7lq"] Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.908088 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.936546 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.951348 5030 scope.go:117] "RemoveContainer" containerID="343f155d1a459e91f33918892e6f512eb6a0c2d10d0aff7fca8e1becf913fd1c" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.952668 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-combined-ca-bundle\") pod \"b3a2955e-219c-481d-bb5d-ca59f20d1255\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.952701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-config-data-custom\") pod \"b3a2955e-219c-481d-bb5d-ca59f20d1255\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.953441 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-config-data\") pod \"b3a2955e-219c-481d-bb5d-ca59f20d1255\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.953476 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-public-tls-certs\") pod \"b3a2955e-219c-481d-bb5d-ca59f20d1255\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.953526 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxjqv\" (UniqueName: \"kubernetes.io/projected/b3a2955e-219c-481d-bb5d-ca59f20d1255-kube-api-access-jxjqv\") pod \"b3a2955e-219c-481d-bb5d-ca59f20d1255\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.953556 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-internal-tls-certs\") pod \"b3a2955e-219c-481d-bb5d-ca59f20d1255\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.953659 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3a2955e-219c-481d-bb5d-ca59f20d1255-logs\") pod \"b3a2955e-219c-481d-bb5d-ca59f20d1255\" (UID: \"b3a2955e-219c-481d-bb5d-ca59f20d1255\") " Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.966558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3a2955e-219c-481d-bb5d-ca59f20d1255-logs" (OuterVolumeSpecName: "logs") pod "b3a2955e-219c-481d-bb5d-ca59f20d1255" (UID: "b3a2955e-219c-481d-bb5d-ca59f20d1255"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.968272 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.981718 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:34:28 crc kubenswrapper[5030]: E0120 23:34:28.982183 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a2955e-219c-481d-bb5d-ca59f20d1255" containerName="barbican-api" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.982203 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a2955e-219c-481d-bb5d-ca59f20d1255" containerName="barbican-api" Jan 20 23:34:28 crc kubenswrapper[5030]: E0120 23:34:28.982217 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a2955e-219c-481d-bb5d-ca59f20d1255" containerName="barbican-api-log" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.982223 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a2955e-219c-481d-bb5d-ca59f20d1255" containerName="barbican-api-log" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.982402 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3a2955e-219c-481d-bb5d-ca59f20d1255" containerName="barbican-api-log" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.982427 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3a2955e-219c-481d-bb5d-ca59f20d1255" containerName="barbican-api" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.983632 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.991383 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.991568 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.991696 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-5fq4w" Jan 20 23:34:28 crc kubenswrapper[5030]: I0120 23:34:28.992246 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.005389 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.016403 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.024141 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.025838 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.036331 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.037170 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.057280 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3a2955e-219c-481d-bb5d-ca59f20d1255-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.066468 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.107086 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3a2955e-219c-481d-bb5d-ca59f20d1255-kube-api-access-jxjqv" (OuterVolumeSpecName: "kube-api-access-jxjqv") pod "b3a2955e-219c-481d-bb5d-ca59f20d1255" (UID: "b3a2955e-219c-481d-bb5d-ca59f20d1255"). InnerVolumeSpecName "kube-api-access-jxjqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.115512 5030 scope.go:117] "RemoveContainer" containerID="031ff5602823d87e2940a4709da9332283c628b5bf6321ae670f5bb9b3a72763" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.159518 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.159562 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-config-data-default\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.159585 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-config-data\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.159607 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.159673 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.159700 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.159742 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.159791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.159816 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl72g\" (UniqueName: \"kubernetes.io/projected/12ef9991-6f38-4b76-a8e7-974ef5255254-kube-api-access-nl72g\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.159869 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-scripts\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.159892 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-kolla-config\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.159936 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12ef9991-6f38-4b76-a8e7-974ef5255254-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.159957 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzcfq\" (UniqueName: \"kubernetes.io/projected/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-kube-api-access-nzcfq\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.160047 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.160962 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxjqv\" (UniqueName: \"kubernetes.io/projected/b3a2955e-219c-481d-bb5d-ca59f20d1255-kube-api-access-jxjqv\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.184975 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.186293 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.194349 5030 generic.go:334] "Generic (PLEG): container finished" podID="af288905-8d90-4f98-ba70-c7351555851b" containerID="2a5756d22f4f3a9b513c8862e9d5fd28415db4db8f9387d9f8b00baa64a602f6" exitCode=137 Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.194455 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" event={"ID":"af288905-8d90-4f98-ba70-c7351555851b","Type":"ContainerDied","Data":"2a5756d22f4f3a9b513c8862e9d5fd28415db4db8f9387d9f8b00baa64a602f6"} Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.197095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" event={"ID":"b3a2955e-219c-481d-bb5d-ca59f20d1255","Type":"ContainerDied","Data":"0948e4c0daa26693867712ecc8c65337053c44cee21476c93d44514857119892"} Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.197220 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.197736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b3a2955e-219c-481d-bb5d-ca59f20d1255" (UID: "b3a2955e-219c-481d-bb5d-ca59f20d1255"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.291278 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3a2955e-219c-481d-bb5d-ca59f20d1255" (UID: "b3a2955e-219c-481d-bb5d-ca59f20d1255"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293176 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293230 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293251 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-config-data-default\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293268 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-config-data\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293288 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293325 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293351 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293401 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293468 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293491 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl72g\" (UniqueName: \"kubernetes.io/projected/12ef9991-6f38-4b76-a8e7-974ef5255254-kube-api-access-nl72g\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293587 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-scripts\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293632 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-kolla-config\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293684 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12ef9991-6f38-4b76-a8e7-974ef5255254-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293704 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzcfq\" (UniqueName: \"kubernetes.io/projected/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-kube-api-access-nzcfq\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293763 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.293775 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.295685 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978","Type":"ContainerStarted","Data":"ec7e149092fad6af623b19824e5babe12171c69a3b87ef9ad55041bd42e9fa51"} Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.297531 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.297870 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.298060 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.298437 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-kolla-config\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.298741 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12ef9991-6f38-4b76-a8e7-974ef5255254-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.299881 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"2e480bc9-3e92-4cb6-a685-4b2118dea5ff","Type":"ContainerStarted","Data":"22afbcef39ae947fe57abfd65f655ba84c23f3719274f3578deb9b1aff9aee9f"} Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.300740 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-config-data-default\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.310215 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-scripts\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.320827 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.326413 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" event={"ID":"5c2a94b7-b322-4de7-b55d-80c87895ab2d","Type":"ContainerStarted","Data":"1f023a1689579151152b0ca7bfe3a74e9a546de82e1d757e00c1973facff6457"} Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.328462 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-config-data\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.356061 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.364976 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.370808 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9c05bad7-0581-4182-9c7b-7a790c22cf64","Type":"ContainerStarted","Data":"bd2173d5780144e78bbd66e000c1fb014dcc9fd1b472fd0d9859a5ed21c774b8"} Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.371945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9c05bad7-0581-4182-9c7b-7a790c22cf64","Type":"ContainerStarted","Data":"a561ed0a1b8342fc0104655d970929de17c3ce7e53b1072404ab41d45299547f"} Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.371966 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.383220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.383686 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl72g\" (UniqueName: \"kubernetes.io/projected/12ef9991-6f38-4b76-a8e7-974ef5255254-kube-api-access-nl72g\") pod \"cinder-scheduler-0\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.383971 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzcfq\" (UniqueName: \"kubernetes.io/projected/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-kube-api-access-nzcfq\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.398132 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=3.39811525 podStartE2EDuration="3.39811525s" podCreationTimestamp="2026-01-20 23:34:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:29.39398435 +0000 UTC m=+3541.714244638" watchObservedRunningTime="2026-01-20 23:34:29.39811525 +0000 UTC m=+3541.718375538" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.415845 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" event={"ID":"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb","Type":"ContainerStarted","Data":"6c9c4f202a2cb68a56587c933cd90e0e468cf5c5ce99564d98ebbae3c4fae420"} Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.428257 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" event={"ID":"99f18e53-0386-4e06-89f6-a7d30f880808","Type":"ContainerStarted","Data":"b09cb47fdff353d2277388987884452b6f97530142dd767855a2b065ff8722ae"} Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.431350 5030 generic.go:334] "Generic (PLEG): container finished" podID="cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d" containerID="0d9fc4ee758d84087beaab327bba3d23f35c0de724fa32f1178cb99595f25278" exitCode=1 Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.431413 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d","Type":"ContainerDied","Data":"0d9fc4ee758d84087beaab327bba3d23f35c0de724fa32f1178cb99595f25278"} Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.434860 5030 generic.go:334] "Generic (PLEG): container finished" podID="ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" containerID="ddab932bf79386702fccfc6c79a0fd5f50d026b98c269df11b368ec27242c3df" exitCode=137 Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.434903 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" event={"ID":"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df","Type":"ContainerDied","Data":"ddab932bf79386702fccfc6c79a0fd5f50d026b98c269df11b368ec27242c3df"} Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.438002 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" event={"ID":"f4bc1efe-faed-4290-a85c-c631a91f02ca","Type":"ContainerStarted","Data":"fc2622eac769de27d1854d47660b3865bb55eb741ab3de14a0ff652e61da605f"} Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.497974 5030 generic.go:334] "Generic (PLEG): container finished" podID="e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" containerID="7610bfc12a250b49177a9212a8e1ab21b02766f17a8cb9d0985354440968ab12" exitCode=137 Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.498102 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.499038 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" event={"ID":"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d","Type":"ContainerDied","Data":"7610bfc12a250b49177a9212a8e1ab21b02766f17a8cb9d0985354440968ab12"} Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.499113 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.499657 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.574797 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.690705 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-config-data" (OuterVolumeSpecName: "config-data") pod "b3a2955e-219c-481d-bb5d-ca59f20d1255" (UID: "b3a2955e-219c-481d-bb5d-ca59f20d1255"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.702878 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.704171 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.739282 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b3a2955e-219c-481d-bb5d-ca59f20d1255" (UID: "b3a2955e-219c-481d-bb5d-ca59f20d1255"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.810392 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.815990 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b3a2955e-219c-481d-bb5d-ca59f20d1255" (UID: "b3a2955e-219c-481d-bb5d-ca59f20d1255"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.892766 5030 scope.go:117] "RemoveContainer" containerID="343f155d1a459e91f33918892e6f512eb6a0c2d10d0aff7fca8e1becf913fd1c" Jan 20 23:34:29 crc kubenswrapper[5030]: E0120 23:34:29.893819 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"343f155d1a459e91f33918892e6f512eb6a0c2d10d0aff7fca8e1becf913fd1c\": container with ID starting with 343f155d1a459e91f33918892e6f512eb6a0c2d10d0aff7fca8e1becf913fd1c not found: ID does not exist" containerID="343f155d1a459e91f33918892e6f512eb6a0c2d10d0aff7fca8e1becf913fd1c" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.893845 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"343f155d1a459e91f33918892e6f512eb6a0c2d10d0aff7fca8e1becf913fd1c"} err="failed to get container status \"343f155d1a459e91f33918892e6f512eb6a0c2d10d0aff7fca8e1becf913fd1c\": rpc error: code = NotFound desc = could not find container \"343f155d1a459e91f33918892e6f512eb6a0c2d10d0aff7fca8e1becf913fd1c\": container with ID starting with 343f155d1a459e91f33918892e6f512eb6a0c2d10d0aff7fca8e1becf913fd1c not found: ID does not exist" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.893866 5030 scope.go:117] "RemoveContainer" containerID="031ff5602823d87e2940a4709da9332283c628b5bf6321ae670f5bb9b3a72763" Jan 20 23:34:29 crc kubenswrapper[5030]: E0120 23:34:29.896997 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"031ff5602823d87e2940a4709da9332283c628b5bf6321ae670f5bb9b3a72763\": container with ID starting with 031ff5602823d87e2940a4709da9332283c628b5bf6321ae670f5bb9b3a72763 not found: ID does not exist" containerID="031ff5602823d87e2940a4709da9332283c628b5bf6321ae670f5bb9b3a72763" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.897025 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031ff5602823d87e2940a4709da9332283c628b5bf6321ae670f5bb9b3a72763"} err="failed to get container status \"031ff5602823d87e2940a4709da9332283c628b5bf6321ae670f5bb9b3a72763\": rpc error: code = NotFound desc = could not find container \"031ff5602823d87e2940a4709da9332283c628b5bf6321ae670f5bb9b3a72763\": container with ID starting with 031ff5602823d87e2940a4709da9332283c628b5bf6321ae670f5bb9b3a72763 not found: ID does not exist" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.897045 5030 scope.go:117] "RemoveContainer" containerID="b5752b3ecf6216d9097f2ae1074c2eb5cb7b6c853218dcdae4da8ae22eb06798" Jan 20 23:34:29 crc kubenswrapper[5030]: I0120 23:34:29.913008 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3a2955e-219c-481d-bb5d-ca59f20d1255-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:29.999962 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5613b909-aedf-4a87-babc-ca3e427fe828" path="/var/lib/kubelet/pods/5613b909-aedf-4a87-babc-ca3e427fe828/volumes" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.000825 5030 scope.go:117] "RemoveContainer" containerID="933d723028f4a35c1b50d6e05e66a8a5d7dfd126606c38522abe3bed4c985ccc" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.005493 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1c966f9-0e90-4a5d-9e43-81ec4980cefc" path="/var/lib/kubelet/pods/c1c966f9-0e90-4a5d-9e43-81ec4980cefc/volumes" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.006287 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8072f56-2133-435c-87fd-e0a273ee9e91" path="/var/lib/kubelet/pods/c8072f56-2133-435c-87fd-e0a273ee9e91/volumes" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.130040 5030 scope.go:117] "RemoveContainer" containerID="6b394c55a99879becceaf8003a6a0e868d119d03350ca95317201ad98351c493" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.214351 5030 scope.go:117] "RemoveContainer" containerID="0245694e9895e15ee5e885b1444d16d218789764028b37b96632396ae39f6ff7" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.236588 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.290655 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.291041 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.311813 5030 scope.go:117] "RemoveContainer" containerID="16f35b422f9f079e31efcdbbb024530bfa9466aa4f2e1d38af8351b8f4450329" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.313015 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.328429 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-combined-ca-bundle\") pod \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.328601 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-logs\") pod \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.328679 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-config-data-custom\") pod \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.328740 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-config-data\") pod \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.328829 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68tfg\" (UniqueName: \"kubernetes.io/projected/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-kube-api-access-68tfg\") pod \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\" (UID: \"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.330025 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-logs" (OuterVolumeSpecName: "logs") pod "ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" (UID: "ceac9104-14e9-4ef2-b2f3-30a3bb64d9df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.330851 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.352605 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.358602 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.410766 5030 scope.go:117] "RemoveContainer" containerID="8d583b27015652f7e8e83138eda5370ac6f3f2cbeca97079c99f7be638682e39" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.425798 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" (UID: "ceac9104-14e9-4ef2-b2f3-30a3bb64d9df"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.432967 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-combined-ca-bundle\") pod \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.433025 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-config-data\") pod \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.433087 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pj4n\" (UniqueName: \"kubernetes.io/projected/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-kube-api-access-7pj4n\") pod \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.433128 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hsmx\" (UniqueName: \"kubernetes.io/projected/af288905-8d90-4f98-ba70-c7351555851b-kube-api-access-2hsmx\") pod \"af288905-8d90-4f98-ba70-c7351555851b\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.433148 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af288905-8d90-4f98-ba70-c7351555851b-logs\") pod \"af288905-8d90-4f98-ba70-c7351555851b\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.433163 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-scripts\") pod \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.433180 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-internal-tls-certs\") pod \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.433208 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-combined-ca-bundle\") pod \"af288905-8d90-4f98-ba70-c7351555851b\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.433231 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-public-tls-certs\") pod \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.433263 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-credential-keys\") pod \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.433280 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-config-data\") pod \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\" (UID: \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.434116 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af288905-8d90-4f98-ba70-c7351555851b-logs" (OuterVolumeSpecName: "logs") pod "af288905-8d90-4f98-ba70-c7351555851b" (UID: "af288905-8d90-4f98-ba70-c7351555851b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.436915 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-config-data\") pod \"af288905-8d90-4f98-ba70-c7351555851b\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.437010 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qhzl\" (UniqueName: \"kubernetes.io/projected/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-kube-api-access-2qhzl\") pod \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\" (UID: \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.437082 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-config-data-custom\") pod \"af288905-8d90-4f98-ba70-c7351555851b\" (UID: \"af288905-8d90-4f98-ba70-c7351555851b\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.437298 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-fernet-keys\") pod \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\" (UID: \"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.437579 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-combined-ca-bundle\") pod \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\" (UID: \"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d\") " Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.445675 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-scripts" (OuterVolumeSpecName: "scripts") pod "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" (UID: "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.450304 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-kube-api-access-68tfg" (OuterVolumeSpecName: "kube-api-access-68tfg") pod "ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" (UID: "ceac9104-14e9-4ef2-b2f3-30a3bb64d9df"). InnerVolumeSpecName "kube-api-access-68tfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.450596 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.458064 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af288905-8d90-4f98-ba70-c7351555851b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.458223 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.458286 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.458356 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.458424 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68tfg\" (UniqueName: \"kubernetes.io/projected/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-kube-api-access-68tfg\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.466683 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:34:30 crc kubenswrapper[5030]: E0120 23:34:30.467097 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d" containerName="nova-cell1-conductor-conductor" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467113 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d" containerName="nova-cell1-conductor-conductor" Jan 20 23:34:30 crc kubenswrapper[5030]: E0120 23:34:30.467128 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" containerName="barbican-keystone-listener" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467134 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" containerName="barbican-keystone-listener" Jan 20 23:34:30 crc kubenswrapper[5030]: E0120 23:34:30.467143 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" containerName="keystone-api" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467151 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" containerName="keystone-api" Jan 20 23:34:30 crc kubenswrapper[5030]: E0120 23:34:30.467159 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af288905-8d90-4f98-ba70-c7351555851b" containerName="barbican-worker-log" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467165 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="af288905-8d90-4f98-ba70-c7351555851b" containerName="barbican-worker-log" Jan 20 23:34:30 crc kubenswrapper[5030]: E0120 23:34:30.467178 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af288905-8d90-4f98-ba70-c7351555851b" containerName="barbican-worker" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467184 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="af288905-8d90-4f98-ba70-c7351555851b" containerName="barbican-worker" Jan 20 23:34:30 crc kubenswrapper[5030]: E0120 23:34:30.467197 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" containerName="barbican-keystone-listener-log" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467203 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" containerName="barbican-keystone-listener-log" Jan 20 23:34:30 crc kubenswrapper[5030]: E0120 23:34:30.467215 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d" containerName="nova-cell1-conductor-conductor" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467221 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d" containerName="nova-cell1-conductor-conductor" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467412 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" containerName="keystone-api" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467424 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="af288905-8d90-4f98-ba70-c7351555851b" containerName="barbican-worker-log" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467431 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d" containerName="nova-cell1-conductor-conductor" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467442 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" containerName="barbican-keystone-listener" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467455 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d" containerName="nova-cell1-conductor-conductor" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467466 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="af288905-8d90-4f98-ba70-c7351555851b" containerName="barbican-worker" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.467477 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" containerName="barbican-keystone-listener-log" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.468476 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.470825 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.470977 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.471165 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-6vjfc" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.471478 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.493805 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af288905-8d90-4f98-ba70-c7351555851b-kube-api-access-2hsmx" (OuterVolumeSpecName: "kube-api-access-2hsmx") pod "af288905-8d90-4f98-ba70-c7351555851b" (UID: "af288905-8d90-4f98-ba70-c7351555851b"). InnerVolumeSpecName "kube-api-access-2hsmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:30 crc kubenswrapper[5030]: E0120 23:34:30.514364 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.518722 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-kube-api-access-2qhzl" (OuterVolumeSpecName: "kube-api-access-2qhzl") pod "cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d" (UID: "cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d"). InnerVolumeSpecName "kube-api-access-2qhzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.518832 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-kube-api-access-7pj4n" (OuterVolumeSpecName: "kube-api-access-7pj4n") pod "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" (UID: "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d"). InnerVolumeSpecName "kube-api-access-7pj4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:30 crc kubenswrapper[5030]: E0120 23:34:30.518907 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.520390 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" (UID: "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.520461 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "af288905-8d90-4f98-ba70-c7351555851b" (UID: "af288905-8d90-4f98-ba70-c7351555851b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:30 crc kubenswrapper[5030]: E0120 23:34:30.525234 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:34:30 crc kubenswrapper[5030]: E0120 23:34:30.525282 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="aaa15825-c66d-4625-86d9-2066c914a955" containerName="ovn-northd" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.534144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" event={"ID":"af288905-8d90-4f98-ba70-c7351555851b","Type":"ContainerDied","Data":"8dafb465f988901e4fe4867094e00575da0eb7bc1d0020e5216b4a9150e27f42"} Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.534225 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.535922 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.545309 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" (UID: "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.546662 5030 scope.go:117] "RemoveContainer" containerID="2a5756d22f4f3a9b513c8862e9d5fd28415db4db8f9387d9f8b00baa64a602f6" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.562878 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.564817 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-676d666f78-55975_a84cf4eb-2a79-4637-aca8-d84c4e588fe7/neutron-httpd/0.log" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.565501 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.566697 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-676d666f78-55975_a84cf4eb-2a79-4637-aca8-d84c4e588fe7/neutron-api/0.log" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.566772 5030 generic.go:334] "Generic (PLEG): container finished" podID="a84cf4eb-2a79-4637-aca8-d84c4e588fe7" containerID="98af69b0bfdc6fc1afc39aff882af1e07a660d71a898d7354bda19b513fc36c9" exitCode=137 Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.566804 5030 generic.go:334] "Generic (PLEG): container finished" podID="a84cf4eb-2a79-4637-aca8-d84c4e588fe7" containerID="5ecde8ec052026f4220036a91d5a3a71cf246312e5090cad9d644e11e17921bb" exitCode=137 Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.566899 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-676d666f78-55975" event={"ID":"a84cf4eb-2a79-4637-aca8-d84c4e588fe7","Type":"ContainerDied","Data":"98af69b0bfdc6fc1afc39aff882af1e07a660d71a898d7354bda19b513fc36c9"} Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.566945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-676d666f78-55975" event={"ID":"a84cf4eb-2a79-4637-aca8-d84c4e588fe7","Type":"ContainerDied","Data":"5ecde8ec052026f4220036a91d5a3a71cf246312e5090cad9d644e11e17921bb"} Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.574880 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.575191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-scripts\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.575302 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d23d8f1-6699-42f6-8152-362964a6a9dc-logs\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.575342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.575401 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-config-data\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.575458 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dspcr\" (UniqueName: \"kubernetes.io/projected/4d23d8f1-6699-42f6-8152-362964a6a9dc-kube-api-access-dspcr\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.575533 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d23d8f1-6699-42f6-8152-362964a6a9dc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.575720 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.575739 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qhzl\" (UniqueName: \"kubernetes.io/projected/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-kube-api-access-2qhzl\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.575753 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.575763 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.575777 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pj4n\" (UniqueName: \"kubernetes.io/projected/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-kube-api-access-7pj4n\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.575787 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hsmx\" (UniqueName: \"kubernetes.io/projected/af288905-8d90-4f98-ba70-c7351555851b-kube-api-access-2hsmx\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.577075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" event={"ID":"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da","Type":"ContainerStarted","Data":"8cb98403ba1e46309540a4b0a01cdae716399a7ce40b028158271baa4c653cbf"} Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.577168 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.585275 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.599187 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.608345 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.608320 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b56fccfd8-pxhmq" event={"ID":"e566cb27-7416-41c6-b6e4-d31c2fdb3f1d","Type":"ContainerDied","Data":"72a59b1fd23ebc0f8e8c313dbec6a6d1db2478bf9de1ccc57dd3cc6a9ae73bb6"} Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.612158 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"ce2df28d-2515-4995-b928-c2f6d198045d","Type":"ContainerStarted","Data":"c9c613bf81cb8eafa07712719e2192745559c6b4dc4e83bb5234258f4ae6d3d0"} Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.615007 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" event={"ID":"ec706fe5-85b7-4580-9013-e39497640ab0","Type":"ContainerStarted","Data":"a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5"} Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.616073 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.616149 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d","Type":"ContainerDied","Data":"11bb102f85b946924c2d3fb32c00ebef6d6f693f819cb13039457de7fa8497db"} Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.631681 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.631764 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"2e480bc9-3e92-4cb6-a685-4b2118dea5ff","Type":"ContainerStarted","Data":"a569f0520ff40c5656ae25d907d285351006a662406feefdd835a9730883f656"} Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.647002 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.647485 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t" event={"ID":"ceac9104-14e9-4ef2-b2f3-30a3bb64d9df","Type":"ContainerDied","Data":"723b0978ae71c86018114fdc1223e1db2589a8a21412aca5690191cf3fbae746"} Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.662004 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.666466 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.668552 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.668655 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.673271 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.675028 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.677141 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d23d8f1-6699-42f6-8152-362964a6a9dc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.677563 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.677936 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.677959 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-scripts\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.678071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d23d8f1-6699-42f6-8152-362964a6a9dc-logs\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.678125 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.678161 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-config-data\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.678198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dspcr\" (UniqueName: \"kubernetes.io/projected/4d23d8f1-6699-42f6-8152-362964a6a9dc-kube-api-access-dspcr\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.678922 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d23d8f1-6699-42f6-8152-362964a6a9dc-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.680838 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.681026 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-st4ds" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.681137 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.681181 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d23d8f1-6699-42f6-8152-362964a6a9dc-logs\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.682321 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.686103 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.699281 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-scripts\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.700052 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.712820 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.713684 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-config-data\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.715402 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dspcr\" (UniqueName: \"kubernetes.io/projected/4d23d8f1-6699-42f6-8152-362964a6a9dc-kube-api-access-dspcr\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.721943 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.731958 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.753089 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.764074 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-54d7f78ddd-xnd26"] Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.780447 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.780490 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jscx\" (UniqueName: \"kubernetes.io/projected/fd6ac7c2-5294-401a-a16d-0821b044e7f8-kube-api-access-4jscx\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.780523 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.780560 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.780605 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.780734 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.780751 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd6ac7c2-5294-401a-a16d-0821b044e7f8-logs\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.780771 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.780818 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.780835 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs6b7\" (UniqueName: \"kubernetes.io/projected/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-kube-api-access-cs6b7\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.780870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.781411 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.781518 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.781563 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.781720 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.782886 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd6ac7c2-5294-401a-a16d-0821b044e7f8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.815140 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" podStartSLOduration=6.81511721 podStartE2EDuration="6.81511721s" podCreationTimestamp="2026-01-20 23:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:30.60649012 +0000 UTC m=+3542.926750408" watchObservedRunningTime="2026-01-20 23:34:30.81511721 +0000 UTC m=+3543.135377498" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.884810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.884858 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.884881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.884899 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd6ac7c2-5294-401a-a16d-0821b044e7f8-logs\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.884922 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.884958 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.884975 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs6b7\" (UniqueName: \"kubernetes.io/projected/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-kube-api-access-cs6b7\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.884994 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.885010 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.885041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.885064 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.885100 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.885152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd6ac7c2-5294-401a-a16d-0821b044e7f8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.885194 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.885212 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jscx\" (UniqueName: \"kubernetes.io/projected/fd6ac7c2-5294-401a-a16d-0821b044e7f8-kube-api-access-4jscx\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.885236 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.886217 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.887039 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd6ac7c2-5294-401a-a16d-0821b044e7f8-logs\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.887494 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.894143 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.894575 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd6ac7c2-5294-401a-a16d-0821b044e7f8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.894824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.895801 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.895891 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.907725 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.907769 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.909993 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.916533 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.918966 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.919552 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.930210 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jscx\" (UniqueName: \"kubernetes.io/projected/fd6ac7c2-5294-401a-a16d-0821b044e7f8-kube-api-access-4jscx\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:30 crc kubenswrapper[5030]: I0120 23:34:30.938858 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs6b7\" (UniqueName: \"kubernetes.io/projected/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-kube-api-access-cs6b7\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.217633 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.220288 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.277153 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" (UID: "ceac9104-14e9-4ef2-b2f3-30a3bb64d9df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.298639 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzhtm"] Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.301736 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.335244 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.356338 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.385507 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v64zq"] Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.385749 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v64zq" podUID="f8e8785a-fb7a-4209-a065-b57a88baec40" containerName="registry-server" containerID="cri-o://d94c9f8a703e75f2978eff10164982d029699b50df36d1c6e8549690c9727708" gracePeriod=2 Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.402572 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" (UID: "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.435011 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.436416 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af288905-8d90-4f98-ba70-c7351555851b" (UID: "af288905-8d90-4f98-ba70-c7351555851b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.452579 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d" (UID: "cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.460723 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-config-data" (OuterVolumeSpecName: "config-data") pod "cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d" (UID: "cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.506789 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.506827 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.506837 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.506849 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.519782 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod84ff2188-a797-40b5-8749-ee4330e6bf9a"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod84ff2188-a797-40b5-8749-ee4330e6bf9a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod84ff2188_a797_40b5_8749_ee4330e6bf9a.slice" Jan 20 23:34:31 crc kubenswrapper[5030]: E0120 23:34:31.519841 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod84ff2188-a797-40b5-8749-ee4330e6bf9a] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod84ff2188-a797-40b5-8749-ee4330e6bf9a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod84ff2188_a797_40b5_8749_ee4330e6bf9a.slice" pod="openstack-kuttl-tests/nova-metadata-0" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.539822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" (UID: "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.560759 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" (UID: "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.569359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-config-data" (OuterVolumeSpecName: "config-data") pod "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" (UID: "e566cb27-7416-41c6-b6e4-d31c2fdb3f1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.570912 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.572490 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod74b19e4a-b5b1-4c2f-9cf9-01097291798d"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod74b19e4a-b5b1-4c2f-9cf9-01097291798d] : Timed out while waiting for systemd to remove kubepods-besteffort-pod74b19e4a_b5b1_4c2f_9cf9_01097291798d.slice" Jan 20 23:34:31 crc kubenswrapper[5030]: E0120 23:34:31.572524 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod74b19e4a-b5b1-4c2f-9cf9-01097291798d] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod74b19e4a-b5b1-4c2f-9cf9-01097291798d] : Timed out while waiting for systemd to remove kubepods-besteffort-pod74b19e4a_b5b1_4c2f_9cf9_01097291798d.slice" pod="openstack-kuttl-tests/ceilometer-0" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.575276 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-config-data" (OuterVolumeSpecName: "config-data") pod "af288905-8d90-4f98-ba70-c7351555851b" (UID: "af288905-8d90-4f98-ba70-c7351555851b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.606900 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-config-data" (OuterVolumeSpecName: "config-data") pod "ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" (UID: "ceac9104-14e9-4ef2-b2f3-30a3bb64d9df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.608156 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.608178 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.608188 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af288905-8d90-4f98-ba70-c7351555851b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.608197 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.608206 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.727934 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" event={"ID":"99f18e53-0386-4e06-89f6-a7d30f880808","Type":"ContainerStarted","Data":"a53f849ac1c1858e7caed509d7798f356ef005496fcb80d6c8c828ae2ffbfaba"} Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.729160 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.729185 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.751258 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" event={"ID":"ec706fe5-85b7-4580-9013-e39497640ab0","Type":"ContainerStarted","Data":"be1f7f5545bed54bae7ef435a6cffa9473ba0595e2d6a1b90f6bb99a4d364f7f"} Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.752194 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.765886 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" podStartSLOduration=8.765869724 podStartE2EDuration="8.765869724s" podCreationTimestamp="2026-01-20 23:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:31.758940546 +0000 UTC m=+3544.079200834" watchObservedRunningTime="2026-01-20 23:34:31.765869724 +0000 UTC m=+3544.086130012" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.774903 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"2e480bc9-3e92-4cb6-a685-4b2118dea5ff","Type":"ContainerStarted","Data":"750810e06e2ba2a1b4a96b2cb486db619a924a3c4e105f8d13577a72affcd1c6"} Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.784609 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" podStartSLOduration=7.784593107 podStartE2EDuration="7.784593107s" podCreationTimestamp="2026-01-20 23:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:31.781925263 +0000 UTC m=+3544.102185551" watchObservedRunningTime="2026-01-20 23:34:31.784593107 +0000 UTC m=+3544.104853395" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.802155 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978","Type":"ContainerStarted","Data":"34ac68c5bbe6cab228a875e4c26bf6e188284b89ca877ab5eab2a0f82d7e057d"} Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.810892 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.815272 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.834133 5030 generic.go:334] "Generic (PLEG): container finished" podID="f8e8785a-fb7a-4209-a065-b57a88baec40" containerID="d94c9f8a703e75f2978eff10164982d029699b50df36d1c6e8549690c9727708" exitCode=0 Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.834243 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v64zq" event={"ID":"f8e8785a-fb7a-4209-a065-b57a88baec40","Type":"ContainerDied","Data":"d94c9f8a703e75f2978eff10164982d029699b50df36d1c6e8549690c9727708"} Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.844573 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=6.844554458 podStartE2EDuration="6.844554458s" podCreationTimestamp="2026-01-20 23:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:31.809804788 +0000 UTC m=+3544.130065076" watchObservedRunningTime="2026-01-20 23:34:31.844554458 +0000 UTC m=+3544.164814746" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.866006 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9","Type":"ContainerStarted","Data":"edb925b8ff8db1df5c8c5dbbe6706091487eaef81933749acfbeed04cff1a05e"} Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.868642 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=4.868632712 podStartE2EDuration="4.868632712s" podCreationTimestamp="2026-01-20 23:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:31.838042651 +0000 UTC m=+3544.158302939" watchObservedRunningTime="2026-01-20 23:34:31.868632712 +0000 UTC m=+3544.188892990" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.877345 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" event={"ID":"f4bc1efe-faed-4290-a85c-c631a91f02ca","Type":"ContainerStarted","Data":"c404f86a75aeed11f9b633de5cc6d308e1fbdb48707f1cf37008195d2631fc96"} Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.965956 5030 generic.go:334] "Generic (PLEG): container finished" podID="bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" containerID="b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4" exitCode=0 Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.966680 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" podStartSLOduration=8.966660005 podStartE2EDuration="8.966660005s" podCreationTimestamp="2026-01-20 23:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:31.90577458 +0000 UTC m=+3544.226034868" watchObservedRunningTime="2026-01-20 23:34:31.966660005 +0000 UTC m=+3544.286920293" Jan 20 23:34:31 crc kubenswrapper[5030]: I0120 23:34:31.968173 5030 scope.go:117] "RemoveContainer" containerID="1b7ac12593213a554ea14bfe0f61419d4d62872f58e89bac2f8027ab71aa20eb" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.016084 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-676d666f78-55975_a84cf4eb-2a79-4637-aca8-d84c4e588fe7/neutron-httpd/0.log" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.032273 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-676d666f78-55975_a84cf4eb-2a79-4637-aca8-d84c4e588fe7/neutron-api/0.log" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.033311 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.033332 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.089974 5030 scope.go:117] "RemoveContainer" containerID="7610bfc12a250b49177a9212a8e1ab21b02766f17a8cb9d0985354440968ab12" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.111249 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8" path="/var/lib/kubelet/pods/2ae89f8a-49bd-4a0b-8e3c-3ee6999395d8/volumes" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.112196 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de7a5fe-85b0-4b7c-af74-ec2c104fac94" path="/var/lib/kubelet/pods/2de7a5fe-85b0-4b7c-af74-ec2c104fac94/volumes" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.113703 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37c2ebda-5e72-41eb-9c8c-1326134768c7" path="/var/lib/kubelet/pods/37c2ebda-5e72-41eb-9c8c-1326134768c7/volumes" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.114345 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3a2955e-219c-481d-bb5d-ca59f20d1255" path="/var/lib/kubelet/pods/b3a2955e-219c-481d-bb5d-ca59f20d1255/volumes" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.115483 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qbtc" event={"ID":"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec","Type":"ContainerDied","Data":"b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4"} Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.115509 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2"] Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.115528 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"12ef9991-6f38-4b76-a8e7-974ef5255254","Type":"ContainerStarted","Data":"a86d9299fe6a4cdf4a78ed49c689e395c66ed9e01288c02768d5391bfd8715d0"} Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.115541 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"ce2df28d-2515-4995-b928-c2f6d198045d","Type":"ContainerStarted","Data":"bb888a669f56ad697c7a2cb6314f21c6d4d03d654bc273d2bff1d915da2291d6"} Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.115553 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-676d666f78-55975" event={"ID":"a84cf4eb-2a79-4637-aca8-d84c4e588fe7","Type":"ContainerDied","Data":"8a5263047017b01347184055ef56482fe8d62fe194eed7c994156f67f1bacf32"} Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.115566 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a5263047017b01347184055ef56482fe8d62fe194eed7c994156f67f1bacf32" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.115843 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" podUID="712df845-7e0f-48ed-b6a1-bb7c967fb5e3" containerName="barbican-keystone-listener-log" containerID="cri-o://201d5bd12c4c0c61fa84da5a2967111dadbe0c96f4de78b25225f4ddfc98e0b3" gracePeriod=30 Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.116356 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" podUID="712df845-7e0f-48ed-b6a1-bb7c967fb5e3" containerName="barbican-keystone-listener" containerID="cri-o://acc6b39f58731022e9895707a3b85947e579adfd11aa69a1924f699081e64377" gracePeriod=30 Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.528090 5030 scope.go:117] "RemoveContainer" containerID="0d9fc4ee758d84087beaab327bba3d23f35c0de724fa32f1178cb99595f25278" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.549707 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.596735 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h"] Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.653992 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-dcf7bb847-h8v9h"] Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.674709 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.677691 5030 scope.go:117] "RemoveContainer" containerID="ddab932bf79386702fccfc6c79a0fd5f50d026b98c269df11b368ec27242c3df" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.701843 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.772664 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.774614 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.778875 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.779075 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.786943 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.833310 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.836821 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-676d666f78-55975_a84cf4eb-2a79-4637-aca8-d84c4e588fe7/neutron-httpd/0.log" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.841516 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-676d666f78-55975_a84cf4eb-2a79-4637-aca8-d84c4e588fe7/neutron-api/0.log" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.841610 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.844075 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-config-data\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.844139 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.844198 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.844236 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4pzn\" (UniqueName: \"kubernetes.io/projected/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-kube-api-access-v4pzn\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.844304 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-logs\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.859669 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.880746 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.895235 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.915653 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.916208 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="c095502b-f026-40aa-a84f-057d9dc0e758" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.116:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.916665 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v64zq" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.945961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-config-data\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.947214 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.947288 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.947336 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4pzn\" (UniqueName: \"kubernetes.io/projected/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-kube-api-access-v4pzn\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.947413 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-logs\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.947979 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-logs\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.944632 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.956918 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-b56fccfd8-pxhmq"] Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.975353 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.979913 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.979966 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-config-data\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:32 crc kubenswrapper[5030]: I0120 23:34:32.998200 5030 scope.go:117] "RemoveContainer" containerID="aa169a139bd24a0b6ddc7e560156c87e40f4eba5bcd5d1dd9974ceb2324a9013" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.007133 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4pzn\" (UniqueName: \"kubernetes.io/projected/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-kube-api-access-v4pzn\") pod \"nova-metadata-0\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.010723 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-b56fccfd8-pxhmq"] Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.051979 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:34:33 crc kubenswrapper[5030]: E0120 23:34:33.052450 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84cf4eb-2a79-4637-aca8-d84c4e588fe7" containerName="neutron-api" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.052466 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84cf4eb-2a79-4637-aca8-d84c4e588fe7" containerName="neutron-api" Jan 20 23:34:33 crc kubenswrapper[5030]: E0120 23:34:33.052487 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84cf4eb-2a79-4637-aca8-d84c4e588fe7" containerName="neutron-httpd" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.052494 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84cf4eb-2a79-4637-aca8-d84c4e588fe7" containerName="neutron-httpd" Jan 20 23:34:33 crc kubenswrapper[5030]: E0120 23:34:33.052514 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e8785a-fb7a-4209-a065-b57a88baec40" containerName="registry-server" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.052521 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e8785a-fb7a-4209-a065-b57a88baec40" containerName="registry-server" Jan 20 23:34:33 crc kubenswrapper[5030]: E0120 23:34:33.052531 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e8785a-fb7a-4209-a065-b57a88baec40" containerName="extract-content" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.052537 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e8785a-fb7a-4209-a065-b57a88baec40" containerName="extract-content" Jan 20 23:34:33 crc kubenswrapper[5030]: E0120 23:34:33.052567 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e8785a-fb7a-4209-a065-b57a88baec40" containerName="extract-utilities" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.052576 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e8785a-fb7a-4209-a065-b57a88baec40" containerName="extract-utilities" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.052774 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a84cf4eb-2a79-4637-aca8-d84c4e588fe7" containerName="neutron-httpd" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.052790 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a84cf4eb-2a79-4637-aca8-d84c4e588fe7" containerName="neutron-api" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.052809 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e8785a-fb7a-4209-a065-b57a88baec40" containerName="registry-server" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.053259 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-httpd-config\") pod \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.053370 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-combined-ca-bundle\") pod \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.053481 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.053586 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e8785a-fb7a-4209-a065-b57a88baec40-catalog-content\") pod \"f8e8785a-fb7a-4209-a065-b57a88baec40\" (UID: \"f8e8785a-fb7a-4209-a065-b57a88baec40\") " Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.053776 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e8785a-fb7a-4209-a065-b57a88baec40-utilities\") pod \"f8e8785a-fb7a-4209-a065-b57a88baec40\" (UID: \"f8e8785a-fb7a-4209-a065-b57a88baec40\") " Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.053960 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-internal-tls-certs\") pod \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.057676 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s8tj\" (UniqueName: \"kubernetes.io/projected/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-kube-api-access-7s8tj\") pod \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.057878 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.057889 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lhwf\" (UniqueName: \"kubernetes.io/projected/f8e8785a-fb7a-4209-a065-b57a88baec40-kube-api-access-5lhwf\") pod \"f8e8785a-fb7a-4209-a065-b57a88baec40\" (UID: \"f8e8785a-fb7a-4209-a065-b57a88baec40\") " Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.058081 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-public-tls-certs\") pod \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.058185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-config\") pod \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.058362 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-ovndb-tls-certs\") pod \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\" (UID: \"a84cf4eb-2a79-4637-aca8-d84c4e588fe7\") " Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.063579 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.073989 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" event={"ID":"5c2a94b7-b322-4de7-b55d-80c87895ab2d","Type":"ContainerStarted","Data":"dbbd94a1cc0b6868e228ff556ac722d58d4265232e5ea11ebaf7cc7bcb233912"} Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.084492 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t"] Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.093336 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e8785a-fb7a-4209-a065-b57a88baec40-kube-api-access-5lhwf" (OuterVolumeSpecName: "kube-api-access-5lhwf") pod "f8e8785a-fb7a-4209-a065-b57a88baec40" (UID: "f8e8785a-fb7a-4209-a065-b57a88baec40"). InnerVolumeSpecName "kube-api-access-5lhwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.094952 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9","Type":"ContainerStarted","Data":"11331a0257cd7cd3a45d9b1e0d9aa4ca36e6a5d20ca49299d905d135b78ee8e3"} Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.105861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a84cf4eb-2a79-4637-aca8-d84c4e588fe7" (UID: "a84cf4eb-2a79-4637-aca8-d84c4e588fe7"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.107767 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-kube-api-access-7s8tj" (OuterVolumeSpecName: "kube-api-access-7s8tj") pod "a84cf4eb-2a79-4637-aca8-d84c4e588fe7" (UID: "a84cf4eb-2a79-4637-aca8-d84c4e588fe7"). InnerVolumeSpecName "kube-api-access-7s8tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.112870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e8785a-fb7a-4209-a065-b57a88baec40-utilities" (OuterVolumeSpecName: "utilities") pod "f8e8785a-fb7a-4209-a065-b57a88baec40" (UID: "f8e8785a-fb7a-4209-a065-b57a88baec40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.112951 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-774477bf6d-2vg4t"] Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.121283 5030 generic.go:334] "Generic (PLEG): container finished" podID="712df845-7e0f-48ed-b6a1-bb7c967fb5e3" containerID="201d5bd12c4c0c61fa84da5a2967111dadbe0c96f4de78b25225f4ddfc98e0b3" exitCode=143 Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.121378 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" event={"ID":"712df845-7e0f-48ed-b6a1-bb7c967fb5e3","Type":"ContainerDied","Data":"201d5bd12c4c0c61fa84da5a2967111dadbe0c96f4de78b25225f4ddfc98e0b3"} Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.122079 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.148569 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v64zq" event={"ID":"f8e8785a-fb7a-4209-a065-b57a88baec40","Type":"ContainerDied","Data":"f52821f4988808b57e317dfa4f15b4b621ecd4229ff90f5ecc4b812c9089f14e"} Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.148642 5030 scope.go:117] "RemoveContainer" containerID="d94c9f8a703e75f2978eff10164982d029699b50df36d1c6e8549690c9727708" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.148751 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v64zq" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.177058 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8658x\" (UniqueName: \"kubernetes.io/projected/d3f32a70-a447-408f-99e0-90075753a304-kube-api-access-8658x\") pod \"nova-cell1-conductor-0\" (UID: \"d3f32a70-a447-408f-99e0-90075753a304\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.177196 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f32a70-a447-408f-99e0-90075753a304-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d3f32a70-a447-408f-99e0-90075753a304\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.177267 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f32a70-a447-408f-99e0-90075753a304-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d3f32a70-a447-408f-99e0-90075753a304\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.177314 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.177326 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e8785a-fb7a-4209-a065-b57a88baec40-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.177336 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s8tj\" (UniqueName: \"kubernetes.io/projected/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-kube-api-access-7s8tj\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.177346 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lhwf\" (UniqueName: \"kubernetes.io/projected/f8e8785a-fb7a-4209-a065-b57a88baec40-kube-api-access-5lhwf\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.209689 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.219981 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.222285 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.227274 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.227464 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.227664 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.227887 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.230277 5030 generic.go:334] "Generic (PLEG): container finished" podID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerID="bd2173d5780144e78bbd66e000c1fb014dcc9fd1b472fd0d9859a5ed21c774b8" exitCode=1 Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.230333 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9c05bad7-0581-4182-9c7b-7a790c22cf64","Type":"ContainerDied","Data":"bd2173d5780144e78bbd66e000c1fb014dcc9fd1b472fd0d9859a5ed21c774b8"} Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.230971 5030 scope.go:117] "RemoveContainer" containerID="bd2173d5780144e78bbd66e000c1fb014dcc9fd1b472fd0d9859a5ed21c774b8" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.266591 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"12ef9991-6f38-4b76-a8e7-974ef5255254","Type":"ContainerStarted","Data":"36217ecd5110ba1cb9d103b09587b21f547638eb8c7ae3eca137279150a09f64"} Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.280910 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f32a70-a447-408f-99e0-90075753a304-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d3f32a70-a447-408f-99e0-90075753a304\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.280986 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f32a70-a447-408f-99e0-90075753a304-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d3f32a70-a447-408f-99e0-90075753a304\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.281088 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8658x\" (UniqueName: \"kubernetes.io/projected/d3f32a70-a447-408f-99e0-90075753a304-kube-api-access-8658x\") pod \"nova-cell1-conductor-0\" (UID: \"d3f32a70-a447-408f-99e0-90075753a304\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.293600 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e8785a-fb7a-4209-a065-b57a88baec40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8e8785a-fb7a-4209-a065-b57a88baec40" (UID: "f8e8785a-fb7a-4209-a065-b57a88baec40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.294338 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.297286 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f32a70-a447-408f-99e0-90075753a304-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"d3f32a70-a447-408f-99e0-90075753a304\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.309222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"ce2df28d-2515-4995-b928-c2f6d198045d","Type":"ContainerStarted","Data":"c7d286487506cf34d40605c251876f3d0a39a62208083381222f15802dd6c018"} Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.314466 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f32a70-a447-408f-99e0-90075753a304-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"d3f32a70-a447-408f-99e0-90075753a304\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.316931 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-676d666f78-55975" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.318065 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" event={"ID":"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb","Type":"ContainerStarted","Data":"e96f90310cc523986ea2cc24d1408bd36e515d624b67edb05f78fd96a59bfb9f"} Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.318087 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.319365 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.345100 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a84cf4eb-2a79-4637-aca8-d84c4e588fe7" (UID: "a84cf4eb-2a79-4637-aca8-d84c4e588fe7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.351004 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" podStartSLOduration=10.350981803 podStartE2EDuration="10.350981803s" podCreationTimestamp="2026-01-20 23:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:33.096498753 +0000 UTC m=+3545.416759031" watchObservedRunningTime="2026-01-20 23:34:33.350981803 +0000 UTC m=+3545.671242091" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.351701 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8658x\" (UniqueName: \"kubernetes.io/projected/d3f32a70-a447-408f-99e0-90075753a304-kube-api-access-8658x\") pod \"nova-cell1-conductor-0\" (UID: \"d3f32a70-a447-408f-99e0-90075753a304\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.355839 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a84cf4eb-2a79-4637-aca8-d84c4e588fe7" (UID: "a84cf4eb-2a79-4637-aca8-d84c4e588fe7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.378407 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp"] Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.378712 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" podUID="4f8fdbf9-041a-46dc-87c9-f2531626e150" containerName="barbican-worker-log" containerID="cri-o://a57e2746787ddfcbf60fecfd45d88fdb4eec56a5f3449cfc24acf2bbf3818b17" gracePeriod=30 Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.378807 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" podUID="4f8fdbf9-041a-46dc-87c9-f2531626e150" containerName="barbican-worker" containerID="cri-o://4d481d03cd21a082e415c95294fe569e98e8b2c182e04998e1cc9ef7f56beef0" gracePeriod=30 Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.382374 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.382425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-run-httpd\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.382485 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-log-httpd\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.382644 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvd2k\" (UniqueName: \"kubernetes.io/projected/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-kube-api-access-pvd2k\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.382679 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-config-data\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.385342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-scripts\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.385427 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.387244 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.387510 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.387524 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.387535 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e8785a-fb7a-4209-a065-b57a88baec40-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.407478 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.426642 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=6.426609673 podStartE2EDuration="6.426609673s" podCreationTimestamp="2026-01-20 23:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:33.34963945 +0000 UTC m=+3545.669899748" watchObservedRunningTime="2026-01-20 23:34:33.426609673 +0000 UTC m=+3545.746869961" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.498407 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-config-data\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.498734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-scripts\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.498757 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.498811 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.498867 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.498885 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-run-httpd\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.498907 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-log-httpd\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.498967 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvd2k\" (UniqueName: \"kubernetes.io/projected/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-kube-api-access-pvd2k\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.507812 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-run-httpd\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.509726 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-log-httpd\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.541340 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.543640 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.544182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-scripts\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.545540 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-config-data\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.548232 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.550463 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.550926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a84cf4eb-2a79-4637-aca8-d84c4e588fe7" (UID: "a84cf4eb-2a79-4637-aca8-d84c4e588fe7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.603435 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.638511 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.707391 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvd2k\" (UniqueName: \"kubernetes.io/projected/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-kube-api-access-pvd2k\") pod \"ceilometer-0\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.715189 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" podStartSLOduration=10.715166198 podStartE2EDuration="10.715166198s" podCreationTimestamp="2026-01-20 23:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:33.510729649 +0000 UTC m=+3545.830989937" watchObservedRunningTime="2026-01-20 23:34:33.715166198 +0000 UTC m=+3546.035426486" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.789753 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.804489 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-config" (OuterVolumeSpecName: "config") pod "a84cf4eb-2a79-4637-aca8-d84c4e588fe7" (UID: "a84cf4eb-2a79-4637-aca8-d84c4e588fe7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.808296 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v64zq"] Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.810249 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.815343 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v64zq"] Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.923354 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.957466 5030 scope.go:117] "RemoveContainer" containerID="94dba902fb8a6721913fefcf4a0d0b3582724872bb3352951c23449e34a024cf" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.966423 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a84cf4eb-2a79-4637-aca8-d84c4e588fe7" (UID: "a84cf4eb-2a79-4637-aca8-d84c4e588fe7"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:33 crc kubenswrapper[5030]: I0120 23:34:33.995290 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74b19e4a-b5b1-4c2f-9cf9-01097291798d" path="/var/lib/kubelet/pods/74b19e4a-b5b1-4c2f-9cf9-01097291798d/volumes" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.006147 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84ff2188-a797-40b5-8749-ee4330e6bf9a" path="/var/lib/kubelet/pods/84ff2188-a797-40b5-8749-ee4330e6bf9a/volumes" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.006841 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af288905-8d90-4f98-ba70-c7351555851b" path="/var/lib/kubelet/pods/af288905-8d90-4f98-ba70-c7351555851b/volumes" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.010266 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d" path="/var/lib/kubelet/pods/cdbf6103-bdd3-4bb9-9d78-eb66cc7a357d/volumes" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.010856 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceac9104-14e9-4ef2-b2f3-30a3bb64d9df" path="/var/lib/kubelet/pods/ceac9104-14e9-4ef2-b2f3-30a3bb64d9df/volumes" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.017469 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e566cb27-7416-41c6-b6e4-d31c2fdb3f1d" path="/var/lib/kubelet/pods/e566cb27-7416-41c6-b6e4-d31c2fdb3f1d/volumes" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.017944 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a84cf4eb-2a79-4637-aca8-d84c4e588fe7-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.018695 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e8785a-fb7a-4209-a065-b57a88baec40" path="/var/lib/kubelet/pods/f8e8785a-fb7a-4209-a065-b57a88baec40/volumes" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.022937 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" podUID="427b84ba-dba3-4881-bd7c-461d4c3674c9" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.108:8778/\": EOF" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.023288 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" podUID="427b84ba-dba3-4881-bd7c-461d4c3674c9" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.108:8778/\": EOF" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.062942 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.183238 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" podUID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.110:8778/\": EOF" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.201763 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" podUID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.110:8778/\": EOF" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.271718 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.355032 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4d23d8f1-6699-42f6-8152-362964a6a9dc","Type":"ContainerStarted","Data":"1b9e2a4a951b011876a9aa764fe630e1a819504641bb37396eaee5f36064b402"} Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.356721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"fd6ac7c2-5294-401a-a16d-0821b044e7f8","Type":"ContainerStarted","Data":"b31f9b5e8b0c977f70319a48b0da177bd27583cad3793a7b30357fd9d9d25ffb"} Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.371852 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f8fdbf9-041a-46dc-87c9-f2531626e150" containerID="a57e2746787ddfcbf60fecfd45d88fdb4eec56a5f3449cfc24acf2bbf3818b17" exitCode=143 Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.373284 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" event={"ID":"4f8fdbf9-041a-46dc-87c9-f2531626e150","Type":"ContainerDied","Data":"a57e2746787ddfcbf60fecfd45d88fdb4eec56a5f3449cfc24acf2bbf3818b17"} Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.411484 5030 scope.go:117] "RemoveContainer" containerID="c07b354577bcaf637fbf1161e625ed80ce96d282b5f21bf9505c00e63a9715f9" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.450913 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.506879 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.540161 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.634310 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-676d666f78-55975"] Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.648071 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-676d666f78-55975"] Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.689684 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.823909 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.129:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.824278 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.129:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:34 crc kubenswrapper[5030]: I0120 23:34:34.970032 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.153168 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.407039 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"d3f32a70-a447-408f-99e0-90075753a304","Type":"ContainerStarted","Data":"854df0a137c6b15d84b8cb748e0de68099c42d359e5014d0141ca6a9aa4ea34e"} Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.410275 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9c05bad7-0581-4182-9c7b-7a790c22cf64","Type":"ContainerStarted","Data":"b8d1ff9f1a8865150e0be3cb7d84d43b40536cc4bc648a2c3d8af999d1a89102"} Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.410730 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.422823 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"5b3850fc-0bf6-489d-8a8f-4a72ad423cce","Type":"ContainerStarted","Data":"643c85b41e6141794593cf3c9ee3bc6ffb89d24885af29de96656e9068afded6"} Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.434675 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa","Type":"ContainerStarted","Data":"a5b2037f83c532db82e0c2292641947cc293e9fb55bc3c6fe284769e3a918862"} Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.463560 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qbtc" event={"ID":"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec","Type":"ContainerStarted","Data":"177b259473f766d88a04a9dc6b87621e06d9d3d47ef734a0f879f225316bbee1"} Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.497410 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15","Type":"ContainerStarted","Data":"0180de15c3bd2219cf4cf9e44ce6525ef636a8fc03501e146cc2c6923d43e6e8"} Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.497442 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15","Type":"ContainerStarted","Data":"a444966ddbc3518f89c8a24b5cfd6f35ac5b00a60e68b9c70192390ed0394882"} Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.501595 5030 generic.go:334] "Generic (PLEG): container finished" podID="427b84ba-dba3-4881-bd7c-461d4c3674c9" containerID="b25711156d4469258b400fabb5bdbe3ad378d45aab78c771c47ee90613bc46b2" exitCode=137 Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.502753 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" event={"ID":"427b84ba-dba3-4881-bd7c-461d4c3674c9","Type":"ContainerDied","Data":"b25711156d4469258b400fabb5bdbe3ad378d45aab78c771c47ee90613bc46b2"} Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.502873 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.507785 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.535111 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5qbtc" podStartSLOduration=6.203638722 podStartE2EDuration="12.535090861s" podCreationTimestamp="2026-01-20 23:34:23 +0000 UTC" firstStartedPulling="2026-01-20 23:34:26.788995344 +0000 UTC m=+3539.109255632" lastFinishedPulling="2026-01-20 23:34:33.120447483 +0000 UTC m=+3545.440707771" observedRunningTime="2026-01-20 23:34:35.492258335 +0000 UTC m=+3547.812518613" watchObservedRunningTime="2026-01-20 23:34:35.535090861 +0000 UTC m=+3547.855351149" Jan 20 23:34:35 crc kubenswrapper[5030]: E0120 23:34:35.565487 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:34:35 crc kubenswrapper[5030]: E0120 23:34:35.573428 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:34:35 crc kubenswrapper[5030]: E0120 23:34:35.584952 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:34:35 crc kubenswrapper[5030]: E0120 23:34:35.585008 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="aaa15825-c66d-4625-86d9-2066c914a955" containerName="ovn-northd" Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.643310 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.678921 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-combined-ca-bundle\") pod \"427b84ba-dba3-4881-bd7c-461d4c3674c9\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.678996 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwvq5\" (UniqueName: \"kubernetes.io/projected/427b84ba-dba3-4881-bd7c-461d4c3674c9-kube-api-access-lwvq5\") pod \"427b84ba-dba3-4881-bd7c-461d4c3674c9\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.679019 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-scripts\") pod \"427b84ba-dba3-4881-bd7c-461d4c3674c9\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.679046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-public-tls-certs\") pod \"427b84ba-dba3-4881-bd7c-461d4c3674c9\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.679571 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427b84ba-dba3-4881-bd7c-461d4c3674c9-logs\") pod \"427b84ba-dba3-4881-bd7c-461d4c3674c9\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.679599 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-config-data\") pod \"427b84ba-dba3-4881-bd7c-461d4c3674c9\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.679740 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-internal-tls-certs\") pod \"427b84ba-dba3-4881-bd7c-461d4c3674c9\" (UID: \"427b84ba-dba3-4881-bd7c-461d4c3674c9\") " Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.682085 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/427b84ba-dba3-4881-bd7c-461d4c3674c9-logs" (OuterVolumeSpecName: "logs") pod "427b84ba-dba3-4881-bd7c-461d4c3674c9" (UID: "427b84ba-dba3-4881-bd7c-461d4c3674c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.751706 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-scripts" (OuterVolumeSpecName: "scripts") pod "427b84ba-dba3-4881-bd7c-461d4c3674c9" (UID: "427b84ba-dba3-4881-bd7c-461d4c3674c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.763883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427b84ba-dba3-4881-bd7c-461d4c3674c9-kube-api-access-lwvq5" (OuterVolumeSpecName: "kube-api-access-lwvq5") pod "427b84ba-dba3-4881-bd7c-461d4c3674c9" (UID: "427b84ba-dba3-4881-bd7c-461d4c3674c9"). InnerVolumeSpecName "kube-api-access-lwvq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.784035 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/427b84ba-dba3-4881-bd7c-461d4c3674c9-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.784068 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwvq5\" (UniqueName: \"kubernetes.io/projected/427b84ba-dba3-4881-bd7c-461d4c3674c9-kube-api-access-lwvq5\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:35 crc kubenswrapper[5030]: I0120 23:34:35.784079 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.029281 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a84cf4eb-2a79-4637-aca8-d84c4e588fe7" path="/var/lib/kubelet/pods/a84cf4eb-2a79-4637-aca8-d84c4e588fe7/volumes" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.066058 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "427b84ba-dba3-4881-bd7c-461d4c3674c9" (UID: "427b84ba-dba3-4881-bd7c-461d4c3674c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.107493 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.179804 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-config-data" (OuterVolumeSpecName: "config-data") pod "427b84ba-dba3-4881-bd7c-461d4c3674c9" (UID: "427b84ba-dba3-4881-bd7c-461d4c3674c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.207420 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "427b84ba-dba3-4881-bd7c-461d4c3674c9" (UID: "427b84ba-dba3-4881-bd7c-461d4c3674c9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.212730 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.212763 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.309166 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "427b84ba-dba3-4881-bd7c-461d4c3674c9" (UID: "427b84ba-dba3-4881-bd7c-461d4c3674c9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.314831 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/427b84ba-dba3-4881-bd7c-461d4c3674c9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.545507 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"5b3850fc-0bf6-489d-8a8f-4a72ad423cce","Type":"ContainerStarted","Data":"d5bb496af776155bccbacd44a182341bd36f316b38ba8e602d66d7dffd391456"} Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.558068 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4d23d8f1-6699-42f6-8152-362964a6a9dc","Type":"ContainerStarted","Data":"d0e7360bf46166aa5bef2ce7924b3df57b7840af48dade76bb5e42853f0bcdae"} Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.566339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15","Type":"ContainerStarted","Data":"b8ae93af6666b9fb76a44af3d521db65e5795460f7c982e680684326ee865d01"} Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.594022 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" event={"ID":"427b84ba-dba3-4881-bd7c-461d4c3674c9","Type":"ContainerDied","Data":"ba1b7cf1041077927df282c8dace10d519450b0dfb4303c6f186843a3559be29"} Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.594489 5030 scope.go:117] "RemoveContainer" containerID="b25711156d4469258b400fabb5bdbe3ad378d45aab78c771c47ee90613bc46b2" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.595055 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7cb86f569d-4mmlt" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.616910 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=4.616891967 podStartE2EDuration="4.616891967s" podCreationTimestamp="2026-01-20 23:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:36.605237216 +0000 UTC m=+3548.925497514" watchObservedRunningTime="2026-01-20 23:34:36.616891967 +0000 UTC m=+3548.937152255" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.634180 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"d3f32a70-a447-408f-99e0-90075753a304","Type":"ContainerStarted","Data":"718b62def710cf515ab2679a13515694e305b4ad1097a0ee51354099652b2c81"} Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.635330 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.652565 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f8fdbf9-041a-46dc-87c9-f2531626e150" containerID="4d481d03cd21a082e415c95294fe569e98e8b2c182e04998e1cc9ef7f56beef0" exitCode=0 Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.652644 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" event={"ID":"4f8fdbf9-041a-46dc-87c9-f2531626e150","Type":"ContainerDied","Data":"4d481d03cd21a082e415c95294fe569e98e8b2c182e04998e1cc9ef7f56beef0"} Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.683144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"fd6ac7c2-5294-401a-a16d-0821b044e7f8","Type":"ContainerStarted","Data":"a3cc6fa9860339a777b502e8d7cf94ab748c9d05c0ad7323a3d5d25264be03d7"} Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.694672 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7cb86f569d-4mmlt"] Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.697409 5030 generic.go:334] "Generic (PLEG): container finished" podID="712df845-7e0f-48ed-b6a1-bb7c967fb5e3" containerID="acc6b39f58731022e9895707a3b85947e579adfd11aa69a1924f699081e64377" exitCode=0 Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.697468 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" event={"ID":"712df845-7e0f-48ed-b6a1-bb7c967fb5e3","Type":"ContainerDied","Data":"acc6b39f58731022e9895707a3b85947e579adfd11aa69a1924f699081e64377"} Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.718460 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7cb86f569d-4mmlt"] Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.720168 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=4.720149077 podStartE2EDuration="4.720149077s" podCreationTimestamp="2026-01-20 23:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:36.689759581 +0000 UTC m=+3549.010019869" watchObservedRunningTime="2026-01-20 23:34:36.720149077 +0000 UTC m=+3549.040409365" Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.722977 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"12ef9991-6f38-4b76-a8e7-974ef5255254","Type":"ContainerStarted","Data":"d5c3b3be74c237646fe90af0b8e7960fcfb98b64d8f2fe17cbcb220246b846f0"} Jan 20 23:34:36 crc kubenswrapper[5030]: I0120 23:34:36.730652 5030 scope.go:117] "RemoveContainer" containerID="0a8376f95d2df43c0daf95c4b85f7a34edcb659b644e3d8e243b73c2e1a7e0bf" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.244941 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.267257 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=9.267241869 podStartE2EDuration="9.267241869s" podCreationTimestamp="2026-01-20 23:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:36.748175865 +0000 UTC m=+3549.068436153" watchObservedRunningTime="2026-01-20 23:34:37.267241869 +0000 UTC m=+3549.587502157" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.350466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-logs\") pod \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.350586 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-config-data-custom\") pod \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.350653 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-config-data\") pod \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.350735 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-combined-ca-bundle\") pod \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.350824 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qqj2\" (UniqueName: \"kubernetes.io/projected/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-kube-api-access-4qqj2\") pod \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\" (UID: \"712df845-7e0f-48ed-b6a1-bb7c967fb5e3\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.351085 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-logs" (OuterVolumeSpecName: "logs") pod "712df845-7e0f-48ed-b6a1-bb7c967fb5e3" (UID: "712df845-7e0f-48ed-b6a1-bb7c967fb5e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.351452 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.357499 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-kube-api-access-4qqj2" (OuterVolumeSpecName: "kube-api-access-4qqj2") pod "712df845-7e0f-48ed-b6a1-bb7c967fb5e3" (UID: "712df845-7e0f-48ed-b6a1-bb7c967fb5e3"). InnerVolumeSpecName "kube-api-access-4qqj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.393932 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "712df845-7e0f-48ed-b6a1-bb7c967fb5e3" (UID: "712df845-7e0f-48ed-b6a1-bb7c967fb5e3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.420663 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-config-data" (OuterVolumeSpecName: "config-data") pod "712df845-7e0f-48ed-b6a1-bb7c967fb5e3" (UID: "712df845-7e0f-48ed-b6a1-bb7c967fb5e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.443786 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "712df845-7e0f-48ed-b6a1-bb7c967fb5e3" (UID: "712df845-7e0f-48ed-b6a1-bb7c967fb5e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.456037 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.456068 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.456077 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.456089 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qqj2\" (UniqueName: \"kubernetes.io/projected/712df845-7e0f-48ed-b6a1-bb7c967fb5e3-kube-api-access-4qqj2\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.529345 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.558751 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_aaa15825-c66d-4625-86d9-2066c914a955/ovn-northd/0.log" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.558850 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.659896 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv5hp\" (UniqueName: \"kubernetes.io/projected/4f8fdbf9-041a-46dc-87c9-f2531626e150-kube-api-access-qv5hp\") pod \"4f8fdbf9-041a-46dc-87c9-f2531626e150\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.660234 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aaa15825-c66d-4625-86d9-2066c914a955-ovn-rundir\") pod \"aaa15825-c66d-4625-86d9-2066c914a955\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.660270 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-config-data\") pod \"4f8fdbf9-041a-46dc-87c9-f2531626e150\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.660326 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aaa15825-c66d-4625-86d9-2066c914a955-scripts\") pod \"aaa15825-c66d-4625-86d9-2066c914a955\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.660373 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-combined-ca-bundle\") pod \"4f8fdbf9-041a-46dc-87c9-f2531626e150\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.660392 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-combined-ca-bundle\") pod \"aaa15825-c66d-4625-86d9-2066c914a955\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.660477 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-config-data-custom\") pod \"4f8fdbf9-041a-46dc-87c9-f2531626e150\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.660504 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-metrics-certs-tls-certs\") pod \"aaa15825-c66d-4625-86d9-2066c914a955\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.660519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa15825-c66d-4625-86d9-2066c914a955-config\") pod \"aaa15825-c66d-4625-86d9-2066c914a955\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.660611 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-ovn-northd-tls-certs\") pod \"aaa15825-c66d-4625-86d9-2066c914a955\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.660653 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f8fdbf9-041a-46dc-87c9-f2531626e150-logs\") pod \"4f8fdbf9-041a-46dc-87c9-f2531626e150\" (UID: \"4f8fdbf9-041a-46dc-87c9-f2531626e150\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.660672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rndp5\" (UniqueName: \"kubernetes.io/projected/aaa15825-c66d-4625-86d9-2066c914a955-kube-api-access-rndp5\") pod \"aaa15825-c66d-4625-86d9-2066c914a955\" (UID: \"aaa15825-c66d-4625-86d9-2066c914a955\") " Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.660904 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaa15825-c66d-4625-86d9-2066c914a955-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "aaa15825-c66d-4625-86d9-2066c914a955" (UID: "aaa15825-c66d-4625-86d9-2066c914a955"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.661305 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/aaa15825-c66d-4625-86d9-2066c914a955-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.661446 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaa15825-c66d-4625-86d9-2066c914a955-scripts" (OuterVolumeSpecName: "scripts") pod "aaa15825-c66d-4625-86d9-2066c914a955" (UID: "aaa15825-c66d-4625-86d9-2066c914a955"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.661852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f8fdbf9-041a-46dc-87c9-f2531626e150-logs" (OuterVolumeSpecName: "logs") pod "4f8fdbf9-041a-46dc-87c9-f2531626e150" (UID: "4f8fdbf9-041a-46dc-87c9-f2531626e150"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.663000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaa15825-c66d-4625-86d9-2066c914a955-config" (OuterVolumeSpecName: "config") pod "aaa15825-c66d-4625-86d9-2066c914a955" (UID: "aaa15825-c66d-4625-86d9-2066c914a955"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.665091 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8fdbf9-041a-46dc-87c9-f2531626e150-kube-api-access-qv5hp" (OuterVolumeSpecName: "kube-api-access-qv5hp") pod "4f8fdbf9-041a-46dc-87c9-f2531626e150" (UID: "4f8fdbf9-041a-46dc-87c9-f2531626e150"). InnerVolumeSpecName "kube-api-access-qv5hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.666053 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4f8fdbf9-041a-46dc-87c9-f2531626e150" (UID: "4f8fdbf9-041a-46dc-87c9-f2531626e150"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.667559 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaa15825-c66d-4625-86d9-2066c914a955-kube-api-access-rndp5" (OuterVolumeSpecName: "kube-api-access-rndp5") pod "aaa15825-c66d-4625-86d9-2066c914a955" (UID: "aaa15825-c66d-4625-86d9-2066c914a955"). InnerVolumeSpecName "kube-api-access-rndp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.702639 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="c095502b-f026-40aa-a84f-057d9dc0e758" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.116:8776/healthcheck\": read tcp 10.217.0.2:43272->10.217.0.116:8776: read: connection reset by peer" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.734435 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" event={"ID":"712df845-7e0f-48ed-b6a1-bb7c967fb5e3","Type":"ContainerDied","Data":"96d3fa65bfb8aded4d4835d8a23a2f287745f32c728f721355c9d27da2608427"} Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.734478 5030 scope.go:117] "RemoveContainer" containerID="acc6b39f58731022e9895707a3b85947e579adfd11aa69a1924f699081e64377" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.734577 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.743914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" event={"ID":"4f8fdbf9-041a-46dc-87c9-f2531626e150","Type":"ContainerDied","Data":"73cbf0a8806f54718d2882617fea2872c8f36fcd941dd194702aef26e3b0465c"} Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.744232 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.752057 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa","Type":"ContainerStarted","Data":"0192f87401264fe14604fce39af47d95952ac3607105d0e71210d811c88df755"} Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.765221 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv5hp\" (UniqueName: \"kubernetes.io/projected/4f8fdbf9-041a-46dc-87c9-f2531626e150-kube-api-access-qv5hp\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.765510 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aaa15825-c66d-4625-86d9-2066c914a955-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.765596 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.765728 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa15825-c66d-4625-86d9-2066c914a955-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.765824 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f8fdbf9-041a-46dc-87c9-f2531626e150-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.765916 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rndp5\" (UniqueName: \"kubernetes.io/projected/aaa15825-c66d-4625-86d9-2066c914a955-kube-api-access-rndp5\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.765839 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4d23d8f1-6699-42f6-8152-362964a6a9dc","Type":"ContainerStarted","Data":"a0e544e09bb702af8047d8373439f3e074741aceea685df79f2a9a8308c31f68"} Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.768918 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_aaa15825-c66d-4625-86d9-2066c914a955/ovn-northd/0.log" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.768975 5030 generic.go:334] "Generic (PLEG): container finished" podID="aaa15825-c66d-4625-86d9-2066c914a955" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" exitCode=139 Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.769847 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.769987 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"aaa15825-c66d-4625-86d9-2066c914a955","Type":"ContainerDied","Data":"506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20"} Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.771430 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"aaa15825-c66d-4625-86d9-2066c914a955","Type":"ContainerDied","Data":"041789556f9faf3b54d6eabf8376afb0a5f37ae440ce691b95c9bb1994ea130d"} Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.826070 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aaa15825-c66d-4625-86d9-2066c914a955" (UID: "aaa15825-c66d-4625-86d9-2066c914a955"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.829722 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f8fdbf9-041a-46dc-87c9-f2531626e150" (UID: "4f8fdbf9-041a-46dc-87c9-f2531626e150"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.847114 5030 scope.go:117] "RemoveContainer" containerID="201d5bd12c4c0c61fa84da5a2967111dadbe0c96f4de78b25225f4ddfc98e0b3" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.856565 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=7.856544564 podStartE2EDuration="7.856544564s" podCreationTimestamp="2026-01-20 23:34:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:37.81712061 +0000 UTC m=+3550.137380898" watchObservedRunningTime="2026-01-20 23:34:37.856544564 +0000 UTC m=+3550.176804852" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.868109 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.868139 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.869988 5030 scope.go:117] "RemoveContainer" containerID="4d481d03cd21a082e415c95294fe569e98e8b2c182e04998e1cc9ef7f56beef0" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.874922 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2"] Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.879361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-config-data" (OuterVolumeSpecName: "config-data") pod "4f8fdbf9-041a-46dc-87c9-f2531626e150" (UID: "4f8fdbf9-041a-46dc-87c9-f2531626e150"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.900130 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-f68656994-jgqr2"] Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.918822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "aaa15825-c66d-4625-86d9-2066c914a955" (UID: "aaa15825-c66d-4625-86d9-2066c914a955"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.940546 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "aaa15825-c66d-4625-86d9-2066c914a955" (UID: "aaa15825-c66d-4625-86d9-2066c914a955"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.954967 5030 scope.go:117] "RemoveContainer" containerID="a57e2746787ddfcbf60fecfd45d88fdb4eec56a5f3449cfc24acf2bbf3818b17" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.972867 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.972896 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa15825-c66d-4625-86d9-2066c914a955-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:37 crc kubenswrapper[5030]: I0120 23:34:37.972907 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f8fdbf9-041a-46dc-87c9-f2531626e150-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.016662 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="427b84ba-dba3-4881-bd7c-461d4c3674c9" path="/var/lib/kubelet/pods/427b84ba-dba3-4881-bd7c-461d4c3674c9/volumes" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.017303 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="712df845-7e0f-48ed-b6a1-bb7c967fb5e3" path="/var/lib/kubelet/pods/712df845-7e0f-48ed-b6a1-bb7c967fb5e3/volumes" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.033336 5030 scope.go:117] "RemoveContainer" containerID="decf69e4e589732702c19f699d836d58b658a46251d7b72038207fef3f5a3ca1" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.115816 5030 scope.go:117] "RemoveContainer" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.168859 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.176126 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp"] Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.215885 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-d56b7b96f-pdtfp"] Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.242863 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.258703 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.294983 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.295040 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.297118 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.315040 5030 scope.go:117] "RemoveContainer" containerID="decf69e4e589732702c19f699d836d58b658a46251d7b72038207fef3f5a3ca1" Jan 20 23:34:38 crc kubenswrapper[5030]: E0120 23:34:38.320809 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"decf69e4e589732702c19f699d836d58b658a46251d7b72038207fef3f5a3ca1\": container with ID starting with decf69e4e589732702c19f699d836d58b658a46251d7b72038207fef3f5a3ca1 not found: ID does not exist" containerID="decf69e4e589732702c19f699d836d58b658a46251d7b72038207fef3f5a3ca1" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.320858 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"decf69e4e589732702c19f699d836d58b658a46251d7b72038207fef3f5a3ca1"} err="failed to get container status \"decf69e4e589732702c19f699d836d58b658a46251d7b72038207fef3f5a3ca1\": rpc error: code = NotFound desc = could not find container \"decf69e4e589732702c19f699d836d58b658a46251d7b72038207fef3f5a3ca1\": container with ID starting with decf69e4e589732702c19f699d836d58b658a46251d7b72038207fef3f5a3ca1 not found: ID does not exist" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.320886 5030 scope.go:117] "RemoveContainer" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" Jan 20 23:34:38 crc kubenswrapper[5030]: E0120 23:34:38.323889 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20\": container with ID starting with 506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20 not found: ID does not exist" containerID="506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.323936 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20"} err="failed to get container status \"506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20\": rpc error: code = NotFound desc = could not find container \"506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20\": container with ID starting with 506a9f43d0c14f693d684cceb5143d4d24f32280ba8ee4d61115650bb378aa20 not found: ID does not exist" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.331845 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:38 crc kubenswrapper[5030]: E0120 23:34:38.332256 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712df845-7e0f-48ed-b6a1-bb7c967fb5e3" containerName="barbican-keystone-listener" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332266 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="712df845-7e0f-48ed-b6a1-bb7c967fb5e3" containerName="barbican-keystone-listener" Jan 20 23:34:38 crc kubenswrapper[5030]: E0120 23:34:38.332284 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427b84ba-dba3-4881-bd7c-461d4c3674c9" containerName="placement-api" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332290 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="427b84ba-dba3-4881-bd7c-461d4c3674c9" containerName="placement-api" Jan 20 23:34:38 crc kubenswrapper[5030]: E0120 23:34:38.332299 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8fdbf9-041a-46dc-87c9-f2531626e150" containerName="barbican-worker" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332306 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8fdbf9-041a-46dc-87c9-f2531626e150" containerName="barbican-worker" Jan 20 23:34:38 crc kubenswrapper[5030]: E0120 23:34:38.332314 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa15825-c66d-4625-86d9-2066c914a955" containerName="openstack-network-exporter" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332320 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa15825-c66d-4625-86d9-2066c914a955" containerName="openstack-network-exporter" Jan 20 23:34:38 crc kubenswrapper[5030]: E0120 23:34:38.332331 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427b84ba-dba3-4881-bd7c-461d4c3674c9" containerName="placement-log" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332337 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="427b84ba-dba3-4881-bd7c-461d4c3674c9" containerName="placement-log" Jan 20 23:34:38 crc kubenswrapper[5030]: E0120 23:34:38.332351 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c095502b-f026-40aa-a84f-057d9dc0e758" containerName="cinder-api-log" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332356 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c095502b-f026-40aa-a84f-057d9dc0e758" containerName="cinder-api-log" Jan 20 23:34:38 crc kubenswrapper[5030]: E0120 23:34:38.332370 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c095502b-f026-40aa-a84f-057d9dc0e758" containerName="cinder-api" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332375 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c095502b-f026-40aa-a84f-057d9dc0e758" containerName="cinder-api" Jan 20 23:34:38 crc kubenswrapper[5030]: E0120 23:34:38.332387 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa15825-c66d-4625-86d9-2066c914a955" containerName="ovn-northd" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332393 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa15825-c66d-4625-86d9-2066c914a955" containerName="ovn-northd" Jan 20 23:34:38 crc kubenswrapper[5030]: E0120 23:34:38.332417 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712df845-7e0f-48ed-b6a1-bb7c967fb5e3" containerName="barbican-keystone-listener-log" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332423 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="712df845-7e0f-48ed-b6a1-bb7c967fb5e3" containerName="barbican-keystone-listener-log" Jan 20 23:34:38 crc kubenswrapper[5030]: E0120 23:34:38.332433 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8fdbf9-041a-46dc-87c9-f2531626e150" containerName="barbican-worker-log" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332439 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8fdbf9-041a-46dc-87c9-f2531626e150" containerName="barbican-worker-log" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332591 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="712df845-7e0f-48ed-b6a1-bb7c967fb5e3" containerName="barbican-keystone-listener" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332609 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="427b84ba-dba3-4881-bd7c-461d4c3674c9" containerName="placement-api" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332627 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="712df845-7e0f-48ed-b6a1-bb7c967fb5e3" containerName="barbican-keystone-listener-log" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332648 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8fdbf9-041a-46dc-87c9-f2531626e150" containerName="barbican-worker" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332660 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa15825-c66d-4625-86d9-2066c914a955" containerName="ovn-northd" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332673 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c095502b-f026-40aa-a84f-057d9dc0e758" containerName="cinder-api" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332685 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8fdbf9-041a-46dc-87c9-f2531626e150" containerName="barbican-worker-log" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332695 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa15825-c66d-4625-86d9-2066c914a955" containerName="openstack-network-exporter" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332704 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="427b84ba-dba3-4881-bd7c-461d4c3674c9" containerName="placement-log" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.332731 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c095502b-f026-40aa-a84f-057d9dc0e758" containerName="cinder-api-log" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.333668 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.339246 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.339530 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-qmbrx" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.339726 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.339939 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.345378 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.362864 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.134:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.397868 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.397920 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b0e7253-4a57-42d1-bbc3-768eeb755113-scripts\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.397946 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8v27\" (UniqueName: \"kubernetes.io/projected/3b0e7253-4a57-42d1-bbc3-768eeb755113-kube-api-access-h8v27\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.397964 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b0e7253-4a57-42d1-bbc3-768eeb755113-config\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.397981 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.398015 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b0e7253-4a57-42d1-bbc3-768eeb755113-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.398065 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.499415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-combined-ca-bundle\") pod \"c095502b-f026-40aa-a84f-057d9dc0e758\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.499522 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c095502b-f026-40aa-a84f-057d9dc0e758-etc-machine-id\") pod \"c095502b-f026-40aa-a84f-057d9dc0e758\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.499574 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-scripts\") pod \"c095502b-f026-40aa-a84f-057d9dc0e758\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.499687 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtkvm\" (UniqueName: \"kubernetes.io/projected/c095502b-f026-40aa-a84f-057d9dc0e758-kube-api-access-jtkvm\") pod \"c095502b-f026-40aa-a84f-057d9dc0e758\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.499744 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-config-data-custom\") pod \"c095502b-f026-40aa-a84f-057d9dc0e758\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.499776 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-public-tls-certs\") pod \"c095502b-f026-40aa-a84f-057d9dc0e758\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.499829 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c095502b-f026-40aa-a84f-057d9dc0e758-logs\") pod \"c095502b-f026-40aa-a84f-057d9dc0e758\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.499863 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-config-data\") pod \"c095502b-f026-40aa-a84f-057d9dc0e758\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.499904 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-internal-tls-certs\") pod \"c095502b-f026-40aa-a84f-057d9dc0e758\" (UID: \"c095502b-f026-40aa-a84f-057d9dc0e758\") " Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.500163 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.500201 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b0e7253-4a57-42d1-bbc3-768eeb755113-scripts\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.500223 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8v27\" (UniqueName: \"kubernetes.io/projected/3b0e7253-4a57-42d1-bbc3-768eeb755113-kube-api-access-h8v27\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.500240 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b0e7253-4a57-42d1-bbc3-768eeb755113-config\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.500259 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.500300 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b0e7253-4a57-42d1-bbc3-768eeb755113-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.500354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.502336 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b0e7253-4a57-42d1-bbc3-768eeb755113-scripts\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.502612 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c095502b-f026-40aa-a84f-057d9dc0e758-logs" (OuterVolumeSpecName: "logs") pod "c095502b-f026-40aa-a84f-057d9dc0e758" (UID: "c095502b-f026-40aa-a84f-057d9dc0e758"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.506101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b0e7253-4a57-42d1-bbc3-768eeb755113-config\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.509187 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c095502b-f026-40aa-a84f-057d9dc0e758-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c095502b-f026-40aa-a84f-057d9dc0e758" (UID: "c095502b-f026-40aa-a84f-057d9dc0e758"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.509497 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b0e7253-4a57-42d1-bbc3-768eeb755113-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.517478 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.545261 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-scripts" (OuterVolumeSpecName: "scripts") pod "c095502b-f026-40aa-a84f-057d9dc0e758" (UID: "c095502b-f026-40aa-a84f-057d9dc0e758"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.545414 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c095502b-f026-40aa-a84f-057d9dc0e758-kube-api-access-jtkvm" (OuterVolumeSpecName: "kube-api-access-jtkvm") pod "c095502b-f026-40aa-a84f-057d9dc0e758" (UID: "c095502b-f026-40aa-a84f-057d9dc0e758"). InnerVolumeSpecName "kube-api-access-jtkvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.558133 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c095502b-f026-40aa-a84f-057d9dc0e758" (UID: "c095502b-f026-40aa-a84f-057d9dc0e758"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.558738 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.575316 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.584344 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8v27\" (UniqueName: \"kubernetes.io/projected/3b0e7253-4a57-42d1-bbc3-768eeb755113-kube-api-access-h8v27\") pod \"ovn-northd-0\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.596206 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c095502b-f026-40aa-a84f-057d9dc0e758" (UID: "c095502b-f026-40aa-a84f-057d9dc0e758"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.603894 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtkvm\" (UniqueName: \"kubernetes.io/projected/c095502b-f026-40aa-a84f-057d9dc0e758-kube-api-access-jtkvm\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.603918 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.603928 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c095502b-f026-40aa-a84f-057d9dc0e758-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.603937 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.603945 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c095502b-f026-40aa-a84f-057d9dc0e758-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.603953 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.671455 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" podUID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.110:8778/\": read tcp 10.217.0.2:60394->10.217.0.110:8778: read: connection reset by peer" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.671781 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" podUID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.110:8778/\": read tcp 10.217.0.2:60390->10.217.0.110:8778: read: connection reset by peer" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.672243 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.701127 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c095502b-f026-40aa-a84f-057d9dc0e758" (UID: "c095502b-f026-40aa-a84f-057d9dc0e758"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.707885 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.730866 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c095502b-f026-40aa-a84f-057d9dc0e758" (UID: "c095502b-f026-40aa-a84f-057d9dc0e758"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.734244 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-config-data" (OuterVolumeSpecName: "config-data") pod "c095502b-f026-40aa-a84f-057d9dc0e758" (UID: "c095502b-f026-40aa-a84f-057d9dc0e758"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.798365 5030 generic.go:334] "Generic (PLEG): container finished" podID="c095502b-f026-40aa-a84f-057d9dc0e758" containerID="8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1" exitCode=0 Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.798496 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.798520 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c095502b-f026-40aa-a84f-057d9dc0e758","Type":"ContainerDied","Data":"8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1"} Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.801740 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c095502b-f026-40aa-a84f-057d9dc0e758","Type":"ContainerDied","Data":"0a1ee7a63dca89b17f71807197fe62d7d5cfa69a0e0f70c4db0e644417b91720"} Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.801779 5030 scope.go:117] "RemoveContainer" containerID="8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.810080 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.810114 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c095502b-f026-40aa-a84f-057d9dc0e758-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.816800 5030 generic.go:334] "Generic (PLEG): container finished" podID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerID="e06dfb116e83c6a042119cbcc962d6e9c94a759890c06b3cecc1ccd89cd716ab" exitCode=0 Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.816856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" event={"ID":"2d1b606a-c4d7-4ead-83db-20b4cd7ded47","Type":"ContainerDied","Data":"e06dfb116e83c6a042119cbcc962d6e9c94a759890c06b3cecc1ccd89cd716ab"} Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.841895 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa","Type":"ContainerStarted","Data":"d43ead94b9f33e2324f591cc408dbb66b68c1bb93ea0da99cc4d0aa6f566ff76"} Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.858614 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"fd6ac7c2-5294-401a-a16d-0821b044e7f8","Type":"ContainerStarted","Data":"9237c481c7aa7834b8c0efe9a0c5fd9901bf2d2eb2b5d3f97cca7cc8a063c842"} Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.866461 5030 generic.go:334] "Generic (PLEG): container finished" podID="d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" containerID="11331a0257cd7cd3a45d9b1e0d9aa4ca36e6a5d20ca49299d905d135b78ee8e3" exitCode=0 Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.866521 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9","Type":"ContainerDied","Data":"11331a0257cd7cd3a45d9b1e0d9aa4ca36e6a5d20ca49299d905d135b78ee8e3"} Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.874362 5030 generic.go:334] "Generic (PLEG): container finished" podID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerID="b8d1ff9f1a8865150e0be3cb7d84d43b40536cc4bc648a2c3d8af999d1a89102" exitCode=1 Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.874435 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9c05bad7-0581-4182-9c7b-7a790c22cf64","Type":"ContainerDied","Data":"b8d1ff9f1a8865150e0be3cb7d84d43b40536cc4bc648a2c3d8af999d1a89102"} Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.879432 5030 scope.go:117] "RemoveContainer" containerID="b8d1ff9f1a8865150e0be3cb7d84d43b40536cc4bc648a2c3d8af999d1a89102" Jan 20 23:34:38 crc kubenswrapper[5030]: E0120 23:34:38.879884 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(9c05bad7-0581-4182-9c7b-7a790c22cf64)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.921721 5030 scope.go:117] "RemoveContainer" containerID="71cfc22f41e9fa0ef8e7067f5101e00ba2cfaf07926332c0f19d957ce019cfcd" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.922454 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=8.922435275 podStartE2EDuration="8.922435275s" podCreationTimestamp="2026-01-20 23:34:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:38.877789225 +0000 UTC m=+3551.198049513" watchObservedRunningTime="2026-01-20 23:34:38.922435275 +0000 UTC m=+3551.242695553" Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.930905 5030 generic.go:334] "Generic (PLEG): container finished" podID="5b3850fc-0bf6-489d-8a8f-4a72ad423cce" containerID="d5bb496af776155bccbacd44a182341bd36f316b38ba8e602d66d7dffd391456" exitCode=0 Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.931846 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"5b3850fc-0bf6-489d-8a8f-4a72ad423cce","Type":"ContainerDied","Data":"d5bb496af776155bccbacd44a182341bd36f316b38ba8e602d66d7dffd391456"} Jan 20 23:34:38 crc kubenswrapper[5030]: I0120 23:34:38.970381 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.022679 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.032582 5030 scope.go:117] "RemoveContainer" containerID="8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1" Jan 20 23:34:39 crc kubenswrapper[5030]: E0120 23:34:39.039967 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1\": container with ID starting with 8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1 not found: ID does not exist" containerID="8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.040010 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1"} err="failed to get container status \"8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1\": rpc error: code = NotFound desc = could not find container \"8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1\": container with ID starting with 8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1 not found: ID does not exist" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.040043 5030 scope.go:117] "RemoveContainer" containerID="71cfc22f41e9fa0ef8e7067f5101e00ba2cfaf07926332c0f19d957ce019cfcd" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.041959 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:39 crc kubenswrapper[5030]: E0120 23:34:39.042381 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71cfc22f41e9fa0ef8e7067f5101e00ba2cfaf07926332c0f19d957ce019cfcd\": container with ID starting with 71cfc22f41e9fa0ef8e7067f5101e00ba2cfaf07926332c0f19d957ce019cfcd not found: ID does not exist" containerID="71cfc22f41e9fa0ef8e7067f5101e00ba2cfaf07926332c0f19d957ce019cfcd" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.042408 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71cfc22f41e9fa0ef8e7067f5101e00ba2cfaf07926332c0f19d957ce019cfcd"} err="failed to get container status \"71cfc22f41e9fa0ef8e7067f5101e00ba2cfaf07926332c0f19d957ce019cfcd\": rpc error: code = NotFound desc = could not find container \"71cfc22f41e9fa0ef8e7067f5101e00ba2cfaf07926332c0f19d957ce019cfcd\": container with ID starting with 71cfc22f41e9fa0ef8e7067f5101e00ba2cfaf07926332c0f19d957ce019cfcd not found: ID does not exist" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.042424 5030 scope.go:117] "RemoveContainer" containerID="bd2173d5780144e78bbd66e000c1fb014dcc9fd1b472fd0d9859a5ed21c774b8" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.046777 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.062446 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.062907 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.063115 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.103825 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.133438 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.133504 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.133586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-config-data-custom\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.133621 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-config-data\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.133666 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7faddbb-53f4-4f27-a853-0751049feafd-logs\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.133699 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg2rx\" (UniqueName: \"kubernetes.io/projected/c7faddbb-53f4-4f27-a853-0751049feafd-kube-api-access-rg2rx\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.133712 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-scripts\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.133739 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7faddbb-53f4-4f27-a853-0751049feafd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.133818 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.169884 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.134:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.235814 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.235956 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.235978 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.236035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-config-data-custom\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.236061 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-config-data\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.236098 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7faddbb-53f4-4f27-a853-0751049feafd-logs\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.236119 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg2rx\" (UniqueName: \"kubernetes.io/projected/c7faddbb-53f4-4f27-a853-0751049feafd-kube-api-access-rg2rx\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.236136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-scripts\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.236157 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7faddbb-53f4-4f27-a853-0751049feafd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.236285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7faddbb-53f4-4f27-a853-0751049feafd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.246309 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.246806 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7faddbb-53f4-4f27-a853-0751049feafd-logs\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.255529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-config-data\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.257202 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-config-data-custom\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.259214 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-scripts\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.264926 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.266772 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.281212 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.294082 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg2rx\" (UniqueName: \"kubernetes.io/projected/c7faddbb-53f4-4f27-a853-0751049feafd-kube-api-access-rg2rx\") pod \"cinder-api-0\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: E0120 23:34:39.411771 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 718b62def710cf515ab2679a13515694e305b4ad1097a0ee51354099652b2c81 is running failed: container process not found" containerID="718b62def710cf515ab2679a13515694e305b4ad1097a0ee51354099652b2c81" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:34:39 crc kubenswrapper[5030]: E0120 23:34:39.415796 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 718b62def710cf515ab2679a13515694e305b4ad1097a0ee51354099652b2c81 is running failed: container process not found" containerID="718b62def710cf515ab2679a13515694e305b4ad1097a0ee51354099652b2c81" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:34:39 crc kubenswrapper[5030]: E0120 23:34:39.419778 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 718b62def710cf515ab2679a13515694e305b4ad1097a0ee51354099652b2c81 is running failed: container process not found" containerID="718b62def710cf515ab2679a13515694e305b4ad1097a0ee51354099652b2c81" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:34:39 crc kubenswrapper[5030]: E0120 23:34:39.419829 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 718b62def710cf515ab2679a13515694e305b4ad1097a0ee51354099652b2c81 is running failed: container process not found" probeType="Liveness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="d3f32a70-a447-408f-99e0-90075753a304" containerName="nova-cell1-conductor-conductor" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.429140 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.485906 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.547400 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-config-data\") pod \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.547697 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbtsm\" (UniqueName: \"kubernetes.io/projected/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-kube-api-access-qbtsm\") pod \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.547738 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-internal-tls-certs\") pod \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.547831 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-scripts\") pod \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.547885 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-public-tls-certs\") pod \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.547961 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-logs\") pod \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.548022 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-combined-ca-bundle\") pod \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\" (UID: \"2d1b606a-c4d7-4ead-83db-20b4cd7ded47\") " Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.551232 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-logs" (OuterVolumeSpecName: "logs") pod "2d1b606a-c4d7-4ead-83db-20b4cd7ded47" (UID: "2d1b606a-c4d7-4ead-83db-20b4cd7ded47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.584933 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-scripts" (OuterVolumeSpecName: "scripts") pod "2d1b606a-c4d7-4ead-83db-20b4cd7ded47" (UID: "2d1b606a-c4d7-4ead-83db-20b4cd7ded47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.590840 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-kube-api-access-qbtsm" (OuterVolumeSpecName: "kube-api-access-qbtsm") pod "2d1b606a-c4d7-4ead-83db-20b4cd7ded47" (UID: "2d1b606a-c4d7-4ead-83db-20b4cd7ded47"). InnerVolumeSpecName "kube-api-access-qbtsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.624559 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.652359 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbtsm\" (UniqueName: \"kubernetes.io/projected/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-kube-api-access-qbtsm\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.652386 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.652395 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.805945 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-config-data" (OuterVolumeSpecName: "config-data") pod "2d1b606a-c4d7-4ead-83db-20b4cd7ded47" (UID: "2d1b606a-c4d7-4ead-83db-20b4cd7ded47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.813468 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d1b606a-c4d7-4ead-83db-20b4cd7ded47" (UID: "2d1b606a-c4d7-4ead-83db-20b4cd7ded47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.832799 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.129:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.832808 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.129:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.855507 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.855559 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.871170 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2d1b606a-c4d7-4ead-83db-20b4cd7ded47" (UID: "2d1b606a-c4d7-4ead-83db-20b4cd7ded47"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.897138 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2d1b606a-c4d7-4ead-83db-20b4cd7ded47" (UID: "2d1b606a-c4d7-4ead-83db-20b4cd7ded47"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.941232 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.957462 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"3b0e7253-4a57-42d1-bbc3-768eeb755113","Type":"ContainerStarted","Data":"2b3f48e0d6f2a6aefac841390f0266fdaf772963df0559171a321f8b0b377b72"} Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.957500 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"3b0e7253-4a57-42d1-bbc3-768eeb755113","Type":"ContainerStarted","Data":"d2298816256dc578d3898588fc407be0225f8dab87e1cc38c30b05d899c283b1"} Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.958115 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.958147 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d1b606a-c4d7-4ead-83db-20b4cd7ded47-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.962582 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" Jan 20 23:34:39 crc kubenswrapper[5030]: I0120 23:34:39.964378 5030 generic.go:334] "Generic (PLEG): container finished" podID="d3f32a70-a447-408f-99e0-90075753a304" containerID="718b62def710cf515ab2679a13515694e305b4ad1097a0ee51354099652b2c81" exitCode=1 Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.021482 5030 scope.go:117] "RemoveContainer" containerID="b8d1ff9f1a8865150e0be3cb7d84d43b40536cc4bc648a2c3d8af999d1a89102" Jan 20 23:34:40 crc kubenswrapper[5030]: E0120 23:34:40.021719 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(9c05bad7-0581-4182-9c7b-7a790c22cf64)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.028818 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f8fdbf9-041a-46dc-87c9-f2531626e150" path="/var/lib/kubelet/pods/4f8fdbf9-041a-46dc-87c9-f2531626e150/volumes" Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.029489 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaa15825-c66d-4625-86d9-2066c914a955" path="/var/lib/kubelet/pods/aaa15825-c66d-4625-86d9-2066c914a955/volumes" Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.034408 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c095502b-f026-40aa-a84f-057d9dc0e758" path="/var/lib/kubelet/pods/c095502b-f026-40aa-a84f-057d9dc0e758/volumes" Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.035387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-66d4dcc7c4-pckll" event={"ID":"2d1b606a-c4d7-4ead-83db-20b4cd7ded47","Type":"ContainerDied","Data":"9eefb33856a9e5a03ba65a238e5121aef46912d0197436eddf25525a753d96d1"} Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.035420 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-66d4dcc7c4-pckll"] Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.035435 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"d3f32a70-a447-408f-99e0-90075753a304","Type":"ContainerDied","Data":"718b62def710cf515ab2679a13515694e305b4ad1097a0ee51354099652b2c81"} Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.035448 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9","Type":"ContainerStarted","Data":"c5665ab2e4d22cab17e4485bf654d41538e5774df91bf1e276c60d2155d2496f"} Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.036685 5030 scope.go:117] "RemoveContainer" containerID="e06dfb116e83c6a042119cbcc962d6e9c94a759890c06b3cecc1ccd89cd716ab" Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.037424 5030 scope.go:117] "RemoveContainer" containerID="718b62def710cf515ab2679a13515694e305b4ad1097a0ee51354099652b2c81" Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.082049 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-66d4dcc7c4-pckll"] Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.082095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"5b3850fc-0bf6-489d-8a8f-4a72ad423cce","Type":"ContainerStarted","Data":"e268973cec96cb63fac254ecd93e81d85190ec9cc7adf77bceda4e64becbf60d"} Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.156768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa","Type":"ContainerStarted","Data":"c027ebf22386d1991931da1aeae88533b95e4a4bd29b95e681b5273a9a061c75"} Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.161250 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=12.161229291 podStartE2EDuration="12.161229291s" podCreationTimestamp="2026-01-20 23:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:40.103263748 +0000 UTC m=+3552.423524036" watchObservedRunningTime="2026-01-20 23:34:40.161229291 +0000 UTC m=+3552.481489579" Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.186113 5030 scope.go:117] "RemoveContainer" containerID="4afb02528814af0d996d7fabe7520b1955adff3dccf2a87e4e71cb62bed38778" Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.219336 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=10.219314707 podStartE2EDuration="10.219314707s" podCreationTimestamp="2026-01-20 23:34:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:40.179629736 +0000 UTC m=+3552.499890034" watchObservedRunningTime="2026-01-20 23:34:40.219314707 +0000 UTC m=+3552.539574995" Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.292967 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.293006 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:40 crc kubenswrapper[5030]: I0120 23:34:40.293020 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:40 crc kubenswrapper[5030]: W0120 23:34:40.661813 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c05bad7_0581_4182_9c7b_7a790c22cf64.slice/crio-bd2173d5780144e78bbd66e000c1fb014dcc9fd1b472fd0d9859a5ed21c774b8.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c05bad7_0581_4182_9c7b_7a790c22cf64.slice/crio-bd2173d5780144e78bbd66e000c1fb014dcc9fd1b472fd0d9859a5ed21c774b8.scope: no such file or directory Jan 20 23:34:40 crc kubenswrapper[5030]: W0120 23:34:40.668970 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc4c5ea7_9b09_4d4f_b296_90bdf86f92ec.slice/crio-conmon-b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc4c5ea7_9b09_4d4f_b296_90bdf86f92ec.slice/crio-conmon-b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4.scope: no such file or directory Jan 20 23:34:40 crc kubenswrapper[5030]: W0120 23:34:40.709573 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc4c5ea7_9b09_4d4f_b296_90bdf86f92ec.slice/crio-b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc4c5ea7_9b09_4d4f_b296_90bdf86f92ec.slice/crio-b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4.scope: no such file or directory Jan 20 23:34:40 crc kubenswrapper[5030]: W0120 23:34:40.869979 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c05bad7_0581_4182_9c7b_7a790c22cf64.slice/crio-conmon-b8d1ff9f1a8865150e0be3cb7d84d43b40536cc4bc648a2c3d8af999d1a89102.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c05bad7_0581_4182_9c7b_7a790c22cf64.slice/crio-conmon-b8d1ff9f1a8865150e0be3cb7d84d43b40536cc4bc648a2c3d8af999d1a89102.scope: no such file or directory Jan 20 23:34:40 crc kubenswrapper[5030]: W0120 23:34:40.870662 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c05bad7_0581_4182_9c7b_7a790c22cf64.slice/crio-b8d1ff9f1a8865150e0be3cb7d84d43b40536cc4bc648a2c3d8af999d1a89102.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c05bad7_0581_4182_9c7b_7a790c22cf64.slice/crio-b8d1ff9f1a8865150e0be3cb7d84d43b40536cc4bc648a2c3d8af999d1a89102.scope: no such file or directory Jan 20 23:34:40 crc kubenswrapper[5030]: E0120 23:34:40.995234 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f8fdbf9_041a_46dc_87c9_f2531626e150.slice/crio-73cbf0a8806f54718d2882617fea2872c8f36fcd941dd194702aef26e3b0465c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5613b909_aedf_4a87_babc_ca3e427fe828.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaa15825_c66d_4625_86d9_2066c914a955.slice/crio-041789556f9faf3b54d6eabf8376afb0a5f37ae440ce691b95c9bb1994ea130d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1c966f9_0e90_4a5d_9e43_81ec4980cefc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1c966f9_0e90_4a5d_9e43_81ec4980cefc.slice/crio-62aceb603d9b599795ce3db4aa808f7df55085790cf93f0bb01657f574a12d00\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3a2955e_219c_481d_bb5d_ca59f20d1255.slice/crio-0948e4c0daa26693867712ecc8c65337053c44cee21476c93d44514857119892\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda84cf4eb_2a79_4637_aca8_d84c4e588fe7.slice/crio-5ecde8ec052026f4220036a91d5a3a71cf246312e5090cad9d644e11e17921bb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5613b909_aedf_4a87_babc_ca3e427fe828.slice/crio-af80d5994aeeb83a95385a0f7fce28dff4b04a7a0c3cd44b876f399ee8c542dd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdbf6103_bdd3_4bb9_9d78_eb66cc7a357d.slice/crio-conmon-0d9fc4ee758d84087beaab327bba3d23f35c0de724fa32f1178cb99595f25278.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod427b84ba_dba3_4881_bd7c_461d4c3674c9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f8fdbf9_041a_46dc_87c9_f2531626e150.slice/crio-conmon-a57e2746787ddfcbf60fecfd45d88fdb4eec56a5f3449cfc24acf2bbf3818b17.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf288905_8d90_4f98_ba70_c7351555851b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3a2955e_219c_481d_bb5d_ca59f20d1255.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdbf6103_bdd3_4bb9_9d78_eb66cc7a357d.slice/crio-11bb102f85b946924c2d3fb32c00ebef6d6f693f819cb13039457de7fa8497db\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc095502b_f026_40aa_a84f_057d9dc0e758.slice/crio-conmon-8a30783da8bd6c987ea054d06770c630b1ed4db725b17e522d88e0a78d7d24f1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8e8785a_fb7a_4209_a065_b57a88baec40.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f8fdbf9_041a_46dc_87c9_f2531626e150.slice/crio-conmon-4d481d03cd21a082e415c95294fe569e98e8b2c182e04998e1cc9ef7f56beef0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaa15825_c66d_4625_86d9_2066c914a955.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda84cf4eb_2a79_4637_aca8_d84c4e588fe7.slice/crio-conmon-98af69b0bfdc6fc1afc39aff882af1e07a660d71a898d7354bda19b513fc36c9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod712df845_7e0f_48ed_b6a1_bb7c967fb5e3.slice/crio-conmon-201d5bd12c4c0c61fa84da5a2967111dadbe0c96f4de78b25225f4ddfc98e0b3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod712df845_7e0f_48ed_b6a1_bb7c967fb5e3.slice/crio-201d5bd12c4c0c61fa84da5a2967111dadbe0c96f4de78b25225f4ddfc98e0b3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8e8785a_fb7a_4209_a065_b57a88baec40.slice/crio-conmon-d94c9f8a703e75f2978eff10164982d029699b50df36d1c6e8549690c9727708.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.049769 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-6776c47bdb-m5w28_b7b3ce7d-73be-4f56-80a5-d9147ad0580e/neutron-api/0.log" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.050041 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.180295 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa","Type":"ContainerStarted","Data":"65120d5a48a7df9b0fba3d1fc2438131bd8b3abbd4f197cb6243fad8d5660da9"} Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.182763 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.188160 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-6776c47bdb-m5w28_b7b3ce7d-73be-4f56-80a5-d9147ad0580e/neutron-api/0.log" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.188209 5030 generic.go:334] "Generic (PLEG): container finished" podID="b7b3ce7d-73be-4f56-80a5-d9147ad0580e" containerID="0bfe039149eb41418e7a2f78fbb14ac57973f48030e358e2f58e1c2623aba811" exitCode=137 Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.188265 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" event={"ID":"b7b3ce7d-73be-4f56-80a5-d9147ad0580e","Type":"ContainerDied","Data":"0bfe039149eb41418e7a2f78fbb14ac57973f48030e358e2f58e1c2623aba811"} Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.188270 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.188287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6776c47bdb-m5w28" event={"ID":"b7b3ce7d-73be-4f56-80a5-d9147ad0580e","Type":"ContainerDied","Data":"c23bf4e0593f5e8bcacf1cb2436a6278954db7f885655cbf8b8477146c1a2b24"} Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.188305 5030 scope.go:117] "RemoveContainer" containerID="580832931f6c0e54e8102660db221878e02f00569eb352c1853ac3c3279ce10e" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.191005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"3b0e7253-4a57-42d1-bbc3-768eeb755113","Type":"ContainerStarted","Data":"0c8ba91ac8628ce6044a35b4f7d9eb67441f4f1ee295ec42f3004b90599321e6"} Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.191542 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.192767 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c7faddbb-53f4-4f27-a853-0751049feafd","Type":"ContainerStarted","Data":"dae190eac1e356b07146411eb43704c3a1d11f244f8ed93f24dbead6bae58b53"} Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.192790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c7faddbb-53f4-4f27-a853-0751049feafd","Type":"ContainerStarted","Data":"aa1523ca9b24f0b8ed109842dd97fa6252ceb006690a4a4349264fa22395bb21"} Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.193828 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-httpd-config\") pod \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.193901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-combined-ca-bundle\") pod \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.194348 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-internal-tls-certs\") pod \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.194389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-public-tls-certs\") pod \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.194425 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-config\") pod \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.194489 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-ovndb-tls-certs\") pod \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.194525 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlx9l\" (UniqueName: \"kubernetes.io/projected/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-kube-api-access-nlx9l\") pod \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\" (UID: \"b7b3ce7d-73be-4f56-80a5-d9147ad0580e\") " Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.206494 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=3.860522779 podStartE2EDuration="9.206476942s" podCreationTimestamp="2026-01-20 23:34:32 +0000 UTC" firstStartedPulling="2026-01-20 23:34:35.198492014 +0000 UTC m=+3547.518752292" lastFinishedPulling="2026-01-20 23:34:40.544446167 +0000 UTC m=+3552.864706455" observedRunningTime="2026-01-20 23:34:41.202467975 +0000 UTC m=+3553.522728303" watchObservedRunningTime="2026-01-20 23:34:41.206476942 +0000 UTC m=+3553.526737230" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.210915 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-kube-api-access-nlx9l" (OuterVolumeSpecName: "kube-api-access-nlx9l") pod "b7b3ce7d-73be-4f56-80a5-d9147ad0580e" (UID: "b7b3ce7d-73be-4f56-80a5-d9147ad0580e"). InnerVolumeSpecName "kube-api-access-nlx9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.223078 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b7b3ce7d-73be-4f56-80a5-d9147ad0580e" (UID: "b7b3ce7d-73be-4f56-80a5-d9147ad0580e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.232899 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"d3f32a70-a447-408f-99e0-90075753a304","Type":"ContainerStarted","Data":"649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7"} Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.233985 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.248141 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=3.248126211 podStartE2EDuration="3.248126211s" podCreationTimestamp="2026-01-20 23:34:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:41.244766229 +0000 UTC m=+3553.565026517" watchObservedRunningTime="2026-01-20 23:34:41.248126211 +0000 UTC m=+3553.568386499" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.273573 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b7b3ce7d-73be-4f56-80a5-d9147ad0580e" (UID: "b7b3ce7d-73be-4f56-80a5-d9147ad0580e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.279401 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b7b3ce7d-73be-4f56-80a5-d9147ad0580e" (UID: "b7b3ce7d-73be-4f56-80a5-d9147ad0580e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.296900 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.296926 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.296936 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlx9l\" (UniqueName: \"kubernetes.io/projected/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-kube-api-access-nlx9l\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.296947 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.306666 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-config" (OuterVolumeSpecName: "config") pod "b7b3ce7d-73be-4f56-80a5-d9147ad0580e" (UID: "b7b3ce7d-73be-4f56-80a5-d9147ad0580e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.310269 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7b3ce7d-73be-4f56-80a5-d9147ad0580e" (UID: "b7b3ce7d-73be-4f56-80a5-d9147ad0580e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.354265 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b7b3ce7d-73be-4f56-80a5-d9147ad0580e" (UID: "b7b3ce7d-73be-4f56-80a5-d9147ad0580e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.399145 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.399176 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.399186 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b3ce7d-73be-4f56-80a5-d9147ad0580e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.425537 5030 scope.go:117] "RemoveContainer" containerID="0bfe039149eb41418e7a2f78fbb14ac57973f48030e358e2f58e1c2623aba811" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.466419 5030 scope.go:117] "RemoveContainer" containerID="580832931f6c0e54e8102660db221878e02f00569eb352c1853ac3c3279ce10e" Jan 20 23:34:41 crc kubenswrapper[5030]: E0120 23:34:41.467178 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"580832931f6c0e54e8102660db221878e02f00569eb352c1853ac3c3279ce10e\": container with ID starting with 580832931f6c0e54e8102660db221878e02f00569eb352c1853ac3c3279ce10e not found: ID does not exist" containerID="580832931f6c0e54e8102660db221878e02f00569eb352c1853ac3c3279ce10e" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.467208 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"580832931f6c0e54e8102660db221878e02f00569eb352c1853ac3c3279ce10e"} err="failed to get container status \"580832931f6c0e54e8102660db221878e02f00569eb352c1853ac3c3279ce10e\": rpc error: code = NotFound desc = could not find container \"580832931f6c0e54e8102660db221878e02f00569eb352c1853ac3c3279ce10e\": container with ID starting with 580832931f6c0e54e8102660db221878e02f00569eb352c1853ac3c3279ce10e not found: ID does not exist" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.467245 5030 scope.go:117] "RemoveContainer" containerID="0bfe039149eb41418e7a2f78fbb14ac57973f48030e358e2f58e1c2623aba811" Jan 20 23:34:41 crc kubenswrapper[5030]: E0120 23:34:41.467502 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bfe039149eb41418e7a2f78fbb14ac57973f48030e358e2f58e1c2623aba811\": container with ID starting with 0bfe039149eb41418e7a2f78fbb14ac57973f48030e358e2f58e1c2623aba811 not found: ID does not exist" containerID="0bfe039149eb41418e7a2f78fbb14ac57973f48030e358e2f58e1c2623aba811" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.467549 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bfe039149eb41418e7a2f78fbb14ac57973f48030e358e2f58e1c2623aba811"} err="failed to get container status \"0bfe039149eb41418e7a2f78fbb14ac57973f48030e358e2f58e1c2623aba811\": rpc error: code = NotFound desc = could not find container \"0bfe039149eb41418e7a2f78fbb14ac57973f48030e358e2f58e1c2623aba811\": container with ID starting with 0bfe039149eb41418e7a2f78fbb14ac57973f48030e358e2f58e1c2623aba811 not found: ID does not exist" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.527877 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6776c47bdb-m5w28"] Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.537834 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6776c47bdb-m5w28"] Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.972449 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" path="/var/lib/kubelet/pods/2d1b606a-c4d7-4ead-83db-20b4cd7ded47/volumes" Jan 20 23:34:41 crc kubenswrapper[5030]: I0120 23:34:41.973429 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b3ce7d-73be-4f56-80a5-d9147ad0580e" path="/var/lib/kubelet/pods/b7b3ce7d-73be-4f56-80a5-d9147ad0580e/volumes" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.165892 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.134:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.247148 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c7faddbb-53f4-4f27-a853-0751049feafd","Type":"ContainerStarted","Data":"8b8510c34e0a7c2858626139d5521d1ef7d3a51f65ff94d2088c0275f2c2b66f"} Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.248375 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.275488 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=4.275466438 podStartE2EDuration="4.275466438s" podCreationTimestamp="2026-01-20 23:34:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:42.271546864 +0000 UTC m=+3554.591807162" watchObservedRunningTime="2026-01-20 23:34:42.275466438 +0000 UTC m=+3554.595726766" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.550106 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.550167 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.613177 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.613562 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.861805 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.864484 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.895778 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.896345 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.896366 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.906284 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:42 crc kubenswrapper[5030]: I0120 23:34:42.915188 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="c095502b-f026-40aa-a84f-057d9dc0e758" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.116:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.257994 5030 generic.go:334] "Generic (PLEG): container finished" podID="d3f32a70-a447-408f-99e0-90075753a304" containerID="649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7" exitCode=1 Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.258115 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"d3f32a70-a447-408f-99e0-90075753a304","Type":"ContainerDied","Data":"649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7"} Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.258148 5030 scope.go:117] "RemoveContainer" containerID="718b62def710cf515ab2679a13515694e305b4ad1097a0ee51354099652b2c81" Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.259413 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.259782 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.259905 5030 scope.go:117] "RemoveContainer" containerID="649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7" Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.260060 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.260577 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:43 crc kubenswrapper[5030]: E0120 23:34:43.260725 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(d3f32a70-a447-408f-99e0-90075753a304)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="d3f32a70-a447-408f-99e0-90075753a304" Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.295386 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.295441 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.405847 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.134:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.461165 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.461229 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.528041 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:43 crc kubenswrapper[5030]: I0120 23:34:43.989623 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.125:9696/\": dial tcp 10.217.0.125:9696: connect: connection refused" Jan 20 23:34:44 crc kubenswrapper[5030]: I0120 23:34:44.175865 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.134:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:44 crc kubenswrapper[5030]: I0120 23:34:44.308850 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.147:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:44 crc kubenswrapper[5030]: I0120 23:34:44.309155 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.147:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:44 crc kubenswrapper[5030]: I0120 23:34:44.322400 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:44 crc kubenswrapper[5030]: I0120 23:34:44.381378 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qbtc"] Jan 20 23:34:44 crc kubenswrapper[5030]: I0120 23:34:44.549424 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:44 crc kubenswrapper[5030]: I0120 23:34:44.636051 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:34:44 crc kubenswrapper[5030]: I0120 23:34:44.868777 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.129:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:44 crc kubenswrapper[5030]: I0120 23:34:44.869152 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.129:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.038356 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.141540 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.169781 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.134:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.293239 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.293264 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.293538 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.293568 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.334884 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="12ef9991-6f38-4b76-a8e7-974ef5255254" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.143:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.408942 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.410106 5030 scope.go:117] "RemoveContainer" containerID="649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7" Jan 20 23:34:45 crc kubenswrapper[5030]: E0120 23:34:45.410377 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(d3f32a70-a447-408f-99e0-90075753a304)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="d3f32a70-a447-408f-99e0-90075753a304" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.415369 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.416782 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.538233 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.706166 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:45 crc kubenswrapper[5030]: I0120 23:34:45.974559 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:34:46 crc kubenswrapper[5030]: I0120 23:34:46.301426 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5qbtc" podUID="bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" containerName="registry-server" containerID="cri-o://177b259473f766d88a04a9dc6b87621e06d9d3d47ef734a0f879f225316bbee1" gracePeriod=2 Jan 20 23:34:46 crc kubenswrapper[5030]: I0120 23:34:46.653026 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:46 crc kubenswrapper[5030]: I0120 23:34:46.710733 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:46 crc kubenswrapper[5030]: I0120 23:34:46.807340 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:46 crc kubenswrapper[5030]: I0120 23:34:46.900806 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-utilities\") pod \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\" (UID: \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\") " Jan 20 23:34:46 crc kubenswrapper[5030]: I0120 23:34:46.900900 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt7lt\" (UniqueName: \"kubernetes.io/projected/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-kube-api-access-wt7lt\") pod \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\" (UID: \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\") " Jan 20 23:34:46 crc kubenswrapper[5030]: I0120 23:34:46.901143 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-catalog-content\") pod \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\" (UID: \"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec\") " Jan 20 23:34:46 crc kubenswrapper[5030]: I0120 23:34:46.901531 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-utilities" (OuterVolumeSpecName: "utilities") pod "bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" (UID: "bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:46 crc kubenswrapper[5030]: I0120 23:34:46.922046 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-kube-api-access-wt7lt" (OuterVolumeSpecName: "kube-api-access-wt7lt") pod "bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" (UID: "bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec"). InnerVolumeSpecName "kube-api-access-wt7lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:46 crc kubenswrapper[5030]: I0120 23:34:46.923347 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" (UID: "bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.004249 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.004292 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.004305 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt7lt\" (UniqueName: \"kubernetes.io/projected/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec-kube-api-access-wt7lt\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.006813 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.107464 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.171212 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-d9f464b66-s2shz"] Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.171433 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" podUID="45231f32-366a-4f97-96b7-c59f7f059770" containerName="placement-log" containerID="cri-o://f6e275cc62225396c909dc76f6977b00be923535bbcf8999afec6b5f3602f7a1" gracePeriod=30 Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.171730 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.134:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.171772 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" podUID="45231f32-366a-4f97-96b7-c59f7f059770" containerName="placement-api" containerID="cri-o://ee034c3b521d612a1a183817d8e0111d607e0d2c34437b652f35b1658e1e0301" gracePeriod=30 Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.318791 5030 generic.go:334] "Generic (PLEG): container finished" podID="bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" containerID="177b259473f766d88a04a9dc6b87621e06d9d3d47ef734a0f879f225316bbee1" exitCode=0 Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.318940 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5qbtc" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.318961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qbtc" event={"ID":"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec","Type":"ContainerDied","Data":"177b259473f766d88a04a9dc6b87621e06d9d3d47ef734a0f879f225316bbee1"} Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.319001 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5qbtc" event={"ID":"bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec","Type":"ContainerDied","Data":"5b53dd15d8c69e1e32d2fbdedd4cdd055d414a06bae73824d679b5ac7d1bd3cb"} Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.319017 5030 scope.go:117] "RemoveContainer" containerID="177b259473f766d88a04a9dc6b87621e06d9d3d47ef734a0f879f225316bbee1" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.319202 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.324213 5030 generic.go:334] "Generic (PLEG): container finished" podID="a75e4451-042a-424d-a919-049a3213d060" containerID="f934c94653ca57fb3a0c59c188fa4c84207ab84b1a111cd3517e299176d2d609" exitCode=1 Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.324270 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a75e4451-042a-424d-a919-049a3213d060","Type":"ContainerDied","Data":"f934c94653ca57fb3a0c59c188fa4c84207ab84b1a111cd3517e299176d2d609"} Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.329019 5030 generic.go:334] "Generic (PLEG): container finished" podID="45231f32-366a-4f97-96b7-c59f7f059770" containerID="f6e275cc62225396c909dc76f6977b00be923535bbcf8999afec6b5f3602f7a1" exitCode=143 Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.329773 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" event={"ID":"45231f32-366a-4f97-96b7-c59f7f059770","Type":"ContainerDied","Data":"f6e275cc62225396c909dc76f6977b00be923535bbcf8999afec6b5f3602f7a1"} Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.343782 5030 scope.go:117] "RemoveContainer" containerID="b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.377984 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qbtc"] Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.387161 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5qbtc"] Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.388534 5030 scope.go:117] "RemoveContainer" containerID="d18e355af79243407dfe997cf0460ad17659ff5b7691bfae2e85595dc86dbaed" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.431218 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m246\" (UniqueName: \"kubernetes.io/projected/a75e4451-042a-424d-a919-049a3213d060-kube-api-access-2m246\") pod \"a75e4451-042a-424d-a919-049a3213d060\" (UID: \"a75e4451-042a-424d-a919-049a3213d060\") " Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.431486 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75e4451-042a-424d-a919-049a3213d060-config-data\") pod \"a75e4451-042a-424d-a919-049a3213d060\" (UID: \"a75e4451-042a-424d-a919-049a3213d060\") " Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.431509 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75e4451-042a-424d-a919-049a3213d060-combined-ca-bundle\") pod \"a75e4451-042a-424d-a919-049a3213d060\" (UID: \"a75e4451-042a-424d-a919-049a3213d060\") " Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.434815 5030 scope.go:117] "RemoveContainer" containerID="177b259473f766d88a04a9dc6b87621e06d9d3d47ef734a0f879f225316bbee1" Jan 20 23:34:47 crc kubenswrapper[5030]: E0120 23:34:47.435812 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"177b259473f766d88a04a9dc6b87621e06d9d3d47ef734a0f879f225316bbee1\": container with ID starting with 177b259473f766d88a04a9dc6b87621e06d9d3d47ef734a0f879f225316bbee1 not found: ID does not exist" containerID="177b259473f766d88a04a9dc6b87621e06d9d3d47ef734a0f879f225316bbee1" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.435841 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"177b259473f766d88a04a9dc6b87621e06d9d3d47ef734a0f879f225316bbee1"} err="failed to get container status \"177b259473f766d88a04a9dc6b87621e06d9d3d47ef734a0f879f225316bbee1\": rpc error: code = NotFound desc = could not find container \"177b259473f766d88a04a9dc6b87621e06d9d3d47ef734a0f879f225316bbee1\": container with ID starting with 177b259473f766d88a04a9dc6b87621e06d9d3d47ef734a0f879f225316bbee1 not found: ID does not exist" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.435860 5030 scope.go:117] "RemoveContainer" containerID="b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4" Jan 20 23:34:47 crc kubenswrapper[5030]: E0120 23:34:47.436547 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4\": container with ID starting with b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4 not found: ID does not exist" containerID="b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.436593 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4"} err="failed to get container status \"b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4\": rpc error: code = NotFound desc = could not find container \"b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4\": container with ID starting with b2ea30717c1e9f3882b3d0525710cbf7501797d18939cd780531f642da14c1e4 not found: ID does not exist" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.436646 5030 scope.go:117] "RemoveContainer" containerID="d18e355af79243407dfe997cf0460ad17659ff5b7691bfae2e85595dc86dbaed" Jan 20 23:34:47 crc kubenswrapper[5030]: E0120 23:34:47.436952 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18e355af79243407dfe997cf0460ad17659ff5b7691bfae2e85595dc86dbaed\": container with ID starting with d18e355af79243407dfe997cf0460ad17659ff5b7691bfae2e85595dc86dbaed not found: ID does not exist" containerID="d18e355af79243407dfe997cf0460ad17659ff5b7691bfae2e85595dc86dbaed" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.436972 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18e355af79243407dfe997cf0460ad17659ff5b7691bfae2e85595dc86dbaed"} err="failed to get container status \"d18e355af79243407dfe997cf0460ad17659ff5b7691bfae2e85595dc86dbaed\": rpc error: code = NotFound desc = could not find container \"d18e355af79243407dfe997cf0460ad17659ff5b7691bfae2e85595dc86dbaed\": container with ID starting with d18e355af79243407dfe997cf0460ad17659ff5b7691bfae2e85595dc86dbaed not found: ID does not exist" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.436986 5030 scope.go:117] "RemoveContainer" containerID="f934c94653ca57fb3a0c59c188fa4c84207ab84b1a111cd3517e299176d2d609" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.444970 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75e4451-042a-424d-a919-049a3213d060-kube-api-access-2m246" (OuterVolumeSpecName: "kube-api-access-2m246") pod "a75e4451-042a-424d-a919-049a3213d060" (UID: "a75e4451-042a-424d-a919-049a3213d060"). InnerVolumeSpecName "kube-api-access-2m246". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.482738 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75e4451-042a-424d-a919-049a3213d060-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a75e4451-042a-424d-a919-049a3213d060" (UID: "a75e4451-042a-424d-a919-049a3213d060"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.498380 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75e4451-042a-424d-a919-049a3213d060-config-data" (OuterVolumeSpecName: "config-data") pod "a75e4451-042a-424d-a919-049a3213d060" (UID: "a75e4451-042a-424d-a919-049a3213d060"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.533680 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a75e4451-042a-424d-a919-049a3213d060-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.533711 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75e4451-042a-424d-a919-049a3213d060-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.533724 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m246\" (UniqueName: \"kubernetes.io/projected/a75e4451-042a-424d-a919-049a3213d060-kube-api-access-2m246\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.577671 5030 scope.go:117] "RemoveContainer" containerID="1ad70098428c7b0568be41217ee431839dc1d2a5d66015163a6657cfd3bb4619" Jan 20 23:34:47 crc kubenswrapper[5030]: I0120 23:34:47.977990 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" path="/var/lib/kubelet/pods/bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec/volumes" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.341189 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a75e4451-042a-424d-a919-049a3213d060","Type":"ContainerDied","Data":"bd4ad3300dd5f9d4928a27642c21e47130d7f60d93cc8338838edda5ae26e239"} Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.341228 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.343547 5030 generic.go:334] "Generic (PLEG): container finished" podID="dc1666bc-40ca-4f98-a280-6c13671a5196" containerID="33dc9db4ec964af41a6a0cfcd301bf153fe2ff8a2a7bd2f13f623f706d7557be" exitCode=137 Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.343601 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"dc1666bc-40ca-4f98-a280-6c13671a5196","Type":"ContainerDied","Data":"33dc9db4ec964af41a6a0cfcd301bf153fe2ff8a2a7bd2f13f623f706d7557be"} Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.397572 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.409873 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.134:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.410540 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.422721 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:34:48 crc kubenswrapper[5030]: E0120 23:34:48.423191 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b3ce7d-73be-4f56-80a5-d9147ad0580e" containerName="neutron-api" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423204 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b3ce7d-73be-4f56-80a5-d9147ad0580e" containerName="neutron-api" Jan 20 23:34:48 crc kubenswrapper[5030]: E0120 23:34:48.423214 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerName="placement-api" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423221 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerName="placement-api" Jan 20 23:34:48 crc kubenswrapper[5030]: E0120 23:34:48.423235 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" containerName="extract-utilities" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423244 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" containerName="extract-utilities" Jan 20 23:34:48 crc kubenswrapper[5030]: E0120 23:34:48.423253 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b3ce7d-73be-4f56-80a5-d9147ad0580e" containerName="neutron-httpd" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423258 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b3ce7d-73be-4f56-80a5-d9147ad0580e" containerName="neutron-httpd" Jan 20 23:34:48 crc kubenswrapper[5030]: E0120 23:34:48.423268 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" containerName="extract-content" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423273 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" containerName="extract-content" Jan 20 23:34:48 crc kubenswrapper[5030]: E0120 23:34:48.423286 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75e4451-042a-424d-a919-049a3213d060" containerName="nova-scheduler-scheduler" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423295 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75e4451-042a-424d-a919-049a3213d060" containerName="nova-scheduler-scheduler" Jan 20 23:34:48 crc kubenswrapper[5030]: E0120 23:34:48.423316 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75e4451-042a-424d-a919-049a3213d060" containerName="nova-scheduler-scheduler" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423323 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75e4451-042a-424d-a919-049a3213d060" containerName="nova-scheduler-scheduler" Jan 20 23:34:48 crc kubenswrapper[5030]: E0120 23:34:48.423333 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerName="placement-log" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423339 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerName="placement-log" Jan 20 23:34:48 crc kubenswrapper[5030]: E0120 23:34:48.423354 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" containerName="registry-server" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423361 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" containerName="registry-server" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423533 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b3ce7d-73be-4f56-80a5-d9147ad0580e" containerName="neutron-httpd" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423547 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerName="placement-api" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423557 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75e4451-042a-424d-a919-049a3213d060" containerName="nova-scheduler-scheduler" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423569 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75e4451-042a-424d-a919-049a3213d060" containerName="nova-scheduler-scheduler" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423581 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d1b606a-c4d7-4ead-83db-20b4cd7ded47" containerName="placement-log" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423591 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4c5ea7-9b09-4d4f-b296-90bdf86f92ec" containerName="registry-server" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.423602 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b3ce7d-73be-4f56-80a5-d9147ad0580e" containerName="neutron-api" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.424272 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.427109 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.429203 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.431795 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.503025 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7"] Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.503556 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api-log" containerID="cri-o://872566d668262ae5e782cbaec5e0b5a5b6cfd95d8d6fa500178bc4186010d399" gracePeriod=30 Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.503703 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api" containerID="cri-o://3073a848a12d66bdc5d1ca7e95e2a440b8209ff9cae9549ac13f372bee8452dd" gracePeriod=30 Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.553134 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-config-data\") pod \"nova-scheduler-0\" (UID: \"5375014e-2365-4042-910d-80304789f745\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.553329 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stlp8\" (UniqueName: \"kubernetes.io/projected/5375014e-2365-4042-910d-80304789f745-kube-api-access-stlp8\") pod \"nova-scheduler-0\" (UID: \"5375014e-2365-4042-910d-80304789f745\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.553357 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5375014e-2365-4042-910d-80304789f745\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.666052 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stlp8\" (UniqueName: \"kubernetes.io/projected/5375014e-2365-4042-910d-80304789f745-kube-api-access-stlp8\") pod \"nova-scheduler-0\" (UID: \"5375014e-2365-4042-910d-80304789f745\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.666116 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5375014e-2365-4042-910d-80304789f745\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.666204 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-config-data\") pod \"nova-scheduler-0\" (UID: \"5375014e-2365-4042-910d-80304789f745\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.683417 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-config-data\") pod \"nova-scheduler-0\" (UID: \"5375014e-2365-4042-910d-80304789f745\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.692289 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5375014e-2365-4042-910d-80304789f745\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.704434 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stlp8\" (UniqueName: \"kubernetes.io/projected/5375014e-2365-4042-910d-80304789f745-kube-api-access-stlp8\") pod \"nova-scheduler-0\" (UID: \"5375014e-2365-4042-910d-80304789f745\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.747143 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.785400 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.899691 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-config-data\") pod \"dc1666bc-40ca-4f98-a280-6c13671a5196\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.899769 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6fdp\" (UniqueName: \"kubernetes.io/projected/dc1666bc-40ca-4f98-a280-6c13671a5196-kube-api-access-s6fdp\") pod \"dc1666bc-40ca-4f98-a280-6c13671a5196\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.899931 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-vencrypt-tls-certs\") pod \"dc1666bc-40ca-4f98-a280-6c13671a5196\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.899976 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-nova-novncproxy-tls-certs\") pod \"dc1666bc-40ca-4f98-a280-6c13671a5196\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.900012 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-combined-ca-bundle\") pod \"dc1666bc-40ca-4f98-a280-6c13671a5196\" (UID: \"dc1666bc-40ca-4f98-a280-6c13671a5196\") " Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.912747 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc1666bc-40ca-4f98-a280-6c13671a5196-kube-api-access-s6fdp" (OuterVolumeSpecName: "kube-api-access-s6fdp") pod "dc1666bc-40ca-4f98-a280-6c13671a5196" (UID: "dc1666bc-40ca-4f98-a280-6c13671a5196"). InnerVolumeSpecName "kube-api-access-s6fdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.935841 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc1666bc-40ca-4f98-a280-6c13671a5196" (UID: "dc1666bc-40ca-4f98-a280-6c13671a5196"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.950705 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-config-data" (OuterVolumeSpecName: "config-data") pod "dc1666bc-40ca-4f98-a280-6c13671a5196" (UID: "dc1666bc-40ca-4f98-a280-6c13671a5196"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:48 crc kubenswrapper[5030]: I0120 23:34:48.975277 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "dc1666bc-40ca-4f98-a280-6c13671a5196" (UID: "dc1666bc-40ca-4f98-a280-6c13671a5196"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.003640 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.003666 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.003677 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.003687 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6fdp\" (UniqueName: \"kubernetes.io/projected/dc1666bc-40ca-4f98-a280-6c13671a5196-kube-api-access-s6fdp\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.024611 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "dc1666bc-40ca-4f98-a280-6c13671a5196" (UID: "dc1666bc-40ca-4f98-a280-6c13671a5196"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.105645 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc1666bc-40ca-4f98-a280-6c13671a5196-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.270377 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:34:49 crc kubenswrapper[5030]: W0120 23:34:49.273350 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5375014e_2365_4042_910d_80304789f745.slice/crio-50b6f1be17c96f0d23d115377b8f9ff93b419cc1a3f1a0caf64eb2db81bb8e7e WatchSource:0}: Error finding container 50b6f1be17c96f0d23d115377b8f9ff93b419cc1a3f1a0caf64eb2db81bb8e7e: Status 404 returned error can't find the container with id 50b6f1be17c96f0d23d115377b8f9ff93b419cc1a3f1a0caf64eb2db81bb8e7e Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.358153 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"5375014e-2365-4042-910d-80304789f745","Type":"ContainerStarted","Data":"50b6f1be17c96f0d23d115377b8f9ff93b419cc1a3f1a0caf64eb2db81bb8e7e"} Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.367187 5030 generic.go:334] "Generic (PLEG): container finished" podID="f522d3b2-124f-4af1-a548-408c3517ef13" containerID="872566d668262ae5e782cbaec5e0b5a5b6cfd95d8d6fa500178bc4186010d399" exitCode=143 Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.367271 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" event={"ID":"f522d3b2-124f-4af1-a548-408c3517ef13","Type":"ContainerDied","Data":"872566d668262ae5e782cbaec5e0b5a5b6cfd95d8d6fa500178bc4186010d399"} Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.369208 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"dc1666bc-40ca-4f98-a280-6c13671a5196","Type":"ContainerDied","Data":"32929c5ff6bdf7c14f5b3c333537b3e2ce877218b0d9751c8e828c2774dffd1b"} Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.369258 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.369277 5030 scope.go:117] "RemoveContainer" containerID="33dc9db4ec964af41a6a0cfcd301bf153fe2ff8a2a7bd2f13f623f706d7557be" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.432228 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.441907 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.456203 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:34:49 crc kubenswrapper[5030]: E0120 23:34:49.456728 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1666bc-40ca-4f98-a280-6c13671a5196" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.456742 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1666bc-40ca-4f98-a280-6c13671a5196" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.456976 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc1666bc-40ca-4f98-a280-6c13671a5196" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.457921 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.460578 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.460797 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.464667 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.466704 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.618498 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.618580 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.618617 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpwld\" (UniqueName: \"kubernetes.io/projected/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-kube-api-access-rpwld\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.618673 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.618763 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.720973 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.721060 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpwld\" (UniqueName: \"kubernetes.io/projected/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-kube-api-access-rpwld\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.721108 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.721155 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.721225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.726065 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.727773 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.728430 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.732133 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.740573 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpwld\" (UniqueName: \"kubernetes.io/projected/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-kube-api-access-rpwld\") pod \"nova-cell1-novncproxy-0\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.777724 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.978159 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75e4451-042a-424d-a919-049a3213d060" path="/var/lib/kubelet/pods/a75e4451-042a-424d-a919-049a3213d060/volumes" Jan 20 23:34:49 crc kubenswrapper[5030]: I0120 23:34:49.979025 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc1666bc-40ca-4f98-a280-6c13671a5196" path="/var/lib/kubelet/pods/dc1666bc-40ca-4f98-a280-6c13671a5196/volumes" Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.262195 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:34:50 crc kubenswrapper[5030]: W0120 23:34:50.266158 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebab4fc9_eaee_431c_94e3_0b5ecf5ea5e2.slice/crio-839057bc1f6f527b70e4324c10c062bc9311f8876fe47a579518d60a21fe4e50 WatchSource:0}: Error finding container 839057bc1f6f527b70e4324c10c062bc9311f8876fe47a579518d60a21fe4e50: Status 404 returned error can't find the container with id 839057bc1f6f527b70e4324c10c062bc9311f8876fe47a579518d60a21fe4e50 Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.376847 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="12ef9991-6f38-4b76-a8e7-974ef5255254" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.143:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.416778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"5375014e-2365-4042-910d-80304789f745","Type":"ContainerStarted","Data":"afde6f101ec2199029e3a43a4fc80bd33d0468cad716f37b53e3df5877a5e468"} Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.418967 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2","Type":"ContainerStarted","Data":"839057bc1f6f527b70e4324c10c062bc9311f8876fe47a579518d60a21fe4e50"} Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.447791 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.447768136 podStartE2EDuration="2.447768136s" podCreationTimestamp="2026-01-20 23:34:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:50.435090069 +0000 UTC m=+3562.755350357" watchObservedRunningTime="2026-01-20 23:34:50.447768136 +0000 UTC m=+3562.768028434" Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.768250 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.817253 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.948328 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-config-data\") pod \"45231f32-366a-4f97-96b7-c59f7f059770\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.948595 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-scripts\") pod \"45231f32-366a-4f97-96b7-c59f7f059770\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.948736 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45231f32-366a-4f97-96b7-c59f7f059770-logs\") pod \"45231f32-366a-4f97-96b7-c59f7f059770\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.948819 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzxtf\" (UniqueName: \"kubernetes.io/projected/45231f32-366a-4f97-96b7-c59f7f059770-kube-api-access-rzxtf\") pod \"45231f32-366a-4f97-96b7-c59f7f059770\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.948942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-public-tls-certs\") pod \"45231f32-366a-4f97-96b7-c59f7f059770\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.949070 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-combined-ca-bundle\") pod \"45231f32-366a-4f97-96b7-c59f7f059770\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.949147 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-internal-tls-certs\") pod \"45231f32-366a-4f97-96b7-c59f7f059770\" (UID: \"45231f32-366a-4f97-96b7-c59f7f059770\") " Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.949602 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45231f32-366a-4f97-96b7-c59f7f059770-logs" (OuterVolumeSpecName: "logs") pod "45231f32-366a-4f97-96b7-c59f7f059770" (UID: "45231f32-366a-4f97-96b7-c59f7f059770"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.963362 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45231f32-366a-4f97-96b7-c59f7f059770-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.968873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-scripts" (OuterVolumeSpecName: "scripts") pod "45231f32-366a-4f97-96b7-c59f7f059770" (UID: "45231f32-366a-4f97-96b7-c59f7f059770"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:50 crc kubenswrapper[5030]: I0120 23:34:50.969076 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45231f32-366a-4f97-96b7-c59f7f059770-kube-api-access-rzxtf" (OuterVolumeSpecName: "kube-api-access-rzxtf") pod "45231f32-366a-4f97-96b7-c59f7f059770" (UID: "45231f32-366a-4f97-96b7-c59f7f059770"). InnerVolumeSpecName "kube-api-access-rzxtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.062735 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45231f32-366a-4f97-96b7-c59f7f059770" (UID: "45231f32-366a-4f97-96b7-c59f7f059770"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.065302 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.065327 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.065357 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzxtf\" (UniqueName: \"kubernetes.io/projected/45231f32-366a-4f97-96b7-c59f7f059770-kube-api-access-rzxtf\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.094848 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "45231f32-366a-4f97-96b7-c59f7f059770" (UID: "45231f32-366a-4f97-96b7-c59f7f059770"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.094885 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-config-data" (OuterVolumeSpecName: "config-data") pod "45231f32-366a-4f97-96b7-c59f7f059770" (UID: "45231f32-366a-4f97-96b7-c59f7f059770"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.114676 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "45231f32-366a-4f97-96b7-c59f7f059770" (UID: "45231f32-366a-4f97-96b7-c59f7f059770"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.166572 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.166606 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.166634 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45231f32-366a-4f97-96b7-c59f7f059770-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.325976 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.430044 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2","Type":"ContainerStarted","Data":"36efa10ae1b4f96519d04efabec3fcfed7aa6b7e9d940f3c5f6f44027d1f248e"} Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.433289 5030 generic.go:334] "Generic (PLEG): container finished" podID="45231f32-366a-4f97-96b7-c59f7f059770" containerID="ee034c3b521d612a1a183817d8e0111d607e0d2c34437b652f35b1658e1e0301" exitCode=0 Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.434170 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.435721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" event={"ID":"45231f32-366a-4f97-96b7-c59f7f059770","Type":"ContainerDied","Data":"ee034c3b521d612a1a183817d8e0111d607e0d2c34437b652f35b1658e1e0301"} Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.436025 5030 scope.go:117] "RemoveContainer" containerID="ee034c3b521d612a1a183817d8e0111d607e0d2c34437b652f35b1658e1e0301" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.436722 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-d9f464b66-s2shz" event={"ID":"45231f32-366a-4f97-96b7-c59f7f059770","Type":"ContainerDied","Data":"6d1376b2e091462d5adf754a73c025d2355de04478a28e9cb17b61bbda8c4846"} Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.453208 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.453189763 podStartE2EDuration="2.453189763s" podCreationTimestamp="2026-01-20 23:34:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:34:51.448883309 +0000 UTC m=+3563.769143597" watchObservedRunningTime="2026-01-20 23:34:51.453189763 +0000 UTC m=+3563.773450051" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.475991 5030 scope.go:117] "RemoveContainer" containerID="f6e275cc62225396c909dc76f6977b00be923535bbcf8999afec6b5f3602f7a1" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.481264 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-d9f464b66-s2shz"] Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.493901 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-d9f464b66-s2shz"] Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.501542 5030 scope.go:117] "RemoveContainer" containerID="ee034c3b521d612a1a183817d8e0111d607e0d2c34437b652f35b1658e1e0301" Jan 20 23:34:51 crc kubenswrapper[5030]: E0120 23:34:51.502142 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee034c3b521d612a1a183817d8e0111d607e0d2c34437b652f35b1658e1e0301\": container with ID starting with ee034c3b521d612a1a183817d8e0111d607e0d2c34437b652f35b1658e1e0301 not found: ID does not exist" containerID="ee034c3b521d612a1a183817d8e0111d607e0d2c34437b652f35b1658e1e0301" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.502176 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee034c3b521d612a1a183817d8e0111d607e0d2c34437b652f35b1658e1e0301"} err="failed to get container status \"ee034c3b521d612a1a183817d8e0111d607e0d2c34437b652f35b1658e1e0301\": rpc error: code = NotFound desc = could not find container \"ee034c3b521d612a1a183817d8e0111d607e0d2c34437b652f35b1658e1e0301\": container with ID starting with ee034c3b521d612a1a183817d8e0111d607e0d2c34437b652f35b1658e1e0301 not found: ID does not exist" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.502212 5030 scope.go:117] "RemoveContainer" containerID="f6e275cc62225396c909dc76f6977b00be923535bbcf8999afec6b5f3602f7a1" Jan 20 23:34:51 crc kubenswrapper[5030]: E0120 23:34:51.502573 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6e275cc62225396c909dc76f6977b00be923535bbcf8999afec6b5f3602f7a1\": container with ID starting with f6e275cc62225396c909dc76f6977b00be923535bbcf8999afec6b5f3602f7a1 not found: ID does not exist" containerID="f6e275cc62225396c909dc76f6977b00be923535bbcf8999afec6b5f3602f7a1" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.502611 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6e275cc62225396c909dc76f6977b00be923535bbcf8999afec6b5f3602f7a1"} err="failed to get container status \"f6e275cc62225396c909dc76f6977b00be923535bbcf8999afec6b5f3602f7a1\": rpc error: code = NotFound desc = could not find container \"f6e275cc62225396c909dc76f6977b00be923535bbcf8999afec6b5f3602f7a1\": container with ID starting with f6e275cc62225396c909dc76f6977b00be923535bbcf8999afec6b5f3602f7a1 not found: ID does not exist" Jan 20 23:34:51 crc kubenswrapper[5030]: I0120 23:34:51.975124 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45231f32-366a-4f97-96b7-c59f7f059770" path="/var/lib/kubelet/pods/45231f32-366a-4f97-96b7-c59f7f059770/volumes" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.215996 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.101:9311/healthcheck\": read tcp 10.217.0.2:45016->10.217.0.101:9311: read: connection reset by peer" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.215996 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.101:9311/healthcheck\": read tcp 10.217.0.2:45026->10.217.0.101:9311: read: connection reset by peer" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.449497 5030 generic.go:334] "Generic (PLEG): container finished" podID="5375014e-2365-4042-910d-80304789f745" containerID="afde6f101ec2199029e3a43a4fc80bd33d0468cad716f37b53e3df5877a5e468" exitCode=1 Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.449712 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"5375014e-2365-4042-910d-80304789f745","Type":"ContainerDied","Data":"afde6f101ec2199029e3a43a4fc80bd33d0468cad716f37b53e3df5877a5e468"} Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.454596 5030 scope.go:117] "RemoveContainer" containerID="afde6f101ec2199029e3a43a4fc80bd33d0468cad716f37b53e3df5877a5e468" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.459061 5030 generic.go:334] "Generic (PLEG): container finished" podID="f522d3b2-124f-4af1-a548-408c3517ef13" containerID="3073a848a12d66bdc5d1ca7e95e2a440b8209ff9cae9549ac13f372bee8452dd" exitCode=0 Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.459141 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" event={"ID":"f522d3b2-124f-4af1-a548-408c3517ef13","Type":"ContainerDied","Data":"3073a848a12d66bdc5d1ca7e95e2a440b8209ff9cae9549ac13f372bee8452dd"} Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.641010 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.698846 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-config-data\") pod \"f522d3b2-124f-4af1-a548-408c3517ef13\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.699153 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-public-tls-certs\") pod \"f522d3b2-124f-4af1-a548-408c3517ef13\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.699189 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-internal-tls-certs\") pod \"f522d3b2-124f-4af1-a548-408c3517ef13\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.699232 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq2ml\" (UniqueName: \"kubernetes.io/projected/f522d3b2-124f-4af1-a548-408c3517ef13-kube-api-access-mq2ml\") pod \"f522d3b2-124f-4af1-a548-408c3517ef13\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.699279 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-combined-ca-bundle\") pod \"f522d3b2-124f-4af1-a548-408c3517ef13\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.699312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-config-data-custom\") pod \"f522d3b2-124f-4af1-a548-408c3517ef13\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.699334 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f522d3b2-124f-4af1-a548-408c3517ef13-logs\") pod \"f522d3b2-124f-4af1-a548-408c3517ef13\" (UID: \"f522d3b2-124f-4af1-a548-408c3517ef13\") " Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.700016 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f522d3b2-124f-4af1-a548-408c3517ef13-logs" (OuterVolumeSpecName: "logs") pod "f522d3b2-124f-4af1-a548-408c3517ef13" (UID: "f522d3b2-124f-4af1-a548-408c3517ef13"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.706707 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f522d3b2-124f-4af1-a548-408c3517ef13-kube-api-access-mq2ml" (OuterVolumeSpecName: "kube-api-access-mq2ml") pod "f522d3b2-124f-4af1-a548-408c3517ef13" (UID: "f522d3b2-124f-4af1-a548-408c3517ef13"). InnerVolumeSpecName "kube-api-access-mq2ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.708815 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f522d3b2-124f-4af1-a548-408c3517ef13" (UID: "f522d3b2-124f-4af1-a548-408c3517ef13"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.737005 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f522d3b2-124f-4af1-a548-408c3517ef13" (UID: "f522d3b2-124f-4af1-a548-408c3517ef13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.760455 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f522d3b2-124f-4af1-a548-408c3517ef13" (UID: "f522d3b2-124f-4af1-a548-408c3517ef13"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.762052 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-config-data" (OuterVolumeSpecName: "config-data") pod "f522d3b2-124f-4af1-a548-408c3517ef13" (UID: "f522d3b2-124f-4af1-a548-408c3517ef13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.762191 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f522d3b2-124f-4af1-a548-408c3517ef13" (UID: "f522d3b2-124f-4af1-a548-408c3517ef13"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.801366 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.801404 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.801420 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq2ml\" (UniqueName: \"kubernetes.io/projected/f522d3b2-124f-4af1-a548-408c3517ef13-kube-api-access-mq2ml\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.801436 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.801450 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.801461 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f522d3b2-124f-4af1-a548-408c3517ef13-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.801474 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f522d3b2-124f-4af1-a548-408c3517ef13-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:52 crc kubenswrapper[5030]: I0120 23:34:52.961818 5030 scope.go:117] "RemoveContainer" containerID="b8d1ff9f1a8865150e0be3cb7d84d43b40536cc4bc648a2c3d8af999d1a89102" Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.389678 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.417548 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.429978 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.473349 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9c05bad7-0581-4182-9c7b-7a790c22cf64","Type":"ContainerStarted","Data":"3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8"} Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.473670 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.477870 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"5375014e-2365-4042-910d-80304789f745","Type":"ContainerStarted","Data":"84ade93355a3eac45979c28fcd90e782dabf113aeb7e0a0088390d73d73bdfa8"} Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.480982 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.481490 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7" event={"ID":"f522d3b2-124f-4af1-a548-408c3517ef13","Type":"ContainerDied","Data":"0e97ee6fd33e202aaa801e530cf385259e03f928d93c86cd176d92e52a6d75c0"} Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.481529 5030 scope.go:117] "RemoveContainer" containerID="3073a848a12d66bdc5d1ca7e95e2a440b8209ff9cae9549ac13f372bee8452dd" Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.489333 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.508409 5030 scope.go:117] "RemoveContainer" containerID="872566d668262ae5e782cbaec5e0b5a5b6cfd95d8d6fa500178bc4186010d399" Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.538605 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7"] Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.546763 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-d5769bb96-hnxc7"] Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.731695 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.748094 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.938085 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:53 crc kubenswrapper[5030]: I0120 23:34:53.976528 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" path="/var/lib/kubelet/pods/f522d3b2-124f-4af1-a548-408c3517ef13/volumes" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.132294 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r9d4\" (UniqueName: \"kubernetes.io/projected/515ad7d0-529c-4fe7-8a8b-e1af630713fc-kube-api-access-6r9d4\") pod \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.132339 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-public-tls-certs\") pod \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.132373 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-internal-tls-certs\") pod \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.132390 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-combined-ca-bundle\") pod \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.132434 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-config-data-custom\") pod \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.132509 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/515ad7d0-529c-4fe7-8a8b-e1af630713fc-logs\") pod \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.132542 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-config-data\") pod \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\" (UID: \"515ad7d0-529c-4fe7-8a8b-e1af630713fc\") " Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.133122 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/515ad7d0-529c-4fe7-8a8b-e1af630713fc-logs" (OuterVolumeSpecName: "logs") pod "515ad7d0-529c-4fe7-8a8b-e1af630713fc" (UID: "515ad7d0-529c-4fe7-8a8b-e1af630713fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.144046 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/515ad7d0-529c-4fe7-8a8b-e1af630713fc-kube-api-access-6r9d4" (OuterVolumeSpecName: "kube-api-access-6r9d4") pod "515ad7d0-529c-4fe7-8a8b-e1af630713fc" (UID: "515ad7d0-529c-4fe7-8a8b-e1af630713fc"). InnerVolumeSpecName "kube-api-access-6r9d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.149796 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "515ad7d0-529c-4fe7-8a8b-e1af630713fc" (UID: "515ad7d0-529c-4fe7-8a8b-e1af630713fc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.168810 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "515ad7d0-529c-4fe7-8a8b-e1af630713fc" (UID: "515ad7d0-529c-4fe7-8a8b-e1af630713fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.187329 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-config-data" (OuterVolumeSpecName: "config-data") pod "515ad7d0-529c-4fe7-8a8b-e1af630713fc" (UID: "515ad7d0-529c-4fe7-8a8b-e1af630713fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.198504 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "515ad7d0-529c-4fe7-8a8b-e1af630713fc" (UID: "515ad7d0-529c-4fe7-8a8b-e1af630713fc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.201500 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "515ad7d0-529c-4fe7-8a8b-e1af630713fc" (UID: "515ad7d0-529c-4fe7-8a8b-e1af630713fc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.235770 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.235804 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r9d4\" (UniqueName: \"kubernetes.io/projected/515ad7d0-529c-4fe7-8a8b-e1af630713fc-kube-api-access-6r9d4\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.235819 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.235834 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.235846 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.235860 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/515ad7d0-529c-4fe7-8a8b-e1af630713fc-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.235872 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/515ad7d0-529c-4fe7-8a8b-e1af630713fc-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.495563 5030 generic.go:334] "Generic (PLEG): container finished" podID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerID="42706cceac4151563e09d1f6cd064d1ecf5b2e84ce6f3b4b234afe04aff748d5" exitCode=137 Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.495711 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" event={"ID":"515ad7d0-529c-4fe7-8a8b-e1af630713fc","Type":"ContainerDied","Data":"42706cceac4151563e09d1f6cd064d1ecf5b2e84ce6f3b4b234afe04aff748d5"} Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.495765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" event={"ID":"515ad7d0-529c-4fe7-8a8b-e1af630713fc","Type":"ContainerDied","Data":"7e8c34b367ede1364b745d0eb22af1a4f9825d558ec828565b9f836634fcdce3"} Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.495798 5030 scope.go:117] "RemoveContainer" containerID="42706cceac4151563e09d1f6cd064d1ecf5b2e84ce6f3b4b234afe04aff748d5" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.497021 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6b8948f648-jn267" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.544266 5030 scope.go:117] "RemoveContainer" containerID="202a49fb9f94af701c51bd0880081eb9874aac6941b1924cef3186fd4505db91" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.544438 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6b8948f648-jn267"] Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.558419 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-6b8948f648-jn267"] Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.576018 5030 scope.go:117] "RemoveContainer" containerID="42706cceac4151563e09d1f6cd064d1ecf5b2e84ce6f3b4b234afe04aff748d5" Jan 20 23:34:54 crc kubenswrapper[5030]: E0120 23:34:54.576438 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42706cceac4151563e09d1f6cd064d1ecf5b2e84ce6f3b4b234afe04aff748d5\": container with ID starting with 42706cceac4151563e09d1f6cd064d1ecf5b2e84ce6f3b4b234afe04aff748d5 not found: ID does not exist" containerID="42706cceac4151563e09d1f6cd064d1ecf5b2e84ce6f3b4b234afe04aff748d5" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.576474 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42706cceac4151563e09d1f6cd064d1ecf5b2e84ce6f3b4b234afe04aff748d5"} err="failed to get container status \"42706cceac4151563e09d1f6cd064d1ecf5b2e84ce6f3b4b234afe04aff748d5\": rpc error: code = NotFound desc = could not find container \"42706cceac4151563e09d1f6cd064d1ecf5b2e84ce6f3b4b234afe04aff748d5\": container with ID starting with 42706cceac4151563e09d1f6cd064d1ecf5b2e84ce6f3b4b234afe04aff748d5 not found: ID does not exist" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.576500 5030 scope.go:117] "RemoveContainer" containerID="202a49fb9f94af701c51bd0880081eb9874aac6941b1924cef3186fd4505db91" Jan 20 23:34:54 crc kubenswrapper[5030]: E0120 23:34:54.577755 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"202a49fb9f94af701c51bd0880081eb9874aac6941b1924cef3186fd4505db91\": container with ID starting with 202a49fb9f94af701c51bd0880081eb9874aac6941b1924cef3186fd4505db91 not found: ID does not exist" containerID="202a49fb9f94af701c51bd0880081eb9874aac6941b1924cef3186fd4505db91" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.577813 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"202a49fb9f94af701c51bd0880081eb9874aac6941b1924cef3186fd4505db91"} err="failed to get container status \"202a49fb9f94af701c51bd0880081eb9874aac6941b1924cef3186fd4505db91\": rpc error: code = NotFound desc = could not find container \"202a49fb9f94af701c51bd0880081eb9874aac6941b1924cef3186fd4505db91\": container with ID starting with 202a49fb9f94af701c51bd0880081eb9874aac6941b1924cef3186fd4505db91 not found: ID does not exist" Jan 20 23:34:54 crc kubenswrapper[5030]: I0120 23:34:54.778737 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.188863 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-55c6fff65d-h7t2s_ba78ec90-96e7-41c8-a6f9-ceff78ba166c/neutron-api/1.log" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.190768 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-55c6fff65d-h7t2s_ba78ec90-96e7-41c8-a6f9-ceff78ba166c/neutron-api/0.log" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.190831 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.259650 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-ovndb-tls-certs\") pod \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.259715 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-combined-ca-bundle\") pod \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.259842 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-internal-tls-certs\") pod \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.259876 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-config\") pod \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.259927 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-public-tls-certs\") pod \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.259964 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-httpd-config\") pod \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.260004 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgtx7\" (UniqueName: \"kubernetes.io/projected/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-kube-api-access-vgtx7\") pod \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\" (UID: \"ba78ec90-96e7-41c8-a6f9-ceff78ba166c\") " Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.265231 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-kube-api-access-vgtx7" (OuterVolumeSpecName: "kube-api-access-vgtx7") pod "ba78ec90-96e7-41c8-a6f9-ceff78ba166c" (UID: "ba78ec90-96e7-41c8-a6f9-ceff78ba166c"). InnerVolumeSpecName "kube-api-access-vgtx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.267936 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ba78ec90-96e7-41c8-a6f9-ceff78ba166c" (UID: "ba78ec90-96e7-41c8-a6f9-ceff78ba166c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.316211 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba78ec90-96e7-41c8-a6f9-ceff78ba166c" (UID: "ba78ec90-96e7-41c8-a6f9-ceff78ba166c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.322986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ba78ec90-96e7-41c8-a6f9-ceff78ba166c" (UID: "ba78ec90-96e7-41c8-a6f9-ceff78ba166c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.329123 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-config" (OuterVolumeSpecName: "config") pod "ba78ec90-96e7-41c8-a6f9-ceff78ba166c" (UID: "ba78ec90-96e7-41c8-a6f9-ceff78ba166c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.334616 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ba78ec90-96e7-41c8-a6f9-ceff78ba166c" (UID: "ba78ec90-96e7-41c8-a6f9-ceff78ba166c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.354662 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ba78ec90-96e7-41c8-a6f9-ceff78ba166c" (UID: "ba78ec90-96e7-41c8-a6f9-ceff78ba166c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.362463 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.362499 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.362514 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.362528 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.362544 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgtx7\" (UniqueName: \"kubernetes.io/projected/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-kube-api-access-vgtx7\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.362559 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.362573 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba78ec90-96e7-41c8-a6f9-ceff78ba166c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.512814 5030 generic.go:334] "Generic (PLEG): container finished" podID="5375014e-2365-4042-910d-80304789f745" containerID="84ade93355a3eac45979c28fcd90e782dabf113aeb7e0a0088390d73d73bdfa8" exitCode=1 Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.512866 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"5375014e-2365-4042-910d-80304789f745","Type":"ContainerDied","Data":"84ade93355a3eac45979c28fcd90e782dabf113aeb7e0a0088390d73d73bdfa8"} Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.512898 5030 scope.go:117] "RemoveContainer" containerID="afde6f101ec2199029e3a43a4fc80bd33d0468cad716f37b53e3df5877a5e468" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.513408 5030 scope.go:117] "RemoveContainer" containerID="84ade93355a3eac45979c28fcd90e782dabf113aeb7e0a0088390d73d73bdfa8" Jan 20 23:34:55 crc kubenswrapper[5030]: E0120 23:34:55.514283 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(5375014e-2365-4042-910d-80304789f745)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="5375014e-2365-4042-910d-80304789f745" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.530776 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-55c6fff65d-h7t2s_ba78ec90-96e7-41c8-a6f9-ceff78ba166c/neutron-api/1.log" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.532434 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-55c6fff65d-h7t2s_ba78ec90-96e7-41c8-a6f9-ceff78ba166c/neutron-api/0.log" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.532491 5030 generic.go:334] "Generic (PLEG): container finished" podID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerID="5cddc67edd2f5935f1b5d1f382f2fcc2e653907621cfa3680325fd62aadb8aa1" exitCode=137 Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.532553 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.532601 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" event={"ID":"ba78ec90-96e7-41c8-a6f9-ceff78ba166c","Type":"ContainerDied","Data":"5cddc67edd2f5935f1b5d1f382f2fcc2e653907621cfa3680325fd62aadb8aa1"} Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.532649 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-55c6fff65d-h7t2s" event={"ID":"ba78ec90-96e7-41c8-a6f9-ceff78ba166c","Type":"ContainerDied","Data":"687541627ea37a329b0829eb92018510c1bf895b4c6761fac6f5d8a5ce4f236a"} Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.617137 5030 scope.go:117] "RemoveContainer" containerID="5cddc67edd2f5935f1b5d1f382f2fcc2e653907621cfa3680325fd62aadb8aa1" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.627452 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-55c6fff65d-h7t2s"] Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.637805 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-55c6fff65d-h7t2s"] Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.643921 5030 scope.go:117] "RemoveContainer" containerID="5e694b41d33d8c2fffa8c96991878aaf037227af1c4a3149eb6d66b6726458a8" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.673924 5030 scope.go:117] "RemoveContainer" containerID="dcdc63e978b66aea26fcff51171500f5e3244ba6e5d9533407f5b85a501aa9c4" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.746198 5030 scope.go:117] "RemoveContainer" containerID="5cddc67edd2f5935f1b5d1f382f2fcc2e653907621cfa3680325fd62aadb8aa1" Jan 20 23:34:55 crc kubenswrapper[5030]: E0120 23:34:55.746547 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cddc67edd2f5935f1b5d1f382f2fcc2e653907621cfa3680325fd62aadb8aa1\": container with ID starting with 5cddc67edd2f5935f1b5d1f382f2fcc2e653907621cfa3680325fd62aadb8aa1 not found: ID does not exist" containerID="5cddc67edd2f5935f1b5d1f382f2fcc2e653907621cfa3680325fd62aadb8aa1" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.746590 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cddc67edd2f5935f1b5d1f382f2fcc2e653907621cfa3680325fd62aadb8aa1"} err="failed to get container status \"5cddc67edd2f5935f1b5d1f382f2fcc2e653907621cfa3680325fd62aadb8aa1\": rpc error: code = NotFound desc = could not find container \"5cddc67edd2f5935f1b5d1f382f2fcc2e653907621cfa3680325fd62aadb8aa1\": container with ID starting with 5cddc67edd2f5935f1b5d1f382f2fcc2e653907621cfa3680325fd62aadb8aa1 not found: ID does not exist" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.746617 5030 scope.go:117] "RemoveContainer" containerID="5e694b41d33d8c2fffa8c96991878aaf037227af1c4a3149eb6d66b6726458a8" Jan 20 23:34:55 crc kubenswrapper[5030]: E0120 23:34:55.746919 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e694b41d33d8c2fffa8c96991878aaf037227af1c4a3149eb6d66b6726458a8\": container with ID starting with 5e694b41d33d8c2fffa8c96991878aaf037227af1c4a3149eb6d66b6726458a8 not found: ID does not exist" containerID="5e694b41d33d8c2fffa8c96991878aaf037227af1c4a3149eb6d66b6726458a8" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.746949 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e694b41d33d8c2fffa8c96991878aaf037227af1c4a3149eb6d66b6726458a8"} err="failed to get container status \"5e694b41d33d8c2fffa8c96991878aaf037227af1c4a3149eb6d66b6726458a8\": rpc error: code = NotFound desc = could not find container \"5e694b41d33d8c2fffa8c96991878aaf037227af1c4a3149eb6d66b6726458a8\": container with ID starting with 5e694b41d33d8c2fffa8c96991878aaf037227af1c4a3149eb6d66b6726458a8 not found: ID does not exist" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.746972 5030 scope.go:117] "RemoveContainer" containerID="dcdc63e978b66aea26fcff51171500f5e3244ba6e5d9533407f5b85a501aa9c4" Jan 20 23:34:55 crc kubenswrapper[5030]: E0120 23:34:55.747689 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcdc63e978b66aea26fcff51171500f5e3244ba6e5d9533407f5b85a501aa9c4\": container with ID starting with dcdc63e978b66aea26fcff51171500f5e3244ba6e5d9533407f5b85a501aa9c4 not found: ID does not exist" containerID="dcdc63e978b66aea26fcff51171500f5e3244ba6e5d9533407f5b85a501aa9c4" Jan 20 23:34:55 crc kubenswrapper[5030]: I0120 23:34:55.747712 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcdc63e978b66aea26fcff51171500f5e3244ba6e5d9533407f5b85a501aa9c4"} err="failed to get container status \"dcdc63e978b66aea26fcff51171500f5e3244ba6e5d9533407f5b85a501aa9c4\": rpc error: code = NotFound desc = could not find container \"dcdc63e978b66aea26fcff51171500f5e3244ba6e5d9533407f5b85a501aa9c4\": container with ID starting with dcdc63e978b66aea26fcff51171500f5e3244ba6e5d9533407f5b85a501aa9c4 not found: ID does not exist" Jan 20 23:34:56 crc kubenswrapper[5030]: I0120 23:34:56.003654 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" path="/var/lib/kubelet/pods/515ad7d0-529c-4fe7-8a8b-e1af630713fc/volumes" Jan 20 23:34:56 crc kubenswrapper[5030]: I0120 23:34:56.005301 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" path="/var/lib/kubelet/pods/ba78ec90-96e7-41c8-a6f9-ceff78ba166c/volumes" Jan 20 23:34:56 crc kubenswrapper[5030]: I0120 23:34:56.166085 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" podUID="ec706fe5-85b7-4580-9013-e39497640ab0" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 20 23:34:56 crc kubenswrapper[5030]: I0120 23:34:56.168332 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" podUID="ec706fe5-85b7-4580-9013-e39497640ab0" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 20 23:34:56 crc kubenswrapper[5030]: I0120 23:34:56.169045 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" podUID="ec706fe5-85b7-4580-9013-e39497640ab0" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 20 23:34:56 crc kubenswrapper[5030]: I0120 23:34:56.545738 5030 generic.go:334] "Generic (PLEG): container finished" podID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerID="3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8" exitCode=1 Jan 20 23:34:56 crc kubenswrapper[5030]: I0120 23:34:56.545823 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9c05bad7-0581-4182-9c7b-7a790c22cf64","Type":"ContainerDied","Data":"3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8"} Jan 20 23:34:56 crc kubenswrapper[5030]: I0120 23:34:56.545860 5030 scope.go:117] "RemoveContainer" containerID="b8d1ff9f1a8865150e0be3cb7d84d43b40536cc4bc648a2c3d8af999d1a89102" Jan 20 23:34:56 crc kubenswrapper[5030]: I0120 23:34:56.546563 5030 scope.go:117] "RemoveContainer" containerID="3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8" Jan 20 23:34:56 crc kubenswrapper[5030]: E0120 23:34:56.561212 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(9c05bad7-0581-4182-9c7b-7a790c22cf64)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" Jan 20 23:34:57 crc kubenswrapper[5030]: I0120 23:34:57.619807 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:34:57 crc kubenswrapper[5030]: I0120 23:34:57.624449 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:34:57 crc kubenswrapper[5030]: I0120 23:34:57.625122 5030 scope.go:117] "RemoveContainer" containerID="3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8" Jan 20 23:34:57 crc kubenswrapper[5030]: E0120 23:34:57.625344 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(9c05bad7-0581-4182-9c7b-7a790c22cf64)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" Jan 20 23:34:57 crc kubenswrapper[5030]: I0120 23:34:57.692135 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-595565b585-58jwl"] Jan 20 23:34:57 crc kubenswrapper[5030]: I0120 23:34:57.692352 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-595565b585-58jwl" podUID="5fffcdc3-dbc6-49d2-9e77-96c41651ee20" containerName="keystone-api" containerID="cri-o://a68d733c81a876f1b864477403c6ac9c0c4ab39c56b7e56209376ae051cc669c" gracePeriod=30 Jan 20 23:34:57 crc kubenswrapper[5030]: I0120 23:34:57.974258 5030 scope.go:117] "RemoveContainer" containerID="649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7" Jan 20 23:34:58 crc kubenswrapper[5030]: I0120 23:34:58.592782 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"d3f32a70-a447-408f-99e0-90075753a304","Type":"ContainerStarted","Data":"0109d594a82a21bcf7309f9ae3d33696d2b156560568c20cd829172557037d0e"} Jan 20 23:34:58 crc kubenswrapper[5030]: I0120 23:34:58.593228 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:34:58 crc kubenswrapper[5030]: I0120 23:34:58.715129 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7742d5e0_8ca6_4c3a_8c6d_706c1bbc74e6.slice" Jan 20 23:34:58 crc kubenswrapper[5030]: E0120 23:34:58.715185 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7742d5e0_8ca6_4c3a_8c6d_706c1bbc74e6.slice" pod="openstack-kuttl-tests/nova-api-0" podUID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" Jan 20 23:34:58 crc kubenswrapper[5030]: I0120 23:34:58.752455 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:58 crc kubenswrapper[5030]: I0120 23:34:58.752572 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:58 crc kubenswrapper[5030]: I0120 23:34:58.752590 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:34:58 crc kubenswrapper[5030]: I0120 23:34:58.753702 5030 scope.go:117] "RemoveContainer" containerID="84ade93355a3eac45979c28fcd90e782dabf113aeb7e0a0088390d73d73bdfa8" Jan 20 23:34:58 crc kubenswrapper[5030]: E0120 23:34:58.754274 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(5375014e-2365-4042-910d-80304789f745)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="5375014e-2365-4042-910d-80304789f745" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.608076 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.648879 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.658684 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.702396 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:34:59 crc kubenswrapper[5030]: E0120 23:34:59.703073 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45231f32-366a-4f97-96b7-c59f7f059770" containerName="placement-log" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703104 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="45231f32-366a-4f97-96b7-c59f7f059770" containerName="placement-log" Jan 20 23:34:59 crc kubenswrapper[5030]: E0120 23:34:59.703125 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-httpd" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703136 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-httpd" Jan 20 23:34:59 crc kubenswrapper[5030]: E0120 23:34:59.703159 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45231f32-366a-4f97-96b7-c59f7f059770" containerName="placement-api" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703168 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="45231f32-366a-4f97-96b7-c59f7f059770" containerName="placement-api" Jan 20 23:34:59 crc kubenswrapper[5030]: E0120 23:34:59.703187 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703197 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api" Jan 20 23:34:59 crc kubenswrapper[5030]: E0120 23:34:59.703229 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-api" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703239 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-api" Jan 20 23:34:59 crc kubenswrapper[5030]: E0120 23:34:59.703270 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api-log" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703280 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api-log" Jan 20 23:34:59 crc kubenswrapper[5030]: E0120 23:34:59.703302 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api-log" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703311 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api-log" Jan 20 23:34:59 crc kubenswrapper[5030]: E0120 23:34:59.703327 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703338 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api" Jan 20 23:34:59 crc kubenswrapper[5030]: E0120 23:34:59.703357 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-api" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703367 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-api" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703663 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-api" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703687 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="45231f32-366a-4f97-96b7-c59f7f059770" containerName="placement-log" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703705 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="45231f32-366a-4f97-96b7-c59f7f059770" containerName="placement-api" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703723 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703735 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-httpd" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703753 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703783 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba78ec90-96e7-41c8-a6f9-ceff78ba166c" containerName="neutron-api" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703809 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="515ad7d0-529c-4fe7-8a8b-e1af630713fc" containerName="barbican-api-log" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.703833 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f522d3b2-124f-4af1-a548-408c3517ef13" containerName="barbican-api-log" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.708218 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.714097 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.714216 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.714341 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.739325 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.778398 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.796246 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.859974 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-config-data\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.860033 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.860823 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-internal-tls-certs\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.861251 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-public-tls-certs\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.861574 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99z2s\" (UniqueName: \"kubernetes.io/projected/16aae85a-a502-4b2e-9663-e8c50d01f657-kube-api-access-99z2s\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.861825 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16aae85a-a502-4b2e-9663-e8c50d01f657-logs\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.962917 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-public-tls-certs\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.963895 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99z2s\" (UniqueName: \"kubernetes.io/projected/16aae85a-a502-4b2e-9663-e8c50d01f657-kube-api-access-99z2s\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.964048 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16aae85a-a502-4b2e-9663-e8c50d01f657-logs\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.964761 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-config-data\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.964812 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.964941 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-internal-tls-certs\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.965067 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16aae85a-a502-4b2e-9663-e8c50d01f657-logs\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.970134 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-config-data\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.970299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-internal-tls-certs\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.972136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.975651 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-public-tls-certs\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.981379 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6" path="/var/lib/kubelet/pods/7742d5e0-8ca6-4c3a-8c6d-706c1bbc74e6/volumes" Jan 20 23:34:59 crc kubenswrapper[5030]: I0120 23:34:59.988293 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99z2s\" (UniqueName: \"kubernetes.io/projected/16aae85a-a502-4b2e-9663-e8c50d01f657-kube-api-access-99z2s\") pod \"nova-api-0\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:35:00 crc kubenswrapper[5030]: I0120 23:35:00.037545 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:35:00 crc kubenswrapper[5030]: I0120 23:35:00.095528 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podb3a2955e-219c-481d-bb5d-ca59f20d1255"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podb3a2955e-219c-481d-bb5d-ca59f20d1255] : Timed out while waiting for systemd to remove kubepods-besteffort-podb3a2955e_219c_481d_bb5d_ca59f20d1255.slice" Jan 20 23:35:00 crc kubenswrapper[5030]: I0120 23:35:00.495574 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:35:00 crc kubenswrapper[5030]: W0120 23:35:00.506771 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16aae85a_a502_4b2e_9663_e8c50d01f657.slice/crio-6dccc950e4cea12b524ca5bc600410458261c5db3eb5f53a67e19bab08f459be WatchSource:0}: Error finding container 6dccc950e4cea12b524ca5bc600410458261c5db3eb5f53a67e19bab08f459be: Status 404 returned error can't find the container with id 6dccc950e4cea12b524ca5bc600410458261c5db3eb5f53a67e19bab08f459be Jan 20 23:35:00 crc kubenswrapper[5030]: I0120 23:35:00.623812 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"16aae85a-a502-4b2e-9663-e8c50d01f657","Type":"ContainerStarted","Data":"6dccc950e4cea12b524ca5bc600410458261c5db3eb5f53a67e19bab08f459be"} Jan 20 23:35:00 crc kubenswrapper[5030]: I0120 23:35:00.639652 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.331353 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.357133 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.357408 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" containerName="openstackclient" containerID="cri-o://4ad92fd7919def846bc376773b3ae36feeaf8411e7fdf8b0ef42055825101089" gracePeriod=2 Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.367610 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" podUID="eddfc91d-df9b-4683-b3e6-5bad910ef921" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.380924 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.392557 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-credential-keys\") pod \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.392783 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-combined-ca-bundle\") pod \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.392838 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-scripts\") pod \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.392948 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-config-data\") pod \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.392993 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-public-tls-certs\") pod \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.393040 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-fernet-keys\") pod \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.393069 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvbd9\" (UniqueName: \"kubernetes.io/projected/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-kube-api-access-lvbd9\") pod \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.393100 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-internal-tls-certs\") pod \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\" (UID: \"5fffcdc3-dbc6-49d2-9e77-96c41651ee20\") " Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.397662 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5fffcdc3-dbc6-49d2-9e77-96c41651ee20" (UID: "5fffcdc3-dbc6-49d2-9e77-96c41651ee20"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.398423 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-scripts" (OuterVolumeSpecName: "scripts") pod "5fffcdc3-dbc6-49d2-9e77-96c41651ee20" (UID: "5fffcdc3-dbc6-49d2-9e77-96c41651ee20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.403896 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.410317 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5fffcdc3-dbc6-49d2-9e77-96c41651ee20" (UID: "5fffcdc3-dbc6-49d2-9e77-96c41651ee20"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.411822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-kube-api-access-lvbd9" (OuterVolumeSpecName: "kube-api-access-lvbd9") pod "5fffcdc3-dbc6-49d2-9e77-96c41651ee20" (UID: "5fffcdc3-dbc6-49d2-9e77-96c41651ee20"). InnerVolumeSpecName "kube-api-access-lvbd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.419263 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fffcdc3-dbc6-49d2-9e77-96c41651ee20" (UID: "5fffcdc3-dbc6-49d2-9e77-96c41651ee20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.420296 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:35:01 crc kubenswrapper[5030]: E0120 23:35:01.420815 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fffcdc3-dbc6-49d2-9e77-96c41651ee20" containerName="keystone-api" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.420830 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fffcdc3-dbc6-49d2-9e77-96c41651ee20" containerName="keystone-api" Jan 20 23:35:01 crc kubenswrapper[5030]: E0120 23:35:01.420868 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" containerName="openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.420875 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" containerName="openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.421099 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fffcdc3-dbc6-49d2-9e77-96c41651ee20" containerName="keystone-api" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.421117 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" containerName="openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.421803 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.423049 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-config-data" (OuterVolumeSpecName: "config-data") pod "5fffcdc3-dbc6-49d2-9e77-96c41651ee20" (UID: "5fffcdc3-dbc6-49d2-9e77-96c41651ee20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.441787 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.450248 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5fffcdc3-dbc6-49d2-9e77-96c41651ee20" (UID: "5fffcdc3-dbc6-49d2-9e77-96c41651ee20"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.469607 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5fffcdc3-dbc6-49d2-9e77-96c41651ee20" (UID: "5fffcdc3-dbc6-49d2-9e77-96c41651ee20"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.505814 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eddfc91d-df9b-4683-b3e6-5bad910ef921-openstack-config\") pod \"openstackclient\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.505861 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltr8f\" (UniqueName: \"kubernetes.io/projected/eddfc91d-df9b-4683-b3e6-5bad910ef921-kube-api-access-ltr8f\") pod \"openstackclient\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.505888 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eddfc91d-df9b-4683-b3e6-5bad910ef921-combined-ca-bundle\") pod \"openstackclient\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.506030 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eddfc91d-df9b-4683-b3e6-5bad910ef921-openstack-config-secret\") pod \"openstackclient\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.506180 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.506197 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.506230 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.506326 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.506341 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.506354 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.507113 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvbd9\" (UniqueName: \"kubernetes.io/projected/5fffcdc3-dbc6-49d2-9e77-96c41651ee20-kube-api-access-lvbd9\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.608456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eddfc91d-df9b-4683-b3e6-5bad910ef921-openstack-config-secret\") pod \"openstackclient\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.608860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eddfc91d-df9b-4683-b3e6-5bad910ef921-openstack-config\") pod \"openstackclient\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.608895 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltr8f\" (UniqueName: \"kubernetes.io/projected/eddfc91d-df9b-4683-b3e6-5bad910ef921-kube-api-access-ltr8f\") pod \"openstackclient\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.608946 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eddfc91d-df9b-4683-b3e6-5bad910ef921-combined-ca-bundle\") pod \"openstackclient\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.609770 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eddfc91d-df9b-4683-b3e6-5bad910ef921-openstack-config\") pod \"openstackclient\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.612234 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eddfc91d-df9b-4683-b3e6-5bad910ef921-combined-ca-bundle\") pod \"openstackclient\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.613962 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eddfc91d-df9b-4683-b3e6-5bad910ef921-openstack-config-secret\") pod \"openstackclient\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.625426 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltr8f\" (UniqueName: \"kubernetes.io/projected/eddfc91d-df9b-4683-b3e6-5bad910ef921-kube-api-access-ltr8f\") pod \"openstackclient\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.651599 5030 generic.go:334] "Generic (PLEG): container finished" podID="5fffcdc3-dbc6-49d2-9e77-96c41651ee20" containerID="a68d733c81a876f1b864477403c6ac9c0c4ab39c56b7e56209376ae051cc669c" exitCode=0 Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.651664 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-595565b585-58jwl" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.651663 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-595565b585-58jwl" event={"ID":"5fffcdc3-dbc6-49d2-9e77-96c41651ee20","Type":"ContainerDied","Data":"a68d733c81a876f1b864477403c6ac9c0c4ab39c56b7e56209376ae051cc669c"} Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.651718 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-595565b585-58jwl" event={"ID":"5fffcdc3-dbc6-49d2-9e77-96c41651ee20","Type":"ContainerDied","Data":"134d7060918f615b30082a51ebe9b97f6a9e6095e61ab8e55495cb0466c2d9b2"} Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.651735 5030 scope.go:117] "RemoveContainer" containerID="a68d733c81a876f1b864477403c6ac9c0c4ab39c56b7e56209376ae051cc669c" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.653597 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"16aae85a-a502-4b2e-9663-e8c50d01f657","Type":"ContainerStarted","Data":"a12e77d1b8be494229f4bdef24c5a9771212355912d130fbc08c36e1abf41aa5"} Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.653644 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"16aae85a-a502-4b2e-9663-e8c50d01f657","Type":"ContainerStarted","Data":"0ab99d99f78a27c31a39fe65464563e8a92791c72ac677499b6836093b92f3ef"} Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.674874 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.674859878 podStartE2EDuration="2.674859878s" podCreationTimestamp="2026-01-20 23:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:35:01.673747771 +0000 UTC m=+3573.994008049" watchObservedRunningTime="2026-01-20 23:35:01.674859878 +0000 UTC m=+3573.995120166" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.682086 5030 scope.go:117] "RemoveContainer" containerID="a68d733c81a876f1b864477403c6ac9c0c4ab39c56b7e56209376ae051cc669c" Jan 20 23:35:01 crc kubenswrapper[5030]: E0120 23:35:01.683320 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a68d733c81a876f1b864477403c6ac9c0c4ab39c56b7e56209376ae051cc669c\": container with ID starting with a68d733c81a876f1b864477403c6ac9c0c4ab39c56b7e56209376ae051cc669c not found: ID does not exist" containerID="a68d733c81a876f1b864477403c6ac9c0c4ab39c56b7e56209376ae051cc669c" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.683359 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a68d733c81a876f1b864477403c6ac9c0c4ab39c56b7e56209376ae051cc669c"} err="failed to get container status \"a68d733c81a876f1b864477403c6ac9c0c4ab39c56b7e56209376ae051cc669c\": rpc error: code = NotFound desc = could not find container \"a68d733c81a876f1b864477403c6ac9c0c4ab39c56b7e56209376ae051cc669c\": container with ID starting with a68d733c81a876f1b864477403c6ac9c0c4ab39c56b7e56209376ae051cc669c not found: ID does not exist" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.703189 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-595565b585-58jwl"] Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.710885 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-595565b585-58jwl"] Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.816125 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:01 crc kubenswrapper[5030]: I0120 23:35:01.973232 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fffcdc3-dbc6-49d2-9e77-96c41651ee20" path="/var/lib/kubelet/pods/5fffcdc3-dbc6-49d2-9e77-96c41651ee20/volumes" Jan 20 23:35:02 crc kubenswrapper[5030]: I0120 23:35:02.327439 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:35:02 crc kubenswrapper[5030]: I0120 23:35:02.663091 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"eddfc91d-df9b-4683-b3e6-5bad910ef921","Type":"ContainerStarted","Data":"c1113f67d75b9999d2aa07e3dea0a48f9ab97342489a0eba77438de7da5986dc"} Jan 20 23:35:02 crc kubenswrapper[5030]: I0120 23:35:02.663439 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"eddfc91d-df9b-4683-b3e6-5bad910ef921","Type":"ContainerStarted","Data":"5e9bce3cd364c7926e9d7f3dc023b1cc6248fb38c226f7d45bc9a627bdf533de"} Jan 20 23:35:02 crc kubenswrapper[5030]: I0120 23:35:02.685120 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=1.685068721 podStartE2EDuration="1.685068721s" podCreationTimestamp="2026-01-20 23:35:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:35:02.678337557 +0000 UTC m=+3574.998597855" watchObservedRunningTime="2026-01-20 23:35:02.685068721 +0000 UTC m=+3575.005329049" Jan 20 23:35:03 crc kubenswrapper[5030]: I0120 23:35:03.337742 5030 scope.go:117] "RemoveContainer" containerID="39015b910f2f39e56b985ca85a7e5d5c7980fd4ff906f21ce55154bbcc3bfbc9" Jan 20 23:35:03 crc kubenswrapper[5030]: I0120 23:35:03.372598 5030 scope.go:117] "RemoveContainer" containerID="afe9fcbff8775eba20651455c79c527a300dd3b3f856f1adcfb7be13f0d24a0b" Jan 20 23:35:03 crc kubenswrapper[5030]: I0120 23:35:03.458145 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:35:03 crc kubenswrapper[5030]: I0120 23:35:03.507732 5030 scope.go:117] "RemoveContainer" containerID="51240c072577546536a7362dad9a075e93e1489c8474e8d196c9d05db5b7e28b" Jan 20 23:35:03 crc kubenswrapper[5030]: I0120 23:35:03.549281 5030 scope.go:117] "RemoveContainer" containerID="ac3990bd9655b2eaf6d0a47482dc86ec1760584475b052542bdff00f237ec4ac" Jan 20 23:35:03 crc kubenswrapper[5030]: I0120 23:35:03.599183 5030 scope.go:117] "RemoveContainer" containerID="511cd6ffef13ece96e755ef9d69349d5ab4a8423bbfa256860fb87cc018a49db" Jan 20 23:35:03 crc kubenswrapper[5030]: I0120 23:35:03.656729 5030 scope.go:117] "RemoveContainer" containerID="ec590bd73705be339baa9b7c1b90768301a4f289003bd0b2191fc3d357538be2" Jan 20 23:35:03 crc kubenswrapper[5030]: I0120 23:35:03.684454 5030 generic.go:334] "Generic (PLEG): container finished" podID="e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" containerID="4ad92fd7919def846bc376773b3ae36feeaf8411e7fdf8b0ef42055825101089" exitCode=137 Jan 20 23:35:03 crc kubenswrapper[5030]: I0120 23:35:03.935543 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.134771 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.164066 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-openstack-config-secret\") pod \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.164160 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fwzb\" (UniqueName: \"kubernetes.io/projected/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-kube-api-access-4fwzb\") pod \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.164214 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-combined-ca-bundle\") pod \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.164364 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-openstack-config\") pod \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\" (UID: \"e4ee31a9-b56d-4e9c-8b51-7b344bb678d0\") " Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.170422 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-kube-api-access-4fwzb" (OuterVolumeSpecName: "kube-api-access-4fwzb") pod "e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" (UID: "e4ee31a9-b56d-4e9c-8b51-7b344bb678d0"). InnerVolumeSpecName "kube-api-access-4fwzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.189944 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" (UID: "e4ee31a9-b56d-4e9c-8b51-7b344bb678d0"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.198978 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" (UID: "e4ee31a9-b56d-4e9c-8b51-7b344bb678d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.223256 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" (UID: "e4ee31a9-b56d-4e9c-8b51-7b344bb678d0"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.266563 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.266586 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fwzb\" (UniqueName: \"kubernetes.io/projected/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-kube-api-access-4fwzb\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.266596 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.266606 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.702208 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.702260 5030 scope.go:117] "RemoveContainer" containerID="4ad92fd7919def846bc376773b3ae36feeaf8411e7fdf8b0ef42055825101089" Jan 20 23:35:04 crc kubenswrapper[5030]: I0120 23:35:04.718579 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" podUID="eddfc91d-df9b-4683-b3e6-5bad910ef921" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.788598 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw"] Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.790285 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.795094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a832d1c9-4838-4070-ad8d-e2af76d8ca47-log-httpd\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.795197 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-combined-ca-bundle\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.795259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a832d1c9-4838-4070-ad8d-e2af76d8ca47-run-httpd\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.795310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-internal-tls-certs\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.795364 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-public-tls-certs\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.795501 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a832d1c9-4838-4070-ad8d-e2af76d8ca47-etc-swift\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.795602 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bn2k\" (UniqueName: \"kubernetes.io/projected/a832d1c9-4838-4070-ad8d-e2af76d8ca47-kube-api-access-6bn2k\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.795686 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-config-data\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.829750 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw"] Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.898041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a832d1c9-4838-4070-ad8d-e2af76d8ca47-etc-swift\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.898459 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bn2k\" (UniqueName: \"kubernetes.io/projected/a832d1c9-4838-4070-ad8d-e2af76d8ca47-kube-api-access-6bn2k\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.898653 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-config-data\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.898917 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a832d1c9-4838-4070-ad8d-e2af76d8ca47-log-httpd\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.899236 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-combined-ca-bundle\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.899458 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a832d1c9-4838-4070-ad8d-e2af76d8ca47-log-httpd\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.899496 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a832d1c9-4838-4070-ad8d-e2af76d8ca47-run-httpd\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.899558 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-internal-tls-certs\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.899614 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-public-tls-certs\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.899803 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a832d1c9-4838-4070-ad8d-e2af76d8ca47-run-httpd\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.905533 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-config-data\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.905653 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-combined-ca-bundle\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.905863 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-public-tls-certs\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.906119 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-internal-tls-certs\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.908731 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a832d1c9-4838-4070-ad8d-e2af76d8ca47-etc-swift\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.918474 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bn2k\" (UniqueName: \"kubernetes.io/projected/a832d1c9-4838-4070-ad8d-e2af76d8ca47-kube-api-access-6bn2k\") pod \"swift-proxy-67694ddbb6-j66sw\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:05 crc kubenswrapper[5030]: I0120 23:35:05.972724 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4ee31a9-b56d-4e9c-8b51-7b344bb678d0" path="/var/lib/kubelet/pods/e4ee31a9-b56d-4e9c-8b51-7b344bb678d0/volumes" Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.105591 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.105916 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="ceilometer-central-agent" containerID="cri-o://0192f87401264fe14604fce39af47d95952ac3607105d0e71210d811c88df755" gracePeriod=30 Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.106060 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="proxy-httpd" containerID="cri-o://65120d5a48a7df9b0fba3d1fc2438131bd8b3abbd4f197cb6243fad8d5660da9" gracePeriod=30 Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.106119 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="sg-core" containerID="cri-o://c027ebf22386d1991931da1aeae88533b95e4a4bd29b95e681b5273a9a061c75" gracePeriod=30 Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.106171 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="ceilometer-notification-agent" containerID="cri-o://d43ead94b9f33e2324f591cc408dbb66b68c1bb93ea0da99cc4d0aa6f566ff76" gracePeriod=30 Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.117320 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.569089 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw"] Jan 20 23:35:06 crc kubenswrapper[5030]: W0120 23:35:06.578592 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda832d1c9_4838_4070_ad8d_e2af76d8ca47.slice/crio-bbf7a2150dd0d618cdf2d49f18bd26756dc2327913a93f25ffb140279d91b870 WatchSource:0}: Error finding container bbf7a2150dd0d618cdf2d49f18bd26756dc2327913a93f25ffb140279d91b870: Status 404 returned error can't find the container with id bbf7a2150dd0d618cdf2d49f18bd26756dc2327913a93f25ffb140279d91b870 Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.742656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" event={"ID":"a832d1c9-4838-4070-ad8d-e2af76d8ca47","Type":"ContainerStarted","Data":"bbf7a2150dd0d618cdf2d49f18bd26756dc2327913a93f25ffb140279d91b870"} Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.751051 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerID="65120d5a48a7df9b0fba3d1fc2438131bd8b3abbd4f197cb6243fad8d5660da9" exitCode=0 Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.751078 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerID="c027ebf22386d1991931da1aeae88533b95e4a4bd29b95e681b5273a9a061c75" exitCode=2 Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.751086 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerID="0192f87401264fe14604fce39af47d95952ac3607105d0e71210d811c88df755" exitCode=0 Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.751104 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa","Type":"ContainerDied","Data":"65120d5a48a7df9b0fba3d1fc2438131bd8b3abbd4f197cb6243fad8d5660da9"} Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.751128 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa","Type":"ContainerDied","Data":"c027ebf22386d1991931da1aeae88533b95e4a4bd29b95e681b5273a9a061c75"} Jan 20 23:35:06 crc kubenswrapper[5030]: I0120 23:35:06.751136 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa","Type":"ContainerDied","Data":"0192f87401264fe14604fce39af47d95952ac3607105d0e71210d811c88df755"} Jan 20 23:35:07 crc kubenswrapper[5030]: I0120 23:35:07.763023 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" event={"ID":"a832d1c9-4838-4070-ad8d-e2af76d8ca47","Type":"ContainerStarted","Data":"3b503db37878b35095c76a10013149fa599afad387b5f267dc2cef4bd347d150"} Jan 20 23:35:07 crc kubenswrapper[5030]: I0120 23:35:07.763277 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" event={"ID":"a832d1c9-4838-4070-ad8d-e2af76d8ca47","Type":"ContainerStarted","Data":"901c65079c86eab89799533dc96ef980a3db2d852f2b58ea3c34b15a6460f991"} Jan 20 23:35:07 crc kubenswrapper[5030]: I0120 23:35:07.763295 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:07 crc kubenswrapper[5030]: I0120 23:35:07.788215 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" podStartSLOduration=2.788190397 podStartE2EDuration="2.788190397s" podCreationTimestamp="2026-01-20 23:35:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:35:07.78173556 +0000 UTC m=+3580.101995888" watchObservedRunningTime="2026-01-20 23:35:07.788190397 +0000 UTC m=+3580.108450725" Jan 20 23:35:07 crc kubenswrapper[5030]: I0120 23:35:07.969613 5030 scope.go:117] "RemoveContainer" containerID="3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8" Jan 20 23:35:07 crc kubenswrapper[5030]: E0120 23:35:07.969890 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(9c05bad7-0581-4182-9c7b-7a790c22cf64)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" Jan 20 23:35:08 crc kubenswrapper[5030]: I0120 23:35:08.777282 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:10 crc kubenswrapper[5030]: I0120 23:35:10.038647 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:35:10 crc kubenswrapper[5030]: I0120 23:35:10.039399 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:35:10 crc kubenswrapper[5030]: I0120 23:35:10.962602 5030 scope.go:117] "RemoveContainer" containerID="84ade93355a3eac45979c28fcd90e782dabf113aeb7e0a0088390d73d73bdfa8" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.053299 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="16aae85a-a502-4b2e-9663-e8c50d01f657" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.154:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.053331 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="16aae85a-a502-4b2e-9663-e8c50d01f657" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.154:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.130333 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.131735 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.247231 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64"] Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.247539 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" podUID="7862bdc0-be79-4653-95d6-88b5ebcb8f04" containerName="proxy-httpd" containerID="cri-o://3ca1567f269cf2f2bd9e348bfea75b6cd1bfe32af33811e55a8d700adc4b3be8" gracePeriod=30 Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.248179 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" podUID="7862bdc0-be79-4653-95d6-88b5ebcb8f04" containerName="proxy-server" containerID="cri-o://9c1a4fdb8c86de289792869fbdaa6943771f4e66e1d91b446b548a98eb55fa91" gracePeriod=30 Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.564714 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.642663 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-sg-core-conf-yaml\") pod \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.642720 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-scripts\") pod \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.642754 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-config-data\") pod \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.642783 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-log-httpd\") pod \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.642824 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvd2k\" (UniqueName: \"kubernetes.io/projected/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-kube-api-access-pvd2k\") pod \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.642929 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-combined-ca-bundle\") pod \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.643012 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-ceilometer-tls-certs\") pod \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.643035 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-run-httpd\") pod \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\" (UID: \"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa\") " Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.643646 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" (UID: "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.644346 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" (UID: "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.654845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-scripts" (OuterVolumeSpecName: "scripts") pod "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" (UID: "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.655186 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-kube-api-access-pvd2k" (OuterVolumeSpecName: "kube-api-access-pvd2k") pod "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" (UID: "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa"). InnerVolumeSpecName "kube-api-access-pvd2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.695823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" (UID: "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.719770 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" (UID: "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.720341 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" (UID: "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.744751 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.744785 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvd2k\" (UniqueName: \"kubernetes.io/projected/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-kube-api-access-pvd2k\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.744799 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.744808 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.744817 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.744825 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.744834 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:11 crc kubenswrapper[5030]: E0120 23:35:11.784643 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7862bdc0_be79_4653_95d6_88b5ebcb8f04.slice/crio-conmon-9c1a4fdb8c86de289792869fbdaa6943771f4e66e1d91b446b548a98eb55fa91.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7862bdc0_be79_4653_95d6_88b5ebcb8f04.slice/crio-9c1a4fdb8c86de289792869fbdaa6943771f4e66e1d91b446b548a98eb55fa91.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.804439 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-config-data" (OuterVolumeSpecName: "config-data") pod "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" (UID: "bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.824110 5030 generic.go:334] "Generic (PLEG): container finished" podID="7862bdc0-be79-4653-95d6-88b5ebcb8f04" containerID="9c1a4fdb8c86de289792869fbdaa6943771f4e66e1d91b446b548a98eb55fa91" exitCode=0 Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.824140 5030 generic.go:334] "Generic (PLEG): container finished" podID="7862bdc0-be79-4653-95d6-88b5ebcb8f04" containerID="3ca1567f269cf2f2bd9e348bfea75b6cd1bfe32af33811e55a8d700adc4b3be8" exitCode=0 Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.824192 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" event={"ID":"7862bdc0-be79-4653-95d6-88b5ebcb8f04","Type":"ContainerDied","Data":"9c1a4fdb8c86de289792869fbdaa6943771f4e66e1d91b446b548a98eb55fa91"} Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.824246 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" event={"ID":"7862bdc0-be79-4653-95d6-88b5ebcb8f04","Type":"ContainerDied","Data":"3ca1567f269cf2f2bd9e348bfea75b6cd1bfe32af33811e55a8d700adc4b3be8"} Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.827820 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerID="d43ead94b9f33e2324f591cc408dbb66b68c1bb93ea0da99cc4d0aa6f566ff76" exitCode=0 Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.827861 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa","Type":"ContainerDied","Data":"d43ead94b9f33e2324f591cc408dbb66b68c1bb93ea0da99cc4d0aa6f566ff76"} Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.827884 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.827910 5030 scope.go:117] "RemoveContainer" containerID="65120d5a48a7df9b0fba3d1fc2438131bd8b3abbd4f197cb6243fad8d5660da9" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.827893 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa","Type":"ContainerDied","Data":"a5b2037f83c532db82e0c2292641947cc293e9fb55bc3c6fe284769e3a918862"} Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.831868 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"5375014e-2365-4042-910d-80304789f745","Type":"ContainerStarted","Data":"1b138305497a5bde823490fa7171b55993dfce48f09f99a115680485ae08ab9d"} Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.847068 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.854474 5030 scope.go:117] "RemoveContainer" containerID="c027ebf22386d1991931da1aeae88533b95e4a4bd29b95e681b5273a9a061c75" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.877675 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.884611 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.900587 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:35:11 crc kubenswrapper[5030]: E0120 23:35:11.901278 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="sg-core" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.901297 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="sg-core" Jan 20 23:35:11 crc kubenswrapper[5030]: E0120 23:35:11.901308 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="proxy-httpd" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.901327 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="proxy-httpd" Jan 20 23:35:11 crc kubenswrapper[5030]: E0120 23:35:11.901357 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="ceilometer-notification-agent" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.901365 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="ceilometer-notification-agent" Jan 20 23:35:11 crc kubenswrapper[5030]: E0120 23:35:11.901381 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="ceilometer-central-agent" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.901387 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="ceilometer-central-agent" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.901595 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="ceilometer-notification-agent" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.901631 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="proxy-httpd" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.901645 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="ceilometer-central-agent" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.901660 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" containerName="sg-core" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.904415 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.908795 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.908970 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.909158 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.910526 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.939850 5030 scope.go:117] "RemoveContainer" containerID="d43ead94b9f33e2324f591cc408dbb66b68c1bb93ea0da99cc4d0aa6f566ff76" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.948880 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.949031 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43061a58-a31e-4ced-836f-b6a6badff81f-run-httpd\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.949060 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-scripts\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.949093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-config-data\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.949129 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43061a58-a31e-4ced-836f-b6a6badff81f-log-httpd\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.949149 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdnrb\" (UniqueName: \"kubernetes.io/projected/43061a58-a31e-4ced-836f-b6a6badff81f-kube-api-access-vdnrb\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.949167 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.949194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:11 crc kubenswrapper[5030]: I0120 23:35:11.981667 5030 scope.go:117] "RemoveContainer" containerID="0192f87401264fe14604fce39af47d95952ac3607105d0e71210d811c88df755" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.011848 5030 scope.go:117] "RemoveContainer" containerID="65120d5a48a7df9b0fba3d1fc2438131bd8b3abbd4f197cb6243fad8d5660da9" Jan 20 23:35:12 crc kubenswrapper[5030]: E0120 23:35:12.012540 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65120d5a48a7df9b0fba3d1fc2438131bd8b3abbd4f197cb6243fad8d5660da9\": container with ID starting with 65120d5a48a7df9b0fba3d1fc2438131bd8b3abbd4f197cb6243fad8d5660da9 not found: ID does not exist" containerID="65120d5a48a7df9b0fba3d1fc2438131bd8b3abbd4f197cb6243fad8d5660da9" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.012584 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65120d5a48a7df9b0fba3d1fc2438131bd8b3abbd4f197cb6243fad8d5660da9"} err="failed to get container status \"65120d5a48a7df9b0fba3d1fc2438131bd8b3abbd4f197cb6243fad8d5660da9\": rpc error: code = NotFound desc = could not find container \"65120d5a48a7df9b0fba3d1fc2438131bd8b3abbd4f197cb6243fad8d5660da9\": container with ID starting with 65120d5a48a7df9b0fba3d1fc2438131bd8b3abbd4f197cb6243fad8d5660da9 not found: ID does not exist" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.012607 5030 scope.go:117] "RemoveContainer" containerID="c027ebf22386d1991931da1aeae88533b95e4a4bd29b95e681b5273a9a061c75" Jan 20 23:35:12 crc kubenswrapper[5030]: E0120 23:35:12.012858 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c027ebf22386d1991931da1aeae88533b95e4a4bd29b95e681b5273a9a061c75\": container with ID starting with c027ebf22386d1991931da1aeae88533b95e4a4bd29b95e681b5273a9a061c75 not found: ID does not exist" containerID="c027ebf22386d1991931da1aeae88533b95e4a4bd29b95e681b5273a9a061c75" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.012874 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c027ebf22386d1991931da1aeae88533b95e4a4bd29b95e681b5273a9a061c75"} err="failed to get container status \"c027ebf22386d1991931da1aeae88533b95e4a4bd29b95e681b5273a9a061c75\": rpc error: code = NotFound desc = could not find container \"c027ebf22386d1991931da1aeae88533b95e4a4bd29b95e681b5273a9a061c75\": container with ID starting with c027ebf22386d1991931da1aeae88533b95e4a4bd29b95e681b5273a9a061c75 not found: ID does not exist" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.012894 5030 scope.go:117] "RemoveContainer" containerID="d43ead94b9f33e2324f591cc408dbb66b68c1bb93ea0da99cc4d0aa6f566ff76" Jan 20 23:35:12 crc kubenswrapper[5030]: E0120 23:35:12.013075 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d43ead94b9f33e2324f591cc408dbb66b68c1bb93ea0da99cc4d0aa6f566ff76\": container with ID starting with d43ead94b9f33e2324f591cc408dbb66b68c1bb93ea0da99cc4d0aa6f566ff76 not found: ID does not exist" containerID="d43ead94b9f33e2324f591cc408dbb66b68c1bb93ea0da99cc4d0aa6f566ff76" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.013095 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d43ead94b9f33e2324f591cc408dbb66b68c1bb93ea0da99cc4d0aa6f566ff76"} err="failed to get container status \"d43ead94b9f33e2324f591cc408dbb66b68c1bb93ea0da99cc4d0aa6f566ff76\": rpc error: code = NotFound desc = could not find container \"d43ead94b9f33e2324f591cc408dbb66b68c1bb93ea0da99cc4d0aa6f566ff76\": container with ID starting with d43ead94b9f33e2324f591cc408dbb66b68c1bb93ea0da99cc4d0aa6f566ff76 not found: ID does not exist" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.013111 5030 scope.go:117] "RemoveContainer" containerID="0192f87401264fe14604fce39af47d95952ac3607105d0e71210d811c88df755" Jan 20 23:35:12 crc kubenswrapper[5030]: E0120 23:35:12.013304 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0192f87401264fe14604fce39af47d95952ac3607105d0e71210d811c88df755\": container with ID starting with 0192f87401264fe14604fce39af47d95952ac3607105d0e71210d811c88df755 not found: ID does not exist" containerID="0192f87401264fe14604fce39af47d95952ac3607105d0e71210d811c88df755" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.013324 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0192f87401264fe14604fce39af47d95952ac3607105d0e71210d811c88df755"} err="failed to get container status \"0192f87401264fe14604fce39af47d95952ac3607105d0e71210d811c88df755\": rpc error: code = NotFound desc = could not find container \"0192f87401264fe14604fce39af47d95952ac3607105d0e71210d811c88df755\": container with ID starting with 0192f87401264fe14604fce39af47d95952ac3607105d0e71210d811c88df755 not found: ID does not exist" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.020402 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa" path="/var/lib/kubelet/pods/bfe9ef3b-f2db-44f8-bf77-71fdc3147ffa/volumes" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.050481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43061a58-a31e-4ced-836f-b6a6badff81f-run-httpd\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.050533 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-scripts\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.050556 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-config-data\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.050589 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43061a58-a31e-4ced-836f-b6a6badff81f-log-httpd\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.050634 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdnrb\" (UniqueName: \"kubernetes.io/projected/43061a58-a31e-4ced-836f-b6a6badff81f-kube-api-access-vdnrb\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.050654 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.050693 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.050765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.052565 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43061a58-a31e-4ced-836f-b6a6badff81f-run-httpd\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.059582 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43061a58-a31e-4ced-836f-b6a6badff81f-log-httpd\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.064510 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-config-data\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.064920 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.065673 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-scripts\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.070827 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.076157 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.080198 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdnrb\" (UniqueName: \"kubernetes.io/projected/43061a58-a31e-4ced-836f-b6a6badff81f-kube-api-access-vdnrb\") pod \"ceilometer-0\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.153681 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.230012 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.355251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-internal-tls-certs\") pod \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.355531 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-config-data\") pod \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.355589 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7862bdc0-be79-4653-95d6-88b5ebcb8f04-log-httpd\") pod \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.356101 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7862bdc0-be79-4653-95d6-88b5ebcb8f04-run-httpd\") pod \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.356172 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7862bdc0-be79-4653-95d6-88b5ebcb8f04-etc-swift\") pod \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.356198 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cntt9\" (UniqueName: \"kubernetes.io/projected/7862bdc0-be79-4653-95d6-88b5ebcb8f04-kube-api-access-cntt9\") pod \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.356218 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-public-tls-certs\") pod \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.356287 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-combined-ca-bundle\") pod \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\" (UID: \"7862bdc0-be79-4653-95d6-88b5ebcb8f04\") " Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.361145 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7862bdc0-be79-4653-95d6-88b5ebcb8f04-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7862bdc0-be79-4653-95d6-88b5ebcb8f04" (UID: "7862bdc0-be79-4653-95d6-88b5ebcb8f04"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.361457 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7862bdc0-be79-4653-95d6-88b5ebcb8f04-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7862bdc0-be79-4653-95d6-88b5ebcb8f04" (UID: "7862bdc0-be79-4653-95d6-88b5ebcb8f04"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.362262 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7862bdc0-be79-4653-95d6-88b5ebcb8f04-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7862bdc0-be79-4653-95d6-88b5ebcb8f04" (UID: "7862bdc0-be79-4653-95d6-88b5ebcb8f04"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.363744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7862bdc0-be79-4653-95d6-88b5ebcb8f04-kube-api-access-cntt9" (OuterVolumeSpecName: "kube-api-access-cntt9") pod "7862bdc0-be79-4653-95d6-88b5ebcb8f04" (UID: "7862bdc0-be79-4653-95d6-88b5ebcb8f04"). InnerVolumeSpecName "kube-api-access-cntt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.419418 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7862bdc0-be79-4653-95d6-88b5ebcb8f04" (UID: "7862bdc0-be79-4653-95d6-88b5ebcb8f04"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.438206 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7862bdc0-be79-4653-95d6-88b5ebcb8f04" (UID: "7862bdc0-be79-4653-95d6-88b5ebcb8f04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.451727 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7862bdc0-be79-4653-95d6-88b5ebcb8f04" (UID: "7862bdc0-be79-4653-95d6-88b5ebcb8f04"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.459102 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7862bdc0-be79-4653-95d6-88b5ebcb8f04-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.459127 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7862bdc0-be79-4653-95d6-88b5ebcb8f04-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.459136 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7862bdc0-be79-4653-95d6-88b5ebcb8f04-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.459148 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cntt9\" (UniqueName: \"kubernetes.io/projected/7862bdc0-be79-4653-95d6-88b5ebcb8f04-kube-api-access-cntt9\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.459158 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.459167 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.459176 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.460786 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-config-data" (OuterVolumeSpecName: "config-data") pod "7862bdc0-be79-4653-95d6-88b5ebcb8f04" (UID: "7862bdc0-be79-4653-95d6-88b5ebcb8f04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.561143 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7862bdc0-be79-4653-95d6-88b5ebcb8f04-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.715053 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.842715 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.842721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" event={"ID":"7862bdc0-be79-4653-95d6-88b5ebcb8f04","Type":"ContainerDied","Data":"d71a72fc88c049637f628543d92a183999d12f38fd28e487b1624442390391f8"} Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.842830 5030 scope.go:117] "RemoveContainer" containerID="9c1a4fdb8c86de289792869fbdaa6943771f4e66e1d91b446b548a98eb55fa91" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.844103 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"43061a58-a31e-4ced-836f-b6a6badff81f","Type":"ContainerStarted","Data":"d1b1a2ac219bf7f63a00978610d3fcc21b78cb45aa5dba3eefec9116cc452426"} Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.868315 5030 scope.go:117] "RemoveContainer" containerID="3ca1567f269cf2f2bd9e348bfea75b6cd1bfe32af33811e55a8d700adc4b3be8" Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.880756 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64"] Jan 20 23:35:12 crc kubenswrapper[5030]: I0120 23:35:12.890736 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64"] Jan 20 23:35:13 crc kubenswrapper[5030]: I0120 23:35:13.747929 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:35:13 crc kubenswrapper[5030]: I0120 23:35:13.871104 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"43061a58-a31e-4ced-836f-b6a6badff81f","Type":"ContainerStarted","Data":"f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc"} Jan 20 23:35:13 crc kubenswrapper[5030]: I0120 23:35:13.979874 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7862bdc0-be79-4653-95d6-88b5ebcb8f04" path="/var/lib/kubelet/pods/7862bdc0-be79-4653-95d6-88b5ebcb8f04/volumes" Jan 20 23:35:14 crc kubenswrapper[5030]: I0120 23:35:14.883501 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"43061a58-a31e-4ced-836f-b6a6badff81f","Type":"ContainerStarted","Data":"6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32"} Jan 20 23:35:15 crc kubenswrapper[5030]: I0120 23:35:15.897242 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"43061a58-a31e-4ced-836f-b6a6badff81f","Type":"ContainerStarted","Data":"4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2"} Jan 20 23:35:16 crc kubenswrapper[5030]: I0120 23:35:16.890116 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" podUID="7862bdc0-be79-4653-95d6-88b5ebcb8f04" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.236:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:35:16 crc kubenswrapper[5030]: I0120 23:35:16.890283 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/swift-proxy-6864c8c87d-g8v64" podUID="7862bdc0-be79-4653-95d6-88b5ebcb8f04" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.1.236:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:35:16 crc kubenswrapper[5030]: I0120 23:35:16.923702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"43061a58-a31e-4ced-836f-b6a6badff81f","Type":"ContainerStarted","Data":"ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54"} Jan 20 23:35:16 crc kubenswrapper[5030]: I0120 23:35:16.924697 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:16 crc kubenswrapper[5030]: I0120 23:35:16.960035 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.58030319 podStartE2EDuration="5.959997569s" podCreationTimestamp="2026-01-20 23:35:11 +0000 UTC" firstStartedPulling="2026-01-20 23:35:12.733365839 +0000 UTC m=+3585.053626127" lastFinishedPulling="2026-01-20 23:35:16.113060208 +0000 UTC m=+3588.433320506" observedRunningTime="2026-01-20 23:35:16.952671461 +0000 UTC m=+3589.272931769" watchObservedRunningTime="2026-01-20 23:35:16.959997569 +0000 UTC m=+3589.280257897" Jan 20 23:35:18 crc kubenswrapper[5030]: I0120 23:35:18.748210 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:35:18 crc kubenswrapper[5030]: I0120 23:35:18.783528 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:35:18 crc kubenswrapper[5030]: I0120 23:35:18.993292 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:35:20 crc kubenswrapper[5030]: I0120 23:35:20.046210 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:35:20 crc kubenswrapper[5030]: I0120 23:35:20.046563 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:35:20 crc kubenswrapper[5030]: I0120 23:35:20.047782 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:35:20 crc kubenswrapper[5030]: I0120 23:35:20.061561 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:35:20 crc kubenswrapper[5030]: I0120 23:35:20.963783 5030 scope.go:117] "RemoveContainer" containerID="3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8" Jan 20 23:35:20 crc kubenswrapper[5030]: I0120 23:35:20.964881 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:35:20 crc kubenswrapper[5030]: I0120 23:35:20.982812 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:35:22 crc kubenswrapper[5030]: I0120 23:35:22.007389 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9c05bad7-0581-4182-9c7b-7a790c22cf64","Type":"ContainerStarted","Data":"7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682"} Jan 20 23:35:22 crc kubenswrapper[5030]: I0120 23:35:22.008314 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:35:26 crc kubenswrapper[5030]: I0120 23:35:26.442559 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:35:26 crc kubenswrapper[5030]: I0120 23:35:26.542007 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw"] Jan 20 23:35:26 crc kubenswrapper[5030]: I0120 23:35:26.544959 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" podUID="a5f7d386-d22d-49cd-8ffd-877ac3c3108f" containerName="neutron-api" containerID="cri-o://32707ab577e549414bd76831b2cae74f34212e803ea399d716a06b2198ac6006" gracePeriod=30 Jan 20 23:35:26 crc kubenswrapper[5030]: I0120 23:35:26.545044 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" podUID="a5f7d386-d22d-49cd-8ffd-877ac3c3108f" containerName="neutron-httpd" containerID="cri-o://e671209acb03220158445d6ada0caf834e79aeef65a37553919aa2552ea75946" gracePeriod=30 Jan 20 23:35:27 crc kubenswrapper[5030]: I0120 23:35:27.053259 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5f7d386-d22d-49cd-8ffd-877ac3c3108f" containerID="e671209acb03220158445d6ada0caf834e79aeef65a37553919aa2552ea75946" exitCode=0 Jan 20 23:35:27 crc kubenswrapper[5030]: I0120 23:35:27.053306 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" event={"ID":"a5f7d386-d22d-49cd-8ffd-877ac3c3108f","Type":"ContainerDied","Data":"e671209acb03220158445d6ada0caf834e79aeef65a37553919aa2552ea75946"} Jan 20 23:35:27 crc kubenswrapper[5030]: I0120 23:35:27.657185 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:35:40 crc kubenswrapper[5030]: I0120 23:35:40.157346 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:35:40 crc kubenswrapper[5030]: I0120 23:35:40.157974 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:35:42 crc kubenswrapper[5030]: I0120 23:35:42.245582 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:35:44 crc kubenswrapper[5030]: I0120 23:35:44.198511 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" podUID="a5f7d386-d22d-49cd-8ffd-877ac3c3108f" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.223:9696/\": dial tcp 10.217.1.223:9696: connect: connection refused" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.063505 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5ffdf8f866-wq6lw_a5f7d386-d22d-49cd-8ffd-877ac3c3108f/neutron-api/0.log" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.064213 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.237449 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scg64\" (UniqueName: \"kubernetes.io/projected/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-kube-api-access-scg64\") pod \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.237573 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-httpd-config\") pod \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.237614 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-internal-tls-certs\") pod \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.237718 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-ovndb-tls-certs\") pod \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.237749 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-combined-ca-bundle\") pod \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.237777 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-public-tls-certs\") pod \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.237804 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-config\") pod \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\" (UID: \"a5f7d386-d22d-49cd-8ffd-877ac3c3108f\") " Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.243804 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-kube-api-access-scg64" (OuterVolumeSpecName: "kube-api-access-scg64") pod "a5f7d386-d22d-49cd-8ffd-877ac3c3108f" (UID: "a5f7d386-d22d-49cd-8ffd-877ac3c3108f"). InnerVolumeSpecName "kube-api-access-scg64". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.256998 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a5f7d386-d22d-49cd-8ffd-877ac3c3108f" (UID: "a5f7d386-d22d-49cd-8ffd-877ac3c3108f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.299053 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5f7d386-d22d-49cd-8ffd-877ac3c3108f" (UID: "a5f7d386-d22d-49cd-8ffd-877ac3c3108f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.307636 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a5f7d386-d22d-49cd-8ffd-877ac3c3108f" (UID: "a5f7d386-d22d-49cd-8ffd-877ac3c3108f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.318429 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a5f7d386-d22d-49cd-8ffd-877ac3c3108f" (UID: "a5f7d386-d22d-49cd-8ffd-877ac3c3108f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.325583 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a5f7d386-d22d-49cd-8ffd-877ac3c3108f" (UID: "a5f7d386-d22d-49cd-8ffd-877ac3c3108f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.331364 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-config" (OuterVolumeSpecName: "config") pod "a5f7d386-d22d-49cd-8ffd-877ac3c3108f" (UID: "a5f7d386-d22d-49cd-8ffd-877ac3c3108f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.339712 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.339801 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.339875 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.339934 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.339993 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.340061 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.340148 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scg64\" (UniqueName: \"kubernetes.io/projected/a5f7d386-d22d-49cd-8ffd-877ac3c3108f-kube-api-access-scg64\") on node \"crc\" DevicePath \"\"" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.418168 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5ffdf8f866-wq6lw_a5f7d386-d22d-49cd-8ffd-877ac3c3108f/neutron-api/0.log" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.418252 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5f7d386-d22d-49cd-8ffd-877ac3c3108f" containerID="32707ab577e549414bd76831b2cae74f34212e803ea399d716a06b2198ac6006" exitCode=137 Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.418287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" event={"ID":"a5f7d386-d22d-49cd-8ffd-877ac3c3108f","Type":"ContainerDied","Data":"32707ab577e549414bd76831b2cae74f34212e803ea399d716a06b2198ac6006"} Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.418317 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" event={"ID":"a5f7d386-d22d-49cd-8ffd-877ac3c3108f","Type":"ContainerDied","Data":"916135fad9cef7b09807c2cb58922e481b601cc7c9c43b0f59be62f4506292d8"} Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.418338 5030 scope.go:117] "RemoveContainer" containerID="e671209acb03220158445d6ada0caf834e79aeef65a37553919aa2552ea75946" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.418484 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.454248 5030 scope.go:117] "RemoveContainer" containerID="32707ab577e549414bd76831b2cae74f34212e803ea399d716a06b2198ac6006" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.458093 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw"] Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.467795 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5ffdf8f866-wq6lw"] Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.487782 5030 scope.go:117] "RemoveContainer" containerID="e671209acb03220158445d6ada0caf834e79aeef65a37553919aa2552ea75946" Jan 20 23:35:57 crc kubenswrapper[5030]: E0120 23:35:57.488284 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e671209acb03220158445d6ada0caf834e79aeef65a37553919aa2552ea75946\": container with ID starting with e671209acb03220158445d6ada0caf834e79aeef65a37553919aa2552ea75946 not found: ID does not exist" containerID="e671209acb03220158445d6ada0caf834e79aeef65a37553919aa2552ea75946" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.488329 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e671209acb03220158445d6ada0caf834e79aeef65a37553919aa2552ea75946"} err="failed to get container status \"e671209acb03220158445d6ada0caf834e79aeef65a37553919aa2552ea75946\": rpc error: code = NotFound desc = could not find container \"e671209acb03220158445d6ada0caf834e79aeef65a37553919aa2552ea75946\": container with ID starting with e671209acb03220158445d6ada0caf834e79aeef65a37553919aa2552ea75946 not found: ID does not exist" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.488388 5030 scope.go:117] "RemoveContainer" containerID="32707ab577e549414bd76831b2cae74f34212e803ea399d716a06b2198ac6006" Jan 20 23:35:57 crc kubenswrapper[5030]: E0120 23:35:57.488893 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32707ab577e549414bd76831b2cae74f34212e803ea399d716a06b2198ac6006\": container with ID starting with 32707ab577e549414bd76831b2cae74f34212e803ea399d716a06b2198ac6006 not found: ID does not exist" containerID="32707ab577e549414bd76831b2cae74f34212e803ea399d716a06b2198ac6006" Jan 20 23:35:57 crc kubenswrapper[5030]: I0120 23:35:57.488974 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32707ab577e549414bd76831b2cae74f34212e803ea399d716a06b2198ac6006"} err="failed to get container status \"32707ab577e549414bd76831b2cae74f34212e803ea399d716a06b2198ac6006\": rpc error: code = NotFound desc = could not find container \"32707ab577e549414bd76831b2cae74f34212e803ea399d716a06b2198ac6006\": container with ID starting with 32707ab577e549414bd76831b2cae74f34212e803ea399d716a06b2198ac6006 not found: ID does not exist" Jan 20 23:35:58 crc kubenswrapper[5030]: I0120 23:35:58.002913 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5f7d386-d22d-49cd-8ffd-877ac3c3108f" path="/var/lib/kubelet/pods/a5f7d386-d22d-49cd-8ffd-877ac3c3108f/volumes" Jan 20 23:36:04 crc kubenswrapper[5030]: I0120 23:36:04.612937 5030 scope.go:117] "RemoveContainer" containerID="7cb774b7ad67df28cae541d2b5fd9915c0040832858e0d9cbf4f702324dd16b6" Jan 20 23:36:04 crc kubenswrapper[5030]: I0120 23:36:04.652334 5030 scope.go:117] "RemoveContainer" containerID="9d17498b4f63fe63c2218b1ac9b50d472f4d2ff957fc47efa50328e708c8df4a" Jan 20 23:36:04 crc kubenswrapper[5030]: I0120 23:36:04.692157 5030 scope.go:117] "RemoveContainer" containerID="72a954b6da79217b10f1275cd3e8dd3a0eb5ed45f587ffd21e26c1f49bee64c7" Jan 20 23:36:10 crc kubenswrapper[5030]: I0120 23:36:10.158156 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:36:10 crc kubenswrapper[5030]: I0120 23:36:10.158834 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.207444 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.215107 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.215564 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="eddfc91d-df9b-4683-b3e6-5bad910ef921" containerName="openstackclient" containerID="cri-o://c1113f67d75b9999d2aa07e3dea0a48f9ab97342489a0eba77438de7da5986dc" gracePeriod=2 Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.236105 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.260123 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x"] Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.260959 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f7d386-d22d-49cd-8ffd-877ac3c3108f" containerName="neutron-api" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.261042 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f7d386-d22d-49cd-8ffd-877ac3c3108f" containerName="neutron-api" Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.261119 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f7d386-d22d-49cd-8ffd-877ac3c3108f" containerName="neutron-httpd" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.261184 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f7d386-d22d-49cd-8ffd-877ac3c3108f" containerName="neutron-httpd" Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.261273 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eddfc91d-df9b-4683-b3e6-5bad910ef921" containerName="openstackclient" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.261337 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eddfc91d-df9b-4683-b3e6-5bad910ef921" containerName="openstackclient" Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.261401 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7862bdc0-be79-4653-95d6-88b5ebcb8f04" containerName="proxy-server" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.261481 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7862bdc0-be79-4653-95d6-88b5ebcb8f04" containerName="proxy-server" Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.261587 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7862bdc0-be79-4653-95d6-88b5ebcb8f04" containerName="proxy-httpd" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.265962 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7862bdc0-be79-4653-95d6-88b5ebcb8f04" containerName="proxy-httpd" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.270395 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eddfc91d-df9b-4683-b3e6-5bad910ef921" containerName="openstackclient" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.270465 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7862bdc0-be79-4653-95d6-88b5ebcb8f04" containerName="proxy-httpd" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.270474 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5f7d386-d22d-49cd-8ffd-877ac3c3108f" containerName="neutron-httpd" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.270501 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5f7d386-d22d-49cd-8ffd-877ac3c3108f" containerName="neutron-api" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.270527 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7862bdc0-be79-4653-95d6-88b5ebcb8f04" containerName="proxy-server" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.279420 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.327808 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc"] Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.332180 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.332247 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data podName:e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:28.832229677 +0000 UTC m=+3661.152489965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data") pod "rabbitmq-server-0" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c") : configmap "rabbitmq-config-data" not found Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.366832 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.412077 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-d9vgn"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.413560 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-d9vgn" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.424017 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.433009 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8705c389-fda4-4ef7-8a28-93efcdff880b-logs\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.433313 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt67c\" (UniqueName: \"kubernetes.io/projected/afaaf0e0-1ddc-4335-843b-c2ea49492b80-kube-api-access-qt67c\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.433401 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afaaf0e0-1ddc-4335-843b-c2ea49492b80-logs\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.433505 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-config-data-custom\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.433637 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-config-data-custom\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.433730 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-config-data\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.433840 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd2px\" (UniqueName: \"kubernetes.io/projected/8705c389-fda4-4ef7-8a28-93efcdff880b-kube-api-access-gd2px\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.433969 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.434069 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.434161 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-config-data\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.458178 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.476024 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.490707 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8q5qn"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.525686 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-d9vgn"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.536016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-config-data-custom\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.536067 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-config-data\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.536103 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd2px\" (UniqueName: \"kubernetes.io/projected/8705c389-fda4-4ef7-8a28-93efcdff880b-kube-api-access-gd2px\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.536165 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.536193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.536225 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvzxs\" (UniqueName: \"kubernetes.io/projected/bc222012-854c-45e8-8ef9-e87ac905ca13-kube-api-access-gvzxs\") pod \"root-account-create-update-d9vgn\" (UID: \"bc222012-854c-45e8-8ef9-e87ac905ca13\") " pod="openstack-kuttl-tests/root-account-create-update-d9vgn" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.536257 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-config-data\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.536310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc222012-854c-45e8-8ef9-e87ac905ca13-operator-scripts\") pod \"root-account-create-update-d9vgn\" (UID: \"bc222012-854c-45e8-8ef9-e87ac905ca13\") " pod="openstack-kuttl-tests/root-account-create-update-d9vgn" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.536341 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8705c389-fda4-4ef7-8a28-93efcdff880b-logs\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.536379 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt67c\" (UniqueName: \"kubernetes.io/projected/afaaf0e0-1ddc-4335-843b-c2ea49492b80-kube-api-access-qt67c\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.536398 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afaaf0e0-1ddc-4335-843b-c2ea49492b80-logs\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.536421 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-config-data-custom\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.537491 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8705c389-fda4-4ef7-8a28-93efcdff880b-logs\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.548901 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-config-data\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.549226 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.549353 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle podName:8705c389-fda4-4ef7-8a28-93efcdff880b nodeName:}" failed. No retries permitted until 2026-01-20 23:36:29.049336363 +0000 UTC m=+3661.369596651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle") pod "barbican-keystone-listener-78ff77fc4b-82l2x" (UID: "8705c389-fda4-4ef7-8a28-93efcdff880b") : secret "combined-ca-bundle" not found Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.549848 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afaaf0e0-1ddc-4335-843b-c2ea49492b80-logs\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.549984 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.550081 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle podName:afaaf0e0-1ddc-4335-843b-c2ea49492b80 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:29.05007129 +0000 UTC m=+3661.370331578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle") pod "barbican-worker-69f86ff78c-d95nc" (UID: "afaaf0e0-1ddc-4335-843b-c2ea49492b80") : secret "combined-ca-bundle" not found Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.550166 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-8q5qn"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.589427 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-config-data-custom\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.591242 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-config-data-custom\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.593330 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-config-data\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.598479 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd2px\" (UniqueName: \"kubernetes.io/projected/8705c389-fda4-4ef7-8a28-93efcdff880b-kube-api-access-gd2px\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.599253 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt67c\" (UniqueName: \"kubernetes.io/projected/afaaf0e0-1ddc-4335-843b-c2ea49492b80-kube-api-access-qt67c\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.617105 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-0855-account-create-update-x55lz"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.626070 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.630323 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.641215 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvzxs\" (UniqueName: \"kubernetes.io/projected/bc222012-854c-45e8-8ef9-e87ac905ca13-kube-api-access-gvzxs\") pod \"root-account-create-update-d9vgn\" (UID: \"bc222012-854c-45e8-8ef9-e87ac905ca13\") " pod="openstack-kuttl-tests/root-account-create-update-d9vgn" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.641283 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc222012-854c-45e8-8ef9-e87ac905ca13-operator-scripts\") pod \"root-account-create-update-d9vgn\" (UID: \"bc222012-854c-45e8-8ef9-e87ac905ca13\") " pod="openstack-kuttl-tests/root-account-create-update-d9vgn" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.642000 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc222012-854c-45e8-8ef9-e87ac905ca13-operator-scripts\") pod \"root-account-create-update-d9vgn\" (UID: \"bc222012-854c-45e8-8ef9-e87ac905ca13\") " pod="openstack-kuttl-tests/root-account-create-update-d9vgn" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.649656 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-0855-account-create-update-x55lz"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.683705 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.723998 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvzxs\" (UniqueName: \"kubernetes.io/projected/bc222012-854c-45e8-8ef9-e87ac905ca13-kube-api-access-gvzxs\") pod \"root-account-create-update-d9vgn\" (UID: \"bc222012-854c-45e8-8ef9-e87ac905ca13\") " pod="openstack-kuttl-tests/root-account-create-update-d9vgn" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.748261 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjrvw\" (UniqueName: \"kubernetes.io/projected/94b0ade6-0e21-4df1-8dca-db81b58aff7c-kube-api-access-zjrvw\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.748307 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-config-data-custom\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.748339 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.748408 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.748429 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.748449 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94b0ade6-0e21-4df1-8dca-db81b58aff7c-logs\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.748481 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-config-data\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.748824 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.764843 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-d9vgn" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.804290 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.804965 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="2e480bc9-3e92-4cb6-a685-4b2118dea5ff" containerName="openstack-network-exporter" containerID="cri-o://750810e06e2ba2a1b4a96b2cb486db619a924a3c4e105f8d13577a72affcd1c6" gracePeriod=300 Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.819687 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.836497 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.836824 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="ce2df28d-2515-4995-b928-c2f6d198045d" containerName="openstack-network-exporter" containerID="cri-o://c7d286487506cf34d40605c251876f3d0a39a62208083381222f15802dd6c018" gracePeriod=300 Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.851373 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjrvw\" (UniqueName: \"kubernetes.io/projected/94b0ade6-0e21-4df1-8dca-db81b58aff7c-kube-api-access-zjrvw\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.851413 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-config-data-custom\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.851442 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.851515 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.851536 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.851552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94b0ade6-0e21-4df1-8dca-db81b58aff7c-logs\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.851584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-config-data\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.852605 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.852695 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data podName:e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:29.852675025 +0000 UTC m=+3662.172935313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data") pod "rabbitmq-server-0" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c") : configmap "rabbitmq-config-data" not found Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.852958 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.853036 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle podName:94b0ade6-0e21-4df1-8dca-db81b58aff7c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:29.353016574 +0000 UTC m=+3661.673276862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle") pod "barbican-api-688dd48fcd-2fh67" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c") : secret "combined-ca-bundle" not found Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.853081 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.853100 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs podName:94b0ade6-0e21-4df1-8dca-db81b58aff7c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:29.353094135 +0000 UTC m=+3661.673354413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs") pod "barbican-api-688dd48fcd-2fh67" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c") : secret "cert-barbican-internal-svc" not found Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.853409 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94b0ade6-0e21-4df1-8dca-db81b58aff7c-logs\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.853478 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.853502 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs podName:94b0ade6-0e21-4df1-8dca-db81b58aff7c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:29.353494515 +0000 UTC m=+3661.673754803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs") pod "barbican-api-688dd48fcd-2fh67" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c") : secret "cert-barbican-public-svc" not found Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.860399 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-config-data-custom\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.860467 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-c1a3-account-create-update-kqpvq"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.869894 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-config-data\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.902956 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjrvw\" (UniqueName: \"kubernetes.io/projected/94b0ade6-0e21-4df1-8dca-db81b58aff7c-kube-api-access-zjrvw\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.920211 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-2712-account-create-update-kvv69"] Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.943440 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-2712-account-create-update-kvv69"] Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.958249 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:36:28 crc kubenswrapper[5030]: E0120 23:36:28.958296 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data podName:707515d1-17d7-47ff-be95-73414a6a2a95 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:29.458282801 +0000 UTC m=+3661.778543089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data") pod "rabbitmq-cell1-server-0" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:36:28 crc kubenswrapper[5030]: I0120 23:36:28.978281 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.001446 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.001944 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="3b0e7253-4a57-42d1-bbc3-768eeb755113" containerName="ovn-northd" containerID="cri-o://2b3f48e0d6f2a6aefac841390f0266fdaf772963df0559171a321f8b0b377b72" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.002410 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="3b0e7253-4a57-42d1-bbc3-768eeb755113" containerName="openstack-network-exporter" containerID="cri-o://0c8ba91ac8628ce6044a35b4f7d9eb67441f4f1ee295ec42f3004b90599321e6" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.050651 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.052885 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="3b0e7253-4a57-42d1-bbc3-768eeb755113" containerName="ovn-northd" probeResult="failure" output=< Jan 20 23:36:29 crc kubenswrapper[5030]: 2026-01-20T23:36:29Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Jan 20 23:36:29 crc kubenswrapper[5030]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Jan 20 23:36:29 crc kubenswrapper[5030]: 2026-01-20T23:36:29Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Jan 20 23:36:29 crc kubenswrapper[5030]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Jan 20 23:36:29 crc kubenswrapper[5030]: > Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.061404 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.061449 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.061578 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.061694 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle podName:afaaf0e0-1ddc-4335-843b-c2ea49492b80 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:30.061603903 +0000 UTC m=+3662.381864191 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle") pod "barbican-worker-69f86ff78c-d95nc" (UID: "afaaf0e0-1ddc-4335-843b-c2ea49492b80") : secret "combined-ca-bundle" not found Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.061735 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.061754 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle podName:8705c389-fda4-4ef7-8a28-93efcdff880b nodeName:}" failed. No retries permitted until 2026-01-20 23:36:30.061748216 +0000 UTC m=+3662.382008504 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle") pod "barbican-keystone-listener-78ff77fc4b-82l2x" (UID: "8705c389-fda4-4ef7-8a28-93efcdff880b") : secret "combined-ca-bundle" not found Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.072924 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-fbr4k"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.098932 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-7b33-account-create-update-rwxx4"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.114608 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="ce2df28d-2515-4995-b928-c2f6d198045d" containerName="ovsdbserver-sb" containerID="cri-o://bb888a669f56ad697c7a2cb6314f21c6d4d03d654bc273d2bff1d915da2291d6" gracePeriod=300 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.124676 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-7c7c-account-create-update-mwr5p"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.145738 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-fbr4k"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.173568 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="2e480bc9-3e92-4cb6-a685-4b2118dea5ff" containerName="ovsdbserver-nb" containerID="cri-o://a569f0520ff40c5656ae25d907d285351006a662406feefdd835a9730883f656" gracePeriod=300 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.174952 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.176309 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.184357 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.204995 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.211961 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-4x6fd"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.223644 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-4x6fd"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.249169 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.250726 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.268944 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.290516 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-7fxnx"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.409965 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b0c66c6-242d-4d35-8be9-28739bd537f5-operator-scripts\") pod \"barbican-6a16-account-create-update-ss6r7\" (UID: \"6b0c66c6-242d-4d35-8be9-28739bd537f5\") " pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.410053 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.415813 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.415902 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs podName:94b0ade6-0e21-4df1-8dca-db81b58aff7c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:30.415879308 +0000 UTC m=+3662.736139596 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs") pod "barbican-api-688dd48fcd-2fh67" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c") : secret "cert-barbican-public-svc" not found Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.410112 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hln4z\" (UniqueName: \"kubernetes.io/projected/6b0c66c6-242d-4d35-8be9-28739bd537f5-kube-api-access-hln4z\") pod \"barbican-6a16-account-create-update-ss6r7\" (UID: \"6b0c66c6-242d-4d35-8be9-28739bd537f5\") " pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.430557 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a5c598-c95d-453a-aeec-bda1f9ffdb82-operator-scripts\") pod \"nova-cell0-7b33-account-create-update-226z8\" (UID: \"06a5c598-c95d-453a-aeec-bda1f9ffdb82\") " pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.430747 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.430826 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.431589 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.431681 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs podName:94b0ade6-0e21-4df1-8dca-db81b58aff7c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:30.43166391 +0000 UTC m=+3662.751924198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs") pod "barbican-api-688dd48fcd-2fh67" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c") : secret "cert-barbican-internal-svc" not found Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.432399 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j7zr\" (UniqueName: \"kubernetes.io/projected/06a5c598-c95d-453a-aeec-bda1f9ffdb82-kube-api-access-2j7zr\") pod \"nova-cell0-7b33-account-create-update-226z8\" (UID: \"06a5c598-c95d-453a-aeec-bda1f9ffdb82\") " pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.433610 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.433678 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle podName:94b0ade6-0e21-4df1-8dca-db81b58aff7c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:30.433661858 +0000 UTC m=+3662.753922146 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle") pod "barbican-api-688dd48fcd-2fh67" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c") : secret "combined-ca-bundle" not found Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.441549 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-7fxnx"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.504637 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.547859 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b0c66c6-242d-4d35-8be9-28739bd537f5-operator-scripts\") pod \"barbican-6a16-account-create-update-ss6r7\" (UID: \"6b0c66c6-242d-4d35-8be9-28739bd537f5\") " pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.547967 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hln4z\" (UniqueName: \"kubernetes.io/projected/6b0c66c6-242d-4d35-8be9-28739bd537f5-kube-api-access-hln4z\") pod \"barbican-6a16-account-create-update-ss6r7\" (UID: \"6b0c66c6-242d-4d35-8be9-28739bd537f5\") " pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.547992 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a5c598-c95d-453a-aeec-bda1f9ffdb82-operator-scripts\") pod \"nova-cell0-7b33-account-create-update-226z8\" (UID: \"06a5c598-c95d-453a-aeec-bda1f9ffdb82\") " pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.548037 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j7zr\" (UniqueName: \"kubernetes.io/projected/06a5c598-c95d-453a-aeec-bda1f9ffdb82-kube-api-access-2j7zr\") pod \"nova-cell0-7b33-account-create-update-226z8\" (UID: \"06a5c598-c95d-453a-aeec-bda1f9ffdb82\") " pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.548773 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.548823 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data podName:707515d1-17d7-47ff-be95-73414a6a2a95 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:30.548807294 +0000 UTC m=+3662.869067582 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data") pod "rabbitmq-cell1-server-0" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.549093 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b0c66c6-242d-4d35-8be9-28739bd537f5-operator-scripts\") pod \"barbican-6a16-account-create-update-ss6r7\" (UID: \"6b0c66c6-242d-4d35-8be9-28739bd537f5\") " pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.549417 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a5c598-c95d-453a-aeec-bda1f9ffdb82-operator-scripts\") pod \"nova-cell0-7b33-account-create-update-226z8\" (UID: \"06a5c598-c95d-453a-aeec-bda1f9ffdb82\") " pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.599897 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.604419 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hln4z\" (UniqueName: \"kubernetes.io/projected/6b0c66c6-242d-4d35-8be9-28739bd537f5-kube-api-access-hln4z\") pod \"barbican-6a16-account-create-update-ss6r7\" (UID: \"6b0c66c6-242d-4d35-8be9-28739bd537f5\") " pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.605197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j7zr\" (UniqueName: \"kubernetes.io/projected/06a5c598-c95d-453a-aeec-bda1f9ffdb82-kube-api-access-2j7zr\") pod \"nova-cell0-7b33-account-create-update-226z8\" (UID: \"06a5c598-c95d-453a-aeec-bda1f9ffdb82\") " pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.664882 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-eba6-account-create-update-lz769"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.687280 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.699688 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.707757 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-6a16-account-create-update-t6wkc"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.724573 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-f9brm"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.724611 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-f9brm"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.734684 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-mktxf"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.744357 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-mktxf"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.754673 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755165 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-server" containerID="cri-o://68a3e13cf2691b33d5c70e2cd4ee154f6c6ade16bd21a3ed1c5508abb12bf1aa" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755553 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="swift-recon-cron" containerID="cri-o://5c43afbc3bd855975b6195fe0ab81ffca022d863f4ec31ef3dcc4088f902abf7" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755592 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="rsync" containerID="cri-o://d8020553ea006945bae8427811e7617cdae1effe1553b99640376e2be8359797" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755636 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-expirer" containerID="cri-o://10f2ad934ade0755d3b8fbad653de62253c2374bf376c3091a96065c690f7603" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755668 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-updater" containerID="cri-o://5255250b357a4ac7681649a1e5c0a926b39fbee907e1ba4000a642da16130fdd" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755699 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-auditor" containerID="cri-o://69b7c21f321b09445606557c0e6984dc6d0949ad2a3b0ee72bd8c7a27d07ecc7" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755726 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-replicator" containerID="cri-o://02c333bfad293ec86e408a8920ec51d1c9268264d8402be95b6a1751dcb8d930" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755757 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-server" containerID="cri-o://7d9c91edb4a35a4bf32992a5fc61b2174f916fe1a309ebe2c3715d15e2bd931f" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755788 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-updater" containerID="cri-o://e1b5f2a13f3163554624db7b1aafe91f5aee9ab1131474fe0369c5f0f4a9dee1" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755820 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-auditor" containerID="cri-o://142f3acf1108276cc11ce9ff0213af6ed4a6eb46e60c66d8d0e27543458e4c19" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755850 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-replicator" containerID="cri-o://aef8219014260e57e40ee40900cc995c3ff708186b3f043e7917e6d02c4cb21e" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755879 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-server" containerID="cri-o://611008d575119572dd79f498dbab28ea12bc998a1001a31f1ecbe0cfe79cdd53" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755912 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="3b0e7253-4a57-42d1-bbc3-768eeb755113" containerName="ovn-northd" probeResult="failure" output="command timed out" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755963 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-reaper" containerID="cri-o://a1d0030d1db7eafecbee2fd3e7074cf5fa995ab67d4254165b2fedc019d55a78" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755974 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-auditor" containerID="cri-o://0478c6b5a2aa7d3114b8aba566b34095474deb6aee07d02a4122007b8a828961" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.755986 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-replicator" containerID="cri-o://6fa5a926460488785a7ffba60bb9b7fadfcf332e45c0b16ee382744faf654dce" gracePeriod=30 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.763461 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-4h926"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.772769 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-4h926"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.790735 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.806478 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.807039 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-zlfhp"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.825879 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.840061 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-d9vgn"] Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.864038 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:36:29 crc kubenswrapper[5030]: E0120 23:36:29.864116 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data podName:e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:31.864101127 +0000 UTC m=+3664.184361405 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data") pod "rabbitmq-server-0" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c") : configmap "rabbitmq-config-data" not found Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.896012 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.898204 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.931056 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q"] Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.970060 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" containerName="rabbitmq" containerID="cri-o://fd3d688e67d991d64ec087d4b4ddfd8bf95aa94fca6895e70222b2a4a5f67e47" gracePeriod=604800 Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.975908 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-kfd8q\" (UID: \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.976077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5gfn\" (UniqueName: \"kubernetes.io/projected/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-kube-api-access-g5gfn\") pod \"dnsmasq-dnsmasq-84b9f45d47-kfd8q\" (UID: \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:36:29 crc kubenswrapper[5030]: I0120 23:36:29.976162 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-kfd8q\" (UID: \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.008447 5030 generic.go:334] "Generic (PLEG): container finished" podID="3b0e7253-4a57-42d1-bbc3-768eeb755113" containerID="0c8ba91ac8628ce6044a35b4f7d9eb67441f4f1ee295ec42f3004b90599321e6" exitCode=2 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.033648 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2247a835-f44c-4a87-8d10-9d93b86d6fb9" path="/var/lib/kubelet/pods/2247a835-f44c-4a87-8d10-9d93b86d6fb9/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.034292 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38852246-6c24-4866-921a-ce61995931d7" path="/var/lib/kubelet/pods/38852246-6c24-4866-921a-ce61995931d7/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.034846 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56cbfa09-ec86-4e48-84e6-7452ae29b778" path="/var/lib/kubelet/pods/56cbfa09-ec86-4e48-84e6-7452ae29b778/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.037285 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e2d7a8b-c587-4109-ac49-11dbec78bbba" path="/var/lib/kubelet/pods/6e2d7a8b-c587-4109-ac49-11dbec78bbba/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.039693 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75ab0bd4-f8be-4a99-87c5-44b8efbaa33d" path="/var/lib/kubelet/pods/75ab0bd4-f8be-4a99-87c5-44b8efbaa33d/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.040213 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b2c52ab-999e-420d-836c-dc6ea4055ff6" path="/var/lib/kubelet/pods/7b2c52ab-999e-420d-836c-dc6ea4055ff6/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.040870 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94c75021-4a7f-4851-aada-24da4ddb3387" path="/var/lib/kubelet/pods/94c75021-4a7f-4851-aada-24da4ddb3387/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.041372 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6" path="/var/lib/kubelet/pods/9db94d9e-2b96-4043-bbc7-cf41ae5ce9b6/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.049665 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b6dfdf-c86c-4d99-913b-0c28fb0c1811" path="/var/lib/kubelet/pods/a6b6dfdf-c86c-4d99-913b-0c28fb0c1811/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.050276 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a99373ff-558b-4e95-a48e-1b457b727b99" path="/var/lib/kubelet/pods/a99373ff-558b-4e95-a48e-1b457b727b99/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.051512 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3228c2-a636-4f9a-8422-8bc110eec2a8" path="/var/lib/kubelet/pods/af3228c2-a636-4f9a-8422-8bc110eec2a8/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.052180 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6294ee-f735-499f-b340-952f3f7b2ded" path="/var/lib/kubelet/pods/dd6294ee-f735-499f-b340-952f3f7b2ded/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.053252 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e290163a-2fc9-492f-b113-1e1a42ff37f9" path="/var/lib/kubelet/pods/e290163a-2fc9-492f-b113-1e1a42ff37f9/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.053781 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e34e0e3b-f240-482d-925b-0c746ed1346e" path="/var/lib/kubelet/pods/e34e0e3b-f240-482d-925b-0c746ed1346e/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.054309 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8fbce55-4d90-46db-a2ee-7a17b043c246" path="/var/lib/kubelet/pods/f8fbce55-4d90-46db-a2ee-7a17b043c246/volumes" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.055600 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"3b0e7253-4a57-42d1-bbc3-768eeb755113","Type":"ContainerDied","Data":"0c8ba91ac8628ce6044a35b4f7d9eb67441f4f1ee295ec42f3004b90599321e6"} Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.061387 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_ce2df28d-2515-4995-b928-c2f6d198045d/ovsdbserver-sb/0.log" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.064776 5030 generic.go:334] "Generic (PLEG): container finished" podID="ce2df28d-2515-4995-b928-c2f6d198045d" containerID="c7d286487506cf34d40605c251876f3d0a39a62208083381222f15802dd6c018" exitCode=2 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.064812 5030 generic.go:334] "Generic (PLEG): container finished" podID="ce2df28d-2515-4995-b928-c2f6d198045d" containerID="bb888a669f56ad697c7a2cb6314f21c6d4d03d654bc273d2bff1d915da2291d6" exitCode=143 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.064822 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"ce2df28d-2515-4995-b928-c2f6d198045d","Type":"ContainerDied","Data":"c7d286487506cf34d40605c251876f3d0a39a62208083381222f15802dd6c018"} Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.064890 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"ce2df28d-2515-4995-b928-c2f6d198045d","Type":"ContainerDied","Data":"bb888a669f56ad697c7a2cb6314f21c6d4d03d654bc273d2bff1d915da2291d6"} Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.066343 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.066572 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="fd6ac7c2-5294-401a-a16d-0821b044e7f8" containerName="glance-log" containerID="cri-o://a3cc6fa9860339a777b502e8d7cf94ab748c9d05c0ad7323a3d5d25264be03d7" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.066705 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="fd6ac7c2-5294-401a-a16d-0821b044e7f8" containerName="glance-httpd" containerID="cri-o://9237c481c7aa7834b8c0efe9a0c5fd9901bf2d2eb2b5d3f97cca7cc8a063c842" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.080097 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.080129 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-kfd8q\" (UID: \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.080160 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.080298 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-kfd8q\" (UID: \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.080383 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5gfn\" (UniqueName: \"kubernetes.io/projected/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-kube-api-access-g5gfn\") pod \"dnsmasq-dnsmasq-84b9f45d47-kfd8q\" (UID: \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.080577 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.080662 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle podName:8705c389-fda4-4ef7-8a28-93efcdff880b nodeName:}" failed. No retries permitted until 2026-01-20 23:36:32.080616378 +0000 UTC m=+3664.400876666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle") pod "barbican-keystone-listener-78ff77fc4b-82l2x" (UID: "8705c389-fda4-4ef7-8a28-93efcdff880b") : secret "combined-ca-bundle" not found Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.080953 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.080977 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle podName:afaaf0e0-1ddc-4335-843b-c2ea49492b80 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:32.080970717 +0000 UTC m=+3664.401230995 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle") pod "barbican-worker-69f86ff78c-d95nc" (UID: "afaaf0e0-1ddc-4335-843b-c2ea49492b80") : secret "combined-ca-bundle" not found Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.081267 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-kfd8q\" (UID: \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.087556 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-kfd8q\" (UID: \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.091673 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_2e480bc9-3e92-4cb6-a685-4b2118dea5ff/ovsdbserver-nb/0.log" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.091840 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"2e480bc9-3e92-4cb6-a685-4b2118dea5ff","Type":"ContainerDied","Data":"750810e06e2ba2a1b4a96b2cb486db619a924a3c4e105f8d13577a72affcd1c6"} Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.091724 5030 generic.go:334] "Generic (PLEG): container finished" podID="2e480bc9-3e92-4cb6-a685-4b2118dea5ff" containerID="750810e06e2ba2a1b4a96b2cb486db619a924a3c4e105f8d13577a72affcd1c6" exitCode=2 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.093238 5030 generic.go:334] "Generic (PLEG): container finished" podID="2e480bc9-3e92-4cb6-a685-4b2118dea5ff" containerID="a569f0520ff40c5656ae25d907d285351006a662406feefdd835a9730883f656" exitCode=143 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.093261 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.093289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"2e480bc9-3e92-4cb6-a685-4b2118dea5ff","Type":"ContainerDied","Data":"a569f0520ff40c5656ae25d907d285351006a662406feefdd835a9730883f656"} Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.093499 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="12ef9991-6f38-4b76-a8e7-974ef5255254" containerName="cinder-scheduler" containerID="cri-o://36217ecd5110ba1cb9d103b09587b21f547638eb8c7ae3eca137279150a09f64" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.094927 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="12ef9991-6f38-4b76-a8e7-974ef5255254" containerName="probe" containerID="cri-o://d5c3b3be74c237646fe90af0b8e7960fcfb98b64d8f2fe17cbcb220246b846f0" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.146904 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5gfn\" (UniqueName: \"kubernetes.io/projected/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-kube-api-access-g5gfn\") pod \"dnsmasq-dnsmasq-84b9f45d47-kfd8q\" (UID: \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.154153 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.155638 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" podUID="3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" containerName="placement-log" containerID="cri-o://6c9c4f202a2cb68a56587c933cd90e0e468cf5c5ce99564d98ebbae3c4fae420" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.158218 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" podUID="3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" containerName="placement-api" containerID="cri-o://e96f90310cc523986ea2cc24d1408bd36e515d624b67edb05f78fd96a59bfb9f" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.277784 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.277991 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="c7faddbb-53f4-4f27-a853-0751049feafd" containerName="cinder-api-log" containerID="cri-o://dae190eac1e356b07146411eb43704c3a1d11f244f8ed93f24dbead6bae58b53" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.278388 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="c7faddbb-53f4-4f27-a853-0751049feafd" containerName="cinder-api" containerID="cri-o://8b8510c34e0a7c2858626139d5521d1ef7d3a51f65ff94d2088c0275f2c2b66f" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.329990 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.346364 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-qvzhj"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.369200 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_2e480bc9-3e92-4cb6-a685-4b2118dea5ff/ovsdbserver-nb/0.log" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.369268 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.373337 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_ce2df28d-2515-4995-b928-c2f6d198045d/ovsdbserver-sb/0.log" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.373401 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.431178 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-qvzhj"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.484402 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.484760 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="4d23d8f1-6699-42f6-8152-362964a6a9dc" containerName="glance-log" containerID="cri-o://d0e7360bf46166aa5bef2ce7924b3df57b7840af48dade76bb5e42853f0bcdae" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.485222 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="4d23d8f1-6699-42f6-8152-362964a6a9dc" containerName="glance-httpd" containerID="cri-o://a0e544e09bb702af8047d8373439f3e074741aceea685df79f2a9a8308c31f68" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495206 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce2df28d-2515-4995-b928-c2f6d198045d-ovsdb-rundir\") pod \"ce2df28d-2515-4995-b928-c2f6d198045d\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495247 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dtzs\" (UniqueName: \"kubernetes.io/projected/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-kube-api-access-4dtzs\") pod \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495283 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-combined-ca-bundle\") pod \"ce2df28d-2515-4995-b928-c2f6d198045d\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495318 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-scripts\") pod \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce2df28d-2515-4995-b928-c2f6d198045d-config\") pod \"ce2df28d-2515-4995-b928-c2f6d198045d\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-config\") pod \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495402 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"ce2df28d-2515-4995-b928-c2f6d198045d\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495426 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495460 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-ovsdb-rundir\") pod \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495484 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce2df28d-2515-4995-b928-c2f6d198045d-scripts\") pod \"ce2df28d-2515-4995-b928-c2f6d198045d\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495506 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-combined-ca-bundle\") pod \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495574 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-metrics-certs-tls-certs\") pod \"ce2df28d-2515-4995-b928-c2f6d198045d\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495644 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-metrics-certs-tls-certs\") pod \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495676 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-ovsdbserver-sb-tls-certs\") pod \"ce2df28d-2515-4995-b928-c2f6d198045d\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495707 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-ovsdbserver-nb-tls-certs\") pod \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\" (UID: \"2e480bc9-3e92-4cb6-a685-4b2118dea5ff\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495735 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8td79\" (UniqueName: \"kubernetes.io/projected/ce2df28d-2515-4995-b928-c2f6d198045d-kube-api-access-8td79\") pod \"ce2df28d-2515-4995-b928-c2f6d198045d\" (UID: \"ce2df28d-2515-4995-b928-c2f6d198045d\") " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.495992 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.496018 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.496175 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.496296 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.496336 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs podName:94b0ade6-0e21-4df1-8dca-db81b58aff7c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:32.49632167 +0000 UTC m=+3664.816581958 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs") pod "barbican-api-688dd48fcd-2fh67" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c") : secret "cert-barbican-public-svc" not found Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.497279 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce2df28d-2515-4995-b928-c2f6d198045d-scripts" (OuterVolumeSpecName: "scripts") pod "ce2df28d-2515-4995-b928-c2f6d198045d" (UID: "ce2df28d-2515-4995-b928-c2f6d198045d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.497723 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce2df28d-2515-4995-b928-c2f6d198045d-config" (OuterVolumeSpecName: "config") pod "ce2df28d-2515-4995-b928-c2f6d198045d" (UID: "ce2df28d-2515-4995-b928-c2f6d198045d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.498394 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce2df28d-2515-4995-b928-c2f6d198045d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "ce2df28d-2515-4995-b928-c2f6d198045d" (UID: "ce2df28d-2515-4995-b928-c2f6d198045d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.503355 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.504632 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle podName:94b0ade6-0e21-4df1-8dca-db81b58aff7c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:32.504595841 +0000 UTC m=+3664.824856129 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle") pod "barbican-api-688dd48fcd-2fh67" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c") : secret "combined-ca-bundle" not found Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.504680 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.504700 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs podName:94b0ade6-0e21-4df1-8dca-db81b58aff7c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:32.504694103 +0000 UTC m=+3664.824954391 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs") pod "barbican-api-688dd48fcd-2fh67" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c") : secret "cert-barbican-internal-svc" not found Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.504269 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "2e480bc9-3e92-4cb6-a685-4b2118dea5ff" (UID: "2e480bc9-3e92-4cb6-a685-4b2118dea5ff"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.505162 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-scripts" (OuterVolumeSpecName: "scripts") pod "2e480bc9-3e92-4cb6-a685-4b2118dea5ff" (UID: "2e480bc9-3e92-4cb6-a685-4b2118dea5ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.507469 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce2df28d-2515-4995-b928-c2f6d198045d-kube-api-access-8td79" (OuterVolumeSpecName: "kube-api-access-8td79") pod "ce2df28d-2515-4995-b928-c2f6d198045d" (UID: "ce2df28d-2515-4995-b928-c2f6d198045d"). InnerVolumeSpecName "kube-api-access-8td79". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.508241 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-config" (OuterVolumeSpecName: "config") pod "2e480bc9-3e92-4cb6-a685-4b2118dea5ff" (UID: "2e480bc9-3e92-4cb6-a685-4b2118dea5ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.510886 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-557d5c9854-wbd5h"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.511745 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" podUID="ec706fe5-85b7-4580-9013-e39497640ab0" containerName="neutron-api" containerID="cri-o://a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.511878 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" podUID="ec706fe5-85b7-4580-9013-e39497640ab0" containerName="neutron-httpd" containerID="cri-o://be1f7f5545bed54bae7ef435a6cffa9473ba0595e2d6a1b90f6bb99a4d364f7f" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.527653 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "ce2df28d-2515-4995-b928-c2f6d198045d" (UID: "ce2df28d-2515-4995-b928-c2f6d198045d"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.566854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-kube-api-access-4dtzs" (OuterVolumeSpecName: "kube-api-access-4dtzs") pod "2e480bc9-3e92-4cb6-a685-4b2118dea5ff" (UID: "2e480bc9-3e92-4cb6-a685-4b2118dea5ff"). InnerVolumeSpecName "kube-api-access-4dtzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.567051 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "2e480bc9-3e92-4cb6-a685-4b2118dea5ff" (UID: "2e480bc9-3e92-4cb6-a685-4b2118dea5ff"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.599934 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8td79\" (UniqueName: \"kubernetes.io/projected/ce2df28d-2515-4995-b928-c2f6d198045d-kube-api-access-8td79\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.599971 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce2df28d-2515-4995-b928-c2f6d198045d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.599987 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dtzs\" (UniqueName: \"kubernetes.io/projected/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-kube-api-access-4dtzs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.599999 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.600010 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce2df28d-2515-4995-b928-c2f6d198045d-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.600028 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.600055 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.600083 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.600098 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.600111 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce2df28d-2515-4995-b928-c2f6d198045d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.601644 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.601712 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data podName:707515d1-17d7-47ff-be95-73414a6a2a95 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:32.601695411 +0000 UTC m=+3664.921955699 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data") pod "rabbitmq-cell1-server-0" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.623486 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jx4gj"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.653374 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jx4gj"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.662309 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce2df28d-2515-4995-b928-c2f6d198045d" (UID: "ce2df28d-2515-4995-b928-c2f6d198045d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.701879 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.704605 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:36:30 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:36:30 crc kubenswrapper[5030]: Jan 20 23:36:30 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:36:30 crc kubenswrapper[5030]: Jan 20 23:36:30 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:36:30 crc kubenswrapper[5030]: Jan 20 23:36:30 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:36:30 crc kubenswrapper[5030]: Jan 20 23:36:30 crc kubenswrapper[5030]: if [ -n "nova_cell0" ]; then Jan 20 23:36:30 crc kubenswrapper[5030]: GRANT_DATABASE="nova_cell0" Jan 20 23:36:30 crc kubenswrapper[5030]: else Jan 20 23:36:30 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:36:30 crc kubenswrapper[5030]: fi Jan 20 23:36:30 crc kubenswrapper[5030]: Jan 20 23:36:30 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:36:30 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:36:30 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:36:30 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:36:30 crc kubenswrapper[5030]: # support updates Jan 20 23:36:30 crc kubenswrapper[5030]: Jan 20 23:36:30 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.705088 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e480bc9-3e92-4cb6-a685-4b2118dea5ff" (UID: "2e480bc9-3e92-4cb6-a685-4b2118dea5ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.708748 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" podUID="06a5c598-c95d-453a-aeec-bda1f9ffdb82" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.720706 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-gl9ft"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.735847 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-gl9ft"] Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.739252 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:36:30 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:36:30 crc kubenswrapper[5030]: Jan 20 23:36:30 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:36:30 crc kubenswrapper[5030]: Jan 20 23:36:30 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:36:30 crc kubenswrapper[5030]: Jan 20 23:36:30 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:36:30 crc kubenswrapper[5030]: Jan 20 23:36:30 crc kubenswrapper[5030]: if [ -n "barbican" ]; then Jan 20 23:36:30 crc kubenswrapper[5030]: GRANT_DATABASE="barbican" Jan 20 23:36:30 crc kubenswrapper[5030]: else Jan 20 23:36:30 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:36:30 crc kubenswrapper[5030]: fi Jan 20 23:36:30 crc kubenswrapper[5030]: Jan 20 23:36:30 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:36:30 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:36:30 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:36:30 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:36:30 crc kubenswrapper[5030]: # support updates Jan 20 23:36:30 crc kubenswrapper[5030]: Jan 20 23:36:30 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:36:30 crc kubenswrapper[5030]: E0120 23:36:30.741777 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" podUID="6b0c66c6-242d-4d35-8be9-28739bd537f5" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.757751 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jqzcn"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.762763 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.768256 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.803235 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.803261 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.803272 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.826399 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-jqzcn"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.842799 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "2e480bc9-3e92-4cb6-a685-4b2118dea5ff" (UID: "2e480bc9-3e92-4cb6-a685-4b2118dea5ff"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.848418 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "ce2df28d-2515-4995-b928-c2f6d198045d" (UID: "ce2df28d-2515-4995-b928-c2f6d198045d"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.854036 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "2e480bc9-3e92-4cb6-a685-4b2118dea5ff" (UID: "2e480bc9-3e92-4cb6-a685-4b2118dea5ff"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.859945 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.860293 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="ceilometer-central-agent" containerID="cri-o://f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.861513 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="proxy-httpd" containerID="cri-o://ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.861754 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="ceilometer-notification-agent" containerID="cri-o://6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.862052 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="sg-core" containerID="cri-o://4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.867465 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "ce2df28d-2515-4995-b928-c2f6d198045d" (UID: "ce2df28d-2515-4995-b928-c2f6d198045d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.870562 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.885206 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.896958 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.897306 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerName="nova-metadata-log" containerID="cri-o://0180de15c3bd2219cf4cf9e44ce6525ef636a8fc03501e146cc2c6923d43e6e8" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.897454 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerName="nova-metadata-metadata" containerID="cri-o://b8ae93af6666b9fb76a44af3d521db65e5795460f7c982e680684326ee865d01" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.907476 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gp87c"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.912701 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.912740 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.912750 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e480bc9-3e92-4cb6-a685-4b2118dea5ff-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.912759 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce2df28d-2515-4995-b928-c2f6d198045d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.912952 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-gp87c"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.920243 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-qxfmd"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.925895 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-qxfmd"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.931207 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.931460 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="16aae85a-a502-4b2e-9663-e8c50d01f657" containerName="nova-api-log" containerID="cri-o://0ab99d99f78a27c31a39fe65464563e8a92791c72ac677499b6836093b92f3ef" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.932098 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="16aae85a-a502-4b2e-9663-e8c50d01f657" containerName="nova-api-api" containerID="cri-o://a12e77d1b8be494229f4bdef24c5a9771212355912d130fbc08c36e1abf41aa5" gracePeriod=30 Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.939223 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.944876 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-d56mx"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.950776 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-d56mx"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.959502 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.967255 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-e6f9-account-create-update-cd46q"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.971396 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.977662 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.978013 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-9kt22"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.983867 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-9kt22"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.990416 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:36:30 crc kubenswrapper[5030]: I0120 23:36:30.997403 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-cxftt"] Jan 20 23:36:31 crc kubenswrapper[5030]: I0120 23:36:31.011894 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-cxftt"] Jan 20 23:36:31 crc kubenswrapper[5030]: I0120 23:36:31.014408 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eddfc91d-df9b-4683-b3e6-5bad910ef921-openstack-config\") pod \"eddfc91d-df9b-4683-b3e6-5bad910ef921\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " Jan 20 23:36:31 crc kubenswrapper[5030]: I0120 23:36:31.014566 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eddfc91d-df9b-4683-b3e6-5bad910ef921-combined-ca-bundle\") pod \"eddfc91d-df9b-4683-b3e6-5bad910ef921\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " Jan 20 23:36:31 crc kubenswrapper[5030]: I0120 23:36:31.014611 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltr8f\" (UniqueName: \"kubernetes.io/projected/eddfc91d-df9b-4683-b3e6-5bad910ef921-kube-api-access-ltr8f\") pod \"eddfc91d-df9b-4683-b3e6-5bad910ef921\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " Jan 20 23:36:31 crc kubenswrapper[5030]: I0120 23:36:31.014685 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eddfc91d-df9b-4683-b3e6-5bad910ef921-openstack-config-secret\") pod \"eddfc91d-df9b-4683-b3e6-5bad910ef921\" (UID: \"eddfc91d-df9b-4683-b3e6-5bad910ef921\") " Jan 20 23:36:31 crc kubenswrapper[5030]: I0120 23:36:31.023824 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7"] Jan 20 23:36:31 crc kubenswrapper[5030]: I0120 23:36:31.033965 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:36:31 crc kubenswrapper[5030]: I0120 23:36:31.034166 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="d3774e65-1783-48d9-af7d-3a9964b609ad" containerName="kube-state-metrics" containerID="cri-o://a161cc88506234cfca0ad20e4d3a89e586f86988256fad6a81063a57a16e11d4" gracePeriod=30 Jan 20 23:36:31 crc kubenswrapper[5030]: I0120 23:36:31.040881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eddfc91d-df9b-4683-b3e6-5bad910ef921-kube-api-access-ltr8f" (OuterVolumeSpecName: "kube-api-access-ltr8f") pod "eddfc91d-df9b-4683-b3e6-5bad910ef921" (UID: "eddfc91d-df9b-4683-b3e6-5bad910ef921"). InnerVolumeSpecName "kube-api-access-ltr8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:31 crc kubenswrapper[5030]: I0120 23:36:31.068792 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="707515d1-17d7-47ff-be95-73414a6a2a95" containerName="rabbitmq" containerID="cri-o://6d9893896c0fe1b9cbcae45b2dde218a0a94a2e3b05ad47f803bf4db0286bc6c" gracePeriod=604800 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.100053 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eddfc91d-df9b-4683-b3e6-5bad910ef921-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "eddfc91d-df9b-4683-b3e6-5bad910ef921" (UID: "eddfc91d-df9b-4683-b3e6-5bad910ef921"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.128108 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltr8f\" (UniqueName: \"kubernetes.io/projected/eddfc91d-df9b-4683-b3e6-5bad910ef921-kube-api-access-ltr8f\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.128130 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/eddfc91d-df9b-4683-b3e6-5bad910ef921-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.128139 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eddfc91d-df9b-4683-b3e6-5bad910ef921-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eddfc91d-df9b-4683-b3e6-5bad910ef921" (UID: "eddfc91d-df9b-4683-b3e6-5bad910ef921"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.137126 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eddfc91d-df9b-4683-b3e6-5bad910ef921-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "eddfc91d-df9b-4683-b3e6-5bad910ef921" (UID: "eddfc91d-df9b-4683-b3e6-5bad910ef921"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158892 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="d8020553ea006945bae8427811e7617cdae1effe1553b99640376e2be8359797" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158919 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="10f2ad934ade0755d3b8fbad653de62253c2374bf376c3091a96065c690f7603" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158926 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="5255250b357a4ac7681649a1e5c0a926b39fbee907e1ba4000a642da16130fdd" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158933 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="69b7c21f321b09445606557c0e6984dc6d0949ad2a3b0ee72bd8c7a27d07ecc7" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158940 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="02c333bfad293ec86e408a8920ec51d1c9268264d8402be95b6a1751dcb8d930" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158946 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="7d9c91edb4a35a4bf32992a5fc61b2174f916fe1a309ebe2c3715d15e2bd931f" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158952 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="e1b5f2a13f3163554624db7b1aafe91f5aee9ab1131474fe0369c5f0f4a9dee1" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158958 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="142f3acf1108276cc11ce9ff0213af6ed4a6eb46e60c66d8d0e27543458e4c19" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158965 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="aef8219014260e57e40ee40900cc995c3ff708186b3f043e7917e6d02c4cb21e" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158970 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="611008d575119572dd79f498dbab28ea12bc998a1001a31f1ecbe0cfe79cdd53" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158976 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="a1d0030d1db7eafecbee2fd3e7074cf5fa995ab67d4254165b2fedc019d55a78" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158982 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="0478c6b5a2aa7d3114b8aba566b34095474deb6aee07d02a4122007b8a828961" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158988 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="6fa5a926460488785a7ffba60bb9b7fadfcf332e45c0b16ee382744faf654dce" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.158994 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="68a3e13cf2691b33d5c70e2cd4ee154f6c6ade16bd21a3ed1c5508abb12bf1aa" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159030 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"d8020553ea006945bae8427811e7617cdae1effe1553b99640376e2be8359797"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159054 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"10f2ad934ade0755d3b8fbad653de62253c2374bf376c3091a96065c690f7603"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159065 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"5255250b357a4ac7681649a1e5c0a926b39fbee907e1ba4000a642da16130fdd"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"69b7c21f321b09445606557c0e6984dc6d0949ad2a3b0ee72bd8c7a27d07ecc7"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159087 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"02c333bfad293ec86e408a8920ec51d1c9268264d8402be95b6a1751dcb8d930"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"7d9c91edb4a35a4bf32992a5fc61b2174f916fe1a309ebe2c3715d15e2bd931f"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159103 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"e1b5f2a13f3163554624db7b1aafe91f5aee9ab1131474fe0369c5f0f4a9dee1"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159112 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"142f3acf1108276cc11ce9ff0213af6ed4a6eb46e60c66d8d0e27543458e4c19"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159120 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"aef8219014260e57e40ee40900cc995c3ff708186b3f043e7917e6d02c4cb21e"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159146 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"611008d575119572dd79f498dbab28ea12bc998a1001a31f1ecbe0cfe79cdd53"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159158 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"a1d0030d1db7eafecbee2fd3e7074cf5fa995ab67d4254165b2fedc019d55a78"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159169 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"0478c6b5a2aa7d3114b8aba566b34095474deb6aee07d02a4122007b8a828961"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159177 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"6fa5a926460488785a7ffba60bb9b7fadfcf332e45c0b16ee382744faf654dce"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.159186 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"68a3e13cf2691b33d5c70e2cd4ee154f6c6ade16bd21a3ed1c5508abb12bf1aa"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.183801 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_2e480bc9-3e92-4cb6-a685-4b2118dea5ff/ovsdbserver-nb/0.log" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.184786 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.185761 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"2e480bc9-3e92-4cb6-a685-4b2118dea5ff","Type":"ContainerDied","Data":"22afbcef39ae947fe57abfd65f655ba84c23f3719274f3578deb9b1aff9aee9f"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.185810 5030 scope.go:117] "RemoveContainer" containerID="750810e06e2ba2a1b4a96b2cb486db619a924a3c4e105f8d13577a72affcd1c6" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.187787 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" event={"ID":"06a5c598-c95d-453a-aeec-bda1f9ffdb82","Type":"ContainerStarted","Data":"7abe160fccc5d4a51af595b200787a9f36de80e2de006e2c820ce5d58040e06c"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.208829 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.209056 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://36efa10ae1b4f96519d04efabec3fcfed7aa6b7e9d940f3c5f6f44027d1f248e" gracePeriod=30 Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.211497 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:36:32 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:36:32 crc kubenswrapper[5030]: Jan 20 23:36:32 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:36:32 crc kubenswrapper[5030]: Jan 20 23:36:32 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:36:32 crc kubenswrapper[5030]: Jan 20 23:36:32 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:36:32 crc kubenswrapper[5030]: Jan 20 23:36:32 crc kubenswrapper[5030]: if [ -n "nova_cell0" ]; then Jan 20 23:36:32 crc kubenswrapper[5030]: GRANT_DATABASE="nova_cell0" Jan 20 23:36:32 crc kubenswrapper[5030]: else Jan 20 23:36:32 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:36:32 crc kubenswrapper[5030]: fi Jan 20 23:36:32 crc kubenswrapper[5030]: Jan 20 23:36:32 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:36:32 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:36:32 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:36:32 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:36:32 crc kubenswrapper[5030]: # support updates Jan 20 23:36:32 crc kubenswrapper[5030]: Jan 20 23:36:32 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.214032 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" podUID="06a5c598-c95d-453a-aeec-bda1f9ffdb82" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.234723 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eddfc91d-df9b-4683-b3e6-5bad910ef921-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.234750 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/eddfc91d-df9b-4683-b3e6-5bad910ef921-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.237543 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.237844 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" podUID="f4bc1efe-faed-4290-a85c-c631a91f02ca" containerName="barbican-keystone-listener-log" containerID="cri-o://fc2622eac769de27d1854d47660b3865bb55eb741ab3de14a0ff652e61da605f" gracePeriod=30 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.237960 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" podUID="f4bc1efe-faed-4290-a85c-c631a91f02ca" containerName="barbican-keystone-listener" containerID="cri-o://c404f86a75aeed11f9b633de5cc6d308e1fbdb48707f1cf37008195d2631fc96" gracePeriod=30 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.249047 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x"] Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.249783 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" podUID="8705c389-fda4-4ef7-8a28-93efcdff880b" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.256989 5030 generic.go:334] "Generic (PLEG): container finished" podID="4d23d8f1-6699-42f6-8152-362964a6a9dc" containerID="d0e7360bf46166aa5bef2ce7924b3df57b7840af48dade76bb5e42853f0bcdae" exitCode=143 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.257061 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4d23d8f1-6699-42f6-8152-362964a6a9dc","Type":"ContainerDied","Data":"d0e7360bf46166aa5bef2ce7924b3df57b7840af48dade76bb5e42853f0bcdae"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.257109 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="5b3850fc-0bf6-489d-8a8f-4a72ad423cce" containerName="galera" containerID="cri-o://e268973cec96cb63fac254ecd93e81d85190ec9cc7adf77bceda4e64becbf60d" gracePeriod=30 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.261401 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc"] Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.262177 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" podUID="afaaf0e0-1ddc-4335-843b-c2ea49492b80" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.270114 5030 scope.go:117] "RemoveContainer" containerID="a569f0520ff40c5656ae25d907d285351006a662406feefdd835a9730883f656" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.272828 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.273125 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" podUID="5c2a94b7-b322-4de7-b55d-80c87895ab2d" containerName="barbican-worker-log" containerID="cri-o://1f023a1689579151152b0ca7bfe3a74e9a546de82e1d757e00c1973facff6457" gracePeriod=30 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.273421 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" podUID="5c2a94b7-b322-4de7-b55d-80c87895ab2d" containerName="barbican-worker" containerID="cri-o://dbbd94a1cc0b6868e228ff556ac722d58d4265232e5ea11ebaf7cc7bcb233912" gracePeriod=30 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.279850 5030 generic.go:334] "Generic (PLEG): container finished" podID="16aae85a-a502-4b2e-9663-e8c50d01f657" containerID="0ab99d99f78a27c31a39fe65464563e8a92791c72ac677499b6836093b92f3ef" exitCode=143 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.279943 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"16aae85a-a502-4b2e-9663-e8c50d01f657","Type":"ContainerDied","Data":"0ab99d99f78a27c31a39fe65464563e8a92791c72ac677499b6836093b92f3ef"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.282324 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67"] Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.283037 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle internal-tls-certs public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" podUID="94b0ade6-0e21-4df1-8dca-db81b58aff7c" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.296581 5030 generic.go:334] "Generic (PLEG): container finished" podID="3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" containerID="6c9c4f202a2cb68a56587c933cd90e0e468cf5c5ce99564d98ebbae3c4fae420" exitCode=143 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.296666 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" event={"ID":"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb","Type":"ContainerDied","Data":"6c9c4f202a2cb68a56587c933cd90e0e468cf5c5ce99564d98ebbae3c4fae420"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.323592 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.324205 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api-log" containerID="cri-o://b09cb47fdff353d2277388987884452b6f97530142dd767855a2b065ff8722ae" gracePeriod=30 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.324332 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api" containerID="cri-o://a53f849ac1c1858e7caed509d7798f356ef005496fcb80d6c8c828ae2ffbfaba" gracePeriod=30 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.368885 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.388932 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.393432 5030 generic.go:334] "Generic (PLEG): container finished" podID="43061a58-a31e-4ced-836f-b6a6badff81f" containerID="ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.393462 5030 generic.go:334] "Generic (PLEG): container finished" podID="43061a58-a31e-4ced-836f-b6a6badff81f" containerID="4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2" exitCode=2 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.393541 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"43061a58-a31e-4ced-836f-b6a6badff81f","Type":"ContainerDied","Data":"ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.393568 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"43061a58-a31e-4ced-836f-b6a6badff81f","Type":"ContainerDied","Data":"4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.400125 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.413973 5030 generic.go:334] "Generic (PLEG): container finished" podID="c7faddbb-53f4-4f27-a853-0751049feafd" containerID="dae190eac1e356b07146411eb43704c3a1d11f244f8ed93f24dbead6bae58b53" exitCode=143 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.414051 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c7faddbb-53f4-4f27-a853-0751049feafd","Type":"ContainerDied","Data":"dae190eac1e356b07146411eb43704c3a1d11f244f8ed93f24dbead6bae58b53"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.415917 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.416162 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" podUID="a832d1c9-4838-4070-ad8d-e2af76d8ca47" containerName="proxy-httpd" containerID="cri-o://901c65079c86eab89799533dc96ef980a3db2d852f2b58ea3c34b15a6460f991" gracePeriod=30 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.416517 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" podUID="a832d1c9-4838-4070-ad8d-e2af76d8ca47" containerName="proxy-server" containerID="cri-o://3b503db37878b35095c76a10013149fa599afad387b5f267dc2cef4bd347d150" gracePeriod=30 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.429666 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_ce2df28d-2515-4995-b928-c2f6d198045d/ovsdbserver-sb/0.log" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.429760 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"ce2df28d-2515-4995-b928-c2f6d198045d","Type":"ContainerDied","Data":"c9c613bf81cb8eafa07712719e2192745559c6b4dc4e83bb5234258f4ae6d3d0"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.429804 5030 scope.go:117] "RemoveContainer" containerID="c7d286487506cf34d40605c251876f3d0a39a62208083381222f15802dd6c018" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.429944 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.443587 5030 generic.go:334] "Generic (PLEG): container finished" podID="fd6ac7c2-5294-401a-a16d-0821b044e7f8" containerID="a3cc6fa9860339a777b502e8d7cf94ab748c9d05c0ad7323a3d5d25264be03d7" exitCode=143 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.443708 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"fd6ac7c2-5294-401a-a16d-0821b044e7f8","Type":"ContainerDied","Data":"a3cc6fa9860339a777b502e8d7cf94ab748c9d05c0ad7323a3d5d25264be03d7"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.452817 5030 generic.go:334] "Generic (PLEG): container finished" podID="ec706fe5-85b7-4580-9013-e39497640ab0" containerID="be1f7f5545bed54bae7ef435a6cffa9473ba0595e2d6a1b90f6bb99a4d364f7f" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.452874 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" event={"ID":"ec706fe5-85b7-4580-9013-e39497640ab0","Type":"ContainerDied","Data":"be1f7f5545bed54bae7ef435a6cffa9473ba0595e2d6a1b90f6bb99a4d364f7f"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.463051 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.465694 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" event={"ID":"6b0c66c6-242d-4d35-8be9-28739bd537f5","Type":"ContainerStarted","Data":"8d5dffb57112a27d355e0c955ea79c9754293888f45174f04a04224ab1e0eefd"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.467963 5030 generic.go:334] "Generic (PLEG): container finished" podID="eddfc91d-df9b-4683-b3e6-5bad910ef921" containerID="c1113f67d75b9999d2aa07e3dea0a48f9ab97342489a0eba77438de7da5986dc" exitCode=137 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.468048 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.474697 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-6f8f-account-create-update-s4zjk"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.482864 5030 generic.go:334] "Generic (PLEG): container finished" podID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerID="0180de15c3bd2219cf4cf9e44ce6525ef636a8fc03501e146cc2c6923d43e6e8" exitCode=143 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.482945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15","Type":"ContainerDied","Data":"0180de15c3bd2219cf4cf9e44ce6525ef636a8fc03501e146cc2c6923d43e6e8"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.504597 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-d9vgn" event={"ID":"bc222012-854c-45e8-8ef9-e87ac905ca13","Type":"ContainerStarted","Data":"104b8abfa5522071d609868f0a618c52a03300297864beebcb810bb8789ac864"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.504659 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-d9vgn" event={"ID":"bc222012-854c-45e8-8ef9-e87ac905ca13","Type":"ContainerStarted","Data":"c6f837a7ba9755b9bba25bf10fe5796e9680a8bd9232c1a25f721339c9fb4e16"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.508394 5030 scope.go:117] "RemoveContainer" containerID="104b8abfa5522071d609868f0a618c52a03300297864beebcb810bb8789ac864" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.508458 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:36:32 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:36:32 crc kubenswrapper[5030]: Jan 20 23:36:32 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:36:32 crc kubenswrapper[5030]: Jan 20 23:36:32 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:36:32 crc kubenswrapper[5030]: Jan 20 23:36:32 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:36:32 crc kubenswrapper[5030]: Jan 20 23:36:32 crc kubenswrapper[5030]: if [ -n "barbican" ]; then Jan 20 23:36:32 crc kubenswrapper[5030]: GRANT_DATABASE="barbican" Jan 20 23:36:32 crc kubenswrapper[5030]: else Jan 20 23:36:32 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:36:32 crc kubenswrapper[5030]: fi Jan 20 23:36:32 crc kubenswrapper[5030]: Jan 20 23:36:32 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:36:32 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:36:32 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:36:32 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:36:32 crc kubenswrapper[5030]: # support updates Jan 20 23:36:32 crc kubenswrapper[5030]: Jan 20 23:36:32 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.510016 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" podUID="6b0c66c6-242d-4d35-8be9-28739bd537f5" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.510067 5030 generic.go:334] "Generic (PLEG): container finished" podID="12ef9991-6f38-4b76-a8e7-974ef5255254" containerID="d5c3b3be74c237646fe90af0b8e7960fcfb98b64d8f2fe17cbcb220246b846f0" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.510136 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"12ef9991-6f38-4b76-a8e7-974ef5255254","Type":"ContainerDied","Data":"d5c3b3be74c237646fe90af0b8e7960fcfb98b64d8f2fe17cbcb220246b846f0"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.525036 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv"] Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.525642 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e480bc9-3e92-4cb6-a685-4b2118dea5ff" containerName="ovsdbserver-nb" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.525657 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e480bc9-3e92-4cb6-a685-4b2118dea5ff" containerName="ovsdbserver-nb" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.525675 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e480bc9-3e92-4cb6-a685-4b2118dea5ff" containerName="openstack-network-exporter" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.525682 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e480bc9-3e92-4cb6-a685-4b2118dea5ff" containerName="openstack-network-exporter" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.525718 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce2df28d-2515-4995-b928-c2f6d198045d" containerName="openstack-network-exporter" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.525726 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2df28d-2515-4995-b928-c2f6d198045d" containerName="openstack-network-exporter" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.525754 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce2df28d-2515-4995-b928-c2f6d198045d" containerName="ovsdbserver-sb" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.525760 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2df28d-2515-4995-b928-c2f6d198045d" containerName="ovsdbserver-sb" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.526015 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce2df28d-2515-4995-b928-c2f6d198045d" containerName="openstack-network-exporter" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.526459 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce2df28d-2515-4995-b928-c2f6d198045d" containerName="ovsdbserver-sb" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.526489 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e480bc9-3e92-4cb6-a685-4b2118dea5ff" containerName="ovsdbserver-nb" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.526502 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e480bc9-3e92-4cb6-a685-4b2118dea5ff" containerName="openstack-network-exporter" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.527569 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.529697 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.538336 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.539768 5030 scope.go:117] "RemoveContainer" containerID="bb888a669f56ad697c7a2cb6314f21c6d4d03d654bc273d2bff1d915da2291d6" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.545448 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw54b\" (UniqueName: \"kubernetes.io/projected/6330116d-6f24-4964-a35b-419b23f92f44-kube-api-access-pw54b\") pod \"keystone-6f8f-account-create-update-x86qv\" (UID: \"6330116d-6f24-4964-a35b-419b23f92f44\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.545535 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6330116d-6f24-4964-a35b-419b23f92f44-operator-scripts\") pod \"keystone-6f8f-account-create-update-x86qv\" (UID: \"6330116d-6f24-4964-a35b-419b23f92f44\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.549759 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.570932 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.651729 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw54b\" (UniqueName: \"kubernetes.io/projected/6330116d-6f24-4964-a35b-419b23f92f44-kube-api-access-pw54b\") pod \"keystone-6f8f-account-create-update-x86qv\" (UID: \"6330116d-6f24-4964-a35b-419b23f92f44\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.651797 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6330116d-6f24-4964-a35b-419b23f92f44-operator-scripts\") pod \"keystone-6f8f-account-create-update-x86qv\" (UID: \"6330116d-6f24-4964-a35b-419b23f92f44\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.651946 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.651993 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6330116d-6f24-4964-a35b-419b23f92f44-operator-scripts podName:6330116d-6f24-4964-a35b-419b23f92f44 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:32.151979125 +0000 UTC m=+3664.472239413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6330116d-6f24-4964-a35b-419b23f92f44-operator-scripts") pod "keystone-6f8f-account-create-update-x86qv" (UID: "6330116d-6f24-4964-a35b-419b23f92f44") : configmap "openstack-scripts" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.657990 5030 projected.go:194] Error preparing data for projected volume kube-api-access-pw54b for pod openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.658064 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6330116d-6f24-4964-a35b-419b23f92f44-kube-api-access-pw54b podName:6330116d-6f24-4964-a35b-419b23f92f44 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:32.158043401 +0000 UTC m=+3664.478303689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pw54b" (UniqueName: "kubernetes.io/projected/6330116d-6f24-4964-a35b-419b23f92f44-kube-api-access-pw54b") pod "keystone-6f8f-account-create-update-x86qv" (UID: "6330116d-6f24-4964-a35b-419b23f92f44") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.660222 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pd9lv"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.668309 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pd9lv"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.687579 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-sf2nd"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.701264 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-sf2nd"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.718404 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.718711 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" podUID="f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" containerName="keystone-api" containerID="cri-o://8cb98403ba1e46309540a4b0a01cdae716399a7ce40b028158271baa4c653cbf" gracePeriod=30 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.750157 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.767992 5030 scope.go:117] "RemoveContainer" containerID="c1113f67d75b9999d2aa07e3dea0a48f9ab97342489a0eba77438de7da5986dc" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.778396 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-8cm8v"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.786904 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-8cm8v"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.793862 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.799242 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-d9vgn"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.922229 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" containerName="galera" containerID="cri-o://c5665ab2e4d22cab17e4485bf654d41538e5774df91bf1e276c60d2155d2496f" gracePeriod=30 Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.962448 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:31.962542 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data podName:e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:35.962503581 +0000 UTC m=+3668.282763939 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data") pod "rabbitmq-server-0" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c") : configmap "rabbitmq-config-data" not found Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.976964 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0db48c6e-e94b-4a4c-bc75-70ee65bded48" path="/var/lib/kubelet/pods/0db48c6e-e94b-4a4c-bc75-70ee65bded48/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.977674 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106f4c1b-d73d-4dac-988b-0b08d0c66b69" path="/var/lib/kubelet/pods/106f4c1b-d73d-4dac-988b-0b08d0c66b69/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.978204 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17be7dc5-b32e-47c3-a083-823f4300b700" path="/var/lib/kubelet/pods/17be7dc5-b32e-47c3-a083-823f4300b700/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.978742 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1" path="/var/lib/kubelet/pods/23a72a1f-d1f6-4a23-ba6c-9baf90e14fc1/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.979754 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc7c61a-28d8-491f-822f-dd544043c6f7" path="/var/lib/kubelet/pods/2dc7c61a-28d8-491f-822f-dd544043c6f7/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.980344 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e480bc9-3e92-4cb6-a685-4b2118dea5ff" path="/var/lib/kubelet/pods/2e480bc9-3e92-4cb6-a685-4b2118dea5ff/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.981490 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34a89983-ae51-4585-9d9a-4309df4ac15b" path="/var/lib/kubelet/pods/34a89983-ae51-4585-9d9a-4309df4ac15b/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.982010 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4572984e-7cd5-46d9-8d0b-627ab9438173" path="/var/lib/kubelet/pods/4572984e-7cd5-46d9-8d0b-627ab9438173/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.982482 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46aa31eb-445e-4c7b-ab98-6326b68c20ac" path="/var/lib/kubelet/pods/46aa31eb-445e-4c7b-ab98-6326b68c20ac/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.983384 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e" path="/var/lib/kubelet/pods/4d6d04f3-a997-4d4a-85cc-b37c1c89ab5e/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.983938 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6003f418-8606-4a88-b48b-2fb2bd06f465" path="/var/lib/kubelet/pods/6003f418-8606-4a88-b48b-2fb2bd06f465/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.984428 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa0b529e-0a74-46b0-a364-0c6edc0e59f4" path="/var/lib/kubelet/pods/aa0b529e-0a74-46b0-a364-0c6edc0e59f4/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.984994 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bab5304f-ca88-442f-b4ca-823d8d0be728" path="/var/lib/kubelet/pods/bab5304f-ca88-442f-b4ca-823d8d0be728/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.985950 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbe0edb2-3db0-4578-ba0f-22e9f7209b5a" path="/var/lib/kubelet/pods/cbe0edb2-3db0-4578-ba0f-22e9f7209b5a/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.986886 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce2df28d-2515-4995-b928-c2f6d198045d" path="/var/lib/kubelet/pods/ce2df28d-2515-4995-b928-c2f6d198045d/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.987958 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7627c98-1b9d-43d6-9479-0b36228b1288" path="/var/lib/kubelet/pods/e7627c98-1b9d-43d6-9479-0b36228b1288/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:31.988502 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eddfc91d-df9b-4683-b3e6-5bad910ef921" path="/var/lib/kubelet/pods/eddfc91d-df9b-4683-b3e6-5bad910ef921/volumes" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.077564 5030 scope.go:117] "RemoveContainer" containerID="c1113f67d75b9999d2aa07e3dea0a48f9ab97342489a0eba77438de7da5986dc" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.078783 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-pw54b operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv" podUID="6330116d-6f24-4964-a35b-419b23f92f44" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.078885 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1113f67d75b9999d2aa07e3dea0a48f9ab97342489a0eba77438de7da5986dc\": container with ID starting with c1113f67d75b9999d2aa07e3dea0a48f9ab97342489a0eba77438de7da5986dc not found: ID does not exist" containerID="c1113f67d75b9999d2aa07e3dea0a48f9ab97342489a0eba77438de7da5986dc" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.078948 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1113f67d75b9999d2aa07e3dea0a48f9ab97342489a0eba77438de7da5986dc"} err="failed to get container status \"c1113f67d75b9999d2aa07e3dea0a48f9ab97342489a0eba77438de7da5986dc\": rpc error: code = NotFound desc = could not find container \"c1113f67d75b9999d2aa07e3dea0a48f9ab97342489a0eba77438de7da5986dc\": container with ID starting with c1113f67d75b9999d2aa07e3dea0a48f9ab97342489a0eba77438de7da5986dc not found: ID does not exist" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.167953 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6330116d-6f24-4964-a35b-419b23f92f44-operator-scripts\") pod \"keystone-6f8f-account-create-update-x86qv\" (UID: \"6330116d-6f24-4964-a35b-419b23f92f44\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.167988 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle\") pod \"barbican-keystone-listener-78ff77fc4b-82l2x\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.168017 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle\") pod \"barbican-worker-69f86ff78c-d95nc\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.168196 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw54b\" (UniqueName: \"kubernetes.io/projected/6330116d-6f24-4964-a35b-419b23f92f44-kube-api-access-pw54b\") pod \"keystone-6f8f-account-create-update-x86qv\" (UID: \"6330116d-6f24-4964-a35b-419b23f92f44\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.168461 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.168494 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6330116d-6f24-4964-a35b-419b23f92f44-operator-scripts podName:6330116d-6f24-4964-a35b-419b23f92f44 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:33.168482216 +0000 UTC m=+3665.488742504 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6330116d-6f24-4964-a35b-419b23f92f44-operator-scripts") pod "keystone-6f8f-account-create-update-x86qv" (UID: "6330116d-6f24-4964-a35b-419b23f92f44") : configmap "openstack-scripts" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.168776 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.168801 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle podName:8705c389-fda4-4ef7-8a28-93efcdff880b nodeName:}" failed. No retries permitted until 2026-01-20 23:36:36.168794485 +0000 UTC m=+3668.489054773 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle") pod "barbican-keystone-listener-78ff77fc4b-82l2x" (UID: "8705c389-fda4-4ef7-8a28-93efcdff880b") : secret "combined-ca-bundle" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.168877 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.168895 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle podName:afaaf0e0-1ddc-4335-843b-c2ea49492b80 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:36.168889837 +0000 UTC m=+3668.489150125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle") pod "barbican-worker-69f86ff78c-d95nc" (UID: "afaaf0e0-1ddc-4335-843b-c2ea49492b80") : secret "combined-ca-bundle" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.171712 5030 projected.go:194] Error preparing data for projected volume kube-api-access-pw54b for pod openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.171751 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6330116d-6f24-4964-a35b-419b23f92f44-kube-api-access-pw54b podName:6330116d-6f24-4964-a35b-419b23f92f44 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:33.171742716 +0000 UTC m=+3665.492003004 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pw54b" (UniqueName: "kubernetes.io/projected/6330116d-6f24-4964-a35b-419b23f92f44-kube-api-access-pw54b") pod "keystone-6f8f-account-create-update-x86qv" (UID: "6330116d-6f24-4964-a35b-419b23f92f44") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.185614 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.270929 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-combined-ca-bundle\") pod \"d3774e65-1783-48d9-af7d-3a9964b609ad\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.271000 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7257g\" (UniqueName: \"kubernetes.io/projected/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-api-access-7257g\") pod \"d3774e65-1783-48d9-af7d-3a9964b609ad\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.271108 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-state-metrics-tls-certs\") pod \"d3774e65-1783-48d9-af7d-3a9964b609ad\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.271163 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-state-metrics-tls-config\") pod \"d3774e65-1783-48d9-af7d-3a9964b609ad\" (UID: \"d3774e65-1783-48d9-af7d-3a9964b609ad\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.277692 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-api-access-7257g" (OuterVolumeSpecName: "kube-api-access-7257g") pod "d3774e65-1783-48d9-af7d-3a9964b609ad" (UID: "d3774e65-1783-48d9-af7d-3a9964b609ad"). InnerVolumeSpecName "kube-api-access-7257g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.302277 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "d3774e65-1783-48d9-af7d-3a9964b609ad" (UID: "d3774e65-1783-48d9-af7d-3a9964b609ad"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.325717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3774e65-1783-48d9-af7d-3a9964b609ad" (UID: "d3774e65-1783-48d9-af7d-3a9964b609ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.343176 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "d3774e65-1783-48d9-af7d-3a9964b609ad" (UID: "d3774e65-1783-48d9-af7d-3a9964b609ad"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.383714 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.383743 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.383757 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3774e65-1783-48d9-af7d-3a9964b609ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.383768 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7257g\" (UniqueName: \"kubernetes.io/projected/d3774e65-1783-48d9-af7d-3a9964b609ad-kube-api-access-7257g\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.463676 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.473079 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.477391 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.527138 5030 generic.go:334] "Generic (PLEG): container finished" podID="ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2" containerID="36efa10ae1b4f96519d04efabec3fcfed7aa6b7e9d940f3c5f6f44027d1f248e" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.527433 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.527420 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2","Type":"ContainerDied","Data":"36efa10ae1b4f96519d04efabec3fcfed7aa6b7e9d940f3c5f6f44027d1f248e"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.527822 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2","Type":"ContainerDied","Data":"839057bc1f6f527b70e4324c10c062bc9311f8876fe47a579518d60a21fe4e50"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.527841 5030 scope.go:117] "RemoveContainer" containerID="36efa10ae1b4f96519d04efabec3fcfed7aa6b7e9d940f3c5f6f44027d1f248e" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.533360 5030 generic.go:334] "Generic (PLEG): container finished" podID="d3774e65-1783-48d9-af7d-3a9964b609ad" containerID="a161cc88506234cfca0ad20e4d3a89e586f86988256fad6a81063a57a16e11d4" exitCode=2 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.533485 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.533489 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d3774e65-1783-48d9-af7d-3a9964b609ad","Type":"ContainerDied","Data":"a161cc88506234cfca0ad20e4d3a89e586f86988256fad6a81063a57a16e11d4"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.533662 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"d3774e65-1783-48d9-af7d-3a9964b609ad","Type":"ContainerDied","Data":"a948a61b5bd35929d797fe8333c7b5ea55cd36a76861fe7af8012a56b49c1e28"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.540450 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4bc1efe-faed-4290-a85c-c631a91f02ca" containerID="fc2622eac769de27d1854d47660b3865bb55eb741ab3de14a0ff652e61da605f" exitCode=143 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.540532 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" event={"ID":"f4bc1efe-faed-4290-a85c-c631a91f02ca","Type":"ContainerDied","Data":"fc2622eac769de27d1854d47660b3865bb55eb741ab3de14a0ff652e61da605f"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.545925 5030 generic.go:334] "Generic (PLEG): container finished" podID="43061a58-a31e-4ced-836f-b6a6badff81f" containerID="6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.545986 5030 generic.go:334] "Generic (PLEG): container finished" podID="43061a58-a31e-4ced-836f-b6a6badff81f" containerID="f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.546042 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"43061a58-a31e-4ced-836f-b6a6badff81f","Type":"ContainerDied","Data":"6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.546071 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"43061a58-a31e-4ced-836f-b6a6badff81f","Type":"ContainerDied","Data":"f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.546081 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"43061a58-a31e-4ced-836f-b6a6badff81f","Type":"ContainerDied","Data":"d1b1a2ac219bf7f63a00978610d3fcc21b78cb45aa5dba3eefec9116cc452426"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.546151 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.558223 5030 generic.go:334] "Generic (PLEG): container finished" podID="a832d1c9-4838-4070-ad8d-e2af76d8ca47" containerID="3b503db37878b35095c76a10013149fa599afad387b5f267dc2cef4bd347d150" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.558250 5030 generic.go:334] "Generic (PLEG): container finished" podID="a832d1c9-4838-4070-ad8d-e2af76d8ca47" containerID="901c65079c86eab89799533dc96ef980a3db2d852f2b58ea3c34b15a6460f991" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.558302 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" event={"ID":"a832d1c9-4838-4070-ad8d-e2af76d8ca47","Type":"ContainerDied","Data":"3b503db37878b35095c76a10013149fa599afad387b5f267dc2cef4bd347d150"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.558322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" event={"ID":"a832d1c9-4838-4070-ad8d-e2af76d8ca47","Type":"ContainerDied","Data":"901c65079c86eab89799533dc96ef980a3db2d852f2b58ea3c34b15a6460f991"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.562421 5030 scope.go:117] "RemoveContainer" containerID="36efa10ae1b4f96519d04efabec3fcfed7aa6b7e9d940f3c5f6f44027d1f248e" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.563050 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36efa10ae1b4f96519d04efabec3fcfed7aa6b7e9d940f3c5f6f44027d1f248e\": container with ID starting with 36efa10ae1b4f96519d04efabec3fcfed7aa6b7e9d940f3c5f6f44027d1f248e not found: ID does not exist" containerID="36efa10ae1b4f96519d04efabec3fcfed7aa6b7e9d940f3c5f6f44027d1f248e" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.563101 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36efa10ae1b4f96519d04efabec3fcfed7aa6b7e9d940f3c5f6f44027d1f248e"} err="failed to get container status \"36efa10ae1b4f96519d04efabec3fcfed7aa6b7e9d940f3c5f6f44027d1f248e\": rpc error: code = NotFound desc = could not find container \"36efa10ae1b4f96519d04efabec3fcfed7aa6b7e9d940f3c5f6f44027d1f248e\": container with ID starting with 36efa10ae1b4f96519d04efabec3fcfed7aa6b7e9d940f3c5f6f44027d1f248e not found: ID does not exist" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.563125 5030 scope.go:117] "RemoveContainer" containerID="a161cc88506234cfca0ad20e4d3a89e586f86988256fad6a81063a57a16e11d4" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.573271 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.580118 5030 generic.go:334] "Generic (PLEG): container finished" podID="d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" containerID="c5665ab2e4d22cab17e4485bf654d41538e5774df91bf1e276c60d2155d2496f" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.580178 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9","Type":"ContainerDied","Data":"c5665ab2e4d22cab17e4485bf654d41538e5774df91bf1e276c60d2155d2496f"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.580886 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.582776 5030 generic.go:334] "Generic (PLEG): container finished" podID="99f18e53-0386-4e06-89f6-a7d30f880808" containerID="b09cb47fdff353d2277388987884452b6f97530142dd767855a2b065ff8722ae" exitCode=143 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.582837 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" event={"ID":"99f18e53-0386-4e06-89f6-a7d30f880808","Type":"ContainerDied","Data":"b09cb47fdff353d2277388987884452b6f97530142dd767855a2b065ff8722ae"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.586269 5030 generic.go:334] "Generic (PLEG): container finished" podID="5b3850fc-0bf6-489d-8a8f-4a72ad423cce" containerID="e268973cec96cb63fac254ecd93e81d85190ec9cc7adf77bceda4e64becbf60d" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.586320 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.586352 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"5b3850fc-0bf6-489d-8a8f-4a72ad423cce","Type":"ContainerDied","Data":"e268973cec96cb63fac254ecd93e81d85190ec9cc7adf77bceda4e64becbf60d"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.586383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"5b3850fc-0bf6-489d-8a8f-4a72ad423cce","Type":"ContainerDied","Data":"643c85b41e6141794593cf3c9ee3bc6ffb89d24885af29de96656e9068afded6"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.586511 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-vencrypt-tls-certs\") pod \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.587144 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-config-data\") pod \"43061a58-a31e-4ced-836f-b6a6badff81f\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.587261 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-scripts\") pod \"43061a58-a31e-4ced-836f-b6a6badff81f\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.587350 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpwld\" (UniqueName: \"kubernetes.io/projected/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-kube-api-access-rpwld\") pod \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.587436 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-config-data-default\") pod \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.587549 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-galera-tls-certs\") pod \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.587694 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-config-data\") pod \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.587801 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-combined-ca-bundle\") pod \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.587907 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdnrb\" (UniqueName: \"kubernetes.io/projected/43061a58-a31e-4ced-836f-b6a6badff81f-kube-api-access-vdnrb\") pod \"43061a58-a31e-4ced-836f-b6a6badff81f\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.588019 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-nova-novncproxy-tls-certs\") pod \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.589012 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "5b3850fc-0bf6-489d-8a8f-4a72ad423cce" (UID: "5b3850fc-0bf6-489d-8a8f-4a72ad423cce"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.589221 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43061a58-a31e-4ced-836f-b6a6badff81f-run-httpd\") pod \"43061a58-a31e-4ced-836f-b6a6badff81f\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.589875 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.590009 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43061a58-a31e-4ced-836f-b6a6badff81f-log-httpd\") pod \"43061a58-a31e-4ced-836f-b6a6badff81f\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.590496 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs6b7\" (UniqueName: \"kubernetes.io/projected/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-kube-api-access-cs6b7\") pod \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.589784 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43061a58-a31e-4ced-836f-b6a6badff81f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "43061a58-a31e-4ced-836f-b6a6badff81f" (UID: "43061a58-a31e-4ced-836f-b6a6badff81f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.590423 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43061a58-a31e-4ced-836f-b6a6badff81f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "43061a58-a31e-4ced-836f-b6a6badff81f" (UID: "43061a58-a31e-4ced-836f-b6a6badff81f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.591052 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-combined-ca-bundle\") pod \"43061a58-a31e-4ced-836f-b6a6badff81f\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.591245 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-ceilometer-tls-certs\") pod \"43061a58-a31e-4ced-836f-b6a6badff81f\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.591351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-sg-core-conf-yaml\") pod \"43061a58-a31e-4ced-836f-b6a6badff81f\" (UID: \"43061a58-a31e-4ced-836f-b6a6badff81f\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.591395 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-combined-ca-bundle\") pod \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\" (UID: \"ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.591444 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-kolla-config\") pod \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.591483 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-operator-scripts\") pod \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.591521 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-config-data-generated\") pod \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\" (UID: \"5b3850fc-0bf6-489d-8a8f-4a72ad423cce\") " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.591943 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.592117 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.592241 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle\") pod \"barbican-api-688dd48fcd-2fh67\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.592668 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.592692 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43061a58-a31e-4ced-836f-b6a6badff81f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.592706 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43061a58-a31e-4ced-836f-b6a6badff81f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.592796 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.592846 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle podName:94b0ade6-0e21-4df1-8dca-db81b58aff7c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:36.592828669 +0000 UTC m=+3668.913088967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle") pod "barbican-api-688dd48fcd-2fh67" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c") : secret "combined-ca-bundle" not found Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.593636 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "5b3850fc-0bf6-489d-8a8f-4a72ad423cce" (UID: "5b3850fc-0bf6-489d-8a8f-4a72ad423cce"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.594410 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b3850fc-0bf6-489d-8a8f-4a72ad423cce" (UID: "5b3850fc-0bf6-489d-8a8f-4a72ad423cce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.594490 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.594524 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs podName:94b0ade6-0e21-4df1-8dca-db81b58aff7c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:36.59451294 +0000 UTC m=+3668.914773248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs") pod "barbican-api-688dd48fcd-2fh67" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c") : secret "cert-barbican-internal-svc" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.595097 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.596358 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs podName:94b0ade6-0e21-4df1-8dca-db81b58aff7c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:36.596337083 +0000 UTC m=+3668.916597371 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs") pod "barbican-api-688dd48fcd-2fh67" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c") : secret "cert-barbican-public-svc" not found Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.597872 5030 generic.go:334] "Generic (PLEG): container finished" podID="ab64dff2-f94e-4a15-84d1-4661aa1b8c87" containerID="82b35ea53706bccfaf45eb8ca8c18b312780c9905324c03b5f0a1a59d626200d" exitCode=0 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.597968 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" event={"ID":"ab64dff2-f94e-4a15-84d1-4661aa1b8c87","Type":"ContainerDied","Data":"82b35ea53706bccfaf45eb8ca8c18b312780c9905324c03b5f0a1a59d626200d"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.598020 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" event={"ID":"ab64dff2-f94e-4a15-84d1-4661aa1b8c87","Type":"ContainerStarted","Data":"170e9e834f46a7b338e37db311cb370bdbd13dbef7bc5e130928776a1b68cdbf"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.601915 5030 generic.go:334] "Generic (PLEG): container finished" podID="bc222012-854c-45e8-8ef9-e87ac905ca13" containerID="104b8abfa5522071d609868f0a618c52a03300297864beebcb810bb8789ac864" exitCode=1 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.601945 5030 generic.go:334] "Generic (PLEG): container finished" podID="bc222012-854c-45e8-8ef9-e87ac905ca13" containerID="7aa4244c68daf36642df73838f4fa0a37eea094dfb4bdbe19e5709bc658be850" exitCode=1 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.601992 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-d9vgn" event={"ID":"bc222012-854c-45e8-8ef9-e87ac905ca13","Type":"ContainerDied","Data":"104b8abfa5522071d609868f0a618c52a03300297864beebcb810bb8789ac864"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.602023 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-d9vgn" event={"ID":"bc222012-854c-45e8-8ef9-e87ac905ca13","Type":"ContainerDied","Data":"7aa4244c68daf36642df73838f4fa0a37eea094dfb4bdbe19e5709bc658be850"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.606753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "5b3850fc-0bf6-489d-8a8f-4a72ad423cce" (UID: "5b3850fc-0bf6-489d-8a8f-4a72ad423cce"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.627360 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-scripts" (OuterVolumeSpecName: "scripts") pod "43061a58-a31e-4ced-836f-b6a6badff81f" (UID: "43061a58-a31e-4ced-836f-b6a6badff81f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.627424 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43061a58-a31e-4ced-836f-b6a6badff81f-kube-api-access-vdnrb" (OuterVolumeSpecName: "kube-api-access-vdnrb") pod "43061a58-a31e-4ced-836f-b6a6badff81f" (UID: "43061a58-a31e-4ced-836f-b6a6badff81f"). InnerVolumeSpecName "kube-api-access-vdnrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.627508 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-kube-api-access-rpwld" (OuterVolumeSpecName: "kube-api-access-rpwld") pod "ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2" (UID: "ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2"). InnerVolumeSpecName "kube-api-access-rpwld". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.627530 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-kube-api-access-cs6b7" (OuterVolumeSpecName: "kube-api-access-cs6b7") pod "5b3850fc-0bf6-489d-8a8f-4a72ad423cce" (UID: "5b3850fc-0bf6-489d-8a8f-4a72ad423cce"). InnerVolumeSpecName "kube-api-access-cs6b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.649061 5030 generic.go:334] "Generic (PLEG): container finished" podID="5c2a94b7-b322-4de7-b55d-80c87895ab2d" containerID="1f023a1689579151152b0ca7bfe3a74e9a546de82e1d757e00c1973facff6457" exitCode=143 Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.649777 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.651964 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" event={"ID":"5c2a94b7-b322-4de7-b55d-80c87895ab2d","Type":"ContainerDied","Data":"1f023a1689579151152b0ca7bfe3a74e9a546de82e1d757e00c1973facff6457"} Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.652062 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.652195 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.652263 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.698012 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.698234 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data podName:707515d1-17d7-47ff-be95-73414a6a2a95 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:36.69821689 +0000 UTC m=+3669.018477178 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data") pod "rabbitmq-cell1-server-0" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.698517 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.698539 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.698550 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.698567 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.698784 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpwld\" (UniqueName: \"kubernetes.io/projected/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-kube-api-access-rpwld\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.698802 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdnrb\" (UniqueName: \"kubernetes.io/projected/43061a58-a31e-4ced-836f-b6a6badff81f-kube-api-access-vdnrb\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.698814 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs6b7\" (UniqueName: \"kubernetes.io/projected/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-kube-api-access-cs6b7\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.709379 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "mysql-db") pod "5b3850fc-0bf6-489d-8a8f-4a72ad423cce" (UID: "5b3850fc-0bf6-489d-8a8f-4a72ad423cce"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.742975 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="707515d1-17d7-47ff-be95-73414a6a2a95" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.211:5671: connect: connection refused" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.800162 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.808243 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2" (UID: "ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.808363 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2" (UID: "ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.824221 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-config-data" (OuterVolumeSpecName: "config-data") pod "ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2" (UID: "ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.831986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b3850fc-0bf6-489d-8a8f-4a72ad423cce" (UID: "5b3850fc-0bf6-489d-8a8f-4a72ad423cce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.833931 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.836059 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "43061a58-a31e-4ced-836f-b6a6badff81f" (UID: "43061a58-a31e-4ced-836f-b6a6badff81f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.849938 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "43061a58-a31e-4ced-836f-b6a6badff81f" (UID: "43061a58-a31e-4ced-836f-b6a6badff81f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.854992 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2" (UID: "ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.867692 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "5b3850fc-0bf6-489d-8a8f-4a72ad423cce" (UID: "5b3850fc-0bf6-489d-8a8f-4a72ad423cce"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.873846 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-config-data" (OuterVolumeSpecName: "config-data") pod "43061a58-a31e-4ced-836f-b6a6badff81f" (UID: "43061a58-a31e-4ced-836f-b6a6badff81f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.898333 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43061a58-a31e-4ced-836f-b6a6badff81f" (UID: "43061a58-a31e-4ced-836f-b6a6badff81f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.901553 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.901579 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.901591 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.901601 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.901609 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.901631 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.901640 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.901649 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43061a58-a31e-4ced-836f-b6a6badff81f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.901659 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.901668 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.901676 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b3850fc-0bf6-489d-8a8f-4a72ad423cce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.951849 5030 scope.go:117] "RemoveContainer" containerID="a161cc88506234cfca0ad20e4d3a89e586f86988256fad6a81063a57a16e11d4" Jan 20 23:36:32 crc kubenswrapper[5030]: E0120 23:36:32.952811 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a161cc88506234cfca0ad20e4d3a89e586f86988256fad6a81063a57a16e11d4\": container with ID starting with a161cc88506234cfca0ad20e4d3a89e586f86988256fad6a81063a57a16e11d4 not found: ID does not exist" containerID="a161cc88506234cfca0ad20e4d3a89e586f86988256fad6a81063a57a16e11d4" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.952838 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a161cc88506234cfca0ad20e4d3a89e586f86988256fad6a81063a57a16e11d4"} err="failed to get container status \"a161cc88506234cfca0ad20e4d3a89e586f86988256fad6a81063a57a16e11d4\": rpc error: code = NotFound desc = could not find container \"a161cc88506234cfca0ad20e4d3a89e586f86988256fad6a81063a57a16e11d4\": container with ID starting with a161cc88506234cfca0ad20e4d3a89e586f86988256fad6a81063a57a16e11d4 not found: ID does not exist" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.952858 5030 scope.go:117] "RemoveContainer" containerID="ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.966119 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.985334 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.992009 5030 scope.go:117] "RemoveContainer" containerID="4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2" Jan 20 23:36:32 crc kubenswrapper[5030]: I0120 23:36:32.995904 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.003963 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.019571 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.024503 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.032544 5030 scope.go:117] "RemoveContainer" containerID="6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.042139 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.049505 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.050880 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.061695 5030 scope.go:117] "RemoveContainer" containerID="f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.086330 5030 scope.go:117] "RemoveContainer" containerID="ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54" Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.087045 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54\": container with ID starting with ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54 not found: ID does not exist" containerID="ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.087084 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54"} err="failed to get container status \"ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54\": rpc error: code = NotFound desc = could not find container \"ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54\": container with ID starting with ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54 not found: ID does not exist" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.087107 5030 scope.go:117] "RemoveContainer" containerID="4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2" Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.087375 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2\": container with ID starting with 4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2 not found: ID does not exist" containerID="4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.087395 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2"} err="failed to get container status \"4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2\": rpc error: code = NotFound desc = could not find container \"4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2\": container with ID starting with 4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2 not found: ID does not exist" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.087410 5030 scope.go:117] "RemoveContainer" containerID="6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32" Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.087725 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32\": container with ID starting with 6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32 not found: ID does not exist" containerID="6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.087744 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32"} err="failed to get container status \"6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32\": rpc error: code = NotFound desc = could not find container \"6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32\": container with ID starting with 6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32 not found: ID does not exist" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.087756 5030 scope.go:117] "RemoveContainer" containerID="f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc" Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.088007 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc\": container with ID starting with f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc not found: ID does not exist" containerID="f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.088028 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc"} err="failed to get container status \"f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc\": rpc error: code = NotFound desc = could not find container \"f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc\": container with ID starting with f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc not found: ID does not exist" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.088040 5030 scope.go:117] "RemoveContainer" containerID="ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.092124 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54"} err="failed to get container status \"ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54\": rpc error: code = NotFound desc = could not find container \"ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54\": container with ID starting with ff89c2398eff483de05b80d260b192fcfce259b05be5fa08d9263651b2155e54 not found: ID does not exist" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.092186 5030 scope.go:117] "RemoveContainer" containerID="4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.092546 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2"} err="failed to get container status \"4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2\": rpc error: code = NotFound desc = could not find container \"4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2\": container with ID starting with 4b7e8e35c0ce57ad88c10e63c005d850bd8bbf6a82e803fc1739bd6894d3d5a2 not found: ID does not exist" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.092596 5030 scope.go:117] "RemoveContainer" containerID="6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.092915 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32"} err="failed to get container status \"6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32\": rpc error: code = NotFound desc = could not find container \"6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32\": container with ID starting with 6099f17944a8025d25873cfb2857c5617e6a473f89d25f2cb2783e0dccbb8a32 not found: ID does not exist" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.092933 5030 scope.go:117] "RemoveContainer" containerID="f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.093204 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc"} err="failed to get container status \"f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc\": rpc error: code = NotFound desc = could not find container \"f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc\": container with ID starting with f24d21c02a261c47b61393a832294cca224b46551f11955ce4f9c7d64d6abcfc not found: ID does not exist" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.093236 5030 scope.go:117] "RemoveContainer" containerID="e268973cec96cb63fac254ecd93e81d85190ec9cc7adf77bceda4e64becbf60d" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.104506 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8705c389-fda4-4ef7-8a28-93efcdff880b-logs\") pod \"8705c389-fda4-4ef7-8a28-93efcdff880b\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.104582 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd2px\" (UniqueName: \"kubernetes.io/projected/8705c389-fda4-4ef7-8a28-93efcdff880b-kube-api-access-gd2px\") pod \"8705c389-fda4-4ef7-8a28-93efcdff880b\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.104663 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-config-data-custom\") pod \"8705c389-fda4-4ef7-8a28-93efcdff880b\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.104847 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-config-data-generated\") pod \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.104924 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8705c389-fda4-4ef7-8a28-93efcdff880b-logs" (OuterVolumeSpecName: "logs") pod "8705c389-fda4-4ef7-8a28-93efcdff880b" (UID: "8705c389-fda4-4ef7-8a28-93efcdff880b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.105215 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a832d1c9-4838-4070-ad8d-e2af76d8ca47-log-httpd\") pod \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.105404 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" (UID: "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.105704 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94b0ade6-0e21-4df1-8dca-db81b58aff7c-logs\") pod \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.105745 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-config-data\") pod \"8705c389-fda4-4ef7-8a28-93efcdff880b\" (UID: \"8705c389-fda4-4ef7-8a28-93efcdff880b\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106023 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94b0ade6-0e21-4df1-8dca-db81b58aff7c-logs" (OuterVolumeSpecName: "logs") pod "94b0ade6-0e21-4df1-8dca-db81b58aff7c" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106233 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a832d1c9-4838-4070-ad8d-e2af76d8ca47-etc-swift\") pod \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106290 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzcfq\" (UniqueName: \"kubernetes.io/projected/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-kube-api-access-nzcfq\") pod \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106316 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-combined-ca-bundle\") pod \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106342 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-kolla-config\") pod \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106367 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-config-data\") pod \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106454 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjrvw\" (UniqueName: \"kubernetes.io/projected/94b0ade6-0e21-4df1-8dca-db81b58aff7c-kube-api-access-zjrvw\") pod \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106526 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-config-data\") pod \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-config-data-custom\") pod \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106585 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt67c\" (UniqueName: \"kubernetes.io/projected/afaaf0e0-1ddc-4335-843b-c2ea49492b80-kube-api-access-qt67c\") pod \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106638 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-galera-tls-certs\") pod \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106674 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a5c598-c95d-453a-aeec-bda1f9ffdb82-operator-scripts\") pod \"06a5c598-c95d-453a-aeec-bda1f9ffdb82\" (UID: \"06a5c598-c95d-453a-aeec-bda1f9ffdb82\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a832d1c9-4838-4070-ad8d-e2af76d8ca47-run-httpd\") pod \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106729 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-config-data-default\") pod \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106760 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-config-data\") pod \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.106790 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-operator-scripts\") pod \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.107517 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8705c389-fda4-4ef7-8a28-93efcdff880b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.107540 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.107551 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94b0ade6-0e21-4df1-8dca-db81b58aff7c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.108122 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a832d1c9-4838-4070-ad8d-e2af76d8ca47-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a832d1c9-4838-4070-ad8d-e2af76d8ca47" (UID: "a832d1c9-4838-4070-ad8d-e2af76d8ca47"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.108541 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" (UID: "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.112189 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a5c598-c95d-453a-aeec-bda1f9ffdb82-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06a5c598-c95d-453a-aeec-bda1f9ffdb82" (UID: "06a5c598-c95d-453a-aeec-bda1f9ffdb82"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.112526 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a832d1c9-4838-4070-ad8d-e2af76d8ca47-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a832d1c9-4838-4070-ad8d-e2af76d8ca47" (UID: "a832d1c9-4838-4070-ad8d-e2af76d8ca47"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.112655 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" (UID: "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.112955 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" (UID: "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.120108 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-config-data" (OuterVolumeSpecName: "config-data") pod "94b0ade6-0e21-4df1-8dca-db81b58aff7c" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.120212 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8705c389-fda4-4ef7-8a28-93efcdff880b-kube-api-access-gd2px" (OuterVolumeSpecName: "kube-api-access-gd2px") pod "8705c389-fda4-4ef7-8a28-93efcdff880b" (UID: "8705c389-fda4-4ef7-8a28-93efcdff880b"). InnerVolumeSpecName "kube-api-access-gd2px". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.121026 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-config-data" (OuterVolumeSpecName: "config-data") pod "afaaf0e0-1ddc-4335-843b-c2ea49492b80" (UID: "afaaf0e0-1ddc-4335-843b-c2ea49492b80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.122017 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-kube-api-access-nzcfq" (OuterVolumeSpecName: "kube-api-access-nzcfq") pod "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" (UID: "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9"). InnerVolumeSpecName "kube-api-access-nzcfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.124015 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afaaf0e0-1ddc-4335-843b-c2ea49492b80-kube-api-access-qt67c" (OuterVolumeSpecName: "kube-api-access-qt67c") pod "afaaf0e0-1ddc-4335-843b-c2ea49492b80" (UID: "afaaf0e0-1ddc-4335-843b-c2ea49492b80"). InnerVolumeSpecName "kube-api-access-qt67c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.124532 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b0ade6-0e21-4df1-8dca-db81b58aff7c-kube-api-access-zjrvw" (OuterVolumeSpecName: "kube-api-access-zjrvw") pod "94b0ade6-0e21-4df1-8dca-db81b58aff7c" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c"). InnerVolumeSpecName "kube-api-access-zjrvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.124704 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-config-data" (OuterVolumeSpecName: "config-data") pod "8705c389-fda4-4ef7-8a28-93efcdff880b" (UID: "8705c389-fda4-4ef7-8a28-93efcdff880b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.127832 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "afaaf0e0-1ddc-4335-843b-c2ea49492b80" (UID: "afaaf0e0-1ddc-4335-843b-c2ea49492b80"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.128695 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a832d1c9-4838-4070-ad8d-e2af76d8ca47-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a832d1c9-4838-4070-ad8d-e2af76d8ca47" (UID: "a832d1c9-4838-4070-ad8d-e2af76d8ca47"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.129884 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8705c389-fda4-4ef7-8a28-93efcdff880b" (UID: "8705c389-fda4-4ef7-8a28-93efcdff880b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.133417 5030 scope.go:117] "RemoveContainer" containerID="d5bb496af776155bccbacd44a182341bd36f316b38ba8e602d66d7dffd391456" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.143626 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" (UID: "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.146882 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-d9vgn" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.155731 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.165971 5030 scope.go:117] "RemoveContainer" containerID="e268973cec96cb63fac254ecd93e81d85190ec9cc7adf77bceda4e64becbf60d" Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.169801 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e268973cec96cb63fac254ecd93e81d85190ec9cc7adf77bceda4e64becbf60d\": container with ID starting with e268973cec96cb63fac254ecd93e81d85190ec9cc7adf77bceda4e64becbf60d not found: ID does not exist" containerID="e268973cec96cb63fac254ecd93e81d85190ec9cc7adf77bceda4e64becbf60d" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.169834 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e268973cec96cb63fac254ecd93e81d85190ec9cc7adf77bceda4e64becbf60d"} err="failed to get container status \"e268973cec96cb63fac254ecd93e81d85190ec9cc7adf77bceda4e64becbf60d\": rpc error: code = NotFound desc = could not find container \"e268973cec96cb63fac254ecd93e81d85190ec9cc7adf77bceda4e64becbf60d\": container with ID starting with e268973cec96cb63fac254ecd93e81d85190ec9cc7adf77bceda4e64becbf60d not found: ID does not exist" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.169855 5030 scope.go:117] "RemoveContainer" containerID="d5bb496af776155bccbacd44a182341bd36f316b38ba8e602d66d7dffd391456" Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.170143 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5bb496af776155bccbacd44a182341bd36f316b38ba8e602d66d7dffd391456\": container with ID starting with d5bb496af776155bccbacd44a182341bd36f316b38ba8e602d66d7dffd391456 not found: ID does not exist" containerID="d5bb496af776155bccbacd44a182341bd36f316b38ba8e602d66d7dffd391456" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.170166 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5bb496af776155bccbacd44a182341bd36f316b38ba8e602d66d7dffd391456"} err="failed to get container status \"d5bb496af776155bccbacd44a182341bd36f316b38ba8e602d66d7dffd391456\": rpc error: code = NotFound desc = could not find container \"d5bb496af776155bccbacd44a182341bd36f316b38ba8e602d66d7dffd391456\": container with ID starting with d5bb496af776155bccbacd44a182341bd36f316b38ba8e602d66d7dffd391456 not found: ID does not exist" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.170180 5030 scope.go:117] "RemoveContainer" containerID="104b8abfa5522071d609868f0a618c52a03300297864beebcb810bb8789ac864" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.178202 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-config-data" (OuterVolumeSpecName: "config-data") pod "a832d1c9-4838-4070-ad8d-e2af76d8ca47" (UID: "a832d1c9-4838-4070-ad8d-e2af76d8ca47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.208426 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j7zr\" (UniqueName: \"kubernetes.io/projected/06a5c598-c95d-453a-aeec-bda1f9ffdb82-kube-api-access-2j7zr\") pod \"06a5c598-c95d-453a-aeec-bda1f9ffdb82\" (UID: \"06a5c598-c95d-453a-aeec-bda1f9ffdb82\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.208474 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-internal-tls-certs\") pod \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.208504 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-config-data-custom\") pod \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\" (UID: \"94b0ade6-0e21-4df1-8dca-db81b58aff7c\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.208522 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bn2k\" (UniqueName: \"kubernetes.io/projected/a832d1c9-4838-4070-ad8d-e2af76d8ca47-kube-api-access-6bn2k\") pod \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.208777 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afaaf0e0-1ddc-4335-843b-c2ea49492b80-logs\") pod \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\" (UID: \"afaaf0e0-1ddc-4335-843b-c2ea49492b80\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.208797 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-public-tls-certs\") pod \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.208837 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\" (UID: \"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.208861 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-combined-ca-bundle\") pod \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\" (UID: \"a832d1c9-4838-4070-ad8d-e2af76d8ca47\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209036 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6330116d-6f24-4964-a35b-419b23f92f44-operator-scripts\") pod \"keystone-6f8f-account-create-update-x86qv\" (UID: \"6330116d-6f24-4964-a35b-419b23f92f44\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209206 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw54b\" (UniqueName: \"kubernetes.io/projected/6330116d-6f24-4964-a35b-419b23f92f44-kube-api-access-pw54b\") pod \"keystone-6f8f-account-create-update-x86qv\" (UID: \"6330116d-6f24-4964-a35b-419b23f92f44\") " pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209284 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209295 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209306 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjrvw\" (UniqueName: \"kubernetes.io/projected/94b0ade6-0e21-4df1-8dca-db81b58aff7c-kube-api-access-zjrvw\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209316 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209325 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt67c\" (UniqueName: \"kubernetes.io/projected/afaaf0e0-1ddc-4335-843b-c2ea49492b80-kube-api-access-qt67c\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209334 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209343 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a5c598-c95d-453a-aeec-bda1f9ffdb82-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209352 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a832d1c9-4838-4070-ad8d-e2af76d8ca47-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209360 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209370 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209381 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209410 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd2px\" (UniqueName: \"kubernetes.io/projected/8705c389-fda4-4ef7-8a28-93efcdff880b-kube-api-access-gd2px\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209421 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209430 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a832d1c9-4838-4070-ad8d-e2af76d8ca47-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209440 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209449 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a832d1c9-4838-4070-ad8d-e2af76d8ca47-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209460 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzcfq\" (UniqueName: \"kubernetes.io/projected/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-kube-api-access-nzcfq\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.209472 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.220038 5030 projected.go:194] Error preparing data for projected volume kube-api-access-pw54b for pod openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.220194 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6330116d-6f24-4964-a35b-419b23f92f44-kube-api-access-pw54b podName:6330116d-6f24-4964-a35b-419b23f92f44 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:35.220174904 +0000 UTC m=+3667.540435192 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-pw54b" (UniqueName: "kubernetes.io/projected/6330116d-6f24-4964-a35b-419b23f92f44-kube-api-access-pw54b") pod "keystone-6f8f-account-create-update-x86qv" (UID: "6330116d-6f24-4964-a35b-419b23f92f44") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.220054 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afaaf0e0-1ddc-4335-843b-c2ea49492b80-logs" (OuterVolumeSpecName: "logs") pod "afaaf0e0-1ddc-4335-843b-c2ea49492b80" (UID: "afaaf0e0-1ddc-4335-843b-c2ea49492b80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.221777 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a5c598-c95d-453a-aeec-bda1f9ffdb82-kube-api-access-2j7zr" (OuterVolumeSpecName: "kube-api-access-2j7zr") pod "06a5c598-c95d-453a-aeec-bda1f9ffdb82" (UID: "06a5c598-c95d-453a-aeec-bda1f9ffdb82"). InnerVolumeSpecName "kube-api-access-2j7zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.225386 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a832d1c9-4838-4070-ad8d-e2af76d8ca47-kube-api-access-6bn2k" (OuterVolumeSpecName: "kube-api-access-6bn2k") pod "a832d1c9-4838-4070-ad8d-e2af76d8ca47" (UID: "a832d1c9-4838-4070-ad8d-e2af76d8ca47"). InnerVolumeSpecName "kube-api-access-6bn2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.225525 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="fd6ac7c2-5294-401a-a16d-0821b044e7f8" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.145:9292/healthcheck\": read tcp 10.217.0.2:35296->10.217.0.145:9292: read: connection reset by peer" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.225732 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="fd6ac7c2-5294-401a-a16d-0821b044e7f8" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.145:9292/healthcheck\": read tcp 10.217.0.2:35294->10.217.0.145:9292: read: connection reset by peer" Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.225825 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.225886 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6330116d-6f24-4964-a35b-419b23f92f44-operator-scripts podName:6330116d-6f24-4964-a35b-419b23f92f44 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:35.225870272 +0000 UTC m=+3667.546130560 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6330116d-6f24-4964-a35b-419b23f92f44-operator-scripts") pod "keystone-6f8f-account-create-update-x86qv" (UID: "6330116d-6f24-4964-a35b-419b23f92f44") : configmap "openstack-scripts" not found Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.227354 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "94b0ade6-0e21-4df1-8dca-db81b58aff7c" (UID: "94b0ade6-0e21-4df1-8dca-db81b58aff7c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.230132 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.235374 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" (UID: "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.239786 5030 scope.go:117] "RemoveContainer" containerID="104b8abfa5522071d609868f0a618c52a03300297864beebcb810bb8789ac864" Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.240415 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"104b8abfa5522071d609868f0a618c52a03300297864beebcb810bb8789ac864\": container with ID starting with 104b8abfa5522071d609868f0a618c52a03300297864beebcb810bb8789ac864 not found: ID does not exist" containerID="104b8abfa5522071d609868f0a618c52a03300297864beebcb810bb8789ac864" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.240488 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104b8abfa5522071d609868f0a618c52a03300297864beebcb810bb8789ac864"} err="failed to get container status \"104b8abfa5522071d609868f0a618c52a03300297864beebcb810bb8789ac864\": rpc error: code = NotFound desc = could not find container \"104b8abfa5522071d609868f0a618c52a03300297864beebcb810bb8789ac864\": container with ID starting with 104b8abfa5522071d609868f0a618c52a03300297864beebcb810bb8789ac864 not found: ID does not exist" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.253760 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" (UID: "d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.255091 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.262205 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.270382 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.271951 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a832d1c9-4838-4070-ad8d-e2af76d8ca47" (UID: "a832d1c9-4838-4070-ad8d-e2af76d8ca47"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.272432 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a832d1c9-4838-4070-ad8d-e2af76d8ca47" (UID: "a832d1c9-4838-4070-ad8d-e2af76d8ca47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.286767 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a832d1c9-4838-4070-ad8d-e2af76d8ca47" (UID: "a832d1c9-4838-4070-ad8d-e2af76d8ca47"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.310037 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc222012-854c-45e8-8ef9-e87ac905ca13-operator-scripts\") pod \"bc222012-854c-45e8-8ef9-e87ac905ca13\" (UID: \"bc222012-854c-45e8-8ef9-e87ac905ca13\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.310169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hln4z\" (UniqueName: \"kubernetes.io/projected/6b0c66c6-242d-4d35-8be9-28739bd537f5-kube-api-access-hln4z\") pod \"6b0c66c6-242d-4d35-8be9-28739bd537f5\" (UID: \"6b0c66c6-242d-4d35-8be9-28739bd537f5\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.310524 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc222012-854c-45e8-8ef9-e87ac905ca13-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc222012-854c-45e8-8ef9-e87ac905ca13" (UID: "bc222012-854c-45e8-8ef9-e87ac905ca13"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.311290 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b0c66c6-242d-4d35-8be9-28739bd537f5-operator-scripts\") pod \"6b0c66c6-242d-4d35-8be9-28739bd537f5\" (UID: \"6b0c66c6-242d-4d35-8be9-28739bd537f5\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.311320 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvzxs\" (UniqueName: \"kubernetes.io/projected/bc222012-854c-45e8-8ef9-e87ac905ca13-kube-api-access-gvzxs\") pod \"bc222012-854c-45e8-8ef9-e87ac905ca13\" (UID: \"bc222012-854c-45e8-8ef9-e87ac905ca13\") " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.311781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0c66c6-242d-4d35-8be9-28739bd537f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b0c66c6-242d-4d35-8be9-28739bd537f5" (UID: "6b0c66c6-242d-4d35-8be9-28739bd537f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.311852 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afaaf0e0-1ddc-4335-843b-c2ea49492b80-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.311867 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.311886 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.311896 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.311906 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j7zr\" (UniqueName: \"kubernetes.io/projected/06a5c598-c95d-453a-aeec-bda1f9ffdb82-kube-api-access-2j7zr\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.311916 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a832d1c9-4838-4070-ad8d-e2af76d8ca47-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.311926 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.311935 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bn2k\" (UniqueName: \"kubernetes.io/projected/a832d1c9-4838-4070-ad8d-e2af76d8ca47-kube-api-access-6bn2k\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.311944 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.311953 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc222012-854c-45e8-8ef9-e87ac905ca13-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.326320 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc222012-854c-45e8-8ef9-e87ac905ca13-kube-api-access-gvzxs" (OuterVolumeSpecName: "kube-api-access-gvzxs") pod "bc222012-854c-45e8-8ef9-e87ac905ca13" (UID: "bc222012-854c-45e8-8ef9-e87ac905ca13"). InnerVolumeSpecName "kube-api-access-gvzxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.338630 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.341283 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b0c66c6-242d-4d35-8be9-28739bd537f5-kube-api-access-hln4z" (OuterVolumeSpecName: "kube-api-access-hln4z") pod "6b0c66c6-242d-4d35-8be9-28739bd537f5" (UID: "6b0c66c6-242d-4d35-8be9-28739bd537f5"). InnerVolumeSpecName "kube-api-access-hln4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.413055 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b0c66c6-242d-4d35-8be9-28739bd537f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.413093 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvzxs\" (UniqueName: \"kubernetes.io/projected/bc222012-854c-45e8-8ef9-e87ac905ca13-kube-api-access-gvzxs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.413105 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.413115 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hln4z\" (UniqueName: \"kubernetes.io/projected/6b0c66c6-242d-4d35-8be9-28739bd537f5-kube-api-access-hln4z\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.571839 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.212:5671: connect: connection refused" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.666711 5030 generic.go:334] "Generic (PLEG): container finished" podID="3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" containerID="e96f90310cc523986ea2cc24d1408bd36e515d624b67edb05f78fd96a59bfb9f" exitCode=0 Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.666776 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" event={"ID":"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb","Type":"ContainerDied","Data":"e96f90310cc523986ea2cc24d1408bd36e515d624b67edb05f78fd96a59bfb9f"} Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.667732 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="4d23d8f1-6699-42f6-8152-362964a6a9dc" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.144:9292/healthcheck\": read tcp 10.217.0.2:34000->10.217.0.144:9292: read: connection reset by peer" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.667779 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="4d23d8f1-6699-42f6-8152-362964a6a9dc" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.144:9292/healthcheck\": read tcp 10.217.0.2:33998->10.217.0.144:9292: read: connection reset by peer" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.669817 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.669864 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8" event={"ID":"06a5c598-c95d-453a-aeec-bda1f9ffdb82","Type":"ContainerDied","Data":"7abe160fccc5d4a51af595b200787a9f36de80e2de006e2c820ce5d58040e06c"} Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.674024 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" event={"ID":"ab64dff2-f94e-4a15-84d1-4661aa1b8c87","Type":"ContainerStarted","Data":"c3482f4db5afd5109224c845a1b35a5f57ef6bb7cfa94854f83af3893cff4828"} Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.674116 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.676149 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2b3f48e0d6f2a6aefac841390f0266fdaf772963df0559171a321f8b0b377b72" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.677734 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2b3f48e0d6f2a6aefac841390f0266fdaf772963df0559171a321f8b0b377b72" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.678805 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2b3f48e0d6f2a6aefac841390f0266fdaf772963df0559171a321f8b0b377b72" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:36:33 crc kubenswrapper[5030]: E0120 23:36:33.678885 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="3b0e7253-4a57-42d1-bbc3-768eeb755113" containerName="ovn-northd" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.682839 5030 generic.go:334] "Generic (PLEG): container finished" podID="fd6ac7c2-5294-401a-a16d-0821b044e7f8" containerID="9237c481c7aa7834b8c0efe9a0c5fd9901bf2d2eb2b5d3f97cca7cc8a063c842" exitCode=0 Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.682915 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"fd6ac7c2-5294-401a-a16d-0821b044e7f8","Type":"ContainerDied","Data":"9237c481c7aa7834b8c0efe9a0c5fd9901bf2d2eb2b5d3f97cca7cc8a063c842"} Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.682941 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"fd6ac7c2-5294-401a-a16d-0821b044e7f8","Type":"ContainerDied","Data":"b31f9b5e8b0c977f70319a48b0da177bd27583cad3793a7b30357fd9d9d25ffb"} Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.682953 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b31f9b5e8b0c977f70319a48b0da177bd27583cad3793a7b30357fd9d9d25ffb" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.691189 5030 generic.go:334] "Generic (PLEG): container finished" podID="c7faddbb-53f4-4f27-a853-0751049feafd" containerID="8b8510c34e0a7c2858626139d5521d1ef7d3a51f65ff94d2088c0275f2c2b66f" exitCode=0 Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.691301 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c7faddbb-53f4-4f27-a853-0751049feafd","Type":"ContainerDied","Data":"8b8510c34e0a7c2858626139d5521d1ef7d3a51f65ff94d2088c0275f2c2b66f"} Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.695175 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" event={"ID":"6b0c66c6-242d-4d35-8be9-28739bd537f5","Type":"ContainerDied","Data":"8d5dffb57112a27d355e0c955ea79c9754293888f45174f04a04224ab1e0eefd"} Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.695252 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.700461 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" event={"ID":"a832d1c9-4838-4070-ad8d-e2af76d8ca47","Type":"ContainerDied","Data":"bbf7a2150dd0d618cdf2d49f18bd26756dc2327913a93f25ffb140279d91b870"} Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.700557 5030 scope.go:117] "RemoveContainer" containerID="3b503db37878b35095c76a10013149fa599afad387b5f267dc2cef4bd347d150" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.700725 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.704572 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-d9vgn" event={"ID":"bc222012-854c-45e8-8ef9-e87ac905ca13","Type":"ContainerDied","Data":"c6f837a7ba9755b9bba25bf10fe5796e9680a8bd9232c1a25f721339c9fb4e16"} Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.704682 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-d9vgn" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.710259 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" podStartSLOduration=4.710226037 podStartE2EDuration="4.710226037s" podCreationTimestamp="2026-01-20 23:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:36:33.696598266 +0000 UTC m=+3666.016858554" watchObservedRunningTime="2026-01-20 23:36:33.710226037 +0000 UTC m=+3666.030486325" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.720198 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.720308 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.722794 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.723932 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.727839 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.728905 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9","Type":"ContainerDied","Data":"edb925b8ff8db1df5c8c5dbbe6706091487eaef81933749acfbeed04cff1a05e"} Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.836267 5030 scope.go:117] "RemoveContainer" containerID="901c65079c86eab89799533dc96ef980a3db2d852f2b58ea3c34b15a6460f991" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.905665 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.914745 5030 scope.go:117] "RemoveContainer" containerID="7aa4244c68daf36642df73838f4fa0a37eea094dfb4bdbe19e5709bc658be850" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.940290 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.940743 5030 scope.go:117] "RemoveContainer" containerID="c5665ab2e4d22cab17e4485bf654d41538e5774df91bf1e276c60d2155d2496f" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.992847 5030 scope.go:117] "RemoveContainer" containerID="11331a0257cd7cd3a45d9b1e0d9aa4ca36e6a5d20ca49299d905d135b78ee8e3" Jan 20 23:36:33 crc kubenswrapper[5030]: I0120 23:36:33.996586 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:36:34 crc kubenswrapper[5030]: E0120 23:36:34.005115 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06a5c598_c95d_453a_aeec_bda1f9ffdb82.slice/crio-7abe160fccc5d4a51af595b200787a9f36de80e2de006e2c820ce5d58040e06c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafaaf0e0_1ddc_4335_843b_c2ea49492b80.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.012847 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" path="/var/lib/kubelet/pods/43061a58-a31e-4ced-836f-b6a6badff81f/volumes" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.014273 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3850fc-0bf6-489d-8a8f-4a72ad423cce" path="/var/lib/kubelet/pods/5b3850fc-0bf6-489d-8a8f-4a72ad423cce/volumes" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.015168 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3774e65-1783-48d9-af7d-3a9964b609ad" path="/var/lib/kubelet/pods/d3774e65-1783-48d9-af7d-3a9964b609ad/volumes" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.016027 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2" path="/var/lib/kubelet/pods/ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2/volumes" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.032214 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jscx\" (UniqueName: \"kubernetes.io/projected/fd6ac7c2-5294-401a-a16d-0821b044e7f8-kube-api-access-4jscx\") pod \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.034185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-internal-tls-certs\") pod \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.034465 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-config-data\") pod \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.034551 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.034693 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-combined-ca-bundle\") pod \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.034797 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd6ac7c2-5294-401a-a16d-0821b044e7f8-logs\") pod \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.034838 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-scripts\") pod \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.034915 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd6ac7c2-5294-401a-a16d-0821b044e7f8-httpd-run\") pod \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\" (UID: \"fd6ac7c2-5294-401a-a16d-0821b044e7f8\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.035197 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-scripts\") pod \"c7faddbb-53f4-4f27-a853-0751049feafd\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.039709 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6ac7c2-5294-401a-a16d-0821b044e7f8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fd6ac7c2-5294-401a-a16d-0821b044e7f8" (UID: "fd6ac7c2-5294-401a-a16d-0821b044e7f8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.041322 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzpgp\" (UniqueName: \"kubernetes.io/projected/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-kube-api-access-rzpgp\") pod \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.041444 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6ac7c2-5294-401a-a16d-0821b044e7f8-logs" (OuterVolumeSpecName: "logs") pod "fd6ac7c2-5294-401a-a16d-0821b044e7f8" (UID: "fd6ac7c2-5294-401a-a16d-0821b044e7f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.041708 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg2rx\" (UniqueName: \"kubernetes.io/projected/c7faddbb-53f4-4f27-a853-0751049feafd-kube-api-access-rg2rx\") pod \"c7faddbb-53f4-4f27-a853-0751049feafd\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.041773 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-public-tls-certs\") pod \"c7faddbb-53f4-4f27-a853-0751049feafd\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.043215 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd6ac7c2-5294-401a-a16d-0821b044e7f8-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.043238 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd6ac7c2-5294-401a-a16d-0821b044e7f8-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.048712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd6ac7c2-5294-401a-a16d-0821b044e7f8-kube-api-access-4jscx" (OuterVolumeSpecName: "kube-api-access-4jscx") pod "fd6ac7c2-5294-401a-a16d-0821b044e7f8" (UID: "fd6ac7c2-5294-401a-a16d-0821b044e7f8"). InnerVolumeSpecName "kube-api-access-4jscx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.051165 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-scripts" (OuterVolumeSpecName: "scripts") pod "c7faddbb-53f4-4f27-a853-0751049feafd" (UID: "c7faddbb-53f4-4f27-a853-0751049feafd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.055761 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-scripts" (OuterVolumeSpecName: "scripts") pod "fd6ac7c2-5294-401a-a16d-0821b044e7f8" (UID: "fd6ac7c2-5294-401a-a16d-0821b044e7f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.073222 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7faddbb-53f4-4f27-a853-0751049feafd-kube-api-access-rg2rx" (OuterVolumeSpecName: "kube-api-access-rg2rx") pod "c7faddbb-53f4-4f27-a853-0751049feafd" (UID: "c7faddbb-53f4-4f27-a853-0751049feafd"). InnerVolumeSpecName "kube-api-access-rg2rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.073577 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.079466 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.147:8775/\": read tcp 10.217.0.2:45506->10.217.0.147:8775: read: connection reset by peer" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.079711 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.147:8775/\": read tcp 10.217.0.2:45502->10.217.0.147:8775: read: connection reset by peer" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.083383 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-6f8f-account-create-update-x86qv"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.095746 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "fd6ac7c2-5294-401a-a16d-0821b044e7f8" (UID: "fd6ac7c2-5294-401a-a16d-0821b044e7f8"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.095822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-kube-api-access-rzpgp" (OuterVolumeSpecName: "kube-api-access-rzpgp") pod "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" (UID: "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb"). InnerVolumeSpecName "kube-api-access-rzpgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.097233 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-d9vgn"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.108857 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.109090 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-d9vgn"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.125436 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.130228 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd6ac7c2-5294-401a-a16d-0821b044e7f8" (UID: "fd6ac7c2-5294-401a-a16d-0821b044e7f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.133195 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-688dd48fcd-2fh67"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.142315 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-config-data" (OuterVolumeSpecName: "config-data") pod "fd6ac7c2-5294-401a-a16d-0821b044e7f8" (UID: "fd6ac7c2-5294-401a-a16d-0821b044e7f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143498 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d23d8f1-6699-42f6-8152-362964a6a9dc-logs\") pod \"4d23d8f1-6699-42f6-8152-362964a6a9dc\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143528 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-logs\") pod \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143549 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-config-data\") pod \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143581 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-internal-tls-certs\") pod \"c7faddbb-53f4-4f27-a853-0751049feafd\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143599 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-public-tls-certs\") pod \"4d23d8f1-6699-42f6-8152-362964a6a9dc\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143636 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-scripts\") pod \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143660 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-combined-ca-bundle\") pod \"4d23d8f1-6699-42f6-8152-362964a6a9dc\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143677 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-combined-ca-bundle\") pod \"c7faddbb-53f4-4f27-a853-0751049feafd\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-internal-tls-certs\") pod \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143725 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7faddbb-53f4-4f27-a853-0751049feafd-etc-machine-id\") pod \"c7faddbb-53f4-4f27-a853-0751049feafd\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143743 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dspcr\" (UniqueName: \"kubernetes.io/projected/4d23d8f1-6699-42f6-8152-362964a6a9dc-kube-api-access-dspcr\") pod \"4d23d8f1-6699-42f6-8152-362964a6a9dc\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143759 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-public-tls-certs\") pod \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143777 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"4d23d8f1-6699-42f6-8152-362964a6a9dc\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143802 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d23d8f1-6699-42f6-8152-362964a6a9dc-httpd-run\") pod \"4d23d8f1-6699-42f6-8152-362964a6a9dc\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143816 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7faddbb-53f4-4f27-a853-0751049feafd-logs\") pod \"c7faddbb-53f4-4f27-a853-0751049feafd\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143838 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-config-data\") pod \"c7faddbb-53f4-4f27-a853-0751049feafd\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143859 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-config-data-custom\") pod \"c7faddbb-53f4-4f27-a853-0751049feafd\" (UID: \"c7faddbb-53f4-4f27-a853-0751049feafd\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143882 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-combined-ca-bundle\") pod \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\" (UID: \"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143912 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-scripts\") pod \"4d23d8f1-6699-42f6-8152-362964a6a9dc\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.143930 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-config-data\") pod \"4d23d8f1-6699-42f6-8152-362964a6a9dc\" (UID: \"4d23d8f1-6699-42f6-8152-362964a6a9dc\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.144194 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.144211 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw54b\" (UniqueName: \"kubernetes.io/projected/6330116d-6f24-4964-a35b-419b23f92f44-kube-api-access-pw54b\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.144223 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzpgp\" (UniqueName: \"kubernetes.io/projected/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-kube-api-access-rzpgp\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.144232 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.144240 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg2rx\" (UniqueName: \"kubernetes.io/projected/c7faddbb-53f4-4f27-a853-0751049feafd-kube-api-access-rg2rx\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.144249 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jscx\" (UniqueName: \"kubernetes.io/projected/fd6ac7c2-5294-401a-a16d-0821b044e7f8-kube-api-access-4jscx\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.144258 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.144275 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.144284 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.144294 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6330116d-6f24-4964-a35b-419b23f92f44-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.144303 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94b0ade6-0e21-4df1-8dca-db81b58aff7c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.144313 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.144322 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.146841 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.146921 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7faddbb-53f4-4f27-a853-0751049feafd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c7faddbb-53f4-4f27-a853-0751049feafd" (UID: "c7faddbb-53f4-4f27-a853-0751049feafd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.147442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7faddbb-53f4-4f27-a853-0751049feafd-logs" (OuterVolumeSpecName: "logs") pod "c7faddbb-53f4-4f27-a853-0751049feafd" (UID: "c7faddbb-53f4-4f27-a853-0751049feafd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.148991 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d23d8f1-6699-42f6-8152-362964a6a9dc-logs" (OuterVolumeSpecName: "logs") pod "4d23d8f1-6699-42f6-8152-362964a6a9dc" (UID: "4d23d8f1-6699-42f6-8152-362964a6a9dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.150082 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-67694ddbb6-j66sw"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.152725 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d23d8f1-6699-42f6-8152-362964a6a9dc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4d23d8f1-6699-42f6-8152-362964a6a9dc" (UID: "4d23d8f1-6699-42f6-8152-362964a6a9dc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.156231 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-logs" (OuterVolumeSpecName: "logs") pod "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" (UID: "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.166609 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "4d23d8f1-6699-42f6-8152-362964a6a9dc" (UID: "4d23d8f1-6699-42f6-8152-362964a6a9dc"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.170131 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c7faddbb-53f4-4f27-a853-0751049feafd" (UID: "c7faddbb-53f4-4f27-a853-0751049feafd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.170981 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.171375 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c7faddbb-53f4-4f27-a853-0751049feafd" (UID: "c7faddbb-53f4-4f27-a853-0751049feafd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.173914 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-scripts" (OuterVolumeSpecName: "scripts") pod "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" (UID: "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.182202 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-6a16-account-create-update-ss6r7"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.191353 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d23d8f1-6699-42f6-8152-362964a6a9dc-kube-api-access-dspcr" (OuterVolumeSpecName: "kube-api-access-dspcr") pod "4d23d8f1-6699-42f6-8152-362964a6a9dc" (UID: "4d23d8f1-6699-42f6-8152-362964a6a9dc"). InnerVolumeSpecName "kube-api-access-dspcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.195250 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.202586 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-7b33-account-create-update-226z8"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.203918 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.204713 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-scripts" (OuterVolumeSpecName: "scripts") pod "4d23d8f1-6699-42f6-8152-362964a6a9dc" (UID: "4d23d8f1-6699-42f6-8152-362964a6a9dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.225468 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.232613 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-78ff77fc4b-82l2x"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.239844 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.245476 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7faddbb-53f4-4f27-a853-0751049feafd" (UID: "c7faddbb-53f4-4f27-a853-0751049feafd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.246132 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.246164 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.246192 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d23d8f1-6699-42f6-8152-362964a6a9dc-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.246200 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.246210 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.246220 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.246230 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.246241 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7faddbb-53f4-4f27-a853-0751049feafd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.246269 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dspcr\" (UniqueName: \"kubernetes.io/projected/4d23d8f1-6699-42f6-8152-362964a6a9dc-kube-api-access-dspcr\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.246305 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.246319 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d23d8f1-6699-42f6-8152-362964a6a9dc-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.249972 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7faddbb-53f4-4f27-a853-0751049feafd-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.249985 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.255780 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fd6ac7c2-5294-401a-a16d-0821b044e7f8" (UID: "fd6ac7c2-5294-401a-a16d-0821b044e7f8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.255828 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.296862 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.305063 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-69f86ff78c-d95nc"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.332585 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.334370 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c7faddbb-53f4-4f27-a853-0751049feafd" (UID: "c7faddbb-53f4-4f27-a853-0751049feafd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.341810 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4d23d8f1-6699-42f6-8152-362964a6a9dc" (UID: "4d23d8f1-6699-42f6-8152-362964a6a9dc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.349900 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d23d8f1-6699-42f6-8152-362964a6a9dc" (UID: "4d23d8f1-6699-42f6-8152-362964a6a9dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.350388 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" (UID: "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.353552 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afaaf0e0-1ddc-4335-843b-c2ea49492b80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.353572 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd6ac7c2-5294-401a-a16d-0821b044e7f8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.353582 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.353592 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.353601 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8705c389-fda4-4ef7-8a28-93efcdff880b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.353610 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.353634 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.353643 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.359457 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-config-data" (OuterVolumeSpecName: "config-data") pod "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" (UID: "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.364171 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-config-data" (OuterVolumeSpecName: "config-data") pod "c7faddbb-53f4-4f27-a853-0751049feafd" (UID: "c7faddbb-53f4-4f27-a853-0751049feafd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.364211 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-config-data" (OuterVolumeSpecName: "config-data") pod "4d23d8f1-6699-42f6-8152-362964a6a9dc" (UID: "4d23d8f1-6699-42f6-8152-362964a6a9dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.386858 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" (UID: "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.408384 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" (UID: "3d310ab9-3091-41d2-967b-a5c4c3b5e7bb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.454553 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.454581 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7faddbb-53f4-4f27-a853-0751049feafd-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.454591 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d23d8f1-6699-42f6-8152-362964a6a9dc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.454601 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.454610 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.485186 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.555836 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-logs\") pod \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.556239 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4pzn\" (UniqueName: \"kubernetes.io/projected/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-kube-api-access-v4pzn\") pod \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.556271 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-nova-metadata-tls-certs\") pod \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.556300 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-combined-ca-bundle\") pod \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.556321 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-config-data\") pod \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\" (UID: \"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.557456 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-logs" (OuterVolumeSpecName: "logs") pod "0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" (UID: "0966335e-4d7c-4eb4-ba9b-b80a40b4fb15"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.572370 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-kube-api-access-v4pzn" (OuterVolumeSpecName: "kube-api-access-v4pzn") pod "0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" (UID: "0966335e-4d7c-4eb4-ba9b-b80a40b4fb15"). InnerVolumeSpecName "kube-api-access-v4pzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.585040 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" (UID: "0966335e-4d7c-4eb4-ba9b-b80a40b4fb15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.600485 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-config-data" (OuterVolumeSpecName: "config-data") pod "0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" (UID: "0966335e-4d7c-4eb4-ba9b-b80a40b4fb15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.612726 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" (UID: "0966335e-4d7c-4eb4-ba9b-b80a40b4fb15"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.658299 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4pzn\" (UniqueName: \"kubernetes.io/projected/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-kube-api-access-v4pzn\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.658322 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.658357 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.658368 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.658377 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.733104 5030 generic.go:334] "Generic (PLEG): container finished" podID="16aae85a-a502-4b2e-9663-e8c50d01f657" containerID="a12e77d1b8be494229f4bdef24c5a9771212355912d130fbc08c36e1abf41aa5" exitCode=0 Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.733163 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"16aae85a-a502-4b2e-9663-e8c50d01f657","Type":"ContainerDied","Data":"a12e77d1b8be494229f4bdef24c5a9771212355912d130fbc08c36e1abf41aa5"} Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.733190 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"16aae85a-a502-4b2e-9663-e8c50d01f657","Type":"ContainerDied","Data":"6dccc950e4cea12b524ca5bc600410458261c5db3eb5f53a67e19bab08f459be"} Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.733201 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dccc950e4cea12b524ca5bc600410458261c5db3eb5f53a67e19bab08f459be" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.735160 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" event={"ID":"3d310ab9-3091-41d2-967b-a5c4c3b5e7bb","Type":"ContainerDied","Data":"d067ff9e7fd3a18c290c327283c6479bcb00b964b9f5c64f4ad32859942f2e6b"} Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.735192 5030 scope.go:117] "RemoveContainer" containerID="e96f90310cc523986ea2cc24d1408bd36e515d624b67edb05f78fd96a59bfb9f" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.735298 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.739334 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" event={"ID":"99f18e53-0386-4e06-89f6-a7d30f880808","Type":"ContainerDied","Data":"a53f849ac1c1858e7caed509d7798f356ef005496fcb80d6c8c828ae2ffbfaba"} Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.739985 5030 generic.go:334] "Generic (PLEG): container finished" podID="99f18e53-0386-4e06-89f6-a7d30f880808" containerID="a53f849ac1c1858e7caed509d7798f356ef005496fcb80d6c8c828ae2ffbfaba" exitCode=0 Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.744853 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"c7faddbb-53f4-4f27-a853-0751049feafd","Type":"ContainerDied","Data":"aa1523ca9b24f0b8ed109842dd97fa6252ceb006690a4a4349264fa22395bb21"} Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.744916 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.750555 5030 generic.go:334] "Generic (PLEG): container finished" podID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerID="b8ae93af6666b9fb76a44af3d521db65e5795460f7c982e680684326ee865d01" exitCode=0 Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.750607 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15","Type":"ContainerDied","Data":"b8ae93af6666b9fb76a44af3d521db65e5795460f7c982e680684326ee865d01"} Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.750647 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0966335e-4d7c-4eb4-ba9b-b80a40b4fb15","Type":"ContainerDied","Data":"a444966ddbc3518f89c8a24b5cfd6f35ac5b00a60e68b9c70192390ed0394882"} Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.750702 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.764263 5030 generic.go:334] "Generic (PLEG): container finished" podID="4d23d8f1-6699-42f6-8152-362964a6a9dc" containerID="a0e544e09bb702af8047d8373439f3e074741aceea685df79f2a9a8308c31f68" exitCode=0 Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.764359 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.764711 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4d23d8f1-6699-42f6-8152-362964a6a9dc","Type":"ContainerDied","Data":"a0e544e09bb702af8047d8373439f3e074741aceea685df79f2a9a8308c31f68"} Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.764762 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"4d23d8f1-6699-42f6-8152-362964a6a9dc","Type":"ContainerDied","Data":"1b9e2a4a951b011876a9aa764fe630e1a819504641bb37396eaee5f36064b402"} Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.772532 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_3b0e7253-4a57-42d1-bbc3-768eeb755113/ovn-northd/0.log" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.772567 5030 generic.go:334] "Generic (PLEG): container finished" podID="3b0e7253-4a57-42d1-bbc3-768eeb755113" containerID="2b3f48e0d6f2a6aefac841390f0266fdaf772963df0559171a321f8b0b377b72" exitCode=139 Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.772664 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.777865 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"3b0e7253-4a57-42d1-bbc3-768eeb755113","Type":"ContainerDied","Data":"2b3f48e0d6f2a6aefac841390f0266fdaf772963df0559171a321f8b0b377b72"} Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.816239 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.823715 5030 scope.go:117] "RemoveContainer" containerID="6c9c4f202a2cb68a56587c933cd90e0e468cf5c5ce99564d98ebbae3c4fae420" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.846927 5030 scope.go:117] "RemoveContainer" containerID="8b8510c34e0a7c2858626139d5521d1ef7d3a51f65ff94d2088c0275f2c2b66f" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.860830 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-combined-ca-bundle\") pod \"16aae85a-a502-4b2e-9663-e8c50d01f657\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.861032 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99z2s\" (UniqueName: \"kubernetes.io/projected/16aae85a-a502-4b2e-9663-e8c50d01f657-kube-api-access-99z2s\") pod \"16aae85a-a502-4b2e-9663-e8c50d01f657\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.861063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16aae85a-a502-4b2e-9663-e8c50d01f657-logs\") pod \"16aae85a-a502-4b2e-9663-e8c50d01f657\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.861084 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-internal-tls-certs\") pod \"16aae85a-a502-4b2e-9663-e8c50d01f657\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.861109 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-public-tls-certs\") pod \"16aae85a-a502-4b2e-9663-e8c50d01f657\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.861143 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-config-data\") pod \"16aae85a-a502-4b2e-9663-e8c50d01f657\" (UID: \"16aae85a-a502-4b2e-9663-e8c50d01f657\") " Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.874790 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16aae85a-a502-4b2e-9663-e8c50d01f657-logs" (OuterVolumeSpecName: "logs") pod "16aae85a-a502-4b2e-9663-e8c50d01f657" (UID: "16aae85a-a502-4b2e-9663-e8c50d01f657"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.886084 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16aae85a-a502-4b2e-9663-e8c50d01f657-kube-api-access-99z2s" (OuterVolumeSpecName: "kube-api-access-99z2s") pod "16aae85a-a502-4b2e-9663-e8c50d01f657" (UID: "16aae85a-a502-4b2e-9663-e8c50d01f657"). InnerVolumeSpecName "kube-api-access-99z2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.897134 5030 scope.go:117] "RemoveContainer" containerID="dae190eac1e356b07146411eb43704c3a1d11f244f8ed93f24dbead6bae58b53" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.898152 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.907672 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.918097 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.918170 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16aae85a-a502-4b2e-9663-e8c50d01f657" (UID: "16aae85a-a502-4b2e-9663-e8c50d01f657"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.923721 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-config-data" (OuterVolumeSpecName: "config-data") pod "16aae85a-a502-4b2e-9663-e8c50d01f657" (UID: "16aae85a-a502-4b2e-9663-e8c50d01f657"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.924273 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-6db9c8bfb8-7rnnr"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.932690 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "16aae85a-a502-4b2e-9663-e8c50d01f657" (UID: "16aae85a-a502-4b2e-9663-e8c50d01f657"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.933424 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.938755 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.943603 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.949036 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.953189 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "16aae85a-a502-4b2e-9663-e8c50d01f657" (UID: "16aae85a-a502-4b2e-9663-e8c50d01f657"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.954113 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.959230 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.962468 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.962551 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.962607 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.962680 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99z2s\" (UniqueName: \"kubernetes.io/projected/16aae85a-a502-4b2e-9663-e8c50d01f657-kube-api-access-99z2s\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.962732 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16aae85a-a502-4b2e-9663-e8c50d01f657-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:34 crc kubenswrapper[5030]: I0120 23:36:34.962784 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16aae85a-a502-4b2e-9663-e8c50d01f657-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.066723 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.085040 5030 scope.go:117] "RemoveContainer" containerID="b8ae93af6666b9fb76a44af3d521db65e5795460f7c982e680684326ee865d01" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.115411 5030 scope.go:117] "RemoveContainer" containerID="0180de15c3bd2219cf4cf9e44ce6525ef636a8fc03501e146cc2c6923d43e6e8" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.142580 5030 scope.go:117] "RemoveContainer" containerID="b8ae93af6666b9fb76a44af3d521db65e5795460f7c982e680684326ee865d01" Jan 20 23:36:35 crc kubenswrapper[5030]: E0120 23:36:35.143073 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8ae93af6666b9fb76a44af3d521db65e5795460f7c982e680684326ee865d01\": container with ID starting with b8ae93af6666b9fb76a44af3d521db65e5795460f7c982e680684326ee865d01 not found: ID does not exist" containerID="b8ae93af6666b9fb76a44af3d521db65e5795460f7c982e680684326ee865d01" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.143126 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ae93af6666b9fb76a44af3d521db65e5795460f7c982e680684326ee865d01"} err="failed to get container status \"b8ae93af6666b9fb76a44af3d521db65e5795460f7c982e680684326ee865d01\": rpc error: code = NotFound desc = could not find container \"b8ae93af6666b9fb76a44af3d521db65e5795460f7c982e680684326ee865d01\": container with ID starting with b8ae93af6666b9fb76a44af3d521db65e5795460f7c982e680684326ee865d01 not found: ID does not exist" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.143149 5030 scope.go:117] "RemoveContainer" containerID="0180de15c3bd2219cf4cf9e44ce6525ef636a8fc03501e146cc2c6923d43e6e8" Jan 20 23:36:35 crc kubenswrapper[5030]: E0120 23:36:35.143474 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0180de15c3bd2219cf4cf9e44ce6525ef636a8fc03501e146cc2c6923d43e6e8\": container with ID starting with 0180de15c3bd2219cf4cf9e44ce6525ef636a8fc03501e146cc2c6923d43e6e8 not found: ID does not exist" containerID="0180de15c3bd2219cf4cf9e44ce6525ef636a8fc03501e146cc2c6923d43e6e8" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.143538 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0180de15c3bd2219cf4cf9e44ce6525ef636a8fc03501e146cc2c6923d43e6e8"} err="failed to get container status \"0180de15c3bd2219cf4cf9e44ce6525ef636a8fc03501e146cc2c6923d43e6e8\": rpc error: code = NotFound desc = could not find container \"0180de15c3bd2219cf4cf9e44ce6525ef636a8fc03501e146cc2c6923d43e6e8\": container with ID starting with 0180de15c3bd2219cf4cf9e44ce6525ef636a8fc03501e146cc2c6923d43e6e8 not found: ID does not exist" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.143569 5030 scope.go:117] "RemoveContainer" containerID="a0e544e09bb702af8047d8373439f3e074741aceea685df79f2a9a8308c31f68" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.164971 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkhm5\" (UniqueName: \"kubernetes.io/projected/99f18e53-0386-4e06-89f6-a7d30f880808-kube-api-access-hkhm5\") pod \"99f18e53-0386-4e06-89f6-a7d30f880808\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.165082 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-config-data\") pod \"99f18e53-0386-4e06-89f6-a7d30f880808\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.165155 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-public-tls-certs\") pod \"99f18e53-0386-4e06-89f6-a7d30f880808\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.165220 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-config-data-custom\") pod \"99f18e53-0386-4e06-89f6-a7d30f880808\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.165260 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-internal-tls-certs\") pod \"99f18e53-0386-4e06-89f6-a7d30f880808\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.165293 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f18e53-0386-4e06-89f6-a7d30f880808-logs\") pod \"99f18e53-0386-4e06-89f6-a7d30f880808\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.165382 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-combined-ca-bundle\") pod \"99f18e53-0386-4e06-89f6-a7d30f880808\" (UID: \"99f18e53-0386-4e06-89f6-a7d30f880808\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.177359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99f18e53-0386-4e06-89f6-a7d30f880808-logs" (OuterVolumeSpecName: "logs") pod "99f18e53-0386-4e06-89f6-a7d30f880808" (UID: "99f18e53-0386-4e06-89f6-a7d30f880808"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.177990 5030 scope.go:117] "RemoveContainer" containerID="d0e7360bf46166aa5bef2ce7924b3df57b7840af48dade76bb5e42853f0bcdae" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.194791 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f18e53-0386-4e06-89f6-a7d30f880808-kube-api-access-hkhm5" (OuterVolumeSpecName: "kube-api-access-hkhm5") pod "99f18e53-0386-4e06-89f6-a7d30f880808" (UID: "99f18e53-0386-4e06-89f6-a7d30f880808"). InnerVolumeSpecName "kube-api-access-hkhm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.195413 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "99f18e53-0386-4e06-89f6-a7d30f880808" (UID: "99f18e53-0386-4e06-89f6-a7d30f880808"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.202822 5030 scope.go:117] "RemoveContainer" containerID="a0e544e09bb702af8047d8373439f3e074741aceea685df79f2a9a8308c31f68" Jan 20 23:36:35 crc kubenswrapper[5030]: E0120 23:36:35.203117 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e544e09bb702af8047d8373439f3e074741aceea685df79f2a9a8308c31f68\": container with ID starting with a0e544e09bb702af8047d8373439f3e074741aceea685df79f2a9a8308c31f68 not found: ID does not exist" containerID="a0e544e09bb702af8047d8373439f3e074741aceea685df79f2a9a8308c31f68" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.203147 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e544e09bb702af8047d8373439f3e074741aceea685df79f2a9a8308c31f68"} err="failed to get container status \"a0e544e09bb702af8047d8373439f3e074741aceea685df79f2a9a8308c31f68\": rpc error: code = NotFound desc = could not find container \"a0e544e09bb702af8047d8373439f3e074741aceea685df79f2a9a8308c31f68\": container with ID starting with a0e544e09bb702af8047d8373439f3e074741aceea685df79f2a9a8308c31f68 not found: ID does not exist" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.203167 5030 scope.go:117] "RemoveContainer" containerID="d0e7360bf46166aa5bef2ce7924b3df57b7840af48dade76bb5e42853f0bcdae" Jan 20 23:36:35 crc kubenswrapper[5030]: E0120 23:36:35.203592 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e7360bf46166aa5bef2ce7924b3df57b7840af48dade76bb5e42853f0bcdae\": container with ID starting with d0e7360bf46166aa5bef2ce7924b3df57b7840af48dade76bb5e42853f0bcdae not found: ID does not exist" containerID="d0e7360bf46166aa5bef2ce7924b3df57b7840af48dade76bb5e42853f0bcdae" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.203613 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e7360bf46166aa5bef2ce7924b3df57b7840af48dade76bb5e42853f0bcdae"} err="failed to get container status \"d0e7360bf46166aa5bef2ce7924b3df57b7840af48dade76bb5e42853f0bcdae\": rpc error: code = NotFound desc = could not find container \"d0e7360bf46166aa5bef2ce7924b3df57b7840af48dade76bb5e42853f0bcdae\": container with ID starting with d0e7360bf46166aa5bef2ce7924b3df57b7840af48dade76bb5e42853f0bcdae not found: ID does not exist" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.205110 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99f18e53-0386-4e06-89f6-a7d30f880808" (UID: "99f18e53-0386-4e06-89f6-a7d30f880808"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.213112 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-config-data" (OuterVolumeSpecName: "config-data") pod "99f18e53-0386-4e06-89f6-a7d30f880808" (UID: "99f18e53-0386-4e06-89f6-a7d30f880808"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.214867 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "99f18e53-0386-4e06-89f6-a7d30f880808" (UID: "99f18e53-0386-4e06-89f6-a7d30f880808"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.224995 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "99f18e53-0386-4e06-89f6-a7d30f880808" (UID: "99f18e53-0386-4e06-89f6-a7d30f880808"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.233961 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.268365 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkhm5\" (UniqueName: \"kubernetes.io/projected/99f18e53-0386-4e06-89f6-a7d30f880808-kube-api-access-hkhm5\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.268396 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.268405 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.268414 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.268423 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.268431 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99f18e53-0386-4e06-89f6-a7d30f880808-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.268440 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f18e53-0386-4e06-89f6-a7d30f880808-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.369559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-fernet-keys\") pod \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.369612 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-combined-ca-bundle\") pod \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.369703 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4mt8\" (UniqueName: \"kubernetes.io/projected/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-kube-api-access-f4mt8\") pod \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.369728 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-config-data\") pod \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.369818 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-credential-keys\") pod \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.369849 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-public-tls-certs\") pod \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.369995 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-internal-tls-certs\") pod \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.370052 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-scripts\") pod \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\" (UID: \"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.377079 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-kube-api-access-f4mt8" (OuterVolumeSpecName: "kube-api-access-f4mt8") pod "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" (UID: "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da"). InnerVolumeSpecName "kube-api-access-f4mt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.377209 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-scripts" (OuterVolumeSpecName: "scripts") pod "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" (UID: "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.378125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" (UID: "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.381150 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" (UID: "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.396196 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_3b0e7253-4a57-42d1-bbc3-768eeb755113/ovn-northd/0.log" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.396264 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.398934 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-config-data" (OuterVolumeSpecName: "config-data") pod "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" (UID: "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.399197 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" (UID: "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.433435 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" (UID: "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.449193 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" (UID: "f0979c96-fbd8-4e5b-a22b-35c1cae6e6da"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.472104 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.472133 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.472143 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.472151 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.472162 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4mt8\" (UniqueName: \"kubernetes.io/projected/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-kube-api-access-f4mt8\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.472171 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.472180 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.472188 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.573010 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-ovn-northd-tls-certs\") pod \"3b0e7253-4a57-42d1-bbc3-768eeb755113\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.573376 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8v27\" (UniqueName: \"kubernetes.io/projected/3b0e7253-4a57-42d1-bbc3-768eeb755113-kube-api-access-h8v27\") pod \"3b0e7253-4a57-42d1-bbc3-768eeb755113\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.573429 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-combined-ca-bundle\") pod \"3b0e7253-4a57-42d1-bbc3-768eeb755113\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.573552 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b0e7253-4a57-42d1-bbc3-768eeb755113-config\") pod \"3b0e7253-4a57-42d1-bbc3-768eeb755113\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.573588 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-metrics-certs-tls-certs\") pod \"3b0e7253-4a57-42d1-bbc3-768eeb755113\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.573635 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b0e7253-4a57-42d1-bbc3-768eeb755113-scripts\") pod \"3b0e7253-4a57-42d1-bbc3-768eeb755113\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.573697 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b0e7253-4a57-42d1-bbc3-768eeb755113-ovn-rundir\") pod \"3b0e7253-4a57-42d1-bbc3-768eeb755113\" (UID: \"3b0e7253-4a57-42d1-bbc3-768eeb755113\") " Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.574197 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b0e7253-4a57-42d1-bbc3-768eeb755113-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "3b0e7253-4a57-42d1-bbc3-768eeb755113" (UID: "3b0e7253-4a57-42d1-bbc3-768eeb755113"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.574362 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b0e7253-4a57-42d1-bbc3-768eeb755113-config" (OuterVolumeSpecName: "config") pod "3b0e7253-4a57-42d1-bbc3-768eeb755113" (UID: "3b0e7253-4a57-42d1-bbc3-768eeb755113"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.574559 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b0e7253-4a57-42d1-bbc3-768eeb755113-scripts" (OuterVolumeSpecName: "scripts") pod "3b0e7253-4a57-42d1-bbc3-768eeb755113" (UID: "3b0e7253-4a57-42d1-bbc3-768eeb755113"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.578746 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b0e7253-4a57-42d1-bbc3-768eeb755113-kube-api-access-h8v27" (OuterVolumeSpecName: "kube-api-access-h8v27") pod "3b0e7253-4a57-42d1-bbc3-768eeb755113" (UID: "3b0e7253-4a57-42d1-bbc3-768eeb755113"). InnerVolumeSpecName "kube-api-access-h8v27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.594748 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b0e7253-4a57-42d1-bbc3-768eeb755113" (UID: "3b0e7253-4a57-42d1-bbc3-768eeb755113"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.652137 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "3b0e7253-4a57-42d1-bbc3-768eeb755113" (UID: "3b0e7253-4a57-42d1-bbc3-768eeb755113"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.658852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3b0e7253-4a57-42d1-bbc3-768eeb755113" (UID: "3b0e7253-4a57-42d1-bbc3-768eeb755113"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.675283 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b0e7253-4a57-42d1-bbc3-768eeb755113-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.675315 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.675329 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b0e7253-4a57-42d1-bbc3-768eeb755113-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.675340 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b0e7253-4a57-42d1-bbc3-768eeb755113-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.675353 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.675366 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8v27\" (UniqueName: \"kubernetes.io/projected/3b0e7253-4a57-42d1-bbc3-768eeb755113-kube-api-access-h8v27\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.675378 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b0e7253-4a57-42d1-bbc3-768eeb755113-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.790267 5030 generic.go:334] "Generic (PLEG): container finished" podID="f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" containerID="8cb98403ba1e46309540a4b0a01cdae716399a7ce40b028158271baa4c653cbf" exitCode=0 Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.790407 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.790446 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" event={"ID":"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da","Type":"ContainerDied","Data":"8cb98403ba1e46309540a4b0a01cdae716399a7ce40b028158271baa4c653cbf"} Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.790529 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x" event={"ID":"f0979c96-fbd8-4e5b-a22b-35c1cae6e6da","Type":"ContainerDied","Data":"fbeb9f441575597f0a4417989c98da7e307dc5d45a00cfa38b2f2c1696700e56"} Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.790568 5030 scope.go:117] "RemoveContainer" containerID="8cb98403ba1e46309540a4b0a01cdae716399a7ce40b028158271baa4c653cbf" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.798875 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" event={"ID":"99f18e53-0386-4e06-89f6-a7d30f880808","Type":"ContainerDied","Data":"cef03121a4040309a93985530bf6c90fc8542184ca20627744371245053e16db"} Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.798925 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.808220 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_3b0e7253-4a57-42d1-bbc3-768eeb755113/ovn-northd/0.log" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.808342 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.808390 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"3b0e7253-4a57-42d1-bbc3-768eeb755113","Type":"ContainerDied","Data":"d2298816256dc578d3898588fc407be0225f8dab87e1cc38c30b05d899c283b1"} Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.814638 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.831565 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x"] Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.843400 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-78c6cc75f4-xhf7x"] Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.844510 5030 scope.go:117] "RemoveContainer" containerID="8cb98403ba1e46309540a4b0a01cdae716399a7ce40b028158271baa4c653cbf" Jan 20 23:36:35 crc kubenswrapper[5030]: E0120 23:36:35.845346 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb98403ba1e46309540a4b0a01cdae716399a7ce40b028158271baa4c653cbf\": container with ID starting with 8cb98403ba1e46309540a4b0a01cdae716399a7ce40b028158271baa4c653cbf not found: ID does not exist" containerID="8cb98403ba1e46309540a4b0a01cdae716399a7ce40b028158271baa4c653cbf" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.845419 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb98403ba1e46309540a4b0a01cdae716399a7ce40b028158271baa4c653cbf"} err="failed to get container status \"8cb98403ba1e46309540a4b0a01cdae716399a7ce40b028158271baa4c653cbf\": rpc error: code = NotFound desc = could not find container \"8cb98403ba1e46309540a4b0a01cdae716399a7ce40b028158271baa4c653cbf\": container with ID starting with 8cb98403ba1e46309540a4b0a01cdae716399a7ce40b028158271baa4c653cbf not found: ID does not exist" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.845465 5030 scope.go:117] "RemoveContainer" containerID="a53f849ac1c1858e7caed509d7798f356ef005496fcb80d6c8c828ae2ffbfaba" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.865679 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.878671 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.883333 5030 scope.go:117] "RemoveContainer" containerID="b09cb47fdff353d2277388987884452b6f97530142dd767855a2b065ff8722ae" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.889504 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw"] Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.900677 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-778f4b5b5b-7p9xw"] Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.905938 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.908814 5030 scope.go:117] "RemoveContainer" containerID="0c8ba91ac8628ce6044a35b4f7d9eb67441f4f1ee295ec42f3004b90599321e6" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.910511 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.931333 5030 scope.go:117] "RemoveContainer" containerID="2b3f48e0d6f2a6aefac841390f0266fdaf772963df0559171a321f8b0b377b72" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.973272 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06a5c598-c95d-453a-aeec-bda1f9ffdb82" path="/var/lib/kubelet/pods/06a5c598-c95d-453a-aeec-bda1f9ffdb82/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.973970 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" path="/var/lib/kubelet/pods/0966335e-4d7c-4eb4-ba9b-b80a40b4fb15/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.975023 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16aae85a-a502-4b2e-9663-e8c50d01f657" path="/var/lib/kubelet/pods/16aae85a-a502-4b2e-9663-e8c50d01f657/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.976985 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b0e7253-4a57-42d1-bbc3-768eeb755113" path="/var/lib/kubelet/pods/3b0e7253-4a57-42d1-bbc3-768eeb755113/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.977976 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" path="/var/lib/kubelet/pods/3d310ab9-3091-41d2-967b-a5c4c3b5e7bb/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.979012 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d23d8f1-6699-42f6-8152-362964a6a9dc" path="/var/lib/kubelet/pods/4d23d8f1-6699-42f6-8152-362964a6a9dc/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: E0120 23:36:35.979735 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:36:35 crc kubenswrapper[5030]: E0120 23:36:35.979827 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data podName:e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c nodeName:}" failed. No retries permitted until 2026-01-20 23:36:43.979804064 +0000 UTC m=+3676.300064372 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data") pod "rabbitmq-server-0" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c") : configmap "rabbitmq-config-data" not found Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.980562 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6330116d-6f24-4964-a35b-419b23f92f44" path="/var/lib/kubelet/pods/6330116d-6f24-4964-a35b-419b23f92f44/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.981077 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b0c66c6-242d-4d35-8be9-28739bd537f5" path="/var/lib/kubelet/pods/6b0c66c6-242d-4d35-8be9-28739bd537f5/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.981661 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8705c389-fda4-4ef7-8a28-93efcdff880b" path="/var/lib/kubelet/pods/8705c389-fda4-4ef7-8a28-93efcdff880b/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.982283 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b0ade6-0e21-4df1-8dca-db81b58aff7c" path="/var/lib/kubelet/pods/94b0ade6-0e21-4df1-8dca-db81b58aff7c/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.982930 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" path="/var/lib/kubelet/pods/99f18e53-0386-4e06-89f6-a7d30f880808/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.984496 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a832d1c9-4838-4070-ad8d-e2af76d8ca47" path="/var/lib/kubelet/pods/a832d1c9-4838-4070-ad8d-e2af76d8ca47/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.985512 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afaaf0e0-1ddc-4335-843b-c2ea49492b80" path="/var/lib/kubelet/pods/afaaf0e0-1ddc-4335-843b-c2ea49492b80/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.986331 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc222012-854c-45e8-8ef9-e87ac905ca13" path="/var/lib/kubelet/pods/bc222012-854c-45e8-8ef9-e87ac905ca13/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.987636 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7faddbb-53f4-4f27-a853-0751049feafd" path="/var/lib/kubelet/pods/c7faddbb-53f4-4f27-a853-0751049feafd/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.988901 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" path="/var/lib/kubelet/pods/d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.990068 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" path="/var/lib/kubelet/pods/f0979c96-fbd8-4e5b-a22b-35c1cae6e6da/volumes" Jan 20 23:36:35 crc kubenswrapper[5030]: I0120 23:36:35.993577 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd6ac7c2-5294-401a-a16d-0821b044e7f8" path="/var/lib/kubelet/pods/fd6ac7c2-5294-401a-a16d-0821b044e7f8/volumes" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.556244 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.690543 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-tls\") pod \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.690640 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-plugins-conf\") pod \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.690688 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-erlang-cookie\") pod \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.690732 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-confd\") pod \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.690807 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-server-conf\") pod \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.690889 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.690957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw8g5\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-kube-api-access-hw8g5\") pod \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.691000 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-pod-info\") pod \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.691046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-erlang-cookie-secret\") pod \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.691113 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-plugins\") pod \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.691141 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.691210 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data\") pod \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\" (UID: \"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c\") " Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.691289 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.691486 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.691746 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.691771 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.691791 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.694731 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.696446 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-pod-info" (OuterVolumeSpecName: "pod-info") pod "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.696519 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "persistence") pod "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.696520 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.697835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-kube-api-access-hw8g5" (OuterVolumeSpecName: "kube-api-access-hw8g5") pod "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c"). InnerVolumeSpecName "kube-api-access-hw8g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.721925 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data" (OuterVolumeSpecName: "config-data") pod "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.737228 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-server-conf" (OuterVolumeSpecName: "server-conf") pod "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.777472 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" (UID: "e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.793261 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.793294 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.793305 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.793316 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.793337 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.793351 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw8g5\" (UniqueName: \"kubernetes.io/projected/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-kube-api-access-hw8g5\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:36 crc kubenswrapper[5030]: E0120 23:36:36.793352 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.793361 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:36 crc kubenswrapper[5030]: E0120 23:36:36.793443 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data podName:707515d1-17d7-47ff-be95-73414a6a2a95 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:44.793392097 +0000 UTC m=+3677.113652385 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data") pod "rabbitmq-cell1-server-0" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.793472 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.808075 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.824588 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" containerID="fd3d688e67d991d64ec087d4b4ddfd8bf95aa94fca6895e70222b2a4a5f67e47" exitCode=0 Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.824660 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c","Type":"ContainerDied","Data":"fd3d688e67d991d64ec087d4b4ddfd8bf95aa94fca6895e70222b2a4a5f67e47"} Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.824686 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c","Type":"ContainerDied","Data":"13613ed8fdeb75e14a275c5b8e8136860ac61830c6715d8c58f9e95416b9de84"} Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.824703 5030 scope.go:117] "RemoveContainer" containerID="fd3d688e67d991d64ec087d4b4ddfd8bf95aa94fca6895e70222b2a4a5f67e47" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.824789 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.846519 5030 scope.go:117] "RemoveContainer" containerID="28edcad0484db485386c7130df3370aaacf3e682e8b1296021f215f02115680c" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.863221 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.870574 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.873554 5030 scope.go:117] "RemoveContainer" containerID="fd3d688e67d991d64ec087d4b4ddfd8bf95aa94fca6895e70222b2a4a5f67e47" Jan 20 23:36:36 crc kubenswrapper[5030]: E0120 23:36:36.874053 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd3d688e67d991d64ec087d4b4ddfd8bf95aa94fca6895e70222b2a4a5f67e47\": container with ID starting with fd3d688e67d991d64ec087d4b4ddfd8bf95aa94fca6895e70222b2a4a5f67e47 not found: ID does not exist" containerID="fd3d688e67d991d64ec087d4b4ddfd8bf95aa94fca6895e70222b2a4a5f67e47" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.874096 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3d688e67d991d64ec087d4b4ddfd8bf95aa94fca6895e70222b2a4a5f67e47"} err="failed to get container status \"fd3d688e67d991d64ec087d4b4ddfd8bf95aa94fca6895e70222b2a4a5f67e47\": rpc error: code = NotFound desc = could not find container \"fd3d688e67d991d64ec087d4b4ddfd8bf95aa94fca6895e70222b2a4a5f67e47\": container with ID starting with fd3d688e67d991d64ec087d4b4ddfd8bf95aa94fca6895e70222b2a4a5f67e47 not found: ID does not exist" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.874124 5030 scope.go:117] "RemoveContainer" containerID="28edcad0484db485386c7130df3370aaacf3e682e8b1296021f215f02115680c" Jan 20 23:36:36 crc kubenswrapper[5030]: E0120 23:36:36.874418 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28edcad0484db485386c7130df3370aaacf3e682e8b1296021f215f02115680c\": container with ID starting with 28edcad0484db485386c7130df3370aaacf3e682e8b1296021f215f02115680c not found: ID does not exist" containerID="28edcad0484db485386c7130df3370aaacf3e682e8b1296021f215f02115680c" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.874450 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28edcad0484db485386c7130df3370aaacf3e682e8b1296021f215f02115680c"} err="failed to get container status \"28edcad0484db485386c7130df3370aaacf3e682e8b1296021f215f02115680c\": rpc error: code = NotFound desc = could not find container \"28edcad0484db485386c7130df3370aaacf3e682e8b1296021f215f02115680c\": container with ID starting with 28edcad0484db485386c7130df3370aaacf3e682e8b1296021f215f02115680c not found: ID does not exist" Jan 20 23:36:36 crc kubenswrapper[5030]: I0120 23:36:36.895118 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.586146 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.708009 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-plugins-conf\") pod \"707515d1-17d7-47ff-be95-73414a6a2a95\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.708090 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-erlang-cookie\") pod \"707515d1-17d7-47ff-be95-73414a6a2a95\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.708148 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nlmg\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-kube-api-access-2nlmg\") pod \"707515d1-17d7-47ff-be95-73414a6a2a95\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.708177 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-server-conf\") pod \"707515d1-17d7-47ff-be95-73414a6a2a95\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.708229 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-plugins\") pod \"707515d1-17d7-47ff-be95-73414a6a2a95\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.708258 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-tls\") pod \"707515d1-17d7-47ff-be95-73414a6a2a95\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.708309 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/707515d1-17d7-47ff-be95-73414a6a2a95-erlang-cookie-secret\") pod \"707515d1-17d7-47ff-be95-73414a6a2a95\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.708338 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"707515d1-17d7-47ff-be95-73414a6a2a95\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.708369 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data\") pod \"707515d1-17d7-47ff-be95-73414a6a2a95\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.708418 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/707515d1-17d7-47ff-be95-73414a6a2a95-pod-info\") pod \"707515d1-17d7-47ff-be95-73414a6a2a95\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.708504 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-confd\") pod \"707515d1-17d7-47ff-be95-73414a6a2a95\" (UID: \"707515d1-17d7-47ff-be95-73414a6a2a95\") " Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.708612 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "707515d1-17d7-47ff-be95-73414a6a2a95" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.708649 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "707515d1-17d7-47ff-be95-73414a6a2a95" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.709110 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.709131 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.709770 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "707515d1-17d7-47ff-be95-73414a6a2a95" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.714984 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "707515d1-17d7-47ff-be95-73414a6a2a95" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.715841 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/707515d1-17d7-47ff-be95-73414a6a2a95-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "707515d1-17d7-47ff-be95-73414a6a2a95" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.715870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-kube-api-access-2nlmg" (OuterVolumeSpecName: "kube-api-access-2nlmg") pod "707515d1-17d7-47ff-be95-73414a6a2a95" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95"). InnerVolumeSpecName "kube-api-access-2nlmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.717823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "707515d1-17d7-47ff-be95-73414a6a2a95" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.720294 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/707515d1-17d7-47ff-be95-73414a6a2a95-pod-info" (OuterVolumeSpecName: "pod-info") pod "707515d1-17d7-47ff-be95-73414a6a2a95" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.735940 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data" (OuterVolumeSpecName: "config-data") pod "707515d1-17d7-47ff-be95-73414a6a2a95" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.750854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-server-conf" (OuterVolumeSpecName: "server-conf") pod "707515d1-17d7-47ff-be95-73414a6a2a95" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.787454 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "707515d1-17d7-47ff-be95-73414a6a2a95" (UID: "707515d1-17d7-47ff-be95-73414a6a2a95"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.810512 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/707515d1-17d7-47ff-be95-73414a6a2a95-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.810550 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.810565 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nlmg\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-kube-api-access-2nlmg\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.810575 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.810584 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.810595 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/707515d1-17d7-47ff-be95-73414a6a2a95-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.810604 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/707515d1-17d7-47ff-be95-73414a6a2a95-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.810658 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.810668 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/707515d1-17d7-47ff-be95-73414a6a2a95-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.855962 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.856895 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.863261 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4bc1efe-faed-4290-a85c-c631a91f02ca" containerID="c404f86a75aeed11f9b633de5cc6d308e1fbdb48707f1cf37008195d2631fc96" exitCode=0 Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.863343 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" event={"ID":"f4bc1efe-faed-4290-a85c-c631a91f02ca","Type":"ContainerDied","Data":"c404f86a75aeed11f9b633de5cc6d308e1fbdb48707f1cf37008195d2631fc96"} Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.863350 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.863373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj" event={"ID":"f4bc1efe-faed-4290-a85c-c631a91f02ca","Type":"ContainerDied","Data":"b227144782ed82b8ff9633022adb1be69c5e6055e89903a202f63d2c3a5def9f"} Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.863394 5030 scope.go:117] "RemoveContainer" containerID="c404f86a75aeed11f9b633de5cc6d308e1fbdb48707f1cf37008195d2631fc96" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.866654 5030 generic.go:334] "Generic (PLEG): container finished" podID="707515d1-17d7-47ff-be95-73414a6a2a95" containerID="6d9893896c0fe1b9cbcae45b2dde218a0a94a2e3b05ad47f803bf4db0286bc6c" exitCode=0 Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.866703 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"707515d1-17d7-47ff-be95-73414a6a2a95","Type":"ContainerDied","Data":"6d9893896c0fe1b9cbcae45b2dde218a0a94a2e3b05ad47f803bf4db0286bc6c"} Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.866726 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"707515d1-17d7-47ff-be95-73414a6a2a95","Type":"ContainerDied","Data":"d81d866e2fdb7ee31c791a88789cae06990472531241a26ed1edacf53c12a8d5"} Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.866786 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.889535 5030 scope.go:117] "RemoveContainer" containerID="fc2622eac769de27d1854d47660b3865bb55eb741ab3de14a0ff652e61da605f" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.914198 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.916055 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.924429 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.924962 5030 scope.go:117] "RemoveContainer" containerID="c404f86a75aeed11f9b633de5cc6d308e1fbdb48707f1cf37008195d2631fc96" Jan 20 23:36:37 crc kubenswrapper[5030]: E0120 23:36:37.925451 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c404f86a75aeed11f9b633de5cc6d308e1fbdb48707f1cf37008195d2631fc96\": container with ID starting with c404f86a75aeed11f9b633de5cc6d308e1fbdb48707f1cf37008195d2631fc96 not found: ID does not exist" containerID="c404f86a75aeed11f9b633de5cc6d308e1fbdb48707f1cf37008195d2631fc96" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.925525 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c404f86a75aeed11f9b633de5cc6d308e1fbdb48707f1cf37008195d2631fc96"} err="failed to get container status \"c404f86a75aeed11f9b633de5cc6d308e1fbdb48707f1cf37008195d2631fc96\": rpc error: code = NotFound desc = could not find container \"c404f86a75aeed11f9b633de5cc6d308e1fbdb48707f1cf37008195d2631fc96\": container with ID starting with c404f86a75aeed11f9b633de5cc6d308e1fbdb48707f1cf37008195d2631fc96 not found: ID does not exist" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.925580 5030 scope.go:117] "RemoveContainer" containerID="fc2622eac769de27d1854d47660b3865bb55eb741ab3de14a0ff652e61da605f" Jan 20 23:36:37 crc kubenswrapper[5030]: E0120 23:36:37.926025 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc2622eac769de27d1854d47660b3865bb55eb741ab3de14a0ff652e61da605f\": container with ID starting with fc2622eac769de27d1854d47660b3865bb55eb741ab3de14a0ff652e61da605f not found: ID does not exist" containerID="fc2622eac769de27d1854d47660b3865bb55eb741ab3de14a0ff652e61da605f" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.926054 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc2622eac769de27d1854d47660b3865bb55eb741ab3de14a0ff652e61da605f"} err="failed to get container status \"fc2622eac769de27d1854d47660b3865bb55eb741ab3de14a0ff652e61da605f\": rpc error: code = NotFound desc = could not find container \"fc2622eac769de27d1854d47660b3865bb55eb741ab3de14a0ff652e61da605f\": container with ID starting with fc2622eac769de27d1854d47660b3865bb55eb741ab3de14a0ff652e61da605f not found: ID does not exist" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.926073 5030 scope.go:117] "RemoveContainer" containerID="6d9893896c0fe1b9cbcae45b2dde218a0a94a2e3b05ad47f803bf4db0286bc6c" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.945828 5030 scope.go:117] "RemoveContainer" containerID="18661e9685ba3cc1b4a95ae118b9c8f65210502829dae351496739d132cd571e" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.964013 5030 scope.go:117] "RemoveContainer" containerID="6d9893896c0fe1b9cbcae45b2dde218a0a94a2e3b05ad47f803bf4db0286bc6c" Jan 20 23:36:37 crc kubenswrapper[5030]: E0120 23:36:37.964362 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d9893896c0fe1b9cbcae45b2dde218a0a94a2e3b05ad47f803bf4db0286bc6c\": container with ID starting with 6d9893896c0fe1b9cbcae45b2dde218a0a94a2e3b05ad47f803bf4db0286bc6c not found: ID does not exist" containerID="6d9893896c0fe1b9cbcae45b2dde218a0a94a2e3b05ad47f803bf4db0286bc6c" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.964405 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d9893896c0fe1b9cbcae45b2dde218a0a94a2e3b05ad47f803bf4db0286bc6c"} err="failed to get container status \"6d9893896c0fe1b9cbcae45b2dde218a0a94a2e3b05ad47f803bf4db0286bc6c\": rpc error: code = NotFound desc = could not find container \"6d9893896c0fe1b9cbcae45b2dde218a0a94a2e3b05ad47f803bf4db0286bc6c\": container with ID starting with 6d9893896c0fe1b9cbcae45b2dde218a0a94a2e3b05ad47f803bf4db0286bc6c not found: ID does not exist" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.964432 5030 scope.go:117] "RemoveContainer" containerID="18661e9685ba3cc1b4a95ae118b9c8f65210502829dae351496739d132cd571e" Jan 20 23:36:37 crc kubenswrapper[5030]: E0120 23:36:37.964759 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18661e9685ba3cc1b4a95ae118b9c8f65210502829dae351496739d132cd571e\": container with ID starting with 18661e9685ba3cc1b4a95ae118b9c8f65210502829dae351496739d132cd571e not found: ID does not exist" containerID="18661e9685ba3cc1b4a95ae118b9c8f65210502829dae351496739d132cd571e" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.964780 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18661e9685ba3cc1b4a95ae118b9c8f65210502829dae351496739d132cd571e"} err="failed to get container status \"18661e9685ba3cc1b4a95ae118b9c8f65210502829dae351496739d132cd571e\": rpc error: code = NotFound desc = could not find container \"18661e9685ba3cc1b4a95ae118b9c8f65210502829dae351496739d132cd571e\": container with ID starting with 18661e9685ba3cc1b4a95ae118b9c8f65210502829dae351496739d132cd571e not found: ID does not exist" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.966041 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/nova-scheduler-0" secret="" err="secret \"nova-nova-dockercfg-jczwg\" not found" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.972498 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707515d1-17d7-47ff-be95-73414a6a2a95" path="/var/lib/kubelet/pods/707515d1-17d7-47ff-be95-73414a6a2a95/volumes" Jan 20 23:36:37 crc kubenswrapper[5030]: I0120 23:36:37.973268 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" path="/var/lib/kubelet/pods/e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c/volumes" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.014648 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvgxd\" (UniqueName: \"kubernetes.io/projected/f4bc1efe-faed-4290-a85c-c631a91f02ca-kube-api-access-vvgxd\") pod \"f4bc1efe-faed-4290-a85c-c631a91f02ca\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.014697 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4bc1efe-faed-4290-a85c-c631a91f02ca-logs\") pod \"f4bc1efe-faed-4290-a85c-c631a91f02ca\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.014755 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-config-data-custom\") pod \"f4bc1efe-faed-4290-a85c-c631a91f02ca\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.014810 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-combined-ca-bundle\") pod \"f4bc1efe-faed-4290-a85c-c631a91f02ca\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.014862 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-config-data\") pod \"f4bc1efe-faed-4290-a85c-c631a91f02ca\" (UID: \"f4bc1efe-faed-4290-a85c-c631a91f02ca\") " Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.015241 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4bc1efe-faed-4290-a85c-c631a91f02ca-logs" (OuterVolumeSpecName: "logs") pod "f4bc1efe-faed-4290-a85c-c631a91f02ca" (UID: "f4bc1efe-faed-4290-a85c-c631a91f02ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.015698 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4bc1efe-faed-4290-a85c-c631a91f02ca-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.017910 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f4bc1efe-faed-4290-a85c-c631a91f02ca" (UID: "f4bc1efe-faed-4290-a85c-c631a91f02ca"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.018229 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4bc1efe-faed-4290-a85c-c631a91f02ca-kube-api-access-vvgxd" (OuterVolumeSpecName: "kube-api-access-vvgxd") pod "f4bc1efe-faed-4290-a85c-c631a91f02ca" (UID: "f4bc1efe-faed-4290-a85c-c631a91f02ca"). InnerVolumeSpecName "kube-api-access-vvgxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.036707 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4bc1efe-faed-4290-a85c-c631a91f02ca" (UID: "f4bc1efe-faed-4290-a85c-c631a91f02ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.058857 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-config-data" (OuterVolumeSpecName: "config-data") pod "f4bc1efe-faed-4290-a85c-c631a91f02ca" (UID: "f4bc1efe-faed-4290-a85c-c631a91f02ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.117136 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.117380 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.117391 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvgxd\" (UniqueName: \"kubernetes.io/projected/f4bc1efe-faed-4290-a85c-c631a91f02ca-kube-api-access-vvgxd\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.117403 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4bc1efe-faed-4290-a85c-c631a91f02ca-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:38 crc kubenswrapper[5030]: E0120 23:36:38.117303 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:38 crc kubenswrapper[5030]: E0120 23:36:38.117459 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle podName:5375014e-2365-4042-910d-80304789f745 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:38.617443047 +0000 UTC m=+3670.937703335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle") pod "nova-scheduler-0" (UID: "5375014e-2365-4042-910d-80304789f745") : secret "combined-ca-bundle" not found Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.194458 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj"] Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.199294 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-64946f8974-8fmpj"] Jan 20 23:36:38 crc kubenswrapper[5030]: E0120 23:36:38.629471 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:38 crc kubenswrapper[5030]: E0120 23:36:38.629531 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle podName:5375014e-2365-4042-910d-80304789f745 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:39.629516382 +0000 UTC m=+3671.949776670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle") pod "nova-scheduler-0" (UID: "5375014e-2365-4042-910d-80304789f745") : secret "combined-ca-bundle" not found Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.808398 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.876405 5030 generic.go:334] "Generic (PLEG): container finished" podID="5c2a94b7-b322-4de7-b55d-80c87895ab2d" containerID="dbbd94a1cc0b6868e228ff556ac722d58d4265232e5ea11ebaf7cc7bcb233912" exitCode=0 Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.876457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" event={"ID":"5c2a94b7-b322-4de7-b55d-80c87895ab2d","Type":"ContainerDied","Data":"dbbd94a1cc0b6868e228ff556ac722d58d4265232e5ea11ebaf7cc7bcb233912"} Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.876480 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" event={"ID":"5c2a94b7-b322-4de7-b55d-80c87895ab2d","Type":"ContainerDied","Data":"6197a3bce9c647c502cc83d469a50e6590cde057c66a12bfa91c47c2073f2f65"} Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.876495 5030 scope.go:117] "RemoveContainer" containerID="dbbd94a1cc0b6868e228ff556ac722d58d4265232e5ea11ebaf7cc7bcb233912" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.876610 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.896017 5030 scope.go:117] "RemoveContainer" containerID="1f023a1689579151152b0ca7bfe3a74e9a546de82e1d757e00c1973facff6457" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.914453 5030 scope.go:117] "RemoveContainer" containerID="dbbd94a1cc0b6868e228ff556ac722d58d4265232e5ea11ebaf7cc7bcb233912" Jan 20 23:36:38 crc kubenswrapper[5030]: E0120 23:36:38.914993 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbbd94a1cc0b6868e228ff556ac722d58d4265232e5ea11ebaf7cc7bcb233912\": container with ID starting with dbbd94a1cc0b6868e228ff556ac722d58d4265232e5ea11ebaf7cc7bcb233912 not found: ID does not exist" containerID="dbbd94a1cc0b6868e228ff556ac722d58d4265232e5ea11ebaf7cc7bcb233912" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.915142 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbbd94a1cc0b6868e228ff556ac722d58d4265232e5ea11ebaf7cc7bcb233912"} err="failed to get container status \"dbbd94a1cc0b6868e228ff556ac722d58d4265232e5ea11ebaf7cc7bcb233912\": rpc error: code = NotFound desc = could not find container \"dbbd94a1cc0b6868e228ff556ac722d58d4265232e5ea11ebaf7cc7bcb233912\": container with ID starting with dbbd94a1cc0b6868e228ff556ac722d58d4265232e5ea11ebaf7cc7bcb233912 not found: ID does not exist" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.915172 5030 scope.go:117] "RemoveContainer" containerID="1f023a1689579151152b0ca7bfe3a74e9a546de82e1d757e00c1973facff6457" Jan 20 23:36:38 crc kubenswrapper[5030]: E0120 23:36:38.915733 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f023a1689579151152b0ca7bfe3a74e9a546de82e1d757e00c1973facff6457\": container with ID starting with 1f023a1689579151152b0ca7bfe3a74e9a546de82e1d757e00c1973facff6457 not found: ID does not exist" containerID="1f023a1689579151152b0ca7bfe3a74e9a546de82e1d757e00c1973facff6457" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.915761 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f023a1689579151152b0ca7bfe3a74e9a546de82e1d757e00c1973facff6457"} err="failed to get container status \"1f023a1689579151152b0ca7bfe3a74e9a546de82e1d757e00c1973facff6457\": rpc error: code = NotFound desc = could not find container \"1f023a1689579151152b0ca7bfe3a74e9a546de82e1d757e00c1973facff6457\": container with ID starting with 1f023a1689579151152b0ca7bfe3a74e9a546de82e1d757e00c1973facff6457 not found: ID does not exist" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.933602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-combined-ca-bundle\") pod \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.933708 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-config-data-custom\") pod \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.933744 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fwn8\" (UniqueName: \"kubernetes.io/projected/5c2a94b7-b322-4de7-b55d-80c87895ab2d-kube-api-access-7fwn8\") pod \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.933835 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2a94b7-b322-4de7-b55d-80c87895ab2d-logs\") pod \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.933952 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-config-data\") pod \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\" (UID: \"5c2a94b7-b322-4de7-b55d-80c87895ab2d\") " Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.934918 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2a94b7-b322-4de7-b55d-80c87895ab2d-logs" (OuterVolumeSpecName: "logs") pod "5c2a94b7-b322-4de7-b55d-80c87895ab2d" (UID: "5c2a94b7-b322-4de7-b55d-80c87895ab2d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.938370 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2a94b7-b322-4de7-b55d-80c87895ab2d-kube-api-access-7fwn8" (OuterVolumeSpecName: "kube-api-access-7fwn8") pod "5c2a94b7-b322-4de7-b55d-80c87895ab2d" (UID: "5c2a94b7-b322-4de7-b55d-80c87895ab2d"). InnerVolumeSpecName "kube-api-access-7fwn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.938784 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5c2a94b7-b322-4de7-b55d-80c87895ab2d" (UID: "5c2a94b7-b322-4de7-b55d-80c87895ab2d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.978875 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-config-data" (OuterVolumeSpecName: "config-data") pod "5c2a94b7-b322-4de7-b55d-80c87895ab2d" (UID: "5c2a94b7-b322-4de7-b55d-80c87895ab2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:38 crc kubenswrapper[5030]: I0120 23:36:38.980524 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c2a94b7-b322-4de7-b55d-80c87895ab2d" (UID: "5c2a94b7-b322-4de7-b55d-80c87895ab2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:39 crc kubenswrapper[5030]: I0120 23:36:39.036318 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:39 crc kubenswrapper[5030]: I0120 23:36:39.036347 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:39 crc kubenswrapper[5030]: I0120 23:36:39.036361 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fwn8\" (UniqueName: \"kubernetes.io/projected/5c2a94b7-b322-4de7-b55d-80c87895ab2d-kube-api-access-7fwn8\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:39 crc kubenswrapper[5030]: I0120 23:36:39.036373 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2a94b7-b322-4de7-b55d-80c87895ab2d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:39 crc kubenswrapper[5030]: I0120 23:36:39.036383 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2a94b7-b322-4de7-b55d-80c87895ab2d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:39 crc kubenswrapper[5030]: I0120 23:36:39.213359 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f"] Jan 20 23:36:39 crc kubenswrapper[5030]: I0120 23:36:39.229683 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7b7d88d9b5-npm6f"] Jan 20 23:36:39 crc kubenswrapper[5030]: E0120 23:36:39.644963 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:39 crc kubenswrapper[5030]: E0120 23:36:39.645955 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle podName:5375014e-2365-4042-910d-80304789f745 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:41.645932225 +0000 UTC m=+3673.966192513 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle") pod "nova-scheduler-0" (UID: "5375014e-2365-4042-910d-80304789f745") : secret "combined-ca-bundle" not found Jan 20 23:36:39 crc kubenswrapper[5030]: I0120 23:36:39.978423 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2a94b7-b322-4de7-b55d-80c87895ab2d" path="/var/lib/kubelet/pods/5c2a94b7-b322-4de7-b55d-80c87895ab2d/volumes" Jan 20 23:36:39 crc kubenswrapper[5030]: I0120 23:36:39.979054 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4bc1efe-faed-4290-a85c-c631a91f02ca" path="/var/lib/kubelet/pods/f4bc1efe-faed-4290-a85c-c631a91f02ca/volumes" Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.069060 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.069315 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="5375014e-2365-4042-910d-80304789f745" containerName="nova-scheduler-scheduler" containerID="cri-o://1b138305497a5bde823490fa7171b55993dfce48f09f99a115680485ae08ab9d" gracePeriod=30 Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.157339 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.157421 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.157483 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.158242 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.158337 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" gracePeriod=600 Jan 20 23:36:40 crc kubenswrapper[5030]: E0120 23:36:40.293330 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.332816 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.406541 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69"] Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.407243 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" podUID="a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" containerName="dnsmasq-dns" containerID="cri-o://bffdf532710533e55a9bfcb1f25017e67b8336ed19189f2017903124f6b31f35" gracePeriod=10 Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.549894 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" podUID="a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.228:5353: connect: connection refused" Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.732944 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.733212 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978" containerName="memcached" containerID="cri-o://34ac68c5bbe6cab228a875e4c26bf6e188284b89ca877ab5eab2a0f82d7e057d" gracePeriod=30 Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.746580 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.746816 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="d3f32a70-a447-408f-99e0-90075753a304" containerName="nova-cell1-conductor-conductor" containerID="cri-o://0109d594a82a21bcf7309f9ae3d33696d2b156560568c20cd829172557037d0e" gracePeriod=30 Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.756302 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z"] Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.764263 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-2fk4z"] Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.769176 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.769388 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" containerID="cri-o://7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682" gracePeriod=30 Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.774850 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz"] Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.781827 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-2pwxz"] Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.892024 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.910674 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" exitCode=0 Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.910761 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4"} Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.910822 5030 scope.go:117] "RemoveContainer" containerID="1a1fe53fa35b53c77fc4449b81793acb927248e20c61c789e75acc9a687014ea" Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.914583 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:36:40 crc kubenswrapper[5030]: E0120 23:36:40.914986 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.917361 5030 generic.go:334] "Generic (PLEG): container finished" podID="a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" containerID="bffdf532710533e55a9bfcb1f25017e67b8336ed19189f2017903124f6b31f35" exitCode=0 Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.917406 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" event={"ID":"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0","Type":"ContainerDied","Data":"bffdf532710533e55a9bfcb1f25017e67b8336ed19189f2017903124f6b31f35"} Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.917434 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" event={"ID":"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0","Type":"ContainerDied","Data":"0883728b834863e289f76f52e7bae14541a8e0c72081e58bd13bfada03b04d47"} Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.917496 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69" Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.939229 5030 scope.go:117] "RemoveContainer" containerID="bffdf532710533e55a9bfcb1f25017e67b8336ed19189f2017903124f6b31f35" Jan 20 23:36:40 crc kubenswrapper[5030]: I0120 23:36:40.965987 5030 scope.go:117] "RemoveContainer" containerID="3b71e337b7a424ae92b3ec58e6074fdcfb016322dcb1f6c81c08b0044d34ffba" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.004694 5030 scope.go:117] "RemoveContainer" containerID="bffdf532710533e55a9bfcb1f25017e67b8336ed19189f2017903124f6b31f35" Jan 20 23:36:41 crc kubenswrapper[5030]: E0120 23:36:41.006051 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bffdf532710533e55a9bfcb1f25017e67b8336ed19189f2017903124f6b31f35\": container with ID starting with bffdf532710533e55a9bfcb1f25017e67b8336ed19189f2017903124f6b31f35 not found: ID does not exist" containerID="bffdf532710533e55a9bfcb1f25017e67b8336ed19189f2017903124f6b31f35" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.006144 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bffdf532710533e55a9bfcb1f25017e67b8336ed19189f2017903124f6b31f35"} err="failed to get container status \"bffdf532710533e55a9bfcb1f25017e67b8336ed19189f2017903124f6b31f35\": rpc error: code = NotFound desc = could not find container \"bffdf532710533e55a9bfcb1f25017e67b8336ed19189f2017903124f6b31f35\": container with ID starting with bffdf532710533e55a9bfcb1f25017e67b8336ed19189f2017903124f6b31f35 not found: ID does not exist" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.006244 5030 scope.go:117] "RemoveContainer" containerID="3b71e337b7a424ae92b3ec58e6074fdcfb016322dcb1f6c81c08b0044d34ffba" Jan 20 23:36:41 crc kubenswrapper[5030]: E0120 23:36:41.007291 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b71e337b7a424ae92b3ec58e6074fdcfb016322dcb1f6c81c08b0044d34ffba\": container with ID starting with 3b71e337b7a424ae92b3ec58e6074fdcfb016322dcb1f6c81c08b0044d34ffba not found: ID does not exist" containerID="3b71e337b7a424ae92b3ec58e6074fdcfb016322dcb1f6c81c08b0044d34ffba" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.007335 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b71e337b7a424ae92b3ec58e6074fdcfb016322dcb1f6c81c08b0044d34ffba"} err="failed to get container status \"3b71e337b7a424ae92b3ec58e6074fdcfb016322dcb1f6c81c08b0044d34ffba\": rpc error: code = NotFound desc = could not find container \"3b71e337b7a424ae92b3ec58e6074fdcfb016322dcb1f6c81c08b0044d34ffba\": container with ID starting with 3b71e337b7a424ae92b3ec58e6074fdcfb016322dcb1f6c81c08b0044d34ffba not found: ID does not exist" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.068362 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-dns-swift-storage-0\") pod \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.068510 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-dnsmasq-svc\") pod \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.068602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8jtr\" (UniqueName: \"kubernetes.io/projected/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-kube-api-access-h8jtr\") pod \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.068660 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-config\") pod \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\" (UID: \"a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0\") " Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.073961 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-kube-api-access-h8jtr" (OuterVolumeSpecName: "kube-api-access-h8jtr") pod "a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" (UID: "a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0"). InnerVolumeSpecName "kube-api-access-h8jtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.106412 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" (UID: "a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.106460 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-config" (OuterVolumeSpecName: "config") pod "a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" (UID: "a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.122174 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" (UID: "a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.170147 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8jtr\" (UniqueName: \"kubernetes.io/projected/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-kube-api-access-h8jtr\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.170385 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.170460 5030 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.170560 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.252551 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69"] Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.258335 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-796955bb77-dbf69"] Jan 20 23:36:41 crc kubenswrapper[5030]: E0120 23:36:41.682601 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:36:41 crc kubenswrapper[5030]: E0120 23:36:41.682714 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle podName:5375014e-2365-4042-910d-80304789f745 nodeName:}" failed. No retries permitted until 2026-01-20 23:36:45.682691747 +0000 UTC m=+3678.002952055 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle") pod "nova-scheduler-0" (UID: "5375014e-2365-4042-910d-80304789f745") : secret "combined-ca-bundle" not found Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.873067 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.895144 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.933531 5030 generic.go:334] "Generic (PLEG): container finished" podID="9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978" containerID="34ac68c5bbe6cab228a875e4c26bf6e188284b89ca877ab5eab2a0f82d7e057d" exitCode=0 Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.933603 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.933586 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978","Type":"ContainerDied","Data":"34ac68c5bbe6cab228a875e4c26bf6e188284b89ca877ab5eab2a0f82d7e057d"} Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.933711 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978","Type":"ContainerDied","Data":"ec7e149092fad6af623b19824e5babe12171c69a3b87ef9ad55041bd42e9fa51"} Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.933737 5030 scope.go:117] "RemoveContainer" containerID="34ac68c5bbe6cab228a875e4c26bf6e188284b89ca877ab5eab2a0f82d7e057d" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.939353 5030 generic.go:334] "Generic (PLEG): container finished" podID="d3f32a70-a447-408f-99e0-90075753a304" containerID="0109d594a82a21bcf7309f9ae3d33696d2b156560568c20cd829172557037d0e" exitCode=0 Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.939415 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.939407 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"d3f32a70-a447-408f-99e0-90075753a304","Type":"ContainerDied","Data":"0109d594a82a21bcf7309f9ae3d33696d2b156560568c20cd829172557037d0e"} Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.939548 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"d3f32a70-a447-408f-99e0-90075753a304","Type":"ContainerDied","Data":"854df0a137c6b15d84b8cb748e0de68099c42d359e5014d0141ca6a9aa4ea34e"} Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.968370 5030 scope.go:117] "RemoveContainer" containerID="34ac68c5bbe6cab228a875e4c26bf6e188284b89ca877ab5eab2a0f82d7e057d" Jan 20 23:36:41 crc kubenswrapper[5030]: E0120 23:36:41.968910 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34ac68c5bbe6cab228a875e4c26bf6e188284b89ca877ab5eab2a0f82d7e057d\": container with ID starting with 34ac68c5bbe6cab228a875e4c26bf6e188284b89ca877ab5eab2a0f82d7e057d not found: ID does not exist" containerID="34ac68c5bbe6cab228a875e4c26bf6e188284b89ca877ab5eab2a0f82d7e057d" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.968987 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34ac68c5bbe6cab228a875e4c26bf6e188284b89ca877ab5eab2a0f82d7e057d"} err="failed to get container status \"34ac68c5bbe6cab228a875e4c26bf6e188284b89ca877ab5eab2a0f82d7e057d\": rpc error: code = NotFound desc = could not find container \"34ac68c5bbe6cab228a875e4c26bf6e188284b89ca877ab5eab2a0f82d7e057d\": container with ID starting with 34ac68c5bbe6cab228a875e4c26bf6e188284b89ca877ab5eab2a0f82d7e057d not found: ID does not exist" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.969021 5030 scope.go:117] "RemoveContainer" containerID="0109d594a82a21bcf7309f9ae3d33696d2b156560568c20cd829172557037d0e" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.973572 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73af7840-4469-4d06-a6a4-23396532a060" path="/var/lib/kubelet/pods/73af7840-4469-4d06-a6a4-23396532a060/volumes" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.974189 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" path="/var/lib/kubelet/pods/a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0/volumes" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.974922 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd48950b-220e-43d8-a96e-ad983ee50c2c" path="/var/lib/kubelet/pods/cd48950b-220e-43d8-a96e-ad983ee50c2c/volumes" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.985695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8658x\" (UniqueName: \"kubernetes.io/projected/d3f32a70-a447-408f-99e0-90075753a304-kube-api-access-8658x\") pod \"d3f32a70-a447-408f-99e0-90075753a304\" (UID: \"d3f32a70-a447-408f-99e0-90075753a304\") " Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.985775 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-config-data\") pod \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.985883 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f32a70-a447-408f-99e0-90075753a304-config-data\") pod \"d3f32a70-a447-408f-99e0-90075753a304\" (UID: \"d3f32a70-a447-408f-99e0-90075753a304\") " Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.985920 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f32a70-a447-408f-99e0-90075753a304-combined-ca-bundle\") pod \"d3f32a70-a447-408f-99e0-90075753a304\" (UID: \"d3f32a70-a447-408f-99e0-90075753a304\") " Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.985978 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-memcached-tls-certs\") pod \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.986014 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhjpb\" (UniqueName: \"kubernetes.io/projected/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-kube-api-access-zhjpb\") pod \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.986044 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-combined-ca-bundle\") pod \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.986091 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-kolla-config\") pod \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\" (UID: \"9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978\") " Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.987035 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-config-data" (OuterVolumeSpecName: "config-data") pod "9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978" (UID: "9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.987047 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978" (UID: "9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.990155 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-kube-api-access-zhjpb" (OuterVolumeSpecName: "kube-api-access-zhjpb") pod "9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978" (UID: "9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978"). InnerVolumeSpecName "kube-api-access-zhjpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.990640 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f32a70-a447-408f-99e0-90075753a304-kube-api-access-8658x" (OuterVolumeSpecName: "kube-api-access-8658x") pod "d3f32a70-a447-408f-99e0-90075753a304" (UID: "d3f32a70-a447-408f-99e0-90075753a304"). InnerVolumeSpecName "kube-api-access-8658x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:41 crc kubenswrapper[5030]: I0120 23:36:41.996438 5030 scope.go:117] "RemoveContainer" containerID="649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.005913 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f32a70-a447-408f-99e0-90075753a304-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3f32a70-a447-408f-99e0-90075753a304" (UID: "d3f32a70-a447-408f-99e0-90075753a304"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.013176 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f32a70-a447-408f-99e0-90075753a304-config-data" (OuterVolumeSpecName: "config-data") pod "d3f32a70-a447-408f-99e0-90075753a304" (UID: "d3f32a70-a447-408f-99e0-90075753a304"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.014757 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978" (UID: "9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.038483 5030 scope.go:117] "RemoveContainer" containerID="0109d594a82a21bcf7309f9ae3d33696d2b156560568c20cd829172557037d0e" Jan 20 23:36:42 crc kubenswrapper[5030]: E0120 23:36:42.039004 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0109d594a82a21bcf7309f9ae3d33696d2b156560568c20cd829172557037d0e\": container with ID starting with 0109d594a82a21bcf7309f9ae3d33696d2b156560568c20cd829172557037d0e not found: ID does not exist" containerID="0109d594a82a21bcf7309f9ae3d33696d2b156560568c20cd829172557037d0e" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.039036 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0109d594a82a21bcf7309f9ae3d33696d2b156560568c20cd829172557037d0e"} err="failed to get container status \"0109d594a82a21bcf7309f9ae3d33696d2b156560568c20cd829172557037d0e\": rpc error: code = NotFound desc = could not find container \"0109d594a82a21bcf7309f9ae3d33696d2b156560568c20cd829172557037d0e\": container with ID starting with 0109d594a82a21bcf7309f9ae3d33696d2b156560568c20cd829172557037d0e not found: ID does not exist" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.039059 5030 scope.go:117] "RemoveContainer" containerID="649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7" Jan 20 23:36:42 crc kubenswrapper[5030]: E0120 23:36:42.039446 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7\": container with ID starting with 649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7 not found: ID does not exist" containerID="649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.039473 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7"} err="failed to get container status \"649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7\": rpc error: code = NotFound desc = could not find container \"649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7\": container with ID starting with 649199ce363a72dcd334d9339ef543ddaebf6f34cc2a93248f9218ab534c29b7 not found: ID does not exist" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.042797 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978" (UID: "9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.087813 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.087871 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhjpb\" (UniqueName: \"kubernetes.io/projected/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-kube-api-access-zhjpb\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.087894 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.088965 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.089010 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8658x\" (UniqueName: \"kubernetes.io/projected/d3f32a70-a447-408f-99e0-90075753a304-kube-api-access-8658x\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.089031 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.089051 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f32a70-a447-408f-99e0-90075753a304-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.089073 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f32a70-a447-408f-99e0-90075753a304-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.273670 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.286838 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.339557 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:36:42 crc kubenswrapper[5030]: I0120 23:36:42.346225 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:36:42 crc kubenswrapper[5030]: E0120 23:36:42.628037 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:36:42 crc kubenswrapper[5030]: E0120 23:36:42.630101 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:36:42 crc kubenswrapper[5030]: E0120 23:36:42.632102 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:36:42 crc kubenswrapper[5030]: E0120 23:36:42.632169 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" Jan 20 23:36:43 crc kubenswrapper[5030]: E0120 23:36:43.751360 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1b138305497a5bde823490fa7171b55993dfce48f09f99a115680485ae08ab9d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:36:43 crc kubenswrapper[5030]: E0120 23:36:43.753727 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1b138305497a5bde823490fa7171b55993dfce48f09f99a115680485ae08ab9d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:36:43 crc kubenswrapper[5030]: E0120 23:36:43.755736 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1b138305497a5bde823490fa7171b55993dfce48f09f99a115680485ae08ab9d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:36:43 crc kubenswrapper[5030]: E0120 23:36:43.755837 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="5375014e-2365-4042-910d-80304789f745" containerName="nova-scheduler-scheduler" Jan 20 23:36:43 crc kubenswrapper[5030]: I0120 23:36:43.979677 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978" path="/var/lib/kubelet/pods/9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978/volumes" Jan 20 23:36:43 crc kubenswrapper[5030]: I0120 23:36:43.980881 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f32a70-a447-408f-99e0-90075753a304" path="/var/lib/kubelet/pods/d3f32a70-a447-408f-99e0-90075753a304/volumes" Jan 20 23:36:44 crc kubenswrapper[5030]: E0120 23:36:44.271471 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec706fe5_85b7_4580_9013_e39497640ab0.slice/crio-conmon-a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.755867 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.855729 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-ovndb-tls-certs\") pod \"ec706fe5-85b7-4580-9013-e39497640ab0\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.855808 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-httpd-config\") pod \"ec706fe5-85b7-4580-9013-e39497640ab0\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.855876 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-config\") pod \"ec706fe5-85b7-4580-9013-e39497640ab0\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.855997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-combined-ca-bundle\") pod \"ec706fe5-85b7-4580-9013-e39497640ab0\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.856034 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-internal-tls-certs\") pod \"ec706fe5-85b7-4580-9013-e39497640ab0\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.856081 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-public-tls-certs\") pod \"ec706fe5-85b7-4580-9013-e39497640ab0\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.856138 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j95mn\" (UniqueName: \"kubernetes.io/projected/ec706fe5-85b7-4580-9013-e39497640ab0-kube-api-access-j95mn\") pod \"ec706fe5-85b7-4580-9013-e39497640ab0\" (UID: \"ec706fe5-85b7-4580-9013-e39497640ab0\") " Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.864786 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ec706fe5-85b7-4580-9013-e39497640ab0" (UID: "ec706fe5-85b7-4580-9013-e39497640ab0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.872318 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec706fe5-85b7-4580-9013-e39497640ab0-kube-api-access-j95mn" (OuterVolumeSpecName: "kube-api-access-j95mn") pod "ec706fe5-85b7-4580-9013-e39497640ab0" (UID: "ec706fe5-85b7-4580-9013-e39497640ab0"). InnerVolumeSpecName "kube-api-access-j95mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.909276 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ec706fe5-85b7-4580-9013-e39497640ab0" (UID: "ec706fe5-85b7-4580-9013-e39497640ab0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.935427 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec706fe5-85b7-4580-9013-e39497640ab0" (UID: "ec706fe5-85b7-4580-9013-e39497640ab0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.936223 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.941479 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ec706fe5-85b7-4580-9013-e39497640ab0" (UID: "ec706fe5-85b7-4580-9013-e39497640ab0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.957525 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c05bad7-0581-4182-9c7b-7a790c22cf64-config-data\") pod \"9c05bad7-0581-4182-9c7b-7a790c22cf64\" (UID: \"9c05bad7-0581-4182-9c7b-7a790c22cf64\") " Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.957631 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfn69\" (UniqueName: \"kubernetes.io/projected/9c05bad7-0581-4182-9c7b-7a790c22cf64-kube-api-access-bfn69\") pod \"9c05bad7-0581-4182-9c7b-7a790c22cf64\" (UID: \"9c05bad7-0581-4182-9c7b-7a790c22cf64\") " Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.957704 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c05bad7-0581-4182-9c7b-7a790c22cf64-combined-ca-bundle\") pod \"9c05bad7-0581-4182-9c7b-7a790c22cf64\" (UID: \"9c05bad7-0581-4182-9c7b-7a790c22cf64\") " Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.958001 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.958018 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.958029 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.958039 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.958048 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j95mn\" (UniqueName: \"kubernetes.io/projected/ec706fe5-85b7-4580-9013-e39497640ab0-kube-api-access-j95mn\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.962721 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-config" (OuterVolumeSpecName: "config") pod "ec706fe5-85b7-4580-9013-e39497640ab0" (UID: "ec706fe5-85b7-4580-9013-e39497640ab0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.963961 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c05bad7-0581-4182-9c7b-7a790c22cf64-kube-api-access-bfn69" (OuterVolumeSpecName: "kube-api-access-bfn69") pod "9c05bad7-0581-4182-9c7b-7a790c22cf64" (UID: "9c05bad7-0581-4182-9c7b-7a790c22cf64"). InnerVolumeSpecName "kube-api-access-bfn69". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.971681 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ec706fe5-85b7-4580-9013-e39497640ab0" (UID: "ec706fe5-85b7-4580-9013-e39497640ab0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.985170 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c05bad7-0581-4182-9c7b-7a790c22cf64-config-data" (OuterVolumeSpecName: "config-data") pod "9c05bad7-0581-4182-9c7b-7a790c22cf64" (UID: "9c05bad7-0581-4182-9c7b-7a790c22cf64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.988520 5030 generic.go:334] "Generic (PLEG): container finished" podID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerID="7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682" exitCode=0 Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.988656 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.988860 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9c05bad7-0581-4182-9c7b-7a790c22cf64","Type":"ContainerDied","Data":"7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682"} Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.988940 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"9c05bad7-0581-4182-9c7b-7a790c22cf64","Type":"ContainerDied","Data":"a561ed0a1b8342fc0104655d970929de17c3ce7e53b1072404ab41d45299547f"} Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.988974 5030 scope.go:117] "RemoveContainer" containerID="7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.991451 5030 generic.go:334] "Generic (PLEG): container finished" podID="5375014e-2365-4042-910d-80304789f745" containerID="1b138305497a5bde823490fa7171b55993dfce48f09f99a115680485ae08ab9d" exitCode=0 Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.991506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"5375014e-2365-4042-910d-80304789f745","Type":"ContainerDied","Data":"1b138305497a5bde823490fa7171b55993dfce48f09f99a115680485ae08ab9d"} Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.993336 5030 generic.go:334] "Generic (PLEG): container finished" podID="ec706fe5-85b7-4580-9013-e39497640ab0" containerID="a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5" exitCode=0 Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.993374 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.993372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" event={"ID":"ec706fe5-85b7-4580-9013-e39497640ab0","Type":"ContainerDied","Data":"a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5"} Jan 20 23:36:44 crc kubenswrapper[5030]: I0120 23:36:44.993490 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-557d5c9854-wbd5h" event={"ID":"ec706fe5-85b7-4580-9013-e39497640ab0","Type":"ContainerDied","Data":"3ba60cf0d2af06af168f14d98c0ee6b461d95799278997d935c999524b467e98"} Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.000017 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c05bad7-0581-4182-9c7b-7a790c22cf64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c05bad7-0581-4182-9c7b-7a790c22cf64" (UID: "9c05bad7-0581-4182-9c7b-7a790c22cf64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.018475 5030 scope.go:117] "RemoveContainer" containerID="3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.031405 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-557d5c9854-wbd5h"] Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.037774 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-557d5c9854-wbd5h"] Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.046005 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.056085 5030 scope.go:117] "RemoveContainer" containerID="7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682" Jan 20 23:36:45 crc kubenswrapper[5030]: E0120 23:36:45.056645 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682\": container with ID starting with 7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682 not found: ID does not exist" containerID="7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.056674 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682"} err="failed to get container status \"7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682\": rpc error: code = NotFound desc = could not find container \"7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682\": container with ID starting with 7bc91cd1f5c9df49e74ef028cccb3877f80d42895501ea45b4572610565d9682 not found: ID does not exist" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.056696 5030 scope.go:117] "RemoveContainer" containerID="3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8" Jan 20 23:36:45 crc kubenswrapper[5030]: E0120 23:36:45.057813 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8\": container with ID starting with 3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8 not found: ID does not exist" containerID="3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.057880 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8"} err="failed to get container status \"3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8\": rpc error: code = NotFound desc = could not find container \"3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8\": container with ID starting with 3d47242f54795403dc1f51ed3fec74e2d12e76a99b78fc45f14917f84937cbf8 not found: ID does not exist" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.057927 5030 scope.go:117] "RemoveContainer" containerID="84ade93355a3eac45979c28fcd90e782dabf113aeb7e0a0088390d73d73bdfa8" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.058277 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stlp8\" (UniqueName: \"kubernetes.io/projected/5375014e-2365-4042-910d-80304789f745-kube-api-access-stlp8\") pod \"5375014e-2365-4042-910d-80304789f745\" (UID: \"5375014e-2365-4042-910d-80304789f745\") " Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.058304 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-config-data\") pod \"5375014e-2365-4042-910d-80304789f745\" (UID: \"5375014e-2365-4042-910d-80304789f745\") " Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.058363 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle\") pod \"5375014e-2365-4042-910d-80304789f745\" (UID: \"5375014e-2365-4042-910d-80304789f745\") " Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.059506 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c05bad7-0581-4182-9c7b-7a790c22cf64-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.059546 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.059559 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfn69\" (UniqueName: \"kubernetes.io/projected/9c05bad7-0581-4182-9c7b-7a790c22cf64-kube-api-access-bfn69\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.059569 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec706fe5-85b7-4580-9013-e39497640ab0-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.059578 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c05bad7-0581-4182-9c7b-7a790c22cf64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.064644 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5375014e-2365-4042-910d-80304789f745-kube-api-access-stlp8" (OuterVolumeSpecName: "kube-api-access-stlp8") pod "5375014e-2365-4042-910d-80304789f745" (UID: "5375014e-2365-4042-910d-80304789f745"). InnerVolumeSpecName "kube-api-access-stlp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.085986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-config-data" (OuterVolumeSpecName: "config-data") pod "5375014e-2365-4042-910d-80304789f745" (UID: "5375014e-2365-4042-910d-80304789f745"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.093658 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5375014e-2365-4042-910d-80304789f745" (UID: "5375014e-2365-4042-910d-80304789f745"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.106341 5030 scope.go:117] "RemoveContainer" containerID="be1f7f5545bed54bae7ef435a6cffa9473ba0595e2d6a1b90f6bb99a4d364f7f" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.126984 5030 scope.go:117] "RemoveContainer" containerID="a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.149088 5030 scope.go:117] "RemoveContainer" containerID="be1f7f5545bed54bae7ef435a6cffa9473ba0595e2d6a1b90f6bb99a4d364f7f" Jan 20 23:36:45 crc kubenswrapper[5030]: E0120 23:36:45.149538 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be1f7f5545bed54bae7ef435a6cffa9473ba0595e2d6a1b90f6bb99a4d364f7f\": container with ID starting with be1f7f5545bed54bae7ef435a6cffa9473ba0595e2d6a1b90f6bb99a4d364f7f not found: ID does not exist" containerID="be1f7f5545bed54bae7ef435a6cffa9473ba0595e2d6a1b90f6bb99a4d364f7f" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.149613 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be1f7f5545bed54bae7ef435a6cffa9473ba0595e2d6a1b90f6bb99a4d364f7f"} err="failed to get container status \"be1f7f5545bed54bae7ef435a6cffa9473ba0595e2d6a1b90f6bb99a4d364f7f\": rpc error: code = NotFound desc = could not find container \"be1f7f5545bed54bae7ef435a6cffa9473ba0595e2d6a1b90f6bb99a4d364f7f\": container with ID starting with be1f7f5545bed54bae7ef435a6cffa9473ba0595e2d6a1b90f6bb99a4d364f7f not found: ID does not exist" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.149682 5030 scope.go:117] "RemoveContainer" containerID="a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5" Jan 20 23:36:45 crc kubenswrapper[5030]: E0120 23:36:45.150340 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5\": container with ID starting with a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5 not found: ID does not exist" containerID="a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.150384 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5"} err="failed to get container status \"a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5\": rpc error: code = NotFound desc = could not find container \"a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5\": container with ID starting with a816050a5e76504f5866f59700c37515f868b675e71a4bcc8df4f55062164cb5 not found: ID does not exist" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.160850 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.160887 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stlp8\" (UniqueName: \"kubernetes.io/projected/5375014e-2365-4042-910d-80304789f745-kube-api-access-stlp8\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.160911 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5375014e-2365-4042-910d-80304789f745-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.334941 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.346490 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.979223 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" path="/var/lib/kubelet/pods/9c05bad7-0581-4182-9c7b-7a790c22cf64/volumes" Jan 20 23:36:45 crc kubenswrapper[5030]: I0120 23:36:45.980346 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec706fe5-85b7-4580-9013-e39497640ab0" path="/var/lib/kubelet/pods/ec706fe5-85b7-4580-9013-e39497640ab0/volumes" Jan 20 23:36:46 crc kubenswrapper[5030]: I0120 23:36:46.011474 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:36:46 crc kubenswrapper[5030]: I0120 23:36:46.011464 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"5375014e-2365-4042-910d-80304789f745","Type":"ContainerDied","Data":"50b6f1be17c96f0d23d115377b8f9ff93b419cc1a3f1a0caf64eb2db81bb8e7e"} Jan 20 23:36:46 crc kubenswrapper[5030]: I0120 23:36:46.011716 5030 scope.go:117] "RemoveContainer" containerID="1b138305497a5bde823490fa7171b55993dfce48f09f99a115680485ae08ab9d" Jan 20 23:36:46 crc kubenswrapper[5030]: I0120 23:36:46.070224 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:36:46 crc kubenswrapper[5030]: I0120 23:36:46.084608 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:36:47 crc kubenswrapper[5030]: I0120 23:36:47.981444 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5375014e-2365-4042-910d-80304789f745" path="/var/lib/kubelet/pods/5375014e-2365-4042-910d-80304789f745/volumes" Jan 20 23:36:53 crc kubenswrapper[5030]: I0120 23:36:53.962073 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:36:53 crc kubenswrapper[5030]: E0120 23:36:53.963039 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.211179 5030 generic.go:334] "Generic (PLEG): container finished" podID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerID="5c43afbc3bd855975b6195fe0ab81ffca022d863f4ec31ef3dcc4088f902abf7" exitCode=137 Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.211805 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"5c43afbc3bd855975b6195fe0ab81ffca022d863f4ec31ef3dcc4088f902abf7"} Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.212139 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f","Type":"ContainerDied","Data":"ee86ab1790b689950fea354926e10566d74ac66d70330658ee9e95ec735cc44f"} Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.212158 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee86ab1790b689950fea354926e10566d74ac66d70330658ee9e95ec735cc44f" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.390907 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.417895 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.417987 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-etc-swift\") pod \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.418097 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkswp\" (UniqueName: \"kubernetes.io/projected/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-kube-api-access-mkswp\") pod \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.418155 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-cache\") pod \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.418217 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-lock\") pod \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\" (UID: \"3239a82b-2e89-4f59-bfd7-a226ef7e9d2f\") " Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.419673 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-cache" (OuterVolumeSpecName: "cache") pod "3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" (UID: "3239a82b-2e89-4f59-bfd7-a226ef7e9d2f"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.419731 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-lock" (OuterVolumeSpecName: "lock") pod "3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" (UID: "3239a82b-2e89-4f59-bfd7-a226ef7e9d2f"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.419876 5030 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-cache\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.419914 5030 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-lock\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.427956 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" (UID: "3239a82b-2e89-4f59-bfd7-a226ef7e9d2f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.428788 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "swift") pod "3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" (UID: "3239a82b-2e89-4f59-bfd7-a226ef7e9d2f"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.434006 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-kube-api-access-mkswp" (OuterVolumeSpecName: "kube-api-access-mkswp") pod "3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" (UID: "3239a82b-2e89-4f59-bfd7-a226ef7e9d2f"). InnerVolumeSpecName "kube-api-access-mkswp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.522652 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.522682 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.522695 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkswp\" (UniqueName: \"kubernetes.io/projected/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f-kube-api-access-mkswp\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.553043 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.561465 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.623695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-config-data-custom\") pod \"12ef9991-6f38-4b76-a8e7-974ef5255254\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.623755 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl72g\" (UniqueName: \"kubernetes.io/projected/12ef9991-6f38-4b76-a8e7-974ef5255254-kube-api-access-nl72g\") pod \"12ef9991-6f38-4b76-a8e7-974ef5255254\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.623813 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-config-data\") pod \"12ef9991-6f38-4b76-a8e7-974ef5255254\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.623982 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-scripts\") pod \"12ef9991-6f38-4b76-a8e7-974ef5255254\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.624030 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-combined-ca-bundle\") pod \"12ef9991-6f38-4b76-a8e7-974ef5255254\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.624055 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12ef9991-6f38-4b76-a8e7-974ef5255254-etc-machine-id\") pod \"12ef9991-6f38-4b76-a8e7-974ef5255254\" (UID: \"12ef9991-6f38-4b76-a8e7-974ef5255254\") " Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.624562 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.624712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12ef9991-6f38-4b76-a8e7-974ef5255254-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "12ef9991-6f38-4b76-a8e7-974ef5255254" (UID: "12ef9991-6f38-4b76-a8e7-974ef5255254"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.626664 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "12ef9991-6f38-4b76-a8e7-974ef5255254" (UID: "12ef9991-6f38-4b76-a8e7-974ef5255254"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.627238 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ef9991-6f38-4b76-a8e7-974ef5255254-kube-api-access-nl72g" (OuterVolumeSpecName: "kube-api-access-nl72g") pod "12ef9991-6f38-4b76-a8e7-974ef5255254" (UID: "12ef9991-6f38-4b76-a8e7-974ef5255254"). InnerVolumeSpecName "kube-api-access-nl72g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.627567 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-scripts" (OuterVolumeSpecName: "scripts") pod "12ef9991-6f38-4b76-a8e7-974ef5255254" (UID: "12ef9991-6f38-4b76-a8e7-974ef5255254"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.667066 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12ef9991-6f38-4b76-a8e7-974ef5255254" (UID: "12ef9991-6f38-4b76-a8e7-974ef5255254"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.706024 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-config-data" (OuterVolumeSpecName: "config-data") pod "12ef9991-6f38-4b76-a8e7-974ef5255254" (UID: "12ef9991-6f38-4b76-a8e7-974ef5255254"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.726453 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.726487 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12ef9991-6f38-4b76-a8e7-974ef5255254-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.726502 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.726519 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl72g\" (UniqueName: \"kubernetes.io/projected/12ef9991-6f38-4b76-a8e7-974ef5255254-kube-api-access-nl72g\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.726536 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:00 crc kubenswrapper[5030]: I0120 23:37:00.726547 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ef9991-6f38-4b76-a8e7-974ef5255254-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.231557 5030 generic.go:334] "Generic (PLEG): container finished" podID="12ef9991-6f38-4b76-a8e7-974ef5255254" containerID="36217ecd5110ba1cb9d103b09587b21f547638eb8c7ae3eca137279150a09f64" exitCode=137 Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.231928 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.231965 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"12ef9991-6f38-4b76-a8e7-974ef5255254","Type":"ContainerDied","Data":"36217ecd5110ba1cb9d103b09587b21f547638eb8c7ae3eca137279150a09f64"} Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.232025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"12ef9991-6f38-4b76-a8e7-974ef5255254","Type":"ContainerDied","Data":"a86d9299fe6a4cdf4a78ed49c689e395c66ed9e01288c02768d5391bfd8715d0"} Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.232060 5030 scope.go:117] "RemoveContainer" containerID="d5c3b3be74c237646fe90af0b8e7960fcfb98b64d8f2fe17cbcb220246b846f0" Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.231941 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.282469 5030 scope.go:117] "RemoveContainer" containerID="36217ecd5110ba1cb9d103b09587b21f547638eb8c7ae3eca137279150a09f64" Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.284269 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.291111 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.307866 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.313388 5030 scope.go:117] "RemoveContainer" containerID="d5c3b3be74c237646fe90af0b8e7960fcfb98b64d8f2fe17cbcb220246b846f0" Jan 20 23:37:01 crc kubenswrapper[5030]: E0120 23:37:01.313987 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c3b3be74c237646fe90af0b8e7960fcfb98b64d8f2fe17cbcb220246b846f0\": container with ID starting with d5c3b3be74c237646fe90af0b8e7960fcfb98b64d8f2fe17cbcb220246b846f0 not found: ID does not exist" containerID="d5c3b3be74c237646fe90af0b8e7960fcfb98b64d8f2fe17cbcb220246b846f0" Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.314195 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c3b3be74c237646fe90af0b8e7960fcfb98b64d8f2fe17cbcb220246b846f0"} err="failed to get container status \"d5c3b3be74c237646fe90af0b8e7960fcfb98b64d8f2fe17cbcb220246b846f0\": rpc error: code = NotFound desc = could not find container \"d5c3b3be74c237646fe90af0b8e7960fcfb98b64d8f2fe17cbcb220246b846f0\": container with ID starting with d5c3b3be74c237646fe90af0b8e7960fcfb98b64d8f2fe17cbcb220246b846f0 not found: ID does not exist" Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.314416 5030 scope.go:117] "RemoveContainer" containerID="36217ecd5110ba1cb9d103b09587b21f547638eb8c7ae3eca137279150a09f64" Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.314294 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:37:01 crc kubenswrapper[5030]: E0120 23:37:01.315524 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36217ecd5110ba1cb9d103b09587b21f547638eb8c7ae3eca137279150a09f64\": container with ID starting with 36217ecd5110ba1cb9d103b09587b21f547638eb8c7ae3eca137279150a09f64 not found: ID does not exist" containerID="36217ecd5110ba1cb9d103b09587b21f547638eb8c7ae3eca137279150a09f64" Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.315608 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36217ecd5110ba1cb9d103b09587b21f547638eb8c7ae3eca137279150a09f64"} err="failed to get container status \"36217ecd5110ba1cb9d103b09587b21f547638eb8c7ae3eca137279150a09f64\": rpc error: code = NotFound desc = could not find container \"36217ecd5110ba1cb9d103b09587b21f547638eb8c7ae3eca137279150a09f64\": container with ID starting with 36217ecd5110ba1cb9d103b09587b21f547638eb8c7ae3eca137279150a09f64 not found: ID does not exist" Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.980107 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ef9991-6f38-4b76-a8e7-974ef5255254" path="/var/lib/kubelet/pods/12ef9991-6f38-4b76-a8e7-974ef5255254/volumes" Jan 20 23:37:01 crc kubenswrapper[5030]: I0120 23:37:01.982451 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" path="/var/lib/kubelet/pods/3239a82b-2e89-4f59-bfd7-a226ef7e9d2f/volumes" Jan 20 23:37:04 crc kubenswrapper[5030]: I0120 23:37:04.913348 5030 scope.go:117] "RemoveContainer" containerID="d972a5ba5335c79ee7f0e9a2ee9867a6a5817c5df455505e757a4059fd140323" Jan 20 23:37:04 crc kubenswrapper[5030]: I0120 23:37:04.970278 5030 scope.go:117] "RemoveContainer" containerID="76fbcef65dc0a7b0baa96f8d4bc2c60c0c4628b98902d125ac94e7d82e8fe890" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.041990 5030 scope.go:117] "RemoveContainer" containerID="1c6850eecdc5fa4c5a36b12834246a3043eb86f02c48821721d9c04dc239804a" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.099053 5030 scope.go:117] "RemoveContainer" containerID="3a8a67f49b51b49c3c0887455162bd0a450f3ed1781f0832075d5df233e94c00" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.152146 5030 scope.go:117] "RemoveContainer" containerID="34da10e01a3101e7713caaff1ac93588e6e463c974c61ea8cd351045e44cfd22" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.173046 5030 scope.go:117] "RemoveContainer" containerID="eb9b1c8eb31c7ba78de3e7a43882ff05111f1008063e4d7ca427acbb295af381" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.207602 5030 scope.go:117] "RemoveContainer" containerID="05d2c8ca19d2f726e4f7a14f77d3b2e23274d5e5e2654c81a841c51062f6366e" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.228225 5030 scope.go:117] "RemoveContainer" containerID="51aa64d4069629fa3a6cef0e85c67b56680900af8f99e9f962991a0d0dfaca45" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.259689 5030 scope.go:117] "RemoveContainer" containerID="c8a3ef2fef70b1b939dd1d84f145e5e6434e7cc730bbddeeb9840fb2dac557ec" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.291036 5030 scope.go:117] "RemoveContainer" containerID="7085afa79dd0d98549d5a88ded95ec6171a3544ee9fd3de70e24d0a912577f79" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.321212 5030 scope.go:117] "RemoveContainer" containerID="75bdd4500474fa14aff48f8007f5e9d67c778fd39b444260ca7c21a9d0b54539" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.353748 5030 scope.go:117] "RemoveContainer" containerID="9d227bee93ee3810f8bf8c42e3725c916f0abced2be65bffcd089b41e0c07bac" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.394686 5030 scope.go:117] "RemoveContainer" containerID="58c6f323880abc0cbf7d43a0533adcf8ff481ae18da8b64a1cdce69e82b6174f" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.423268 5030 scope.go:117] "RemoveContainer" containerID="769e9e12ef6c3bc11d8c7227a59831aaf4beb7eb11f3328fed8e63f694d38681" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.454174 5030 scope.go:117] "RemoveContainer" containerID="7185c5fb05b2a58b278c176c1f7c6feba52b9f23788ceb4dae8035f37f075ebe" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.485498 5030 scope.go:117] "RemoveContainer" containerID="d8018bf3bade68fa64b2765cfd6560c427fb070f2f81ee36252a4716114b59ee" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.512078 5030 scope.go:117] "RemoveContainer" containerID="4ebfc1d04aa89c829d06fdef5b71f0d7b4ef8876a9b3d81c860e269d91b40c9c" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.576496 5030 scope.go:117] "RemoveContainer" containerID="06deebca81c94f49e91bdb53421c7e64e98a23a1d53f80d9395798d2657ef569" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.605864 5030 scope.go:117] "RemoveContainer" containerID="3bfb3d28c0e12ae18efaa28d415459489a4a00796c95465b198516c16c2dc578" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.645098 5030 scope.go:117] "RemoveContainer" containerID="d465c7355fb972c93c904f55d8b5006846ff351d0ce18969da3c146b863c1b94" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.683285 5030 scope.go:117] "RemoveContainer" containerID="c54f67d7ddbd7c4c1aee630494153ae6c202ce4616219d0db07a4f4bb1e257f7" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.715331 5030 scope.go:117] "RemoveContainer" containerID="25854f2a2d04afcd6032d297351aac945108d95777f369400932fa4d630807df" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.740164 5030 scope.go:117] "RemoveContainer" containerID="9fd232b7659b789db5ad4d7e43b1a3850e70a4da7a82b8d138d56c50052f9690" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.775222 5030 scope.go:117] "RemoveContainer" containerID="0bf73260889f3156f80b4c12bb900ddb99d6d59bc8af041c63fd69a91e15ccea" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.807882 5030 scope.go:117] "RemoveContainer" containerID="1e5eb2657b6e29da14a81671a285de70380a94c6019bdd99e6633fb362695c1d" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.836298 5030 scope.go:117] "RemoveContainer" containerID="1e78519247299b9a68e9bc1aea8273f93f583e1bcef928b636dee432adb3ccd8" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.866723 5030 scope.go:117] "RemoveContainer" containerID="d9107a0eac389bd83b0b5be0fa014626ef904c92ed31eaab47cade0dbda1a139" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.909316 5030 scope.go:117] "RemoveContainer" containerID="e7c6be4bd549d34d96b35ea32ddf2d9e56b7140e18ff6672de159cff243a514f" Jan 20 23:37:05 crc kubenswrapper[5030]: I0120 23:37:05.938025 5030 scope.go:117] "RemoveContainer" containerID="04d68748733c2310e34f95b5220a0dfe6edb41cdacc8760a24872f073e4c79f0" Jan 20 23:37:06 crc kubenswrapper[5030]: I0120 23:37:06.033278 5030 scope.go:117] "RemoveContainer" containerID="b67b792123403768573a8ebd3b75c1e265d1b457365b9957bcba8f5d6b593839" Jan 20 23:37:06 crc kubenswrapper[5030]: I0120 23:37:06.082827 5030 scope.go:117] "RemoveContainer" containerID="fd760ce4c516eadaa45a00279c67df1a2875779ca50c76f040476cc90657dcdd" Jan 20 23:37:06 crc kubenswrapper[5030]: I0120 23:37:06.113405 5030 scope.go:117] "RemoveContainer" containerID="cba3263a6508fedaf28776583fb352f8fcd4acf74a5a4acef1cadfc0e9b2be2d" Jan 20 23:37:06 crc kubenswrapper[5030]: I0120 23:37:06.152799 5030 scope.go:117] "RemoveContainer" containerID="9fd52bc2e65925f44682cc313398c8fe0670f551130374c8775dcfb43a8ae0b8" Jan 20 23:37:06 crc kubenswrapper[5030]: I0120 23:37:06.178594 5030 scope.go:117] "RemoveContainer" containerID="04816dfe20b62b8e232c335ff1b70ffffcc8b5ac48f2109858f796af3749054b" Jan 20 23:37:06 crc kubenswrapper[5030]: I0120 23:37:06.218680 5030 scope.go:117] "RemoveContainer" containerID="6d02ba4f740c40f15ed8b65911dd184538b19b9aa75c2e5567cbec252abecfc4" Jan 20 23:37:06 crc kubenswrapper[5030]: I0120 23:37:06.246155 5030 scope.go:117] "RemoveContainer" containerID="e68403552d73e628a0c63064a45243baf0e1b9e7e0d20930a30f25654c3a5e58" Jan 20 23:37:08 crc kubenswrapper[5030]: I0120 23:37:08.962190 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:37:08 crc kubenswrapper[5030]: E0120 23:37:08.962978 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:37:13 crc kubenswrapper[5030]: I0120 23:37:13.924937 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-vnjr9"] Jan 20 23:37:13 crc kubenswrapper[5030]: I0120 23:37:13.937380 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-vnjr9"] Jan 20 23:37:13 crc kubenswrapper[5030]: I0120 23:37:13.979282 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e110f4c-8a24-42d1-b480-87fa2452f2bd" path="/var/lib/kubelet/pods/3e110f4c-8a24-42d1-b480-87fa2452f2bd/volumes" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.108299 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-t7v2j"] Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.108737 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-updater" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.108768 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-updater" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.108792 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a832d1c9-4838-4070-ad8d-e2af76d8ca47" containerName="proxy-server" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.108806 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a832d1c9-4838-4070-ad8d-e2af76d8ca47" containerName="proxy-server" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.108827 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-updater" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.108841 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-updater" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.108865 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b0e7253-4a57-42d1-bbc3-768eeb755113" containerName="ovn-northd" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.108877 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b0e7253-4a57-42d1-bbc3-768eeb755113" containerName="ovn-northd" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.108899 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-auditor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.108913 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-auditor" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.108929 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ef9991-6f38-4b76-a8e7-974ef5255254" containerName="probe" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.109279 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ef9991-6f38-4b76-a8e7-974ef5255254" containerName="probe" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.109308 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-replicator" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.109410 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-replicator" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.109424 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="ceilometer-notification-agent" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.109438 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="ceilometer-notification-agent" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.109455 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-expirer" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.109490 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-expirer" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.109511 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerName="nova-metadata-metadata" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.109522 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerName="nova-metadata-metadata" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.109543 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5375014e-2365-4042-910d-80304789f745" containerName="nova-scheduler-scheduler" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.109592 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5375014e-2365-4042-910d-80304789f745" containerName="nova-scheduler-scheduler" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.109607 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5375014e-2365-4042-910d-80304789f745" containerName="nova-scheduler-scheduler" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.109840 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5375014e-2365-4042-910d-80304789f745" containerName="nova-scheduler-scheduler" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.109871 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a832d1c9-4838-4070-ad8d-e2af76d8ca47" containerName="proxy-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.109884 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a832d1c9-4838-4070-ad8d-e2af76d8ca47" containerName="proxy-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.109902 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="rsync" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.109914 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="rsync" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.109932 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-auditor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.109943 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-auditor" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.109958 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-reaper" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.109971 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-reaper" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.109987 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-replicator" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.109999 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-replicator" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110022 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707515d1-17d7-47ff-be95-73414a6a2a95" containerName="rabbitmq" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110035 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="707515d1-17d7-47ff-be95-73414a6a2a95" containerName="rabbitmq" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110054 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16aae85a-a502-4b2e-9663-e8c50d01f657" containerName="nova-api-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110066 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="16aae85a-a502-4b2e-9663-e8c50d01f657" containerName="nova-api-log" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110081 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-auditor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110092 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-auditor" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110108 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" containerName="init" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110120 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" containerName="init" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110134 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec706fe5-85b7-4580-9013-e39497640ab0" containerName="neutron-api" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110145 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec706fe5-85b7-4580-9013-e39497640ab0" containerName="neutron-api" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110167 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110179 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110194 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110207 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110224 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-server" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110236 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-server" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110260 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3774e65-1783-48d9-af7d-3a9964b609ad" containerName="kube-state-metrics" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110272 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3774e65-1783-48d9-af7d-3a9964b609ad" containerName="kube-state-metrics" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110296 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="proxy-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110311 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="proxy-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110327 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110339 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110359 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" containerName="keystone-api" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110371 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" containerName="keystone-api" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110383 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f32a70-a447-408f-99e0-90075753a304" containerName="nova-cell1-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110395 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f32a70-a447-408f-99e0-90075753a304" containerName="nova-cell1-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110411 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110422 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api-log" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110442 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="ceilometer-central-agent" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110454 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="ceilometer-central-agent" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110477 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc222012-854c-45e8-8ef9-e87ac905ca13" containerName="mariadb-account-create-update" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110489 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc222012-854c-45e8-8ef9-e87ac905ca13" containerName="mariadb-account-create-update" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110509 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4bc1efe-faed-4290-a85c-c631a91f02ca" containerName="barbican-keystone-listener-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110521 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4bc1efe-faed-4290-a85c-c631a91f02ca" containerName="barbican-keystone-listener-log" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110544 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" containerName="placement-api" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110556 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" containerName="placement-api" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110573 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7faddbb-53f4-4f27-a853-0751049feafd" containerName="cinder-api" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110586 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7faddbb-53f4-4f27-a853-0751049feafd" containerName="cinder-api" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110605 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc222012-854c-45e8-8ef9-e87ac905ca13" containerName="mariadb-account-create-update" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110654 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc222012-854c-45e8-8ef9-e87ac905ca13" containerName="mariadb-account-create-update" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110671 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="sg-core" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110685 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="sg-core" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110709 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110724 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110749 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110764 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110788 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978" containerName="memcached" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110804 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978" containerName="memcached" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110821 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3850fc-0bf6-489d-8a8f-4a72ad423cce" containerName="mysql-bootstrap" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110836 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3850fc-0bf6-489d-8a8f-4a72ad423cce" containerName="mysql-bootstrap" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110859 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec706fe5-85b7-4580-9013-e39497640ab0" containerName="neutron-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110870 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec706fe5-85b7-4580-9013-e39497640ab0" containerName="neutron-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110886 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-server" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110901 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-server" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110922 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4bc1efe-faed-4290-a85c-c631a91f02ca" containerName="barbican-keystone-listener" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110936 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4bc1efe-faed-4290-a85c-c631a91f02ca" containerName="barbican-keystone-listener" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110967 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ef9991-6f38-4b76-a8e7-974ef5255254" containerName="cinder-scheduler" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.110980 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ef9991-6f38-4b76-a8e7-974ef5255254" containerName="cinder-scheduler" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.110996 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" containerName="rabbitmq" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111007 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" containerName="rabbitmq" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111021 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16aae85a-a502-4b2e-9663-e8c50d01f657" containerName="nova-api-api" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111032 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="16aae85a-a502-4b2e-9663-e8c50d01f657" containerName="nova-api-api" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111047 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" containerName="galera" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111059 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" containerName="galera" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111074 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111086 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111098 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerName="nova-metadata-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111109 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerName="nova-metadata-log" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111134 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6ac7c2-5294-401a-a16d-0821b044e7f8" containerName="glance-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111146 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6ac7c2-5294-401a-a16d-0821b044e7f8" containerName="glance-log" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111167 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" containerName="dnsmasq-dns" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111182 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" containerName="dnsmasq-dns" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111211 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="swift-recon-cron" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111228 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="swift-recon-cron" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111256 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707515d1-17d7-47ff-be95-73414a6a2a95" containerName="setup-container" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111269 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="707515d1-17d7-47ff-be95-73414a6a2a95" containerName="setup-container" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111285 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2a94b7-b322-4de7-b55d-80c87895ab2d" containerName="barbican-worker-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111297 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2a94b7-b322-4de7-b55d-80c87895ab2d" containerName="barbican-worker-log" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111311 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" containerName="mysql-bootstrap" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111323 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" containerName="mysql-bootstrap" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111338 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" containerName="placement-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111350 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" containerName="placement-log" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111371 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2a94b7-b322-4de7-b55d-80c87895ab2d" containerName="barbican-worker" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111382 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2a94b7-b322-4de7-b55d-80c87895ab2d" containerName="barbican-worker" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111400 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f32a70-a447-408f-99e0-90075753a304" containerName="nova-cell1-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111412 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f32a70-a447-408f-99e0-90075753a304" containerName="nova-cell1-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111428 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" containerName="setup-container" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111440 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" containerName="setup-container" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111458 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7faddbb-53f4-4f27-a853-0751049feafd" containerName="cinder-api-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111470 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7faddbb-53f4-4f27-a853-0751049feafd" containerName="cinder-api-log" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111486 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d23d8f1-6699-42f6-8152-362964a6a9dc" containerName="glance-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111498 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d23d8f1-6699-42f6-8152-362964a6a9dc" containerName="glance-log" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111510 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6ac7c2-5294-401a-a16d-0821b044e7f8" containerName="glance-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111522 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6ac7c2-5294-401a-a16d-0821b044e7f8" containerName="glance-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111542 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3850fc-0bf6-489d-8a8f-4a72ad423cce" containerName="galera" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111554 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3850fc-0bf6-489d-8a8f-4a72ad423cce" containerName="galera" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111576 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b0e7253-4a57-42d1-bbc3-768eeb755113" containerName="openstack-network-exporter" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111593 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b0e7253-4a57-42d1-bbc3-768eeb755113" containerName="openstack-network-exporter" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111612 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-server" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111666 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-server" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111688 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d23d8f1-6699-42f6-8152-362964a6a9dc" containerName="glance-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111704 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d23d8f1-6699-42f6-8152-362964a6a9dc" containerName="glance-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: E0120 23:37:14.111720 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-replicator" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.111737 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-replicator" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112054 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0cd7474-d970-4c0b-9c5d-ed4dfd2fd95c" containerName="rabbitmq" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112099 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-auditor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112118 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc222012-854c-45e8-8ef9-e87ac905ca13" containerName="mariadb-account-create-update" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112143 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a832d1c9-4838-4070-ad8d-e2af76d8ca47" containerName="proxy-server" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112170 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d23d8f1-6699-42f6-8152-362964a6a9dc" containerName="glance-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112185 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f32a70-a447-408f-99e0-90075753a304" containerName="nova-cell1-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112200 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5375014e-2365-4042-910d-80304789f745" containerName="nova-scheduler-scheduler" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112214 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-expirer" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112234 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7faddbb-53f4-4f27-a853-0751049feafd" containerName="cinder-api" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112247 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-server" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112260 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4bc1efe-faed-4290-a85c-c631a91f02ca" containerName="barbican-keystone-listener-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112274 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b0e7253-4a57-42d1-bbc3-768eeb755113" containerName="openstack-network-exporter" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112294 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2a94b7-b322-4de7-b55d-80c87895ab2d" containerName="barbican-worker-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112310 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc222012-854c-45e8-8ef9-e87ac905ca13" containerName="mariadb-account-create-update" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112335 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e1dbe4-f672-4c53-9335-6b1bb4f03ee0" containerName="dnsmasq-dns" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112361 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3774e65-1783-48d9-af7d-3a9964b609ad" containerName="kube-state-metrics" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112388 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="sg-core" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112404 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-auditor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112433 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-replicator" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112461 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-replicator" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112481 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" containerName="placement-api" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112504 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="16aae85a-a502-4b2e-9663-e8c50d01f657" containerName="nova-api-api" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112523 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a832d1c9-4838-4070-ad8d-e2af76d8ca47" containerName="proxy-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112549 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-server" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112696 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec706fe5-85b7-4580-9013-e39497640ab0" containerName="neutron-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112723 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f32a70-a447-408f-99e0-90075753a304" containerName="nova-cell1-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112736 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112756 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-server" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112769 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd0d4f5-ff02-42ff-9cb6-d3b9c3ef7978" containerName="memcached" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112788 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5375014e-2365-4042-910d-80304789f745" containerName="nova-scheduler-scheduler" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112807 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7faddbb-53f4-4f27-a853-0751049feafd" containerName="cinder-api-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112823 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-replicator" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112842 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="swift-recon-cron" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112863 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="707515d1-17d7-47ff-be95-73414a6a2a95" containerName="rabbitmq" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112877 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d99a5c-b837-4fa2-a490-a9c79bcd0cf9" containerName="galera" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112888 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b0e7253-4a57-42d1-bbc3-768eeb755113" containerName="ovn-northd" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112908 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d23d8f1-6699-42f6-8152-362964a6a9dc" containerName="glance-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112935 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112954 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2a94b7-b322-4de7-b55d-80c87895ab2d" containerName="barbican-worker" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112972 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.112990 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd6ac7c2-5294-401a-a16d-0821b044e7f8" containerName="glance-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113007 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3850fc-0bf6-489d-8a8f-4a72ad423cce" containerName="galera" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113021 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4bc1efe-faed-4290-a85c-c631a91f02ca" containerName="barbican-keystone-listener" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113040 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="16aae85a-a502-4b2e-9663-e8c50d01f657" containerName="nova-api-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113060 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ef9991-6f38-4b76-a8e7-974ef5255254" containerName="probe" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113072 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="ceilometer-central-agent" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113088 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec706fe5-85b7-4580-9013-e39497640ab0" containerName="neutron-api" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113108 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5375014e-2365-4042-910d-80304789f745" containerName="nova-scheduler-scheduler" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113121 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113134 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ef9991-6f38-4b76-a8e7-974ef5255254" containerName="cinder-scheduler" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113151 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="ceilometer-notification-agent" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113169 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f32a70-a447-408f-99e0-90075753a304" containerName="nova-cell1-conductor-conductor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113186 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerName="nova-metadata-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113201 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebab4fc9-eaee-431c-94e3-0b5ecf5ea5e2" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113219 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="43061a58-a31e-4ced-836f-b6a6badff81f" containerName="proxy-httpd" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113233 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0966335e-4d7c-4eb4-ba9b-b80a40b4fb15" containerName="nova-metadata-metadata" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113247 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd6ac7c2-5294-401a-a16d-0821b044e7f8" containerName="glance-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113266 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-auditor" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113280 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0979c96-fbd8-4e5b-a22b-35c1cae6e6da" containerName="keystone-api" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113295 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f18e53-0386-4e06-89f6-a7d30f880808" containerName="barbican-api-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113310 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="object-updater" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113327 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="rsync" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113339 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="account-reaper" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113354 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d310ab9-3091-41d2-967b-a5c4c3b5e7bb" containerName="placement-log" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.113366 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3239a82b-2e89-4f59-bfd7-a226ef7e9d2f" containerName="container-updater" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.114254 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7v2j" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.116874 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.118996 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.119290 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.119558 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.137411 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-t7v2j"] Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.264184 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfxct\" (UniqueName: \"kubernetes.io/projected/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-kube-api-access-vfxct\") pod \"crc-storage-crc-t7v2j\" (UID: \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\") " pod="crc-storage/crc-storage-crc-t7v2j" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.264270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-crc-storage\") pod \"crc-storage-crc-t7v2j\" (UID: \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\") " pod="crc-storage/crc-storage-crc-t7v2j" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.264890 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-node-mnt\") pod \"crc-storage-crc-t7v2j\" (UID: \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\") " pod="crc-storage/crc-storage-crc-t7v2j" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.367223 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-node-mnt\") pod \"crc-storage-crc-t7v2j\" (UID: \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\") " pod="crc-storage/crc-storage-crc-t7v2j" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.367428 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfxct\" (UniqueName: \"kubernetes.io/projected/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-kube-api-access-vfxct\") pod \"crc-storage-crc-t7v2j\" (UID: \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\") " pod="crc-storage/crc-storage-crc-t7v2j" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.367485 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-crc-storage\") pod \"crc-storage-crc-t7v2j\" (UID: \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\") " pod="crc-storage/crc-storage-crc-t7v2j" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.367715 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-node-mnt\") pod \"crc-storage-crc-t7v2j\" (UID: \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\") " pod="crc-storage/crc-storage-crc-t7v2j" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.368774 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-crc-storage\") pod \"crc-storage-crc-t7v2j\" (UID: \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\") " pod="crc-storage/crc-storage-crc-t7v2j" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.402228 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfxct\" (UniqueName: \"kubernetes.io/projected/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-kube-api-access-vfxct\") pod \"crc-storage-crc-t7v2j\" (UID: \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\") " pod="crc-storage/crc-storage-crc-t7v2j" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.453176 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7v2j" Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.967390 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-t7v2j"] Jan 20 23:37:14 crc kubenswrapper[5030]: I0120 23:37:14.982269 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 23:37:15 crc kubenswrapper[5030]: I0120 23:37:15.475215 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-t7v2j" event={"ID":"6aeec0cb-10a5-47b6-8046-2b089c0b81a9","Type":"ContainerStarted","Data":"82ea9a1ed9d550f3f98e5093c63fe30f2bcab73aa690c4fc2c5b13c157221cb2"} Jan 20 23:37:16 crc kubenswrapper[5030]: I0120 23:37:16.503267 5030 generic.go:334] "Generic (PLEG): container finished" podID="6aeec0cb-10a5-47b6-8046-2b089c0b81a9" containerID="cbd8c53c0c338082933ba2b93a1ac6d72c5a94a590009ae090199f4a21ef75f5" exitCode=0 Jan 20 23:37:16 crc kubenswrapper[5030]: I0120 23:37:16.503369 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-t7v2j" event={"ID":"6aeec0cb-10a5-47b6-8046-2b089c0b81a9","Type":"ContainerDied","Data":"cbd8c53c0c338082933ba2b93a1ac6d72c5a94a590009ae090199f4a21ef75f5"} Jan 20 23:37:17 crc kubenswrapper[5030]: I0120 23:37:17.882656 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7v2j" Jan 20 23:37:17 crc kubenswrapper[5030]: I0120 23:37:17.926263 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-node-mnt\") pod \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\" (UID: \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\") " Jan 20 23:37:17 crc kubenswrapper[5030]: I0120 23:37:17.926349 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-crc-storage\") pod \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\" (UID: \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\") " Jan 20 23:37:17 crc kubenswrapper[5030]: I0120 23:37:17.926395 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfxct\" (UniqueName: \"kubernetes.io/projected/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-kube-api-access-vfxct\") pod \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\" (UID: \"6aeec0cb-10a5-47b6-8046-2b089c0b81a9\") " Jan 20 23:37:17 crc kubenswrapper[5030]: I0120 23:37:17.926412 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6aeec0cb-10a5-47b6-8046-2b089c0b81a9" (UID: "6aeec0cb-10a5-47b6-8046-2b089c0b81a9"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:37:17 crc kubenswrapper[5030]: I0120 23:37:17.926782 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:17 crc kubenswrapper[5030]: I0120 23:37:17.932199 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-kube-api-access-vfxct" (OuterVolumeSpecName: "kube-api-access-vfxct") pod "6aeec0cb-10a5-47b6-8046-2b089c0b81a9" (UID: "6aeec0cb-10a5-47b6-8046-2b089c0b81a9"). InnerVolumeSpecName "kube-api-access-vfxct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:37:17 crc kubenswrapper[5030]: I0120 23:37:17.957909 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6aeec0cb-10a5-47b6-8046-2b089c0b81a9" (UID: "6aeec0cb-10a5-47b6-8046-2b089c0b81a9"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:37:18 crc kubenswrapper[5030]: I0120 23:37:18.028086 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:18 crc kubenswrapper[5030]: I0120 23:37:18.028138 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfxct\" (UniqueName: \"kubernetes.io/projected/6aeec0cb-10a5-47b6-8046-2b089c0b81a9-kube-api-access-vfxct\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:18 crc kubenswrapper[5030]: I0120 23:37:18.532612 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-t7v2j" event={"ID":"6aeec0cb-10a5-47b6-8046-2b089c0b81a9","Type":"ContainerDied","Data":"82ea9a1ed9d550f3f98e5093c63fe30f2bcab73aa690c4fc2c5b13c157221cb2"} Jan 20 23:37:18 crc kubenswrapper[5030]: I0120 23:37:18.532731 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82ea9a1ed9d550f3f98e5093c63fe30f2bcab73aa690c4fc2c5b13c157221cb2" Jan 20 23:37:18 crc kubenswrapper[5030]: I0120 23:37:18.532777 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-t7v2j" Jan 20 23:37:20 crc kubenswrapper[5030]: I0120 23:37:20.962165 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:37:20 crc kubenswrapper[5030]: E0120 23:37:20.962981 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.545493 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-t7v2j"] Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.550540 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-t7v2j"] Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.719354 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-jg5mc"] Jan 20 23:37:21 crc kubenswrapper[5030]: E0120 23:37:21.719957 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5375014e-2365-4042-910d-80304789f745" containerName="nova-scheduler-scheduler" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.720001 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5375014e-2365-4042-910d-80304789f745" containerName="nova-scheduler-scheduler" Jan 20 23:37:21 crc kubenswrapper[5030]: E0120 23:37:21.720044 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f32a70-a447-408f-99e0-90075753a304" containerName="nova-cell1-conductor-conductor" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.720063 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f32a70-a447-408f-99e0-90075753a304" containerName="nova-cell1-conductor-conductor" Jan 20 23:37:21 crc kubenswrapper[5030]: E0120 23:37:21.720087 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aeec0cb-10a5-47b6-8046-2b089c0b81a9" containerName="storage" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.720109 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aeec0cb-10a5-47b6-8046-2b089c0b81a9" containerName="storage" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.746606 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aeec0cb-10a5-47b6-8046-2b089c0b81a9" containerName="storage" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.746965 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c05bad7-0581-4182-9c7b-7a790c22cf64" containerName="nova-cell0-conductor-conductor" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.754779 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jg5mc"] Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.754935 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jg5mc" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.758778 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.759730 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.761292 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.761587 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.791084 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fcfcd441-8232-4cc7-b794-e52c18eb1b72-crc-storage\") pod \"crc-storage-crc-jg5mc\" (UID: \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\") " pod="crc-storage/crc-storage-crc-jg5mc" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.791297 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw2ff\" (UniqueName: \"kubernetes.io/projected/fcfcd441-8232-4cc7-b794-e52c18eb1b72-kube-api-access-zw2ff\") pod \"crc-storage-crc-jg5mc\" (UID: \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\") " pod="crc-storage/crc-storage-crc-jg5mc" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.791376 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fcfcd441-8232-4cc7-b794-e52c18eb1b72-node-mnt\") pod \"crc-storage-crc-jg5mc\" (UID: \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\") " pod="crc-storage/crc-storage-crc-jg5mc" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.894080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fcfcd441-8232-4cc7-b794-e52c18eb1b72-crc-storage\") pod \"crc-storage-crc-jg5mc\" (UID: \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\") " pod="crc-storage/crc-storage-crc-jg5mc" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.894167 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw2ff\" (UniqueName: \"kubernetes.io/projected/fcfcd441-8232-4cc7-b794-e52c18eb1b72-kube-api-access-zw2ff\") pod \"crc-storage-crc-jg5mc\" (UID: \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\") " pod="crc-storage/crc-storage-crc-jg5mc" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.894218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fcfcd441-8232-4cc7-b794-e52c18eb1b72-node-mnt\") pod \"crc-storage-crc-jg5mc\" (UID: \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\") " pod="crc-storage/crc-storage-crc-jg5mc" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.894814 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fcfcd441-8232-4cc7-b794-e52c18eb1b72-node-mnt\") pod \"crc-storage-crc-jg5mc\" (UID: \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\") " pod="crc-storage/crc-storage-crc-jg5mc" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.894996 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fcfcd441-8232-4cc7-b794-e52c18eb1b72-crc-storage\") pod \"crc-storage-crc-jg5mc\" (UID: \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\") " pod="crc-storage/crc-storage-crc-jg5mc" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.927005 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw2ff\" (UniqueName: \"kubernetes.io/projected/fcfcd441-8232-4cc7-b794-e52c18eb1b72-kube-api-access-zw2ff\") pod \"crc-storage-crc-jg5mc\" (UID: \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\") " pod="crc-storage/crc-storage-crc-jg5mc" Jan 20 23:37:21 crc kubenswrapper[5030]: I0120 23:37:21.978428 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aeec0cb-10a5-47b6-8046-2b089c0b81a9" path="/var/lib/kubelet/pods/6aeec0cb-10a5-47b6-8046-2b089c0b81a9/volumes" Jan 20 23:37:22 crc kubenswrapper[5030]: I0120 23:37:22.093251 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jg5mc" Jan 20 23:37:22 crc kubenswrapper[5030]: I0120 23:37:22.365319 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jg5mc"] Jan 20 23:37:22 crc kubenswrapper[5030]: I0120 23:37:22.575384 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jg5mc" event={"ID":"fcfcd441-8232-4cc7-b794-e52c18eb1b72","Type":"ContainerStarted","Data":"74183558fdfd9db3e34aab1d6f1199a55cc7434e8a3d5b7b775bd3f6f9314e88"} Jan 20 23:37:23 crc kubenswrapper[5030]: I0120 23:37:23.589372 5030 generic.go:334] "Generic (PLEG): container finished" podID="fcfcd441-8232-4cc7-b794-e52c18eb1b72" containerID="172ef371bccaa04eddfc89e039a269b815a0129e68773a1d14b0d92242d7b805" exitCode=0 Jan 20 23:37:23 crc kubenswrapper[5030]: I0120 23:37:23.589441 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jg5mc" event={"ID":"fcfcd441-8232-4cc7-b794-e52c18eb1b72","Type":"ContainerDied","Data":"172ef371bccaa04eddfc89e039a269b815a0129e68773a1d14b0d92242d7b805"} Jan 20 23:37:24 crc kubenswrapper[5030]: I0120 23:37:24.990411 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jg5mc" Jan 20 23:37:25 crc kubenswrapper[5030]: I0120 23:37:25.148429 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fcfcd441-8232-4cc7-b794-e52c18eb1b72-node-mnt\") pod \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\" (UID: \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\") " Jan 20 23:37:25 crc kubenswrapper[5030]: I0120 23:37:25.148510 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcfcd441-8232-4cc7-b794-e52c18eb1b72-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "fcfcd441-8232-4cc7-b794-e52c18eb1b72" (UID: "fcfcd441-8232-4cc7-b794-e52c18eb1b72"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:37:25 crc kubenswrapper[5030]: I0120 23:37:25.148533 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fcfcd441-8232-4cc7-b794-e52c18eb1b72-crc-storage\") pod \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\" (UID: \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\") " Jan 20 23:37:25 crc kubenswrapper[5030]: I0120 23:37:25.148857 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw2ff\" (UniqueName: \"kubernetes.io/projected/fcfcd441-8232-4cc7-b794-e52c18eb1b72-kube-api-access-zw2ff\") pod \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\" (UID: \"fcfcd441-8232-4cc7-b794-e52c18eb1b72\") " Jan 20 23:37:25 crc kubenswrapper[5030]: I0120 23:37:25.150450 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fcfcd441-8232-4cc7-b794-e52c18eb1b72-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:25 crc kubenswrapper[5030]: I0120 23:37:25.155948 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcfcd441-8232-4cc7-b794-e52c18eb1b72-kube-api-access-zw2ff" (OuterVolumeSpecName: "kube-api-access-zw2ff") pod "fcfcd441-8232-4cc7-b794-e52c18eb1b72" (UID: "fcfcd441-8232-4cc7-b794-e52c18eb1b72"). InnerVolumeSpecName "kube-api-access-zw2ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:37:25 crc kubenswrapper[5030]: I0120 23:37:25.171871 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcfcd441-8232-4cc7-b794-e52c18eb1b72-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "fcfcd441-8232-4cc7-b794-e52c18eb1b72" (UID: "fcfcd441-8232-4cc7-b794-e52c18eb1b72"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:37:25 crc kubenswrapper[5030]: I0120 23:37:25.251870 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw2ff\" (UniqueName: \"kubernetes.io/projected/fcfcd441-8232-4cc7-b794-e52c18eb1b72-kube-api-access-zw2ff\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:25 crc kubenswrapper[5030]: I0120 23:37:25.251902 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fcfcd441-8232-4cc7-b794-e52c18eb1b72-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:37:25 crc kubenswrapper[5030]: I0120 23:37:25.613965 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jg5mc" event={"ID":"fcfcd441-8232-4cc7-b794-e52c18eb1b72","Type":"ContainerDied","Data":"74183558fdfd9db3e34aab1d6f1199a55cc7434e8a3d5b7b775bd3f6f9314e88"} Jan 20 23:37:25 crc kubenswrapper[5030]: I0120 23:37:25.614023 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74183558fdfd9db3e34aab1d6f1199a55cc7434e8a3d5b7b775bd3f6f9314e88" Jan 20 23:37:25 crc kubenswrapper[5030]: I0120 23:37:25.614075 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jg5mc" Jan 20 23:37:32 crc kubenswrapper[5030]: I0120 23:37:32.962871 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:37:32 crc kubenswrapper[5030]: E0120 23:37:32.964099 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.869844 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:37:38 crc kubenswrapper[5030]: E0120 23:37:38.871072 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcfcd441-8232-4cc7-b794-e52c18eb1b72" containerName="storage" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.871102 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcfcd441-8232-4cc7-b794-e52c18eb1b72" containerName="storage" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.871391 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcfcd441-8232-4cc7-b794-e52c18eb1b72" containerName="storage" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.872733 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.875592 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.876106 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.877059 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.877169 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.877787 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.878009 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-q8qdj" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.883987 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.893138 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.987327 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.987406 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.987462 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.987497 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.987520 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.987543 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.987611 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wllkk\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-kube-api-access-wllkk\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.987673 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.987699 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.987721 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:38 crc kubenswrapper[5030]: I0120 23:37:38.987742 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.089399 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.089460 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.089497 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.089538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.090251 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.090271 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.090389 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wllkk\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-kube-api-access-wllkk\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.090421 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.090452 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.090477 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.090502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.090570 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.090606 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.090974 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.091660 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.092175 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.092687 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.098224 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.098285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.098235 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.100435 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.121164 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wllkk\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-kube-api-access-wllkk\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.125579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.209943 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.347522 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.349612 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.351946 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.351998 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.352269 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.352414 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.352525 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.352915 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-r82f2" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.353155 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.375825 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.495984 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.496027 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.496051 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.496071 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16f26a59-3681-422d-a46f-406b454f73b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.496219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.496341 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.496407 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16f26a59-3681-422d-a46f-406b454f73b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.496429 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lksg\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-kube-api-access-5lksg\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.496541 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.496586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.496611 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.598371 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.598476 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.598529 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16f26a59-3681-422d-a46f-406b454f73b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.598563 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lksg\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-kube-api-access-5lksg\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.598662 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.598713 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.598743 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.598811 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.598841 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.598881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.598917 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16f26a59-3681-422d-a46f-406b454f73b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.598959 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") device mount path \"/mnt/openstack/pv07\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.599543 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.599992 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.600536 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.601780 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.602046 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.605156 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.605788 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16f26a59-3681-422d-a46f-406b454f73b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.607387 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16f26a59-3681-422d-a46f-406b454f73b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.607717 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.623316 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lksg\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-kube-api-access-5lksg\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.634796 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.679138 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.717581 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:37:39 crc kubenswrapper[5030]: I0120 23:37:39.788299 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"2118c3ce-d55e-4a65-8d25-d7f53bcac91e","Type":"ContainerStarted","Data":"5476b5c63e00b4cc5e3431d604dd24030771d82ed1e2154221e46902839d3f72"} Jan 20 23:37:40 crc kubenswrapper[5030]: I0120 23:37:40.155768 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:37:40 crc kubenswrapper[5030]: I0120 23:37:40.800738 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"16f26a59-3681-422d-a46f-406b454f73b8","Type":"ContainerStarted","Data":"51acff2a27d331282d3d81f29d33a4a74b9a86ceb3192ef5404c8ac742bb6061"} Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.048106 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.049929 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.053699 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.054214 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.055828 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-j9wjx" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.057747 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.070600 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.079102 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.231219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.231309 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqxl5\" (UniqueName: \"kubernetes.io/projected/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-kube-api-access-tqxl5\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.231445 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.231513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-config-data-default\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.231669 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.231763 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-kolla-config\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.231790 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.231843 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.333652 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqxl5\" (UniqueName: \"kubernetes.io/projected/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-kube-api-access-tqxl5\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.333737 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.333783 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-config-data-default\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.333810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.333855 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-kolla-config\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.333876 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.333904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.333948 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.334524 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.334642 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-kolla-config\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.334831 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.335464 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-config-data-default\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.335852 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.339220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.340405 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.354342 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqxl5\" (UniqueName: \"kubernetes.io/projected/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-kube-api-access-tqxl5\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.369713 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.669956 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.814296 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"16f26a59-3681-422d-a46f-406b454f73b8","Type":"ContainerStarted","Data":"9c48f12957fef053cb13ac5971d97c7a68101793fe83cc1adfe30ac9a8f81cc5"} Jan 20 23:37:41 crc kubenswrapper[5030]: I0120 23:37:41.816710 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"2118c3ce-d55e-4a65-8d25-d7f53bcac91e","Type":"ContainerStarted","Data":"62b3fef48a93cc798c887229c9410be50b9b8c1feb8b7d1bf40cdaf819af989a"} Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.145199 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:37:42 crc kubenswrapper[5030]: W0120 23:37:42.146473 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0f6aa7c_dc7c_4aa6_8e0f_ea81ca443a34.slice/crio-24377ead099d46b1d905df90aef759a74af9cd5bdb7a7fe83f91ab527bfa1cdd WatchSource:0}: Error finding container 24377ead099d46b1d905df90aef759a74af9cd5bdb7a7fe83f91ab527bfa1cdd: Status 404 returned error can't find the container with id 24377ead099d46b1d905df90aef759a74af9cd5bdb7a7fe83f91ab527bfa1cdd Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.344287 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.345593 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.348923 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.350603 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.351455 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-rjg7g" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.352255 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.376580 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.450611 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.450803 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.450851 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/410c6d7e-0366-4932-b95a-4e9624eadf71-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.450871 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410c6d7e-0366-4932-b95a-4e9624eadf71-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.450914 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngv9k\" (UniqueName: \"kubernetes.io/projected/410c6d7e-0366-4932-b95a-4e9624eadf71-kube-api-access-ngv9k\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.451046 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.451089 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.451133 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/410c6d7e-0366-4932-b95a-4e9624eadf71-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.552140 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.552429 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/410c6d7e-0366-4932-b95a-4e9624eadf71-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.552579 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.552768 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.552877 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/410c6d7e-0366-4932-b95a-4e9624eadf71-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.552969 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410c6d7e-0366-4932-b95a-4e9624eadf71-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.553070 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngv9k\" (UniqueName: \"kubernetes.io/projected/410c6d7e-0366-4932-b95a-4e9624eadf71-kube-api-access-ngv9k\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.553187 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.553365 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.553385 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/410c6d7e-0366-4932-b95a-4e9624eadf71-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.553493 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.555998 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.556705 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.556955 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410c6d7e-0366-4932-b95a-4e9624eadf71-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.570601 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/410c6d7e-0366-4932-b95a-4e9624eadf71-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.577317 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngv9k\" (UniqueName: \"kubernetes.io/projected/410c6d7e-0366-4932-b95a-4e9624eadf71-kube-api-access-ngv9k\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.582833 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.619349 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.620542 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.621954 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.625227 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-bmf9h" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.625415 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.636845 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.711459 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.755595 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.755887 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-kolla-config\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.756010 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-config-data\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.756082 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pms6q\" (UniqueName: \"kubernetes.io/projected/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-kube-api-access-pms6q\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.756148 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.831407 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34","Type":"ContainerStarted","Data":"354ef07e84ccce6c737aee025f3ecd29b588ce56c49537c778770bfa8330eb07"} Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.831450 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34","Type":"ContainerStarted","Data":"24377ead099d46b1d905df90aef759a74af9cd5bdb7a7fe83f91ab527bfa1cdd"} Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.858300 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-kolla-config\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.858383 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-config-data\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.858412 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pms6q\" (UniqueName: \"kubernetes.io/projected/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-kube-api-access-pms6q\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.858440 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.858515 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.860293 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-kolla-config\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.860321 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-config-data\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.863754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.866909 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.880963 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pms6q\" (UniqueName: \"kubernetes.io/projected/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-kube-api-access-pms6q\") pod \"memcached-0\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:42 crc kubenswrapper[5030]: I0120 23:37:42.951544 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:43 crc kubenswrapper[5030]: I0120 23:37:43.139436 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:37:43 crc kubenswrapper[5030]: W0120 23:37:43.144292 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod410c6d7e_0366_4932_b95a_4e9624eadf71.slice/crio-53deed7c2df75d0d2f1773031ab76255788fc76bd82624f1bc4f084ab9818c8a WatchSource:0}: Error finding container 53deed7c2df75d0d2f1773031ab76255788fc76bd82624f1bc4f084ab9818c8a: Status 404 returned error can't find the container with id 53deed7c2df75d0d2f1773031ab76255788fc76bd82624f1bc4f084ab9818c8a Jan 20 23:37:43 crc kubenswrapper[5030]: I0120 23:37:43.379579 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:37:43 crc kubenswrapper[5030]: W0120 23:37:43.390790 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6d833c2_fa91_4538_a2e6_d42190e6fb9e.slice/crio-c10240d067c7a3f65a3b35858f4c1a56e2627e7ca5bde6937d9b7053f3faccdb WatchSource:0}: Error finding container c10240d067c7a3f65a3b35858f4c1a56e2627e7ca5bde6937d9b7053f3faccdb: Status 404 returned error can't find the container with id c10240d067c7a3f65a3b35858f4c1a56e2627e7ca5bde6937d9b7053f3faccdb Jan 20 23:37:43 crc kubenswrapper[5030]: I0120 23:37:43.839325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"410c6d7e-0366-4932-b95a-4e9624eadf71","Type":"ContainerStarted","Data":"e16b7deceab9b6d152d69c2ad3efd2d72f8ea41ed94ce05e0ca3b3309400a66f"} Jan 20 23:37:43 crc kubenswrapper[5030]: I0120 23:37:43.839370 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"410c6d7e-0366-4932-b95a-4e9624eadf71","Type":"ContainerStarted","Data":"53deed7c2df75d0d2f1773031ab76255788fc76bd82624f1bc4f084ab9818c8a"} Jan 20 23:37:43 crc kubenswrapper[5030]: I0120 23:37:43.841643 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"f6d833c2-fa91-4538-a2e6-d42190e6fb9e","Type":"ContainerStarted","Data":"8bed77b1dd9d957a97c36677d666f529c0937e16ba1e19ff1190dce0c3a08706"} Jan 20 23:37:43 crc kubenswrapper[5030]: I0120 23:37:43.841692 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:43 crc kubenswrapper[5030]: I0120 23:37:43.841729 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"f6d833c2-fa91-4538-a2e6-d42190e6fb9e","Type":"ContainerStarted","Data":"c10240d067c7a3f65a3b35858f4c1a56e2627e7ca5bde6937d9b7053f3faccdb"} Jan 20 23:37:43 crc kubenswrapper[5030]: I0120 23:37:43.876801 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=1.876779924 podStartE2EDuration="1.876779924s" podCreationTimestamp="2026-01-20 23:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:37:43.875907302 +0000 UTC m=+3736.196167600" watchObservedRunningTime="2026-01-20 23:37:43.876779924 +0000 UTC m=+3736.197040222" Jan 20 23:37:43 crc kubenswrapper[5030]: I0120 23:37:43.962456 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:37:43 crc kubenswrapper[5030]: E0120 23:37:43.962690 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:37:44 crc kubenswrapper[5030]: I0120 23:37:44.146123 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:37:44 crc kubenswrapper[5030]: I0120 23:37:44.147366 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:37:44 crc kubenswrapper[5030]: I0120 23:37:44.151075 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-4clvl" Jan 20 23:37:44 crc kubenswrapper[5030]: I0120 23:37:44.161017 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:37:44 crc kubenswrapper[5030]: I0120 23:37:44.186184 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjn6n\" (UniqueName: \"kubernetes.io/projected/5c398695-7b05-43bd-b76f-68e6a9e85605-kube-api-access-mjn6n\") pod \"kube-state-metrics-0\" (UID: \"5c398695-7b05-43bd-b76f-68e6a9e85605\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:37:44 crc kubenswrapper[5030]: I0120 23:37:44.287866 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjn6n\" (UniqueName: \"kubernetes.io/projected/5c398695-7b05-43bd-b76f-68e6a9e85605-kube-api-access-mjn6n\") pod \"kube-state-metrics-0\" (UID: \"5c398695-7b05-43bd-b76f-68e6a9e85605\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:37:44 crc kubenswrapper[5030]: I0120 23:37:44.308269 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjn6n\" (UniqueName: \"kubernetes.io/projected/5c398695-7b05-43bd-b76f-68e6a9e85605-kube-api-access-mjn6n\") pod \"kube-state-metrics-0\" (UID: \"5c398695-7b05-43bd-b76f-68e6a9e85605\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:37:44 crc kubenswrapper[5030]: I0120 23:37:44.476058 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:37:44 crc kubenswrapper[5030]: I0120 23:37:44.923750 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:37:45 crc kubenswrapper[5030]: I0120 23:37:45.860226 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"5c398695-7b05-43bd-b76f-68e6a9e85605","Type":"ContainerStarted","Data":"0c04d25d07c9c7cb76b3dc5a59950a742d760a872cb02a14c5668b1b2c745c9a"} Jan 20 23:37:45 crc kubenswrapper[5030]: I0120 23:37:45.860668 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"5c398695-7b05-43bd-b76f-68e6a9e85605","Type":"ContainerStarted","Data":"b4fab6437f1ec783dbfc613bcf6953f2a0c8fca88aa66a2d77d00f7aeafb5525"} Jan 20 23:37:45 crc kubenswrapper[5030]: I0120 23:37:45.860698 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:37:45 crc kubenswrapper[5030]: I0120 23:37:45.863211 5030 generic.go:334] "Generic (PLEG): container finished" podID="d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" containerID="354ef07e84ccce6c737aee025f3ecd29b588ce56c49537c778770bfa8330eb07" exitCode=0 Jan 20 23:37:45 crc kubenswrapper[5030]: I0120 23:37:45.863293 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34","Type":"ContainerDied","Data":"354ef07e84ccce6c737aee025f3ecd29b588ce56c49537c778770bfa8330eb07"} Jan 20 23:37:45 crc kubenswrapper[5030]: I0120 23:37:45.883207 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.518340448 podStartE2EDuration="1.88318254s" podCreationTimestamp="2026-01-20 23:37:44 +0000 UTC" firstStartedPulling="2026-01-20 23:37:44.936368961 +0000 UTC m=+3737.256629249" lastFinishedPulling="2026-01-20 23:37:45.301211053 +0000 UTC m=+3737.621471341" observedRunningTime="2026-01-20 23:37:45.88318207 +0000 UTC m=+3738.203442388" watchObservedRunningTime="2026-01-20 23:37:45.88318254 +0000 UTC m=+3738.203442848" Jan 20 23:37:46 crc kubenswrapper[5030]: I0120 23:37:46.877788 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34","Type":"ContainerStarted","Data":"bdbd823f6b499ef708f35a6d50e847df98d69dbff62b7e9b46c4b99da43314ae"} Jan 20 23:37:46 crc kubenswrapper[5030]: I0120 23:37:46.913314 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=6.913283354 podStartE2EDuration="6.913283354s" podCreationTimestamp="2026-01-20 23:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:37:46.906936401 +0000 UTC m=+3739.227196719" watchObservedRunningTime="2026-01-20 23:37:46.913283354 +0000 UTC m=+3739.233543662" Jan 20 23:37:47 crc kubenswrapper[5030]: I0120 23:37:47.892341 5030 generic.go:334] "Generic (PLEG): container finished" podID="410c6d7e-0366-4932-b95a-4e9624eadf71" containerID="e16b7deceab9b6d152d69c2ad3efd2d72f8ea41ed94ce05e0ca3b3309400a66f" exitCode=0 Jan 20 23:37:47 crc kubenswrapper[5030]: I0120 23:37:47.892406 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"410c6d7e-0366-4932-b95a-4e9624eadf71","Type":"ContainerDied","Data":"e16b7deceab9b6d152d69c2ad3efd2d72f8ea41ed94ce05e0ca3b3309400a66f"} Jan 20 23:37:48 crc kubenswrapper[5030]: I0120 23:37:48.902180 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"410c6d7e-0366-4932-b95a-4e9624eadf71","Type":"ContainerStarted","Data":"4f34747e9a5c643353b7a19006a2b052c6d1ed183a2329724de92de4a875c411"} Jan 20 23:37:48 crc kubenswrapper[5030]: I0120 23:37:48.931119 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=7.9311007369999995 podStartE2EDuration="7.931100737s" podCreationTimestamp="2026-01-20 23:37:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:37:48.922455548 +0000 UTC m=+3741.242715826" watchObservedRunningTime="2026-01-20 23:37:48.931100737 +0000 UTC m=+3741.251361025" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.886102 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.888952 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.893194 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.893236 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.895003 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.895383 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.896695 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-m2qb9" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.897664 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.981382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.981778 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.981826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks65s\" (UniqueName: \"kubernetes.io/projected/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-kube-api-access-ks65s\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.981858 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.981999 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-config\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.982048 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.982077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:49 crc kubenswrapper[5030]: I0120 23:37:49.982109 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.083552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.083990 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks65s\" (UniqueName: \"kubernetes.io/projected/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-kube-api-access-ks65s\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.084185 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.084642 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-config\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.084697 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.084726 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.084758 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.084801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.085208 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.085993 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.087016 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-config\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.087471 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.094666 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.095357 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.095697 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.114507 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks65s\" (UniqueName: \"kubernetes.io/projected/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-kube-api-access-ks65s\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.122765 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.154987 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.157766 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.165351 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.165811 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.166168 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.166605 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-7s6fz" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.168201 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.208155 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.286890 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.286968 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtw9s\" (UniqueName: \"kubernetes.io/projected/565f8fbe-5c4e-4cf4-8a18-0930a8595306-kube-api-access-gtw9s\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.287019 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.287125 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/565f8fbe-5c4e-4cf4-8a18-0930a8595306-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.287166 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.287193 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.287230 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/565f8fbe-5c4e-4cf4-8a18-0930a8595306-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.287294 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/565f8fbe-5c4e-4cf4-8a18-0930a8595306-config\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.389070 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/565f8fbe-5c4e-4cf4-8a18-0930a8595306-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.389497 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.389535 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.389592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/565f8fbe-5c4e-4cf4-8a18-0930a8595306-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.389711 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/565f8fbe-5c4e-4cf4-8a18-0930a8595306-config\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.389777 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.389807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtw9s\" (UniqueName: \"kubernetes.io/projected/565f8fbe-5c4e-4cf4-8a18-0930a8595306-kube-api-access-gtw9s\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.389874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.392134 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/565f8fbe-5c4e-4cf4-8a18-0930a8595306-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.392811 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/565f8fbe-5c4e-4cf4-8a18-0930a8595306-config\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.392970 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/565f8fbe-5c4e-4cf4-8a18-0930a8595306-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.393392 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") device mount path \"/mnt/openstack/pv16\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.396484 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.397313 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.407108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.410198 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtw9s\" (UniqueName: \"kubernetes.io/projected/565f8fbe-5c4e-4cf4-8a18-0930a8595306-kube-api-access-gtw9s\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.414988 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.519235 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.688274 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:37:50 crc kubenswrapper[5030]: W0120 23:37:50.695755 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaf55cf8_5e84_44cc_afbb_8e8af7aa0a3f.slice/crio-51ad7e4068ed79b67273eb094e8facdd2b100873403720679fd4cb08f933f72f WatchSource:0}: Error finding container 51ad7e4068ed79b67273eb094e8facdd2b100873403720679fd4cb08f933f72f: Status 404 returned error can't find the container with id 51ad7e4068ed79b67273eb094e8facdd2b100873403720679fd4cb08f933f72f Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.925744 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f","Type":"ContainerStarted","Data":"f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160"} Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.926049 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f","Type":"ContainerStarted","Data":"51ad7e4068ed79b67273eb094e8facdd2b100873403720679fd4cb08f933f72f"} Jan 20 23:37:50 crc kubenswrapper[5030]: I0120 23:37:50.982487 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:37:50 crc kubenswrapper[5030]: W0120 23:37:50.993808 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod565f8fbe_5c4e_4cf4_8a18_0930a8595306.slice/crio-bbf5f25f66736fe1bf9ce2fa2d91a62e77af1012e7410952ed6eb2d1d9bcae94 WatchSource:0}: Error finding container bbf5f25f66736fe1bf9ce2fa2d91a62e77af1012e7410952ed6eb2d1d9bcae94: Status 404 returned error can't find the container with id bbf5f25f66736fe1bf9ce2fa2d91a62e77af1012e7410952ed6eb2d1d9bcae94 Jan 20 23:37:51 crc kubenswrapper[5030]: I0120 23:37:51.671040 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:51 crc kubenswrapper[5030]: I0120 23:37:51.671846 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:51 crc kubenswrapper[5030]: I0120 23:37:51.939228 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"565f8fbe-5c4e-4cf4-8a18-0930a8595306","Type":"ContainerStarted","Data":"4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e"} Jan 20 23:37:51 crc kubenswrapper[5030]: I0120 23:37:51.939325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"565f8fbe-5c4e-4cf4-8a18-0930a8595306","Type":"ContainerStarted","Data":"4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45"} Jan 20 23:37:51 crc kubenswrapper[5030]: I0120 23:37:51.939356 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"565f8fbe-5c4e-4cf4-8a18-0930a8595306","Type":"ContainerStarted","Data":"bbf5f25f66736fe1bf9ce2fa2d91a62e77af1012e7410952ed6eb2d1d9bcae94"} Jan 20 23:37:51 crc kubenswrapper[5030]: I0120 23:37:51.942181 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f","Type":"ContainerStarted","Data":"6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879"} Jan 20 23:37:51 crc kubenswrapper[5030]: I0120 23:37:51.976088 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.976069624 podStartE2EDuration="2.976069624s" podCreationTimestamp="2026-01-20 23:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:37:51.974790513 +0000 UTC m=+3744.295050881" watchObservedRunningTime="2026-01-20 23:37:51.976069624 +0000 UTC m=+3744.296329922" Jan 20 23:37:52 crc kubenswrapper[5030]: I0120 23:37:52.012423 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=4.012390563 podStartE2EDuration="4.012390563s" podCreationTimestamp="2026-01-20 23:37:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:37:52.004296007 +0000 UTC m=+3744.324556385" watchObservedRunningTime="2026-01-20 23:37:52.012390563 +0000 UTC m=+3744.332650901" Jan 20 23:37:52 crc kubenswrapper[5030]: I0120 23:37:52.712323 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:52 crc kubenswrapper[5030]: I0120 23:37:52.712987 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:52 crc kubenswrapper[5030]: I0120 23:37:52.953743 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:37:53 crc kubenswrapper[5030]: I0120 23:37:53.209299 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:53 crc kubenswrapper[5030]: I0120 23:37:53.272118 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:53 crc kubenswrapper[5030]: I0120 23:37:53.519944 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:53 crc kubenswrapper[5030]: I0120 23:37:53.979110 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:54 crc kubenswrapper[5030]: I0120 23:37:54.072139 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:54 crc kubenswrapper[5030]: I0120 23:37:54.174904 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:37:54 crc kubenswrapper[5030]: I0120 23:37:54.481081 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:37:54 crc kubenswrapper[5030]: I0120 23:37:54.962850 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:37:54 crc kubenswrapper[5030]: E0120 23:37:54.963301 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.143674 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.237254 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.270642 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.520603 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.562937 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.572702 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.575958 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.575965 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.576096 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.576152 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-jqpp4" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.585510 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.678093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1fa80a86-a911-4a89-9186-520189e4cc88-lock\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.678151 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.678175 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.678207 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgx9c\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-kube-api-access-kgx9c\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.678297 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa80a86-a911-4a89-9186-520189e4cc88-cache\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.779856 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1fa80a86-a911-4a89-9186-520189e4cc88-lock\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.780173 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.780199 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.780228 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgx9c\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-kube-api-access-kgx9c\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.780318 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa80a86-a911-4a89-9186-520189e4cc88-cache\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: E0120 23:37:55.781026 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:37:55 crc kubenswrapper[5030]: E0120 23:37:55.781109 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.781183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1fa80a86-a911-4a89-9186-520189e4cc88-lock\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.781257 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa80a86-a911-4a89-9186-520189e4cc88-cache\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: E0120 23:37:55.781320 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift podName:1fa80a86-a911-4a89-9186-520189e4cc88 nodeName:}" failed. No retries permitted until 2026-01-20 23:37:56.281199461 +0000 UTC m=+3748.601459749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift") pod "swift-storage-0" (UID: "1fa80a86-a911-4a89-9186-520189e4cc88") : configmap "swift-ring-files" not found Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.781349 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.802723 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.810933 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgx9c\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-kube-api-access-kgx9c\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.926677 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-mphbf"] Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.927813 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.931721 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.931729 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.931985 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.940325 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-mphbf"] Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.983250 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/874081cb-7984-4363-818a-78f8ed9321ed-etc-swift\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.983303 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-combined-ca-bundle\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.983466 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-swiftconf\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.983523 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-dispersionconf\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.983661 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/874081cb-7984-4363-818a-78f8ed9321ed-scripts\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.983759 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx8g7\" (UniqueName: \"kubernetes.io/projected/874081cb-7984-4363-818a-78f8ed9321ed-kube-api-access-xx8g7\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:55 crc kubenswrapper[5030]: I0120 23:37:55.983850 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/874081cb-7984-4363-818a-78f8ed9321ed-ring-data-devices\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.085097 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/874081cb-7984-4363-818a-78f8ed9321ed-scripts\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.085205 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx8g7\" (UniqueName: \"kubernetes.io/projected/874081cb-7984-4363-818a-78f8ed9321ed-kube-api-access-xx8g7\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.085281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/874081cb-7984-4363-818a-78f8ed9321ed-ring-data-devices\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.085320 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/874081cb-7984-4363-818a-78f8ed9321ed-etc-swift\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.085339 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-combined-ca-bundle\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.085423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-swiftconf\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.085445 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-dispersionconf\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.085984 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/874081cb-7984-4363-818a-78f8ed9321ed-scripts\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.086335 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/874081cb-7984-4363-818a-78f8ed9321ed-etc-swift\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.086571 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/874081cb-7984-4363-818a-78f8ed9321ed-ring-data-devices\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.090164 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-swiftconf\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.090404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-combined-ca-bundle\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.092024 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-dispersionconf\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.102407 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx8g7\" (UniqueName: \"kubernetes.io/projected/874081cb-7984-4363-818a-78f8ed9321ed-kube-api-access-xx8g7\") pod \"swift-ring-rebalance-mphbf\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.243960 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.288479 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:56 crc kubenswrapper[5030]: E0120 23:37:56.288703 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:37:56 crc kubenswrapper[5030]: E0120 23:37:56.288921 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:37:56 crc kubenswrapper[5030]: E0120 23:37:56.288992 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift podName:1fa80a86-a911-4a89-9186-520189e4cc88 nodeName:}" failed. No retries permitted until 2026-01-20 23:37:57.288970913 +0000 UTC m=+3749.609231201 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift") pod "swift-storage-0" (UID: "1fa80a86-a911-4a89-9186-520189e4cc88") : configmap "swift-ring-files" not found Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.566320 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.614019 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.666468 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-mphbf"] Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.762979 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.764550 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.767143 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.767168 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-l9mtr" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.767320 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.770638 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.771987 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.796334 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-scripts\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.796383 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.796416 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.796441 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-config\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.796496 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.796537 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.796562 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlqwt\" (UniqueName: \"kubernetes.io/projected/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-kube-api-access-vlqwt\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.898027 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-scripts\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.898430 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.898484 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.898523 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-config\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.898599 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.898694 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.898732 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlqwt\" (UniqueName: \"kubernetes.io/projected/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-kube-api-access-vlqwt\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.899121 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-scripts\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.899512 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.899967 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-config\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.903534 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.903567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.911364 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.920064 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlqwt\" (UniqueName: \"kubernetes.io/projected/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-kube-api-access-vlqwt\") pod \"ovn-northd-0\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.990414 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" event={"ID":"874081cb-7984-4363-818a-78f8ed9321ed","Type":"ContainerStarted","Data":"d9605d7f9d273668c47f45b37af497fcbfd880427fd93734f81f8edf971ac8a9"} Jan 20 23:37:56 crc kubenswrapper[5030]: I0120 23:37:56.990465 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" event={"ID":"874081cb-7984-4363-818a-78f8ed9321ed","Type":"ContainerStarted","Data":"fb80d15eb9df72965f5f02caa4c5c9bfaf62e9b923907e69d6e4d0c7aea88ddf"} Jan 20 23:37:57 crc kubenswrapper[5030]: I0120 23:37:57.081121 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:57 crc kubenswrapper[5030]: I0120 23:37:57.306325 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:37:57 crc kubenswrapper[5030]: E0120 23:37:57.306538 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:37:57 crc kubenswrapper[5030]: E0120 23:37:57.306758 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:37:57 crc kubenswrapper[5030]: E0120 23:37:57.306823 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift podName:1fa80a86-a911-4a89-9186-520189e4cc88 nodeName:}" failed. No retries permitted until 2026-01-20 23:37:59.30680301 +0000 UTC m=+3751.627063308 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift") pod "swift-storage-0" (UID: "1fa80a86-a911-4a89-9186-520189e4cc88") : configmap "swift-ring-files" not found Jan 20 23:37:57 crc kubenswrapper[5030]: I0120 23:37:57.572438 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" podStartSLOduration=2.572415509 podStartE2EDuration="2.572415509s" podCreationTimestamp="2026-01-20 23:37:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:37:57.009533464 +0000 UTC m=+3749.329793752" watchObservedRunningTime="2026-01-20 23:37:57.572415509 +0000 UTC m=+3749.892675807" Jan 20 23:37:57 crc kubenswrapper[5030]: I0120 23:37:57.574700 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:37:57 crc kubenswrapper[5030]: W0120 23:37:57.580285 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4c36492_6bfc_4370_b163_6e8ea9b8c06a.slice/crio-93e1498c036be00fce2beed4475e558b06e6dccea18d7b64ec7b64e36e26666a WatchSource:0}: Error finding container 93e1498c036be00fce2beed4475e558b06e6dccea18d7b64ec7b64e36e26666a: Status 404 returned error can't find the container with id 93e1498c036be00fce2beed4475e558b06e6dccea18d7b64ec7b64e36e26666a Jan 20 23:37:58 crc kubenswrapper[5030]: I0120 23:37:58.009282 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"b4c36492-6bfc-4370-b163-6e8ea9b8c06a","Type":"ContainerStarted","Data":"e7b8a2e75577c589c157713b286429901def374e01d91bd72e7e69b63cc402a3"} Jan 20 23:37:58 crc kubenswrapper[5030]: I0120 23:37:58.009779 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"b4c36492-6bfc-4370-b163-6e8ea9b8c06a","Type":"ContainerStarted","Data":"d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d"} Jan 20 23:37:58 crc kubenswrapper[5030]: I0120 23:37:58.009813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"b4c36492-6bfc-4370-b163-6e8ea9b8c06a","Type":"ContainerStarted","Data":"93e1498c036be00fce2beed4475e558b06e6dccea18d7b64ec7b64e36e26666a"} Jan 20 23:37:58 crc kubenswrapper[5030]: I0120 23:37:58.042311 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.042296484 podStartE2EDuration="2.042296484s" podCreationTimestamp="2026-01-20 23:37:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:37:58.036376361 +0000 UTC m=+3750.356636649" watchObservedRunningTime="2026-01-20 23:37:58.042296484 +0000 UTC m=+3750.362556772" Jan 20 23:37:59 crc kubenswrapper[5030]: I0120 23:37:59.024426 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:37:59 crc kubenswrapper[5030]: E0120 23:37:59.363679 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:37:59 crc kubenswrapper[5030]: E0120 23:37:59.364286 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:37:59 crc kubenswrapper[5030]: E0120 23:37:59.364360 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift podName:1fa80a86-a911-4a89-9186-520189e4cc88 nodeName:}" failed. No retries permitted until 2026-01-20 23:38:03.364333395 +0000 UTC m=+3755.684593713 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift") pod "swift-storage-0" (UID: "1fa80a86-a911-4a89-9186-520189e4cc88") : configmap "swift-ring-files" not found Jan 20 23:37:59 crc kubenswrapper[5030]: I0120 23:37:59.363393 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:38:00 crc kubenswrapper[5030]: I0120 23:38:00.028176 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-vb7xc"] Jan 20 23:38:00 crc kubenswrapper[5030]: I0120 23:38:00.030423 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-vb7xc" Jan 20 23:38:00 crc kubenswrapper[5030]: I0120 23:38:00.035857 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:38:00 crc kubenswrapper[5030]: I0120 23:38:00.048971 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-vb7xc"] Jan 20 23:38:00 crc kubenswrapper[5030]: I0120 23:38:00.077495 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8gqp\" (UniqueName: \"kubernetes.io/projected/5441adc4-fd81-4115-b6aa-57d6752fcb1f-kube-api-access-n8gqp\") pod \"root-account-create-update-vb7xc\" (UID: \"5441adc4-fd81-4115-b6aa-57d6752fcb1f\") " pod="openstack-kuttl-tests/root-account-create-update-vb7xc" Jan 20 23:38:00 crc kubenswrapper[5030]: I0120 23:38:00.077695 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5441adc4-fd81-4115-b6aa-57d6752fcb1f-operator-scripts\") pod \"root-account-create-update-vb7xc\" (UID: \"5441adc4-fd81-4115-b6aa-57d6752fcb1f\") " pod="openstack-kuttl-tests/root-account-create-update-vb7xc" Jan 20 23:38:00 crc kubenswrapper[5030]: I0120 23:38:00.180375 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8gqp\" (UniqueName: \"kubernetes.io/projected/5441adc4-fd81-4115-b6aa-57d6752fcb1f-kube-api-access-n8gqp\") pod \"root-account-create-update-vb7xc\" (UID: \"5441adc4-fd81-4115-b6aa-57d6752fcb1f\") " pod="openstack-kuttl-tests/root-account-create-update-vb7xc" Jan 20 23:38:00 crc kubenswrapper[5030]: I0120 23:38:00.180721 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5441adc4-fd81-4115-b6aa-57d6752fcb1f-operator-scripts\") pod \"root-account-create-update-vb7xc\" (UID: \"5441adc4-fd81-4115-b6aa-57d6752fcb1f\") " pod="openstack-kuttl-tests/root-account-create-update-vb7xc" Jan 20 23:38:00 crc kubenswrapper[5030]: I0120 23:38:00.198739 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5441adc4-fd81-4115-b6aa-57d6752fcb1f-operator-scripts\") pod \"root-account-create-update-vb7xc\" (UID: \"5441adc4-fd81-4115-b6aa-57d6752fcb1f\") " pod="openstack-kuttl-tests/root-account-create-update-vb7xc" Jan 20 23:38:00 crc kubenswrapper[5030]: I0120 23:38:00.218523 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8gqp\" (UniqueName: \"kubernetes.io/projected/5441adc4-fd81-4115-b6aa-57d6752fcb1f-kube-api-access-n8gqp\") pod \"root-account-create-update-vb7xc\" (UID: \"5441adc4-fd81-4115-b6aa-57d6752fcb1f\") " pod="openstack-kuttl-tests/root-account-create-update-vb7xc" Jan 20 23:38:00 crc kubenswrapper[5030]: I0120 23:38:00.357672 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-vb7xc" Jan 20 23:38:00 crc kubenswrapper[5030]: I0120 23:38:00.924588 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-vb7xc"] Jan 20 23:38:01 crc kubenswrapper[5030]: I0120 23:38:01.051072 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-vb7xc" event={"ID":"5441adc4-fd81-4115-b6aa-57d6752fcb1f","Type":"ContainerStarted","Data":"6879af0f29678a3e4d59880f81e89c900b4c0e99c1ef49cf0c70f546bec7624e"} Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.066273 5030 generic.go:334] "Generic (PLEG): container finished" podID="5441adc4-fd81-4115-b6aa-57d6752fcb1f" containerID="ab014bff4d1d0bad1eb5a67c73e0c3d1aa274f4b0161985c39a46475197f7bc4" exitCode=0 Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.066351 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-vb7xc" event={"ID":"5441adc4-fd81-4115-b6aa-57d6752fcb1f","Type":"ContainerDied","Data":"ab014bff4d1d0bad1eb5a67c73e0c3d1aa274f4b0161985c39a46475197f7bc4"} Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.531707 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-b5f77"] Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.533592 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-b5f77" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.546191 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-b5f77"] Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.628605 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8326185d-803c-4238-b9c8-9056b3758d01-operator-scripts\") pod \"keystone-db-create-b5f77\" (UID: \"8326185d-803c-4238-b9c8-9056b3758d01\") " pod="openstack-kuttl-tests/keystone-db-create-b5f77" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.628883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8phh5\" (UniqueName: \"kubernetes.io/projected/8326185d-803c-4238-b9c8-9056b3758d01-kube-api-access-8phh5\") pod \"keystone-db-create-b5f77\" (UID: \"8326185d-803c-4238-b9c8-9056b3758d01\") " pod="openstack-kuttl-tests/keystone-db-create-b5f77" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.641296 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt"] Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.642506 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.645177 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.658692 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt"] Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.731260 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtlxt\" (UniqueName: \"kubernetes.io/projected/d0c8839c-8268-4bdd-8139-06af1589e008-kube-api-access-vtlxt\") pod \"keystone-b2c4-account-create-update-d5bdt\" (UID: \"d0c8839c-8268-4bdd-8139-06af1589e008\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.731325 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0c8839c-8268-4bdd-8139-06af1589e008-operator-scripts\") pod \"keystone-b2c4-account-create-update-d5bdt\" (UID: \"d0c8839c-8268-4bdd-8139-06af1589e008\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.731531 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8326185d-803c-4238-b9c8-9056b3758d01-operator-scripts\") pod \"keystone-db-create-b5f77\" (UID: \"8326185d-803c-4238-b9c8-9056b3758d01\") " pod="openstack-kuttl-tests/keystone-db-create-b5f77" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.731644 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8phh5\" (UniqueName: \"kubernetes.io/projected/8326185d-803c-4238-b9c8-9056b3758d01-kube-api-access-8phh5\") pod \"keystone-db-create-b5f77\" (UID: \"8326185d-803c-4238-b9c8-9056b3758d01\") " pod="openstack-kuttl-tests/keystone-db-create-b5f77" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.732825 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8326185d-803c-4238-b9c8-9056b3758d01-operator-scripts\") pod \"keystone-db-create-b5f77\" (UID: \"8326185d-803c-4238-b9c8-9056b3758d01\") " pod="openstack-kuttl-tests/keystone-db-create-b5f77" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.755220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8phh5\" (UniqueName: \"kubernetes.io/projected/8326185d-803c-4238-b9c8-9056b3758d01-kube-api-access-8phh5\") pod \"keystone-db-create-b5f77\" (UID: \"8326185d-803c-4238-b9c8-9056b3758d01\") " pod="openstack-kuttl-tests/keystone-db-create-b5f77" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.827762 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-svl75"] Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.828732 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-svl75" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.832538 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llcg6\" (UniqueName: \"kubernetes.io/projected/4730508a-e5d2-41c3-9358-5f8cd815ae15-kube-api-access-llcg6\") pod \"placement-db-create-svl75\" (UID: \"4730508a-e5d2-41c3-9358-5f8cd815ae15\") " pod="openstack-kuttl-tests/placement-db-create-svl75" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.832940 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtlxt\" (UniqueName: \"kubernetes.io/projected/d0c8839c-8268-4bdd-8139-06af1589e008-kube-api-access-vtlxt\") pod \"keystone-b2c4-account-create-update-d5bdt\" (UID: \"d0c8839c-8268-4bdd-8139-06af1589e008\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.832964 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4730508a-e5d2-41c3-9358-5f8cd815ae15-operator-scripts\") pod \"placement-db-create-svl75\" (UID: \"4730508a-e5d2-41c3-9358-5f8cd815ae15\") " pod="openstack-kuttl-tests/placement-db-create-svl75" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.832986 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0c8839c-8268-4bdd-8139-06af1589e008-operator-scripts\") pod \"keystone-b2c4-account-create-update-d5bdt\" (UID: \"d0c8839c-8268-4bdd-8139-06af1589e008\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.833776 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0c8839c-8268-4bdd-8139-06af1589e008-operator-scripts\") pod \"keystone-b2c4-account-create-update-d5bdt\" (UID: \"d0c8839c-8268-4bdd-8139-06af1589e008\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.846449 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-svl75"] Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.862511 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtlxt\" (UniqueName: \"kubernetes.io/projected/d0c8839c-8268-4bdd-8139-06af1589e008-kube-api-access-vtlxt\") pod \"keystone-b2c4-account-create-update-d5bdt\" (UID: \"d0c8839c-8268-4bdd-8139-06af1589e008\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.881004 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-b5f77" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.934735 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4730508a-e5d2-41c3-9358-5f8cd815ae15-operator-scripts\") pod \"placement-db-create-svl75\" (UID: \"4730508a-e5d2-41c3-9358-5f8cd815ae15\") " pod="openstack-kuttl-tests/placement-db-create-svl75" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.934927 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llcg6\" (UniqueName: \"kubernetes.io/projected/4730508a-e5d2-41c3-9358-5f8cd815ae15-kube-api-access-llcg6\") pod \"placement-db-create-svl75\" (UID: \"4730508a-e5d2-41c3-9358-5f8cd815ae15\") " pod="openstack-kuttl-tests/placement-db-create-svl75" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.935707 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4730508a-e5d2-41c3-9358-5f8cd815ae15-operator-scripts\") pod \"placement-db-create-svl75\" (UID: \"4730508a-e5d2-41c3-9358-5f8cd815ae15\") " pod="openstack-kuttl-tests/placement-db-create-svl75" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.954550 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-914b-account-create-update-lz9xk"] Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.956363 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-914b-account-create-update-lz9xk" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.956925 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.959694 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.961841 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llcg6\" (UniqueName: \"kubernetes.io/projected/4730508a-e5d2-41c3-9358-5f8cd815ae15-kube-api-access-llcg6\") pod \"placement-db-create-svl75\" (UID: \"4730508a-e5d2-41c3-9358-5f8cd815ae15\") " pod="openstack-kuttl-tests/placement-db-create-svl75" Jan 20 23:38:02 crc kubenswrapper[5030]: I0120 23:38:02.964787 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-914b-account-create-update-lz9xk"] Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.035985 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5357f4-0242-43f7-b48a-3f2f7ccd4583-operator-scripts\") pod \"placement-914b-account-create-update-lz9xk\" (UID: \"ec5357f4-0242-43f7-b48a-3f2f7ccd4583\") " pod="openstack-kuttl-tests/placement-914b-account-create-update-lz9xk" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.036415 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4rfv\" (UniqueName: \"kubernetes.io/projected/ec5357f4-0242-43f7-b48a-3f2f7ccd4583-kube-api-access-b4rfv\") pod \"placement-914b-account-create-update-lz9xk\" (UID: \"ec5357f4-0242-43f7-b48a-3f2f7ccd4583\") " pod="openstack-kuttl-tests/placement-914b-account-create-update-lz9xk" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.081788 5030 generic.go:334] "Generic (PLEG): container finished" podID="874081cb-7984-4363-818a-78f8ed9321ed" containerID="d9605d7f9d273668c47f45b37af497fcbfd880427fd93734f81f8edf971ac8a9" exitCode=0 Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.081910 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" event={"ID":"874081cb-7984-4363-818a-78f8ed9321ed","Type":"ContainerDied","Data":"d9605d7f9d273668c47f45b37af497fcbfd880427fd93734f81f8edf971ac8a9"} Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.138048 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5357f4-0242-43f7-b48a-3f2f7ccd4583-operator-scripts\") pod \"placement-914b-account-create-update-lz9xk\" (UID: \"ec5357f4-0242-43f7-b48a-3f2f7ccd4583\") " pod="openstack-kuttl-tests/placement-914b-account-create-update-lz9xk" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.138237 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4rfv\" (UniqueName: \"kubernetes.io/projected/ec5357f4-0242-43f7-b48a-3f2f7ccd4583-kube-api-access-b4rfv\") pod \"placement-914b-account-create-update-lz9xk\" (UID: \"ec5357f4-0242-43f7-b48a-3f2f7ccd4583\") " pod="openstack-kuttl-tests/placement-914b-account-create-update-lz9xk" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.138811 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-t4959"] Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.139684 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5357f4-0242-43f7-b48a-3f2f7ccd4583-operator-scripts\") pod \"placement-914b-account-create-update-lz9xk\" (UID: \"ec5357f4-0242-43f7-b48a-3f2f7ccd4583\") " pod="openstack-kuttl-tests/placement-914b-account-create-update-lz9xk" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.139799 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-t4959" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.146247 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-svl75" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.149427 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-t4959"] Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.157642 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4rfv\" (UniqueName: \"kubernetes.io/projected/ec5357f4-0242-43f7-b48a-3f2f7ccd4583-kube-api-access-b4rfv\") pod \"placement-914b-account-create-update-lz9xk\" (UID: \"ec5357f4-0242-43f7-b48a-3f2f7ccd4583\") " pod="openstack-kuttl-tests/placement-914b-account-create-update-lz9xk" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.266939 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7de4926-8119-4b2c-b581-9a68520f43e0-operator-scripts\") pod \"glance-db-create-t4959\" (UID: \"a7de4926-8119-4b2c-b581-9a68520f43e0\") " pod="openstack-kuttl-tests/glance-db-create-t4959" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.267088 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2srsr\" (UniqueName: \"kubernetes.io/projected/a7de4926-8119-4b2c-b581-9a68520f43e0-kube-api-access-2srsr\") pod \"glance-db-create-t4959\" (UID: \"a7de4926-8119-4b2c-b581-9a68520f43e0\") " pod="openstack-kuttl-tests/glance-db-create-t4959" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.268998 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8"] Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.270029 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.271997 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.278934 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8"] Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.321133 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-914b-account-create-update-lz9xk" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.349714 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-b5f77"] Jan 20 23:38:03 crc kubenswrapper[5030]: W0120 23:38:03.350593 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8326185d_803c_4238_b9c8_9056b3758d01.slice/crio-bfbeaef8a91c724962985620a4e9657ff1433ec2fe64470c624abe4a35f70335 WatchSource:0}: Error finding container bfbeaef8a91c724962985620a4e9657ff1433ec2fe64470c624abe4a35f70335: Status 404 returned error can't find the container with id bfbeaef8a91c724962985620a4e9657ff1433ec2fe64470c624abe4a35f70335 Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.368100 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5pxc\" (UniqueName: \"kubernetes.io/projected/3d97caaa-e49c-4669-893c-b1fe0d02e047-kube-api-access-s5pxc\") pod \"glance-2dcf-account-create-update-vv9f8\" (UID: \"3d97caaa-e49c-4669-893c-b1fe0d02e047\") " pod="openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.368179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2srsr\" (UniqueName: \"kubernetes.io/projected/a7de4926-8119-4b2c-b581-9a68520f43e0-kube-api-access-2srsr\") pod \"glance-db-create-t4959\" (UID: \"a7de4926-8119-4b2c-b581-9a68520f43e0\") " pod="openstack-kuttl-tests/glance-db-create-t4959" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.368261 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.368315 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d97caaa-e49c-4669-893c-b1fe0d02e047-operator-scripts\") pod \"glance-2dcf-account-create-update-vv9f8\" (UID: \"3d97caaa-e49c-4669-893c-b1fe0d02e047\") " pod="openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.368430 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7de4926-8119-4b2c-b581-9a68520f43e0-operator-scripts\") pod \"glance-db-create-t4959\" (UID: \"a7de4926-8119-4b2c-b581-9a68520f43e0\") " pod="openstack-kuttl-tests/glance-db-create-t4959" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.369510 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7de4926-8119-4b2c-b581-9a68520f43e0-operator-scripts\") pod \"glance-db-create-t4959\" (UID: \"a7de4926-8119-4b2c-b581-9a68520f43e0\") " pod="openstack-kuttl-tests/glance-db-create-t4959" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.379811 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift\") pod \"swift-storage-0\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.387070 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2srsr\" (UniqueName: \"kubernetes.io/projected/a7de4926-8119-4b2c-b581-9a68520f43e0-kube-api-access-2srsr\") pod \"glance-db-create-t4959\" (UID: \"a7de4926-8119-4b2c-b581-9a68520f43e0\") " pod="openstack-kuttl-tests/glance-db-create-t4959" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.398320 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.407033 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-vb7xc" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.464180 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-t4959" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.469128 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8gqp\" (UniqueName: \"kubernetes.io/projected/5441adc4-fd81-4115-b6aa-57d6752fcb1f-kube-api-access-n8gqp\") pod \"5441adc4-fd81-4115-b6aa-57d6752fcb1f\" (UID: \"5441adc4-fd81-4115-b6aa-57d6752fcb1f\") " Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.469211 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5441adc4-fd81-4115-b6aa-57d6752fcb1f-operator-scripts\") pod \"5441adc4-fd81-4115-b6aa-57d6752fcb1f\" (UID: \"5441adc4-fd81-4115-b6aa-57d6752fcb1f\") " Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.469333 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d97caaa-e49c-4669-893c-b1fe0d02e047-operator-scripts\") pod \"glance-2dcf-account-create-update-vv9f8\" (UID: \"3d97caaa-e49c-4669-893c-b1fe0d02e047\") " pod="openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.469424 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5pxc\" (UniqueName: \"kubernetes.io/projected/3d97caaa-e49c-4669-893c-b1fe0d02e047-kube-api-access-s5pxc\") pod \"glance-2dcf-account-create-update-vv9f8\" (UID: \"3d97caaa-e49c-4669-893c-b1fe0d02e047\") " pod="openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.469982 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441adc4-fd81-4115-b6aa-57d6752fcb1f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5441adc4-fd81-4115-b6aa-57d6752fcb1f" (UID: "5441adc4-fd81-4115-b6aa-57d6752fcb1f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.470350 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d97caaa-e49c-4669-893c-b1fe0d02e047-operator-scripts\") pod \"glance-2dcf-account-create-update-vv9f8\" (UID: \"3d97caaa-e49c-4669-893c-b1fe0d02e047\") " pod="openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.476837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441adc4-fd81-4115-b6aa-57d6752fcb1f-kube-api-access-n8gqp" (OuterVolumeSpecName: "kube-api-access-n8gqp") pod "5441adc4-fd81-4115-b6aa-57d6752fcb1f" (UID: "5441adc4-fd81-4115-b6aa-57d6752fcb1f"). InnerVolumeSpecName "kube-api-access-n8gqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.484781 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5pxc\" (UniqueName: \"kubernetes.io/projected/3d97caaa-e49c-4669-893c-b1fe0d02e047-kube-api-access-s5pxc\") pod \"glance-2dcf-account-create-update-vv9f8\" (UID: \"3d97caaa-e49c-4669-893c-b1fe0d02e047\") " pod="openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.541916 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt"] Jan 20 23:38:03 crc kubenswrapper[5030]: W0120 23:38:03.549098 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0c8839c_8268_4bdd_8139_06af1589e008.slice/crio-a72e5055e738ccf777cc6a617adddf7adf45bc6bb0ab8aca4892d905aa4e7489 WatchSource:0}: Error finding container a72e5055e738ccf777cc6a617adddf7adf45bc6bb0ab8aca4892d905aa4e7489: Status 404 returned error can't find the container with id a72e5055e738ccf777cc6a617adddf7adf45bc6bb0ab8aca4892d905aa4e7489 Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.570803 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5441adc4-fd81-4115-b6aa-57d6752fcb1f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.570835 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8gqp\" (UniqueName: \"kubernetes.io/projected/5441adc4-fd81-4115-b6aa-57d6752fcb1f-kube-api-access-n8gqp\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.593776 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8" Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.685923 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-svl75"] Jan 20 23:38:03 crc kubenswrapper[5030]: W0120 23:38:03.695298 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4730508a_e5d2_41c3_9358_5f8cd815ae15.slice/crio-82b77d6da060dff611f13b412bdbd5d30ce3f7b6001e077a0e69dec284097590 WatchSource:0}: Error finding container 82b77d6da060dff611f13b412bdbd5d30ce3f7b6001e077a0e69dec284097590: Status 404 returned error can't find the container with id 82b77d6da060dff611f13b412bdbd5d30ce3f7b6001e077a0e69dec284097590 Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.822731 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-914b-account-create-update-lz9xk"] Jan 20 23:38:03 crc kubenswrapper[5030]: W0120 23:38:03.891578 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec5357f4_0242_43f7_b48a_3f2f7ccd4583.slice/crio-d4e02aeec1402919744751b8e180b442344dabba5bddf462175cd63cb5eb7362 WatchSource:0}: Error finding container d4e02aeec1402919744751b8e180b442344dabba5bddf462175cd63cb5eb7362: Status 404 returned error can't find the container with id d4e02aeec1402919744751b8e180b442344dabba5bddf462175cd63cb5eb7362 Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.914729 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:38:03 crc kubenswrapper[5030]: W0120 23:38:03.914953 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fa80a86_a911_4a89_9186_520189e4cc88.slice/crio-566e5fbb6b26fe8efa2057d0b7dbfea6b25021f1dd5818f82ec1e02320728190 WatchSource:0}: Error finding container 566e5fbb6b26fe8efa2057d0b7dbfea6b25021f1dd5818f82ec1e02320728190: Status 404 returned error can't find the container with id 566e5fbb6b26fe8efa2057d0b7dbfea6b25021f1dd5818f82ec1e02320728190 Jan 20 23:38:03 crc kubenswrapper[5030]: I0120 23:38:03.994571 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-t4959"] Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.114296 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8"] Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.117972 5030 generic.go:334] "Generic (PLEG): container finished" podID="d0c8839c-8268-4bdd-8139-06af1589e008" containerID="c45bbe420a3c0c3efb9b06d7ee93b499854a8e5500f9dfd99ff855cf5df00379" exitCode=0 Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.118136 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt" event={"ID":"d0c8839c-8268-4bdd-8139-06af1589e008","Type":"ContainerDied","Data":"c45bbe420a3c0c3efb9b06d7ee93b499854a8e5500f9dfd99ff855cf5df00379"} Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.118162 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt" event={"ID":"d0c8839c-8268-4bdd-8139-06af1589e008","Type":"ContainerStarted","Data":"a72e5055e738ccf777cc6a617adddf7adf45bc6bb0ab8aca4892d905aa4e7489"} Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.120026 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-t4959" event={"ID":"a7de4926-8119-4b2c-b581-9a68520f43e0","Type":"ContainerStarted","Data":"fbf35cbc4187f10bb2e6d74703a0ce4061e518015c2e23cb120380eee1a50425"} Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.121328 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-914b-account-create-update-lz9xk" event={"ID":"ec5357f4-0242-43f7-b48a-3f2f7ccd4583","Type":"ContainerStarted","Data":"d4e02aeec1402919744751b8e180b442344dabba5bddf462175cd63cb5eb7362"} Jan 20 23:38:04 crc kubenswrapper[5030]: W0120 23:38:04.123516 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d97caaa_e49c_4669_893c_b1fe0d02e047.slice/crio-b05f3cc1b096b02042cce64eebd4622393a848b805d6ebb0c793893a64f471d3 WatchSource:0}: Error finding container b05f3cc1b096b02042cce64eebd4622393a848b805d6ebb0c793893a64f471d3: Status 404 returned error can't find the container with id b05f3cc1b096b02042cce64eebd4622393a848b805d6ebb0c793893a64f471d3 Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.124786 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-vb7xc" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.124804 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-vb7xc" event={"ID":"5441adc4-fd81-4115-b6aa-57d6752fcb1f","Type":"ContainerDied","Data":"6879af0f29678a3e4d59880f81e89c900b4c0e99c1ef49cf0c70f546bec7624e"} Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.124837 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6879af0f29678a3e4d59880f81e89c900b4c0e99c1ef49cf0c70f546bec7624e" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.127795 5030 generic.go:334] "Generic (PLEG): container finished" podID="8326185d-803c-4238-b9c8-9056b3758d01" containerID="93e09b6369ee1ec46642c19a747cf87d854479c6601cd44323fa57ca688fb52e" exitCode=0 Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.127850 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-b5f77" event={"ID":"8326185d-803c-4238-b9c8-9056b3758d01","Type":"ContainerDied","Data":"93e09b6369ee1ec46642c19a747cf87d854479c6601cd44323fa57ca688fb52e"} Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.127873 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-b5f77" event={"ID":"8326185d-803c-4238-b9c8-9056b3758d01","Type":"ContainerStarted","Data":"bfbeaef8a91c724962985620a4e9657ff1433ec2fe64470c624abe4a35f70335"} Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.129528 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"566e5fbb6b26fe8efa2057d0b7dbfea6b25021f1dd5818f82ec1e02320728190"} Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.136525 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-svl75" event={"ID":"4730508a-e5d2-41c3-9358-5f8cd815ae15","Type":"ContainerStarted","Data":"82b77d6da060dff611f13b412bdbd5d30ce3f7b6001e077a0e69dec284097590"} Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.615965 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.690421 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-swiftconf\") pod \"874081cb-7984-4363-818a-78f8ed9321ed\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.690604 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx8g7\" (UniqueName: \"kubernetes.io/projected/874081cb-7984-4363-818a-78f8ed9321ed-kube-api-access-xx8g7\") pod \"874081cb-7984-4363-818a-78f8ed9321ed\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.690878 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/874081cb-7984-4363-818a-78f8ed9321ed-etc-swift\") pod \"874081cb-7984-4363-818a-78f8ed9321ed\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.691068 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/874081cb-7984-4363-818a-78f8ed9321ed-ring-data-devices\") pod \"874081cb-7984-4363-818a-78f8ed9321ed\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.691187 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-combined-ca-bundle\") pod \"874081cb-7984-4363-818a-78f8ed9321ed\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.691339 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/874081cb-7984-4363-818a-78f8ed9321ed-scripts\") pod \"874081cb-7984-4363-818a-78f8ed9321ed\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.691443 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-dispersionconf\") pod \"874081cb-7984-4363-818a-78f8ed9321ed\" (UID: \"874081cb-7984-4363-818a-78f8ed9321ed\") " Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.691773 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/874081cb-7984-4363-818a-78f8ed9321ed-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "874081cb-7984-4363-818a-78f8ed9321ed" (UID: "874081cb-7984-4363-818a-78f8ed9321ed"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.692466 5030 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/874081cb-7984-4363-818a-78f8ed9321ed-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.692805 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/874081cb-7984-4363-818a-78f8ed9321ed-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "874081cb-7984-4363-818a-78f8ed9321ed" (UID: "874081cb-7984-4363-818a-78f8ed9321ed"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.699481 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/874081cb-7984-4363-818a-78f8ed9321ed-kube-api-access-xx8g7" (OuterVolumeSpecName: "kube-api-access-xx8g7") pod "874081cb-7984-4363-818a-78f8ed9321ed" (UID: "874081cb-7984-4363-818a-78f8ed9321ed"). InnerVolumeSpecName "kube-api-access-xx8g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.704101 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "874081cb-7984-4363-818a-78f8ed9321ed" (UID: "874081cb-7984-4363-818a-78f8ed9321ed"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.716155 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "874081cb-7984-4363-818a-78f8ed9321ed" (UID: "874081cb-7984-4363-818a-78f8ed9321ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.721155 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "874081cb-7984-4363-818a-78f8ed9321ed" (UID: "874081cb-7984-4363-818a-78f8ed9321ed"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.721267 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/874081cb-7984-4363-818a-78f8ed9321ed-scripts" (OuterVolumeSpecName: "scripts") pod "874081cb-7984-4363-818a-78f8ed9321ed" (UID: "874081cb-7984-4363-818a-78f8ed9321ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.794396 5030 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.794760 5030 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.794778 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx8g7\" (UniqueName: \"kubernetes.io/projected/874081cb-7984-4363-818a-78f8ed9321ed-kube-api-access-xx8g7\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.794794 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/874081cb-7984-4363-818a-78f8ed9321ed-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.794808 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/874081cb-7984-4363-818a-78f8ed9321ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:04 crc kubenswrapper[5030]: I0120 23:38:04.794823 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/874081cb-7984-4363-818a-78f8ed9321ed-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.159934 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"1804d3605dcae8df1649629d1484b17dbb6c4cb8599bf27e3f1d93177fa864f6"} Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.160032 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"fffadcca529e80041791a2dbc5892ae2ae5e249cae53e8e2aea4dd2a52b5ed5f"} Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.160050 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"0be440e2b30ae5389f7c2cbaafbac4d4866fd7d5b1985dbe1813ca1df94788ca"} Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.160061 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"5a0e990417f8c5fb20ac6296acd86d2880ed3ee8806354f7c4c3d73054ac98ed"} Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.160075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"ca2b82dd6522d59c8f400191a6a84849e1a0dc5afd452ebd9ad3d6e6bb95252c"} Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.164563 5030 generic.go:334] "Generic (PLEG): container finished" podID="3d97caaa-e49c-4669-893c-b1fe0d02e047" containerID="78d23a1e9ddea4f858d613077986875f1ebdff9b27bbc15d1e79bfac6bd8148f" exitCode=0 Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.164993 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8" event={"ID":"3d97caaa-e49c-4669-893c-b1fe0d02e047","Type":"ContainerDied","Data":"78d23a1e9ddea4f858d613077986875f1ebdff9b27bbc15d1e79bfac6bd8148f"} Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.165113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8" event={"ID":"3d97caaa-e49c-4669-893c-b1fe0d02e047","Type":"ContainerStarted","Data":"b05f3cc1b096b02042cce64eebd4622393a848b805d6ebb0c793893a64f471d3"} Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.176781 5030 generic.go:334] "Generic (PLEG): container finished" podID="4730508a-e5d2-41c3-9358-5f8cd815ae15" containerID="975217a77382cd0704c9172f8dfafc0bea19409cc429217624bee737730226f1" exitCode=0 Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.176889 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-svl75" event={"ID":"4730508a-e5d2-41c3-9358-5f8cd815ae15","Type":"ContainerDied","Data":"975217a77382cd0704c9172f8dfafc0bea19409cc429217624bee737730226f1"} Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.179411 5030 generic.go:334] "Generic (PLEG): container finished" podID="a7de4926-8119-4b2c-b581-9a68520f43e0" containerID="28d0a312bd9b754e9a2a5340f10df1c45aa03b9850f1f67802fbc5a8627a8ab7" exitCode=0 Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.179455 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-t4959" event={"ID":"a7de4926-8119-4b2c-b581-9a68520f43e0","Type":"ContainerDied","Data":"28d0a312bd9b754e9a2a5340f10df1c45aa03b9850f1f67802fbc5a8627a8ab7"} Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.191169 5030 generic.go:334] "Generic (PLEG): container finished" podID="ec5357f4-0242-43f7-b48a-3f2f7ccd4583" containerID="acb2cd8b976af56ab01d689076d286ed050ea6b6291bdc89ba6a80a2a05b46e5" exitCode=0 Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.191312 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-914b-account-create-update-lz9xk" event={"ID":"ec5357f4-0242-43f7-b48a-3f2f7ccd4583","Type":"ContainerDied","Data":"acb2cd8b976af56ab01d689076d286ed050ea6b6291bdc89ba6a80a2a05b46e5"} Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.194739 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" event={"ID":"874081cb-7984-4363-818a-78f8ed9321ed","Type":"ContainerDied","Data":"fb80d15eb9df72965f5f02caa4c5c9bfaf62e9b923907e69d6e4d0c7aea88ddf"} Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.194782 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb80d15eb9df72965f5f02caa4c5c9bfaf62e9b923907e69d6e4d0c7aea88ddf" Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.194921 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-mphbf" Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.608104 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-b5f77" Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.657694 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt" Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.710404 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtlxt\" (UniqueName: \"kubernetes.io/projected/d0c8839c-8268-4bdd-8139-06af1589e008-kube-api-access-vtlxt\") pod \"d0c8839c-8268-4bdd-8139-06af1589e008\" (UID: \"d0c8839c-8268-4bdd-8139-06af1589e008\") " Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.710452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0c8839c-8268-4bdd-8139-06af1589e008-operator-scripts\") pod \"d0c8839c-8268-4bdd-8139-06af1589e008\" (UID: \"d0c8839c-8268-4bdd-8139-06af1589e008\") " Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.710515 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8phh5\" (UniqueName: \"kubernetes.io/projected/8326185d-803c-4238-b9c8-9056b3758d01-kube-api-access-8phh5\") pod \"8326185d-803c-4238-b9c8-9056b3758d01\" (UID: \"8326185d-803c-4238-b9c8-9056b3758d01\") " Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.710648 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8326185d-803c-4238-b9c8-9056b3758d01-operator-scripts\") pod \"8326185d-803c-4238-b9c8-9056b3758d01\" (UID: \"8326185d-803c-4238-b9c8-9056b3758d01\") " Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.711774 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0c8839c-8268-4bdd-8139-06af1589e008-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0c8839c-8268-4bdd-8139-06af1589e008" (UID: "d0c8839c-8268-4bdd-8139-06af1589e008"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.711925 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8326185d-803c-4238-b9c8-9056b3758d01-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8326185d-803c-4238-b9c8-9056b3758d01" (UID: "8326185d-803c-4238-b9c8-9056b3758d01"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.716149 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8326185d-803c-4238-b9c8-9056b3758d01-kube-api-access-8phh5" (OuterVolumeSpecName: "kube-api-access-8phh5") pod "8326185d-803c-4238-b9c8-9056b3758d01" (UID: "8326185d-803c-4238-b9c8-9056b3758d01"). InnerVolumeSpecName "kube-api-access-8phh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.716219 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0c8839c-8268-4bdd-8139-06af1589e008-kube-api-access-vtlxt" (OuterVolumeSpecName: "kube-api-access-vtlxt") pod "d0c8839c-8268-4bdd-8139-06af1589e008" (UID: "d0c8839c-8268-4bdd-8139-06af1589e008"). InnerVolumeSpecName "kube-api-access-vtlxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.811771 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtlxt\" (UniqueName: \"kubernetes.io/projected/d0c8839c-8268-4bdd-8139-06af1589e008-kube-api-access-vtlxt\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.811801 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0c8839c-8268-4bdd-8139-06af1589e008-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.811812 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8phh5\" (UniqueName: \"kubernetes.io/projected/8326185d-803c-4238-b9c8-9056b3758d01-kube-api-access-8phh5\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:05 crc kubenswrapper[5030]: I0120 23:38:05.811821 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8326185d-803c-4238-b9c8-9056b3758d01-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.213361 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.213490 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt" event={"ID":"d0c8839c-8268-4bdd-8139-06af1589e008","Type":"ContainerDied","Data":"a72e5055e738ccf777cc6a617adddf7adf45bc6bb0ab8aca4892d905aa4e7489"} Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.215093 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a72e5055e738ccf777cc6a617adddf7adf45bc6bb0ab8aca4892d905aa4e7489" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.222803 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-b5f77" event={"ID":"8326185d-803c-4238-b9c8-9056b3758d01","Type":"ContainerDied","Data":"bfbeaef8a91c724962985620a4e9657ff1433ec2fe64470c624abe4a35f70335"} Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.222866 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-b5f77" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.222871 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfbeaef8a91c724962985620a4e9657ff1433ec2fe64470c624abe4a35f70335" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.232732 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"cc48b4c6d8e07a0b1db15c7843485e19655d2fc4c950337f5fa62bc1a6c473e8"} Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.232818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"bac50d4cc962099037430a767b8939f7657afbb7b51f2a1c37620ae6aac67d42"} Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.232854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"60ab4b7b3cfb297f14a1ca0ce6b7a2cef1d8b852da1a174b8c0a481cccd94e44"} Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.232880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"ba230f74c1544afcbb8b1f9c898cc75207ae9e7b4b8083280e546953073e28ef"} Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.232906 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"32ce18f6289e30499008982b3bc6f5d28395565c34d8a569f794f6c6a7775704"} Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.232934 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"1cf6b355d0e499f70b76d7c2fa59bfce9c9ed28b910c999b8bcfdcc19ec0dbf2"} Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.340347 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-vb7xc"] Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.347322 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-vb7xc"] Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.630563 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-svl75" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.728546 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llcg6\" (UniqueName: \"kubernetes.io/projected/4730508a-e5d2-41c3-9358-5f8cd815ae15-kube-api-access-llcg6\") pod \"4730508a-e5d2-41c3-9358-5f8cd815ae15\" (UID: \"4730508a-e5d2-41c3-9358-5f8cd815ae15\") " Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.728697 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4730508a-e5d2-41c3-9358-5f8cd815ae15-operator-scripts\") pod \"4730508a-e5d2-41c3-9358-5f8cd815ae15\" (UID: \"4730508a-e5d2-41c3-9358-5f8cd815ae15\") " Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.729524 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4730508a-e5d2-41c3-9358-5f8cd815ae15-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4730508a-e5d2-41c3-9358-5f8cd815ae15" (UID: "4730508a-e5d2-41c3-9358-5f8cd815ae15"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.735844 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4730508a-e5d2-41c3-9358-5f8cd815ae15-kube-api-access-llcg6" (OuterVolumeSpecName: "kube-api-access-llcg6") pod "4730508a-e5d2-41c3-9358-5f8cd815ae15" (UID: "4730508a-e5d2-41c3-9358-5f8cd815ae15"). InnerVolumeSpecName "kube-api-access-llcg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.830118 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4730508a-e5d2-41c3-9358-5f8cd815ae15-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.830147 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llcg6\" (UniqueName: \"kubernetes.io/projected/4730508a-e5d2-41c3-9358-5f8cd815ae15-kube-api-access-llcg6\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.869089 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-914b-account-create-update-lz9xk" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.875469 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-t4959" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.882592 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.931606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d97caaa-e49c-4669-893c-b1fe0d02e047-operator-scripts\") pod \"3d97caaa-e49c-4669-893c-b1fe0d02e047\" (UID: \"3d97caaa-e49c-4669-893c-b1fe0d02e047\") " Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.931750 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2srsr\" (UniqueName: \"kubernetes.io/projected/a7de4926-8119-4b2c-b581-9a68520f43e0-kube-api-access-2srsr\") pod \"a7de4926-8119-4b2c-b581-9a68520f43e0\" (UID: \"a7de4926-8119-4b2c-b581-9a68520f43e0\") " Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.932033 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d97caaa-e49c-4669-893c-b1fe0d02e047-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d97caaa-e49c-4669-893c-b1fe0d02e047" (UID: "3d97caaa-e49c-4669-893c-b1fe0d02e047"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.931783 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7de4926-8119-4b2c-b581-9a68520f43e0-operator-scripts\") pod \"a7de4926-8119-4b2c-b581-9a68520f43e0\" (UID: \"a7de4926-8119-4b2c-b581-9a68520f43e0\") " Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.932344 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7de4926-8119-4b2c-b581-9a68520f43e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7de4926-8119-4b2c-b581-9a68520f43e0" (UID: "a7de4926-8119-4b2c-b581-9a68520f43e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.932439 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5pxc\" (UniqueName: \"kubernetes.io/projected/3d97caaa-e49c-4669-893c-b1fe0d02e047-kube-api-access-s5pxc\") pod \"3d97caaa-e49c-4669-893c-b1fe0d02e047\" (UID: \"3d97caaa-e49c-4669-893c-b1fe0d02e047\") " Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.932481 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4rfv\" (UniqueName: \"kubernetes.io/projected/ec5357f4-0242-43f7-b48a-3f2f7ccd4583-kube-api-access-b4rfv\") pod \"ec5357f4-0242-43f7-b48a-3f2f7ccd4583\" (UID: \"ec5357f4-0242-43f7-b48a-3f2f7ccd4583\") " Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.932534 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5357f4-0242-43f7-b48a-3f2f7ccd4583-operator-scripts\") pod \"ec5357f4-0242-43f7-b48a-3f2f7ccd4583\" (UID: \"ec5357f4-0242-43f7-b48a-3f2f7ccd4583\") " Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.932962 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d97caaa-e49c-4669-893c-b1fe0d02e047-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.932987 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7de4926-8119-4b2c-b581-9a68520f43e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.933308 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec5357f4-0242-43f7-b48a-3f2f7ccd4583-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec5357f4-0242-43f7-b48a-3f2f7ccd4583" (UID: "ec5357f4-0242-43f7-b48a-3f2f7ccd4583"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.934885 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7de4926-8119-4b2c-b581-9a68520f43e0-kube-api-access-2srsr" (OuterVolumeSpecName: "kube-api-access-2srsr") pod "a7de4926-8119-4b2c-b581-9a68520f43e0" (UID: "a7de4926-8119-4b2c-b581-9a68520f43e0"). InnerVolumeSpecName "kube-api-access-2srsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.943640 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d97caaa-e49c-4669-893c-b1fe0d02e047-kube-api-access-s5pxc" (OuterVolumeSpecName: "kube-api-access-s5pxc") pod "3d97caaa-e49c-4669-893c-b1fe0d02e047" (UID: "3d97caaa-e49c-4669-893c-b1fe0d02e047"). InnerVolumeSpecName "kube-api-access-s5pxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:06 crc kubenswrapper[5030]: I0120 23:38:06.950789 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec5357f4-0242-43f7-b48a-3f2f7ccd4583-kube-api-access-b4rfv" (OuterVolumeSpecName: "kube-api-access-b4rfv") pod "ec5357f4-0242-43f7-b48a-3f2f7ccd4583" (UID: "ec5357f4-0242-43f7-b48a-3f2f7ccd4583"). InnerVolumeSpecName "kube-api-access-b4rfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.035250 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4rfv\" (UniqueName: \"kubernetes.io/projected/ec5357f4-0242-43f7-b48a-3f2f7ccd4583-kube-api-access-b4rfv\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.035302 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec5357f4-0242-43f7-b48a-3f2f7ccd4583-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.035323 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2srsr\" (UniqueName: \"kubernetes.io/projected/a7de4926-8119-4b2c-b581-9a68520f43e0-kube-api-access-2srsr\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.035341 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5pxc\" (UniqueName: \"kubernetes.io/projected/3d97caaa-e49c-4669-893c-b1fe0d02e047-kube-api-access-s5pxc\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.163906 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.243648 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-t4959" event={"ID":"a7de4926-8119-4b2c-b581-9a68520f43e0","Type":"ContainerDied","Data":"fbf35cbc4187f10bb2e6d74703a0ce4061e518015c2e23cb120380eee1a50425"} Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.243706 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbf35cbc4187f10bb2e6d74703a0ce4061e518015c2e23cb120380eee1a50425" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.243788 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-t4959" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.249347 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-914b-account-create-update-lz9xk" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.249774 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-914b-account-create-update-lz9xk" event={"ID":"ec5357f4-0242-43f7-b48a-3f2f7ccd4583","Type":"ContainerDied","Data":"d4e02aeec1402919744751b8e180b442344dabba5bddf462175cd63cb5eb7362"} Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.249841 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4e02aeec1402919744751b8e180b442344dabba5bddf462175cd63cb5eb7362" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.263033 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"703b4130b09e7e9f7f77d19f3d5f9809062cb9cb593d15aaae512b984d70c591"} Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.263080 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"eab5cc67834e8fd543f3f0bf8bd1fcda92107d8d7ed59e2a5b6142df62f58fce"} Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.263095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"0cc78d3b2cdd7c16a9c523aa805dd25a524946b1037801b958e0b5aca73eae5d"} Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.263107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerStarted","Data":"12eaa764aa377753f5ace52c861bbb3f3689b2fca77a117e7d7f96175209240d"} Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.266829 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.266844 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8" event={"ID":"3d97caaa-e49c-4669-893c-b1fe0d02e047","Type":"ContainerDied","Data":"b05f3cc1b096b02042cce64eebd4622393a848b805d6ebb0c793893a64f471d3"} Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.266906 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b05f3cc1b096b02042cce64eebd4622393a848b805d6ebb0c793893a64f471d3" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.269063 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-svl75" event={"ID":"4730508a-e5d2-41c3-9358-5f8cd815ae15","Type":"ContainerDied","Data":"82b77d6da060dff611f13b412bdbd5d30ce3f7b6001e077a0e69dec284097590"} Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.269106 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82b77d6da060dff611f13b412bdbd5d30ce3f7b6001e077a0e69dec284097590" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.269177 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-svl75" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.304697 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=13.304677768 podStartE2EDuration="13.304677768s" podCreationTimestamp="2026-01-20 23:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:07.299281367 +0000 UTC m=+3759.619541665" watchObservedRunningTime="2026-01-20 23:38:07.304677768 +0000 UTC m=+3759.624938046" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.444658 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26"] Jan 20 23:38:07 crc kubenswrapper[5030]: E0120 23:38:07.445209 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4730508a-e5d2-41c3-9358-5f8cd815ae15" containerName="mariadb-database-create" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445243 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4730508a-e5d2-41c3-9358-5f8cd815ae15" containerName="mariadb-database-create" Jan 20 23:38:07 crc kubenswrapper[5030]: E0120 23:38:07.445276 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7de4926-8119-4b2c-b581-9a68520f43e0" containerName="mariadb-database-create" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445296 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7de4926-8119-4b2c-b581-9a68520f43e0" containerName="mariadb-database-create" Jan 20 23:38:07 crc kubenswrapper[5030]: E0120 23:38:07.445338 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5441adc4-fd81-4115-b6aa-57d6752fcb1f" containerName="mariadb-account-create-update" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445353 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5441adc4-fd81-4115-b6aa-57d6752fcb1f" containerName="mariadb-account-create-update" Jan 20 23:38:07 crc kubenswrapper[5030]: E0120 23:38:07.445370 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8326185d-803c-4238-b9c8-9056b3758d01" containerName="mariadb-database-create" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445385 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8326185d-803c-4238-b9c8-9056b3758d01" containerName="mariadb-database-create" Jan 20 23:38:07 crc kubenswrapper[5030]: E0120 23:38:07.445420 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c8839c-8268-4bdd-8139-06af1589e008" containerName="mariadb-account-create-update" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445435 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c8839c-8268-4bdd-8139-06af1589e008" containerName="mariadb-account-create-update" Jan 20 23:38:07 crc kubenswrapper[5030]: E0120 23:38:07.445460 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5357f4-0242-43f7-b48a-3f2f7ccd4583" containerName="mariadb-account-create-update" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445475 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5357f4-0242-43f7-b48a-3f2f7ccd4583" containerName="mariadb-account-create-update" Jan 20 23:38:07 crc kubenswrapper[5030]: E0120 23:38:07.445501 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="874081cb-7984-4363-818a-78f8ed9321ed" containerName="swift-ring-rebalance" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445516 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="874081cb-7984-4363-818a-78f8ed9321ed" containerName="swift-ring-rebalance" Jan 20 23:38:07 crc kubenswrapper[5030]: E0120 23:38:07.445534 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d97caaa-e49c-4669-893c-b1fe0d02e047" containerName="mariadb-account-create-update" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445549 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d97caaa-e49c-4669-893c-b1fe0d02e047" containerName="mariadb-account-create-update" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445862 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="874081cb-7984-4363-818a-78f8ed9321ed" containerName="swift-ring-rebalance" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445895 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5441adc4-fd81-4115-b6aa-57d6752fcb1f" containerName="mariadb-account-create-update" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445913 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0c8839c-8268-4bdd-8139-06af1589e008" containerName="mariadb-account-create-update" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445929 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d97caaa-e49c-4669-893c-b1fe0d02e047" containerName="mariadb-account-create-update" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445941 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8326185d-803c-4238-b9c8-9056b3758d01" containerName="mariadb-database-create" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445954 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7de4926-8119-4b2c-b581-9a68520f43e0" containerName="mariadb-database-create" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445967 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4730508a-e5d2-41c3-9358-5f8cd815ae15" containerName="mariadb-database-create" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.445983 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5357f4-0242-43f7-b48a-3f2f7ccd4583" containerName="mariadb-account-create-update" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.446996 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.450068 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.460395 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26"] Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.499949 5030 scope.go:117] "RemoveContainer" containerID="0478c6b5a2aa7d3114b8aba566b34095474deb6aee07d02a4122007b8a828961" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.521776 5030 scope.go:117] "RemoveContainer" containerID="e9b169497221304dbe6adee831f056e749c90f520798d74f0635cb2f60436f1d" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.541338 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-859d7775df-wtw26\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.541429 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-config\") pod \"dnsmasq-dnsmasq-859d7775df-wtw26\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.541457 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-859d7775df-wtw26\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.541482 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86bkw\" (UniqueName: \"kubernetes.io/projected/34541e14-d037-4e28-bcc1-f0781aa25bfb-kube-api-access-86bkw\") pod \"dnsmasq-dnsmasq-859d7775df-wtw26\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.544319 5030 scope.go:117] "RemoveContainer" containerID="d8020553ea006945bae8427811e7617cdae1effe1553b99640376e2be8359797" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.563724 5030 scope.go:117] "RemoveContainer" containerID="5acce875889d65a3f586a3832b99f2d8bced89596046689298d3df1e58194783" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.582328 5030 scope.go:117] "RemoveContainer" containerID="e1b5f2a13f3163554624db7b1aafe91f5aee9ab1131474fe0369c5f0f4a9dee1" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.598870 5030 scope.go:117] "RemoveContainer" containerID="ef6976dc418e8fc75f18940eec16d6ef87c71cd9327cf28588a9a353d0afb920" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.624592 5030 scope.go:117] "RemoveContainer" containerID="10f2ad934ade0755d3b8fbad653de62253c2374bf376c3091a96065c690f7603" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.643458 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-config\") pod \"dnsmasq-dnsmasq-859d7775df-wtw26\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.643524 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-859d7775df-wtw26\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.643570 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86bkw\" (UniqueName: \"kubernetes.io/projected/34541e14-d037-4e28-bcc1-f0781aa25bfb-kube-api-access-86bkw\") pod \"dnsmasq-dnsmasq-859d7775df-wtw26\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.643780 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-859d7775df-wtw26\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.644696 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-config\") pod \"dnsmasq-dnsmasq-859d7775df-wtw26\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.644893 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-859d7775df-wtw26\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.645168 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-859d7775df-wtw26\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.648309 5030 scope.go:117] "RemoveContainer" containerID="5119d0d0cc4f6a962474254571f2c6fcb5f0d2fbf5b431abae2990241b7a16de" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.662984 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86bkw\" (UniqueName: \"kubernetes.io/projected/34541e14-d037-4e28-bcc1-f0781aa25bfb-kube-api-access-86bkw\") pod \"dnsmasq-dnsmasq-859d7775df-wtw26\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.666641 5030 scope.go:117] "RemoveContainer" containerID="611008d575119572dd79f498dbab28ea12bc998a1001a31f1ecbe0cfe79cdd53" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.694839 5030 scope.go:117] "RemoveContainer" containerID="68a3e13cf2691b33d5c70e2cd4ee154f6c6ade16bd21a3ed1c5508abb12bf1aa" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.724201 5030 scope.go:117] "RemoveContainer" containerID="a41bbbe631becdc2663227271247bffebe52245317b5d494313c36b84c0010bf" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.767248 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.777242 5030 scope.go:117] "RemoveContainer" containerID="932dcfdf35065219a5e818e5ee79e925e511ed060334c88dc1016384aecb5e71" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.859615 5030 scope.go:117] "RemoveContainer" containerID="6fa5a926460488785a7ffba60bb9b7fadfcf332e45c0b16ee382744faf654dce" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.894788 5030 scope.go:117] "RemoveContainer" containerID="ace8455d89e787da2e667d87dede5ef4ab3dc9bb9804afaac85184a3e3d85013" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.932210 5030 scope.go:117] "RemoveContainer" containerID="5e56e0a90f48d67ecced2eb4b7bd71d6bb25bbb2926717de0d9772105c53f0b2" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.962974 5030 scope.go:117] "RemoveContainer" containerID="dba39c972f09e29c1e7af7710cba14d22bb01635d4706b36515e50f46d419e23" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.966484 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:38:07 crc kubenswrapper[5030]: E0120 23:38:07.966715 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.978219 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441adc4-fd81-4115-b6aa-57d6752fcb1f" path="/var/lib/kubelet/pods/5441adc4-fd81-4115-b6aa-57d6752fcb1f/volumes" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.981478 5030 scope.go:117] "RemoveContainer" containerID="7d9c91edb4a35a4bf32992a5fc61b2174f916fe1a309ebe2c3715d15e2bd931f" Jan 20 23:38:07 crc kubenswrapper[5030]: I0120 23:38:07.998633 5030 scope.go:117] "RemoveContainer" containerID="a1d0030d1db7eafecbee2fd3e7074cf5fa995ab67d4254165b2fedc019d55a78" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.015311 5030 scope.go:117] "RemoveContainer" containerID="c1b33ab002e4f712903d4ed520c8c52b8e5d06f4f632273086ae04558ec64596" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.031440 5030 scope.go:117] "RemoveContainer" containerID="825b5af702ffdf5de2be5f22ddc61eac516d5cbe358b33e13e145f3e929fbc57" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.053811 5030 scope.go:117] "RemoveContainer" containerID="5c43afbc3bd855975b6195fe0ab81ffca022d863f4ec31ef3dcc4088f902abf7" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.070901 5030 scope.go:117] "RemoveContainer" containerID="142f3acf1108276cc11ce9ff0213af6ed4a6eb46e60c66d8d0e27543458e4c19" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.091304 5030 scope.go:117] "RemoveContainer" containerID="5577e422daaa40c7d4a3a4b146deed7f44ec2457dac0430688594f2eba2c7186" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.111596 5030 scope.go:117] "RemoveContainer" containerID="69b7c21f321b09445606557c0e6984dc6d0949ad2a3b0ee72bd8c7a27d07ecc7" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.138307 5030 scope.go:117] "RemoveContainer" containerID="02c333bfad293ec86e408a8920ec51d1c9268264d8402be95b6a1751dcb8d930" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.171112 5030 scope.go:117] "RemoveContainer" containerID="42893f23058587a6ea7bb91a49056a29e0ef50a9501774456a34ef1a79090ca0" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.200828 5030 scope.go:117] "RemoveContainer" containerID="5255250b357a4ac7681649a1e5c0a926b39fbee907e1ba4000a642da16130fdd" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.224969 5030 scope.go:117] "RemoveContainer" containerID="aef8219014260e57e40ee40900cc995c3ff708186b3f043e7917e6d02c4cb21e" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.262758 5030 scope.go:117] "RemoveContainer" containerID="d818d37af6dd495c07c90156f3acd2271b27602fefb571a1c777bca760f33f38" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.306184 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26"] Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.414345 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-f87km"] Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.415511 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.417973 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.418003 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-fm8zv" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.460958 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-f87km"] Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.466673 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-db-sync-config-data\") pod \"glance-db-sync-f87km\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.466724 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-combined-ca-bundle\") pod \"glance-db-sync-f87km\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.466757 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btw88\" (UniqueName: \"kubernetes.io/projected/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-kube-api-access-btw88\") pod \"glance-db-sync-f87km\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.466796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-config-data\") pod \"glance-db-sync-f87km\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.569337 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-db-sync-config-data\") pod \"glance-db-sync-f87km\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.569437 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-combined-ca-bundle\") pod \"glance-db-sync-f87km\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.569473 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btw88\" (UniqueName: \"kubernetes.io/projected/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-kube-api-access-btw88\") pod \"glance-db-sync-f87km\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.569537 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-config-data\") pod \"glance-db-sync-f87km\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.575145 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-db-sync-config-data\") pod \"glance-db-sync-f87km\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.575149 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-config-data\") pod \"glance-db-sync-f87km\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.575383 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-combined-ca-bundle\") pod \"glance-db-sync-f87km\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.588015 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btw88\" (UniqueName: \"kubernetes.io/projected/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-kube-api-access-btw88\") pod \"glance-db-sync-f87km\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:08 crc kubenswrapper[5030]: I0120 23:38:08.770066 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:09 crc kubenswrapper[5030]: I0120 23:38:09.065373 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-f87km"] Jan 20 23:38:09 crc kubenswrapper[5030]: W0120 23:38:09.083998 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c4ba8d1_6bb4_4ff7_ab51_46533317e15d.slice/crio-f2ca09046616f064353c258103aa823f3a65f780547f552879b4c59e05b46242 WatchSource:0}: Error finding container f2ca09046616f064353c258103aa823f3a65f780547f552879b4c59e05b46242: Status 404 returned error can't find the container with id f2ca09046616f064353c258103aa823f3a65f780547f552879b4c59e05b46242 Jan 20 23:38:09 crc kubenswrapper[5030]: I0120 23:38:09.306397 5030 generic.go:334] "Generic (PLEG): container finished" podID="34541e14-d037-4e28-bcc1-f0781aa25bfb" containerID="2ff9fb3899b3991032e62388d4f1c3c9e18aaa1dda3d5717634443e4b1e40fcd" exitCode=0 Jan 20 23:38:09 crc kubenswrapper[5030]: I0120 23:38:09.306458 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" event={"ID":"34541e14-d037-4e28-bcc1-f0781aa25bfb","Type":"ContainerDied","Data":"2ff9fb3899b3991032e62388d4f1c3c9e18aaa1dda3d5717634443e4b1e40fcd"} Jan 20 23:38:09 crc kubenswrapper[5030]: I0120 23:38:09.306484 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" event={"ID":"34541e14-d037-4e28-bcc1-f0781aa25bfb","Type":"ContainerStarted","Data":"f9c01f64bc1365ebf45aaaf4051fffe63c7f7e9ea69bd68a634adb65b4bae75c"} Jan 20 23:38:09 crc kubenswrapper[5030]: I0120 23:38:09.311009 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-f87km" event={"ID":"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d","Type":"ContainerStarted","Data":"f2ca09046616f064353c258103aa823f3a65f780547f552879b4c59e05b46242"} Jan 20 23:38:10 crc kubenswrapper[5030]: I0120 23:38:10.332669 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" event={"ID":"34541e14-d037-4e28-bcc1-f0781aa25bfb","Type":"ContainerStarted","Data":"659c5948dbbf542ccf9d9063860877a6004816c800cf48cc1f29568bb715db62"} Jan 20 23:38:10 crc kubenswrapper[5030]: I0120 23:38:10.334970 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-f87km" event={"ID":"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d","Type":"ContainerStarted","Data":"9e725199433e346decd5a46d7977b737a73063b254698099a39fbde23c3b26fa"} Jan 20 23:38:10 crc kubenswrapper[5030]: I0120 23:38:10.365483 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" podStartSLOduration=3.365465477 podStartE2EDuration="3.365465477s" podCreationTimestamp="2026-01-20 23:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:10.356976231 +0000 UTC m=+3762.677236509" watchObservedRunningTime="2026-01-20 23:38:10.365465477 +0000 UTC m=+3762.685725765" Jan 20 23:38:10 crc kubenswrapper[5030]: I0120 23:38:10.387068 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-f87km" podStartSLOduration=2.38704597 podStartE2EDuration="2.38704597s" podCreationTimestamp="2026-01-20 23:38:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:10.374538097 +0000 UTC m=+3762.694798425" watchObservedRunningTime="2026-01-20 23:38:10.38704597 +0000 UTC m=+3762.707306298" Jan 20 23:38:11 crc kubenswrapper[5030]: I0120 23:38:11.344971 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:11 crc kubenswrapper[5030]: I0120 23:38:11.362578 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-c5jng"] Jan 20 23:38:11 crc kubenswrapper[5030]: I0120 23:38:11.363793 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-c5jng" Jan 20 23:38:11 crc kubenswrapper[5030]: I0120 23:38:11.371690 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 20 23:38:11 crc kubenswrapper[5030]: I0120 23:38:11.376668 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-c5jng"] Jan 20 23:38:11 crc kubenswrapper[5030]: I0120 23:38:11.517992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn2wq\" (UniqueName: \"kubernetes.io/projected/e3b244be-2928-4aa4-9d48-8d6411876b72-kube-api-access-zn2wq\") pod \"root-account-create-update-c5jng\" (UID: \"e3b244be-2928-4aa4-9d48-8d6411876b72\") " pod="openstack-kuttl-tests/root-account-create-update-c5jng" Jan 20 23:38:11 crc kubenswrapper[5030]: I0120 23:38:11.518914 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b244be-2928-4aa4-9d48-8d6411876b72-operator-scripts\") pod \"root-account-create-update-c5jng\" (UID: \"e3b244be-2928-4aa4-9d48-8d6411876b72\") " pod="openstack-kuttl-tests/root-account-create-update-c5jng" Jan 20 23:38:11 crc kubenswrapper[5030]: I0120 23:38:11.621502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b244be-2928-4aa4-9d48-8d6411876b72-operator-scripts\") pod \"root-account-create-update-c5jng\" (UID: \"e3b244be-2928-4aa4-9d48-8d6411876b72\") " pod="openstack-kuttl-tests/root-account-create-update-c5jng" Jan 20 23:38:11 crc kubenswrapper[5030]: I0120 23:38:11.621725 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn2wq\" (UniqueName: \"kubernetes.io/projected/e3b244be-2928-4aa4-9d48-8d6411876b72-kube-api-access-zn2wq\") pod \"root-account-create-update-c5jng\" (UID: \"e3b244be-2928-4aa4-9d48-8d6411876b72\") " pod="openstack-kuttl-tests/root-account-create-update-c5jng" Jan 20 23:38:11 crc kubenswrapper[5030]: I0120 23:38:11.622283 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b244be-2928-4aa4-9d48-8d6411876b72-operator-scripts\") pod \"root-account-create-update-c5jng\" (UID: \"e3b244be-2928-4aa4-9d48-8d6411876b72\") " pod="openstack-kuttl-tests/root-account-create-update-c5jng" Jan 20 23:38:11 crc kubenswrapper[5030]: I0120 23:38:11.646104 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn2wq\" (UniqueName: \"kubernetes.io/projected/e3b244be-2928-4aa4-9d48-8d6411876b72-kube-api-access-zn2wq\") pod \"root-account-create-update-c5jng\" (UID: \"e3b244be-2928-4aa4-9d48-8d6411876b72\") " pod="openstack-kuttl-tests/root-account-create-update-c5jng" Jan 20 23:38:11 crc kubenswrapper[5030]: I0120 23:38:11.687316 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-c5jng" Jan 20 23:38:12 crc kubenswrapper[5030]: I0120 23:38:12.188188 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-c5jng"] Jan 20 23:38:12 crc kubenswrapper[5030]: W0120 23:38:12.192424 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3b244be_2928_4aa4_9d48_8d6411876b72.slice/crio-2cd91ae6d1897526735a41f13d8971e917142f48d61ac5ad99e248056d6aae85 WatchSource:0}: Error finding container 2cd91ae6d1897526735a41f13d8971e917142f48d61ac5ad99e248056d6aae85: Status 404 returned error can't find the container with id 2cd91ae6d1897526735a41f13d8971e917142f48d61ac5ad99e248056d6aae85 Jan 20 23:38:12 crc kubenswrapper[5030]: I0120 23:38:12.364698 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-c5jng" event={"ID":"e3b244be-2928-4aa4-9d48-8d6411876b72","Type":"ContainerStarted","Data":"2cd91ae6d1897526735a41f13d8971e917142f48d61ac5ad99e248056d6aae85"} Jan 20 23:38:13 crc kubenswrapper[5030]: I0120 23:38:13.379922 5030 generic.go:334] "Generic (PLEG): container finished" podID="e3b244be-2928-4aa4-9d48-8d6411876b72" containerID="83e74463459e08631754313660ebe7d9b0a5d2244d0828698d9ba7f291108e96" exitCode=0 Jan 20 23:38:13 crc kubenswrapper[5030]: I0120 23:38:13.380008 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-c5jng" event={"ID":"e3b244be-2928-4aa4-9d48-8d6411876b72","Type":"ContainerDied","Data":"83e74463459e08631754313660ebe7d9b0a5d2244d0828698d9ba7f291108e96"} Jan 20 23:38:13 crc kubenswrapper[5030]: I0120 23:38:13.385345 5030 generic.go:334] "Generic (PLEG): container finished" podID="2118c3ce-d55e-4a65-8d25-d7f53bcac91e" containerID="62b3fef48a93cc798c887229c9410be50b9b8c1feb8b7d1bf40cdaf819af989a" exitCode=0 Jan 20 23:38:13 crc kubenswrapper[5030]: I0120 23:38:13.385454 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"2118c3ce-d55e-4a65-8d25-d7f53bcac91e","Type":"ContainerDied","Data":"62b3fef48a93cc798c887229c9410be50b9b8c1feb8b7d1bf40cdaf819af989a"} Jan 20 23:38:13 crc kubenswrapper[5030]: I0120 23:38:13.388552 5030 generic.go:334] "Generic (PLEG): container finished" podID="16f26a59-3681-422d-a46f-406b454f73b8" containerID="9c48f12957fef053cb13ac5971d97c7a68101793fe83cc1adfe30ac9a8f81cc5" exitCode=0 Jan 20 23:38:13 crc kubenswrapper[5030]: I0120 23:38:13.388607 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"16f26a59-3681-422d-a46f-406b454f73b8","Type":"ContainerDied","Data":"9c48f12957fef053cb13ac5971d97c7a68101793fe83cc1adfe30ac9a8f81cc5"} Jan 20 23:38:14 crc kubenswrapper[5030]: I0120 23:38:14.398326 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"16f26a59-3681-422d-a46f-406b454f73b8","Type":"ContainerStarted","Data":"7f377657ebc8cc8d5f270487a8b8d28e4ee20ea78c35ac924a0cfd025e4c2bef"} Jan 20 23:38:14 crc kubenswrapper[5030]: I0120 23:38:14.398771 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:38:14 crc kubenswrapper[5030]: I0120 23:38:14.401187 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"2118c3ce-d55e-4a65-8d25-d7f53bcac91e","Type":"ContainerStarted","Data":"8d105475679fa60963f81be8cfa6ee62761c330e4c9f837c616f6bf01d9043ec"} Jan 20 23:38:14 crc kubenswrapper[5030]: I0120 23:38:14.423526 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=36.423505255 podStartE2EDuration="36.423505255s" podCreationTimestamp="2026-01-20 23:37:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:14.421519907 +0000 UTC m=+3766.741780215" watchObservedRunningTime="2026-01-20 23:38:14.423505255 +0000 UTC m=+3766.743765543" Jan 20 23:38:14 crc kubenswrapper[5030]: I0120 23:38:14.446071 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.446046681 podStartE2EDuration="37.446046681s" podCreationTimestamp="2026-01-20 23:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:14.441882751 +0000 UTC m=+3766.762143099" watchObservedRunningTime="2026-01-20 23:38:14.446046681 +0000 UTC m=+3766.766307019" Jan 20 23:38:14 crc kubenswrapper[5030]: I0120 23:38:14.734126 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-c5jng" Jan 20 23:38:14 crc kubenswrapper[5030]: I0120 23:38:14.915592 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b244be-2928-4aa4-9d48-8d6411876b72-operator-scripts\") pod \"e3b244be-2928-4aa4-9d48-8d6411876b72\" (UID: \"e3b244be-2928-4aa4-9d48-8d6411876b72\") " Jan 20 23:38:14 crc kubenswrapper[5030]: I0120 23:38:14.915732 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn2wq\" (UniqueName: \"kubernetes.io/projected/e3b244be-2928-4aa4-9d48-8d6411876b72-kube-api-access-zn2wq\") pod \"e3b244be-2928-4aa4-9d48-8d6411876b72\" (UID: \"e3b244be-2928-4aa4-9d48-8d6411876b72\") " Jan 20 23:38:14 crc kubenswrapper[5030]: I0120 23:38:14.916412 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b244be-2928-4aa4-9d48-8d6411876b72-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3b244be-2928-4aa4-9d48-8d6411876b72" (UID: "e3b244be-2928-4aa4-9d48-8d6411876b72"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:14 crc kubenswrapper[5030]: I0120 23:38:14.933908 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b244be-2928-4aa4-9d48-8d6411876b72-kube-api-access-zn2wq" (OuterVolumeSpecName: "kube-api-access-zn2wq") pod "e3b244be-2928-4aa4-9d48-8d6411876b72" (UID: "e3b244be-2928-4aa4-9d48-8d6411876b72"). InnerVolumeSpecName "kube-api-access-zn2wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:15 crc kubenswrapper[5030]: I0120 23:38:15.018632 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b244be-2928-4aa4-9d48-8d6411876b72-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:15 crc kubenswrapper[5030]: I0120 23:38:15.018989 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn2wq\" (UniqueName: \"kubernetes.io/projected/e3b244be-2928-4aa4-9d48-8d6411876b72-kube-api-access-zn2wq\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:15 crc kubenswrapper[5030]: I0120 23:38:15.412680 5030 generic.go:334] "Generic (PLEG): container finished" podID="6c4ba8d1-6bb4-4ff7-ab51-46533317e15d" containerID="9e725199433e346decd5a46d7977b737a73063b254698099a39fbde23c3b26fa" exitCode=0 Jan 20 23:38:15 crc kubenswrapper[5030]: I0120 23:38:15.412784 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-f87km" event={"ID":"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d","Type":"ContainerDied","Data":"9e725199433e346decd5a46d7977b737a73063b254698099a39fbde23c3b26fa"} Jan 20 23:38:15 crc kubenswrapper[5030]: I0120 23:38:15.414449 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-c5jng" Jan 20 23:38:15 crc kubenswrapper[5030]: I0120 23:38:15.414443 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-c5jng" event={"ID":"e3b244be-2928-4aa4-9d48-8d6411876b72","Type":"ContainerDied","Data":"2cd91ae6d1897526735a41f13d8971e917142f48d61ac5ad99e248056d6aae85"} Jan 20 23:38:15 crc kubenswrapper[5030]: I0120 23:38:15.414524 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cd91ae6d1897526735a41f13d8971e917142f48d61ac5ad99e248056d6aae85" Jan 20 23:38:16 crc kubenswrapper[5030]: I0120 23:38:16.844037 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:16 crc kubenswrapper[5030]: I0120 23:38:16.951454 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-combined-ca-bundle\") pod \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " Jan 20 23:38:16 crc kubenswrapper[5030]: I0120 23:38:16.951565 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-config-data\") pod \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " Jan 20 23:38:16 crc kubenswrapper[5030]: I0120 23:38:16.951704 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btw88\" (UniqueName: \"kubernetes.io/projected/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-kube-api-access-btw88\") pod \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " Jan 20 23:38:16 crc kubenswrapper[5030]: I0120 23:38:16.951831 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-db-sync-config-data\") pod \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\" (UID: \"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d\") " Jan 20 23:38:16 crc kubenswrapper[5030]: I0120 23:38:16.958824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6c4ba8d1-6bb4-4ff7-ab51-46533317e15d" (UID: "6c4ba8d1-6bb4-4ff7-ab51-46533317e15d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:16 crc kubenswrapper[5030]: I0120 23:38:16.959931 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-kube-api-access-btw88" (OuterVolumeSpecName: "kube-api-access-btw88") pod "6c4ba8d1-6bb4-4ff7-ab51-46533317e15d" (UID: "6c4ba8d1-6bb4-4ff7-ab51-46533317e15d"). InnerVolumeSpecName "kube-api-access-btw88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:16 crc kubenswrapper[5030]: I0120 23:38:16.991944 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c4ba8d1-6bb4-4ff7-ab51-46533317e15d" (UID: "6c4ba8d1-6bb4-4ff7-ab51-46533317e15d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:17 crc kubenswrapper[5030]: I0120 23:38:17.006207 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-config-data" (OuterVolumeSpecName: "config-data") pod "6c4ba8d1-6bb4-4ff7-ab51-46533317e15d" (UID: "6c4ba8d1-6bb4-4ff7-ab51-46533317e15d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:17 crc kubenswrapper[5030]: I0120 23:38:17.054870 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btw88\" (UniqueName: \"kubernetes.io/projected/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-kube-api-access-btw88\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:17 crc kubenswrapper[5030]: I0120 23:38:17.054912 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:17 crc kubenswrapper[5030]: I0120 23:38:17.054926 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:17 crc kubenswrapper[5030]: I0120 23:38:17.054939 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:17 crc kubenswrapper[5030]: I0120 23:38:17.437472 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-f87km" event={"ID":"6c4ba8d1-6bb4-4ff7-ab51-46533317e15d","Type":"ContainerDied","Data":"f2ca09046616f064353c258103aa823f3a65f780547f552879b4c59e05b46242"} Jan 20 23:38:17 crc kubenswrapper[5030]: I0120 23:38:17.437517 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2ca09046616f064353c258103aa823f3a65f780547f552879b4c59e05b46242" Jan 20 23:38:17 crc kubenswrapper[5030]: I0120 23:38:17.437553 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-f87km" Jan 20 23:38:17 crc kubenswrapper[5030]: I0120 23:38:17.768493 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:38:17 crc kubenswrapper[5030]: I0120 23:38:17.815289 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q"] Jan 20 23:38:17 crc kubenswrapper[5030]: I0120 23:38:17.815513 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" podUID="ab64dff2-f94e-4a15-84d1-4661aa1b8c87" containerName="dnsmasq-dns" containerID="cri-o://c3482f4db5afd5109224c845a1b35a5f57ef6bb7cfa94854f83af3893cff4828" gracePeriod=10 Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.240785 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.374084 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5gfn\" (UniqueName: \"kubernetes.io/projected/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-kube-api-access-g5gfn\") pod \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\" (UID: \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\") " Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.374173 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-config\") pod \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\" (UID: \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\") " Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.374222 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-dnsmasq-svc\") pod \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\" (UID: \"ab64dff2-f94e-4a15-84d1-4661aa1b8c87\") " Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.379703 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-kube-api-access-g5gfn" (OuterVolumeSpecName: "kube-api-access-g5gfn") pod "ab64dff2-f94e-4a15-84d1-4661aa1b8c87" (UID: "ab64dff2-f94e-4a15-84d1-4661aa1b8c87"). InnerVolumeSpecName "kube-api-access-g5gfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.415775 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-config" (OuterVolumeSpecName: "config") pod "ab64dff2-f94e-4a15-84d1-4661aa1b8c87" (UID: "ab64dff2-f94e-4a15-84d1-4661aa1b8c87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.419466 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "ab64dff2-f94e-4a15-84d1-4661aa1b8c87" (UID: "ab64dff2-f94e-4a15-84d1-4661aa1b8c87"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.447271 5030 generic.go:334] "Generic (PLEG): container finished" podID="ab64dff2-f94e-4a15-84d1-4661aa1b8c87" containerID="c3482f4db5afd5109224c845a1b35a5f57ef6bb7cfa94854f83af3893cff4828" exitCode=0 Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.447312 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" event={"ID":"ab64dff2-f94e-4a15-84d1-4661aa1b8c87","Type":"ContainerDied","Data":"c3482f4db5afd5109224c845a1b35a5f57ef6bb7cfa94854f83af3893cff4828"} Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.447338 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" event={"ID":"ab64dff2-f94e-4a15-84d1-4661aa1b8c87","Type":"ContainerDied","Data":"170e9e834f46a7b338e37db311cb370bdbd13dbef7bc5e130928776a1b68cdbf"} Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.447337 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q" Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.447355 5030 scope.go:117] "RemoveContainer" containerID="c3482f4db5afd5109224c845a1b35a5f57ef6bb7cfa94854f83af3893cff4828" Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.464933 5030 scope.go:117] "RemoveContainer" containerID="82b35ea53706bccfaf45eb8ca8c18b312780c9905324c03b5f0a1a59d626200d" Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.476187 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.476219 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5gfn\" (UniqueName: \"kubernetes.io/projected/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-kube-api-access-g5gfn\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.476232 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab64dff2-f94e-4a15-84d1-4661aa1b8c87-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.477822 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q"] Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.483976 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kfd8q"] Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.496914 5030 scope.go:117] "RemoveContainer" containerID="c3482f4db5afd5109224c845a1b35a5f57ef6bb7cfa94854f83af3893cff4828" Jan 20 23:38:18 crc kubenswrapper[5030]: E0120 23:38:18.497367 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3482f4db5afd5109224c845a1b35a5f57ef6bb7cfa94854f83af3893cff4828\": container with ID starting with c3482f4db5afd5109224c845a1b35a5f57ef6bb7cfa94854f83af3893cff4828 not found: ID does not exist" containerID="c3482f4db5afd5109224c845a1b35a5f57ef6bb7cfa94854f83af3893cff4828" Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.497458 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3482f4db5afd5109224c845a1b35a5f57ef6bb7cfa94854f83af3893cff4828"} err="failed to get container status \"c3482f4db5afd5109224c845a1b35a5f57ef6bb7cfa94854f83af3893cff4828\": rpc error: code = NotFound desc = could not find container \"c3482f4db5afd5109224c845a1b35a5f57ef6bb7cfa94854f83af3893cff4828\": container with ID starting with c3482f4db5afd5109224c845a1b35a5f57ef6bb7cfa94854f83af3893cff4828 not found: ID does not exist" Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.497533 5030 scope.go:117] "RemoveContainer" containerID="82b35ea53706bccfaf45eb8ca8c18b312780c9905324c03b5f0a1a59d626200d" Jan 20 23:38:18 crc kubenswrapper[5030]: E0120 23:38:18.498003 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82b35ea53706bccfaf45eb8ca8c18b312780c9905324c03b5f0a1a59d626200d\": container with ID starting with 82b35ea53706bccfaf45eb8ca8c18b312780c9905324c03b5f0a1a59d626200d not found: ID does not exist" containerID="82b35ea53706bccfaf45eb8ca8c18b312780c9905324c03b5f0a1a59d626200d" Jan 20 23:38:18 crc kubenswrapper[5030]: I0120 23:38:18.498068 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82b35ea53706bccfaf45eb8ca8c18b312780c9905324c03b5f0a1a59d626200d"} err="failed to get container status \"82b35ea53706bccfaf45eb8ca8c18b312780c9905324c03b5f0a1a59d626200d\": rpc error: code = NotFound desc = could not find container \"82b35ea53706bccfaf45eb8ca8c18b312780c9905324c03b5f0a1a59d626200d\": container with ID starting with 82b35ea53706bccfaf45eb8ca8c18b312780c9905324c03b5f0a1a59d626200d not found: ID does not exist" Jan 20 23:38:19 crc kubenswrapper[5030]: I0120 23:38:19.210674 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:38:19 crc kubenswrapper[5030]: I0120 23:38:19.997094 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab64dff2-f94e-4a15-84d1-4661aa1b8c87" path="/var/lib/kubelet/pods/ab64dff2-f94e-4a15-84d1-4661aa1b8c87/volumes" Jan 20 23:38:20 crc kubenswrapper[5030]: I0120 23:38:20.963098 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:38:20 crc kubenswrapper[5030]: E0120 23:38:20.963899 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.216042 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.595713 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jkq6c"] Jan 20 23:38:29 crc kubenswrapper[5030]: E0120 23:38:29.596035 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b244be-2928-4aa4-9d48-8d6411876b72" containerName="mariadb-account-create-update" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.596051 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b244be-2928-4aa4-9d48-8d6411876b72" containerName="mariadb-account-create-update" Jan 20 23:38:29 crc kubenswrapper[5030]: E0120 23:38:29.596065 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4ba8d1-6bb4-4ff7-ab51-46533317e15d" containerName="glance-db-sync" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.596072 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4ba8d1-6bb4-4ff7-ab51-46533317e15d" containerName="glance-db-sync" Jan 20 23:38:29 crc kubenswrapper[5030]: E0120 23:38:29.596080 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab64dff2-f94e-4a15-84d1-4661aa1b8c87" containerName="init" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.596086 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab64dff2-f94e-4a15-84d1-4661aa1b8c87" containerName="init" Jan 20 23:38:29 crc kubenswrapper[5030]: E0120 23:38:29.596096 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab64dff2-f94e-4a15-84d1-4661aa1b8c87" containerName="dnsmasq-dns" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.596101 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab64dff2-f94e-4a15-84d1-4661aa1b8c87" containerName="dnsmasq-dns" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.596249 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c4ba8d1-6bb4-4ff7-ab51-46533317e15d" containerName="glance-db-sync" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.596268 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b244be-2928-4aa4-9d48-8d6411876b72" containerName="mariadb-account-create-update" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.596278 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab64dff2-f94e-4a15-84d1-4661aa1b8c87" containerName="dnsmasq-dns" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.596820 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-jkq6c" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.618430 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn"] Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.619731 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.621185 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.627421 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jkq6c"] Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.642406 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn"] Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.683802 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.684765 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03910f64-db26-4c5e-8122-f88cce939744-operator-scripts\") pod \"cinder-db-create-jkq6c\" (UID: \"03910f64-db26-4c5e-8122-f88cce939744\") " pod="openstack-kuttl-tests/cinder-db-create-jkq6c" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.684946 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w2vw\" (UniqueName: \"kubernetes.io/projected/03910f64-db26-4c5e-8122-f88cce939744-kube-api-access-5w2vw\") pod \"cinder-db-create-jkq6c\" (UID: \"03910f64-db26-4c5e-8122-f88cce939744\") " pod="openstack-kuttl-tests/cinder-db-create-jkq6c" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.700839 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-ch2zp"] Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.702075 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-ch2zp" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.712526 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-ch2zp"] Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.786788 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03910f64-db26-4c5e-8122-f88cce939744-operator-scripts\") pod \"cinder-db-create-jkq6c\" (UID: \"03910f64-db26-4c5e-8122-f88cce939744\") " pod="openstack-kuttl-tests/cinder-db-create-jkq6c" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.787111 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7462f\" (UniqueName: \"kubernetes.io/projected/5cc0a822-f532-4667-983b-b4b9f5653853-kube-api-access-7462f\") pod \"cinder-6e8b-account-create-update-mbcgn\" (UID: \"5cc0a822-f532-4667-983b-b4b9f5653853\") " pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.787216 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cc0a822-f532-4667-983b-b4b9f5653853-operator-scripts\") pod \"cinder-6e8b-account-create-update-mbcgn\" (UID: \"5cc0a822-f532-4667-983b-b4b9f5653853\") " pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.787332 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w2vw\" (UniqueName: \"kubernetes.io/projected/03910f64-db26-4c5e-8122-f88cce939744-kube-api-access-5w2vw\") pod \"cinder-db-create-jkq6c\" (UID: \"03910f64-db26-4c5e-8122-f88cce939744\") " pod="openstack-kuttl-tests/cinder-db-create-jkq6c" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.787428 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdddt\" (UniqueName: \"kubernetes.io/projected/78f33bd8-c4ff-4eec-96d8-8244037c2edb-kube-api-access-fdddt\") pod \"barbican-db-create-ch2zp\" (UID: \"78f33bd8-c4ff-4eec-96d8-8244037c2edb\") " pod="openstack-kuttl-tests/barbican-db-create-ch2zp" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.787531 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78f33bd8-c4ff-4eec-96d8-8244037c2edb-operator-scripts\") pod \"barbican-db-create-ch2zp\" (UID: \"78f33bd8-c4ff-4eec-96d8-8244037c2edb\") " pod="openstack-kuttl-tests/barbican-db-create-ch2zp" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.788491 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03910f64-db26-4c5e-8122-f88cce939744-operator-scripts\") pod \"cinder-db-create-jkq6c\" (UID: \"03910f64-db26-4c5e-8122-f88cce939744\") " pod="openstack-kuttl-tests/cinder-db-create-jkq6c" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.797554 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-mntdc"] Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.798738 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-mntdc" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.811122 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w2vw\" (UniqueName: \"kubernetes.io/projected/03910f64-db26-4c5e-8122-f88cce939744-kube-api-access-5w2vw\") pod \"cinder-db-create-jkq6c\" (UID: \"03910f64-db26-4c5e-8122-f88cce939744\") " pod="openstack-kuttl-tests/cinder-db-create-jkq6c" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.822741 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59"] Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.823950 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.826213 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.834761 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-mntdc"] Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.842952 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59"] Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.871400 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-77ffx"] Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.872569 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-77ffx" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.875134 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.875491 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.875348 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.875737 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-2jq2r" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.886485 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-77ffx"] Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.889075 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-867jb\" (UniqueName: \"kubernetes.io/projected/2d6bc82a-43cc-4f35-b324-c2f2a40cefbd-kube-api-access-867jb\") pod \"neutron-db-create-mntdc\" (UID: \"2d6bc82a-43cc-4f35-b324-c2f2a40cefbd\") " pod="openstack-kuttl-tests/neutron-db-create-mntdc" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.889303 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7462f\" (UniqueName: \"kubernetes.io/projected/5cc0a822-f532-4667-983b-b4b9f5653853-kube-api-access-7462f\") pod \"cinder-6e8b-account-create-update-mbcgn\" (UID: \"5cc0a822-f532-4667-983b-b4b9f5653853\") " pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.889387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j82l\" (UniqueName: \"kubernetes.io/projected/c1b44006-d170-4768-8096-67d4dd02546c-kube-api-access-7j82l\") pod \"neutron-52b7-account-create-update-5gx59\" (UID: \"c1b44006-d170-4768-8096-67d4dd02546c\") " pod="openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.889486 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1b44006-d170-4768-8096-67d4dd02546c-operator-scripts\") pod \"neutron-52b7-account-create-update-5gx59\" (UID: \"c1b44006-d170-4768-8096-67d4dd02546c\") " pod="openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.889571 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cc0a822-f532-4667-983b-b4b9f5653853-operator-scripts\") pod \"cinder-6e8b-account-create-update-mbcgn\" (UID: \"5cc0a822-f532-4667-983b-b4b9f5653853\") " pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.889711 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdddt\" (UniqueName: \"kubernetes.io/projected/78f33bd8-c4ff-4eec-96d8-8244037c2edb-kube-api-access-fdddt\") pod \"barbican-db-create-ch2zp\" (UID: \"78f33bd8-c4ff-4eec-96d8-8244037c2edb\") " pod="openstack-kuttl-tests/barbican-db-create-ch2zp" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.889788 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d6bc82a-43cc-4f35-b324-c2f2a40cefbd-operator-scripts\") pod \"neutron-db-create-mntdc\" (UID: \"2d6bc82a-43cc-4f35-b324-c2f2a40cefbd\") " pod="openstack-kuttl-tests/neutron-db-create-mntdc" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.889872 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78f33bd8-c4ff-4eec-96d8-8244037c2edb-operator-scripts\") pod \"barbican-db-create-ch2zp\" (UID: \"78f33bd8-c4ff-4eec-96d8-8244037c2edb\") " pod="openstack-kuttl-tests/barbican-db-create-ch2zp" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.890473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cc0a822-f532-4667-983b-b4b9f5653853-operator-scripts\") pod \"cinder-6e8b-account-create-update-mbcgn\" (UID: \"5cc0a822-f532-4667-983b-b4b9f5653853\") " pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.890638 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78f33bd8-c4ff-4eec-96d8-8244037c2edb-operator-scripts\") pod \"barbican-db-create-ch2zp\" (UID: \"78f33bd8-c4ff-4eec-96d8-8244037c2edb\") " pod="openstack-kuttl-tests/barbican-db-create-ch2zp" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.913964 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-jkq6c" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.927608 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdddt\" (UniqueName: \"kubernetes.io/projected/78f33bd8-c4ff-4eec-96d8-8244037c2edb-kube-api-access-fdddt\") pod \"barbican-db-create-ch2zp\" (UID: \"78f33bd8-c4ff-4eec-96d8-8244037c2edb\") " pod="openstack-kuttl-tests/barbican-db-create-ch2zp" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.931926 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm"] Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.933016 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.938883 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.964675 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm"] Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.983191 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7462f\" (UniqueName: \"kubernetes.io/projected/5cc0a822-f532-4667-983b-b4b9f5653853-kube-api-access-7462f\") pod \"cinder-6e8b-account-create-update-mbcgn\" (UID: \"5cc0a822-f532-4667-983b-b4b9f5653853\") " pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.992203 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rmvr\" (UniqueName: \"kubernetes.io/projected/ead812d7-990f-4f88-87fc-8620e73f9e67-kube-api-access-7rmvr\") pod \"keystone-db-sync-77ffx\" (UID: \"ead812d7-990f-4f88-87fc-8620e73f9e67\") " pod="openstack-kuttl-tests/keystone-db-sync-77ffx" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.992459 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-867jb\" (UniqueName: \"kubernetes.io/projected/2d6bc82a-43cc-4f35-b324-c2f2a40cefbd-kube-api-access-867jb\") pod \"neutron-db-create-mntdc\" (UID: \"2d6bc82a-43cc-4f35-b324-c2f2a40cefbd\") " pod="openstack-kuttl-tests/neutron-db-create-mntdc" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.992564 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cf760db-bb64-461a-ae21-2ce8224f1f02-operator-scripts\") pod \"barbican-e7ae-account-create-update-hczvm\" (UID: \"6cf760db-bb64-461a-ae21-2ce8224f1f02\") " pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.992657 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead812d7-990f-4f88-87fc-8620e73f9e67-combined-ca-bundle\") pod \"keystone-db-sync-77ffx\" (UID: \"ead812d7-990f-4f88-87fc-8620e73f9e67\") " pod="openstack-kuttl-tests/keystone-db-sync-77ffx" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.992738 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2dtf\" (UniqueName: \"kubernetes.io/projected/6cf760db-bb64-461a-ae21-2ce8224f1f02-kube-api-access-x2dtf\") pod \"barbican-e7ae-account-create-update-hczvm\" (UID: \"6cf760db-bb64-461a-ae21-2ce8224f1f02\") " pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.992811 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead812d7-990f-4f88-87fc-8620e73f9e67-config-data\") pod \"keystone-db-sync-77ffx\" (UID: \"ead812d7-990f-4f88-87fc-8620e73f9e67\") " pod="openstack-kuttl-tests/keystone-db-sync-77ffx" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.992913 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j82l\" (UniqueName: \"kubernetes.io/projected/c1b44006-d170-4768-8096-67d4dd02546c-kube-api-access-7j82l\") pod \"neutron-52b7-account-create-update-5gx59\" (UID: \"c1b44006-d170-4768-8096-67d4dd02546c\") " pod="openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.992966 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1b44006-d170-4768-8096-67d4dd02546c-operator-scripts\") pod \"neutron-52b7-account-create-update-5gx59\" (UID: \"c1b44006-d170-4768-8096-67d4dd02546c\") " pod="openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.993124 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d6bc82a-43cc-4f35-b324-c2f2a40cefbd-operator-scripts\") pod \"neutron-db-create-mntdc\" (UID: \"2d6bc82a-43cc-4f35-b324-c2f2a40cefbd\") " pod="openstack-kuttl-tests/neutron-db-create-mntdc" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.993854 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d6bc82a-43cc-4f35-b324-c2f2a40cefbd-operator-scripts\") pod \"neutron-db-create-mntdc\" (UID: \"2d6bc82a-43cc-4f35-b324-c2f2a40cefbd\") " pod="openstack-kuttl-tests/neutron-db-create-mntdc" Jan 20 23:38:29 crc kubenswrapper[5030]: I0120 23:38:29.993955 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1b44006-d170-4768-8096-67d4dd02546c-operator-scripts\") pod \"neutron-52b7-account-create-update-5gx59\" (UID: \"c1b44006-d170-4768-8096-67d4dd02546c\") " pod="openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.026656 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-867jb\" (UniqueName: \"kubernetes.io/projected/2d6bc82a-43cc-4f35-b324-c2f2a40cefbd-kube-api-access-867jb\") pod \"neutron-db-create-mntdc\" (UID: \"2d6bc82a-43cc-4f35-b324-c2f2a40cefbd\") " pod="openstack-kuttl-tests/neutron-db-create-mntdc" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.027942 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-ch2zp" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.047321 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j82l\" (UniqueName: \"kubernetes.io/projected/c1b44006-d170-4768-8096-67d4dd02546c-kube-api-access-7j82l\") pod \"neutron-52b7-account-create-update-5gx59\" (UID: \"c1b44006-d170-4768-8096-67d4dd02546c\") " pod="openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.098523 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rmvr\" (UniqueName: \"kubernetes.io/projected/ead812d7-990f-4f88-87fc-8620e73f9e67-kube-api-access-7rmvr\") pod \"keystone-db-sync-77ffx\" (UID: \"ead812d7-990f-4f88-87fc-8620e73f9e67\") " pod="openstack-kuttl-tests/keystone-db-sync-77ffx" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.098588 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cf760db-bb64-461a-ae21-2ce8224f1f02-operator-scripts\") pod \"barbican-e7ae-account-create-update-hczvm\" (UID: \"6cf760db-bb64-461a-ae21-2ce8224f1f02\") " pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.098609 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead812d7-990f-4f88-87fc-8620e73f9e67-combined-ca-bundle\") pod \"keystone-db-sync-77ffx\" (UID: \"ead812d7-990f-4f88-87fc-8620e73f9e67\") " pod="openstack-kuttl-tests/keystone-db-sync-77ffx" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.098664 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dtf\" (UniqueName: \"kubernetes.io/projected/6cf760db-bb64-461a-ae21-2ce8224f1f02-kube-api-access-x2dtf\") pod \"barbican-e7ae-account-create-update-hczvm\" (UID: \"6cf760db-bb64-461a-ae21-2ce8224f1f02\") " pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.098683 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead812d7-990f-4f88-87fc-8620e73f9e67-config-data\") pod \"keystone-db-sync-77ffx\" (UID: \"ead812d7-990f-4f88-87fc-8620e73f9e67\") " pod="openstack-kuttl-tests/keystone-db-sync-77ffx" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.099744 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cf760db-bb64-461a-ae21-2ce8224f1f02-operator-scripts\") pod \"barbican-e7ae-account-create-update-hczvm\" (UID: \"6cf760db-bb64-461a-ae21-2ce8224f1f02\") " pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.104068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead812d7-990f-4f88-87fc-8620e73f9e67-config-data\") pod \"keystone-db-sync-77ffx\" (UID: \"ead812d7-990f-4f88-87fc-8620e73f9e67\") " pod="openstack-kuttl-tests/keystone-db-sync-77ffx" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.107196 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead812d7-990f-4f88-87fc-8620e73f9e67-combined-ca-bundle\") pod \"keystone-db-sync-77ffx\" (UID: \"ead812d7-990f-4f88-87fc-8620e73f9e67\") " pod="openstack-kuttl-tests/keystone-db-sync-77ffx" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.118886 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2dtf\" (UniqueName: \"kubernetes.io/projected/6cf760db-bb64-461a-ae21-2ce8224f1f02-kube-api-access-x2dtf\") pod \"barbican-e7ae-account-create-update-hczvm\" (UID: \"6cf760db-bb64-461a-ae21-2ce8224f1f02\") " pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.143323 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rmvr\" (UniqueName: \"kubernetes.io/projected/ead812d7-990f-4f88-87fc-8620e73f9e67-kube-api-access-7rmvr\") pod \"keystone-db-sync-77ffx\" (UID: \"ead812d7-990f-4f88-87fc-8620e73f9e67\") " pod="openstack-kuttl-tests/keystone-db-sync-77ffx" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.153951 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-mntdc" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.173975 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.206189 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-77ffx" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.248951 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn" Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.388810 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm" Jan 20 23:38:30 crc kubenswrapper[5030]: W0120 23:38:30.638114 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03910f64_db26_4c5e_8122_f88cce939744.slice/crio-43107f807a17e16ac0ede176fb29d085eb4a00c3808ee050bfe09f6d6d8992a3 WatchSource:0}: Error finding container 43107f807a17e16ac0ede176fb29d085eb4a00c3808ee050bfe09f6d6d8992a3: Status 404 returned error can't find the container with id 43107f807a17e16ac0ede176fb29d085eb4a00c3808ee050bfe09f6d6d8992a3 Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.643812 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jkq6c"] Jan 20 23:38:30 crc kubenswrapper[5030]: W0120 23:38:30.779365 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1b44006_d170_4768_8096_67d4dd02546c.slice/crio-30f740f0b5884f90a12c89fa937bcca6a6cefd44038e21be7d915cf1590af8ef WatchSource:0}: Error finding container 30f740f0b5884f90a12c89fa937bcca6a6cefd44038e21be7d915cf1590af8ef: Status 404 returned error can't find the container with id 30f740f0b5884f90a12c89fa937bcca6a6cefd44038e21be7d915cf1590af8ef Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.782104 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59"] Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.789367 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-ch2zp"] Jan 20 23:38:30 crc kubenswrapper[5030]: W0120 23:38:30.799614 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78f33bd8_c4ff_4eec_96d8_8244037c2edb.slice/crio-2acd59d569d7da0376b43f4f386bb440469b9af6041c4a8afee8b70d16476a2d WatchSource:0}: Error finding container 2acd59d569d7da0376b43f4f386bb440469b9af6041c4a8afee8b70d16476a2d: Status 404 returned error can't find the container with id 2acd59d569d7da0376b43f4f386bb440469b9af6041c4a8afee8b70d16476a2d Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.912131 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn"] Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.934349 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-77ffx"] Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.947587 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-mntdc"] Jan 20 23:38:30 crc kubenswrapper[5030]: I0120 23:38:30.959943 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm"] Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.581548 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-77ffx" event={"ID":"ead812d7-990f-4f88-87fc-8620e73f9e67","Type":"ContainerStarted","Data":"63cdd757318961398a57e724643715beb0c52c0c72e8648c72d7b3a6a1963c1a"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.581597 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-77ffx" event={"ID":"ead812d7-990f-4f88-87fc-8620e73f9e67","Type":"ContainerStarted","Data":"9f29ddc0f2b48906e32eacb6316d95b4b739eb4c8a9b89c661e02e7aa6190316"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.584294 5030 generic.go:334] "Generic (PLEG): container finished" podID="c1b44006-d170-4768-8096-67d4dd02546c" containerID="3297caab8ada18e3a891d11d79f546ab779eaf78f3077e6af83e7fdfc7b652aa" exitCode=0 Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.584374 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59" event={"ID":"c1b44006-d170-4768-8096-67d4dd02546c","Type":"ContainerDied","Data":"3297caab8ada18e3a891d11d79f546ab779eaf78f3077e6af83e7fdfc7b652aa"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.584422 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59" event={"ID":"c1b44006-d170-4768-8096-67d4dd02546c","Type":"ContainerStarted","Data":"30f740f0b5884f90a12c89fa937bcca6a6cefd44038e21be7d915cf1590af8ef"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.586165 5030 generic.go:334] "Generic (PLEG): container finished" podID="6cf760db-bb64-461a-ae21-2ce8224f1f02" containerID="935662df8cf9aece3ed3605af05bf9156234f657db887627bf7aceee2b470956" exitCode=0 Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.586229 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm" event={"ID":"6cf760db-bb64-461a-ae21-2ce8224f1f02","Type":"ContainerDied","Data":"935662df8cf9aece3ed3605af05bf9156234f657db887627bf7aceee2b470956"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.586259 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm" event={"ID":"6cf760db-bb64-461a-ae21-2ce8224f1f02","Type":"ContainerStarted","Data":"5b7059a24facc4542be8e98f2c72d2d99359b36b003fc71e7b7f1b33db579d9b"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.588077 5030 generic.go:334] "Generic (PLEG): container finished" podID="78f33bd8-c4ff-4eec-96d8-8244037c2edb" containerID="8a0eec32beb473b172927074604f141d23560612f04feaa5a8ac286e951f5c6f" exitCode=0 Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.588123 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-ch2zp" event={"ID":"78f33bd8-c4ff-4eec-96d8-8244037c2edb","Type":"ContainerDied","Data":"8a0eec32beb473b172927074604f141d23560612f04feaa5a8ac286e951f5c6f"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.588142 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-ch2zp" event={"ID":"78f33bd8-c4ff-4eec-96d8-8244037c2edb","Type":"ContainerStarted","Data":"2acd59d569d7da0376b43f4f386bb440469b9af6041c4a8afee8b70d16476a2d"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.590459 5030 generic.go:334] "Generic (PLEG): container finished" podID="03910f64-db26-4c5e-8122-f88cce939744" containerID="5ed9fef91aa09e90121fb4abe44cf03f6735cf5076317f24839a85eb41859635" exitCode=0 Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.590475 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-jkq6c" event={"ID":"03910f64-db26-4c5e-8122-f88cce939744","Type":"ContainerDied","Data":"5ed9fef91aa09e90121fb4abe44cf03f6735cf5076317f24839a85eb41859635"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.590498 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-jkq6c" event={"ID":"03910f64-db26-4c5e-8122-f88cce939744","Type":"ContainerStarted","Data":"43107f807a17e16ac0ede176fb29d085eb4a00c3808ee050bfe09f6d6d8992a3"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.592307 5030 generic.go:334] "Generic (PLEG): container finished" podID="5cc0a822-f532-4667-983b-b4b9f5653853" containerID="446c9c22532ed5000c2914327203f69862de8749941d75e7ed5a0f66132d8d91" exitCode=0 Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.592356 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn" event={"ID":"5cc0a822-f532-4667-983b-b4b9f5653853","Type":"ContainerDied","Data":"446c9c22532ed5000c2914327203f69862de8749941d75e7ed5a0f66132d8d91"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.592373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn" event={"ID":"5cc0a822-f532-4667-983b-b4b9f5653853","Type":"ContainerStarted","Data":"d54ab2f85f66f8457f34169e9847a2fe44979cf3d314ce04c05609db3a1aed06"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.594282 5030 generic.go:334] "Generic (PLEG): container finished" podID="2d6bc82a-43cc-4f35-b324-c2f2a40cefbd" containerID="c723fee7122bd82a4654f3e57bc39980e37b10a0faadf88c51ea2d1808f5f10c" exitCode=0 Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.594318 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-mntdc" event={"ID":"2d6bc82a-43cc-4f35-b324-c2f2a40cefbd","Type":"ContainerDied","Data":"c723fee7122bd82a4654f3e57bc39980e37b10a0faadf88c51ea2d1808f5f10c"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.594336 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-mntdc" event={"ID":"2d6bc82a-43cc-4f35-b324-c2f2a40cefbd","Type":"ContainerStarted","Data":"1472d1a189782e4d395988355937c941da8a6385a891b6bdc9e00ff5e355d716"} Jan 20 23:38:31 crc kubenswrapper[5030]: I0120 23:38:31.598391 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-77ffx" podStartSLOduration=2.5983817890000003 podStartE2EDuration="2.598381789s" podCreationTimestamp="2026-01-20 23:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:31.595766575 +0000 UTC m=+3783.916026863" watchObservedRunningTime="2026-01-20 23:38:31.598381789 +0000 UTC m=+3783.918642077" Jan 20 23:38:32 crc kubenswrapper[5030]: I0120 23:38:32.910865 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-mntdc" Jan 20 23:38:32 crc kubenswrapper[5030]: I0120 23:38:32.961405 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:38:32 crc kubenswrapper[5030]: E0120 23:38:32.961720 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.062167 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-867jb\" (UniqueName: \"kubernetes.io/projected/2d6bc82a-43cc-4f35-b324-c2f2a40cefbd-kube-api-access-867jb\") pod \"2d6bc82a-43cc-4f35-b324-c2f2a40cefbd\" (UID: \"2d6bc82a-43cc-4f35-b324-c2f2a40cefbd\") " Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.062336 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d6bc82a-43cc-4f35-b324-c2f2a40cefbd-operator-scripts\") pod \"2d6bc82a-43cc-4f35-b324-c2f2a40cefbd\" (UID: \"2d6bc82a-43cc-4f35-b324-c2f2a40cefbd\") " Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.063196 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d6bc82a-43cc-4f35-b324-c2f2a40cefbd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d6bc82a-43cc-4f35-b324-c2f2a40cefbd" (UID: "2d6bc82a-43cc-4f35-b324-c2f2a40cefbd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.068473 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d6bc82a-43cc-4f35-b324-c2f2a40cefbd-kube-api-access-867jb" (OuterVolumeSpecName: "kube-api-access-867jb") pod "2d6bc82a-43cc-4f35-b324-c2f2a40cefbd" (UID: "2d6bc82a-43cc-4f35-b324-c2f2a40cefbd"). InnerVolumeSpecName "kube-api-access-867jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.164333 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d6bc82a-43cc-4f35-b324-c2f2a40cefbd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.164364 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-867jb\" (UniqueName: \"kubernetes.io/projected/2d6bc82a-43cc-4f35-b324-c2f2a40cefbd-kube-api-access-867jb\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.173157 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-jkq6c" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.175897 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-ch2zp" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.182118 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.189552 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.195904 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.265289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cc0a822-f532-4667-983b-b4b9f5653853-operator-scripts\") pod \"5cc0a822-f532-4667-983b-b4b9f5653853\" (UID: \"5cc0a822-f532-4667-983b-b4b9f5653853\") " Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.265335 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdddt\" (UniqueName: \"kubernetes.io/projected/78f33bd8-c4ff-4eec-96d8-8244037c2edb-kube-api-access-fdddt\") pod \"78f33bd8-c4ff-4eec-96d8-8244037c2edb\" (UID: \"78f33bd8-c4ff-4eec-96d8-8244037c2edb\") " Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.265365 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7462f\" (UniqueName: \"kubernetes.io/projected/5cc0a822-f532-4667-983b-b4b9f5653853-kube-api-access-7462f\") pod \"5cc0a822-f532-4667-983b-b4b9f5653853\" (UID: \"5cc0a822-f532-4667-983b-b4b9f5653853\") " Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.265398 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03910f64-db26-4c5e-8122-f88cce939744-operator-scripts\") pod \"03910f64-db26-4c5e-8122-f88cce939744\" (UID: \"03910f64-db26-4c5e-8122-f88cce939744\") " Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.265437 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cf760db-bb64-461a-ae21-2ce8224f1f02-operator-scripts\") pod \"6cf760db-bb64-461a-ae21-2ce8224f1f02\" (UID: \"6cf760db-bb64-461a-ae21-2ce8224f1f02\") " Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.265525 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1b44006-d170-4768-8096-67d4dd02546c-operator-scripts\") pod \"c1b44006-d170-4768-8096-67d4dd02546c\" (UID: \"c1b44006-d170-4768-8096-67d4dd02546c\") " Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.265541 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2dtf\" (UniqueName: \"kubernetes.io/projected/6cf760db-bb64-461a-ae21-2ce8224f1f02-kube-api-access-x2dtf\") pod \"6cf760db-bb64-461a-ae21-2ce8224f1f02\" (UID: \"6cf760db-bb64-461a-ae21-2ce8224f1f02\") " Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.265573 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w2vw\" (UniqueName: \"kubernetes.io/projected/03910f64-db26-4c5e-8122-f88cce939744-kube-api-access-5w2vw\") pod \"03910f64-db26-4c5e-8122-f88cce939744\" (UID: \"03910f64-db26-4c5e-8122-f88cce939744\") " Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.265608 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78f33bd8-c4ff-4eec-96d8-8244037c2edb-operator-scripts\") pod \"78f33bd8-c4ff-4eec-96d8-8244037c2edb\" (UID: \"78f33bd8-c4ff-4eec-96d8-8244037c2edb\") " Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.265650 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j82l\" (UniqueName: \"kubernetes.io/projected/c1b44006-d170-4768-8096-67d4dd02546c-kube-api-access-7j82l\") pod \"c1b44006-d170-4768-8096-67d4dd02546c\" (UID: \"c1b44006-d170-4768-8096-67d4dd02546c\") " Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.265931 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cc0a822-f532-4667-983b-b4b9f5653853-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5cc0a822-f532-4667-983b-b4b9f5653853" (UID: "5cc0a822-f532-4667-983b-b4b9f5653853"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.266635 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78f33bd8-c4ff-4eec-96d8-8244037c2edb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "78f33bd8-c4ff-4eec-96d8-8244037c2edb" (UID: "78f33bd8-c4ff-4eec-96d8-8244037c2edb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.266717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1b44006-d170-4768-8096-67d4dd02546c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1b44006-d170-4768-8096-67d4dd02546c" (UID: "c1b44006-d170-4768-8096-67d4dd02546c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.266904 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cf760db-bb64-461a-ae21-2ce8224f1f02-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6cf760db-bb64-461a-ae21-2ce8224f1f02" (UID: "6cf760db-bb64-461a-ae21-2ce8224f1f02"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.266950 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03910f64-db26-4c5e-8122-f88cce939744-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03910f64-db26-4c5e-8122-f88cce939744" (UID: "03910f64-db26-4c5e-8122-f88cce939744"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.271976 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cc0a822-f532-4667-983b-b4b9f5653853-kube-api-access-7462f" (OuterVolumeSpecName: "kube-api-access-7462f") pod "5cc0a822-f532-4667-983b-b4b9f5653853" (UID: "5cc0a822-f532-4667-983b-b4b9f5653853"). InnerVolumeSpecName "kube-api-access-7462f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.272089 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1b44006-d170-4768-8096-67d4dd02546c-kube-api-access-7j82l" (OuterVolumeSpecName: "kube-api-access-7j82l") pod "c1b44006-d170-4768-8096-67d4dd02546c" (UID: "c1b44006-d170-4768-8096-67d4dd02546c"). InnerVolumeSpecName "kube-api-access-7j82l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.272070 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f33bd8-c4ff-4eec-96d8-8244037c2edb-kube-api-access-fdddt" (OuterVolumeSpecName: "kube-api-access-fdddt") pod "78f33bd8-c4ff-4eec-96d8-8244037c2edb" (UID: "78f33bd8-c4ff-4eec-96d8-8244037c2edb"). InnerVolumeSpecName "kube-api-access-fdddt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.272569 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf760db-bb64-461a-ae21-2ce8224f1f02-kube-api-access-x2dtf" (OuterVolumeSpecName: "kube-api-access-x2dtf") pod "6cf760db-bb64-461a-ae21-2ce8224f1f02" (UID: "6cf760db-bb64-461a-ae21-2ce8224f1f02"). InnerVolumeSpecName "kube-api-access-x2dtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.272702 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03910f64-db26-4c5e-8122-f88cce939744-kube-api-access-5w2vw" (OuterVolumeSpecName: "kube-api-access-5w2vw") pod "03910f64-db26-4c5e-8122-f88cce939744" (UID: "03910f64-db26-4c5e-8122-f88cce939744"). InnerVolumeSpecName "kube-api-access-5w2vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.367584 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03910f64-db26-4c5e-8122-f88cce939744-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.367655 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cf760db-bb64-461a-ae21-2ce8224f1f02-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.367671 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1b44006-d170-4768-8096-67d4dd02546c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.367688 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2dtf\" (UniqueName: \"kubernetes.io/projected/6cf760db-bb64-461a-ae21-2ce8224f1f02-kube-api-access-x2dtf\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.367705 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w2vw\" (UniqueName: \"kubernetes.io/projected/03910f64-db26-4c5e-8122-f88cce939744-kube-api-access-5w2vw\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.367719 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78f33bd8-c4ff-4eec-96d8-8244037c2edb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.367729 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j82l\" (UniqueName: \"kubernetes.io/projected/c1b44006-d170-4768-8096-67d4dd02546c-kube-api-access-7j82l\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.367740 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5cc0a822-f532-4667-983b-b4b9f5653853-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.367752 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdddt\" (UniqueName: \"kubernetes.io/projected/78f33bd8-c4ff-4eec-96d8-8244037c2edb-kube-api-access-fdddt\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.367763 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7462f\" (UniqueName: \"kubernetes.io/projected/5cc0a822-f532-4667-983b-b4b9f5653853-kube-api-access-7462f\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.612249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn" event={"ID":"5cc0a822-f532-4667-983b-b4b9f5653853","Type":"ContainerDied","Data":"d54ab2f85f66f8457f34169e9847a2fe44979cf3d314ce04c05609db3a1aed06"} Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.612288 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d54ab2f85f66f8457f34169e9847a2fe44979cf3d314ce04c05609db3a1aed06" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.612261 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.619315 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-mntdc" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.619313 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-mntdc" event={"ID":"2d6bc82a-43cc-4f35-b324-c2f2a40cefbd","Type":"ContainerDied","Data":"1472d1a189782e4d395988355937c941da8a6385a891b6bdc9e00ff5e355d716"} Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.619809 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1472d1a189782e4d395988355937c941da8a6385a891b6bdc9e00ff5e355d716" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.621207 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59" event={"ID":"c1b44006-d170-4768-8096-67d4dd02546c","Type":"ContainerDied","Data":"30f740f0b5884f90a12c89fa937bcca6a6cefd44038e21be7d915cf1590af8ef"} Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.621236 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30f740f0b5884f90a12c89fa937bcca6a6cefd44038e21be7d915cf1590af8ef" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.621283 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.622867 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.622918 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm" event={"ID":"6cf760db-bb64-461a-ae21-2ce8224f1f02","Type":"ContainerDied","Data":"5b7059a24facc4542be8e98f2c72d2d99359b36b003fc71e7b7f1b33db579d9b"} Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.623072 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b7059a24facc4542be8e98f2c72d2d99359b36b003fc71e7b7f1b33db579d9b" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.624293 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-ch2zp" event={"ID":"78f33bd8-c4ff-4eec-96d8-8244037c2edb","Type":"ContainerDied","Data":"2acd59d569d7da0376b43f4f386bb440469b9af6041c4a8afee8b70d16476a2d"} Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.624319 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2acd59d569d7da0376b43f4f386bb440469b9af6041c4a8afee8b70d16476a2d" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.624370 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-ch2zp" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.630948 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-jkq6c" event={"ID":"03910f64-db26-4c5e-8122-f88cce939744","Type":"ContainerDied","Data":"43107f807a17e16ac0ede176fb29d085eb4a00c3808ee050bfe09f6d6d8992a3"} Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.630986 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43107f807a17e16ac0ede176fb29d085eb4a00c3808ee050bfe09f6d6d8992a3" Jan 20 23:38:33 crc kubenswrapper[5030]: I0120 23:38:33.631023 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-jkq6c" Jan 20 23:38:34 crc kubenswrapper[5030]: I0120 23:38:34.645339 5030 generic.go:334] "Generic (PLEG): container finished" podID="ead812d7-990f-4f88-87fc-8620e73f9e67" containerID="63cdd757318961398a57e724643715beb0c52c0c72e8648c72d7b3a6a1963c1a" exitCode=0 Jan 20 23:38:34 crc kubenswrapper[5030]: I0120 23:38:34.645473 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-77ffx" event={"ID":"ead812d7-990f-4f88-87fc-8620e73f9e67","Type":"ContainerDied","Data":"63cdd757318961398a57e724643715beb0c52c0c72e8648c72d7b3a6a1963c1a"} Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.000689 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-77ffx" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.117883 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead812d7-990f-4f88-87fc-8620e73f9e67-config-data\") pod \"ead812d7-990f-4f88-87fc-8620e73f9e67\" (UID: \"ead812d7-990f-4f88-87fc-8620e73f9e67\") " Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.117941 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rmvr\" (UniqueName: \"kubernetes.io/projected/ead812d7-990f-4f88-87fc-8620e73f9e67-kube-api-access-7rmvr\") pod \"ead812d7-990f-4f88-87fc-8620e73f9e67\" (UID: \"ead812d7-990f-4f88-87fc-8620e73f9e67\") " Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.117983 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead812d7-990f-4f88-87fc-8620e73f9e67-combined-ca-bundle\") pod \"ead812d7-990f-4f88-87fc-8620e73f9e67\" (UID: \"ead812d7-990f-4f88-87fc-8620e73f9e67\") " Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.125362 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ead812d7-990f-4f88-87fc-8620e73f9e67-kube-api-access-7rmvr" (OuterVolumeSpecName: "kube-api-access-7rmvr") pod "ead812d7-990f-4f88-87fc-8620e73f9e67" (UID: "ead812d7-990f-4f88-87fc-8620e73f9e67"). InnerVolumeSpecName "kube-api-access-7rmvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.142193 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ead812d7-990f-4f88-87fc-8620e73f9e67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ead812d7-990f-4f88-87fc-8620e73f9e67" (UID: "ead812d7-990f-4f88-87fc-8620e73f9e67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.165976 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ead812d7-990f-4f88-87fc-8620e73f9e67-config-data" (OuterVolumeSpecName: "config-data") pod "ead812d7-990f-4f88-87fc-8620e73f9e67" (UID: "ead812d7-990f-4f88-87fc-8620e73f9e67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.220252 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ead812d7-990f-4f88-87fc-8620e73f9e67-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.220290 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rmvr\" (UniqueName: \"kubernetes.io/projected/ead812d7-990f-4f88-87fc-8620e73f9e67-kube-api-access-7rmvr\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.220302 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ead812d7-990f-4f88-87fc-8620e73f9e67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.664265 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-77ffx" event={"ID":"ead812d7-990f-4f88-87fc-8620e73f9e67","Type":"ContainerDied","Data":"9f29ddc0f2b48906e32eacb6316d95b4b739eb4c8a9b89c661e02e7aa6190316"} Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.664320 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f29ddc0f2b48906e32eacb6316d95b4b739eb4c8a9b89c661e02e7aa6190316" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.664842 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-77ffx" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.882165 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6twz6"] Jan 20 23:38:36 crc kubenswrapper[5030]: E0120 23:38:36.882862 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6bc82a-43cc-4f35-b324-c2f2a40cefbd" containerName="mariadb-database-create" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.882881 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6bc82a-43cc-4f35-b324-c2f2a40cefbd" containerName="mariadb-database-create" Jan 20 23:38:36 crc kubenswrapper[5030]: E0120 23:38:36.882894 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf760db-bb64-461a-ae21-2ce8224f1f02" containerName="mariadb-account-create-update" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.882903 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf760db-bb64-461a-ae21-2ce8224f1f02" containerName="mariadb-account-create-update" Jan 20 23:38:36 crc kubenswrapper[5030]: E0120 23:38:36.882921 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b44006-d170-4768-8096-67d4dd02546c" containerName="mariadb-account-create-update" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.882930 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b44006-d170-4768-8096-67d4dd02546c" containerName="mariadb-account-create-update" Jan 20 23:38:36 crc kubenswrapper[5030]: E0120 23:38:36.882956 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03910f64-db26-4c5e-8122-f88cce939744" containerName="mariadb-database-create" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.882964 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="03910f64-db26-4c5e-8122-f88cce939744" containerName="mariadb-database-create" Jan 20 23:38:36 crc kubenswrapper[5030]: E0120 23:38:36.882978 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f33bd8-c4ff-4eec-96d8-8244037c2edb" containerName="mariadb-database-create" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.882986 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f33bd8-c4ff-4eec-96d8-8244037c2edb" containerName="mariadb-database-create" Jan 20 23:38:36 crc kubenswrapper[5030]: E0120 23:38:36.883007 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc0a822-f532-4667-983b-b4b9f5653853" containerName="mariadb-account-create-update" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.883016 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc0a822-f532-4667-983b-b4b9f5653853" containerName="mariadb-account-create-update" Jan 20 23:38:36 crc kubenswrapper[5030]: E0120 23:38:36.883041 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ead812d7-990f-4f88-87fc-8620e73f9e67" containerName="keystone-db-sync" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.883050 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ead812d7-990f-4f88-87fc-8620e73f9e67" containerName="keystone-db-sync" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.883281 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6bc82a-43cc-4f35-b324-c2f2a40cefbd" containerName="mariadb-database-create" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.883305 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf760db-bb64-461a-ae21-2ce8224f1f02" containerName="mariadb-account-create-update" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.883327 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ead812d7-990f-4f88-87fc-8620e73f9e67" containerName="keystone-db-sync" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.883348 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="03910f64-db26-4c5e-8122-f88cce939744" containerName="mariadb-database-create" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.883376 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f33bd8-c4ff-4eec-96d8-8244037c2edb" containerName="mariadb-database-create" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.883391 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cc0a822-f532-4667-983b-b4b9f5653853" containerName="mariadb-account-create-update" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.883406 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b44006-d170-4768-8096-67d4dd02546c" containerName="mariadb-account-create-update" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.884071 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.887705 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.887610 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-2jq2r" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.887866 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.888638 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.889331 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:38:36 crc kubenswrapper[5030]: I0120 23:38:36.896382 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6twz6"] Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.030815 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-credential-keys\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.030881 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-combined-ca-bundle\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.030972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-fernet-keys\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.031000 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-scripts\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.031030 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-config-data\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.031044 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqgmn\" (UniqueName: \"kubernetes.io/projected/4f92e9a1-385a-43e9-9a11-996c7020a1b2-kube-api-access-hqgmn\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.040603 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-9k2jq"] Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.041744 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.046124 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.046312 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-vdncx" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.046459 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.047334 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-72mr2"] Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.048313 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.067395 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.067976 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.071034 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-fdpgm" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.079953 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-9k2jq"] Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.135545 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-combined-ca-bundle\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.135689 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-fernet-keys\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.135719 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-scripts\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.135739 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ml8b\" (UniqueName: \"kubernetes.io/projected/dc45df02-ca0a-4c18-813e-0c644876cd5f-kube-api-access-7ml8b\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.135769 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-db-sync-config-data\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.135800 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-config-data\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.135820 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqgmn\" (UniqueName: \"kubernetes.io/projected/4f92e9a1-385a-43e9-9a11-996c7020a1b2-kube-api-access-hqgmn\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.135854 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-scripts\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.135877 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9dwc\" (UniqueName: \"kubernetes.io/projected/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-kube-api-access-t9dwc\") pod \"neutron-db-sync-9k2jq\" (UID: \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\") " pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.135906 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-credential-keys\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.135925 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc45df02-ca0a-4c18-813e-0c644876cd5f-etc-machine-id\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.135962 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-combined-ca-bundle\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.135979 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-config\") pod \"neutron-db-sync-9k2jq\" (UID: \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\") " pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.136004 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-combined-ca-bundle\") pod \"neutron-db-sync-9k2jq\" (UID: \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\") " pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.136030 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-config-data\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.147931 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-scripts\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.149051 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-72mr2"] Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.151860 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-fernet-keys\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.153280 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-config-data\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.154811 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-credential-keys\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.166371 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-combined-ca-bundle\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.183700 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-jkvvn"] Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.185033 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.190912 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-6fhjp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.191177 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.198658 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqgmn\" (UniqueName: \"kubernetes.io/projected/4f92e9a1-385a-43e9-9a11-996c7020a1b2-kube-api-access-hqgmn\") pod \"keystone-bootstrap-6twz6\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.200648 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.203490 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-jkvvn"] Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.222685 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-dt6fp"] Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.223912 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.228874 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-fl6p4" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.229450 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.229943 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.237605 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ml8b\" (UniqueName: \"kubernetes.io/projected/dc45df02-ca0a-4c18-813e-0c644876cd5f-kube-api-access-7ml8b\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.238972 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-db-sync-config-data\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.239339 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-scripts\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.239427 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9dwc\" (UniqueName: \"kubernetes.io/projected/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-kube-api-access-t9dwc\") pod \"neutron-db-sync-9k2jq\" (UID: \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\") " pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.239524 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc45df02-ca0a-4c18-813e-0c644876cd5f-etc-machine-id\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.239665 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-config\") pod \"neutron-db-sync-9k2jq\" (UID: \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\") " pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.239766 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-combined-ca-bundle\") pod \"neutron-db-sync-9k2jq\" (UID: \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\") " pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.239879 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-config-data\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.239983 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-combined-ca-bundle\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.241167 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc45df02-ca0a-4c18-813e-0c644876cd5f-etc-machine-id\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.247436 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-scripts\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.248732 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-config-data\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.271953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-config\") pod \"neutron-db-sync-9k2jq\" (UID: \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\") " pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.272512 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-combined-ca-bundle\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.272920 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-db-sync-config-data\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.273369 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-combined-ca-bundle\") pod \"neutron-db-sync-9k2jq\" (UID: \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\") " pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.276556 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9dwc\" (UniqueName: \"kubernetes.io/projected/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-kube-api-access-t9dwc\") pod \"neutron-db-sync-9k2jq\" (UID: \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\") " pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.289466 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ml8b\" (UniqueName: \"kubernetes.io/projected/dc45df02-ca0a-4c18-813e-0c644876cd5f-kube-api-access-7ml8b\") pod \"cinder-db-sync-72mr2\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.291938 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-dt6fp"] Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.308685 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.311202 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.318684 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.318734 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.329383 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.341171 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7rnt\" (UniqueName: \"kubernetes.io/projected/16ed07d6-f54e-428f-b3fa-a643b2853ea5-kube-api-access-p7rnt\") pod \"barbican-db-sync-jkvvn\" (UID: \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\") " pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.341229 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-combined-ca-bundle\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.341259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ed07d6-f54e-428f-b3fa-a643b2853ea5-combined-ca-bundle\") pod \"barbican-db-sync-jkvvn\" (UID: \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\") " pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.341281 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16ed07d6-f54e-428f-b3fa-a643b2853ea5-db-sync-config-data\") pod \"barbican-db-sync-jkvvn\" (UID: \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\") " pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.341301 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-config-data\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.341332 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-667rn\" (UniqueName: \"kubernetes.io/projected/a9c4cd6f-c719-424c-8f38-accb36a496ff-kube-api-access-667rn\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.341364 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9c4cd6f-c719-424c-8f38-accb36a496ff-logs\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.341387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-scripts\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.380246 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.395281 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.442992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-run-httpd\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.443038 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.443667 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-combined-ca-bundle\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.443718 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.443764 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-config-data\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.443796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtlpd\" (UniqueName: \"kubernetes.io/projected/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-kube-api-access-jtlpd\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.443887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ed07d6-f54e-428f-b3fa-a643b2853ea5-combined-ca-bundle\") pod \"barbican-db-sync-jkvvn\" (UID: \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\") " pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.443928 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16ed07d6-f54e-428f-b3fa-a643b2853ea5-db-sync-config-data\") pod \"barbican-db-sync-jkvvn\" (UID: \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\") " pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.443964 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-config-data\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.444030 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-scripts\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.444068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-667rn\" (UniqueName: \"kubernetes.io/projected/a9c4cd6f-c719-424c-8f38-accb36a496ff-kube-api-access-667rn\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.444158 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9c4cd6f-c719-424c-8f38-accb36a496ff-logs\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.444180 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-log-httpd\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.444229 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-scripts\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.444329 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7rnt\" (UniqueName: \"kubernetes.io/projected/16ed07d6-f54e-428f-b3fa-a643b2853ea5-kube-api-access-p7rnt\") pod \"barbican-db-sync-jkvvn\" (UID: \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\") " pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.446544 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9c4cd6f-c719-424c-8f38-accb36a496ff-logs\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.447869 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-combined-ca-bundle\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.449146 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-scripts\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.449416 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ed07d6-f54e-428f-b3fa-a643b2853ea5-combined-ca-bundle\") pod \"barbican-db-sync-jkvvn\" (UID: \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\") " pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.450958 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16ed07d6-f54e-428f-b3fa-a643b2853ea5-db-sync-config-data\") pod \"barbican-db-sync-jkvvn\" (UID: \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\") " pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.451747 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-config-data\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.459350 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7rnt\" (UniqueName: \"kubernetes.io/projected/16ed07d6-f54e-428f-b3fa-a643b2853ea5-kube-api-access-p7rnt\") pod \"barbican-db-sync-jkvvn\" (UID: \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\") " pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.462307 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-667rn\" (UniqueName: \"kubernetes.io/projected/a9c4cd6f-c719-424c-8f38-accb36a496ff-kube-api-access-667rn\") pod \"placement-db-sync-dt6fp\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.546432 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-scripts\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.546705 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-log-httpd\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.546765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-run-httpd\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.546787 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.546815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.546836 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-config-data\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.546859 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtlpd\" (UniqueName: \"kubernetes.io/projected/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-kube-api-access-jtlpd\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.547789 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-log-httpd\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.547902 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-run-httpd\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.551756 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.552090 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-config-data\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.554201 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.565177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-scripts\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.568659 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtlpd\" (UniqueName: \"kubernetes.io/projected/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-kube-api-access-jtlpd\") pod \"ceilometer-0\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.670931 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.687725 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.693611 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.766198 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6twz6"] Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.886268 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-9k2jq"] Jan 20 23:38:37 crc kubenswrapper[5030]: W0120 23:38:37.897372 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cd3e31b_fb34_4bf2_8605_fbb4cac3b7d8.slice/crio-87a07af13777c06590e7df973684d5463df34d538f904fcaa92e90e478a73184 WatchSource:0}: Error finding container 87a07af13777c06590e7df973684d5463df34d538f904fcaa92e90e478a73184: Status 404 returned error can't find the container with id 87a07af13777c06590e7df973684d5463df34d538f904fcaa92e90e478a73184 Jan 20 23:38:37 crc kubenswrapper[5030]: I0120 23:38:37.952537 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-72mr2"] Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.008476 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.012155 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.012243 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.024255 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.024394 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-fm8zv" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.024591 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.024652 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.080030 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.082341 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.086006 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.086212 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.086499 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.158534 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-jkvvn"] Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159323 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159384 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be7233cb-0da9-481a-82f9-9c839d4f411f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159411 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-config-data\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159439 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159465 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159483 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159519 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159539 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be7233cb-0da9-481a-82f9-9c839d4f411f-logs\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159564 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159583 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159599 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159635 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159656 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hctx\" (UniqueName: \"kubernetes.io/projected/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-kube-api-access-4hctx\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159682 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-scripts\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159698 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pttsw\" (UniqueName: \"kubernetes.io/projected/be7233cb-0da9-481a-82f9-9c839d4f411f-kube-api-access-pttsw\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.159715 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.261515 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.262055 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.262616 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.262768 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.262843 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be7233cb-0da9-481a-82f9-9c839d4f411f-logs\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.263189 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.263287 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.263355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.263418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.263492 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hctx\" (UniqueName: \"kubernetes.io/projected/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-kube-api-access-4hctx\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.263586 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-scripts\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.263674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pttsw\" (UniqueName: \"kubernetes.io/projected/be7233cb-0da9-481a-82f9-9c839d4f411f-kube-api-access-pttsw\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.263739 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.263834 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.263927 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be7233cb-0da9-481a-82f9-9c839d4f411f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.264021 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-config-data\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.264441 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.266150 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be7233cb-0da9-481a-82f9-9c839d4f411f-logs\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.266373 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.269087 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.270278 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.270392 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.271707 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be7233cb-0da9-481a-82f9-9c839d4f411f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.273175 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.274779 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.275102 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.276218 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.282736 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-scripts\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.283003 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.284485 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.293285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pttsw\" (UniqueName: \"kubernetes.io/projected/be7233cb-0da9-481a-82f9-9c839d4f411f-kube-api-access-pttsw\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.299094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-config-data\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.300695 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hctx\" (UniqueName: \"kubernetes.io/projected/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-kube-api-access-4hctx\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.302993 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-dt6fp"] Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.364586 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.366449 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.409829 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.671000 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.689349 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" event={"ID":"16ed07d6-f54e-428f-b3fa-a643b2853ea5","Type":"ContainerStarted","Data":"fc9a809137c348f37727899759ea3b9e309c3ff612805eeab5ca65ed89054529"} Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.689396 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" event={"ID":"16ed07d6-f54e-428f-b3fa-a643b2853ea5","Type":"ContainerStarted","Data":"a421fe729f64d3ab86ec8de8613346bcfa0498e9b73024f5a692709053c3d1e3"} Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.692311 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-72mr2" event={"ID":"dc45df02-ca0a-4c18-813e-0c644876cd5f","Type":"ContainerStarted","Data":"f82b5e2b8fdcf04aac0cc0473869acff2408489bfdcf343b92873f3fcc70ef99"} Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.699913 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-dt6fp" event={"ID":"a9c4cd6f-c719-424c-8f38-accb36a496ff","Type":"ContainerStarted","Data":"d72333d1c396d27f7c3c36d2d86ebc7b0ddf14b4940fe119415b8d02910ef699"} Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.701737 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a911a29b-b582-49e1-9a21-80ca6a0c7ef3","Type":"ContainerStarted","Data":"d06d3587c0f13cd53f7176d1d3d579702748b547500bf8583f886553abff8edd"} Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.705080 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" event={"ID":"4f92e9a1-385a-43e9-9a11-996c7020a1b2","Type":"ContainerStarted","Data":"2997f4b86dcccf1c849311b9eb484938b4223576f649cdfae1c3ec9c2f959fd7"} Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.705128 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" event={"ID":"4f92e9a1-385a-43e9-9a11-996c7020a1b2","Type":"ContainerStarted","Data":"fed90448d6701867aab4d0aca10897c862f68b24f72f99d1dc32c0b3cdfbed4a"} Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.706168 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" event={"ID":"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8","Type":"ContainerStarted","Data":"64de3c79bce421a0d3c1059b0988a3e76dde367063a872a184ec15dd5d265737"} Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.706192 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" event={"ID":"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8","Type":"ContainerStarted","Data":"87a07af13777c06590e7df973684d5463df34d538f904fcaa92e90e478a73184"} Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.722801 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" podStartSLOduration=1.7227858710000001 podStartE2EDuration="1.722785871s" podCreationTimestamp="2026-01-20 23:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:38.721248774 +0000 UTC m=+3791.041509062" watchObservedRunningTime="2026-01-20 23:38:38.722785871 +0000 UTC m=+3791.043046159" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.741744 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" podStartSLOduration=1.74172529 podStartE2EDuration="1.74172529s" podCreationTimestamp="2026-01-20 23:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:38.737490347 +0000 UTC m=+3791.057750635" watchObservedRunningTime="2026-01-20 23:38:38.74172529 +0000 UTC m=+3791.061985568" Jan 20 23:38:38 crc kubenswrapper[5030]: I0120 23:38:38.766932 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" podStartSLOduration=2.76691061 podStartE2EDuration="2.76691061s" podCreationTimestamp="2026-01-20 23:38:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:38.765658879 +0000 UTC m=+3791.085919167" watchObservedRunningTime="2026-01-20 23:38:38.76691061 +0000 UTC m=+3791.087170898" Jan 20 23:38:39 crc kubenswrapper[5030]: I0120 23:38:39.118281 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:38:39 crc kubenswrapper[5030]: I0120 23:38:39.273342 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:38:39 crc kubenswrapper[5030]: I0120 23:38:39.732006 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-72mr2" event={"ID":"dc45df02-ca0a-4c18-813e-0c644876cd5f","Type":"ContainerStarted","Data":"0279e54b1935433ab1c4cfae7c60466d90a9ac5d6d65fae068ea6efb4d3826be"} Jan 20 23:38:39 crc kubenswrapper[5030]: I0120 23:38:39.738257 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-dt6fp" event={"ID":"a9c4cd6f-c719-424c-8f38-accb36a496ff","Type":"ContainerStarted","Data":"f2617bfd0f56a2a7511d6655cd7e74e93508ab10fa8d7a49fc0ec36b48df2eea"} Jan 20 23:38:39 crc kubenswrapper[5030]: I0120 23:38:39.745527 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a911a29b-b582-49e1-9a21-80ca6a0c7ef3","Type":"ContainerStarted","Data":"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675"} Jan 20 23:38:39 crc kubenswrapper[5030]: I0120 23:38:39.748562 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2","Type":"ContainerStarted","Data":"d84a29f48eda6c9542cfec73109ad2d6e72102f298fe3380909ba98643684f4c"} Jan 20 23:38:39 crc kubenswrapper[5030]: I0120 23:38:39.749660 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"be7233cb-0da9-481a-82f9-9c839d4f411f","Type":"ContainerStarted","Data":"b15c31aefd43824c37104246d2e6f4d45e4057bf638b4303ee8f366481e26588"} Jan 20 23:38:39 crc kubenswrapper[5030]: I0120 23:38:39.754666 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-72mr2" podStartSLOduration=2.754650938 podStartE2EDuration="2.754650938s" podCreationTimestamp="2026-01-20 23:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:39.752826995 +0000 UTC m=+3792.073087283" watchObservedRunningTime="2026-01-20 23:38:39.754650938 +0000 UTC m=+3792.074911226" Jan 20 23:38:39 crc kubenswrapper[5030]: I0120 23:38:39.767339 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-dt6fp" podStartSLOduration=2.767317995 podStartE2EDuration="2.767317995s" podCreationTimestamp="2026-01-20 23:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:39.764401484 +0000 UTC m=+3792.084661772" watchObservedRunningTime="2026-01-20 23:38:39.767317995 +0000 UTC m=+3792.087578283" Jan 20 23:38:39 crc kubenswrapper[5030]: I0120 23:38:39.995749 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:38:40 crc kubenswrapper[5030]: I0120 23:38:40.016281 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:38:40 crc kubenswrapper[5030]: I0120 23:38:40.102330 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:38:40 crc kubenswrapper[5030]: I0120 23:38:40.770974 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a911a29b-b582-49e1-9a21-80ca6a0c7ef3","Type":"ContainerStarted","Data":"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920"} Jan 20 23:38:40 crc kubenswrapper[5030]: I0120 23:38:40.772940 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2","Type":"ContainerStarted","Data":"2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446"} Jan 20 23:38:40 crc kubenswrapper[5030]: I0120 23:38:40.782507 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"be7233cb-0da9-481a-82f9-9c839d4f411f","Type":"ContainerStarted","Data":"7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949"} Jan 20 23:38:41 crc kubenswrapper[5030]: I0120 23:38:41.796139 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"be7233cb-0da9-481a-82f9-9c839d4f411f","Type":"ContainerStarted","Data":"27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1"} Jan 20 23:38:41 crc kubenswrapper[5030]: I0120 23:38:41.796221 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="be7233cb-0da9-481a-82f9-9c839d4f411f" containerName="glance-log" containerID="cri-o://7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949" gracePeriod=30 Jan 20 23:38:41 crc kubenswrapper[5030]: I0120 23:38:41.796702 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="be7233cb-0da9-481a-82f9-9c839d4f411f" containerName="glance-httpd" containerID="cri-o://27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1" gracePeriod=30 Jan 20 23:38:41 crc kubenswrapper[5030]: I0120 23:38:41.801684 5030 generic.go:334] "Generic (PLEG): container finished" podID="a9c4cd6f-c719-424c-8f38-accb36a496ff" containerID="f2617bfd0f56a2a7511d6655cd7e74e93508ab10fa8d7a49fc0ec36b48df2eea" exitCode=0 Jan 20 23:38:41 crc kubenswrapper[5030]: I0120 23:38:41.801785 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-dt6fp" event={"ID":"a9c4cd6f-c719-424c-8f38-accb36a496ff","Type":"ContainerDied","Data":"f2617bfd0f56a2a7511d6655cd7e74e93508ab10fa8d7a49fc0ec36b48df2eea"} Jan 20 23:38:41 crc kubenswrapper[5030]: I0120 23:38:41.808585 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a911a29b-b582-49e1-9a21-80ca6a0c7ef3","Type":"ContainerStarted","Data":"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363"} Jan 20 23:38:41 crc kubenswrapper[5030]: I0120 23:38:41.811120 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2","Type":"ContainerStarted","Data":"c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8"} Jan 20 23:38:41 crc kubenswrapper[5030]: I0120 23:38:41.811219 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" containerName="glance-log" containerID="cri-o://2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446" gracePeriod=30 Jan 20 23:38:41 crc kubenswrapper[5030]: I0120 23:38:41.811280 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" containerName="glance-httpd" containerID="cri-o://c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8" gracePeriod=30 Jan 20 23:38:41 crc kubenswrapper[5030]: I0120 23:38:41.824950 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=5.824935422 podStartE2EDuration="5.824935422s" podCreationTimestamp="2026-01-20 23:38:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:41.823381734 +0000 UTC m=+3794.143642032" watchObservedRunningTime="2026-01-20 23:38:41.824935422 +0000 UTC m=+3794.145195710" Jan 20 23:38:41 crc kubenswrapper[5030]: I0120 23:38:41.863787 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.863766342 podStartE2EDuration="4.863766342s" podCreationTimestamp="2026-01-20 23:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:41.860238226 +0000 UTC m=+3794.180498524" watchObservedRunningTime="2026-01-20 23:38:41.863766342 +0000 UTC m=+3794.184026630" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.746678 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.752145 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.826580 5030 generic.go:334] "Generic (PLEG): container finished" podID="4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" containerID="c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8" exitCode=0 Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.826636 5030 generic.go:334] "Generic (PLEG): container finished" podID="4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" containerID="2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446" exitCode=143 Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.826770 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.827572 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2","Type":"ContainerDied","Data":"c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8"} Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.827610 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2","Type":"ContainerDied","Data":"2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446"} Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.827659 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2","Type":"ContainerDied","Data":"d84a29f48eda6c9542cfec73109ad2d6e72102f298fe3380909ba98643684f4c"} Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.827679 5030 scope.go:117] "RemoveContainer" containerID="c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.829668 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f92e9a1-385a-43e9-9a11-996c7020a1b2" containerID="2997f4b86dcccf1c849311b9eb484938b4223576f649cdfae1c3ec9c2f959fd7" exitCode=0 Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.829734 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" event={"ID":"4f92e9a1-385a-43e9-9a11-996c7020a1b2","Type":"ContainerDied","Data":"2997f4b86dcccf1c849311b9eb484938b4223576f649cdfae1c3ec9c2f959fd7"} Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.832642 5030 generic.go:334] "Generic (PLEG): container finished" podID="16ed07d6-f54e-428f-b3fa-a643b2853ea5" containerID="fc9a809137c348f37727899759ea3b9e309c3ff612805eeab5ca65ed89054529" exitCode=0 Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.832712 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" event={"ID":"16ed07d6-f54e-428f-b3fa-a643b2853ea5","Type":"ContainerDied","Data":"fc9a809137c348f37727899759ea3b9e309c3ff612805eeab5ca65ed89054529"} Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.834685 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.834708 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"be7233cb-0da9-481a-82f9-9c839d4f411f","Type":"ContainerDied","Data":"27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1"} Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.834679 5030 generic.go:334] "Generic (PLEG): container finished" podID="be7233cb-0da9-481a-82f9-9c839d4f411f" containerID="27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1" exitCode=0 Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.834826 5030 generic.go:334] "Generic (PLEG): container finished" podID="be7233cb-0da9-481a-82f9-9c839d4f411f" containerID="7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949" exitCode=143 Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.835004 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"be7233cb-0da9-481a-82f9-9c839d4f411f","Type":"ContainerDied","Data":"7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949"} Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.835025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"be7233cb-0da9-481a-82f9-9c839d4f411f","Type":"ContainerDied","Data":"b15c31aefd43824c37104246d2e6f4d45e4057bf638b4303ee8f366481e26588"} Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.856456 5030 scope.go:117] "RemoveContainer" containerID="2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872031 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-config-data\") pod \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872101 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-combined-ca-bundle\") pod \"be7233cb-0da9-481a-82f9-9c839d4f411f\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872134 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-public-tls-certs\") pod \"be7233cb-0da9-481a-82f9-9c839d4f411f\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872161 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-internal-tls-certs\") pod \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"be7233cb-0da9-481a-82f9-9c839d4f411f\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872202 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-scripts\") pod \"be7233cb-0da9-481a-82f9-9c839d4f411f\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872226 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pttsw\" (UniqueName: \"kubernetes.io/projected/be7233cb-0da9-481a-82f9-9c839d4f411f-kube-api-access-pttsw\") pod \"be7233cb-0da9-481a-82f9-9c839d4f411f\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872257 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-combined-ca-bundle\") pod \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872277 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-config-data\") pod \"be7233cb-0da9-481a-82f9-9c839d4f411f\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872345 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be7233cb-0da9-481a-82f9-9c839d4f411f-logs\") pod \"be7233cb-0da9-481a-82f9-9c839d4f411f\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872406 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872421 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be7233cb-0da9-481a-82f9-9c839d4f411f-httpd-run\") pod \"be7233cb-0da9-481a-82f9-9c839d4f411f\" (UID: \"be7233cb-0da9-481a-82f9-9c839d4f411f\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872445 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-httpd-run\") pod \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872462 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-scripts\") pod \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872495 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hctx\" (UniqueName: \"kubernetes.io/projected/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-kube-api-access-4hctx\") pod \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.872591 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-logs\") pod \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\" (UID: \"4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2\") " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.873025 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be7233cb-0da9-481a-82f9-9c839d4f411f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "be7233cb-0da9-481a-82f9-9c839d4f411f" (UID: "be7233cb-0da9-481a-82f9-9c839d4f411f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.873186 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be7233cb-0da9-481a-82f9-9c839d4f411f-logs" (OuterVolumeSpecName: "logs") pod "be7233cb-0da9-481a-82f9-9c839d4f411f" (UID: "be7233cb-0da9-481a-82f9-9c839d4f411f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.873196 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-logs" (OuterVolumeSpecName: "logs") pod "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" (UID: "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.873755 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" (UID: "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.877759 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-scripts" (OuterVolumeSpecName: "scripts") pod "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" (UID: "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.877865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-kube-api-access-4hctx" (OuterVolumeSpecName: "kube-api-access-4hctx") pod "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" (UID: "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2"). InnerVolumeSpecName "kube-api-access-4hctx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.879962 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-scripts" (OuterVolumeSpecName: "scripts") pod "be7233cb-0da9-481a-82f9-9c839d4f411f" (UID: "be7233cb-0da9-481a-82f9-9c839d4f411f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.880801 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be7233cb-0da9-481a-82f9-9c839d4f411f-kube-api-access-pttsw" (OuterVolumeSpecName: "kube-api-access-pttsw") pod "be7233cb-0da9-481a-82f9-9c839d4f411f" (UID: "be7233cb-0da9-481a-82f9-9c839d4f411f"). InnerVolumeSpecName "kube-api-access-pttsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.888932 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "be7233cb-0da9-481a-82f9-9c839d4f411f" (UID: "be7233cb-0da9-481a-82f9-9c839d4f411f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.890408 5030 scope.go:117] "RemoveContainer" containerID="c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8" Jan 20 23:38:42 crc kubenswrapper[5030]: E0120 23:38:42.891221 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8\": container with ID starting with c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8 not found: ID does not exist" containerID="c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.891266 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8"} err="failed to get container status \"c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8\": rpc error: code = NotFound desc = could not find container \"c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8\": container with ID starting with c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8 not found: ID does not exist" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.891297 5030 scope.go:117] "RemoveContainer" containerID="2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.891301 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" (UID: "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: E0120 23:38:42.891846 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446\": container with ID starting with 2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446 not found: ID does not exist" containerID="2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.891891 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446"} err="failed to get container status \"2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446\": rpc error: code = NotFound desc = could not find container \"2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446\": container with ID starting with 2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446 not found: ID does not exist" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.891916 5030 scope.go:117] "RemoveContainer" containerID="c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.892391 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8"} err="failed to get container status \"c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8\": rpc error: code = NotFound desc = could not find container \"c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8\": container with ID starting with c6ae9cc73fd1ab6fe4ce6b6cdb7d21e96e3f4826930769b2c673da71daa3eba8 not found: ID does not exist" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.892420 5030 scope.go:117] "RemoveContainer" containerID="2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.892757 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446"} err="failed to get container status \"2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446\": rpc error: code = NotFound desc = could not find container \"2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446\": container with ID starting with 2f796828d5f559fc60df563c20a27e17602631f15345fc81bc025461357a1446 not found: ID does not exist" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.892924 5030 scope.go:117] "RemoveContainer" containerID="27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.919965 5030 scope.go:117] "RemoveContainer" containerID="7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.920359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be7233cb-0da9-481a-82f9-9c839d4f411f" (UID: "be7233cb-0da9-481a-82f9-9c839d4f411f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.937161 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" (UID: "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.937770 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" (UID: "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.941004 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-config-data" (OuterVolumeSpecName: "config-data") pod "be7233cb-0da9-481a-82f9-9c839d4f411f" (UID: "be7233cb-0da9-481a-82f9-9c839d4f411f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.951695 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-config-data" (OuterVolumeSpecName: "config-data") pod "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" (UID: "4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.952028 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "be7233cb-0da9-481a-82f9-9c839d4f411f" (UID: "be7233cb-0da9-481a-82f9-9c839d4f411f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.953951 5030 scope.go:117] "RemoveContainer" containerID="27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1" Jan 20 23:38:42 crc kubenswrapper[5030]: E0120 23:38:42.954909 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1\": container with ID starting with 27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1 not found: ID does not exist" containerID="27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.954941 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1"} err="failed to get container status \"27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1\": rpc error: code = NotFound desc = could not find container \"27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1\": container with ID starting with 27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1 not found: ID does not exist" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.954964 5030 scope.go:117] "RemoveContainer" containerID="7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949" Jan 20 23:38:42 crc kubenswrapper[5030]: E0120 23:38:42.955303 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949\": container with ID starting with 7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949 not found: ID does not exist" containerID="7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.955336 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949"} err="failed to get container status \"7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949\": rpc error: code = NotFound desc = could not find container \"7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949\": container with ID starting with 7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949 not found: ID does not exist" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.955358 5030 scope.go:117] "RemoveContainer" containerID="27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.955864 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1"} err="failed to get container status \"27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1\": rpc error: code = NotFound desc = could not find container \"27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1\": container with ID starting with 27082937f503c46182850f7fcaa76d788cfc491ebd6af97baca9134aea8794a1 not found: ID does not exist" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.955917 5030 scope.go:117] "RemoveContainer" containerID="7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.956203 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949"} err="failed to get container status \"7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949\": rpc error: code = NotFound desc = could not find container \"7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949\": container with ID starting with 7907f52b7b416bebeea27847c68bf6b35b2d087b34a88f383eb271a950847949 not found: ID does not exist" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.975871 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.975926 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.975939 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.975973 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.975985 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pttsw\" (UniqueName: \"kubernetes.io/projected/be7233cb-0da9-481a-82f9-9c839d4f411f-kube-api-access-pttsw\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.975996 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.976005 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.976013 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be7233cb-0da9-481a-82f9-9c839d4f411f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.976048 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.976057 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be7233cb-0da9-481a-82f9-9c839d4f411f-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.976066 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.976074 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.976082 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hctx\" (UniqueName: \"kubernetes.io/projected/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-kube-api-access-4hctx\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.976090 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.976099 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.976110 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7233cb-0da9-481a-82f9-9c839d4f411f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:42 crc kubenswrapper[5030]: I0120 23:38:42.996993 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.003069 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.077073 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.077111 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.108569 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.171044 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.177765 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9c4cd6f-c719-424c-8f38-accb36a496ff-logs\") pod \"a9c4cd6f-c719-424c-8f38-accb36a496ff\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.177921 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-667rn\" (UniqueName: \"kubernetes.io/projected/a9c4cd6f-c719-424c-8f38-accb36a496ff-kube-api-access-667rn\") pod \"a9c4cd6f-c719-424c-8f38-accb36a496ff\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.178241 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9c4cd6f-c719-424c-8f38-accb36a496ff-logs" (OuterVolumeSpecName: "logs") pod "a9c4cd6f-c719-424c-8f38-accb36a496ff" (UID: "a9c4cd6f-c719-424c-8f38-accb36a496ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.178701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-config-data\") pod \"a9c4cd6f-c719-424c-8f38-accb36a496ff\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.178764 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-combined-ca-bundle\") pod \"a9c4cd6f-c719-424c-8f38-accb36a496ff\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.178942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-scripts\") pod \"a9c4cd6f-c719-424c-8f38-accb36a496ff\" (UID: \"a9c4cd6f-c719-424c-8f38-accb36a496ff\") " Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.179329 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9c4cd6f-c719-424c-8f38-accb36a496ff-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.181842 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9c4cd6f-c719-424c-8f38-accb36a496ff-kube-api-access-667rn" (OuterVolumeSpecName: "kube-api-access-667rn") pod "a9c4cd6f-c719-424c-8f38-accb36a496ff" (UID: "a9c4cd6f-c719-424c-8f38-accb36a496ff"). InnerVolumeSpecName "kube-api-access-667rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.185109 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.203795 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.207286 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-scripts" (OuterVolumeSpecName: "scripts") pod "a9c4cd6f-c719-424c-8f38-accb36a496ff" (UID: "a9c4cd6f-c719-424c-8f38-accb36a496ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.214203 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.228515 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:38:43 crc kubenswrapper[5030]: E0120 23:38:43.228967 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" containerName="glance-httpd" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.228986 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" containerName="glance-httpd" Jan 20 23:38:43 crc kubenswrapper[5030]: E0120 23:38:43.229000 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be7233cb-0da9-481a-82f9-9c839d4f411f" containerName="glance-httpd" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.229007 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="be7233cb-0da9-481a-82f9-9c839d4f411f" containerName="glance-httpd" Jan 20 23:38:43 crc kubenswrapper[5030]: E0120 23:38:43.229034 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be7233cb-0da9-481a-82f9-9c839d4f411f" containerName="glance-log" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.229040 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="be7233cb-0da9-481a-82f9-9c839d4f411f" containerName="glance-log" Jan 20 23:38:43 crc kubenswrapper[5030]: E0120 23:38:43.229052 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" containerName="glance-log" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.229059 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" containerName="glance-log" Jan 20 23:38:43 crc kubenswrapper[5030]: E0120 23:38:43.229071 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c4cd6f-c719-424c-8f38-accb36a496ff" containerName="placement-db-sync" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.229078 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c4cd6f-c719-424c-8f38-accb36a496ff" containerName="placement-db-sync" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.229239 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="be7233cb-0da9-481a-82f9-9c839d4f411f" containerName="glance-httpd" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.229258 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9c4cd6f-c719-424c-8f38-accb36a496ff" containerName="placement-db-sync" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.229277 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" containerName="glance-httpd" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.229290 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" containerName="glance-log" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.229306 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="be7233cb-0da9-481a-82f9-9c839d4f411f" containerName="glance-log" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.230286 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.234649 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.238409 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-fm8zv" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.238681 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.238804 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.238818 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.240681 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.242125 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.242275 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.245396 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.253394 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9c4cd6f-c719-424c-8f38-accb36a496ff" (UID: "a9c4cd6f-c719-424c-8f38-accb36a496ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.254734 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.266749 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-config-data" (OuterVolumeSpecName: "config-data") pod "a9c4cd6f-c719-424c-8f38-accb36a496ff" (UID: "a9c4cd6f-c719-424c-8f38-accb36a496ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.282035 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.282089 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.282124 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac3db439-8610-4e8f-9a5f-1c65bea0f129-logs\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.282184 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.282206 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac3db439-8610-4e8f-9a5f-1c65bea0f129-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.282237 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.282513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxxnd\" (UniqueName: \"kubernetes.io/projected/ac3db439-8610-4e8f-9a5f-1c65bea0f129-kube-api-access-bxxnd\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.282559 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.282738 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-667rn\" (UniqueName: \"kubernetes.io/projected/a9c4cd6f-c719-424c-8f38-accb36a496ff-kube-api-access-667rn\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.282757 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.282768 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.282778 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9c4cd6f-c719-424c-8f38-accb36a496ff-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384092 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac3db439-8610-4e8f-9a5f-1c65bea0f129-logs\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384156 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/465f615b-c8cb-4464-b576-3de67e335b60-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384183 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-config-data\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384199 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/465f615b-c8cb-4464-b576-3de67e335b60-logs\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384215 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384276 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384406 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384461 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac3db439-8610-4e8f-9a5f-1c65bea0f129-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384535 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384670 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-scripts\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384751 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxxnd\" (UniqueName: \"kubernetes.io/projected/ac3db439-8610-4e8f-9a5f-1c65bea0f129-kube-api-access-bxxnd\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384783 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384827 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384865 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf5jx\" (UniqueName: \"kubernetes.io/projected/465f615b-c8cb-4464-b576-3de67e335b60-kube-api-access-kf5jx\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384923 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384927 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac3db439-8610-4e8f-9a5f-1c65bea0f129-logs\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.384953 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.385487 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.385643 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac3db439-8610-4e8f-9a5f-1c65bea0f129-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.388472 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.388493 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.389408 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.389874 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.402190 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxxnd\" (UniqueName: \"kubernetes.io/projected/ac3db439-8610-4e8f-9a5f-1c65bea0f129-kube-api-access-bxxnd\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.418111 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.486051 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-scripts\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.486113 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.486152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf5jx\" (UniqueName: \"kubernetes.io/projected/465f615b-c8cb-4464-b576-3de67e335b60-kube-api-access-kf5jx\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.486209 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/465f615b-c8cb-4464-b576-3de67e335b60-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.486238 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-config-data\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.486258 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/465f615b-c8cb-4464-b576-3de67e335b60-logs\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.486281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.486305 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.487136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/465f615b-c8cb-4464-b576-3de67e335b60-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.487445 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/465f615b-c8cb-4464-b576-3de67e335b60-logs\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.486314 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.489835 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-scripts\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.491400 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.492095 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.493426 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-config-data\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.505670 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf5jx\" (UniqueName: \"kubernetes.io/projected/465f615b-c8cb-4464-b576-3de67e335b60-kube-api-access-kf5jx\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.525880 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.553149 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.597838 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.847570 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-dt6fp" event={"ID":"a9c4cd6f-c719-424c-8f38-accb36a496ff","Type":"ContainerDied","Data":"d72333d1c396d27f7c3c36d2d86ebc7b0ddf14b4940fe119415b8d02910ef699"} Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.847842 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d72333d1c396d27f7c3c36d2d86ebc7b0ddf14b4940fe119415b8d02910ef699" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.847640 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-dt6fp" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.852274 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a911a29b-b582-49e1-9a21-80ca6a0c7ef3","Type":"ContainerStarted","Data":"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e"} Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.852393 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="ceilometer-central-agent" containerID="cri-o://e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675" gracePeriod=30 Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.852525 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="proxy-httpd" containerID="cri-o://0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e" gracePeriod=30 Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.852578 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="sg-core" containerID="cri-o://a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363" gracePeriod=30 Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.852615 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="ceilometer-notification-agent" containerID="cri-o://bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920" gracePeriod=30 Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.852634 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.889274 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.644483722 podStartE2EDuration="6.88925511s" podCreationTimestamp="2026-01-20 23:38:37 +0000 UTC" firstStartedPulling="2026-01-20 23:38:38.294564676 +0000 UTC m=+3790.614824964" lastFinishedPulling="2026-01-20 23:38:42.539336064 +0000 UTC m=+3794.859596352" observedRunningTime="2026-01-20 23:38:43.88138954 +0000 UTC m=+3796.201649828" watchObservedRunningTime="2026-01-20 23:38:43.88925511 +0000 UTC m=+3796.209515398" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.949960 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-756b878d5d-nf4hr"] Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.953143 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.955998 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.956442 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-fl6p4" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.957878 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.957902 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.957907 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.992875 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2" path="/var/lib/kubelet/pods/4dc7dbb8-d2c9-4ddc-bf16-7042c05e02d2/volumes" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.993710 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be7233cb-0da9-481a-82f9-9c839d4f411f" path="/var/lib/kubelet/pods/be7233cb-0da9-481a-82f9-9c839d4f411f/volumes" Jan 20 23:38:43 crc kubenswrapper[5030]: I0120 23:38:43.994337 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-756b878d5d-nf4hr"] Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.047124 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:38:44 crc kubenswrapper[5030]: W0120 23:38:44.051878 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac3db439_8610_4e8f_9a5f_1c65bea0f129.slice/crio-04e95b1b5d769b8bf96b50ab601e4c2dc69291855dd7ca24cb806180d93d421a WatchSource:0}: Error finding container 04e95b1b5d769b8bf96b50ab601e4c2dc69291855dd7ca24cb806180d93d421a: Status 404 returned error can't find the container with id 04e95b1b5d769b8bf96b50ab601e4c2dc69291855dd7ca24cb806180d93d421a Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.096993 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-public-tls-certs\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.097053 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-internal-tls-certs\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.097096 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-config-data\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.097152 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-scripts\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.097171 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-combined-ca-bundle\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.097190 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjkbl\" (UniqueName: \"kubernetes.io/projected/76dc6ed6-f471-4ee1-8257-45b1c366e23f-kube-api-access-bjkbl\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.097218 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ed6-f471-4ee1-8257-45b1c366e23f-logs\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.132270 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.198555 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ed6-f471-4ee1-8257-45b1c366e23f-logs\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.198883 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-public-tls-certs\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.198949 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-internal-tls-certs\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.198987 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-config-data\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.199034 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-scripts\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.199056 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-combined-ca-bundle\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.199080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjkbl\" (UniqueName: \"kubernetes.io/projected/76dc6ed6-f471-4ee1-8257-45b1c366e23f-kube-api-access-bjkbl\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.206955 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ed6-f471-4ee1-8257-45b1c366e23f-logs\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.217589 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-combined-ca-bundle\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.221510 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjkbl\" (UniqueName: \"kubernetes.io/projected/76dc6ed6-f471-4ee1-8257-45b1c366e23f-kube-api-access-bjkbl\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.221593 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-scripts\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.243067 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-internal-tls-certs\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.243652 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-config-data\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.244009 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-public-tls-certs\") pod \"placement-756b878d5d-nf4hr\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.323458 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.393365 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.399917 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.504440 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-config-data\") pod \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.504498 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7rnt\" (UniqueName: \"kubernetes.io/projected/16ed07d6-f54e-428f-b3fa-a643b2853ea5-kube-api-access-p7rnt\") pod \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\" (UID: \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.504545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-fernet-keys\") pod \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.504580 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ed07d6-f54e-428f-b3fa-a643b2853ea5-combined-ca-bundle\") pod \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\" (UID: \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.504672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-credential-keys\") pod \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.504701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-scripts\") pod \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.504751 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqgmn\" (UniqueName: \"kubernetes.io/projected/4f92e9a1-385a-43e9-9a11-996c7020a1b2-kube-api-access-hqgmn\") pod \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.504818 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-combined-ca-bundle\") pod \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\" (UID: \"4f92e9a1-385a-43e9-9a11-996c7020a1b2\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.504900 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16ed07d6-f54e-428f-b3fa-a643b2853ea5-db-sync-config-data\") pod \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\" (UID: \"16ed07d6-f54e-428f-b3fa-a643b2853ea5\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.509188 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16ed07d6-f54e-428f-b3fa-a643b2853ea5-kube-api-access-p7rnt" (OuterVolumeSpecName: "kube-api-access-p7rnt") pod "16ed07d6-f54e-428f-b3fa-a643b2853ea5" (UID: "16ed07d6-f54e-428f-b3fa-a643b2853ea5"). InnerVolumeSpecName "kube-api-access-p7rnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.511096 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f92e9a1-385a-43e9-9a11-996c7020a1b2-kube-api-access-hqgmn" (OuterVolumeSpecName: "kube-api-access-hqgmn") pod "4f92e9a1-385a-43e9-9a11-996c7020a1b2" (UID: "4f92e9a1-385a-43e9-9a11-996c7020a1b2"). InnerVolumeSpecName "kube-api-access-hqgmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.511426 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4f92e9a1-385a-43e9-9a11-996c7020a1b2" (UID: "4f92e9a1-385a-43e9-9a11-996c7020a1b2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.514073 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-scripts" (OuterVolumeSpecName: "scripts") pod "4f92e9a1-385a-43e9-9a11-996c7020a1b2" (UID: "4f92e9a1-385a-43e9-9a11-996c7020a1b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.514137 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ed07d6-f54e-428f-b3fa-a643b2853ea5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "16ed07d6-f54e-428f-b3fa-a643b2853ea5" (UID: "16ed07d6-f54e-428f-b3fa-a643b2853ea5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.514341 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4f92e9a1-385a-43e9-9a11-996c7020a1b2" (UID: "4f92e9a1-385a-43e9-9a11-996c7020a1b2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.537391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-config-data" (OuterVolumeSpecName: "config-data") pod "4f92e9a1-385a-43e9-9a11-996c7020a1b2" (UID: "4f92e9a1-385a-43e9-9a11-996c7020a1b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.537820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16ed07d6-f54e-428f-b3fa-a643b2853ea5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16ed07d6-f54e-428f-b3fa-a643b2853ea5" (UID: "16ed07d6-f54e-428f-b3fa-a643b2853ea5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.573781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f92e9a1-385a-43e9-9a11-996c7020a1b2" (UID: "4f92e9a1-385a-43e9-9a11-996c7020a1b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.606966 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.606997 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqgmn\" (UniqueName: \"kubernetes.io/projected/4f92e9a1-385a-43e9-9a11-996c7020a1b2-kube-api-access-hqgmn\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.607009 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.607035 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16ed07d6-f54e-428f-b3fa-a643b2853ea5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.607044 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.607053 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7rnt\" (UniqueName: \"kubernetes.io/projected/16ed07d6-f54e-428f-b3fa-a643b2853ea5-kube-api-access-p7rnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.607061 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.607069 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16ed07d6-f54e-428f-b3fa-a643b2853ea5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.607077 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f92e9a1-385a-43e9-9a11-996c7020a1b2-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.703117 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.810053 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-scripts\") pod \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.810114 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtlpd\" (UniqueName: \"kubernetes.io/projected/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-kube-api-access-jtlpd\") pod \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.810197 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-sg-core-conf-yaml\") pod \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.810222 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-run-httpd\") pod \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.810305 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-config-data\") pod \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.810321 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-combined-ca-bundle\") pod \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.810351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-log-httpd\") pod \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\" (UID: \"a911a29b-b582-49e1-9a21-80ca6a0c7ef3\") " Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.810942 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a911a29b-b582-49e1-9a21-80ca6a0c7ef3" (UID: "a911a29b-b582-49e1-9a21-80ca6a0c7ef3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.811175 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a911a29b-b582-49e1-9a21-80ca6a0c7ef3" (UID: "a911a29b-b582-49e1-9a21-80ca6a0c7ef3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.813681 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-kube-api-access-jtlpd" (OuterVolumeSpecName: "kube-api-access-jtlpd") pod "a911a29b-b582-49e1-9a21-80ca6a0c7ef3" (UID: "a911a29b-b582-49e1-9a21-80ca6a0c7ef3"). InnerVolumeSpecName "kube-api-access-jtlpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.818729 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-scripts" (OuterVolumeSpecName: "scripts") pod "a911a29b-b582-49e1-9a21-80ca6a0c7ef3" (UID: "a911a29b-b582-49e1-9a21-80ca6a0c7ef3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.839698 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a911a29b-b582-49e1-9a21-80ca6a0c7ef3" (UID: "a911a29b-b582-49e1-9a21-80ca6a0c7ef3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.886498 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ac3db439-8610-4e8f-9a5f-1c65bea0f129","Type":"ContainerStarted","Data":"eb6392344ac4cd2d376c2123ee922c1fcc21577ba39fa816cb1a992087f36ac3"} Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.886794 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ac3db439-8610-4e8f-9a5f-1c65bea0f129","Type":"ContainerStarted","Data":"04e95b1b5d769b8bf96b50ab601e4c2dc69291855dd7ca24cb806180d93d421a"} Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.892148 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" event={"ID":"4f92e9a1-385a-43e9-9a11-996c7020a1b2","Type":"ContainerDied","Data":"fed90448d6701867aab4d0aca10897c862f68b24f72f99d1dc32c0b3cdfbed4a"} Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.892184 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fed90448d6701867aab4d0aca10897c862f68b24f72f99d1dc32c0b3cdfbed4a" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.892239 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-6twz6" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.901985 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-756b878d5d-nf4hr"] Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.908548 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a911a29b-b582-49e1-9a21-80ca6a0c7ef3" (UID: "a911a29b-b582-49e1-9a21-80ca6a0c7ef3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.912594 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" event={"ID":"16ed07d6-f54e-428f-b3fa-a643b2853ea5","Type":"ContainerDied","Data":"a421fe729f64d3ab86ec8de8613346bcfa0498e9b73024f5a692709053c3d1e3"} Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.912682 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a421fe729f64d3ab86ec8de8613346bcfa0498e9b73024f5a692709053c3d1e3" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.912761 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-jkvvn" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.916641 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.916664 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.916679 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.916687 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.916696 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.916706 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtlpd\" (UniqueName: \"kubernetes.io/projected/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-kube-api-access-jtlpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.919183 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"465f615b-c8cb-4464-b576-3de67e335b60","Type":"ContainerStarted","Data":"29708025d1281124abcbc26bfc7e1d70b99cc64704cff281617e7f596d5ec915"} Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.919613 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-config-data" (OuterVolumeSpecName: "config-data") pod "a911a29b-b582-49e1-9a21-80ca6a0c7ef3" (UID: "a911a29b-b582-49e1-9a21-80ca6a0c7ef3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:44 crc kubenswrapper[5030]: W0120 23:38:44.922674 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76dc6ed6_f471_4ee1_8257_45b1c366e23f.slice/crio-61604dd3ee019112fb4ab41474008fb18c64429a5a47181a9b74f1e5a3c5e66a WatchSource:0}: Error finding container 61604dd3ee019112fb4ab41474008fb18c64429a5a47181a9b74f1e5a3c5e66a: Status 404 returned error can't find the container with id 61604dd3ee019112fb4ab41474008fb18c64429a5a47181a9b74f1e5a3c5e66a Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.938548 5030 generic.go:334] "Generic (PLEG): container finished" podID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerID="0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e" exitCode=0 Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.938575 5030 generic.go:334] "Generic (PLEG): container finished" podID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerID="a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363" exitCode=2 Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.938585 5030 generic.go:334] "Generic (PLEG): container finished" podID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerID="bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920" exitCode=0 Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.938593 5030 generic.go:334] "Generic (PLEG): container finished" podID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerID="e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675" exitCode=0 Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.938610 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a911a29b-b582-49e1-9a21-80ca6a0c7ef3","Type":"ContainerDied","Data":"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e"} Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.938643 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.938669 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a911a29b-b582-49e1-9a21-80ca6a0c7ef3","Type":"ContainerDied","Data":"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363"} Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.938680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a911a29b-b582-49e1-9a21-80ca6a0c7ef3","Type":"ContainerDied","Data":"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920"} Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.938688 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a911a29b-b582-49e1-9a21-80ca6a0c7ef3","Type":"ContainerDied","Data":"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675"} Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.938696 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a911a29b-b582-49e1-9a21-80ca6a0c7ef3","Type":"ContainerDied","Data":"d06d3587c0f13cd53f7176d1d3d579702748b547500bf8583f886553abff8edd"} Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.938711 5030 scope.go:117] "RemoveContainer" containerID="0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.942280 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6twz6"] Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.949000 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-6twz6"] Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.962351 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:38:44 crc kubenswrapper[5030]: E0120 23:38:44.962604 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.971026 5030 scope.go:117] "RemoveContainer" containerID="a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363" Jan 20 23:38:44 crc kubenswrapper[5030]: I0120 23:38:44.992755 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.025348 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a911a29b-b582-49e1-9a21-80ca6a0c7ef3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.035139 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.068201 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:38:45 crc kubenswrapper[5030]: E0120 23:38:45.068934 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="proxy-httpd" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.068951 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="proxy-httpd" Jan 20 23:38:45 crc kubenswrapper[5030]: E0120 23:38:45.068979 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="ceilometer-notification-agent" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.068985 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="ceilometer-notification-agent" Jan 20 23:38:45 crc kubenswrapper[5030]: E0120 23:38:45.069006 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="sg-core" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.069013 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="sg-core" Jan 20 23:38:45 crc kubenswrapper[5030]: E0120 23:38:45.069031 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16ed07d6-f54e-428f-b3fa-a643b2853ea5" containerName="barbican-db-sync" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.069037 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="16ed07d6-f54e-428f-b3fa-a643b2853ea5" containerName="barbican-db-sync" Jan 20 23:38:45 crc kubenswrapper[5030]: E0120 23:38:45.069055 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="ceilometer-central-agent" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.069060 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="ceilometer-central-agent" Jan 20 23:38:45 crc kubenswrapper[5030]: E0120 23:38:45.069094 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f92e9a1-385a-43e9-9a11-996c7020a1b2" containerName="keystone-bootstrap" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.069099 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f92e9a1-385a-43e9-9a11-996c7020a1b2" containerName="keystone-bootstrap" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.069396 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="ceilometer-notification-agent" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.069413 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="ceilometer-central-agent" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.069428 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="sg-core" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.069444 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="16ed07d6-f54e-428f-b3fa-a643b2853ea5" containerName="barbican-db-sync" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.069453 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" containerName="proxy-httpd" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.069475 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f92e9a1-385a-43e9-9a11-996c7020a1b2" containerName="keystone-bootstrap" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.072155 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.094426 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.100237 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.100465 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.149481 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bwkgg"] Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.151182 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.155129 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.155588 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-2jq2r" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.178890 5030 scope.go:117] "RemoveContainer" containerID="bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.179325 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.179472 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.179593 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.192766 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bwkgg"] Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.226937 5030 scope.go:117] "RemoveContainer" containerID="e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.239705 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb"] Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.243044 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.244077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-scripts\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.244159 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-config-data\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.244230 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.249378 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-6fhjp" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.249687 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.250869 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.251449 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9"] Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.257801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-fernet-keys\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.257867 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2da8aa0-866f-4d14-affd-51bc1c069895-log-httpd\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.257946 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-credential-keys\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.257975 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vm67\" (UniqueName: \"kubernetes.io/projected/ec6d9b48-412f-49d8-ab21-20ab43366732-kube-api-access-8vm67\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.258007 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjjhd\" (UniqueName: \"kubernetes.io/projected/f2da8aa0-866f-4d14-affd-51bc1c069895-kube-api-access-qjjhd\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.258083 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-combined-ca-bundle\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.258140 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-scripts\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.258192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2da8aa0-866f-4d14-affd-51bc1c069895-run-httpd\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.258232 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-config-data\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.258374 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.266692 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9"] Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.267194 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.270479 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.288703 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb"] Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.315252 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr"] Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.316680 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.322014 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr"] Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.322300 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.340537 5030 scope.go:117] "RemoveContainer" containerID="0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e" Jan 20 23:38:45 crc kubenswrapper[5030]: E0120 23:38:45.344057 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e\": container with ID starting with 0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e not found: ID does not exist" containerID="0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.344198 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e"} err="failed to get container status \"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e\": rpc error: code = NotFound desc = could not find container \"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e\": container with ID starting with 0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.344282 5030 scope.go:117] "RemoveContainer" containerID="a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363" Jan 20 23:38:45 crc kubenswrapper[5030]: E0120 23:38:45.346379 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363\": container with ID starting with a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363 not found: ID does not exist" containerID="a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.346424 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363"} err="failed to get container status \"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363\": rpc error: code = NotFound desc = could not find container \"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363\": container with ID starting with a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363 not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.346451 5030 scope.go:117] "RemoveContainer" containerID="bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920" Jan 20 23:38:45 crc kubenswrapper[5030]: E0120 23:38:45.350022 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920\": container with ID starting with bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920 not found: ID does not exist" containerID="bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.350072 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920"} err="failed to get container status \"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920\": rpc error: code = NotFound desc = could not find container \"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920\": container with ID starting with bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920 not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.350087 5030 scope.go:117] "RemoveContainer" containerID="e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675" Jan 20 23:38:45 crc kubenswrapper[5030]: E0120 23:38:45.352116 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675\": container with ID starting with e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675 not found: ID does not exist" containerID="e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.352141 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675"} err="failed to get container status \"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675\": rpc error: code = NotFound desc = could not find container \"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675\": container with ID starting with e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675 not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.352156 5030 scope.go:117] "RemoveContainer" containerID="0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.364204 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e"} err="failed to get container status \"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e\": rpc error: code = NotFound desc = could not find container \"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e\": container with ID starting with 0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.364249 5030 scope.go:117] "RemoveContainer" containerID="a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.365889 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1161dc0a-c665-401c-8c5c-b83131eaa38f-logs\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.365941 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.365970 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-scripts\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.365991 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-config-data\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.366021 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.366042 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-fernet-keys\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.366115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-config-data\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.366137 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2da8aa0-866f-4d14-affd-51bc1c069895-log-httpd\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.366165 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-credential-keys\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.366185 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vm67\" (UniqueName: \"kubernetes.io/projected/ec6d9b48-412f-49d8-ab21-20ab43366732-kube-api-access-8vm67\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.366203 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-combined-ca-bundle\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.366225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjjhd\" (UniqueName: \"kubernetes.io/projected/f2da8aa0-866f-4d14-affd-51bc1c069895-kube-api-access-qjjhd\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.368763 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2da8aa0-866f-4d14-affd-51bc1c069895-log-httpd\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.369222 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363"} err="failed to get container status \"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363\": rpc error: code = NotFound desc = could not find container \"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363\": container with ID starting with a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363 not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.369252 5030 scope.go:117] "RemoveContainer" containerID="bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.370172 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920"} err="failed to get container status \"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920\": rpc error: code = NotFound desc = could not find container \"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920\": container with ID starting with bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920 not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.370408 5030 scope.go:117] "RemoveContainer" containerID="e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.371013 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-combined-ca-bundle\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.371018 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-scripts\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.371257 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-config-data-custom\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.372602 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-scripts\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.372720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2da8aa0-866f-4d14-affd-51bc1c069895-run-httpd\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.372910 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-config-data\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.373092 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh7kx\" (UniqueName: \"kubernetes.io/projected/1161dc0a-c665-401c-8c5c-b83131eaa38f-kube-api-access-xh7kx\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.371714 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675"} err="failed to get container status \"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675\": rpc error: code = NotFound desc = could not find container \"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675\": container with ID starting with e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675 not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.373374 5030 scope.go:117] "RemoveContainer" containerID="0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.372537 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-config-data\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.373646 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2da8aa0-866f-4d14-affd-51bc1c069895-run-httpd\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.371934 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.375584 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-combined-ca-bundle\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.375922 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-credential-keys\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.376187 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-fernet-keys\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.376494 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-config-data\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.377818 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-scripts\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.378100 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e"} err="failed to get container status \"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e\": rpc error: code = NotFound desc = could not find container \"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e\": container with ID starting with 0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.378137 5030 scope.go:117] "RemoveContainer" containerID="a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.380196 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363"} err="failed to get container status \"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363\": rpc error: code = NotFound desc = could not find container \"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363\": container with ID starting with a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363 not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.380225 5030 scope.go:117] "RemoveContainer" containerID="bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.381488 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920"} err="failed to get container status \"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920\": rpc error: code = NotFound desc = could not find container \"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920\": container with ID starting with bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920 not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.381508 5030 scope.go:117] "RemoveContainer" containerID="e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.382831 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675"} err="failed to get container status \"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675\": rpc error: code = NotFound desc = could not find container \"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675\": container with ID starting with e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675 not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.382870 5030 scope.go:117] "RemoveContainer" containerID="0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.384450 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e"} err="failed to get container status \"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e\": rpc error: code = NotFound desc = could not find container \"0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e\": container with ID starting with 0e51c555e461fd8532f6b7600c0b30913984303cac5188532c00e14c83ad540e not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.384797 5030 scope.go:117] "RemoveContainer" containerID="a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.385707 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363"} err="failed to get container status \"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363\": rpc error: code = NotFound desc = could not find container \"a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363\": container with ID starting with a1da31b771c865e99b1ca4c9703839796f277650a12924b1384ce598dec0e363 not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.385864 5030 scope.go:117] "RemoveContainer" containerID="bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.386424 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920"} err="failed to get container status \"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920\": rpc error: code = NotFound desc = could not find container \"bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920\": container with ID starting with bb70d3ee221088b8cddfd6852f1a3f09f6a2e5542909f4e544585ea3103b6920 not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.386452 5030 scope.go:117] "RemoveContainer" containerID="e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.387164 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjjhd\" (UniqueName: \"kubernetes.io/projected/f2da8aa0-866f-4d14-affd-51bc1c069895-kube-api-access-qjjhd\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.387898 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675"} err="failed to get container status \"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675\": rpc error: code = NotFound desc = could not find container \"e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675\": container with ID starting with e0e64c6b1141d464cb3756e1dfef21b335e426bcf037ecfc5a733d7073838675 not found: ID does not exist" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.389303 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.391051 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vm67\" (UniqueName: \"kubernetes.io/projected/ec6d9b48-412f-49d8-ab21-20ab43366732-kube-api-access-8vm67\") pod \"keystone-bootstrap-bwkgg\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-config-data-custom\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475425 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh7kx\" (UniqueName: \"kubernetes.io/projected/1161dc0a-c665-401c-8c5c-b83131eaa38f-kube-api-access-xh7kx\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475456 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09386ef4-57a8-4beb-950a-49b79579bd25-logs\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475475 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rxdl\" (UniqueName: \"kubernetes.io/projected/03bf9cbf-e666-42ca-9b29-f177c78ef68c-kube-api-access-9rxdl\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475495 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1161dc0a-c665-401c-8c5c-b83131eaa38f-logs\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475530 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-config-data-custom\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475551 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-config-data\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475568 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-config-data\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475583 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03bf9cbf-e666-42ca-9b29-f177c78ef68c-logs\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475602 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-config-data-custom\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475643 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-config-data\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475664 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-combined-ca-bundle\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-combined-ca-bundle\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475708 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-combined-ca-bundle\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.475737 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxkqj\" (UniqueName: \"kubernetes.io/projected/09386ef4-57a8-4beb-950a-49b79579bd25-kube-api-access-vxkqj\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.478614 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1161dc0a-c665-401c-8c5c-b83131eaa38f-logs\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.483067 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-config-data-custom\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.485130 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-config-data\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.486970 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-combined-ca-bundle\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.494847 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh7kx\" (UniqueName: \"kubernetes.io/projected/1161dc0a-c665-401c-8c5c-b83131eaa38f-kube-api-access-xh7kx\") pod \"barbican-worker-7dbc7bf587-fpsqb\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.496429 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.546871 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.580178 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.580550 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09386ef4-57a8-4beb-950a-49b79579bd25-logs\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.580587 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rxdl\" (UniqueName: \"kubernetes.io/projected/03bf9cbf-e666-42ca-9b29-f177c78ef68c-kube-api-access-9rxdl\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.580666 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-config-data-custom\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.580689 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-config-data\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.580714 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-config-data\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.580730 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03bf9cbf-e666-42ca-9b29-f177c78ef68c-logs\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.580751 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-config-data-custom\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.580781 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-combined-ca-bundle\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.580813 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-combined-ca-bundle\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.580842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxkqj\" (UniqueName: \"kubernetes.io/projected/09386ef4-57a8-4beb-950a-49b79579bd25-kube-api-access-vxkqj\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.581133 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09386ef4-57a8-4beb-950a-49b79579bd25-logs\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.582068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03bf9cbf-e666-42ca-9b29-f177c78ef68c-logs\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.594823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-config-data-custom\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.595883 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-combined-ca-bundle\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.603921 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxkqj\" (UniqueName: \"kubernetes.io/projected/09386ef4-57a8-4beb-950a-49b79579bd25-kube-api-access-vxkqj\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.607130 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-config-data\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.608077 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-combined-ca-bundle\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.608267 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-config-data\") pod \"barbican-keystone-listener-58746674b-tcfk9\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.609107 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-config-data-custom\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.621013 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rxdl\" (UniqueName: \"kubernetes.io/projected/03bf9cbf-e666-42ca-9b29-f177c78ef68c-kube-api-access-9rxdl\") pod \"barbican-api-6c8f595fb-7ttwr\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.625275 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.642423 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.994915 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f92e9a1-385a-43e9-9a11-996c7020a1b2" path="/var/lib/kubelet/pods/4f92e9a1-385a-43e9-9a11-996c7020a1b2/volumes" Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.998773 5030 generic.go:334] "Generic (PLEG): container finished" podID="dc45df02-ca0a-4c18-813e-0c644876cd5f" containerID="0279e54b1935433ab1c4cfae7c60466d90a9ac5d6d65fae068ea6efb4d3826be" exitCode=0 Jan 20 23:38:45 crc kubenswrapper[5030]: I0120 23:38:45.999081 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a911a29b-b582-49e1-9a21-80ca6a0c7ef3" path="/var/lib/kubelet/pods/a911a29b-b582-49e1-9a21-80ca6a0c7ef3/volumes" Jan 20 23:38:46 crc kubenswrapper[5030]: I0120 23:38:45.999822 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-72mr2" event={"ID":"dc45df02-ca0a-4c18-813e-0c644876cd5f","Type":"ContainerDied","Data":"0279e54b1935433ab1c4cfae7c60466d90a9ac5d6d65fae068ea6efb4d3826be"} Jan 20 23:38:46 crc kubenswrapper[5030]: I0120 23:38:46.000678 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" event={"ID":"76dc6ed6-f471-4ee1-8257-45b1c366e23f","Type":"ContainerStarted","Data":"d237b27eaa104ce903a2a8f6a052606cc3d5791c78e134f2cb0c14cfd7492e99"} Jan 20 23:38:46 crc kubenswrapper[5030]: I0120 23:38:46.000734 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" event={"ID":"76dc6ed6-f471-4ee1-8257-45b1c366e23f","Type":"ContainerStarted","Data":"61604dd3ee019112fb4ab41474008fb18c64429a5a47181a9b74f1e5a3c5e66a"} Jan 20 23:38:46 crc kubenswrapper[5030]: I0120 23:38:46.013201 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"465f615b-c8cb-4464-b576-3de67e335b60","Type":"ContainerStarted","Data":"69101b4e8486bd4ac5b32d623cd91ef7c663869931c87c6eef9741331b365a75"} Jan 20 23:38:46 crc kubenswrapper[5030]: I0120 23:38:46.019327 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ac3db439-8610-4e8f-9a5f-1c65bea0f129","Type":"ContainerStarted","Data":"1e5fc2dd3d62da6838f1090bb2515031a07a9ece9f1eea10ec89734922fc4bee"} Jan 20 23:38:46 crc kubenswrapper[5030]: I0120 23:38:46.084242 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.084220842 podStartE2EDuration="3.084220842s" podCreationTimestamp="2026-01-20 23:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:46.068670435 +0000 UTC m=+3798.388930723" watchObservedRunningTime="2026-01-20 23:38:46.084220842 +0000 UTC m=+3798.404481130" Jan 20 23:38:46 crc kubenswrapper[5030]: I0120 23:38:46.228323 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:38:46 crc kubenswrapper[5030]: I0120 23:38:46.336277 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bwkgg"] Jan 20 23:38:46 crc kubenswrapper[5030]: I0120 23:38:46.462848 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9"] Jan 20 23:38:46 crc kubenswrapper[5030]: I0120 23:38:46.551458 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr"] Jan 20 23:38:46 crc kubenswrapper[5030]: W0120 23:38:46.554412 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1161dc0a_c665_401c_8c5c_b83131eaa38f.slice/crio-c1dd53c78fed5cee69a5f91895dc201bbc2fe207fa7903633b2e6e71f5f1f35c WatchSource:0}: Error finding container c1dd53c78fed5cee69a5f91895dc201bbc2fe207fa7903633b2e6e71f5f1f35c: Status 404 returned error can't find the container with id c1dd53c78fed5cee69a5f91895dc201bbc2fe207fa7903633b2e6e71f5f1f35c Jan 20 23:38:46 crc kubenswrapper[5030]: I0120 23:38:46.575969 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb"] Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.035765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" event={"ID":"09386ef4-57a8-4beb-950a-49b79579bd25","Type":"ContainerStarted","Data":"b25ff8a83d116c4cbc47780cd4b734085f77cdcc9f01bfe134d74870aa75527c"} Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.036134 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" event={"ID":"09386ef4-57a8-4beb-950a-49b79579bd25","Type":"ContainerStarted","Data":"f5a74acba0a8a99226f14f4549f03f8f58417131875199ea1e4cdc9379448f3a"} Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.036157 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" event={"ID":"09386ef4-57a8-4beb-950a-49b79579bd25","Type":"ContainerStarted","Data":"21d55cc3a6c969e7c1c2c0564777fef0d5cf3f2d2bb4b9fd2dfc1576e0c9b551"} Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.048336 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" event={"ID":"03bf9cbf-e666-42ca-9b29-f177c78ef68c","Type":"ContainerStarted","Data":"28307d7bbfeb70415d5fdbdcc1a1fa76a855c4987e015218eeec904fb9ec5dd4"} Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.048385 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" event={"ID":"03bf9cbf-e666-42ca-9b29-f177c78ef68c","Type":"ContainerStarted","Data":"df557b3755d6472f06a2932a5f96af17b792113372117c45e3eed65984d039c5"} Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.051665 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" event={"ID":"1161dc0a-c665-401c-8c5c-b83131eaa38f","Type":"ContainerStarted","Data":"2ddec88ed5b072c38033cdee2f329b4a5d2567e6d0da5ecc25c62b5c39d2c21b"} Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.051719 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" event={"ID":"1161dc0a-c665-401c-8c5c-b83131eaa38f","Type":"ContainerStarted","Data":"c1dd53c78fed5cee69a5f91895dc201bbc2fe207fa7903633b2e6e71f5f1f35c"} Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.056740 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f2da8aa0-866f-4d14-affd-51bc1c069895","Type":"ContainerStarted","Data":"bcb765c17614757a78d0787bf07765891e263896d488270c1630049dbad86bec"} Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.056779 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f2da8aa0-866f-4d14-affd-51bc1c069895","Type":"ContainerStarted","Data":"0abcbb7a9d6e82021e300ad745a202ed6d5ab8ccf9cc046a9780640d94270cfa"} Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.060281 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" podStartSLOduration=2.060259637 podStartE2EDuration="2.060259637s" podCreationTimestamp="2026-01-20 23:38:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:47.051360222 +0000 UTC m=+3799.371620510" watchObservedRunningTime="2026-01-20 23:38:47.060259637 +0000 UTC m=+3799.380519925" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.062360 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" event={"ID":"ec6d9b48-412f-49d8-ab21-20ab43366732","Type":"ContainerStarted","Data":"6cf9372d371534e9f69cfa3095035fd05d0953c31466438fdef72192fdad5f2f"} Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.062393 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" event={"ID":"ec6d9b48-412f-49d8-ab21-20ab43366732","Type":"ContainerStarted","Data":"e0ca2a3c027208829725e32f70d4925c11b7b17fdd512d8d91fead38c8bad93f"} Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.064868 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" event={"ID":"76dc6ed6-f471-4ee1-8257-45b1c366e23f","Type":"ContainerStarted","Data":"2fd6129fe28aa73bc45a8a833204a917e8c99df2272ecdbe088383a43ce45fe2"} Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.065574 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.065595 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.080383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"465f615b-c8cb-4464-b576-3de67e335b60","Type":"ContainerStarted","Data":"9ec8c1394b7cd5609ca385739ce15032289c3a70706b9ad03ce5c2e188b8a10b"} Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.090571 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" podStartSLOduration=2.090553541 podStartE2EDuration="2.090553541s" podCreationTimestamp="2026-01-20 23:38:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:47.081847721 +0000 UTC m=+3799.402108009" watchObservedRunningTime="2026-01-20 23:38:47.090553541 +0000 UTC m=+3799.410813829" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.141750 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.141731549 podStartE2EDuration="4.141731549s" podCreationTimestamp="2026-01-20 23:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:47.115092394 +0000 UTC m=+3799.435352682" watchObservedRunningTime="2026-01-20 23:38:47.141731549 +0000 UTC m=+3799.461991837" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.143453 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" podStartSLOduration=4.143449422 podStartE2EDuration="4.143449422s" podCreationTimestamp="2026-01-20 23:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:47.141279929 +0000 UTC m=+3799.461540227" watchObservedRunningTime="2026-01-20 23:38:47.143449422 +0000 UTC m=+3799.463709710" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.441456 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.542361 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-config-data\") pod \"dc45df02-ca0a-4c18-813e-0c644876cd5f\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.542422 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-combined-ca-bundle\") pod \"dc45df02-ca0a-4c18-813e-0c644876cd5f\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.542475 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-db-sync-config-data\") pod \"dc45df02-ca0a-4c18-813e-0c644876cd5f\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.542513 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ml8b\" (UniqueName: \"kubernetes.io/projected/dc45df02-ca0a-4c18-813e-0c644876cd5f-kube-api-access-7ml8b\") pod \"dc45df02-ca0a-4c18-813e-0c644876cd5f\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.543114 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc45df02-ca0a-4c18-813e-0c644876cd5f-etc-machine-id\") pod \"dc45df02-ca0a-4c18-813e-0c644876cd5f\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.543138 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-scripts\") pod \"dc45df02-ca0a-4c18-813e-0c644876cd5f\" (UID: \"dc45df02-ca0a-4c18-813e-0c644876cd5f\") " Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.543166 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc45df02-ca0a-4c18-813e-0c644876cd5f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dc45df02-ca0a-4c18-813e-0c644876cd5f" (UID: "dc45df02-ca0a-4c18-813e-0c644876cd5f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.543474 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc45df02-ca0a-4c18-813e-0c644876cd5f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.550653 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dc45df02-ca0a-4c18-813e-0c644876cd5f" (UID: "dc45df02-ca0a-4c18-813e-0c644876cd5f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.551756 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc45df02-ca0a-4c18-813e-0c644876cd5f-kube-api-access-7ml8b" (OuterVolumeSpecName: "kube-api-access-7ml8b") pod "dc45df02-ca0a-4c18-813e-0c644876cd5f" (UID: "dc45df02-ca0a-4c18-813e-0c644876cd5f"). InnerVolumeSpecName "kube-api-access-7ml8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.573750 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc45df02-ca0a-4c18-813e-0c644876cd5f" (UID: "dc45df02-ca0a-4c18-813e-0c644876cd5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.578095 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-scripts" (OuterVolumeSpecName: "scripts") pod "dc45df02-ca0a-4c18-813e-0c644876cd5f" (UID: "dc45df02-ca0a-4c18-813e-0c644876cd5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.588214 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-config-data" (OuterVolumeSpecName: "config-data") pod "dc45df02-ca0a-4c18-813e-0c644876cd5f" (UID: "dc45df02-ca0a-4c18-813e-0c644876cd5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.644958 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.644991 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.645002 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.645015 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc45df02-ca0a-4c18-813e-0c644876cd5f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:47 crc kubenswrapper[5030]: I0120 23:38:47.645026 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ml8b\" (UniqueName: \"kubernetes.io/projected/dc45df02-ca0a-4c18-813e-0c644876cd5f-kube-api-access-7ml8b\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.089025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" event={"ID":"1161dc0a-c665-401c-8c5c-b83131eaa38f","Type":"ContainerStarted","Data":"38e25d60d83df7b7698e2f71ec167569bffbe17710fcb7c93716578cf6b088b3"} Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.093021 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-72mr2" event={"ID":"dc45df02-ca0a-4c18-813e-0c644876cd5f","Type":"ContainerDied","Data":"f82b5e2b8fdcf04aac0cc0473869acff2408489bfdcf343b92873f3fcc70ef99"} Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.093057 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f82b5e2b8fdcf04aac0cc0473869acff2408489bfdcf343b92873f3fcc70ef99" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.093115 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-72mr2" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.098049 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" event={"ID":"03bf9cbf-e666-42ca-9b29-f177c78ef68c","Type":"ContainerStarted","Data":"9add33d5c8c64359fc934b28c89533d4295fa85906204392d93b43522ebe74ff"} Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.128530 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" podStartSLOduration=3.128511595 podStartE2EDuration="3.128511595s" podCreationTimestamp="2026-01-20 23:38:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:48.115587683 +0000 UTC m=+3800.435847961" watchObservedRunningTime="2026-01-20 23:38:48.128511595 +0000 UTC m=+3800.448771883" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.139202 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g"] Jan 20 23:38:48 crc kubenswrapper[5030]: E0120 23:38:48.139547 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc45df02-ca0a-4c18-813e-0c644876cd5f" containerName="cinder-db-sync" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.139561 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc45df02-ca0a-4c18-813e-0c644876cd5f" containerName="cinder-db-sync" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.139756 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc45df02-ca0a-4c18-813e-0c644876cd5f" containerName="cinder-db-sync" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.141032 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.142829 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.145903 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.146968 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" podStartSLOduration=3.146950902 podStartE2EDuration="3.146950902s" podCreationTimestamp="2026-01-20 23:38:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:48.145987369 +0000 UTC m=+3800.466247657" watchObservedRunningTime="2026-01-20 23:38:48.146950902 +0000 UTC m=+3800.467211180" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.168575 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g"] Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.250862 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.254948 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-public-tls-certs\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.255015 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-config-data\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.255033 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-config-data-custom\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.255054 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-combined-ca-bundle\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.255148 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-internal-tls-certs\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.255347 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-logs\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.255417 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47j47\" (UniqueName: \"kubernetes.io/projected/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-kube-api-access-47j47\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.257256 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.262404 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-fdpgm" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.262582 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.262912 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.269934 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.288345 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.357349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-combined-ca-bundle\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.357613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-internal-tls-certs\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.357673 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/014aff8f-a8f6-4d53-953b-7c86894fa72e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.357753 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-logs\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.357800 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47j47\" (UniqueName: \"kubernetes.io/projected/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-kube-api-access-47j47\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.357827 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-scripts\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.357842 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.357910 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-config-data\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.357932 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-public-tls-certs\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.357971 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxgx8\" (UniqueName: \"kubernetes.io/projected/014aff8f-a8f6-4d53-953b-7c86894fa72e-kube-api-access-nxgx8\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.357992 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-config-data\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.358036 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-config-data-custom\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.358055 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.358752 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-logs\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.378396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-internal-tls-certs\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.378869 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-config-data\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.381107 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-public-tls-certs\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.381630 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-config-data-custom\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.392459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47j47\" (UniqueName: \"kubernetes.io/projected/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-kube-api-access-47j47\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.401797 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-combined-ca-bundle\") pod \"barbican-api-5646794dd4-m2p4g\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.427334 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.429792 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.432254 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.448317 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.459599 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/014aff8f-a8f6-4d53-953b-7c86894fa72e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.459710 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/014aff8f-a8f6-4d53-953b-7c86894fa72e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.459860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-scripts\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.459882 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.460003 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-config-data\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.460058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxgx8\" (UniqueName: \"kubernetes.io/projected/014aff8f-a8f6-4d53-953b-7c86894fa72e-kube-api-access-nxgx8\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.460088 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.465698 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.466955 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.468641 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-scripts\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.471249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-config-data\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.481474 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxgx8\" (UniqueName: \"kubernetes.io/projected/014aff8f-a8f6-4d53-953b-7c86894fa72e-kube-api-access-nxgx8\") pod \"cinder-scheduler-0\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.493660 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.561471 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-config-data\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.561518 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.561549 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf2362a0-6768-424c-9478-2f6aa4a3f48e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.561595 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-config-data-custom\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.561679 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp2kr\" (UniqueName: \"kubernetes.io/projected/cf2362a0-6768-424c-9478-2f6aa4a3f48e-kube-api-access-tp2kr\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.561712 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2362a0-6768-424c-9478-2f6aa4a3f48e-logs\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.561738 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-scripts\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.617044 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.663000 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-config-data\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.663049 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.663078 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf2362a0-6768-424c-9478-2f6aa4a3f48e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.663136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-config-data-custom\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.663203 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp2kr\" (UniqueName: \"kubernetes.io/projected/cf2362a0-6768-424c-9478-2f6aa4a3f48e-kube-api-access-tp2kr\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.663250 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2362a0-6768-424c-9478-2f6aa4a3f48e-logs\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.663275 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-scripts\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.666754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf2362a0-6768-424c-9478-2f6aa4a3f48e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.667157 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2362a0-6768-424c-9478-2f6aa4a3f48e-logs\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.943854 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.943916 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-config-data-custom\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.944239 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-config-data\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.944288 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp2kr\" (UniqueName: \"kubernetes.io/projected/cf2362a0-6768-424c-9478-2f6aa4a3f48e-kube-api-access-tp2kr\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.947509 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-scripts\") pod \"cinder-api-0\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:48 crc kubenswrapper[5030]: I0120 23:38:48.968193 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g"] Jan 20 23:38:49 crc kubenswrapper[5030]: I0120 23:38:49.093861 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:49 crc kubenswrapper[5030]: I0120 23:38:49.119010 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" event={"ID":"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d","Type":"ContainerStarted","Data":"3e92041344bf2c8c86a9132f7d9aaaf809ef3136b22a0f4e7d195f8186abd2fa"} Jan 20 23:38:49 crc kubenswrapper[5030]: I0120 23:38:49.127395 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f2da8aa0-866f-4d14-affd-51bc1c069895","Type":"ContainerStarted","Data":"000435c0bff778492d15b82f751b1f419178a6ddf8486dc227ec0079586ede6f"} Jan 20 23:38:49 crc kubenswrapper[5030]: I0120 23:38:49.128253 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:49 crc kubenswrapper[5030]: I0120 23:38:49.128706 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:49 crc kubenswrapper[5030]: I0120 23:38:49.421413 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:38:49 crc kubenswrapper[5030]: W0120 23:38:49.427443 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod014aff8f_a8f6_4d53_953b_7c86894fa72e.slice/crio-45f4412fea28ea42e7838d41452887b21b99eea86289ac7c55ef399cd6ff1b9f WatchSource:0}: Error finding container 45f4412fea28ea42e7838d41452887b21b99eea86289ac7c55ef399cd6ff1b9f: Status 404 returned error can't find the container with id 45f4412fea28ea42e7838d41452887b21b99eea86289ac7c55ef399cd6ff1b9f Jan 20 23:38:49 crc kubenswrapper[5030]: I0120 23:38:49.603282 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:38:50 crc kubenswrapper[5030]: I0120 23:38:50.134149 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"cf2362a0-6768-424c-9478-2f6aa4a3f48e","Type":"ContainerStarted","Data":"e495597c2ee7184e2d7f9eb7748e2376695b348cdd6836a0d7a5a1efb9bae452"} Jan 20 23:38:50 crc kubenswrapper[5030]: I0120 23:38:50.137159 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f2da8aa0-866f-4d14-affd-51bc1c069895","Type":"ContainerStarted","Data":"0c1ae3ce16b4465aaf74d3794c1e89b42c44f57190525b1e91b588410473e05b"} Jan 20 23:38:50 crc kubenswrapper[5030]: I0120 23:38:50.139226 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"014aff8f-a8f6-4d53-953b-7c86894fa72e","Type":"ContainerStarted","Data":"45f4412fea28ea42e7838d41452887b21b99eea86289ac7c55ef399cd6ff1b9f"} Jan 20 23:38:50 crc kubenswrapper[5030]: I0120 23:38:50.141312 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" event={"ID":"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d","Type":"ContainerStarted","Data":"3347c314b0e756e3b221978cc6e13748d59b6b6fd17aa2b01e4f18a3697a0816"} Jan 20 23:38:50 crc kubenswrapper[5030]: I0120 23:38:50.141364 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" event={"ID":"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d","Type":"ContainerStarted","Data":"5622e08e8950e98a8739de3b9233a0d993a1898abb24917da33b851510e57497"} Jan 20 23:38:50 crc kubenswrapper[5030]: I0120 23:38:50.171265 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" podStartSLOduration=2.171239571 podStartE2EDuration="2.171239571s" podCreationTimestamp="2026-01-20 23:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:50.156225958 +0000 UTC m=+3802.476486246" watchObservedRunningTime="2026-01-20 23:38:50.171239571 +0000 UTC m=+3802.491499869" Jan 20 23:38:51 crc kubenswrapper[5030]: I0120 23:38:51.154870 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"cf2362a0-6768-424c-9478-2f6aa4a3f48e","Type":"ContainerStarted","Data":"8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e"} Jan 20 23:38:51 crc kubenswrapper[5030]: I0120 23:38:51.159945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f2da8aa0-866f-4d14-affd-51bc1c069895","Type":"ContainerStarted","Data":"190149a065d2528b829a609717456c9410c7dc363069daad9b87990930390ba2"} Jan 20 23:38:51 crc kubenswrapper[5030]: I0120 23:38:51.161991 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:38:51 crc kubenswrapper[5030]: I0120 23:38:51.164399 5030 generic.go:334] "Generic (PLEG): container finished" podID="ec6d9b48-412f-49d8-ab21-20ab43366732" containerID="6cf9372d371534e9f69cfa3095035fd05d0953c31466438fdef72192fdad5f2f" exitCode=0 Jan 20 23:38:51 crc kubenswrapper[5030]: I0120 23:38:51.164462 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" event={"ID":"ec6d9b48-412f-49d8-ab21-20ab43366732","Type":"ContainerDied","Data":"6cf9372d371534e9f69cfa3095035fd05d0953c31466438fdef72192fdad5f2f"} Jan 20 23:38:51 crc kubenswrapper[5030]: I0120 23:38:51.171834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"014aff8f-a8f6-4d53-953b-7c86894fa72e","Type":"ContainerStarted","Data":"31dbae6be5fb055e7d0727117d149fef0e129127a07342e1d40968ebdffe1569"} Jan 20 23:38:51 crc kubenswrapper[5030]: I0120 23:38:51.171904 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:51 crc kubenswrapper[5030]: I0120 23:38:51.172667 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:38:51 crc kubenswrapper[5030]: I0120 23:38:51.197577 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.7641231189999997 podStartE2EDuration="7.197558014s" podCreationTimestamp="2026-01-20 23:38:44 +0000 UTC" firstStartedPulling="2026-01-20 23:38:46.233919755 +0000 UTC m=+3798.554180043" lastFinishedPulling="2026-01-20 23:38:50.66735465 +0000 UTC m=+3802.987614938" observedRunningTime="2026-01-20 23:38:51.182334806 +0000 UTC m=+3803.502595114" watchObservedRunningTime="2026-01-20 23:38:51.197558014 +0000 UTC m=+3803.517818302" Jan 20 23:38:51 crc kubenswrapper[5030]: I0120 23:38:51.704567 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.180775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"014aff8f-a8f6-4d53-953b-7c86894fa72e","Type":"ContainerStarted","Data":"5000bb60585bc36c85e8c0849950fa9ce4449e3e4a77a983090c8b7a43545b8b"} Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.183204 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"cf2362a0-6768-424c-9478-2f6aa4a3f48e","Type":"ContainerStarted","Data":"5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1"} Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.183542 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.213562 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=4.213539938 podStartE2EDuration="4.213539938s" podCreationTimestamp="2026-01-20 23:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:52.20248461 +0000 UTC m=+3804.522744898" watchObservedRunningTime="2026-01-20 23:38:52.213539938 +0000 UTC m=+3804.533800226" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.227814 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=4.227795162 podStartE2EDuration="4.227795162s" podCreationTimestamp="2026-01-20 23:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:52.22153305 +0000 UTC m=+3804.541793338" watchObservedRunningTime="2026-01-20 23:38:52.227795162 +0000 UTC m=+3804.548055450" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.605796 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.651714 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-config-data\") pod \"ec6d9b48-412f-49d8-ab21-20ab43366732\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.652696 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vm67\" (UniqueName: \"kubernetes.io/projected/ec6d9b48-412f-49d8-ab21-20ab43366732-kube-api-access-8vm67\") pod \"ec6d9b48-412f-49d8-ab21-20ab43366732\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.652736 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-fernet-keys\") pod \"ec6d9b48-412f-49d8-ab21-20ab43366732\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.652898 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-scripts\") pod \"ec6d9b48-412f-49d8-ab21-20ab43366732\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.652968 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-combined-ca-bundle\") pod \"ec6d9b48-412f-49d8-ab21-20ab43366732\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.653010 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-credential-keys\") pod \"ec6d9b48-412f-49d8-ab21-20ab43366732\" (UID: \"ec6d9b48-412f-49d8-ab21-20ab43366732\") " Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.656959 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec6d9b48-412f-49d8-ab21-20ab43366732-kube-api-access-8vm67" (OuterVolumeSpecName: "kube-api-access-8vm67") pod "ec6d9b48-412f-49d8-ab21-20ab43366732" (UID: "ec6d9b48-412f-49d8-ab21-20ab43366732"). InnerVolumeSpecName "kube-api-access-8vm67". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.662005 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ec6d9b48-412f-49d8-ab21-20ab43366732" (UID: "ec6d9b48-412f-49d8-ab21-20ab43366732"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.664212 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-scripts" (OuterVolumeSpecName: "scripts") pod "ec6d9b48-412f-49d8-ab21-20ab43366732" (UID: "ec6d9b48-412f-49d8-ab21-20ab43366732"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.668192 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ec6d9b48-412f-49d8-ab21-20ab43366732" (UID: "ec6d9b48-412f-49d8-ab21-20ab43366732"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.682835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-config-data" (OuterVolumeSpecName: "config-data") pod "ec6d9b48-412f-49d8-ab21-20ab43366732" (UID: "ec6d9b48-412f-49d8-ab21-20ab43366732"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.696044 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec6d9b48-412f-49d8-ab21-20ab43366732" (UID: "ec6d9b48-412f-49d8-ab21-20ab43366732"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.755190 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.755234 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vm67\" (UniqueName: \"kubernetes.io/projected/ec6d9b48-412f-49d8-ab21-20ab43366732-kube-api-access-8vm67\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.755248 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.755259 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.755269 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.755278 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec6d9b48-412f-49d8-ab21-20ab43366732-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:52 crc kubenswrapper[5030]: I0120 23:38:52.836152 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.198435 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.198530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-bwkgg" event={"ID":"ec6d9b48-412f-49d8-ab21-20ab43366732","Type":"ContainerDied","Data":"e0ca2a3c027208829725e32f70d4925c11b7b17fdd512d8d91fead38c8bad93f"} Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.198765 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0ca2a3c027208829725e32f70d4925c11b7b17fdd512d8d91fead38c8bad93f" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.198739 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="cf2362a0-6768-424c-9478-2f6aa4a3f48e" containerName="cinder-api-log" containerID="cri-o://8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e" gracePeriod=30 Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.198807 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="cf2362a0-6768-424c-9478-2f6aa4a3f48e" containerName="cinder-api" containerID="cri-o://5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1" gracePeriod=30 Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.306549 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-68445ff9cc-cq5qp"] Jan 20 23:38:53 crc kubenswrapper[5030]: E0120 23:38:53.306971 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec6d9b48-412f-49d8-ab21-20ab43366732" containerName="keystone-bootstrap" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.306990 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec6d9b48-412f-49d8-ab21-20ab43366732" containerName="keystone-bootstrap" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.307224 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec6d9b48-412f-49d8-ab21-20ab43366732" containerName="keystone-bootstrap" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.307909 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.317658 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-2jq2r" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.318362 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.318666 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.319035 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.319830 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.319935 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.320360 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-68445ff9cc-cq5qp"] Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.366229 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-fernet-keys\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.366296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-credential-keys\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.366322 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-scripts\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.366492 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-config-data\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.367659 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz8dd\" (UniqueName: \"kubernetes.io/projected/29283d66-1ecd-4ad9-8bed-53cc737db34d-kube-api-access-pz8dd\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.367756 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-internal-tls-certs\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.367915 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-combined-ca-bundle\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.367956 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-public-tls-certs\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.470014 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-credential-keys\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.470071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-scripts\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.470110 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-config-data\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.470155 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz8dd\" (UniqueName: \"kubernetes.io/projected/29283d66-1ecd-4ad9-8bed-53cc737db34d-kube-api-access-pz8dd\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.470203 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-internal-tls-certs\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.470269 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-combined-ca-bundle\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.470296 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-public-tls-certs\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.470343 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-fernet-keys\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.476201 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-combined-ca-bundle\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.476533 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-public-tls-certs\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.477966 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-config-data\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.478395 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-fernet-keys\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.479139 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-internal-tls-certs\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.480761 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-credential-keys\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.480787 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-scripts\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.491531 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz8dd\" (UniqueName: \"kubernetes.io/projected/29283d66-1ecd-4ad9-8bed-53cc737db34d-kube-api-access-pz8dd\") pod \"keystone-68445ff9cc-cq5qp\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.555510 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.555752 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.587263 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.598860 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.598908 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.600289 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.618604 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.644578 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.650012 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.671288 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.874287 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.981006 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2362a0-6768-424c-9478-2f6aa4a3f48e-logs\") pod \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.981098 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-combined-ca-bundle\") pod \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.981149 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf2362a0-6768-424c-9478-2f6aa4a3f48e-etc-machine-id\") pod \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.981179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-scripts\") pod \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.981227 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-config-data\") pod \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.981296 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-config-data-custom\") pod \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.981274 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf2362a0-6768-424c-9478-2f6aa4a3f48e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cf2362a0-6768-424c-9478-2f6aa4a3f48e" (UID: "cf2362a0-6768-424c-9478-2f6aa4a3f48e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.981350 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp2kr\" (UniqueName: \"kubernetes.io/projected/cf2362a0-6768-424c-9478-2f6aa4a3f48e-kube-api-access-tp2kr\") pod \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\" (UID: \"cf2362a0-6768-424c-9478-2f6aa4a3f48e\") " Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.981883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf2362a0-6768-424c-9478-2f6aa4a3f48e-logs" (OuterVolumeSpecName: "logs") pod "cf2362a0-6768-424c-9478-2f6aa4a3f48e" (UID: "cf2362a0-6768-424c-9478-2f6aa4a3f48e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.981995 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf2362a0-6768-424c-9478-2f6aa4a3f48e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.986717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cf2362a0-6768-424c-9478-2f6aa4a3f48e" (UID: "cf2362a0-6768-424c-9478-2f6aa4a3f48e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.986861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-scripts" (OuterVolumeSpecName: "scripts") pod "cf2362a0-6768-424c-9478-2f6aa4a3f48e" (UID: "cf2362a0-6768-424c-9478-2f6aa4a3f48e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:53 crc kubenswrapper[5030]: I0120 23:38:53.988824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf2362a0-6768-424c-9478-2f6aa4a3f48e-kube-api-access-tp2kr" (OuterVolumeSpecName: "kube-api-access-tp2kr") pod "cf2362a0-6768-424c-9478-2f6aa4a3f48e" (UID: "cf2362a0-6768-424c-9478-2f6aa4a3f48e"). InnerVolumeSpecName "kube-api-access-tp2kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.008067 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf2362a0-6768-424c-9478-2f6aa4a3f48e" (UID: "cf2362a0-6768-424c-9478-2f6aa4a3f48e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.031731 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-config-data" (OuterVolumeSpecName: "config-data") pod "cf2362a0-6768-424c-9478-2f6aa4a3f48e" (UID: "cf2362a0-6768-424c-9478-2f6aa4a3f48e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.084214 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.084259 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp2kr\" (UniqueName: \"kubernetes.io/projected/cf2362a0-6768-424c-9478-2f6aa4a3f48e-kube-api-access-tp2kr\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.084274 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2362a0-6768-424c-9478-2f6aa4a3f48e-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.084288 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.084302 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.084315 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2362a0-6768-424c-9478-2f6aa4a3f48e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.130583 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-68445ff9cc-cq5qp"] Jan 20 23:38:54 crc kubenswrapper[5030]: W0120 23:38:54.134020 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29283d66_1ecd_4ad9_8bed_53cc737db34d.slice/crio-1723031c1195b8f6581b0b5d0426fa6da1e1b46eb2b6facbd47b84947f8a7a69 WatchSource:0}: Error finding container 1723031c1195b8f6581b0b5d0426fa6da1e1b46eb2b6facbd47b84947f8a7a69: Status 404 returned error can't find the container with id 1723031c1195b8f6581b0b5d0426fa6da1e1b46eb2b6facbd47b84947f8a7a69 Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.235998 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" event={"ID":"29283d66-1ecd-4ad9-8bed-53cc737db34d","Type":"ContainerStarted","Data":"1723031c1195b8f6581b0b5d0426fa6da1e1b46eb2b6facbd47b84947f8a7a69"} Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.264583 5030 generic.go:334] "Generic (PLEG): container finished" podID="cf2362a0-6768-424c-9478-2f6aa4a3f48e" containerID="5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1" exitCode=0 Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.264616 5030 generic.go:334] "Generic (PLEG): container finished" podID="cf2362a0-6768-424c-9478-2f6aa4a3f48e" containerID="8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e" exitCode=143 Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.264678 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"cf2362a0-6768-424c-9478-2f6aa4a3f48e","Type":"ContainerDied","Data":"5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1"} Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.264739 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"cf2362a0-6768-424c-9478-2f6aa4a3f48e","Type":"ContainerDied","Data":"8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e"} Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.264749 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"cf2362a0-6768-424c-9478-2f6aa4a3f48e","Type":"ContainerDied","Data":"e495597c2ee7184e2d7f9eb7748e2376695b348cdd6836a0d7a5a1efb9bae452"} Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.264765 5030 scope.go:117] "RemoveContainer" containerID="5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.265006 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.268395 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.268425 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.268458 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.268632 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.333771 5030 scope.go:117] "RemoveContainer" containerID="8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.356561 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.362780 5030 scope.go:117] "RemoveContainer" containerID="5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1" Jan 20 23:38:54 crc kubenswrapper[5030]: E0120 23:38:54.363547 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1\": container with ID starting with 5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1 not found: ID does not exist" containerID="5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.363681 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1"} err="failed to get container status \"5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1\": rpc error: code = NotFound desc = could not find container \"5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1\": container with ID starting with 5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1 not found: ID does not exist" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.363720 5030 scope.go:117] "RemoveContainer" containerID="8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e" Jan 20 23:38:54 crc kubenswrapper[5030]: E0120 23:38:54.369795 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e\": container with ID starting with 8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e not found: ID does not exist" containerID="8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.369846 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e"} err="failed to get container status \"8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e\": rpc error: code = NotFound desc = could not find container \"8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e\": container with ID starting with 8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e not found: ID does not exist" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.369873 5030 scope.go:117] "RemoveContainer" containerID="5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.370281 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1"} err="failed to get container status \"5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1\": rpc error: code = NotFound desc = could not find container \"5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1\": container with ID starting with 5778fbace3bdb4aca0baba6f102d17b6071097875f08671a0d93e71ba08dc6d1 not found: ID does not exist" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.370302 5030 scope.go:117] "RemoveContainer" containerID="8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.370490 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e"} err="failed to get container status \"8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e\": rpc error: code = NotFound desc = could not find container \"8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e\": container with ID starting with 8aea18258e5bf1db8ae994fbc5533534eb7caf813c43c97644b00fbdec04582e not found: ID does not exist" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.383513 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.412731 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:38:54 crc kubenswrapper[5030]: E0120 23:38:54.413105 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2362a0-6768-424c-9478-2f6aa4a3f48e" containerName="cinder-api-log" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.413115 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2362a0-6768-424c-9478-2f6aa4a3f48e" containerName="cinder-api-log" Jan 20 23:38:54 crc kubenswrapper[5030]: E0120 23:38:54.413133 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf2362a0-6768-424c-9478-2f6aa4a3f48e" containerName="cinder-api" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.413139 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf2362a0-6768-424c-9478-2f6aa4a3f48e" containerName="cinder-api" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.413314 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf2362a0-6768-424c-9478-2f6aa4a3f48e" containerName="cinder-api-log" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.413327 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf2362a0-6768-424c-9478-2f6aa4a3f48e" containerName="cinder-api" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.414215 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.419471 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.421432 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.423171 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.450800 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.498192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.498245 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.498278 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn6xj\" (UniqueName: \"kubernetes.io/projected/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-kube-api-access-bn6xj\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.498356 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.498372 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-scripts\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.498399 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-config-data\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.498414 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-config-data-custom\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.498433 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-logs\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.498465 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.599658 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.599703 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-scripts\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.599733 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-config-data\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.599749 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-config-data-custom\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.599771 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-logs\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.599800 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.599828 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.599851 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.599880 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn6xj\" (UniqueName: \"kubernetes.io/projected/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-kube-api-access-bn6xj\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.600299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.600573 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-logs\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.605410 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.606722 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-config-data-custom\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.607128 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.608236 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.616219 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-config-data\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.616999 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-scripts\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.620040 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn6xj\" (UniqueName: \"kubernetes.io/projected/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-kube-api-access-bn6xj\") pod \"cinder-api-0\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.632736 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:38:54 crc kubenswrapper[5030]: I0120 23:38:54.789104 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:55 crc kubenswrapper[5030]: I0120 23:38:55.163628 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:38:55 crc kubenswrapper[5030]: I0120 23:38:55.284879 5030 generic.go:334] "Generic (PLEG): container finished" podID="2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8" containerID="64de3c79bce421a0d3c1059b0988a3e76dde367063a872a184ec15dd5d265737" exitCode=0 Jan 20 23:38:55 crc kubenswrapper[5030]: I0120 23:38:55.285194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" event={"ID":"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8","Type":"ContainerDied","Data":"64de3c79bce421a0d3c1059b0988a3e76dde367063a872a184ec15dd5d265737"} Jan 20 23:38:55 crc kubenswrapper[5030]: I0120 23:38:55.294770 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad","Type":"ContainerStarted","Data":"bbd7996fd07c4b83ef560c5f832bd31063b8def68133a0f6da2b5d4c17533e02"} Jan 20 23:38:55 crc kubenswrapper[5030]: I0120 23:38:55.309875 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" event={"ID":"29283d66-1ecd-4ad9-8bed-53cc737db34d","Type":"ContainerStarted","Data":"da549dd5cb777f8caab4bb5e65c66917dfd797083bb493b48e348211e19e4bd3"} Jan 20 23:38:55 crc kubenswrapper[5030]: I0120 23:38:55.342192 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" podStartSLOduration=2.342166088 podStartE2EDuration="2.342166088s" podCreationTimestamp="2026-01-20 23:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:55.336729427 +0000 UTC m=+3807.656989715" watchObservedRunningTime="2026-01-20 23:38:55.342166088 +0000 UTC m=+3807.662426376" Jan 20 23:38:55 crc kubenswrapper[5030]: I0120 23:38:55.971853 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf2362a0-6768-424c-9478-2f6aa4a3f48e" path="/var/lib/kubelet/pods/cf2362a0-6768-424c-9478-2f6aa4a3f48e/volumes" Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.335956 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.336308 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.336017 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad","Type":"ContainerStarted","Data":"24a1ad1389e7a1626270495336d2b4c5f0f755ad5419efb4129c1e33a7aacb57"} Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.336404 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.336122 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.336492 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.475939 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.677908 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.727764 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.728127 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.766799 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.957105 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9dwc\" (UniqueName: \"kubernetes.io/projected/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-kube-api-access-t9dwc\") pod \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\" (UID: \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\") " Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.957239 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-combined-ca-bundle\") pod \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\" (UID: \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\") " Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.957268 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-config\") pod \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\" (UID: \"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8\") " Jan 20 23:38:56 crc kubenswrapper[5030]: I0120 23:38:56.964054 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:38:56 crc kubenswrapper[5030]: E0120 23:38:56.964470 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.353295 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-kube-api-access-t9dwc" (OuterVolumeSpecName: "kube-api-access-t9dwc") pod "2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8" (UID: "2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8"). InnerVolumeSpecName "kube-api-access-t9dwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.356550 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" event={"ID":"2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8","Type":"ContainerDied","Data":"87a07af13777c06590e7df973684d5463df34d538f904fcaa92e90e478a73184"} Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.356763 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87a07af13777c06590e7df973684d5463df34d538f904fcaa92e90e478a73184" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.356580 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-9k2jq" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.360232 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad","Type":"ContainerStarted","Data":"2e863572ff39f65b432bd8c77129121cd78d3d35c55dae06d8f91ddfcc6bcec0"} Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.369548 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9dwc\" (UniqueName: \"kubernetes.io/projected/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-kube-api-access-t9dwc\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.373582 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8" (UID: "2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.388255 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.388239385 podStartE2EDuration="3.388239385s" podCreationTimestamp="2026-01-20 23:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:38:57.384107096 +0000 UTC m=+3809.704367384" watchObservedRunningTime="2026-01-20 23:38:57.388239385 +0000 UTC m=+3809.708499673" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.390205 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-config" (OuterVolumeSpecName: "config") pod "2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8" (UID: "2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.475223 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.475275 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.550226 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6d4488f98f-844w8"] Jan 20 23:38:57 crc kubenswrapper[5030]: E0120 23:38:57.550908 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8" containerName="neutron-db-sync" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.550925 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8" containerName="neutron-db-sync" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.551090 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8" containerName="neutron-db-sync" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.551973 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.555757 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.569142 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6d4488f98f-844w8"] Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.678370 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-httpd-config\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.678428 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6xzs\" (UniqueName: \"kubernetes.io/projected/f50beaa6-b99c-4026-b8b8-b0832e99c48c-kube-api-access-h6xzs\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.678486 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-combined-ca-bundle\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.678513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-ovndb-tls-certs\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.678543 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-config\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.780092 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-combined-ca-bundle\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.780145 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-ovndb-tls-certs\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.780176 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-config\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.780238 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-httpd-config\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.780273 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6xzs\" (UniqueName: \"kubernetes.io/projected/f50beaa6-b99c-4026-b8b8-b0832e99c48c-kube-api-access-h6xzs\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.793394 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-combined-ca-bundle\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.793941 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-ovndb-tls-certs\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.794203 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-httpd-config\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.794878 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-config\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.811356 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6xzs\" (UniqueName: \"kubernetes.io/projected/f50beaa6-b99c-4026-b8b8-b0832e99c48c-kube-api-access-h6xzs\") pod \"neutron-6d4488f98f-844w8\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:57 crc kubenswrapper[5030]: I0120 23:38:57.872039 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:38:58 crc kubenswrapper[5030]: I0120 23:38:58.331692 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6d4488f98f-844w8"] Jan 20 23:38:58 crc kubenswrapper[5030]: I0120 23:38:58.378461 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" event={"ID":"f50beaa6-b99c-4026-b8b8-b0832e99c48c","Type":"ContainerStarted","Data":"a206ff47809785bcd12cca96ed151837fdf7daf32527d1755e335357d3bf1e36"} Jan 20 23:38:58 crc kubenswrapper[5030]: I0120 23:38:58.378510 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:38:58 crc kubenswrapper[5030]: I0120 23:38:58.847373 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:38:58 crc kubenswrapper[5030]: I0120 23:38:58.903199 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:38:59 crc kubenswrapper[5030]: I0120 23:38:59.435968 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" event={"ID":"f50beaa6-b99c-4026-b8b8-b0832e99c48c","Type":"ContainerStarted","Data":"f1127219cbe7c14cf2621b6f40e029352b74a4162ddcc345284bc15d2e259772"} Jan 20 23:38:59 crc kubenswrapper[5030]: I0120 23:38:59.436685 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="014aff8f-a8f6-4d53-953b-7c86894fa72e" containerName="cinder-scheduler" containerID="cri-o://31dbae6be5fb055e7d0727117d149fef0e129127a07342e1d40968ebdffe1569" gracePeriod=30 Jan 20 23:38:59 crc kubenswrapper[5030]: I0120 23:38:59.436953 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="014aff8f-a8f6-4d53-953b-7c86894fa72e" containerName="probe" containerID="cri-o://5000bb60585bc36c85e8c0849950fa9ce4449e3e4a77a983090c8b7a43545b8b" gracePeriod=30 Jan 20 23:38:59 crc kubenswrapper[5030]: I0120 23:38:59.988763 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.162132 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.227586 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr"] Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.227826 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" podUID="03bf9cbf-e666-42ca-9b29-f177c78ef68c" containerName="barbican-api-log" containerID="cri-o://28307d7bbfeb70415d5fdbdcc1a1fa76a855c4987e015218eeec904fb9ec5dd4" gracePeriod=30 Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.228307 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" podUID="03bf9cbf-e666-42ca-9b29-f177c78ef68c" containerName="barbican-api" containerID="cri-o://9add33d5c8c64359fc934b28c89533d4295fa85906204392d93b43522ebe74ff" gracePeriod=30 Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.428372 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5df8ff887d-7k5dd"] Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.436706 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.439046 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.439115 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.447285 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5df8ff887d-7k5dd"] Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.463920 5030 generic.go:334] "Generic (PLEG): container finished" podID="014aff8f-a8f6-4d53-953b-7c86894fa72e" containerID="5000bb60585bc36c85e8c0849950fa9ce4449e3e4a77a983090c8b7a43545b8b" exitCode=0 Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.464144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"014aff8f-a8f6-4d53-953b-7c86894fa72e","Type":"ContainerDied","Data":"5000bb60585bc36c85e8c0849950fa9ce4449e3e4a77a983090c8b7a43545b8b"} Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.466408 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-public-tls-certs\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.466594 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-internal-tls-certs\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.466741 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-combined-ca-bundle\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.466897 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-ovndb-tls-certs\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.467053 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-httpd-config\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.467159 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-config\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.467273 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2kn8\" (UniqueName: \"kubernetes.io/projected/3b08e5aa-df16-4deb-90e9-27af6c1d690b-kube-api-access-j2kn8\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.468251 5030 generic.go:334] "Generic (PLEG): container finished" podID="03bf9cbf-e666-42ca-9b29-f177c78ef68c" containerID="28307d7bbfeb70415d5fdbdcc1a1fa76a855c4987e015218eeec904fb9ec5dd4" exitCode=143 Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.468391 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" event={"ID":"03bf9cbf-e666-42ca-9b29-f177c78ef68c","Type":"ContainerDied","Data":"28307d7bbfeb70415d5fdbdcc1a1fa76a855c4987e015218eeec904fb9ec5dd4"} Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.472711 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" event={"ID":"f50beaa6-b99c-4026-b8b8-b0832e99c48c","Type":"ContainerStarted","Data":"d72f5c4321b88f7d02041229a51ba006b67c1cd0e7d4f8da3df7d5d8d5c267aa"} Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.503319 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" podStartSLOduration=3.503299808 podStartE2EDuration="3.503299808s" podCreationTimestamp="2026-01-20 23:38:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:39:00.495570521 +0000 UTC m=+3812.815830809" watchObservedRunningTime="2026-01-20 23:39:00.503299808 +0000 UTC m=+3812.823560086" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.569376 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-ovndb-tls-certs\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.569497 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-httpd-config\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.569519 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-config\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.569553 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2kn8\" (UniqueName: \"kubernetes.io/projected/3b08e5aa-df16-4deb-90e9-27af6c1d690b-kube-api-access-j2kn8\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.569615 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-public-tls-certs\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.569684 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-internal-tls-certs\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.569702 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-combined-ca-bundle\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.577206 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-public-tls-certs\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.577244 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-combined-ca-bundle\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.577652 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-config\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.577695 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-ovndb-tls-certs\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.577808 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-httpd-config\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.579405 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-internal-tls-certs\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.587303 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2kn8\" (UniqueName: \"kubernetes.io/projected/3b08e5aa-df16-4deb-90e9-27af6c1d690b-kube-api-access-j2kn8\") pod \"neutron-5df8ff887d-7k5dd\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:00 crc kubenswrapper[5030]: I0120 23:39:00.759589 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:01 crc kubenswrapper[5030]: I0120 23:39:01.251133 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5df8ff887d-7k5dd"] Jan 20 23:39:01 crc kubenswrapper[5030]: I0120 23:39:01.484749 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" event={"ID":"3b08e5aa-df16-4deb-90e9-27af6c1d690b","Type":"ContainerStarted","Data":"ca8c183d14cec1dc67c813d2efe9151b6b9e35b2d43f55c0f5341b36e6776c67"} Jan 20 23:39:01 crc kubenswrapper[5030]: I0120 23:39:01.485091 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.073300 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.099219 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-scripts\") pod \"014aff8f-a8f6-4d53-953b-7c86894fa72e\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.099311 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-combined-ca-bundle\") pod \"014aff8f-a8f6-4d53-953b-7c86894fa72e\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.099373 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/014aff8f-a8f6-4d53-953b-7c86894fa72e-etc-machine-id\") pod \"014aff8f-a8f6-4d53-953b-7c86894fa72e\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.099507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-config-data\") pod \"014aff8f-a8f6-4d53-953b-7c86894fa72e\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.099538 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxgx8\" (UniqueName: \"kubernetes.io/projected/014aff8f-a8f6-4d53-953b-7c86894fa72e-kube-api-access-nxgx8\") pod \"014aff8f-a8f6-4d53-953b-7c86894fa72e\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.099595 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-config-data-custom\") pod \"014aff8f-a8f6-4d53-953b-7c86894fa72e\" (UID: \"014aff8f-a8f6-4d53-953b-7c86894fa72e\") " Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.100453 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/014aff8f-a8f6-4d53-953b-7c86894fa72e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "014aff8f-a8f6-4d53-953b-7c86894fa72e" (UID: "014aff8f-a8f6-4d53-953b-7c86894fa72e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.111203 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-scripts" (OuterVolumeSpecName: "scripts") pod "014aff8f-a8f6-4d53-953b-7c86894fa72e" (UID: "014aff8f-a8f6-4d53-953b-7c86894fa72e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.111246 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "014aff8f-a8f6-4d53-953b-7c86894fa72e" (UID: "014aff8f-a8f6-4d53-953b-7c86894fa72e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.111347 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014aff8f-a8f6-4d53-953b-7c86894fa72e-kube-api-access-nxgx8" (OuterVolumeSpecName: "kube-api-access-nxgx8") pod "014aff8f-a8f6-4d53-953b-7c86894fa72e" (UID: "014aff8f-a8f6-4d53-953b-7c86894fa72e"). InnerVolumeSpecName "kube-api-access-nxgx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.161893 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "014aff8f-a8f6-4d53-953b-7c86894fa72e" (UID: "014aff8f-a8f6-4d53-953b-7c86894fa72e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.201585 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/014aff8f-a8f6-4d53-953b-7c86894fa72e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.201657 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxgx8\" (UniqueName: \"kubernetes.io/projected/014aff8f-a8f6-4d53-953b-7c86894fa72e-kube-api-access-nxgx8\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.201673 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.201682 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.201690 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.209769 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-config-data" (OuterVolumeSpecName: "config-data") pod "014aff8f-a8f6-4d53-953b-7c86894fa72e" (UID: "014aff8f-a8f6-4d53-953b-7c86894fa72e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.303191 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014aff8f-a8f6-4d53-953b-7c86894fa72e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.494824 5030 generic.go:334] "Generic (PLEG): container finished" podID="014aff8f-a8f6-4d53-953b-7c86894fa72e" containerID="31dbae6be5fb055e7d0727117d149fef0e129127a07342e1d40968ebdffe1569" exitCode=0 Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.494892 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"014aff8f-a8f6-4d53-953b-7c86894fa72e","Type":"ContainerDied","Data":"31dbae6be5fb055e7d0727117d149fef0e129127a07342e1d40968ebdffe1569"} Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.494908 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.494929 5030 scope.go:117] "RemoveContainer" containerID="5000bb60585bc36c85e8c0849950fa9ce4449e3e4a77a983090c8b7a43545b8b" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.494919 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"014aff8f-a8f6-4d53-953b-7c86894fa72e","Type":"ContainerDied","Data":"45f4412fea28ea42e7838d41452887b21b99eea86289ac7c55ef399cd6ff1b9f"} Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.500337 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" event={"ID":"3b08e5aa-df16-4deb-90e9-27af6c1d690b","Type":"ContainerStarted","Data":"21bb5a1314ff029a0924afc5c5d204212e943dbceb3519fbb0dd59b7c25a4f1b"} Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.500401 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" event={"ID":"3b08e5aa-df16-4deb-90e9-27af6c1d690b","Type":"ContainerStarted","Data":"d6cea96ea84415eb01eca8f809e2ee1e7f206a16c5d3b668df781111cc6b7093"} Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.501268 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.533503 5030 scope.go:117] "RemoveContainer" containerID="31dbae6be5fb055e7d0727117d149fef0e129127a07342e1d40968ebdffe1569" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.540168 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" podStartSLOduration=2.540151612 podStartE2EDuration="2.540151612s" podCreationTimestamp="2026-01-20 23:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:39:02.529664769 +0000 UTC m=+3814.849925077" watchObservedRunningTime="2026-01-20 23:39:02.540151612 +0000 UTC m=+3814.860411900" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.563053 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.573911 5030 scope.go:117] "RemoveContainer" containerID="5000bb60585bc36c85e8c0849950fa9ce4449e3e4a77a983090c8b7a43545b8b" Jan 20 23:39:02 crc kubenswrapper[5030]: E0120 23:39:02.576006 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5000bb60585bc36c85e8c0849950fa9ce4449e3e4a77a983090c8b7a43545b8b\": container with ID starting with 5000bb60585bc36c85e8c0849950fa9ce4449e3e4a77a983090c8b7a43545b8b not found: ID does not exist" containerID="5000bb60585bc36c85e8c0849950fa9ce4449e3e4a77a983090c8b7a43545b8b" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.576081 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5000bb60585bc36c85e8c0849950fa9ce4449e3e4a77a983090c8b7a43545b8b"} err="failed to get container status \"5000bb60585bc36c85e8c0849950fa9ce4449e3e4a77a983090c8b7a43545b8b\": rpc error: code = NotFound desc = could not find container \"5000bb60585bc36c85e8c0849950fa9ce4449e3e4a77a983090c8b7a43545b8b\": container with ID starting with 5000bb60585bc36c85e8c0849950fa9ce4449e3e4a77a983090c8b7a43545b8b not found: ID does not exist" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.576115 5030 scope.go:117] "RemoveContainer" containerID="31dbae6be5fb055e7d0727117d149fef0e129127a07342e1d40968ebdffe1569" Jan 20 23:39:02 crc kubenswrapper[5030]: E0120 23:39:02.577493 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31dbae6be5fb055e7d0727117d149fef0e129127a07342e1d40968ebdffe1569\": container with ID starting with 31dbae6be5fb055e7d0727117d149fef0e129127a07342e1d40968ebdffe1569 not found: ID does not exist" containerID="31dbae6be5fb055e7d0727117d149fef0e129127a07342e1d40968ebdffe1569" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.577540 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31dbae6be5fb055e7d0727117d149fef0e129127a07342e1d40968ebdffe1569"} err="failed to get container status \"31dbae6be5fb055e7d0727117d149fef0e129127a07342e1d40968ebdffe1569\": rpc error: code = NotFound desc = could not find container \"31dbae6be5fb055e7d0727117d149fef0e129127a07342e1d40968ebdffe1569\": container with ID starting with 31dbae6be5fb055e7d0727117d149fef0e129127a07342e1d40968ebdffe1569 not found: ID does not exist" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.583113 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.591395 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:39:02 crc kubenswrapper[5030]: E0120 23:39:02.591884 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014aff8f-a8f6-4d53-953b-7c86894fa72e" containerName="cinder-scheduler" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.591906 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="014aff8f-a8f6-4d53-953b-7c86894fa72e" containerName="cinder-scheduler" Jan 20 23:39:02 crc kubenswrapper[5030]: E0120 23:39:02.591925 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014aff8f-a8f6-4d53-953b-7c86894fa72e" containerName="probe" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.591935 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="014aff8f-a8f6-4d53-953b-7c86894fa72e" containerName="probe" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.592132 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="014aff8f-a8f6-4d53-953b-7c86894fa72e" containerName="probe" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.592164 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="014aff8f-a8f6-4d53-953b-7c86894fa72e" containerName="cinder-scheduler" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.593401 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.595164 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.600596 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.609906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-config-data\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.609971 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwsx\" (UniqueName: \"kubernetes.io/projected/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-kube-api-access-qbwsx\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.610053 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.610077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-scripts\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.610125 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.610150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.723377 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.723426 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-scripts\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.723468 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.723492 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.723560 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-config-data\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.723594 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbwsx\" (UniqueName: \"kubernetes.io/projected/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-kube-api-access-qbwsx\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.723649 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.728168 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.728225 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.728870 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-scripts\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.729988 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-config-data\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.743378 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbwsx\" (UniqueName: \"kubernetes.io/projected/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-kube-api-access-qbwsx\") pod \"cinder-scheduler-0\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:02 crc kubenswrapper[5030]: I0120 23:39:02.912776 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:03 crc kubenswrapper[5030]: I0120 23:39:03.373320 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" podUID="03bf9cbf-e666-42ca-9b29-f177c78ef68c" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.211:9311/healthcheck\": read tcp 10.217.0.2:43534->10.217.0.211:9311: read: connection reset by peer" Jan 20 23:39:03 crc kubenswrapper[5030]: I0120 23:39:03.373354 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" podUID="03bf9cbf-e666-42ca-9b29-f177c78ef68c" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.211:9311/healthcheck\": read tcp 10.217.0.2:43524->10.217.0.211:9311: read: connection reset by peer" Jan 20 23:39:03 crc kubenswrapper[5030]: I0120 23:39:03.396238 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:39:03 crc kubenswrapper[5030]: I0120 23:39:03.532102 5030 generic.go:334] "Generic (PLEG): container finished" podID="03bf9cbf-e666-42ca-9b29-f177c78ef68c" containerID="9add33d5c8c64359fc934b28c89533d4295fa85906204392d93b43522ebe74ff" exitCode=0 Jan 20 23:39:03 crc kubenswrapper[5030]: I0120 23:39:03.532332 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" event={"ID":"03bf9cbf-e666-42ca-9b29-f177c78ef68c","Type":"ContainerDied","Data":"9add33d5c8c64359fc934b28c89533d4295fa85906204392d93b43522ebe74ff"} Jan 20 23:39:03 crc kubenswrapper[5030]: I0120 23:39:03.542590 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bb388aeb-cf40-49d1-bb60-a22d6117c6a8","Type":"ContainerStarted","Data":"635b2138e1468785602e2430635c31641ced4fb6ffe0dab196d48d3481258c6f"} Jan 20 23:39:03 crc kubenswrapper[5030]: I0120 23:39:03.944210 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:39:03 crc kubenswrapper[5030]: I0120 23:39:03.984242 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="014aff8f-a8f6-4d53-953b-7c86894fa72e" path="/var/lib/kubelet/pods/014aff8f-a8f6-4d53-953b-7c86894fa72e/volumes" Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.049849 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03bf9cbf-e666-42ca-9b29-f177c78ef68c-logs\") pod \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.049956 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-combined-ca-bundle\") pod \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.050029 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rxdl\" (UniqueName: \"kubernetes.io/projected/03bf9cbf-e666-42ca-9b29-f177c78ef68c-kube-api-access-9rxdl\") pod \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.050055 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-config-data-custom\") pod \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.050143 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-config-data\") pod \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\" (UID: \"03bf9cbf-e666-42ca-9b29-f177c78ef68c\") " Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.051407 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03bf9cbf-e666-42ca-9b29-f177c78ef68c-logs" (OuterVolumeSpecName: "logs") pod "03bf9cbf-e666-42ca-9b29-f177c78ef68c" (UID: "03bf9cbf-e666-42ca-9b29-f177c78ef68c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.059847 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03bf9cbf-e666-42ca-9b29-f177c78ef68c-kube-api-access-9rxdl" (OuterVolumeSpecName: "kube-api-access-9rxdl") pod "03bf9cbf-e666-42ca-9b29-f177c78ef68c" (UID: "03bf9cbf-e666-42ca-9b29-f177c78ef68c"). InnerVolumeSpecName "kube-api-access-9rxdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.066081 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "03bf9cbf-e666-42ca-9b29-f177c78ef68c" (UID: "03bf9cbf-e666-42ca-9b29-f177c78ef68c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.078287 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03bf9cbf-e666-42ca-9b29-f177c78ef68c" (UID: "03bf9cbf-e666-42ca-9b29-f177c78ef68c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.097440 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-config-data" (OuterVolumeSpecName: "config-data") pod "03bf9cbf-e666-42ca-9b29-f177c78ef68c" (UID: "03bf9cbf-e666-42ca-9b29-f177c78ef68c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.152166 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rxdl\" (UniqueName: \"kubernetes.io/projected/03bf9cbf-e666-42ca-9b29-f177c78ef68c-kube-api-access-9rxdl\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.152201 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.152212 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.152221 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03bf9cbf-e666-42ca-9b29-f177c78ef68c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.152233 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03bf9cbf-e666-42ca-9b29-f177c78ef68c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.557747 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bb388aeb-cf40-49d1-bb60-a22d6117c6a8","Type":"ContainerStarted","Data":"c77ef71b1ba7fc80efcdf5a24a3e01e5a68dadd6b1d0537759ca10e5a0da5519"} Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.561020 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.561169 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr" event={"ID":"03bf9cbf-e666-42ca-9b29-f177c78ef68c","Type":"ContainerDied","Data":"df557b3755d6472f06a2932a5f96af17b792113372117c45e3eed65984d039c5"} Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.561220 5030 scope.go:117] "RemoveContainer" containerID="9add33d5c8c64359fc934b28c89533d4295fa85906204392d93b43522ebe74ff" Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.592904 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr"] Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.598498 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-6c8f595fb-7ttwr"] Jan 20 23:39:04 crc kubenswrapper[5030]: I0120 23:39:04.600411 5030 scope.go:117] "RemoveContainer" containerID="28307d7bbfeb70415d5fdbdcc1a1fa76a855c4987e015218eeec904fb9ec5dd4" Jan 20 23:39:05 crc kubenswrapper[5030]: I0120 23:39:05.578740 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bb388aeb-cf40-49d1-bb60-a22d6117c6a8","Type":"ContainerStarted","Data":"e0d56ff4302d57b4640e3a21a93764cd9ca8e599cfa8633aa74cfa5710979040"} Jan 20 23:39:05 crc kubenswrapper[5030]: I0120 23:39:05.609070 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.6090481580000002 podStartE2EDuration="3.609048158s" podCreationTimestamp="2026-01-20 23:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:39:05.605364358 +0000 UTC m=+3817.925624666" watchObservedRunningTime="2026-01-20 23:39:05.609048158 +0000 UTC m=+3817.929308456" Jan 20 23:39:05 crc kubenswrapper[5030]: I0120 23:39:05.980043 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03bf9cbf-e666-42ca-9b29-f177c78ef68c" path="/var/lib/kubelet/pods/03bf9cbf-e666-42ca-9b29-f177c78ef68c/volumes" Jan 20 23:39:06 crc kubenswrapper[5030]: I0120 23:39:06.539183 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:39:07 crc kubenswrapper[5030]: I0120 23:39:07.913299 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:08 crc kubenswrapper[5030]: I0120 23:39:08.525390 5030 scope.go:117] "RemoveContainer" containerID="5ae7bb9ba42eb770778d66a021e4203f2b42a0265e7c087325892501138ebe93" Jan 20 23:39:08 crc kubenswrapper[5030]: I0120 23:39:08.570134 5030 scope.go:117] "RemoveContainer" containerID="ce9d20226aae9e07d8570216d84d0cf4134e266aaf06bf4acc21b06332b96de8" Jan 20 23:39:08 crc kubenswrapper[5030]: I0120 23:39:08.614005 5030 scope.go:117] "RemoveContainer" containerID="90cc47d6fbfec80696ca78278e6616158c0acbacf6dd0460eedd37b9b8007991" Jan 20 23:39:08 crc kubenswrapper[5030]: I0120 23:39:08.789768 5030 scope.go:117] "RemoveContainer" containerID="114affce9b800cbe0857f605af142f1a72a5d890f7e1f80920fbac277a72b8f3" Jan 20 23:39:08 crc kubenswrapper[5030]: I0120 23:39:08.848483 5030 scope.go:117] "RemoveContainer" containerID="5f9274ccadd108b7edd55790f7101b6f78736bb7058b62758a8e76d34b967ae3" Jan 20 23:39:09 crc kubenswrapper[5030]: I0120 23:39:09.962352 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:39:09 crc kubenswrapper[5030]: E0120 23:39:09.964332 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:39:13 crc kubenswrapper[5030]: I0120 23:39:13.120174 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:39:15 crc kubenswrapper[5030]: I0120 23:39:15.350324 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:39:15 crc kubenswrapper[5030]: I0120 23:39:15.351537 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:39:15 crc kubenswrapper[5030]: I0120 23:39:15.601003 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:24 crc kubenswrapper[5030]: I0120 23:39:24.966092 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:39:24 crc kubenswrapper[5030]: E0120 23:39:24.967166 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:39:25 crc kubenswrapper[5030]: I0120 23:39:25.181008 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:39:27 crc kubenswrapper[5030]: I0120 23:39:27.880050 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:39:28 crc kubenswrapper[5030]: I0120 23:39:28.909747 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:39:28 crc kubenswrapper[5030]: E0120 23:39:28.910356 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03bf9cbf-e666-42ca-9b29-f177c78ef68c" containerName="barbican-api-log" Jan 20 23:39:28 crc kubenswrapper[5030]: I0120 23:39:28.910369 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="03bf9cbf-e666-42ca-9b29-f177c78ef68c" containerName="barbican-api-log" Jan 20 23:39:28 crc kubenswrapper[5030]: E0120 23:39:28.910383 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03bf9cbf-e666-42ca-9b29-f177c78ef68c" containerName="barbican-api" Jan 20 23:39:28 crc kubenswrapper[5030]: I0120 23:39:28.910401 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="03bf9cbf-e666-42ca-9b29-f177c78ef68c" containerName="barbican-api" Jan 20 23:39:28 crc kubenswrapper[5030]: I0120 23:39:28.910553 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="03bf9cbf-e666-42ca-9b29-f177c78ef68c" containerName="barbican-api" Jan 20 23:39:28 crc kubenswrapper[5030]: I0120 23:39:28.910572 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="03bf9cbf-e666-42ca-9b29-f177c78ef68c" containerName="barbican-api-log" Jan 20 23:39:28 crc kubenswrapper[5030]: I0120 23:39:28.911142 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:28 crc kubenswrapper[5030]: I0120 23:39:28.913931 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-gbw25" Jan 20 23:39:28 crc kubenswrapper[5030]: I0120 23:39:28.913965 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 20 23:39:28 crc kubenswrapper[5030]: I0120 23:39:28.914323 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 20 23:39:28 crc kubenswrapper[5030]: I0120 23:39:28.930847 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.002695 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54pst\" (UniqueName: \"kubernetes.io/projected/a39ae3f6-b542-4a93-a45b-8b431f5e857e-kube-api-access-54pst\") pod \"openstackclient\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.002741 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39ae3f6-b542-4a93-a45b-8b431f5e857e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.002941 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a39ae3f6-b542-4a93-a45b-8b431f5e857e-openstack-config-secret\") pod \"openstackclient\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.003179 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a39ae3f6-b542-4a93-a45b-8b431f5e857e-openstack-config\") pod \"openstackclient\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.105055 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a39ae3f6-b542-4a93-a45b-8b431f5e857e-openstack-config\") pod \"openstackclient\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.105179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54pst\" (UniqueName: \"kubernetes.io/projected/a39ae3f6-b542-4a93-a45b-8b431f5e857e-kube-api-access-54pst\") pod \"openstackclient\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.105206 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39ae3f6-b542-4a93-a45b-8b431f5e857e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.105271 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a39ae3f6-b542-4a93-a45b-8b431f5e857e-openstack-config-secret\") pod \"openstackclient\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.107354 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a39ae3f6-b542-4a93-a45b-8b431f5e857e-openstack-config\") pod \"openstackclient\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.114351 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a39ae3f6-b542-4a93-a45b-8b431f5e857e-openstack-config-secret\") pod \"openstackclient\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.114829 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39ae3f6-b542-4a93-a45b-8b431f5e857e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.146017 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54pst\" (UniqueName: \"kubernetes.io/projected/a39ae3f6-b542-4a93-a45b-8b431f5e857e-kube-api-access-54pst\") pod \"openstackclient\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.244040 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:39:29 crc kubenswrapper[5030]: I0120 23:39:29.685428 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:39:29 crc kubenswrapper[5030]: W0120 23:39:29.689651 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda39ae3f6_b542_4a93_a45b_8b431f5e857e.slice/crio-fc5866f808a8a80519bdf59e575d36f536d08291d9158ae752e796384d962b40 WatchSource:0}: Error finding container fc5866f808a8a80519bdf59e575d36f536d08291d9158ae752e796384d962b40: Status 404 returned error can't find the container with id fc5866f808a8a80519bdf59e575d36f536d08291d9158ae752e796384d962b40 Jan 20 23:39:30 crc kubenswrapper[5030]: I0120 23:39:30.095301 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"a39ae3f6-b542-4a93-a45b-8b431f5e857e","Type":"ContainerStarted","Data":"9ea9c1592d15cd47a478230d43c094acf26e6539dd38741ae71e17be69d79521"} Jan 20 23:39:30 crc kubenswrapper[5030]: I0120 23:39:30.095355 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"a39ae3f6-b542-4a93-a45b-8b431f5e857e","Type":"ContainerStarted","Data":"fc5866f808a8a80519bdf59e575d36f536d08291d9158ae752e796384d962b40"} Jan 20 23:39:30 crc kubenswrapper[5030]: I0120 23:39:30.110414 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.110398796 podStartE2EDuration="2.110398796s" podCreationTimestamp="2026-01-20 23:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:39:30.106673766 +0000 UTC m=+3842.426934054" watchObservedRunningTime="2026-01-20 23:39:30.110398796 +0000 UTC m=+3842.430659084" Jan 20 23:39:30 crc kubenswrapper[5030]: I0120 23:39:30.775073 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:39:30 crc kubenswrapper[5030]: I0120 23:39:30.820703 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6d4488f98f-844w8"] Jan 20 23:39:30 crc kubenswrapper[5030]: I0120 23:39:30.820975 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" podUID="f50beaa6-b99c-4026-b8b8-b0832e99c48c" containerName="neutron-api" containerID="cri-o://f1127219cbe7c14cf2621b6f40e029352b74a4162ddcc345284bc15d2e259772" gracePeriod=30 Jan 20 23:39:30 crc kubenswrapper[5030]: I0120 23:39:30.821418 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" podUID="f50beaa6-b99c-4026-b8b8-b0832e99c48c" containerName="neutron-httpd" containerID="cri-o://d72f5c4321b88f7d02041229a51ba006b67c1cd0e7d4f8da3df7d5d8d5c267aa" gracePeriod=30 Jan 20 23:39:31 crc kubenswrapper[5030]: I0120 23:39:31.105729 5030 generic.go:334] "Generic (PLEG): container finished" podID="f50beaa6-b99c-4026-b8b8-b0832e99c48c" containerID="d72f5c4321b88f7d02041229a51ba006b67c1cd0e7d4f8da3df7d5d8d5c267aa" exitCode=0 Jan 20 23:39:31 crc kubenswrapper[5030]: I0120 23:39:31.105810 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" event={"ID":"f50beaa6-b99c-4026-b8b8-b0832e99c48c","Type":"ContainerDied","Data":"d72f5c4321b88f7d02041229a51ba006b67c1cd0e7d4f8da3df7d5d8d5c267aa"} Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.077350 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb"] Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.079724 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.082364 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.087496 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.091386 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.092990 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb"] Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.175137 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-combined-ca-bundle\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.175197 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66a4eea-17a5-4ab5-b3be-2a479226daf7-run-httpd\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.175259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ztzp\" (UniqueName: \"kubernetes.io/projected/d66a4eea-17a5-4ab5-b3be-2a479226daf7-kube-api-access-7ztzp\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.175277 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.175291 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66a4eea-17a5-4ab5-b3be-2a479226daf7-log-httpd\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.175308 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d66a4eea-17a5-4ab5-b3be-2a479226daf7-etc-swift\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.175322 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.175350 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-config-data\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.277308 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ztzp\" (UniqueName: \"kubernetes.io/projected/d66a4eea-17a5-4ab5-b3be-2a479226daf7-kube-api-access-7ztzp\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.277365 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.277383 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66a4eea-17a5-4ab5-b3be-2a479226daf7-log-httpd\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.277405 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d66a4eea-17a5-4ab5-b3be-2a479226daf7-etc-swift\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.277421 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.277448 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-config-data\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.277523 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-combined-ca-bundle\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.277558 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66a4eea-17a5-4ab5-b3be-2a479226daf7-run-httpd\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.278026 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66a4eea-17a5-4ab5-b3be-2a479226daf7-run-httpd\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.279182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66a4eea-17a5-4ab5-b3be-2a479226daf7-log-httpd\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.284333 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-config-data\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.284841 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.285360 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.286914 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d66a4eea-17a5-4ab5-b3be-2a479226daf7-etc-swift\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.286928 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-combined-ca-bundle\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.300285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ztzp\" (UniqueName: \"kubernetes.io/projected/d66a4eea-17a5-4ab5-b3be-2a479226daf7-kube-api-access-7ztzp\") pod \"swift-proxy-7c556c6584-sfpnb\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.408784 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.637371 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.788112 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-ovndb-tls-certs\") pod \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.788185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6xzs\" (UniqueName: \"kubernetes.io/projected/f50beaa6-b99c-4026-b8b8-b0832e99c48c-kube-api-access-h6xzs\") pod \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.788372 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-combined-ca-bundle\") pod \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.788406 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-httpd-config\") pod \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.788508 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-config\") pod \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\" (UID: \"f50beaa6-b99c-4026-b8b8-b0832e99c48c\") " Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.793848 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f50beaa6-b99c-4026-b8b8-b0832e99c48c" (UID: "f50beaa6-b99c-4026-b8b8-b0832e99c48c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.793871 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f50beaa6-b99c-4026-b8b8-b0832e99c48c-kube-api-access-h6xzs" (OuterVolumeSpecName: "kube-api-access-h6xzs") pod "f50beaa6-b99c-4026-b8b8-b0832e99c48c" (UID: "f50beaa6-b99c-4026-b8b8-b0832e99c48c"). InnerVolumeSpecName "kube-api-access-h6xzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.841676 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-config" (OuterVolumeSpecName: "config") pod "f50beaa6-b99c-4026-b8b8-b0832e99c48c" (UID: "f50beaa6-b99c-4026-b8b8-b0832e99c48c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.855184 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb"] Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.857392 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f50beaa6-b99c-4026-b8b8-b0832e99c48c" (UID: "f50beaa6-b99c-4026-b8b8-b0832e99c48c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.869629 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f50beaa6-b99c-4026-b8b8-b0832e99c48c" (UID: "f50beaa6-b99c-4026-b8b8-b0832e99c48c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.890657 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.890683 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.890695 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.890806 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f50beaa6-b99c-4026-b8b8-b0832e99c48c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:33 crc kubenswrapper[5030]: I0120 23:39:33.890819 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6xzs\" (UniqueName: \"kubernetes.io/projected/f50beaa6-b99c-4026-b8b8-b0832e99c48c-kube-api-access-h6xzs\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.132955 5030 generic.go:334] "Generic (PLEG): container finished" podID="f50beaa6-b99c-4026-b8b8-b0832e99c48c" containerID="f1127219cbe7c14cf2621b6f40e029352b74a4162ddcc345284bc15d2e259772" exitCode=0 Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.133156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" event={"ID":"f50beaa6-b99c-4026-b8b8-b0832e99c48c","Type":"ContainerDied","Data":"f1127219cbe7c14cf2621b6f40e029352b74a4162ddcc345284bc15d2e259772"} Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.133260 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" event={"ID":"f50beaa6-b99c-4026-b8b8-b0832e99c48c","Type":"ContainerDied","Data":"a206ff47809785bcd12cca96ed151837fdf7daf32527d1755e335357d3bf1e36"} Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.133271 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6d4488f98f-844w8" Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.133282 5030 scope.go:117] "RemoveContainer" containerID="d72f5c4321b88f7d02041229a51ba006b67c1cd0e7d4f8da3df7d5d8d5c267aa" Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.137078 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" event={"ID":"d66a4eea-17a5-4ab5-b3be-2a479226daf7","Type":"ContainerStarted","Data":"41286e72f29dcb5bdf7f4831ad8e2fe1b6c22003e27b09645abbfa6ae0917401"} Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.137101 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" event={"ID":"d66a4eea-17a5-4ab5-b3be-2a479226daf7","Type":"ContainerStarted","Data":"e2cb41bc141dab0e4fad9d0ccee1a7acf0b20b4fe1fc40ccbcac930aca5db7c7"} Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.158665 5030 scope.go:117] "RemoveContainer" containerID="f1127219cbe7c14cf2621b6f40e029352b74a4162ddcc345284bc15d2e259772" Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.159866 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6d4488f98f-844w8"] Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.170697 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6d4488f98f-844w8"] Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.181092 5030 scope.go:117] "RemoveContainer" containerID="d72f5c4321b88f7d02041229a51ba006b67c1cd0e7d4f8da3df7d5d8d5c267aa" Jan 20 23:39:34 crc kubenswrapper[5030]: E0120 23:39:34.181609 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d72f5c4321b88f7d02041229a51ba006b67c1cd0e7d4f8da3df7d5d8d5c267aa\": container with ID starting with d72f5c4321b88f7d02041229a51ba006b67c1cd0e7d4f8da3df7d5d8d5c267aa not found: ID does not exist" containerID="d72f5c4321b88f7d02041229a51ba006b67c1cd0e7d4f8da3df7d5d8d5c267aa" Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.181684 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d72f5c4321b88f7d02041229a51ba006b67c1cd0e7d4f8da3df7d5d8d5c267aa"} err="failed to get container status \"d72f5c4321b88f7d02041229a51ba006b67c1cd0e7d4f8da3df7d5d8d5c267aa\": rpc error: code = NotFound desc = could not find container \"d72f5c4321b88f7d02041229a51ba006b67c1cd0e7d4f8da3df7d5d8d5c267aa\": container with ID starting with d72f5c4321b88f7d02041229a51ba006b67c1cd0e7d4f8da3df7d5d8d5c267aa not found: ID does not exist" Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.181722 5030 scope.go:117] "RemoveContainer" containerID="f1127219cbe7c14cf2621b6f40e029352b74a4162ddcc345284bc15d2e259772" Jan 20 23:39:34 crc kubenswrapper[5030]: E0120 23:39:34.182166 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1127219cbe7c14cf2621b6f40e029352b74a4162ddcc345284bc15d2e259772\": container with ID starting with f1127219cbe7c14cf2621b6f40e029352b74a4162ddcc345284bc15d2e259772 not found: ID does not exist" containerID="f1127219cbe7c14cf2621b6f40e029352b74a4162ddcc345284bc15d2e259772" Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.182201 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1127219cbe7c14cf2621b6f40e029352b74a4162ddcc345284bc15d2e259772"} err="failed to get container status \"f1127219cbe7c14cf2621b6f40e029352b74a4162ddcc345284bc15d2e259772\": rpc error: code = NotFound desc = could not find container \"f1127219cbe7c14cf2621b6f40e029352b74a4162ddcc345284bc15d2e259772\": container with ID starting with f1127219cbe7c14cf2621b6f40e029352b74a4162ddcc345284bc15d2e259772 not found: ID does not exist" Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.666068 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.666802 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="ceilometer-central-agent" containerID="cri-o://bcb765c17614757a78d0787bf07765891e263896d488270c1630049dbad86bec" gracePeriod=30 Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.666965 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="proxy-httpd" containerID="cri-o://190149a065d2528b829a609717456c9410c7dc363069daad9b87990930390ba2" gracePeriod=30 Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.667007 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="sg-core" containerID="cri-o://0c1ae3ce16b4465aaf74d3794c1e89b42c44f57190525b1e91b588410473e05b" gracePeriod=30 Jan 20 23:39:34 crc kubenswrapper[5030]: I0120 23:39:34.667041 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="ceilometer-notification-agent" containerID="cri-o://000435c0bff778492d15b82f751b1f419178a6ddf8486dc227ec0079586ede6f" gracePeriod=30 Jan 20 23:39:35 crc kubenswrapper[5030]: I0120 23:39:35.147702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" event={"ID":"d66a4eea-17a5-4ab5-b3be-2a479226daf7","Type":"ContainerStarted","Data":"8f7ace46d65651278377efee09416c1370252c641e56c616324fd9e3e6962d2d"} Jan 20 23:39:35 crc kubenswrapper[5030]: I0120 23:39:35.148047 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:35 crc kubenswrapper[5030]: I0120 23:39:35.148072 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:35 crc kubenswrapper[5030]: I0120 23:39:35.150881 5030 generic.go:334] "Generic (PLEG): container finished" podID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerID="190149a065d2528b829a609717456c9410c7dc363069daad9b87990930390ba2" exitCode=0 Jan 20 23:39:35 crc kubenswrapper[5030]: I0120 23:39:35.150917 5030 generic.go:334] "Generic (PLEG): container finished" podID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerID="0c1ae3ce16b4465aaf74d3794c1e89b42c44f57190525b1e91b588410473e05b" exitCode=2 Jan 20 23:39:35 crc kubenswrapper[5030]: I0120 23:39:35.150918 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f2da8aa0-866f-4d14-affd-51bc1c069895","Type":"ContainerDied","Data":"190149a065d2528b829a609717456c9410c7dc363069daad9b87990930390ba2"} Jan 20 23:39:35 crc kubenswrapper[5030]: I0120 23:39:35.150967 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f2da8aa0-866f-4d14-affd-51bc1c069895","Type":"ContainerDied","Data":"0c1ae3ce16b4465aaf74d3794c1e89b42c44f57190525b1e91b588410473e05b"} Jan 20 23:39:35 crc kubenswrapper[5030]: I0120 23:39:35.179020 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" podStartSLOduration=2.179003616 podStartE2EDuration="2.179003616s" podCreationTimestamp="2026-01-20 23:39:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:39:35.173259567 +0000 UTC m=+3847.493519855" watchObservedRunningTime="2026-01-20 23:39:35.179003616 +0000 UTC m=+3847.499263904" Jan 20 23:39:35 crc kubenswrapper[5030]: I0120 23:39:35.976298 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f50beaa6-b99c-4026-b8b8-b0832e99c48c" path="/var/lib/kubelet/pods/f50beaa6-b99c-4026-b8b8-b0832e99c48c/volumes" Jan 20 23:39:36 crc kubenswrapper[5030]: I0120 23:39:36.164693 5030 generic.go:334] "Generic (PLEG): container finished" podID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerID="bcb765c17614757a78d0787bf07765891e263896d488270c1630049dbad86bec" exitCode=0 Jan 20 23:39:36 crc kubenswrapper[5030]: I0120 23:39:36.164784 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f2da8aa0-866f-4d14-affd-51bc1c069895","Type":"ContainerDied","Data":"bcb765c17614757a78d0787bf07765891e263896d488270c1630049dbad86bec"} Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.176432 5030 generic.go:334] "Generic (PLEG): container finished" podID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerID="000435c0bff778492d15b82f751b1f419178a6ddf8486dc227ec0079586ede6f" exitCode=0 Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.176477 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f2da8aa0-866f-4d14-affd-51bc1c069895","Type":"ContainerDied","Data":"000435c0bff778492d15b82f751b1f419178a6ddf8486dc227ec0079586ede6f"} Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.453103 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.556451 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-combined-ca-bundle\") pod \"f2da8aa0-866f-4d14-affd-51bc1c069895\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.556670 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-scripts\") pod \"f2da8aa0-866f-4d14-affd-51bc1c069895\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.556708 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-config-data\") pod \"f2da8aa0-866f-4d14-affd-51bc1c069895\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.556769 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2da8aa0-866f-4d14-affd-51bc1c069895-log-httpd\") pod \"f2da8aa0-866f-4d14-affd-51bc1c069895\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.556857 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2da8aa0-866f-4d14-affd-51bc1c069895-run-httpd\") pod \"f2da8aa0-866f-4d14-affd-51bc1c069895\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.556891 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-sg-core-conf-yaml\") pod \"f2da8aa0-866f-4d14-affd-51bc1c069895\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.556951 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjjhd\" (UniqueName: \"kubernetes.io/projected/f2da8aa0-866f-4d14-affd-51bc1c069895-kube-api-access-qjjhd\") pod \"f2da8aa0-866f-4d14-affd-51bc1c069895\" (UID: \"f2da8aa0-866f-4d14-affd-51bc1c069895\") " Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.557396 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2da8aa0-866f-4d14-affd-51bc1c069895-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f2da8aa0-866f-4d14-affd-51bc1c069895" (UID: "f2da8aa0-866f-4d14-affd-51bc1c069895"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.557612 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2da8aa0-866f-4d14-affd-51bc1c069895-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f2da8aa0-866f-4d14-affd-51bc1c069895" (UID: "f2da8aa0-866f-4d14-affd-51bc1c069895"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.562113 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2da8aa0-866f-4d14-affd-51bc1c069895-kube-api-access-qjjhd" (OuterVolumeSpecName: "kube-api-access-qjjhd") pod "f2da8aa0-866f-4d14-affd-51bc1c069895" (UID: "f2da8aa0-866f-4d14-affd-51bc1c069895"). InnerVolumeSpecName "kube-api-access-qjjhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.562765 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-scripts" (OuterVolumeSpecName: "scripts") pod "f2da8aa0-866f-4d14-affd-51bc1c069895" (UID: "f2da8aa0-866f-4d14-affd-51bc1c069895"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.581529 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f2da8aa0-866f-4d14-affd-51bc1c069895" (UID: "f2da8aa0-866f-4d14-affd-51bc1c069895"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.623517 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2da8aa0-866f-4d14-affd-51bc1c069895" (UID: "f2da8aa0-866f-4d14-affd-51bc1c069895"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.640528 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-config-data" (OuterVolumeSpecName: "config-data") pod "f2da8aa0-866f-4d14-affd-51bc1c069895" (UID: "f2da8aa0-866f-4d14-affd-51bc1c069895"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.658554 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2da8aa0-866f-4d14-affd-51bc1c069895-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.658582 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2da8aa0-866f-4d14-affd-51bc1c069895-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.658592 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.658605 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjjhd\" (UniqueName: \"kubernetes.io/projected/f2da8aa0-866f-4d14-affd-51bc1c069895-kube-api-access-qjjhd\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.658615 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.658639 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.658648 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2da8aa0-866f-4d14-affd-51bc1c069895-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:37 crc kubenswrapper[5030]: I0120 23:39:37.974747 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:39:37 crc kubenswrapper[5030]: E0120 23:39:37.975210 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.187868 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"f2da8aa0-866f-4d14-affd-51bc1c069895","Type":"ContainerDied","Data":"0abcbb7a9d6e82021e300ad745a202ed6d5ab8ccf9cc046a9780640d94270cfa"} Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.188104 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.188153 5030 scope.go:117] "RemoveContainer" containerID="190149a065d2528b829a609717456c9410c7dc363069daad9b87990930390ba2" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.207351 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.213059 5030 scope.go:117] "RemoveContainer" containerID="0c1ae3ce16b4465aaf74d3794c1e89b42c44f57190525b1e91b588410473e05b" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.217064 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.237464 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:39:38 crc kubenswrapper[5030]: E0120 23:39:38.237899 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="ceilometer-central-agent" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.237923 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="ceilometer-central-agent" Jan 20 23:39:38 crc kubenswrapper[5030]: E0120 23:39:38.237954 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50beaa6-b99c-4026-b8b8-b0832e99c48c" containerName="neutron-httpd" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.237962 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50beaa6-b99c-4026-b8b8-b0832e99c48c" containerName="neutron-httpd" Jan 20 23:39:38 crc kubenswrapper[5030]: E0120 23:39:38.237976 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f50beaa6-b99c-4026-b8b8-b0832e99c48c" containerName="neutron-api" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.237984 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50beaa6-b99c-4026-b8b8-b0832e99c48c" containerName="neutron-api" Jan 20 23:39:38 crc kubenswrapper[5030]: E0120 23:39:38.238003 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="ceilometer-notification-agent" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.238011 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="ceilometer-notification-agent" Jan 20 23:39:38 crc kubenswrapper[5030]: E0120 23:39:38.238027 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="sg-core" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.238036 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="sg-core" Jan 20 23:39:38 crc kubenswrapper[5030]: E0120 23:39:38.238047 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="proxy-httpd" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.238054 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="proxy-httpd" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.238235 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="ceilometer-central-agent" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.238247 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f50beaa6-b99c-4026-b8b8-b0832e99c48c" containerName="neutron-httpd" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.238269 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="sg-core" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.238280 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="proxy-httpd" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.238290 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f50beaa6-b99c-4026-b8b8-b0832e99c48c" containerName="neutron-api" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.238308 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" containerName="ceilometer-notification-agent" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.243105 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.245720 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.245960 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.256566 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.258052 5030 scope.go:117] "RemoveContainer" containerID="000435c0bff778492d15b82f751b1f419178a6ddf8486dc227ec0079586ede6f" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.302476 5030 scope.go:117] "RemoveContainer" containerID="bcb765c17614757a78d0787bf07765891e263896d488270c1630049dbad86bec" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.394768 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-run-httpd\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.395067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-scripts\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.395157 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-config-data\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.395248 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5b6\" (UniqueName: \"kubernetes.io/projected/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-kube-api-access-rv5b6\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.395336 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-log-httpd\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.395456 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.395557 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.496835 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-scripts\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.497185 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-config-data\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.497347 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5b6\" (UniqueName: \"kubernetes.io/projected/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-kube-api-access-rv5b6\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.497455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-log-httpd\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.497589 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.497718 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.497909 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-run-httpd\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.498486 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-run-httpd\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.498813 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-log-httpd\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.502580 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.502903 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-config-data\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.504414 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-scripts\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.504789 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.520265 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5b6\" (UniqueName: \"kubernetes.io/projected/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-kube-api-access-rv5b6\") pod \"ceilometer-0\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:38 crc kubenswrapper[5030]: I0120 23:39:38.577635 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:39 crc kubenswrapper[5030]: I0120 23:39:39.007222 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:39:39 crc kubenswrapper[5030]: I0120 23:39:39.206075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb","Type":"ContainerStarted","Data":"96fe24a84b73b7c8a8669214f745e0d1ba2f4e0cd88ee991cbea7398a71bb6b7"} Jan 20 23:39:39 crc kubenswrapper[5030]: I0120 23:39:39.950883 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:39:39 crc kubenswrapper[5030]: I0120 23:39:39.972256 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2da8aa0-866f-4d14-affd-51bc1c069895" path="/var/lib/kubelet/pods/f2da8aa0-866f-4d14-affd-51bc1c069895/volumes" Jan 20 23:39:40 crc kubenswrapper[5030]: I0120 23:39:40.216009 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb","Type":"ContainerStarted","Data":"a7bc677d6e784b4af9822c8fb0855f5acc2ac7c962f110430d421220208a664a"} Jan 20 23:39:40 crc kubenswrapper[5030]: I0120 23:39:40.755115 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:39:40 crc kubenswrapper[5030]: I0120 23:39:40.755796 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ac3db439-8610-4e8f-9a5f-1c65bea0f129" containerName="glance-log" containerID="cri-o://eb6392344ac4cd2d376c2123ee922c1fcc21577ba39fa816cb1a992087f36ac3" gracePeriod=30 Jan 20 23:39:40 crc kubenswrapper[5030]: I0120 23:39:40.756242 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ac3db439-8610-4e8f-9a5f-1c65bea0f129" containerName="glance-httpd" containerID="cri-o://1e5fc2dd3d62da6838f1090bb2515031a07a9ece9f1eea10ec89734922fc4bee" gracePeriod=30 Jan 20 23:39:41 crc kubenswrapper[5030]: I0120 23:39:41.231232 5030 generic.go:334] "Generic (PLEG): container finished" podID="ac3db439-8610-4e8f-9a5f-1c65bea0f129" containerID="eb6392344ac4cd2d376c2123ee922c1fcc21577ba39fa816cb1a992087f36ac3" exitCode=143 Jan 20 23:39:41 crc kubenswrapper[5030]: I0120 23:39:41.231311 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ac3db439-8610-4e8f-9a5f-1c65bea0f129","Type":"ContainerDied","Data":"eb6392344ac4cd2d376c2123ee922c1fcc21577ba39fa816cb1a992087f36ac3"} Jan 20 23:39:41 crc kubenswrapper[5030]: I0120 23:39:41.233778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb","Type":"ContainerStarted","Data":"d4318c87eddf4ed37394d9d3cc05a8f6b279806e80d5c1eec78b613ab6d12ce2"} Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.188386 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-98jxz"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.189719 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-98jxz" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.207064 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-98jxz"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.243060 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb","Type":"ContainerStarted","Data":"030877f2846373e4da5c440a4f78f2b84ad892a377fddde7d7b863e93538aebe"} Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.288368 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-z6fdw"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.291453 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-z6fdw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.338805 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-z6fdw"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.346719 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.347850 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.350150 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.358544 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4s7h\" (UniqueName: \"kubernetes.io/projected/69848fe2-bbbb-44ba-982f-2cfacf5280d0-kube-api-access-l4s7h\") pod \"nova-api-db-create-98jxz\" (UID: \"69848fe2-bbbb-44ba-982f-2cfacf5280d0\") " pod="openstack-kuttl-tests/nova-api-db-create-98jxz" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.358606 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69848fe2-bbbb-44ba-982f-2cfacf5280d0-operator-scripts\") pod \"nova-api-db-create-98jxz\" (UID: \"69848fe2-bbbb-44ba-982f-2cfacf5280d0\") " pod="openstack-kuttl-tests/nova-api-db-create-98jxz" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.375339 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.460663 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4s7h\" (UniqueName: \"kubernetes.io/projected/69848fe2-bbbb-44ba-982f-2cfacf5280d0-kube-api-access-l4s7h\") pod \"nova-api-db-create-98jxz\" (UID: \"69848fe2-bbbb-44ba-982f-2cfacf5280d0\") " pod="openstack-kuttl-tests/nova-api-db-create-98jxz" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.460742 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69848fe2-bbbb-44ba-982f-2cfacf5280d0-operator-scripts\") pod \"nova-api-db-create-98jxz\" (UID: \"69848fe2-bbbb-44ba-982f-2cfacf5280d0\") " pod="openstack-kuttl-tests/nova-api-db-create-98jxz" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.460798 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4bf350-04d5-4e11-8462-2e53fa7b058a-operator-scripts\") pod \"nova-cell0-db-create-z6fdw\" (UID: \"9b4bf350-04d5-4e11-8462-2e53fa7b058a\") " pod="openstack-kuttl-tests/nova-cell0-db-create-z6fdw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.460828 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqwrt\" (UniqueName: \"kubernetes.io/projected/843da113-e41e-40b0-9d7a-c5486468ca5d-kube-api-access-dqwrt\") pod \"nova-api-cb33-account-create-update-bjtz7\" (UID: \"843da113-e41e-40b0-9d7a-c5486468ca5d\") " pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.460860 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84g5r\" (UniqueName: \"kubernetes.io/projected/9b4bf350-04d5-4e11-8462-2e53fa7b058a-kube-api-access-84g5r\") pod \"nova-cell0-db-create-z6fdw\" (UID: \"9b4bf350-04d5-4e11-8462-2e53fa7b058a\") " pod="openstack-kuttl-tests/nova-cell0-db-create-z6fdw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.460878 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/843da113-e41e-40b0-9d7a-c5486468ca5d-operator-scripts\") pod \"nova-api-cb33-account-create-update-bjtz7\" (UID: \"843da113-e41e-40b0-9d7a-c5486468ca5d\") " pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.461522 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69848fe2-bbbb-44ba-982f-2cfacf5280d0-operator-scripts\") pod \"nova-api-db-create-98jxz\" (UID: \"69848fe2-bbbb-44ba-982f-2cfacf5280d0\") " pod="openstack-kuttl-tests/nova-api-db-create-98jxz" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.492490 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4s7h\" (UniqueName: \"kubernetes.io/projected/69848fe2-bbbb-44ba-982f-2cfacf5280d0-kube-api-access-l4s7h\") pod \"nova-api-db-create-98jxz\" (UID: \"69848fe2-bbbb-44ba-982f-2cfacf5280d0\") " pod="openstack-kuttl-tests/nova-api-db-create-98jxz" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.504701 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.505951 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.506908 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-98jxz" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.509436 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.513010 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-kf4fb"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.514529 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-kf4fb" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.525178 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.535400 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-kf4fb"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.565172 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4bf350-04d5-4e11-8462-2e53fa7b058a-operator-scripts\") pod \"nova-cell0-db-create-z6fdw\" (UID: \"9b4bf350-04d5-4e11-8462-2e53fa7b058a\") " pod="openstack-kuttl-tests/nova-cell0-db-create-z6fdw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.565248 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqwrt\" (UniqueName: \"kubernetes.io/projected/843da113-e41e-40b0-9d7a-c5486468ca5d-kube-api-access-dqwrt\") pod \"nova-api-cb33-account-create-update-bjtz7\" (UID: \"843da113-e41e-40b0-9d7a-c5486468ca5d\") " pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.565286 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/843da113-e41e-40b0-9d7a-c5486468ca5d-operator-scripts\") pod \"nova-api-cb33-account-create-update-bjtz7\" (UID: \"843da113-e41e-40b0-9d7a-c5486468ca5d\") " pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.565323 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84g5r\" (UniqueName: \"kubernetes.io/projected/9b4bf350-04d5-4e11-8462-2e53fa7b058a-kube-api-access-84g5r\") pod \"nova-cell0-db-create-z6fdw\" (UID: \"9b4bf350-04d5-4e11-8462-2e53fa7b058a\") " pod="openstack-kuttl-tests/nova-cell0-db-create-z6fdw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.566327 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4bf350-04d5-4e11-8462-2e53fa7b058a-operator-scripts\") pod \"nova-cell0-db-create-z6fdw\" (UID: \"9b4bf350-04d5-4e11-8462-2e53fa7b058a\") " pod="openstack-kuttl-tests/nova-cell0-db-create-z6fdw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.566897 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/843da113-e41e-40b0-9d7a-c5486468ca5d-operator-scripts\") pod \"nova-api-cb33-account-create-update-bjtz7\" (UID: \"843da113-e41e-40b0-9d7a-c5486468ca5d\") " pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.582551 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqwrt\" (UniqueName: \"kubernetes.io/projected/843da113-e41e-40b0-9d7a-c5486468ca5d-kube-api-access-dqwrt\") pod \"nova-api-cb33-account-create-update-bjtz7\" (UID: \"843da113-e41e-40b0-9d7a-c5486468ca5d\") " pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.585490 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84g5r\" (UniqueName: \"kubernetes.io/projected/9b4bf350-04d5-4e11-8462-2e53fa7b058a-kube-api-access-84g5r\") pod \"nova-cell0-db-create-z6fdw\" (UID: \"9b4bf350-04d5-4e11-8462-2e53fa7b058a\") " pod="openstack-kuttl-tests/nova-cell0-db-create-z6fdw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.633898 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-z6fdw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.672061 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.672499 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn9bm\" (UniqueName: \"kubernetes.io/projected/7ffdd818-4c7b-46df-8e2f-0bcba0281818-kube-api-access-xn9bm\") pod \"nova-cell1-db-create-kf4fb\" (UID: \"7ffdd818-4c7b-46df-8e2f-0bcba0281818\") " pod="openstack-kuttl-tests/nova-cell1-db-create-kf4fb" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.672548 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce721ec0-af84-470d-a905-0c6163c1580f-operator-scripts\") pod \"nova-cell0-617d-account-create-update-jnfhw\" (UID: \"ce721ec0-af84-470d-a905-0c6163c1580f\") " pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.672943 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nntsz\" (UniqueName: \"kubernetes.io/projected/ce721ec0-af84-470d-a905-0c6163c1580f-kube-api-access-nntsz\") pod \"nova-cell0-617d-account-create-update-jnfhw\" (UID: \"ce721ec0-af84-470d-a905-0c6163c1580f\") " pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.673004 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffdd818-4c7b-46df-8e2f-0bcba0281818-operator-scripts\") pod \"nova-cell1-db-create-kf4fb\" (UID: \"7ffdd818-4c7b-46df-8e2f-0bcba0281818\") " pod="openstack-kuttl-tests/nova-cell1-db-create-kf4fb" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.702472 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.703736 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.706279 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.707752 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.775896 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nntsz\" (UniqueName: \"kubernetes.io/projected/ce721ec0-af84-470d-a905-0c6163c1580f-kube-api-access-nntsz\") pod \"nova-cell0-617d-account-create-update-jnfhw\" (UID: \"ce721ec0-af84-470d-a905-0c6163c1580f\") " pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.775943 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d57228-bbc8-4b0d-b4bc-9fd11cca7630-operator-scripts\") pod \"nova-cell1-a2f6-account-create-update-npd8s\" (UID: \"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.775993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffdd818-4c7b-46df-8e2f-0bcba0281818-operator-scripts\") pod \"nova-cell1-db-create-kf4fb\" (UID: \"7ffdd818-4c7b-46df-8e2f-0bcba0281818\") " pod="openstack-kuttl-tests/nova-cell1-db-create-kf4fb" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.776012 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lb5s\" (UniqueName: \"kubernetes.io/projected/c5d57228-bbc8-4b0d-b4bc-9fd11cca7630-kube-api-access-2lb5s\") pod \"nova-cell1-a2f6-account-create-update-npd8s\" (UID: \"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.776039 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn9bm\" (UniqueName: \"kubernetes.io/projected/7ffdd818-4c7b-46df-8e2f-0bcba0281818-kube-api-access-xn9bm\") pod \"nova-cell1-db-create-kf4fb\" (UID: \"7ffdd818-4c7b-46df-8e2f-0bcba0281818\") " pod="openstack-kuttl-tests/nova-cell1-db-create-kf4fb" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.776081 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce721ec0-af84-470d-a905-0c6163c1580f-operator-scripts\") pod \"nova-cell0-617d-account-create-update-jnfhw\" (UID: \"ce721ec0-af84-470d-a905-0c6163c1580f\") " pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.777013 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce721ec0-af84-470d-a905-0c6163c1580f-operator-scripts\") pod \"nova-cell0-617d-account-create-update-jnfhw\" (UID: \"ce721ec0-af84-470d-a905-0c6163c1580f\") " pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.778331 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffdd818-4c7b-46df-8e2f-0bcba0281818-operator-scripts\") pod \"nova-cell1-db-create-kf4fb\" (UID: \"7ffdd818-4c7b-46df-8e2f-0bcba0281818\") " pod="openstack-kuttl-tests/nova-cell1-db-create-kf4fb" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.877290 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lb5s\" (UniqueName: \"kubernetes.io/projected/c5d57228-bbc8-4b0d-b4bc-9fd11cca7630-kube-api-access-2lb5s\") pod \"nova-cell1-a2f6-account-create-update-npd8s\" (UID: \"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.877440 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d57228-bbc8-4b0d-b4bc-9fd11cca7630-operator-scripts\") pod \"nova-cell1-a2f6-account-create-update-npd8s\" (UID: \"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.878156 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d57228-bbc8-4b0d-b4bc-9fd11cca7630-operator-scripts\") pod \"nova-cell1-a2f6-account-create-update-npd8s\" (UID: \"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.878831 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.879084 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="465f615b-c8cb-4464-b576-3de67e335b60" containerName="glance-log" containerID="cri-o://69101b4e8486bd4ac5b32d623cd91ef7c663869931c87c6eef9741331b365a75" gracePeriod=30 Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.879156 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="465f615b-c8cb-4464-b576-3de67e335b60" containerName="glance-httpd" containerID="cri-o://9ec8c1394b7cd5609ca385739ce15032289c3a70706b9ad03ce5c2e188b8a10b" gracePeriod=30 Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.944379 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nntsz\" (UniqueName: \"kubernetes.io/projected/ce721ec0-af84-470d-a905-0c6163c1580f-kube-api-access-nntsz\") pod \"nova-cell0-617d-account-create-update-jnfhw\" (UID: \"ce721ec0-af84-470d-a905-0c6163c1580f\") " pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.944436 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn9bm\" (UniqueName: \"kubernetes.io/projected/7ffdd818-4c7b-46df-8e2f-0bcba0281818-kube-api-access-xn9bm\") pod \"nova-cell1-db-create-kf4fb\" (UID: \"7ffdd818-4c7b-46df-8e2f-0bcba0281818\") " pod="openstack-kuttl-tests/nova-cell1-db-create-kf4fb" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.954511 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-98jxz"] Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.962996 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lb5s\" (UniqueName: \"kubernetes.io/projected/c5d57228-bbc8-4b0d-b4bc-9fd11cca7630-kube-api-access-2lb5s\") pod \"nova-cell1-a2f6-account-create-update-npd8s\" (UID: \"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" Jan 20 23:39:42 crc kubenswrapper[5030]: W0120 23:39:42.969663 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69848fe2_bbbb_44ba_982f_2cfacf5280d0.slice/crio-d1c8b62540a72333d6e24f1042043284fa8d9a4b7daa72c08156f691c915ee8c WatchSource:0}: Error finding container d1c8b62540a72333d6e24f1042043284fa8d9a4b7daa72c08156f691c915ee8c: Status 404 returned error can't find the container with id d1c8b62540a72333d6e24f1042043284fa8d9a4b7daa72c08156f691c915ee8c Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.990306 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-kf4fb" Jan 20 23:39:42 crc kubenswrapper[5030]: I0120 23:39:42.990671 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" Jan 20 23:39:43 crc kubenswrapper[5030]: I0120 23:39:43.028038 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" Jan 20 23:39:43 crc kubenswrapper[5030]: I0120 23:39:43.258179 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-98jxz" event={"ID":"69848fe2-bbbb-44ba-982f-2cfacf5280d0","Type":"ContainerStarted","Data":"d1c8b62540a72333d6e24f1042043284fa8d9a4b7daa72c08156f691c915ee8c"} Jan 20 23:39:43 crc kubenswrapper[5030]: I0120 23:39:43.262410 5030 generic.go:334] "Generic (PLEG): container finished" podID="465f615b-c8cb-4464-b576-3de67e335b60" containerID="69101b4e8486bd4ac5b32d623cd91ef7c663869931c87c6eef9741331b365a75" exitCode=143 Jan 20 23:39:43 crc kubenswrapper[5030]: I0120 23:39:43.262465 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"465f615b-c8cb-4464-b576-3de67e335b60","Type":"ContainerDied","Data":"69101b4e8486bd4ac5b32d623cd91ef7c663869931c87c6eef9741331b365a75"} Jan 20 23:39:43 crc kubenswrapper[5030]: I0120 23:39:43.416607 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:43 crc kubenswrapper[5030]: I0120 23:39:43.416838 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:39:43 crc kubenswrapper[5030]: I0120 23:39:43.455344 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7"] Jan 20 23:39:43 crc kubenswrapper[5030]: I0120 23:39:43.632643 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-z6fdw"] Jan 20 23:39:43 crc kubenswrapper[5030]: I0120 23:39:43.737701 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-kf4fb"] Jan 20 23:39:43 crc kubenswrapper[5030]: I0120 23:39:43.744732 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw"] Jan 20 23:39:43 crc kubenswrapper[5030]: I0120 23:39:43.754473 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s"] Jan 20 23:39:43 crc kubenswrapper[5030]: I0120 23:39:43.980490 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ac3db439-8610-4e8f-9a5f-1c65bea0f129" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.204:9292/healthcheck\": read tcp 10.217.0.2:43760->10.217.0.204:9292: read: connection reset by peer" Jan 20 23:39:43 crc kubenswrapper[5030]: I0120 23:39:43.980545 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ac3db439-8610-4e8f-9a5f-1c65bea0f129" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.204:9292/healthcheck\": read tcp 10.217.0.2:43750->10.217.0.204:9292: read: connection reset by peer" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.312489 5030 generic.go:334] "Generic (PLEG): container finished" podID="69848fe2-bbbb-44ba-982f-2cfacf5280d0" containerID="07bb1e77795afa2d8f3e0e1429c41dbf48e1729d1bcdb8874b9fe5cb50fb7d85" exitCode=0 Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.314521 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-98jxz" event={"ID":"69848fe2-bbbb-44ba-982f-2cfacf5280d0","Type":"ContainerDied","Data":"07bb1e77795afa2d8f3e0e1429c41dbf48e1729d1bcdb8874b9fe5cb50fb7d85"} Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.319514 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" event={"ID":"ce721ec0-af84-470d-a905-0c6163c1580f","Type":"ContainerStarted","Data":"61c00bd05f4c308767a18aa48966b356a7b2f685e21d882aac3bfee2f238374c"} Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.319561 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" event={"ID":"ce721ec0-af84-470d-a905-0c6163c1580f","Type":"ContainerStarted","Data":"663212bfdbdcaf0127fff5fa48686827502e5d09dc4c7a0478f35b2f85dc9a5b"} Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.324604 5030 generic.go:334] "Generic (PLEG): container finished" podID="ac3db439-8610-4e8f-9a5f-1c65bea0f129" containerID="1e5fc2dd3d62da6838f1090bb2515031a07a9ece9f1eea10ec89734922fc4bee" exitCode=0 Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.324672 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ac3db439-8610-4e8f-9a5f-1c65bea0f129","Type":"ContainerDied","Data":"1e5fc2dd3d62da6838f1090bb2515031a07a9ece9f1eea10ec89734922fc4bee"} Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.330161 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb","Type":"ContainerStarted","Data":"1b9b5e21d250ae262d5e17e495f75db1b2625987e8848941a85f609f387ef502"} Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.330223 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="proxy-httpd" containerID="cri-o://1b9b5e21d250ae262d5e17e495f75db1b2625987e8848941a85f609f387ef502" gracePeriod=30 Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.330368 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="sg-core" containerID="cri-o://030877f2846373e4da5c440a4f78f2b84ad892a377fddde7d7b863e93538aebe" gracePeriod=30 Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.330405 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.330292 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="ceilometer-central-agent" containerID="cri-o://a7bc677d6e784b4af9822c8fb0855f5acc2ac7c962f110430d421220208a664a" gracePeriod=30 Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.330563 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="ceilometer-notification-agent" containerID="cri-o://d4318c87eddf4ed37394d9d3cc05a8f6b279806e80d5c1eec78b613ab6d12ce2" gracePeriod=30 Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.348928 5030 generic.go:334] "Generic (PLEG): container finished" podID="7ffdd818-4c7b-46df-8e2f-0bcba0281818" containerID="b550427f5955b0f992f9ff6bc36ba4407ad531409b6dbe72aa177a9c214aa9df" exitCode=0 Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.349097 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-kf4fb" event={"ID":"7ffdd818-4c7b-46df-8e2f-0bcba0281818","Type":"ContainerDied","Data":"b550427f5955b0f992f9ff6bc36ba4407ad531409b6dbe72aa177a9c214aa9df"} Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.349162 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-kf4fb" event={"ID":"7ffdd818-4c7b-46df-8e2f-0bcba0281818","Type":"ContainerStarted","Data":"a0f663a26013d8476e73c40d432db5054913909cd41a5495680c1b0eb3f5c4ea"} Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.352295 5030 generic.go:334] "Generic (PLEG): container finished" podID="9b4bf350-04d5-4e11-8462-2e53fa7b058a" containerID="3ce538f9112db0591af690b27ecb9a574dd0329fa9ce2db3d26567ffb5981503" exitCode=0 Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.352356 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-z6fdw" event={"ID":"9b4bf350-04d5-4e11-8462-2e53fa7b058a","Type":"ContainerDied","Data":"3ce538f9112db0591af690b27ecb9a574dd0329fa9ce2db3d26567ffb5981503"} Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.352379 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-z6fdw" event={"ID":"9b4bf350-04d5-4e11-8462-2e53fa7b058a","Type":"ContainerStarted","Data":"61b553e4175e72fd4ebdc829594de34285d575cbd84389065738108cc8a5c104"} Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.353087 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" podStartSLOduration=2.353067113 podStartE2EDuration="2.353067113s" podCreationTimestamp="2026-01-20 23:39:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:39:44.346240118 +0000 UTC m=+3856.666500406" watchObservedRunningTime="2026-01-20 23:39:44.353067113 +0000 UTC m=+3856.673327401" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.355203 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" event={"ID":"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630","Type":"ContainerStarted","Data":"c90c755bc61ffabd973240ff7c809123f3f9f2dfda56b982d7ee2a2191f2daf4"} Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.355239 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" event={"ID":"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630","Type":"ContainerStarted","Data":"f30d9044efc5303799afb9ccf54cf5bd12369aac2360ae5e9b8e57192ead40b9"} Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.358062 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" event={"ID":"843da113-e41e-40b0-9d7a-c5486468ca5d","Type":"ContainerStarted","Data":"f789b6523159d3101ae670e5476e0e7c39359c10b74e15d291d4e99cb86bfeb6"} Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.358086 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" event={"ID":"843da113-e41e-40b0-9d7a-c5486468ca5d","Type":"ContainerStarted","Data":"abf6227b7e7ebea5f5874fa51fc8e4666b38c9efc05755ebc0d006f7e0f4f413"} Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.367268 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.792920832 podStartE2EDuration="6.367253227s" podCreationTimestamp="2026-01-20 23:39:38 +0000 UTC" firstStartedPulling="2026-01-20 23:39:39.012669424 +0000 UTC m=+3851.332929712" lastFinishedPulling="2026-01-20 23:39:43.587001819 +0000 UTC m=+3855.907262107" observedRunningTime="2026-01-20 23:39:44.36204 +0000 UTC m=+3856.682300288" watchObservedRunningTime="2026-01-20 23:39:44.367253227 +0000 UTC m=+3856.687513505" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.440797 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.519995 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-config-data\") pod \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.520043 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-internal-tls-certs\") pod \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.520087 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-scripts\") pod \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.520115 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-combined-ca-bundle\") pod \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.520211 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac3db439-8610-4e8f-9a5f-1c65bea0f129-httpd-run\") pod \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.520248 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxxnd\" (UniqueName: \"kubernetes.io/projected/ac3db439-8610-4e8f-9a5f-1c65bea0f129-kube-api-access-bxxnd\") pod \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.520269 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac3db439-8610-4e8f-9a5f-1c65bea0f129-logs\") pod \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.520295 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\" (UID: \"ac3db439-8610-4e8f-9a5f-1c65bea0f129\") " Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.522935 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac3db439-8610-4e8f-9a5f-1c65bea0f129-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ac3db439-8610-4e8f-9a5f-1c65bea0f129" (UID: "ac3db439-8610-4e8f-9a5f-1c65bea0f129"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.523044 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac3db439-8610-4e8f-9a5f-1c65bea0f129-logs" (OuterVolumeSpecName: "logs") pod "ac3db439-8610-4e8f-9a5f-1c65bea0f129" (UID: "ac3db439-8610-4e8f-9a5f-1c65bea0f129"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.544336 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "ac3db439-8610-4e8f-9a5f-1c65bea0f129" (UID: "ac3db439-8610-4e8f-9a5f-1c65bea0f129"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.544807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3db439-8610-4e8f-9a5f-1c65bea0f129-kube-api-access-bxxnd" (OuterVolumeSpecName: "kube-api-access-bxxnd") pod "ac3db439-8610-4e8f-9a5f-1c65bea0f129" (UID: "ac3db439-8610-4e8f-9a5f-1c65bea0f129"). InnerVolumeSpecName "kube-api-access-bxxnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.594996 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-scripts" (OuterVolumeSpecName: "scripts") pod "ac3db439-8610-4e8f-9a5f-1c65bea0f129" (UID: "ac3db439-8610-4e8f-9a5f-1c65bea0f129"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.635933 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ac3db439-8610-4e8f-9a5f-1c65bea0f129-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.635968 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxxnd\" (UniqueName: \"kubernetes.io/projected/ac3db439-8610-4e8f-9a5f-1c65bea0f129-kube-api-access-bxxnd\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.635980 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac3db439-8610-4e8f-9a5f-1c65bea0f129-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.636001 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.636010 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.669484 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac3db439-8610-4e8f-9a5f-1c65bea0f129" (UID: "ac3db439-8610-4e8f-9a5f-1c65bea0f129"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.695046 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.739811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ac3db439-8610-4e8f-9a5f-1c65bea0f129" (UID: "ac3db439-8610-4e8f-9a5f-1c65bea0f129"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.748819 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-config-data" (OuterVolumeSpecName: "config-data") pod "ac3db439-8610-4e8f-9a5f-1c65bea0f129" (UID: "ac3db439-8610-4e8f-9a5f-1c65bea0f129"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.761876 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.761925 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.761939 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac3db439-8610-4e8f-9a5f-1c65bea0f129-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:44 crc kubenswrapper[5030]: I0120 23:39:44.761951 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.366467 5030 generic.go:334] "Generic (PLEG): container finished" podID="ce721ec0-af84-470d-a905-0c6163c1580f" containerID="61c00bd05f4c308767a18aa48966b356a7b2f685e21d882aac3bfee2f238374c" exitCode=0 Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.366505 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" event={"ID":"ce721ec0-af84-470d-a905-0c6163c1580f","Type":"ContainerDied","Data":"61c00bd05f4c308767a18aa48966b356a7b2f685e21d882aac3bfee2f238374c"} Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.368836 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ac3db439-8610-4e8f-9a5f-1c65bea0f129","Type":"ContainerDied","Data":"04e95b1b5d769b8bf96b50ab601e4c2dc69291855dd7ca24cb806180d93d421a"} Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.368881 5030 scope.go:117] "RemoveContainer" containerID="1e5fc2dd3d62da6838f1090bb2515031a07a9ece9f1eea10ec89734922fc4bee" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.368977 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.375564 5030 generic.go:334] "Generic (PLEG): container finished" podID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerID="1b9b5e21d250ae262d5e17e495f75db1b2625987e8848941a85f609f387ef502" exitCode=0 Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.375584 5030 generic.go:334] "Generic (PLEG): container finished" podID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerID="030877f2846373e4da5c440a4f78f2b84ad892a377fddde7d7b863e93538aebe" exitCode=2 Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.375592 5030 generic.go:334] "Generic (PLEG): container finished" podID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerID="d4318c87eddf4ed37394d9d3cc05a8f6b279806e80d5c1eec78b613ab6d12ce2" exitCode=0 Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.375631 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb","Type":"ContainerDied","Data":"1b9b5e21d250ae262d5e17e495f75db1b2625987e8848941a85f609f387ef502"} Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.375647 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb","Type":"ContainerDied","Data":"030877f2846373e4da5c440a4f78f2b84ad892a377fddde7d7b863e93538aebe"} Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.375657 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb","Type":"ContainerDied","Data":"d4318c87eddf4ed37394d9d3cc05a8f6b279806e80d5c1eec78b613ab6d12ce2"} Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.376936 5030 generic.go:334] "Generic (PLEG): container finished" podID="c5d57228-bbc8-4b0d-b4bc-9fd11cca7630" containerID="c90c755bc61ffabd973240ff7c809123f3f9f2dfda56b982d7ee2a2191f2daf4" exitCode=0 Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.376973 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" event={"ID":"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630","Type":"ContainerDied","Data":"c90c755bc61ffabd973240ff7c809123f3f9f2dfda56b982d7ee2a2191f2daf4"} Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.387846 5030 generic.go:334] "Generic (PLEG): container finished" podID="843da113-e41e-40b0-9d7a-c5486468ca5d" containerID="f789b6523159d3101ae670e5476e0e7c39359c10b74e15d291d4e99cb86bfeb6" exitCode=0 Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.388169 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" event={"ID":"843da113-e41e-40b0-9d7a-c5486468ca5d","Type":"ContainerDied","Data":"f789b6523159d3101ae670e5476e0e7c39359c10b74e15d291d4e99cb86bfeb6"} Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.416012 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.429303 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.445283 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:39:45 crc kubenswrapper[5030]: E0120 23:39:45.445703 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3db439-8610-4e8f-9a5f-1c65bea0f129" containerName="glance-log" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.445727 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3db439-8610-4e8f-9a5f-1c65bea0f129" containerName="glance-log" Jan 20 23:39:45 crc kubenswrapper[5030]: E0120 23:39:45.445754 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3db439-8610-4e8f-9a5f-1c65bea0f129" containerName="glance-httpd" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.445762 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3db439-8610-4e8f-9a5f-1c65bea0f129" containerName="glance-httpd" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.445945 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3db439-8610-4e8f-9a5f-1c65bea0f129" containerName="glance-httpd" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.445972 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3db439-8610-4e8f-9a5f-1c65bea0f129" containerName="glance-log" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.446654 5030 scope.go:117] "RemoveContainer" containerID="eb6392344ac4cd2d376c2123ee922c1fcc21577ba39fa816cb1a992087f36ac3" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.447018 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.451074 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.471651 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.475260 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.578662 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.578705 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.578732 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.578756 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7rxw\" (UniqueName: \"kubernetes.io/projected/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-kube-api-access-k7rxw\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.578833 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-logs\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.578865 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.578903 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.578918 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.680566 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.680649 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7rxw\" (UniqueName: \"kubernetes.io/projected/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-kube-api-access-k7rxw\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.680702 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-logs\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.680738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.680791 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.680813 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.681009 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.681012 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.681042 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.681480 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-logs\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.681722 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.687930 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.690737 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.690914 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.691776 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.701912 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7rxw\" (UniqueName: \"kubernetes.io/projected/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-kube-api-access-k7rxw\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.719237 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.768792 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.899125 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-kf4fb" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.924934 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.929897 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-z6fdw" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.945995 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.955707 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-98jxz" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.981356 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3db439-8610-4e8f-9a5f-1c65bea0f129" path="/var/lib/kubelet/pods/ac3db439-8610-4e8f-9a5f-1c65bea0f129/volumes" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.989750 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84g5r\" (UniqueName: \"kubernetes.io/projected/9b4bf350-04d5-4e11-8462-2e53fa7b058a-kube-api-access-84g5r\") pod \"9b4bf350-04d5-4e11-8462-2e53fa7b058a\" (UID: \"9b4bf350-04d5-4e11-8462-2e53fa7b058a\") " Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.989942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqwrt\" (UniqueName: \"kubernetes.io/projected/843da113-e41e-40b0-9d7a-c5486468ca5d-kube-api-access-dqwrt\") pod \"843da113-e41e-40b0-9d7a-c5486468ca5d\" (UID: \"843da113-e41e-40b0-9d7a-c5486468ca5d\") " Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.990196 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d57228-bbc8-4b0d-b4bc-9fd11cca7630-operator-scripts\") pod \"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630\" (UID: \"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630\") " Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.990275 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lb5s\" (UniqueName: \"kubernetes.io/projected/c5d57228-bbc8-4b0d-b4bc-9fd11cca7630-kube-api-access-2lb5s\") pod \"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630\" (UID: \"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630\") " Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.990677 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn9bm\" (UniqueName: \"kubernetes.io/projected/7ffdd818-4c7b-46df-8e2f-0bcba0281818-kube-api-access-xn9bm\") pod \"7ffdd818-4c7b-46df-8e2f-0bcba0281818\" (UID: \"7ffdd818-4c7b-46df-8e2f-0bcba0281818\") " Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.990707 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4s7h\" (UniqueName: \"kubernetes.io/projected/69848fe2-bbbb-44ba-982f-2cfacf5280d0-kube-api-access-l4s7h\") pod \"69848fe2-bbbb-44ba-982f-2cfacf5280d0\" (UID: \"69848fe2-bbbb-44ba-982f-2cfacf5280d0\") " Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.990737 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffdd818-4c7b-46df-8e2f-0bcba0281818-operator-scripts\") pod \"7ffdd818-4c7b-46df-8e2f-0bcba0281818\" (UID: \"7ffdd818-4c7b-46df-8e2f-0bcba0281818\") " Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.991190 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4bf350-04d5-4e11-8462-2e53fa7b058a-operator-scripts\") pod \"9b4bf350-04d5-4e11-8462-2e53fa7b058a\" (UID: \"9b4bf350-04d5-4e11-8462-2e53fa7b058a\") " Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.991220 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69848fe2-bbbb-44ba-982f-2cfacf5280d0-operator-scripts\") pod \"69848fe2-bbbb-44ba-982f-2cfacf5280d0\" (UID: \"69848fe2-bbbb-44ba-982f-2cfacf5280d0\") " Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.991254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/843da113-e41e-40b0-9d7a-c5486468ca5d-operator-scripts\") pod \"843da113-e41e-40b0-9d7a-c5486468ca5d\" (UID: \"843da113-e41e-40b0-9d7a-c5486468ca5d\") " Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.991924 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/843da113-e41e-40b0-9d7a-c5486468ca5d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "843da113-e41e-40b0-9d7a-c5486468ca5d" (UID: "843da113-e41e-40b0-9d7a-c5486468ca5d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.992667 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5d57228-bbc8-4b0d-b4bc-9fd11cca7630-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5d57228-bbc8-4b0d-b4bc-9fd11cca7630" (UID: "c5d57228-bbc8-4b0d-b4bc-9fd11cca7630"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.994555 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/843da113-e41e-40b0-9d7a-c5486468ca5d-kube-api-access-dqwrt" (OuterVolumeSpecName: "kube-api-access-dqwrt") pod "843da113-e41e-40b0-9d7a-c5486468ca5d" (UID: "843da113-e41e-40b0-9d7a-c5486468ca5d"). InnerVolumeSpecName "kube-api-access-dqwrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.994854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4bf350-04d5-4e11-8462-2e53fa7b058a-kube-api-access-84g5r" (OuterVolumeSpecName: "kube-api-access-84g5r") pod "9b4bf350-04d5-4e11-8462-2e53fa7b058a" (UID: "9b4bf350-04d5-4e11-8462-2e53fa7b058a"). InnerVolumeSpecName "kube-api-access-84g5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.995248 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ffdd818-4c7b-46df-8e2f-0bcba0281818-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ffdd818-4c7b-46df-8e2f-0bcba0281818" (UID: "7ffdd818-4c7b-46df-8e2f-0bcba0281818"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.995530 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4bf350-04d5-4e11-8462-2e53fa7b058a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b4bf350-04d5-4e11-8462-2e53fa7b058a" (UID: "9b4bf350-04d5-4e11-8462-2e53fa7b058a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.995890 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69848fe2-bbbb-44ba-982f-2cfacf5280d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69848fe2-bbbb-44ba-982f-2cfacf5280d0" (UID: "69848fe2-bbbb-44ba-982f-2cfacf5280d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:39:45 crc kubenswrapper[5030]: I0120 23:39:45.998480 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ffdd818-4c7b-46df-8e2f-0bcba0281818-kube-api-access-xn9bm" (OuterVolumeSpecName: "kube-api-access-xn9bm") pod "7ffdd818-4c7b-46df-8e2f-0bcba0281818" (UID: "7ffdd818-4c7b-46df-8e2f-0bcba0281818"). InnerVolumeSpecName "kube-api-access-xn9bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:45.998892 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69848fe2-bbbb-44ba-982f-2cfacf5280d0-kube-api-access-l4s7h" (OuterVolumeSpecName: "kube-api-access-l4s7h") pod "69848fe2-bbbb-44ba-982f-2cfacf5280d0" (UID: "69848fe2-bbbb-44ba-982f-2cfacf5280d0"). InnerVolumeSpecName "kube-api-access-l4s7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.001206 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d57228-bbc8-4b0d-b4bc-9fd11cca7630-kube-api-access-2lb5s" (OuterVolumeSpecName: "kube-api-access-2lb5s") pod "c5d57228-bbc8-4b0d-b4bc-9fd11cca7630" (UID: "c5d57228-bbc8-4b0d-b4bc-9fd11cca7630"). InnerVolumeSpecName "kube-api-access-2lb5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.003764 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="465f615b-c8cb-4464-b576-3de67e335b60" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.205:9292/healthcheck\": read tcp 10.217.0.2:43856->10.217.0.205:9292: read: connection reset by peer" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.004079 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="465f615b-c8cb-4464-b576-3de67e335b60" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.205:9292/healthcheck\": read tcp 10.217.0.2:43854->10.217.0.205:9292: read: connection reset by peer" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.093324 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqwrt\" (UniqueName: \"kubernetes.io/projected/843da113-e41e-40b0-9d7a-c5486468ca5d-kube-api-access-dqwrt\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.093347 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d57228-bbc8-4b0d-b4bc-9fd11cca7630-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.093362 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lb5s\" (UniqueName: \"kubernetes.io/projected/c5d57228-bbc8-4b0d-b4bc-9fd11cca7630-kube-api-access-2lb5s\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.093372 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn9bm\" (UniqueName: \"kubernetes.io/projected/7ffdd818-4c7b-46df-8e2f-0bcba0281818-kube-api-access-xn9bm\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.093381 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4s7h\" (UniqueName: \"kubernetes.io/projected/69848fe2-bbbb-44ba-982f-2cfacf5280d0-kube-api-access-l4s7h\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.093390 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffdd818-4c7b-46df-8e2f-0bcba0281818-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.093400 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4bf350-04d5-4e11-8462-2e53fa7b058a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.093410 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69848fe2-bbbb-44ba-982f-2cfacf5280d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.093419 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/843da113-e41e-40b0-9d7a-c5486468ca5d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.093428 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84g5r\" (UniqueName: \"kubernetes.io/projected/9b4bf350-04d5-4e11-8462-2e53fa7b058a-kube-api-access-84g5r\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.321764 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:39:46 crc kubenswrapper[5030]: W0120 23:39:46.325489 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19f669e8_7127_4a91_ae58_2e88b3bb7d6e.slice/crio-8e09247cc3cc4b667a1de46e9091bc6a5a00952b1850ee813130e1e2f255317e WatchSource:0}: Error finding container 8e09247cc3cc4b667a1de46e9091bc6a5a00952b1850ee813130e1e2f255317e: Status 404 returned error can't find the container with id 8e09247cc3cc4b667a1de46e9091bc6a5a00952b1850ee813130e1e2f255317e Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.363506 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.413379 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/465f615b-c8cb-4464-b576-3de67e335b60-logs\") pod \"465f615b-c8cb-4464-b576-3de67e335b60\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.413476 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"465f615b-c8cb-4464-b576-3de67e335b60\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.413516 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-combined-ca-bundle\") pod \"465f615b-c8cb-4464-b576-3de67e335b60\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.413555 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/465f615b-c8cb-4464-b576-3de67e335b60-httpd-run\") pod \"465f615b-c8cb-4464-b576-3de67e335b60\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.413577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-scripts\") pod \"465f615b-c8cb-4464-b576-3de67e335b60\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.413723 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-public-tls-certs\") pod \"465f615b-c8cb-4464-b576-3de67e335b60\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.413763 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-config-data\") pod \"465f615b-c8cb-4464-b576-3de67e335b60\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.413792 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf5jx\" (UniqueName: \"kubernetes.io/projected/465f615b-c8cb-4464-b576-3de67e335b60-kube-api-access-kf5jx\") pod \"465f615b-c8cb-4464-b576-3de67e335b60\" (UID: \"465f615b-c8cb-4464-b576-3de67e335b60\") " Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.414939 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/465f615b-c8cb-4464-b576-3de67e335b60-logs" (OuterVolumeSpecName: "logs") pod "465f615b-c8cb-4464-b576-3de67e335b60" (UID: "465f615b-c8cb-4464-b576-3de67e335b60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.424130 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/465f615b-c8cb-4464-b576-3de67e335b60-kube-api-access-kf5jx" (OuterVolumeSpecName: "kube-api-access-kf5jx") pod "465f615b-c8cb-4464-b576-3de67e335b60" (UID: "465f615b-c8cb-4464-b576-3de67e335b60"). InnerVolumeSpecName "kube-api-access-kf5jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.428102 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "465f615b-c8cb-4464-b576-3de67e335b60" (UID: "465f615b-c8cb-4464-b576-3de67e335b60"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.441899 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-scripts" (OuterVolumeSpecName: "scripts") pod "465f615b-c8cb-4464-b576-3de67e335b60" (UID: "465f615b-c8cb-4464-b576-3de67e335b60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.441900 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/465f615b-c8cb-4464-b576-3de67e335b60-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "465f615b-c8cb-4464-b576-3de67e335b60" (UID: "465f615b-c8cb-4464-b576-3de67e335b60"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.461757 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"19f669e8-7127-4a91-ae58-2e88b3bb7d6e","Type":"ContainerStarted","Data":"8e09247cc3cc4b667a1de46e9091bc6a5a00952b1850ee813130e1e2f255317e"} Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.464391 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" event={"ID":"c5d57228-bbc8-4b0d-b4bc-9fd11cca7630","Type":"ContainerDied","Data":"f30d9044efc5303799afb9ccf54cf5bd12369aac2360ae5e9b8e57192ead40b9"} Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.464412 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f30d9044efc5303799afb9ccf54cf5bd12369aac2360ae5e9b8e57192ead40b9" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.464472 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.466160 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" event={"ID":"843da113-e41e-40b0-9d7a-c5486468ca5d","Type":"ContainerDied","Data":"abf6227b7e7ebea5f5874fa51fc8e4666b38c9efc05755ebc0d006f7e0f4f413"} Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.466200 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abf6227b7e7ebea5f5874fa51fc8e4666b38c9efc05755ebc0d006f7e0f4f413" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.466321 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.467405 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-98jxz" event={"ID":"69848fe2-bbbb-44ba-982f-2cfacf5280d0","Type":"ContainerDied","Data":"d1c8b62540a72333d6e24f1042043284fa8d9a4b7daa72c08156f691c915ee8c"} Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.467427 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1c8b62540a72333d6e24f1042043284fa8d9a4b7daa72c08156f691c915ee8c" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.467484 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-98jxz" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.471280 5030 generic.go:334] "Generic (PLEG): container finished" podID="465f615b-c8cb-4464-b576-3de67e335b60" containerID="9ec8c1394b7cd5609ca385739ce15032289c3a70706b9ad03ce5c2e188b8a10b" exitCode=0 Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.471322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"465f615b-c8cb-4464-b576-3de67e335b60","Type":"ContainerDied","Data":"9ec8c1394b7cd5609ca385739ce15032289c3a70706b9ad03ce5c2e188b8a10b"} Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.471338 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"465f615b-c8cb-4464-b576-3de67e335b60","Type":"ContainerDied","Data":"29708025d1281124abcbc26bfc7e1d70b99cc64704cff281617e7f596d5ec915"} Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.471353 5030 scope.go:117] "RemoveContainer" containerID="9ec8c1394b7cd5609ca385739ce15032289c3a70706b9ad03ce5c2e188b8a10b" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.471484 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.477423 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "465f615b-c8cb-4464-b576-3de67e335b60" (UID: "465f615b-c8cb-4464-b576-3de67e335b60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.480715 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-kf4fb" event={"ID":"7ffdd818-4c7b-46df-8e2f-0bcba0281818","Type":"ContainerDied","Data":"a0f663a26013d8476e73c40d432db5054913909cd41a5495680c1b0eb3f5c4ea"} Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.480746 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0f663a26013d8476e73c40d432db5054913909cd41a5495680c1b0eb3f5c4ea" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.480805 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-kf4fb" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.484247 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-z6fdw" event={"ID":"9b4bf350-04d5-4e11-8462-2e53fa7b058a","Type":"ContainerDied","Data":"61b553e4175e72fd4ebdc829594de34285d575cbd84389065738108cc8a5c104"} Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.484260 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-z6fdw" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.484297 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61b553e4175e72fd4ebdc829594de34285d575cbd84389065738108cc8a5c104" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.503308 5030 scope.go:117] "RemoveContainer" containerID="69101b4e8486bd4ac5b32d623cd91ef7c663869931c87c6eef9741331b365a75" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.514387 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "465f615b-c8cb-4464-b576-3de67e335b60" (UID: "465f615b-c8cb-4464-b576-3de67e335b60"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.515607 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/465f615b-c8cb-4464-b576-3de67e335b60-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.515715 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.515799 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.515963 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/465f615b-c8cb-4464-b576-3de67e335b60-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.516026 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.516088 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.516145 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf5jx\" (UniqueName: \"kubernetes.io/projected/465f615b-c8cb-4464-b576-3de67e335b60-kube-api-access-kf5jx\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.516059 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-config-data" (OuterVolumeSpecName: "config-data") pod "465f615b-c8cb-4464-b576-3de67e335b60" (UID: "465f615b-c8cb-4464-b576-3de67e335b60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.538739 5030 scope.go:117] "RemoveContainer" containerID="9ec8c1394b7cd5609ca385739ce15032289c3a70706b9ad03ce5c2e188b8a10b" Jan 20 23:39:46 crc kubenswrapper[5030]: E0120 23:39:46.539526 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ec8c1394b7cd5609ca385739ce15032289c3a70706b9ad03ce5c2e188b8a10b\": container with ID starting with 9ec8c1394b7cd5609ca385739ce15032289c3a70706b9ad03ce5c2e188b8a10b not found: ID does not exist" containerID="9ec8c1394b7cd5609ca385739ce15032289c3a70706b9ad03ce5c2e188b8a10b" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.539566 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ec8c1394b7cd5609ca385739ce15032289c3a70706b9ad03ce5c2e188b8a10b"} err="failed to get container status \"9ec8c1394b7cd5609ca385739ce15032289c3a70706b9ad03ce5c2e188b8a10b\": rpc error: code = NotFound desc = could not find container \"9ec8c1394b7cd5609ca385739ce15032289c3a70706b9ad03ce5c2e188b8a10b\": container with ID starting with 9ec8c1394b7cd5609ca385739ce15032289c3a70706b9ad03ce5c2e188b8a10b not found: ID does not exist" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.539592 5030 scope.go:117] "RemoveContainer" containerID="69101b4e8486bd4ac5b32d623cd91ef7c663869931c87c6eef9741331b365a75" Jan 20 23:39:46 crc kubenswrapper[5030]: E0120 23:39:46.540163 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69101b4e8486bd4ac5b32d623cd91ef7c663869931c87c6eef9741331b365a75\": container with ID starting with 69101b4e8486bd4ac5b32d623cd91ef7c663869931c87c6eef9741331b365a75 not found: ID does not exist" containerID="69101b4e8486bd4ac5b32d623cd91ef7c663869931c87c6eef9741331b365a75" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.540209 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69101b4e8486bd4ac5b32d623cd91ef7c663869931c87c6eef9741331b365a75"} err="failed to get container status \"69101b4e8486bd4ac5b32d623cd91ef7c663869931c87c6eef9741331b365a75\": rpc error: code = NotFound desc = could not find container \"69101b4e8486bd4ac5b32d623cd91ef7c663869931c87c6eef9741331b365a75\": container with ID starting with 69101b4e8486bd4ac5b32d623cd91ef7c663869931c87c6eef9741331b365a75 not found: ID does not exist" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.542102 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.617501 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.617535 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/465f615b-c8cb-4464-b576-3de67e335b60-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.849234 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.861666 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.872905 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.894306 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:39:46 crc kubenswrapper[5030]: E0120 23:39:46.894790 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465f615b-c8cb-4464-b576-3de67e335b60" containerName="glance-log" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.894810 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="465f615b-c8cb-4464-b576-3de67e335b60" containerName="glance-log" Jan 20 23:39:46 crc kubenswrapper[5030]: E0120 23:39:46.894829 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d57228-bbc8-4b0d-b4bc-9fd11cca7630" containerName="mariadb-account-create-update" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.894839 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d57228-bbc8-4b0d-b4bc-9fd11cca7630" containerName="mariadb-account-create-update" Jan 20 23:39:46 crc kubenswrapper[5030]: E0120 23:39:46.894862 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465f615b-c8cb-4464-b576-3de67e335b60" containerName="glance-httpd" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.894871 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="465f615b-c8cb-4464-b576-3de67e335b60" containerName="glance-httpd" Jan 20 23:39:46 crc kubenswrapper[5030]: E0120 23:39:46.894882 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce721ec0-af84-470d-a905-0c6163c1580f" containerName="mariadb-account-create-update" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.894898 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce721ec0-af84-470d-a905-0c6163c1580f" containerName="mariadb-account-create-update" Jan 20 23:39:46 crc kubenswrapper[5030]: E0120 23:39:46.894919 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="843da113-e41e-40b0-9d7a-c5486468ca5d" containerName="mariadb-account-create-update" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.894929 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="843da113-e41e-40b0-9d7a-c5486468ca5d" containerName="mariadb-account-create-update" Jan 20 23:39:46 crc kubenswrapper[5030]: E0120 23:39:46.894945 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4bf350-04d5-4e11-8462-2e53fa7b058a" containerName="mariadb-database-create" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.894953 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4bf350-04d5-4e11-8462-2e53fa7b058a" containerName="mariadb-database-create" Jan 20 23:39:46 crc kubenswrapper[5030]: E0120 23:39:46.894968 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ffdd818-4c7b-46df-8e2f-0bcba0281818" containerName="mariadb-database-create" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.895010 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ffdd818-4c7b-46df-8e2f-0bcba0281818" containerName="mariadb-database-create" Jan 20 23:39:46 crc kubenswrapper[5030]: E0120 23:39:46.895029 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69848fe2-bbbb-44ba-982f-2cfacf5280d0" containerName="mariadb-database-create" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.895039 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="69848fe2-bbbb-44ba-982f-2cfacf5280d0" containerName="mariadb-database-create" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.895244 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4bf350-04d5-4e11-8462-2e53fa7b058a" containerName="mariadb-database-create" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.895259 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="465f615b-c8cb-4464-b576-3de67e335b60" containerName="glance-httpd" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.895268 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="69848fe2-bbbb-44ba-982f-2cfacf5280d0" containerName="mariadb-database-create" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.895286 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="465f615b-c8cb-4464-b576-3de67e335b60" containerName="glance-log" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.895304 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ffdd818-4c7b-46df-8e2f-0bcba0281818" containerName="mariadb-database-create" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.895344 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce721ec0-af84-470d-a905-0c6163c1580f" containerName="mariadb-account-create-update" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.895362 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="843da113-e41e-40b0-9d7a-c5486468ca5d" containerName="mariadb-account-create-update" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.895376 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d57228-bbc8-4b0d-b4bc-9fd11cca7630" containerName="mariadb-account-create-update" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.896600 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.899297 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.904030 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.908411 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.920576 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nntsz\" (UniqueName: \"kubernetes.io/projected/ce721ec0-af84-470d-a905-0c6163c1580f-kube-api-access-nntsz\") pod \"ce721ec0-af84-470d-a905-0c6163c1580f\" (UID: \"ce721ec0-af84-470d-a905-0c6163c1580f\") " Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.920641 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce721ec0-af84-470d-a905-0c6163c1580f-operator-scripts\") pod \"ce721ec0-af84-470d-a905-0c6163c1580f\" (UID: \"ce721ec0-af84-470d-a905-0c6163c1580f\") " Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.922052 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce721ec0-af84-470d-a905-0c6163c1580f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce721ec0-af84-470d-a905-0c6163c1580f" (UID: "ce721ec0-af84-470d-a905-0c6163c1580f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:39:46 crc kubenswrapper[5030]: I0120 23:39:46.942111 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce721ec0-af84-470d-a905-0c6163c1580f-kube-api-access-nntsz" (OuterVolumeSpecName: "kube-api-access-nntsz") pod "ce721ec0-af84-470d-a905-0c6163c1580f" (UID: "ce721ec0-af84-470d-a905-0c6163c1580f"). InnerVolumeSpecName "kube-api-access-nntsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.022962 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs49g\" (UniqueName: \"kubernetes.io/projected/20ed422e-24f7-4373-8bd9-e32a8e086a0c-kube-api-access-gs49g\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.023344 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.023368 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-config-data\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.023389 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20ed422e-24f7-4373-8bd9-e32a8e086a0c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.023409 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.023576 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20ed422e-24f7-4373-8bd9-e32a8e086a0c-logs\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.023671 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-scripts\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.023710 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.023913 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nntsz\" (UniqueName: \"kubernetes.io/projected/ce721ec0-af84-470d-a905-0c6163c1580f-kube-api-access-nntsz\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.023927 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce721ec0-af84-470d-a905-0c6163c1580f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.127984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.128046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-config-data\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.128066 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20ed422e-24f7-4373-8bd9-e32a8e086a0c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.128088 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.128119 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20ed422e-24f7-4373-8bd9-e32a8e086a0c-logs\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.128140 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-scripts\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.128158 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.128217 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs49g\" (UniqueName: \"kubernetes.io/projected/20ed422e-24f7-4373-8bd9-e32a8e086a0c-kube-api-access-gs49g\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.129451 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.129695 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20ed422e-24f7-4373-8bd9-e32a8e086a0c-logs\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.129926 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20ed422e-24f7-4373-8bd9-e32a8e086a0c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.132754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-scripts\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.134113 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.140359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-config-data\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.140809 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.147921 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs49g\" (UniqueName: \"kubernetes.io/projected/20ed422e-24f7-4373-8bd9-e32a8e086a0c-kube-api-access-gs49g\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.158249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.215257 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.502476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" event={"ID":"ce721ec0-af84-470d-a905-0c6163c1580f","Type":"ContainerDied","Data":"663212bfdbdcaf0127fff5fa48686827502e5d09dc4c7a0478f35b2f85dc9a5b"} Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.502817 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="663212bfdbdcaf0127fff5fa48686827502e5d09dc4c7a0478f35b2f85dc9a5b" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.502782 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.504926 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"19f669e8-7127-4a91-ae58-2e88b3bb7d6e","Type":"ContainerStarted","Data":"d3091f37569897e9b88ec5d78c920009f5c04f92e89d358d4a768fe3e3188955"} Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.504960 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"19f669e8-7127-4a91-ae58-2e88b3bb7d6e","Type":"ContainerStarted","Data":"577c0b1d3c02aa73ef6c0f40cb4e260f2080808352a81286f14d239c1a159825"} Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.535080 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.535064926 podStartE2EDuration="2.535064926s" podCreationTimestamp="2026-01-20 23:39:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:39:47.529582583 +0000 UTC m=+3859.849842871" watchObservedRunningTime="2026-01-20 23:39:47.535064926 +0000 UTC m=+3859.855325214" Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.672553 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:39:47 crc kubenswrapper[5030]: W0120 23:39:47.681860 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20ed422e_24f7_4373_8bd9_e32a8e086a0c.slice/crio-326d6570fa1ab3a89274a3b0133d8fcd95b9d50d171e5e7bf75c912a7e440960 WatchSource:0}: Error finding container 326d6570fa1ab3a89274a3b0133d8fcd95b9d50d171e5e7bf75c912a7e440960: Status 404 returned error can't find the container with id 326d6570fa1ab3a89274a3b0133d8fcd95b9d50d171e5e7bf75c912a7e440960 Jan 20 23:39:47 crc kubenswrapper[5030]: I0120 23:39:47.988295 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="465f615b-c8cb-4464-b576-3de67e335b60" path="/var/lib/kubelet/pods/465f615b-c8cb-4464-b576-3de67e335b60/volumes" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.220992 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.355124 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-log-httpd\") pod \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.355229 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-combined-ca-bundle\") pod \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.355316 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv5b6\" (UniqueName: \"kubernetes.io/projected/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-kube-api-access-rv5b6\") pod \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.355347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-run-httpd\") pod \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.355495 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" (UID: "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.355455 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-sg-core-conf-yaml\") pod \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.356323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-scripts\") pod \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.356366 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-config-data\") pod \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\" (UID: \"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb\") " Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.356568 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" (UID: "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.356996 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.357023 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.360583 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-scripts" (OuterVolumeSpecName: "scripts") pod "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" (UID: "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.361020 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-kube-api-access-rv5b6" (OuterVolumeSpecName: "kube-api-access-rv5b6") pod "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" (UID: "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb"). InnerVolumeSpecName "kube-api-access-rv5b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.381891 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" (UID: "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.428869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" (UID: "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.444274 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-config-data" (OuterVolumeSpecName: "config-data") pod "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" (UID: "78ba0629-d4c5-4b7c-bb91-870cb62b1ccb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.458445 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.458488 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.458505 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv5b6\" (UniqueName: \"kubernetes.io/projected/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-kube-api-access-rv5b6\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.458518 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.458530 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.517462 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"20ed422e-24f7-4373-8bd9-e32a8e086a0c","Type":"ContainerStarted","Data":"d3eaeefe69d0cb25858dfc5c754c96ff130ce1e3dfac3894928e03469d40a64f"} Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.517518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"20ed422e-24f7-4373-8bd9-e32a8e086a0c","Type":"ContainerStarted","Data":"326d6570fa1ab3a89274a3b0133d8fcd95b9d50d171e5e7bf75c912a7e440960"} Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.523205 5030 generic.go:334] "Generic (PLEG): container finished" podID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerID="a7bc677d6e784b4af9822c8fb0855f5acc2ac7c962f110430d421220208a664a" exitCode=0 Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.524141 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.524324 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb","Type":"ContainerDied","Data":"a7bc677d6e784b4af9822c8fb0855f5acc2ac7c962f110430d421220208a664a"} Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.524358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"78ba0629-d4c5-4b7c-bb91-870cb62b1ccb","Type":"ContainerDied","Data":"96fe24a84b73b7c8a8669214f745e0d1ba2f4e0cd88ee991cbea7398a71bb6b7"} Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.524380 5030 scope.go:117] "RemoveContainer" containerID="1b9b5e21d250ae262d5e17e495f75db1b2625987e8848941a85f609f387ef502" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.567924 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.579245 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.592146 5030 scope.go:117] "RemoveContainer" containerID="030877f2846373e4da5c440a4f78f2b84ad892a377fddde7d7b863e93538aebe" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.600146 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:39:48 crc kubenswrapper[5030]: E0120 23:39:48.600524 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="proxy-httpd" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.600545 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="proxy-httpd" Jan 20 23:39:48 crc kubenswrapper[5030]: E0120 23:39:48.600558 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="ceilometer-notification-agent" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.600582 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="ceilometer-notification-agent" Jan 20 23:39:48 crc kubenswrapper[5030]: E0120 23:39:48.600598 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="sg-core" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.600605 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="sg-core" Jan 20 23:39:48 crc kubenswrapper[5030]: E0120 23:39:48.600696 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="ceilometer-central-agent" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.600704 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="ceilometer-central-agent" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.600860 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="ceilometer-notification-agent" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.600883 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="sg-core" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.600898 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="ceilometer-central-agent" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.600911 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" containerName="proxy-httpd" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.602375 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.617821 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.618082 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.659291 5030 scope.go:117] "RemoveContainer" containerID="d4318c87eddf4ed37394d9d3cc05a8f6b279806e80d5c1eec78b613ab6d12ce2" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.664752 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.673799 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92797b7d-c5ce-45d9-b85a-422cbbda3700-log-httpd\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.673836 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsxk4\" (UniqueName: \"kubernetes.io/projected/92797b7d-c5ce-45d9-b85a-422cbbda3700-kube-api-access-zsxk4\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.673881 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-config-data\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.673954 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-scripts\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.673983 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.674086 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.674172 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92797b7d-c5ce-45d9-b85a-422cbbda3700-run-httpd\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.719720 5030 scope.go:117] "RemoveContainer" containerID="a7bc677d6e784b4af9822c8fb0855f5acc2ac7c962f110430d421220208a664a" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.781047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-scripts\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.781141 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.781327 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.781521 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92797b7d-c5ce-45d9-b85a-422cbbda3700-run-httpd\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.781567 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsxk4\" (UniqueName: \"kubernetes.io/projected/92797b7d-c5ce-45d9-b85a-422cbbda3700-kube-api-access-zsxk4\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.781585 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92797b7d-c5ce-45d9-b85a-422cbbda3700-log-httpd\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.781658 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-config-data\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.793466 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-scripts\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.793721 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92797b7d-c5ce-45d9-b85a-422cbbda3700-run-httpd\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.797136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.802022 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.803005 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92797b7d-c5ce-45d9-b85a-422cbbda3700-log-httpd\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.803588 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-config-data\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.811792 5030 scope.go:117] "RemoveContainer" containerID="1b9b5e21d250ae262d5e17e495f75db1b2625987e8848941a85f609f387ef502" Jan 20 23:39:48 crc kubenswrapper[5030]: E0120 23:39:48.827173 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b9b5e21d250ae262d5e17e495f75db1b2625987e8848941a85f609f387ef502\": container with ID starting with 1b9b5e21d250ae262d5e17e495f75db1b2625987e8848941a85f609f387ef502 not found: ID does not exist" containerID="1b9b5e21d250ae262d5e17e495f75db1b2625987e8848941a85f609f387ef502" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.827215 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b9b5e21d250ae262d5e17e495f75db1b2625987e8848941a85f609f387ef502"} err="failed to get container status \"1b9b5e21d250ae262d5e17e495f75db1b2625987e8848941a85f609f387ef502\": rpc error: code = NotFound desc = could not find container \"1b9b5e21d250ae262d5e17e495f75db1b2625987e8848941a85f609f387ef502\": container with ID starting with 1b9b5e21d250ae262d5e17e495f75db1b2625987e8848941a85f609f387ef502 not found: ID does not exist" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.827244 5030 scope.go:117] "RemoveContainer" containerID="030877f2846373e4da5c440a4f78f2b84ad892a377fddde7d7b863e93538aebe" Jan 20 23:39:48 crc kubenswrapper[5030]: E0120 23:39:48.839896 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"030877f2846373e4da5c440a4f78f2b84ad892a377fddde7d7b863e93538aebe\": container with ID starting with 030877f2846373e4da5c440a4f78f2b84ad892a377fddde7d7b863e93538aebe not found: ID does not exist" containerID="030877f2846373e4da5c440a4f78f2b84ad892a377fddde7d7b863e93538aebe" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.840452 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"030877f2846373e4da5c440a4f78f2b84ad892a377fddde7d7b863e93538aebe"} err="failed to get container status \"030877f2846373e4da5c440a4f78f2b84ad892a377fddde7d7b863e93538aebe\": rpc error: code = NotFound desc = could not find container \"030877f2846373e4da5c440a4f78f2b84ad892a377fddde7d7b863e93538aebe\": container with ID starting with 030877f2846373e4da5c440a4f78f2b84ad892a377fddde7d7b863e93538aebe not found: ID does not exist" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.840494 5030 scope.go:117] "RemoveContainer" containerID="d4318c87eddf4ed37394d9d3cc05a8f6b279806e80d5c1eec78b613ab6d12ce2" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.840506 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsxk4\" (UniqueName: \"kubernetes.io/projected/92797b7d-c5ce-45d9-b85a-422cbbda3700-kube-api-access-zsxk4\") pod \"ceilometer-0\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:48 crc kubenswrapper[5030]: E0120 23:39:48.848340 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4318c87eddf4ed37394d9d3cc05a8f6b279806e80d5c1eec78b613ab6d12ce2\": container with ID starting with d4318c87eddf4ed37394d9d3cc05a8f6b279806e80d5c1eec78b613ab6d12ce2 not found: ID does not exist" containerID="d4318c87eddf4ed37394d9d3cc05a8f6b279806e80d5c1eec78b613ab6d12ce2" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.848379 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4318c87eddf4ed37394d9d3cc05a8f6b279806e80d5c1eec78b613ab6d12ce2"} err="failed to get container status \"d4318c87eddf4ed37394d9d3cc05a8f6b279806e80d5c1eec78b613ab6d12ce2\": rpc error: code = NotFound desc = could not find container \"d4318c87eddf4ed37394d9d3cc05a8f6b279806e80d5c1eec78b613ab6d12ce2\": container with ID starting with d4318c87eddf4ed37394d9d3cc05a8f6b279806e80d5c1eec78b613ab6d12ce2 not found: ID does not exist" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.848406 5030 scope.go:117] "RemoveContainer" containerID="a7bc677d6e784b4af9822c8fb0855f5acc2ac7c962f110430d421220208a664a" Jan 20 23:39:48 crc kubenswrapper[5030]: E0120 23:39:48.850643 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7bc677d6e784b4af9822c8fb0855f5acc2ac7c962f110430d421220208a664a\": container with ID starting with a7bc677d6e784b4af9822c8fb0855f5acc2ac7c962f110430d421220208a664a not found: ID does not exist" containerID="a7bc677d6e784b4af9822c8fb0855f5acc2ac7c962f110430d421220208a664a" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.850661 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7bc677d6e784b4af9822c8fb0855f5acc2ac7c962f110430d421220208a664a"} err="failed to get container status \"a7bc677d6e784b4af9822c8fb0855f5acc2ac7c962f110430d421220208a664a\": rpc error: code = NotFound desc = could not find container \"a7bc677d6e784b4af9822c8fb0855f5acc2ac7c962f110430d421220208a664a\": container with ID starting with a7bc677d6e784b4af9822c8fb0855f5acc2ac7c962f110430d421220208a664a not found: ID does not exist" Jan 20 23:39:48 crc kubenswrapper[5030]: I0120 23:39:48.959441 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:49 crc kubenswrapper[5030]: I0120 23:39:49.372141 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:39:49 crc kubenswrapper[5030]: W0120 23:39:49.389072 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92797b7d_c5ce_45d9_b85a_422cbbda3700.slice/crio-4f6b396afd9ba57b0d0135d69ba0b2a2dd67ad03082887056e4ef6c17b8a6ddb WatchSource:0}: Error finding container 4f6b396afd9ba57b0d0135d69ba0b2a2dd67ad03082887056e4ef6c17b8a6ddb: Status 404 returned error can't find the container with id 4f6b396afd9ba57b0d0135d69ba0b2a2dd67ad03082887056e4ef6c17b8a6ddb Jan 20 23:39:49 crc kubenswrapper[5030]: I0120 23:39:49.535095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92797b7d-c5ce-45d9-b85a-422cbbda3700","Type":"ContainerStarted","Data":"4f6b396afd9ba57b0d0135d69ba0b2a2dd67ad03082887056e4ef6c17b8a6ddb"} Jan 20 23:39:49 crc kubenswrapper[5030]: I0120 23:39:49.537427 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"20ed422e-24f7-4373-8bd9-e32a8e086a0c","Type":"ContainerStarted","Data":"89f942718707120126f9bb556042a383baa6956e7cf2185a56b93a5da399dd9d"} Jan 20 23:39:49 crc kubenswrapper[5030]: I0120 23:39:49.557420 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.557404868 podStartE2EDuration="3.557404868s" podCreationTimestamp="2026-01-20 23:39:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:39:49.556113308 +0000 UTC m=+3861.876373596" watchObservedRunningTime="2026-01-20 23:39:49.557404868 +0000 UTC m=+3861.877665146" Jan 20 23:39:49 crc kubenswrapper[5030]: I0120 23:39:49.975001 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78ba0629-d4c5-4b7c-bb91-870cb62b1ccb" path="/var/lib/kubelet/pods/78ba0629-d4c5-4b7c-bb91-870cb62b1ccb/volumes" Jan 20 23:39:50 crc kubenswrapper[5030]: I0120 23:39:50.962243 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:39:50 crc kubenswrapper[5030]: E0120 23:39:50.962863 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:39:51 crc kubenswrapper[5030]: I0120 23:39:51.584582 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92797b7d-c5ce-45d9-b85a-422cbbda3700","Type":"ContainerStarted","Data":"b34f46179fa8bd339b04b207e1b1de3fb4da30bd9a0124a1d5929b2ea8da5fe0"} Jan 20 23:39:51 crc kubenswrapper[5030]: I0120 23:39:51.584680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92797b7d-c5ce-45d9-b85a-422cbbda3700","Type":"ContainerStarted","Data":"066c1361524f5c6bbf2dfd417a5d6dbf303afaac13c7660cd90d510fb0d0635c"} Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.610816 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92797b7d-c5ce-45d9-b85a-422cbbda3700","Type":"ContainerStarted","Data":"a1b86b43730b7a0fb803ffc9ce9dc60f67efd356920a732d429b21de59f6bb54"} Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.783950 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh"] Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.785037 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.787201 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.787333 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.788543 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-hpr85" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.797909 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh"] Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.853479 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-scripts\") pod \"nova-cell0-conductor-db-sync-z2rqh\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.853529 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-config-data\") pod \"nova-cell0-conductor-db-sync-z2rqh\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.853566 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-z2rqh\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.853586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rjnw\" (UniqueName: \"kubernetes.io/projected/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-kube-api-access-8rjnw\") pod \"nova-cell0-conductor-db-sync-z2rqh\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.955220 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-scripts\") pod \"nova-cell0-conductor-db-sync-z2rqh\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.955272 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-config-data\") pod \"nova-cell0-conductor-db-sync-z2rqh\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.955314 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-z2rqh\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.955336 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rjnw\" (UniqueName: \"kubernetes.io/projected/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-kube-api-access-8rjnw\") pod \"nova-cell0-conductor-db-sync-z2rqh\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.959347 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-scripts\") pod \"nova-cell0-conductor-db-sync-z2rqh\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.959657 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-z2rqh\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.960113 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-config-data\") pod \"nova-cell0-conductor-db-sync-z2rqh\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:52 crc kubenswrapper[5030]: I0120 23:39:52.982973 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rjnw\" (UniqueName: \"kubernetes.io/projected/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-kube-api-access-8rjnw\") pod \"nova-cell0-conductor-db-sync-z2rqh\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:53 crc kubenswrapper[5030]: I0120 23:39:53.099702 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:39:53 crc kubenswrapper[5030]: I0120 23:39:53.620716 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92797b7d-c5ce-45d9-b85a-422cbbda3700","Type":"ContainerStarted","Data":"4d69001c30c5924e9c41d154f7c082263d5373513bdf70a50cd66bb3684953a4"} Jan 20 23:39:53 crc kubenswrapper[5030]: I0120 23:39:53.621405 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:39:53 crc kubenswrapper[5030]: I0120 23:39:53.651665 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.9877722960000002 podStartE2EDuration="5.651647674s" podCreationTimestamp="2026-01-20 23:39:48 +0000 UTC" firstStartedPulling="2026-01-20 23:39:49.391529683 +0000 UTC m=+3861.711789961" lastFinishedPulling="2026-01-20 23:39:53.055405051 +0000 UTC m=+3865.375665339" observedRunningTime="2026-01-20 23:39:53.643124378 +0000 UTC m=+3865.963384676" watchObservedRunningTime="2026-01-20 23:39:53.651647674 +0000 UTC m=+3865.971907962" Jan 20 23:39:53 crc kubenswrapper[5030]: I0120 23:39:53.667971 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh"] Jan 20 23:39:53 crc kubenswrapper[5030]: W0120 23:39:53.672678 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9449e5f9_0e08_4f4e_ac8b_a942bb89a09e.slice/crio-a9b0488a45605659548f738db43b2ad5426cafaf25784938495235875a40c735 WatchSource:0}: Error finding container a9b0488a45605659548f738db43b2ad5426cafaf25784938495235875a40c735: Status 404 returned error can't find the container with id a9b0488a45605659548f738db43b2ad5426cafaf25784938495235875a40c735 Jan 20 23:39:54 crc kubenswrapper[5030]: I0120 23:39:54.629949 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" event={"ID":"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e","Type":"ContainerStarted","Data":"ac46d323e8f0a5b973e8d23adb65853f4d94b260c2af44727eb8af627172e11d"} Jan 20 23:39:54 crc kubenswrapper[5030]: I0120 23:39:54.630264 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" event={"ID":"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e","Type":"ContainerStarted","Data":"a9b0488a45605659548f738db43b2ad5426cafaf25784938495235875a40c735"} Jan 20 23:39:54 crc kubenswrapper[5030]: I0120 23:39:54.656605 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" podStartSLOduration=2.656577748 podStartE2EDuration="2.656577748s" podCreationTimestamp="2026-01-20 23:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:39:54.651201398 +0000 UTC m=+3866.971461686" watchObservedRunningTime="2026-01-20 23:39:54.656577748 +0000 UTC m=+3866.976838046" Jan 20 23:39:55 crc kubenswrapper[5030]: I0120 23:39:55.769454 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:55 crc kubenswrapper[5030]: I0120 23:39:55.769837 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:55 crc kubenswrapper[5030]: I0120 23:39:55.798730 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:55 crc kubenswrapper[5030]: I0120 23:39:55.809210 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:56 crc kubenswrapper[5030]: I0120 23:39:56.649350 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:56 crc kubenswrapper[5030]: I0120 23:39:56.649414 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:57 crc kubenswrapper[5030]: I0120 23:39:57.217538 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:57 crc kubenswrapper[5030]: I0120 23:39:57.218948 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:57 crc kubenswrapper[5030]: I0120 23:39:57.274961 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:57 crc kubenswrapper[5030]: I0120 23:39:57.280187 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:57 crc kubenswrapper[5030]: I0120 23:39:57.658794 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:57 crc kubenswrapper[5030]: I0120 23:39:57.659056 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:58 crc kubenswrapper[5030]: I0120 23:39:58.610323 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:58 crc kubenswrapper[5030]: I0120 23:39:58.612495 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:39:59 crc kubenswrapper[5030]: I0120 23:39:59.510000 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:39:59 crc kubenswrapper[5030]: I0120 23:39:59.675076 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:39:59 crc kubenswrapper[5030]: I0120 23:39:59.888031 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:40:03 crc kubenswrapper[5030]: I0120 23:40:03.715469 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" event={"ID":"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e","Type":"ContainerDied","Data":"ac46d323e8f0a5b973e8d23adb65853f4d94b260c2af44727eb8af627172e11d"} Jan 20 23:40:03 crc kubenswrapper[5030]: I0120 23:40:03.715408 5030 generic.go:334] "Generic (PLEG): container finished" podID="9449e5f9-0e08-4f4e-ac8b-a942bb89a09e" containerID="ac46d323e8f0a5b973e8d23adb65853f4d94b260c2af44727eb8af627172e11d" exitCode=0 Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.062943 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.172733 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rjnw\" (UniqueName: \"kubernetes.io/projected/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-kube-api-access-8rjnw\") pod \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.172987 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-scripts\") pod \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.173046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-config-data\") pod \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.173645 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-combined-ca-bundle\") pod \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\" (UID: \"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e\") " Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.178841 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-scripts" (OuterVolumeSpecName: "scripts") pod "9449e5f9-0e08-4f4e-ac8b-a942bb89a09e" (UID: "9449e5f9-0e08-4f4e-ac8b-a942bb89a09e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.181909 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-kube-api-access-8rjnw" (OuterVolumeSpecName: "kube-api-access-8rjnw") pod "9449e5f9-0e08-4f4e-ac8b-a942bb89a09e" (UID: "9449e5f9-0e08-4f4e-ac8b-a942bb89a09e"). InnerVolumeSpecName "kube-api-access-8rjnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.201905 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-config-data" (OuterVolumeSpecName: "config-data") pod "9449e5f9-0e08-4f4e-ac8b-a942bb89a09e" (UID: "9449e5f9-0e08-4f4e-ac8b-a942bb89a09e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.208662 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9449e5f9-0e08-4f4e-ac8b-a942bb89a09e" (UID: "9449e5f9-0e08-4f4e-ac8b-a942bb89a09e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.251421 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.251911 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="ceilometer-central-agent" containerID="cri-o://066c1361524f5c6bbf2dfd417a5d6dbf303afaac13c7660cd90d510fb0d0635c" gracePeriod=30 Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.252153 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="proxy-httpd" containerID="cri-o://4d69001c30c5924e9c41d154f7c082263d5373513bdf70a50cd66bb3684953a4" gracePeriod=30 Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.252307 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="sg-core" containerID="cri-o://a1b86b43730b7a0fb803ffc9ce9dc60f67efd356920a732d429b21de59f6bb54" gracePeriod=30 Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.252427 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="ceilometer-notification-agent" containerID="cri-o://b34f46179fa8bd339b04b207e1b1de3fb4da30bd9a0124a1d5929b2ea8da5fe0" gracePeriod=30 Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.264333 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.231:3000/\": EOF" Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.275545 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rjnw\" (UniqueName: \"kubernetes.io/projected/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-kube-api-access-8rjnw\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.275576 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.275588 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.275599 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.739517 5030 generic.go:334] "Generic (PLEG): container finished" podID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerID="4d69001c30c5924e9c41d154f7c082263d5373513bdf70a50cd66bb3684953a4" exitCode=0 Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.739558 5030 generic.go:334] "Generic (PLEG): container finished" podID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerID="a1b86b43730b7a0fb803ffc9ce9dc60f67efd356920a732d429b21de59f6bb54" exitCode=2 Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.739570 5030 generic.go:334] "Generic (PLEG): container finished" podID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerID="066c1361524f5c6bbf2dfd417a5d6dbf303afaac13c7660cd90d510fb0d0635c" exitCode=0 Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.739638 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92797b7d-c5ce-45d9-b85a-422cbbda3700","Type":"ContainerDied","Data":"4d69001c30c5924e9c41d154f7c082263d5373513bdf70a50cd66bb3684953a4"} Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.739667 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92797b7d-c5ce-45d9-b85a-422cbbda3700","Type":"ContainerDied","Data":"a1b86b43730b7a0fb803ffc9ce9dc60f67efd356920a732d429b21de59f6bb54"} Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.739678 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92797b7d-c5ce-45d9-b85a-422cbbda3700","Type":"ContainerDied","Data":"066c1361524f5c6bbf2dfd417a5d6dbf303afaac13c7660cd90d510fb0d0635c"} Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.741339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" event={"ID":"9449e5f9-0e08-4f4e-ac8b-a942bb89a09e","Type":"ContainerDied","Data":"a9b0488a45605659548f738db43b2ad5426cafaf25784938495235875a40c735"} Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.741374 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b0488a45605659548f738db43b2ad5426cafaf25784938495235875a40c735" Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.741430 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh" Jan 20 23:40:05 crc kubenswrapper[5030]: I0120 23:40:05.962658 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:40:05 crc kubenswrapper[5030]: E0120 23:40:05.962957 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.144911 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:40:06 crc kubenswrapper[5030]: E0120 23:40:06.145456 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9449e5f9-0e08-4f4e-ac8b-a942bb89a09e" containerName="nova-cell0-conductor-db-sync" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.145479 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9449e5f9-0e08-4f4e-ac8b-a942bb89a09e" containerName="nova-cell0-conductor-db-sync" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.145846 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9449e5f9-0e08-4f4e-ac8b-a942bb89a09e" containerName="nova-cell0-conductor-db-sync" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.146902 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.148781 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.149386 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-hpr85" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.159493 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.297572 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvkmd\" (UniqueName: \"kubernetes.io/projected/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-kube-api-access-vvkmd\") pod \"nova-cell0-conductor-0\" (UID: \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.297921 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.297971 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.399430 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.399550 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvkmd\" (UniqueName: \"kubernetes.io/projected/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-kube-api-access-vvkmd\") pod \"nova-cell0-conductor-0\" (UID: \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.399621 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.404256 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.406239 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.420126 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvkmd\" (UniqueName: \"kubernetes.io/projected/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-kube-api-access-vvkmd\") pod \"nova-cell0-conductor-0\" (UID: \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:40:06 crc kubenswrapper[5030]: I0120 23:40:06.478418 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:40:07 crc kubenswrapper[5030]: W0120 23:40:07.378998 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5db97c47_beb0_437c_bcae_2aaabb7cf3ff.slice/crio-9b95d5e2f9dcc3a881806e642605eb94b93b3d25fb31a605824287006f823e2f WatchSource:0}: Error finding container 9b95d5e2f9dcc3a881806e642605eb94b93b3d25fb31a605824287006f823e2f: Status 404 returned error can't find the container with id 9b95d5e2f9dcc3a881806e642605eb94b93b3d25fb31a605824287006f823e2f Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.385837 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.537924 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.629355 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-sg-core-conf-yaml\") pod \"92797b7d-c5ce-45d9-b85a-422cbbda3700\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.629478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsxk4\" (UniqueName: \"kubernetes.io/projected/92797b7d-c5ce-45d9-b85a-422cbbda3700-kube-api-access-zsxk4\") pod \"92797b7d-c5ce-45d9-b85a-422cbbda3700\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.629559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-scripts\") pod \"92797b7d-c5ce-45d9-b85a-422cbbda3700\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.629592 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-combined-ca-bundle\") pod \"92797b7d-c5ce-45d9-b85a-422cbbda3700\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.630106 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-config-data\") pod \"92797b7d-c5ce-45d9-b85a-422cbbda3700\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.630175 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92797b7d-c5ce-45d9-b85a-422cbbda3700-run-httpd\") pod \"92797b7d-c5ce-45d9-b85a-422cbbda3700\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.630206 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92797b7d-c5ce-45d9-b85a-422cbbda3700-log-httpd\") pod \"92797b7d-c5ce-45d9-b85a-422cbbda3700\" (UID: \"92797b7d-c5ce-45d9-b85a-422cbbda3700\") " Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.630577 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92797b7d-c5ce-45d9-b85a-422cbbda3700-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "92797b7d-c5ce-45d9-b85a-422cbbda3700" (UID: "92797b7d-c5ce-45d9-b85a-422cbbda3700"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.631546 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92797b7d-c5ce-45d9-b85a-422cbbda3700-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "92797b7d-c5ce-45d9-b85a-422cbbda3700" (UID: "92797b7d-c5ce-45d9-b85a-422cbbda3700"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.634208 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-scripts" (OuterVolumeSpecName: "scripts") pod "92797b7d-c5ce-45d9-b85a-422cbbda3700" (UID: "92797b7d-c5ce-45d9-b85a-422cbbda3700"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.635916 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92797b7d-c5ce-45d9-b85a-422cbbda3700-kube-api-access-zsxk4" (OuterVolumeSpecName: "kube-api-access-zsxk4") pod "92797b7d-c5ce-45d9-b85a-422cbbda3700" (UID: "92797b7d-c5ce-45d9-b85a-422cbbda3700"). InnerVolumeSpecName "kube-api-access-zsxk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.659849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "92797b7d-c5ce-45d9-b85a-422cbbda3700" (UID: "92797b7d-c5ce-45d9-b85a-422cbbda3700"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.699155 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92797b7d-c5ce-45d9-b85a-422cbbda3700" (UID: "92797b7d-c5ce-45d9-b85a-422cbbda3700"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.722307 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-config-data" (OuterVolumeSpecName: "config-data") pod "92797b7d-c5ce-45d9-b85a-422cbbda3700" (UID: "92797b7d-c5ce-45d9-b85a-422cbbda3700"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.733001 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92797b7d-c5ce-45d9-b85a-422cbbda3700-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.733040 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92797b7d-c5ce-45d9-b85a-422cbbda3700-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.733057 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.733077 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsxk4\" (UniqueName: \"kubernetes.io/projected/92797b7d-c5ce-45d9-b85a-422cbbda3700-kube-api-access-zsxk4\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.733096 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.733115 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.733132 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92797b7d-c5ce-45d9-b85a-422cbbda3700-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.777786 5030 generic.go:334] "Generic (PLEG): container finished" podID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerID="b34f46179fa8bd339b04b207e1b1de3fb4da30bd9a0124a1d5929b2ea8da5fe0" exitCode=0 Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.777862 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92797b7d-c5ce-45d9-b85a-422cbbda3700","Type":"ContainerDied","Data":"b34f46179fa8bd339b04b207e1b1de3fb4da30bd9a0124a1d5929b2ea8da5fe0"} Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.777894 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"92797b7d-c5ce-45d9-b85a-422cbbda3700","Type":"ContainerDied","Data":"4f6b396afd9ba57b0d0135d69ba0b2a2dd67ad03082887056e4ef6c17b8a6ddb"} Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.777918 5030 scope.go:117] "RemoveContainer" containerID="4d69001c30c5924e9c41d154f7c082263d5373513bdf70a50cd66bb3684953a4" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.778108 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.786593 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"5db97c47-beb0-437c-bcae-2aaabb7cf3ff","Type":"ContainerStarted","Data":"50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937"} Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.786668 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"5db97c47-beb0-437c-bcae-2aaabb7cf3ff","Type":"ContainerStarted","Data":"9b95d5e2f9dcc3a881806e642605eb94b93b3d25fb31a605824287006f823e2f"} Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.787007 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.805127 5030 scope.go:117] "RemoveContainer" containerID="a1b86b43730b7a0fb803ffc9ce9dc60f67efd356920a732d429b21de59f6bb54" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.819871 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=1.819852088 podStartE2EDuration="1.819852088s" podCreationTimestamp="2026-01-20 23:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:07.808296817 +0000 UTC m=+3880.128557115" watchObservedRunningTime="2026-01-20 23:40:07.819852088 +0000 UTC m=+3880.140112386" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.831003 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.837040 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.859919 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.860254 5030 scope.go:117] "RemoveContainer" containerID="b34f46179fa8bd339b04b207e1b1de3fb4da30bd9a0124a1d5929b2ea8da5fe0" Jan 20 23:40:07 crc kubenswrapper[5030]: E0120 23:40:07.860446 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="ceilometer-central-agent" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.860470 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="ceilometer-central-agent" Jan 20 23:40:07 crc kubenswrapper[5030]: E0120 23:40:07.860492 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="ceilometer-notification-agent" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.860501 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="ceilometer-notification-agent" Jan 20 23:40:07 crc kubenswrapper[5030]: E0120 23:40:07.860528 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="proxy-httpd" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.860538 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="proxy-httpd" Jan 20 23:40:07 crc kubenswrapper[5030]: E0120 23:40:07.860559 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="sg-core" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.860567 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="sg-core" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.860791 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="ceilometer-central-agent" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.860814 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="proxy-httpd" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.860830 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="sg-core" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.860855 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" containerName="ceilometer-notification-agent" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.867845 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.870196 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.870390 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.878040 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.891926 5030 scope.go:117] "RemoveContainer" containerID="066c1361524f5c6bbf2dfd417a5d6dbf303afaac13c7660cd90d510fb0d0635c" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.910856 5030 scope.go:117] "RemoveContainer" containerID="4d69001c30c5924e9c41d154f7c082263d5373513bdf70a50cd66bb3684953a4" Jan 20 23:40:07 crc kubenswrapper[5030]: E0120 23:40:07.911237 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d69001c30c5924e9c41d154f7c082263d5373513bdf70a50cd66bb3684953a4\": container with ID starting with 4d69001c30c5924e9c41d154f7c082263d5373513bdf70a50cd66bb3684953a4 not found: ID does not exist" containerID="4d69001c30c5924e9c41d154f7c082263d5373513bdf70a50cd66bb3684953a4" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.911267 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d69001c30c5924e9c41d154f7c082263d5373513bdf70a50cd66bb3684953a4"} err="failed to get container status \"4d69001c30c5924e9c41d154f7c082263d5373513bdf70a50cd66bb3684953a4\": rpc error: code = NotFound desc = could not find container \"4d69001c30c5924e9c41d154f7c082263d5373513bdf70a50cd66bb3684953a4\": container with ID starting with 4d69001c30c5924e9c41d154f7c082263d5373513bdf70a50cd66bb3684953a4 not found: ID does not exist" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.911289 5030 scope.go:117] "RemoveContainer" containerID="a1b86b43730b7a0fb803ffc9ce9dc60f67efd356920a732d429b21de59f6bb54" Jan 20 23:40:07 crc kubenswrapper[5030]: E0120 23:40:07.911686 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1b86b43730b7a0fb803ffc9ce9dc60f67efd356920a732d429b21de59f6bb54\": container with ID starting with a1b86b43730b7a0fb803ffc9ce9dc60f67efd356920a732d429b21de59f6bb54 not found: ID does not exist" containerID="a1b86b43730b7a0fb803ffc9ce9dc60f67efd356920a732d429b21de59f6bb54" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.911732 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1b86b43730b7a0fb803ffc9ce9dc60f67efd356920a732d429b21de59f6bb54"} err="failed to get container status \"a1b86b43730b7a0fb803ffc9ce9dc60f67efd356920a732d429b21de59f6bb54\": rpc error: code = NotFound desc = could not find container \"a1b86b43730b7a0fb803ffc9ce9dc60f67efd356920a732d429b21de59f6bb54\": container with ID starting with a1b86b43730b7a0fb803ffc9ce9dc60f67efd356920a732d429b21de59f6bb54 not found: ID does not exist" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.911767 5030 scope.go:117] "RemoveContainer" containerID="b34f46179fa8bd339b04b207e1b1de3fb4da30bd9a0124a1d5929b2ea8da5fe0" Jan 20 23:40:07 crc kubenswrapper[5030]: E0120 23:40:07.912034 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34f46179fa8bd339b04b207e1b1de3fb4da30bd9a0124a1d5929b2ea8da5fe0\": container with ID starting with b34f46179fa8bd339b04b207e1b1de3fb4da30bd9a0124a1d5929b2ea8da5fe0 not found: ID does not exist" containerID="b34f46179fa8bd339b04b207e1b1de3fb4da30bd9a0124a1d5929b2ea8da5fe0" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.912070 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34f46179fa8bd339b04b207e1b1de3fb4da30bd9a0124a1d5929b2ea8da5fe0"} err="failed to get container status \"b34f46179fa8bd339b04b207e1b1de3fb4da30bd9a0124a1d5929b2ea8da5fe0\": rpc error: code = NotFound desc = could not find container \"b34f46179fa8bd339b04b207e1b1de3fb4da30bd9a0124a1d5929b2ea8da5fe0\": container with ID starting with b34f46179fa8bd339b04b207e1b1de3fb4da30bd9a0124a1d5929b2ea8da5fe0 not found: ID does not exist" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.912084 5030 scope.go:117] "RemoveContainer" containerID="066c1361524f5c6bbf2dfd417a5d6dbf303afaac13c7660cd90d510fb0d0635c" Jan 20 23:40:07 crc kubenswrapper[5030]: E0120 23:40:07.912778 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"066c1361524f5c6bbf2dfd417a5d6dbf303afaac13c7660cd90d510fb0d0635c\": container with ID starting with 066c1361524f5c6bbf2dfd417a5d6dbf303afaac13c7660cd90d510fb0d0635c not found: ID does not exist" containerID="066c1361524f5c6bbf2dfd417a5d6dbf303afaac13c7660cd90d510fb0d0635c" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.912813 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"066c1361524f5c6bbf2dfd417a5d6dbf303afaac13c7660cd90d510fb0d0635c"} err="failed to get container status \"066c1361524f5c6bbf2dfd417a5d6dbf303afaac13c7660cd90d510fb0d0635c\": rpc error: code = NotFound desc = could not find container \"066c1361524f5c6bbf2dfd417a5d6dbf303afaac13c7660cd90d510fb0d0635c\": container with ID starting with 066c1361524f5c6bbf2dfd417a5d6dbf303afaac13c7660cd90d510fb0d0635c not found: ID does not exist" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.936076 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.936189 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/233cb4a4-24c7-4596-adc1-d7e4a47af70a-run-httpd\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.936240 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t4wp\" (UniqueName: \"kubernetes.io/projected/233cb4a4-24c7-4596-adc1-d7e4a47af70a-kube-api-access-5t4wp\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.936337 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-config-data\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.936398 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/233cb4a4-24c7-4596-adc1-d7e4a47af70a-log-httpd\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.936450 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.936502 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-scripts\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:07 crc kubenswrapper[5030]: I0120 23:40:07.971481 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92797b7d-c5ce-45d9-b85a-422cbbda3700" path="/var/lib/kubelet/pods/92797b7d-c5ce-45d9-b85a-422cbbda3700/volumes" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.048468 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.048573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/233cb4a4-24c7-4596-adc1-d7e4a47af70a-run-httpd\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.048644 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t4wp\" (UniqueName: \"kubernetes.io/projected/233cb4a4-24c7-4596-adc1-d7e4a47af70a-kube-api-access-5t4wp\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.048732 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-config-data\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.048756 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/233cb4a4-24c7-4596-adc1-d7e4a47af70a-log-httpd\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.048796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.048840 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-scripts\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.049301 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/233cb4a4-24c7-4596-adc1-d7e4a47af70a-run-httpd\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.049336 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/233cb4a4-24c7-4596-adc1-d7e4a47af70a-log-httpd\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.056169 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.060230 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.063424 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-scripts\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.072759 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-config-data\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.090256 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t4wp\" (UniqueName: \"kubernetes.io/projected/233cb4a4-24c7-4596-adc1-d7e4a47af70a-kube-api-access-5t4wp\") pod \"ceilometer-0\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.189945 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:08 crc kubenswrapper[5030]: W0120 23:40:08.677453 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod233cb4a4_24c7_4596_adc1_d7e4a47af70a.slice/crio-e5976501a30164b4d6e9d409b64b0ac22ffdf9d2149cfc5d8660d59e96ee35f3 WatchSource:0}: Error finding container e5976501a30164b4d6e9d409b64b0ac22ffdf9d2149cfc5d8660d59e96ee35f3: Status 404 returned error can't find the container with id e5976501a30164b4d6e9d409b64b0ac22ffdf9d2149cfc5d8660d59e96ee35f3 Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.678644 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:40:08 crc kubenswrapper[5030]: I0120 23:40:08.798004 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"233cb4a4-24c7-4596-adc1-d7e4a47af70a","Type":"ContainerStarted","Data":"e5976501a30164b4d6e9d409b64b0ac22ffdf9d2149cfc5d8660d59e96ee35f3"} Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.079966 5030 scope.go:117] "RemoveContainer" containerID="7ebffc22266db7afd3868873c3574233432e05434afc8ac8fe64ce746971e63e" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.115747 5030 scope.go:117] "RemoveContainer" containerID="a817fccb1d8108fa564f78fe7c39689a51604fe6b3c5f2955526baac928c24e3" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.147719 5030 scope.go:117] "RemoveContainer" containerID="23f3dcb0553f3089e12b860e39bdceefecb58a03add17ca1ec21c1cfd51434a4" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.180447 5030 scope.go:117] "RemoveContainer" containerID="d5e32ad78f87773cab221c54225e04fd0ff72e850437ba7b5b0074e225f96e54" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.208960 5030 scope.go:117] "RemoveContainer" containerID="bd36c5c2f114bde7769c6cda08f898ffc29246290b55ddd8e9c6358ddea9bfac" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.323884 5030 scope.go:117] "RemoveContainer" containerID="e2f3539b0d09fd315ddf7e23acf530f47592955f23925972593c99550f0a21fb" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.345752 5030 scope.go:117] "RemoveContainer" containerID="5038cf6a212317ce8623b41306e775f298e2c7a56a2b338c29c6f7840be367fa" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.378220 5030 scope.go:117] "RemoveContainer" containerID="7672969c7149a4ed716e18defd00bfdaebf66124fd33296b0113971e32bea71b" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.397852 5030 scope.go:117] "RemoveContainer" containerID="89d17ab7dc16cfbf12bf89444cea716d748efedb57923810b56061ff70252a84" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.412398 5030 scope.go:117] "RemoveContainer" containerID="1364f3685e72aa6c1fbe3584228fe8fd80440e7f23b6bca2e91fbdaa8ae36662" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.427849 5030 scope.go:117] "RemoveContainer" containerID="5ecde8ec052026f4220036a91d5a3a71cf246312e5090cad9d644e11e17921bb" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.444913 5030 scope.go:117] "RemoveContainer" containerID="820679d70ac112437e5c84042ac99cfca7c5a633c0df3f571874cadf6eb168d5" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.463319 5030 scope.go:117] "RemoveContainer" containerID="98af69b0bfdc6fc1afc39aff882af1e07a660d71a898d7354bda19b513fc36c9" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.484225 5030 scope.go:117] "RemoveContainer" containerID="f2c210a76ba1350e345c553a28c3e3eb934894ee48a553eb12bcb7de9a8f8525" Jan 20 23:40:09 crc kubenswrapper[5030]: I0120 23:40:09.813182 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"233cb4a4-24c7-4596-adc1-d7e4a47af70a","Type":"ContainerStarted","Data":"5e608a958181817401ce6598938660abbec2a131569c53fc8ca60929d54cae4a"} Jan 20 23:40:10 crc kubenswrapper[5030]: I0120 23:40:10.826338 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"233cb4a4-24c7-4596-adc1-d7e4a47af70a","Type":"ContainerStarted","Data":"c366ff4fd041feeec337d801d372b0ca541ed02aabcfd79af2451aeba162f701"} Jan 20 23:40:10 crc kubenswrapper[5030]: I0120 23:40:10.826689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"233cb4a4-24c7-4596-adc1-d7e4a47af70a","Type":"ContainerStarted","Data":"df202092794809bbbe9ab204cac9e83d63c6db3159fa98a884f9820a3b1660cc"} Jan 20 23:40:12 crc kubenswrapper[5030]: I0120 23:40:12.862983 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"233cb4a4-24c7-4596-adc1-d7e4a47af70a","Type":"ContainerStarted","Data":"66324d1d1778ace59097b423d525af9f147a69b5b819a89d2eea7d96a40cd691"} Jan 20 23:40:12 crc kubenswrapper[5030]: I0120 23:40:12.863663 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:12 crc kubenswrapper[5030]: I0120 23:40:12.898129 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.665260248 podStartE2EDuration="5.898093031s" podCreationTimestamp="2026-01-20 23:40:07 +0000 UTC" firstStartedPulling="2026-01-20 23:40:08.680440549 +0000 UTC m=+3881.000700857" lastFinishedPulling="2026-01-20 23:40:11.913273342 +0000 UTC m=+3884.233533640" observedRunningTime="2026-01-20 23:40:12.889115693 +0000 UTC m=+3885.209375981" watchObservedRunningTime="2026-01-20 23:40:12.898093031 +0000 UTC m=+3885.218353319" Jan 20 23:40:16 crc kubenswrapper[5030]: I0120 23:40:16.521157 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.091295 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7"] Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.092498 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.097973 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.098226 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.117448 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7"] Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.233976 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b95d7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.234031 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-scripts\") pod \"nova-cell0-cell-mapping-b95d7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.234079 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfbsv\" (UniqueName: \"kubernetes.io/projected/c9412ce4-a977-4306-a71c-f52217d2a3e7-kube-api-access-dfbsv\") pod \"nova-cell0-cell-mapping-b95d7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.234110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-config-data\") pod \"nova-cell0-cell-mapping-b95d7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.264703 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.266291 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.272552 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.281425 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.310789 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.311914 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.315526 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.335763 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-scripts\") pod \"nova-cell0-cell-mapping-b95d7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.335831 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbsv\" (UniqueName: \"kubernetes.io/projected/c9412ce4-a977-4306-a71c-f52217d2a3e7-kube-api-access-dfbsv\") pod \"nova-cell0-cell-mapping-b95d7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.335860 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-logs\") pod \"nova-api-0\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.335895 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-config-data\") pod \"nova-cell0-cell-mapping-b95d7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.335939 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kccwx\" (UniqueName: \"kubernetes.io/projected/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-kube-api-access-kccwx\") pod \"nova-api-0\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.335965 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.336006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-config-data\") pod \"nova-api-0\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.336034 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b95d7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.340644 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.344031 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b95d7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.344129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-scripts\") pod \"nova-cell0-cell-mapping-b95d7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.346951 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-config-data\") pod \"nova-cell0-cell-mapping-b95d7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.363731 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfbsv\" (UniqueName: \"kubernetes.io/projected/c9412ce4-a977-4306-a71c-f52217d2a3e7-kube-api-access-dfbsv\") pod \"nova-cell0-cell-mapping-b95d7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.414699 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.416132 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.420967 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.432014 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.437502 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d814a861-2a2f-4f40-a7ec-79379f1d1df7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.437573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-logs\") pod \"nova-api-0\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.437594 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.437658 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d814a861-2a2f-4f40-a7ec-79379f1d1df7-config-data\") pod \"nova-scheduler-0\" (UID: \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.437704 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kccwx\" (UniqueName: \"kubernetes.io/projected/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-kube-api-access-kccwx\") pod \"nova-api-0\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.437733 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.437756 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nftl9\" (UniqueName: \"kubernetes.io/projected/d814a861-2a2f-4f40-a7ec-79379f1d1df7-kube-api-access-nftl9\") pod \"nova-scheduler-0\" (UID: \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.437796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-config-data\") pod \"nova-api-0\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.437991 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-logs\") pod \"nova-api-0\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.451254 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.468979 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.470249 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.484798 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.508424 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-config-data\") pod \"nova-api-0\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.510572 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.526513 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kccwx\" (UniqueName: \"kubernetes.io/projected/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-kube-api-access-kccwx\") pod \"nova-api-0\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.540843 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nftl9\" (UniqueName: \"kubernetes.io/projected/d814a861-2a2f-4f40-a7ec-79379f1d1df7-kube-api-access-nftl9\") pod \"nova-scheduler-0\" (UID: \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.540969 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-config-data\") pod \"nova-metadata-0\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.541008 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.541336 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.541367 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d814a861-2a2f-4f40-a7ec-79379f1d1df7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.541386 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.541959 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85vvz\" (UniqueName: \"kubernetes.io/projected/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-kube-api-access-85vvz\") pod \"nova-metadata-0\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.542009 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f6bk\" (UniqueName: \"kubernetes.io/projected/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-kube-api-access-4f6bk\") pod \"nova-cell1-novncproxy-0\" (UID: \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.542034 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d814a861-2a2f-4f40-a7ec-79379f1d1df7-config-data\") pod \"nova-scheduler-0\" (UID: \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.542092 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-logs\") pod \"nova-metadata-0\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.550881 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d814a861-2a2f-4f40-a7ec-79379f1d1df7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.556278 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d814a861-2a2f-4f40-a7ec-79379f1d1df7-config-data\") pod \"nova-scheduler-0\" (UID: \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.571822 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nftl9\" (UniqueName: \"kubernetes.io/projected/d814a861-2a2f-4f40-a7ec-79379f1d1df7-kube-api-access-nftl9\") pod \"nova-scheduler-0\" (UID: \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.618340 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.643882 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.643937 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.643964 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.644027 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85vvz\" (UniqueName: \"kubernetes.io/projected/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-kube-api-access-85vvz\") pod \"nova-metadata-0\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.644051 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f6bk\" (UniqueName: \"kubernetes.io/projected/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-kube-api-access-4f6bk\") pod \"nova-cell1-novncproxy-0\" (UID: \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.644414 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-logs\") pod \"nova-metadata-0\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.645830 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-logs\") pod \"nova-metadata-0\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.646976 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-config-data\") pod \"nova-metadata-0\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.648872 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.649393 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.651139 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.651422 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-config-data\") pod \"nova-metadata-0\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.662655 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f6bk\" (UniqueName: \"kubernetes.io/projected/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-kube-api-access-4f6bk\") pod \"nova-cell1-novncproxy-0\" (UID: \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.665264 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85vvz\" (UniqueName: \"kubernetes.io/projected/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-kube-api-access-85vvz\") pod \"nova-metadata-0\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.734313 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.865693 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.886890 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:17 crc kubenswrapper[5030]: I0120 23:40:17.976759 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7"] Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.105170 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.155422 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb"] Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.164809 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb"] Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.164935 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.167209 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.167431 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.266267 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-scripts\") pod \"nova-cell1-conductor-db-sync-vb7pb\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.266493 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s546n\" (UniqueName: \"kubernetes.io/projected/65c08c3d-8985-44c2-81bf-a50efb87efe6-kube-api-access-s546n\") pod \"nova-cell1-conductor-db-sync-vb7pb\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.266561 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-config-data\") pod \"nova-cell1-conductor-db-sync-vb7pb\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.266600 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vb7pb\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.305031 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.368830 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-scripts\") pod \"nova-cell1-conductor-db-sync-vb7pb\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.368881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s546n\" (UniqueName: \"kubernetes.io/projected/65c08c3d-8985-44c2-81bf-a50efb87efe6-kube-api-access-s546n\") pod \"nova-cell1-conductor-db-sync-vb7pb\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.368927 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-config-data\") pod \"nova-cell1-conductor-db-sync-vb7pb\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.368960 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vb7pb\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.374671 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-scripts\") pod \"nova-cell1-conductor-db-sync-vb7pb\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.374990 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-config-data\") pod \"nova-cell1-conductor-db-sync-vb7pb\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.378495 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vb7pb\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.400360 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s546n\" (UniqueName: \"kubernetes.io/projected/65c08c3d-8985-44c2-81bf-a50efb87efe6-kube-api-access-s546n\") pod \"nova-cell1-conductor-db-sync-vb7pb\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: W0120 23:40:18.413738 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08f0d5d9_ebbc_4d51_9880_0b0fab8800b6.slice/crio-d9717419c036bbdd9db860687483812bcb858dfae017b42aa0c6827d2a28910e WatchSource:0}: Error finding container d9717419c036bbdd9db860687483812bcb858dfae017b42aa0c6827d2a28910e: Status 404 returned error can't find the container with id d9717419c036bbdd9db860687483812bcb858dfae017b42aa0c6827d2a28910e Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.420384 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.486549 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.549394 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.920753 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5a29c8fd-5ce7-4561-991f-bbac253c1a4e","Type":"ContainerStarted","Data":"80efd511e4ee3768484b403e45e2b6ca39c567b5911b1d9ade0590a90cbd5874"} Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.921350 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5a29c8fd-5ce7-4561-991f-bbac253c1a4e","Type":"ContainerStarted","Data":"0e26fb0031c07818588d5bf4766e1ad866dda0810a4088164b622b8f9f6a5e70"} Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.921368 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5a29c8fd-5ce7-4561-991f-bbac253c1a4e","Type":"ContainerStarted","Data":"0680c5d9c990c5e3cbe1c81ad411bdb7cf9199b61579cda9f6f37f9ecd506651"} Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.922497 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6","Type":"ContainerStarted","Data":"3d3f7103d35ca8412dace1990cdb16df9face3e5f03f401a5ad8b24d70557d85"} Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.922531 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6","Type":"ContainerStarted","Data":"d9717419c036bbdd9db860687483812bcb858dfae017b42aa0c6827d2a28910e"} Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.929722 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" event={"ID":"c9412ce4-a977-4306-a71c-f52217d2a3e7","Type":"ContainerStarted","Data":"1e8a9aa23fb4f9c7588b5b3cfa8a42d9d4011ad83f9f2ffcca6b66ca93a8c094"} Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.929759 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" event={"ID":"c9412ce4-a977-4306-a71c-f52217d2a3e7","Type":"ContainerStarted","Data":"7547c0b093ce631b77e3de5ce04d05d4130df463469ef80ec0215062db54f8e6"} Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.931474 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d814a861-2a2f-4f40-a7ec-79379f1d1df7","Type":"ContainerStarted","Data":"8218d6e26ae246f3c2e30853ab3bd3acc26e289805956d6fe5a0620489440dec"} Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.931511 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d814a861-2a2f-4f40-a7ec-79379f1d1df7","Type":"ContainerStarted","Data":"7484ec99fcb44e7186c55f0e2ac85fe3a0d3c0bd38cb9bf611bbc309492650ba"} Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.934275 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0fbd4d3b-6c4d-4786-b701-f9a8844ffade","Type":"ContainerStarted","Data":"40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514"} Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.934321 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0fbd4d3b-6c4d-4786-b701-f9a8844ffade","Type":"ContainerStarted","Data":"94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c"} Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.934335 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0fbd4d3b-6c4d-4786-b701-f9a8844ffade","Type":"ContainerStarted","Data":"e9acfb82ee15a5f3d7a007c1030d6cfa07ff439f9f98bdaa4834218f650b2c8b"} Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.946970 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.946952249 podStartE2EDuration="1.946952249s" podCreationTimestamp="2026-01-20 23:40:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:18.941501257 +0000 UTC m=+3891.261761545" watchObservedRunningTime="2026-01-20 23:40:18.946952249 +0000 UTC m=+3891.267212537" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.962772 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=1.962758992 podStartE2EDuration="1.962758992s" podCreationTimestamp="2026-01-20 23:40:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:18.962392023 +0000 UTC m=+3891.282652311" watchObservedRunningTime="2026-01-20 23:40:18.962758992 +0000 UTC m=+3891.283019280" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.963170 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:40:18 crc kubenswrapper[5030]: E0120 23:40:18.963388 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.989513 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.989497319 podStartE2EDuration="1.989497319s" podCreationTimestamp="2026-01-20 23:40:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:18.976702749 +0000 UTC m=+3891.296963037" watchObservedRunningTime="2026-01-20 23:40:18.989497319 +0000 UTC m=+3891.309757607" Jan 20 23:40:18 crc kubenswrapper[5030]: I0120 23:40:18.989582 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb"] Jan 20 23:40:19 crc kubenswrapper[5030]: I0120 23:40:19.006643 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" podStartSLOduration=2.006610463 podStartE2EDuration="2.006610463s" podCreationTimestamp="2026-01-20 23:40:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:18.993602848 +0000 UTC m=+3891.313863136" watchObservedRunningTime="2026-01-20 23:40:19.006610463 +0000 UTC m=+3891.326870751" Jan 20 23:40:19 crc kubenswrapper[5030]: I0120 23:40:19.017555 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.017537847 podStartE2EDuration="2.017537847s" podCreationTimestamp="2026-01-20 23:40:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:19.011220865 +0000 UTC m=+3891.331481163" watchObservedRunningTime="2026-01-20 23:40:19.017537847 +0000 UTC m=+3891.337798135" Jan 20 23:40:19 crc kubenswrapper[5030]: I0120 23:40:19.945368 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" event={"ID":"65c08c3d-8985-44c2-81bf-a50efb87efe6","Type":"ContainerStarted","Data":"2924e6130325cda5b1a20e5c851ab6d28c294009f722d453a9f1ae2bd9a43edb"} Jan 20 23:40:19 crc kubenswrapper[5030]: I0120 23:40:19.945720 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" event={"ID":"65c08c3d-8985-44c2-81bf-a50efb87efe6","Type":"ContainerStarted","Data":"088edd86af1f1ca4cc3c2f5eddbaced3a6ff230ad9dc3aa511b15dbcb35cefa2"} Jan 20 23:40:19 crc kubenswrapper[5030]: I0120 23:40:19.975384 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" podStartSLOduration=1.975358402 podStartE2EDuration="1.975358402s" podCreationTimestamp="2026-01-20 23:40:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:19.967557804 +0000 UTC m=+3892.287818102" watchObservedRunningTime="2026-01-20 23:40:19.975358402 +0000 UTC m=+3892.295618720" Jan 20 23:40:20 crc kubenswrapper[5030]: I0120 23:40:20.798401 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:20 crc kubenswrapper[5030]: I0120 23:40:20.875073 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:40:20 crc kubenswrapper[5030]: I0120 23:40:20.952398 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="08f0d5d9-ebbc-4d51-9880-0b0fab8800b6" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3d3f7103d35ca8412dace1990cdb16df9face3e5f03f401a5ad8b24d70557d85" gracePeriod=30 Jan 20 23:40:20 crc kubenswrapper[5030]: I0120 23:40:20.953005 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0fbd4d3b-6c4d-4786-b701-f9a8844ffade" containerName="nova-metadata-log" containerID="cri-o://94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c" gracePeriod=30 Jan 20 23:40:20 crc kubenswrapper[5030]: I0120 23:40:20.953066 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="0fbd4d3b-6c4d-4786-b701-f9a8844ffade" containerName="nova-metadata-metadata" containerID="cri-o://40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514" gracePeriod=30 Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.534170 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.542583 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.648272 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-config-data\") pod \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.648403 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f6bk\" (UniqueName: \"kubernetes.io/projected/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-kube-api-access-4f6bk\") pod \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\" (UID: \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\") " Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.648509 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-logs\") pod \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.648602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-combined-ca-bundle\") pod \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.648676 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85vvz\" (UniqueName: \"kubernetes.io/projected/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-kube-api-access-85vvz\") pod \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\" (UID: \"0fbd4d3b-6c4d-4786-b701-f9a8844ffade\") " Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.648743 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-config-data\") pod \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\" (UID: \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\") " Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.648799 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-combined-ca-bundle\") pod \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\" (UID: \"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6\") " Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.649359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-logs" (OuterVolumeSpecName: "logs") pod "0fbd4d3b-6c4d-4786-b701-f9a8844ffade" (UID: "0fbd4d3b-6c4d-4786-b701-f9a8844ffade"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.649696 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.653866 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-kube-api-access-85vvz" (OuterVolumeSpecName: "kube-api-access-85vvz") pod "0fbd4d3b-6c4d-4786-b701-f9a8844ffade" (UID: "0fbd4d3b-6c4d-4786-b701-f9a8844ffade"). InnerVolumeSpecName "kube-api-access-85vvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.655954 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-kube-api-access-4f6bk" (OuterVolumeSpecName: "kube-api-access-4f6bk") pod "08f0d5d9-ebbc-4d51-9880-0b0fab8800b6" (UID: "08f0d5d9-ebbc-4d51-9880-0b0fab8800b6"). InnerVolumeSpecName "kube-api-access-4f6bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.675772 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08f0d5d9-ebbc-4d51-9880-0b0fab8800b6" (UID: "08f0d5d9-ebbc-4d51-9880-0b0fab8800b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.677954 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fbd4d3b-6c4d-4786-b701-f9a8844ffade" (UID: "0fbd4d3b-6c4d-4786-b701-f9a8844ffade"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.684386 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-config-data" (OuterVolumeSpecName: "config-data") pod "0fbd4d3b-6c4d-4786-b701-f9a8844ffade" (UID: "0fbd4d3b-6c4d-4786-b701-f9a8844ffade"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.686237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-config-data" (OuterVolumeSpecName: "config-data") pod "08f0d5d9-ebbc-4d51-9880-0b0fab8800b6" (UID: "08f0d5d9-ebbc-4d51-9880-0b0fab8800b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.752016 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.752065 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85vvz\" (UniqueName: \"kubernetes.io/projected/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-kube-api-access-85vvz\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.752086 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.752102 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.752119 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fbd4d3b-6c4d-4786-b701-f9a8844ffade-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.752136 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f6bk\" (UniqueName: \"kubernetes.io/projected/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6-kube-api-access-4f6bk\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.963225 5030 generic.go:334] "Generic (PLEG): container finished" podID="08f0d5d9-ebbc-4d51-9880-0b0fab8800b6" containerID="3d3f7103d35ca8412dace1990cdb16df9face3e5f03f401a5ad8b24d70557d85" exitCode=0 Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.963320 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.968540 5030 generic.go:334] "Generic (PLEG): container finished" podID="0fbd4d3b-6c4d-4786-b701-f9a8844ffade" containerID="40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514" exitCode=0 Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.968566 5030 generic.go:334] "Generic (PLEG): container finished" podID="0fbd4d3b-6c4d-4786-b701-f9a8844ffade" containerID="94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c" exitCode=143 Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.968633 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.973544 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6","Type":"ContainerDied","Data":"3d3f7103d35ca8412dace1990cdb16df9face3e5f03f401a5ad8b24d70557d85"} Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.973600 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"08f0d5d9-ebbc-4d51-9880-0b0fab8800b6","Type":"ContainerDied","Data":"d9717419c036bbdd9db860687483812bcb858dfae017b42aa0c6827d2a28910e"} Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.973613 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0fbd4d3b-6c4d-4786-b701-f9a8844ffade","Type":"ContainerDied","Data":"40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514"} Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.973650 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0fbd4d3b-6c4d-4786-b701-f9a8844ffade","Type":"ContainerDied","Data":"94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c"} Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.973662 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"0fbd4d3b-6c4d-4786-b701-f9a8844ffade","Type":"ContainerDied","Data":"e9acfb82ee15a5f3d7a007c1030d6cfa07ff439f9f98bdaa4834218f650b2c8b"} Jan 20 23:40:21 crc kubenswrapper[5030]: I0120 23:40:21.973677 5030 scope.go:117] "RemoveContainer" containerID="3d3f7103d35ca8412dace1990cdb16df9face3e5f03f401a5ad8b24d70557d85" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.018754 5030 scope.go:117] "RemoveContainer" containerID="3d3f7103d35ca8412dace1990cdb16df9face3e5f03f401a5ad8b24d70557d85" Jan 20 23:40:22 crc kubenswrapper[5030]: E0120 23:40:22.020411 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d3f7103d35ca8412dace1990cdb16df9face3e5f03f401a5ad8b24d70557d85\": container with ID starting with 3d3f7103d35ca8412dace1990cdb16df9face3e5f03f401a5ad8b24d70557d85 not found: ID does not exist" containerID="3d3f7103d35ca8412dace1990cdb16df9face3e5f03f401a5ad8b24d70557d85" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.020467 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d3f7103d35ca8412dace1990cdb16df9face3e5f03f401a5ad8b24d70557d85"} err="failed to get container status \"3d3f7103d35ca8412dace1990cdb16df9face3e5f03f401a5ad8b24d70557d85\": rpc error: code = NotFound desc = could not find container \"3d3f7103d35ca8412dace1990cdb16df9face3e5f03f401a5ad8b24d70557d85\": container with ID starting with 3d3f7103d35ca8412dace1990cdb16df9face3e5f03f401a5ad8b24d70557d85 not found: ID does not exist" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.020497 5030 scope.go:117] "RemoveContainer" containerID="40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.031285 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.042161 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.055705 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.072474 5030 scope.go:117] "RemoveContainer" containerID="94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.085040 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:40:22 crc kubenswrapper[5030]: E0120 23:40:22.085554 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fbd4d3b-6c4d-4786-b701-f9a8844ffade" containerName="nova-metadata-log" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.085571 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbd4d3b-6c4d-4786-b701-f9a8844ffade" containerName="nova-metadata-log" Jan 20 23:40:22 crc kubenswrapper[5030]: E0120 23:40:22.085616 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f0d5d9-ebbc-4d51-9880-0b0fab8800b6" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.085681 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f0d5d9-ebbc-4d51-9880-0b0fab8800b6" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:40:22 crc kubenswrapper[5030]: E0120 23:40:22.085701 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fbd4d3b-6c4d-4786-b701-f9a8844ffade" containerName="nova-metadata-metadata" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.085709 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbd4d3b-6c4d-4786-b701-f9a8844ffade" containerName="nova-metadata-metadata" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.085946 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="08f0d5d9-ebbc-4d51-9880-0b0fab8800b6" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.085972 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fbd4d3b-6c4d-4786-b701-f9a8844ffade" containerName="nova-metadata-log" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.085993 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fbd4d3b-6c4d-4786-b701-f9a8844ffade" containerName="nova-metadata-metadata" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.086775 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.091125 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.091279 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.091382 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.095258 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.103722 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.112306 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.114952 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.117496 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.118893 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.121407 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.156700 5030 scope.go:117] "RemoveContainer" containerID="40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514" Jan 20 23:40:22 crc kubenswrapper[5030]: E0120 23:40:22.157196 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514\": container with ID starting with 40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514 not found: ID does not exist" containerID="40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.157236 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514"} err="failed to get container status \"40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514\": rpc error: code = NotFound desc = could not find container \"40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514\": container with ID starting with 40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514 not found: ID does not exist" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.157272 5030 scope.go:117] "RemoveContainer" containerID="94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c" Jan 20 23:40:22 crc kubenswrapper[5030]: E0120 23:40:22.157660 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c\": container with ID starting with 94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c not found: ID does not exist" containerID="94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.157704 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c"} err="failed to get container status \"94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c\": rpc error: code = NotFound desc = could not find container \"94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c\": container with ID starting with 94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c not found: ID does not exist" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.157730 5030 scope.go:117] "RemoveContainer" containerID="40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.158000 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514"} err="failed to get container status \"40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514\": rpc error: code = NotFound desc = could not find container \"40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514\": container with ID starting with 40f47041447dd1078605e41bac1b089f00b12e3b208e480b6da7e6c683ab9514 not found: ID does not exist" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.158025 5030 scope.go:117] "RemoveContainer" containerID="94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.158243 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c"} err="failed to get container status \"94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c\": rpc error: code = NotFound desc = could not find container \"94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c\": container with ID starting with 94bcbd65435bd527a9609c5d68466f02a872a6adba7623cc78bdf432acf6830c not found: ID does not exist" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.158320 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh48t\" (UniqueName: \"kubernetes.io/projected/eedd0095-2878-461a-a374-8f50c0a0f040-kube-api-access-lh48t\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.158386 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.158428 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.158471 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.158508 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.260435 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.261158 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.261227 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.261255 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrd8d\" (UniqueName: \"kubernetes.io/projected/b669d4ff-6ded-4155-88f4-e185ee5a61b4-kube-api-access-nrd8d\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.261303 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.261502 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-config-data\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.261866 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.261936 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh48t\" (UniqueName: \"kubernetes.io/projected/eedd0095-2878-461a-a374-8f50c0a0f040-kube-api-access-lh48t\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.261960 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.261987 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b669d4ff-6ded-4155-88f4-e185ee5a61b4-logs\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.265286 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.270069 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.270268 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.270432 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.282964 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh48t\" (UniqueName: \"kubernetes.io/projected/eedd0095-2878-461a-a374-8f50c0a0f040-kube-api-access-lh48t\") pod \"nova-cell1-novncproxy-0\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.363698 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.363753 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b669d4ff-6ded-4155-88f4-e185ee5a61b4-logs\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.363914 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrd8d\" (UniqueName: \"kubernetes.io/projected/b669d4ff-6ded-4155-88f4-e185ee5a61b4-kube-api-access-nrd8d\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.363978 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-config-data\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.364030 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.364870 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b669d4ff-6ded-4155-88f4-e185ee5a61b4-logs\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.370198 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.370350 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.370911 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-config-data\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.383113 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrd8d\" (UniqueName: \"kubernetes.io/projected/b669d4ff-6ded-4155-88f4-e185ee5a61b4-kube-api-access-nrd8d\") pod \"nova-metadata-0\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.458219 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.468132 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.735520 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.896713 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:40:22 crc kubenswrapper[5030]: I0120 23:40:22.978613 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"eedd0095-2878-461a-a374-8f50c0a0f040","Type":"ContainerStarted","Data":"2712b6d1babe11a94d357395e930698091b1d864ca55cf94cbb0879d16fb7dfc"} Jan 20 23:40:23 crc kubenswrapper[5030]: I0120 23:40:23.155389 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:23 crc kubenswrapper[5030]: I0120 23:40:23.977754 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08f0d5d9-ebbc-4d51-9880-0b0fab8800b6" path="/var/lib/kubelet/pods/08f0d5d9-ebbc-4d51-9880-0b0fab8800b6/volumes" Jan 20 23:40:23 crc kubenswrapper[5030]: I0120 23:40:23.979124 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fbd4d3b-6c4d-4786-b701-f9a8844ffade" path="/var/lib/kubelet/pods/0fbd4d3b-6c4d-4786-b701-f9a8844ffade/volumes" Jan 20 23:40:23 crc kubenswrapper[5030]: I0120 23:40:23.993611 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b669d4ff-6ded-4155-88f4-e185ee5a61b4","Type":"ContainerStarted","Data":"77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76"} Jan 20 23:40:23 crc kubenswrapper[5030]: I0120 23:40:23.994378 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b669d4ff-6ded-4155-88f4-e185ee5a61b4","Type":"ContainerStarted","Data":"3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f"} Jan 20 23:40:23 crc kubenswrapper[5030]: I0120 23:40:23.994470 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b669d4ff-6ded-4155-88f4-e185ee5a61b4","Type":"ContainerStarted","Data":"e945c65be71a74f9402dbb7dcb8ac02a738611f80277a600d96e5e1c8e75b6f0"} Jan 20 23:40:23 crc kubenswrapper[5030]: I0120 23:40:23.999361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"eedd0095-2878-461a-a374-8f50c0a0f040","Type":"ContainerStarted","Data":"ade9450e93cbd809b41874b3fd69c3ff407cb273c825f9de6a700f1a0133a9a1"} Jan 20 23:40:24 crc kubenswrapper[5030]: I0120 23:40:24.111730 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.111693777 podStartE2EDuration="2.111693777s" podCreationTimestamp="2026-01-20 23:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:24.045294639 +0000 UTC m=+3896.365554927" watchObservedRunningTime="2026-01-20 23:40:24.111693777 +0000 UTC m=+3896.431954065" Jan 20 23:40:26 crc kubenswrapper[5030]: I0120 23:40:26.019048 5030 generic.go:334] "Generic (PLEG): container finished" podID="c9412ce4-a977-4306-a71c-f52217d2a3e7" containerID="1e8a9aa23fb4f9c7588b5b3cfa8a42d9d4011ad83f9f2ffcca6b66ca93a8c094" exitCode=0 Jan 20 23:40:26 crc kubenswrapper[5030]: I0120 23:40:26.019167 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" event={"ID":"c9412ce4-a977-4306-a71c-f52217d2a3e7","Type":"ContainerDied","Data":"1e8a9aa23fb4f9c7588b5b3cfa8a42d9d4011ad83f9f2ffcca6b66ca93a8c094"} Jan 20 23:40:26 crc kubenswrapper[5030]: I0120 23:40:26.039251 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=4.039235754 podStartE2EDuration="4.039235754s" podCreationTimestamp="2026-01-20 23:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:24.117267182 +0000 UTC m=+3896.437527470" watchObservedRunningTime="2026-01-20 23:40:26.039235754 +0000 UTC m=+3898.359496042" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.036100 5030 generic.go:334] "Generic (PLEG): container finished" podID="65c08c3d-8985-44c2-81bf-a50efb87efe6" containerID="2924e6130325cda5b1a20e5c851ab6d28c294009f722d453a9f1ae2bd9a43edb" exitCode=0 Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.036375 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" event={"ID":"65c08c3d-8985-44c2-81bf-a50efb87efe6","Type":"ContainerDied","Data":"2924e6130325cda5b1a20e5c851ab6d28c294009f722d453a9f1ae2bd9a43edb"} Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.435560 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.458931 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.468223 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.468755 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.602702 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-scripts\") pod \"c9412ce4-a977-4306-a71c-f52217d2a3e7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.602803 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-config-data\") pod \"c9412ce4-a977-4306-a71c-f52217d2a3e7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.602929 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-combined-ca-bundle\") pod \"c9412ce4-a977-4306-a71c-f52217d2a3e7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.602991 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfbsv\" (UniqueName: \"kubernetes.io/projected/c9412ce4-a977-4306-a71c-f52217d2a3e7-kube-api-access-dfbsv\") pod \"c9412ce4-a977-4306-a71c-f52217d2a3e7\" (UID: \"c9412ce4-a977-4306-a71c-f52217d2a3e7\") " Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.609806 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9412ce4-a977-4306-a71c-f52217d2a3e7-kube-api-access-dfbsv" (OuterVolumeSpecName: "kube-api-access-dfbsv") pod "c9412ce4-a977-4306-a71c-f52217d2a3e7" (UID: "c9412ce4-a977-4306-a71c-f52217d2a3e7"). InnerVolumeSpecName "kube-api-access-dfbsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.611018 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-scripts" (OuterVolumeSpecName: "scripts") pod "c9412ce4-a977-4306-a71c-f52217d2a3e7" (UID: "c9412ce4-a977-4306-a71c-f52217d2a3e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.619504 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.619548 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.636803 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-config-data" (OuterVolumeSpecName: "config-data") pod "c9412ce4-a977-4306-a71c-f52217d2a3e7" (UID: "c9412ce4-a977-4306-a71c-f52217d2a3e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.640088 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9412ce4-a977-4306-a71c-f52217d2a3e7" (UID: "c9412ce4-a977-4306-a71c-f52217d2a3e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.705287 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.705318 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfbsv\" (UniqueName: \"kubernetes.io/projected/c9412ce4-a977-4306-a71c-f52217d2a3e7-kube-api-access-dfbsv\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.705333 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.705345 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9412ce4-a977-4306-a71c-f52217d2a3e7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.735209 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:27 crc kubenswrapper[5030]: I0120 23:40:27.774469 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.054328 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.054383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7" event={"ID":"c9412ce4-a977-4306-a71c-f52217d2a3e7","Type":"ContainerDied","Data":"7547c0b093ce631b77e3de5ce04d05d4130df463469ef80ec0215062db54f8e6"} Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.054452 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7547c0b093ce631b77e3de5ce04d05d4130df463469ef80ec0215062db54f8e6" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.099803 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.237819 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.238071 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="5a29c8fd-5ce7-4561-991f-bbac253c1a4e" containerName="nova-api-log" containerID="cri-o://0e26fb0031c07818588d5bf4766e1ad866dda0810a4088164b622b8f9f6a5e70" gracePeriod=30 Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.238161 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="5a29c8fd-5ce7-4561-991f-bbac253c1a4e" containerName="nova-api-api" containerID="cri-o://80efd511e4ee3768484b403e45e2b6ca39c567b5911b1d9ade0590a90cbd5874" gracePeriod=30 Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.243212 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="5a29c8fd-5ce7-4561-991f-bbac253c1a4e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.236:8774/\": EOF" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.243446 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="5a29c8fd-5ce7-4561-991f-bbac253c1a4e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.236:8774/\": EOF" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.292472 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.447951 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.583311 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.621141 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s546n\" (UniqueName: \"kubernetes.io/projected/65c08c3d-8985-44c2-81bf-a50efb87efe6-kube-api-access-s546n\") pod \"65c08c3d-8985-44c2-81bf-a50efb87efe6\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.621268 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-config-data\") pod \"65c08c3d-8985-44c2-81bf-a50efb87efe6\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.621497 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-combined-ca-bundle\") pod \"65c08c3d-8985-44c2-81bf-a50efb87efe6\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.621530 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-scripts\") pod \"65c08c3d-8985-44c2-81bf-a50efb87efe6\" (UID: \"65c08c3d-8985-44c2-81bf-a50efb87efe6\") " Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.627648 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c08c3d-8985-44c2-81bf-a50efb87efe6-kube-api-access-s546n" (OuterVolumeSpecName: "kube-api-access-s546n") pod "65c08c3d-8985-44c2-81bf-a50efb87efe6" (UID: "65c08c3d-8985-44c2-81bf-a50efb87efe6"). InnerVolumeSpecName "kube-api-access-s546n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.636355 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-scripts" (OuterVolumeSpecName: "scripts") pod "65c08c3d-8985-44c2-81bf-a50efb87efe6" (UID: "65c08c3d-8985-44c2-81bf-a50efb87efe6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.648179 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-config-data" (OuterVolumeSpecName: "config-data") pod "65c08c3d-8985-44c2-81bf-a50efb87efe6" (UID: "65c08c3d-8985-44c2-81bf-a50efb87efe6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.648654 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65c08c3d-8985-44c2-81bf-a50efb87efe6" (UID: "65c08c3d-8985-44c2-81bf-a50efb87efe6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.723975 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.724163 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.724251 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s546n\" (UniqueName: \"kubernetes.io/projected/65c08c3d-8985-44c2-81bf-a50efb87efe6-kube-api-access-s546n\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:28 crc kubenswrapper[5030]: I0120 23:40:28.724319 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c08c3d-8985-44c2-81bf-a50efb87efe6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.064100 5030 generic.go:334] "Generic (PLEG): container finished" podID="5a29c8fd-5ce7-4561-991f-bbac253c1a4e" containerID="0e26fb0031c07818588d5bf4766e1ad866dda0810a4088164b622b8f9f6a5e70" exitCode=143 Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.064184 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5a29c8fd-5ce7-4561-991f-bbac253c1a4e","Type":"ContainerDied","Data":"0e26fb0031c07818588d5bf4766e1ad866dda0810a4088164b622b8f9f6a5e70"} Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.066858 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b669d4ff-6ded-4155-88f4-e185ee5a61b4" containerName="nova-metadata-log" containerID="cri-o://3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f" gracePeriod=30 Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.067193 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.067446 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb" event={"ID":"65c08c3d-8985-44c2-81bf-a50efb87efe6","Type":"ContainerDied","Data":"088edd86af1f1ca4cc3c2f5eddbaced3a6ff230ad9dc3aa511b15dbcb35cefa2"} Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.067502 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="088edd86af1f1ca4cc3c2f5eddbaced3a6ff230ad9dc3aa511b15dbcb35cefa2" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.068434 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="b669d4ff-6ded-4155-88f4-e185ee5a61b4" containerName="nova-metadata-metadata" containerID="cri-o://77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76" gracePeriod=30 Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.141118 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:40:29 crc kubenswrapper[5030]: E0120 23:40:29.141566 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c08c3d-8985-44c2-81bf-a50efb87efe6" containerName="nova-cell1-conductor-db-sync" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.141581 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c08c3d-8985-44c2-81bf-a50efb87efe6" containerName="nova-cell1-conductor-db-sync" Jan 20 23:40:29 crc kubenswrapper[5030]: E0120 23:40:29.141602 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9412ce4-a977-4306-a71c-f52217d2a3e7" containerName="nova-manage" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.141608 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9412ce4-a977-4306-a71c-f52217d2a3e7" containerName="nova-manage" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.141796 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="65c08c3d-8985-44c2-81bf-a50efb87efe6" containerName="nova-cell1-conductor-db-sync" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.141814 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9412ce4-a977-4306-a71c-f52217d2a3e7" containerName="nova-manage" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.142402 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.147232 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.148536 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.234469 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnzww\" (UniqueName: \"kubernetes.io/projected/f19ee173-f9bb-4671-afde-4107284c57fa-kube-api-access-mnzww\") pod \"nova-cell1-conductor-0\" (UID: \"f19ee173-f9bb-4671-afde-4107284c57fa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.234835 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19ee173-f9bb-4671-afde-4107284c57fa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f19ee173-f9bb-4671-afde-4107284c57fa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.234890 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19ee173-f9bb-4671-afde-4107284c57fa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f19ee173-f9bb-4671-afde-4107284c57fa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.337584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19ee173-f9bb-4671-afde-4107284c57fa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f19ee173-f9bb-4671-afde-4107284c57fa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.337720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnzww\" (UniqueName: \"kubernetes.io/projected/f19ee173-f9bb-4671-afde-4107284c57fa-kube-api-access-mnzww\") pod \"nova-cell1-conductor-0\" (UID: \"f19ee173-f9bb-4671-afde-4107284c57fa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.337782 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19ee173-f9bb-4671-afde-4107284c57fa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f19ee173-f9bb-4671-afde-4107284c57fa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.347338 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19ee173-f9bb-4671-afde-4107284c57fa-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f19ee173-f9bb-4671-afde-4107284c57fa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.369044 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19ee173-f9bb-4671-afde-4107284c57fa-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f19ee173-f9bb-4671-afde-4107284c57fa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.376218 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnzww\" (UniqueName: \"kubernetes.io/projected/f19ee173-f9bb-4671-afde-4107284c57fa-kube-api-access-mnzww\") pod \"nova-cell1-conductor-0\" (UID: \"f19ee173-f9bb-4671-afde-4107284c57fa\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.459579 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:40:29 crc kubenswrapper[5030]: I0120 23:40:29.904348 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.049900 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-config-data\") pod \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.050239 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b669d4ff-6ded-4155-88f4-e185ee5a61b4-logs\") pod \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.050339 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-combined-ca-bundle\") pod \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.050389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrd8d\" (UniqueName: \"kubernetes.io/projected/b669d4ff-6ded-4155-88f4-e185ee5a61b4-kube-api-access-nrd8d\") pod \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.050410 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-nova-metadata-tls-certs\") pod \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\" (UID: \"b669d4ff-6ded-4155-88f4-e185ee5a61b4\") " Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.050702 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b669d4ff-6ded-4155-88f4-e185ee5a61b4-logs" (OuterVolumeSpecName: "logs") pod "b669d4ff-6ded-4155-88f4-e185ee5a61b4" (UID: "b669d4ff-6ded-4155-88f4-e185ee5a61b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.050936 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b669d4ff-6ded-4155-88f4-e185ee5a61b4-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.053187 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.057054 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b669d4ff-6ded-4155-88f4-e185ee5a61b4-kube-api-access-nrd8d" (OuterVolumeSpecName: "kube-api-access-nrd8d") pod "b669d4ff-6ded-4155-88f4-e185ee5a61b4" (UID: "b669d4ff-6ded-4155-88f4-e185ee5a61b4"). InnerVolumeSpecName "kube-api-access-nrd8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:30 crc kubenswrapper[5030]: W0120 23:40:30.062907 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf19ee173_f9bb_4671_afde_4107284c57fa.slice/crio-d4da8a4187a3585a50584099560f4c953725cb26ddf3a80ef84340fc6d250971 WatchSource:0}: Error finding container d4da8a4187a3585a50584099560f4c953725cb26ddf3a80ef84340fc6d250971: Status 404 returned error can't find the container with id d4da8a4187a3585a50584099560f4c953725cb26ddf3a80ef84340fc6d250971 Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.079256 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b669d4ff-6ded-4155-88f4-e185ee5a61b4" (UID: "b669d4ff-6ded-4155-88f4-e185ee5a61b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.079730 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-config-data" (OuterVolumeSpecName: "config-data") pod "b669d4ff-6ded-4155-88f4-e185ee5a61b4" (UID: "b669d4ff-6ded-4155-88f4-e185ee5a61b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.080213 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.080170 5030 generic.go:334] "Generic (PLEG): container finished" podID="b669d4ff-6ded-4155-88f4-e185ee5a61b4" containerID="77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76" exitCode=0 Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.080251 5030 generic.go:334] "Generic (PLEG): container finished" podID="b669d4ff-6ded-4155-88f4-e185ee5a61b4" containerID="3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f" exitCode=143 Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.080296 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b669d4ff-6ded-4155-88f4-e185ee5a61b4","Type":"ContainerDied","Data":"77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76"} Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.080351 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b669d4ff-6ded-4155-88f4-e185ee5a61b4","Type":"ContainerDied","Data":"3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f"} Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.080361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"b669d4ff-6ded-4155-88f4-e185ee5a61b4","Type":"ContainerDied","Data":"e945c65be71a74f9402dbb7dcb8ac02a738611f80277a600d96e5e1c8e75b6f0"} Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.080377 5030 scope.go:117] "RemoveContainer" containerID="77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.081910 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"f19ee173-f9bb-4671-afde-4107284c57fa","Type":"ContainerStarted","Data":"d4da8a4187a3585a50584099560f4c953725cb26ddf3a80ef84340fc6d250971"} Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.082035 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d814a861-2a2f-4f40-a7ec-79379f1d1df7" containerName="nova-scheduler-scheduler" containerID="cri-o://8218d6e26ae246f3c2e30853ab3bd3acc26e289805956d6fe5a0620489440dec" gracePeriod=30 Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.115089 5030 scope.go:117] "RemoveContainer" containerID="3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.123148 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b669d4ff-6ded-4155-88f4-e185ee5a61b4" (UID: "b669d4ff-6ded-4155-88f4-e185ee5a61b4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.140935 5030 scope.go:117] "RemoveContainer" containerID="77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76" Jan 20 23:40:30 crc kubenswrapper[5030]: E0120 23:40:30.141511 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76\": container with ID starting with 77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76 not found: ID does not exist" containerID="77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.141542 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76"} err="failed to get container status \"77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76\": rpc error: code = NotFound desc = could not find container \"77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76\": container with ID starting with 77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76 not found: ID does not exist" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.141565 5030 scope.go:117] "RemoveContainer" containerID="3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f" Jan 20 23:40:30 crc kubenswrapper[5030]: E0120 23:40:30.142174 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f\": container with ID starting with 3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f not found: ID does not exist" containerID="3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.142199 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f"} err="failed to get container status \"3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f\": rpc error: code = NotFound desc = could not find container \"3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f\": container with ID starting with 3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f not found: ID does not exist" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.142214 5030 scope.go:117] "RemoveContainer" containerID="77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.142552 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76"} err="failed to get container status \"77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76\": rpc error: code = NotFound desc = could not find container \"77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76\": container with ID starting with 77f4a45c5ca56fbed04af657f6642ff13198dbfbb5a68bb82c7830842063fe76 not found: ID does not exist" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.142607 5030 scope.go:117] "RemoveContainer" containerID="3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.143000 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f"} err="failed to get container status \"3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f\": rpc error: code = NotFound desc = could not find container \"3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f\": container with ID starting with 3151f2d87c6592a592ab63d5df8bb0c81be817dce50874f80fc207fff7cda82f not found: ID does not exist" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.152594 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.152631 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.152643 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrd8d\" (UniqueName: \"kubernetes.io/projected/b669d4ff-6ded-4155-88f4-e185ee5a61b4-kube-api-access-nrd8d\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.152654 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b669d4ff-6ded-4155-88f4-e185ee5a61b4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.419973 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.443813 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.456700 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:30 crc kubenswrapper[5030]: E0120 23:40:30.457729 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b669d4ff-6ded-4155-88f4-e185ee5a61b4" containerName="nova-metadata-metadata" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.457788 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b669d4ff-6ded-4155-88f4-e185ee5a61b4" containerName="nova-metadata-metadata" Jan 20 23:40:30 crc kubenswrapper[5030]: E0120 23:40:30.457813 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b669d4ff-6ded-4155-88f4-e185ee5a61b4" containerName="nova-metadata-log" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.457820 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b669d4ff-6ded-4155-88f4-e185ee5a61b4" containerName="nova-metadata-log" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.458199 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b669d4ff-6ded-4155-88f4-e185ee5a61b4" containerName="nova-metadata-log" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.458225 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b669d4ff-6ded-4155-88f4-e185ee5a61b4" containerName="nova-metadata-metadata" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.464305 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.471004 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.471736 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.483291 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.559587 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.559909 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gp6v\" (UniqueName: \"kubernetes.io/projected/33eeca5d-0bab-4e41-9b66-6f3f971e2137-kube-api-access-2gp6v\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.560071 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.560152 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33eeca5d-0bab-4e41-9b66-6f3f971e2137-logs\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.560270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-config-data\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.662331 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-config-data\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.662392 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.662419 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gp6v\" (UniqueName: \"kubernetes.io/projected/33eeca5d-0bab-4e41-9b66-6f3f971e2137-kube-api-access-2gp6v\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.662508 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.662535 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33eeca5d-0bab-4e41-9b66-6f3f971e2137-logs\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.662988 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33eeca5d-0bab-4e41-9b66-6f3f971e2137-logs\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.667642 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.668025 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-config-data\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.668083 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.677901 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gp6v\" (UniqueName: \"kubernetes.io/projected/33eeca5d-0bab-4e41-9b66-6f3f971e2137-kube-api-access-2gp6v\") pod \"nova-metadata-0\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:30 crc kubenswrapper[5030]: I0120 23:40:30.778614 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:31 crc kubenswrapper[5030]: I0120 23:40:31.105385 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"f19ee173-f9bb-4671-afde-4107284c57fa","Type":"ContainerStarted","Data":"dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3"} Jan 20 23:40:31 crc kubenswrapper[5030]: I0120 23:40:31.105691 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:40:31 crc kubenswrapper[5030]: I0120 23:40:31.133516 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.133494756 podStartE2EDuration="2.133494756s" podCreationTimestamp="2026-01-20 23:40:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:31.133353982 +0000 UTC m=+3903.453614300" watchObservedRunningTime="2026-01-20 23:40:31.133494756 +0000 UTC m=+3903.453755054" Jan 20 23:40:31 crc kubenswrapper[5030]: I0120 23:40:31.235911 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:31 crc kubenswrapper[5030]: W0120 23:40:31.243380 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33eeca5d_0bab_4e41_9b66_6f3f971e2137.slice/crio-1cfc5348677079fa5737785b4c99791d9c5996c16ad539b7da54b141574f94df WatchSource:0}: Error finding container 1cfc5348677079fa5737785b4c99791d9c5996c16ad539b7da54b141574f94df: Status 404 returned error can't find the container with id 1cfc5348677079fa5737785b4c99791d9c5996c16ad539b7da54b141574f94df Jan 20 23:40:31 crc kubenswrapper[5030]: I0120 23:40:31.961908 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:40:31 crc kubenswrapper[5030]: E0120 23:40:31.962214 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:40:31 crc kubenswrapper[5030]: I0120 23:40:31.973817 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b669d4ff-6ded-4155-88f4-e185ee5a61b4" path="/var/lib/kubelet/pods/b669d4ff-6ded-4155-88f4-e185ee5a61b4/volumes" Jan 20 23:40:32 crc kubenswrapper[5030]: I0120 23:40:32.118244 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"33eeca5d-0bab-4e41-9b66-6f3f971e2137","Type":"ContainerStarted","Data":"06db0b8bc2516db9d0f2346b1302e874bfbfda14e72e2c94c2199df5f7ee7396"} Jan 20 23:40:32 crc kubenswrapper[5030]: I0120 23:40:32.118302 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"33eeca5d-0bab-4e41-9b66-6f3f971e2137","Type":"ContainerStarted","Data":"cdd1cc95c836d4a467573154276e8d1779cea5282fce06e7389075fbf9827bae"} Jan 20 23:40:32 crc kubenswrapper[5030]: I0120 23:40:32.118329 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"33eeca5d-0bab-4e41-9b66-6f3f971e2137","Type":"ContainerStarted","Data":"1cfc5348677079fa5737785b4c99791d9c5996c16ad539b7da54b141574f94df"} Jan 20 23:40:32 crc kubenswrapper[5030]: I0120 23:40:32.150639 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.150608295 podStartE2EDuration="2.150608295s" podCreationTimestamp="2026-01-20 23:40:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:32.134815843 +0000 UTC m=+3904.455076131" watchObservedRunningTime="2026-01-20 23:40:32.150608295 +0000 UTC m=+3904.470868623" Jan 20 23:40:32 crc kubenswrapper[5030]: I0120 23:40:32.458826 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:32 crc kubenswrapper[5030]: I0120 23:40:32.475221 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:32 crc kubenswrapper[5030]: E0120 23:40:32.737125 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8218d6e26ae246f3c2e30853ab3bd3acc26e289805956d6fe5a0620489440dec" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:40:32 crc kubenswrapper[5030]: E0120 23:40:32.739136 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8218d6e26ae246f3c2e30853ab3bd3acc26e289805956d6fe5a0620489440dec" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:40:32 crc kubenswrapper[5030]: E0120 23:40:32.740247 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8218d6e26ae246f3c2e30853ab3bd3acc26e289805956d6fe5a0620489440dec" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:40:32 crc kubenswrapper[5030]: E0120 23:40:32.740277 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="d814a861-2a2f-4f40-a7ec-79379f1d1df7" containerName="nova-scheduler-scheduler" Jan 20 23:40:32 crc kubenswrapper[5030]: I0120 23:40:32.845786 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gnqzm"] Jan 20 23:40:32 crc kubenswrapper[5030]: I0120 23:40:32.848188 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:32 crc kubenswrapper[5030]: I0120 23:40:32.862357 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gnqzm"] Jan 20 23:40:33 crc kubenswrapper[5030]: I0120 23:40:33.005795 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-catalog-content\") pod \"redhat-operators-gnqzm\" (UID: \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\") " pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:33 crc kubenswrapper[5030]: I0120 23:40:33.005939 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-utilities\") pod \"redhat-operators-gnqzm\" (UID: \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\") " pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:33 crc kubenswrapper[5030]: I0120 23:40:33.006027 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh89z\" (UniqueName: \"kubernetes.io/projected/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-kube-api-access-fh89z\") pod \"redhat-operators-gnqzm\" (UID: \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\") " pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:33 crc kubenswrapper[5030]: I0120 23:40:33.107623 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh89z\" (UniqueName: \"kubernetes.io/projected/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-kube-api-access-fh89z\") pod \"redhat-operators-gnqzm\" (UID: \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\") " pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:33 crc kubenswrapper[5030]: I0120 23:40:33.107707 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-catalog-content\") pod \"redhat-operators-gnqzm\" (UID: \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\") " pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:33 crc kubenswrapper[5030]: I0120 23:40:33.107848 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-utilities\") pod \"redhat-operators-gnqzm\" (UID: \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\") " pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:33 crc kubenswrapper[5030]: I0120 23:40:33.108887 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-catalog-content\") pod \"redhat-operators-gnqzm\" (UID: \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\") " pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:33 crc kubenswrapper[5030]: I0120 23:40:33.109693 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-utilities\") pod \"redhat-operators-gnqzm\" (UID: \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\") " pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:33 crc kubenswrapper[5030]: I0120 23:40:33.128133 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh89z\" (UniqueName: \"kubernetes.io/projected/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-kube-api-access-fh89z\") pod \"redhat-operators-gnqzm\" (UID: \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\") " pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:33 crc kubenswrapper[5030]: I0120 23:40:33.153511 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:40:33 crc kubenswrapper[5030]: I0120 23:40:33.168797 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:33 crc kubenswrapper[5030]: I0120 23:40:33.612516 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gnqzm"] Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.002473 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.128304 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-logs\") pod \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.128400 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kccwx\" (UniqueName: \"kubernetes.io/projected/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-kube-api-access-kccwx\") pod \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.128468 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-combined-ca-bundle\") pod \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.128581 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-config-data\") pod \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\" (UID: \"5a29c8fd-5ce7-4561-991f-bbac253c1a4e\") " Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.129869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-logs" (OuterVolumeSpecName: "logs") pod "5a29c8fd-5ce7-4561-991f-bbac253c1a4e" (UID: "5a29c8fd-5ce7-4561-991f-bbac253c1a4e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.142849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-kube-api-access-kccwx" (OuterVolumeSpecName: "kube-api-access-kccwx") pod "5a29c8fd-5ce7-4561-991f-bbac253c1a4e" (UID: "5a29c8fd-5ce7-4561-991f-bbac253c1a4e"). InnerVolumeSpecName "kube-api-access-kccwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.152097 5030 generic.go:334] "Generic (PLEG): container finished" podID="5a29c8fd-5ce7-4561-991f-bbac253c1a4e" containerID="80efd511e4ee3768484b403e45e2b6ca39c567b5911b1d9ade0590a90cbd5874" exitCode=0 Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.152172 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5a29c8fd-5ce7-4561-991f-bbac253c1a4e","Type":"ContainerDied","Data":"80efd511e4ee3768484b403e45e2b6ca39c567b5911b1d9ade0590a90cbd5874"} Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.152196 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"5a29c8fd-5ce7-4561-991f-bbac253c1a4e","Type":"ContainerDied","Data":"0680c5d9c990c5e3cbe1c81ad411bdb7cf9199b61579cda9f6f37f9ecd506651"} Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.152213 5030 scope.go:117] "RemoveContainer" containerID="80efd511e4ee3768484b403e45e2b6ca39c567b5911b1d9ade0590a90cbd5874" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.152309 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.158519 5030 generic.go:334] "Generic (PLEG): container finished" podID="ccc0c21b-efe9-4477-97f2-9cea8761b1c4" containerID="ae6c9fa62be8b01af628d91641918743661189ab64511131ded1964e41d4fa24" exitCode=0 Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.160242 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnqzm" event={"ID":"ccc0c21b-efe9-4477-97f2-9cea8761b1c4","Type":"ContainerDied","Data":"ae6c9fa62be8b01af628d91641918743661189ab64511131ded1964e41d4fa24"} Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.160268 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnqzm" event={"ID":"ccc0c21b-efe9-4477-97f2-9cea8761b1c4","Type":"ContainerStarted","Data":"43cd6744eb9072a804f5ebe18b7992defcc4df9648f462967cc02e491454fc6b"} Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.167901 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-config-data" (OuterVolumeSpecName: "config-data") pod "5a29c8fd-5ce7-4561-991f-bbac253c1a4e" (UID: "5a29c8fd-5ce7-4561-991f-bbac253c1a4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.198241 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a29c8fd-5ce7-4561-991f-bbac253c1a4e" (UID: "5a29c8fd-5ce7-4561-991f-bbac253c1a4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.230290 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.230321 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.230334 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kccwx\" (UniqueName: \"kubernetes.io/projected/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-kube-api-access-kccwx\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.230344 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a29c8fd-5ce7-4561-991f-bbac253c1a4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.270616 5030 scope.go:117] "RemoveContainer" containerID="0e26fb0031c07818588d5bf4766e1ad866dda0810a4088164b622b8f9f6a5e70" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.291705 5030 scope.go:117] "RemoveContainer" containerID="80efd511e4ee3768484b403e45e2b6ca39c567b5911b1d9ade0590a90cbd5874" Jan 20 23:40:34 crc kubenswrapper[5030]: E0120 23:40:34.292261 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80efd511e4ee3768484b403e45e2b6ca39c567b5911b1d9ade0590a90cbd5874\": container with ID starting with 80efd511e4ee3768484b403e45e2b6ca39c567b5911b1d9ade0590a90cbd5874 not found: ID does not exist" containerID="80efd511e4ee3768484b403e45e2b6ca39c567b5911b1d9ade0590a90cbd5874" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.292310 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80efd511e4ee3768484b403e45e2b6ca39c567b5911b1d9ade0590a90cbd5874"} err="failed to get container status \"80efd511e4ee3768484b403e45e2b6ca39c567b5911b1d9ade0590a90cbd5874\": rpc error: code = NotFound desc = could not find container \"80efd511e4ee3768484b403e45e2b6ca39c567b5911b1d9ade0590a90cbd5874\": container with ID starting with 80efd511e4ee3768484b403e45e2b6ca39c567b5911b1d9ade0590a90cbd5874 not found: ID does not exist" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.292337 5030 scope.go:117] "RemoveContainer" containerID="0e26fb0031c07818588d5bf4766e1ad866dda0810a4088164b622b8f9f6a5e70" Jan 20 23:40:34 crc kubenswrapper[5030]: E0120 23:40:34.292794 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e26fb0031c07818588d5bf4766e1ad866dda0810a4088164b622b8f9f6a5e70\": container with ID starting with 0e26fb0031c07818588d5bf4766e1ad866dda0810a4088164b622b8f9f6a5e70 not found: ID does not exist" containerID="0e26fb0031c07818588d5bf4766e1ad866dda0810a4088164b622b8f9f6a5e70" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.292839 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e26fb0031c07818588d5bf4766e1ad866dda0810a4088164b622b8f9f6a5e70"} err="failed to get container status \"0e26fb0031c07818588d5bf4766e1ad866dda0810a4088164b622b8f9f6a5e70\": rpc error: code = NotFound desc = could not find container \"0e26fb0031c07818588d5bf4766e1ad866dda0810a4088164b622b8f9f6a5e70\": container with ID starting with 0e26fb0031c07818588d5bf4766e1ad866dda0810a4088164b622b8f9f6a5e70 not found: ID does not exist" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.488980 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.498341 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.519892 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:34 crc kubenswrapper[5030]: E0120 23:40:34.520351 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a29c8fd-5ce7-4561-991f-bbac253c1a4e" containerName="nova-api-api" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.520374 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a29c8fd-5ce7-4561-991f-bbac253c1a4e" containerName="nova-api-api" Jan 20 23:40:34 crc kubenswrapper[5030]: E0120 23:40:34.520417 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a29c8fd-5ce7-4561-991f-bbac253c1a4e" containerName="nova-api-log" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.520427 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a29c8fd-5ce7-4561-991f-bbac253c1a4e" containerName="nova-api-log" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.520695 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a29c8fd-5ce7-4561-991f-bbac253c1a4e" containerName="nova-api-api" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.520726 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a29c8fd-5ce7-4561-991f-bbac253c1a4e" containerName="nova-api-log" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.521963 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.525505 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.545826 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.639962 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e6530f-e723-47f8-ae17-641287a0b712-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.640370 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlf72\" (UniqueName: \"kubernetes.io/projected/15e6530f-e723-47f8-ae17-641287a0b712-kube-api-access-wlf72\") pod \"nova-api-0\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.640441 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e6530f-e723-47f8-ae17-641287a0b712-logs\") pod \"nova-api-0\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.640487 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e6530f-e723-47f8-ae17-641287a0b712-config-data\") pod \"nova-api-0\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.741843 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e6530f-e723-47f8-ae17-641287a0b712-logs\") pod \"nova-api-0\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.741922 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e6530f-e723-47f8-ae17-641287a0b712-config-data\") pod \"nova-api-0\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.742095 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e6530f-e723-47f8-ae17-641287a0b712-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.742136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlf72\" (UniqueName: \"kubernetes.io/projected/15e6530f-e723-47f8-ae17-641287a0b712-kube-api-access-wlf72\") pod \"nova-api-0\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.742674 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e6530f-e723-47f8-ae17-641287a0b712-logs\") pod \"nova-api-0\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.746516 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e6530f-e723-47f8-ae17-641287a0b712-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.747010 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e6530f-e723-47f8-ae17-641287a0b712-config-data\") pod \"nova-api-0\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.757717 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlf72\" (UniqueName: \"kubernetes.io/projected/15e6530f-e723-47f8-ae17-641287a0b712-kube-api-access-wlf72\") pod \"nova-api-0\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:34 crc kubenswrapper[5030]: I0120 23:40:34.854679 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.169800 5030 generic.go:334] "Generic (PLEG): container finished" podID="d814a861-2a2f-4f40-a7ec-79379f1d1df7" containerID="8218d6e26ae246f3c2e30853ab3bd3acc26e289805956d6fe5a0620489440dec" exitCode=0 Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.169885 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d814a861-2a2f-4f40-a7ec-79379f1d1df7","Type":"ContainerDied","Data":"8218d6e26ae246f3c2e30853ab3bd3acc26e289805956d6fe5a0620489440dec"} Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.293163 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.515369 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.659414 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nftl9\" (UniqueName: \"kubernetes.io/projected/d814a861-2a2f-4f40-a7ec-79379f1d1df7-kube-api-access-nftl9\") pod \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\" (UID: \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\") " Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.659456 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d814a861-2a2f-4f40-a7ec-79379f1d1df7-combined-ca-bundle\") pod \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\" (UID: \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\") " Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.659517 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d814a861-2a2f-4f40-a7ec-79379f1d1df7-config-data\") pod \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\" (UID: \"d814a861-2a2f-4f40-a7ec-79379f1d1df7\") " Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.667732 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d814a861-2a2f-4f40-a7ec-79379f1d1df7-kube-api-access-nftl9" (OuterVolumeSpecName: "kube-api-access-nftl9") pod "d814a861-2a2f-4f40-a7ec-79379f1d1df7" (UID: "d814a861-2a2f-4f40-a7ec-79379f1d1df7"). InnerVolumeSpecName "kube-api-access-nftl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.706242 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d814a861-2a2f-4f40-a7ec-79379f1d1df7-config-data" (OuterVolumeSpecName: "config-data") pod "d814a861-2a2f-4f40-a7ec-79379f1d1df7" (UID: "d814a861-2a2f-4f40-a7ec-79379f1d1df7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.720448 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d814a861-2a2f-4f40-a7ec-79379f1d1df7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d814a861-2a2f-4f40-a7ec-79379f1d1df7" (UID: "d814a861-2a2f-4f40-a7ec-79379f1d1df7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.761606 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nftl9\" (UniqueName: \"kubernetes.io/projected/d814a861-2a2f-4f40-a7ec-79379f1d1df7-kube-api-access-nftl9\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.761674 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d814a861-2a2f-4f40-a7ec-79379f1d1df7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.761686 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d814a861-2a2f-4f40-a7ec-79379f1d1df7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.778895 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.779001 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:35 crc kubenswrapper[5030]: I0120 23:40:35.976713 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a29c8fd-5ce7-4561-991f-bbac253c1a4e" path="/var/lib/kubelet/pods/5a29c8fd-5ce7-4561-991f-bbac253c1a4e/volumes" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.181313 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"15e6530f-e723-47f8-ae17-641287a0b712","Type":"ContainerStarted","Data":"42e96a1872189fd9e726e722d011a3c37b6249a412360c2ddb77fc01afe93a3a"} Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.181609 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"15e6530f-e723-47f8-ae17-641287a0b712","Type":"ContainerStarted","Data":"cac952bbce80d00c67d7f64ce36ba24a76a45381100b955c9d89dc9da594ecc8"} Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.181649 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"15e6530f-e723-47f8-ae17-641287a0b712","Type":"ContainerStarted","Data":"db811f059ed4c029c9ae09b56ad58e0958c4ba2c8933cb84e9212860dc29bd07"} Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.183830 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnqzm" event={"ID":"ccc0c21b-efe9-4477-97f2-9cea8761b1c4","Type":"ContainerStarted","Data":"e6f8f0c0745e9447e00a4388063eb4d520dd81e092d40dfef84375bcc7d1f89e"} Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.185461 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"d814a861-2a2f-4f40-a7ec-79379f1d1df7","Type":"ContainerDied","Data":"7484ec99fcb44e7186c55f0e2ac85fe3a0d3c0bd38cb9bf611bbc309492650ba"} Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.185503 5030 scope.go:117] "RemoveContainer" containerID="8218d6e26ae246f3c2e30853ab3bd3acc26e289805956d6fe5a0620489440dec" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.185470 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.204801 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.2047809210000002 podStartE2EDuration="2.204780921s" podCreationTimestamp="2026-01-20 23:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:36.199664316 +0000 UTC m=+3908.519924614" watchObservedRunningTime="2026-01-20 23:40:36.204780921 +0000 UTC m=+3908.525041209" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.603204 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.612368 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.627110 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:36 crc kubenswrapper[5030]: E0120 23:40:36.627654 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d814a861-2a2f-4f40-a7ec-79379f1d1df7" containerName="nova-scheduler-scheduler" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.627676 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d814a861-2a2f-4f40-a7ec-79379f1d1df7" containerName="nova-scheduler-scheduler" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.627914 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d814a861-2a2f-4f40-a7ec-79379f1d1df7" containerName="nova-scheduler-scheduler" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.629748 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.632528 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.644258 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.780953 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldmnp\" (UniqueName: \"kubernetes.io/projected/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-kube-api-access-ldmnp\") pod \"nova-scheduler-0\" (UID: \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.781302 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-config-data\") pod \"nova-scheduler-0\" (UID: \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.781341 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.883106 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmnp\" (UniqueName: \"kubernetes.io/projected/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-kube-api-access-ldmnp\") pod \"nova-scheduler-0\" (UID: \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.883196 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-config-data\") pod \"nova-scheduler-0\" (UID: \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.883239 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.887858 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-config-data\") pod \"nova-scheduler-0\" (UID: \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.888230 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.898599 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldmnp\" (UniqueName: \"kubernetes.io/projected/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-kube-api-access-ldmnp\") pod \"nova-scheduler-0\" (UID: \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:36 crc kubenswrapper[5030]: I0120 23:40:36.963988 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:37 crc kubenswrapper[5030]: I0120 23:40:37.389896 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:37 crc kubenswrapper[5030]: W0120 23:40:37.392303 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc96eb84e_9bf3_49a7_a7b6_a5ae9948f615.slice/crio-8dbb7fafcf2f3d8860b7f2395e39307d9c73dbda0bf4865976b27f6d40dc4b24 WatchSource:0}: Error finding container 8dbb7fafcf2f3d8860b7f2395e39307d9c73dbda0bf4865976b27f6d40dc4b24: Status 404 returned error can't find the container with id 8dbb7fafcf2f3d8860b7f2395e39307d9c73dbda0bf4865976b27f6d40dc4b24 Jan 20 23:40:37 crc kubenswrapper[5030]: I0120 23:40:37.999144 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d814a861-2a2f-4f40-a7ec-79379f1d1df7" path="/var/lib/kubelet/pods/d814a861-2a2f-4f40-a7ec-79379f1d1df7/volumes" Jan 20 23:40:38 crc kubenswrapper[5030]: I0120 23:40:38.196795 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:38 crc kubenswrapper[5030]: I0120 23:40:38.231095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615","Type":"ContainerStarted","Data":"70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e"} Jan 20 23:40:38 crc kubenswrapper[5030]: I0120 23:40:38.231188 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615","Type":"ContainerStarted","Data":"8dbb7fafcf2f3d8860b7f2395e39307d9c73dbda0bf4865976b27f6d40dc4b24"} Jan 20 23:40:38 crc kubenswrapper[5030]: I0120 23:40:38.247941 5030 generic.go:334] "Generic (PLEG): container finished" podID="ccc0c21b-efe9-4477-97f2-9cea8761b1c4" containerID="e6f8f0c0745e9447e00a4388063eb4d520dd81e092d40dfef84375bcc7d1f89e" exitCode=0 Jan 20 23:40:38 crc kubenswrapper[5030]: I0120 23:40:38.247989 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnqzm" event={"ID":"ccc0c21b-efe9-4477-97f2-9cea8761b1c4","Type":"ContainerDied","Data":"e6f8f0c0745e9447e00a4388063eb4d520dd81e092d40dfef84375bcc7d1f89e"} Jan 20 23:40:38 crc kubenswrapper[5030]: I0120 23:40:38.284005 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.28398528 podStartE2EDuration="2.28398528s" podCreationTimestamp="2026-01-20 23:40:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:38.268265898 +0000 UTC m=+3910.588526236" watchObservedRunningTime="2026-01-20 23:40:38.28398528 +0000 UTC m=+3910.604245578" Jan 20 23:40:39 crc kubenswrapper[5030]: I0120 23:40:39.258296 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnqzm" event={"ID":"ccc0c21b-efe9-4477-97f2-9cea8761b1c4","Type":"ContainerStarted","Data":"342d7f351f54f295cbacd23a1ceaa052078c4c9218337b09d7c16553990360a1"} Jan 20 23:40:39 crc kubenswrapper[5030]: I0120 23:40:39.282322 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gnqzm" podStartSLOduration=2.626103967 podStartE2EDuration="7.282303825s" podCreationTimestamp="2026-01-20 23:40:32 +0000 UTC" firstStartedPulling="2026-01-20 23:40:34.160379303 +0000 UTC m=+3906.480639591" lastFinishedPulling="2026-01-20 23:40:38.816579161 +0000 UTC m=+3911.136839449" observedRunningTime="2026-01-20 23:40:39.278329518 +0000 UTC m=+3911.598589806" watchObservedRunningTime="2026-01-20 23:40:39.282303825 +0000 UTC m=+3911.602564113" Jan 20 23:40:39 crc kubenswrapper[5030]: I0120 23:40:39.487658 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:40:39 crc kubenswrapper[5030]: I0120 23:40:39.980965 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb"] Jan 20 23:40:39 crc kubenswrapper[5030]: I0120 23:40:39.982262 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:39 crc kubenswrapper[5030]: I0120 23:40:39.985210 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 20 23:40:39 crc kubenswrapper[5030]: I0120 23:40:39.985330 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 20 23:40:39 crc kubenswrapper[5030]: I0120 23:40:39.996520 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb"] Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.156342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz9pj\" (UniqueName: \"kubernetes.io/projected/1b22b9f9-3ec2-44dd-9284-574e90be8eae-kube-api-access-tz9pj\") pod \"nova-cell1-cell-mapping-m68lb\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.156510 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-scripts\") pod \"nova-cell1-cell-mapping-m68lb\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.156563 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-m68lb\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.156727 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-config-data\") pod \"nova-cell1-cell-mapping-m68lb\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.258757 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz9pj\" (UniqueName: \"kubernetes.io/projected/1b22b9f9-3ec2-44dd-9284-574e90be8eae-kube-api-access-tz9pj\") pod \"nova-cell1-cell-mapping-m68lb\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.258869 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-scripts\") pod \"nova-cell1-cell-mapping-m68lb\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.258923 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-m68lb\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.259010 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-config-data\") pod \"nova-cell1-cell-mapping-m68lb\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.264565 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-m68lb\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.265693 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-scripts\") pod \"nova-cell1-cell-mapping-m68lb\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.266496 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-config-data\") pod \"nova-cell1-cell-mapping-m68lb\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.277151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz9pj\" (UniqueName: \"kubernetes.io/projected/1b22b9f9-3ec2-44dd-9284-574e90be8eae-kube-api-access-tz9pj\") pod \"nova-cell1-cell-mapping-m68lb\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.297808 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.755432 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb"] Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.779583 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:40 crc kubenswrapper[5030]: I0120 23:40:40.779621 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:41 crc kubenswrapper[5030]: I0120 23:40:41.280788 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" event={"ID":"1b22b9f9-3ec2-44dd-9284-574e90be8eae","Type":"ContainerStarted","Data":"2b3263321b820b32f721d15d4ce47b9ccdf9ebd90f26eb5bf80c361394a902e3"} Jan 20 23:40:41 crc kubenswrapper[5030]: I0120 23:40:41.281071 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" event={"ID":"1b22b9f9-3ec2-44dd-9284-574e90be8eae","Type":"ContainerStarted","Data":"8e8603acd9584307c967f71b952e52d318c419df0bcc391e472d209fdc84406c"} Jan 20 23:40:41 crc kubenswrapper[5030]: I0120 23:40:41.299850 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" podStartSLOduration=2.299830411 podStartE2EDuration="2.299830411s" podCreationTimestamp="2026-01-20 23:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:41.294989283 +0000 UTC m=+3913.615249571" watchObservedRunningTime="2026-01-20 23:40:41.299830411 +0000 UTC m=+3913.620090699" Jan 20 23:40:41 crc kubenswrapper[5030]: I0120 23:40:41.792770 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="33eeca5d-0bab-4e41-9b66-6f3f971e2137" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.244:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:40:41 crc kubenswrapper[5030]: I0120 23:40:41.792778 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="33eeca5d-0bab-4e41-9b66-6f3f971e2137" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.244:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:40:41 crc kubenswrapper[5030]: I0120 23:40:41.973814 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:42 crc kubenswrapper[5030]: I0120 23:40:42.161733 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:40:42 crc kubenswrapper[5030]: I0120 23:40:42.161952 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="5c398695-7b05-43bd-b76f-68e6a9e85605" containerName="kube-state-metrics" containerID="cri-o://0c04d25d07c9c7cb76b3dc5a59950a742d760a872cb02a14c5668b1b2c745c9a" gracePeriod=30 Jan 20 23:40:42 crc kubenswrapper[5030]: I0120 23:40:42.963176 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:40:42 crc kubenswrapper[5030]: E0120 23:40:42.964045 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:40:43 crc kubenswrapper[5030]: I0120 23:40:43.169842 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:43 crc kubenswrapper[5030]: I0120 23:40:43.169931 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:43 crc kubenswrapper[5030]: I0120 23:40:43.315256 5030 generic.go:334] "Generic (PLEG): container finished" podID="5c398695-7b05-43bd-b76f-68e6a9e85605" containerID="0c04d25d07c9c7cb76b3dc5a59950a742d760a872cb02a14c5668b1b2c745c9a" exitCode=2 Jan 20 23:40:43 crc kubenswrapper[5030]: I0120 23:40:43.315313 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"5c398695-7b05-43bd-b76f-68e6a9e85605","Type":"ContainerDied","Data":"0c04d25d07c9c7cb76b3dc5a59950a742d760a872cb02a14c5668b1b2c745c9a"} Jan 20 23:40:43 crc kubenswrapper[5030]: I0120 23:40:43.460471 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:43 crc kubenswrapper[5030]: I0120 23:40:43.621000 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjn6n\" (UniqueName: \"kubernetes.io/projected/5c398695-7b05-43bd-b76f-68e6a9e85605-kube-api-access-mjn6n\") pod \"5c398695-7b05-43bd-b76f-68e6a9e85605\" (UID: \"5c398695-7b05-43bd-b76f-68e6a9e85605\") " Jan 20 23:40:43 crc kubenswrapper[5030]: I0120 23:40:43.626300 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c398695-7b05-43bd-b76f-68e6a9e85605-kube-api-access-mjn6n" (OuterVolumeSpecName: "kube-api-access-mjn6n") pod "5c398695-7b05-43bd-b76f-68e6a9e85605" (UID: "5c398695-7b05-43bd-b76f-68e6a9e85605"). InnerVolumeSpecName "kube-api-access-mjn6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:43 crc kubenswrapper[5030]: I0120 23:40:43.723987 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjn6n\" (UniqueName: \"kubernetes.io/projected/5c398695-7b05-43bd-b76f-68e6a9e85605-kube-api-access-mjn6n\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:43 crc kubenswrapper[5030]: I0120 23:40:43.939075 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:40:43 crc kubenswrapper[5030]: I0120 23:40:43.939383 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="ceilometer-central-agent" containerID="cri-o://5e608a958181817401ce6598938660abbec2a131569c53fc8ca60929d54cae4a" gracePeriod=30 Jan 20 23:40:43 crc kubenswrapper[5030]: I0120 23:40:43.939499 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="ceilometer-notification-agent" containerID="cri-o://df202092794809bbbe9ab204cac9e83d63c6db3159fa98a884f9820a3b1660cc" gracePeriod=30 Jan 20 23:40:43 crc kubenswrapper[5030]: I0120 23:40:43.939523 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="sg-core" containerID="cri-o://c366ff4fd041feeec337d801d372b0ca541ed02aabcfd79af2451aeba162f701" gracePeriod=30 Jan 20 23:40:43 crc kubenswrapper[5030]: I0120 23:40:43.939562 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="proxy-httpd" containerID="cri-o://66324d1d1778ace59097b423d525af9f147a69b5b819a89d2eea7d96a40cd691" gracePeriod=30 Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.225153 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gnqzm" podUID="ccc0c21b-efe9-4477-97f2-9cea8761b1c4" containerName="registry-server" probeResult="failure" output=< Jan 20 23:40:44 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 23:40:44 crc kubenswrapper[5030]: > Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.325052 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"5c398695-7b05-43bd-b76f-68e6a9e85605","Type":"ContainerDied","Data":"b4fab6437f1ec783dbfc613bcf6953f2a0c8fca88aa66a2d77d00f7aeafb5525"} Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.325095 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.325111 5030 scope.go:117] "RemoveContainer" containerID="0c04d25d07c9c7cb76b3dc5a59950a742d760a872cb02a14c5668b1b2c745c9a" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.327980 5030 generic.go:334] "Generic (PLEG): container finished" podID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerID="66324d1d1778ace59097b423d525af9f147a69b5b819a89d2eea7d96a40cd691" exitCode=0 Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.328015 5030 generic.go:334] "Generic (PLEG): container finished" podID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerID="c366ff4fd041feeec337d801d372b0ca541ed02aabcfd79af2451aeba162f701" exitCode=2 Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.328041 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"233cb4a4-24c7-4596-adc1-d7e4a47af70a","Type":"ContainerDied","Data":"66324d1d1778ace59097b423d525af9f147a69b5b819a89d2eea7d96a40cd691"} Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.328071 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"233cb4a4-24c7-4596-adc1-d7e4a47af70a","Type":"ContainerDied","Data":"c366ff4fd041feeec337d801d372b0ca541ed02aabcfd79af2451aeba162f701"} Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.353471 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.361965 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.372123 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:40:44 crc kubenswrapper[5030]: E0120 23:40:44.372454 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c398695-7b05-43bd-b76f-68e6a9e85605" containerName="kube-state-metrics" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.372471 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c398695-7b05-43bd-b76f-68e6a9e85605" containerName="kube-state-metrics" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.372682 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c398695-7b05-43bd-b76f-68e6a9e85605" containerName="kube-state-metrics" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.373498 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.375959 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.377129 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.407038 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.538714 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.538811 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.538859 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lpr9\" (UniqueName: \"kubernetes.io/projected/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-api-access-6lpr9\") pod \"kube-state-metrics-0\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.538883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.640243 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.640331 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.640371 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lpr9\" (UniqueName: \"kubernetes.io/projected/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-api-access-6lpr9\") pod \"kube-state-metrics-0\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.640394 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.644823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.650689 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.662692 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.662841 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lpr9\" (UniqueName: \"kubernetes.io/projected/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-api-access-6lpr9\") pod \"kube-state-metrics-0\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.702174 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.857096 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:44 crc kubenswrapper[5030]: I0120 23:40:44.857438 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:45 crc kubenswrapper[5030]: I0120 23:40:45.147242 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:40:45 crc kubenswrapper[5030]: W0120 23:40:45.149785 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86b0b2ab_44de_47c1_b0f4_6fc63d87393a.slice/crio-9b80c9ca3ace44c607a095305001f867f4fc68f08a57178da2066aab0556a1ca WatchSource:0}: Error finding container 9b80c9ca3ace44c607a095305001f867f4fc68f08a57178da2066aab0556a1ca: Status 404 returned error can't find the container with id 9b80c9ca3ace44c607a095305001f867f4fc68f08a57178da2066aab0556a1ca Jan 20 23:40:45 crc kubenswrapper[5030]: I0120 23:40:45.338544 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"86b0b2ab-44de-47c1-b0f4-6fc63d87393a","Type":"ContainerStarted","Data":"9b80c9ca3ace44c607a095305001f867f4fc68f08a57178da2066aab0556a1ca"} Jan 20 23:40:45 crc kubenswrapper[5030]: I0120 23:40:45.341206 5030 generic.go:334] "Generic (PLEG): container finished" podID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerID="5e608a958181817401ce6598938660abbec2a131569c53fc8ca60929d54cae4a" exitCode=0 Jan 20 23:40:45 crc kubenswrapper[5030]: I0120 23:40:45.341249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"233cb4a4-24c7-4596-adc1-d7e4a47af70a","Type":"ContainerDied","Data":"5e608a958181817401ce6598938660abbec2a131569c53fc8ca60929d54cae4a"} Jan 20 23:40:45 crc kubenswrapper[5030]: I0120 23:40:45.938806 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="15e6530f-e723-47f8-ae17-641287a0b712" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.246:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:40:45 crc kubenswrapper[5030]: I0120 23:40:45.938850 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="15e6530f-e723-47f8-ae17-641287a0b712" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.246:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:40:45 crc kubenswrapper[5030]: I0120 23:40:45.973924 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c398695-7b05-43bd-b76f-68e6a9e85605" path="/var/lib/kubelet/pods/5c398695-7b05-43bd-b76f-68e6a9e85605/volumes" Jan 20 23:40:46 crc kubenswrapper[5030]: I0120 23:40:46.352407 5030 generic.go:334] "Generic (PLEG): container finished" podID="1b22b9f9-3ec2-44dd-9284-574e90be8eae" containerID="2b3263321b820b32f721d15d4ce47b9ccdf9ebd90f26eb5bf80c361394a902e3" exitCode=0 Jan 20 23:40:46 crc kubenswrapper[5030]: I0120 23:40:46.352677 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" event={"ID":"1b22b9f9-3ec2-44dd-9284-574e90be8eae","Type":"ContainerDied","Data":"2b3263321b820b32f721d15d4ce47b9ccdf9ebd90f26eb5bf80c361394a902e3"} Jan 20 23:40:46 crc kubenswrapper[5030]: I0120 23:40:46.354882 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"86b0b2ab-44de-47c1-b0f4-6fc63d87393a","Type":"ContainerStarted","Data":"deeb6b4d1668a9582b4c378c594679c4235c09d76d1e2e4ec46aa1ca229900d6"} Jan 20 23:40:46 crc kubenswrapper[5030]: I0120 23:40:46.355037 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:46 crc kubenswrapper[5030]: I0120 23:40:46.391950 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.038703089 podStartE2EDuration="2.391930039s" podCreationTimestamp="2026-01-20 23:40:44 +0000 UTC" firstStartedPulling="2026-01-20 23:40:45.152362974 +0000 UTC m=+3917.472623262" lastFinishedPulling="2026-01-20 23:40:45.505589924 +0000 UTC m=+3917.825850212" observedRunningTime="2026-01-20 23:40:46.382569133 +0000 UTC m=+3918.702829421" watchObservedRunningTime="2026-01-20 23:40:46.391930039 +0000 UTC m=+3918.712190327" Jan 20 23:40:46 crc kubenswrapper[5030]: I0120 23:40:46.964690 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.002590 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.424788 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.730698 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.802418 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-scripts\") pod \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.802485 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-combined-ca-bundle\") pod \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.802545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-config-data\") pod \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.802647 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz9pj\" (UniqueName: \"kubernetes.io/projected/1b22b9f9-3ec2-44dd-9284-574e90be8eae-kube-api-access-tz9pj\") pod \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\" (UID: \"1b22b9f9-3ec2-44dd-9284-574e90be8eae\") " Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.811074 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b22b9f9-3ec2-44dd-9284-574e90be8eae-kube-api-access-tz9pj" (OuterVolumeSpecName: "kube-api-access-tz9pj") pod "1b22b9f9-3ec2-44dd-9284-574e90be8eae" (UID: "1b22b9f9-3ec2-44dd-9284-574e90be8eae"). InnerVolumeSpecName "kube-api-access-tz9pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.811172 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-scripts" (OuterVolumeSpecName: "scripts") pod "1b22b9f9-3ec2-44dd-9284-574e90be8eae" (UID: "1b22b9f9-3ec2-44dd-9284-574e90be8eae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.830747 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-config-data" (OuterVolumeSpecName: "config-data") pod "1b22b9f9-3ec2-44dd-9284-574e90be8eae" (UID: "1b22b9f9-3ec2-44dd-9284-574e90be8eae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.842909 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b22b9f9-3ec2-44dd-9284-574e90be8eae" (UID: "1b22b9f9-3ec2-44dd-9284-574e90be8eae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.904368 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz9pj\" (UniqueName: \"kubernetes.io/projected/1b22b9f9-3ec2-44dd-9284-574e90be8eae-kube-api-access-tz9pj\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.904401 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.904415 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:47 crc kubenswrapper[5030]: I0120 23:40:47.904428 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b22b9f9-3ec2-44dd-9284-574e90be8eae-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:48 crc kubenswrapper[5030]: I0120 23:40:48.377465 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" event={"ID":"1b22b9f9-3ec2-44dd-9284-574e90be8eae","Type":"ContainerDied","Data":"8e8603acd9584307c967f71b952e52d318c419df0bcc391e472d209fdc84406c"} Jan 20 23:40:48 crc kubenswrapper[5030]: I0120 23:40:48.377899 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e8603acd9584307c967f71b952e52d318c419df0bcc391e472d209fdc84406c" Jan 20 23:40:48 crc kubenswrapper[5030]: I0120 23:40:48.377559 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb" Jan 20 23:40:48 crc kubenswrapper[5030]: I0120 23:40:48.581959 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:48 crc kubenswrapper[5030]: I0120 23:40:48.599727 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:48 crc kubenswrapper[5030]: I0120 23:40:48.600054 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="15e6530f-e723-47f8-ae17-641287a0b712" containerName="nova-api-log" containerID="cri-o://cac952bbce80d00c67d7f64ce36ba24a76a45381100b955c9d89dc9da594ecc8" gracePeriod=30 Jan 20 23:40:48 crc kubenswrapper[5030]: I0120 23:40:48.600147 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="15e6530f-e723-47f8-ae17-641287a0b712" containerName="nova-api-api" containerID="cri-o://42e96a1872189fd9e726e722d011a3c37b6249a412360c2ddb77fc01afe93a3a" gracePeriod=30 Jan 20 23:40:48 crc kubenswrapper[5030]: I0120 23:40:48.677802 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:48 crc kubenswrapper[5030]: I0120 23:40:48.678246 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="33eeca5d-0bab-4e41-9b66-6f3f971e2137" containerName="nova-metadata-log" containerID="cri-o://cdd1cc95c836d4a467573154276e8d1779cea5282fce06e7389075fbf9827bae" gracePeriod=30 Jan 20 23:40:48 crc kubenswrapper[5030]: I0120 23:40:48.678364 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="33eeca5d-0bab-4e41-9b66-6f3f971e2137" containerName="nova-metadata-metadata" containerID="cri-o://06db0b8bc2516db9d0f2346b1302e874bfbfda14e72e2c94c2199df5f7ee7396" gracePeriod=30 Jan 20 23:40:49 crc kubenswrapper[5030]: I0120 23:40:49.391443 5030 generic.go:334] "Generic (PLEG): container finished" podID="15e6530f-e723-47f8-ae17-641287a0b712" containerID="cac952bbce80d00c67d7f64ce36ba24a76a45381100b955c9d89dc9da594ecc8" exitCode=143 Jan 20 23:40:49 crc kubenswrapper[5030]: I0120 23:40:49.391496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"15e6530f-e723-47f8-ae17-641287a0b712","Type":"ContainerDied","Data":"cac952bbce80d00c67d7f64ce36ba24a76a45381100b955c9d89dc9da594ecc8"} Jan 20 23:40:49 crc kubenswrapper[5030]: I0120 23:40:49.396579 5030 generic.go:334] "Generic (PLEG): container finished" podID="33eeca5d-0bab-4e41-9b66-6f3f971e2137" containerID="cdd1cc95c836d4a467573154276e8d1779cea5282fce06e7389075fbf9827bae" exitCode=143 Jan 20 23:40:49 crc kubenswrapper[5030]: I0120 23:40:49.396666 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"33eeca5d-0bab-4e41-9b66-6f3f971e2137","Type":"ContainerDied","Data":"cdd1cc95c836d4a467573154276e8d1779cea5282fce06e7389075fbf9827bae"} Jan 20 23:40:49 crc kubenswrapper[5030]: I0120 23:40:49.396928 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c96eb84e-9bf3-49a7-a7b6-a5ae9948f615" containerName="nova-scheduler-scheduler" containerID="cri-o://70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e" gracePeriod=30 Jan 20 23:40:51 crc kubenswrapper[5030]: E0120 23:40:51.967947 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:40:51 crc kubenswrapper[5030]: E0120 23:40:51.970985 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:40:51 crc kubenswrapper[5030]: E0120 23:40:51.972972 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:40:51 crc kubenswrapper[5030]: E0120 23:40:51.973005 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c96eb84e-9bf3-49a7-a7b6-a5ae9948f615" containerName="nova-scheduler-scheduler" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.270821 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.295637 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.296272 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.395974 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-config-data\") pod \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396021 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gp6v\" (UniqueName: \"kubernetes.io/projected/33eeca5d-0bab-4e41-9b66-6f3f971e2137-kube-api-access-2gp6v\") pod \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396059 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e6530f-e723-47f8-ae17-641287a0b712-config-data\") pod \"15e6530f-e723-47f8-ae17-641287a0b712\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396088 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-scripts\") pod \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396108 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e6530f-e723-47f8-ae17-641287a0b712-logs\") pod \"15e6530f-e723-47f8-ae17-641287a0b712\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396134 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-combined-ca-bundle\") pod \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396152 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t4wp\" (UniqueName: \"kubernetes.io/projected/233cb4a4-24c7-4596-adc1-d7e4a47af70a-kube-api-access-5t4wp\") pod \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396214 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-combined-ca-bundle\") pod \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396229 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-config-data\") pod \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/233cb4a4-24c7-4596-adc1-d7e4a47af70a-log-httpd\") pod \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396277 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-sg-core-conf-yaml\") pod \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396325 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlf72\" (UniqueName: \"kubernetes.io/projected/15e6530f-e723-47f8-ae17-641287a0b712-kube-api-access-wlf72\") pod \"15e6530f-e723-47f8-ae17-641287a0b712\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396340 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/233cb4a4-24c7-4596-adc1-d7e4a47af70a-run-httpd\") pod \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\" (UID: \"233cb4a4-24c7-4596-adc1-d7e4a47af70a\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396362 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33eeca5d-0bab-4e41-9b66-6f3f971e2137-logs\") pod \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396393 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e6530f-e723-47f8-ae17-641287a0b712-combined-ca-bundle\") pod \"15e6530f-e723-47f8-ae17-641287a0b712\" (UID: \"15e6530f-e723-47f8-ae17-641287a0b712\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396412 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-nova-metadata-tls-certs\") pod \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\" (UID: \"33eeca5d-0bab-4e41-9b66-6f3f971e2137\") " Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.396650 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15e6530f-e723-47f8-ae17-641287a0b712-logs" (OuterVolumeSpecName: "logs") pod "15e6530f-e723-47f8-ae17-641287a0b712" (UID: "15e6530f-e723-47f8-ae17-641287a0b712"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.397059 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15e6530f-e723-47f8-ae17-641287a0b712-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.397258 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233cb4a4-24c7-4596-adc1-d7e4a47af70a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "233cb4a4-24c7-4596-adc1-d7e4a47af70a" (UID: "233cb4a4-24c7-4596-adc1-d7e4a47af70a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.397527 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33eeca5d-0bab-4e41-9b66-6f3f971e2137-logs" (OuterVolumeSpecName: "logs") pod "33eeca5d-0bab-4e41-9b66-6f3f971e2137" (UID: "33eeca5d-0bab-4e41-9b66-6f3f971e2137"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.397736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233cb4a4-24c7-4596-adc1-d7e4a47af70a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "233cb4a4-24c7-4596-adc1-d7e4a47af70a" (UID: "233cb4a4-24c7-4596-adc1-d7e4a47af70a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.402491 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e6530f-e723-47f8-ae17-641287a0b712-kube-api-access-wlf72" (OuterVolumeSpecName: "kube-api-access-wlf72") pod "15e6530f-e723-47f8-ae17-641287a0b712" (UID: "15e6530f-e723-47f8-ae17-641287a0b712"). InnerVolumeSpecName "kube-api-access-wlf72". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.403134 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233cb4a4-24c7-4596-adc1-d7e4a47af70a-kube-api-access-5t4wp" (OuterVolumeSpecName: "kube-api-access-5t4wp") pod "233cb4a4-24c7-4596-adc1-d7e4a47af70a" (UID: "233cb4a4-24c7-4596-adc1-d7e4a47af70a"). InnerVolumeSpecName "kube-api-access-5t4wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.403164 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-scripts" (OuterVolumeSpecName: "scripts") pod "233cb4a4-24c7-4596-adc1-d7e4a47af70a" (UID: "233cb4a4-24c7-4596-adc1-d7e4a47af70a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.403190 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33eeca5d-0bab-4e41-9b66-6f3f971e2137-kube-api-access-2gp6v" (OuterVolumeSpecName: "kube-api-access-2gp6v") pod "33eeca5d-0bab-4e41-9b66-6f3f971e2137" (UID: "33eeca5d-0bab-4e41-9b66-6f3f971e2137"). InnerVolumeSpecName "kube-api-access-2gp6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.422046 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e6530f-e723-47f8-ae17-641287a0b712-config-data" (OuterVolumeSpecName: "config-data") pod "15e6530f-e723-47f8-ae17-641287a0b712" (UID: "15e6530f-e723-47f8-ae17-641287a0b712"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.427672 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33eeca5d-0bab-4e41-9b66-6f3f971e2137" (UID: "33eeca5d-0bab-4e41-9b66-6f3f971e2137"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.429203 5030 generic.go:334] "Generic (PLEG): container finished" podID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerID="df202092794809bbbe9ab204cac9e83d63c6db3159fa98a884f9820a3b1660cc" exitCode=0 Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.429273 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"233cb4a4-24c7-4596-adc1-d7e4a47af70a","Type":"ContainerDied","Data":"df202092794809bbbe9ab204cac9e83d63c6db3159fa98a884f9820a3b1660cc"} Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.429300 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"233cb4a4-24c7-4596-adc1-d7e4a47af70a","Type":"ContainerDied","Data":"e5976501a30164b4d6e9d409b64b0ac22ffdf9d2149cfc5d8660d59e96ee35f3"} Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.429317 5030 scope.go:117] "RemoveContainer" containerID="66324d1d1778ace59097b423d525af9f147a69b5b819a89d2eea7d96a40cd691" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.429459 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.431575 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-config-data" (OuterVolumeSpecName: "config-data") pod "33eeca5d-0bab-4e41-9b66-6f3f971e2137" (UID: "33eeca5d-0bab-4e41-9b66-6f3f971e2137"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.432591 5030 generic.go:334] "Generic (PLEG): container finished" podID="33eeca5d-0bab-4e41-9b66-6f3f971e2137" containerID="06db0b8bc2516db9d0f2346b1302e874bfbfda14e72e2c94c2199df5f7ee7396" exitCode=0 Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.432700 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"33eeca5d-0bab-4e41-9b66-6f3f971e2137","Type":"ContainerDied","Data":"06db0b8bc2516db9d0f2346b1302e874bfbfda14e72e2c94c2199df5f7ee7396"} Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.432736 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"33eeca5d-0bab-4e41-9b66-6f3f971e2137","Type":"ContainerDied","Data":"1cfc5348677079fa5737785b4c99791d9c5996c16ad539b7da54b141574f94df"} Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.432739 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.435064 5030 generic.go:334] "Generic (PLEG): container finished" podID="15e6530f-e723-47f8-ae17-641287a0b712" containerID="42e96a1872189fd9e726e722d011a3c37b6249a412360c2ddb77fc01afe93a3a" exitCode=0 Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.435106 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"15e6530f-e723-47f8-ae17-641287a0b712","Type":"ContainerDied","Data":"42e96a1872189fd9e726e722d011a3c37b6249a412360c2ddb77fc01afe93a3a"} Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.435133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"15e6530f-e723-47f8-ae17-641287a0b712","Type":"ContainerDied","Data":"db811f059ed4c029c9ae09b56ad58e0958c4ba2c8933cb84e9212860dc29bd07"} Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.435198 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.441870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e6530f-e723-47f8-ae17-641287a0b712-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15e6530f-e723-47f8-ae17-641287a0b712" (UID: "15e6530f-e723-47f8-ae17-641287a0b712"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.451035 5030 scope.go:117] "RemoveContainer" containerID="c366ff4fd041feeec337d801d372b0ca541ed02aabcfd79af2451aeba162f701" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.452471 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "233cb4a4-24c7-4596-adc1-d7e4a47af70a" (UID: "233cb4a4-24c7-4596-adc1-d7e4a47af70a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.467658 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "33eeca5d-0bab-4e41-9b66-6f3f971e2137" (UID: "33eeca5d-0bab-4e41-9b66-6f3f971e2137"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.468296 5030 scope.go:117] "RemoveContainer" containerID="df202092794809bbbe9ab204cac9e83d63c6db3159fa98a884f9820a3b1660cc" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.498449 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/233cb4a4-24c7-4596-adc1-d7e4a47af70a-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.498480 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.498493 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlf72\" (UniqueName: \"kubernetes.io/projected/15e6530f-e723-47f8-ae17-641287a0b712-kube-api-access-wlf72\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.498503 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/233cb4a4-24c7-4596-adc1-d7e4a47af70a-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.498512 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33eeca5d-0bab-4e41-9b66-6f3f971e2137-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.498522 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e6530f-e723-47f8-ae17-641287a0b712-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.498531 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.498539 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.498548 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gp6v\" (UniqueName: \"kubernetes.io/projected/33eeca5d-0bab-4e41-9b66-6f3f971e2137-kube-api-access-2gp6v\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.498556 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15e6530f-e723-47f8-ae17-641287a0b712-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.498564 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.498572 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33eeca5d-0bab-4e41-9b66-6f3f971e2137-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.498581 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t4wp\" (UniqueName: \"kubernetes.io/projected/233cb4a4-24c7-4596-adc1-d7e4a47af70a-kube-api-access-5t4wp\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.503281 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "233cb4a4-24c7-4596-adc1-d7e4a47af70a" (UID: "233cb4a4-24c7-4596-adc1-d7e4a47af70a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.503393 5030 scope.go:117] "RemoveContainer" containerID="5e608a958181817401ce6598938660abbec2a131569c53fc8ca60929d54cae4a" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.520114 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-config-data" (OuterVolumeSpecName: "config-data") pod "233cb4a4-24c7-4596-adc1-d7e4a47af70a" (UID: "233cb4a4-24c7-4596-adc1-d7e4a47af70a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.525553 5030 scope.go:117] "RemoveContainer" containerID="66324d1d1778ace59097b423d525af9f147a69b5b819a89d2eea7d96a40cd691" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.526062 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66324d1d1778ace59097b423d525af9f147a69b5b819a89d2eea7d96a40cd691\": container with ID starting with 66324d1d1778ace59097b423d525af9f147a69b5b819a89d2eea7d96a40cd691 not found: ID does not exist" containerID="66324d1d1778ace59097b423d525af9f147a69b5b819a89d2eea7d96a40cd691" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.526096 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66324d1d1778ace59097b423d525af9f147a69b5b819a89d2eea7d96a40cd691"} err="failed to get container status \"66324d1d1778ace59097b423d525af9f147a69b5b819a89d2eea7d96a40cd691\": rpc error: code = NotFound desc = could not find container \"66324d1d1778ace59097b423d525af9f147a69b5b819a89d2eea7d96a40cd691\": container with ID starting with 66324d1d1778ace59097b423d525af9f147a69b5b819a89d2eea7d96a40cd691 not found: ID does not exist" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.526116 5030 scope.go:117] "RemoveContainer" containerID="c366ff4fd041feeec337d801d372b0ca541ed02aabcfd79af2451aeba162f701" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.526587 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c366ff4fd041feeec337d801d372b0ca541ed02aabcfd79af2451aeba162f701\": container with ID starting with c366ff4fd041feeec337d801d372b0ca541ed02aabcfd79af2451aeba162f701 not found: ID does not exist" containerID="c366ff4fd041feeec337d801d372b0ca541ed02aabcfd79af2451aeba162f701" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.526643 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c366ff4fd041feeec337d801d372b0ca541ed02aabcfd79af2451aeba162f701"} err="failed to get container status \"c366ff4fd041feeec337d801d372b0ca541ed02aabcfd79af2451aeba162f701\": rpc error: code = NotFound desc = could not find container \"c366ff4fd041feeec337d801d372b0ca541ed02aabcfd79af2451aeba162f701\": container with ID starting with c366ff4fd041feeec337d801d372b0ca541ed02aabcfd79af2451aeba162f701 not found: ID does not exist" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.526687 5030 scope.go:117] "RemoveContainer" containerID="df202092794809bbbe9ab204cac9e83d63c6db3159fa98a884f9820a3b1660cc" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.527382 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df202092794809bbbe9ab204cac9e83d63c6db3159fa98a884f9820a3b1660cc\": container with ID starting with df202092794809bbbe9ab204cac9e83d63c6db3159fa98a884f9820a3b1660cc not found: ID does not exist" containerID="df202092794809bbbe9ab204cac9e83d63c6db3159fa98a884f9820a3b1660cc" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.527414 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df202092794809bbbe9ab204cac9e83d63c6db3159fa98a884f9820a3b1660cc"} err="failed to get container status \"df202092794809bbbe9ab204cac9e83d63c6db3159fa98a884f9820a3b1660cc\": rpc error: code = NotFound desc = could not find container \"df202092794809bbbe9ab204cac9e83d63c6db3159fa98a884f9820a3b1660cc\": container with ID starting with df202092794809bbbe9ab204cac9e83d63c6db3159fa98a884f9820a3b1660cc not found: ID does not exist" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.527428 5030 scope.go:117] "RemoveContainer" containerID="5e608a958181817401ce6598938660abbec2a131569c53fc8ca60929d54cae4a" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.527703 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e608a958181817401ce6598938660abbec2a131569c53fc8ca60929d54cae4a\": container with ID starting with 5e608a958181817401ce6598938660abbec2a131569c53fc8ca60929d54cae4a not found: ID does not exist" containerID="5e608a958181817401ce6598938660abbec2a131569c53fc8ca60929d54cae4a" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.527730 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e608a958181817401ce6598938660abbec2a131569c53fc8ca60929d54cae4a"} err="failed to get container status \"5e608a958181817401ce6598938660abbec2a131569c53fc8ca60929d54cae4a\": rpc error: code = NotFound desc = could not find container \"5e608a958181817401ce6598938660abbec2a131569c53fc8ca60929d54cae4a\": container with ID starting with 5e608a958181817401ce6598938660abbec2a131569c53fc8ca60929d54cae4a not found: ID does not exist" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.527752 5030 scope.go:117] "RemoveContainer" containerID="06db0b8bc2516db9d0f2346b1302e874bfbfda14e72e2c94c2199df5f7ee7396" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.548276 5030 scope.go:117] "RemoveContainer" containerID="cdd1cc95c836d4a467573154276e8d1779cea5282fce06e7389075fbf9827bae" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.571602 5030 scope.go:117] "RemoveContainer" containerID="06db0b8bc2516db9d0f2346b1302e874bfbfda14e72e2c94c2199df5f7ee7396" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.588117 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06db0b8bc2516db9d0f2346b1302e874bfbfda14e72e2c94c2199df5f7ee7396\": container with ID starting with 06db0b8bc2516db9d0f2346b1302e874bfbfda14e72e2c94c2199df5f7ee7396 not found: ID does not exist" containerID="06db0b8bc2516db9d0f2346b1302e874bfbfda14e72e2c94c2199df5f7ee7396" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.588167 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06db0b8bc2516db9d0f2346b1302e874bfbfda14e72e2c94c2199df5f7ee7396"} err="failed to get container status \"06db0b8bc2516db9d0f2346b1302e874bfbfda14e72e2c94c2199df5f7ee7396\": rpc error: code = NotFound desc = could not find container \"06db0b8bc2516db9d0f2346b1302e874bfbfda14e72e2c94c2199df5f7ee7396\": container with ID starting with 06db0b8bc2516db9d0f2346b1302e874bfbfda14e72e2c94c2199df5f7ee7396 not found: ID does not exist" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.588198 5030 scope.go:117] "RemoveContainer" containerID="cdd1cc95c836d4a467573154276e8d1779cea5282fce06e7389075fbf9827bae" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.588638 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdd1cc95c836d4a467573154276e8d1779cea5282fce06e7389075fbf9827bae\": container with ID starting with cdd1cc95c836d4a467573154276e8d1779cea5282fce06e7389075fbf9827bae not found: ID does not exist" containerID="cdd1cc95c836d4a467573154276e8d1779cea5282fce06e7389075fbf9827bae" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.588690 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd1cc95c836d4a467573154276e8d1779cea5282fce06e7389075fbf9827bae"} err="failed to get container status \"cdd1cc95c836d4a467573154276e8d1779cea5282fce06e7389075fbf9827bae\": rpc error: code = NotFound desc = could not find container \"cdd1cc95c836d4a467573154276e8d1779cea5282fce06e7389075fbf9827bae\": container with ID starting with cdd1cc95c836d4a467573154276e8d1779cea5282fce06e7389075fbf9827bae not found: ID does not exist" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.588721 5030 scope.go:117] "RemoveContainer" containerID="42e96a1872189fd9e726e722d011a3c37b6249a412360c2ddb77fc01afe93a3a" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.605767 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.605801 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233cb4a4-24c7-4596-adc1-d7e4a47af70a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.634786 5030 scope.go:117] "RemoveContainer" containerID="cac952bbce80d00c67d7f64ce36ba24a76a45381100b955c9d89dc9da594ecc8" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.675221 5030 scope.go:117] "RemoveContainer" containerID="42e96a1872189fd9e726e722d011a3c37b6249a412360c2ddb77fc01afe93a3a" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.675954 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e96a1872189fd9e726e722d011a3c37b6249a412360c2ddb77fc01afe93a3a\": container with ID starting with 42e96a1872189fd9e726e722d011a3c37b6249a412360c2ddb77fc01afe93a3a not found: ID does not exist" containerID="42e96a1872189fd9e726e722d011a3c37b6249a412360c2ddb77fc01afe93a3a" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.675983 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e96a1872189fd9e726e722d011a3c37b6249a412360c2ddb77fc01afe93a3a"} err="failed to get container status \"42e96a1872189fd9e726e722d011a3c37b6249a412360c2ddb77fc01afe93a3a\": rpc error: code = NotFound desc = could not find container \"42e96a1872189fd9e726e722d011a3c37b6249a412360c2ddb77fc01afe93a3a\": container with ID starting with 42e96a1872189fd9e726e722d011a3c37b6249a412360c2ddb77fc01afe93a3a not found: ID does not exist" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.676003 5030 scope.go:117] "RemoveContainer" containerID="cac952bbce80d00c67d7f64ce36ba24a76a45381100b955c9d89dc9da594ecc8" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.684746 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cac952bbce80d00c67d7f64ce36ba24a76a45381100b955c9d89dc9da594ecc8\": container with ID starting with cac952bbce80d00c67d7f64ce36ba24a76a45381100b955c9d89dc9da594ecc8 not found: ID does not exist" containerID="cac952bbce80d00c67d7f64ce36ba24a76a45381100b955c9d89dc9da594ecc8" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.684789 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac952bbce80d00c67d7f64ce36ba24a76a45381100b955c9d89dc9da594ecc8"} err="failed to get container status \"cac952bbce80d00c67d7f64ce36ba24a76a45381100b955c9d89dc9da594ecc8\": rpc error: code = NotFound desc = could not find container \"cac952bbce80d00c67d7f64ce36ba24a76a45381100b955c9d89dc9da594ecc8\": container with ID starting with cac952bbce80d00c67d7f64ce36ba24a76a45381100b955c9d89dc9da594ecc8 not found: ID does not exist" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.790763 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.797471 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.805970 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.814259 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.821821 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.822232 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="ceilometer-central-agent" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822247 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="ceilometer-central-agent" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.822263 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b22b9f9-3ec2-44dd-9284-574e90be8eae" containerName="nova-manage" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822270 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b22b9f9-3ec2-44dd-9284-574e90be8eae" containerName="nova-manage" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.822285 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33eeca5d-0bab-4e41-9b66-6f3f971e2137" containerName="nova-metadata-metadata" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822290 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="33eeca5d-0bab-4e41-9b66-6f3f971e2137" containerName="nova-metadata-metadata" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.822300 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e6530f-e723-47f8-ae17-641287a0b712" containerName="nova-api-api" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822307 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e6530f-e723-47f8-ae17-641287a0b712" containerName="nova-api-api" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.822319 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="proxy-httpd" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822324 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="proxy-httpd" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.822338 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="ceilometer-notification-agent" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822343 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="ceilometer-notification-agent" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.822358 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33eeca5d-0bab-4e41-9b66-6f3f971e2137" containerName="nova-metadata-log" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822364 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="33eeca5d-0bab-4e41-9b66-6f3f971e2137" containerName="nova-metadata-log" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.822380 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e6530f-e723-47f8-ae17-641287a0b712" containerName="nova-api-log" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822386 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e6530f-e723-47f8-ae17-641287a0b712" containerName="nova-api-log" Jan 20 23:40:52 crc kubenswrapper[5030]: E0120 23:40:52.822399 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="sg-core" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822404 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="sg-core" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822566 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="ceilometer-notification-agent" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822580 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="proxy-httpd" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822590 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="33eeca5d-0bab-4e41-9b66-6f3f971e2137" containerName="nova-metadata-metadata" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822598 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="33eeca5d-0bab-4e41-9b66-6f3f971e2137" containerName="nova-metadata-log" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822607 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b22b9f9-3ec2-44dd-9284-574e90be8eae" containerName="nova-manage" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822657 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="sg-core" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822667 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e6530f-e723-47f8-ae17-641287a0b712" containerName="nova-api-api" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822678 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e6530f-e723-47f8-ae17-641287a0b712" containerName="nova-api-log" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.822687 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" containerName="ceilometer-central-agent" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.824257 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.829076 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.829239 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.829422 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.830054 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.837963 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.871325 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.885564 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.887015 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.889278 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.889336 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.898027 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.899529 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.901988 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.911023 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.911085 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9859f1cb-945d-4148-bf8d-b844c578dc40-log-httpd\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.911128 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-config-data\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.911180 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcmxc\" (UniqueName: \"kubernetes.io/projected/9859f1cb-945d-4148-bf8d-b844c578dc40-kube-api-access-lcmxc\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.911203 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-scripts\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.911229 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9859f1cb-945d-4148-bf8d-b844c578dc40-run-httpd\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.911245 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.911269 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.911382 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:52 crc kubenswrapper[5030]: I0120 23:40:52.923611 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013008 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcmxc\" (UniqueName: \"kubernetes.io/projected/9859f1cb-945d-4148-bf8d-b844c578dc40-kube-api-access-lcmxc\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013063 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-scripts\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg7mm\" (UniqueName: \"kubernetes.io/projected/85f32108-760c-4027-8c6d-d1b761a3305d-kube-api-access-kg7mm\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013114 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013139 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9859f1cb-945d-4148-bf8d-b844c578dc40-run-httpd\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013155 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013240 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-config-data\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013289 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkh74\" (UniqueName: \"kubernetes.io/projected/9ca2269e-2446-4442-bdf5-028127190491-kube-api-access-nkh74\") pod \"nova-api-0\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ca2269e-2446-4442-bdf5-028127190491-logs\") pod \"nova-api-0\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013348 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f32108-760c-4027-8c6d-d1b761a3305d-logs\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013390 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013411 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca2269e-2446-4442-bdf5-028127190491-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013435 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013606 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9859f1cb-945d-4148-bf8d-b844c578dc40-run-httpd\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013700 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9859f1cb-945d-4148-bf8d-b844c578dc40-log-httpd\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013726 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca2269e-2446-4442-bdf5-028127190491-config-data\") pod \"nova-api-0\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.013823 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-config-data\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.014099 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9859f1cb-945d-4148-bf8d-b844c578dc40-log-httpd\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.017895 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.018024 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.019314 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.019593 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-scripts\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.019608 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-config-data\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.029543 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcmxc\" (UniqueName: \"kubernetes.io/projected/9859f1cb-945d-4148-bf8d-b844c578dc40-kube-api-access-lcmxc\") pod \"ceilometer-0\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.115804 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.116093 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca2269e-2446-4442-bdf5-028127190491-config-data\") pod \"nova-api-0\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.116285 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg7mm\" (UniqueName: \"kubernetes.io/projected/85f32108-760c-4027-8c6d-d1b761a3305d-kube-api-access-kg7mm\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.116447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.116539 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-config-data\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.116612 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkh74\" (UniqueName: \"kubernetes.io/projected/9ca2269e-2446-4442-bdf5-028127190491-kube-api-access-nkh74\") pod \"nova-api-0\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.116695 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ca2269e-2446-4442-bdf5-028127190491-logs\") pod \"nova-api-0\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.116777 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f32108-760c-4027-8c6d-d1b761a3305d-logs\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.116864 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca2269e-2446-4442-bdf5-028127190491-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.117454 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ca2269e-2446-4442-bdf5-028127190491-logs\") pod \"nova-api-0\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.120539 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.121447 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-config-data\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.123845 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.123890 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca2269e-2446-4442-bdf5-028127190491-config-data\") pod \"nova-api-0\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.125852 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f32108-760c-4027-8c6d-d1b761a3305d-logs\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.130196 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca2269e-2446-4442-bdf5-028127190491-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.132939 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg7mm\" (UniqueName: \"kubernetes.io/projected/85f32108-760c-4027-8c6d-d1b761a3305d-kube-api-access-kg7mm\") pod \"nova-metadata-0\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.156816 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkh74\" (UniqueName: \"kubernetes.io/projected/9ca2269e-2446-4442-bdf5-028127190491-kube-api-access-nkh74\") pod \"nova-api-0\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.227951 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.235876 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.242405 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.257571 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.307903 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.497332 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gnqzm"] Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.744871 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:40:53 crc kubenswrapper[5030]: W0120 23:40:53.750899 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9859f1cb_945d_4148_bf8d_b844c578dc40.slice/crio-a6e9cbb80f1774d834d11028c4aea8d6d569dd7e3f620ba8dd540003aa93562e WatchSource:0}: Error finding container a6e9cbb80f1774d834d11028c4aea8d6d569dd7e3f620ba8dd540003aa93562e: Status 404 returned error can't find the container with id a6e9cbb80f1774d834d11028c4aea8d6d569dd7e3f620ba8dd540003aa93562e Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.755152 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:40:53 crc kubenswrapper[5030]: W0120 23:40:53.755781 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85f32108_760c_4027_8c6d_d1b761a3305d.slice/crio-9675c470bb1aef46b301e3ab235b66ed7ae2317e09d6803733a746007fea66d7 WatchSource:0}: Error finding container 9675c470bb1aef46b301e3ab235b66ed7ae2317e09d6803733a746007fea66d7: Status 404 returned error can't find the container with id 9675c470bb1aef46b301e3ab235b66ed7ae2317e09d6803733a746007fea66d7 Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.824041 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:40:53 crc kubenswrapper[5030]: W0120 23:40:53.864384 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ca2269e_2446_4442_bdf5_028127190491.slice/crio-c4beb39ee364d1b83043130b0a06b38772dea459e12dd5598a53cdf7793bfafc WatchSource:0}: Error finding container c4beb39ee364d1b83043130b0a06b38772dea459e12dd5598a53cdf7793bfafc: Status 404 returned error can't find the container with id c4beb39ee364d1b83043130b0a06b38772dea459e12dd5598a53cdf7793bfafc Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.975431 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15e6530f-e723-47f8-ae17-641287a0b712" path="/var/lib/kubelet/pods/15e6530f-e723-47f8-ae17-641287a0b712/volumes" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.976210 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233cb4a4-24c7-4596-adc1-d7e4a47af70a" path="/var/lib/kubelet/pods/233cb4a4-24c7-4596-adc1-d7e4a47af70a/volumes" Jan 20 23:40:53 crc kubenswrapper[5030]: I0120 23:40:53.977350 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33eeca5d-0bab-4e41-9b66-6f3f971e2137" path="/var/lib/kubelet/pods/33eeca5d-0bab-4e41-9b66-6f3f971e2137/volumes" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.217475 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.352122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-config-data\") pod \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\" (UID: \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\") " Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.352235 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-combined-ca-bundle\") pod \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\" (UID: \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\") " Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.352482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldmnp\" (UniqueName: \"kubernetes.io/projected/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-kube-api-access-ldmnp\") pod \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\" (UID: \"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615\") " Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.356404 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-kube-api-access-ldmnp" (OuterVolumeSpecName: "kube-api-access-ldmnp") pod "c96eb84e-9bf3-49a7-a7b6-a5ae9948f615" (UID: "c96eb84e-9bf3-49a7-a7b6-a5ae9948f615"). InnerVolumeSpecName "kube-api-access-ldmnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.383202 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c96eb84e-9bf3-49a7-a7b6-a5ae9948f615" (UID: "c96eb84e-9bf3-49a7-a7b6-a5ae9948f615"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.383599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-config-data" (OuterVolumeSpecName: "config-data") pod "c96eb84e-9bf3-49a7-a7b6-a5ae9948f615" (UID: "c96eb84e-9bf3-49a7-a7b6-a5ae9948f615"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.461649 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldmnp\" (UniqueName: \"kubernetes.io/projected/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-kube-api-access-ldmnp\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.461964 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.462061 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.475922 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9ca2269e-2446-4442-bdf5-028127190491","Type":"ContainerStarted","Data":"7c0a9b98904b5433f5b3ed457012a29e457f31972bad59a0ea558c2b4a3d791d"} Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.476019 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9ca2269e-2446-4442-bdf5-028127190491","Type":"ContainerStarted","Data":"1aa47ae18432cbe78acc6d97102ad0cee058c20159ede4b349fe58b581317b35"} Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.476045 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9ca2269e-2446-4442-bdf5-028127190491","Type":"ContainerStarted","Data":"c4beb39ee364d1b83043130b0a06b38772dea459e12dd5598a53cdf7793bfafc"} Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.479388 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9859f1cb-945d-4148-bf8d-b844c578dc40","Type":"ContainerStarted","Data":"a6e9cbb80f1774d834d11028c4aea8d6d569dd7e3f620ba8dd540003aa93562e"} Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.483747 5030 generic.go:334] "Generic (PLEG): container finished" podID="c96eb84e-9bf3-49a7-a7b6-a5ae9948f615" containerID="70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e" exitCode=0 Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.483835 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.483849 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615","Type":"ContainerDied","Data":"70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e"} Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.487256 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c96eb84e-9bf3-49a7-a7b6-a5ae9948f615","Type":"ContainerDied","Data":"8dbb7fafcf2f3d8860b7f2395e39307d9c73dbda0bf4865976b27f6d40dc4b24"} Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.487480 5030 scope.go:117] "RemoveContainer" containerID="70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.496326 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.496310423 podStartE2EDuration="2.496310423s" podCreationTimestamp="2026-01-20 23:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:54.489044057 +0000 UTC m=+3926.809304375" watchObservedRunningTime="2026-01-20 23:40:54.496310423 +0000 UTC m=+3926.816570711" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.497377 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"85f32108-760c-4027-8c6d-d1b761a3305d","Type":"ContainerStarted","Data":"a73c97b11afcbc04aed83e51bf767bd0510faa92f7acfad1d7f35ffe02c51dc2"} Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.497418 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"85f32108-760c-4027-8c6d-d1b761a3305d","Type":"ContainerStarted","Data":"c482b68ed01591be35af0242d8124f86faa81f2474d8a0b10508f5b0e77c62dc"} Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.497439 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"85f32108-760c-4027-8c6d-d1b761a3305d","Type":"ContainerStarted","Data":"9675c470bb1aef46b301e3ab235b66ed7ae2317e09d6803733a746007fea66d7"} Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.497486 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gnqzm" podUID="ccc0c21b-efe9-4477-97f2-9cea8761b1c4" containerName="registry-server" containerID="cri-o://342d7f351f54f295cbacd23a1ceaa052078c4c9218337b09d7c16553990360a1" gracePeriod=2 Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.530813 5030 scope.go:117] "RemoveContainer" containerID="70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e" Jan 20 23:40:54 crc kubenswrapper[5030]: E0120 23:40:54.550893 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e\": container with ID starting with 70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e not found: ID does not exist" containerID="70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.550933 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e"} err="failed to get container status \"70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e\": rpc error: code = NotFound desc = could not find container \"70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e\": container with ID starting with 70e4c5079eb2e7b7ec15460170c995858aa6fee96c3a5ccfeedbe72c8ee0fb3e not found: ID does not exist" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.560496 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.5604797870000002 podStartE2EDuration="2.560479787s" podCreationTimestamp="2026-01-20 23:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:54.526236077 +0000 UTC m=+3926.846496375" watchObservedRunningTime="2026-01-20 23:40:54.560479787 +0000 UTC m=+3926.880740065" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.567670 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.579802 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.595685 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:54 crc kubenswrapper[5030]: E0120 23:40:54.596146 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96eb84e-9bf3-49a7-a7b6-a5ae9948f615" containerName="nova-scheduler-scheduler" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.596163 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96eb84e-9bf3-49a7-a7b6-a5ae9948f615" containerName="nova-scheduler-scheduler" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.596355 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c96eb84e-9bf3-49a7-a7b6-a5ae9948f615" containerName="nova-scheduler-scheduler" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.597016 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.600936 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.619696 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.664903 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737f957c-6c28-4e40-ac90-83e5cc719f07-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"737f957c-6c28-4e40-ac90-83e5cc719f07\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.665150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrxlk\" (UniqueName: \"kubernetes.io/projected/737f957c-6c28-4e40-ac90-83e5cc719f07-kube-api-access-rrxlk\") pod \"nova-scheduler-0\" (UID: \"737f957c-6c28-4e40-ac90-83e5cc719f07\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.665227 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737f957c-6c28-4e40-ac90-83e5cc719f07-config-data\") pod \"nova-scheduler-0\" (UID: \"737f957c-6c28-4e40-ac90-83e5cc719f07\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.712935 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.768167 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737f957c-6c28-4e40-ac90-83e5cc719f07-config-data\") pod \"nova-scheduler-0\" (UID: \"737f957c-6c28-4e40-ac90-83e5cc719f07\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.768612 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737f957c-6c28-4e40-ac90-83e5cc719f07-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"737f957c-6c28-4e40-ac90-83e5cc719f07\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.768655 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrxlk\" (UniqueName: \"kubernetes.io/projected/737f957c-6c28-4e40-ac90-83e5cc719f07-kube-api-access-rrxlk\") pod \"nova-scheduler-0\" (UID: \"737f957c-6c28-4e40-ac90-83e5cc719f07\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.776108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737f957c-6c28-4e40-ac90-83e5cc719f07-config-data\") pod \"nova-scheduler-0\" (UID: \"737f957c-6c28-4e40-ac90-83e5cc719f07\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.776214 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737f957c-6c28-4e40-ac90-83e5cc719f07-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"737f957c-6c28-4e40-ac90-83e5cc719f07\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.791553 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrxlk\" (UniqueName: \"kubernetes.io/projected/737f957c-6c28-4e40-ac90-83e5cc719f07-kube-api-access-rrxlk\") pod \"nova-scheduler-0\" (UID: \"737f957c-6c28-4e40-ac90-83e5cc719f07\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.848864 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.938157 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.975376 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-catalog-content\") pod \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\" (UID: \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\") " Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.975457 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh89z\" (UniqueName: \"kubernetes.io/projected/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-kube-api-access-fh89z\") pod \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\" (UID: \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\") " Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.975610 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-utilities\") pod \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\" (UID: \"ccc0c21b-efe9-4477-97f2-9cea8761b1c4\") " Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.976639 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-utilities" (OuterVolumeSpecName: "utilities") pod "ccc0c21b-efe9-4477-97f2-9cea8761b1c4" (UID: "ccc0c21b-efe9-4477-97f2-9cea8761b1c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:40:54 crc kubenswrapper[5030]: I0120 23:40:54.986910 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-kube-api-access-fh89z" (OuterVolumeSpecName: "kube-api-access-fh89z") pod "ccc0c21b-efe9-4477-97f2-9cea8761b1c4" (UID: "ccc0c21b-efe9-4477-97f2-9cea8761b1c4"). InnerVolumeSpecName "kube-api-access-fh89z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.079177 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh89z\" (UniqueName: \"kubernetes.io/projected/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-kube-api-access-fh89z\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.079214 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.097858 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ccc0c21b-efe9-4477-97f2-9cea8761b1c4" (UID: "ccc0c21b-efe9-4477-97f2-9cea8761b1c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.181232 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccc0c21b-efe9-4477-97f2-9cea8761b1c4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.367964 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:40:55 crc kubenswrapper[5030]: W0120 23:40:55.368766 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737f957c_6c28_4e40_ac90_83e5cc719f07.slice/crio-cbfc4a87c1f84a754f079c86af1b4cf06631c990f910be93ba687e79d6819646 WatchSource:0}: Error finding container cbfc4a87c1f84a754f079c86af1b4cf06631c990f910be93ba687e79d6819646: Status 404 returned error can't find the container with id cbfc4a87c1f84a754f079c86af1b4cf06631c990f910be93ba687e79d6819646 Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.507711 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"737f957c-6c28-4e40-ac90-83e5cc719f07","Type":"ContainerStarted","Data":"cbfc4a87c1f84a754f079c86af1b4cf06631c990f910be93ba687e79d6819646"} Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.510507 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9859f1cb-945d-4148-bf8d-b844c578dc40","Type":"ContainerStarted","Data":"b46e4c93e3b42c0f07e8afaa1e791c5bff3b2e8c83db02811fd2b24a59c6ab67"} Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.510641 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9859f1cb-945d-4148-bf8d-b844c578dc40","Type":"ContainerStarted","Data":"3d0c45acd92ee1e1ad704453ae9207a5f95798eb2f33bd1d833be926e792ced9"} Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.513472 5030 generic.go:334] "Generic (PLEG): container finished" podID="ccc0c21b-efe9-4477-97f2-9cea8761b1c4" containerID="342d7f351f54f295cbacd23a1ceaa052078c4c9218337b09d7c16553990360a1" exitCode=0 Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.513554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnqzm" event={"ID":"ccc0c21b-efe9-4477-97f2-9cea8761b1c4","Type":"ContainerDied","Data":"342d7f351f54f295cbacd23a1ceaa052078c4c9218337b09d7c16553990360a1"} Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.513606 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnqzm" event={"ID":"ccc0c21b-efe9-4477-97f2-9cea8761b1c4","Type":"ContainerDied","Data":"43cd6744eb9072a804f5ebe18b7992defcc4df9648f462967cc02e491454fc6b"} Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.513664 5030 scope.go:117] "RemoveContainer" containerID="342d7f351f54f295cbacd23a1ceaa052078c4c9218337b09d7c16553990360a1" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.513741 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnqzm" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.552549 5030 scope.go:117] "RemoveContainer" containerID="e6f8f0c0745e9447e00a4388063eb4d520dd81e092d40dfef84375bcc7d1f89e" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.554928 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gnqzm"] Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.566261 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gnqzm"] Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.576120 5030 scope.go:117] "RemoveContainer" containerID="ae6c9fa62be8b01af628d91641918743661189ab64511131ded1964e41d4fa24" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.594066 5030 scope.go:117] "RemoveContainer" containerID="342d7f351f54f295cbacd23a1ceaa052078c4c9218337b09d7c16553990360a1" Jan 20 23:40:55 crc kubenswrapper[5030]: E0120 23:40:55.594559 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"342d7f351f54f295cbacd23a1ceaa052078c4c9218337b09d7c16553990360a1\": container with ID starting with 342d7f351f54f295cbacd23a1ceaa052078c4c9218337b09d7c16553990360a1 not found: ID does not exist" containerID="342d7f351f54f295cbacd23a1ceaa052078c4c9218337b09d7c16553990360a1" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.594593 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"342d7f351f54f295cbacd23a1ceaa052078c4c9218337b09d7c16553990360a1"} err="failed to get container status \"342d7f351f54f295cbacd23a1ceaa052078c4c9218337b09d7c16553990360a1\": rpc error: code = NotFound desc = could not find container \"342d7f351f54f295cbacd23a1ceaa052078c4c9218337b09d7c16553990360a1\": container with ID starting with 342d7f351f54f295cbacd23a1ceaa052078c4c9218337b09d7c16553990360a1 not found: ID does not exist" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.594633 5030 scope.go:117] "RemoveContainer" containerID="e6f8f0c0745e9447e00a4388063eb4d520dd81e092d40dfef84375bcc7d1f89e" Jan 20 23:40:55 crc kubenswrapper[5030]: E0120 23:40:55.594941 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f8f0c0745e9447e00a4388063eb4d520dd81e092d40dfef84375bcc7d1f89e\": container with ID starting with e6f8f0c0745e9447e00a4388063eb4d520dd81e092d40dfef84375bcc7d1f89e not found: ID does not exist" containerID="e6f8f0c0745e9447e00a4388063eb4d520dd81e092d40dfef84375bcc7d1f89e" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.594970 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f8f0c0745e9447e00a4388063eb4d520dd81e092d40dfef84375bcc7d1f89e"} err="failed to get container status \"e6f8f0c0745e9447e00a4388063eb4d520dd81e092d40dfef84375bcc7d1f89e\": rpc error: code = NotFound desc = could not find container \"e6f8f0c0745e9447e00a4388063eb4d520dd81e092d40dfef84375bcc7d1f89e\": container with ID starting with e6f8f0c0745e9447e00a4388063eb4d520dd81e092d40dfef84375bcc7d1f89e not found: ID does not exist" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.594989 5030 scope.go:117] "RemoveContainer" containerID="ae6c9fa62be8b01af628d91641918743661189ab64511131ded1964e41d4fa24" Jan 20 23:40:55 crc kubenswrapper[5030]: E0120 23:40:55.595267 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae6c9fa62be8b01af628d91641918743661189ab64511131ded1964e41d4fa24\": container with ID starting with ae6c9fa62be8b01af628d91641918743661189ab64511131ded1964e41d4fa24 not found: ID does not exist" containerID="ae6c9fa62be8b01af628d91641918743661189ab64511131ded1964e41d4fa24" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.595294 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae6c9fa62be8b01af628d91641918743661189ab64511131ded1964e41d4fa24"} err="failed to get container status \"ae6c9fa62be8b01af628d91641918743661189ab64511131ded1964e41d4fa24\": rpc error: code = NotFound desc = could not find container \"ae6c9fa62be8b01af628d91641918743661189ab64511131ded1964e41d4fa24\": container with ID starting with ae6c9fa62be8b01af628d91641918743661189ab64511131ded1964e41d4fa24 not found: ID does not exist" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.976098 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c96eb84e-9bf3-49a7-a7b6-a5ae9948f615" path="/var/lib/kubelet/pods/c96eb84e-9bf3-49a7-a7b6-a5ae9948f615/volumes" Jan 20 23:40:55 crc kubenswrapper[5030]: I0120 23:40:55.977603 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccc0c21b-efe9-4477-97f2-9cea8761b1c4" path="/var/lib/kubelet/pods/ccc0c21b-efe9-4477-97f2-9cea8761b1c4/volumes" Jan 20 23:40:56 crc kubenswrapper[5030]: I0120 23:40:56.530990 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"737f957c-6c28-4e40-ac90-83e5cc719f07","Type":"ContainerStarted","Data":"1d0ac515361116a705fa1bbc68a57e410f5f863be4014e57d7caa5134773a66d"} Jan 20 23:40:56 crc kubenswrapper[5030]: I0120 23:40:56.533020 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9859f1cb-945d-4148-bf8d-b844c578dc40","Type":"ContainerStarted","Data":"6caaf2b46e5bad83dc495839017aba78b64334a97c11bf45d832c8bb2d1dfbcf"} Jan 20 23:40:56 crc kubenswrapper[5030]: I0120 23:40:56.550508 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.550484766 podStartE2EDuration="2.550484766s" podCreationTimestamp="2026-01-20 23:40:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:40:56.543681691 +0000 UTC m=+3928.863941989" watchObservedRunningTime="2026-01-20 23:40:56.550484766 +0000 UTC m=+3928.870745054" Jan 20 23:40:56 crc kubenswrapper[5030]: I0120 23:40:56.962154 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:40:56 crc kubenswrapper[5030]: E0120 23:40:56.962610 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:40:58 crc kubenswrapper[5030]: I0120 23:40:58.236843 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:58 crc kubenswrapper[5030]: I0120 23:40:58.237700 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:40:58 crc kubenswrapper[5030]: I0120 23:40:58.556772 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9859f1cb-945d-4148-bf8d-b844c578dc40","Type":"ContainerStarted","Data":"0932390218c2d5df8ff5263ecc58b889f027afdc7524c7d74a525668641ef3ab"} Jan 20 23:40:58 crc kubenswrapper[5030]: I0120 23:40:58.556836 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:40:58 crc kubenswrapper[5030]: I0120 23:40:58.593562 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.308349494 podStartE2EDuration="6.59354622s" podCreationTimestamp="2026-01-20 23:40:52 +0000 UTC" firstStartedPulling="2026-01-20 23:40:53.754085417 +0000 UTC m=+3926.074345705" lastFinishedPulling="2026-01-20 23:40:58.039282123 +0000 UTC m=+3930.359542431" observedRunningTime="2026-01-20 23:40:58.584242144 +0000 UTC m=+3930.904502432" watchObservedRunningTime="2026-01-20 23:40:58.59354622 +0000 UTC m=+3930.913806508" Jan 20 23:40:59 crc kubenswrapper[5030]: I0120 23:40:59.938944 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:41:03 crc kubenswrapper[5030]: I0120 23:41:03.236368 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:41:03 crc kubenswrapper[5030]: I0120 23:41:03.236868 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:41:03 crc kubenswrapper[5030]: I0120 23:41:03.242799 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:03 crc kubenswrapper[5030]: I0120 23:41:03.243436 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:04 crc kubenswrapper[5030]: I0120 23:41:04.334845 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="9ca2269e-2446-4442-bdf5-028127190491" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.252:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:41:04 crc kubenswrapper[5030]: I0120 23:41:04.334913 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="9ca2269e-2446-4442-bdf5-028127190491" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.252:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:41:04 crc kubenswrapper[5030]: I0120 23:41:04.334829 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="85f32108-760c-4027-8c6d-d1b761a3305d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.251:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:41:04 crc kubenswrapper[5030]: I0120 23:41:04.334879 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="85f32108-760c-4027-8c6d-d1b761a3305d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.251:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:41:04 crc kubenswrapper[5030]: I0120 23:41:04.938677 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:41:04 crc kubenswrapper[5030]: I0120 23:41:04.990543 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:41:05 crc kubenswrapper[5030]: I0120 23:41:05.686191 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:41:09 crc kubenswrapper[5030]: I0120 23:41:09.781237 5030 scope.go:117] "RemoveContainer" containerID="a3cc6fa9860339a777b502e8d7cf94ab748c9d05c0ad7323a3d5d25264be03d7" Jan 20 23:41:09 crc kubenswrapper[5030]: I0120 23:41:09.809514 5030 scope.go:117] "RemoveContainer" containerID="0ab99d99f78a27c31a39fe65464563e8a92791c72ac677499b6836093b92f3ef" Jan 20 23:41:09 crc kubenswrapper[5030]: I0120 23:41:09.848093 5030 scope.go:117] "RemoveContainer" containerID="a12e77d1b8be494229f4bdef24c5a9771212355912d130fbc08c36e1abf41aa5" Jan 20 23:41:09 crc kubenswrapper[5030]: I0120 23:41:09.873755 5030 scope.go:117] "RemoveContainer" containerID="9237c481c7aa7834b8c0efe9a0c5fd9901bf2d2eb2b5d3f97cca7cc8a063c842" Jan 20 23:41:09 crc kubenswrapper[5030]: I0120 23:41:09.962881 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:41:09 crc kubenswrapper[5030]: E0120 23:41:09.963204 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:41:13 crc kubenswrapper[5030]: I0120 23:41:13.246682 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:41:13 crc kubenswrapper[5030]: I0120 23:41:13.247843 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:41:13 crc kubenswrapper[5030]: I0120 23:41:13.247961 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:13 crc kubenswrapper[5030]: I0120 23:41:13.248669 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:13 crc kubenswrapper[5030]: I0120 23:41:13.249419 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:13 crc kubenswrapper[5030]: I0120 23:41:13.254250 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:13 crc kubenswrapper[5030]: I0120 23:41:13.257926 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:41:13 crc kubenswrapper[5030]: I0120 23:41:13.258139 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:41:13 crc kubenswrapper[5030]: I0120 23:41:13.728881 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:13 crc kubenswrapper[5030]: I0120 23:41:13.731490 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:15 crc kubenswrapper[5030]: I0120 23:41:15.475590 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:41:15 crc kubenswrapper[5030]: I0120 23:41:15.476094 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="ceilometer-central-agent" containerID="cri-o://3d0c45acd92ee1e1ad704453ae9207a5f95798eb2f33bd1d833be926e792ced9" gracePeriod=30 Jan 20 23:41:15 crc kubenswrapper[5030]: I0120 23:41:15.476205 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="ceilometer-notification-agent" containerID="cri-o://b46e4c93e3b42c0f07e8afaa1e791c5bff3b2e8c83db02811fd2b24a59c6ab67" gracePeriod=30 Jan 20 23:41:15 crc kubenswrapper[5030]: I0120 23:41:15.476190 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="sg-core" containerID="cri-o://6caaf2b46e5bad83dc495839017aba78b64334a97c11bf45d832c8bb2d1dfbcf" gracePeriod=30 Jan 20 23:41:15 crc kubenswrapper[5030]: I0120 23:41:15.476251 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="proxy-httpd" containerID="cri-o://0932390218c2d5df8ff5263ecc58b889f027afdc7524c7d74a525668641ef3ab" gracePeriod=30 Jan 20 23:41:15 crc kubenswrapper[5030]: I0120 23:41:15.488291 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.250:3000/\": EOF" Jan 20 23:41:15 crc kubenswrapper[5030]: I0120 23:41:15.608551 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:41:15 crc kubenswrapper[5030]: I0120 23:41:15.756467 5030 generic.go:334] "Generic (PLEG): container finished" podID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerID="0932390218c2d5df8ff5263ecc58b889f027afdc7524c7d74a525668641ef3ab" exitCode=0 Jan 20 23:41:15 crc kubenswrapper[5030]: I0120 23:41:15.756499 5030 generic.go:334] "Generic (PLEG): container finished" podID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerID="6caaf2b46e5bad83dc495839017aba78b64334a97c11bf45d832c8bb2d1dfbcf" exitCode=2 Jan 20 23:41:15 crc kubenswrapper[5030]: I0120 23:41:15.757337 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9859f1cb-945d-4148-bf8d-b844c578dc40","Type":"ContainerDied","Data":"0932390218c2d5df8ff5263ecc58b889f027afdc7524c7d74a525668641ef3ab"} Jan 20 23:41:15 crc kubenswrapper[5030]: I0120 23:41:15.757371 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9859f1cb-945d-4148-bf8d-b844c578dc40","Type":"ContainerDied","Data":"6caaf2b46e5bad83dc495839017aba78b64334a97c11bf45d832c8bb2d1dfbcf"} Jan 20 23:41:16 crc kubenswrapper[5030]: I0120 23:41:16.769133 5030 generic.go:334] "Generic (PLEG): container finished" podID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerID="3d0c45acd92ee1e1ad704453ae9207a5f95798eb2f33bd1d833be926e792ced9" exitCode=0 Jan 20 23:41:16 crc kubenswrapper[5030]: I0120 23:41:16.769213 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9859f1cb-945d-4148-bf8d-b844c578dc40","Type":"ContainerDied","Data":"3d0c45acd92ee1e1ad704453ae9207a5f95798eb2f33bd1d833be926e792ced9"} Jan 20 23:41:16 crc kubenswrapper[5030]: I0120 23:41:16.769600 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="9ca2269e-2446-4442-bdf5-028127190491" containerName="nova-api-log" containerID="cri-o://1aa47ae18432cbe78acc6d97102ad0cee058c20159ede4b349fe58b581317b35" gracePeriod=30 Jan 20 23:41:16 crc kubenswrapper[5030]: I0120 23:41:16.769694 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="9ca2269e-2446-4442-bdf5-028127190491" containerName="nova-api-api" containerID="cri-o://7c0a9b98904b5433f5b3ed457012a29e457f31972bad59a0ea558c2b4a3d791d" gracePeriod=30 Jan 20 23:41:17 crc kubenswrapper[5030]: I0120 23:41:17.778329 5030 generic.go:334] "Generic (PLEG): container finished" podID="9ca2269e-2446-4442-bdf5-028127190491" containerID="1aa47ae18432cbe78acc6d97102ad0cee058c20159ede4b349fe58b581317b35" exitCode=143 Jan 20 23:41:17 crc kubenswrapper[5030]: I0120 23:41:17.778374 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9ca2269e-2446-4442-bdf5-028127190491","Type":"ContainerDied","Data":"1aa47ae18432cbe78acc6d97102ad0cee058c20159ede4b349fe58b581317b35"} Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.474146 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.607755 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca2269e-2446-4442-bdf5-028127190491-config-data\") pod \"9ca2269e-2446-4442-bdf5-028127190491\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.608062 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkh74\" (UniqueName: \"kubernetes.io/projected/9ca2269e-2446-4442-bdf5-028127190491-kube-api-access-nkh74\") pod \"9ca2269e-2446-4442-bdf5-028127190491\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.608292 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca2269e-2446-4442-bdf5-028127190491-combined-ca-bundle\") pod \"9ca2269e-2446-4442-bdf5-028127190491\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.608332 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ca2269e-2446-4442-bdf5-028127190491-logs\") pod \"9ca2269e-2446-4442-bdf5-028127190491\" (UID: \"9ca2269e-2446-4442-bdf5-028127190491\") " Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.609841 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ca2269e-2446-4442-bdf5-028127190491-logs" (OuterVolumeSpecName: "logs") pod "9ca2269e-2446-4442-bdf5-028127190491" (UID: "9ca2269e-2446-4442-bdf5-028127190491"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.619320 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ca2269e-2446-4442-bdf5-028127190491-kube-api-access-nkh74" (OuterVolumeSpecName: "kube-api-access-nkh74") pod "9ca2269e-2446-4442-bdf5-028127190491" (UID: "9ca2269e-2446-4442-bdf5-028127190491"). InnerVolumeSpecName "kube-api-access-nkh74". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.644037 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ca2269e-2446-4442-bdf5-028127190491-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ca2269e-2446-4442-bdf5-028127190491" (UID: "9ca2269e-2446-4442-bdf5-028127190491"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.647579 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ca2269e-2446-4442-bdf5-028127190491-config-data" (OuterVolumeSpecName: "config-data") pod "9ca2269e-2446-4442-bdf5-028127190491" (UID: "9ca2269e-2446-4442-bdf5-028127190491"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.710405 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ca2269e-2446-4442-bdf5-028127190491-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.710439 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ca2269e-2446-4442-bdf5-028127190491-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.710450 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ca2269e-2446-4442-bdf5-028127190491-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.710461 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkh74\" (UniqueName: \"kubernetes.io/projected/9ca2269e-2446-4442-bdf5-028127190491-kube-api-access-nkh74\") on node \"crc\" DevicePath \"\"" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.729999 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.810822 5030 generic.go:334] "Generic (PLEG): container finished" podID="9ca2269e-2446-4442-bdf5-028127190491" containerID="7c0a9b98904b5433f5b3ed457012a29e457f31972bad59a0ea558c2b4a3d791d" exitCode=0 Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.810883 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9ca2269e-2446-4442-bdf5-028127190491","Type":"ContainerDied","Data":"7c0a9b98904b5433f5b3ed457012a29e457f31972bad59a0ea558c2b4a3d791d"} Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.810910 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"9ca2269e-2446-4442-bdf5-028127190491","Type":"ContainerDied","Data":"c4beb39ee364d1b83043130b0a06b38772dea459e12dd5598a53cdf7793bfafc"} Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.810928 5030 scope.go:117] "RemoveContainer" containerID="7c0a9b98904b5433f5b3ed457012a29e457f31972bad59a0ea558c2b4a3d791d" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.811042 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.816326 5030 generic.go:334] "Generic (PLEG): container finished" podID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerID="b46e4c93e3b42c0f07e8afaa1e791c5bff3b2e8c83db02811fd2b24a59c6ab67" exitCode=0 Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.816373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9859f1cb-945d-4148-bf8d-b844c578dc40","Type":"ContainerDied","Data":"b46e4c93e3b42c0f07e8afaa1e791c5bff3b2e8c83db02811fd2b24a59c6ab67"} Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.816399 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.816503 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9859f1cb-945d-4148-bf8d-b844c578dc40","Type":"ContainerDied","Data":"a6e9cbb80f1774d834d11028c4aea8d6d569dd7e3f620ba8dd540003aa93562e"} Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.840250 5030 scope.go:117] "RemoveContainer" containerID="1aa47ae18432cbe78acc6d97102ad0cee058c20159ede4b349fe58b581317b35" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.853399 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.863856 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.875955 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.876410 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc0c21b-efe9-4477-97f2-9cea8761b1c4" containerName="registry-server" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876429 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc0c21b-efe9-4477-97f2-9cea8761b1c4" containerName="registry-server" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.876446 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ca2269e-2446-4442-bdf5-028127190491" containerName="nova-api-log" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876453 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ca2269e-2446-4442-bdf5-028127190491" containerName="nova-api-log" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.876464 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc0c21b-efe9-4477-97f2-9cea8761b1c4" containerName="extract-content" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876470 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc0c21b-efe9-4477-97f2-9cea8761b1c4" containerName="extract-content" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.876487 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="ceilometer-notification-agent" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876493 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="ceilometer-notification-agent" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.876510 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="sg-core" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876517 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="sg-core" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.876535 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="ceilometer-central-agent" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876544 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="ceilometer-central-agent" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.876553 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="proxy-httpd" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876560 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="proxy-httpd" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.876572 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc0c21b-efe9-4477-97f2-9cea8761b1c4" containerName="extract-utilities" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876580 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc0c21b-efe9-4477-97f2-9cea8761b1c4" containerName="extract-utilities" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.876597 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ca2269e-2446-4442-bdf5-028127190491" containerName="nova-api-api" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876607 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ca2269e-2446-4442-bdf5-028127190491" containerName="nova-api-api" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876924 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="ceilometer-notification-agent" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876945 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc0c21b-efe9-4477-97f2-9cea8761b1c4" containerName="registry-server" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876954 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="ceilometer-central-agent" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876964 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ca2269e-2446-4442-bdf5-028127190491" containerName="nova-api-log" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876975 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ca2269e-2446-4442-bdf5-028127190491" containerName="nova-api-api" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876981 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="sg-core" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.876990 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" containerName="proxy-httpd" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.878042 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.878646 5030 scope.go:117] "RemoveContainer" containerID="7c0a9b98904b5433f5b3ed457012a29e457f31972bad59a0ea558c2b4a3d791d" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.879087 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c0a9b98904b5433f5b3ed457012a29e457f31972bad59a0ea558c2b4a3d791d\": container with ID starting with 7c0a9b98904b5433f5b3ed457012a29e457f31972bad59a0ea558c2b4a3d791d not found: ID does not exist" containerID="7c0a9b98904b5433f5b3ed457012a29e457f31972bad59a0ea558c2b4a3d791d" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.879119 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c0a9b98904b5433f5b3ed457012a29e457f31972bad59a0ea558c2b4a3d791d"} err="failed to get container status \"7c0a9b98904b5433f5b3ed457012a29e457f31972bad59a0ea558c2b4a3d791d\": rpc error: code = NotFound desc = could not find container \"7c0a9b98904b5433f5b3ed457012a29e457f31972bad59a0ea558c2b4a3d791d\": container with ID starting with 7c0a9b98904b5433f5b3ed457012a29e457f31972bad59a0ea558c2b4a3d791d not found: ID does not exist" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.879140 5030 scope.go:117] "RemoveContainer" containerID="1aa47ae18432cbe78acc6d97102ad0cee058c20159ede4b349fe58b581317b35" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.879925 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aa47ae18432cbe78acc6d97102ad0cee058c20159ede4b349fe58b581317b35\": container with ID starting with 1aa47ae18432cbe78acc6d97102ad0cee058c20159ede4b349fe58b581317b35 not found: ID does not exist" containerID="1aa47ae18432cbe78acc6d97102ad0cee058c20159ede4b349fe58b581317b35" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.879950 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aa47ae18432cbe78acc6d97102ad0cee058c20159ede4b349fe58b581317b35"} err="failed to get container status \"1aa47ae18432cbe78acc6d97102ad0cee058c20159ede4b349fe58b581317b35\": rpc error: code = NotFound desc = could not find container \"1aa47ae18432cbe78acc6d97102ad0cee058c20159ede4b349fe58b581317b35\": container with ID starting with 1aa47ae18432cbe78acc6d97102ad0cee058c20159ede4b349fe58b581317b35 not found: ID does not exist" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.879964 5030 scope.go:117] "RemoveContainer" containerID="0932390218c2d5df8ff5263ecc58b889f027afdc7524c7d74a525668641ef3ab" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.881588 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.881845 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.882571 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.884595 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.913603 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-combined-ca-bundle\") pod \"9859f1cb-945d-4148-bf8d-b844c578dc40\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.913744 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-sg-core-conf-yaml\") pod \"9859f1cb-945d-4148-bf8d-b844c578dc40\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.913798 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcmxc\" (UniqueName: \"kubernetes.io/projected/9859f1cb-945d-4148-bf8d-b844c578dc40-kube-api-access-lcmxc\") pod \"9859f1cb-945d-4148-bf8d-b844c578dc40\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.913827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-scripts\") pod \"9859f1cb-945d-4148-bf8d-b844c578dc40\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.913871 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9859f1cb-945d-4148-bf8d-b844c578dc40-log-httpd\") pod \"9859f1cb-945d-4148-bf8d-b844c578dc40\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.913903 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9859f1cb-945d-4148-bf8d-b844c578dc40-run-httpd\") pod \"9859f1cb-945d-4148-bf8d-b844c578dc40\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.913982 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-ceilometer-tls-certs\") pod \"9859f1cb-945d-4148-bf8d-b844c578dc40\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.914005 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-config-data\") pod \"9859f1cb-945d-4148-bf8d-b844c578dc40\" (UID: \"9859f1cb-945d-4148-bf8d-b844c578dc40\") " Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.914581 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9859f1cb-945d-4148-bf8d-b844c578dc40-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9859f1cb-945d-4148-bf8d-b844c578dc40" (UID: "9859f1cb-945d-4148-bf8d-b844c578dc40"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.914929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9859f1cb-945d-4148-bf8d-b844c578dc40-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9859f1cb-945d-4148-bf8d-b844c578dc40" (UID: "9859f1cb-945d-4148-bf8d-b844c578dc40"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.915300 5030 scope.go:117] "RemoveContainer" containerID="6caaf2b46e5bad83dc495839017aba78b64334a97c11bf45d832c8bb2d1dfbcf" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.915972 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9859f1cb-945d-4148-bf8d-b844c578dc40-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.916019 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9859f1cb-945d-4148-bf8d-b844c578dc40-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.918787 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-scripts" (OuterVolumeSpecName: "scripts") pod "9859f1cb-945d-4148-bf8d-b844c578dc40" (UID: "9859f1cb-945d-4148-bf8d-b844c578dc40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.918890 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9859f1cb-945d-4148-bf8d-b844c578dc40-kube-api-access-lcmxc" (OuterVolumeSpecName: "kube-api-access-lcmxc") pod "9859f1cb-945d-4148-bf8d-b844c578dc40" (UID: "9859f1cb-945d-4148-bf8d-b844c578dc40"). InnerVolumeSpecName "kube-api-access-lcmxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.939300 5030 scope.go:117] "RemoveContainer" containerID="b46e4c93e3b42c0f07e8afaa1e791c5bff3b2e8c83db02811fd2b24a59c6ab67" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.943778 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9859f1cb-945d-4148-bf8d-b844c578dc40" (UID: "9859f1cb-945d-4148-bf8d-b844c578dc40"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.965971 5030 scope.go:117] "RemoveContainer" containerID="3d0c45acd92ee1e1ad704453ae9207a5f95798eb2f33bd1d833be926e792ced9" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.968009 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9859f1cb-945d-4148-bf8d-b844c578dc40" (UID: "9859f1cb-945d-4148-bf8d-b844c578dc40"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.981722 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ca2269e_2446_4442_bdf5_028127190491.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.983554 5030 scope.go:117] "RemoveContainer" containerID="0932390218c2d5df8ff5263ecc58b889f027afdc7524c7d74a525668641ef3ab" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.984264 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0932390218c2d5df8ff5263ecc58b889f027afdc7524c7d74a525668641ef3ab\": container with ID starting with 0932390218c2d5df8ff5263ecc58b889f027afdc7524c7d74a525668641ef3ab not found: ID does not exist" containerID="0932390218c2d5df8ff5263ecc58b889f027afdc7524c7d74a525668641ef3ab" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.984315 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0932390218c2d5df8ff5263ecc58b889f027afdc7524c7d74a525668641ef3ab"} err="failed to get container status \"0932390218c2d5df8ff5263ecc58b889f027afdc7524c7d74a525668641ef3ab\": rpc error: code = NotFound desc = could not find container \"0932390218c2d5df8ff5263ecc58b889f027afdc7524c7d74a525668641ef3ab\": container with ID starting with 0932390218c2d5df8ff5263ecc58b889f027afdc7524c7d74a525668641ef3ab not found: ID does not exist" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.984347 5030 scope.go:117] "RemoveContainer" containerID="6caaf2b46e5bad83dc495839017aba78b64334a97c11bf45d832c8bb2d1dfbcf" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.984946 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6caaf2b46e5bad83dc495839017aba78b64334a97c11bf45d832c8bb2d1dfbcf\": container with ID starting with 6caaf2b46e5bad83dc495839017aba78b64334a97c11bf45d832c8bb2d1dfbcf not found: ID does not exist" containerID="6caaf2b46e5bad83dc495839017aba78b64334a97c11bf45d832c8bb2d1dfbcf" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.985083 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6caaf2b46e5bad83dc495839017aba78b64334a97c11bf45d832c8bb2d1dfbcf"} err="failed to get container status \"6caaf2b46e5bad83dc495839017aba78b64334a97c11bf45d832c8bb2d1dfbcf\": rpc error: code = NotFound desc = could not find container \"6caaf2b46e5bad83dc495839017aba78b64334a97c11bf45d832c8bb2d1dfbcf\": container with ID starting with 6caaf2b46e5bad83dc495839017aba78b64334a97c11bf45d832c8bb2d1dfbcf not found: ID does not exist" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.985177 5030 scope.go:117] "RemoveContainer" containerID="b46e4c93e3b42c0f07e8afaa1e791c5bff3b2e8c83db02811fd2b24a59c6ab67" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.985720 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b46e4c93e3b42c0f07e8afaa1e791c5bff3b2e8c83db02811fd2b24a59c6ab67\": container with ID starting with b46e4c93e3b42c0f07e8afaa1e791c5bff3b2e8c83db02811fd2b24a59c6ab67 not found: ID does not exist" containerID="b46e4c93e3b42c0f07e8afaa1e791c5bff3b2e8c83db02811fd2b24a59c6ab67" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.985780 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b46e4c93e3b42c0f07e8afaa1e791c5bff3b2e8c83db02811fd2b24a59c6ab67"} err="failed to get container status \"b46e4c93e3b42c0f07e8afaa1e791c5bff3b2e8c83db02811fd2b24a59c6ab67\": rpc error: code = NotFound desc = could not find container \"b46e4c93e3b42c0f07e8afaa1e791c5bff3b2e8c83db02811fd2b24a59c6ab67\": container with ID starting with b46e4c93e3b42c0f07e8afaa1e791c5bff3b2e8c83db02811fd2b24a59c6ab67 not found: ID does not exist" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.985814 5030 scope.go:117] "RemoveContainer" containerID="3d0c45acd92ee1e1ad704453ae9207a5f95798eb2f33bd1d833be926e792ced9" Jan 20 23:41:20 crc kubenswrapper[5030]: E0120 23:41:20.986171 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d0c45acd92ee1e1ad704453ae9207a5f95798eb2f33bd1d833be926e792ced9\": container with ID starting with 3d0c45acd92ee1e1ad704453ae9207a5f95798eb2f33bd1d833be926e792ced9 not found: ID does not exist" containerID="3d0c45acd92ee1e1ad704453ae9207a5f95798eb2f33bd1d833be926e792ced9" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.986200 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d0c45acd92ee1e1ad704453ae9207a5f95798eb2f33bd1d833be926e792ced9"} err="failed to get container status \"3d0c45acd92ee1e1ad704453ae9207a5f95798eb2f33bd1d833be926e792ced9\": rpc error: code = NotFound desc = could not find container \"3d0c45acd92ee1e1ad704453ae9207a5f95798eb2f33bd1d833be926e792ced9\": container with ID starting with 3d0c45acd92ee1e1ad704453ae9207a5f95798eb2f33bd1d833be926e792ced9 not found: ID does not exist" Jan 20 23:41:20 crc kubenswrapper[5030]: I0120 23:41:20.994723 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9859f1cb-945d-4148-bf8d-b844c578dc40" (UID: "9859f1cb-945d-4148-bf8d-b844c578dc40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.014903 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-config-data" (OuterVolumeSpecName: "config-data") pod "9859f1cb-945d-4148-bf8d-b844c578dc40" (UID: "9859f1cb-945d-4148-bf8d-b844c578dc40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.017057 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.017135 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.017294 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhk67\" (UniqueName: \"kubernetes.io/projected/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-kube-api-access-jhk67\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.017400 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-logs\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.017631 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-config-data\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.017720 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-public-tls-certs\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.018019 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.018064 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcmxc\" (UniqueName: \"kubernetes.io/projected/9859f1cb-945d-4148-bf8d-b844c578dc40-kube-api-access-lcmxc\") on node \"crc\" DevicePath \"\"" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.018080 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.018095 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.018108 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.018122 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9859f1cb-945d-4148-bf8d-b844c578dc40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.119398 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.119465 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhk67\" (UniqueName: \"kubernetes.io/projected/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-kube-api-access-jhk67\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.119504 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-logs\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.119597 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-config-data\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.119654 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-public-tls-certs\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.119702 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.120167 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-logs\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.122915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.123249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-config-data\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.127508 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.127984 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-public-tls-certs\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.138800 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhk67\" (UniqueName: \"kubernetes.io/projected/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-kube-api-access-jhk67\") pod \"nova-api-0\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.197384 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.225249 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.245719 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.262119 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.264452 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.267227 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.267770 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.269683 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.285593 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.423989 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.424789 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scsnq\" (UniqueName: \"kubernetes.io/projected/152c5e6b-d331-4a98-af8d-7e4667624551-kube-api-access-scsnq\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.424842 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/152c5e6b-d331-4a98-af8d-7e4667624551-run-httpd\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.424888 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-scripts\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.425121 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.425278 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.425307 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/152c5e6b-d331-4a98-af8d-7e4667624551-log-httpd\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.425480 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-config-data\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.527254 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.527305 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scsnq\" (UniqueName: \"kubernetes.io/projected/152c5e6b-d331-4a98-af8d-7e4667624551-kube-api-access-scsnq\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.527362 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/152c5e6b-d331-4a98-af8d-7e4667624551-run-httpd\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.527408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-scripts\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.527466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.527517 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.527540 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/152c5e6b-d331-4a98-af8d-7e4667624551-log-httpd\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.527591 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-config-data\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.531767 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.531989 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/152c5e6b-d331-4a98-af8d-7e4667624551-log-httpd\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.532295 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.532378 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/152c5e6b-d331-4a98-af8d-7e4667624551-run-httpd\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.532890 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-scripts\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.542748 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.543710 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-config-data\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.556356 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scsnq\" (UniqueName: \"kubernetes.io/projected/152c5e6b-d331-4a98-af8d-7e4667624551-kube-api-access-scsnq\") pod \"ceilometer-0\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.587840 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.677434 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:41:21 crc kubenswrapper[5030]: W0120 23:41:21.678476 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb47fe73_ea40_40ce_9531_57b6a5e8e7ce.slice/crio-573d728e5081c2a0cd1221f250645ab9c66bcd867a567c6da57071e922a20b6e WatchSource:0}: Error finding container 573d728e5081c2a0cd1221f250645ab9c66bcd867a567c6da57071e922a20b6e: Status 404 returned error can't find the container with id 573d728e5081c2a0cd1221f250645ab9c66bcd867a567c6da57071e922a20b6e Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.828399 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce","Type":"ContainerStarted","Data":"573d728e5081c2a0cd1221f250645ab9c66bcd867a567c6da57071e922a20b6e"} Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.996071 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9859f1cb-945d-4148-bf8d-b844c578dc40" path="/var/lib/kubelet/pods/9859f1cb-945d-4148-bf8d-b844c578dc40/volumes" Jan 20 23:41:21 crc kubenswrapper[5030]: I0120 23:41:21.997730 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ca2269e-2446-4442-bdf5-028127190491" path="/var/lib/kubelet/pods/9ca2269e-2446-4442-bdf5-028127190491/volumes" Jan 20 23:41:22 crc kubenswrapper[5030]: W0120 23:41:22.086666 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod152c5e6b_d331_4a98_af8d_7e4667624551.slice/crio-12bb229b9e7d349841bd6e5af687f72514c2632d9fdf950e882189d8b4c16ed2 WatchSource:0}: Error finding container 12bb229b9e7d349841bd6e5af687f72514c2632d9fdf950e882189d8b4c16ed2: Status 404 returned error can't find the container with id 12bb229b9e7d349841bd6e5af687f72514c2632d9fdf950e882189d8b4c16ed2 Jan 20 23:41:22 crc kubenswrapper[5030]: I0120 23:41:22.091202 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:41:22 crc kubenswrapper[5030]: I0120 23:41:22.839863 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce","Type":"ContainerStarted","Data":"6046a846fcbf16047a808dd6d90fc3bd12599a6aad8d1c25fcb9bea953decf51"} Jan 20 23:41:22 crc kubenswrapper[5030]: I0120 23:41:22.840271 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce","Type":"ContainerStarted","Data":"cb2f0c3b8501093aaa0e433ff2ce6a327de9731da2f84e935851d94250d96e84"} Jan 20 23:41:22 crc kubenswrapper[5030]: I0120 23:41:22.842239 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"152c5e6b-d331-4a98-af8d-7e4667624551","Type":"ContainerStarted","Data":"66b51a5ee5c65fde67dd7d3d08fce102a5471a432ed1a233ce9cd1c2b86d060c"} Jan 20 23:41:22 crc kubenswrapper[5030]: I0120 23:41:22.842282 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"152c5e6b-d331-4a98-af8d-7e4667624551","Type":"ContainerStarted","Data":"12bb229b9e7d349841bd6e5af687f72514c2632d9fdf950e882189d8b4c16ed2"} Jan 20 23:41:22 crc kubenswrapper[5030]: I0120 23:41:22.868131 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.868107727 podStartE2EDuration="2.868107727s" podCreationTimestamp="2026-01-20 23:41:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:41:22.85623583 +0000 UTC m=+3955.176496118" watchObservedRunningTime="2026-01-20 23:41:22.868107727 +0000 UTC m=+3955.188368045" Jan 20 23:41:23 crc kubenswrapper[5030]: I0120 23:41:23.863165 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"152c5e6b-d331-4a98-af8d-7e4667624551","Type":"ContainerStarted","Data":"62ac49d41b5b1a662239abb22332a64b85a61c71f355decbf19689965a3507ac"} Jan 20 23:41:24 crc kubenswrapper[5030]: I0120 23:41:24.874571 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"152c5e6b-d331-4a98-af8d-7e4667624551","Type":"ContainerStarted","Data":"90dc073d46f7c4b78f10acfaaa4fcbd5333cb0e5ac3d78d615530c8639e62431"} Jan 20 23:41:24 crc kubenswrapper[5030]: I0120 23:41:24.963184 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:41:24 crc kubenswrapper[5030]: E0120 23:41:24.963597 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:41:26 crc kubenswrapper[5030]: I0120 23:41:26.906866 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"152c5e6b-d331-4a98-af8d-7e4667624551","Type":"ContainerStarted","Data":"1433c94928d680a85d1fee2bdd2908ad73a37317b65f71e0ebac6101c8dd81b4"} Jan 20 23:41:26 crc kubenswrapper[5030]: I0120 23:41:26.907233 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:41:26 crc kubenswrapper[5030]: I0120 23:41:26.933238 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.135416988 podStartE2EDuration="5.933211007s" podCreationTimestamp="2026-01-20 23:41:21 +0000 UTC" firstStartedPulling="2026-01-20 23:41:22.089722906 +0000 UTC m=+3954.409983194" lastFinishedPulling="2026-01-20 23:41:25.887516915 +0000 UTC m=+3958.207777213" observedRunningTime="2026-01-20 23:41:26.926869583 +0000 UTC m=+3959.247129951" watchObservedRunningTime="2026-01-20 23:41:26.933211007 +0000 UTC m=+3959.253471335" Jan 20 23:41:31 crc kubenswrapper[5030]: I0120 23:41:31.198906 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:31 crc kubenswrapper[5030]: I0120 23:41:31.199516 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:32 crc kubenswrapper[5030]: I0120 23:41:32.213861 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.254:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:41:32 crc kubenswrapper[5030]: I0120 23:41:32.213860 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.254:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:41:36 crc kubenswrapper[5030]: I0120 23:41:36.962417 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:41:36 crc kubenswrapper[5030]: E0120 23:41:36.962967 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:41:41 crc kubenswrapper[5030]: I0120 23:41:41.209981 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:41 crc kubenswrapper[5030]: I0120 23:41:41.211124 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:41 crc kubenswrapper[5030]: I0120 23:41:41.215056 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:41 crc kubenswrapper[5030]: I0120 23:41:41.222598 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:42 crc kubenswrapper[5030]: I0120 23:41:42.076260 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:42 crc kubenswrapper[5030]: I0120 23:41:42.088368 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:41:47 crc kubenswrapper[5030]: I0120 23:41:47.970085 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:41:49 crc kubenswrapper[5030]: I0120 23:41:49.166707 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"3f648d6436623b73b4f4d7a187af028d54b54d93907fd7af94f7f92831044719"} Jan 20 23:41:51 crc kubenswrapper[5030]: I0120 23:41:51.596911 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:19 crc kubenswrapper[5030]: I0120 23:42:19.752740 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:42:19 crc kubenswrapper[5030]: I0120 23:42:19.894905 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" containerName="galera" containerID="cri-o://bdbd823f6b499ef708f35a6d50e847df98d69dbff62b7e9b46c4b99da43314ae" gracePeriod=30 Jan 20 23:42:20 crc kubenswrapper[5030]: E0120 23:42:20.070074 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-swift-internal-svc: secret "cert-swift-internal-svc" not found Jan 20 23:42:20 crc kubenswrapper[5030]: E0120 23:42:20.070168 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs podName:d66a4eea-17a5-4ab5-b3be-2a479226daf7 nodeName:}" failed. No retries permitted until 2026-01-20 23:42:20.570144547 +0000 UTC m=+4012.890404955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs") pod "swift-proxy-7c556c6584-sfpnb" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7") : secret "cert-swift-internal-svc" not found Jan 20 23:42:20 crc kubenswrapper[5030]: E0120 23:42:20.070963 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-swift-public-svc: secret "cert-swift-public-svc" not found Jan 20 23:42:20 crc kubenswrapper[5030]: E0120 23:42:20.071203 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs podName:d66a4eea-17a5-4ab5-b3be-2a479226daf7 nodeName:}" failed. No retries permitted until 2026-01-20 23:42:20.571173401 +0000 UTC m=+4012.891433719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs") pod "swift-proxy-7c556c6584-sfpnb" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7") : secret "cert-swift-public-svc" not found Jan 20 23:42:20 crc kubenswrapper[5030]: E0120 23:42:20.584256 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-swift-internal-svc: secret "cert-swift-internal-svc" not found Jan 20 23:42:20 crc kubenswrapper[5030]: E0120 23:42:20.584343 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs podName:d66a4eea-17a5-4ab5-b3be-2a479226daf7 nodeName:}" failed. No retries permitted until 2026-01-20 23:42:21.58432262 +0000 UTC m=+4013.904582918 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs") pod "swift-proxy-7c556c6584-sfpnb" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7") : secret "cert-swift-internal-svc" not found Jan 20 23:42:20 crc kubenswrapper[5030]: E0120 23:42:20.584263 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-swift-public-svc: secret "cert-swift-public-svc" not found Jan 20 23:42:20 crc kubenswrapper[5030]: E0120 23:42:20.584520 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs podName:d66a4eea-17a5-4ab5-b3be-2a479226daf7 nodeName:}" failed. No retries permitted until 2026-01-20 23:42:21.584466753 +0000 UTC m=+4013.904727081 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs") pod "swift-proxy-7c556c6584-sfpnb" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7") : secret "cert-swift-public-svc" not found Jan 20 23:42:20 crc kubenswrapper[5030]: I0120 23:42:20.962958 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.597870 5030 generic.go:334] "Generic (PLEG): container finished" podID="d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" containerID="bdbd823f6b499ef708f35a6d50e847df98d69dbff62b7e9b46c4b99da43314ae" exitCode=0 Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.597947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34","Type":"ContainerDied","Data":"bdbd823f6b499ef708f35a6d50e847df98d69dbff62b7e9b46c4b99da43314ae"} Jan 20 23:42:21 crc kubenswrapper[5030]: E0120 23:42:21.605509 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-swift-public-svc: secret "cert-swift-public-svc" not found Jan 20 23:42:21 crc kubenswrapper[5030]: E0120 23:42:21.605576 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs podName:d66a4eea-17a5-4ab5-b3be-2a479226daf7 nodeName:}" failed. No retries permitted until 2026-01-20 23:42:23.605557423 +0000 UTC m=+4015.925817721 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs") pod "swift-proxy-7c556c6584-sfpnb" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7") : secret "cert-swift-public-svc" not found Jan 20 23:42:21 crc kubenswrapper[5030]: E0120 23:42:21.605639 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-swift-internal-svc: secret "cert-swift-internal-svc" not found Jan 20 23:42:21 crc kubenswrapper[5030]: E0120 23:42:21.605701 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs podName:d66a4eea-17a5-4ab5-b3be-2a479226daf7 nodeName:}" failed. No retries permitted until 2026-01-20 23:42:23.605686536 +0000 UTC m=+4015.925946844 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs") pod "swift-proxy-7c556c6584-sfpnb" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7") : secret "cert-swift-internal-svc" not found Jan 20 23:42:21 crc kubenswrapper[5030]: E0120 23:42:21.670976 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bdbd823f6b499ef708f35a6d50e847df98d69dbff62b7e9b46c4b99da43314ae is running failed: container process not found" containerID="bdbd823f6b499ef708f35a6d50e847df98d69dbff62b7e9b46c4b99da43314ae" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:42:21 crc kubenswrapper[5030]: E0120 23:42:21.671421 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bdbd823f6b499ef708f35a6d50e847df98d69dbff62b7e9b46c4b99da43314ae is running failed: container process not found" containerID="bdbd823f6b499ef708f35a6d50e847df98d69dbff62b7e9b46c4b99da43314ae" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:42:21 crc kubenswrapper[5030]: E0120 23:42:21.671758 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bdbd823f6b499ef708f35a6d50e847df98d69dbff62b7e9b46c4b99da43314ae is running failed: container process not found" containerID="bdbd823f6b499ef708f35a6d50e847df98d69dbff62b7e9b46c4b99da43314ae" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:42:21 crc kubenswrapper[5030]: E0120 23:42:21.671792 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bdbd823f6b499ef708f35a6d50e847df98d69dbff62b7e9b46c4b99da43314ae is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/openstack-galera-0" podUID="d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" containerName="galera" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.804977 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.910863 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-config-data-generated\") pod \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.911162 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-config-data-default\") pod \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.911299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-kolla-config\") pod \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.911405 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-galera-tls-certs\") pod \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.911448 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" (UID: "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.911533 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-operator-scripts\") pod \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.911674 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-combined-ca-bundle\") pod \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.911829 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.911920 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" (UID: "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.911934 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqxl5\" (UniqueName: \"kubernetes.io/projected/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-kube-api-access-tqxl5\") pod \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\" (UID: \"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34\") " Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.911962 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" (UID: "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.912361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" (UID: "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.913077 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.913107 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.913120 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.913134 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.919501 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-kube-api-access-tqxl5" (OuterVolumeSpecName: "kube-api-access-tqxl5") pod "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" (UID: "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34"). InnerVolumeSpecName "kube-api-access-tqxl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.927497 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" (UID: "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.951079 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" (UID: "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:21 crc kubenswrapper[5030]: I0120 23:42:21.976418 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" (UID: "d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.014543 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.014588 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.014602 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqxl5\" (UniqueName: \"kubernetes.io/projected/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-kube-api-access-tqxl5\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.014612 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.035793 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.117999 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.518196 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.518539 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" containerName="cinder-api" containerID="cri-o://2e863572ff39f65b432bd8c77129121cd78d3d35c55dae06d8f91ddfcc6bcec0" gracePeriod=30 Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.518937 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" containerName="cinder-api-log" containerID="cri-o://24a1ad1389e7a1626270495336d2b4c5f0f755ad5419efb4129c1e33a7aacb57" gracePeriod=30 Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.609533 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34","Type":"ContainerDied","Data":"24377ead099d46b1d905df90aef759a74af9cd5bdb7a7fe83f91ab527bfa1cdd"} Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.609645 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.609980 5030 scope.go:117] "RemoveContainer" containerID="bdbd823f6b499ef708f35a6d50e847df98d69dbff62b7e9b46c4b99da43314ae" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.644287 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.659045 5030 scope.go:117] "RemoveContainer" containerID="354ef07e84ccce6c737aee025f3ecd29b588ce56c49537c778770bfa8330eb07" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.665290 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.670482 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:42:22 crc kubenswrapper[5030]: E0120 23:42:22.670915 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" containerName="mysql-bootstrap" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.670931 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" containerName="mysql-bootstrap" Jan 20 23:42:22 crc kubenswrapper[5030]: E0120 23:42:22.670962 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" containerName="galera" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.670970 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" containerName="galera" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.671134 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" containerName="galera" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.672059 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.677279 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.677594 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.677785 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-j9wjx" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.677979 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.682868 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.829355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.829408 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c184acd-d9c1-4e70-8587-ce940b980983-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.829432 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-kolla-config\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.829455 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.829831 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl4tv\" (UniqueName: \"kubernetes.io/projected/4c184acd-d9c1-4e70-8587-ce940b980983-kube-api-access-nl4tv\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.830026 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-config-data-default\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.830092 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c184acd-d9c1-4e70-8587-ce940b980983-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.830204 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4c184acd-d9c1-4e70-8587-ce940b980983-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.931400 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl4tv\" (UniqueName: \"kubernetes.io/projected/4c184acd-d9c1-4e70-8587-ce940b980983-kube-api-access-nl4tv\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.931474 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-config-data-default\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.931502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c184acd-d9c1-4e70-8587-ce940b980983-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.931534 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4c184acd-d9c1-4e70-8587-ce940b980983-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.931588 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.931630 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c184acd-d9c1-4e70-8587-ce940b980983-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.931652 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-kolla-config\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.931670 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.931970 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.932082 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4c184acd-d9c1-4e70-8587-ce940b980983-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.932970 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-config-data-default\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.932998 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-kolla-config\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.933297 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.936467 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c184acd-d9c1-4e70-8587-ce940b980983-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.936526 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c184acd-d9c1-4e70-8587-ce940b980983-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.952519 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl4tv\" (UniqueName: \"kubernetes.io/projected/4c184acd-d9c1-4e70-8587-ce940b980983-kube-api-access-nl4tv\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:22 crc kubenswrapper[5030]: I0120 23:42:22.961818 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:23 crc kubenswrapper[5030]: I0120 23:42:23.064532 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:23 crc kubenswrapper[5030]: E0120 23:42:23.136004 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 20 23:42:23 crc kubenswrapper[5030]: E0120 23:42:23.136066 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs podName:19f669e8-7127-4a91-ae58-2e88b3bb7d6e nodeName:}" failed. No retries permitted until 2026-01-20 23:42:23.636051801 +0000 UTC m=+4015.956312089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e") : secret "cert-glance-default-internal-svc" not found Jan 20 23:42:23 crc kubenswrapper[5030]: I0120 23:42:23.532999 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:42:23 crc kubenswrapper[5030]: I0120 23:42:23.622313 5030 generic.go:334] "Generic (PLEG): container finished" podID="cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" containerID="24a1ad1389e7a1626270495336d2b4c5f0f755ad5419efb4129c1e33a7aacb57" exitCode=143 Jan 20 23:42:23 crc kubenswrapper[5030]: I0120 23:42:23.622394 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad","Type":"ContainerDied","Data":"24a1ad1389e7a1626270495336d2b4c5f0f755ad5419efb4129c1e33a7aacb57"} Jan 20 23:42:23 crc kubenswrapper[5030]: I0120 23:42:23.623901 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"4c184acd-d9c1-4e70-8587-ce940b980983","Type":"ContainerStarted","Data":"ac288eabd9795330ac846cea87322f15e976e34b7c83a416725c5a0cc3bf6cf7"} Jan 20 23:42:23 crc kubenswrapper[5030]: E0120 23:42:23.646683 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 20 23:42:23 crc kubenswrapper[5030]: E0120 23:42:23.646745 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs podName:19f669e8-7127-4a91-ae58-2e88b3bb7d6e nodeName:}" failed. No retries permitted until 2026-01-20 23:42:24.646730659 +0000 UTC m=+4016.966990947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e") : secret "cert-glance-default-internal-svc" not found Jan 20 23:42:23 crc kubenswrapper[5030]: E0120 23:42:23.647045 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-swift-public-svc: secret "cert-swift-public-svc" not found Jan 20 23:42:23 crc kubenswrapper[5030]: E0120 23:42:23.647079 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs podName:d66a4eea-17a5-4ab5-b3be-2a479226daf7 nodeName:}" failed. No retries permitted until 2026-01-20 23:42:27.647071727 +0000 UTC m=+4019.967332015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs") pod "swift-proxy-7c556c6584-sfpnb" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7") : secret "cert-swift-public-svc" not found Jan 20 23:42:23 crc kubenswrapper[5030]: E0120 23:42:23.647117 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-swift-internal-svc: secret "cert-swift-internal-svc" not found Jan 20 23:42:23 crc kubenswrapper[5030]: E0120 23:42:23.647137 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs podName:d66a4eea-17a5-4ab5-b3be-2a479226daf7 nodeName:}" failed. No retries permitted until 2026-01-20 23:42:27.647130469 +0000 UTC m=+4019.967390757 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs") pod "swift-proxy-7c556c6584-sfpnb" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7") : secret "cert-swift-internal-svc" not found Jan 20 23:42:23 crc kubenswrapper[5030]: I0120 23:42:23.937408 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 23:42:23 crc kubenswrapper[5030]: I0120 23:42:23.975238 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34" path="/var/lib/kubelet/pods/d0f6aa7c-dc7c-4aa6-8e0f-ea81ca443a34/volumes" Jan 20 23:42:24 crc kubenswrapper[5030]: I0120 23:42:24.639721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"4c184acd-d9c1-4e70-8587-ce940b980983","Type":"ContainerStarted","Data":"2f1b253ff24c4c1ea125df54301e5f8a3831fd6d691239c96bedcafa484781ff"} Jan 20 23:42:24 crc kubenswrapper[5030]: E0120 23:42:24.668711 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 20 23:42:24 crc kubenswrapper[5030]: E0120 23:42:24.668860 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs podName:19f669e8-7127-4a91-ae58-2e88b3bb7d6e nodeName:}" failed. No retries permitted until 2026-01-20 23:42:26.668821874 +0000 UTC m=+4018.989082202 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e") : secret "cert-glance-default-internal-svc" not found Jan 20 23:42:25 crc kubenswrapper[5030]: I0120 23:42:25.683133 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.216:8776/healthcheck\": read tcp 10.217.0.2:39276->10.217.0.216:8776: read: connection reset by peer" Jan 20 23:42:26 crc kubenswrapper[5030]: I0120 23:42:26.663192 5030 generic.go:334] "Generic (PLEG): container finished" podID="cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" containerID="2e863572ff39f65b432bd8c77129121cd78d3d35c55dae06d8f91ddfcc6bcec0" exitCode=0 Jan 20 23:42:26 crc kubenswrapper[5030]: I0120 23:42:26.663490 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad","Type":"ContainerDied","Data":"2e863572ff39f65b432bd8c77129121cd78d3d35c55dae06d8f91ddfcc6bcec0"} Jan 20 23:42:26 crc kubenswrapper[5030]: I0120 23:42:26.665184 5030 generic.go:334] "Generic (PLEG): container finished" podID="4c184acd-d9c1-4e70-8587-ce940b980983" containerID="2f1b253ff24c4c1ea125df54301e5f8a3831fd6d691239c96bedcafa484781ff" exitCode=0 Jan 20 23:42:26 crc kubenswrapper[5030]: I0120 23:42:26.665228 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"4c184acd-d9c1-4e70-8587-ce940b980983","Type":"ContainerDied","Data":"2f1b253ff24c4c1ea125df54301e5f8a3831fd6d691239c96bedcafa484781ff"} Jan 20 23:42:26 crc kubenswrapper[5030]: E0120 23:42:26.720146 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 20 23:42:26 crc kubenswrapper[5030]: E0120 23:42:26.720214 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs podName:19f669e8-7127-4a91-ae58-2e88b3bb7d6e nodeName:}" failed. No retries permitted until 2026-01-20 23:42:30.720199287 +0000 UTC m=+4023.040459575 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e") : secret "cert-glance-default-internal-svc" not found Jan 20 23:42:26 crc kubenswrapper[5030]: I0120 23:42:26.934777 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 23:42:26 crc kubenswrapper[5030]: I0120 23:42:26.935136 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:42:26 crc kubenswrapper[5030]: I0120 23:42:26.935845 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"c77ef71b1ba7fc80efcdf5a24a3e01e5a68dadd6b1d0537759ca10e5a0da5519"} pod="openstack-kuttl-tests/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Jan 20 23:42:26 crc kubenswrapper[5030]: I0120 23:42:26.935896 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="cinder-scheduler" containerID="cri-o://c77ef71b1ba7fc80efcdf5a24a3e01e5a68dadd6b1d0537759ca10e5a0da5519" gracePeriod=30 Jan 20 23:42:26 crc kubenswrapper[5030]: I0120 23:42:26.944319 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.126130 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-internal-tls-certs\") pod \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.126202 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-etc-machine-id\") pod \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.126235 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-config-data-custom\") pod \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.126285 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" (UID: "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.126296 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-combined-ca-bundle\") pod \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.126395 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-logs\") pod \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.126462 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-config-data\") pod \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.126530 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn6xj\" (UniqueName: \"kubernetes.io/projected/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-kube-api-access-bn6xj\") pod \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.126729 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-public-tls-certs\") pod \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.126758 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-scripts\") pod \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\" (UID: \"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad\") " Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.127517 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.129026 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-logs" (OuterVolumeSpecName: "logs") pod "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" (UID: "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.131830 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-kube-api-access-bn6xj" (OuterVolumeSpecName: "kube-api-access-bn6xj") pod "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" (UID: "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad"). InnerVolumeSpecName "kube-api-access-bn6xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.133378 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-scripts" (OuterVolumeSpecName: "scripts") pod "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" (UID: "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.136676 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" (UID: "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.166956 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" (UID: "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.178881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-config-data" (OuterVolumeSpecName: "config-data") pod "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" (UID: "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.182567 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" (UID: "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.188698 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" (UID: "cbd9c315-eb7d-4ef2-ac12-431a837cf9ad"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.229027 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn6xj\" (UniqueName: \"kubernetes.io/projected/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-kube-api-access-bn6xj\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.229329 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.229413 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.229488 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.229562 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.229668 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.229745 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.229844 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.677907 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"cbd9c315-eb7d-4ef2-ac12-431a837cf9ad","Type":"ContainerDied","Data":"bbd7996fd07c4b83ef560c5f832bd31063b8def68133a0f6da2b5d4c17533e02"} Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.678271 5030 scope.go:117] "RemoveContainer" containerID="2e863572ff39f65b432bd8c77129121cd78d3d35c55dae06d8f91ddfcc6bcec0" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.677994 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.681490 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"4c184acd-d9c1-4e70-8587-ce940b980983","Type":"ContainerStarted","Data":"626d362af97385a0d67e9e8fee5bd07bb31c5c1353c92b989bc60909ed7f85e1"} Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.711979 5030 scope.go:117] "RemoveContainer" containerID="24a1ad1389e7a1626270495336d2b4c5f0f755ad5419efb4129c1e33a7aacb57" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.724511 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=5.72448573 podStartE2EDuration="5.72448573s" podCreationTimestamp="2026-01-20 23:42:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:27.71129181 +0000 UTC m=+4020.031552108" watchObservedRunningTime="2026-01-20 23:42:27.72448573 +0000 UTC m=+4020.044746028" Jan 20 23:42:27 crc kubenswrapper[5030]: E0120 23:42:27.739766 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-swift-public-svc: secret "cert-swift-public-svc" not found Jan 20 23:42:27 crc kubenswrapper[5030]: E0120 23:42:27.739802 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-swift-internal-svc: secret "cert-swift-internal-svc" not found Jan 20 23:42:27 crc kubenswrapper[5030]: E0120 23:42:27.739877 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs podName:d66a4eea-17a5-4ab5-b3be-2a479226daf7 nodeName:}" failed. No retries permitted until 2026-01-20 23:42:35.739855473 +0000 UTC m=+4028.060115761 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs") pod "swift-proxy-7c556c6584-sfpnb" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7") : secret "cert-swift-public-svc" not found Jan 20 23:42:27 crc kubenswrapper[5030]: E0120 23:42:27.740053 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs podName:d66a4eea-17a5-4ab5-b3be-2a479226daf7 nodeName:}" failed. No retries permitted until 2026-01-20 23:42:35.740035608 +0000 UTC m=+4028.060295936 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs") pod "swift-proxy-7c556c6584-sfpnb" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7") : secret "cert-swift-internal-svc" not found Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.744111 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.764778 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.772880 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:42:27 crc kubenswrapper[5030]: E0120 23:42:27.773374 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" containerName="cinder-api" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.773395 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" containerName="cinder-api" Jan 20 23:42:27 crc kubenswrapper[5030]: E0120 23:42:27.773428 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" containerName="cinder-api-log" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.773438 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" containerName="cinder-api-log" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.773734 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" containerName="cinder-api" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.773770 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" containerName="cinder-api-log" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.774975 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.778277 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.778463 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.780995 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.782580 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.943761 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6j67\" (UniqueName: \"kubernetes.io/projected/425b4c25-fdad-470a-b462-188b573d359a-kube-api-access-w6j67\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.943833 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-config-data\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.943880 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-scripts\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.944116 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.944158 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/425b4c25-fdad-470a-b462-188b573d359a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.944185 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.944362 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425b4c25-fdad-470a-b462-188b573d359a-logs\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.944458 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.944610 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-config-data-custom\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:27 crc kubenswrapper[5030]: I0120 23:42:27.978907 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbd9c315-eb7d-4ef2-ac12-431a837cf9ad" path="/var/lib/kubelet/pods/cbd9c315-eb7d-4ef2-ac12-431a837cf9ad/volumes" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.046949 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-config-data-custom\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.047086 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6j67\" (UniqueName: \"kubernetes.io/projected/425b4c25-fdad-470a-b462-188b573d359a-kube-api-access-w6j67\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.047139 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-config-data\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.047191 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-scripts\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.047338 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.047682 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/425b4c25-fdad-470a-b462-188b573d359a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.047850 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/425b4c25-fdad-470a-b462-188b573d359a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.048314 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.048526 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425b4c25-fdad-470a-b462-188b573d359a-logs\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.048652 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.049268 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425b4c25-fdad-470a-b462-188b573d359a-logs\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.052685 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-scripts\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.053141 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.053873 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.054239 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-config-data-custom\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.054828 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-config-data\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.055138 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.075912 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6j67\" (UniqueName: \"kubernetes.io/projected/425b4c25-fdad-470a-b462-188b573d359a-kube-api-access-w6j67\") pod \"cinder-api-0\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.096796 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:28 crc kubenswrapper[5030]: E0120 23:42:28.206684 5030 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.9:33728->38.102.83.9:37955: read tcp 38.102.83.9:33728->38.102.83.9:37955: read: connection reset by peer Jan 20 23:42:28 crc kubenswrapper[5030]: W0120 23:42:28.598408 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod425b4c25_fdad_470a_b462_188b573d359a.slice/crio-7ed669b95f7dcec3deca556677f4cdd0b0cf8a59085f00a4d6a864d397f32144 WatchSource:0}: Error finding container 7ed669b95f7dcec3deca556677f4cdd0b0cf8a59085f00a4d6a864d397f32144: Status 404 returned error can't find the container with id 7ed669b95f7dcec3deca556677f4cdd0b0cf8a59085f00a4d6a864d397f32144 Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.598741 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.717922 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"425b4c25-fdad-470a-b462-188b573d359a","Type":"ContainerStarted","Data":"7ed669b95f7dcec3deca556677f4cdd0b0cf8a59085f00a4d6a864d397f32144"} Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.964803 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:42:28 crc kubenswrapper[5030]: I0120 23:42:28.965581 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" containerName="openstack-network-exporter" containerID="cri-o://6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879" gracePeriod=300 Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.029044 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" containerName="ovsdbserver-nb" containerID="cri-o://f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160" gracePeriod=300 Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.442526 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f/ovsdbserver-nb/0.log" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.442937 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.581286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-ovsdb-rundir\") pod \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.581584 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.581615 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" (UID: "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.581691 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-ovsdbserver-nb-tls-certs\") pod \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.581752 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-combined-ca-bundle\") pod \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.582114 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-scripts\") pod \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.582156 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-config\") pod \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.582221 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-metrics-certs-tls-certs\") pod \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.582285 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks65s\" (UniqueName: \"kubernetes.io/projected/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-kube-api-access-ks65s\") pod \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\" (UID: \"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f\") " Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.582723 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-scripts" (OuterVolumeSpecName: "scripts") pod "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" (UID: "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.582945 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.582962 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.583003 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-config" (OuterVolumeSpecName: "config") pod "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" (UID: "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.586242 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-kube-api-access-ks65s" (OuterVolumeSpecName: "kube-api-access-ks65s") pod "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" (UID: "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f"). InnerVolumeSpecName "kube-api-access-ks65s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.586378 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" (UID: "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.615578 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" (UID: "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.649225 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" (UID: "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.661371 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" (UID: "daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.685371 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks65s\" (UniqueName: \"kubernetes.io/projected/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-kube-api-access-ks65s\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.685429 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.685444 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.685457 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.685470 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.685481 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.708252 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.732363 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f/ovsdbserver-nb/0.log" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.732403 5030 generic.go:334] "Generic (PLEG): container finished" podID="daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" containerID="6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879" exitCode=2 Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.732419 5030 generic.go:334] "Generic (PLEG): container finished" podID="daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" containerID="f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160" exitCode=143 Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.732462 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f","Type":"ContainerDied","Data":"6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879"} Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.732488 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f","Type":"ContainerDied","Data":"f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160"} Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.732497 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f","Type":"ContainerDied","Data":"51ad7e4068ed79b67273eb094e8facdd2b100873403720679fd4cb08f933f72f"} Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.732513 5030 scope.go:117] "RemoveContainer" containerID="6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.732649 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.736149 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"425b4c25-fdad-470a-b462-188b573d359a","Type":"ContainerStarted","Data":"8b167ca61f8e9969a9c5b4773f26b86a04c25264b7d286b0bb38c376ba36ecef"} Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.752661 5030 scope.go:117] "RemoveContainer" containerID="f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.783488 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.789282 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.791611 5030 scope.go:117] "RemoveContainer" containerID="6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879" Jan 20 23:42:29 crc kubenswrapper[5030]: E0120 23:42:29.793526 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879\": container with ID starting with 6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879 not found: ID does not exist" containerID="6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.793590 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879"} err="failed to get container status \"6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879\": rpc error: code = NotFound desc = could not find container \"6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879\": container with ID starting with 6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879 not found: ID does not exist" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.793615 5030 scope.go:117] "RemoveContainer" containerID="f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160" Jan 20 23:42:29 crc kubenswrapper[5030]: E0120 23:42:29.793960 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160\": container with ID starting with f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160 not found: ID does not exist" containerID="f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.793983 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160"} err="failed to get container status \"f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160\": rpc error: code = NotFound desc = could not find container \"f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160\": container with ID starting with f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160 not found: ID does not exist" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.793996 5030 scope.go:117] "RemoveContainer" containerID="6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.794263 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879"} err="failed to get container status \"6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879\": rpc error: code = NotFound desc = could not find container \"6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879\": container with ID starting with 6993a6eaf50304840fa800c5425b559de98e30563c4055deed1fe8bdc8a7b879 not found: ID does not exist" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.794283 5030 scope.go:117] "RemoveContainer" containerID="f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.794510 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160"} err="failed to get container status \"f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160\": rpc error: code = NotFound desc = could not find container \"f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160\": container with ID starting with f00d0bda564f59097aa364417eb72ad7498278441774eca7fb29537c8ee1e160 not found: ID does not exist" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.794549 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.818273 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:42:29 crc kubenswrapper[5030]: E0120 23:42:29.818685 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" containerName="openstack-network-exporter" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.818702 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" containerName="openstack-network-exporter" Jan 20 23:42:29 crc kubenswrapper[5030]: E0120 23:42:29.818721 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" containerName="ovsdbserver-nb" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.818729 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" containerName="ovsdbserver-nb" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.818910 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" containerName="openstack-network-exporter" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.818939 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" containerName="ovsdbserver-nb" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.819885 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.822117 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.822274 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.822379 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-m2qb9" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.823945 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.833007 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.976718 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f" path="/var/lib/kubelet/pods/daf55cf8-5e84-44cc-afbb-8e8af7aa0a3f/volumes" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.993122 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.993198 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30def4ae-b574-406f-ac55-1fbae852b7f8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.993290 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30def4ae-b574-406f-ac55-1fbae852b7f8-config\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.993499 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30def4ae-b574-406f-ac55-1fbae852b7f8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.993639 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.993860 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.993958 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:29 crc kubenswrapper[5030]: I0120 23:42:29.994039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7l9s\" (UniqueName: \"kubernetes.io/projected/30def4ae-b574-406f-ac55-1fbae852b7f8-kube-api-access-n7l9s\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.095988 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30def4ae-b574-406f-ac55-1fbae852b7f8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.096272 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30def4ae-b574-406f-ac55-1fbae852b7f8-config\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.096356 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30def4ae-b574-406f-ac55-1fbae852b7f8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.096378 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.096444 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.096468 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.096487 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7l9s\" (UniqueName: \"kubernetes.io/projected/30def4ae-b574-406f-ac55-1fbae852b7f8-kube-api-access-n7l9s\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.096542 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.097086 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30def4ae-b574-406f-ac55-1fbae852b7f8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.097138 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.097212 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30def4ae-b574-406f-ac55-1fbae852b7f8-config\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.097259 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30def4ae-b574-406f-ac55-1fbae852b7f8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.101007 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.101229 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.101764 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.115572 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7l9s\" (UniqueName: \"kubernetes.io/projected/30def4ae-b574-406f-ac55-1fbae852b7f8-kube-api-access-n7l9s\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.132132 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.143961 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.609874 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:42:30 crc kubenswrapper[5030]: W0120 23:42:30.614486 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30def4ae_b574_406f_ac55_1fbae852b7f8.slice/crio-06bea6c482c0d846d880a184c1a4ae2b0be1468e842799de009fe4b76de8dac5 WatchSource:0}: Error finding container 06bea6c482c0d846d880a184c1a4ae2b0be1468e842799de009fe4b76de8dac5: Status 404 returned error can't find the container with id 06bea6c482c0d846d880a184c1a4ae2b0be1468e842799de009fe4b76de8dac5 Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.683414 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.683827 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="565f8fbe-5c4e-4cf4-8a18-0930a8595306" containerName="openstack-network-exporter" containerID="cri-o://4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e" gracePeriod=300 Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.750617 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"425b4c25-fdad-470a-b462-188b573d359a","Type":"ContainerStarted","Data":"590d9e7afd81121f20e8b3c4a10aa9c029280098e2371ecd81d6f475c6e6a932"} Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.750789 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.751720 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"30def4ae-b574-406f-ac55-1fbae852b7f8","Type":"ContainerStarted","Data":"06bea6c482c0d846d880a184c1a4ae2b0be1468e842799de009fe4b76de8dac5"} Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.754236 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="565f8fbe-5c4e-4cf4-8a18-0930a8595306" containerName="ovsdbserver-sb" containerID="cri-o://4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45" gracePeriod=300 Jan 20 23:42:30 crc kubenswrapper[5030]: I0120 23:42:30.782597 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.782574995 podStartE2EDuration="3.782574995s" podCreationTimestamp="2026-01-20 23:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:30.766993697 +0000 UTC m=+4023.087253985" watchObservedRunningTime="2026-01-20 23:42:30.782574995 +0000 UTC m=+4023.102835293" Jan 20 23:42:30 crc kubenswrapper[5030]: E0120 23:42:30.811029 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 20 23:42:30 crc kubenswrapper[5030]: E0120 23:42:30.811128 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs podName:19f669e8-7127-4a91-ae58-2e88b3bb7d6e nodeName:}" failed. No retries permitted until 2026-01-20 23:42:38.811106247 +0000 UTC m=+4031.131366545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e") : secret "cert-glance-default-internal-svc" not found Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.141480 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_565f8fbe-5c4e-4cf4-8a18-0930a8595306/ovsdbserver-sb/0.log" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.141758 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.172210 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.172444 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="b4c36492-6bfc-4370-b163-6e8ea9b8c06a" containerName="ovn-northd" containerID="cri-o://d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d" gracePeriod=30 Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.172538 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="b4c36492-6bfc-4370-b163-6e8ea9b8c06a" containerName="openstack-network-exporter" containerID="cri-o://e7b8a2e75577c589c157713b286429901def374e01d91bd72e7e69b63cc402a3" gracePeriod=30 Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.321775 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/565f8fbe-5c4e-4cf4-8a18-0930a8595306-ovsdb-rundir\") pod \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.321873 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.321911 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/565f8fbe-5c4e-4cf4-8a18-0930a8595306-config\") pod \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.322031 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-combined-ca-bundle\") pod \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.322079 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-ovsdbserver-sb-tls-certs\") pod \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.322131 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtw9s\" (UniqueName: \"kubernetes.io/projected/565f8fbe-5c4e-4cf4-8a18-0930a8595306-kube-api-access-gtw9s\") pod \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.322150 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-metrics-certs-tls-certs\") pod \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.322174 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/565f8fbe-5c4e-4cf4-8a18-0930a8595306-scripts\") pod \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\" (UID: \"565f8fbe-5c4e-4cf4-8a18-0930a8595306\") " Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.322305 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/565f8fbe-5c4e-4cf4-8a18-0930a8595306-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "565f8fbe-5c4e-4cf4-8a18-0930a8595306" (UID: "565f8fbe-5c4e-4cf4-8a18-0930a8595306"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.322603 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/565f8fbe-5c4e-4cf4-8a18-0930a8595306-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.322723 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565f8fbe-5c4e-4cf4-8a18-0930a8595306-config" (OuterVolumeSpecName: "config") pod "565f8fbe-5c4e-4cf4-8a18-0930a8595306" (UID: "565f8fbe-5c4e-4cf4-8a18-0930a8595306"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.323089 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565f8fbe-5c4e-4cf4-8a18-0930a8595306-scripts" (OuterVolumeSpecName: "scripts") pod "565f8fbe-5c4e-4cf4-8a18-0930a8595306" (UID: "565f8fbe-5c4e-4cf4-8a18-0930a8595306"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.326903 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "565f8fbe-5c4e-4cf4-8a18-0930a8595306" (UID: "565f8fbe-5c4e-4cf4-8a18-0930a8595306"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.329228 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565f8fbe-5c4e-4cf4-8a18-0930a8595306-kube-api-access-gtw9s" (OuterVolumeSpecName: "kube-api-access-gtw9s") pod "565f8fbe-5c4e-4cf4-8a18-0930a8595306" (UID: "565f8fbe-5c4e-4cf4-8a18-0930a8595306"). InnerVolumeSpecName "kube-api-access-gtw9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.346216 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "565f8fbe-5c4e-4cf4-8a18-0930a8595306" (UID: "565f8fbe-5c4e-4cf4-8a18-0930a8595306"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.392965 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "565f8fbe-5c4e-4cf4-8a18-0930a8595306" (UID: "565f8fbe-5c4e-4cf4-8a18-0930a8595306"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.414916 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "565f8fbe-5c4e-4cf4-8a18-0930a8595306" (UID: "565f8fbe-5c4e-4cf4-8a18-0930a8595306"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.424874 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.424914 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/565f8fbe-5c4e-4cf4-8a18-0930a8595306-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.424931 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.424949 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.424962 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtw9s\" (UniqueName: \"kubernetes.io/projected/565f8fbe-5c4e-4cf4-8a18-0930a8595306-kube-api-access-gtw9s\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.424973 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/565f8fbe-5c4e-4cf4-8a18-0930a8595306-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.424987 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/565f8fbe-5c4e-4cf4-8a18-0930a8595306-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.465931 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.526387 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.763985 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_565f8fbe-5c4e-4cf4-8a18-0930a8595306/ovsdbserver-sb/0.log" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.764236 5030 generic.go:334] "Generic (PLEG): container finished" podID="565f8fbe-5c4e-4cf4-8a18-0930a8595306" containerID="4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e" exitCode=2 Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.764252 5030 generic.go:334] "Generic (PLEG): container finished" podID="565f8fbe-5c4e-4cf4-8a18-0930a8595306" containerID="4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45" exitCode=143 Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.764296 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"565f8fbe-5c4e-4cf4-8a18-0930a8595306","Type":"ContainerDied","Data":"4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e"} Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.764320 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"565f8fbe-5c4e-4cf4-8a18-0930a8595306","Type":"ContainerDied","Data":"4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45"} Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.764331 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"565f8fbe-5c4e-4cf4-8a18-0930a8595306","Type":"ContainerDied","Data":"bbf5f25f66736fe1bf9ce2fa2d91a62e77af1012e7410952ed6eb2d1d9bcae94"} Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.764372 5030 scope.go:117] "RemoveContainer" containerID="4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.764404 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.767634 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"30def4ae-b574-406f-ac55-1fbae852b7f8","Type":"ContainerStarted","Data":"8a6b141363970fec4afba83162c49db3275e773739115c07b31346e1c8e613cb"} Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.767658 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"30def4ae-b574-406f-ac55-1fbae852b7f8","Type":"ContainerStarted","Data":"4f6874d9d0b21b19757ba0154d86f3da9e658121a8f165c2e5820052bed84ad5"} Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.770163 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4c36492-6bfc-4370-b163-6e8ea9b8c06a" containerID="e7b8a2e75577c589c157713b286429901def374e01d91bd72e7e69b63cc402a3" exitCode=2 Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.770234 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"b4c36492-6bfc-4370-b163-6e8ea9b8c06a","Type":"ContainerDied","Data":"e7b8a2e75577c589c157713b286429901def374e01d91bd72e7e69b63cc402a3"} Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.805517 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=2.805492329 podStartE2EDuration="2.805492329s" podCreationTimestamp="2026-01-20 23:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:31.804717831 +0000 UTC m=+4024.124978119" watchObservedRunningTime="2026-01-20 23:42:31.805492329 +0000 UTC m=+4024.125752657" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.822767 5030 scope.go:117] "RemoveContainer" containerID="4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.857123 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.863437 5030 scope.go:117] "RemoveContainer" containerID="4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e" Jan 20 23:42:31 crc kubenswrapper[5030]: E0120 23:42:31.866425 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e\": container with ID starting with 4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e not found: ID does not exist" containerID="4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.866475 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e"} err="failed to get container status \"4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e\": rpc error: code = NotFound desc = could not find container \"4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e\": container with ID starting with 4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e not found: ID does not exist" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.866507 5030 scope.go:117] "RemoveContainer" containerID="4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45" Jan 20 23:42:31 crc kubenswrapper[5030]: E0120 23:42:31.866841 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45\": container with ID starting with 4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45 not found: ID does not exist" containerID="4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.866868 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45"} err="failed to get container status \"4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45\": rpc error: code = NotFound desc = could not find container \"4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45\": container with ID starting with 4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45 not found: ID does not exist" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.866892 5030 scope.go:117] "RemoveContainer" containerID="4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.867106 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e"} err="failed to get container status \"4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e\": rpc error: code = NotFound desc = could not find container \"4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e\": container with ID starting with 4de278d9a5ff270e2a1daa88c0325c4b647c78feb276c7c29f7ba54b44299a8e not found: ID does not exist" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.867124 5030 scope.go:117] "RemoveContainer" containerID="4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.867168 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.867337 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45"} err="failed to get container status \"4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45\": rpc error: code = NotFound desc = could not find container \"4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45\": container with ID starting with 4316551a5daa3a88272dc2c84618160c58c4d5a1457dbbaa50bbbbe7f4598c45 not found: ID does not exist" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.881995 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:42:31 crc kubenswrapper[5030]: E0120 23:42:31.882985 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565f8fbe-5c4e-4cf4-8a18-0930a8595306" containerName="openstack-network-exporter" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.883006 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="565f8fbe-5c4e-4cf4-8a18-0930a8595306" containerName="openstack-network-exporter" Jan 20 23:42:31 crc kubenswrapper[5030]: E0120 23:42:31.883049 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565f8fbe-5c4e-4cf4-8a18-0930a8595306" containerName="ovsdbserver-sb" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.883055 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="565f8fbe-5c4e-4cf4-8a18-0930a8595306" containerName="ovsdbserver-sb" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.883214 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="565f8fbe-5c4e-4cf4-8a18-0930a8595306" containerName="ovsdbserver-sb" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.883239 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="565f8fbe-5c4e-4cf4-8a18-0930a8595306" containerName="openstack-network-exporter" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.884179 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.886258 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.886454 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.886607 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.886866 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-7s6fz" Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.893671 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:42:31 crc kubenswrapper[5030]: I0120 23:42:31.975824 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="565f8fbe-5c4e-4cf4-8a18-0930a8595306" path="/var/lib/kubelet/pods/565f8fbe-5c4e-4cf4-8a18-0930a8595306/volumes" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.035776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.035847 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pz2f\" (UniqueName: \"kubernetes.io/projected/be7d3731-2d0f-4fcc-8edf-424ffe462724-kube-api-access-6pz2f\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.035937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.036034 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.036082 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be7d3731-2d0f-4fcc-8edf-424ffe462724-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.036107 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.036173 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be7d3731-2d0f-4fcc-8edf-424ffe462724-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.036202 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be7d3731-2d0f-4fcc-8edf-424ffe462724-config\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: E0120 23:42:32.085703 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:42:32 crc kubenswrapper[5030]: E0120 23:42:32.087468 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:42:32 crc kubenswrapper[5030]: E0120 23:42:32.088936 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:42:32 crc kubenswrapper[5030]: E0120 23:42:32.089004 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="b4c36492-6bfc-4370-b163-6e8ea9b8c06a" containerName="ovn-northd" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.137904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.137965 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pz2f\" (UniqueName: \"kubernetes.io/projected/be7d3731-2d0f-4fcc-8edf-424ffe462724-kube-api-access-6pz2f\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.138016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.138094 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.138168 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be7d3731-2d0f-4fcc-8edf-424ffe462724-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.138206 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.138252 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be7d3731-2d0f-4fcc-8edf-424ffe462724-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.138273 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be7d3731-2d0f-4fcc-8edf-424ffe462724-config\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.139450 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be7d3731-2d0f-4fcc-8edf-424ffe462724-config\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.140480 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be7d3731-2d0f-4fcc-8edf-424ffe462724-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.140927 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") device mount path \"/mnt/openstack/pv16\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.141520 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be7d3731-2d0f-4fcc-8edf-424ffe462724-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.147672 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.149651 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.152704 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.161483 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pz2f\" (UniqueName: \"kubernetes.io/projected/be7d3731-2d0f-4fcc-8edf-424ffe462724-kube-api-access-6pz2f\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.179649 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.207084 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:32 crc kubenswrapper[5030]: W0120 23:42:32.650795 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe7d3731_2d0f_4fcc_8edf_424ffe462724.slice/crio-f612e18174db4664ec2fc5a5fd3ee2f6c4a63c70838148e83e03dd3a54bb01a6 WatchSource:0}: Error finding container f612e18174db4664ec2fc5a5fd3ee2f6c4a63c70838148e83e03dd3a54bb01a6: Status 404 returned error can't find the container with id f612e18174db4664ec2fc5a5fd3ee2f6c4a63c70838148e83e03dd3a54bb01a6 Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.652226 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.783385 5030 generic.go:334] "Generic (PLEG): container finished" podID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerID="c77ef71b1ba7fc80efcdf5a24a3e01e5a68dadd6b1d0537759ca10e5a0da5519" exitCode=0 Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.783793 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bb388aeb-cf40-49d1-bb60-a22d6117c6a8","Type":"ContainerDied","Data":"c77ef71b1ba7fc80efcdf5a24a3e01e5a68dadd6b1d0537759ca10e5a0da5519"} Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.783856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bb388aeb-cf40-49d1-bb60-a22d6117c6a8","Type":"ContainerStarted","Data":"20c4a921161139dd7e23f659cc47617ba1e894ddbac491c152cd76fd4f4e85b4"} Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.784804 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"be7d3731-2d0f-4fcc-8edf-424ffe462724","Type":"ContainerStarted","Data":"f612e18174db4664ec2fc5a5fd3ee2f6c4a63c70838148e83e03dd3a54bb01a6"} Jan 20 23:42:32 crc kubenswrapper[5030]: I0120 23:42:32.913745 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:42:33 crc kubenswrapper[5030]: I0120 23:42:33.064929 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:33 crc kubenswrapper[5030]: I0120 23:42:33.065209 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:33 crc kubenswrapper[5030]: I0120 23:42:33.141835 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:33 crc kubenswrapper[5030]: I0120 23:42:33.144816 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:33 crc kubenswrapper[5030]: I0120 23:42:33.799305 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"be7d3731-2d0f-4fcc-8edf-424ffe462724","Type":"ContainerStarted","Data":"d7c25cc2d81571cbaf4e3fdaed35bbbbb4f6c053387fbd0f51749a48c434bf73"} Jan 20 23:42:33 crc kubenswrapper[5030]: I0120 23:42:33.799351 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"be7d3731-2d0f-4fcc-8edf-424ffe462724","Type":"ContainerStarted","Data":"70478a059fca701b17d2bef4ede54b3267ff5a5ce2ad1aa06cb4b60331aeb708"} Jan 20 23:42:33 crc kubenswrapper[5030]: I0120 23:42:33.824143 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.824121039 podStartE2EDuration="2.824121039s" podCreationTimestamp="2026-01-20 23:42:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:33.819039645 +0000 UTC m=+4026.139299933" watchObservedRunningTime="2026-01-20 23:42:33.824121039 +0000 UTC m=+4026.144381337" Jan 20 23:42:33 crc kubenswrapper[5030]: I0120 23:42:33.919878 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.496179 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-85c9874f86-xpfrg"] Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.499274 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.509652 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-85c9874f86-xpfrg"] Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.587465 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-internal-tls-certs\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.587644 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-logs\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.587683 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b6zg\" (UniqueName: \"kubernetes.io/projected/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-kube-api-access-5b6zg\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.587716 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-public-tls-certs\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.587804 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-combined-ca-bundle\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.588024 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-scripts\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.588139 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-config-data\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.689471 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-public-tls-certs\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.690028 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-combined-ca-bundle\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.690264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-scripts\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.690396 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-config-data\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.690553 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-internal-tls-certs\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.690774 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-logs\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.690880 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b6zg\" (UniqueName: \"kubernetes.io/projected/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-kube-api-access-5b6zg\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.691420 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-logs\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.704506 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-combined-ca-bundle\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.707166 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-internal-tls-certs\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.710608 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-config-data\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.713004 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-public-tls-certs\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.714555 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-scripts\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.724797 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b6zg\" (UniqueName: \"kubernetes.io/projected/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-kube-api-access-5b6zg\") pod \"placement-85c9874f86-xpfrg\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:34 crc kubenswrapper[5030]: I0120 23:42:34.816005 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.145196 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.208185 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.263003 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_b4c36492-6bfc-4370-b163-6e8ea9b8c06a/ovn-northd/0.log" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.263082 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.263603 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.304671 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-ovn-northd-tls-certs\") pod \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.304801 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-scripts\") pod \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.304873 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-combined-ca-bundle\") pod \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.304994 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-config\") pod \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.305047 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlqwt\" (UniqueName: \"kubernetes.io/projected/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-kube-api-access-vlqwt\") pod \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.305066 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-metrics-certs-tls-certs\") pod \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.305094 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-ovn-rundir\") pod \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\" (UID: \"b4c36492-6bfc-4370-b163-6e8ea9b8c06a\") " Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.306057 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "b4c36492-6bfc-4370-b163-6e8ea9b8c06a" (UID: "b4c36492-6bfc-4370-b163-6e8ea9b8c06a"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.307328 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-config" (OuterVolumeSpecName: "config") pod "b4c36492-6bfc-4370-b163-6e8ea9b8c06a" (UID: "b4c36492-6bfc-4370-b163-6e8ea9b8c06a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.307781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-scripts" (OuterVolumeSpecName: "scripts") pod "b4c36492-6bfc-4370-b163-6e8ea9b8c06a" (UID: "b4c36492-6bfc-4370-b163-6e8ea9b8c06a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.319019 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-kube-api-access-vlqwt" (OuterVolumeSpecName: "kube-api-access-vlqwt") pod "b4c36492-6bfc-4370-b163-6e8ea9b8c06a" (UID: "b4c36492-6bfc-4370-b163-6e8ea9b8c06a"). InnerVolumeSpecName "kube-api-access-vlqwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.344065 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4c36492-6bfc-4370-b163-6e8ea9b8c06a" (UID: "b4c36492-6bfc-4370-b163-6e8ea9b8c06a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.384311 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "b4c36492-6bfc-4370-b163-6e8ea9b8c06a" (UID: "b4c36492-6bfc-4370-b163-6e8ea9b8c06a"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.402016 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b4c36492-6bfc-4370-b163-6e8ea9b8c06a" (UID: "b4c36492-6bfc-4370-b163-6e8ea9b8c06a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.407585 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.407617 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlqwt\" (UniqueName: \"kubernetes.io/projected/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-kube-api-access-vlqwt\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.407640 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.407653 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.407664 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.407672 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.407682 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c36492-6bfc-4370-b163-6e8ea9b8c06a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.412092 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-85c9874f86-xpfrg"] Jan 20 23:42:35 crc kubenswrapper[5030]: E0120 23:42:35.815854 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-swift-public-svc: secret "cert-swift-public-svc" not found Jan 20 23:42:35 crc kubenswrapper[5030]: E0120 23:42:35.815890 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-swift-internal-svc: secret "cert-swift-internal-svc" not found Jan 20 23:42:35 crc kubenswrapper[5030]: E0120 23:42:35.815979 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs podName:d66a4eea-17a5-4ab5-b3be-2a479226daf7 nodeName:}" failed. No retries permitted until 2026-01-20 23:42:51.815947868 +0000 UTC m=+4044.136208156 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs") pod "swift-proxy-7c556c6584-sfpnb" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7") : secret "cert-swift-public-svc" not found Jan 20 23:42:35 crc kubenswrapper[5030]: E0120 23:42:35.816010 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs podName:d66a4eea-17a5-4ab5-b3be-2a479226daf7 nodeName:}" failed. No retries permitted until 2026-01-20 23:42:51.816000099 +0000 UTC m=+4044.136260387 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs") pod "swift-proxy-7c556c6584-sfpnb" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7") : secret "cert-swift-internal-svc" not found Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.822576 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" event={"ID":"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd","Type":"ContainerStarted","Data":"1b5d6e7723fc432f59bb152ee5c44c0e160790b16eb10a41a085b3d039712a96"} Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.822666 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" event={"ID":"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd","Type":"ContainerStarted","Data":"2bcdfe6dfff86b5e656db87b6f5cb63bbc57725e294dba0a83718b6cf6e1154c"} Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.826169 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_b4c36492-6bfc-4370-b163-6e8ea9b8c06a/ovn-northd/0.log" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.826224 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4c36492-6bfc-4370-b163-6e8ea9b8c06a" containerID="d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d" exitCode=139 Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.826286 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"b4c36492-6bfc-4370-b163-6e8ea9b8c06a","Type":"ContainerDied","Data":"d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d"} Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.826358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"b4c36492-6bfc-4370-b163-6e8ea9b8c06a","Type":"ContainerDied","Data":"93e1498c036be00fce2beed4475e558b06e6dccea18d7b64ec7b64e36e26666a"} Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.826381 5030 scope.go:117] "RemoveContainer" containerID="e7b8a2e75577c589c157713b286429901def374e01d91bd72e7e69b63cc402a3" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.826304 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.826743 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.864420 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.864962 5030 scope.go:117] "RemoveContainer" containerID="d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.874462 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.898915 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:42:35 crc kubenswrapper[5030]: E0120 23:42:35.899409 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c36492-6bfc-4370-b163-6e8ea9b8c06a" containerName="ovn-northd" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.899435 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c36492-6bfc-4370-b163-6e8ea9b8c06a" containerName="ovn-northd" Jan 20 23:42:35 crc kubenswrapper[5030]: E0120 23:42:35.899493 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c36492-6bfc-4370-b163-6e8ea9b8c06a" containerName="openstack-network-exporter" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.899502 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c36492-6bfc-4370-b163-6e8ea9b8c06a" containerName="openstack-network-exporter" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.899769 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c36492-6bfc-4370-b163-6e8ea9b8c06a" containerName="openstack-network-exporter" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.899799 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c36492-6bfc-4370-b163-6e8ea9b8c06a" containerName="ovn-northd" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.902004 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.910588 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.910681 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-l9mtr" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.910749 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.912987 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.915510 5030 scope.go:117] "RemoveContainer" containerID="e7b8a2e75577c589c157713b286429901def374e01d91bd72e7e69b63cc402a3" Jan 20 23:42:35 crc kubenswrapper[5030]: E0120 23:42:35.916202 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7b8a2e75577c589c157713b286429901def374e01d91bd72e7e69b63cc402a3\": container with ID starting with e7b8a2e75577c589c157713b286429901def374e01d91bd72e7e69b63cc402a3 not found: ID does not exist" containerID="e7b8a2e75577c589c157713b286429901def374e01d91bd72e7e69b63cc402a3" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.916231 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7b8a2e75577c589c157713b286429901def374e01d91bd72e7e69b63cc402a3"} err="failed to get container status \"e7b8a2e75577c589c157713b286429901def374e01d91bd72e7e69b63cc402a3\": rpc error: code = NotFound desc = could not find container \"e7b8a2e75577c589c157713b286429901def374e01d91bd72e7e69b63cc402a3\": container with ID starting with e7b8a2e75577c589c157713b286429901def374e01d91bd72e7e69b63cc402a3 not found: ID does not exist" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.916252 5030 scope.go:117] "RemoveContainer" containerID="d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.917179 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.917241 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.917296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrzhw\" (UniqueName: \"kubernetes.io/projected/87ed5aae-25b3-475c-b086-4ef865d35368-kube-api-access-nrzhw\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.917317 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ed5aae-25b3-475c-b086-4ef865d35368-config\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.917338 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.917367 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87ed5aae-25b3-475c-b086-4ef865d35368-scripts\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.917444 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.917463 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87ed5aae-25b3-475c-b086-4ef865d35368-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:35 crc kubenswrapper[5030]: E0120 23:42:35.917749 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d\": container with ID starting with d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d not found: ID does not exist" containerID="d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.917772 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d"} err="failed to get container status \"d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d\": rpc error: code = NotFound desc = could not find container \"d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d\": container with ID starting with d963c6340fa6c02da3dbcf63fe62e7f81f6054eca4f78117aab7b1b13f17c44d not found: ID does not exist" Jan 20 23:42:35 crc kubenswrapper[5030]: I0120 23:42:35.971839 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4c36492-6bfc-4370-b163-6e8ea9b8c06a" path="/var/lib/kubelet/pods/b4c36492-6bfc-4370-b163-6e8ea9b8c06a/volumes" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.018653 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrzhw\" (UniqueName: \"kubernetes.io/projected/87ed5aae-25b3-475c-b086-4ef865d35368-kube-api-access-nrzhw\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.018698 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ed5aae-25b3-475c-b086-4ef865d35368-config\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.018727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.018770 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87ed5aae-25b3-475c-b086-4ef865d35368-scripts\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.018816 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.018845 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87ed5aae-25b3-475c-b086-4ef865d35368-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.018956 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.019541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87ed5aae-25b3-475c-b086-4ef865d35368-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.019773 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87ed5aae-25b3-475c-b086-4ef865d35368-scripts\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.020129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ed5aae-25b3-475c-b086-4ef865d35368-config\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.023641 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.023680 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.025260 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.037793 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrzhw\" (UniqueName: \"kubernetes.io/projected/87ed5aae-25b3-475c-b086-4ef865d35368-kube-api-access-nrzhw\") pod \"ovn-northd-0\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.187143 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.223919 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.224190 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.685755 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:42:36 crc kubenswrapper[5030]: W0120 23:42:36.692522 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87ed5aae_25b3_475c_b086_4ef865d35368.slice/crio-110d30eeebae4837cb28aa3e933028a46551d3a7db01bd57214198f11aefffaf WatchSource:0}: Error finding container 110d30eeebae4837cb28aa3e933028a46551d3a7db01bd57214198f11aefffaf: Status 404 returned error can't find the container with id 110d30eeebae4837cb28aa3e933028a46551d3a7db01bd57214198f11aefffaf Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.844596 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"87ed5aae-25b3-475c-b086-4ef865d35368","Type":"ContainerStarted","Data":"110d30eeebae4837cb28aa3e933028a46551d3a7db01bd57214198f11aefffaf"} Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.847262 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" event={"ID":"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd","Type":"ContainerStarted","Data":"4c1481e7f538bbd813fc066cbdf4b164b9a4610f8972b882bebb9408284fb510"} Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.847418 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:36 crc kubenswrapper[5030]: I0120 23:42:36.871378 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" podStartSLOduration=2.8713599 podStartE2EDuration="2.8713599s" podCreationTimestamp="2026-01-20 23:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:36.86598168 +0000 UTC m=+4029.186241968" watchObservedRunningTime="2026-01-20 23:42:36.8713599 +0000 UTC m=+4029.191620188" Jan 20 23:42:37 crc kubenswrapper[5030]: I0120 23:42:37.276608 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:42:37 crc kubenswrapper[5030]: I0120 23:42:37.864775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"87ed5aae-25b3-475c-b086-4ef865d35368","Type":"ContainerStarted","Data":"333adca68e29b9bae2f3440fb85d275fec118f97e652dac5d291b5fdd406d414"} Jan 20 23:42:37 crc kubenswrapper[5030]: I0120 23:42:37.865073 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"87ed5aae-25b3-475c-b086-4ef865d35368","Type":"ContainerStarted","Data":"d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb"} Jan 20 23:42:37 crc kubenswrapper[5030]: I0120 23:42:37.866001 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:42:37 crc kubenswrapper[5030]: I0120 23:42:37.888583 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.888560026 podStartE2EDuration="2.888560026s" podCreationTimestamp="2026-01-20 23:42:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:37.881383342 +0000 UTC m=+4030.201643620" watchObservedRunningTime="2026-01-20 23:42:37.888560026 +0000 UTC m=+4030.208820324" Jan 20 23:42:37 crc kubenswrapper[5030]: I0120 23:42:37.945200 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:42:38 crc kubenswrapper[5030]: E0120 23:42:38.872286 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 20 23:42:38 crc kubenswrapper[5030]: E0120 23:42:38.872343 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs podName:19f669e8-7127-4a91-ae58-2e88b3bb7d6e nodeName:}" failed. No retries permitted until 2026-01-20 23:42:54.872329662 +0000 UTC m=+4047.192589950 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e") : secret "cert-glance-default-internal-svc" not found Jan 20 23:42:38 crc kubenswrapper[5030]: I0120 23:42:38.875475 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:39 crc kubenswrapper[5030]: I0120 23:42:39.890021 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:42:40 crc kubenswrapper[5030]: E0120 23:42:40.097771 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 20 23:42:40 crc kubenswrapper[5030]: E0120 23:42:40.097852 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs podName:f6d833c2-fa91-4538-a2e6-d42190e6fb9e nodeName:}" failed. No retries permitted until 2026-01-20 23:42:40.59782892 +0000 UTC m=+4032.918089218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs") pod "memcached-0" (UID: "f6d833c2-fa91-4538-a2e6-d42190e6fb9e") : secret "cert-memcached-svc" not found Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.427582 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7"] Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.429444 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.448052 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7"] Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.504552 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-combined-ca-bundle\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.504689 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-config-data-custom\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.504708 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e590045-baa7-4ef7-87a4-4ab4a64161b3-logs\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.504835 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84bh7\" (UniqueName: \"kubernetes.io/projected/6e590045-baa7-4ef7-87a4-4ab4a64161b3-kube-api-access-84bh7\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.504993 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-internal-tls-certs\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.505133 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-public-tls-certs\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.505207 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-config-data\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.606611 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-config-data-custom\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.606878 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e590045-baa7-4ef7-87a4-4ab4a64161b3-logs\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.607010 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84bh7\" (UniqueName: \"kubernetes.io/projected/6e590045-baa7-4ef7-87a4-4ab4a64161b3-kube-api-access-84bh7\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.607111 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-internal-tls-certs\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.607202 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-public-tls-certs\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.607400 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e590045-baa7-4ef7-87a4-4ab4a64161b3-logs\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.607824 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-config-data\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.607924 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-combined-ca-bundle\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: E0120 23:42:40.607950 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 20 23:42:40 crc kubenswrapper[5030]: E0120 23:42:40.608137 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs podName:f6d833c2-fa91-4538-a2e6-d42190e6fb9e nodeName:}" failed. No retries permitted until 2026-01-20 23:42:41.608120779 +0000 UTC m=+4033.928381067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs") pod "memcached-0" (UID: "f6d833c2-fa91-4538-a2e6-d42190e6fb9e") : secret "cert-memcached-svc" not found Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.613040 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-config-data\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.613696 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-public-tls-certs\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.614244 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-internal-tls-certs\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.620387 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-config-data-custom\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.621692 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-combined-ca-bundle\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.640189 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84bh7\" (UniqueName: \"kubernetes.io/projected/6e590045-baa7-4ef7-87a4-4ab4a64161b3-kube-api-access-84bh7\") pod \"barbican-api-c9cc68596-zfxv7\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:40 crc kubenswrapper[5030]: I0120 23:42:40.747615 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:41 crc kubenswrapper[5030]: I0120 23:42:41.235031 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7"] Jan 20 23:42:41 crc kubenswrapper[5030]: E0120 23:42:41.636961 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 20 23:42:41 crc kubenswrapper[5030]: E0120 23:42:41.637278 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs podName:f6d833c2-fa91-4538-a2e6-d42190e6fb9e nodeName:}" failed. No retries permitted until 2026-01-20 23:42:43.637256334 +0000 UTC m=+4035.957516612 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs") pod "memcached-0" (UID: "f6d833c2-fa91-4538-a2e6-d42190e6fb9e") : secret "cert-memcached-svc" not found Jan 20 23:42:41 crc kubenswrapper[5030]: I0120 23:42:41.917460 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" event={"ID":"6e590045-baa7-4ef7-87a4-4ab4a64161b3","Type":"ContainerStarted","Data":"13859d5f41c3b61a0c89700bdb6935b58c5bf928f2bbdf63a92f7c1183771d73"} Jan 20 23:42:41 crc kubenswrapper[5030]: I0120 23:42:41.917506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" event={"ID":"6e590045-baa7-4ef7-87a4-4ab4a64161b3","Type":"ContainerStarted","Data":"6dcae7ae7c484584017f7fa0aca6aa0cb08528fa8a61b4d048214746eb80c8b1"} Jan 20 23:42:41 crc kubenswrapper[5030]: I0120 23:42:41.917517 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" event={"ID":"6e590045-baa7-4ef7-87a4-4ab4a64161b3","Type":"ContainerStarted","Data":"ce7139799de264d52e74dfa0f7516e35af51cfbd6733c10e1f41e617940e9330"} Jan 20 23:42:41 crc kubenswrapper[5030]: I0120 23:42:41.917697 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:41 crc kubenswrapper[5030]: I0120 23:42:41.917957 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:41 crc kubenswrapper[5030]: I0120 23:42:41.948336 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" podStartSLOduration=1.94830843 podStartE2EDuration="1.94830843s" podCreationTimestamp="2026-01-20 23:42:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:41.947381688 +0000 UTC m=+4034.267641986" watchObservedRunningTime="2026-01-20 23:42:41.94830843 +0000 UTC m=+4034.268568758" Jan 20 23:42:42 crc kubenswrapper[5030]: I0120 23:42:42.680009 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:42:42 crc kubenswrapper[5030]: I0120 23:42:42.836610 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="410c6d7e-0366-4932-b95a-4e9624eadf71" containerName="galera" containerID="cri-o://4f34747e9a5c643353b7a19006a2b052c6d1ed183a2329724de92de4a875c411" gracePeriod=30 Jan 20 23:42:43 crc kubenswrapper[5030]: E0120 23:42:43.678680 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 20 23:42:43 crc kubenswrapper[5030]: E0120 23:42:43.679023 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs podName:f6d833c2-fa91-4538-a2e6-d42190e6fb9e nodeName:}" failed. No retries permitted until 2026-01-20 23:42:47.679008555 +0000 UTC m=+4039.999268833 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs") pod "memcached-0" (UID: "f6d833c2-fa91-4538-a2e6-d42190e6fb9e") : secret "cert-memcached-svc" not found Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.738606 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.881924 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410c6d7e-0366-4932-b95a-4e9624eadf71-combined-ca-bundle\") pod \"410c6d7e-0366-4932-b95a-4e9624eadf71\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.882010 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-config-data-default\") pod \"410c6d7e-0366-4932-b95a-4e9624eadf71\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.882065 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-operator-scripts\") pod \"410c6d7e-0366-4932-b95a-4e9624eadf71\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.882199 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"410c6d7e-0366-4932-b95a-4e9624eadf71\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.882282 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/410c6d7e-0366-4932-b95a-4e9624eadf71-config-data-generated\") pod \"410c6d7e-0366-4932-b95a-4e9624eadf71\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.883020 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/410c6d7e-0366-4932-b95a-4e9624eadf71-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "410c6d7e-0366-4932-b95a-4e9624eadf71" (UID: "410c6d7e-0366-4932-b95a-4e9624eadf71"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.883134 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngv9k\" (UniqueName: \"kubernetes.io/projected/410c6d7e-0366-4932-b95a-4e9624eadf71-kube-api-access-ngv9k\") pod \"410c6d7e-0366-4932-b95a-4e9624eadf71\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.883398 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "410c6d7e-0366-4932-b95a-4e9624eadf71" (UID: "410c6d7e-0366-4932-b95a-4e9624eadf71"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.883694 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/410c6d7e-0366-4932-b95a-4e9624eadf71-galera-tls-certs\") pod \"410c6d7e-0366-4932-b95a-4e9624eadf71\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.883753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "410c6d7e-0366-4932-b95a-4e9624eadf71" (UID: "410c6d7e-0366-4932-b95a-4e9624eadf71"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.883782 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-kolla-config\") pod \"410c6d7e-0366-4932-b95a-4e9624eadf71\" (UID: \"410c6d7e-0366-4932-b95a-4e9624eadf71\") " Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.884472 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "410c6d7e-0366-4932-b95a-4e9624eadf71" (UID: "410c6d7e-0366-4932-b95a-4e9624eadf71"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.885426 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.885493 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.885526 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/410c6d7e-0366-4932-b95a-4e9624eadf71-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.885562 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/410c6d7e-0366-4932-b95a-4e9624eadf71-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.888553 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410c6d7e-0366-4932-b95a-4e9624eadf71-kube-api-access-ngv9k" (OuterVolumeSpecName: "kube-api-access-ngv9k") pod "410c6d7e-0366-4932-b95a-4e9624eadf71" (UID: "410c6d7e-0366-4932-b95a-4e9624eadf71"). InnerVolumeSpecName "kube-api-access-ngv9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.897922 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "410c6d7e-0366-4932-b95a-4e9624eadf71" (UID: "410c6d7e-0366-4932-b95a-4e9624eadf71"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.915978 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410c6d7e-0366-4932-b95a-4e9624eadf71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "410c6d7e-0366-4932-b95a-4e9624eadf71" (UID: "410c6d7e-0366-4932-b95a-4e9624eadf71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.950201 5030 generic.go:334] "Generic (PLEG): container finished" podID="410c6d7e-0366-4932-b95a-4e9624eadf71" containerID="4f34747e9a5c643353b7a19006a2b052c6d1ed183a2329724de92de4a875c411" exitCode=0 Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.950244 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"410c6d7e-0366-4932-b95a-4e9624eadf71","Type":"ContainerDied","Data":"4f34747e9a5c643353b7a19006a2b052c6d1ed183a2329724de92de4a875c411"} Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.950270 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"410c6d7e-0366-4932-b95a-4e9624eadf71","Type":"ContainerDied","Data":"53deed7c2df75d0d2f1773031ab76255788fc76bd82624f1bc4f084ab9818c8a"} Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.950287 5030 scope.go:117] "RemoveContainer" containerID="4f34747e9a5c643353b7a19006a2b052c6d1ed183a2329724de92de4a875c411" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.950422 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.963565 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410c6d7e-0366-4932-b95a-4e9624eadf71-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "410c6d7e-0366-4932-b95a-4e9624eadf71" (UID: "410c6d7e-0366-4932-b95a-4e9624eadf71"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.986962 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngv9k\" (UniqueName: \"kubernetes.io/projected/410c6d7e-0366-4932-b95a-4e9624eadf71-kube-api-access-ngv9k\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.987006 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/410c6d7e-0366-4932-b95a-4e9624eadf71-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.987024 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410c6d7e-0366-4932-b95a-4e9624eadf71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:43 crc kubenswrapper[5030]: I0120 23:42:43.987065 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.015055 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.016332 5030 scope.go:117] "RemoveContainer" containerID="e16b7deceab9b6d152d69c2ad3efd2d72f8ea41ed94ce05e0ca3b3309400a66f" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.040256 5030 scope.go:117] "RemoveContainer" containerID="4f34747e9a5c643353b7a19006a2b052c6d1ed183a2329724de92de4a875c411" Jan 20 23:42:44 crc kubenswrapper[5030]: E0120 23:42:44.040706 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f34747e9a5c643353b7a19006a2b052c6d1ed183a2329724de92de4a875c411\": container with ID starting with 4f34747e9a5c643353b7a19006a2b052c6d1ed183a2329724de92de4a875c411 not found: ID does not exist" containerID="4f34747e9a5c643353b7a19006a2b052c6d1ed183a2329724de92de4a875c411" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.040735 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f34747e9a5c643353b7a19006a2b052c6d1ed183a2329724de92de4a875c411"} err="failed to get container status \"4f34747e9a5c643353b7a19006a2b052c6d1ed183a2329724de92de4a875c411\": rpc error: code = NotFound desc = could not find container \"4f34747e9a5c643353b7a19006a2b052c6d1ed183a2329724de92de4a875c411\": container with ID starting with 4f34747e9a5c643353b7a19006a2b052c6d1ed183a2329724de92de4a875c411 not found: ID does not exist" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.040758 5030 scope.go:117] "RemoveContainer" containerID="e16b7deceab9b6d152d69c2ad3efd2d72f8ea41ed94ce05e0ca3b3309400a66f" Jan 20 23:42:44 crc kubenswrapper[5030]: E0120 23:42:44.041396 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e16b7deceab9b6d152d69c2ad3efd2d72f8ea41ed94ce05e0ca3b3309400a66f\": container with ID starting with e16b7deceab9b6d152d69c2ad3efd2d72f8ea41ed94ce05e0ca3b3309400a66f not found: ID does not exist" containerID="e16b7deceab9b6d152d69c2ad3efd2d72f8ea41ed94ce05e0ca3b3309400a66f" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.041445 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e16b7deceab9b6d152d69c2ad3efd2d72f8ea41ed94ce05e0ca3b3309400a66f"} err="failed to get container status \"e16b7deceab9b6d152d69c2ad3efd2d72f8ea41ed94ce05e0ca3b3309400a66f\": rpc error: code = NotFound desc = could not find container \"e16b7deceab9b6d152d69c2ad3efd2d72f8ea41ed94ce05e0ca3b3309400a66f\": container with ID starting with e16b7deceab9b6d152d69c2ad3efd2d72f8ea41ed94ce05e0ca3b3309400a66f not found: ID does not exist" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.090990 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.282818 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.303528 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.320977 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:42:44 crc kubenswrapper[5030]: E0120 23:42:44.321676 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410c6d7e-0366-4932-b95a-4e9624eadf71" containerName="mysql-bootstrap" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.321775 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="410c6d7e-0366-4932-b95a-4e9624eadf71" containerName="mysql-bootstrap" Jan 20 23:42:44 crc kubenswrapper[5030]: E0120 23:42:44.321877 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410c6d7e-0366-4932-b95a-4e9624eadf71" containerName="galera" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.321952 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="410c6d7e-0366-4932-b95a-4e9624eadf71" containerName="galera" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.322284 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="410c6d7e-0366-4932-b95a-4e9624eadf71" containerName="galera" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.323612 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.326788 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.326874 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.326905 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.326820 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-rjg7g" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.332995 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.503295 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ba66fb72-701e-46d6-b539-bdc036b9477f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.503341 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba66fb72-701e-46d6-b539-bdc036b9477f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.503370 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.503413 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba66fb72-701e-46d6-b539-bdc036b9477f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.503647 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.503728 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.504727 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.504853 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6klm\" (UniqueName: \"kubernetes.io/projected/ba66fb72-701e-46d6-b539-bdc036b9477f-kube-api-access-w6klm\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.607335 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.607644 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.607875 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.608292 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.608371 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6klm\" (UniqueName: \"kubernetes.io/projected/ba66fb72-701e-46d6-b539-bdc036b9477f-kube-api-access-w6klm\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.608570 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ba66fb72-701e-46d6-b539-bdc036b9477f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.608636 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba66fb72-701e-46d6-b539-bdc036b9477f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.608681 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.608731 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba66fb72-701e-46d6-b539-bdc036b9477f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.608932 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.609540 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.609644 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ba66fb72-701e-46d6-b539-bdc036b9477f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.612123 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.614436 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba66fb72-701e-46d6-b539-bdc036b9477f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.615943 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba66fb72-701e-46d6-b539-bdc036b9477f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.634216 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6klm\" (UniqueName: \"kubernetes.io/projected/ba66fb72-701e-46d6-b539-bdc036b9477f-kube-api-access-w6klm\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.660527 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.951694 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.951964 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="f6d833c2-fa91-4538-a2e6-d42190e6fb9e" containerName="memcached" containerID="cri-o://8bed77b1dd9d957a97c36677d666f529c0937e16ba1e19ff1190dce0c3a08706" gracePeriod=30 Jan 20 23:42:44 crc kubenswrapper[5030]: I0120 23:42:44.954279 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.009680 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.010182 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="eedd0095-2878-461a-a374-8f50c0a0f040" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ade9450e93cbd809b41874b3fd69c3ff407cb273c825f9de6a700f1a0133a9a1" gracePeriod=30 Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.027850 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.028135 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="20ed422e-24f7-4373-8bd9-e32a8e086a0c" containerName="glance-log" containerID="cri-o://d3eaeefe69d0cb25858dfc5c754c96ff130ce1e3dfac3894928e03469d40a64f" gracePeriod=30 Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.028220 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="20ed422e-24f7-4373-8bd9-e32a8e086a0c" containerName="glance-httpd" containerID="cri-o://89f942718707120126f9bb556042a383baa6956e7cf2185a56b93a5da399dd9d" gracePeriod=30 Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.112219 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.112527 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="19f669e8-7127-4a91-ae58-2e88b3bb7d6e" containerName="glance-log" containerID="cri-o://577c0b1d3c02aa73ef6c0f40cb4e260f2080808352a81286f14d239c1a159825" gracePeriod=30 Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.112710 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="19f669e8-7127-4a91-ae58-2e88b3bb7d6e" containerName="glance-httpd" containerID="cri-o://d3091f37569897e9b88ec5d78c920009f5c04f92e89d358d4a768fe3e3188955" gracePeriod=30 Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.216806 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k"] Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.218569 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.235155 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k"] Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.322477 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-combined-ca-bundle\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.322514 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htklw\" (UniqueName: \"kubernetes.io/projected/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-kube-api-access-htklw\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.322578 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-fernet-keys\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.322603 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-credential-keys\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.322737 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-public-tls-certs\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.322955 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-config-data\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.323046 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-scripts\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.323160 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-internal-tls-certs\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.425061 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-scripts\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.425197 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-internal-tls-certs\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.425225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-combined-ca-bundle\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.425251 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htklw\" (UniqueName: \"kubernetes.io/projected/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-kube-api-access-htklw\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.425320 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-fernet-keys\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.425350 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-credential-keys\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.425375 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-public-tls-certs\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.425488 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-config-data\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.431242 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-scripts\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.431509 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-credential-keys\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.435118 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-fernet-keys\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.437080 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-combined-ca-bundle\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.437159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-internal-tls-certs\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.438227 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-config-data\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.438693 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-public-tls-certs\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.446850 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htklw\" (UniqueName: \"kubernetes.io/projected/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-kube-api-access-htklw\") pod \"keystone-685c4d7f7b-d6j4k\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.531128 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.543048 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.851724 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.938269 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-nova-novncproxy-tls-certs\") pod \"eedd0095-2878-461a-a374-8f50c0a0f040\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.938318 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-vencrypt-tls-certs\") pod \"eedd0095-2878-461a-a374-8f50c0a0f040\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.938351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-combined-ca-bundle\") pod \"eedd0095-2878-461a-a374-8f50c0a0f040\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.938437 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh48t\" (UniqueName: \"kubernetes.io/projected/eedd0095-2878-461a-a374-8f50c0a0f040-kube-api-access-lh48t\") pod \"eedd0095-2878-461a-a374-8f50c0a0f040\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.938515 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-config-data\") pod \"eedd0095-2878-461a-a374-8f50c0a0f040\" (UID: \"eedd0095-2878-461a-a374-8f50c0a0f040\") " Jan 20 23:42:45 crc kubenswrapper[5030]: I0120 23:42:45.955934 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eedd0095-2878-461a-a374-8f50c0a0f040-kube-api-access-lh48t" (OuterVolumeSpecName: "kube-api-access-lh48t") pod "eedd0095-2878-461a-a374-8f50c0a0f040" (UID: "eedd0095-2878-461a-a374-8f50c0a0f040"). InnerVolumeSpecName "kube-api-access-lh48t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.025693 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="410c6d7e-0366-4932-b95a-4e9624eadf71" path="/var/lib/kubelet/pods/410c6d7e-0366-4932-b95a-4e9624eadf71/volumes" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.035973 5030 generic.go:334] "Generic (PLEG): container finished" podID="eedd0095-2878-461a-a374-8f50c0a0f040" containerID="ade9450e93cbd809b41874b3fd69c3ff407cb273c825f9de6a700f1a0133a9a1" exitCode=0 Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.036068 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.045077 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh48t\" (UniqueName: \"kubernetes.io/projected/eedd0095-2878-461a-a374-8f50c0a0f040-kube-api-access-lh48t\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.048966 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"eedd0095-2878-461a-a374-8f50c0a0f040","Type":"ContainerDied","Data":"ade9450e93cbd809b41874b3fd69c3ff407cb273c825f9de6a700f1a0133a9a1"} Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.049011 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"eedd0095-2878-461a-a374-8f50c0a0f040","Type":"ContainerDied","Data":"2712b6d1babe11a94d357395e930698091b1d864ca55cf94cbb0879d16fb7dfc"} Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.049025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"ba66fb72-701e-46d6-b539-bdc036b9477f","Type":"ContainerStarted","Data":"0209a5dfb0c62f1e66ec249e44fd294c019db9b9470a7115b6112a0bd6656979"} Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.049035 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"ba66fb72-701e-46d6-b539-bdc036b9477f","Type":"ContainerStarted","Data":"71b5bce4ed79b269d7a64a65dd7f41e193081fd88b88a067419eba7bd8a373de"} Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.049058 5030 scope.go:117] "RemoveContainer" containerID="ade9450e93cbd809b41874b3fd69c3ff407cb273c825f9de6a700f1a0133a9a1" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.067929 5030 generic.go:334] "Generic (PLEG): container finished" podID="19f669e8-7127-4a91-ae58-2e88b3bb7d6e" containerID="577c0b1d3c02aa73ef6c0f40cb4e260f2080808352a81286f14d239c1a159825" exitCode=143 Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.067995 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"19f669e8-7127-4a91-ae58-2e88b3bb7d6e","Type":"ContainerDied","Data":"577c0b1d3c02aa73ef6c0f40cb4e260f2080808352a81286f14d239c1a159825"} Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.080806 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-config-data" (OuterVolumeSpecName: "config-data") pod "eedd0095-2878-461a-a374-8f50c0a0f040" (UID: "eedd0095-2878-461a-a374-8f50c0a0f040"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.090115 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eedd0095-2878-461a-a374-8f50c0a0f040" (UID: "eedd0095-2878-461a-a374-8f50c0a0f040"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.091476 5030 generic.go:334] "Generic (PLEG): container finished" podID="20ed422e-24f7-4373-8bd9-e32a8e086a0c" containerID="d3eaeefe69d0cb25858dfc5c754c96ff130ce1e3dfac3894928e03469d40a64f" exitCode=143 Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.091546 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"20ed422e-24f7-4373-8bd9-e32a8e086a0c","Type":"ContainerDied","Data":"d3eaeefe69d0cb25858dfc5c754c96ff130ce1e3dfac3894928e03469d40a64f"} Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.094748 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "eedd0095-2878-461a-a374-8f50c0a0f040" (UID: "eedd0095-2878-461a-a374-8f50c0a0f040"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.100745 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k"] Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.105254 5030 scope.go:117] "RemoveContainer" containerID="ade9450e93cbd809b41874b3fd69c3ff407cb273c825f9de6a700f1a0133a9a1" Jan 20 23:42:46 crc kubenswrapper[5030]: E0120 23:42:46.107762 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade9450e93cbd809b41874b3fd69c3ff407cb273c825f9de6a700f1a0133a9a1\": container with ID starting with ade9450e93cbd809b41874b3fd69c3ff407cb273c825f9de6a700f1a0133a9a1 not found: ID does not exist" containerID="ade9450e93cbd809b41874b3fd69c3ff407cb273c825f9de6a700f1a0133a9a1" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.107809 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade9450e93cbd809b41874b3fd69c3ff407cb273c825f9de6a700f1a0133a9a1"} err="failed to get container status \"ade9450e93cbd809b41874b3fd69c3ff407cb273c825f9de6a700f1a0133a9a1\": rpc error: code = NotFound desc = could not find container \"ade9450e93cbd809b41874b3fd69c3ff407cb273c825f9de6a700f1a0133a9a1\": container with ID starting with ade9450e93cbd809b41874b3fd69c3ff407cb273c825f9de6a700f1a0133a9a1 not found: ID does not exist" Jan 20 23:42:46 crc kubenswrapper[5030]: W0120 23:42:46.111162 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b42296e_9bb8_4b92_ac1f_6f62a58bb042.slice/crio-b2263a247334346a23757c575d761339fb92af17d180b287c6b07462fd3ac1c2 WatchSource:0}: Error finding container b2263a247334346a23757c575d761339fb92af17d180b287c6b07462fd3ac1c2: Status 404 returned error can't find the container with id b2263a247334346a23757c575d761339fb92af17d180b287c6b07462fd3ac1c2 Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.111211 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "eedd0095-2878-461a-a374-8f50c0a0f040" (UID: "eedd0095-2878-461a-a374-8f50c0a0f040"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.147738 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.147775 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.147787 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.147797 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedd0095-2878-461a-a374-8f50c0a0f040-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.296168 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.377192 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.393275 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.402902 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:42:46 crc kubenswrapper[5030]: E0120 23:42:46.403380 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedd0095-2878-461a-a374-8f50c0a0f040" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.403398 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedd0095-2878-461a-a374-8f50c0a0f040" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.403635 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedd0095-2878-461a-a374-8f50c0a0f040" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.404298 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.408334 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.408581 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.408878 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.448311 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.455590 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.455800 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gsrc\" (UniqueName: \"kubernetes.io/projected/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-kube-api-access-7gsrc\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.458172 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.460447 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.460660 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.563169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.563221 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.563267 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gsrc\" (UniqueName: \"kubernetes.io/projected/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-kube-api-access-7gsrc\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.563333 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.563393 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.567590 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.568038 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.572687 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.573454 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.586373 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gsrc\" (UniqueName: \"kubernetes.io/projected/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-kube-api-access-7gsrc\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.790929 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.836811 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.869443 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-kolla-config\") pod \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.869508 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pms6q\" (UniqueName: \"kubernetes.io/projected/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-kube-api-access-pms6q\") pod \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.869599 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs\") pod \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.869633 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-combined-ca-bundle\") pod \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.869764 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-config-data\") pod \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\" (UID: \"f6d833c2-fa91-4538-a2e6-d42190e6fb9e\") " Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.870884 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-config-data" (OuterVolumeSpecName: "config-data") pod "f6d833c2-fa91-4538-a2e6-d42190e6fb9e" (UID: "f6d833c2-fa91-4538-a2e6-d42190e6fb9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.871074 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f6d833c2-fa91-4538-a2e6-d42190e6fb9e" (UID: "f6d833c2-fa91-4538-a2e6-d42190e6fb9e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.884533 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-kube-api-access-pms6q" (OuterVolumeSpecName: "kube-api-access-pms6q") pod "f6d833c2-fa91-4538-a2e6-d42190e6fb9e" (UID: "f6d833c2-fa91-4538-a2e6-d42190e6fb9e"). InnerVolumeSpecName "kube-api-access-pms6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.915791 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6d833c2-fa91-4538-a2e6-d42190e6fb9e" (UID: "f6d833c2-fa91-4538-a2e6-d42190e6fb9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.943781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "f6d833c2-fa91-4538-a2e6-d42190e6fb9e" (UID: "f6d833c2-fa91-4538-a2e6-d42190e6fb9e"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.972391 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.972422 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.972434 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.972443 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:46 crc kubenswrapper[5030]: I0120 23:42:46.972456 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pms6q\" (UniqueName: \"kubernetes.io/projected/f6d833c2-fa91-4538-a2e6-d42190e6fb9e-kube-api-access-pms6q\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.107334 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" event={"ID":"8b42296e-9bb8-4b92-ac1f-6f62a58bb042","Type":"ContainerStarted","Data":"98fb3459638b3c9138e33d4416a9e172a726d393548444329a3a1a4c7fee114e"} Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.107379 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" event={"ID":"8b42296e-9bb8-4b92-ac1f-6f62a58bb042","Type":"ContainerStarted","Data":"b2263a247334346a23757c575d761339fb92af17d180b287c6b07462fd3ac1c2"} Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.107441 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.109442 5030 generic.go:334] "Generic (PLEG): container finished" podID="f6d833c2-fa91-4538-a2e6-d42190e6fb9e" containerID="8bed77b1dd9d957a97c36677d666f529c0937e16ba1e19ff1190dce0c3a08706" exitCode=0 Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.109973 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.111174 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"f6d833c2-fa91-4538-a2e6-d42190e6fb9e","Type":"ContainerDied","Data":"8bed77b1dd9d957a97c36677d666f529c0937e16ba1e19ff1190dce0c3a08706"} Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.111222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"f6d833c2-fa91-4538-a2e6-d42190e6fb9e","Type":"ContainerDied","Data":"c10240d067c7a3f65a3b35858f4c1a56e2627e7ca5bde6937d9b7053f3faccdb"} Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.111242 5030 scope.go:117] "RemoveContainer" containerID="8bed77b1dd9d957a97c36677d666f529c0937e16ba1e19ff1190dce0c3a08706" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.144080 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" podStartSLOduration=2.144064191 podStartE2EDuration="2.144064191s" podCreationTimestamp="2026-01-20 23:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:47.12790332 +0000 UTC m=+4039.448163608" watchObservedRunningTime="2026-01-20 23:42:47.144064191 +0000 UTC m=+4039.464324469" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.144876 5030 scope.go:117] "RemoveContainer" containerID="8bed77b1dd9d957a97c36677d666f529c0937e16ba1e19ff1190dce0c3a08706" Jan 20 23:42:47 crc kubenswrapper[5030]: E0120 23:42:47.145818 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bed77b1dd9d957a97c36677d666f529c0937e16ba1e19ff1190dce0c3a08706\": container with ID starting with 8bed77b1dd9d957a97c36677d666f529c0937e16ba1e19ff1190dce0c3a08706 not found: ID does not exist" containerID="8bed77b1dd9d957a97c36677d666f529c0937e16ba1e19ff1190dce0c3a08706" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.145864 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bed77b1dd9d957a97c36677d666f529c0937e16ba1e19ff1190dce0c3a08706"} err="failed to get container status \"8bed77b1dd9d957a97c36677d666f529c0937e16ba1e19ff1190dce0c3a08706\": rpc error: code = NotFound desc = could not find container \"8bed77b1dd9d957a97c36677d666f529c0937e16ba1e19ff1190dce0c3a08706\": container with ID starting with 8bed77b1dd9d957a97c36677d666f529c0937e16ba1e19ff1190dce0c3a08706 not found: ID does not exist" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.157613 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.165602 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.173782 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:42:47 crc kubenswrapper[5030]: E0120 23:42:47.174225 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d833c2-fa91-4538-a2e6-d42190e6fb9e" containerName="memcached" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.174243 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d833c2-fa91-4538-a2e6-d42190e6fb9e" containerName="memcached" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.174441 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6d833c2-fa91-4538-a2e6-d42190e6fb9e" containerName="memcached" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.175112 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.177380 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.180984 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-bmf9h" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.181183 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.185080 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.278518 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0acef259-668b-4545-be1f-8f19440dd308-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.278563 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0acef259-668b-4545-be1f-8f19440dd308-kolla-config\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.278601 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pzr6\" (UniqueName: \"kubernetes.io/projected/0acef259-668b-4545-be1f-8f19440dd308-kube-api-access-5pzr6\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.278661 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0acef259-668b-4545-be1f-8f19440dd308-config-data\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.278694 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0acef259-668b-4545-be1f-8f19440dd308-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: W0120 23:42:47.304864 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d68c95c_4505_4a7d_bd0f_44f7cf7c0671.slice/crio-f921b8881f8cd5d93240f668646c11a8b6c0fd4be82a73102aac1109cb4acbfa WatchSource:0}: Error finding container f921b8881f8cd5d93240f668646c11a8b6c0fd4be82a73102aac1109cb4acbfa: Status 404 returned error can't find the container with id f921b8881f8cd5d93240f668646c11a8b6c0fd4be82a73102aac1109cb4acbfa Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.304902 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.380344 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0acef259-668b-4545-be1f-8f19440dd308-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.380505 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0acef259-668b-4545-be1f-8f19440dd308-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.380528 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0acef259-668b-4545-be1f-8f19440dd308-kolla-config\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.380565 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pzr6\" (UniqueName: \"kubernetes.io/projected/0acef259-668b-4545-be1f-8f19440dd308-kube-api-access-5pzr6\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.380603 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0acef259-668b-4545-be1f-8f19440dd308-config-data\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.381330 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0acef259-668b-4545-be1f-8f19440dd308-config-data\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.382105 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0acef259-668b-4545-be1f-8f19440dd308-kolla-config\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.396564 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0acef259-668b-4545-be1f-8f19440dd308-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.398976 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pzr6\" (UniqueName: \"kubernetes.io/projected/0acef259-668b-4545-be1f-8f19440dd308-kube-api-access-5pzr6\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.399247 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0acef259-668b-4545-be1f-8f19440dd308-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.498082 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.976928 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eedd0095-2878-461a-a374-8f50c0a0f040" path="/var/lib/kubelet/pods/eedd0095-2878-461a-a374-8f50c0a0f040/volumes" Jan 20 23:42:47 crc kubenswrapper[5030]: I0120 23:42:47.980538 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d833c2-fa91-4538-a2e6-d42190e6fb9e" path="/var/lib/kubelet/pods/f6d833c2-fa91-4538-a2e6-d42190e6fb9e/volumes" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.000084 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:42:48 crc kubenswrapper[5030]: W0120 23:42:48.012832 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0acef259_668b_4545_be1f_8f19440dd308.slice/crio-dc259b0ff49e6ca76829f12c4377bebc2b9b50a20fd52b385f4448d9d599934b WatchSource:0}: Error finding container dc259b0ff49e6ca76829f12c4377bebc2b9b50a20fd52b385f4448d9d599934b: Status 404 returned error can't find the container with id dc259b0ff49e6ca76829f12c4377bebc2b9b50a20fd52b385f4448d9d599934b Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.124442 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"0acef259-668b-4545-be1f-8f19440dd308","Type":"ContainerStarted","Data":"dc259b0ff49e6ca76829f12c4377bebc2b9b50a20fd52b385f4448d9d599934b"} Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.126726 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671","Type":"ContainerStarted","Data":"e29d468a1a3038d9e8d3fea4953bb329a9ca4383ab0ab779841c9af0753ef204"} Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.126762 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671","Type":"ContainerStarted","Data":"f921b8881f8cd5d93240f668646c11a8b6c0fd4be82a73102aac1109cb4acbfa"} Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.149771 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.149751369 podStartE2EDuration="2.149751369s" podCreationTimestamp="2026-01-20 23:42:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:48.147902493 +0000 UTC m=+4040.468162801" watchObservedRunningTime="2026-01-20 23:42:48.149751369 +0000 UTC m=+4040.470011657" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.275494 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="19f669e8-7127-4a91-ae58-2e88b3bb7d6e" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.229:9292/healthcheck\": read tcp 10.217.0.2:33968->10.217.0.229:9292: read: connection reset by peer" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.275874 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="19f669e8-7127-4a91-ae58-2e88b3bb7d6e" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.229:9292/healthcheck\": read tcp 10.217.0.2:33970->10.217.0.229:9292: read: connection reset by peer" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.456872 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="20ed422e-24f7-4373-8bd9-e32a8e086a0c" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.230:9292/healthcheck\": read tcp 10.217.0.2:50102->10.217.0.230:9292: read: connection reset by peer" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.456933 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="20ed422e-24f7-4373-8bd9-e32a8e086a0c" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.230:9292/healthcheck\": read tcp 10.217.0.2:50096->10.217.0.230:9292: read: connection reset by peer" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.673074 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.717709 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-logs\") pod \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.717803 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-httpd-run\") pod \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.717902 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-combined-ca-bundle\") pod \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.717953 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-config-data\") pod \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.717991 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-scripts\") pod \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.718021 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7rxw\" (UniqueName: \"kubernetes.io/projected/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-kube-api-access-k7rxw\") pod \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.718099 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs\") pod \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.718130 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\" (UID: \"19f669e8-7127-4a91-ae58-2e88b3bb7d6e\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.718800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "19f669e8-7127-4a91-ae58-2e88b3bb7d6e" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.719073 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-logs" (OuterVolumeSpecName: "logs") pod "19f669e8-7127-4a91-ae58-2e88b3bb7d6e" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.740238 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "19f669e8-7127-4a91-ae58-2e88b3bb7d6e" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.740377 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-scripts" (OuterVolumeSpecName: "scripts") pod "19f669e8-7127-4a91-ae58-2e88b3bb7d6e" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.754699 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19f669e8-7127-4a91-ae58-2e88b3bb7d6e" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.755164 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-kube-api-access-k7rxw" (OuterVolumeSpecName: "kube-api-access-k7rxw") pod "19f669e8-7127-4a91-ae58-2e88b3bb7d6e" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e"). InnerVolumeSpecName "kube-api-access-k7rxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.786978 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "19f669e8-7127-4a91-ae58-2e88b3bb7d6e" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.808669 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-config-data" (OuterVolumeSpecName: "config-data") pod "19f669e8-7127-4a91-ae58-2e88b3bb7d6e" (UID: "19f669e8-7127-4a91-ae58-2e88b3bb7d6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.821304 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.821332 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.821343 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.821352 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.821364 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7rxw\" (UniqueName: \"kubernetes.io/projected/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-kube-api-access-k7rxw\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.821374 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.821401 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.821413 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19f669e8-7127-4a91-ae58-2e88b3bb7d6e-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.840868 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.862192 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.878314 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.878588 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="ceilometer-central-agent" containerID="cri-o://66b51a5ee5c65fde67dd7d3d08fce102a5471a432ed1a233ce9cd1c2b86d060c" gracePeriod=30 Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.878736 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="proxy-httpd" containerID="cri-o://1433c94928d680a85d1fee2bdd2908ad73a37317b65f71e0ebac6101c8dd81b4" gracePeriod=30 Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.878783 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="sg-core" containerID="cri-o://90dc073d46f7c4b78f10acfaaa4fcbd5333cb0e5ac3d78d615530c8639e62431" gracePeriod=30 Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.878819 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="ceilometer-notification-agent" containerID="cri-o://62ac49d41b5b1a662239abb22332a64b85a61c71f355decbf19689965a3507ac" gracePeriod=30 Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.922152 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20ed422e-24f7-4373-8bd9-e32a8e086a0c-logs\") pod \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.922216 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20ed422e-24f7-4373-8bd9-e32a8e086a0c-httpd-run\") pod \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.922243 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.922268 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-config-data\") pod \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.922323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-combined-ca-bundle\") pod \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.922572 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20ed422e-24f7-4373-8bd9-e32a8e086a0c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "20ed422e-24f7-4373-8bd9-e32a8e086a0c" (UID: "20ed422e-24f7-4373-8bd9-e32a8e086a0c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.922694 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20ed422e-24f7-4373-8bd9-e32a8e086a0c-logs" (OuterVolumeSpecName: "logs") pod "20ed422e-24f7-4373-8bd9-e32a8e086a0c" (UID: "20ed422e-24f7-4373-8bd9-e32a8e086a0c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.922813 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-public-tls-certs\") pod \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.922860 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs49g\" (UniqueName: \"kubernetes.io/projected/20ed422e-24f7-4373-8bd9-e32a8e086a0c-kube-api-access-gs49g\") pod \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.922914 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-scripts\") pod \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\" (UID: \"20ed422e-24f7-4373-8bd9-e32a8e086a0c\") " Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.923327 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20ed422e-24f7-4373-8bd9-e32a8e086a0c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.923344 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/20ed422e-24f7-4373-8bd9-e32a8e086a0c-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.923354 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.925105 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "20ed422e-24f7-4373-8bd9-e32a8e086a0c" (UID: "20ed422e-24f7-4373-8bd9-e32a8e086a0c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.926284 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-scripts" (OuterVolumeSpecName: "scripts") pod "20ed422e-24f7-4373-8bd9-e32a8e086a0c" (UID: "20ed422e-24f7-4373-8bd9-e32a8e086a0c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.926509 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ed422e-24f7-4373-8bd9-e32a8e086a0c-kube-api-access-gs49g" (OuterVolumeSpecName: "kube-api-access-gs49g") pod "20ed422e-24f7-4373-8bd9-e32a8e086a0c" (UID: "20ed422e-24f7-4373-8bd9-e32a8e086a0c"). InnerVolumeSpecName "kube-api-access-gs49g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.948377 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20ed422e-24f7-4373-8bd9-e32a8e086a0c" (UID: "20ed422e-24f7-4373-8bd9-e32a8e086a0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.968766 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-config-data" (OuterVolumeSpecName: "config-data") pod "20ed422e-24f7-4373-8bd9-e32a8e086a0c" (UID: "20ed422e-24f7-4373-8bd9-e32a8e086a0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:48 crc kubenswrapper[5030]: I0120 23:42:48.977488 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "20ed422e-24f7-4373-8bd9-e32a8e086a0c" (UID: "20ed422e-24f7-4373-8bd9-e32a8e086a0c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.024973 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.025560 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.025637 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.025708 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.025768 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs49g\" (UniqueName: \"kubernetes.io/projected/20ed422e-24f7-4373-8bd9-e32a8e086a0c-kube-api-access-gs49g\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.025821 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ed422e-24f7-4373-8bd9-e32a8e086a0c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.042846 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.127854 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.144554 5030 generic.go:334] "Generic (PLEG): container finished" podID="19f669e8-7127-4a91-ae58-2e88b3bb7d6e" containerID="d3091f37569897e9b88ec5d78c920009f5c04f92e89d358d4a768fe3e3188955" exitCode=0 Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.144657 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"19f669e8-7127-4a91-ae58-2e88b3bb7d6e","Type":"ContainerDied","Data":"d3091f37569897e9b88ec5d78c920009f5c04f92e89d358d4a768fe3e3188955"} Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.144720 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"19f669e8-7127-4a91-ae58-2e88b3bb7d6e","Type":"ContainerDied","Data":"8e09247cc3cc4b667a1de46e9091bc6a5a00952b1850ee813130e1e2f255317e"} Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.144738 5030 scope.go:117] "RemoveContainer" containerID="d3091f37569897e9b88ec5d78c920009f5c04f92e89d358d4a768fe3e3188955" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.144675 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.148750 5030 generic.go:334] "Generic (PLEG): container finished" podID="20ed422e-24f7-4373-8bd9-e32a8e086a0c" containerID="89f942718707120126f9bb556042a383baa6956e7cf2185a56b93a5da399dd9d" exitCode=0 Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.148818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"20ed422e-24f7-4373-8bd9-e32a8e086a0c","Type":"ContainerDied","Data":"89f942718707120126f9bb556042a383baa6956e7cf2185a56b93a5da399dd9d"} Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.148840 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"20ed422e-24f7-4373-8bd9-e32a8e086a0c","Type":"ContainerDied","Data":"326d6570fa1ab3a89274a3b0133d8fcd95b9d50d171e5e7bf75c912a7e440960"} Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.148848 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.151712 5030 generic.go:334] "Generic (PLEG): container finished" podID="ba66fb72-701e-46d6-b539-bdc036b9477f" containerID="0209a5dfb0c62f1e66ec249e44fd294c019db9b9470a7115b6112a0bd6656979" exitCode=0 Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.151776 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"ba66fb72-701e-46d6-b539-bdc036b9477f","Type":"ContainerDied","Data":"0209a5dfb0c62f1e66ec249e44fd294c019db9b9470a7115b6112a0bd6656979"} Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.155500 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"0acef259-668b-4545-be1f-8f19440dd308","Type":"ContainerStarted","Data":"0ff3a171627e7737463b58bcc435f1f2f3c9457442d5db62279215da4a11a740"} Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.159706 5030 generic.go:334] "Generic (PLEG): container finished" podID="152c5e6b-d331-4a98-af8d-7e4667624551" containerID="1433c94928d680a85d1fee2bdd2908ad73a37317b65f71e0ebac6101c8dd81b4" exitCode=0 Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.159752 5030 generic.go:334] "Generic (PLEG): container finished" podID="152c5e6b-d331-4a98-af8d-7e4667624551" containerID="90dc073d46f7c4b78f10acfaaa4fcbd5333cb0e5ac3d78d615530c8639e62431" exitCode=2 Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.162222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"152c5e6b-d331-4a98-af8d-7e4667624551","Type":"ContainerDied","Data":"1433c94928d680a85d1fee2bdd2908ad73a37317b65f71e0ebac6101c8dd81b4"} Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.162255 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"152c5e6b-d331-4a98-af8d-7e4667624551","Type":"ContainerDied","Data":"90dc073d46f7c4b78f10acfaaa4fcbd5333cb0e5ac3d78d615530c8639e62431"} Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.192030 5030 scope.go:117] "RemoveContainer" containerID="577c0b1d3c02aa73ef6c0f40cb4e260f2080808352a81286f14d239c1a159825" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.228505 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.247492 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.258344 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:42:49 crc kubenswrapper[5030]: E0120 23:42:49.258848 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f669e8-7127-4a91-ae58-2e88b3bb7d6e" containerName="glance-httpd" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.258865 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f669e8-7127-4a91-ae58-2e88b3bb7d6e" containerName="glance-httpd" Jan 20 23:42:49 crc kubenswrapper[5030]: E0120 23:42:49.258915 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ed422e-24f7-4373-8bd9-e32a8e086a0c" containerName="glance-log" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.258924 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ed422e-24f7-4373-8bd9-e32a8e086a0c" containerName="glance-log" Jan 20 23:42:49 crc kubenswrapper[5030]: E0120 23:42:49.258935 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ed422e-24f7-4373-8bd9-e32a8e086a0c" containerName="glance-httpd" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.258943 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ed422e-24f7-4373-8bd9-e32a8e086a0c" containerName="glance-httpd" Jan 20 23:42:49 crc kubenswrapper[5030]: E0120 23:42:49.258961 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f669e8-7127-4a91-ae58-2e88b3bb7d6e" containerName="glance-log" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.258968 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f669e8-7127-4a91-ae58-2e88b3bb7d6e" containerName="glance-log" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.259171 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ed422e-24f7-4373-8bd9-e32a8e086a0c" containerName="glance-log" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.259191 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f669e8-7127-4a91-ae58-2e88b3bb7d6e" containerName="glance-httpd" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.259201 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ed422e-24f7-4373-8bd9-e32a8e086a0c" containerName="glance-httpd" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.259220 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f669e8-7127-4a91-ae58-2e88b3bb7d6e" containerName="glance-log" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.260278 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.262111 5030 scope.go:117] "RemoveContainer" containerID="d3091f37569897e9b88ec5d78c920009f5c04f92e89d358d4a768fe3e3188955" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.262968 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-fm8zv" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.263200 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:42:49 crc kubenswrapper[5030]: E0120 23:42:49.263287 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3091f37569897e9b88ec5d78c920009f5c04f92e89d358d4a768fe3e3188955\": container with ID starting with d3091f37569897e9b88ec5d78c920009f5c04f92e89d358d4a768fe3e3188955 not found: ID does not exist" containerID="d3091f37569897e9b88ec5d78c920009f5c04f92e89d358d4a768fe3e3188955" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.263328 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3091f37569897e9b88ec5d78c920009f5c04f92e89d358d4a768fe3e3188955"} err="failed to get container status \"d3091f37569897e9b88ec5d78c920009f5c04f92e89d358d4a768fe3e3188955\": rpc error: code = NotFound desc = could not find container \"d3091f37569897e9b88ec5d78c920009f5c04f92e89d358d4a768fe3e3188955\": container with ID starting with d3091f37569897e9b88ec5d78c920009f5c04f92e89d358d4a768fe3e3188955 not found: ID does not exist" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.263354 5030 scope.go:117] "RemoveContainer" containerID="577c0b1d3c02aa73ef6c0f40cb4e260f2080808352a81286f14d239c1a159825" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.263376 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.263584 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:42:49 crc kubenswrapper[5030]: E0120 23:42:49.263739 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"577c0b1d3c02aa73ef6c0f40cb4e260f2080808352a81286f14d239c1a159825\": container with ID starting with 577c0b1d3c02aa73ef6c0f40cb4e260f2080808352a81286f14d239c1a159825 not found: ID does not exist" containerID="577c0b1d3c02aa73ef6c0f40cb4e260f2080808352a81286f14d239c1a159825" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.263817 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"577c0b1d3c02aa73ef6c0f40cb4e260f2080808352a81286f14d239c1a159825"} err="failed to get container status \"577c0b1d3c02aa73ef6c0f40cb4e260f2080808352a81286f14d239c1a159825\": rpc error: code = NotFound desc = could not find container \"577c0b1d3c02aa73ef6c0f40cb4e260f2080808352a81286f14d239c1a159825\": container with ID starting with 577c0b1d3c02aa73ef6c0f40cb4e260f2080808352a81286f14d239c1a159825 not found: ID does not exist" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.263881 5030 scope.go:117] "RemoveContainer" containerID="89f942718707120126f9bb556042a383baa6956e7cf2185a56b93a5da399dd9d" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.275534 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.275511338 podStartE2EDuration="2.275511338s" podCreationTimestamp="2026-01-20 23:42:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:49.223161788 +0000 UTC m=+4041.543422086" watchObservedRunningTime="2026-01-20 23:42:49.275511338 +0000 UTC m=+4041.595771626" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.295576 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.303758 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.311677 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.319098 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.321817 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.323444 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.323741 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.326303 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.331160 5030 scope.go:117] "RemoveContainer" containerID="d3eaeefe69d0cb25858dfc5c754c96ff130ce1e3dfac3894928e03469d40a64f" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.331329 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.331399 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ddd0447c-b834-4a95-b1d8-35eb86c7de95-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.331448 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.331527 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.331552 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.331612 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgh7c\" (UniqueName: \"kubernetes.io/projected/ddd0447c-b834-4a95-b1d8-35eb86c7de95-kube-api-access-hgh7c\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.331656 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.331672 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd0447c-b834-4a95-b1d8-35eb86c7de95-logs\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.358798 5030 scope.go:117] "RemoveContainer" containerID="89f942718707120126f9bb556042a383baa6956e7cf2185a56b93a5da399dd9d" Jan 20 23:42:49 crc kubenswrapper[5030]: E0120 23:42:49.362064 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f942718707120126f9bb556042a383baa6956e7cf2185a56b93a5da399dd9d\": container with ID starting with 89f942718707120126f9bb556042a383baa6956e7cf2185a56b93a5da399dd9d not found: ID does not exist" containerID="89f942718707120126f9bb556042a383baa6956e7cf2185a56b93a5da399dd9d" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.362114 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f942718707120126f9bb556042a383baa6956e7cf2185a56b93a5da399dd9d"} err="failed to get container status \"89f942718707120126f9bb556042a383baa6956e7cf2185a56b93a5da399dd9d\": rpc error: code = NotFound desc = could not find container \"89f942718707120126f9bb556042a383baa6956e7cf2185a56b93a5da399dd9d\": container with ID starting with 89f942718707120126f9bb556042a383baa6956e7cf2185a56b93a5da399dd9d not found: ID does not exist" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.362140 5030 scope.go:117] "RemoveContainer" containerID="d3eaeefe69d0cb25858dfc5c754c96ff130ce1e3dfac3894928e03469d40a64f" Jan 20 23:42:49 crc kubenswrapper[5030]: E0120 23:42:49.362759 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3eaeefe69d0cb25858dfc5c754c96ff130ce1e3dfac3894928e03469d40a64f\": container with ID starting with d3eaeefe69d0cb25858dfc5c754c96ff130ce1e3dfac3894928e03469d40a64f not found: ID does not exist" containerID="d3eaeefe69d0cb25858dfc5c754c96ff130ce1e3dfac3894928e03469d40a64f" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.362780 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3eaeefe69d0cb25858dfc5c754c96ff130ce1e3dfac3894928e03469d40a64f"} err="failed to get container status \"d3eaeefe69d0cb25858dfc5c754c96ff130ce1e3dfac3894928e03469d40a64f\": rpc error: code = NotFound desc = could not find container \"d3eaeefe69d0cb25858dfc5c754c96ff130ce1e3dfac3894928e03469d40a64f\": container with ID starting with d3eaeefe69d0cb25858dfc5c754c96ff130ce1e3dfac3894928e03469d40a64f not found: ID does not exist" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.433637 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.433734 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7b7h\" (UniqueName: \"kubernetes.io/projected/afe4aef7-888e-40d3-b1d9-f54b766ae91e-kube-api-access-m7b7h\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.433764 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.433799 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-scripts\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.433832 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgh7c\" (UniqueName: \"kubernetes.io/projected/ddd0447c-b834-4a95-b1d8-35eb86c7de95-kube-api-access-hgh7c\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.433850 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe4aef7-888e-40d3-b1d9-f54b766ae91e-logs\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.433938 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.433974 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd0447c-b834-4a95-b1d8-35eb86c7de95-logs\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.434071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.434095 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-config-data\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.434157 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ddd0447c-b834-4a95-b1d8-35eb86c7de95-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.434220 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.434277 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.434324 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.434403 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afe4aef7-888e-40d3-b1d9-f54b766ae91e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.434468 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.434538 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd0447c-b834-4a95-b1d8-35eb86c7de95-logs\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.434758 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ddd0447c-b834-4a95-b1d8-35eb86c7de95-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.434940 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.438764 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.439323 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.440328 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.440789 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.452007 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgh7c\" (UniqueName: \"kubernetes.io/projected/ddd0447c-b834-4a95-b1d8-35eb86c7de95-kube-api-access-hgh7c\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.466138 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.536684 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.536748 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.536803 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afe4aef7-888e-40d3-b1d9-f54b766ae91e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.536905 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7b7h\" (UniqueName: \"kubernetes.io/projected/afe4aef7-888e-40d3-b1d9-f54b766ae91e-kube-api-access-m7b7h\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.536935 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.536939 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.536971 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-scripts\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.536997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe4aef7-888e-40d3-b1d9-f54b766ae91e-logs\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.537059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-config-data\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.537481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afe4aef7-888e-40d3-b1d9-f54b766ae91e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.537663 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe4aef7-888e-40d3-b1d9-f54b766ae91e-logs\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.540851 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.540908 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-scripts\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.540915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.542501 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-config-data\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.551540 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7b7h\" (UniqueName: \"kubernetes.io/projected/afe4aef7-888e-40d3-b1d9-f54b766ae91e-kube-api-access-m7b7h\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.573581 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.611052 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.649544 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.972016 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f669e8-7127-4a91-ae58-2e88b3bb7d6e" path="/var/lib/kubelet/pods/19f669e8-7127-4a91-ae58-2e88b3bb7d6e/volumes" Jan 20 23:42:49 crc kubenswrapper[5030]: I0120 23:42:49.973191 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ed422e-24f7-4373-8bd9-e32a8e086a0c" path="/var/lib/kubelet/pods/20ed422e-24f7-4373-8bd9-e32a8e086a0c/volumes" Jan 20 23:42:50 crc kubenswrapper[5030]: W0120 23:42:50.071090 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddd0447c_b834_4a95_b1d8_35eb86c7de95.slice/crio-c00f6f81562756ec701624ac718043ffdafc3186e407b94febbb51cee2d64580 WatchSource:0}: Error finding container c00f6f81562756ec701624ac718043ffdafc3186e407b94febbb51cee2d64580: Status 404 returned error can't find the container with id c00f6f81562756ec701624ac718043ffdafc3186e407b94febbb51cee2d64580 Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.073601 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.150829 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:42:50 crc kubenswrapper[5030]: W0120 23:42:50.159281 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafe4aef7_888e_40d3_b1d9_f54b766ae91e.slice/crio-8d94757d5b406f505cd18c696f7bc39137c0a97ef31aea931015c57110d0b133 WatchSource:0}: Error finding container 8d94757d5b406f505cd18c696f7bc39137c0a97ef31aea931015c57110d0b133: Status 404 returned error can't find the container with id 8d94757d5b406f505cd18c696f7bc39137c0a97ef31aea931015c57110d0b133 Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.179924 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"ba66fb72-701e-46d6-b539-bdc036b9477f","Type":"ContainerStarted","Data":"bee0947fe7fe657b7ed53ea384f8fcf3e0f5fc524636231fa1f7cd941b3366f5"} Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.183010 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ddd0447c-b834-4a95-b1d8-35eb86c7de95","Type":"ContainerStarted","Data":"c00f6f81562756ec701624ac718043ffdafc3186e407b94febbb51cee2d64580"} Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.187783 5030 generic.go:334] "Generic (PLEG): container finished" podID="152c5e6b-d331-4a98-af8d-7e4667624551" containerID="66b51a5ee5c65fde67dd7d3d08fce102a5471a432ed1a233ce9cd1c2b86d060c" exitCode=0 Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.187849 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"152c5e6b-d331-4a98-af8d-7e4667624551","Type":"ContainerDied","Data":"66b51a5ee5c65fde67dd7d3d08fce102a5471a432ed1a233ce9cd1c2b86d060c"} Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.189199 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"afe4aef7-888e-40d3-b1d9-f54b766ae91e","Type":"ContainerStarted","Data":"8d94757d5b406f505cd18c696f7bc39137c0a97ef31aea931015c57110d0b133"} Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.191486 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.207737 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=6.207719841 podStartE2EDuration="6.207719841s" podCreationTimestamp="2026-01-20 23:42:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:50.203747006 +0000 UTC m=+4042.524007304" watchObservedRunningTime="2026-01-20 23:42:50.207719841 +0000 UTC m=+4042.527980119" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.676157 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.822121 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-ceilometer-tls-certs\") pod \"152c5e6b-d331-4a98-af8d-7e4667624551\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.822237 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-config-data\") pod \"152c5e6b-d331-4a98-af8d-7e4667624551\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.832188 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-scripts\") pod \"152c5e6b-d331-4a98-af8d-7e4667624551\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.832251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scsnq\" (UniqueName: \"kubernetes.io/projected/152c5e6b-d331-4a98-af8d-7e4667624551-kube-api-access-scsnq\") pod \"152c5e6b-d331-4a98-af8d-7e4667624551\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.832314 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/152c5e6b-d331-4a98-af8d-7e4667624551-run-httpd\") pod \"152c5e6b-d331-4a98-af8d-7e4667624551\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.832358 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/152c5e6b-d331-4a98-af8d-7e4667624551-log-httpd\") pod \"152c5e6b-d331-4a98-af8d-7e4667624551\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.832395 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-combined-ca-bundle\") pod \"152c5e6b-d331-4a98-af8d-7e4667624551\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.832430 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-sg-core-conf-yaml\") pod \"152c5e6b-d331-4a98-af8d-7e4667624551\" (UID: \"152c5e6b-d331-4a98-af8d-7e4667624551\") " Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.833888 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/152c5e6b-d331-4a98-af8d-7e4667624551-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "152c5e6b-d331-4a98-af8d-7e4667624551" (UID: "152c5e6b-d331-4a98-af8d-7e4667624551"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.834967 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/152c5e6b-d331-4a98-af8d-7e4667624551-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "152c5e6b-d331-4a98-af8d-7e4667624551" (UID: "152c5e6b-d331-4a98-af8d-7e4667624551"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.835133 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/152c5e6b-d331-4a98-af8d-7e4667624551-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.854131 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/152c5e6b-d331-4a98-af8d-7e4667624551-kube-api-access-scsnq" (OuterVolumeSpecName: "kube-api-access-scsnq") pod "152c5e6b-d331-4a98-af8d-7e4667624551" (UID: "152c5e6b-d331-4a98-af8d-7e4667624551"). InnerVolumeSpecName "kube-api-access-scsnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.859306 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-scripts" (OuterVolumeSpecName: "scripts") pod "152c5e6b-d331-4a98-af8d-7e4667624551" (UID: "152c5e6b-d331-4a98-af8d-7e4667624551"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.881732 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "152c5e6b-d331-4a98-af8d-7e4667624551" (UID: "152c5e6b-d331-4a98-af8d-7e4667624551"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.912892 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "152c5e6b-d331-4a98-af8d-7e4667624551" (UID: "152c5e6b-d331-4a98-af8d-7e4667624551"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.935161 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "152c5e6b-d331-4a98-af8d-7e4667624551" (UID: "152c5e6b-d331-4a98-af8d-7e4667624551"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.936491 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.936521 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scsnq\" (UniqueName: \"kubernetes.io/projected/152c5e6b-d331-4a98-af8d-7e4667624551-kube-api-access-scsnq\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.936534 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/152c5e6b-d331-4a98-af8d-7e4667624551-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.936545 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.936554 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.936562 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:50 crc kubenswrapper[5030]: I0120 23:42:50.940372 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-config-data" (OuterVolumeSpecName: "config-data") pod "152c5e6b-d331-4a98-af8d-7e4667624551" (UID: "152c5e6b-d331-4a98-af8d-7e4667624551"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.038230 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152c5e6b-d331-4a98-af8d-7e4667624551-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.203783 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ddd0447c-b834-4a95-b1d8-35eb86c7de95","Type":"ContainerStarted","Data":"fe2e23d68370c4c00564a676e4d3e1d4904e1c03bc91bd3f25aa636e081afe21"} Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.206929 5030 generic.go:334] "Generic (PLEG): container finished" podID="152c5e6b-d331-4a98-af8d-7e4667624551" containerID="62ac49d41b5b1a662239abb22332a64b85a61c71f355decbf19689965a3507ac" exitCode=0 Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.206994 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.207010 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"152c5e6b-d331-4a98-af8d-7e4667624551","Type":"ContainerDied","Data":"62ac49d41b5b1a662239abb22332a64b85a61c71f355decbf19689965a3507ac"} Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.207047 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"152c5e6b-d331-4a98-af8d-7e4667624551","Type":"ContainerDied","Data":"12bb229b9e7d349841bd6e5af687f72514c2632d9fdf950e882189d8b4c16ed2"} Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.207064 5030 scope.go:117] "RemoveContainer" containerID="1433c94928d680a85d1fee2bdd2908ad73a37317b65f71e0ebac6101c8dd81b4" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.216673 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"afe4aef7-888e-40d3-b1d9-f54b766ae91e","Type":"ContainerStarted","Data":"013b095f32b6d6c23f8213aecd5cbd8dca48c4f0173ec8268ad0eeda1644224f"} Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.259686 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.281169 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.304255 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:42:51 crc kubenswrapper[5030]: E0120 23:42:51.304738 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="proxy-httpd" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.304791 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="proxy-httpd" Jan 20 23:42:51 crc kubenswrapper[5030]: E0120 23:42:51.304812 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="sg-core" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.304819 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="sg-core" Jan 20 23:42:51 crc kubenswrapper[5030]: E0120 23:42:51.304832 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="ceilometer-central-agent" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.304838 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="ceilometer-central-agent" Jan 20 23:42:51 crc kubenswrapper[5030]: E0120 23:42:51.304868 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="ceilometer-notification-agent" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.304876 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="ceilometer-notification-agent" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.305065 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="sg-core" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.305083 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="proxy-httpd" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.305098 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="ceilometer-notification-agent" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.305110 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" containerName="ceilometer-central-agent" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.307277 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.312637 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.312825 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.312883 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.314366 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.322290 5030 scope.go:117] "RemoveContainer" containerID="90dc073d46f7c4b78f10acfaaa4fcbd5333cb0e5ac3d78d615530c8639e62431" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.360260 5030 scope.go:117] "RemoveContainer" containerID="62ac49d41b5b1a662239abb22332a64b85a61c71f355decbf19689965a3507ac" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.448848 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.448979 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.449009 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.449028 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xdc4\" (UniqueName: \"kubernetes.io/projected/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-kube-api-access-4xdc4\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.449078 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-config-data\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.449097 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-log-httpd\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.449121 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-scripts\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.449151 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-run-httpd\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.465394 5030 scope.go:117] "RemoveContainer" containerID="66b51a5ee5c65fde67dd7d3d08fce102a5471a432ed1a233ce9cd1c2b86d060c" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.483342 5030 scope.go:117] "RemoveContainer" containerID="1433c94928d680a85d1fee2bdd2908ad73a37317b65f71e0ebac6101c8dd81b4" Jan 20 23:42:51 crc kubenswrapper[5030]: E0120 23:42:51.486183 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1433c94928d680a85d1fee2bdd2908ad73a37317b65f71e0ebac6101c8dd81b4\": container with ID starting with 1433c94928d680a85d1fee2bdd2908ad73a37317b65f71e0ebac6101c8dd81b4 not found: ID does not exist" containerID="1433c94928d680a85d1fee2bdd2908ad73a37317b65f71e0ebac6101c8dd81b4" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.486232 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1433c94928d680a85d1fee2bdd2908ad73a37317b65f71e0ebac6101c8dd81b4"} err="failed to get container status \"1433c94928d680a85d1fee2bdd2908ad73a37317b65f71e0ebac6101c8dd81b4\": rpc error: code = NotFound desc = could not find container \"1433c94928d680a85d1fee2bdd2908ad73a37317b65f71e0ebac6101c8dd81b4\": container with ID starting with 1433c94928d680a85d1fee2bdd2908ad73a37317b65f71e0ebac6101c8dd81b4 not found: ID does not exist" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.486262 5030 scope.go:117] "RemoveContainer" containerID="90dc073d46f7c4b78f10acfaaa4fcbd5333cb0e5ac3d78d615530c8639e62431" Jan 20 23:42:51 crc kubenswrapper[5030]: E0120 23:42:51.486692 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90dc073d46f7c4b78f10acfaaa4fcbd5333cb0e5ac3d78d615530c8639e62431\": container with ID starting with 90dc073d46f7c4b78f10acfaaa4fcbd5333cb0e5ac3d78d615530c8639e62431 not found: ID does not exist" containerID="90dc073d46f7c4b78f10acfaaa4fcbd5333cb0e5ac3d78d615530c8639e62431" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.486746 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90dc073d46f7c4b78f10acfaaa4fcbd5333cb0e5ac3d78d615530c8639e62431"} err="failed to get container status \"90dc073d46f7c4b78f10acfaaa4fcbd5333cb0e5ac3d78d615530c8639e62431\": rpc error: code = NotFound desc = could not find container \"90dc073d46f7c4b78f10acfaaa4fcbd5333cb0e5ac3d78d615530c8639e62431\": container with ID starting with 90dc073d46f7c4b78f10acfaaa4fcbd5333cb0e5ac3d78d615530c8639e62431 not found: ID does not exist" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.486781 5030 scope.go:117] "RemoveContainer" containerID="62ac49d41b5b1a662239abb22332a64b85a61c71f355decbf19689965a3507ac" Jan 20 23:42:51 crc kubenswrapper[5030]: E0120 23:42:51.487115 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62ac49d41b5b1a662239abb22332a64b85a61c71f355decbf19689965a3507ac\": container with ID starting with 62ac49d41b5b1a662239abb22332a64b85a61c71f355decbf19689965a3507ac not found: ID does not exist" containerID="62ac49d41b5b1a662239abb22332a64b85a61c71f355decbf19689965a3507ac" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.487147 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62ac49d41b5b1a662239abb22332a64b85a61c71f355decbf19689965a3507ac"} err="failed to get container status \"62ac49d41b5b1a662239abb22332a64b85a61c71f355decbf19689965a3507ac\": rpc error: code = NotFound desc = could not find container \"62ac49d41b5b1a662239abb22332a64b85a61c71f355decbf19689965a3507ac\": container with ID starting with 62ac49d41b5b1a662239abb22332a64b85a61c71f355decbf19689965a3507ac not found: ID does not exist" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.487167 5030 scope.go:117] "RemoveContainer" containerID="66b51a5ee5c65fde67dd7d3d08fce102a5471a432ed1a233ce9cd1c2b86d060c" Jan 20 23:42:51 crc kubenswrapper[5030]: E0120 23:42:51.487446 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66b51a5ee5c65fde67dd7d3d08fce102a5471a432ed1a233ce9cd1c2b86d060c\": container with ID starting with 66b51a5ee5c65fde67dd7d3d08fce102a5471a432ed1a233ce9cd1c2b86d060c not found: ID does not exist" containerID="66b51a5ee5c65fde67dd7d3d08fce102a5471a432ed1a233ce9cd1c2b86d060c" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.487472 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66b51a5ee5c65fde67dd7d3d08fce102a5471a432ed1a233ce9cd1c2b86d060c"} err="failed to get container status \"66b51a5ee5c65fde67dd7d3d08fce102a5471a432ed1a233ce9cd1c2b86d060c\": rpc error: code = NotFound desc = could not find container \"66b51a5ee5c65fde67dd7d3d08fce102a5471a432ed1a233ce9cd1c2b86d060c\": container with ID starting with 66b51a5ee5c65fde67dd7d3d08fce102a5471a432ed1a233ce9cd1c2b86d060c not found: ID does not exist" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.551163 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.551497 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xdc4\" (UniqueName: \"kubernetes.io/projected/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-kube-api-access-4xdc4\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.551553 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-config-data\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.551575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-log-httpd\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.551602 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-scripts\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.551672 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-run-httpd\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.552033 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-log-httpd\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.552186 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.552270 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.552273 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-run-httpd\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.555888 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.556053 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-scripts\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.556521 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.556709 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.558067 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-config-data\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.568372 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xdc4\" (UniqueName: \"kubernetes.io/projected/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-kube-api-access-4xdc4\") pod \"ceilometer-0\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.648671 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.837909 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:51 crc kubenswrapper[5030]: I0120 23:42:51.972663 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="152c5e6b-d331-4a98-af8d-7e4667624551" path="/var/lib/kubelet/pods/152c5e6b-d331-4a98-af8d-7e4667624551/volumes" Jan 20 23:42:52 crc kubenswrapper[5030]: I0120 23:42:52.090518 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:42:52 crc kubenswrapper[5030]: W0120 23:42:52.095544 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb002bd94_4e47_49ea_9ce8_5af565ddb6ed.slice/crio-ac3292ce89ca97c48cb84b6f0a40de5fbc7b1ecfac103534780f495831dd49ce WatchSource:0}: Error finding container ac3292ce89ca97c48cb84b6f0a40de5fbc7b1ecfac103534780f495831dd49ce: Status 404 returned error can't find the container with id ac3292ce89ca97c48cb84b6f0a40de5fbc7b1ecfac103534780f495831dd49ce Jan 20 23:42:52 crc kubenswrapper[5030]: I0120 23:42:52.103108 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 23:42:52 crc kubenswrapper[5030]: I0120 23:42:52.226534 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ddd0447c-b834-4a95-b1d8-35eb86c7de95","Type":"ContainerStarted","Data":"8622c0b6fced3f26178c491c88d209f96cfe8f2ccbe0ba24a807c4a3dd340c2a"} Jan 20 23:42:52 crc kubenswrapper[5030]: I0120 23:42:52.234359 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"afe4aef7-888e-40d3-b1d9-f54b766ae91e","Type":"ContainerStarted","Data":"54b74467f9994bbd83a66a95be59795207df4b7b649af57287e6019ecd04da3f"} Jan 20 23:42:52 crc kubenswrapper[5030]: I0120 23:42:52.235547 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b002bd94-4e47-49ea-9ce8-5af565ddb6ed","Type":"ContainerStarted","Data":"ac3292ce89ca97c48cb84b6f0a40de5fbc7b1ecfac103534780f495831dd49ce"} Jan 20 23:42:52 crc kubenswrapper[5030]: I0120 23:42:52.249278 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.2492424460000002 podStartE2EDuration="3.249242446s" podCreationTimestamp="2026-01-20 23:42:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:52.248513178 +0000 UTC m=+4044.568773477" watchObservedRunningTime="2026-01-20 23:42:52.249242446 +0000 UTC m=+4044.569502734" Jan 20 23:42:52 crc kubenswrapper[5030]: I0120 23:42:52.256300 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:52 crc kubenswrapper[5030]: I0120 23:42:52.277527 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.277507912 podStartE2EDuration="3.277507912s" podCreationTimestamp="2026-01-20 23:42:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:52.271014874 +0000 UTC m=+4044.591275162" watchObservedRunningTime="2026-01-20 23:42:52.277507912 +0000 UTC m=+4044.597768220" Jan 20 23:42:52 crc kubenswrapper[5030]: I0120 23:42:52.388165 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:42:52 crc kubenswrapper[5030]: I0120 23:42:52.451645 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g"] Jan 20 23:42:52 crc kubenswrapper[5030]: I0120 23:42:52.451880 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" podUID="54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" containerName="barbican-api-log" containerID="cri-o://5622e08e8950e98a8739de3b9233a0d993a1898abb24917da33b851510e57497" gracePeriod=30 Jan 20 23:42:52 crc kubenswrapper[5030]: I0120 23:42:52.452304 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" podUID="54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" containerName="barbican-api" containerID="cri-o://3347c314b0e756e3b221978cc6e13748d59b6b6fd17aa2b01e4f18a3697a0816" gracePeriod=30 Jan 20 23:42:53 crc kubenswrapper[5030]: I0120 23:42:53.264736 5030 generic.go:334] "Generic (PLEG): container finished" podID="54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" containerID="5622e08e8950e98a8739de3b9233a0d993a1898abb24917da33b851510e57497" exitCode=143 Jan 20 23:42:53 crc kubenswrapper[5030]: I0120 23:42:53.265237 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" event={"ID":"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d","Type":"ContainerDied","Data":"5622e08e8950e98a8739de3b9233a0d993a1898abb24917da33b851510e57497"} Jan 20 23:42:53 crc kubenswrapper[5030]: I0120 23:42:53.271222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b002bd94-4e47-49ea-9ce8-5af565ddb6ed","Type":"ContainerStarted","Data":"2aba22d12b4a8082bd872a965b953151609964a03b4ad9582d03b72dab943a2e"} Jan 20 23:42:54 crc kubenswrapper[5030]: I0120 23:42:54.282331 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b002bd94-4e47-49ea-9ce8-5af565ddb6ed","Type":"ContainerStarted","Data":"c1491fd116b61462ffe9d4c787b8ac9b46c8b5380cd45cc8fd9896b43d420535"} Jan 20 23:42:54 crc kubenswrapper[5030]: I0120 23:42:54.282770 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b002bd94-4e47-49ea-9ce8-5af565ddb6ed","Type":"ContainerStarted","Data":"12091bb4362b85d843ea5295d417371e1ca04650ec43156c0521deaaa0286f2d"} Jan 20 23:42:54 crc kubenswrapper[5030]: I0120 23:42:54.955907 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:54 crc kubenswrapper[5030]: I0120 23:42:54.955954 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:55 crc kubenswrapper[5030]: I0120 23:42:55.048481 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:55 crc kubenswrapper[5030]: I0120 23:42:55.398424 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:42:55 crc kubenswrapper[5030]: I0120 23:42:55.609642 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" podUID="54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.212:9311/healthcheck\": read tcp 10.217.0.2:36198->10.217.0.212:9311: read: connection reset by peer" Jan 20 23:42:55 crc kubenswrapper[5030]: I0120 23:42:55.609753 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" podUID="54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.212:9311/healthcheck\": read tcp 10.217.0.2:36196->10.217.0.212:9311: read: connection reset by peer" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.027430 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.147364 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-config-data-custom\") pod \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.147434 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-public-tls-certs\") pod \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.147488 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-internal-tls-certs\") pod \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.147530 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-config-data\") pod \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.147572 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-combined-ca-bundle\") pod \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.147599 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47j47\" (UniqueName: \"kubernetes.io/projected/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-kube-api-access-47j47\") pod \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.147701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-logs\") pod \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\" (UID: \"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d\") " Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.148530 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-logs" (OuterVolumeSpecName: "logs") pod "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" (UID: "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.155755 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" (UID: "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.157795 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-kube-api-access-47j47" (OuterVolumeSpecName: "kube-api-access-47j47") pod "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" (UID: "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d"). InnerVolumeSpecName "kube-api-access-47j47". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.181519 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" (UID: "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.211560 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" (UID: "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.218926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" (UID: "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.218996 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-config-data" (OuterVolumeSpecName: "config-data") pod "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" (UID: "54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.249962 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.249992 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.250002 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.250013 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.250023 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.250032 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.250042 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47j47\" (UniqueName: \"kubernetes.io/projected/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d-kube-api-access-47j47\") on node \"crc\" DevicePath \"\"" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.306204 5030 generic.go:334] "Generic (PLEG): container finished" podID="54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" containerID="3347c314b0e756e3b221978cc6e13748d59b6b6fd17aa2b01e4f18a3697a0816" exitCode=0 Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.306263 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" event={"ID":"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d","Type":"ContainerDied","Data":"3347c314b0e756e3b221978cc6e13748d59b6b6fd17aa2b01e4f18a3697a0816"} Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.306289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" event={"ID":"54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d","Type":"ContainerDied","Data":"3e92041344bf2c8c86a9132f7d9aaaf809ef3136b22a0f4e7d195f8186abd2fa"} Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.306307 5030 scope.go:117] "RemoveContainer" containerID="3347c314b0e756e3b221978cc6e13748d59b6b6fd17aa2b01e4f18a3697a0816" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.306429 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.316990 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b002bd94-4e47-49ea-9ce8-5af565ddb6ed","Type":"ContainerStarted","Data":"70ffc14e50637690db7989fdf98987e7bba8872dcb444eceda7cc13290c16cde"} Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.337233 5030 scope.go:117] "RemoveContainer" containerID="5622e08e8950e98a8739de3b9233a0d993a1898abb24917da33b851510e57497" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.348243 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.664275285 podStartE2EDuration="5.348220422s" podCreationTimestamp="2026-01-20 23:42:51 +0000 UTC" firstStartedPulling="2026-01-20 23:42:52.102882876 +0000 UTC m=+4044.423143164" lastFinishedPulling="2026-01-20 23:42:55.786827993 +0000 UTC m=+4048.107088301" observedRunningTime="2026-01-20 23:42:56.338434934 +0000 UTC m=+4048.658695222" watchObservedRunningTime="2026-01-20 23:42:56.348220422 +0000 UTC m=+4048.668480710" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.364136 5030 scope.go:117] "RemoveContainer" containerID="3347c314b0e756e3b221978cc6e13748d59b6b6fd17aa2b01e4f18a3697a0816" Jan 20 23:42:56 crc kubenswrapper[5030]: E0120 23:42:56.365482 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3347c314b0e756e3b221978cc6e13748d59b6b6fd17aa2b01e4f18a3697a0816\": container with ID starting with 3347c314b0e756e3b221978cc6e13748d59b6b6fd17aa2b01e4f18a3697a0816 not found: ID does not exist" containerID="3347c314b0e756e3b221978cc6e13748d59b6b6fd17aa2b01e4f18a3697a0816" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.365516 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3347c314b0e756e3b221978cc6e13748d59b6b6fd17aa2b01e4f18a3697a0816"} err="failed to get container status \"3347c314b0e756e3b221978cc6e13748d59b6b6fd17aa2b01e4f18a3697a0816\": rpc error: code = NotFound desc = could not find container \"3347c314b0e756e3b221978cc6e13748d59b6b6fd17aa2b01e4f18a3697a0816\": container with ID starting with 3347c314b0e756e3b221978cc6e13748d59b6b6fd17aa2b01e4f18a3697a0816 not found: ID does not exist" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.365536 5030 scope.go:117] "RemoveContainer" containerID="5622e08e8950e98a8739de3b9233a0d993a1898abb24917da33b851510e57497" Jan 20 23:42:56 crc kubenswrapper[5030]: E0120 23:42:56.366808 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5622e08e8950e98a8739de3b9233a0d993a1898abb24917da33b851510e57497\": container with ID starting with 5622e08e8950e98a8739de3b9233a0d993a1898abb24917da33b851510e57497 not found: ID does not exist" containerID="5622e08e8950e98a8739de3b9233a0d993a1898abb24917da33b851510e57497" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.366859 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5622e08e8950e98a8739de3b9233a0d993a1898abb24917da33b851510e57497"} err="failed to get container status \"5622e08e8950e98a8739de3b9233a0d993a1898abb24917da33b851510e57497\": rpc error: code = NotFound desc = could not find container \"5622e08e8950e98a8739de3b9233a0d993a1898abb24917da33b851510e57497\": container with ID starting with 5622e08e8950e98a8739de3b9233a0d993a1898abb24917da33b851510e57497 not found: ID does not exist" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.368234 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g"] Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.377380 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-5646794dd4-m2p4g"] Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.838255 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:56 crc kubenswrapper[5030]: I0120 23:42:56.869061 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.326494 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.349468 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.500788 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.589551 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.590253 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" containerName="nova-api-api" containerID="cri-o://6046a846fcbf16047a808dd6d90fc3bd12599a6aad8d1c25fcb9bea953decf51" gracePeriod=30 Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.590537 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" containerName="nova-api-log" containerID="cri-o://cb2f0c3b8501093aaa0e433ff2ce6a327de9731da2f84e935851d94250d96e84" gracePeriod=30 Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.855665 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-76766d6986-2jghb"] Jan 20 23:42:57 crc kubenswrapper[5030]: E0120 23:42:57.856020 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" containerName="barbican-api-log" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.856033 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" containerName="barbican-api-log" Jan 20 23:42:57 crc kubenswrapper[5030]: E0120 23:42:57.856065 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" containerName="barbican-api" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.856072 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" containerName="barbican-api" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.856242 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" containerName="barbican-api" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.856261 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" containerName="barbican-api-log" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.857130 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.877900 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-76766d6986-2jghb"] Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.982187 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-config\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.982475 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xfcs\" (UniqueName: \"kubernetes.io/projected/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-kube-api-access-8xfcs\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.982672 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-internal-tls-certs\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.982843 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-public-tls-certs\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.982871 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-ovndb-tls-certs\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.982931 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-combined-ca-bundle\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.982992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-httpd-config\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:57 crc kubenswrapper[5030]: I0120 23:42:57.987407 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d" path="/var/lib/kubelet/pods/54187d8e-e7c4-4b8e-bfd7-55e29a3b8b1d/volumes" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.084690 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-ovndb-tls-certs\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.084765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-combined-ca-bundle\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.084799 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-httpd-config\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.084888 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-config\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.084933 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xfcs\" (UniqueName: \"kubernetes.io/projected/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-kube-api-access-8xfcs\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.085000 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-internal-tls-certs\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.085073 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-public-tls-certs\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.144590 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-httpd-config\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.144703 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-ovndb-tls-certs\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.144749 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-public-tls-certs\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.144779 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-combined-ca-bundle\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.144887 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-internal-tls-certs\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.145458 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-config\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.147779 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xfcs\" (UniqueName: \"kubernetes.io/projected/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-kube-api-access-8xfcs\") pod \"neutron-76766d6986-2jghb\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.174781 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.337131 5030 generic.go:334] "Generic (PLEG): container finished" podID="cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" containerID="cb2f0c3b8501093aaa0e433ff2ce6a327de9731da2f84e935851d94250d96e84" exitCode=143 Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.337354 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce","Type":"ContainerDied","Data":"cb2f0c3b8501093aaa0e433ff2ce6a327de9731da2f84e935851d94250d96e84"} Jan 20 23:42:58 crc kubenswrapper[5030]: I0120 23:42:58.621227 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-76766d6986-2jghb"] Jan 20 23:42:58 crc kubenswrapper[5030]: W0120 23:42:58.627411 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ea6b77a_5e8c_4afd_ba36_5ae9ce25f208.slice/crio-8132c0095ef59d931f51b86dfbb0a1b1491d41fdef6d5764ed61d6d71caa1694 WatchSource:0}: Error finding container 8132c0095ef59d931f51b86dfbb0a1b1491d41fdef6d5764ed61d6d71caa1694: Status 404 returned error can't find the container with id 8132c0095ef59d931f51b86dfbb0a1b1491d41fdef6d5764ed61d6d71caa1694 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.272836 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.276922 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.306115 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.306402 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="probe" containerID="cri-o://e0d56ff4302d57b4640e3a21a93764cd9ca8e599cfa8633aa74cfa5710979040" gracePeriod=30 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.306484 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="cinder-scheduler" containerID="cri-o://20c4a921161139dd7e23f659cc47617ba1e894ddbac491c152cd76fd4f4e85b4" gracePeriod=30 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.326911 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.366888 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.367492 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="30def4ae-b574-406f-ac55-1fbae852b7f8" containerName="openstack-network-exporter" containerID="cri-o://8a6b141363970fec4afba83162c49db3275e773739115c07b31346e1c8e613cb" gracePeriod=300 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.385158 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" event={"ID":"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208","Type":"ContainerStarted","Data":"6209a3e89aa3986aa1aed1097e852e2687fad2453d971f8772348cd0527da462"} Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.385198 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" event={"ID":"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208","Type":"ContainerStarted","Data":"67d37a9bdbb07180a5c6d24417922339a86ca098de85f8d76f671c4e93136767"} Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.385209 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" event={"ID":"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208","Type":"ContainerStarted","Data":"8132c0095ef59d931f51b86dfbb0a1b1491d41fdef6d5764ed61d6d71caa1694"} Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.386399 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.391949 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.392310 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="0acef259-668b-4545-be1f-8f19440dd308" containerName="memcached" containerID="cri-o://0ff3a171627e7737463b58bcc435f1f2f3c9457442d5db62279215da4a11a740" gracePeriod=30 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.411265 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-logs\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.411889 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkngv\" (UniqueName: \"kubernetes.io/projected/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-kube-api-access-rkngv\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.411975 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.412274 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-combined-ca-bundle\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.412380 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data-custom\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.463930 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.464347 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="85f32108-760c-4027-8c6d-d1b761a3305d" containerName="nova-metadata-log" containerID="cri-o://c482b68ed01591be35af0242d8124f86faa81f2474d8a0b10508f5b0e77c62dc" gracePeriod=30 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.464757 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="85f32108-760c-4027-8c6d-d1b761a3305d" containerName="nova-metadata-metadata" containerID="cri-o://a73c97b11afcbc04aed83e51bf767bd0510faa92f7acfad1d7f35ffe02c51dc2" gracePeriod=30 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.491190 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.491565 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="6d68c95c-4505-4a7d-bd0f-44f7cf7c0671" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e29d468a1a3038d9e8d3fea4953bb329a9ca4383ab0ab779841c9af0753ef204" gracePeriod=30 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.512985 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.514772 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.516194 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-logs\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.519966 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-logs\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.525727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkngv\" (UniqueName: \"kubernetes.io/projected/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-kube-api-access-rkngv\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.525828 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.525915 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-combined-ca-bundle\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.526076 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data-custom\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.548008 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.596761 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.597206 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="afe4aef7-888e-40d3-b1d9-f54b766ae91e" containerName="glance-log" containerID="cri-o://013b095f32b6d6c23f8213aecd5cbd8dca48c4f0173ec8268ad0eeda1644224f" gracePeriod=30 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.597611 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="afe4aef7-888e-40d3-b1d9-f54b766ae91e" containerName="glance-httpd" containerID="cri-o://54b74467f9994bbd83a66a95be59795207df4b7b649af57287e6019ecd04da3f" gracePeriod=30 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.615485 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.626939 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.645509 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.646041 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="87ed5aae-25b3-475c-b086-4ef865d35368" containerName="ovn-northd" containerID="cri-o://d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" gracePeriod=30 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.646483 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="87ed5aae-25b3-475c-b086-4ef865d35368" containerName="openstack-network-exporter" containerID="cri-o://333adca68e29b9bae2f3440fb85d275fec118f97e652dac5d291b5fdd406d414" gracePeriod=30 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.661844 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f205895a-be1d-4da7-adbc-773b2750d39b-logs\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.661906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt82h\" (UniqueName: \"kubernetes.io/projected/f205895a-be1d-4da7-adbc-773b2750d39b-kube-api-access-dt82h\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.661965 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-config-data-custom\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.661994 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-config-data\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.662069 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-combined-ca-bundle\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.691227 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkngv\" (UniqueName: \"kubernetes.io/projected/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-kube-api-access-rkngv\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.702283 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.705296 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-combined-ca-bundle\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.717330 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data-custom\") pod \"barbican-worker-86c9cf7bbf-qpwgc\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.757467 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" podStartSLOduration=2.757447445 podStartE2EDuration="2.757447445s" podCreationTimestamp="2026-01-20 23:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:42:59.429932 +0000 UTC m=+4051.750192288" watchObservedRunningTime="2026-01-20 23:42:59.757447445 +0000 UTC m=+4052.077707733" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.799121 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f205895a-be1d-4da7-adbc-773b2750d39b-logs\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.799196 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt82h\" (UniqueName: \"kubernetes.io/projected/f205895a-be1d-4da7-adbc-773b2750d39b-kube-api-access-dt82h\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.799296 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-config-data-custom\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.799327 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-config-data\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.799403 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-combined-ca-bundle\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.801028 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f205895a-be1d-4da7-adbc-773b2750d39b-logs\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.817989 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-combined-ca-bundle\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.818901 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-config-data\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.825270 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-config-data-custom\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.843282 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.852234 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt82h\" (UniqueName: \"kubernetes.io/projected/f205895a-be1d-4da7-adbc-773b2750d39b-kube-api-access-dt82h\") pod \"barbican-keystone-listener-5bc7cb6786-4jbxx\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.866697 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-85c9874f86-xpfrg"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.866950 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" podUID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerName="placement-log" containerID="cri-o://1b5d6e7723fc432f59bb152ee5c44c0e160790b16eb10a41a085b3d039712a96" gracePeriod=30 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.867735 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" podUID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerName="placement-api" containerID="cri-o://4c1481e7f538bbd813fc066cbdf4b164b9a4610f8972b882bebb9408284fb510" gracePeriod=30 Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.881389 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.893969 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.907227 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.944854 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv"] Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.946767 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:42:59 crc kubenswrapper[5030]: I0120 23:42:59.954854 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.004850 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-config-data\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.004935 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw4l9\" (UniqueName: \"kubernetes.io/projected/0cee270b-420e-4af4-965a-3f07b75adb64-kube-api-access-kw4l9\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.004960 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-internal-tls-certs\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.005007 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-config-data-custom\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.005022 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-public-tls-certs\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.005040 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cee270b-420e-4af4-965a-3f07b75adb64-logs\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.005058 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-combined-ca-bundle\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.014706 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.015352 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.021684 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.021962 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" podUID="8b42296e-9bb8-4b92-ac1f-6f62a58bb042" containerName="keystone-api" containerID="cri-o://98fb3459638b3c9138e33d4416a9e172a726d393548444329a3a1a4c7fee114e" gracePeriod=30 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.032289 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.058791 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-76766d6986-2jghb"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.068049 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" podUID="8b42296e-9bb8-4b92-ac1f-6f62a58bb042" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.1.8:5000/v3\": EOF" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.068288 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-6f9877d997-jdh5f"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.069780 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.075976 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6f9877d997-jdh5f"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.076152 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="30def4ae-b574-406f-ac55-1fbae852b7f8" containerName="ovsdbserver-nb" containerID="cri-o://4f6874d9d0b21b19757ba0154d86f3da9e658121a8f165c2e5820052bed84ad5" gracePeriod=300 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.096688 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.096993 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="425b4c25-fdad-470a-b462-188b573d359a" containerName="cinder-api-log" containerID="cri-o://8b167ca61f8e9969a9c5b4773f26b86a04c25264b7d286b0bb38c376ba36ecef" gracePeriod=30 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.097108 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="425b4c25-fdad-470a-b462-188b573d359a" containerName="cinder-api" containerID="cri-o://590d9e7afd81121f20e8b3c4a10aa9c029280098e2371ecd81d6f475c6e6a932" gracePeriod=30 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.111327 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-config-data\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.111755 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-combined-ca-bundle\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.111838 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jszzs\" (UniqueName: \"kubernetes.io/projected/9948629d-78da-4168-98e5-f887c418909f-kube-api-access-jszzs\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.111897 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-public-tls-certs\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.111944 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw4l9\" (UniqueName: \"kubernetes.io/projected/0cee270b-420e-4af4-965a-3f07b75adb64-kube-api-access-kw4l9\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.111990 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-internal-tls-certs\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.112108 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-config-data-custom\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.112130 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-public-tls-certs\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.112157 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cee270b-420e-4af4-965a-3f07b75adb64-logs\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.112177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-combined-ca-bundle\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.112302 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9948629d-78da-4168-98e5-f887c418909f-logs\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.112320 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-internal-tls-certs\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.112355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-scripts\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.112435 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-config-data\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.114677 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.114713 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.115262 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="be7d3731-2d0f-4fcc-8edf-424ffe462724" containerName="openstack-network-exporter" containerID="cri-o://d7c25cc2d81571cbaf4e3fdaed35bbbbb4f6c053387fbd0f51749a48c434bf73" gracePeriod=300 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.138433 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.139551 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.151126 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cee270b-420e-4af4-965a-3f07b75adb64-logs\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: E0120 23:43:00.151393 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f6874d9d0b21b19757ba0154d86f3da9e658121a8f165c2e5820052bed84ad5 is running failed: container process not found" containerID="4f6874d9d0b21b19757ba0154d86f3da9e658121a8f165c2e5820052bed84ad5" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.152159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-public-tls-certs\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.160134 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-config-data-custom\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: E0120 23:43:00.160203 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f6874d9d0b21b19757ba0154d86f3da9e658121a8f165c2e5820052bed84ad5 is running failed: container process not found" containerID="4f6874d9d0b21b19757ba0154d86f3da9e658121a8f165c2e5820052bed84ad5" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.160303 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-internal-tls-certs\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: E0120 23:43:00.164301 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f6874d9d0b21b19757ba0154d86f3da9e658121a8f165c2e5820052bed84ad5 is running failed: container process not found" containerID="4f6874d9d0b21b19757ba0154d86f3da9e658121a8f165c2e5820052bed84ad5" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:43:00 crc kubenswrapper[5030]: E0120 23:43:00.164348 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4f6874d9d0b21b19757ba0154d86f3da9e658121a8f165c2e5820052bed84ad5 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="30def4ae-b574-406f-ac55-1fbae852b7f8" containerName="ovsdbserver-nb" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.166330 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-combined-ca-bundle\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.170689 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5b454f47f8-kbjj4"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.172574 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.175924 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-config-data\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.190610 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5b454f47f8-kbjj4"] Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.198331 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw4l9\" (UniqueName: \"kubernetes.io/projected/0cee270b-420e-4af4-965a-3f07b75adb64-kube-api-access-kw4l9\") pod \"barbican-api-5bd4ddf7c6-tfql8\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216296 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-scripts\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216344 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-fernet-keys\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216405 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-config-data\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216445 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-combined-ca-bundle\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216461 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-config-data\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216496 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jszzs\" (UniqueName: \"kubernetes.io/projected/9948629d-78da-4168-98e5-f887c418909f-kube-api-access-jszzs\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216525 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-scripts\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216555 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-public-tls-certs\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216608 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-credential-keys\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216653 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-internal-tls-certs\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216677 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv4wn\" (UniqueName: \"kubernetes.io/projected/68768b92-d310-4d2a-ace4-e95aa765f972-kube-api-access-mv4wn\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216697 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-public-tls-certs\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216714 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-combined-ca-bundle\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216745 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9948629d-78da-4168-98e5-f887c418909f-logs\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.216763 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-internal-tls-certs\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.224006 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-internal-tls-certs\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.224117 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-scripts\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.224281 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9948629d-78da-4168-98e5-f887c418909f-logs\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.234868 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-public-tls-certs\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.244994 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-config-data\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.263218 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jszzs\" (UniqueName: \"kubernetes.io/projected/9948629d-78da-4168-98e5-f887c418909f-kube-api-access-jszzs\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.270648 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-combined-ca-bundle\") pod \"placement-5ffbcd4fdb-84hxv\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.313177 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="be7d3731-2d0f-4fcc-8edf-424ffe462724" containerName="ovsdbserver-sb" containerID="cri-o://70478a059fca701b17d2bef4ede54b3267ff5a5ce2ad1aa06cb4b60331aeb708" gracePeriod=300 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.319648 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-internal-tls-certs\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.319724 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-httpd-config\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.319751 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-credential-keys\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.319775 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-config\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.319809 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-internal-tls-certs\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.319831 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-combined-ca-bundle\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.319854 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv4wn\" (UniqueName: \"kubernetes.io/projected/68768b92-d310-4d2a-ace4-e95aa765f972-kube-api-access-mv4wn\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.319874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-public-tls-certs\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.319894 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-combined-ca-bundle\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.319945 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-fernet-keys\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.320007 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-config-data\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.320047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-scripts\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.320068 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np6ft\" (UniqueName: \"kubernetes.io/projected/a467f566-3c36-4534-8219-fa10f325e20b-kube-api-access-np6ft\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.320092 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-ovndb-tls-certs\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.320107 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-public-tls-certs\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.329299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-scripts\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.332332 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-config-data\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.335230 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-fernet-keys\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.343028 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-public-tls-certs\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.343472 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-combined-ca-bundle\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.348329 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-internal-tls-certs\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.350099 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-credential-keys\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.365308 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv4wn\" (UniqueName: \"kubernetes.io/projected/68768b92-d310-4d2a-ace4-e95aa765f972-kube-api-access-mv4wn\") pod \"keystone-6f9877d997-jdh5f\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.429204 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np6ft\" (UniqueName: \"kubernetes.io/projected/a467f566-3c36-4534-8219-fa10f325e20b-kube-api-access-np6ft\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.429258 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-public-tls-certs\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.429273 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-ovndb-tls-certs\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.429301 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-internal-tls-certs\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.429340 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-httpd-config\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.429360 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-config\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.429392 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-combined-ca-bundle\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.444940 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-internal-tls-certs\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.449692 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-combined-ca-bundle\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.452649 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np6ft\" (UniqueName: \"kubernetes.io/projected/a467f566-3c36-4534-8219-fa10f325e20b-kube-api-access-np6ft\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.453162 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-public-tls-certs\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.454342 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-httpd-config\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.463101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-ovndb-tls-certs\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.464386 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-config\") pod \"neutron-5b454f47f8-kbjj4\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.464919 5030 generic.go:334] "Generic (PLEG): container finished" podID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerID="1b5d6e7723fc432f59bb152ee5c44c0e160790b16eb10a41a085b3d039712a96" exitCode=143 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.464960 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" event={"ID":"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd","Type":"ContainerDied","Data":"1b5d6e7723fc432f59bb152ee5c44c0e160790b16eb10a41a085b3d039712a96"} Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.516081 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_be7d3731-2d0f-4fcc-8edf-424ffe462724/ovsdbserver-sb/0.log" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.516116 5030 generic.go:334] "Generic (PLEG): container finished" podID="be7d3731-2d0f-4fcc-8edf-424ffe462724" containerID="d7c25cc2d81571cbaf4e3fdaed35bbbbb4f6c053387fbd0f51749a48c434bf73" exitCode=2 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.516165 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"be7d3731-2d0f-4fcc-8edf-424ffe462724","Type":"ContainerDied","Data":"d7c25cc2d81571cbaf4e3fdaed35bbbbb4f6c053387fbd0f51749a48c434bf73"} Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.553663 5030 generic.go:334] "Generic (PLEG): container finished" podID="425b4c25-fdad-470a-b462-188b573d359a" containerID="8b167ca61f8e9969a9c5b4773f26b86a04c25264b7d286b0bb38c376ba36ecef" exitCode=143 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.553731 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"425b4c25-fdad-470a-b462-188b573d359a","Type":"ContainerDied","Data":"8b167ca61f8e9969a9c5b4773f26b86a04c25264b7d286b0bb38c376ba36ecef"} Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.563004 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="4c184acd-d9c1-4e70-8587-ce940b980983" containerName="galera" containerID="cri-o://626d362af97385a0d67e9e8fee5bd07bb31c5c1353c92b989bc60909ed7f85e1" gracePeriod=30 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.567421 5030 generic.go:334] "Generic (PLEG): container finished" podID="afe4aef7-888e-40d3-b1d9-f54b766ae91e" containerID="54b74467f9994bbd83a66a95be59795207df4b7b649af57287e6019ecd04da3f" exitCode=0 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.567443 5030 generic.go:334] "Generic (PLEG): container finished" podID="afe4aef7-888e-40d3-b1d9-f54b766ae91e" containerID="013b095f32b6d6c23f8213aecd5cbd8dca48c4f0173ec8268ad0eeda1644224f" exitCode=143 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.567491 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"afe4aef7-888e-40d3-b1d9-f54b766ae91e","Type":"ContainerDied","Data":"54b74467f9994bbd83a66a95be59795207df4b7b649af57287e6019ecd04da3f"} Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.567517 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"afe4aef7-888e-40d3-b1d9-f54b766ae91e","Type":"ContainerDied","Data":"013b095f32b6d6c23f8213aecd5cbd8dca48c4f0173ec8268ad0eeda1644224f"} Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.587868 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="ba66fb72-701e-46d6-b539-bdc036b9477f" containerName="galera" containerID="cri-o://bee0947fe7fe657b7ed53ea384f8fcf3e0f5fc524636231fa1f7cd941b3366f5" gracePeriod=30 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.599945 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_30def4ae-b574-406f-ac55-1fbae852b7f8/ovsdbserver-nb/0.log" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.599988 5030 generic.go:334] "Generic (PLEG): container finished" podID="30def4ae-b574-406f-ac55-1fbae852b7f8" containerID="8a6b141363970fec4afba83162c49db3275e773739115c07b31346e1c8e613cb" exitCode=2 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.600003 5030 generic.go:334] "Generic (PLEG): container finished" podID="30def4ae-b574-406f-ac55-1fbae852b7f8" containerID="4f6874d9d0b21b19757ba0154d86f3da9e658121a8f165c2e5820052bed84ad5" exitCode=143 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.600061 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"30def4ae-b574-406f-ac55-1fbae852b7f8","Type":"ContainerDied","Data":"8a6b141363970fec4afba83162c49db3275e773739115c07b31346e1c8e613cb"} Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.600086 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"30def4ae-b574-406f-ac55-1fbae852b7f8","Type":"ContainerDied","Data":"4f6874d9d0b21b19757ba0154d86f3da9e658121a8f165c2e5820052bed84ad5"} Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.633342 5030 generic.go:334] "Generic (PLEG): container finished" podID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerID="e0d56ff4302d57b4640e3a21a93764cd9ca8e599cfa8633aa74cfa5710979040" exitCode=0 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.633417 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bb388aeb-cf40-49d1-bb60-a22d6117c6a8","Type":"ContainerDied","Data":"e0d56ff4302d57b4640e3a21a93764cd9ca8e599cfa8633aa74cfa5710979040"} Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.659357 5030 generic.go:334] "Generic (PLEG): container finished" podID="85f32108-760c-4027-8c6d-d1b761a3305d" containerID="c482b68ed01591be35af0242d8124f86faa81f2474d8a0b10508f5b0e77c62dc" exitCode=143 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.659437 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"85f32108-760c-4027-8c6d-d1b761a3305d","Type":"ContainerDied","Data":"c482b68ed01591be35af0242d8124f86faa81f2474d8a0b10508f5b0e77c62dc"} Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.690759 5030 generic.go:334] "Generic (PLEG): container finished" podID="87ed5aae-25b3-475c-b086-4ef865d35368" containerID="333adca68e29b9bae2f3440fb85d275fec118f97e652dac5d291b5fdd406d414" exitCode=2 Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.691690 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"87ed5aae-25b3-475c-b086-4ef865d35368","Type":"ContainerDied","Data":"333adca68e29b9bae2f3440fb85d275fec118f97e652dac5d291b5fdd406d414"} Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.692398 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.692415 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.692846 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.717721 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.727944 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.746433 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.920351 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_30def4ae-b574-406f-ac55-1fbae852b7f8/ovsdbserver-nb/0.log" Jan 20 23:43:00 crc kubenswrapper[5030]: I0120 23:43:00.920431 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.047567 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30def4ae-b574-406f-ac55-1fbae852b7f8-scripts\") pod \"30def4ae-b574-406f-ac55-1fbae852b7f8\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.048098 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30def4ae-b574-406f-ac55-1fbae852b7f8-ovsdb-rundir\") pod \"30def4ae-b574-406f-ac55-1fbae852b7f8\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.048278 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-ovsdbserver-nb-tls-certs\") pod \"30def4ae-b574-406f-ac55-1fbae852b7f8\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.048347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"30def4ae-b574-406f-ac55-1fbae852b7f8\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.048421 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30def4ae-b574-406f-ac55-1fbae852b7f8-config\") pod \"30def4ae-b574-406f-ac55-1fbae852b7f8\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.048511 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-combined-ca-bundle\") pod \"30def4ae-b574-406f-ac55-1fbae852b7f8\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.048603 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-metrics-certs-tls-certs\") pod \"30def4ae-b574-406f-ac55-1fbae852b7f8\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.048705 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7l9s\" (UniqueName: \"kubernetes.io/projected/30def4ae-b574-406f-ac55-1fbae852b7f8-kube-api-access-n7l9s\") pod \"30def4ae-b574-406f-ac55-1fbae852b7f8\" (UID: \"30def4ae-b574-406f-ac55-1fbae852b7f8\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.051143 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30def4ae-b574-406f-ac55-1fbae852b7f8-scripts" (OuterVolumeSpecName: "scripts") pod "30def4ae-b574-406f-ac55-1fbae852b7f8" (UID: "30def4ae-b574-406f-ac55-1fbae852b7f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.053800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30def4ae-b574-406f-ac55-1fbae852b7f8-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "30def4ae-b574-406f-ac55-1fbae852b7f8" (UID: "30def4ae-b574-406f-ac55-1fbae852b7f8"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.054214 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30def4ae-b574-406f-ac55-1fbae852b7f8-config" (OuterVolumeSpecName: "config") pod "30def4ae-b574-406f-ac55-1fbae852b7f8" (UID: "30def4ae-b574-406f-ac55-1fbae852b7f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.095066 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "30def4ae-b574-406f-ac55-1fbae852b7f8" (UID: "30def4ae-b574-406f-ac55-1fbae852b7f8"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.096894 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30def4ae-b574-406f-ac55-1fbae852b7f8-kube-api-access-n7l9s" (OuterVolumeSpecName: "kube-api-access-n7l9s") pod "30def4ae-b574-406f-ac55-1fbae852b7f8" (UID: "30def4ae-b574-406f-ac55-1fbae852b7f8"). InnerVolumeSpecName "kube-api-access-n7l9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.103110 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_be7d3731-2d0f-4fcc-8edf-424ffe462724/ovsdbserver-sb/0.log" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.113802 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.170410 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30def4ae-b574-406f-ac55-1fbae852b7f8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.170440 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30def4ae-b574-406f-ac55-1fbae852b7f8-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.170487 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.170499 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30def4ae-b574-406f-ac55-1fbae852b7f8-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.170509 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7l9s\" (UniqueName: \"kubernetes.io/projected/30def4ae-b574-406f-ac55-1fbae852b7f8-kube-api-access-n7l9s\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.219531 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.239477 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.239928 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="f19ee173-f9bb-4671-afde-4107284c57fa" containerName="nova-cell1-conductor-conductor" containerID="cri-o://dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3" gracePeriod=30 Jan 20 23:43:01 crc kubenswrapper[5030]: E0120 23:43:01.241173 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.281865 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.283031 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-combined-ca-bundle\") pod \"be7d3731-2d0f-4fcc-8edf-424ffe462724\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.283065 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"be7d3731-2d0f-4fcc-8edf-424ffe462724\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.283217 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be7d3731-2d0f-4fcc-8edf-424ffe462724-scripts\") pod \"be7d3731-2d0f-4fcc-8edf-424ffe462724\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.283267 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-ovsdbserver-sb-tls-certs\") pod \"be7d3731-2d0f-4fcc-8edf-424ffe462724\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.283288 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be7d3731-2d0f-4fcc-8edf-424ffe462724-config\") pod \"be7d3731-2d0f-4fcc-8edf-424ffe462724\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.283361 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pz2f\" (UniqueName: \"kubernetes.io/projected/be7d3731-2d0f-4fcc-8edf-424ffe462724-kube-api-access-6pz2f\") pod \"be7d3731-2d0f-4fcc-8edf-424ffe462724\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.283403 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be7d3731-2d0f-4fcc-8edf-424ffe462724-ovsdb-rundir\") pod \"be7d3731-2d0f-4fcc-8edf-424ffe462724\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.283447 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-metrics-certs-tls-certs\") pod \"be7d3731-2d0f-4fcc-8edf-424ffe462724\" (UID: \"be7d3731-2d0f-4fcc-8edf-424ffe462724\") " Jan 20 23:43:01 crc kubenswrapper[5030]: E0120 23:43:01.323105 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.324036 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.334060 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be7d3731-2d0f-4fcc-8edf-424ffe462724-config" (OuterVolumeSpecName: "config") pod "be7d3731-2d0f-4fcc-8edf-424ffe462724" (UID: "be7d3731-2d0f-4fcc-8edf-424ffe462724"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.350151 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be7d3731-2d0f-4fcc-8edf-424ffe462724-scripts" (OuterVolumeSpecName: "scripts") pod "be7d3731-2d0f-4fcc-8edf-424ffe462724" (UID: "be7d3731-2d0f-4fcc-8edf-424ffe462724"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.358194 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be7d3731-2d0f-4fcc-8edf-424ffe462724-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "be7d3731-2d0f-4fcc-8edf-424ffe462724" (UID: "be7d3731-2d0f-4fcc-8edf-424ffe462724"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.392667 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "be7d3731-2d0f-4fcc-8edf-424ffe462724" (UID: "be7d3731-2d0f-4fcc-8edf-424ffe462724"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: E0120 23:43:01.392797 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:43:01 crc kubenswrapper[5030]: E0120 23:43:01.392840 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="87ed5aae-25b3-475c-b086-4ef865d35368" containerName="ovn-northd" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.416710 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.416921 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="737f957c-6c28-4e40-ac90-83e5cc719f07" containerName="nova-scheduler-scheduler" containerID="cri-o://1d0ac515361116a705fa1bbc68a57e410f5f863be4014e57d7caa5134773a66d" gracePeriod=30 Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.418641 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30def4ae-b574-406f-ac55-1fbae852b7f8" (UID: "30def4ae-b574-406f-ac55-1fbae852b7f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.418759 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be7d3731-2d0f-4fcc-8edf-424ffe462724-kube-api-access-6pz2f" (OuterVolumeSpecName: "kube-api-access-6pz2f") pod "be7d3731-2d0f-4fcc-8edf-424ffe462724" (UID: "be7d3731-2d0f-4fcc-8edf-424ffe462724"). InnerVolumeSpecName "kube-api-access-6pz2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.433383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-config-data\") pod \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.433467 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.433490 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7b7h\" (UniqueName: \"kubernetes.io/projected/afe4aef7-888e-40d3-b1d9-f54b766ae91e-kube-api-access-m7b7h\") pod \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.433559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-public-tls-certs\") pod \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.433591 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-scripts\") pod \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.433633 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-combined-ca-bundle\") pod \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.433796 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afe4aef7-888e-40d3-b1d9-f54b766ae91e-httpd-run\") pod \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.433825 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe4aef7-888e-40d3-b1d9-f54b766ae91e-logs\") pod \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\" (UID: \"afe4aef7-888e-40d3-b1d9-f54b766ae91e\") " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.434235 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be7d3731-2d0f-4fcc-8edf-424ffe462724-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.434246 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be7d3731-2d0f-4fcc-8edf-424ffe462724-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.434256 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pz2f\" (UniqueName: \"kubernetes.io/projected/be7d3731-2d0f-4fcc-8edf-424ffe462724-kube-api-access-6pz2f\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.434265 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/be7d3731-2d0f-4fcc-8edf-424ffe462724-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.434282 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.434292 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.456204 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afe4aef7-888e-40d3-b1d9-f54b766ae91e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "afe4aef7-888e-40d3-b1d9-f54b766ae91e" (UID: "afe4aef7-888e-40d3-b1d9-f54b766ae91e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.482058 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afe4aef7-888e-40d3-b1d9-f54b766ae91e-logs" (OuterVolumeSpecName: "logs") pod "afe4aef7-888e-40d3-b1d9-f54b766ae91e" (UID: "afe4aef7-888e-40d3-b1d9-f54b766ae91e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.502794 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "afe4aef7-888e-40d3-b1d9-f54b766ae91e" (UID: "afe4aef7-888e-40d3-b1d9-f54b766ae91e"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.518002 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe4aef7-888e-40d3-b1d9-f54b766ae91e-kube-api-access-m7b7h" (OuterVolumeSpecName: "kube-api-access-m7b7h") pod "afe4aef7-888e-40d3-b1d9-f54b766ae91e" (UID: "afe4aef7-888e-40d3-b1d9-f54b766ae91e"). InnerVolumeSpecName "kube-api-access-m7b7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.553840 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/afe4aef7-888e-40d3-b1d9-f54b766ae91e-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.553874 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe4aef7-888e-40d3-b1d9-f54b766ae91e-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.553895 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.553904 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7b7h\" (UniqueName: \"kubernetes.io/projected/afe4aef7-888e-40d3-b1d9-f54b766ae91e-kube-api-access-m7b7h\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.597059 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-scripts" (OuterVolumeSpecName: "scripts") pod "afe4aef7-888e-40d3-b1d9-f54b766ae91e" (UID: "afe4aef7-888e-40d3-b1d9-f54b766ae91e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.602068 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.637321 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be7d3731-2d0f-4fcc-8edf-424ffe462724" (UID: "be7d3731-2d0f-4fcc-8edf-424ffe462724"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.656458 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.656692 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.656756 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.667757 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afe4aef7-888e-40d3-b1d9-f54b766ae91e" (UID: "afe4aef7-888e-40d3-b1d9-f54b766ae91e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.686464 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.749109 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "30def4ae-b574-406f-ac55-1fbae852b7f8" (UID: "30def4ae-b574-406f-ac55-1fbae852b7f8"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.760030 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.760140 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.760210 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.791081 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"afe4aef7-888e-40d3-b1d9-f54b766ae91e","Type":"ContainerDied","Data":"8d94757d5b406f505cd18c696f7bc39137c0a97ef31aea931015c57110d0b133"} Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.791128 5030 scope.go:117] "RemoveContainer" containerID="54b74467f9994bbd83a66a95be59795207df4b7b649af57287e6019ecd04da3f" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.791244 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.865715 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.865977 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="ceilometer-central-agent" containerID="cri-o://2aba22d12b4a8082bd872a965b953151609964a03b4ad9582d03b72dab943a2e" gracePeriod=30 Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.866361 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="proxy-httpd" containerID="cri-o://70ffc14e50637690db7989fdf98987e7bba8872dcb444eceda7cc13290c16cde" gracePeriod=30 Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.866409 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="sg-core" containerID="cri-o://c1491fd116b61462ffe9d4c787b8ac9b46c8b5380cd45cc8fd9896b43d420535" gracePeriod=30 Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.866438 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="ceilometer-notification-agent" containerID="cri-o://12091bb4362b85d843ea5295d417371e1ca04650ec43156c0521deaaa0286f2d" gracePeriod=30 Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.873401 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "afe4aef7-888e-40d3-b1d9-f54b766ae91e" (UID: "afe4aef7-888e-40d3-b1d9-f54b766ae91e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.878675 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_30def4ae-b574-406f-ac55-1fbae852b7f8/ovsdbserver-nb/0.log" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.878813 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.879523 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"30def4ae-b574-406f-ac55-1fbae852b7f8","Type":"ContainerDied","Data":"06bea6c482c0d846d880a184c1a4ae2b0be1468e842799de009fe4b76de8dac5"} Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.886880 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "30def4ae-b574-406f-ac55-1fbae852b7f8" (UID: "30def4ae-b574-406f-ac55-1fbae852b7f8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.915752 5030 scope.go:117] "RemoveContainer" containerID="013b095f32b6d6c23f8213aecd5cbd8dca48c4f0173ec8268ad0eeda1644224f" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.922749 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx"] Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.939078 5030 generic.go:334] "Generic (PLEG): container finished" podID="6d68c95c-4505-4a7d-bd0f-44f7cf7c0671" containerID="e29d468a1a3038d9e8d3fea4953bb329a9ca4383ab0ab779841c9af0753ef204" exitCode=0 Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.939270 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671","Type":"ContainerDied","Data":"e29d468a1a3038d9e8d3fea4953bb329a9ca4383ab0ab779841c9af0753ef204"} Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.953209 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-config-data" (OuterVolumeSpecName: "config-data") pod "afe4aef7-888e-40d3-b1d9-f54b766ae91e" (UID: "afe4aef7-888e-40d3-b1d9-f54b766ae91e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.958239 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc"] Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.967301 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.969421 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.969449 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/afe4aef7-888e-40d3-b1d9-f54b766ae91e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.969460 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30def4ae-b574-406f-ac55-1fbae852b7f8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.976921 5030 generic.go:334] "Generic (PLEG): container finished" podID="cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" containerID="6046a846fcbf16047a808dd6d90fc3bd12599a6aad8d1c25fcb9bea953decf51" exitCode=0 Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.978703 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "be7d3731-2d0f-4fcc-8edf-424ffe462724" (UID: "be7d3731-2d0f-4fcc-8edf-424ffe462724"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.986179 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce","Type":"ContainerDied","Data":"6046a846fcbf16047a808dd6d90fc3bd12599a6aad8d1c25fcb9bea953decf51"} Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.989635 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:43:01 crc kubenswrapper[5030]: I0120 23:43:01.990157 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="5db97c47-beb0-437c-bcae-2aaabb7cf3ff" containerName="nova-cell0-conductor-conductor" containerID="cri-o://50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937" gracePeriod=30 Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.001881 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_be7d3731-2d0f-4fcc-8edf-424ffe462724/ovsdbserver-sb/0.log" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.001926 5030 generic.go:334] "Generic (PLEG): container finished" podID="be7d3731-2d0f-4fcc-8edf-424ffe462724" containerID="70478a059fca701b17d2bef4ede54b3267ff5a5ce2ad1aa06cb4b60331aeb708" exitCode=143 Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.002131 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" podUID="8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" containerName="neutron-api" containerID="cri-o://67d37a9bdbb07180a5c6d24417922339a86ca098de85f8d76f671c4e93136767" gracePeriod=30 Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.002422 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.002927 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" podUID="8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" containerName="neutron-httpd" containerID="cri-o://6209a3e89aa3986aa1aed1097e852e2687fad2453d971f8772348cd0527da462" gracePeriod=30 Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.003010 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"be7d3731-2d0f-4fcc-8edf-424ffe462724","Type":"ContainerDied","Data":"70478a059fca701b17d2bef4ede54b3267ff5a5ce2ad1aa06cb4b60331aeb708"} Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.003057 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"be7d3731-2d0f-4fcc-8edf-424ffe462724","Type":"ContainerDied","Data":"f612e18174db4664ec2fc5a5fd3ee2f6c4a63c70838148e83e03dd3a54bb01a6"} Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.003238 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerName="glance-log" containerID="cri-o://fe2e23d68370c4c00564a676e4d3e1d4904e1c03bc91bd3f25aa636e081afe21" gracePeriod=30 Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.003287 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerName="glance-httpd" containerID="cri-o://8622c0b6fced3f26178c491c88d209f96cfe8f2ccbe0ba24a807c4a3dd340c2a" gracePeriod=30 Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.024704 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "be7d3731-2d0f-4fcc-8edf-424ffe462724" (UID: "be7d3731-2d0f-4fcc-8edf-424ffe462724"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.026926 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.11:9292/healthcheck\": EOF" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.027377 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.11:9292/healthcheck\": EOF" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.075075 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-config-data\") pod \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.075193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-combined-ca-bundle\") pod \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.075254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gsrc\" (UniqueName: \"kubernetes.io/projected/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-kube-api-access-7gsrc\") pod \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.075314 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-vencrypt-tls-certs\") pod \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.075396 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-nova-novncproxy-tls-certs\") pod \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\" (UID: \"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671\") " Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.099794 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.099907 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7d3731-2d0f-4fcc-8edf-424ffe462724-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.140884 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.141286 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-kube-api-access-7gsrc" (OuterVolumeSpecName: "kube-api-access-7gsrc") pod "6d68c95c-4505-4a7d-bd0f-44f7cf7c0671" (UID: "6d68c95c-4505-4a7d-bd0f-44f7cf7c0671"). InnerVolumeSpecName "kube-api-access-7gsrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.149662 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.161749 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d68c95c-4505-4a7d-bd0f-44f7cf7c0671" (UID: "6d68c95c-4505-4a7d-bd0f-44f7cf7c0671"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.169112 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:43:02 crc kubenswrapper[5030]: E0120 23:43:02.169536 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be7d3731-2d0f-4fcc-8edf-424ffe462724" containerName="openstack-network-exporter" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.169563 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="be7d3731-2d0f-4fcc-8edf-424ffe462724" containerName="openstack-network-exporter" Jan 20 23:43:02 crc kubenswrapper[5030]: E0120 23:43:02.169580 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe4aef7-888e-40d3-b1d9-f54b766ae91e" containerName="glance-log" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.169586 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe4aef7-888e-40d3-b1d9-f54b766ae91e" containerName="glance-log" Jan 20 23:43:02 crc kubenswrapper[5030]: E0120 23:43:02.169594 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be7d3731-2d0f-4fcc-8edf-424ffe462724" containerName="ovsdbserver-sb" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.169600 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="be7d3731-2d0f-4fcc-8edf-424ffe462724" containerName="ovsdbserver-sb" Jan 20 23:43:02 crc kubenswrapper[5030]: E0120 23:43:02.169609 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d68c95c-4505-4a7d-bd0f-44f7cf7c0671" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.169615 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d68c95c-4505-4a7d-bd0f-44f7cf7c0671" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:43:02 crc kubenswrapper[5030]: E0120 23:43:02.169698 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30def4ae-b574-406f-ac55-1fbae852b7f8" containerName="ovsdbserver-nb" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.169705 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="30def4ae-b574-406f-ac55-1fbae852b7f8" containerName="ovsdbserver-nb" Jan 20 23:43:02 crc kubenswrapper[5030]: E0120 23:43:02.169731 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe4aef7-888e-40d3-b1d9-f54b766ae91e" containerName="glance-httpd" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.169736 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe4aef7-888e-40d3-b1d9-f54b766ae91e" containerName="glance-httpd" Jan 20 23:43:02 crc kubenswrapper[5030]: E0120 23:43:02.169749 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30def4ae-b574-406f-ac55-1fbae852b7f8" containerName="openstack-network-exporter" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.169775 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="30def4ae-b574-406f-ac55-1fbae852b7f8" containerName="openstack-network-exporter" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.170528 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe4aef7-888e-40d3-b1d9-f54b766ae91e" containerName="glance-httpd" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.170550 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d68c95c-4505-4a7d-bd0f-44f7cf7c0671" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.170563 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="30def4ae-b574-406f-ac55-1fbae852b7f8" containerName="ovsdbserver-nb" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.170575 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="be7d3731-2d0f-4fcc-8edf-424ffe462724" containerName="ovsdbserver-sb" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.170585 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="30def4ae-b574-406f-ac55-1fbae852b7f8" containerName="openstack-network-exporter" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.170591 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe4aef7-888e-40d3-b1d9-f54b766ae91e" containerName="glance-log" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.170605 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="be7d3731-2d0f-4fcc-8edf-424ffe462724" containerName="openstack-network-exporter" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.171675 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.176958 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.177171 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.201869 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.201896 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gsrc\" (UniqueName: \"kubernetes.io/projected/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-kube-api-access-7gsrc\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.203639 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.246792 5030 scope.go:117] "RemoveContainer" containerID="8a6b141363970fec4afba83162c49db3275e773739115c07b31346e1c8e613cb" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.303644 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-config-data\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.303933 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.304017 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.304069 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/527ab23e-825f-4355-9abf-e3858c484683-logs\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.304088 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/527ab23e-825f-4355-9abf-e3858c484683-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.304195 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.304241 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znzk9\" (UniqueName: \"kubernetes.io/projected/527ab23e-825f-4355-9abf-e3858c484683-kube-api-access-znzk9\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.304306 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-scripts\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.321166 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-config-data" (OuterVolumeSpecName: "config-data") pod "6d68c95c-4505-4a7d-bd0f-44f7cf7c0671" (UID: "6d68c95c-4505-4a7d-bd0f-44f7cf7c0671"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.361524 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "6d68c95c-4505-4a7d-bd0f-44f7cf7c0671" (UID: "6d68c95c-4505-4a7d-bd0f-44f7cf7c0671"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.406585 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.406706 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.406802 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/527ab23e-825f-4355-9abf-e3858c484683-logs\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.406868 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/527ab23e-825f-4355-9abf-e3858c484683-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.406992 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.407068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znzk9\" (UniqueName: \"kubernetes.io/projected/527ab23e-825f-4355-9abf-e3858c484683-kube-api-access-znzk9\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.407208 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-scripts\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.407283 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-config-data\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.407390 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.407408 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.408585 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.408846 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/527ab23e-825f-4355-9abf-e3858c484683-logs\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.409870 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/527ab23e-825f-4355-9abf-e3858c484683-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.414427 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.419939 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-config-data\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.431195 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-scripts\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.436270 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.441410 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znzk9\" (UniqueName: \"kubernetes.io/projected/527ab23e-825f-4355-9abf-e3858c484683-kube-api-access-znzk9\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.446732 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "6d68c95c-4505-4a7d-bd0f-44f7cf7c0671" (UID: "6d68c95c-4505-4a7d-bd0f-44f7cf7c0671"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.453332 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.509570 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.578412 5030 scope.go:117] "RemoveContainer" containerID="4f6874d9d0b21b19757ba0154d86f3da9e658121a8f165c2e5820052bed84ad5" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.612298 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.632847 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.640761 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.662825 5030 scope.go:117] "RemoveContainer" containerID="d7c25cc2d81571cbaf4e3fdaed35bbbbb4f6c053387fbd0f51749a48c434bf73" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.668959 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.679187 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.717767 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.737697 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:43:02 crc kubenswrapper[5030]: E0120 23:43:02.738102 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" containerName="nova-api-log" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.738113 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" containerName="nova-api-log" Jan 20 23:43:02 crc kubenswrapper[5030]: E0120 23:43:02.738129 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" containerName="nova-api-api" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.738135 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" containerName="nova-api-api" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.738333 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" containerName="nova-api-log" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.738343 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" containerName="nova-api-api" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.739303 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.757068 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.757240 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-7s6fz" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.757523 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.757658 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.782041 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.792723 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.813777 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.816153 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-internal-tls-certs\") pod \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.821769 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-combined-ca-bundle\") pod \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.822112 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-config-data\") pod \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.822232 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhk67\" (UniqueName: \"kubernetes.io/projected/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-kube-api-access-jhk67\") pod \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.822323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-public-tls-certs\") pod \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.822484 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-logs\") pod \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\" (UID: \"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce\") " Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.823082 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.823350 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.830857 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnbz7\" (UniqueName: \"kubernetes.io/projected/3fc32705-7b67-4425-88be-7fe57171135e-kube-api-access-gnbz7\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.831119 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fc32705-7b67-4425-88be-7fe57171135e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.831313 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.831405 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fc32705-7b67-4425-88be-7fe57171135e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.831500 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.816755 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.816808 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.816846 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.816862 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-m2qb9" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.826425 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-logs" (OuterVolumeSpecName: "logs") pod "cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" (UID: "cb47fe73-ea40-40ce-9531-57b6a5e8e7ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.833488 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc32705-7b67-4425-88be-7fe57171135e-config\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.834765 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.872475 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.888790 5030 scope.go:117] "RemoveContainer" containerID="70478a059fca701b17d2bef4ede54b3267ff5a5ce2ad1aa06cb4b60331aeb708" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.890636 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-kube-api-access-jhk67" (OuterVolumeSpecName: "kube-api-access-jhk67") pod "cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" (UID: "cb47fe73-ea40-40ce-9531-57b6a5e8e7ce"). InnerVolumeSpecName "kube-api-access-jhk67". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.936702 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f7698f27-dba0-4e8f-95b4-329a483d6691-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.936785 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.936831 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.936851 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnbz7\" (UniqueName: \"kubernetes.io/projected/3fc32705-7b67-4425-88be-7fe57171135e-kube-api-access-gnbz7\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.936866 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dgs5\" (UniqueName: \"kubernetes.io/projected/f7698f27-dba0-4e8f-95b4-329a483d6691-kube-api-access-7dgs5\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.936947 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7698f27-dba0-4e8f-95b4-329a483d6691-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.937002 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fc32705-7b67-4425-88be-7fe57171135e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.937021 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.937058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.937079 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fc32705-7b67-4425-88be-7fe57171135e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.937099 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.937143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.937177 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7698f27-dba0-4e8f-95b4-329a483d6691-config\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.937238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.937262 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc32705-7b67-4425-88be-7fe57171135e-config\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.937306 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.937372 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhk67\" (UniqueName: \"kubernetes.io/projected/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-kube-api-access-jhk67\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.938670 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fc32705-7b67-4425-88be-7fe57171135e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.939285 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") device mount path \"/mnt/openstack/pv16\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.939763 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fc32705-7b67-4425-88be-7fe57171135e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.941633 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc32705-7b67-4425-88be-7fe57171135e-config\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.961998 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnbz7\" (UniqueName: \"kubernetes.io/projected/3fc32705-7b67-4425-88be-7fe57171135e-kube-api-access-gnbz7\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.964640 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.965896 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:02 crc kubenswrapper[5030]: I0120 23:43:02.968407 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.040893 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.041411 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.046737 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7698f27-dba0-4e8f-95b4-329a483d6691-config\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.047026 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.047224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f7698f27-dba0-4e8f-95b4-329a483d6691-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.047318 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.047354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dgs5\" (UniqueName: \"kubernetes.io/projected/f7698f27-dba0-4e8f-95b4-329a483d6691-kube-api-access-7dgs5\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.047402 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7698f27-dba0-4e8f-95b4-329a483d6691-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.047919 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7698f27-dba0-4e8f-95b4-329a483d6691-config\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.048369 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7698f27-dba0-4e8f-95b4-329a483d6691-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.048637 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f7698f27-dba0-4e8f-95b4-329a483d6691-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.045437 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" event={"ID":"f205895a-be1d-4da7-adbc-773b2750d39b","Type":"ContainerStarted","Data":"c0d00eb7cc6e4dbc19a3aef1eac0c012a3298c1fc24fa2c33c43c98cef8bd29a"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.052495 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.067233 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 626d362af97385a0d67e9e8fee5bd07bb31c5c1353c92b989bc60909ed7f85e1 is running failed: container process not found" containerID="626d362af97385a0d67e9e8fee5bd07bb31c5c1353c92b989bc60909ed7f85e1" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.070785 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 626d362af97385a0d67e9e8fee5bd07bb31c5c1353c92b989bc60909ed7f85e1 is running failed: container process not found" containerID="626d362af97385a0d67e9e8fee5bd07bb31c5c1353c92b989bc60909ed7f85e1" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.073970 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 626d362af97385a0d67e9e8fee5bd07bb31c5c1353c92b989bc60909ed7f85e1 is running failed: container process not found" containerID="626d362af97385a0d67e9e8fee5bd07bb31c5c1353c92b989bc60909ed7f85e1" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.074011 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 626d362af97385a0d67e9e8fee5bd07bb31c5c1353c92b989bc60909ed7f85e1 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/openstack-galera-0" podUID="4c184acd-d9c1-4e70-8587-ce940b980983" containerName="galera" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.085646 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.086129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.086136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.087018 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dgs5\" (UniqueName: \"kubernetes.io/projected/f7698f27-dba0-4e8f-95b4-329a483d6691-kube-api-access-7dgs5\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.093591 5030 generic.go:334] "Generic (PLEG): container finished" podID="8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" containerID="6209a3e89aa3986aa1aed1097e852e2687fad2453d971f8772348cd0527da462" exitCode=0 Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.093705 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" event={"ID":"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208","Type":"ContainerDied","Data":"6209a3e89aa3986aa1aed1097e852e2687fad2453d971f8772348cd0527da462"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.127074 5030 scope.go:117] "RemoveContainer" containerID="d7c25cc2d81571cbaf4e3fdaed35bbbbb4f6c053387fbd0f51749a48c434bf73" Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.133725 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c25cc2d81571cbaf4e3fdaed35bbbbb4f6c053387fbd0f51749a48c434bf73\": container with ID starting with d7c25cc2d81571cbaf4e3fdaed35bbbbb4f6c053387fbd0f51749a48c434bf73 not found: ID does not exist" containerID="d7c25cc2d81571cbaf4e3fdaed35bbbbb4f6c053387fbd0f51749a48c434bf73" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.133769 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c25cc2d81571cbaf4e3fdaed35bbbbb4f6c053387fbd0f51749a48c434bf73"} err="failed to get container status \"d7c25cc2d81571cbaf4e3fdaed35bbbbb4f6c053387fbd0f51749a48c434bf73\": rpc error: code = NotFound desc = could not find container \"d7c25cc2d81571cbaf4e3fdaed35bbbbb4f6c053387fbd0f51749a48c434bf73\": container with ID starting with d7c25cc2d81571cbaf4e3fdaed35bbbbb4f6c053387fbd0f51749a48c434bf73 not found: ID does not exist" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.133794 5030 scope.go:117] "RemoveContainer" containerID="70478a059fca701b17d2bef4ede54b3267ff5a5ce2ad1aa06cb4b60331aeb708" Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.153062 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70478a059fca701b17d2bef4ede54b3267ff5a5ce2ad1aa06cb4b60331aeb708\": container with ID starting with 70478a059fca701b17d2bef4ede54b3267ff5a5ce2ad1aa06cb4b60331aeb708 not found: ID does not exist" containerID="70478a059fca701b17d2bef4ede54b3267ff5a5ce2ad1aa06cb4b60331aeb708" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.153105 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70478a059fca701b17d2bef4ede54b3267ff5a5ce2ad1aa06cb4b60331aeb708"} err="failed to get container status \"70478a059fca701b17d2bef4ede54b3267ff5a5ce2ad1aa06cb4b60331aeb708\": rpc error: code = NotFound desc = could not find container \"70478a059fca701b17d2bef4ede54b3267ff5a5ce2ad1aa06cb4b60331aeb708\": container with ID starting with 70478a059fca701b17d2bef4ede54b3267ff5a5ce2ad1aa06cb4b60331aeb708 not found: ID does not exist" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.154780 5030 generic.go:334] "Generic (PLEG): container finished" podID="85f32108-760c-4027-8c6d-d1b761a3305d" containerID="a73c97b11afcbc04aed83e51bf767bd0510faa92f7acfad1d7f35ffe02c51dc2" exitCode=0 Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.154914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"85f32108-760c-4027-8c6d-d1b761a3305d","Type":"ContainerDied","Data":"a73c97b11afcbc04aed83e51bf767bd0510faa92f7acfad1d7f35ffe02c51dc2"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.208176 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-config-data" (OuterVolumeSpecName: "config-data") pod "cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" (UID: "cb47fe73-ea40-40ce-9531-57b6a5e8e7ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.227224 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" (UID: "cb47fe73-ea40-40ce-9531-57b6a5e8e7ce"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.238153 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" (UID: "cb47fe73-ea40-40ce-9531-57b6a5e8e7ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.238510 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="85f32108-760c-4027-8c6d-d1b761a3305d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.251:8775/\": dial tcp 10.217.0.251:8775: connect: connection refused" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.238957 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="85f32108-760c-4027-8c6d-d1b761a3305d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.251:8775/\": dial tcp 10.217.0.251:8775: connect: connection refused" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.255405 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.255548 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.255636 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.264072 5030 generic.go:334] "Generic (PLEG): container finished" podID="4c184acd-d9c1-4e70-8587-ce940b980983" containerID="626d362af97385a0d67e9e8fee5bd07bb31c5c1353c92b989bc60909ed7f85e1" exitCode=0 Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.264235 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"4c184acd-d9c1-4e70-8587-ce940b980983","Type":"ContainerDied","Data":"626d362af97385a0d67e9e8fee5bd07bb31c5c1353c92b989bc60909ed7f85e1"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.267407 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.303215 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"6d68c95c-4505-4a7d-bd0f-44f7cf7c0671","Type":"ContainerDied","Data":"f921b8881f8cd5d93240f668646c11a8b6c0fd4be82a73102aac1109cb4acbfa"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.303237 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.303268 5030 scope.go:117] "RemoveContainer" containerID="e29d468a1a3038d9e8d3fea4953bb329a9ca4383ab0ab779841c9af0753ef204" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.313102 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.316645 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" (UID: "cb47fe73-ea40-40ce-9531-57b6a5e8e7ce"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.317020 5030 generic.go:334] "Generic (PLEG): container finished" podID="0acef259-668b-4545-be1f-8f19440dd308" containerID="0ff3a171627e7737463b58bcc435f1f2f3c9457442d5db62279215da4a11a740" exitCode=0 Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.317084 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"0acef259-668b-4545-be1f-8f19440dd308","Type":"ContainerDied","Data":"0ff3a171627e7737463b58bcc435f1f2f3c9457442d5db62279215da4a11a740"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.319106 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="425b4c25-fdad-470a-b462-188b573d359a" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.1:8776/healthcheck\": read tcp 10.217.0.2:49642->10.217.1.1:8776: read: connection reset by peer" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.329803 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"cb47fe73-ea40-40ce-9531-57b6a5e8e7ce","Type":"ContainerDied","Data":"573d728e5081c2a0cd1221f250645ab9c66bcd867a567c6da57071e922a20b6e"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.329889 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.338287 5030 generic.go:334] "Generic (PLEG): container finished" podID="ba66fb72-701e-46d6-b539-bdc036b9477f" containerID="bee0947fe7fe657b7ed53ea384f8fcf3e0f5fc524636231fa1f7cd941b3366f5" exitCode=0 Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.338480 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"ba66fb72-701e-46d6-b539-bdc036b9477f","Type":"ContainerDied","Data":"bee0947fe7fe657b7ed53ea384f8fcf3e0f5fc524636231fa1f7cd941b3366f5"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.360138 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.360571 5030 generic.go:334] "Generic (PLEG): container finished" podID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerID="70ffc14e50637690db7989fdf98987e7bba8872dcb444eceda7cc13290c16cde" exitCode=0 Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.360755 5030 generic.go:334] "Generic (PLEG): container finished" podID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerID="c1491fd116b61462ffe9d4c787b8ac9b46c8b5380cd45cc8fd9896b43d420535" exitCode=2 Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.360817 5030 generic.go:334] "Generic (PLEG): container finished" podID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerID="12091bb4362b85d843ea5295d417371e1ca04650ec43156c0521deaaa0286f2d" exitCode=0 Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.360843 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b002bd94-4e47-49ea-9ce8-5af565ddb6ed","Type":"ContainerDied","Data":"70ffc14e50637690db7989fdf98987e7bba8872dcb444eceda7cc13290c16cde"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.361046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b002bd94-4e47-49ea-9ce8-5af565ddb6ed","Type":"ContainerDied","Data":"c1491fd116b61462ffe9d4c787b8ac9b46c8b5380cd45cc8fd9896b43d420535"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.361115 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b002bd94-4e47-49ea-9ce8-5af565ddb6ed","Type":"ContainerDied","Data":"12091bb4362b85d843ea5295d417371e1ca04650ec43156c0521deaaa0286f2d"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.361143 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b002bd94-4e47-49ea-9ce8-5af565ddb6ed","Type":"ContainerDied","Data":"2aba22d12b4a8082bd872a965b953151609964a03b4ad9582d03b72dab943a2e"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.360868 5030 generic.go:334] "Generic (PLEG): container finished" podID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerID="2aba22d12b4a8082bd872a965b953151609964a03b4ad9582d03b72dab943a2e" exitCode=0 Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.366780 5030 generic.go:334] "Generic (PLEG): container finished" podID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerID="fe2e23d68370c4c00564a676e4d3e1d4904e1c03bc91bd3f25aa636e081afe21" exitCode=143 Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.366833 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ddd0447c-b834-4a95-b1d8-35eb86c7de95","Type":"ContainerDied","Data":"fe2e23d68370c4c00564a676e4d3e1d4904e1c03bc91bd3f25aa636e081afe21"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.395111 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" event={"ID":"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1","Type":"ContainerStarted","Data":"f1f39f1af8be8586d4e08109ccc15fd44076fee82dbc43273800e8dc177416bf"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.395156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" event={"ID":"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1","Type":"ContainerStarted","Data":"97f61c495047eacdf4c2f0f34a36d823dd546b180f254cdd46ecd1356039a624"} Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.396176 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.504508 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.674446 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8"] Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.678945 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737f957c_6c28_4e40_ac90_83e5cc719f07.slice/crio-conmon-1d0ac515361116a705fa1bbc68a57e410f5f863be4014e57d7caa5134773a66d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb47fe73_ea40_40ce_9531_57b6a5e8e7ce.slice/crio-573d728e5081c2a0cd1221f250645ab9c66bcd867a567c6da57071e922a20b6e\": RecentStats: unable to find data in memory cache]" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.682193 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.684125 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv"] Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.696760 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.719721 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.726595 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.735819 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-6f9877d997-jdh5f"] Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.757654 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.763600 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.764045 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c184acd-d9c1-4e70-8587-ce940b980983" containerName="galera" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.764062 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c184acd-d9c1-4e70-8587-ce940b980983" containerName="galera" Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.764078 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c184acd-d9c1-4e70-8587-ce940b980983" containerName="mysql-bootstrap" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.764086 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c184acd-d9c1-4e70-8587-ce940b980983" containerName="mysql-bootstrap" Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.764109 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0acef259-668b-4545-be1f-8f19440dd308" containerName="memcached" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.764116 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0acef259-668b-4545-be1f-8f19440dd308" containerName="memcached" Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.764136 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba66fb72-701e-46d6-b539-bdc036b9477f" containerName="mysql-bootstrap" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.764142 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba66fb72-701e-46d6-b539-bdc036b9477f" containerName="mysql-bootstrap" Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.764156 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba66fb72-701e-46d6-b539-bdc036b9477f" containerName="galera" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.764165 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba66fb72-701e-46d6-b539-bdc036b9477f" containerName="galera" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.764340 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0acef259-668b-4545-be1f-8f19440dd308" containerName="memcached" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.764365 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c184acd-d9c1-4e70-8587-ce940b980983" containerName="galera" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.764376 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba66fb72-701e-46d6-b539-bdc036b9477f" containerName="galera" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.765066 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.767682 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.768010 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.768028 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.768069 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.773807 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ba66fb72-701e-46d6-b539-bdc036b9477f\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.773843 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba66fb72-701e-46d6-b539-bdc036b9477f-combined-ca-bundle\") pod \"ba66fb72-701e-46d6-b539-bdc036b9477f\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.773876 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-operator-scripts\") pod \"ba66fb72-701e-46d6-b539-bdc036b9477f\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.773934 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba66fb72-701e-46d6-b539-bdc036b9477f-galera-tls-certs\") pod \"ba66fb72-701e-46d6-b539-bdc036b9477f\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.774036 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0acef259-668b-4545-be1f-8f19440dd308-combined-ca-bundle\") pod \"0acef259-668b-4545-be1f-8f19440dd308\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.774058 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0acef259-668b-4545-be1f-8f19440dd308-memcached-tls-certs\") pod \"0acef259-668b-4545-be1f-8f19440dd308\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.774143 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ba66fb72-701e-46d6-b539-bdc036b9477f-config-data-generated\") pod \"ba66fb72-701e-46d6-b539-bdc036b9477f\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.774166 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0acef259-668b-4545-be1f-8f19440dd308-config-data\") pod \"0acef259-668b-4545-be1f-8f19440dd308\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.774190 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6klm\" (UniqueName: \"kubernetes.io/projected/ba66fb72-701e-46d6-b539-bdc036b9477f-kube-api-access-w6klm\") pod \"ba66fb72-701e-46d6-b539-bdc036b9477f\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.776631 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pzr6\" (UniqueName: \"kubernetes.io/projected/0acef259-668b-4545-be1f-8f19440dd308-kube-api-access-5pzr6\") pod \"0acef259-668b-4545-be1f-8f19440dd308\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.776660 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0acef259-668b-4545-be1f-8f19440dd308-kolla-config\") pod \"0acef259-668b-4545-be1f-8f19440dd308\" (UID: \"0acef259-668b-4545-be1f-8f19440dd308\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.776679 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-kolla-config\") pod \"ba66fb72-701e-46d6-b539-bdc036b9477f\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.776706 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-config-data-default\") pod \"ba66fb72-701e-46d6-b539-bdc036b9477f\" (UID: \"ba66fb72-701e-46d6-b539-bdc036b9477f\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.778510 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "ba66fb72-701e-46d6-b539-bdc036b9477f" (UID: "ba66fb72-701e-46d6-b539-bdc036b9477f"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.784068 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba66fb72-701e-46d6-b539-bdc036b9477f-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "ba66fb72-701e-46d6-b539-bdc036b9477f" (UID: "ba66fb72-701e-46d6-b539-bdc036b9477f"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.785036 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5b454f47f8-kbjj4"] Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.786018 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0acef259-668b-4545-be1f-8f19440dd308-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "0acef259-668b-4545-be1f-8f19440dd308" (UID: "0acef259-668b-4545-be1f-8f19440dd308"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.786591 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "ba66fb72-701e-46d6-b539-bdc036b9477f" (UID: "ba66fb72-701e-46d6-b539-bdc036b9477f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.786927 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0acef259-668b-4545-be1f-8f19440dd308-config-data" (OuterVolumeSpecName: "config-data") pod "0acef259-668b-4545-be1f-8f19440dd308" (UID: "0acef259-668b-4545-be1f-8f19440dd308"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.787605 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba66fb72-701e-46d6-b539-bdc036b9477f" (UID: "ba66fb72-701e-46d6-b539-bdc036b9477f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.787662 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.848799 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.860967 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0acef259-668b-4545-be1f-8f19440dd308-kube-api-access-5pzr6" (OuterVolumeSpecName: "kube-api-access-5pzr6") pod "0acef259-668b-4545-be1f-8f19440dd308" (UID: "0acef259-668b-4545-be1f-8f19440dd308"). InnerVolumeSpecName "kube-api-access-5pzr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.871690 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.879714 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-kolla-config\") pod \"4c184acd-d9c1-4e70-8587-ce940b980983\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.879754 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-config-data-default\") pod \"4c184acd-d9c1-4e70-8587-ce940b980983\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.879815 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4c184acd-d9c1-4e70-8587-ce940b980983-config-data-generated\") pod \"4c184acd-d9c1-4e70-8587-ce940b980983\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.879924 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-config-data\") pod \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.879951 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xdc4\" (UniqueName: \"kubernetes.io/projected/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-kube-api-access-4xdc4\") pod \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.879990 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-log-httpd\") pod \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-sg-core-conf-yaml\") pod \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880047 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-operator-scripts\") pod \"4c184acd-d9c1-4e70-8587-ce940b980983\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880088 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-run-httpd\") pod \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880108 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-scripts\") pod \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880133 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c184acd-d9c1-4e70-8587-ce940b980983-combined-ca-bundle\") pod \"4c184acd-d9c1-4e70-8587-ce940b980983\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880164 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"4c184acd-d9c1-4e70-8587-ce940b980983\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880186 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c184acd-d9c1-4e70-8587-ce940b980983-galera-tls-certs\") pod \"4c184acd-d9c1-4e70-8587-ce940b980983\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880215 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl4tv\" (UniqueName: \"kubernetes.io/projected/4c184acd-d9c1-4e70-8587-ce940b980983-kube-api-access-nl4tv\") pod \"4c184acd-d9c1-4e70-8587-ce940b980983\" (UID: \"4c184acd-d9c1-4e70-8587-ce940b980983\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880256 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-combined-ca-bundle\") pod \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880281 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-ceilometer-tls-certs\") pod \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\" (UID: \"b002bd94-4e47-49ea-9ce8-5af565ddb6ed\") " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880631 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880745 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880849 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.880933 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw5md\" (UniqueName: \"kubernetes.io/projected/8eab155c-8ebf-4332-9d47-16df303833b7-kube-api-access-mw5md\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.881000 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.881106 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ba66fb72-701e-46d6-b539-bdc036b9477f-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.881118 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0acef259-668b-4545-be1f-8f19440dd308-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.881128 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pzr6\" (UniqueName: \"kubernetes.io/projected/0acef259-668b-4545-be1f-8f19440dd308-kube-api-access-5pzr6\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.881136 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0acef259-668b-4545-be1f-8f19440dd308-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.881145 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.881155 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.881163 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba66fb72-701e-46d6-b539-bdc036b9477f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.882169 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b002bd94-4e47-49ea-9ce8-5af565ddb6ed" (UID: "b002bd94-4e47-49ea-9ce8-5af565ddb6ed"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.882777 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "4c184acd-d9c1-4e70-8587-ce940b980983" (UID: "4c184acd-d9c1-4e70-8587-ce940b980983"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.883179 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c184acd-d9c1-4e70-8587-ce940b980983-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "4c184acd-d9c1-4e70-8587-ce940b980983" (UID: "4c184acd-d9c1-4e70-8587-ce940b980983"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.883227 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c184acd-d9c1-4e70-8587-ce940b980983" (UID: "4c184acd-d9c1-4e70-8587-ce940b980983"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.883842 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b002bd94-4e47-49ea-9ce8-5af565ddb6ed" (UID: "b002bd94-4e47-49ea-9ce8-5af565ddb6ed"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.885614 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "4c184acd-d9c1-4e70-8587-ce940b980983" (UID: "4c184acd-d9c1-4e70-8587-ce940b980983"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.893324 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.893715 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="ceilometer-notification-agent" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.893730 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="ceilometer-notification-agent" Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.893750 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="ceilometer-central-agent" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.893756 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="ceilometer-central-agent" Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.893766 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="sg-core" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.893772 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="sg-core" Jan 20 23:43:03 crc kubenswrapper[5030]: E0120 23:43:03.893790 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="proxy-httpd" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.893795 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="proxy-httpd" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.893963 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="ceilometer-notification-agent" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.893974 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="proxy-httpd" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.893994 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="ceilometer-central-agent" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.894009 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" containerName="sg-core" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.894930 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.898417 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba66fb72-701e-46d6-b539-bdc036b9477f-kube-api-access-w6klm" (OuterVolumeSpecName: "kube-api-access-w6klm") pod "ba66fb72-701e-46d6-b539-bdc036b9477f" (UID: "ba66fb72-701e-46d6-b539-bdc036b9477f"). InnerVolumeSpecName "kube-api-access-w6klm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.899090 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.899135 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.899935 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.905183 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c184acd-d9c1-4e70-8587-ce940b980983-kube-api-access-nl4tv" (OuterVolumeSpecName: "kube-api-access-nl4tv") pod "4c184acd-d9c1-4e70-8587-ce940b980983" (UID: "4c184acd-d9c1-4e70-8587-ce940b980983"). InnerVolumeSpecName "kube-api-access-nl4tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.906694 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.913978 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-scripts" (OuterVolumeSpecName: "scripts") pod "b002bd94-4e47-49ea-9ce8-5af565ddb6ed" (UID: "b002bd94-4e47-49ea-9ce8-5af565ddb6ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.923472 5030 scope.go:117] "RemoveContainer" containerID="6046a846fcbf16047a808dd6d90fc3bd12599a6aad8d1c25fcb9bea953decf51" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.923469 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-kube-api-access-4xdc4" (OuterVolumeSpecName: "kube-api-access-4xdc4") pod "b002bd94-4e47-49ea-9ce8-5af565ddb6ed" (UID: "b002bd94-4e47-49ea-9ce8-5af565ddb6ed"). InnerVolumeSpecName "kube-api-access-4xdc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.936043 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "ba66fb72-701e-46d6-b539-bdc036b9477f" (UID: "ba66fb72-701e-46d6-b539-bdc036b9477f"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.979501 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "4c184acd-d9c1-4e70-8587-ce940b980983" (UID: "4c184acd-d9c1-4e70-8587-ce940b980983"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.987753 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.987815 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-public-tls-certs\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.987868 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.987895 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.987965 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kgkk\" (UniqueName: \"kubernetes.io/projected/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-kube-api-access-6kgkk\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.987987 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.988021 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-logs\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.988053 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.988109 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw5md\" (UniqueName: \"kubernetes.io/projected/8eab155c-8ebf-4332-9d47-16df303833b7-kube-api-access-mw5md\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.988159 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.988188 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-config-data\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.988290 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.988311 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.988322 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.988332 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.988389 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30def4ae-b574-406f-ac55-1fbae852b7f8" path="/var/lib/kubelet/pods/30def4ae-b574-406f-ac55-1fbae852b7f8/volumes" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.989659 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d68c95c-4505-4a7d-bd0f-44f7cf7c0671" path="/var/lib/kubelet/pods/6d68c95c-4505-4a7d-bd0f-44f7cf7c0671/volumes" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.990405 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afe4aef7-888e-40d3-b1d9-f54b766ae91e" path="/var/lib/kubelet/pods/afe4aef7-888e-40d3-b1d9-f54b766ae91e/volumes" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.991341 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.991377 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl4tv\" (UniqueName: \"kubernetes.io/projected/4c184acd-d9c1-4e70-8587-ce940b980983-kube-api-access-nl4tv\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.991389 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.991403 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4c184acd-d9c1-4e70-8587-ce940b980983-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.991415 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4c184acd-d9c1-4e70-8587-ce940b980983-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.991425 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6klm\" (UniqueName: \"kubernetes.io/projected/ba66fb72-701e-46d6-b539-bdc036b9477f-kube-api-access-w6klm\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.991449 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.991460 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xdc4\" (UniqueName: \"kubernetes.io/projected/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-kube-api-access-4xdc4\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.991496 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be7d3731-2d0f-4fcc-8edf-424ffe462724" path="/var/lib/kubelet/pods/be7d3731-2d0f-4fcc-8edf-424ffe462724/volumes" Jan 20 23:43:03 crc kubenswrapper[5030]: I0120 23:43:03.993584 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb47fe73-ea40-40ce-9531-57b6a5e8e7ce" path="/var/lib/kubelet/pods/cb47fe73-ea40-40ce-9531-57b6a5e8e7ce/volumes" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.007239 5030 scope.go:117] "RemoveContainer" containerID="cb2f0c3b8501093aaa0e433ff2ce6a327de9731da2f84e935851d94250d96e84" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.010925 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.011413 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.011895 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.012550 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.035214 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw5md\" (UniqueName: \"kubernetes.io/projected/8eab155c-8ebf-4332-9d47-16df303833b7-kube-api-access-mw5md\") pod \"nova-cell1-novncproxy-0\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.095093 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-config-data\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.095159 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-public-tls-certs\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.095188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.095245 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.095263 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kgkk\" (UniqueName: \"kubernetes.io/projected/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-kube-api-access-6kgkk\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.095280 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-logs\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.095647 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-logs\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.101095 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.102274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.106851 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-config-data\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.115355 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.141448 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kgkk\" (UniqueName: \"kubernetes.io/projected/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-kube-api-access-6kgkk\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.145600 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0acef259-668b-4545-be1f-8f19440dd308-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0acef259-668b-4545-be1f-8f19440dd308" (UID: "0acef259-668b-4545-be1f-8f19440dd308"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.147337 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-public-tls-certs\") pod \"nova-api-0\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.151961 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba66fb72-701e-46d6-b539-bdc036b9477f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba66fb72-701e-46d6-b539-bdc036b9477f" (UID: "ba66fb72-701e-46d6-b539-bdc036b9477f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.171067 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.176921 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b002bd94-4e47-49ea-9ce8-5af565ddb6ed" (UID: "b002bd94-4e47-49ea-9ce8-5af565ddb6ed"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.198849 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba66fb72-701e-46d6-b539-bdc036b9477f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.198871 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.198880 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.198890 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0acef259-668b-4545-be1f-8f19440dd308-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.218767 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.238014 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.250473 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba66fb72-701e-46d6-b539-bdc036b9477f-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "ba66fb72-701e-46d6-b539-bdc036b9477f" (UID: "ba66fb72-701e-46d6-b539-bdc036b9477f"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.299674 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c184acd-d9c1-4e70-8587-ce940b980983-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c184acd-d9c1-4e70-8587-ce940b980983" (UID: "4c184acd-d9c1-4e70-8587-ce940b980983"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.306550 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.306585 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba66fb72-701e-46d6-b539-bdc036b9477f-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.306597 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c184acd-d9c1-4e70-8587-ce940b980983-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.346823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0acef259-668b-4545-be1f-8f19440dd308-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "0acef259-668b-4545-be1f-8f19440dd308" (UID: "0acef259-668b-4545-be1f-8f19440dd308"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.348483 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b002bd94-4e47-49ea-9ce8-5af565ddb6ed" (UID: "b002bd94-4e47-49ea-9ce8-5af565ddb6ed"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.408576 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0acef259-668b-4545-be1f-8f19440dd308-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.408614 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.420391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c184acd-d9c1-4e70-8587-ce940b980983-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "4c184acd-d9c1-4e70-8587-ce940b980983" (UID: "4c184acd-d9c1-4e70-8587-ce940b980983"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.424066 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b002bd94-4e47-49ea-9ce8-5af565ddb6ed" (UID: "b002bd94-4e47-49ea-9ce8-5af565ddb6ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.425540 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" event={"ID":"68768b92-d310-4d2a-ace4-e95aa765f972","Type":"ContainerStarted","Data":"ebcd6fe11550f275a26d82161d24a065df68e7c0c9eebe2309b5cd41f35f9646"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.432097 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" event={"ID":"9948629d-78da-4168-98e5-f887c418909f","Type":"ContainerStarted","Data":"0f5c253e681107992aa80a89e5d843a4a26d33bb1956288acaae5260478766f4"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.435796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" event={"ID":"f205895a-be1d-4da7-adbc-773b2750d39b","Type":"ContainerStarted","Data":"d0252ba3559ee134b9000998c8060c53dc2b0d984f743e9c1fb9fb16abf9e46f"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.441128 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b002bd94-4e47-49ea-9ce8-5af565ddb6ed","Type":"ContainerDied","Data":"ac3292ce89ca97c48cb84b6f0a40de5fbc7b1ecfac103534780f495831dd49ce"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.441164 5030 scope.go:117] "RemoveContainer" containerID="70ffc14e50637690db7989fdf98987e7bba8872dcb444eceda7cc13290c16cde" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.441171 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.460109 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"85f32108-760c-4027-8c6d-d1b761a3305d","Type":"ContainerDied","Data":"9675c470bb1aef46b301e3ab235b66ed7ae2317e09d6803733a746007fea66d7"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.460152 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9675c470bb1aef46b301e3ab235b66ed7ae2317e09d6803733a746007fea66d7" Jan 20 23:43:04 crc kubenswrapper[5030]: E0120 23:43:04.462158 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.464110 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.464146 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"4c184acd-d9c1-4e70-8587-ce940b980983","Type":"ContainerDied","Data":"ac288eabd9795330ac846cea87322f15e976e34b7c83a416725c5a0cc3bf6cf7"} Jan 20 23:43:04 crc kubenswrapper[5030]: E0120 23:43:04.466549 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:43:04 crc kubenswrapper[5030]: E0120 23:43:04.471646 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:43:04 crc kubenswrapper[5030]: E0120 23:43:04.471686 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="f19ee173-f9bb-4671-afde-4107284c57fa" containerName="nova-cell1-conductor-conductor" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.474383 5030 generic.go:334] "Generic (PLEG): container finished" podID="737f957c-6c28-4e40-ac90-83e5cc719f07" containerID="1d0ac515361116a705fa1bbc68a57e410f5f863be4014e57d7caa5134773a66d" exitCode=0 Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.474424 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"737f957c-6c28-4e40-ac90-83e5cc719f07","Type":"ContainerDied","Data":"1d0ac515361116a705fa1bbc68a57e410f5f863be4014e57d7caa5134773a66d"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.474472 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"737f957c-6c28-4e40-ac90-83e5cc719f07","Type":"ContainerDied","Data":"cbfc4a87c1f84a754f079c86af1b4cf06631c990f910be93ba687e79d6819646"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.474483 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbfc4a87c1f84a754f079c86af1b4cf06631c990f910be93ba687e79d6819646" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.508081 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.508612 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"ba66fb72-701e-46d6-b539-bdc036b9477f","Type":"ContainerDied","Data":"71b5bce4ed79b269d7a64a65dd7f41e193081fd88b88a067419eba7bd8a373de"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.516424 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c184acd-d9c1-4e70-8587-ce940b980983-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.516457 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.520479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" event={"ID":"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1","Type":"ContainerStarted","Data":"124b27d23196c76c208e464129a013f7699a2efee00d1a0cc64c1e1f4062736b"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.523657 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.525423 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-config-data" (OuterVolumeSpecName: "config-data") pod "b002bd94-4e47-49ea-9ce8-5af565ddb6ed" (UID: "b002bd94-4e47-49ea-9ce8-5af565ddb6ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.533050 5030 generic.go:334] "Generic (PLEG): container finished" podID="425b4c25-fdad-470a-b462-188b573d359a" containerID="590d9e7afd81121f20e8b3c4a10aa9c029280098e2371ecd81d6f475c6e6a932" exitCode=0 Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.533113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"425b4c25-fdad-470a-b462-188b573d359a","Type":"ContainerDied","Data":"590d9e7afd81121f20e8b3c4a10aa9c029280098e2371ecd81d6f475c6e6a932"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.533138 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"425b4c25-fdad-470a-b462-188b573d359a","Type":"ContainerDied","Data":"7ed669b95f7dcec3deca556677f4cdd0b0cf8a59085f00a4d6a864d397f32144"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.533150 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ed669b95f7dcec3deca556677f4cdd0b0cf8a59085f00a4d6a864d397f32144" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.554068 5030 generic.go:334] "Generic (PLEG): container finished" podID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerID="20c4a921161139dd7e23f659cc47617ba1e894ddbac491c152cd76fd4f4e85b4" exitCode=0 Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.554253 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bb388aeb-cf40-49d1-bb60-a22d6117c6a8","Type":"ContainerDied","Data":"20c4a921161139dd7e23f659cc47617ba1e894ddbac491c152cd76fd4f4e85b4"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.554471 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"bb388aeb-cf40-49d1-bb60-a22d6117c6a8","Type":"ContainerDied","Data":"635b2138e1468785602e2430635c31641ced4fb6ffe0dab196d48d3481258c6f"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.554512 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="635b2138e1468785602e2430635c31641ced4fb6ffe0dab196d48d3481258c6f" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.572536 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" event={"ID":"a467f566-3c36-4534-8219-fa10f325e20b","Type":"ContainerStarted","Data":"a15ad4702023cddd300461791a532a8767efd27711ed6f5f9fd133cce2f39b90"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.581557 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" podStartSLOduration=5.5815192 podStartE2EDuration="5.5815192s" podCreationTimestamp="2026-01-20 23:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:04.538326663 +0000 UTC m=+4056.858586951" watchObservedRunningTime="2026-01-20 23:43:04.5815192 +0000 UTC m=+4056.901779488" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.589056 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb"] Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.589318 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" event={"ID":"0cee270b-420e-4af4-965a-3f07b75adb64","Type":"ContainerStarted","Data":"8e1c928a41fbec16d37549b4e9409eb6f5e84359420e31e38ab9e590916cc551"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.591100 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" podUID="1161dc0a-c665-401c-8c5c-b83131eaa38f" containerName="barbican-worker-log" containerID="cri-o://2ddec88ed5b072c38033cdee2f329b4a5d2567e6d0da5ecc25c62b5c39d2c21b" gracePeriod=30 Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.591388 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" podUID="1161dc0a-c665-401c-8c5c-b83131eaa38f" containerName="barbican-worker" containerID="cri-o://38e25d60d83df7b7698e2f71ec167569bffbe17710fcb7c93716578cf6b088b3" gracePeriod=30 Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.607576 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"0acef259-668b-4545-be1f-8f19440dd308","Type":"ContainerDied","Data":"dc259b0ff49e6ca76829f12c4377bebc2b9b50a20fd52b385f4448d9d599934b"} Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.607706 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.621554 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b002bd94-4e47-49ea-9ce8-5af565ddb6ed-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.621899 5030 scope.go:117] "RemoveContainer" containerID="c1491fd116b61462ffe9d4c787b8ac9b46c8b5380cd45cc8fd9896b43d420535" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.653407 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.685797 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.750055 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.825077 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f32108-760c-4027-8c6d-d1b761a3305d-logs\") pod \"85f32108-760c-4027-8c6d-d1b761a3305d\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.825126 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-combined-ca-bundle\") pod \"85f32108-760c-4027-8c6d-d1b761a3305d\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.825213 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-config-data\") pod \"85f32108-760c-4027-8c6d-d1b761a3305d\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.825289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg7mm\" (UniqueName: \"kubernetes.io/projected/85f32108-760c-4027-8c6d-d1b761a3305d-kube-api-access-kg7mm\") pod \"85f32108-760c-4027-8c6d-d1b761a3305d\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.825389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-nova-metadata-tls-certs\") pod \"85f32108-760c-4027-8c6d-d1b761a3305d\" (UID: \"85f32108-760c-4027-8c6d-d1b761a3305d\") " Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.825927 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f32108-760c-4027-8c6d-d1b761a3305d-logs" (OuterVolumeSpecName: "logs") pod "85f32108-760c-4027-8c6d-d1b761a3305d" (UID: "85f32108-760c-4027-8c6d-d1b761a3305d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.843491 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f32108-760c-4027-8c6d-d1b761a3305d-kube-api-access-kg7mm" (OuterVolumeSpecName: "kube-api-access-kg7mm") pod "85f32108-760c-4027-8c6d-d1b761a3305d" (UID: "85f32108-760c-4027-8c6d-d1b761a3305d"). InnerVolumeSpecName "kube-api-access-kg7mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.857372 5030 scope.go:117] "RemoveContainer" containerID="12091bb4362b85d843ea5295d417371e1ca04650ec43156c0521deaaa0286f2d" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.858933 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.880348 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.937664 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f32108-760c-4027-8c6d-d1b761a3305d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.937887 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg7mm\" (UniqueName: \"kubernetes.io/projected/85f32108-760c-4027-8c6d-d1b761a3305d-kube-api-access-kg7mm\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.955960 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:04 crc kubenswrapper[5030]: E0120 23:43:04.973414 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1d0ac515361116a705fa1bbc68a57e410f5f863be4014e57d7caa5134773a66d is running failed: container process not found" containerID="1d0ac515361116a705fa1bbc68a57e410f5f863be4014e57d7caa5134773a66d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:43:04 crc kubenswrapper[5030]: E0120 23:43:04.982393 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1d0ac515361116a705fa1bbc68a57e410f5f863be4014e57d7caa5134773a66d is running failed: container process not found" containerID="1d0ac515361116a705fa1bbc68a57e410f5f863be4014e57d7caa5134773a66d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:43:04 crc kubenswrapper[5030]: I0120 23:43:04.982697 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.005551 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.008155 5030 scope.go:117] "RemoveContainer" containerID="2aba22d12b4a8082bd872a965b953151609964a03b4ad9582d03b72dab943a2e" Jan 20 23:43:05 crc kubenswrapper[5030]: E0120 23:43:05.024394 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1d0ac515361116a705fa1bbc68a57e410f5f863be4014e57d7caa5134773a66d is running failed: container process not found" containerID="1d0ac515361116a705fa1bbc68a57e410f5f863be4014e57d7caa5134773a66d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:43:05 crc kubenswrapper[5030]: E0120 23:43:05.024458 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1d0ac515361116a705fa1bbc68a57e410f5f863be4014e57d7caa5134773a66d is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="737f957c-6c28-4e40-ac90-83e5cc719f07" containerName="nova-scheduler-scheduler" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.025369 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.039464 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-config-data\") pod \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.039654 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbwsx\" (UniqueName: \"kubernetes.io/projected/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-kube-api-access-qbwsx\") pod \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.039675 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-config-data-custom\") pod \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.039693 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-combined-ca-bundle\") pod \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.039792 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-etc-machine-id\") pod \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.039836 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-scripts\") pod \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\" (UID: \"bb388aeb-cf40-49d1-bb60-a22d6117c6a8\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.047671 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bb388aeb-cf40-49d1-bb60-a22d6117c6a8" (UID: "bb388aeb-cf40-49d1-bb60-a22d6117c6a8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.066208 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-kube-api-access-qbwsx" (OuterVolumeSpecName: "kube-api-access-qbwsx") pod "bb388aeb-cf40-49d1-bb60-a22d6117c6a8" (UID: "bb388aeb-cf40-49d1-bb60-a22d6117c6a8"). InnerVolumeSpecName "kube-api-access-qbwsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.092348 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.105835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-scripts" (OuterVolumeSpecName: "scripts") pod "bb388aeb-cf40-49d1-bb60-a22d6117c6a8" (UID: "bb388aeb-cf40-49d1-bb60-a22d6117c6a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.108185 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: E0120 23:43:05.108594 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425b4c25-fdad-470a-b462-188b573d359a" containerName="cinder-api-log" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.108605 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="425b4c25-fdad-470a-b462-188b573d359a" containerName="cinder-api-log" Jan 20 23:43:05 crc kubenswrapper[5030]: E0120 23:43:05.108906 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425b4c25-fdad-470a-b462-188b573d359a" containerName="cinder-api" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.108919 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="425b4c25-fdad-470a-b462-188b573d359a" containerName="cinder-api" Jan 20 23:43:05 crc kubenswrapper[5030]: E0120 23:43:05.108928 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f32108-760c-4027-8c6d-d1b761a3305d" containerName="nova-metadata-log" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.108935 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f32108-760c-4027-8c6d-d1b761a3305d" containerName="nova-metadata-log" Jan 20 23:43:05 crc kubenswrapper[5030]: E0120 23:43:05.108947 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737f957c-6c28-4e40-ac90-83e5cc719f07" containerName="nova-scheduler-scheduler" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.108953 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="737f957c-6c28-4e40-ac90-83e5cc719f07" containerName="nova-scheduler-scheduler" Jan 20 23:43:05 crc kubenswrapper[5030]: E0120 23:43:05.108968 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f32108-760c-4027-8c6d-d1b761a3305d" containerName="nova-metadata-metadata" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.108974 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f32108-760c-4027-8c6d-d1b761a3305d" containerName="nova-metadata-metadata" Jan 20 23:43:05 crc kubenswrapper[5030]: E0120 23:43:05.108995 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="cinder-scheduler" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.109001 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="cinder-scheduler" Jan 20 23:43:05 crc kubenswrapper[5030]: E0120 23:43:05.109014 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="cinder-scheduler" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.109020 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="cinder-scheduler" Jan 20 23:43:05 crc kubenswrapper[5030]: E0120 23:43:05.109033 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="probe" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.109050 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="probe" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.109220 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="425b4c25-fdad-470a-b462-188b573d359a" containerName="cinder-api" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.109236 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f32108-760c-4027-8c6d-d1b761a3305d" containerName="nova-metadata-metadata" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.109244 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="cinder-scheduler" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.109253 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="425b4c25-fdad-470a-b462-188b573d359a" containerName="cinder-api-log" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.109263 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="probe" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.109272 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f32108-760c-4027-8c6d-d1b761a3305d" containerName="nova-metadata-log" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.109287 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="737f957c-6c28-4e40-ac90-83e5cc719f07" containerName="nova-scheduler-scheduler" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.109589 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" containerName="cinder-scheduler" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.110864 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.114074 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.114870 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.115897 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.118575 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.119495 5030 scope.go:117] "RemoveContainer" containerID="626d362af97385a0d67e9e8fee5bd07bb31c5c1353c92b989bc60909ed7f85e1" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.141942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737f957c-6c28-4e40-ac90-83e5cc719f07-config-data\") pod \"737f957c-6c28-4e40-ac90-83e5cc719f07\" (UID: \"737f957c-6c28-4e40-ac90-83e5cc719f07\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.142154 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-public-tls-certs\") pod \"425b4c25-fdad-470a-b462-188b573d359a\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.142270 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737f957c-6c28-4e40-ac90-83e5cc719f07-combined-ca-bundle\") pod \"737f957c-6c28-4e40-ac90-83e5cc719f07\" (UID: \"737f957c-6c28-4e40-ac90-83e5cc719f07\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.142457 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrxlk\" (UniqueName: \"kubernetes.io/projected/737f957c-6c28-4e40-ac90-83e5cc719f07-kube-api-access-rrxlk\") pod \"737f957c-6c28-4e40-ac90-83e5cc719f07\" (UID: \"737f957c-6c28-4e40-ac90-83e5cc719f07\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.142638 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/425b4c25-fdad-470a-b462-188b573d359a-etc-machine-id\") pod \"425b4c25-fdad-470a-b462-188b573d359a\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.142835 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-config-data-custom\") pod \"425b4c25-fdad-470a-b462-188b573d359a\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.142953 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6j67\" (UniqueName: \"kubernetes.io/projected/425b4c25-fdad-470a-b462-188b573d359a-kube-api-access-w6j67\") pod \"425b4c25-fdad-470a-b462-188b573d359a\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.143439 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-scripts\") pod \"425b4c25-fdad-470a-b462-188b573d359a\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.143596 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425b4c25-fdad-470a-b462-188b573d359a-logs\") pod \"425b4c25-fdad-470a-b462-188b573d359a\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.145015 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-config-data\") pod \"425b4c25-fdad-470a-b462-188b573d359a\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.145205 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-internal-tls-certs\") pod \"425b4c25-fdad-470a-b462-188b573d359a\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.145481 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-combined-ca-bundle\") pod \"425b4c25-fdad-470a-b462-188b573d359a\" (UID: \"425b4c25-fdad-470a-b462-188b573d359a\") " Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.147295 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.148430 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.148655 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbwsx\" (UniqueName: \"kubernetes.io/projected/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-kube-api-access-qbwsx\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.144453 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bb388aeb-cf40-49d1-bb60-a22d6117c6a8" (UID: "bb388aeb-cf40-49d1-bb60-a22d6117c6a8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.144526 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/425b4c25-fdad-470a-b462-188b573d359a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "425b4c25-fdad-470a-b462-188b573d359a" (UID: "425b4c25-fdad-470a-b462-188b573d359a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.146466 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/425b4c25-fdad-470a-b462-188b573d359a-logs" (OuterVolumeSpecName: "logs") pod "425b4c25-fdad-470a-b462-188b573d359a" (UID: "425b4c25-fdad-470a-b462-188b573d359a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.150388 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.152311 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.158970 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.159174 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.159496 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-j9wjx" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.160076 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.169216 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737f957c-6c28-4e40-ac90-83e5cc719f07-kube-api-access-rrxlk" (OuterVolumeSpecName: "kube-api-access-rrxlk") pod "737f957c-6c28-4e40-ac90-83e5cc719f07" (UID: "737f957c-6c28-4e40-ac90-83e5cc719f07"). InnerVolumeSpecName "kube-api-access-rrxlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.170043 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-config-data" (OuterVolumeSpecName: "config-data") pod "85f32108-760c-4027-8c6d-d1b761a3305d" (UID: "85f32108-760c-4027-8c6d-d1b761a3305d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.170454 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "425b4c25-fdad-470a-b462-188b573d359a" (UID: "425b4c25-fdad-470a-b462-188b573d359a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.174045 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425b4c25-fdad-470a-b462-188b573d359a-kube-api-access-w6j67" (OuterVolumeSpecName: "kube-api-access-w6j67") pod "425b4c25-fdad-470a-b462-188b573d359a" (UID: "425b4c25-fdad-470a-b462-188b573d359a"). InnerVolumeSpecName "kube-api-access-w6j67". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.175910 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.182994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-scripts" (OuterVolumeSpecName: "scripts") pod "425b4c25-fdad-470a-b462-188b573d359a" (UID: "425b4c25-fdad-470a-b462-188b573d359a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.195404 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.208266 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.208467 5030 scope.go:117] "RemoveContainer" containerID="2f1b253ff24c4c1ea125df54301e5f8a3831fd6d691239c96bedcafa484781ff" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.249332 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.251590 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-config-data\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.251635 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnkff\" (UniqueName: \"kubernetes.io/projected/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-kube-api-access-vnkff\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.251667 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.251693 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-log-httpd\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.251739 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-scripts\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.251774 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k2m7\" (UniqueName: \"kubernetes.io/projected/0aacc04a-8369-4d86-a209-51905de33654-kube-api-access-8k2m7\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.251961 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-run-httpd\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252095 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252136 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-kolla-config\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252195 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aacc04a-8369-4d86-a209-51905de33654-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252233 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0aacc04a-8369-4d86-a209-51905de33654-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aacc04a-8369-4d86-a209-51905de33654-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252346 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252372 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-config-data-default\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252419 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252444 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252700 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252718 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/425b4c25-fdad-470a-b462-188b573d359a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252731 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252753 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrxlk\" (UniqueName: \"kubernetes.io/projected/737f957c-6c28-4e40-ac90-83e5cc719f07-kube-api-access-rrxlk\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252763 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/425b4c25-fdad-470a-b462-188b573d359a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252774 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252785 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6j67\" (UniqueName: \"kubernetes.io/projected/425b4c25-fdad-470a-b462-188b573d359a-kube-api-access-w6j67\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.252806 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.267734 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.269706 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.271974 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.273371 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-bmf9h" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.275356 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.315792 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.327961 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.340320 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.355770 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.357346 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.358776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/862166c4-fd5b-456d-9a91-c2235cdf2aa2-kolla-config\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.358823 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-scripts\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.358865 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k2m7\" (UniqueName: \"kubernetes.io/projected/0aacc04a-8369-4d86-a209-51905de33654-kube-api-access-8k2m7\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.358883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4nfk\" (UniqueName: \"kubernetes.io/projected/862166c4-fd5b-456d-9a91-c2235cdf2aa2-kube-api-access-d4nfk\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.358924 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-run-httpd\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.358948 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-kolla-config\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.358967 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.358993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aacc04a-8369-4d86-a209-51905de33654-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359009 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aacc04a-8369-4d86-a209-51905de33654-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359027 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0aacc04a-8369-4d86-a209-51905de33654-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359052 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/862166c4-fd5b-456d-9a91-c2235cdf2aa2-config-data\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359076 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359092 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-config-data-default\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359109 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862166c4-fd5b-456d-9a91-c2235cdf2aa2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359128 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359145 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-config-data\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359194 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnkff\" (UniqueName: \"kubernetes.io/projected/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-kube-api-access-vnkff\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359216 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359231 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/862166c4-fd5b-456d-9a91-c2235cdf2aa2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359251 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-log-httpd\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.359712 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-log-httpd\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.360913 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.361056 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0aacc04a-8369-4d86-a209-51905de33654-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.361167 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.361261 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.361337 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-rjg7g" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.361366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-run-httpd\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.361577 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.362100 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-kolla-config\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.362641 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.378011 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-config-data-default\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.389741 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.413682 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.453292 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-scripts\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.457243 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461230 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aacc04a-8369-4d86-a209-51905de33654-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461269 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/862166c4-fd5b-456d-9a91-c2235cdf2aa2-kolla-config\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461320 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c11e361-d0f4-49be-9b4b-96c109c78e43-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461356 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4nfk\" (UniqueName: \"kubernetes.io/projected/862166c4-fd5b-456d-9a91-c2235cdf2aa2-kube-api-access-d4nfk\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461385 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8n8m\" (UniqueName: \"kubernetes.io/projected/9c11e361-d0f4-49be-9b4b-96c109c78e43-kube-api-access-h8n8m\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461458 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/862166c4-fd5b-456d-9a91-c2235cdf2aa2-config-data\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461480 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c11e361-d0f4-49be-9b4b-96c109c78e43-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461510 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862166c4-fd5b-456d-9a91-c2235cdf2aa2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461545 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461566 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9c11e361-d0f4-49be-9b4b-96c109c78e43-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461631 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461651 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/862166c4-fd5b-456d-9a91-c2235cdf2aa2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461690 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/862166c4-fd5b-456d-9a91-c2235cdf2aa2-kolla-config\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.462128 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/862166c4-fd5b-456d-9a91-c2235cdf2aa2-config-data\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.461275 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.463111 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aacc04a-8369-4d86-a209-51905de33654-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.463771 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.464189 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-config-data\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.468375 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnkff\" (UniqueName: \"kubernetes.io/projected/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-kube-api-access-vnkff\") pod \"ceilometer-0\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.518858 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k2m7\" (UniqueName: \"kubernetes.io/projected/0aacc04a-8369-4d86-a209-51905de33654-kube-api-access-8k2m7\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.522797 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862166c4-fd5b-456d-9a91-c2235cdf2aa2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.524403 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/862166c4-fd5b-456d-9a91-c2235cdf2aa2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.525149 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4nfk\" (UniqueName: \"kubernetes.io/projected/862166c4-fd5b-456d-9a91-c2235cdf2aa2-kube-api-access-d4nfk\") pod \"memcached-0\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.563420 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c11e361-d0f4-49be-9b4b-96c109c78e43-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.563720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8n8m\" (UniqueName: \"kubernetes.io/projected/9c11e361-d0f4-49be-9b4b-96c109c78e43-kube-api-access-h8n8m\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.563807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c11e361-d0f4-49be-9b4b-96c109c78e43-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.563845 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.563867 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9c11e361-d0f4-49be-9b4b-96c109c78e43-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.563886 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.563912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.563952 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.565036 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.565209 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.566162 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.566212 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9c11e361-d0f4-49be-9b4b-96c109c78e43-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.566492 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.579901 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c11e361-d0f4-49be-9b4b-96c109c78e43-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.584059 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8n8m\" (UniqueName: \"kubernetes.io/projected/9c11e361-d0f4-49be-9b4b-96c109c78e43-kube-api-access-h8n8m\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.584178 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c11e361-d0f4-49be-9b4b-96c109c78e43-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.630004 5030 generic.go:334] "Generic (PLEG): container finished" podID="1161dc0a-c665-401c-8c5c-b83131eaa38f" containerID="2ddec88ed5b072c38033cdee2f329b4a5d2567e6d0da5ecc25c62b5c39d2c21b" exitCode=143 Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.630069 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" event={"ID":"1161dc0a-c665-401c-8c5c-b83131eaa38f","Type":"ContainerDied","Data":"2ddec88ed5b072c38033cdee2f329b4a5d2567e6d0da5ecc25c62b5c39d2c21b"} Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.633423 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"f7698f27-dba0-4e8f-95b4-329a483d6691","Type":"ContainerStarted","Data":"aceb0c132ca610f79613ed759be8a9c277b262f6bda56734910d3f61ee6a199a"} Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.639253 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"8eab155c-8ebf-4332-9d47-16df303833b7","Type":"ContainerStarted","Data":"2db0932cbd813797c68e7d4248a83d4bd5a83131038b1f588e4e7803da10cb01"} Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.640357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb","Type":"ContainerStarted","Data":"d01314e734acb62212779c45b5299b0c23a49c00045008b2d1ee187819ba326f"} Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.643026 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"527ab23e-825f-4355-9abf-e3858c484683","Type":"ContainerStarted","Data":"1dbff2e4713380c96cd10e70c51df97e55624a112a5058f20809ac1e24049788"} Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.644498 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.645419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"3fc32705-7b67-4425-88be-7fe57171135e","Type":"ContainerStarted","Data":"e62679548f35be4b9b789d9aaf3f4e7ba177a3e149c1161ad44252ad3dcbd2b1"} Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.645993 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.646154 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.646387 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.740006 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.801882 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.878715 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.988299 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0acef259-668b-4545-be1f-8f19440dd308" path="/var/lib/kubelet/pods/0acef259-668b-4545-be1f-8f19440dd308/volumes" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.989086 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c184acd-d9c1-4e70-8587-ce940b980983" path="/var/lib/kubelet/pods/4c184acd-d9c1-4e70-8587-ce940b980983/volumes" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.990084 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b002bd94-4e47-49ea-9ce8-5af565ddb6ed" path="/var/lib/kubelet/pods/b002bd94-4e47-49ea-9ce8-5af565ddb6ed/volumes" Jan 20 23:43:05 crc kubenswrapper[5030]: I0120 23:43:05.991610 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba66fb72-701e-46d6-b539-bdc036b9477f" path="/var/lib/kubelet/pods/ba66fb72-701e-46d6-b539-bdc036b9477f/volumes" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.022518 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85f32108-760c-4027-8c6d-d1b761a3305d" (UID: "85f32108-760c-4027-8c6d-d1b761a3305d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.031339 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "85f32108-760c-4027-8c6d-d1b761a3305d" (UID: "85f32108-760c-4027-8c6d-d1b761a3305d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.080105 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.081132 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85f32108-760c-4027-8c6d-d1b761a3305d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.150850 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "425b4c25-fdad-470a-b462-188b573d359a" (UID: "425b4c25-fdad-470a-b462-188b573d359a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.168051 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737f957c-6c28-4e40-ac90-83e5cc719f07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "737f957c-6c28-4e40-ac90-83e5cc719f07" (UID: "737f957c-6c28-4e40-ac90-83e5cc719f07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.168830 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb388aeb-cf40-49d1-bb60-a22d6117c6a8" (UID: "bb388aeb-cf40-49d1-bb60-a22d6117c6a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.169110 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "425b4c25-fdad-470a-b462-188b573d359a" (UID: "425b4c25-fdad-470a-b462-188b573d359a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.169922 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "425b4c25-fdad-470a-b462-188b573d359a" (UID: "425b4c25-fdad-470a-b462-188b573d359a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.179842 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737f957c-6c28-4e40-ac90-83e5cc719f07-config-data" (OuterVolumeSpecName: "config-data") pod "737f957c-6c28-4e40-ac90-83e5cc719f07" (UID: "737f957c-6c28-4e40-ac90-83e5cc719f07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.183955 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.183994 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737f957c-6c28-4e40-ac90-83e5cc719f07-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.184007 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.184019 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737f957c-6c28-4e40-ac90-83e5cc719f07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.184032 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.184042 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.235105 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-config-data" (OuterVolumeSpecName: "config-data") pod "425b4c25-fdad-470a-b462-188b573d359a" (UID: "425b4c25-fdad-470a-b462-188b573d359a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:06 crc kubenswrapper[5030]: E0120 23:43:06.235345 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:43:06 crc kubenswrapper[5030]: E0120 23:43:06.249749 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:43:06 crc kubenswrapper[5030]: E0120 23:43:06.256757 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:43:06 crc kubenswrapper[5030]: E0120 23:43:06.256838 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="87ed5aae-25b3-475c-b086-4ef865d35368" containerName="ovn-northd" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.293825 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425b4c25-fdad-470a-b462-188b573d359a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.305357 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-config-data" (OuterVolumeSpecName: "config-data") pod "bb388aeb-cf40-49d1-bb60-a22d6117c6a8" (UID: "bb388aeb-cf40-49d1-bb60-a22d6117c6a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.396154 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb388aeb-cf40-49d1-bb60-a22d6117c6a8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.469395 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.478112 5030 scope.go:117] "RemoveContainer" containerID="bee0947fe7fe657b7ed53ea384f8fcf3e0f5fc524636231fa1f7cd941b3366f5" Jan 20 23:43:06 crc kubenswrapper[5030]: E0120 23:43:06.481897 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937 is running failed: container process not found" containerID="50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:43:06 crc kubenswrapper[5030]: E0120 23:43:06.483114 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937 is running failed: container process not found" containerID="50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:43:06 crc kubenswrapper[5030]: E0120 23:43:06.483489 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937 is running failed: container process not found" containerID="50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:43:06 crc kubenswrapper[5030]: E0120 23:43:06.483535 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="5db97c47-beb0-437c-bcae-2aaabb7cf3ff" containerName="nova-cell0-conductor-conductor" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.539789 5030 scope.go:117] "RemoveContainer" containerID="0209a5dfb0c62f1e66ec249e44fd294c019db9b9470a7115b6112a0bd6656979" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.605033 5030 scope.go:117] "RemoveContainer" containerID="c77ef71b1ba7fc80efcdf5a24a3e01e5a68dadd6b1d0537759ca10e5a0da5519" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.643987 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.646002 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.648750 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.651234 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.656819 5030 scope.go:117] "RemoveContainer" containerID="0ff3a171627e7737463b58bcc435f1f2f3c9457442d5db62279215da4a11a740" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.664594 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" event={"ID":"9948629d-78da-4168-98e5-f887c418909f","Type":"ContainerStarted","Data":"98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b"} Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.694431 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" event={"ID":"a467f566-3c36-4534-8219-fa10f325e20b","Type":"ContainerStarted","Data":"ad6f8e58c9dff12d340a8632ebe13ee33bdf79383a461b73e54bcbe069e202bd"} Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.713703 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.731300 5030 generic.go:334] "Generic (PLEG): container finished" podID="5db97c47-beb0-437c-bcae-2aaabb7cf3ff" containerID="50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937" exitCode=0 Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.731425 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.732278 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"5db97c47-beb0-437c-bcae-2aaabb7cf3ff","Type":"ContainerDied","Data":"50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937"} Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.732345 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"5db97c47-beb0-437c-bcae-2aaabb7cf3ff","Type":"ContainerDied","Data":"9b95d5e2f9dcc3a881806e642605eb94b93b3d25fb31a605824287006f823e2f"} Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.751584 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.762875 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d","Type":"ContainerStarted","Data":"68ca516876cf13491d4b0ff89ccfacff2571b2db918b083cc72996f74a4600ac"} Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.781204 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" event={"ID":"f205895a-be1d-4da7-adbc-773b2750d39b","Type":"ContainerStarted","Data":"5dba5d24db0179451d485278775eeffd46ef67bfef2e3668bd94f66923b10c84"} Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.783435 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.784404 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"f7698f27-dba0-4e8f-95b4-329a483d6691","Type":"ContainerStarted","Data":"c60cc42b20fa5da460341c91ae834bc23e0583d2671863d235c214e33a0cc075"} Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.790976 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" event={"ID":"0cee270b-420e-4af4-965a-3f07b75adb64","Type":"ContainerStarted","Data":"93f5098d23b0d3f01250c711f57d6ce77c7bdec802fcac44aba9785bbe81f1b6"} Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.796823 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"3fc32705-7b67-4425-88be-7fe57171135e","Type":"ContainerStarted","Data":"ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605"} Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.798794 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" event={"ID":"68768b92-d310-4d2a-ace4-e95aa765f972","Type":"ContainerStarted","Data":"a36634b4807646149e5f3a7b6744b5ba605ae319c65e476cda97660d7f503e98"} Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.799062 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.813772 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-combined-ca-bundle\") pod \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\" (UID: \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\") " Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.814044 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-config-data\") pod \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\" (UID: \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\") " Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.814137 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvkmd\" (UniqueName: \"kubernetes.io/projected/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-kube-api-access-vvkmd\") pod \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\" (UID: \"5db97c47-beb0-437c-bcae-2aaabb7cf3ff\") " Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.827647 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.839404 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="6d68c95c-4505-4a7d-bd0f-44f7cf7c0671" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.1.9:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.842853 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-kube-api-access-vvkmd" (OuterVolumeSpecName: "kube-api-access-vvkmd") pod "5db97c47-beb0-437c-bcae-2aaabb7cf3ff" (UID: "5db97c47-beb0-437c-bcae-2aaabb7cf3ff"). InnerVolumeSpecName "kube-api-access-vvkmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.850723 5030 scope.go:117] "RemoveContainer" containerID="50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.862347 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5db97c47-beb0-437c-bcae-2aaabb7cf3ff" (UID: "5db97c47-beb0-437c-bcae-2aaabb7cf3ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.866711 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:43:06 crc kubenswrapper[5030]: E0120 23:43:06.867146 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db97c47-beb0-437c-bcae-2aaabb7cf3ff" containerName="nova-cell0-conductor-conductor" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.867211 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db97c47-beb0-437c-bcae-2aaabb7cf3ff" containerName="nova-cell0-conductor-conductor" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.867460 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db97c47-beb0-437c-bcae-2aaabb7cf3ff" containerName="nova-cell0-conductor-conductor" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.868579 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.870852 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.872669 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.883911 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.885399 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.886776 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-config-data" (OuterVolumeSpecName: "config-data") pod "5db97c47-beb0-437c-bcae-2aaabb7cf3ff" (UID: "5db97c47-beb0-437c-bcae-2aaabb7cf3ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.893610 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.894070 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.894324 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-fdpgm" Jan 20 23:43:06 crc kubenswrapper[5030]: I0120 23:43:06.895746 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:06.918724 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:06.918763 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:06.921883 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:06.921906 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:06.921940 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvkmd\" (UniqueName: \"kubernetes.io/projected/5db97c47-beb0-437c-bcae-2aaabb7cf3ff-kube-api-access-vvkmd\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:06.925932 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" podStartSLOduration=7.925913562 podStartE2EDuration="7.925913562s" podCreationTimestamp="2026-01-20 23:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:06.800967921 +0000 UTC m=+4059.121228209" watchObservedRunningTime="2026-01-20 23:43:06.925913562 +0000 UTC m=+4059.246173850" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:06.955609 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9"] Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:06.956006 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" podUID="09386ef4-57a8-4beb-950a-49b79579bd25" containerName="barbican-keystone-listener-log" containerID="cri-o://f5a74acba0a8a99226f14f4549f03f8f58417131875199ea1e4cdc9379448f3a" gracePeriod=30 Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:06.956394 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" podUID="09386ef4-57a8-4beb-950a-49b79579bd25" containerName="barbican-keystone-listener" containerID="cri-o://b25ff8a83d116c4cbc47780cd4b734085f77cdcc9f01bfe134d74870aa75527c" gracePeriod=30 Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:06.957908 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" podStartSLOduration=7.957890638 podStartE2EDuration="7.957890638s" podCreationTimestamp="2026-01-20 23:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:06.822082044 +0000 UTC m=+4059.142342342" watchObservedRunningTime="2026-01-20 23:43:06.957890638 +0000 UTC m=+4059.278150926" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.023110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-config-data\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.023167 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.023209 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.023233 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.023290 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-scripts\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.023309 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31981674-b244-476b-bf36-e3f5903d2446-logs\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.023345 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njnjd\" (UniqueName: \"kubernetes.io/projected/31981674-b244-476b-bf36-e3f5903d2446-kube-api-access-njnjd\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.023370 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn474\" (UniqueName: \"kubernetes.io/projected/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-kube-api-access-tn474\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.023389 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.023435 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.023454 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-config-data\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.038814 5030 scope.go:117] "RemoveContainer" containerID="50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937" Jan 20 23:43:07 crc kubenswrapper[5030]: E0120 23:43:07.042712 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937\": container with ID starting with 50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937 not found: ID does not exist" containerID="50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.042752 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937"} err="failed to get container status \"50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937\": rpc error: code = NotFound desc = could not find container \"50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937\": container with ID starting with 50af3bffb90a2e84dbe75f422adce68b8462ff36d915e2d84f42d408872fa937 not found: ID does not exist" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.121192 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.124872 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.124924 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.124987 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-scripts\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.125005 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31981674-b244-476b-bf36-e3f5903d2446-logs\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.125047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njnjd\" (UniqueName: \"kubernetes.io/projected/31981674-b244-476b-bf36-e3f5903d2446-kube-api-access-njnjd\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.125073 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn474\" (UniqueName: \"kubernetes.io/projected/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-kube-api-access-tn474\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.125094 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.125120 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.125140 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-config-data\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.125173 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-config-data\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.125198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.125286 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.126260 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31981674-b244-476b-bf36-e3f5903d2446-logs\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.138911 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.143288 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.144135 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-scripts\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.144460 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.145152 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.146900 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njnjd\" (UniqueName: \"kubernetes.io/projected/31981674-b244-476b-bf36-e3f5903d2446-kube-api-access-njnjd\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.147080 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.148191 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-config-data\") pod \"nova-metadata-0\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.152286 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-config-data\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.153080 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn474\" (UniqueName: \"kubernetes.io/projected/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-kube-api-access-tn474\") pod \"cinder-scheduler-0\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.154747 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.157146 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.166187 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.168440 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.227580 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d86b77-716c-4eb7-8976-9e47e1b6df7d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.227657 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04d86b77-716c-4eb7-8976-9e47e1b6df7d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.227752 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmwf6\" (UniqueName: \"kubernetes.io/projected/04d86b77-716c-4eb7-8976-9e47e1b6df7d-kube-api-access-xmwf6\") pod \"nova-cell0-conductor-0\" (UID: \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.335457 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.336427 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d86b77-716c-4eb7-8976-9e47e1b6df7d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.336489 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04d86b77-716c-4eb7-8976-9e47e1b6df7d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.336567 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmwf6\" (UniqueName: \"kubernetes.io/projected/04d86b77-716c-4eb7-8976-9e47e1b6df7d-kube-api-access-xmwf6\") pod \"nova-cell0-conductor-0\" (UID: \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.350447 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04d86b77-716c-4eb7-8976-9e47e1b6df7d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.352157 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d86b77-716c-4eb7-8976-9e47e1b6df7d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.353087 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmwf6\" (UniqueName: \"kubernetes.io/projected/04d86b77-716c-4eb7-8976-9e47e1b6df7d-kube-api-access-xmwf6\") pod \"nova-cell0-conductor-0\" (UID: \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.379266 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.411061 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.474218 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.493063 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.512930 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/memcached-0" podUID="0acef259-668b-4545-be1f-8f19440dd308" containerName="memcached" probeResult="failure" output="dial tcp 10.217.1.10:11211: i/o timeout" Jan 20 23:43:07 crc kubenswrapper[5030]: W0120 23:43:07.546797 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c11e361_d0f4_49be_9b4b_96c109c78e43.slice/crio-4fe7cbbb90e11abae4a3861d145f392b89da7ccfc8bfe18fdc8475b08c238ae8 WatchSource:0}: Error finding container 4fe7cbbb90e11abae4a3861d145f392b89da7ccfc8bfe18fdc8475b08c238ae8: Status 404 returned error can't find the container with id 4fe7cbbb90e11abae4a3861d145f392b89da7ccfc8bfe18fdc8475b08c238ae8 Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.775644 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.859724 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"0aacc04a-8369-4d86-a209-51905de33654","Type":"ContainerStarted","Data":"39acce0069abdd1fa0016b0b151911ccd2b22d0da92fab8d6a1b6727f97e67d0"} Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.889854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"8eab155c-8ebf-4332-9d47-16df303833b7","Type":"ContainerStarted","Data":"eaf5b25e80fbc00732ee0cd1b7677c0ff4326220df49407113e8e594514df699"} Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.893368 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d","Type":"ContainerStarted","Data":"22080769a37ddfd631226ed565d7ef82f827787272f10813765651405967da1d"} Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.895140 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" event={"ID":"0cee270b-420e-4af4-965a-3f07b75adb64","Type":"ContainerStarted","Data":"dae5f367d91c592f55708cf1cf8cef62b4a47e8ae4fe340741cda3c5b092b951"} Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.895835 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.895955 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.902287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"3fc32705-7b67-4425-88be-7fe57171135e","Type":"ContainerStarted","Data":"9b46431a294cf5d85df14faf26ae76b884709b85de6b4b62da3a05b399483892"} Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.912929 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb","Type":"ContainerStarted","Data":"6b9efb08e33facd7a36f07305e5026219490fefe5b522a4546faa6abd38475c8"} Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.915069 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=4.915050037 podStartE2EDuration="4.915050037s" podCreationTimestamp="2026-01-20 23:43:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:07.904544222 +0000 UTC m=+4060.224804510" watchObservedRunningTime="2026-01-20 23:43:07.915050037 +0000 UTC m=+4060.235310335" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.917361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"862166c4-fd5b-456d-9a91-c2235cdf2aa2","Type":"ContainerStarted","Data":"fb94f334ddb9cbe51f468a0c4ed6ee52c88491d6ee338870b8b94b48e17fdbf4"} Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.920040 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"f7698f27-dba0-4e8f-95b4-329a483d6691","Type":"ContainerStarted","Data":"9b00346eae6215c81c9b549691ce9a825ffdd4a07f8e28ceb5cc31cd3a07ad35"} Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.931722 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" podStartSLOduration=8.931705481 podStartE2EDuration="8.931705481s" podCreationTimestamp="2026-01-20 23:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:07.926960856 +0000 UTC m=+4060.247221144" watchObservedRunningTime="2026-01-20 23:43:07.931705481 +0000 UTC m=+4060.251965759" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.936165 5030 generic.go:334] "Generic (PLEG): container finished" podID="8b42296e-9bb8-4b92-ac1f-6f62a58bb042" containerID="98fb3459638b3c9138e33d4416a9e172a726d393548444329a3a1a4c7fee114e" exitCode=0 Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.936207 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" event={"ID":"8b42296e-9bb8-4b92-ac1f-6f62a58bb042","Type":"ContainerDied","Data":"98fb3459638b3c9138e33d4416a9e172a726d393548444329a3a1a4c7fee114e"} Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.948719 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=5.948702443 podStartE2EDuration="5.948702443s" podCreationTimestamp="2026-01-20 23:43:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:07.944767137 +0000 UTC m=+4060.265027425" watchObservedRunningTime="2026-01-20 23:43:07.948702443 +0000 UTC m=+4060.268962731" Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.958039 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" event={"ID":"a467f566-3c36-4534-8219-fa10f325e20b","Type":"ContainerStarted","Data":"6d4fdd95933dd76f66126e2eafeb8812ed0de780243beac0c33000b1c9a049c3"} Jan 20 23:43:07 crc kubenswrapper[5030]: I0120 23:43:07.959514 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.022139 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=6.022119215 podStartE2EDuration="6.022119215s" podCreationTimestamp="2026-01-20 23:43:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:07.994837492 +0000 UTC m=+4060.315097780" watchObservedRunningTime="2026-01-20 23:43:08.022119215 +0000 UTC m=+4060.342379503" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.051197 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5db97c47-beb0-437c-bcae-2aaabb7cf3ff" path="/var/lib/kubelet/pods/5db97c47-beb0-437c-bcae-2aaabb7cf3ff/volumes" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.052012 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85f32108-760c-4027-8c6d-d1b761a3305d" path="/var/lib/kubelet/pods/85f32108-760c-4027-8c6d-d1b761a3305d/volumes" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.052742 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb388aeb-cf40-49d1-bb60-a22d6117c6a8" path="/var/lib/kubelet/pods/bb388aeb-cf40-49d1-bb60-a22d6117c6a8/volumes" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.055166 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"527ab23e-825f-4355-9abf-e3858c484683","Type":"ContainerStarted","Data":"81e32e53c3f6a49ddd25bb2402c3925b0d11ecd9d86657224a2a426078ebe402"} Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.055278 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" event={"ID":"9948629d-78da-4168-98e5-f887c418909f","Type":"ContainerStarted","Data":"59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb"} Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.055384 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.055480 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.055550 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9c11e361-d0f4-49be-9b4b-96c109c78e43","Type":"ContainerStarted","Data":"4fe7cbbb90e11abae4a3861d145f392b89da7ccfc8bfe18fdc8475b08c238ae8"} Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.075113 5030 generic.go:334] "Generic (PLEG): container finished" podID="09386ef4-57a8-4beb-950a-49b79579bd25" containerID="f5a74acba0a8a99226f14f4549f03f8f58417131875199ea1e4cdc9379448f3a" exitCode=143 Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.076683 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" event={"ID":"09386ef4-57a8-4beb-950a-49b79579bd25","Type":"ContainerDied","Data":"f5a74acba0a8a99226f14f4549f03f8f58417131875199ea1e4cdc9379448f3a"} Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.136134 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.187396 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" podStartSLOduration=9.187372773 podStartE2EDuration="9.187372773s" podCreationTimestamp="2026-01-20 23:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:08.130171076 +0000 UTC m=+4060.450431364" watchObservedRunningTime="2026-01-20 23:43:08.187372773 +0000 UTC m=+4060.507633061" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.385055 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.401014 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.410071 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" podStartSLOduration=9.410050455 podStartE2EDuration="9.410050455s" podCreationTimestamp="2026-01-20 23:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:08.327932353 +0000 UTC m=+4060.648192641" watchObservedRunningTime="2026-01-20 23:43:08.410050455 +0000 UTC m=+4060.730310743" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.508716 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.513268 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.11:9292/healthcheck\": read tcp 10.217.0.2:36526->10.217.1.11:9292: read: connection reset by peer" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.513562 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.11:9292/healthcheck\": read tcp 10.217.0.2:36518->10.217.1.11:9292: read: connection reset by peer" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.620537 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.873216 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:43:08 crc kubenswrapper[5030]: I0120 23:43:08.987473 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.009687 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-internal-tls-certs\") pod \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.009740 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-fernet-keys\") pod \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.010452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-combined-ca-bundle\") pod \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.010507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htklw\" (UniqueName: \"kubernetes.io/projected/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-kube-api-access-htklw\") pod \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.010545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-scripts\") pod \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.010569 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-credential-keys\") pod \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.010655 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-config-data\") pod \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.010680 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-public-tls-certs\") pod \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\" (UID: \"8b42296e-9bb8-4b92-ac1f-6f62a58bb042\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.088373 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8b42296e-9bb8-4b92-ac1f-6f62a58bb042" (UID: "8b42296e-9bb8-4b92-ac1f-6f62a58bb042"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.097021 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-kube-api-access-htklw" (OuterVolumeSpecName: "kube-api-access-htklw") pod "8b42296e-9bb8-4b92-ac1f-6f62a58bb042" (UID: "8b42296e-9bb8-4b92-ac1f-6f62a58bb042"). InnerVolumeSpecName "kube-api-access-htklw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.107654 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.111166 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"0aacc04a-8369-4d86-a209-51905de33654","Type":"ContainerStarted","Data":"68f05a4b2107b10954ba17eed85dc844f830039874e21f94175e99351cd0d1c5"} Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.117183 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8b42296e-9bb8-4b92-ac1f-6f62a58bb042" (UID: "8b42296e-9bb8-4b92-ac1f-6f62a58bb042"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.120046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d","Type":"ContainerStarted","Data":"437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5"} Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.126279 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnzww\" (UniqueName: \"kubernetes.io/projected/f19ee173-f9bb-4671-afde-4107284c57fa-kube-api-access-mnzww\") pod \"f19ee173-f9bb-4671-afde-4107284c57fa\" (UID: \"f19ee173-f9bb-4671-afde-4107284c57fa\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.126402 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19ee173-f9bb-4671-afde-4107284c57fa-combined-ca-bundle\") pod \"f19ee173-f9bb-4671-afde-4107284c57fa\" (UID: \"f19ee173-f9bb-4671-afde-4107284c57fa\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.126423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19ee173-f9bb-4671-afde-4107284c57fa-config-data\") pod \"f19ee173-f9bb-4671-afde-4107284c57fa\" (UID: \"f19ee173-f9bb-4671-afde-4107284c57fa\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.128889 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htklw\" (UniqueName: \"kubernetes.io/projected/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-kube-api-access-htklw\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.128940 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.128951 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.143776 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-scripts" (OuterVolumeSpecName: "scripts") pod "8b42296e-9bb8-4b92-ac1f-6f62a58bb042" (UID: "8b42296e-9bb8-4b92-ac1f-6f62a58bb042"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.150562 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f19ee173-f9bb-4671-afde-4107284c57fa-kube-api-access-mnzww" (OuterVolumeSpecName: "kube-api-access-mnzww") pod "f19ee173-f9bb-4671-afde-4107284c57fa" (UID: "f19ee173-f9bb-4671-afde-4107284c57fa"). InnerVolumeSpecName "kube-api-access-mnzww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.162867 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" event={"ID":"8b42296e-9bb8-4b92-ac1f-6f62a58bb042","Type":"ContainerDied","Data":"b2263a247334346a23757c575d761339fb92af17d180b287c6b07462fd3ac1c2"} Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.162924 5030 scope.go:117] "RemoveContainer" containerID="98fb3459638b3c9138e33d4416a9e172a726d393548444329a3a1a4c7fee114e" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.162977 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.172706 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb","Type":"ContainerStarted","Data":"cb8237cc0b5b2e4e6dae90a4bf8cfbbab519f02669ecc73ed98d4b9bea378826"} Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.178120 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"862166c4-fd5b-456d-9a91-c2235cdf2aa2","Type":"ContainerStarted","Data":"b975eea71e79ae10099bf416b39c08cf81c77876459e96d57d47c85f13049d1c"} Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.178847 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.181426 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"04d86b77-716c-4eb7-8976-9e47e1b6df7d","Type":"ContainerStarted","Data":"4aaf8e6fcb24b362e556db122f75e92eadf4805bcf94845dd8ac1ad4a5b02c76"} Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.183337 5030 generic.go:334] "Generic (PLEG): container finished" podID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerID="8622c0b6fced3f26178c491c88d209f96cfe8f2ccbe0ba24a807c4a3dd340c2a" exitCode=0 Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.183377 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ddd0447c-b834-4a95-b1d8-35eb86c7de95","Type":"ContainerDied","Data":"8622c0b6fced3f26178c491c88d209f96cfe8f2ccbe0ba24a807c4a3dd340c2a"} Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.185567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9c11e361-d0f4-49be-9b4b-96c109c78e43","Type":"ContainerStarted","Data":"fb97ce3a33b1f71f846cb119e808bd2b5c362dd51d11650e41e745ce64e0db52"} Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.195009 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb","Type":"ContainerStarted","Data":"39d75acd0e24b7098ea6e1abb5aa8ea635cc430b52893fa5ef9edbc6e210ce11"} Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.200763 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"527ab23e-825f-4355-9abf-e3858c484683","Type":"ContainerStarted","Data":"77811e0d82b6720b021978cb34a90c19c14e60268e314bc8de7e4c5e39e50342"} Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.208584 5030 generic.go:334] "Generic (PLEG): container finished" podID="f19ee173-f9bb-4671-afde-4107284c57fa" containerID="dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3" exitCode=0 Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.208695 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"f19ee173-f9bb-4671-afde-4107284c57fa","Type":"ContainerDied","Data":"dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3"} Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.208723 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"f19ee173-f9bb-4671-afde-4107284c57fa","Type":"ContainerDied","Data":"d4da8a4187a3585a50584099560f4c953725cb26ddf3a80ef84340fc6d250971"} Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.208768 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.212192 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=5.212170174 podStartE2EDuration="5.212170174s" podCreationTimestamp="2026-01-20 23:43:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:09.20499411 +0000 UTC m=+4061.525254398" watchObservedRunningTime="2026-01-20 23:43:09.212170174 +0000 UTC m=+4061.532430462" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.220089 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"31981674-b244-476b-bf36-e3f5903d2446","Type":"ContainerStarted","Data":"bb1f4a48dba1630238aa1f6de1040687a3a18337c0846b8b26c2f0cddc9fd4ff"} Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.224995 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=6.224979244 podStartE2EDuration="6.224979244s" podCreationTimestamp="2026-01-20 23:43:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:09.222452973 +0000 UTC m=+4061.542713261" watchObservedRunningTime="2026-01-20 23:43:09.224979244 +0000 UTC m=+4061.545239522" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.231955 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnzww\" (UniqueName: \"kubernetes.io/projected/f19ee173-f9bb-4671-afde-4107284c57fa-kube-api-access-mnzww\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.231994 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.264290 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" podUID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.1.4:8778/\": EOF" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.264396 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" podUID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.1.4:8778/\": EOF" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.271720 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=7.271701817 podStartE2EDuration="7.271701817s" podCreationTimestamp="2026-01-20 23:43:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:09.265270082 +0000 UTC m=+4061.585530370" watchObservedRunningTime="2026-01-20 23:43:09.271701817 +0000 UTC m=+4061.591962115" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.399514 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.511174 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.523195 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b42296e-9bb8-4b92-ac1f-6f62a58bb042" (UID: "8b42296e-9bb8-4b92-ac1f-6f62a58bb042"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.552244 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.567275 5030 scope.go:117] "RemoveContainer" containerID="dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.683934 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-config-data" (OuterVolumeSpecName: "config-data") pod "8b42296e-9bb8-4b92-ac1f-6f62a58bb042" (UID: "8b42296e-9bb8-4b92-ac1f-6f62a58bb042"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.686309 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="16f26a59-3681-422d-a46f-406b454f73b8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.169:5671: connect: connection reset by peer" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.698668 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.732497 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.733764 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8b42296e-9bb8-4b92-ac1f-6f62a58bb042" (UID: "8b42296e-9bb8-4b92-ac1f-6f62a58bb042"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.769960 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.774789 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.801834 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f19ee173-f9bb-4671-afde-4107284c57fa-config-data" (OuterVolumeSpecName: "config-data") pod "f19ee173-f9bb-4671-afde-4107284c57fa" (UID: "f19ee173-f9bb-4671-afde-4107284c57fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.824194 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f19ee173-f9bb-4671-afde-4107284c57fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f19ee173-f9bb-4671-afde-4107284c57fa" (UID: "f19ee173-f9bb-4671-afde-4107284c57fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.836837 5030 scope.go:117] "RemoveContainer" containerID="dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3" Jan 20 23:43:09 crc kubenswrapper[5030]: E0120 23:43:09.847405 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3\": container with ID starting with dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3 not found: ID does not exist" containerID="dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.847452 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3"} err="failed to get container status \"dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3\": rpc error: code = NotFound desc = could not find container \"dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3\": container with ID starting with dd51753f22e2ce4bc0086fe9fb58678b6395fbcb0ef6cc6aa2048a0e7ef85ef3 not found: ID does not exist" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.876791 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.877844 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f19ee173-f9bb-4671-afde-4107284c57fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.877920 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f19ee173-f9bb-4671-afde-4107284c57fa-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.921474 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8b42296e-9bb8-4b92-ac1f-6f62a58bb042" (UID: "8b42296e-9bb8-4b92-ac1f-6f62a58bb042"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.926816 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.978700 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-config-data\") pod \"09386ef4-57a8-4beb-950a-49b79579bd25\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.978743 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxkqj\" (UniqueName: \"kubernetes.io/projected/09386ef4-57a8-4beb-950a-49b79579bd25-kube-api-access-vxkqj\") pod \"09386ef4-57a8-4beb-950a-49b79579bd25\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.978802 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-config-data-custom\") pod \"09386ef4-57a8-4beb-950a-49b79579bd25\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.978825 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-combined-ca-bundle\") pod \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.978884 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-combined-ca-bundle\") pod \"09386ef4-57a8-4beb-950a-49b79579bd25\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.978932 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-scripts\") pod \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.978960 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd0447c-b834-4a95-b1d8-35eb86c7de95-logs\") pod \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.979016 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09386ef4-57a8-4beb-950a-49b79579bd25-logs\") pod \"09386ef4-57a8-4beb-950a-49b79579bd25\" (UID: \"09386ef4-57a8-4beb-950a-49b79579bd25\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.979063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.979175 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-config-data\") pod \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.979356 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgh7c\" (UniqueName: \"kubernetes.io/projected/ddd0447c-b834-4a95-b1d8-35eb86c7de95-kube-api-access-hgh7c\") pod \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.979404 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ddd0447c-b834-4a95-b1d8-35eb86c7de95-httpd-run\") pod \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.979492 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-internal-tls-certs\") pod \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\" (UID: \"ddd0447c-b834-4a95-b1d8-35eb86c7de95\") " Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.979882 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b42296e-9bb8-4b92-ac1f-6f62a58bb042-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.990597 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd0447c-b834-4a95-b1d8-35eb86c7de95-logs" (OuterVolumeSpecName: "logs") pod "ddd0447c-b834-4a95-b1d8-35eb86c7de95" (UID: "ddd0447c-b834-4a95-b1d8-35eb86c7de95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.993207 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09386ef4-57a8-4beb-950a-49b79579bd25-logs" (OuterVolumeSpecName: "logs") pod "09386ef4-57a8-4beb-950a-49b79579bd25" (UID: "09386ef4-57a8-4beb-950a-49b79579bd25"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:09 crc kubenswrapper[5030]: I0120 23:43:09.999530 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd0447c-b834-4a95-b1d8-35eb86c7de95-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ddd0447c-b834-4a95-b1d8-35eb86c7de95" (UID: "ddd0447c-b834-4a95-b1d8-35eb86c7de95"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.000814 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.004132 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "09386ef4-57a8-4beb-950a-49b79579bd25" (UID: "09386ef4-57a8-4beb-950a-49b79579bd25"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.004249 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "ddd0447c-b834-4a95-b1d8-35eb86c7de95" (UID: "ddd0447c-b834-4a95-b1d8-35eb86c7de95"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.005157 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-scripts" (OuterVolumeSpecName: "scripts") pod "ddd0447c-b834-4a95-b1d8-35eb86c7de95" (UID: "ddd0447c-b834-4a95-b1d8-35eb86c7de95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.007730 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09386ef4-57a8-4beb-950a-49b79579bd25-kube-api-access-vxkqj" (OuterVolumeSpecName: "kube-api-access-vxkqj") pod "09386ef4-57a8-4beb-950a-49b79579bd25" (UID: "09386ef4-57a8-4beb-950a-49b79579bd25"). InnerVolumeSpecName "kube-api-access-vxkqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.016563 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd0447c-b834-4a95-b1d8-35eb86c7de95-kube-api-access-hgh7c" (OuterVolumeSpecName: "kube-api-access-hgh7c") pod "ddd0447c-b834-4a95-b1d8-35eb86c7de95" (UID: "ddd0447c-b834-4a95-b1d8-35eb86c7de95"). InnerVolumeSpecName "kube-api-access-hgh7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.084246 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh7kx\" (UniqueName: \"kubernetes.io/projected/1161dc0a-c665-401c-8c5c-b83131eaa38f-kube-api-access-xh7kx\") pod \"1161dc0a-c665-401c-8c5c-b83131eaa38f\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.084303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1161dc0a-c665-401c-8c5c-b83131eaa38f-logs\") pod \"1161dc0a-c665-401c-8c5c-b83131eaa38f\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.084436 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-config-data-custom\") pod \"1161dc0a-c665-401c-8c5c-b83131eaa38f\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.084588 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-combined-ca-bundle\") pod \"1161dc0a-c665-401c-8c5c-b83131eaa38f\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.084647 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-config-data\") pod \"1161dc0a-c665-401c-8c5c-b83131eaa38f\" (UID: \"1161dc0a-c665-401c-8c5c-b83131eaa38f\") " Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.085037 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.085049 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd0447c-b834-4a95-b1d8-35eb86c7de95-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.085057 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09386ef4-57a8-4beb-950a-49b79579bd25-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.085077 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.085086 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgh7c\" (UniqueName: \"kubernetes.io/projected/ddd0447c-b834-4a95-b1d8-35eb86c7de95-kube-api-access-hgh7c\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.085096 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ddd0447c-b834-4a95-b1d8-35eb86c7de95-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.085107 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxkqj\" (UniqueName: \"kubernetes.io/projected/09386ef4-57a8-4beb-950a-49b79579bd25-kube-api-access-vxkqj\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.085116 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.087782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddd0447c-b834-4a95-b1d8-35eb86c7de95" (UID: "ddd0447c-b834-4a95-b1d8-35eb86c7de95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.087891 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1161dc0a-c665-401c-8c5c-b83131eaa38f-logs" (OuterVolumeSpecName: "logs") pod "1161dc0a-c665-401c-8c5c-b83131eaa38f" (UID: "1161dc0a-c665-401c-8c5c-b83131eaa38f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.108612 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1161dc0a-c665-401c-8c5c-b83131eaa38f-kube-api-access-xh7kx" (OuterVolumeSpecName: "kube-api-access-xh7kx") pod "1161dc0a-c665-401c-8c5c-b83131eaa38f" (UID: "1161dc0a-c665-401c-8c5c-b83131eaa38f"). InnerVolumeSpecName "kube-api-access-xh7kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.111756 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1161dc0a-c665-401c-8c5c-b83131eaa38f" (UID: "1161dc0a-c665-401c-8c5c-b83131eaa38f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.154106 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.161240 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.167956 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.185523 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:43:10 crc kubenswrapper[5030]: E0120 23:43:10.186110 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1161dc0a-c665-401c-8c5c-b83131eaa38f" containerName="barbican-worker" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.186215 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1161dc0a-c665-401c-8c5c-b83131eaa38f" containerName="barbican-worker" Jan 20 23:43:10 crc kubenswrapper[5030]: E0120 23:43:10.186289 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b42296e-9bb8-4b92-ac1f-6f62a58bb042" containerName="keystone-api" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.186341 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b42296e-9bb8-4b92-ac1f-6f62a58bb042" containerName="keystone-api" Jan 20 23:43:10 crc kubenswrapper[5030]: E0120 23:43:10.186390 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerName="glance-httpd" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.186438 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerName="glance-httpd" Jan 20 23:43:10 crc kubenswrapper[5030]: E0120 23:43:10.186510 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerName="glance-log" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.186565 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerName="glance-log" Jan 20 23:43:10 crc kubenswrapper[5030]: E0120 23:43:10.186636 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f19ee173-f9bb-4671-afde-4107284c57fa" containerName="nova-cell1-conductor-conductor" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.186689 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19ee173-f9bb-4671-afde-4107284c57fa" containerName="nova-cell1-conductor-conductor" Jan 20 23:43:10 crc kubenswrapper[5030]: E0120 23:43:10.186744 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09386ef4-57a8-4beb-950a-49b79579bd25" containerName="barbican-keystone-listener-log" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.186794 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="09386ef4-57a8-4beb-950a-49b79579bd25" containerName="barbican-keystone-listener-log" Jan 20 23:43:10 crc kubenswrapper[5030]: E0120 23:43:10.186859 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09386ef4-57a8-4beb-950a-49b79579bd25" containerName="barbican-keystone-listener" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.186928 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="09386ef4-57a8-4beb-950a-49b79579bd25" containerName="barbican-keystone-listener" Jan 20 23:43:10 crc kubenswrapper[5030]: E0120 23:43:10.186988 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1161dc0a-c665-401c-8c5c-b83131eaa38f" containerName="barbican-worker-log" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.187038 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1161dc0a-c665-401c-8c5c-b83131eaa38f" containerName="barbican-worker-log" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.187311 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="09386ef4-57a8-4beb-950a-49b79579bd25" containerName="barbican-keystone-listener-log" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.187387 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerName="glance-log" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.187446 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1161dc0a-c665-401c-8c5c-b83131eaa38f" containerName="barbican-worker" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.187505 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b42296e-9bb8-4b92-ac1f-6f62a58bb042" containerName="keystone-api" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.187568 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f19ee173-f9bb-4671-afde-4107284c57fa" containerName="nova-cell1-conductor-conductor" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.187640 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" containerName="glance-httpd" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.187696 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1161dc0a-c665-401c-8c5c-b83131eaa38f" containerName="barbican-worker-log" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.193015 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh7kx\" (UniqueName: \"kubernetes.io/projected/1161dc0a-c665-401c-8c5c-b83131eaa38f-kube-api-access-xh7kx\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.197684 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="09386ef4-57a8-4beb-950a-49b79579bd25" containerName="barbican-keystone-listener" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.199608 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1161dc0a-c665-401c-8c5c-b83131eaa38f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.199676 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.199689 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.199701 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.200259 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.202220 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.205501 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-config-data" (OuterVolumeSpecName: "config-data") pod "09386ef4-57a8-4beb-950a-49b79579bd25" (UID: "09386ef4-57a8-4beb-950a-49b79579bd25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.205773 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.220751 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-685c4d7f7b-d6j4k"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.232883 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.237514 5030 generic.go:334] "Generic (PLEG): container finished" podID="09386ef4-57a8-4beb-950a-49b79579bd25" containerID="b25ff8a83d116c4cbc47780cd4b734085f77cdcc9f01bfe134d74870aa75527c" exitCode=0 Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.237571 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" event={"ID":"09386ef4-57a8-4beb-950a-49b79579bd25","Type":"ContainerDied","Data":"b25ff8a83d116c4cbc47780cd4b734085f77cdcc9f01bfe134d74870aa75527c"} Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.237599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" event={"ID":"09386ef4-57a8-4beb-950a-49b79579bd25","Type":"ContainerDied","Data":"21d55cc3a6c969e7c1c2c0564777fef0d5cf3f2d2bb4b9fd2dfc1576e0c9b551"} Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.237615 5030 scope.go:117] "RemoveContainer" containerID="b25ff8a83d116c4cbc47780cd4b734085f77cdcc9f01bfe134d74870aa75527c" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.237729 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.248661 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6z7st"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.250528 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.259066 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6z7st"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.262781 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"04d86b77-716c-4eb7-8976-9e47e1b6df7d","Type":"ContainerStarted","Data":"1feab068d22c049b8496d1357dade67b2f0fdf20291edc9d615dba4d367218dc"} Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.262837 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.297686 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=3.297668626 podStartE2EDuration="3.297668626s" podCreationTimestamp="2026-01-20 23:43:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:10.290165194 +0000 UTC m=+4062.610425482" watchObservedRunningTime="2026-01-20 23:43:10.297668626 +0000 UTC m=+4062.617928914" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.305123 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6e18e5-718a-4578-8b74-add47cb38d2d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1d6e18e5-718a-4578-8b74-add47cb38d2d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.305198 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqmqh\" (UniqueName: \"kubernetes.io/projected/1d6e18e5-718a-4578-8b74-add47cb38d2d-kube-api-access-lqmqh\") pod \"nova-cell1-conductor-0\" (UID: \"1d6e18e5-718a-4578-8b74-add47cb38d2d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.305239 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6e18e5-718a-4578-8b74-add47cb38d2d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1d6e18e5-718a-4578-8b74-add47cb38d2d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.305289 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.305511 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"31981674-b244-476b-bf36-e3f5903d2446","Type":"ContainerStarted","Data":"95901ce3221d90b9173bf554b861b002c40953ce0859a9f6d52f0473e2884b29"} Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.308306 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09386ef4-57a8-4beb-950a-49b79579bd25" (UID: "09386ef4-57a8-4beb-950a-49b79579bd25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.321268 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.322055 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ddd0447c-b834-4a95-b1d8-35eb86c7de95","Type":"ContainerDied","Data":"c00f6f81562756ec701624ac718043ffdafc3186e407b94febbb51cee2d64580"} Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.327440 5030 generic.go:334] "Generic (PLEG): container finished" podID="1161dc0a-c665-401c-8c5c-b83131eaa38f" containerID="38e25d60d83df7b7698e2f71ec167569bffbe17710fcb7c93716578cf6b088b3" exitCode=0 Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.327591 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.328578 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" event={"ID":"1161dc0a-c665-401c-8c5c-b83131eaa38f","Type":"ContainerDied","Data":"38e25d60d83df7b7698e2f71ec167569bffbe17710fcb7c93716578cf6b088b3"} Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.328638 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb" event={"ID":"1161dc0a-c665-401c-8c5c-b83131eaa38f","Type":"ContainerDied","Data":"c1dd53c78fed5cee69a5f91895dc201bbc2fe207fa7903633b2e6e71f5f1f35c"} Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.408064 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6e18e5-718a-4578-8b74-add47cb38d2d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1d6e18e5-718a-4578-8b74-add47cb38d2d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.408240 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqmqh\" (UniqueName: \"kubernetes.io/projected/1d6e18e5-718a-4578-8b74-add47cb38d2d-kube-api-access-lqmqh\") pod \"nova-cell1-conductor-0\" (UID: \"1d6e18e5-718a-4578-8b74-add47cb38d2d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.408341 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a95b578-1de2-4b88-88dd-63ff04e9020f-utilities\") pod \"certified-operators-6z7st\" (UID: \"2a95b578-1de2-4b88-88dd-63ff04e9020f\") " pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.408397 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6e18e5-718a-4578-8b74-add47cb38d2d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1d6e18e5-718a-4578-8b74-add47cb38d2d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.408430 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qpwf\" (UniqueName: \"kubernetes.io/projected/2a95b578-1de2-4b88-88dd-63ff04e9020f-kube-api-access-9qpwf\") pod \"certified-operators-6z7st\" (UID: \"2a95b578-1de2-4b88-88dd-63ff04e9020f\") " pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.408490 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a95b578-1de2-4b88-88dd-63ff04e9020f-catalog-content\") pod \"certified-operators-6z7st\" (UID: \"2a95b578-1de2-4b88-88dd-63ff04e9020f\") " pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.408910 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09386ef4-57a8-4beb-950a-49b79579bd25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.418897 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6e18e5-718a-4578-8b74-add47cb38d2d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1d6e18e5-718a-4578-8b74-add47cb38d2d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.420068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6e18e5-718a-4578-8b74-add47cb38d2d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1d6e18e5-718a-4578-8b74-add47cb38d2d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.431864 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqmqh\" (UniqueName: \"kubernetes.io/projected/1d6e18e5-718a-4578-8b74-add47cb38d2d-kube-api-access-lqmqh\") pod \"nova-cell1-conductor-0\" (UID: \"1d6e18e5-718a-4578-8b74-add47cb38d2d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.453309 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.460784 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-config-data" (OuterVolumeSpecName: "config-data") pod "ddd0447c-b834-4a95-b1d8-35eb86c7de95" (UID: "ddd0447c-b834-4a95-b1d8-35eb86c7de95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.469845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1161dc0a-c665-401c-8c5c-b83131eaa38f" (UID: "1161dc0a-c665-401c-8c5c-b83131eaa38f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.485253 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ddd0447c-b834-4a95-b1d8-35eb86c7de95" (UID: "ddd0447c-b834-4a95-b1d8-35eb86c7de95"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.534218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a95b578-1de2-4b88-88dd-63ff04e9020f-utilities\") pod \"certified-operators-6z7st\" (UID: \"2a95b578-1de2-4b88-88dd-63ff04e9020f\") " pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.534299 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qpwf\" (UniqueName: \"kubernetes.io/projected/2a95b578-1de2-4b88-88dd-63ff04e9020f-kube-api-access-9qpwf\") pod \"certified-operators-6z7st\" (UID: \"2a95b578-1de2-4b88-88dd-63ff04e9020f\") " pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.534346 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a95b578-1de2-4b88-88dd-63ff04e9020f-catalog-content\") pod \"certified-operators-6z7st\" (UID: \"2a95b578-1de2-4b88-88dd-63ff04e9020f\") " pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.534424 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.534438 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddd0447c-b834-4a95-b1d8-35eb86c7de95-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.534455 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.534990 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a95b578-1de2-4b88-88dd-63ff04e9020f-catalog-content\") pod \"certified-operators-6z7st\" (UID: \"2a95b578-1de2-4b88-88dd-63ff04e9020f\") " pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.535222 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a95b578-1de2-4b88-88dd-63ff04e9020f-utilities\") pod \"certified-operators-6z7st\" (UID: \"2a95b578-1de2-4b88-88dd-63ff04e9020f\") " pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.559572 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.574360 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-config-data" (OuterVolumeSpecName: "config-data") pod "1161dc0a-c665-401c-8c5c-b83131eaa38f" (UID: "1161dc0a-c665-401c-8c5c-b83131eaa38f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.578944 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qpwf\" (UniqueName: \"kubernetes.io/projected/2a95b578-1de2-4b88-88dd-63ff04e9020f-kube-api-access-9qpwf\") pod \"certified-operators-6z7st\" (UID: \"2a95b578-1de2-4b88-88dd-63ff04e9020f\") " pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.646495 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1161dc0a-c665-401c-8c5c-b83131eaa38f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.792277 5030 scope.go:117] "RemoveContainer" containerID="f5a74acba0a8a99226f14f4549f03f8f58417131875199ea1e4cdc9379448f3a" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.793480 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.795651 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.803122 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.814988 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.834133 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.845767 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-7dbc7bf587-fpsqb"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.853444 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.855113 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.858242 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.858471 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.869759 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.899669 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.899729 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-58746674b-tcfk9"] Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.911231 5030 scope.go:117] "RemoveContainer" containerID="b25ff8a83d116c4cbc47780cd4b734085f77cdcc9f01bfe134d74870aa75527c" Jan 20 23:43:10 crc kubenswrapper[5030]: E0120 23:43:10.912195 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b25ff8a83d116c4cbc47780cd4b734085f77cdcc9f01bfe134d74870aa75527c\": container with ID starting with b25ff8a83d116c4cbc47780cd4b734085f77cdcc9f01bfe134d74870aa75527c not found: ID does not exist" containerID="b25ff8a83d116c4cbc47780cd4b734085f77cdcc9f01bfe134d74870aa75527c" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.912235 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25ff8a83d116c4cbc47780cd4b734085f77cdcc9f01bfe134d74870aa75527c"} err="failed to get container status \"b25ff8a83d116c4cbc47780cd4b734085f77cdcc9f01bfe134d74870aa75527c\": rpc error: code = NotFound desc = could not find container \"b25ff8a83d116c4cbc47780cd4b734085f77cdcc9f01bfe134d74870aa75527c\": container with ID starting with b25ff8a83d116c4cbc47780cd4b734085f77cdcc9f01bfe134d74870aa75527c not found: ID does not exist" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.912266 5030 scope.go:117] "RemoveContainer" containerID="f5a74acba0a8a99226f14f4549f03f8f58417131875199ea1e4cdc9379448f3a" Jan 20 23:43:10 crc kubenswrapper[5030]: E0120 23:43:10.915018 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5a74acba0a8a99226f14f4549f03f8f58417131875199ea1e4cdc9379448f3a\": container with ID starting with f5a74acba0a8a99226f14f4549f03f8f58417131875199ea1e4cdc9379448f3a not found: ID does not exist" containerID="f5a74acba0a8a99226f14f4549f03f8f58417131875199ea1e4cdc9379448f3a" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.915067 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5a74acba0a8a99226f14f4549f03f8f58417131875199ea1e4cdc9379448f3a"} err="failed to get container status \"f5a74acba0a8a99226f14f4549f03f8f58417131875199ea1e4cdc9379448f3a\": rpc error: code = NotFound desc = could not find container \"f5a74acba0a8a99226f14f4549f03f8f58417131875199ea1e4cdc9379448f3a\": container with ID starting with f5a74acba0a8a99226f14f4549f03f8f58417131875199ea1e4cdc9379448f3a not found: ID does not exist" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.915110 5030 scope.go:117] "RemoveContainer" containerID="8622c0b6fced3f26178c491c88d209f96cfe8f2ccbe0ba24a807c4a3dd340c2a" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.961729 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.961792 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.961827 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-logs\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.961849 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.961906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.961972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.961997 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.962019 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg22d\" (UniqueName: \"kubernetes.io/projected/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-kube-api-access-bg22d\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:10 crc kubenswrapper[5030]: I0120 23:43:10.966964 5030 scope.go:117] "RemoveContainer" containerID="fe2e23d68370c4c00564a676e4d3e1d4904e1c03bc91bd3f25aa636e081afe21" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.065262 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-logs\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.065321 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.065407 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.065514 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.065556 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.065575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg22d\" (UniqueName: \"kubernetes.io/projected/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-kube-api-access-bg22d\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.065654 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.065726 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.066684 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-logs\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.066889 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.073241 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.075659 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.080776 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.082001 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.084203 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.114317 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg22d\" (UniqueName: \"kubernetes.io/projected/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-kube-api-access-bg22d\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.129024 5030 scope.go:117] "RemoveContainer" containerID="38e25d60d83df7b7698e2f71ec167569bffbe17710fcb7c93716578cf6b088b3" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.139274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.186091 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.204579 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv"] Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.204937 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" podUID="9948629d-78da-4168-98e5-f887c418909f" containerName="placement-log" containerID="cri-o://98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b" gracePeriod=30 Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.205363 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" podUID="9948629d-78da-4168-98e5-f887c418909f" containerName="placement-api" containerID="cri-o://59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb" gracePeriod=30 Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.208876 5030 scope.go:117] "RemoveContainer" containerID="2ddec88ed5b072c38033cdee2f329b4a5d2567e6d0da5ecc25c62b5c39d2c21b" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.261780 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-8564b945c8-66n97"] Jan 20 23:43:11 crc kubenswrapper[5030]: E0120 23:43:11.266013 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.267090 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" podUID="9948629d-78da-4168-98e5-f887c418909f" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.1.18:8778/\": read tcp 10.217.0.2:36702->10.217.1.18:8778: read: connection reset by peer" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.289405 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: E0120 23:43:11.301384 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:43:11 crc kubenswrapper[5030]: E0120 23:43:11.344906 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:43:11 crc kubenswrapper[5030]: E0120 23:43:11.344981 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="87ed5aae-25b3-475c-b086-4ef865d35368" containerName="ovn-northd" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.353998 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-8564b945c8-66n97"] Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.373225 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-logs\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.373304 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-config-data\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.373335 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgdfx\" (UniqueName: \"kubernetes.io/projected/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-kube-api-access-xgdfx\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.373382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-public-tls-certs\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.373427 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-scripts\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.373444 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-internal-tls-certs\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.373480 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-combined-ca-bundle\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.384232 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d","Type":"ContainerStarted","Data":"aff3c22c8090855f1bc142b62bd3dbd94c3ee26aaf152b76668467680a42dbb7"} Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.452826 5030 scope.go:117] "RemoveContainer" containerID="38e25d60d83df7b7698e2f71ec167569bffbe17710fcb7c93716578cf6b088b3" Jan 20 23:43:11 crc kubenswrapper[5030]: E0120 23:43:11.453567 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e25d60d83df7b7698e2f71ec167569bffbe17710fcb7c93716578cf6b088b3\": container with ID starting with 38e25d60d83df7b7698e2f71ec167569bffbe17710fcb7c93716578cf6b088b3 not found: ID does not exist" containerID="38e25d60d83df7b7698e2f71ec167569bffbe17710fcb7c93716578cf6b088b3" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.453610 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e25d60d83df7b7698e2f71ec167569bffbe17710fcb7c93716578cf6b088b3"} err="failed to get container status \"38e25d60d83df7b7698e2f71ec167569bffbe17710fcb7c93716578cf6b088b3\": rpc error: code = NotFound desc = could not find container \"38e25d60d83df7b7698e2f71ec167569bffbe17710fcb7c93716578cf6b088b3\": container with ID starting with 38e25d60d83df7b7698e2f71ec167569bffbe17710fcb7c93716578cf6b088b3 not found: ID does not exist" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.453654 5030 scope.go:117] "RemoveContainer" containerID="2ddec88ed5b072c38033cdee2f329b4a5d2567e6d0da5ecc25c62b5c39d2c21b" Jan 20 23:43:11 crc kubenswrapper[5030]: E0120 23:43:11.455439 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ddec88ed5b072c38033cdee2f329b4a5d2567e6d0da5ecc25c62b5c39d2c21b\": container with ID starting with 2ddec88ed5b072c38033cdee2f329b4a5d2567e6d0da5ecc25c62b5c39d2c21b not found: ID does not exist" containerID="2ddec88ed5b072c38033cdee2f329b4a5d2567e6d0da5ecc25c62b5c39d2c21b" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.455464 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ddec88ed5b072c38033cdee2f329b4a5d2567e6d0da5ecc25c62b5c39d2c21b"} err="failed to get container status \"2ddec88ed5b072c38033cdee2f329b4a5d2567e6d0da5ecc25c62b5c39d2c21b\": rpc error: code = NotFound desc = could not find container \"2ddec88ed5b072c38033cdee2f329b4a5d2567e6d0da5ecc25c62b5c39d2c21b\": container with ID starting with 2ddec88ed5b072c38033cdee2f329b4a5d2567e6d0da5ecc25c62b5c39d2c21b not found: ID does not exist" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.468364 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"31981674-b244-476b-bf36-e3f5903d2446","Type":"ContainerStarted","Data":"f68a1b444eca0b16db61c51ec7f264af573275a6ffcdd3dfad8bf3826fec5f89"} Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.475019 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-config-data\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.475068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgdfx\" (UniqueName: \"kubernetes.io/projected/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-kube-api-access-xgdfx\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.475139 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-public-tls-certs\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.475204 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-scripts\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.475225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-internal-tls-certs\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.475281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-combined-ca-bundle\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.475333 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-logs\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.476652 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-logs\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.491325 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-config-data\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.502630 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-public-tls-certs\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.506457 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-combined-ca-bundle\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.512987 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgdfx\" (UniqueName: \"kubernetes.io/projected/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-kube-api-access-xgdfx\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.513238 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-internal-tls-certs\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.513543 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-scripts\") pod \"placement-8564b945c8-66n97\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.523401 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=5.52338361 podStartE2EDuration="5.52338361s" podCreationTimestamp="2026-01-20 23:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:11.493216899 +0000 UTC m=+4063.813477187" watchObservedRunningTime="2026-01-20 23:43:11.52338361 +0000 UTC m=+4063.843643898" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.538042 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb","Type":"ContainerStarted","Data":"5b31b1b990ceabd4797fda7252ebbcdbfa5957e9115be52ceb86f6f08f169842"} Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.652593 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6z7st"] Jan 20 23:43:11 crc kubenswrapper[5030]: W0120 23:43:11.657420 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a95b578_1de2_4b88_88dd_63ff04e9020f.slice/crio-2a2ff61ab22d58961a6d4813e63680f01d23f60c14d9f4220f14d5ad8703406c WatchSource:0}: Error finding container 2a2ff61ab22d58961a6d4813e63680f01d23f60c14d9f4220f14d5ad8703406c: Status 404 returned error can't find the container with id 2a2ff61ab22d58961a6d4813e63680f01d23f60c14d9f4220f14d5ad8703406c Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.706438 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:11 crc kubenswrapper[5030]: I0120 23:43:11.809980 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.021047 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09386ef4-57a8-4beb-950a-49b79579bd25" path="/var/lib/kubelet/pods/09386ef4-57a8-4beb-950a-49b79579bd25/volumes" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.023497 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1161dc0a-c665-401c-8c5c-b83131eaa38f" path="/var/lib/kubelet/pods/1161dc0a-c665-401c-8c5c-b83131eaa38f/volumes" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.024241 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b42296e-9bb8-4b92-ac1f-6f62a58bb042" path="/var/lib/kubelet/pods/8b42296e-9bb8-4b92-ac1f-6f62a58bb042/volumes" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.030829 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd0447c-b834-4a95-b1d8-35eb86c7de95" path="/var/lib/kubelet/pods/ddd0447c-b834-4a95-b1d8-35eb86c7de95/volumes" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.035885 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f19ee173-f9bb-4671-afde-4107284c57fa" path="/var/lib/kubelet/pods/f19ee173-f9bb-4671-afde-4107284c57fa/volumes" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.036553 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.164114 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_87ed5aae-25b3-475c-b086-4ef865d35368/ovn-northd/0.log" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.164407 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.309195 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-combined-ca-bundle\") pod \"87ed5aae-25b3-475c-b086-4ef865d35368\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.309614 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87ed5aae-25b3-475c-b086-4ef865d35368-scripts\") pod \"87ed5aae-25b3-475c-b086-4ef865d35368\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.309666 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ed5aae-25b3-475c-b086-4ef865d35368-config\") pod \"87ed5aae-25b3-475c-b086-4ef865d35368\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.310415 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ed5aae-25b3-475c-b086-4ef865d35368-config" (OuterVolumeSpecName: "config") pod "87ed5aae-25b3-475c-b086-4ef865d35368" (UID: "87ed5aae-25b3-475c-b086-4ef865d35368"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.310446 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87ed5aae-25b3-475c-b086-4ef865d35368-scripts" (OuterVolumeSpecName: "scripts") pod "87ed5aae-25b3-475c-b086-4ef865d35368" (UID: "87ed5aae-25b3-475c-b086-4ef865d35368"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.310534 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87ed5aae-25b3-475c-b086-4ef865d35368-ovn-rundir\") pod \"87ed5aae-25b3-475c-b086-4ef865d35368\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.310925 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87ed5aae-25b3-475c-b086-4ef865d35368-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "87ed5aae-25b3-475c-b086-4ef865d35368" (UID: "87ed5aae-25b3-475c-b086-4ef865d35368"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.311009 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrzhw\" (UniqueName: \"kubernetes.io/projected/87ed5aae-25b3-475c-b086-4ef865d35368-kube-api-access-nrzhw\") pod \"87ed5aae-25b3-475c-b086-4ef865d35368\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.311039 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-metrics-certs-tls-certs\") pod \"87ed5aae-25b3-475c-b086-4ef865d35368\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.311678 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-ovn-northd-tls-certs\") pod \"87ed5aae-25b3-475c-b086-4ef865d35368\" (UID: \"87ed5aae-25b3-475c-b086-4ef865d35368\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.312220 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87ed5aae-25b3-475c-b086-4ef865d35368-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.312231 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87ed5aae-25b3-475c-b086-4ef865d35368-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.312240 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87ed5aae-25b3-475c-b086-4ef865d35368-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.329269 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87ed5aae-25b3-475c-b086-4ef865d35368-kube-api-access-nrzhw" (OuterVolumeSpecName: "kube-api-access-nrzhw") pod "87ed5aae-25b3-475c-b086-4ef865d35368" (UID: "87ed5aae-25b3-475c-b086-4ef865d35368"). InnerVolumeSpecName "kube-api-access-nrzhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.382291 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.382359 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.383463 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87ed5aae-25b3-475c-b086-4ef865d35368" (UID: "87ed5aae-25b3-475c-b086-4ef865d35368"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.417704 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrzhw\" (UniqueName: \"kubernetes.io/projected/87ed5aae-25b3-475c-b086-4ef865d35368-kube-api-access-nrzhw\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.417756 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.457585 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "87ed5aae-25b3-475c-b086-4ef865d35368" (UID: "87ed5aae-25b3-475c-b086-4ef865d35368"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.467343 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-8564b945c8-66n97"] Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.475696 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "87ed5aae-25b3-475c-b086-4ef865d35368" (UID: "87ed5aae-25b3-475c-b086-4ef865d35368"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.484813 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.520895 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.520923 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87ed5aae-25b3-475c-b086-4ef865d35368-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.552533 5030 generic.go:334] "Generic (PLEG): container finished" podID="0aacc04a-8369-4d86-a209-51905de33654" containerID="68f05a4b2107b10954ba17eed85dc844f830039874e21f94175e99351cd0d1c5" exitCode=0 Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.552599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"0aacc04a-8369-4d86-a209-51905de33654","Type":"ContainerDied","Data":"68f05a4b2107b10954ba17eed85dc844f830039874e21f94175e99351cd0d1c5"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.560945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"64d99a0e-503c-4dab-a5ef-68c2ff6446a4","Type":"ContainerStarted","Data":"e52123514770d37017c3b944ad15d09d15a840f3ccb2d4c8d7d7cbe349bbae0c"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.564366 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"1d6e18e5-718a-4578-8b74-add47cb38d2d","Type":"ContainerStarted","Data":"51240c7f068beb376adfcd6b67226bc6f0f4c7a4350b5e086ba82544702cca04"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.564408 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"1d6e18e5-718a-4578-8b74-add47cb38d2d","Type":"ContainerStarted","Data":"53b0e0dc24df6dea41ed71c1ff156a0b1842d60e49849ef27900be1d8d386c67"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.566921 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.582172 5030 generic.go:334] "Generic (PLEG): container finished" podID="2a95b578-1de2-4b88-88dd-63ff04e9020f" containerID="34dad5f312e352d038ff1235be5085442c825ba1357b91409e220096727216b9" exitCode=0 Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.582226 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z7st" event={"ID":"2a95b578-1de2-4b88-88dd-63ff04e9020f","Type":"ContainerDied","Data":"34dad5f312e352d038ff1235be5085442c825ba1357b91409e220096727216b9"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.582246 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z7st" event={"ID":"2a95b578-1de2-4b88-88dd-63ff04e9020f","Type":"ContainerStarted","Data":"2a2ff61ab22d58961a6d4813e63680f01d23f60c14d9f4220f14d5ad8703406c"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.593946 5030 generic.go:334] "Generic (PLEG): container finished" podID="9c11e361-d0f4-49be-9b4b-96c109c78e43" containerID="fb97ce3a33b1f71f846cb119e808bd2b5c362dd51d11650e41e745ce64e0db52" exitCode=0 Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.594029 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9c11e361-d0f4-49be-9b4b-96c109c78e43","Type":"ContainerDied","Data":"fb97ce3a33b1f71f846cb119e808bd2b5c362dd51d11650e41e745ce64e0db52"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.601011 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_87ed5aae-25b3-475c-b086-4ef865d35368/ovn-northd/0.log" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.601058 5030 generic.go:334] "Generic (PLEG): container finished" podID="87ed5aae-25b3-475c-b086-4ef865d35368" containerID="d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" exitCode=139 Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.601142 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"87ed5aae-25b3-475c-b086-4ef865d35368","Type":"ContainerDied","Data":"d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.601174 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"87ed5aae-25b3-475c-b086-4ef865d35368","Type":"ContainerDied","Data":"110d30eeebae4837cb28aa3e933028a46551d3a7db01bd57214198f11aefffaf"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.601172 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.601201 5030 scope.go:117] "RemoveContainer" containerID="333adca68e29b9bae2f3440fb85d275fec118f97e652dac5d291b5fdd406d414" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.605774 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb","Type":"ContainerStarted","Data":"86726663164ca08470d8023499565b6de220fd18491f622b7a239d4b600235cf"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.607929 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.6079203 podStartE2EDuration="2.6079203s" podCreationTimestamp="2026-01-20 23:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:12.590457276 +0000 UTC m=+4064.910717564" watchObservedRunningTime="2026-01-20 23:43:12.6079203 +0000 UTC m=+4064.928180588" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.608168 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" event={"ID":"0d3607ca-2431-4ace-a841-a6cdceb9d0ff","Type":"ContainerStarted","Data":"efaf95a472299db4ec824227bec10442d2170ceffedfcfd8f255cf1fe31eea7b"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.613489 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.614438 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.615298 5030 generic.go:334] "Generic (PLEG): container finished" podID="9948629d-78da-4168-98e5-f887c418909f" containerID="59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb" exitCode=0 Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.615349 5030 generic.go:334] "Generic (PLEG): container finished" podID="9948629d-78da-4168-98e5-f887c418909f" containerID="98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b" exitCode=143 Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.616983 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.617138 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" event={"ID":"9948629d-78da-4168-98e5-f887c418909f","Type":"ContainerDied","Data":"59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.617172 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" event={"ID":"9948629d-78da-4168-98e5-f887c418909f","Type":"ContainerDied","Data":"98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.617186 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv" event={"ID":"9948629d-78da-4168-98e5-f887c418909f","Type":"ContainerDied","Data":"0f5c253e681107992aa80a89e5d843a4a26d33bb1956288acaae5260478766f4"} Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.621876 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9948629d-78da-4168-98e5-f887c418909f-logs\") pod \"9948629d-78da-4168-98e5-f887c418909f\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.621934 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-internal-tls-certs\") pod \"9948629d-78da-4168-98e5-f887c418909f\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.621976 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-scripts\") pod \"9948629d-78da-4168-98e5-f887c418909f\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.622096 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-public-tls-certs\") pod \"9948629d-78da-4168-98e5-f887c418909f\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.622119 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-combined-ca-bundle\") pod \"9948629d-78da-4168-98e5-f887c418909f\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.622157 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jszzs\" (UniqueName: \"kubernetes.io/projected/9948629d-78da-4168-98e5-f887c418909f-kube-api-access-jszzs\") pod \"9948629d-78da-4168-98e5-f887c418909f\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.622200 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-config-data\") pod \"9948629d-78da-4168-98e5-f887c418909f\" (UID: \"9948629d-78da-4168-98e5-f887c418909f\") " Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.623216 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9948629d-78da-4168-98e5-f887c418909f-logs" (OuterVolumeSpecName: "logs") pod "9948629d-78da-4168-98e5-f887c418909f" (UID: "9948629d-78da-4168-98e5-f887c418909f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.639908 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9948629d-78da-4168-98e5-f887c418909f-kube-api-access-jszzs" (OuterVolumeSpecName: "kube-api-access-jszzs") pod "9948629d-78da-4168-98e5-f887c418909f" (UID: "9948629d-78da-4168-98e5-f887c418909f"). InnerVolumeSpecName "kube-api-access-jszzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.650749 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-scripts" (OuterVolumeSpecName: "scripts") pod "9948629d-78da-4168-98e5-f887c418909f" (UID: "9948629d-78da-4168-98e5-f887c418909f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.692840 5030 scope.go:117] "RemoveContainer" containerID="d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.704041 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=6.704024661 podStartE2EDuration="6.704024661s" podCreationTimestamp="2026-01-20 23:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:12.670067247 +0000 UTC m=+4064.990327545" watchObservedRunningTime="2026-01-20 23:43:12.704024661 +0000 UTC m=+4065.024284949" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.726493 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9948629d-78da-4168-98e5-f887c418909f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.726518 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.726529 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jszzs\" (UniqueName: \"kubernetes.io/projected/9948629d-78da-4168-98e5-f887c418909f-kube-api-access-jszzs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.747288 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.770444 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.773653 5030 scope.go:117] "RemoveContainer" containerID="333adca68e29b9bae2f3440fb85d275fec118f97e652dac5d291b5fdd406d414" Jan 20 23:43:12 crc kubenswrapper[5030]: E0120 23:43:12.775216 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"333adca68e29b9bae2f3440fb85d275fec118f97e652dac5d291b5fdd406d414\": container with ID starting with 333adca68e29b9bae2f3440fb85d275fec118f97e652dac5d291b5fdd406d414 not found: ID does not exist" containerID="333adca68e29b9bae2f3440fb85d275fec118f97e652dac5d291b5fdd406d414" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.775480 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333adca68e29b9bae2f3440fb85d275fec118f97e652dac5d291b5fdd406d414"} err="failed to get container status \"333adca68e29b9bae2f3440fb85d275fec118f97e652dac5d291b5fdd406d414\": rpc error: code = NotFound desc = could not find container \"333adca68e29b9bae2f3440fb85d275fec118f97e652dac5d291b5fdd406d414\": container with ID starting with 333adca68e29b9bae2f3440fb85d275fec118f97e652dac5d291b5fdd406d414 not found: ID does not exist" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.775595 5030 scope.go:117] "RemoveContainer" containerID="d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" Jan 20 23:43:12 crc kubenswrapper[5030]: E0120 23:43:12.776139 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb\": container with ID starting with d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb not found: ID does not exist" containerID="d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.776279 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb"} err="failed to get container status \"d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb\": rpc error: code = NotFound desc = could not find container \"d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb\": container with ID starting with d3dd33b0c812061ccaf1e35f1a3726a85dbf1bb9018d310e22e1f52a2452e1fb not found: ID does not exist" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.776373 5030 scope.go:117] "RemoveContainer" containerID="59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.776595 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.786877 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.801051 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:43:12 crc kubenswrapper[5030]: E0120 23:43:12.801440 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9948629d-78da-4168-98e5-f887c418909f" containerName="placement-api" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.801458 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9948629d-78da-4168-98e5-f887c418909f" containerName="placement-api" Jan 20 23:43:12 crc kubenswrapper[5030]: E0120 23:43:12.801472 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ed5aae-25b3-475c-b086-4ef865d35368" containerName="openstack-network-exporter" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.801480 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ed5aae-25b3-475c-b086-4ef865d35368" containerName="openstack-network-exporter" Jan 20 23:43:12 crc kubenswrapper[5030]: E0120 23:43:12.801502 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9948629d-78da-4168-98e5-f887c418909f" containerName="placement-log" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.801510 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9948629d-78da-4168-98e5-f887c418909f" containerName="placement-log" Jan 20 23:43:12 crc kubenswrapper[5030]: E0120 23:43:12.801535 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87ed5aae-25b3-475c-b086-4ef865d35368" containerName="ovn-northd" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.801542 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ed5aae-25b3-475c-b086-4ef865d35368" containerName="ovn-northd" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.801777 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9948629d-78da-4168-98e5-f887c418909f" containerName="placement-api" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.801794 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9948629d-78da-4168-98e5-f887c418909f" containerName="placement-log" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.801809 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="87ed5aae-25b3-475c-b086-4ef865d35368" containerName="openstack-network-exporter" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.801818 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="87ed5aae-25b3-475c-b086-4ef865d35368" containerName="ovn-northd" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.802827 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.802929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-config-data" (OuterVolumeSpecName: "config-data") pod "9948629d-78da-4168-98e5-f887c418909f" (UID: "9948629d-78da-4168-98e5-f887c418909f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.805489 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-l9mtr" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.805855 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.806091 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.806224 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.824930 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.828304 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.840565 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9948629d-78da-4168-98e5-f887c418909f" (UID: "9948629d-78da-4168-98e5-f887c418909f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.854072 5030 scope.go:117] "RemoveContainer" containerID="98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.917495 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-8564b945c8-66n97"] Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.929728 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4rgm\" (UniqueName: \"kubernetes.io/projected/1a24a888-0314-4501-b989-20463ef27332-kube-api-access-l4rgm\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.929785 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.929859 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a24a888-0314-4501-b989-20463ef27332-scripts\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.929909 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a24a888-0314-4501-b989-20463ef27332-config\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.929947 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.929977 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.929993 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1a24a888-0314-4501-b989-20463ef27332-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.930317 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.950614 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-6c4cd74858-frhdt"] Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.951857 5030 scope.go:117] "RemoveContainer" containerID="59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.952584 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:12 crc kubenswrapper[5030]: E0120 23:43:12.953484 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb\": container with ID starting with 59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb not found: ID does not exist" containerID="59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.953516 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb"} err="failed to get container status \"59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb\": rpc error: code = NotFound desc = could not find container \"59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb\": container with ID starting with 59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb not found: ID does not exist" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.953539 5030 scope.go:117] "RemoveContainer" containerID="98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b" Jan 20 23:43:12 crc kubenswrapper[5030]: E0120 23:43:12.954145 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b\": container with ID starting with 98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b not found: ID does not exist" containerID="98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.954163 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b"} err="failed to get container status \"98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b\": rpc error: code = NotFound desc = could not find container \"98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b\": container with ID starting with 98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b not found: ID does not exist" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.954176 5030 scope.go:117] "RemoveContainer" containerID="59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.954521 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb"} err="failed to get container status \"59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb\": rpc error: code = NotFound desc = could not find container \"59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb\": container with ID starting with 59f2f16f0f49567c03060fd60081623b39cdf1145df5d3dedc7ab79965572ddb not found: ID does not exist" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.954536 5030 scope.go:117] "RemoveContainer" containerID="98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.954713 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b"} err="failed to get container status \"98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b\": rpc error: code = NotFound desc = could not find container \"98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b\": container with ID starting with 98ac66e9b14ecd792908658c7b0994d978ed8d4a02ce97491f4054a96d27279b not found: ID does not exist" Jan 20 23:43:12 crc kubenswrapper[5030]: I0120 23:43:12.965660 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6c4cd74858-frhdt"] Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.000739 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9948629d-78da-4168-98e5-f887c418909f" (UID: "9948629d-78da-4168-98e5-f887c418909f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.033640 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4rgm\" (UniqueName: \"kubernetes.io/projected/1a24a888-0314-4501-b989-20463ef27332-kube-api-access-l4rgm\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.033728 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.033777 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-public-tls-certs\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.033825 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4hs8\" (UniqueName: \"kubernetes.io/projected/d19c3298-c9b5-4371-b1f5-8edf2510706b-kube-api-access-l4hs8\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.033949 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a24a888-0314-4501-b989-20463ef27332-scripts\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.034026 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a24a888-0314-4501-b989-20463ef27332-config\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.034049 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-scripts\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.034065 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-internal-tls-certs\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.034142 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-config-data\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.034200 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.034235 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.034288 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1a24a888-0314-4501-b989-20463ef27332-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.034355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-combined-ca-bundle\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.034431 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d19c3298-c9b5-4371-b1f5-8edf2510706b-logs\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.034590 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.036736 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a24a888-0314-4501-b989-20463ef27332-scripts\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.037730 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a24a888-0314-4501-b989-20463ef27332-config\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.039027 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1a24a888-0314-4501-b989-20463ef27332-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.068098 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.068190 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.072882 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4rgm\" (UniqueName: \"kubernetes.io/projected/1a24a888-0314-4501-b989-20463ef27332-kube-api-access-l4rgm\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.073373 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.099984 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9948629d-78da-4168-98e5-f887c418909f" (UID: "9948629d-78da-4168-98e5-f887c418909f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.137049 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-combined-ca-bundle\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.137111 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d19c3298-c9b5-4371-b1f5-8edf2510706b-logs\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.137197 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-public-tls-certs\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.137223 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4hs8\" (UniqueName: \"kubernetes.io/projected/d19c3298-c9b5-4371-b1f5-8edf2510706b-kube-api-access-l4hs8\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.137263 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-scripts\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.137280 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-internal-tls-certs\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.137314 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-config-data\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.137374 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9948629d-78da-4168-98e5-f887c418909f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.140558 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-combined-ca-bundle\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.141275 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d19c3298-c9b5-4371-b1f5-8edf2510706b-logs\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.141527 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-config-data\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.151147 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-internal-tls-certs\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.151526 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-scripts\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.153319 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-public-tls-certs\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.156096 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4hs8\" (UniqueName: \"kubernetes.io/projected/d19c3298-c9b5-4371-b1f5-8edf2510706b-kube-api-access-l4hs8\") pod \"placement-6c4cd74858-frhdt\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.192997 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.269096 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.306686 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" podUID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.1.4:8778/\": read tcp 10.217.0.2:37192->10.217.1.4:8778: read: connection reset by peer" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.306707 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" podUID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.1.4:8778/\": read tcp 10.217.0.2:37184->10.217.1.4:8778: read: connection reset by peer" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.568019 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv"] Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.601281 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-5ffbcd4fdb-84hxv"] Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.650210 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" event={"ID":"0d3607ca-2431-4ace-a841-a6cdceb9d0ff","Type":"ContainerStarted","Data":"320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4"} Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.655941 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"0aacc04a-8369-4d86-a209-51905de33654","Type":"ContainerStarted","Data":"70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e"} Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.702959 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d","Type":"ContainerStarted","Data":"3080ff8c5f746f4b16193cc0fc72270cf4d129efbbfdc598db6620ca200e79a9"} Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.703057 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.705362 5030 generic.go:334] "Generic (PLEG): container finished" podID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerID="1feab068d22c049b8496d1357dade67b2f0fdf20291edc9d615dba4d367218dc" exitCode=1 Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.705409 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"04d86b77-716c-4eb7-8976-9e47e1b6df7d","Type":"ContainerDied","Data":"1feab068d22c049b8496d1357dade67b2f0fdf20291edc9d615dba4d367218dc"} Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.705752 5030 scope.go:117] "RemoveContainer" containerID="1feab068d22c049b8496d1357dade67b2f0fdf20291edc9d615dba4d367218dc" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.716815 5030 generic.go:334] "Generic (PLEG): container finished" podID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerID="4c1481e7f538bbd813fc066cbdf4b164b9a4610f8972b882bebb9408284fb510" exitCode=0 Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.716914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" event={"ID":"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd","Type":"ContainerDied","Data":"4c1481e7f538bbd813fc066cbdf4b164b9a4610f8972b882bebb9408284fb510"} Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.720537 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=3.9461834810000003 podStartE2EDuration="9.72052096s" podCreationTimestamp="2026-01-20 23:43:04 +0000 UTC" firstStartedPulling="2026-01-20 23:43:06.338806889 +0000 UTC m=+4058.659067177" lastFinishedPulling="2026-01-20 23:43:12.113144368 +0000 UTC m=+4064.433404656" observedRunningTime="2026-01-20 23:43:13.71968402 +0000 UTC m=+4066.039944308" watchObservedRunningTime="2026-01-20 23:43:13.72052096 +0000 UTC m=+4066.040781248" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.728715 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"64d99a0e-503c-4dab-a5ef-68c2ff6446a4","Type":"ContainerStarted","Data":"9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035"} Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.731945 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=9.731929287 podStartE2EDuration="9.731929287s" podCreationTimestamp="2026-01-20 23:43:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:13.684094527 +0000 UTC m=+4066.004354825" watchObservedRunningTime="2026-01-20 23:43:13.731929287 +0000 UTC m=+4066.052189565" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.733343 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9c11e361-d0f4-49be-9b4b-96c109c78e43","Type":"ContainerStarted","Data":"7d15586f500f915a660186440c2bca1909e77741fb7263388082aea6d2bfeafa"} Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.734853 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.734924 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.778564 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.789075 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=9.789036872 podStartE2EDuration="9.789036872s" podCreationTimestamp="2026-01-20 23:43:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:13.760759306 +0000 UTC m=+4066.081019594" watchObservedRunningTime="2026-01-20 23:43:13.789036872 +0000 UTC m=+4066.109297160" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.813944 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.960801 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.980599 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87ed5aae-25b3-475c-b086-4ef865d35368" path="/var/lib/kubelet/pods/87ed5aae-25b3-475c-b086-4ef865d35368/volumes" Jan 20 23:43:13 crc kubenswrapper[5030]: I0120 23:43:13.981372 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9948629d-78da-4168-98e5-f887c418909f" path="/var/lib/kubelet/pods/9948629d-78da-4168-98e5-f887c418909f/volumes" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.067941 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-scripts\") pod \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.068022 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-config-data\") pod \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.068115 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-logs\") pod \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.068139 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b6zg\" (UniqueName: \"kubernetes.io/projected/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-kube-api-access-5b6zg\") pod \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.068189 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-combined-ca-bundle\") pod \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.068225 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-internal-tls-certs\") pod \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.068243 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-public-tls-certs\") pod \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\" (UID: \"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd\") " Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.079869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-logs" (OuterVolumeSpecName: "logs") pod "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" (UID: "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.085806 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-kube-api-access-5b6zg" (OuterVolumeSpecName: "kube-api-access-5b6zg") pod "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" (UID: "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd"). InnerVolumeSpecName "kube-api-access-5b6zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.086027 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-scripts" (OuterVolumeSpecName: "scripts") pod "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" (UID: "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.102247 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.117234 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-6c4cd74858-frhdt"] Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.174867 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.174903 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.174917 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b6zg\" (UniqueName: \"kubernetes.io/projected/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-kube-api-access-5b6zg\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.219851 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-config-data" (OuterVolumeSpecName: "config-data") pod "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" (UID: "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.240061 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.240102 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.279932 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.289789 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" (UID: "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.328007 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.382871 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.418357 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" (UID: "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.458486 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" (UID: "b3175aa2-ede3-4c4f-b5c7-c5a1096890cd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.486855 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.486894 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.762054 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"64d99a0e-503c-4dab-a5ef-68c2ff6446a4","Type":"ContainerStarted","Data":"f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff"} Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.786916 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.786895898 podStartE2EDuration="4.786895898s" podCreationTimestamp="2026-01-20 23:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:14.781466757 +0000 UTC m=+4067.101727045" watchObservedRunningTime="2026-01-20 23:43:14.786895898 +0000 UTC m=+4067.107156186" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.788998 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" event={"ID":"d19c3298-c9b5-4371-b1f5-8edf2510706b","Type":"ContainerStarted","Data":"7de8ca058af69798715808c2c705adb228699f92727076f0a21040a1e153dd1b"} Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.789040 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" event={"ID":"d19c3298-c9b5-4371-b1f5-8edf2510706b","Type":"ContainerStarted","Data":"d4812a9e549e5d2fda3fe2a7411347a434f92007728dced134a5fb0a7ce5de0e"} Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.804457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"1a24a888-0314-4501-b989-20463ef27332","Type":"ContainerStarted","Data":"f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126"} Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.804516 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"1a24a888-0314-4501-b989-20463ef27332","Type":"ContainerStarted","Data":"6bfc653e010a933624d3199768da1c2010cbd0500e1b50c32ab72a246903d334"} Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.828885 5030 generic.go:334] "Generic (PLEG): container finished" podID="2a95b578-1de2-4b88-88dd-63ff04e9020f" containerID="424c93f4038942de5f50f7c4a340318ffd69ecb302f7a35747632d938e54e023" exitCode=0 Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.829111 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z7st" event={"ID":"2a95b578-1de2-4b88-88dd-63ff04e9020f","Type":"ContainerDied","Data":"424c93f4038942de5f50f7c4a340318ffd69ecb302f7a35747632d938e54e023"} Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.863530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"04d86b77-716c-4eb7-8976-9e47e1b6df7d","Type":"ContainerStarted","Data":"efc08bb977e1dec2224e0550e4a51eee0e71377a6fe1d5972f92ea98201f81e3"} Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.864833 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.888496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" event={"ID":"0d3607ca-2431-4ace-a841-a6cdceb9d0ff","Type":"ContainerStarted","Data":"693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139"} Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.888679 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" podUID="0d3607ca-2431-4ace-a841-a6cdceb9d0ff" containerName="placement-log" containerID="cri-o://320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4" gracePeriod=30 Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.888758 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" podUID="0d3607ca-2431-4ace-a841-a6cdceb9d0ff" containerName="placement-api" containerID="cri-o://693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139" gracePeriod=30 Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.888797 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.889053 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.894721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" event={"ID":"b3175aa2-ede3-4c4f-b5c7-c5a1096890cd","Type":"ContainerDied","Data":"2bcdfe6dfff86b5e656db87b6f5cb63bbc57725e294dba0a83718b6cf6e1154c"} Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.894769 5030 scope.go:117] "RemoveContainer" containerID="4c1481e7f538bbd813fc066cbdf4b164b9a4610f8972b882bebb9408284fb510" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.916456 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" podStartSLOduration=3.916438481 podStartE2EDuration="3.916438481s" podCreationTimestamp="2026-01-20 23:43:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:14.912756632 +0000 UTC m=+4067.233016920" watchObservedRunningTime="2026-01-20 23:43:14.916438481 +0000 UTC m=+4067.236698769" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.916885 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-85c9874f86-xpfrg" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.949128 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:14 crc kubenswrapper[5030]: I0120 23:43:14.976245 5030 scope.go:117] "RemoveContainer" containerID="1b5d6e7723fc432f59bb152ee5c44c0e160790b16eb10a41a085b3d039712a96" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.059825 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-85c9874f86-xpfrg"] Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.066714 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-85c9874f86-xpfrg"] Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.293934 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.25:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.294208 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.25:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.664083 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.688363 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-public-tls-certs\") pod \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.688435 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-logs\") pod \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.688509 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgdfx\" (UniqueName: \"kubernetes.io/projected/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-kube-api-access-xgdfx\") pod \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.688570 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-combined-ca-bundle\") pod \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.688656 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-internal-tls-certs\") pod \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.688684 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-config-data\") pod \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.689436 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-scripts\") pod \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\" (UID: \"0d3607ca-2431-4ace-a841-a6cdceb9d0ff\") " Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.689650 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-logs" (OuterVolumeSpecName: "logs") pod "0d3607ca-2431-4ace-a841-a6cdceb9d0ff" (UID: "0d3607ca-2431-4ace-a841-a6cdceb9d0ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.690260 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.694129 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-scripts" (OuterVolumeSpecName: "scripts") pod "0d3607ca-2431-4ace-a841-a6cdceb9d0ff" (UID: "0d3607ca-2431-4ace-a841-a6cdceb9d0ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.701838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-kube-api-access-xgdfx" (OuterVolumeSpecName: "kube-api-access-xgdfx") pod "0d3607ca-2431-4ace-a841-a6cdceb9d0ff" (UID: "0d3607ca-2431-4ace-a841-a6cdceb9d0ff"). InnerVolumeSpecName "kube-api-access-xgdfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.729094 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.17:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.792213 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgdfx\" (UniqueName: \"kubernetes.io/projected/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-kube-api-access-xgdfx\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.792241 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.800525 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-config-data" (OuterVolumeSpecName: "config-data") pod "0d3607ca-2431-4ace-a841-a6cdceb9d0ff" (UID: "0d3607ca-2431-4ace-a841-a6cdceb9d0ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.804728 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d3607ca-2431-4ace-a841-a6cdceb9d0ff" (UID: "0d3607ca-2431-4ace-a841-a6cdceb9d0ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.824670 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0d3607ca-2431-4ace-a841-a6cdceb9d0ff" (UID: "0d3607ca-2431-4ace-a841-a6cdceb9d0ff"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.872705 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0d3607ca-2431-4ace-a841-a6cdceb9d0ff" (UID: "0d3607ca-2431-4ace-a841-a6cdceb9d0ff"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.893509 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.893544 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.893558 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.893569 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3607ca-2431-4ace-a841-a6cdceb9d0ff-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.904937 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z7st" event={"ID":"2a95b578-1de2-4b88-88dd-63ff04e9020f","Type":"ContainerStarted","Data":"830e341a8e288fa9122e208b4222270702aa246660044831d9c54d42f91e8f09"} Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.907579 5030 generic.go:334] "Generic (PLEG): container finished" podID="0d3607ca-2431-4ace-a841-a6cdceb9d0ff" containerID="693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139" exitCode=0 Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.907634 5030 generic.go:334] "Generic (PLEG): container finished" podID="0d3607ca-2431-4ace-a841-a6cdceb9d0ff" containerID="320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4" exitCode=143 Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.907653 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.907685 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" event={"ID":"0d3607ca-2431-4ace-a841-a6cdceb9d0ff","Type":"ContainerDied","Data":"693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139"} Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.907710 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" event={"ID":"0d3607ca-2431-4ace-a841-a6cdceb9d0ff","Type":"ContainerDied","Data":"320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4"} Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.907719 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-8564b945c8-66n97" event={"ID":"0d3607ca-2431-4ace-a841-a6cdceb9d0ff","Type":"ContainerDied","Data":"efaf95a472299db4ec824227bec10442d2170ceffedfcfd8f255cf1fe31eea7b"} Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.907734 5030 scope.go:117] "RemoveContainer" containerID="693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.910813 5030 generic.go:334] "Generic (PLEG): container finished" podID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerID="51240c7f068beb376adfcd6b67226bc6f0f4c7a4350b5e086ba82544702cca04" exitCode=1 Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.910854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"1d6e18e5-718a-4578-8b74-add47cb38d2d","Type":"ContainerDied","Data":"51240c7f068beb376adfcd6b67226bc6f0f4c7a4350b5e086ba82544702cca04"} Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.911334 5030 scope.go:117] "RemoveContainer" containerID="51240c7f068beb376adfcd6b67226bc6f0f4c7a4350b5e086ba82544702cca04" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.920101 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" event={"ID":"d19c3298-c9b5-4371-b1f5-8edf2510706b","Type":"ContainerStarted","Data":"a6d258088fb951bc34644f2ffa05470352c7ec6b7690b7eaa4f6eb8de5451c0f"} Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.921221 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.921341 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.923309 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.923343 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.923650 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"1a24a888-0314-4501-b989-20463ef27332","Type":"ContainerStarted","Data":"0c308db5ef08b7ef693c77c927ec44ee44d39ecf14e8c420af0c16132d9886ec"} Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.931210 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6z7st" podStartSLOduration=3.283655231 podStartE2EDuration="5.931191727s" podCreationTimestamp="2026-01-20 23:43:10 +0000 UTC" firstStartedPulling="2026-01-20 23:43:12.591282996 +0000 UTC m=+4064.911543284" lastFinishedPulling="2026-01-20 23:43:15.238819492 +0000 UTC m=+4067.559079780" observedRunningTime="2026-01-20 23:43:15.925337615 +0000 UTC m=+4068.245597913" watchObservedRunningTime="2026-01-20 23:43:15.931191727 +0000 UTC m=+4068.251452015" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.952434 5030 scope.go:117] "RemoveContainer" containerID="320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.981572 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" path="/var/lib/kubelet/pods/b3175aa2-ede3-4c4f-b5c7-c5a1096890cd/volumes" Jan 20 23:43:15 crc kubenswrapper[5030]: I0120 23:43:15.983165 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=3.983145718 podStartE2EDuration="3.983145718s" podCreationTimestamp="2026-01-20 23:43:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:15.956260885 +0000 UTC m=+4068.276521173" watchObservedRunningTime="2026-01-20 23:43:15.983145718 +0000 UTC m=+4068.303406006" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.001385 5030 scope.go:117] "RemoveContainer" containerID="693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139" Jan 20 23:43:16 crc kubenswrapper[5030]: E0120 23:43:16.007098 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139\": container with ID starting with 693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139 not found: ID does not exist" containerID="693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.007158 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139"} err="failed to get container status \"693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139\": rpc error: code = NotFound desc = could not find container \"693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139\": container with ID starting with 693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139 not found: ID does not exist" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.007808 5030 scope.go:117] "RemoveContainer" containerID="320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4" Jan 20 23:43:16 crc kubenswrapper[5030]: E0120 23:43:16.010342 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4\": container with ID starting with 320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4 not found: ID does not exist" containerID="320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.010370 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4"} err="failed to get container status \"320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4\": rpc error: code = NotFound desc = could not find container \"320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4\": container with ID starting with 320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4 not found: ID does not exist" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.010388 5030 scope.go:117] "RemoveContainer" containerID="693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.019996 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139"} err="failed to get container status \"693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139\": rpc error: code = NotFound desc = could not find container \"693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139\": container with ID starting with 693c8d9593b96d70567d2e841d5536116c5edebed3af8fe381f154c661301139 not found: ID does not exist" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.020028 5030 scope.go:117] "RemoveContainer" containerID="320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.021435 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4"} err="failed to get container status \"320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4\": rpc error: code = NotFound desc = could not find container \"320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4\": container with ID starting with 320d5e4ead4bcd20738c1ebe3d4a6cafdac037e3e0df801e9c4dc0c0e87f8db4 not found: ID does not exist" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.021770 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" podStartSLOduration=4.021751504 podStartE2EDuration="4.021751504s" podCreationTimestamp="2026-01-20 23:43:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:15.999518645 +0000 UTC m=+4068.319778933" watchObservedRunningTime="2026-01-20 23:43:16.021751504 +0000 UTC m=+4068.342011792" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.034933 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-8564b945c8-66n97"] Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.044118 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-8564b945c8-66n97"] Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.605421 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.647370 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.647423 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.650784 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.652532 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.652580 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.697435 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.785217 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.794705 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.854070 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.934811 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"1d6e18e5-718a-4578-8b74-add47cb38d2d","Type":"ContainerStarted","Data":"22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6"} Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.934857 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.935984 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="64d99a0e-503c-4dab-a5ef-68c2ff6446a4" containerName="glance-log" containerID="cri-o://9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035" gracePeriod=30 Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.936080 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="64d99a0e-503c-4dab-a5ef-68c2ff6446a4" containerName="glance-httpd" containerID="cri-o://f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff" gracePeriod=30 Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.936387 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="862166c4-fd5b-456d-9a91-c2235cdf2aa2" containerName="memcached" containerID="cri-o://b975eea71e79ae10099bf416b39c08cf81c77876459e96d57d47c85f13049d1c" gracePeriod=30 Jan 20 23:43:16 crc kubenswrapper[5030]: I0120 23:43:16.936532 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.148673 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.380896 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.380946 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.411769 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.701803 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.17:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.701879 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.17:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.941521 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.945457 5030 generic.go:334] "Generic (PLEG): container finished" podID="64d99a0e-503c-4dab-a5ef-68c2ff6446a4" containerID="f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff" exitCode=0 Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.945485 5030 generic.go:334] "Generic (PLEG): container finished" podID="64d99a0e-503c-4dab-a5ef-68c2ff6446a4" containerID="9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035" exitCode=143 Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.945513 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.945530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"64d99a0e-503c-4dab-a5ef-68c2ff6446a4","Type":"ContainerDied","Data":"f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff"} Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.945560 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"64d99a0e-503c-4dab-a5ef-68c2ff6446a4","Type":"ContainerDied","Data":"9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035"} Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.945570 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"64d99a0e-503c-4dab-a5ef-68c2ff6446a4","Type":"ContainerDied","Data":"e52123514770d37017c3b944ad15d09d15a840f3ccb2d4c8d7d7cbe349bbae0c"} Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.945585 5030 scope.go:117] "RemoveContainer" containerID="f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.947891 5030 generic.go:334] "Generic (PLEG): container finished" podID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerID="efc08bb977e1dec2224e0550e4a51eee0e71377a6fe1d5972f92ea98201f81e3" exitCode=1 Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.948008 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"04d86b77-716c-4eb7-8976-9e47e1b6df7d","Type":"ContainerDied","Data":"efc08bb977e1dec2224e0550e4a51eee0e71377a6fe1d5972f92ea98201f81e3"} Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.948199 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="527ab23e-825f-4355-9abf-e3858c484683" containerName="glance-log" containerID="cri-o://81e32e53c3f6a49ddd25bb2402c3925b0d11ecd9d86657224a2a426078ebe402" gracePeriod=30 Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.948233 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.948259 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="527ab23e-825f-4355-9abf-e3858c484683" containerName="glance-httpd" containerID="cri-o://77811e0d82b6720b021978cb34a90c19c14e60268e314bc8de7e4c5e39e50342" gracePeriod=30 Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.948787 5030 scope.go:117] "RemoveContainer" containerID="efc08bb977e1dec2224e0550e4a51eee0e71377a6fe1d5972f92ea98201f81e3" Jan 20 23:43:17 crc kubenswrapper[5030]: E0120 23:43:17.949080 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(04d86b77-716c-4eb7-8976-9e47e1b6df7d)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.956577 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="527ab23e-825f-4355-9abf-e3858c484683" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.21:9292/healthcheck\": EOF" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.956591 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="527ab23e-825f-4355-9abf-e3858c484683" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.21:9292/healthcheck\": EOF" Jan 20 23:43:17 crc kubenswrapper[5030]: I0120 23:43:17.985949 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d3607ca-2431-4ace-a841-a6cdceb9d0ff" path="/var/lib/kubelet/pods/0d3607ca-2431-4ace-a841-a6cdceb9d0ff/volumes" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.035874 5030 scope.go:117] "RemoveContainer" containerID="9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.040122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-config-data\") pod \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.040292 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-combined-ca-bundle\") pod \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.040321 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-internal-tls-certs\") pod \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.040370 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg22d\" (UniqueName: \"kubernetes.io/projected/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-kube-api-access-bg22d\") pod \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.040385 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-scripts\") pod \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.040400 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-logs\") pod \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.040484 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.040853 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-httpd-run\") pod \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\" (UID: \"64d99a0e-503c-4dab-a5ef-68c2ff6446a4\") " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.046813 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-logs" (OuterVolumeSpecName: "logs") pod "64d99a0e-503c-4dab-a5ef-68c2ff6446a4" (UID: "64d99a0e-503c-4dab-a5ef-68c2ff6446a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.047858 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "64d99a0e-503c-4dab-a5ef-68c2ff6446a4" (UID: "64d99a0e-503c-4dab-a5ef-68c2ff6446a4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.048538 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-scripts" (OuterVolumeSpecName: "scripts") pod "64d99a0e-503c-4dab-a5ef-68c2ff6446a4" (UID: "64d99a0e-503c-4dab-a5ef-68c2ff6446a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.062456 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-kube-api-access-bg22d" (OuterVolumeSpecName: "kube-api-access-bg22d") pod "64d99a0e-503c-4dab-a5ef-68c2ff6446a4" (UID: "64d99a0e-503c-4dab-a5ef-68c2ff6446a4"). InnerVolumeSpecName "kube-api-access-bg22d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.066793 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "64d99a0e-503c-4dab-a5ef-68c2ff6446a4" (UID: "64d99a0e-503c-4dab-a5ef-68c2ff6446a4"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.085066 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64d99a0e-503c-4dab-a5ef-68c2ff6446a4" (UID: "64d99a0e-503c-4dab-a5ef-68c2ff6446a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.110533 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-config-data" (OuterVolumeSpecName: "config-data") pod "64d99a0e-503c-4dab-a5ef-68c2ff6446a4" (UID: "64d99a0e-503c-4dab-a5ef-68c2ff6446a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.128363 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "64d99a0e-503c-4dab-a5ef-68c2ff6446a4" (UID: "64d99a0e-503c-4dab-a5ef-68c2ff6446a4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.142863 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.142897 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.142909 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.142919 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.142930 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg22d\" (UniqueName: \"kubernetes.io/projected/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-kube-api-access-bg22d\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.142960 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.142969 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.142980 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64d99a0e-503c-4dab-a5ef-68c2ff6446a4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.163515 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.163552 5030 scope.go:117] "RemoveContainer" containerID="f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff" Jan 20 23:43:18 crc kubenswrapper[5030]: E0120 23:43:18.166610 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff\": container with ID starting with f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff not found: ID does not exist" containerID="f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.166752 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff"} err="failed to get container status \"f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff\": rpc error: code = NotFound desc = could not find container \"f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff\": container with ID starting with f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff not found: ID does not exist" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.166783 5030 scope.go:117] "RemoveContainer" containerID="9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035" Jan 20 23:43:18 crc kubenswrapper[5030]: E0120 23:43:18.167170 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035\": container with ID starting with 9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035 not found: ID does not exist" containerID="9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.167201 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035"} err="failed to get container status \"9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035\": rpc error: code = NotFound desc = could not find container \"9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035\": container with ID starting with 9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035 not found: ID does not exist" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.167223 5030 scope.go:117] "RemoveContainer" containerID="f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.167553 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff"} err="failed to get container status \"f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff\": rpc error: code = NotFound desc = could not find container \"f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff\": container with ID starting with f4af87243bf69d16a0b5a87905d15a7b4707bdb308f081417bb4f066a34da8ff not found: ID does not exist" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.167575 5030 scope.go:117] "RemoveContainer" containerID="9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.168032 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035"} err="failed to get container status \"9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035\": rpc error: code = NotFound desc = could not find container \"9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035\": container with ID starting with 9deec08386987e578203ea8545634e83c5eb0848099c4a17da113cda493df035 not found: ID does not exist" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.168058 5030 scope.go:117] "RemoveContainer" containerID="1feab068d22c049b8496d1357dade67b2f0fdf20291edc9d615dba4d367218dc" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.244198 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.280733 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.294418 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.311173 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:43:18 crc kubenswrapper[5030]: E0120 23:43:18.311593 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerName="placement-log" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.311609 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerName="placement-log" Jan 20 23:43:18 crc kubenswrapper[5030]: E0120 23:43:18.311624 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerName="placement-api" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.311657 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerName="placement-api" Jan 20 23:43:18 crc kubenswrapper[5030]: E0120 23:43:18.311666 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3607ca-2431-4ace-a841-a6cdceb9d0ff" containerName="placement-api" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.311672 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3607ca-2431-4ace-a841-a6cdceb9d0ff" containerName="placement-api" Jan 20 23:43:18 crc kubenswrapper[5030]: E0120 23:43:18.311689 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3607ca-2431-4ace-a841-a6cdceb9d0ff" containerName="placement-log" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.311694 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3607ca-2431-4ace-a841-a6cdceb9d0ff" containerName="placement-log" Jan 20 23:43:18 crc kubenswrapper[5030]: E0120 23:43:18.311705 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64d99a0e-503c-4dab-a5ef-68c2ff6446a4" containerName="glance-httpd" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.311710 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d99a0e-503c-4dab-a5ef-68c2ff6446a4" containerName="glance-httpd" Jan 20 23:43:18 crc kubenswrapper[5030]: E0120 23:43:18.311743 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64d99a0e-503c-4dab-a5ef-68c2ff6446a4" containerName="glance-log" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.311750 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d99a0e-503c-4dab-a5ef-68c2ff6446a4" containerName="glance-log" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.311918 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="64d99a0e-503c-4dab-a5ef-68c2ff6446a4" containerName="glance-log" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.311932 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3607ca-2431-4ace-a841-a6cdceb9d0ff" containerName="placement-log" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.311942 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerName="placement-api" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.311953 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="64d99a0e-503c-4dab-a5ef-68c2ff6446a4" containerName="glance-httpd" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.311962 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3175aa2-ede3-4c4f-b5c7-c5a1096890cd" containerName="placement-log" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.311976 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3607ca-2431-4ace-a841-a6cdceb9d0ff" containerName="placement-api" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.312952 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.333016 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.333264 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.358738 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.395431 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="31981674-b244-476b-bf36-e3f5903d2446" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.30:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.395423 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="31981674-b244-476b-bf36-e3f5903d2446" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.30:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.449578 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.449641 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.449695 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.449746 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbd6x\" (UniqueName: \"kubernetes.io/projected/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-kube-api-access-vbd6x\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.449801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.449837 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.449900 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.449917 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.551657 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.551735 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.551759 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.551778 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.551807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.551832 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.551867 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.551914 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbd6x\" (UniqueName: \"kubernetes.io/projected/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-kube-api-access-vbd6x\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.553737 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.553973 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.557835 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.558710 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.562355 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.579317 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.587204 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.601756 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbd6x\" (UniqueName: \"kubernetes.io/projected/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-kube-api-access-vbd6x\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.708655 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.765372 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.862056 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862166c4-fd5b-456d-9a91-c2235cdf2aa2-combined-ca-bundle\") pod \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.862196 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/862166c4-fd5b-456d-9a91-c2235cdf2aa2-config-data\") pod \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.862265 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4nfk\" (UniqueName: \"kubernetes.io/projected/862166c4-fd5b-456d-9a91-c2235cdf2aa2-kube-api-access-d4nfk\") pod \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.862303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/862166c4-fd5b-456d-9a91-c2235cdf2aa2-memcached-tls-certs\") pod \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.862353 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/862166c4-fd5b-456d-9a91-c2235cdf2aa2-kolla-config\") pod \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\" (UID: \"862166c4-fd5b-456d-9a91-c2235cdf2aa2\") " Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.862792 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/862166c4-fd5b-456d-9a91-c2235cdf2aa2-config-data" (OuterVolumeSpecName: "config-data") pod "862166c4-fd5b-456d-9a91-c2235cdf2aa2" (UID: "862166c4-fd5b-456d-9a91-c2235cdf2aa2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.863044 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/862166c4-fd5b-456d-9a91-c2235cdf2aa2-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "862166c4-fd5b-456d-9a91-c2235cdf2aa2" (UID: "862166c4-fd5b-456d-9a91-c2235cdf2aa2"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.880817 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/862166c4-fd5b-456d-9a91-c2235cdf2aa2-kube-api-access-d4nfk" (OuterVolumeSpecName: "kube-api-access-d4nfk") pod "862166c4-fd5b-456d-9a91-c2235cdf2aa2" (UID: "862166c4-fd5b-456d-9a91-c2235cdf2aa2"). InnerVolumeSpecName "kube-api-access-d4nfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.920003 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/862166c4-fd5b-456d-9a91-c2235cdf2aa2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "862166c4-fd5b-456d-9a91-c2235cdf2aa2" (UID: "862166c4-fd5b-456d-9a91-c2235cdf2aa2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.940094 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.969531 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/862166c4-fd5b-456d-9a91-c2235cdf2aa2-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.969568 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/862166c4-fd5b-456d-9a91-c2235cdf2aa2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.969578 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/862166c4-fd5b-456d-9a91-c2235cdf2aa2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.969589 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4nfk\" (UniqueName: \"kubernetes.io/projected/862166c4-fd5b-456d-9a91-c2235cdf2aa2-kube-api-access-d4nfk\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.981649 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/862166c4-fd5b-456d-9a91-c2235cdf2aa2-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "862166c4-fd5b-456d-9a91-c2235cdf2aa2" (UID: "862166c4-fd5b-456d-9a91-c2235cdf2aa2"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.997208 5030 generic.go:334] "Generic (PLEG): container finished" podID="527ab23e-825f-4355-9abf-e3858c484683" containerID="81e32e53c3f6a49ddd25bb2402c3925b0d11ecd9d86657224a2a426078ebe402" exitCode=143 Jan 20 23:43:18 crc kubenswrapper[5030]: I0120 23:43:18.997289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"527ab23e-825f-4355-9abf-e3858c484683","Type":"ContainerDied","Data":"81e32e53c3f6a49ddd25bb2402c3925b0d11ecd9d86657224a2a426078ebe402"} Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.025584 5030 generic.go:334] "Generic (PLEG): container finished" podID="862166c4-fd5b-456d-9a91-c2235cdf2aa2" containerID="b975eea71e79ae10099bf416b39c08cf81c77876459e96d57d47c85f13049d1c" exitCode=0 Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.026436 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.027721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"862166c4-fd5b-456d-9a91-c2235cdf2aa2","Type":"ContainerDied","Data":"b975eea71e79ae10099bf416b39c08cf81c77876459e96d57d47c85f13049d1c"} Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.027786 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"862166c4-fd5b-456d-9a91-c2235cdf2aa2","Type":"ContainerDied","Data":"fb94f334ddb9cbe51f468a0c4ed6ee52c88491d6ee338870b8b94b48e17fdbf4"} Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.027805 5030 scope.go:117] "RemoveContainer" containerID="b975eea71e79ae10099bf416b39c08cf81c77876459e96d57d47c85f13049d1c" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.069925 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.078993 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/862166c4-fd5b-456d-9a91-c2235cdf2aa2-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.102300 5030 scope.go:117] "RemoveContainer" containerID="b975eea71e79ae10099bf416b39c08cf81c77876459e96d57d47c85f13049d1c" Jan 20 23:43:19 crc kubenswrapper[5030]: E0120 23:43:19.103011 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b975eea71e79ae10099bf416b39c08cf81c77876459e96d57d47c85f13049d1c\": container with ID starting with b975eea71e79ae10099bf416b39c08cf81c77876459e96d57d47c85f13049d1c not found: ID does not exist" containerID="b975eea71e79ae10099bf416b39c08cf81c77876459e96d57d47c85f13049d1c" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.103034 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b975eea71e79ae10099bf416b39c08cf81c77876459e96d57d47c85f13049d1c"} err="failed to get container status \"b975eea71e79ae10099bf416b39c08cf81c77876459e96d57d47c85f13049d1c\": rpc error: code = NotFound desc = could not find container \"b975eea71e79ae10099bf416b39c08cf81c77876459e96d57d47c85f13049d1c\": container with ID starting with b975eea71e79ae10099bf416b39c08cf81c77876459e96d57d47c85f13049d1c not found: ID does not exist" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.106702 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.137705 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:43:19 crc kubenswrapper[5030]: E0120 23:43:19.138803 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="862166c4-fd5b-456d-9a91-c2235cdf2aa2" containerName="memcached" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.138818 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="862166c4-fd5b-456d-9a91-c2235cdf2aa2" containerName="memcached" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.139174 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="862166c4-fd5b-456d-9a91-c2235cdf2aa2" containerName="memcached" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.140142 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.140231 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.147167 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.147410 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.147541 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-bmf9h" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.197915 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.237989 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.285889 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsqrg\" (UniqueName: \"kubernetes.io/projected/d10e94b1-91a0-4631-a7b8-763d41c403ce-kube-api-access-rsqrg\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.286042 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d10e94b1-91a0-4631-a7b8-763d41c403ce-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.286109 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d10e94b1-91a0-4631-a7b8-763d41c403ce-kolla-config\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.286215 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d10e94b1-91a0-4631-a7b8-763d41c403ce-config-data\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.286245 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10e94b1-91a0-4631-a7b8-763d41c403ce-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.390725 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d10e94b1-91a0-4631-a7b8-763d41c403ce-config-data\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.390786 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10e94b1-91a0-4631-a7b8-763d41c403ce-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.390831 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsqrg\" (UniqueName: \"kubernetes.io/projected/d10e94b1-91a0-4631-a7b8-763d41c403ce-kube-api-access-rsqrg\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.390890 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d10e94b1-91a0-4631-a7b8-763d41c403ce-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.390929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d10e94b1-91a0-4631-a7b8-763d41c403ce-kolla-config\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.391767 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d10e94b1-91a0-4631-a7b8-763d41c403ce-kolla-config\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.392222 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d10e94b1-91a0-4631-a7b8-763d41c403ce-config-data\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.399357 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.400155 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d10e94b1-91a0-4631-a7b8-763d41c403ce-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.406021 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10e94b1-91a0-4631-a7b8-763d41c403ce-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.412858 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.417048 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsqrg\" (UniqueName: \"kubernetes.io/projected/d10e94b1-91a0-4631-a7b8-763d41c403ce-kube-api-access-rsqrg\") pod \"memcached-0\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.540301 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.553219 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:43:19 crc kubenswrapper[5030]: W0120 23:43:19.586837 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5ca0f14_59f2_44f0_91f4_e0c2c8b43a2b.slice/crio-05f2ebe1de645d4dd09eed2022ef8ea1ef6889ff43d5b204cf62abcccc8860d3 WatchSource:0}: Error finding container 05f2ebe1de645d4dd09eed2022ef8ea1ef6889ff43d5b204cf62abcccc8860d3: Status 404 returned error can't find the container with id 05f2ebe1de645d4dd09eed2022ef8ea1ef6889ff43d5b204cf62abcccc8860d3 Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.776704 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.777768 5030 scope.go:117] "RemoveContainer" containerID="efc08bb977e1dec2224e0550e4a51eee0e71377a6fe1d5972f92ea98201f81e3" Jan 20 23:43:19 crc kubenswrapper[5030]: E0120 23:43:19.777990 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(04d86b77-716c-4eb7-8976-9e47e1b6df7d)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" Jan 20 23:43:19 crc kubenswrapper[5030]: E0120 23:43:19.794461 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6 is running failed: container process not found" containerID="22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:43:19 crc kubenswrapper[5030]: E0120 23:43:19.794917 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6 is running failed: container process not found" containerID="22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:43:19 crc kubenswrapper[5030]: E0120 23:43:19.795360 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6 is running failed: container process not found" containerID="22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:43:19 crc kubenswrapper[5030]: E0120 23:43:19.795392 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6 is running failed: container process not found" probeType="Liveness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerName="nova-cell1-conductor-conductor" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.985060 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64d99a0e-503c-4dab-a5ef-68c2ff6446a4" path="/var/lib/kubelet/pods/64d99a0e-503c-4dab-a5ef-68c2ff6446a4/volumes" Jan 20 23:43:19 crc kubenswrapper[5030]: I0120 23:43:19.986031 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="862166c4-fd5b-456d-9a91-c2235cdf2aa2" path="/var/lib/kubelet/pods/862166c4-fd5b-456d-9a91-c2235cdf2aa2/volumes" Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.008003 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:43:20 crc kubenswrapper[5030]: W0120 23:43:20.012591 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd10e94b1_91a0_4631_a7b8_763d41c403ce.slice/crio-aa1b8741581bc40fbae399648856b94373f42f25418cdd24c54a607bec53b8ab WatchSource:0}: Error finding container aa1b8741581bc40fbae399648856b94373f42f25418cdd24c54a607bec53b8ab: Status 404 returned error can't find the container with id aa1b8741581bc40fbae399648856b94373f42f25418cdd24c54a607bec53b8ab Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.036416 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b","Type":"ContainerStarted","Data":"05f2ebe1de645d4dd09eed2022ef8ea1ef6889ff43d5b204cf62abcccc8860d3"} Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.038087 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"d10e94b1-91a0-4631-a7b8-763d41c403ce","Type":"ContainerStarted","Data":"aa1b8741581bc40fbae399648856b94373f42f25418cdd24c54a607bec53b8ab"} Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.042430 5030 generic.go:334] "Generic (PLEG): container finished" podID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerID="22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6" exitCode=1 Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.042551 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"1d6e18e5-718a-4578-8b74-add47cb38d2d","Type":"ContainerDied","Data":"22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6"} Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.042606 5030 scope.go:117] "RemoveContainer" containerID="51240c7f068beb376adfcd6b67226bc6f0f4c7a4350b5e086ba82544702cca04" Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.043026 5030 scope.go:117] "RemoveContainer" containerID="22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6" Jan 20 23:43:20 crc kubenswrapper[5030]: E0120 23:43:20.043255 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(1d6e18e5-718a-4578-8b74-add47cb38d2d)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.735806 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.17:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.740836 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.17:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.752069 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.753504 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.806250 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.806313 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.841184 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7"] Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.841416 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" podUID="6e590045-baa7-4ef7-87a4-4ab4a64161b3" containerName="barbican-api-log" containerID="cri-o://6dcae7ae7c484584017f7fa0aca6aa0cb08528fa8a61b4d048214746eb80c8b1" gracePeriod=30 Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.845780 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" podUID="6e590045-baa7-4ef7-87a4-4ab4a64161b3" containerName="barbican-api" containerID="cri-o://13859d5f41c3b61a0c89700bdb6935b58c5bf928f2bbdf63a92f7c1183771d73" gracePeriod=30 Jan 20 23:43:20 crc kubenswrapper[5030]: I0120 23:43:20.885939 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:21 crc kubenswrapper[5030]: I0120 23:43:21.052350 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"d10e94b1-91a0-4631-a7b8-763d41c403ce","Type":"ContainerStarted","Data":"042c3ddc337801a1de68a99e66b3565634736bf58285d9ac8ac3ca69a8d4aca1"} Jan 20 23:43:21 crc kubenswrapper[5030]: I0120 23:43:21.052749 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:21 crc kubenswrapper[5030]: I0120 23:43:21.054902 5030 generic.go:334] "Generic (PLEG): container finished" podID="6e590045-baa7-4ef7-87a4-4ab4a64161b3" containerID="6dcae7ae7c484584017f7fa0aca6aa0cb08528fa8a61b4d048214746eb80c8b1" exitCode=143 Jan 20 23:43:21 crc kubenswrapper[5030]: I0120 23:43:21.054954 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" event={"ID":"6e590045-baa7-4ef7-87a4-4ab4a64161b3","Type":"ContainerDied","Data":"6dcae7ae7c484584017f7fa0aca6aa0cb08528fa8a61b4d048214746eb80c8b1"} Jan 20 23:43:21 crc kubenswrapper[5030]: I0120 23:43:21.059372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b","Type":"ContainerStarted","Data":"b494a9a55c3d113fd512a6d0e36e0c85cab0dd52b358ca282eeda6ddabb313cb"} Jan 20 23:43:21 crc kubenswrapper[5030]: I0120 23:43:21.059407 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b","Type":"ContainerStarted","Data":"0c5c3896b95122ba2528e0d7943d1eba9eaee2155abb09b973a5236b470342ce"} Jan 20 23:43:21 crc kubenswrapper[5030]: I0120 23:43:21.074356 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.074339204 podStartE2EDuration="2.074339204s" podCreationTimestamp="2026-01-20 23:43:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:21.06842236 +0000 UTC m=+4073.388682648" watchObservedRunningTime="2026-01-20 23:43:21.074339204 +0000 UTC m=+4073.394599492" Jan 20 23:43:21 crc kubenswrapper[5030]: I0120 23:43:21.091234 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.091213573 podStartE2EDuration="3.091213573s" podCreationTimestamp="2026-01-20 23:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:21.087162505 +0000 UTC m=+4073.407422793" watchObservedRunningTime="2026-01-20 23:43:21.091213573 +0000 UTC m=+4073.411473861" Jan 20 23:43:21 crc kubenswrapper[5030]: I0120 23:43:21.110243 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:21 crc kubenswrapper[5030]: I0120 23:43:21.151412 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6z7st"] Jan 20 23:43:22 crc kubenswrapper[5030]: I0120 23:43:22.187497 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:22 crc kubenswrapper[5030]: I0120 23:43:22.187863 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="ceilometer-central-agent" containerID="cri-o://22080769a37ddfd631226ed565d7ef82f827787272f10813765651405967da1d" gracePeriod=30 Jan 20 23:43:22 crc kubenswrapper[5030]: I0120 23:43:22.187990 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="ceilometer-notification-agent" containerID="cri-o://437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5" gracePeriod=30 Jan 20 23:43:22 crc kubenswrapper[5030]: I0120 23:43:22.187948 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="sg-core" containerID="cri-o://aff3c22c8090855f1bc142b62bd3dbd94c3ee26aaf152b76668467680a42dbb7" gracePeriod=30 Jan 20 23:43:22 crc kubenswrapper[5030]: I0120 23:43:22.188147 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="proxy-httpd" containerID="cri-o://3080ff8c5f746f4b16193cc0fc72270cf4d129efbbfdc598db6620ca200e79a9" gracePeriod=30 Jan 20 23:43:22 crc kubenswrapper[5030]: I0120 23:43:22.452854 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.31:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:22 crc kubenswrapper[5030]: I0120 23:43:22.543373 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Jan 20 23:43:22 crc kubenswrapper[5030]: I0120 23:43:22.706808 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.17:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:22 crc kubenswrapper[5030]: I0120 23:43:22.794090 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:22 crc kubenswrapper[5030]: I0120 23:43:22.794931 5030 scope.go:117] "RemoveContainer" containerID="22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6" Jan 20 23:43:22 crc kubenswrapper[5030]: E0120 23:43:22.795219 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(1d6e18e5-718a-4578-8b74-add47cb38d2d)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.083212 5030 generic.go:334] "Generic (PLEG): container finished" podID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerID="aff3c22c8090855f1bc142b62bd3dbd94c3ee26aaf152b76668467680a42dbb7" exitCode=2 Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.083249 5030 generic.go:334] "Generic (PLEG): container finished" podID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerID="22080769a37ddfd631226ed565d7ef82f827787272f10813765651405967da1d" exitCode=0 Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.083246 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d","Type":"ContainerDied","Data":"aff3c22c8090855f1bc142b62bd3dbd94c3ee26aaf152b76668467680a42dbb7"} Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.083288 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d","Type":"ContainerDied","Data":"22080769a37ddfd631226ed565d7ef82f827787272f10813765651405967da1d"} Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.083460 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6z7st" podUID="2a95b578-1de2-4b88-88dd-63ff04e9020f" containerName="registry-server" containerID="cri-o://830e341a8e288fa9122e208b4222270702aa246660044831d9c54d42f91e8f09" gracePeriod=2 Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.778128 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.788691 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.876845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a95b578-1de2-4b88-88dd-63ff04e9020f-utilities\") pod \"2a95b578-1de2-4b88-88dd-63ff04e9020f\" (UID: \"2a95b578-1de2-4b88-88dd-63ff04e9020f\") " Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.877142 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qpwf\" (UniqueName: \"kubernetes.io/projected/2a95b578-1de2-4b88-88dd-63ff04e9020f-kube-api-access-9qpwf\") pod \"2a95b578-1de2-4b88-88dd-63ff04e9020f\" (UID: \"2a95b578-1de2-4b88-88dd-63ff04e9020f\") " Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.877263 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a95b578-1de2-4b88-88dd-63ff04e9020f-catalog-content\") pod \"2a95b578-1de2-4b88-88dd-63ff04e9020f\" (UID: \"2a95b578-1de2-4b88-88dd-63ff04e9020f\") " Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.878494 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a95b578-1de2-4b88-88dd-63ff04e9020f-utilities" (OuterVolumeSpecName: "utilities") pod "2a95b578-1de2-4b88-88dd-63ff04e9020f" (UID: "2a95b578-1de2-4b88-88dd-63ff04e9020f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.886864 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a95b578-1de2-4b88-88dd-63ff04e9020f-kube-api-access-9qpwf" (OuterVolumeSpecName: "kube-api-access-9qpwf") pod "2a95b578-1de2-4b88-88dd-63ff04e9020f" (UID: "2a95b578-1de2-4b88-88dd-63ff04e9020f"). InnerVolumeSpecName "kube-api-access-9qpwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.898807 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="0aacc04a-8369-4d86-a209-51905de33654" containerName="galera" containerID="cri-o://70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e" gracePeriod=30 Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.922991 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a95b578-1de2-4b88-88dd-63ff04e9020f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a95b578-1de2-4b88-88dd-63ff04e9020f" (UID: "2a95b578-1de2-4b88-88dd-63ff04e9020f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.936901 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.980979 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a95b578-1de2-4b88-88dd-63ff04e9020f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.981008 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a95b578-1de2-4b88-88dd-63ff04e9020f-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:23 crc kubenswrapper[5030]: I0120 23:43:23.981021 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qpwf\" (UniqueName: \"kubernetes.io/projected/2a95b578-1de2-4b88-88dd-63ff04e9020f-kube-api-access-9qpwf\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.081828 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/527ab23e-825f-4355-9abf-e3858c484683-logs\") pod \"527ab23e-825f-4355-9abf-e3858c484683\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.081874 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/527ab23e-825f-4355-9abf-e3858c484683-httpd-run\") pod \"527ab23e-825f-4355-9abf-e3858c484683\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.081958 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-scripts\") pod \"527ab23e-825f-4355-9abf-e3858c484683\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.082023 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-combined-ca-bundle\") pod \"527ab23e-825f-4355-9abf-e3858c484683\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.082078 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-public-tls-certs\") pod \"527ab23e-825f-4355-9abf-e3858c484683\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.082119 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znzk9\" (UniqueName: \"kubernetes.io/projected/527ab23e-825f-4355-9abf-e3858c484683-kube-api-access-znzk9\") pod \"527ab23e-825f-4355-9abf-e3858c484683\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.082263 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-config-data\") pod \"527ab23e-825f-4355-9abf-e3858c484683\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.082285 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"527ab23e-825f-4355-9abf-e3858c484683\" (UID: \"527ab23e-825f-4355-9abf-e3858c484683\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.082977 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527ab23e-825f-4355-9abf-e3858c484683-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "527ab23e-825f-4355-9abf-e3858c484683" (UID: "527ab23e-825f-4355-9abf-e3858c484683"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.083388 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527ab23e-825f-4355-9abf-e3858c484683-logs" (OuterVolumeSpecName: "logs") pod "527ab23e-825f-4355-9abf-e3858c484683" (UID: "527ab23e-825f-4355-9abf-e3858c484683"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.089412 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527ab23e-825f-4355-9abf-e3858c484683-kube-api-access-znzk9" (OuterVolumeSpecName: "kube-api-access-znzk9") pod "527ab23e-825f-4355-9abf-e3858c484683" (UID: "527ab23e-825f-4355-9abf-e3858c484683"). InnerVolumeSpecName "kube-api-access-znzk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.093195 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "527ab23e-825f-4355-9abf-e3858c484683" (UID: "527ab23e-825f-4355-9abf-e3858c484683"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.098700 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-scripts" (OuterVolumeSpecName: "scripts") pod "527ab23e-825f-4355-9abf-e3858c484683" (UID: "527ab23e-825f-4355-9abf-e3858c484683"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.112905 5030 generic.go:334] "Generic (PLEG): container finished" podID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerID="3080ff8c5f746f4b16193cc0fc72270cf4d129efbbfdc598db6620ca200e79a9" exitCode=0 Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.112966 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d","Type":"ContainerDied","Data":"3080ff8c5f746f4b16193cc0fc72270cf4d129efbbfdc598db6620ca200e79a9"} Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.121271 5030 generic.go:334] "Generic (PLEG): container finished" podID="2a95b578-1de2-4b88-88dd-63ff04e9020f" containerID="830e341a8e288fa9122e208b4222270702aa246660044831d9c54d42f91e8f09" exitCode=0 Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.121352 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z7st" event={"ID":"2a95b578-1de2-4b88-88dd-63ff04e9020f","Type":"ContainerDied","Data":"830e341a8e288fa9122e208b4222270702aa246660044831d9c54d42f91e8f09"} Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.121397 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z7st" event={"ID":"2a95b578-1de2-4b88-88dd-63ff04e9020f","Type":"ContainerDied","Data":"2a2ff61ab22d58961a6d4813e63680f01d23f60c14d9f4220f14d5ad8703406c"} Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.121400 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z7st" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.121418 5030 scope.go:117] "RemoveContainer" containerID="830e341a8e288fa9122e208b4222270702aa246660044831d9c54d42f91e8f09" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.128552 5030 generic.go:334] "Generic (PLEG): container finished" podID="527ab23e-825f-4355-9abf-e3858c484683" containerID="77811e0d82b6720b021978cb34a90c19c14e60268e314bc8de7e4c5e39e50342" exitCode=0 Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.128617 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"527ab23e-825f-4355-9abf-e3858c484683","Type":"ContainerDied","Data":"77811e0d82b6720b021978cb34a90c19c14e60268e314bc8de7e4c5e39e50342"} Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.128662 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"527ab23e-825f-4355-9abf-e3858c484683","Type":"ContainerDied","Data":"1dbff2e4713380c96cd10e70c51df97e55624a112a5058f20809ac1e24049788"} Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.128924 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.129694 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "527ab23e-825f-4355-9abf-e3858c484683" (UID: "527ab23e-825f-4355-9abf-e3858c484683"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.139515 5030 generic.go:334] "Generic (PLEG): container finished" podID="6e590045-baa7-4ef7-87a4-4ab4a64161b3" containerID="13859d5f41c3b61a0c89700bdb6935b58c5bf928f2bbdf63a92f7c1183771d73" exitCode=0 Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.139788 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" event={"ID":"6e590045-baa7-4ef7-87a4-4ab4a64161b3","Type":"ContainerDied","Data":"13859d5f41c3b61a0c89700bdb6935b58c5bf928f2bbdf63a92f7c1183771d73"} Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.150104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-config-data" (OuterVolumeSpecName: "config-data") pod "527ab23e-825f-4355-9abf-e3858c484683" (UID: "527ab23e-825f-4355-9abf-e3858c484683"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.161920 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6z7st"] Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.177227 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "527ab23e-825f-4355-9abf-e3858c484683" (UID: "527ab23e-825f-4355-9abf-e3858c484683"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.181151 5030 scope.go:117] "RemoveContainer" containerID="424c93f4038942de5f50f7c4a340318ffd69ecb302f7a35747632d938e54e023" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.185289 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6z7st"] Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.186129 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/527ab23e-825f-4355-9abf-e3858c484683-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.186147 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/527ab23e-825f-4355-9abf-e3858c484683-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.186157 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.186169 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.186178 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.186188 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znzk9\" (UniqueName: \"kubernetes.io/projected/527ab23e-825f-4355-9abf-e3858c484683-kube-api-access-znzk9\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.186198 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527ab23e-825f-4355-9abf-e3858c484683-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.186227 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.207697 5030 scope.go:117] "RemoveContainer" containerID="34dad5f312e352d038ff1235be5085442c825ba1357b91409e220096727216b9" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.210757 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.240974 5030 scope.go:117] "RemoveContainer" containerID="830e341a8e288fa9122e208b4222270702aa246660044831d9c54d42f91e8f09" Jan 20 23:43:24 crc kubenswrapper[5030]: E0120 23:43:24.248237 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"830e341a8e288fa9122e208b4222270702aa246660044831d9c54d42f91e8f09\": container with ID starting with 830e341a8e288fa9122e208b4222270702aa246660044831d9c54d42f91e8f09 not found: ID does not exist" containerID="830e341a8e288fa9122e208b4222270702aa246660044831d9c54d42f91e8f09" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.248299 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830e341a8e288fa9122e208b4222270702aa246660044831d9c54d42f91e8f09"} err="failed to get container status \"830e341a8e288fa9122e208b4222270702aa246660044831d9c54d42f91e8f09\": rpc error: code = NotFound desc = could not find container \"830e341a8e288fa9122e208b4222270702aa246660044831d9c54d42f91e8f09\": container with ID starting with 830e341a8e288fa9122e208b4222270702aa246660044831d9c54d42f91e8f09 not found: ID does not exist" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.248334 5030 scope.go:117] "RemoveContainer" containerID="424c93f4038942de5f50f7c4a340318ffd69ecb302f7a35747632d938e54e023" Jan 20 23:43:24 crc kubenswrapper[5030]: E0120 23:43:24.249766 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"424c93f4038942de5f50f7c4a340318ffd69ecb302f7a35747632d938e54e023\": container with ID starting with 424c93f4038942de5f50f7c4a340318ffd69ecb302f7a35747632d938e54e023 not found: ID does not exist" containerID="424c93f4038942de5f50f7c4a340318ffd69ecb302f7a35747632d938e54e023" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.249802 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"424c93f4038942de5f50f7c4a340318ffd69ecb302f7a35747632d938e54e023"} err="failed to get container status \"424c93f4038942de5f50f7c4a340318ffd69ecb302f7a35747632d938e54e023\": rpc error: code = NotFound desc = could not find container \"424c93f4038942de5f50f7c4a340318ffd69ecb302f7a35747632d938e54e023\": container with ID starting with 424c93f4038942de5f50f7c4a340318ffd69ecb302f7a35747632d938e54e023 not found: ID does not exist" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.249819 5030 scope.go:117] "RemoveContainer" containerID="34dad5f312e352d038ff1235be5085442c825ba1357b91409e220096727216b9" Jan 20 23:43:24 crc kubenswrapper[5030]: E0120 23:43:24.250938 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34dad5f312e352d038ff1235be5085442c825ba1357b91409e220096727216b9\": container with ID starting with 34dad5f312e352d038ff1235be5085442c825ba1357b91409e220096727216b9 not found: ID does not exist" containerID="34dad5f312e352d038ff1235be5085442c825ba1357b91409e220096727216b9" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.250994 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34dad5f312e352d038ff1235be5085442c825ba1357b91409e220096727216b9"} err="failed to get container status \"34dad5f312e352d038ff1235be5085442c825ba1357b91409e220096727216b9\": rpc error: code = NotFound desc = could not find container \"34dad5f312e352d038ff1235be5085442c825ba1357b91409e220096727216b9\": container with ID starting with 34dad5f312e352d038ff1235be5085442c825ba1357b91409e220096727216b9 not found: ID does not exist" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.251020 5030 scope.go:117] "RemoveContainer" containerID="77811e0d82b6720b021978cb34a90c19c14e60268e314bc8de7e4c5e39e50342" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.289262 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.290477 5030 scope.go:117] "RemoveContainer" containerID="81e32e53c3f6a49ddd25bb2402c3925b0d11ecd9d86657224a2a426078ebe402" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.409336 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.419712 5030 scope.go:117] "RemoveContainer" containerID="77811e0d82b6720b021978cb34a90c19c14e60268e314bc8de7e4c5e39e50342" Jan 20 23:43:24 crc kubenswrapper[5030]: E0120 23:43:24.420099 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77811e0d82b6720b021978cb34a90c19c14e60268e314bc8de7e4c5e39e50342\": container with ID starting with 77811e0d82b6720b021978cb34a90c19c14e60268e314bc8de7e4c5e39e50342 not found: ID does not exist" containerID="77811e0d82b6720b021978cb34a90c19c14e60268e314bc8de7e4c5e39e50342" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.420132 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77811e0d82b6720b021978cb34a90c19c14e60268e314bc8de7e4c5e39e50342"} err="failed to get container status \"77811e0d82b6720b021978cb34a90c19c14e60268e314bc8de7e4c5e39e50342\": rpc error: code = NotFound desc = could not find container \"77811e0d82b6720b021978cb34a90c19c14e60268e314bc8de7e4c5e39e50342\": container with ID starting with 77811e0d82b6720b021978cb34a90c19c14e60268e314bc8de7e4c5e39e50342 not found: ID does not exist" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.420148 5030 scope.go:117] "RemoveContainer" containerID="81e32e53c3f6a49ddd25bb2402c3925b0d11ecd9d86657224a2a426078ebe402" Jan 20 23:43:24 crc kubenswrapper[5030]: E0120 23:43:24.420493 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81e32e53c3f6a49ddd25bb2402c3925b0d11ecd9d86657224a2a426078ebe402\": container with ID starting with 81e32e53c3f6a49ddd25bb2402c3925b0d11ecd9d86657224a2a426078ebe402 not found: ID does not exist" containerID="81e32e53c3f6a49ddd25bb2402c3925b0d11ecd9d86657224a2a426078ebe402" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.420516 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e32e53c3f6a49ddd25bb2402c3925b0d11ecd9d86657224a2a426078ebe402"} err="failed to get container status \"81e32e53c3f6a49ddd25bb2402c3925b0d11ecd9d86657224a2a426078ebe402\": rpc error: code = NotFound desc = could not find container \"81e32e53c3f6a49ddd25bb2402c3925b0d11ecd9d86657224a2a426078ebe402\": container with ID starting with 81e32e53c3f6a49ddd25bb2402c3925b0d11ecd9d86657224a2a426078ebe402 not found: ID does not exist" Jan 20 23:43:24 crc kubenswrapper[5030]: E0120 23:43:24.422835 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aacc04a_8369_4d86_a209_51905de33654.slice/crio-conmon-70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bb7ee5a_b9bc_40e6_b42c_3a9f9da7687d.slice/crio-437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.463393 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.474199 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.484441 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:43:24 crc kubenswrapper[5030]: E0120 23:43:24.484901 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e590045-baa7-4ef7-87a4-4ab4a64161b3" containerName="barbican-api" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.484919 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e590045-baa7-4ef7-87a4-4ab4a64161b3" containerName="barbican-api" Jan 20 23:43:24 crc kubenswrapper[5030]: E0120 23:43:24.484944 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a95b578-1de2-4b88-88dd-63ff04e9020f" containerName="extract-utilities" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.484951 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a95b578-1de2-4b88-88dd-63ff04e9020f" containerName="extract-utilities" Jan 20 23:43:24 crc kubenswrapper[5030]: E0120 23:43:24.484971 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a95b578-1de2-4b88-88dd-63ff04e9020f" containerName="extract-content" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.484976 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a95b578-1de2-4b88-88dd-63ff04e9020f" containerName="extract-content" Jan 20 23:43:24 crc kubenswrapper[5030]: E0120 23:43:24.484992 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527ab23e-825f-4355-9abf-e3858c484683" containerName="glance-log" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.484998 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="527ab23e-825f-4355-9abf-e3858c484683" containerName="glance-log" Jan 20 23:43:24 crc kubenswrapper[5030]: E0120 23:43:24.485011 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e590045-baa7-4ef7-87a4-4ab4a64161b3" containerName="barbican-api-log" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.485017 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e590045-baa7-4ef7-87a4-4ab4a64161b3" containerName="barbican-api-log" Jan 20 23:43:24 crc kubenswrapper[5030]: E0120 23:43:24.485024 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a95b578-1de2-4b88-88dd-63ff04e9020f" containerName="registry-server" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.485030 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a95b578-1de2-4b88-88dd-63ff04e9020f" containerName="registry-server" Jan 20 23:43:24 crc kubenswrapper[5030]: E0120 23:43:24.485044 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527ab23e-825f-4355-9abf-e3858c484683" containerName="glance-httpd" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.485051 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="527ab23e-825f-4355-9abf-e3858c484683" containerName="glance-httpd" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.485217 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e590045-baa7-4ef7-87a4-4ab4a64161b3" containerName="barbican-api" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.485238 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="527ab23e-825f-4355-9abf-e3858c484683" containerName="glance-httpd" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.485248 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a95b578-1de2-4b88-88dd-63ff04e9020f" containerName="registry-server" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.485260 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="527ab23e-825f-4355-9abf-e3858c484683" containerName="glance-log" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.485275 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e590045-baa7-4ef7-87a4-4ab4a64161b3" containerName="barbican-api-log" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.486284 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.488097 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.492327 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.493545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-combined-ca-bundle\") pod \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.493696 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-config-data\") pod \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.493750 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e590045-baa7-4ef7-87a4-4ab4a64161b3-logs\") pod \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.493772 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-internal-tls-certs\") pod \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.493820 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84bh7\" (UniqueName: \"kubernetes.io/projected/6e590045-baa7-4ef7-87a4-4ab4a64161b3-kube-api-access-84bh7\") pod \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.493889 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-config-data-custom\") pod \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.493959 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-public-tls-certs\") pod \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\" (UID: \"6e590045-baa7-4ef7-87a4-4ab4a64161b3\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.495078 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e590045-baa7-4ef7-87a4-4ab4a64161b3-logs" (OuterVolumeSpecName: "logs") pod "6e590045-baa7-4ef7-87a4-4ab4a64161b3" (UID: "6e590045-baa7-4ef7-87a4-4ab4a64161b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.497189 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.502160 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e590045-baa7-4ef7-87a4-4ab4a64161b3-kube-api-access-84bh7" (OuterVolumeSpecName: "kube-api-access-84bh7") pod "6e590045-baa7-4ef7-87a4-4ab4a64161b3" (UID: "6e590045-baa7-4ef7-87a4-4ab4a64161b3"). InnerVolumeSpecName "kube-api-access-84bh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.502318 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6e590045-baa7-4ef7-87a4-4ab4a64161b3" (UID: "6e590045-baa7-4ef7-87a4-4ab4a64161b3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.557696 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e590045-baa7-4ef7-87a4-4ab4a64161b3" (UID: "6e590045-baa7-4ef7-87a4-4ab4a64161b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.573672 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-config-data" (OuterVolumeSpecName: "config-data") pod "6e590045-baa7-4ef7-87a4-4ab4a64161b3" (UID: "6e590045-baa7-4ef7-87a4-4ab4a64161b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.575368 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6e590045-baa7-4ef7-87a4-4ab4a64161b3" (UID: "6e590045-baa7-4ef7-87a4-4ab4a64161b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.583480 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6e590045-baa7-4ef7-87a4-4ab4a64161b3" (UID: "6e590045-baa7-4ef7-87a4-4ab4a64161b3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.596090 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc4jk\" (UniqueName: \"kubernetes.io/projected/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-kube-api-access-nc4jk\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.597899 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.598036 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.598154 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-logs\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.598370 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-scripts\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.598407 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.598513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.598573 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-config-data\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.598949 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.598966 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.598976 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.598985 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e590045-baa7-4ef7-87a4-4ab4a64161b3-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.598995 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.599007 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84bh7\" (UniqueName: \"kubernetes.io/projected/6e590045-baa7-4ef7-87a4-4ab4a64161b3-kube-api-access-84bh7\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.599017 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e590045-baa7-4ef7-87a4-4ab4a64161b3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.700374 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-scripts\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.700417 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.700441 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.700466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-config-data\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.700510 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc4jk\" (UniqueName: \"kubernetes.io/projected/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-kube-api-access-nc4jk\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.700533 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.700562 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.700603 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-logs\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.701053 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-logs\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.702708 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.703198 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.704744 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-scripts\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.708587 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-config-data\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.710155 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.712173 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.722929 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc4jk\" (UniqueName: \"kubernetes.io/projected/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-kube-api-access-nc4jk\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.737443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.810642 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.823873 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.831411 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.905285 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-log-httpd\") pod \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.905598 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0aacc04a-8369-4d86-a209-51905de33654\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.905680 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnkff\" (UniqueName: \"kubernetes.io/projected/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-kube-api-access-vnkff\") pod \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.905719 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0aacc04a-8369-4d86-a209-51905de33654-config-data-generated\") pod \"0aacc04a-8369-4d86-a209-51905de33654\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.905749 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k2m7\" (UniqueName: \"kubernetes.io/projected/0aacc04a-8369-4d86-a209-51905de33654-kube-api-access-8k2m7\") pod \"0aacc04a-8369-4d86-a209-51905de33654\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.905797 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-combined-ca-bundle\") pod \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.905827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aacc04a-8369-4d86-a209-51905de33654-galera-tls-certs\") pod \"0aacc04a-8369-4d86-a209-51905de33654\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.905897 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-ceilometer-tls-certs\") pod \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.905929 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-config-data\") pod \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.905966 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-sg-core-conf-yaml\") pod \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.906004 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-operator-scripts\") pod \"0aacc04a-8369-4d86-a209-51905de33654\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.906043 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-kolla-config\") pod \"0aacc04a-8369-4d86-a209-51905de33654\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.906130 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aacc04a-8369-4d86-a209-51905de33654-combined-ca-bundle\") pod \"0aacc04a-8369-4d86-a209-51905de33654\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.906193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-config-data-default\") pod \"0aacc04a-8369-4d86-a209-51905de33654\" (UID: \"0aacc04a-8369-4d86-a209-51905de33654\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.906250 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-run-httpd\") pod \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.906291 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-scripts\") pod \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\" (UID: \"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d\") " Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.907890 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0aacc04a-8369-4d86-a209-51905de33654" (UID: "0aacc04a-8369-4d86-a209-51905de33654"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.911094 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" (UID: "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.914345 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-kube-api-access-vnkff" (OuterVolumeSpecName: "kube-api-access-vnkff") pod "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" (UID: "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d"). InnerVolumeSpecName "kube-api-access-vnkff". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.916991 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "0aacc04a-8369-4d86-a209-51905de33654" (UID: "0aacc04a-8369-4d86-a209-51905de33654"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.917048 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "0aacc04a-8369-4d86-a209-51905de33654" (UID: "0aacc04a-8369-4d86-a209-51905de33654"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.917340 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-scripts" (OuterVolumeSpecName: "scripts") pod "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" (UID: "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.917576 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" (UID: "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.917591 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0aacc04a-8369-4d86-a209-51905de33654-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "0aacc04a-8369-4d86-a209-51905de33654" (UID: "0aacc04a-8369-4d86-a209-51905de33654"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.930359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aacc04a-8369-4d86-a209-51905de33654-kube-api-access-8k2m7" (OuterVolumeSpecName: "kube-api-access-8k2m7") pod "0aacc04a-8369-4d86-a209-51905de33654" (UID: "0aacc04a-8369-4d86-a209-51905de33654"). InnerVolumeSpecName "kube-api-access-8k2m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.938237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "0aacc04a-8369-4d86-a209-51905de33654" (UID: "0aacc04a-8369-4d86-a209-51905de33654"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.953309 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aacc04a-8369-4d86-a209-51905de33654-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0aacc04a-8369-4d86-a209-51905de33654" (UID: "0aacc04a-8369-4d86-a209-51905de33654"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.965192 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" (UID: "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:24 crc kubenswrapper[5030]: I0120 23:43:24.984383 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aacc04a-8369-4d86-a209-51905de33654-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "0aacc04a-8369-4d86-a209-51905de33654" (UID: "0aacc04a-8369-4d86-a209-51905de33654"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.005613 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" (UID: "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.009292 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.009344 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.009354 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.009377 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.009421 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnkff\" (UniqueName: \"kubernetes.io/projected/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-kube-api-access-vnkff\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.009431 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0aacc04a-8369-4d86-a209-51905de33654-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.009440 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k2m7\" (UniqueName: \"kubernetes.io/projected/0aacc04a-8369-4d86-a209-51905de33654-kube-api-access-8k2m7\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.009450 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aacc04a-8369-4d86-a209-51905de33654-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.012354 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.012374 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.012385 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.012394 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.012403 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aacc04a-8369-4d86-a209-51905de33654-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.012411 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0aacc04a-8369-4d86-a209-51905de33654-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.028723 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.032798 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" (UID: "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.043961 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-config-data" (OuterVolumeSpecName: "config-data") pod "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" (UID: "8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.113952 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.114323 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.114337 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.150327 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" event={"ID":"6e590045-baa7-4ef7-87a4-4ab4a64161b3","Type":"ContainerDied","Data":"ce7139799de264d52e74dfa0f7516e35af51cfbd6733c10e1f41e617940e9330"} Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.150373 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.150385 5030 scope.go:117] "RemoveContainer" containerID="13859d5f41c3b61a0c89700bdb6935b58c5bf928f2bbdf63a92f7c1183771d73" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.151883 5030 generic.go:334] "Generic (PLEG): container finished" podID="0aacc04a-8369-4d86-a209-51905de33654" containerID="70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e" exitCode=0 Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.151940 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.151941 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"0aacc04a-8369-4d86-a209-51905de33654","Type":"ContainerDied","Data":"70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e"} Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.152052 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"0aacc04a-8369-4d86-a209-51905de33654","Type":"ContainerDied","Data":"39acce0069abdd1fa0016b0b151911ccd2b22d0da92fab8d6a1b6727f97e67d0"} Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.155253 5030 generic.go:334] "Generic (PLEG): container finished" podID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerID="437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5" exitCode=0 Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.155291 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d","Type":"ContainerDied","Data":"437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5"} Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.155331 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d","Type":"ContainerDied","Data":"68ca516876cf13491d4b0ff89ccfacff2571b2db918b083cc72996f74a4600ac"} Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.155333 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.171670 5030 scope.go:117] "RemoveContainer" containerID="6dcae7ae7c484584017f7fa0aca6aa0cb08528fa8a61b4d048214746eb80c8b1" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.186541 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7"] Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.201931 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-c9cc68596-zfxv7"] Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.203810 5030 scope.go:117] "RemoveContainer" containerID="70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.219734 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.227411 5030 scope.go:117] "RemoveContainer" containerID="68f05a4b2107b10954ba17eed85dc844f830039874e21f94175e99351cd0d1c5" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.231028 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.247398 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.250794 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.25:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.250892 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.25:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.271805 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.271868 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:43:25 crc kubenswrapper[5030]: E0120 23:43:25.272252 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aacc04a-8369-4d86-a209-51905de33654" containerName="mysql-bootstrap" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.272269 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aacc04a-8369-4d86-a209-51905de33654" containerName="mysql-bootstrap" Jan 20 23:43:25 crc kubenswrapper[5030]: E0120 23:43:25.272286 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="proxy-httpd" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.272293 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="proxy-httpd" Jan 20 23:43:25 crc kubenswrapper[5030]: E0120 23:43:25.272307 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aacc04a-8369-4d86-a209-51905de33654" containerName="galera" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.272312 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aacc04a-8369-4d86-a209-51905de33654" containerName="galera" Jan 20 23:43:25 crc kubenswrapper[5030]: E0120 23:43:25.272338 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="ceilometer-notification-agent" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.272344 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="ceilometer-notification-agent" Jan 20 23:43:25 crc kubenswrapper[5030]: E0120 23:43:25.272352 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="sg-core" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.272359 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="sg-core" Jan 20 23:43:25 crc kubenswrapper[5030]: E0120 23:43:25.272369 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="ceilometer-central-agent" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.272375 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="ceilometer-central-agent" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.272559 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="sg-core" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.272570 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="ceilometer-central-agent" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.272581 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="proxy-httpd" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.272596 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" containerName="ceilometer-notification-agent" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.272610 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aacc04a-8369-4d86-a209-51905de33654" containerName="galera" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.273534 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.275764 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.275777 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.278734 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.279004 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-j9wjx" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.279110 5030 scope.go:117] "RemoveContainer" containerID="70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.279306 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.282573 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: E0120 23:43:25.282711 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e\": container with ID starting with 70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e not found: ID does not exist" containerID="70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.282742 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e"} err="failed to get container status \"70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e\": rpc error: code = NotFound desc = could not find container \"70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e\": container with ID starting with 70b371668598847da8285d3ace2e484cc3e9b62a7f78bcb3c559f59789a4a22e not found: ID does not exist" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.282763 5030 scope.go:117] "RemoveContainer" containerID="68f05a4b2107b10954ba17eed85dc844f830039874e21f94175e99351cd0d1c5" Jan 20 23:43:25 crc kubenswrapper[5030]: E0120 23:43:25.283322 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f05a4b2107b10954ba17eed85dc844f830039874e21f94175e99351cd0d1c5\": container with ID starting with 68f05a4b2107b10954ba17eed85dc844f830039874e21f94175e99351cd0d1c5 not found: ID does not exist" containerID="68f05a4b2107b10954ba17eed85dc844f830039874e21f94175e99351cd0d1c5" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.283349 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f05a4b2107b10954ba17eed85dc844f830039874e21f94175e99351cd0d1c5"} err="failed to get container status \"68f05a4b2107b10954ba17eed85dc844f830039874e21f94175e99351cd0d1c5\": rpc error: code = NotFound desc = could not find container \"68f05a4b2107b10954ba17eed85dc844f830039874e21f94175e99351cd0d1c5\": container with ID starting with 68f05a4b2107b10954ba17eed85dc844f830039874e21f94175e99351cd0d1c5 not found: ID does not exist" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.283374 5030 scope.go:117] "RemoveContainer" containerID="3080ff8c5f746f4b16193cc0fc72270cf4d129efbbfdc598db6620ca200e79a9" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.283650 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.284390 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.284751 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.287790 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.295300 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:25 crc kubenswrapper[5030]: W0120 23:43:25.295965 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59dc3b96_a86c_49bf_b1cd_dd0c0f72ac54.slice/crio-e90b4616815292d0dbc1a382729855d89e7175a6f3f1f6bc5729b620033b9d96 WatchSource:0}: Error finding container e90b4616815292d0dbc1a382729855d89e7175a6f3f1f6bc5729b620033b9d96: Status 404 returned error can't find the container with id e90b4616815292d0dbc1a382729855d89e7175a6f3f1f6bc5729b620033b9d96 Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.317342 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321011 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-config-data-default\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321062 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321131 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4xtx\" (UniqueName: \"kubernetes.io/projected/eb755d31-fd61-4e84-9e18-5acc68934a58-kube-api-access-p4xtx\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321183 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321205 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-scripts\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321227 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321285 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxf4r\" (UniqueName: \"kubernetes.io/projected/df866ce3-a10a-41e3-a17b-abea64eaa5ca-kube-api-access-gxf4r\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321306 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb755d31-fd61-4e84-9e18-5acc68934a58-log-httpd\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321358 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321436 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-config-data\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321478 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb755d31-fd61-4e84-9e18-5acc68934a58-run-httpd\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321640 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/df866ce3-a10a-41e3-a17b-abea64eaa5ca-config-data-generated\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321734 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-operator-scripts\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321795 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-kolla-config\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321834 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.321903 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.324524 5030 scope.go:117] "RemoveContainer" containerID="aff3c22c8090855f1bc142b62bd3dbd94c3ee26aaf152b76668467680a42dbb7" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.354138 5030 scope.go:117] "RemoveContainer" containerID="437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.373227 5030 scope.go:117] "RemoveContainer" containerID="22080769a37ddfd631226ed565d7ef82f827787272f10813765651405967da1d" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.406842 5030 scope.go:117] "RemoveContainer" containerID="3080ff8c5f746f4b16193cc0fc72270cf4d129efbbfdc598db6620ca200e79a9" Jan 20 23:43:25 crc kubenswrapper[5030]: E0120 23:43:25.407333 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3080ff8c5f746f4b16193cc0fc72270cf4d129efbbfdc598db6620ca200e79a9\": container with ID starting with 3080ff8c5f746f4b16193cc0fc72270cf4d129efbbfdc598db6620ca200e79a9 not found: ID does not exist" containerID="3080ff8c5f746f4b16193cc0fc72270cf4d129efbbfdc598db6620ca200e79a9" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.407395 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3080ff8c5f746f4b16193cc0fc72270cf4d129efbbfdc598db6620ca200e79a9"} err="failed to get container status \"3080ff8c5f746f4b16193cc0fc72270cf4d129efbbfdc598db6620ca200e79a9\": rpc error: code = NotFound desc = could not find container \"3080ff8c5f746f4b16193cc0fc72270cf4d129efbbfdc598db6620ca200e79a9\": container with ID starting with 3080ff8c5f746f4b16193cc0fc72270cf4d129efbbfdc598db6620ca200e79a9 not found: ID does not exist" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.407437 5030 scope.go:117] "RemoveContainer" containerID="aff3c22c8090855f1bc142b62bd3dbd94c3ee26aaf152b76668467680a42dbb7" Jan 20 23:43:25 crc kubenswrapper[5030]: E0120 23:43:25.407984 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff3c22c8090855f1bc142b62bd3dbd94c3ee26aaf152b76668467680a42dbb7\": container with ID starting with aff3c22c8090855f1bc142b62bd3dbd94c3ee26aaf152b76668467680a42dbb7 not found: ID does not exist" containerID="aff3c22c8090855f1bc142b62bd3dbd94c3ee26aaf152b76668467680a42dbb7" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.408014 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff3c22c8090855f1bc142b62bd3dbd94c3ee26aaf152b76668467680a42dbb7"} err="failed to get container status \"aff3c22c8090855f1bc142b62bd3dbd94c3ee26aaf152b76668467680a42dbb7\": rpc error: code = NotFound desc = could not find container \"aff3c22c8090855f1bc142b62bd3dbd94c3ee26aaf152b76668467680a42dbb7\": container with ID starting with aff3c22c8090855f1bc142b62bd3dbd94c3ee26aaf152b76668467680a42dbb7 not found: ID does not exist" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.408037 5030 scope.go:117] "RemoveContainer" containerID="437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5" Jan 20 23:43:25 crc kubenswrapper[5030]: E0120 23:43:25.408395 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5\": container with ID starting with 437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5 not found: ID does not exist" containerID="437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.408438 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5"} err="failed to get container status \"437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5\": rpc error: code = NotFound desc = could not find container \"437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5\": container with ID starting with 437c030cb31177fde2528b8fd61c1ab9b06efd6075f53cf7a556e982fc98c5c5 not found: ID does not exist" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.408466 5030 scope.go:117] "RemoveContainer" containerID="22080769a37ddfd631226ed565d7ef82f827787272f10813765651405967da1d" Jan 20 23:43:25 crc kubenswrapper[5030]: E0120 23:43:25.408772 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22080769a37ddfd631226ed565d7ef82f827787272f10813765651405967da1d\": container with ID starting with 22080769a37ddfd631226ed565d7ef82f827787272f10813765651405967da1d not found: ID does not exist" containerID="22080769a37ddfd631226ed565d7ef82f827787272f10813765651405967da1d" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.408799 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22080769a37ddfd631226ed565d7ef82f827787272f10813765651405967da1d"} err="failed to get container status \"22080769a37ddfd631226ed565d7ef82f827787272f10813765651405967da1d\": rpc error: code = NotFound desc = could not find container \"22080769a37ddfd631226ed565d7ef82f827787272f10813765651405967da1d\": container with ID starting with 22080769a37ddfd631226ed565d7ef82f827787272f10813765651405967da1d not found: ID does not exist" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425634 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425697 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4xtx\" (UniqueName: \"kubernetes.io/projected/eb755d31-fd61-4e84-9e18-5acc68934a58-kube-api-access-p4xtx\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425721 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-scripts\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425740 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425759 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425778 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxf4r\" (UniqueName: \"kubernetes.io/projected/df866ce3-a10a-41e3-a17b-abea64eaa5ca-kube-api-access-gxf4r\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb755d31-fd61-4e84-9e18-5acc68934a58-log-httpd\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425829 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-config-data\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb755d31-fd61-4e84-9e18-5acc68934a58-run-httpd\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425900 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/df866ce3-a10a-41e3-a17b-abea64eaa5ca-config-data-generated\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425952 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-operator-scripts\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425969 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-kolla-config\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.425995 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.426024 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.426065 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-config-data-default\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.426255 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb755d31-fd61-4e84-9e18-5acc68934a58-log-httpd\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.426749 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/df866ce3-a10a-41e3-a17b-abea64eaa5ca-config-data-generated\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.426988 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.427767 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb755d31-fd61-4e84-9e18-5acc68934a58-run-httpd\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.428527 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-kolla-config\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.428835 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-config-data-default\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.429943 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-operator-scripts\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.431097 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.431224 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.431515 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-config-data\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.432240 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.433942 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-scripts\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.434374 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.434888 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.447062 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxf4r\" (UniqueName: \"kubernetes.io/projected/df866ce3-a10a-41e3-a17b-abea64eaa5ca-kube-api-access-gxf4r\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.447696 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4xtx\" (UniqueName: \"kubernetes.io/projected/eb755d31-fd61-4e84-9e18-5acc68934a58-kube-api-access-p4xtx\") pod \"ceilometer-0\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.455402 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.593651 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.613860 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.974351 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aacc04a-8369-4d86-a209-51905de33654" path="/var/lib/kubelet/pods/0aacc04a-8369-4d86-a209-51905de33654/volumes" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.975618 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a95b578-1de2-4b88-88dd-63ff04e9020f" path="/var/lib/kubelet/pods/2a95b578-1de2-4b88-88dd-63ff04e9020f/volumes" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.977009 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527ab23e-825f-4355-9abf-e3858c484683" path="/var/lib/kubelet/pods/527ab23e-825f-4355-9abf-e3858c484683/volumes" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.977610 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e590045-baa7-4ef7-87a4-4ab4a64161b3" path="/var/lib/kubelet/pods/6e590045-baa7-4ef7-87a4-4ab4a64161b3/volumes" Jan 20 23:43:25 crc kubenswrapper[5030]: I0120 23:43:25.978228 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d" path="/var/lib/kubelet/pods/8bb7ee5a-b9bc-40e6-b42c-3a9f9da7687d/volumes" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.145586 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.169430 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"df866ce3-a10a-41e3-a17b-abea64eaa5ca","Type":"ContainerStarted","Data":"34bf0052a34fd46a4d24f69b53361321b70feba66f09d756c858bd959b1eac5b"} Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.173669 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54","Type":"ContainerStarted","Data":"b54bda34645aa30be316ecd336547283562d059fb3040eb96063b7c53a7e4521"} Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.173707 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54","Type":"ContainerStarted","Data":"e90b4616815292d0dbc1a382729855d89e7175a6f3f1f6bc5729b620033b9d96"} Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.206273 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:26 crc kubenswrapper[5030]: W0120 23:43:26.207634 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb755d31_fd61_4e84_9e18_5acc68934a58.slice/crio-6520776bb31990d1b45ef0ed59e0298a7e30aa377d074ede7401abc8408d3b3a WatchSource:0}: Error finding container 6520776bb31990d1b45ef0ed59e0298a7e30aa377d074ede7401abc8408d3b3a: Status 404 returned error can't find the container with id 6520776bb31990d1b45ef0ed59e0298a7e30aa377d074ede7401abc8408d3b3a Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.762176 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv"] Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.764802 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.773062 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv"] Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.847067 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.851262 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-public-tls-certs\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.851314 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-internal-tls-certs\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.851341 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95fv8\" (UniqueName: \"kubernetes.io/projected/382e73bc-d9db-48e6-a776-c1130cfa12ea-kube-api-access-95fv8\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.851377 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-config-data-custom\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.851427 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-config-data\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.851451 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/382e73bc-d9db-48e6-a776-c1130cfa12ea-logs\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.851519 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-combined-ca-bundle\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.952730 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-combined-ca-bundle\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.953015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-public-tls-certs\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.953037 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-internal-tls-certs\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.953062 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95fv8\" (UniqueName: \"kubernetes.io/projected/382e73bc-d9db-48e6-a776-c1130cfa12ea-kube-api-access-95fv8\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.953094 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-config-data-custom\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.953142 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-config-data\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.953163 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/382e73bc-d9db-48e6-a776-c1130cfa12ea-logs\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.953547 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/382e73bc-d9db-48e6-a776-c1130cfa12ea-logs\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.962132 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-internal-tls-certs\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.972091 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-config-data\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.977075 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-public-tls-certs\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.977545 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-combined-ca-bundle\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.985526 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-config-data-custom\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:26 crc kubenswrapper[5030]: I0120 23:43:26.995384 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95fv8\" (UniqueName: \"kubernetes.io/projected/382e73bc-d9db-48e6-a776-c1130cfa12ea-kube-api-access-95fv8\") pod \"barbican-api-6bcbc6fc44-wcxwv\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.099440 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.187066 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"df866ce3-a10a-41e3-a17b-abea64eaa5ca","Type":"ContainerStarted","Data":"1a73e1e4df6206bcd4110d3e04305f833063815efa94750aa87d060b68f615e8"} Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.190694 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb755d31-fd61-4e84-9e18-5acc68934a58","Type":"ContainerStarted","Data":"cfa5d12ec8dd35e4a892ca6be634b4e7bf04ed77c8dbf172c3f5aca0da154dd3"} Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.190721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb755d31-fd61-4e84-9e18-5acc68934a58","Type":"ContainerStarted","Data":"6520776bb31990d1b45ef0ed59e0298a7e30aa377d074ede7401abc8408d3b3a"} Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.192109 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54","Type":"ContainerStarted","Data":"4d4ad5a6d7d11df4886fa6a22b4f7a1bd73a0b2aeaa3ab4e6b5789b085035912"} Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.248109 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.24808853 podStartE2EDuration="3.24808853s" podCreationTimestamp="2026-01-20 23:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:27.24398688 +0000 UTC m=+4079.564247168" watchObservedRunningTime="2026-01-20 23:43:27.24808853 +0000 UTC m=+4079.568348818" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.272293 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="9c11e361-d0f4-49be-9b4b-96c109c78e43" containerName="galera" containerID="cri-o://7d15586f500f915a660186440c2bca1909e77741fb7263388082aea6d2bfeafa" gracePeriod=30 Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.401985 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.416960 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.442739 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.453318 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv"] Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.482109 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-748859d488-6s2vx"] Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.483733 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.494791 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.31:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.495357 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-748859d488-6s2vx"] Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.604114 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv"] Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.669998 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-config-data-custom\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.670218 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks94d\" (UniqueName: \"kubernetes.io/projected/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-kube-api-access-ks94d\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.670319 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-logs\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.670431 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-internal-tls-certs\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.670563 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-config-data\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.670684 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-combined-ca-bundle\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.670805 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-public-tls-certs\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.774028 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-config-data\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.774118 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-combined-ca-bundle\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.774169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-public-tls-certs\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.774205 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-config-data-custom\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.774295 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks94d\" (UniqueName: \"kubernetes.io/projected/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-kube-api-access-ks94d\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.774335 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-logs\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.774380 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-internal-tls-certs\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.776687 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-logs\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.782423 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-config-data\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.784031 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-public-tls-certs\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.786853 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-internal-tls-certs\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.788368 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-combined-ca-bundle\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.789391 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-config-data-custom\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.793855 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks94d\" (UniqueName: \"kubernetes.io/projected/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-kube-api-access-ks94d\") pod \"barbican-api-748859d488-6s2vx\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.803728 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:27 crc kubenswrapper[5030]: I0120 23:43:27.998366 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.178524 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" podUID="8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.14:9696/\": dial tcp 10.217.1.14:9696: connect: connection refused" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.180607 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-operator-scripts\") pod \"9c11e361-d0f4-49be-9b4b-96c109c78e43\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.181103 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9c11e361-d0f4-49be-9b4b-96c109c78e43-config-data-generated\") pod \"9c11e361-d0f4-49be-9b4b-96c109c78e43\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.181193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9c11e361-d0f4-49be-9b4b-96c109c78e43\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.181225 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-config-data-default\") pod \"9c11e361-d0f4-49be-9b4b-96c109c78e43\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.181260 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8n8m\" (UniqueName: \"kubernetes.io/projected/9c11e361-d0f4-49be-9b4b-96c109c78e43-kube-api-access-h8n8m\") pod \"9c11e361-d0f4-49be-9b4b-96c109c78e43\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.181336 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c11e361-d0f4-49be-9b4b-96c109c78e43-galera-tls-certs\") pod \"9c11e361-d0f4-49be-9b4b-96c109c78e43\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.181356 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-kolla-config\") pod \"9c11e361-d0f4-49be-9b4b-96c109c78e43\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.181433 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c11e361-d0f4-49be-9b4b-96c109c78e43-combined-ca-bundle\") pod \"9c11e361-d0f4-49be-9b4b-96c109c78e43\" (UID: \"9c11e361-d0f4-49be-9b4b-96c109c78e43\") " Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.181638 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c11e361-d0f4-49be-9b4b-96c109c78e43" (UID: "9c11e361-d0f4-49be-9b4b-96c109c78e43"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.181874 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.182129 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "9c11e361-d0f4-49be-9b4b-96c109c78e43" (UID: "9c11e361-d0f4-49be-9b4b-96c109c78e43"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.182144 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9c11e361-d0f4-49be-9b4b-96c109c78e43" (UID: "9c11e361-d0f4-49be-9b4b-96c109c78e43"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.182611 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c11e361-d0f4-49be-9b4b-96c109c78e43-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "9c11e361-d0f4-49be-9b4b-96c109c78e43" (UID: "9c11e361-d0f4-49be-9b4b-96c109c78e43"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.189076 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c11e361-d0f4-49be-9b4b-96c109c78e43-kube-api-access-h8n8m" (OuterVolumeSpecName: "kube-api-access-h8n8m") pod "9c11e361-d0f4-49be-9b4b-96c109c78e43" (UID: "9c11e361-d0f4-49be-9b4b-96c109c78e43"). InnerVolumeSpecName "kube-api-access-h8n8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.193202 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "9c11e361-d0f4-49be-9b4b-96c109c78e43" (UID: "9c11e361-d0f4-49be-9b4b-96c109c78e43"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.211588 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" event={"ID":"382e73bc-d9db-48e6-a776-c1130cfa12ea","Type":"ContainerStarted","Data":"86ad6e87427996c964edf1a9f998f1448d7e0eec78542066cb892e73c873eb22"} Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.211656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" event={"ID":"382e73bc-d9db-48e6-a776-c1130cfa12ea","Type":"ContainerStarted","Data":"0a74bf65c737c0bbc23a581e6113af01f1a94ae141e295accb4c32d779dfc7a1"} Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.211667 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" event={"ID":"382e73bc-d9db-48e6-a776-c1130cfa12ea","Type":"ContainerStarted","Data":"290f7afe2f80b3b10dd11c663f0e74fcacec92b8cca35955cec5a57fb7592fef"} Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.211747 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" podUID="382e73bc-d9db-48e6-a776-c1130cfa12ea" containerName="barbican-api-log" containerID="cri-o://0a74bf65c737c0bbc23a581e6113af01f1a94ae141e295accb4c32d779dfc7a1" gracePeriod=30 Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.211788 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.211845 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" podUID="382e73bc-d9db-48e6-a776-c1130cfa12ea" containerName="barbican-api" containerID="cri-o://86ad6e87427996c964edf1a9f998f1448d7e0eec78542066cb892e73c873eb22" gracePeriod=30 Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.214094 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c11e361-d0f4-49be-9b4b-96c109c78e43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c11e361-d0f4-49be-9b4b-96c109c78e43" (UID: "9c11e361-d0f4-49be-9b4b-96c109c78e43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.215531 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb755d31-fd61-4e84-9e18-5acc68934a58","Type":"ContainerStarted","Data":"df80f7331fd2acd35e1f5223ad9813293a66e401c708fb7402e8df96840c4ac3"} Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.217675 5030 generic.go:334] "Generic (PLEG): container finished" podID="9c11e361-d0f4-49be-9b4b-96c109c78e43" containerID="7d15586f500f915a660186440c2bca1909e77741fb7263388082aea6d2bfeafa" exitCode=0 Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.217830 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9c11e361-d0f4-49be-9b4b-96c109c78e43","Type":"ContainerDied","Data":"7d15586f500f915a660186440c2bca1909e77741fb7263388082aea6d2bfeafa"} Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.217874 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9c11e361-d0f4-49be-9b4b-96c109c78e43","Type":"ContainerDied","Data":"4fe7cbbb90e11abae4a3861d145f392b89da7ccfc8bfe18fdc8475b08c238ae8"} Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.217890 5030 scope.go:117] "RemoveContainer" containerID="7d15586f500f915a660186440c2bca1909e77741fb7263388082aea6d2bfeafa" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.217938 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.230125 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.233232 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" podStartSLOduration=2.233211308 podStartE2EDuration="2.233211308s" podCreationTimestamp="2026-01-20 23:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:28.227164071 +0000 UTC m=+4080.547424359" watchObservedRunningTime="2026-01-20 23:43:28.233211308 +0000 UTC m=+4080.553471596" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.251265 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c11e361-d0f4-49be-9b4b-96c109c78e43-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "9c11e361-d0f4-49be-9b4b-96c109c78e43" (UID: "9c11e361-d0f4-49be-9b4b-96c109c78e43"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.268986 5030 scope.go:117] "RemoveContainer" containerID="fb97ce3a33b1f71f846cb119e808bd2b5c362dd51d11650e41e745ce64e0db52" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.283258 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.283286 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8n8m\" (UniqueName: \"kubernetes.io/projected/9c11e361-d0f4-49be-9b4b-96c109c78e43-kube-api-access-h8n8m\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.283297 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c11e361-d0f4-49be-9b4b-96c109c78e43-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.283308 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9c11e361-d0f4-49be-9b4b-96c109c78e43-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.283318 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c11e361-d0f4-49be-9b4b-96c109c78e43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.283327 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9c11e361-d0f4-49be-9b4b-96c109c78e43-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.283354 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.287705 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-748859d488-6s2vx"] Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.307730 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.317398 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.324210 5030 scope.go:117] "RemoveContainer" containerID="7d15586f500f915a660186440c2bca1909e77741fb7263388082aea6d2bfeafa" Jan 20 23:43:28 crc kubenswrapper[5030]: E0120 23:43:28.327821 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d15586f500f915a660186440c2bca1909e77741fb7263388082aea6d2bfeafa\": container with ID starting with 7d15586f500f915a660186440c2bca1909e77741fb7263388082aea6d2bfeafa not found: ID does not exist" containerID="7d15586f500f915a660186440c2bca1909e77741fb7263388082aea6d2bfeafa" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.328001 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d15586f500f915a660186440c2bca1909e77741fb7263388082aea6d2bfeafa"} err="failed to get container status \"7d15586f500f915a660186440c2bca1909e77741fb7263388082aea6d2bfeafa\": rpc error: code = NotFound desc = could not find container \"7d15586f500f915a660186440c2bca1909e77741fb7263388082aea6d2bfeafa\": container with ID starting with 7d15586f500f915a660186440c2bca1909e77741fb7263388082aea6d2bfeafa not found: ID does not exist" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.328183 5030 scope.go:117] "RemoveContainer" containerID="fb97ce3a33b1f71f846cb119e808bd2b5c362dd51d11650e41e745ce64e0db52" Jan 20 23:43:28 crc kubenswrapper[5030]: E0120 23:43:28.334751 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb97ce3a33b1f71f846cb119e808bd2b5c362dd51d11650e41e745ce64e0db52\": container with ID starting with fb97ce3a33b1f71f846cb119e808bd2b5c362dd51d11650e41e745ce64e0db52 not found: ID does not exist" containerID="fb97ce3a33b1f71f846cb119e808bd2b5c362dd51d11650e41e745ce64e0db52" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.334794 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb97ce3a33b1f71f846cb119e808bd2b5c362dd51d11650e41e745ce64e0db52"} err="failed to get container status \"fb97ce3a33b1f71f846cb119e808bd2b5c362dd51d11650e41e745ce64e0db52\": rpc error: code = NotFound desc = could not find container \"fb97ce3a33b1f71f846cb119e808bd2b5c362dd51d11650e41e745ce64e0db52\": container with ID starting with fb97ce3a33b1f71f846cb119e808bd2b5c362dd51d11650e41e745ce64e0db52 not found: ID does not exist" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.384966 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.551045 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.566070 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.575803 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:43:28 crc kubenswrapper[5030]: E0120 23:43:28.576203 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c11e361-d0f4-49be-9b4b-96c109c78e43" containerName="galera" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.576222 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c11e361-d0f4-49be-9b4b-96c109c78e43" containerName="galera" Jan 20 23:43:28 crc kubenswrapper[5030]: E0120 23:43:28.576236 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c11e361-d0f4-49be-9b4b-96c109c78e43" containerName="mysql-bootstrap" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.576242 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c11e361-d0f4-49be-9b4b-96c109c78e43" containerName="mysql-bootstrap" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.576421 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c11e361-d0f4-49be-9b4b-96c109c78e43" containerName="galera" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.577547 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.585096 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.585831 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.585917 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.586020 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-rjg7g" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.593715 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.693017 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tbfq\" (UniqueName: \"kubernetes.io/projected/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kube-api-access-2tbfq\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.693072 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.693132 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.693164 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.693234 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.693266 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.693294 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.693351 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.795879 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.795926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.795991 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.796040 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.796067 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.796115 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.796184 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tbfq\" (UniqueName: \"kubernetes.io/projected/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kube-api-access-2tbfq\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.796210 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.797121 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.797514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.798110 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.798523 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.800551 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.800587 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.803603 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.816148 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tbfq\" (UniqueName: \"kubernetes.io/projected/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kube-api-access-2tbfq\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.864019 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.941947 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:28 crc kubenswrapper[5030]: I0120 23:43:28.942307 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.063106 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.265583 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" event={"ID":"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd","Type":"ContainerStarted","Data":"44672b9e5ac71bd9b91365302a913d20bb6e00d96ec91f69b8166b36d5e7f2e4"} Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.265677 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" event={"ID":"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd","Type":"ContainerStarted","Data":"f878fb22f76e5c3d5c5aad54d40d91cba28b59e426192e35705dec229d359f10"} Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.265699 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" event={"ID":"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd","Type":"ContainerStarted","Data":"629c2486ecf452c9efd0ca732ce13bdefc27a6afb438a3cfef8c9deefa36d3a9"} Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.268454 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.268495 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.269825 5030 generic.go:334] "Generic (PLEG): container finished" podID="df866ce3-a10a-41e3-a17b-abea64eaa5ca" containerID="1a73e1e4df6206bcd4110d3e04305f833063815efa94750aa87d060b68f615e8" exitCode=0 Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.269917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"df866ce3-a10a-41e3-a17b-abea64eaa5ca","Type":"ContainerDied","Data":"1a73e1e4df6206bcd4110d3e04305f833063815efa94750aa87d060b68f615e8"} Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.271437 5030 generic.go:334] "Generic (PLEG): container finished" podID="382e73bc-d9db-48e6-a776-c1130cfa12ea" containerID="0a74bf65c737c0bbc23a581e6113af01f1a94ae141e295accb4c32d779dfc7a1" exitCode=143 Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.271484 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" event={"ID":"382e73bc-d9db-48e6-a776-c1130cfa12ea","Type":"ContainerDied","Data":"0a74bf65c737c0bbc23a581e6113af01f1a94ae141e295accb4c32d779dfc7a1"} Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.280808 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb755d31-fd61-4e84-9e18-5acc68934a58","Type":"ContainerStarted","Data":"78a21b2039a28bb9676a5f052208d7607b918ac03ca2faa1ba709a6784895363"} Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.314467 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" podStartSLOduration=2.314448427 podStartE2EDuration="2.314448427s" podCreationTimestamp="2026-01-20 23:43:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:29.300775025 +0000 UTC m=+4081.621035323" watchObservedRunningTime="2026-01-20 23:43:29.314448427 +0000 UTC m=+4081.634708715" Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.541670 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.609122 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.610734 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.674881 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.698679 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.698933 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="8eab155c-8ebf-4332-9d47-16df303833b7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://eaf5b25e80fbc00732ee0cd1b7677c0ff4326220df49407113e8e594514df699" gracePeriod=30 Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.716177 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.912715 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5b454f47f8-kbjj4"] Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.913185 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" podUID="a467f566-3c36-4534-8219-fa10f325e20b" containerName="neutron-api" containerID="cri-o://ad6f8e58c9dff12d340a8632ebe13ee33bdf79383a461b73e54bcbe069e202bd" gracePeriod=30 Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.913973 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" podUID="a467f566-3c36-4534-8219-fa10f325e20b" containerName="neutron-httpd" containerID="cri-o://6d4fdd95933dd76f66126e2eafeb8812ed0de780243beac0c33000b1c9a049c3" gracePeriod=30 Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.948149 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-696d6757b4-nkrlp"] Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.950439 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:29 crc kubenswrapper[5030]: I0120 23:43:29.965046 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" podUID="a467f566-3c36-4534-8219-fa10f325e20b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.20:9696/\": EOF" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.001730 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c11e361-d0f4-49be-9b4b-96c109c78e43" path="/var/lib/kubelet/pods/9c11e361-d0f4-49be-9b4b-96c109c78e43/volumes" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.002329 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-696d6757b4-nkrlp"] Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.026588 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-config\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.026895 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-combined-ca-bundle\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.026918 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-public-tls-certs\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.026941 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-645pl\" (UniqueName: \"kubernetes.io/projected/69e686ca-6477-47fc-8154-0683c6ed953a-kube-api-access-645pl\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.026960 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-internal-tls-certs\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.027056 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-httpd-config\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.027115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-ovndb-tls-certs\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.128715 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-ovndb-tls-certs\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.128787 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-config\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.128814 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-combined-ca-bundle\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.128833 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-public-tls-certs\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.128872 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-645pl\" (UniqueName: \"kubernetes.io/projected/69e686ca-6477-47fc-8154-0683c6ed953a-kube-api-access-645pl\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.128890 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-internal-tls-certs\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.128969 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-httpd-config\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.137340 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-ovndb-tls-certs\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.141250 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-public-tls-certs\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.141481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-combined-ca-bundle\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.141823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-internal-tls-certs\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.144651 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-config\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.151259 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-httpd-config\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.156235 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-645pl\" (UniqueName: \"kubernetes.io/projected/69e686ca-6477-47fc-8154-0683c6ed953a-kube-api-access-645pl\") pod \"neutron-696d6757b4-nkrlp\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.267348 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.288442 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"df866ce3-a10a-41e3-a17b-abea64eaa5ca","Type":"ContainerStarted","Data":"3a08091a8839c78e509146461ec023c2f7f8a34d9700b1f59f27ad351db63a85"} Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.291036 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb755d31-fd61-4e84-9e18-5acc68934a58","Type":"ContainerStarted","Data":"3a021e6e1de3258fbc43e1fe576effd6eb9ae3f6b6721159e550dce62dbe4db8"} Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.291150 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.292170 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f7c46ea7-7016-445e-8f53-a65df4ea35fc","Type":"ContainerStarted","Data":"674887c93b46f3429c18e0eea6c551b3fe7bcfb8b7ca78b701ede6a261b1a99b"} Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.293838 5030 generic.go:334] "Generic (PLEG): container finished" podID="a467f566-3c36-4534-8219-fa10f325e20b" containerID="6d4fdd95933dd76f66126e2eafeb8812ed0de780243beac0c33000b1c9a049c3" exitCode=0 Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.293910 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" event={"ID":"a467f566-3c36-4534-8219-fa10f325e20b","Type":"ContainerDied","Data":"6d4fdd95933dd76f66126e2eafeb8812ed0de780243beac0c33000b1c9a049c3"} Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.294455 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.311865 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=5.311848242 podStartE2EDuration="5.311848242s" podCreationTimestamp="2026-01-20 23:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:30.303972612 +0000 UTC m=+4082.624232900" watchObservedRunningTime="2026-01-20 23:43:30.311848242 +0000 UTC m=+4082.632108520" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.318599 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.334694 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.940191642 podStartE2EDuration="5.334679097s" podCreationTimestamp="2026-01-20 23:43:25 +0000 UTC" firstStartedPulling="2026-01-20 23:43:26.209942886 +0000 UTC m=+4078.530203174" lastFinishedPulling="2026-01-20 23:43:29.604430341 +0000 UTC m=+4081.924690629" observedRunningTime="2026-01-20 23:43:30.328770303 +0000 UTC m=+4082.649030601" watchObservedRunningTime="2026-01-20 23:43:30.334679097 +0000 UTC m=+4082.654939385" Jan 20 23:43:30 crc kubenswrapper[5030]: I0120 23:43:30.747423 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" podUID="a467f566-3c36-4534-8219-fa10f325e20b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.20:9696/\": dial tcp 10.217.1.20:9696: connect: connection refused" Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.114686 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-696d6757b4-nkrlp"] Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.308816 5030 generic.go:334] "Generic (PLEG): container finished" podID="8eab155c-8ebf-4332-9d47-16df303833b7" containerID="eaf5b25e80fbc00732ee0cd1b7677c0ff4326220df49407113e8e594514df699" exitCode=0 Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.308905 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"8eab155c-8ebf-4332-9d47-16df303833b7","Type":"ContainerDied","Data":"eaf5b25e80fbc00732ee0cd1b7677c0ff4326220df49407113e8e594514df699"} Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.312241 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="31981674-b244-476b-bf36-e3f5903d2446" containerName="nova-metadata-log" containerID="cri-o://95901ce3221d90b9173bf554b861b002c40953ce0859a9f6d52f0473e2884b29" gracePeriod=30 Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.313040 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f7c46ea7-7016-445e-8f53-a65df4ea35fc","Type":"ContainerStarted","Data":"7d4d3b891364959f2992b7d0b16936c8ce989a4e6bfc2892f0a8fbde5313a0dc"} Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.313067 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.313203 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="31981674-b244-476b-bf36-e3f5903d2446" containerName="nova-metadata-metadata" containerID="cri-o://f68a1b444eca0b16db61c51ec7f264af573275a6ffcdd3dfad8bf3826fec5f89" gracePeriod=30 Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.584202 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.679452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-nova-novncproxy-tls-certs\") pod \"8eab155c-8ebf-4332-9d47-16df303833b7\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.679505 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw5md\" (UniqueName: \"kubernetes.io/projected/8eab155c-8ebf-4332-9d47-16df303833b7-kube-api-access-mw5md\") pod \"8eab155c-8ebf-4332-9d47-16df303833b7\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.679643 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-vencrypt-tls-certs\") pod \"8eab155c-8ebf-4332-9d47-16df303833b7\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.679680 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-combined-ca-bundle\") pod \"8eab155c-8ebf-4332-9d47-16df303833b7\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.679777 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-config-data\") pod \"8eab155c-8ebf-4332-9d47-16df303833b7\" (UID: \"8eab155c-8ebf-4332-9d47-16df303833b7\") " Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.693130 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eab155c-8ebf-4332-9d47-16df303833b7-kube-api-access-mw5md" (OuterVolumeSpecName: "kube-api-access-mw5md") pod "8eab155c-8ebf-4332-9d47-16df303833b7" (UID: "8eab155c-8ebf-4332-9d47-16df303833b7"). InnerVolumeSpecName "kube-api-access-mw5md". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.784410 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw5md\" (UniqueName: \"kubernetes.io/projected/8eab155c-8ebf-4332-9d47-16df303833b7-kube-api-access-mw5md\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.807032 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "8eab155c-8ebf-4332-9d47-16df303833b7" (UID: "8eab155c-8ebf-4332-9d47-16df303833b7"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.857418 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-config-data" (OuterVolumeSpecName: "config-data") pod "8eab155c-8ebf-4332-9d47-16df303833b7" (UID: "8eab155c-8ebf-4332-9d47-16df303833b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.878211 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8eab155c-8ebf-4332-9d47-16df303833b7" (UID: "8eab155c-8ebf-4332-9d47-16df303833b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.886676 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.886992 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.887004 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.899597 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "8eab155c-8ebf-4332-9d47-16df303833b7" (UID: "8eab155c-8ebf-4332-9d47-16df303833b7"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:31 crc kubenswrapper[5030]: I0120 23:43:31.989448 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eab155c-8ebf-4332-9d47-16df303833b7-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.092826 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.099532 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.099796 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.324470 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"8eab155c-8ebf-4332-9d47-16df303833b7","Type":"ContainerDied","Data":"2db0932cbd813797c68e7d4248a83d4bd5a83131038b1f588e4e7803da10cb01"} Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.324777 5030 scope.go:117] "RemoveContainer" containerID="eaf5b25e80fbc00732ee0cd1b7677c0ff4326220df49407113e8e594514df699" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.324955 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.349967 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" event={"ID":"69e686ca-6477-47fc-8154-0683c6ed953a","Type":"ContainerStarted","Data":"435f0b805ae72f39e1e85f6d4ab34ed0af3b4b65094220981c7f3d947c20afd9"} Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.350011 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" event={"ID":"69e686ca-6477-47fc-8154-0683c6ed953a","Type":"ContainerStarted","Data":"e8dcb212d5fdedc91a641f7a1f28c081da6ff7db705082af138b24c20baf2089"} Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.350021 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" event={"ID":"69e686ca-6477-47fc-8154-0683c6ed953a","Type":"ContainerStarted","Data":"ae1ccb8bd95c322d95de1c27cd1c16af9351cdbff6992f41c3fc6f45876b163b"} Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.350763 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.375789 5030 generic.go:334] "Generic (PLEG): container finished" podID="31981674-b244-476b-bf36-e3f5903d2446" containerID="95901ce3221d90b9173bf554b861b002c40953ce0859a9f6d52f0473e2884b29" exitCode=143 Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.375862 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"31981674-b244-476b-bf36-e3f5903d2446","Type":"ContainerDied","Data":"95901ce3221d90b9173bf554b861b002c40953ce0859a9f6d52f0473e2884b29"} Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.395163 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-76766d6986-2jghb_8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208/neutron-api/0.log" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.395204 5030 generic.go:334] "Generic (PLEG): container finished" podID="8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" containerID="67d37a9bdbb07180a5c6d24417922339a86ca098de85f8d76f671c4e93136767" exitCode=137 Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.396025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" event={"ID":"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208","Type":"ContainerDied","Data":"67d37a9bdbb07180a5c6d24417922339a86ca098de85f8d76f671c4e93136767"} Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.408277 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.443458 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.466733 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:43:32 crc kubenswrapper[5030]: E0120 23:43:32.467240 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eab155c-8ebf-4332-9d47-16df303833b7" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.467256 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eab155c-8ebf-4332-9d47-16df303833b7" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.467489 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eab155c-8ebf-4332-9d47-16df303833b7" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.468301 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.474172 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.474380 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.474480 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.489529 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.496952 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" podStartSLOduration=3.49693046 podStartE2EDuration="3.49693046s" podCreationTimestamp="2026-01-20 23:43:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:32.410121574 +0000 UTC m=+4084.730381852" watchObservedRunningTime="2026-01-20 23:43:32.49693046 +0000 UTC m=+4084.817190748" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.529615 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-76766d6986-2jghb_8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208/neutron-api/0.log" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.529753 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.547128 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.31:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.604225 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xfcs\" (UniqueName: \"kubernetes.io/projected/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-kube-api-access-8xfcs\") pod \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.604291 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-config\") pod \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.604346 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-public-tls-certs\") pod \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.604490 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-combined-ca-bundle\") pod \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.604510 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-ovndb-tls-certs\") pod \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.604555 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-httpd-config\") pod \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.604595 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-internal-tls-certs\") pod \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\" (UID: \"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208\") " Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.604836 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.604864 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.604953 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.604988 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s9sm\" (UniqueName: \"kubernetes.io/projected/24b38406-08cf-452c-b0f8-c455ed6b7ece-kube-api-access-8s9sm\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.605027 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.611488 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" (UID: "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.613763 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-kube-api-access-8xfcs" (OuterVolumeSpecName: "kube-api-access-8xfcs") pod "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" (UID: "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208"). InnerVolumeSpecName "kube-api-access-8xfcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.678818 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" (UID: "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.700838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" (UID: "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.706733 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.706793 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s9sm\" (UniqueName: \"kubernetes.io/projected/24b38406-08cf-452c-b0f8-c455ed6b7ece-kube-api-access-8s9sm\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.706835 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.706880 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.706901 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.706971 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.706984 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.706997 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.707006 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xfcs\" (UniqueName: \"kubernetes.io/projected/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-kube-api-access-8xfcs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.721311 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.723837 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.725115 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.737985 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.738382 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-config" (OuterVolumeSpecName: "config") pod "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" (UID: "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.739315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.743369 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s9sm\" (UniqueName: \"kubernetes.io/projected/24b38406-08cf-452c-b0f8-c455ed6b7ece-kube-api-access-8s9sm\") pod \"nova-cell1-novncproxy-0\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.747315 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" (UID: "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.804740 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" (UID: "8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.816169 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.816216 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.816228 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.836638 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.887509 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-68445ff9cc-cq5qp"] Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.887764 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" podUID="29283d66-1ecd-4ad9-8bed-53cc737db34d" containerName="keystone-api" containerID="cri-o://da549dd5cb777f8caab4bb5e65c66917dfd797083bb493b48e348211e19e4bd3" gracePeriod=30 Jan 20 23:43:32 crc kubenswrapper[5030]: I0120 23:43:32.962892 5030 scope.go:117] "RemoveContainer" containerID="efc08bb977e1dec2224e0550e4a51eee0e71377a6fe1d5972f92ea98201f81e3" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.025365 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp"] Jan 20 23:43:33 crc kubenswrapper[5030]: E0120 23:43:33.025791 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" containerName="neutron-api" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.025808 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" containerName="neutron-api" Jan 20 23:43:33 crc kubenswrapper[5030]: E0120 23:43:33.025839 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" containerName="neutron-httpd" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.025846 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" containerName="neutron-httpd" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.026035 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" containerName="neutron-api" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.026046 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" containerName="neutron-httpd" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.026688 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.051123 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp"] Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.126132 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-internal-tls-certs\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.126395 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-combined-ca-bundle\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.126432 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-config-data\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.126490 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lncl5\" (UniqueName: \"kubernetes.io/projected/cd1c466c-2109-4e95-8bfb-a598a6af4442-kube-api-access-lncl5\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.126511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-fernet-keys\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.126543 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-credential-keys\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.126588 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-scripts\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.126607 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-public-tls-certs\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.232664 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-credential-keys\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.232745 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-scripts\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.232766 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-public-tls-certs\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.232812 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-internal-tls-certs\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.232834 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-combined-ca-bundle\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.232864 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-config-data\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.232918 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lncl5\" (UniqueName: \"kubernetes.io/projected/cd1c466c-2109-4e95-8bfb-a598a6af4442-kube-api-access-lncl5\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.232940 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-fernet-keys\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.239812 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-credential-keys\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.243072 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-internal-tls-certs\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.243127 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-fernet-keys\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.243652 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-config-data\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.247051 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-combined-ca-bundle\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.249581 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-public-tls-certs\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.250003 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-scripts\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.258777 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lncl5\" (UniqueName: \"kubernetes.io/projected/cd1c466c-2109-4e95-8bfb-a598a6af4442-kube-api-access-lncl5\") pod \"keystone-65ff77b5b8-8pvfp\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.371069 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.388771 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.408975 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"04d86b77-716c-4eb7-8976-9e47e1b6df7d","Type":"ContainerStarted","Data":"fc4cf92d45efba3ab3f584658fe1ec9a79d16433135d3203f60088326bf66a6b"} Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.411054 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.415830 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-76766d6986-2jghb_8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208/neutron-api/0.log" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.415905 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" event={"ID":"8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208","Type":"ContainerDied","Data":"8132c0095ef59d931f51b86dfbb0a1b1491d41fdef6d5764ed61d6d71caa1694"} Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.415942 5030 scope.go:117] "RemoveContainer" containerID="6209a3e89aa3986aa1aed1097e852e2687fad2453d971f8772348cd0527da462" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.415949 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-76766d6986-2jghb" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.471793 5030 scope.go:117] "RemoveContainer" containerID="67d37a9bdbb07180a5c6d24417922339a86ca098de85f8d76f671c4e93136767" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.485927 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-76766d6986-2jghb"] Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.495072 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-76766d6986-2jghb"] Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.884167 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp"] Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.980671 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208" path="/var/lib/kubelet/pods/8ea6b77a-5e8c-4afd-ba36-5ae9ce25f208/volumes" Jan 20 23:43:33 crc kubenswrapper[5030]: I0120 23:43:33.981371 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eab155c-8ebf-4332-9d47-16df303833b7" path="/var/lib/kubelet/pods/8eab155c-8ebf-4332-9d47-16df303833b7/volumes" Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.239408 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.239466 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.429487 5030 generic.go:334] "Generic (PLEG): container finished" podID="f7c46ea7-7016-445e-8f53-a65df4ea35fc" containerID="7d4d3b891364959f2992b7d0b16936c8ce989a4e6bfc2892f0a8fbde5313a0dc" exitCode=0 Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.429549 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f7c46ea7-7016-445e-8f53-a65df4ea35fc","Type":"ContainerDied","Data":"7d4d3b891364959f2992b7d0b16936c8ce989a4e6bfc2892f0a8fbde5313a0dc"} Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.431514 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" event={"ID":"cd1c466c-2109-4e95-8bfb-a598a6af4442","Type":"ContainerStarted","Data":"34ddb20d0ede83e4cd7c0c0b694af5203eb3fc56c960ffac026eaad28935327a"} Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.431813 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.431826 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" event={"ID":"cd1c466c-2109-4e95-8bfb-a598a6af4442","Type":"ContainerStarted","Data":"3ae56dc3ede8d0a07e952d5dc986bd7883ee87215d4a28f848ca97500b33b175"} Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.440795 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"24b38406-08cf-452c-b0f8-c455ed6b7ece","Type":"ContainerStarted","Data":"1abdfa8e1738c463a2fd304df58c250bf4c6932fce81a2f5d1066f83952f0f5c"} Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.440846 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"24b38406-08cf-452c-b0f8-c455ed6b7ece","Type":"ContainerStarted","Data":"60afe191c49d49fe93faffd8ed6cf5091df110a9f590c0d83c96c4219a1e3f86"} Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.507545 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.507527674 podStartE2EDuration="2.507527674s" podCreationTimestamp="2026-01-20 23:43:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:34.502131843 +0000 UTC m=+4086.822392131" watchObservedRunningTime="2026-01-20 23:43:34.507527674 +0000 UTC m=+4086.827787962" Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.508379 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" podStartSLOduration=2.508369505 podStartE2EDuration="2.508369505s" podCreationTimestamp="2026-01-20 23:43:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:34.479891433 +0000 UTC m=+4086.800151721" watchObservedRunningTime="2026-01-20 23:43:34.508369505 +0000 UTC m=+4086.828629793" Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.811857 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.811911 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.888967 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:34 crc kubenswrapper[5030]: I0120 23:43:34.954297 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.015666 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp"] Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.040528 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-7c845c9d97-gq5lm"] Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.041792 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.049470 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7c845c9d97-gq5lm"] Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.186039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-internal-tls-certs\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.186119 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6cgc\" (UniqueName: \"kubernetes.io/projected/ba47504f-1af1-4c46-acc4-6f6ebc32855f-kube-api-access-l6cgc\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.186188 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-credential-keys\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.186236 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-fernet-keys\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.186313 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-scripts\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.186534 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-config-data\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.186575 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-public-tls-certs\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.186740 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-combined-ca-bundle\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.247760 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.25:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.247784 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.25:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.289346 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-combined-ca-bundle\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.289409 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-internal-tls-certs\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.289437 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6cgc\" (UniqueName: \"kubernetes.io/projected/ba47504f-1af1-4c46-acc4-6f6ebc32855f-kube-api-access-l6cgc\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.289475 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-credential-keys\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.289519 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-fernet-keys\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.289541 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-scripts\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.289606 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-config-data\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.289634 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-public-tls-certs\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.297565 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-combined-ca-bundle\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.299170 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-credential-keys\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.299231 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-public-tls-certs\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.301574 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-config-data\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.302195 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-internal-tls-certs\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.302850 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-scripts\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.304540 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-fernet-keys\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.307889 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6cgc\" (UniqueName: \"kubernetes.io/projected/ba47504f-1af1-4c46-acc4-6f6ebc32855f-kube-api-access-l6cgc\") pod \"keystone-7c845c9d97-gq5lm\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.358982 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.490361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f7c46ea7-7016-445e-8f53-a65df4ea35fc","Type":"ContainerStarted","Data":"25486648d351881e64da50149b9a74c4d78a6f3be0d43be4097ddd110093e64c"} Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.491949 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.491983 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.521218 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=7.521197014 podStartE2EDuration="7.521197014s" podCreationTimestamp="2026-01-20 23:43:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:35.506937488 +0000 UTC m=+4087.827197776" watchObservedRunningTime="2026-01-20 23:43:35.521197014 +0000 UTC m=+4087.841457312" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.598783 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:35 crc kubenswrapper[5030]: I0120 23:43:35.598827 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.659489 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.659723 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-log" containerID="cri-o://6b9efb08e33facd7a36f07305e5026219490fefe5b522a4546faa6abd38475c8" gracePeriod=30 Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.659981 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-api" containerID="cri-o://39d75acd0e24b7098ea6e1abb5aa8ea635cc430b52893fa5ef9edbc6e210ce11" gracePeriod=30 Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.731901 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-696d6757b4-nkrlp"] Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.732138 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" podUID="69e686ca-6477-47fc-8154-0683c6ed953a" containerName="neutron-api" containerID="cri-o://e8dcb212d5fdedc91a641f7a1f28c081da6ff7db705082af138b24c20baf2089" gracePeriod=30 Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.732181 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" podUID="69e686ca-6477-47fc-8154-0683c6ed953a" containerName="neutron-httpd" containerID="cri-o://435f0b805ae72f39e1e85f6d4ab34ed0af3b4b65094220981c7f3d947c20afd9" gracePeriod=30 Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.775574 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6f69755d54-ldcwk"] Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.792251 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.792336 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.805874 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6f69755d54-ldcwk"] Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.849175 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7c845c9d97-gq5lm"] Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.863758 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="31981674-b244-476b-bf36-e3f5903d2446" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.30:8775/\": read tcp 10.217.0.2:60554->10.217.1.30:8775: read: connection reset by peer" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.863808 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="31981674-b244-476b-bf36-e3f5903d2446" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.30:8775/\": read tcp 10.217.0.2:60544->10.217.1.30:8775: read: connection reset by peer" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.899005 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-public-tls-certs\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.899035 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-ovndb-tls-certs\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.899070 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-combined-ca-bundle\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.899107 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-internal-tls-certs\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.899182 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-config\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.899205 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhx5w\" (UniqueName: \"kubernetes.io/projected/7f9eab26-3469-41cd-abbf-9f314b2e3db1-kube-api-access-vhx5w\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.899223 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-httpd-config\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:35.968835 5030 scope.go:117] "RemoveContainer" containerID="22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.001099 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-internal-tls-certs\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.001200 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-config\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.001222 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhx5w\" (UniqueName: \"kubernetes.io/projected/7f9eab26-3469-41cd-abbf-9f314b2e3db1-kube-api-access-vhx5w\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.001241 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-httpd-config\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.001333 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-public-tls-certs\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.001349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-ovndb-tls-certs\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.001416 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-combined-ca-bundle\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.005706 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-internal-tls-certs\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.005871 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-httpd-config\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.007129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-combined-ca-bundle\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.007197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-public-tls-certs\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.008706 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-config\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.008988 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-ovndb-tls-certs\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.018035 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhx5w\" (UniqueName: \"kubernetes.io/projected/7f9eab26-3469-41cd-abbf-9f314b2e3db1-kube-api-access-vhx5w\") pod \"neutron-6f69755d54-ldcwk\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.137848 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.469983 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod737f957c-6c28-4e40-ac90-83e5cc719f07"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod737f957c-6c28-4e40-ac90-83e5cc719f07] : Timed out while waiting for systemd to remove kubepods-besteffort-pod737f957c_6c28_4e40_ac90_83e5cc719f07.slice" Jan 20 23:43:36 crc kubenswrapper[5030]: E0120 23:43:36.470434 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod737f957c-6c28-4e40-ac90-83e5cc719f07] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod737f957c-6c28-4e40-ac90-83e5cc719f07] : Timed out while waiting for systemd to remove kubepods-besteffort-pod737f957c_6c28_4e40_ac90_83e5cc719f07.slice" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="737f957c-6c28-4e40-ac90-83e5cc719f07" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.506548 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"1d6e18e5-718a-4578-8b74-add47cb38d2d","Type":"ContainerStarted","Data":"93d97d664a75ce718e24a3db67922eeb139038d7198bf3078d32828f497081d0"} Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.507694 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.511583 5030 generic.go:334] "Generic (PLEG): container finished" podID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerID="6b9efb08e33facd7a36f07305e5026219490fefe5b522a4546faa6abd38475c8" exitCode=143 Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.511799 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb","Type":"ContainerDied","Data":"6b9efb08e33facd7a36f07305e5026219490fefe5b522a4546faa6abd38475c8"} Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.514921 5030 generic.go:334] "Generic (PLEG): container finished" podID="29283d66-1ecd-4ad9-8bed-53cc737db34d" containerID="da549dd5cb777f8caab4bb5e65c66917dfd797083bb493b48e348211e19e4bd3" exitCode=0 Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.515080 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" event={"ID":"29283d66-1ecd-4ad9-8bed-53cc737db34d","Type":"ContainerDied","Data":"da549dd5cb777f8caab4bb5e65c66917dfd797083bb493b48e348211e19e4bd3"} Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.526432 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod425b4c25-fdad-470a-b462-188b573d359a"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod425b4c25-fdad-470a-b462-188b573d359a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod425b4c25_fdad_470a_b462_188b573d359a.slice" Jan 20 23:43:36 crc kubenswrapper[5030]: E0120 23:43:36.526666 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod425b4c25-fdad-470a-b462-188b573d359a] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod425b4c25-fdad-470a-b462-188b573d359a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod425b4c25_fdad_470a_b462_188b573d359a.slice" pod="openstack-kuttl-tests/cinder-api-0" podUID="425b4c25-fdad-470a-b462-188b573d359a" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.528146 5030 generic.go:334] "Generic (PLEG): container finished" podID="69e686ca-6477-47fc-8154-0683c6ed953a" containerID="435f0b805ae72f39e1e85f6d4ab34ed0af3b4b65094220981c7f3d947c20afd9" exitCode=0 Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.528222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" event={"ID":"69e686ca-6477-47fc-8154-0683c6ed953a","Type":"ContainerDied","Data":"435f0b805ae72f39e1e85f6d4ab34ed0af3b4b65094220981c7f3d947c20afd9"} Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.544903 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" event={"ID":"ba47504f-1af1-4c46-acc4-6f6ebc32855f","Type":"ContainerStarted","Data":"243306ef2b6397bc8c6c2bcd542fb20212d3039979bc4b05c6287f0f67be396d"} Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.545119 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" event={"ID":"ba47504f-1af1-4c46-acc4-6f6ebc32855f","Type":"ContainerStarted","Data":"db3a329ad2fb8bc422bc37633e08c7a296dbf322ff6fad9a1c5c34f5799acd9a"} Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.546639 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.571933 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" podStartSLOduration=1.5719173720000001 podStartE2EDuration="1.571917372s" podCreationTimestamp="2026-01-20 23:43:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:36.571472962 +0000 UTC m=+4088.891733250" watchObservedRunningTime="2026-01-20 23:43:36.571917372 +0000 UTC m=+4088.892177660" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.579352 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podbb388aeb-cf40-49d1-bb60-a22d6117c6a8"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podbb388aeb-cf40-49d1-bb60-a22d6117c6a8] : Timed out while waiting for systemd to remove kubepods-besteffort-podbb388aeb_cf40_49d1_bb60_a22d6117c6a8.slice" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.601940 5030 generic.go:334] "Generic (PLEG): container finished" podID="31981674-b244-476b-bf36-e3f5903d2446" containerID="f68a1b444eca0b16db61c51ec7f264af573275a6ffcdd3dfad8bf3826fec5f89" exitCode=0 Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.602879 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"31981674-b244-476b-bf36-e3f5903d2446","Type":"ContainerDied","Data":"f68a1b444eca0b16db61c51ec7f264af573275a6ffcdd3dfad8bf3826fec5f89"} Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.602926 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.603784 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" podUID="cd1c466c-2109-4e95-8bfb-a598a6af4442" containerName="keystone-api" containerID="cri-o://34ddb20d0ede83e4cd7c0c0b694af5203eb3fc56c960ffac026eaad28935327a" gracePeriod=30 Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.661750 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.671472 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.682445 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.683815 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.685701 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.692378 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.751894 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.766015 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.828157 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-nova-metadata-tls-certs\") pod \"31981674-b244-476b-bf36-e3f5903d2446\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.828529 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31981674-b244-476b-bf36-e3f5903d2446-logs\") pod \"31981674-b244-476b-bf36-e3f5903d2446\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.828586 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-config-data\") pod \"31981674-b244-476b-bf36-e3f5903d2446\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.829479 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njnjd\" (UniqueName: \"kubernetes.io/projected/31981674-b244-476b-bf36-e3f5903d2446-kube-api-access-njnjd\") pod \"31981674-b244-476b-bf36-e3f5903d2446\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.829519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-combined-ca-bundle\") pod \"31981674-b244-476b-bf36-e3f5903d2446\" (UID: \"31981674-b244-476b-bf36-e3f5903d2446\") " Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.830127 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2wzn\" (UniqueName: \"kubernetes.io/projected/db5e46d2-d1b4-4639-8e45-c329417ae8e1-kube-api-access-b2wzn\") pod \"nova-scheduler-0\" (UID: \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.830184 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5e46d2-d1b4-4639-8e45-c329417ae8e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.830241 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5e46d2-d1b4-4639-8e45-c329417ae8e1-config-data\") pod \"nova-scheduler-0\" (UID: \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.834220 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.839303 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31981674-b244-476b-bf36-e3f5903d2446-kube-api-access-njnjd" (OuterVolumeSpecName: "kube-api-access-njnjd") pod "31981674-b244-476b-bf36-e3f5903d2446" (UID: "31981674-b244-476b-bf36-e3f5903d2446"). InnerVolumeSpecName "kube-api-access-njnjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.850868 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31981674-b244-476b-bf36-e3f5903d2446-logs" (OuterVolumeSpecName: "logs") pod "31981674-b244-476b-bf36-e3f5903d2446" (UID: "31981674-b244-476b-bf36-e3f5903d2446"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.915809 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-config-data" (OuterVolumeSpecName: "config-data") pod "31981674-b244-476b-bf36-e3f5903d2446" (UID: "31981674-b244-476b-bf36-e3f5903d2446"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.932304 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-scripts\") pod \"29283d66-1ecd-4ad9-8bed-53cc737db34d\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.932361 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz8dd\" (UniqueName: \"kubernetes.io/projected/29283d66-1ecd-4ad9-8bed-53cc737db34d-kube-api-access-pz8dd\") pod \"29283d66-1ecd-4ad9-8bed-53cc737db34d\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.932381 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-combined-ca-bundle\") pod \"29283d66-1ecd-4ad9-8bed-53cc737db34d\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.932422 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-credential-keys\") pod \"29283d66-1ecd-4ad9-8bed-53cc737db34d\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.932694 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-internal-tls-certs\") pod \"29283d66-1ecd-4ad9-8bed-53cc737db34d\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.933187 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-fernet-keys\") pod \"29283d66-1ecd-4ad9-8bed-53cc737db34d\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.935455 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-public-tls-certs\") pod \"29283d66-1ecd-4ad9-8bed-53cc737db34d\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.935475 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-config-data\") pod \"29283d66-1ecd-4ad9-8bed-53cc737db34d\" (UID: \"29283d66-1ecd-4ad9-8bed-53cc737db34d\") " Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.936594 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-scripts" (OuterVolumeSpecName: "scripts") pod "29283d66-1ecd-4ad9-8bed-53cc737db34d" (UID: "29283d66-1ecd-4ad9-8bed-53cc737db34d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.941349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2wzn\" (UniqueName: \"kubernetes.io/projected/db5e46d2-d1b4-4639-8e45-c329417ae8e1-kube-api-access-b2wzn\") pod \"nova-scheduler-0\" (UID: \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.944232 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5e46d2-d1b4-4639-8e45-c329417ae8e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.945040 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5e46d2-d1b4-4639-8e45-c329417ae8e1-config-data\") pod \"nova-scheduler-0\" (UID: \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.945818 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.947051 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.947390 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njnjd\" (UniqueName: \"kubernetes.io/projected/31981674-b244-476b-bf36-e3f5903d2446-kube-api-access-njnjd\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.947461 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31981674-b244-476b-bf36-e3f5903d2446-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.950132 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "29283d66-1ecd-4ad9-8bed-53cc737db34d" (UID: "29283d66-1ecd-4ad9-8bed-53cc737db34d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.963846 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29283d66-1ecd-4ad9-8bed-53cc737db34d-kube-api-access-pz8dd" (OuterVolumeSpecName: "kube-api-access-pz8dd") pod "29283d66-1ecd-4ad9-8bed-53cc737db34d" (UID: "29283d66-1ecd-4ad9-8bed-53cc737db34d"). InnerVolumeSpecName "kube-api-access-pz8dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.964852 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5e46d2-d1b4-4639-8e45-c329417ae8e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.964879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "29283d66-1ecd-4ad9-8bed-53cc737db34d" (UID: "29283d66-1ecd-4ad9-8bed-53cc737db34d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.969414 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2wzn\" (UniqueName: \"kubernetes.io/projected/db5e46d2-d1b4-4639-8e45-c329417ae8e1-kube-api-access-b2wzn\") pod \"nova-scheduler-0\" (UID: \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.972299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5e46d2-d1b4-4639-8e45-c329417ae8e1-config-data\") pod \"nova-scheduler-0\" (UID: \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:36 crc kubenswrapper[5030]: I0120 23:43:36.973808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31981674-b244-476b-bf36-e3f5903d2446" (UID: "31981674-b244-476b-bf36-e3f5903d2446"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.001876 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6f69755d54-ldcwk"] Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.006887 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-config-data" (OuterVolumeSpecName: "config-data") pod "29283d66-1ecd-4ad9-8bed-53cc737db34d" (UID: "29283d66-1ecd-4ad9-8bed-53cc737db34d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.018634 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.023175 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "31981674-b244-476b-bf36-e3f5903d2446" (UID: "31981674-b244-476b-bf36-e3f5903d2446"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.033994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29283d66-1ecd-4ad9-8bed-53cc737db34d" (UID: "29283d66-1ecd-4ad9-8bed-53cc737db34d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.034130 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "29283d66-1ecd-4ad9-8bed-53cc737db34d" (UID: "29283d66-1ecd-4ad9-8bed-53cc737db34d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.040934 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "29283d66-1ecd-4ad9-8bed-53cc737db34d" (UID: "29283d66-1ecd-4ad9-8bed-53cc737db34d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.051230 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.051372 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.051390 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.051400 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.051410 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.051419 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz8dd\" (UniqueName: \"kubernetes.io/projected/29283d66-1ecd-4ad9-8bed-53cc737db34d-kube-api-access-pz8dd\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.051428 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.051438 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31981674-b244-476b-bf36-e3f5903d2446-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.051446 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/29283d66-1ecd-4ad9-8bed-53cc737db34d-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.113062 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" podUID="382e73bc-d9db-48e6-a776-c1130cfa12ea" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.44:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.517395 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.590340 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.31:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.614904 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" event={"ID":"29283d66-1ecd-4ad9-8bed-53cc737db34d","Type":"ContainerDied","Data":"1723031c1195b8f6581b0b5d0426fa6da1e1b46eb2b6facbd47b84947f8a7a69"} Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.614957 5030 scope.go:117] "RemoveContainer" containerID="da549dd5cb777f8caab4bb5e65c66917dfd797083bb493b48e348211e19e4bd3" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.615117 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-68445ff9cc-cq5qp" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.636864 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" event={"ID":"7f9eab26-3469-41cd-abbf-9f314b2e3db1","Type":"ContainerStarted","Data":"ba25de2ae7478e2268a1149edfb6a5499eb0e53e3ec1456722337f6a45228229"} Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.636909 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" event={"ID":"7f9eab26-3469-41cd-abbf-9f314b2e3db1","Type":"ContainerStarted","Data":"c6b688c31c136694cd617d7076fc8d77cc330b64c2f68291106be52b8810e00c"} Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.636925 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" event={"ID":"7f9eab26-3469-41cd-abbf-9f314b2e3db1","Type":"ContainerStarted","Data":"dd82bcf76d467c27ed3193bcf23b5fda2a3a657d167d57483002341c3b4880c9"} Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.638147 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.661739 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"31981674-b244-476b-bf36-e3f5903d2446","Type":"ContainerDied","Data":"bb1f4a48dba1630238aa1f6de1040687a3a18337c0846b8b26c2f0cddc9fd4ff"} Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.661889 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.664613 5030 generic.go:334] "Generic (PLEG): container finished" podID="cd1c466c-2109-4e95-8bfb-a598a6af4442" containerID="34ddb20d0ede83e4cd7c0c0b694af5203eb3fc56c960ffac026eaad28935327a" exitCode=0 Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.664730 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.665721 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.665728 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" event={"ID":"cd1c466c-2109-4e95-8bfb-a598a6af4442","Type":"ContainerDied","Data":"34ddb20d0ede83e4cd7c0c0b694af5203eb3fc56c960ffac026eaad28935327a"} Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.665954 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp" event={"ID":"cd1c466c-2109-4e95-8bfb-a598a6af4442","Type":"ContainerDied","Data":"3ae56dc3ede8d0a07e952d5dc986bd7883ee87215d4a28f848ca97500b33b175"} Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.666201 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.666215 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.670402 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-public-tls-certs\") pod \"cd1c466c-2109-4e95-8bfb-a598a6af4442\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.670472 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-fernet-keys\") pod \"cd1c466c-2109-4e95-8bfb-a598a6af4442\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.670526 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-scripts\") pod \"cd1c466c-2109-4e95-8bfb-a598a6af4442\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.670613 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-combined-ca-bundle\") pod \"cd1c466c-2109-4e95-8bfb-a598a6af4442\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.670649 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-credential-keys\") pod \"cd1c466c-2109-4e95-8bfb-a598a6af4442\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.670804 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-config-data\") pod \"cd1c466c-2109-4e95-8bfb-a598a6af4442\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.670836 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lncl5\" (UniqueName: \"kubernetes.io/projected/cd1c466c-2109-4e95-8bfb-a598a6af4442-kube-api-access-lncl5\") pod \"cd1c466c-2109-4e95-8bfb-a598a6af4442\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.670899 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-internal-tls-certs\") pod \"cd1c466c-2109-4e95-8bfb-a598a6af4442\" (UID: \"cd1c466c-2109-4e95-8bfb-a598a6af4442\") " Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.682206 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" podStartSLOduration=2.682172066 podStartE2EDuration="2.682172066s" podCreationTimestamp="2026-01-20 23:43:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:37.659410304 +0000 UTC m=+4089.979670592" watchObservedRunningTime="2026-01-20 23:43:37.682172066 +0000 UTC m=+4090.002432354" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.686358 5030 scope.go:117] "RemoveContainer" containerID="f68a1b444eca0b16db61c51ec7f264af573275a6ffcdd3dfad8bf3826fec5f89" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.781333 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-68445ff9cc-cq5qp"] Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.807672 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-68445ff9cc-cq5qp"] Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.821457 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.828672 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.845149 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.859408 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.865679 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.874666 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.884673 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:43:37 crc kubenswrapper[5030]: E0120 23:43:37.885142 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd1c466c-2109-4e95-8bfb-a598a6af4442" containerName="keystone-api" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.885158 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1c466c-2109-4e95-8bfb-a598a6af4442" containerName="keystone-api" Jan 20 23:43:37 crc kubenswrapper[5030]: E0120 23:43:37.885172 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31981674-b244-476b-bf36-e3f5903d2446" containerName="nova-metadata-log" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.885178 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="31981674-b244-476b-bf36-e3f5903d2446" containerName="nova-metadata-log" Jan 20 23:43:37 crc kubenswrapper[5030]: E0120 23:43:37.885202 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31981674-b244-476b-bf36-e3f5903d2446" containerName="nova-metadata-metadata" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.885208 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="31981674-b244-476b-bf36-e3f5903d2446" containerName="nova-metadata-metadata" Jan 20 23:43:37 crc kubenswrapper[5030]: E0120 23:43:37.885216 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29283d66-1ecd-4ad9-8bed-53cc737db34d" containerName="keystone-api" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.885222 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="29283d66-1ecd-4ad9-8bed-53cc737db34d" containerName="keystone-api" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.885388 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd1c466c-2109-4e95-8bfb-a598a6af4442" containerName="keystone-api" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.885415 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="31981674-b244-476b-bf36-e3f5903d2446" containerName="nova-metadata-metadata" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.885426 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="29283d66-1ecd-4ad9-8bed-53cc737db34d" containerName="keystone-api" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.885436 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="31981674-b244-476b-bf36-e3f5903d2446" containerName="nova-metadata-log" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.886487 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.889394 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.892737 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.893006 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.896860 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.898641 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.903866 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.904168 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.908209 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.921678 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.977042 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29283d66-1ecd-4ad9-8bed-53cc737db34d" path="/var/lib/kubelet/pods/29283d66-1ecd-4ad9-8bed-53cc737db34d/volumes" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.977618 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31981674-b244-476b-bf36-e3f5903d2446" path="/var/lib/kubelet/pods/31981674-b244-476b-bf36-e3f5903d2446/volumes" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.978224 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="425b4c25-fdad-470a-b462-188b573d359a" path="/var/lib/kubelet/pods/425b4c25-fdad-470a-b462-188b573d359a/volumes" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.981964 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c070076-3f81-4e61-9f27-e5a39a84c380-logs\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.982021 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.982066 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.982090 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc5mj\" (UniqueName: \"kubernetes.io/projected/50bb7262-9571-4851-9399-5115ee497125-kube-api-access-sc5mj\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.982131 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50bb7262-9571-4851-9399-5115ee497125-etc-machine-id\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.982176 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4bgn\" (UniqueName: \"kubernetes.io/projected/5c070076-3f81-4e61-9f27-e5a39a84c380-kube-api-access-k4bgn\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.982208 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-config-data\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.982235 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50bb7262-9571-4851-9399-5115ee497125-logs\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.982255 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-public-tls-certs\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.982272 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-scripts\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.982290 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-config-data-custom\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.982318 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-config-data\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.982340 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.982381 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:37 crc kubenswrapper[5030]: I0120 23:43:37.983608 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737f957c-6c28-4e40-ac90-83e5cc719f07" path="/var/lib/kubelet/pods/737f957c-6c28-4e40-ac90-83e5cc719f07/volumes" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084114 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50bb7262-9571-4851-9399-5115ee497125-logs\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-public-tls-certs\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084219 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-scripts\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084246 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-config-data-custom\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084283 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-config-data\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084314 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084362 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084396 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c070076-3f81-4e61-9f27-e5a39a84c380-logs\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084442 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc5mj\" (UniqueName: \"kubernetes.io/projected/50bb7262-9571-4851-9399-5115ee497125-kube-api-access-sc5mj\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084567 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50bb7262-9571-4851-9399-5115ee497125-etc-machine-id\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084640 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4bgn\" (UniqueName: \"kubernetes.io/projected/5c070076-3f81-4e61-9f27-e5a39a84c380-kube-api-access-k4bgn\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.084675 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-config-data\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.085773 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c070076-3f81-4e61-9f27-e5a39a84c380-logs\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.085842 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50bb7262-9571-4851-9399-5115ee497125-etc-machine-id\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.085861 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50bb7262-9571-4851-9399-5115ee497125-logs\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.149633 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cd1c466c-2109-4e95-8bfb-a598a6af4442" (UID: "cd1c466c-2109-4e95-8bfb-a598a6af4442"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.150953 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cd1c466c-2109-4e95-8bfb-a598a6af4442" (UID: "cd1c466c-2109-4e95-8bfb-a598a6af4442"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.152735 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-scripts" (OuterVolumeSpecName: "scripts") pod "cd1c466c-2109-4e95-8bfb-a598a6af4442" (UID: "cd1c466c-2109-4e95-8bfb-a598a6af4442"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.152941 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.153490 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.153793 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd1c466c-2109-4e95-8bfb-a598a6af4442-kube-api-access-lncl5" (OuterVolumeSpecName: "kube-api-access-lncl5") pod "cd1c466c-2109-4e95-8bfb-a598a6af4442" (UID: "cd1c466c-2109-4e95-8bfb-a598a6af4442"). InnerVolumeSpecName "kube-api-access-lncl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.155422 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-config-data\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.155907 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.156002 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.156931 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-scripts\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.157186 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-public-tls-certs\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.158173 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc5mj\" (UniqueName: \"kubernetes.io/projected/50bb7262-9571-4851-9399-5115ee497125-kube-api-access-sc5mj\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.158329 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-config-data-custom\") pod \"cinder-api-0\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.158567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4bgn\" (UniqueName: \"kubernetes.io/projected/5c070076-3f81-4e61-9f27-e5a39a84c380-kube-api-access-k4bgn\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.158902 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-config-data\") pod \"nova-metadata-0\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.185321 5030 scope.go:117] "RemoveContainer" containerID="95901ce3221d90b9173bf554b861b002c40953ce0859a9f6d52f0473e2884b29" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.186849 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lncl5\" (UniqueName: \"kubernetes.io/projected/cd1c466c-2109-4e95-8bfb-a598a6af4442-kube-api-access-lncl5\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.186868 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.186878 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.186887 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.269828 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-config-data" (OuterVolumeSpecName: "config-data") pod "cd1c466c-2109-4e95-8bfb-a598a6af4442" (UID: "cd1c466c-2109-4e95-8bfb-a598a6af4442"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.289953 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.386772 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd1c466c-2109-4e95-8bfb-a598a6af4442" (UID: "cd1c466c-2109-4e95-8bfb-a598a6af4442"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.392487 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.412731 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cd1c466c-2109-4e95-8bfb-a598a6af4442" (UID: "cd1c466c-2109-4e95-8bfb-a598a6af4442"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.422143 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cd1c466c-2109-4e95-8bfb-a598a6af4442" (UID: "cd1c466c-2109-4e95-8bfb-a598a6af4442"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.494280 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.494328 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1c466c-2109-4e95-8bfb-a598a6af4442-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.509814 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.509879 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.509905 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.531751 5030 scope.go:117] "RemoveContainer" containerID="34ddb20d0ede83e4cd7c0c0b694af5203eb3fc56c960ffac026eaad28935327a" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.585144 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.607969 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.649809 5030 scope.go:117] "RemoveContainer" containerID="34ddb20d0ede83e4cd7c0c0b694af5203eb3fc56c960ffac026eaad28935327a" Jan 20 23:43:38 crc kubenswrapper[5030]: E0120 23:43:38.659803 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34ddb20d0ede83e4cd7c0c0b694af5203eb3fc56c960ffac026eaad28935327a\": container with ID starting with 34ddb20d0ede83e4cd7c0c0b694af5203eb3fc56c960ffac026eaad28935327a not found: ID does not exist" containerID="34ddb20d0ede83e4cd7c0c0b694af5203eb3fc56c960ffac026eaad28935327a" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.659856 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34ddb20d0ede83e4cd7c0c0b694af5203eb3fc56c960ffac026eaad28935327a"} err="failed to get container status \"34ddb20d0ede83e4cd7c0c0b694af5203eb3fc56c960ffac026eaad28935327a\": rpc error: code = NotFound desc = could not find container \"34ddb20d0ede83e4cd7c0c0b694af5203eb3fc56c960ffac026eaad28935327a\": container with ID starting with 34ddb20d0ede83e4cd7c0c0b694af5203eb3fc56c960ffac026eaad28935327a not found: ID does not exist" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.788016 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"db5e46d2-d1b4-4639-8e45-c329417ae8e1","Type":"ContainerStarted","Data":"1486f2c999df12315217989d8070d874ddb2991aae5c87d9d4eae7f6ea2b9785"} Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.821904 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" podUID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.45:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:38 crc kubenswrapper[5030]: I0120 23:43:38.830806 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" podUID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.45:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:39 crc kubenswrapper[5030]: I0120 23:43:39.063354 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:39 crc kubenswrapper[5030]: I0120 23:43:39.065500 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:39 crc kubenswrapper[5030]: I0120 23:43:39.217947 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp"] Jan 20 23:43:39 crc kubenswrapper[5030]: I0120 23:43:39.255565 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-65ff77b5b8-8pvfp"] Jan 20 23:43:39 crc kubenswrapper[5030]: I0120 23:43:39.605857 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:43:39 crc kubenswrapper[5030]: I0120 23:43:39.623524 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:43:39 crc kubenswrapper[5030]: I0120 23:43:39.698058 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:39 crc kubenswrapper[5030]: I0120 23:43:39.798996 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"db5e46d2-d1b4-4639-8e45-c329417ae8e1","Type":"ContainerStarted","Data":"541f9809b1851320adea3aba77104ed75a318a6d7f3ff14f9ae06b101cf63b8f"} Jan 20 23:43:39 crc kubenswrapper[5030]: I0120 23:43:39.815673 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=3.815655701 podStartE2EDuration="3.815655701s" podCreationTimestamp="2026-01-20 23:43:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:39.813800747 +0000 UTC m=+4092.134061035" watchObservedRunningTime="2026-01-20 23:43:39.815655701 +0000 UTC m=+4092.135915979" Jan 20 23:43:39 crc kubenswrapper[5030]: I0120 23:43:39.819642 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:39 crc kubenswrapper[5030]: I0120 23:43:39.956110 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:39 crc kubenswrapper[5030]: I0120 23:43:39.975033 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd1c466c-2109-4e95-8bfb-a598a6af4442" path="/var/lib/kubelet/pods/cd1c466c-2109-4e95-8bfb-a598a6af4442/volumes" Jan 20 23:43:40 crc kubenswrapper[5030]: I0120 23:43:40.658136 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:43:40 crc kubenswrapper[5030]: I0120 23:43:40.732757 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8"] Jan 20 23:43:40 crc kubenswrapper[5030]: I0120 23:43:40.732994 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api-log" containerID="cri-o://93f5098d23b0d3f01250c711f57d6ce77c7bdec802fcac44aba9785bbe81f1b6" gracePeriod=30 Jan 20 23:43:40 crc kubenswrapper[5030]: I0120 23:43:40.733114 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api" containerID="cri-o://dae5f367d91c592f55708cf1cf8cef62b4a47e8ae4fe340741cda3c5b092b951" gracePeriod=30 Jan 20 23:43:40 crc kubenswrapper[5030]: I0120 23:43:40.842187 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"50bb7262-9571-4851-9399-5115ee497125","Type":"ContainerStarted","Data":"baa4899e3f859a5db2323e87eae3885e65c5e4f74409614fbf39a1fc1be3957a"} Jan 20 23:43:40 crc kubenswrapper[5030]: I0120 23:43:40.860809 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5c070076-3f81-4e61-9f27-e5a39a84c380","Type":"ContainerStarted","Data":"3ee7c875e7608cf4c0f7d6601a304573ca4dfceefd3058f7a8848e454d0f9c9d"} Jan 20 23:43:40 crc kubenswrapper[5030]: I0120 23:43:40.860848 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5c070076-3f81-4e61-9f27-e5a39a84c380","Type":"ContainerStarted","Data":"312b6637dd14cf9fb498a66c35b5684b996cb8cb1e7953888991d4eb4c6aab8b"} Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.160854 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.161325 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="ceilometer-central-agent" containerID="cri-o://cfa5d12ec8dd35e4a892ca6be634b4e7bf04ed77c8dbf172c3f5aca0da154dd3" gracePeriod=30 Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.161821 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="proxy-httpd" containerID="cri-o://3a021e6e1de3258fbc43e1fe576effd6eb9ae3f6b6721159e550dce62dbe4db8" gracePeriod=30 Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.162028 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="ceilometer-notification-agent" containerID="cri-o://df80f7331fd2acd35e1f5223ad9813293a66e401c708fb7402e8df96840c4ac3" gracePeriod=30 Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.162084 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="sg-core" containerID="cri-o://78a21b2039a28bb9676a5f052208d7607b918ac03ca2faa1ba709a6784895363" gracePeriod=30 Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.169278 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.43:3000/\": read tcp 10.217.0.2:48364->10.217.1.43:3000: read: connection reset by peer" Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.261935 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.411465 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.886103 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cee270b-420e-4af4-965a-3f07b75adb64" containerID="93f5098d23b0d3f01250c711f57d6ce77c7bdec802fcac44aba9785bbe81f1b6" exitCode=143 Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.886188 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" event={"ID":"0cee270b-420e-4af4-965a-3f07b75adb64","Type":"ContainerDied","Data":"93f5098d23b0d3f01250c711f57d6ce77c7bdec802fcac44aba9785bbe81f1b6"} Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.890767 5030 generic.go:334] "Generic (PLEG): container finished" podID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerID="3a021e6e1de3258fbc43e1fe576effd6eb9ae3f6b6721159e550dce62dbe4db8" exitCode=0 Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.890811 5030 generic.go:334] "Generic (PLEG): container finished" podID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerID="78a21b2039a28bb9676a5f052208d7607b918ac03ca2faa1ba709a6784895363" exitCode=2 Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.890804 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb755d31-fd61-4e84-9e18-5acc68934a58","Type":"ContainerDied","Data":"3a021e6e1de3258fbc43e1fe576effd6eb9ae3f6b6721159e550dce62dbe4db8"} Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.890861 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb755d31-fd61-4e84-9e18-5acc68934a58","Type":"ContainerDied","Data":"78a21b2039a28bb9676a5f052208d7607b918ac03ca2faa1ba709a6784895363"} Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.890874 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb755d31-fd61-4e84-9e18-5acc68934a58","Type":"ContainerDied","Data":"cfa5d12ec8dd35e4a892ca6be634b4e7bf04ed77c8dbf172c3f5aca0da154dd3"} Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.890825 5030 generic.go:334] "Generic (PLEG): container finished" podID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerID="cfa5d12ec8dd35e4a892ca6be634b4e7bf04ed77c8dbf172c3f5aca0da154dd3" exitCode=0 Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.894503 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5c070076-3f81-4e61-9f27-e5a39a84c380","Type":"ContainerStarted","Data":"4c0e97ab7ac916ffeec23817d4c8f83530ee3240f89e8b751dc5b7193307f748"} Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.900055 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"50bb7262-9571-4851-9399-5115ee497125","Type":"ContainerStarted","Data":"db87e0d657028f0da82f16b08db2f08eb396b07e9c9a8800ab5a9e1a8e82760c"} Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.900107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"50bb7262-9571-4851-9399-5115ee497125","Type":"ContainerStarted","Data":"181729877b0aab834202ecd11fb5d266acca0c2132cefa652fcb436c28a852fb"} Jan 20 23:43:41 crc kubenswrapper[5030]: I0120 23:43:41.930200 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=4.930176836 podStartE2EDuration="4.930176836s" podCreationTimestamp="2026-01-20 23:43:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:41.911261048 +0000 UTC m=+4094.231521336" watchObservedRunningTime="2026-01-20 23:43:41.930176836 +0000 UTC m=+4094.250437124" Jan 20 23:43:42 crc kubenswrapper[5030]: I0120 23:43:42.019213 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:42 crc kubenswrapper[5030]: I0120 23:43:42.812442 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:43:42 crc kubenswrapper[5030]: I0120 23:43:42.828994 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=5.8289760699999995 podStartE2EDuration="5.82897607s" podCreationTimestamp="2026-01-20 23:43:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:41.950392387 +0000 UTC m=+4094.270652665" watchObservedRunningTime="2026-01-20 23:43:42.82897607 +0000 UTC m=+4095.149236358" Jan 20 23:43:42 crc kubenswrapper[5030]: I0120 23:43:42.878007 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:42 crc kubenswrapper[5030]: I0120 23:43:42.901696 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:42 crc kubenswrapper[5030]: I0120 23:43:42.910334 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:42 crc kubenswrapper[5030]: I0120 23:43:42.934120 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:43:43 crc kubenswrapper[5030]: I0120 23:43:43.608919 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:43 crc kubenswrapper[5030]: I0120 23:43:43.608971 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:44 crc kubenswrapper[5030]: I0120 23:43:44.247322 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:44 crc kubenswrapper[5030]: I0120 23:43:44.255706 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:43:44 crc kubenswrapper[5030]: I0120 23:43:44.351316 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-756b878d5d-nf4hr"] Jan 20 23:43:44 crc kubenswrapper[5030]: I0120 23:43:44.351919 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" podUID="76dc6ed6-f471-4ee1-8257-45b1c366e23f" containerName="placement-log" containerID="cri-o://d237b27eaa104ce903a2a8f6a052606cc3d5791c78e134f2cb0c14cfd7492e99" gracePeriod=30 Jan 20 23:43:44 crc kubenswrapper[5030]: I0120 23:43:44.351971 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" podUID="76dc6ed6-f471-4ee1-8257-45b1c366e23f" containerName="placement-api" containerID="cri-o://2fd6129fe28aa73bc45a8a833204a917e8c99df2272ecdbe088383a43ce45fe2" gracePeriod=30 Jan 20 23:43:44 crc kubenswrapper[5030]: I0120 23:43:44.964973 5030 generic.go:334] "Generic (PLEG): container finished" podID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerID="df80f7331fd2acd35e1f5223ad9813293a66e401c708fb7402e8df96840c4ac3" exitCode=0 Jan 20 23:43:44 crc kubenswrapper[5030]: I0120 23:43:44.965077 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb755d31-fd61-4e84-9e18-5acc68934a58","Type":"ContainerDied","Data":"df80f7331fd2acd35e1f5223ad9813293a66e401c708fb7402e8df96840c4ac3"} Jan 20 23:43:44 crc kubenswrapper[5030]: I0120 23:43:44.965316 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"eb755d31-fd61-4e84-9e18-5acc68934a58","Type":"ContainerDied","Data":"6520776bb31990d1b45ef0ed59e0298a7e30aa377d074ede7401abc8408d3b3a"} Jan 20 23:43:44 crc kubenswrapper[5030]: I0120 23:43:44.965333 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6520776bb31990d1b45ef0ed59e0298a7e30aa377d074ede7401abc8408d3b3a" Jan 20 23:43:44 crc kubenswrapper[5030]: I0120 23:43:44.967221 5030 generic.go:334] "Generic (PLEG): container finished" podID="76dc6ed6-f471-4ee1-8257-45b1c366e23f" containerID="d237b27eaa104ce903a2a8f6a052606cc3d5791c78e134f2cb0c14cfd7492e99" exitCode=143 Jan 20 23:43:44 crc kubenswrapper[5030]: I0120 23:43:44.967249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" event={"ID":"76dc6ed6-f471-4ee1-8257-45b1c366e23f","Type":"ContainerDied","Data":"d237b27eaa104ce903a2a8f6a052606cc3d5791c78e134f2cb0c14cfd7492e99"} Jan 20 23:43:44 crc kubenswrapper[5030]: I0120 23:43:44.986290 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.091395 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb755d31-fd61-4e84-9e18-5acc68934a58-log-httpd\") pod \"eb755d31-fd61-4e84-9e18-5acc68934a58\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.091649 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-scripts\") pod \"eb755d31-fd61-4e84-9e18-5acc68934a58\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.091737 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-config-data\") pod \"eb755d31-fd61-4e84-9e18-5acc68934a58\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.091820 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb755d31-fd61-4e84-9e18-5acc68934a58-run-httpd\") pod \"eb755d31-fd61-4e84-9e18-5acc68934a58\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.091839 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-combined-ca-bundle\") pod \"eb755d31-fd61-4e84-9e18-5acc68934a58\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.091943 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-ceilometer-tls-certs\") pod \"eb755d31-fd61-4e84-9e18-5acc68934a58\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.092025 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4xtx\" (UniqueName: \"kubernetes.io/projected/eb755d31-fd61-4e84-9e18-5acc68934a58-kube-api-access-p4xtx\") pod \"eb755d31-fd61-4e84-9e18-5acc68934a58\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.092063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-sg-core-conf-yaml\") pod \"eb755d31-fd61-4e84-9e18-5acc68934a58\" (UID: \"eb755d31-fd61-4e84-9e18-5acc68934a58\") " Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.092326 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb755d31-fd61-4e84-9e18-5acc68934a58-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eb755d31-fd61-4e84-9e18-5acc68934a58" (UID: "eb755d31-fd61-4e84-9e18-5acc68934a58"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.093178 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb755d31-fd61-4e84-9e18-5acc68934a58-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.095847 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb755d31-fd61-4e84-9e18-5acc68934a58-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eb755d31-fd61-4e84-9e18-5acc68934a58" (UID: "eb755d31-fd61-4e84-9e18-5acc68934a58"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.103874 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-scripts" (OuterVolumeSpecName: "scripts") pod "eb755d31-fd61-4e84-9e18-5acc68934a58" (UID: "eb755d31-fd61-4e84-9e18-5acc68934a58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.104051 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb755d31-fd61-4e84-9e18-5acc68934a58-kube-api-access-p4xtx" (OuterVolumeSpecName: "kube-api-access-p4xtx") pod "eb755d31-fd61-4e84-9e18-5acc68934a58" (UID: "eb755d31-fd61-4e84-9e18-5acc68934a58"). InnerVolumeSpecName "kube-api-access-p4xtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.148584 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eb755d31-fd61-4e84-9e18-5acc68934a58" (UID: "eb755d31-fd61-4e84-9e18-5acc68934a58"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.148800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "eb755d31-fd61-4e84-9e18-5acc68934a58" (UID: "eb755d31-fd61-4e84-9e18-5acc68934a58"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.182279 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb755d31-fd61-4e84-9e18-5acc68934a58" (UID: "eb755d31-fd61-4e84-9e18-5acc68934a58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.197237 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb755d31-fd61-4e84-9e18-5acc68934a58-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.197268 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.197280 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.197292 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4xtx\" (UniqueName: \"kubernetes.io/projected/eb755d31-fd61-4e84-9e18-5acc68934a58-kube-api-access-p4xtx\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.197301 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.197310 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.213197 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-config-data" (OuterVolumeSpecName: "config-data") pod "eb755d31-fd61-4e84-9e18-5acc68934a58" (UID: "eb755d31-fd61-4e84-9e18-5acc68934a58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.298982 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb755d31-fd61-4e84-9e18-5acc68934a58-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.826597 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.985043 5030 generic.go:334] "Generic (PLEG): container finished" podID="a467f566-3c36-4534-8219-fa10f325e20b" containerID="ad6f8e58c9dff12d340a8632ebe13ee33bdf79383a461b73e54bcbe069e202bd" exitCode=0 Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.985167 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.989691 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" event={"ID":"a467f566-3c36-4534-8219-fa10f325e20b","Type":"ContainerDied","Data":"ad6f8e58c9dff12d340a8632ebe13ee33bdf79383a461b73e54bcbe069e202bd"} Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.989760 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" event={"ID":"a467f566-3c36-4534-8219-fa10f325e20b","Type":"ContainerDied","Data":"a15ad4702023cddd300461791a532a8767efd27711ed6f5f9fd133cce2f39b90"} Jan 20 23:43:45 crc kubenswrapper[5030]: I0120 23:43:45.989782 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a15ad4702023cddd300461791a532a8767efd27711ed6f5f9fd133cce2f39b90" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.032291 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.064291 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.077150 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.107209 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:46 crc kubenswrapper[5030]: E0120 23:43:46.107703 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a467f566-3c36-4534-8219-fa10f325e20b" containerName="neutron-httpd" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.107720 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a467f566-3c36-4534-8219-fa10f325e20b" containerName="neutron-httpd" Jan 20 23:43:46 crc kubenswrapper[5030]: E0120 23:43:46.107734 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a467f566-3c36-4534-8219-fa10f325e20b" containerName="neutron-api" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.107740 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a467f566-3c36-4534-8219-fa10f325e20b" containerName="neutron-api" Jan 20 23:43:46 crc kubenswrapper[5030]: E0120 23:43:46.107750 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="ceilometer-notification-agent" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.107756 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="ceilometer-notification-agent" Jan 20 23:43:46 crc kubenswrapper[5030]: E0120 23:43:46.107782 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="proxy-httpd" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.107788 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="proxy-httpd" Jan 20 23:43:46 crc kubenswrapper[5030]: E0120 23:43:46.107807 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="sg-core" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.107812 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="sg-core" Jan 20 23:43:46 crc kubenswrapper[5030]: E0120 23:43:46.107830 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="ceilometer-central-agent" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.107837 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="ceilometer-central-agent" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.107998 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="proxy-httpd" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.108020 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a467f566-3c36-4534-8219-fa10f325e20b" containerName="neutron-api" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.108031 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="ceilometer-notification-agent" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.108041 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a467f566-3c36-4534-8219-fa10f325e20b" containerName="neutron-httpd" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.108052 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="ceilometer-central-agent" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.108060 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" containerName="sg-core" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.109845 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.112787 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-ovndb-tls-certs\") pod \"a467f566-3c36-4534-8219-fa10f325e20b\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.112926 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-combined-ca-bundle\") pod \"a467f566-3c36-4534-8219-fa10f325e20b\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.112955 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-config\") pod \"a467f566-3c36-4534-8219-fa10f325e20b\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.112970 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.112995 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-internal-tls-certs\") pod \"a467f566-3c36-4534-8219-fa10f325e20b\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.113040 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np6ft\" (UniqueName: \"kubernetes.io/projected/a467f566-3c36-4534-8219-fa10f325e20b-kube-api-access-np6ft\") pod \"a467f566-3c36-4534-8219-fa10f325e20b\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.113093 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-public-tls-certs\") pod \"a467f566-3c36-4534-8219-fa10f325e20b\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.113153 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.113193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-httpd-config\") pod \"a467f566-3c36-4534-8219-fa10f325e20b\" (UID: \"a467f566-3c36-4534-8219-fa10f325e20b\") " Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.113272 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.114466 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.128901 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a467f566-3c36-4534-8219-fa10f325e20b" (UID: "a467f566-3c36-4534-8219-fa10f325e20b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.131836 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a467f566-3c36-4534-8219-fa10f325e20b-kube-api-access-np6ft" (OuterVolumeSpecName: "kube-api-access-np6ft") pod "a467f566-3c36-4534-8219-fa10f325e20b" (UID: "a467f566-3c36-4534-8219-fa10f325e20b"). InnerVolumeSpecName "kube-api-access-np6ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.174754 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a467f566-3c36-4534-8219-fa10f325e20b" (UID: "a467f566-3c36-4534-8219-fa10f325e20b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.190243 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-config" (OuterVolumeSpecName: "config") pod "a467f566-3c36-4534-8219-fa10f325e20b" (UID: "a467f566-3c36-4534-8219-fa10f325e20b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.205992 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a467f566-3c36-4534-8219-fa10f325e20b" (UID: "a467f566-3c36-4534-8219-fa10f325e20b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.215297 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.215343 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c854d032-629e-4a6e-ba33-4250efb68e93-log-httpd\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.215359 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.215381 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-config-data\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.215404 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c854d032-629e-4a6e-ba33-4250efb68e93-run-httpd\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.215437 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.215772 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4cst\" (UniqueName: \"kubernetes.io/projected/c854d032-629e-4a6e-ba33-4250efb68e93-kube-api-access-z4cst\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.215861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a467f566-3c36-4534-8219-fa10f325e20b" (UID: "a467f566-3c36-4534-8219-fa10f325e20b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.215950 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-scripts\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.216273 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np6ft\" (UniqueName: \"kubernetes.io/projected/a467f566-3c36-4534-8219-fa10f325e20b-kube-api-access-np6ft\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.216297 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.216309 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.216320 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.216330 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.216339 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.222707 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a467f566-3c36-4534-8219-fa10f325e20b" (UID: "a467f566-3c36-4534-8219-fa10f325e20b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.318207 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4cst\" (UniqueName: \"kubernetes.io/projected/c854d032-629e-4a6e-ba33-4250efb68e93-kube-api-access-z4cst\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.318254 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-scripts\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.318329 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.318348 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c854d032-629e-4a6e-ba33-4250efb68e93-log-httpd\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.318366 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.318387 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-config-data\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.318421 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c854d032-629e-4a6e-ba33-4250efb68e93-run-httpd\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.318455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.318504 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a467f566-3c36-4534-8219-fa10f325e20b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.319441 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c854d032-629e-4a6e-ba33-4250efb68e93-log-httpd\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.319784 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c854d032-629e-4a6e-ba33-4250efb68e93-run-httpd\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.322794 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.322948 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.323622 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-scripts\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.323668 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-config-data\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.327295 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.337941 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4cst\" (UniqueName: \"kubernetes.io/projected/c854d032-629e-4a6e-ba33-4250efb68e93-kube-api-access-z4cst\") pod \"ceilometer-0\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.474002 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:46 crc kubenswrapper[5030]: I0120 23:43:46.792782 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:43:47 crc kubenswrapper[5030]: I0120 23:43:47.015277 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c854d032-629e-4a6e-ba33-4250efb68e93","Type":"ContainerStarted","Data":"c53a57bba70c5a402c0ef0606c7eaa13ca8412761706f70e19dcfced05a3306c"} Jan 20 23:43:47 crc kubenswrapper[5030]: I0120 23:43:47.015334 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5b454f47f8-kbjj4" Jan 20 23:43:47 crc kubenswrapper[5030]: I0120 23:43:47.019469 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:47 crc kubenswrapper[5030]: I0120 23:43:47.064906 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5b454f47f8-kbjj4"] Jan 20 23:43:47 crc kubenswrapper[5030]: I0120 23:43:47.065914 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:47 crc kubenswrapper[5030]: I0120 23:43:47.086947 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5b454f47f8-kbjj4"] Jan 20 23:43:47 crc kubenswrapper[5030]: I0120 23:43:47.496883 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" podUID="76dc6ed6-f471-4ee1-8257-45b1c366e23f" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.206:8778/\": read tcp 10.217.0.2:44712->10.217.0.206:8778: read: connection reset by peer" Jan 20 23:43:47 crc kubenswrapper[5030]: I0120 23:43:47.497446 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" podUID="76dc6ed6-f471-4ee1-8257-45b1c366e23f" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.206:8778/\": read tcp 10.217.0.2:44720->10.217.0.206:8778: read: connection reset by peer" Jan 20 23:43:47 crc kubenswrapper[5030]: I0120 23:43:47.922354 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:43:47 crc kubenswrapper[5030]: I0120 23:43:47.973750 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a467f566-3c36-4534-8219-fa10f325e20b" path="/var/lib/kubelet/pods/a467f566-3c36-4534-8219-fa10f325e20b/volumes" Jan 20 23:43:47 crc kubenswrapper[5030]: I0120 23:43:47.974521 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb755d31-fd61-4e84-9e18-5acc68934a58" path="/var/lib/kubelet/pods/eb755d31-fd61-4e84-9e18-5acc68934a58/volumes" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.046095 5030 generic.go:334] "Generic (PLEG): container finished" podID="76dc6ed6-f471-4ee1-8257-45b1c366e23f" containerID="2fd6129fe28aa73bc45a8a833204a917e8c99df2272ecdbe088383a43ce45fe2" exitCode=0 Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.046171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" event={"ID":"76dc6ed6-f471-4ee1-8257-45b1c366e23f","Type":"ContainerDied","Data":"2fd6129fe28aa73bc45a8a833204a917e8c99df2272ecdbe088383a43ce45fe2"} Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.046201 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" event={"ID":"76dc6ed6-f471-4ee1-8257-45b1c366e23f","Type":"ContainerDied","Data":"61604dd3ee019112fb4ab41474008fb18c64429a5a47181a9b74f1e5a3c5e66a"} Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.046229 5030 scope.go:117] "RemoveContainer" containerID="2fd6129fe28aa73bc45a8a833204a917e8c99df2272ecdbe088383a43ce45fe2" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.046387 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-756b878d5d-nf4hr" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.054271 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjkbl\" (UniqueName: \"kubernetes.io/projected/76dc6ed6-f471-4ee1-8257-45b1c366e23f-kube-api-access-bjkbl\") pod \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.054433 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-internal-tls-certs\") pod \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.054474 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-scripts\") pod \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.054494 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-config-data\") pod \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.054513 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-public-tls-certs\") pod \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.054537 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ed6-f471-4ee1-8257-45b1c366e23f-logs\") pod \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.054685 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-combined-ca-bundle\") pod \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\" (UID: \"76dc6ed6-f471-4ee1-8257-45b1c366e23f\") " Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.057729 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c854d032-629e-4a6e-ba33-4250efb68e93","Type":"ContainerStarted","Data":"8bbb1503f80b80cf7f8e991e86ddfc92759c57c14cc65be7034179949da55f2a"} Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.058157 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76dc6ed6-f471-4ee1-8257-45b1c366e23f-logs" (OuterVolumeSpecName: "logs") pod "76dc6ed6-f471-4ee1-8257-45b1c366e23f" (UID: "76dc6ed6-f471-4ee1-8257-45b1c366e23f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.071018 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-scripts" (OuterVolumeSpecName: "scripts") pod "76dc6ed6-f471-4ee1-8257-45b1c366e23f" (UID: "76dc6ed6-f471-4ee1-8257-45b1c366e23f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.077809 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76dc6ed6-f471-4ee1-8257-45b1c366e23f-kube-api-access-bjkbl" (OuterVolumeSpecName: "kube-api-access-bjkbl") pod "76dc6ed6-f471-4ee1-8257-45b1c366e23f" (UID: "76dc6ed6-f471-4ee1-8257-45b1c366e23f"). InnerVolumeSpecName "kube-api-access-bjkbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.115704 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.158921 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjkbl\" (UniqueName: \"kubernetes.io/projected/76dc6ed6-f471-4ee1-8257-45b1c366e23f-kube-api-access-bjkbl\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.158952 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.158962 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ed6-f471-4ee1-8257-45b1c366e23f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.174495 5030 scope.go:117] "RemoveContainer" containerID="d237b27eaa104ce903a2a8f6a052606cc3d5791c78e134f2cb0c14cfd7492e99" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.211356 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76dc6ed6-f471-4ee1-8257-45b1c366e23f" (UID: "76dc6ed6-f471-4ee1-8257-45b1c366e23f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.231589 5030 scope.go:117] "RemoveContainer" containerID="2fd6129fe28aa73bc45a8a833204a917e8c99df2272ecdbe088383a43ce45fe2" Jan 20 23:43:48 crc kubenswrapper[5030]: E0120 23:43:48.232922 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fd6129fe28aa73bc45a8a833204a917e8c99df2272ecdbe088383a43ce45fe2\": container with ID starting with 2fd6129fe28aa73bc45a8a833204a917e8c99df2272ecdbe088383a43ce45fe2 not found: ID does not exist" containerID="2fd6129fe28aa73bc45a8a833204a917e8c99df2272ecdbe088383a43ce45fe2" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.232964 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd6129fe28aa73bc45a8a833204a917e8c99df2272ecdbe088383a43ce45fe2"} err="failed to get container status \"2fd6129fe28aa73bc45a8a833204a917e8c99df2272ecdbe088383a43ce45fe2\": rpc error: code = NotFound desc = could not find container \"2fd6129fe28aa73bc45a8a833204a917e8c99df2272ecdbe088383a43ce45fe2\": container with ID starting with 2fd6129fe28aa73bc45a8a833204a917e8c99df2272ecdbe088383a43ce45fe2 not found: ID does not exist" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.232990 5030 scope.go:117] "RemoveContainer" containerID="d237b27eaa104ce903a2a8f6a052606cc3d5791c78e134f2cb0c14cfd7492e99" Jan 20 23:43:48 crc kubenswrapper[5030]: E0120 23:43:48.234596 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d237b27eaa104ce903a2a8f6a052606cc3d5791c78e134f2cb0c14cfd7492e99\": container with ID starting with d237b27eaa104ce903a2a8f6a052606cc3d5791c78e134f2cb0c14cfd7492e99 not found: ID does not exist" containerID="d237b27eaa104ce903a2a8f6a052606cc3d5791c78e134f2cb0c14cfd7492e99" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.234739 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d237b27eaa104ce903a2a8f6a052606cc3d5791c78e134f2cb0c14cfd7492e99"} err="failed to get container status \"d237b27eaa104ce903a2a8f6a052606cc3d5791c78e134f2cb0c14cfd7492e99\": rpc error: code = NotFound desc = could not find container \"d237b27eaa104ce903a2a8f6a052606cc3d5791c78e134f2cb0c14cfd7492e99\": container with ID starting with d237b27eaa104ce903a2a8f6a052606cc3d5791c78e134f2cb0c14cfd7492e99 not found: ID does not exist" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.262998 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.609219 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.609567 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.750869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-config-data" (OuterVolumeSpecName: "config-data") pod "76dc6ed6-f471-4ee1-8257-45b1c366e23f" (UID: "76dc6ed6-f471-4ee1-8257-45b1c366e23f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.772485 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.812772 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "76dc6ed6-f471-4ee1-8257-45b1c366e23f" (UID: "76dc6ed6-f471-4ee1-8257-45b1c366e23f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.826093 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "76dc6ed6-f471-4ee1-8257-45b1c366e23f" (UID: "76dc6ed6-f471-4ee1-8257-45b1c366e23f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.874822 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.874857 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76dc6ed6-f471-4ee1-8257-45b1c366e23f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.984932 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-756b878d5d-nf4hr"] Jan 20 23:43:48 crc kubenswrapper[5030]: I0120 23:43:48.998708 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-756b878d5d-nf4hr"] Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.069855 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c854d032-629e-4a6e-ba33-4250efb68e93","Type":"ContainerStarted","Data":"fa5d613f8d0d813c33ea394506a8aeaabedae1997902ced4ffc7f9afa555c780"} Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.217114 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.17:9311/healthcheck\": read tcp 10.217.0.2:54512->10.217.1.17:9311: read: connection reset by peer" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.217394 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.17:9311/healthcheck\": read tcp 10.217.0.2:54522->10.217.1.17:9311: read: connection reset by peer" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.563392 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.594394 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kgkk\" (UniqueName: \"kubernetes.io/projected/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-kube-api-access-6kgkk\") pod \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.594544 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-combined-ca-bundle\") pod \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.594664 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-logs\") pod \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.594687 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-internal-tls-certs\") pod \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.594708 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-public-tls-certs\") pod \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.594759 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-config-data\") pod \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\" (UID: \"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb\") " Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.595438 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-logs" (OuterVolumeSpecName: "logs") pod "b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" (UID: "b5a99a50-cb48-4ae4-85d5-b582b0fc4beb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.607824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-kube-api-access-6kgkk" (OuterVolumeSpecName: "kube-api-access-6kgkk") pod "b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" (UID: "b5a99a50-cb48-4ae4-85d5-b582b0fc4beb"). InnerVolumeSpecName "kube-api-access-6kgkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.622809 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5c070076-3f81-4e61-9f27-e5a39a84c380" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.54:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.623059 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5c070076-3f81-4e61-9f27-e5a39a84c380" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.54:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.635078 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" (UID: "b5a99a50-cb48-4ae4-85d5-b582b0fc4beb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.653470 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-config-data" (OuterVolumeSpecName: "config-data") pod "b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" (UID: "b5a99a50-cb48-4ae4-85d5-b582b0fc4beb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.670845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" (UID: "b5a99a50-cb48-4ae4-85d5-b582b0fc4beb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.696592 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.696645 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.696654 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.696666 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.696691 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kgkk\" (UniqueName: \"kubernetes.io/projected/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-kube-api-access-6kgkk\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.707956 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" (UID: "b5a99a50-cb48-4ae4-85d5-b582b0fc4beb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.748892 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.798223 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-config-data\") pod \"0cee270b-420e-4af4-965a-3f07b75adb64\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.798292 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-combined-ca-bundle\") pod \"0cee270b-420e-4af4-965a-3f07b75adb64\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.798340 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-config-data-custom\") pod \"0cee270b-420e-4af4-965a-3f07b75adb64\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.798492 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-public-tls-certs\") pod \"0cee270b-420e-4af4-965a-3f07b75adb64\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.798523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cee270b-420e-4af4-965a-3f07b75adb64-logs\") pod \"0cee270b-420e-4af4-965a-3f07b75adb64\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.798557 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw4l9\" (UniqueName: \"kubernetes.io/projected/0cee270b-420e-4af4-965a-3f07b75adb64-kube-api-access-kw4l9\") pod \"0cee270b-420e-4af4-965a-3f07b75adb64\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.798588 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-internal-tls-certs\") pod \"0cee270b-420e-4af4-965a-3f07b75adb64\" (UID: \"0cee270b-420e-4af4-965a-3f07b75adb64\") " Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.799139 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.800691 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cee270b-420e-4af4-965a-3f07b75adb64-logs" (OuterVolumeSpecName: "logs") pod "0cee270b-420e-4af4-965a-3f07b75adb64" (UID: "0cee270b-420e-4af4-965a-3f07b75adb64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.809166 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0cee270b-420e-4af4-965a-3f07b75adb64" (UID: "0cee270b-420e-4af4-965a-3f07b75adb64"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.810802 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cee270b-420e-4af4-965a-3f07b75adb64-kube-api-access-kw4l9" (OuterVolumeSpecName: "kube-api-access-kw4l9") pod "0cee270b-420e-4af4-965a-3f07b75adb64" (UID: "0cee270b-420e-4af4-965a-3f07b75adb64"). InnerVolumeSpecName "kube-api-access-kw4l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.874224 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cee270b-420e-4af4-965a-3f07b75adb64" (UID: "0cee270b-420e-4af4-965a-3f07b75adb64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.879414 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0cee270b-420e-4af4-965a-3f07b75adb64" (UID: "0cee270b-420e-4af4-965a-3f07b75adb64"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.885090 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0cee270b-420e-4af4-965a-3f07b75adb64" (UID: "0cee270b-420e-4af4-965a-3f07b75adb64"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.886807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-config-data" (OuterVolumeSpecName: "config-data") pod "0cee270b-420e-4af4-965a-3f07b75adb64" (UID: "0cee270b-420e-4af4-965a-3f07b75adb64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.900729 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.900768 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cee270b-420e-4af4-965a-3f07b75adb64-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.900781 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw4l9\" (UniqueName: \"kubernetes.io/projected/0cee270b-420e-4af4-965a-3f07b75adb64-kube-api-access-kw4l9\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.900792 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.900801 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.900811 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.900820 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cee270b-420e-4af4-965a-3f07b75adb64-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:49 crc kubenswrapper[5030]: I0120 23:43:49.973777 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76dc6ed6-f471-4ee1-8257-45b1c366e23f" path="/var/lib/kubelet/pods/76dc6ed6-f471-4ee1-8257-45b1c366e23f/volumes" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.080778 5030 generic.go:334] "Generic (PLEG): container finished" podID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerID="39d75acd0e24b7098ea6e1abb5aa8ea635cc430b52893fa5ef9edbc6e210ce11" exitCode=0 Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.080852 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb","Type":"ContainerDied","Data":"39d75acd0e24b7098ea6e1abb5aa8ea635cc430b52893fa5ef9edbc6e210ce11"} Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.080885 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"b5a99a50-cb48-4ae4-85d5-b582b0fc4beb","Type":"ContainerDied","Data":"d01314e734acb62212779c45b5299b0c23a49c00045008b2d1ee187819ba326f"} Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.080906 5030 scope.go:117] "RemoveContainer" containerID="39d75acd0e24b7098ea6e1abb5aa8ea635cc430b52893fa5ef9edbc6e210ce11" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.080927 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.088803 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c854d032-629e-4a6e-ba33-4250efb68e93","Type":"ContainerStarted","Data":"5f81a82dbd579b31cd82b7dfcbcf23af4b004b459b1582b1fa9c8715ac7c09a4"} Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.090746 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cee270b-420e-4af4-965a-3f07b75adb64" containerID="dae5f367d91c592f55708cf1cf8cef62b4a47e8ae4fe340741cda3c5b092b951" exitCode=0 Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.090773 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" event={"ID":"0cee270b-420e-4af4-965a-3f07b75adb64","Type":"ContainerDied","Data":"dae5f367d91c592f55708cf1cf8cef62b4a47e8ae4fe340741cda3c5b092b951"} Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.090789 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" event={"ID":"0cee270b-420e-4af4-965a-3f07b75adb64","Type":"ContainerDied","Data":"8e1c928a41fbec16d37549b4e9409eb6f5e84359420e31e38ab9e590916cc551"} Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.090839 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.107734 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.108007 5030 scope.go:117] "RemoveContainer" containerID="6b9efb08e33facd7a36f07305e5026219490fefe5b522a4546faa6abd38475c8" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.137701 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.137776 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8"] Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.150794 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-5bd4ddf7c6-tfql8"] Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.166479 5030 scope.go:117] "RemoveContainer" containerID="39d75acd0e24b7098ea6e1abb5aa8ea635cc430b52893fa5ef9edbc6e210ce11" Jan 20 23:43:50 crc kubenswrapper[5030]: E0120 23:43:50.168922 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d75acd0e24b7098ea6e1abb5aa8ea635cc430b52893fa5ef9edbc6e210ce11\": container with ID starting with 39d75acd0e24b7098ea6e1abb5aa8ea635cc430b52893fa5ef9edbc6e210ce11 not found: ID does not exist" containerID="39d75acd0e24b7098ea6e1abb5aa8ea635cc430b52893fa5ef9edbc6e210ce11" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.168962 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d75acd0e24b7098ea6e1abb5aa8ea635cc430b52893fa5ef9edbc6e210ce11"} err="failed to get container status \"39d75acd0e24b7098ea6e1abb5aa8ea635cc430b52893fa5ef9edbc6e210ce11\": rpc error: code = NotFound desc = could not find container \"39d75acd0e24b7098ea6e1abb5aa8ea635cc430b52893fa5ef9edbc6e210ce11\": container with ID starting with 39d75acd0e24b7098ea6e1abb5aa8ea635cc430b52893fa5ef9edbc6e210ce11 not found: ID does not exist" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.168991 5030 scope.go:117] "RemoveContainer" containerID="6b9efb08e33facd7a36f07305e5026219490fefe5b522a4546faa6abd38475c8" Jan 20 23:43:50 crc kubenswrapper[5030]: E0120 23:43:50.169907 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b9efb08e33facd7a36f07305e5026219490fefe5b522a4546faa6abd38475c8\": container with ID starting with 6b9efb08e33facd7a36f07305e5026219490fefe5b522a4546faa6abd38475c8 not found: ID does not exist" containerID="6b9efb08e33facd7a36f07305e5026219490fefe5b522a4546faa6abd38475c8" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.169941 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b9efb08e33facd7a36f07305e5026219490fefe5b522a4546faa6abd38475c8"} err="failed to get container status \"6b9efb08e33facd7a36f07305e5026219490fefe5b522a4546faa6abd38475c8\": rpc error: code = NotFound desc = could not find container \"6b9efb08e33facd7a36f07305e5026219490fefe5b522a4546faa6abd38475c8\": container with ID starting with 6b9efb08e33facd7a36f07305e5026219490fefe5b522a4546faa6abd38475c8 not found: ID does not exist" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.169959 5030 scope.go:117] "RemoveContainer" containerID="dae5f367d91c592f55708cf1cf8cef62b4a47e8ae4fe340741cda3c5b092b951" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.172032 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:43:50 crc kubenswrapper[5030]: E0120 23:43:50.172525 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.172550 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api" Jan 20 23:43:50 crc kubenswrapper[5030]: E0120 23:43:50.172574 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76dc6ed6-f471-4ee1-8257-45b1c366e23f" containerName="placement-api" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.172585 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="76dc6ed6-f471-4ee1-8257-45b1c366e23f" containerName="placement-api" Jan 20 23:43:50 crc kubenswrapper[5030]: E0120 23:43:50.172608 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-api" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.172634 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-api" Jan 20 23:43:50 crc kubenswrapper[5030]: E0120 23:43:50.172655 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api-log" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.172664 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api-log" Jan 20 23:43:50 crc kubenswrapper[5030]: E0120 23:43:50.172677 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76dc6ed6-f471-4ee1-8257-45b1c366e23f" containerName="placement-log" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.172686 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="76dc6ed6-f471-4ee1-8257-45b1c366e23f" containerName="placement-log" Jan 20 23:43:50 crc kubenswrapper[5030]: E0120 23:43:50.172702 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-log" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.172710 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-log" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.172944 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api-log" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.172965 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-api" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.172977 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" containerName="nova-api-log" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.172990 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" containerName="barbican-api" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.173001 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="76dc6ed6-f471-4ee1-8257-45b1c366e23f" containerName="placement-api" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.173025 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="76dc6ed6-f471-4ee1-8257-45b1c366e23f" containerName="placement-log" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.174419 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.177528 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.177867 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.179947 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.186271 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.205520 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-config-data\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.205574 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbdc4876-2fa1-495d-88fd-3c18541b14d4-logs\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.205649 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-public-tls-certs\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.205677 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.205708 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzgpz\" (UniqueName: \"kubernetes.io/projected/fbdc4876-2fa1-495d-88fd-3c18541b14d4-kube-api-access-vzgpz\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.205776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.209127 5030 scope.go:117] "RemoveContainer" containerID="93f5098d23b0d3f01250c711f57d6ce77c7bdec802fcac44aba9785bbe81f1b6" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.233003 5030 scope.go:117] "RemoveContainer" containerID="dae5f367d91c592f55708cf1cf8cef62b4a47e8ae4fe340741cda3c5b092b951" Jan 20 23:43:50 crc kubenswrapper[5030]: E0120 23:43:50.233527 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae5f367d91c592f55708cf1cf8cef62b4a47e8ae4fe340741cda3c5b092b951\": container with ID starting with dae5f367d91c592f55708cf1cf8cef62b4a47e8ae4fe340741cda3c5b092b951 not found: ID does not exist" containerID="dae5f367d91c592f55708cf1cf8cef62b4a47e8ae4fe340741cda3c5b092b951" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.233565 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae5f367d91c592f55708cf1cf8cef62b4a47e8ae4fe340741cda3c5b092b951"} err="failed to get container status \"dae5f367d91c592f55708cf1cf8cef62b4a47e8ae4fe340741cda3c5b092b951\": rpc error: code = NotFound desc = could not find container \"dae5f367d91c592f55708cf1cf8cef62b4a47e8ae4fe340741cda3c5b092b951\": container with ID starting with dae5f367d91c592f55708cf1cf8cef62b4a47e8ae4fe340741cda3c5b092b951 not found: ID does not exist" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.233585 5030 scope.go:117] "RemoveContainer" containerID="93f5098d23b0d3f01250c711f57d6ce77c7bdec802fcac44aba9785bbe81f1b6" Jan 20 23:43:50 crc kubenswrapper[5030]: E0120 23:43:50.233988 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f5098d23b0d3f01250c711f57d6ce77c7bdec802fcac44aba9785bbe81f1b6\": container with ID starting with 93f5098d23b0d3f01250c711f57d6ce77c7bdec802fcac44aba9785bbe81f1b6 not found: ID does not exist" containerID="93f5098d23b0d3f01250c711f57d6ce77c7bdec802fcac44aba9785bbe81f1b6" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.234012 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f5098d23b0d3f01250c711f57d6ce77c7bdec802fcac44aba9785bbe81f1b6"} err="failed to get container status \"93f5098d23b0d3f01250c711f57d6ce77c7bdec802fcac44aba9785bbe81f1b6\": rpc error: code = NotFound desc = could not find container \"93f5098d23b0d3f01250c711f57d6ce77c7bdec802fcac44aba9785bbe81f1b6\": container with ID starting with 93f5098d23b0d3f01250c711f57d6ce77c7bdec802fcac44aba9785bbe81f1b6 not found: ID does not exist" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.307589 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-public-tls-certs\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.307657 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.307687 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzgpz\" (UniqueName: \"kubernetes.io/projected/fbdc4876-2fa1-495d-88fd-3c18541b14d4-kube-api-access-vzgpz\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.307756 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.307792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-config-data\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.307814 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbdc4876-2fa1-495d-88fd-3c18541b14d4-logs\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.308154 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbdc4876-2fa1-495d-88fd-3c18541b14d4-logs\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.312086 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-public-tls-certs\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.312212 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.313650 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-config-data\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.314516 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.326807 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzgpz\" (UniqueName: \"kubernetes.io/projected/fbdc4876-2fa1-495d-88fd-3c18541b14d4-kube-api-access-vzgpz\") pod \"nova-api-0\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.500898 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:43:50 crc kubenswrapper[5030]: I0120 23:43:50.541180 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:43:51 crc kubenswrapper[5030]: I0120 23:43:51.041176 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:43:51 crc kubenswrapper[5030]: I0120 23:43:51.974185 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cee270b-420e-4af4-965a-3f07b75adb64" path="/var/lib/kubelet/pods/0cee270b-420e-4af4-965a-3f07b75adb64/volumes" Jan 20 23:43:51 crc kubenswrapper[5030]: I0120 23:43:51.976040 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a99a50-cb48-4ae4-85d5-b582b0fc4beb" path="/var/lib/kubelet/pods/b5a99a50-cb48-4ae4-85d5-b582b0fc4beb/volumes" Jan 20 23:43:52 crc kubenswrapper[5030]: I0120 23:43:52.126224 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fbdc4876-2fa1-495d-88fd-3c18541b14d4","Type":"ContainerStarted","Data":"b59c18df1e318f6b2e3f84ac12a6374255e6026543722d5cbad05da5899f1a0b"} Jan 20 23:43:52 crc kubenswrapper[5030]: I0120 23:43:52.126286 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fbdc4876-2fa1-495d-88fd-3c18541b14d4","Type":"ContainerStarted","Data":"e8c469752570de07e4e1aec0689c0fb676780caef99c0dec50a0412886c30541"} Jan 20 23:43:52 crc kubenswrapper[5030]: I0120 23:43:52.128818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c854d032-629e-4a6e-ba33-4250efb68e93","Type":"ContainerStarted","Data":"fb990b5b667171dd307f376e0733a9f29d039041ca7fe051069b24b8156a6eb7"} Jan 20 23:43:52 crc kubenswrapper[5030]: I0120 23:43:52.129385 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:43:52 crc kubenswrapper[5030]: I0120 23:43:52.152529 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.9383044059999999 podStartE2EDuration="6.152512956s" podCreationTimestamp="2026-01-20 23:43:46 +0000 UTC" firstStartedPulling="2026-01-20 23:43:46.796083097 +0000 UTC m=+4099.116343395" lastFinishedPulling="2026-01-20 23:43:51.010291657 +0000 UTC m=+4103.330551945" observedRunningTime="2026-01-20 23:43:52.149896713 +0000 UTC m=+4104.470157001" watchObservedRunningTime="2026-01-20 23:43:52.152512956 +0000 UTC m=+4104.472773244" Jan 20 23:43:53 crc kubenswrapper[5030]: I0120 23:43:53.144574 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fbdc4876-2fa1-495d-88fd-3c18541b14d4","Type":"ContainerStarted","Data":"dbf3f368b62a0be16be3046e1cabd348e08742f826a1911f3c06fe9f0acb438c"} Jan 20 23:43:53 crc kubenswrapper[5030]: I0120 23:43:53.171272 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=3.17124557 podStartE2EDuration="3.17124557s" podCreationTimestamp="2026-01-20 23:43:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:43:53.167853277 +0000 UTC m=+4105.488113565" watchObservedRunningTime="2026-01-20 23:43:53.17124557 +0000 UTC m=+4105.491505898" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.618802 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.621542 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.625484 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.679592 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.795177 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-config-data-custom\") pod \"382e73bc-d9db-48e6-a776-c1130cfa12ea\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.795258 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-combined-ca-bundle\") pod \"382e73bc-d9db-48e6-a776-c1130cfa12ea\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.795339 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-internal-tls-certs\") pod \"382e73bc-d9db-48e6-a776-c1130cfa12ea\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.795398 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-config-data\") pod \"382e73bc-d9db-48e6-a776-c1130cfa12ea\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.795433 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/382e73bc-d9db-48e6-a776-c1130cfa12ea-logs\") pod \"382e73bc-d9db-48e6-a776-c1130cfa12ea\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.795487 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95fv8\" (UniqueName: \"kubernetes.io/projected/382e73bc-d9db-48e6-a776-c1130cfa12ea-kube-api-access-95fv8\") pod \"382e73bc-d9db-48e6-a776-c1130cfa12ea\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.795521 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-public-tls-certs\") pod \"382e73bc-d9db-48e6-a776-c1130cfa12ea\" (UID: \"382e73bc-d9db-48e6-a776-c1130cfa12ea\") " Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.796228 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/382e73bc-d9db-48e6-a776-c1130cfa12ea-logs" (OuterVolumeSpecName: "logs") pod "382e73bc-d9db-48e6-a776-c1130cfa12ea" (UID: "382e73bc-d9db-48e6-a776-c1130cfa12ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.801359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/382e73bc-d9db-48e6-a776-c1130cfa12ea-kube-api-access-95fv8" (OuterVolumeSpecName: "kube-api-access-95fv8") pod "382e73bc-d9db-48e6-a776-c1130cfa12ea" (UID: "382e73bc-d9db-48e6-a776-c1130cfa12ea"). InnerVolumeSpecName "kube-api-access-95fv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.801641 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "382e73bc-d9db-48e6-a776-c1130cfa12ea" (UID: "382e73bc-d9db-48e6-a776-c1130cfa12ea"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.824787 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "382e73bc-d9db-48e6-a776-c1130cfa12ea" (UID: "382e73bc-d9db-48e6-a776-c1130cfa12ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.848881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "382e73bc-d9db-48e6-a776-c1130cfa12ea" (UID: "382e73bc-d9db-48e6-a776-c1130cfa12ea"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.859865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-config-data" (OuterVolumeSpecName: "config-data") pod "382e73bc-d9db-48e6-a776-c1130cfa12ea" (UID: "382e73bc-d9db-48e6-a776-c1130cfa12ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.860258 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "382e73bc-d9db-48e6-a776-c1130cfa12ea" (UID: "382e73bc-d9db-48e6-a776-c1130cfa12ea"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.898294 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.898336 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.898350 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/382e73bc-d9db-48e6-a776-c1130cfa12ea-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.898365 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95fv8\" (UniqueName: \"kubernetes.io/projected/382e73bc-d9db-48e6-a776-c1130cfa12ea-kube-api-access-95fv8\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.898378 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.898389 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:58 crc kubenswrapper[5030]: I0120 23:43:58.898400 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382e73bc-d9db-48e6-a776-c1130cfa12ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.209763 5030 generic.go:334] "Generic (PLEG): container finished" podID="382e73bc-d9db-48e6-a776-c1130cfa12ea" containerID="86ad6e87427996c964edf1a9f998f1448d7e0eec78542066cb892e73c873eb22" exitCode=137 Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.209844 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.209870 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" event={"ID":"382e73bc-d9db-48e6-a776-c1130cfa12ea","Type":"ContainerDied","Data":"86ad6e87427996c964edf1a9f998f1448d7e0eec78542066cb892e73c873eb22"} Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.209951 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv" event={"ID":"382e73bc-d9db-48e6-a776-c1130cfa12ea","Type":"ContainerDied","Data":"290f7afe2f80b3b10dd11c663f0e74fcacec92b8cca35955cec5a57fb7592fef"} Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.209975 5030 scope.go:117] "RemoveContainer" containerID="86ad6e87427996c964edf1a9f998f1448d7e0eec78542066cb892e73c873eb22" Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.216405 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.273344 5030 scope.go:117] "RemoveContainer" containerID="0a74bf65c737c0bbc23a581e6113af01f1a94ae141e295accb4c32d779dfc7a1" Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.278708 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv"] Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.287805 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-6bcbc6fc44-wcxwv"] Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.302360 5030 scope.go:117] "RemoveContainer" containerID="86ad6e87427996c964edf1a9f998f1448d7e0eec78542066cb892e73c873eb22" Jan 20 23:43:59 crc kubenswrapper[5030]: E0120 23:43:59.303475 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86ad6e87427996c964edf1a9f998f1448d7e0eec78542066cb892e73c873eb22\": container with ID starting with 86ad6e87427996c964edf1a9f998f1448d7e0eec78542066cb892e73c873eb22 not found: ID does not exist" containerID="86ad6e87427996c964edf1a9f998f1448d7e0eec78542066cb892e73c873eb22" Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.303515 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ad6e87427996c964edf1a9f998f1448d7e0eec78542066cb892e73c873eb22"} err="failed to get container status \"86ad6e87427996c964edf1a9f998f1448d7e0eec78542066cb892e73c873eb22\": rpc error: code = NotFound desc = could not find container \"86ad6e87427996c964edf1a9f998f1448d7e0eec78542066cb892e73c873eb22\": container with ID starting with 86ad6e87427996c964edf1a9f998f1448d7e0eec78542066cb892e73c873eb22 not found: ID does not exist" Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.303540 5030 scope.go:117] "RemoveContainer" containerID="0a74bf65c737c0bbc23a581e6113af01f1a94ae141e295accb4c32d779dfc7a1" Jan 20 23:43:59 crc kubenswrapper[5030]: E0120 23:43:59.304148 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a74bf65c737c0bbc23a581e6113af01f1a94ae141e295accb4c32d779dfc7a1\": container with ID starting with 0a74bf65c737c0bbc23a581e6113af01f1a94ae141e295accb4c32d779dfc7a1 not found: ID does not exist" containerID="0a74bf65c737c0bbc23a581e6113af01f1a94ae141e295accb4c32d779dfc7a1" Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.304192 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a74bf65c737c0bbc23a581e6113af01f1a94ae141e295accb4c32d779dfc7a1"} err="failed to get container status \"0a74bf65c737c0bbc23a581e6113af01f1a94ae141e295accb4c32d779dfc7a1\": rpc error: code = NotFound desc = could not find container \"0a74bf65c737c0bbc23a581e6113af01f1a94ae141e295accb4c32d779dfc7a1\": container with ID starting with 0a74bf65c737c0bbc23a581e6113af01f1a94ae141e295accb4c32d779dfc7a1 not found: ID does not exist" Jan 20 23:43:59 crc kubenswrapper[5030]: I0120 23:43:59.983173 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="382e73bc-d9db-48e6-a776-c1130cfa12ea" path="/var/lib/kubelet/pods/382e73bc-d9db-48e6-a776-c1130cfa12ea/volumes" Jan 20 23:44:00 crc kubenswrapper[5030]: I0120 23:44:00.320372 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" podUID="69e686ca-6477-47fc-8154-0683c6ed953a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.47:9696/\": dial tcp 10.217.1.47:9696: connect: connection refused" Jan 20 23:44:00 crc kubenswrapper[5030]: I0120 23:44:00.501820 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:44:00 crc kubenswrapper[5030]: I0120 23:44:00.501899 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:44:01 crc kubenswrapper[5030]: I0120 23:44:01.514927 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="fbdc4876-2fa1-495d-88fd-3c18541b14d4" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.56:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:44:01 crc kubenswrapper[5030]: I0120 23:44:01.514983 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="fbdc4876-2fa1-495d-88fd-3c18541b14d4" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.56:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.169446 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.252007 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5df8ff887d-7k5dd"] Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.252230 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" podUID="3b08e5aa-df16-4deb-90e9-27af6c1d690b" containerName="neutron-api" containerID="cri-o://21bb5a1314ff029a0924afc5c5d204212e943dbceb3519fbb0dd59b7c25a4f1b" gracePeriod=30 Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.252605 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" podUID="3b08e5aa-df16-4deb-90e9-27af6c1d690b" containerName="neutron-httpd" containerID="cri-o://d6cea96ea84415eb01eca8f809e2ee1e7f206a16c5d3b668df781111cc6b7093" gracePeriod=30 Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.333896 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-696d6757b4-nkrlp_69e686ca-6477-47fc-8154-0683c6ed953a/neutron-api/0.log" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.333940 5030 generic.go:334] "Generic (PLEG): container finished" podID="69e686ca-6477-47fc-8154-0683c6ed953a" containerID="e8dcb212d5fdedc91a641f7a1f28c081da6ff7db705082af138b24c20baf2089" exitCode=137 Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.333966 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" event={"ID":"69e686ca-6477-47fc-8154-0683c6ed953a","Type":"ContainerDied","Data":"e8dcb212d5fdedc91a641f7a1f28c081da6ff7db705082af138b24c20baf2089"} Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.333989 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" event={"ID":"69e686ca-6477-47fc-8154-0683c6ed953a","Type":"ContainerDied","Data":"ae1ccb8bd95c322d95de1c27cd1c16af9351cdbff6992f41c3fc6f45876b163b"} Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.333998 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae1ccb8bd95c322d95de1c27cd1c16af9351cdbff6992f41c3fc6f45876b163b" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.372556 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-696d6757b4-nkrlp_69e686ca-6477-47fc-8154-0683c6ed953a/neutron-api/0.log" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.372615 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.480470 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-public-tls-certs\") pod \"69e686ca-6477-47fc-8154-0683c6ed953a\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.480535 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-combined-ca-bundle\") pod \"69e686ca-6477-47fc-8154-0683c6ed953a\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.480576 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-internal-tls-certs\") pod \"69e686ca-6477-47fc-8154-0683c6ed953a\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.480598 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-ovndb-tls-certs\") pod \"69e686ca-6477-47fc-8154-0683c6ed953a\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.480679 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-config\") pod \"69e686ca-6477-47fc-8154-0683c6ed953a\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.480839 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-645pl\" (UniqueName: \"kubernetes.io/projected/69e686ca-6477-47fc-8154-0683c6ed953a-kube-api-access-645pl\") pod \"69e686ca-6477-47fc-8154-0683c6ed953a\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.480872 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-httpd-config\") pod \"69e686ca-6477-47fc-8154-0683c6ed953a\" (UID: \"69e686ca-6477-47fc-8154-0683c6ed953a\") " Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.486228 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "69e686ca-6477-47fc-8154-0683c6ed953a" (UID: "69e686ca-6477-47fc-8154-0683c6ed953a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.486835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e686ca-6477-47fc-8154-0683c6ed953a-kube-api-access-645pl" (OuterVolumeSpecName: "kube-api-access-645pl") pod "69e686ca-6477-47fc-8154-0683c6ed953a" (UID: "69e686ca-6477-47fc-8154-0683c6ed953a"). InnerVolumeSpecName "kube-api-access-645pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.561912 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69e686ca-6477-47fc-8154-0683c6ed953a" (UID: "69e686ca-6477-47fc-8154-0683c6ed953a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.564533 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "69e686ca-6477-47fc-8154-0683c6ed953a" (UID: "69e686ca-6477-47fc-8154-0683c6ed953a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.567694 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "69e686ca-6477-47fc-8154-0683c6ed953a" (UID: "69e686ca-6477-47fc-8154-0683c6ed953a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.572132 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-config" (OuterVolumeSpecName: "config") pod "69e686ca-6477-47fc-8154-0683c6ed953a" (UID: "69e686ca-6477-47fc-8154-0683c6ed953a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.583694 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.583829 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.583901 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.583960 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.584017 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-645pl\" (UniqueName: \"kubernetes.io/projected/69e686ca-6477-47fc-8154-0683c6ed953a-kube-api-access-645pl\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.584070 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.585252 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "69e686ca-6477-47fc-8154-0683c6ed953a" (UID: "69e686ca-6477-47fc-8154-0683c6ed953a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.686309 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/69e686ca-6477-47fc-8154-0683c6ed953a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.911886 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.993415 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6f9877d997-jdh5f"] Jan 20 23:44:06 crc kubenswrapper[5030]: I0120 23:44:06.994519 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" podUID="68768b92-d310-4d2a-ace4-e95aa765f972" containerName="keystone-api" containerID="cri-o://a36634b4807646149e5f3a7b6744b5ba605ae319c65e476cda97660d7f503e98" gracePeriod=30 Jan 20 23:44:07 crc kubenswrapper[5030]: I0120 23:44:07.347101 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-696d6757b4-nkrlp" Jan 20 23:44:07 crc kubenswrapper[5030]: I0120 23:44:07.395532 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-696d6757b4-nkrlp"] Jan 20 23:44:07 crc kubenswrapper[5030]: I0120 23:44:07.403635 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-696d6757b4-nkrlp"] Jan 20 23:44:07 crc kubenswrapper[5030]: I0120 23:44:07.974775 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e686ca-6477-47fc-8154-0683c6ed953a" path="/var/lib/kubelet/pods/69e686ca-6477-47fc-8154-0683c6ed953a/volumes" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.287727 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.288388 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="a39ae3f6-b542-4a93-a45b-8b431f5e857e" containerName="openstackclient" containerID="cri-o://9ea9c1592d15cd47a478230d43c094acf26e6539dd38741ae71e17be69d79521" gracePeriod=2 Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.295989 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.354751 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:44:09 crc kubenswrapper[5030]: E0120 23:44:09.355267 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e686ca-6477-47fc-8154-0683c6ed953a" containerName="neutron-api" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.355288 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e686ca-6477-47fc-8154-0683c6ed953a" containerName="neutron-api" Jan 20 23:44:09 crc kubenswrapper[5030]: E0120 23:44:09.355314 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382e73bc-d9db-48e6-a776-c1130cfa12ea" containerName="barbican-api" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.355323 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="382e73bc-d9db-48e6-a776-c1130cfa12ea" containerName="barbican-api" Jan 20 23:44:09 crc kubenswrapper[5030]: E0120 23:44:09.355342 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382e73bc-d9db-48e6-a776-c1130cfa12ea" containerName="barbican-api-log" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.355351 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="382e73bc-d9db-48e6-a776-c1130cfa12ea" containerName="barbican-api-log" Jan 20 23:44:09 crc kubenswrapper[5030]: E0120 23:44:09.355372 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a39ae3f6-b542-4a93-a45b-8b431f5e857e" containerName="openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.355381 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a39ae3f6-b542-4a93-a45b-8b431f5e857e" containerName="openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: E0120 23:44:09.355398 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e686ca-6477-47fc-8154-0683c6ed953a" containerName="neutron-httpd" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.355408 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e686ca-6477-47fc-8154-0683c6ed953a" containerName="neutron-httpd" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.355657 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="382e73bc-d9db-48e6-a776-c1130cfa12ea" containerName="barbican-api" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.355684 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="382e73bc-d9db-48e6-a776-c1130cfa12ea" containerName="barbican-api-log" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.355700 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a39ae3f6-b542-4a93-a45b-8b431f5e857e" containerName="openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.355724 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e686ca-6477-47fc-8154-0683c6ed953a" containerName="neutron-httpd" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.355744 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e686ca-6477-47fc-8154-0683c6ed953a" containerName="neutron-api" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.356525 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.367101 5030 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack-kuttl-tests/openstackclient" oldPodUID="a39ae3f6-b542-4a93-a45b-8b431f5e857e" podUID="bfe5c819-263d-4ea5-934c-51567db0e55e" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.368354 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.436722 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfe5c819-263d-4ea5-934c-51567db0e55e-openstack-config\") pod \"openstackclient\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.436941 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmvb4\" (UniqueName: \"kubernetes.io/projected/bfe5c819-263d-4ea5-934c-51567db0e55e-kube-api-access-bmvb4\") pod \"openstackclient\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.436996 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfe5c819-263d-4ea5-934c-51567db0e55e-openstack-config-secret\") pod \"openstackclient\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.437046 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe5c819-263d-4ea5-934c-51567db0e55e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.539611 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmvb4\" (UniqueName: \"kubernetes.io/projected/bfe5c819-263d-4ea5-934c-51567db0e55e-kube-api-access-bmvb4\") pod \"openstackclient\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.539681 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfe5c819-263d-4ea5-934c-51567db0e55e-openstack-config-secret\") pod \"openstackclient\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.539715 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe5c819-263d-4ea5-934c-51567db0e55e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.539748 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfe5c819-263d-4ea5-934c-51567db0e55e-openstack-config\") pod \"openstackclient\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.540646 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfe5c819-263d-4ea5-934c-51567db0e55e-openstack-config\") pod \"openstackclient\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.549697 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe5c819-263d-4ea5-934c-51567db0e55e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.550416 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfe5c819-263d-4ea5-934c-51567db0e55e-openstack-config-secret\") pod \"openstackclient\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.560131 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmvb4\" (UniqueName: \"kubernetes.io/projected/bfe5c819-263d-4ea5-934c-51567db0e55e-kube-api-access-bmvb4\") pod \"openstackclient\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:09 crc kubenswrapper[5030]: I0120 23:44:09.687749 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.157573 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.158088 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.228995 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:44:10 crc kubenswrapper[5030]: W0120 23:44:10.236871 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfe5c819_263d_4ea5_934c_51567db0e55e.slice/crio-520e1dc4ee6f65a3d0a044885512b6e472bad3c9a2f1404ef605616c0bb14774 WatchSource:0}: Error finding container 520e1dc4ee6f65a3d0a044885512b6e472bad3c9a2f1404ef605616c0bb14774: Status 404 returned error can't find the container with id 520e1dc4ee6f65a3d0a044885512b6e472bad3c9a2f1404ef605616c0bb14774 Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.411408 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"bfe5c819-263d-4ea5-934c-51567db0e55e","Type":"ContainerStarted","Data":"520e1dc4ee6f65a3d0a044885512b6e472bad3c9a2f1404ef605616c0bb14774"} Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.419070 5030 generic.go:334] "Generic (PLEG): container finished" podID="68768b92-d310-4d2a-ace4-e95aa765f972" containerID="a36634b4807646149e5f3a7b6744b5ba605ae319c65e476cda97660d7f503e98" exitCode=0 Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.419122 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" event={"ID":"68768b92-d310-4d2a-ace4-e95aa765f972","Type":"ContainerDied","Data":"a36634b4807646149e5f3a7b6744b5ba605ae319c65e476cda97660d7f503e98"} Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.500452 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.509500 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.509829 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.513927 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.522313 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.669571 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-credential-keys\") pod \"68768b92-d310-4d2a-ace4-e95aa765f972\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.669897 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-config-data\") pod \"68768b92-d310-4d2a-ace4-e95aa765f972\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.670091 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-internal-tls-certs\") pod \"68768b92-d310-4d2a-ace4-e95aa765f972\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.670766 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-scripts\") pod \"68768b92-d310-4d2a-ace4-e95aa765f972\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.670861 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-fernet-keys\") pod \"68768b92-d310-4d2a-ace4-e95aa765f972\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.670892 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-combined-ca-bundle\") pod \"68768b92-d310-4d2a-ace4-e95aa765f972\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.671011 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv4wn\" (UniqueName: \"kubernetes.io/projected/68768b92-d310-4d2a-ace4-e95aa765f972-kube-api-access-mv4wn\") pod \"68768b92-d310-4d2a-ace4-e95aa765f972\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.671034 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-public-tls-certs\") pod \"68768b92-d310-4d2a-ace4-e95aa765f972\" (UID: \"68768b92-d310-4d2a-ace4-e95aa765f972\") " Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.679578 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68768b92-d310-4d2a-ace4-e95aa765f972-kube-api-access-mv4wn" (OuterVolumeSpecName: "kube-api-access-mv4wn") pod "68768b92-d310-4d2a-ace4-e95aa765f972" (UID: "68768b92-d310-4d2a-ace4-e95aa765f972"). InnerVolumeSpecName "kube-api-access-mv4wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.680213 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "68768b92-d310-4d2a-ace4-e95aa765f972" (UID: "68768b92-d310-4d2a-ace4-e95aa765f972"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.680469 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-scripts" (OuterVolumeSpecName: "scripts") pod "68768b92-d310-4d2a-ace4-e95aa765f972" (UID: "68768b92-d310-4d2a-ace4-e95aa765f972"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.686112 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "68768b92-d310-4d2a-ace4-e95aa765f972" (UID: "68768b92-d310-4d2a-ace4-e95aa765f972"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.701114 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-config-data" (OuterVolumeSpecName: "config-data") pod "68768b92-d310-4d2a-ace4-e95aa765f972" (UID: "68768b92-d310-4d2a-ace4-e95aa765f972"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.703936 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68768b92-d310-4d2a-ace4-e95aa765f972" (UID: "68768b92-d310-4d2a-ace4-e95aa765f972"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.724756 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "68768b92-d310-4d2a-ace4-e95aa765f972" (UID: "68768b92-d310-4d2a-ace4-e95aa765f972"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.737852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "68768b92-d310-4d2a-ace4-e95aa765f972" (UID: "68768b92-d310-4d2a-ace4-e95aa765f972"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.774772 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.774945 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.774969 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.775010 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv4wn\" (UniqueName: \"kubernetes.io/projected/68768b92-d310-4d2a-ace4-e95aa765f972-kube-api-access-mv4wn\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.775025 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.775037 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.775048 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:10 crc kubenswrapper[5030]: I0120 23:44:10.775062 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68768b92-d310-4d2a-ace4-e95aa765f972-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.439662 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" event={"ID":"68768b92-d310-4d2a-ace4-e95aa765f972","Type":"ContainerDied","Data":"ebcd6fe11550f275a26d82161d24a065df68e7c0c9eebe2309b5cd41f35f9646"} Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.439941 5030 scope.go:117] "RemoveContainer" containerID="a36634b4807646149e5f3a7b6744b5ba605ae319c65e476cda97660d7f503e98" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.439818 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-6f9877d997-jdh5f" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.446683 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"bfe5c819-263d-4ea5-934c-51567db0e55e","Type":"ContainerStarted","Data":"4162ae90212f0f20c63503791f5909dd8447d80bbbc67a89f73e7ea573c92180"} Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.449396 5030 generic.go:334] "Generic (PLEG): container finished" podID="a39ae3f6-b542-4a93-a45b-8b431f5e857e" containerID="9ea9c1592d15cd47a478230d43c094acf26e6539dd38741ae71e17be69d79521" exitCode=137 Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.450117 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.456410 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.481854 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.481824309 podStartE2EDuration="2.481824309s" podCreationTimestamp="2026-01-20 23:44:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:44:11.471226411 +0000 UTC m=+4123.791486729" watchObservedRunningTime="2026-01-20 23:44:11.481824309 +0000 UTC m=+4123.802084607" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.495322 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-6f9877d997-jdh5f"] Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.502663 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-6f9877d997-jdh5f"] Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.627318 5030 scope.go:117] "RemoveContainer" containerID="cbd8c53c0c338082933ba2b93a1ac6d72c5a94a590009ae090199f4a21ef75f5" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.660120 5030 scope.go:117] "RemoveContainer" containerID="ab014bff4d1d0bad1eb5a67c73e0c3d1aa274f4b0161985c39a46475197f7bc4" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.696440 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.798540 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39ae3f6-b542-4a93-a45b-8b431f5e857e-combined-ca-bundle\") pod \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.799145 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a39ae3f6-b542-4a93-a45b-8b431f5e857e-openstack-config-secret\") pod \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.799282 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54pst\" (UniqueName: \"kubernetes.io/projected/a39ae3f6-b542-4a93-a45b-8b431f5e857e-kube-api-access-54pst\") pod \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.799393 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a39ae3f6-b542-4a93-a45b-8b431f5e857e-openstack-config\") pod \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\" (UID: \"a39ae3f6-b542-4a93-a45b-8b431f5e857e\") " Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.805159 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a39ae3f6-b542-4a93-a45b-8b431f5e857e-kube-api-access-54pst" (OuterVolumeSpecName: "kube-api-access-54pst") pod "a39ae3f6-b542-4a93-a45b-8b431f5e857e" (UID: "a39ae3f6-b542-4a93-a45b-8b431f5e857e"). InnerVolumeSpecName "kube-api-access-54pst". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.838460 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a39ae3f6-b542-4a93-a45b-8b431f5e857e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a39ae3f6-b542-4a93-a45b-8b431f5e857e" (UID: "a39ae3f6-b542-4a93-a45b-8b431f5e857e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.847356 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a39ae3f6-b542-4a93-a45b-8b431f5e857e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a39ae3f6-b542-4a93-a45b-8b431f5e857e" (UID: "a39ae3f6-b542-4a93-a45b-8b431f5e857e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.868195 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a39ae3f6-b542-4a93-a45b-8b431f5e857e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a39ae3f6-b542-4a93-a45b-8b431f5e857e" (UID: "a39ae3f6-b542-4a93-a45b-8b431f5e857e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.902188 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a39ae3f6-b542-4a93-a45b-8b431f5e857e-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.902232 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39ae3f6-b542-4a93-a45b-8b431f5e857e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.902242 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a39ae3f6-b542-4a93-a45b-8b431f5e857e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.902255 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54pst\" (UniqueName: \"kubernetes.io/projected/a39ae3f6-b542-4a93-a45b-8b431f5e857e-kube-api-access-54pst\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.996494 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68768b92-d310-4d2a-ace4-e95aa765f972" path="/var/lib/kubelet/pods/68768b92-d310-4d2a-ace4-e95aa765f972/volumes" Jan 20 23:44:11 crc kubenswrapper[5030]: I0120 23:44:11.999348 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a39ae3f6-b542-4a93-a45b-8b431f5e857e" path="/var/lib/kubelet/pods/a39ae3f6-b542-4a93-a45b-8b431f5e857e/volumes" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.081523 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf"] Jan 20 23:44:12 crc kubenswrapper[5030]: E0120 23:44:12.082505 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68768b92-d310-4d2a-ace4-e95aa765f972" containerName="keystone-api" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.082535 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="68768b92-d310-4d2a-ace4-e95aa765f972" containerName="keystone-api" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.082858 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="68768b92-d310-4d2a-ace4-e95aa765f972" containerName="keystone-api" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.084734 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.096748 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf"] Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.213657 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-internal-tls-certs\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.213785 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b795c-8dbe-4f59-aa00-e3f908a9c974-log-httpd\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.213813 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/987b795c-8dbe-4f59-aa00-e3f908a9c974-etc-swift\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.213864 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-config-data\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.213883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtkhq\" (UniqueName: \"kubernetes.io/projected/987b795c-8dbe-4f59-aa00-e3f908a9c974-kube-api-access-vtkhq\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.213904 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-combined-ca-bundle\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.213923 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-public-tls-certs\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.213964 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b795c-8dbe-4f59-aa00-e3f908a9c974-run-httpd\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.316469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b795c-8dbe-4f59-aa00-e3f908a9c974-log-httpd\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.316552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/987b795c-8dbe-4f59-aa00-e3f908a9c974-etc-swift\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.316658 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-config-data\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.316697 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtkhq\" (UniqueName: \"kubernetes.io/projected/987b795c-8dbe-4f59-aa00-e3f908a9c974-kube-api-access-vtkhq\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.316742 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-combined-ca-bundle\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.316775 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-public-tls-certs\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.316849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b795c-8dbe-4f59-aa00-e3f908a9c974-run-httpd\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.316997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-internal-tls-certs\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.317449 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b795c-8dbe-4f59-aa00-e3f908a9c974-log-httpd\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.317674 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b795c-8dbe-4f59-aa00-e3f908a9c974-run-httpd\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.322479 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-public-tls-certs\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.323164 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-combined-ca-bundle\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.323832 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-internal-tls-certs\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.327606 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/987b795c-8dbe-4f59-aa00-e3f908a9c974-etc-swift\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.329808 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-config-data\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.334883 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtkhq\" (UniqueName: \"kubernetes.io/projected/987b795c-8dbe-4f59-aa00-e3f908a9c974-kube-api-access-vtkhq\") pod \"swift-proxy-844c45cdb4-lhgzf\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.425210 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.461668 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.461743 5030 scope.go:117] "RemoveContainer" containerID="9ea9c1592d15cd47a478230d43c094acf26e6539dd38741ae71e17be69d79521" Jan 20 23:44:12 crc kubenswrapper[5030]: I0120 23:44:12.863010 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf"] Jan 20 23:44:12 crc kubenswrapper[5030]: W0120 23:44:12.867774 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod987b795c_8dbe_4f59_aa00_e3f908a9c974.slice/crio-02317d546363d34817da85af5c16aeadb570d1dac6167472d21545739adcfd02 WatchSource:0}: Error finding container 02317d546363d34817da85af5c16aeadb570d1dac6167472d21545739adcfd02: Status 404 returned error can't find the container with id 02317d546363d34817da85af5c16aeadb570d1dac6167472d21545739adcfd02 Jan 20 23:44:13 crc kubenswrapper[5030]: I0120 23:44:13.476468 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" event={"ID":"987b795c-8dbe-4f59-aa00-e3f908a9c974","Type":"ContainerStarted","Data":"3ff22418a328424b32b7b4f415a4a4ad754ab1eb3343427244682702b8634303"} Jan 20 23:44:13 crc kubenswrapper[5030]: I0120 23:44:13.476856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" event={"ID":"987b795c-8dbe-4f59-aa00-e3f908a9c974","Type":"ContainerStarted","Data":"3e26081edbb10019449a003a17538e1de68bd9f0dc38018db630e0edd2157f3c"} Jan 20 23:44:13 crc kubenswrapper[5030]: I0120 23:44:13.476880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" event={"ID":"987b795c-8dbe-4f59-aa00-e3f908a9c974","Type":"ContainerStarted","Data":"02317d546363d34817da85af5c16aeadb570d1dac6167472d21545739adcfd02"} Jan 20 23:44:13 crc kubenswrapper[5030]: I0120 23:44:13.511452 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" podStartSLOduration=1.511429694 podStartE2EDuration="1.511429694s" podCreationTimestamp="2026-01-20 23:44:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:44:13.493757845 +0000 UTC m=+4125.814018133" watchObservedRunningTime="2026-01-20 23:44:13.511429694 +0000 UTC m=+4125.831690012" Jan 20 23:44:14 crc kubenswrapper[5030]: I0120 23:44:14.483672 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:14 crc kubenswrapper[5030]: I0120 23:44:14.483910 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:16 crc kubenswrapper[5030]: I0120 23:44:16.485137 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:17 crc kubenswrapper[5030]: I0120 23:44:17.524867 5030 generic.go:334] "Generic (PLEG): container finished" podID="3b08e5aa-df16-4deb-90e9-27af6c1d690b" containerID="d6cea96ea84415eb01eca8f809e2ee1e7f206a16c5d3b668df781111cc6b7093" exitCode=0 Jan 20 23:44:17 crc kubenswrapper[5030]: I0120 23:44:17.526351 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" event={"ID":"3b08e5aa-df16-4deb-90e9-27af6c1d690b","Type":"ContainerDied","Data":"d6cea96ea84415eb01eca8f809e2ee1e7f206a16c5d3b668df781111cc6b7093"} Jan 20 23:44:22 crc kubenswrapper[5030]: I0120 23:44:22.127714 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:44:22 crc kubenswrapper[5030]: I0120 23:44:22.128295 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="ceilometer-central-agent" containerID="cri-o://8bbb1503f80b80cf7f8e991e86ddfc92759c57c14cc65be7034179949da55f2a" gracePeriod=30 Jan 20 23:44:22 crc kubenswrapper[5030]: I0120 23:44:22.128412 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="sg-core" containerID="cri-o://5f81a82dbd579b31cd82b7dfcbcf23af4b004b459b1582b1fa9c8715ac7c09a4" gracePeriod=30 Jan 20 23:44:22 crc kubenswrapper[5030]: I0120 23:44:22.128460 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="proxy-httpd" containerID="cri-o://fb990b5b667171dd307f376e0733a9f29d039041ca7fe051069b24b8156a6eb7" gracePeriod=30 Jan 20 23:44:22 crc kubenswrapper[5030]: I0120 23:44:22.128456 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="ceilometer-notification-agent" containerID="cri-o://fa5d613f8d0d813c33ea394506a8aeaabedae1997902ced4ffc7f9afa555c780" gracePeriod=30 Jan 20 23:44:22 crc kubenswrapper[5030]: I0120 23:44:22.436520 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:22 crc kubenswrapper[5030]: I0120 23:44:22.440024 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:22 crc kubenswrapper[5030]: I0120 23:44:22.531721 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb"] Jan 20 23:44:22 crc kubenswrapper[5030]: I0120 23:44:22.531969 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" podUID="d66a4eea-17a5-4ab5-b3be-2a479226daf7" containerName="proxy-httpd" containerID="cri-o://41286e72f29dcb5bdf7f4831ad8e2fe1b6c22003e27b09645abbfa6ae0917401" gracePeriod=30 Jan 20 23:44:22 crc kubenswrapper[5030]: I0120 23:44:22.532075 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" podUID="d66a4eea-17a5-4ab5-b3be-2a479226daf7" containerName="proxy-server" containerID="cri-o://8f7ace46d65651278377efee09416c1370252c641e56c616324fd9e3e6962d2d" gracePeriod=30 Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.620774 5030 generic.go:334] "Generic (PLEG): container finished" podID="d66a4eea-17a5-4ab5-b3be-2a479226daf7" containerID="8f7ace46d65651278377efee09416c1370252c641e56c616324fd9e3e6962d2d" exitCode=0 Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.620837 5030 generic.go:334] "Generic (PLEG): container finished" podID="d66a4eea-17a5-4ab5-b3be-2a479226daf7" containerID="41286e72f29dcb5bdf7f4831ad8e2fe1b6c22003e27b09645abbfa6ae0917401" exitCode=0 Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.620950 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" event={"ID":"d66a4eea-17a5-4ab5-b3be-2a479226daf7","Type":"ContainerDied","Data":"8f7ace46d65651278377efee09416c1370252c641e56c616324fd9e3e6962d2d"} Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.621006 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" event={"ID":"d66a4eea-17a5-4ab5-b3be-2a479226daf7","Type":"ContainerDied","Data":"41286e72f29dcb5bdf7f4831ad8e2fe1b6c22003e27b09645abbfa6ae0917401"} Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.621016 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" event={"ID":"d66a4eea-17a5-4ab5-b3be-2a479226daf7","Type":"ContainerDied","Data":"e2cb41bc141dab0e4fad9d0ccee1a7acf0b20b4fe1fc40ccbcac930aca5db7c7"} Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.621028 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2cb41bc141dab0e4fad9d0ccee1a7acf0b20b4fe1fc40ccbcac930aca5db7c7" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.621340 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.623333 5030 generic.go:334] "Generic (PLEG): container finished" podID="c854d032-629e-4a6e-ba33-4250efb68e93" containerID="fb990b5b667171dd307f376e0733a9f29d039041ca7fe051069b24b8156a6eb7" exitCode=0 Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.623353 5030 generic.go:334] "Generic (PLEG): container finished" podID="c854d032-629e-4a6e-ba33-4250efb68e93" containerID="5f81a82dbd579b31cd82b7dfcbcf23af4b004b459b1582b1fa9c8715ac7c09a4" exitCode=2 Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.623361 5030 generic.go:334] "Generic (PLEG): container finished" podID="c854d032-629e-4a6e-ba33-4250efb68e93" containerID="8bbb1503f80b80cf7f8e991e86ddfc92759c57c14cc65be7034179949da55f2a" exitCode=0 Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.623383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c854d032-629e-4a6e-ba33-4250efb68e93","Type":"ContainerDied","Data":"fb990b5b667171dd307f376e0733a9f29d039041ca7fe051069b24b8156a6eb7"} Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.623407 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c854d032-629e-4a6e-ba33-4250efb68e93","Type":"ContainerDied","Data":"5f81a82dbd579b31cd82b7dfcbcf23af4b004b459b1582b1fa9c8715ac7c09a4"} Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.623417 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c854d032-629e-4a6e-ba33-4250efb68e93","Type":"ContainerDied","Data":"8bbb1503f80b80cf7f8e991e86ddfc92759c57c14cc65be7034179949da55f2a"} Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.777503 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs\") pod \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.777558 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66a4eea-17a5-4ab5-b3be-2a479226daf7-run-httpd\") pod \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.777590 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-combined-ca-bundle\") pod \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.777634 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66a4eea-17a5-4ab5-b3be-2a479226daf7-log-httpd\") pod \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.777657 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d66a4eea-17a5-4ab5-b3be-2a479226daf7-etc-swift\") pod \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.777811 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs\") pod \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.777863 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-config-data\") pod \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.777904 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ztzp\" (UniqueName: \"kubernetes.io/projected/d66a4eea-17a5-4ab5-b3be-2a479226daf7-kube-api-access-7ztzp\") pod \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\" (UID: \"d66a4eea-17a5-4ab5-b3be-2a479226daf7\") " Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.778454 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66a4eea-17a5-4ab5-b3be-2a479226daf7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d66a4eea-17a5-4ab5-b3be-2a479226daf7" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.778590 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d66a4eea-17a5-4ab5-b3be-2a479226daf7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d66a4eea-17a5-4ab5-b3be-2a479226daf7" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.783516 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66a4eea-17a5-4ab5-b3be-2a479226daf7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d66a4eea-17a5-4ab5-b3be-2a479226daf7" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.783651 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66a4eea-17a5-4ab5-b3be-2a479226daf7-kube-api-access-7ztzp" (OuterVolumeSpecName: "kube-api-access-7ztzp") pod "d66a4eea-17a5-4ab5-b3be-2a479226daf7" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7"). InnerVolumeSpecName "kube-api-access-7ztzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.832828 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-config-data" (OuterVolumeSpecName: "config-data") pod "d66a4eea-17a5-4ab5-b3be-2a479226daf7" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.834330 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d66a4eea-17a5-4ab5-b3be-2a479226daf7" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.846310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d66a4eea-17a5-4ab5-b3be-2a479226daf7" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.865807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d66a4eea-17a5-4ab5-b3be-2a479226daf7" (UID: "d66a4eea-17a5-4ab5-b3be-2a479226daf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.880036 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ztzp\" (UniqueName: \"kubernetes.io/projected/d66a4eea-17a5-4ab5-b3be-2a479226daf7-kube-api-access-7ztzp\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.880076 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66a4eea-17a5-4ab5-b3be-2a479226daf7-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.880090 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.880103 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.880111 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d66a4eea-17a5-4ab5-b3be-2a479226daf7-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.880119 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d66a4eea-17a5-4ab5-b3be-2a479226daf7-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.880127 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:23 crc kubenswrapper[5030]: I0120 23:44:23.880209 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66a4eea-17a5-4ab5-b3be-2a479226daf7-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:24 crc kubenswrapper[5030]: I0120 23:44:24.630710 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" Jan 20 23:44:24 crc kubenswrapper[5030]: I0120 23:44:24.653417 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb"] Jan 20 23:44:24 crc kubenswrapper[5030]: I0120 23:44:24.665250 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb"] Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.600928 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.650844 5030 generic.go:334] "Generic (PLEG): container finished" podID="c854d032-629e-4a6e-ba33-4250efb68e93" containerID="fa5d613f8d0d813c33ea394506a8aeaabedae1997902ced4ffc7f9afa555c780" exitCode=0 Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.650919 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.650961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c854d032-629e-4a6e-ba33-4250efb68e93","Type":"ContainerDied","Data":"fa5d613f8d0d813c33ea394506a8aeaabedae1997902ced4ffc7f9afa555c780"} Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.651993 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"c854d032-629e-4a6e-ba33-4250efb68e93","Type":"ContainerDied","Data":"c53a57bba70c5a402c0ef0606c7eaa13ca8412761706f70e19dcfced05a3306c"} Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.652033 5030 scope.go:117] "RemoveContainer" containerID="fb990b5b667171dd307f376e0733a9f29d039041ca7fe051069b24b8156a6eb7" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.673740 5030 scope.go:117] "RemoveContainer" containerID="5f81a82dbd579b31cd82b7dfcbcf23af4b004b459b1582b1fa9c8715ac7c09a4" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.695121 5030 scope.go:117] "RemoveContainer" containerID="fa5d613f8d0d813c33ea394506a8aeaabedae1997902ced4ffc7f9afa555c780" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.712226 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c854d032-629e-4a6e-ba33-4250efb68e93-log-httpd\") pod \"c854d032-629e-4a6e-ba33-4250efb68e93\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.712280 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-config-data\") pod \"c854d032-629e-4a6e-ba33-4250efb68e93\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.712300 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c854d032-629e-4a6e-ba33-4250efb68e93-run-httpd\") pod \"c854d032-629e-4a6e-ba33-4250efb68e93\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.712344 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-combined-ca-bundle\") pod \"c854d032-629e-4a6e-ba33-4250efb68e93\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.712399 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-scripts\") pod \"c854d032-629e-4a6e-ba33-4250efb68e93\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.712421 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-sg-core-conf-yaml\") pod \"c854d032-629e-4a6e-ba33-4250efb68e93\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.712464 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-ceilometer-tls-certs\") pod \"c854d032-629e-4a6e-ba33-4250efb68e93\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.712488 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4cst\" (UniqueName: \"kubernetes.io/projected/c854d032-629e-4a6e-ba33-4250efb68e93-kube-api-access-z4cst\") pod \"c854d032-629e-4a6e-ba33-4250efb68e93\" (UID: \"c854d032-629e-4a6e-ba33-4250efb68e93\") " Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.714014 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c854d032-629e-4a6e-ba33-4250efb68e93-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c854d032-629e-4a6e-ba33-4250efb68e93" (UID: "c854d032-629e-4a6e-ba33-4250efb68e93"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.714088 5030 scope.go:117] "RemoveContainer" containerID="8bbb1503f80b80cf7f8e991e86ddfc92759c57c14cc65be7034179949da55f2a" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.714565 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c854d032-629e-4a6e-ba33-4250efb68e93-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c854d032-629e-4a6e-ba33-4250efb68e93" (UID: "c854d032-629e-4a6e-ba33-4250efb68e93"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.719124 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-scripts" (OuterVolumeSpecName: "scripts") pod "c854d032-629e-4a6e-ba33-4250efb68e93" (UID: "c854d032-629e-4a6e-ba33-4250efb68e93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.719320 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c854d032-629e-4a6e-ba33-4250efb68e93-kube-api-access-z4cst" (OuterVolumeSpecName: "kube-api-access-z4cst") pod "c854d032-629e-4a6e-ba33-4250efb68e93" (UID: "c854d032-629e-4a6e-ba33-4250efb68e93"). InnerVolumeSpecName "kube-api-access-z4cst". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.740781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c854d032-629e-4a6e-ba33-4250efb68e93" (UID: "c854d032-629e-4a6e-ba33-4250efb68e93"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.767654 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c854d032-629e-4a6e-ba33-4250efb68e93" (UID: "c854d032-629e-4a6e-ba33-4250efb68e93"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.796789 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c854d032-629e-4a6e-ba33-4250efb68e93" (UID: "c854d032-629e-4a6e-ba33-4250efb68e93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.815209 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.815275 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.815288 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4cst\" (UniqueName: \"kubernetes.io/projected/c854d032-629e-4a6e-ba33-4250efb68e93-kube-api-access-z4cst\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.815299 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c854d032-629e-4a6e-ba33-4250efb68e93-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.815308 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c854d032-629e-4a6e-ba33-4250efb68e93-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.815318 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.815327 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.817789 5030 scope.go:117] "RemoveContainer" containerID="fb990b5b667171dd307f376e0733a9f29d039041ca7fe051069b24b8156a6eb7" Jan 20 23:44:25 crc kubenswrapper[5030]: E0120 23:44:25.818231 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb990b5b667171dd307f376e0733a9f29d039041ca7fe051069b24b8156a6eb7\": container with ID starting with fb990b5b667171dd307f376e0733a9f29d039041ca7fe051069b24b8156a6eb7 not found: ID does not exist" containerID="fb990b5b667171dd307f376e0733a9f29d039041ca7fe051069b24b8156a6eb7" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.818267 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb990b5b667171dd307f376e0733a9f29d039041ca7fe051069b24b8156a6eb7"} err="failed to get container status \"fb990b5b667171dd307f376e0733a9f29d039041ca7fe051069b24b8156a6eb7\": rpc error: code = NotFound desc = could not find container \"fb990b5b667171dd307f376e0733a9f29d039041ca7fe051069b24b8156a6eb7\": container with ID starting with fb990b5b667171dd307f376e0733a9f29d039041ca7fe051069b24b8156a6eb7 not found: ID does not exist" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.818293 5030 scope.go:117] "RemoveContainer" containerID="5f81a82dbd579b31cd82b7dfcbcf23af4b004b459b1582b1fa9c8715ac7c09a4" Jan 20 23:44:25 crc kubenswrapper[5030]: E0120 23:44:25.818564 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f81a82dbd579b31cd82b7dfcbcf23af4b004b459b1582b1fa9c8715ac7c09a4\": container with ID starting with 5f81a82dbd579b31cd82b7dfcbcf23af4b004b459b1582b1fa9c8715ac7c09a4 not found: ID does not exist" containerID="5f81a82dbd579b31cd82b7dfcbcf23af4b004b459b1582b1fa9c8715ac7c09a4" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.818609 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f81a82dbd579b31cd82b7dfcbcf23af4b004b459b1582b1fa9c8715ac7c09a4"} err="failed to get container status \"5f81a82dbd579b31cd82b7dfcbcf23af4b004b459b1582b1fa9c8715ac7c09a4\": rpc error: code = NotFound desc = could not find container \"5f81a82dbd579b31cd82b7dfcbcf23af4b004b459b1582b1fa9c8715ac7c09a4\": container with ID starting with 5f81a82dbd579b31cd82b7dfcbcf23af4b004b459b1582b1fa9c8715ac7c09a4 not found: ID does not exist" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.818672 5030 scope.go:117] "RemoveContainer" containerID="fa5d613f8d0d813c33ea394506a8aeaabedae1997902ced4ffc7f9afa555c780" Jan 20 23:44:25 crc kubenswrapper[5030]: E0120 23:44:25.818921 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa5d613f8d0d813c33ea394506a8aeaabedae1997902ced4ffc7f9afa555c780\": container with ID starting with fa5d613f8d0d813c33ea394506a8aeaabedae1997902ced4ffc7f9afa555c780 not found: ID does not exist" containerID="fa5d613f8d0d813c33ea394506a8aeaabedae1997902ced4ffc7f9afa555c780" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.818956 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa5d613f8d0d813c33ea394506a8aeaabedae1997902ced4ffc7f9afa555c780"} err="failed to get container status \"fa5d613f8d0d813c33ea394506a8aeaabedae1997902ced4ffc7f9afa555c780\": rpc error: code = NotFound desc = could not find container \"fa5d613f8d0d813c33ea394506a8aeaabedae1997902ced4ffc7f9afa555c780\": container with ID starting with fa5d613f8d0d813c33ea394506a8aeaabedae1997902ced4ffc7f9afa555c780 not found: ID does not exist" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.818973 5030 scope.go:117] "RemoveContainer" containerID="8bbb1503f80b80cf7f8e991e86ddfc92759c57c14cc65be7034179949da55f2a" Jan 20 23:44:25 crc kubenswrapper[5030]: E0120 23:44:25.819156 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bbb1503f80b80cf7f8e991e86ddfc92759c57c14cc65be7034179949da55f2a\": container with ID starting with 8bbb1503f80b80cf7f8e991e86ddfc92759c57c14cc65be7034179949da55f2a not found: ID does not exist" containerID="8bbb1503f80b80cf7f8e991e86ddfc92759c57c14cc65be7034179949da55f2a" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.819185 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bbb1503f80b80cf7f8e991e86ddfc92759c57c14cc65be7034179949da55f2a"} err="failed to get container status \"8bbb1503f80b80cf7f8e991e86ddfc92759c57c14cc65be7034179949da55f2a\": rpc error: code = NotFound desc = could not find container \"8bbb1503f80b80cf7f8e991e86ddfc92759c57c14cc65be7034179949da55f2a\": container with ID starting with 8bbb1503f80b80cf7f8e991e86ddfc92759c57c14cc65be7034179949da55f2a not found: ID does not exist" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.839115 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-config-data" (OuterVolumeSpecName: "config-data") pod "c854d032-629e-4a6e-ba33-4250efb68e93" (UID: "c854d032-629e-4a6e-ba33-4250efb68e93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.917612 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c854d032-629e-4a6e-ba33-4250efb68e93-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:25 crc kubenswrapper[5030]: I0120 23:44:25.976269 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d66a4eea-17a5-4ab5-b3be-2a479226daf7" path="/var/lib/kubelet/pods/d66a4eea-17a5-4ab5-b3be-2a479226daf7/volumes" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.615665 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.624707 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.648540 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:44:26 crc kubenswrapper[5030]: E0120 23:44:26.648952 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="ceilometer-central-agent" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.648971 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="ceilometer-central-agent" Jan 20 23:44:26 crc kubenswrapper[5030]: E0120 23:44:26.648980 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66a4eea-17a5-4ab5-b3be-2a479226daf7" containerName="proxy-server" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.648986 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66a4eea-17a5-4ab5-b3be-2a479226daf7" containerName="proxy-server" Jan 20 23:44:26 crc kubenswrapper[5030]: E0120 23:44:26.649019 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66a4eea-17a5-4ab5-b3be-2a479226daf7" containerName="proxy-httpd" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.649027 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66a4eea-17a5-4ab5-b3be-2a479226daf7" containerName="proxy-httpd" Jan 20 23:44:26 crc kubenswrapper[5030]: E0120 23:44:26.649040 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="ceilometer-notification-agent" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.649046 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="ceilometer-notification-agent" Jan 20 23:44:26 crc kubenswrapper[5030]: E0120 23:44:26.649065 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="proxy-httpd" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.649071 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="proxy-httpd" Jan 20 23:44:26 crc kubenswrapper[5030]: E0120 23:44:26.649080 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="sg-core" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.649088 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="sg-core" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.649273 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66a4eea-17a5-4ab5-b3be-2a479226daf7" containerName="proxy-httpd" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.649286 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="proxy-httpd" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.649306 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="ceilometer-central-agent" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.649318 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="sg-core" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.649328 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" containerName="ceilometer-notification-agent" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.649337 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66a4eea-17a5-4ab5-b3be-2a479226daf7" containerName="proxy-server" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.651429 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.654443 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.654976 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.655037 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.676910 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.734032 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nst7v\" (UniqueName: \"kubernetes.io/projected/aeaf04c5-9099-4b8a-b270-d4ee406f8896-kube-api-access-nst7v\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.734103 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.734143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.734253 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-scripts\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.734293 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-config-data\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.734330 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.734359 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeaf04c5-9099-4b8a-b270-d4ee406f8896-run-httpd\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.734392 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeaf04c5-9099-4b8a-b270-d4ee406f8896-log-httpd\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.836253 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nst7v\" (UniqueName: \"kubernetes.io/projected/aeaf04c5-9099-4b8a-b270-d4ee406f8896-kube-api-access-nst7v\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.836354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.836416 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.836557 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-scripts\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.836617 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-config-data\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.836745 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.836819 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeaf04c5-9099-4b8a-b270-d4ee406f8896-run-httpd\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.836878 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeaf04c5-9099-4b8a-b270-d4ee406f8896-log-httpd\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.837500 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeaf04c5-9099-4b8a-b270-d4ee406f8896-run-httpd\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.837737 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeaf04c5-9099-4b8a-b270-d4ee406f8896-log-httpd\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.844690 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.845782 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.855548 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-scripts\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.855785 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.870203 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nst7v\" (UniqueName: \"kubernetes.io/projected/aeaf04c5-9099-4b8a-b270-d4ee406f8896-kube-api-access-nst7v\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.870610 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-config-data\") pod \"ceilometer-0\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.878720 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xqz49"] Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.882316 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.883017 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xqz49"] Jan 20 23:44:26 crc kubenswrapper[5030]: I0120 23:44:26.970041 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.040699 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046ad923-ab58-4298-83a4-7d2627b5eee9-catalog-content\") pod \"community-operators-xqz49\" (UID: \"046ad923-ab58-4298-83a4-7d2627b5eee9\") " pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.040741 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046ad923-ab58-4298-83a4-7d2627b5eee9-utilities\") pod \"community-operators-xqz49\" (UID: \"046ad923-ab58-4298-83a4-7d2627b5eee9\") " pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.040773 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp4fx\" (UniqueName: \"kubernetes.io/projected/046ad923-ab58-4298-83a4-7d2627b5eee9-kube-api-access-tp4fx\") pod \"community-operators-xqz49\" (UID: \"046ad923-ab58-4298-83a4-7d2627b5eee9\") " pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.142887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046ad923-ab58-4298-83a4-7d2627b5eee9-catalog-content\") pod \"community-operators-xqz49\" (UID: \"046ad923-ab58-4298-83a4-7d2627b5eee9\") " pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.142937 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046ad923-ab58-4298-83a4-7d2627b5eee9-utilities\") pod \"community-operators-xqz49\" (UID: \"046ad923-ab58-4298-83a4-7d2627b5eee9\") " pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.142968 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp4fx\" (UniqueName: \"kubernetes.io/projected/046ad923-ab58-4298-83a4-7d2627b5eee9-kube-api-access-tp4fx\") pod \"community-operators-xqz49\" (UID: \"046ad923-ab58-4298-83a4-7d2627b5eee9\") " pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.143529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046ad923-ab58-4298-83a4-7d2627b5eee9-catalog-content\") pod \"community-operators-xqz49\" (UID: \"046ad923-ab58-4298-83a4-7d2627b5eee9\") " pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.143679 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046ad923-ab58-4298-83a4-7d2627b5eee9-utilities\") pod \"community-operators-xqz49\" (UID: \"046ad923-ab58-4298-83a4-7d2627b5eee9\") " pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.162896 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp4fx\" (UniqueName: \"kubernetes.io/projected/046ad923-ab58-4298-83a4-7d2627b5eee9-kube-api-access-tp4fx\") pod \"community-operators-xqz49\" (UID: \"046ad923-ab58-4298-83a4-7d2627b5eee9\") " pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.266450 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.482974 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.602531 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xqz49"] Jan 20 23:44:27 crc kubenswrapper[5030]: W0120 23:44:27.604676 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod046ad923_ab58_4298_83a4_7d2627b5eee9.slice/crio-08e6827d2eff5b02e65fa03484aa099068f20dd1a46a03b555be6e0d3527ab7f WatchSource:0}: Error finding container 08e6827d2eff5b02e65fa03484aa099068f20dd1a46a03b555be6e0d3527ab7f: Status 404 returned error can't find the container with id 08e6827d2eff5b02e65fa03484aa099068f20dd1a46a03b555be6e0d3527ab7f Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.678653 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqz49" event={"ID":"046ad923-ab58-4298-83a4-7d2627b5eee9","Type":"ContainerStarted","Data":"08e6827d2eff5b02e65fa03484aa099068f20dd1a46a03b555be6e0d3527ab7f"} Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.679551 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aeaf04c5-9099-4b8a-b270-d4ee406f8896","Type":"ContainerStarted","Data":"19ef9d49e4482be6c5078408f36807932158ee24a4f9325758201be0283083bb"} Jan 20 23:44:27 crc kubenswrapper[5030]: I0120 23:44:27.985934 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c854d032-629e-4a6e-ba33-4250efb68e93" path="/var/lib/kubelet/pods/c854d032-629e-4a6e-ba33-4250efb68e93/volumes" Jan 20 23:44:28 crc kubenswrapper[5030]: I0120 23:44:28.409778 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" podUID="d66a4eea-17a5-4ab5-b3be-2a479226daf7" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.221:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:44:28 crc kubenswrapper[5030]: I0120 23:44:28.409787 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/swift-proxy-7c556c6584-sfpnb" podUID="d66a4eea-17a5-4ab5-b3be-2a479226daf7" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.221:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:44:28 crc kubenswrapper[5030]: I0120 23:44:28.691780 5030 generic.go:334] "Generic (PLEG): container finished" podID="046ad923-ab58-4298-83a4-7d2627b5eee9" containerID="8ae5072d7810fdcd0eec0e344c272820bf4e46792893a41ff51d2afed7ffba3b" exitCode=0 Jan 20 23:44:28 crc kubenswrapper[5030]: I0120 23:44:28.691839 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqz49" event={"ID":"046ad923-ab58-4298-83a4-7d2627b5eee9","Type":"ContainerDied","Data":"8ae5072d7810fdcd0eec0e344c272820bf4e46792893a41ff51d2afed7ffba3b"} Jan 20 23:44:28 crc kubenswrapper[5030]: I0120 23:44:28.699479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aeaf04c5-9099-4b8a-b270-d4ee406f8896","Type":"ContainerStarted","Data":"31de7642e3f6a2920d520c9f8ba02678706279ee75f8771e85a21f79d49d8f2c"} Jan 20 23:44:29 crc kubenswrapper[5030]: I0120 23:44:29.725460 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aeaf04c5-9099-4b8a-b270-d4ee406f8896","Type":"ContainerStarted","Data":"1f6749a25c72c9c787c127ceefad9a03df49b421462cca6b8ac0a09613026753"} Jan 20 23:44:30 crc kubenswrapper[5030]: I0120 23:44:30.739672 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aeaf04c5-9099-4b8a-b270-d4ee406f8896","Type":"ContainerStarted","Data":"216e958871075e8c220585fbd5ca404192d6413d788bac1112f975b37e4a20f3"} Jan 20 23:44:30 crc kubenswrapper[5030]: I0120 23:44:30.761715 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" podUID="3b08e5aa-df16-4deb-90e9-27af6c1d690b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.218:9696/\": dial tcp 10.217.0.218:9696: connect: connection refused" Jan 20 23:44:33 crc kubenswrapper[5030]: I0120 23:44:33.768503 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aeaf04c5-9099-4b8a-b270-d4ee406f8896","Type":"ContainerStarted","Data":"28b7de7d56face10f996033601d7679206b992fa119d73cbe2e9f054af31eaea"} Jan 20 23:44:33 crc kubenswrapper[5030]: I0120 23:44:33.769816 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:33 crc kubenswrapper[5030]: I0120 23:44:33.805032 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.50802897 podStartE2EDuration="7.805003997s" podCreationTimestamp="2026-01-20 23:44:26 +0000 UTC" firstStartedPulling="2026-01-20 23:44:27.515974213 +0000 UTC m=+4139.836234491" lastFinishedPulling="2026-01-20 23:44:32.81294924 +0000 UTC m=+4145.133209518" observedRunningTime="2026-01-20 23:44:33.794781569 +0000 UTC m=+4146.115041897" watchObservedRunningTime="2026-01-20 23:44:33.805003997 +0000 UTC m=+4146.125264275" Jan 20 23:44:34 crc kubenswrapper[5030]: I0120 23:44:34.786803 5030 generic.go:334] "Generic (PLEG): container finished" podID="046ad923-ab58-4298-83a4-7d2627b5eee9" containerID="443cbed4d3a144e31b0f62f3bd0cb624b9732b6f101e000f07f81af64c3085e4" exitCode=0 Jan 20 23:44:34 crc kubenswrapper[5030]: I0120 23:44:34.789325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqz49" event={"ID":"046ad923-ab58-4298-83a4-7d2627b5eee9","Type":"ContainerDied","Data":"443cbed4d3a144e31b0f62f3bd0cb624b9732b6f101e000f07f81af64c3085e4"} Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.698838 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5df8ff887d-7k5dd_3b08e5aa-df16-4deb-90e9-27af6c1d690b/neutron-api/0.log" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.699352 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.747749 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2kn8\" (UniqueName: \"kubernetes.io/projected/3b08e5aa-df16-4deb-90e9-27af6c1d690b-kube-api-access-j2kn8\") pod \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.747842 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-combined-ca-bundle\") pod \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.747919 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-config\") pod \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.747942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-public-tls-certs\") pod \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.748816 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-internal-tls-certs\") pod \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.748887 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-ovndb-tls-certs\") pod \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.748960 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-httpd-config\") pod \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\" (UID: \"3b08e5aa-df16-4deb-90e9-27af6c1d690b\") " Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.754139 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3b08e5aa-df16-4deb-90e9-27af6c1d690b" (UID: "3b08e5aa-df16-4deb-90e9-27af6c1d690b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.754214 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b08e5aa-df16-4deb-90e9-27af6c1d690b-kube-api-access-j2kn8" (OuterVolumeSpecName: "kube-api-access-j2kn8") pod "3b08e5aa-df16-4deb-90e9-27af6c1d690b" (UID: "3b08e5aa-df16-4deb-90e9-27af6c1d690b"). InnerVolumeSpecName "kube-api-access-j2kn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.808424 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqz49" event={"ID":"046ad923-ab58-4298-83a4-7d2627b5eee9","Type":"ContainerStarted","Data":"914eace812bf63e7559810a7997a45ad8eb574795938f005742a6218faa5d52c"} Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.813224 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5df8ff887d-7k5dd_3b08e5aa-df16-4deb-90e9-27af6c1d690b/neutron-api/0.log" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.813301 5030 generic.go:334] "Generic (PLEG): container finished" podID="3b08e5aa-df16-4deb-90e9-27af6c1d690b" containerID="21bb5a1314ff029a0924afc5c5d204212e943dbceb3519fbb0dd59b7c25a4f1b" exitCode=137 Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.813364 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.813369 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" event={"ID":"3b08e5aa-df16-4deb-90e9-27af6c1d690b","Type":"ContainerDied","Data":"21bb5a1314ff029a0924afc5c5d204212e943dbceb3519fbb0dd59b7c25a4f1b"} Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.813484 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5df8ff887d-7k5dd" event={"ID":"3b08e5aa-df16-4deb-90e9-27af6c1d690b","Type":"ContainerDied","Data":"ca8c183d14cec1dc67c813d2efe9151b6b9e35b2d43f55c0f5341b36e6776c67"} Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.813507 5030 scope.go:117] "RemoveContainer" containerID="d6cea96ea84415eb01eca8f809e2ee1e7f206a16c5d3b668df781111cc6b7093" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.830918 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xqz49" podStartSLOduration=4.271813055 podStartE2EDuration="10.83089923s" podCreationTimestamp="2026-01-20 23:44:26 +0000 UTC" firstStartedPulling="2026-01-20 23:44:28.698674204 +0000 UTC m=+4141.018934492" lastFinishedPulling="2026-01-20 23:44:35.257760379 +0000 UTC m=+4147.578020667" observedRunningTime="2026-01-20 23:44:36.830315887 +0000 UTC m=+4149.150576175" watchObservedRunningTime="2026-01-20 23:44:36.83089923 +0000 UTC m=+4149.151159518" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.834501 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b08e5aa-df16-4deb-90e9-27af6c1d690b" (UID: "3b08e5aa-df16-4deb-90e9-27af6c1d690b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.835008 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3b08e5aa-df16-4deb-90e9-27af6c1d690b" (UID: "3b08e5aa-df16-4deb-90e9-27af6c1d690b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.849744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3b08e5aa-df16-4deb-90e9-27af6c1d690b" (UID: "3b08e5aa-df16-4deb-90e9-27af6c1d690b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.851400 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.851452 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.851473 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.851518 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2kn8\" (UniqueName: \"kubernetes.io/projected/3b08e5aa-df16-4deb-90e9-27af6c1d690b-kube-api-access-j2kn8\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.851534 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.853004 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-config" (OuterVolumeSpecName: "config") pod "3b08e5aa-df16-4deb-90e9-27af6c1d690b" (UID: "3b08e5aa-df16-4deb-90e9-27af6c1d690b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.868849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3b08e5aa-df16-4deb-90e9-27af6c1d690b" (UID: "3b08e5aa-df16-4deb-90e9-27af6c1d690b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.945753 5030 scope.go:117] "RemoveContainer" containerID="21bb5a1314ff029a0924afc5c5d204212e943dbceb3519fbb0dd59b7c25a4f1b" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.953866 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.953909 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b08e5aa-df16-4deb-90e9-27af6c1d690b-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.973288 5030 scope.go:117] "RemoveContainer" containerID="d6cea96ea84415eb01eca8f809e2ee1e7f206a16c5d3b668df781111cc6b7093" Jan 20 23:44:36 crc kubenswrapper[5030]: E0120 23:44:36.973785 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6cea96ea84415eb01eca8f809e2ee1e7f206a16c5d3b668df781111cc6b7093\": container with ID starting with d6cea96ea84415eb01eca8f809e2ee1e7f206a16c5d3b668df781111cc6b7093 not found: ID does not exist" containerID="d6cea96ea84415eb01eca8f809e2ee1e7f206a16c5d3b668df781111cc6b7093" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.973814 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6cea96ea84415eb01eca8f809e2ee1e7f206a16c5d3b668df781111cc6b7093"} err="failed to get container status \"d6cea96ea84415eb01eca8f809e2ee1e7f206a16c5d3b668df781111cc6b7093\": rpc error: code = NotFound desc = could not find container \"d6cea96ea84415eb01eca8f809e2ee1e7f206a16c5d3b668df781111cc6b7093\": container with ID starting with d6cea96ea84415eb01eca8f809e2ee1e7f206a16c5d3b668df781111cc6b7093 not found: ID does not exist" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.973834 5030 scope.go:117] "RemoveContainer" containerID="21bb5a1314ff029a0924afc5c5d204212e943dbceb3519fbb0dd59b7c25a4f1b" Jan 20 23:44:36 crc kubenswrapper[5030]: E0120 23:44:36.974100 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21bb5a1314ff029a0924afc5c5d204212e943dbceb3519fbb0dd59b7c25a4f1b\": container with ID starting with 21bb5a1314ff029a0924afc5c5d204212e943dbceb3519fbb0dd59b7c25a4f1b not found: ID does not exist" containerID="21bb5a1314ff029a0924afc5c5d204212e943dbceb3519fbb0dd59b7c25a4f1b" Jan 20 23:44:36 crc kubenswrapper[5030]: I0120 23:44:36.974128 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21bb5a1314ff029a0924afc5c5d204212e943dbceb3519fbb0dd59b7c25a4f1b"} err="failed to get container status \"21bb5a1314ff029a0924afc5c5d204212e943dbceb3519fbb0dd59b7c25a4f1b\": rpc error: code = NotFound desc = could not find container \"21bb5a1314ff029a0924afc5c5d204212e943dbceb3519fbb0dd59b7c25a4f1b\": container with ID starting with 21bb5a1314ff029a0924afc5c5d204212e943dbceb3519fbb0dd59b7c25a4f1b not found: ID does not exist" Jan 20 23:44:37 crc kubenswrapper[5030]: I0120 23:44:37.148421 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5df8ff887d-7k5dd"] Jan 20 23:44:37 crc kubenswrapper[5030]: I0120 23:44:37.164809 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5df8ff887d-7k5dd"] Jan 20 23:44:37 crc kubenswrapper[5030]: I0120 23:44:37.267441 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:37 crc kubenswrapper[5030]: I0120 23:44:37.267709 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:37 crc kubenswrapper[5030]: I0120 23:44:37.978909 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b08e5aa-df16-4deb-90e9-27af6c1d690b" path="/var/lib/kubelet/pods/3b08e5aa-df16-4deb-90e9-27af6c1d690b/volumes" Jan 20 23:44:38 crc kubenswrapper[5030]: I0120 23:44:38.305514 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-xqz49" podUID="046ad923-ab58-4298-83a4-7d2627b5eee9" containerName="registry-server" probeResult="failure" output=< Jan 20 23:44:38 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 23:44:38 crc kubenswrapper[5030]: > Jan 20 23:44:40 crc kubenswrapper[5030]: I0120 23:44:40.157119 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:44:40 crc kubenswrapper[5030]: I0120 23:44:40.157381 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.095025 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.114165 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.124278 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-2dcf-account-create-update-vv9f8"] Jan 20 23:44:41 crc kubenswrapper[5030]: E0120 23:44:41.145448 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:44:41 crc kubenswrapper[5030]: E0120 23:44:41.145502 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data podName:16f26a59-3681-422d-a46f-406b454f73b8 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:41.645488717 +0000 UTC m=+4153.965749005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data") pod "rabbitmq-cell1-server-0" (UID: "16f26a59-3681-422d-a46f-406b454f73b8") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.170604 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.174207 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="bfe5c819-263d-4ea5-934c-51567db0e55e" containerName="openstackclient" containerID="cri-o://4162ae90212f0f20c63503791f5909dd8447d80bbbc67a89f73e7ea573c92180" gracePeriod=2 Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.192746 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.218252 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-c5jng"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.246447 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5"] Jan 20 23:44:41 crc kubenswrapper[5030]: E0120 23:44:41.247007 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b08e5aa-df16-4deb-90e9-27af6c1d690b" containerName="neutron-api" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.247021 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b08e5aa-df16-4deb-90e9-27af6c1d690b" containerName="neutron-api" Jan 20 23:44:41 crc kubenswrapper[5030]: E0120 23:44:41.247032 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe5c819-263d-4ea5-934c-51567db0e55e" containerName="openstackclient" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.247406 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe5c819-263d-4ea5-934c-51567db0e55e" containerName="openstackclient" Jan 20 23:44:41 crc kubenswrapper[5030]: E0120 23:44:41.247426 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b08e5aa-df16-4deb-90e9-27af6c1d690b" containerName="neutron-httpd" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.247434 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b08e5aa-df16-4deb-90e9-27af6c1d690b" containerName="neutron-httpd" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.247688 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe5c819-263d-4ea5-934c-51567db0e55e" containerName="openstackclient" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.247699 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b08e5aa-df16-4deb-90e9-27af6c1d690b" containerName="neutron-httpd" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.247711 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b08e5aa-df16-4deb-90e9-27af6c1d690b" containerName="neutron-api" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.248690 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.265912 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.282528 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.310268 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.310508 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerName="cinder-scheduler" containerID="cri-o://5b31b1b990ceabd4797fda7252ebbcdbfa5957e9115be52ceb86f6f08f169842" gracePeriod=30 Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.310684 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerName="probe" containerID="cri-o://86726663164ca08470d8023499565b6de220fd18491f622b7a239d4b600235cf" gracePeriod=30 Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.310791 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.316993 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.327689 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-c5jng"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.350185 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d88dc1f-6d17-4a07-b701-d1e578d686ad-operator-scripts\") pod \"glance-2dcf-account-create-update-qrmz5\" (UID: \"4d88dc1f-6d17-4a07-b701-d1e578d686ad\") " pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.350276 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a88bb5-d95c-4db3-a79f-51958a894a2b-operator-scripts\") pod \"barbican-e7ae-account-create-update-lxxbb\" (UID: \"c8a88bb5-d95c-4db3-a79f-51958a894a2b\") " pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.350356 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mfwx\" (UniqueName: \"kubernetes.io/projected/4d88dc1f-6d17-4a07-b701-d1e578d686ad-kube-api-access-9mfwx\") pod \"glance-2dcf-account-create-update-qrmz5\" (UID: \"4d88dc1f-6d17-4a07-b701-d1e578d686ad\") " pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.350373 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thds6\" (UniqueName: \"kubernetes.io/projected/c8a88bb5-d95c-4db3-a79f-51958a894a2b-kube-api-access-thds6\") pod \"barbican-e7ae-account-create-update-lxxbb\" (UID: \"c8a88bb5-d95c-4db3-a79f-51958a894a2b\") " pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.376363 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bthhs"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.377585 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-bthhs" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.388921 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.422985 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.451797 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljfcm\" (UniqueName: \"kubernetes.io/projected/cfd208a2-6c29-4066-a573-c6b35a6a59d5-kube-api-access-ljfcm\") pod \"root-account-create-update-bthhs\" (UID: \"cfd208a2-6c29-4066-a573-c6b35a6a59d5\") " pod="openstack-kuttl-tests/root-account-create-update-bthhs" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.451884 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mfwx\" (UniqueName: \"kubernetes.io/projected/4d88dc1f-6d17-4a07-b701-d1e578d686ad-kube-api-access-9mfwx\") pod \"glance-2dcf-account-create-update-qrmz5\" (UID: \"4d88dc1f-6d17-4a07-b701-d1e578d686ad\") " pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.451913 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thds6\" (UniqueName: \"kubernetes.io/projected/c8a88bb5-d95c-4db3-a79f-51958a894a2b-kube-api-access-thds6\") pod \"barbican-e7ae-account-create-update-lxxbb\" (UID: \"c8a88bb5-d95c-4db3-a79f-51958a894a2b\") " pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.451958 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfd208a2-6c29-4066-a573-c6b35a6a59d5-operator-scripts\") pod \"root-account-create-update-bthhs\" (UID: \"cfd208a2-6c29-4066-a573-c6b35a6a59d5\") " pod="openstack-kuttl-tests/root-account-create-update-bthhs" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.452000 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d88dc1f-6d17-4a07-b701-d1e578d686ad-operator-scripts\") pod \"glance-2dcf-account-create-update-qrmz5\" (UID: \"4d88dc1f-6d17-4a07-b701-d1e578d686ad\") " pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.452072 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a88bb5-d95c-4db3-a79f-51958a894a2b-operator-scripts\") pod \"barbican-e7ae-account-create-update-lxxbb\" (UID: \"c8a88bb5-d95c-4db3-a79f-51958a894a2b\") " pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.452965 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a88bb5-d95c-4db3-a79f-51958a894a2b-operator-scripts\") pod \"barbican-e7ae-account-create-update-lxxbb\" (UID: \"c8a88bb5-d95c-4db3-a79f-51958a894a2b\") " pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.453096 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d88dc1f-6d17-4a07-b701-d1e578d686ad-operator-scripts\") pod \"glance-2dcf-account-create-update-qrmz5\" (UID: \"4d88dc1f-6d17-4a07-b701-d1e578d686ad\") " pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.462563 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bthhs"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.485274 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.487314 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thds6\" (UniqueName: \"kubernetes.io/projected/c8a88bb5-d95c-4db3-a79f-51958a894a2b-kube-api-access-thds6\") pod \"barbican-e7ae-account-create-update-lxxbb\" (UID: \"c8a88bb5-d95c-4db3-a79f-51958a894a2b\") " pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.521898 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.556892 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mfwx\" (UniqueName: \"kubernetes.io/projected/4d88dc1f-6d17-4a07-b701-d1e578d686ad-kube-api-access-9mfwx\") pod \"glance-2dcf-account-create-update-qrmz5\" (UID: \"4d88dc1f-6d17-4a07-b701-d1e578d686ad\") " pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.557032 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-914b-account-create-update-lz9xk"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.558150 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljfcm\" (UniqueName: \"kubernetes.io/projected/cfd208a2-6c29-4066-a573-c6b35a6a59d5-kube-api-access-ljfcm\") pod \"root-account-create-update-bthhs\" (UID: \"cfd208a2-6c29-4066-a573-c6b35a6a59d5\") " pod="openstack-kuttl-tests/root-account-create-update-bthhs" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.558252 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfd208a2-6c29-4066-a573-c6b35a6a59d5-operator-scripts\") pod \"root-account-create-update-bthhs\" (UID: \"cfd208a2-6c29-4066-a573-c6b35a6a59d5\") " pod="openstack-kuttl-tests/root-account-create-update-bthhs" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.559084 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfd208a2-6c29-4066-a573-c6b35a6a59d5-operator-scripts\") pod \"root-account-create-update-bthhs\" (UID: \"cfd208a2-6c29-4066-a573-c6b35a6a59d5\") " pod="openstack-kuttl-tests/root-account-create-update-bthhs" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.576028 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.599584 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-e7ae-account-create-update-hczvm"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.632913 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-914b-account-create-update-lz9xk"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.645070 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.647221 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljfcm\" (UniqueName: \"kubernetes.io/projected/cfd208a2-6c29-4066-a573-c6b35a6a59d5-kube-api-access-ljfcm\") pod \"root-account-create-update-bthhs\" (UID: \"cfd208a2-6c29-4066-a573-c6b35a6a59d5\") " pod="openstack-kuttl-tests/root-account-create-update-bthhs" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.661450 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-914b-account-create-update-25csc"] Jan 20 23:44:41 crc kubenswrapper[5030]: E0120 23:44:41.661854 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:44:41 crc kubenswrapper[5030]: E0120 23:44:41.661920 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data podName:16f26a59-3681-422d-a46f-406b454f73b8 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:42.661900794 +0000 UTC m=+4154.982161082 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data") pod "rabbitmq-cell1-server-0" (UID: "16f26a59-3681-422d-a46f-406b454f73b8") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.662677 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.671053 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.711598 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-bthhs" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.762711 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.792718 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-914b-account-create-update-25csc"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.882670 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12f927a3-29ee-4af3-97e7-dc9998169372-operator-scripts\") pod \"placement-914b-account-create-update-25csc\" (UID: \"12f927a3-29ee-4af3-97e7-dc9998169372\") " pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.882706 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmg8n\" (UniqueName: \"kubernetes.io/projected/12f927a3-29ee-4af3-97e7-dc9998169372-kube-api-access-xmg8n\") pod \"placement-914b-account-create-update-25csc\" (UID: \"12f927a3-29ee-4af3-97e7-dc9998169372\") " pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" Jan 20 23:44:41 crc kubenswrapper[5030]: E0120 23:44:41.883767 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:44:41 crc kubenswrapper[5030]: E0120 23:44:41.883852 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data podName:2118c3ce-d55e-4a65-8d25-d7f53bcac91e nodeName:}" failed. No retries permitted until 2026-01-20 23:44:42.383823127 +0000 UTC m=+4154.704083405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data") pod "rabbitmq-server-0" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e") : configmap "rabbitmq-config-data" not found Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.916704 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.917588 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="3fc32705-7b67-4425-88be-7fe57171135e" containerName="openstack-network-exporter" containerID="cri-o://9b46431a294cf5d85df14faf26ae76b884709b85de6b4b62da3a05b399483892" gracePeriod=300 Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.936742 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59"] Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.985948 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12f927a3-29ee-4af3-97e7-dc9998169372-operator-scripts\") pod \"placement-914b-account-create-update-25csc\" (UID: \"12f927a3-29ee-4af3-97e7-dc9998169372\") " pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.985993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmg8n\" (UniqueName: \"kubernetes.io/projected/12f927a3-29ee-4af3-97e7-dc9998169372-kube-api-access-xmg8n\") pod \"placement-914b-account-create-update-25csc\" (UID: \"12f927a3-29ee-4af3-97e7-dc9998169372\") " pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.986339 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d97caaa-e49c-4669-893c-b1fe0d02e047" path="/var/lib/kubelet/pods/3d97caaa-e49c-4669-893c-b1fe0d02e047/volumes" Jan 20 23:44:41 crc kubenswrapper[5030]: I0120 23:44:41.987040 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12f927a3-29ee-4af3-97e7-dc9998169372-operator-scripts\") pod \"placement-914b-account-create-update-25csc\" (UID: \"12f927a3-29ee-4af3-97e7-dc9998169372\") " pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.015403 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cf760db-bb64-461a-ae21-2ce8224f1f02" path="/var/lib/kubelet/pods/6cf760db-bb64-461a-ae21-2ce8224f1f02/volumes" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.016040 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b244be-2928-4aa4-9d48-8d6411876b72" path="/var/lib/kubelet/pods/e3b244be-2928-4aa4-9d48-8d6411876b72/volumes" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.016572 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec5357f4-0242-43f7-b48a-3f2f7ccd4583" path="/var/lib/kubelet/pods/ec5357f4-0242-43f7-b48a-3f2f7ccd4583/volumes" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.018753 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-52b7-account-create-update-5gx59"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.037974 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-mphbf"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.083269 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmg8n\" (UniqueName: \"kubernetes.io/projected/12f927a3-29ee-4af3-97e7-dc9998169372-kube-api-access-xmg8n\") pod \"placement-914b-account-create-update-25csc\" (UID: \"12f927a3-29ee-4af3-97e7-dc9998169372\") " pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" Jan 20 23:44:42 crc kubenswrapper[5030]: E0120 23:44:42.098032 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:44:42 crc kubenswrapper[5030]: E0120 23:44:42.098137 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle podName:df866ce3-a10a-41e3-a17b-abea64eaa5ca nodeName:}" failed. No retries permitted until 2026-01-20 23:44:42.598106406 +0000 UTC m=+4154.918366694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle") pod "openstack-galera-0" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca") : secret "combined-ca-bundle" not found Jan 20 23:44:42 crc kubenswrapper[5030]: E0120 23:44:42.098695 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 20 23:44:42 crc kubenswrapper[5030]: E0120 23:44:42.098733 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs podName:df866ce3-a10a-41e3-a17b-abea64eaa5ca nodeName:}" failed. No retries permitted until 2026-01-20 23:44:42.598716751 +0000 UTC m=+4154.918977039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs") pod "openstack-galera-0" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca") : secret "cert-galera-openstack-svc" not found Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.145113 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-mphbf"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.227288 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-f87km"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.327457 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.368520 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-f87km"] Jan 20 23:44:42 crc kubenswrapper[5030]: E0120 23:44:42.427829 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:44:42 crc kubenswrapper[5030]: E0120 23:44:42.427904 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data podName:2118c3ce-d55e-4a65-8d25-d7f53bcac91e nodeName:}" failed. No retries permitted until 2026-01-20 23:44:43.427885936 +0000 UTC m=+4155.748146224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data") pod "rabbitmq-server-0" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e") : configmap "rabbitmq-config-data" not found Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.450695 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.451070 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="1a24a888-0314-4501-b989-20463ef27332" containerName="ovn-northd" containerID="cri-o://f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126" gracePeriod=30 Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.451670 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="1a24a888-0314-4501-b989-20463ef27332" containerName="openstack-network-exporter" containerID="cri-o://0c308db5ef08b7ef693c77c927ec44ee44d39ecf14e8c420af0c16132d9886ec" gracePeriod=30 Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.483249 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.484938 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.490260 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.501680 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.523685 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-6e8b-account-create-update-mbcgn"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.528294 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af19e9cd-1a61-43ac-a803-fdb7e92831e9-operator-scripts\") pod \"cinder-6e8b-account-create-update-xb886\" (UID: \"af19e9cd-1a61-43ac-a803-fdb7e92831e9\") " pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.528445 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp2vh\" (UniqueName: \"kubernetes.io/projected/af19e9cd-1a61-43ac-a803-fdb7e92831e9-kube-api-access-zp2vh\") pod \"cinder-6e8b-account-create-update-xb886\" (UID: \"af19e9cd-1a61-43ac-a803-fdb7e92831e9\") " pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.540528 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-dt6fp"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.567195 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-dt6fp"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.604700 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.618839 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-jkvvn"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.632241 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-jkvvn"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.642077 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp2vh\" (UniqueName: \"kubernetes.io/projected/af19e9cd-1a61-43ac-a803-fdb7e92831e9-kube-api-access-zp2vh\") pod \"cinder-6e8b-account-create-update-xb886\" (UID: \"af19e9cd-1a61-43ac-a803-fdb7e92831e9\") " pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.642287 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af19e9cd-1a61-43ac-a803-fdb7e92831e9-operator-scripts\") pod \"cinder-6e8b-account-create-update-xb886\" (UID: \"af19e9cd-1a61-43ac-a803-fdb7e92831e9\") " pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.643560 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af19e9cd-1a61-43ac-a803-fdb7e92831e9-operator-scripts\") pod \"cinder-6e8b-account-create-update-xb886\" (UID: \"af19e9cd-1a61-43ac-a803-fdb7e92831e9\") " pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886" Jan 20 23:44:42 crc kubenswrapper[5030]: E0120 23:44:42.643684 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 20 23:44:42 crc kubenswrapper[5030]: E0120 23:44:42.643747 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs podName:df866ce3-a10a-41e3-a17b-abea64eaa5ca nodeName:}" failed. No retries permitted until 2026-01-20 23:44:43.643733312 +0000 UTC m=+4155.963993600 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs") pod "openstack-galera-0" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca") : secret "cert-galera-openstack-svc" not found Jan 20 23:44:42 crc kubenswrapper[5030]: E0120 23:44:42.643776 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:44:42 crc kubenswrapper[5030]: E0120 23:44:42.643844 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle podName:df866ce3-a10a-41e3-a17b-abea64eaa5ca nodeName:}" failed. No retries permitted until 2026-01-20 23:44:43.643826954 +0000 UTC m=+4155.964087242 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle") pod "openstack-galera-0" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca") : secret "combined-ca-bundle" not found Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.660099 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.660593 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="f7698f27-dba0-4e8f-95b4-329a483d6691" containerName="openstack-network-exporter" containerID="cri-o://9b00346eae6215c81c9b549691ce9a825ffdd4a07f8e28ceb5cc31cd3a07ad35" gracePeriod=300 Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.681396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp2vh\" (UniqueName: \"kubernetes.io/projected/af19e9cd-1a61-43ac-a803-fdb7e92831e9-kube-api-access-zp2vh\") pod \"cinder-6e8b-account-create-update-xb886\" (UID: \"af19e9cd-1a61-43ac-a803-fdb7e92831e9\") " pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.725899 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.739303 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.740493 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6" Jan 20 23:44:42 crc kubenswrapper[5030]: E0120 23:44:42.745342 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:44:42 crc kubenswrapper[5030]: E0120 23:44:42.745397 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data podName:16f26a59-3681-422d-a46f-406b454f73b8 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:44.745382677 +0000 UTC m=+4157.065642965 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data") pod "rabbitmq-cell1-server-0" (UID: "16f26a59-3681-422d-a46f-406b454f73b8") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.745494 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.751281 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.752554 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.757644 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.775010 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-cb33-account-create-update-bjtz7"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.776682 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.786819 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.807980 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="3fc32705-7b67-4425-88be-7fe57171135e" containerName="ovsdbserver-sb" containerID="cri-o://ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605" gracePeriod=300 Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.848738 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/660c45c7-982d-49a6-be38-7a74cf69a00e-operator-scripts\") pod \"nova-cell0-617d-account-create-update-mmt4z\" (UID: \"660c45c7-982d-49a6-be38-7a74cf69a00e\") " pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.848796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgmcg\" (UniqueName: \"kubernetes.io/projected/8ac2d572-0fed-4309-bb22-03bbc9a30270-kube-api-access-tgmcg\") pod \"nova-api-cb33-account-create-update-b86k6\" (UID: \"8ac2d572-0fed-4309-bb22-03bbc9a30270\") " pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.848824 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz7wg\" (UniqueName: \"kubernetes.io/projected/660c45c7-982d-49a6-be38-7a74cf69a00e-kube-api-access-hz7wg\") pod \"nova-cell0-617d-account-create-update-mmt4z\" (UID: \"660c45c7-982d-49a6-be38-7a74cf69a00e\") " pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.848868 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ac2d572-0fed-4309-bb22-03bbc9a30270-operator-scripts\") pod \"nova-api-cb33-account-create-update-b86k6\" (UID: \"8ac2d572-0fed-4309-bb22-03bbc9a30270\") " pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.883369 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.896137 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.897653 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.906221 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.922476 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.938955 5030 generic.go:334] "Generic (PLEG): container finished" podID="1a24a888-0314-4501-b989-20463ef27332" containerID="0c308db5ef08b7ef693c77c927ec44ee44d39ecf14e8c420af0c16132d9886ec" exitCode=2 Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.939032 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"1a24a888-0314-4501-b989-20463ef27332","Type":"ContainerDied","Data":"0c308db5ef08b7ef693c77c927ec44ee44d39ecf14e8c420af0c16132d9886ec"} Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.959046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/660c45c7-982d-49a6-be38-7a74cf69a00e-operator-scripts\") pod \"nova-cell0-617d-account-create-update-mmt4z\" (UID: \"660c45c7-982d-49a6-be38-7a74cf69a00e\") " pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.959114 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgmcg\" (UniqueName: \"kubernetes.io/projected/8ac2d572-0fed-4309-bb22-03bbc9a30270-kube-api-access-tgmcg\") pod \"nova-api-cb33-account-create-update-b86k6\" (UID: \"8ac2d572-0fed-4309-bb22-03bbc9a30270\") " pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.959149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz7wg\" (UniqueName: \"kubernetes.io/projected/660c45c7-982d-49a6-be38-7a74cf69a00e-kube-api-access-hz7wg\") pod \"nova-cell0-617d-account-create-update-mmt4z\" (UID: \"660c45c7-982d-49a6-be38-7a74cf69a00e\") " pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.959176 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts\") pod \"nova-cell1-a2f6-account-create-update-v29f8\" (UID: \"37397e88-13fe-4396-810e-d87514548283\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.959209 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwph5\" (UniqueName: \"kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5\") pod \"nova-cell1-a2f6-account-create-update-v29f8\" (UID: \"37397e88-13fe-4396-810e-d87514548283\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.959233 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ac2d572-0fed-4309-bb22-03bbc9a30270-operator-scripts\") pod \"nova-api-cb33-account-create-update-b86k6\" (UID: \"8ac2d572-0fed-4309-bb22-03bbc9a30270\") " pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.960081 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ac2d572-0fed-4309-bb22-03bbc9a30270-operator-scripts\") pod \"nova-api-cb33-account-create-update-b86k6\" (UID: \"8ac2d572-0fed-4309-bb22-03bbc9a30270\") " pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.960693 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/660c45c7-982d-49a6-be38-7a74cf69a00e-operator-scripts\") pod \"nova-cell0-617d-account-create-update-mmt4z\" (UID: \"660c45c7-982d-49a6-be38-7a74cf69a00e\") " pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z" Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.969671 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-9k2jq"] Jan 20 23:44:42 crc kubenswrapper[5030]: I0120 23:44:42.989024 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-9k2jq"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.006873 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="f7698f27-dba0-4e8f-95b4-329a483d6691" containerName="ovsdbserver-nb" containerID="cri-o://c60cc42b20fa5da460341c91ae834bc23e0583d2671863d235c214e33a0cc075" gracePeriod=300 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.009866 5030 generic.go:334] "Generic (PLEG): container finished" podID="3fc32705-7b67-4425-88be-7fe57171135e" containerID="9b46431a294cf5d85df14faf26ae76b884709b85de6b4b62da3a05b399483892" exitCode=2 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.009922 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"3fc32705-7b67-4425-88be-7fe57171135e","Type":"ContainerDied","Data":"9b46431a294cf5d85df14faf26ae76b884709b85de6b4b62da3a05b399483892"} Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.015103 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.015689 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-server" containerID="cri-o://ca2b82dd6522d59c8f400191a6a84849e1a0dc5afd452ebd9ad3d6e6bb95252c" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016186 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="swift-recon-cron" containerID="cri-o://703b4130b09e7e9f7f77d19f3d5f9809062cb9cb593d15aaae512b984d70c591" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016263 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="rsync" containerID="cri-o://eab5cc67834e8fd543f3f0bf8bd1fcda92107d8d7ed59e2a5b6142df62f58fce" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016310 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-replicator" containerID="cri-o://1cf6b355d0e499f70b76d7c2fa59bfce9c9ed28b910c999b8bcfdcc19ec0dbf2" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016316 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-auditor" containerID="cri-o://32ce18f6289e30499008982b3bc6f5d28395565c34d8a569f794f6c6a7775704" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016361 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-server" containerID="cri-o://1804d3605dcae8df1649629d1484b17dbb6c4cb8599bf27e3f1d93177fa864f6" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016383 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-expirer" containerID="cri-o://0cc78d3b2cdd7c16a9c523aa805dd25a524946b1037801b958e0b5aca73eae5d" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016392 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-reaper" containerID="cri-o://fffadcca529e80041791a2dbc5892ae2ae5e249cae53e8e2aea4dd2a52b5ed5f" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016413 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-updater" containerID="cri-o://12eaa764aa377753f5ace52c861bbb3f3689b2fca77a117e7d7f96175209240d" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016442 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-auditor" containerID="cri-o://0be440e2b30ae5389f7c2cbaafbac4d4866fd7d5b1985dbe1813ca1df94788ca" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016444 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-auditor" containerID="cri-o://cc48b4c6d8e07a0b1db15c7843485e19655d2fc4c950337f5fa62bc1a6c473e8" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016456 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-replicator" containerID="cri-o://bac50d4cc962099037430a767b8939f7657afbb7b51f2a1c37620ae6aac67d42" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016498 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-server" containerID="cri-o://60ab4b7b3cfb297f14a1ca0ce6b7a2cef1d8b852da1a174b8c0a481cccd94e44" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016525 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-replicator" containerID="cri-o://5a0e990417f8c5fb20ac6296acd86d2880ed3ee8806354f7c4c3d73054ac98ed" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.016530 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-updater" containerID="cri-o://ba230f74c1544afcbb8b1f9c898cc75207ae9e7b4b8083280e546953073e28ef" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.023993 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgmcg\" (UniqueName: \"kubernetes.io/projected/8ac2d572-0fed-4309-bb22-03bbc9a30270-kube-api-access-tgmcg\") pod \"nova-api-cb33-account-create-update-b86k6\" (UID: \"8ac2d572-0fed-4309-bb22-03bbc9a30270\") " pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6" Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.035243 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:44:43 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: if [ -n "barbican" ]; then Jan 20 23:44:43 crc kubenswrapper[5030]: GRANT_DATABASE="barbican" Jan 20 23:44:43 crc kubenswrapper[5030]: else Jan 20 23:44:43 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:44:43 crc kubenswrapper[5030]: fi Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:44:43 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:44:43 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:44:43 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:44:43 crc kubenswrapper[5030]: # support updates Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.036298 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" podUID="c8a88bb5-d95c-4db3-a79f-51958a894a2b" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.038438 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" event={"ID":"c8a88bb5-d95c-4db3-a79f-51958a894a2b","Type":"ContainerStarted","Data":"ed5989cca24847c95b1dca37bd0ae38eeab993dadc8ce6b9a2adaaa338217b12"} Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.047243 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:44:43 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: if [ -n "glance" ]; then Jan 20 23:44:43 crc kubenswrapper[5030]: GRANT_DATABASE="glance" Jan 20 23:44:43 crc kubenswrapper[5030]: else Jan 20 23:44:43 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:44:43 crc kubenswrapper[5030]: fi Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:44:43 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:44:43 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:44:43 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:44:43 crc kubenswrapper[5030]: # support updates Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.047405 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz7wg\" (UniqueName: \"kubernetes.io/projected/660c45c7-982d-49a6-be38-7a74cf69a00e-kube-api-access-hz7wg\") pod \"nova-cell0-617d-account-create-update-mmt4z\" (UID: \"660c45c7-982d-49a6-be38-7a74cf69a00e\") " pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z" Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.050819 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" podUID="4d88dc1f-6d17-4a07-b701-d1e578d686ad" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.060971 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts\") pod \"nova-cell1-a2f6-account-create-update-v29f8\" (UID: \"37397e88-13fe-4396-810e-d87514548283\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.061060 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwph5\" (UniqueName: \"kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5\") pod \"nova-cell1-a2f6-account-create-update-v29f8\" (UID: \"37397e88-13fe-4396-810e-d87514548283\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.062351 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.062431 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts podName:37397e88-13fe-4396-810e-d87514548283 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:43.562410819 +0000 UTC m=+4155.882671107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts") pod "nova-cell1-a2f6-account-create-update-v29f8" (UID: "37397e88-13fe-4396-810e-d87514548283") : configmap "openstack-cell1-scripts" not found Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.066829 5030 projected.go:194] Error preparing data for projected volume kube-api-access-kwph5 for pod openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.067594 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5 podName:37397e88-13fe-4396-810e-d87514548283 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:43.567567863 +0000 UTC m=+4155.887828151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kwph5" (UniqueName: "kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5") pod "nova-cell1-a2f6-account-create-update-v29f8" (UID: "37397e88-13fe-4396-810e-d87514548283") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.084219 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.098657 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-617d-account-create-update-jnfhw"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.111687 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.124123 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.124398 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" containerName="glance-log" containerID="cri-o://0c5c3896b95122ba2528e0d7943d1eba9eaee2155abb09b973a5236b470342ce" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.124520 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" containerName="glance-httpd" containerID="cri-o://b494a9a55c3d113fd512a6d0e36e0c85cab0dd52b358ca282eeda6ddabb313cb" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.164145 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6" Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.196437 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.198406 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.200549 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.210419 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-npd8s"] Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.218663 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.219002 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="1a24a888-0314-4501-b989-20463ef27332" containerName="ovn-northd" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.219818 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-72mr2"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.227841 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-72mr2"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.236653 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6c4cd74858-frhdt"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.237041 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" podUID="d19c3298-c9b5-4371-b1f5-8edf2510706b" containerName="placement-log" containerID="cri-o://7de8ca058af69798715808c2c705adb228699f92727076f0a21040a1e153dd1b" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.237171 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" podUID="d19c3298-c9b5-4371-b1f5-8edf2510706b" containerName="placement-api" containerID="cri-o://a6d258088fb951bc34644f2ffa05470352c7ec6b7690b7eaa4f6eb8de5451c0f" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.253085 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="16f26a59-3681-422d-a46f-406b454f73b8" containerName="rabbitmq" containerID="cri-o://7f377657ebc8cc8d5f270487a8b8d28e4ee20ea78c35ac924a0cfd025e4c2bef" gracePeriod=604800 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.255807 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.256046 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" containerName="glance-log" containerID="cri-o://b54bda34645aa30be316ecd336547283562d059fb3040eb96063b7c53a7e4521" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.256424 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" containerName="glance-httpd" containerID="cri-o://4d4ad5a6d7d11df4886fa6a22b4f7a1bd73a0b2aeaa3ab4e6b5789b085035912" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.292289 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.294593 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.304230 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.355389 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.389302 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fa8f718e-3edd-4de0-9125-7d981584e6e0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8sp2w\" (UID: \"fa8f718e-3edd-4de0-9125-7d981584e6e0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.389342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5w6\" (UniqueName: \"kubernetes.io/projected/fa8f718e-3edd-4de0-9125-7d981584e6e0-kube-api-access-mc5w6\") pod \"dnsmasq-dnsmasq-84b9f45d47-8sp2w\" (UID: \"fa8f718e-3edd-4de0-9125-7d981584e6e0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.389503 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa8f718e-3edd-4de0-9125-7d981584e6e0-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8sp2w\" (UID: \"fa8f718e-3edd-4de0-9125-7d981584e6e0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.408348 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605 is running failed: container process not found" containerID="ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.408518 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-t4959"] Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.412474 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605 is running failed: container process not found" containerID="ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.417676 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-t4959"] Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.418797 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605 is running failed: container process not found" containerID="ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.418866 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="3fc32705-7b67-4425-88be-7fe57171135e" containerName="ovsdbserver-sb" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.473669 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.489107 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.490773 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa8f718e-3edd-4de0-9125-7d981584e6e0-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8sp2w\" (UID: \"fa8f718e-3edd-4de0-9125-7d981584e6e0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.490861 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fa8f718e-3edd-4de0-9125-7d981584e6e0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8sp2w\" (UID: \"fa8f718e-3edd-4de0-9125-7d981584e6e0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.490884 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5w6\" (UniqueName: \"kubernetes.io/projected/fa8f718e-3edd-4de0-9125-7d981584e6e0-kube-api-access-mc5w6\") pod \"dnsmasq-dnsmasq-84b9f45d47-8sp2w\" (UID: \"fa8f718e-3edd-4de0-9125-7d981584e6e0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.491508 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.491555 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data podName:2118c3ce-d55e-4a65-8d25-d7f53bcac91e nodeName:}" failed. No retries permitted until 2026-01-20 23:44:45.491539558 +0000 UTC m=+4157.811799846 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data") pod "rabbitmq-server-0" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e") : configmap "rabbitmq-config-data" not found Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.492484 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fa8f718e-3edd-4de0-9125-7d981584e6e0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8sp2w\" (UID: \"fa8f718e-3edd-4de0-9125-7d981584e6e0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.492744 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa8f718e-3edd-4de0-9125-7d981584e6e0-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8sp2w\" (UID: \"fa8f718e-3edd-4de0-9125-7d981584e6e0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.508365 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c60cc42b20fa5da460341c91ae834bc23e0583d2671863d235c214e33a0cc075 is running failed: container process not found" containerID="c60cc42b20fa5da460341c91ae834bc23e0583d2671863d235c214e33a0cc075" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.511820 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c60cc42b20fa5da460341c91ae834bc23e0583d2671863d235c214e33a0cc075 is running failed: container process not found" containerID="c60cc42b20fa5da460341c91ae834bc23e0583d2671863d235c214e33a0cc075" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.518748 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb"] Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.522727 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c60cc42b20fa5da460341c91ae834bc23e0583d2671863d235c214e33a0cc075 is running failed: container process not found" containerID="c60cc42b20fa5da460341c91ae834bc23e0583d2671863d235c214e33a0cc075" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.522790 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c60cc42b20fa5da460341c91ae834bc23e0583d2671863d235c214e33a0cc075 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="f7698f27-dba0-4e8f-95b4-329a483d6691" containerName="ovsdbserver-nb" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.539298 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-m68lb"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.549473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5w6\" (UniqueName: \"kubernetes.io/projected/fa8f718e-3edd-4de0-9125-7d981584e6e0-kube-api-access-mc5w6\") pod \"dnsmasq-dnsmasq-84b9f45d47-8sp2w\" (UID: \"fa8f718e-3edd-4de0-9125-7d981584e6e0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.592725 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.592922 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts\") pod \"nova-cell1-a2f6-account-create-update-v29f8\" (UID: \"37397e88-13fe-4396-810e-d87514548283\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.593256 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.594040 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwph5\" (UniqueName: \"kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5\") pod \"nova-cell1-a2f6-account-create-update-v29f8\" (UID: \"37397e88-13fe-4396-810e-d87514548283\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.594182 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts podName:37397e88-13fe-4396-810e-d87514548283 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:44.594165468 +0000 UTC m=+4156.914425756 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts") pod "nova-cell1-a2f6-account-create-update-v29f8" (UID: "37397e88-13fe-4396-810e-d87514548283") : configmap "openstack-cell1-scripts" not found Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.601394 5030 projected.go:194] Error preparing data for projected volume kube-api-access-kwph5 for pod openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.601448 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5 podName:37397e88-13fe-4396-810e-d87514548283 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:44.601432834 +0000 UTC m=+4156.921693112 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kwph5" (UniqueName: "kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5") pod "nova-cell1-a2f6-account-create-update-v29f8" (UID: "37397e88-13fe-4396-810e-d87514548283") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.609241 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-b95d7"] Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.642290 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:44:43 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: if [ -n "placement" ]; then Jan 20 23:44:43 crc kubenswrapper[5030]: GRANT_DATABASE="placement" Jan 20 23:44:43 crc kubenswrapper[5030]: else Jan 20 23:44:43 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:44:43 crc kubenswrapper[5030]: fi Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:44:43 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:44:43 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:44:43 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:44:43 crc kubenswrapper[5030]: # support updates Jan 20 23:44:43 crc kubenswrapper[5030]: Jan 20 23:44:43 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.645286 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" podUID="12f927a3-29ee-4af3-97e7-dc9998169372" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.656119 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.658012 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.678803 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6f69755d54-ldcwk"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.679042 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" podUID="7f9eab26-3469-41cd-abbf-9f314b2e3db1" containerName="neutron-api" containerID="cri-o://c6b688c31c136694cd617d7076fc8d77cc330b64c2f68291106be52b8810e00c" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.679415 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" podUID="7f9eab26-3469-41cd-abbf-9f314b2e3db1" containerName="neutron-httpd" containerID="cri-o://ba25de2ae7478e2268a1149edfb6a5499eb0e53e3ec1456722337f6a45228229" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.703386 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.703451 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle podName:df866ce3-a10a-41e3-a17b-abea64eaa5ca nodeName:}" failed. No retries permitted until 2026-01-20 23:44:45.703437429 +0000 UTC m=+4158.023697727 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle") pod "openstack-galera-0" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca") : secret "combined-ca-bundle" not found Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.703489 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 20 23:44:43 crc kubenswrapper[5030]: E0120 23:44:43.703509 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs podName:df866ce3-a10a-41e3-a17b-abea64eaa5ca nodeName:}" failed. No retries permitted until 2026-01-20 23:44:45.703503531 +0000 UTC m=+4158.023763819 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs") pod "openstack-galera-0" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca") : secret "cert-galera-openstack-svc" not found Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.724575 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bthhs"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.749167 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-914b-account-create-update-25csc"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.759062 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-mntdc"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.766983 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-mntdc"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.772721 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-svl75"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.789267 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-svl75"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.800286 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.800508 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="50bb7262-9571-4851-9399-5115ee497125" containerName="cinder-api-log" containerID="cri-o://181729877b0aab834202ecd11fb5d266acca0c2132cefa652fcb436c28a852fb" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.801433 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="50bb7262-9571-4851-9399-5115ee497125" containerName="cinder-api" containerID="cri-o://db87e0d657028f0da82f16b08db2f08eb396b07e9c9a8800ab5a9e1a8e82760c" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.835870 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-ch2zp"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.840499 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_3fc32705-7b67-4425-88be-7fe57171135e/ovsdbserver-sb/0.log" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.840565 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.854974 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-ch2zp"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.867020 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.885986 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jkq6c"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.901439 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-jkq6c"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.907719 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-combined-ca-bundle\") pod \"3fc32705-7b67-4425-88be-7fe57171135e\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.907810 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc32705-7b67-4425-88be-7fe57171135e-config\") pod \"3fc32705-7b67-4425-88be-7fe57171135e\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.907838 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnbz7\" (UniqueName: \"kubernetes.io/projected/3fc32705-7b67-4425-88be-7fe57171135e-kube-api-access-gnbz7\") pod \"3fc32705-7b67-4425-88be-7fe57171135e\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.909146 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-metrics-certs-tls-certs\") pod \"3fc32705-7b67-4425-88be-7fe57171135e\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.909279 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fc32705-7b67-4425-88be-7fe57171135e-scripts\") pod \"3fc32705-7b67-4425-88be-7fe57171135e\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.909308 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-ovsdbserver-sb-tls-certs\") pod \"3fc32705-7b67-4425-88be-7fe57171135e\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.909357 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fc32705-7b67-4425-88be-7fe57171135e-ovsdb-rundir\") pod \"3fc32705-7b67-4425-88be-7fe57171135e\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.909396 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"3fc32705-7b67-4425-88be-7fe57171135e\" (UID: \"3fc32705-7b67-4425-88be-7fe57171135e\") " Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.921857 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.922060 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" podUID="ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" containerName="barbican-worker-log" containerID="cri-o://f1f39f1af8be8586d4e08109ccc15fd44076fee82dbc43273800e8dc177416bf" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.922404 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fc32705-7b67-4425-88be-7fe57171135e-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "3fc32705-7b67-4425-88be-7fe57171135e" (UID: "3fc32705-7b67-4425-88be-7fe57171135e"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.922434 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" podUID="ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" containerName="barbican-worker" containerID="cri-o://124b27d23196c76c208e464129a013f7699a2efee00d1a0cc64c1e1f4062736b" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.923752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fc32705-7b67-4425-88be-7fe57171135e-scripts" (OuterVolumeSpecName: "scripts") pod "3fc32705-7b67-4425-88be-7fe57171135e" (UID: "3fc32705-7b67-4425-88be-7fe57171135e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.924131 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fc32705-7b67-4425-88be-7fe57171135e-config" (OuterVolumeSpecName: "config") pod "3fc32705-7b67-4425-88be-7fe57171135e" (UID: "3fc32705-7b67-4425-88be-7fe57171135e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.926043 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.926227 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" podUID="f205895a-be1d-4da7-adbc-773b2750d39b" containerName="barbican-keystone-listener-log" containerID="cri-o://d0252ba3559ee134b9000998c8060c53dc2b0d984f743e9c1fb9fb16abf9e46f" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.926305 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" podUID="f205895a-be1d-4da7-adbc-773b2750d39b" containerName="barbican-keystone-listener" containerID="cri-o://5dba5d24db0179451d485278775eeffd46ef67bfef2e3668bd94f66923b10c84" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.931893 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "3fc32705-7b67-4425-88be-7fe57171135e" (UID: "3fc32705-7b67-4425-88be-7fe57171135e"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.939044 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc32705-7b67-4425-88be-7fe57171135e-kube-api-access-gnbz7" (OuterVolumeSpecName: "kube-api-access-gnbz7") pod "3fc32705-7b67-4425-88be-7fe57171135e" (UID: "3fc32705-7b67-4425-88be-7fe57171135e"). InnerVolumeSpecName "kube-api-access-gnbz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.960678 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-748859d488-6s2vx"] Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.960939 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" podUID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerName="barbican-api-log" containerID="cri-o://f878fb22f76e5c3d5c5aad54d40d91cba28b59e426192e35705dec229d359f10" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.961068 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" podUID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerName="barbican-api" containerID="cri-o://44672b9e5ac71bd9b91365302a913d20bb6e00d96ec91f69b8166b36d5e7f2e4" gracePeriod=30 Jan 20 23:44:43 crc kubenswrapper[5030]: I0120 23:44:43.966766 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/openstack-cell1-galera-0" secret="" err="secret \"galera-openstack-cell1-dockercfg-rjg7g\" not found" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.006941 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03910f64-db26-4c5e-8122-f88cce939744" path="/var/lib/kubelet/pods/03910f64-db26-4c5e-8122-f88cce939744/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.007703 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16ed07d6-f54e-428f-b3fa-a643b2853ea5" path="/var/lib/kubelet/pods/16ed07d6-f54e-428f-b3fa-a643b2853ea5/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.008235 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b22b9f9-3ec2-44dd-9284-574e90be8eae" path="/var/lib/kubelet/pods/1b22b9f9-3ec2-44dd-9284-574e90be8eae/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.014567 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fc32705-7b67-4425-88be-7fe57171135e-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.014608 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnbz7\" (UniqueName: \"kubernetes.io/projected/3fc32705-7b67-4425-88be-7fe57171135e-kube-api-access-gnbz7\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.014635 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fc32705-7b67-4425-88be-7fe57171135e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.014644 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fc32705-7b67-4425-88be-7fe57171135e-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.014663 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.015344 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.015456 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:44.515439417 +0000 UTC m=+4156.835699705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : configmap "openstack-cell1-scripts" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.015472 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.015862 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:44.515847238 +0000 UTC m=+4156.836107526 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : secret "combined-ca-bundle" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.015509 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.015887 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:44.515881619 +0000 UTC m=+4156.836141907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : configmap "openstack-cell1-config-data" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.015550 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.015906 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:44.515901349 +0000 UTC m=+4156.836161637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : configmap "openstack-cell1-config-data" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.015369 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.015938 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:44.51592813 +0000 UTC m=+4156.836188418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.016538 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8" path="/var/lib/kubelet/pods/2cd3e31b-fb34-4bf2-8605-fbb4cac3b7d8/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.017117 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d6bc82a-43cc-4f35-b324-c2f2a40cefbd" path="/var/lib/kubelet/pods/2d6bc82a-43cc-4f35-b324-c2f2a40cefbd/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.017800 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4730508a-e5d2-41c3-9358-5f8cd815ae15" path="/var/lib/kubelet/pods/4730508a-e5d2-41c3-9358-5f8cd815ae15/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.018912 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cc0a822-f532-4667-983b-b4b9f5653853" path="/var/lib/kubelet/pods/5cc0a822-f532-4667-983b-b4b9f5653853/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.019446 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c4ba8d1-6bb4-4ff7-ab51-46533317e15d" path="/var/lib/kubelet/pods/6c4ba8d1-6bb4-4ff7-ab51-46533317e15d/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.020806 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f33bd8-c4ff-4eec-96d8-8244037c2edb" path="/var/lib/kubelet/pods/78f33bd8-c4ff-4eec-96d8-8244037c2edb/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.021367 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="843da113-e41e-40b0-9d7a-c5486468ca5d" path="/var/lib/kubelet/pods/843da113-e41e-40b0-9d7a-c5486468ca5d/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.022565 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="874081cb-7984-4363-818a-78f8ed9321ed" path="/var/lib/kubelet/pods/874081cb-7984-4363-818a-78f8ed9321ed/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.023134 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7de4926-8119-4b2c-b581-9a68520f43e0" path="/var/lib/kubelet/pods/a7de4926-8119-4b2c-b581-9a68520f43e0/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.023698 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9c4cd6f-c719-424c-8f38-accb36a496ff" path="/var/lib/kubelet/pods/a9c4cd6f-c719-424c-8f38-accb36a496ff/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.024716 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1b44006-d170-4768-8096-67d4dd02546c" path="/var/lib/kubelet/pods/c1b44006-d170-4768-8096-67d4dd02546c/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.025230 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d57228-bbc8-4b0d-b4bc-9fd11cca7630" path="/var/lib/kubelet/pods/c5d57228-bbc8-4b0d-b4bc-9fd11cca7630/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.025827 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9412ce4-a977-4306-a71c-f52217d2a3e7" path="/var/lib/kubelet/pods/c9412ce4-a977-4306-a71c-f52217d2a3e7/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.026768 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce721ec0-af84-470d-a905-0c6163c1580f" path="/var/lib/kubelet/pods/ce721ec0-af84-470d-a905-0c6163c1580f/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.027286 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc45df02-ca0a-4c18-813e-0c644876cd5f" path="/var/lib/kubelet/pods/dc45df02-ca0a-4c18-813e-0c644876cd5f/volumes" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.043152 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.043200 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.053455 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" event={"ID":"4d88dc1f-6d17-4a07-b701-d1e578d686ad","Type":"ContainerStarted","Data":"9a50bf10fe0a10ae31abca060a64d550c1740d2a92517342198500c80451a643"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.066653 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_3fc32705-7b67-4425-88be-7fe57171135e/ovsdbserver-sb/0.log" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.066894 5030 generic.go:334] "Generic (PLEG): container finished" podID="3fc32705-7b67-4425-88be-7fe57171135e" containerID="ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605" exitCode=143 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.067170 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"3fc32705-7b67-4425-88be-7fe57171135e","Type":"ContainerDied","Data":"ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.068778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"3fc32705-7b67-4425-88be-7fe57171135e","Type":"ContainerDied","Data":"e62679548f35be4b9b789d9aaf3f4e7ba177a3e149c1161ad44252ad3dcbd2b1"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.068895 5030 scope.go:117] "RemoveContainer" containerID="9b46431a294cf5d85df14faf26ae76b884709b85de6b4b62da3a05b399483892" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.069948 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.073877 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.074124 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="fbdc4876-2fa1-495d-88fd-3c18541b14d4" containerName="nova-api-log" containerID="cri-o://b59c18df1e318f6b2e3f84ac12a6374255e6026543722d5cbad05da5899f1a0b" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.074268 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="fbdc4876-2fa1-495d-88fd-3c18541b14d4" containerName="nova-api-api" containerID="cri-o://dbf3f368b62a0be16be3046e1cabd348e08742f826a1911f3c06fe9f0acb438c" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.080687 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-98jxz"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.089277 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-98jxz"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.094939 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.102165 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-914b-account-create-update-25csc"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.106508 5030 generic.go:334] "Generic (PLEG): container finished" podID="d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" containerID="0c5c3896b95122ba2528e0d7943d1eba9eaee2155abb09b973a5236b470342ce" exitCode=143 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.106567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b","Type":"ContainerDied","Data":"0c5c3896b95122ba2528e0d7943d1eba9eaee2155abb09b973a5236b470342ce"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.112308 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" event={"ID":"12f927a3-29ee-4af3-97e7-dc9998169372","Type":"ContainerStarted","Data":"fcad1326e3665d00f689aafc66bab7877882a99e7a0a9bc1128fbe61d2dd5743"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.120558 5030 generic.go:334] "Generic (PLEG): container finished" podID="d19c3298-c9b5-4371-b1f5-8edf2510706b" containerID="7de8ca058af69798715808c2c705adb228699f92727076f0a21040a1e153dd1b" exitCode=143 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.120609 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" event={"ID":"d19c3298-c9b5-4371-b1f5-8edf2510706b","Type":"ContainerDied","Data":"7de8ca058af69798715808c2c705adb228699f92727076f0a21040a1e153dd1b"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.135890 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.136233 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5c070076-3f81-4e61-9f27-e5a39a84c380" containerName="nova-metadata-log" containerID="cri-o://3ee7c875e7608cf4c0f7d6601a304573ca4dfceefd3058f7a8848e454d0f9c9d" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.136882 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5c070076-3f81-4e61-9f27-e5a39a84c380" containerName="nova-metadata-metadata" containerID="cri-o://4c0e97ab7ac916ffeec23817d4c8f83530ee3240f89e8b751dc5b7193307f748" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150037 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="eab5cc67834e8fd543f3f0bf8bd1fcda92107d8d7ed59e2a5b6142df62f58fce" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150069 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="0cc78d3b2cdd7c16a9c523aa805dd25a524946b1037801b958e0b5aca73eae5d" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150076 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="12eaa764aa377753f5ace52c861bbb3f3689b2fca77a117e7d7f96175209240d" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150083 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="cc48b4c6d8e07a0b1db15c7843485e19655d2fc4c950337f5fa62bc1a6c473e8" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150090 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="bac50d4cc962099037430a767b8939f7657afbb7b51f2a1c37620ae6aac67d42" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150097 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="60ab4b7b3cfb297f14a1ca0ce6b7a2cef1d8b852da1a174b8c0a481cccd94e44" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150104 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="ba230f74c1544afcbb8b1f9c898cc75207ae9e7b4b8083280e546953073e28ef" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150110 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="32ce18f6289e30499008982b3bc6f5d28395565c34d8a569f794f6c6a7775704" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150116 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="1cf6b355d0e499f70b76d7c2fa59bfce9c9ed28b910c999b8bcfdcc19ec0dbf2" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150123 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="1804d3605dcae8df1649629d1484b17dbb6c4cb8599bf27e3f1d93177fa864f6" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150129 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="fffadcca529e80041791a2dbc5892ae2ae5e249cae53e8e2aea4dd2a52b5ed5f" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150135 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="0be440e2b30ae5389f7c2cbaafbac4d4866fd7d5b1985dbe1813ca1df94788ca" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150144 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="5a0e990417f8c5fb20ac6296acd86d2880ed3ee8806354f7c4c3d73054ac98ed" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150150 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="ca2b82dd6522d59c8f400191a6a84849e1a0dc5afd452ebd9ad3d6e6bb95252c" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150238 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"eab5cc67834e8fd543f3f0bf8bd1fcda92107d8d7ed59e2a5b6142df62f58fce"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150264 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"0cc78d3b2cdd7c16a9c523aa805dd25a524946b1037801b958e0b5aca73eae5d"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150276 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"12eaa764aa377753f5ace52c861bbb3f3689b2fca77a117e7d7f96175209240d"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"cc48b4c6d8e07a0b1db15c7843485e19655d2fc4c950337f5fa62bc1a6c473e8"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150314 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"bac50d4cc962099037430a767b8939f7657afbb7b51f2a1c37620ae6aac67d42"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150323 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"60ab4b7b3cfb297f14a1ca0ce6b7a2cef1d8b852da1a174b8c0a481cccd94e44"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150333 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"ba230f74c1544afcbb8b1f9c898cc75207ae9e7b4b8083280e546953073e28ef"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150343 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"32ce18f6289e30499008982b3bc6f5d28395565c34d8a569f794f6c6a7775704"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150352 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"1cf6b355d0e499f70b76d7c2fa59bfce9c9ed28b910c999b8bcfdcc19ec0dbf2"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"1804d3605dcae8df1649629d1484b17dbb6c4cb8599bf27e3f1d93177fa864f6"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"fffadcca529e80041791a2dbc5892ae2ae5e249cae53e8e2aea4dd2a52b5ed5f"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150381 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"0be440e2b30ae5389f7c2cbaafbac4d4866fd7d5b1985dbe1813ca1df94788ca"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150396 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"5a0e990417f8c5fb20ac6296acd86d2880ed3ee8806354f7c4c3d73054ac98ed"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.150407 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"ca2b82dd6522d59c8f400191a6a84849e1a0dc5afd452ebd9ad3d6e6bb95252c"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.158874 5030 generic.go:334] "Generic (PLEG): container finished" podID="50bb7262-9571-4851-9399-5115ee497125" containerID="181729877b0aab834202ecd11fb5d266acca0c2132cefa652fcb436c28a852fb" exitCode=143 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.158989 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.159025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"50bb7262-9571-4851-9399-5115ee497125","Type":"ContainerDied","Data":"181729877b0aab834202ecd11fb5d266acca0c2132cefa652fcb436c28a852fb"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.169319 5030 generic.go:334] "Generic (PLEG): container finished" podID="59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" containerID="b54bda34645aa30be316ecd336547283562d059fb3040eb96063b7c53a7e4521" exitCode=143 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.169406 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-kf4fb"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.169432 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54","Type":"ContainerDied","Data":"b54bda34645aa30be316ecd336547283562d059fb3040eb96063b7c53a7e4521"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.177547 5030 generic.go:334] "Generic (PLEG): container finished" podID="7f9eab26-3469-41cd-abbf-9f314b2e3db1" containerID="ba25de2ae7478e2268a1149edfb6a5499eb0e53e3ec1456722337f6a45228229" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.177652 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" event={"ID":"7f9eab26-3469-41cd-abbf-9f314b2e3db1","Type":"ContainerDied","Data":"ba25de2ae7478e2268a1149edfb6a5499eb0e53e3ec1456722337f6a45228229"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.179809 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-kf4fb"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.180005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-bthhs" event={"ID":"cfd208a2-6c29-4066-a573-c6b35a6a59d5","Type":"ContainerStarted","Data":"4b45c425c67d45996af8598a9def2ca1dc154197f4548a8d2570f3c3420df911"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.184278 5030 generic.go:334] "Generic (PLEG): container finished" podID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerID="86726663164ca08470d8023499565b6de220fd18491f622b7a239d4b600235cf" exitCode=0 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.184346 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb","Type":"ContainerDied","Data":"86726663164ca08470d8023499565b6de220fd18491f622b7a239d4b600235cf"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.189987 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_f7698f27-dba0-4e8f-95b4-329a483d6691/ovsdbserver-nb/0.log" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.190030 5030 generic.go:334] "Generic (PLEG): container finished" podID="f7698f27-dba0-4e8f-95b4-329a483d6691" containerID="9b00346eae6215c81c9b549691ce9a825ffdd4a07f8e28ceb5cc31cd3a07ad35" exitCode=2 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.190048 5030 generic.go:334] "Generic (PLEG): container finished" podID="f7698f27-dba0-4e8f-95b4-329a483d6691" containerID="c60cc42b20fa5da460341c91ae834bc23e0583d2671863d235c214e33a0cc075" exitCode=143 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.190117 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"f7698f27-dba0-4e8f-95b4-329a483d6691","Type":"ContainerDied","Data":"9b00346eae6215c81c9b549691ce9a825ffdd4a07f8e28ceb5cc31cd3a07ad35"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.190142 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"f7698f27-dba0-4e8f-95b4-329a483d6691","Type":"ContainerDied","Data":"c60cc42b20fa5da460341c91ae834bc23e0583d2671863d235c214e33a0cc075"} Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.193708 5030 generic.go:334] "Generic (PLEG): container finished" podID="bfe5c819-263d-4ea5-934c-51567db0e55e" containerID="4162ae90212f0f20c63503791f5909dd8447d80bbbc67a89f73e7ea573c92180" exitCode=137 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.194225 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8"] Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.195265 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-kwph5 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" podUID="37397e88-13fe-4396-810e-d87514548283" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.215259 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.216736 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="db5e46d2-d1b4-4639-8e45-c329417ae8e1" containerName="nova-scheduler-scheduler" containerID="cri-o://541f9809b1851320adea3aba77104ed75a318a6d7f3ff14f9ae06b101cf63b8f" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.226151 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-z6fdw"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.234583 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-z6fdw"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.244490 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.250768 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-z2rqh"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.269748 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.280309 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.280487 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://fc4cf92d45efba3ab3f584658fe1ec9a79d16433135d3203f60088326bf66a6b" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.284723 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.284987 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="24b38406-08cf-452c-b0f8-c455ed6b7ece" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://1abdfa8e1738c463a2fd304df58c250bf4c6932fce81a2f5d1066f83952f0f5c" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.294115 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.302662 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-vb7pb"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.312574 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.313543 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerName="nova-cell1-conductor-conductor" containerID="cri-o://93d97d664a75ce718e24a3db67922eeb139038d7198bf3078d32828f497081d0" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.328003 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.328301 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="86b0b2ab-44de-47c1-b0f4-6fc63d87393a" containerName="kube-state-metrics" containerID="cri-o://deeb6b4d1668a9582b4c378c594679c4235c09d76d1e2e4ec46aa1ca229900d6" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.348246 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.348529 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="ceilometer-central-agent" containerID="cri-o://31de7642e3f6a2920d520c9f8ba02678706279ee75f8771e85a21f79d49d8f2c" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.349168 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="proxy-httpd" containerID="cri-o://28b7de7d56face10f996033601d7679206b992fa119d73cbe2e9f054af31eaea" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.349226 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="sg-core" containerID="cri-o://216e958871075e8c220585fbd5ca404192d6413d788bac1112f975b37e4a20f3" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.349261 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="ceilometer-notification-agent" containerID="cri-o://1f6749a25c72c9c787c127ceefad9a03df49b421462cca6b8ac0a09613026753" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.357417 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886"] Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.394067 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:44:44 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: if [ -n "glance" ]; then Jan 20 23:44:44 crc kubenswrapper[5030]: GRANT_DATABASE="glance" Jan 20 23:44:44 crc kubenswrapper[5030]: else Jan 20 23:44:44 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:44:44 crc kubenswrapper[5030]: fi Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:44:44 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:44:44 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:44:44 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:44:44 crc kubenswrapper[5030]: # support updates Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.395171 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" podUID="4d88dc1f-6d17-4a07-b701-d1e578d686ad" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.410329 5030 scope.go:117] "RemoveContainer" containerID="ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.410891 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:44:44 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: if [ -n "placement" ]; then Jan 20 23:44:44 crc kubenswrapper[5030]: GRANT_DATABASE="placement" Jan 20 23:44:44 crc kubenswrapper[5030]: else Jan 20 23:44:44 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:44:44 crc kubenswrapper[5030]: fi Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:44:44 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:44:44 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:44:44 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:44:44 crc kubenswrapper[5030]: # support updates Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.411355 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:44:44 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: if [ -n "barbican" ]; then Jan 20 23:44:44 crc kubenswrapper[5030]: GRANT_DATABASE="barbican" Jan 20 23:44:44 crc kubenswrapper[5030]: else Jan 20 23:44:44 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:44:44 crc kubenswrapper[5030]: fi Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:44:44 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:44:44 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:44:44 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:44:44 crc kubenswrapper[5030]: # support updates Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.411990 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" podUID="12f927a3-29ee-4af3-97e7-dc9998169372" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.412384 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" podUID="c8a88bb5-d95c-4db3-a79f-51958a894a2b" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.414882 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.424793 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.426462 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfe5c819-263d-4ea5-934c-51567db0e55e-openstack-config-secret\") pod \"bfe5c819-263d-4ea5-934c-51567db0e55e\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.426483 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:44:44 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: if [ -n "cinder" ]; then Jan 20 23:44:44 crc kubenswrapper[5030]: GRANT_DATABASE="cinder" Jan 20 23:44:44 crc kubenswrapper[5030]: else Jan 20 23:44:44 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:44:44 crc kubenswrapper[5030]: fi Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:44:44 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:44:44 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:44:44 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:44:44 crc kubenswrapper[5030]: # support updates Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.429458 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886" podUID="af19e9cd-1a61-43ac-a803-fdb7e92831e9" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.430817 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfe5c819-263d-4ea5-934c-51567db0e55e-openstack-config\") pod \"bfe5c819-263d-4ea5-934c-51567db0e55e\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.431070 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmvb4\" (UniqueName: \"kubernetes.io/projected/bfe5c819-263d-4ea5-934c-51567db0e55e-kube-api-access-bmvb4\") pod \"bfe5c819-263d-4ea5-934c-51567db0e55e\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.431101 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe5c819-263d-4ea5-934c-51567db0e55e-combined-ca-bundle\") pod \"bfe5c819-263d-4ea5-934c-51567db0e55e\" (UID: \"bfe5c819-263d-4ea5-934c-51567db0e55e\") " Jan 20 23:44:44 crc kubenswrapper[5030]: W0120 23:44:44.434306 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod660c45c7_982d_49a6_be38_7a74cf69a00e.slice/crio-5fc1ec9dce0d1a33f8515cdded84036ca857359dde464f100a864ab2d8b0a390 WatchSource:0}: Error finding container 5fc1ec9dce0d1a33f8515cdded84036ca857359dde464f100a864ab2d8b0a390: Status 404 returned error can't find the container with id 5fc1ec9dce0d1a33f8515cdded84036ca857359dde464f100a864ab2d8b0a390 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.439706 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.439913 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" podUID="987b795c-8dbe-4f59-aa00-e3f908a9c974" containerName="proxy-httpd" containerID="cri-o://3e26081edbb10019449a003a17538e1de68bd9f0dc38018db630e0edd2157f3c" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.440227 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" podUID="987b795c-8dbe-4f59-aa00-e3f908a9c974" containerName="proxy-server" containerID="cri-o://3ff22418a328424b32b7b4f415a4a4ad754ab1eb3343427244682702b8634303" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.445183 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe5c819-263d-4ea5-934c-51567db0e55e-kube-api-access-bmvb4" (OuterVolumeSpecName: "kube-api-access-bmvb4") pod "bfe5c819-263d-4ea5-934c-51567db0e55e" (UID: "bfe5c819-263d-4ea5-934c-51567db0e55e"). InnerVolumeSpecName "kube-api-access-bmvb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.446521 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z"] Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.453823 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6"] Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.468885 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:44:44 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: if [ -n "nova_api" ]; then Jan 20 23:44:44 crc kubenswrapper[5030]: GRANT_DATABASE="nova_api" Jan 20 23:44:44 crc kubenswrapper[5030]: else Jan 20 23:44:44 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:44:44 crc kubenswrapper[5030]: fi Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:44:44 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:44:44 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:44:44 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:44:44 crc kubenswrapper[5030]: # support updates Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.469079 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:44:44 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: if [ -n "nova_cell0" ]; then Jan 20 23:44:44 crc kubenswrapper[5030]: GRANT_DATABASE="nova_cell0" Jan 20 23:44:44 crc kubenswrapper[5030]: else Jan 20 23:44:44 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:44:44 crc kubenswrapper[5030]: fi Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:44:44 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:44:44 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:44:44 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:44:44 crc kubenswrapper[5030]: # support updates Jan 20 23:44:44 crc kubenswrapper[5030]: Jan 20 23:44:44 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.470344 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z" podUID="660c45c7-982d-49a6-be38-7a74cf69a00e" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.470681 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6" podUID="8ac2d572-0fed-4309-bb22-03bbc9a30270" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.478099 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_f7698f27-dba0-4e8f-95b4-329a483d6691/ovsdbserver-nb/0.log" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.478160 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.531746 5030 scope.go:117] "RemoveContainer" containerID="9b46431a294cf5d85df14faf26ae76b884709b85de6b4b62da3a05b399483892" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.532164 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b46431a294cf5d85df14faf26ae76b884709b85de6b4b62da3a05b399483892\": container with ID starting with 9b46431a294cf5d85df14faf26ae76b884709b85de6b4b62da3a05b399483892 not found: ID does not exist" containerID="9b46431a294cf5d85df14faf26ae76b884709b85de6b4b62da3a05b399483892" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.532191 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b46431a294cf5d85df14faf26ae76b884709b85de6b4b62da3a05b399483892"} err="failed to get container status \"9b46431a294cf5d85df14faf26ae76b884709b85de6b4b62da3a05b399483892\": rpc error: code = NotFound desc = could not find container \"9b46431a294cf5d85df14faf26ae76b884709b85de6b4b62da3a05b399483892\": container with ID starting with 9b46431a294cf5d85df14faf26ae76b884709b85de6b4b62da3a05b399483892 not found: ID does not exist" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.532215 5030 scope.go:117] "RemoveContainer" containerID="ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.532558 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605\": container with ID starting with ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605 not found: ID does not exist" containerID="ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.532819 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605"} err="failed to get container status \"ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605\": rpc error: code = NotFound desc = could not find container \"ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605\": container with ID starting with ec0c2500d9a3af05e24d8f0ff05c11d34b634201321697fba7e7c81adfa13605 not found: ID does not exist" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.535687 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-combined-ca-bundle\") pod \"f7698f27-dba0-4e8f-95b4-329a483d6691\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.535810 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7698f27-dba0-4e8f-95b4-329a483d6691-scripts\") pod \"f7698f27-dba0-4e8f-95b4-329a483d6691\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.536223 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-metrics-certs-tls-certs\") pod \"f7698f27-dba0-4e8f-95b4-329a483d6691\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.536452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7698f27-dba0-4e8f-95b4-329a483d6691-config\") pod \"f7698f27-dba0-4e8f-95b4-329a483d6691\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.536487 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f7698f27-dba0-4e8f-95b4-329a483d6691-ovsdb-rundir\") pod \"f7698f27-dba0-4e8f-95b4-329a483d6691\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.536687 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dgs5\" (UniqueName: \"kubernetes.io/projected/f7698f27-dba0-4e8f-95b4-329a483d6691-kube-api-access-7dgs5\") pod \"f7698f27-dba0-4e8f-95b4-329a483d6691\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.536708 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-ovsdbserver-nb-tls-certs\") pod \"f7698f27-dba0-4e8f-95b4-329a483d6691\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.536732 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"f7698f27-dba0-4e8f-95b4-329a483d6691\" (UID: \"f7698f27-dba0-4e8f-95b4-329a483d6691\") " Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.537220 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmvb4\" (UniqueName: \"kubernetes.io/projected/bfe5c819-263d-4ea5-934c-51567db0e55e-kube-api-access-bmvb4\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.537302 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.537350 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:45.537336898 +0000 UTC m=+4157.857597186 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.537724 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.537797 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:45.537789 +0000 UTC m=+4157.858049288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : configmap "openstack-cell1-config-data" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.537847 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.537868 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:45.537861621 +0000 UTC m=+4157.858121909 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : configmap "openstack-cell1-config-data" not found Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.537881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7698f27-dba0-4e8f-95b4-329a483d6691-scripts" (OuterVolumeSpecName: "scripts") pod "f7698f27-dba0-4e8f-95b4-329a483d6691" (UID: "f7698f27-dba0-4e8f-95b4-329a483d6691"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.538196 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.538338 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.538418 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7698f27-dba0-4e8f-95b4-329a483d6691-config" (OuterVolumeSpecName: "config") pod "f7698f27-dba0-4e8f-95b4-329a483d6691" (UID: "f7698f27-dba0-4e8f-95b4-329a483d6691"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.538590 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7698f27-dba0-4e8f-95b4-329a483d6691-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "f7698f27-dba0-4e8f-95b4-329a483d6691" (UID: "f7698f27-dba0-4e8f-95b4-329a483d6691"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.538650 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:45.53822177 +0000 UTC m=+4157.858482058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : configmap "openstack-cell1-scripts" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.538668 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:45.53865962 +0000 UTC m=+4157.858919908 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : secret "combined-ca-bundle" not found Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.546794 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7698f27-dba0-4e8f-95b4-329a483d6691-kube-api-access-7dgs5" (OuterVolumeSpecName: "kube-api-access-7dgs5") pod "f7698f27-dba0-4e8f-95b4-329a483d6691" (UID: "f7698f27-dba0-4e8f-95b4-329a483d6691"). InnerVolumeSpecName "kube-api-access-7dgs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.565670 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "f7698f27-dba0-4e8f-95b4-329a483d6691" (UID: "f7698f27-dba0-4e8f-95b4-329a483d6691"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.593504 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w"] Jan 20 23:44:44 crc kubenswrapper[5030]: W0120 23:44:44.613736 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa8f718e_3edd_4de0_9125_7d981584e6e0.slice/crio-02453a9d42983992c321bc6fc95fdeba4b8e233c6cc170d345f57bb867c223c5 WatchSource:0}: Error finding container 02453a9d42983992c321bc6fc95fdeba4b8e233c6cc170d345f57bb867c223c5: Status 404 returned error can't find the container with id 02453a9d42983992c321bc6fc95fdeba4b8e233c6cc170d345f57bb867c223c5 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.638900 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts\") pod \"nova-cell1-a2f6-account-create-update-v29f8\" (UID: \"37397e88-13fe-4396-810e-d87514548283\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.638951 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwph5\" (UniqueName: \"kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5\") pod \"nova-cell1-a2f6-account-create-update-v29f8\" (UID: \"37397e88-13fe-4396-810e-d87514548283\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.639019 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.639034 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7698f27-dba0-4e8f-95b4-329a483d6691-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.639044 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7698f27-dba0-4e8f-95b4-329a483d6691-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.639054 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f7698f27-dba0-4e8f-95b4-329a483d6691-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.639065 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dgs5\" (UniqueName: \"kubernetes.io/projected/f7698f27-dba0-4e8f-95b4-329a483d6691-kube-api-access-7dgs5\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.639199 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.639279 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts podName:37397e88-13fe-4396-810e-d87514548283 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:46.639255881 +0000 UTC m=+4158.959516209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts") pod "nova-cell1-a2f6-account-create-update-v29f8" (UID: "37397e88-13fe-4396-810e-d87514548283") : configmap "openstack-cell1-scripts" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.643061 5030 projected.go:194] Error preparing data for projected volume kube-api-access-kwph5 for pod openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.643115 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5 podName:37397e88-13fe-4396-810e-d87514548283 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:46.643100534 +0000 UTC m=+4158.963360822 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kwph5" (UniqueName: "kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5") pod "nova-cell1-a2f6-account-create-update-v29f8" (UID: "37397e88-13fe-4396-810e-d87514548283") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.662337 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.708993 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="86b0b2ab-44de-47c1-b0f4-6fc63d87393a" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.249:8081/readyz\": dial tcp 10.217.0.249:8081: connect: connection refused" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.741417 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.766666 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe5c819-263d-4ea5-934c-51567db0e55e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfe5c819-263d-4ea5-934c-51567db0e55e" (UID: "bfe5c819-263d-4ea5-934c-51567db0e55e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.771375 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fc32705-7b67-4425-88be-7fe57171135e" (UID: "3fc32705-7b67-4425-88be-7fe57171135e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.779810 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="2118c3ce-d55e-4a65-8d25-d7f53bcac91e" containerName="rabbitmq" containerID="cri-o://8d105475679fa60963f81be8cfa6ee62761c330e4c9f837c616f6bf01d9043ec" gracePeriod=604800 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.789888 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7698f27-dba0-4e8f-95b4-329a483d6691" (UID: "f7698f27-dba0-4e8f-95b4-329a483d6691"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.790042 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.811170 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfe5c819-263d-4ea5-934c-51567db0e55e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "bfe5c819-263d-4ea5-934c-51567db0e55e" (UID: "bfe5c819-263d-4ea5-934c-51567db0e55e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.818795 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3fc32705-7b67-4425-88be-7fe57171135e" (UID: "3fc32705-7b67-4425-88be-7fe57171135e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.822457 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "3fc32705-7b67-4425-88be-7fe57171135e" (UID: "3fc32705-7b67-4425-88be-7fe57171135e"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.825194 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe5c819-263d-4ea5-934c-51567db0e55e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "bfe5c819-263d-4ea5-934c-51567db0e55e" (UID: "bfe5c819-263d-4ea5-934c-51567db0e55e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.843445 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bfe5c819-263d-4ea5-934c-51567db0e55e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.843472 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.843484 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bfe5c819-263d-4ea5-934c-51567db0e55e-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.843494 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.843504 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.843514 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.843523 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fc32705-7b67-4425-88be-7fe57171135e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.843534 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe5c819-263d-4ea5-934c-51567db0e55e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.843587 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:44:44 crc kubenswrapper[5030]: E0120 23:44:44.843657 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data podName:16f26a59-3681-422d-a46f-406b454f73b8 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:48.843616888 +0000 UTC m=+4161.163877176 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data") pod "rabbitmq-cell1-server-0" (UID: "16f26a59-3681-422d-a46f-406b454f73b8") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.927094 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="f7c46ea7-7016-445e-8f53-a65df4ea35fc" containerName="galera" containerID="cri-o://25486648d351881e64da50149b9a74c4d78a6f3be0d43be4097ddd110093e64c" gracePeriod=30 Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.945942 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "f7698f27-dba0-4e8f-95b4-329a483d6691" (UID: "f7698f27-dba0-4e8f-95b4-329a483d6691"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:44 crc kubenswrapper[5030]: I0120 23:44:44.969691 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "f7698f27-dba0-4e8f-95b4-329a483d6691" (UID: "f7698f27-dba0-4e8f-95b4-329a483d6691"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.048350 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.048410 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7698f27-dba0-4e8f-95b4-329a483d6691-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.198909 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.204018 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6" event={"ID":"8ac2d572-0fed-4309-bb22-03bbc9a30270","Type":"ContainerStarted","Data":"5c4c0e384351decfa0e06db0ec510afd7cd507fd97199e0f95a9982922d2aed9"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.206070 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.211601 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886" event={"ID":"af19e9cd-1a61-43ac-a803-fdb7e92831e9","Type":"ContainerStarted","Data":"6aa6dca3f90f6440c9827bb4333a532df782d70124418bfb8cd81904cb9d563d"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.215079 5030 generic.go:334] "Generic (PLEG): container finished" podID="24b38406-08cf-452c-b0f8-c455ed6b7ece" containerID="1abdfa8e1738c463a2fd304df58c250bf4c6932fce81a2f5d1066f83952f0f5c" exitCode=0 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.215130 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"24b38406-08cf-452c-b0f8-c455ed6b7ece","Type":"ContainerDied","Data":"1abdfa8e1738c463a2fd304df58c250bf4c6932fce81a2f5d1066f83952f0f5c"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.255599 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_f7698f27-dba0-4e8f-95b4-329a483d6691/ovsdbserver-nb/0.log" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.255677 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"f7698f27-dba0-4e8f-95b4-329a483d6691","Type":"ContainerDied","Data":"aceb0c132ca610f79613ed759be8a9c277b262f6bda56734910d3f61ee6a199a"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.255711 5030 scope.go:117] "RemoveContainer" containerID="9b00346eae6215c81c9b549691ce9a825ffdd4a07f8e28ceb5cc31cd3a07ad35" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.255811 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.303209 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.314980 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.318914 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.333455 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z" event={"ID":"660c45c7-982d-49a6-be38-7a74cf69a00e","Type":"ContainerStarted","Data":"5fc1ec9dce0d1a33f8515cdded84036ca857359dde464f100a864ab2d8b0a390"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.351197 5030 generic.go:334] "Generic (PLEG): container finished" podID="5c070076-3f81-4e61-9f27-e5a39a84c380" containerID="3ee7c875e7608cf4c0f7d6601a304573ca4dfceefd3058f7a8848e454d0f9c9d" exitCode=143 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.351279 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5c070076-3f81-4e61-9f27-e5a39a84c380","Type":"ContainerDied","Data":"3ee7c875e7608cf4c0f7d6601a304573ca4dfceefd3058f7a8848e454d0f9c9d"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.386141 5030 generic.go:334] "Generic (PLEG): container finished" podID="ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" containerID="f1f39f1af8be8586d4e08109ccc15fd44076fee82dbc43273800e8dc177416bf" exitCode=143 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.386208 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" event={"ID":"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1","Type":"ContainerDied","Data":"f1f39f1af8be8586d4e08109ccc15fd44076fee82dbc43273800e8dc177416bf"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.412420 5030 scope.go:117] "RemoveContainer" containerID="c60cc42b20fa5da460341c91ae834bc23e0583d2671863d235c214e33a0cc075" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.423551 5030 generic.go:334] "Generic (PLEG): container finished" podID="f205895a-be1d-4da7-adbc-773b2750d39b" containerID="d0252ba3559ee134b9000998c8060c53dc2b0d984f743e9c1fb9fb16abf9e46f" exitCode=143 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.423652 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" event={"ID":"f205895a-be1d-4da7-adbc-773b2750d39b","Type":"ContainerDied","Data":"d0252ba3559ee134b9000998c8060c53dc2b0d984f743e9c1fb9fb16abf9e46f"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.433701 5030 generic.go:334] "Generic (PLEG): container finished" podID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerID="5b31b1b990ceabd4797fda7252ebbcdbfa5957e9115be52ceb86f6f08f169842" exitCode=0 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.433788 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb","Type":"ContainerDied","Data":"5b31b1b990ceabd4797fda7252ebbcdbfa5957e9115be52ceb86f6f08f169842"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.437334 5030 generic.go:334] "Generic (PLEG): container finished" podID="86b0b2ab-44de-47c1-b0f4-6fc63d87393a" containerID="deeb6b4d1668a9582b4c378c594679c4235c09d76d1e2e4ec46aa1ca229900d6" exitCode=2 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.437384 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"86b0b2ab-44de-47c1-b0f4-6fc63d87393a","Type":"ContainerDied","Data":"deeb6b4d1668a9582b4c378c594679c4235c09d76d1e2e4ec46aa1ca229900d6"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.442218 5030 generic.go:334] "Generic (PLEG): container finished" podID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerID="28b7de7d56face10f996033601d7679206b992fa119d73cbe2e9f054af31eaea" exitCode=0 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.442247 5030 generic.go:334] "Generic (PLEG): container finished" podID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerID="216e958871075e8c220585fbd5ca404192d6413d788bac1112f975b37e4a20f3" exitCode=2 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.442255 5030 generic.go:334] "Generic (PLEG): container finished" podID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerID="31de7642e3f6a2920d520c9f8ba02678706279ee75f8771e85a21f79d49d8f2c" exitCode=0 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.442295 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aeaf04c5-9099-4b8a-b270-d4ee406f8896","Type":"ContainerDied","Data":"28b7de7d56face10f996033601d7679206b992fa119d73cbe2e9f054af31eaea"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.442320 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aeaf04c5-9099-4b8a-b270-d4ee406f8896","Type":"ContainerDied","Data":"216e958871075e8c220585fbd5ca404192d6413d788bac1112f975b37e4a20f3"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.442331 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aeaf04c5-9099-4b8a-b270-d4ee406f8896","Type":"ContainerDied","Data":"31de7642e3f6a2920d520c9f8ba02678706279ee75f8771e85a21f79d49d8f2c"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.453290 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" event={"ID":"fa8f718e-3edd-4de0-9125-7d981584e6e0","Type":"ContainerStarted","Data":"02453a9d42983992c321bc6fc95fdeba4b8e233c6cc170d345f57bb867c223c5"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.468911 5030 generic.go:334] "Generic (PLEG): container finished" podID="cfd208a2-6c29-4066-a573-c6b35a6a59d5" containerID="41d281360074f0de7fd83a7b9474cfafb813de4e7bf96ba1b8301e227e49eae4" exitCode=1 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.469198 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-bthhs" event={"ID":"cfd208a2-6c29-4066-a573-c6b35a6a59d5","Type":"ContainerDied","Data":"41d281360074f0de7fd83a7b9474cfafb813de4e7bf96ba1b8301e227e49eae4"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.469801 5030 scope.go:117] "RemoveContainer" containerID="41d281360074f0de7fd83a7b9474cfafb813de4e7bf96ba1b8301e227e49eae4" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.524227 5030 generic.go:334] "Generic (PLEG): container finished" podID="fbdc4876-2fa1-495d-88fd-3c18541b14d4" containerID="b59c18df1e318f6b2e3f84ac12a6374255e6026543722d5cbad05da5899f1a0b" exitCode=143 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.524297 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fbdc4876-2fa1-495d-88fd-3c18541b14d4","Type":"ContainerDied","Data":"b59c18df1e318f6b2e3f84ac12a6374255e6026543722d5cbad05da5899f1a0b"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.538269 5030 generic.go:334] "Generic (PLEG): container finished" podID="987b795c-8dbe-4f59-aa00-e3f908a9c974" containerID="3ff22418a328424b32b7b4f415a4a4ad754ab1eb3343427244682702b8634303" exitCode=0 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.538598 5030 generic.go:334] "Generic (PLEG): container finished" podID="987b795c-8dbe-4f59-aa00-e3f908a9c974" containerID="3e26081edbb10019449a003a17538e1de68bd9f0dc38018db630e0edd2157f3c" exitCode=0 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.538675 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" event={"ID":"987b795c-8dbe-4f59-aa00-e3f908a9c974","Type":"ContainerDied","Data":"3ff22418a328424b32b7b4f415a4a4ad754ab1eb3343427244682702b8634303"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.538702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" event={"ID":"987b795c-8dbe-4f59-aa00-e3f908a9c974","Type":"ContainerDied","Data":"3e26081edbb10019449a003a17538e1de68bd9f0dc38018db630e0edd2157f3c"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.548014 5030 generic.go:334] "Generic (PLEG): container finished" podID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerID="f878fb22f76e5c3d5c5aad54d40d91cba28b59e426192e35705dec229d359f10" exitCode=143 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.548464 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" event={"ID":"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd","Type":"ContainerDied","Data":"f878fb22f76e5c3d5c5aad54d40d91cba28b59e426192e35705dec229d359f10"} Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.548790 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.560413 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.560479 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:47.560460398 +0000 UTC m=+4159.880720686 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : secret "combined-ca-bundle" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.560516 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.560534 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:47.56052869 +0000 UTC m=+4159.880788978 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : configmap "openstack-cell1-config-data" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.562668 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.562783 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:47.562764414 +0000 UTC m=+4159.883024702 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.562873 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.562942 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:47.562933518 +0000 UTC m=+4159.883193796 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : configmap "openstack-cell1-config-data" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.563374 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.563465 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:47.56345593 +0000 UTC m=+4159.883716218 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : configmap "openstack-cell1-scripts" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.563554 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.563638 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data podName:2118c3ce-d55e-4a65-8d25-d7f53bcac91e nodeName:}" failed. No retries permitted until 2026-01-20 23:44:49.563630915 +0000 UTC m=+4161.883891203 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data") pod "rabbitmq-server-0" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e") : configmap "rabbitmq-config-data" not found Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.592527 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.601208 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.608219 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.616974 5030 scope.go:117] "RemoveContainer" containerID="4162ae90212f0f20c63503791f5909dd8447d80bbbc67a89f73e7ea573c92180" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.661195 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lpr9\" (UniqueName: \"kubernetes.io/projected/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-api-access-6lpr9\") pod \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.661401 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-combined-ca-bundle\") pod \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.661529 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn474\" (UniqueName: \"kubernetes.io/projected/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-kube-api-access-tn474\") pod \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.661604 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-config-data-custom\") pod \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.661702 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-combined-ca-bundle\") pod \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.661783 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-scripts\") pod \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.661854 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-state-metrics-tls-certs\") pod \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.661963 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-etc-machine-id\") pod \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.662047 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-config-data\") pod \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\" (UID: \"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb\") " Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.662116 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-state-metrics-tls-config\") pod \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\" (UID: \"86b0b2ab-44de-47c1-b0f4-6fc63d87393a\") " Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.664371 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" (UID: "d5cd7900-68ef-41ef-8635-44dd8b9bfdbb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.743115 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" (UID: "d5cd7900-68ef-41ef-8635-44dd8b9bfdbb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.743494 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-api-access-6lpr9" (OuterVolumeSpecName: "kube-api-access-6lpr9") pod "86b0b2ab-44de-47c1-b0f4-6fc63d87393a" (UID: "86b0b2ab-44de-47c1-b0f4-6fc63d87393a"). InnerVolumeSpecName "kube-api-access-6lpr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.743670 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-scripts" (OuterVolumeSpecName: "scripts") pod "d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" (UID: "d5cd7900-68ef-41ef-8635-44dd8b9bfdbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.743782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-kube-api-access-tn474" (OuterVolumeSpecName: "kube-api-access-tn474") pod "d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" (UID: "d5cd7900-68ef-41ef-8635-44dd8b9bfdbb"). InnerVolumeSpecName "kube-api-access-tn474". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.763282 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.763308 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.763319 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lpr9\" (UniqueName: \"kubernetes.io/projected/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-api-access-6lpr9\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.763328 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn474\" (UniqueName: \"kubernetes.io/projected/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-kube-api-access-tn474\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.763337 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.763402 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.763444 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle podName:df866ce3-a10a-41e3-a17b-abea64eaa5ca nodeName:}" failed. No retries permitted until 2026-01-20 23:44:49.763430812 +0000 UTC m=+4162.083691100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle") pod "openstack-galera-0" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca") : secret "combined-ca-bundle" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.763749 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.763778 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs podName:df866ce3-a10a-41e3-a17b-abea64eaa5ca nodeName:}" failed. No retries permitted until 2026-01-20 23:44:49.76377083 +0000 UTC m=+4162.084031108 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs") pod "openstack-galera-0" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca") : secret "cert-galera-openstack-svc" not found Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.796885 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93d97d664a75ce718e24a3db67922eeb139038d7198bf3078d32828f497081d0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.798949 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93d97d664a75ce718e24a3db67922eeb139038d7198bf3078d32828f497081d0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.800606 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="93d97d664a75ce718e24a3db67922eeb139038d7198bf3078d32828f497081d0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:44:45 crc kubenswrapper[5030]: E0120 23:44:45.800680 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerName="nova-cell1-conductor-conductor" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.842474 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "86b0b2ab-44de-47c1-b0f4-6fc63d87393a" (UID: "86b0b2ab-44de-47c1-b0f4-6fc63d87393a"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.847588 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86b0b2ab-44de-47c1-b0f4-6fc63d87393a" (UID: "86b0b2ab-44de-47c1-b0f4-6fc63d87393a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.849693 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "86b0b2ab-44de-47c1-b0f4-6fc63d87393a" (UID: "86b0b2ab-44de-47c1-b0f4-6fc63d87393a"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.850913 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" (UID: "d5cd7900-68ef-41ef-8635-44dd8b9bfdbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.871927 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.871962 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.871972 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/86b0b2ab-44de-47c1-b0f4-6fc63d87393a-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.871981 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.915100 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.915562 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="d10e94b1-91a0-4631-a7b8-763d41c403ce" containerName="memcached" containerID="cri-o://042c3ddc337801a1de68a99e66b3565634736bf58285d9ac8ac3ca69a8d4aca1" gracePeriod=30 Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.939521 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt"] Jan 20 23:44:45 crc kubenswrapper[5030]: I0120 23:44:45.993134 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-config-data" (OuterVolumeSpecName: "config-data") pod "d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" (UID: "d5cd7900-68ef-41ef-8635-44dd8b9bfdbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.035885 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fc32705-7b67-4425-88be-7fe57171135e" path="/var/lib/kubelet/pods/3fc32705-7b67-4425-88be-7fe57171135e/volumes" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.037146 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65c08c3d-8985-44c2-81bf-a50efb87efe6" path="/var/lib/kubelet/pods/65c08c3d-8985-44c2-81bf-a50efb87efe6/volumes" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.037766 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69848fe2-bbbb-44ba-982f-2cfacf5280d0" path="/var/lib/kubelet/pods/69848fe2-bbbb-44ba-982f-2cfacf5280d0/volumes" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.041710 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ffdd818-4c7b-46df-8e2f-0bcba0281818" path="/var/lib/kubelet/pods/7ffdd818-4c7b-46df-8e2f-0bcba0281818/volumes" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.042470 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9449e5f9-0e08-4f4e-ac8b-a942bb89a09e" path="/var/lib/kubelet/pods/9449e5f9-0e08-4f4e-ac8b-a942bb89a09e/volumes" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.043249 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4bf350-04d5-4e11-8462-2e53fa7b058a" path="/var/lib/kubelet/pods/9b4bf350-04d5-4e11-8462-2e53fa7b058a/volumes" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.045581 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe5c819-263d-4ea5-934c-51567db0e55e" path="/var/lib/kubelet/pods/bfe5c819-263d-4ea5-934c-51567db0e55e/volumes" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.046474 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7698f27-dba0-4e8f-95b4-329a483d6691" path="/var/lib/kubelet/pods/f7698f27-dba0-4e8f-95b4-329a483d6691/volumes" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.049989 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-b2c4-account-create-update-d5bdt"] Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.050045 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt"] Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.050550 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b0b2ab-44de-47c1-b0f4-6fc63d87393a" containerName="kube-state-metrics" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.050562 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b0b2ab-44de-47c1-b0f4-6fc63d87393a" containerName="kube-state-metrics" Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.050589 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc32705-7b67-4425-88be-7fe57171135e" containerName="ovsdbserver-sb" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.050595 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc32705-7b67-4425-88be-7fe57171135e" containerName="ovsdbserver-sb" Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.050603 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerName="probe" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.050609 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerName="probe" Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.050642 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerName="cinder-scheduler" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.050648 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerName="cinder-scheduler" Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.050660 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7698f27-dba0-4e8f-95b4-329a483d6691" containerName="ovsdbserver-nb" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.050666 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7698f27-dba0-4e8f-95b4-329a483d6691" containerName="ovsdbserver-nb" Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.050674 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7698f27-dba0-4e8f-95b4-329a483d6691" containerName="openstack-network-exporter" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.050681 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7698f27-dba0-4e8f-95b4-329a483d6691" containerName="openstack-network-exporter" Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.050689 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc32705-7b67-4425-88be-7fe57171135e" containerName="openstack-network-exporter" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.050695 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc32705-7b67-4425-88be-7fe57171135e" containerName="openstack-network-exporter" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.052416 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b0b2ab-44de-47c1-b0f4-6fc63d87393a" containerName="kube-state-metrics" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.052454 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc32705-7b67-4425-88be-7fe57171135e" containerName="ovsdbserver-sb" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.052463 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7698f27-dba0-4e8f-95b4-329a483d6691" containerName="ovsdbserver-nb" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.052470 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7698f27-dba0-4e8f-95b4-329a483d6691" containerName="openstack-network-exporter" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.052478 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc32705-7b67-4425-88be-7fe57171135e" containerName="openstack-network-exporter" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.052518 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerName="probe" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.052531 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" containerName="cinder-scheduler" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.054415 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt"] Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.054518 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.061141 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.071927 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-77ffx"] Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.078317 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.080101 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bwkgg"] Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.087863 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-bwkgg"] Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.102123 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-77ffx"] Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.114910 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7c845c9d97-gq5lm"] Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.115118 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" podUID="ba47504f-1af1-4c46-acc4-6f6ebc32855f" containerName="keystone-api" containerID="cri-o://243306ef2b6397bc8c6c2bcd542fb20212d3039979bc4b05c6287f0f67be396d" gracePeriod=30 Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.121268 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.127884 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.143997 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-b5f77"] Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.144054 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-b5f77"] Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.150663 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt"] Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.153958 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-6js45 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" podUID="3a215636-7c93-4a97-97f5-842f25380d45" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.170519 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bthhs"] Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.180419 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6js45\" (UniqueName: \"kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45\") pod \"keystone-b2c4-account-create-update-95cmt\" (UID: \"3a215636-7c93-4a97-97f5-842f25380d45\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.180485 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts\") pod \"keystone-b2c4-account-create-update-95cmt\" (UID: \"3a215636-7c93-4a97-97f5-842f25380d45\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.196505 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.197780 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.281906 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/987b795c-8dbe-4f59-aa00-e3f908a9c974-etc-swift\") pod \"987b795c-8dbe-4f59-aa00-e3f908a9c974\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.281963 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b795c-8dbe-4f59-aa00-e3f908a9c974-run-httpd\") pod \"987b795c-8dbe-4f59-aa00-e3f908a9c974\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.282005 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-public-tls-certs\") pod \"987b795c-8dbe-4f59-aa00-e3f908a9c974\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.282035 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-combined-ca-bundle\") pod \"987b795c-8dbe-4f59-aa00-e3f908a9c974\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.282107 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtkhq\" (UniqueName: \"kubernetes.io/projected/987b795c-8dbe-4f59-aa00-e3f908a9c974-kube-api-access-vtkhq\") pod \"987b795c-8dbe-4f59-aa00-e3f908a9c974\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.282146 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-config-data\") pod \"987b795c-8dbe-4f59-aa00-e3f908a9c974\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.282190 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b795c-8dbe-4f59-aa00-e3f908a9c974-log-httpd\") pod \"987b795c-8dbe-4f59-aa00-e3f908a9c974\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.282213 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-internal-tls-certs\") pod \"987b795c-8dbe-4f59-aa00-e3f908a9c974\" (UID: \"987b795c-8dbe-4f59-aa00-e3f908a9c974\") " Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.282551 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6js45\" (UniqueName: \"kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45\") pod \"keystone-b2c4-account-create-update-95cmt\" (UID: \"3a215636-7c93-4a97-97f5-842f25380d45\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.282590 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts\") pod \"keystone-b2c4-account-create-update-95cmt\" (UID: \"3a215636-7c93-4a97-97f5-842f25380d45\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.282728 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.282773 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts podName:3a215636-7c93-4a97-97f5-842f25380d45 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:46.78275951 +0000 UTC m=+4159.103019798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts") pod "keystone-b2c4-account-create-update-95cmt" (UID: "3a215636-7c93-4a97-97f5-842f25380d45") : configmap "openstack-scripts" not found Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.284270 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/987b795c-8dbe-4f59-aa00-e3f908a9c974-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "987b795c-8dbe-4f59-aa00-e3f908a9c974" (UID: "987b795c-8dbe-4f59-aa00-e3f908a9c974"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.285523 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/987b795c-8dbe-4f59-aa00-e3f908a9c974-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "987b795c-8dbe-4f59-aa00-e3f908a9c974" (UID: "987b795c-8dbe-4f59-aa00-e3f908a9c974"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.294783 5030 projected.go:194] Error preparing data for projected volume kube-api-access-6js45 for pod openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.294945 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45 podName:3a215636-7c93-4a97-97f5-842f25380d45 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:46.794855933 +0000 UTC m=+4159.115116221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6js45" (UniqueName: "kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45") pod "keystone-b2c4-account-create-update-95cmt" (UID: "3a215636-7c93-4a97-97f5-842f25380d45") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.296026 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/987b795c-8dbe-4f59-aa00-e3f908a9c974-kube-api-access-vtkhq" (OuterVolumeSpecName: "kube-api-access-vtkhq") pod "987b795c-8dbe-4f59-aa00-e3f908a9c974" (UID: "987b795c-8dbe-4f59-aa00-e3f908a9c974"). InnerVolumeSpecName "kube-api-access-vtkhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.309369 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/987b795c-8dbe-4f59-aa00-e3f908a9c974-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "987b795c-8dbe-4f59-aa00-e3f908a9c974" (UID: "987b795c-8dbe-4f59-aa00-e3f908a9c974"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.362810 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-config-data" (OuterVolumeSpecName: "config-data") pod "987b795c-8dbe-4f59-aa00-e3f908a9c974" (UID: "987b795c-8dbe-4f59-aa00-e3f908a9c974"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.385466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgmcg\" (UniqueName: \"kubernetes.io/projected/8ac2d572-0fed-4309-bb22-03bbc9a30270-kube-api-access-tgmcg\") pod \"8ac2d572-0fed-4309-bb22-03bbc9a30270\" (UID: \"8ac2d572-0fed-4309-bb22-03bbc9a30270\") " Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.385618 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ac2d572-0fed-4309-bb22-03bbc9a30270-operator-scripts\") pod \"8ac2d572-0fed-4309-bb22-03bbc9a30270\" (UID: \"8ac2d572-0fed-4309-bb22-03bbc9a30270\") " Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.385734 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp2vh\" (UniqueName: \"kubernetes.io/projected/af19e9cd-1a61-43ac-a803-fdb7e92831e9-kube-api-access-zp2vh\") pod \"af19e9cd-1a61-43ac-a803-fdb7e92831e9\" (UID: \"af19e9cd-1a61-43ac-a803-fdb7e92831e9\") " Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.385756 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af19e9cd-1a61-43ac-a803-fdb7e92831e9-operator-scripts\") pod \"af19e9cd-1a61-43ac-a803-fdb7e92831e9\" (UID: \"af19e9cd-1a61-43ac-a803-fdb7e92831e9\") " Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.386261 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/987b795c-8dbe-4f59-aa00-e3f908a9c974-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.386279 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b795c-8dbe-4f59-aa00-e3f908a9c974-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.386288 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtkhq\" (UniqueName: \"kubernetes.io/projected/987b795c-8dbe-4f59-aa00-e3f908a9c974-kube-api-access-vtkhq\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.386299 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.386306 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/987b795c-8dbe-4f59-aa00-e3f908a9c974-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.386771 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af19e9cd-1a61-43ac-a803-fdb7e92831e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af19e9cd-1a61-43ac-a803-fdb7e92831e9" (UID: "af19e9cd-1a61-43ac-a803-fdb7e92831e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.391171 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac2d572-0fed-4309-bb22-03bbc9a30270-kube-api-access-tgmcg" (OuterVolumeSpecName: "kube-api-access-tgmcg") pod "8ac2d572-0fed-4309-bb22-03bbc9a30270" (UID: "8ac2d572-0fed-4309-bb22-03bbc9a30270"). InnerVolumeSpecName "kube-api-access-tgmcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.391848 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ac2d572-0fed-4309-bb22-03bbc9a30270-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ac2d572-0fed-4309-bb22-03bbc9a30270" (UID: "8ac2d572-0fed-4309-bb22-03bbc9a30270"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.396819 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af19e9cd-1a61-43ac-a803-fdb7e92831e9-kube-api-access-zp2vh" (OuterVolumeSpecName: "kube-api-access-zp2vh") pod "af19e9cd-1a61-43ac-a803-fdb7e92831e9" (UID: "af19e9cd-1a61-43ac-a803-fdb7e92831e9"). InnerVolumeSpecName "kube-api-access-zp2vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.405261 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "987b795c-8dbe-4f59-aa00-e3f908a9c974" (UID: "987b795c-8dbe-4f59-aa00-e3f908a9c974"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.430678 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="df866ce3-a10a-41e3-a17b-abea64eaa5ca" containerName="galera" containerID="cri-o://3a08091a8839c78e509146461ec023c2f7f8a34d9700b1f59f27ad351db63a85" gracePeriod=30 Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.439945 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "987b795c-8dbe-4f59-aa00-e3f908a9c974" (UID: "987b795c-8dbe-4f59-aa00-e3f908a9c974"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.458117 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "987b795c-8dbe-4f59-aa00-e3f908a9c974" (UID: "987b795c-8dbe-4f59-aa00-e3f908a9c974"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.472788 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" podUID="d19c3298-c9b5-4371-b1f5-8edf2510706b" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.1.38:8778/\": read tcp 10.217.0.2:37924->10.217.1.38:8778: read: connection reset by peer" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.472821 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" podUID="d19c3298-c9b5-4371-b1f5-8edf2510706b" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.1.38:8778/\": read tcp 10.217.0.2:37936->10.217.1.38:8778: read: connection reset by peer" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.487688 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.487720 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp2vh\" (UniqueName: \"kubernetes.io/projected/af19e9cd-1a61-43ac-a803-fdb7e92831e9-kube-api-access-zp2vh\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.487731 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af19e9cd-1a61-43ac-a803-fdb7e92831e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.487762 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.487773 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgmcg\" (UniqueName: \"kubernetes.io/projected/8ac2d572-0fed-4309-bb22-03bbc9a30270-kube-api-access-tgmcg\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.487782 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ac2d572-0fed-4309-bb22-03bbc9a30270-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.487790 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/987b795c-8dbe-4f59-aa00-e3f908a9c974-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.560182 5030 generic.go:334] "Generic (PLEG): container finished" podID="ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" containerID="124b27d23196c76c208e464129a013f7699a2efee00d1a0cc64c1e1f4062736b" exitCode=0 Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.560257 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" event={"ID":"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1","Type":"ContainerDied","Data":"124b27d23196c76c208e464129a013f7699a2efee00d1a0cc64c1e1f4062736b"} Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.561526 5030 generic.go:334] "Generic (PLEG): container finished" podID="cfd208a2-6c29-4066-a573-c6b35a6a59d5" containerID="332c1b6479e30e4a4e22fdcc2d4661217a7b8a37320b8504f9bc9df20679771f" exitCode=1 Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.561583 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-bthhs" event={"ID":"cfd208a2-6c29-4066-a573-c6b35a6a59d5","Type":"ContainerDied","Data":"332c1b6479e30e4a4e22fdcc2d4661217a7b8a37320b8504f9bc9df20679771f"} Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.561611 5030 scope.go:117] "RemoveContainer" containerID="41d281360074f0de7fd83a7b9474cfafb813de4e7bf96ba1b8301e227e49eae4" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.561968 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-bthhs" secret="" err="secret \"galera-openstack-dockercfg-j9wjx\" not found" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.562005 5030 scope.go:117] "RemoveContainer" containerID="332c1b6479e30e4a4e22fdcc2d4661217a7b8a37320b8504f9bc9df20679771f" Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.562289 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-bthhs_openstack-kuttl-tests(cfd208a2-6c29-4066-a573-c6b35a6a59d5)\"" pod="openstack-kuttl-tests/root-account-create-update-bthhs" podUID="cfd208a2-6c29-4066-a573-c6b35a6a59d5" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.572967 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"d5cd7900-68ef-41ef-8635-44dd8b9bfdbb","Type":"ContainerDied","Data":"cb8237cc0b5b2e4e6dae90a4bf8cfbbab519f02669ecc73ed98d4b9bea378826"} Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.573048 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.578888 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.578994 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6" event={"ID":"8ac2d572-0fed-4309-bb22-03bbc9a30270","Type":"ContainerDied","Data":"5c4c0e384351decfa0e06db0ec510afd7cd507fd97199e0f95a9982922d2aed9"} Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.580278 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"86b0b2ab-44de-47c1-b0f4-6fc63d87393a","Type":"ContainerDied","Data":"9b80c9ca3ace44c607a095305001f867f4fc68f08a57178da2066aab0556a1ca"} Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.580345 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.581838 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886" event={"ID":"af19e9cd-1a61-43ac-a803-fdb7e92831e9","Type":"ContainerDied","Data":"6aa6dca3f90f6440c9827bb4333a532df782d70124418bfb8cd81904cb9d563d"} Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.582056 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.587397 5030 generic.go:334] "Generic (PLEG): container finished" podID="f7c46ea7-7016-445e-8f53-a65df4ea35fc" containerID="25486648d351881e64da50149b9a74c4d78a6f3be0d43be4097ddd110093e64c" exitCode=0 Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.587469 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f7c46ea7-7016-445e-8f53-a65df4ea35fc","Type":"ContainerDied","Data":"25486648d351881e64da50149b9a74c4d78a6f3be0d43be4097ddd110093e64c"} Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.590875 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" event={"ID":"987b795c-8dbe-4f59-aa00-e3f908a9c974","Type":"ContainerDied","Data":"02317d546363d34817da85af5c16aeadb570d1dac6167472d21545739adcfd02"} Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.590953 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.592354 5030 generic.go:334] "Generic (PLEG): container finished" podID="fa8f718e-3edd-4de0-9125-7d981584e6e0" containerID="a1aa7fac800cf0fad77a25577024fa04cb0bf4062cc984176e43fa01e676bda9" exitCode=0 Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.592405 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" event={"ID":"fa8f718e-3edd-4de0-9125-7d981584e6e0","Type":"ContainerDied","Data":"a1aa7fac800cf0fad77a25577024fa04cb0bf4062cc984176e43fa01e676bda9"} Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.595854 5030 generic.go:334] "Generic (PLEG): container finished" podID="d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" containerID="b494a9a55c3d113fd512a6d0e36e0c85cab0dd52b358ca282eeda6ddabb313cb" exitCode=0 Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.595898 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.596249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b","Type":"ContainerDied","Data":"b494a9a55c3d113fd512a6d0e36e0c85cab0dd52b358ca282eeda6ddabb313cb"} Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.596287 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.690178 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts\") pod \"nova-cell1-a2f6-account-create-update-v29f8\" (UID: \"37397e88-13fe-4396-810e-d87514548283\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.690397 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.690478 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwph5\" (UniqueName: \"kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5\") pod \"nova-cell1-a2f6-account-create-update-v29f8\" (UID: \"37397e88-13fe-4396-810e-d87514548283\") " pod="openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8" Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.691151 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts podName:37397e88-13fe-4396-810e-d87514548283 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:50.691015513 +0000 UTC m=+4163.011275801 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts") pod "nova-cell1-a2f6-account-create-update-v29f8" (UID: "37397e88-13fe-4396-810e-d87514548283") : configmap "openstack-cell1-scripts" not found Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.691825 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.691863 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cfd208a2-6c29-4066-a573-c6b35a6a59d5-operator-scripts podName:cfd208a2-6c29-4066-a573-c6b35a6a59d5 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:47.191853144 +0000 UTC m=+4159.512113432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/cfd208a2-6c29-4066-a573-c6b35a6a59d5-operator-scripts") pod "root-account-create-update-bthhs" (UID: "cfd208a2-6c29-4066-a573-c6b35a6a59d5") : configmap "openstack-scripts" not found Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.693586 5030 projected.go:194] Error preparing data for projected volume kube-api-access-kwph5 for pod openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.693696 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5 podName:37397e88-13fe-4396-810e-d87514548283 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:50.693676218 +0000 UTC m=+4163.013936506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kwph5" (UniqueName: "kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5") pod "nova-cell1-a2f6-account-create-update-v29f8" (UID: "37397e88-13fe-4396-810e-d87514548283") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.714478 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca6a40bb_ca5c_46eb_bad6_8f7f7cee94f1.slice/crio-conmon-124b27d23196c76c208e464129a013f7699a2efee00d1a0cc64c1e1f4062736b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca6a40bb_ca5c_46eb_bad6_8f7f7cee94f1.slice/crio-124b27d23196c76c208e464129a013f7699a2efee00d1a0cc64c1e1f4062736b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5ca0f14_59f2_44f0_91f4_e0c2c8b43a2b.slice/crio-b494a9a55c3d113fd512a6d0e36e0c85cab0dd52b358ca282eeda6ddabb313cb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5ca0f14_59f2_44f0_91f4_e0c2c8b43a2b.slice/crio-conmon-b494a9a55c3d113fd512a6d0e36e0c85cab0dd52b358ca282eeda6ddabb313cb.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.793305 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts\") pod \"keystone-b2c4-account-create-update-95cmt\" (UID: \"3a215636-7c93-4a97-97f5-842f25380d45\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.793500 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.793585 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts podName:3a215636-7c93-4a97-97f5-842f25380d45 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:47.793564732 +0000 UTC m=+4160.113825020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts") pod "keystone-b2c4-account-create-update-95cmt" (UID: "3a215636-7c93-4a97-97f5-842f25380d45") : configmap "openstack-scripts" not found Jan 20 23:44:46 crc kubenswrapper[5030]: I0120 23:44:46.895802 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6js45\" (UniqueName: \"kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45\") pod \"keystone-b2c4-account-create-update-95cmt\" (UID: \"3a215636-7c93-4a97-97f5-842f25380d45\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.899284 5030 projected.go:194] Error preparing data for projected volume kube-api-access-6js45 for pod openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:44:46 crc kubenswrapper[5030]: E0120 23:44:46.899347 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45 podName:3a215636-7c93-4a97-97f5-842f25380d45 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:47.899330037 +0000 UTC m=+4160.219590345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6js45" (UniqueName: "kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45") pod "keystone-b2c4-account-create-update-95cmt" (UID: "3a215636-7c93-4a97-97f5-842f25380d45") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.029738 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="541f9809b1851320adea3aba77104ed75a318a6d7f3ff14f9ae06b101cf63b8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.036020 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="541f9809b1851320adea3aba77104ed75a318a6d7f3ff14f9ae06b101cf63b8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.044361 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="541f9809b1851320adea3aba77104ed75a318a6d7f3ff14f9ae06b101cf63b8f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.044402 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="db5e46d2-d1b4-4639-8e45-c329417ae8e1" containerName="nova-scheduler-scheduler" Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.214835 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.214924 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cfd208a2-6c29-4066-a573-c6b35a6a59d5-operator-scripts podName:cfd208a2-6c29-4066-a573-c6b35a6a59d5 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:48.214906143 +0000 UTC m=+4160.535166431 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/cfd208a2-6c29-4066-a573-c6b35a6a59d5-operator-scripts") pod "root-account-create-update-bthhs" (UID: "cfd208a2-6c29-4066-a573-c6b35a6a59d5") : configmap "openstack-scripts" not found Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.327943 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.385586 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xqz49" Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.475920 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xqz49"] Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.579060 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzhtm"] Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.585575 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kzhtm" podUID="4f7e1038-a9e1-40d9-a09d-4c224fb114d4" containerName="registry-server" containerID="cri-o://6ee1b6560a64ef259ec6eaa5bbb33e54f0ef57489ad2da168aaaaf38c3656758" gracePeriod=2 Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.626761 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.627135 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:51.627115252 +0000 UTC m=+4163.947375540 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : configmap "openstack-cell1-scripts" not found Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.626971 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.627524 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:51.627512451 +0000 UTC m=+4163.947772739 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.627006 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.627567 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:51.627557132 +0000 UTC m=+4163.947817420 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : secret "combined-ca-bundle" not found Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.627047 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.627598 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:51.627591403 +0000 UTC m=+4163.947851701 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : configmap "openstack-cell1-config-data" not found Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.627071 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-config-data: configmap "openstack-cell1-config-data" not found Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.627645 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default podName:f7c46ea7-7016-445e-8f53-a65df4ea35fc nodeName:}" failed. No retries permitted until 2026-01-20 23:44:51.627637324 +0000 UTC m=+4163.947897612 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default") pod "openstack-cell1-galera-0" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc") : configmap "openstack-cell1-config-data" not found Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.631784 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" event={"ID":"c8a88bb5-d95c-4db3-a79f-51958a894a2b","Type":"ContainerDied","Data":"ed5989cca24847c95b1dca37bd0ae38eeab993dadc8ce6b9a2adaaa338217b12"} Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.631829 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed5989cca24847c95b1dca37bd0ae38eeab993dadc8ce6b9a2adaaa338217b12" Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.640985 5030 generic.go:334] "Generic (PLEG): container finished" podID="d10e94b1-91a0-4631-a7b8-763d41c403ce" containerID="042c3ddc337801a1de68a99e66b3565634736bf58285d9ac8ac3ca69a8d4aca1" exitCode=0 Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.641170 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"d10e94b1-91a0-4631-a7b8-763d41c403ce","Type":"ContainerDied","Data":"042c3ddc337801a1de68a99e66b3565634736bf58285d9ac8ac3ca69a8d4aca1"} Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.672352 5030 generic.go:334] "Generic (PLEG): container finished" podID="d19c3298-c9b5-4371-b1f5-8edf2510706b" containerID="a6d258088fb951bc34644f2ffa05470352c7ec6b7690b7eaa4f6eb8de5451c0f" exitCode=0 Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.672442 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" event={"ID":"d19c3298-c9b5-4371-b1f5-8edf2510706b","Type":"ContainerDied","Data":"a6d258088fb951bc34644f2ffa05470352c7ec6b7690b7eaa4f6eb8de5451c0f"} Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.728886 5030 generic.go:334] "Generic (PLEG): container finished" podID="fbdc4876-2fa1-495d-88fd-3c18541b14d4" containerID="dbf3f368b62a0be16be3046e1cabd348e08742f826a1911f3c06fe9f0acb438c" exitCode=0 Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.728947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fbdc4876-2fa1-495d-88fd-3c18541b14d4","Type":"ContainerDied","Data":"dbf3f368b62a0be16be3046e1cabd348e08742f826a1911f3c06fe9f0acb438c"} Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.790664 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fc4cf92d45efba3ab3f584658fe1ec9a79d16433135d3203f60088326bf66a6b is running failed: container process not found" containerID="fc4cf92d45efba3ab3f584658fe1ec9a79d16433135d3203f60088326bf66a6b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.790859 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" event={"ID":"4d88dc1f-6d17-4a07-b701-d1e578d686ad","Type":"ContainerDied","Data":"9a50bf10fe0a10ae31abca060a64d550c1740d2a92517342198500c80451a643"} Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.790887 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a50bf10fe0a10ae31abca060a64d550c1740d2a92517342198500c80451a643" Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.795009 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fc4cf92d45efba3ab3f584658fe1ec9a79d16433135d3203f60088326bf66a6b is running failed: container process not found" containerID="fc4cf92d45efba3ab3f584658fe1ec9a79d16433135d3203f60088326bf66a6b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.800758 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fc4cf92d45efba3ab3f584658fe1ec9a79d16433135d3203f60088326bf66a6b is running failed: container process not found" containerID="fc4cf92d45efba3ab3f584658fe1ec9a79d16433135d3203f60088326bf66a6b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.800796 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fc4cf92d45efba3ab3f584658fe1ec9a79d16433135d3203f60088326bf66a6b is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerName="nova-cell0-conductor-conductor" Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.805003 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" podUID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.45:9311/healthcheck\": dial tcp 10.217.1.45:9311: connect: connection refused" Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.805106 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" podUID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.45:9311/healthcheck\": dial tcp 10.217.1.45:9311: connect: connection refused" Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.817854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"24b38406-08cf-452c-b0f8-c455ed6b7ece","Type":"ContainerDied","Data":"60afe191c49d49fe93faffd8ed6cf5091df110a9f590c0d83c96c4219a1e3f86"} Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.817898 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60afe191c49d49fe93faffd8ed6cf5091df110a9f590c0d83c96c4219a1e3f86" Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.830475 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts\") pod \"keystone-b2c4-account-create-update-95cmt\" (UID: \"3a215636-7c93-4a97-97f5-842f25380d45\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.830749 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.830801 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts podName:3a215636-7c93-4a97-97f5-842f25380d45 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:49.830787292 +0000 UTC m=+4162.151047580 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts") pod "keystone-b2c4-account-create-update-95cmt" (UID: "3a215636-7c93-4a97-97f5-842f25380d45") : configmap "openstack-scripts" not found Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.858996 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b","Type":"ContainerDied","Data":"05f2ebe1de645d4dd09eed2022ef8ea1ef6889ff43d5b204cf62abcccc8860d3"} Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.859037 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05f2ebe1de645d4dd09eed2022ef8ea1ef6889ff43d5b204cf62abcccc8860d3" Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.896759 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" event={"ID":"12f927a3-29ee-4af3-97e7-dc9998169372","Type":"ContainerDied","Data":"fcad1326e3665d00f689aafc66bab7877882a99e7a0a9bc1128fbe61d2dd5743"} Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.896795 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcad1326e3665d00f689aafc66bab7877882a99e7a0a9bc1128fbe61d2dd5743" Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.926040 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" event={"ID":"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1","Type":"ContainerDied","Data":"97f61c495047eacdf4c2f0f34a36d823dd546b180f254cdd46ecd1356039a624"} Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.926088 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97f61c495047eacdf4c2f0f34a36d823dd546b180f254cdd46ecd1356039a624" Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.934043 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6js45\" (UniqueName: \"kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45\") pod \"keystone-b2c4-account-create-update-95cmt\" (UID: \"3a215636-7c93-4a97-97f5-842f25380d45\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.942878 5030 projected.go:194] Error preparing data for projected volume kube-api-access-6js45 for pod openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:44:47 crc kubenswrapper[5030]: E0120 23:44:47.942947 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45 podName:3a215636-7c93-4a97-97f5-842f25380d45 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:49.942929004 +0000 UTC m=+4162.263189292 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6js45" (UniqueName: "kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45") pod "keystone-b2c4-account-create-update-95cmt" (UID: "3a215636-7c93-4a97-97f5-842f25380d45") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:44:47 crc kubenswrapper[5030]: I0120 23:44:47.998416 5030 generic.go:334] "Generic (PLEG): container finished" podID="50bb7262-9571-4851-9399-5115ee497125" containerID="db87e0d657028f0da82f16b08db2f08eb396b07e9c9a8800ab5a9e1a8e82760c" exitCode=0 Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.003644 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8326185d-803c-4238-b9c8-9056b3758d01" path="/var/lib/kubelet/pods/8326185d-803c-4238-b9c8-9056b3758d01/volumes" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.004475 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0c8839c-8268-4bdd-8139-06af1589e008" path="/var/lib/kubelet/pods/d0c8839c-8268-4bdd-8139-06af1589e008/volumes" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.005634 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ead812d7-990f-4f88-87fc-8620e73f9e67" path="/var/lib/kubelet/pods/ead812d7-990f-4f88-87fc-8620e73f9e67/volumes" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.006198 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec6d9b48-412f-49d8-ab21-20ab43366732" path="/var/lib/kubelet/pods/ec6d9b48-412f-49d8-ab21-20ab43366732/volumes" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.007113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"50bb7262-9571-4851-9399-5115ee497125","Type":"ContainerDied","Data":"db87e0d657028f0da82f16b08db2f08eb396b07e9c9a8800ab5a9e1a8e82760c"} Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.007718 5030 scope.go:117] "RemoveContainer" containerID="86726663164ca08470d8023499565b6de220fd18491f622b7a239d4b600235cf" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.011501 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.028305 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"f7c46ea7-7016-445e-8f53-a65df4ea35fc","Type":"ContainerDied","Data":"674887c93b46f3429c18e0eea6c551b3fe7bcfb8b7ca78b701ede6a261b1a99b"} Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.028344 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="674887c93b46f3429c18e0eea6c551b3fe7bcfb8b7ca78b701ede6a261b1a99b" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.067872 5030 generic.go:334] "Generic (PLEG): container finished" podID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerID="fc4cf92d45efba3ab3f584658fe1ec9a79d16433135d3203f60088326bf66a6b" exitCode=0 Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.067987 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"04d86b77-716c-4eb7-8976-9e47e1b6df7d","Type":"ContainerDied","Data":"fc4cf92d45efba3ab3f584658fe1ec9a79d16433135d3203f60088326bf66a6b"} Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.071212 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z" event={"ID":"660c45c7-982d-49a6-be38-7a74cf69a00e","Type":"ContainerDied","Data":"5fc1ec9dce0d1a33f8515cdded84036ca857359dde464f100a864ab2d8b0a390"} Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.071235 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fc1ec9dce0d1a33f8515cdded84036ca857359dde464f100a864ab2d8b0a390" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.074160 5030 generic.go:334] "Generic (PLEG): container finished" podID="df866ce3-a10a-41e3-a17b-abea64eaa5ca" containerID="3a08091a8839c78e509146461ec023c2f7f8a34d9700b1f59f27ad351db63a85" exitCode=0 Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.074229 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"df866ce3-a10a-41e3-a17b-abea64eaa5ca","Type":"ContainerDied","Data":"3a08091a8839c78e509146461ec023c2f7f8a34d9700b1f59f27ad351db63a85"} Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.077084 5030 generic.go:334] "Generic (PLEG): container finished" podID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerID="93d97d664a75ce718e24a3db67922eeb139038d7198bf3078d32828f497081d0" exitCode=0 Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.077140 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"1d6e18e5-718a-4578-8b74-add47cb38d2d","Type":"ContainerDied","Data":"93d97d664a75ce718e24a3db67922eeb139038d7198bf3078d32828f497081d0"} Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.078787 5030 generic.go:334] "Generic (PLEG): container finished" podID="59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" containerID="4d4ad5a6d7d11df4886fa6a22b4f7a1bd73a0b2aeaa3ab4e6b5789b085035912" exitCode=0 Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.078899 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54","Type":"ContainerDied","Data":"4d4ad5a6d7d11df4886fa6a22b4f7a1bd73a0b2aeaa3ab4e6b5789b085035912"} Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.080289 5030 generic.go:334] "Generic (PLEG): container finished" podID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerID="44672b9e5ac71bd9b91365302a913d20bb6e00d96ec91f69b8166b36d5e7f2e4" exitCode=0 Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.080322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" event={"ID":"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd","Type":"ContainerDied","Data":"44672b9e5ac71bd9b91365302a913d20bb6e00d96ec91f69b8166b36d5e7f2e4"} Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.081486 5030 generic.go:334] "Generic (PLEG): container finished" podID="5c070076-3f81-4e61-9f27-e5a39a84c380" containerID="4c0e97ab7ac916ffeec23817d4c8f83530ee3240f89e8b751dc5b7193307f748" exitCode=0 Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.082060 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5c070076-3f81-4e61-9f27-e5a39a84c380","Type":"ContainerDied","Data":"4c0e97ab7ac916ffeec23817d4c8f83530ee3240f89e8b751dc5b7193307f748"} Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.180684 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.192070 5030 scope.go:117] "RemoveContainer" containerID="5b31b1b990ceabd4797fda7252ebbcdbfa5957e9115be52ceb86f6f08f169842" Jan 20 23:44:48 crc kubenswrapper[5030]: E0120 23:44:48.196473 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.229682 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.232703 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:44:48 crc kubenswrapper[5030]: E0120 23:44:48.242688 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:44:48 crc kubenswrapper[5030]: E0120 23:44:48.244795 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:44:48 crc kubenswrapper[5030]: E0120 23:44:48.244845 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cfd208a2-6c29-4066-a573-c6b35a6a59d5-operator-scripts podName:cfd208a2-6c29-4066-a573-c6b35a6a59d5 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:50.244831827 +0000 UTC m=+4162.565092115 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/cfd208a2-6c29-4066-a573-c6b35a6a59d5-operator-scripts") pod "root-account-create-update-bthhs" (UID: "cfd208a2-6c29-4066-a573-c6b35a6a59d5") : configmap "openstack-scripts" not found Jan 20 23:44:48 crc kubenswrapper[5030]: E0120 23:44:48.252014 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:44:48 crc kubenswrapper[5030]: E0120 23:44:48.252076 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="1a24a888-0314-4501-b989-20463ef27332" containerName="ovn-northd" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.252298 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.252942 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.276555 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.299586 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886"] Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.313044 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-6e8b-account-create-update-xb886"] Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.313455 5030 scope.go:117] "RemoveContainer" containerID="deeb6b4d1668a9582b4c378c594679c4235c09d76d1e2e4ec46aa1ca229900d6" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.320958 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.330915 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6"] Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.345752 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thds6\" (UniqueName: \"kubernetes.io/projected/c8a88bb5-d95c-4db3-a79f-51958a894a2b-kube-api-access-thds6\") pod \"c8a88bb5-d95c-4db3-a79f-51958a894a2b\" (UID: \"c8a88bb5-d95c-4db3-a79f-51958a894a2b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.345837 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a88bb5-d95c-4db3-a79f-51958a894a2b-operator-scripts\") pod \"c8a88bb5-d95c-4db3-a79f-51958a894a2b\" (UID: \"c8a88bb5-d95c-4db3-a79f-51958a894a2b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.346441 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a88bb5-d95c-4db3-a79f-51958a894a2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8a88bb5-d95c-4db3-a79f-51958a894a2b" (UID: "c8a88bb5-d95c-4db3-a79f-51958a894a2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.348356 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8a88bb5-d95c-4db3-a79f-51958a894a2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.350769 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.356510 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.357549 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.358658 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-cb33-account-create-update-b86k6"] Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.364281 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a88bb5-d95c-4db3-a79f-51958a894a2b-kube-api-access-thds6" (OuterVolumeSpecName: "kube-api-access-thds6") pod "c8a88bb5-d95c-4db3-a79f-51958a894a2b" (UID: "c8a88bb5-d95c-4db3-a79f-51958a894a2b"). InnerVolumeSpecName "kube-api-access-thds6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.377474 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.381219 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8"] Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.387597 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.388243 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.395270 5030 scope.go:117] "RemoveContainer" containerID="3ff22418a328424b32b7b4f415a4a4ad754ab1eb3343427244682702b8634303" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.400330 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-a2f6-account-create-update-v29f8"] Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.403709 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461274 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d10e94b1-91a0-4631-a7b8-763d41c403ce-config-data\") pod \"d10e94b1-91a0-4631-a7b8-763d41c403ce\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461319 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mfwx\" (UniqueName: \"kubernetes.io/projected/4d88dc1f-6d17-4a07-b701-d1e578d686ad-kube-api-access-9mfwx\") pod \"4d88dc1f-6d17-4a07-b701-d1e578d686ad\" (UID: \"4d88dc1f-6d17-4a07-b701-d1e578d686ad\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461356 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-config-data\") pod \"24b38406-08cf-452c-b0f8-c455ed6b7ece\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461373 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461393 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-httpd-run\") pod \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461413 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d10e94b1-91a0-4631-a7b8-763d41c403ce-memcached-tls-certs\") pod \"d10e94b1-91a0-4631-a7b8-763d41c403ce\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461437 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs\") pod \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461460 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d10e94b1-91a0-4631-a7b8-763d41c403ce-kolla-config\") pod \"d10e94b1-91a0-4631-a7b8-763d41c403ce\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle\") pod \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461509 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10e94b1-91a0-4631-a7b8-763d41c403ce-combined-ca-bundle\") pod \"d10e94b1-91a0-4631-a7b8-763d41c403ce\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461572 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsqrg\" (UniqueName: \"kubernetes.io/projected/d10e94b1-91a0-4631-a7b8-763d41c403ce-kube-api-access-rsqrg\") pod \"d10e94b1-91a0-4631-a7b8-763d41c403ce\" (UID: \"d10e94b1-91a0-4631-a7b8-763d41c403ce\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461595 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-logs\") pod \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461613 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbd6x\" (UniqueName: \"kubernetes.io/projected/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-kube-api-access-vbd6x\") pod \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461656 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-combined-ca-bundle\") pod \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-logs\") pod \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config\") pod \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461718 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz7wg\" (UniqueName: \"kubernetes.io/projected/660c45c7-982d-49a6-be38-7a74cf69a00e-kube-api-access-hz7wg\") pod \"660c45c7-982d-49a6-be38-7a74cf69a00e\" (UID: \"660c45c7-982d-49a6-be38-7a74cf69a00e\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461743 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-logs\") pod \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461772 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data\") pod \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461787 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-combined-ca-bundle\") pod \"24b38406-08cf-452c-b0f8-c455ed6b7ece\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461803 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmg8n\" (UniqueName: \"kubernetes.io/projected/12f927a3-29ee-4af3-97e7-dc9998169372-kube-api-access-xmg8n\") pod \"12f927a3-29ee-4af3-97e7-dc9998169372\" (UID: \"12f927a3-29ee-4af3-97e7-dc9998169372\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461821 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data-custom\") pod \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461848 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-public-tls-certs\") pod \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461863 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/660c45c7-982d-49a6-be38-7a74cf69a00e-operator-scripts\") pod \"660c45c7-982d-49a6-be38-7a74cf69a00e\" (UID: \"660c45c7-982d-49a6-be38-7a74cf69a00e\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461885 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-nova-novncproxy-tls-certs\") pod \"24b38406-08cf-452c-b0f8-c455ed6b7ece\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461902 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkngv\" (UniqueName: \"kubernetes.io/projected/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-kube-api-access-rkngv\") pod \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461921 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tbfq\" (UniqueName: \"kubernetes.io/projected/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kube-api-access-2tbfq\") pod \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461945 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12f927a3-29ee-4af3-97e7-dc9998169372-operator-scripts\") pod \"12f927a3-29ee-4af3-97e7-dc9998169372\" (UID: \"12f927a3-29ee-4af3-97e7-dc9998169372\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461962 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-httpd-run\") pod \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461981 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc4jk\" (UniqueName: \"kubernetes.io/projected/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-kube-api-access-nc4jk\") pod \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.461997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.462021 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-combined-ca-bundle\") pod \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.462053 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts\") pod \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.462078 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-generated\") pod \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.462103 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-config-data\") pod \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.472332 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d10e94b1-91a0-4631-a7b8-763d41c403ce-config-data" (OuterVolumeSpecName: "config-data") pod "d10e94b1-91a0-4631-a7b8-763d41c403ce" (UID: "d10e94b1-91a0-4631-a7b8-763d41c403ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.462127 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-internal-tls-certs\") pod \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.472435 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.472479 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s9sm\" (UniqueName: \"kubernetes.io/projected/24b38406-08cf-452c-b0f8-c455ed6b7ece-kube-api-access-8s9sm\") pod \"24b38406-08cf-452c-b0f8-c455ed6b7ece\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.472508 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-scripts\") pod \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\" (UID: \"d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.472550 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-combined-ca-bundle\") pod \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.472602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default\") pod \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\" (UID: \"f7c46ea7-7016-445e-8f53-a65df4ea35fc\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.472655 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-config-data\") pod \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.472683 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-scripts\") pod \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\" (UID: \"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.472705 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-vencrypt-tls-certs\") pod \"24b38406-08cf-452c-b0f8-c455ed6b7ece\" (UID: \"24b38406-08cf-452c-b0f8-c455ed6b7ece\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.472737 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d88dc1f-6d17-4a07-b701-d1e578d686ad-operator-scripts\") pod \"4d88dc1f-6d17-4a07-b701-d1e578d686ad\" (UID: \"4d88dc1f-6d17-4a07-b701-d1e578d686ad\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.473965 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d10e94b1-91a0-4631-a7b8-763d41c403ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.473991 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thds6\" (UniqueName: \"kubernetes.io/projected/c8a88bb5-d95c-4db3-a79f-51958a894a2b-kube-api-access-thds6\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.474003 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37397e88-13fe-4396-810e-d87514548283-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.474013 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwph5\" (UniqueName: \"kubernetes.io/projected/37397e88-13fe-4396-810e-d87514548283-kube-api-access-kwph5\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.474354 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d88dc1f-6d17-4a07-b701-d1e578d686ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d88dc1f-6d17-4a07-b701-d1e578d686ad" (UID: "4d88dc1f-6d17-4a07-b701-d1e578d686ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.481050 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-logs" (OuterVolumeSpecName: "logs") pod "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" (UID: "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.481384 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-logs" (OuterVolumeSpecName: "logs") pod "ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" (UID: "ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.482075 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d10e94b1-91a0-4631-a7b8-763d41c403ce-kube-api-access-rsqrg" (OuterVolumeSpecName: "kube-api-access-rsqrg") pod "d10e94b1-91a0-4631-a7b8-763d41c403ce" (UID: "d10e94b1-91a0-4631-a7b8-763d41c403ce"). InnerVolumeSpecName "kube-api-access-rsqrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.482495 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" (UID: "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.489347 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24b38406-08cf-452c-b0f8-c455ed6b7ece-kube-api-access-8s9sm" (OuterVolumeSpecName: "kube-api-access-8s9sm") pod "24b38406-08cf-452c-b0f8-c455ed6b7ece" (UID: "24b38406-08cf-452c-b0f8-c455ed6b7ece"). InnerVolumeSpecName "kube-api-access-8s9sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.491698 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/660c45c7-982d-49a6-be38-7a74cf69a00e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "660c45c7-982d-49a6-be38-7a74cf69a00e" (UID: "660c45c7-982d-49a6-be38-7a74cf69a00e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.494544 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12f927a3-29ee-4af3-97e7-dc9998169372-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12f927a3-29ee-4af3-97e7-dc9998169372" (UID: "12f927a3-29ee-4af3-97e7-dc9998169372"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.494860 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" (UID: "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.497109 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f7c46ea7-7016-445e-8f53-a65df4ea35fc" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.499105 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7c46ea7-7016-445e-8f53-a65df4ea35fc" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.499743 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f7c46ea7-7016-445e-8f53-a65df4ea35fc" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.499900 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f7c46ea7-7016-445e-8f53-a65df4ea35fc" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.500124 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-logs" (OuterVolumeSpecName: "logs") pod "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" (UID: "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.500215 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d88dc1f-6d17-4a07-b701-d1e578d686ad-kube-api-access-9mfwx" (OuterVolumeSpecName: "kube-api-access-9mfwx") pod "4d88dc1f-6d17-4a07-b701-d1e578d686ad" (UID: "4d88dc1f-6d17-4a07-b701-d1e578d686ad"). InnerVolumeSpecName "kube-api-access-9mfwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.500318 5030 scope.go:117] "RemoveContainer" containerID="3e26081edbb10019449a003a17538e1de68bd9f0dc38018db630e0edd2157f3c" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.501072 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.502527 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d10e94b1-91a0-4631-a7b8-763d41c403ce-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d10e94b1-91a0-4631-a7b8-763d41c403ce" (UID: "d10e94b1-91a0-4631-a7b8-763d41c403ce"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.506921 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.508146 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.508284 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.543577 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.545288 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.547381 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580103 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-config-data-custom\") pod \"50bb7262-9571-4851-9399-5115ee497125\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580222 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4bgn\" (UniqueName: \"kubernetes.io/projected/5c070076-3f81-4e61-9f27-e5a39a84c380-kube-api-access-k4bgn\") pod \"5c070076-3f81-4e61-9f27-e5a39a84c380\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580271 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50bb7262-9571-4851-9399-5115ee497125-logs\") pod \"50bb7262-9571-4851-9399-5115ee497125\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-scripts\") pod \"d19c3298-c9b5-4371-b1f5-8edf2510706b\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzgpz\" (UniqueName: \"kubernetes.io/projected/fbdc4876-2fa1-495d-88fd-3c18541b14d4-kube-api-access-vzgpz\") pod \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580337 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c070076-3f81-4e61-9f27-e5a39a84c380-logs\") pod \"5c070076-3f81-4e61-9f27-e5a39a84c380\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580360 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-logs\") pod \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580416 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmwf6\" (UniqueName: \"kubernetes.io/projected/04d86b77-716c-4eb7-8976-9e47e1b6df7d-kube-api-access-xmwf6\") pod \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\" (UID: \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580431 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-public-tls-certs\") pod \"d19c3298-c9b5-4371-b1f5-8edf2510706b\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6e18e5-718a-4578-8b74-add47cb38d2d-config-data\") pod \"1d6e18e5-718a-4578-8b74-add47cb38d2d\" (UID: \"1d6e18e5-718a-4578-8b74-add47cb38d2d\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580505 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-combined-ca-bundle\") pod \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580531 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6e18e5-718a-4578-8b74-add47cb38d2d-combined-ca-bundle\") pod \"1d6e18e5-718a-4578-8b74-add47cb38d2d\" (UID: \"1d6e18e5-718a-4578-8b74-add47cb38d2d\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580546 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50bb7262-9571-4851-9399-5115ee497125-etc-machine-id\") pod \"50bb7262-9571-4851-9399-5115ee497125\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580581 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-config-data\") pod \"5c070076-3f81-4e61-9f27-e5a39a84c380\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580597 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-scripts\") pod \"50bb7262-9571-4851-9399-5115ee497125\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580642 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-nova-metadata-tls-certs\") pod \"5c070076-3f81-4e61-9f27-e5a39a84c380\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580671 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-internal-tls-certs\") pod \"d19c3298-c9b5-4371-b1f5-8edf2510706b\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580693 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc5mj\" (UniqueName: \"kubernetes.io/projected/50bb7262-9571-4851-9399-5115ee497125-kube-api-access-sc5mj\") pod \"50bb7262-9571-4851-9399-5115ee497125\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580716 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d19c3298-c9b5-4371-b1f5-8edf2510706b-logs\") pod \"d19c3298-c9b5-4371-b1f5-8edf2510706b\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580733 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-config-data\") pod \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580752 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-config-data-custom\") pod \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580782 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-combined-ca-bundle\") pod \"50bb7262-9571-4851-9399-5115ee497125\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580802 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-public-tls-certs\") pod \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580817 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-config-data\") pod \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580847 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-internal-tls-certs\") pod \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580871 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-internal-tls-certs\") pod \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-internal-tls-certs\") pod \"50bb7262-9571-4851-9399-5115ee497125\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580935 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-config-data\") pod \"d19c3298-c9b5-4371-b1f5-8edf2510706b\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbdc4876-2fa1-495d-88fd-3c18541b14d4-logs\") pod \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.580999 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-config-data\") pod \"50bb7262-9571-4851-9399-5115ee497125\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.581014 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-combined-ca-bundle\") pod \"d19c3298-c9b5-4371-b1f5-8edf2510706b\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.581032 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4hs8\" (UniqueName: \"kubernetes.io/projected/d19c3298-c9b5-4371-b1f5-8edf2510706b-kube-api-access-l4hs8\") pod \"d19c3298-c9b5-4371-b1f5-8edf2510706b\" (UID: \"d19c3298-c9b5-4371-b1f5-8edf2510706b\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.581056 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04d86b77-716c-4eb7-8976-9e47e1b6df7d-combined-ca-bundle\") pod \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\" (UID: \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.581077 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-combined-ca-bundle\") pod \"5c070076-3f81-4e61-9f27-e5a39a84c380\" (UID: \"5c070076-3f81-4e61-9f27-e5a39a84c380\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.581109 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d86b77-716c-4eb7-8976-9e47e1b6df7d-config-data\") pod \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\" (UID: \"04d86b77-716c-4eb7-8976-9e47e1b6df7d\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.581128 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks94d\" (UniqueName: \"kubernetes.io/projected/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-kube-api-access-ks94d\") pod \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.581153 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-public-tls-certs\") pod \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\" (UID: \"fbdc4876-2fa1-495d-88fd-3c18541b14d4\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.581185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-public-tls-certs\") pod \"50bb7262-9571-4851-9399-5115ee497125\" (UID: \"50bb7262-9571-4851-9399-5115ee497125\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.581219 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqmqh\" (UniqueName: \"kubernetes.io/projected/1d6e18e5-718a-4578-8b74-add47cb38d2d-kube-api-access-lqmqh\") pod \"1d6e18e5-718a-4578-8b74-add47cb38d2d\" (UID: \"1d6e18e5-718a-4578-8b74-add47cb38d2d\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.581238 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-combined-ca-bundle\") pod \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\" (UID: \"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd\") " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.587660 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50bb7262-9571-4851-9399-5115ee497125-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "50bb7262-9571-4851-9399-5115ee497125" (UID: "50bb7262-9571-4851-9399-5115ee497125"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588303 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588379 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12f927a3-29ee-4af3-97e7-dc9998169372-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588414 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588481 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588498 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s9sm\" (UniqueName: \"kubernetes.io/projected/24b38406-08cf-452c-b0f8-c455ed6b7ece-kube-api-access-8s9sm\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588525 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588540 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d88dc1f-6d17-4a07-b701-d1e578d686ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588556 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mfwx\" (UniqueName: \"kubernetes.io/projected/4d88dc1f-6d17-4a07-b701-d1e578d686ad-kube-api-access-9mfwx\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588572 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588584 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50bb7262-9571-4851-9399-5115ee497125-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588597 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d10e94b1-91a0-4631-a7b8-763d41c403ce-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588610 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsqrg\" (UniqueName: \"kubernetes.io/projected/d10e94b1-91a0-4631-a7b8-763d41c403ce-kube-api-access-rsqrg\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588640 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588653 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588665 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588678 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.588692 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/660c45c7-982d-49a6-be38-7a74cf69a00e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.599502 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d19c3298-c9b5-4371-b1f5-8edf2510706b-logs" (OuterVolumeSpecName: "logs") pod "d19c3298-c9b5-4371-b1f5-8edf2510706b" (UID: "d19c3298-c9b5-4371-b1f5-8edf2510706b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.599828 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50bb7262-9571-4851-9399-5115ee497125-logs" (OuterVolumeSpecName: "logs") pod "50bb7262-9571-4851-9399-5115ee497125" (UID: "50bb7262-9571-4851-9399-5115ee497125"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.602097 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-logs" (OuterVolumeSpecName: "logs") pod "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" (UID: "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.605709 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c070076-3f81-4e61-9f27-e5a39a84c380-logs" (OuterVolumeSpecName: "logs") pod "5c070076-3f81-4e61-9f27-e5a39a84c380" (UID: "5c070076-3f81-4e61-9f27-e5a39a84c380"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.608132 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbdc4876-2fa1-495d-88fd-3c18541b14d4-logs" (OuterVolumeSpecName: "logs") pod "fbdc4876-2fa1-495d-88fd-3c18541b14d4" (UID: "fbdc4876-2fa1-495d-88fd-3c18541b14d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.651829 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d6e18e5-718a-4578-8b74-add47cb38d2d-kube-api-access-lqmqh" (OuterVolumeSpecName: "kube-api-access-lqmqh") pod "1d6e18e5-718a-4578-8b74-add47cb38d2d" (UID: "1d6e18e5-718a-4578-8b74-add47cb38d2d"). InnerVolumeSpecName "kube-api-access-lqmqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.652520 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-scripts" (OuterVolumeSpecName: "scripts") pod "50bb7262-9571-4851-9399-5115ee497125" (UID: "50bb7262-9571-4851-9399-5115ee497125"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.652570 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-kube-api-access-vbd6x" (OuterVolumeSpecName: "kube-api-access-vbd6x") pod "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" (UID: "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b"). InnerVolumeSpecName "kube-api-access-vbd6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.653264 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-kube-api-access-nc4jk" (OuterVolumeSpecName: "kube-api-access-nc4jk") pod "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" (UID: "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54"). InnerVolumeSpecName "kube-api-access-nc4jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.653339 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c070076-3f81-4e61-9f27-e5a39a84c380-kube-api-access-k4bgn" (OuterVolumeSpecName: "kube-api-access-k4bgn") pod "5c070076-3f81-4e61-9f27-e5a39a84c380" (UID: "5c070076-3f81-4e61-9f27-e5a39a84c380"). InnerVolumeSpecName "kube-api-access-k4bgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.654266 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kube-api-access-2tbfq" (OuterVolumeSpecName: "kube-api-access-2tbfq") pod "f7c46ea7-7016-445e-8f53-a65df4ea35fc" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc"). InnerVolumeSpecName "kube-api-access-2tbfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.654312 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-scripts" (OuterVolumeSpecName: "scripts") pod "d19c3298-c9b5-4371-b1f5-8edf2510706b" (UID: "d19c3298-c9b5-4371-b1f5-8edf2510706b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.654341 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" (UID: "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.654801 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" (UID: "ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.655960 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12f927a3-29ee-4af3-97e7-dc9998169372-kube-api-access-xmg8n" (OuterVolumeSpecName: "kube-api-access-xmg8n") pod "12f927a3-29ee-4af3-97e7-dc9998169372" (UID: "12f927a3-29ee-4af3-97e7-dc9998169372"). InnerVolumeSpecName "kube-api-access-xmg8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.656041 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19c3298-c9b5-4371-b1f5-8edf2510706b-kube-api-access-l4hs8" (OuterVolumeSpecName: "kube-api-access-l4hs8") pod "d19c3298-c9b5-4371-b1f5-8edf2510706b" (UID: "d19c3298-c9b5-4371-b1f5-8edf2510706b"). InnerVolumeSpecName "kube-api-access-l4hs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.656104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50bb7262-9571-4851-9399-5115ee497125-kube-api-access-sc5mj" (OuterVolumeSpecName: "kube-api-access-sc5mj") pod "50bb7262-9571-4851-9399-5115ee497125" (UID: "50bb7262-9571-4851-9399-5115ee497125"). InnerVolumeSpecName "kube-api-access-sc5mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.656573 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbdc4876-2fa1-495d-88fd-3c18541b14d4-kube-api-access-vzgpz" (OuterVolumeSpecName: "kube-api-access-vzgpz") pod "fbdc4876-2fa1-495d-88fd-3c18541b14d4" (UID: "fbdc4876-2fa1-495d-88fd-3c18541b14d4"). InnerVolumeSpecName "kube-api-access-vzgpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.659147 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-scripts" (OuterVolumeSpecName: "scripts") pod "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" (UID: "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.659157 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" (UID: "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.659237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d86b77-716c-4eb7-8976-9e47e1b6df7d-kube-api-access-xmwf6" (OuterVolumeSpecName: "kube-api-access-xmwf6") pod "04d86b77-716c-4eb7-8976-9e47e1b6df7d" (UID: "04d86b77-716c-4eb7-8976-9e47e1b6df7d"). InnerVolumeSpecName "kube-api-access-xmwf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.659259 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "50bb7262-9571-4851-9399-5115ee497125" (UID: "50bb7262-9571-4851-9399-5115ee497125"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.659209 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-scripts" (OuterVolumeSpecName: "scripts") pod "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" (UID: "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.659492 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-kube-api-access-rkngv" (OuterVolumeSpecName: "kube-api-access-rkngv") pod "ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" (UID: "ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1"). InnerVolumeSpecName "kube-api-access-rkngv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.659884 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" (UID: "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.664746 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "f7c46ea7-7016-445e-8f53-a65df4ea35fc" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.674208 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/660c45c7-982d-49a6-be38-7a74cf69a00e-kube-api-access-hz7wg" (OuterVolumeSpecName: "kube-api-access-hz7wg") pod "660c45c7-982d-49a6-be38-7a74cf69a00e" (UID: "660c45c7-982d-49a6-be38-7a74cf69a00e"). InnerVolumeSpecName "kube-api-access-hz7wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.691451 5030 scope.go:117] "RemoveContainer" containerID="efc08bb977e1dec2224e0550e4a51eee0e71377a6fe1d5972f92ea98201f81e3" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.703869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-kube-api-access-ks94d" (OuterVolumeSpecName: "kube-api-access-ks94d") pod "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" (UID: "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd"). InnerVolumeSpecName "kube-api-access-ks94d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.709728 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710491 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710517 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbd6x\" (UniqueName: \"kubernetes.io/projected/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-kube-api-access-vbd6x\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710529 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc5mj\" (UniqueName: \"kubernetes.io/projected/50bb7262-9571-4851-9399-5115ee497125-kube-api-access-sc5mj\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710542 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d19c3298-c9b5-4371-b1f5-8edf2510706b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710553 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710564 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz7wg\" (UniqueName: \"kubernetes.io/projected/660c45c7-982d-49a6-be38-7a74cf69a00e-kube-api-access-hz7wg\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710577 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmg8n\" (UniqueName: \"kubernetes.io/projected/12f927a3-29ee-4af3-97e7-dc9998169372-kube-api-access-xmg8n\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710587 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710596 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbdc4876-2fa1-495d-88fd-3c18541b14d4-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710604 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkngv\" (UniqueName: \"kubernetes.io/projected/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-kube-api-access-rkngv\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710615 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tbfq\" (UniqueName: \"kubernetes.io/projected/f7c46ea7-7016-445e-8f53-a65df4ea35fc-kube-api-access-2tbfq\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710639 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4hs8\" (UniqueName: \"kubernetes.io/projected/d19c3298-c9b5-4371-b1f5-8edf2510706b-kube-api-access-l4hs8\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710648 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc4jk\" (UniqueName: \"kubernetes.io/projected/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-kube-api-access-nc4jk\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710675 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710686 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks94d\" (UniqueName: \"kubernetes.io/projected/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-kube-api-access-ks94d\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710697 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqmqh\" (UniqueName: \"kubernetes.io/projected/1d6e18e5-718a-4578-8b74-add47cb38d2d-kube-api-access-lqmqh\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710707 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710721 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710732 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4bgn\" (UniqueName: \"kubernetes.io/projected/5c070076-3f81-4e61-9f27-e5a39a84c380-kube-api-access-k4bgn\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710742 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710752 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50bb7262-9571-4851-9399-5115ee497125-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710763 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710779 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzgpz\" (UniqueName: \"kubernetes.io/projected/fbdc4876-2fa1-495d-88fd-3c18541b14d4-kube-api-access-vzgpz\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710789 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c070076-3f81-4e61-9f27-e5a39a84c380-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710801 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710812 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.710826 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmwf6\" (UniqueName: \"kubernetes.io/projected/04d86b77-716c-4eb7-8976-9e47e1b6df7d-kube-api-access-xmwf6\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.711108 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-config-data" (OuterVolumeSpecName: "config-data") pod "24b38406-08cf-452c-b0f8-c455ed6b7ece" (UID: "24b38406-08cf-452c-b0f8-c455ed6b7ece"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.718518 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" (UID: "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.769224 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.793235 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10e94b1-91a0-4631-a7b8-763d41c403ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d10e94b1-91a0-4631-a7b8-763d41c403ce" (UID: "d10e94b1-91a0-4631-a7b8-763d41c403ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.815174 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.815213 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10e94b1-91a0-4631-a7b8-763d41c403ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.815224 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.815237 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.816027 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7c46ea7-7016-445e-8f53-a65df4ea35fc" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.831889 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24b38406-08cf-452c-b0f8-c455ed6b7ece" (UID: "24b38406-08cf-452c-b0f8-c455ed6b7ece"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.876959 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d6e18e5-718a-4578-8b74-add47cb38d2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d6e18e5-718a-4578-8b74-add47cb38d2d" (UID: "1d6e18e5-718a-4578-8b74-add47cb38d2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.916996 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6e18e5-718a-4578-8b74-add47cb38d2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.917029 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.917039 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:48 crc kubenswrapper[5030]: E0120 23:44:48.917094 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:44:48 crc kubenswrapper[5030]: E0120 23:44:48.917167 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data podName:16f26a59-3681-422d-a46f-406b454f73b8 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:56.917148986 +0000 UTC m=+4169.237409274 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data") pod "rabbitmq-cell1-server-0" (UID: "16f26a59-3681-422d-a46f-406b454f73b8") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:44:48 crc kubenswrapper[5030]: I0120 23:44:48.945504 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" (UID: "ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.013312 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbdc4876-2fa1-495d-88fd-3c18541b14d4" (UID: "fbdc4876-2fa1-495d-88fd-3c18541b14d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.021112 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.021139 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.020374 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04d86b77-716c-4eb7-8976-9e47e1b6df7d-config-data" (OuterVolumeSpecName: "config-data") pod "04d86b77-716c-4eb7-8976-9e47e1b6df7d" (UID: "04d86b77-716c-4eb7-8976-9e47e1b6df7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.037172 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.051963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50bb7262-9571-4851-9399-5115ee497125" (UID: "50bb7262-9571-4851-9399-5115ee497125"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.063874 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-config-data" (OuterVolumeSpecName: "config-data") pod "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" (UID: "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.111263 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.111618 5030 scope.go:117] "RemoveContainer" containerID="22d9c7e4e07c725f754ed478aed56b0e88d97a5b82512d49c685a25276a272d6" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.114908 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c070076-3f81-4e61-9f27-e5a39a84c380" (UID: "5c070076-3f81-4e61-9f27-e5a39a84c380"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.121713 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d6e18e5-718a-4578-8b74-add47cb38d2d-config-data" (OuterVolumeSpecName: "config-data") pod "1d6e18e5-718a-4578-8b74-add47cb38d2d" (UID: "1d6e18e5-718a-4578-8b74-add47cb38d2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.121734 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" (UID: "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.122916 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.122937 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04d86b77-716c-4eb7-8976-9e47e1b6df7d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.122947 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.122958 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.122967 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6e18e5-718a-4578-8b74-add47cb38d2d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.122975 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.122985 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.122999 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" event={"ID":"d19c3298-c9b5-4371-b1f5-8edf2510706b","Type":"ContainerDied","Data":"d4812a9e549e5d2fda3fe2a7411347a434f92007728dced134a5fb0a7ce5de0e"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.123148 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-6c4cd74858-frhdt" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.124156 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-bthhs" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.136896 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fbdc4876-2fa1-495d-88fd-3c18541b14d4","Type":"ContainerDied","Data":"e8c469752570de07e4e1aec0689c0fb676780caef99c0dec50a0412886c30541"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.137047 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.137112 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.152853 5030 generic.go:334] "Generic (PLEG): container finished" podID="f205895a-be1d-4da7-adbc-773b2750d39b" containerID="5dba5d24db0179451d485278775eeffd46ef67bfef2e3668bd94f66923b10c84" exitCode=0 Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.152917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" event={"ID":"f205895a-be1d-4da7-adbc-773b2750d39b","Type":"ContainerDied","Data":"5dba5d24db0179451d485278775eeffd46ef67bfef2e3668bd94f66923b10c84"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.160203 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.160776 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" event={"ID":"f205895a-be1d-4da7-adbc-773b2750d39b","Type":"ContainerDied","Data":"c0d00eb7cc6e4dbc19a3aef1eac0c012a3298c1fc24fa2c33c43c98cef8bd29a"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.189769 5030 scope.go:117] "RemoveContainer" containerID="a6d258088fb951bc34644f2ffa05470352c7ec6b7690b7eaa4f6eb8de5451c0f" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.204402 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" (UID: "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.215738 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="2118c3ce-d55e-4a65-8d25-d7f53bcac91e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.168:5671: connect: connection refused" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.216410 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"04d86b77-716c-4eb7-8976-9e47e1b6df7d","Type":"ContainerDied","Data":"4aaf8e6fcb24b362e556db122f75e92eadf4805bcf94845dd8ac1ad4a5b02c76"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.216831 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.220697 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.224047 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfd208a2-6c29-4066-a573-c6b35a6a59d5-operator-scripts\") pod \"cfd208a2-6c29-4066-a573-c6b35a6a59d5\" (UID: \"cfd208a2-6c29-4066-a573-c6b35a6a59d5\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.224277 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f205895a-be1d-4da7-adbc-773b2750d39b-logs\") pod \"f205895a-be1d-4da7-adbc-773b2750d39b\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.224481 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxf4r\" (UniqueName: \"kubernetes.io/projected/df866ce3-a10a-41e3-a17b-abea64eaa5ca-kube-api-access-gxf4r\") pod \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.227248 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt82h\" (UniqueName: \"kubernetes.io/projected/f205895a-be1d-4da7-adbc-773b2750d39b-kube-api-access-dt82h\") pod \"f205895a-be1d-4da7-adbc-773b2750d39b\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.227377 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-utilities\") pod \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\" (UID: \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.227469 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.227545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-config-data-default\") pod \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.227746 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-operator-scripts\") pod \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.227843 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-catalog-content\") pod \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\" (UID: \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.227911 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9d7d\" (UniqueName: \"kubernetes.io/projected/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-kube-api-access-g9d7d\") pod \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\" (UID: \"4f7e1038-a9e1-40d9-a09d-4c224fb114d4\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.227999 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle\") pod \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.228097 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs\") pod \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.228179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-config-data\") pod \"f205895a-be1d-4da7-adbc-773b2750d39b\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.228295 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/df866ce3-a10a-41e3-a17b-abea64eaa5ca-config-data-generated\") pod \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.230287 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-config-data-custom\") pod \"f205895a-be1d-4da7-adbc-773b2750d39b\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.230390 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-combined-ca-bundle\") pod \"f205895a-be1d-4da7-adbc-773b2750d39b\" (UID: \"f205895a-be1d-4da7-adbc-773b2750d39b\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.230460 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljfcm\" (UniqueName: \"kubernetes.io/projected/cfd208a2-6c29-4066-a573-c6b35a6a59d5-kube-api-access-ljfcm\") pod \"cfd208a2-6c29-4066-a573-c6b35a6a59d5\" (UID: \"cfd208a2-6c29-4066-a573-c6b35a6a59d5\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.230551 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-kolla-config\") pod \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\" (UID: \"df866ce3-a10a-41e3-a17b-abea64eaa5ca\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.231075 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.231686 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.224847 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"d10e94b1-91a0-4631-a7b8-763d41c403ce","Type":"ContainerDied","Data":"aa1b8741581bc40fbae399648856b94373f42f25418cdd24c54a607bec53b8ab"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.232273 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"50bb7262-9571-4851-9399-5115ee497125","Type":"ContainerDied","Data":"baa4899e3f859a5db2323e87eae3885e65c5e4f74409614fbf39a1fc1be3957a"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.224675 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f205895a-be1d-4da7-adbc-773b2750d39b-logs" (OuterVolumeSpecName: "logs") pod "f205895a-be1d-4da7-adbc-773b2750d39b" (UID: "f205895a-be1d-4da7-adbc-773b2750d39b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.224854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfd208a2-6c29-4066-a573-c6b35a6a59d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cfd208a2-6c29-4066-a573-c6b35a6a59d5" (UID: "cfd208a2-6c29-4066-a573-c6b35a6a59d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.228284 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "df866ce3-a10a-41e3-a17b-abea64eaa5ca" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.228946 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-utilities" (OuterVolumeSpecName: "utilities") pod "4f7e1038-a9e1-40d9-a09d-4c224fb114d4" (UID: "4f7e1038-a9e1-40d9-a09d-4c224fb114d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.229075 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df866ce3-a10a-41e3-a17b-abea64eaa5ca-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "df866ce3-a10a-41e3-a17b-abea64eaa5ca" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.232179 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "df866ce3-a10a-41e3-a17b-abea64eaa5ca" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.229086 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.224921 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.238938 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df866ce3-a10a-41e3-a17b-abea64eaa5ca" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.250860 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"df866ce3-a10a-41e3-a17b-abea64eaa5ca","Type":"ContainerDied","Data":"34bf0052a34fd46a4d24f69b53361321b70feba66f09d756c858bd959b1eac5b"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.250972 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.275562 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df866ce3-a10a-41e3-a17b-abea64eaa5ca-kube-api-access-gxf4r" (OuterVolumeSpecName: "kube-api-access-gxf4r") pod "df866ce3-a10a-41e3-a17b-abea64eaa5ca" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca"). InnerVolumeSpecName "kube-api-access-gxf4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.275717 5030 scope.go:117] "RemoveContainer" containerID="7de8ca058af69798715808c2c705adb228699f92727076f0a21040a1e153dd1b" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.276490 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f205895a-be1d-4da7-adbc-773b2750d39b" (UID: "f205895a-be1d-4da7-adbc-773b2750d39b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.287906 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" event={"ID":"979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd","Type":"ContainerDied","Data":"629c2486ecf452c9efd0ca732ce13bdefc27a6afb438a3cfef8c9deefa36d3a9"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.287995 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-748859d488-6s2vx" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.308447 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" event={"ID":"fa8f718e-3edd-4de0-9125-7d981584e6e0","Type":"ContainerStarted","Data":"32b66f89314cc7517837e61d98b70ca2ab834e88a6527f2a3fe2792e84855aef"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.309466 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.312872 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfd208a2-6c29-4066-a573-c6b35a6a59d5-kube-api-access-ljfcm" (OuterVolumeSpecName: "kube-api-access-ljfcm") pod "cfd208a2-6c29-4066-a573-c6b35a6a59d5" (UID: "cfd208a2-6c29-4066-a573-c6b35a6a59d5"). InnerVolumeSpecName "kube-api-access-ljfcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.315058 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5c070076-3f81-4e61-9f27-e5a39a84c380","Type":"ContainerDied","Data":"312b6637dd14cf9fb498a66c35b5684b996cb8cb1e7953888991d4eb4c6aab8b"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.315127 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.320044 5030 scope.go:117] "RemoveContainer" containerID="dbf3f368b62a0be16be3046e1cabd348e08742f826a1911f3c06fe9f0acb438c" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.321326 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f7e1038-a9e1-40d9-a09d-4c224fb114d4" containerID="6ee1b6560a64ef259ec6eaa5bbb33e54f0ef57489ad2da168aaaaf38c3656758" exitCode=0 Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.321541 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzhtm" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.321786 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzhtm" event={"ID":"4f7e1038-a9e1-40d9-a09d-4c224fb114d4","Type":"ContainerDied","Data":"6ee1b6560a64ef259ec6eaa5bbb33e54f0ef57489ad2da168aaaaf38c3656758"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.321810 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzhtm" event={"ID":"4f7e1038-a9e1-40d9-a09d-4c224fb114d4","Type":"ContainerDied","Data":"b6391fd1b04273b40e16deee537ea078556fe55d839ecf59646b59e7b0cb9d91"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.326835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-kube-api-access-g9d7d" (OuterVolumeSpecName: "kube-api-access-g9d7d") pod "4f7e1038-a9e1-40d9-a09d-4c224fb114d4" (UID: "4f7e1038-a9e1-40d9-a09d-4c224fb114d4"). InnerVolumeSpecName "kube-api-access-g9d7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.328041 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-bthhs" event={"ID":"cfd208a2-6c29-4066-a573-c6b35a6a59d5","Type":"ContainerDied","Data":"4b45c425c67d45996af8598a9def2ca1dc154197f4548a8d2570f3c3420df911"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.328119 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-bthhs" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.334006 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.334035 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.334046 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9d7d\" (UniqueName: \"kubernetes.io/projected/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-kube-api-access-g9d7d\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.334055 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/df866ce3-a10a-41e3-a17b-abea64eaa5ca-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.334065 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.334074 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljfcm\" (UniqueName: \"kubernetes.io/projected/cfd208a2-6c29-4066-a573-c6b35a6a59d5-kube-api-access-ljfcm\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.334084 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/df866ce3-a10a-41e3-a17b-abea64eaa5ca-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.334094 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfd208a2-6c29-4066-a573-c6b35a6a59d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.334102 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f205895a-be1d-4da7-adbc-773b2750d39b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.334111 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxf4r\" (UniqueName: \"kubernetes.io/projected/df866ce3-a10a-41e3-a17b-abea64eaa5ca-kube-api-access-gxf4r\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.334120 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.337103 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f205895a-be1d-4da7-adbc-773b2750d39b-kube-api-access-dt82h" (OuterVolumeSpecName: "kube-api-access-dt82h") pod "f205895a-be1d-4da7-adbc-773b2750d39b" (UID: "f205895a-be1d-4da7-adbc-773b2750d39b"). InnerVolumeSpecName "kube-api-access-dt82h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.345108 5030 scope.go:117] "RemoveContainer" containerID="b59c18df1e318f6b2e3f84ac12a6374255e6026543722d5cbad05da5899f1a0b" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.351288 5030 generic.go:334] "Generic (PLEG): container finished" podID="db5e46d2-d1b4-4639-8e45-c329417ae8e1" containerID="541f9809b1851320adea3aba77104ed75a318a6d7f3ff14f9ae06b101cf63b8f" exitCode=0 Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.351357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"db5e46d2-d1b4-4639-8e45-c329417ae8e1","Type":"ContainerDied","Data":"541f9809b1851320adea3aba77104ed75a318a6d7f3ff14f9ae06b101cf63b8f"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.354318 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" podStartSLOduration=7.354295862 podStartE2EDuration="7.354295862s" podCreationTimestamp="2026-01-20 23:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:44:49.329878279 +0000 UTC m=+4161.650138567" watchObservedRunningTime="2026-01-20 23:44:49.354295862 +0000 UTC m=+4161.674556150" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.356853 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "df866ce3-a10a-41e3-a17b-abea64eaa5ca" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.375702 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.377235 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-config-data" (OuterVolumeSpecName: "config-data") pod "5c070076-3f81-4e61-9f27-e5a39a84c380" (UID: "5c070076-3f81-4e61-9f27-e5a39a84c380"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.381092 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bthhs"] Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.381812 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54","Type":"ContainerDied","Data":"e90b4616815292d0dbc1a382729855d89e7175a6f3f1f6bc5729b620033b9d96"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.382010 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.383776 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.385577 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"1d6e18e5-718a-4578-8b74-add47cb38d2d","Type":"ContainerDied","Data":"53b0e0dc24df6dea41ed71c1ff156a0b1842d60e49849ef27900be1d8d386c67"} Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.385689 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.385893 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.385934 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-914b-account-create-update-25csc" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.385981 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.386019 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.386050 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.386395 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.386412 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.386461 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.387306 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bthhs"] Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.392121 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f7e1038-a9e1-40d9-a09d-4c224fb114d4" (UID: "4f7e1038-a9e1-40d9-a09d-4c224fb114d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.394996 5030 scope.go:117] "RemoveContainer" containerID="5dba5d24db0179451d485278775eeffd46ef67bfef2e3668bd94f66923b10c84" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.436148 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2wzn\" (UniqueName: \"kubernetes.io/projected/db5e46d2-d1b4-4639-8e45-c329417ae8e1-kube-api-access-b2wzn\") pod \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\" (UID: \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.436463 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5e46d2-d1b4-4639-8e45-c329417ae8e1-combined-ca-bundle\") pod \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\" (UID: \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.436541 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5e46d2-d1b4-4639-8e45-c329417ae8e1-config-data\") pod \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\" (UID: \"db5e46d2-d1b4-4639-8e45-c329417ae8e1\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.437287 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt82h\" (UniqueName: \"kubernetes.io/projected/f205895a-be1d-4da7-adbc-773b2750d39b-kube-api-access-dt82h\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.437321 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.437335 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.437349 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f7e1038-a9e1-40d9-a09d-4c224fb114d4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.467030 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db5e46d2-d1b4-4639-8e45-c329417ae8e1-kube-api-access-b2wzn" (OuterVolumeSpecName: "kube-api-access-b2wzn") pod "db5e46d2-d1b4-4639-8e45-c329417ae8e1" (UID: "db5e46d2-d1b4-4639-8e45-c329417ae8e1"). InnerVolumeSpecName "kube-api-access-b2wzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.470237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" (UID: "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.497052 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "50bb7262-9571-4851-9399-5115ee497125" (UID: "50bb7262-9571-4851-9399-5115ee497125"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.497583 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-config-data" (OuterVolumeSpecName: "config-data") pod "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" (UID: "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.507016 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-config-data" (OuterVolumeSpecName: "config-data") pod "fbdc4876-2fa1-495d-88fd-3c18541b14d4" (UID: "fbdc4876-2fa1-495d-88fd-3c18541b14d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.507485 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-config-data" (OuterVolumeSpecName: "config-data") pod "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" (UID: "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.516446 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-config-data" (OuterVolumeSpecName: "config-data") pod "50bb7262-9571-4851-9399-5115ee497125" (UID: "50bb7262-9571-4851-9399-5115ee497125"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.524751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04d86b77-716c-4eb7-8976-9e47e1b6df7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04d86b77-716c-4eb7-8976-9e47e1b6df7d" (UID: "04d86b77-716c-4eb7-8976-9e47e1b6df7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.529519 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" (UID: "59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.538726 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.538757 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2wzn\" (UniqueName: \"kubernetes.io/projected/db5e46d2-d1b4-4639-8e45-c329417ae8e1-kube-api-access-b2wzn\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.538768 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.538778 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.538787 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.538797 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.538805 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.538814 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04d86b77-716c-4eb7-8976-9e47e1b6df7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.538823 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.543442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5c070076-3f81-4e61-9f27-e5a39a84c380" (UID: "5c070076-3f81-4e61-9f27-e5a39a84c380"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.545778 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-config-data" (OuterVolumeSpecName: "config-data") pod "d19c3298-c9b5-4371-b1f5-8edf2510706b" (UID: "d19c3298-c9b5-4371-b1f5-8edf2510706b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.564000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df866ce3-a10a-41e3-a17b-abea64eaa5ca" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.566807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" (UID: "d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.571825 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "50bb7262-9571-4851-9399-5115ee497125" (UID: "50bb7262-9571-4851-9399-5115ee497125"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.574012 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "24b38406-08cf-452c-b0f8-c455ed6b7ece" (UID: "24b38406-08cf-452c-b0f8-c455ed6b7ece"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.586511 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db5e46d2-d1b4-4639-8e45-c329417ae8e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db5e46d2-d1b4-4639-8e45-c329417ae8e1" (UID: "db5e46d2-d1b4-4639-8e45-c329417ae8e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.614651 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fbdc4876-2fa1-495d-88fd-3c18541b14d4" (UID: "fbdc4876-2fa1-495d-88fd-3c18541b14d4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.617361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f7c46ea7-7016-445e-8f53-a65df4ea35fc" (UID: "f7c46ea7-7016-445e-8f53-a65df4ea35fc"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.624422 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.636678 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "24b38406-08cf-452c-b0f8-c455ed6b7ece" (UID: "24b38406-08cf-452c-b0f8-c455ed6b7ece"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.641769 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50bb7262-9571-4851-9399-5115ee497125-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.641798 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.641809 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5e46d2-d1b4-4639-8e45-c329417ae8e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.641819 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.641829 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c46ea7-7016-445e-8f53-a65df4ea35fc-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.641842 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.641852 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c070076-3f81-4e61-9f27-e5a39a84c380-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.641863 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.641873 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.641886 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.641899 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b38406-08cf-452c-b0f8-c455ed6b7ece-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: E0120 23:44:49.641977 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:44:49 crc kubenswrapper[5030]: E0120 23:44:49.642071 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data podName:2118c3ce-d55e-4a65-8d25-d7f53bcac91e nodeName:}" failed. No retries permitted until 2026-01-20 23:44:57.642042621 +0000 UTC m=+4169.962302909 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data") pod "rabbitmq-server-0" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e") : configmap "rabbitmq-config-data" not found Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.742876 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data" (OuterVolumeSpecName: "config-data") pod "ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" (UID: "ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.742989 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data\") pod \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\" (UID: \"ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1\") " Jan 20 23:44:49 crc kubenswrapper[5030]: W0120 23:44:49.743277 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1/volumes/kubernetes.io~secret/config-data Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.743309 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data" (OuterVolumeSpecName: "config-data") pod "ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" (UID: "ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.743407 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.747364 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f205895a-be1d-4da7-adbc-773b2750d39b" (UID: "f205895a-be1d-4da7-adbc-773b2750d39b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.747441 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" (UID: "979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.747704 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d19c3298-c9b5-4371-b1f5-8edf2510706b" (UID: "d19c3298-c9b5-4371-b1f5-8edf2510706b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.747970 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10e94b1-91a0-4631-a7b8-763d41c403ce-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "d10e94b1-91a0-4631-a7b8-763d41c403ce" (UID: "d10e94b1-91a0-4631-a7b8-763d41c403ce"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.752724 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db5e46d2-d1b4-4639-8e45-c329417ae8e1-config-data" (OuterVolumeSpecName: "config-data") pod "db5e46d2-d1b4-4639-8e45-c329417ae8e1" (UID: "db5e46d2-d1b4-4639-8e45-c329417ae8e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.771244 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fbdc4876-2fa1-495d-88fd-3c18541b14d4" (UID: "fbdc4876-2fa1-495d-88fd-3c18541b14d4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.771304 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "df866ce3-a10a-41e3-a17b-abea64eaa5ca" (UID: "df866ce3-a10a-41e3-a17b-abea64eaa5ca"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.772202 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-config-data" (OuterVolumeSpecName: "config-data") pod "f205895a-be1d-4da7-adbc-773b2750d39b" (UID: "f205895a-be1d-4da7-adbc-773b2750d39b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.778097 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d19c3298-c9b5-4371-b1f5-8edf2510706b" (UID: "d19c3298-c9b5-4371-b1f5-8edf2510706b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.797646 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d19c3298-c9b5-4371-b1f5-8edf2510706b" (UID: "d19c3298-c9b5-4371-b1f5-8edf2510706b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.843834 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.852894 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts\") pod \"keystone-b2c4-account-create-update-95cmt\" (UID: \"3a215636-7c93-4a97-97f5-842f25380d45\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.853698 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.853719 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.853729 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/df866ce3-a10a-41e3-a17b-abea64eaa5ca-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.853739 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.853749 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.853757 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbdc4876-2fa1-495d-88fd-3c18541b14d4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.853766 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f205895a-be1d-4da7-adbc-773b2750d39b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.853774 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db5e46d2-d1b4-4639-8e45-c329417ae8e1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.853782 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d19c3298-c9b5-4371-b1f5-8edf2510706b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.853792 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d10e94b1-91a0-4631-a7b8-763d41c403ce-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:49 crc kubenswrapper[5030]: E0120 23:44:49.853830 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: object "openstack-kuttl-tests"/"openstack-scripts" not registered Jan 20 23:44:49 crc kubenswrapper[5030]: E0120 23:44:49.853869 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts podName:3a215636-7c93-4a97-97f5-842f25380d45 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:53.853854969 +0000 UTC m=+4166.174115257 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts") pod "keystone-b2c4-account-create-update-95cmt" (UID: "3a215636-7c93-4a97-97f5-842f25380d45") : object "openstack-kuttl-tests"/"openstack-scripts" not registered Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.855160 5030 scope.go:117] "RemoveContainer" containerID="d0252ba3559ee134b9000998c8060c53dc2b0d984f743e9c1fb9fb16abf9e46f" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.907369 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.916305 5030 scope.go:117] "RemoveContainer" containerID="5dba5d24db0179451d485278775eeffd46ef67bfef2e3668bd94f66923b10c84" Jan 20 23:44:49 crc kubenswrapper[5030]: E0120 23:44:49.924818 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dba5d24db0179451d485278775eeffd46ef67bfef2e3668bd94f66923b10c84\": container with ID starting with 5dba5d24db0179451d485278775eeffd46ef67bfef2e3668bd94f66923b10c84 not found: ID does not exist" containerID="5dba5d24db0179451d485278775eeffd46ef67bfef2e3668bd94f66923b10c84" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.924887 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dba5d24db0179451d485278775eeffd46ef67bfef2e3668bd94f66923b10c84"} err="failed to get container status \"5dba5d24db0179451d485278775eeffd46ef67bfef2e3668bd94f66923b10c84\": rpc error: code = NotFound desc = could not find container \"5dba5d24db0179451d485278775eeffd46ef67bfef2e3668bd94f66923b10c84\": container with ID starting with 5dba5d24db0179451d485278775eeffd46ef67bfef2e3668bd94f66923b10c84 not found: ID does not exist" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.924927 5030 scope.go:117] "RemoveContainer" containerID="d0252ba3559ee134b9000998c8060c53dc2b0d984f743e9c1fb9fb16abf9e46f" Jan 20 23:44:49 crc kubenswrapper[5030]: E0120 23:44:49.928241 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0252ba3559ee134b9000998c8060c53dc2b0d984f743e9c1fb9fb16abf9e46f\": container with ID starting with d0252ba3559ee134b9000998c8060c53dc2b0d984f743e9c1fb9fb16abf9e46f not found: ID does not exist" containerID="d0252ba3559ee134b9000998c8060c53dc2b0d984f743e9c1fb9fb16abf9e46f" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.928366 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0252ba3559ee134b9000998c8060c53dc2b0d984f743e9c1fb9fb16abf9e46f"} err="failed to get container status \"d0252ba3559ee134b9000998c8060c53dc2b0d984f743e9c1fb9fb16abf9e46f\": rpc error: code = NotFound desc = could not find container \"d0252ba3559ee134b9000998c8060c53dc2b0d984f743e9c1fb9fb16abf9e46f\": container with ID starting with d0252ba3559ee134b9000998c8060c53dc2b0d984f743e9c1fb9fb16abf9e46f not found: ID does not exist" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.929817 5030 scope.go:117] "RemoveContainer" containerID="fc4cf92d45efba3ab3f584658fe1ec9a79d16433135d3203f60088326bf66a6b" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.954354 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-scripts\") pod \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.954440 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-plugins\") pod \"16f26a59-3681-422d-a46f-406b454f73b8\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.954488 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6cgc\" (UniqueName: \"kubernetes.io/projected/ba47504f-1af1-4c46-acc4-6f6ebc32855f-kube-api-access-l6cgc\") pod \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.954518 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-server-conf\") pod \"16f26a59-3681-422d-a46f-406b454f73b8\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.954691 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-config-data\") pod \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.954763 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-public-tls-certs\") pod \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.954865 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-erlang-cookie\") pod \"16f26a59-3681-422d-a46f-406b454f73b8\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.954894 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-tls\") pod \"16f26a59-3681-422d-a46f-406b454f73b8\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.954954 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-combined-ca-bundle\") pod \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.954972 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16f26a59-3681-422d-a46f-406b454f73b8-pod-info\") pod \"16f26a59-3681-422d-a46f-406b454f73b8\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.955009 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-internal-tls-certs\") pod \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.955031 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-plugins-conf\") pod \"16f26a59-3681-422d-a46f-406b454f73b8\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.955049 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"16f26a59-3681-422d-a46f-406b454f73b8\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.955091 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-confd\") pod \"16f26a59-3681-422d-a46f-406b454f73b8\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.955125 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-fernet-keys\") pod \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.955196 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data\") pod \"16f26a59-3681-422d-a46f-406b454f73b8\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.955218 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-credential-keys\") pod \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\" (UID: \"ba47504f-1af1-4c46-acc4-6f6ebc32855f\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.955360 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16f26a59-3681-422d-a46f-406b454f73b8-erlang-cookie-secret\") pod \"16f26a59-3681-422d-a46f-406b454f73b8\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.955387 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lksg\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-kube-api-access-5lksg\") pod \"16f26a59-3681-422d-a46f-406b454f73b8\" (UID: \"16f26a59-3681-422d-a46f-406b454f73b8\") " Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.955564 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6js45\" (UniqueName: \"kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45\") pod \"keystone-b2c4-account-create-update-95cmt\" (UID: \"3a215636-7c93-4a97-97f5-842f25380d45\") " pod="openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.956367 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "16f26a59-3681-422d-a46f-406b454f73b8" (UID: "16f26a59-3681-422d-a46f-406b454f73b8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.957652 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "16f26a59-3681-422d-a46f-406b454f73b8" (UID: "16f26a59-3681-422d-a46f-406b454f73b8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: E0120 23:44:49.965770 5030 projected.go:194] Error preparing data for projected volume kube-api-access-6js45 for pod openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:44:49 crc kubenswrapper[5030]: E0120 23:44:49.966072 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45 podName:3a215636-7c93-4a97-97f5-842f25380d45 nodeName:}" failed. No retries permitted until 2026-01-20 23:44:53.965854406 +0000 UTC m=+4166.286114684 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6js45" (UniqueName: "kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45") pod "keystone-b2c4-account-create-update-95cmt" (UID: "3a215636-7c93-4a97-97f5-842f25380d45") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.967566 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.968525 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "16f26a59-3681-422d-a46f-406b454f73b8" (UID: "16f26a59-3681-422d-a46f-406b454f73b8"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.965778 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ba47504f-1af1-4c46-acc4-6f6ebc32855f" (UID: "ba47504f-1af1-4c46-acc4-6f6ebc32855f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.969382 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "16f26a59-3681-422d-a46f-406b454f73b8" (UID: "16f26a59-3681-422d-a46f-406b454f73b8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.972871 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ba47504f-1af1-4c46-acc4-6f6ebc32855f" (UID: "ba47504f-1af1-4c46-acc4-6f6ebc32855f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.973049 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "16f26a59-3681-422d-a46f-406b454f73b8" (UID: "16f26a59-3681-422d-a46f-406b454f73b8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.982690 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-scripts" (OuterVolumeSpecName: "scripts") pod "ba47504f-1af1-4c46-acc4-6f6ebc32855f" (UID: "ba47504f-1af1-4c46-acc4-6f6ebc32855f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.984456 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-kube-api-access-5lksg" (OuterVolumeSpecName: "kube-api-access-5lksg") pod "16f26a59-3681-422d-a46f-406b454f73b8" (UID: "16f26a59-3681-422d-a46f-406b454f73b8"). InnerVolumeSpecName "kube-api-access-5lksg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.985171 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/16f26a59-3681-422d-a46f-406b454f73b8-pod-info" (OuterVolumeSpecName: "pod-info") pod "16f26a59-3681-422d-a46f-406b454f73b8" (UID: "16f26a59-3681-422d-a46f-406b454f73b8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.991222 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f26a59-3681-422d-a46f-406b454f73b8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "16f26a59-3681-422d-a46f-406b454f73b8" (UID: "16f26a59-3681-422d-a46f-406b454f73b8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.992255 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37397e88-13fe-4396-810e-d87514548283" path="/var/lib/kubelet/pods/37397e88-13fe-4396-810e-d87514548283/volumes" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.994468 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86b0b2ab-44de-47c1-b0f4-6fc63d87393a" path="/var/lib/kubelet/pods/86b0b2ab-44de-47c1-b0f4-6fc63d87393a/volumes" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.995032 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ac2d572-0fed-4309-bb22-03bbc9a30270" path="/var/lib/kubelet/pods/8ac2d572-0fed-4309-bb22-03bbc9a30270/volumes" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.995379 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af19e9cd-1a61-43ac-a803-fdb7e92831e9" path="/var/lib/kubelet/pods/af19e9cd-1a61-43ac-a803-fdb7e92831e9/volumes" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.995733 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfd208a2-6c29-4066-a573-c6b35a6a59d5" path="/var/lib/kubelet/pods/cfd208a2-6c29-4066-a573-c6b35a6a59d5/volumes" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.996786 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5cd7900-68ef-41ef-8635-44dd8b9bfdbb" path="/var/lib/kubelet/pods/d5cd7900-68ef-41ef-8635-44dd8b9bfdbb/volumes" Jan 20 23:44:49 crc kubenswrapper[5030]: I0120 23:44:49.998245 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba47504f-1af1-4c46-acc4-6f6ebc32855f-kube-api-access-l6cgc" (OuterVolumeSpecName: "kube-api-access-l6cgc") pod "ba47504f-1af1-4c46-acc4-6f6ebc32855f" (UID: "ba47504f-1af1-4c46-acc4-6f6ebc32855f"). InnerVolumeSpecName "kube-api-access-l6cgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.019778 5030 scope.go:117] "RemoveContainer" containerID="042c3ddc337801a1de68a99e66b3565634736bf58285d9ac8ac3ca69a8d4aca1" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.035595 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.035646 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzhtm"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.035659 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kzhtm"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.035668 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.037736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba47504f-1af1-4c46-acc4-6f6ebc32855f" (UID: "ba47504f-1af1-4c46-acc4-6f6ebc32855f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.039161 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-config-data" (OuterVolumeSpecName: "config-data") pod "ba47504f-1af1-4c46-acc4-6f6ebc32855f" (UID: "ba47504f-1af1-4c46-acc4-6f6ebc32855f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.039351 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.056990 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.057016 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/16f26a59-3681-422d-a46f-406b454f73b8-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.057025 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.057067 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.057079 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.057088 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.057097 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/16f26a59-3681-422d-a46f-406b454f73b8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.057105 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lksg\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-kube-api-access-5lksg\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.057135 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.057145 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.057154 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6cgc\" (UniqueName: \"kubernetes.io/projected/ba47504f-1af1-4c46-acc4-6f6ebc32855f-kube-api-access-l6cgc\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.057163 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.057172 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.057180 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.060528 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data" (OuterVolumeSpecName: "config-data") pod "16f26a59-3681-422d-a46f-406b454f73b8" (UID: "16f26a59-3681-422d-a46f-406b454f73b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.067032 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-914b-account-create-update-25csc"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.070153 5030 scope.go:117] "RemoveContainer" containerID="db87e0d657028f0da82f16b08db2f08eb396b07e9c9a8800ab5a9e1a8e82760c" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.082890 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ba47504f-1af1-4c46-acc4-6f6ebc32855f" (UID: "ba47504f-1af1-4c46-acc4-6f6ebc32855f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.084177 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.093174 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ba47504f-1af1-4c46-acc4-6f6ebc32855f" (UID: "ba47504f-1af1-4c46-acc4-6f6ebc32855f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.097117 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-914b-account-create-update-25csc"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.114143 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.120083 5030 scope.go:117] "RemoveContainer" containerID="181729877b0aab834202ecd11fb5d266acca0c2132cefa652fcb436c28a852fb" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.120852 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.127936 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-server-conf" (OuterVolumeSpecName: "server-conf") pod "16f26a59-3681-422d-a46f-406b454f73b8" (UID: "16f26a59-3681-422d-a46f-406b454f73b8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.136590 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.142416 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-617d-account-create-update-mmt4z"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.145445 5030 scope.go:117] "RemoveContainer" containerID="3a08091a8839c78e509146461ec023c2f7f8a34d9700b1f59f27ad351db63a85" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.159580 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.159608 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.159633 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba47504f-1af1-4c46-acc4-6f6ebc32855f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.159643 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.159652 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16f26a59-3681-422d-a46f-406b454f73b8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.167485 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.174076 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-e7ae-account-create-update-lxxbb"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.183070 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "16f26a59-3681-422d-a46f-406b454f73b8" (UID: "16f26a59-3681-422d-a46f-406b454f73b8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.187754 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.195855 5030 scope.go:117] "RemoveContainer" containerID="1a73e1e4df6206bcd4110d3e04305f833063815efa94750aa87d060b68f615e8" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.197387 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-2dcf-account-create-update-qrmz5"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.206046 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.215578 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.226751 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.238948 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.240717 5030 scope.go:117] "RemoveContainer" containerID="44672b9e5ac71bd9b91365302a913d20bb6e00d96ec91f69b8166b36d5e7f2e4" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.254230 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.259146 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-b2c4-account-create-update-95cmt"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.261904 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/16f26a59-3681-422d-a46f-406b454f73b8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.264409 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.282274 5030 scope.go:117] "RemoveContainer" containerID="f878fb22f76e5c3d5c5aad54d40d91cba28b59e426192e35705dec229d359f10" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.287688 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.296122 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.303950 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.307482 5030 scope.go:117] "RemoveContainer" containerID="4c0e97ab7ac916ffeec23817d4c8f83530ee3240f89e8b751dc5b7193307f748" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.309861 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.317272 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.324030 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-748859d488-6s2vx"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.326582 5030 scope.go:117] "RemoveContainer" containerID="3ee7c875e7608cf4c0f7d6601a304573ca4dfceefd3058f7a8848e454d0f9c9d" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.330717 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-748859d488-6s2vx"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.336407 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-6c4cd74858-frhdt"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.341290 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-6c4cd74858-frhdt"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.347186 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.347735 5030 scope.go:117] "RemoveContainer" containerID="6ee1b6560a64ef259ec6eaa5bbb33e54f0ef57489ad2da168aaaaf38c3656758" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.355379 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-86c9cf7bbf-qpwgc"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.363164 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a215636-7c93-4a97-97f5-842f25380d45-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.363193 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6js45\" (UniqueName: \"kubernetes.io/projected/3a215636-7c93-4a97-97f5-842f25380d45-kube-api-access-6js45\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.363210 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.373667 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.373760 5030 scope.go:117] "RemoveContainer" containerID="d265c55e4aeef9268ed267ea571a9d19e742c27097df56c4176eb709a398dffe" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.385584 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.388950 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.396121 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.401830 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.411869 5030 scope.go:117] "RemoveContainer" containerID="700ae07561fddcc52977bb5e74502ea44461979ec254243a6c0a66a86fc28dce" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.416617 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.421029 5030 generic.go:334] "Generic (PLEG): container finished" podID="16f26a59-3681-422d-a46f-406b454f73b8" containerID="7f377657ebc8cc8d5f270487a8b8d28e4ee20ea78c35ac924a0cfd025e4c2bef" exitCode=0 Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.421116 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"16f26a59-3681-422d-a46f-406b454f73b8","Type":"ContainerDied","Data":"7f377657ebc8cc8d5f270487a8b8d28e4ee20ea78c35ac924a0cfd025e4c2bef"} Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.421144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"16f26a59-3681-422d-a46f-406b454f73b8","Type":"ContainerDied","Data":"51acff2a27d331282d3d81f29d33a4a74b9a86ceb3192ef5404c8ac742bb6061"} Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.421216 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.434786 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"db5e46d2-d1b4-4639-8e45-c329417ae8e1","Type":"ContainerDied","Data":"1486f2c999df12315217989d8070d874ddb2991aae5c87d9d4eae7f6ea2b9785"} Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.434826 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.437648 5030 generic.go:334] "Generic (PLEG): container finished" podID="ba47504f-1af1-4c46-acc4-6f6ebc32855f" containerID="243306ef2b6397bc8c6c2bcd542fb20212d3039979bc4b05c6287f0f67be396d" exitCode=0 Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.438295 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.439517 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" event={"ID":"ba47504f-1af1-4c46-acc4-6f6ebc32855f","Type":"ContainerDied","Data":"243306ef2b6397bc8c6c2bcd542fb20212d3039979bc4b05c6287f0f67be396d"} Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.439546 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7c845c9d97-gq5lm" event={"ID":"ba47504f-1af1-4c46-acc4-6f6ebc32855f","Type":"ContainerDied","Data":"db3a329ad2fb8bc422bc37633e08c7a296dbf322ff6fad9a1c5c34f5799acd9a"} Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.497497 5030 scope.go:117] "RemoveContainer" containerID="6ee1b6560a64ef259ec6eaa5bbb33e54f0ef57489ad2da168aaaaf38c3656758" Jan 20 23:44:50 crc kubenswrapper[5030]: E0120 23:44:50.504604 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee1b6560a64ef259ec6eaa5bbb33e54f0ef57489ad2da168aaaaf38c3656758\": container with ID starting with 6ee1b6560a64ef259ec6eaa5bbb33e54f0ef57489ad2da168aaaaf38c3656758 not found: ID does not exist" containerID="6ee1b6560a64ef259ec6eaa5bbb33e54f0ef57489ad2da168aaaaf38c3656758" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.504656 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee1b6560a64ef259ec6eaa5bbb33e54f0ef57489ad2da168aaaaf38c3656758"} err="failed to get container status \"6ee1b6560a64ef259ec6eaa5bbb33e54f0ef57489ad2da168aaaaf38c3656758\": rpc error: code = NotFound desc = could not find container \"6ee1b6560a64ef259ec6eaa5bbb33e54f0ef57489ad2da168aaaaf38c3656758\": container with ID starting with 6ee1b6560a64ef259ec6eaa5bbb33e54f0ef57489ad2da168aaaaf38c3656758 not found: ID does not exist" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.504685 5030 scope.go:117] "RemoveContainer" containerID="d265c55e4aeef9268ed267ea571a9d19e742c27097df56c4176eb709a398dffe" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.504941 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7c845c9d97-gq5lm"] Jan 20 23:44:50 crc kubenswrapper[5030]: E0120 23:44:50.506869 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d265c55e4aeef9268ed267ea571a9d19e742c27097df56c4176eb709a398dffe\": container with ID starting with d265c55e4aeef9268ed267ea571a9d19e742c27097df56c4176eb709a398dffe not found: ID does not exist" containerID="d265c55e4aeef9268ed267ea571a9d19e742c27097df56c4176eb709a398dffe" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.508730 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d265c55e4aeef9268ed267ea571a9d19e742c27097df56c4176eb709a398dffe"} err="failed to get container status \"d265c55e4aeef9268ed267ea571a9d19e742c27097df56c4176eb709a398dffe\": rpc error: code = NotFound desc = could not find container \"d265c55e4aeef9268ed267ea571a9d19e742c27097df56c4176eb709a398dffe\": container with ID starting with d265c55e4aeef9268ed267ea571a9d19e742c27097df56c4176eb709a398dffe not found: ID does not exist" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.508840 5030 scope.go:117] "RemoveContainer" containerID="700ae07561fddcc52977bb5e74502ea44461979ec254243a6c0a66a86fc28dce" Jan 20 23:44:50 crc kubenswrapper[5030]: E0120 23:44:50.518399 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700ae07561fddcc52977bb5e74502ea44461979ec254243a6c0a66a86fc28dce\": container with ID starting with 700ae07561fddcc52977bb5e74502ea44461979ec254243a6c0a66a86fc28dce not found: ID does not exist" containerID="700ae07561fddcc52977bb5e74502ea44461979ec254243a6c0a66a86fc28dce" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.518441 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700ae07561fddcc52977bb5e74502ea44461979ec254243a6c0a66a86fc28dce"} err="failed to get container status \"700ae07561fddcc52977bb5e74502ea44461979ec254243a6c0a66a86fc28dce\": rpc error: code = NotFound desc = could not find container \"700ae07561fddcc52977bb5e74502ea44461979ec254243a6c0a66a86fc28dce\": container with ID starting with 700ae07561fddcc52977bb5e74502ea44461979ec254243a6c0a66a86fc28dce not found: ID does not exist" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.518468 5030 scope.go:117] "RemoveContainer" containerID="332c1b6479e30e4a4e22fdcc2d4661217a7b8a37320b8504f9bc9df20679771f" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.521036 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-7c845c9d97-gq5lm"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.531122 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.542455 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.548995 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.554567 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5bc7cb6786-4jbxx"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.559499 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.563982 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.583850 5030 scope.go:117] "RemoveContainer" containerID="4d4ad5a6d7d11df4886fa6a22b4f7a1bd73a0b2aeaa3ab4e6b5789b085035912" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.656934 5030 scope.go:117] "RemoveContainer" containerID="b54bda34645aa30be316ecd336547283562d059fb3040eb96063b7c53a7e4521" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.672590 5030 scope.go:117] "RemoveContainer" containerID="93d97d664a75ce718e24a3db67922eeb139038d7198bf3078d32828f497081d0" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.701788 5030 scope.go:117] "RemoveContainer" containerID="7f377657ebc8cc8d5f270487a8b8d28e4ee20ea78c35ac924a0cfd025e4c2bef" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.746970 5030 scope.go:117] "RemoveContainer" containerID="9c48f12957fef053cb13ac5971d97c7a68101793fe83cc1adfe30ac9a8f81cc5" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.786460 5030 scope.go:117] "RemoveContainer" containerID="7f377657ebc8cc8d5f270487a8b8d28e4ee20ea78c35ac924a0cfd025e4c2bef" Jan 20 23:44:50 crc kubenswrapper[5030]: E0120 23:44:50.788477 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f377657ebc8cc8d5f270487a8b8d28e4ee20ea78c35ac924a0cfd025e4c2bef\": container with ID starting with 7f377657ebc8cc8d5f270487a8b8d28e4ee20ea78c35ac924a0cfd025e4c2bef not found: ID does not exist" containerID="7f377657ebc8cc8d5f270487a8b8d28e4ee20ea78c35ac924a0cfd025e4c2bef" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.788526 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f377657ebc8cc8d5f270487a8b8d28e4ee20ea78c35ac924a0cfd025e4c2bef"} err="failed to get container status \"7f377657ebc8cc8d5f270487a8b8d28e4ee20ea78c35ac924a0cfd025e4c2bef\": rpc error: code = NotFound desc = could not find container \"7f377657ebc8cc8d5f270487a8b8d28e4ee20ea78c35ac924a0cfd025e4c2bef\": container with ID starting with 7f377657ebc8cc8d5f270487a8b8d28e4ee20ea78c35ac924a0cfd025e4c2bef not found: ID does not exist" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.788546 5030 scope.go:117] "RemoveContainer" containerID="9c48f12957fef053cb13ac5971d97c7a68101793fe83cc1adfe30ac9a8f81cc5" Jan 20 23:44:50 crc kubenswrapper[5030]: E0120 23:44:50.788864 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c48f12957fef053cb13ac5971d97c7a68101793fe83cc1adfe30ac9a8f81cc5\": container with ID starting with 9c48f12957fef053cb13ac5971d97c7a68101793fe83cc1adfe30ac9a8f81cc5 not found: ID does not exist" containerID="9c48f12957fef053cb13ac5971d97c7a68101793fe83cc1adfe30ac9a8f81cc5" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.788912 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c48f12957fef053cb13ac5971d97c7a68101793fe83cc1adfe30ac9a8f81cc5"} err="failed to get container status \"9c48f12957fef053cb13ac5971d97c7a68101793fe83cc1adfe30ac9a8f81cc5\": rpc error: code = NotFound desc = could not find container \"9c48f12957fef053cb13ac5971d97c7a68101793fe83cc1adfe30ac9a8f81cc5\": container with ID starting with 9c48f12957fef053cb13ac5971d97c7a68101793fe83cc1adfe30ac9a8f81cc5 not found: ID does not exist" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.788941 5030 scope.go:117] "RemoveContainer" containerID="541f9809b1851320adea3aba77104ed75a318a6d7f3ff14f9ae06b101cf63b8f" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.806642 5030 scope.go:117] "RemoveContainer" containerID="243306ef2b6397bc8c6c2bcd542fb20212d3039979bc4b05c6287f0f67be396d" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.871462 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_1a24a888-0314-4501-b989-20463ef27332/ovn-northd/0.log" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.871533 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.885241 5030 scope.go:117] "RemoveContainer" containerID="243306ef2b6397bc8c6c2bcd542fb20212d3039979bc4b05c6287f0f67be396d" Jan 20 23:44:50 crc kubenswrapper[5030]: E0120 23:44:50.886487 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"243306ef2b6397bc8c6c2bcd542fb20212d3039979bc4b05c6287f0f67be396d\": container with ID starting with 243306ef2b6397bc8c6c2bcd542fb20212d3039979bc4b05c6287f0f67be396d not found: ID does not exist" containerID="243306ef2b6397bc8c6c2bcd542fb20212d3039979bc4b05c6287f0f67be396d" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.886539 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"243306ef2b6397bc8c6c2bcd542fb20212d3039979bc4b05c6287f0f67be396d"} err="failed to get container status \"243306ef2b6397bc8c6c2bcd542fb20212d3039979bc4b05c6287f0f67be396d\": rpc error: code = NotFound desc = could not find container \"243306ef2b6397bc8c6c2bcd542fb20212d3039979bc4b05c6287f0f67be396d\": container with ID starting with 243306ef2b6397bc8c6c2bcd542fb20212d3039979bc4b05c6287f0f67be396d not found: ID does not exist" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.970273 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a24a888-0314-4501-b989-20463ef27332-scripts\") pod \"1a24a888-0314-4501-b989-20463ef27332\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.970346 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-combined-ca-bundle\") pod \"1a24a888-0314-4501-b989-20463ef27332\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.970383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-ovn-northd-tls-certs\") pod \"1a24a888-0314-4501-b989-20463ef27332\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.970420 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-metrics-certs-tls-certs\") pod \"1a24a888-0314-4501-b989-20463ef27332\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.970447 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a24a888-0314-4501-b989-20463ef27332-config\") pod \"1a24a888-0314-4501-b989-20463ef27332\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.970505 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1a24a888-0314-4501-b989-20463ef27332-ovn-rundir\") pod \"1a24a888-0314-4501-b989-20463ef27332\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.970545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4rgm\" (UniqueName: \"kubernetes.io/projected/1a24a888-0314-4501-b989-20463ef27332-kube-api-access-l4rgm\") pod \"1a24a888-0314-4501-b989-20463ef27332\" (UID: \"1a24a888-0314-4501-b989-20463ef27332\") " Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.970817 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a24a888-0314-4501-b989-20463ef27332-scripts" (OuterVolumeSpecName: "scripts") pod "1a24a888-0314-4501-b989-20463ef27332" (UID: "1a24a888-0314-4501-b989-20463ef27332"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.971125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a24a888-0314-4501-b989-20463ef27332-config" (OuterVolumeSpecName: "config") pod "1a24a888-0314-4501-b989-20463ef27332" (UID: "1a24a888-0314-4501-b989-20463ef27332"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.971172 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a24a888-0314-4501-b989-20463ef27332-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "1a24a888-0314-4501-b989-20463ef27332" (UID: "1a24a888-0314-4501-b989-20463ef27332"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.990862 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a24a888-0314-4501-b989-20463ef27332-kube-api-access-l4rgm" (OuterVolumeSpecName: "kube-api-access-l4rgm") pod "1a24a888-0314-4501-b989-20463ef27332" (UID: "1a24a888-0314-4501-b989-20463ef27332"). InnerVolumeSpecName "kube-api-access-l4rgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:50 crc kubenswrapper[5030]: I0120 23:44:50.995755 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a24a888-0314-4501-b989-20463ef27332" (UID: "1a24a888-0314-4501-b989-20463ef27332"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.061091 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "1a24a888-0314-4501-b989-20463ef27332" (UID: "1a24a888-0314-4501-b989-20463ef27332"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.072479 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1a24a888-0314-4501-b989-20463ef27332-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.072512 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4rgm\" (UniqueName: \"kubernetes.io/projected/1a24a888-0314-4501-b989-20463ef27332-kube-api-access-l4rgm\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.072524 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a24a888-0314-4501-b989-20463ef27332-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.072535 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.072544 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.072552 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a24a888-0314-4501-b989-20463ef27332-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.079429 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "1a24a888-0314-4501-b989-20463ef27332" (UID: "1a24a888-0314-4501-b989-20463ef27332"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.174192 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a24a888-0314-4501-b989-20463ef27332-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.367613 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.449410 5030 generic.go:334] "Generic (PLEG): container finished" podID="2118c3ce-d55e-4a65-8d25-d7f53bcac91e" containerID="8d105475679fa60963f81be8cfa6ee62761c330e4c9f837c616f6bf01d9043ec" exitCode=0 Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.449517 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.449520 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"2118c3ce-d55e-4a65-8d25-d7f53bcac91e","Type":"ContainerDied","Data":"8d105475679fa60963f81be8cfa6ee62761c330e4c9f837c616f6bf01d9043ec"} Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.449573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"2118c3ce-d55e-4a65-8d25-d7f53bcac91e","Type":"ContainerDied","Data":"5476b5c63e00b4cc5e3431d604dd24030771d82ed1e2154221e46902839d3f72"} Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.449596 5030 scope.go:117] "RemoveContainer" containerID="8d105475679fa60963f81be8cfa6ee62761c330e4c9f837c616f6bf01d9043ec" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.464427 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_1a24a888-0314-4501-b989-20463ef27332/ovn-northd/0.log" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.464505 5030 generic.go:334] "Generic (PLEG): container finished" podID="1a24a888-0314-4501-b989-20463ef27332" containerID="f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126" exitCode=139 Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.464568 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"1a24a888-0314-4501-b989-20463ef27332","Type":"ContainerDied","Data":"f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126"} Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.464860 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"1a24a888-0314-4501-b989-20463ef27332","Type":"ContainerDied","Data":"6bfc653e010a933624d3199768da1c2010cbd0500e1b50c32ab72a246903d334"} Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.464901 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.477715 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-erlang-cookie-secret\") pod \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.477811 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-plugins\") pod \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.477901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-confd\") pod \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.477996 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-tls\") pod \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.478081 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-plugins-conf\") pod \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.478143 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data\") pod \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.478213 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wllkk\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-kube-api-access-wllkk\") pod \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.478269 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-server-conf\") pod \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.478304 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.478363 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-erlang-cookie\") pod \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.478408 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-pod-info\") pod \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\" (UID: \"2118c3ce-d55e-4a65-8d25-d7f53bcac91e\") " Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.478929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2118c3ce-d55e-4a65-8d25-d7f53bcac91e" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.478939 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2118c3ce-d55e-4a65-8d25-d7f53bcac91e" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.479661 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2118c3ce-d55e-4a65-8d25-d7f53bcac91e" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.482679 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2118c3ce-d55e-4a65-8d25-d7f53bcac91e" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.484812 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "2118c3ce-d55e-4a65-8d25-d7f53bcac91e" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.485943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-kube-api-access-wllkk" (OuterVolumeSpecName: "kube-api-access-wllkk") pod "2118c3ce-d55e-4a65-8d25-d7f53bcac91e" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e"). InnerVolumeSpecName "kube-api-access-wllkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.486063 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2118c3ce-d55e-4a65-8d25-d7f53bcac91e" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.486860 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-pod-info" (OuterVolumeSpecName: "pod-info") pod "2118c3ce-d55e-4a65-8d25-d7f53bcac91e" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.499088 5030 scope.go:117] "RemoveContainer" containerID="62b3fef48a93cc798c887229c9410be50b9b8c1feb8b7d1bf40cdaf819af989a" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.502361 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.506012 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data" (OuterVolumeSpecName: "config-data") pod "2118c3ce-d55e-4a65-8d25-d7f53bcac91e" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.517647 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.522350 5030 scope.go:117] "RemoveContainer" containerID="8d105475679fa60963f81be8cfa6ee62761c330e4c9f837c616f6bf01d9043ec" Jan 20 23:44:51 crc kubenswrapper[5030]: E0120 23:44:51.523091 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d105475679fa60963f81be8cfa6ee62761c330e4c9f837c616f6bf01d9043ec\": container with ID starting with 8d105475679fa60963f81be8cfa6ee62761c330e4c9f837c616f6bf01d9043ec not found: ID does not exist" containerID="8d105475679fa60963f81be8cfa6ee62761c330e4c9f837c616f6bf01d9043ec" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.523161 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d105475679fa60963f81be8cfa6ee62761c330e4c9f837c616f6bf01d9043ec"} err="failed to get container status \"8d105475679fa60963f81be8cfa6ee62761c330e4c9f837c616f6bf01d9043ec\": rpc error: code = NotFound desc = could not find container \"8d105475679fa60963f81be8cfa6ee62761c330e4c9f837c616f6bf01d9043ec\": container with ID starting with 8d105475679fa60963f81be8cfa6ee62761c330e4c9f837c616f6bf01d9043ec not found: ID does not exist" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.523204 5030 scope.go:117] "RemoveContainer" containerID="62b3fef48a93cc798c887229c9410be50b9b8c1feb8b7d1bf40cdaf819af989a" Jan 20 23:44:51 crc kubenswrapper[5030]: E0120 23:44:51.523556 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b3fef48a93cc798c887229c9410be50b9b8c1feb8b7d1bf40cdaf819af989a\": container with ID starting with 62b3fef48a93cc798c887229c9410be50b9b8c1feb8b7d1bf40cdaf819af989a not found: ID does not exist" containerID="62b3fef48a93cc798c887229c9410be50b9b8c1feb8b7d1bf40cdaf819af989a" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.523593 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b3fef48a93cc798c887229c9410be50b9b8c1feb8b7d1bf40cdaf819af989a"} err="failed to get container status \"62b3fef48a93cc798c887229c9410be50b9b8c1feb8b7d1bf40cdaf819af989a\": rpc error: code = NotFound desc = could not find container \"62b3fef48a93cc798c887229c9410be50b9b8c1feb8b7d1bf40cdaf819af989a\": container with ID starting with 62b3fef48a93cc798c887229c9410be50b9b8c1feb8b7d1bf40cdaf819af989a not found: ID does not exist" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.523656 5030 scope.go:117] "RemoveContainer" containerID="0c308db5ef08b7ef693c77c927ec44ee44d39ecf14e8c420af0c16132d9886ec" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.542718 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-server-conf" (OuterVolumeSpecName: "server-conf") pod "2118c3ce-d55e-4a65-8d25-d7f53bcac91e" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.547430 5030 scope.go:117] "RemoveContainer" containerID="f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.578608 5030 scope.go:117] "RemoveContainer" containerID="0c308db5ef08b7ef693c77c927ec44ee44d39ecf14e8c420af0c16132d9886ec" Jan 20 23:44:51 crc kubenswrapper[5030]: E0120 23:44:51.579082 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c308db5ef08b7ef693c77c927ec44ee44d39ecf14e8c420af0c16132d9886ec\": container with ID starting with 0c308db5ef08b7ef693c77c927ec44ee44d39ecf14e8c420af0c16132d9886ec not found: ID does not exist" containerID="0c308db5ef08b7ef693c77c927ec44ee44d39ecf14e8c420af0c16132d9886ec" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.579129 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c308db5ef08b7ef693c77c927ec44ee44d39ecf14e8c420af0c16132d9886ec"} err="failed to get container status \"0c308db5ef08b7ef693c77c927ec44ee44d39ecf14e8c420af0c16132d9886ec\": rpc error: code = NotFound desc = could not find container \"0c308db5ef08b7ef693c77c927ec44ee44d39ecf14e8c420af0c16132d9886ec\": container with ID starting with 0c308db5ef08b7ef693c77c927ec44ee44d39ecf14e8c420af0c16132d9886ec not found: ID does not exist" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.579155 5030 scope.go:117] "RemoveContainer" containerID="f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126" Jan 20 23:44:51 crc kubenswrapper[5030]: E0120 23:44:51.579562 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126\": container with ID starting with f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126 not found: ID does not exist" containerID="f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.579593 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126"} err="failed to get container status \"f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126\": rpc error: code = NotFound desc = could not find container \"f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126\": container with ID starting with f3b74c58f1f0924c6b08841f6c16d6d4d13149c31696c78cc0405d471bd20126 not found: ID does not exist" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.581291 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.581315 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.581324 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.581336 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wllkk\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-kube-api-access-wllkk\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.581347 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.581379 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.581579 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.581593 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.581604 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.581627 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.595310 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.603337 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2118c3ce-d55e-4a65-8d25-d7f53bcac91e" (UID: "2118c3ce-d55e-4a65-8d25-d7f53bcac91e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.683663 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2118c3ce-d55e-4a65-8d25-d7f53bcac91e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.683722 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.783966 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.793679 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.978150 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" path="/var/lib/kubelet/pods/04d86b77-716c-4eb7-8976-9e47e1b6df7d/volumes" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.978902 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12f927a3-29ee-4af3-97e7-dc9998169372" path="/var/lib/kubelet/pods/12f927a3-29ee-4af3-97e7-dc9998169372/volumes" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.979705 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16f26a59-3681-422d-a46f-406b454f73b8" path="/var/lib/kubelet/pods/16f26a59-3681-422d-a46f-406b454f73b8/volumes" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.981155 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a24a888-0314-4501-b989-20463ef27332" path="/var/lib/kubelet/pods/1a24a888-0314-4501-b989-20463ef27332/volumes" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.981917 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" path="/var/lib/kubelet/pods/1d6e18e5-718a-4578-8b74-add47cb38d2d/volumes" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.982823 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2118c3ce-d55e-4a65-8d25-d7f53bcac91e" path="/var/lib/kubelet/pods/2118c3ce-d55e-4a65-8d25-d7f53bcac91e/volumes" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.984157 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b38406-08cf-452c-b0f8-c455ed6b7ece" path="/var/lib/kubelet/pods/24b38406-08cf-452c-b0f8-c455ed6b7ece/volumes" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.984696 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a215636-7c93-4a97-97f5-842f25380d45" path="/var/lib/kubelet/pods/3a215636-7c93-4a97-97f5-842f25380d45/volumes" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.985073 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d88dc1f-6d17-4a07-b701-d1e578d686ad" path="/var/lib/kubelet/pods/4d88dc1f-6d17-4a07-b701-d1e578d686ad/volumes" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.985503 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f7e1038-a9e1-40d9-a09d-4c224fb114d4" path="/var/lib/kubelet/pods/4f7e1038-a9e1-40d9-a09d-4c224fb114d4/volumes" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.986941 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50bb7262-9571-4851-9399-5115ee497125" path="/var/lib/kubelet/pods/50bb7262-9571-4851-9399-5115ee497125/volumes" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.988157 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" path="/var/lib/kubelet/pods/59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54/volumes" Jan 20 23:44:51 crc kubenswrapper[5030]: I0120 23:44:51.989164 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c070076-3f81-4e61-9f27-e5a39a84c380" path="/var/lib/kubelet/pods/5c070076-3f81-4e61-9f27-e5a39a84c380/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.006668 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="660c45c7-982d-49a6-be38-7a74cf69a00e" path="/var/lib/kubelet/pods/660c45c7-982d-49a6-be38-7a74cf69a00e/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.007394 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" path="/var/lib/kubelet/pods/979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.008283 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba47504f-1af1-4c46-acc4-6f6ebc32855f" path="/var/lib/kubelet/pods/ba47504f-1af1-4c46-acc4-6f6ebc32855f/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.009144 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a88bb5-d95c-4db3-a79f-51958a894a2b" path="/var/lib/kubelet/pods/c8a88bb5-d95c-4db3-a79f-51958a894a2b/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.011314 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" path="/var/lib/kubelet/pods/ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.012342 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d10e94b1-91a0-4631-a7b8-763d41c403ce" path="/var/lib/kubelet/pods/d10e94b1-91a0-4631-a7b8-763d41c403ce/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.013273 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d19c3298-c9b5-4371-b1f5-8edf2510706b" path="/var/lib/kubelet/pods/d19c3298-c9b5-4371-b1f5-8edf2510706b/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.014830 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" path="/var/lib/kubelet/pods/d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.015817 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db5e46d2-d1b4-4639-8e45-c329417ae8e1" path="/var/lib/kubelet/pods/db5e46d2-d1b4-4639-8e45-c329417ae8e1/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.016766 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df866ce3-a10a-41e3-a17b-abea64eaa5ca" path="/var/lib/kubelet/pods/df866ce3-a10a-41e3-a17b-abea64eaa5ca/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.020300 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f205895a-be1d-4da7-adbc-773b2750d39b" path="/var/lib/kubelet/pods/f205895a-be1d-4da7-adbc-773b2750d39b/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.022929 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c46ea7-7016-445e-8f53-a65df4ea35fc" path="/var/lib/kubelet/pods/f7c46ea7-7016-445e-8f53-a65df4ea35fc/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.023696 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbdc4876-2fa1-495d-88fd-3c18541b14d4" path="/var/lib/kubelet/pods/fbdc4876-2fa1-495d-88fd-3c18541b14d4/volumes" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.499102 5030 generic.go:334] "Generic (PLEG): container finished" podID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerID="1f6749a25c72c9c787c127ceefad9a03df49b421462cca6b8ac0a09613026753" exitCode=0 Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.499709 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aeaf04c5-9099-4b8a-b270-d4ee406f8896","Type":"ContainerDied","Data":"1f6749a25c72c9c787c127ceefad9a03df49b421462cca6b8ac0a09613026753"} Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.600819 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.698060 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nst7v\" (UniqueName: \"kubernetes.io/projected/aeaf04c5-9099-4b8a-b270-d4ee406f8896-kube-api-access-nst7v\") pod \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.698339 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-sg-core-conf-yaml\") pod \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.698572 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-config-data\") pod \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.698713 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-ceilometer-tls-certs\") pod \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.698808 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeaf04c5-9099-4b8a-b270-d4ee406f8896-run-httpd\") pod \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.698891 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-scripts\") pod \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.698955 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-combined-ca-bundle\") pod \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.699031 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeaf04c5-9099-4b8a-b270-d4ee406f8896-log-httpd\") pod \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\" (UID: \"aeaf04c5-9099-4b8a-b270-d4ee406f8896\") " Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.699806 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeaf04c5-9099-4b8a-b270-d4ee406f8896-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aeaf04c5-9099-4b8a-b270-d4ee406f8896" (UID: "aeaf04c5-9099-4b8a-b270-d4ee406f8896"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.699986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeaf04c5-9099-4b8a-b270-d4ee406f8896-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aeaf04c5-9099-4b8a-b270-d4ee406f8896" (UID: "aeaf04c5-9099-4b8a-b270-d4ee406f8896"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.718801 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeaf04c5-9099-4b8a-b270-d4ee406f8896-kube-api-access-nst7v" (OuterVolumeSpecName: "kube-api-access-nst7v") pod "aeaf04c5-9099-4b8a-b270-d4ee406f8896" (UID: "aeaf04c5-9099-4b8a-b270-d4ee406f8896"). InnerVolumeSpecName "kube-api-access-nst7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.728175 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-scripts" (OuterVolumeSpecName: "scripts") pod "aeaf04c5-9099-4b8a-b270-d4ee406f8896" (UID: "aeaf04c5-9099-4b8a-b270-d4ee406f8896"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.802464 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeaf04c5-9099-4b8a-b270-d4ee406f8896-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.802503 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.802512 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aeaf04c5-9099-4b8a-b270-d4ee406f8896-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.802524 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nst7v\" (UniqueName: \"kubernetes.io/projected/aeaf04c5-9099-4b8a-b270-d4ee406f8896-kube-api-access-nst7v\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.812837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aeaf04c5-9099-4b8a-b270-d4ee406f8896" (UID: "aeaf04c5-9099-4b8a-b270-d4ee406f8896"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.849901 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "aeaf04c5-9099-4b8a-b270-d4ee406f8896" (UID: "aeaf04c5-9099-4b8a-b270-d4ee406f8896"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.850736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-config-data" (OuterVolumeSpecName: "config-data") pod "aeaf04c5-9099-4b8a-b270-d4ee406f8896" (UID: "aeaf04c5-9099-4b8a-b270-d4ee406f8896"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.850831 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeaf04c5-9099-4b8a-b270-d4ee406f8896" (UID: "aeaf04c5-9099-4b8a-b270-d4ee406f8896"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.862790 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="24b38406-08cf-452c-b0f8-c455ed6b7ece" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.1.48:6080/vnc_lite.html\": dial tcp 10.217.1.48:6080: i/o timeout" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.904361 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.904406 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.904423 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:52 crc kubenswrapper[5030]: I0120 23:44:52.904435 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aeaf04c5-9099-4b8a-b270-d4ee406f8896-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:53 crc kubenswrapper[5030]: I0120 23:44:53.518982 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aeaf04c5-9099-4b8a-b270-d4ee406f8896","Type":"ContainerDied","Data":"19ef9d49e4482be6c5078408f36807932158ee24a4f9325758201be0283083bb"} Jan 20 23:44:53 crc kubenswrapper[5030]: I0120 23:44:53.519063 5030 scope.go:117] "RemoveContainer" containerID="28b7de7d56face10f996033601d7679206b992fa119d73cbe2e9f054af31eaea" Jan 20 23:44:53 crc kubenswrapper[5030]: I0120 23:44:53.521069 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:44:53 crc kubenswrapper[5030]: I0120 23:44:53.555288 5030 scope.go:117] "RemoveContainer" containerID="216e958871075e8c220585fbd5ca404192d6413d788bac1112f975b37e4a20f3" Jan 20 23:44:53 crc kubenswrapper[5030]: I0120 23:44:53.579967 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:44:53 crc kubenswrapper[5030]: I0120 23:44:53.587470 5030 scope.go:117] "RemoveContainer" containerID="1f6749a25c72c9c787c127ceefad9a03df49b421462cca6b8ac0a09613026753" Jan 20 23:44:53 crc kubenswrapper[5030]: I0120 23:44:53.591189 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:44:53 crc kubenswrapper[5030]: I0120 23:44:53.614412 5030 scope.go:117] "RemoveContainer" containerID="31de7642e3f6a2920d520c9f8ba02678706279ee75f8771e85a21f79d49d8f2c" Jan 20 23:44:53 crc kubenswrapper[5030]: I0120 23:44:53.659801 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:44:53 crc kubenswrapper[5030]: I0120 23:44:53.737046 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26"] Jan 20 23:44:53 crc kubenswrapper[5030]: I0120 23:44:53.737316 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" podUID="34541e14-d037-4e28-bcc1-f0781aa25bfb" containerName="dnsmasq-dns" containerID="cri-o://659c5948dbbf542ccf9d9063860877a6004816c800cf48cc1f29568bb715db62" gracePeriod=10 Jan 20 23:44:53 crc kubenswrapper[5030]: I0120 23:44:53.980313 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" path="/var/lib/kubelet/pods/aeaf04c5-9099-4b8a-b270-d4ee406f8896/volumes" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.364072 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.430378 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-config\") pod \"34541e14-d037-4e28-bcc1-f0781aa25bfb\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.430491 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-dnsmasq-svc\") pod \"34541e14-d037-4e28-bcc1-f0781aa25bfb\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.430522 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-dns-swift-storage-0\") pod \"34541e14-d037-4e28-bcc1-f0781aa25bfb\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.430631 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86bkw\" (UniqueName: \"kubernetes.io/projected/34541e14-d037-4e28-bcc1-f0781aa25bfb-kube-api-access-86bkw\") pod \"34541e14-d037-4e28-bcc1-f0781aa25bfb\" (UID: \"34541e14-d037-4e28-bcc1-f0781aa25bfb\") " Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.437755 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34541e14-d037-4e28-bcc1-f0781aa25bfb-kube-api-access-86bkw" (OuterVolumeSpecName: "kube-api-access-86bkw") pod "34541e14-d037-4e28-bcc1-f0781aa25bfb" (UID: "34541e14-d037-4e28-bcc1-f0781aa25bfb"). InnerVolumeSpecName "kube-api-access-86bkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.472209 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "34541e14-d037-4e28-bcc1-f0781aa25bfb" (UID: "34541e14-d037-4e28-bcc1-f0781aa25bfb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.482885 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-config" (OuterVolumeSpecName: "config") pod "34541e14-d037-4e28-bcc1-f0781aa25bfb" (UID: "34541e14-d037-4e28-bcc1-f0781aa25bfb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.496962 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "34541e14-d037-4e28-bcc1-f0781aa25bfb" (UID: "34541e14-d037-4e28-bcc1-f0781aa25bfb"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.532669 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86bkw\" (UniqueName: \"kubernetes.io/projected/34541e14-d037-4e28-bcc1-f0781aa25bfb-kube-api-access-86bkw\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.532719 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.532741 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.532761 5030 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34541e14-d037-4e28-bcc1-f0781aa25bfb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.535678 5030 generic.go:334] "Generic (PLEG): container finished" podID="34541e14-d037-4e28-bcc1-f0781aa25bfb" containerID="659c5948dbbf542ccf9d9063860877a6004816c800cf48cc1f29568bb715db62" exitCode=0 Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.535727 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" event={"ID":"34541e14-d037-4e28-bcc1-f0781aa25bfb","Type":"ContainerDied","Data":"659c5948dbbf542ccf9d9063860877a6004816c800cf48cc1f29568bb715db62"} Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.535757 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" event={"ID":"34541e14-d037-4e28-bcc1-f0781aa25bfb","Type":"ContainerDied","Data":"f9c01f64bc1365ebf45aaaf4051fffe63c7f7e9ea69bd68a634adb65b4bae75c"} Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.535779 5030 scope.go:117] "RemoveContainer" containerID="659c5948dbbf542ccf9d9063860877a6004816c800cf48cc1f29568bb715db62" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.535889 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.571615 5030 scope.go:117] "RemoveContainer" containerID="2ff9fb3899b3991032e62388d4f1c3c9e18aaa1dda3d5717634443e4b1e40fcd" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.575718 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26"] Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.584014 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-859d7775df-wtw26"] Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.600002 5030 scope.go:117] "RemoveContainer" containerID="659c5948dbbf542ccf9d9063860877a6004816c800cf48cc1f29568bb715db62" Jan 20 23:44:54 crc kubenswrapper[5030]: E0120 23:44:54.600538 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"659c5948dbbf542ccf9d9063860877a6004816c800cf48cc1f29568bb715db62\": container with ID starting with 659c5948dbbf542ccf9d9063860877a6004816c800cf48cc1f29568bb715db62 not found: ID does not exist" containerID="659c5948dbbf542ccf9d9063860877a6004816c800cf48cc1f29568bb715db62" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.600764 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"659c5948dbbf542ccf9d9063860877a6004816c800cf48cc1f29568bb715db62"} err="failed to get container status \"659c5948dbbf542ccf9d9063860877a6004816c800cf48cc1f29568bb715db62\": rpc error: code = NotFound desc = could not find container \"659c5948dbbf542ccf9d9063860877a6004816c800cf48cc1f29568bb715db62\": container with ID starting with 659c5948dbbf542ccf9d9063860877a6004816c800cf48cc1f29568bb715db62 not found: ID does not exist" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.600807 5030 scope.go:117] "RemoveContainer" containerID="2ff9fb3899b3991032e62388d4f1c3c9e18aaa1dda3d5717634443e4b1e40fcd" Jan 20 23:44:54 crc kubenswrapper[5030]: E0120 23:44:54.601330 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff9fb3899b3991032e62388d4f1c3c9e18aaa1dda3d5717634443e4b1e40fcd\": container with ID starting with 2ff9fb3899b3991032e62388d4f1c3c9e18aaa1dda3d5717634443e4b1e40fcd not found: ID does not exist" containerID="2ff9fb3899b3991032e62388d4f1c3c9e18aaa1dda3d5717634443e4b1e40fcd" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.601371 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff9fb3899b3991032e62388d4f1c3c9e18aaa1dda3d5717634443e4b1e40fcd"} err="failed to get container status \"2ff9fb3899b3991032e62388d4f1c3c9e18aaa1dda3d5717634443e4b1e40fcd\": rpc error: code = NotFound desc = could not find container \"2ff9fb3899b3991032e62388d4f1c3c9e18aaa1dda3d5717634443e4b1e40fcd\": container with ID starting with 2ff9fb3899b3991032e62388d4f1c3c9e18aaa1dda3d5717634443e4b1e40fcd not found: ID does not exist" Jan 20 23:44:54 crc kubenswrapper[5030]: I0120 23:44:54.690797 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="16f26a59-3681-422d-a46f-406b454f73b8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.169:5671: i/o timeout" Jan 20 23:44:55 crc kubenswrapper[5030]: I0120 23:44:55.981550 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34541e14-d037-4e28-bcc1-f0781aa25bfb" path="/var/lib/kubelet/pods/34541e14-d037-4e28-bcc1-f0781aa25bfb/volumes" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.474318 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.514192 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhx5w\" (UniqueName: \"kubernetes.io/projected/7f9eab26-3469-41cd-abbf-9f314b2e3db1-kube-api-access-vhx5w\") pod \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.514591 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-internal-tls-certs\") pod \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.514723 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-public-tls-certs\") pod \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.514759 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-config\") pod \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.514845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-httpd-config\") pod \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.514901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-combined-ca-bundle\") pod \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.514935 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-ovndb-tls-certs\") pod \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\" (UID: \"7f9eab26-3469-41cd-abbf-9f314b2e3db1\") " Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.520987 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f9eab26-3469-41cd-abbf-9f314b2e3db1-kube-api-access-vhx5w" (OuterVolumeSpecName: "kube-api-access-vhx5w") pod "7f9eab26-3469-41cd-abbf-9f314b2e3db1" (UID: "7f9eab26-3469-41cd-abbf-9f314b2e3db1"). InnerVolumeSpecName "kube-api-access-vhx5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.524876 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7f9eab26-3469-41cd-abbf-9f314b2e3db1" (UID: "7f9eab26-3469-41cd-abbf-9f314b2e3db1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.562814 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-config" (OuterVolumeSpecName: "config") pod "7f9eab26-3469-41cd-abbf-9f314b2e3db1" (UID: "7f9eab26-3469-41cd-abbf-9f314b2e3db1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.573345 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f9eab26-3469-41cd-abbf-9f314b2e3db1" (UID: "7f9eab26-3469-41cd-abbf-9f314b2e3db1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.575344 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7f9eab26-3469-41cd-abbf-9f314b2e3db1" (UID: "7f9eab26-3469-41cd-abbf-9f314b2e3db1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.579044 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7f9eab26-3469-41cd-abbf-9f314b2e3db1" (UID: "7f9eab26-3469-41cd-abbf-9f314b2e3db1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.594581 5030 generic.go:334] "Generic (PLEG): container finished" podID="7f9eab26-3469-41cd-abbf-9f314b2e3db1" containerID="c6b688c31c136694cd617d7076fc8d77cc330b64c2f68291106be52b8810e00c" exitCode=0 Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.594683 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" event={"ID":"7f9eab26-3469-41cd-abbf-9f314b2e3db1","Type":"ContainerDied","Data":"c6b688c31c136694cd617d7076fc8d77cc330b64c2f68291106be52b8810e00c"} Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.594700 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.594777 5030 scope.go:117] "RemoveContainer" containerID="ba25de2ae7478e2268a1149edfb6a5499eb0e53e3ec1456722337f6a45228229" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.594758 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6f69755d54-ldcwk" event={"ID":"7f9eab26-3469-41cd-abbf-9f314b2e3db1","Type":"ContainerDied","Data":"dd82bcf76d467c27ed3193bcf23b5fda2a3a657d167d57483002341c3b4880c9"} Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.605689 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7f9eab26-3469-41cd-abbf-9f314b2e3db1" (UID: "7f9eab26-3469-41cd-abbf-9f314b2e3db1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.616888 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhx5w\" (UniqueName: \"kubernetes.io/projected/7f9eab26-3469-41cd-abbf-9f314b2e3db1-kube-api-access-vhx5w\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.616930 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.616947 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.616962 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.616975 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.616988 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.617001 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9eab26-3469-41cd-abbf-9f314b2e3db1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.621383 5030 scope.go:117] "RemoveContainer" containerID="c6b688c31c136694cd617d7076fc8d77cc330b64c2f68291106be52b8810e00c" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.639888 5030 scope.go:117] "RemoveContainer" containerID="ba25de2ae7478e2268a1149edfb6a5499eb0e53e3ec1456722337f6a45228229" Jan 20 23:44:59 crc kubenswrapper[5030]: E0120 23:44:59.640280 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba25de2ae7478e2268a1149edfb6a5499eb0e53e3ec1456722337f6a45228229\": container with ID starting with ba25de2ae7478e2268a1149edfb6a5499eb0e53e3ec1456722337f6a45228229 not found: ID does not exist" containerID="ba25de2ae7478e2268a1149edfb6a5499eb0e53e3ec1456722337f6a45228229" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.640310 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba25de2ae7478e2268a1149edfb6a5499eb0e53e3ec1456722337f6a45228229"} err="failed to get container status \"ba25de2ae7478e2268a1149edfb6a5499eb0e53e3ec1456722337f6a45228229\": rpc error: code = NotFound desc = could not find container \"ba25de2ae7478e2268a1149edfb6a5499eb0e53e3ec1456722337f6a45228229\": container with ID starting with ba25de2ae7478e2268a1149edfb6a5499eb0e53e3ec1456722337f6a45228229 not found: ID does not exist" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.640329 5030 scope.go:117] "RemoveContainer" containerID="c6b688c31c136694cd617d7076fc8d77cc330b64c2f68291106be52b8810e00c" Jan 20 23:44:59 crc kubenswrapper[5030]: E0120 23:44:59.640834 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b688c31c136694cd617d7076fc8d77cc330b64c2f68291106be52b8810e00c\": container with ID starting with c6b688c31c136694cd617d7076fc8d77cc330b64c2f68291106be52b8810e00c not found: ID does not exist" containerID="c6b688c31c136694cd617d7076fc8d77cc330b64c2f68291106be52b8810e00c" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.640896 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b688c31c136694cd617d7076fc8d77cc330b64c2f68291106be52b8810e00c"} err="failed to get container status \"c6b688c31c136694cd617d7076fc8d77cc330b64c2f68291106be52b8810e00c\": rpc error: code = NotFound desc = could not find container \"c6b688c31c136694cd617d7076fc8d77cc330b64c2f68291106be52b8810e00c\": container with ID starting with c6b688c31c136694cd617d7076fc8d77cc330b64c2f68291106be52b8810e00c not found: ID does not exist" Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.945744 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6f69755d54-ldcwk"] Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.950917 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6f69755d54-ldcwk"] Jan 20 23:44:59 crc kubenswrapper[5030]: I0120 23:44:59.977272 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f9eab26-3469-41cd-abbf-9f314b2e3db1" path="/var/lib/kubelet/pods/7f9eab26-3469-41cd-abbf-9f314b2e3db1/volumes" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.161905 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n"] Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162343 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="ceilometer-notification-agent" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162377 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="ceilometer-notification-agent" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162420 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="proxy-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162431 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="proxy-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162452 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2118c3ce-d55e-4a65-8d25-d7f53bcac91e" containerName="setup-container" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162464 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2118c3ce-d55e-4a65-8d25-d7f53bcac91e" containerName="setup-container" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162478 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbdc4876-2fa1-495d-88fd-3c18541b14d4" containerName="nova-api-api" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162489 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbdc4876-2fa1-495d-88fd-3c18541b14d4" containerName="nova-api-api" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162507 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" containerName="glance-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162517 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" containerName="glance-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162530 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" containerName="glance-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162541 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" containerName="glance-log" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162551 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerName="nova-cell1-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162562 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerName="nova-cell1-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162582 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19c3298-c9b5-4371-b1f5-8edf2510706b" containerName="placement-api" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162592 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19c3298-c9b5-4371-b1f5-8edf2510706b" containerName="placement-api" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162604 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerName="barbican-api" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162613 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerName="barbican-api" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162651 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34541e14-d037-4e28-bcc1-f0781aa25bfb" containerName="init" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162661 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="34541e14-d037-4e28-bcc1-f0781aa25bfb" containerName="init" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162682 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerName="nova-cell1-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162692 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerName="nova-cell1-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162707 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987b795c-8dbe-4f59-aa00-e3f908a9c974" containerName="proxy-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162716 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="987b795c-8dbe-4f59-aa00-e3f908a9c974" containerName="proxy-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162730 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerName="barbican-api-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162739 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerName="barbican-api-log" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162756 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba47504f-1af1-4c46-acc4-6f6ebc32855f" containerName="keystone-api" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162765 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba47504f-1af1-4c46-acc4-6f6ebc32855f" containerName="keystone-api" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162777 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f205895a-be1d-4da7-adbc-773b2750d39b" containerName="barbican-keystone-listener-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162789 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f205895a-be1d-4da7-adbc-773b2750d39b" containerName="barbican-keystone-listener-log" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162809 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c070076-3f81-4e61-9f27-e5a39a84c380" containerName="nova-metadata-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162819 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c070076-3f81-4e61-9f27-e5a39a84c380" containerName="nova-metadata-log" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162833 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f26a59-3681-422d-a46f-406b454f73b8" containerName="setup-container" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162842 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f26a59-3681-422d-a46f-406b454f73b8" containerName="setup-container" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162853 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerName="nova-cell1-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162863 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerName="nova-cell1-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162877 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10e94b1-91a0-4631-a7b8-763d41c403ce" containerName="memcached" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162886 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10e94b1-91a0-4631-a7b8-763d41c403ce" containerName="memcached" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162906 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" containerName="barbican-worker-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162916 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" containerName="barbican-worker-log" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162925 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a24a888-0314-4501-b989-20463ef27332" containerName="ovn-northd" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162934 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a24a888-0314-4501-b989-20463ef27332" containerName="ovn-northd" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162948 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" containerName="barbican-worker" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162958 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" containerName="barbican-worker" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162973 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34541e14-d037-4e28-bcc1-f0781aa25bfb" containerName="dnsmasq-dns" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162981 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="34541e14-d037-4e28-bcc1-f0781aa25bfb" containerName="dnsmasq-dns" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.162990 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b38406-08cf-452c-b0f8-c455ed6b7ece" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.162999 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b38406-08cf-452c-b0f8-c455ed6b7ece" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163009 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerName="nova-cell0-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163016 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerName="nova-cell0-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163026 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df866ce3-a10a-41e3-a17b-abea64eaa5ca" containerName="mysql-bootstrap" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163034 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="df866ce3-a10a-41e3-a17b-abea64eaa5ca" containerName="mysql-bootstrap" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163045 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f205895a-be1d-4da7-adbc-773b2750d39b" containerName="barbican-keystone-listener" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163053 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f205895a-be1d-4da7-adbc-773b2750d39b" containerName="barbican-keystone-listener" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163066 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerName="nova-cell0-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163073 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerName="nova-cell0-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163088 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f7e1038-a9e1-40d9-a09d-4c224fb114d4" containerName="registry-server" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163097 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f7e1038-a9e1-40d9-a09d-4c224fb114d4" containerName="registry-server" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163106 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bb7262-9571-4851-9399-5115ee497125" containerName="cinder-api-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163114 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bb7262-9571-4851-9399-5115ee497125" containerName="cinder-api-log" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163128 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2118c3ce-d55e-4a65-8d25-d7f53bcac91e" containerName="rabbitmq" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163136 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2118c3ce-d55e-4a65-8d25-d7f53bcac91e" containerName="rabbitmq" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163144 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" containerName="glance-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163152 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" containerName="glance-log" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163161 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" containerName="glance-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163169 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" containerName="glance-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163181 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c070076-3f81-4e61-9f27-e5a39a84c380" containerName="nova-metadata-metadata" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163189 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c070076-3f81-4e61-9f27-e5a39a84c380" containerName="nova-metadata-metadata" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163199 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bb7262-9571-4851-9399-5115ee497125" containerName="cinder-api" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163206 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bb7262-9571-4851-9399-5115ee497125" containerName="cinder-api" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163216 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbdc4876-2fa1-495d-88fd-3c18541b14d4" containerName="nova-api-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163223 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbdc4876-2fa1-495d-88fd-3c18541b14d4" containerName="nova-api-log" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163235 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfd208a2-6c29-4066-a573-c6b35a6a59d5" containerName="mariadb-account-create-update" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163244 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd208a2-6c29-4066-a573-c6b35a6a59d5" containerName="mariadb-account-create-update" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163251 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df866ce3-a10a-41e3-a17b-abea64eaa5ca" containerName="galera" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163259 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="df866ce3-a10a-41e3-a17b-abea64eaa5ca" containerName="galera" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163287 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f7e1038-a9e1-40d9-a09d-4c224fb114d4" containerName="extract-utilities" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163294 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f7e1038-a9e1-40d9-a09d-4c224fb114d4" containerName="extract-utilities" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163305 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987b795c-8dbe-4f59-aa00-e3f908a9c974" containerName="proxy-server" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163313 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="987b795c-8dbe-4f59-aa00-e3f908a9c974" containerName="proxy-server" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163323 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f7e1038-a9e1-40d9-a09d-4c224fb114d4" containerName="extract-content" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163330 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f7e1038-a9e1-40d9-a09d-4c224fb114d4" containerName="extract-content" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163344 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9eab26-3469-41cd-abbf-9f314b2e3db1" containerName="neutron-api" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163352 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9eab26-3469-41cd-abbf-9f314b2e3db1" containerName="neutron-api" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163363 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="ceilometer-central-agent" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163371 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="ceilometer-central-agent" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163385 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9eab26-3469-41cd-abbf-9f314b2e3db1" containerName="neutron-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163394 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9eab26-3469-41cd-abbf-9f314b2e3db1" containerName="neutron-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163404 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19c3298-c9b5-4371-b1f5-8edf2510706b" containerName="placement-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163412 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19c3298-c9b5-4371-b1f5-8edf2510706b" containerName="placement-log" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163426 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c46ea7-7016-445e-8f53-a65df4ea35fc" containerName="galera" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163433 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c46ea7-7016-445e-8f53-a65df4ea35fc" containerName="galera" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163446 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a24a888-0314-4501-b989-20463ef27332" containerName="openstack-network-exporter" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163454 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a24a888-0314-4501-b989-20463ef27332" containerName="openstack-network-exporter" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163464 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f26a59-3681-422d-a46f-406b454f73b8" containerName="rabbitmq" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163471 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f26a59-3681-422d-a46f-406b454f73b8" containerName="rabbitmq" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163480 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="sg-core" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163490 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="sg-core" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163498 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c46ea7-7016-445e-8f53-a65df4ea35fc" containerName="mysql-bootstrap" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163505 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c46ea7-7016-445e-8f53-a65df4ea35fc" containerName="mysql-bootstrap" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163519 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db5e46d2-d1b4-4639-8e45-c329417ae8e1" containerName="nova-scheduler-scheduler" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163527 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="db5e46d2-d1b4-4639-8e45-c329417ae8e1" containerName="nova-scheduler-scheduler" Jan 20 23:45:00 crc kubenswrapper[5030]: E0120 23:45:00.163541 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfd208a2-6c29-4066-a573-c6b35a6a59d5" containerName="mariadb-account-create-update" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163548 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd208a2-6c29-4066-a573-c6b35a6a59d5" containerName="mariadb-account-create-update" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163880 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfd208a2-6c29-4066-a573-c6b35a6a59d5" containerName="mariadb-account-create-update" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163898 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f7e1038-a9e1-40d9-a09d-4c224fb114d4" containerName="registry-server" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163910 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="df866ce3-a10a-41e3-a17b-abea64eaa5ca" containerName="galera" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163924 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f205895a-be1d-4da7-adbc-773b2750d39b" containerName="barbican-keystone-listener" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163937 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" containerName="glance-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163950 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba47504f-1af1-4c46-acc4-6f6ebc32855f" containerName="keystone-api" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163963 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19c3298-c9b5-4371-b1f5-8edf2510706b" containerName="placement-api" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163974 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="50bb7262-9571-4851-9399-5115ee497125" containerName="cinder-api-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163987 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" containerName="glance-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.163996 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerName="nova-cell0-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164006 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c46ea7-7016-445e-8f53-a65df4ea35fc" containerName="galera" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164015 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfd208a2-6c29-4066-a573-c6b35a6a59d5" containerName="mariadb-account-create-update" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164024 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerName="nova-cell1-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164036 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerName="nova-cell1-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164049 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ca0f14-59f2-44f0-91f4-e0c2c8b43a2b" containerName="glance-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164064 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9eab26-3469-41cd-abbf-9f314b2e3db1" containerName="neutron-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164081 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="sg-core" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164093 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a24a888-0314-4501-b989-20463ef27332" containerName="ovn-northd" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164107 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f205895a-be1d-4da7-adbc-773b2750d39b" containerName="barbican-keystone-listener-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164116 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerName="nova-cell0-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164128 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerName="nova-cell0-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164138 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerName="barbican-api-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164147 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="987b795c-8dbe-4f59-aa00-e3f908a9c974" containerName="proxy-server" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164160 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d10e94b1-91a0-4631-a7b8-763d41c403ce" containerName="memcached" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164172 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="50bb7262-9571-4851-9399-5115ee497125" containerName="cinder-api" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164183 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2118c3ce-d55e-4a65-8d25-d7f53bcac91e" containerName="rabbitmq" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164195 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6e18e5-718a-4578-8b74-add47cb38d2d" containerName="nova-cell1-conductor-conductor" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164204 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19c3298-c9b5-4371-b1f5-8edf2510706b" containerName="placement-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164218 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="ceilometer-central-agent" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164232 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="34541e14-d037-4e28-bcc1-f0781aa25bfb" containerName="dnsmasq-dns" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164264 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9eab26-3469-41cd-abbf-9f314b2e3db1" containerName="neutron-api" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164275 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="proxy-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164287 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbdc4876-2fa1-495d-88fd-3c18541b14d4" containerName="nova-api-api" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164296 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" containerName="barbican-worker-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164304 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="db5e46d2-d1b4-4639-8e45-c329417ae8e1" containerName="nova-scheduler-scheduler" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164316 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca6a40bb-ca5c-46eb-bad6-8f7f7cee94f1" containerName="barbican-worker" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164325 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="979e9ab4-f6ba-4e96-91aa-6e2da22bc5cd" containerName="barbican-api" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164336 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c070076-3f81-4e61-9f27-e5a39a84c380" containerName="nova-metadata-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164348 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a24a888-0314-4501-b989-20463ef27332" containerName="openstack-network-exporter" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164358 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59dc3b96-a86c-49bf-b1cd-dd0c0f72ac54" containerName="glance-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164366 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeaf04c5-9099-4b8a-b270-d4ee406f8896" containerName="ceilometer-notification-agent" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164377 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c070076-3f81-4e61-9f27-e5a39a84c380" containerName="nova-metadata-metadata" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164389 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="16f26a59-3681-422d-a46f-406b454f73b8" containerName="rabbitmq" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164397 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbdc4876-2fa1-495d-88fd-3c18541b14d4" containerName="nova-api-log" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164410 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="987b795c-8dbe-4f59-aa00-e3f908a9c974" containerName="proxy-httpd" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.164419 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b38406-08cf-452c-b0f8-c455ed6b7ece" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.165060 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.167832 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.168768 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.193482 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n"] Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.227096 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-config-volume\") pod \"collect-profiles-29482545-fj69n\" (UID: \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.227162 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-secret-volume\") pod \"collect-profiles-29482545-fj69n\" (UID: \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.227220 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjtmg\" (UniqueName: \"kubernetes.io/projected/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-kube-api-access-tjtmg\") pod \"collect-profiles-29482545-fj69n\" (UID: \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.328009 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-config-volume\") pod \"collect-profiles-29482545-fj69n\" (UID: \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.328039 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-secret-volume\") pod \"collect-profiles-29482545-fj69n\" (UID: \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.328068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjtmg\" (UniqueName: \"kubernetes.io/projected/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-kube-api-access-tjtmg\") pod \"collect-profiles-29482545-fj69n\" (UID: \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.329350 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-config-volume\") pod \"collect-profiles-29482545-fj69n\" (UID: \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.333303 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-secret-volume\") pod \"collect-profiles-29482545-fj69n\" (UID: \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.346258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjtmg\" (UniqueName: \"kubernetes.io/projected/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-kube-api-access-tjtmg\") pod \"collect-profiles-29482545-fj69n\" (UID: \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.513121 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" Jan 20 23:45:00 crc kubenswrapper[5030]: I0120 23:45:00.984723 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n"] Jan 20 23:45:00 crc kubenswrapper[5030]: W0120 23:45:00.985077 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10d12c4e_cd2b_4d9e_ab44_6ea82cd18829.slice/crio-cb841fc3979636e2f46b32ba4b86b9a4c2da4cba4b7aa4b169691a95b52a535a WatchSource:0}: Error finding container cb841fc3979636e2f46b32ba4b86b9a4c2da4cba4b7aa4b169691a95b52a535a: Status 404 returned error can't find the container with id cb841fc3979636e2f46b32ba4b86b9a4c2da4cba4b7aa4b169691a95b52a535a Jan 20 23:45:01 crc kubenswrapper[5030]: I0120 23:45:01.619696 5030 generic.go:334] "Generic (PLEG): container finished" podID="10d12c4e-cd2b-4d9e-ab44-6ea82cd18829" containerID="d97e7dad220d6c0976d63306667b72803bea126b66f2551e2435f8710999b82b" exitCode=0 Jan 20 23:45:01 crc kubenswrapper[5030]: I0120 23:45:01.619880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" event={"ID":"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829","Type":"ContainerDied","Data":"d97e7dad220d6c0976d63306667b72803bea126b66f2551e2435f8710999b82b"} Jan 20 23:45:01 crc kubenswrapper[5030]: I0120 23:45:01.620843 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" event={"ID":"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829","Type":"ContainerStarted","Data":"cb841fc3979636e2f46b32ba4b86b9a4c2da4cba4b7aa4b169691a95b52a535a"} Jan 20 23:45:02 crc kubenswrapper[5030]: I0120 23:45:02.958642 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" Jan 20 23:45:03 crc kubenswrapper[5030]: I0120 23:45:03.073827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjtmg\" (UniqueName: \"kubernetes.io/projected/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-kube-api-access-tjtmg\") pod \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\" (UID: \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\") " Jan 20 23:45:03 crc kubenswrapper[5030]: I0120 23:45:03.074020 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-secret-volume\") pod \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\" (UID: \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\") " Jan 20 23:45:03 crc kubenswrapper[5030]: I0120 23:45:03.074092 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-config-volume\") pod \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\" (UID: \"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829\") " Jan 20 23:45:03 crc kubenswrapper[5030]: I0120 23:45:03.075140 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-config-volume" (OuterVolumeSpecName: "config-volume") pod "10d12c4e-cd2b-4d9e-ab44-6ea82cd18829" (UID: "10d12c4e-cd2b-4d9e-ab44-6ea82cd18829"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:45:03 crc kubenswrapper[5030]: I0120 23:45:03.080937 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-kube-api-access-tjtmg" (OuterVolumeSpecName: "kube-api-access-tjtmg") pod "10d12c4e-cd2b-4d9e-ab44-6ea82cd18829" (UID: "10d12c4e-cd2b-4d9e-ab44-6ea82cd18829"). InnerVolumeSpecName "kube-api-access-tjtmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:45:03 crc kubenswrapper[5030]: I0120 23:45:03.081193 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "10d12c4e-cd2b-4d9e-ab44-6ea82cd18829" (UID: "10d12c4e-cd2b-4d9e-ab44-6ea82cd18829"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:45:03 crc kubenswrapper[5030]: I0120 23:45:03.176051 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:03 crc kubenswrapper[5030]: I0120 23:45:03.176102 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:03 crc kubenswrapper[5030]: I0120 23:45:03.176127 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjtmg\" (UniqueName: \"kubernetes.io/projected/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829-kube-api-access-tjtmg\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:03 crc kubenswrapper[5030]: I0120 23:45:03.643106 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" event={"ID":"10d12c4e-cd2b-4d9e-ab44-6ea82cd18829","Type":"ContainerDied","Data":"cb841fc3979636e2f46b32ba4b86b9a4c2da4cba4b7aa4b169691a95b52a535a"} Jan 20 23:45:03 crc kubenswrapper[5030]: I0120 23:45:03.643176 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n" Jan 20 23:45:03 crc kubenswrapper[5030]: I0120 23:45:03.643166 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb841fc3979636e2f46b32ba4b86b9a4c2da4cba4b7aa4b169691a95b52a535a" Jan 20 23:45:04 crc kubenswrapper[5030]: I0120 23:45:04.056691 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng"] Jan 20 23:45:04 crc kubenswrapper[5030]: I0120 23:45:04.067393 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482500-fwwng"] Jan 20 23:45:05 crc kubenswrapper[5030]: I0120 23:45:05.980306 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69981745-6b95-4420-8607-6189df827955" path="/var/lib/kubelet/pods/69981745-6b95-4420-8607-6189df827955/volumes" Jan 20 23:45:10 crc kubenswrapper[5030]: I0120 23:45:10.157219 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:45:10 crc kubenswrapper[5030]: I0120 23:45:10.157660 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:45:10 crc kubenswrapper[5030]: I0120 23:45:10.157726 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 23:45:10 crc kubenswrapper[5030]: I0120 23:45:10.158693 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f648d6436623b73b4f4d7a187af028d54b54d93907fd7af94f7f92831044719"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 23:45:10 crc kubenswrapper[5030]: I0120 23:45:10.158791 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://3f648d6436623b73b4f4d7a187af028d54b54d93907fd7af94f7f92831044719" gracePeriod=600 Jan 20 23:45:10 crc kubenswrapper[5030]: I0120 23:45:10.739826 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="3f648d6436623b73b4f4d7a187af028d54b54d93907fd7af94f7f92831044719" exitCode=0 Jan 20 23:45:10 crc kubenswrapper[5030]: I0120 23:45:10.739913 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"3f648d6436623b73b4f4d7a187af028d54b54d93907fd7af94f7f92831044719"} Jan 20 23:45:10 crc kubenswrapper[5030]: I0120 23:45:10.740358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121"} Jan 20 23:45:10 crc kubenswrapper[5030]: I0120 23:45:10.740410 5030 scope.go:117] "RemoveContainer" containerID="86c3e80b108c4b529ead605d41701ae1ffbabe2e1e0a97867452f06e92f807e4" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.251613 5030 scope.go:117] "RemoveContainer" containerID="d9605d7f9d273668c47f45b37af497fcbfd880427fd93734f81f8edf971ac8a9" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.291956 5030 scope.go:117] "RemoveContainer" containerID="28d0a312bd9b754e9a2a5340f10df1c45aa03b9850f1f67802fbc5a8627a8ab7" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.327376 5030 scope.go:117] "RemoveContainer" containerID="c45bbe420a3c0c3efb9b06d7ee93b499854a8e5500f9dfd99ff855cf5df00379" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.355674 5030 scope.go:117] "RemoveContainer" containerID="0279e54b1935433ab1c4cfae7c60466d90a9ac5d6d65fae068ea6efb4d3826be" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.423976 5030 scope.go:117] "RemoveContainer" containerID="8a0eec32beb473b172927074604f141d23560612f04feaa5a8ac286e951f5c6f" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.456265 5030 scope.go:117] "RemoveContainer" containerID="83e74463459e08631754313660ebe7d9b0a5d2244d0828698d9ba7f291108e96" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.493084 5030 scope.go:117] "RemoveContainer" containerID="9e725199433e346decd5a46d7977b737a73063b254698099a39fbde23c3b26fa" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.528047 5030 scope.go:117] "RemoveContainer" containerID="bda6d746b7d14d48dfab7091c45f743116fdf4934edb9a99356585797691c9bd" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.553170 5030 scope.go:117] "RemoveContainer" containerID="c723fee7122bd82a4654f3e57bc39980e37b10a0faadf88c51ea2d1808f5f10c" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.578512 5030 scope.go:117] "RemoveContainer" containerID="f2617bfd0f56a2a7511d6655cd7e74e93508ab10fa8d7a49fc0ec36b48df2eea" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.619588 5030 scope.go:117] "RemoveContainer" containerID="63cdd757318961398a57e724643715beb0c52c0c72e8648c72d7b3a6a1963c1a" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.666966 5030 scope.go:117] "RemoveContainer" containerID="2997f4b86dcccf1c849311b9eb484938b4223576f649cdfae1c3ec9c2f959fd7" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.710092 5030 scope.go:117] "RemoveContainer" containerID="78d23a1e9ddea4f858d613077986875f1ebdff9b27bbc15d1e79bfac6bd8148f" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.737817 5030 scope.go:117] "RemoveContainer" containerID="acb2cd8b976af56ab01d689076d286ed050ea6b6291bdc89ba6a80a2a05b46e5" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.768922 5030 scope.go:117] "RemoveContainer" containerID="446c9c22532ed5000c2914327203f69862de8749941d75e7ed5a0f66132d8d91" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.798933 5030 scope.go:117] "RemoveContainer" containerID="fc9a809137c348f37727899759ea3b9e309c3ff612805eeab5ca65ed89054529" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.831288 5030 scope.go:117] "RemoveContainer" containerID="64de3c79bce421a0d3c1059b0988a3e76dde367063a872a184ec15dd5d265737" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.891535 5030 scope.go:117] "RemoveContainer" containerID="3297caab8ada18e3a891d11d79f546ab779eaf78f3077e6af83e7fdfc7b652aa" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.921364 5030 scope.go:117] "RemoveContainer" containerID="935662df8cf9aece3ed3605af05bf9156234f657db887627bf7aceee2b470956" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.953445 5030 scope.go:117] "RemoveContainer" containerID="5ed9fef91aa09e90121fb4abe44cf03f6735cf5076317f24839a85eb41859635" Jan 20 23:45:12 crc kubenswrapper[5030]: I0120 23:45:12.991485 5030 scope.go:117] "RemoveContainer" containerID="e0d56ff4302d57b4640e3a21a93764cd9ca8e599cfa8633aa74cfa5710979040" Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.027170 5030 scope.go:117] "RemoveContainer" containerID="975217a77382cd0704c9172f8dfafc0bea19409cc429217624bee737730226f1" Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.053726 5030 scope.go:117] "RemoveContainer" containerID="93e09b6369ee1ec46642c19a747cf87d854479c6601cd44323fa57ca688fb52e" Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.459227 5030 scope.go:117] "RemoveContainer" containerID="6cf9372d371534e9f69cfa3095035fd05d0953c31466438fdef72192fdad5f2f" Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.824973 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fa80a86-a911-4a89-9186-520189e4cc88" containerID="703b4130b09e7e9f7f77d19f3d5f9809062cb9cb593d15aaae512b984d70c591" exitCode=137 Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.825047 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"703b4130b09e7e9f7f77d19f3d5f9809062cb9cb593d15aaae512b984d70c591"} Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.880336 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.952714 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift\") pod \"1fa80a86-a911-4a89-9186-520189e4cc88\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.952851 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa80a86-a911-4a89-9186-520189e4cc88-cache\") pod \"1fa80a86-a911-4a89-9186-520189e4cc88\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.952949 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgx9c\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-kube-api-access-kgx9c\") pod \"1fa80a86-a911-4a89-9186-520189e4cc88\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.952984 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"1fa80a86-a911-4a89-9186-520189e4cc88\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.953055 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1fa80a86-a911-4a89-9186-520189e4cc88-lock\") pod \"1fa80a86-a911-4a89-9186-520189e4cc88\" (UID: \"1fa80a86-a911-4a89-9186-520189e4cc88\") " Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.953951 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa80a86-a911-4a89-9186-520189e4cc88-lock" (OuterVolumeSpecName: "lock") pod "1fa80a86-a911-4a89-9186-520189e4cc88" (UID: "1fa80a86-a911-4a89-9186-520189e4cc88"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.954003 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fa80a86-a911-4a89-9186-520189e4cc88-cache" (OuterVolumeSpecName: "cache") pod "1fa80a86-a911-4a89-9186-520189e4cc88" (UID: "1fa80a86-a911-4a89-9186-520189e4cc88"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.954342 5030 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1fa80a86-a911-4a89-9186-520189e4cc88-cache\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:13 crc kubenswrapper[5030]: I0120 23:45:13.954376 5030 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1fa80a86-a911-4a89-9186-520189e4cc88-lock\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.042420 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-kube-api-access-kgx9c" (OuterVolumeSpecName: "kube-api-access-kgx9c") pod "1fa80a86-a911-4a89-9186-520189e4cc88" (UID: "1fa80a86-a911-4a89-9186-520189e4cc88"). InnerVolumeSpecName "kube-api-access-kgx9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.042459 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1fa80a86-a911-4a89-9186-520189e4cc88" (UID: "1fa80a86-a911-4a89-9186-520189e4cc88"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.043057 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "swift") pod "1fa80a86-a911-4a89-9186-520189e4cc88" (UID: "1fa80a86-a911-4a89-9186-520189e4cc88"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.056361 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.056414 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgx9c\" (UniqueName: \"kubernetes.io/projected/1fa80a86-a911-4a89-9186-520189e4cc88-kube-api-access-kgx9c\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.056452 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.082174 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.158018 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.856348 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"1fa80a86-a911-4a89-9186-520189e4cc88","Type":"ContainerDied","Data":"566e5fbb6b26fe8efa2057d0b7dbfea6b25021f1dd5818f82ec1e02320728190"} Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.856471 5030 scope.go:117] "RemoveContainer" containerID="703b4130b09e7e9f7f77d19f3d5f9809062cb9cb593d15aaae512b984d70c591" Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.856482 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.891489 5030 scope.go:117] "RemoveContainer" containerID="eab5cc67834e8fd543f3f0bf8bd1fcda92107d8d7ed59e2a5b6142df62f58fce" Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.916206 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.931438 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.938611 5030 scope.go:117] "RemoveContainer" containerID="0cc78d3b2cdd7c16a9c523aa805dd25a524946b1037801b958e0b5aca73eae5d" Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.962303 5030 scope.go:117] "RemoveContainer" containerID="12eaa764aa377753f5ace52c861bbb3f3689b2fca77a117e7d7f96175209240d" Jan 20 23:45:14 crc kubenswrapper[5030]: I0120 23:45:14.988402 5030 scope.go:117] "RemoveContainer" containerID="cc48b4c6d8e07a0b1db15c7843485e19655d2fc4c950337f5fa62bc1a6c473e8" Jan 20 23:45:15 crc kubenswrapper[5030]: I0120 23:45:15.011311 5030 scope.go:117] "RemoveContainer" containerID="bac50d4cc962099037430a767b8939f7657afbb7b51f2a1c37620ae6aac67d42" Jan 20 23:45:15 crc kubenswrapper[5030]: I0120 23:45:15.032551 5030 scope.go:117] "RemoveContainer" containerID="60ab4b7b3cfb297f14a1ca0ce6b7a2cef1d8b852da1a174b8c0a481cccd94e44" Jan 20 23:45:15 crc kubenswrapper[5030]: I0120 23:45:15.054980 5030 scope.go:117] "RemoveContainer" containerID="ba230f74c1544afcbb8b1f9c898cc75207ae9e7b4b8083280e546953073e28ef" Jan 20 23:45:15 crc kubenswrapper[5030]: I0120 23:45:15.072429 5030 scope.go:117] "RemoveContainer" containerID="32ce18f6289e30499008982b3bc6f5d28395565c34d8a569f794f6c6a7775704" Jan 20 23:45:15 crc kubenswrapper[5030]: I0120 23:45:15.093664 5030 scope.go:117] "RemoveContainer" containerID="1cf6b355d0e499f70b76d7c2fa59bfce9c9ed28b910c999b8bcfdcc19ec0dbf2" Jan 20 23:45:15 crc kubenswrapper[5030]: I0120 23:45:15.115867 5030 scope.go:117] "RemoveContainer" containerID="1804d3605dcae8df1649629d1484b17dbb6c4cb8599bf27e3f1d93177fa864f6" Jan 20 23:45:15 crc kubenswrapper[5030]: I0120 23:45:15.140303 5030 scope.go:117] "RemoveContainer" containerID="fffadcca529e80041791a2dbc5892ae2ae5e249cae53e8e2aea4dd2a52b5ed5f" Jan 20 23:45:15 crc kubenswrapper[5030]: I0120 23:45:15.168368 5030 scope.go:117] "RemoveContainer" containerID="0be440e2b30ae5389f7c2cbaafbac4d4866fd7d5b1985dbe1813ca1df94788ca" Jan 20 23:45:15 crc kubenswrapper[5030]: I0120 23:45:15.196773 5030 scope.go:117] "RemoveContainer" containerID="5a0e990417f8c5fb20ac6296acd86d2880ed3ee8806354f7c4c3d73054ac98ed" Jan 20 23:45:15 crc kubenswrapper[5030]: I0120 23:45:15.216297 5030 scope.go:117] "RemoveContainer" containerID="ca2b82dd6522d59c8f400191a6a84849e1a0dc5afd452ebd9ad3d6e6bb95252c" Jan 20 23:45:15 crc kubenswrapper[5030]: I0120 23:45:15.973561 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" path="/var/lib/kubelet/pods/1fa80a86-a911-4a89-9186-520189e4cc88/volumes" Jan 20 23:45:18 crc kubenswrapper[5030]: I0120 23:45:18.172760 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod987b795c-8dbe-4f59-aa00-e3f908a9c974"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod987b795c-8dbe-4f59-aa00-e3f908a9c974] : Timed out while waiting for systemd to remove kubepods-besteffort-pod987b795c_8dbe_4f59_aa00_e3f908a9c974.slice" Jan 20 23:45:18 crc kubenswrapper[5030]: E0120 23:45:18.173183 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod987b795c-8dbe-4f59-aa00-e3f908a9c974] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod987b795c-8dbe-4f59-aa00-e3f908a9c974] : Timed out while waiting for systemd to remove kubepods-besteffort-pod987b795c_8dbe_4f59_aa00_e3f908a9c974.slice" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" podUID="987b795c-8dbe-4f59-aa00-e3f908a9c974" Jan 20 23:45:18 crc kubenswrapper[5030]: I0120 23:45:18.895927 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf" Jan 20 23:45:18 crc kubenswrapper[5030]: I0120 23:45:18.949020 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf"] Jan 20 23:45:18 crc kubenswrapper[5030]: I0120 23:45:18.957581 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-844c45cdb4-lhgzf"] Jan 20 23:45:19 crc kubenswrapper[5030]: I0120 23:45:19.977014 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="987b795c-8dbe-4f59-aa00-e3f908a9c974" path="/var/lib/kubelet/pods/987b795c-8dbe-4f59-aa00-e3f908a9c974/volumes" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886074 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886422 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="rsync" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886439 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="rsync" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886458 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-auditor" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886466 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-auditor" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886477 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-auditor" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886485 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-auditor" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886496 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-updater" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886503 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-updater" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886514 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-auditor" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886521 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-auditor" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886534 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="swift-recon-cron" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886541 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="swift-recon-cron" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886557 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-server" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886564 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-server" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886575 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerName="nova-cell0-conductor-conductor" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886582 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d86b77-716c-4eb7-8976-9e47e1b6df7d" containerName="nova-cell0-conductor-conductor" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886592 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-replicator" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886599 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-replicator" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886685 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-replicator" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886696 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-replicator" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886707 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-server" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886716 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-server" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886727 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d12c4e-cd2b-4d9e-ab44-6ea82cd18829" containerName="collect-profiles" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886734 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d12c4e-cd2b-4d9e-ab44-6ea82cd18829" containerName="collect-profiles" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886750 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-server" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886758 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-server" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886770 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-replicator" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886777 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-replicator" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886793 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-updater" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886800 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-updater" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886811 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-expirer" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886816 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-expirer" Jan 20 23:45:23 crc kubenswrapper[5030]: E0120 23:45:23.886825 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-reaper" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886831 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-reaper" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886953 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-updater" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886962 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-server" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886973 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="swift-recon-cron" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886981 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-reaper" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.886991 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-expirer" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.887001 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="rsync" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.887008 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-auditor" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.887016 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-auditor" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.887027 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-replicator" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.887035 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-server" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.887045 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-updater" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.887054 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="account-auditor" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.887063 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-replicator" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.887072 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d12c4e-cd2b-4d9e-ab44-6ea82cd18829" containerName="collect-profiles" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.887084 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="object-server" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.887095 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa80a86-a911-4a89-9186-520189e4cc88" containerName="container-replicator" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.887894 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.890025 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.890347 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-l2rh5" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.890605 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.890672 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.892123 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.892128 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.893496 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 20 23:45:23 crc kubenswrapper[5030]: I0120 23:45:23.915835 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.033887 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.033937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.033973 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.033992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.034007 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.034026 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.034247 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.034292 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6jx4\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-kube-api-access-t6jx4\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.034355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.034399 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.034417 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.136089 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.136161 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6jx4\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-kube-api-access-t6jx4\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.136193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.136218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.136238 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.136283 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.136316 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.136350 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.136367 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.136389 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.136416 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.136700 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.137101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.137399 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.137427 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.137476 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.138185 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.145565 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.147065 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.147284 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.156511 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.161156 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6jx4\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-kube-api-access-t6jx4\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.173148 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.208189 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.572099 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.869133 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.871076 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.877376 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.877376 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.879039 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-29mtd" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.879311 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.879462 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.882489 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.886498 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.893523 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.947038 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"81ab5d03-ddb2-4d25-bec9-485f19fa3f13","Type":"ContainerStarted","Data":"7d54a026a485d18cd1d8c96ef81c4c85d27c633cadb588acb70f22c02d843a63"} Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.951578 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.951674 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.951740 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfr8n\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-kube-api-access-qfr8n\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.951791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.951886 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.951926 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.951980 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd483381-3919-4ade-ac69-c8abde2869a6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.952008 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd483381-3919-4ade-ac69-c8abde2869a6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.952044 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.952075 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:24 crc kubenswrapper[5030]: I0120 23:45:24.952105 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.053879 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.053923 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.053955 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfr8n\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-kube-api-access-qfr8n\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.053985 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.054042 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.054079 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.054113 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd483381-3919-4ade-ac69-c8abde2869a6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.054128 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd483381-3919-4ade-ac69-c8abde2869a6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.054146 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.054164 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.054479 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.055387 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.055486 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.055579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.054305 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.055854 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.055995 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.059311 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd483381-3919-4ade-ac69-c8abde2869a6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.060096 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.061792 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.075837 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfr8n\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-kube-api-access-qfr8n\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.345567 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.347079 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd483381-3919-4ade-ac69-c8abde2869a6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.347256 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.350204 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.350248 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.350204 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-f8qjh" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.361828 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.365217 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.365939 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.462499 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/62872782-8768-483f-b48b-298cf9973612-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.462786 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-config-data-default\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.462899 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8qm4\" (UniqueName: \"kubernetes.io/projected/62872782-8768-483f-b48b-298cf9973612-kube-api-access-d8qm4\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.462992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-operator-scripts\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.463085 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/62872782-8768-483f-b48b-298cf9973612-config-data-generated\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.463182 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.463266 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62872782-8768-483f-b48b-298cf9973612-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.463340 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-kolla-config\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.469001 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.503651 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.564908 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-kolla-config\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.565215 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/62872782-8768-483f-b48b-298cf9973612-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.565350 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-config-data-default\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.565499 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8qm4\" (UniqueName: \"kubernetes.io/projected/62872782-8768-483f-b48b-298cf9973612-kube-api-access-d8qm4\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.565677 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-operator-scripts\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.565816 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-kolla-config\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.566523 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-config-data-default\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.566894 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/62872782-8768-483f-b48b-298cf9973612-config-data-generated\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.567026 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.567096 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62872782-8768-483f-b48b-298cf9973612-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.567830 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/62872782-8768-483f-b48b-298cf9973612-config-data-generated\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.567873 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.567996 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-operator-scripts\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.581583 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8qm4\" (UniqueName: \"kubernetes.io/projected/62872782-8768-483f-b48b-298cf9973612-kube-api-access-d8qm4\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.642906 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/62872782-8768-483f-b48b-298cf9973612-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.643366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62872782-8768-483f-b48b-298cf9973612-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.745931 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:25 crc kubenswrapper[5030]: I0120 23:45:25.813155 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.000015 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.283828 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:45:26 crc kubenswrapper[5030]: W0120 23:45:26.292704 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62872782_8768_483f_b48b_298cf9973612.slice/crio-188c87c82ed83707455428ef810356b00cbf20190ff648f50d03a8e007170982 WatchSource:0}: Error finding container 188c87c82ed83707455428ef810356b00cbf20190ff648f50d03a8e007170982: Status 404 returned error can't find the container with id 188c87c82ed83707455428ef810356b00cbf20190ff648f50d03a8e007170982 Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.669720 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.671366 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.673859 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-4j24t" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.674395 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.678644 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.678912 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.682908 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.785386 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.785739 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.785780 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.785832 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9729aa6a-9110-421f-bd3a-d22cad6a75e9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.785872 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9729aa6a-9110-421f-bd3a-d22cad6a75e9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.786014 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmtts\" (UniqueName: \"kubernetes.io/projected/9729aa6a-9110-421f-bd3a-d22cad6a75e9-kube-api-access-vmtts\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.786062 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.786134 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9729aa6a-9110-421f-bd3a-d22cad6a75e9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.887514 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9729aa6a-9110-421f-bd3a-d22cad6a75e9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.887574 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9729aa6a-9110-421f-bd3a-d22cad6a75e9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.887637 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmtts\" (UniqueName: \"kubernetes.io/projected/9729aa6a-9110-421f-bd3a-d22cad6a75e9-kube-api-access-vmtts\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.887667 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.887700 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9729aa6a-9110-421f-bd3a-d22cad6a75e9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.887732 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.887801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.887834 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.888273 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9729aa6a-9110-421f-bd3a-d22cad6a75e9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.888362 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.888682 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.889070 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.889529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.894750 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9729aa6a-9110-421f-bd3a-d22cad6a75e9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.896510 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9729aa6a-9110-421f-bd3a-d22cad6a75e9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.906643 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmtts\" (UniqueName: \"kubernetes.io/projected/9729aa6a-9110-421f-bd3a-d22cad6a75e9-kube-api-access-vmtts\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.907130 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.967723 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"62872782-8768-483f-b48b-298cf9973612","Type":"ContainerStarted","Data":"3dcaa04e26ec261129476f380d699ce7ad309301aeac511b8cb22006ac37f707"} Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.967761 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"62872782-8768-483f-b48b-298cf9973612","Type":"ContainerStarted","Data":"188c87c82ed83707455428ef810356b00cbf20190ff648f50d03a8e007170982"} Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.968892 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"dd483381-3919-4ade-ac69-c8abde2869a6","Type":"ContainerStarted","Data":"e9fdef20722e3e765799d93fdd39a7190da044ff2d99ede01532b274f61fad2e"} Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.969990 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"81ab5d03-ddb2-4d25-bec9-485f19fa3f13","Type":"ContainerStarted","Data":"f30c6ebbf974766386c1b296ba4efbecc9546efb107e4b5e3844480b05eb892c"} Jan 20 23:45:26 crc kubenswrapper[5030]: I0120 23:45:26.991539 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:27 crc kubenswrapper[5030]: I0120 23:45:27.445639 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:45:27 crc kubenswrapper[5030]: W0120 23:45:27.468993 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9729aa6a_9110_421f_bd3a_d22cad6a75e9.slice/crio-f0fc87eef0d2e32bf33a81d2f671b20a960c3309c5bb24c43e584f8b008458c2 WatchSource:0}: Error finding container f0fc87eef0d2e32bf33a81d2f671b20a960c3309c5bb24c43e584f8b008458c2: Status 404 returned error can't find the container with id f0fc87eef0d2e32bf33a81d2f671b20a960c3309c5bb24c43e584f8b008458c2 Jan 20 23:45:27 crc kubenswrapper[5030]: I0120 23:45:27.984365 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"dd483381-3919-4ade-ac69-c8abde2869a6","Type":"ContainerStarted","Data":"11fee335c42cdc9d15ce4fe859a0bea109b4a8f81b0b4f42812b3dd2aaaede89"} Jan 20 23:45:27 crc kubenswrapper[5030]: I0120 23:45:27.986574 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9729aa6a-9110-421f-bd3a-d22cad6a75e9","Type":"ContainerStarted","Data":"ad5c930bf89250ddc8f00ab17c3db6d55f29428feee79fe0649ce409b08c2260"} Jan 20 23:45:27 crc kubenswrapper[5030]: I0120 23:45:27.986611 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9729aa6a-9110-421f-bd3a-d22cad6a75e9","Type":"ContainerStarted","Data":"f0fc87eef0d2e32bf33a81d2f671b20a960c3309c5bb24c43e584f8b008458c2"} Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.011309 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.012513 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.014553 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-rp726" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.014822 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.016952 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.039989 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.211187 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-kolla-config\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.211268 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.211383 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-config-data\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.211494 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6mvh\" (UniqueName: \"kubernetes.io/projected/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-kube-api-access-j6mvh\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.211564 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.313082 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-config-data\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.313382 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6mvh\" (UniqueName: \"kubernetes.io/projected/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-kube-api-access-j6mvh\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.313418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.313456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-kolla-config\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.313479 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.316936 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.317100 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.318189 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.325300 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-kolla-config\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.325362 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-config-data\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.331052 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.333438 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6mvh\" (UniqueName: \"kubernetes.io/projected/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-kube-api-access-j6mvh\") pod \"memcached-0\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.336791 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-rp726" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.346000 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.693416 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.994304 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"5727ecb0-0164-4b1f-9c45-80b381d6e6dd","Type":"ContainerStarted","Data":"320e734e3ff9af79dbcc22005042dcf98c9fab6ecad6918286b22dbf79009782"} Jan 20 23:45:28 crc kubenswrapper[5030]: I0120 23:45:28.994578 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"5727ecb0-0164-4b1f-9c45-80b381d6e6dd","Type":"ContainerStarted","Data":"cf22e84fef8fc521d41c77e4d89847c129485609c42476eaeae65df07568fe7b"} Jan 20 23:45:29 crc kubenswrapper[5030]: I0120 23:45:29.012252 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.012235029 podStartE2EDuration="2.012235029s" podCreationTimestamp="2026-01-20 23:45:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:45:29.010790844 +0000 UTC m=+4201.331051132" watchObservedRunningTime="2026-01-20 23:45:29.012235029 +0000 UTC m=+4201.332495317" Jan 20 23:45:29 crc kubenswrapper[5030]: I0120 23:45:29.625695 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:45:29 crc kubenswrapper[5030]: I0120 23:45:29.626574 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:45:29 crc kubenswrapper[5030]: I0120 23:45:29.628597 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-mbt2h" Jan 20 23:45:29 crc kubenswrapper[5030]: I0120 23:45:29.636118 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:45:29 crc kubenswrapper[5030]: I0120 23:45:29.733575 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjln\" (UniqueName: \"kubernetes.io/projected/f8ca9328-fae7-493b-936e-4019e3931081-kube-api-access-cbjln\") pod \"kube-state-metrics-0\" (UID: \"f8ca9328-fae7-493b-936e-4019e3931081\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:45:29 crc kubenswrapper[5030]: I0120 23:45:29.835114 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjln\" (UniqueName: \"kubernetes.io/projected/f8ca9328-fae7-493b-936e-4019e3931081-kube-api-access-cbjln\") pod \"kube-state-metrics-0\" (UID: \"f8ca9328-fae7-493b-936e-4019e3931081\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:45:29 crc kubenswrapper[5030]: I0120 23:45:29.851259 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjln\" (UniqueName: \"kubernetes.io/projected/f8ca9328-fae7-493b-936e-4019e3931081-kube-api-access-cbjln\") pod \"kube-state-metrics-0\" (UID: \"f8ca9328-fae7-493b-936e-4019e3931081\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:45:29 crc kubenswrapper[5030]: I0120 23:45:29.941685 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:45:30 crc kubenswrapper[5030]: I0120 23:45:30.009278 5030 generic.go:334] "Generic (PLEG): container finished" podID="9729aa6a-9110-421f-bd3a-d22cad6a75e9" containerID="ad5c930bf89250ddc8f00ab17c3db6d55f29428feee79fe0649ce409b08c2260" exitCode=0 Jan 20 23:45:30 crc kubenswrapper[5030]: I0120 23:45:30.009368 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9729aa6a-9110-421f-bd3a-d22cad6a75e9","Type":"ContainerDied","Data":"ad5c930bf89250ddc8f00ab17c3db6d55f29428feee79fe0649ce409b08c2260"} Jan 20 23:45:30 crc kubenswrapper[5030]: I0120 23:45:30.012765 5030 generic.go:334] "Generic (PLEG): container finished" podID="62872782-8768-483f-b48b-298cf9973612" containerID="3dcaa04e26ec261129476f380d699ce7ad309301aeac511b8cb22006ac37f707" exitCode=0 Jan 20 23:45:30 crc kubenswrapper[5030]: I0120 23:45:30.012838 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"62872782-8768-483f-b48b-298cf9973612","Type":"ContainerDied","Data":"3dcaa04e26ec261129476f380d699ce7ad309301aeac511b8cb22006ac37f707"} Jan 20 23:45:30 crc kubenswrapper[5030]: I0120 23:45:30.013146 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:30 crc kubenswrapper[5030]: I0120 23:45:30.438415 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:45:30 crc kubenswrapper[5030]: W0120 23:45:30.441134 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8ca9328_fae7_493b_936e_4019e3931081.slice/crio-3b6cd25662a3f8e406e208b7eb56e2bd85c8ae01f853fdd1ab9bed524b1d94bd WatchSource:0}: Error finding container 3b6cd25662a3f8e406e208b7eb56e2bd85c8ae01f853fdd1ab9bed524b1d94bd: Status 404 returned error can't find the container with id 3b6cd25662a3f8e406e208b7eb56e2bd85c8ae01f853fdd1ab9bed524b1d94bd Jan 20 23:45:31 crc kubenswrapper[5030]: I0120 23:45:31.022658 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"f8ca9328-fae7-493b-936e-4019e3931081","Type":"ContainerStarted","Data":"3b6cd25662a3f8e406e208b7eb56e2bd85c8ae01f853fdd1ab9bed524b1d94bd"} Jan 20 23:45:31 crc kubenswrapper[5030]: I0120 23:45:31.024938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9729aa6a-9110-421f-bd3a-d22cad6a75e9","Type":"ContainerStarted","Data":"ee14274333d8d3015cc3322cf08423a90a30cfb67d14fd4de84a2516c22bd7be"} Jan 20 23:45:31 crc kubenswrapper[5030]: I0120 23:45:31.027205 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"62872782-8768-483f-b48b-298cf9973612","Type":"ContainerStarted","Data":"427abcf023821e4c96cad1a1fc1371694fc425ad1dba2ec4e422b6da5144072e"} Jan 20 23:45:31 crc kubenswrapper[5030]: I0120 23:45:31.048422 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=5.048396223 podStartE2EDuration="5.048396223s" podCreationTimestamp="2026-01-20 23:45:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:45:31.04658536 +0000 UTC m=+4203.366845688" watchObservedRunningTime="2026-01-20 23:45:31.048396223 +0000 UTC m=+4203.368656551" Jan 20 23:45:31 crc kubenswrapper[5030]: I0120 23:45:31.068508 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=6.06848642 podStartE2EDuration="6.06848642s" podCreationTimestamp="2026-01-20 23:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:45:31.066666587 +0000 UTC m=+4203.386926885" watchObservedRunningTime="2026-01-20 23:45:31.06848642 +0000 UTC m=+4203.388746708" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.039793 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"f8ca9328-fae7-493b-936e-4019e3931081","Type":"ContainerStarted","Data":"3e7da6e1ecec7326a424ae9d92c5cbef29e1ab2ac4a02206dd58aca870673b8a"} Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.040009 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.060035 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.645259523 podStartE2EDuration="3.060015504s" podCreationTimestamp="2026-01-20 23:45:29 +0000 UTC" firstStartedPulling="2026-01-20 23:45:30.443011078 +0000 UTC m=+4202.763271366" lastFinishedPulling="2026-01-20 23:45:30.857767059 +0000 UTC m=+4203.178027347" observedRunningTime="2026-01-20 23:45:32.058524328 +0000 UTC m=+4204.378784626" watchObservedRunningTime="2026-01-20 23:45:32.060015504 +0000 UTC m=+4204.380275802" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.901270 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.902749 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.913918 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-l99kr" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.915112 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.915401 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.916926 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.920040 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.939909 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.988458 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.988557 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.988606 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdxdr\" (UniqueName: \"kubernetes.io/projected/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-kube-api-access-cdxdr\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.988670 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.989031 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.989086 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.989203 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:32 crc kubenswrapper[5030]: I0120 23:45:32.989245 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-config\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.090933 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdxdr\" (UniqueName: \"kubernetes.io/projected/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-kube-api-access-cdxdr\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.090982 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.091042 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.091061 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.091103 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.091126 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-config\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.091150 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.091195 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.091394 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.092468 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.093109 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-config\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.094138 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.098015 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.123552 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.127331 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.127853 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdxdr\" (UniqueName: \"kubernetes.io/projected/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-kube-api-access-cdxdr\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.129355 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.221964 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.347769 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:45:33 crc kubenswrapper[5030]: I0120 23:45:33.710404 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:45:34 crc kubenswrapper[5030]: I0120 23:45:34.056872 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b","Type":"ContainerStarted","Data":"d14e9008105657463a596aefde91ce6bff6bcde232fe9380f40c4a188e349145"} Jan 20 23:45:34 crc kubenswrapper[5030]: I0120 23:45:34.057274 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b","Type":"ContainerStarted","Data":"45bd5cb9bd91333e8b4b00eaebbb15d56343efd3440ed4badc13c410925f7ed0"} Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.078710 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b","Type":"ContainerStarted","Data":"3bdfa2c0f1f4d03ce20d64482e831290dd6ab19d7e2711ec93d9ac4cce9e8843"} Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.097025 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.098443 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.100504 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.100733 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.100778 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.100978 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-s5kbg" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.123991 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.123973121 podStartE2EDuration="3.123973121s" podCreationTimestamp="2026-01-20 23:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:45:35.105526374 +0000 UTC m=+4207.425786662" watchObservedRunningTime="2026-01-20 23:45:35.123973121 +0000 UTC m=+4207.444233409" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.131704 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.131805 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/806e42db-b6fa-4931-ab88-b2903c5e2830-config\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.131909 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.131943 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.131984 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/806e42db-b6fa-4931-ab88-b2903c5e2830-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.132039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjn5g\" (UniqueName: \"kubernetes.io/projected/806e42db-b6fa-4931-ab88-b2903c5e2830-kube-api-access-qjn5g\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.132065 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/806e42db-b6fa-4931-ab88-b2903c5e2830-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.132101 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.148138 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.233344 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.233408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/806e42db-b6fa-4931-ab88-b2903c5e2830-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.233455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjn5g\" (UniqueName: \"kubernetes.io/projected/806e42db-b6fa-4931-ab88-b2903c5e2830-kube-api-access-qjn5g\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.233476 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/806e42db-b6fa-4931-ab88-b2903c5e2830-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.233506 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.233544 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.233577 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/806e42db-b6fa-4931-ab88-b2903c5e2830-config\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.233665 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.234010 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") device mount path \"/mnt/openstack/pv16\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.234469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/806e42db-b6fa-4931-ab88-b2903c5e2830-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.235147 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/806e42db-b6fa-4931-ab88-b2903c5e2830-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.235279 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/806e42db-b6fa-4931-ab88-b2903c5e2830-config\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.444897 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.445453 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.446211 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjn5g\" (UniqueName: \"kubernetes.io/projected/806e42db-b6fa-4931-ab88-b2903c5e2830-kube-api-access-qjn5g\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.446907 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.477515 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.727453 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.813502 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.813574 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:35 crc kubenswrapper[5030]: I0120 23:45:35.929202 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:36 crc kubenswrapper[5030]: I0120 23:45:36.171428 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:45:36 crc kubenswrapper[5030]: I0120 23:45:36.211005 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:45:36 crc kubenswrapper[5030]: W0120 23:45:36.222310 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod806e42db_b6fa_4931_ab88_b2903c5e2830.slice/crio-f68906bc74881c82cf7b9c1196f5ca0e9055ef5e41efb62cba87552a5fac3858 WatchSource:0}: Error finding container f68906bc74881c82cf7b9c1196f5ca0e9055ef5e41efb62cba87552a5fac3858: Status 404 returned error can't find the container with id f68906bc74881c82cf7b9c1196f5ca0e9055ef5e41efb62cba87552a5fac3858 Jan 20 23:45:36 crc kubenswrapper[5030]: I0120 23:45:36.222669 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:36 crc kubenswrapper[5030]: I0120 23:45:36.274747 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:36 crc kubenswrapper[5030]: I0120 23:45:36.991878 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:36 crc kubenswrapper[5030]: I0120 23:45:36.992307 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:37 crc kubenswrapper[5030]: I0120 23:45:37.100607 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"806e42db-b6fa-4931-ab88-b2903c5e2830","Type":"ContainerStarted","Data":"6b834b707d40e4da7564fea909b7b96377cf70ecece5fdecf7803fb75747167e"} Jan 20 23:45:37 crc kubenswrapper[5030]: I0120 23:45:37.100700 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"806e42db-b6fa-4931-ab88-b2903c5e2830","Type":"ContainerStarted","Data":"d42829f2ade98d7dd79e571500a85a3e1b4edf65a779d6c65138c90834bf5c89"} Jan 20 23:45:37 crc kubenswrapper[5030]: I0120 23:45:37.100721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"806e42db-b6fa-4931-ab88-b2903c5e2830","Type":"ContainerStarted","Data":"f68906bc74881c82cf7b9c1196f5ca0e9055ef5e41efb62cba87552a5fac3858"} Jan 20 23:45:37 crc kubenswrapper[5030]: I0120 23:45:37.101519 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:37 crc kubenswrapper[5030]: I0120 23:45:37.129675 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.129605045 podStartE2EDuration="2.129605045s" podCreationTimestamp="2026-01-20 23:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:45:37.126332536 +0000 UTC m=+4209.446592884" watchObservedRunningTime="2026-01-20 23:45:37.129605045 +0000 UTC m=+4209.449865373" Jan 20 23:45:37 crc kubenswrapper[5030]: I0120 23:45:37.597118 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:37 crc kubenswrapper[5030]: I0120 23:45:37.663872 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:45:37 crc kubenswrapper[5030]: I0120 23:45:37.921309 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-95nrc"] Jan 20 23:45:37 crc kubenswrapper[5030]: I0120 23:45:37.922175 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-95nrc" Jan 20 23:45:37 crc kubenswrapper[5030]: I0120 23:45:37.936614 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-95nrc"] Jan 20 23:45:37 crc kubenswrapper[5030]: I0120 23:45:37.982427 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0afa8856-f941-4f8d-a68c-94e605b4825a-operator-scripts\") pod \"keystone-db-create-95nrc\" (UID: \"0afa8856-f941-4f8d-a68c-94e605b4825a\") " pod="openstack-kuttl-tests/keystone-db-create-95nrc" Jan 20 23:45:37 crc kubenswrapper[5030]: I0120 23:45:37.982510 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7hx6\" (UniqueName: \"kubernetes.io/projected/0afa8856-f941-4f8d-a68c-94e605b4825a-kube-api-access-m7hx6\") pod \"keystone-db-create-95nrc\" (UID: \"0afa8856-f941-4f8d-a68c-94e605b4825a\") " pod="openstack-kuttl-tests/keystone-db-create-95nrc" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.084131 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0afa8856-f941-4f8d-a68c-94e605b4825a-operator-scripts\") pod \"keystone-db-create-95nrc\" (UID: \"0afa8856-f941-4f8d-a68c-94e605b4825a\") " pod="openstack-kuttl-tests/keystone-db-create-95nrc" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.084239 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7hx6\" (UniqueName: \"kubernetes.io/projected/0afa8856-f941-4f8d-a68c-94e605b4825a-kube-api-access-m7hx6\") pod \"keystone-db-create-95nrc\" (UID: \"0afa8856-f941-4f8d-a68c-94e605b4825a\") " pod="openstack-kuttl-tests/keystone-db-create-95nrc" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.086207 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0afa8856-f941-4f8d-a68c-94e605b4825a-operator-scripts\") pod \"keystone-db-create-95nrc\" (UID: \"0afa8856-f941-4f8d-a68c-94e605b4825a\") " pod="openstack-kuttl-tests/keystone-db-create-95nrc" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.111774 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7hx6\" (UniqueName: \"kubernetes.io/projected/0afa8856-f941-4f8d-a68c-94e605b4825a-kube-api-access-m7hx6\") pod \"keystone-db-create-95nrc\" (UID: \"0afa8856-f941-4f8d-a68c-94e605b4825a\") " pod="openstack-kuttl-tests/keystone-db-create-95nrc" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.122384 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs"] Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.123496 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.126804 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.140185 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs"] Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.180727 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.185168 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed4653dc-2f44-44ec-a22f-073fd4782f85-operator-scripts\") pod \"keystone-a1e6-account-create-update-vlzvs\" (UID: \"ed4653dc-2f44-44ec-a22f-073fd4782f85\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.185291 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sct7j\" (UniqueName: \"kubernetes.io/projected/ed4653dc-2f44-44ec-a22f-073fd4782f85-kube-api-access-sct7j\") pod \"keystone-a1e6-account-create-update-vlzvs\" (UID: \"ed4653dc-2f44-44ec-a22f-073fd4782f85\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.226739 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-jhh6k"] Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.228001 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-jhh6k" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.233090 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-jhh6k"] Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.244916 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-95nrc" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.293105 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed4653dc-2f44-44ec-a22f-073fd4782f85-operator-scripts\") pod \"keystone-a1e6-account-create-update-vlzvs\" (UID: \"ed4653dc-2f44-44ec-a22f-073fd4782f85\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.293172 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sct7j\" (UniqueName: \"kubernetes.io/projected/ed4653dc-2f44-44ec-a22f-073fd4782f85-kube-api-access-sct7j\") pod \"keystone-a1e6-account-create-update-vlzvs\" (UID: \"ed4653dc-2f44-44ec-a22f-073fd4782f85\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.293206 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j28zr\" (UniqueName: \"kubernetes.io/projected/ec81857f-9f0b-427d-8e80-985a39d18721-kube-api-access-j28zr\") pod \"placement-db-create-jhh6k\" (UID: \"ec81857f-9f0b-427d-8e80-985a39d18721\") " pod="openstack-kuttl-tests/placement-db-create-jhh6k" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.293231 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec81857f-9f0b-427d-8e80-985a39d18721-operator-scripts\") pod \"placement-db-create-jhh6k\" (UID: \"ec81857f-9f0b-427d-8e80-985a39d18721\") " pod="openstack-kuttl-tests/placement-db-create-jhh6k" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.294057 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed4653dc-2f44-44ec-a22f-073fd4782f85-operator-scripts\") pod \"keystone-a1e6-account-create-update-vlzvs\" (UID: \"ed4653dc-2f44-44ec-a22f-073fd4782f85\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.312097 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sct7j\" (UniqueName: \"kubernetes.io/projected/ed4653dc-2f44-44ec-a22f-073fd4782f85-kube-api-access-sct7j\") pod \"keystone-a1e6-account-create-update-vlzvs\" (UID: \"ed4653dc-2f44-44ec-a22f-073fd4782f85\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.327693 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c"] Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.329103 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.331698 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.340204 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c"] Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.395125 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b2zt\" (UniqueName: \"kubernetes.io/projected/cae4fe8a-e775-4599-9ca9-ff3be2baa456-kube-api-access-7b2zt\") pod \"placement-44f7-account-create-update-fjv9c\" (UID: \"cae4fe8a-e775-4599-9ca9-ff3be2baa456\") " pod="openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.395204 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j28zr\" (UniqueName: \"kubernetes.io/projected/ec81857f-9f0b-427d-8e80-985a39d18721-kube-api-access-j28zr\") pod \"placement-db-create-jhh6k\" (UID: \"ec81857f-9f0b-427d-8e80-985a39d18721\") " pod="openstack-kuttl-tests/placement-db-create-jhh6k" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.395281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec81857f-9f0b-427d-8e80-985a39d18721-operator-scripts\") pod \"placement-db-create-jhh6k\" (UID: \"ec81857f-9f0b-427d-8e80-985a39d18721\") " pod="openstack-kuttl-tests/placement-db-create-jhh6k" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.395499 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae4fe8a-e775-4599-9ca9-ff3be2baa456-operator-scripts\") pod \"placement-44f7-account-create-update-fjv9c\" (UID: \"cae4fe8a-e775-4599-9ca9-ff3be2baa456\") " pod="openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.396021 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec81857f-9f0b-427d-8e80-985a39d18721-operator-scripts\") pod \"placement-db-create-jhh6k\" (UID: \"ec81857f-9f0b-427d-8e80-985a39d18721\") " pod="openstack-kuttl-tests/placement-db-create-jhh6k" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.410173 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j28zr\" (UniqueName: \"kubernetes.io/projected/ec81857f-9f0b-427d-8e80-985a39d18721-kube-api-access-j28zr\") pod \"placement-db-create-jhh6k\" (UID: \"ec81857f-9f0b-427d-8e80-985a39d18721\") " pod="openstack-kuttl-tests/placement-db-create-jhh6k" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.473546 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.499423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae4fe8a-e775-4599-9ca9-ff3be2baa456-operator-scripts\") pod \"placement-44f7-account-create-update-fjv9c\" (UID: \"cae4fe8a-e775-4599-9ca9-ff3be2baa456\") " pod="openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.499513 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b2zt\" (UniqueName: \"kubernetes.io/projected/cae4fe8a-e775-4599-9ca9-ff3be2baa456-kube-api-access-7b2zt\") pod \"placement-44f7-account-create-update-fjv9c\" (UID: \"cae4fe8a-e775-4599-9ca9-ff3be2baa456\") " pod="openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.500616 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae4fe8a-e775-4599-9ca9-ff3be2baa456-operator-scripts\") pod \"placement-44f7-account-create-update-fjv9c\" (UID: \"cae4fe8a-e775-4599-9ca9-ff3be2baa456\") " pod="openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.521317 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b2zt\" (UniqueName: \"kubernetes.io/projected/cae4fe8a-e775-4599-9ca9-ff3be2baa456-kube-api-access-7b2zt\") pod \"placement-44f7-account-create-update-fjv9c\" (UID: \"cae4fe8a-e775-4599-9ca9-ff3be2baa456\") " pod="openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.562874 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-jhh6k" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.672640 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.723341 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-95nrc"] Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.727870 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:38 crc kubenswrapper[5030]: I0120 23:45:38.944603 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs"] Jan 20 23:45:39 crc kubenswrapper[5030]: I0120 23:45:39.047500 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-jhh6k"] Jan 20 23:45:39 crc kubenswrapper[5030]: W0120 23:45:39.115174 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec81857f_9f0b_427d_8e80_985a39d18721.slice/crio-f90d6b7c6d6b900cf2187883b596b5bd47e10a07fba5f69d4ed847b586244622 WatchSource:0}: Error finding container f90d6b7c6d6b900cf2187883b596b5bd47e10a07fba5f69d4ed847b586244622: Status 404 returned error can't find the container with id f90d6b7c6d6b900cf2187883b596b5bd47e10a07fba5f69d4ed847b586244622 Jan 20 23:45:39 crc kubenswrapper[5030]: I0120 23:45:39.127582 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c"] Jan 20 23:45:39 crc kubenswrapper[5030]: I0120 23:45:39.135259 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-jhh6k" event={"ID":"ec81857f-9f0b-427d-8e80-985a39d18721","Type":"ContainerStarted","Data":"f90d6b7c6d6b900cf2187883b596b5bd47e10a07fba5f69d4ed847b586244622"} Jan 20 23:45:39 crc kubenswrapper[5030]: I0120 23:45:39.138271 5030 generic.go:334] "Generic (PLEG): container finished" podID="0afa8856-f941-4f8d-a68c-94e605b4825a" containerID="ff457d14d19e95e1a68beccc54d8b8ec84f5dfda5d6553ed1fa2422bfd6a6ecc" exitCode=0 Jan 20 23:45:39 crc kubenswrapper[5030]: I0120 23:45:39.138310 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-95nrc" event={"ID":"0afa8856-f941-4f8d-a68c-94e605b4825a","Type":"ContainerDied","Data":"ff457d14d19e95e1a68beccc54d8b8ec84f5dfda5d6553ed1fa2422bfd6a6ecc"} Jan 20 23:45:39 crc kubenswrapper[5030]: I0120 23:45:39.138343 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-95nrc" event={"ID":"0afa8856-f941-4f8d-a68c-94e605b4825a","Type":"ContainerStarted","Data":"01141a1a8579e32b2caf4d0f1fb60b62b762f1fafe67853641e46a283f0285f8"} Jan 20 23:45:39 crc kubenswrapper[5030]: I0120 23:45:39.141586 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs" event={"ID":"ed4653dc-2f44-44ec-a22f-073fd4782f85","Type":"ContainerStarted","Data":"23c1fd7b8e80589aecf71e2030f425d380947a7f65d0b95aa13b03faa5176adc"} Jan 20 23:45:39 crc kubenswrapper[5030]: I0120 23:45:39.945909 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.072263 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.078813 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.080835 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.080959 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-p79hw" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.081208 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.082043 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.091009 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.126610 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cl4k\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-kube-api-access-7cl4k\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.126671 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.126713 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-lock\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.126748 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.126821 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-cache\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.149759 5030 generic.go:334] "Generic (PLEG): container finished" podID="cae4fe8a-e775-4599-9ca9-ff3be2baa456" containerID="1c026ff013ec59d543a67923331e5855251f76f3e958cb1eefdb053d5331676d" exitCode=0 Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.149844 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c" event={"ID":"cae4fe8a-e775-4599-9ca9-ff3be2baa456","Type":"ContainerDied","Data":"1c026ff013ec59d543a67923331e5855251f76f3e958cb1eefdb053d5331676d"} Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.149881 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c" event={"ID":"cae4fe8a-e775-4599-9ca9-ff3be2baa456","Type":"ContainerStarted","Data":"7bacc8cb424ec7baa052fffe143ca7d45cb4495cd2c400df21d5f78cbd936709"} Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.151319 5030 generic.go:334] "Generic (PLEG): container finished" podID="ed4653dc-2f44-44ec-a22f-073fd4782f85" containerID="22e68e62988cf1e516c7b5af2b06fb4f49c2ec461e9218371d944061ae0e011c" exitCode=0 Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.151411 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs" event={"ID":"ed4653dc-2f44-44ec-a22f-073fd4782f85","Type":"ContainerDied","Data":"22e68e62988cf1e516c7b5af2b06fb4f49c2ec461e9218371d944061ae0e011c"} Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.152672 5030 generic.go:334] "Generic (PLEG): container finished" podID="ec81857f-9f0b-427d-8e80-985a39d18721" containerID="d98c71152944a299bb8484276e289c385976f06af8f4f14a45496d662e778bee" exitCode=0 Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.152756 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-jhh6k" event={"ID":"ec81857f-9f0b-427d-8e80-985a39d18721","Type":"ContainerDied","Data":"d98c71152944a299bb8484276e289c385976f06af8f4f14a45496d662e778bee"} Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.229178 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-cache\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.229224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cl4k\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-kube-api-access-7cl4k\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.229249 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.229283 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-lock\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.229313 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.229633 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") device mount path \"/mnt/openstack/pv10\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.231018 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-cache\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: E0120 23:45:40.231123 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:45:40 crc kubenswrapper[5030]: E0120 23:45:40.231135 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:45:40 crc kubenswrapper[5030]: E0120 23:45:40.231171 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift podName:0cb7ff45-e6d7-41cc-9001-de1c7f458a83 nodeName:}" failed. No retries permitted until 2026-01-20 23:45:40.731156675 +0000 UTC m=+4213.051416963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift") pod "swift-storage-0" (UID: "0cb7ff45-e6d7-41cc-9001-de1c7f458a83") : configmap "swift-ring-files" not found Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.231482 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-lock\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.250093 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cl4k\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-kube-api-access-7cl4k\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.253609 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.431370 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-f4t54"] Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.433337 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.437086 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.437125 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.437374 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.441313 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-f4t54"] Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.494838 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-95nrc" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.542238 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7hx6\" (UniqueName: \"kubernetes.io/projected/0afa8856-f941-4f8d-a68c-94e605b4825a-kube-api-access-m7hx6\") pod \"0afa8856-f941-4f8d-a68c-94e605b4825a\" (UID: \"0afa8856-f941-4f8d-a68c-94e605b4825a\") " Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.542351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0afa8856-f941-4f8d-a68c-94e605b4825a-operator-scripts\") pod \"0afa8856-f941-4f8d-a68c-94e605b4825a\" (UID: \"0afa8856-f941-4f8d-a68c-94e605b4825a\") " Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.542640 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65f20a41-6543-40ee-9462-2c73c1b904ef-ring-data-devices\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.542684 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65f20a41-6543-40ee-9462-2c73c1b904ef-etc-swift\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.542715 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-dispersionconf\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.542785 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65f20a41-6543-40ee-9462-2c73c1b904ef-scripts\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.542823 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6zg5\" (UniqueName: \"kubernetes.io/projected/65f20a41-6543-40ee-9462-2c73c1b904ef-kube-api-access-w6zg5\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.542893 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-combined-ca-bundle\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.542944 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-swiftconf\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.543253 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afa8856-f941-4f8d-a68c-94e605b4825a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0afa8856-f941-4f8d-a68c-94e605b4825a" (UID: "0afa8856-f941-4f8d-a68c-94e605b4825a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.547391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0afa8856-f941-4f8d-a68c-94e605b4825a-kube-api-access-m7hx6" (OuterVolumeSpecName: "kube-api-access-m7hx6") pod "0afa8856-f941-4f8d-a68c-94e605b4825a" (UID: "0afa8856-f941-4f8d-a68c-94e605b4825a"). InnerVolumeSpecName "kube-api-access-m7hx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.644069 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65f20a41-6543-40ee-9462-2c73c1b904ef-scripts\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.644133 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6zg5\" (UniqueName: \"kubernetes.io/projected/65f20a41-6543-40ee-9462-2c73c1b904ef-kube-api-access-w6zg5\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.644188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-combined-ca-bundle\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.644224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-swiftconf\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.644263 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65f20a41-6543-40ee-9462-2c73c1b904ef-ring-data-devices\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.644280 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65f20a41-6543-40ee-9462-2c73c1b904ef-etc-swift\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.644303 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-dispersionconf\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.644368 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0afa8856-f941-4f8d-a68c-94e605b4825a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.644381 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7hx6\" (UniqueName: \"kubernetes.io/projected/0afa8856-f941-4f8d-a68c-94e605b4825a-kube-api-access-m7hx6\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.645329 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65f20a41-6543-40ee-9462-2c73c1b904ef-scripts\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.645363 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65f20a41-6543-40ee-9462-2c73c1b904ef-etc-swift\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.645650 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65f20a41-6543-40ee-9462-2c73c1b904ef-ring-data-devices\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.648353 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-dispersionconf\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.649478 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-swiftconf\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.651677 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-combined-ca-bundle\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.658969 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6zg5\" (UniqueName: \"kubernetes.io/projected/65f20a41-6543-40ee-9462-2c73c1b904ef-kube-api-access-w6zg5\") pod \"swift-ring-rebalance-f4t54\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.728293 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.745149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:40 crc kubenswrapper[5030]: E0120 23:45:40.745460 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:45:40 crc kubenswrapper[5030]: E0120 23:45:40.745549 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:45:40 crc kubenswrapper[5030]: E0120 23:45:40.745704 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift podName:0cb7ff45-e6d7-41cc-9001-de1c7f458a83 nodeName:}" failed. No retries permitted until 2026-01-20 23:45:41.745675595 +0000 UTC m=+4214.065935923 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift") pod "swift-storage-0" (UID: "0cb7ff45-e6d7-41cc-9001-de1c7f458a83") : configmap "swift-ring-files" not found Jan 20 23:45:40 crc kubenswrapper[5030]: I0120 23:45:40.793952 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.163664 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-95nrc" event={"ID":"0afa8856-f941-4f8d-a68c-94e605b4825a","Type":"ContainerDied","Data":"01141a1a8579e32b2caf4d0f1fb60b62b762f1fafe67853641e46a283f0285f8"} Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.164022 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01141a1a8579e32b2caf4d0f1fb60b62b762f1fafe67853641e46a283f0285f8" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.163707 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-95nrc" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.271059 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-f4t54"] Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.477248 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.567683 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed4653dc-2f44-44ec-a22f-073fd4782f85-operator-scripts\") pod \"ed4653dc-2f44-44ec-a22f-073fd4782f85\" (UID: \"ed4653dc-2f44-44ec-a22f-073fd4782f85\") " Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.567978 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sct7j\" (UniqueName: \"kubernetes.io/projected/ed4653dc-2f44-44ec-a22f-073fd4782f85-kube-api-access-sct7j\") pod \"ed4653dc-2f44-44ec-a22f-073fd4782f85\" (UID: \"ed4653dc-2f44-44ec-a22f-073fd4782f85\") " Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.568454 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed4653dc-2f44-44ec-a22f-073fd4782f85-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed4653dc-2f44-44ec-a22f-073fd4782f85" (UID: "ed4653dc-2f44-44ec-a22f-073fd4782f85"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.569367 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed4653dc-2f44-44ec-a22f-073fd4782f85-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.574529 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed4653dc-2f44-44ec-a22f-073fd4782f85-kube-api-access-sct7j" (OuterVolumeSpecName: "kube-api-access-sct7j") pod "ed4653dc-2f44-44ec-a22f-073fd4782f85" (UID: "ed4653dc-2f44-44ec-a22f-073fd4782f85"). InnerVolumeSpecName "kube-api-access-sct7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.634716 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-jhh6k" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.663897 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.670752 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec81857f-9f0b-427d-8e80-985a39d18721-operator-scripts\") pod \"ec81857f-9f0b-427d-8e80-985a39d18721\" (UID: \"ec81857f-9f0b-427d-8e80-985a39d18721\") " Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.670801 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j28zr\" (UniqueName: \"kubernetes.io/projected/ec81857f-9f0b-427d-8e80-985a39d18721-kube-api-access-j28zr\") pod \"ec81857f-9f0b-427d-8e80-985a39d18721\" (UID: \"ec81857f-9f0b-427d-8e80-985a39d18721\") " Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.671218 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sct7j\" (UniqueName: \"kubernetes.io/projected/ed4653dc-2f44-44ec-a22f-073fd4782f85-kube-api-access-sct7j\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.671322 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec81857f-9f0b-427d-8e80-985a39d18721-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec81857f-9f0b-427d-8e80-985a39d18721" (UID: "ec81857f-9f0b-427d-8e80-985a39d18721"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.673726 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec81857f-9f0b-427d-8e80-985a39d18721-kube-api-access-j28zr" (OuterVolumeSpecName: "kube-api-access-j28zr") pod "ec81857f-9f0b-427d-8e80-985a39d18721" (UID: "ec81857f-9f0b-427d-8e80-985a39d18721"). InnerVolumeSpecName "kube-api-access-j28zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.771208 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.772674 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b2zt\" (UniqueName: \"kubernetes.io/projected/cae4fe8a-e775-4599-9ca9-ff3be2baa456-kube-api-access-7b2zt\") pod \"cae4fe8a-e775-4599-9ca9-ff3be2baa456\" (UID: \"cae4fe8a-e775-4599-9ca9-ff3be2baa456\") " Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.773759 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae4fe8a-e775-4599-9ca9-ff3be2baa456-operator-scripts\") pod \"cae4fe8a-e775-4599-9ca9-ff3be2baa456\" (UID: \"cae4fe8a-e775-4599-9ca9-ff3be2baa456\") " Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.774088 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.774529 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cae4fe8a-e775-4599-9ca9-ff3be2baa456-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cae4fe8a-e775-4599-9ca9-ff3be2baa456" (UID: "cae4fe8a-e775-4599-9ca9-ff3be2baa456"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.775938 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae4fe8a-e775-4599-9ca9-ff3be2baa456-kube-api-access-7b2zt" (OuterVolumeSpecName: "kube-api-access-7b2zt") pod "cae4fe8a-e775-4599-9ca9-ff3be2baa456" (UID: "cae4fe8a-e775-4599-9ca9-ff3be2baa456"). InnerVolumeSpecName "kube-api-access-7b2zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:45:41 crc kubenswrapper[5030]: E0120 23:45:41.780206 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:45:41 crc kubenswrapper[5030]: E0120 23:45:41.780228 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.780235 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec81857f-9f0b-427d-8e80-985a39d18721-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.780260 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j28zr\" (UniqueName: \"kubernetes.io/projected/ec81857f-9f0b-427d-8e80-985a39d18721-kube-api-access-j28zr\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:41 crc kubenswrapper[5030]: E0120 23:45:41.780270 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift podName:0cb7ff45-e6d7-41cc-9001-de1c7f458a83 nodeName:}" failed. No retries permitted until 2026-01-20 23:45:43.780256163 +0000 UTC m=+4216.100516451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift") pod "swift-storage-0" (UID: "0cb7ff45-e6d7-41cc-9001-de1c7f458a83") : configmap "swift-ring-files" not found Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.817591 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.882440 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cae4fe8a-e775-4599-9ca9-ff3be2baa456-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:41 crc kubenswrapper[5030]: I0120 23:45:41.882852 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b2zt\" (UniqueName: \"kubernetes.io/projected/cae4fe8a-e775-4599-9ca9-ff3be2baa456-kube-api-access-7b2zt\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.089584 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:45:42 crc kubenswrapper[5030]: E0120 23:45:42.089875 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae4fe8a-e775-4599-9ca9-ff3be2baa456" containerName="mariadb-account-create-update" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.089891 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae4fe8a-e775-4599-9ca9-ff3be2baa456" containerName="mariadb-account-create-update" Jan 20 23:45:42 crc kubenswrapper[5030]: E0120 23:45:42.089907 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed4653dc-2f44-44ec-a22f-073fd4782f85" containerName="mariadb-account-create-update" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.089913 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed4653dc-2f44-44ec-a22f-073fd4782f85" containerName="mariadb-account-create-update" Jan 20 23:45:42 crc kubenswrapper[5030]: E0120 23:45:42.089926 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afa8856-f941-4f8d-a68c-94e605b4825a" containerName="mariadb-database-create" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.089931 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afa8856-f941-4f8d-a68c-94e605b4825a" containerName="mariadb-database-create" Jan 20 23:45:42 crc kubenswrapper[5030]: E0120 23:45:42.089947 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec81857f-9f0b-427d-8e80-985a39d18721" containerName="mariadb-database-create" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.089954 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec81857f-9f0b-427d-8e80-985a39d18721" containerName="mariadb-database-create" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.090089 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed4653dc-2f44-44ec-a22f-073fd4782f85" containerName="mariadb-account-create-update" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.090107 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec81857f-9f0b-427d-8e80-985a39d18721" containerName="mariadb-database-create" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.090115 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae4fe8a-e775-4599-9ca9-ff3be2baa456" containerName="mariadb-account-create-update" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.090129 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0afa8856-f941-4f8d-a68c-94e605b4825a" containerName="mariadb-database-create" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.090847 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.093713 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.093846 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.093999 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.094109 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-5hlhd" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.113862 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.171571 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-jhh6k" event={"ID":"ec81857f-9f0b-427d-8e80-985a39d18721","Type":"ContainerDied","Data":"f90d6b7c6d6b900cf2187883b596b5bd47e10a07fba5f69d4ed847b586244622"} Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.171609 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f90d6b7c6d6b900cf2187883b596b5bd47e10a07fba5f69d4ed847b586244622" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.171658 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-jhh6k" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.173263 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" event={"ID":"65f20a41-6543-40ee-9462-2c73c1b904ef","Type":"ContainerStarted","Data":"421267c6bfd98f4fe3243005cbd7630d3179d26c45dbcc020099fd31c6f39830"} Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.173299 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" event={"ID":"65f20a41-6543-40ee-9462-2c73c1b904ef","Type":"ContainerStarted","Data":"87219c47dbc4e1cbaa2dd7725ea9c2b5fc76c422da52c12fdf2209c8f0df41c1"} Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.175641 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.175657 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c" event={"ID":"cae4fe8a-e775-4599-9ca9-ff3be2baa456","Type":"ContainerDied","Data":"7bacc8cb424ec7baa052fffe143ca7d45cb4495cd2c400df21d5f78cbd936709"} Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.175695 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bacc8cb424ec7baa052fffe143ca7d45cb4495cd2c400df21d5f78cbd936709" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.177068 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.177077 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs" event={"ID":"ed4653dc-2f44-44ec-a22f-073fd4782f85","Type":"ContainerDied","Data":"23c1fd7b8e80589aecf71e2030f425d380947a7f65d0b95aa13b03faa5176adc"} Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.177112 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23c1fd7b8e80589aecf71e2030f425d380947a7f65d0b95aa13b03faa5176adc" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.186476 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmj9c\" (UniqueName: \"kubernetes.io/projected/b2e0e246-826d-4bef-a72a-64b6e0266965-kube-api-access-vmj9c\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.186531 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.186560 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.186577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b2e0e246-826d-4bef-a72a-64b6e0266965-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.186637 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e0e246-826d-4bef-a72a-64b6e0266965-config\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.186662 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2e0e246-826d-4bef-a72a-64b6e0266965-scripts\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.186681 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.206430 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" podStartSLOduration=2.206409351 podStartE2EDuration="2.206409351s" podCreationTimestamp="2026-01-20 23:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:45:42.199471233 +0000 UTC m=+4214.519731521" watchObservedRunningTime="2026-01-20 23:45:42.206409351 +0000 UTC m=+4214.526669649" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.287791 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.287934 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmj9c\" (UniqueName: \"kubernetes.io/projected/b2e0e246-826d-4bef-a72a-64b6e0266965-kube-api-access-vmj9c\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.287980 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.288007 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.288025 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b2e0e246-826d-4bef-a72a-64b6e0266965-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.288129 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e0e246-826d-4bef-a72a-64b6e0266965-config\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.288155 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2e0e246-826d-4bef-a72a-64b6e0266965-scripts\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.288806 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b2e0e246-826d-4bef-a72a-64b6e0266965-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.289365 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2e0e246-826d-4bef-a72a-64b6e0266965-scripts\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.289732 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e0e246-826d-4bef-a72a-64b6e0266965-config\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.295302 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.297212 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.303330 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.318386 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmj9c\" (UniqueName: \"kubernetes.io/projected/b2e0e246-826d-4bef-a72a-64b6e0266965-kube-api-access-vmj9c\") pod \"ovn-northd-0\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.412682 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:42 crc kubenswrapper[5030]: I0120 23:45:42.897863 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.190079 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"b2e0e246-826d-4bef-a72a-64b6e0266965","Type":"ContainerStarted","Data":"ec45665b6a75193fb57359dd4199fd116084c74be755d905e8513881a7470510"} Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.190453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"b2e0e246-826d-4bef-a72a-64b6e0266965","Type":"ContainerStarted","Data":"6699c0d88f30ad882905fac2311b9f3cfe64fc8e840c6009b6200b25aebe07e4"} Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.578302 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql"] Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.579373 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.581969 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.593819 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-knct4"] Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.595118 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-knct4" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.604477 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql"] Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.614244 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxvmj\" (UniqueName: \"kubernetes.io/projected/10ee2580-c430-43ee-bd0a-676053beeb00-kube-api-access-kxvmj\") pod \"glance-bd31-account-create-update-rg5ql\" (UID: \"10ee2580-c430-43ee-bd0a-676053beeb00\") " pod="openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.614365 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10ee2580-c430-43ee-bd0a-676053beeb00-operator-scripts\") pod \"glance-bd31-account-create-update-rg5ql\" (UID: \"10ee2580-c430-43ee-bd0a-676053beeb00\") " pod="openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.617576 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-knct4"] Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.715921 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtf59\" (UniqueName: \"kubernetes.io/projected/1ea9be16-3983-4590-943a-0870a9dfd9c3-kube-api-access-mtf59\") pod \"glance-db-create-knct4\" (UID: \"1ea9be16-3983-4590-943a-0870a9dfd9c3\") " pod="openstack-kuttl-tests/glance-db-create-knct4" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.715985 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxvmj\" (UniqueName: \"kubernetes.io/projected/10ee2580-c430-43ee-bd0a-676053beeb00-kube-api-access-kxvmj\") pod \"glance-bd31-account-create-update-rg5ql\" (UID: \"10ee2580-c430-43ee-bd0a-676053beeb00\") " pod="openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.716085 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ea9be16-3983-4590-943a-0870a9dfd9c3-operator-scripts\") pod \"glance-db-create-knct4\" (UID: \"1ea9be16-3983-4590-943a-0870a9dfd9c3\") " pod="openstack-kuttl-tests/glance-db-create-knct4" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.716119 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10ee2580-c430-43ee-bd0a-676053beeb00-operator-scripts\") pod \"glance-bd31-account-create-update-rg5ql\" (UID: \"10ee2580-c430-43ee-bd0a-676053beeb00\") " pod="openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.716816 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10ee2580-c430-43ee-bd0a-676053beeb00-operator-scripts\") pod \"glance-bd31-account-create-update-rg5ql\" (UID: \"10ee2580-c430-43ee-bd0a-676053beeb00\") " pod="openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.735128 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxvmj\" (UniqueName: \"kubernetes.io/projected/10ee2580-c430-43ee-bd0a-676053beeb00-kube-api-access-kxvmj\") pod \"glance-bd31-account-create-update-rg5ql\" (UID: \"10ee2580-c430-43ee-bd0a-676053beeb00\") " pod="openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.817761 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ea9be16-3983-4590-943a-0870a9dfd9c3-operator-scripts\") pod \"glance-db-create-knct4\" (UID: \"1ea9be16-3983-4590-943a-0870a9dfd9c3\") " pod="openstack-kuttl-tests/glance-db-create-knct4" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.817835 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.817948 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtf59\" (UniqueName: \"kubernetes.io/projected/1ea9be16-3983-4590-943a-0870a9dfd9c3-kube-api-access-mtf59\") pod \"glance-db-create-knct4\" (UID: \"1ea9be16-3983-4590-943a-0870a9dfd9c3\") " pod="openstack-kuttl-tests/glance-db-create-knct4" Jan 20 23:45:43 crc kubenswrapper[5030]: E0120 23:45:43.818050 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:45:43 crc kubenswrapper[5030]: E0120 23:45:43.818085 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:45:43 crc kubenswrapper[5030]: E0120 23:45:43.818147 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift podName:0cb7ff45-e6d7-41cc-9001-de1c7f458a83 nodeName:}" failed. No retries permitted until 2026-01-20 23:45:47.818125798 +0000 UTC m=+4220.138386096 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift") pod "swift-storage-0" (UID: "0cb7ff45-e6d7-41cc-9001-de1c7f458a83") : configmap "swift-ring-files" not found Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.818425 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ea9be16-3983-4590-943a-0870a9dfd9c3-operator-scripts\") pod \"glance-db-create-knct4\" (UID: \"1ea9be16-3983-4590-943a-0870a9dfd9c3\") " pod="openstack-kuttl-tests/glance-db-create-knct4" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.847235 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtf59\" (UniqueName: \"kubernetes.io/projected/1ea9be16-3983-4590-943a-0870a9dfd9c3-kube-api-access-mtf59\") pod \"glance-db-create-knct4\" (UID: \"1ea9be16-3983-4590-943a-0870a9dfd9c3\") " pod="openstack-kuttl-tests/glance-db-create-knct4" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.896550 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql" Jan 20 23:45:43 crc kubenswrapper[5030]: I0120 23:45:43.913425 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-knct4" Jan 20 23:45:44 crc kubenswrapper[5030]: I0120 23:45:44.205065 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"b2e0e246-826d-4bef-a72a-64b6e0266965","Type":"ContainerStarted","Data":"5ffdf2f065bf633d15e59103da0d96e1da51576f3b2d1cd447a73672d69bdc29"} Jan 20 23:45:44 crc kubenswrapper[5030]: I0120 23:45:44.206165 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:44 crc kubenswrapper[5030]: I0120 23:45:44.227130 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.22711416 podStartE2EDuration="2.22711416s" podCreationTimestamp="2026-01-20 23:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:45:44.221987246 +0000 UTC m=+4216.542247534" watchObservedRunningTime="2026-01-20 23:45:44.22711416 +0000 UTC m=+4216.547374448" Jan 20 23:45:44 crc kubenswrapper[5030]: I0120 23:45:44.349402 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql"] Jan 20 23:45:44 crc kubenswrapper[5030]: I0120 23:45:44.434283 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-knct4"] Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.222422 5030 generic.go:334] "Generic (PLEG): container finished" podID="1ea9be16-3983-4590-943a-0870a9dfd9c3" containerID="83ba40878af4490661874b5ada4f4f7d502d1cbc85a36fc133c30b8c8c4ae107" exitCode=0 Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.222533 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-knct4" event={"ID":"1ea9be16-3983-4590-943a-0870a9dfd9c3","Type":"ContainerDied","Data":"83ba40878af4490661874b5ada4f4f7d502d1cbc85a36fc133c30b8c8c4ae107"} Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.223102 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-knct4" event={"ID":"1ea9be16-3983-4590-943a-0870a9dfd9c3","Type":"ContainerStarted","Data":"aa8e6736c7d0248476c4d052a93e05905f53152a6e0018e1ae3053cbffc93bb2"} Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.226825 5030 generic.go:334] "Generic (PLEG): container finished" podID="10ee2580-c430-43ee-bd0a-676053beeb00" containerID="281753cf2e936ebb772453915aa4f8dbf5ea0e0af7646cff35eb71a2c1c0f0f4" exitCode=0 Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.226941 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql" event={"ID":"10ee2580-c430-43ee-bd0a-676053beeb00","Type":"ContainerDied","Data":"281753cf2e936ebb772453915aa4f8dbf5ea0e0af7646cff35eb71a2c1c0f0f4"} Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.227010 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql" event={"ID":"10ee2580-c430-43ee-bd0a-676053beeb00","Type":"ContainerStarted","Data":"e3d8cf4d459996711956765ceffd8582e25a6162d487ba994627a0ec7b37161e"} Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.382479 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bqtr9"] Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.384178 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-bqtr9" Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.388401 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.395614 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bqtr9"] Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.455187 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a611922a-c883-4ae3-a9a9-823b11d6fab7-operator-scripts\") pod \"root-account-create-update-bqtr9\" (UID: \"a611922a-c883-4ae3-a9a9-823b11d6fab7\") " pod="openstack-kuttl-tests/root-account-create-update-bqtr9" Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.455253 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v67hs\" (UniqueName: \"kubernetes.io/projected/a611922a-c883-4ae3-a9a9-823b11d6fab7-kube-api-access-v67hs\") pod \"root-account-create-update-bqtr9\" (UID: \"a611922a-c883-4ae3-a9a9-823b11d6fab7\") " pod="openstack-kuttl-tests/root-account-create-update-bqtr9" Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.556426 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a611922a-c883-4ae3-a9a9-823b11d6fab7-operator-scripts\") pod \"root-account-create-update-bqtr9\" (UID: \"a611922a-c883-4ae3-a9a9-823b11d6fab7\") " pod="openstack-kuttl-tests/root-account-create-update-bqtr9" Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.556499 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v67hs\" (UniqueName: \"kubernetes.io/projected/a611922a-c883-4ae3-a9a9-823b11d6fab7-kube-api-access-v67hs\") pod \"root-account-create-update-bqtr9\" (UID: \"a611922a-c883-4ae3-a9a9-823b11d6fab7\") " pod="openstack-kuttl-tests/root-account-create-update-bqtr9" Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.557970 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a611922a-c883-4ae3-a9a9-823b11d6fab7-operator-scripts\") pod \"root-account-create-update-bqtr9\" (UID: \"a611922a-c883-4ae3-a9a9-823b11d6fab7\") " pod="openstack-kuttl-tests/root-account-create-update-bqtr9" Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.589046 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v67hs\" (UniqueName: \"kubernetes.io/projected/a611922a-c883-4ae3-a9a9-823b11d6fab7-kube-api-access-v67hs\") pod \"root-account-create-update-bqtr9\" (UID: \"a611922a-c883-4ae3-a9a9-823b11d6fab7\") " pod="openstack-kuttl-tests/root-account-create-update-bqtr9" Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.710085 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-bqtr9" Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.951223 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gbdnv"] Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.954197 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:45 crc kubenswrapper[5030]: I0120 23:45:45.980590 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbdnv"] Jan 20 23:45:46 crc kubenswrapper[5030]: I0120 23:45:46.073643 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dfb2910-03b5-479f-a21f-70f27512580d-catalog-content\") pod \"redhat-marketplace-gbdnv\" (UID: \"5dfb2910-03b5-479f-a21f-70f27512580d\") " pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:46 crc kubenswrapper[5030]: I0120 23:45:46.073715 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49dhb\" (UniqueName: \"kubernetes.io/projected/5dfb2910-03b5-479f-a21f-70f27512580d-kube-api-access-49dhb\") pod \"redhat-marketplace-gbdnv\" (UID: \"5dfb2910-03b5-479f-a21f-70f27512580d\") " pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:46 crc kubenswrapper[5030]: I0120 23:45:46.073753 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dfb2910-03b5-479f-a21f-70f27512580d-utilities\") pod \"redhat-marketplace-gbdnv\" (UID: \"5dfb2910-03b5-479f-a21f-70f27512580d\") " pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:46 crc kubenswrapper[5030]: I0120 23:45:46.175160 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dfb2910-03b5-479f-a21f-70f27512580d-catalog-content\") pod \"redhat-marketplace-gbdnv\" (UID: \"5dfb2910-03b5-479f-a21f-70f27512580d\") " pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:46 crc kubenswrapper[5030]: I0120 23:45:46.175226 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49dhb\" (UniqueName: \"kubernetes.io/projected/5dfb2910-03b5-479f-a21f-70f27512580d-kube-api-access-49dhb\") pod \"redhat-marketplace-gbdnv\" (UID: \"5dfb2910-03b5-479f-a21f-70f27512580d\") " pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:46 crc kubenswrapper[5030]: I0120 23:45:46.175265 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dfb2910-03b5-479f-a21f-70f27512580d-utilities\") pod \"redhat-marketplace-gbdnv\" (UID: \"5dfb2910-03b5-479f-a21f-70f27512580d\") " pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:46 crc kubenswrapper[5030]: I0120 23:45:46.175754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dfb2910-03b5-479f-a21f-70f27512580d-utilities\") pod \"redhat-marketplace-gbdnv\" (UID: \"5dfb2910-03b5-479f-a21f-70f27512580d\") " pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:46 crc kubenswrapper[5030]: I0120 23:45:46.175964 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dfb2910-03b5-479f-a21f-70f27512580d-catalog-content\") pod \"redhat-marketplace-gbdnv\" (UID: \"5dfb2910-03b5-479f-a21f-70f27512580d\") " pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:46 crc kubenswrapper[5030]: I0120 23:45:46.194668 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49dhb\" (UniqueName: \"kubernetes.io/projected/5dfb2910-03b5-479f-a21f-70f27512580d-kube-api-access-49dhb\") pod \"redhat-marketplace-gbdnv\" (UID: \"5dfb2910-03b5-479f-a21f-70f27512580d\") " pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:46 crc kubenswrapper[5030]: I0120 23:45:46.267529 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bqtr9"] Jan 20 23:45:46 crc kubenswrapper[5030]: I0120 23:45:46.294395 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:46 crc kubenswrapper[5030]: I0120 23:45:46.832210 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbdnv"] Jan 20 23:45:46 crc kubenswrapper[5030]: W0120 23:45:46.851475 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dfb2910_03b5_479f_a21f_70f27512580d.slice/crio-247b1c6378a7f53bc434c57f8e7e3084a4c4af60d80fe7d9861b1091c47bd1af WatchSource:0}: Error finding container 247b1c6378a7f53bc434c57f8e7e3084a4c4af60d80fe7d9861b1091c47bd1af: Status 404 returned error can't find the container with id 247b1c6378a7f53bc434c57f8e7e3084a4c4af60d80fe7d9861b1091c47bd1af Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.163378 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-knct4" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.195784 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.249341 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql" event={"ID":"10ee2580-c430-43ee-bd0a-676053beeb00","Type":"ContainerDied","Data":"e3d8cf4d459996711956765ceffd8582e25a6162d487ba994627a0ec7b37161e"} Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.249377 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.249385 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3d8cf4d459996711956765ceffd8582e25a6162d487ba994627a0ec7b37161e" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.251401 5030 generic.go:334] "Generic (PLEG): container finished" podID="a611922a-c883-4ae3-a9a9-823b11d6fab7" containerID="2d2794b65b2db7f4a9c84934ca6eef3c2896bd5ee5b8868af3e0f7db3cbf6b41" exitCode=0 Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.251467 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-bqtr9" event={"ID":"a611922a-c883-4ae3-a9a9-823b11d6fab7","Type":"ContainerDied","Data":"2d2794b65b2db7f4a9c84934ca6eef3c2896bd5ee5b8868af3e0f7db3cbf6b41"} Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.251496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-bqtr9" event={"ID":"a611922a-c883-4ae3-a9a9-823b11d6fab7","Type":"ContainerStarted","Data":"b40640d09144b819dca342772e8e598edb1681230b3acae2057c135b75d3e797"} Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.258170 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbdnv" event={"ID":"5dfb2910-03b5-479f-a21f-70f27512580d","Type":"ContainerStarted","Data":"006de6542578d5eeddf74de08cc47876e56fc1a9b1dbeb942cd81e8f2252e178"} Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.258217 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbdnv" event={"ID":"5dfb2910-03b5-479f-a21f-70f27512580d","Type":"ContainerStarted","Data":"247b1c6378a7f53bc434c57f8e7e3084a4c4af60d80fe7d9861b1091c47bd1af"} Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.260387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-knct4" event={"ID":"1ea9be16-3983-4590-943a-0870a9dfd9c3","Type":"ContainerDied","Data":"aa8e6736c7d0248476c4d052a93e05905f53152a6e0018e1ae3053cbffc93bb2"} Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.260415 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa8e6736c7d0248476c4d052a93e05905f53152a6e0018e1ae3053cbffc93bb2" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.260418 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-knct4" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.301256 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxvmj\" (UniqueName: \"kubernetes.io/projected/10ee2580-c430-43ee-bd0a-676053beeb00-kube-api-access-kxvmj\") pod \"10ee2580-c430-43ee-bd0a-676053beeb00\" (UID: \"10ee2580-c430-43ee-bd0a-676053beeb00\") " Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.301345 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10ee2580-c430-43ee-bd0a-676053beeb00-operator-scripts\") pod \"10ee2580-c430-43ee-bd0a-676053beeb00\" (UID: \"10ee2580-c430-43ee-bd0a-676053beeb00\") " Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.301402 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ea9be16-3983-4590-943a-0870a9dfd9c3-operator-scripts\") pod \"1ea9be16-3983-4590-943a-0870a9dfd9c3\" (UID: \"1ea9be16-3983-4590-943a-0870a9dfd9c3\") " Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.301544 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtf59\" (UniqueName: \"kubernetes.io/projected/1ea9be16-3983-4590-943a-0870a9dfd9c3-kube-api-access-mtf59\") pod \"1ea9be16-3983-4590-943a-0870a9dfd9c3\" (UID: \"1ea9be16-3983-4590-943a-0870a9dfd9c3\") " Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.302357 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10ee2580-c430-43ee-bd0a-676053beeb00-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10ee2580-c430-43ee-bd0a-676053beeb00" (UID: "10ee2580-c430-43ee-bd0a-676053beeb00"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.302358 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ea9be16-3983-4590-943a-0870a9dfd9c3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1ea9be16-3983-4590-943a-0870a9dfd9c3" (UID: "1ea9be16-3983-4590-943a-0870a9dfd9c3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.307075 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ea9be16-3983-4590-943a-0870a9dfd9c3-kube-api-access-mtf59" (OuterVolumeSpecName: "kube-api-access-mtf59") pod "1ea9be16-3983-4590-943a-0870a9dfd9c3" (UID: "1ea9be16-3983-4590-943a-0870a9dfd9c3"). InnerVolumeSpecName "kube-api-access-mtf59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.307191 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ee2580-c430-43ee-bd0a-676053beeb00-kube-api-access-kxvmj" (OuterVolumeSpecName: "kube-api-access-kxvmj") pod "10ee2580-c430-43ee-bd0a-676053beeb00" (UID: "10ee2580-c430-43ee-bd0a-676053beeb00"). InnerVolumeSpecName "kube-api-access-kxvmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.403827 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ea9be16-3983-4590-943a-0870a9dfd9c3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.404178 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtf59\" (UniqueName: \"kubernetes.io/projected/1ea9be16-3983-4590-943a-0870a9dfd9c3-kube-api-access-mtf59\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.404198 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxvmj\" (UniqueName: \"kubernetes.io/projected/10ee2580-c430-43ee-bd0a-676053beeb00-kube-api-access-kxvmj\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.404213 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10ee2580-c430-43ee-bd0a-676053beeb00-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:47 crc kubenswrapper[5030]: I0120 23:45:47.911929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:47 crc kubenswrapper[5030]: E0120 23:45:47.912120 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:45:47 crc kubenswrapper[5030]: E0120 23:45:47.912160 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:45:47 crc kubenswrapper[5030]: E0120 23:45:47.912211 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift podName:0cb7ff45-e6d7-41cc-9001-de1c7f458a83 nodeName:}" failed. No retries permitted until 2026-01-20 23:45:55.912196724 +0000 UTC m=+4228.232457012 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift") pod "swift-storage-0" (UID: "0cb7ff45-e6d7-41cc-9001-de1c7f458a83") : configmap "swift-ring-files" not found Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.273567 5030 generic.go:334] "Generic (PLEG): container finished" podID="5dfb2910-03b5-479f-a21f-70f27512580d" containerID="006de6542578d5eeddf74de08cc47876e56fc1a9b1dbeb942cd81e8f2252e178" exitCode=0 Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.273645 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbdnv" event={"ID":"5dfb2910-03b5-479f-a21f-70f27512580d","Type":"ContainerDied","Data":"006de6542578d5eeddf74de08cc47876e56fc1a9b1dbeb942cd81e8f2252e178"} Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.713126 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-bqtr9" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.829369 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a611922a-c883-4ae3-a9a9-823b11d6fab7-operator-scripts\") pod \"a611922a-c883-4ae3-a9a9-823b11d6fab7\" (UID: \"a611922a-c883-4ae3-a9a9-823b11d6fab7\") " Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.829478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v67hs\" (UniqueName: \"kubernetes.io/projected/a611922a-c883-4ae3-a9a9-823b11d6fab7-kube-api-access-v67hs\") pod \"a611922a-c883-4ae3-a9a9-823b11d6fab7\" (UID: \"a611922a-c883-4ae3-a9a9-823b11d6fab7\") " Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.830007 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a611922a-c883-4ae3-a9a9-823b11d6fab7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a611922a-c883-4ae3-a9a9-823b11d6fab7" (UID: "a611922a-c883-4ae3-a9a9-823b11d6fab7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.835553 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a611922a-c883-4ae3-a9a9-823b11d6fab7-kube-api-access-v67hs" (OuterVolumeSpecName: "kube-api-access-v67hs") pod "a611922a-c883-4ae3-a9a9-823b11d6fab7" (UID: "a611922a-c883-4ae3-a9a9-823b11d6fab7"). InnerVolumeSpecName "kube-api-access-v67hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.896338 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-wcfd8"] Jan 20 23:45:48 crc kubenswrapper[5030]: E0120 23:45:48.896750 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ea9be16-3983-4590-943a-0870a9dfd9c3" containerName="mariadb-database-create" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.896766 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ea9be16-3983-4590-943a-0870a9dfd9c3" containerName="mariadb-database-create" Jan 20 23:45:48 crc kubenswrapper[5030]: E0120 23:45:48.896780 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a611922a-c883-4ae3-a9a9-823b11d6fab7" containerName="mariadb-account-create-update" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.896791 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a611922a-c883-4ae3-a9a9-823b11d6fab7" containerName="mariadb-account-create-update" Jan 20 23:45:48 crc kubenswrapper[5030]: E0120 23:45:48.896807 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ee2580-c430-43ee-bd0a-676053beeb00" containerName="mariadb-account-create-update" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.896815 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ee2580-c430-43ee-bd0a-676053beeb00" containerName="mariadb-account-create-update" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.897023 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a611922a-c883-4ae3-a9a9-823b11d6fab7" containerName="mariadb-account-create-update" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.897045 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ea9be16-3983-4590-943a-0870a9dfd9c3" containerName="mariadb-database-create" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.897063 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="10ee2580-c430-43ee-bd0a-676053beeb00" containerName="mariadb-account-create-update" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.897694 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.909316 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.909428 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-hsjcl" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.931217 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a611922a-c883-4ae3-a9a9-823b11d6fab7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.931252 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v67hs\" (UniqueName: \"kubernetes.io/projected/a611922a-c883-4ae3-a9a9-823b11d6fab7-kube-api-access-v67hs\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:48 crc kubenswrapper[5030]: I0120 23:45:48.933366 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-wcfd8"] Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.033035 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-db-sync-config-data\") pod \"glance-db-sync-wcfd8\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.033484 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-config-data\") pod \"glance-db-sync-wcfd8\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.033731 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-combined-ca-bundle\") pod \"glance-db-sync-wcfd8\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.033885 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lrb4\" (UniqueName: \"kubernetes.io/projected/56413b06-4751-4af2-a91e-d70b46e32c20-kube-api-access-2lrb4\") pod \"glance-db-sync-wcfd8\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.135346 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-db-sync-config-data\") pod \"glance-db-sync-wcfd8\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.135399 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-config-data\") pod \"glance-db-sync-wcfd8\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.135449 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-combined-ca-bundle\") pod \"glance-db-sync-wcfd8\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.135499 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lrb4\" (UniqueName: \"kubernetes.io/projected/56413b06-4751-4af2-a91e-d70b46e32c20-kube-api-access-2lrb4\") pod \"glance-db-sync-wcfd8\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.140440 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-combined-ca-bundle\") pod \"glance-db-sync-wcfd8\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.141570 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-db-sync-config-data\") pod \"glance-db-sync-wcfd8\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.143186 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-config-data\") pod \"glance-db-sync-wcfd8\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.155668 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lrb4\" (UniqueName: \"kubernetes.io/projected/56413b06-4751-4af2-a91e-d70b46e32c20-kube-api-access-2lrb4\") pod \"glance-db-sync-wcfd8\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.213765 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.329220 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-bqtr9" event={"ID":"a611922a-c883-4ae3-a9a9-823b11d6fab7","Type":"ContainerDied","Data":"b40640d09144b819dca342772e8e598edb1681230b3acae2057c135b75d3e797"} Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.329236 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-bqtr9" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.329278 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b40640d09144b819dca342772e8e598edb1681230b3acae2057c135b75d3e797" Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.332402 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbdnv" event={"ID":"5dfb2910-03b5-479f-a21f-70f27512580d","Type":"ContainerStarted","Data":"b3fb84b847e1711d51151f64b59a2c7daef045817d67518cb0d4035793df6a55"} Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.336353 5030 generic.go:334] "Generic (PLEG): container finished" podID="65f20a41-6543-40ee-9462-2c73c1b904ef" containerID="421267c6bfd98f4fe3243005cbd7630d3179d26c45dbcc020099fd31c6f39830" exitCode=0 Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.336834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" event={"ID":"65f20a41-6543-40ee-9462-2c73c1b904ef","Type":"ContainerDied","Data":"421267c6bfd98f4fe3243005cbd7630d3179d26c45dbcc020099fd31c6f39830"} Jan 20 23:45:49 crc kubenswrapper[5030]: I0120 23:45:49.651509 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-wcfd8"] Jan 20 23:45:49 crc kubenswrapper[5030]: W0120 23:45:49.660113 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56413b06_4751_4af2_a91e_d70b46e32c20.slice/crio-bd97eb0428b5cfc79da1efa90b520db7bc242ed756c498696f2944a74c34ffde WatchSource:0}: Error finding container bd97eb0428b5cfc79da1efa90b520db7bc242ed756c498696f2944a74c34ffde: Status 404 returned error can't find the container with id bd97eb0428b5cfc79da1efa90b520db7bc242ed756c498696f2944a74c34ffde Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.351605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-wcfd8" event={"ID":"56413b06-4751-4af2-a91e-d70b46e32c20","Type":"ContainerStarted","Data":"86d552f2b139f590f704d703a14acee5de646739e003c7b6988c201a7f83dbc7"} Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.352236 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-wcfd8" event={"ID":"56413b06-4751-4af2-a91e-d70b46e32c20","Type":"ContainerStarted","Data":"bd97eb0428b5cfc79da1efa90b520db7bc242ed756c498696f2944a74c34ffde"} Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.354505 5030 generic.go:334] "Generic (PLEG): container finished" podID="5dfb2910-03b5-479f-a21f-70f27512580d" containerID="b3fb84b847e1711d51151f64b59a2c7daef045817d67518cb0d4035793df6a55" exitCode=0 Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.354612 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbdnv" event={"ID":"5dfb2910-03b5-479f-a21f-70f27512580d","Type":"ContainerDied","Data":"b3fb84b847e1711d51151f64b59a2c7daef045817d67518cb0d4035793df6a55"} Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.379135 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-wcfd8" podStartSLOduration=2.379117729 podStartE2EDuration="2.379117729s" podCreationTimestamp="2026-01-20 23:45:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:45:50.373752569 +0000 UTC m=+4222.694012877" watchObservedRunningTime="2026-01-20 23:45:50.379117729 +0000 UTC m=+4222.699378017" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.755710 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.864525 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6zg5\" (UniqueName: \"kubernetes.io/projected/65f20a41-6543-40ee-9462-2c73c1b904ef-kube-api-access-w6zg5\") pod \"65f20a41-6543-40ee-9462-2c73c1b904ef\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.864645 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65f20a41-6543-40ee-9462-2c73c1b904ef-scripts\") pod \"65f20a41-6543-40ee-9462-2c73c1b904ef\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.864727 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65f20a41-6543-40ee-9462-2c73c1b904ef-etc-swift\") pod \"65f20a41-6543-40ee-9462-2c73c1b904ef\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.864755 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65f20a41-6543-40ee-9462-2c73c1b904ef-ring-data-devices\") pod \"65f20a41-6543-40ee-9462-2c73c1b904ef\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.864806 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-dispersionconf\") pod \"65f20a41-6543-40ee-9462-2c73c1b904ef\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.864937 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-combined-ca-bundle\") pod \"65f20a41-6543-40ee-9462-2c73c1b904ef\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.865492 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f20a41-6543-40ee-9462-2c73c1b904ef-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "65f20a41-6543-40ee-9462-2c73c1b904ef" (UID: "65f20a41-6543-40ee-9462-2c73c1b904ef"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.865932 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-swiftconf\") pod \"65f20a41-6543-40ee-9462-2c73c1b904ef\" (UID: \"65f20a41-6543-40ee-9462-2c73c1b904ef\") " Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.866732 5030 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65f20a41-6543-40ee-9462-2c73c1b904ef-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.872023 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65f20a41-6543-40ee-9462-2c73c1b904ef-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "65f20a41-6543-40ee-9462-2c73c1b904ef" (UID: "65f20a41-6543-40ee-9462-2c73c1b904ef"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.882826 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f20a41-6543-40ee-9462-2c73c1b904ef-kube-api-access-w6zg5" (OuterVolumeSpecName: "kube-api-access-w6zg5") pod "65f20a41-6543-40ee-9462-2c73c1b904ef" (UID: "65f20a41-6543-40ee-9462-2c73c1b904ef"). InnerVolumeSpecName "kube-api-access-w6zg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.885822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "65f20a41-6543-40ee-9462-2c73c1b904ef" (UID: "65f20a41-6543-40ee-9462-2c73c1b904ef"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.887853 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f20a41-6543-40ee-9462-2c73c1b904ef-scripts" (OuterVolumeSpecName: "scripts") pod "65f20a41-6543-40ee-9462-2c73c1b904ef" (UID: "65f20a41-6543-40ee-9462-2c73c1b904ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.892064 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65f20a41-6543-40ee-9462-2c73c1b904ef" (UID: "65f20a41-6543-40ee-9462-2c73c1b904ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.914498 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "65f20a41-6543-40ee-9462-2c73c1b904ef" (UID: "65f20a41-6543-40ee-9462-2c73c1b904ef"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.967827 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65f20a41-6543-40ee-9462-2c73c1b904ef-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.968100 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65f20a41-6543-40ee-9462-2c73c1b904ef-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.968178 5030 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.968297 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.968375 5030 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65f20a41-6543-40ee-9462-2c73c1b904ef-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:50 crc kubenswrapper[5030]: I0120 23:45:50.968442 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6zg5\" (UniqueName: \"kubernetes.io/projected/65f20a41-6543-40ee-9462-2c73c1b904ef-kube-api-access-w6zg5\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:51 crc kubenswrapper[5030]: I0120 23:45:51.364523 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" event={"ID":"65f20a41-6543-40ee-9462-2c73c1b904ef","Type":"ContainerDied","Data":"87219c47dbc4e1cbaa2dd7725ea9c2b5fc76c422da52c12fdf2209c8f0df41c1"} Jan 20 23:45:51 crc kubenswrapper[5030]: I0120 23:45:51.366395 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87219c47dbc4e1cbaa2dd7725ea9c2b5fc76c422da52c12fdf2209c8f0df41c1" Jan 20 23:45:51 crc kubenswrapper[5030]: I0120 23:45:51.367530 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-f4t54" Jan 20 23:45:51 crc kubenswrapper[5030]: I0120 23:45:51.372995 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbdnv" event={"ID":"5dfb2910-03b5-479f-a21f-70f27512580d","Type":"ContainerStarted","Data":"fc28d670a0e09fae24c2fd7af0d375a4313585b2ce973e787a027cba51f463f6"} Jan 20 23:45:51 crc kubenswrapper[5030]: I0120 23:45:51.409401 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gbdnv" podStartSLOduration=3.859190547 podStartE2EDuration="6.409373431s" podCreationTimestamp="2026-01-20 23:45:45 +0000 UTC" firstStartedPulling="2026-01-20 23:45:48.276085442 +0000 UTC m=+4220.596345750" lastFinishedPulling="2026-01-20 23:45:50.826268346 +0000 UTC m=+4223.146528634" observedRunningTime="2026-01-20 23:45:51.404390571 +0000 UTC m=+4223.724650859" watchObservedRunningTime="2026-01-20 23:45:51.409373431 +0000 UTC m=+4223.729633719" Jan 20 23:45:51 crc kubenswrapper[5030]: I0120 23:45:51.723610 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bqtr9"] Jan 20 23:45:51 crc kubenswrapper[5030]: I0120 23:45:51.732492 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-bqtr9"] Jan 20 23:45:51 crc kubenswrapper[5030]: I0120 23:45:51.972551 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a611922a-c883-4ae3-a9a9-823b11d6fab7" path="/var/lib/kubelet/pods/a611922a-c883-4ae3-a9a9-823b11d6fab7/volumes" Jan 20 23:45:53 crc kubenswrapper[5030]: I0120 23:45:53.398593 5030 generic.go:334] "Generic (PLEG): container finished" podID="56413b06-4751-4af2-a91e-d70b46e32c20" containerID="86d552f2b139f590f704d703a14acee5de646739e003c7b6988c201a7f83dbc7" exitCode=0 Jan 20 23:45:53 crc kubenswrapper[5030]: I0120 23:45:53.398721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-wcfd8" event={"ID":"56413b06-4751-4af2-a91e-d70b46e32c20","Type":"ContainerDied","Data":"86d552f2b139f590f704d703a14acee5de646739e003c7b6988c201a7f83dbc7"} Jan 20 23:45:54 crc kubenswrapper[5030]: I0120 23:45:54.856574 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:54 crc kubenswrapper[5030]: I0120 23:45:54.940539 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-combined-ca-bundle\") pod \"56413b06-4751-4af2-a91e-d70b46e32c20\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " Jan 20 23:45:54 crc kubenswrapper[5030]: I0120 23:45:54.940765 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-config-data\") pod \"56413b06-4751-4af2-a91e-d70b46e32c20\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " Jan 20 23:45:54 crc kubenswrapper[5030]: I0120 23:45:54.940868 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-db-sync-config-data\") pod \"56413b06-4751-4af2-a91e-d70b46e32c20\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " Jan 20 23:45:54 crc kubenswrapper[5030]: I0120 23:45:54.940981 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lrb4\" (UniqueName: \"kubernetes.io/projected/56413b06-4751-4af2-a91e-d70b46e32c20-kube-api-access-2lrb4\") pod \"56413b06-4751-4af2-a91e-d70b46e32c20\" (UID: \"56413b06-4751-4af2-a91e-d70b46e32c20\") " Jan 20 23:45:54 crc kubenswrapper[5030]: I0120 23:45:54.947715 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "56413b06-4751-4af2-a91e-d70b46e32c20" (UID: "56413b06-4751-4af2-a91e-d70b46e32c20"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:45:54 crc kubenswrapper[5030]: I0120 23:45:54.959042 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56413b06-4751-4af2-a91e-d70b46e32c20-kube-api-access-2lrb4" (OuterVolumeSpecName: "kube-api-access-2lrb4") pod "56413b06-4751-4af2-a91e-d70b46e32c20" (UID: "56413b06-4751-4af2-a91e-d70b46e32c20"). InnerVolumeSpecName "kube-api-access-2lrb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:45:54 crc kubenswrapper[5030]: I0120 23:45:54.981887 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56413b06-4751-4af2-a91e-d70b46e32c20" (UID: "56413b06-4751-4af2-a91e-d70b46e32c20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:45:55 crc kubenswrapper[5030]: I0120 23:45:55.006946 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-config-data" (OuterVolumeSpecName: "config-data") pod "56413b06-4751-4af2-a91e-d70b46e32c20" (UID: "56413b06-4751-4af2-a91e-d70b46e32c20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:45:55 crc kubenswrapper[5030]: I0120 23:45:55.043537 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:55 crc kubenswrapper[5030]: I0120 23:45:55.043582 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:55 crc kubenswrapper[5030]: I0120 23:45:55.043615 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56413b06-4751-4af2-a91e-d70b46e32c20-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:55 crc kubenswrapper[5030]: I0120 23:45:55.043669 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lrb4\" (UniqueName: \"kubernetes.io/projected/56413b06-4751-4af2-a91e-d70b46e32c20-kube-api-access-2lrb4\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:55 crc kubenswrapper[5030]: I0120 23:45:55.425971 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-wcfd8" event={"ID":"56413b06-4751-4af2-a91e-d70b46e32c20","Type":"ContainerDied","Data":"bd97eb0428b5cfc79da1efa90b520db7bc242ed756c498696f2944a74c34ffde"} Jan 20 23:45:55 crc kubenswrapper[5030]: I0120 23:45:55.426214 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd97eb0428b5cfc79da1efa90b520db7bc242ed756c498696f2944a74c34ffde" Jan 20 23:45:55 crc kubenswrapper[5030]: I0120 23:45:55.426273 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-wcfd8" Jan 20 23:45:55 crc kubenswrapper[5030]: I0120 23:45:55.960192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.244365 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift\") pod \"swift-storage-0\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.295913 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.295974 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.296056 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.370187 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.502878 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.605401 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbdnv"] Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.730450 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g8ftc"] Jan 20 23:45:56 crc kubenswrapper[5030]: E0120 23:45:56.730855 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f20a41-6543-40ee-9462-2c73c1b904ef" containerName="swift-ring-rebalance" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.730876 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f20a41-6543-40ee-9462-2c73c1b904ef" containerName="swift-ring-rebalance" Jan 20 23:45:56 crc kubenswrapper[5030]: E0120 23:45:56.730897 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56413b06-4751-4af2-a91e-d70b46e32c20" containerName="glance-db-sync" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.730906 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="56413b06-4751-4af2-a91e-d70b46e32c20" containerName="glance-db-sync" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.731103 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f20a41-6543-40ee-9462-2c73c1b904ef" containerName="swift-ring-rebalance" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.731136 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="56413b06-4751-4af2-a91e-d70b46e32c20" containerName="glance-db-sync" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.731747 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-g8ftc" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.734610 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.751023 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g8ftc"] Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.774227 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzqfk\" (UniqueName: \"kubernetes.io/projected/629f5916-4311-4b35-a772-d7dcced41284-kube-api-access-vzqfk\") pod \"root-account-create-update-g8ftc\" (UID: \"629f5916-4311-4b35-a772-d7dcced41284\") " pod="openstack-kuttl-tests/root-account-create-update-g8ftc" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.774292 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/629f5916-4311-4b35-a772-d7dcced41284-operator-scripts\") pod \"root-account-create-update-g8ftc\" (UID: \"629f5916-4311-4b35-a772-d7dcced41284\") " pod="openstack-kuttl-tests/root-account-create-update-g8ftc" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.807892 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.875738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzqfk\" (UniqueName: \"kubernetes.io/projected/629f5916-4311-4b35-a772-d7dcced41284-kube-api-access-vzqfk\") pod \"root-account-create-update-g8ftc\" (UID: \"629f5916-4311-4b35-a772-d7dcced41284\") " pod="openstack-kuttl-tests/root-account-create-update-g8ftc" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.875795 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/629f5916-4311-4b35-a772-d7dcced41284-operator-scripts\") pod \"root-account-create-update-g8ftc\" (UID: \"629f5916-4311-4b35-a772-d7dcced41284\") " pod="openstack-kuttl-tests/root-account-create-update-g8ftc" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.876600 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/629f5916-4311-4b35-a772-d7dcced41284-operator-scripts\") pod \"root-account-create-update-g8ftc\" (UID: \"629f5916-4311-4b35-a772-d7dcced41284\") " pod="openstack-kuttl-tests/root-account-create-update-g8ftc" Jan 20 23:45:56 crc kubenswrapper[5030]: I0120 23:45:56.897756 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzqfk\" (UniqueName: \"kubernetes.io/projected/629f5916-4311-4b35-a772-d7dcced41284-kube-api-access-vzqfk\") pod \"root-account-create-update-g8ftc\" (UID: \"629f5916-4311-4b35-a772-d7dcced41284\") " pod="openstack-kuttl-tests/root-account-create-update-g8ftc" Jan 20 23:45:57 crc kubenswrapper[5030]: I0120 23:45:57.057066 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-g8ftc" Jan 20 23:45:57 crc kubenswrapper[5030]: I0120 23:45:57.458967 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"6cd3e98e0e337b6e8110bc5ed6d220c6312475e87a43ee037f356b6914ffca40"} Jan 20 23:45:57 crc kubenswrapper[5030]: I0120 23:45:57.459234 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"6af8f9825683a9e7bcfa9790dd622b1408428b449ce0e291c74c97d671c14f9b"} Jan 20 23:45:57 crc kubenswrapper[5030]: I0120 23:45:57.459245 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"24da4819f9fb671a134974d99647e69b22fc2baef5e09e52f91aa2ac3e827c13"} Jan 20 23:45:57 crc kubenswrapper[5030]: I0120 23:45:57.516553 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g8ftc"] Jan 20 23:45:57 crc kubenswrapper[5030]: W0120 23:45:57.546249 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod629f5916_4311_4b35_a772_d7dcced41284.slice/crio-f7731045c514e82c08d3826e9ba86ac1e42eee3ba397df2b2a6704afaf43248f WatchSource:0}: Error finding container f7731045c514e82c08d3826e9ba86ac1e42eee3ba397df2b2a6704afaf43248f: Status 404 returned error can't find the container with id f7731045c514e82c08d3826e9ba86ac1e42eee3ba397df2b2a6704afaf43248f Jan 20 23:45:57 crc kubenswrapper[5030]: I0120 23:45:57.656201 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:45:58 crc kubenswrapper[5030]: I0120 23:45:58.467880 5030 generic.go:334] "Generic (PLEG): container finished" podID="81ab5d03-ddb2-4d25-bec9-485f19fa3f13" containerID="f30c6ebbf974766386c1b296ba4efbecc9546efb107e4b5e3844480b05eb892c" exitCode=0 Jan 20 23:45:58 crc kubenswrapper[5030]: I0120 23:45:58.467970 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"81ab5d03-ddb2-4d25-bec9-485f19fa3f13","Type":"ContainerDied","Data":"f30c6ebbf974766386c1b296ba4efbecc9546efb107e4b5e3844480b05eb892c"} Jan 20 23:45:58 crc kubenswrapper[5030]: I0120 23:45:58.471589 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-g8ftc" event={"ID":"629f5916-4311-4b35-a772-d7dcced41284","Type":"ContainerStarted","Data":"13735290db958c0573e03fd426dea1aa03063d2ff76782514b3c6f879976a7e7"} Jan 20 23:45:58 crc kubenswrapper[5030]: I0120 23:45:58.471662 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-g8ftc" event={"ID":"629f5916-4311-4b35-a772-d7dcced41284","Type":"ContainerStarted","Data":"f7731045c514e82c08d3826e9ba86ac1e42eee3ba397df2b2a6704afaf43248f"} Jan 20 23:45:58 crc kubenswrapper[5030]: I0120 23:45:58.475525 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gbdnv" podUID="5dfb2910-03b5-479f-a21f-70f27512580d" containerName="registry-server" containerID="cri-o://fc28d670a0e09fae24c2fd7af0d375a4313585b2ce973e787a027cba51f463f6" gracePeriod=2 Jan 20 23:45:58 crc kubenswrapper[5030]: I0120 23:45:58.475787 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"85df78331aa99aafae2f67fc3bc58a8e03b9b7cb97d02206474e31d6bc372557"} Jan 20 23:45:58 crc kubenswrapper[5030]: I0120 23:45:58.521382 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/root-account-create-update-g8ftc" podStartSLOduration=2.521365759 podStartE2EDuration="2.521365759s" podCreationTimestamp="2026-01-20 23:45:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:45:58.514990554 +0000 UTC m=+4230.835250842" watchObservedRunningTime="2026-01-20 23:45:58.521365759 +0000 UTC m=+4230.841626047" Jan 20 23:45:58 crc kubenswrapper[5030]: I0120 23:45:58.962811 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.014200 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dfb2910-03b5-479f-a21f-70f27512580d-utilities\") pod \"5dfb2910-03b5-479f-a21f-70f27512580d\" (UID: \"5dfb2910-03b5-479f-a21f-70f27512580d\") " Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.014448 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dfb2910-03b5-479f-a21f-70f27512580d-catalog-content\") pod \"5dfb2910-03b5-479f-a21f-70f27512580d\" (UID: \"5dfb2910-03b5-479f-a21f-70f27512580d\") " Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.014500 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49dhb\" (UniqueName: \"kubernetes.io/projected/5dfb2910-03b5-479f-a21f-70f27512580d-kube-api-access-49dhb\") pod \"5dfb2910-03b5-479f-a21f-70f27512580d\" (UID: \"5dfb2910-03b5-479f-a21f-70f27512580d\") " Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.015173 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dfb2910-03b5-479f-a21f-70f27512580d-utilities" (OuterVolumeSpecName: "utilities") pod "5dfb2910-03b5-479f-a21f-70f27512580d" (UID: "5dfb2910-03b5-479f-a21f-70f27512580d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.015401 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dfb2910-03b5-479f-a21f-70f27512580d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.029328 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dfb2910-03b5-479f-a21f-70f27512580d-kube-api-access-49dhb" (OuterVolumeSpecName: "kube-api-access-49dhb") pod "5dfb2910-03b5-479f-a21f-70f27512580d" (UID: "5dfb2910-03b5-479f-a21f-70f27512580d"). InnerVolumeSpecName "kube-api-access-49dhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.045361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dfb2910-03b5-479f-a21f-70f27512580d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dfb2910-03b5-479f-a21f-70f27512580d" (UID: "5dfb2910-03b5-479f-a21f-70f27512580d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.117135 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dfb2910-03b5-479f-a21f-70f27512580d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.117169 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49dhb\" (UniqueName: \"kubernetes.io/projected/5dfb2910-03b5-479f-a21f-70f27512580d-kube-api-access-49dhb\") on node \"crc\" DevicePath \"\"" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.487312 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"81ab5d03-ddb2-4d25-bec9-485f19fa3f13","Type":"ContainerStarted","Data":"bbb816d2b9ba7c53b6ec15eb428bf7dd729a8cebd652ee1b9e0381c8dad5f9d5"} Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.487561 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.488620 5030 generic.go:334] "Generic (PLEG): container finished" podID="629f5916-4311-4b35-a772-d7dcced41284" containerID="13735290db958c0573e03fd426dea1aa03063d2ff76782514b3c6f879976a7e7" exitCode=0 Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.488691 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-g8ftc" event={"ID":"629f5916-4311-4b35-a772-d7dcced41284","Type":"ContainerDied","Data":"13735290db958c0573e03fd426dea1aa03063d2ff76782514b3c6f879976a7e7"} Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.491796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"47752edc56435edd967b808a352f1ce3b054e10598ff669178f3ef8398eeefba"} Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.491818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"be9299d3748b999437f0fa19a92cf43c1a5f5fe089543d8ab23f5a5c5f4f0c07"} Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.491829 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"f6b6d32d54243a624699e2b703b31cb8449fd7e9d6677452c3fad7ecf705ad22"} Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.491839 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"81bc4f92c31773ee13486eb01f1c4bfe2377e41f09e12794224e70040ffe1a53"} Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.491848 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"92c9ca6d0747ab19439d8b49dfd9a1c96242ac77f3c689c788a65c1c0e1258b2"} Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.491856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"3c24cf1353bfd1771fb13a4e92f529dc36f8d0e1781a72b6afd315e775fba8c5"} Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.495553 5030 generic.go:334] "Generic (PLEG): container finished" podID="dd483381-3919-4ade-ac69-c8abde2869a6" containerID="11fee335c42cdc9d15ce4fe859a0bea109b4a8f81b0b4f42812b3dd2aaaede89" exitCode=0 Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.495617 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"dd483381-3919-4ade-ac69-c8abde2869a6","Type":"ContainerDied","Data":"11fee335c42cdc9d15ce4fe859a0bea109b4a8f81b0b4f42812b3dd2aaaede89"} Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.497733 5030 generic.go:334] "Generic (PLEG): container finished" podID="5dfb2910-03b5-479f-a21f-70f27512580d" containerID="fc28d670a0e09fae24c2fd7af0d375a4313585b2ce973e787a027cba51f463f6" exitCode=0 Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.497760 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbdnv" event={"ID":"5dfb2910-03b5-479f-a21f-70f27512580d","Type":"ContainerDied","Data":"fc28d670a0e09fae24c2fd7af0d375a4313585b2ce973e787a027cba51f463f6"} Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.497778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbdnv" event={"ID":"5dfb2910-03b5-479f-a21f-70f27512580d","Type":"ContainerDied","Data":"247b1c6378a7f53bc434c57f8e7e3084a4c4af60d80fe7d9861b1091c47bd1af"} Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.497793 5030 scope.go:117] "RemoveContainer" containerID="fc28d670a0e09fae24c2fd7af0d375a4313585b2ce973e787a027cba51f463f6" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.497791 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbdnv" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.520048 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=37.520024675 podStartE2EDuration="37.520024675s" podCreationTimestamp="2026-01-20 23:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:45:59.512620135 +0000 UTC m=+4231.832880433" watchObservedRunningTime="2026-01-20 23:45:59.520024675 +0000 UTC m=+4231.840284973" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.624563 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbdnv"] Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.632922 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbdnv"] Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.633517 5030 scope.go:117] "RemoveContainer" containerID="b3fb84b847e1711d51151f64b59a2c7daef045817d67518cb0d4035793df6a55" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.663660 5030 scope.go:117] "RemoveContainer" containerID="006de6542578d5eeddf74de08cc47876e56fc1a9b1dbeb942cd81e8f2252e178" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.682424 5030 scope.go:117] "RemoveContainer" containerID="fc28d670a0e09fae24c2fd7af0d375a4313585b2ce973e787a027cba51f463f6" Jan 20 23:45:59 crc kubenswrapper[5030]: E0120 23:45:59.682868 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc28d670a0e09fae24c2fd7af0d375a4313585b2ce973e787a027cba51f463f6\": container with ID starting with fc28d670a0e09fae24c2fd7af0d375a4313585b2ce973e787a027cba51f463f6 not found: ID does not exist" containerID="fc28d670a0e09fae24c2fd7af0d375a4313585b2ce973e787a027cba51f463f6" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.682900 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc28d670a0e09fae24c2fd7af0d375a4313585b2ce973e787a027cba51f463f6"} err="failed to get container status \"fc28d670a0e09fae24c2fd7af0d375a4313585b2ce973e787a027cba51f463f6\": rpc error: code = NotFound desc = could not find container \"fc28d670a0e09fae24c2fd7af0d375a4313585b2ce973e787a027cba51f463f6\": container with ID starting with fc28d670a0e09fae24c2fd7af0d375a4313585b2ce973e787a027cba51f463f6 not found: ID does not exist" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.682931 5030 scope.go:117] "RemoveContainer" containerID="b3fb84b847e1711d51151f64b59a2c7daef045817d67518cb0d4035793df6a55" Jan 20 23:45:59 crc kubenswrapper[5030]: E0120 23:45:59.684507 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3fb84b847e1711d51151f64b59a2c7daef045817d67518cb0d4035793df6a55\": container with ID starting with b3fb84b847e1711d51151f64b59a2c7daef045817d67518cb0d4035793df6a55 not found: ID does not exist" containerID="b3fb84b847e1711d51151f64b59a2c7daef045817d67518cb0d4035793df6a55" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.684548 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3fb84b847e1711d51151f64b59a2c7daef045817d67518cb0d4035793df6a55"} err="failed to get container status \"b3fb84b847e1711d51151f64b59a2c7daef045817d67518cb0d4035793df6a55\": rpc error: code = NotFound desc = could not find container \"b3fb84b847e1711d51151f64b59a2c7daef045817d67518cb0d4035793df6a55\": container with ID starting with b3fb84b847e1711d51151f64b59a2c7daef045817d67518cb0d4035793df6a55 not found: ID does not exist" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.684571 5030 scope.go:117] "RemoveContainer" containerID="006de6542578d5eeddf74de08cc47876e56fc1a9b1dbeb942cd81e8f2252e178" Jan 20 23:45:59 crc kubenswrapper[5030]: E0120 23:45:59.684852 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006de6542578d5eeddf74de08cc47876e56fc1a9b1dbeb942cd81e8f2252e178\": container with ID starting with 006de6542578d5eeddf74de08cc47876e56fc1a9b1dbeb942cd81e8f2252e178 not found: ID does not exist" containerID="006de6542578d5eeddf74de08cc47876e56fc1a9b1dbeb942cd81e8f2252e178" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.684874 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006de6542578d5eeddf74de08cc47876e56fc1a9b1dbeb942cd81e8f2252e178"} err="failed to get container status \"006de6542578d5eeddf74de08cc47876e56fc1a9b1dbeb942cd81e8f2252e178\": rpc error: code = NotFound desc = could not find container \"006de6542578d5eeddf74de08cc47876e56fc1a9b1dbeb942cd81e8f2252e178\": container with ID starting with 006de6542578d5eeddf74de08cc47876e56fc1a9b1dbeb942cd81e8f2252e178 not found: ID does not exist" Jan 20 23:45:59 crc kubenswrapper[5030]: I0120 23:45:59.971698 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dfb2910-03b5-479f-a21f-70f27512580d" path="/var/lib/kubelet/pods/5dfb2910-03b5-479f-a21f-70f27512580d/volumes" Jan 20 23:46:00 crc kubenswrapper[5030]: I0120 23:46:00.512313 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"b7cffbcf3ad1b63e00ae392389302e885c55f2585cddad9fe0291894a6e8cecb"} Jan 20 23:46:00 crc kubenswrapper[5030]: I0120 23:46:00.512661 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"dde3d6c7a1986ddafd263ad5e85db2efc0b08d473326e955b4001ad6c7a98f50"} Jan 20 23:46:00 crc kubenswrapper[5030]: I0120 23:46:00.512679 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"add2324fcfa33b1725dfe035aca44112500e267e7cf07a88742e7c07f8c84862"} Jan 20 23:46:00 crc kubenswrapper[5030]: I0120 23:46:00.512691 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"1350a717bc1295245194beb51e514c867204d22c527b6a02bbcc86472510cc38"} Jan 20 23:46:00 crc kubenswrapper[5030]: I0120 23:46:00.512703 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"239d60cbdc9fb72217dc18f7e01aa568cec96d28806e913c0638419d688b401a"} Jan 20 23:46:00 crc kubenswrapper[5030]: I0120 23:46:00.514316 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"dd483381-3919-4ade-ac69-c8abde2869a6","Type":"ContainerStarted","Data":"59a819026ca38754d2a26af58c705565136f6b307a0c53e7ac0b9eba9ecae242"} Jan 20 23:46:00 crc kubenswrapper[5030]: I0120 23:46:00.543232 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.543216526 podStartE2EDuration="37.543216526s" podCreationTimestamp="2026-01-20 23:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:00.536607105 +0000 UTC m=+4232.856867403" watchObservedRunningTime="2026-01-20 23:46:00.543216526 +0000 UTC m=+4232.863476814" Jan 20 23:46:00 crc kubenswrapper[5030]: I0120 23:46:00.871879 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-g8ftc" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.048984 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzqfk\" (UniqueName: \"kubernetes.io/projected/629f5916-4311-4b35-a772-d7dcced41284-kube-api-access-vzqfk\") pod \"629f5916-4311-4b35-a772-d7dcced41284\" (UID: \"629f5916-4311-4b35-a772-d7dcced41284\") " Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.049112 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/629f5916-4311-4b35-a772-d7dcced41284-operator-scripts\") pod \"629f5916-4311-4b35-a772-d7dcced41284\" (UID: \"629f5916-4311-4b35-a772-d7dcced41284\") " Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.050516 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/629f5916-4311-4b35-a772-d7dcced41284-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "629f5916-4311-4b35-a772-d7dcced41284" (UID: "629f5916-4311-4b35-a772-d7dcced41284"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.054544 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/629f5916-4311-4b35-a772-d7dcced41284-kube-api-access-vzqfk" (OuterVolumeSpecName: "kube-api-access-vzqfk") pod "629f5916-4311-4b35-a772-d7dcced41284" (UID: "629f5916-4311-4b35-a772-d7dcced41284"). InnerVolumeSpecName "kube-api-access-vzqfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.151162 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzqfk\" (UniqueName: \"kubernetes.io/projected/629f5916-4311-4b35-a772-d7dcced41284-kube-api-access-vzqfk\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.151197 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/629f5916-4311-4b35-a772-d7dcced41284-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.529883 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-g8ftc" event={"ID":"629f5916-4311-4b35-a772-d7dcced41284","Type":"ContainerDied","Data":"f7731045c514e82c08d3826e9ba86ac1e42eee3ba397df2b2a6704afaf43248f"} Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.530531 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7731045c514e82c08d3826e9ba86ac1e42eee3ba397df2b2a6704afaf43248f" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.529938 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-g8ftc" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.542224 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerStarted","Data":"be639c5606151f012255de9bcb91690338a83b8cba439bddcc11b7175fb3edb5"} Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.602602 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=21.602581104 podStartE2EDuration="21.602581104s" podCreationTimestamp="2026-01-20 23:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:01.593879453 +0000 UTC m=+4233.914139751" watchObservedRunningTime="2026-01-20 23:46:01.602581104 +0000 UTC m=+4233.922841402" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.797213 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp"] Jan 20 23:46:01 crc kubenswrapper[5030]: E0120 23:46:01.797499 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dfb2910-03b5-479f-a21f-70f27512580d" containerName="extract-content" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.797510 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dfb2910-03b5-479f-a21f-70f27512580d" containerName="extract-content" Jan 20 23:46:01 crc kubenswrapper[5030]: E0120 23:46:01.797521 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dfb2910-03b5-479f-a21f-70f27512580d" containerName="extract-utilities" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.797528 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dfb2910-03b5-479f-a21f-70f27512580d" containerName="extract-utilities" Jan 20 23:46:01 crc kubenswrapper[5030]: E0120 23:46:01.797536 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dfb2910-03b5-479f-a21f-70f27512580d" containerName="registry-server" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.797542 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dfb2910-03b5-479f-a21f-70f27512580d" containerName="registry-server" Jan 20 23:46:01 crc kubenswrapper[5030]: E0120 23:46:01.797560 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="629f5916-4311-4b35-a772-d7dcced41284" containerName="mariadb-account-create-update" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.797567 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="629f5916-4311-4b35-a772-d7dcced41284" containerName="mariadb-account-create-update" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.797735 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dfb2910-03b5-479f-a21f-70f27512580d" containerName="registry-server" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.797750 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="629f5916-4311-4b35-a772-d7dcced41284" containerName="mariadb-account-create-update" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.798502 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.801361 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.846301 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp"] Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.962607 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-config\") pod \"dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.963831 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.963891 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkfrg\" (UniqueName: \"kubernetes.io/projected/b30c4233-78ba-46a4-a091-ad9613da839f-kube-api-access-rkfrg\") pod \"dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:01 crc kubenswrapper[5030]: I0120 23:46:01.964097 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:02 crc kubenswrapper[5030]: I0120 23:46:02.066158 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:02 crc kubenswrapper[5030]: I0120 23:46:02.066369 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-config\") pod \"dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:02 crc kubenswrapper[5030]: I0120 23:46:02.066446 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:02 crc kubenswrapper[5030]: I0120 23:46:02.066469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkfrg\" (UniqueName: \"kubernetes.io/projected/b30c4233-78ba-46a4-a091-ad9613da839f-kube-api-access-rkfrg\") pod \"dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:02 crc kubenswrapper[5030]: I0120 23:46:02.068261 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:02 crc kubenswrapper[5030]: I0120 23:46:02.068404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-config\") pod \"dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:02 crc kubenswrapper[5030]: I0120 23:46:02.069129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:02 crc kubenswrapper[5030]: I0120 23:46:02.086417 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkfrg\" (UniqueName: \"kubernetes.io/projected/b30c4233-78ba-46a4-a091-ad9613da839f-kube-api-access-rkfrg\") pod \"dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:02 crc kubenswrapper[5030]: I0120 23:46:02.113088 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:02 crc kubenswrapper[5030]: I0120 23:46:02.531758 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp"] Jan 20 23:46:02 crc kubenswrapper[5030]: W0120 23:46:02.536889 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb30c4233_78ba_46a4_a091_ad9613da839f.slice/crio-a63bf7ae68c3da2a0456deb327cf635f241aa975e7e24c71c406bd8493c9fb7c WatchSource:0}: Error finding container a63bf7ae68c3da2a0456deb327cf635f241aa975e7e24c71c406bd8493c9fb7c: Status 404 returned error can't find the container with id a63bf7ae68c3da2a0456deb327cf635f241aa975e7e24c71c406bd8493c9fb7c Jan 20 23:46:02 crc kubenswrapper[5030]: I0120 23:46:02.551593 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" event={"ID":"b30c4233-78ba-46a4-a091-ad9613da839f","Type":"ContainerStarted","Data":"a63bf7ae68c3da2a0456deb327cf635f241aa975e7e24c71c406bd8493c9fb7c"} Jan 20 23:46:03 crc kubenswrapper[5030]: I0120 23:46:03.560132 5030 generic.go:334] "Generic (PLEG): container finished" podID="b30c4233-78ba-46a4-a091-ad9613da839f" containerID="8c7e8e5624129560558ffe8ea84fb6213671c363ba9a62cdbc97526ad98c6905" exitCode=0 Jan 20 23:46:03 crc kubenswrapper[5030]: I0120 23:46:03.560209 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" event={"ID":"b30c4233-78ba-46a4-a091-ad9613da839f","Type":"ContainerDied","Data":"8c7e8e5624129560558ffe8ea84fb6213671c363ba9a62cdbc97526ad98c6905"} Jan 20 23:46:04 crc kubenswrapper[5030]: I0120 23:46:04.572972 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" event={"ID":"b30c4233-78ba-46a4-a091-ad9613da839f","Type":"ContainerStarted","Data":"a1c6b2689b2462436e37911d9f2dc4eb381d2bc0fb353c3eccd86f82e90ec5f7"} Jan 20 23:46:04 crc kubenswrapper[5030]: I0120 23:46:04.573565 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:04 crc kubenswrapper[5030]: I0120 23:46:04.603119 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" podStartSLOduration=3.603098163 podStartE2EDuration="3.603098163s" podCreationTimestamp="2026-01-20 23:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:04.600557261 +0000 UTC m=+4236.920817569" watchObservedRunningTime="2026-01-20 23:46:04.603098163 +0000 UTC m=+4236.923358461" Jan 20 23:46:05 crc kubenswrapper[5030]: I0120 23:46:05.504217 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.115010 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.211906 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w"] Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.212215 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" podUID="fa8f718e-3edd-4de0-9125-7d981584e6e0" containerName="dnsmasq-dns" containerID="cri-o://32b66f89314cc7517837e61d98b70ca2ab834e88a6527f2a3fe2792e84855aef" gracePeriod=10 Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.661303 5030 generic.go:334] "Generic (PLEG): container finished" podID="fa8f718e-3edd-4de0-9125-7d981584e6e0" containerID="32b66f89314cc7517837e61d98b70ca2ab834e88a6527f2a3fe2792e84855aef" exitCode=0 Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.661341 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" event={"ID":"fa8f718e-3edd-4de0-9125-7d981584e6e0","Type":"ContainerDied","Data":"32b66f89314cc7517837e61d98b70ca2ab834e88a6527f2a3fe2792e84855aef"} Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.661813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" event={"ID":"fa8f718e-3edd-4de0-9125-7d981584e6e0","Type":"ContainerDied","Data":"02453a9d42983992c321bc6fc95fdeba4b8e233c6cc170d345f57bb867c223c5"} Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.661864 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02453a9d42983992c321bc6fc95fdeba4b8e233c6cc170d345f57bb867c223c5" Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.733075 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.871402 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fa8f718e-3edd-4de0-9125-7d981584e6e0-dnsmasq-svc\") pod \"fa8f718e-3edd-4de0-9125-7d981584e6e0\" (UID: \"fa8f718e-3edd-4de0-9125-7d981584e6e0\") " Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.871505 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc5w6\" (UniqueName: \"kubernetes.io/projected/fa8f718e-3edd-4de0-9125-7d981584e6e0-kube-api-access-mc5w6\") pod \"fa8f718e-3edd-4de0-9125-7d981584e6e0\" (UID: \"fa8f718e-3edd-4de0-9125-7d981584e6e0\") " Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.871590 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa8f718e-3edd-4de0-9125-7d981584e6e0-config\") pod \"fa8f718e-3edd-4de0-9125-7d981584e6e0\" (UID: \"fa8f718e-3edd-4de0-9125-7d981584e6e0\") " Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.877824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa8f718e-3edd-4de0-9125-7d981584e6e0-kube-api-access-mc5w6" (OuterVolumeSpecName: "kube-api-access-mc5w6") pod "fa8f718e-3edd-4de0-9125-7d981584e6e0" (UID: "fa8f718e-3edd-4de0-9125-7d981584e6e0"). InnerVolumeSpecName "kube-api-access-mc5w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.924757 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa8f718e-3edd-4de0-9125-7d981584e6e0-config" (OuterVolumeSpecName: "config") pod "fa8f718e-3edd-4de0-9125-7d981584e6e0" (UID: "fa8f718e-3edd-4de0-9125-7d981584e6e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.943807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa8f718e-3edd-4de0-9125-7d981584e6e0-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "fa8f718e-3edd-4de0-9125-7d981584e6e0" (UID: "fa8f718e-3edd-4de0-9125-7d981584e6e0"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.973574 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc5w6\" (UniqueName: \"kubernetes.io/projected/fa8f718e-3edd-4de0-9125-7d981584e6e0-kube-api-access-mc5w6\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.973600 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa8f718e-3edd-4de0-9125-7d981584e6e0-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:12 crc kubenswrapper[5030]: I0120 23:46:12.973609 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/fa8f718e-3edd-4de0-9125-7d981584e6e0-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:13 crc kubenswrapper[5030]: I0120 23:46:13.676055 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w" Jan 20 23:46:13 crc kubenswrapper[5030]: I0120 23:46:13.736393 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w"] Jan 20 23:46:13 crc kubenswrapper[5030]: I0120 23:46:13.749981 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8sp2w"] Jan 20 23:46:13 crc kubenswrapper[5030]: I0120 23:46:13.981073 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa8f718e-3edd-4de0-9125-7d981584e6e0" path="/var/lib/kubelet/pods/fa8f718e-3edd-4de0-9125-7d981584e6e0/volumes" Jan 20 23:46:14 crc kubenswrapper[5030]: I0120 23:46:14.212298 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:46:14 crc kubenswrapper[5030]: I0120 23:46:14.709154 5030 scope.go:117] "RemoveContainer" containerID="3ce538f9112db0591af690b27ecb9a574dd0329fa9ce2db3d26567ffb5981503" Jan 20 23:46:14 crc kubenswrapper[5030]: I0120 23:46:14.748227 5030 scope.go:117] "RemoveContainer" containerID="61c00bd05f4c308767a18aa48966b356a7b2f685e21d882aac3bfee2f238374c" Jan 20 23:46:14 crc kubenswrapper[5030]: I0120 23:46:14.805514 5030 scope.go:117] "RemoveContainer" containerID="b550427f5955b0f992f9ff6bc36ba4407ad531409b6dbe72aa177a9c214aa9df" Jan 20 23:46:14 crc kubenswrapper[5030]: I0120 23:46:14.858904 5030 scope.go:117] "RemoveContainer" containerID="07bb1e77795afa2d8f3e0e1429c41dbf48e1729d1bcdb8874b9fe5cb50fb7d85" Jan 20 23:46:14 crc kubenswrapper[5030]: I0120 23:46:14.894158 5030 scope.go:117] "RemoveContainer" containerID="f789b6523159d3101ae670e5476e0e7c39359c10b74e15d291d4e99cb86bfeb6" Jan 20 23:46:14 crc kubenswrapper[5030]: I0120 23:46:14.924328 5030 scope.go:117] "RemoveContainer" containerID="ac46d323e8f0a5b973e8d23adb65853f4d94b260c2af44727eb8af627172e11d" Jan 20 23:46:14 crc kubenswrapper[5030]: I0120 23:46:14.978385 5030 scope.go:117] "RemoveContainer" containerID="41286e72f29dcb5bdf7f4831ad8e2fe1b6c22003e27b09645abbfa6ae0917401" Jan 20 23:46:15 crc kubenswrapper[5030]: I0120 23:46:15.009342 5030 scope.go:117] "RemoveContainer" containerID="c90c755bc61ffabd973240ff7c809123f3f9f2dfda56b982d7ee2a2191f2daf4" Jan 20 23:46:15 crc kubenswrapper[5030]: I0120 23:46:15.039463 5030 scope.go:117] "RemoveContainer" containerID="8f7ace46d65651278377efee09416c1370252c641e56c616324fd9e3e6962d2d" Jan 20 23:46:15 crc kubenswrapper[5030]: I0120 23:46:15.507904 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.235789 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-7ffdk"] Jan 20 23:46:16 crc kubenswrapper[5030]: E0120 23:46:16.236121 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8f718e-3edd-4de0-9125-7d981584e6e0" containerName="init" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.236134 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8f718e-3edd-4de0-9125-7d981584e6e0" containerName="init" Jan 20 23:46:16 crc kubenswrapper[5030]: E0120 23:46:16.236166 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8f718e-3edd-4de0-9125-7d981584e6e0" containerName="dnsmasq-dns" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.236173 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8f718e-3edd-4de0-9125-7d981584e6e0" containerName="dnsmasq-dns" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.236352 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8f718e-3edd-4de0-9125-7d981584e6e0" containerName="dnsmasq-dns" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.236875 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-7ffdk" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.258029 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-7ffdk"] Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.325160 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-kr8bd"] Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.326528 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-kr8bd" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.331001 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3042e5b1-9e08-4870-99a1-2c89d2b536dd-operator-scripts\") pod \"cinder-db-create-7ffdk\" (UID: \"3042e5b1-9e08-4870-99a1-2c89d2b536dd\") " pod="openstack-kuttl-tests/cinder-db-create-7ffdk" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.331314 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbdcl\" (UniqueName: \"kubernetes.io/projected/3042e5b1-9e08-4870-99a1-2c89d2b536dd-kube-api-access-zbdcl\") pod \"cinder-db-create-7ffdk\" (UID: \"3042e5b1-9e08-4870-99a1-2c89d2b536dd\") " pod="openstack-kuttl-tests/cinder-db-create-7ffdk" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.339007 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-kr8bd"] Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.432581 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9fe9436-325e-46ee-9b56-41b8297f2f7c-operator-scripts\") pod \"barbican-db-create-kr8bd\" (UID: \"a9fe9436-325e-46ee-9b56-41b8297f2f7c\") " pod="openstack-kuttl-tests/barbican-db-create-kr8bd" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.432654 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg8jf\" (UniqueName: \"kubernetes.io/projected/a9fe9436-325e-46ee-9b56-41b8297f2f7c-kube-api-access-pg8jf\") pod \"barbican-db-create-kr8bd\" (UID: \"a9fe9436-325e-46ee-9b56-41b8297f2f7c\") " pod="openstack-kuttl-tests/barbican-db-create-kr8bd" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.432700 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbdcl\" (UniqueName: \"kubernetes.io/projected/3042e5b1-9e08-4870-99a1-2c89d2b536dd-kube-api-access-zbdcl\") pod \"cinder-db-create-7ffdk\" (UID: \"3042e5b1-9e08-4870-99a1-2c89d2b536dd\") " pod="openstack-kuttl-tests/cinder-db-create-7ffdk" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.432752 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3042e5b1-9e08-4870-99a1-2c89d2b536dd-operator-scripts\") pod \"cinder-db-create-7ffdk\" (UID: \"3042e5b1-9e08-4870-99a1-2c89d2b536dd\") " pod="openstack-kuttl-tests/cinder-db-create-7ffdk" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.433472 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3042e5b1-9e08-4870-99a1-2c89d2b536dd-operator-scripts\") pod \"cinder-db-create-7ffdk\" (UID: \"3042e5b1-9e08-4870-99a1-2c89d2b536dd\") " pod="openstack-kuttl-tests/cinder-db-create-7ffdk" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.434542 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-z6ngm"] Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.435660 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-z6ngm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.444908 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-z6ngm"] Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.457115 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-8027-account-create-update-4m224"] Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.458083 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-8027-account-create-update-4m224" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.460814 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.485266 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbdcl\" (UniqueName: \"kubernetes.io/projected/3042e5b1-9e08-4870-99a1-2c89d2b536dd-kube-api-access-zbdcl\") pod \"cinder-db-create-7ffdk\" (UID: \"3042e5b1-9e08-4870-99a1-2c89d2b536dd\") " pod="openstack-kuttl-tests/cinder-db-create-7ffdk" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.488925 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-8027-account-create-update-4m224"] Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.535698 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt2kt\" (UniqueName: \"kubernetes.io/projected/c16de63a-192f-4731-8c18-aaaede74cb00-kube-api-access-zt2kt\") pod \"barbican-8027-account-create-update-4m224\" (UID: \"c16de63a-192f-4731-8c18-aaaede74cb00\") " pod="openstack-kuttl-tests/barbican-8027-account-create-update-4m224" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.535807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9fe9436-325e-46ee-9b56-41b8297f2f7c-operator-scripts\") pod \"barbican-db-create-kr8bd\" (UID: \"a9fe9436-325e-46ee-9b56-41b8297f2f7c\") " pod="openstack-kuttl-tests/barbican-db-create-kr8bd" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.535846 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c16de63a-192f-4731-8c18-aaaede74cb00-operator-scripts\") pod \"barbican-8027-account-create-update-4m224\" (UID: \"c16de63a-192f-4731-8c18-aaaede74cb00\") " pod="openstack-kuttl-tests/barbican-8027-account-create-update-4m224" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.538041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg8jf\" (UniqueName: \"kubernetes.io/projected/a9fe9436-325e-46ee-9b56-41b8297f2f7c-kube-api-access-pg8jf\") pod \"barbican-db-create-kr8bd\" (UID: \"a9fe9436-325e-46ee-9b56-41b8297f2f7c\") " pod="openstack-kuttl-tests/barbican-db-create-kr8bd" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.538103 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/043666ed-7c1f-4489-8cfa-bb9c089ba951-operator-scripts\") pod \"neutron-db-create-z6ngm\" (UID: \"043666ed-7c1f-4489-8cfa-bb9c089ba951\") " pod="openstack-kuttl-tests/neutron-db-create-z6ngm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.538140 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2wdh\" (UniqueName: \"kubernetes.io/projected/043666ed-7c1f-4489-8cfa-bb9c089ba951-kube-api-access-t2wdh\") pod \"neutron-db-create-z6ngm\" (UID: \"043666ed-7c1f-4489-8cfa-bb9c089ba951\") " pod="openstack-kuttl-tests/neutron-db-create-z6ngm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.536988 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9fe9436-325e-46ee-9b56-41b8297f2f7c-operator-scripts\") pod \"barbican-db-create-kr8bd\" (UID: \"a9fe9436-325e-46ee-9b56-41b8297f2f7c\") " pod="openstack-kuttl-tests/barbican-db-create-kr8bd" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.551340 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-7ffdk" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.565243 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm"] Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.566230 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.570308 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg8jf\" (UniqueName: \"kubernetes.io/projected/a9fe9436-325e-46ee-9b56-41b8297f2f7c-kube-api-access-pg8jf\") pod \"barbican-db-create-kr8bd\" (UID: \"a9fe9436-325e-46ee-9b56-41b8297f2f7c\") " pod="openstack-kuttl-tests/barbican-db-create-kr8bd" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.573889 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-rzn4z"] Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.574916 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.575367 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.577323 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-6rz8w" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.577521 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.577665 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.580886 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.598451 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm"] Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.606015 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-rzn4z"] Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.639243 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c16de63a-192f-4731-8c18-aaaede74cb00-operator-scripts\") pod \"barbican-8027-account-create-update-4m224\" (UID: \"c16de63a-192f-4731-8c18-aaaede74cb00\") " pod="openstack-kuttl-tests/barbican-8027-account-create-update-4m224" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.639287 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpdk2\" (UniqueName: \"kubernetes.io/projected/20bd2b86-ed12-4296-97db-55879b9d87ac-kube-api-access-qpdk2\") pod \"keystone-db-sync-rzn4z\" (UID: \"20bd2b86-ed12-4296-97db-55879b9d87ac\") " pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.639340 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/043666ed-7c1f-4489-8cfa-bb9c089ba951-operator-scripts\") pod \"neutron-db-create-z6ngm\" (UID: \"043666ed-7c1f-4489-8cfa-bb9c089ba951\") " pod="openstack-kuttl-tests/neutron-db-create-z6ngm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.639368 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2wdh\" (UniqueName: \"kubernetes.io/projected/043666ed-7c1f-4489-8cfa-bb9c089ba951-kube-api-access-t2wdh\") pod \"neutron-db-create-z6ngm\" (UID: \"043666ed-7c1f-4489-8cfa-bb9c089ba951\") " pod="openstack-kuttl-tests/neutron-db-create-z6ngm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.639399 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9sd5\" (UniqueName: \"kubernetes.io/projected/242b3bb2-e246-48b9-a491-d5a2e1fceea1-kube-api-access-d9sd5\") pod \"cinder-ded5-account-create-update-kt6nm\" (UID: \"242b3bb2-e246-48b9-a491-d5a2e1fceea1\") " pod="openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.639417 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bd2b86-ed12-4296-97db-55879b9d87ac-combined-ca-bundle\") pod \"keystone-db-sync-rzn4z\" (UID: \"20bd2b86-ed12-4296-97db-55879b9d87ac\") " pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.639441 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20bd2b86-ed12-4296-97db-55879b9d87ac-config-data\") pod \"keystone-db-sync-rzn4z\" (UID: \"20bd2b86-ed12-4296-97db-55879b9d87ac\") " pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.639457 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt2kt\" (UniqueName: \"kubernetes.io/projected/c16de63a-192f-4731-8c18-aaaede74cb00-kube-api-access-zt2kt\") pod \"barbican-8027-account-create-update-4m224\" (UID: \"c16de63a-192f-4731-8c18-aaaede74cb00\") " pod="openstack-kuttl-tests/barbican-8027-account-create-update-4m224" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.639478 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/242b3bb2-e246-48b9-a491-d5a2e1fceea1-operator-scripts\") pod \"cinder-ded5-account-create-update-kt6nm\" (UID: \"242b3bb2-e246-48b9-a491-d5a2e1fceea1\") " pod="openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.640168 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c16de63a-192f-4731-8c18-aaaede74cb00-operator-scripts\") pod \"barbican-8027-account-create-update-4m224\" (UID: \"c16de63a-192f-4731-8c18-aaaede74cb00\") " pod="openstack-kuttl-tests/barbican-8027-account-create-update-4m224" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.640665 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/043666ed-7c1f-4489-8cfa-bb9c089ba951-operator-scripts\") pod \"neutron-db-create-z6ngm\" (UID: \"043666ed-7c1f-4489-8cfa-bb9c089ba951\") " pod="openstack-kuttl-tests/neutron-db-create-z6ngm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.641293 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-kr8bd" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.661192 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l"] Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.672108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2wdh\" (UniqueName: \"kubernetes.io/projected/043666ed-7c1f-4489-8cfa-bb9c089ba951-kube-api-access-t2wdh\") pod \"neutron-db-create-z6ngm\" (UID: \"043666ed-7c1f-4489-8cfa-bb9c089ba951\") " pod="openstack-kuttl-tests/neutron-db-create-z6ngm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.679114 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.683896 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.701213 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt2kt\" (UniqueName: \"kubernetes.io/projected/c16de63a-192f-4731-8c18-aaaede74cb00-kube-api-access-zt2kt\") pod \"barbican-8027-account-create-update-4m224\" (UID: \"c16de63a-192f-4731-8c18-aaaede74cb00\") " pod="openstack-kuttl-tests/barbican-8027-account-create-update-4m224" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.704013 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l"] Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.740464 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/242b3bb2-e246-48b9-a491-d5a2e1fceea1-operator-scripts\") pod \"cinder-ded5-account-create-update-kt6nm\" (UID: \"242b3bb2-e246-48b9-a491-d5a2e1fceea1\") " pod="openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.740985 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-572xd\" (UniqueName: \"kubernetes.io/projected/9a473770-8a49-4263-b949-d4f89d63542d-kube-api-access-572xd\") pod \"neutron-aec2-account-create-update-t7k5l\" (UID: \"9a473770-8a49-4263-b949-d4f89d63542d\") " pod="openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.741078 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a473770-8a49-4263-b949-d4f89d63542d-operator-scripts\") pod \"neutron-aec2-account-create-update-t7k5l\" (UID: \"9a473770-8a49-4263-b949-d4f89d63542d\") " pod="openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.741190 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpdk2\" (UniqueName: \"kubernetes.io/projected/20bd2b86-ed12-4296-97db-55879b9d87ac-kube-api-access-qpdk2\") pod \"keystone-db-sync-rzn4z\" (UID: \"20bd2b86-ed12-4296-97db-55879b9d87ac\") " pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.741314 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9sd5\" (UniqueName: \"kubernetes.io/projected/242b3bb2-e246-48b9-a491-d5a2e1fceea1-kube-api-access-d9sd5\") pod \"cinder-ded5-account-create-update-kt6nm\" (UID: \"242b3bb2-e246-48b9-a491-d5a2e1fceea1\") " pod="openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.741395 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bd2b86-ed12-4296-97db-55879b9d87ac-combined-ca-bundle\") pod \"keystone-db-sync-rzn4z\" (UID: \"20bd2b86-ed12-4296-97db-55879b9d87ac\") " pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.741472 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20bd2b86-ed12-4296-97db-55879b9d87ac-config-data\") pod \"keystone-db-sync-rzn4z\" (UID: \"20bd2b86-ed12-4296-97db-55879b9d87ac\") " pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.742182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/242b3bb2-e246-48b9-a491-d5a2e1fceea1-operator-scripts\") pod \"cinder-ded5-account-create-update-kt6nm\" (UID: \"242b3bb2-e246-48b9-a491-d5a2e1fceea1\") " pod="openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.747302 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20bd2b86-ed12-4296-97db-55879b9d87ac-config-data\") pod \"keystone-db-sync-rzn4z\" (UID: \"20bd2b86-ed12-4296-97db-55879b9d87ac\") " pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.749388 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bd2b86-ed12-4296-97db-55879b9d87ac-combined-ca-bundle\") pod \"keystone-db-sync-rzn4z\" (UID: \"20bd2b86-ed12-4296-97db-55879b9d87ac\") " pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.750593 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-z6ngm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.766714 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9sd5\" (UniqueName: \"kubernetes.io/projected/242b3bb2-e246-48b9-a491-d5a2e1fceea1-kube-api-access-d9sd5\") pod \"cinder-ded5-account-create-update-kt6nm\" (UID: \"242b3bb2-e246-48b9-a491-d5a2e1fceea1\") " pod="openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.766747 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpdk2\" (UniqueName: \"kubernetes.io/projected/20bd2b86-ed12-4296-97db-55879b9d87ac-kube-api-access-qpdk2\") pod \"keystone-db-sync-rzn4z\" (UID: \"20bd2b86-ed12-4296-97db-55879b9d87ac\") " pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.805970 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-8027-account-create-update-4m224" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.843123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-572xd\" (UniqueName: \"kubernetes.io/projected/9a473770-8a49-4263-b949-d4f89d63542d-kube-api-access-572xd\") pod \"neutron-aec2-account-create-update-t7k5l\" (UID: \"9a473770-8a49-4263-b949-d4f89d63542d\") " pod="openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.843168 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a473770-8a49-4263-b949-d4f89d63542d-operator-scripts\") pod \"neutron-aec2-account-create-update-t7k5l\" (UID: \"9a473770-8a49-4263-b949-d4f89d63542d\") " pod="openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.844719 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a473770-8a49-4263-b949-d4f89d63542d-operator-scripts\") pod \"neutron-aec2-account-create-update-t7k5l\" (UID: \"9a473770-8a49-4263-b949-d4f89d63542d\") " pod="openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.861247 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-572xd\" (UniqueName: \"kubernetes.io/projected/9a473770-8a49-4263-b949-d4f89d63542d-kube-api-access-572xd\") pod \"neutron-aec2-account-create-update-t7k5l\" (UID: \"9a473770-8a49-4263-b949-d4f89d63542d\") " pod="openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.898184 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l" Jan 20 23:46:16 crc kubenswrapper[5030]: I0120 23:46:16.960949 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm" Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.018044 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.103243 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-7ffdk"] Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.170128 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-kr8bd"] Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.232363 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-z6ngm"] Jan 20 23:46:17 crc kubenswrapper[5030]: W0120 23:46:17.241288 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod043666ed_7c1f_4489_8cfa_bb9c089ba951.slice/crio-9b16d89aeb03eb0a62fecca39fc7f59547caa2020702e6462122313814799c5c WatchSource:0}: Error finding container 9b16d89aeb03eb0a62fecca39fc7f59547caa2020702e6462122313814799c5c: Status 404 returned error can't find the container with id 9b16d89aeb03eb0a62fecca39fc7f59547caa2020702e6462122313814799c5c Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.308631 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-8027-account-create-update-4m224"] Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.401749 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l"] Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.484104 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm"] Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.535656 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-rzn4z"] Jan 20 23:46:17 crc kubenswrapper[5030]: W0120 23:46:17.547672 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc16de63a_192f_4731_8c18_aaaede74cb00.slice/crio-a430675512d03f1312c66637563026202ff95bdd5841e1a122ce54bda4940e59 WatchSource:0}: Error finding container a430675512d03f1312c66637563026202ff95bdd5841e1a122ce54bda4940e59: Status 404 returned error can't find the container with id a430675512d03f1312c66637563026202ff95bdd5841e1a122ce54bda4940e59 Jan 20 23:46:17 crc kubenswrapper[5030]: W0120 23:46:17.555809 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a473770_8a49_4263_b949_d4f89d63542d.slice/crio-6a4a2ff2308fa1600912c3e619964378a0c5864957b31ea0bd25dbcda2475ff8 WatchSource:0}: Error finding container 6a4a2ff2308fa1600912c3e619964378a0c5864957b31ea0bd25dbcda2475ff8: Status 404 returned error can't find the container with id 6a4a2ff2308fa1600912c3e619964378a0c5864957b31ea0bd25dbcda2475ff8 Jan 20 23:46:17 crc kubenswrapper[5030]: W0120 23:46:17.560618 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod242b3bb2_e246_48b9_a491_d5a2e1fceea1.slice/crio-30bdd2fedd465f06072d15e823659ae2d75e865fec76e89164ad029a5b0d412a WatchSource:0}: Error finding container 30bdd2fedd465f06072d15e823659ae2d75e865fec76e89164ad029a5b0d412a: Status 404 returned error can't find the container with id 30bdd2fedd465f06072d15e823659ae2d75e865fec76e89164ad029a5b0d412a Jan 20 23:46:17 crc kubenswrapper[5030]: W0120 23:46:17.569812 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20bd2b86_ed12_4296_97db_55879b9d87ac.slice/crio-f23ef1d90baf34599991bb8b54f595a4b8f24ff0f1703021d5f1b60673907fcc WatchSource:0}: Error finding container f23ef1d90baf34599991bb8b54f595a4b8f24ff0f1703021d5f1b60673907fcc: Status 404 returned error can't find the container with id f23ef1d90baf34599991bb8b54f595a4b8f24ff0f1703021d5f1b60673907fcc Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.754489 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-kr8bd" event={"ID":"a9fe9436-325e-46ee-9b56-41b8297f2f7c","Type":"ContainerStarted","Data":"f44bf16e874fa2061db03d5eb0a6a02e01717151f81002a463462df94a764705"} Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.755736 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-8027-account-create-update-4m224" event={"ID":"c16de63a-192f-4731-8c18-aaaede74cb00","Type":"ContainerStarted","Data":"a430675512d03f1312c66637563026202ff95bdd5841e1a122ce54bda4940e59"} Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.758558 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" event={"ID":"20bd2b86-ed12-4296-97db-55879b9d87ac","Type":"ContainerStarted","Data":"f23ef1d90baf34599991bb8b54f595a4b8f24ff0f1703021d5f1b60673907fcc"} Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.760261 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-7ffdk" event={"ID":"3042e5b1-9e08-4870-99a1-2c89d2b536dd","Type":"ContainerStarted","Data":"5af8e541c7b45adcd5f97f874d6b9a2214aad19a732554539987c9fffd8a4613"} Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.760291 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-7ffdk" event={"ID":"3042e5b1-9e08-4870-99a1-2c89d2b536dd","Type":"ContainerStarted","Data":"d88a033562427e54dd5a07a7c8e0fb325b3f00a4b774a7792c29ef9c13e92145"} Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.761233 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l" event={"ID":"9a473770-8a49-4263-b949-d4f89d63542d","Type":"ContainerStarted","Data":"6a4a2ff2308fa1600912c3e619964378a0c5864957b31ea0bd25dbcda2475ff8"} Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.763127 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-z6ngm" event={"ID":"043666ed-7c1f-4489-8cfa-bb9c089ba951","Type":"ContainerStarted","Data":"9b16d89aeb03eb0a62fecca39fc7f59547caa2020702e6462122313814799c5c"} Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.766169 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm" event={"ID":"242b3bb2-e246-48b9-a491-d5a2e1fceea1","Type":"ContainerStarted","Data":"30bdd2fedd465f06072d15e823659ae2d75e865fec76e89164ad029a5b0d412a"} Jan 20 23:46:17 crc kubenswrapper[5030]: I0120 23:46:17.778009 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-create-7ffdk" podStartSLOduration=1.777991487 podStartE2EDuration="1.777991487s" podCreationTimestamp="2026-01-20 23:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:17.772405522 +0000 UTC m=+4250.092665810" watchObservedRunningTime="2026-01-20 23:46:17.777991487 +0000 UTC m=+4250.098251775" Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.789225 5030 generic.go:334] "Generic (PLEG): container finished" podID="9a473770-8a49-4263-b949-d4f89d63542d" containerID="7202fe73450ff434a4d69e7b2aaa3222736c50ff914003ac1349d55d5e00980e" exitCode=0 Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.789395 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l" event={"ID":"9a473770-8a49-4263-b949-d4f89d63542d","Type":"ContainerDied","Data":"7202fe73450ff434a4d69e7b2aaa3222736c50ff914003ac1349d55d5e00980e"} Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.791483 5030 generic.go:334] "Generic (PLEG): container finished" podID="043666ed-7c1f-4489-8cfa-bb9c089ba951" containerID="c8ad123c0154ee907610e2669c2cf247b63d6af191295a68bcf8042417ed6be7" exitCode=0 Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.791529 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-z6ngm" event={"ID":"043666ed-7c1f-4489-8cfa-bb9c089ba951","Type":"ContainerDied","Data":"c8ad123c0154ee907610e2669c2cf247b63d6af191295a68bcf8042417ed6be7"} Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.793635 5030 generic.go:334] "Generic (PLEG): container finished" podID="242b3bb2-e246-48b9-a491-d5a2e1fceea1" containerID="666ff3f23387594aed671b6782cb36baf282d66c8bc54fcb03b31d2e44ed6692" exitCode=0 Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.793692 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm" event={"ID":"242b3bb2-e246-48b9-a491-d5a2e1fceea1","Type":"ContainerDied","Data":"666ff3f23387594aed671b6782cb36baf282d66c8bc54fcb03b31d2e44ed6692"} Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.796872 5030 generic.go:334] "Generic (PLEG): container finished" podID="a9fe9436-325e-46ee-9b56-41b8297f2f7c" containerID="ff7be2c694d3dca0c559222993519094634a7a1dfb835a7ff0372d249337df59" exitCode=0 Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.796904 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-kr8bd" event={"ID":"a9fe9436-325e-46ee-9b56-41b8297f2f7c","Type":"ContainerDied","Data":"ff7be2c694d3dca0c559222993519094634a7a1dfb835a7ff0372d249337df59"} Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.799120 5030 generic.go:334] "Generic (PLEG): container finished" podID="c16de63a-192f-4731-8c18-aaaede74cb00" containerID="088e594ead1c9624dd906a0c4ba82c22d2712fb2e45e227fbb018786e1228e86" exitCode=0 Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.799168 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-8027-account-create-update-4m224" event={"ID":"c16de63a-192f-4731-8c18-aaaede74cb00","Type":"ContainerDied","Data":"088e594ead1c9624dd906a0c4ba82c22d2712fb2e45e227fbb018786e1228e86"} Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.801514 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" event={"ID":"20bd2b86-ed12-4296-97db-55879b9d87ac","Type":"ContainerStarted","Data":"0ebf9740d24b5d513af1db1bc11b43bf812a8327e814b235234029b85f1004bf"} Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.804994 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-7ffdk" event={"ID":"3042e5b1-9e08-4870-99a1-2c89d2b536dd","Type":"ContainerDied","Data":"5af8e541c7b45adcd5f97f874d6b9a2214aad19a732554539987c9fffd8a4613"} Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.804807 5030 generic.go:334] "Generic (PLEG): container finished" podID="3042e5b1-9e08-4870-99a1-2c89d2b536dd" containerID="5af8e541c7b45adcd5f97f874d6b9a2214aad19a732554539987c9fffd8a4613" exitCode=0 Jan 20 23:46:18 crc kubenswrapper[5030]: I0120 23:46:18.918985 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" podStartSLOduration=2.9189672460000002 podStartE2EDuration="2.918967246s" podCreationTimestamp="2026-01-20 23:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:18.905756395 +0000 UTC m=+4251.226016703" watchObservedRunningTime="2026-01-20 23:46:18.918967246 +0000 UTC m=+4251.239227534" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.266318 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-8027-account-create-update-4m224" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.387379 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-7ffdk" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.405180 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt2kt\" (UniqueName: \"kubernetes.io/projected/c16de63a-192f-4731-8c18-aaaede74cb00-kube-api-access-zt2kt\") pod \"c16de63a-192f-4731-8c18-aaaede74cb00\" (UID: \"c16de63a-192f-4731-8c18-aaaede74cb00\") " Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.405258 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c16de63a-192f-4731-8c18-aaaede74cb00-operator-scripts\") pod \"c16de63a-192f-4731-8c18-aaaede74cb00\" (UID: \"c16de63a-192f-4731-8c18-aaaede74cb00\") " Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.405762 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c16de63a-192f-4731-8c18-aaaede74cb00-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c16de63a-192f-4731-8c18-aaaede74cb00" (UID: "c16de63a-192f-4731-8c18-aaaede74cb00"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.409996 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c16de63a-192f-4731-8c18-aaaede74cb00-kube-api-access-zt2kt" (OuterVolumeSpecName: "kube-api-access-zt2kt") pod "c16de63a-192f-4731-8c18-aaaede74cb00" (UID: "c16de63a-192f-4731-8c18-aaaede74cb00"). InnerVolumeSpecName "kube-api-access-zt2kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.461708 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.468383 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.479323 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-z6ngm" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.486487 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-kr8bd" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.507937 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbdcl\" (UniqueName: \"kubernetes.io/projected/3042e5b1-9e08-4870-99a1-2c89d2b536dd-kube-api-access-zbdcl\") pod \"3042e5b1-9e08-4870-99a1-2c89d2b536dd\" (UID: \"3042e5b1-9e08-4870-99a1-2c89d2b536dd\") " Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.508039 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3042e5b1-9e08-4870-99a1-2c89d2b536dd-operator-scripts\") pod \"3042e5b1-9e08-4870-99a1-2c89d2b536dd\" (UID: \"3042e5b1-9e08-4870-99a1-2c89d2b536dd\") " Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.508435 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3042e5b1-9e08-4870-99a1-2c89d2b536dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3042e5b1-9e08-4870-99a1-2c89d2b536dd" (UID: "3042e5b1-9e08-4870-99a1-2c89d2b536dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.508532 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3042e5b1-9e08-4870-99a1-2c89d2b536dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.508546 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt2kt\" (UniqueName: \"kubernetes.io/projected/c16de63a-192f-4731-8c18-aaaede74cb00-kube-api-access-zt2kt\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.508557 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c16de63a-192f-4731-8c18-aaaede74cb00-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.511890 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3042e5b1-9e08-4870-99a1-2c89d2b536dd-kube-api-access-zbdcl" (OuterVolumeSpecName: "kube-api-access-zbdcl") pod "3042e5b1-9e08-4870-99a1-2c89d2b536dd" (UID: "3042e5b1-9e08-4870-99a1-2c89d2b536dd"). InnerVolumeSpecName "kube-api-access-zbdcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.609137 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9sd5\" (UniqueName: \"kubernetes.io/projected/242b3bb2-e246-48b9-a491-d5a2e1fceea1-kube-api-access-d9sd5\") pod \"242b3bb2-e246-48b9-a491-d5a2e1fceea1\" (UID: \"242b3bb2-e246-48b9-a491-d5a2e1fceea1\") " Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.609186 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/043666ed-7c1f-4489-8cfa-bb9c089ba951-operator-scripts\") pod \"043666ed-7c1f-4489-8cfa-bb9c089ba951\" (UID: \"043666ed-7c1f-4489-8cfa-bb9c089ba951\") " Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.609215 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/242b3bb2-e246-48b9-a491-d5a2e1fceea1-operator-scripts\") pod \"242b3bb2-e246-48b9-a491-d5a2e1fceea1\" (UID: \"242b3bb2-e246-48b9-a491-d5a2e1fceea1\") " Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.609248 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2wdh\" (UniqueName: \"kubernetes.io/projected/043666ed-7c1f-4489-8cfa-bb9c089ba951-kube-api-access-t2wdh\") pod \"043666ed-7c1f-4489-8cfa-bb9c089ba951\" (UID: \"043666ed-7c1f-4489-8cfa-bb9c089ba951\") " Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.609286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-572xd\" (UniqueName: \"kubernetes.io/projected/9a473770-8a49-4263-b949-d4f89d63542d-kube-api-access-572xd\") pod \"9a473770-8a49-4263-b949-d4f89d63542d\" (UID: \"9a473770-8a49-4263-b949-d4f89d63542d\") " Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.609306 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg8jf\" (UniqueName: \"kubernetes.io/projected/a9fe9436-325e-46ee-9b56-41b8297f2f7c-kube-api-access-pg8jf\") pod \"a9fe9436-325e-46ee-9b56-41b8297f2f7c\" (UID: \"a9fe9436-325e-46ee-9b56-41b8297f2f7c\") " Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.609383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9fe9436-325e-46ee-9b56-41b8297f2f7c-operator-scripts\") pod \"a9fe9436-325e-46ee-9b56-41b8297f2f7c\" (UID: \"a9fe9436-325e-46ee-9b56-41b8297f2f7c\") " Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.609434 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a473770-8a49-4263-b949-d4f89d63542d-operator-scripts\") pod \"9a473770-8a49-4263-b949-d4f89d63542d\" (UID: \"9a473770-8a49-4263-b949-d4f89d63542d\") " Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.609646 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/043666ed-7c1f-4489-8cfa-bb9c089ba951-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "043666ed-7c1f-4489-8cfa-bb9c089ba951" (UID: "043666ed-7c1f-4489-8cfa-bb9c089ba951"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.609824 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbdcl\" (UniqueName: \"kubernetes.io/projected/3042e5b1-9e08-4870-99a1-2c89d2b536dd-kube-api-access-zbdcl\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.610205 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/242b3bb2-e246-48b9-a491-d5a2e1fceea1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "242b3bb2-e246-48b9-a491-d5a2e1fceea1" (UID: "242b3bb2-e246-48b9-a491-d5a2e1fceea1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.610252 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fe9436-325e-46ee-9b56-41b8297f2f7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a9fe9436-325e-46ee-9b56-41b8297f2f7c" (UID: "a9fe9436-325e-46ee-9b56-41b8297f2f7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.610404 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/043666ed-7c1f-4489-8cfa-bb9c089ba951-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.610403 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a473770-8a49-4263-b949-d4f89d63542d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a473770-8a49-4263-b949-d4f89d63542d" (UID: "9a473770-8a49-4263-b949-d4f89d63542d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.612673 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/242b3bb2-e246-48b9-a491-d5a2e1fceea1-kube-api-access-d9sd5" (OuterVolumeSpecName: "kube-api-access-d9sd5") pod "242b3bb2-e246-48b9-a491-d5a2e1fceea1" (UID: "242b3bb2-e246-48b9-a491-d5a2e1fceea1"). InnerVolumeSpecName "kube-api-access-d9sd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.613066 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/043666ed-7c1f-4489-8cfa-bb9c089ba951-kube-api-access-t2wdh" (OuterVolumeSpecName: "kube-api-access-t2wdh") pod "043666ed-7c1f-4489-8cfa-bb9c089ba951" (UID: "043666ed-7c1f-4489-8cfa-bb9c089ba951"). InnerVolumeSpecName "kube-api-access-t2wdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.614488 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a473770-8a49-4263-b949-d4f89d63542d-kube-api-access-572xd" (OuterVolumeSpecName: "kube-api-access-572xd") pod "9a473770-8a49-4263-b949-d4f89d63542d" (UID: "9a473770-8a49-4263-b949-d4f89d63542d"). InnerVolumeSpecName "kube-api-access-572xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.615270 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9fe9436-325e-46ee-9b56-41b8297f2f7c-kube-api-access-pg8jf" (OuterVolumeSpecName: "kube-api-access-pg8jf") pod "a9fe9436-325e-46ee-9b56-41b8297f2f7c" (UID: "a9fe9436-325e-46ee-9b56-41b8297f2f7c"). InnerVolumeSpecName "kube-api-access-pg8jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.712163 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-572xd\" (UniqueName: \"kubernetes.io/projected/9a473770-8a49-4263-b949-d4f89d63542d-kube-api-access-572xd\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.712195 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg8jf\" (UniqueName: \"kubernetes.io/projected/a9fe9436-325e-46ee-9b56-41b8297f2f7c-kube-api-access-pg8jf\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.712208 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9fe9436-325e-46ee-9b56-41b8297f2f7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.712219 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a473770-8a49-4263-b949-d4f89d63542d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.712228 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9sd5\" (UniqueName: \"kubernetes.io/projected/242b3bb2-e246-48b9-a491-d5a2e1fceea1-kube-api-access-d9sd5\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.712236 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/242b3bb2-e246-48b9-a491-d5a2e1fceea1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.712244 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2wdh\" (UniqueName: \"kubernetes.io/projected/043666ed-7c1f-4489-8cfa-bb9c089ba951-kube-api-access-t2wdh\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.828693 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-8027-account-create-update-4m224" event={"ID":"c16de63a-192f-4731-8c18-aaaede74cb00","Type":"ContainerDied","Data":"a430675512d03f1312c66637563026202ff95bdd5841e1a122ce54bda4940e59"} Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.829023 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a430675512d03f1312c66637563026202ff95bdd5841e1a122ce54bda4940e59" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.828746 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-8027-account-create-update-4m224" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.830871 5030 generic.go:334] "Generic (PLEG): container finished" podID="20bd2b86-ed12-4296-97db-55879b9d87ac" containerID="0ebf9740d24b5d513af1db1bc11b43bf812a8327e814b235234029b85f1004bf" exitCode=0 Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.830955 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" event={"ID":"20bd2b86-ed12-4296-97db-55879b9d87ac","Type":"ContainerDied","Data":"0ebf9740d24b5d513af1db1bc11b43bf812a8327e814b235234029b85f1004bf"} Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.833069 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-7ffdk" event={"ID":"3042e5b1-9e08-4870-99a1-2c89d2b536dd","Type":"ContainerDied","Data":"d88a033562427e54dd5a07a7c8e0fb325b3f00a4b774a7792c29ef9c13e92145"} Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.833124 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d88a033562427e54dd5a07a7c8e0fb325b3f00a4b774a7792c29ef9c13e92145" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.833087 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-7ffdk" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.835009 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.835047 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l" event={"ID":"9a473770-8a49-4263-b949-d4f89d63542d","Type":"ContainerDied","Data":"6a4a2ff2308fa1600912c3e619964378a0c5864957b31ea0bd25dbcda2475ff8"} Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.835099 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a4a2ff2308fa1600912c3e619964378a0c5864957b31ea0bd25dbcda2475ff8" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.837012 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-z6ngm" event={"ID":"043666ed-7c1f-4489-8cfa-bb9c089ba951","Type":"ContainerDied","Data":"9b16d89aeb03eb0a62fecca39fc7f59547caa2020702e6462122313814799c5c"} Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.837059 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b16d89aeb03eb0a62fecca39fc7f59547caa2020702e6462122313814799c5c" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.837036 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-z6ngm" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.839059 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.839055 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm" event={"ID":"242b3bb2-e246-48b9-a491-d5a2e1fceea1","Type":"ContainerDied","Data":"30bdd2fedd465f06072d15e823659ae2d75e865fec76e89164ad029a5b0d412a"} Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.839224 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30bdd2fedd465f06072d15e823659ae2d75e865fec76e89164ad029a5b0d412a" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.843797 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-kr8bd" event={"ID":"a9fe9436-325e-46ee-9b56-41b8297f2f7c","Type":"ContainerDied","Data":"f44bf16e874fa2061db03d5eb0a6a02e01717151f81002a463462df94a764705"} Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.843840 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f44bf16e874fa2061db03d5eb0a6a02e01717151f81002a463462df94a764705" Jan 20 23:46:20 crc kubenswrapper[5030]: I0120 23:46:20.843923 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-kr8bd" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.248825 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.338119 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20bd2b86-ed12-4296-97db-55879b9d87ac-config-data\") pod \"20bd2b86-ed12-4296-97db-55879b9d87ac\" (UID: \"20bd2b86-ed12-4296-97db-55879b9d87ac\") " Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.338203 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bd2b86-ed12-4296-97db-55879b9d87ac-combined-ca-bundle\") pod \"20bd2b86-ed12-4296-97db-55879b9d87ac\" (UID: \"20bd2b86-ed12-4296-97db-55879b9d87ac\") " Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.338246 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpdk2\" (UniqueName: \"kubernetes.io/projected/20bd2b86-ed12-4296-97db-55879b9d87ac-kube-api-access-qpdk2\") pod \"20bd2b86-ed12-4296-97db-55879b9d87ac\" (UID: \"20bd2b86-ed12-4296-97db-55879b9d87ac\") " Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.344539 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20bd2b86-ed12-4296-97db-55879b9d87ac-kube-api-access-qpdk2" (OuterVolumeSpecName: "kube-api-access-qpdk2") pod "20bd2b86-ed12-4296-97db-55879b9d87ac" (UID: "20bd2b86-ed12-4296-97db-55879b9d87ac"). InnerVolumeSpecName "kube-api-access-qpdk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.370026 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20bd2b86-ed12-4296-97db-55879b9d87ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20bd2b86-ed12-4296-97db-55879b9d87ac" (UID: "20bd2b86-ed12-4296-97db-55879b9d87ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.405276 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20bd2b86-ed12-4296-97db-55879b9d87ac-config-data" (OuterVolumeSpecName: "config-data") pod "20bd2b86-ed12-4296-97db-55879b9d87ac" (UID: "20bd2b86-ed12-4296-97db-55879b9d87ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.440432 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20bd2b86-ed12-4296-97db-55879b9d87ac-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.440482 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bd2b86-ed12-4296-97db-55879b9d87ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.440499 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpdk2\" (UniqueName: \"kubernetes.io/projected/20bd2b86-ed12-4296-97db-55879b9d87ac-kube-api-access-qpdk2\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.871175 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" event={"ID":"20bd2b86-ed12-4296-97db-55879b9d87ac","Type":"ContainerDied","Data":"f23ef1d90baf34599991bb8b54f595a4b8f24ff0f1703021d5f1b60673907fcc"} Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.871239 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f23ef1d90baf34599991bb8b54f595a4b8f24ff0f1703021d5f1b60673907fcc" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.871254 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-rzn4z" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.995606 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-cwv4x"] Jan 20 23:46:22 crc kubenswrapper[5030]: E0120 23:46:22.995990 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043666ed-7c1f-4489-8cfa-bb9c089ba951" containerName="mariadb-database-create" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996009 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="043666ed-7c1f-4489-8cfa-bb9c089ba951" containerName="mariadb-database-create" Jan 20 23:46:22 crc kubenswrapper[5030]: E0120 23:46:22.996028 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a473770-8a49-4263-b949-d4f89d63542d" containerName="mariadb-account-create-update" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996036 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a473770-8a49-4263-b949-d4f89d63542d" containerName="mariadb-account-create-update" Jan 20 23:46:22 crc kubenswrapper[5030]: E0120 23:46:22.996053 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c16de63a-192f-4731-8c18-aaaede74cb00" containerName="mariadb-account-create-update" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996062 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c16de63a-192f-4731-8c18-aaaede74cb00" containerName="mariadb-account-create-update" Jan 20 23:46:22 crc kubenswrapper[5030]: E0120 23:46:22.996074 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242b3bb2-e246-48b9-a491-d5a2e1fceea1" containerName="mariadb-account-create-update" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996081 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="242b3bb2-e246-48b9-a491-d5a2e1fceea1" containerName="mariadb-account-create-update" Jan 20 23:46:22 crc kubenswrapper[5030]: E0120 23:46:22.996095 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3042e5b1-9e08-4870-99a1-2c89d2b536dd" containerName="mariadb-database-create" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996103 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3042e5b1-9e08-4870-99a1-2c89d2b536dd" containerName="mariadb-database-create" Jan 20 23:46:22 crc kubenswrapper[5030]: E0120 23:46:22.996118 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fe9436-325e-46ee-9b56-41b8297f2f7c" containerName="mariadb-database-create" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996125 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fe9436-325e-46ee-9b56-41b8297f2f7c" containerName="mariadb-database-create" Jan 20 23:46:22 crc kubenswrapper[5030]: E0120 23:46:22.996146 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20bd2b86-ed12-4296-97db-55879b9d87ac" containerName="keystone-db-sync" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996156 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="20bd2b86-ed12-4296-97db-55879b9d87ac" containerName="keystone-db-sync" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996423 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9fe9436-325e-46ee-9b56-41b8297f2f7c" containerName="mariadb-database-create" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996441 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="043666ed-7c1f-4489-8cfa-bb9c089ba951" containerName="mariadb-database-create" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996462 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="20bd2b86-ed12-4296-97db-55879b9d87ac" containerName="keystone-db-sync" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996471 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="242b3bb2-e246-48b9-a491-d5a2e1fceea1" containerName="mariadb-account-create-update" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996488 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c16de63a-192f-4731-8c18-aaaede74cb00" containerName="mariadb-account-create-update" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996502 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3042e5b1-9e08-4870-99a1-2c89d2b536dd" containerName="mariadb-database-create" Jan 20 23:46:22 crc kubenswrapper[5030]: I0120 23:46:22.996513 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a473770-8a49-4263-b949-d4f89d63542d" containerName="mariadb-account-create-update" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.001202 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.004667 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.005492 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.005850 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-6rz8w" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.006061 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.006274 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.022366 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-cwv4x"] Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.055683 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-config-data\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.055763 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-credential-keys\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.055939 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-combined-ca-bundle\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.055970 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp7mb\" (UniqueName: \"kubernetes.io/projected/4e00e205-ea8a-4a04-8161-dd51e3532143-kube-api-access-bp7mb\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.056034 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-scripts\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.056130 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-fernet-keys\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.058125 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.059417 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.065405 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-hsjcl" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.065447 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.065596 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.065410 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.080678 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.140351 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.143571 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.147171 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.147350 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.158678 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-config-data\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.158764 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-credential-keys\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.158823 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb9br\" (UniqueName: \"kubernetes.io/projected/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-kube-api-access-kb9br\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.158935 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-logs\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.158987 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.159019 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.159071 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.159110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.159161 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-combined-ca-bundle\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.159187 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.159240 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp7mb\" (UniqueName: \"kubernetes.io/projected/4e00e205-ea8a-4a04-8161-dd51e3532143-kube-api-access-bp7mb\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.159287 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-scripts\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.159362 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-fernet-keys\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.159487 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.169284 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-fernet-keys\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.169893 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-config-data\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.172870 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-credential-keys\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.176145 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-combined-ca-bundle\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.177322 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-scripts\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.177375 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.196274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp7mb\" (UniqueName: \"kubernetes.io/projected/4e00e205-ea8a-4a04-8161-dd51e3532143-kube-api-access-bp7mb\") pod \"keystone-bootstrap-cwv4x\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.203472 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-z276q"] Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.204668 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.209392 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.209609 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-4xxv7" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.209738 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.235684 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-qttm2"] Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.236935 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.239577 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.239758 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.239855 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-nsx78" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.253553 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-z276q"] Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262576 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t65gl\" (UniqueName: \"kubernetes.io/projected/9275ea5f-43f0-447d-9e4d-adf3f977f058-kube-api-access-t65gl\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262614 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-combined-ca-bundle\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262662 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9275ea5f-43f0-447d-9e4d-adf3f977f058-etc-machine-id\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262680 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-db-sync-config-data\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262708 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-logs\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262726 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262778 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262825 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262846 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262862 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-config-data\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262891 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262910 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262933 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/497ba0ef-c7db-4c22-8b2f-7da2826ece57-logs\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262961 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-scripts\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.262984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.263024 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-scripts\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.263044 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dr9v\" (UniqueName: \"kubernetes.io/projected/497ba0ef-c7db-4c22-8b2f-7da2826ece57-kube-api-access-8dr9v\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.263073 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-config-data\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.263102 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb9br\" (UniqueName: \"kubernetes.io/projected/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-kube-api-access-kb9br\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.263128 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/497ba0ef-c7db-4c22-8b2f-7da2826ece57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.263592 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-logs\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.263910 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.263936 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.269431 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-qttm2"] Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.271970 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.274415 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.278381 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.289223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.289522 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb9br\" (UniqueName: \"kubernetes.io/projected/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-kube-api-access-kb9br\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.317870 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.319687 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.321997 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.322564 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.325574 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.344177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.353231 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.364840 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.364885 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-scripts\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.364912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/497ba0ef-c7db-4c22-8b2f-7da2826ece57-logs\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.364932 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.364954 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29ljg\" (UniqueName: \"kubernetes.io/projected/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-kube-api-access-29ljg\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.364982 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-scripts\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365156 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-logs\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365214 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-scripts\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365235 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dr9v\" (UniqueName: \"kubernetes.io/projected/497ba0ef-c7db-4c22-8b2f-7da2826ece57-kube-api-access-8dr9v\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365255 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e8bfe6-1143-4b62-a672-1255e6a861e4-run-httpd\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365294 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-config-data\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365336 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e8bfe6-1143-4b62-a672-1255e6a861e4-log-httpd\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365376 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/497ba0ef-c7db-4c22-8b2f-7da2826ece57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t65gl\" (UniqueName: \"kubernetes.io/projected/9275ea5f-43f0-447d-9e4d-adf3f977f058-kube-api-access-t65gl\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365426 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-combined-ca-bundle\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365447 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-config-data\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/497ba0ef-c7db-4c22-8b2f-7da2826ece57-logs\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9275ea5f-43f0-447d-9e4d-adf3f977f058-etc-machine-id\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365492 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9275ea5f-43f0-447d-9e4d-adf3f977f058-etc-machine-id\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365561 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-db-sync-config-data\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365599 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-config-data\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365645 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365681 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvcz9\" (UniqueName: \"kubernetes.io/projected/b4e8bfe6-1143-4b62-a672-1255e6a861e4-kube-api-access-zvcz9\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365727 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-scripts\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365757 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-combined-ca-bundle\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365813 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-config-data\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.365872 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.366189 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.367907 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/497ba0ef-c7db-4c22-8b2f-7da2826ece57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.373687 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.373886 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-config-data\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.374455 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-scripts\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.374606 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.378510 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.378966 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-db-sync-config-data\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.380536 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-config-data\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.383116 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-scripts\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.383351 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dr9v\" (UniqueName: \"kubernetes.io/projected/497ba0ef-c7db-4c22-8b2f-7da2826ece57-kube-api-access-8dr9v\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.385574 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t65gl\" (UniqueName: \"kubernetes.io/projected/9275ea5f-43f0-447d-9e4d-adf3f977f058-kube-api-access-t65gl\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.388138 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-combined-ca-bundle\") pod \"cinder-db-sync-z276q\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.410460 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.411805 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.469679 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-scripts\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.470112 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.470143 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29ljg\" (UniqueName: \"kubernetes.io/projected/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-kube-api-access-29ljg\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.470824 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-logs\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.470855 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e8bfe6-1143-4b62-a672-1255e6a861e4-run-httpd\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.470882 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.470904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e8bfe6-1143-4b62-a672-1255e6a861e4-log-httpd\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.470958 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-config-data\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.471002 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-config-data\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.471030 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvcz9\" (UniqueName: \"kubernetes.io/projected/b4e8bfe6-1143-4b62-a672-1255e6a861e4-kube-api-access-zvcz9\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.471061 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-scripts\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.471083 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-combined-ca-bundle\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.471189 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-logs\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.471999 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e8bfe6-1143-4b62-a672-1255e6a861e4-log-httpd\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.473152 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e8bfe6-1143-4b62-a672-1255e6a861e4-run-httpd\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.475557 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.478516 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.479261 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-config-data\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.479278 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-combined-ca-bundle\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.483561 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-config-data\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.484286 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-scripts\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.488218 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-scripts\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.491004 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29ljg\" (UniqueName: \"kubernetes.io/projected/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-kube-api-access-29ljg\") pod \"placement-db-sync-qttm2\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.491526 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvcz9\" (UniqueName: \"kubernetes.io/projected/b4e8bfe6-1143-4b62-a672-1255e6a861e4-kube-api-access-zvcz9\") pod \"ceilometer-0\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.562976 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.732007 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.746350 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.802745 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-cwv4x"] Jan 20 23:46:23 crc kubenswrapper[5030]: W0120 23:46:23.818282 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e00e205_ea8a_4a04_8161_dd51e3532143.slice/crio-edaeae24dbb5e6d879f9c38a3f498c1ca5144f6c642916ffccc32a71c58baeeb WatchSource:0}: Error finding container edaeae24dbb5e6d879f9c38a3f498c1ca5144f6c642916ffccc32a71c58baeeb: Status 404 returned error can't find the container with id edaeae24dbb5e6d879f9c38a3f498c1ca5144f6c642916ffccc32a71c58baeeb Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.907489 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.920052 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" event={"ID":"4e00e205-ea8a-4a04-8161-dd51e3532143","Type":"ContainerStarted","Data":"edaeae24dbb5e6d879f9c38a3f498c1ca5144f6c642916ffccc32a71c58baeeb"} Jan 20 23:46:23 crc kubenswrapper[5030]: I0120 23:46:23.935114 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-z276q"] Jan 20 23:46:23 crc kubenswrapper[5030]: W0120 23:46:23.959422 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9275ea5f_43f0_447d_9e4d_adf3f977f058.slice/crio-35bb08397b973e5c4c27e381ae61e421dc222083dda38dfc285f6d8ccdbc5272 WatchSource:0}: Error finding container 35bb08397b973e5c4c27e381ae61e421dc222083dda38dfc285f6d8ccdbc5272: Status 404 returned error can't find the container with id 35bb08397b973e5c4c27e381ae61e421dc222083dda38dfc285f6d8ccdbc5272 Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.027591 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:46:24 crc kubenswrapper[5030]: W0120 23:46:24.036591 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod497ba0ef_c7db_4c22_8b2f_7da2826ece57.slice/crio-0912b7452323777bc3b4e1bcbaaacb7edc84b456c336de56aa61c8654f5f1f7f WatchSource:0}: Error finding container 0912b7452323777bc3b4e1bcbaaacb7edc84b456c336de56aa61c8654f5f1f7f: Status 404 returned error can't find the container with id 0912b7452323777bc3b4e1bcbaaacb7edc84b456c336de56aa61c8654f5f1f7f Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.178550 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-qttm2"] Jan 20 23:46:24 crc kubenswrapper[5030]: W0120 23:46:24.185413 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0aca09c_1d5a_4d9b_a6f0_e843f97c6cc8.slice/crio-e4bf44234b19cfb8edc2568564e9a54ee498002c098c276c717eec94e33884b1 WatchSource:0}: Error finding container e4bf44234b19cfb8edc2568564e9a54ee498002c098c276c717eec94e33884b1: Status 404 returned error can't find the container with id e4bf44234b19cfb8edc2568564e9a54ee498002c098c276c717eec94e33884b1 Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.320709 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:46:24 crc kubenswrapper[5030]: W0120 23:46:24.341070 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4e8bfe6_1143_4b62_a672_1255e6a861e4.slice/crio-2effed606f819ecbc25ab9664cb620a95900b259fa37c32966f96b01eabe57b0 WatchSource:0}: Error finding container 2effed606f819ecbc25ab9664cb620a95900b259fa37c32966f96b01eabe57b0: Status 404 returned error can't find the container with id 2effed606f819ecbc25ab9664cb620a95900b259fa37c32966f96b01eabe57b0 Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.951554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-z276q" event={"ID":"9275ea5f-43f0-447d-9e4d-adf3f977f058","Type":"ContainerStarted","Data":"27b0016d3de811e0565969532e55790cc76bbc720f2b4e02b99a2dd62f2af4ac"} Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.951978 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-z276q" event={"ID":"9275ea5f-43f0-447d-9e4d-adf3f977f058","Type":"ContainerStarted","Data":"35bb08397b973e5c4c27e381ae61e421dc222083dda38dfc285f6d8ccdbc5272"} Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.957599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e6a62a0a-8124-4401-a8fb-5926c1d3ce65","Type":"ContainerStarted","Data":"db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95"} Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.957658 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e6a62a0a-8124-4401-a8fb-5926c1d3ce65","Type":"ContainerStarted","Data":"3dad6f8718f77d08b85d384cd1be37900938cb68972aa519a101b2bbaae7d821"} Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.963943 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-qttm2" event={"ID":"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8","Type":"ContainerStarted","Data":"1dd19098759807465068b62dcf1137af6bfd6b4d362b08535bcc40499c15886e"} Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.963994 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-qttm2" event={"ID":"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8","Type":"ContainerStarted","Data":"e4bf44234b19cfb8edc2568564e9a54ee498002c098c276c717eec94e33884b1"} Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.966918 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" event={"ID":"4e00e205-ea8a-4a04-8161-dd51e3532143","Type":"ContainerStarted","Data":"a17d532b4f091901e53dc6ace5b971e6a334eb6f16e23f0fbf4b5e3399aab24c"} Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.972617 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b4e8bfe6-1143-4b62-a672-1255e6a861e4","Type":"ContainerStarted","Data":"2effed606f819ecbc25ab9664cb620a95900b259fa37c32966f96b01eabe57b0"} Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.972914 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-z276q" podStartSLOduration=1.972900455 podStartE2EDuration="1.972900455s" podCreationTimestamp="2026-01-20 23:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:24.96896664 +0000 UTC m=+4257.289226928" watchObservedRunningTime="2026-01-20 23:46:24.972900455 +0000 UTC m=+4257.293160743" Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.975053 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"497ba0ef-c7db-4c22-8b2f-7da2826ece57","Type":"ContainerStarted","Data":"f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017"} Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.975086 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"497ba0ef-c7db-4c22-8b2f-7da2826ece57","Type":"ContainerStarted","Data":"0912b7452323777bc3b4e1bcbaaacb7edc84b456c336de56aa61c8654f5f1f7f"} Jan 20 23:46:24 crc kubenswrapper[5030]: I0120 23:46:24.999272 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" podStartSLOduration=2.999238294 podStartE2EDuration="2.999238294s" podCreationTimestamp="2026-01-20 23:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:24.988213047 +0000 UTC m=+4257.308473335" watchObservedRunningTime="2026-01-20 23:46:24.999238294 +0000 UTC m=+4257.319498582" Jan 20 23:46:25 crc kubenswrapper[5030]: I0120 23:46:25.008541 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-qttm2" podStartSLOduration=2.008519869 podStartE2EDuration="2.008519869s" podCreationTimestamp="2026-01-20 23:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:25.007241938 +0000 UTC m=+4257.327502226" watchObservedRunningTime="2026-01-20 23:46:25.008519869 +0000 UTC m=+4257.328780157" Jan 20 23:46:25 crc kubenswrapper[5030]: I0120 23:46:25.315087 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:46:25 crc kubenswrapper[5030]: I0120 23:46:25.349936 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:46:25 crc kubenswrapper[5030]: I0120 23:46:25.373613 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:46:25 crc kubenswrapper[5030]: I0120 23:46:25.989059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"497ba0ef-c7db-4c22-8b2f-7da2826ece57","Type":"ContainerStarted","Data":"f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996"} Jan 20 23:46:25 crc kubenswrapper[5030]: I0120 23:46:25.992586 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e6a62a0a-8124-4401-a8fb-5926c1d3ce65","Type":"ContainerStarted","Data":"b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8"} Jan 20 23:46:25 crc kubenswrapper[5030]: I0120 23:46:25.997373 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8" containerID="1dd19098759807465068b62dcf1137af6bfd6b4d362b08535bcc40499c15886e" exitCode=0 Jan 20 23:46:25 crc kubenswrapper[5030]: I0120 23:46:25.997445 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-qttm2" event={"ID":"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8","Type":"ContainerDied","Data":"1dd19098759807465068b62dcf1137af6bfd6b4d362b08535bcc40499c15886e"} Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.001956 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b4e8bfe6-1143-4b62-a672-1255e6a861e4","Type":"ContainerStarted","Data":"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df"} Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.014967 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.014950294 podStartE2EDuration="3.014950294s" podCreationTimestamp="2026-01-20 23:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:26.012797571 +0000 UTC m=+4258.333057859" watchObservedRunningTime="2026-01-20 23:46:26.014950294 +0000 UTC m=+4258.335210582" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.066090 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.066063863 podStartE2EDuration="3.066063863s" podCreationTimestamp="2026-01-20 23:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:26.057196568 +0000 UTC m=+4258.377456856" watchObservedRunningTime="2026-01-20 23:46:26.066063863 +0000 UTC m=+4258.386324151" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.730444 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-hrsm6"] Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.733029 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.736327 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-4htrl" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.736481 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.751463 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-hrsm6"] Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.820233 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-7v8jz"] Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.826357 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.832605 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-combined-ca-bundle\") pod \"barbican-db-sync-hrsm6\" (UID: \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\") " pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.832668 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztbl2\" (UniqueName: \"kubernetes.io/projected/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-kube-api-access-ztbl2\") pod \"barbican-db-sync-hrsm6\" (UID: \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\") " pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.832724 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-db-sync-config-data\") pod \"barbican-db-sync-hrsm6\" (UID: \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\") " pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.838050 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-7v8jz"] Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.854504 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.854797 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-xfzxd" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.854504 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.933661 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-combined-ca-bundle\") pod \"barbican-db-sync-hrsm6\" (UID: \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\") " pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.933807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztbl2\" (UniqueName: \"kubernetes.io/projected/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-kube-api-access-ztbl2\") pod \"barbican-db-sync-hrsm6\" (UID: \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\") " pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.933907 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-db-sync-config-data\") pod \"barbican-db-sync-hrsm6\" (UID: \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\") " pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.934007 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6933f1ec-acd0-4c85-a7b3-17363f451790-config\") pod \"neutron-db-sync-7v8jz\" (UID: \"6933f1ec-acd0-4c85-a7b3-17363f451790\") " pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.934095 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m984w\" (UniqueName: \"kubernetes.io/projected/6933f1ec-acd0-4c85-a7b3-17363f451790-kube-api-access-m984w\") pod \"neutron-db-sync-7v8jz\" (UID: \"6933f1ec-acd0-4c85-a7b3-17363f451790\") " pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.934203 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6933f1ec-acd0-4c85-a7b3-17363f451790-combined-ca-bundle\") pod \"neutron-db-sync-7v8jz\" (UID: \"6933f1ec-acd0-4c85-a7b3-17363f451790\") " pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.939261 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-db-sync-config-data\") pod \"barbican-db-sync-hrsm6\" (UID: \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\") " pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.939267 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-combined-ca-bundle\") pod \"barbican-db-sync-hrsm6\" (UID: \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\") " pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" Jan 20 23:46:26 crc kubenswrapper[5030]: I0120 23:46:26.950772 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztbl2\" (UniqueName: \"kubernetes.io/projected/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-kube-api-access-ztbl2\") pod \"barbican-db-sync-hrsm6\" (UID: \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\") " pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.011667 5030 generic.go:334] "Generic (PLEG): container finished" podID="4e00e205-ea8a-4a04-8161-dd51e3532143" containerID="a17d532b4f091901e53dc6ace5b971e6a334eb6f16e23f0fbf4b5e3399aab24c" exitCode=0 Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.011884 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" event={"ID":"4e00e205-ea8a-4a04-8161-dd51e3532143","Type":"ContainerDied","Data":"a17d532b4f091901e53dc6ace5b971e6a334eb6f16e23f0fbf4b5e3399aab24c"} Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.018578 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b4e8bfe6-1143-4b62-a672-1255e6a861e4","Type":"ContainerStarted","Data":"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14"} Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.018634 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b4e8bfe6-1143-4b62-a672-1255e6a861e4","Type":"ContainerStarted","Data":"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a"} Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.022522 5030 generic.go:334] "Generic (PLEG): container finished" podID="9275ea5f-43f0-447d-9e4d-adf3f977f058" containerID="27b0016d3de811e0565969532e55790cc76bbc720f2b4e02b99a2dd62f2af4ac" exitCode=0 Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.022739 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-z276q" event={"ID":"9275ea5f-43f0-447d-9e4d-adf3f977f058","Type":"ContainerDied","Data":"27b0016d3de811e0565969532e55790cc76bbc720f2b4e02b99a2dd62f2af4ac"} Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.023085 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="497ba0ef-c7db-4c22-8b2f-7da2826ece57" containerName="glance-log" containerID="cri-o://f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017" gracePeriod=30 Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.023126 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="497ba0ef-c7db-4c22-8b2f-7da2826ece57" containerName="glance-httpd" containerID="cri-o://f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996" gracePeriod=30 Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.023161 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="e6a62a0a-8124-4401-a8fb-5926c1d3ce65" containerName="glance-log" containerID="cri-o://db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95" gracePeriod=30 Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.023241 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="e6a62a0a-8124-4401-a8fb-5926c1d3ce65" containerName="glance-httpd" containerID="cri-o://b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8" gracePeriod=30 Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.036875 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6933f1ec-acd0-4c85-a7b3-17363f451790-config\") pod \"neutron-db-sync-7v8jz\" (UID: \"6933f1ec-acd0-4c85-a7b3-17363f451790\") " pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.037148 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m984w\" (UniqueName: \"kubernetes.io/projected/6933f1ec-acd0-4c85-a7b3-17363f451790-kube-api-access-m984w\") pod \"neutron-db-sync-7v8jz\" (UID: \"6933f1ec-acd0-4c85-a7b3-17363f451790\") " pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.037203 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6933f1ec-acd0-4c85-a7b3-17363f451790-combined-ca-bundle\") pod \"neutron-db-sync-7v8jz\" (UID: \"6933f1ec-acd0-4c85-a7b3-17363f451790\") " pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.041332 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6933f1ec-acd0-4c85-a7b3-17363f451790-config\") pod \"neutron-db-sync-7v8jz\" (UID: \"6933f1ec-acd0-4c85-a7b3-17363f451790\") " pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.042316 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6933f1ec-acd0-4c85-a7b3-17363f451790-combined-ca-bundle\") pod \"neutron-db-sync-7v8jz\" (UID: \"6933f1ec-acd0-4c85-a7b3-17363f451790\") " pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.061946 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m984w\" (UniqueName: \"kubernetes.io/projected/6933f1ec-acd0-4c85-a7b3-17363f451790-kube-api-access-m984w\") pod \"neutron-db-sync-7v8jz\" (UID: \"6933f1ec-acd0-4c85-a7b3-17363f451790\") " pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.179054 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.205054 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.581730 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.590719 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657187 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-hrsm6"] Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657467 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-combined-ca-bundle\") pod \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657513 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29ljg\" (UniqueName: \"kubernetes.io/projected/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-kube-api-access-29ljg\") pod \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657570 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/497ba0ef-c7db-4c22-8b2f-7da2826ece57-httpd-run\") pod \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657604 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-public-tls-certs\") pod \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657685 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dr9v\" (UniqueName: \"kubernetes.io/projected/497ba0ef-c7db-4c22-8b2f-7da2826ece57-kube-api-access-8dr9v\") pod \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-combined-ca-bundle\") pod \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657732 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-config-data\") pod \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657746 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-config-data\") pod \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657767 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-logs\") pod \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657797 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-scripts\") pod \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\" (UID: \"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8\") " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657854 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-scripts\") pod \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.657895 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/497ba0ef-c7db-4c22-8b2f-7da2826ece57-logs\") pod \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\" (UID: \"497ba0ef-c7db-4c22-8b2f-7da2826ece57\") " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.658574 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/497ba0ef-c7db-4c22-8b2f-7da2826ece57-logs" (OuterVolumeSpecName: "logs") pod "497ba0ef-c7db-4c22-8b2f-7da2826ece57" (UID: "497ba0ef-c7db-4c22-8b2f-7da2826ece57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.660805 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/497ba0ef-c7db-4c22-8b2f-7da2826ece57-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "497ba0ef-c7db-4c22-8b2f-7da2826ece57" (UID: "497ba0ef-c7db-4c22-8b2f-7da2826ece57"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.660931 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-logs" (OuterVolumeSpecName: "logs") pod "e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8" (UID: "e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.669809 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-kube-api-access-29ljg" (OuterVolumeSpecName: "kube-api-access-29ljg") pod "e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8" (UID: "e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8"). InnerVolumeSpecName "kube-api-access-29ljg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.670509 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "497ba0ef-c7db-4c22-8b2f-7da2826ece57" (UID: "497ba0ef-c7db-4c22-8b2f-7da2826ece57"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.692079 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/497ba0ef-c7db-4c22-8b2f-7da2826ece57-kube-api-access-8dr9v" (OuterVolumeSpecName: "kube-api-access-8dr9v") pod "497ba0ef-c7db-4c22-8b2f-7da2826ece57" (UID: "497ba0ef-c7db-4c22-8b2f-7da2826ece57"). InnerVolumeSpecName "kube-api-access-8dr9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.693053 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-scripts" (OuterVolumeSpecName: "scripts") pod "497ba0ef-c7db-4c22-8b2f-7da2826ece57" (UID: "497ba0ef-c7db-4c22-8b2f-7da2826ece57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.704207 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-scripts" (OuterVolumeSpecName: "scripts") pod "e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8" (UID: "e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.729519 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "497ba0ef-c7db-4c22-8b2f-7da2826ece57" (UID: "497ba0ef-c7db-4c22-8b2f-7da2826ece57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.732793 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-config-data" (OuterVolumeSpecName: "config-data") pod "e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8" (UID: "e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.742813 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-config-data" (OuterVolumeSpecName: "config-data") pod "497ba0ef-c7db-4c22-8b2f-7da2826ece57" (UID: "497ba0ef-c7db-4c22-8b2f-7da2826ece57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.744401 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8" (UID: "e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.760590 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.760667 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29ljg\" (UniqueName: \"kubernetes.io/projected/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-kube-api-access-29ljg\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.760680 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/497ba0ef-c7db-4c22-8b2f-7da2826ece57-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.760690 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dr9v\" (UniqueName: \"kubernetes.io/projected/497ba0ef-c7db-4c22-8b2f-7da2826ece57-kube-api-access-8dr9v\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.760700 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.760709 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.760717 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.760726 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.760747 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.760757 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.760766 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.760774 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/497ba0ef-c7db-4c22-8b2f-7da2826ece57-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.762815 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "497ba0ef-c7db-4c22-8b2f-7da2826ece57" (UID: "497ba0ef-c7db-4c22-8b2f-7da2826ece57"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.775887 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-7v8jz"] Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.781043 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.862169 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/497ba0ef-c7db-4c22-8b2f-7da2826ece57-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.862198 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:27 crc kubenswrapper[5030]: I0120 23:46:27.945744 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.051196 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" event={"ID":"43ed9ea9-6c15-4082-b728-5d5d66dfa24e","Type":"ContainerStarted","Data":"f05982f608e268c197c7f0c0f512888069aa7995523bedc0554768de0b8db5ce"} Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.051299 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" event={"ID":"43ed9ea9-6c15-4082-b728-5d5d66dfa24e","Type":"ContainerStarted","Data":"295d22a9489ae6aec77ef45603377c1153142c30d28536f61b4ec7bdb6bd5717"} Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.054576 5030 generic.go:334] "Generic (PLEG): container finished" podID="497ba0ef-c7db-4c22-8b2f-7da2826ece57" containerID="f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996" exitCode=0 Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.054601 5030 generic.go:334] "Generic (PLEG): container finished" podID="497ba0ef-c7db-4c22-8b2f-7da2826ece57" containerID="f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017" exitCode=143 Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.054768 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.054782 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"497ba0ef-c7db-4c22-8b2f-7da2826ece57","Type":"ContainerDied","Data":"f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996"} Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.054812 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"497ba0ef-c7db-4c22-8b2f-7da2826ece57","Type":"ContainerDied","Data":"f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017"} Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.054826 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"497ba0ef-c7db-4c22-8b2f-7da2826ece57","Type":"ContainerDied","Data":"0912b7452323777bc3b4e1bcbaaacb7edc84b456c336de56aa61c8654f5f1f7f"} Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.054845 5030 scope.go:117] "RemoveContainer" containerID="f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.060341 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" event={"ID":"6933f1ec-acd0-4c85-a7b3-17363f451790","Type":"ContainerStarted","Data":"82c5de5d1e9ee60604eec4b8187687c0e8fd8640cff7d12c1f5a28c78d168427"} Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.060786 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" event={"ID":"6933f1ec-acd0-4c85-a7b3-17363f451790","Type":"ContainerStarted","Data":"abf471d2b2054c1f64ae7477dc37126eaca8485709203e80e3bc2b4345ac9084"} Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.063677 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-combined-ca-bundle\") pod \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.063773 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-httpd-run\") pod \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.063814 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb9br\" (UniqueName: \"kubernetes.io/projected/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-kube-api-access-kb9br\") pod \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.063878 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-logs\") pod \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.063903 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-scripts\") pod \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.063949 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.063975 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-internal-tls-certs\") pod \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.064052 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-config-data\") pod \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\" (UID: \"e6a62a0a-8124-4401-a8fb-5926c1d3ce65\") " Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.065363 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-logs" (OuterVolumeSpecName: "logs") pod "e6a62a0a-8124-4401-a8fb-5926c1d3ce65" (UID: "e6a62a0a-8124-4401-a8fb-5926c1d3ce65"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.065385 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e6a62a0a-8124-4401-a8fb-5926c1d3ce65" (UID: "e6a62a0a-8124-4401-a8fb-5926c1d3ce65"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.068929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "e6a62a0a-8124-4401-a8fb-5926c1d3ce65" (UID: "e6a62a0a-8124-4401-a8fb-5926c1d3ce65"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.069938 5030 generic.go:334] "Generic (PLEG): container finished" podID="e6a62a0a-8124-4401-a8fb-5926c1d3ce65" containerID="b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8" exitCode=0 Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.069991 5030 generic.go:334] "Generic (PLEG): container finished" podID="e6a62a0a-8124-4401-a8fb-5926c1d3ce65" containerID="db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95" exitCode=143 Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.070022 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.070053 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e6a62a0a-8124-4401-a8fb-5926c1d3ce65","Type":"ContainerDied","Data":"b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8"} Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.070084 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e6a62a0a-8124-4401-a8fb-5926c1d3ce65","Type":"ContainerDied","Data":"db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95"} Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.070097 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"e6a62a0a-8124-4401-a8fb-5926c1d3ce65","Type":"ContainerDied","Data":"3dad6f8718f77d08b85d384cd1be37900938cb68972aa519a101b2bbaae7d821"} Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.072105 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-qttm2" event={"ID":"e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8","Type":"ContainerDied","Data":"e4bf44234b19cfb8edc2568564e9a54ee498002c098c276c717eec94e33884b1"} Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.072159 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4bf44234b19cfb8edc2568564e9a54ee498002c098c276c717eec94e33884b1" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.073176 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-qttm2" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.074075 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-kube-api-access-kb9br" (OuterVolumeSpecName: "kube-api-access-kb9br") pod "e6a62a0a-8124-4401-a8fb-5926c1d3ce65" (UID: "e6a62a0a-8124-4401-a8fb-5926c1d3ce65"). InnerVolumeSpecName "kube-api-access-kb9br". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.084066 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" podStartSLOduration=2.084049837 podStartE2EDuration="2.084049837s" podCreationTimestamp="2026-01-20 23:46:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:28.080658184 +0000 UTC m=+4260.400918472" watchObservedRunningTime="2026-01-20 23:46:28.084049837 +0000 UTC m=+4260.404310125" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.087198 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-scripts" (OuterVolumeSpecName: "scripts") pod "e6a62a0a-8124-4401-a8fb-5926c1d3ce65" (UID: "e6a62a0a-8124-4401-a8fb-5926c1d3ce65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.090339 5030 scope.go:117] "RemoveContainer" containerID="f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.110096 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6a62a0a-8124-4401-a8fb-5926c1d3ce65" (UID: "e6a62a0a-8124-4401-a8fb-5926c1d3ce65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.125741 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" podStartSLOduration=2.125719047 podStartE2EDuration="2.125719047s" podCreationTimestamp="2026-01-20 23:46:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:28.105942198 +0000 UTC m=+4260.426202496" watchObservedRunningTime="2026-01-20 23:46:28.125719047 +0000 UTC m=+4260.445979335" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.136864 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-config-data" (OuterVolumeSpecName: "config-data") pod "e6a62a0a-8124-4401-a8fb-5926c1d3ce65" (UID: "e6a62a0a-8124-4401-a8fb-5926c1d3ce65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.142826 5030 scope.go:117] "RemoveContainer" containerID="f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.147151 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:46:28 crc kubenswrapper[5030]: E0120 23:46:28.148744 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996\": container with ID starting with f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996 not found: ID does not exist" containerID="f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.148784 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996"} err="failed to get container status \"f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996\": rpc error: code = NotFound desc = could not find container \"f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996\": container with ID starting with f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996 not found: ID does not exist" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.148809 5030 scope.go:117] "RemoveContainer" containerID="f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.157968 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:46:28 crc kubenswrapper[5030]: E0120 23:46:28.163407 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017\": container with ID starting with f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017 not found: ID does not exist" containerID="f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.163450 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017"} err="failed to get container status \"f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017\": rpc error: code = NotFound desc = could not find container \"f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017\": container with ID starting with f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017 not found: ID does not exist" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.163477 5030 scope.go:117] "RemoveContainer" containerID="f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.165726 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.165751 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb9br\" (UniqueName: \"kubernetes.io/projected/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-kube-api-access-kb9br\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.165760 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.165770 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.165796 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.165811 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.165820 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.168744 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996"} err="failed to get container status \"f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996\": rpc error: code = NotFound desc = could not find container \"f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996\": container with ID starting with f2b0c9088d6c4cebc837c3c7fd397e23e6d4d22e94806bd839dc6d4264389996 not found: ID does not exist" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.168776 5030 scope.go:117] "RemoveContainer" containerID="f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.169662 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017"} err="failed to get container status \"f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017\": rpc error: code = NotFound desc = could not find container \"f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017\": container with ID starting with f46f0b03ba9b380247565090e70e5a4c64051a6cc33512b51d4471e078611017 not found: ID does not exist" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.169680 5030 scope.go:117] "RemoveContainer" containerID="b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.192662 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:46:28 crc kubenswrapper[5030]: E0120 23:46:28.193050 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8" containerName="placement-db-sync" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.193062 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8" containerName="placement-db-sync" Jan 20 23:46:28 crc kubenswrapper[5030]: E0120 23:46:28.193080 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a62a0a-8124-4401-a8fb-5926c1d3ce65" containerName="glance-log" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.193085 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a62a0a-8124-4401-a8fb-5926c1d3ce65" containerName="glance-log" Jan 20 23:46:28 crc kubenswrapper[5030]: E0120 23:46:28.193118 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497ba0ef-c7db-4c22-8b2f-7da2826ece57" containerName="glance-httpd" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.193124 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="497ba0ef-c7db-4c22-8b2f-7da2826ece57" containerName="glance-httpd" Jan 20 23:46:28 crc kubenswrapper[5030]: E0120 23:46:28.193139 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a62a0a-8124-4401-a8fb-5926c1d3ce65" containerName="glance-httpd" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.193145 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a62a0a-8124-4401-a8fb-5926c1d3ce65" containerName="glance-httpd" Jan 20 23:46:28 crc kubenswrapper[5030]: E0120 23:46:28.193158 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497ba0ef-c7db-4c22-8b2f-7da2826ece57" containerName="glance-log" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.193164 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="497ba0ef-c7db-4c22-8b2f-7da2826ece57" containerName="glance-log" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.193329 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a62a0a-8124-4401-a8fb-5926c1d3ce65" containerName="glance-log" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.193344 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="497ba0ef-c7db-4c22-8b2f-7da2826ece57" containerName="glance-httpd" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.193358 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="497ba0ef-c7db-4c22-8b2f-7da2826ece57" containerName="glance-log" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.193369 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8" containerName="placement-db-sync" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.193377 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a62a0a-8124-4401-a8fb-5926c1d3ce65" containerName="glance-httpd" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.201738 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e6a62a0a-8124-4401-a8fb-5926c1d3ce65" (UID: "e6a62a0a-8124-4401-a8fb-5926c1d3ce65"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.202307 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.208981 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.214961 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.268655 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7hqw\" (UniqueName: \"kubernetes.io/projected/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-kube-api-access-v7hqw\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.268755 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.268799 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.268822 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-scripts\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.268849 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-config-data\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.268868 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-logs\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.268890 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.268914 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.268980 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6a62a0a-8124-4401-a8fb-5926c1d3ce65-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.269199 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.297467 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.301148 5030 scope.go:117] "RemoveContainer" containerID="db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.318754 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-fd7fccdd6-2qq68"] Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.320044 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.324038 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-fd7fccdd6-2qq68"] Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.356102 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.356272 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.370524 5030 scope.go:117] "RemoveContainer" containerID="b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.370855 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-nsx78" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373555 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-config-data\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373594 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373636 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-combined-ca-bundle\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373696 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373715 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-scripts\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373740 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-config-data\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-logs\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373774 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84c24aee-5bda-43fe-8d1d-f300da572b7b-logs\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373793 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373852 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7hqw\" (UniqueName: \"kubernetes.io/projected/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-kube-api-access-v7hqw\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373885 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpvp6\" (UniqueName: \"kubernetes.io/projected/84c24aee-5bda-43fe-8d1d-f300da572b7b-kube-api-access-fpvp6\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373911 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-scripts\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.373978 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.374292 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: E0120 23:46:28.381805 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8\": container with ID starting with b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8 not found: ID does not exist" containerID="b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.381848 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8"} err="failed to get container status \"b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8\": rpc error: code = NotFound desc = could not find container \"b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8\": container with ID starting with b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8 not found: ID does not exist" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.381873 5030 scope.go:117] "RemoveContainer" containerID="db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.392221 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-logs\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: E0120 23:46:28.392710 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95\": container with ID starting with db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95 not found: ID does not exist" containerID="db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.392736 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95"} err="failed to get container status \"db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95\": rpc error: code = NotFound desc = could not find container \"db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95\": container with ID starting with db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95 not found: ID does not exist" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.392759 5030 scope.go:117] "RemoveContainer" containerID="b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.396184 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8"} err="failed to get container status \"b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8\": rpc error: code = NotFound desc = could not find container \"b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8\": container with ID starting with b1adf0f631aac550d31c15b207bd2289c6b3c516aa8f8bc6c2b5132df61205b8 not found: ID does not exist" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.396230 5030 scope.go:117] "RemoveContainer" containerID="db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.400793 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95"} err="failed to get container status \"db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95\": rpc error: code = NotFound desc = could not find container \"db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95\": container with ID starting with db37fb52b7affb00b6773c191d0b1469eee7a1ba3ae4d5e97ba244e25232ea95 not found: ID does not exist" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.401184 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.411806 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-config-data\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.412293 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-scripts\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.412822 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.431950 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.440418 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7hqw\" (UniqueName: \"kubernetes.io/projected/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-kube-api-access-v7hqw\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.488690 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpvp6\" (UniqueName: \"kubernetes.io/projected/84c24aee-5bda-43fe-8d1d-f300da572b7b-kube-api-access-fpvp6\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.488737 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-scripts\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.488791 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-config-data\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.488819 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-combined-ca-bundle\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.488892 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84c24aee-5bda-43fe-8d1d-f300da572b7b-logs\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.489276 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84c24aee-5bda-43fe-8d1d-f300da572b7b-logs\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.569179 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.578730 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.588200 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.589599 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.593040 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.593096 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.619971 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.692954 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.692999 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94ebfcd5-ace3-4a42-b07e-48858270151a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.693051 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.693093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2lt6\" (UniqueName: \"kubernetes.io/projected/94ebfcd5-ace3-4a42-b07e-48858270151a-kube-api-access-c2lt6\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.693120 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.693156 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.693181 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.693226 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94ebfcd5-ace3-4a42-b07e-48858270151a-logs\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: E0120 23:46:28.778548 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod497ba0ef_c7db_4c22_8b2f_7da2826ece57.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.794923 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.795039 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94ebfcd5-ace3-4a42-b07e-48858270151a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.795114 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.795201 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2lt6\" (UniqueName: \"kubernetes.io/projected/94ebfcd5-ace3-4a42-b07e-48858270151a-kube-api-access-c2lt6\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.795261 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.795341 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.795404 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.795489 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94ebfcd5-ace3-4a42-b07e-48858270151a-logs\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.795559 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94ebfcd5-ace3-4a42-b07e-48858270151a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.795573 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.795931 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94ebfcd5-ace3-4a42-b07e-48858270151a-logs\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.842718 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-scripts\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.843726 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpvp6\" (UniqueName: \"kubernetes.io/projected/84c24aee-5bda-43fe-8d1d-f300da572b7b-kube-api-access-fpvp6\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.844209 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-config-data\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.844421 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.844580 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-combined-ca-bundle\") pod \"placement-fd7fccdd6-2qq68\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.845377 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.845711 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.848653 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.855034 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2lt6\" (UniqueName: \"kubernetes.io/projected/94ebfcd5-ace3-4a42-b07e-48858270151a-kube-api-access-c2lt6\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.868585 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.873853 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.917714 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:28 crc kubenswrapper[5030]: I0120 23:46:28.992591 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.023483 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.106761 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-scripts\") pod \"4e00e205-ea8a-4a04-8161-dd51e3532143\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.106799 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-credential-keys\") pod \"4e00e205-ea8a-4a04-8161-dd51e3532143\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.106883 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp7mb\" (UniqueName: \"kubernetes.io/projected/4e00e205-ea8a-4a04-8161-dd51e3532143-kube-api-access-bp7mb\") pod \"4e00e205-ea8a-4a04-8161-dd51e3532143\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.106918 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-combined-ca-bundle\") pod \"4e00e205-ea8a-4a04-8161-dd51e3532143\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.107030 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-fernet-keys\") pod \"4e00e205-ea8a-4a04-8161-dd51e3532143\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.107066 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-config-data\") pod \"4e00e205-ea8a-4a04-8161-dd51e3532143\" (UID: \"4e00e205-ea8a-4a04-8161-dd51e3532143\") " Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.111468 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.120018 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-z276q" event={"ID":"9275ea5f-43f0-447d-9e4d-adf3f977f058","Type":"ContainerDied","Data":"35bb08397b973e5c4c27e381ae61e421dc222083dda38dfc285f6d8ccdbc5272"} Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.120064 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35bb08397b973e5c4c27e381ae61e421dc222083dda38dfc285f6d8ccdbc5272" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.122849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-scripts" (OuterVolumeSpecName: "scripts") pod "4e00e205-ea8a-4a04-8161-dd51e3532143" (UID: "4e00e205-ea8a-4a04-8161-dd51e3532143"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.124399 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4e00e205-ea8a-4a04-8161-dd51e3532143" (UID: "4e00e205-ea8a-4a04-8161-dd51e3532143"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.126656 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4e00e205-ea8a-4a04-8161-dd51e3532143" (UID: "4e00e205-ea8a-4a04-8161-dd51e3532143"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.129531 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e00e205-ea8a-4a04-8161-dd51e3532143-kube-api-access-bp7mb" (OuterVolumeSpecName: "kube-api-access-bp7mb") pod "4e00e205-ea8a-4a04-8161-dd51e3532143" (UID: "4e00e205-ea8a-4a04-8161-dd51e3532143"). InnerVolumeSpecName "kube-api-access-bp7mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.136725 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" event={"ID":"4e00e205-ea8a-4a04-8161-dd51e3532143","Type":"ContainerDied","Data":"edaeae24dbb5e6d879f9c38a3f498c1ca5144f6c642916ffccc32a71c58baeeb"} Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.136765 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edaeae24dbb5e6d879f9c38a3f498c1ca5144f6c642916ffccc32a71c58baeeb" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.136845 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-cwv4x" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.150003 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.187360 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-cwv4x"] Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.195902 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-cwv4x"] Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.208439 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-scripts\") pod \"9275ea5f-43f0-447d-9e4d-adf3f977f058\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.208510 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t65gl\" (UniqueName: \"kubernetes.io/projected/9275ea5f-43f0-447d-9e4d-adf3f977f058-kube-api-access-t65gl\") pod \"9275ea5f-43f0-447d-9e4d-adf3f977f058\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.208598 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9275ea5f-43f0-447d-9e4d-adf3f977f058-etc-machine-id\") pod \"9275ea5f-43f0-447d-9e4d-adf3f977f058\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.209231 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-config-data\") pod \"9275ea5f-43f0-447d-9e4d-adf3f977f058\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.209466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-db-sync-config-data\") pod \"9275ea5f-43f0-447d-9e4d-adf3f977f058\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.209565 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-combined-ca-bundle\") pod \"9275ea5f-43f0-447d-9e4d-adf3f977f058\" (UID: \"9275ea5f-43f0-447d-9e4d-adf3f977f058\") " Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.208859 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9275ea5f-43f0-447d-9e4d-adf3f977f058-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9275ea5f-43f0-447d-9e4d-adf3f977f058" (UID: "9275ea5f-43f0-447d-9e4d-adf3f977f058"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.212046 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9275ea5f-43f0-447d-9e4d-adf3f977f058-kube-api-access-t65gl" (OuterVolumeSpecName: "kube-api-access-t65gl") pod "9275ea5f-43f0-447d-9e4d-adf3f977f058" (UID: "9275ea5f-43f0-447d-9e4d-adf3f977f058"). InnerVolumeSpecName "kube-api-access-t65gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.212572 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-scripts" (OuterVolumeSpecName: "scripts") pod "9275ea5f-43f0-447d-9e4d-adf3f977f058" (UID: "9275ea5f-43f0-447d-9e4d-adf3f977f058"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.212777 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.212810 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t65gl\" (UniqueName: \"kubernetes.io/projected/9275ea5f-43f0-447d-9e4d-adf3f977f058-kube-api-access-t65gl\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.212823 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.212833 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9275ea5f-43f0-447d-9e4d-adf3f977f058-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.212845 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.212855 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.212867 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp7mb\" (UniqueName: \"kubernetes.io/projected/4e00e205-ea8a-4a04-8161-dd51e3532143-kube-api-access-bp7mb\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.303857 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-9g6g8"] Jan 20 23:46:29 crc kubenswrapper[5030]: E0120 23:46:29.304255 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9275ea5f-43f0-447d-9e4d-adf3f977f058" containerName="cinder-db-sync" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.304267 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9275ea5f-43f0-447d-9e4d-adf3f977f058" containerName="cinder-db-sync" Jan 20 23:46:29 crc kubenswrapper[5030]: E0120 23:46:29.304279 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e00e205-ea8a-4a04-8161-dd51e3532143" containerName="keystone-bootstrap" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.304286 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e00e205-ea8a-4a04-8161-dd51e3532143" containerName="keystone-bootstrap" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.304462 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9275ea5f-43f0-447d-9e4d-adf3f977f058" containerName="cinder-db-sync" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.304478 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e00e205-ea8a-4a04-8161-dd51e3532143" containerName="keystone-bootstrap" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.305048 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.307558 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-9g6g8"] Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.343965 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9275ea5f-43f0-447d-9e4d-adf3f977f058" (UID: "9275ea5f-43f0-447d-9e4d-adf3f977f058"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.349544 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-config-data" (OuterVolumeSpecName: "config-data") pod "4e00e205-ea8a-4a04-8161-dd51e3532143" (UID: "4e00e205-ea8a-4a04-8161-dd51e3532143"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.351694 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-config-data" (OuterVolumeSpecName: "config-data") pod "9275ea5f-43f0-447d-9e4d-adf3f977f058" (UID: "9275ea5f-43f0-447d-9e4d-adf3f977f058"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.353473 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e00e205-ea8a-4a04-8161-dd51e3532143" (UID: "4e00e205-ea8a-4a04-8161-dd51e3532143"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.359380 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9275ea5f-43f0-447d-9e4d-adf3f977f058" (UID: "9275ea5f-43f0-447d-9e4d-adf3f977f058"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.417021 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-fernet-keys\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.417064 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-config-data\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.417025 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.417263 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-scripts\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.417382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-credential-keys\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.417441 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-combined-ca-bundle\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.417565 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pck5d\" (UniqueName: \"kubernetes.io/projected/0d3d6688-641e-4058-a346-61c3ee44938c-kube-api-access-pck5d\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.417663 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.417680 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.417691 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.417701 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9275ea5f-43f0-447d-9e4d-adf3f977f058-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.417711 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e00e205-ea8a-4a04-8161-dd51e3532143-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:29 crc kubenswrapper[5030]: W0120 23:46:29.424172 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94ebfcd5_ace3_4a42_b07e_48858270151a.slice/crio-3f80c3dc58af05d11075e422056ff2fb4d1fe6db9c8cf1f1d9e4e15c60bb1078 WatchSource:0}: Error finding container 3f80c3dc58af05d11075e422056ff2fb4d1fe6db9c8cf1f1d9e4e15c60bb1078: Status 404 returned error can't find the container with id 3f80c3dc58af05d11075e422056ff2fb4d1fe6db9c8cf1f1d9e4e15c60bb1078 Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.510306 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-79b6d76946-54s42"] Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.518872 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pck5d\" (UniqueName: \"kubernetes.io/projected/0d3d6688-641e-4058-a346-61c3ee44938c-kube-api-access-pck5d\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.518951 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-config-data\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.518983 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-fernet-keys\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.519047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-scripts\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.519107 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-credential-keys\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.519135 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-combined-ca-bundle\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.520348 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.522816 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.523022 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.526408 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-79b6d76946-54s42"] Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.526819 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-fernet-keys\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.527994 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-credential-keys\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.528348 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-config-data\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.529443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-combined-ca-bundle\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.530994 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-scripts\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.538643 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pck5d\" (UniqueName: \"kubernetes.io/projected/0d3d6688-641e-4058-a346-61c3ee44938c-kube-api-access-pck5d\") pod \"keystone-bootstrap-9g6g8\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.584824 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-fd7fccdd6-2qq68"] Jan 20 23:46:29 crc kubenswrapper[5030]: W0120 23:46:29.612688 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84c24aee_5bda_43fe_8d1d_f300da572b7b.slice/crio-0bd4702d7738830219bc8aaff93f4499f2ba5f475349e635a643c10f16107bd1 WatchSource:0}: Error finding container 0bd4702d7738830219bc8aaff93f4499f2ba5f475349e635a643c10f16107bd1: Status 404 returned error can't find the container with id 0bd4702d7738830219bc8aaff93f4499f2ba5f475349e635a643c10f16107bd1 Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.620397 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-scripts\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.620446 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg47g\" (UniqueName: \"kubernetes.io/projected/9235118a-bc18-42b7-8968-8d1716aa58e6-kube-api-access-kg47g\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.620471 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-internal-tls-certs\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.620532 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-config-data\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.620611 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-combined-ca-bundle\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.620657 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9235118a-bc18-42b7-8968-8d1716aa58e6-logs\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.620683 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-public-tls-certs\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.631003 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.709919 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:46:29 crc kubenswrapper[5030]: W0120 23:46:29.713147 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ead5a03_9b2d_4b00_8ae0_0d5192f642cf.slice/crio-8b021494e1aac5122b52829e6c9c87800a7d308af27e58e1c36b78ee640e91e7 WatchSource:0}: Error finding container 8b021494e1aac5122b52829e6c9c87800a7d308af27e58e1c36b78ee640e91e7: Status 404 returned error can't find the container with id 8b021494e1aac5122b52829e6c9c87800a7d308af27e58e1c36b78ee640e91e7 Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.721719 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-config-data\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.721799 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-combined-ca-bundle\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.721830 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9235118a-bc18-42b7-8968-8d1716aa58e6-logs\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.721854 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-public-tls-certs\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.721931 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-scripts\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.721964 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg47g\" (UniqueName: \"kubernetes.io/projected/9235118a-bc18-42b7-8968-8d1716aa58e6-kube-api-access-kg47g\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.721986 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-internal-tls-certs\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.722998 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9235118a-bc18-42b7-8968-8d1716aa58e6-logs\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.726444 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-scripts\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.726645 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-public-tls-certs\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.728012 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-combined-ca-bundle\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.728647 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-config-data\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.730549 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-internal-tls-certs\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.740493 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg47g\" (UniqueName: \"kubernetes.io/projected/9235118a-bc18-42b7-8968-8d1716aa58e6-kube-api-access-kg47g\") pod \"placement-79b6d76946-54s42\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.840273 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.972921 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="497ba0ef-c7db-4c22-8b2f-7da2826ece57" path="/var/lib/kubelet/pods/497ba0ef-c7db-4c22-8b2f-7da2826ece57/volumes" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.974096 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e00e205-ea8a-4a04-8161-dd51e3532143" path="/var/lib/kubelet/pods/4e00e205-ea8a-4a04-8161-dd51e3532143/volumes" Jan 20 23:46:29 crc kubenswrapper[5030]: I0120 23:46:29.974657 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a62a0a-8124-4401-a8fb-5926c1d3ce65" path="/var/lib/kubelet/pods/e6a62a0a-8124-4401-a8fb-5926c1d3ce65/volumes" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.062429 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-9g6g8"] Jan 20 23:46:30 crc kubenswrapper[5030]: W0120 23:46:30.074054 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d3d6688_641e_4058_a346_61c3ee44938c.slice/crio-740571b006a278182f492b699c5a21ba69fd05c7dfb42235839bff755793bdd2 WatchSource:0}: Error finding container 740571b006a278182f492b699c5a21ba69fd05c7dfb42235839bff755793bdd2: Status 404 returned error can't find the container with id 740571b006a278182f492b699c5a21ba69fd05c7dfb42235839bff755793bdd2 Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.169933 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b4e8bfe6-1143-4b62-a672-1255e6a861e4","Type":"ContainerStarted","Data":"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae"} Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.170063 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="ceilometer-central-agent" containerID="cri-o://059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df" gracePeriod=30 Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.170095 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.170156 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="proxy-httpd" containerID="cri-o://29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae" gracePeriod=30 Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.170224 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="sg-core" containerID="cri-o://5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14" gracePeriod=30 Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.170260 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="ceilometer-notification-agent" containerID="cri-o://e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a" gracePeriod=30 Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.176516 5030 generic.go:334] "Generic (PLEG): container finished" podID="43ed9ea9-6c15-4082-b728-5d5d66dfa24e" containerID="f05982f608e268c197c7f0c0f512888069aa7995523bedc0554768de0b8db5ce" exitCode=0 Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.176610 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" event={"ID":"43ed9ea9-6c15-4082-b728-5d5d66dfa24e","Type":"ContainerDied","Data":"f05982f608e268c197c7f0c0f512888069aa7995523bedc0554768de0b8db5ce"} Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.181770 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" event={"ID":"84c24aee-5bda-43fe-8d1d-f300da572b7b","Type":"ContainerStarted","Data":"36265ff97ff1d54df0caea2a363dab0086d9d2663bb0d319e7493b32674eee84"} Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.181798 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" event={"ID":"84c24aee-5bda-43fe-8d1d-f300da572b7b","Type":"ContainerStarted","Data":"1eb541585fbd67506eb0e791ddff809f983a430209159e2301cac2f668f7034e"} Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.181809 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" event={"ID":"84c24aee-5bda-43fe-8d1d-f300da572b7b","Type":"ContainerStarted","Data":"0bd4702d7738830219bc8aaff93f4499f2ba5f475349e635a643c10f16107bd1"} Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.182555 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.182586 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.186886 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"94ebfcd5-ace3-4a42-b07e-48858270151a","Type":"ContainerStarted","Data":"7909e93e6793df80e1395a1fa9b158599f2597163cd273b1941ebc60df184243"} Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.186913 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"94ebfcd5-ace3-4a42-b07e-48858270151a","Type":"ContainerStarted","Data":"3f80c3dc58af05d11075e422056ff2fb4d1fe6db9c8cf1f1d9e4e15c60bb1078"} Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.188439 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" event={"ID":"0d3d6688-641e-4058-a346-61c3ee44938c","Type":"ContainerStarted","Data":"740571b006a278182f492b699c5a21ba69fd05c7dfb42235839bff755793bdd2"} Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.189996 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-z276q" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.190449 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf","Type":"ContainerStarted","Data":"8b021494e1aac5122b52829e6c9c87800a7d308af27e58e1c36b78ee640e91e7"} Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.210172 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=3.252552257 podStartE2EDuration="7.210150623s" podCreationTimestamp="2026-01-20 23:46:23 +0000 UTC" firstStartedPulling="2026-01-20 23:46:24.344794848 +0000 UTC m=+4256.665055136" lastFinishedPulling="2026-01-20 23:46:28.302393214 +0000 UTC m=+4260.622653502" observedRunningTime="2026-01-20 23:46:30.190905516 +0000 UTC m=+4262.511165804" watchObservedRunningTime="2026-01-20 23:46:30.210150623 +0000 UTC m=+4262.530410911" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.246174 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" podStartSLOduration=2.246152897 podStartE2EDuration="2.246152897s" podCreationTimestamp="2026-01-20 23:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:30.222839041 +0000 UTC m=+4262.543099349" watchObservedRunningTime="2026-01-20 23:46:30.246152897 +0000 UTC m=+4262.566413315" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.329579 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.331206 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.339838 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-4xxv7" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.339930 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.340127 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.340241 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.354044 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-79b6d76946-54s42"] Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.377061 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.434549 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.434781 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-config-data\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.434849 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klrk7\" (UniqueName: \"kubernetes.io/projected/743f3c13-8b5a-454a-9111-2bfe623b17fa-kube-api-access-klrk7\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.434925 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/743f3c13-8b5a-454a-9111-2bfe623b17fa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.434991 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.435058 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-scripts\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.458953 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.460252 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.468108 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.494672 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.542699 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-config-data\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.542749 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.542785 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-scripts\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.542817 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-config-data\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.542839 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-config-data-custom\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.542862 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klrk7\" (UniqueName: \"kubernetes.io/projected/743f3c13-8b5a-454a-9111-2bfe623b17fa-kube-api-access-klrk7\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.542882 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9782d2bc-aec5-414f-abdf-6e10b79f3539-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.542903 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/743f3c13-8b5a-454a-9111-2bfe623b17fa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.542930 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp2fk\" (UniqueName: \"kubernetes.io/projected/9782d2bc-aec5-414f-abdf-6e10b79f3539-kube-api-access-hp2fk\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.542950 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.542975 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9782d2bc-aec5-414f-abdf-6e10b79f3539-logs\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.542992 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-scripts\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.543070 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.543413 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/743f3c13-8b5a-454a-9111-2bfe623b17fa-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.553500 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.553857 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-config-data\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.554433 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-scripts\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.560255 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.564729 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klrk7\" (UniqueName: \"kubernetes.io/projected/743f3c13-8b5a-454a-9111-2bfe623b17fa-kube-api-access-klrk7\") pod \"cinder-scheduler-0\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.644210 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9782d2bc-aec5-414f-abdf-6e10b79f3539-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.644283 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp2fk\" (UniqueName: \"kubernetes.io/projected/9782d2bc-aec5-414f-abdf-6e10b79f3539-kube-api-access-hp2fk\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.644319 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9782d2bc-aec5-414f-abdf-6e10b79f3539-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.644321 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9782d2bc-aec5-414f-abdf-6e10b79f3539-logs\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.644424 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-config-data\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.644455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.644488 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-scripts\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.644520 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-config-data-custom\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.644632 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9782d2bc-aec5-414f-abdf-6e10b79f3539-logs\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.648022 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-scripts\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.651739 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-config-data-custom\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.652633 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-config-data\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.661716 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.674450 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp2fk\" (UniqueName: \"kubernetes.io/projected/9782d2bc-aec5-414f-abdf-6e10b79f3539-kube-api-access-hp2fk\") pod \"cinder-api-0\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.707889 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:30 crc kubenswrapper[5030]: I0120 23:46:30.784370 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.082126 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.151966 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-sg-core-conf-yaml\") pod \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.152026 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-config-data\") pod \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.152060 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-scripts\") pod \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.152136 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e8bfe6-1143-4b62-a672-1255e6a861e4-log-httpd\") pod \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.152156 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvcz9\" (UniqueName: \"kubernetes.io/projected/b4e8bfe6-1143-4b62-a672-1255e6a861e4-kube-api-access-zvcz9\") pod \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.152182 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e8bfe6-1143-4b62-a672-1255e6a861e4-run-httpd\") pod \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.152273 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-combined-ca-bundle\") pod \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\" (UID: \"b4e8bfe6-1143-4b62-a672-1255e6a861e4\") " Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.153606 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4e8bfe6-1143-4b62-a672-1255e6a861e4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b4e8bfe6-1143-4b62-a672-1255e6a861e4" (UID: "b4e8bfe6-1143-4b62-a672-1255e6a861e4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.153753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4e8bfe6-1143-4b62-a672-1255e6a861e4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b4e8bfe6-1143-4b62-a672-1255e6a861e4" (UID: "b4e8bfe6-1143-4b62-a672-1255e6a861e4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.157427 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4e8bfe6-1143-4b62-a672-1255e6a861e4-kube-api-access-zvcz9" (OuterVolumeSpecName: "kube-api-access-zvcz9") pod "b4e8bfe6-1143-4b62-a672-1255e6a861e4" (UID: "b4e8bfe6-1143-4b62-a672-1255e6a861e4"). InnerVolumeSpecName "kube-api-access-zvcz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.157730 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-scripts" (OuterVolumeSpecName: "scripts") pod "b4e8bfe6-1143-4b62-a672-1255e6a861e4" (UID: "b4e8bfe6-1143-4b62-a672-1255e6a861e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.182875 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b4e8bfe6-1143-4b62-a672-1255e6a861e4" (UID: "b4e8bfe6-1143-4b62-a672-1255e6a861e4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.200358 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerID="29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae" exitCode=0 Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.200388 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerID="5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14" exitCode=2 Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.200398 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerID="e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a" exitCode=0 Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.200405 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerID="059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df" exitCode=0 Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.200450 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b4e8bfe6-1143-4b62-a672-1255e6a861e4","Type":"ContainerDied","Data":"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae"} Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.200476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b4e8bfe6-1143-4b62-a672-1255e6a861e4","Type":"ContainerDied","Data":"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14"} Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.200486 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b4e8bfe6-1143-4b62-a672-1255e6a861e4","Type":"ContainerDied","Data":"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a"} Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.200495 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b4e8bfe6-1143-4b62-a672-1255e6a861e4","Type":"ContainerDied","Data":"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df"} Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.200505 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"b4e8bfe6-1143-4b62-a672-1255e6a861e4","Type":"ContainerDied","Data":"2effed606f819ecbc25ab9664cb620a95900b259fa37c32966f96b01eabe57b0"} Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.200520 5030 scope.go:117] "RemoveContainer" containerID="29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.200659 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.209835 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" event={"ID":"9235118a-bc18-42b7-8968-8d1716aa58e6","Type":"ContainerStarted","Data":"078b4c220ea49991c2fd2e5c63d577629afa1a9bc49871f2340203ad8b972145"} Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.209879 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" event={"ID":"9235118a-bc18-42b7-8968-8d1716aa58e6","Type":"ContainerStarted","Data":"e4604a1a5e1055d193592b0018c10b47b6be4ee3b04193a2a20752fb258dd934"} Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.209892 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" event={"ID":"9235118a-bc18-42b7-8968-8d1716aa58e6","Type":"ContainerStarted","Data":"421d4397cda92e5572573f711e6f557fa679c4211ffd70cb0a30eb2d318ff390"} Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.211001 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.211030 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.227907 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"94ebfcd5-ace3-4a42-b07e-48858270151a","Type":"ContainerStarted","Data":"d4b576c3601df1ede01c40d13b89ee5a68c09bff6eccc0bad2d568b5d81597a1"} Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.243789 5030 scope.go:117] "RemoveContainer" containerID="5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.261593 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.266049 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.266072 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvcz9\" (UniqueName: \"kubernetes.io/projected/b4e8bfe6-1143-4b62-a672-1255e6a861e4-kube-api-access-zvcz9\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.266083 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e8bfe6-1143-4b62-a672-1255e6a861e4-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.266213 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4e8bfe6-1143-4b62-a672-1255e6a861e4-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.263819 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" event={"ID":"0d3d6688-641e-4058-a346-61c3ee44938c","Type":"ContainerStarted","Data":"4c5a39b28271097caeee65f9a045b46a3c2ee6068921dc3575a234b5903108e2"} Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.274545 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" podStartSLOduration=2.274526533 podStartE2EDuration="2.274526533s" podCreationTimestamp="2026-01-20 23:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:31.238222292 +0000 UTC m=+4263.558482590" watchObservedRunningTime="2026-01-20 23:46:31.274526533 +0000 UTC m=+4263.594786831" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.281907 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf","Type":"ContainerStarted","Data":"024945ba7854a3418939480adf5fc933bae89651e0dcf79a2b5d6a00cd68768c"} Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.281941 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf","Type":"ContainerStarted","Data":"a69d719a82f846cca84f68dd21c5335bedd3bfc9c3738955f14f594f7e507935"} Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.291589 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.292614 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4e8bfe6-1143-4b62-a672-1255e6a861e4" (UID: "b4e8bfe6-1143-4b62-a672-1255e6a861e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.293786 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-config-data" (OuterVolumeSpecName: "config-data") pod "b4e8bfe6-1143-4b62-a672-1255e6a861e4" (UID: "b4e8bfe6-1143-4b62-a672-1255e6a861e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:31 crc kubenswrapper[5030]: W0120 23:46:31.293879 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod743f3c13_8b5a_454a_9111_2bfe623b17fa.slice/crio-89f373b0afe29ee91464bc60916f26b4795c5450e6639d2913668e8d59951225 WatchSource:0}: Error finding container 89f373b0afe29ee91464bc60916f26b4795c5450e6639d2913668e8d59951225: Status 404 returned error can't find the container with id 89f373b0afe29ee91464bc60916f26b4795c5450e6639d2913668e8d59951225 Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.297207 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.321493 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.3214744720000002 podStartE2EDuration="3.321474472s" podCreationTimestamp="2026-01-20 23:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:31.266338955 +0000 UTC m=+4263.586599243" watchObservedRunningTime="2026-01-20 23:46:31.321474472 +0000 UTC m=+4263.641734760" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.337449 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" podStartSLOduration=2.337430069 podStartE2EDuration="2.337430069s" podCreationTimestamp="2026-01-20 23:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:31.286668088 +0000 UTC m=+4263.606928376" watchObservedRunningTime="2026-01-20 23:46:31.337430069 +0000 UTC m=+4263.657690357" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.343738 5030 scope.go:117] "RemoveContainer" containerID="e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.356207 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.356184945 podStartE2EDuration="3.356184945s" podCreationTimestamp="2026-01-20 23:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:31.312681089 +0000 UTC m=+4263.632941397" watchObservedRunningTime="2026-01-20 23:46:31.356184945 +0000 UTC m=+4263.676445233" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.368227 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.368263 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4e8bfe6-1143-4b62-a672-1255e6a861e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.393788 5030 scope.go:117] "RemoveContainer" containerID="059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.428744 5030 scope.go:117] "RemoveContainer" containerID="29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae" Jan 20 23:46:31 crc kubenswrapper[5030]: E0120 23:46:31.429405 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae\": container with ID starting with 29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae not found: ID does not exist" containerID="29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.429451 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae"} err="failed to get container status \"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae\": rpc error: code = NotFound desc = could not find container \"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae\": container with ID starting with 29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.429477 5030 scope.go:117] "RemoveContainer" containerID="5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14" Jan 20 23:46:31 crc kubenswrapper[5030]: E0120 23:46:31.430236 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14\": container with ID starting with 5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14 not found: ID does not exist" containerID="5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.430274 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14"} err="failed to get container status \"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14\": rpc error: code = NotFound desc = could not find container \"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14\": container with ID starting with 5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14 not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.430288 5030 scope.go:117] "RemoveContainer" containerID="e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a" Jan 20 23:46:31 crc kubenswrapper[5030]: E0120 23:46:31.430582 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a\": container with ID starting with e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a not found: ID does not exist" containerID="e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.430604 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a"} err="failed to get container status \"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a\": rpc error: code = NotFound desc = could not find container \"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a\": container with ID starting with e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.430616 5030 scope.go:117] "RemoveContainer" containerID="059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df" Jan 20 23:46:31 crc kubenswrapper[5030]: E0120 23:46:31.431048 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df\": container with ID starting with 059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df not found: ID does not exist" containerID="059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.431075 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df"} err="failed to get container status \"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df\": rpc error: code = NotFound desc = could not find container \"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df\": container with ID starting with 059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.431089 5030 scope.go:117] "RemoveContainer" containerID="29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.431420 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae"} err="failed to get container status \"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae\": rpc error: code = NotFound desc = could not find container \"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae\": container with ID starting with 29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.431450 5030 scope.go:117] "RemoveContainer" containerID="5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.431759 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14"} err="failed to get container status \"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14\": rpc error: code = NotFound desc = could not find container \"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14\": container with ID starting with 5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14 not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.431777 5030 scope.go:117] "RemoveContainer" containerID="e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.432007 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a"} err="failed to get container status \"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a\": rpc error: code = NotFound desc = could not find container \"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a\": container with ID starting with e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.432028 5030 scope.go:117] "RemoveContainer" containerID="059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.432367 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df"} err="failed to get container status \"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df\": rpc error: code = NotFound desc = could not find container \"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df\": container with ID starting with 059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.432396 5030 scope.go:117] "RemoveContainer" containerID="29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.432653 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae"} err="failed to get container status \"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae\": rpc error: code = NotFound desc = could not find container \"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae\": container with ID starting with 29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.432673 5030 scope.go:117] "RemoveContainer" containerID="5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.432970 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14"} err="failed to get container status \"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14\": rpc error: code = NotFound desc = could not find container \"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14\": container with ID starting with 5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14 not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.432989 5030 scope.go:117] "RemoveContainer" containerID="e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.433195 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a"} err="failed to get container status \"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a\": rpc error: code = NotFound desc = could not find container \"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a\": container with ID starting with e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.433224 5030 scope.go:117] "RemoveContainer" containerID="059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.433560 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df"} err="failed to get container status \"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df\": rpc error: code = NotFound desc = could not find container \"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df\": container with ID starting with 059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.433580 5030 scope.go:117] "RemoveContainer" containerID="29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.433748 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae"} err="failed to get container status \"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae\": rpc error: code = NotFound desc = could not find container \"29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae\": container with ID starting with 29273aece294025f37f60a50ccbc9c52e6fd5d7a14519afe422a0050d94b81ae not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.433767 5030 scope.go:117] "RemoveContainer" containerID="5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.433919 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14"} err="failed to get container status \"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14\": rpc error: code = NotFound desc = could not find container \"5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14\": container with ID starting with 5aef9deb7cb1d622ddc8279972423c7c7ef0a0943abbbf64eb47c2a76d4f9f14 not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.433935 5030 scope.go:117] "RemoveContainer" containerID="e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.434173 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a"} err="failed to get container status \"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a\": rpc error: code = NotFound desc = could not find container \"e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a\": container with ID starting with e9b5aa5ae40726768678190ad3e843a05a93bc91e8b080f54f11d11021660b5a not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.434189 5030 scope.go:117] "RemoveContainer" containerID="059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.434335 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df"} err="failed to get container status \"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df\": rpc error: code = NotFound desc = could not find container \"059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df\": container with ID starting with 059e88e4c3138d67e9eefcdc2bcfae58f345060a2becebc0c62b0e5943af12df not found: ID does not exist" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.543381 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.548838 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.573389 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:46:31 crc kubenswrapper[5030]: E0120 23:46:31.574244 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="proxy-httpd" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.576226 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="proxy-httpd" Jan 20 23:46:31 crc kubenswrapper[5030]: E0120 23:46:31.576343 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="ceilometer-notification-agent" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.576406 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="ceilometer-notification-agent" Jan 20 23:46:31 crc kubenswrapper[5030]: E0120 23:46:31.576478 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="sg-core" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.576536 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="sg-core" Jan 20 23:46:31 crc kubenswrapper[5030]: E0120 23:46:31.576596 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="ceilometer-central-agent" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.576671 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="ceilometer-central-agent" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.576876 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="ceilometer-central-agent" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.576939 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="proxy-httpd" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.576996 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="sg-core" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.577058 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" containerName="ceilometer-notification-agent" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.578560 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.580885 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.581458 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.584196 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.655730 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.675988 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-db-sync-config-data\") pod \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\" (UID: \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\") " Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.676113 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztbl2\" (UniqueName: \"kubernetes.io/projected/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-kube-api-access-ztbl2\") pod \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\" (UID: \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\") " Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.676213 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-combined-ca-bundle\") pod \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\" (UID: \"43ed9ea9-6c15-4082-b728-5d5d66dfa24e\") " Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.676394 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.676428 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-config-data\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.676471 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-run-httpd\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.676487 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-scripts\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.676510 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-log-httpd\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.676529 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4zvk\" (UniqueName: \"kubernetes.io/projected/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-kube-api-access-m4zvk\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.676586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.680391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "43ed9ea9-6c15-4082-b728-5d5d66dfa24e" (UID: "43ed9ea9-6c15-4082-b728-5d5d66dfa24e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.683824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-kube-api-access-ztbl2" (OuterVolumeSpecName: "kube-api-access-ztbl2") pod "43ed9ea9-6c15-4082-b728-5d5d66dfa24e" (UID: "43ed9ea9-6c15-4082-b728-5d5d66dfa24e"). InnerVolumeSpecName "kube-api-access-ztbl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.717362 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43ed9ea9-6c15-4082-b728-5d5d66dfa24e" (UID: "43ed9ea9-6c15-4082-b728-5d5d66dfa24e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.778395 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.778478 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.778507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-config-data\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.778550 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-run-httpd\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.778565 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-scripts\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.778591 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-log-httpd\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.778613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4zvk\" (UniqueName: \"kubernetes.io/projected/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-kube-api-access-m4zvk\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.778711 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.778724 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.778736 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztbl2\" (UniqueName: \"kubernetes.io/projected/43ed9ea9-6c15-4082-b728-5d5d66dfa24e-kube-api-access-ztbl2\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.779202 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-log-httpd\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.779258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-run-httpd\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.782009 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.782099 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.784262 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-config-data\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.795276 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4zvk\" (UniqueName: \"kubernetes.io/projected/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-kube-api-access-m4zvk\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.795322 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-scripts\") pod \"ceilometer-0\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.907669 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:31 crc kubenswrapper[5030]: I0120 23:46:31.976239 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4e8bfe6-1143-4b62-a672-1255e6a861e4" path="/var/lib/kubelet/pods/b4e8bfe6-1143-4b62-a672-1255e6a861e4/volumes" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.371655 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l"] Jan 20 23:46:32 crc kubenswrapper[5030]: E0120 23:46:32.372345 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ed9ea9-6c15-4082-b728-5d5d66dfa24e" containerName="barbican-db-sync" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.372357 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ed9ea9-6c15-4082-b728-5d5d66dfa24e" containerName="barbican-db-sync" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.372529 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ed9ea9-6c15-4082-b728-5d5d66dfa24e" containerName="barbican-db-sync" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.373402 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.375846 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" event={"ID":"43ed9ea9-6c15-4082-b728-5d5d66dfa24e","Type":"ContainerDied","Data":"295d22a9489ae6aec77ef45603377c1153142c30d28536f61b4ec7bdb6bd5717"} Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.375878 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="295d22a9489ae6aec77ef45603377c1153142c30d28536f61b4ec7bdb6bd5717" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.375956 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-hrsm6" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.376480 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.388159 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65"] Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.390879 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.394813 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-config-data-custom\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.394895 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-config-data\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.394911 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-combined-ca-bundle\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.394941 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hhdq\" (UniqueName: \"kubernetes.io/projected/26c79cd4-48f4-4082-a364-e767b967212d-kube-api-access-5hhdq\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.395006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26c79cd4-48f4-4082-a364-e767b967212d-logs\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.396650 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.398385 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"9782d2bc-aec5-414f-abdf-6e10b79f3539","Type":"ContainerStarted","Data":"07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0"} Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.398423 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"9782d2bc-aec5-414f-abdf-6e10b79f3539","Type":"ContainerStarted","Data":"10e2d252b37ecd98b5168420dbe975fe27edab71309ee2773750dfed65865307"} Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.404978 5030 generic.go:334] "Generic (PLEG): container finished" podID="6933f1ec-acd0-4c85-a7b3-17363f451790" containerID="82c5de5d1e9ee60604eec4b8187687c0e8fd8640cff7d12c1f5a28c78d168427" exitCode=0 Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.405038 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" event={"ID":"6933f1ec-acd0-4c85-a7b3-17363f451790","Type":"ContainerDied","Data":"82c5de5d1e9ee60604eec4b8187687c0e8fd8640cff7d12c1f5a28c78d168427"} Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.406062 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.420153 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l"] Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.425356 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"743f3c13-8b5a-454a-9111-2bfe623b17fa","Type":"ContainerStarted","Data":"ab5f32402c9ac2f9b4bc17412c00ee974f32e3d0d1f4485cbcdbd4ff4f4a0b56"} Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.425404 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"743f3c13-8b5a-454a-9111-2bfe623b17fa","Type":"ContainerStarted","Data":"89f373b0afe29ee91464bc60916f26b4795c5450e6639d2913668e8d59951225"} Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.441685 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65"] Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.481233 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-54bf956f58-k98kq"] Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.483316 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.489199 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.490779 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-54bf956f58-k98kq"] Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.497125 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-config-data-custom\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.497384 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-config-data\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.497522 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-config-data\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.497912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-combined-ca-bundle\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.498090 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hhdq\" (UniqueName: \"kubernetes.io/projected/26c79cd4-48f4-4082-a364-e767b967212d-kube-api-access-5hhdq\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.498281 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnspk\" (UniqueName: \"kubernetes.io/projected/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-kube-api-access-fnspk\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.498454 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-combined-ca-bundle\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.498613 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-logs\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.498772 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26c79cd4-48f4-4082-a364-e767b967212d-logs\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.498892 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-config-data-custom\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.502270 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26c79cd4-48f4-4082-a364-e767b967212d-logs\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.509213 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-combined-ca-bundle\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.523431 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-config-data\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.526137 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hhdq\" (UniqueName: \"kubernetes.io/projected/26c79cd4-48f4-4082-a364-e767b967212d-kube-api-access-5hhdq\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.528655 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-config-data-custom\") pod \"barbican-worker-fb4df5d95-xcn4l\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.600003 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnspk\" (UniqueName: \"kubernetes.io/projected/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-kube-api-access-fnspk\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.600058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-combined-ca-bundle\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.600085 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-logs\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.600119 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-config-data-custom\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.600160 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27b05991-4609-438d-8101-49094dcaab73-logs\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.600185 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-config-data\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.600212 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnz75\" (UniqueName: \"kubernetes.io/projected/27b05991-4609-438d-8101-49094dcaab73-kube-api-access-cnz75\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.600251 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-config-data\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.600276 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-config-data-custom\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.600310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-combined-ca-bundle\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.600777 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-logs\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.605814 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-combined-ca-bundle\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.606511 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-config-data-custom\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.607378 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-config-data\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.616153 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnspk\" (UniqueName: \"kubernetes.io/projected/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-kube-api-access-fnspk\") pod \"barbican-keystone-listener-58d55775d6-drf65\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.705861 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-config-data-custom\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.706221 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-combined-ca-bundle\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.706305 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27b05991-4609-438d-8101-49094dcaab73-logs\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.706328 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-config-data\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.706357 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnz75\" (UniqueName: \"kubernetes.io/projected/27b05991-4609-438d-8101-49094dcaab73-kube-api-access-cnz75\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.710470 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27b05991-4609-438d-8101-49094dcaab73-logs\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.712051 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.713114 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-combined-ca-bundle\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.715258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-config-data\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.717879 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-config-data-custom\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.719985 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnz75\" (UniqueName: \"kubernetes.io/projected/27b05991-4609-438d-8101-49094dcaab73-kube-api-access-cnz75\") pod \"barbican-api-54bf956f58-k98kq\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.749576 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:46:32 crc kubenswrapper[5030]: I0120 23:46:32.818573 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.187898 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l"] Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.262852 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65"] Jan 20 23:46:33 crc kubenswrapper[5030]: W0120 23:46:33.274964 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4909635_4c45_4cc9_af55_4df3dfa5a6ae.slice/crio-937f0ec470fa44414da8e254940ef4be0d70ba32f2dc62a9d5caee5c6531fcff WatchSource:0}: Error finding container 937f0ec470fa44414da8e254940ef4be0d70ba32f2dc62a9d5caee5c6531fcff: Status 404 returned error can't find the container with id 937f0ec470fa44414da8e254940ef4be0d70ba32f2dc62a9d5caee5c6531fcff Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.387069 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-54bf956f58-k98kq"] Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.416936 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.437461 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" event={"ID":"26c79cd4-48f4-4082-a364-e767b967212d","Type":"ContainerStarted","Data":"16c5c4277d4db587ebd20571f0eda8b1071259dcd5b2cc10f6127bebbbe1f881"} Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.437518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" event={"ID":"26c79cd4-48f4-4082-a364-e767b967212d","Type":"ContainerStarted","Data":"b84507167787fa2b59d5f4711b61116736f4ffa02b95797c8bba2661aef5dd2b"} Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.440308 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"743f3c13-8b5a-454a-9111-2bfe623b17fa","Type":"ContainerStarted","Data":"921c2eb1941ddf7a3fafe2c2fe470b42cbc620c2cb908a714dfdb9c223c5d339"} Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.456642 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"9782d2bc-aec5-414f-abdf-6e10b79f3539","Type":"ContainerStarted","Data":"22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998"} Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.457168 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.465126 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" event={"ID":"b4909635-4c45-4cc9-af55-4df3dfa5a6ae","Type":"ContainerStarted","Data":"937f0ec470fa44414da8e254940ef4be0d70ba32f2dc62a9d5caee5c6531fcff"} Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.485655 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aa508531-48c3-49ba-9a0b-45f2fce5a3f8","Type":"ContainerStarted","Data":"76c6a6de66aff93f3051fcaf10d3d7f5616d25cbee186689903f84de1167f95c"} Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.485699 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aa508531-48c3-49ba-9a0b-45f2fce5a3f8","Type":"ContainerStarted","Data":"6614420bd5452b1908c2fe058a5525e7d57bfcdaec657e0ed0dd72061f394bcb"} Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.491525 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.491508714 podStartE2EDuration="3.491508714s" podCreationTimestamp="2026-01-20 23:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:33.472143765 +0000 UTC m=+4265.792404063" watchObservedRunningTime="2026-01-20 23:46:33.491508714 +0000 UTC m=+4265.811769002" Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.496685 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" event={"ID":"27b05991-4609-438d-8101-49094dcaab73","Type":"ContainerStarted","Data":"29e279a8319639ccedc7df5b4d7f6c623191deb3afc900b214471999fbac52ef"} Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.498198 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.498182847 podStartE2EDuration="3.498182847s" podCreationTimestamp="2026-01-20 23:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:33.487236141 +0000 UTC m=+4265.807496429" watchObservedRunningTime="2026-01-20 23:46:33.498182847 +0000 UTC m=+4265.818443135" Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.904635 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.930692 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6933f1ec-acd0-4c85-a7b3-17363f451790-combined-ca-bundle\") pod \"6933f1ec-acd0-4c85-a7b3-17363f451790\" (UID: \"6933f1ec-acd0-4c85-a7b3-17363f451790\") " Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.930844 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6933f1ec-acd0-4c85-a7b3-17363f451790-config\") pod \"6933f1ec-acd0-4c85-a7b3-17363f451790\" (UID: \"6933f1ec-acd0-4c85-a7b3-17363f451790\") " Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.930902 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m984w\" (UniqueName: \"kubernetes.io/projected/6933f1ec-acd0-4c85-a7b3-17363f451790-kube-api-access-m984w\") pod \"6933f1ec-acd0-4c85-a7b3-17363f451790\" (UID: \"6933f1ec-acd0-4c85-a7b3-17363f451790\") " Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.941261 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6933f1ec-acd0-4c85-a7b3-17363f451790-kube-api-access-m984w" (OuterVolumeSpecName: "kube-api-access-m984w") pod "6933f1ec-acd0-4c85-a7b3-17363f451790" (UID: "6933f1ec-acd0-4c85-a7b3-17363f451790"). InnerVolumeSpecName "kube-api-access-m984w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.970084 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6933f1ec-acd0-4c85-a7b3-17363f451790-config" (OuterVolumeSpecName: "config") pod "6933f1ec-acd0-4c85-a7b3-17363f451790" (UID: "6933f1ec-acd0-4c85-a7b3-17363f451790"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:33 crc kubenswrapper[5030]: I0120 23:46:33.979697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6933f1ec-acd0-4c85-a7b3-17363f451790-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6933f1ec-acd0-4c85-a7b3-17363f451790" (UID: "6933f1ec-acd0-4c85-a7b3-17363f451790"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.032888 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6933f1ec-acd0-4c85-a7b3-17363f451790-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.032920 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6933f1ec-acd0-4c85-a7b3-17363f451790-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.032931 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m984w\" (UniqueName: \"kubernetes.io/projected/6933f1ec-acd0-4c85-a7b3-17363f451790-kube-api-access-m984w\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.507473 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" event={"ID":"6933f1ec-acd0-4c85-a7b3-17363f451790","Type":"ContainerDied","Data":"abf471d2b2054c1f64ae7477dc37126eaca8485709203e80e3bc2b4345ac9084"} Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.507809 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abf471d2b2054c1f64ae7477dc37126eaca8485709203e80e3bc2b4345ac9084" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.507489 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-7v8jz" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.511356 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" event={"ID":"b4909635-4c45-4cc9-af55-4df3dfa5a6ae","Type":"ContainerStarted","Data":"ed470bb8adfad334977662cf16520c4e04b1607943ea23451dcf79905020279d"} Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.511392 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" event={"ID":"b4909635-4c45-4cc9-af55-4df3dfa5a6ae","Type":"ContainerStarted","Data":"e9844f2bcfe577a131f628de6cb182cfcf200e6f70195cd5fe757c784f5c29dd"} Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.514914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aa508531-48c3-49ba-9a0b-45f2fce5a3f8","Type":"ContainerStarted","Data":"02e88784775eafdc1b9189fcec5afbb4ed9dcee7c3a77c6a3ab8948783a3ce44"} Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.516984 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" event={"ID":"27b05991-4609-438d-8101-49094dcaab73","Type":"ContainerStarted","Data":"395e9a0cd2baaa9a1747350534c5a945f8f1526a6b21dfdc2f59fba8b9b49db1"} Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.517012 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" event={"ID":"27b05991-4609-438d-8101-49094dcaab73","Type":"ContainerStarted","Data":"006c6af6ec7f5c7a7adb7d60fb62998d52343c19ab5a4d217ce16b6062fbd091"} Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.517172 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.517202 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.518614 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" event={"ID":"26c79cd4-48f4-4082-a364-e767b967212d","Type":"ContainerStarted","Data":"ed609912d75fa708ca9829f491a4fdfce5eca751361677e9cfa3203e0a357751"} Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.520807 5030 generic.go:334] "Generic (PLEG): container finished" podID="0d3d6688-641e-4058-a346-61c3ee44938c" containerID="4c5a39b28271097caeee65f9a045b46a3c2ee6068921dc3575a234b5903108e2" exitCode=0 Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.520889 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" event={"ID":"0d3d6688-641e-4058-a346-61c3ee44938c","Type":"ContainerDied","Data":"4c5a39b28271097caeee65f9a045b46a3c2ee6068921dc3575a234b5903108e2"} Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.521150 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="9782d2bc-aec5-414f-abdf-6e10b79f3539" containerName="cinder-api-log" containerID="cri-o://07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0" gracePeriod=30 Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.521283 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="9782d2bc-aec5-414f-abdf-6e10b79f3539" containerName="cinder-api" containerID="cri-o://22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998" gracePeriod=30 Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.542990 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" podStartSLOduration=2.542970152 podStartE2EDuration="2.542970152s" podCreationTimestamp="2026-01-20 23:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:34.530451168 +0000 UTC m=+4266.850711476" watchObservedRunningTime="2026-01-20 23:46:34.542970152 +0000 UTC m=+4266.863230460" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.592088 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" podStartSLOduration=2.592070183 podStartE2EDuration="2.592070183s" podCreationTimestamp="2026-01-20 23:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:34.551667042 +0000 UTC m=+4266.871927340" watchObservedRunningTime="2026-01-20 23:46:34.592070183 +0000 UTC m=+4266.912330461" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.613179 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" podStartSLOduration=2.613160674 podStartE2EDuration="2.613160674s" podCreationTimestamp="2026-01-20 23:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:34.606284107 +0000 UTC m=+4266.926544405" watchObservedRunningTime="2026-01-20 23:46:34.613160674 +0000 UTC m=+4266.933420962" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.681534 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-59755f95cc-55knr"] Jan 20 23:46:34 crc kubenswrapper[5030]: E0120 23:46:34.681947 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6933f1ec-acd0-4c85-a7b3-17363f451790" containerName="neutron-db-sync" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.681962 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6933f1ec-acd0-4c85-a7b3-17363f451790" containerName="neutron-db-sync" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.682154 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6933f1ec-acd0-4c85-a7b3-17363f451790" containerName="neutron-db-sync" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.683404 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.688110 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.688266 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.688384 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-xfzxd" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.688517 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.698905 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-59755f95cc-55knr"] Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.760699 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-config\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.760745 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-httpd-config\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.760798 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-combined-ca-bundle\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.760829 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhvtw\" (UniqueName: \"kubernetes.io/projected/0800e960-d721-4790-8586-8f5c77ab05dc-kube-api-access-zhvtw\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.761001 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-ovndb-tls-certs\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.863304 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-config\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.863363 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-httpd-config\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.863407 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-combined-ca-bundle\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.863444 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhvtw\" (UniqueName: \"kubernetes.io/projected/0800e960-d721-4790-8586-8f5c77ab05dc-kube-api-access-zhvtw\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.863488 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-ovndb-tls-certs\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.868026 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-ovndb-tls-certs\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.868333 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-config\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.875839 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-httpd-config\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.884142 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhvtw\" (UniqueName: \"kubernetes.io/projected/0800e960-d721-4790-8586-8f5c77ab05dc-kube-api-access-zhvtw\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:34 crc kubenswrapper[5030]: I0120 23:46:34.889087 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-combined-ca-bundle\") pod \"neutron-59755f95cc-55knr\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.027831 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.170964 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.369897 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9782d2bc-aec5-414f-abdf-6e10b79f3539-etc-machine-id\") pod \"9782d2bc-aec5-414f-abdf-6e10b79f3539\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.369978 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-config-data\") pod \"9782d2bc-aec5-414f-abdf-6e10b79f3539\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.369996 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9782d2bc-aec5-414f-abdf-6e10b79f3539-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9782d2bc-aec5-414f-abdf-6e10b79f3539" (UID: "9782d2bc-aec5-414f-abdf-6e10b79f3539"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.370031 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-config-data-custom\") pod \"9782d2bc-aec5-414f-abdf-6e10b79f3539\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.370110 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9782d2bc-aec5-414f-abdf-6e10b79f3539-logs\") pod \"9782d2bc-aec5-414f-abdf-6e10b79f3539\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.370144 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-scripts\") pod \"9782d2bc-aec5-414f-abdf-6e10b79f3539\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.370172 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-combined-ca-bundle\") pod \"9782d2bc-aec5-414f-abdf-6e10b79f3539\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.370215 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp2fk\" (UniqueName: \"kubernetes.io/projected/9782d2bc-aec5-414f-abdf-6e10b79f3539-kube-api-access-hp2fk\") pod \"9782d2bc-aec5-414f-abdf-6e10b79f3539\" (UID: \"9782d2bc-aec5-414f-abdf-6e10b79f3539\") " Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.370697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9782d2bc-aec5-414f-abdf-6e10b79f3539-logs" (OuterVolumeSpecName: "logs") pod "9782d2bc-aec5-414f-abdf-6e10b79f3539" (UID: "9782d2bc-aec5-414f-abdf-6e10b79f3539"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.371012 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9782d2bc-aec5-414f-abdf-6e10b79f3539-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.371037 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9782d2bc-aec5-414f-abdf-6e10b79f3539-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.521701 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-59755f95cc-55knr"] Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.537007 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aa508531-48c3-49ba-9a0b-45f2fce5a3f8","Type":"ContainerStarted","Data":"0c891e84022b49df1c5d5035d0ce800f65dec26e591a308961236a0a7f2a9424"} Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.539406 5030 generic.go:334] "Generic (PLEG): container finished" podID="9782d2bc-aec5-414f-abdf-6e10b79f3539" containerID="22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998" exitCode=0 Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.539460 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"9782d2bc-aec5-414f-abdf-6e10b79f3539","Type":"ContainerDied","Data":"22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998"} Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.539504 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.539539 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"9782d2bc-aec5-414f-abdf-6e10b79f3539","Type":"ContainerDied","Data":"07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0"} Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.539564 5030 scope.go:117] "RemoveContainer" containerID="22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.539478 5030 generic.go:334] "Generic (PLEG): container finished" podID="9782d2bc-aec5-414f-abdf-6e10b79f3539" containerID="07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0" exitCode=143 Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.539701 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"9782d2bc-aec5-414f-abdf-6e10b79f3539","Type":"ContainerDied","Data":"10e2d252b37ecd98b5168420dbe975fe27edab71309ee2773750dfed65865307"} Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.708678 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.844410 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-scripts" (OuterVolumeSpecName: "scripts") pod "9782d2bc-aec5-414f-abdf-6e10b79f3539" (UID: "9782d2bc-aec5-414f-abdf-6e10b79f3539"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:35 crc kubenswrapper[5030]: W0120 23:46:35.856512 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0800e960_d721_4790_8586_8f5c77ab05dc.slice/crio-33b43d709c6e0de7c397a549eecc4ee8a17acdc4d4ab5618a585277d6f9cd4a0 WatchSource:0}: Error finding container 33b43d709c6e0de7c397a549eecc4ee8a17acdc4d4ab5618a585277d6f9cd4a0: Status 404 returned error can't find the container with id 33b43d709c6e0de7c397a549eecc4ee8a17acdc4d4ab5618a585277d6f9cd4a0 Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.858052 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9782d2bc-aec5-414f-abdf-6e10b79f3539-kube-api-access-hp2fk" (OuterVolumeSpecName: "kube-api-access-hp2fk") pod "9782d2bc-aec5-414f-abdf-6e10b79f3539" (UID: "9782d2bc-aec5-414f-abdf-6e10b79f3539"). InnerVolumeSpecName "kube-api-access-hp2fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.858221 5030 scope.go:117] "RemoveContainer" containerID="07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.860706 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9782d2bc-aec5-414f-abdf-6e10b79f3539" (UID: "9782d2bc-aec5-414f-abdf-6e10b79f3539"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.880801 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.880830 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp2fk\" (UniqueName: \"kubernetes.io/projected/9782d2bc-aec5-414f-abdf-6e10b79f3539-kube-api-access-hp2fk\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.880842 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.886811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9782d2bc-aec5-414f-abdf-6e10b79f3539" (UID: "9782d2bc-aec5-414f-abdf-6e10b79f3539"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.946197 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-config-data" (OuterVolumeSpecName: "config-data") pod "9782d2bc-aec5-414f-abdf-6e10b79f3539" (UID: "9782d2bc-aec5-414f-abdf-6e10b79f3539"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.982367 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:35 crc kubenswrapper[5030]: I0120 23:46:35.982707 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9782d2bc-aec5-414f-abdf-6e10b79f3539-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.057925 5030 scope.go:117] "RemoveContainer" containerID="22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998" Jan 20 23:46:36 crc kubenswrapper[5030]: E0120 23:46:36.058411 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998\": container with ID starting with 22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998 not found: ID does not exist" containerID="22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.058468 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998"} err="failed to get container status \"22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998\": rpc error: code = NotFound desc = could not find container \"22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998\": container with ID starting with 22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998 not found: ID does not exist" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.058508 5030 scope.go:117] "RemoveContainer" containerID="07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0" Jan 20 23:46:36 crc kubenswrapper[5030]: E0120 23:46:36.059060 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0\": container with ID starting with 07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0 not found: ID does not exist" containerID="07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.059097 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0"} err="failed to get container status \"07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0\": rpc error: code = NotFound desc = could not find container \"07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0\": container with ID starting with 07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0 not found: ID does not exist" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.059122 5030 scope.go:117] "RemoveContainer" containerID="22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.059502 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998"} err="failed to get container status \"22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998\": rpc error: code = NotFound desc = could not find container \"22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998\": container with ID starting with 22bcb981c94bf699573321d421229004923f033c7a24267097a26c25d376a998 not found: ID does not exist" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.059540 5030 scope.go:117] "RemoveContainer" containerID="07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.059889 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0"} err="failed to get container status \"07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0\": rpc error: code = NotFound desc = could not find container \"07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0\": container with ID starting with 07def519f6312165634862b25be6bb1b42a1ef22144d84c4d41be29e95da2bd0 not found: ID does not exist" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.070115 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.169381 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.197543 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-config-data\") pod \"0d3d6688-641e-4058-a346-61c3ee44938c\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.197605 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-credential-keys\") pod \"0d3d6688-641e-4058-a346-61c3ee44938c\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.198245 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-scripts\") pod \"0d3d6688-641e-4058-a346-61c3ee44938c\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.198728 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pck5d\" (UniqueName: \"kubernetes.io/projected/0d3d6688-641e-4058-a346-61c3ee44938c-kube-api-access-pck5d\") pod \"0d3d6688-641e-4058-a346-61c3ee44938c\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.199207 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-fernet-keys\") pod \"0d3d6688-641e-4058-a346-61c3ee44938c\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.199317 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-combined-ca-bundle\") pod \"0d3d6688-641e-4058-a346-61c3ee44938c\" (UID: \"0d3d6688-641e-4058-a346-61c3ee44938c\") " Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.202759 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.206133 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-scripts" (OuterVolumeSpecName: "scripts") pod "0d3d6688-641e-4058-a346-61c3ee44938c" (UID: "0d3d6688-641e-4058-a346-61c3ee44938c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.214036 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3d6688-641e-4058-a346-61c3ee44938c-kube-api-access-pck5d" (OuterVolumeSpecName: "kube-api-access-pck5d") pod "0d3d6688-641e-4058-a346-61c3ee44938c" (UID: "0d3d6688-641e-4058-a346-61c3ee44938c"). InnerVolumeSpecName "kube-api-access-pck5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.217245 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0d3d6688-641e-4058-a346-61c3ee44938c" (UID: "0d3d6688-641e-4058-a346-61c3ee44938c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.227032 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0d3d6688-641e-4058-a346-61c3ee44938c" (UID: "0d3d6688-641e-4058-a346-61c3ee44938c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.228839 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:46:36 crc kubenswrapper[5030]: E0120 23:46:36.231090 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9782d2bc-aec5-414f-abdf-6e10b79f3539" containerName="cinder-api" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.231119 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9782d2bc-aec5-414f-abdf-6e10b79f3539" containerName="cinder-api" Jan 20 23:46:36 crc kubenswrapper[5030]: E0120 23:46:36.231148 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9782d2bc-aec5-414f-abdf-6e10b79f3539" containerName="cinder-api-log" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.231157 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9782d2bc-aec5-414f-abdf-6e10b79f3539" containerName="cinder-api-log" Jan 20 23:46:36 crc kubenswrapper[5030]: E0120 23:46:36.231172 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3d6688-641e-4058-a346-61c3ee44938c" containerName="keystone-bootstrap" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.231179 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3d6688-641e-4058-a346-61c3ee44938c" containerName="keystone-bootstrap" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.231367 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9782d2bc-aec5-414f-abdf-6e10b79f3539" containerName="cinder-api" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.231393 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3d6688-641e-4058-a346-61c3ee44938c" containerName="keystone-bootstrap" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.231406 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9782d2bc-aec5-414f-abdf-6e10b79f3539" containerName="cinder-api-log" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.232785 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.235721 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.235932 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.236085 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.244954 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d3d6688-641e-4058-a346-61c3ee44938c" (UID: "0d3d6688-641e-4058-a346-61c3ee44938c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.263055 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.286734 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-config-data" (OuterVolumeSpecName: "config-data") pod "0d3d6688-641e-4058-a346-61c3ee44938c" (UID: "0d3d6688-641e-4058-a346-61c3ee44938c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.301902 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-scripts\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302013 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302078 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302099 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-config-data-custom\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302165 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302186 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-logs\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302220 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whhd2\" (UniqueName: \"kubernetes.io/projected/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-kube-api-access-whhd2\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302277 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-config-data\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302346 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302362 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302417 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302428 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pck5d\" (UniqueName: \"kubernetes.io/projected/0d3d6688-641e-4058-a346-61c3ee44938c-kube-api-access-pck5d\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302438 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.302461 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3d6688-641e-4058-a346-61c3ee44938c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.404408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-config-data\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.404887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-scripts\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.405015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.405176 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.405277 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.405293 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-config-data-custom\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.405427 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.405468 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.405507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-logs\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.405530 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whhd2\" (UniqueName: \"kubernetes.io/projected/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-kube-api-access-whhd2\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.406109 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-logs\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.408884 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-config-data\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.409146 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-scripts\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.414176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-config-data-custom\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.414340 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.414423 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.414920 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.420506 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whhd2\" (UniqueName: \"kubernetes.io/projected/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-kube-api-access-whhd2\") pod \"cinder-api-0\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.556451 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" event={"ID":"0800e960-d721-4790-8586-8f5c77ab05dc","Type":"ContainerStarted","Data":"d73ad5f754eb2901b60128bff6a264983d7d74521f4176732b15724e0d3df8c8"} Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.556505 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" event={"ID":"0800e960-d721-4790-8586-8f5c77ab05dc","Type":"ContainerStarted","Data":"6f2c342a8688dabd93f58244581defe726775c8bbc85ea0469b1283dc62695de"} Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.556518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" event={"ID":"0800e960-d721-4790-8586-8f5c77ab05dc","Type":"ContainerStarted","Data":"33b43d709c6e0de7c397a549eecc4ee8a17acdc4d4ab5618a585277d6f9cd4a0"} Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.557751 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.560408 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aa508531-48c3-49ba-9a0b-45f2fce5a3f8","Type":"ContainerStarted","Data":"58d9e6cc3cb107f7376adb59f063d6a98f9a4806a4c933c4606eee3c3d38344f"} Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.561212 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.563157 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" event={"ID":"0d3d6688-641e-4058-a346-61c3ee44938c","Type":"ContainerDied","Data":"740571b006a278182f492b699c5a21ba69fd05c7dfb42235839bff755793bdd2"} Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.563191 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="740571b006a278182f492b699c5a21ba69fd05c7dfb42235839bff755793bdd2" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.563414 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-9g6g8" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.578119 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" podStartSLOduration=2.57810477 podStartE2EDuration="2.57810477s" podCreationTimestamp="2026-01-20 23:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:36.575960469 +0000 UTC m=+4268.896220777" watchObservedRunningTime="2026-01-20 23:46:36.57810477 +0000 UTC m=+4268.898365058" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.583588 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.608053 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.864777342 podStartE2EDuration="5.607893704s" podCreationTimestamp="2026-01-20 23:46:31 +0000 UTC" firstStartedPulling="2026-01-20 23:46:32.475095518 +0000 UTC m=+4264.795355806" lastFinishedPulling="2026-01-20 23:46:36.21821189 +0000 UTC m=+4268.538472168" observedRunningTime="2026-01-20 23:46:36.602260757 +0000 UTC m=+4268.922521045" watchObservedRunningTime="2026-01-20 23:46:36.607893704 +0000 UTC m=+4268.928153992" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.719603 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-7fc69fb975-2tn7g"] Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.723770 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.727945 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.730927 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.731068 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.731228 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.731338 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-6rz8w" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.731479 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.744990 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7fc69fb975-2tn7g"] Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.814342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-combined-ca-bundle\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.814395 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-internal-tls-certs\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.814436 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-credential-keys\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.814513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnjn5\" (UniqueName: \"kubernetes.io/projected/9124c35a-a0a3-4adf-92cd-5d7011098204-kube-api-access-tnjn5\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.814557 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-public-tls-certs\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.814586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-fernet-keys\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.814608 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-scripts\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.814718 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-config-data\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.915399 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-combined-ca-bundle\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.915448 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-internal-tls-certs\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.915471 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-credential-keys\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.915525 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnjn5\" (UniqueName: \"kubernetes.io/projected/9124c35a-a0a3-4adf-92cd-5d7011098204-kube-api-access-tnjn5\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.915566 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-public-tls-certs\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.915591 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-fernet-keys\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.915609 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-scripts\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.915664 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-config-data\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.921368 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-combined-ca-bundle\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.921755 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-internal-tls-certs\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.922452 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-fernet-keys\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.922867 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-credential-keys\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.923552 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-config-data\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.924255 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-public-tls-certs\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.931686 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-scripts\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:36 crc kubenswrapper[5030]: I0120 23:46:36.932155 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnjn5\" (UniqueName: \"kubernetes.io/projected/9124c35a-a0a3-4adf-92cd-5d7011098204-kube-api-access-tnjn5\") pod \"keystone-7fc69fb975-2tn7g\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:37 crc kubenswrapper[5030]: I0120 23:46:37.048589 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:37 crc kubenswrapper[5030]: I0120 23:46:37.113581 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:46:37 crc kubenswrapper[5030]: W0120 23:46:37.118083 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1f606ed_8a00_4c89_b7cd_2bc1e166e44d.slice/crio-e5388e89ea2bdbf4703f3f2abec8968aaa13a50444466cc4a5b50b58c9e894a0 WatchSource:0}: Error finding container e5388e89ea2bdbf4703f3f2abec8968aaa13a50444466cc4a5b50b58c9e894a0: Status 404 returned error can't find the container with id e5388e89ea2bdbf4703f3f2abec8968aaa13a50444466cc4a5b50b58c9e894a0 Jan 20 23:46:37 crc kubenswrapper[5030]: I0120 23:46:37.577693 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d","Type":"ContainerStarted","Data":"e5388e89ea2bdbf4703f3f2abec8968aaa13a50444466cc4a5b50b58c9e894a0"} Jan 20 23:46:37 crc kubenswrapper[5030]: I0120 23:46:37.981412 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9782d2bc-aec5-414f-abdf-6e10b79f3539" path="/var/lib/kubelet/pods/9782d2bc-aec5-414f-abdf-6e10b79f3539/volumes" Jan 20 23:46:38 crc kubenswrapper[5030]: I0120 23:46:38.270173 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-7fc69fb975-2tn7g"] Jan 20 23:46:38 crc kubenswrapper[5030]: I0120 23:46:38.621792 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" event={"ID":"9124c35a-a0a3-4adf-92cd-5d7011098204","Type":"ContainerStarted","Data":"67f70121c9d064d309a201aa5a31d13ba6fde83ff52dac3b6851e7006a1e044f"} Jan 20 23:46:38 crc kubenswrapper[5030]: I0120 23:46:38.646768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d","Type":"ContainerStarted","Data":"4692f8f2e0876360cbe854005c8dbab4b736e7fa64a44b994b00fec7f8624e10"} Jan 20 23:46:38 crc kubenswrapper[5030]: I0120 23:46:38.918006 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:38 crc kubenswrapper[5030]: I0120 23:46:38.918310 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:38 crc kubenswrapper[5030]: I0120 23:46:38.950336 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:38 crc kubenswrapper[5030]: I0120 23:46:38.959140 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:38 crc kubenswrapper[5030]: E0120 23:46:38.999462 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod497ba0ef_c7db_4c22_8b2f_7da2826ece57.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.151466 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.151839 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.183466 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.191785 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.656109 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d","Type":"ContainerStarted","Data":"d9cbf6d5b6e9e1557a747ebddf26e0d91cf97ac99872d10a07f8a705f8fd5859"} Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.657419 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.662379 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" event={"ID":"9124c35a-a0a3-4adf-92cd-5d7011098204","Type":"ContainerStarted","Data":"3f6c7cb33ab3b87a8e27bc70591e84aa73c4cf65c60f08eb014c06c96a92cc58"} Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.662425 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.662440 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.663474 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.663503 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.663516 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.704012 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.70398591 podStartE2EDuration="3.70398591s" podCreationTimestamp="2026-01-20 23:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:39.684012266 +0000 UTC m=+4272.004272554" watchObservedRunningTime="2026-01-20 23:46:39.70398591 +0000 UTC m=+4272.024246218" Jan 20 23:46:39 crc kubenswrapper[5030]: I0120 23:46:39.720661 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" podStartSLOduration=3.720614574 podStartE2EDuration="3.720614574s" podCreationTimestamp="2026-01-20 23:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:39.700441914 +0000 UTC m=+4272.020702212" watchObservedRunningTime="2026-01-20 23:46:39.720614574 +0000 UTC m=+4272.040874882" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.803025 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-8d6495cc4-tcgn7"] Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.805336 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.807728 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.808187 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.818284 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-8d6495cc4-tcgn7"] Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.891773 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w65n\" (UniqueName: \"kubernetes.io/projected/5a76a052-42f5-44fb-b981-dffa9b886afe-kube-api-access-5w65n\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.891838 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-config\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.891862 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-ovndb-tls-certs\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.891897 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-combined-ca-bundle\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.891957 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-internal-tls-certs\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.892028 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-httpd-config\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.892056 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-public-tls-certs\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.993630 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-httpd-config\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.993672 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-public-tls-certs\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.993728 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w65n\" (UniqueName: \"kubernetes.io/projected/5a76a052-42f5-44fb-b981-dffa9b886afe-kube-api-access-5w65n\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.993759 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-config\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.994650 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-ovndb-tls-certs\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.994687 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-combined-ca-bundle\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.994733 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-internal-tls-certs\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:40 crc kubenswrapper[5030]: I0120 23:46:40.999918 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-ovndb-tls-certs\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.000838 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-public-tls-certs\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.001656 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-internal-tls-certs\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.001671 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-combined-ca-bundle\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.002100 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-httpd-config\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.002543 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-config\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.012321 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w65n\" (UniqueName: \"kubernetes.io/projected/5a76a052-42f5-44fb-b981-dffa9b886afe-kube-api-access-5w65n\") pod \"neutron-8d6495cc4-tcgn7\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.034231 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.075050 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.121387 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.486120 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.489938 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.490389 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:46:41 crc kubenswrapper[5030]: W0120 23:46:41.545916 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a76a052_42f5_44fb_b981_dffa9b886afe.slice/crio-92aedd641ddc844f9250dfff551102007a7b55500847f45450a7d051fb61bdff WatchSource:0}: Error finding container 92aedd641ddc844f9250dfff551102007a7b55500847f45450a7d051fb61bdff: Status 404 returned error can't find the container with id 92aedd641ddc844f9250dfff551102007a7b55500847f45450a7d051fb61bdff Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.549367 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-8d6495cc4-tcgn7"] Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.679932 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.681261 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="743f3c13-8b5a-454a-9111-2bfe623b17fa" containerName="cinder-scheduler" containerID="cri-o://ab5f32402c9ac2f9b4bc17412c00ee974f32e3d0d1f4485cbcdbd4ff4f4a0b56" gracePeriod=30 Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.681369 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" event={"ID":"5a76a052-42f5-44fb-b981-dffa9b886afe","Type":"ContainerStarted","Data":"92aedd641ddc844f9250dfff551102007a7b55500847f45450a7d051fb61bdff"} Jan 20 23:46:41 crc kubenswrapper[5030]: I0120 23:46:41.681822 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="743f3c13-8b5a-454a-9111-2bfe623b17fa" containerName="probe" containerID="cri-o://921c2eb1941ddf7a3fafe2c2fe470b42cbc620c2cb908a714dfdb9c223c5d339" gracePeriod=30 Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.110066 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66"] Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.111711 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.115171 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.125001 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66"] Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.125262 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.221183 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-public-tls-certs\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.221258 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-config-data-custom\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.221283 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-internal-tls-certs\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.221353 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-config-data\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.221411 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94aa752f-480e-4bae-9e31-ae1ce700e577-logs\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.221434 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-combined-ca-bundle\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.221481 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4h97\" (UniqueName: \"kubernetes.io/projected/94aa752f-480e-4bae-9e31-ae1ce700e577-kube-api-access-b4h97\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.322739 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-public-tls-certs\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.322829 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-config-data-custom\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.322861 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-internal-tls-certs\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.322929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-config-data\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.322991 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94aa752f-480e-4bae-9e31-ae1ce700e577-logs\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.323010 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-combined-ca-bundle\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.323068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4h97\" (UniqueName: \"kubernetes.io/projected/94aa752f-480e-4bae-9e31-ae1ce700e577-kube-api-access-b4h97\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.323789 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94aa752f-480e-4bae-9e31-ae1ce700e577-logs\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.326927 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-internal-tls-certs\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.326949 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-public-tls-certs\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.327709 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-config-data\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.328164 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-config-data-custom\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.330160 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-combined-ca-bundle\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.347404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4h97\" (UniqueName: \"kubernetes.io/projected/94aa752f-480e-4bae-9e31-ae1ce700e577-kube-api-access-b4h97\") pod \"barbican-api-5bd5fbdc44-xvm66\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.436761 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.700163 5030 generic.go:334] "Generic (PLEG): container finished" podID="743f3c13-8b5a-454a-9111-2bfe623b17fa" containerID="921c2eb1941ddf7a3fafe2c2fe470b42cbc620c2cb908a714dfdb9c223c5d339" exitCode=0 Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.700539 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"743f3c13-8b5a-454a-9111-2bfe623b17fa","Type":"ContainerDied","Data":"921c2eb1941ddf7a3fafe2c2fe470b42cbc620c2cb908a714dfdb9c223c5d339"} Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.703877 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" event={"ID":"5a76a052-42f5-44fb-b981-dffa9b886afe","Type":"ContainerStarted","Data":"ddc74a644d7ab5111265bc9f9536df25bbc79c7022f36f8e1bbb5069f4cae5f8"} Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.703914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" event={"ID":"5a76a052-42f5-44fb-b981-dffa9b886afe","Type":"ContainerStarted","Data":"296fedc1bd542169ff9175e281777f11d5558ffe31dda5bcb1a4e4db1759e2e0"} Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.737861 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" podStartSLOduration=2.737844108 podStartE2EDuration="2.737844108s" podCreationTimestamp="2026-01-20 23:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:42.72517231 +0000 UTC m=+4275.045432588" watchObservedRunningTime="2026-01-20 23:46:42.737844108 +0000 UTC m=+4275.058104396" Jan 20 23:46:42 crc kubenswrapper[5030]: I0120 23:46:42.904608 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66"] Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.420966 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.547241 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-config-data-custom\") pod \"743f3c13-8b5a-454a-9111-2bfe623b17fa\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.547301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/743f3c13-8b5a-454a-9111-2bfe623b17fa-etc-machine-id\") pod \"743f3c13-8b5a-454a-9111-2bfe623b17fa\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.547331 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klrk7\" (UniqueName: \"kubernetes.io/projected/743f3c13-8b5a-454a-9111-2bfe623b17fa-kube-api-access-klrk7\") pod \"743f3c13-8b5a-454a-9111-2bfe623b17fa\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.547395 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-scripts\") pod \"743f3c13-8b5a-454a-9111-2bfe623b17fa\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.547444 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-combined-ca-bundle\") pod \"743f3c13-8b5a-454a-9111-2bfe623b17fa\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.547476 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-config-data\") pod \"743f3c13-8b5a-454a-9111-2bfe623b17fa\" (UID: \"743f3c13-8b5a-454a-9111-2bfe623b17fa\") " Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.547991 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/743f3c13-8b5a-454a-9111-2bfe623b17fa-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "743f3c13-8b5a-454a-9111-2bfe623b17fa" (UID: "743f3c13-8b5a-454a-9111-2bfe623b17fa"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.552447 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/743f3c13-8b5a-454a-9111-2bfe623b17fa-kube-api-access-klrk7" (OuterVolumeSpecName: "kube-api-access-klrk7") pod "743f3c13-8b5a-454a-9111-2bfe623b17fa" (UID: "743f3c13-8b5a-454a-9111-2bfe623b17fa"). InnerVolumeSpecName "kube-api-access-klrk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.554784 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "743f3c13-8b5a-454a-9111-2bfe623b17fa" (UID: "743f3c13-8b5a-454a-9111-2bfe623b17fa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.554879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-scripts" (OuterVolumeSpecName: "scripts") pod "743f3c13-8b5a-454a-9111-2bfe623b17fa" (UID: "743f3c13-8b5a-454a-9111-2bfe623b17fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.611387 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "743f3c13-8b5a-454a-9111-2bfe623b17fa" (UID: "743f3c13-8b5a-454a-9111-2bfe623b17fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.650164 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.650242 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/743f3c13-8b5a-454a-9111-2bfe623b17fa-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.650267 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klrk7\" (UniqueName: \"kubernetes.io/projected/743f3c13-8b5a-454a-9111-2bfe623b17fa-kube-api-access-klrk7\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.650287 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.650306 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.661759 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-config-data" (OuterVolumeSpecName: "config-data") pod "743f3c13-8b5a-454a-9111-2bfe623b17fa" (UID: "743f3c13-8b5a-454a-9111-2bfe623b17fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.714378 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" event={"ID":"94aa752f-480e-4bae-9e31-ae1ce700e577","Type":"ContainerStarted","Data":"d4b65a581e80f8642a2a817a22e1a404af731c0e858c88b0592707343aa1a8b7"} Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.714445 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" event={"ID":"94aa752f-480e-4bae-9e31-ae1ce700e577","Type":"ContainerStarted","Data":"2a0cc0b59d1a6c1381916bc83161799447404481798fd098a6aef30f606848f3"} Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.714457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" event={"ID":"94aa752f-480e-4bae-9e31-ae1ce700e577","Type":"ContainerStarted","Data":"a1ecf587dd8da4b8e0ca0a8bc79489b9100cc4646ac0bbc976d8c923e4c821d0"} Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.714518 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.716394 5030 generic.go:334] "Generic (PLEG): container finished" podID="743f3c13-8b5a-454a-9111-2bfe623b17fa" containerID="ab5f32402c9ac2f9b4bc17412c00ee974f32e3d0d1f4485cbcdbd4ff4f4a0b56" exitCode=0 Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.716440 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.716484 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"743f3c13-8b5a-454a-9111-2bfe623b17fa","Type":"ContainerDied","Data":"ab5f32402c9ac2f9b4bc17412c00ee974f32e3d0d1f4485cbcdbd4ff4f4a0b56"} Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.716527 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"743f3c13-8b5a-454a-9111-2bfe623b17fa","Type":"ContainerDied","Data":"89f373b0afe29ee91464bc60916f26b4795c5450e6639d2913668e8d59951225"} Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.716547 5030 scope.go:117] "RemoveContainer" containerID="921c2eb1941ddf7a3fafe2c2fe470b42cbc620c2cb908a714dfdb9c223c5d339" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.716893 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.759020 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" podStartSLOduration=1.758991799 podStartE2EDuration="1.758991799s" podCreationTimestamp="2026-01-20 23:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:43.737670552 +0000 UTC m=+4276.057930850" watchObservedRunningTime="2026-01-20 23:46:43.758991799 +0000 UTC m=+4276.079252087" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.760741 5030 scope.go:117] "RemoveContainer" containerID="ab5f32402c9ac2f9b4bc17412c00ee974f32e3d0d1f4485cbcdbd4ff4f4a0b56" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.761772 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743f3c13-8b5a-454a-9111-2bfe623b17fa-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.787710 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.807699 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.812873 5030 scope.go:117] "RemoveContainer" containerID="921c2eb1941ddf7a3fafe2c2fe470b42cbc620c2cb908a714dfdb9c223c5d339" Jan 20 23:46:43 crc kubenswrapper[5030]: E0120 23:46:43.814419 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"921c2eb1941ddf7a3fafe2c2fe470b42cbc620c2cb908a714dfdb9c223c5d339\": container with ID starting with 921c2eb1941ddf7a3fafe2c2fe470b42cbc620c2cb908a714dfdb9c223c5d339 not found: ID does not exist" containerID="921c2eb1941ddf7a3fafe2c2fe470b42cbc620c2cb908a714dfdb9c223c5d339" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.814485 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"921c2eb1941ddf7a3fafe2c2fe470b42cbc620c2cb908a714dfdb9c223c5d339"} err="failed to get container status \"921c2eb1941ddf7a3fafe2c2fe470b42cbc620c2cb908a714dfdb9c223c5d339\": rpc error: code = NotFound desc = could not find container \"921c2eb1941ddf7a3fafe2c2fe470b42cbc620c2cb908a714dfdb9c223c5d339\": container with ID starting with 921c2eb1941ddf7a3fafe2c2fe470b42cbc620c2cb908a714dfdb9c223c5d339 not found: ID does not exist" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.814529 5030 scope.go:117] "RemoveContainer" containerID="ab5f32402c9ac2f9b4bc17412c00ee974f32e3d0d1f4485cbcdbd4ff4f4a0b56" Jan 20 23:46:43 crc kubenswrapper[5030]: E0120 23:46:43.815926 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab5f32402c9ac2f9b4bc17412c00ee974f32e3d0d1f4485cbcdbd4ff4f4a0b56\": container with ID starting with ab5f32402c9ac2f9b4bc17412c00ee974f32e3d0d1f4485cbcdbd4ff4f4a0b56 not found: ID does not exist" containerID="ab5f32402c9ac2f9b4bc17412c00ee974f32e3d0d1f4485cbcdbd4ff4f4a0b56" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.815952 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab5f32402c9ac2f9b4bc17412c00ee974f32e3d0d1f4485cbcdbd4ff4f4a0b56"} err="failed to get container status \"ab5f32402c9ac2f9b4bc17412c00ee974f32e3d0d1f4485cbcdbd4ff4f4a0b56\": rpc error: code = NotFound desc = could not find container \"ab5f32402c9ac2f9b4bc17412c00ee974f32e3d0d1f4485cbcdbd4ff4f4a0b56\": container with ID starting with ab5f32402c9ac2f9b4bc17412c00ee974f32e3d0d1f4485cbcdbd4ff4f4a0b56 not found: ID does not exist" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.817175 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:46:43 crc kubenswrapper[5030]: E0120 23:46:43.817642 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743f3c13-8b5a-454a-9111-2bfe623b17fa" containerName="probe" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.817660 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="743f3c13-8b5a-454a-9111-2bfe623b17fa" containerName="probe" Jan 20 23:46:43 crc kubenswrapper[5030]: E0120 23:46:43.817684 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743f3c13-8b5a-454a-9111-2bfe623b17fa" containerName="cinder-scheduler" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.817691 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="743f3c13-8b5a-454a-9111-2bfe623b17fa" containerName="cinder-scheduler" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.817874 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="743f3c13-8b5a-454a-9111-2bfe623b17fa" containerName="probe" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.817907 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="743f3c13-8b5a-454a-9111-2bfe623b17fa" containerName="cinder-scheduler" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.819023 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.821750 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.825691 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.863657 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.863743 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-scripts\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.863776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.863796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2zp4\" (UniqueName: \"kubernetes.io/projected/e0048c16-5c70-4aae-baf0-d9f25538e3e2-kube-api-access-v2zp4\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.863856 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0048c16-5c70-4aae-baf0-d9f25538e3e2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.863898 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-config-data\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.964821 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-scripts\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.964867 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.964891 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2zp4\" (UniqueName: \"kubernetes.io/projected/e0048c16-5c70-4aae-baf0-d9f25538e3e2-kube-api-access-v2zp4\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.964950 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0048c16-5c70-4aae-baf0-d9f25538e3e2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.964993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-config-data\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.965028 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.968479 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.968528 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0048c16-5c70-4aae-baf0-d9f25538e3e2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.971748 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="743f3c13-8b5a-454a-9111-2bfe623b17fa" path="/var/lib/kubelet/pods/743f3c13-8b5a-454a-9111-2bfe623b17fa/volumes" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.972788 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.973465 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-config-data\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.974007 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-scripts\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:43 crc kubenswrapper[5030]: I0120 23:46:43.984208 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2zp4\" (UniqueName: \"kubernetes.io/projected/e0048c16-5c70-4aae-baf0-d9f25538e3e2-kube-api-access-v2zp4\") pod \"cinder-scheduler-0\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:44 crc kubenswrapper[5030]: I0120 23:46:44.135248 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:44 crc kubenswrapper[5030]: I0120 23:46:44.183929 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:44 crc kubenswrapper[5030]: I0120 23:46:44.322994 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:44 crc kubenswrapper[5030]: I0120 23:46:44.574969 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:46:44 crc kubenswrapper[5030]: I0120 23:46:44.725839 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e0048c16-5c70-4aae-baf0-d9f25538e3e2","Type":"ContainerStarted","Data":"57f57eab70ae5eac2734902fae4f4946bdc32ad4952a5e1e99f5664556ef3e3a"} Jan 20 23:46:44 crc kubenswrapper[5030]: I0120 23:46:44.727391 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:46 crc kubenswrapper[5030]: I0120 23:46:46.761173 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e0048c16-5c70-4aae-baf0-d9f25538e3e2","Type":"ContainerStarted","Data":"039e7ef3eac22a9018b69c95007576efe560fae9bfde236edbf26c56343c30a2"} Jan 20 23:46:46 crc kubenswrapper[5030]: I0120 23:46:46.761842 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e0048c16-5c70-4aae-baf0-d9f25538e3e2","Type":"ContainerStarted","Data":"539dbc34fb373341eff958e84967781ee544073c3550557667fd10768f210aeb"} Jan 20 23:46:46 crc kubenswrapper[5030]: I0120 23:46:46.789089 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.789073294 podStartE2EDuration="3.789073294s" podCreationTimestamp="2026-01-20 23:46:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:46:46.788417699 +0000 UTC m=+4279.108678017" watchObservedRunningTime="2026-01-20 23:46:46.789073294 +0000 UTC m=+4279.109333572" Jan 20 23:46:48 crc kubenswrapper[5030]: I0120 23:46:48.288540 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:46:49 crc kubenswrapper[5030]: I0120 23:46:49.135708 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:49 crc kubenswrapper[5030]: E0120 23:46:49.276358 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod497ba0ef_c7db_4c22_8b2f_7da2826ece57.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:46:53 crc kubenswrapper[5030]: I0120 23:46:53.743942 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:53 crc kubenswrapper[5030]: I0120 23:46:53.776440 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:46:53 crc kubenswrapper[5030]: I0120 23:46:53.863287 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-54bf956f58-k98kq"] Jan 20 23:46:53 crc kubenswrapper[5030]: I0120 23:46:53.864574 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" podUID="27b05991-4609-438d-8101-49094dcaab73" containerName="barbican-api" containerID="cri-o://395e9a0cd2baaa9a1747350534c5a945f8f1526a6b21dfdc2f59fba8b9b49db1" gracePeriod=30 Jan 20 23:46:53 crc kubenswrapper[5030]: I0120 23:46:53.864821 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" podUID="27b05991-4609-438d-8101-49094dcaab73" containerName="barbican-api-log" containerID="cri-o://006c6af6ec7f5c7a7adb7d60fb62998d52343c19ab5a4d217ce16b6062fbd091" gracePeriod=30 Jan 20 23:46:54 crc kubenswrapper[5030]: I0120 23:46:54.353699 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:46:54 crc kubenswrapper[5030]: I0120 23:46:54.851431 5030 generic.go:334] "Generic (PLEG): container finished" podID="27b05991-4609-438d-8101-49094dcaab73" containerID="006c6af6ec7f5c7a7adb7d60fb62998d52343c19ab5a4d217ce16b6062fbd091" exitCode=143 Jan 20 23:46:54 crc kubenswrapper[5030]: I0120 23:46:54.851506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" event={"ID":"27b05991-4609-438d-8101-49094dcaab73","Type":"ContainerDied","Data":"006c6af6ec7f5c7a7adb7d60fb62998d52343c19ab5a4d217ce16b6062fbd091"} Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.465433 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.565466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-config-data\") pod \"27b05991-4609-438d-8101-49094dcaab73\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.565538 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-combined-ca-bundle\") pod \"27b05991-4609-438d-8101-49094dcaab73\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.565701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27b05991-4609-438d-8101-49094dcaab73-logs\") pod \"27b05991-4609-438d-8101-49094dcaab73\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.565734 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnz75\" (UniqueName: \"kubernetes.io/projected/27b05991-4609-438d-8101-49094dcaab73-kube-api-access-cnz75\") pod \"27b05991-4609-438d-8101-49094dcaab73\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.565754 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-config-data-custom\") pod \"27b05991-4609-438d-8101-49094dcaab73\" (UID: \"27b05991-4609-438d-8101-49094dcaab73\") " Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.566356 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27b05991-4609-438d-8101-49094dcaab73-logs" (OuterVolumeSpecName: "logs") pod "27b05991-4609-438d-8101-49094dcaab73" (UID: "27b05991-4609-438d-8101-49094dcaab73"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.571233 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "27b05991-4609-438d-8101-49094dcaab73" (UID: "27b05991-4609-438d-8101-49094dcaab73"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.583842 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27b05991-4609-438d-8101-49094dcaab73-kube-api-access-cnz75" (OuterVolumeSpecName: "kube-api-access-cnz75") pod "27b05991-4609-438d-8101-49094dcaab73" (UID: "27b05991-4609-438d-8101-49094dcaab73"). InnerVolumeSpecName "kube-api-access-cnz75". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.589601 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27b05991-4609-438d-8101-49094dcaab73" (UID: "27b05991-4609-438d-8101-49094dcaab73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.611191 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-config-data" (OuterVolumeSpecName: "config-data") pod "27b05991-4609-438d-8101-49094dcaab73" (UID: "27b05991-4609-438d-8101-49094dcaab73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.668473 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27b05991-4609-438d-8101-49094dcaab73-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.668511 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnz75\" (UniqueName: \"kubernetes.io/projected/27b05991-4609-438d-8101-49094dcaab73-kube-api-access-cnz75\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.668525 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.668538 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.668548 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b05991-4609-438d-8101-49094dcaab73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.892159 5030 generic.go:334] "Generic (PLEG): container finished" podID="27b05991-4609-438d-8101-49094dcaab73" containerID="395e9a0cd2baaa9a1747350534c5a945f8f1526a6b21dfdc2f59fba8b9b49db1" exitCode=0 Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.892205 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" event={"ID":"27b05991-4609-438d-8101-49094dcaab73","Type":"ContainerDied","Data":"395e9a0cd2baaa9a1747350534c5a945f8f1526a6b21dfdc2f59fba8b9b49db1"} Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.892257 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" event={"ID":"27b05991-4609-438d-8101-49094dcaab73","Type":"ContainerDied","Data":"29e279a8319639ccedc7df5b4d7f6c623191deb3afc900b214471999fbac52ef"} Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.892273 5030 scope.go:117] "RemoveContainer" containerID="395e9a0cd2baaa9a1747350534c5a945f8f1526a6b21dfdc2f59fba8b9b49db1" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.892281 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54bf956f58-k98kq" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.940704 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-54bf956f58-k98kq"] Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.948812 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-54bf956f58-k98kq"] Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.950947 5030 scope.go:117] "RemoveContainer" containerID="006c6af6ec7f5c7a7adb7d60fb62998d52343c19ab5a4d217ce16b6062fbd091" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.975979 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27b05991-4609-438d-8101-49094dcaab73" path="/var/lib/kubelet/pods/27b05991-4609-438d-8101-49094dcaab73/volumes" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.983229 5030 scope.go:117] "RemoveContainer" containerID="395e9a0cd2baaa9a1747350534c5a945f8f1526a6b21dfdc2f59fba8b9b49db1" Jan 20 23:46:57 crc kubenswrapper[5030]: E0120 23:46:57.983607 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"395e9a0cd2baaa9a1747350534c5a945f8f1526a6b21dfdc2f59fba8b9b49db1\": container with ID starting with 395e9a0cd2baaa9a1747350534c5a945f8f1526a6b21dfdc2f59fba8b9b49db1 not found: ID does not exist" containerID="395e9a0cd2baaa9a1747350534c5a945f8f1526a6b21dfdc2f59fba8b9b49db1" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.983662 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395e9a0cd2baaa9a1747350534c5a945f8f1526a6b21dfdc2f59fba8b9b49db1"} err="failed to get container status \"395e9a0cd2baaa9a1747350534c5a945f8f1526a6b21dfdc2f59fba8b9b49db1\": rpc error: code = NotFound desc = could not find container \"395e9a0cd2baaa9a1747350534c5a945f8f1526a6b21dfdc2f59fba8b9b49db1\": container with ID starting with 395e9a0cd2baaa9a1747350534c5a945f8f1526a6b21dfdc2f59fba8b9b49db1 not found: ID does not exist" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.983687 5030 scope.go:117] "RemoveContainer" containerID="006c6af6ec7f5c7a7adb7d60fb62998d52343c19ab5a4d217ce16b6062fbd091" Jan 20 23:46:57 crc kubenswrapper[5030]: E0120 23:46:57.984556 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006c6af6ec7f5c7a7adb7d60fb62998d52343c19ab5a4d217ce16b6062fbd091\": container with ID starting with 006c6af6ec7f5c7a7adb7d60fb62998d52343c19ab5a4d217ce16b6062fbd091 not found: ID does not exist" containerID="006c6af6ec7f5c7a7adb7d60fb62998d52343c19ab5a4d217ce16b6062fbd091" Jan 20 23:46:57 crc kubenswrapper[5030]: I0120 23:46:57.984744 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006c6af6ec7f5c7a7adb7d60fb62998d52343c19ab5a4d217ce16b6062fbd091"} err="failed to get container status \"006c6af6ec7f5c7a7adb7d60fb62998d52343c19ab5a4d217ce16b6062fbd091\": rpc error: code = NotFound desc = could not find container \"006c6af6ec7f5c7a7adb7d60fb62998d52343c19ab5a4d217ce16b6062fbd091\": container with ID starting with 006c6af6ec7f5c7a7adb7d60fb62998d52343c19ab5a4d217ce16b6062fbd091 not found: ID does not exist" Jan 20 23:46:59 crc kubenswrapper[5030]: E0120 23:46:59.515920 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod497ba0ef_c7db_4c22_8b2f_7da2826ece57.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:46:59 crc kubenswrapper[5030]: I0120 23:46:59.952183 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:46:59 crc kubenswrapper[5030]: I0120 23:46:59.952453 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:47:00 crc kubenswrapper[5030]: I0120 23:47:00.849849 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:47:00 crc kubenswrapper[5030]: I0120 23:47:00.894065 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:47:00 crc kubenswrapper[5030]: I0120 23:47:00.955073 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-fd7fccdd6-2qq68"] Jan 20 23:47:00 crc kubenswrapper[5030]: I0120 23:47:00.955308 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" podUID="84c24aee-5bda-43fe-8d1d-f300da572b7b" containerName="placement-log" containerID="cri-o://1eb541585fbd67506eb0e791ddff809f983a430209159e2301cac2f668f7034e" gracePeriod=30 Jan 20 23:47:00 crc kubenswrapper[5030]: I0120 23:47:00.955425 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" podUID="84c24aee-5bda-43fe-8d1d-f300da572b7b" containerName="placement-api" containerID="cri-o://36265ff97ff1d54df0caea2a363dab0086d9d2663bb0d319e7493b32674eee84" gracePeriod=30 Jan 20 23:47:01 crc kubenswrapper[5030]: I0120 23:47:01.916908 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:01 crc kubenswrapper[5030]: I0120 23:47:01.943398 5030 generic.go:334] "Generic (PLEG): container finished" podID="84c24aee-5bda-43fe-8d1d-f300da572b7b" containerID="1eb541585fbd67506eb0e791ddff809f983a430209159e2301cac2f668f7034e" exitCode=143 Jan 20 23:47:01 crc kubenswrapper[5030]: I0120 23:47:01.943541 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" event={"ID":"84c24aee-5bda-43fe-8d1d-f300da572b7b","Type":"ContainerDied","Data":"1eb541585fbd67506eb0e791ddff809f983a430209159e2301cac2f668f7034e"} Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.819065 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.912196 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-scripts\") pod \"84c24aee-5bda-43fe-8d1d-f300da572b7b\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.912297 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84c24aee-5bda-43fe-8d1d-f300da572b7b-logs\") pod \"84c24aee-5bda-43fe-8d1d-f300da572b7b\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.912328 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-config-data\") pod \"84c24aee-5bda-43fe-8d1d-f300da572b7b\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.912387 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpvp6\" (UniqueName: \"kubernetes.io/projected/84c24aee-5bda-43fe-8d1d-f300da572b7b-kube-api-access-fpvp6\") pod \"84c24aee-5bda-43fe-8d1d-f300da572b7b\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.912456 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-combined-ca-bundle\") pod \"84c24aee-5bda-43fe-8d1d-f300da572b7b\" (UID: \"84c24aee-5bda-43fe-8d1d-f300da572b7b\") " Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.912857 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84c24aee-5bda-43fe-8d1d-f300da572b7b-logs" (OuterVolumeSpecName: "logs") pod "84c24aee-5bda-43fe-8d1d-f300da572b7b" (UID: "84c24aee-5bda-43fe-8d1d-f300da572b7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.913789 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84c24aee-5bda-43fe-8d1d-f300da572b7b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.918379 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-scripts" (OuterVolumeSpecName: "scripts") pod "84c24aee-5bda-43fe-8d1d-f300da572b7b" (UID: "84c24aee-5bda-43fe-8d1d-f300da572b7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.919163 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84c24aee-5bda-43fe-8d1d-f300da572b7b-kube-api-access-fpvp6" (OuterVolumeSpecName: "kube-api-access-fpvp6") pod "84c24aee-5bda-43fe-8d1d-f300da572b7b" (UID: "84c24aee-5bda-43fe-8d1d-f300da572b7b"). InnerVolumeSpecName "kube-api-access-fpvp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.978707 5030 generic.go:334] "Generic (PLEG): container finished" podID="84c24aee-5bda-43fe-8d1d-f300da572b7b" containerID="36265ff97ff1d54df0caea2a363dab0086d9d2663bb0d319e7493b32674eee84" exitCode=0 Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.978768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" event={"ID":"84c24aee-5bda-43fe-8d1d-f300da572b7b","Type":"ContainerDied","Data":"36265ff97ff1d54df0caea2a363dab0086d9d2663bb0d319e7493b32674eee84"} Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.979033 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" event={"ID":"84c24aee-5bda-43fe-8d1d-f300da572b7b","Type":"ContainerDied","Data":"0bd4702d7738830219bc8aaff93f4499f2ba5f475349e635a643c10f16107bd1"} Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.979065 5030 scope.go:117] "RemoveContainer" containerID="36265ff97ff1d54df0caea2a363dab0086d9d2663bb0d319e7493b32674eee84" Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.978819 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-fd7fccdd6-2qq68" Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.988697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84c24aee-5bda-43fe-8d1d-f300da572b7b" (UID: "84c24aee-5bda-43fe-8d1d-f300da572b7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:04 crc kubenswrapper[5030]: I0120 23:47:04.999300 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-config-data" (OuterVolumeSpecName: "config-data") pod "84c24aee-5bda-43fe-8d1d-f300da572b7b" (UID: "84c24aee-5bda-43fe-8d1d-f300da572b7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:05 crc kubenswrapper[5030]: I0120 23:47:05.015403 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:05 crc kubenswrapper[5030]: I0120 23:47:05.015456 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpvp6\" (UniqueName: \"kubernetes.io/projected/84c24aee-5bda-43fe-8d1d-f300da572b7b-kube-api-access-fpvp6\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:05 crc kubenswrapper[5030]: I0120 23:47:05.015489 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:05 crc kubenswrapper[5030]: I0120 23:47:05.015516 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84c24aee-5bda-43fe-8d1d-f300da572b7b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:05 crc kubenswrapper[5030]: I0120 23:47:05.018699 5030 scope.go:117] "RemoveContainer" containerID="1eb541585fbd67506eb0e791ddff809f983a430209159e2301cac2f668f7034e" Jan 20 23:47:05 crc kubenswrapper[5030]: I0120 23:47:05.045371 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:47:05 crc kubenswrapper[5030]: I0120 23:47:05.046732 5030 scope.go:117] "RemoveContainer" containerID="36265ff97ff1d54df0caea2a363dab0086d9d2663bb0d319e7493b32674eee84" Jan 20 23:47:05 crc kubenswrapper[5030]: E0120 23:47:05.047873 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36265ff97ff1d54df0caea2a363dab0086d9d2663bb0d319e7493b32674eee84\": container with ID starting with 36265ff97ff1d54df0caea2a363dab0086d9d2663bb0d319e7493b32674eee84 not found: ID does not exist" containerID="36265ff97ff1d54df0caea2a363dab0086d9d2663bb0d319e7493b32674eee84" Jan 20 23:47:05 crc kubenswrapper[5030]: I0120 23:47:05.047931 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36265ff97ff1d54df0caea2a363dab0086d9d2663bb0d319e7493b32674eee84"} err="failed to get container status \"36265ff97ff1d54df0caea2a363dab0086d9d2663bb0d319e7493b32674eee84\": rpc error: code = NotFound desc = could not find container \"36265ff97ff1d54df0caea2a363dab0086d9d2663bb0d319e7493b32674eee84\": container with ID starting with 36265ff97ff1d54df0caea2a363dab0086d9d2663bb0d319e7493b32674eee84 not found: ID does not exist" Jan 20 23:47:05 crc kubenswrapper[5030]: I0120 23:47:05.047954 5030 scope.go:117] "RemoveContainer" containerID="1eb541585fbd67506eb0e791ddff809f983a430209159e2301cac2f668f7034e" Jan 20 23:47:05 crc kubenswrapper[5030]: E0120 23:47:05.048469 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb541585fbd67506eb0e791ddff809f983a430209159e2301cac2f668f7034e\": container with ID starting with 1eb541585fbd67506eb0e791ddff809f983a430209159e2301cac2f668f7034e not found: ID does not exist" containerID="1eb541585fbd67506eb0e791ddff809f983a430209159e2301cac2f668f7034e" Jan 20 23:47:05 crc kubenswrapper[5030]: I0120 23:47:05.048508 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb541585fbd67506eb0e791ddff809f983a430209159e2301cac2f668f7034e"} err="failed to get container status \"1eb541585fbd67506eb0e791ddff809f983a430209159e2301cac2f668f7034e\": rpc error: code = NotFound desc = could not find container \"1eb541585fbd67506eb0e791ddff809f983a430209159e2301cac2f668f7034e\": container with ID starting with 1eb541585fbd67506eb0e791ddff809f983a430209159e2301cac2f668f7034e not found: ID does not exist" Jan 20 23:47:05 crc kubenswrapper[5030]: I0120 23:47:05.332793 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-fd7fccdd6-2qq68"] Jan 20 23:47:05 crc kubenswrapper[5030]: I0120 23:47:05.342516 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-fd7fccdd6-2qq68"] Jan 20 23:47:05 crc kubenswrapper[5030]: I0120 23:47:05.979528 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84c24aee-5bda-43fe-8d1d-f300da572b7b" path="/var/lib/kubelet/pods/84c24aee-5bda-43fe-8d1d-f300da572b7b/volumes" Jan 20 23:47:08 crc kubenswrapper[5030]: I0120 23:47:08.475148 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.342176 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:47:09 crc kubenswrapper[5030]: E0120 23:47:09.342778 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b05991-4609-438d-8101-49094dcaab73" containerName="barbican-api" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.342811 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b05991-4609-438d-8101-49094dcaab73" containerName="barbican-api" Jan 20 23:47:09 crc kubenswrapper[5030]: E0120 23:47:09.342834 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c24aee-5bda-43fe-8d1d-f300da572b7b" containerName="placement-log" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.342847 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c24aee-5bda-43fe-8d1d-f300da572b7b" containerName="placement-log" Jan 20 23:47:09 crc kubenswrapper[5030]: E0120 23:47:09.342883 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b05991-4609-438d-8101-49094dcaab73" containerName="barbican-api-log" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.342893 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b05991-4609-438d-8101-49094dcaab73" containerName="barbican-api-log" Jan 20 23:47:09 crc kubenswrapper[5030]: E0120 23:47:09.342916 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c24aee-5bda-43fe-8d1d-f300da572b7b" containerName="placement-api" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.342926 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c24aee-5bda-43fe-8d1d-f300da572b7b" containerName="placement-api" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.343209 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b05991-4609-438d-8101-49094dcaab73" containerName="barbican-api-log" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.343236 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c24aee-5bda-43fe-8d1d-f300da572b7b" containerName="placement-log" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.343267 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b05991-4609-438d-8101-49094dcaab73" containerName="barbican-api" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.343286 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c24aee-5bda-43fe-8d1d-f300da572b7b" containerName="placement-api" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.344255 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.350668 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.351032 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.351654 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.359381 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-lpzq6" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.519914 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/99675934-2bb7-46f9-8e3c-49e818022424-openstack-config-secret\") pod \"openstackclient\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.520097 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf2nm\" (UniqueName: \"kubernetes.io/projected/99675934-2bb7-46f9-8e3c-49e818022424-kube-api-access-rf2nm\") pod \"openstackclient\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.520133 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99675934-2bb7-46f9-8e3c-49e818022424-combined-ca-bundle\") pod \"openstackclient\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.520304 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/99675934-2bb7-46f9-8e3c-49e818022424-openstack-config\") pod \"openstackclient\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.621285 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/99675934-2bb7-46f9-8e3c-49e818022424-openstack-config-secret\") pod \"openstackclient\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.622211 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf2nm\" (UniqueName: \"kubernetes.io/projected/99675934-2bb7-46f9-8e3c-49e818022424-kube-api-access-rf2nm\") pod \"openstackclient\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.622268 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99675934-2bb7-46f9-8e3c-49e818022424-combined-ca-bundle\") pod \"openstackclient\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.622302 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/99675934-2bb7-46f9-8e3c-49e818022424-openstack-config\") pod \"openstackclient\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.624616 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/99675934-2bb7-46f9-8e3c-49e818022424-openstack-config\") pod \"openstackclient\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.635541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/99675934-2bb7-46f9-8e3c-49e818022424-openstack-config-secret\") pod \"openstackclient\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.640206 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99675934-2bb7-46f9-8e3c-49e818022424-combined-ca-bundle\") pod \"openstackclient\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.641159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf2nm\" (UniqueName: \"kubernetes.io/projected/99675934-2bb7-46f9-8e3c-49e818022424-kube-api-access-rf2nm\") pod \"openstackclient\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: I0120 23:47:09.681212 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:47:09 crc kubenswrapper[5030]: E0120 23:47:09.747685 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod497ba0ef_c7db_4c22_8b2f_7da2826ece57.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:47:10 crc kubenswrapper[5030]: I0120 23:47:10.140261 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:47:10 crc kubenswrapper[5030]: W0120 23:47:10.142241 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99675934_2bb7_46f9_8e3c_49e818022424.slice/crio-373eb10a0b41d6b4fb293ec4d42b16aa2f22c4c797ccd0ced1186569caccbd59 WatchSource:0}: Error finding container 373eb10a0b41d6b4fb293ec4d42b16aa2f22c4c797ccd0ced1186569caccbd59: Status 404 returned error can't find the container with id 373eb10a0b41d6b4fb293ec4d42b16aa2f22c4c797ccd0ced1186569caccbd59 Jan 20 23:47:10 crc kubenswrapper[5030]: I0120 23:47:10.157003 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:47:10 crc kubenswrapper[5030]: I0120 23:47:10.157047 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:47:11 crc kubenswrapper[5030]: I0120 23:47:11.090638 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"99675934-2bb7-46f9-8e3c-49e818022424","Type":"ContainerStarted","Data":"e39fcd1cdfdc87e110da28235413a92eda0b8fd8acd8844e248a0cfc858598d0"} Jan 20 23:47:11 crc kubenswrapper[5030]: I0120 23:47:11.090994 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"99675934-2bb7-46f9-8e3c-49e818022424","Type":"ContainerStarted","Data":"373eb10a0b41d6b4fb293ec4d42b16aa2f22c4c797ccd0ced1186569caccbd59"} Jan 20 23:47:11 crc kubenswrapper[5030]: I0120 23:47:11.117110 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.11708879 podStartE2EDuration="2.11708879s" podCreationTimestamp="2026-01-20 23:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:11.108328907 +0000 UTC m=+4303.428589195" watchObservedRunningTime="2026-01-20 23:47:11.11708879 +0000 UTC m=+4303.437349088" Jan 20 23:47:11 crc kubenswrapper[5030]: I0120 23:47:11.138139 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:47:11 crc kubenswrapper[5030]: I0120 23:47:11.203251 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-59755f95cc-55knr"] Jan 20 23:47:11 crc kubenswrapper[5030]: I0120 23:47:11.203547 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" podUID="0800e960-d721-4790-8586-8f5c77ab05dc" containerName="neutron-api" containerID="cri-o://6f2c342a8688dabd93f58244581defe726775c8bbc85ea0469b1283dc62695de" gracePeriod=30 Jan 20 23:47:11 crc kubenswrapper[5030]: I0120 23:47:11.203726 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" podUID="0800e960-d721-4790-8586-8f5c77ab05dc" containerName="neutron-httpd" containerID="cri-o://d73ad5f754eb2901b60128bff6a264983d7d74521f4176732b15724e0d3df8c8" gracePeriod=30 Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.101126 5030 generic.go:334] "Generic (PLEG): container finished" podID="0800e960-d721-4790-8586-8f5c77ab05dc" containerID="d73ad5f754eb2901b60128bff6a264983d7d74521f4176732b15724e0d3df8c8" exitCode=0 Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.101196 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" event={"ID":"0800e960-d721-4790-8586-8f5c77ab05dc","Type":"ContainerDied","Data":"d73ad5f754eb2901b60128bff6a264983d7d74521f4176732b15724e0d3df8c8"} Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.616943 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc"] Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.618920 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.624444 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.624575 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.624799 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.642393 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc"] Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.696875 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.697126 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="ceilometer-central-agent" containerID="cri-o://76c6a6de66aff93f3051fcaf10d3d7f5616d25cbee186689903f84de1167f95c" gracePeriod=30 Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.697234 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="proxy-httpd" containerID="cri-o://58d9e6cc3cb107f7376adb59f063d6a98f9a4806a4c933c4606eee3c3d38344f" gracePeriod=30 Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.697275 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="sg-core" containerID="cri-o://0c891e84022b49df1c5d5035d0ce800f65dec26e591a308961236a0a7f2a9424" gracePeriod=30 Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.697312 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="ceilometer-notification-agent" containerID="cri-o://02e88784775eafdc1b9189fcec5afbb4ed9dcee7c3a77c6a3ab8948783a3ce44" gracePeriod=30 Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.777144 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3568f883-8a26-4dae-b664-474faca62644-run-httpd\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.777201 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3568f883-8a26-4dae-b664-474faca62644-log-httpd\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.777439 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3568f883-8a26-4dae-b664-474faca62644-etc-swift\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.777490 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-public-tls-certs\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.777541 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-internal-tls-certs\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.777718 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-combined-ca-bundle\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.777773 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csbcb\" (UniqueName: \"kubernetes.io/projected/3568f883-8a26-4dae-b664-474faca62644-kube-api-access-csbcb\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.777810 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-config-data\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.879013 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3568f883-8a26-4dae-b664-474faca62644-etc-swift\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.879050 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-public-tls-certs\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.879079 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-internal-tls-certs\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.879138 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-combined-ca-bundle\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.879158 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csbcb\" (UniqueName: \"kubernetes.io/projected/3568f883-8a26-4dae-b664-474faca62644-kube-api-access-csbcb\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.879180 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-config-data\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.879217 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3568f883-8a26-4dae-b664-474faca62644-run-httpd\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.879249 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3568f883-8a26-4dae-b664-474faca62644-log-httpd\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.881199 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3568f883-8a26-4dae-b664-474faca62644-log-httpd\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:12 crc kubenswrapper[5030]: I0120 23:47:12.881445 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3568f883-8a26-4dae-b664-474faca62644-run-httpd\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.046065 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-public-tls-certs\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.046182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3568f883-8a26-4dae-b664-474faca62644-etc-swift\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.046914 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-combined-ca-bundle\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.046926 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-internal-tls-certs\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.047205 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-config-data\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.047967 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csbcb\" (UniqueName: \"kubernetes.io/projected/3568f883-8a26-4dae-b664-474faca62644-kube-api-access-csbcb\") pod \"swift-proxy-85b58b6496-6nkdc\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.114281 5030 generic.go:334] "Generic (PLEG): container finished" podID="0800e960-d721-4790-8586-8f5c77ab05dc" containerID="6f2c342a8688dabd93f58244581defe726775c8bbc85ea0469b1283dc62695de" exitCode=0 Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.114385 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" event={"ID":"0800e960-d721-4790-8586-8f5c77ab05dc","Type":"ContainerDied","Data":"6f2c342a8688dabd93f58244581defe726775c8bbc85ea0469b1283dc62695de"} Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.114614 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" event={"ID":"0800e960-d721-4790-8586-8f5c77ab05dc","Type":"ContainerDied","Data":"33b43d709c6e0de7c397a549eecc4ee8a17acdc4d4ab5618a585277d6f9cd4a0"} Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.114645 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33b43d709c6e0de7c397a549eecc4ee8a17acdc4d4ab5618a585277d6f9cd4a0" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.119921 5030 generic.go:334] "Generic (PLEG): container finished" podID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerID="0c891e84022b49df1c5d5035d0ce800f65dec26e591a308961236a0a7f2a9424" exitCode=2 Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.119963 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aa508531-48c3-49ba-9a0b-45f2fce5a3f8","Type":"ContainerDied","Data":"0c891e84022b49df1c5d5035d0ce800f65dec26e591a308961236a0a7f2a9424"} Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.164524 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.246312 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.288160 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-config\") pod \"0800e960-d721-4790-8586-8f5c77ab05dc\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.288347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-httpd-config\") pod \"0800e960-d721-4790-8586-8f5c77ab05dc\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.288575 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-combined-ca-bundle\") pod \"0800e960-d721-4790-8586-8f5c77ab05dc\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.288692 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhvtw\" (UniqueName: \"kubernetes.io/projected/0800e960-d721-4790-8586-8f5c77ab05dc-kube-api-access-zhvtw\") pod \"0800e960-d721-4790-8586-8f5c77ab05dc\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.289577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-ovndb-tls-certs\") pod \"0800e960-d721-4790-8586-8f5c77ab05dc\" (UID: \"0800e960-d721-4790-8586-8f5c77ab05dc\") " Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.293619 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0800e960-d721-4790-8586-8f5c77ab05dc-kube-api-access-zhvtw" (OuterVolumeSpecName: "kube-api-access-zhvtw") pod "0800e960-d721-4790-8586-8f5c77ab05dc" (UID: "0800e960-d721-4790-8586-8f5c77ab05dc"). InnerVolumeSpecName "kube-api-access-zhvtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.294585 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0800e960-d721-4790-8586-8f5c77ab05dc" (UID: "0800e960-d721-4790-8586-8f5c77ab05dc"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.333994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-config" (OuterVolumeSpecName: "config") pod "0800e960-d721-4790-8586-8f5c77ab05dc" (UID: "0800e960-d721-4790-8586-8f5c77ab05dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.342815 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0800e960-d721-4790-8586-8f5c77ab05dc" (UID: "0800e960-d721-4790-8586-8f5c77ab05dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.371849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0800e960-d721-4790-8586-8f5c77ab05dc" (UID: "0800e960-d721-4790-8586-8f5c77ab05dc"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.392327 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.392358 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.392371 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.392384 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhvtw\" (UniqueName: \"kubernetes.io/projected/0800e960-d721-4790-8586-8f5c77ab05dc-kube-api-access-zhvtw\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.392393 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0800e960-d721-4790-8586-8f5c77ab05dc-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:13 crc kubenswrapper[5030]: W0120 23:47:13.705679 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3568f883_8a26_4dae_b664_474faca62644.slice/crio-34d6ec9ef98aaec25c7575000309cb71c1632121138c7fc69ba6fba4fed06c17 WatchSource:0}: Error finding container 34d6ec9ef98aaec25c7575000309cb71c1632121138c7fc69ba6fba4fed06c17: Status 404 returned error can't find the container with id 34d6ec9ef98aaec25c7575000309cb71c1632121138c7fc69ba6fba4fed06c17 Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.711203 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc"] Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.726680 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-c2wf4"] Jan 20 23:47:13 crc kubenswrapper[5030]: E0120 23:47:13.727158 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0800e960-d721-4790-8586-8f5c77ab05dc" containerName="neutron-httpd" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.727176 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0800e960-d721-4790-8586-8f5c77ab05dc" containerName="neutron-httpd" Jan 20 23:47:13 crc kubenswrapper[5030]: E0120 23:47:13.727189 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0800e960-d721-4790-8586-8f5c77ab05dc" containerName="neutron-api" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.727196 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0800e960-d721-4790-8586-8f5c77ab05dc" containerName="neutron-api" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.727372 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0800e960-d721-4790-8586-8f5c77ab05dc" containerName="neutron-httpd" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.727404 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0800e960-d721-4790-8586-8f5c77ab05dc" containerName="neutron-api" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.728034 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.732472 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-c2wf4"] Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.803560 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-lklcb"] Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.804840 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.816665 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m"] Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.822130 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.826739 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.851877 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-lklcb"] Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.865850 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m"] Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.904221 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4566d9-34be-4ff1-8c9b-41c517c89979-operator-scripts\") pod \"nova-api-db-create-c2wf4\" (UID: \"dc4566d9-34be-4ff1-8c9b-41c517c89979\") " pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.904330 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c103a8f-3d99-4db6-861d-2fffca40b7ab-operator-scripts\") pod \"nova-cell0-db-create-lklcb\" (UID: \"9c103a8f-3d99-4db6-861d-2fffca40b7ab\") " pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.904354 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrfxr\" (UniqueName: \"kubernetes.io/projected/9c103a8f-3d99-4db6-861d-2fffca40b7ab-kube-api-access-vrfxr\") pod \"nova-cell0-db-create-lklcb\" (UID: \"9c103a8f-3d99-4db6-861d-2fffca40b7ab\") " pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.904426 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdjch\" (UniqueName: \"kubernetes.io/projected/dc4566d9-34be-4ff1-8c9b-41c517c89979-kube-api-access-tdjch\") pod \"nova-api-db-create-c2wf4\" (UID: \"dc4566d9-34be-4ff1-8c9b-41c517c89979\") " pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.909376 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-bp42s"] Jan 20 23:47:13 crc kubenswrapper[5030]: I0120 23:47:13.910590 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-bp42s" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.006882 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bglxq\" (UniqueName: \"kubernetes.io/projected/b87816c4-975a-45d5-b926-dd5ec79d3b1e-kube-api-access-bglxq\") pod \"nova-api-3da3-account-create-update-tgj5m\" (UID: \"b87816c4-975a-45d5-b926-dd5ec79d3b1e\") " pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.006958 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdjch\" (UniqueName: \"kubernetes.io/projected/dc4566d9-34be-4ff1-8c9b-41c517c89979-kube-api-access-tdjch\") pod \"nova-api-db-create-c2wf4\" (UID: \"dc4566d9-34be-4ff1-8c9b-41c517c89979\") " pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.007003 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4566d9-34be-4ff1-8c9b-41c517c89979-operator-scripts\") pod \"nova-api-db-create-c2wf4\" (UID: \"dc4566d9-34be-4ff1-8c9b-41c517c89979\") " pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.007035 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsxbb\" (UniqueName: \"kubernetes.io/projected/efde4f69-1772-44dd-b9d1-fdc471133d2f-kube-api-access-rsxbb\") pod \"nova-cell1-db-create-bp42s\" (UID: \"efde4f69-1772-44dd-b9d1-fdc471133d2f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-bp42s" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.007114 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efde4f69-1772-44dd-b9d1-fdc471133d2f-operator-scripts\") pod \"nova-cell1-db-create-bp42s\" (UID: \"efde4f69-1772-44dd-b9d1-fdc471133d2f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-bp42s" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.007170 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c103a8f-3d99-4db6-861d-2fffca40b7ab-operator-scripts\") pod \"nova-cell0-db-create-lklcb\" (UID: \"9c103a8f-3d99-4db6-861d-2fffca40b7ab\") " pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.007191 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrfxr\" (UniqueName: \"kubernetes.io/projected/9c103a8f-3d99-4db6-861d-2fffca40b7ab-kube-api-access-vrfxr\") pod \"nova-cell0-db-create-lklcb\" (UID: \"9c103a8f-3d99-4db6-861d-2fffca40b7ab\") " pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.007246 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b87816c4-975a-45d5-b926-dd5ec79d3b1e-operator-scripts\") pod \"nova-api-3da3-account-create-update-tgj5m\" (UID: \"b87816c4-975a-45d5-b926-dd5ec79d3b1e\") " pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.008028 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4566d9-34be-4ff1-8c9b-41c517c89979-operator-scripts\") pod \"nova-api-db-create-c2wf4\" (UID: \"dc4566d9-34be-4ff1-8c9b-41c517c89979\") " pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.008159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c103a8f-3d99-4db6-861d-2fffca40b7ab-operator-scripts\") pod \"nova-cell0-db-create-lklcb\" (UID: \"9c103a8f-3d99-4db6-861d-2fffca40b7ab\") " pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.022475 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-bp42s"] Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.043489 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5"] Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.044773 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.049870 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.055609 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrfxr\" (UniqueName: \"kubernetes.io/projected/9c103a8f-3d99-4db6-861d-2fffca40b7ab-kube-api-access-vrfxr\") pod \"nova-cell0-db-create-lklcb\" (UID: \"9c103a8f-3d99-4db6-861d-2fffca40b7ab\") " pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.057382 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdjch\" (UniqueName: \"kubernetes.io/projected/dc4566d9-34be-4ff1-8c9b-41c517c89979-kube-api-access-tdjch\") pod \"nova-api-db-create-c2wf4\" (UID: \"dc4566d9-34be-4ff1-8c9b-41c517c89979\") " pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.058385 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5"] Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.109830 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bglxq\" (UniqueName: \"kubernetes.io/projected/b87816c4-975a-45d5-b926-dd5ec79d3b1e-kube-api-access-bglxq\") pod \"nova-api-3da3-account-create-update-tgj5m\" (UID: \"b87816c4-975a-45d5-b926-dd5ec79d3b1e\") " pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.110069 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsxbb\" (UniqueName: \"kubernetes.io/projected/efde4f69-1772-44dd-b9d1-fdc471133d2f-kube-api-access-rsxbb\") pod \"nova-cell1-db-create-bp42s\" (UID: \"efde4f69-1772-44dd-b9d1-fdc471133d2f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-bp42s" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.110193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efde4f69-1772-44dd-b9d1-fdc471133d2f-operator-scripts\") pod \"nova-cell1-db-create-bp42s\" (UID: \"efde4f69-1772-44dd-b9d1-fdc471133d2f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-bp42s" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.110367 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b87816c4-975a-45d5-b926-dd5ec79d3b1e-operator-scripts\") pod \"nova-api-3da3-account-create-update-tgj5m\" (UID: \"b87816c4-975a-45d5-b926-dd5ec79d3b1e\") " pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.111843 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b87816c4-975a-45d5-b926-dd5ec79d3b1e-operator-scripts\") pod \"nova-api-3da3-account-create-update-tgj5m\" (UID: \"b87816c4-975a-45d5-b926-dd5ec79d3b1e\") " pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.111942 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efde4f69-1772-44dd-b9d1-fdc471133d2f-operator-scripts\") pod \"nova-cell1-db-create-bp42s\" (UID: \"efde4f69-1772-44dd-b9d1-fdc471133d2f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-bp42s" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.127767 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bglxq\" (UniqueName: \"kubernetes.io/projected/b87816c4-975a-45d5-b926-dd5ec79d3b1e-kube-api-access-bglxq\") pod \"nova-api-3da3-account-create-update-tgj5m\" (UID: \"b87816c4-975a-45d5-b926-dd5ec79d3b1e\") " pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.130989 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsxbb\" (UniqueName: \"kubernetes.io/projected/efde4f69-1772-44dd-b9d1-fdc471133d2f-kube-api-access-rsxbb\") pod \"nova-cell1-db-create-bp42s\" (UID: \"efde4f69-1772-44dd-b9d1-fdc471133d2f\") " pod="openstack-kuttl-tests/nova-cell1-db-create-bp42s" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.137566 5030 generic.go:334] "Generic (PLEG): container finished" podID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerID="58d9e6cc3cb107f7376adb59f063d6a98f9a4806a4c933c4606eee3c3d38344f" exitCode=0 Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.137673 5030 generic.go:334] "Generic (PLEG): container finished" podID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerID="76c6a6de66aff93f3051fcaf10d3d7f5616d25cbee186689903f84de1167f95c" exitCode=0 Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.137786 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aa508531-48c3-49ba-9a0b-45f2fce5a3f8","Type":"ContainerDied","Data":"58d9e6cc3cb107f7376adb59f063d6a98f9a4806a4c933c4606eee3c3d38344f"} Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.137883 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aa508531-48c3-49ba-9a0b-45f2fce5a3f8","Type":"ContainerDied","Data":"76c6a6de66aff93f3051fcaf10d3d7f5616d25cbee186689903f84de1167f95c"} Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.139167 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-59755f95cc-55knr" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.140167 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" event={"ID":"3568f883-8a26-4dae-b664-474faca62644","Type":"ContainerStarted","Data":"34d6ec9ef98aaec25c7575000309cb71c1632121138c7fc69ba6fba4fed06c17"} Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.185982 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-59755f95cc-55knr"] Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.199606 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.199858 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-59755f95cc-55knr"] Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.209495 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.211848 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35aff0fd-825b-4e57-84eb-b284266de899-operator-scripts\") pod \"nova-cell0-02a2-account-create-update-dgbt5\" (UID: \"35aff0fd-825b-4e57-84eb-b284266de899\") " pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.211921 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ntnj\" (UniqueName: \"kubernetes.io/projected/35aff0fd-825b-4e57-84eb-b284266de899-kube-api-access-6ntnj\") pod \"nova-cell0-02a2-account-create-update-dgbt5\" (UID: \"35aff0fd-825b-4e57-84eb-b284266de899\") " pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.255986 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5"] Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.259247 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.270506 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.281205 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.290420 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-bp42s" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.294681 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5"] Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.313726 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8a41ef-f17b-4a77-976e-0e8531cdbb52-operator-scripts\") pod \"nova-cell1-f32b-account-create-update-tdqp5\" (UID: \"8c8a41ef-f17b-4a77-976e-0e8531cdbb52\") " pod="openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.314284 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ntnj\" (UniqueName: \"kubernetes.io/projected/35aff0fd-825b-4e57-84eb-b284266de899-kube-api-access-6ntnj\") pod \"nova-cell0-02a2-account-create-update-dgbt5\" (UID: \"35aff0fd-825b-4e57-84eb-b284266de899\") " pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.314466 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f6xz\" (UniqueName: \"kubernetes.io/projected/8c8a41ef-f17b-4a77-976e-0e8531cdbb52-kube-api-access-2f6xz\") pod \"nova-cell1-f32b-account-create-update-tdqp5\" (UID: \"8c8a41ef-f17b-4a77-976e-0e8531cdbb52\") " pod="openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.314542 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35aff0fd-825b-4e57-84eb-b284266de899-operator-scripts\") pod \"nova-cell0-02a2-account-create-update-dgbt5\" (UID: \"35aff0fd-825b-4e57-84eb-b284266de899\") " pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.317561 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35aff0fd-825b-4e57-84eb-b284266de899-operator-scripts\") pod \"nova-cell0-02a2-account-create-update-dgbt5\" (UID: \"35aff0fd-825b-4e57-84eb-b284266de899\") " pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.341178 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ntnj\" (UniqueName: \"kubernetes.io/projected/35aff0fd-825b-4e57-84eb-b284266de899-kube-api-access-6ntnj\") pod \"nova-cell0-02a2-account-create-update-dgbt5\" (UID: \"35aff0fd-825b-4e57-84eb-b284266de899\") " pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.367124 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.418673 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f6xz\" (UniqueName: \"kubernetes.io/projected/8c8a41ef-f17b-4a77-976e-0e8531cdbb52-kube-api-access-2f6xz\") pod \"nova-cell1-f32b-account-create-update-tdqp5\" (UID: \"8c8a41ef-f17b-4a77-976e-0e8531cdbb52\") " pod="openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.418766 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8a41ef-f17b-4a77-976e-0e8531cdbb52-operator-scripts\") pod \"nova-cell1-f32b-account-create-update-tdqp5\" (UID: \"8c8a41ef-f17b-4a77-976e-0e8531cdbb52\") " pod="openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.420708 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8a41ef-f17b-4a77-976e-0e8531cdbb52-operator-scripts\") pod \"nova-cell1-f32b-account-create-update-tdqp5\" (UID: \"8c8a41ef-f17b-4a77-976e-0e8531cdbb52\") " pod="openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.449222 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f6xz\" (UniqueName: \"kubernetes.io/projected/8c8a41ef-f17b-4a77-976e-0e8531cdbb52-kube-api-access-2f6xz\") pod \"nova-cell1-f32b-account-create-update-tdqp5\" (UID: \"8c8a41ef-f17b-4a77-976e-0e8531cdbb52\") " pod="openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.634174 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5" Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.759996 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-c2wf4"] Jan 20 23:47:14 crc kubenswrapper[5030]: W0120 23:47:14.773446 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc4566d9_34be_4ff1_8c9b_41c517c89979.slice/crio-419165fa3e0cacb4460e0441f812363dd823825e5dbcb11893e684b5c0d81fea WatchSource:0}: Error finding container 419165fa3e0cacb4460e0441f812363dd823825e5dbcb11893e684b5c0d81fea: Status 404 returned error can't find the container with id 419165fa3e0cacb4460e0441f812363dd823825e5dbcb11893e684b5c0d81fea Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.858115 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-lklcb"] Jan 20 23:47:14 crc kubenswrapper[5030]: W0120 23:47:14.870099 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c103a8f_3d99_4db6_861d_2fffca40b7ab.slice/crio-1346311934b2916cd857026d242bc114b31e00e73e3e3674b101ea59e08fa24a WatchSource:0}: Error finding container 1346311934b2916cd857026d242bc114b31e00e73e3e3674b101ea59e08fa24a: Status 404 returned error can't find the container with id 1346311934b2916cd857026d242bc114b31e00e73e3e3674b101ea59e08fa24a Jan 20 23:47:14 crc kubenswrapper[5030]: I0120 23:47:14.994219 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-bp42s"] Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.017017 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5"] Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.031430 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m"] Jan 20 23:47:15 crc kubenswrapper[5030]: W0120 23:47:15.049776 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb87816c4_975a_45d5_b926_dd5ec79d3b1e.slice/crio-65deb56f0bf245e35f8cf821f0576dae435a9f10ad71d8ded4d732578229edf1 WatchSource:0}: Error finding container 65deb56f0bf245e35f8cf821f0576dae435a9f10ad71d8ded4d732578229edf1: Status 404 returned error can't find the container with id 65deb56f0bf245e35f8cf821f0576dae435a9f10ad71d8ded4d732578229edf1 Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.141647 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5"] Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.157107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" event={"ID":"3568f883-8a26-4dae-b664-474faca62644","Type":"ContainerStarted","Data":"5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa"} Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.157160 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" event={"ID":"3568f883-8a26-4dae-b664-474faca62644","Type":"ContainerStarted","Data":"1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75"} Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.157917 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.157946 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.160328 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m" event={"ID":"b87816c4-975a-45d5-b926-dd5ec79d3b1e","Type":"ContainerStarted","Data":"65deb56f0bf245e35f8cf821f0576dae435a9f10ad71d8ded4d732578229edf1"} Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.164266 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" event={"ID":"9c103a8f-3d99-4db6-861d-2fffca40b7ab","Type":"ContainerStarted","Data":"f02cc91e2853d55d3ece04bb8ab62f60b3907e5d9e90aa4bedab46dedc0cfc98"} Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.164318 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" event={"ID":"9c103a8f-3d99-4db6-861d-2fffca40b7ab","Type":"ContainerStarted","Data":"1346311934b2916cd857026d242bc114b31e00e73e3e3674b101ea59e08fa24a"} Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.167200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-bp42s" event={"ID":"efde4f69-1772-44dd-b9d1-fdc471133d2f","Type":"ContainerStarted","Data":"46c2013d4aadc107208e0fe8e057616aac10c33f28f68eac7f5fd141d5f1aa4a"} Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.169092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5" event={"ID":"35aff0fd-825b-4e57-84eb-b284266de899","Type":"ContainerStarted","Data":"9f8a0bad68faec075f1cf315a21e7ae076cb9016ee8e42ccd5cea4f96683a7e1"} Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.170640 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" event={"ID":"dc4566d9-34be-4ff1-8c9b-41c517c89979","Type":"ContainerStarted","Data":"3e83d2eec52fcc87d54a061d272d727de88d4704afbd112e7973bc9837044a45"} Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.170659 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" event={"ID":"dc4566d9-34be-4ff1-8c9b-41c517c89979","Type":"ContainerStarted","Data":"419165fa3e0cacb4460e0441f812363dd823825e5dbcb11893e684b5c0d81fea"} Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.185578 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" podStartSLOduration=3.185558335 podStartE2EDuration="3.185558335s" podCreationTimestamp="2026-01-20 23:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:15.180654177 +0000 UTC m=+4307.500914465" watchObservedRunningTime="2026-01-20 23:47:15.185558335 +0000 UTC m=+4307.505818623" Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.224725 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" podStartSLOduration=2.224706235 podStartE2EDuration="2.224706235s" podCreationTimestamp="2026-01-20 23:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:15.22036513 +0000 UTC m=+4307.540625418" watchObservedRunningTime="2026-01-20 23:47:15.224706235 +0000 UTC m=+4307.544966523" Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.239875 5030 scope.go:117] "RemoveContainer" containerID="1e8a9aa23fb4f9c7588b5b3cfa8a42d9d4011ad83f9f2ffcca6b66ca93a8c094" Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.326710 5030 scope.go:117] "RemoveContainer" containerID="a73c97b11afcbc04aed83e51bf767bd0510faa92f7acfad1d7f35ffe02c51dc2" Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.376856 5030 scope.go:117] "RemoveContainer" containerID="2924e6130325cda5b1a20e5c851ab6d28c294009f722d453a9f1ae2bd9a43edb" Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.414460 5030 scope.go:117] "RemoveContainer" containerID="c482b68ed01591be35af0242d8124f86faa81f2474d8a0b10508f5b0e77c62dc" Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.623944 5030 scope.go:117] "RemoveContainer" containerID="1d0ac515361116a705fa1bbc68a57e410f5f863be4014e57d7caa5134773a66d" Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.649824 5030 scope.go:117] "RemoveContainer" containerID="2b3263321b820b32f721d15d4ce47b9ccdf9ebd90f26eb5bf80c361394a902e3" Jan 20 23:47:15 crc kubenswrapper[5030]: I0120 23:47:15.981560 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0800e960-d721-4790-8586-8f5c77ab05dc" path="/var/lib/kubelet/pods/0800e960-d721-4790-8586-8f5c77ab05dc/volumes" Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.134116 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.134470 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="94ebfcd5-ace3-4a42-b07e-48858270151a" containerName="glance-log" containerID="cri-o://7909e93e6793df80e1395a1fa9b158599f2597163cd273b1941ebc60df184243" gracePeriod=30 Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.134582 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="94ebfcd5-ace3-4a42-b07e-48858270151a" containerName="glance-httpd" containerID="cri-o://d4b576c3601df1ede01c40d13b89ee5a68c09bff6eccc0bad2d568b5d81597a1" gracePeriod=30 Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.179664 5030 generic.go:334] "Generic (PLEG): container finished" podID="8c8a41ef-f17b-4a77-976e-0e8531cdbb52" containerID="2f721d727f0ab17d1d01471a6de8a00a13c0596cb70faaf4e7a965049517134c" exitCode=0 Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.179747 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5" event={"ID":"8c8a41ef-f17b-4a77-976e-0e8531cdbb52","Type":"ContainerDied","Data":"2f721d727f0ab17d1d01471a6de8a00a13c0596cb70faaf4e7a965049517134c"} Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.179771 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5" event={"ID":"8c8a41ef-f17b-4a77-976e-0e8531cdbb52","Type":"ContainerStarted","Data":"c81bb6c7b3d58d76aafd4b5eb7761fd281d54c4c32964170dd3e0786eeb51d8e"} Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.180958 5030 generic.go:334] "Generic (PLEG): container finished" podID="efde4f69-1772-44dd-b9d1-fdc471133d2f" containerID="ba5937ed9f4588f1a552253a88dea74aa441d6cfd6aaa27566780ea0ec75d0ee" exitCode=0 Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.181007 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-bp42s" event={"ID":"efde4f69-1772-44dd-b9d1-fdc471133d2f","Type":"ContainerDied","Data":"ba5937ed9f4588f1a552253a88dea74aa441d6cfd6aaa27566780ea0ec75d0ee"} Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.182794 5030 generic.go:334] "Generic (PLEG): container finished" podID="35aff0fd-825b-4e57-84eb-b284266de899" containerID="e80638829867975c142f9a8772d2bd6c5c492f03b382bfe54bf55e324fbf8d6e" exitCode=0 Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.182901 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5" event={"ID":"35aff0fd-825b-4e57-84eb-b284266de899","Type":"ContainerDied","Data":"e80638829867975c142f9a8772d2bd6c5c492f03b382bfe54bf55e324fbf8d6e"} Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.184018 5030 generic.go:334] "Generic (PLEG): container finished" podID="dc4566d9-34be-4ff1-8c9b-41c517c89979" containerID="3e83d2eec52fcc87d54a061d272d727de88d4704afbd112e7973bc9837044a45" exitCode=0 Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.184098 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" event={"ID":"dc4566d9-34be-4ff1-8c9b-41c517c89979","Type":"ContainerDied","Data":"3e83d2eec52fcc87d54a061d272d727de88d4704afbd112e7973bc9837044a45"} Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.191042 5030 generic.go:334] "Generic (PLEG): container finished" podID="b87816c4-975a-45d5-b926-dd5ec79d3b1e" containerID="08ec6657f50d5550e85af87826ba5cf46aac8240ff9e47ea5d6763a2de1fa2fa" exitCode=0 Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.191135 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m" event={"ID":"b87816c4-975a-45d5-b926-dd5ec79d3b1e","Type":"ContainerDied","Data":"08ec6657f50d5550e85af87826ba5cf46aac8240ff9e47ea5d6763a2de1fa2fa"} Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.192704 5030 generic.go:334] "Generic (PLEG): container finished" podID="9c103a8f-3d99-4db6-861d-2fffca40b7ab" containerID="f02cc91e2853d55d3ece04bb8ab62f60b3907e5d9e90aa4bedab46dedc0cfc98" exitCode=0 Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.192779 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" event={"ID":"9c103a8f-3d99-4db6-861d-2fffca40b7ab","Type":"ContainerDied","Data":"f02cc91e2853d55d3ece04bb8ab62f60b3907e5d9e90aa4bedab46dedc0cfc98"} Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.528668 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.681648 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4566d9-34be-4ff1-8c9b-41c517c89979-operator-scripts\") pod \"dc4566d9-34be-4ff1-8c9b-41c517c89979\" (UID: \"dc4566d9-34be-4ff1-8c9b-41c517c89979\") " Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.682367 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdjch\" (UniqueName: \"kubernetes.io/projected/dc4566d9-34be-4ff1-8c9b-41c517c89979-kube-api-access-tdjch\") pod \"dc4566d9-34be-4ff1-8c9b-41c517c89979\" (UID: \"dc4566d9-34be-4ff1-8c9b-41c517c89979\") " Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.682937 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4566d9-34be-4ff1-8c9b-41c517c89979-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc4566d9-34be-4ff1-8c9b-41c517c89979" (UID: "dc4566d9-34be-4ff1-8c9b-41c517c89979"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.687929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc4566d9-34be-4ff1-8c9b-41c517c89979-kube-api-access-tdjch" (OuterVolumeSpecName: "kube-api-access-tdjch") pod "dc4566d9-34be-4ff1-8c9b-41c517c89979" (UID: "dc4566d9-34be-4ff1-8c9b-41c517c89979"). InnerVolumeSpecName "kube-api-access-tdjch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.784782 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4566d9-34be-4ff1-8c9b-41c517c89979-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.784833 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdjch\" (UniqueName: \"kubernetes.io/projected/dc4566d9-34be-4ff1-8c9b-41c517c89979-kube-api-access-tdjch\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:16 crc kubenswrapper[5030]: I0120 23:47:16.932456 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.089037 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-run-httpd\") pod \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.089075 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-combined-ca-bundle\") pod \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.089117 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-scripts\") pod \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.089158 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4zvk\" (UniqueName: \"kubernetes.io/projected/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-kube-api-access-m4zvk\") pod \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.089263 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-sg-core-conf-yaml\") pod \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.089299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-log-httpd\") pod \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.089401 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-config-data\") pod \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\" (UID: \"aa508531-48c3-49ba-9a0b-45f2fce5a3f8\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.089456 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aa508531-48c3-49ba-9a0b-45f2fce5a3f8" (UID: "aa508531-48c3-49ba-9a0b-45f2fce5a3f8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.089771 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.089899 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aa508531-48c3-49ba-9a0b-45f2fce5a3f8" (UID: "aa508531-48c3-49ba-9a0b-45f2fce5a3f8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.092679 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-scripts" (OuterVolumeSpecName: "scripts") pod "aa508531-48c3-49ba-9a0b-45f2fce5a3f8" (UID: "aa508531-48c3-49ba-9a0b-45f2fce5a3f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.093585 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-kube-api-access-m4zvk" (OuterVolumeSpecName: "kube-api-access-m4zvk") pod "aa508531-48c3-49ba-9a0b-45f2fce5a3f8" (UID: "aa508531-48c3-49ba-9a0b-45f2fce5a3f8"). InnerVolumeSpecName "kube-api-access-m4zvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.118720 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aa508531-48c3-49ba-9a0b-45f2fce5a3f8" (UID: "aa508531-48c3-49ba-9a0b-45f2fce5a3f8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.181912 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-config-data" (OuterVolumeSpecName: "config-data") pod "aa508531-48c3-49ba-9a0b-45f2fce5a3f8" (UID: "aa508531-48c3-49ba-9a0b-45f2fce5a3f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.184378 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa508531-48c3-49ba-9a0b-45f2fce5a3f8" (UID: "aa508531-48c3-49ba-9a0b-45f2fce5a3f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.191705 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.191746 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.191761 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.191776 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.191790 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.191806 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4zvk\" (UniqueName: \"kubernetes.io/projected/aa508531-48c3-49ba-9a0b-45f2fce5a3f8-kube-api-access-m4zvk\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.204126 5030 generic.go:334] "Generic (PLEG): container finished" podID="94ebfcd5-ace3-4a42-b07e-48858270151a" containerID="7909e93e6793df80e1395a1fa9b158599f2597163cd273b1941ebc60df184243" exitCode=143 Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.204199 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"94ebfcd5-ace3-4a42-b07e-48858270151a","Type":"ContainerDied","Data":"7909e93e6793df80e1395a1fa9b158599f2597163cd273b1941ebc60df184243"} Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.205962 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.205964 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-c2wf4" event={"ID":"dc4566d9-34be-4ff1-8c9b-41c517c89979","Type":"ContainerDied","Data":"419165fa3e0cacb4460e0441f812363dd823825e5dbcb11893e684b5c0d81fea"} Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.206084 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="419165fa3e0cacb4460e0441f812363dd823825e5dbcb11893e684b5c0d81fea" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.208493 5030 generic.go:334] "Generic (PLEG): container finished" podID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerID="02e88784775eafdc1b9189fcec5afbb4ed9dcee7c3a77c6a3ab8948783a3ce44" exitCode=0 Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.208540 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aa508531-48c3-49ba-9a0b-45f2fce5a3f8","Type":"ContainerDied","Data":"02e88784775eafdc1b9189fcec5afbb4ed9dcee7c3a77c6a3ab8948783a3ce44"} Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.208593 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"aa508531-48c3-49ba-9a0b-45f2fce5a3f8","Type":"ContainerDied","Data":"6614420bd5452b1908c2fe058a5525e7d57bfcdaec657e0ed0dd72061f394bcb"} Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.208559 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.208613 5030 scope.go:117] "RemoveContainer" containerID="58d9e6cc3cb107f7376adb59f063d6a98f9a4806a4c933c4606eee3c3d38344f" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.262685 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.271186 5030 scope.go:117] "RemoveContainer" containerID="0c891e84022b49df1c5d5035d0ce800f65dec26e591a308961236a0a7f2a9424" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.273233 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.281207 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:17 crc kubenswrapper[5030]: E0120 23:47:17.281557 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4566d9-34be-4ff1-8c9b-41c517c89979" containerName="mariadb-database-create" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.281568 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4566d9-34be-4ff1-8c9b-41c517c89979" containerName="mariadb-database-create" Jan 20 23:47:17 crc kubenswrapper[5030]: E0120 23:47:17.281588 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="sg-core" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.281594 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="sg-core" Jan 20 23:47:17 crc kubenswrapper[5030]: E0120 23:47:17.281663 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="ceilometer-central-agent" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.281670 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="ceilometer-central-agent" Jan 20 23:47:17 crc kubenswrapper[5030]: E0120 23:47:17.281684 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="ceilometer-notification-agent" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.281691 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="ceilometer-notification-agent" Jan 20 23:47:17 crc kubenswrapper[5030]: E0120 23:47:17.281701 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="proxy-httpd" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.281708 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="proxy-httpd" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.281899 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="ceilometer-notification-agent" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.281913 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="proxy-httpd" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.281927 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4566d9-34be-4ff1-8c9b-41c517c89979" containerName="mariadb-database-create" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.281938 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="ceilometer-central-agent" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.281948 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" containerName="sg-core" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.283410 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.289587 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.289796 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.299727 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.364401 5030 scope.go:117] "RemoveContainer" containerID="02e88784775eafdc1b9189fcec5afbb4ed9dcee7c3a77c6a3ab8948783a3ce44" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.385842 5030 scope.go:117] "RemoveContainer" containerID="76c6a6de66aff93f3051fcaf10d3d7f5616d25cbee186689903f84de1167f95c" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.396188 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w7c4\" (UniqueName: \"kubernetes.io/projected/2936a29a-a0a2-49e1-a4c7-e5de87266b55-kube-api-access-9w7c4\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.396253 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.396277 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2936a29a-a0a2-49e1-a4c7-e5de87266b55-log-httpd\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.396366 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-scripts\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.396382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.396408 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2936a29a-a0a2-49e1-a4c7-e5de87266b55-run-httpd\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.396431 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-config-data\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.405318 5030 scope.go:117] "RemoveContainer" containerID="58d9e6cc3cb107f7376adb59f063d6a98f9a4806a4c933c4606eee3c3d38344f" Jan 20 23:47:17 crc kubenswrapper[5030]: E0120 23:47:17.405866 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d9e6cc3cb107f7376adb59f063d6a98f9a4806a4c933c4606eee3c3d38344f\": container with ID starting with 58d9e6cc3cb107f7376adb59f063d6a98f9a4806a4c933c4606eee3c3d38344f not found: ID does not exist" containerID="58d9e6cc3cb107f7376adb59f063d6a98f9a4806a4c933c4606eee3c3d38344f" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.405907 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d9e6cc3cb107f7376adb59f063d6a98f9a4806a4c933c4606eee3c3d38344f"} err="failed to get container status \"58d9e6cc3cb107f7376adb59f063d6a98f9a4806a4c933c4606eee3c3d38344f\": rpc error: code = NotFound desc = could not find container \"58d9e6cc3cb107f7376adb59f063d6a98f9a4806a4c933c4606eee3c3d38344f\": container with ID starting with 58d9e6cc3cb107f7376adb59f063d6a98f9a4806a4c933c4606eee3c3d38344f not found: ID does not exist" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.405935 5030 scope.go:117] "RemoveContainer" containerID="0c891e84022b49df1c5d5035d0ce800f65dec26e591a308961236a0a7f2a9424" Jan 20 23:47:17 crc kubenswrapper[5030]: E0120 23:47:17.406315 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c891e84022b49df1c5d5035d0ce800f65dec26e591a308961236a0a7f2a9424\": container with ID starting with 0c891e84022b49df1c5d5035d0ce800f65dec26e591a308961236a0a7f2a9424 not found: ID does not exist" containerID="0c891e84022b49df1c5d5035d0ce800f65dec26e591a308961236a0a7f2a9424" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.406333 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c891e84022b49df1c5d5035d0ce800f65dec26e591a308961236a0a7f2a9424"} err="failed to get container status \"0c891e84022b49df1c5d5035d0ce800f65dec26e591a308961236a0a7f2a9424\": rpc error: code = NotFound desc = could not find container \"0c891e84022b49df1c5d5035d0ce800f65dec26e591a308961236a0a7f2a9424\": container with ID starting with 0c891e84022b49df1c5d5035d0ce800f65dec26e591a308961236a0a7f2a9424 not found: ID does not exist" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.406345 5030 scope.go:117] "RemoveContainer" containerID="02e88784775eafdc1b9189fcec5afbb4ed9dcee7c3a77c6a3ab8948783a3ce44" Jan 20 23:47:17 crc kubenswrapper[5030]: E0120 23:47:17.406796 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02e88784775eafdc1b9189fcec5afbb4ed9dcee7c3a77c6a3ab8948783a3ce44\": container with ID starting with 02e88784775eafdc1b9189fcec5afbb4ed9dcee7c3a77c6a3ab8948783a3ce44 not found: ID does not exist" containerID="02e88784775eafdc1b9189fcec5afbb4ed9dcee7c3a77c6a3ab8948783a3ce44" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.406826 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e88784775eafdc1b9189fcec5afbb4ed9dcee7c3a77c6a3ab8948783a3ce44"} err="failed to get container status \"02e88784775eafdc1b9189fcec5afbb4ed9dcee7c3a77c6a3ab8948783a3ce44\": rpc error: code = NotFound desc = could not find container \"02e88784775eafdc1b9189fcec5afbb4ed9dcee7c3a77c6a3ab8948783a3ce44\": container with ID starting with 02e88784775eafdc1b9189fcec5afbb4ed9dcee7c3a77c6a3ab8948783a3ce44 not found: ID does not exist" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.406951 5030 scope.go:117] "RemoveContainer" containerID="76c6a6de66aff93f3051fcaf10d3d7f5616d25cbee186689903f84de1167f95c" Jan 20 23:47:17 crc kubenswrapper[5030]: E0120 23:47:17.408273 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76c6a6de66aff93f3051fcaf10d3d7f5616d25cbee186689903f84de1167f95c\": container with ID starting with 76c6a6de66aff93f3051fcaf10d3d7f5616d25cbee186689903f84de1167f95c not found: ID does not exist" containerID="76c6a6de66aff93f3051fcaf10d3d7f5616d25cbee186689903f84de1167f95c" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.408328 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76c6a6de66aff93f3051fcaf10d3d7f5616d25cbee186689903f84de1167f95c"} err="failed to get container status \"76c6a6de66aff93f3051fcaf10d3d7f5616d25cbee186689903f84de1167f95c\": rpc error: code = NotFound desc = could not find container \"76c6a6de66aff93f3051fcaf10d3d7f5616d25cbee186689903f84de1167f95c\": container with ID starting with 76c6a6de66aff93f3051fcaf10d3d7f5616d25cbee186689903f84de1167f95c not found: ID does not exist" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.440413 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:17 crc kubenswrapper[5030]: E0120 23:47:17.441156 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-9w7c4 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/ceilometer-0" podUID="2936a29a-a0a2-49e1-a4c7-e5de87266b55" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.503385 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.503431 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2936a29a-a0a2-49e1-a4c7-e5de87266b55-log-httpd\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.503527 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-scripts\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.503672 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.503701 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2936a29a-a0a2-49e1-a4c7-e5de87266b55-run-httpd\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.503723 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-config-data\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.503768 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w7c4\" (UniqueName: \"kubernetes.io/projected/2936a29a-a0a2-49e1-a4c7-e5de87266b55-kube-api-access-9w7c4\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.504400 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2936a29a-a0a2-49e1-a4c7-e5de87266b55-log-httpd\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.504975 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2936a29a-a0a2-49e1-a4c7-e5de87266b55-run-httpd\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.520813 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-scripts\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.521701 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-config-data\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.521719 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.522969 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.530687 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w7c4\" (UniqueName: \"kubernetes.io/projected/2936a29a-a0a2-49e1-a4c7-e5de87266b55-kube-api-access-9w7c4\") pod \"ceilometer-0\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.641906 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.764835 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.773733 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.782604 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.788978 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-bp42s" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.825013 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8a41ef-f17b-4a77-976e-0e8531cdbb52-operator-scripts\") pod \"8c8a41ef-f17b-4a77-976e-0e8531cdbb52\" (UID: \"8c8a41ef-f17b-4a77-976e-0e8531cdbb52\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.825988 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c8a41ef-f17b-4a77-976e-0e8531cdbb52-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c8a41ef-f17b-4a77-976e-0e8531cdbb52" (UID: "8c8a41ef-f17b-4a77-976e-0e8531cdbb52"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.827052 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f6xz\" (UniqueName: \"kubernetes.io/projected/8c8a41ef-f17b-4a77-976e-0e8531cdbb52-kube-api-access-2f6xz\") pod \"8c8a41ef-f17b-4a77-976e-0e8531cdbb52\" (UID: \"8c8a41ef-f17b-4a77-976e-0e8531cdbb52\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.827592 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8a41ef-f17b-4a77-976e-0e8531cdbb52-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.831398 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8a41ef-f17b-4a77-976e-0e8531cdbb52-kube-api-access-2f6xz" (OuterVolumeSpecName: "kube-api-access-2f6xz") pod "8c8a41ef-f17b-4a77-976e-0e8531cdbb52" (UID: "8c8a41ef-f17b-4a77-976e-0e8531cdbb52"). InnerVolumeSpecName "kube-api-access-2f6xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.929044 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrfxr\" (UniqueName: \"kubernetes.io/projected/9c103a8f-3d99-4db6-861d-2fffca40b7ab-kube-api-access-vrfxr\") pod \"9c103a8f-3d99-4db6-861d-2fffca40b7ab\" (UID: \"9c103a8f-3d99-4db6-861d-2fffca40b7ab\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.929097 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35aff0fd-825b-4e57-84eb-b284266de899-operator-scripts\") pod \"35aff0fd-825b-4e57-84eb-b284266de899\" (UID: \"35aff0fd-825b-4e57-84eb-b284266de899\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.929117 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsxbb\" (UniqueName: \"kubernetes.io/projected/efde4f69-1772-44dd-b9d1-fdc471133d2f-kube-api-access-rsxbb\") pod \"efde4f69-1772-44dd-b9d1-fdc471133d2f\" (UID: \"efde4f69-1772-44dd-b9d1-fdc471133d2f\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.929153 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bglxq\" (UniqueName: \"kubernetes.io/projected/b87816c4-975a-45d5-b926-dd5ec79d3b1e-kube-api-access-bglxq\") pod \"b87816c4-975a-45d5-b926-dd5ec79d3b1e\" (UID: \"b87816c4-975a-45d5-b926-dd5ec79d3b1e\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.929203 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b87816c4-975a-45d5-b926-dd5ec79d3b1e-operator-scripts\") pod \"b87816c4-975a-45d5-b926-dd5ec79d3b1e\" (UID: \"b87816c4-975a-45d5-b926-dd5ec79d3b1e\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.929227 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ntnj\" (UniqueName: \"kubernetes.io/projected/35aff0fd-825b-4e57-84eb-b284266de899-kube-api-access-6ntnj\") pod \"35aff0fd-825b-4e57-84eb-b284266de899\" (UID: \"35aff0fd-825b-4e57-84eb-b284266de899\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.929270 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efde4f69-1772-44dd-b9d1-fdc471133d2f-operator-scripts\") pod \"efde4f69-1772-44dd-b9d1-fdc471133d2f\" (UID: \"efde4f69-1772-44dd-b9d1-fdc471133d2f\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.929287 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c103a8f-3d99-4db6-861d-2fffca40b7ab-operator-scripts\") pod \"9c103a8f-3d99-4db6-861d-2fffca40b7ab\" (UID: \"9c103a8f-3d99-4db6-861d-2fffca40b7ab\") " Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.929697 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f6xz\" (UniqueName: \"kubernetes.io/projected/8c8a41ef-f17b-4a77-976e-0e8531cdbb52-kube-api-access-2f6xz\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.930046 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c103a8f-3d99-4db6-861d-2fffca40b7ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c103a8f-3d99-4db6-861d-2fffca40b7ab" (UID: "9c103a8f-3d99-4db6-861d-2fffca40b7ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.930276 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b87816c4-975a-45d5-b926-dd5ec79d3b1e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b87816c4-975a-45d5-b926-dd5ec79d3b1e" (UID: "b87816c4-975a-45d5-b926-dd5ec79d3b1e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.930811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35aff0fd-825b-4e57-84eb-b284266de899-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35aff0fd-825b-4e57-84eb-b284266de899" (UID: "35aff0fd-825b-4e57-84eb-b284266de899"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.931020 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efde4f69-1772-44dd-b9d1-fdc471133d2f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efde4f69-1772-44dd-b9d1-fdc471133d2f" (UID: "efde4f69-1772-44dd-b9d1-fdc471133d2f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.933073 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35aff0fd-825b-4e57-84eb-b284266de899-kube-api-access-6ntnj" (OuterVolumeSpecName: "kube-api-access-6ntnj") pod "35aff0fd-825b-4e57-84eb-b284266de899" (UID: "35aff0fd-825b-4e57-84eb-b284266de899"). InnerVolumeSpecName "kube-api-access-6ntnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.933470 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b87816c4-975a-45d5-b926-dd5ec79d3b1e-kube-api-access-bglxq" (OuterVolumeSpecName: "kube-api-access-bglxq") pod "b87816c4-975a-45d5-b926-dd5ec79d3b1e" (UID: "b87816c4-975a-45d5-b926-dd5ec79d3b1e"). InnerVolumeSpecName "kube-api-access-bglxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.933994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efde4f69-1772-44dd-b9d1-fdc471133d2f-kube-api-access-rsxbb" (OuterVolumeSpecName: "kube-api-access-rsxbb") pod "efde4f69-1772-44dd-b9d1-fdc471133d2f" (UID: "efde4f69-1772-44dd-b9d1-fdc471133d2f"). InnerVolumeSpecName "kube-api-access-rsxbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.937001 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c103a8f-3d99-4db6-861d-2fffca40b7ab-kube-api-access-vrfxr" (OuterVolumeSpecName: "kube-api-access-vrfxr") pod "9c103a8f-3d99-4db6-861d-2fffca40b7ab" (UID: "9c103a8f-3d99-4db6-861d-2fffca40b7ab"). InnerVolumeSpecName "kube-api-access-vrfxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:17 crc kubenswrapper[5030]: I0120 23:47:17.974223 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa508531-48c3-49ba-9a0b-45f2fce5a3f8" path="/var/lib/kubelet/pods/aa508531-48c3-49ba-9a0b-45f2fce5a3f8/volumes" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.031128 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efde4f69-1772-44dd-b9d1-fdc471133d2f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.031157 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c103a8f-3d99-4db6-861d-2fffca40b7ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.031168 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrfxr\" (UniqueName: \"kubernetes.io/projected/9c103a8f-3d99-4db6-861d-2fffca40b7ab-kube-api-access-vrfxr\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.031178 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35aff0fd-825b-4e57-84eb-b284266de899-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.031189 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsxbb\" (UniqueName: \"kubernetes.io/projected/efde4f69-1772-44dd-b9d1-fdc471133d2f-kube-api-access-rsxbb\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.031199 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bglxq\" (UniqueName: \"kubernetes.io/projected/b87816c4-975a-45d5-b926-dd5ec79d3b1e-kube-api-access-bglxq\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.031209 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b87816c4-975a-45d5-b926-dd5ec79d3b1e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.031218 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ntnj\" (UniqueName: \"kubernetes.io/projected/35aff0fd-825b-4e57-84eb-b284266de899-kube-api-access-6ntnj\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.221901 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" event={"ID":"9c103a8f-3d99-4db6-861d-2fffca40b7ab","Type":"ContainerDied","Data":"1346311934b2916cd857026d242bc114b31e00e73e3e3674b101ea59e08fa24a"} Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.221959 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1346311934b2916cd857026d242bc114b31e00e73e3e3674b101ea59e08fa24a" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.221927 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-lklcb" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.224652 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-bp42s" event={"ID":"efde4f69-1772-44dd-b9d1-fdc471133d2f","Type":"ContainerDied","Data":"46c2013d4aadc107208e0fe8e057616aac10c33f28f68eac7f5fd141d5f1aa4a"} Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.224689 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46c2013d4aadc107208e0fe8e057616aac10c33f28f68eac7f5fd141d5f1aa4a" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.224722 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-bp42s" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.227151 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5" event={"ID":"35aff0fd-825b-4e57-84eb-b284266de899","Type":"ContainerDied","Data":"9f8a0bad68faec075f1cf315a21e7ae076cb9016ee8e42ccd5cea4f96683a7e1"} Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.227206 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f8a0bad68faec075f1cf315a21e7ae076cb9016ee8e42ccd5cea4f96683a7e1" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.227314 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.231136 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m" event={"ID":"b87816c4-975a-45d5-b926-dd5ec79d3b1e","Type":"ContainerDied","Data":"65deb56f0bf245e35f8cf821f0576dae435a9f10ad71d8ded4d732578229edf1"} Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.231174 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65deb56f0bf245e35f8cf821f0576dae435a9f10ad71d8ded4d732578229edf1" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.231248 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.243222 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.243411 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5" event={"ID":"8c8a41ef-f17b-4a77-976e-0e8531cdbb52","Type":"ContainerDied","Data":"c81bb6c7b3d58d76aafd4b5eb7761fd281d54c4c32964170dd3e0786eeb51d8e"} Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.243573 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.244846 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c81bb6c7b3d58d76aafd4b5eb7761fd281d54c4c32964170dd3e0786eeb51d8e" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.256569 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.437669 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w7c4\" (UniqueName: \"kubernetes.io/projected/2936a29a-a0a2-49e1-a4c7-e5de87266b55-kube-api-access-9w7c4\") pod \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.437830 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2936a29a-a0a2-49e1-a4c7-e5de87266b55-log-httpd\") pod \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.437854 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-combined-ca-bundle\") pod \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.437874 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2936a29a-a0a2-49e1-a4c7-e5de87266b55-run-httpd\") pod \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.437905 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-config-data\") pod \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.437981 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-scripts\") pod \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.438039 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-sg-core-conf-yaml\") pod \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\" (UID: \"2936a29a-a0a2-49e1-a4c7-e5de87266b55\") " Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.438957 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2936a29a-a0a2-49e1-a4c7-e5de87266b55-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2936a29a-a0a2-49e1-a4c7-e5de87266b55" (UID: "2936a29a-a0a2-49e1-a4c7-e5de87266b55"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.439222 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2936a29a-a0a2-49e1-a4c7-e5de87266b55-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2936a29a-a0a2-49e1-a4c7-e5de87266b55" (UID: "2936a29a-a0a2-49e1-a4c7-e5de87266b55"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.442350 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-scripts" (OuterVolumeSpecName: "scripts") pod "2936a29a-a0a2-49e1-a4c7-e5de87266b55" (UID: "2936a29a-a0a2-49e1-a4c7-e5de87266b55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.444008 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-config-data" (OuterVolumeSpecName: "config-data") pod "2936a29a-a0a2-49e1-a4c7-e5de87266b55" (UID: "2936a29a-a0a2-49e1-a4c7-e5de87266b55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.444937 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2936a29a-a0a2-49e1-a4c7-e5de87266b55-kube-api-access-9w7c4" (OuterVolumeSpecName: "kube-api-access-9w7c4") pod "2936a29a-a0a2-49e1-a4c7-e5de87266b55" (UID: "2936a29a-a0a2-49e1-a4c7-e5de87266b55"). InnerVolumeSpecName "kube-api-access-9w7c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.445178 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2936a29a-a0a2-49e1-a4c7-e5de87266b55" (UID: "2936a29a-a0a2-49e1-a4c7-e5de87266b55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.448746 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2936a29a-a0a2-49e1-a4c7-e5de87266b55" (UID: "2936a29a-a0a2-49e1-a4c7-e5de87266b55"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.539841 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.540062 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w7c4\" (UniqueName: \"kubernetes.io/projected/2936a29a-a0a2-49e1-a4c7-e5de87266b55-kube-api-access-9w7c4\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.540141 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.540249 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2936a29a-a0a2-49e1-a4c7-e5de87266b55-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.540320 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2936a29a-a0a2-49e1-a4c7-e5de87266b55-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.540387 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:18 crc kubenswrapper[5030]: I0120 23:47:18.540451 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2936a29a-a0a2-49e1-a4c7-e5de87266b55-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.178244 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.178491 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" containerName="glance-log" containerID="cri-o://a69d719a82f846cca84f68dd21c5335bedd3bfc9c3738955f14f594f7e507935" gracePeriod=30 Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.178590 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" containerName="glance-httpd" containerID="cri-o://024945ba7854a3418939480adf5fc933bae89651e0dcf79a2b5d6a00cd68768c" gracePeriod=30 Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.251823 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.298854 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.334366 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.347839 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:19 crc kubenswrapper[5030]: E0120 23:47:19.348206 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87816c4-975a-45d5-b926-dd5ec79d3b1e" containerName="mariadb-account-create-update" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.348222 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87816c4-975a-45d5-b926-dd5ec79d3b1e" containerName="mariadb-account-create-update" Jan 20 23:47:19 crc kubenswrapper[5030]: E0120 23:47:19.348235 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8a41ef-f17b-4a77-976e-0e8531cdbb52" containerName="mariadb-account-create-update" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.348241 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8a41ef-f17b-4a77-976e-0e8531cdbb52" containerName="mariadb-account-create-update" Jan 20 23:47:19 crc kubenswrapper[5030]: E0120 23:47:19.348265 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c103a8f-3d99-4db6-861d-2fffca40b7ab" containerName="mariadb-database-create" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.348271 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c103a8f-3d99-4db6-861d-2fffca40b7ab" containerName="mariadb-database-create" Jan 20 23:47:19 crc kubenswrapper[5030]: E0120 23:47:19.348280 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efde4f69-1772-44dd-b9d1-fdc471133d2f" containerName="mariadb-database-create" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.348286 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="efde4f69-1772-44dd-b9d1-fdc471133d2f" containerName="mariadb-database-create" Jan 20 23:47:19 crc kubenswrapper[5030]: E0120 23:47:19.348301 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35aff0fd-825b-4e57-84eb-b284266de899" containerName="mariadb-account-create-update" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.348307 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="35aff0fd-825b-4e57-84eb-b284266de899" containerName="mariadb-account-create-update" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.348466 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c103a8f-3d99-4db6-861d-2fffca40b7ab" containerName="mariadb-database-create" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.348480 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="efde4f69-1772-44dd-b9d1-fdc471133d2f" containerName="mariadb-database-create" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.348494 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8a41ef-f17b-4a77-976e-0e8531cdbb52" containerName="mariadb-account-create-update" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.348509 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87816c4-975a-45d5-b926-dd5ec79d3b1e" containerName="mariadb-account-create-update" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.348524 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="35aff0fd-825b-4e57-84eb-b284266de899" containerName="mariadb-account-create-update" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.350053 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.352976 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.361542 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.372180 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.437257 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs"] Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.438280 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.444124 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.444150 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.444472 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-gn4g2" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.453209 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs"] Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.454570 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-scripts\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.454610 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcqql\" (UniqueName: \"kubernetes.io/projected/9036d901-6de7-4301-85c0-62723b10ae6c-kube-api-access-qcqql\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.454707 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9036d901-6de7-4301-85c0-62723b10ae6c-log-httpd\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.454753 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9036d901-6de7-4301-85c0-62723b10ae6c-run-httpd\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.454781 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.454822 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-config-data\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.454845 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.556096 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-scripts\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.556156 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcqql\" (UniqueName: \"kubernetes.io/projected/9036d901-6de7-4301-85c0-62723b10ae6c-kube-api-access-qcqql\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.556182 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9036d901-6de7-4301-85c0-62723b10ae6c-log-httpd\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.556224 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-config-data\") pod \"nova-cell0-conductor-db-sync-smrcs\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.556252 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9036d901-6de7-4301-85c0-62723b10ae6c-run-httpd\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.556270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-scripts\") pod \"nova-cell0-conductor-db-sync-smrcs\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.556298 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.556316 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfnp6\" (UniqueName: \"kubernetes.io/projected/907d9698-9733-4db5-93c2-8c35280a5610-kube-api-access-bfnp6\") pod \"nova-cell0-conductor-db-sync-smrcs\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.556360 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-config-data\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.556379 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.556398 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-smrcs\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.557146 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9036d901-6de7-4301-85c0-62723b10ae6c-log-httpd\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.558674 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9036d901-6de7-4301-85c0-62723b10ae6c-run-httpd\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.560823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.561394 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-scripts\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.564645 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-config-data\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.576579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcqql\" (UniqueName: \"kubernetes.io/projected/9036d901-6de7-4301-85c0-62723b10ae6c-kube-api-access-qcqql\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.577259 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.658453 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-config-data\") pod \"nova-cell0-conductor-db-sync-smrcs\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.658529 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-scripts\") pod \"nova-cell0-conductor-db-sync-smrcs\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.658566 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfnp6\" (UniqueName: \"kubernetes.io/projected/907d9698-9733-4db5-93c2-8c35280a5610-kube-api-access-bfnp6\") pod \"nova-cell0-conductor-db-sync-smrcs\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.658637 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-smrcs\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.665220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-scripts\") pod \"nova-cell0-conductor-db-sync-smrcs\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.665354 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-smrcs\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.665640 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-config-data\") pod \"nova-cell0-conductor-db-sync-smrcs\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.675903 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfnp6\" (UniqueName: \"kubernetes.io/projected/907d9698-9733-4db5-93c2-8c35280a5610-kube-api-access-bfnp6\") pod \"nova-cell0-conductor-db-sync-smrcs\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.745544 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.748931 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.756526 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.861361 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-internal-tls-certs\") pod \"94ebfcd5-ace3-4a42-b07e-48858270151a\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.861416 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-scripts\") pod \"94ebfcd5-ace3-4a42-b07e-48858270151a\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.861479 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94ebfcd5-ace3-4a42-b07e-48858270151a-logs\") pod \"94ebfcd5-ace3-4a42-b07e-48858270151a\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.861587 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94ebfcd5-ace3-4a42-b07e-48858270151a-httpd-run\") pod \"94ebfcd5-ace3-4a42-b07e-48858270151a\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.861607 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"94ebfcd5-ace3-4a42-b07e-48858270151a\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.861667 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-combined-ca-bundle\") pod \"94ebfcd5-ace3-4a42-b07e-48858270151a\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.861689 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-config-data\") pod \"94ebfcd5-ace3-4a42-b07e-48858270151a\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.861739 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2lt6\" (UniqueName: \"kubernetes.io/projected/94ebfcd5-ace3-4a42-b07e-48858270151a-kube-api-access-c2lt6\") pod \"94ebfcd5-ace3-4a42-b07e-48858270151a\" (UID: \"94ebfcd5-ace3-4a42-b07e-48858270151a\") " Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.866860 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94ebfcd5-ace3-4a42-b07e-48858270151a-kube-api-access-c2lt6" (OuterVolumeSpecName: "kube-api-access-c2lt6") pod "94ebfcd5-ace3-4a42-b07e-48858270151a" (UID: "94ebfcd5-ace3-4a42-b07e-48858270151a"). InnerVolumeSpecName "kube-api-access-c2lt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.867986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94ebfcd5-ace3-4a42-b07e-48858270151a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "94ebfcd5-ace3-4a42-b07e-48858270151a" (UID: "94ebfcd5-ace3-4a42-b07e-48858270151a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.869266 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94ebfcd5-ace3-4a42-b07e-48858270151a-logs" (OuterVolumeSpecName: "logs") pod "94ebfcd5-ace3-4a42-b07e-48858270151a" (UID: "94ebfcd5-ace3-4a42-b07e-48858270151a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.873285 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-scripts" (OuterVolumeSpecName: "scripts") pod "94ebfcd5-ace3-4a42-b07e-48858270151a" (UID: "94ebfcd5-ace3-4a42-b07e-48858270151a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.874613 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "94ebfcd5-ace3-4a42-b07e-48858270151a" (UID: "94ebfcd5-ace3-4a42-b07e-48858270151a"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.902898 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94ebfcd5-ace3-4a42-b07e-48858270151a" (UID: "94ebfcd5-ace3-4a42-b07e-48858270151a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.935218 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "94ebfcd5-ace3-4a42-b07e-48858270151a" (UID: "94ebfcd5-ace3-4a42-b07e-48858270151a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.954980 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-config-data" (OuterVolumeSpecName: "config-data") pod "94ebfcd5-ace3-4a42-b07e-48858270151a" (UID: "94ebfcd5-ace3-4a42-b07e-48858270151a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.968993 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.969270 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.969280 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94ebfcd5-ace3-4a42-b07e-48858270151a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.969290 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94ebfcd5-ace3-4a42-b07e-48858270151a-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.969351 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.969368 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.969379 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94ebfcd5-ace3-4a42-b07e-48858270151a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.969441 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2lt6\" (UniqueName: \"kubernetes.io/projected/94ebfcd5-ace3-4a42-b07e-48858270151a-kube-api-access-c2lt6\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:19 crc kubenswrapper[5030]: E0120 23:47:19.979022 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod497ba0ef_c7db_4c22_8b2f_7da2826ece57.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:47:19 crc kubenswrapper[5030]: I0120 23:47:19.982664 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2936a29a-a0a2-49e1-a4c7-e5de87266b55" path="/var/lib/kubelet/pods/2936a29a-a0a2-49e1-a4c7-e5de87266b55/volumes" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.006140 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.070715 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.259058 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs"] Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.262530 5030 generic.go:334] "Generic (PLEG): container finished" podID="94ebfcd5-ace3-4a42-b07e-48858270151a" containerID="d4b576c3601df1ede01c40d13b89ee5a68c09bff6eccc0bad2d568b5d81597a1" exitCode=0 Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.262584 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"94ebfcd5-ace3-4a42-b07e-48858270151a","Type":"ContainerDied","Data":"d4b576c3601df1ede01c40d13b89ee5a68c09bff6eccc0bad2d568b5d81597a1"} Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.262608 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"94ebfcd5-ace3-4a42-b07e-48858270151a","Type":"ContainerDied","Data":"3f80c3dc58af05d11075e422056ff2fb4d1fe6db9c8cf1f1d9e4e15c60bb1078"} Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.262639 5030 scope.go:117] "RemoveContainer" containerID="d4b576c3601df1ede01c40d13b89ee5a68c09bff6eccc0bad2d568b5d81597a1" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.262781 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.268586 5030 generic.go:334] "Generic (PLEG): container finished" podID="2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" containerID="a69d719a82f846cca84f68dd21c5335bedd3bfc9c3738955f14f594f7e507935" exitCode=143 Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.268647 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf","Type":"ContainerDied","Data":"a69d719a82f846cca84f68dd21c5335bedd3bfc9c3738955f14f594f7e507935"} Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.288044 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.295834 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.302937 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.309431 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:47:20 crc kubenswrapper[5030]: E0120 23:47:20.310166 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94ebfcd5-ace3-4a42-b07e-48858270151a" containerName="glance-httpd" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.310448 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="94ebfcd5-ace3-4a42-b07e-48858270151a" containerName="glance-httpd" Jan 20 23:47:20 crc kubenswrapper[5030]: E0120 23:47:20.310582 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94ebfcd5-ace3-4a42-b07e-48858270151a" containerName="glance-log" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.310664 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="94ebfcd5-ace3-4a42-b07e-48858270151a" containerName="glance-log" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.310899 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="94ebfcd5-ace3-4a42-b07e-48858270151a" containerName="glance-log" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.310989 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="94ebfcd5-ace3-4a42-b07e-48858270151a" containerName="glance-httpd" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.312068 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.315783 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.316095 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.316939 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.379487 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.379878 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.380000 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.380095 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.380221 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.380359 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.380492 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5479\" (UniqueName: \"kubernetes.io/projected/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-kube-api-access-v5479\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.380761 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.482472 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.482838 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.482863 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.482884 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.482900 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.482934 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.482960 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.482998 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5479\" (UniqueName: \"kubernetes.io/projected/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-kube-api-access-v5479\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.483215 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.483686 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.483909 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.544679 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.544729 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.545035 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.545779 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.546940 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5479\" (UniqueName: \"kubernetes.io/projected/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-kube-api-access-v5479\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.567929 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.569729 5030 scope.go:117] "RemoveContainer" containerID="7909e93e6793df80e1395a1fa9b158599f2597163cd273b1941ebc60df184243" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.635352 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.747495 5030 scope.go:117] "RemoveContainer" containerID="d4b576c3601df1ede01c40d13b89ee5a68c09bff6eccc0bad2d568b5d81597a1" Jan 20 23:47:20 crc kubenswrapper[5030]: E0120 23:47:20.748219 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b576c3601df1ede01c40d13b89ee5a68c09bff6eccc0bad2d568b5d81597a1\": container with ID starting with d4b576c3601df1ede01c40d13b89ee5a68c09bff6eccc0bad2d568b5d81597a1 not found: ID does not exist" containerID="d4b576c3601df1ede01c40d13b89ee5a68c09bff6eccc0bad2d568b5d81597a1" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.748255 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b576c3601df1ede01c40d13b89ee5a68c09bff6eccc0bad2d568b5d81597a1"} err="failed to get container status \"d4b576c3601df1ede01c40d13b89ee5a68c09bff6eccc0bad2d568b5d81597a1\": rpc error: code = NotFound desc = could not find container \"d4b576c3601df1ede01c40d13b89ee5a68c09bff6eccc0bad2d568b5d81597a1\": container with ID starting with d4b576c3601df1ede01c40d13b89ee5a68c09bff6eccc0bad2d568b5d81597a1 not found: ID does not exist" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.748305 5030 scope.go:117] "RemoveContainer" containerID="7909e93e6793df80e1395a1fa9b158599f2597163cd273b1941ebc60df184243" Jan 20 23:47:20 crc kubenswrapper[5030]: E0120 23:47:20.748921 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7909e93e6793df80e1395a1fa9b158599f2597163cd273b1941ebc60df184243\": container with ID starting with 7909e93e6793df80e1395a1fa9b158599f2597163cd273b1941ebc60df184243 not found: ID does not exist" containerID="7909e93e6793df80e1395a1fa9b158599f2597163cd273b1941ebc60df184243" Jan 20 23:47:20 crc kubenswrapper[5030]: I0120 23:47:20.748960 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7909e93e6793df80e1395a1fa9b158599f2597163cd273b1941ebc60df184243"} err="failed to get container status \"7909e93e6793df80e1395a1fa9b158599f2597163cd273b1941ebc60df184243\": rpc error: code = NotFound desc = could not find container \"7909e93e6793df80e1395a1fa9b158599f2597163cd273b1941ebc60df184243\": container with ID starting with 7909e93e6793df80e1395a1fa9b158599f2597163cd273b1941ebc60df184243 not found: ID does not exist" Jan 20 23:47:21 crc kubenswrapper[5030]: I0120 23:47:21.090134 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:47:21 crc kubenswrapper[5030]: I0120 23:47:21.284322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" event={"ID":"907d9698-9733-4db5-93c2-8c35280a5610","Type":"ContainerStarted","Data":"d2fed7ec38d03a6a43221941b4cffa52c915ff185c27683474988eb9263cdca6"} Jan 20 23:47:21 crc kubenswrapper[5030]: I0120 23:47:21.284605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" event={"ID":"907d9698-9733-4db5-93c2-8c35280a5610","Type":"ContainerStarted","Data":"aa9541cc5e2bebf711debd3e0858fac06059d0c76d6ced95915c2e928b6fb2c8"} Jan 20 23:47:21 crc kubenswrapper[5030]: I0120 23:47:21.285806 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ff6971b9-88b5-4b5e-9b15-a105b0c38e18","Type":"ContainerStarted","Data":"38ef4db4cd904e75810b9cb033726c2a23117637590a89dfc532a4ffbb5dce68"} Jan 20 23:47:21 crc kubenswrapper[5030]: I0120 23:47:21.297734 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9036d901-6de7-4301-85c0-62723b10ae6c","Type":"ContainerStarted","Data":"cee688626790c7cb73b6123629ec28fdd87d46fe46aefc4ebd146404c590e3fa"} Jan 20 23:47:21 crc kubenswrapper[5030]: I0120 23:47:21.306268 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" podStartSLOduration=2.306257335 podStartE2EDuration="2.306257335s" podCreationTimestamp="2026-01-20 23:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:21.299813659 +0000 UTC m=+4313.620073947" watchObservedRunningTime="2026-01-20 23:47:21.306257335 +0000 UTC m=+4313.626517623" Jan 20 23:47:21 crc kubenswrapper[5030]: I0120 23:47:21.717391 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:21 crc kubenswrapper[5030]: I0120 23:47:21.977237 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94ebfcd5-ace3-4a42-b07e-48858270151a" path="/var/lib/kubelet/pods/94ebfcd5-ace3-4a42-b07e-48858270151a/volumes" Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.307508 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ff6971b9-88b5-4b5e-9b15-a105b0c38e18","Type":"ContainerStarted","Data":"c04f3732b98e193b6b53fc1957ae0a6d9a7d436e4e6b8a697a9012d0ac5cf9b1"} Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.309313 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9036d901-6de7-4301-85c0-62723b10ae6c","Type":"ContainerStarted","Data":"b5668c5767465d1ac0a059f35961c95c3750890a2ab067c7e40c0fd02ef0a90b"} Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.848743 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.933383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-combined-ca-bundle\") pod \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.933510 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-scripts\") pod \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.933539 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-config-data\") pod \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.933600 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-httpd-run\") pod \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.933695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7hqw\" (UniqueName: \"kubernetes.io/projected/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-kube-api-access-v7hqw\") pod \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.933723 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-public-tls-certs\") pod \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.933739 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.933783 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-logs\") pod \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\" (UID: \"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf\") " Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.934926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" (UID: "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.942260 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-scripts" (OuterVolumeSpecName: "scripts") pod "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" (UID: "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.945790 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-logs" (OuterVolumeSpecName: "logs") pod "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" (UID: "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.951848 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" (UID: "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.955782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-kube-api-access-v7hqw" (OuterVolumeSpecName: "kube-api-access-v7hqw") pod "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" (UID: "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf"). InnerVolumeSpecName "kube-api-access-v7hqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:22 crc kubenswrapper[5030]: I0120 23:47:22.997757 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" (UID: "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.021752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-config-data" (OuterVolumeSpecName: "config-data") pod "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" (UID: "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.038778 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7hqw\" (UniqueName: \"kubernetes.io/projected/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-kube-api-access-v7hqw\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.038831 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.038847 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.038858 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.038868 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.038877 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.038887 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.061496 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.067773 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" (UID: "2ead5a03-9b2d-4b00-8ae0-0d5192f642cf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.140701 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.140959 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.318845 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ff6971b9-88b5-4b5e-9b15-a105b0c38e18","Type":"ContainerStarted","Data":"7cd9dbd240d3ef2a14c6d6bb20711d33050ff28e12fcb1a4972c439a6b1b2b7a"} Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.320782 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9036d901-6de7-4301-85c0-62723b10ae6c","Type":"ContainerStarted","Data":"ae8a26acea412ba89d4499dc41189e13598d4aa8e2f33437732b1f4b93e89caa"} Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.323088 5030 generic.go:334] "Generic (PLEG): container finished" podID="2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" containerID="024945ba7854a3418939480adf5fc933bae89651e0dcf79a2b5d6a00cd68768c" exitCode=0 Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.323133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf","Type":"ContainerDied","Data":"024945ba7854a3418939480adf5fc933bae89651e0dcf79a2b5d6a00cd68768c"} Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.323162 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"2ead5a03-9b2d-4b00-8ae0-0d5192f642cf","Type":"ContainerDied","Data":"8b021494e1aac5122b52829e6c9c87800a7d308af27e58e1c36b78ee640e91e7"} Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.323180 5030 scope.go:117] "RemoveContainer" containerID="024945ba7854a3418939480adf5fc933bae89651e0dcf79a2b5d6a00cd68768c" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.323180 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.345254 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.345237608 podStartE2EDuration="3.345237608s" podCreationTimestamp="2026-01-20 23:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:23.343418224 +0000 UTC m=+4315.663678512" watchObservedRunningTime="2026-01-20 23:47:23.345237608 +0000 UTC m=+4315.665497896" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.346883 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.352550 5030 scope.go:117] "RemoveContainer" containerID="a69d719a82f846cca84f68dd21c5335bedd3bfc9c3738955f14f594f7e507935" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.362769 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.369533 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.379712 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.380367 5030 scope.go:117] "RemoveContainer" containerID="024945ba7854a3418939480adf5fc933bae89651e0dcf79a2b5d6a00cd68768c" Jan 20 23:47:23 crc kubenswrapper[5030]: E0120 23:47:23.381482 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024945ba7854a3418939480adf5fc933bae89651e0dcf79a2b5d6a00cd68768c\": container with ID starting with 024945ba7854a3418939480adf5fc933bae89651e0dcf79a2b5d6a00cd68768c not found: ID does not exist" containerID="024945ba7854a3418939480adf5fc933bae89651e0dcf79a2b5d6a00cd68768c" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.381526 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024945ba7854a3418939480adf5fc933bae89651e0dcf79a2b5d6a00cd68768c"} err="failed to get container status \"024945ba7854a3418939480adf5fc933bae89651e0dcf79a2b5d6a00cd68768c\": rpc error: code = NotFound desc = could not find container \"024945ba7854a3418939480adf5fc933bae89651e0dcf79a2b5d6a00cd68768c\": container with ID starting with 024945ba7854a3418939480adf5fc933bae89651e0dcf79a2b5d6a00cd68768c not found: ID does not exist" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.381548 5030 scope.go:117] "RemoveContainer" containerID="a69d719a82f846cca84f68dd21c5335bedd3bfc9c3738955f14f594f7e507935" Jan 20 23:47:23 crc kubenswrapper[5030]: E0120 23:47:23.384408 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69d719a82f846cca84f68dd21c5335bedd3bfc9c3738955f14f594f7e507935\": container with ID starting with a69d719a82f846cca84f68dd21c5335bedd3bfc9c3738955f14f594f7e507935 not found: ID does not exist" containerID="a69d719a82f846cca84f68dd21c5335bedd3bfc9c3738955f14f594f7e507935" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.384449 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69d719a82f846cca84f68dd21c5335bedd3bfc9c3738955f14f594f7e507935"} err="failed to get container status \"a69d719a82f846cca84f68dd21c5335bedd3bfc9c3738955f14f594f7e507935\": rpc error: code = NotFound desc = could not find container \"a69d719a82f846cca84f68dd21c5335bedd3bfc9c3738955f14f594f7e507935\": container with ID starting with a69d719a82f846cca84f68dd21c5335bedd3bfc9c3738955f14f594f7e507935 not found: ID does not exist" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.414098 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:47:23 crc kubenswrapper[5030]: E0120 23:47:23.414421 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" containerName="glance-httpd" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.414438 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" containerName="glance-httpd" Jan 20 23:47:23 crc kubenswrapper[5030]: E0120 23:47:23.414481 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" containerName="glance-log" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.414488 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" containerName="glance-log" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.414683 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" containerName="glance-log" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.414711 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" containerName="glance-httpd" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.416533 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.419553 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.419758 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.430402 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.548786 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.548883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.548907 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.548954 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf2fe09-370b-4904-9f96-676f484639a2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.549003 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-config-data\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.549039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-scripts\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.549096 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ljj8\" (UniqueName: \"kubernetes.io/projected/8cf2fe09-370b-4904-9f96-676f484639a2-kube-api-access-9ljj8\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.549116 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf2fe09-370b-4904-9f96-676f484639a2-logs\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.650278 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf2fe09-370b-4904-9f96-676f484639a2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.650675 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-config-data\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.650770 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf2fe09-370b-4904-9f96-676f484639a2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.651339 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-scripts\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.651384 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ljj8\" (UniqueName: \"kubernetes.io/projected/8cf2fe09-370b-4904-9f96-676f484639a2-kube-api-access-9ljj8\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.651435 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf2fe09-370b-4904-9f96-676f484639a2-logs\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.651535 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.651842 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf2fe09-370b-4904-9f96-676f484639a2-logs\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.651992 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.652025 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.652429 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.656301 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-config-data\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.656466 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-scripts\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.657499 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.659340 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.672034 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ljj8\" (UniqueName: \"kubernetes.io/projected/8cf2fe09-370b-4904-9f96-676f484639a2-kube-api-access-9ljj8\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.685390 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.747085 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:23 crc kubenswrapper[5030]: I0120 23:47:23.978389 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ead5a03-9b2d-4b00-8ae0-0d5192f642cf" path="/var/lib/kubelet/pods/2ead5a03-9b2d-4b00-8ae0-0d5192f642cf/volumes" Jan 20 23:47:24 crc kubenswrapper[5030]: I0120 23:47:24.193564 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:47:24 crc kubenswrapper[5030]: I0120 23:47:24.334689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9036d901-6de7-4301-85c0-62723b10ae6c","Type":"ContainerStarted","Data":"701f43b2568e312e9772d92318bb41bb84f4409a0d9a2c0eb6f8605d404a8b06"} Jan 20 23:47:24 crc kubenswrapper[5030]: I0120 23:47:24.336006 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"8cf2fe09-370b-4904-9f96-676f484639a2","Type":"ContainerStarted","Data":"1c46e108039b15a0fe7bf9973c2b0c7bd1a4858120af7d074e71d7f50ff3b411"} Jan 20 23:47:25 crc kubenswrapper[5030]: I0120 23:47:25.348029 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"8cf2fe09-370b-4904-9f96-676f484639a2","Type":"ContainerStarted","Data":"08efb75988c344c6ad8db318e7e0250f525d5625216651e8b4ae86babeea05fa"} Jan 20 23:47:25 crc kubenswrapper[5030]: I0120 23:47:25.348373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"8cf2fe09-370b-4904-9f96-676f484639a2","Type":"ContainerStarted","Data":"79c09103c60431ebdd981145e3c2b16138f9661ec8c5a1fbfa1b8844ea828e76"} Jan 20 23:47:25 crc kubenswrapper[5030]: I0120 23:47:25.351958 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9036d901-6de7-4301-85c0-62723b10ae6c","Type":"ContainerStarted","Data":"c62780d3e5840b74bc2429c44c1168ee6a179caac4010c37ab7ee9c316f92d48"} Jan 20 23:47:25 crc kubenswrapper[5030]: I0120 23:47:25.352109 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="ceilometer-central-agent" containerID="cri-o://b5668c5767465d1ac0a059f35961c95c3750890a2ab067c7e40c0fd02ef0a90b" gracePeriod=30 Jan 20 23:47:25 crc kubenswrapper[5030]: I0120 23:47:25.352255 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="sg-core" containerID="cri-o://701f43b2568e312e9772d92318bb41bb84f4409a0d9a2c0eb6f8605d404a8b06" gracePeriod=30 Jan 20 23:47:25 crc kubenswrapper[5030]: I0120 23:47:25.352290 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:25 crc kubenswrapper[5030]: I0120 23:47:25.352308 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="proxy-httpd" containerID="cri-o://c62780d3e5840b74bc2429c44c1168ee6a179caac4010c37ab7ee9c316f92d48" gracePeriod=30 Jan 20 23:47:25 crc kubenswrapper[5030]: I0120 23:47:25.352271 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="ceilometer-notification-agent" containerID="cri-o://ae8a26acea412ba89d4499dc41189e13598d4aa8e2f33437732b1f4b93e89caa" gracePeriod=30 Jan 20 23:47:25 crc kubenswrapper[5030]: I0120 23:47:25.387305 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.387288005 podStartE2EDuration="2.387288005s" podCreationTimestamp="2026-01-20 23:47:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:25.382651533 +0000 UTC m=+4317.702911831" watchObservedRunningTime="2026-01-20 23:47:25.387288005 +0000 UTC m=+4317.707548293" Jan 20 23:47:25 crc kubenswrapper[5030]: I0120 23:47:25.416106 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.263124739 podStartE2EDuration="6.416082964s" podCreationTimestamp="2026-01-20 23:47:19 +0000 UTC" firstStartedPulling="2026-01-20 23:47:20.569441852 +0000 UTC m=+4312.889702140" lastFinishedPulling="2026-01-20 23:47:24.722400077 +0000 UTC m=+4317.042660365" observedRunningTime="2026-01-20 23:47:25.408990462 +0000 UTC m=+4317.729250750" watchObservedRunningTime="2026-01-20 23:47:25.416082964 +0000 UTC m=+4317.736343252" Jan 20 23:47:27 crc kubenswrapper[5030]: I0120 23:47:27.310800 5030 generic.go:334] "Generic (PLEG): container finished" podID="9036d901-6de7-4301-85c0-62723b10ae6c" containerID="c62780d3e5840b74bc2429c44c1168ee6a179caac4010c37ab7ee9c316f92d48" exitCode=0 Jan 20 23:47:27 crc kubenswrapper[5030]: I0120 23:47:27.311093 5030 generic.go:334] "Generic (PLEG): container finished" podID="9036d901-6de7-4301-85c0-62723b10ae6c" containerID="701f43b2568e312e9772d92318bb41bb84f4409a0d9a2c0eb6f8605d404a8b06" exitCode=2 Jan 20 23:47:27 crc kubenswrapper[5030]: I0120 23:47:27.311105 5030 generic.go:334] "Generic (PLEG): container finished" podID="9036d901-6de7-4301-85c0-62723b10ae6c" containerID="ae8a26acea412ba89d4499dc41189e13598d4aa8e2f33437732b1f4b93e89caa" exitCode=0 Jan 20 23:47:27 crc kubenswrapper[5030]: I0120 23:47:27.310941 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9036d901-6de7-4301-85c0-62723b10ae6c","Type":"ContainerDied","Data":"c62780d3e5840b74bc2429c44c1168ee6a179caac4010c37ab7ee9c316f92d48"} Jan 20 23:47:27 crc kubenswrapper[5030]: I0120 23:47:27.311899 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9036d901-6de7-4301-85c0-62723b10ae6c","Type":"ContainerDied","Data":"701f43b2568e312e9772d92318bb41bb84f4409a0d9a2c0eb6f8605d404a8b06"} Jan 20 23:47:27 crc kubenswrapper[5030]: I0120 23:47:27.311935 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9036d901-6de7-4301-85c0-62723b10ae6c","Type":"ContainerDied","Data":"ae8a26acea412ba89d4499dc41189e13598d4aa8e2f33437732b1f4b93e89caa"} Jan 20 23:47:28 crc kubenswrapper[5030]: I0120 23:47:28.322520 5030 generic.go:334] "Generic (PLEG): container finished" podID="907d9698-9733-4db5-93c2-8c35280a5610" containerID="d2fed7ec38d03a6a43221941b4cffa52c915ff185c27683474988eb9263cdca6" exitCode=0 Jan 20 23:47:28 crc kubenswrapper[5030]: I0120 23:47:28.322801 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" event={"ID":"907d9698-9733-4db5-93c2-8c35280a5610","Type":"ContainerDied","Data":"d2fed7ec38d03a6a43221941b4cffa52c915ff185c27683474988eb9263cdca6"} Jan 20 23:47:29 crc kubenswrapper[5030]: I0120 23:47:29.700264 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:29 crc kubenswrapper[5030]: I0120 23:47:29.796363 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfnp6\" (UniqueName: \"kubernetes.io/projected/907d9698-9733-4db5-93c2-8c35280a5610-kube-api-access-bfnp6\") pod \"907d9698-9733-4db5-93c2-8c35280a5610\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " Jan 20 23:47:29 crc kubenswrapper[5030]: I0120 23:47:29.796487 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-scripts\") pod \"907d9698-9733-4db5-93c2-8c35280a5610\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " Jan 20 23:47:29 crc kubenswrapper[5030]: I0120 23:47:29.796513 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-config-data\") pod \"907d9698-9733-4db5-93c2-8c35280a5610\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " Jan 20 23:47:29 crc kubenswrapper[5030]: I0120 23:47:29.796664 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-combined-ca-bundle\") pod \"907d9698-9733-4db5-93c2-8c35280a5610\" (UID: \"907d9698-9733-4db5-93c2-8c35280a5610\") " Jan 20 23:47:29 crc kubenswrapper[5030]: I0120 23:47:29.803553 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-scripts" (OuterVolumeSpecName: "scripts") pod "907d9698-9733-4db5-93c2-8c35280a5610" (UID: "907d9698-9733-4db5-93c2-8c35280a5610"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:29 crc kubenswrapper[5030]: I0120 23:47:29.803784 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/907d9698-9733-4db5-93c2-8c35280a5610-kube-api-access-bfnp6" (OuterVolumeSpecName: "kube-api-access-bfnp6") pod "907d9698-9733-4db5-93c2-8c35280a5610" (UID: "907d9698-9733-4db5-93c2-8c35280a5610"). InnerVolumeSpecName "kube-api-access-bfnp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:29 crc kubenswrapper[5030]: I0120 23:47:29.827384 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "907d9698-9733-4db5-93c2-8c35280a5610" (UID: "907d9698-9733-4db5-93c2-8c35280a5610"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:29 crc kubenswrapper[5030]: I0120 23:47:29.829273 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-config-data" (OuterVolumeSpecName: "config-data") pod "907d9698-9733-4db5-93c2-8c35280a5610" (UID: "907d9698-9733-4db5-93c2-8c35280a5610"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:29 crc kubenswrapper[5030]: I0120 23:47:29.899192 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:29 crc kubenswrapper[5030]: I0120 23:47:29.899236 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfnp6\" (UniqueName: \"kubernetes.io/projected/907d9698-9733-4db5-93c2-8c35280a5610-kube-api-access-bfnp6\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:29 crc kubenswrapper[5030]: I0120 23:47:29.899253 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:29 crc kubenswrapper[5030]: I0120 23:47:29.899266 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907d9698-9733-4db5-93c2-8c35280a5610-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.343319 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" event={"ID":"907d9698-9733-4db5-93c2-8c35280a5610","Type":"ContainerDied","Data":"aa9541cc5e2bebf711debd3e0858fac06059d0c76d6ced95915c2e928b6fb2c8"} Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.343792 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa9541cc5e2bebf711debd3e0858fac06059d0c76d6ced95915c2e928b6fb2c8" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.343412 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.414014 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:47:30 crc kubenswrapper[5030]: E0120 23:47:30.414412 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907d9698-9733-4db5-93c2-8c35280a5610" containerName="nova-cell0-conductor-db-sync" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.414429 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="907d9698-9733-4db5-93c2-8c35280a5610" containerName="nova-cell0-conductor-db-sync" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.414582 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="907d9698-9733-4db5-93c2-8c35280a5610" containerName="nova-cell0-conductor-db-sync" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.415189 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.417273 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.418954 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-gn4g2" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.426794 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.613413 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg6dr\" (UniqueName: \"kubernetes.io/projected/0619400b-dc63-4076-86e8-b2c80b6c4065-kube-api-access-fg6dr\") pod \"nova-cell0-conductor-0\" (UID: \"0619400b-dc63-4076-86e8-b2c80b6c4065\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.613499 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0619400b-dc63-4076-86e8-b2c80b6c4065-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0619400b-dc63-4076-86e8-b2c80b6c4065\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.613865 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0619400b-dc63-4076-86e8-b2c80b6c4065-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0619400b-dc63-4076-86e8-b2c80b6c4065\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.635886 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.636439 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.672999 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.678161 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.715284 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg6dr\" (UniqueName: \"kubernetes.io/projected/0619400b-dc63-4076-86e8-b2c80b6c4065-kube-api-access-fg6dr\") pod \"nova-cell0-conductor-0\" (UID: \"0619400b-dc63-4076-86e8-b2c80b6c4065\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.715378 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0619400b-dc63-4076-86e8-b2c80b6c4065-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0619400b-dc63-4076-86e8-b2c80b6c4065\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.715477 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0619400b-dc63-4076-86e8-b2c80b6c4065-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0619400b-dc63-4076-86e8-b2c80b6c4065\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.723131 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0619400b-dc63-4076-86e8-b2c80b6c4065-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0619400b-dc63-4076-86e8-b2c80b6c4065\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.727101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0619400b-dc63-4076-86e8-b2c80b6c4065-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0619400b-dc63-4076-86e8-b2c80b6c4065\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.744235 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg6dr\" (UniqueName: \"kubernetes.io/projected/0619400b-dc63-4076-86e8-b2c80b6c4065-kube-api-access-fg6dr\") pod \"nova-cell0-conductor-0\" (UID: \"0619400b-dc63-4076-86e8-b2c80b6c4065\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:30 crc kubenswrapper[5030]: I0120 23:47:30.750606 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:31 crc kubenswrapper[5030]: I0120 23:47:31.235877 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:47:31 crc kubenswrapper[5030]: I0120 23:47:31.355682 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:31 crc kubenswrapper[5030]: I0120 23:47:31.355748 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:31 crc kubenswrapper[5030]: W0120 23:47:31.449902 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0619400b_dc63_4076_86e8_b2c80b6c4065.slice/crio-a9639d2c8e34f8817f16c03fa6618cd401722c25b13dba8676c5cd520776c7fb WatchSource:0}: Error finding container a9639d2c8e34f8817f16c03fa6618cd401722c25b13dba8676c5cd520776c7fb: Status 404 returned error can't find the container with id a9639d2c8e34f8817f16c03fa6618cd401722c25b13dba8676c5cd520776c7fb Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.369460 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"0619400b-dc63-4076-86e8-b2c80b6c4065","Type":"ContainerStarted","Data":"06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481"} Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.370012 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"0619400b-dc63-4076-86e8-b2c80b6c4065","Type":"ContainerStarted","Data":"a9639d2c8e34f8817f16c03fa6618cd401722c25b13dba8676c5cd520776c7fb"} Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.370640 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.374343 5030 generic.go:334] "Generic (PLEG): container finished" podID="9036d901-6de7-4301-85c0-62723b10ae6c" containerID="b5668c5767465d1ac0a059f35961c95c3750890a2ab067c7e40c0fd02ef0a90b" exitCode=0 Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.375102 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9036d901-6de7-4301-85c0-62723b10ae6c","Type":"ContainerDied","Data":"b5668c5767465d1ac0a059f35961c95c3750890a2ab067c7e40c0fd02ef0a90b"} Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.375136 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"9036d901-6de7-4301-85c0-62723b10ae6c","Type":"ContainerDied","Data":"cee688626790c7cb73b6123629ec28fdd87d46fe46aefc4ebd146404c590e3fa"} Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.375151 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cee688626790c7cb73b6123629ec28fdd87d46fe46aefc4ebd146404c590e3fa" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.387661 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.387642334 podStartE2EDuration="2.387642334s" podCreationTimestamp="2026-01-20 23:47:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:32.386069666 +0000 UTC m=+4324.706329964" watchObservedRunningTime="2026-01-20 23:47:32.387642334 +0000 UTC m=+4324.707902622" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.417896 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.550114 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9036d901-6de7-4301-85c0-62723b10ae6c-run-httpd\") pod \"9036d901-6de7-4301-85c0-62723b10ae6c\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.550299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9036d901-6de7-4301-85c0-62723b10ae6c-log-httpd\") pod \"9036d901-6de7-4301-85c0-62723b10ae6c\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.550323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-combined-ca-bundle\") pod \"9036d901-6de7-4301-85c0-62723b10ae6c\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.550362 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-scripts\") pod \"9036d901-6de7-4301-85c0-62723b10ae6c\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.550436 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-sg-core-conf-yaml\") pod \"9036d901-6de7-4301-85c0-62723b10ae6c\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.550467 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-config-data\") pod \"9036d901-6de7-4301-85c0-62723b10ae6c\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.550514 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcqql\" (UniqueName: \"kubernetes.io/projected/9036d901-6de7-4301-85c0-62723b10ae6c-kube-api-access-qcqql\") pod \"9036d901-6de7-4301-85c0-62723b10ae6c\" (UID: \"9036d901-6de7-4301-85c0-62723b10ae6c\") " Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.550772 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9036d901-6de7-4301-85c0-62723b10ae6c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9036d901-6de7-4301-85c0-62723b10ae6c" (UID: "9036d901-6de7-4301-85c0-62723b10ae6c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.550984 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9036d901-6de7-4301-85c0-62723b10ae6c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.551006 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9036d901-6de7-4301-85c0-62723b10ae6c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9036d901-6de7-4301-85c0-62723b10ae6c" (UID: "9036d901-6de7-4301-85c0-62723b10ae6c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.559258 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-scripts" (OuterVolumeSpecName: "scripts") pod "9036d901-6de7-4301-85c0-62723b10ae6c" (UID: "9036d901-6de7-4301-85c0-62723b10ae6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.559354 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9036d901-6de7-4301-85c0-62723b10ae6c-kube-api-access-qcqql" (OuterVolumeSpecName: "kube-api-access-qcqql") pod "9036d901-6de7-4301-85c0-62723b10ae6c" (UID: "9036d901-6de7-4301-85c0-62723b10ae6c"). InnerVolumeSpecName "kube-api-access-qcqql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.578768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9036d901-6de7-4301-85c0-62723b10ae6c" (UID: "9036d901-6de7-4301-85c0-62723b10ae6c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.627183 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9036d901-6de7-4301-85c0-62723b10ae6c" (UID: "9036d901-6de7-4301-85c0-62723b10ae6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.645907 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-config-data" (OuterVolumeSpecName: "config-data") pod "9036d901-6de7-4301-85c0-62723b10ae6c" (UID: "9036d901-6de7-4301-85c0-62723b10ae6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.653306 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9036d901-6de7-4301-85c0-62723b10ae6c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.653341 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.653352 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.653361 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.653371 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9036d901-6de7-4301-85c0-62723b10ae6c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:32 crc kubenswrapper[5030]: I0120 23:47:32.653381 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcqql\" (UniqueName: \"kubernetes.io/projected/9036d901-6de7-4301-85c0-62723b10ae6c-kube-api-access-qcqql\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.104484 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.197984 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.382732 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.430127 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.441719 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.457735 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:33 crc kubenswrapper[5030]: E0120 23:47:33.458073 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="sg-core" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.458090 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="sg-core" Jan 20 23:47:33 crc kubenswrapper[5030]: E0120 23:47:33.458105 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="ceilometer-notification-agent" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.458112 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="ceilometer-notification-agent" Jan 20 23:47:33 crc kubenswrapper[5030]: E0120 23:47:33.458122 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="ceilometer-central-agent" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.458130 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="ceilometer-central-agent" Jan 20 23:47:33 crc kubenswrapper[5030]: E0120 23:47:33.458148 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="proxy-httpd" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.458154 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="proxy-httpd" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.458322 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="ceilometer-central-agent" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.458342 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="ceilometer-notification-agent" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.458353 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="proxy-httpd" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.458364 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" containerName="sg-core" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.459824 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.462512 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.470310 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.473259 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.588723 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.588975 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86b08649-8366-4b98-a7d6-2b44667edd54-run-httpd\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.589052 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-config-data\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.589159 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86b08649-8366-4b98-a7d6-2b44667edd54-log-httpd\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.589303 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpcqb\" (UniqueName: \"kubernetes.io/projected/86b08649-8366-4b98-a7d6-2b44667edd54-kube-api-access-gpcqb\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.589826 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.590562 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-scripts\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.691832 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.691897 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86b08649-8366-4b98-a7d6-2b44667edd54-run-httpd\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.691928 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-config-data\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.691976 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86b08649-8366-4b98-a7d6-2b44667edd54-log-httpd\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.692068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpcqb\" (UniqueName: \"kubernetes.io/projected/86b08649-8366-4b98-a7d6-2b44667edd54-kube-api-access-gpcqb\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.692106 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.692145 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-scripts\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.692486 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86b08649-8366-4b98-a7d6-2b44667edd54-run-httpd\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.692724 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86b08649-8366-4b98-a7d6-2b44667edd54-log-httpd\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.696412 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-config-data\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.696706 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-scripts\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.696805 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.698454 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.710845 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpcqb\" (UniqueName: \"kubernetes.io/projected/86b08649-8366-4b98-a7d6-2b44667edd54-kube-api-access-gpcqb\") pod \"ceilometer-0\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.748302 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.748359 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.776611 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.800351 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.813197 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:33 crc kubenswrapper[5030]: I0120 23:47:33.974926 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9036d901-6de7-4301-85c0-62723b10ae6c" path="/var/lib/kubelet/pods/9036d901-6de7-4301-85c0-62723b10ae6c/volumes" Jan 20 23:47:34 crc kubenswrapper[5030]: I0120 23:47:34.243301 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:34 crc kubenswrapper[5030]: W0120 23:47:34.243375 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86b08649_8366_4b98_a7d6_2b44667edd54.slice/crio-20507c8a42fc05c30c960ff9abb94dc9849232639c276747b725d5a44a51d718 WatchSource:0}: Error finding container 20507c8a42fc05c30c960ff9abb94dc9849232639c276747b725d5a44a51d718: Status 404 returned error can't find the container with id 20507c8a42fc05c30c960ff9abb94dc9849232639c276747b725d5a44a51d718 Jan 20 23:47:34 crc kubenswrapper[5030]: I0120 23:47:34.393211 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"86b08649-8366-4b98-a7d6-2b44667edd54","Type":"ContainerStarted","Data":"20507c8a42fc05c30c960ff9abb94dc9849232639c276747b725d5a44a51d718"} Jan 20 23:47:34 crc kubenswrapper[5030]: I0120 23:47:34.393249 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:34 crc kubenswrapper[5030]: I0120 23:47:34.393859 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:35 crc kubenswrapper[5030]: I0120 23:47:35.401102 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"86b08649-8366-4b98-a7d6-2b44667edd54","Type":"ContainerStarted","Data":"25dc2541f4bf29853064edadf1f7f458f533e0adb4e3f8b4c37ee27007d458ab"} Jan 20 23:47:36 crc kubenswrapper[5030]: I0120 23:47:36.302972 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:36 crc kubenswrapper[5030]: I0120 23:47:36.308275 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:47:36 crc kubenswrapper[5030]: I0120 23:47:36.438236 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"86b08649-8366-4b98-a7d6-2b44667edd54","Type":"ContainerStarted","Data":"9d47f625fcbeffadcae795dbae8be3ad8e90d407a8177813051572bfc15e5111"} Jan 20 23:47:36 crc kubenswrapper[5030]: I0120 23:47:36.438294 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"86b08649-8366-4b98-a7d6-2b44667edd54","Type":"ContainerStarted","Data":"51c43d1cbee7b9eb61f984f3b60a3790b68f9d599556aa9450e5a0e7717fff41"} Jan 20 23:47:37 crc kubenswrapper[5030]: I0120 23:47:37.165251 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:47:37 crc kubenswrapper[5030]: I0120 23:47:37.165449 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="0619400b-dc63-4076-86e8-b2c80b6c4065" containerName="nova-cell0-conductor-conductor" containerID="cri-o://06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481" gracePeriod=30 Jan 20 23:47:37 crc kubenswrapper[5030]: E0120 23:47:37.170683 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:47:37 crc kubenswrapper[5030]: E0120 23:47:37.172278 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:47:37 crc kubenswrapper[5030]: E0120 23:47:37.173416 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:47:37 crc kubenswrapper[5030]: E0120 23:47:37.173482 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="0619400b-dc63-4076-86e8-b2c80b6c4065" containerName="nova-cell0-conductor-conductor" Jan 20 23:47:38 crc kubenswrapper[5030]: I0120 23:47:38.458456 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"86b08649-8366-4b98-a7d6-2b44667edd54","Type":"ContainerStarted","Data":"72cc3bebcaad664e62566aed4fb5e7ffa69d75452f120bd2743ed4587d6fad4a"} Jan 20 23:47:38 crc kubenswrapper[5030]: I0120 23:47:38.458739 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:38 crc kubenswrapper[5030]: I0120 23:47:38.517008 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.464839893 podStartE2EDuration="5.516989664s" podCreationTimestamp="2026-01-20 23:47:33 +0000 UTC" firstStartedPulling="2026-01-20 23:47:34.245129164 +0000 UTC m=+4326.565389452" lastFinishedPulling="2026-01-20 23:47:37.297278945 +0000 UTC m=+4329.617539223" observedRunningTime="2026-01-20 23:47:38.510869066 +0000 UTC m=+4330.831129354" watchObservedRunningTime="2026-01-20 23:47:38.516989664 +0000 UTC m=+4330.837249952" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.003384 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.437960 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.466803 5030 generic.go:334] "Generic (PLEG): container finished" podID="0619400b-dc63-4076-86e8-b2c80b6c4065" containerID="06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481" exitCode=0 Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.467583 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.467671 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"0619400b-dc63-4076-86e8-b2c80b6c4065","Type":"ContainerDied","Data":"06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481"} Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.467766 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"0619400b-dc63-4076-86e8-b2c80b6c4065","Type":"ContainerDied","Data":"a9639d2c8e34f8817f16c03fa6618cd401722c25b13dba8676c5cd520776c7fb"} Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.467819 5030 scope.go:117] "RemoveContainer" containerID="06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.489303 5030 scope.go:117] "RemoveContainer" containerID="06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481" Jan 20 23:47:39 crc kubenswrapper[5030]: E0120 23:47:39.490121 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481\": container with ID starting with 06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481 not found: ID does not exist" containerID="06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.490161 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481"} err="failed to get container status \"06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481\": rpc error: code = NotFound desc = could not find container \"06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481\": container with ID starting with 06734393c2302678c1600aa75efd1fda4bb2b28ba346d0ca433d45df1af47481 not found: ID does not exist" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.598786 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0619400b-dc63-4076-86e8-b2c80b6c4065-config-data\") pod \"0619400b-dc63-4076-86e8-b2c80b6c4065\" (UID: \"0619400b-dc63-4076-86e8-b2c80b6c4065\") " Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.598904 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg6dr\" (UniqueName: \"kubernetes.io/projected/0619400b-dc63-4076-86e8-b2c80b6c4065-kube-api-access-fg6dr\") pod \"0619400b-dc63-4076-86e8-b2c80b6c4065\" (UID: \"0619400b-dc63-4076-86e8-b2c80b6c4065\") " Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.599075 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0619400b-dc63-4076-86e8-b2c80b6c4065-combined-ca-bundle\") pod \"0619400b-dc63-4076-86e8-b2c80b6c4065\" (UID: \"0619400b-dc63-4076-86e8-b2c80b6c4065\") " Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.603928 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0619400b-dc63-4076-86e8-b2c80b6c4065-kube-api-access-fg6dr" (OuterVolumeSpecName: "kube-api-access-fg6dr") pod "0619400b-dc63-4076-86e8-b2c80b6c4065" (UID: "0619400b-dc63-4076-86e8-b2c80b6c4065"). InnerVolumeSpecName "kube-api-access-fg6dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.629272 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0619400b-dc63-4076-86e8-b2c80b6c4065-config-data" (OuterVolumeSpecName: "config-data") pod "0619400b-dc63-4076-86e8-b2c80b6c4065" (UID: "0619400b-dc63-4076-86e8-b2c80b6c4065"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.636268 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0619400b-dc63-4076-86e8-b2c80b6c4065-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0619400b-dc63-4076-86e8-b2c80b6c4065" (UID: "0619400b-dc63-4076-86e8-b2c80b6c4065"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.701121 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0619400b-dc63-4076-86e8-b2c80b6c4065-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.701154 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0619400b-dc63-4076-86e8-b2c80b6c4065-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.701168 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg6dr\" (UniqueName: \"kubernetes.io/projected/0619400b-dc63-4076-86e8-b2c80b6c4065-kube-api-access-fg6dr\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.804324 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.813184 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.831228 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:47:39 crc kubenswrapper[5030]: E0120 23:47:39.831584 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0619400b-dc63-4076-86e8-b2c80b6c4065" containerName="nova-cell0-conductor-conductor" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.831600 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0619400b-dc63-4076-86e8-b2c80b6c4065" containerName="nova-cell0-conductor-conductor" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.831793 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0619400b-dc63-4076-86e8-b2c80b6c4065" containerName="nova-cell0-conductor-conductor" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.832325 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.834371 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-gn4g2" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.834725 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.843707 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.904821 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32dd6edd-9576-4aa1-9371-f9498e9c1a60-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.904924 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dd6edd-9576-4aa1-9371-f9498e9c1a60-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.905179 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-979dt\" (UniqueName: \"kubernetes.io/projected/32dd6edd-9576-4aa1-9371-f9498e9c1a60-kube-api-access-979dt\") pod \"nova-cell0-conductor-0\" (UID: \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:39 crc kubenswrapper[5030]: I0120 23:47:39.973422 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0619400b-dc63-4076-86e8-b2c80b6c4065" path="/var/lib/kubelet/pods/0619400b-dc63-4076-86e8-b2c80b6c4065/volumes" Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.006982 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-979dt\" (UniqueName: \"kubernetes.io/projected/32dd6edd-9576-4aa1-9371-f9498e9c1a60-kube-api-access-979dt\") pod \"nova-cell0-conductor-0\" (UID: \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.007760 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32dd6edd-9576-4aa1-9371-f9498e9c1a60-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.008271 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dd6edd-9576-4aa1-9371-f9498e9c1a60-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.013069 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32dd6edd-9576-4aa1-9371-f9498e9c1a60-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.013640 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dd6edd-9576-4aa1-9371-f9498e9c1a60-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.022241 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-979dt\" (UniqueName: \"kubernetes.io/projected/32dd6edd-9576-4aa1-9371-f9498e9c1a60-kube-api-access-979dt\") pod \"nova-cell0-conductor-0\" (UID: \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.150306 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.157683 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.157716 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.476101 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="ceilometer-central-agent" containerID="cri-o://25dc2541f4bf29853064edadf1f7f458f533e0adb4e3f8b4c37ee27007d458ab" gracePeriod=30 Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.476157 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="ceilometer-notification-agent" containerID="cri-o://51c43d1cbee7b9eb61f984f3b60a3790b68f9d599556aa9450e5a0e7717fff41" gracePeriod=30 Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.476144 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="sg-core" containerID="cri-o://9d47f625fcbeffadcae795dbae8be3ad8e90d407a8177813051572bfc15e5111" gracePeriod=30 Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.476208 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="proxy-httpd" containerID="cri-o://72cc3bebcaad664e62566aed4fb5e7ffa69d75452f120bd2743ed4587d6fad4a" gracePeriod=30 Jan 20 23:47:40 crc kubenswrapper[5030]: I0120 23:47:40.985293 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:47:41 crc kubenswrapper[5030]: I0120 23:47:41.484433 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"32dd6edd-9576-4aa1-9371-f9498e9c1a60","Type":"ContainerStarted","Data":"8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd"} Jan 20 23:47:41 crc kubenswrapper[5030]: I0120 23:47:41.484491 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"32dd6edd-9576-4aa1-9371-f9498e9c1a60","Type":"ContainerStarted","Data":"6703f6e5c6c634151401763c5f33c32d706ead65ab5d5ab20313596e251a51e8"} Jan 20 23:47:41 crc kubenswrapper[5030]: I0120 23:47:41.484606 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:41 crc kubenswrapper[5030]: I0120 23:47:41.487080 5030 generic.go:334] "Generic (PLEG): container finished" podID="86b08649-8366-4b98-a7d6-2b44667edd54" containerID="72cc3bebcaad664e62566aed4fb5e7ffa69d75452f120bd2743ed4587d6fad4a" exitCode=0 Jan 20 23:47:41 crc kubenswrapper[5030]: I0120 23:47:41.487110 5030 generic.go:334] "Generic (PLEG): container finished" podID="86b08649-8366-4b98-a7d6-2b44667edd54" containerID="9d47f625fcbeffadcae795dbae8be3ad8e90d407a8177813051572bfc15e5111" exitCode=2 Jan 20 23:47:41 crc kubenswrapper[5030]: I0120 23:47:41.487121 5030 generic.go:334] "Generic (PLEG): container finished" podID="86b08649-8366-4b98-a7d6-2b44667edd54" containerID="51c43d1cbee7b9eb61f984f3b60a3790b68f9d599556aa9450e5a0e7717fff41" exitCode=0 Jan 20 23:47:41 crc kubenswrapper[5030]: I0120 23:47:41.487141 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"86b08649-8366-4b98-a7d6-2b44667edd54","Type":"ContainerDied","Data":"72cc3bebcaad664e62566aed4fb5e7ffa69d75452f120bd2743ed4587d6fad4a"} Jan 20 23:47:41 crc kubenswrapper[5030]: I0120 23:47:41.487159 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"86b08649-8366-4b98-a7d6-2b44667edd54","Type":"ContainerDied","Data":"9d47f625fcbeffadcae795dbae8be3ad8e90d407a8177813051572bfc15e5111"} Jan 20 23:47:41 crc kubenswrapper[5030]: I0120 23:47:41.487172 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"86b08649-8366-4b98-a7d6-2b44667edd54","Type":"ContainerDied","Data":"51c43d1cbee7b9eb61f984f3b60a3790b68f9d599556aa9450e5a0e7717fff41"} Jan 20 23:47:41 crc kubenswrapper[5030]: I0120 23:47:41.501708 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.501689949 podStartE2EDuration="2.501689949s" podCreationTimestamp="2026-01-20 23:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:41.498830249 +0000 UTC m=+4333.819090537" watchObservedRunningTime="2026-01-20 23:47:41.501689949 +0000 UTC m=+4333.821950237" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.035922 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.164112 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86b08649-8366-4b98-a7d6-2b44667edd54-log-httpd\") pod \"86b08649-8366-4b98-a7d6-2b44667edd54\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.164202 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-combined-ca-bundle\") pod \"86b08649-8366-4b98-a7d6-2b44667edd54\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.164305 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-scripts\") pod \"86b08649-8366-4b98-a7d6-2b44667edd54\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.164366 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpcqb\" (UniqueName: \"kubernetes.io/projected/86b08649-8366-4b98-a7d6-2b44667edd54-kube-api-access-gpcqb\") pod \"86b08649-8366-4b98-a7d6-2b44667edd54\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.164415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86b08649-8366-4b98-a7d6-2b44667edd54-run-httpd\") pod \"86b08649-8366-4b98-a7d6-2b44667edd54\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.164517 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-sg-core-conf-yaml\") pod \"86b08649-8366-4b98-a7d6-2b44667edd54\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.164580 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-config-data\") pod \"86b08649-8366-4b98-a7d6-2b44667edd54\" (UID: \"86b08649-8366-4b98-a7d6-2b44667edd54\") " Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.165220 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86b08649-8366-4b98-a7d6-2b44667edd54-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "86b08649-8366-4b98-a7d6-2b44667edd54" (UID: "86b08649-8366-4b98-a7d6-2b44667edd54"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.165459 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86b08649-8366-4b98-a7d6-2b44667edd54-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "86b08649-8366-4b98-a7d6-2b44667edd54" (UID: "86b08649-8366-4b98-a7d6-2b44667edd54"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.170902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b08649-8366-4b98-a7d6-2b44667edd54-kube-api-access-gpcqb" (OuterVolumeSpecName: "kube-api-access-gpcqb") pod "86b08649-8366-4b98-a7d6-2b44667edd54" (UID: "86b08649-8366-4b98-a7d6-2b44667edd54"). InnerVolumeSpecName "kube-api-access-gpcqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.172056 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-scripts" (OuterVolumeSpecName: "scripts") pod "86b08649-8366-4b98-a7d6-2b44667edd54" (UID: "86b08649-8366-4b98-a7d6-2b44667edd54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.210512 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "86b08649-8366-4b98-a7d6-2b44667edd54" (UID: "86b08649-8366-4b98-a7d6-2b44667edd54"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.267246 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.267276 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86b08649-8366-4b98-a7d6-2b44667edd54-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.267290 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.267302 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpcqb\" (UniqueName: \"kubernetes.io/projected/86b08649-8366-4b98-a7d6-2b44667edd54-kube-api-access-gpcqb\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.267311 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86b08649-8366-4b98-a7d6-2b44667edd54-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.284610 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86b08649-8366-4b98-a7d6-2b44667edd54" (UID: "86b08649-8366-4b98-a7d6-2b44667edd54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.302442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-config-data" (OuterVolumeSpecName: "config-data") pod "86b08649-8366-4b98-a7d6-2b44667edd54" (UID: "86b08649-8366-4b98-a7d6-2b44667edd54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.369113 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.369147 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86b08649-8366-4b98-a7d6-2b44667edd54-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.512076 5030 generic.go:334] "Generic (PLEG): container finished" podID="86b08649-8366-4b98-a7d6-2b44667edd54" containerID="25dc2541f4bf29853064edadf1f7f458f533e0adb4e3f8b4c37ee27007d458ab" exitCode=0 Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.512124 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"86b08649-8366-4b98-a7d6-2b44667edd54","Type":"ContainerDied","Data":"25dc2541f4bf29853064edadf1f7f458f533e0adb4e3f8b4c37ee27007d458ab"} Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.512189 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"86b08649-8366-4b98-a7d6-2b44667edd54","Type":"ContainerDied","Data":"20507c8a42fc05c30c960ff9abb94dc9849232639c276747b725d5a44a51d718"} Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.512213 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.512220 5030 scope.go:117] "RemoveContainer" containerID="72cc3bebcaad664e62566aed4fb5e7ffa69d75452f120bd2743ed4587d6fad4a" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.546469 5030 scope.go:117] "RemoveContainer" containerID="9d47f625fcbeffadcae795dbae8be3ad8e90d407a8177813051572bfc15e5111" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.552915 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.560309 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.583674 5030 scope.go:117] "RemoveContainer" containerID="51c43d1cbee7b9eb61f984f3b60a3790b68f9d599556aa9450e5a0e7717fff41" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.598003 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:43 crc kubenswrapper[5030]: E0120 23:47:43.598591 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="proxy-httpd" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.598616 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="proxy-httpd" Jan 20 23:47:43 crc kubenswrapper[5030]: E0120 23:47:43.598647 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="ceilometer-central-agent" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.598658 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="ceilometer-central-agent" Jan 20 23:47:43 crc kubenswrapper[5030]: E0120 23:47:43.598699 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="sg-core" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.598708 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="sg-core" Jan 20 23:47:43 crc kubenswrapper[5030]: E0120 23:47:43.598729 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="ceilometer-notification-agent" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.598738 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="ceilometer-notification-agent" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.598963 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="ceilometer-central-agent" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.598977 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="sg-core" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.599000 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="ceilometer-notification-agent" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.599011 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" containerName="proxy-httpd" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.602009 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.609600 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.611935 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.613399 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.662849 5030 scope.go:117] "RemoveContainer" containerID="25dc2541f4bf29853064edadf1f7f458f533e0adb4e3f8b4c37ee27007d458ab" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.673442 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.673506 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f404786-795b-4fe1-98be-adff150f9666-run-httpd\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.673532 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-scripts\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.673549 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-config-data\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.673722 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2m9x\" (UniqueName: \"kubernetes.io/projected/4f404786-795b-4fe1-98be-adff150f9666-kube-api-access-p2m9x\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.673813 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.673864 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f404786-795b-4fe1-98be-adff150f9666-log-httpd\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.682783 5030 scope.go:117] "RemoveContainer" containerID="72cc3bebcaad664e62566aed4fb5e7ffa69d75452f120bd2743ed4587d6fad4a" Jan 20 23:47:43 crc kubenswrapper[5030]: E0120 23:47:43.683258 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72cc3bebcaad664e62566aed4fb5e7ffa69d75452f120bd2743ed4587d6fad4a\": container with ID starting with 72cc3bebcaad664e62566aed4fb5e7ffa69d75452f120bd2743ed4587d6fad4a not found: ID does not exist" containerID="72cc3bebcaad664e62566aed4fb5e7ffa69d75452f120bd2743ed4587d6fad4a" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.683294 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72cc3bebcaad664e62566aed4fb5e7ffa69d75452f120bd2743ed4587d6fad4a"} err="failed to get container status \"72cc3bebcaad664e62566aed4fb5e7ffa69d75452f120bd2743ed4587d6fad4a\": rpc error: code = NotFound desc = could not find container \"72cc3bebcaad664e62566aed4fb5e7ffa69d75452f120bd2743ed4587d6fad4a\": container with ID starting with 72cc3bebcaad664e62566aed4fb5e7ffa69d75452f120bd2743ed4587d6fad4a not found: ID does not exist" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.683320 5030 scope.go:117] "RemoveContainer" containerID="9d47f625fcbeffadcae795dbae8be3ad8e90d407a8177813051572bfc15e5111" Jan 20 23:47:43 crc kubenswrapper[5030]: E0120 23:47:43.683670 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d47f625fcbeffadcae795dbae8be3ad8e90d407a8177813051572bfc15e5111\": container with ID starting with 9d47f625fcbeffadcae795dbae8be3ad8e90d407a8177813051572bfc15e5111 not found: ID does not exist" containerID="9d47f625fcbeffadcae795dbae8be3ad8e90d407a8177813051572bfc15e5111" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.683706 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d47f625fcbeffadcae795dbae8be3ad8e90d407a8177813051572bfc15e5111"} err="failed to get container status \"9d47f625fcbeffadcae795dbae8be3ad8e90d407a8177813051572bfc15e5111\": rpc error: code = NotFound desc = could not find container \"9d47f625fcbeffadcae795dbae8be3ad8e90d407a8177813051572bfc15e5111\": container with ID starting with 9d47f625fcbeffadcae795dbae8be3ad8e90d407a8177813051572bfc15e5111 not found: ID does not exist" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.683727 5030 scope.go:117] "RemoveContainer" containerID="51c43d1cbee7b9eb61f984f3b60a3790b68f9d599556aa9450e5a0e7717fff41" Jan 20 23:47:43 crc kubenswrapper[5030]: E0120 23:47:43.683975 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c43d1cbee7b9eb61f984f3b60a3790b68f9d599556aa9450e5a0e7717fff41\": container with ID starting with 51c43d1cbee7b9eb61f984f3b60a3790b68f9d599556aa9450e5a0e7717fff41 not found: ID does not exist" containerID="51c43d1cbee7b9eb61f984f3b60a3790b68f9d599556aa9450e5a0e7717fff41" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.683998 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c43d1cbee7b9eb61f984f3b60a3790b68f9d599556aa9450e5a0e7717fff41"} err="failed to get container status \"51c43d1cbee7b9eb61f984f3b60a3790b68f9d599556aa9450e5a0e7717fff41\": rpc error: code = NotFound desc = could not find container \"51c43d1cbee7b9eb61f984f3b60a3790b68f9d599556aa9450e5a0e7717fff41\": container with ID starting with 51c43d1cbee7b9eb61f984f3b60a3790b68f9d599556aa9450e5a0e7717fff41 not found: ID does not exist" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.684011 5030 scope.go:117] "RemoveContainer" containerID="25dc2541f4bf29853064edadf1f7f458f533e0adb4e3f8b4c37ee27007d458ab" Jan 20 23:47:43 crc kubenswrapper[5030]: E0120 23:47:43.684453 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25dc2541f4bf29853064edadf1f7f458f533e0adb4e3f8b4c37ee27007d458ab\": container with ID starting with 25dc2541f4bf29853064edadf1f7f458f533e0adb4e3f8b4c37ee27007d458ab not found: ID does not exist" containerID="25dc2541f4bf29853064edadf1f7f458f533e0adb4e3f8b4c37ee27007d458ab" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.684487 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25dc2541f4bf29853064edadf1f7f458f533e0adb4e3f8b4c37ee27007d458ab"} err="failed to get container status \"25dc2541f4bf29853064edadf1f7f458f533e0adb4e3f8b4c37ee27007d458ab\": rpc error: code = NotFound desc = could not find container \"25dc2541f4bf29853064edadf1f7f458f533e0adb4e3f8b4c37ee27007d458ab\": container with ID starting with 25dc2541f4bf29853064edadf1f7f458f533e0adb4e3f8b4c37ee27007d458ab not found: ID does not exist" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.775438 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.775552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f404786-795b-4fe1-98be-adff150f9666-run-httpd\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.775613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-scripts\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.775705 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-config-data\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.775773 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2m9x\" (UniqueName: \"kubernetes.io/projected/4f404786-795b-4fe1-98be-adff150f9666-kube-api-access-p2m9x\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.775819 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.775868 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f404786-795b-4fe1-98be-adff150f9666-log-httpd\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.776094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f404786-795b-4fe1-98be-adff150f9666-run-httpd\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.776451 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f404786-795b-4fe1-98be-adff150f9666-log-httpd\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.779661 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-scripts\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.781061 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-config-data\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.781289 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.782404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.797129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2m9x\" (UniqueName: \"kubernetes.io/projected/4f404786-795b-4fe1-98be-adff150f9666-kube-api-access-p2m9x\") pod \"ceilometer-0\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.970742 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:43 crc kubenswrapper[5030]: I0120 23:47:43.975927 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86b08649-8366-4b98-a7d6-2b44667edd54" path="/var/lib/kubelet/pods/86b08649-8366-4b98-a7d6-2b44667edd54/volumes" Jan 20 23:47:44 crc kubenswrapper[5030]: I0120 23:47:44.470630 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:47:44 crc kubenswrapper[5030]: I0120 23:47:44.522048 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f404786-795b-4fe1-98be-adff150f9666","Type":"ContainerStarted","Data":"7c34fff42877cec5632a564cbe641912862deccdaaac2b6d4d60f6a664bc77b7"} Jan 20 23:47:45 crc kubenswrapper[5030]: I0120 23:47:45.536048 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f404786-795b-4fe1-98be-adff150f9666","Type":"ContainerStarted","Data":"aa16a8c7591c2f0163856752bc46cf19ebb656270060a0e84d1c774b80b3d754"} Jan 20 23:47:46 crc kubenswrapper[5030]: I0120 23:47:46.548852 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f404786-795b-4fe1-98be-adff150f9666","Type":"ContainerStarted","Data":"58923d6d2e326c64fbd33abf0cf6dd8ee0f378c38e35dcf2b1e5911de17bdbd5"} Jan 20 23:47:47 crc kubenswrapper[5030]: I0120 23:47:47.563191 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f404786-795b-4fe1-98be-adff150f9666","Type":"ContainerStarted","Data":"a720673ebe7b5561565a50618027fb9df32cc456dfc0a85bb51860d130e7df2e"} Jan 20 23:47:48 crc kubenswrapper[5030]: I0120 23:47:48.617358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f404786-795b-4fe1-98be-adff150f9666","Type":"ContainerStarted","Data":"c9240ef96f450acbb0bc496c062fecd0236d99d36d030f9bc75cff7c1389ba47"} Jan 20 23:47:48 crc kubenswrapper[5030]: I0120 23:47:48.617561 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:47:48 crc kubenswrapper[5030]: I0120 23:47:48.662550 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.832843767 podStartE2EDuration="5.66253019s" podCreationTimestamp="2026-01-20 23:47:43 +0000 UTC" firstStartedPulling="2026-01-20 23:47:44.474406642 +0000 UTC m=+4336.794666930" lastFinishedPulling="2026-01-20 23:47:48.304093055 +0000 UTC m=+4340.624353353" observedRunningTime="2026-01-20 23:47:48.648993821 +0000 UTC m=+4340.969254109" watchObservedRunningTime="2026-01-20 23:47:48.66253019 +0000 UTC m=+4340.982790488" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.176062 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.750447 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-pclms"] Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.751913 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.758266 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.758456 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.763405 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-pclms"] Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.895781 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.897687 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.903736 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.915489 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwscs\" (UniqueName: \"kubernetes.io/projected/650fe4d3-bc01-4ea5-b677-039054b091ec-kube-api-access-lwscs\") pod \"nova-cell0-cell-mapping-pclms\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.915539 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-config-data\") pod \"nova-cell0-cell-mapping-pclms\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.915586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pclms\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.915678 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-scripts\") pod \"nova-cell0-cell-mapping-pclms\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.916372 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.917850 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.919636 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.939129 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.950742 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.991643 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.993019 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:47:50 crc kubenswrapper[5030]: I0120 23:47:50.996139 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.018219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz8pj\" (UniqueName: \"kubernetes.io/projected/61f04bba-e278-40c8-b5ef-42778eb24aa4-kube-api-access-lz8pj\") pod \"nova-metadata-0\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.018263 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f04bba-e278-40c8-b5ef-42778eb24aa4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.018307 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwscs\" (UniqueName: \"kubernetes.io/projected/650fe4d3-bc01-4ea5-b677-039054b091ec-kube-api-access-lwscs\") pod \"nova-cell0-cell-mapping-pclms\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.018331 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-config-data\") pod \"nova-cell0-cell-mapping-pclms\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.018365 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8e4e5-f4c0-4ede-9877-29c66827fd83-config-data\") pod \"nova-api-0\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.018383 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8e4e5-f4c0-4ede-9877-29c66827fd83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.018406 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pclms\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.018427 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8e4e5-f4c0-4ede-9877-29c66827fd83-logs\") pod \"nova-api-0\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.018474 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8t9n\" (UniqueName: \"kubernetes.io/projected/52c8e4e5-f4c0-4ede-9877-29c66827fd83-kube-api-access-b8t9n\") pod \"nova-api-0\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.018500 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f04bba-e278-40c8-b5ef-42778eb24aa4-config-data\") pod \"nova-metadata-0\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.018519 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-scripts\") pod \"nova-cell0-cell-mapping-pclms\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.018533 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61f04bba-e278-40c8-b5ef-42778eb24aa4-logs\") pod \"nova-metadata-0\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.019903 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.034511 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pclms\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.042773 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-scripts\") pod \"nova-cell0-cell-mapping-pclms\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.049822 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwscs\" (UniqueName: \"kubernetes.io/projected/650fe4d3-bc01-4ea5-b677-039054b091ec-kube-api-access-lwscs\") pod \"nova-cell0-cell-mapping-pclms\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.061873 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-config-data\") pod \"nova-cell0-cell-mapping-pclms\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.098717 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.101408 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.105561 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.113496 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.127936 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1926423-f37e-4978-8780-54a70ca3f158-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a1926423-f37e-4978-8780-54a70ca3f158\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.128002 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1926423-f37e-4978-8780-54a70ca3f158-config-data\") pod \"nova-scheduler-0\" (UID: \"a1926423-f37e-4978-8780-54a70ca3f158\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.128070 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8e4e5-f4c0-4ede-9877-29c66827fd83-config-data\") pod \"nova-api-0\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.128099 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8e4e5-f4c0-4ede-9877-29c66827fd83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.128128 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8e4e5-f4c0-4ede-9877-29c66827fd83-logs\") pod \"nova-api-0\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.128206 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8t9n\" (UniqueName: \"kubernetes.io/projected/52c8e4e5-f4c0-4ede-9877-29c66827fd83-kube-api-access-b8t9n\") pod \"nova-api-0\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.128226 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkff8\" (UniqueName: \"kubernetes.io/projected/a1926423-f37e-4978-8780-54a70ca3f158-kube-api-access-mkff8\") pod \"nova-scheduler-0\" (UID: \"a1926423-f37e-4978-8780-54a70ca3f158\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.128279 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f04bba-e278-40c8-b5ef-42778eb24aa4-config-data\") pod \"nova-metadata-0\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.128313 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61f04bba-e278-40c8-b5ef-42778eb24aa4-logs\") pod \"nova-metadata-0\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.128394 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz8pj\" (UniqueName: \"kubernetes.io/projected/61f04bba-e278-40c8-b5ef-42778eb24aa4-kube-api-access-lz8pj\") pod \"nova-metadata-0\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.128429 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f04bba-e278-40c8-b5ef-42778eb24aa4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.131440 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8e4e5-f4c0-4ede-9877-29c66827fd83-logs\") pod \"nova-api-0\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.132298 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61f04bba-e278-40c8-b5ef-42778eb24aa4-logs\") pod \"nova-metadata-0\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.148073 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.229981 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkgb9\" (UniqueName: \"kubernetes.io/projected/84298425-3947-4f9a-a417-a6d225aac8df-kube-api-access-hkgb9\") pod \"nova-cell1-novncproxy-0\" (UID: \"84298425-3947-4f9a-a417-a6d225aac8df\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.230258 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1926423-f37e-4978-8780-54a70ca3f158-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a1926423-f37e-4978-8780-54a70ca3f158\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.230290 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1926423-f37e-4978-8780-54a70ca3f158-config-data\") pod \"nova-scheduler-0\" (UID: \"a1926423-f37e-4978-8780-54a70ca3f158\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.230351 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84298425-3947-4f9a-a417-a6d225aac8df-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"84298425-3947-4f9a-a417-a6d225aac8df\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.230369 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84298425-3947-4f9a-a417-a6d225aac8df-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"84298425-3947-4f9a-a417-a6d225aac8df\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.230443 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkff8\" (UniqueName: \"kubernetes.io/projected/a1926423-f37e-4978-8780-54a70ca3f158-kube-api-access-mkff8\") pod \"nova-scheduler-0\" (UID: \"a1926423-f37e-4978-8780-54a70ca3f158\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.332329 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkgb9\" (UniqueName: \"kubernetes.io/projected/84298425-3947-4f9a-a417-a6d225aac8df-kube-api-access-hkgb9\") pod \"nova-cell1-novncproxy-0\" (UID: \"84298425-3947-4f9a-a417-a6d225aac8df\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.332438 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84298425-3947-4f9a-a417-a6d225aac8df-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"84298425-3947-4f9a-a417-a6d225aac8df\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.332462 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84298425-3947-4f9a-a417-a6d225aac8df-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"84298425-3947-4f9a-a417-a6d225aac8df\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.444740 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f04bba-e278-40c8-b5ef-42778eb24aa4-config-data\") pod \"nova-metadata-0\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.444986 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8e4e5-f4c0-4ede-9877-29c66827fd83-config-data\") pod \"nova-api-0\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.445373 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f04bba-e278-40c8-b5ef-42778eb24aa4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.445409 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8e4e5-f4c0-4ede-9877-29c66827fd83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.445606 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8t9n\" (UniqueName: \"kubernetes.io/projected/52c8e4e5-f4c0-4ede-9877-29c66827fd83-kube-api-access-b8t9n\") pod \"nova-api-0\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.450161 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz8pj\" (UniqueName: \"kubernetes.io/projected/61f04bba-e278-40c8-b5ef-42778eb24aa4-kube-api-access-lz8pj\") pod \"nova-metadata-0\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.450412 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84298425-3947-4f9a-a417-a6d225aac8df-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"84298425-3947-4f9a-a417-a6d225aac8df\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.450521 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84298425-3947-4f9a-a417-a6d225aac8df-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"84298425-3947-4f9a-a417-a6d225aac8df\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.450534 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1926423-f37e-4978-8780-54a70ca3f158-config-data\") pod \"nova-scheduler-0\" (UID: \"a1926423-f37e-4978-8780-54a70ca3f158\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.450574 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkff8\" (UniqueName: \"kubernetes.io/projected/a1926423-f37e-4978-8780-54a70ca3f158-kube-api-access-mkff8\") pod \"nova-scheduler-0\" (UID: \"a1926423-f37e-4978-8780-54a70ca3f158\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.450688 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1926423-f37e-4978-8780-54a70ca3f158-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a1926423-f37e-4978-8780-54a70ca3f158\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.452319 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkgb9\" (UniqueName: \"kubernetes.io/projected/84298425-3947-4f9a-a417-a6d225aac8df-kube-api-access-hkgb9\") pod \"nova-cell1-novncproxy-0\" (UID: \"84298425-3947-4f9a-a417-a6d225aac8df\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.519257 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.549613 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.614576 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.635065 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.635566 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6"] Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.641003 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.643997 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.644122 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.657870 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6"] Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.742290 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-config-data\") pod \"nova-cell1-conductor-db-sync-s2df6\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.742397 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-scripts\") pod \"nova-cell1-conductor-db-sync-s2df6\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.742427 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s2df6\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.742481 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7nkf\" (UniqueName: \"kubernetes.io/projected/53052896-e072-486c-ba64-dbf034fc2647-kube-api-access-d7nkf\") pod \"nova-cell1-conductor-db-sync-s2df6\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.847057 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-config-data\") pod \"nova-cell1-conductor-db-sync-s2df6\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.847117 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-scripts\") pod \"nova-cell1-conductor-db-sync-s2df6\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.847139 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s2df6\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.847180 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7nkf\" (UniqueName: \"kubernetes.io/projected/53052896-e072-486c-ba64-dbf034fc2647-kube-api-access-d7nkf\") pod \"nova-cell1-conductor-db-sync-s2df6\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.853051 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-scripts\") pod \"nova-cell1-conductor-db-sync-s2df6\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.854093 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-config-data\") pod \"nova-cell1-conductor-db-sync-s2df6\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.856468 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s2df6\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.864522 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7nkf\" (UniqueName: \"kubernetes.io/projected/53052896-e072-486c-ba64-dbf034fc2647-kube-api-access-d7nkf\") pod \"nova-cell1-conductor-db-sync-s2df6\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.948188 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-pclms"] Jan 20 23:47:51 crc kubenswrapper[5030]: W0120 23:47:51.954861 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod650fe4d3_bc01_4ea5_b677_039054b091ec.slice/crio-49a27249a3f2e141907070e023d9ee0f994826f60f030620c20f2a24339c5917 WatchSource:0}: Error finding container 49a27249a3f2e141907070e023d9ee0f994826f60f030620c20f2a24339c5917: Status 404 returned error can't find the container with id 49a27249a3f2e141907070e023d9ee0f994826f60f030620c20f2a24339c5917 Jan 20 23:47:51 crc kubenswrapper[5030]: I0120 23:47:51.968505 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.129133 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.184864 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.317218 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:47:52 crc kubenswrapper[5030]: W0120 23:47:52.317688 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1926423_f37e_4978_8780_54a70ca3f158.slice/crio-1df0c054f8b25587306a705702c890038b5dd80c30b216f092e355ec8d86b6c3 WatchSource:0}: Error finding container 1df0c054f8b25587306a705702c890038b5dd80c30b216f092e355ec8d86b6c3: Status 404 returned error can't find the container with id 1df0c054f8b25587306a705702c890038b5dd80c30b216f092e355ec8d86b6c3 Jan 20 23:47:52 crc kubenswrapper[5030]: W0120 23:47:52.333968 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84298425_3947_4f9a_a417_a6d225aac8df.slice/crio-f75a922c56b236f751da826506f9cafad5cb020d5e7ff9677c3954f828598e0c WatchSource:0}: Error finding container f75a922c56b236f751da826506f9cafad5cb020d5e7ff9677c3954f828598e0c: Status 404 returned error can't find the container with id f75a922c56b236f751da826506f9cafad5cb020d5e7ff9677c3954f828598e0c Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.336700 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.464445 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6"] Jan 20 23:47:52 crc kubenswrapper[5030]: W0120 23:47:52.481689 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53052896_e072_486c_ba64_dbf034fc2647.slice/crio-0b6187c468ba0d469c9bb2ab2e7c42f17ceac851ab2129a794444a34fc2463be WatchSource:0}: Error finding container 0b6187c468ba0d469c9bb2ab2e7c42f17ceac851ab2129a794444a34fc2463be: Status 404 returned error can't find the container with id 0b6187c468ba0d469c9bb2ab2e7c42f17ceac851ab2129a794444a34fc2463be Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.665456 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"52c8e4e5-f4c0-4ede-9877-29c66827fd83","Type":"ContainerStarted","Data":"fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7"} Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.665956 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"52c8e4e5-f4c0-4ede-9877-29c66827fd83","Type":"ContainerStarted","Data":"9bc0ea44d49e41628cf882266dd789114b6243ed63a45deaa46f63d8e8cc6043"} Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.667763 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" event={"ID":"650fe4d3-bc01-4ea5-b677-039054b091ec","Type":"ContainerStarted","Data":"de7d98c8e93e6a2cb05b7a6bf897c11d83d20b98e812427b2724dfe9cd548087"} Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.667793 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" event={"ID":"650fe4d3-bc01-4ea5-b677-039054b091ec","Type":"ContainerStarted","Data":"49a27249a3f2e141907070e023d9ee0f994826f60f030620c20f2a24339c5917"} Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.671913 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"61f04bba-e278-40c8-b5ef-42778eb24aa4","Type":"ContainerStarted","Data":"b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b"} Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.672024 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"61f04bba-e278-40c8-b5ef-42778eb24aa4","Type":"ContainerStarted","Data":"7ff882eb3cf270e695eebb8c8073c8dab6e4c2f391cac3a06b0b21f614ca8a44"} Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.677943 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a1926423-f37e-4978-8780-54a70ca3f158","Type":"ContainerStarted","Data":"1df0c054f8b25587306a705702c890038b5dd80c30b216f092e355ec8d86b6c3"} Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.685193 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" event={"ID":"53052896-e072-486c-ba64-dbf034fc2647","Type":"ContainerStarted","Data":"0b6187c468ba0d469c9bb2ab2e7c42f17ceac851ab2129a794444a34fc2463be"} Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.685283 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" podStartSLOduration=2.6852723960000002 podStartE2EDuration="2.685272396s" podCreationTimestamp="2026-01-20 23:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:52.680992572 +0000 UTC m=+4345.001252860" watchObservedRunningTime="2026-01-20 23:47:52.685272396 +0000 UTC m=+4345.005532684" Jan 20 23:47:52 crc kubenswrapper[5030]: I0120 23:47:52.688157 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"84298425-3947-4f9a-a417-a6d225aac8df","Type":"ContainerStarted","Data":"f75a922c56b236f751da826506f9cafad5cb020d5e7ff9677c3954f828598e0c"} Jan 20 23:47:53 crc kubenswrapper[5030]: I0120 23:47:53.710387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a1926423-f37e-4978-8780-54a70ca3f158","Type":"ContainerStarted","Data":"6a0c76435ce21c72d6fe5a88287e9eca0162f69ee09939bcac8ca83634d2a99e"} Jan 20 23:47:53 crc kubenswrapper[5030]: I0120 23:47:53.712480 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" event={"ID":"53052896-e072-486c-ba64-dbf034fc2647","Type":"ContainerStarted","Data":"caabf9e91a5b98c65a9956848d602849f629932b869a201628353e0848aa2e94"} Jan 20 23:47:53 crc kubenswrapper[5030]: I0120 23:47:53.715151 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"84298425-3947-4f9a-a417-a6d225aac8df","Type":"ContainerStarted","Data":"170b6359195b41c969b45837097d412ed5f4dd373ab322d6aec30ac3e906b10a"} Jan 20 23:47:53 crc kubenswrapper[5030]: I0120 23:47:53.717374 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"52c8e4e5-f4c0-4ede-9877-29c66827fd83","Type":"ContainerStarted","Data":"58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6"} Jan 20 23:47:53 crc kubenswrapper[5030]: I0120 23:47:53.719460 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"61f04bba-e278-40c8-b5ef-42778eb24aa4","Type":"ContainerStarted","Data":"37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47"} Jan 20 23:47:53 crc kubenswrapper[5030]: I0120 23:47:53.735816 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=3.7357719 podStartE2EDuration="3.7357719s" podCreationTimestamp="2026-01-20 23:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:53.727761106 +0000 UTC m=+4346.048021404" watchObservedRunningTime="2026-01-20 23:47:53.7357719 +0000 UTC m=+4346.056032188" Jan 20 23:47:53 crc kubenswrapper[5030]: I0120 23:47:53.745471 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.745450075 podStartE2EDuration="2.745450075s" podCreationTimestamp="2026-01-20 23:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:53.742937403 +0000 UTC m=+4346.063197701" watchObservedRunningTime="2026-01-20 23:47:53.745450075 +0000 UTC m=+4346.065710363" Jan 20 23:47:53 crc kubenswrapper[5030]: I0120 23:47:53.772254 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=3.772235994 podStartE2EDuration="3.772235994s" podCreationTimestamp="2026-01-20 23:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:53.767406818 +0000 UTC m=+4346.087667106" watchObservedRunningTime="2026-01-20 23:47:53.772235994 +0000 UTC m=+4346.092496282" Jan 20 23:47:53 crc kubenswrapper[5030]: I0120 23:47:53.792663 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" podStartSLOduration=2.79263975 podStartE2EDuration="2.79263975s" podCreationTimestamp="2026-01-20 23:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:53.787616567 +0000 UTC m=+4346.107876855" watchObservedRunningTime="2026-01-20 23:47:53.79263975 +0000 UTC m=+4346.112900038" Jan 20 23:47:53 crc kubenswrapper[5030]: I0120 23:47:53.819231 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=3.819197714 podStartE2EDuration="3.819197714s" podCreationTimestamp="2026-01-20 23:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:53.807818298 +0000 UTC m=+4346.128078586" watchObservedRunningTime="2026-01-20 23:47:53.819197714 +0000 UTC m=+4346.139457992" Jan 20 23:47:54 crc kubenswrapper[5030]: I0120 23:47:54.361305 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:47:54 crc kubenswrapper[5030]: I0120 23:47:54.378211 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:47:55 crc kubenswrapper[5030]: I0120 23:47:55.736466 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="84298425-3947-4f9a-a417-a6d225aac8df" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://170b6359195b41c969b45837097d412ed5f4dd373ab322d6aec30ac3e906b10a" gracePeriod=30 Jan 20 23:47:55 crc kubenswrapper[5030]: I0120 23:47:55.736654 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="61f04bba-e278-40c8-b5ef-42778eb24aa4" containerName="nova-metadata-log" containerID="cri-o://b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b" gracePeriod=30 Jan 20 23:47:55 crc kubenswrapper[5030]: I0120 23:47:55.736863 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="61f04bba-e278-40c8-b5ef-42778eb24aa4" containerName="nova-metadata-metadata" containerID="cri-o://37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47" gracePeriod=30 Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.298590 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.426466 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.441950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz8pj\" (UniqueName: \"kubernetes.io/projected/61f04bba-e278-40c8-b5ef-42778eb24aa4-kube-api-access-lz8pj\") pod \"61f04bba-e278-40c8-b5ef-42778eb24aa4\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.442151 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f04bba-e278-40c8-b5ef-42778eb24aa4-config-data\") pod \"61f04bba-e278-40c8-b5ef-42778eb24aa4\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.442405 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f04bba-e278-40c8-b5ef-42778eb24aa4-combined-ca-bundle\") pod \"61f04bba-e278-40c8-b5ef-42778eb24aa4\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.442476 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61f04bba-e278-40c8-b5ef-42778eb24aa4-logs\") pod \"61f04bba-e278-40c8-b5ef-42778eb24aa4\" (UID: \"61f04bba-e278-40c8-b5ef-42778eb24aa4\") " Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.443559 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61f04bba-e278-40c8-b5ef-42778eb24aa4-logs" (OuterVolumeSpecName: "logs") pod "61f04bba-e278-40c8-b5ef-42778eb24aa4" (UID: "61f04bba-e278-40c8-b5ef-42778eb24aa4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.450329 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f04bba-e278-40c8-b5ef-42778eb24aa4-kube-api-access-lz8pj" (OuterVolumeSpecName: "kube-api-access-lz8pj") pod "61f04bba-e278-40c8-b5ef-42778eb24aa4" (UID: "61f04bba-e278-40c8-b5ef-42778eb24aa4"). InnerVolumeSpecName "kube-api-access-lz8pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.470328 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f04bba-e278-40c8-b5ef-42778eb24aa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61f04bba-e278-40c8-b5ef-42778eb24aa4" (UID: "61f04bba-e278-40c8-b5ef-42778eb24aa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.483771 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f04bba-e278-40c8-b5ef-42778eb24aa4-config-data" (OuterVolumeSpecName: "config-data") pod "61f04bba-e278-40c8-b5ef-42778eb24aa4" (UID: "61f04bba-e278-40c8-b5ef-42778eb24aa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.544691 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84298425-3947-4f9a-a417-a6d225aac8df-combined-ca-bundle\") pod \"84298425-3947-4f9a-a417-a6d225aac8df\" (UID: \"84298425-3947-4f9a-a417-a6d225aac8df\") " Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.544786 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84298425-3947-4f9a-a417-a6d225aac8df-config-data\") pod \"84298425-3947-4f9a-a417-a6d225aac8df\" (UID: \"84298425-3947-4f9a-a417-a6d225aac8df\") " Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.544832 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkgb9\" (UniqueName: \"kubernetes.io/projected/84298425-3947-4f9a-a417-a6d225aac8df-kube-api-access-hkgb9\") pod \"84298425-3947-4f9a-a417-a6d225aac8df\" (UID: \"84298425-3947-4f9a-a417-a6d225aac8df\") " Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.545241 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f04bba-e278-40c8-b5ef-42778eb24aa4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.545260 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f04bba-e278-40c8-b5ef-42778eb24aa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.545269 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61f04bba-e278-40c8-b5ef-42778eb24aa4-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.545278 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz8pj\" (UniqueName: \"kubernetes.io/projected/61f04bba-e278-40c8-b5ef-42778eb24aa4-kube-api-access-lz8pj\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.548598 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84298425-3947-4f9a-a417-a6d225aac8df-kube-api-access-hkgb9" (OuterVolumeSpecName: "kube-api-access-hkgb9") pod "84298425-3947-4f9a-a417-a6d225aac8df" (UID: "84298425-3947-4f9a-a417-a6d225aac8df"). InnerVolumeSpecName "kube-api-access-hkgb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.569748 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84298425-3947-4f9a-a417-a6d225aac8df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84298425-3947-4f9a-a417-a6d225aac8df" (UID: "84298425-3947-4f9a-a417-a6d225aac8df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.586432 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84298425-3947-4f9a-a417-a6d225aac8df-config-data" (OuterVolumeSpecName: "config-data") pod "84298425-3947-4f9a-a417-a6d225aac8df" (UID: "84298425-3947-4f9a-a417-a6d225aac8df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.615737 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.648063 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84298425-3947-4f9a-a417-a6d225aac8df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.648120 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84298425-3947-4f9a-a417-a6d225aac8df-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.648137 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkgb9\" (UniqueName: \"kubernetes.io/projected/84298425-3947-4f9a-a417-a6d225aac8df-kube-api-access-hkgb9\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.751048 5030 generic.go:334] "Generic (PLEG): container finished" podID="61f04bba-e278-40c8-b5ef-42778eb24aa4" containerID="37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47" exitCode=0 Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.751118 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"61f04bba-e278-40c8-b5ef-42778eb24aa4","Type":"ContainerDied","Data":"37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47"} Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.751136 5030 generic.go:334] "Generic (PLEG): container finished" podID="61f04bba-e278-40c8-b5ef-42778eb24aa4" containerID="b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b" exitCode=143 Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.751163 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"61f04bba-e278-40c8-b5ef-42778eb24aa4","Type":"ContainerDied","Data":"b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b"} Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.751179 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"61f04bba-e278-40c8-b5ef-42778eb24aa4","Type":"ContainerDied","Data":"7ff882eb3cf270e695eebb8c8073c8dab6e4c2f391cac3a06b0b21f614ca8a44"} Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.751196 5030 scope.go:117] "RemoveContainer" containerID="37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.751096 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.772305 5030 generic.go:334] "Generic (PLEG): container finished" podID="53052896-e072-486c-ba64-dbf034fc2647" containerID="caabf9e91a5b98c65a9956848d602849f629932b869a201628353e0848aa2e94" exitCode=0 Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.772378 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" event={"ID":"53052896-e072-486c-ba64-dbf034fc2647","Type":"ContainerDied","Data":"caabf9e91a5b98c65a9956848d602849f629932b869a201628353e0848aa2e94"} Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.776356 5030 generic.go:334] "Generic (PLEG): container finished" podID="84298425-3947-4f9a-a417-a6d225aac8df" containerID="170b6359195b41c969b45837097d412ed5f4dd373ab322d6aec30ac3e906b10a" exitCode=0 Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.776394 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"84298425-3947-4f9a-a417-a6d225aac8df","Type":"ContainerDied","Data":"170b6359195b41c969b45837097d412ed5f4dd373ab322d6aec30ac3e906b10a"} Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.776411 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"84298425-3947-4f9a-a417-a6d225aac8df","Type":"ContainerDied","Data":"f75a922c56b236f751da826506f9cafad5cb020d5e7ff9677c3954f828598e0c"} Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.776449 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.820752 5030 scope.go:117] "RemoveContainer" containerID="b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.834468 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.852699 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.854750 5030 scope.go:117] "RemoveContainer" containerID="37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47" Jan 20 23:47:56 crc kubenswrapper[5030]: E0120 23:47:56.855376 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47\": container with ID starting with 37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47 not found: ID does not exist" containerID="37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.855460 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47"} err="failed to get container status \"37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47\": rpc error: code = NotFound desc = could not find container \"37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47\": container with ID starting with 37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47 not found: ID does not exist" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.855494 5030 scope.go:117] "RemoveContainer" containerID="b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b" Jan 20 23:47:56 crc kubenswrapper[5030]: E0120 23:47:56.855895 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b\": container with ID starting with b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b not found: ID does not exist" containerID="b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.855936 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b"} err="failed to get container status \"b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b\": rpc error: code = NotFound desc = could not find container \"b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b\": container with ID starting with b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b not found: ID does not exist" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.855961 5030 scope.go:117] "RemoveContainer" containerID="37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.856320 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47"} err="failed to get container status \"37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47\": rpc error: code = NotFound desc = could not find container \"37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47\": container with ID starting with 37e73967fead06d6e8afc44abd80fa392e8f6ea0b9225b87ce1579610558cf47 not found: ID does not exist" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.856354 5030 scope.go:117] "RemoveContainer" containerID="b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.856722 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b"} err="failed to get container status \"b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b\": rpc error: code = NotFound desc = could not find container \"b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b\": container with ID starting with b51243998587fdc8c1d38386bacda0b9050e4016a1da83733bb400fe1e7f361b not found: ID does not exist" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.856750 5030 scope.go:117] "RemoveContainer" containerID="170b6359195b41c969b45837097d412ed5f4dd373ab322d6aec30ac3e906b10a" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.863816 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.878210 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:47:56 crc kubenswrapper[5030]: E0120 23:47:56.878805 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f04bba-e278-40c8-b5ef-42778eb24aa4" containerName="nova-metadata-metadata" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.878830 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f04bba-e278-40c8-b5ef-42778eb24aa4" containerName="nova-metadata-metadata" Jan 20 23:47:56 crc kubenswrapper[5030]: E0120 23:47:56.878866 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f04bba-e278-40c8-b5ef-42778eb24aa4" containerName="nova-metadata-log" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.878876 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f04bba-e278-40c8-b5ef-42778eb24aa4" containerName="nova-metadata-log" Jan 20 23:47:56 crc kubenswrapper[5030]: E0120 23:47:56.878896 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84298425-3947-4f9a-a417-a6d225aac8df" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.878905 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="84298425-3947-4f9a-a417-a6d225aac8df" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.879167 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="84298425-3947-4f9a-a417-a6d225aac8df" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.879207 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f04bba-e278-40c8-b5ef-42778eb24aa4" containerName="nova-metadata-log" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.879227 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f04bba-e278-40c8-b5ef-42778eb24aa4" containerName="nova-metadata-metadata" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.880001 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.883163 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.883500 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.883826 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.891673 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.901315 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.909814 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.911858 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.919792 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.940111 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.940397 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.954293 5030 scope.go:117] "RemoveContainer" containerID="170b6359195b41c969b45837097d412ed5f4dd373ab322d6aec30ac3e906b10a" Jan 20 23:47:56 crc kubenswrapper[5030]: E0120 23:47:56.955134 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"170b6359195b41c969b45837097d412ed5f4dd373ab322d6aec30ac3e906b10a\": container with ID starting with 170b6359195b41c969b45837097d412ed5f4dd373ab322d6aec30ac3e906b10a not found: ID does not exist" containerID="170b6359195b41c969b45837097d412ed5f4dd373ab322d6aec30ac3e906b10a" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.955200 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170b6359195b41c969b45837097d412ed5f4dd373ab322d6aec30ac3e906b10a"} err="failed to get container status \"170b6359195b41c969b45837097d412ed5f4dd373ab322d6aec30ac3e906b10a\": rpc error: code = NotFound desc = could not find container \"170b6359195b41c969b45837097d412ed5f4dd373ab322d6aec30ac3e906b10a\": container with ID starting with 170b6359195b41c969b45837097d412ed5f4dd373ab322d6aec30ac3e906b10a not found: ID does not exist" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.955889 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.955997 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.956032 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.956103 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzd4c\" (UniqueName: \"kubernetes.io/projected/82558fc9-417f-49d3-b522-207e0c215f92-kube-api-access-vzd4c\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:56 crc kubenswrapper[5030]: I0120 23:47:56.956172 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.057886 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzd4c\" (UniqueName: \"kubernetes.io/projected/82558fc9-417f-49d3-b522-207e0c215f92-kube-api-access-vzd4c\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.058271 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-config-data\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.058361 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd512e5a-a455-44e5-a6b7-6d7c65838c97-logs\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.058395 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.058433 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.058456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.058566 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.058593 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.058637 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.058685 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gt4r\" (UniqueName: \"kubernetes.io/projected/cd512e5a-a455-44e5-a6b7-6d7c65838c97-kube-api-access-8gt4r\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.065601 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.065613 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.065704 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.065755 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.075754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzd4c\" (UniqueName: \"kubernetes.io/projected/82558fc9-417f-49d3-b522-207e0c215f92-kube-api-access-vzd4c\") pod \"nova-cell1-novncproxy-0\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.160811 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gt4r\" (UniqueName: \"kubernetes.io/projected/cd512e5a-a455-44e5-a6b7-6d7c65838c97-kube-api-access-8gt4r\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.160941 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-config-data\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.161002 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd512e5a-a455-44e5-a6b7-6d7c65838c97-logs\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.161043 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.161120 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.161774 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd512e5a-a455-44e5-a6b7-6d7c65838c97-logs\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.165935 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-config-data\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.166176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.167431 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.186503 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gt4r\" (UniqueName: \"kubernetes.io/projected/cd512e5a-a455-44e5-a6b7-6d7c65838c97-kube-api-access-8gt4r\") pod \"nova-metadata-0\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.295248 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.304466 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.788697 5030 generic.go:334] "Generic (PLEG): container finished" podID="650fe4d3-bc01-4ea5-b677-039054b091ec" containerID="de7d98c8e93e6a2cb05b7a6bf897c11d83d20b98e812427b2724dfe9cd548087" exitCode=0 Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.788803 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" event={"ID":"650fe4d3-bc01-4ea5-b677-039054b091ec","Type":"ContainerDied","Data":"de7d98c8e93e6a2cb05b7a6bf897c11d83d20b98e812427b2724dfe9cd548087"} Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.812153 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.869043 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:47:57 crc kubenswrapper[5030]: W0120 23:47:57.879148 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd512e5a_a455_44e5_a6b7_6d7c65838c97.slice/crio-23b993b3debf0bb0888c8e258e53b7f91ab1a73e71f67be120f04a69be043287 WatchSource:0}: Error finding container 23b993b3debf0bb0888c8e258e53b7f91ab1a73e71f67be120f04a69be043287: Status 404 returned error can't find the container with id 23b993b3debf0bb0888c8e258e53b7f91ab1a73e71f67be120f04a69be043287 Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.975043 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f04bba-e278-40c8-b5ef-42778eb24aa4" path="/var/lib/kubelet/pods/61f04bba-e278-40c8-b5ef-42778eb24aa4/volumes" Jan 20 23:47:57 crc kubenswrapper[5030]: I0120 23:47:57.979660 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84298425-3947-4f9a-a417-a6d225aac8df" path="/var/lib/kubelet/pods/84298425-3947-4f9a-a417-a6d225aac8df/volumes" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.162086 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.280434 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-scripts\") pod \"53052896-e072-486c-ba64-dbf034fc2647\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.280543 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-combined-ca-bundle\") pod \"53052896-e072-486c-ba64-dbf034fc2647\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.280663 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7nkf\" (UniqueName: \"kubernetes.io/projected/53052896-e072-486c-ba64-dbf034fc2647-kube-api-access-d7nkf\") pod \"53052896-e072-486c-ba64-dbf034fc2647\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.280756 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-config-data\") pod \"53052896-e072-486c-ba64-dbf034fc2647\" (UID: \"53052896-e072-486c-ba64-dbf034fc2647\") " Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.284400 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-scripts" (OuterVolumeSpecName: "scripts") pod "53052896-e072-486c-ba64-dbf034fc2647" (UID: "53052896-e072-486c-ba64-dbf034fc2647"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.284484 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53052896-e072-486c-ba64-dbf034fc2647-kube-api-access-d7nkf" (OuterVolumeSpecName: "kube-api-access-d7nkf") pod "53052896-e072-486c-ba64-dbf034fc2647" (UID: "53052896-e072-486c-ba64-dbf034fc2647"). InnerVolumeSpecName "kube-api-access-d7nkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.307489 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-config-data" (OuterVolumeSpecName: "config-data") pod "53052896-e072-486c-ba64-dbf034fc2647" (UID: "53052896-e072-486c-ba64-dbf034fc2647"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.312674 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53052896-e072-486c-ba64-dbf034fc2647" (UID: "53052896-e072-486c-ba64-dbf034fc2647"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.383326 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.383364 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.383379 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7nkf\" (UniqueName: \"kubernetes.io/projected/53052896-e072-486c-ba64-dbf034fc2647-kube-api-access-d7nkf\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.383389 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53052896-e072-486c-ba64-dbf034fc2647-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.806875 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"cd512e5a-a455-44e5-a6b7-6d7c65838c97","Type":"ContainerStarted","Data":"3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5"} Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.806971 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"cd512e5a-a455-44e5-a6b7-6d7c65838c97","Type":"ContainerStarted","Data":"48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6"} Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.806999 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"cd512e5a-a455-44e5-a6b7-6d7c65838c97","Type":"ContainerStarted","Data":"23b993b3debf0bb0888c8e258e53b7f91ab1a73e71f67be120f04a69be043287"} Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.810006 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"82558fc9-417f-49d3-b522-207e0c215f92","Type":"ContainerStarted","Data":"2a5e943d5d03a2a8aec5eb4d1bc3f3ab36ac421741ca2815d290edb3b2d56f0d"} Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.810068 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"82558fc9-417f-49d3-b522-207e0c215f92","Type":"ContainerStarted","Data":"4ab3a9efafd0a567fbba8b8497336dc9e1bb8011c28ccaf0e83acbdd2110b1b7"} Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.816277 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.817603 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6" event={"ID":"53052896-e072-486c-ba64-dbf034fc2647","Type":"ContainerDied","Data":"0b6187c468ba0d469c9bb2ab2e7c42f17ceac851ab2129a794444a34fc2463be"} Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.817797 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b6187c468ba0d469c9bb2ab2e7c42f17ceac851ab2129a794444a34fc2463be" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.860166 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.86014045 podStartE2EDuration="2.86014045s" podCreationTimestamp="2026-01-20 23:47:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:58.854298498 +0000 UTC m=+4351.174558796" watchObservedRunningTime="2026-01-20 23:47:58.86014045 +0000 UTC m=+4351.180400748" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.906753 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.906730141 podStartE2EDuration="2.906730141s" podCreationTimestamp="2026-01-20 23:47:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:47:58.896225525 +0000 UTC m=+4351.216485863" watchObservedRunningTime="2026-01-20 23:47:58.906730141 +0000 UTC m=+4351.226990429" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.928558 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:47:58 crc kubenswrapper[5030]: E0120 23:47:58.929238 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53052896-e072-486c-ba64-dbf034fc2647" containerName="nova-cell1-conductor-db-sync" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.929266 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="53052896-e072-486c-ba64-dbf034fc2647" containerName="nova-cell1-conductor-db-sync" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.929591 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="53052896-e072-486c-ba64-dbf034fc2647" containerName="nova-cell1-conductor-db-sync" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.930737 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.936528 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.954035 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.999372 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bd1362d-6fee-4d39-ad27-ba95334caeef-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6bd1362d-6fee-4d39-ad27-ba95334caeef\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.999737 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd1362d-6fee-4d39-ad27-ba95334caeef-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6bd1362d-6fee-4d39-ad27-ba95334caeef\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:47:58 crc kubenswrapper[5030]: I0120 23:47:58.999794 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sp7s\" (UniqueName: \"kubernetes.io/projected/6bd1362d-6fee-4d39-ad27-ba95334caeef-kube-api-access-8sp7s\") pod \"nova-cell1-conductor-0\" (UID: \"6bd1362d-6fee-4d39-ad27-ba95334caeef\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.101042 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bd1362d-6fee-4d39-ad27-ba95334caeef-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6bd1362d-6fee-4d39-ad27-ba95334caeef\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.102373 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd1362d-6fee-4d39-ad27-ba95334caeef-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6bd1362d-6fee-4d39-ad27-ba95334caeef\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.102416 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sp7s\" (UniqueName: \"kubernetes.io/projected/6bd1362d-6fee-4d39-ad27-ba95334caeef-kube-api-access-8sp7s\") pod \"nova-cell1-conductor-0\" (UID: \"6bd1362d-6fee-4d39-ad27-ba95334caeef\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.107830 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd1362d-6fee-4d39-ad27-ba95334caeef-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6bd1362d-6fee-4d39-ad27-ba95334caeef\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.111585 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bd1362d-6fee-4d39-ad27-ba95334caeef-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6bd1362d-6fee-4d39-ad27-ba95334caeef\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.127108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sp7s\" (UniqueName: \"kubernetes.io/projected/6bd1362d-6fee-4d39-ad27-ba95334caeef-kube-api-access-8sp7s\") pod \"nova-cell1-conductor-0\" (UID: \"6bd1362d-6fee-4d39-ad27-ba95334caeef\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.231499 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.267780 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.304787 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-scripts\") pod \"650fe4d3-bc01-4ea5-b677-039054b091ec\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.304885 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-combined-ca-bundle\") pod \"650fe4d3-bc01-4ea5-b677-039054b091ec\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.304961 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-config-data\") pod \"650fe4d3-bc01-4ea5-b677-039054b091ec\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.305017 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwscs\" (UniqueName: \"kubernetes.io/projected/650fe4d3-bc01-4ea5-b677-039054b091ec-kube-api-access-lwscs\") pod \"650fe4d3-bc01-4ea5-b677-039054b091ec\" (UID: \"650fe4d3-bc01-4ea5-b677-039054b091ec\") " Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.309602 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/650fe4d3-bc01-4ea5-b677-039054b091ec-kube-api-access-lwscs" (OuterVolumeSpecName: "kube-api-access-lwscs") pod "650fe4d3-bc01-4ea5-b677-039054b091ec" (UID: "650fe4d3-bc01-4ea5-b677-039054b091ec"). InnerVolumeSpecName "kube-api-access-lwscs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.311802 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-scripts" (OuterVolumeSpecName: "scripts") pod "650fe4d3-bc01-4ea5-b677-039054b091ec" (UID: "650fe4d3-bc01-4ea5-b677-039054b091ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.331847 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-config-data" (OuterVolumeSpecName: "config-data") pod "650fe4d3-bc01-4ea5-b677-039054b091ec" (UID: "650fe4d3-bc01-4ea5-b677-039054b091ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.341485 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "650fe4d3-bc01-4ea5-b677-039054b091ec" (UID: "650fe4d3-bc01-4ea5-b677-039054b091ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.407359 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwscs\" (UniqueName: \"kubernetes.io/projected/650fe4d3-bc01-4ea5-b677-039054b091ec-kube-api-access-lwscs\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.407608 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.407631 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.407641 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/650fe4d3-bc01-4ea5-b677-039054b091ec-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.722193 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.850937 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" event={"ID":"650fe4d3-bc01-4ea5-b677-039054b091ec","Type":"ContainerDied","Data":"49a27249a3f2e141907070e023d9ee0f994826f60f030620c20f2a24339c5917"} Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.850980 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49a27249a3f2e141907070e023d9ee0f994826f60f030620c20f2a24339c5917" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.851066 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-pclms" Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.853611 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"6bd1362d-6fee-4d39-ad27-ba95334caeef","Type":"ContainerStarted","Data":"a86c9a242c160cbc18143b5142d877b4a0d2ca8a2759349d15e5bfe1bd7d3b60"} Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.984172 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.984843 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="52c8e4e5-f4c0-4ede-9877-29c66827fd83" containerName="nova-api-log" containerID="cri-o://fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7" gracePeriod=30 Jan 20 23:47:59 crc kubenswrapper[5030]: I0120 23:47:59.985072 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="52c8e4e5-f4c0-4ede-9877-29c66827fd83" containerName="nova-api-api" containerID="cri-o://58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6" gracePeriod=30 Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.018645 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.073063 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.073285 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="a1926423-f37e-4978-8780-54a70ca3f158" containerName="nova-scheduler-scheduler" containerID="cri-o://6a0c76435ce21c72d6fe5a88287e9eca0162f69ee09939bcac8ca83634d2a99e" gracePeriod=30 Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.555447 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.627046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8e4e5-f4c0-4ede-9877-29c66827fd83-logs\") pod \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.627219 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8e4e5-f4c0-4ede-9877-29c66827fd83-combined-ca-bundle\") pod \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.627267 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8t9n\" (UniqueName: \"kubernetes.io/projected/52c8e4e5-f4c0-4ede-9877-29c66827fd83-kube-api-access-b8t9n\") pod \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.627295 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8e4e5-f4c0-4ede-9877-29c66827fd83-config-data\") pod \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\" (UID: \"52c8e4e5-f4c0-4ede-9877-29c66827fd83\") " Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.627399 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c8e4e5-f4c0-4ede-9877-29c66827fd83-logs" (OuterVolumeSpecName: "logs") pod "52c8e4e5-f4c0-4ede-9877-29c66827fd83" (UID: "52c8e4e5-f4c0-4ede-9877-29c66827fd83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.627668 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8e4e5-f4c0-4ede-9877-29c66827fd83-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.862474 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"6bd1362d-6fee-4d39-ad27-ba95334caeef","Type":"ContainerStarted","Data":"1ccefdcb102753b7ad4e4b3ff83929bdf44a95fb3b2cf14345b4c80f9c23baac"} Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.863523 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.866325 5030 generic.go:334] "Generic (PLEG): container finished" podID="52c8e4e5-f4c0-4ede-9877-29c66827fd83" containerID="58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6" exitCode=0 Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.866346 5030 generic.go:334] "Generic (PLEG): container finished" podID="52c8e4e5-f4c0-4ede-9877-29c66827fd83" containerID="fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7" exitCode=143 Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.866411 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.866455 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"52c8e4e5-f4c0-4ede-9877-29c66827fd83","Type":"ContainerDied","Data":"58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6"} Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.866483 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="cd512e5a-a455-44e5-a6b7-6d7c65838c97" containerName="nova-metadata-log" containerID="cri-o://48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6" gracePeriod=30 Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.866500 5030 scope.go:117] "RemoveContainer" containerID="58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6" Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.866696 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="cd512e5a-a455-44e5-a6b7-6d7c65838c97" containerName="nova-metadata-metadata" containerID="cri-o://3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5" gracePeriod=30 Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.866488 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"52c8e4e5-f4c0-4ede-9877-29c66827fd83","Type":"ContainerDied","Data":"fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7"} Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.866753 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"52c8e4e5-f4c0-4ede-9877-29c66827fd83","Type":"ContainerDied","Data":"9bc0ea44d49e41628cf882266dd789114b6243ed63a45deaa46f63d8e8cc6043"} Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.899701 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.899683777 podStartE2EDuration="2.899683777s" podCreationTimestamp="2026-01-20 23:47:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:48:00.88704882 +0000 UTC m=+4353.207309108" watchObservedRunningTime="2026-01-20 23:48:00.899683777 +0000 UTC m=+4353.219944065" Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.946222 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c8e4e5-f4c0-4ede-9877-29c66827fd83-kube-api-access-b8t9n" (OuterVolumeSpecName: "kube-api-access-b8t9n") pod "52c8e4e5-f4c0-4ede-9877-29c66827fd83" (UID: "52c8e4e5-f4c0-4ede-9877-29c66827fd83"). InnerVolumeSpecName "kube-api-access-b8t9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:48:00 crc kubenswrapper[5030]: I0120 23:48:00.977930 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8e4e5-f4c0-4ede-9877-29c66827fd83-config-data" (OuterVolumeSpecName: "config-data") pod "52c8e4e5-f4c0-4ede-9877-29c66827fd83" (UID: "52c8e4e5-f4c0-4ede-9877-29c66827fd83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.000263 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8e4e5-f4c0-4ede-9877-29c66827fd83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52c8e4e5-f4c0-4ede-9877-29c66827fd83" (UID: "52c8e4e5-f4c0-4ede-9877-29c66827fd83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.034258 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8e4e5-f4c0-4ede-9877-29c66827fd83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.034293 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8t9n\" (UniqueName: \"kubernetes.io/projected/52c8e4e5-f4c0-4ede-9877-29c66827fd83-kube-api-access-b8t9n\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.034304 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8e4e5-f4c0-4ede-9877-29c66827fd83-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.140665 5030 scope.go:117] "RemoveContainer" containerID="fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.162211 5030 scope.go:117] "RemoveContainer" containerID="58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6" Jan 20 23:48:01 crc kubenswrapper[5030]: E0120 23:48:01.163238 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6\": container with ID starting with 58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6 not found: ID does not exist" containerID="58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.163283 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6"} err="failed to get container status \"58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6\": rpc error: code = NotFound desc = could not find container \"58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6\": container with ID starting with 58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6 not found: ID does not exist" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.163311 5030 scope.go:117] "RemoveContainer" containerID="fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7" Jan 20 23:48:01 crc kubenswrapper[5030]: E0120 23:48:01.163709 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7\": container with ID starting with fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7 not found: ID does not exist" containerID="fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.163731 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7"} err="failed to get container status \"fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7\": rpc error: code = NotFound desc = could not find container \"fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7\": container with ID starting with fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7 not found: ID does not exist" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.163746 5030 scope.go:117] "RemoveContainer" containerID="58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.163923 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6"} err="failed to get container status \"58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6\": rpc error: code = NotFound desc = could not find container \"58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6\": container with ID starting with 58464727e2edd3b6cc4d292ca21af1036f7fea211405912c08a6a43cf32ca5c6 not found: ID does not exist" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.163942 5030 scope.go:117] "RemoveContainer" containerID="fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.164239 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7"} err="failed to get container status \"fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7\": rpc error: code = NotFound desc = could not find container \"fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7\": container with ID starting with fa113d2c06aff43e56efb4f6f373e8599735b7e2a446a45c5cbf2b8ba882a4d7 not found: ID does not exist" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.216431 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.224486 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.233501 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:01 crc kubenswrapper[5030]: E0120 23:48:01.233875 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c8e4e5-f4c0-4ede-9877-29c66827fd83" containerName="nova-api-api" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.233895 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c8e4e5-f4c0-4ede-9877-29c66827fd83" containerName="nova-api-api" Jan 20 23:48:01 crc kubenswrapper[5030]: E0120 23:48:01.233931 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650fe4d3-bc01-4ea5-b677-039054b091ec" containerName="nova-manage" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.233940 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="650fe4d3-bc01-4ea5-b677-039054b091ec" containerName="nova-manage" Jan 20 23:48:01 crc kubenswrapper[5030]: E0120 23:48:01.233962 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c8e4e5-f4c0-4ede-9877-29c66827fd83" containerName="nova-api-log" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.233968 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c8e4e5-f4c0-4ede-9877-29c66827fd83" containerName="nova-api-log" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.234199 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c8e4e5-f4c0-4ede-9877-29c66827fd83" containerName="nova-api-log" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.234234 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="650fe4d3-bc01-4ea5-b677-039054b091ec" containerName="nova-manage" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.234253 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c8e4e5-f4c0-4ede-9877-29c66827fd83" containerName="nova-api-api" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.235670 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.242328 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.245601 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.338749 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.338828 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-config-data\") pod \"nova-api-0\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.338936 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-logs\") pod \"nova-api-0\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.338967 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vklj\" (UniqueName: \"kubernetes.io/projected/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-kube-api-access-5vklj\") pod \"nova-api-0\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.441086 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-logs\") pod \"nova-api-0\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.441143 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vklj\" (UniqueName: \"kubernetes.io/projected/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-kube-api-access-5vklj\") pod \"nova-api-0\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.441664 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.441729 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-config-data\") pod \"nova-api-0\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.447042 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-logs\") pod \"nova-api-0\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.448213 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.448922 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-config-data\") pod \"nova-api-0\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.465023 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.466453 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vklj\" (UniqueName: \"kubernetes.io/projected/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-kube-api-access-5vklj\") pod \"nova-api-0\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.543355 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gt4r\" (UniqueName: \"kubernetes.io/projected/cd512e5a-a455-44e5-a6b7-6d7c65838c97-kube-api-access-8gt4r\") pod \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.543495 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-config-data\") pod \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.543548 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-combined-ca-bundle\") pod \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.543665 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-nova-metadata-tls-certs\") pod \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.543810 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd512e5a-a455-44e5-a6b7-6d7c65838c97-logs\") pod \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\" (UID: \"cd512e5a-a455-44e5-a6b7-6d7c65838c97\") " Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.544994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd512e5a-a455-44e5-a6b7-6d7c65838c97-logs" (OuterVolumeSpecName: "logs") pod "cd512e5a-a455-44e5-a6b7-6d7c65838c97" (UID: "cd512e5a-a455-44e5-a6b7-6d7c65838c97"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.547328 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd512e5a-a455-44e5-a6b7-6d7c65838c97-kube-api-access-8gt4r" (OuterVolumeSpecName: "kube-api-access-8gt4r") pod "cd512e5a-a455-44e5-a6b7-6d7c65838c97" (UID: "cd512e5a-a455-44e5-a6b7-6d7c65838c97"). InnerVolumeSpecName "kube-api-access-8gt4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.560863 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.570652 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd512e5a-a455-44e5-a6b7-6d7c65838c97" (UID: "cd512e5a-a455-44e5-a6b7-6d7c65838c97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.575107 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-config-data" (OuterVolumeSpecName: "config-data") pod "cd512e5a-a455-44e5-a6b7-6d7c65838c97" (UID: "cd512e5a-a455-44e5-a6b7-6d7c65838c97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.604757 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cd512e5a-a455-44e5-a6b7-6d7c65838c97" (UID: "cd512e5a-a455-44e5-a6b7-6d7c65838c97"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.649921 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gt4r\" (UniqueName: \"kubernetes.io/projected/cd512e5a-a455-44e5-a6b7-6d7c65838c97-kube-api-access-8gt4r\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.649979 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.650001 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.650021 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd512e5a-a455-44e5-a6b7-6d7c65838c97-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.650041 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd512e5a-a455-44e5-a6b7-6d7c65838c97-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.825159 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:01 crc kubenswrapper[5030]: W0120 23:48:01.826856 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bcf67b9_c2e6_4e94_bc61_c3905d269c38.slice/crio-24a4723278f67f48f84bb355e47545444174dd2d64eda1b7bf99b58ca478b3be WatchSource:0}: Error finding container 24a4723278f67f48f84bb355e47545444174dd2d64eda1b7bf99b58ca478b3be: Status 404 returned error can't find the container with id 24a4723278f67f48f84bb355e47545444174dd2d64eda1b7bf99b58ca478b3be Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.881122 5030 generic.go:334] "Generic (PLEG): container finished" podID="cd512e5a-a455-44e5-a6b7-6d7c65838c97" containerID="3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5" exitCode=0 Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.881174 5030 generic.go:334] "Generic (PLEG): container finished" podID="cd512e5a-a455-44e5-a6b7-6d7c65838c97" containerID="48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6" exitCode=143 Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.881232 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"cd512e5a-a455-44e5-a6b7-6d7c65838c97","Type":"ContainerDied","Data":"3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5"} Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.881267 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"cd512e5a-a455-44e5-a6b7-6d7c65838c97","Type":"ContainerDied","Data":"48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6"} Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.881285 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"cd512e5a-a455-44e5-a6b7-6d7c65838c97","Type":"ContainerDied","Data":"23b993b3debf0bb0888c8e258e53b7f91ab1a73e71f67be120f04a69be043287"} Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.881308 5030 scope.go:117] "RemoveContainer" containerID="3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.881448 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.887437 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"4bcf67b9-c2e6-4e94-bc61-c3905d269c38","Type":"ContainerStarted","Data":"24a4723278f67f48f84bb355e47545444174dd2d64eda1b7bf99b58ca478b3be"} Jan 20 23:48:01 crc kubenswrapper[5030]: I0120 23:48:01.988673 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52c8e4e5-f4c0-4ede-9877-29c66827fd83" path="/var/lib/kubelet/pods/52c8e4e5-f4c0-4ede-9877-29c66827fd83/volumes" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.002429 5030 scope.go:117] "RemoveContainer" containerID="48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.018418 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.036805 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.041780 5030 scope.go:117] "RemoveContainer" containerID="3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.048419 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:48:02 crc kubenswrapper[5030]: E0120 23:48:02.049124 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd512e5a-a455-44e5-a6b7-6d7c65838c97" containerName="nova-metadata-metadata" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.049159 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd512e5a-a455-44e5-a6b7-6d7c65838c97" containerName="nova-metadata-metadata" Jan 20 23:48:02 crc kubenswrapper[5030]: E0120 23:48:02.049204 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd512e5a-a455-44e5-a6b7-6d7c65838c97" containerName="nova-metadata-log" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.049218 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd512e5a-a455-44e5-a6b7-6d7c65838c97" containerName="nova-metadata-log" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.049547 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd512e5a-a455-44e5-a6b7-6d7c65838c97" containerName="nova-metadata-log" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.049603 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd512e5a-a455-44e5-a6b7-6d7c65838c97" containerName="nova-metadata-metadata" Jan 20 23:48:02 crc kubenswrapper[5030]: E0120 23:48:02.052207 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5\": container with ID starting with 3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5 not found: ID does not exist" containerID="3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.052257 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5"} err="failed to get container status \"3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5\": rpc error: code = NotFound desc = could not find container \"3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5\": container with ID starting with 3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5 not found: ID does not exist" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.052288 5030 scope.go:117] "RemoveContainer" containerID="48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6" Jan 20 23:48:02 crc kubenswrapper[5030]: E0120 23:48:02.057051 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6\": container with ID starting with 48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6 not found: ID does not exist" containerID="48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.057112 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6"} err="failed to get container status \"48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6\": rpc error: code = NotFound desc = could not find container \"48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6\": container with ID starting with 48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6 not found: ID does not exist" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.057142 5030 scope.go:117] "RemoveContainer" containerID="3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.057775 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.060998 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5"} err="failed to get container status \"3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5\": rpc error: code = NotFound desc = could not find container \"3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5\": container with ID starting with 3b9f4dcf079311cdc2fbf671ae1473891155a868f353ebfd9c0bfe1f175472a5 not found: ID does not exist" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.061041 5030 scope.go:117] "RemoveContainer" containerID="48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.061313 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.061710 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.063486 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.071960 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6"} err="failed to get container status \"48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6\": rpc error: code = NotFound desc = could not find container \"48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6\": container with ID starting with 48d53358488cd250e0e0f86049c1e72e079906c83d6f9271fd0e8158425edec6 not found: ID does not exist" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.158560 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a9453fb-caa2-4466-8aef-3721fa6d234d-logs\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.158797 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.158884 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5l8p\" (UniqueName: \"kubernetes.io/projected/4a9453fb-caa2-4466-8aef-3721fa6d234d-kube-api-access-j5l8p\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.159050 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.159396 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-config-data\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.260641 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.260727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-config-data\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.260806 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a9453fb-caa2-4466-8aef-3721fa6d234d-logs\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.260828 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.260855 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5l8p\" (UniqueName: \"kubernetes.io/projected/4a9453fb-caa2-4466-8aef-3721fa6d234d-kube-api-access-j5l8p\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.263999 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a9453fb-caa2-4466-8aef-3721fa6d234d-logs\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.268443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.273687 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-config-data\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.274411 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.281843 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5l8p\" (UniqueName: \"kubernetes.io/projected/4a9453fb-caa2-4466-8aef-3721fa6d234d-kube-api-access-j5l8p\") pod \"nova-metadata-0\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.300340 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.386418 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.906505 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"4bcf67b9-c2e6-4e94-bc61-c3905d269c38","Type":"ContainerStarted","Data":"1a99a23c24c8a4429254d14a7233576f7e5ccde8b2ccd04f13941848e966b20e"} Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.906929 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"4bcf67b9-c2e6-4e94-bc61-c3905d269c38","Type":"ContainerStarted","Data":"ed534d78822c8f2a82d5ad51a7048ea000aa721982cc285fff1e7495f7f1379f"} Jan 20 23:48:02 crc kubenswrapper[5030]: I0120 23:48:02.938921 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=1.938896815 podStartE2EDuration="1.938896815s" podCreationTimestamp="2026-01-20 23:48:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:48:02.936038445 +0000 UTC m=+4355.256298773" watchObservedRunningTime="2026-01-20 23:48:02.938896815 +0000 UTC m=+4355.259157123" Jan 20 23:48:03 crc kubenswrapper[5030]: I0120 23:48:03.295796 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:48:03 crc kubenswrapper[5030]: I0120 23:48:03.922552 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"4a9453fb-caa2-4466-8aef-3721fa6d234d","Type":"ContainerStarted","Data":"b60c9f17d536683c0b3086521d39f2a015f2ecb7a7f36acbd4cb72e368d8f440"} Jan 20 23:48:03 crc kubenswrapper[5030]: I0120 23:48:03.923165 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"4a9453fb-caa2-4466-8aef-3721fa6d234d","Type":"ContainerStarted","Data":"b326cbfa0fd6c978b34d0ac8cc1eeb18a136282fa7f03718466c6ed8092618ba"} Jan 20 23:48:03 crc kubenswrapper[5030]: I0120 23:48:03.923204 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"4a9453fb-caa2-4466-8aef-3721fa6d234d","Type":"ContainerStarted","Data":"f22f9114e3389416762882129ddd3911a9ce54927cd96f8cb5fd9ecdb8f61662"} Jan 20 23:48:03 crc kubenswrapper[5030]: I0120 23:48:03.955387 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=1.955370423 podStartE2EDuration="1.955370423s" podCreationTimestamp="2026-01-20 23:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:48:03.954873101 +0000 UTC m=+4356.275133399" watchObservedRunningTime="2026-01-20 23:48:03.955370423 +0000 UTC m=+4356.275630721" Jan 20 23:48:03 crc kubenswrapper[5030]: I0120 23:48:03.982027 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd512e5a-a455-44e5-a6b7-6d7c65838c97" path="/var/lib/kubelet/pods/cd512e5a-a455-44e5-a6b7-6d7c65838c97/volumes" Jan 20 23:48:04 crc kubenswrapper[5030]: I0120 23:48:04.925857 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:04 crc kubenswrapper[5030]: I0120 23:48:04.933705 5030 generic.go:334] "Generic (PLEG): container finished" podID="a1926423-f37e-4978-8780-54a70ca3f158" containerID="6a0c76435ce21c72d6fe5a88287e9eca0162f69ee09939bcac8ca83634d2a99e" exitCode=0 Jan 20 23:48:04 crc kubenswrapper[5030]: I0120 23:48:04.933782 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:04 crc kubenswrapper[5030]: I0120 23:48:04.933831 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a1926423-f37e-4978-8780-54a70ca3f158","Type":"ContainerDied","Data":"6a0c76435ce21c72d6fe5a88287e9eca0162f69ee09939bcac8ca83634d2a99e"} Jan 20 23:48:04 crc kubenswrapper[5030]: I0120 23:48:04.933876 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"a1926423-f37e-4978-8780-54a70ca3f158","Type":"ContainerDied","Data":"1df0c054f8b25587306a705702c890038b5dd80c30b216f092e355ec8d86b6c3"} Jan 20 23:48:04 crc kubenswrapper[5030]: I0120 23:48:04.933899 5030 scope.go:117] "RemoveContainer" containerID="6a0c76435ce21c72d6fe5a88287e9eca0162f69ee09939bcac8ca83634d2a99e" Jan 20 23:48:04 crc kubenswrapper[5030]: I0120 23:48:04.962273 5030 scope.go:117] "RemoveContainer" containerID="6a0c76435ce21c72d6fe5a88287e9eca0162f69ee09939bcac8ca83634d2a99e" Jan 20 23:48:04 crc kubenswrapper[5030]: E0120 23:48:04.962727 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0c76435ce21c72d6fe5a88287e9eca0162f69ee09939bcac8ca83634d2a99e\": container with ID starting with 6a0c76435ce21c72d6fe5a88287e9eca0162f69ee09939bcac8ca83634d2a99e not found: ID does not exist" containerID="6a0c76435ce21c72d6fe5a88287e9eca0162f69ee09939bcac8ca83634d2a99e" Jan 20 23:48:04 crc kubenswrapper[5030]: I0120 23:48:04.962752 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0c76435ce21c72d6fe5a88287e9eca0162f69ee09939bcac8ca83634d2a99e"} err="failed to get container status \"6a0c76435ce21c72d6fe5a88287e9eca0162f69ee09939bcac8ca83634d2a99e\": rpc error: code = NotFound desc = could not find container \"6a0c76435ce21c72d6fe5a88287e9eca0162f69ee09939bcac8ca83634d2a99e\": container with ID starting with 6a0c76435ce21c72d6fe5a88287e9eca0162f69ee09939bcac8ca83634d2a99e not found: ID does not exist" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.020730 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1926423-f37e-4978-8780-54a70ca3f158-config-data\") pod \"a1926423-f37e-4978-8780-54a70ca3f158\" (UID: \"a1926423-f37e-4978-8780-54a70ca3f158\") " Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.020841 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkff8\" (UniqueName: \"kubernetes.io/projected/a1926423-f37e-4978-8780-54a70ca3f158-kube-api-access-mkff8\") pod \"a1926423-f37e-4978-8780-54a70ca3f158\" (UID: \"a1926423-f37e-4978-8780-54a70ca3f158\") " Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.020888 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1926423-f37e-4978-8780-54a70ca3f158-combined-ca-bundle\") pod \"a1926423-f37e-4978-8780-54a70ca3f158\" (UID: \"a1926423-f37e-4978-8780-54a70ca3f158\") " Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.026229 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1926423-f37e-4978-8780-54a70ca3f158-kube-api-access-mkff8" (OuterVolumeSpecName: "kube-api-access-mkff8") pod "a1926423-f37e-4978-8780-54a70ca3f158" (UID: "a1926423-f37e-4978-8780-54a70ca3f158"). InnerVolumeSpecName "kube-api-access-mkff8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.047379 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1926423-f37e-4978-8780-54a70ca3f158-config-data" (OuterVolumeSpecName: "config-data") pod "a1926423-f37e-4978-8780-54a70ca3f158" (UID: "a1926423-f37e-4978-8780-54a70ca3f158"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.068536 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1926423-f37e-4978-8780-54a70ca3f158-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1926423-f37e-4978-8780-54a70ca3f158" (UID: "a1926423-f37e-4978-8780-54a70ca3f158"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.123320 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1926423-f37e-4978-8780-54a70ca3f158-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.123371 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkff8\" (UniqueName: \"kubernetes.io/projected/a1926423-f37e-4978-8780-54a70ca3f158-kube-api-access-mkff8\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.123393 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1926423-f37e-4978-8780-54a70ca3f158-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.284588 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.306892 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.321848 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:48:05 crc kubenswrapper[5030]: E0120 23:48:05.335019 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1926423-f37e-4978-8780-54a70ca3f158" containerName="nova-scheduler-scheduler" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.335078 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1926423-f37e-4978-8780-54a70ca3f158" containerName="nova-scheduler-scheduler" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.335962 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1926423-f37e-4978-8780-54a70ca3f158" containerName="nova-scheduler-scheduler" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.337563 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.341477 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.363220 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.436989 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.437138 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-config-data\") pod \"nova-scheduler-0\" (UID: \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.437207 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-929tf\" (UniqueName: \"kubernetes.io/projected/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-kube-api-access-929tf\") pod \"nova-scheduler-0\" (UID: \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.539613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-config-data\") pod \"nova-scheduler-0\" (UID: \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.539994 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-929tf\" (UniqueName: \"kubernetes.io/projected/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-kube-api-access-929tf\") pod \"nova-scheduler-0\" (UID: \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.540172 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.546564 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.546612 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-config-data\") pod \"nova-scheduler-0\" (UID: \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.559915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-929tf\" (UniqueName: \"kubernetes.io/projected/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-kube-api-access-929tf\") pod \"nova-scheduler-0\" (UID: \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.664158 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:05 crc kubenswrapper[5030]: I0120 23:48:05.975956 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1926423-f37e-4978-8780-54a70ca3f158" path="/var/lib/kubelet/pods/a1926423-f37e-4978-8780-54a70ca3f158/volumes" Jan 20 23:48:06 crc kubenswrapper[5030]: I0120 23:48:06.228772 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:48:06 crc kubenswrapper[5030]: I0120 23:48:06.975124 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"fe4f4a14-2d5b-4a4c-b76f-1675f3429044","Type":"ContainerStarted","Data":"63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363"} Jan 20 23:48:06 crc kubenswrapper[5030]: I0120 23:48:06.975581 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"fe4f4a14-2d5b-4a4c-b76f-1675f3429044","Type":"ContainerStarted","Data":"32b04fcedd7c8d372b7009c1f3757a5bf75ffc3856ccc0cbaf9f96feb134c1db"} Jan 20 23:48:07 crc kubenswrapper[5030]: I0120 23:48:07.016395 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.016370629 podStartE2EDuration="2.016370629s" podCreationTimestamp="2026-01-20 23:48:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:48:07.000915094 +0000 UTC m=+4359.321175382" watchObservedRunningTime="2026-01-20 23:48:07.016370629 +0000 UTC m=+4359.336630957" Jan 20 23:48:07 crc kubenswrapper[5030]: I0120 23:48:07.296284 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:48:07 crc kubenswrapper[5030]: I0120 23:48:07.317418 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:48:07 crc kubenswrapper[5030]: I0120 23:48:07.386870 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:07 crc kubenswrapper[5030]: I0120 23:48:07.386925 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:08 crc kubenswrapper[5030]: I0120 23:48:08.018015 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:48:09 crc kubenswrapper[5030]: I0120 23:48:09.298835 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:48:09 crc kubenswrapper[5030]: I0120 23:48:09.886609 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t"] Jan 20 23:48:09 crc kubenswrapper[5030]: I0120 23:48:09.887976 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:09 crc kubenswrapper[5030]: I0120 23:48:09.891281 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 20 23:48:09 crc kubenswrapper[5030]: I0120 23:48:09.892249 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 20 23:48:09 crc kubenswrapper[5030]: I0120 23:48:09.897859 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t"] Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.034979 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6z58t\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.036055 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-scripts\") pod \"nova-cell1-cell-mapping-6z58t\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.036315 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knmz5\" (UniqueName: \"kubernetes.io/projected/54c23895-db55-439b-97b3-b53d50ba92fc-kube-api-access-knmz5\") pod \"nova-cell1-cell-mapping-6z58t\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.037110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-config-data\") pod \"nova-cell1-cell-mapping-6z58t\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.139104 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-config-data\") pod \"nova-cell1-cell-mapping-6z58t\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.139192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6z58t\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.139249 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-scripts\") pod \"nova-cell1-cell-mapping-6z58t\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.139278 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knmz5\" (UniqueName: \"kubernetes.io/projected/54c23895-db55-439b-97b3-b53d50ba92fc-kube-api-access-knmz5\") pod \"nova-cell1-cell-mapping-6z58t\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.146773 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-scripts\") pod \"nova-cell1-cell-mapping-6z58t\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.147843 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6z58t\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.167764 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.167868 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.167961 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.169261 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-config-data\") pod \"nova-cell1-cell-mapping-6z58t\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.169342 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.169867 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" gracePeriod=600 Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.175793 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knmz5\" (UniqueName: \"kubernetes.io/projected/54c23895-db55-439b-97b3-b53d50ba92fc-kube-api-access-knmz5\") pod \"nova-cell1-cell-mapping-6z58t\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.218912 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:10 crc kubenswrapper[5030]: E0120 23:48:10.303350 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.664313 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:10 crc kubenswrapper[5030]: W0120 23:48:10.733418 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54c23895_db55_439b_97b3_b53d50ba92fc.slice/crio-159febec5116fdfa0e5e377a42279b81bf7ccd39d6c1dba2e486b1dd5fc767f6 WatchSource:0}: Error finding container 159febec5116fdfa0e5e377a42279b81bf7ccd39d6c1dba2e486b1dd5fc767f6: Status 404 returned error can't find the container with id 159febec5116fdfa0e5e377a42279b81bf7ccd39d6c1dba2e486b1dd5fc767f6 Jan 20 23:48:10 crc kubenswrapper[5030]: I0120 23:48:10.735622 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t"] Jan 20 23:48:11 crc kubenswrapper[5030]: I0120 23:48:11.026441 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" exitCode=0 Jan 20 23:48:11 crc kubenswrapper[5030]: I0120 23:48:11.026513 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121"} Jan 20 23:48:11 crc kubenswrapper[5030]: I0120 23:48:11.026830 5030 scope.go:117] "RemoveContainer" containerID="3f648d6436623b73b4f4d7a187af028d54b54d93907fd7af94f7f92831044719" Jan 20 23:48:11 crc kubenswrapper[5030]: I0120 23:48:11.027553 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:48:11 crc kubenswrapper[5030]: E0120 23:48:11.027863 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:48:11 crc kubenswrapper[5030]: I0120 23:48:11.029729 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" event={"ID":"54c23895-db55-439b-97b3-b53d50ba92fc","Type":"ContainerStarted","Data":"fcc1c5fccc13caf607c55b1c6f3a6b1f7b218f35a0ce717dc443ff2c2713f71d"} Jan 20 23:48:11 crc kubenswrapper[5030]: I0120 23:48:11.029765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" event={"ID":"54c23895-db55-439b-97b3-b53d50ba92fc","Type":"ContainerStarted","Data":"159febec5116fdfa0e5e377a42279b81bf7ccd39d6c1dba2e486b1dd5fc767f6"} Jan 20 23:48:11 crc kubenswrapper[5030]: I0120 23:48:11.077908 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" podStartSLOduration=2.077876955 podStartE2EDuration="2.077876955s" podCreationTimestamp="2026-01-20 23:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:48:11.062241265 +0000 UTC m=+4363.382501563" watchObservedRunningTime="2026-01-20 23:48:11.077876955 +0000 UTC m=+4363.398137253" Jan 20 23:48:11 crc kubenswrapper[5030]: I0120 23:48:11.562072 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:11 crc kubenswrapper[5030]: I0120 23:48:11.562112 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:12 crc kubenswrapper[5030]: I0120 23:48:12.389225 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:12 crc kubenswrapper[5030]: I0120 23:48:12.389601 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:12 crc kubenswrapper[5030]: I0120 23:48:12.644943 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="4bcf67b9-c2e6-4e94-bc61-c3905d269c38" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.152:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:48:12 crc kubenswrapper[5030]: I0120 23:48:12.645339 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="4bcf67b9-c2e6-4e94-bc61-c3905d269c38" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.152:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:48:13 crc kubenswrapper[5030]: I0120 23:48:13.402889 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="4a9453fb-caa2-4466-8aef-3721fa6d234d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.153:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:48:13 crc kubenswrapper[5030]: I0120 23:48:13.402926 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="4a9453fb-caa2-4466-8aef-3721fa6d234d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.153:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:48:13 crc kubenswrapper[5030]: I0120 23:48:13.984744 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:15 crc kubenswrapper[5030]: I0120 23:48:15.664894 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:15 crc kubenswrapper[5030]: I0120 23:48:15.695471 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:16 crc kubenswrapper[5030]: I0120 23:48:16.096129 5030 generic.go:334] "Generic (PLEG): container finished" podID="54c23895-db55-439b-97b3-b53d50ba92fc" containerID="fcc1c5fccc13caf607c55b1c6f3a6b1f7b218f35a0ce717dc443ff2c2713f71d" exitCode=0 Jan 20 23:48:16 crc kubenswrapper[5030]: I0120 23:48:16.096228 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" event={"ID":"54c23895-db55-439b-97b3-b53d50ba92fc","Type":"ContainerDied","Data":"fcc1c5fccc13caf607c55b1c6f3a6b1f7b218f35a0ce717dc443ff2c2713f71d"} Jan 20 23:48:16 crc kubenswrapper[5030]: I0120 23:48:16.132166 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.438015 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.592237 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-config-data\") pod \"54c23895-db55-439b-97b3-b53d50ba92fc\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.592315 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-combined-ca-bundle\") pod \"54c23895-db55-439b-97b3-b53d50ba92fc\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.592410 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knmz5\" (UniqueName: \"kubernetes.io/projected/54c23895-db55-439b-97b3-b53d50ba92fc-kube-api-access-knmz5\") pod \"54c23895-db55-439b-97b3-b53d50ba92fc\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.592472 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-scripts\") pod \"54c23895-db55-439b-97b3-b53d50ba92fc\" (UID: \"54c23895-db55-439b-97b3-b53d50ba92fc\") " Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.598476 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c23895-db55-439b-97b3-b53d50ba92fc-kube-api-access-knmz5" (OuterVolumeSpecName: "kube-api-access-knmz5") pod "54c23895-db55-439b-97b3-b53d50ba92fc" (UID: "54c23895-db55-439b-97b3-b53d50ba92fc"). InnerVolumeSpecName "kube-api-access-knmz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.600408 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-scripts" (OuterVolumeSpecName: "scripts") pod "54c23895-db55-439b-97b3-b53d50ba92fc" (UID: "54c23895-db55-439b-97b3-b53d50ba92fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.626712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-config-data" (OuterVolumeSpecName: "config-data") pod "54c23895-db55-439b-97b3-b53d50ba92fc" (UID: "54c23895-db55-439b-97b3-b53d50ba92fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.627785 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54c23895-db55-439b-97b3-b53d50ba92fc" (UID: "54c23895-db55-439b-97b3-b53d50ba92fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.646021 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.646205 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="f8ca9328-fae7-493b-936e-4019e3931081" containerName="kube-state-metrics" containerID="cri-o://3e7da6e1ecec7326a424ae9d92c5cbef29e1ab2ac4a02206dd58aca870673b8a" gracePeriod=30 Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.694141 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.694176 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.694190 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knmz5\" (UniqueName: \"kubernetes.io/projected/54c23895-db55-439b-97b3-b53d50ba92fc-kube-api-access-knmz5\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:17 crc kubenswrapper[5030]: I0120 23:48:17.694202 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54c23895-db55-439b-97b3-b53d50ba92fc-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.058258 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.118243 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.118292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t" event={"ID":"54c23895-db55-439b-97b3-b53d50ba92fc","Type":"ContainerDied","Data":"159febec5116fdfa0e5e377a42279b81bf7ccd39d6c1dba2e486b1dd5fc767f6"} Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.118348 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="159febec5116fdfa0e5e377a42279b81bf7ccd39d6c1dba2e486b1dd5fc767f6" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.125233 5030 generic.go:334] "Generic (PLEG): container finished" podID="f8ca9328-fae7-493b-936e-4019e3931081" containerID="3e7da6e1ecec7326a424ae9d92c5cbef29e1ab2ac4a02206dd58aca870673b8a" exitCode=2 Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.125280 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"f8ca9328-fae7-493b-936e-4019e3931081","Type":"ContainerDied","Data":"3e7da6e1ecec7326a424ae9d92c5cbef29e1ab2ac4a02206dd58aca870673b8a"} Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.125336 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"f8ca9328-fae7-493b-936e-4019e3931081","Type":"ContainerDied","Data":"3b6cd25662a3f8e406e208b7eb56e2bd85c8ae01f853fdd1ab9bed524b1d94bd"} Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.125369 5030 scope.go:117] "RemoveContainer" containerID="3e7da6e1ecec7326a424ae9d92c5cbef29e1ab2ac4a02206dd58aca870673b8a" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.125368 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.154055 5030 scope.go:117] "RemoveContainer" containerID="3e7da6e1ecec7326a424ae9d92c5cbef29e1ab2ac4a02206dd58aca870673b8a" Jan 20 23:48:18 crc kubenswrapper[5030]: E0120 23:48:18.154504 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e7da6e1ecec7326a424ae9d92c5cbef29e1ab2ac4a02206dd58aca870673b8a\": container with ID starting with 3e7da6e1ecec7326a424ae9d92c5cbef29e1ab2ac4a02206dd58aca870673b8a not found: ID does not exist" containerID="3e7da6e1ecec7326a424ae9d92c5cbef29e1ab2ac4a02206dd58aca870673b8a" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.154533 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7da6e1ecec7326a424ae9d92c5cbef29e1ab2ac4a02206dd58aca870673b8a"} err="failed to get container status \"3e7da6e1ecec7326a424ae9d92c5cbef29e1ab2ac4a02206dd58aca870673b8a\": rpc error: code = NotFound desc = could not find container \"3e7da6e1ecec7326a424ae9d92c5cbef29e1ab2ac4a02206dd58aca870673b8a\": container with ID starting with 3e7da6e1ecec7326a424ae9d92c5cbef29e1ab2ac4a02206dd58aca870673b8a not found: ID does not exist" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.206334 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbjln\" (UniqueName: \"kubernetes.io/projected/f8ca9328-fae7-493b-936e-4019e3931081-kube-api-access-cbjln\") pod \"f8ca9328-fae7-493b-936e-4019e3931081\" (UID: \"f8ca9328-fae7-493b-936e-4019e3931081\") " Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.210558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8ca9328-fae7-493b-936e-4019e3931081-kube-api-access-cbjln" (OuterVolumeSpecName: "kube-api-access-cbjln") pod "f8ca9328-fae7-493b-936e-4019e3931081" (UID: "f8ca9328-fae7-493b-936e-4019e3931081"). InnerVolumeSpecName "kube-api-access-cbjln". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.299363 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.299690 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="4bcf67b9-c2e6-4e94-bc61-c3905d269c38" containerName="nova-api-log" containerID="cri-o://ed534d78822c8f2a82d5ad51a7048ea000aa721982cc285fff1e7495f7f1379f" gracePeriod=30 Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.299746 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="4bcf67b9-c2e6-4e94-bc61-c3905d269c38" containerName="nova-api-api" containerID="cri-o://1a99a23c24c8a4429254d14a7233576f7e5ccde8b2ccd04f13941848e966b20e" gracePeriod=30 Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.308253 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbjln\" (UniqueName: \"kubernetes.io/projected/f8ca9328-fae7-493b-936e-4019e3931081-kube-api-access-cbjln\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.313811 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.314031 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="fe4f4a14-2d5b-4a4c-b76f-1675f3429044" containerName="nova-scheduler-scheduler" containerID="cri-o://63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363" gracePeriod=30 Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.323318 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.323548 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="4a9453fb-caa2-4466-8aef-3721fa6d234d" containerName="nova-metadata-log" containerID="cri-o://b326cbfa0fd6c978b34d0ac8cc1eeb18a136282fa7f03718466c6ed8092618ba" gracePeriod=30 Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.323602 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="4a9453fb-caa2-4466-8aef-3721fa6d234d" containerName="nova-metadata-metadata" containerID="cri-o://b60c9f17d536683c0b3086521d39f2a015f2ecb7a7f36acbd4cb72e368d8f440" gracePeriod=30 Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.494136 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.501424 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.511085 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:48:18 crc kubenswrapper[5030]: E0120 23:48:18.511450 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8ca9328-fae7-493b-936e-4019e3931081" containerName="kube-state-metrics" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.511468 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8ca9328-fae7-493b-936e-4019e3931081" containerName="kube-state-metrics" Jan 20 23:48:18 crc kubenswrapper[5030]: E0120 23:48:18.511486 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c23895-db55-439b-97b3-b53d50ba92fc" containerName="nova-manage" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.511493 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c23895-db55-439b-97b3-b53d50ba92fc" containerName="nova-manage" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.511658 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c23895-db55-439b-97b3-b53d50ba92fc" containerName="nova-manage" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.511686 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8ca9328-fae7-493b-936e-4019e3931081" containerName="kube-state-metrics" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.513039 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.515071 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.515992 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.521302 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.613351 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.613416 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.613535 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.613670 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r55g\" (UniqueName: \"kubernetes.io/projected/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-api-access-8r55g\") pod \"kube-state-metrics-0\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.715469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.715575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r55g\" (UniqueName: \"kubernetes.io/projected/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-api-access-8r55g\") pod \"kube-state-metrics-0\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.715709 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.715743 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.720402 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.721177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.727229 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.732211 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r55g\" (UniqueName: \"kubernetes.io/projected/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-api-access-8r55g\") pod \"kube-state-metrics-0\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:18 crc kubenswrapper[5030]: I0120 23:48:18.827355 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:19 crc kubenswrapper[5030]: I0120 23:48:19.139584 5030 generic.go:334] "Generic (PLEG): container finished" podID="4a9453fb-caa2-4466-8aef-3721fa6d234d" containerID="b326cbfa0fd6c978b34d0ac8cc1eeb18a136282fa7f03718466c6ed8092618ba" exitCode=143 Jan 20 23:48:19 crc kubenswrapper[5030]: I0120 23:48:19.139665 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"4a9453fb-caa2-4466-8aef-3721fa6d234d","Type":"ContainerDied","Data":"b326cbfa0fd6c978b34d0ac8cc1eeb18a136282fa7f03718466c6ed8092618ba"} Jan 20 23:48:19 crc kubenswrapper[5030]: I0120 23:48:19.143290 5030 generic.go:334] "Generic (PLEG): container finished" podID="4bcf67b9-c2e6-4e94-bc61-c3905d269c38" containerID="ed534d78822c8f2a82d5ad51a7048ea000aa721982cc285fff1e7495f7f1379f" exitCode=143 Jan 20 23:48:19 crc kubenswrapper[5030]: I0120 23:48:19.143339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"4bcf67b9-c2e6-4e94-bc61-c3905d269c38","Type":"ContainerDied","Data":"ed534d78822c8f2a82d5ad51a7048ea000aa721982cc285fff1e7495f7f1379f"} Jan 20 23:48:19 crc kubenswrapper[5030]: I0120 23:48:19.288178 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:48:19 crc kubenswrapper[5030]: I0120 23:48:19.291664 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 23:48:19 crc kubenswrapper[5030]: I0120 23:48:19.373036 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:48:19 crc kubenswrapper[5030]: I0120 23:48:19.373446 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="ceilometer-central-agent" containerID="cri-o://aa16a8c7591c2f0163856752bc46cf19ebb656270060a0e84d1c774b80b3d754" gracePeriod=30 Jan 20 23:48:19 crc kubenswrapper[5030]: I0120 23:48:19.375313 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="proxy-httpd" containerID="cri-o://c9240ef96f450acbb0bc496c062fecd0236d99d36d030f9bc75cff7c1389ba47" gracePeriod=30 Jan 20 23:48:19 crc kubenswrapper[5030]: I0120 23:48:19.375457 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="sg-core" containerID="cri-o://a720673ebe7b5561565a50618027fb9df32cc456dfc0a85bb51860d130e7df2e" gracePeriod=30 Jan 20 23:48:19 crc kubenswrapper[5030]: I0120 23:48:19.375531 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="ceilometer-notification-agent" containerID="cri-o://58923d6d2e326c64fbd33abf0cf6dd8ee0f378c38e35dcf2b1e5911de17bdbd5" gracePeriod=30 Jan 20 23:48:20 crc kubenswrapper[5030]: I0120 23:48:20.000147 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8ca9328-fae7-493b-936e-4019e3931081" path="/var/lib/kubelet/pods/f8ca9328-fae7-493b-936e-4019e3931081/volumes" Jan 20 23:48:20 crc kubenswrapper[5030]: I0120 23:48:20.158762 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f404786-795b-4fe1-98be-adff150f9666" containerID="c9240ef96f450acbb0bc496c062fecd0236d99d36d030f9bc75cff7c1389ba47" exitCode=0 Jan 20 23:48:20 crc kubenswrapper[5030]: I0120 23:48:20.158812 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f404786-795b-4fe1-98be-adff150f9666" containerID="a720673ebe7b5561565a50618027fb9df32cc456dfc0a85bb51860d130e7df2e" exitCode=2 Jan 20 23:48:20 crc kubenswrapper[5030]: I0120 23:48:20.158825 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f404786-795b-4fe1-98be-adff150f9666" containerID="aa16a8c7591c2f0163856752bc46cf19ebb656270060a0e84d1c774b80b3d754" exitCode=0 Jan 20 23:48:20 crc kubenswrapper[5030]: I0120 23:48:20.158881 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f404786-795b-4fe1-98be-adff150f9666","Type":"ContainerDied","Data":"c9240ef96f450acbb0bc496c062fecd0236d99d36d030f9bc75cff7c1389ba47"} Jan 20 23:48:20 crc kubenswrapper[5030]: I0120 23:48:20.158919 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f404786-795b-4fe1-98be-adff150f9666","Type":"ContainerDied","Data":"a720673ebe7b5561565a50618027fb9df32cc456dfc0a85bb51860d130e7df2e"} Jan 20 23:48:20 crc kubenswrapper[5030]: I0120 23:48:20.158938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f404786-795b-4fe1-98be-adff150f9666","Type":"ContainerDied","Data":"aa16a8c7591c2f0163856752bc46cf19ebb656270060a0e84d1c774b80b3d754"} Jan 20 23:48:20 crc kubenswrapper[5030]: I0120 23:48:20.160886 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"f832da32-f1c8-47a5-8bcd-7ea67db8b416","Type":"ContainerStarted","Data":"b1e5db55c72f251dc3ec01cc2adc3e99ecab004e604b0ef4205e9f8eccfb3fdb"} Jan 20 23:48:20 crc kubenswrapper[5030]: I0120 23:48:20.160968 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"f832da32-f1c8-47a5-8bcd-7ea67db8b416","Type":"ContainerStarted","Data":"db8482dc5d1a3de53f05c0df5f7b25915ec1c08554d77cf1e169d14db4b3bde0"} Jan 20 23:48:20 crc kubenswrapper[5030]: I0120 23:48:20.161035 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:20 crc kubenswrapper[5030]: I0120 23:48:20.185080 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=1.81684782 podStartE2EDuration="2.185059633s" podCreationTimestamp="2026-01-20 23:48:18 +0000 UTC" firstStartedPulling="2026-01-20 23:48:19.291361003 +0000 UTC m=+4371.611621291" lastFinishedPulling="2026-01-20 23:48:19.659572806 +0000 UTC m=+4371.979833104" observedRunningTime="2026-01-20 23:48:20.18002387 +0000 UTC m=+4372.500284168" watchObservedRunningTime="2026-01-20 23:48:20.185059633 +0000 UTC m=+4372.505319921" Jan 20 23:48:20 crc kubenswrapper[5030]: E0120 23:48:20.667222 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:48:20 crc kubenswrapper[5030]: E0120 23:48:20.670560 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:48:20 crc kubenswrapper[5030]: E0120 23:48:20.672284 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:48:20 crc kubenswrapper[5030]: E0120 23:48:20.672339 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="fe4f4a14-2d5b-4a4c-b76f-1675f3429044" containerName="nova-scheduler-scheduler" Jan 20 23:48:21 crc kubenswrapper[5030]: I0120 23:48:21.983239 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:21 crc kubenswrapper[5030]: I0120 23:48:21.990676 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.079261 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.082509 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vklj\" (UniqueName: \"kubernetes.io/projected/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-kube-api-access-5vklj\") pod \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.082581 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a9453fb-caa2-4466-8aef-3721fa6d234d-logs\") pod \"4a9453fb-caa2-4466-8aef-3721fa6d234d\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.082610 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-combined-ca-bundle\") pod \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.082676 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-combined-ca-bundle\") pod \"4a9453fb-caa2-4466-8aef-3721fa6d234d\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.082708 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-config-data\") pod \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.082767 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-logs\") pod \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\" (UID: \"4bcf67b9-c2e6-4e94-bc61-c3905d269c38\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.082813 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5l8p\" (UniqueName: \"kubernetes.io/projected/4a9453fb-caa2-4466-8aef-3721fa6d234d-kube-api-access-j5l8p\") pod \"4a9453fb-caa2-4466-8aef-3721fa6d234d\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.082837 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-nova-metadata-tls-certs\") pod \"4a9453fb-caa2-4466-8aef-3721fa6d234d\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.082904 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-config-data\") pod \"4a9453fb-caa2-4466-8aef-3721fa6d234d\" (UID: \"4a9453fb-caa2-4466-8aef-3721fa6d234d\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.083250 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-logs" (OuterVolumeSpecName: "logs") pod "4bcf67b9-c2e6-4e94-bc61-c3905d269c38" (UID: "4bcf67b9-c2e6-4e94-bc61-c3905d269c38"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.083490 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a9453fb-caa2-4466-8aef-3721fa6d234d-logs" (OuterVolumeSpecName: "logs") pod "4a9453fb-caa2-4466-8aef-3721fa6d234d" (UID: "4a9453fb-caa2-4466-8aef-3721fa6d234d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.087813 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a9453fb-caa2-4466-8aef-3721fa6d234d-kube-api-access-j5l8p" (OuterVolumeSpecName: "kube-api-access-j5l8p") pod "4a9453fb-caa2-4466-8aef-3721fa6d234d" (UID: "4a9453fb-caa2-4466-8aef-3721fa6d234d"). InnerVolumeSpecName "kube-api-access-j5l8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.089347 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-kube-api-access-5vklj" (OuterVolumeSpecName: "kube-api-access-5vklj") pod "4bcf67b9-c2e6-4e94-bc61-c3905d269c38" (UID: "4bcf67b9-c2e6-4e94-bc61-c3905d269c38"). InnerVolumeSpecName "kube-api-access-5vklj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.113517 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-config-data" (OuterVolumeSpecName: "config-data") pod "4a9453fb-caa2-4466-8aef-3721fa6d234d" (UID: "4a9453fb-caa2-4466-8aef-3721fa6d234d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.120186 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a9453fb-caa2-4466-8aef-3721fa6d234d" (UID: "4a9453fb-caa2-4466-8aef-3721fa6d234d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.131234 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bcf67b9-c2e6-4e94-bc61-c3905d269c38" (UID: "4bcf67b9-c2e6-4e94-bc61-c3905d269c38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.136351 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-config-data" (OuterVolumeSpecName: "config-data") pod "4bcf67b9-c2e6-4e94-bc61-c3905d269c38" (UID: "4bcf67b9-c2e6-4e94-bc61-c3905d269c38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.141464 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4a9453fb-caa2-4466-8aef-3721fa6d234d" (UID: "4a9453fb-caa2-4466-8aef-3721fa6d234d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.181557 5030 generic.go:334] "Generic (PLEG): container finished" podID="4a9453fb-caa2-4466-8aef-3721fa6d234d" containerID="b60c9f17d536683c0b3086521d39f2a015f2ecb7a7f36acbd4cb72e368d8f440" exitCode=0 Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.181615 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"4a9453fb-caa2-4466-8aef-3721fa6d234d","Type":"ContainerDied","Data":"b60c9f17d536683c0b3086521d39f2a015f2ecb7a7f36acbd4cb72e368d8f440"} Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.181660 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"4a9453fb-caa2-4466-8aef-3721fa6d234d","Type":"ContainerDied","Data":"f22f9114e3389416762882129ddd3911a9ce54927cd96f8cb5fd9ecdb8f61662"} Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.181675 5030 scope.go:117] "RemoveContainer" containerID="b60c9f17d536683c0b3086521d39f2a015f2ecb7a7f36acbd4cb72e368d8f440" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.181788 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.191980 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f404786-795b-4fe1-98be-adff150f9666-run-httpd\") pod \"4f404786-795b-4fe1-98be-adff150f9666\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.192014 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-sg-core-conf-yaml\") pod \"4f404786-795b-4fe1-98be-adff150f9666\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.192154 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-scripts\") pod \"4f404786-795b-4fe1-98be-adff150f9666\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.192431 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f404786-795b-4fe1-98be-adff150f9666-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4f404786-795b-4fe1-98be-adff150f9666" (UID: "4f404786-795b-4fe1-98be-adff150f9666"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.192480 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-combined-ca-bundle\") pod \"4f404786-795b-4fe1-98be-adff150f9666\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.192941 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f404786-795b-4fe1-98be-adff150f9666-log-httpd\") pod \"4f404786-795b-4fe1-98be-adff150f9666\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.193025 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-config-data\") pod \"4f404786-795b-4fe1-98be-adff150f9666\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.193140 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2m9x\" (UniqueName: \"kubernetes.io/projected/4f404786-795b-4fe1-98be-adff150f9666-kube-api-access-p2m9x\") pod \"4f404786-795b-4fe1-98be-adff150f9666\" (UID: \"4f404786-795b-4fe1-98be-adff150f9666\") " Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.193452 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f404786-795b-4fe1-98be-adff150f9666-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4f404786-795b-4fe1-98be-adff150f9666" (UID: "4f404786-795b-4fe1-98be-adff150f9666"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194111 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f404786-795b-4fe1-98be-adff150f9666-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194141 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a9453fb-caa2-4466-8aef-3721fa6d234d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194157 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194175 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194189 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194201 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194217 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5l8p\" (UniqueName: \"kubernetes.io/projected/4a9453fb-caa2-4466-8aef-3721fa6d234d-kube-api-access-j5l8p\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194231 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194242 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f404786-795b-4fe1-98be-adff150f9666-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194254 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9453fb-caa2-4466-8aef-3721fa6d234d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194267 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vklj\" (UniqueName: \"kubernetes.io/projected/4bcf67b9-c2e6-4e94-bc61-c3905d269c38-kube-api-access-5vklj\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194751 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f404786-795b-4fe1-98be-adff150f9666" containerID="58923d6d2e326c64fbd33abf0cf6dd8ee0f378c38e35dcf2b1e5911de17bdbd5" exitCode=0 Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194802 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f404786-795b-4fe1-98be-adff150f9666","Type":"ContainerDied","Data":"58923d6d2e326c64fbd33abf0cf6dd8ee0f378c38e35dcf2b1e5911de17bdbd5"} Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194842 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4f404786-795b-4fe1-98be-adff150f9666","Type":"ContainerDied","Data":"7c34fff42877cec5632a564cbe641912862deccdaaac2b6d4d60f6a664bc77b7"} Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.194930 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.195697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f404786-795b-4fe1-98be-adff150f9666-kube-api-access-p2m9x" (OuterVolumeSpecName: "kube-api-access-p2m9x") pod "4f404786-795b-4fe1-98be-adff150f9666" (UID: "4f404786-795b-4fe1-98be-adff150f9666"). InnerVolumeSpecName "kube-api-access-p2m9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.196676 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-scripts" (OuterVolumeSpecName: "scripts") pod "4f404786-795b-4fe1-98be-adff150f9666" (UID: "4f404786-795b-4fe1-98be-adff150f9666"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.197233 5030 generic.go:334] "Generic (PLEG): container finished" podID="4bcf67b9-c2e6-4e94-bc61-c3905d269c38" containerID="1a99a23c24c8a4429254d14a7233576f7e5ccde8b2ccd04f13941848e966b20e" exitCode=0 Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.197258 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.197289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"4bcf67b9-c2e6-4e94-bc61-c3905d269c38","Type":"ContainerDied","Data":"1a99a23c24c8a4429254d14a7233576f7e5ccde8b2ccd04f13941848e966b20e"} Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.197772 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"4bcf67b9-c2e6-4e94-bc61-c3905d269c38","Type":"ContainerDied","Data":"24a4723278f67f48f84bb355e47545444174dd2d64eda1b7bf99b58ca478b3be"} Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.213633 5030 scope.go:117] "RemoveContainer" containerID="b326cbfa0fd6c978b34d0ac8cc1eeb18a136282fa7f03718466c6ed8092618ba" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.221875 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.248521 5030 scope.go:117] "RemoveContainer" containerID="b60c9f17d536683c0b3086521d39f2a015f2ecb7a7f36acbd4cb72e368d8f440" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.252601 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4f404786-795b-4fe1-98be-adff150f9666" (UID: "4f404786-795b-4fe1-98be-adff150f9666"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.253285 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b60c9f17d536683c0b3086521d39f2a015f2ecb7a7f36acbd4cb72e368d8f440\": container with ID starting with b60c9f17d536683c0b3086521d39f2a015f2ecb7a7f36acbd4cb72e368d8f440 not found: ID does not exist" containerID="b60c9f17d536683c0b3086521d39f2a015f2ecb7a7f36acbd4cb72e368d8f440" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.253325 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b60c9f17d536683c0b3086521d39f2a015f2ecb7a7f36acbd4cb72e368d8f440"} err="failed to get container status \"b60c9f17d536683c0b3086521d39f2a015f2ecb7a7f36acbd4cb72e368d8f440\": rpc error: code = NotFound desc = could not find container \"b60c9f17d536683c0b3086521d39f2a015f2ecb7a7f36acbd4cb72e368d8f440\": container with ID starting with b60c9f17d536683c0b3086521d39f2a015f2ecb7a7f36acbd4cb72e368d8f440 not found: ID does not exist" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.253347 5030 scope.go:117] "RemoveContainer" containerID="b326cbfa0fd6c978b34d0ac8cc1eeb18a136282fa7f03718466c6ed8092618ba" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.253865 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b326cbfa0fd6c978b34d0ac8cc1eeb18a136282fa7f03718466c6ed8092618ba\": container with ID starting with b326cbfa0fd6c978b34d0ac8cc1eeb18a136282fa7f03718466c6ed8092618ba not found: ID does not exist" containerID="b326cbfa0fd6c978b34d0ac8cc1eeb18a136282fa7f03718466c6ed8092618ba" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.253919 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b326cbfa0fd6c978b34d0ac8cc1eeb18a136282fa7f03718466c6ed8092618ba"} err="failed to get container status \"b326cbfa0fd6c978b34d0ac8cc1eeb18a136282fa7f03718466c6ed8092618ba\": rpc error: code = NotFound desc = could not find container \"b326cbfa0fd6c978b34d0ac8cc1eeb18a136282fa7f03718466c6ed8092618ba\": container with ID starting with b326cbfa0fd6c978b34d0ac8cc1eeb18a136282fa7f03718466c6ed8092618ba not found: ID does not exist" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.253951 5030 scope.go:117] "RemoveContainer" containerID="c9240ef96f450acbb0bc496c062fecd0236d99d36d030f9bc75cff7c1389ba47" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.255991 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.281854 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.282254 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="ceilometer-central-agent" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282270 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="ceilometer-central-agent" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.282450 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bcf67b9-c2e6-4e94-bc61-c3905d269c38" containerName="nova-api-log" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282457 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bcf67b9-c2e6-4e94-bc61-c3905d269c38" containerName="nova-api-log" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.282466 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="sg-core" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282474 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="sg-core" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.282489 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="proxy-httpd" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282494 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="proxy-httpd" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.282502 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="ceilometer-notification-agent" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282508 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="ceilometer-notification-agent" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.282520 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9453fb-caa2-4466-8aef-3721fa6d234d" containerName="nova-metadata-log" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282526 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9453fb-caa2-4466-8aef-3721fa6d234d" containerName="nova-metadata-log" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.282549 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bcf67b9-c2e6-4e94-bc61-c3905d269c38" containerName="nova-api-api" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282555 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bcf67b9-c2e6-4e94-bc61-c3905d269c38" containerName="nova-api-api" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.282566 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9453fb-caa2-4466-8aef-3721fa6d234d" containerName="nova-metadata-metadata" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282572 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9453fb-caa2-4466-8aef-3721fa6d234d" containerName="nova-metadata-metadata" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282760 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="proxy-httpd" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282777 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9453fb-caa2-4466-8aef-3721fa6d234d" containerName="nova-metadata-metadata" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282792 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="sg-core" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282798 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bcf67b9-c2e6-4e94-bc61-c3905d269c38" containerName="nova-api-api" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282809 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="ceilometer-central-agent" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282818 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f404786-795b-4fe1-98be-adff150f9666" containerName="ceilometer-notification-agent" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282829 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9453fb-caa2-4466-8aef-3721fa6d234d" containerName="nova-metadata-log" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.282838 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bcf67b9-c2e6-4e94-bc61-c3905d269c38" containerName="nova-api-log" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.283767 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.285991 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.286527 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.295894 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2m9x\" (UniqueName: \"kubernetes.io/projected/4f404786-795b-4fe1-98be-adff150f9666-kube-api-access-p2m9x\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.295922 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.295931 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.298351 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f404786-795b-4fe1-98be-adff150f9666" (UID: "4f404786-795b-4fe1-98be-adff150f9666"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.298585 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.299187 5030 scope.go:117] "RemoveContainer" containerID="a720673ebe7b5561565a50618027fb9df32cc456dfc0a85bb51860d130e7df2e" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.306096 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.311451 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-config-data" (OuterVolumeSpecName: "config-data") pod "4f404786-795b-4fe1-98be-adff150f9666" (UID: "4f404786-795b-4fe1-98be-adff150f9666"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.315023 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.315754 5030 scope.go:117] "RemoveContainer" containerID="58923d6d2e326c64fbd33abf0cf6dd8ee0f378c38e35dcf2b1e5911de17bdbd5" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.325145 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.326703 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.328835 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.332373 5030 scope.go:117] "RemoveContainer" containerID="aa16a8c7591c2f0163856752bc46cf19ebb656270060a0e84d1c774b80b3d754" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.343167 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.364014 5030 scope.go:117] "RemoveContainer" containerID="c9240ef96f450acbb0bc496c062fecd0236d99d36d030f9bc75cff7c1389ba47" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.364450 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9240ef96f450acbb0bc496c062fecd0236d99d36d030f9bc75cff7c1389ba47\": container with ID starting with c9240ef96f450acbb0bc496c062fecd0236d99d36d030f9bc75cff7c1389ba47 not found: ID does not exist" containerID="c9240ef96f450acbb0bc496c062fecd0236d99d36d030f9bc75cff7c1389ba47" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.364493 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9240ef96f450acbb0bc496c062fecd0236d99d36d030f9bc75cff7c1389ba47"} err="failed to get container status \"c9240ef96f450acbb0bc496c062fecd0236d99d36d030f9bc75cff7c1389ba47\": rpc error: code = NotFound desc = could not find container \"c9240ef96f450acbb0bc496c062fecd0236d99d36d030f9bc75cff7c1389ba47\": container with ID starting with c9240ef96f450acbb0bc496c062fecd0236d99d36d030f9bc75cff7c1389ba47 not found: ID does not exist" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.364556 5030 scope.go:117] "RemoveContainer" containerID="a720673ebe7b5561565a50618027fb9df32cc456dfc0a85bb51860d130e7df2e" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.364957 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a720673ebe7b5561565a50618027fb9df32cc456dfc0a85bb51860d130e7df2e\": container with ID starting with a720673ebe7b5561565a50618027fb9df32cc456dfc0a85bb51860d130e7df2e not found: ID does not exist" containerID="a720673ebe7b5561565a50618027fb9df32cc456dfc0a85bb51860d130e7df2e" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.364987 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a720673ebe7b5561565a50618027fb9df32cc456dfc0a85bb51860d130e7df2e"} err="failed to get container status \"a720673ebe7b5561565a50618027fb9df32cc456dfc0a85bb51860d130e7df2e\": rpc error: code = NotFound desc = could not find container \"a720673ebe7b5561565a50618027fb9df32cc456dfc0a85bb51860d130e7df2e\": container with ID starting with a720673ebe7b5561565a50618027fb9df32cc456dfc0a85bb51860d130e7df2e not found: ID does not exist" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.365009 5030 scope.go:117] "RemoveContainer" containerID="58923d6d2e326c64fbd33abf0cf6dd8ee0f378c38e35dcf2b1e5911de17bdbd5" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.365455 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58923d6d2e326c64fbd33abf0cf6dd8ee0f378c38e35dcf2b1e5911de17bdbd5\": container with ID starting with 58923d6d2e326c64fbd33abf0cf6dd8ee0f378c38e35dcf2b1e5911de17bdbd5 not found: ID does not exist" containerID="58923d6d2e326c64fbd33abf0cf6dd8ee0f378c38e35dcf2b1e5911de17bdbd5" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.365476 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58923d6d2e326c64fbd33abf0cf6dd8ee0f378c38e35dcf2b1e5911de17bdbd5"} err="failed to get container status \"58923d6d2e326c64fbd33abf0cf6dd8ee0f378c38e35dcf2b1e5911de17bdbd5\": rpc error: code = NotFound desc = could not find container \"58923d6d2e326c64fbd33abf0cf6dd8ee0f378c38e35dcf2b1e5911de17bdbd5\": container with ID starting with 58923d6d2e326c64fbd33abf0cf6dd8ee0f378c38e35dcf2b1e5911de17bdbd5 not found: ID does not exist" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.365487 5030 scope.go:117] "RemoveContainer" containerID="aa16a8c7591c2f0163856752bc46cf19ebb656270060a0e84d1c774b80b3d754" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.365763 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa16a8c7591c2f0163856752bc46cf19ebb656270060a0e84d1c774b80b3d754\": container with ID starting with aa16a8c7591c2f0163856752bc46cf19ebb656270060a0e84d1c774b80b3d754 not found: ID does not exist" containerID="aa16a8c7591c2f0163856752bc46cf19ebb656270060a0e84d1c774b80b3d754" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.365789 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa16a8c7591c2f0163856752bc46cf19ebb656270060a0e84d1c774b80b3d754"} err="failed to get container status \"aa16a8c7591c2f0163856752bc46cf19ebb656270060a0e84d1c774b80b3d754\": rpc error: code = NotFound desc = could not find container \"aa16a8c7591c2f0163856752bc46cf19ebb656270060a0e84d1c774b80b3d754\": container with ID starting with aa16a8c7591c2f0163856752bc46cf19ebb656270060a0e84d1c774b80b3d754 not found: ID does not exist" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.365806 5030 scope.go:117] "RemoveContainer" containerID="1a99a23c24c8a4429254d14a7233576f7e5ccde8b2ccd04f13941848e966b20e" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.386183 5030 scope.go:117] "RemoveContainer" containerID="ed534d78822c8f2a82d5ad51a7048ea000aa721982cc285fff1e7495f7f1379f" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.397683 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-config-data\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.397742 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c375ff6f-68b7-42be-8e2b-800ad51039f5-logs\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.397820 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwst7\" (UniqueName: \"kubernetes.io/projected/c375ff6f-68b7-42be-8e2b-800ad51039f5-kube-api-access-xwst7\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.397851 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.397876 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.398226 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.398318 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f404786-795b-4fe1-98be-adff150f9666-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.403206 5030 scope.go:117] "RemoveContainer" containerID="1a99a23c24c8a4429254d14a7233576f7e5ccde8b2ccd04f13941848e966b20e" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.403541 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a99a23c24c8a4429254d14a7233576f7e5ccde8b2ccd04f13941848e966b20e\": container with ID starting with 1a99a23c24c8a4429254d14a7233576f7e5ccde8b2ccd04f13941848e966b20e not found: ID does not exist" containerID="1a99a23c24c8a4429254d14a7233576f7e5ccde8b2ccd04f13941848e966b20e" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.403600 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a99a23c24c8a4429254d14a7233576f7e5ccde8b2ccd04f13941848e966b20e"} err="failed to get container status \"1a99a23c24c8a4429254d14a7233576f7e5ccde8b2ccd04f13941848e966b20e\": rpc error: code = NotFound desc = could not find container \"1a99a23c24c8a4429254d14a7233576f7e5ccde8b2ccd04f13941848e966b20e\": container with ID starting with 1a99a23c24c8a4429254d14a7233576f7e5ccde8b2ccd04f13941848e966b20e not found: ID does not exist" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.403654 5030 scope.go:117] "RemoveContainer" containerID="ed534d78822c8f2a82d5ad51a7048ea000aa721982cc285fff1e7495f7f1379f" Jan 20 23:48:22 crc kubenswrapper[5030]: E0120 23:48:22.404005 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed534d78822c8f2a82d5ad51a7048ea000aa721982cc285fff1e7495f7f1379f\": container with ID starting with ed534d78822c8f2a82d5ad51a7048ea000aa721982cc285fff1e7495f7f1379f not found: ID does not exist" containerID="ed534d78822c8f2a82d5ad51a7048ea000aa721982cc285fff1e7495f7f1379f" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.404032 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed534d78822c8f2a82d5ad51a7048ea000aa721982cc285fff1e7495f7f1379f"} err="failed to get container status \"ed534d78822c8f2a82d5ad51a7048ea000aa721982cc285fff1e7495f7f1379f\": rpc error: code = NotFound desc = could not find container \"ed534d78822c8f2a82d5ad51a7048ea000aa721982cc285fff1e7495f7f1379f\": container with ID starting with ed534d78822c8f2a82d5ad51a7048ea000aa721982cc285fff1e7495f7f1379f not found: ID does not exist" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.500194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5l4w\" (UniqueName: \"kubernetes.io/projected/d1a2ca22-4f91-4168-80ce-ef047529dca1-kube-api-access-d5l4w\") pod \"nova-api-0\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.500555 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-config-data\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.500697 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c375ff6f-68b7-42be-8e2b-800ad51039f5-logs\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.500791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a2ca22-4f91-4168-80ce-ef047529dca1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.500883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1a2ca22-4f91-4168-80ce-ef047529dca1-logs\") pod \"nova-api-0\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.501005 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwst7\" (UniqueName: \"kubernetes.io/projected/c375ff6f-68b7-42be-8e2b-800ad51039f5-kube-api-access-xwst7\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.501098 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.501180 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.501266 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a2ca22-4f91-4168-80ce-ef047529dca1-config-data\") pod \"nova-api-0\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.501692 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c375ff6f-68b7-42be-8e2b-800ad51039f5-logs\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.504410 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.504420 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.505044 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-config-data\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.520121 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwst7\" (UniqueName: \"kubernetes.io/projected/c375ff6f-68b7-42be-8e2b-800ad51039f5-kube-api-access-xwst7\") pod \"nova-metadata-0\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.533462 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.543517 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.564418 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.566899 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.575429 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.575644 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.575680 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.586482 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.602825 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a2ca22-4f91-4168-80ce-ef047529dca1-config-data\") pod \"nova-api-0\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.602881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5l4w\" (UniqueName: \"kubernetes.io/projected/d1a2ca22-4f91-4168-80ce-ef047529dca1-kube-api-access-d5l4w\") pod \"nova-api-0\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.602954 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a2ca22-4f91-4168-80ce-ef047529dca1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.602988 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1a2ca22-4f91-4168-80ce-ef047529dca1-logs\") pod \"nova-api-0\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.603534 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1a2ca22-4f91-4168-80ce-ef047529dca1-logs\") pod \"nova-api-0\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.607048 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.607343 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a2ca22-4f91-4168-80ce-ef047529dca1-config-data\") pod \"nova-api-0\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.610664 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a2ca22-4f91-4168-80ce-ef047529dca1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.624746 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5l4w\" (UniqueName: \"kubernetes.io/projected/d1a2ca22-4f91-4168-80ce-ef047529dca1-kube-api-access-d5l4w\") pod \"nova-api-0\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.648016 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.705079 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-config-data\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.705513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.705537 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-scripts\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.705572 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.705612 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04090560-6a34-4c06-8ad1-2302c0780d35-log-httpd\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.705764 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04090560-6a34-4c06-8ad1-2302c0780d35-run-httpd\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.705807 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.705842 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swxsh\" (UniqueName: \"kubernetes.io/projected/04090560-6a34-4c06-8ad1-2302c0780d35-kube-api-access-swxsh\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.807288 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.807334 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-scripts\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.807365 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.807394 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04090560-6a34-4c06-8ad1-2302c0780d35-log-httpd\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.807451 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04090560-6a34-4c06-8ad1-2302c0780d35-run-httpd\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.807479 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.807507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swxsh\" (UniqueName: \"kubernetes.io/projected/04090560-6a34-4c06-8ad1-2302c0780d35-kube-api-access-swxsh\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.807543 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-config-data\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.808754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04090560-6a34-4c06-8ad1-2302c0780d35-run-httpd\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.809122 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04090560-6a34-4c06-8ad1-2302c0780d35-log-httpd\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.812014 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.812333 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.812565 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-scripts\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.812930 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-config-data\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.816201 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.823823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swxsh\" (UniqueName: \"kubernetes.io/projected/04090560-6a34-4c06-8ad1-2302c0780d35-kube-api-access-swxsh\") pod \"ceilometer-0\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:22 crc kubenswrapper[5030]: I0120 23:48:22.833997 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.080989 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:48:23 crc kubenswrapper[5030]: W0120 23:48:23.096222 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc375ff6f_68b7_42be_8e2b_800ad51039f5.slice/crio-fe04f526fd281e9e79d89f074e64d5a25b12ca027d1160ef754aa5dc6bbc16ab WatchSource:0}: Error finding container fe04f526fd281e9e79d89f074e64d5a25b12ca027d1160ef754aa5dc6bbc16ab: Status 404 returned error can't find the container with id fe04f526fd281e9e79d89f074e64d5a25b12ca027d1160ef754aa5dc6bbc16ab Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.128417 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.198819 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:23 crc kubenswrapper[5030]: W0120 23:48:23.201274 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1a2ca22_4f91_4168_80ce_ef047529dca1.slice/crio-ce597354d5d179dd78768275dc4cbf5b2d1745e9dbc0ad466d9bb5a42716a2be WatchSource:0}: Error finding container ce597354d5d179dd78768275dc4cbf5b2d1745e9dbc0ad466d9bb5a42716a2be: Status 404 returned error can't find the container with id ce597354d5d179dd78768275dc4cbf5b2d1745e9dbc0ad466d9bb5a42716a2be Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.211223 5030 generic.go:334] "Generic (PLEG): container finished" podID="fe4f4a14-2d5b-4a4c-b76f-1675f3429044" containerID="63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363" exitCode=0 Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.211305 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.211316 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"fe4f4a14-2d5b-4a4c-b76f-1675f3429044","Type":"ContainerDied","Data":"63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363"} Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.211347 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"fe4f4a14-2d5b-4a4c-b76f-1675f3429044","Type":"ContainerDied","Data":"32b04fcedd7c8d372b7009c1f3757a5bf75ffc3856ccc0cbaf9f96feb134c1db"} Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.211365 5030 scope.go:117] "RemoveContainer" containerID="63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.214955 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c375ff6f-68b7-42be-8e2b-800ad51039f5","Type":"ContainerStarted","Data":"fe04f526fd281e9e79d89f074e64d5a25b12ca027d1160ef754aa5dc6bbc16ab"} Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.215212 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-combined-ca-bundle\") pod \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\" (UID: \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\") " Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.215346 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-929tf\" (UniqueName: \"kubernetes.io/projected/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-kube-api-access-929tf\") pod \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\" (UID: \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\") " Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.215441 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-config-data\") pod \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\" (UID: \"fe4f4a14-2d5b-4a4c-b76f-1675f3429044\") " Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.222109 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-kube-api-access-929tf" (OuterVolumeSpecName: "kube-api-access-929tf") pod "fe4f4a14-2d5b-4a4c-b76f-1675f3429044" (UID: "fe4f4a14-2d5b-4a4c-b76f-1675f3429044"). InnerVolumeSpecName "kube-api-access-929tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.247035 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-config-data" (OuterVolumeSpecName: "config-data") pod "fe4f4a14-2d5b-4a4c-b76f-1675f3429044" (UID: "fe4f4a14-2d5b-4a4c-b76f-1675f3429044"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.253357 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe4f4a14-2d5b-4a4c-b76f-1675f3429044" (UID: "fe4f4a14-2d5b-4a4c-b76f-1675f3429044"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.262803 5030 scope.go:117] "RemoveContainer" containerID="63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363" Jan 20 23:48:23 crc kubenswrapper[5030]: E0120 23:48:23.263303 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363\": container with ID starting with 63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363 not found: ID does not exist" containerID="63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.263342 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363"} err="failed to get container status \"63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363\": rpc error: code = NotFound desc = could not find container \"63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363\": container with ID starting with 63591549c178c1410dfe17eaccfa4f11087a039bd73eaf014cce1e4deac11363 not found: ID does not exist" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.319390 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-929tf\" (UniqueName: \"kubernetes.io/projected/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-kube-api-access-929tf\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.319584 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.319663 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe4f4a14-2d5b-4a4c-b76f-1675f3429044-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.319553 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:48:23 crc kubenswrapper[5030]: W0120 23:48:23.322821 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04090560_6a34_4c06_8ad1_2302c0780d35.slice/crio-be0932e2771b4b5f56d4dc07e81638ee17c21f4d57170fa639cbb797502033e5 WatchSource:0}: Error finding container be0932e2771b4b5f56d4dc07e81638ee17c21f4d57170fa639cbb797502033e5: Status 404 returned error can't find the container with id be0932e2771b4b5f56d4dc07e81638ee17c21f4d57170fa639cbb797502033e5 Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.551245 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.568156 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.584786 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:48:23 crc kubenswrapper[5030]: E0120 23:48:23.585265 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe4f4a14-2d5b-4a4c-b76f-1675f3429044" containerName="nova-scheduler-scheduler" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.585301 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4f4a14-2d5b-4a4c-b76f-1675f3429044" containerName="nova-scheduler-scheduler" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.585532 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe4f4a14-2d5b-4a4c-b76f-1675f3429044" containerName="nova-scheduler-scheduler" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.586326 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.589172 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.598016 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.727845 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6944830c-0fd3-4285-b2d7-acbc0265920f-config-data\") pod \"nova-scheduler-0\" (UID: \"6944830c-0fd3-4285-b2d7-acbc0265920f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.727906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6944830c-0fd3-4285-b2d7-acbc0265920f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6944830c-0fd3-4285-b2d7-acbc0265920f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.728264 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d98fq\" (UniqueName: \"kubernetes.io/projected/6944830c-0fd3-4285-b2d7-acbc0265920f-kube-api-access-d98fq\") pod \"nova-scheduler-0\" (UID: \"6944830c-0fd3-4285-b2d7-acbc0265920f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.829923 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6944830c-0fd3-4285-b2d7-acbc0265920f-config-data\") pod \"nova-scheduler-0\" (UID: \"6944830c-0fd3-4285-b2d7-acbc0265920f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.829969 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6944830c-0fd3-4285-b2d7-acbc0265920f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6944830c-0fd3-4285-b2d7-acbc0265920f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.830071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d98fq\" (UniqueName: \"kubernetes.io/projected/6944830c-0fd3-4285-b2d7-acbc0265920f-kube-api-access-d98fq\") pod \"nova-scheduler-0\" (UID: \"6944830c-0fd3-4285-b2d7-acbc0265920f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.963458 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:48:23 crc kubenswrapper[5030]: E0120 23:48:23.963704 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.980768 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a9453fb-caa2-4466-8aef-3721fa6d234d" path="/var/lib/kubelet/pods/4a9453fb-caa2-4466-8aef-3721fa6d234d/volumes" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.981379 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bcf67b9-c2e6-4e94-bc61-c3905d269c38" path="/var/lib/kubelet/pods/4bcf67b9-c2e6-4e94-bc61-c3905d269c38/volumes" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.981979 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f404786-795b-4fe1-98be-adff150f9666" path="/var/lib/kubelet/pods/4f404786-795b-4fe1-98be-adff150f9666/volumes" Jan 20 23:48:23 crc kubenswrapper[5030]: I0120 23:48:23.983310 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe4f4a14-2d5b-4a4c-b76f-1675f3429044" path="/var/lib/kubelet/pods/fe4f4a14-2d5b-4a4c-b76f-1675f3429044/volumes" Jan 20 23:48:24 crc kubenswrapper[5030]: I0120 23:48:24.144444 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6944830c-0fd3-4285-b2d7-acbc0265920f-config-data\") pod \"nova-scheduler-0\" (UID: \"6944830c-0fd3-4285-b2d7-acbc0265920f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:24 crc kubenswrapper[5030]: I0120 23:48:24.144668 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6944830c-0fd3-4285-b2d7-acbc0265920f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6944830c-0fd3-4285-b2d7-acbc0265920f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:24 crc kubenswrapper[5030]: I0120 23:48:24.145401 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d98fq\" (UniqueName: \"kubernetes.io/projected/6944830c-0fd3-4285-b2d7-acbc0265920f-kube-api-access-d98fq\") pod \"nova-scheduler-0\" (UID: \"6944830c-0fd3-4285-b2d7-acbc0265920f\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:24 crc kubenswrapper[5030]: I0120 23:48:24.230336 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"04090560-6a34-4c06-8ad1-2302c0780d35","Type":"ContainerStarted","Data":"be0932e2771b4b5f56d4dc07e81638ee17c21f4d57170fa639cbb797502033e5"} Jan 20 23:48:24 crc kubenswrapper[5030]: I0120 23:48:24.236385 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c375ff6f-68b7-42be-8e2b-800ad51039f5","Type":"ContainerStarted","Data":"f2d6b9369ce716621b16493427f2b972be56c6d00d119998710fbf5b3df8d1c7"} Jan 20 23:48:24 crc kubenswrapper[5030]: I0120 23:48:24.236469 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c375ff6f-68b7-42be-8e2b-800ad51039f5","Type":"ContainerStarted","Data":"b0f23f798796c9796da67506324bf9f776c7d41136bd05b183e42db700b9f13e"} Jan 20 23:48:24 crc kubenswrapper[5030]: I0120 23:48:24.241221 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d1a2ca22-4f91-4168-80ce-ef047529dca1","Type":"ContainerStarted","Data":"5871f4bdf0294abcf2fc24e99f9516fa57df9c67b341735ce123e6dcc3125612"} Jan 20 23:48:24 crc kubenswrapper[5030]: I0120 23:48:24.241268 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d1a2ca22-4f91-4168-80ce-ef047529dca1","Type":"ContainerStarted","Data":"1ec78bfd8ed7da1cb896e14fb16fb3b9249c5c701440a0260b4dc66343d7d693"} Jan 20 23:48:24 crc kubenswrapper[5030]: I0120 23:48:24.241278 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d1a2ca22-4f91-4168-80ce-ef047529dca1","Type":"ContainerStarted","Data":"ce597354d5d179dd78768275dc4cbf5b2d1745e9dbc0ad466d9bb5a42716a2be"} Jan 20 23:48:24 crc kubenswrapper[5030]: I0120 23:48:24.270237 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.270218774 podStartE2EDuration="2.270218774s" podCreationTimestamp="2026-01-20 23:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:48:24.262127247 +0000 UTC m=+4376.582387565" watchObservedRunningTime="2026-01-20 23:48:24.270218774 +0000 UTC m=+4376.590479052" Jan 20 23:48:24 crc kubenswrapper[5030]: I0120 23:48:24.293782 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.293758814 podStartE2EDuration="2.293758814s" podCreationTimestamp="2026-01-20 23:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:48:24.290678349 +0000 UTC m=+4376.610938667" watchObservedRunningTime="2026-01-20 23:48:24.293758814 +0000 UTC m=+4376.614019112" Jan 20 23:48:24 crc kubenswrapper[5030]: I0120 23:48:24.508832 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:24 crc kubenswrapper[5030]: I0120 23:48:24.981143 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:48:24 crc kubenswrapper[5030]: W0120 23:48:24.985370 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6944830c_0fd3_4285_b2d7_acbc0265920f.slice/crio-ba1b4ad527572baee17b8dc6ba0d0001b2abd12eafd8911735e7c4661bc1b492 WatchSource:0}: Error finding container ba1b4ad527572baee17b8dc6ba0d0001b2abd12eafd8911735e7c4661bc1b492: Status 404 returned error can't find the container with id ba1b4ad527572baee17b8dc6ba0d0001b2abd12eafd8911735e7c4661bc1b492 Jan 20 23:48:25 crc kubenswrapper[5030]: I0120 23:48:25.253464 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"6944830c-0fd3-4285-b2d7-acbc0265920f","Type":"ContainerStarted","Data":"ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8"} Jan 20 23:48:25 crc kubenswrapper[5030]: I0120 23:48:25.253775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"6944830c-0fd3-4285-b2d7-acbc0265920f","Type":"ContainerStarted","Data":"ba1b4ad527572baee17b8dc6ba0d0001b2abd12eafd8911735e7c4661bc1b492"} Jan 20 23:48:25 crc kubenswrapper[5030]: I0120 23:48:25.257075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"04090560-6a34-4c06-8ad1-2302c0780d35","Type":"ContainerStarted","Data":"7fa80d92761accac304b360ba78225a67844fd21b07ae40ab8add0b66e36e8b9"} Jan 20 23:48:25 crc kubenswrapper[5030]: I0120 23:48:25.286533 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.286477086 podStartE2EDuration="2.286477086s" podCreationTimestamp="2026-01-20 23:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:48:25.279283791 +0000 UTC m=+4377.599544079" watchObservedRunningTime="2026-01-20 23:48:25.286477086 +0000 UTC m=+4377.606737384" Jan 20 23:48:26 crc kubenswrapper[5030]: I0120 23:48:26.280476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"04090560-6a34-4c06-8ad1-2302c0780d35","Type":"ContainerStarted","Data":"418bfe577c4d149cee9748647c98209413c708a4e2bedb58c0cc48de072e8154"} Jan 20 23:48:27 crc kubenswrapper[5030]: I0120 23:48:27.295793 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"04090560-6a34-4c06-8ad1-2302c0780d35","Type":"ContainerStarted","Data":"48c9c2fe5573f4ec56dd4e5d5297def56e3b7dfdfe230941bf1737afa5a59d44"} Jan 20 23:48:27 crc kubenswrapper[5030]: I0120 23:48:27.608229 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:27 crc kubenswrapper[5030]: I0120 23:48:27.608292 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:28 crc kubenswrapper[5030]: I0120 23:48:28.311656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"04090560-6a34-4c06-8ad1-2302c0780d35","Type":"ContainerStarted","Data":"731e2c423140e3522cd47c27f95e1f17662d9cf65c9ece61d960b38656be604b"} Jan 20 23:48:28 crc kubenswrapper[5030]: I0120 23:48:28.313787 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:28 crc kubenswrapper[5030]: I0120 23:48:28.341440 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.034750431 podStartE2EDuration="6.341423855s" podCreationTimestamp="2026-01-20 23:48:22 +0000 UTC" firstStartedPulling="2026-01-20 23:48:23.326239113 +0000 UTC m=+4375.646499401" lastFinishedPulling="2026-01-20 23:48:27.632912527 +0000 UTC m=+4379.953172825" observedRunningTime="2026-01-20 23:48:28.336289501 +0000 UTC m=+4380.656549809" watchObservedRunningTime="2026-01-20 23:48:28.341423855 +0000 UTC m=+4380.661684143" Jan 20 23:48:28 crc kubenswrapper[5030]: I0120 23:48:28.841159 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:48:29 crc kubenswrapper[5030]: I0120 23:48:29.509439 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:32 crc kubenswrapper[5030]: I0120 23:48:32.607922 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:32 crc kubenswrapper[5030]: I0120 23:48:32.608308 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:32 crc kubenswrapper[5030]: I0120 23:48:32.649046 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:32 crc kubenswrapper[5030]: I0120 23:48:32.649121 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:33 crc kubenswrapper[5030]: I0120 23:48:33.621839 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.162:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:48:33 crc kubenswrapper[5030]: I0120 23:48:33.621857 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.162:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:48:33 crc kubenswrapper[5030]: I0120 23:48:33.731899 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="d1a2ca22-4f91-4168-80ce-ef047529dca1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.163:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:48:33 crc kubenswrapper[5030]: I0120 23:48:33.731939 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="d1a2ca22-4f91-4168-80ce-ef047529dca1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.163:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:48:34 crc kubenswrapper[5030]: I0120 23:48:34.509390 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:34 crc kubenswrapper[5030]: I0120 23:48:34.555401 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:35 crc kubenswrapper[5030]: I0120 23:48:35.456776 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:48:35 crc kubenswrapper[5030]: I0120 23:48:35.963758 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:48:35 crc kubenswrapper[5030]: E0120 23:48:35.964540 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:48:42 crc kubenswrapper[5030]: I0120 23:48:42.618092 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:42 crc kubenswrapper[5030]: I0120 23:48:42.619388 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:42 crc kubenswrapper[5030]: I0120 23:48:42.626790 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:42 crc kubenswrapper[5030]: I0120 23:48:42.653158 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:42 crc kubenswrapper[5030]: I0120 23:48:42.653605 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:42 crc kubenswrapper[5030]: I0120 23:48:42.654738 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:42 crc kubenswrapper[5030]: I0120 23:48:42.656453 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:43 crc kubenswrapper[5030]: I0120 23:48:43.508259 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:43 crc kubenswrapper[5030]: I0120 23:48:43.514146 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:43 crc kubenswrapper[5030]: I0120 23:48:43.518280 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:48:45 crc kubenswrapper[5030]: I0120 23:48:45.341246 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:48:45 crc kubenswrapper[5030]: I0120 23:48:45.341886 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="sg-core" containerID="cri-o://48c9c2fe5573f4ec56dd4e5d5297def56e3b7dfdfe230941bf1737afa5a59d44" gracePeriod=30 Jan 20 23:48:45 crc kubenswrapper[5030]: I0120 23:48:45.341968 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="ceilometer-notification-agent" containerID="cri-o://418bfe577c4d149cee9748647c98209413c708a4e2bedb58c0cc48de072e8154" gracePeriod=30 Jan 20 23:48:45 crc kubenswrapper[5030]: I0120 23:48:45.341974 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="proxy-httpd" containerID="cri-o://731e2c423140e3522cd47c27f95e1f17662d9cf65c9ece61d960b38656be604b" gracePeriod=30 Jan 20 23:48:45 crc kubenswrapper[5030]: I0120 23:48:45.342194 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="ceilometer-central-agent" containerID="cri-o://7fa80d92761accac304b360ba78225a67844fd21b07ae40ab8add0b66e36e8b9" gracePeriod=30 Jan 20 23:48:45 crc kubenswrapper[5030]: I0120 23:48:45.361924 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.164:3000/\": read tcp 10.217.0.2:53180->10.217.1.164:3000: read: connection reset by peer" Jan 20 23:48:45 crc kubenswrapper[5030]: I0120 23:48:45.562257 5030 generic.go:334] "Generic (PLEG): container finished" podID="04090560-6a34-4c06-8ad1-2302c0780d35" containerID="48c9c2fe5573f4ec56dd4e5d5297def56e3b7dfdfe230941bf1737afa5a59d44" exitCode=2 Jan 20 23:48:45 crc kubenswrapper[5030]: I0120 23:48:45.564299 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"04090560-6a34-4c06-8ad1-2302c0780d35","Type":"ContainerDied","Data":"48c9c2fe5573f4ec56dd4e5d5297def56e3b7dfdfe230941bf1737afa5a59d44"} Jan 20 23:48:45 crc kubenswrapper[5030]: I0120 23:48:45.986047 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:46 crc kubenswrapper[5030]: I0120 23:48:46.579422 5030 generic.go:334] "Generic (PLEG): container finished" podID="04090560-6a34-4c06-8ad1-2302c0780d35" containerID="731e2c423140e3522cd47c27f95e1f17662d9cf65c9ece61d960b38656be604b" exitCode=0 Jan 20 23:48:46 crc kubenswrapper[5030]: I0120 23:48:46.579448 5030 generic.go:334] "Generic (PLEG): container finished" podID="04090560-6a34-4c06-8ad1-2302c0780d35" containerID="7fa80d92761accac304b360ba78225a67844fd21b07ae40ab8add0b66e36e8b9" exitCode=0 Jan 20 23:48:46 crc kubenswrapper[5030]: I0120 23:48:46.579506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"04090560-6a34-4c06-8ad1-2302c0780d35","Type":"ContainerDied","Data":"731e2c423140e3522cd47c27f95e1f17662d9cf65c9ece61d960b38656be604b"} Jan 20 23:48:46 crc kubenswrapper[5030]: I0120 23:48:46.579572 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"04090560-6a34-4c06-8ad1-2302c0780d35","Type":"ContainerDied","Data":"7fa80d92761accac304b360ba78225a67844fd21b07ae40ab8add0b66e36e8b9"} Jan 20 23:48:46 crc kubenswrapper[5030]: I0120 23:48:46.579701 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="d1a2ca22-4f91-4168-80ce-ef047529dca1" containerName="nova-api-api" containerID="cri-o://5871f4bdf0294abcf2fc24e99f9516fa57df9c67b341735ce123e6dcc3125612" gracePeriod=30 Jan 20 23:48:46 crc kubenswrapper[5030]: I0120 23:48:46.579962 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="d1a2ca22-4f91-4168-80ce-ef047529dca1" containerName="nova-api-log" containerID="cri-o://1ec78bfd8ed7da1cb896e14fb16fb3b9249c5c701440a0260b4dc66343d7d693" gracePeriod=30 Jan 20 23:48:47 crc kubenswrapper[5030]: I0120 23:48:47.591870 5030 generic.go:334] "Generic (PLEG): container finished" podID="d1a2ca22-4f91-4168-80ce-ef047529dca1" containerID="1ec78bfd8ed7da1cb896e14fb16fb3b9249c5c701440a0260b4dc66343d7d693" exitCode=143 Jan 20 23:48:47 crc kubenswrapper[5030]: I0120 23:48:47.591932 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d1a2ca22-4f91-4168-80ce-ef047529dca1","Type":"ContainerDied","Data":"1ec78bfd8ed7da1cb896e14fb16fb3b9249c5c701440a0260b4dc66343d7d693"} Jan 20 23:48:49 crc kubenswrapper[5030]: I0120 23:48:49.964264 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:48:49 crc kubenswrapper[5030]: E0120 23:48:49.965583 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:48:50 crc kubenswrapper[5030]: I0120 23:48:50.627674 5030 generic.go:334] "Generic (PLEG): container finished" podID="d1a2ca22-4f91-4168-80ce-ef047529dca1" containerID="5871f4bdf0294abcf2fc24e99f9516fa57df9c67b341735ce123e6dcc3125612" exitCode=0 Jan 20 23:48:50 crc kubenswrapper[5030]: I0120 23:48:50.627859 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d1a2ca22-4f91-4168-80ce-ef047529dca1","Type":"ContainerDied","Data":"5871f4bdf0294abcf2fc24e99f9516fa57df9c67b341735ce123e6dcc3125612"} Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.051076 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.109915 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a2ca22-4f91-4168-80ce-ef047529dca1-config-data\") pod \"d1a2ca22-4f91-4168-80ce-ef047529dca1\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.110097 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5l4w\" (UniqueName: \"kubernetes.io/projected/d1a2ca22-4f91-4168-80ce-ef047529dca1-kube-api-access-d5l4w\") pod \"d1a2ca22-4f91-4168-80ce-ef047529dca1\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.110173 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1a2ca22-4f91-4168-80ce-ef047529dca1-logs\") pod \"d1a2ca22-4f91-4168-80ce-ef047529dca1\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.110258 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a2ca22-4f91-4168-80ce-ef047529dca1-combined-ca-bundle\") pod \"d1a2ca22-4f91-4168-80ce-ef047529dca1\" (UID: \"d1a2ca22-4f91-4168-80ce-ef047529dca1\") " Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.111999 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1a2ca22-4f91-4168-80ce-ef047529dca1-logs" (OuterVolumeSpecName: "logs") pod "d1a2ca22-4f91-4168-80ce-ef047529dca1" (UID: "d1a2ca22-4f91-4168-80ce-ef047529dca1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.116993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a2ca22-4f91-4168-80ce-ef047529dca1-kube-api-access-d5l4w" (OuterVolumeSpecName: "kube-api-access-d5l4w") pod "d1a2ca22-4f91-4168-80ce-ef047529dca1" (UID: "d1a2ca22-4f91-4168-80ce-ef047529dca1"). InnerVolumeSpecName "kube-api-access-d5l4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.136968 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a2ca22-4f91-4168-80ce-ef047529dca1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1a2ca22-4f91-4168-80ce-ef047529dca1" (UID: "d1a2ca22-4f91-4168-80ce-ef047529dca1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.147614 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a2ca22-4f91-4168-80ce-ef047529dca1-config-data" (OuterVolumeSpecName: "config-data") pod "d1a2ca22-4f91-4168-80ce-ef047529dca1" (UID: "d1a2ca22-4f91-4168-80ce-ef047529dca1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.213098 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5l4w\" (UniqueName: \"kubernetes.io/projected/d1a2ca22-4f91-4168-80ce-ef047529dca1-kube-api-access-d5l4w\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.213350 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1a2ca22-4f91-4168-80ce-ef047529dca1-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.213364 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a2ca22-4f91-4168-80ce-ef047529dca1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.213374 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a2ca22-4f91-4168-80ce-ef047529dca1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.230833 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.314954 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04090560-6a34-4c06-8ad1-2302c0780d35-run-httpd\") pod \"04090560-6a34-4c06-8ad1-2302c0780d35\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.315058 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-config-data\") pod \"04090560-6a34-4c06-8ad1-2302c0780d35\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.315119 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04090560-6a34-4c06-8ad1-2302c0780d35-log-httpd\") pod \"04090560-6a34-4c06-8ad1-2302c0780d35\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.315143 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-scripts\") pod \"04090560-6a34-4c06-8ad1-2302c0780d35\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.315256 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-ceilometer-tls-certs\") pod \"04090560-6a34-4c06-8ad1-2302c0780d35\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.315310 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-sg-core-conf-yaml\") pod \"04090560-6a34-4c06-8ad1-2302c0780d35\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.315356 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-combined-ca-bundle\") pod \"04090560-6a34-4c06-8ad1-2302c0780d35\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.315414 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swxsh\" (UniqueName: \"kubernetes.io/projected/04090560-6a34-4c06-8ad1-2302c0780d35-kube-api-access-swxsh\") pod \"04090560-6a34-4c06-8ad1-2302c0780d35\" (UID: \"04090560-6a34-4c06-8ad1-2302c0780d35\") " Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.315479 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04090560-6a34-4c06-8ad1-2302c0780d35-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "04090560-6a34-4c06-8ad1-2302c0780d35" (UID: "04090560-6a34-4c06-8ad1-2302c0780d35"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.315891 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04090560-6a34-4c06-8ad1-2302c0780d35-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.316322 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04090560-6a34-4c06-8ad1-2302c0780d35-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "04090560-6a34-4c06-8ad1-2302c0780d35" (UID: "04090560-6a34-4c06-8ad1-2302c0780d35"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.319849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-scripts" (OuterVolumeSpecName: "scripts") pod "04090560-6a34-4c06-8ad1-2302c0780d35" (UID: "04090560-6a34-4c06-8ad1-2302c0780d35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.335196 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04090560-6a34-4c06-8ad1-2302c0780d35-kube-api-access-swxsh" (OuterVolumeSpecName: "kube-api-access-swxsh") pod "04090560-6a34-4c06-8ad1-2302c0780d35" (UID: "04090560-6a34-4c06-8ad1-2302c0780d35"). InnerVolumeSpecName "kube-api-access-swxsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.356699 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "04090560-6a34-4c06-8ad1-2302c0780d35" (UID: "04090560-6a34-4c06-8ad1-2302c0780d35"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.366234 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "04090560-6a34-4c06-8ad1-2302c0780d35" (UID: "04090560-6a34-4c06-8ad1-2302c0780d35"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.387807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04090560-6a34-4c06-8ad1-2302c0780d35" (UID: "04090560-6a34-4c06-8ad1-2302c0780d35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.419029 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.419084 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.419117 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.419146 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swxsh\" (UniqueName: \"kubernetes.io/projected/04090560-6a34-4c06-8ad1-2302c0780d35-kube-api-access-swxsh\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.419174 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/04090560-6a34-4c06-8ad1-2302c0780d35-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.419200 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.424509 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-config-data" (OuterVolumeSpecName: "config-data") pod "04090560-6a34-4c06-8ad1-2302c0780d35" (UID: "04090560-6a34-4c06-8ad1-2302c0780d35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.520387 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04090560-6a34-4c06-8ad1-2302c0780d35-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.642773 5030 generic.go:334] "Generic (PLEG): container finished" podID="04090560-6a34-4c06-8ad1-2302c0780d35" containerID="418bfe577c4d149cee9748647c98209413c708a4e2bedb58c0cc48de072e8154" exitCode=0 Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.642884 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.642880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"04090560-6a34-4c06-8ad1-2302c0780d35","Type":"ContainerDied","Data":"418bfe577c4d149cee9748647c98209413c708a4e2bedb58c0cc48de072e8154"} Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.643083 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"04090560-6a34-4c06-8ad1-2302c0780d35","Type":"ContainerDied","Data":"be0932e2771b4b5f56d4dc07e81638ee17c21f4d57170fa639cbb797502033e5"} Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.643120 5030 scope.go:117] "RemoveContainer" containerID="731e2c423140e3522cd47c27f95e1f17662d9cf65c9ece61d960b38656be604b" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.645580 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"d1a2ca22-4f91-4168-80ce-ef047529dca1","Type":"ContainerDied","Data":"ce597354d5d179dd78768275dc4cbf5b2d1745e9dbc0ad466d9bb5a42716a2be"} Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.645747 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.666775 5030 scope.go:117] "RemoveContainer" containerID="48c9c2fe5573f4ec56dd4e5d5297def56e3b7dfdfe230941bf1737afa5a59d44" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.693888 5030 scope.go:117] "RemoveContainer" containerID="418bfe577c4d149cee9748647c98209413c708a4e2bedb58c0cc48de072e8154" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.710918 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.721782 5030 scope.go:117] "RemoveContainer" containerID="7fa80d92761accac304b360ba78225a67844fd21b07ae40ab8add0b66e36e8b9" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.725352 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.736600 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.747494 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.754122 5030 scope.go:117] "RemoveContainer" containerID="731e2c423140e3522cd47c27f95e1f17662d9cf65c9ece61d960b38656be604b" Jan 20 23:48:51 crc kubenswrapper[5030]: E0120 23:48:51.756804 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"731e2c423140e3522cd47c27f95e1f17662d9cf65c9ece61d960b38656be604b\": container with ID starting with 731e2c423140e3522cd47c27f95e1f17662d9cf65c9ece61d960b38656be604b not found: ID does not exist" containerID="731e2c423140e3522cd47c27f95e1f17662d9cf65c9ece61d960b38656be604b" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.756864 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"731e2c423140e3522cd47c27f95e1f17662d9cf65c9ece61d960b38656be604b"} err="failed to get container status \"731e2c423140e3522cd47c27f95e1f17662d9cf65c9ece61d960b38656be604b\": rpc error: code = NotFound desc = could not find container \"731e2c423140e3522cd47c27f95e1f17662d9cf65c9ece61d960b38656be604b\": container with ID starting with 731e2c423140e3522cd47c27f95e1f17662d9cf65c9ece61d960b38656be604b not found: ID does not exist" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.756898 5030 scope.go:117] "RemoveContainer" containerID="48c9c2fe5573f4ec56dd4e5d5297def56e3b7dfdfe230941bf1737afa5a59d44" Jan 20 23:48:51 crc kubenswrapper[5030]: E0120 23:48:51.758150 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48c9c2fe5573f4ec56dd4e5d5297def56e3b7dfdfe230941bf1737afa5a59d44\": container with ID starting with 48c9c2fe5573f4ec56dd4e5d5297def56e3b7dfdfe230941bf1737afa5a59d44 not found: ID does not exist" containerID="48c9c2fe5573f4ec56dd4e5d5297def56e3b7dfdfe230941bf1737afa5a59d44" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.758190 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48c9c2fe5573f4ec56dd4e5d5297def56e3b7dfdfe230941bf1737afa5a59d44"} err="failed to get container status \"48c9c2fe5573f4ec56dd4e5d5297def56e3b7dfdfe230941bf1737afa5a59d44\": rpc error: code = NotFound desc = could not find container \"48c9c2fe5573f4ec56dd4e5d5297def56e3b7dfdfe230941bf1737afa5a59d44\": container with ID starting with 48c9c2fe5573f4ec56dd4e5d5297def56e3b7dfdfe230941bf1737afa5a59d44 not found: ID does not exist" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.758216 5030 scope.go:117] "RemoveContainer" containerID="418bfe577c4d149cee9748647c98209413c708a4e2bedb58c0cc48de072e8154" Jan 20 23:48:51 crc kubenswrapper[5030]: E0120 23:48:51.758497 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"418bfe577c4d149cee9748647c98209413c708a4e2bedb58c0cc48de072e8154\": container with ID starting with 418bfe577c4d149cee9748647c98209413c708a4e2bedb58c0cc48de072e8154 not found: ID does not exist" containerID="418bfe577c4d149cee9748647c98209413c708a4e2bedb58c0cc48de072e8154" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.758525 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"418bfe577c4d149cee9748647c98209413c708a4e2bedb58c0cc48de072e8154"} err="failed to get container status \"418bfe577c4d149cee9748647c98209413c708a4e2bedb58c0cc48de072e8154\": rpc error: code = NotFound desc = could not find container \"418bfe577c4d149cee9748647c98209413c708a4e2bedb58c0cc48de072e8154\": container with ID starting with 418bfe577c4d149cee9748647c98209413c708a4e2bedb58c0cc48de072e8154 not found: ID does not exist" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.758544 5030 scope.go:117] "RemoveContainer" containerID="7fa80d92761accac304b360ba78225a67844fd21b07ae40ab8add0b66e36e8b9" Jan 20 23:48:51 crc kubenswrapper[5030]: E0120 23:48:51.759240 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fa80d92761accac304b360ba78225a67844fd21b07ae40ab8add0b66e36e8b9\": container with ID starting with 7fa80d92761accac304b360ba78225a67844fd21b07ae40ab8add0b66e36e8b9 not found: ID does not exist" containerID="7fa80d92761accac304b360ba78225a67844fd21b07ae40ab8add0b66e36e8b9" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.759274 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa80d92761accac304b360ba78225a67844fd21b07ae40ab8add0b66e36e8b9"} err="failed to get container status \"7fa80d92761accac304b360ba78225a67844fd21b07ae40ab8add0b66e36e8b9\": rpc error: code = NotFound desc = could not find container \"7fa80d92761accac304b360ba78225a67844fd21b07ae40ab8add0b66e36e8b9\": container with ID starting with 7fa80d92761accac304b360ba78225a67844fd21b07ae40ab8add0b66e36e8b9 not found: ID does not exist" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.759293 5030 scope.go:117] "RemoveContainer" containerID="5871f4bdf0294abcf2fc24e99f9516fa57df9c67b341735ce123e6dcc3125612" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.768738 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:51 crc kubenswrapper[5030]: E0120 23:48:51.769239 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="proxy-httpd" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.769264 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="proxy-httpd" Jan 20 23:48:51 crc kubenswrapper[5030]: E0120 23:48:51.769290 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a2ca22-4f91-4168-80ce-ef047529dca1" containerName="nova-api-log" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.769300 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a2ca22-4f91-4168-80ce-ef047529dca1" containerName="nova-api-log" Jan 20 23:48:51 crc kubenswrapper[5030]: E0120 23:48:51.769315 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a2ca22-4f91-4168-80ce-ef047529dca1" containerName="nova-api-api" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.769324 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a2ca22-4f91-4168-80ce-ef047529dca1" containerName="nova-api-api" Jan 20 23:48:51 crc kubenswrapper[5030]: E0120 23:48:51.769367 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="sg-core" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.769379 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="sg-core" Jan 20 23:48:51 crc kubenswrapper[5030]: E0120 23:48:51.769400 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="ceilometer-notification-agent" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.769410 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="ceilometer-notification-agent" Jan 20 23:48:51 crc kubenswrapper[5030]: E0120 23:48:51.769429 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="ceilometer-central-agent" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.769438 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="ceilometer-central-agent" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.769687 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a2ca22-4f91-4168-80ce-ef047529dca1" containerName="nova-api-api" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.769715 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a2ca22-4f91-4168-80ce-ef047529dca1" containerName="nova-api-log" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.769732 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="ceilometer-notification-agent" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.769748 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="proxy-httpd" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.769769 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="sg-core" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.769790 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" containerName="ceilometer-central-agent" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.771326 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.774822 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.774960 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.775047 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.776289 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.778788 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.781012 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.781229 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.781479 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.782918 5030 scope.go:117] "RemoveContainer" containerID="1ec78bfd8ed7da1cb896e14fb16fb3b9249c5c701440a0260b4dc66343d7d693" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.784686 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.793783 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.926998 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.927054 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.927085 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.927143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-public-tls-certs\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.927178 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.927215 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.927297 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fff68923-cebb-4d2c-bf69-4355b0da8b6d-logs\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.927327 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88xwb\" (UniqueName: \"kubernetes.io/projected/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-kube-api-access-88xwb\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.927358 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-config-data\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.927379 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-log-httpd\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.927400 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-config-data\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.927445 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-scripts\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.927460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-run-httpd\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.927498 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6cwm\" (UniqueName: \"kubernetes.io/projected/fff68923-cebb-4d2c-bf69-4355b0da8b6d-kube-api-access-d6cwm\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.982487 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04090560-6a34-4c06-8ad1-2302c0780d35" path="/var/lib/kubelet/pods/04090560-6a34-4c06-8ad1-2302c0780d35/volumes" Jan 20 23:48:51 crc kubenswrapper[5030]: I0120 23:48:51.984356 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a2ca22-4f91-4168-80ce-ef047529dca1" path="/var/lib/kubelet/pods/d1a2ca22-4f91-4168-80ce-ef047529dca1/volumes" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.028997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-log-httpd\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029055 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-config-data\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029135 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-scripts\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029163 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-run-httpd\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029231 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6cwm\" (UniqueName: \"kubernetes.io/projected/fff68923-cebb-4d2c-bf69-4355b0da8b6d-kube-api-access-d6cwm\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029309 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029335 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029382 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-public-tls-certs\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029420 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029454 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029511 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fff68923-cebb-4d2c-bf69-4355b0da8b6d-logs\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029537 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88xwb\" (UniqueName: \"kubernetes.io/projected/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-kube-api-access-88xwb\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029570 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-config-data\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029905 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-log-httpd\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.029954 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-run-httpd\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.031536 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fff68923-cebb-4d2c-bf69-4355b0da8b6d-logs\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.035604 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-config-data\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.035728 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-public-tls-certs\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.036727 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.039809 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-scripts\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.042754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.043008 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.043021 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.043305 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.043889 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-config-data\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.055200 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6cwm\" (UniqueName: \"kubernetes.io/projected/fff68923-cebb-4d2c-bf69-4355b0da8b6d-kube-api-access-d6cwm\") pod \"nova-api-0\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.064490 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88xwb\" (UniqueName: \"kubernetes.io/projected/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-kube-api-access-88xwb\") pod \"ceilometer-0\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.098944 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.107173 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.632259 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:48:52 crc kubenswrapper[5030]: W0120 23:48:52.684304 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfff68923_cebb_4d2c_bf69_4355b0da8b6d.slice/crio-00cef13c135e11960a7dbb2e3be693db993cb165337f0f9bdfbdb166b386af88 WatchSource:0}: Error finding container 00cef13c135e11960a7dbb2e3be693db993cb165337f0f9bdfbdb166b386af88: Status 404 returned error can't find the container with id 00cef13c135e11960a7dbb2e3be693db993cb165337f0f9bdfbdb166b386af88 Jan 20 23:48:52 crc kubenswrapper[5030]: I0120 23:48:52.688386 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:48:53 crc kubenswrapper[5030]: I0120 23:48:53.684909 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03","Type":"ContainerStarted","Data":"c130e1e3f087863c14f1d0006c481f506719b0045564a503944f6c864d4040d3"} Jan 20 23:48:53 crc kubenswrapper[5030]: I0120 23:48:53.685235 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03","Type":"ContainerStarted","Data":"649095794f9ad7ae4dcb6223b9267716fa2fc98ee77d546ec29536c8d5cb89ae"} Jan 20 23:48:53 crc kubenswrapper[5030]: I0120 23:48:53.686642 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fff68923-cebb-4d2c-bf69-4355b0da8b6d","Type":"ContainerStarted","Data":"dce285993f85675d1e7573be37d820244bbab99b017ceb4a9b64b82a42bbdad3"} Jan 20 23:48:53 crc kubenswrapper[5030]: I0120 23:48:53.686666 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fff68923-cebb-4d2c-bf69-4355b0da8b6d","Type":"ContainerStarted","Data":"3ead8f8773eddb79d0b1807a8d7ef304eaad1f308dc8ad661b4d120d965ddd11"} Jan 20 23:48:53 crc kubenswrapper[5030]: I0120 23:48:53.686676 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fff68923-cebb-4d2c-bf69-4355b0da8b6d","Type":"ContainerStarted","Data":"00cef13c135e11960a7dbb2e3be693db993cb165337f0f9bdfbdb166b386af88"} Jan 20 23:48:54 crc kubenswrapper[5030]: I0120 23:48:54.709166 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03","Type":"ContainerStarted","Data":"1089f049d91fe37cc3fd52ef3dead85a955989169754a9e8689c66683255eeda"} Jan 20 23:48:55 crc kubenswrapper[5030]: I0120 23:48:55.722990 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03","Type":"ContainerStarted","Data":"eb9221952dc192f9c1f1f8e2fb015a4b92badc39abb4c32ab4746ab7b9ff7f2c"} Jan 20 23:48:56 crc kubenswrapper[5030]: I0120 23:48:56.744791 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03","Type":"ContainerStarted","Data":"a4af203238bd85addfe604690e3c292c8efda30dec6ea3736cd2d68577d11e6b"} Jan 20 23:48:56 crc kubenswrapper[5030]: I0120 23:48:56.747117 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:48:56 crc kubenswrapper[5030]: I0120 23:48:56.776228 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.270071989 podStartE2EDuration="5.776201903s" podCreationTimestamp="2026-01-20 23:48:51 +0000 UTC" firstStartedPulling="2026-01-20 23:48:52.651546974 +0000 UTC m=+4404.971807262" lastFinishedPulling="2026-01-20 23:48:56.157676868 +0000 UTC m=+4408.477937176" observedRunningTime="2026-01-20 23:48:56.766959198 +0000 UTC m=+4409.087219526" watchObservedRunningTime="2026-01-20 23:48:56.776201903 +0000 UTC m=+4409.096462201" Jan 20 23:48:56 crc kubenswrapper[5030]: I0120 23:48:56.785150 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=5.785131699 podStartE2EDuration="5.785131699s" podCreationTimestamp="2026-01-20 23:48:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:48:53.719393769 +0000 UTC m=+4406.039654107" watchObservedRunningTime="2026-01-20 23:48:56.785131699 +0000 UTC m=+4409.105391997" Jan 20 23:49:02 crc kubenswrapper[5030]: I0120 23:49:02.099710 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:02 crc kubenswrapper[5030]: I0120 23:49:02.100580 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:03 crc kubenswrapper[5030]: I0120 23:49:03.112906 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.166:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:49:03 crc kubenswrapper[5030]: I0120 23:49:03.112870 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.166:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:49:04 crc kubenswrapper[5030]: I0120 23:49:04.962041 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:49:04 crc kubenswrapper[5030]: E0120 23:49:04.962338 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:49:12 crc kubenswrapper[5030]: I0120 23:49:12.111103 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:12 crc kubenswrapper[5030]: I0120 23:49:12.111937 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:12 crc kubenswrapper[5030]: I0120 23:49:12.115654 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:12 crc kubenswrapper[5030]: I0120 23:49:12.124355 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:12 crc kubenswrapper[5030]: I0120 23:49:12.952193 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:12 crc kubenswrapper[5030]: I0120 23:49:12.958875 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:16 crc kubenswrapper[5030]: I0120 23:49:16.298276 5030 scope.go:117] "RemoveContainer" containerID="6d4fdd95933dd76f66126e2eafeb8812ed0de780243beac0c33000b1c9a049c3" Jan 20 23:49:16 crc kubenswrapper[5030]: I0120 23:49:16.337332 5030 scope.go:117] "RemoveContainer" containerID="590d9e7afd81121f20e8b3c4a10aa9c029280098e2371ecd81d6f475c6e6a932" Jan 20 23:49:16 crc kubenswrapper[5030]: I0120 23:49:16.387263 5030 scope.go:117] "RemoveContainer" containerID="124b27d23196c76c208e464129a013f7699a2efee00d1a0cc64c1e1f4062736b" Jan 20 23:49:16 crc kubenswrapper[5030]: I0120 23:49:16.424860 5030 scope.go:117] "RemoveContainer" containerID="20c4a921161139dd7e23f659cc47617ba1e894ddbac491c152cd76fd4f4e85b4" Jan 20 23:49:16 crc kubenswrapper[5030]: I0120 23:49:16.465087 5030 scope.go:117] "RemoveContainer" containerID="8b167ca61f8e9969a9c5b4773f26b86a04c25264b7d286b0bb38c376ba36ecef" Jan 20 23:49:16 crc kubenswrapper[5030]: I0120 23:49:16.500288 5030 scope.go:117] "RemoveContainer" containerID="ad6f8e58c9dff12d340a8632ebe13ee33bdf79383a461b73e54bcbe069e202bd" Jan 20 23:49:16 crc kubenswrapper[5030]: I0120 23:49:16.536332 5030 scope.go:117] "RemoveContainer" containerID="f1f39f1af8be8586d4e08109ccc15fd44076fee82dbc43273800e8dc177416bf" Jan 20 23:49:19 crc kubenswrapper[5030]: I0120 23:49:19.963669 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:49:19 crc kubenswrapper[5030]: E0120 23:49:19.964766 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:49:22 crc kubenswrapper[5030]: I0120 23:49:22.115650 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:32 crc kubenswrapper[5030]: I0120 23:49:32.962331 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:49:32 crc kubenswrapper[5030]: E0120 23:49:32.963604 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:49:45 crc kubenswrapper[5030]: I0120 23:49:45.962430 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:49:45 crc kubenswrapper[5030]: E0120 23:49:45.963670 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.776914 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.777253 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="b2e0e246-826d-4bef-a72a-64b6e0266965" containerName="openstack-network-exporter" containerID="cri-o://5ffdf2f065bf633d15e59103da0d96e1da51576f3b2d1cd447a73672d69bdc29" gracePeriod=30 Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.777417 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="b2e0e246-826d-4bef-a72a-64b6e0266965" containerName="ovn-northd" containerID="cri-o://ec45665b6a75193fb57359dd4199fd116084c74be755d905e8513881a7470510" gracePeriod=30 Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.798077 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4"] Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.799852 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.825314 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp"] Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.827002 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.835662 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.835888 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="99675934-2bb7-46f9-8e3c-49e818022424" containerName="openstackclient" containerID="cri-o://e39fcd1cdfdc87e110da28235413a92eda0b8fd8acd8844e248a0cfc858598d0" gracePeriod=2 Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.858929 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.860292 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" containerName="openstack-network-exporter" containerID="cri-o://3bdfa2c0f1f4d03ce20d64482e831290dd6ab19d7e2711ec93d9ac4cce9e8843" gracePeriod=300 Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.903886 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4"] Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.919488 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp"] Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.931032 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.932010 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-config-data\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.932097 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8shf5\" (UniqueName: \"kubernetes.io/projected/611e73cb-eb48-4fed-9073-7b3ee43b88e8-kube-api-access-8shf5\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.932142 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-config-data\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.932163 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/611e73cb-eb48-4fed-9073-7b3ee43b88e8-logs\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.932192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-config-data-custom\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.932208 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-combined-ca-bundle\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.932238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-combined-ca-bundle\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.932259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nxdw\" (UniqueName: \"kubernetes.io/projected/d3ed2d89-dd21-446c-b85b-b44bb72512bc-kube-api-access-6nxdw\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.932277 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-config-data-custom\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.932293 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ed2d89-dd21-446c-b85b-b44bb72512bc-logs\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.946034 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:49:47 crc kubenswrapper[5030]: I0120 23:49:47.946223 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="6944830c-0fd3-4285-b2d7-acbc0265920f" containerName="nova-scheduler-scheduler" containerID="cri-o://ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.007131 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.007979 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ff6971b9-88b5-4b5e-9b15-a105b0c38e18" containerName="glance-log" containerID="cri-o://c04f3732b98e193b6b53fc1957ae0a6d9a7d436e4e6b8a697a9012d0ac5cf9b1" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.008540 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ff6971b9-88b5-4b5e-9b15-a105b0c38e18" containerName="glance-httpd" containerID="cri-o://7cd9dbd240d3ef2a14c6d6bb20711d33050ff28e12fcb1a4972c439a6b1b2b7a" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.068303 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8shf5\" (UniqueName: \"kubernetes.io/projected/611e73cb-eb48-4fed-9073-7b3ee43b88e8-kube-api-access-8shf5\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.068750 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-config-data\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.068793 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/611e73cb-eb48-4fed-9073-7b3ee43b88e8-logs\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.068837 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-config-data-custom\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.068874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-combined-ca-bundle\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.068979 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-combined-ca-bundle\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.069219 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nxdw\" (UniqueName: \"kubernetes.io/projected/d3ed2d89-dd21-446c-b85b-b44bb72512bc-kube-api-access-6nxdw\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.069258 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-config-data-custom\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.069291 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ed2d89-dd21-446c-b85b-b44bb72512bc-logs\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.069449 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-config-data\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.082452 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ed2d89-dd21-446c-b85b-b44bb72512bc-logs\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.082774 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/611e73cb-eb48-4fed-9073-7b3ee43b88e8-logs\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.089410 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-combined-ca-bundle\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.090202 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-combined-ca-bundle\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.096802 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-config-data-custom\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.097401 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-config-data\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.097846 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-config-data\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.105778 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-config-data-custom\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.117492 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.118129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nxdw\" (UniqueName: \"kubernetes.io/projected/d3ed2d89-dd21-446c-b85b-b44bb72512bc-kube-api-access-6nxdw\") pod \"barbican-worker-656dcc67f7-xmnf4\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.132224 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerName="nova-metadata-log" containerID="cri-o://b0f23f798796c9796da67506324bf9f776c7d41136bd05b183e42db700b9f13e" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.132904 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerName="nova-metadata-metadata" containerID="cri-o://f2d6b9369ce716621b16493427f2b972be56c6d00d119998710fbf5b3df8d1c7" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.146846 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.154049 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8shf5\" (UniqueName: \"kubernetes.io/projected/611e73cb-eb48-4fed-9073-7b3ee43b88e8-kube-api-access-8shf5\") pod \"barbican-keystone-listener-5db5b7584d-vzrsp\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.165861 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.166331 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="32dd6edd-9576-4aa1-9371-f9498e9c1a60" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.241195 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" containerName="ovsdbserver-nb" containerID="cri-o://d14e9008105657463a596aefde91ce6bff6bcde232fe9380f40c4a188e349145" gracePeriod=300 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.332698 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.333001 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="e0048c16-5c70-4aae-baf0-d9f25538e3e2" containerName="cinder-scheduler" containerID="cri-o://539dbc34fb373341eff958e84967781ee544073c3550557667fd10768f210aeb" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.333398 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="e0048c16-5c70-4aae-baf0-d9f25538e3e2" containerName="probe" containerID="cri-o://039e7ef3eac22a9018b69c95007576efe560fae9bfde236edbf26c56343c30a2" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.387270 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.387521 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="5727ecb0-0164-4b1f-9c45-80b381d6e6dd" containerName="memcached" containerID="cri-o://320e734e3ff9af79dbcc22005042dcf98c9fab6ecad6918286b22dbf79009782" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.425540 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.439536 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.446081 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.485485 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.485725 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" containerName="cinder-api-log" containerID="cri-o://4692f8f2e0876360cbe854005c8dbab4b736e7fa64a44b994b00fec7f8624e10" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.486092 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" containerName="cinder-api" containerID="cri-o://d9cbf6d5b6e9e1557a747ebddf26e0d91cf97ac99872d10a07f8a705f8fd5859" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.498355 5030 generic.go:334] "Generic (PLEG): container finished" podID="ff6971b9-88b5-4b5e-9b15-a105b0c38e18" containerID="c04f3732b98e193b6b53fc1957ae0a6d9a7d436e4e6b8a697a9012d0ac5cf9b1" exitCode=143 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.498406 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ff6971b9-88b5-4b5e-9b15-a105b0c38e18","Type":"ContainerDied","Data":"c04f3732b98e193b6b53fc1957ae0a6d9a7d436e4e6b8a697a9012d0ac5cf9b1"} Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.499987 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2e0e246-826d-4bef-a72a-64b6e0266965" containerID="5ffdf2f065bf633d15e59103da0d96e1da51576f3b2d1cd447a73672d69bdc29" exitCode=2 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.500034 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"b2e0e246-826d-4bef-a72a-64b6e0266965","Type":"ContainerDied","Data":"5ffdf2f065bf633d15e59103da0d96e1da51576f3b2d1cd447a73672d69bdc29"} Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.517104 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.526672 5030 generic.go:334] "Generic (PLEG): container finished" podID="201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" containerID="3bdfa2c0f1f4d03ce20d64482e831290dd6ab19d7e2711ec93d9ac4cce9e8843" exitCode=2 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.526878 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="8cf2fe09-370b-4904-9f96-676f484639a2" containerName="glance-log" containerID="cri-o://79c09103c60431ebdd981145e3c2b16138f9661ec8c5a1fbfa1b8844ea828e76" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.526982 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b","Type":"ContainerDied","Data":"3bdfa2c0f1f4d03ce20d64482e831290dd6ab19d7e2711ec93d9ac4cce9e8843"} Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.527289 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="8cf2fe09-370b-4904-9f96-676f484639a2" containerName="glance-httpd" containerID="cri-o://08efb75988c344c6ad8db318e7e0250f525d5625216651e8b4ae86babeea05fa" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.568782 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.569330 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="82558fc9-417f-49d3-b522-207e0c215f92" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2a5e943d5d03a2a8aec5eb4d1bc3f3ab36ac421741ca2815d290edb3b2d56f0d" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.607471 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7fdd8fb598-6247c"] Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.610333 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99675934-2bb7-46f9-8e3c-49e818022424" containerName="openstackclient" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.610359 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="99675934-2bb7-46f9-8e3c-49e818022424" containerName="openstackclient" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.610662 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="99675934-2bb7-46f9-8e3c-49e818022424" containerName="openstackclient" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.618879 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.641007 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7fdd8fb598-6247c"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.684282 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.684634 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="806e42db-b6fa-4931-ab88-b2903c5e2830" containerName="openstack-network-exporter" containerID="cri-o://6b834b707d40e4da7564fea909b7b96377cf70ecece5fdecf7803fb75747167e" gracePeriod=300 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.724937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83e561a5-31df-4e46-ac6b-9602ea09bcef-logs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.725002 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-scripts\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.725054 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.725084 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-config-data\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.725133 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.725187 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kxtw\" (UniqueName: \"kubernetes.io/projected/83e561a5-31df-4e46-ac6b-9602ea09bcef-kube-api-access-8kxtw\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.725237 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-combined-ca-bundle\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.728864 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.730387 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.777414 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.779047 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.796772 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-d4c58fbf5-82sk5"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.798003 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.808988 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.809272 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerName="nova-api-log" containerID="cri-o://3ead8f8773eddb79d0b1807a8d7ef304eaad1f308dc8ad661b4d120d965ddd11" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.809414 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerName="nova-api-api" containerID="cri-o://dce285993f85675d1e7573be37d820244bbab99b017ceb4a9b64b82a42bbdad3" gracePeriod=30 Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.819249 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.830687 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.830731 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-combined-ca-bundle\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.830780 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-combined-ca-bundle\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.830804 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83e561a5-31df-4e46-ac6b-9602ea09bcef-logs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.830822 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.830851 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-config-data-custom\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.830872 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-scripts\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.830896 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6q8s\" (UniqueName: \"kubernetes.io/projected/d63800d7-7064-4d3f-9294-bcdbbd3957ec-kube-api-access-r6q8s\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.830921 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.830954 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.830980 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-config-data\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.831000 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kkdq\" (UniqueName: \"kubernetes.io/projected/58885f5a-2467-45da-b076-7cdb2952d2a9-kube-api-access-5kkdq\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.831018 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-config-data\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.831031 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58885f5a-2467-45da-b076-7cdb2952d2a9-logs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.831049 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-httpd-config\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.831071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.831090 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.831110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-config\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.831129 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-combined-ca-bundle\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.831155 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.831181 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kxtw\" (UniqueName: \"kubernetes.io/projected/83e561a5-31df-4e46-ac6b-9602ea09bcef-kube-api-access-8kxtw\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.832298 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83e561a5-31df-4e46-ac6b-9602ea09bcef-logs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.837386 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.837446 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs podName:83e561a5-31df-4e46-ac6b-9602ea09bcef nodeName:}" failed. No retries permitted until 2026-01-20 23:49:49.337430896 +0000 UTC m=+4461.657691184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs") pod "placement-7fdd8fb598-6247c" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef") : secret "cert-placement-public-svc" not found Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.838790 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.838874 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs podName:83e561a5-31df-4e46-ac6b-9602ea09bcef nodeName:}" failed. No retries permitted until 2026-01-20 23:49:49.33885178 +0000 UTC m=+4461.659112068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs") pod "placement-7fdd8fb598-6247c" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef") : secret "cert-placement-internal-svc" not found Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.838926 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-d4c58fbf5-82sk5"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.850534 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-scripts\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.851113 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-config-data\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.851354 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw"] Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.857719 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-combined-ca-bundle\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.915449 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kxtw\" (UniqueName: \"kubernetes.io/projected/83e561a5-31df-4e46-ac6b-9602ea09bcef-kube-api-access-8kxtw\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.937691 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.937741 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kkdq\" (UniqueName: \"kubernetes.io/projected/58885f5a-2467-45da-b076-7cdb2952d2a9-kube-api-access-5kkdq\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.937765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-config-data\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.937782 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58885f5a-2467-45da-b076-7cdb2952d2a9-logs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.937802 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-httpd-config\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.937842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.937862 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-config-data\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.937881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-config\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.937912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-combined-ca-bundle\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.937939 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-credential-keys\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.937957 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.937997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.938012 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-combined-ca-bundle\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.938042 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-scripts\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.938065 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-combined-ca-bundle\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.938085 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.938109 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-config-data-custom\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.938130 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfks4\" (UniqueName: \"kubernetes.io/projected/16b352a0-984a-4736-9ede-a3797059b56a-kube-api-access-qfks4\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.938150 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6q8s\" (UniqueName: \"kubernetes.io/projected/d63800d7-7064-4d3f-9294-bcdbbd3957ec-kube-api-access-r6q8s\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.938176 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-fernet-keys\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.938193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.938219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.940688 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.940741 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs podName:58885f5a-2467-45da-b076-7cdb2952d2a9 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:49.440725791 +0000 UTC m=+4461.760986079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs") pod "barbican-api-6ffbb9cff4-gljqq" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9") : secret "cert-barbican-internal-svc" not found Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.941324 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58885f5a-2467-45da-b076-7cdb2952d2a9-logs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.946987 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: EOF, stdout: , stderr: , exit code -1" containerID="d14e9008105657463a596aefde91ce6bff6bcde232fe9380f40c4a188e349145" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.948842 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d14e9008105657463a596aefde91ce6bff6bcde232fe9380f40c4a188e349145 is running failed: container process not found" containerID="d14e9008105657463a596aefde91ce6bff6bcde232fe9380f40c4a188e349145" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.949323 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d14e9008105657463a596aefde91ce6bff6bcde232fe9380f40c4a188e349145 is running failed: container process not found" containerID="d14e9008105657463a596aefde91ce6bff6bcde232fe9380f40c4a188e349145" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.949345 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d14e9008105657463a596aefde91ce6bff6bcde232fe9380f40c4a188e349145 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" containerName="ovsdbserver-nb" Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.949532 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.949569 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:49:49.449554526 +0000 UTC m=+4461.769814814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-public-svc" not found Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.949874 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.949937 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:49:49.449917634 +0000 UTC m=+4461.770177922 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-ovndbs" not found Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.951906 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.951997 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs podName:58885f5a-2467-45da-b076-7cdb2952d2a9 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:49.451974444 +0000 UTC m=+4461.772234812 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs") pod "barbican-api-6ffbb9cff4-gljqq" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9") : secret "cert-barbican-public-svc" not found Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.952727 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 20 23:49:48 crc kubenswrapper[5030]: E0120 23:49:48.952770 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:49:49.452761053 +0000 UTC m=+4461.773021331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-internal-svc" not found Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.964137 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-config-data\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.966764 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-config-data-custom\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.968721 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-httpd-config\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:48 crc kubenswrapper[5030]: I0120 23:49:48.999251 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-combined-ca-bundle\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.004670 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6q8s\" (UniqueName: \"kubernetes.io/projected/d63800d7-7064-4d3f-9294-bcdbbd3957ec-kube-api-access-r6q8s\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.005197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kkdq\" (UniqueName: \"kubernetes.io/projected/58885f5a-2467-45da-b076-7cdb2952d2a9-kube-api-access-5kkdq\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.005961 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-config\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.007313 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-combined-ca-bundle\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.039655 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.039722 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.039780 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-config-data\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.039815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-credential-keys\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.039887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-scripts\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.039912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-combined-ca-bundle\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.039957 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfks4\" (UniqueName: \"kubernetes.io/projected/16b352a0-984a-4736-9ede-a3797059b56a-kube-api-access-qfks4\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.039988 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-fernet-keys\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.041131 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.041200 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs podName:16b352a0-984a-4736-9ede-a3797059b56a nodeName:}" failed. No retries permitted until 2026-01-20 23:49:49.541182238 +0000 UTC m=+4461.861442526 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs") pod "keystone-d4c58fbf5-82sk5" (UID: "16b352a0-984a-4736-9ede-a3797059b56a") : secret "cert-keystone-public-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.041245 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.041267 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs podName:16b352a0-984a-4736-9ede-a3797059b56a nodeName:}" failed. No retries permitted until 2026-01-20 23:49:49.54126068 +0000 UTC m=+4461.861520958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs") pod "keystone-d4c58fbf5-82sk5" (UID: "16b352a0-984a-4736-9ede-a3797059b56a") : secret "cert-keystone-internal-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.046229 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-fernet-keys\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.046406 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-credential-keys\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.046767 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-combined-ca-bundle\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.055562 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-config-data\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.055699 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-scripts\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.083758 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfks4\" (UniqueName: \"kubernetes.io/projected/16b352a0-984a-4736-9ede-a3797059b56a-kube-api-access-qfks4\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.157271 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="806e42db-b6fa-4931-ab88-b2903c5e2830" containerName="ovsdbserver-sb" containerID="cri-o://d42829f2ade98d7dd79e571500a85a3e1b4edf65a779d6c65138c90834bf5c89" gracePeriod=300 Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.194191 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="62872782-8768-483f-b48b-298cf9973612" containerName="galera" containerID="cri-o://427abcf023821e4c96cad1a1fc1371694fc425ad1dba2ec4e422b6da5144072e" gracePeriod=30 Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.204767 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="9729aa6a-9110-421f-bd3a-d22cad6a75e9" containerName="galera" containerID="cri-o://ee14274333d8d3015cc3322cf08423a90a30cfb67d14fd4de84a2516c22bd7be" gracePeriod=30 Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.385672 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.385971 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.386650 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs podName:83e561a5-31df-4e46-ac6b-9602ea09bcef nodeName:}" failed. No retries permitted until 2026-01-20 23:49:50.386093385 +0000 UTC m=+4462.706353673 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs") pod "placement-7fdd8fb598-6247c" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef") : secret "cert-placement-internal-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.387015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.387096 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.387314 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs podName:83e561a5-31df-4e46-ac6b-9602ea09bcef nodeName:}" failed. No retries permitted until 2026-01-20 23:49:50.387305694 +0000 UTC m=+4462.707565982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs") pod "placement-7fdd8fb598-6247c" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef") : secret "cert-placement-public-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.475520 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4"] Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.493843 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.494007 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.494063 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.494122 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.494181 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.494331 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.494367 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.494386 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.494386 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.494377 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.494429 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:49:50.494409753 +0000 UTC m=+4462.814670041 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-internal-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.494460 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs podName:58885f5a-2467-45da-b076-7cdb2952d2a9 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:50.494440064 +0000 UTC m=+4462.814700352 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs") pod "barbican-api-6ffbb9cff4-gljqq" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9") : secret "cert-barbican-internal-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.494475 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs podName:58885f5a-2467-45da-b076-7cdb2952d2a9 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:50.494469954 +0000 UTC m=+4462.814730232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs") pod "barbican-api-6ffbb9cff4-gljqq" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9") : secret "cert-barbican-public-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.494487 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:49:50.494481525 +0000 UTC m=+4462.814741813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-ovndbs" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.494498 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:49:50.494493715 +0000 UTC m=+4462.814754003 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-public-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.511922 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.514952 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.520708 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.520761 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="6944830c-0fd3-4285-b2d7-acbc0265920f" containerName="nova-scheduler-scheduler" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.541155 5030 generic.go:334] "Generic (PLEG): container finished" podID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerID="3ead8f8773eddb79d0b1807a8d7ef304eaad1f308dc8ad661b4d120d965ddd11" exitCode=143 Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.541410 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fff68923-cebb-4d2c-bf69-4355b0da8b6d","Type":"ContainerDied","Data":"3ead8f8773eddb79d0b1807a8d7ef304eaad1f308dc8ad661b4d120d965ddd11"} Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.544008 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_806e42db-b6fa-4931-ab88-b2903c5e2830/ovsdbserver-sb/0.log" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.544041 5030 generic.go:334] "Generic (PLEG): container finished" podID="806e42db-b6fa-4931-ab88-b2903c5e2830" containerID="6b834b707d40e4da7564fea909b7b96377cf70ecece5fdecf7803fb75747167e" exitCode=2 Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.544053 5030 generic.go:334] "Generic (PLEG): container finished" podID="806e42db-b6fa-4931-ab88-b2903c5e2830" containerID="d42829f2ade98d7dd79e571500a85a3e1b4edf65a779d6c65138c90834bf5c89" exitCode=143 Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.544089 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"806e42db-b6fa-4931-ab88-b2903c5e2830","Type":"ContainerDied","Data":"6b834b707d40e4da7564fea909b7b96377cf70ecece5fdecf7803fb75747167e"} Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.544102 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"806e42db-b6fa-4931-ab88-b2903c5e2830","Type":"ContainerDied","Data":"d42829f2ade98d7dd79e571500a85a3e1b4edf65a779d6c65138c90834bf5c89"} Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.556979 5030 generic.go:334] "Generic (PLEG): container finished" podID="f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" containerID="4692f8f2e0876360cbe854005c8dbab4b736e7fa64a44b994b00fec7f8624e10" exitCode=143 Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.557053 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d","Type":"ContainerDied","Data":"4692f8f2e0876360cbe854005c8dbab4b736e7fa64a44b994b00fec7f8624e10"} Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.567071 5030 generic.go:334] "Generic (PLEG): container finished" podID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerID="b0f23f798796c9796da67506324bf9f776c7d41136bd05b183e42db700b9f13e" exitCode=143 Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.567151 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c375ff6f-68b7-42be-8e2b-800ad51039f5","Type":"ContainerDied","Data":"b0f23f798796c9796da67506324bf9f776c7d41136bd05b183e42db700b9f13e"} Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.591807 5030 generic.go:334] "Generic (PLEG): container finished" podID="8cf2fe09-370b-4904-9f96-676f484639a2" containerID="79c09103c60431ebdd981145e3c2b16138f9661ec8c5a1fbfa1b8844ea828e76" exitCode=143 Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.591886 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"8cf2fe09-370b-4904-9f96-676f484639a2","Type":"ContainerDied","Data":"79c09103c60431ebdd981145e3c2b16138f9661ec8c5a1fbfa1b8844ea828e76"} Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.595186 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.595226 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.595439 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.595477 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs podName:16b352a0-984a-4736-9ede-a3797059b56a nodeName:}" failed. No retries permitted until 2026-01-20 23:49:50.595464364 +0000 UTC m=+4462.915724652 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs") pod "keystone-d4c58fbf5-82sk5" (UID: "16b352a0-984a-4736-9ede-a3797059b56a") : secret "cert-keystone-public-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.595508 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: E0120 23:49:49.595525 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs podName:16b352a0-984a-4736-9ede-a3797059b56a nodeName:}" failed. No retries permitted until 2026-01-20 23:49:50.595519505 +0000 UTC m=+4462.915779793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs") pod "keystone-d4c58fbf5-82sk5" (UID: "16b352a0-984a-4736-9ede-a3797059b56a") : secret "cert-keystone-internal-svc" not found Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.597224 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_201b8e1b-7c3d-4d9d-a47b-4dda67f6829b/ovsdbserver-nb/0.log" Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.597266 5030 generic.go:334] "Generic (PLEG): container finished" podID="201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" containerID="d14e9008105657463a596aefde91ce6bff6bcde232fe9380f40c4a188e349145" exitCode=143 Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.597304 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b","Type":"ContainerDied","Data":"d14e9008105657463a596aefde91ce6bff6bcde232fe9380f40c4a188e349145"} Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.724088 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp"] Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.772060 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.772388 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="ceilometer-central-agent" containerID="cri-o://c130e1e3f087863c14f1d0006c481f506719b0045564a503944f6c864d4040d3" gracePeriod=30 Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.772875 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="proxy-httpd" containerID="cri-o://a4af203238bd85addfe604690e3c292c8efda30dec6ea3736cd2d68577d11e6b" gracePeriod=30 Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.773074 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="ceilometer-notification-agent" containerID="cri-o://1089f049d91fe37cc3fd52ef3dead85a955989169754a9e8689c66683255eeda" gracePeriod=30 Jan 20 23:49:49 crc kubenswrapper[5030]: I0120 23:49:49.773121 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="sg-core" containerID="cri-o://eb9221952dc192f9c1f1f8e2fb015a4b92badc39abb4c32ab4746ab7b9ff7f2c" gracePeriod=30 Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.052983 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_201b8e1b-7c3d-4d9d-a47b-4dda67f6829b/ovsdbserver-nb/0.log" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.053242 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.152761 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd is running failed: container process not found" containerID="8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.153151 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd is running failed: container process not found" containerID="8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.153328 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd is running failed: container process not found" containerID="8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.153350 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="32dd6edd-9576-4aa1-9371-f9498e9c1a60" containerName="nova-cell0-conductor-conductor" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.226430 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdxdr\" (UniqueName: \"kubernetes.io/projected/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-kube-api-access-cdxdr\") pod \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.226536 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.226559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-scripts\") pod \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.226577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-ovsdbserver-nb-tls-certs\") pod \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.226615 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-combined-ca-bundle\") pod \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.226670 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-metrics-certs-tls-certs\") pod \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.226745 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-config\") pod \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.226824 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-ovsdb-rundir\") pod \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\" (UID: \"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.228128 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-config" (OuterVolumeSpecName: "config") pod "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" (UID: "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.228216 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" (UID: "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.228846 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-scripts" (OuterVolumeSpecName: "scripts") pod "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" (UID: "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.235182 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-kube-api-access-cdxdr" (OuterVolumeSpecName: "kube-api-access-cdxdr") pod "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" (UID: "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b"). InnerVolumeSpecName "kube-api-access-cdxdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.240044 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" (UID: "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.272097 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" (UID: "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.310262 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_806e42db-b6fa-4931-ab88-b2903c5e2830/ovsdbserver-sb/0.log" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.310325 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.344173 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjn5g\" (UniqueName: \"kubernetes.io/projected/806e42db-b6fa-4931-ab88-b2903c5e2830-kube-api-access-qjn5g\") pod \"806e42db-b6fa-4931-ab88-b2903c5e2830\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.344262 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/806e42db-b6fa-4931-ab88-b2903c5e2830-config\") pod \"806e42db-b6fa-4931-ab88-b2903c5e2830\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.344386 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-metrics-certs-tls-certs\") pod \"806e42db-b6fa-4931-ab88-b2903c5e2830\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.344436 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-ovsdbserver-sb-tls-certs\") pod \"806e42db-b6fa-4931-ab88-b2903c5e2830\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.344490 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/806e42db-b6fa-4931-ab88-b2903c5e2830-scripts\") pod \"806e42db-b6fa-4931-ab88-b2903c5e2830\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.344507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-combined-ca-bundle\") pod \"806e42db-b6fa-4931-ab88-b2903c5e2830\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.344527 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/806e42db-b6fa-4931-ab88-b2903c5e2830-ovsdb-rundir\") pod \"806e42db-b6fa-4931-ab88-b2903c5e2830\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.344608 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"806e42db-b6fa-4931-ab88-b2903c5e2830\" (UID: \"806e42db-b6fa-4931-ab88-b2903c5e2830\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.345161 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.345179 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.345190 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdxdr\" (UniqueName: \"kubernetes.io/projected/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-kube-api-access-cdxdr\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.345208 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.345218 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.345228 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.346575 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/806e42db-b6fa-4931-ab88-b2903c5e2830-config" (OuterVolumeSpecName: "config") pod "806e42db-b6fa-4931-ab88-b2903c5e2830" (UID: "806e42db-b6fa-4931-ab88-b2903c5e2830"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.346861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/806e42db-b6fa-4931-ab88-b2903c5e2830-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "806e42db-b6fa-4931-ab88-b2903c5e2830" (UID: "806e42db-b6fa-4931-ab88-b2903c5e2830"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.347137 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/806e42db-b6fa-4931-ab88-b2903c5e2830-scripts" (OuterVolumeSpecName: "scripts") pod "806e42db-b6fa-4931-ab88-b2903c5e2830" (UID: "806e42db-b6fa-4931-ab88-b2903c5e2830"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.348076 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" (UID: "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.348722 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.352371 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "806e42db-b6fa-4931-ab88-b2903c5e2830" (UID: "806e42db-b6fa-4931-ab88-b2903c5e2830"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.354861 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.357154 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" (UID: "201b8e1b-7c3d-4d9d-a47b-4dda67f6829b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.358593 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/806e42db-b6fa-4931-ab88-b2903c5e2830-kube-api-access-qjn5g" (OuterVolumeSpecName: "kube-api-access-qjn5g") pod "806e42db-b6fa-4931-ab88-b2903c5e2830" (UID: "806e42db-b6fa-4931-ab88-b2903c5e2830"). InnerVolumeSpecName "kube-api-access-qjn5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.361978 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.381048 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.386410 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.394017 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.448607 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-config-data-default\") pod \"62872782-8768-483f-b48b-298cf9973612\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.448654 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.449100 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62872782-8768-483f-b48b-298cf9973612-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "62872782-8768-483f-b48b-298cf9973612" (UID: "62872782-8768-483f-b48b-298cf9973612"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.448678 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/62872782-8768-483f-b48b-298cf9973612-config-data-generated\") pod \"62872782-8768-483f-b48b-298cf9973612\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.449223 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "62872782-8768-483f-b48b-298cf9973612" (UID: "62872782-8768-483f-b48b-298cf9973612"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.449237 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62872782-8768-483f-b48b-298cf9973612-combined-ca-bundle\") pod \"62872782-8768-483f-b48b-298cf9973612\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.449274 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-kolla-config\") pod \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.449297 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-kolla-config\") pod \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.449326 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99675934-2bb7-46f9-8e3c-49e818022424-combined-ca-bundle\") pod \"99675934-2bb7-46f9-8e3c-49e818022424\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.449353 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-config-data-default\") pod \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.449384 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9729aa6a-9110-421f-bd3a-d22cad6a75e9-config-data-generated\") pod \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.449934 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "5727ecb0-0164-4b1f-9c45-80b381d6e6dd" (UID: "5727ecb0-0164-4b1f-9c45-80b381d6e6dd"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "9729aa6a-9110-421f-bd3a-d22cad6a75e9" (UID: "9729aa6a-9110-421f-bd3a-d22cad6a75e9"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450182 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6mvh\" (UniqueName: \"kubernetes.io/projected/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-kube-api-access-j6mvh\") pod \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.449999 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9729aa6a-9110-421f-bd3a-d22cad6a75e9" (UID: "9729aa6a-9110-421f-bd3a-d22cad6a75e9"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450232 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-979dt\" (UniqueName: \"kubernetes.io/projected/32dd6edd-9576-4aa1-9371-f9498e9c1a60-kube-api-access-979dt\") pod \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\" (UID: \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmtts\" (UniqueName: \"kubernetes.io/projected/9729aa6a-9110-421f-bd3a-d22cad6a75e9-kube-api-access-vmtts\") pod \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450320 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8qm4\" (UniqueName: \"kubernetes.io/projected/62872782-8768-483f-b48b-298cf9973612-kube-api-access-d8qm4\") pod \"62872782-8768-483f-b48b-298cf9973612\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dd6edd-9576-4aa1-9371-f9498e9c1a60-combined-ca-bundle\") pod \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\" (UID: \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450280 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9729aa6a-9110-421f-bd3a-d22cad6a75e9-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "9729aa6a-9110-421f-bd3a-d22cad6a75e9" (UID: "9729aa6a-9110-421f-bd3a-d22cad6a75e9"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-combined-ca-bundle\") pod \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450418 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"62872782-8768-483f-b48b-298cf9973612\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450440 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9729aa6a-9110-421f-bd3a-d22cad6a75e9-galera-tls-certs\") pod \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-memcached-tls-certs\") pod \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450511 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf2nm\" (UniqueName: \"kubernetes.io/projected/99675934-2bb7-46f9-8e3c-49e818022424-kube-api-access-rf2nm\") pod \"99675934-2bb7-46f9-8e3c-49e818022424\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450537 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/99675934-2bb7-46f9-8e3c-49e818022424-openstack-config\") pod \"99675934-2bb7-46f9-8e3c-49e818022424\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450555 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9729aa6a-9110-421f-bd3a-d22cad6a75e9-combined-ca-bundle\") pod \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450578 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-kolla-config\") pod \"62872782-8768-483f-b48b-298cf9973612\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.450603 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-operator-scripts\") pod \"62872782-8768-483f-b48b-298cf9973612\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.451161 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-operator-scripts\") pod \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\" (UID: \"9729aa6a-9110-421f-bd3a-d22cad6a75e9\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.451207 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-config-data\") pod \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\" (UID: \"5727ecb0-0164-4b1f-9c45-80b381d6e6dd\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.451270 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzd4c\" (UniqueName: \"kubernetes.io/projected/82558fc9-417f-49d3-b522-207e0c215f92-kube-api-access-vzd4c\") pod \"82558fc9-417f-49d3-b522-207e0c215f92\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.451304 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-nova-novncproxy-tls-certs\") pod \"82558fc9-417f-49d3-b522-207e0c215f92\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.451341 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/99675934-2bb7-46f9-8e3c-49e818022424-openstack-config-secret\") pod \"99675934-2bb7-46f9-8e3c-49e818022424\" (UID: \"99675934-2bb7-46f9-8e3c-49e818022424\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.451388 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-vencrypt-tls-certs\") pod \"82558fc9-417f-49d3-b522-207e0c215f92\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.451415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-config-data\") pod \"82558fc9-417f-49d3-b522-207e0c215f92\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.451450 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-combined-ca-bundle\") pod \"82558fc9-417f-49d3-b522-207e0c215f92\" (UID: \"82558fc9-417f-49d3-b522-207e0c215f92\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.451489 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/62872782-8768-483f-b48b-298cf9973612-galera-tls-certs\") pod \"62872782-8768-483f-b48b-298cf9973612\" (UID: \"62872782-8768-483f-b48b-298cf9973612\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.451541 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32dd6edd-9576-4aa1-9371-f9498e9c1a60-config-data\") pod \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\" (UID: \"32dd6edd-9576-4aa1-9371-f9498e9c1a60\") " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452395 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452480 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452765 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjn5g\" (UniqueName: \"kubernetes.io/projected/806e42db-b6fa-4931-ab88-b2903c5e2830-kube-api-access-qjn5g\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452777 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452788 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452796 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/806e42db-b6fa-4931-ab88-b2903c5e2830-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452805 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/62872782-8768-483f-b48b-298cf9973612-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452814 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452822 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452833 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9729aa6a-9110-421f-bd3a-d22cad6a75e9-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452842 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452852 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452864 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/806e42db-b6fa-4931-ab88-b2903c5e2830-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452874 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/806e42db-b6fa-4931-ab88-b2903c5e2830-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.452895 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.453324 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "62872782-8768-483f-b48b-298cf9973612" (UID: "62872782-8768-483f-b48b-298cf9973612"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.454262 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9729aa6a-9110-421f-bd3a-d22cad6a75e9" (UID: "9729aa6a-9110-421f-bd3a-d22cad6a75e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.454423 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62872782-8768-483f-b48b-298cf9973612" (UID: "62872782-8768-483f-b48b-298cf9973612"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.455495 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.455563 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs podName:83e561a5-31df-4e46-ac6b-9602ea09bcef nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.455541418 +0000 UTC m=+4464.775801786 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs") pod "placement-7fdd8fb598-6247c" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef") : secret "cert-placement-public-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.455914 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-config-data" (OuterVolumeSpecName: "config-data") pod "5727ecb0-0164-4b1f-9c45-80b381d6e6dd" (UID: "5727ecb0-0164-4b1f-9c45-80b381d6e6dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.456008 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.456054 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs podName:83e561a5-31df-4e46-ac6b-9602ea09bcef nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.456036331 +0000 UTC m=+4464.776296619 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs") pod "placement-7fdd8fb598-6247c" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef") : secret "cert-placement-internal-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.486546 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32dd6edd-9576-4aa1-9371-f9498e9c1a60-kube-api-access-979dt" (OuterVolumeSpecName: "kube-api-access-979dt") pod "32dd6edd-9576-4aa1-9371-f9498e9c1a60" (UID: "32dd6edd-9576-4aa1-9371-f9498e9c1a60"). InnerVolumeSpecName "kube-api-access-979dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.488978 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9729aa6a-9110-421f-bd3a-d22cad6a75e9-kube-api-access-vmtts" (OuterVolumeSpecName: "kube-api-access-vmtts") pod "9729aa6a-9110-421f-bd3a-d22cad6a75e9" (UID: "9729aa6a-9110-421f-bd3a-d22cad6a75e9"). InnerVolumeSpecName "kube-api-access-vmtts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.489163 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-kube-api-access-j6mvh" (OuterVolumeSpecName: "kube-api-access-j6mvh") pod "5727ecb0-0164-4b1f-9c45-80b381d6e6dd" (UID: "5727ecb0-0164-4b1f-9c45-80b381d6e6dd"). InnerVolumeSpecName "kube-api-access-j6mvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.489925 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62872782-8768-483f-b48b-298cf9973612-kube-api-access-d8qm4" (OuterVolumeSpecName: "kube-api-access-d8qm4") pod "62872782-8768-483f-b48b-298cf9973612" (UID: "62872782-8768-483f-b48b-298cf9973612"). InnerVolumeSpecName "kube-api-access-d8qm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.490872 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99675934-2bb7-46f9-8e3c-49e818022424-kube-api-access-rf2nm" (OuterVolumeSpecName: "kube-api-access-rf2nm") pod "99675934-2bb7-46f9-8e3c-49e818022424" (UID: "99675934-2bb7-46f9-8e3c-49e818022424"). InnerVolumeSpecName "kube-api-access-rf2nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.491173 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82558fc9-417f-49d3-b522-207e0c215f92-kube-api-access-vzd4c" (OuterVolumeSpecName: "kube-api-access-vzd4c") pod "82558fc9-417f-49d3-b522-207e0c215f92" (UID: "82558fc9-417f-49d3-b522-207e0c215f92"). InnerVolumeSpecName "kube-api-access-vzd4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.499086 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "9729aa6a-9110-421f-bd3a-d22cad6a75e9" (UID: "9729aa6a-9110-421f-bd3a-d22cad6a75e9"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.510759 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "62872782-8768-483f-b48b-298cf9973612" (UID: "62872782-8768-483f-b48b-298cf9973612"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.523207 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554359 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554482 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554525 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554569 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554614 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554703 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554725 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554737 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6mvh\" (UniqueName: \"kubernetes.io/projected/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-kube-api-access-j6mvh\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554748 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-979dt\" (UniqueName: \"kubernetes.io/projected/32dd6edd-9576-4aa1-9371-f9498e9c1a60-kube-api-access-979dt\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554758 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmtts\" (UniqueName: \"kubernetes.io/projected/9729aa6a-9110-421f-bd3a-d22cad6a75e9-kube-api-access-vmtts\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554768 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8qm4\" (UniqueName: \"kubernetes.io/projected/62872782-8768-483f-b48b-298cf9973612-kube-api-access-d8qm4\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554784 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554795 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf2nm\" (UniqueName: \"kubernetes.io/projected/99675934-2bb7-46f9-8e3c-49e818022424-kube-api-access-rf2nm\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554806 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554817 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62872782-8768-483f-b48b-298cf9973612-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554827 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9729aa6a-9110-421f-bd3a-d22cad6a75e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554837 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.554848 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzd4c\" (UniqueName: \"kubernetes.io/projected/82558fc9-417f-49d3-b522-207e0c215f92-kube-api-access-vzd4c\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.555027 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.555076 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.555059622 +0000 UTC m=+4464.875319910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-ovndbs" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.555104 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.555433 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.555824 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs podName:58885f5a-2467-45da-b076-7cdb2952d2a9 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.555803081 +0000 UTC m=+4464.876063369 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs") pod "barbican-api-6ffbb9cff4-gljqq" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9") : secret "cert-barbican-internal-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.555453 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.555872 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs podName:58885f5a-2467-45da-b076-7cdb2952d2a9 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.555863002 +0000 UTC m=+4464.876123420 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs") pod "barbican-api-6ffbb9cff4-gljqq" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9") : secret "cert-barbican-public-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.555480 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.555912 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.555906793 +0000 UTC m=+4464.876167081 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-internal-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.555943 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.555934784 +0000 UTC m=+4464.876195072 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-public-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.578194 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.586477 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.598723 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.616111 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82558fc9-417f-49d3-b522-207e0c215f92" (UID: "82558fc9-417f-49d3-b522-207e0c215f92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.616326 5030 generic.go:334] "Generic (PLEG): container finished" podID="9729aa6a-9110-421f-bd3a-d22cad6a75e9" containerID="ee14274333d8d3015cc3322cf08423a90a30cfb67d14fd4de84a2516c22bd7be" exitCode=0 Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.616383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9729aa6a-9110-421f-bd3a-d22cad6a75e9","Type":"ContainerDied","Data":"ee14274333d8d3015cc3322cf08423a90a30cfb67d14fd4de84a2516c22bd7be"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.616408 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"9729aa6a-9110-421f-bd3a-d22cad6a75e9","Type":"ContainerDied","Data":"f0fc87eef0d2e32bf33a81d2f671b20a960c3309c5bb24c43e584f8b008458c2"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.616425 5030 scope.go:117] "RemoveContainer" containerID="ee14274333d8d3015cc3322cf08423a90a30cfb67d14fd4de84a2516c22bd7be" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.616550 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.620015 5030 generic.go:334] "Generic (PLEG): container finished" podID="62872782-8768-483f-b48b-298cf9973612" containerID="427abcf023821e4c96cad1a1fc1371694fc425ad1dba2ec4e422b6da5144072e" exitCode=0 Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.620072 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"62872782-8768-483f-b48b-298cf9973612","Type":"ContainerDied","Data":"427abcf023821e4c96cad1a1fc1371694fc425ad1dba2ec4e422b6da5144072e"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.620086 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.620089 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"62872782-8768-483f-b48b-298cf9973612","Type":"ContainerDied","Data":"188c87c82ed83707455428ef810356b00cbf20190ff648f50d03a8e007170982"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.621668 5030 generic.go:334] "Generic (PLEG): container finished" podID="99675934-2bb7-46f9-8e3c-49e818022424" containerID="e39fcd1cdfdc87e110da28235413a92eda0b8fd8acd8844e248a0cfc858598d0" exitCode=137 Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.621776 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.623157 5030 generic.go:334] "Generic (PLEG): container finished" podID="32dd6edd-9576-4aa1-9371-f9498e9c1a60" containerID="8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd" exitCode=0 Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.623199 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.623220 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"32dd6edd-9576-4aa1-9371-f9498e9c1a60","Type":"ContainerDied","Data":"8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.623245 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"32dd6edd-9576-4aa1-9371-f9498e9c1a60","Type":"ContainerDied","Data":"6703f6e5c6c634151401763c5f33c32d706ead65ab5d5ab20313596e251a51e8"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.625857 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62872782-8768-483f-b48b-298cf9973612-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62872782-8768-483f-b48b-298cf9973612" (UID: "62872782-8768-483f-b48b-298cf9973612"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.625883 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"5727ecb0-0164-4b1f-9c45-80b381d6e6dd","Type":"ContainerDied","Data":"320e734e3ff9af79dbcc22005042dcf98c9fab6ecad6918286b22dbf79009782"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.625866 5030 generic.go:334] "Generic (PLEG): container finished" podID="5727ecb0-0164-4b1f-9c45-80b381d6e6dd" containerID="320e734e3ff9af79dbcc22005042dcf98c9fab6ecad6918286b22dbf79009782" exitCode=0 Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.625940 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"5727ecb0-0164-4b1f-9c45-80b381d6e6dd","Type":"ContainerDied","Data":"cf22e84fef8fc521d41c77e4d89847c129485609c42476eaeae65df07568fe7b"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.625948 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.627619 5030 generic.go:334] "Generic (PLEG): container finished" podID="82558fc9-417f-49d3-b522-207e0c215f92" containerID="2a5e943d5d03a2a8aec5eb4d1bc3f3ab36ac421741ca2815d290edb3b2d56f0d" exitCode=0 Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.627726 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"82558fc9-417f-49d3-b522-207e0c215f92","Type":"ContainerDied","Data":"2a5e943d5d03a2a8aec5eb4d1bc3f3ab36ac421741ca2815d290edb3b2d56f0d"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.627766 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"82558fc9-417f-49d3-b522-207e0c215f92","Type":"ContainerDied","Data":"4ab3a9efafd0a567fbba8b8497336dc9e1bb8011c28ccaf0e83acbdd2110b1b7"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.627775 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.634924 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" event={"ID":"d3ed2d89-dd21-446c-b85b-b44bb72512bc","Type":"ContainerStarted","Data":"29681178c1d15195b91a0a520a9ab9575c9dc60635da73365d7bf863a845e294"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.634961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" event={"ID":"d3ed2d89-dd21-446c-b85b-b44bb72512bc","Type":"ContainerStarted","Data":"33a963852f5f74a4ebd888b9e512423d08c12d4c9b02d76e4d46f9e67720a6f1"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.638515 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_201b8e1b-7c3d-4d9d-a47b-4dda67f6829b/ovsdbserver-nb/0.log" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.638558 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"201b8e1b-7c3d-4d9d-a47b-4dda67f6829b","Type":"ContainerDied","Data":"45bd5cb9bd91333e8b4b00eaebbb15d56343efd3440ed4badc13c410925f7ed0"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.638646 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.638694 5030 scope.go:117] "RemoveContainer" containerID="ad5c930bf89250ddc8f00ab17c3db6d55f29428feee79fe0649ce409b08c2260" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.640864 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" event={"ID":"611e73cb-eb48-4fed-9073-7b3ee43b88e8","Type":"ContainerStarted","Data":"640b59c1c1f24c64cc46c43b80b843802e0c48890d90206a9daf79885221c80a"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.640902 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" event={"ID":"611e73cb-eb48-4fed-9073-7b3ee43b88e8","Type":"ContainerStarted","Data":"c1b7fd9d7ea1d3ccd3b70102bb4c482c5343dd2f77a86f37c2b7339baf55ecf9"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.643392 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_806e42db-b6fa-4931-ab88-b2903c5e2830/ovsdbserver-sb/0.log" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.643453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"806e42db-b6fa-4931-ab88-b2903c5e2830","Type":"ContainerDied","Data":"f68906bc74881c82cf7b9c1196f5ca0e9055ef5e41efb62cba87552a5fac3858"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.643494 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.646101 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0048c16-5c70-4aae-baf0-d9f25538e3e2" containerID="039e7ef3eac22a9018b69c95007576efe560fae9bfde236edbf26c56343c30a2" exitCode=0 Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.646164 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e0048c16-5c70-4aae-baf0-d9f25538e3e2","Type":"ContainerDied","Data":"039e7ef3eac22a9018b69c95007576efe560fae9bfde236edbf26c56343c30a2"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.647150 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99675934-2bb7-46f9-8e3c-49e818022424-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99675934-2bb7-46f9-8e3c-49e818022424" (UID: "99675934-2bb7-46f9-8e3c-49e818022424"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.649253 5030 generic.go:334] "Generic (PLEG): container finished" podID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerID="a4af203238bd85addfe604690e3c292c8efda30dec6ea3736cd2d68577d11e6b" exitCode=0 Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.649285 5030 generic.go:334] "Generic (PLEG): container finished" podID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerID="eb9221952dc192f9c1f1f8e2fb015a4b92badc39abb4c32ab4746ab7b9ff7f2c" exitCode=2 Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.649293 5030 generic.go:334] "Generic (PLEG): container finished" podID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerID="c130e1e3f087863c14f1d0006c481f506719b0045564a503944f6c864d4040d3" exitCode=0 Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.649314 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03","Type":"ContainerDied","Data":"a4af203238bd85addfe604690e3c292c8efda30dec6ea3736cd2d68577d11e6b"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.649338 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03","Type":"ContainerDied","Data":"eb9221952dc192f9c1f1f8e2fb015a4b92badc39abb4c32ab4746ab7b9ff7f2c"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.649349 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03","Type":"ContainerDied","Data":"c130e1e3f087863c14f1d0006c481f506719b0045564a503944f6c864d4040d3"} Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.656188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.656410 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.656934 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.657004 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.657032 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.657064 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.657104 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62872782-8768-483f-b48b-298cf9973612-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.657112 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs podName:16b352a0-984a-4736-9ede-a3797059b56a nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.657097648 +0000 UTC m=+4464.977357936 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs") pod "keystone-d4c58fbf5-82sk5" (UID: "16b352a0-984a-4736-9ede-a3797059b56a") : secret "cert-keystone-public-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.657189 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99675934-2bb7-46f9-8e3c-49e818022424-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.657215 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs podName:16b352a0-984a-4736-9ede-a3797059b56a nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.6571992 +0000 UTC m=+4464.977459488 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs") pod "keystone-d4c58fbf5-82sk5" (UID: "16b352a0-984a-4736-9ede-a3797059b56a") : secret "cert-keystone-internal-svc" not found Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.657226 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.657239 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.675480 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.681692 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.682751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dd6edd-9576-4aa1-9371-f9498e9c1a60-config-data" (OuterVolumeSpecName: "config-data") pod "32dd6edd-9576-4aa1-9371-f9498e9c1a60" (UID: "32dd6edd-9576-4aa1-9371-f9498e9c1a60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.688965 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-config-data" (OuterVolumeSpecName: "config-data") pod "82558fc9-417f-49d3-b522-207e0c215f92" (UID: "82558fc9-417f-49d3-b522-207e0c215f92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698276 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.698607 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9729aa6a-9110-421f-bd3a-d22cad6a75e9" containerName="mysql-bootstrap" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698631 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9729aa6a-9110-421f-bd3a-d22cad6a75e9" containerName="mysql-bootstrap" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.698639 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806e42db-b6fa-4931-ab88-b2903c5e2830" containerName="ovsdbserver-sb" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698645 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="806e42db-b6fa-4931-ab88-b2903c5e2830" containerName="ovsdbserver-sb" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.698672 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82558fc9-417f-49d3-b522-207e0c215f92" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698679 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="82558fc9-417f-49d3-b522-207e0c215f92" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.698691 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62872782-8768-483f-b48b-298cf9973612" containerName="mysql-bootstrap" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698697 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="62872782-8768-483f-b48b-298cf9973612" containerName="mysql-bootstrap" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.698710 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32dd6edd-9576-4aa1-9371-f9498e9c1a60" containerName="nova-cell0-conductor-conductor" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698716 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="32dd6edd-9576-4aa1-9371-f9498e9c1a60" containerName="nova-cell0-conductor-conductor" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.698727 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" containerName="openstack-network-exporter" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698733 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" containerName="openstack-network-exporter" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.698742 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" containerName="ovsdbserver-nb" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698748 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" containerName="ovsdbserver-nb" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.698759 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9729aa6a-9110-421f-bd3a-d22cad6a75e9" containerName="galera" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698764 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9729aa6a-9110-421f-bd3a-d22cad6a75e9" containerName="galera" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.698776 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806e42db-b6fa-4931-ab88-b2903c5e2830" containerName="openstack-network-exporter" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698781 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="806e42db-b6fa-4931-ab88-b2903c5e2830" containerName="openstack-network-exporter" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.698794 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5727ecb0-0164-4b1f-9c45-80b381d6e6dd" containerName="memcached" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698800 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5727ecb0-0164-4b1f-9c45-80b381d6e6dd" containerName="memcached" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.698811 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62872782-8768-483f-b48b-298cf9973612" containerName="galera" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698816 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="62872782-8768-483f-b48b-298cf9973612" containerName="galera" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698962 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" containerName="openstack-network-exporter" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698974 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="62872782-8768-483f-b48b-298cf9973612" containerName="galera" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698986 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9729aa6a-9110-421f-bd3a-d22cad6a75e9" containerName="galera" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.698993 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="806e42db-b6fa-4931-ab88-b2903c5e2830" containerName="ovsdbserver-sb" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.699005 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="82558fc9-417f-49d3-b522-207e0c215f92" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.699013 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="32dd6edd-9576-4aa1-9371-f9498e9c1a60" containerName="nova-cell0-conductor-conductor" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.699026 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" containerName="ovsdbserver-nb" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.699032 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5727ecb0-0164-4b1f-9c45-80b381d6e6dd" containerName="memcached" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.699042 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="806e42db-b6fa-4931-ab88-b2903c5e2830" containerName="openstack-network-exporter" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.701021 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.704836 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "806e42db-b6fa-4931-ab88-b2903c5e2830" (UID: "806e42db-b6fa-4931-ab88-b2903c5e2830"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.705173 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-l99kr" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.705438 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.705570 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.705685 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.706568 5030 scope.go:117] "RemoveContainer" containerID="ee14274333d8d3015cc3322cf08423a90a30cfb67d14fd4de84a2516c22bd7be" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.707434 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee14274333d8d3015cc3322cf08423a90a30cfb67d14fd4de84a2516c22bd7be\": container with ID starting with ee14274333d8d3015cc3322cf08423a90a30cfb67d14fd4de84a2516c22bd7be not found: ID does not exist" containerID="ee14274333d8d3015cc3322cf08423a90a30cfb67d14fd4de84a2516c22bd7be" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.707463 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee14274333d8d3015cc3322cf08423a90a30cfb67d14fd4de84a2516c22bd7be"} err="failed to get container status \"ee14274333d8d3015cc3322cf08423a90a30cfb67d14fd4de84a2516c22bd7be\": rpc error: code = NotFound desc = could not find container \"ee14274333d8d3015cc3322cf08423a90a30cfb67d14fd4de84a2516c22bd7be\": container with ID starting with ee14274333d8d3015cc3322cf08423a90a30cfb67d14fd4de84a2516c22bd7be not found: ID does not exist" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.707479 5030 scope.go:117] "RemoveContainer" containerID="ad5c930bf89250ddc8f00ab17c3db6d55f29428feee79fe0649ce409b08c2260" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.707729 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad5c930bf89250ddc8f00ab17c3db6d55f29428feee79fe0649ce409b08c2260\": container with ID starting with ad5c930bf89250ddc8f00ab17c3db6d55f29428feee79fe0649ce409b08c2260 not found: ID does not exist" containerID="ad5c930bf89250ddc8f00ab17c3db6d55f29428feee79fe0649ce409b08c2260" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.707752 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5c930bf89250ddc8f00ab17c3db6d55f29428feee79fe0649ce409b08c2260"} err="failed to get container status \"ad5c930bf89250ddc8f00ab17c3db6d55f29428feee79fe0649ce409b08c2260\": rpc error: code = NotFound desc = could not find container \"ad5c930bf89250ddc8f00ab17c3db6d55f29428feee79fe0649ce409b08c2260\": container with ID starting with ad5c930bf89250ddc8f00ab17c3db6d55f29428feee79fe0649ce409b08c2260 not found: ID does not exist" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.707763 5030 scope.go:117] "RemoveContainer" containerID="427abcf023821e4c96cad1a1fc1371694fc425ad1dba2ec4e422b6da5144072e" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.714819 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.736828 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62872782-8768-483f-b48b-298cf9973612-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "62872782-8768-483f-b48b-298cf9973612" (UID: "62872782-8768-483f-b48b-298cf9973612"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.737499 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5727ecb0-0164-4b1f-9c45-80b381d6e6dd" (UID: "5727ecb0-0164-4b1f-9c45-80b381d6e6dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.744900 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dd6edd-9576-4aa1-9371-f9498e9c1a60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32dd6edd-9576-4aa1-9371-f9498e9c1a60" (UID: "32dd6edd-9576-4aa1-9371-f9498e9c1a60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.758515 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dd6edd-9576-4aa1-9371-f9498e9c1a60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.758542 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.758553 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.758564 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.758575 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/62872782-8768-483f-b48b-298cf9973612-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.758584 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32dd6edd-9576-4aa1-9371-f9498e9c1a60-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.799087 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9729aa6a-9110-421f-bd3a-d22cad6a75e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9729aa6a-9110-421f-bd3a-d22cad6a75e9" (UID: "9729aa6a-9110-421f-bd3a-d22cad6a75e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.805803 5030 scope.go:117] "RemoveContainer" containerID="3dcaa04e26ec261129476f380d699ce7ad309301aeac511b8cb22006ac37f707" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.819833 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9729aa6a-9110-421f-bd3a-d22cad6a75e9-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "9729aa6a-9110-421f-bd3a-d22cad6a75e9" (UID: "9729aa6a-9110-421f-bd3a-d22cad6a75e9"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.820139 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99675934-2bb7-46f9-8e3c-49e818022424-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "99675934-2bb7-46f9-8e3c-49e818022424" (UID: "99675934-2bb7-46f9-8e3c-49e818022424"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.828543 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99675934-2bb7-46f9-8e3c-49e818022424-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "99675934-2bb7-46f9-8e3c-49e818022424" (UID: "99675934-2bb7-46f9-8e3c-49e818022424"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.833756 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "82558fc9-417f-49d3-b522-207e0c215f92" (UID: "82558fc9-417f-49d3-b522-207e0c215f92"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.838140 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "5727ecb0-0164-4b1f-9c45-80b381d6e6dd" (UID: "5727ecb0-0164-4b1f-9c45-80b381d6e6dd"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.838493 5030 scope.go:117] "RemoveContainer" containerID="427abcf023821e4c96cad1a1fc1371694fc425ad1dba2ec4e422b6da5144072e" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.838873 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"427abcf023821e4c96cad1a1fc1371694fc425ad1dba2ec4e422b6da5144072e\": container with ID starting with 427abcf023821e4c96cad1a1fc1371694fc425ad1dba2ec4e422b6da5144072e not found: ID does not exist" containerID="427abcf023821e4c96cad1a1fc1371694fc425ad1dba2ec4e422b6da5144072e" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.838917 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"427abcf023821e4c96cad1a1fc1371694fc425ad1dba2ec4e422b6da5144072e"} err="failed to get container status \"427abcf023821e4c96cad1a1fc1371694fc425ad1dba2ec4e422b6da5144072e\": rpc error: code = NotFound desc = could not find container \"427abcf023821e4c96cad1a1fc1371694fc425ad1dba2ec4e422b6da5144072e\": container with ID starting with 427abcf023821e4c96cad1a1fc1371694fc425ad1dba2ec4e422b6da5144072e not found: ID does not exist" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.838950 5030 scope.go:117] "RemoveContainer" containerID="3dcaa04e26ec261129476f380d699ce7ad309301aeac511b8cb22006ac37f707" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.839225 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dcaa04e26ec261129476f380d699ce7ad309301aeac511b8cb22006ac37f707\": container with ID starting with 3dcaa04e26ec261129476f380d699ce7ad309301aeac511b8cb22006ac37f707 not found: ID does not exist" containerID="3dcaa04e26ec261129476f380d699ce7ad309301aeac511b8cb22006ac37f707" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.839247 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dcaa04e26ec261129476f380d699ce7ad309301aeac511b8cb22006ac37f707"} err="failed to get container status \"3dcaa04e26ec261129476f380d699ce7ad309301aeac511b8cb22006ac37f707\": rpc error: code = NotFound desc = could not find container \"3dcaa04e26ec261129476f380d699ce7ad309301aeac511b8cb22006ac37f707\": container with ID starting with 3dcaa04e26ec261129476f380d699ce7ad309301aeac511b8cb22006ac37f707 not found: ID does not exist" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.839261 5030 scope.go:117] "RemoveContainer" containerID="e39fcd1cdfdc87e110da28235413a92eda0b8fd8acd8844e248a0cfc858598d0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.844584 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "806e42db-b6fa-4931-ab88-b2903c5e2830" (UID: "806e42db-b6fa-4931-ab88-b2903c5e2830"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.855907 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "82558fc9-417f-49d3-b522-207e0c215f92" (UID: "82558fc9-417f-49d3-b522-207e0c215f92"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.860650 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.860706 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.860734 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.860797 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.860843 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x26lh\" (UniqueName: \"kubernetes.io/projected/bd8e50b7-5dd6-475d-80a9-69eff598bb18-kube-api-access-x26lh\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.860867 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8e50b7-5dd6-475d-80a9-69eff598bb18-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.861075 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8e50b7-5dd6-475d-80a9-69eff598bb18-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.861135 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.861287 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.861336 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.861352 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9729aa6a-9110-421f-bd3a-d22cad6a75e9-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.861365 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5727ecb0-0164-4b1f-9c45-80b381d6e6dd-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.861377 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9729aa6a-9110-421f-bd3a-d22cad6a75e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.861388 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/99675934-2bb7-46f9-8e3c-49e818022424-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.861400 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/82558fc9-417f-49d3-b522-207e0c215f92-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.861414 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/99675934-2bb7-46f9-8e3c-49e818022424-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.862565 5030 scope.go:117] "RemoveContainer" containerID="e39fcd1cdfdc87e110da28235413a92eda0b8fd8acd8844e248a0cfc858598d0" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.863486 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e39fcd1cdfdc87e110da28235413a92eda0b8fd8acd8844e248a0cfc858598d0\": container with ID starting with e39fcd1cdfdc87e110da28235413a92eda0b8fd8acd8844e248a0cfc858598d0 not found: ID does not exist" containerID="e39fcd1cdfdc87e110da28235413a92eda0b8fd8acd8844e248a0cfc858598d0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.863639 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e39fcd1cdfdc87e110da28235413a92eda0b8fd8acd8844e248a0cfc858598d0"} err="failed to get container status \"e39fcd1cdfdc87e110da28235413a92eda0b8fd8acd8844e248a0cfc858598d0\": rpc error: code = NotFound desc = could not find container \"e39fcd1cdfdc87e110da28235413a92eda0b8fd8acd8844e248a0cfc858598d0\": container with ID starting with e39fcd1cdfdc87e110da28235413a92eda0b8fd8acd8844e248a0cfc858598d0 not found: ID does not exist" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.863838 5030 scope.go:117] "RemoveContainer" containerID="8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.879167 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "806e42db-b6fa-4931-ab88-b2903c5e2830" (UID: "806e42db-b6fa-4931-ab88-b2903c5e2830"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.895690 5030 scope.go:117] "RemoveContainer" containerID="8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.896427 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd\": container with ID starting with 8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd not found: ID does not exist" containerID="8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.896457 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd"} err="failed to get container status \"8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd\": rpc error: code = NotFound desc = could not find container \"8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd\": container with ID starting with 8e4aa41e862bad15dfb9bad59d2ea665a3c316dbbe2ba72fc4b2666623f97fcd not found: ID does not exist" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.896477 5030 scope.go:117] "RemoveContainer" containerID="320e734e3ff9af79dbcc22005042dcf98c9fab6ecad6918286b22dbf79009782" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.956460 5030 scope.go:117] "RemoveContainer" containerID="320e734e3ff9af79dbcc22005042dcf98c9fab6ecad6918286b22dbf79009782" Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.958799 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320e734e3ff9af79dbcc22005042dcf98c9fab6ecad6918286b22dbf79009782\": container with ID starting with 320e734e3ff9af79dbcc22005042dcf98c9fab6ecad6918286b22dbf79009782 not found: ID does not exist" containerID="320e734e3ff9af79dbcc22005042dcf98c9fab6ecad6918286b22dbf79009782" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.958836 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320e734e3ff9af79dbcc22005042dcf98c9fab6ecad6918286b22dbf79009782"} err="failed to get container status \"320e734e3ff9af79dbcc22005042dcf98c9fab6ecad6918286b22dbf79009782\": rpc error: code = NotFound desc = could not find container \"320e734e3ff9af79dbcc22005042dcf98c9fab6ecad6918286b22dbf79009782\": container with ID starting with 320e734e3ff9af79dbcc22005042dcf98c9fab6ecad6918286b22dbf79009782 not found: ID does not exist" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.958859 5030 scope.go:117] "RemoveContainer" containerID="2a5e943d5d03a2a8aec5eb4d1bc3f3ab36ac421741ca2815d290edb3b2d56f0d" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.963498 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x26lh\" (UniqueName: \"kubernetes.io/projected/bd8e50b7-5dd6-475d-80a9-69eff598bb18-kube-api-access-x26lh\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.963532 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8e50b7-5dd6-475d-80a9-69eff598bb18-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.963604 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8e50b7-5dd6-475d-80a9-69eff598bb18-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.963765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.963800 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.963828 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.963848 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.963918 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.964617 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8e50b7-5dd6-475d-80a9-69eff598bb18-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.966543 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/806e42db-b6fa-4931-ab88-b2903c5e2830-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.969370 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.972771 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8e50b7-5dd6-475d-80a9-69eff598bb18-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.972812 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.972865 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.972906 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs podName:bd8e50b7-5dd6-475d-80a9-69eff598bb18 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:51.472892189 +0000 UTC m=+4463.793152477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.973073 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:50 crc kubenswrapper[5030]: E0120 23:49:50.973112 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs podName:bd8e50b7-5dd6-475d-80a9-69eff598bb18 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:51.473105314 +0000 UTC m=+4463.793365602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18") : secret "cert-ovn-metrics" not found Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.973287 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:50 crc kubenswrapper[5030]: I0120 23:49:50.988541 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.022361 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.039737 5030 scope.go:117] "RemoveContainer" containerID="2a5e943d5d03a2a8aec5eb4d1bc3f3ab36ac421741ca2815d290edb3b2d56f0d" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.050555 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a5e943d5d03a2a8aec5eb4d1bc3f3ab36ac421741ca2815d290edb3b2d56f0d\": container with ID starting with 2a5e943d5d03a2a8aec5eb4d1bc3f3ab36ac421741ca2815d290edb3b2d56f0d not found: ID does not exist" containerID="2a5e943d5d03a2a8aec5eb4d1bc3f3ab36ac421741ca2815d290edb3b2d56f0d" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.050612 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a5e943d5d03a2a8aec5eb4d1bc3f3ab36ac421741ca2815d290edb3b2d56f0d"} err="failed to get container status \"2a5e943d5d03a2a8aec5eb4d1bc3f3ab36ac421741ca2815d290edb3b2d56f0d\": rpc error: code = NotFound desc = could not find container \"2a5e943d5d03a2a8aec5eb4d1bc3f3ab36ac421741ca2815d290edb3b2d56f0d\": container with ID starting with 2a5e943d5d03a2a8aec5eb4d1bc3f3ab36ac421741ca2815d290edb3b2d56f0d not found: ID does not exist" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.050654 5030 scope.go:117] "RemoveContainer" containerID="3bdfa2c0f1f4d03ce20d64482e831290dd6ab19d7e2711ec93d9ac4cce9e8843" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.105029 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x26lh\" (UniqueName: \"kubernetes.io/projected/bd8e50b7-5dd6-475d-80a9-69eff598bb18-kube-api-access-x26lh\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.105306 5030 scope.go:117] "RemoveContainer" containerID="d14e9008105657463a596aefde91ce6bff6bcde232fe9380f40c4a188e349145" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.108872 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.123146 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.132303 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.145706 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.147335 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.152961 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.156349 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.156533 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.156666 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-4j24t" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.156775 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.162475 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.165510 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.189984 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.217308 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.217354 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.255381 5030 scope.go:117] "RemoveContainer" containerID="6b834b707d40e4da7564fea909b7b96377cf70ecece5fdecf7803fb75747167e" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.256085 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.263752 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.270685 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.277564 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.281024 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.284758 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.285947 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.286008 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.286148 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.287457 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.287542 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.288159 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-rp726" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.288742 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.289017 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.289092 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll68r\" (UniqueName: \"kubernetes.io/projected/87584043-7d0e-475d-89ec-2f743b49b145-kube-api-access-ll68r\") pod \"nova-cell0-conductor-0\" (UID: \"87584043-7d0e-475d-89ec-2f743b49b145\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.289117 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.289150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87584043-7d0e-475d-89ec-2f743b49b145-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"87584043-7d0e-475d-89ec-2f743b49b145\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.289182 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf24j\" (UniqueName: \"kubernetes.io/projected/d465598f-140e-4d07-a886-9d5c96b3bbe3-kube-api-access-tf24j\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.289213 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.289248 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d465598f-140e-4d07-a886-9d5c96b3bbe3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.289308 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87584043-7d0e-475d-89ec-2f743b49b145-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"87584043-7d0e-475d-89ec-2f743b49b145\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.289331 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.290566 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.290845 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.292553 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.293100 5030 scope.go:117] "RemoveContainer" containerID="d42829f2ade98d7dd79e571500a85a3e1b4edf65a779d6c65138c90834bf5c89" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.293374 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.300779 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.309545 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.318738 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.325409 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.325889 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="sg-core" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.325902 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="sg-core" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.325929 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="proxy-httpd" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.325936 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="proxy-httpd" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.325955 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="ceilometer-central-agent" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.325961 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="ceilometer-central-agent" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.325981 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="ceilometer-notification-agent" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.325989 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="ceilometer-notification-agent" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.326158 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="proxy-httpd" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.326174 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="sg-core" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.326181 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="ceilometer-notification-agent" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.326190 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerName="ceilometer-central-agent" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.331586 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.334077 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.335085 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-f8qjh" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.338243 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.338316 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.339904 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.341188 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.349680 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.356643 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.361225 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.364398 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.364686 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-s5kbg" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.368249 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.369304 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.392266 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88xwb\" (UniqueName: \"kubernetes.io/projected/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-kube-api-access-88xwb\") pod \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.392304 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-log-httpd\") pod \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.392392 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-run-httpd\") pod \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.392430 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-scripts\") pod \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.392447 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-config-data\") pod \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.392495 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-ceilometer-tls-certs\") pod \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.392602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-sg-core-conf-yaml\") pod \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.393028 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" (UID: "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.393363 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" (UID: "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.396773 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-combined-ca-bundle\") pod \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\" (UID: \"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03\") " Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398032 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398079 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398147 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljn5g\" (UniqueName: \"kubernetes.io/projected/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-kube-api-access-ljn5g\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398320 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398431 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mklb2\" (UniqueName: \"kubernetes.io/projected/b088bf19-0c85-4732-bc31-18e8bd1cd751-kube-api-access-mklb2\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398525 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-config-data-default\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdn6w\" (UniqueName: \"kubernetes.io/projected/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-kube-api-access-wdn6w\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398609 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-config-data\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398698 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398908 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-kolla-config\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398935 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398960 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398964 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.399007 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll68r\" (UniqueName: \"kubernetes.io/projected/87584043-7d0e-475d-89ec-2f743b49b145-kube-api-access-ll68r\") pod \"nova-cell0-conductor-0\" (UID: \"87584043-7d0e-475d-89ec-2f743b49b145\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.399032 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.399075 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87584043-7d0e-475d-89ec-2f743b49b145-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"87584043-7d0e-475d-89ec-2f743b49b145\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.399185 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-scripts" (OuterVolumeSpecName: "scripts") pod "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" (UID: "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.400247 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.401397 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.399311 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.398687 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.402550 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf24j\" (UniqueName: \"kubernetes.io/projected/d465598f-140e-4d07-a886-9d5c96b3bbe3-kube-api-access-tf24j\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.402677 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.402738 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.402781 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d465598f-140e-4d07-a886-9d5c96b3bbe3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.402813 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.402845 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.402916 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87584043-7d0e-475d-89ec-2f743b49b145-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"87584043-7d0e-475d-89ec-2f743b49b145\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.402940 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.402995 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.403042 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-kolla-config\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.403083 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.403137 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d465598f-140e-4d07-a886-9d5c96b3bbe3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.403503 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.403635 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.403657 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.403702 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs podName:d465598f-140e-4d07-a886-9d5c96b3bbe3 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:51.903687579 +0000 UTC m=+4464.223947867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "d465598f-140e-4d07-a886-9d5c96b3bbe3") : secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.403877 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.403894 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.409946 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87584043-7d0e-475d-89ec-2f743b49b145-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"87584043-7d0e-475d-89ec-2f743b49b145\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.414201 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-kube-api-access-88xwb" (OuterVolumeSpecName: "kube-api-access-88xwb") pod "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" (UID: "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03"). InnerVolumeSpecName "kube-api-access-88xwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.414250 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.416309 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87584043-7d0e-475d-89ec-2f743b49b145-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"87584043-7d0e-475d-89ec-2f743b49b145\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.427520 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll68r\" (UniqueName: \"kubernetes.io/projected/87584043-7d0e-475d-89ec-2f743b49b145-kube-api-access-ll68r\") pod \"nova-cell0-conductor-0\" (UID: \"87584043-7d0e-475d-89ec-2f743b49b145\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.430541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.438252 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf24j\" (UniqueName: \"kubernetes.io/projected/d465598f-140e-4d07-a886-9d5c96b3bbe3-kube-api-access-tf24j\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.441379 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" (UID: "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.481392 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" (UID: "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.494818 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" (UID: "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.499071 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-config-data" (OuterVolumeSpecName: "config-data") pod "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" (UID: "4c4e36db-cf6b-4863-a6aa-dfe6388e5c03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.505082 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.505120 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l87st\" (UniqueName: \"kubernetes.io/projected/58e259c6-360b-4eb5-9940-ceddedf1e275-kube-api-access-l87st\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.505152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-config-data-default\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.505194 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdn6w\" (UniqueName: \"kubernetes.io/projected/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-kube-api-access-wdn6w\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.505212 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-config-data\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.505326 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.505518 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-kolla-config\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.505566 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.505779 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.505885 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.506010 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-config-data\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.506076 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.506140 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.506200 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.506232 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-config-data-default\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.506322 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-novncproxy-cell1-public-svc: secret "cert-nova-novncproxy-cell1-public-svc" not found Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.506366 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs podName:b088bf19-0c85-4732-bc31-18e8bd1cd751 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.00635213 +0000 UTC m=+4464.326612418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nova-novncproxy-tls-certs" (UniqueName: "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs") pod "nova-cell1-novncproxy-0" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751") : secret "cert-nova-novncproxy-cell1-public-svc" not found Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.506415 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.506635 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.506638 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.506667 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs podName:0e6ec1e2-3869-4ced-86cf-55f3d2f26b83 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.006658337 +0000 UTC m=+4464.326918625 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs") pod "openstack-galera-0" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83") : secret "cert-galera-openstack-svc" not found Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.506694 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.506425 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-kolla-config\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.507150 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs podName:5945d7b7-7f69-4c5e-a90f-5963d1ff010f nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.007084337 +0000 UTC m=+4464.327344615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs") pod "memcached-0" (UID: "5945d7b7-7f69-4c5e-a90f-5963d1ff010f") : secret "cert-memcached-svc" not found Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507178 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58e259c6-360b-4eb5-9940-ceddedf1e275-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507232 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507253 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507373 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-kolla-config\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507414 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507430 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507464 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507544 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507577 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507600 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507643 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljn5g\" (UniqueName: \"kubernetes.io/projected/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-kube-api-access-ljn5g\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507661 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507682 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507702 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mklb2\" (UniqueName: \"kubernetes.io/projected/b088bf19-0c85-4732-bc31-18e8bd1cd751-kube-api-access-mklb2\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507718 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e259c6-360b-4eb5-9940-ceddedf1e275-config\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507784 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507796 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507810 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507820 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.507829 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88xwb\" (UniqueName: \"kubernetes.io/projected/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03-kube-api-access-88xwb\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.507849 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-novncproxy-cell1-vencrypt: secret "cert-nova-novncproxy-cell1-vencrypt" not found Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.507878 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs podName:b088bf19-0c85-4732-bc31-18e8bd1cd751 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.007868807 +0000 UTC m=+4464.328129095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "vencrypt-tls-certs" (UniqueName: "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs") pod "nova-cell1-novncproxy-0" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751") : secret "cert-nova-novncproxy-cell1-vencrypt" not found Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.508018 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-kolla-config\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.508060 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.508088 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs podName:bd8e50b7-5dd6-475d-80a9-69eff598bb18 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.508079232 +0000 UTC m=+4464.828339520 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.510229 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.510273 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs podName:bd8e50b7-5dd6-475d-80a9-69eff598bb18 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.510262365 +0000 UTC m=+4464.830522653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18") : secret "cert-ovn-metrics" not found Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.511131 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.513258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.518249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.520575 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.520578 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.525215 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mklb2\" (UniqueName: \"kubernetes.io/projected/b088bf19-0c85-4732-bc31-18e8bd1cd751-kube-api-access-mklb2\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.525836 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljn5g\" (UniqueName: \"kubernetes.io/projected/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-kube-api-access-ljn5g\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.525847 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdn6w\" (UniqueName: \"kubernetes.io/projected/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-kube-api-access-wdn6w\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.538293 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.578352 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.609461 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.609505 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58e259c6-360b-4eb5-9940-ceddedf1e275-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.609545 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.609590 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.609633 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.609723 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e259c6-360b-4eb5-9940-ceddedf1e275-config\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.609743 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.609758 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l87st\" (UniqueName: \"kubernetes.io/projected/58e259c6-360b-4eb5-9940-ceddedf1e275-kube-api-access-l87st\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.609932 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") device mount path \"/mnt/openstack/pv16\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.610107 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.610535 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58e259c6-360b-4eb5-9940-ceddedf1e275-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.610646 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-sb-ovndbs: secret "cert-ovndbcluster-sb-ovndbs" not found Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.610691 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs podName:58e259c6-360b-4eb5-9940-ceddedf1e275 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.11067784 +0000 UTC m=+4464.430938128 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovsdbserver-sb-tls-certs" (UniqueName: "kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs") pod "ovsdbserver-sb-0" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275") : secret "cert-ovndbcluster-sb-ovndbs" not found Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.610727 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e259c6-360b-4eb5-9940-ceddedf1e275-config\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.611091 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.611127 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs podName:58e259c6-360b-4eb5-9940-ceddedf1e275 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.111118101 +0000 UTC m=+4464.431378389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs") pod "ovsdbserver-sb-0" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275") : secret "cert-ovn-metrics" not found Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.614204 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.626881 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l87st\" (UniqueName: \"kubernetes.io/projected/58e259c6-360b-4eb5-9940-ceddedf1e275-kube-api-access-l87st\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.639748 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.667667 5030 generic.go:334] "Generic (PLEG): container finished" podID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" containerID="1089f049d91fe37cc3fd52ef3dead85a955989169754a9e8689c66683255eeda" exitCode=0 Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.668598 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.668642 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03","Type":"ContainerDied","Data":"1089f049d91fe37cc3fd52ef3dead85a955989169754a9e8689c66683255eeda"} Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.669018 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"4c4e36db-cf6b-4863-a6aa-dfe6388e5c03","Type":"ContainerDied","Data":"649095794f9ad7ae4dcb6223b9267716fa2fc98ee77d546ec29536c8d5cb89ae"} Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.669043 5030 scope.go:117] "RemoveContainer" containerID="a4af203238bd85addfe604690e3c292c8efda30dec6ea3736cd2d68577d11e6b" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.676955 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" event={"ID":"d3ed2d89-dd21-446c-b85b-b44bb72512bc","Type":"ContainerStarted","Data":"39cbbaace20c2208aaabbaa59903f192867600c05fcdfb4a1ad5000eb6ffd70e"} Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.704908 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" podStartSLOduration=4.704892406 podStartE2EDuration="4.704892406s" podCreationTimestamp="2026-01-20 23:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:49:51.695908568 +0000 UTC m=+4464.016168856" watchObservedRunningTime="2026-01-20 23:49:51.704892406 +0000 UTC m=+4464.025152694" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.726837 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" event={"ID":"611e73cb-eb48-4fed-9073-7b3ee43b88e8","Type":"ContainerStarted","Data":"56c4a1d60ce76e519f42d8bf41166f1c90ffea4940e28ac6032adf8c11f04400"} Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.749710 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.802318 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.802780 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" podUID="26c79cd4-48f4-4082-a364-e767b967212d" containerName="barbican-worker-log" containerID="cri-o://16c5c4277d4db587ebd20571f0eda8b1071259dcd5b2cc10f6127bebbbe1f881" gracePeriod=30 Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.803421 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" podUID="26c79cd4-48f4-4082-a364-e767b967212d" containerName="barbican-worker" containerID="cri-o://ed609912d75fa708ca9829f491a4fdfce5eca751361677e9cfa3203e0a357751" gracePeriod=30 Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.808837 5030 scope.go:117] "RemoveContainer" containerID="eb9221952dc192f9c1f1f8e2fb015a4b92badc39abb4c32ab4746ab7b9ff7f2c" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.830734 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.843663 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.845406 5030 scope.go:117] "RemoveContainer" containerID="1089f049d91fe37cc3fd52ef3dead85a955989169754a9e8689c66683255eeda" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.845924 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.847574 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.848048 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.848276 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.857252 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.861658 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" podStartSLOduration=4.861598217 podStartE2EDuration="4.861598217s" podCreationTimestamp="2026-01-20 23:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:49:51.774990647 +0000 UTC m=+4464.095250945" watchObservedRunningTime="2026-01-20 23:49:51.861598217 +0000 UTC m=+4464.181858535" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.880002 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65"] Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.880238 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" podUID="b4909635-4c45-4cc9-af55-4df3dfa5a6ae" containerName="barbican-keystone-listener-log" containerID="cri-o://e9844f2bcfe577a131f628de6cb182cfcf200e6f70195cd5fe757c784f5c29dd" gracePeriod=30 Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.880725 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" podUID="b4909635-4c45-4cc9-af55-4df3dfa5a6ae" containerName="barbican-keystone-listener" containerID="cri-o://ed470bb8adfad334977662cf16520c4e04b1607943ea23451dcf79905020279d" gracePeriod=30 Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.891095 5030 scope.go:117] "RemoveContainer" containerID="c130e1e3f087863c14f1d0006c481f506719b0045564a503944f6c864d4040d3" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.920021 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.920094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247c3cd7-bfa5-4883-bdac-ced00b376ca0-log-httpd\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.920157 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.920242 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5mpk\" (UniqueName: \"kubernetes.io/projected/247c3cd7-bfa5-4883-bdac-ced00b376ca0-kube-api-access-p5mpk\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.920269 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-config-data\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.920300 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.920360 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247c3cd7-bfa5-4883-bdac-ced00b376ca0-run-httpd\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.920444 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.920468 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-scripts\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.920520 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.920675 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs podName:d465598f-140e-4d07-a886-9d5c96b3bbe3 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.920576268 +0000 UTC m=+4465.240836556 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "d465598f-140e-4d07-a886-9d5c96b3bbe3") : secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.923798 5030 scope.go:117] "RemoveContainer" containerID="a4af203238bd85addfe604690e3c292c8efda30dec6ea3736cd2d68577d11e6b" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.928286 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4af203238bd85addfe604690e3c292c8efda30dec6ea3736cd2d68577d11e6b\": container with ID starting with a4af203238bd85addfe604690e3c292c8efda30dec6ea3736cd2d68577d11e6b not found: ID does not exist" containerID="a4af203238bd85addfe604690e3c292c8efda30dec6ea3736cd2d68577d11e6b" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.928330 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4af203238bd85addfe604690e3c292c8efda30dec6ea3736cd2d68577d11e6b"} err="failed to get container status \"a4af203238bd85addfe604690e3c292c8efda30dec6ea3736cd2d68577d11e6b\": rpc error: code = NotFound desc = could not find container \"a4af203238bd85addfe604690e3c292c8efda30dec6ea3736cd2d68577d11e6b\": container with ID starting with a4af203238bd85addfe604690e3c292c8efda30dec6ea3736cd2d68577d11e6b not found: ID does not exist" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.928353 5030 scope.go:117] "RemoveContainer" containerID="eb9221952dc192f9c1f1f8e2fb015a4b92badc39abb4c32ab4746ab7b9ff7f2c" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.928996 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb9221952dc192f9c1f1f8e2fb015a4b92badc39abb4c32ab4746ab7b9ff7f2c\": container with ID starting with eb9221952dc192f9c1f1f8e2fb015a4b92badc39abb4c32ab4746ab7b9ff7f2c not found: ID does not exist" containerID="eb9221952dc192f9c1f1f8e2fb015a4b92badc39abb4c32ab4746ab7b9ff7f2c" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.929019 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb9221952dc192f9c1f1f8e2fb015a4b92badc39abb4c32ab4746ab7b9ff7f2c"} err="failed to get container status \"eb9221952dc192f9c1f1f8e2fb015a4b92badc39abb4c32ab4746ab7b9ff7f2c\": rpc error: code = NotFound desc = could not find container \"eb9221952dc192f9c1f1f8e2fb015a4b92badc39abb4c32ab4746ab7b9ff7f2c\": container with ID starting with eb9221952dc192f9c1f1f8e2fb015a4b92badc39abb4c32ab4746ab7b9ff7f2c not found: ID does not exist" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.929031 5030 scope.go:117] "RemoveContainer" containerID="1089f049d91fe37cc3fd52ef3dead85a955989169754a9e8689c66683255eeda" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.929382 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1089f049d91fe37cc3fd52ef3dead85a955989169754a9e8689c66683255eeda\": container with ID starting with 1089f049d91fe37cc3fd52ef3dead85a955989169754a9e8689c66683255eeda not found: ID does not exist" containerID="1089f049d91fe37cc3fd52ef3dead85a955989169754a9e8689c66683255eeda" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.929402 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1089f049d91fe37cc3fd52ef3dead85a955989169754a9e8689c66683255eeda"} err="failed to get container status \"1089f049d91fe37cc3fd52ef3dead85a955989169754a9e8689c66683255eeda\": rpc error: code = NotFound desc = could not find container \"1089f049d91fe37cc3fd52ef3dead85a955989169754a9e8689c66683255eeda\": container with ID starting with 1089f049d91fe37cc3fd52ef3dead85a955989169754a9e8689c66683255eeda not found: ID does not exist" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.929436 5030 scope.go:117] "RemoveContainer" containerID="c130e1e3f087863c14f1d0006c481f506719b0045564a503944f6c864d4040d3" Jan 20 23:49:51 crc kubenswrapper[5030]: E0120 23:49:51.929815 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c130e1e3f087863c14f1d0006c481f506719b0045564a503944f6c864d4040d3\": container with ID starting with c130e1e3f087863c14f1d0006c481f506719b0045564a503944f6c864d4040d3 not found: ID does not exist" containerID="c130e1e3f087863c14f1d0006c481f506719b0045564a503944f6c864d4040d3" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.929834 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c130e1e3f087863c14f1d0006c481f506719b0045564a503944f6c864d4040d3"} err="failed to get container status \"c130e1e3f087863c14f1d0006c481f506719b0045564a503944f6c864d4040d3\": rpc error: code = NotFound desc = could not find container \"c130e1e3f087863c14f1d0006c481f506719b0045564a503944f6c864d4040d3\": container with ID starting with c130e1e3f087863c14f1d0006c481f506719b0045564a503944f6c864d4040d3 not found: ID does not exist" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.972903 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="201b8e1b-7c3d-4d9d-a47b-4dda67f6829b" path="/var/lib/kubelet/pods/201b8e1b-7c3d-4d9d-a47b-4dda67f6829b/volumes" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.973478 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32dd6edd-9576-4aa1-9371-f9498e9c1a60" path="/var/lib/kubelet/pods/32dd6edd-9576-4aa1-9371-f9498e9c1a60/volumes" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.974015 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c4e36db-cf6b-4863-a6aa-dfe6388e5c03" path="/var/lib/kubelet/pods/4c4e36db-cf6b-4863-a6aa-dfe6388e5c03/volumes" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.975372 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5727ecb0-0164-4b1f-9c45-80b381d6e6dd" path="/var/lib/kubelet/pods/5727ecb0-0164-4b1f-9c45-80b381d6e6dd/volumes" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.975955 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62872782-8768-483f-b48b-298cf9973612" path="/var/lib/kubelet/pods/62872782-8768-483f-b48b-298cf9973612/volumes" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.977060 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="806e42db-b6fa-4931-ab88-b2903c5e2830" path="/var/lib/kubelet/pods/806e42db-b6fa-4931-ab88-b2903c5e2830/volumes" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.977636 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82558fc9-417f-49d3-b522-207e0c215f92" path="/var/lib/kubelet/pods/82558fc9-417f-49d3-b522-207e0c215f92/volumes" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.978195 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9729aa6a-9110-421f-bd3a-d22cad6a75e9" path="/var/lib/kubelet/pods/9729aa6a-9110-421f-bd3a-d22cad6a75e9/volumes" Jan 20 23:49:51 crc kubenswrapper[5030]: I0120 23:49:51.979129 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99675934-2bb7-46f9-8e3c-49e818022424" path="/var/lib/kubelet/pods/99675934-2bb7-46f9-8e3c-49e818022424/volumes" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.022471 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.022774 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.022672 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-novncproxy-cell1-public-svc: secret "cert-nova-novncproxy-cell1-public-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.022909 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs podName:b088bf19-0c85-4732-bc31-18e8bd1cd751 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:53.02289001 +0000 UTC m=+4465.343150298 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nova-novncproxy-tls-certs" (UniqueName: "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs") pod "nova-cell1-novncproxy-0" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751") : secret "cert-nova-novncproxy-cell1-public-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.022993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.023088 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.023162 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.023172 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5mpk\" (UniqueName: \"kubernetes.io/projected/247c3cd7-bfa5-4883-bdac-ced00b376ca0-kube-api-access-p5mpk\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.023252 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs podName:5945d7b7-7f69-4c5e-a90f-5963d1ff010f nodeName:}" failed. No retries permitted until 2026-01-20 23:49:53.023227228 +0000 UTC m=+4465.343487526 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs") pod "memcached-0" (UID: "5945d7b7-7f69-4c5e-a90f-5963d1ff010f") : secret "cert-memcached-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.023396 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-config-data\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.023295 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.023579 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs podName:0e6ec1e2-3869-4ced-86cf-55f3d2f26b83 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:53.023565986 +0000 UTC m=+4465.343826274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs") pod "openstack-galera-0" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83") : secret "cert-galera-openstack-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.023299 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ceilometer-internal-svc: secret "cert-ceilometer-internal-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.023616 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs podName:247c3cd7-bfa5-4883-bdac-ced00b376ca0 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:52.523609797 +0000 UTC m=+4464.843870085 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ceilometer-tls-certs" (UniqueName: "kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs") pod "ceilometer-0" (UID: "247c3cd7-bfa5-4883-bdac-ced00b376ca0") : secret "cert-ceilometer-internal-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.023778 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247c3cd7-bfa5-4883-bdac-ced00b376ca0-run-httpd\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.023907 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.023993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.024085 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-scripts\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.024146 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247c3cd7-bfa5-4883-bdac-ced00b376ca0-run-httpd\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.024394 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.024543 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247c3cd7-bfa5-4883-bdac-ced00b376ca0-log-httpd\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.024640 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-novncproxy-cell1-vencrypt: secret "cert-nova-novncproxy-cell1-vencrypt" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.025190 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs podName:b088bf19-0c85-4732-bc31-18e8bd1cd751 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:53.025173065 +0000 UTC m=+4465.345433353 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "vencrypt-tls-certs" (UniqueName: "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs") pod "nova-cell1-novncproxy-0" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751") : secret "cert-nova-novncproxy-cell1-vencrypt" not found Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.025072 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247c3cd7-bfa5-4883-bdac-ced00b376ca0-log-httpd\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.028818 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-scripts\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.029508 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.030870 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-config-data\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.039263 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.052874 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5mpk\" (UniqueName: \"kubernetes.io/projected/247c3cd7-bfa5-4883-bdac-ced00b376ca0-kube-api-access-p5mpk\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.081021 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:49:52 crc kubenswrapper[5030]: W0120 23:49:52.084381 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87584043_7d0e_475d_89ec_2f743b49b145.slice/crio-70d5d0c6f651e999014f7196349d603e480a9cd2506a2c434142c1d05c6f83a6 WatchSource:0}: Error finding container 70d5d0c6f651e999014f7196349d603e480a9cd2506a2c434142c1d05c6f83a6: Status 404 returned error can't find the container with id 70d5d0c6f651e999014f7196349d603e480a9cd2506a2c434142c1d05c6f83a6 Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.126807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.126935 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-sb-ovndbs: secret "cert-ovndbcluster-sb-ovndbs" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.127083 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs podName:58e259c6-360b-4eb5-9940-ceddedf1e275 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:53.127067387 +0000 UTC m=+4465.447327675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ovsdbserver-sb-tls-certs" (UniqueName: "kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs") pod "ovsdbserver-sb-0" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275") : secret "cert-ovndbcluster-sb-ovndbs" not found Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.127511 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.127585 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.127629 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs podName:58e259c6-360b-4eb5-9940-ceddedf1e275 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:53.1276024 +0000 UTC m=+4465.447862688 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs") pod "ovsdbserver-sb-0" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275") : secret "cert-ovn-metrics" not found Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.297137 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.121:8776/healthcheck\": read tcp 10.217.0.2:48040->10.217.1.121:8776: read: connection reset by peer" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.322296 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.166:8774/\": read tcp 10.217.0.2:32936->10.217.1.166:8774: read: connection reset by peer" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.322311 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.166:8774/\": read tcp 10.217.0.2:32924->10.217.1.166:8774: read: connection reset by peer" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.415334 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ec45665b6a75193fb57359dd4199fd116084c74be755d905e8513881a7470510" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.420310 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ec45665b6a75193fb57359dd4199fd116084c74be755d905e8513881a7470510" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.421574 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ec45665b6a75193fb57359dd4199fd116084c74be755d905e8513881a7470510" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.421615 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="b2e0e246-826d-4bef-a72a-64b6e0266965" containerName="ovn-northd" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.538123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.538515 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.538640 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.538688 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.538718 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.538853 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.538943 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs podName:bd8e50b7-5dd6-475d-80a9-69eff598bb18 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:54.538914858 +0000 UTC m=+4466.859175146 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18") : secret "cert-ovn-metrics" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.539081 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ceilometer-internal-svc: secret "cert-ceilometer-internal-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.539120 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs podName:247c3cd7-bfa5-4883-bdac-ced00b376ca0 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:53.539113533 +0000 UTC m=+4465.859373821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ceilometer-tls-certs" (UniqueName: "kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs") pod "ceilometer-0" (UID: "247c3cd7-bfa5-4883-bdac-ced00b376ca0") : secret "cert-ceilometer-internal-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.539158 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.539178 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs podName:83e561a5-31df-4e46-ac6b-9602ea09bcef nodeName:}" failed. No retries permitted until 2026-01-20 23:49:56.539172024 +0000 UTC m=+4468.859432312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs") pod "placement-7fdd8fb598-6247c" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef") : secret "cert-placement-internal-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.539228 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.539258 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs podName:bd8e50b7-5dd6-475d-80a9-69eff598bb18 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:54.539252166 +0000 UTC m=+4466.859512454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.539298 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.539321 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs podName:83e561a5-31df-4e46-ac6b-9602ea09bcef nodeName:}" failed. No retries permitted until 2026-01-20 23:49:56.539312048 +0000 UTC m=+4468.859572336 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs") pod "placement-7fdd8fb598-6247c" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef") : secret "cert-placement-public-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.576904 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.634144 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.162:8775/\": read tcp 10.217.0.2:34338->10.217.1.162:8775: read: connection reset by peer" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.634453 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.162:8775/\": read tcp 10.217.0.2:34352->10.217.1.162:8775: read: connection reset by peer" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.639816 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf2fe09-370b-4904-9f96-676f484639a2-httpd-run\") pod \"8cf2fe09-370b-4904-9f96-676f484639a2\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.639919 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf2fe09-370b-4904-9f96-676f484639a2-logs\") pod \"8cf2fe09-370b-4904-9f96-676f484639a2\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.639955 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-config-data\") pod \"8cf2fe09-370b-4904-9f96-676f484639a2\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.639974 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-scripts\") pod \"8cf2fe09-370b-4904-9f96-676f484639a2\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.640033 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ljj8\" (UniqueName: \"kubernetes.io/projected/8cf2fe09-370b-4904-9f96-676f484639a2-kube-api-access-9ljj8\") pod \"8cf2fe09-370b-4904-9f96-676f484639a2\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.640057 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"8cf2fe09-370b-4904-9f96-676f484639a2\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.640080 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-public-tls-certs\") pod \"8cf2fe09-370b-4904-9f96-676f484639a2\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.640120 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-combined-ca-bundle\") pod \"8cf2fe09-370b-4904-9f96-676f484639a2\" (UID: \"8cf2fe09-370b-4904-9f96-676f484639a2\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.640520 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.640587 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.640680 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.640743 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.640810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.640990 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.641034 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:49:56.641018325 +0000 UTC m=+4468.961278603 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-public-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.642746 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.642833 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:49:56.642815068 +0000 UTC m=+4468.963075356 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-internal-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.650284 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.650361 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs podName:58885f5a-2467-45da-b076-7cdb2952d2a9 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:56.650342001 +0000 UTC m=+4468.970602289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs") pod "barbican-api-6ffbb9cff4-gljqq" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9") : secret "cert-barbican-internal-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.650405 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.650428 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs podName:58885f5a-2467-45da-b076-7cdb2952d2a9 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:56.650420553 +0000 UTC m=+4468.970680841 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs") pod "barbican-api-6ffbb9cff4-gljqq" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9") : secret "cert-barbican-public-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.650460 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.650478 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:49:56.650472244 +0000 UTC m=+4468.970732532 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-ovndbs" not found Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.651035 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cf2fe09-370b-4904-9f96-676f484639a2-logs" (OuterVolumeSpecName: "logs") pod "8cf2fe09-370b-4904-9f96-676f484639a2" (UID: "8cf2fe09-370b-4904-9f96-676f484639a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.651365 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cf2fe09-370b-4904-9f96-676f484639a2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8cf2fe09-370b-4904-9f96-676f484639a2" (UID: "8cf2fe09-370b-4904-9f96-676f484639a2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.651940 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ff6971b9-88b5-4b5e-9b15-a105b0c38e18" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.137:9292/healthcheck\": read tcp 10.217.0.2:60448->10.217.1.137:9292: read: connection reset by peer" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.651948 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="ff6971b9-88b5-4b5e-9b15-a105b0c38e18" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.137:9292/healthcheck\": read tcp 10.217.0.2:60450->10.217.1.137:9292: read: connection reset by peer" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.652174 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "8cf2fe09-370b-4904-9f96-676f484639a2" (UID: "8cf2fe09-370b-4904-9f96-676f484639a2"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.663066 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-scripts" (OuterVolumeSpecName: "scripts") pod "8cf2fe09-370b-4904-9f96-676f484639a2" (UID: "8cf2fe09-370b-4904-9f96-676f484639a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.663238 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cf2fe09-370b-4904-9f96-676f484639a2-kube-api-access-9ljj8" (OuterVolumeSpecName: "kube-api-access-9ljj8") pod "8cf2fe09-370b-4904-9f96-676f484639a2" (UID: "8cf2fe09-370b-4904-9f96-676f484639a2"). InnerVolumeSpecName "kube-api-access-9ljj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.710883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cf2fe09-370b-4904-9f96-676f484639a2" (UID: "8cf2fe09-370b-4904-9f96-676f484639a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.726293 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8cf2fe09-370b-4904-9f96-676f484639a2" (UID: "8cf2fe09-370b-4904-9f96-676f484639a2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.735016 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-config-data" (OuterVolumeSpecName: "config-data") pod "8cf2fe09-370b-4904-9f96-676f484639a2" (UID: "8cf2fe09-370b-4904-9f96-676f484639a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.740910 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c375ff6f-68b7-42be-8e2b-800ad51039f5","Type":"ContainerDied","Data":"f2d6b9369ce716621b16493427f2b972be56c6d00d119998710fbf5b3df8d1c7"} Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.741956 5030 generic.go:334] "Generic (PLEG): container finished" podID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerID="f2d6b9369ce716621b16493427f2b972be56c6d00d119998710fbf5b3df8d1c7" exitCode=0 Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.744292 5030 generic.go:334] "Generic (PLEG): container finished" podID="26c79cd4-48f4-4082-a364-e767b967212d" containerID="16c5c4277d4db587ebd20571f0eda8b1071259dcd5b2cc10f6127bebbbe1f881" exitCode=143 Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.744351 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" event={"ID":"26c79cd4-48f4-4082-a364-e767b967212d","Type":"ContainerDied","Data":"16c5c4277d4db587ebd20571f0eda8b1071259dcd5b2cc10f6127bebbbe1f881"} Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.747250 5030 generic.go:334] "Generic (PLEG): container finished" podID="8cf2fe09-370b-4904-9f96-676f484639a2" containerID="08efb75988c344c6ad8db318e7e0250f525d5625216651e8b4ae86babeea05fa" exitCode=0 Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.747282 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"8cf2fe09-370b-4904-9f96-676f484639a2","Type":"ContainerDied","Data":"08efb75988c344c6ad8db318e7e0250f525d5625216651e8b4ae86babeea05fa"} Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.747297 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"8cf2fe09-370b-4904-9f96-676f484639a2","Type":"ContainerDied","Data":"1c46e108039b15a0fe7bf9973c2b0c7bd1a4858120af7d074e71d7f50ff3b411"} Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.747312 5030 scope.go:117] "RemoveContainer" containerID="08efb75988c344c6ad8db318e7e0250f525d5625216651e8b4ae86babeea05fa" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.747402 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.751013 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"87584043-7d0e-475d-89ec-2f743b49b145","Type":"ContainerStarted","Data":"95ca0841856856a63ff25d713e88ef8694f506c3cfc1946a717a1a49c91cfea1"} Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.751030 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"87584043-7d0e-475d-89ec-2f743b49b145","Type":"ContainerStarted","Data":"70d5d0c6f651e999014f7196349d603e480a9cd2506a2c434142c1d05c6f83a6"} Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.751205 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.754959 5030 generic.go:334] "Generic (PLEG): container finished" podID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerID="dce285993f85675d1e7573be37d820244bbab99b017ceb4a9b64b82a42bbdad3" exitCode=0 Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.754991 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fff68923-cebb-4d2c-bf69-4355b0da8b6d","Type":"ContainerDied","Data":"dce285993f85675d1e7573be37d820244bbab99b017ceb4a9b64b82a42bbdad3"} Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.761423 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4909635-4c45-4cc9-af55-4df3dfa5a6ae" containerID="e9844f2bcfe577a131f628de6cb182cfcf200e6f70195cd5fe757c784f5c29dd" exitCode=143 Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.761454 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" event={"ID":"b4909635-4c45-4cc9-af55-4df3dfa5a6ae","Type":"ContainerDied","Data":"e9844f2bcfe577a131f628de6cb182cfcf200e6f70195cd5fe757c784f5c29dd"} Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.762124 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.762179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.762482 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.762517 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs podName:16b352a0-984a-4736-9ede-a3797059b56a nodeName:}" failed. No retries permitted until 2026-01-20 23:49:56.762505532 +0000 UTC m=+4469.082765820 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs") pod "keystone-d4c58fbf5-82sk5" (UID: "16b352a0-984a-4736-9ede-a3797059b56a") : secret "cert-keystone-public-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.763278 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-internal-svc: secret "cert-keystone-internal-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.763330 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs podName:16b352a0-984a-4736-9ede-a3797059b56a nodeName:}" failed. No retries permitted until 2026-01-20 23:49:56.763315132 +0000 UTC m=+4469.083575420 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs") pod "keystone-d4c58fbf5-82sk5" (UID: "16b352a0-984a-4736-9ede-a3797059b56a") : secret "cert-keystone-internal-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.763360 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf2fe09-370b-4904-9f96-676f484639a2-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.763371 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.763380 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.763393 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ljj8\" (UniqueName: \"kubernetes.io/projected/8cf2fe09-370b-4904-9f96-676f484639a2-kube-api-access-9ljj8\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.763414 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.763426 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.763437 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf2fe09-370b-4904-9f96-676f484639a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.763447 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf2fe09-370b-4904-9f96-676f484639a2-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.781856 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.784393 5030 generic.go:334] "Generic (PLEG): container finished" podID="f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" containerID="d9cbf6d5b6e9e1557a747ebddf26e0d91cf97ac99872d10a07f8a705f8fd5859" exitCode=0 Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.784565 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d","Type":"ContainerDied","Data":"d9cbf6d5b6e9e1557a747ebddf26e0d91cf97ac99872d10a07f8a705f8fd5859"} Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.794805 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.7947816850000002 podStartE2EDuration="2.794781685s" podCreationTimestamp="2026-01-20 23:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:49:52.772152736 +0000 UTC m=+4465.092413024" watchObservedRunningTime="2026-01-20 23:49:52.794781685 +0000 UTC m=+4465.115041973" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.816324 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.858323 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.866293 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-etc-machine-id\") pod \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.866556 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" (UID: "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.866816 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-config-data-custom\") pod \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.867026 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-internal-tls-certs\") pod \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.867119 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whhd2\" (UniqueName: \"kubernetes.io/projected/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-kube-api-access-whhd2\") pod \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.867278 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-public-tls-certs\") pod \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.867693 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-combined-ca-bundle\") pod \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.868665 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-scripts\") pod \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.868834 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-logs\") pod \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.868936 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-config-data\") pod \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\" (UID: \"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.869926 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.870056 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.873324 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-logs" (OuterVolumeSpecName: "logs") pod "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" (UID: "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.873517 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.877224 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-kube-api-access-whhd2" (OuterVolumeSpecName: "kube-api-access-whhd2") pod "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" (UID: "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d"). InnerVolumeSpecName "kube-api-access-whhd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.878553 5030 scope.go:117] "RemoveContainer" containerID="79c09103c60431ebdd981145e3c2b16138f9661ec8c5a1fbfa1b8844ea828e76" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.882052 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.882132 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" (UID: "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.889863 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-scripts" (OuterVolumeSpecName: "scripts") pod "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" (UID: "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.892348 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.893085 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cf2fe09-370b-4904-9f96-676f484639a2" containerName="glance-log" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.893681 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf2fe09-370b-4904-9f96-676f484639a2" containerName="glance-log" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.893810 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" containerName="cinder-api" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.893896 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" containerName="cinder-api" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.893977 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cf2fe09-370b-4904-9f96-676f484639a2" containerName="glance-httpd" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.894037 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf2fe09-370b-4904-9f96-676f484639a2" containerName="glance-httpd" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.894107 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerName="nova-api-log" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.894276 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerName="nova-api-log" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.894360 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerName="nova-api-api" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.894425 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerName="nova-api-api" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.894496 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" containerName="cinder-api-log" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.894559 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" containerName="cinder-api-log" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.894815 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cf2fe09-370b-4904-9f96-676f484639a2" containerName="glance-httpd" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.894901 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cf2fe09-370b-4904-9f96-676f484639a2" containerName="glance-log" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.894992 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerName="nova-api-log" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.895077 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" containerName="cinder-api-log" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.895159 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" containerName="nova-api-api" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.895231 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" containerName="cinder-api" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.896243 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.901059 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.901563 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.908323 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.935272 5030 scope.go:117] "RemoveContainer" containerID="08efb75988c344c6ad8db318e7e0250f525d5625216651e8b4ae86babeea05fa" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.942910 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08efb75988c344c6ad8db318e7e0250f525d5625216651e8b4ae86babeea05fa\": container with ID starting with 08efb75988c344c6ad8db318e7e0250f525d5625216651e8b4ae86babeea05fa not found: ID does not exist" containerID="08efb75988c344c6ad8db318e7e0250f525d5625216651e8b4ae86babeea05fa" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.942950 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08efb75988c344c6ad8db318e7e0250f525d5625216651e8b4ae86babeea05fa"} err="failed to get container status \"08efb75988c344c6ad8db318e7e0250f525d5625216651e8b4ae86babeea05fa\": rpc error: code = NotFound desc = could not find container \"08efb75988c344c6ad8db318e7e0250f525d5625216651e8b4ae86babeea05fa\": container with ID starting with 08efb75988c344c6ad8db318e7e0250f525d5625216651e8b4ae86babeea05fa not found: ID does not exist" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.942992 5030 scope.go:117] "RemoveContainer" containerID="79c09103c60431ebdd981145e3c2b16138f9661ec8c5a1fbfa1b8844ea828e76" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.944733 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79c09103c60431ebdd981145e3c2b16138f9661ec8c5a1fbfa1b8844ea828e76\": container with ID starting with 79c09103c60431ebdd981145e3c2b16138f9661ec8c5a1fbfa1b8844ea828e76 not found: ID does not exist" containerID="79c09103c60431ebdd981145e3c2b16138f9661ec8c5a1fbfa1b8844ea828e76" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.944785 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c09103c60431ebdd981145e3c2b16138f9661ec8c5a1fbfa1b8844ea828e76"} err="failed to get container status \"79c09103c60431ebdd981145e3c2b16138f9661ec8c5a1fbfa1b8844ea828e76\": rpc error: code = NotFound desc = could not find container \"79c09103c60431ebdd981145e3c2b16138f9661ec8c5a1fbfa1b8844ea828e76\": container with ID starting with 79c09103c60431ebdd981145e3c2b16138f9661ec8c5a1fbfa1b8844ea828e76 not found: ID does not exist" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.944810 5030 scope.go:117] "RemoveContainer" containerID="d9cbf6d5b6e9e1557a747ebddf26e0d91cf97ac99872d10a07f8a705f8fd5859" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.945140 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" (UID: "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.965681 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" (UID: "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.968015 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-config-data" (OuterVolumeSpecName: "config-data") pod "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" (UID: "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.971226 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-config-data\") pod \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.971312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fff68923-cebb-4d2c-bf69-4355b0da8b6d-logs\") pod \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.971428 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-combined-ca-bundle\") pod \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.971519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6cwm\" (UniqueName: \"kubernetes.io/projected/fff68923-cebb-4d2c-bf69-4355b0da8b6d-kube-api-access-d6cwm\") pod \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.971555 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-public-tls-certs\") pod \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.971574 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-internal-tls-certs\") pod \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\" (UID: \"fff68923-cebb-4d2c-bf69-4355b0da8b6d\") " Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.971844 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.971874 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqm6t\" (UniqueName: \"kubernetes.io/projected/b33621c6-2162-464f-853a-377357ad9f2f-kube-api-access-zqm6t\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.971979 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-config-data\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.972055 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b33621c6-2162-464f-853a-377357ad9f2f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.972081 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b33621c6-2162-464f-853a-377357ad9f2f-logs\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.972102 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.972120 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-scripts\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.972191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.972239 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.972303 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.972320 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.972329 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.972338 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.972347 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.972356 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.972365 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whhd2\" (UniqueName: \"kubernetes.io/projected/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-kube-api-access-whhd2\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.973115 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff68923-cebb-4d2c-bf69-4355b0da8b6d-logs" (OuterVolumeSpecName: "logs") pod "fff68923-cebb-4d2c-bf69-4355b0da8b6d" (UID: "fff68923-cebb-4d2c-bf69-4355b0da8b6d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.973207 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: E0120 23:49:52.973255 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs podName:d465598f-140e-4d07-a886-9d5c96b3bbe3 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:54.973239384 +0000 UTC m=+4467.293499672 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "d465598f-140e-4d07-a886-9d5c96b3bbe3") : secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.976380 5030 scope.go:117] "RemoveContainer" containerID="4692f8f2e0876360cbe854005c8dbab4b736e7fa64a44b994b00fec7f8624e10" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.981754 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" (UID: "f1f606ed-8a00-4c89-b7cd-2bc1e166e44d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:52 crc kubenswrapper[5030]: I0120 23:49:52.987795 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff68923-cebb-4d2c-bf69-4355b0da8b6d-kube-api-access-d6cwm" (OuterVolumeSpecName: "kube-api-access-d6cwm") pod "fff68923-cebb-4d2c-bf69-4355b0da8b6d" (UID: "fff68923-cebb-4d2c-bf69-4355b0da8b6d"). InnerVolumeSpecName "kube-api-access-d6cwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.003603 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-config-data" (OuterVolumeSpecName: "config-data") pod "fff68923-cebb-4d2c-bf69-4355b0da8b6d" (UID: "fff68923-cebb-4d2c-bf69-4355b0da8b6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.017976 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fff68923-cebb-4d2c-bf69-4355b0da8b6d" (UID: "fff68923-cebb-4d2c-bf69-4355b0da8b6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.039348 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fff68923-cebb-4d2c-bf69-4355b0da8b6d" (UID: "fff68923-cebb-4d2c-bf69-4355b0da8b6d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.042013 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fff68923-cebb-4d2c-bf69-4355b0da8b6d" (UID: "fff68923-cebb-4d2c-bf69-4355b0da8b6d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.074442 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.074514 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.074655 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.074691 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-novncproxy-cell1-vencrypt: secret "cert-nova-novncproxy-cell1-vencrypt" not found Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.074706 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqm6t\" (UniqueName: \"kubernetes.io/projected/b33621c6-2162-464f-853a-377357ad9f2f-kube-api-access-zqm6t\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.074788 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs podName:b088bf19-0c85-4732-bc31-18e8bd1cd751 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:55.074769557 +0000 UTC m=+4467.395029845 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "vencrypt-tls-certs" (UniqueName: "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs") pod "nova-cell1-novncproxy-0" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751") : secret "cert-nova-novncproxy-cell1-vencrypt" not found Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.074971 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.075029 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-config-data\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.075142 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-public-svc: secret "cert-glance-default-public-svc" not found Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.075173 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-novncproxy-cell1-public-svc: secret "cert-nova-novncproxy-cell1-public-svc" not found Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.075200 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs podName:b088bf19-0c85-4732-bc31-18e8bd1cd751 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:55.075192727 +0000 UTC m=+4467.395453015 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nova-novncproxy-tls-certs" (UniqueName: "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs") pod "nova-cell1-novncproxy-0" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751") : secret "cert-nova-novncproxy-cell1-public-svc" not found Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.075761 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.075870 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs podName:b33621c6-2162-464f-853a-377357ad9f2f nodeName:}" failed. No retries permitted until 2026-01-20 23:49:53.575854854 +0000 UTC m=+4465.896115142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs") pod "glance-default-external-api-0" (UID: "b33621c6-2162-464f-853a-377357ad9f2f") : secret "cert-glance-default-public-svc" not found Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.075874 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.075952 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs podName:5945d7b7-7f69-4c5e-a90f-5963d1ff010f nodeName:}" failed. No retries permitted until 2026-01-20 23:49:55.075933596 +0000 UTC m=+4467.396193884 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs") pod "memcached-0" (UID: "5945d7b7-7f69-4c5e-a90f-5963d1ff010f") : secret "cert-memcached-svc" not found Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.075075 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.077409 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.077509 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b33621c6-2162-464f-853a-377357ad9f2f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.077538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b33621c6-2162-464f-853a-377357ad9f2f-logs\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.077634 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-scripts\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.077653 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.077688 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs podName:0e6ec1e2-3869-4ced-86cf-55f3d2f26b83 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:55.077678458 +0000 UTC m=+4467.397938746 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs") pod "openstack-galera-0" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83") : secret "cert-galera-openstack-svc" not found Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.077838 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.078130 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.078175 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6cwm\" (UniqueName: \"kubernetes.io/projected/fff68923-cebb-4d2c-bf69-4355b0da8b6d-kube-api-access-d6cwm\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.078189 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.078202 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.078213 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fff68923-cebb-4d2c-bf69-4355b0da8b6d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.078224 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.078232 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fff68923-cebb-4d2c-bf69-4355b0da8b6d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.082135 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.082161 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b33621c6-2162-464f-853a-377357ad9f2f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.082437 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b33621c6-2162-464f-853a-377357ad9f2f-logs\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.082641 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-scripts\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.097430 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-config-data\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.107694 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.110223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqm6t\" (UniqueName: \"kubernetes.io/projected/b33621c6-2162-464f-853a-377357ad9f2f-kube-api-access-zqm6t\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.187970 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.188169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.188350 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-sb-ovndbs: secret "cert-ovndbcluster-sb-ovndbs" not found Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.188398 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs podName:58e259c6-360b-4eb5-9940-ceddedf1e275 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:55.188383803 +0000 UTC m=+4467.508644091 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "ovsdbserver-sb-tls-certs" (UniqueName: "kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs") pod "ovsdbserver-sb-0" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275") : secret "cert-ovndbcluster-sb-ovndbs" not found Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.188737 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.188761 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs podName:58e259c6-360b-4eb5-9940-ceddedf1e275 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:55.188753702 +0000 UTC m=+4467.509013990 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs") pod "ovsdbserver-sb-0" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275") : secret "cert-ovn-metrics" not found Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.354806 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.449074 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.502343 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.502397 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-internal-tls-certs\") pod \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.502459 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-logs\") pod \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.502518 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-combined-ca-bundle\") pod \"c375ff6f-68b7-42be-8e2b-800ad51039f5\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.502542 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-scripts\") pod \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.502580 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c375ff6f-68b7-42be-8e2b-800ad51039f5-logs\") pod \"c375ff6f-68b7-42be-8e2b-800ad51039f5\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.502642 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-combined-ca-bundle\") pod \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.502674 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5479\" (UniqueName: \"kubernetes.io/projected/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-kube-api-access-v5479\") pod \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.502700 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-httpd-run\") pod \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.502725 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-config-data\") pod \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\" (UID: \"ff6971b9-88b5-4b5e-9b15-a105b0c38e18\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.502768 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwst7\" (UniqueName: \"kubernetes.io/projected/c375ff6f-68b7-42be-8e2b-800ad51039f5-kube-api-access-xwst7\") pod \"c375ff6f-68b7-42be-8e2b-800ad51039f5\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.502788 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-config-data\") pod \"c375ff6f-68b7-42be-8e2b-800ad51039f5\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.502818 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-nova-metadata-tls-certs\") pod \"c375ff6f-68b7-42be-8e2b-800ad51039f5\" (UID: \"c375ff6f-68b7-42be-8e2b-800ad51039f5\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.503267 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c375ff6f-68b7-42be-8e2b-800ad51039f5-logs" (OuterVolumeSpecName: "logs") pod "c375ff6f-68b7-42be-8e2b-800ad51039f5" (UID: "c375ff6f-68b7-42be-8e2b-800ad51039f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.504086 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-logs" (OuterVolumeSpecName: "logs") pod "ff6971b9-88b5-4b5e-9b15-a105b0c38e18" (UID: "ff6971b9-88b5-4b5e-9b15-a105b0c38e18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.504417 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ff6971b9-88b5-4b5e-9b15-a105b0c38e18" (UID: "ff6971b9-88b5-4b5e-9b15-a105b0c38e18"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.504520 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.504597 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c375ff6f-68b7-42be-8e2b-800ad51039f5-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.509990 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-scripts" (OuterVolumeSpecName: "scripts") pod "ff6971b9-88b5-4b5e-9b15-a105b0c38e18" (UID: "ff6971b9-88b5-4b5e-9b15-a105b0c38e18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.510100 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-kube-api-access-v5479" (OuterVolumeSpecName: "kube-api-access-v5479") pod "ff6971b9-88b5-4b5e-9b15-a105b0c38e18" (UID: "ff6971b9-88b5-4b5e-9b15-a105b0c38e18"). InnerVolumeSpecName "kube-api-access-v5479". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.510350 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "ff6971b9-88b5-4b5e-9b15-a105b0c38e18" (UID: "ff6971b9-88b5-4b5e-9b15-a105b0c38e18"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.520500 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.522258 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c375ff6f-68b7-42be-8e2b-800ad51039f5-kube-api-access-xwst7" (OuterVolumeSpecName: "kube-api-access-xwst7") pod "c375ff6f-68b7-42be-8e2b-800ad51039f5" (UID: "c375ff6f-68b7-42be-8e2b-800ad51039f5"). InnerVolumeSpecName "kube-api-access-xwst7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.563909 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c375ff6f-68b7-42be-8e2b-800ad51039f5" (UID: "c375ff6f-68b7-42be-8e2b-800ad51039f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.568954 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c375ff6f-68b7-42be-8e2b-800ad51039f5" (UID: "c375ff6f-68b7-42be-8e2b-800ad51039f5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.571299 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff6971b9-88b5-4b5e-9b15-a105b0c38e18" (UID: "ff6971b9-88b5-4b5e-9b15-a105b0c38e18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.572484 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-config-data" (OuterVolumeSpecName: "config-data") pod "c375ff6f-68b7-42be-8e2b-800ad51039f5" (UID: "c375ff6f-68b7-42be-8e2b-800ad51039f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.581007 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ff6971b9-88b5-4b5e-9b15-a105b0c38e18" (UID: "ff6971b9-88b5-4b5e-9b15-a105b0c38e18"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.605633 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d98fq\" (UniqueName: \"kubernetes.io/projected/6944830c-0fd3-4285-b2d7-acbc0265920f-kube-api-access-d98fq\") pod \"6944830c-0fd3-4285-b2d7-acbc0265920f\" (UID: \"6944830c-0fd3-4285-b2d7-acbc0265920f\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.605808 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6944830c-0fd3-4285-b2d7-acbc0265920f-config-data\") pod \"6944830c-0fd3-4285-b2d7-acbc0265920f\" (UID: \"6944830c-0fd3-4285-b2d7-acbc0265920f\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.605950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6944830c-0fd3-4285-b2d7-acbc0265920f-combined-ca-bundle\") pod \"6944830c-0fd3-4285-b2d7-acbc0265920f\" (UID: \"6944830c-0fd3-4285-b2d7-acbc0265920f\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.606583 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.606736 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.606838 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwst7\" (UniqueName: \"kubernetes.io/projected/c375ff6f-68b7-42be-8e2b-800ad51039f5-kube-api-access-xwst7\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.606856 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.606892 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.606918 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.606932 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.606947 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c375ff6f-68b7-42be-8e2b-800ad51039f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.606960 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.606973 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.606985 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5479\" (UniqueName: \"kubernetes.io/projected/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-kube-api-access-v5479\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.606997 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.608018 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ceilometer-internal-svc: secret "cert-ceilometer-internal-svc" not found Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.608049 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-public-svc: secret "cert-glance-default-public-svc" not found Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.608085 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs podName:247c3cd7-bfa5-4883-bdac-ced00b376ca0 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:55.608067724 +0000 UTC m=+4467.928328112 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "ceilometer-tls-certs" (UniqueName: "kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs") pod "ceilometer-0" (UID: "247c3cd7-bfa5-4883-bdac-ced00b376ca0") : secret "cert-ceilometer-internal-svc" not found Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.608106 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs podName:b33621c6-2162-464f-853a-377357ad9f2f nodeName:}" failed. No retries permitted until 2026-01-20 23:49:54.608094225 +0000 UTC m=+4466.928354513 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs") pod "glance-default-external-api-0" (UID: "b33621c6-2162-464f-853a-377357ad9f2f") : secret "cert-glance-default-public-svc" not found Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.621807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6944830c-0fd3-4285-b2d7-acbc0265920f-kube-api-access-d98fq" (OuterVolumeSpecName: "kube-api-access-d98fq") pod "6944830c-0fd3-4285-b2d7-acbc0265920f" (UID: "6944830c-0fd3-4285-b2d7-acbc0265920f"). InnerVolumeSpecName "kube-api-access-d98fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.627357 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.628376 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-config-data" (OuterVolumeSpecName: "config-data") pod "ff6971b9-88b5-4b5e-9b15-a105b0c38e18" (UID: "ff6971b9-88b5-4b5e-9b15-a105b0c38e18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.636810 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6944830c-0fd3-4285-b2d7-acbc0265920f-config-data" (OuterVolumeSpecName: "config-data") pod "6944830c-0fd3-4285-b2d7-acbc0265920f" (UID: "6944830c-0fd3-4285-b2d7-acbc0265920f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.642083 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6944830c-0fd3-4285-b2d7-acbc0265920f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6944830c-0fd3-4285-b2d7-acbc0265920f" (UID: "6944830c-0fd3-4285-b2d7-acbc0265920f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.674657 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.708139 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6944830c-0fd3-4285-b2d7-acbc0265920f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.708163 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff6971b9-88b5-4b5e-9b15-a105b0c38e18-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.708174 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6944830c-0fd3-4285-b2d7-acbc0265920f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.708184 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d98fq\" (UniqueName: \"kubernetes.io/projected/6944830c-0fd3-4285-b2d7-acbc0265920f-kube-api-access-d98fq\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.708194 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.798080 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c375ff6f-68b7-42be-8e2b-800ad51039f5","Type":"ContainerDied","Data":"fe04f526fd281e9e79d89f074e64d5a25b12ca027d1160ef754aa5dc6bbc16ab"} Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.798134 5030 scope.go:117] "RemoveContainer" containerID="f2d6b9369ce716621b16493427f2b972be56c6d00d119998710fbf5b3df8d1c7" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.798283 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.807892 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_b2e0e246-826d-4bef-a72a-64b6e0266965/ovn-northd/0.log" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.807935 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2e0e246-826d-4bef-a72a-64b6e0266965" containerID="ec45665b6a75193fb57359dd4199fd116084c74be755d905e8513881a7470510" exitCode=139 Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.807991 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"b2e0e246-826d-4bef-a72a-64b6e0266965","Type":"ContainerDied","Data":"ec45665b6a75193fb57359dd4199fd116084c74be755d905e8513881a7470510"} Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.810086 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-scripts\") pod \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.810239 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-combined-ca-bundle\") pod \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.810302 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-config-data\") pod \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.810347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2zp4\" (UniqueName: \"kubernetes.io/projected/e0048c16-5c70-4aae-baf0-d9f25538e3e2-kube-api-access-v2zp4\") pod \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.810442 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0048c16-5c70-4aae-baf0-d9f25538e3e2-etc-machine-id\") pod \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.810551 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-config-data-custom\") pod \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\" (UID: \"e0048c16-5c70-4aae-baf0-d9f25538e3e2\") " Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.811892 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0048c16-5c70-4aae-baf0-d9f25538e3e2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e0048c16-5c70-4aae-baf0-d9f25538e3e2" (UID: "e0048c16-5c70-4aae-baf0-d9f25538e3e2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.814526 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-scripts" (OuterVolumeSpecName: "scripts") pod "e0048c16-5c70-4aae-baf0-d9f25538e3e2" (UID: "e0048c16-5c70-4aae-baf0-d9f25538e3e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.814712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e0048c16-5c70-4aae-baf0-d9f25538e3e2" (UID: "e0048c16-5c70-4aae-baf0-d9f25538e3e2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.814965 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0048c16-5c70-4aae-baf0-d9f25538e3e2-kube-api-access-v2zp4" (OuterVolumeSpecName: "kube-api-access-v2zp4") pod "e0048c16-5c70-4aae-baf0-d9f25538e3e2" (UID: "e0048c16-5c70-4aae-baf0-d9f25538e3e2"). InnerVolumeSpecName "kube-api-access-v2zp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.818812 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"fff68923-cebb-4d2c-bf69-4355b0da8b6d","Type":"ContainerDied","Data":"00cef13c135e11960a7dbb2e3be693db993cb165337f0f9bdfbdb166b386af88"} Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.818808 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.821563 5030 generic.go:334] "Generic (PLEG): container finished" podID="6944830c-0fd3-4285-b2d7-acbc0265920f" containerID="ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8" exitCode=0 Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.821722 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"6944830c-0fd3-4285-b2d7-acbc0265920f","Type":"ContainerDied","Data":"ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8"} Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.821767 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"6944830c-0fd3-4285-b2d7-acbc0265920f","Type":"ContainerDied","Data":"ba1b4ad527572baee17b8dc6ba0d0001b2abd12eafd8911735e7c4661bc1b492"} Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.821809 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.824850 5030 generic.go:334] "Generic (PLEG): container finished" podID="ff6971b9-88b5-4b5e-9b15-a105b0c38e18" containerID="7cd9dbd240d3ef2a14c6d6bb20711d33050ff28e12fcb1a4972c439a6b1b2b7a" exitCode=0 Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.824924 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ff6971b9-88b5-4b5e-9b15-a105b0c38e18","Type":"ContainerDied","Data":"7cd9dbd240d3ef2a14c6d6bb20711d33050ff28e12fcb1a4972c439a6b1b2b7a"} Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.824957 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"ff6971b9-88b5-4b5e-9b15-a105b0c38e18","Type":"ContainerDied","Data":"38ef4db4cd904e75810b9cb033726c2a23117637590a89dfc532a4ffbb5dce68"} Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.825035 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.831153 5030 scope.go:117] "RemoveContainer" containerID="b0f23f798796c9796da67506324bf9f776c7d41136bd05b183e42db700b9f13e" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.845558 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0048c16-5c70-4aae-baf0-d9f25538e3e2" containerID="539dbc34fb373341eff958e84967781ee544073c3550557667fd10768f210aeb" exitCode=0 Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.845650 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e0048c16-5c70-4aae-baf0-d9f25538e3e2","Type":"ContainerDied","Data":"539dbc34fb373341eff958e84967781ee544073c3550557667fd10768f210aeb"} Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.845680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"e0048c16-5c70-4aae-baf0-d9f25538e3e2","Type":"ContainerDied","Data":"57f57eab70ae5eac2734902fae4f4946bdc32ad4952a5e1e99f5664556ef3e3a"} Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.845900 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.853404 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.855535 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"f1f606ed-8a00-4c89-b7cd-2bc1e166e44d","Type":"ContainerDied","Data":"e5388e89ea2bdbf4703f3f2abec8968aaa13a50444466cc4a5b50b58c9e894a0"} Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.863777 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.878721 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.897593 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0048c16-5c70-4aae-baf0-d9f25538e3e2" (UID: "e0048c16-5c70-4aae-baf0-d9f25538e3e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.911557 5030 scope.go:117] "RemoveContainer" containerID="dce285993f85675d1e7573be37d820244bbab99b017ceb4a9b64b82a42bbdad3" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.912754 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.912775 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2zp4\" (UniqueName: \"kubernetes.io/projected/e0048c16-5c70-4aae-baf0-d9f25538e3e2-kube-api-access-v2zp4\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.912789 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e0048c16-5c70-4aae-baf0-d9f25538e3e2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.912798 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.912808 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.917479 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.918008 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6944830c-0fd3-4285-b2d7-acbc0265920f" containerName="nova-scheduler-scheduler" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918026 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6944830c-0fd3-4285-b2d7-acbc0265920f" containerName="nova-scheduler-scheduler" Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.918040 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0048c16-5c70-4aae-baf0-d9f25538e3e2" containerName="cinder-scheduler" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918046 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0048c16-5c70-4aae-baf0-d9f25538e3e2" containerName="cinder-scheduler" Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.918068 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0048c16-5c70-4aae-baf0-d9f25538e3e2" containerName="probe" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918074 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0048c16-5c70-4aae-baf0-d9f25538e3e2" containerName="probe" Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.918083 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerName="nova-metadata-metadata" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918090 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerName="nova-metadata-metadata" Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.918105 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6971b9-88b5-4b5e-9b15-a105b0c38e18" containerName="glance-log" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918111 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6971b9-88b5-4b5e-9b15-a105b0c38e18" containerName="glance-log" Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.918121 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6971b9-88b5-4b5e-9b15-a105b0c38e18" containerName="glance-httpd" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918126 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6971b9-88b5-4b5e-9b15-a105b0c38e18" containerName="glance-httpd" Jan 20 23:49:53 crc kubenswrapper[5030]: E0120 23:49:53.918139 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerName="nova-metadata-log" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918145 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerName="nova-metadata-log" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918311 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6944830c-0fd3-4285-b2d7-acbc0265920f" containerName="nova-scheduler-scheduler" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918329 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0048c16-5c70-4aae-baf0-d9f25538e3e2" containerName="probe" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918340 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff6971b9-88b5-4b5e-9b15-a105b0c38e18" containerName="glance-httpd" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918351 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff6971b9-88b5-4b5e-9b15-a105b0c38e18" containerName="glance-log" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918370 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0048c16-5c70-4aae-baf0-d9f25538e3e2" containerName="cinder-scheduler" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918390 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerName="nova-metadata-log" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.918402 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c375ff6f-68b7-42be-8e2b-800ad51039f5" containerName="nova-metadata-metadata" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.919546 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.921956 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.922321 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.929929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-config-data" (OuterVolumeSpecName: "config-data") pod "e0048c16-5c70-4aae-baf0-d9f25538e3e2" (UID: "e0048c16-5c70-4aae-baf0-d9f25538e3e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.951964 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.956397 5030 scope.go:117] "RemoveContainer" containerID="3ead8f8773eddb79d0b1807a8d7ef304eaad1f308dc8ad661b4d120d965ddd11" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.979132 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cf2fe09-370b-4904-9f96-676f484639a2" path="/var/lib/kubelet/pods/8cf2fe09-370b-4904-9f96-676f484639a2/volumes" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.983937 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c375ff6f-68b7-42be-8e2b-800ad51039f5" path="/var/lib/kubelet/pods/c375ff6f-68b7-42be-8e2b-800ad51039f5/volumes" Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.990507 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:49:53 crc kubenswrapper[5030]: I0120 23:49:53.998366 5030 scope.go:117] "RemoveContainer" containerID="ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.014108 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.015285 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9916ac3c-e85e-498a-a0cc-769ca26f84b0-logs\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.015347 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vmzs\" (UniqueName: \"kubernetes.io/projected/9916ac3c-e85e-498a-a0cc-769ca26f84b0-kube-api-access-4vmzs\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.015495 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.015522 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.015544 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-config-data\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.016001 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0048c16-5c70-4aae-baf0-d9f25538e3e2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.029773 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.031105 5030 scope.go:117] "RemoveContainer" containerID="ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.033125 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.036483 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8\": container with ID starting with ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8 not found: ID does not exist" containerID="ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.036534 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8"} err="failed to get container status \"ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8\": rpc error: code = NotFound desc = could not find container \"ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8\": container with ID starting with ebfe07e049482e1df399bade68cc2fcafa2b703965c4a995dcd8f6b36a9d60a8 not found: ID does not exist" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.036560 5030 scope.go:117] "RemoveContainer" containerID="7cd9dbd240d3ef2a14c6d6bb20711d33050ff28e12fcb1a4972c439a6b1b2b7a" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.037154 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.037205 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.037407 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.051388 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.061996 5030 scope.go:117] "RemoveContainer" containerID="c04f3732b98e193b6b53fc1957ae0a6d9a7d436e4e6b8a697a9012d0ac5cf9b1" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.071561 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.080443 5030 scope.go:117] "RemoveContainer" containerID="7cd9dbd240d3ef2a14c6d6bb20711d33050ff28e12fcb1a4972c439a6b1b2b7a" Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.081061 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cd9dbd240d3ef2a14c6d6bb20711d33050ff28e12fcb1a4972c439a6b1b2b7a\": container with ID starting with 7cd9dbd240d3ef2a14c6d6bb20711d33050ff28e12fcb1a4972c439a6b1b2b7a not found: ID does not exist" containerID="7cd9dbd240d3ef2a14c6d6bb20711d33050ff28e12fcb1a4972c439a6b1b2b7a" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.082141 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cd9dbd240d3ef2a14c6d6bb20711d33050ff28e12fcb1a4972c439a6b1b2b7a"} err="failed to get container status \"7cd9dbd240d3ef2a14c6d6bb20711d33050ff28e12fcb1a4972c439a6b1b2b7a\": rpc error: code = NotFound desc = could not find container \"7cd9dbd240d3ef2a14c6d6bb20711d33050ff28e12fcb1a4972c439a6b1b2b7a\": container with ID starting with 7cd9dbd240d3ef2a14c6d6bb20711d33050ff28e12fcb1a4972c439a6b1b2b7a not found: ID does not exist" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.082179 5030 scope.go:117] "RemoveContainer" containerID="c04f3732b98e193b6b53fc1957ae0a6d9a7d436e4e6b8a697a9012d0ac5cf9b1" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.081332 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.082557 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c04f3732b98e193b6b53fc1957ae0a6d9a7d436e4e6b8a697a9012d0ac5cf9b1\": container with ID starting with c04f3732b98e193b6b53fc1957ae0a6d9a7d436e4e6b8a697a9012d0ac5cf9b1 not found: ID does not exist" containerID="c04f3732b98e193b6b53fc1957ae0a6d9a7d436e4e6b8a697a9012d0ac5cf9b1" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.082589 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c04f3732b98e193b6b53fc1957ae0a6d9a7d436e4e6b8a697a9012d0ac5cf9b1"} err="failed to get container status \"c04f3732b98e193b6b53fc1957ae0a6d9a7d436e4e6b8a697a9012d0ac5cf9b1\": rpc error: code = NotFound desc = could not find container \"c04f3732b98e193b6b53fc1957ae0a6d9a7d436e4e6b8a697a9012d0ac5cf9b1\": container with ID starting with c04f3732b98e193b6b53fc1957ae0a6d9a7d436e4e6b8a697a9012d0ac5cf9b1 not found: ID does not exist" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.082604 5030 scope.go:117] "RemoveContainer" containerID="039e7ef3eac22a9018b69c95007576efe560fae9bfde236edbf26c56343c30a2" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.090005 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.099268 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.107279 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.110679 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.113336 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.114384 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.118194 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.118260 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.118302 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-config-data\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.118442 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-config-data\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.118485 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjvjr\" (UniqueName: \"kubernetes.io/projected/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-kube-api-access-sjvjr\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.118546 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.118765 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-logs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.118805 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.119015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9916ac3c-e85e-498a-a0cc-769ca26f84b0-logs\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.119075 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vmzs\" (UniqueName: \"kubernetes.io/projected/9916ac3c-e85e-498a-a0cc-769ca26f84b0-kube-api-access-4vmzs\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.119174 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.125427 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.126065 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9916ac3c-e85e-498a-a0cc-769ca26f84b0-logs\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.136002 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.149406 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.159954 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.161590 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.163913 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.164064 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.163962 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.164147 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-4xxv7" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.164020 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.164244 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.169962 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.178749 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6944830c_0fd3_4285_b2d7_acbc0265920f.slice/crio-ba1b4ad527572baee17b8dc6ba0d0001b2abd12eafd8911735e7c4661bc1b492\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0048c16_5c70_4aae_baf0_d9f25538e3e2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1f606ed_8a00_4c89_b7cd_2bc1e166e44d.slice/crio-e5388e89ea2bdbf4703f3f2abec8968aaa13a50444466cc4a5b50b58c9e894a0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6944830c_0fd3_4285_b2d7_acbc0265920f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff6971b9_88b5_4b5e_9b15_a105b0c38e18.slice/crio-38ef4db4cd904e75810b9cb033726c2a23117637590a89dfc532a4ffbb5dce68\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff6971b9_88b5_4b5e_9b15_a105b0c38e18.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0048c16_5c70_4aae_baf0_d9f25538e3e2.slice/crio-57f57eab70ae5eac2734902fae4f4946bdc32ad4952a5e1e99f5664556ef3e3a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1f606ed_8a00_4c89_b7cd_2bc1e166e44d.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.181092 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.182729 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.184332 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.191567 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.220931 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-config-data\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221020 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-config-data-custom\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221073 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvjr\" (UniqueName: \"kubernetes.io/projected/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-kube-api-access-sjvjr\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7daf347c-1524-4050-b50e-deb2024da0cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7daf347c-1524-4050-b50e-deb2024da0cc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221173 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221220 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221249 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-config-data\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221266 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221297 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13242275-7ac1-46df-a760-68acdd2cd9af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221325 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c80f8119-49a7-4395-afb7-cdb94b8d6c35-logs\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221345 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221369 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221383 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgnk5\" (UniqueName: \"kubernetes.io/projected/13242275-7ac1-46df-a760-68acdd2cd9af-kube-api-access-lgnk5\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221403 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221582 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7daf347c-1524-4050-b50e-deb2024da0cc-config-data\") pod \"nova-scheduler-0\" (UID: \"7daf347c-1524-4050-b50e-deb2024da0cc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221637 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-scripts\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221671 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-logs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221696 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221854 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp67c\" (UniqueName: \"kubernetes.io/projected/c80f8119-49a7-4395-afb7-cdb94b8d6c35-kube-api-access-sp67c\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.221866 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-internal-svc: secret "cert-nova-internal-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.221899 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.221923 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs podName:a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:54.721904895 +0000 UTC m=+4467.042165183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs") pod "nova-api-0" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01") : secret "cert-nova-internal-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.221978 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-public-svc: secret "cert-nova-public-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.222030 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs podName:a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:54.722011457 +0000 UTC m=+4467.042271835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs") pod "nova-api-0" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01") : secret "cert-nova-public-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.222076 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.222245 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.222305 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.222379 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bfgn\" (UniqueName: \"kubernetes.io/projected/7daf347c-1524-4050-b50e-deb2024da0cc-kube-api-access-4bfgn\") pod \"nova-scheduler-0\" (UID: \"7daf347c-1524-4050-b50e-deb2024da0cc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.222458 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c80f8119-49a7-4395-afb7-cdb94b8d6c35-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.222588 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13242275-7ac1-46df-a760-68acdd2cd9af-logs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.223553 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-logs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.325576 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c80f8119-49a7-4395-afb7-cdb94b8d6c35-logs\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.325653 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.325688 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.325712 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgnk5\" (UniqueName: \"kubernetes.io/projected/13242275-7ac1-46df-a760-68acdd2cd9af-kube-api-access-lgnk5\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.325739 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.325786 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7daf347c-1524-4050-b50e-deb2024da0cc-config-data\") pod \"nova-scheduler-0\" (UID: \"7daf347c-1524-4050-b50e-deb2024da0cc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.325819 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-scripts\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.325919 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp67c\" (UniqueName: \"kubernetes.io/projected/c80f8119-49a7-4395-afb7-cdb94b8d6c35-kube-api-access-sp67c\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.325997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.326061 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.326125 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.326173 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bfgn\" (UniqueName: \"kubernetes.io/projected/7daf347c-1524-4050-b50e-deb2024da0cc-kube-api-access-4bfgn\") pod \"nova-scheduler-0\" (UID: \"7daf347c-1524-4050-b50e-deb2024da0cc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.326199 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c80f8119-49a7-4395-afb7-cdb94b8d6c35-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.326239 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13242275-7ac1-46df-a760-68acdd2cd9af-logs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.326280 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-config-data-custom\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.326328 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7daf347c-1524-4050-b50e-deb2024da0cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7daf347c-1524-4050-b50e-deb2024da0cc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.326383 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.326420 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-config-data\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.326698 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.326727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13242275-7ac1-46df-a760-68acdd2cd9af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.326846 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13242275-7ac1-46df-a760-68acdd2cd9af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.326943 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.326992 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs podName:13242275-7ac1-46df-a760-68acdd2cd9af nodeName:}" failed. No retries permitted until 2026-01-20 23:49:54.826975264 +0000 UTC m=+4467.147235562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs") pod "cinder-api-0" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af") : secret "cert-cinder-internal-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.327019 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c80f8119-49a7-4395-afb7-cdb94b8d6c35-logs\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.333395 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c80f8119-49a7-4395-afb7-cdb94b8d6c35-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.335334 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.335514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13242275-7ac1-46df-a760-68acdd2cd9af-logs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.337503 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-public-svc: secret "cert-cinder-public-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.337575 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs podName:13242275-7ac1-46df-a760-68acdd2cd9af nodeName:}" failed. No retries permitted until 2026-01-20 23:49:54.837553821 +0000 UTC m=+4467.157814109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs") pod "cinder-api-0" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af") : secret "cert-cinder-public-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.338524 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.338599 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs podName:c80f8119-49a7-4395-afb7-cdb94b8d6c35 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:54.838578946 +0000 UTC m=+4467.158839324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "c80f8119-49a7-4395-afb7-cdb94b8d6c35") : secret "cert-glance-default-internal-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.544680 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.545816 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.545862 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-config-data\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.546902 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.547256 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vmzs\" (UniqueName: \"kubernetes.io/projected/9916ac3c-e85e-498a-a0cc-769ca26f84b0-kube-api-access-4vmzs\") pod \"nova-metadata-0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.547524 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-config-data\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.547520 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjvjr\" (UniqueName: \"kubernetes.io/projected/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-kube-api-access-sjvjr\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.547942 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.548059 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.548366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-scripts\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.548769 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7daf347c-1524-4050-b50e-deb2024da0cc-config-data\") pod \"nova-scheduler-0\" (UID: \"7daf347c-1524-4050-b50e-deb2024da0cc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.548840 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-config-data-custom\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.550669 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.551424 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-config-data\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.551763 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7daf347c-1524-4050-b50e-deb2024da0cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7daf347c-1524-4050-b50e-deb2024da0cc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.551843 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.551859 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bfgn\" (UniqueName: \"kubernetes.io/projected/7daf347c-1524-4050-b50e-deb2024da0cc-kube-api-access-4bfgn\") pod \"nova-scheduler-0\" (UID: \"7daf347c-1524-4050-b50e-deb2024da0cc\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.553247 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgnk5\" (UniqueName: \"kubernetes.io/projected/13242275-7ac1-46df-a760-68acdd2cd9af-kube-api-access-lgnk5\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.553280 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp67c\" (UniqueName: \"kubernetes.io/projected/c80f8119-49a7-4395-afb7-cdb94b8d6c35-kube-api-access-sp67c\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.563513 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.608259 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.635676 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.635894 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.635977 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.636125 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.636219 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.636303 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs podName:bd8e50b7-5dd6-475d-80a9-69eff598bb18 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:58.636280077 +0000 UTC m=+4470.956540355 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.636323 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs podName:bd8e50b7-5dd6-475d-80a9-69eff598bb18 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:58.636316648 +0000 UTC m=+4470.956576936 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18") : secret "cert-ovn-metrics" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.636534 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-public-svc: secret "cert-glance-default-public-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.637377 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs podName:b33621c6-2162-464f-853a-377357ad9f2f nodeName:}" failed. No retries permitted until 2026-01-20 23:49:56.637364934 +0000 UTC m=+4468.957625222 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs") pod "glance-default-external-api-0" (UID: "b33621c6-2162-464f-853a-377357ad9f2f") : secret "cert-glance-default-public-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.739262 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.739369 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.739824 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-public-svc: secret "cert-nova-public-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.739872 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs podName:a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:55.73985765 +0000 UTC m=+4468.060117938 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs") pod "nova-api-0" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01") : secret "cert-nova-public-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.740678 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-internal-svc: secret "cert-nova-internal-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.740708 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs podName:a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:55.74069902 +0000 UTC m=+4468.060959308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs") pod "nova-api-0" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01") : secret "cert-nova-internal-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.760646 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.764846 5030 scope.go:117] "RemoveContainer" containerID="539dbc34fb373341eff958e84967781ee544073c3550557667fd10768f210aeb" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.770699 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.787109 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.789142 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.790983 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.799155 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.802281 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.806744 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_b2e0e246-826d-4bef-a72a-64b6e0266965/ovn-northd/0.log" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.806903 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.825836 5030 scope.go:117] "RemoveContainer" containerID="039e7ef3eac22a9018b69c95007576efe560fae9bfde236edbf26c56343c30a2" Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.826749 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"039e7ef3eac22a9018b69c95007576efe560fae9bfde236edbf26c56343c30a2\": container with ID starting with 039e7ef3eac22a9018b69c95007576efe560fae9bfde236edbf26c56343c30a2 not found: ID does not exist" containerID="039e7ef3eac22a9018b69c95007576efe560fae9bfde236edbf26c56343c30a2" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.826789 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"039e7ef3eac22a9018b69c95007576efe560fae9bfde236edbf26c56343c30a2"} err="failed to get container status \"039e7ef3eac22a9018b69c95007576efe560fae9bfde236edbf26c56343c30a2\": rpc error: code = NotFound desc = could not find container \"039e7ef3eac22a9018b69c95007576efe560fae9bfde236edbf26c56343c30a2\": container with ID starting with 039e7ef3eac22a9018b69c95007576efe560fae9bfde236edbf26c56343c30a2 not found: ID does not exist" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.826814 5030 scope.go:117] "RemoveContainer" containerID="539dbc34fb373341eff958e84967781ee544073c3550557667fd10768f210aeb" Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.827778 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"539dbc34fb373341eff958e84967781ee544073c3550557667fd10768f210aeb\": container with ID starting with 539dbc34fb373341eff958e84967781ee544073c3550557667fd10768f210aeb not found: ID does not exist" containerID="539dbc34fb373341eff958e84967781ee544073c3550557667fd10768f210aeb" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.827799 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"539dbc34fb373341eff958e84967781ee544073c3550557667fd10768f210aeb"} err="failed to get container status \"539dbc34fb373341eff958e84967781ee544073c3550557667fd10768f210aeb\": rpc error: code = NotFound desc = could not find container \"539dbc34fb373341eff958e84967781ee544073c3550557667fd10768f210aeb\": container with ID starting with 539dbc34fb373341eff958e84967781ee544073c3550557667fd10768f210aeb not found: ID does not exist" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.841317 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-metrics-certs-tls-certs\") pod \"b2e0e246-826d-4bef-a72a-64b6e0266965\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.841356 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmj9c\" (UniqueName: \"kubernetes.io/projected/b2e0e246-826d-4bef-a72a-64b6e0266965-kube-api-access-vmj9c\") pod \"b2e0e246-826d-4bef-a72a-64b6e0266965\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.841405 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b2e0e246-826d-4bef-a72a-64b6e0266965-ovn-rundir\") pod \"b2e0e246-826d-4bef-a72a-64b6e0266965\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.841567 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2e0e246-826d-4bef-a72a-64b6e0266965-scripts\") pod \"b2e0e246-826d-4bef-a72a-64b6e0266965\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.841607 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e0e246-826d-4bef-a72a-64b6e0266965-config\") pod \"b2e0e246-826d-4bef-a72a-64b6e0266965\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.841639 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-ovn-northd-tls-certs\") pod \"b2e0e246-826d-4bef-a72a-64b6e0266965\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.841697 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-combined-ca-bundle\") pod \"b2e0e246-826d-4bef-a72a-64b6e0266965\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.841932 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-scripts\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.841992 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.842187 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22hfz\" (UniqueName: \"kubernetes.io/projected/b476b9d8-8c44-47b1-9736-3b7441a703bd-kube-api-access-22hfz\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.842227 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-config-data\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.842259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b476b9d8-8c44-47b1-9736-3b7441a703bd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.842304 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.842356 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2e0e246-826d-4bef-a72a-64b6e0266965-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "b2e0e246-826d-4bef-a72a-64b6e0266965" (UID: "b2e0e246-826d-4bef-a72a-64b6e0266965"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.842377 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.842841 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.842919 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.843076 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b2e0e246-826d-4bef-a72a-64b6e0266965-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.843166 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.843214 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs podName:c80f8119-49a7-4395-afb7-cdb94b8d6c35 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:55.843198196 +0000 UTC m=+4468.163458484 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "c80f8119-49a7-4395-afb7-cdb94b8d6c35") : secret "cert-glance-default-internal-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.843575 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2e0e246-826d-4bef-a72a-64b6e0266965-scripts" (OuterVolumeSpecName: "scripts") pod "b2e0e246-826d-4bef-a72a-64b6e0266965" (UID: "b2e0e246-826d-4bef-a72a-64b6e0266965"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.843995 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2e0e246-826d-4bef-a72a-64b6e0266965-config" (OuterVolumeSpecName: "config") pod "b2e0e246-826d-4bef-a72a-64b6e0266965" (UID: "b2e0e246-826d-4bef-a72a-64b6e0266965"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.851969 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-public-svc: secret "cert-cinder-public-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.852059 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs podName:13242275-7ac1-46df-a760-68acdd2cd9af nodeName:}" failed. No retries permitted until 2026-01-20 23:49:55.85203495 +0000 UTC m=+4468.172295238 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs") pod "cinder-api-0" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af") : secret "cert-cinder-public-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.853020 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: E0120 23:49:54.853229 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs podName:13242275-7ac1-46df-a760-68acdd2cd9af nodeName:}" failed. No retries permitted until 2026-01-20 23:49:55.853210799 +0000 UTC m=+4468.173471087 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs") pod "cinder-api-0" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af") : secret "cert-cinder-internal-svc" not found Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.854865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2e0e246-826d-4bef-a72a-64b6e0266965-kube-api-access-vmj9c" (OuterVolumeSpecName: "kube-api-access-vmj9c") pod "b2e0e246-826d-4bef-a72a-64b6e0266965" (UID: "b2e0e246-826d-4bef-a72a-64b6e0266965"). InnerVolumeSpecName "kube-api-access-vmj9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.871003 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2e0e246-826d-4bef-a72a-64b6e0266965" (UID: "b2e0e246-826d-4bef-a72a-64b6e0266965"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.902354 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_b2e0e246-826d-4bef-a72a-64b6e0266965/ovn-northd/0.log" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.902631 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"b2e0e246-826d-4bef-a72a-64b6e0266965","Type":"ContainerDied","Data":"6699c0d88f30ad882905fac2311b9f3cfe64fc8e840c6009b6200b25aebe07e4"} Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.902667 5030 scope.go:117] "RemoveContainer" containerID="5ffdf2f065bf633d15e59103da0d96e1da51576f3b2d1cd447a73672d69bdc29" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.902762 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.914969 5030 generic.go:334] "Generic (PLEG): container finished" podID="87584043-7d0e-475d-89ec-2f743b49b145" containerID="95ca0841856856a63ff25d713e88ef8694f506c3cfc1946a717a1a49c91cfea1" exitCode=1 Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.915074 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"87584043-7d0e-475d-89ec-2f743b49b145","Type":"ContainerDied","Data":"95ca0841856856a63ff25d713e88ef8694f506c3cfc1946a717a1a49c91cfea1"} Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.915611 5030 scope.go:117] "RemoveContainer" containerID="95ca0841856856a63ff25d713e88ef8694f506c3cfc1946a717a1a49c91cfea1" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.945349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22hfz\" (UniqueName: \"kubernetes.io/projected/b476b9d8-8c44-47b1-9736-3b7441a703bd-kube-api-access-22hfz\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.945398 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-config-data\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.945424 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b476b9d8-8c44-47b1-9736-3b7441a703bd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.945490 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.945573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.945635 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-scripts\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.945731 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmj9c\" (UniqueName: \"kubernetes.io/projected/b2e0e246-826d-4bef-a72a-64b6e0266965-kube-api-access-vmj9c\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.945745 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2e0e246-826d-4bef-a72a-64b6e0266965-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.945754 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2e0e246-826d-4bef-a72a-64b6e0266965-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.945764 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.945893 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b476b9d8-8c44-47b1-9736-3b7441a703bd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.950154 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.951811 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:54 crc kubenswrapper[5030]: I0120 23:49:54.966720 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22hfz\" (UniqueName: \"kubernetes.io/projected/b476b9d8-8c44-47b1-9736-3b7441a703bd-kube-api-access-22hfz\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.045342 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-config-data\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.047018 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.047272 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.047310 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs podName:d465598f-140e-4d07-a886-9d5c96b3bbe3 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:59.047299218 +0000 UTC m=+4471.367559506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "d465598f-140e-4d07-a886-9d5c96b3bbe3") : secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.052999 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-scripts\") pod \"cinder-scheduler-0\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:55 crc kubenswrapper[5030]: W0120 23:49:55.067781 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9916ac3c_e85e_498a_a0cc_769ca26f84b0.slice/crio-e818ee58a9c8866803b2d800afba678cba00b813a0b3b8e5e9c4a8b76c8aea10 WatchSource:0}: Error finding container e818ee58a9c8866803b2d800afba678cba00b813a0b3b8e5e9c4a8b76c8aea10: Status 404 returned error can't find the container with id e818ee58a9c8866803b2d800afba678cba00b813a0b3b8e5e9c4a8b76c8aea10 Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.068794 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.073848 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-metrics-certs-tls-certs podName:b2e0e246-826d-4bef-a72a-64b6e0266965 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:55.573815051 +0000 UTC m=+4467.894075339 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-metrics-certs-tls-certs") pod "b2e0e246-826d-4bef-a72a-64b6e0266965" (UID: "b2e0e246-826d-4bef-a72a-64b6e0266965") : error deleting /var/lib/kubelet/pods/b2e0e246-826d-4bef-a72a-64b6e0266965/volume-subpaths: remove /var/lib/kubelet/pods/b2e0e246-826d-4bef-a72a-64b6e0266965/volume-subpaths: no such file or directory Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.077863 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "b2e0e246-826d-4bef-a72a-64b6e0266965" (UID: "b2e0e246-826d-4bef-a72a-64b6e0266965"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.119652 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.148974 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.149137 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-novncproxy-cell1-vencrypt: secret "cert-nova-novncproxy-cell1-vencrypt" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.149208 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs podName:b088bf19-0c85-4732-bc31-18e8bd1cd751 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:59.14919104 +0000 UTC m=+4471.469451328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "vencrypt-tls-certs" (UniqueName: "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs") pod "nova-cell1-novncproxy-0" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751") : secret "cert-nova-novncproxy-cell1-vencrypt" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.149234 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-novncproxy-cell1-public-svc: secret "cert-nova-novncproxy-cell1-public-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.149284 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs podName:b088bf19-0c85-4732-bc31-18e8bd1cd751 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:59.149268152 +0000 UTC m=+4471.469528440 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nova-novncproxy-tls-certs" (UniqueName: "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs") pod "nova-cell1-novncproxy-0" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751") : secret "cert-nova-novncproxy-cell1-public-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.149148 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.149349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.149380 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.149531 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.149557 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.149564 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs podName:5945d7b7-7f69-4c5e-a90f-5963d1ff010f nodeName:}" failed. No retries permitted until 2026-01-20 23:49:59.149556559 +0000 UTC m=+4471.469816847 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs") pod "memcached-0" (UID: "5945d7b7-7f69-4c5e-a90f-5963d1ff010f") : secret "cert-memcached-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.149604 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.149641 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs podName:0e6ec1e2-3869-4ced-86cf-55f3d2f26b83 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:59.14963371 +0000 UTC m=+4471.469893998 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs") pod "openstack-galera-0" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83") : secret "cert-galera-openstack-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.150692 5030 scope.go:117] "RemoveContainer" containerID="ec45665b6a75193fb57359dd4199fd116084c74be755d905e8513881a7470510" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.251598 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.251750 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.252323 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs podName:58e259c6-360b-4eb5-9940-ceddedf1e275 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:59.252306001 +0000 UTC m=+4471.572566289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs") pod "ovsdbserver-sb-0" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275") : secret "cert-ovn-metrics" not found Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.253573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.253793 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-sb-ovndbs: secret "cert-ovndbcluster-sb-ovndbs" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.253886 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs podName:58e259c6-360b-4eb5-9940-ceddedf1e275 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:59.253862359 +0000 UTC m=+4471.574122727 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "ovsdbserver-sb-tls-certs" (UniqueName: "kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs") pod "ovsdbserver-sb-0" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275") : secret "cert-ovndbcluster-sb-ovndbs" not found Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.308601 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:49:55 crc kubenswrapper[5030]: W0120 23:49:55.330769 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7daf347c_1524_4050_b50e_deb2024da0cc.slice/crio-1c9a6d35ee19d31ab24156b86f222c57b195b06277a2e3a4b18222c4abea3ef5 WatchSource:0}: Error finding container 1c9a6d35ee19d31ab24156b86f222c57b195b06277a2e3a4b18222c4abea3ef5: Status 404 returned error can't find the container with id 1c9a6d35ee19d31ab24156b86f222c57b195b06277a2e3a4b18222c4abea3ef5 Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.367229 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.661971 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-metrics-certs-tls-certs\") pod \"b2e0e246-826d-4bef-a72a-64b6e0266965\" (UID: \"b2e0e246-826d-4bef-a72a-64b6e0266965\") " Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.662719 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.667053 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ceilometer-internal-svc: secret "cert-ceilometer-internal-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.667130 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs podName:247c3cd7-bfa5-4883-bdac-ced00b376ca0 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:59.667112013 +0000 UTC m=+4471.987372291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "ceilometer-tls-certs" (UniqueName: "kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs") pod "ceilometer-0" (UID: "247c3cd7-bfa5-4883-bdac-ced00b376ca0") : secret "cert-ceilometer-internal-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.682727 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b2e0e246-826d-4bef-a72a-64b6e0266965" (UID: "b2e0e246-826d-4bef-a72a-64b6e0266965"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.711959 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.764559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26c79cd4-48f4-4082-a364-e767b967212d-logs\") pod \"26c79cd4-48f4-4082-a364-e767b967212d\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.764701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-config-data-custom\") pod \"26c79cd4-48f4-4082-a364-e767b967212d\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.764746 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hhdq\" (UniqueName: \"kubernetes.io/projected/26c79cd4-48f4-4082-a364-e767b967212d-kube-api-access-5hhdq\") pod \"26c79cd4-48f4-4082-a364-e767b967212d\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.764774 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-combined-ca-bundle\") pod \"26c79cd4-48f4-4082-a364-e767b967212d\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.764878 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-config-data\") pod \"26c79cd4-48f4-4082-a364-e767b967212d\" (UID: \"26c79cd4-48f4-4082-a364-e767b967212d\") " Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.765235 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.765311 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.765332 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26c79cd4-48f4-4082-a364-e767b967212d-logs" (OuterVolumeSpecName: "logs") pod "26c79cd4-48f4-4082-a364-e767b967212d" (UID: "26c79cd4-48f4-4082-a364-e767b967212d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.765457 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-public-svc: secret "cert-nova-public-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.765511 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-internal-svc: secret "cert-nova-internal-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.765562 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs podName:a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:57.765546712 +0000 UTC m=+4470.085807000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs") pod "nova-api-0" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01") : secret "cert-nova-public-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.765662 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs podName:a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:57.765646774 +0000 UTC m=+4470.085907062 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs") pod "nova-api-0" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01") : secret "cert-nova-internal-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.765830 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26c79cd4-48f4-4082-a364-e767b967212d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.765874 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2e0e246-826d-4bef-a72a-64b6e0266965-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.768455 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "26c79cd4-48f4-4082-a364-e767b967212d" (UID: "26c79cd4-48f4-4082-a364-e767b967212d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.772744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c79cd4-48f4-4082-a364-e767b967212d-kube-api-access-5hhdq" (OuterVolumeSpecName: "kube-api-access-5hhdq") pod "26c79cd4-48f4-4082-a364-e767b967212d" (UID: "26c79cd4-48f4-4082-a364-e767b967212d"). InnerVolumeSpecName "kube-api-access-5hhdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.792375 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.812837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26c79cd4-48f4-4082-a364-e767b967212d" (UID: "26c79cd4-48f4-4082-a364-e767b967212d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.825089 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-config-data" (OuterVolumeSpecName: "config-data") pod "26c79cd4-48f4-4082-a364-e767b967212d" (UID: "26c79cd4-48f4-4082-a364-e767b967212d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.848831 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.864298 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.867212 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-config-data\") pod \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.867289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnspk\" (UniqueName: \"kubernetes.io/projected/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-kube-api-access-fnspk\") pod \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.867331 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-logs\") pod \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.867357 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-combined-ca-bundle\") pod \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.867395 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-config-data-custom\") pod \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\" (UID: \"b4909635-4c45-4cc9-af55-4df3dfa5a6ae\") " Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.867911 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.868078 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.868130 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.868238 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.868257 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hhdq\" (UniqueName: \"kubernetes.io/projected/26c79cd4-48f4-4082-a364-e767b967212d-kube-api-access-5hhdq\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.868269 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.868279 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26c79cd4-48f4-4082-a364-e767b967212d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.868365 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-public-svc: secret "cert-cinder-public-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.868418 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs podName:13242275-7ac1-46df-a760-68acdd2cd9af nodeName:}" failed. No retries permitted until 2026-01-20 23:49:57.868402067 +0000 UTC m=+4470.188662355 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs") pod "cinder-api-0" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af") : secret "cert-cinder-public-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.868862 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.868938 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs podName:13242275-7ac1-46df-a760-68acdd2cd9af nodeName:}" failed. No retries permitted until 2026-01-20 23:49:57.86891527 +0000 UTC m=+4470.189175558 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs") pod "cinder-api-0" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af") : secret "cert-cinder-internal-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.869082 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.869372 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs podName:c80f8119-49a7-4395-afb7-cdb94b8d6c35 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:57.86935947 +0000 UTC m=+4470.189619848 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "c80f8119-49a7-4395-afb7-cdb94b8d6c35") : secret "cert-glance-default-internal-svc" not found Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.869803 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-logs" (OuterVolumeSpecName: "logs") pod "b4909635-4c45-4cc9-af55-4df3dfa5a6ae" (UID: "b4909635-4c45-4cc9-af55-4df3dfa5a6ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.877465 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b4909635-4c45-4cc9-af55-4df3dfa5a6ae" (UID: "b4909635-4c45-4cc9-af55-4df3dfa5a6ae"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.881112 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-kube-api-access-fnspk" (OuterVolumeSpecName: "kube-api-access-fnspk") pod "b4909635-4c45-4cc9-af55-4df3dfa5a6ae" (UID: "b4909635-4c45-4cc9-af55-4df3dfa5a6ae"). InnerVolumeSpecName "kube-api-access-fnspk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.892962 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.893695 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4909635-4c45-4cc9-af55-4df3dfa5a6ae" containerName="barbican-keystone-listener" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.893714 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4909635-4c45-4cc9-af55-4df3dfa5a6ae" containerName="barbican-keystone-listener" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.893739 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e0e246-826d-4bef-a72a-64b6e0266965" containerName="ovn-northd" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.893747 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e0e246-826d-4bef-a72a-64b6e0266965" containerName="ovn-northd" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.893770 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2e0e246-826d-4bef-a72a-64b6e0266965" containerName="openstack-network-exporter" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.893780 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e0e246-826d-4bef-a72a-64b6e0266965" containerName="openstack-network-exporter" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.893791 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4909635-4c45-4cc9-af55-4df3dfa5a6ae" containerName="barbican-keystone-listener-log" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.893799 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4909635-4c45-4cc9-af55-4df3dfa5a6ae" containerName="barbican-keystone-listener-log" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.893815 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c79cd4-48f4-4082-a364-e767b967212d" containerName="barbican-worker-log" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.893824 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c79cd4-48f4-4082-a364-e767b967212d" containerName="barbican-worker-log" Jan 20 23:49:55 crc kubenswrapper[5030]: E0120 23:49:55.893840 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c79cd4-48f4-4082-a364-e767b967212d" containerName="barbican-worker" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.893848 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c79cd4-48f4-4082-a364-e767b967212d" containerName="barbican-worker" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.894083 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c79cd4-48f4-4082-a364-e767b967212d" containerName="barbican-worker-log" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.894112 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2e0e246-826d-4bef-a72a-64b6e0266965" containerName="ovn-northd" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.894124 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2e0e246-826d-4bef-a72a-64b6e0266965" containerName="openstack-network-exporter" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.894133 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4909635-4c45-4cc9-af55-4df3dfa5a6ae" containerName="barbican-keystone-listener" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.894158 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4909635-4c45-4cc9-af55-4df3dfa5a6ae" containerName="barbican-keystone-listener-log" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.894170 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c79cd4-48f4-4082-a364-e767b967212d" containerName="barbican-worker" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.895464 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.900931 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.900967 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.901123 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.901219 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-5hlhd" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.901762 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4909635-4c45-4cc9-af55-4df3dfa5a6ae" (UID: "b4909635-4c45-4cc9-af55-4df3dfa5a6ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.907062 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.934889 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-config-data" (OuterVolumeSpecName: "config-data") pod "b4909635-4c45-4cc9-af55-4df3dfa5a6ae" (UID: "b4909635-4c45-4cc9-af55-4df3dfa5a6ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.946614 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"b476b9d8-8c44-47b1-9736-3b7441a703bd","Type":"ContainerStarted","Data":"cdc9fc78102b637f5c1b59cc5ee594e39a1ec8a8836d438b968bfb541b555a4e"} Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.949544 5030 generic.go:334] "Generic (PLEG): container finished" podID="26c79cd4-48f4-4082-a364-e767b967212d" containerID="ed609912d75fa708ca9829f491a4fdfce5eca751361677e9cfa3203e0a357751" exitCode=0 Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.949586 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" event={"ID":"26c79cd4-48f4-4082-a364-e767b967212d","Type":"ContainerDied","Data":"ed609912d75fa708ca9829f491a4fdfce5eca751361677e9cfa3203e0a357751"} Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.949641 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.949665 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l" event={"ID":"26c79cd4-48f4-4082-a364-e767b967212d","Type":"ContainerDied","Data":"b84507167787fa2b59d5f4711b61116736f4ffa02b95797c8bba2661aef5dd2b"} Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.949688 5030 scope.go:117] "RemoveContainer" containerID="ed609912d75fa708ca9829f491a4fdfce5eca751361677e9cfa3203e0a357751" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.954613 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"87584043-7d0e-475d-89ec-2f743b49b145","Type":"ContainerStarted","Data":"56275492582da8b8f8b152940db54aa4552246f3e96d5f8d4e3b4e2ebd331e86"} Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.954783 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.956821 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7daf347c-1524-4050-b50e-deb2024da0cc","Type":"ContainerStarted","Data":"b20f1977eee125aebc807931296fe0bddf05f42f458f1beec8791a1bbb2c7e0d"} Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.956889 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7daf347c-1524-4050-b50e-deb2024da0cc","Type":"ContainerStarted","Data":"1c9a6d35ee19d31ab24156b86f222c57b195b06277a2e3a4b18222c4abea3ef5"} Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.959488 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9916ac3c-e85e-498a-a0cc-769ca26f84b0","Type":"ContainerStarted","Data":"8596413765ace3663ab1dbabdc220f74e3eac2172966214f82638524e7826651"} Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.959517 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9916ac3c-e85e-498a-a0cc-769ca26f84b0","Type":"ContainerStarted","Data":"c25d532fd4fcba60465f02b139d44d010a92cbabfea1746a9dfa54e3df14eee8"} Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.959527 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9916ac3c-e85e-498a-a0cc-769ca26f84b0","Type":"ContainerStarted","Data":"e818ee58a9c8866803b2d800afba678cba00b813a0b3b8e5e9c4a8b76c8aea10"} Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.961677 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4909635-4c45-4cc9-af55-4df3dfa5a6ae" containerID="ed470bb8adfad334977662cf16520c4e04b1607943ea23451dcf79905020279d" exitCode=0 Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.961756 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.969950 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01baff70-8147-462e-b9ca-a1ca5301b4b1-scripts\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.970119 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.970292 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.970565 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.970790 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01baff70-8147-462e-b9ca-a1ca5301b4b1-config\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.975955 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.976522 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zvxz\" (UniqueName: \"kubernetes.io/projected/01baff70-8147-462e-b9ca-a1ca5301b4b1-kube-api-access-4zvxz\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.976679 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.976694 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnspk\" (UniqueName: \"kubernetes.io/projected/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-kube-api-access-fnspk\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.976705 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.976714 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.976724 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4909635-4c45-4cc9-af55-4df3dfa5a6ae-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.978594 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6944830c-0fd3-4285-b2d7-acbc0265920f" path="/var/lib/kubelet/pods/6944830c-0fd3-4285-b2d7-acbc0265920f/volumes" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.979312 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2e0e246-826d-4bef-a72a-64b6e0266965" path="/var/lib/kubelet/pods/b2e0e246-826d-4bef-a72a-64b6e0266965/volumes" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.980342 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0048c16-5c70-4aae-baf0-d9f25538e3e2" path="/var/lib/kubelet/pods/e0048c16-5c70-4aae-baf0-d9f25538e3e2/volumes" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.982245 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f606ed-8a00-4c89-b7cd-2bc1e166e44d" path="/var/lib/kubelet/pods/f1f606ed-8a00-4c89-b7cd-2bc1e166e44d/volumes" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.983102 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff6971b9-88b5-4b5e-9b15-a105b0c38e18" path="/var/lib/kubelet/pods/ff6971b9-88b5-4b5e-9b15-a105b0c38e18/volumes" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.983787 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fff68923-cebb-4d2c-bf69-4355b0da8b6d" path="/var/lib/kubelet/pods/fff68923-cebb-4d2c-bf69-4355b0da8b6d/volumes" Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.986059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" event={"ID":"b4909635-4c45-4cc9-af55-4df3dfa5a6ae","Type":"ContainerDied","Data":"ed470bb8adfad334977662cf16520c4e04b1607943ea23451dcf79905020279d"} Jan 20 23:49:55 crc kubenswrapper[5030]: I0120 23:49:55.986101 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65" event={"ID":"b4909635-4c45-4cc9-af55-4df3dfa5a6ae","Type":"ContainerDied","Data":"937f0ec470fa44414da8e254940ef4be0d70ba32f2dc62a9d5caee5c6531fcff"} Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.004526 5030 scope.go:117] "RemoveContainer" containerID="16c5c4277d4db587ebd20571f0eda8b1071259dcd5b2cc10f6127bebbbe1f881" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.021600 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=3.021568593 podStartE2EDuration="3.021568593s" podCreationTimestamp="2026-01-20 23:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:49:56.018677903 +0000 UTC m=+4468.338938201" watchObservedRunningTime="2026-01-20 23:49:56.021568593 +0000 UTC m=+4468.341828891" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.033169 5030 scope.go:117] "RemoveContainer" containerID="ed609912d75fa708ca9829f491a4fdfce5eca751361677e9cfa3203e0a357751" Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.034072 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed609912d75fa708ca9829f491a4fdfce5eca751361677e9cfa3203e0a357751\": container with ID starting with ed609912d75fa708ca9829f491a4fdfce5eca751361677e9cfa3203e0a357751 not found: ID does not exist" containerID="ed609912d75fa708ca9829f491a4fdfce5eca751361677e9cfa3203e0a357751" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.034113 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed609912d75fa708ca9829f491a4fdfce5eca751361677e9cfa3203e0a357751"} err="failed to get container status \"ed609912d75fa708ca9829f491a4fdfce5eca751361677e9cfa3203e0a357751\": rpc error: code = NotFound desc = could not find container \"ed609912d75fa708ca9829f491a4fdfce5eca751361677e9cfa3203e0a357751\": container with ID starting with ed609912d75fa708ca9829f491a4fdfce5eca751361677e9cfa3203e0a357751 not found: ID does not exist" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.034139 5030 scope.go:117] "RemoveContainer" containerID="16c5c4277d4db587ebd20571f0eda8b1071259dcd5b2cc10f6127bebbbe1f881" Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.034590 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16c5c4277d4db587ebd20571f0eda8b1071259dcd5b2cc10f6127bebbbe1f881\": container with ID starting with 16c5c4277d4db587ebd20571f0eda8b1071259dcd5b2cc10f6127bebbbe1f881 not found: ID does not exist" containerID="16c5c4277d4db587ebd20571f0eda8b1071259dcd5b2cc10f6127bebbbe1f881" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.034647 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16c5c4277d4db587ebd20571f0eda8b1071259dcd5b2cc10f6127bebbbe1f881"} err="failed to get container status \"16c5c4277d4db587ebd20571f0eda8b1071259dcd5b2cc10f6127bebbbe1f881\": rpc error: code = NotFound desc = could not find container \"16c5c4277d4db587ebd20571f0eda8b1071259dcd5b2cc10f6127bebbbe1f881\": container with ID starting with 16c5c4277d4db587ebd20571f0eda8b1071259dcd5b2cc10f6127bebbbe1f881 not found: ID does not exist" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.034676 5030 scope.go:117] "RemoveContainer" containerID="ed470bb8adfad334977662cf16520c4e04b1607943ea23451dcf79905020279d" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.037330 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=3.037316815 podStartE2EDuration="3.037316815s" podCreationTimestamp="2026-01-20 23:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:49:55.996395852 +0000 UTC m=+4468.316656150" watchObservedRunningTime="2026-01-20 23:49:56.037316815 +0000 UTC m=+4468.357577103" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.053710 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l"] Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.062354 5030 scope.go:117] "RemoveContainer" containerID="e9844f2bcfe577a131f628de6cb182cfcf200e6f70195cd5fe757c784f5c29dd" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.062484 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-fb4df5d95-xcn4l"] Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.073391 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65"] Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.078807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.078903 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01baff70-8147-462e-b9ca-a1ca5301b4b1-config\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.079022 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.079144 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zvxz\" (UniqueName: \"kubernetes.io/projected/01baff70-8147-462e-b9ca-a1ca5301b4b1-kube-api-access-4zvxz\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.079192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01baff70-8147-462e-b9ca-a1ca5301b4b1-scripts\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.079211 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.079277 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.079987 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.080039 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs podName:01baff70-8147-462e-b9ca-a1ca5301b4b1 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:56.58002504 +0000 UTC m=+4468.900285328 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1") : secret "cert-ovn-metrics" not found Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.080529 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01baff70-8147-462e-b9ca-a1ca5301b4b1-config\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.080633 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.080678 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs podName:01baff70-8147-462e-b9ca-a1ca5301b4b1 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:56.580665256 +0000 UTC m=+4468.900925544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1") : secret "cert-ovnnorthd-ovndbs" not found Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.081202 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01baff70-8147-462e-b9ca-a1ca5301b4b1-scripts\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.081243 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.083450 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-58d55775d6-drf65"] Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.086069 5030 scope.go:117] "RemoveContainer" containerID="ed470bb8adfad334977662cf16520c4e04b1607943ea23451dcf79905020279d" Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.087879 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed470bb8adfad334977662cf16520c4e04b1607943ea23451dcf79905020279d\": container with ID starting with ed470bb8adfad334977662cf16520c4e04b1607943ea23451dcf79905020279d not found: ID does not exist" containerID="ed470bb8adfad334977662cf16520c4e04b1607943ea23451dcf79905020279d" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.087928 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed470bb8adfad334977662cf16520c4e04b1607943ea23451dcf79905020279d"} err="failed to get container status \"ed470bb8adfad334977662cf16520c4e04b1607943ea23451dcf79905020279d\": rpc error: code = NotFound desc = could not find container \"ed470bb8adfad334977662cf16520c4e04b1607943ea23451dcf79905020279d\": container with ID starting with ed470bb8adfad334977662cf16520c4e04b1607943ea23451dcf79905020279d not found: ID does not exist" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.087959 5030 scope.go:117] "RemoveContainer" containerID="e9844f2bcfe577a131f628de6cb182cfcf200e6f70195cd5fe757c784f5c29dd" Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.088937 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9844f2bcfe577a131f628de6cb182cfcf200e6f70195cd5fe757c784f5c29dd\": container with ID starting with e9844f2bcfe577a131f628de6cb182cfcf200e6f70195cd5fe757c784f5c29dd not found: ID does not exist" containerID="e9844f2bcfe577a131f628de6cb182cfcf200e6f70195cd5fe757c784f5c29dd" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.088969 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9844f2bcfe577a131f628de6cb182cfcf200e6f70195cd5fe757c784f5c29dd"} err="failed to get container status \"e9844f2bcfe577a131f628de6cb182cfcf200e6f70195cd5fe757c784f5c29dd\": rpc error: code = NotFound desc = could not find container \"e9844f2bcfe577a131f628de6cb182cfcf200e6f70195cd5fe757c784f5c29dd\": container with ID starting with e9844f2bcfe577a131f628de6cb182cfcf200e6f70195cd5fe757c784f5c29dd not found: ID does not exist" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.601581 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.601764 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.601842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.601854 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-public-svc: secret "cert-placement-public-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.601967 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs podName:83e561a5-31df-4e46-ac6b-9602ea09bcef nodeName:}" failed. No retries permitted until 2026-01-20 23:50:04.601935631 +0000 UTC m=+4476.922195949 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs") pod "placement-7fdd8fb598-6247c" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef") : secret "cert-placement-public-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.602056 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.602134 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs podName:83e561a5-31df-4e46-ac6b-9602ea09bcef nodeName:}" failed. No retries permitted until 2026-01-20 23:50:04.602109045 +0000 UTC m=+4476.922369373 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs") pod "placement-7fdd8fb598-6247c" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef") : secret "cert-placement-internal-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.602274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.602663 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.602735 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs podName:01baff70-8147-462e-b9ca-a1ca5301b4b1 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:57.602711451 +0000 UTC m=+4469.922971779 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1") : secret "cert-ovnnorthd-ovndbs" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.602756 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.602815 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs podName:01baff70-8147-462e-b9ca-a1ca5301b4b1 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:57.602797813 +0000 UTC m=+4469.923058131 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1") : secret "cert-ovn-metrics" not found Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.645355 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.645798 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zvxz\" (UniqueName: \"kubernetes.io/projected/01baff70-8147-462e-b9ca-a1ca5301b4b1-kube-api-access-4zvxz\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.704525 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.704684 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.704750 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.704793 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.704824 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.704892 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.705059 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.705111 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:50:04.705095204 +0000 UTC m=+4477.025355492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-ovndbs" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.705606 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-public-svc: secret "cert-glance-default-public-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.705667 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs podName:b33621c6-2162-464f-853a-377357ad9f2f nodeName:}" failed. No retries permitted until 2026-01-20 23:50:00.705654487 +0000 UTC m=+4473.025914775 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs") pod "glance-default-external-api-0" (UID: "b33621c6-2162-464f-853a-377357ad9f2f") : secret "cert-glance-default-public-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.705675 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.705715 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:50:04.705704518 +0000 UTC m=+4477.025964806 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-internal-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.705610 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.705749 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs podName:58885f5a-2467-45da-b076-7cdb2952d2a9 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:04.705740399 +0000 UTC m=+4477.026000687 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs") pod "barbican-api-6ffbb9cff4-gljqq" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9") : secret "cert-barbican-public-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.705714 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.705763 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.705781 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs podName:58885f5a-2467-45da-b076-7cdb2952d2a9 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:04.70577344 +0000 UTC m=+4477.026033728 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs") pod "barbican-api-6ffbb9cff4-gljqq" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9") : secret "cert-barbican-internal-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.705937 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:50:04.705911793 +0000 UTC m=+4477.026172091 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-public-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.807099 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.807159 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.807397 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-keystone-public-svc: secret "cert-keystone-public-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.807447 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs podName:16b352a0-984a-4736-9ede-a3797059b56a nodeName:}" failed. No retries permitted until 2026-01-20 23:50:04.807433886 +0000 UTC m=+4477.127694174 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs") pod "keystone-d4c58fbf5-82sk5" (UID: "16b352a0-984a-4736-9ede-a3797059b56a") : secret "cert-keystone-public-svc" not found Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.813200 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.962268 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:49:56 crc kubenswrapper[5030]: E0120 23:49:56.962855 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:49:56 crc kubenswrapper[5030]: I0120 23:49:56.982961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"b476b9d8-8c44-47b1-9736-3b7441a703bd","Type":"ContainerStarted","Data":"b39481b9df711fc23477860a389138de4b98dc604215528415d3e6762dbf8608"} Jan 20 23:49:57 crc kubenswrapper[5030]: I0120 23:49:57.626697 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.626841 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.627167 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs podName:01baff70-8147-462e-b9ca-a1ca5301b4b1 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:59.627148211 +0000 UTC m=+4471.947408499 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1") : secret "cert-ovn-metrics" not found Jan 20 23:49:57 crc kubenswrapper[5030]: I0120 23:49:57.627275 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.627557 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.627740 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs podName:01baff70-8147-462e-b9ca-a1ca5301b4b1 nodeName:}" failed. No retries permitted until 2026-01-20 23:49:59.627686885 +0000 UTC m=+4471.947947233 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1") : secret "cert-ovnnorthd-ovndbs" not found Jan 20 23:49:57 crc kubenswrapper[5030]: I0120 23:49:57.831949 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:57 crc kubenswrapper[5030]: I0120 23:49:57.832140 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.832173 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-internal-svc: secret "cert-nova-internal-svc" not found Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.832322 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs podName:a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:01.832283028 +0000 UTC m=+4474.152543386 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs") pod "nova-api-0" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01") : secret "cert-nova-internal-svc" not found Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.832388 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-public-svc: secret "cert-nova-public-svc" not found Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.832547 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs podName:a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:01.832507203 +0000 UTC m=+4474.152767541 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs") pod "nova-api-0" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01") : secret "cert-nova-public-svc" not found Jan 20 23:49:57 crc kubenswrapper[5030]: I0120 23:49:57.934654 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:49:57 crc kubenswrapper[5030]: I0120 23:49:57.934794 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.935066 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 20 23:49:57 crc kubenswrapper[5030]: I0120 23:49:57.935112 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.935137 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-public-svc: secret "cert-cinder-public-svc" not found Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.935197 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs podName:c80f8119-49a7-4395-afb7-cdb94b8d6c35 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:01.935161003 +0000 UTC m=+4474.255421341 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "c80f8119-49a7-4395-afb7-cdb94b8d6c35") : secret "cert-glance-default-internal-svc" not found Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.935273 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-cinder-internal-svc: secret "cert-cinder-internal-svc" not found Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.935278 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs podName:13242275-7ac1-46df-a760-68acdd2cd9af nodeName:}" failed. No retries permitted until 2026-01-20 23:50:01.935240825 +0000 UTC m=+4474.255501153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs") pod "cinder-api-0" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af") : secret "cert-cinder-public-svc" not found Jan 20 23:49:57 crc kubenswrapper[5030]: E0120 23:49:57.935498 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs podName:13242275-7ac1-46df-a760-68acdd2cd9af nodeName:}" failed. No retries permitted until 2026-01-20 23:50:01.935477772 +0000 UTC m=+4474.255738100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs") pod "cinder-api-0" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af") : secret "cert-cinder-internal-svc" not found Jan 20 23:49:57 crc kubenswrapper[5030]: I0120 23:49:57.990806 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c79cd4-48f4-4082-a364-e767b967212d" path="/var/lib/kubelet/pods/26c79cd4-48f4-4082-a364-e767b967212d/volumes" Jan 20 23:49:57 crc kubenswrapper[5030]: I0120 23:49:57.992861 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4909635-4c45-4cc9-af55-4df3dfa5a6ae" path="/var/lib/kubelet/pods/b4909635-4c45-4cc9-af55-4df3dfa5a6ae/volumes" Jan 20 23:49:58 crc kubenswrapper[5030]: I0120 23:49:58.013816 5030 generic.go:334] "Generic (PLEG): container finished" podID="87584043-7d0e-475d-89ec-2f743b49b145" containerID="56275492582da8b8f8b152940db54aa4552246f3e96d5f8d4e3b4e2ebd331e86" exitCode=1 Jan 20 23:49:58 crc kubenswrapper[5030]: I0120 23:49:58.013919 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"87584043-7d0e-475d-89ec-2f743b49b145","Type":"ContainerDied","Data":"56275492582da8b8f8b152940db54aa4552246f3e96d5f8d4e3b4e2ebd331e86"} Jan 20 23:49:58 crc kubenswrapper[5030]: I0120 23:49:58.014006 5030 scope.go:117] "RemoveContainer" containerID="95ca0841856856a63ff25d713e88ef8694f506c3cfc1946a717a1a49c91cfea1" Jan 20 23:49:58 crc kubenswrapper[5030]: I0120 23:49:58.014989 5030 scope.go:117] "RemoveContainer" containerID="56275492582da8b8f8b152940db54aa4552246f3e96d5f8d4e3b4e2ebd331e86" Jan 20 23:49:58 crc kubenswrapper[5030]: E0120 23:49:58.015539 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(87584043-7d0e-475d-89ec-2f743b49b145)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="87584043-7d0e-475d-89ec-2f743b49b145" Jan 20 23:49:58 crc kubenswrapper[5030]: I0120 23:49:58.018504 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"b476b9d8-8c44-47b1-9736-3b7441a703bd","Type":"ContainerStarted","Data":"9df222c59711d3fd2a93db293137b68d56cc3f6dc8f6d95807cf89f2a8e746bc"} Jan 20 23:49:58 crc kubenswrapper[5030]: I0120 23:49:58.087499 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=4.087430507 podStartE2EDuration="4.087430507s" podCreationTimestamp="2026-01-20 23:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:49:58.069254546 +0000 UTC m=+4470.389514874" watchObservedRunningTime="2026-01-20 23:49:58.087430507 +0000 UTC m=+4470.407690835" Jan 20 23:49:58 crc kubenswrapper[5030]: I0120 23:49:58.654313 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:58 crc kubenswrapper[5030]: I0120 23:49:58.654723 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:49:58 crc kubenswrapper[5030]: E0120 23:49:58.654528 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:58 crc kubenswrapper[5030]: E0120 23:49:58.654946 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs podName:bd8e50b7-5dd6-475d-80a9-69eff598bb18 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:06.654932474 +0000 UTC m=+4478.975192762 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18") : secret "cert-ovn-metrics" not found Jan 20 23:49:58 crc kubenswrapper[5030]: E0120 23:49:58.654895 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:49:58 crc kubenswrapper[5030]: E0120 23:49:58.655319 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs podName:bd8e50b7-5dd6-475d-80a9-69eff598bb18 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:06.655275423 +0000 UTC m=+4478.975535741 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.064114 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.064399 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.064514 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs podName:d465598f-140e-4d07-a886-9d5c96b3bbe3 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:07.064486899 +0000 UTC m=+4479.384747227 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "d465598f-140e-4d07-a886-9d5c96b3bbe3") : secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.167092 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.167411 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.167514 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-novncproxy-cell1-vencrypt: secret "cert-nova-novncproxy-cell1-vencrypt" not found Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.167556 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.167690 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-novncproxy-cell1-public-svc: secret "cert-nova-novncproxy-cell1-public-svc" not found Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.167694 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs podName:b088bf19-0c85-4732-bc31-18e8bd1cd751 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:07.167657082 +0000 UTC m=+4479.487917410 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "vencrypt-tls-certs" (UniqueName: "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs") pod "nova-cell1-novncproxy-0" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751") : secret "cert-nova-novncproxy-cell1-vencrypt" not found Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.167772 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.167831 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.167876 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.167898 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs podName:b088bf19-0c85-4732-bc31-18e8bd1cd751 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:07.167872337 +0000 UTC m=+4479.488132655 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nova-novncproxy-tls-certs" (UniqueName: "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs") pod "nova-cell1-novncproxy-0" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751") : secret "cert-nova-novncproxy-cell1-public-svc" not found Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.167974 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs podName:0e6ec1e2-3869-4ced-86cf-55f3d2f26b83 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:07.167949399 +0000 UTC m=+4479.488209817 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs") pod "openstack-galera-0" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83") : secret "cert-galera-openstack-svc" not found Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.168016 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs podName:5945d7b7-7f69-4c5e-a90f-5963d1ff010f nodeName:}" failed. No retries permitted until 2026-01-20 23:50:07.16799697 +0000 UTC m=+4479.488257288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs") pod "memcached-0" (UID: "5945d7b7-7f69-4c5e-a90f-5963d1ff010f") : secret "cert-memcached-svc" not found Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.270820 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.271121 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-sb-ovndbs: secret "cert-ovndbcluster-sb-ovndbs" not found Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.271290 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs podName:58e259c6-360b-4eb5-9940-ceddedf1e275 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:07.271257695 +0000 UTC m=+4479.591518023 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "ovsdbserver-sb-tls-certs" (UniqueName: "kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs") pod "ovsdbserver-sb-0" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275") : secret "cert-ovndbcluster-sb-ovndbs" not found Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.271449 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.271680 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.271773 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs podName:58e259c6-360b-4eb5-9940-ceddedf1e275 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:07.271748437 +0000 UTC m=+4479.592008755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs") pod "ovsdbserver-sb-0" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275") : secret "cert-ovn-metrics" not found Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.563728 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.563830 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.680993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.681294 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.681641 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.681924 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.681974 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs podName:01baff70-8147-462e-b9ca-a1ca5301b4b1 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:03.681957388 +0000 UTC m=+4476.002217686 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1") : secret "cert-ovn-metrics" not found Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.682101 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 20 23:49:59 crc kubenswrapper[5030]: E0120 23:49:59.682191 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs podName:01baff70-8147-462e-b9ca-a1ca5301b4b1 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:03.682165343 +0000 UTC m=+4476.002425671 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1") : secret "cert-ovnnorthd-ovndbs" not found Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.692699 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.803395 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:49:59 crc kubenswrapper[5030]: I0120 23:49:59.966248 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:00 crc kubenswrapper[5030]: I0120 23:50:00.120453 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:50:00 crc kubenswrapper[5030]: I0120 23:50:00.497003 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:50:00 crc kubenswrapper[5030]: I0120 23:50:00.579271 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:50:00 crc kubenswrapper[5030]: I0120 23:50:00.579854 5030 scope.go:117] "RemoveContainer" containerID="56275492582da8b8f8b152940db54aa4552246f3e96d5f8d4e3b4e2ebd331e86" Jan 20 23:50:00 crc kubenswrapper[5030]: E0120 23:50:00.580228 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(87584043-7d0e-475d-89ec-2f743b49b145)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="87584043-7d0e-475d-89ec-2f743b49b145" Jan 20 23:50:00 crc kubenswrapper[5030]: I0120 23:50:00.711789 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:00 crc kubenswrapper[5030]: E0120 23:50:00.712058 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-public-svc: secret "cert-glance-default-public-svc" not found Jan 20 23:50:00 crc kubenswrapper[5030]: E0120 23:50:00.712158 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs podName:b33621c6-2162-464f-853a-377357ad9f2f nodeName:}" failed. No retries permitted until 2026-01-20 23:50:08.712106368 +0000 UTC m=+4481.032366686 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs") pod "glance-default-external-api-0" (UID: "b33621c6-2162-464f-853a-377357ad9f2f") : secret "cert-glance-default-public-svc" not found Jan 20 23:50:01 crc kubenswrapper[5030]: I0120 23:50:01.066398 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"247c3cd7-bfa5-4883-bdac-ced00b376ca0","Type":"ContainerStarted","Data":"ada35364535f89b6fcff77016022db7ec0598dcedc895845fc1db566dff4c7f2"} Jan 20 23:50:01 crc kubenswrapper[5030]: I0120 23:50:01.784016 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:50:01 crc kubenswrapper[5030]: E0120 23:50:01.785026 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[internal-tls-certs public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/cinder-api-0" podUID="13242275-7ac1-46df-a760-68acdd2cd9af" Jan 20 23:50:01 crc kubenswrapper[5030]: I0120 23:50:01.835575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:01 crc kubenswrapper[5030]: I0120 23:50:01.835822 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:01 crc kubenswrapper[5030]: E0120 23:50:01.836206 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-public-svc: secret "cert-nova-public-svc" not found Jan 20 23:50:01 crc kubenswrapper[5030]: E0120 23:50:01.836313 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs podName:a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:09.83628993 +0000 UTC m=+4482.156550248 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs") pod "nova-api-0" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01") : secret "cert-nova-public-svc" not found Jan 20 23:50:01 crc kubenswrapper[5030]: E0120 23:50:01.836659 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-internal-svc: secret "cert-nova-internal-svc" not found Jan 20 23:50:01 crc kubenswrapper[5030]: E0120 23:50:01.836951 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs podName:a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:09.836912934 +0000 UTC m=+4482.157173262 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs") pod "nova-api-0" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01") : secret "cert-nova-internal-svc" not found Jan 20 23:50:01 crc kubenswrapper[5030]: I0120 23:50:01.939270 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:01 crc kubenswrapper[5030]: I0120 23:50:01.939604 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:01 crc kubenswrapper[5030]: I0120 23:50:01.939749 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:01 crc kubenswrapper[5030]: E0120 23:50:01.939747 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-glance-default-internal-svc: secret "cert-glance-default-internal-svc" not found Jan 20 23:50:01 crc kubenswrapper[5030]: E0120 23:50:01.940001 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs podName:c80f8119-49a7-4395-afb7-cdb94b8d6c35 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:09.939977345 +0000 UTC m=+4482.260237643 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs") pod "glance-default-internal-api-0" (UID: "c80f8119-49a7-4395-afb7-cdb94b8d6c35") : secret "cert-glance-default-internal-svc" not found Jan 20 23:50:01 crc kubenswrapper[5030]: I0120 23:50:01.943519 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:01 crc kubenswrapper[5030]: I0120 23:50:01.952841 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.089472 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.089474 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"247c3cd7-bfa5-4883-bdac-ced00b376ca0","Type":"ContainerStarted","Data":"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8"} Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.122245 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.255957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs\") pod \"13242275-7ac1-46df-a760-68acdd2cd9af\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.256366 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-config-data\") pod \"13242275-7ac1-46df-a760-68acdd2cd9af\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.256397 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs\") pod \"13242275-7ac1-46df-a760-68acdd2cd9af\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.256419 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-scripts\") pod \"13242275-7ac1-46df-a760-68acdd2cd9af\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.256471 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13242275-7ac1-46df-a760-68acdd2cd9af-etc-machine-id\") pod \"13242275-7ac1-46df-a760-68acdd2cd9af\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.256496 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13242275-7ac1-46df-a760-68acdd2cd9af-logs\") pod \"13242275-7ac1-46df-a760-68acdd2cd9af\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.256551 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgnk5\" (UniqueName: \"kubernetes.io/projected/13242275-7ac1-46df-a760-68acdd2cd9af-kube-api-access-lgnk5\") pod \"13242275-7ac1-46df-a760-68acdd2cd9af\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.256589 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-config-data-custom\") pod \"13242275-7ac1-46df-a760-68acdd2cd9af\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.256637 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13242275-7ac1-46df-a760-68acdd2cd9af-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "13242275-7ac1-46df-a760-68acdd2cd9af" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.256749 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-combined-ca-bundle\") pod \"13242275-7ac1-46df-a760-68acdd2cd9af\" (UID: \"13242275-7ac1-46df-a760-68acdd2cd9af\") " Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.257592 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13242275-7ac1-46df-a760-68acdd2cd9af-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.258859 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13242275-7ac1-46df-a760-68acdd2cd9af-logs" (OuterVolumeSpecName: "logs") pod "13242275-7ac1-46df-a760-68acdd2cd9af" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.261140 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "13242275-7ac1-46df-a760-68acdd2cd9af" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.261161 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-scripts" (OuterVolumeSpecName: "scripts") pod "13242275-7ac1-46df-a760-68acdd2cd9af" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.261581 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13242275-7ac1-46df-a760-68acdd2cd9af" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.262284 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-config-data" (OuterVolumeSpecName: "config-data") pod "13242275-7ac1-46df-a760-68acdd2cd9af" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.262275 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "13242275-7ac1-46df-a760-68acdd2cd9af" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.262494 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13242275-7ac1-46df-a760-68acdd2cd9af" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.284940 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13242275-7ac1-46df-a760-68acdd2cd9af-kube-api-access-lgnk5" (OuterVolumeSpecName: "kube-api-access-lgnk5") pod "13242275-7ac1-46df-a760-68acdd2cd9af" (UID: "13242275-7ac1-46df-a760-68acdd2cd9af"). InnerVolumeSpecName "kube-api-access-lgnk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.359653 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.359686 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.359696 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.359706 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.359715 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.359724 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13242275-7ac1-46df-a760-68acdd2cd9af-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.359733 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13242275-7ac1-46df-a760-68acdd2cd9af-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:02 crc kubenswrapper[5030]: I0120 23:50:02.359743 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgnk5\" (UniqueName: \"kubernetes.io/projected/13242275-7ac1-46df-a760-68acdd2cd9af-kube-api-access-lgnk5\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.121157 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.125247 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"247c3cd7-bfa5-4883-bdac-ced00b376ca0","Type":"ContainerStarted","Data":"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96"} Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.125326 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"247c3cd7-bfa5-4883-bdac-ced00b376ca0","Type":"ContainerStarted","Data":"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c"} Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.148795 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq"] Jan 20 23:50:03 crc kubenswrapper[5030]: E0120 23:50:03.150374 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[internal-tls-certs public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" podUID="58885f5a-2467-45da-b076-7cdb2952d2a9" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.180002 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-777db55c66-sxd4j"] Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.181552 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.242881 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-777db55c66-sxd4j"] Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.277088 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.284693 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c59fe1c-661b-4268-a33d-e9f87ec167eb-logs\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.285663 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv5bw\" (UniqueName: \"kubernetes.io/projected/7c59fe1c-661b-4268-a33d-e9f87ec167eb-kube-api-access-cv5bw\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.285705 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-internal-tls-certs\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.285840 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-combined-ca-bundle\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.285880 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-config-data-custom\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.286040 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-public-tls-certs\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.286110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-config-data\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.286673 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.304109 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.305759 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.308239 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.308347 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.308464 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.315197 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387146 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387196 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv5bw\" (UniqueName: \"kubernetes.io/projected/7c59fe1c-661b-4268-a33d-e9f87ec167eb-kube-api-access-cv5bw\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387228 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-internal-tls-certs\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387330 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-scripts\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387362 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-combined-ca-bundle\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387403 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-config-data-custom\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387421 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-config-data\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387443 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-public-tls-certs\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387475 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-config-data\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387503 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g478r\" (UniqueName: \"kubernetes.io/projected/e8420ab0-7e33-4fbc-a045-c190273ccc1f-kube-api-access-g478r\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387583 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c59fe1c-661b-4268-a33d-e9f87ec167eb-logs\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387599 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387643 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387666 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387693 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8420ab0-7e33-4fbc-a045-c190273ccc1f-logs\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.387711 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8420ab0-7e33-4fbc-a045-c190273ccc1f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.388153 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c59fe1c-661b-4268-a33d-e9f87ec167eb-logs\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.391152 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-config-data-custom\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.391276 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-internal-tls-certs\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.393661 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-combined-ca-bundle\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.394674 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-config-data\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.400925 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-public-tls-certs\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.406358 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv5bw\" (UniqueName: \"kubernetes.io/projected/7c59fe1c-661b-4268-a33d-e9f87ec167eb-kube-api-access-cv5bw\") pod \"barbican-api-777db55c66-sxd4j\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.489573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g478r\" (UniqueName: \"kubernetes.io/projected/e8420ab0-7e33-4fbc-a045-c190273ccc1f-kube-api-access-g478r\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.489687 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.489718 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.489740 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.489767 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8420ab0-7e33-4fbc-a045-c190273ccc1f-logs\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.489787 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8420ab0-7e33-4fbc-a045-c190273ccc1f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.489846 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.489951 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-scripts\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.490000 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-config-data\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.490532 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8420ab0-7e33-4fbc-a045-c190273ccc1f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.490909 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8420ab0-7e33-4fbc-a045-c190273ccc1f-logs\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.493826 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.494162 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.494315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.494770 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-scripts\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.495599 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-config-data\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.497004 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.506102 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g478r\" (UniqueName: \"kubernetes.io/projected/e8420ab0-7e33-4fbc-a045-c190273ccc1f-kube-api-access-g478r\") pod \"cinder-api-0\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.506553 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.622110 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.694258 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.694408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:03 crc kubenswrapper[5030]: E0120 23:50:03.694422 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:50:03 crc kubenswrapper[5030]: E0120 23:50:03.694506 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs podName:01baff70-8147-462e-b9ca-a1ca5301b4b1 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:11.694483467 +0000 UTC m=+4484.014743775 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1") : secret "cert-ovn-metrics" not found Jan 20 23:50:03 crc kubenswrapper[5030]: E0120 23:50:03.694597 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 20 23:50:03 crc kubenswrapper[5030]: E0120 23:50:03.694684 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs podName:01baff70-8147-462e-b9ca-a1ca5301b4b1 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:11.694665041 +0000 UTC m=+4484.014925449 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1") : secret "cert-ovnnorthd-ovndbs" not found Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.866946 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.974258 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13242275-7ac1-46df-a760-68acdd2cd9af" path="/var/lib/kubelet/pods/13242275-7ac1-46df-a760-68acdd2cd9af/volumes" Jan 20 23:50:03 crc kubenswrapper[5030]: I0120 23:50:03.974749 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-777db55c66-sxd4j"] Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.131246 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e8420ab0-7e33-4fbc-a045-c190273ccc1f","Type":"ContainerStarted","Data":"493cd2b5907170a9970aacad3e9cd960c3ffbfdb54abe88696305ec096d41ff9"} Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.132256 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.132847 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" event={"ID":"7c59fe1c-661b-4268-a33d-e9f87ec167eb","Type":"ContainerStarted","Data":"6689e1c56b54a2951f9299ed2fb4a60a7bdb882bc64aefd868fdeb1ac8978756"} Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.156028 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.305494 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-config-data\") pod \"58885f5a-2467-45da-b076-7cdb2952d2a9\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.305805 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-combined-ca-bundle\") pod \"58885f5a-2467-45da-b076-7cdb2952d2a9\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.305901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kkdq\" (UniqueName: \"kubernetes.io/projected/58885f5a-2467-45da-b076-7cdb2952d2a9-kube-api-access-5kkdq\") pod \"58885f5a-2467-45da-b076-7cdb2952d2a9\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.305992 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-config-data-custom\") pod \"58885f5a-2467-45da-b076-7cdb2952d2a9\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.306027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58885f5a-2467-45da-b076-7cdb2952d2a9-logs\") pod \"58885f5a-2467-45da-b076-7cdb2952d2a9\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.307045 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58885f5a-2467-45da-b076-7cdb2952d2a9-logs" (OuterVolumeSpecName: "logs") pod "58885f5a-2467-45da-b076-7cdb2952d2a9" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.308882 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58885f5a-2467-45da-b076-7cdb2952d2a9" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.308907 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-config-data" (OuterVolumeSpecName: "config-data") pod "58885f5a-2467-45da-b076-7cdb2952d2a9" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.310981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "58885f5a-2467-45da-b076-7cdb2952d2a9" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.312739 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58885f5a-2467-45da-b076-7cdb2952d2a9-kube-api-access-5kkdq" (OuterVolumeSpecName: "kube-api-access-5kkdq") pod "58885f5a-2467-45da-b076-7cdb2952d2a9" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9"). InnerVolumeSpecName "kube-api-access-5kkdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.408101 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.408131 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.408142 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kkdq\" (UniqueName: \"kubernetes.io/projected/58885f5a-2467-45da-b076-7cdb2952d2a9-kube-api-access-5kkdq\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.408153 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.408164 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58885f5a-2467-45da-b076-7cdb2952d2a9-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.563672 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.564095 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.613331 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.613812 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:50:04 crc kubenswrapper[5030]: E0120 23:50:04.613929 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 20 23:50:04 crc kubenswrapper[5030]: E0120 23:50:04.613994 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs podName:83e561a5-31df-4e46-ac6b-9602ea09bcef nodeName:}" failed. No retries permitted until 2026-01-20 23:50:20.613977762 +0000 UTC m=+4492.934238050 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs") pod "placement-7fdd8fb598-6247c" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef") : secret "cert-placement-internal-svc" not found Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.616213 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.715974 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.716065 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.716163 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:50:04 crc kubenswrapper[5030]: E0120 23:50:04.716162 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-ovndbs: secret "cert-neutron-ovndbs" not found Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.716246 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.716312 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:50:04 crc kubenswrapper[5030]: E0120 23:50:04.716244 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-public-svc: secret "cert-neutron-public-svc" not found Jan 20 23:50:04 crc kubenswrapper[5030]: E0120 23:50:04.716221 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-neutron-internal-svc: secret "cert-neutron-internal-svc" not found Jan 20 23:50:04 crc kubenswrapper[5030]: E0120 23:50:04.717031 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:50:20.71653288 +0000 UTC m=+4493.036793168 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "ovndb-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-ovndbs" not found Jan 20 23:50:04 crc kubenswrapper[5030]: E0120 23:50:04.717052 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:50:20.717046382 +0000 UTC m=+4493.037306670 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-public-svc" not found Jan 20 23:50:04 crc kubenswrapper[5030]: E0120 23:50:04.717078 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs podName:d63800d7-7064-4d3f-9294-bcdbbd3957ec nodeName:}" failed. No retries permitted until 2026-01-20 23:50:20.717061933 +0000 UTC m=+4493.037322221 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs") pod "neutron-5568cd6b4b-ms4dw" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec") : secret "cert-neutron-internal-svc" not found Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.719797 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.720676 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs\") pod \"barbican-api-6ffbb9cff4-gljqq\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.802549 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.816922 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs\") pod \"58885f5a-2467-45da-b076-7cdb2952d2a9\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.817110 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs\") pod \"58885f5a-2467-45da-b076-7cdb2952d2a9\" (UID: \"58885f5a-2467-45da-b076-7cdb2952d2a9\") " Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.817350 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.819466 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "58885f5a-2467-45da-b076-7cdb2952d2a9" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.820555 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "58885f5a-2467-45da-b076-7cdb2952d2a9" (UID: "58885f5a-2467-45da-b076-7cdb2952d2a9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.821552 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs\") pod \"keystone-d4c58fbf5-82sk5\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.827148 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.919053 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:04 crc kubenswrapper[5030]: I0120 23:50:04.919087 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58885f5a-2467-45da-b076-7cdb2952d2a9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.095294 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.154427 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e8420ab0-7e33-4fbc-a045-c190273ccc1f","Type":"ContainerStarted","Data":"07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169"} Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.158524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"247c3cd7-bfa5-4883-bdac-ced00b376ca0","Type":"ContainerStarted","Data":"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db"} Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.158865 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.165437 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" event={"ID":"7c59fe1c-661b-4268-a33d-e9f87ec167eb","Type":"ContainerStarted","Data":"aad49c5ee0f0980c146469d4264f259f604d654ee7afbf6b04e4731884cf6baa"} Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.165493 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.165513 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" event={"ID":"7c59fe1c-661b-4268-a33d-e9f87ec167eb","Type":"ContainerStarted","Data":"03b4cd2420f5137945cf116f9f72eb27ef3c09ae5682c6031cff9066b34979ca"} Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.165583 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq" Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.168808 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.192583 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=10.432386961 podStartE2EDuration="14.192566608s" podCreationTimestamp="2026-01-20 23:49:51 +0000 UTC" firstStartedPulling="2026-01-20 23:50:00.495904093 +0000 UTC m=+4472.816164381" lastFinishedPulling="2026-01-20 23:50:04.25608374 +0000 UTC m=+4476.576344028" observedRunningTime="2026-01-20 23:50:05.177347189 +0000 UTC m=+4477.497607497" watchObservedRunningTime="2026-01-20 23:50:05.192566608 +0000 UTC m=+4477.512826896" Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.212553 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podStartSLOduration=2.212534712 podStartE2EDuration="2.212534712s" podCreationTimestamp="2026-01-20 23:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:05.206879705 +0000 UTC m=+4477.527139993" watchObservedRunningTime="2026-01-20 23:50:05.212534712 +0000 UTC m=+4477.532795000" Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.217271 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.245654 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq"] Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.259099 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-6ffbb9cff4-gljqq"] Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.577739 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.183:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.577792 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.183:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.626293 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-d4c58fbf5-82sk5"] Jan 20 23:50:05 crc kubenswrapper[5030]: W0120 23:50:05.640132 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16b352a0_984a_4736_9ede_a3797059b56a.slice/crio-32e46dab9e267047d0ea16968ff1e5ff7b12dd9bd552a2012b51f642efe794ed WatchSource:0}: Error finding container 32e46dab9e267047d0ea16968ff1e5ff7b12dd9bd552a2012b51f642efe794ed: Status 404 returned error can't find the container with id 32e46dab9e267047d0ea16968ff1e5ff7b12dd9bd552a2012b51f642efe794ed Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.642037 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:50:05 crc kubenswrapper[5030]: I0120 23:50:05.977648 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58885f5a-2467-45da-b076-7cdb2952d2a9" path="/var/lib/kubelet/pods/58885f5a-2467-45da-b076-7cdb2952d2a9/volumes" Jan 20 23:50:06 crc kubenswrapper[5030]: I0120 23:50:06.176352 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" event={"ID":"16b352a0-984a-4736-9ede-a3797059b56a","Type":"ContainerStarted","Data":"1a322a5b0f4609dfb53e29d267cab28f1bd935ea7c08384b0f94ec463163dffc"} Jan 20 23:50:06 crc kubenswrapper[5030]: I0120 23:50:06.176405 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" event={"ID":"16b352a0-984a-4736-9ede-a3797059b56a","Type":"ContainerStarted","Data":"32e46dab9e267047d0ea16968ff1e5ff7b12dd9bd552a2012b51f642efe794ed"} Jan 20 23:50:06 crc kubenswrapper[5030]: I0120 23:50:06.177203 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:50:06 crc kubenswrapper[5030]: I0120 23:50:06.179189 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e8420ab0-7e33-4fbc-a045-c190273ccc1f","Type":"ContainerStarted","Data":"70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd"} Jan 20 23:50:06 crc kubenswrapper[5030]: I0120 23:50:06.215241 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" podStartSLOduration=18.215214306 podStartE2EDuration="18.215214306s" podCreationTimestamp="2026-01-20 23:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:06.207763036 +0000 UTC m=+4478.528023334" watchObservedRunningTime="2026-01-20 23:50:06.215214306 +0000 UTC m=+4478.535474634" Jan 20 23:50:06 crc kubenswrapper[5030]: I0120 23:50:06.251441 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.251423095 podStartE2EDuration="3.251423095s" podCreationTimestamp="2026-01-20 23:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:06.242783385 +0000 UTC m=+4478.563043683" watchObservedRunningTime="2026-01-20 23:50:06.251423095 +0000 UTC m=+4478.571683403" Jan 20 23:50:06 crc kubenswrapper[5030]: I0120 23:50:06.668295 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:06 crc kubenswrapper[5030]: I0120 23:50:06.668367 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:06 crc kubenswrapper[5030]: E0120 23:50:06.668475 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:50:06 crc kubenswrapper[5030]: E0120 23:50:06.668550 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs podName:bd8e50b7-5dd6-475d-80a9-69eff598bb18 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:22.668532333 +0000 UTC m=+4494.988792621 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs") pod "ovsdbserver-nb-0" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18") : secret "cert-ovn-metrics" not found Jan 20 23:50:06 crc kubenswrapper[5030]: E0120 23:50:06.668613 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-nb-ovndbs: secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:50:06 crc kubenswrapper[5030]: E0120 23:50:06.668685 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs podName:bd8e50b7-5dd6-475d-80a9-69eff598bb18 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:22.668670336 +0000 UTC m=+4494.988930624 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "ovsdbserver-nb-tls-certs" (UniqueName: "kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs") pod "ovsdbserver-nb-0" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18") : secret "cert-ovndbcluster-nb-ovndbs" not found Jan 20 23:50:07 crc kubenswrapper[5030]: I0120 23:50:07.079664 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.079889 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-cell1-svc: secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.079980 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs podName:d465598f-140e-4d07-a886-9d5c96b3bbe3 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:23.079955224 +0000 UTC m=+4495.400215542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs") pod "openstack-cell1-galera-0" (UID: "d465598f-140e-4d07-a886-9d5c96b3bbe3") : secret "cert-galera-openstack-cell1-svc" not found Jan 20 23:50:07 crc kubenswrapper[5030]: I0120 23:50:07.185795 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.186039 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-novncproxy-cell1-vencrypt: secret "cert-nova-novncproxy-cell1-vencrypt" not found Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.190721 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs podName:b088bf19-0c85-4732-bc31-18e8bd1cd751 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:23.190603758 +0000 UTC m=+4495.510864086 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "vencrypt-tls-certs" (UniqueName: "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs") pod "nova-cell1-novncproxy-0" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751") : secret "cert-nova-novncproxy-cell1-vencrypt" not found Jan 20 23:50:07 crc kubenswrapper[5030]: I0120 23:50:07.191561 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:07 crc kubenswrapper[5030]: I0120 23:50:07.191784 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:07 crc kubenswrapper[5030]: I0120 23:50:07.191906 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.191793 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-novncproxy-cell1-public-svc: secret "cert-nova-novncproxy-cell1-public-svc" not found Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.192072 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs podName:b088bf19-0c85-4732-bc31-18e8bd1cd751 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:23.192044703 +0000 UTC m=+4495.512305081 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nova-novncproxy-tls-certs" (UniqueName: "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs") pod "nova-cell1-novncproxy-0" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751") : secret "cert-nova-novncproxy-cell1-public-svc" not found Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.191872 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-memcached-svc: secret "cert-memcached-svc" not found Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.192129 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-galera-openstack-svc: secret "cert-galera-openstack-svc" not found Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.192195 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs podName:5945d7b7-7f69-4c5e-a90f-5963d1ff010f nodeName:}" failed. No retries permitted until 2026-01-20 23:50:23.192160986 +0000 UTC m=+4495.512421364 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "memcached-tls-certs" (UniqueName: "kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs") pod "memcached-0" (UID: "5945d7b7-7f69-4c5e-a90f-5963d1ff010f") : secret "cert-memcached-svc" not found Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.192395 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs podName:0e6ec1e2-3869-4ced-86cf-55f3d2f26b83 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:23.19235556 +0000 UTC m=+4495.512615918 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "galera-tls-certs" (UniqueName: "kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs") pod "openstack-galera-0" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83") : secret "cert-galera-openstack-svc" not found Jan 20 23:50:07 crc kubenswrapper[5030]: I0120 23:50:07.197693 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:07 crc kubenswrapper[5030]: I0120 23:50:07.198044 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="ceilometer-central-agent" containerID="cri-o://c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8" gracePeriod=30 Jan 20 23:50:07 crc kubenswrapper[5030]: I0120 23:50:07.198204 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="sg-core" containerID="cri-o://677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c" gracePeriod=30 Jan 20 23:50:07 crc kubenswrapper[5030]: I0120 23:50:07.198222 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="ceilometer-notification-agent" containerID="cri-o://f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96" gracePeriod=30 Jan 20 23:50:07 crc kubenswrapper[5030]: I0120 23:50:07.198285 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="proxy-httpd" containerID="cri-o://a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db" gracePeriod=30 Jan 20 23:50:07 crc kubenswrapper[5030]: I0120 23:50:07.315498 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:07 crc kubenswrapper[5030]: I0120 23:50:07.316614 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.317420 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.317508 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs podName:58e259c6-360b-4eb5-9940-ceddedf1e275 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:23.317480995 +0000 UTC m=+4495.637741323 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs") pod "ovsdbserver-sb-0" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275") : secret "cert-ovn-metrics" not found Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.320369 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovndbcluster-sb-ovndbs: secret "cert-ovndbcluster-sb-ovndbs" not found Jan 20 23:50:07 crc kubenswrapper[5030]: E0120 23:50:07.327056 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs podName:58e259c6-360b-4eb5-9940-ceddedf1e275 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:23.327024407 +0000 UTC m=+4495.647284705 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "ovsdbserver-sb-tls-certs" (UniqueName: "kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs") pod "ovsdbserver-sb-0" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275") : secret "cert-ovndbcluster-sb-ovndbs" not found Jan 20 23:50:07 crc kubenswrapper[5030]: I0120 23:50:07.959156 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.139429 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-combined-ca-bundle\") pod \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.139478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247c3cd7-bfa5-4883-bdac-ced00b376ca0-run-httpd\") pod \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.139519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5mpk\" (UniqueName: \"kubernetes.io/projected/247c3cd7-bfa5-4883-bdac-ced00b376ca0-kube-api-access-p5mpk\") pod \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.139549 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs\") pod \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.139572 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-scripts\") pod \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.139901 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247c3cd7-bfa5-4883-bdac-ced00b376ca0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "247c3cd7-bfa5-4883-bdac-ced00b376ca0" (UID: "247c3cd7-bfa5-4883-bdac-ced00b376ca0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.139988 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-sg-core-conf-yaml\") pod \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.140387 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-config-data\") pod \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.140811 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247c3cd7-bfa5-4883-bdac-ced00b376ca0-log-httpd\") pod \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\" (UID: \"247c3cd7-bfa5-4883-bdac-ced00b376ca0\") " Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.141598 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247c3cd7-bfa5-4883-bdac-ced00b376ca0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "247c3cd7-bfa5-4883-bdac-ced00b376ca0" (UID: "247c3cd7-bfa5-4883-bdac-ced00b376ca0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.141925 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247c3cd7-bfa5-4883-bdac-ced00b376ca0-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.142007 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247c3cd7-bfa5-4883-bdac-ced00b376ca0-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.146242 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-scripts" (OuterVolumeSpecName: "scripts") pod "247c3cd7-bfa5-4883-bdac-ced00b376ca0" (UID: "247c3cd7-bfa5-4883-bdac-ced00b376ca0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.147883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247c3cd7-bfa5-4883-bdac-ced00b376ca0-kube-api-access-p5mpk" (OuterVolumeSpecName: "kube-api-access-p5mpk") pod "247c3cd7-bfa5-4883-bdac-ced00b376ca0" (UID: "247c3cd7-bfa5-4883-bdac-ced00b376ca0"). InnerVolumeSpecName "kube-api-access-p5mpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.183431 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "247c3cd7-bfa5-4883-bdac-ced00b376ca0" (UID: "247c3cd7-bfa5-4883-bdac-ced00b376ca0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.193979 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "247c3cd7-bfa5-4883-bdac-ced00b376ca0" (UID: "247c3cd7-bfa5-4883-bdac-ced00b376ca0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.213868 5030 generic.go:334] "Generic (PLEG): container finished" podID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerID="a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db" exitCode=0 Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.213898 5030 generic.go:334] "Generic (PLEG): container finished" podID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerID="677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c" exitCode=2 Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.213906 5030 generic.go:334] "Generic (PLEG): container finished" podID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerID="f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96" exitCode=0 Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.213914 5030 generic.go:334] "Generic (PLEG): container finished" podID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerID="c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8" exitCode=0 Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.213982 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.214049 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"247c3cd7-bfa5-4883-bdac-ced00b376ca0","Type":"ContainerDied","Data":"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db"} Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.214092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"247c3cd7-bfa5-4883-bdac-ced00b376ca0","Type":"ContainerDied","Data":"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c"} Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.214113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"247c3cd7-bfa5-4883-bdac-ced00b376ca0","Type":"ContainerDied","Data":"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96"} Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.214132 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"247c3cd7-bfa5-4883-bdac-ced00b376ca0","Type":"ContainerDied","Data":"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8"} Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.214149 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"247c3cd7-bfa5-4883-bdac-ced00b376ca0","Type":"ContainerDied","Data":"ada35364535f89b6fcff77016022db7ec0598dcedc895845fc1db566dff4c7f2"} Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.214174 5030 scope.go:117] "RemoveContainer" containerID="a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.216289 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "247c3cd7-bfa5-4883-bdac-ced00b376ca0" (UID: "247c3cd7-bfa5-4883-bdac-ced00b376ca0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.243844 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.243880 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5mpk\" (UniqueName: \"kubernetes.io/projected/247c3cd7-bfa5-4883-bdac-ced00b376ca0-kube-api-access-p5mpk\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.243895 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.243907 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.243919 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.254128 5030 scope.go:117] "RemoveContainer" containerID="677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.267019 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-config-data" (OuterVolumeSpecName: "config-data") pod "247c3cd7-bfa5-4883-bdac-ced00b376ca0" (UID: "247c3cd7-bfa5-4883-bdac-ced00b376ca0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.277832 5030 scope.go:117] "RemoveContainer" containerID="f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.321202 5030 scope.go:117] "RemoveContainer" containerID="c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.345973 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247c3cd7-bfa5-4883-bdac-ced00b376ca0-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.385202 5030 scope.go:117] "RemoveContainer" containerID="a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db" Jan 20 23:50:08 crc kubenswrapper[5030]: E0120 23:50:08.385749 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db\": container with ID starting with a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db not found: ID does not exist" containerID="a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.385795 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db"} err="failed to get container status \"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db\": rpc error: code = NotFound desc = could not find container \"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db\": container with ID starting with a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.385822 5030 scope.go:117] "RemoveContainer" containerID="677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c" Jan 20 23:50:08 crc kubenswrapper[5030]: E0120 23:50:08.386295 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c\": container with ID starting with 677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c not found: ID does not exist" containerID="677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.386333 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c"} err="failed to get container status \"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c\": rpc error: code = NotFound desc = could not find container \"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c\": container with ID starting with 677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.386364 5030 scope.go:117] "RemoveContainer" containerID="f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96" Jan 20 23:50:08 crc kubenswrapper[5030]: E0120 23:50:08.386950 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96\": container with ID starting with f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96 not found: ID does not exist" containerID="f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.386974 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96"} err="failed to get container status \"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96\": rpc error: code = NotFound desc = could not find container \"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96\": container with ID starting with f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96 not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.386988 5030 scope.go:117] "RemoveContainer" containerID="c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8" Jan 20 23:50:08 crc kubenswrapper[5030]: E0120 23:50:08.387413 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8\": container with ID starting with c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8 not found: ID does not exist" containerID="c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.387445 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8"} err="failed to get container status \"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8\": rpc error: code = NotFound desc = could not find container \"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8\": container with ID starting with c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8 not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.387464 5030 scope.go:117] "RemoveContainer" containerID="a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.388094 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db"} err="failed to get container status \"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db\": rpc error: code = NotFound desc = could not find container \"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db\": container with ID starting with a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.388118 5030 scope.go:117] "RemoveContainer" containerID="677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.388459 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c"} err="failed to get container status \"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c\": rpc error: code = NotFound desc = could not find container \"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c\": container with ID starting with 677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.388484 5030 scope.go:117] "RemoveContainer" containerID="f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.388953 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96"} err="failed to get container status \"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96\": rpc error: code = NotFound desc = could not find container \"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96\": container with ID starting with f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96 not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.388971 5030 scope.go:117] "RemoveContainer" containerID="c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.389431 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8"} err="failed to get container status \"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8\": rpc error: code = NotFound desc = could not find container \"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8\": container with ID starting with c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8 not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.389454 5030 scope.go:117] "RemoveContainer" containerID="a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.390056 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db"} err="failed to get container status \"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db\": rpc error: code = NotFound desc = could not find container \"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db\": container with ID starting with a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.390079 5030 scope.go:117] "RemoveContainer" containerID="677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.390418 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c"} err="failed to get container status \"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c\": rpc error: code = NotFound desc = could not find container \"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c\": container with ID starting with 677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.390438 5030 scope.go:117] "RemoveContainer" containerID="f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.390853 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96"} err="failed to get container status \"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96\": rpc error: code = NotFound desc = could not find container \"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96\": container with ID starting with f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96 not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.390884 5030 scope.go:117] "RemoveContainer" containerID="c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.391429 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8"} err="failed to get container status \"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8\": rpc error: code = NotFound desc = could not find container \"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8\": container with ID starting with c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8 not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.391498 5030 scope.go:117] "RemoveContainer" containerID="a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.392065 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db"} err="failed to get container status \"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db\": rpc error: code = NotFound desc = could not find container \"a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db\": container with ID starting with a419e8b56fd7166fd2c9dd1b5bbbe099a96b9c8873e208f8591536c2d7d6e7db not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.392093 5030 scope.go:117] "RemoveContainer" containerID="677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.392597 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c"} err="failed to get container status \"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c\": rpc error: code = NotFound desc = could not find container \"677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c\": container with ID starting with 677612afc41b86acf3cb16a97d71bb7f17a387837c8f340e7e08ce17e6bd547c not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.392637 5030 scope.go:117] "RemoveContainer" containerID="f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.393045 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96"} err="failed to get container status \"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96\": rpc error: code = NotFound desc = could not find container \"f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96\": container with ID starting with f529c88a395769023b64f4a7051d6e3964b0a918549fd2c767e3cd2ff8e89e96 not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.393109 5030 scope.go:117] "RemoveContainer" containerID="c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.393443 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8"} err="failed to get container status \"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8\": rpc error: code = NotFound desc = could not find container \"c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8\": container with ID starting with c6f98145b0709aabece6f938f63a39e8cb402681caacbb51bebfde0245206cf8 not found: ID does not exist" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.569980 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.589096 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.599570 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:50:08 crc kubenswrapper[5030]: E0120 23:50:08.600017 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="ceilometer-central-agent" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.600037 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="ceilometer-central-agent" Jan 20 23:50:08 crc kubenswrapper[5030]: E0120 23:50:08.600065 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="ceilometer-notification-agent" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.600073 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="ceilometer-notification-agent" Jan 20 23:50:08 crc kubenswrapper[5030]: E0120 23:50:08.600089 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="proxy-httpd" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.600098 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="proxy-httpd" Jan 20 23:50:08 crc kubenswrapper[5030]: E0120 23:50:08.600111 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="sg-core" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.600119 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="sg-core" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.600354 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="ceilometer-central-agent" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.600372 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="ceilometer-notification-agent" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.600391 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="proxy-httpd" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.600404 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" containerName="sg-core" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.602452 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.604524 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.605150 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.616702 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.619980 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.756383 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-scripts\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.756763 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90dd7615-420e-4c08-a072-bb831bffaa87-log-httpd\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.756918 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ctfq\" (UniqueName: \"kubernetes.io/projected/90dd7615-420e-4c08-a072-bb831bffaa87-kube-api-access-2ctfq\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.757006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.757196 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-config-data\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.757291 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.757372 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.757461 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90dd7615-420e-4c08-a072-bb831bffaa87-run-httpd\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.757575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.859203 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90dd7615-420e-4c08-a072-bb831bffaa87-log-httpd\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.859299 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ctfq\" (UniqueName: \"kubernetes.io/projected/90dd7615-420e-4c08-a072-bb831bffaa87-kube-api-access-2ctfq\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.859331 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.859418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-config-data\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.859437 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.859454 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.859494 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90dd7615-420e-4c08-a072-bb831bffaa87-run-httpd\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.859582 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-scripts\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.859871 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90dd7615-420e-4c08-a072-bb831bffaa87-log-httpd\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:08 crc kubenswrapper[5030]: I0120 23:50:08.860312 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90dd7615-420e-4c08-a072-bb831bffaa87-run-httpd\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.044612 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.044791 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-scripts\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.045420 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.045732 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-config-data\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.045822 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.045903 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ctfq\" (UniqueName: \"kubernetes.io/projected/90dd7615-420e-4c08-a072-bb831bffaa87-kube-api-access-2ctfq\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.045912 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.125588 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.234405 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.613407 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:50:09 crc kubenswrapper[5030]: W0120 23:50:09.613791 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb33621c6_2162_464f_853a_377357ad9f2f.slice/crio-5c8fdd64889a9a4aabaecd6a65c18c5c357b12c991957ef8d063993914a59613 WatchSource:0}: Error finding container 5c8fdd64889a9a4aabaecd6a65c18c5c357b12c991957ef8d063993914a59613: Status 404 returned error can't find the container with id 5c8fdd64889a9a4aabaecd6a65c18c5c357b12c991957ef8d063993914a59613 Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.713647 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:50:09 crc kubenswrapper[5030]: W0120 23:50:09.717184 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90dd7615_420e_4c08_a072_bb831bffaa87.slice/crio-c130dd97c2e9d2d904967293b0e779e5788e6e15fe6a7443e51305baad8edd56 WatchSource:0}: Error finding container c130dd97c2e9d2d904967293b0e779e5788e6e15fe6a7443e51305baad8edd56: Status 404 returned error can't find the container with id c130dd97c2e9d2d904967293b0e779e5788e6e15fe6a7443e51305baad8edd56 Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.879770 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.880014 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:09 crc kubenswrapper[5030]: E0120 23:50:09.881102 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-nova-public-svc: secret "cert-nova-public-svc" not found Jan 20 23:50:09 crc kubenswrapper[5030]: E0120 23:50:09.881157 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs podName:a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:25.881142736 +0000 UTC m=+4498.201403024 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs") pod "nova-api-0" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01") : secret "cert-nova-public-svc" not found Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.890586 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.964019 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:50:09 crc kubenswrapper[5030]: E0120 23:50:09.964279 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.974493 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247c3cd7-bfa5-4883-bdac-ced00b376ca0" path="/var/lib/kubelet/pods/247c3cd7-bfa5-4883-bdac-ced00b376ca0/volumes" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.984734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:09 crc kubenswrapper[5030]: I0120 23:50:09.989094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:10 crc kubenswrapper[5030]: I0120 23:50:10.053175 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:10 crc kubenswrapper[5030]: I0120 23:50:10.163692 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.188:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:10 crc kubenswrapper[5030]: I0120 23:50:10.263614 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"90dd7615-420e-4c08-a072-bb831bffaa87","Type":"ContainerStarted","Data":"c130dd97c2e9d2d904967293b0e779e5788e6e15fe6a7443e51305baad8edd56"} Jan 20 23:50:10 crc kubenswrapper[5030]: I0120 23:50:10.268049 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"b33621c6-2162-464f-853a-377357ad9f2f","Type":"ContainerStarted","Data":"3e9a5cc906acbaff7456c0e13b2e3f9c2e7b8cf065ffe3e9db20c604658530d5"} Jan 20 23:50:10 crc kubenswrapper[5030]: I0120 23:50:10.268092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"b33621c6-2162-464f-853a-377357ad9f2f","Type":"ContainerStarted","Data":"5c8fdd64889a9a4aabaecd6a65c18c5c357b12c991957ef8d063993914a59613"} Jan 20 23:50:10 crc kubenswrapper[5030]: I0120 23:50:10.518952 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:50:11 crc kubenswrapper[5030]: I0120 23:50:11.278966 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c80f8119-49a7-4395-afb7-cdb94b8d6c35","Type":"ContainerStarted","Data":"22df7d634b6fcc4d2d4394d1e3fd73e43228c42a17debaad1858400944d85c3f"} Jan 20 23:50:11 crc kubenswrapper[5030]: I0120 23:50:11.279554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c80f8119-49a7-4395-afb7-cdb94b8d6c35","Type":"ContainerStarted","Data":"f36ebd96f2d963aad096c98d1642e51349d2a11763ae21b3692dbc6e8922dd1c"} Jan 20 23:50:11 crc kubenswrapper[5030]: I0120 23:50:11.286241 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"90dd7615-420e-4c08-a072-bb831bffaa87","Type":"ContainerStarted","Data":"92cf2fb122f30e5c6a21d4e0e4baf83970b5790cf5d4a295edcace6a86415e72"} Jan 20 23:50:11 crc kubenswrapper[5030]: I0120 23:50:11.286273 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"90dd7615-420e-4c08-a072-bb831bffaa87","Type":"ContainerStarted","Data":"8b1845e8728d8152e80b8ed32bb158372b3cc417655fa2c42f9e0e6d2bad0093"} Jan 20 23:50:11 crc kubenswrapper[5030]: I0120 23:50:11.288252 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"b33621c6-2162-464f-853a-377357ad9f2f","Type":"ContainerStarted","Data":"83aa5d018cd69e4a57bf33801771af6a6199c3a715f3d3470b60454c99f0ab56"} Jan 20 23:50:11 crc kubenswrapper[5030]: I0120 23:50:11.725070 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:11 crc kubenswrapper[5030]: E0120 23:50:11.725322 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovn-metrics: secret "cert-ovn-metrics" not found Jan 20 23:50:11 crc kubenswrapper[5030]: I0120 23:50:11.725524 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:11 crc kubenswrapper[5030]: E0120 23:50:11.725608 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs podName:01baff70-8147-462e-b9ca-a1ca5301b4b1 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:27.725583821 +0000 UTC m=+4500.045844119 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs-tls-certs" (UniqueName: "kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs") pod "ovn-northd-0" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1") : secret "cert-ovn-metrics" not found Jan 20 23:50:11 crc kubenswrapper[5030]: E0120 23:50:11.725642 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-ovnnorthd-ovndbs: secret "cert-ovnnorthd-ovndbs" not found Jan 20 23:50:11 crc kubenswrapper[5030]: E0120 23:50:11.725848 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs podName:01baff70-8147-462e-b9ca-a1ca5301b4b1 nodeName:}" failed. No retries permitted until 2026-01-20 23:50:27.725814626 +0000 UTC m=+4500.046075004 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "ovn-northd-tls-certs" (UniqueName: "kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs") pod "ovn-northd-0" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1") : secret "cert-ovnnorthd-ovndbs" not found Jan 20 23:50:12 crc kubenswrapper[5030]: I0120 23:50:12.306771 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c80f8119-49a7-4395-afb7-cdb94b8d6c35","Type":"ContainerStarted","Data":"45defd247220b7c9d73f86491711fda5c0c833c5e9fda3087524c4f3b85ddafa"} Jan 20 23:50:12 crc kubenswrapper[5030]: I0120 23:50:12.311225 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"90dd7615-420e-4c08-a072-bb831bffaa87","Type":"ContainerStarted","Data":"e041dd73fc1444a0956c52fe58129a69c5fa7a3c31ada76916a7e7c4bb1b49a7"} Jan 20 23:50:12 crc kubenswrapper[5030]: I0120 23:50:12.342455 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=20.342434994 podStartE2EDuration="20.342434994s" podCreationTimestamp="2026-01-20 23:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:11.309988549 +0000 UTC m=+4483.630248837" watchObservedRunningTime="2026-01-20 23:50:12.342434994 +0000 UTC m=+4484.662695292" Jan 20 23:50:12 crc kubenswrapper[5030]: I0120 23:50:12.343326 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=19.343318846 podStartE2EDuration="19.343318846s" podCreationTimestamp="2026-01-20 23:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:12.343197313 +0000 UTC m=+4484.663457681" watchObservedRunningTime="2026-01-20 23:50:12.343318846 +0000 UTC m=+4484.663579154" Jan 20 23:50:13 crc kubenswrapper[5030]: I0120 23:50:13.326571 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"90dd7615-420e-4c08-a072-bb831bffaa87","Type":"ContainerStarted","Data":"a081c1c7ccdc4cb13791e724c739ec79d3301101fcdb06b56421f2dc0c9a48a4"} Jan 20 23:50:13 crc kubenswrapper[5030]: I0120 23:50:13.350359 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.009439289 podStartE2EDuration="5.350341415s" podCreationTimestamp="2026-01-20 23:50:08 +0000 UTC" firstStartedPulling="2026-01-20 23:50:09.719231659 +0000 UTC m=+4482.039491947" lastFinishedPulling="2026-01-20 23:50:13.060133765 +0000 UTC m=+4485.380394073" observedRunningTime="2026-01-20 23:50:13.345344823 +0000 UTC m=+4485.665605121" watchObservedRunningTime="2026-01-20 23:50:13.350341415 +0000 UTC m=+4485.670601703" Jan 20 23:50:14 crc kubenswrapper[5030]: I0120 23:50:14.340039 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:14 crc kubenswrapper[5030]: I0120 23:50:14.517815 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:14 crc kubenswrapper[5030]: I0120 23:50:14.517822 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:14 crc kubenswrapper[5030]: I0120 23:50:14.573551 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:50:14 crc kubenswrapper[5030]: E0120 23:50:14.580319 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs-tls-certs ovsdbserver-sb-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="58e259c6-360b-4eb5-9940-ceddedf1e275" Jan 20 23:50:14 crc kubenswrapper[5030]: I0120 23:50:14.765767 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:50:14 crc kubenswrapper[5030]: E0120 23:50:14.766443 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs-tls-certs ovsdbserver-nb-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="bd8e50b7-5dd6-475d-80a9-69eff598bb18" Jan 20 23:50:14 crc kubenswrapper[5030]: I0120 23:50:14.962048 5030 scope.go:117] "RemoveContainer" containerID="56275492582da8b8f8b152940db54aa4552246f3e96d5f8d4e3b4e2ebd331e86" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.204783 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.188:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.352709 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.352819 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.352849 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"87584043-7d0e-475d-89ec-2f743b49b145","Type":"ContainerStarted","Data":"0f4d8ae10989b886f307feec10a85282b7e675f678e654903030afe74cc1794d"} Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.354489 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.373571 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.391789 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.534469 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdb-rundir\") pod \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.534576 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x26lh\" (UniqueName: \"kubernetes.io/projected/bd8e50b7-5dd6-475d-80a9-69eff598bb18-kube-api-access-x26lh\") pod \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.534603 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58e259c6-360b-4eb5-9940-ceddedf1e275-scripts\") pod \"58e259c6-360b-4eb5-9940-ceddedf1e275\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.534648 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8e50b7-5dd6-475d-80a9-69eff598bb18-scripts\") pod \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.534694 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-combined-ca-bundle\") pod \"58e259c6-360b-4eb5-9940-ceddedf1e275\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.534746 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8e50b7-5dd6-475d-80a9-69eff598bb18-config\") pod \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.534772 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"58e259c6-360b-4eb5-9940-ceddedf1e275\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.534813 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdb-rundir\") pod \"58e259c6-360b-4eb5-9940-ceddedf1e275\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.534846 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.534893 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e259c6-360b-4eb5-9940-ceddedf1e275-config\") pod \"58e259c6-360b-4eb5-9940-ceddedf1e275\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.534920 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-combined-ca-bundle\") pod \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\" (UID: \"bd8e50b7-5dd6-475d-80a9-69eff598bb18\") " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.535488 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l87st\" (UniqueName: \"kubernetes.io/projected/58e259c6-360b-4eb5-9940-ceddedf1e275-kube-api-access-l87st\") pod \"58e259c6-360b-4eb5-9940-ceddedf1e275\" (UID: \"58e259c6-360b-4eb5-9940-ceddedf1e275\") " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.534913 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "bd8e50b7-5dd6-475d-80a9-69eff598bb18" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.535249 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd8e50b7-5dd6-475d-80a9-69eff598bb18-config" (OuterVolumeSpecName: "config") pod "bd8e50b7-5dd6-475d-80a9-69eff598bb18" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.535642 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd8e50b7-5dd6-475d-80a9-69eff598bb18-scripts" (OuterVolumeSpecName: "scripts") pod "bd8e50b7-5dd6-475d-80a9-69eff598bb18" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.535824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e259c6-360b-4eb5-9940-ceddedf1e275-scripts" (OuterVolumeSpecName: "scripts") pod "58e259c6-360b-4eb5-9940-ceddedf1e275" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.536130 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58e259c6-360b-4eb5-9940-ceddedf1e275-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.536151 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd8e50b7-5dd6-475d-80a9-69eff598bb18-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.536161 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd8e50b7-5dd6-475d-80a9-69eff598bb18-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.536172 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.536820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "58e259c6-360b-4eb5-9940-ceddedf1e275" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.538761 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58e259c6-360b-4eb5-9940-ceddedf1e275-config" (OuterVolumeSpecName: "config") pod "58e259c6-360b-4eb5-9940-ceddedf1e275" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.543706 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "bd8e50b7-5dd6-475d-80a9-69eff598bb18" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.544133 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58e259c6-360b-4eb5-9940-ceddedf1e275" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.545827 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "58e259c6-360b-4eb5-9940-ceddedf1e275" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.545824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd8e50b7-5dd6-475d-80a9-69eff598bb18" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.547299 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e259c6-360b-4eb5-9940-ceddedf1e275-kube-api-access-l87st" (OuterVolumeSpecName: "kube-api-access-l87st") pod "58e259c6-360b-4eb5-9940-ceddedf1e275" (UID: "58e259c6-360b-4eb5-9940-ceddedf1e275"). InnerVolumeSpecName "kube-api-access-l87st". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.552910 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd8e50b7-5dd6-475d-80a9-69eff598bb18-kube-api-access-x26lh" (OuterVolumeSpecName: "kube-api-access-x26lh") pod "bd8e50b7-5dd6-475d-80a9-69eff598bb18" (UID: "bd8e50b7-5dd6-475d-80a9-69eff598bb18"). InnerVolumeSpecName "kube-api-access-x26lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.604866 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.183:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.610892 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.183:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.638877 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58e259c6-360b-4eb5-9940-ceddedf1e275-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.638904 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.638915 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l87st\" (UniqueName: \"kubernetes.io/projected/58e259c6-360b-4eb5-9940-ceddedf1e275-kube-api-access-l87st\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.638927 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x26lh\" (UniqueName: \"kubernetes.io/projected/bd8e50b7-5dd6-475d-80a9-69eff598bb18-kube-api-access-x26lh\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.638937 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.638956 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.638966 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.638979 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.681067 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.681249 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.742540 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:15 crc kubenswrapper[5030]: I0120 23:50:15.743122 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.091216 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:50:16 crc kubenswrapper[5030]: E0120 23:50:16.092308 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs-tls-certs ovn-northd-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/ovn-northd-0" podUID="01baff70-8147-462e-b9ca-a1ca5301b4b1" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.372283 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.373093 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.373793 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.395231 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.462356 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.492107 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.511133 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.512652 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.524170 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.524427 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.525033 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.525219 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-l99kr" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.537801 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.546852 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.555084 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.561867 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-combined-ca-bundle\") pod \"01baff70-8147-462e-b9ca-a1ca5301b4b1\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.561919 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01baff70-8147-462e-b9ca-a1ca5301b4b1-scripts\") pod \"01baff70-8147-462e-b9ca-a1ca5301b4b1\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.561979 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01baff70-8147-462e-b9ca-a1ca5301b4b1-config\") pod \"01baff70-8147-462e-b9ca-a1ca5301b4b1\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.562074 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-rundir\") pod \"01baff70-8147-462e-b9ca-a1ca5301b4b1\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.562095 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zvxz\" (UniqueName: \"kubernetes.io/projected/01baff70-8147-462e-b9ca-a1ca5301b4b1-kube-api-access-4zvxz\") pod \"01baff70-8147-462e-b9ca-a1ca5301b4b1\" (UID: \"01baff70-8147-462e-b9ca-a1ca5301b4b1\") " Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.562446 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01baff70-8147-462e-b9ca-a1ca5301b4b1-scripts" (OuterVolumeSpecName: "scripts") pod "01baff70-8147-462e-b9ca-a1ca5301b4b1" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.562494 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01baff70-8147-462e-b9ca-a1ca5301b4b1-config" (OuterVolumeSpecName: "config") pod "01baff70-8147-462e-b9ca-a1ca5301b4b1" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.562830 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/01baff70-8147-462e-b9ca-a1ca5301b4b1-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.562851 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01baff70-8147-462e-b9ca-a1ca5301b4b1-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.562945 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "01baff70-8147-462e-b9ca-a1ca5301b4b1" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.564779 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.566643 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.567673 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01baff70-8147-462e-b9ca-a1ca5301b4b1" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.567779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01baff70-8147-462e-b9ca-a1ca5301b4b1-kube-api-access-4zvxz" (OuterVolumeSpecName: "kube-api-access-4zvxz") pod "01baff70-8147-462e-b9ca-a1ca5301b4b1" (UID: "01baff70-8147-462e-b9ca-a1ca5301b4b1"). InnerVolumeSpecName "kube-api-access-4zvxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.569596 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.569900 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.570061 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.570126 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-s5kbg" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.589913 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.664491 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.664536 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.664577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bc2f7c1-a435-4337-bd8b-47c0443f4004-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.664662 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2bc2f7c1-a435-4337-bd8b-47c0443f4004-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.664683 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.664700 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-config\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.664744 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnpws\" (UniqueName: \"kubernetes.io/projected/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-kube-api-access-gnpws\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.664777 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.664794 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bc2f7c1-a435-4337-bd8b-47c0443f4004-config\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.664838 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwdkq\" (UniqueName: \"kubernetes.io/projected/2bc2f7c1-a435-4337-bd8b-47c0443f4004-kube-api-access-lwdkq\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.664946 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.665008 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.665209 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.665312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.665428 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.665495 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.665747 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.665772 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e259c6-360b-4eb5-9940-ceddedf1e275-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.665789 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.665805 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8e50b7-5dd6-475d-80a9-69eff598bb18-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.665821 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.665839 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.665853 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zvxz\" (UniqueName: \"kubernetes.io/projected/01baff70-8147-462e-b9ca-a1ca5301b4b1-kube-api-access-4zvxz\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767156 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767194 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767243 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767260 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bc2f7c1-a435-4337-bd8b-47c0443f4004-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767331 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2bc2f7c1-a435-4337-bd8b-47c0443f4004-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767390 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-config\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767438 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnpws\" (UniqueName: \"kubernetes.io/projected/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-kube-api-access-gnpws\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767472 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bc2f7c1-a435-4337-bd8b-47c0443f4004-config\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767492 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767516 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwdkq\" (UniqueName: \"kubernetes.io/projected/2bc2f7c1-a435-4337-bd8b-47c0443f4004-kube-api-access-lwdkq\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767536 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767601 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767794 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") device mount path \"/mnt/openstack/pv04\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.767916 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2bc2f7c1-a435-4337-bd8b-47c0443f4004-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.768054 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") device mount path \"/mnt/openstack/pv16\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.768121 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.769215 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-config\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.769256 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bc2f7c1-a435-4337-bd8b-47c0443f4004-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.769492 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.769583 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bc2f7c1-a435-4337-bd8b-47c0443f4004-config\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.770541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.782262 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.786006 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.786190 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.788138 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnpws\" (UniqueName: \"kubernetes.io/projected/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-kube-api-access-gnpws\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.789264 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.798240 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.800140 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwdkq\" (UniqueName: \"kubernetes.io/projected/2bc2f7c1-a435-4337-bd8b-47c0443f4004-kube-api-access-lwdkq\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.814452 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.815246 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"ovsdbserver-sb-0\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.831668 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.876709 5030 scope.go:117] "RemoveContainer" containerID="3a021e6e1de3258fbc43e1fe576effd6eb9ae3f6b6721159e550dce62dbe4db8" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.942370 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.955077 5030 scope.go:117] "RemoveContainer" containerID="25486648d351881e64da50149b9a74c4d78a6f3be0d43be4097ddd110093e64c" Jan 20 23:50:16 crc kubenswrapper[5030]: I0120 23:50:16.998121 5030 scope.go:117] "RemoveContainer" containerID="b494a9a55c3d113fd512a6d0e36e0c85cab0dd52b358ca282eeda6ddabb313cb" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.023761 5030 scope.go:117] "RemoveContainer" containerID="cfa5d12ec8dd35e4a892ca6be634b4e7bf04ed77c8dbf172c3f5aca0da154dd3" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.057417 5030 scope.go:117] "RemoveContainer" containerID="e8dcb212d5fdedc91a641f7a1f28c081da6ff7db705082af138b24c20baf2089" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.095144 5030 scope.go:117] "RemoveContainer" containerID="df80f7331fd2acd35e1f5223ad9813293a66e401c708fb7402e8df96840c4ac3" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.127193 5030 scope.go:117] "RemoveContainer" containerID="7d4d3b891364959f2992b7d0b16936c8ce989a4e6bfc2892f0a8fbde5313a0dc" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.167037 5030 scope.go:117] "RemoveContainer" containerID="0c5c3896b95122ba2528e0d7943d1eba9eaee2155abb09b973a5236b470342ce" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.187189 5030 scope.go:117] "RemoveContainer" containerID="78a21b2039a28bb9676a5f052208d7607b918ac03ca2faa1ba709a6784895363" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.212433 5030 scope.go:117] "RemoveContainer" containerID="435f0b805ae72f39e1e85f6d4ab34ed0af3b4b65094220981c7f3d947c20afd9" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.234271 5030 scope.go:117] "RemoveContainer" containerID="1abdfa8e1738c463a2fd304df58c250bf4c6932fce81a2f5d1066f83952f0f5c" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.327292 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.390945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"2bc2f7c1-a435-4337-bd8b-47c0443f4004","Type":"ContainerStarted","Data":"76d6439f2add6a1ad3a45258a3ebaa3b4d0647b9065b86998539762ef5b46e80"} Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.404346 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.406801 5030 generic.go:334] "Generic (PLEG): container finished" podID="87584043-7d0e-475d-89ec-2f743b49b145" containerID="0f4d8ae10989b886f307feec10a85282b7e675f678e654903030afe74cc1794d" exitCode=1 Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.406969 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"87584043-7d0e-475d-89ec-2f743b49b145","Type":"ContainerDied","Data":"0f4d8ae10989b886f307feec10a85282b7e675f678e654903030afe74cc1794d"} Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.407007 5030 scope.go:117] "RemoveContainer" containerID="56275492582da8b8f8b152940db54aa4552246f3e96d5f8d4e3b4e2ebd331e86" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.407180 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.407364 5030 scope.go:117] "RemoveContainer" containerID="0f4d8ae10989b886f307feec10a85282b7e675f678e654903030afe74cc1794d" Jan 20 23:50:17 crc kubenswrapper[5030]: E0120 23:50:17.407595 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(87584043-7d0e-475d-89ec-2f743b49b145)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="87584043-7d0e-475d-89ec-2f743b49b145" Jan 20 23:50:17 crc kubenswrapper[5030]: W0120 23:50:17.413764 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26cd9f7a_0ecf_4b5e_aaee_fcc7d5b32848.slice/crio-5ea9498e4b598290f77c435bab88a811f81b8192261ec2cbc1eab93bba023fcc WatchSource:0}: Error finding container 5ea9498e4b598290f77c435bab88a811f81b8192261ec2cbc1eab93bba023fcc: Status 404 returned error can't find the container with id 5ea9498e4b598290f77c435bab88a811f81b8192261ec2cbc1eab93bba023fcc Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.507691 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.524849 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.534853 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.537153 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.544404 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.544497 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-5hlhd" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.544733 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.545048 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.551743 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.628778 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-api-0" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.191:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.686092 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55537e5f-bb41-4899-a095-e87c7495ebdf-config\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.686462 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55537e5f-bb41-4899-a095-e87c7495ebdf-scripts\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.686489 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55537e5f-bb41-4899-a095-e87c7495ebdf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.686574 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.686608 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rq2f\" (UniqueName: \"kubernetes.io/projected/55537e5f-bb41-4899-a095-e87c7495ebdf-kube-api-access-7rq2f\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.686649 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.686700 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.686793 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.686811 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/01baff70-8147-462e-b9ca-a1ca5301b4b1-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.788373 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.788486 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.788645 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55537e5f-bb41-4899-a095-e87c7495ebdf-config\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.789393 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55537e5f-bb41-4899-a095-e87c7495ebdf-config\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.789487 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55537e5f-bb41-4899-a095-e87c7495ebdf-scripts\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.789512 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55537e5f-bb41-4899-a095-e87c7495ebdf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.790041 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55537e5f-bb41-4899-a095-e87c7495ebdf-scripts\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.790205 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.790242 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rq2f\" (UniqueName: \"kubernetes.io/projected/55537e5f-bb41-4899-a095-e87c7495ebdf-kube-api-access-7rq2f\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.790574 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55537e5f-bb41-4899-a095-e87c7495ebdf-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.792473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.794145 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.797348 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.807533 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rq2f\" (UniqueName: \"kubernetes.io/projected/55537e5f-bb41-4899-a095-e87c7495ebdf-kube-api-access-7rq2f\") pod \"ovn-northd-0\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.976908 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01baff70-8147-462e-b9ca-a1ca5301b4b1" path="/var/lib/kubelet/pods/01baff70-8147-462e-b9ca-a1ca5301b4b1/volumes" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.977531 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e259c6-360b-4eb5-9940-ceddedf1e275" path="/var/lib/kubelet/pods/58e259c6-360b-4eb5-9940-ceddedf1e275/volumes" Jan 20 23:50:17 crc kubenswrapper[5030]: I0120 23:50:17.978151 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd8e50b7-5dd6-475d-80a9-69eff598bb18" path="/var/lib/kubelet/pods/bd8e50b7-5dd6-475d-80a9-69eff598bb18/volumes" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.025052 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.117341 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:50:18 crc kubenswrapper[5030]: E0120 23:50:18.118088 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[galera-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="d465598f-140e-4d07-a886-9d5c96b3bbe3" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.421183 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"2bc2f7c1-a435-4337-bd8b-47c0443f4004","Type":"ContainerStarted","Data":"1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb"} Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.421790 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"2bc2f7c1-a435-4337-bd8b-47c0443f4004","Type":"ContainerStarted","Data":"2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b"} Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.423980 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.424957 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848","Type":"ContainerStarted","Data":"00240cdae6147b11afb04a76cf0732eff9a4442b4dfe7a113f244770f4710e4f"} Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.424995 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848","Type":"ContainerStarted","Data":"5dacabdec0bab9ffc3c73ab9cfe59add1f99a5ee81798258e3acf28ada151c65"} Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.425009 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848","Type":"ContainerStarted","Data":"5ea9498e4b598290f77c435bab88a811f81b8192261ec2cbc1eab93bba023fcc"} Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.438829 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.457530 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=2.457507368 podStartE2EDuration="2.457507368s" podCreationTimestamp="2026-01-20 23:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:18.447113135 +0000 UTC m=+4490.767373423" watchObservedRunningTime="2026-01-20 23:50:18.457507368 +0000 UTC m=+4490.777767676" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.488004 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.487978416 podStartE2EDuration="2.487978416s" podCreationTimestamp="2026-01-20 23:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:18.473992377 +0000 UTC m=+4490.794252665" watchObservedRunningTime="2026-01-20 23:50:18.487978416 +0000 UTC m=+4490.808238704" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.497353 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:50:18 crc kubenswrapper[5030]: W0120 23:50:18.502205 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55537e5f_bb41_4899_a095_e87c7495ebdf.slice/crio-c3047d14db9cfa745977ce04166814820750b9affc0308b4acc75edd96c2c062 WatchSource:0}: Error finding container c3047d14db9cfa745977ce04166814820750b9affc0308b4acc75edd96c2c062: Status 404 returned error can't find the container with id c3047d14db9cfa745977ce04166814820750b9affc0308b4acc75edd96c2c062 Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.512482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-config-data-default\") pod \"d465598f-140e-4d07-a886-9d5c96b3bbe3\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.512693 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"d465598f-140e-4d07-a886-9d5c96b3bbe3\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.512768 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-operator-scripts\") pod \"d465598f-140e-4d07-a886-9d5c96b3bbe3\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.512798 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-combined-ca-bundle\") pod \"d465598f-140e-4d07-a886-9d5c96b3bbe3\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.512842 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d465598f-140e-4d07-a886-9d5c96b3bbe3-config-data-generated\") pod \"d465598f-140e-4d07-a886-9d5c96b3bbe3\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.512911 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf24j\" (UniqueName: \"kubernetes.io/projected/d465598f-140e-4d07-a886-9d5c96b3bbe3-kube-api-access-tf24j\") pod \"d465598f-140e-4d07-a886-9d5c96b3bbe3\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.512982 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-kolla-config\") pod \"d465598f-140e-4d07-a886-9d5c96b3bbe3\" (UID: \"d465598f-140e-4d07-a886-9d5c96b3bbe3\") " Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.513177 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d465598f-140e-4d07-a886-9d5c96b3bbe3-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "d465598f-140e-4d07-a886-9d5c96b3bbe3" (UID: "d465598f-140e-4d07-a886-9d5c96b3bbe3"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.513398 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "d465598f-140e-4d07-a886-9d5c96b3bbe3" (UID: "d465598f-140e-4d07-a886-9d5c96b3bbe3"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.513682 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d465598f-140e-4d07-a886-9d5c96b3bbe3" (UID: "d465598f-140e-4d07-a886-9d5c96b3bbe3"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.513811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d465598f-140e-4d07-a886-9d5c96b3bbe3" (UID: "d465598f-140e-4d07-a886-9d5c96b3bbe3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.514230 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.514265 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.514280 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d465598f-140e-4d07-a886-9d5c96b3bbe3-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.514296 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d465598f-140e-4d07-a886-9d5c96b3bbe3-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.515301 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "d465598f-140e-4d07-a886-9d5c96b3bbe3" (UID: "d465598f-140e-4d07-a886-9d5c96b3bbe3"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.516657 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d465598f-140e-4d07-a886-9d5c96b3bbe3" (UID: "d465598f-140e-4d07-a886-9d5c96b3bbe3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.517756 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.517788 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.518652 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d465598f-140e-4d07-a886-9d5c96b3bbe3-kube-api-access-tf24j" (OuterVolumeSpecName: "kube-api-access-tf24j") pod "d465598f-140e-4d07-a886-9d5c96b3bbe3" (UID: "d465598f-140e-4d07-a886-9d5c96b3bbe3"). InnerVolumeSpecName "kube-api-access-tf24j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.578751 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.579789 5030 scope.go:117] "RemoveContainer" containerID="0f4d8ae10989b886f307feec10a85282b7e675f678e654903030afe74cc1794d" Jan 20 23:50:18 crc kubenswrapper[5030]: E0120 23:50:18.580231 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(87584043-7d0e-475d-89ec-2f743b49b145)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="87584043-7d0e-475d-89ec-2f743b49b145" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.615702 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.615738 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf24j\" (UniqueName: \"kubernetes.io/projected/d465598f-140e-4d07-a886-9d5c96b3bbe3-kube-api-access-tf24j\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.615767 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.626854 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.191:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.642236 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:50:18 crc kubenswrapper[5030]: I0120 23:50:18.717502 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.125950 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.126208 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.177197 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.177306 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.437816 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.437816 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"55537e5f-bb41-4899-a095-e87c7495ebdf","Type":"ContainerStarted","Data":"1afd7b619c388f1b085904f3aa4e3b2164180d355c9e37efea98319e13c503d6"} Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.437886 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"55537e5f-bb41-4899-a095-e87c7495ebdf","Type":"ContainerStarted","Data":"cef4a5f00f4648bf0fd3e6e59c501cb49f0b6ca80d5691ef8ef3da4d372a28d8"} Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.437899 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"55537e5f-bb41-4899-a095-e87c7495ebdf","Type":"ContainerStarted","Data":"c3047d14db9cfa745977ce04166814820750b9affc0308b4acc75edd96c2c062"} Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.439220 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.439266 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.439277 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.466390 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=2.466369231 podStartE2EDuration="2.466369231s" podCreationTimestamp="2026-01-20 23:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:19.460309794 +0000 UTC m=+4491.780570082" watchObservedRunningTime="2026-01-20 23:50:19.466369231 +0000 UTC m=+4491.786629519" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.516693 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.532862 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.568862 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.569250 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.600320 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.605016 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.627919 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.636598 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.636793 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.639007 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-4j24t" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.639147 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.760512 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.760673 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5bz\" (UniqueName: \"kubernetes.io/projected/cdf571fc-fd44-4b32-bc32-5ccd319a5157-kube-api-access-mc5bz\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.760795 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf571fc-fd44-4b32-bc32-5ccd319a5157-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.760839 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cdf571fc-fd44-4b32-bc32-5ccd319a5157-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.760876 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.760925 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.760969 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdf571fc-fd44-4b32-bc32-5ccd319a5157-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.761113 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.761179 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d465598f-140e-4d07-a886-9d5c96b3bbe3-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.833009 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.863028 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.863084 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.863129 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5bz\" (UniqueName: \"kubernetes.io/projected/cdf571fc-fd44-4b32-bc32-5ccd319a5157-kube-api-access-mc5bz\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.863183 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf571fc-fd44-4b32-bc32-5ccd319a5157-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.863226 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cdf571fc-fd44-4b32-bc32-5ccd319a5157-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.863252 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.863290 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.863315 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdf571fc-fd44-4b32-bc32-5ccd319a5157-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.863404 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.863800 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.864117 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cdf571fc-fd44-4b32-bc32-5ccd319a5157-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.864585 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.865389 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.867472 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf571fc-fd44-4b32-bc32-5ccd319a5157-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.872199 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdf571fc-fd44-4b32-bc32-5ccd319a5157-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.896113 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5bz\" (UniqueName: \"kubernetes.io/projected/cdf571fc-fd44-4b32-bc32-5ccd319a5157-kube-api-access-mc5bz\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.943917 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.966234 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:19 crc kubenswrapper[5030]: I0120 23:50:19.988474 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d465598f-140e-4d07-a886-9d5c96b3bbe3" path="/var/lib/kubelet/pods/d465598f-140e-4d07-a886-9d5c96b3bbe3/volumes" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.053782 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.054122 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.089684 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.114551 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.246859 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.188:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.262114 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.451972 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.452217 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.680286 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs\") pod \"placement-7fdd8fb598-6247c\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:50:20 crc kubenswrapper[5030]: E0120 23:50:20.680412 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-placement-internal-svc: secret "cert-placement-internal-svc" not found Jan 20 23:50:20 crc kubenswrapper[5030]: E0120 23:50:20.682044 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs podName:83e561a5-31df-4e46-ac6b-9602ea09bcef nodeName:}" failed. No retries permitted until 2026-01-20 23:50:52.682028281 +0000 UTC m=+4525.002288569 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs") pod "placement-7fdd8fb598-6247c" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef") : secret "cert-placement-internal-svc" not found Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.714382 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:50:20 crc kubenswrapper[5030]: W0120 23:50:20.715036 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdf571fc_fd44_4b32_bc32_5ccd319a5157.slice/crio-392a99c47b75155bfabf1543efe845fd915bc665e86d59fee18e2aec6e7784ee WatchSource:0}: Error finding container 392a99c47b75155bfabf1543efe845fd915bc665e86d59fee18e2aec6e7784ee: Status 404 returned error can't find the container with id 392a99c47b75155bfabf1543efe845fd915bc665e86d59fee18e2aec6e7784ee Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.783371 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.783470 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.783587 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.787350 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.787414 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.791886 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs\") pod \"neutron-5568cd6b4b-ms4dw\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:50:20 crc kubenswrapper[5030]: I0120 23:50:20.943507 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.070840 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7fdd8fb598-6247c"] Jan 20 23:50:21 crc kubenswrapper[5030]: E0120 23:50:21.071953 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[internal-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" podUID="83e561a5-31df-4e46-ac6b-9602ea09bcef" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.108994 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-5484cf4b86-wsff9"] Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.111326 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.122945 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5484cf4b86-wsff9"] Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.198943 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-combined-ca-bundle\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.199060 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-scripts\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.199091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-logs\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.199210 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-public-tls-certs\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.199281 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-internal-tls-certs\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.199332 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvm4p\" (UniqueName: \"kubernetes.io/projected/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-kube-api-access-zvm4p\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.199361 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-config-data\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.301177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-config-data\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.302016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-combined-ca-bundle\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.302095 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-scripts\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.302128 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-logs\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.302215 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-public-tls-certs\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.302274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-internal-tls-certs\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.302313 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvm4p\" (UniqueName: \"kubernetes.io/projected/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-kube-api-access-zvm4p\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.302709 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-logs\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.312048 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-combined-ca-bundle\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.312957 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-public-tls-certs\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.314231 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-config-data\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.315951 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-scripts\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.323090 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-internal-tls-certs\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.343211 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvm4p\" (UniqueName: \"kubernetes.io/projected/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-kube-api-access-zvm4p\") pod \"placement-5484cf4b86-wsff9\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.444100 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.452377 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw"] Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.470174 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"cdf571fc-fd44-4b32-bc32-5ccd319a5157","Type":"ContainerStarted","Data":"9262e8bee8e2eb2f56051f3125b33978e0a66c8e21628ff8884448813720bf84"} Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.470216 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"cdf571fc-fd44-4b32-bc32-5ccd319a5157","Type":"ContainerStarted","Data":"392a99c47b75155bfabf1543efe845fd915bc665e86d59fee18e2aec6e7784ee"} Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.492817 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.493436 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" event={"ID":"d63800d7-7064-4d3f-9294-bcdbbd3957ec","Type":"ContainerStarted","Data":"fafd40d87fe0776a710b68e7cf7c6c26032068bba803c3ada0ccd6419170a76d"} Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.555065 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.719040 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-combined-ca-bundle\") pod \"83e561a5-31df-4e46-ac6b-9602ea09bcef\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.719085 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kxtw\" (UniqueName: \"kubernetes.io/projected/83e561a5-31df-4e46-ac6b-9602ea09bcef-kube-api-access-8kxtw\") pod \"83e561a5-31df-4e46-ac6b-9602ea09bcef\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.719154 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-config-data\") pod \"83e561a5-31df-4e46-ac6b-9602ea09bcef\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.722270 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83e561a5-31df-4e46-ac6b-9602ea09bcef-logs\") pod \"83e561a5-31df-4e46-ac6b-9602ea09bcef\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.722324 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs\") pod \"83e561a5-31df-4e46-ac6b-9602ea09bcef\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.722461 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-scripts\") pod \"83e561a5-31df-4e46-ac6b-9602ea09bcef\" (UID: \"83e561a5-31df-4e46-ac6b-9602ea09bcef\") " Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.722868 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83e561a5-31df-4e46-ac6b-9602ea09bcef-logs" (OuterVolumeSpecName: "logs") pod "83e561a5-31df-4e46-ac6b-9602ea09bcef" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.722988 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e561a5-31df-4e46-ac6b-9602ea09bcef-kube-api-access-8kxtw" (OuterVolumeSpecName: "kube-api-access-8kxtw") pod "83e561a5-31df-4e46-ac6b-9602ea09bcef" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef"). InnerVolumeSpecName "kube-api-access-8kxtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.723189 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83e561a5-31df-4e46-ac6b-9602ea09bcef-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.723209 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kxtw\" (UniqueName: \"kubernetes.io/projected/83e561a5-31df-4e46-ac6b-9602ea09bcef-kube-api-access-8kxtw\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.724742 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-config-data" (OuterVolumeSpecName: "config-data") pod "83e561a5-31df-4e46-ac6b-9602ea09bcef" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.726894 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "83e561a5-31df-4e46-ac6b-9602ea09bcef" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.727770 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83e561a5-31df-4e46-ac6b-9602ea09bcef" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.729740 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-scripts" (OuterVolumeSpecName: "scripts") pod "83e561a5-31df-4e46-ac6b-9602ea09bcef" (UID: "83e561a5-31df-4e46-ac6b-9602ea09bcef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.824932 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.825231 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.825242 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.825252 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.832732 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:21 crc kubenswrapper[5030]: I0120 23:50:21.943472 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.023764 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5484cf4b86-wsff9"] Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.265146 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.265482 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.373803 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.503121 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" event={"ID":"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a","Type":"ContainerStarted","Data":"d2251a881137fe8b2237286d8f579027b803fabf914d7b024ad2945b1ccc202e"} Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.503405 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" event={"ID":"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a","Type":"ContainerStarted","Data":"255c4a9058db6132dbc1d591cdfa04bd107d181e1d3692fc5d55bf5752456d30"} Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.503425 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.503435 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" event={"ID":"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a","Type":"ContainerStarted","Data":"045b68bed21ea5278f19fe263879de21960af7eeff5e88c071234831b358a2af"} Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.506225 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.506246 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.507112 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" event={"ID":"d63800d7-7064-4d3f-9294-bcdbbd3957ec","Type":"ContainerStarted","Data":"242993683d7222f4ec4755e00b402788f7faba656a52586cfc1ddacde6d5bbde"} Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.507145 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.507156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" event={"ID":"d63800d7-7064-4d3f-9294-bcdbbd3957ec","Type":"ContainerStarted","Data":"2f8257c9154705931fbf08d31a2b04c330f4551f96f3ed0732be9ece70049a15"} Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.507193 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7fdd8fb598-6247c" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.529776 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" podStartSLOduration=1.5297622849999999 podStartE2EDuration="1.529762285s" podCreationTimestamp="2026-01-20 23:50:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:22.527992232 +0000 UTC m=+4494.848252520" watchObservedRunningTime="2026-01-20 23:50:22.529762285 +0000 UTC m=+4494.850022573" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.554084 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" podStartSLOduration=34.554069644 podStartE2EDuration="34.554069644s" podCreationTimestamp="2026-01-20 23:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:22.543790135 +0000 UTC m=+4494.864050423" watchObservedRunningTime="2026-01-20 23:50:22.554069644 +0000 UTC m=+4494.874329932" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.616688 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:50:22 crc kubenswrapper[5030]: E0120 23:50:22.617426 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[galera-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/openstack-galera-0" podUID="0e6ec1e2-3869-4ced-86cf-55f3d2f26b83" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.622673 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7fdd8fb598-6247c"] Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.629343 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7fdd8fb598-6247c"] Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.632773 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-api-0" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.191:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.645209 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.647010 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e561a5-31df-4e46-ac6b-9602ea09bcef-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.698198 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.885545 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.932001 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.962988 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:50:22 crc kubenswrapper[5030]: E0120 23:50:22.963217 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:50:22 crc kubenswrapper[5030]: I0120 23:50:22.994236 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.034959 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.257577 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.257870 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.257972 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.258136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.266223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.266278 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.266335 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.266344 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.407803 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.415795 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.516379 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.516756 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.526771 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.526794 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.533772 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.631828 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.191:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.672493 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.672569 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljn5g\" (UniqueName: \"kubernetes.io/projected/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-kube-api-access-ljn5g\") pod \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.672710 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs\") pod \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.672782 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-kolla-config\") pod \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.672835 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-operator-scripts\") pod \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.672894 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-config-data-default\") pod \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.672948 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-combined-ca-bundle\") pod \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.673085 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-config-data-generated\") pod \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\" (UID: \"0e6ec1e2-3869-4ced-86cf-55f3d2f26b83\") " Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.673430 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.674365 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.675410 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.678144 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.681127 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-kube-api-access-ljn5g" (OuterVolumeSpecName: "kube-api-access-ljn5g") pod "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83"). InnerVolumeSpecName "kube-api-access-ljn5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.684014 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.684736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.684836 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.695461 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83" (UID: "0e6ec1e2-3869-4ced-86cf-55f3d2f26b83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.778123 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.778160 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.778192 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.778203 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljn5g\" (UniqueName: \"kubernetes.io/projected/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-kube-api-access-ljn5g\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.778214 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.778224 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.778233 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.797331 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.879894 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.957149 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:50:23 crc kubenswrapper[5030]: W0120 23:50:23.966813 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5945d7b7_7f69_4c5e_a90f_5963d1ff010f.slice/crio-3090799522da9146273743f90f5d86391a3803602e9abf8157632a98d0d57e62 WatchSource:0}: Error finding container 3090799522da9146273743f90f5d86391a3803602e9abf8157632a98d0d57e62: Status 404 returned error can't find the container with id 3090799522da9146273743f90f5d86391a3803602e9abf8157632a98d0d57e62 Jan 20 23:50:23 crc kubenswrapper[5030]: I0120 23:50:23.977106 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e561a5-31df-4e46-ac6b-9602ea09bcef" path="/var/lib/kubelet/pods/83e561a5-31df-4e46-ac6b-9602ea09bcef/volumes" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.058410 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:50:24 crc kubenswrapper[5030]: W0120 23:50:24.060892 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb088bf19_0c85_4732_bc31_18e8bd1cd751.slice/crio-d75895c86ce313a62e53a2b81fe0a959946b40356bc637b58fa5727225fe8140 WatchSource:0}: Error finding container d75895c86ce313a62e53a2b81fe0a959946b40356bc637b58fa5727225fe8140: Status 404 returned error can't find the container with id d75895c86ce313a62e53a2b81fe0a959946b40356bc637b58fa5727225fe8140 Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.528584 5030 generic.go:334] "Generic (PLEG): container finished" podID="cdf571fc-fd44-4b32-bc32-5ccd319a5157" containerID="9262e8bee8e2eb2f56051f3125b33978e0a66c8e21628ff8884448813720bf84" exitCode=0 Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.528724 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"cdf571fc-fd44-4b32-bc32-5ccd319a5157","Type":"ContainerDied","Data":"9262e8bee8e2eb2f56051f3125b33978e0a66c8e21628ff8884448813720bf84"} Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.531537 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"b088bf19-0c85-4732-bc31-18e8bd1cd751","Type":"ContainerStarted","Data":"a825c74b70f641b404c47a1fa1990f243387f7cca87ac7ab8abfdbdffb8fac2f"} Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.531587 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"b088bf19-0c85-4732-bc31-18e8bd1cd751","Type":"ContainerStarted","Data":"d75895c86ce313a62e53a2b81fe0a959946b40356bc637b58fa5727225fe8140"} Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.539425 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"5945d7b7-7f69-4c5e-a90f-5963d1ff010f","Type":"ContainerStarted","Data":"243c26ce8aa4bba4c7d0e3d062875497112101399848cdfe817ec2e79ce6b99e"} Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.539484 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.539507 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"5945d7b7-7f69-4c5e-a90f-5963d1ff010f","Type":"ContainerStarted","Data":"3090799522da9146273743f90f5d86391a3803602e9abf8157632a98d0d57e62"} Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.541094 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.579151 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.579230 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.579811 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.579928 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.579995 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="barbican-api-log" containerStatusID={"Type":"cri-o","ID":"03b4cd2420f5137945cf116f9f72eb27ef3c09ae5682c6031cff9066b34979ca"} pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" containerMessage="Container barbican-api-log failed liveness probe, will be restarted" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.580037 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="barbican-api" containerStatusID={"Type":"cri-o","ID":"aad49c5ee0f0980c146469d4264f259f604d654ee7afbf6b04e4731884cf6baa"} pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" containerMessage="Container barbican-api failed liveness probe, will be restarted" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.580059 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" containerID="cri-o://03b4cd2420f5137945cf116f9f72eb27ef3c09ae5682c6031cff9066b34979ca" gracePeriod=30 Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.595256 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=33.59523474 podStartE2EDuration="33.59523474s" podCreationTimestamp="2026-01-20 23:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:24.583900575 +0000 UTC m=+4496.904160883" watchObservedRunningTime="2026-01-20 23:50:24.59523474 +0000 UTC m=+4496.915495038" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.687000 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=33.686982116 podStartE2EDuration="33.686982116s" podCreationTimestamp="2026-01-20 23:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:24.624142131 +0000 UTC m=+4496.944402419" watchObservedRunningTime="2026-01-20 23:50:24.686982116 +0000 UTC m=+4497.007242404" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.700844 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.715529 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.725836 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" containerID="cri-o://aad49c5ee0f0980c146469d4264f259f604d654ee7afbf6b04e4731884cf6baa" gracePeriod=30 Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.729910 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": EOF" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.730097 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": EOF" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.731022 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": EOF" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.742955 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.744497 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.746586 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.746925 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-f8qjh" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.747105 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.747109 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.750561 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.899577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.899664 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.899689 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.899731 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-config-data-default\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.899764 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.899898 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-kolla-config\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.899970 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w56km\" (UniqueName: \"kubernetes.io/projected/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-kube-api-access-w56km\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:24 crc kubenswrapper[5030]: I0120 23:50:24.900029 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.001984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.002054 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.002080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.002118 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-config-data-default\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.002149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.002174 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-kolla-config\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.002191 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w56km\" (UniqueName: \"kubernetes.io/projected/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-kube-api-access-w56km\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.002243 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.002409 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") device mount path \"/mnt/openstack/pv08\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.003022 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-kolla-config\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.003101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-config-data-default\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.003353 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.003723 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.012243 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.012310 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.021113 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w56km\" (UniqueName: \"kubernetes.io/projected/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-kube-api-access-w56km\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.029526 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.062525 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.293067 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.188:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.338957 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.545480 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4","Type":"ContainerStarted","Data":"b3bc34edace7fc3eca37d67215afa4a9bd7b0e4340834c68cf70f7d37b951847"} Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.547286 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"cdf571fc-fd44-4b32-bc32-5ccd319a5157","Type":"ContainerStarted","Data":"26977dc366c36dd6821aaf3e4eaa5d87aeefe7baa15409a5c0bce4fd3fa23efe"} Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.550168 5030 generic.go:334] "Generic (PLEG): container finished" podID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerID="03b4cd2420f5137945cf116f9f72eb27ef3c09ae5682c6031cff9066b34979ca" exitCode=143 Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.550259 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" event={"ID":"7c59fe1c-661b-4268-a33d-e9f87ec167eb","Type":"ContainerDied","Data":"03b4cd2420f5137945cf116f9f72eb27ef3c09ae5682c6031cff9066b34979ca"} Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.568221 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=6.568202843 podStartE2EDuration="6.568202843s" podCreationTimestamp="2026-01-20 23:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:25.564704478 +0000 UTC m=+4497.884964766" watchObservedRunningTime="2026-01-20 23:50:25.568202843 +0000 UTC m=+4497.888463131" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.611802 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.183:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.611805 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.183:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.920027 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.923998 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs\") pod \"nova-api-0\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:25 crc kubenswrapper[5030]: I0120 23:50:25.929394 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:26 crc kubenswrapper[5030]: I0120 23:50:26.006293 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e6ec1e2-3869-4ced-86cf-55f3d2f26b83" path="/var/lib/kubelet/pods/0e6ec1e2-3869-4ced-86cf-55f3d2f26b83/volumes" Jan 20 23:50:26 crc kubenswrapper[5030]: I0120 23:50:26.457316 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:50:26 crc kubenswrapper[5030]: W0120 23:50:26.460366 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0d0e7e8_cb45_4f3a_82f8_377b69cd8b01.slice/crio-49473b7bf39e44a4987c765b462b055a4031cf90a7237af4894caf5dd109ddcb WatchSource:0}: Error finding container 49473b7bf39e44a4987c765b462b055a4031cf90a7237af4894caf5dd109ddcb: Status 404 returned error can't find the container with id 49473b7bf39e44a4987c765b462b055a4031cf90a7237af4894caf5dd109ddcb Jan 20 23:50:26 crc kubenswrapper[5030]: I0120 23:50:26.561635 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4","Type":"ContainerStarted","Data":"860fbedaf6f62ef518f67b4d9910fae4069d9b322765e36a8ffd936820b3bc72"} Jan 20 23:50:26 crc kubenswrapper[5030]: I0120 23:50:26.562879 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01","Type":"ContainerStarted","Data":"49473b7bf39e44a4987c765b462b055a4031cf90a7237af4894caf5dd109ddcb"} Jan 20 23:50:27 crc kubenswrapper[5030]: I0120 23:50:27.571889 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01","Type":"ContainerStarted","Data":"38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead"} Jan 20 23:50:27 crc kubenswrapper[5030]: I0120 23:50:27.572287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01","Type":"ContainerStarted","Data":"aa693e4a8efcf4b1aa87a16438ed95218a621a854730dbf8e03df111010aa2ee"} Jan 20 23:50:27 crc kubenswrapper[5030]: I0120 23:50:27.593845 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=34.593822362 podStartE2EDuration="34.593822362s" podCreationTimestamp="2026-01-20 23:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:27.587654663 +0000 UTC m=+4499.907914961" watchObservedRunningTime="2026-01-20 23:50:27.593822362 +0000 UTC m=+4499.914082650" Jan 20 23:50:27 crc kubenswrapper[5030]: I0120 23:50:27.637785 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/cinder-api-0" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.191:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:27 crc kubenswrapper[5030]: I0120 23:50:27.637859 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:27 crc kubenswrapper[5030]: I0120 23:50:27.638424 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-api" containerStatusID={"Type":"cri-o","ID":"70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd"} pod="openstack-kuttl-tests/cinder-api-0" containerMessage="Container cinder-api failed liveness probe, will be restarted" Jan 20 23:50:27 crc kubenswrapper[5030]: I0120 23:50:27.638462 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" containerID="cri-o://70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd" gracePeriod=30 Jan 20 23:50:27 crc kubenswrapper[5030]: I0120 23:50:27.644843 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.191:8776/healthcheck\": EOF" Jan 20 23:50:28 crc kubenswrapper[5030]: I0120 23:50:28.416469 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:28 crc kubenswrapper[5030]: I0120 23:50:28.796165 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:50:29 crc kubenswrapper[5030]: I0120 23:50:29.594333 5030 generic.go:334] "Generic (PLEG): container finished" podID="c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" containerID="860fbedaf6f62ef518f67b4d9910fae4069d9b322765e36a8ffd936820b3bc72" exitCode=0 Jan 20 23:50:29 crc kubenswrapper[5030]: I0120 23:50:29.594488 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4","Type":"ContainerDied","Data":"860fbedaf6f62ef518f67b4d9910fae4069d9b322765e36a8ffd936820b3bc72"} Jan 20 23:50:29 crc kubenswrapper[5030]: I0120 23:50:29.739805 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:30 crc kubenswrapper[5030]: I0120 23:50:30.262222 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:30 crc kubenswrapper[5030]: I0120 23:50:30.262545 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:30 crc kubenswrapper[5030]: I0120 23:50:30.334835 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.188:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:30 crc kubenswrapper[5030]: I0120 23:50:30.610942 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4","Type":"ContainerStarted","Data":"79303e278316bc7cf3193899577e73e1c0136d14a75ad0b5c2bf54b109f7b1d9"} Jan 20 23:50:30 crc kubenswrapper[5030]: I0120 23:50:30.637754 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=6.637727573 podStartE2EDuration="6.637727573s" podCreationTimestamp="2026-01-20 23:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:30.630909687 +0000 UTC m=+4502.951170015" watchObservedRunningTime="2026-01-20 23:50:30.637727573 +0000 UTC m=+4502.957987891" Jan 20 23:50:30 crc kubenswrapper[5030]: I0120 23:50:30.936614 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:31 crc kubenswrapper[5030]: I0120 23:50:31.055191 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:50:31 crc kubenswrapper[5030]: I0120 23:50:31.962569 5030 scope.go:117] "RemoveContainer" containerID="0f4d8ae10989b886f307feec10a85282b7e675f678e654903030afe74cc1794d" Jan 20 23:50:31 crc kubenswrapper[5030]: E0120 23:50:31.963025 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(87584043-7d0e-475d-89ec-2f743b49b145)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="87584043-7d0e-475d-89ec-2f743b49b145" Jan 20 23:50:32 crc kubenswrapper[5030]: I0120 23:50:32.645587 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.191:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.410144 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.416975 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.444281 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.454522 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.508285 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.558200 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.567209 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.567455 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="b33621c6-2162-464f-853a-377357ad9f2f" containerName="glance-log" containerID="cri-o://3e9a5cc906acbaff7456c0e13b2e3f9c2e7b8cf065ffe3e9db20c604658530d5" gracePeriod=30 Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.567535 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="b33621c6-2162-464f-853a-377357ad9f2f" containerName="glance-httpd" containerID="cri-o://83aa5d018cd69e4a57bf33801771af6a6199c3a715f3d3470b60454c99f0ab56" gracePeriod=30 Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.625119 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.625456 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" containerName="nova-api-log" containerID="cri-o://aa693e4a8efcf4b1aa87a16438ed95218a621a854730dbf8e03df111010aa2ee" gracePeriod=30 Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.625532 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" containerName="nova-api-api" containerID="cri-o://38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead" gracePeriod=30 Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.636836 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="5945d7b7-7f69-4c5e-a90f-5963d1ff010f" containerName="memcached" containerID="cri-o://243c26ce8aa4bba4c7d0e3d062875497112101399848cdfe817ec2e79ce6b99e" gracePeriod=30 Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.669650 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.669892 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="6bd1362d-6fee-4d39-ad27-ba95334caeef" containerName="nova-cell1-conductor-conductor" containerID="cri-o://1ccefdcb102753b7ad4e4b3ff83929bdf44a95fb3b2cf14345b4c80f9c23baac" gracePeriod=30 Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.687716 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.687977 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="c80f8119-49a7-4395-afb7-cdb94b8d6c35" containerName="glance-log" containerID="cri-o://22df7d634b6fcc4d2d4394d1e3fd73e43228c42a17debaad1858400944d85c3f" gracePeriod=30 Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.688127 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="c80f8119-49a7-4395-afb7-cdb94b8d6c35" containerName="glance-httpd" containerID="cri-o://45defd247220b7c9d73f86491711fda5c0c833c5e9fda3087524c4f3b85ddafa" gracePeriod=30 Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.806668 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-d4c58fbf5-82sk5"] Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.806874 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" podUID="16b352a0-984a-4736-9ede-a3797059b56a" containerName="keystone-api" containerID="cri-o://1a322a5b0f4609dfb53e29d267cab28f1bd935ea7c08384b0f94ec463163dffc" gracePeriod=30 Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.821022 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" podUID="16b352a0-984a-4736-9ede-a3797059b56a" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.1.173:5000/v3\": EOF" Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.853832 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-5b98cb8978-zrv7q"] Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.854967 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.881379 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5b98cb8978-zrv7q"] Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.986381 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw"] Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.986797 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" podUID="d63800d7-7064-4d3f-9294-bcdbbd3957ec" containerName="neutron-api" containerID="cri-o://2f8257c9154705931fbf08d31a2b04c330f4551f96f3ed0732be9ece70049a15" gracePeriod=30 Jan 20 23:50:33 crc kubenswrapper[5030]: I0120 23:50:33.987349 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" podUID="d63800d7-7064-4d3f-9294-bcdbbd3957ec" containerName="neutron-httpd" containerID="cri-o://242993683d7222f4ec4755e00b402788f7faba656a52586cfc1ddacde6d5bbde" gracePeriod=30 Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.011618 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-combined-ca-bundle\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.011678 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-config-data\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.011727 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-credential-keys\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.011753 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcrbg\" (UniqueName: \"kubernetes.io/projected/d098cb47-5d3b-48d8-b10e-59e0c1d79840-kube-api-access-kcrbg\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.011831 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-public-tls-certs\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.011864 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-scripts\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.011895 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-internal-tls-certs\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.011927 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-fernet-keys\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.047893 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" podUID="d63800d7-7064-4d3f-9294-bcdbbd3957ec" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.048130 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-78f766ffcd-dv8xk"] Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.049597 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.069144 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-78f766ffcd-dv8xk"] Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.116693 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-combined-ca-bundle\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.116757 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-config-data\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.116805 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-credential-keys\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.116829 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcrbg\" (UniqueName: \"kubernetes.io/projected/d098cb47-5d3b-48d8-b10e-59e0c1d79840-kube-api-access-kcrbg\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.116907 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-public-tls-certs\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.116941 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-scripts\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.116985 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-internal-tls-certs\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.117016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-fernet-keys\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.126115 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-credential-keys\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.128726 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-combined-ca-bundle\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.129822 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-scripts\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.130418 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-fernet-keys\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.136138 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-internal-tls-certs\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.136606 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-config-data\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.144272 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-public-tls-certs\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.157582 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcrbg\" (UniqueName: \"kubernetes.io/projected/d098cb47-5d3b-48d8-b10e-59e0c1d79840-kube-api-access-kcrbg\") pod \"keystone-5b98cb8978-zrv7q\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.170354 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.180118 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.220876 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rt76\" (UniqueName: \"kubernetes.io/projected/ada934e3-1bb4-44a0-ad44-771a35fcabf7-kube-api-access-8rt76\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.221031 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-public-tls-certs\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.221213 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-combined-ca-bundle\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.221663 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-ovndb-tls-certs\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.234721 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-httpd-config\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.234779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-internal-tls-certs\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.234807 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-config\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: E0120 23:50:34.274868 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ccefdcb102753b7ad4e4b3ff83929bdf44a95fb3b2cf14345b4c80f9c23baac" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:50:34 crc kubenswrapper[5030]: E0120 23:50:34.277049 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ccefdcb102753b7ad4e4b3ff83929bdf44a95fb3b2cf14345b4c80f9c23baac" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:50:34 crc kubenswrapper[5030]: E0120 23:50:34.278747 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ccefdcb102753b7ad4e4b3ff83929bdf44a95fb3b2cf14345b4c80f9c23baac" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:50:34 crc kubenswrapper[5030]: E0120 23:50:34.278854 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="6bd1362d-6fee-4d39-ad27-ba95334caeef" containerName="nova-cell1-conductor-conductor" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.339871 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-ovndb-tls-certs\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.339914 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-httpd-config\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.339937 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-internal-tls-certs\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.339956 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-config\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.340061 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rt76\" (UniqueName: \"kubernetes.io/projected/ada934e3-1bb4-44a0-ad44-771a35fcabf7-kube-api-access-8rt76\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.340080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-public-tls-certs\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.340112 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-combined-ca-bundle\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.344875 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-ovndb-tls-certs\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.344941 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-combined-ca-bundle\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.346313 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-config\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.347458 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-internal-tls-certs\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.347594 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-httpd-config\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.348249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-public-tls-certs\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.363471 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rt76\" (UniqueName: \"kubernetes.io/projected/ada934e3-1bb4-44a0-ad44-771a35fcabf7-kube-api-access-8rt76\") pod \"neutron-78f766ffcd-dv8xk\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.367786 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.662576 5030 generic.go:334] "Generic (PLEG): container finished" podID="b33621c6-2162-464f-853a-377357ad9f2f" containerID="3e9a5cc906acbaff7456c0e13b2e3f9c2e7b8cf065ffe3e9db20c604658530d5" exitCode=143 Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.662666 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"b33621c6-2162-464f-853a-377357ad9f2f","Type":"ContainerDied","Data":"3e9a5cc906acbaff7456c0e13b2e3f9c2e7b8cf065ffe3e9db20c604658530d5"} Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.664901 5030 generic.go:334] "Generic (PLEG): container finished" podID="c80f8119-49a7-4395-afb7-cdb94b8d6c35" containerID="22df7d634b6fcc4d2d4394d1e3fd73e43228c42a17debaad1858400944d85c3f" exitCode=143 Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.665017 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c80f8119-49a7-4395-afb7-cdb94b8d6c35","Type":"ContainerDied","Data":"22df7d634b6fcc4d2d4394d1e3fd73e43228c42a17debaad1858400944d85c3f"} Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.667452 5030 generic.go:334] "Generic (PLEG): container finished" podID="a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" containerID="aa693e4a8efcf4b1aa87a16438ed95218a621a854730dbf8e03df111010aa2ee" exitCode=143 Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.667518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01","Type":"ContainerDied","Data":"aa693e4a8efcf4b1aa87a16438ed95218a621a854730dbf8e03df111010aa2ee"} Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.669021 5030 generic.go:334] "Generic (PLEG): container finished" podID="d63800d7-7064-4d3f-9294-bcdbbd3957ec" containerID="242993683d7222f4ec4755e00b402788f7faba656a52586cfc1ddacde6d5bbde" exitCode=0 Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.669176 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="b088bf19-0c85-4732-bc31-18e8bd1cd751" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a825c74b70f641b404c47a1fa1990f243387f7cca87ac7ab8abfdbdffb8fac2f" gracePeriod=30 Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.669357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" event={"ID":"d63800d7-7064-4d3f-9294-bcdbbd3957ec","Type":"ContainerDied","Data":"242993683d7222f4ec4755e00b402788f7faba656a52586cfc1ddacde6d5bbde"} Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.683901 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5b98cb8978-zrv7q"] Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.740293 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.831441 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-78f766ffcd-dv8xk"] Jan 20 23:50:34 crc kubenswrapper[5030]: I0120 23:50:34.964604 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:50:34 crc kubenswrapper[5030]: E0120 23:50:34.964937 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.063254 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.063565 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.205889 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:35 crc kubenswrapper[5030]: E0120 23:50:35.237707 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0d0e7e8_cb45_4f3a_82f8_377b69cd8b01.slice/crio-conmon-38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0d0e7e8_cb45_4f3a_82f8_377b69cd8b01.slice/crio-38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.377858 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.188:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.428866 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.559367 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.576777 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.183:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.576843 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.183:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.587899 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mklb2\" (UniqueName: \"kubernetes.io/projected/b088bf19-0c85-4732-bc31-18e8bd1cd751-kube-api-access-mklb2\") pod \"b088bf19-0c85-4732-bc31-18e8bd1cd751\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.587961 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-combined-ca-bundle\") pod \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.587995 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-logs\") pod \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.588022 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-combined-ca-bundle\") pod \"b088bf19-0c85-4732-bc31-18e8bd1cd751\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.588077 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs\") pod \"b088bf19-0c85-4732-bc31-18e8bd1cd751\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.588099 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs\") pod \"b088bf19-0c85-4732-bc31-18e8bd1cd751\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.588118 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs\") pod \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.588147 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-config-data\") pod \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.588172 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjvjr\" (UniqueName: \"kubernetes.io/projected/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-kube-api-access-sjvjr\") pod \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.588198 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs\") pod \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\" (UID: \"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01\") " Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.588217 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-config-data\") pod \"b088bf19-0c85-4732-bc31-18e8bd1cd751\" (UID: \"b088bf19-0c85-4732-bc31-18e8bd1cd751\") " Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.591366 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-logs" (OuterVolumeSpecName: "logs") pod "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.618160 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b088bf19-0c85-4732-bc31-18e8bd1cd751-kube-api-access-mklb2" (OuterVolumeSpecName: "kube-api-access-mklb2") pod "b088bf19-0c85-4732-bc31-18e8bd1cd751" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751"). InnerVolumeSpecName "kube-api-access-mklb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.626428 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-kube-api-access-sjvjr" (OuterVolumeSpecName: "kube-api-access-sjvjr") pod "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01"). InnerVolumeSpecName "kube-api-access-sjvjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.656736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-config-data" (OuterVolumeSpecName: "config-data") pod "b088bf19-0c85-4732-bc31-18e8bd1cd751" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.658192 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-config-data" (OuterVolumeSpecName: "config-data") pod "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.659387 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b088bf19-0c85-4732-bc31-18e8bd1cd751" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.667829 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "b088bf19-0c85-4732-bc31-18e8bd1cd751" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.669254 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.682148 5030 generic.go:334] "Generic (PLEG): container finished" podID="b088bf19-0c85-4732-bc31-18e8bd1cd751" containerID="a825c74b70f641b404c47a1fa1990f243387f7cca87ac7ab8abfdbdffb8fac2f" exitCode=0 Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.682203 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.682237 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"b088bf19-0c85-4732-bc31-18e8bd1cd751","Type":"ContainerDied","Data":"a825c74b70f641b404c47a1fa1990f243387f7cca87ac7ab8abfdbdffb8fac2f"} Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.682281 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"b088bf19-0c85-4732-bc31-18e8bd1cd751","Type":"ContainerDied","Data":"d75895c86ce313a62e53a2b81fe0a959946b40356bc637b58fa5727225fe8140"} Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.682302 5030 scope.go:117] "RemoveContainer" containerID="a825c74b70f641b404c47a1fa1990f243387f7cca87ac7ab8abfdbdffb8fac2f" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.688946 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" event={"ID":"ada934e3-1bb4-44a0-ad44-771a35fcabf7","Type":"ContainerStarted","Data":"7f36d07fb47f8a6aea3d6480a30f89be95381550b0c9be70f2f2037486ec60a8"} Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.688983 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" event={"ID":"ada934e3-1bb4-44a0-ad44-771a35fcabf7","Type":"ContainerStarted","Data":"32226fd0c8a82e1ea67c45bbe6ce2bde02a5fd033c62eb3de88db4248fca5777"} Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.688993 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" event={"ID":"ada934e3-1bb4-44a0-ad44-771a35fcabf7","Type":"ContainerStarted","Data":"4291686583adb894843dbf9b11b0e8912d7c2e46aff101885ffb53f8a66c552f"} Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.689093 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.692647 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" event={"ID":"d098cb47-5d3b-48d8-b10e-59e0c1d79840","Type":"ContainerStarted","Data":"22410699b2d975f66dd4fbd18bf8cdeb1ca6e285e64621da5b2a1c37a284b6b0"} Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.692686 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" event={"ID":"d098cb47-5d3b-48d8-b10e-59e0c1d79840","Type":"ContainerStarted","Data":"e5879721858805294eb27cab1443dd8b8b4564521732c1c1365e72891c8606f7"} Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.693600 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.693639 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.693659 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjvjr\" (UniqueName: \"kubernetes.io/projected/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-kube-api-access-sjvjr\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.693671 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.693681 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mklb2\" (UniqueName: \"kubernetes.io/projected/b088bf19-0c85-4732-bc31-18e8bd1cd751-kube-api-access-mklb2\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.693690 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.693700 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.693709 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.693692 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.693905 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "b088bf19-0c85-4732-bc31-18e8bd1cd751" (UID: "b088bf19-0c85-4732-bc31-18e8bd1cd751"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.695339 5030 generic.go:334] "Generic (PLEG): container finished" podID="a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" containerID="38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead" exitCode=0 Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.695527 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" (UID: "a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.695635 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.696114 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01","Type":"ContainerDied","Data":"38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead"} Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.696148 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01","Type":"ContainerDied","Data":"49473b7bf39e44a4987c765b462b055a4031cf90a7237af4894caf5dd109ddcb"} Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.728307 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" podStartSLOduration=1.728282853 podStartE2EDuration="1.728282853s" podCreationTimestamp="2026-01-20 23:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:35.716446905 +0000 UTC m=+4508.036707213" watchObservedRunningTime="2026-01-20 23:50:35.728282853 +0000 UTC m=+4508.048543161" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.737494 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" podStartSLOduration=2.737471195 podStartE2EDuration="2.737471195s" podCreationTimestamp="2026-01-20 23:50:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:35.732256449 +0000 UTC m=+4508.052516747" watchObservedRunningTime="2026-01-20 23:50:35.737471195 +0000 UTC m=+4508.057731503" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.783306 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.795166 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b088bf19-0c85-4732-bc31-18e8bd1cd751-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.795201 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.795232 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.920840 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.930195 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.934020 5030 scope.go:117] "RemoveContainer" containerID="a825c74b70f641b404c47a1fa1990f243387f7cca87ac7ab8abfdbdffb8fac2f" Jan 20 23:50:35 crc kubenswrapper[5030]: E0120 23:50:35.934444 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a825c74b70f641b404c47a1fa1990f243387f7cca87ac7ab8abfdbdffb8fac2f\": container with ID starting with a825c74b70f641b404c47a1fa1990f243387f7cca87ac7ab8abfdbdffb8fac2f not found: ID does not exist" containerID="a825c74b70f641b404c47a1fa1990f243387f7cca87ac7ab8abfdbdffb8fac2f" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.934481 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a825c74b70f641b404c47a1fa1990f243387f7cca87ac7ab8abfdbdffb8fac2f"} err="failed to get container status \"a825c74b70f641b404c47a1fa1990f243387f7cca87ac7ab8abfdbdffb8fac2f\": rpc error: code = NotFound desc = could not find container \"a825c74b70f641b404c47a1fa1990f243387f7cca87ac7ab8abfdbdffb8fac2f\": container with ID starting with a825c74b70f641b404c47a1fa1990f243387f7cca87ac7ab8abfdbdffb8fac2f not found: ID does not exist" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.934499 5030 scope.go:117] "RemoveContainer" containerID="38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.949348 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:50:35 crc kubenswrapper[5030]: E0120 23:50:35.949725 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" containerName="nova-api-log" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.949738 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" containerName="nova-api-log" Jan 20 23:50:35 crc kubenswrapper[5030]: E0120 23:50:35.949772 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" containerName="nova-api-api" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.949778 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" containerName="nova-api-api" Jan 20 23:50:35 crc kubenswrapper[5030]: E0120 23:50:35.949799 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b088bf19-0c85-4732-bc31-18e8bd1cd751" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.949805 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b088bf19-0c85-4732-bc31-18e8bd1cd751" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.949957 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" containerName="nova-api-log" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.949972 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" containerName="nova-api-api" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.949983 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b088bf19-0c85-4732-bc31-18e8bd1cd751" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.950937 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.953218 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.953449 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.953574 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.976873 5030 scope.go:117] "RemoveContainer" containerID="aa693e4a8efcf4b1aa87a16438ed95218a621a854730dbf8e03df111010aa2ee" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.978615 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01" path="/var/lib/kubelet/pods/a0d0e7e8-cb45-4f3a-82f8-377b69cd8b01/volumes" Jan 20 23:50:35 crc kubenswrapper[5030]: I0120 23:50:35.981121 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.001409 5030 scope.go:117] "RemoveContainer" containerID="38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead" Jan 20 23:50:36 crc kubenswrapper[5030]: E0120 23:50:36.004348 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead\": container with ID starting with 38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead not found: ID does not exist" containerID="38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.004375 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead"} err="failed to get container status \"38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead\": rpc error: code = NotFound desc = could not find container \"38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead\": container with ID starting with 38c4fa43d7796d129ca66d48623077c0f0bb71fb215b123f61861d72d51e7ead not found: ID does not exist" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.004393 5030 scope.go:117] "RemoveContainer" containerID="aa693e4a8efcf4b1aa87a16438ed95218a621a854730dbf8e03df111010aa2ee" Jan 20 23:50:36 crc kubenswrapper[5030]: E0120 23:50:36.009380 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa693e4a8efcf4b1aa87a16438ed95218a621a854730dbf8e03df111010aa2ee\": container with ID starting with aa693e4a8efcf4b1aa87a16438ed95218a621a854730dbf8e03df111010aa2ee not found: ID does not exist" containerID="aa693e4a8efcf4b1aa87a16438ed95218a621a854730dbf8e03df111010aa2ee" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.009711 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa693e4a8efcf4b1aa87a16438ed95218a621a854730dbf8e03df111010aa2ee"} err="failed to get container status \"aa693e4a8efcf4b1aa87a16438ed95218a621a854730dbf8e03df111010aa2ee\": rpc error: code = NotFound desc = could not find container \"aa693e4a8efcf4b1aa87a16438ed95218a621a854730dbf8e03df111010aa2ee\": container with ID starting with aa693e4a8efcf4b1aa87a16438ed95218a621a854730dbf8e03df111010aa2ee not found: ID does not exist" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.020023 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.026194 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.039381 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.041187 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.043588 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.043884 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.047407 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.058929 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.099832 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqrcq\" (UniqueName: \"kubernetes.io/projected/7338788b-d61c-4a48-8ec0-305003dc5397-kube-api-access-sqrcq\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.099900 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7338788b-d61c-4a48-8ec0-305003dc5397-logs\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.099930 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.099950 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.099999 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-config-data\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.100294 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-public-tls-certs\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.203596 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7338788b-d61c-4a48-8ec0-305003dc5397-logs\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.203668 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.203692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.203712 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.203738 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcnmj\" (UniqueName: \"kubernetes.io/projected/c07495be-8fe0-4e49-a868-75234b6547d6-kube-api-access-zcnmj\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.203776 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-config-data\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.203881 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.203906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.203978 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.204000 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-public-tls-certs\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.204022 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqrcq\" (UniqueName: \"kubernetes.io/projected/7338788b-d61c-4a48-8ec0-305003dc5397-kube-api-access-sqrcq\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.204676 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7338788b-d61c-4a48-8ec0-305003dc5397-logs\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.209915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.210580 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.211236 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-public-tls-certs\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.213883 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-config-data\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.222336 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqrcq\" (UniqueName: \"kubernetes.io/projected/7338788b-d61c-4a48-8ec0-305003dc5397-kube-api-access-sqrcq\") pod \"nova-api-0\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.258394 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.270037 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.314199 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.314267 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.314359 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.314429 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.314474 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcnmj\" (UniqueName: \"kubernetes.io/projected/c07495be-8fe0-4e49-a868-75234b6547d6-kube-api-access-zcnmj\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.319539 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.323396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.332469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.336650 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.336985 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcnmj\" (UniqueName: \"kubernetes.io/projected/c07495be-8fe0-4e49-a868-75234b6547d6-kube-api-access-zcnmj\") pod \"nova-cell1-novncproxy-0\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.359773 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.417867 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdn6w\" (UniqueName: \"kubernetes.io/projected/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-kube-api-access-wdn6w\") pod \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.418293 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs\") pod \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.418420 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-config-data\") pod \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.418550 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-kolla-config\") pod \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.418753 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-combined-ca-bundle\") pod \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\" (UID: \"5945d7b7-7f69-4c5e-a90f-5963d1ff010f\") " Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.432566 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-config-data" (OuterVolumeSpecName: "config-data") pod "5945d7b7-7f69-4c5e-a90f-5963d1ff010f" (UID: "5945d7b7-7f69-4c5e-a90f-5963d1ff010f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.433697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "5945d7b7-7f69-4c5e-a90f-5963d1ff010f" (UID: "5945d7b7-7f69-4c5e-a90f-5963d1ff010f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.463910 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-kube-api-access-wdn6w" (OuterVolumeSpecName: "kube-api-access-wdn6w") pod "5945d7b7-7f69-4c5e-a90f-5963d1ff010f" (UID: "5945d7b7-7f69-4c5e-a90f-5963d1ff010f"). InnerVolumeSpecName "kube-api-access-wdn6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.502807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5945d7b7-7f69-4c5e-a90f-5963d1ff010f" (UID: "5945d7b7-7f69-4c5e-a90f-5963d1ff010f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.523881 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.523912 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.523924 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.523959 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdn6w\" (UniqueName: \"kubernetes.io/projected/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-kube-api-access-wdn6w\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.532483 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "5945d7b7-7f69-4c5e-a90f-5963d1ff010f" (UID: "5945d7b7-7f69-4c5e-a90f-5963d1ff010f"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.625855 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5945d7b7-7f69-4c5e-a90f-5963d1ff010f-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.706647 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.706640 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"5945d7b7-7f69-4c5e-a90f-5963d1ff010f","Type":"ContainerDied","Data":"243c26ce8aa4bba4c7d0e3d062875497112101399848cdfe817ec2e79ce6b99e"} Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.706724 5030 scope.go:117] "RemoveContainer" containerID="243c26ce8aa4bba4c7d0e3d062875497112101399848cdfe817ec2e79ce6b99e" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.706573 5030 generic.go:334] "Generic (PLEG): container finished" podID="5945d7b7-7f69-4c5e-a90f-5963d1ff010f" containerID="243c26ce8aa4bba4c7d0e3d062875497112101399848cdfe817ec2e79ce6b99e" exitCode=0 Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.706891 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"5945d7b7-7f69-4c5e-a90f-5963d1ff010f","Type":"ContainerDied","Data":"3090799522da9146273743f90f5d86391a3803602e9abf8157632a98d0d57e62"} Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.708246 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.769746 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.777960 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.787008 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:50:36 crc kubenswrapper[5030]: E0120 23:50:36.797171 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5945d7b7-7f69-4c5e-a90f-5963d1ff010f" containerName="memcached" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.797459 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5945d7b7-7f69-4c5e-a90f-5963d1ff010f" containerName="memcached" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.797738 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5945d7b7-7f69-4c5e-a90f-5963d1ff010f" containerName="memcached" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.798842 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.798994 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.802356 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.802680 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-rp726" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.802869 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.807861 5030 scope.go:117] "RemoveContainer" containerID="243c26ce8aa4bba4c7d0e3d062875497112101399848cdfe817ec2e79ce6b99e" Jan 20 23:50:36 crc kubenswrapper[5030]: E0120 23:50:36.808295 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"243c26ce8aa4bba4c7d0e3d062875497112101399848cdfe817ec2e79ce6b99e\": container with ID starting with 243c26ce8aa4bba4c7d0e3d062875497112101399848cdfe817ec2e79ce6b99e not found: ID does not exist" containerID="243c26ce8aa4bba4c7d0e3d062875497112101399848cdfe817ec2e79ce6b99e" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.808398 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"243c26ce8aa4bba4c7d0e3d062875497112101399848cdfe817ec2e79ce6b99e"} err="failed to get container status \"243c26ce8aa4bba4c7d0e3d062875497112101399848cdfe817ec2e79ce6b99e\": rpc error: code = NotFound desc = could not find container \"243c26ce8aa4bba4c7d0e3d062875497112101399848cdfe817ec2e79ce6b99e\": container with ID starting with 243c26ce8aa4bba4c7d0e3d062875497112101399848cdfe817ec2e79ce6b99e not found: ID does not exist" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.843828 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.914923 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.932228 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84e84d7f-54c8-43d1-a0ee-29d3e173f662-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.932279 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84e84d7f-54c8-43d1-a0ee-29d3e173f662-kolla-config\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.932653 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e84d7f-54c8-43d1-a0ee-29d3e173f662-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.932999 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft4wq\" (UniqueName: \"kubernetes.io/projected/84e84d7f-54c8-43d1-a0ee-29d3e173f662-kube-api-access-ft4wq\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:36 crc kubenswrapper[5030]: I0120 23:50:36.933098 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84e84d7f-54c8-43d1-a0ee-29d3e173f662-config-data\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:37 crc kubenswrapper[5030]: I0120 23:50:37.034763 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84e84d7f-54c8-43d1-a0ee-29d3e173f662-config-data\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:37 crc kubenswrapper[5030]: I0120 23:50:37.034829 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84e84d7f-54c8-43d1-a0ee-29d3e173f662-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:37 crc kubenswrapper[5030]: I0120 23:50:37.034873 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84e84d7f-54c8-43d1-a0ee-29d3e173f662-kolla-config\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:37 crc kubenswrapper[5030]: I0120 23:50:37.035010 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e84d7f-54c8-43d1-a0ee-29d3e173f662-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:37 crc kubenswrapper[5030]: I0120 23:50:37.035083 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft4wq\" (UniqueName: \"kubernetes.io/projected/84e84d7f-54c8-43d1-a0ee-29d3e173f662-kube-api-access-ft4wq\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:37 crc kubenswrapper[5030]: I0120 23:50:37.035446 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84e84d7f-54c8-43d1-a0ee-29d3e173f662-config-data\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:37 crc kubenswrapper[5030]: I0120 23:50:37.035689 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84e84d7f-54c8-43d1-a0ee-29d3e173f662-kolla-config\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:37 crc kubenswrapper[5030]: I0120 23:50:37.039064 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84e84d7f-54c8-43d1-a0ee-29d3e173f662-memcached-tls-certs\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:37 crc kubenswrapper[5030]: I0120 23:50:37.039313 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e84d7f-54c8-43d1-a0ee-29d3e173f662-combined-ca-bundle\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:37 crc kubenswrapper[5030]: I0120 23:50:37.056159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft4wq\" (UniqueName: \"kubernetes.io/projected/84e84d7f-54c8-43d1-a0ee-29d3e173f662-kube-api-access-ft4wq\") pod \"memcached-0\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:37 crc kubenswrapper[5030]: I0120 23:50:37.118495 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.586690 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.607043 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.646646 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.191:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.753896 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b33621c6-2162-464f-853a-377357ad9f2f-httpd-run\") pod \"b33621c6-2162-464f-853a-377357ad9f2f\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.755151 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"b33621c6-2162-464f-853a-377357ad9f2f\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.755174 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqm6t\" (UniqueName: \"kubernetes.io/projected/b33621c6-2162-464f-853a-377357ad9f2f-kube-api-access-zqm6t\") pod \"b33621c6-2162-464f-853a-377357ad9f2f\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.755205 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-combined-ca-bundle\") pod \"b33621c6-2162-464f-853a-377357ad9f2f\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.755227 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-config-data\") pod \"b33621c6-2162-464f-853a-377357ad9f2f\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.755301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs\") pod \"b33621c6-2162-464f-853a-377357ad9f2f\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.755323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-scripts\") pod \"b33621c6-2162-464f-853a-377357ad9f2f\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.755338 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b33621c6-2162-464f-853a-377357ad9f2f-logs\") pod \"b33621c6-2162-464f-853a-377357ad9f2f\" (UID: \"b33621c6-2162-464f-853a-377357ad9f2f\") " Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.759009 5030 generic.go:334] "Generic (PLEG): container finished" podID="b33621c6-2162-464f-853a-377357ad9f2f" containerID="83aa5d018cd69e4a57bf33801771af6a6199c3a715f3d3470b60454c99f0ab56" exitCode=0 Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.759069 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"b33621c6-2162-464f-853a-377357ad9f2f","Type":"ContainerDied","Data":"83aa5d018cd69e4a57bf33801771af6a6199c3a715f3d3470b60454c99f0ab56"} Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.759094 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"b33621c6-2162-464f-853a-377357ad9f2f","Type":"ContainerDied","Data":"5c8fdd64889a9a4aabaecd6a65c18c5c357b12c991957ef8d063993914a59613"} Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.759110 5030 scope.go:117] "RemoveContainer" containerID="83aa5d018cd69e4a57bf33801771af6a6199c3a715f3d3470b60454c99f0ab56" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.759196 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.755101 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b33621c6-2162-464f-853a-377357ad9f2f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b33621c6-2162-464f-853a-377357ad9f2f" (UID: "b33621c6-2162-464f-853a-377357ad9f2f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.756256 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b33621c6-2162-464f-853a-377357ad9f2f-logs" (OuterVolumeSpecName: "logs") pod "b33621c6-2162-464f-853a-377357ad9f2f" (UID: "b33621c6-2162-464f-853a-377357ad9f2f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.760038 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b33621c6-2162-464f-853a-377357ad9f2f-kube-api-access-zqm6t" (OuterVolumeSpecName: "kube-api-access-zqm6t") pod "b33621c6-2162-464f-853a-377357ad9f2f" (UID: "b33621c6-2162-464f-853a-377357ad9f2f"). InnerVolumeSpecName "kube-api-access-zqm6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.763503 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "b33621c6-2162-464f-853a-377357ad9f2f" (UID: "b33621c6-2162-464f-853a-377357ad9f2f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.767772 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-scripts" (OuterVolumeSpecName: "scripts") pod "b33621c6-2162-464f-853a-377357ad9f2f" (UID: "b33621c6-2162-464f-853a-377357ad9f2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.783420 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c07495be-8fe0-4e49-a868-75234b6547d6","Type":"ContainerStarted","Data":"b38c7a180e10f0876b8ec363fc81fe7721ddcc28b1a9185fc6405b6f48bf18a3"} Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.783457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c07495be-8fe0-4e49-a868-75234b6547d6","Type":"ContainerStarted","Data":"8c7a580c04e91fe4ade9368da014125790c9fdacbb046aba154736ac6001b15b"} Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.799559 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"84e84d7f-54c8-43d1-a0ee-29d3e173f662","Type":"ContainerStarted","Data":"03e71d1266c739615d8e5fd86af9aa963f92040fafee649a5536a3dcdcde51b6"} Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.811140 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7338788b-d61c-4a48-8ec0-305003dc5397","Type":"ContainerStarted","Data":"e9064aae263583ea97f12eaa487d85dde47b08e15128a9c49deab6445fad1c01"} Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.811452 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7338788b-d61c-4a48-8ec0-305003dc5397","Type":"ContainerStarted","Data":"e7a018502be0466055f7e2d9c3c5111388a41fe69ae1db8fc06e7cce2f2adaf5"} Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.811469 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7338788b-d61c-4a48-8ec0-305003dc5397","Type":"ContainerStarted","Data":"a9dcc89c3bcf83d0fcd1455fcf54baac067dc0c582f6daffbe02d9672f1444af"} Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.827579 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b33621c6-2162-464f-853a-377357ad9f2f" (UID: "b33621c6-2162-464f-853a-377357ad9f2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.827655 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=1.82763624 podStartE2EDuration="1.82763624s" podCreationTimestamp="2026-01-20 23:50:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:37.81773015 +0000 UTC m=+4510.137990458" watchObservedRunningTime="2026-01-20 23:50:37.82763624 +0000 UTC m=+4510.147896528" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.855683 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.855592698 podStartE2EDuration="2.855592698s" podCreationTimestamp="2026-01-20 23:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:37.855097987 +0000 UTC m=+4510.175358275" watchObservedRunningTime="2026-01-20 23:50:37.855592698 +0000 UTC m=+4510.175852986" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.857278 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.857300 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.857309 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b33621c6-2162-464f-853a-377357ad9f2f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.857317 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b33621c6-2162-464f-853a-377357ad9f2f-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.857334 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.857343 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqm6t\" (UniqueName: \"kubernetes.io/projected/b33621c6-2162-464f-853a-377357ad9f2f-kube-api-access-zqm6t\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.871171 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b33621c6-2162-464f-853a-377357ad9f2f" (UID: "b33621c6-2162-464f-853a-377357ad9f2f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.959668 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.980299 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5945d7b7-7f69-4c5e-a90f-5963d1ff010f" path="/var/lib/kubelet/pods/5945d7b7-7f69-4c5e-a90f-5963d1ff010f/volumes" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:37.980815 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b088bf19-0c85-4732-bc31-18e8bd1cd751" path="/var/lib/kubelet/pods/b088bf19-0c85-4732-bc31-18e8bd1cd751/volumes" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.164236 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": read tcp 10.217.0.2:48778->10.217.1.190:9311: read: connection reset by peer" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.164247 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": read tcp 10.217.0.2:48784->10.217.1.190:9311: read: connection reset by peer" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.433898 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.445666 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-config-data" (OuterVolumeSpecName: "config-data") pod "b33621c6-2162-464f-853a-377357ad9f2f" (UID: "b33621c6-2162-464f-853a-377357ad9f2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.469789 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.469826 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b33621c6-2162-464f-853a-377357ad9f2f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.510435 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": dial tcp 10.217.1.190:9311: connect: connection refused" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.510529 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": dial tcp 10.217.1.190:9311: connect: connection refused" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.660912 5030 scope.go:117] "RemoveContainer" containerID="3e9a5cc906acbaff7456c0e13b2e3f9c2e7b8cf065ffe3e9db20c604658530d5" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.756747 5030 scope.go:117] "RemoveContainer" containerID="83aa5d018cd69e4a57bf33801771af6a6199c3a715f3d3470b60454c99f0ab56" Jan 20 23:50:38 crc kubenswrapper[5030]: E0120 23:50:38.764961 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83aa5d018cd69e4a57bf33801771af6a6199c3a715f3d3470b60454c99f0ab56\": container with ID starting with 83aa5d018cd69e4a57bf33801771af6a6199c3a715f3d3470b60454c99f0ab56 not found: ID does not exist" containerID="83aa5d018cd69e4a57bf33801771af6a6199c3a715f3d3470b60454c99f0ab56" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.765039 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83aa5d018cd69e4a57bf33801771af6a6199c3a715f3d3470b60454c99f0ab56"} err="failed to get container status \"83aa5d018cd69e4a57bf33801771af6a6199c3a715f3d3470b60454c99f0ab56\": rpc error: code = NotFound desc = could not find container \"83aa5d018cd69e4a57bf33801771af6a6199c3a715f3d3470b60454c99f0ab56\": container with ID starting with 83aa5d018cd69e4a57bf33801771af6a6199c3a715f3d3470b60454c99f0ab56 not found: ID does not exist" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.765088 5030 scope.go:117] "RemoveContainer" containerID="3e9a5cc906acbaff7456c0e13b2e3f9c2e7b8cf065ffe3e9db20c604658530d5" Jan 20 23:50:38 crc kubenswrapper[5030]: E0120 23:50:38.765962 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e9a5cc906acbaff7456c0e13b2e3f9c2e7b8cf065ffe3e9db20c604658530d5\": container with ID starting with 3e9a5cc906acbaff7456c0e13b2e3f9c2e7b8cf065ffe3e9db20c604658530d5 not found: ID does not exist" containerID="3e9a5cc906acbaff7456c0e13b2e3f9c2e7b8cf065ffe3e9db20c604658530d5" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.766000 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9a5cc906acbaff7456c0e13b2e3f9c2e7b8cf065ffe3e9db20c604658530d5"} err="failed to get container status \"3e9a5cc906acbaff7456c0e13b2e3f9c2e7b8cf065ffe3e9db20c604658530d5\": rpc error: code = NotFound desc = could not find container \"3e9a5cc906acbaff7456c0e13b2e3f9c2e7b8cf065ffe3e9db20c604658530d5\": container with ID starting with 3e9a5cc906acbaff7456c0e13b2e3f9c2e7b8cf065ffe3e9db20c604658530d5 not found: ID does not exist" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.775701 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.782579 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.797077 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:50:38 crc kubenswrapper[5030]: E0120 23:50:38.797460 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b33621c6-2162-464f-853a-377357ad9f2f" containerName="glance-log" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.797477 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b33621c6-2162-464f-853a-377357ad9f2f" containerName="glance-log" Jan 20 23:50:38 crc kubenswrapper[5030]: E0120 23:50:38.797520 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b33621c6-2162-464f-853a-377357ad9f2f" containerName="glance-httpd" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.797527 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b33621c6-2162-464f-853a-377357ad9f2f" containerName="glance-httpd" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.797710 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b33621c6-2162-464f-853a-377357ad9f2f" containerName="glance-httpd" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.797731 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b33621c6-2162-464f-853a-377357ad9f2f" containerName="glance-log" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.798669 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.802760 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.803050 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.833679 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.931669 5030 generic.go:334] "Generic (PLEG): container finished" podID="7daf347c-1524-4050-b50e-deb2024da0cc" containerID="b20f1977eee125aebc807931296fe0bddf05f42f458f1beec8791a1bbb2c7e0d" exitCode=1 Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.931740 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7daf347c-1524-4050-b50e-deb2024da0cc","Type":"ContainerDied","Data":"b20f1977eee125aebc807931296fe0bddf05f42f458f1beec8791a1bbb2c7e0d"} Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.932351 5030 scope.go:117] "RemoveContainer" containerID="b20f1977eee125aebc807931296fe0bddf05f42f458f1beec8791a1bbb2c7e0d" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.983767 5030 generic.go:334] "Generic (PLEG): container finished" podID="c80f8119-49a7-4395-afb7-cdb94b8d6c35" containerID="45defd247220b7c9d73f86491711fda5c0c833c5e9fda3087524c4f3b85ddafa" exitCode=0 Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.983833 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c80f8119-49a7-4395-afb7-cdb94b8d6c35","Type":"ContainerDied","Data":"45defd247220b7c9d73f86491711fda5c0c833c5e9fda3087524c4f3b85ddafa"} Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.989242 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.989286 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df0c6001-ce31-4b3e-995f-d2a144d3142e-logs\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.989356 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.989384 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-config-data\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.989422 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df0c6001-ce31-4b3e-995f-d2a144d3142e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.989451 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-998dd\" (UniqueName: \"kubernetes.io/projected/df0c6001-ce31-4b3e-995f-d2a144d3142e-kube-api-access-998dd\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.989477 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-scripts\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:38 crc kubenswrapper[5030]: I0120 23:50:38.989513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.001838 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.032579 5030 generic.go:334] "Generic (PLEG): container finished" podID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerID="aad49c5ee0f0980c146469d4264f259f604d654ee7afbf6b04e4731884cf6baa" exitCode=0 Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.032679 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" event={"ID":"7c59fe1c-661b-4268-a33d-e9f87ec167eb","Type":"ContainerDied","Data":"aad49c5ee0f0980c146469d4264f259f604d654ee7afbf6b04e4731884cf6baa"} Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.060263 5030 generic.go:334] "Generic (PLEG): container finished" podID="6bd1362d-6fee-4d39-ad27-ba95334caeef" containerID="1ccefdcb102753b7ad4e4b3ff83929bdf44a95fb3b2cf14345b4c80f9c23baac" exitCode=0 Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.061129 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"6bd1362d-6fee-4d39-ad27-ba95334caeef","Type":"ContainerDied","Data":"1ccefdcb102753b7ad4e4b3ff83929bdf44a95fb3b2cf14345b4c80f9c23baac"} Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.096243 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c80f8119-49a7-4395-afb7-cdb94b8d6c35-logs\") pod \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.096388 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp67c\" (UniqueName: \"kubernetes.io/projected/c80f8119-49a7-4395-afb7-cdb94b8d6c35-kube-api-access-sp67c\") pod \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.096425 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-combined-ca-bundle\") pod \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.096475 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-scripts\") pod \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.096529 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-config-data\") pod \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.096599 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c80f8119-49a7-4395-afb7-cdb94b8d6c35-httpd-run\") pod \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.096681 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs\") pod \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.096735 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\" (UID: \"c80f8119-49a7-4395-afb7-cdb94b8d6c35\") " Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.096978 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.097012 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-config-data\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.097080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df0c6001-ce31-4b3e-995f-d2a144d3142e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.097108 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-998dd\" (UniqueName: \"kubernetes.io/projected/df0c6001-ce31-4b3e-995f-d2a144d3142e-kube-api-access-998dd\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.097152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-scripts\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.097203 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.097316 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.097345 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df0c6001-ce31-4b3e-995f-d2a144d3142e-logs\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.098662 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df0c6001-ce31-4b3e-995f-d2a144d3142e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.099508 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c80f8119-49a7-4395-afb7-cdb94b8d6c35-logs" (OuterVolumeSpecName: "logs") pod "c80f8119-49a7-4395-afb7-cdb94b8d6c35" (UID: "c80f8119-49a7-4395-afb7-cdb94b8d6c35"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.099947 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df0c6001-ce31-4b3e-995f-d2a144d3142e-logs\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.100739 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.100815 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c80f8119-49a7-4395-afb7-cdb94b8d6c35-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c80f8119-49a7-4395-afb7-cdb94b8d6c35" (UID: "c80f8119-49a7-4395-afb7-cdb94b8d6c35"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.112063 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-scripts" (OuterVolumeSpecName: "scripts") pod "c80f8119-49a7-4395-afb7-cdb94b8d6c35" (UID: "c80f8119-49a7-4395-afb7-cdb94b8d6c35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.112729 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-config-data\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.114978 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.117919 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-scripts\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.122481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.125797 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.132230 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-998dd\" (UniqueName: \"kubernetes.io/projected/df0c6001-ce31-4b3e-995f-d2a144d3142e-kube-api-access-998dd\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.137685 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "c80f8119-49a7-4395-afb7-cdb94b8d6c35" (UID: "c80f8119-49a7-4395-afb7-cdb94b8d6c35"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.139819 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c80f8119-49a7-4395-afb7-cdb94b8d6c35-kube-api-access-sp67c" (OuterVolumeSpecName: "kube-api-access-sp67c") pod "c80f8119-49a7-4395-afb7-cdb94b8d6c35" (UID: "c80f8119-49a7-4395-afb7-cdb94b8d6c35"). InnerVolumeSpecName "kube-api-access-sp67c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.199456 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c80f8119-49a7-4395-afb7-cdb94b8d6c35-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.199526 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.199538 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c80f8119-49a7-4395-afb7-cdb94b8d6c35-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.199550 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp67c\" (UniqueName: \"kubernetes.io/projected/c80f8119-49a7-4395-afb7-cdb94b8d6c35-kube-api-access-sp67c\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.199561 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.201422 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.207539 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c80f8119-49a7-4395-afb7-cdb94b8d6c35" (UID: "c80f8119-49a7-4395-afb7-cdb94b8d6c35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.246505 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.283868 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" podUID="16b352a0-984a-4736-9ede-a3797059b56a" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.1.173:5000/v3\": read tcp 10.217.0.2:52090->10.217.1.173:5000: read: connection reset by peer" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.300285 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd1362d-6fee-4d39-ad27-ba95334caeef-combined-ca-bundle\") pod \"6bd1362d-6fee-4d39-ad27-ba95334caeef\" (UID: \"6bd1362d-6fee-4d39-ad27-ba95334caeef\") " Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.300478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sp7s\" (UniqueName: \"kubernetes.io/projected/6bd1362d-6fee-4d39-ad27-ba95334caeef-kube-api-access-8sp7s\") pod \"6bd1362d-6fee-4d39-ad27-ba95334caeef\" (UID: \"6bd1362d-6fee-4d39-ad27-ba95334caeef\") " Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.300562 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bd1362d-6fee-4d39-ad27-ba95334caeef-config-data\") pod \"6bd1362d-6fee-4d39-ad27-ba95334caeef\" (UID: \"6bd1362d-6fee-4d39-ad27-ba95334caeef\") " Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.300962 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.345387 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd1362d-6fee-4d39-ad27-ba95334caeef-kube-api-access-8sp7s" (OuterVolumeSpecName: "kube-api-access-8sp7s") pod "6bd1362d-6fee-4d39-ad27-ba95334caeef" (UID: "6bd1362d-6fee-4d39-ad27-ba95334caeef"). InnerVolumeSpecName "kube-api-access-8sp7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.357228 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.389808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bd1362d-6fee-4d39-ad27-ba95334caeef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bd1362d-6fee-4d39-ad27-ba95334caeef" (UID: "6bd1362d-6fee-4d39-ad27-ba95334caeef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.390553 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c80f8119-49a7-4395-afb7-cdb94b8d6c35" (UID: "c80f8119-49a7-4395-afb7-cdb94b8d6c35"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.404528 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd1362d-6fee-4d39-ad27-ba95334caeef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.404562 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.404573 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.404584 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sp7s\" (UniqueName: \"kubernetes.io/projected/6bd1362d-6fee-4d39-ad27-ba95334caeef-kube-api-access-8sp7s\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.405241 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bd1362d-6fee-4d39-ad27-ba95334caeef-config-data" (OuterVolumeSpecName: "config-data") pod "6bd1362d-6fee-4d39-ad27-ba95334caeef" (UID: "6bd1362d-6fee-4d39-ad27-ba95334caeef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.419289 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.430406 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-config-data" (OuterVolumeSpecName: "config-data") pod "c80f8119-49a7-4395-afb7-cdb94b8d6c35" (UID: "c80f8119-49a7-4395-afb7-cdb94b8d6c35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.506306 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bd1362d-6fee-4d39-ad27-ba95334caeef-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.506683 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c80f8119-49a7-4395-afb7-cdb94b8d6c35-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.579092 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.802647 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.974987 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b33621c6-2162-464f-853a-377357ad9f2f" path="/var/lib/kubelet/pods/b33621c6-2162-464f-853a-377357ad9f2f/volumes" Jan 20 23:50:39 crc kubenswrapper[5030]: I0120 23:50:39.986436 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.072914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" event={"ID":"7c59fe1c-661b-4268-a33d-e9f87ec167eb","Type":"ContainerStarted","Data":"cf4277b9970dab45af9d50c12645ba97abd7f0b0cad2e3a7288e663c8ecb365f"} Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.072980 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" event={"ID":"7c59fe1c-661b-4268-a33d-e9f87ec167eb","Type":"ContainerStarted","Data":"802ab5c1c2e050a8a391ce1421faacfc7008363dd9ae9278c32e3b4a5668b5f9"} Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.073033 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.073066 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.082090 5030 generic.go:334] "Generic (PLEG): container finished" podID="16b352a0-984a-4736-9ede-a3797059b56a" containerID="1a322a5b0f4609dfb53e29d267cab28f1bd935ea7c08384b0f94ec463163dffc" exitCode=0 Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.082162 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" event={"ID":"16b352a0-984a-4736-9ede-a3797059b56a","Type":"ContainerDied","Data":"1a322a5b0f4609dfb53e29d267cab28f1bd935ea7c08384b0f94ec463163dffc"} Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.082185 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" event={"ID":"16b352a0-984a-4736-9ede-a3797059b56a","Type":"ContainerDied","Data":"32e46dab9e267047d0ea16968ff1e5ff7b12dd9bd552a2012b51f642efe794ed"} Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.082196 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32e46dab9e267047d0ea16968ff1e5ff7b12dd9bd552a2012b51f642efe794ed" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.084141 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"6bd1362d-6fee-4d39-ad27-ba95334caeef","Type":"ContainerDied","Data":"a86c9a242c160cbc18143b5142d877b4a0d2ca8a2759349d15e5bfe1bd7d3b60"} Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.084188 5030 scope.go:117] "RemoveContainer" containerID="1ccefdcb102753b7ad4e4b3ff83929bdf44a95fb3b2cf14345b4c80f9c23baac" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.084299 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.094669 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"84e84d7f-54c8-43d1-a0ee-29d3e173f662","Type":"ContainerStarted","Data":"905aa18ac6c1a58594d7762f6524c72d05436f56d34dfa2fbeeeba54d58677b5"} Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.094754 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.100554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7daf347c-1524-4050-b50e-deb2024da0cc","Type":"ContainerStarted","Data":"e6bfa4ce18ccb4bef2f10305bfdd261c586ba8e9d2496092c0df8e58dadb8812"} Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.105034 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"c80f8119-49a7-4395-afb7-cdb94b8d6c35","Type":"ContainerDied","Data":"f36ebd96f2d963aad096c98d1642e51349d2a11763ae21b3692dbc6e8922dd1c"} Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.105119 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.117544 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"df0c6001-ce31-4b3e-995f-d2a144d3142e","Type":"ContainerStarted","Data":"ad9fbba8899de986ea359d584c1bf50448fd29d3fd8b7be6e07cb8c3bf546e0a"} Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.133356 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=4.133330073 podStartE2EDuration="4.133330073s" podCreationTimestamp="2026-01-20 23:50:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:40.116382832 +0000 UTC m=+4512.436643120" watchObservedRunningTime="2026-01-20 23:50:40.133330073 +0000 UTC m=+4512.453590361" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.313587 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.332826 5030 scope.go:117] "RemoveContainer" containerID="45defd247220b7c9d73f86491711fda5c0c833c5e9fda3087524c4f3b85ddafa" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.337101 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.363496 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.398362 5030 scope.go:117] "RemoveContainer" containerID="22df7d634b6fcc4d2d4394d1e3fd73e43228c42a17debaad1858400944d85c3f" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.404570 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.420079 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.188:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.426927 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.439760 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-credential-keys\") pod \"16b352a0-984a-4736-9ede-a3797059b56a\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.439825 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs\") pod \"16b352a0-984a-4736-9ede-a3797059b56a\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.439906 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs\") pod \"16b352a0-984a-4736-9ede-a3797059b56a\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.440035 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfks4\" (UniqueName: \"kubernetes.io/projected/16b352a0-984a-4736-9ede-a3797059b56a-kube-api-access-qfks4\") pod \"16b352a0-984a-4736-9ede-a3797059b56a\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.440077 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-fernet-keys\") pod \"16b352a0-984a-4736-9ede-a3797059b56a\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.440125 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-combined-ca-bundle\") pod \"16b352a0-984a-4736-9ede-a3797059b56a\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.440142 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-config-data\") pod \"16b352a0-984a-4736-9ede-a3797059b56a\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.440166 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-scripts\") pod \"16b352a0-984a-4736-9ede-a3797059b56a\" (UID: \"16b352a0-984a-4736-9ede-a3797059b56a\") " Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.444576 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:50:40 crc kubenswrapper[5030]: E0120 23:50:40.444996 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c80f8119-49a7-4395-afb7-cdb94b8d6c35" containerName="glance-httpd" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.445012 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c80f8119-49a7-4395-afb7-cdb94b8d6c35" containerName="glance-httpd" Jan 20 23:50:40 crc kubenswrapper[5030]: E0120 23:50:40.445028 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b352a0-984a-4736-9ede-a3797059b56a" containerName="keystone-api" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.445035 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b352a0-984a-4736-9ede-a3797059b56a" containerName="keystone-api" Jan 20 23:50:40 crc kubenswrapper[5030]: E0120 23:50:40.445048 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c80f8119-49a7-4395-afb7-cdb94b8d6c35" containerName="glance-log" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.445055 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c80f8119-49a7-4395-afb7-cdb94b8d6c35" containerName="glance-log" Jan 20 23:50:40 crc kubenswrapper[5030]: E0120 23:50:40.445080 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd1362d-6fee-4d39-ad27-ba95334caeef" containerName="nova-cell1-conductor-conductor" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.445086 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd1362d-6fee-4d39-ad27-ba95334caeef" containerName="nova-cell1-conductor-conductor" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.445281 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c80f8119-49a7-4395-afb7-cdb94b8d6c35" containerName="glance-httpd" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.445300 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd1362d-6fee-4d39-ad27-ba95334caeef" containerName="nova-cell1-conductor-conductor" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.445318 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c80f8119-49a7-4395-afb7-cdb94b8d6c35" containerName="glance-log" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.445325 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b352a0-984a-4736-9ede-a3797059b56a" containerName="keystone-api" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.447964 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.449352 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "16b352a0-984a-4736-9ede-a3797059b56a" (UID: "16b352a0-984a-4736-9ede-a3797059b56a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.450178 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.450418 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.460157 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.461520 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.463594 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.469981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-scripts" (OuterVolumeSpecName: "scripts") pod "16b352a0-984a-4736-9ede-a3797059b56a" (UID: "16b352a0-984a-4736-9ede-a3797059b56a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.470644 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b352a0-984a-4736-9ede-a3797059b56a-kube-api-access-qfks4" (OuterVolumeSpecName: "kube-api-access-qfks4") pod "16b352a0-984a-4736-9ede-a3797059b56a" (UID: "16b352a0-984a-4736-9ede-a3797059b56a"). InnerVolumeSpecName "kube-api-access-qfks4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.473177 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "16b352a0-984a-4736-9ede-a3797059b56a" (UID: "16b352a0-984a-4736-9ede-a3797059b56a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.486122 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.499559 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16b352a0-984a-4736-9ede-a3797059b56a" (UID: "16b352a0-984a-4736-9ede-a3797059b56a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.502288 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.519115 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "16b352a0-984a-4736-9ede-a3797059b56a" (UID: "16b352a0-984a-4736-9ede-a3797059b56a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.522127 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-config-data" (OuterVolumeSpecName: "config-data") pod "16b352a0-984a-4736-9ede-a3797059b56a" (UID: "16b352a0-984a-4736-9ede-a3797059b56a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.524422 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "16b352a0-984a-4736-9ede-a3797059b56a" (UID: "16b352a0-984a-4736-9ede-a3797059b56a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.544015 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.544052 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.544065 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.544074 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.544083 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.544091 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.544102 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/16b352a0-984a-4736-9ede-a3797059b56a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.544113 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfks4\" (UniqueName: \"kubernetes.io/projected/16b352a0-984a-4736-9ede-a3797059b56a-kube-api-access-qfks4\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.645337 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6dd2d147-2c8e-4988-924e-68faf5018f06-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.645387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.645416 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b5db4a-e775-4981-a7fd-e2d8bd56f334-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.645444 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b5db4a-e775-4981-a7fd-e2d8bd56f334-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.645463 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd2d147-2c8e-4988-924e-68faf5018f06-logs\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.645485 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.645522 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tpgx\" (UniqueName: \"kubernetes.io/projected/6dd2d147-2c8e-4988-924e-68faf5018f06-kube-api-access-5tpgx\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.645538 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvwtx\" (UniqueName: \"kubernetes.io/projected/65b5db4a-e775-4981-a7fd-e2d8bd56f334-kube-api-access-nvwtx\") pod \"nova-cell1-conductor-0\" (UID: \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.645556 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.645614 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.645648 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.746739 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.746786 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.746849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6dd2d147-2c8e-4988-924e-68faf5018f06-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.746870 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.746897 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b5db4a-e775-4981-a7fd-e2d8bd56f334-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.746929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b5db4a-e775-4981-a7fd-e2d8bd56f334-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.746943 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd2d147-2c8e-4988-924e-68faf5018f06-logs\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.746970 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.747010 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tpgx\" (UniqueName: \"kubernetes.io/projected/6dd2d147-2c8e-4988-924e-68faf5018f06-kube-api-access-5tpgx\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.747027 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvwtx\" (UniqueName: \"kubernetes.io/projected/65b5db4a-e775-4981-a7fd-e2d8bd56f334-kube-api-access-nvwtx\") pod \"nova-cell1-conductor-0\" (UID: \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.747048 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.747063 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.747918 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6dd2d147-2c8e-4988-924e-68faf5018f06-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.748391 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd2d147-2c8e-4988-924e-68faf5018f06-logs\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.760300 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.761783 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.762567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.766281 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b5db4a-e775-4981-a7fd-e2d8bd56f334-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.766969 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b5db4a-e775-4981-a7fd-e2d8bd56f334-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.767211 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvwtx\" (UniqueName: \"kubernetes.io/projected/65b5db4a-e775-4981-a7fd-e2d8bd56f334-kube-api-access-nvwtx\") pod \"nova-cell1-conductor-0\" (UID: \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.771379 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.773504 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tpgx\" (UniqueName: \"kubernetes.io/projected/6dd2d147-2c8e-4988-924e-68faf5018f06-kube-api-access-5tpgx\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.795925 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.908777 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:40 crc kubenswrapper[5030]: I0120 23:50:40.918019 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:41 crc kubenswrapper[5030]: I0120 23:50:41.157897 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"df0c6001-ce31-4b3e-995f-d2a144d3142e","Type":"ContainerStarted","Data":"614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74"} Jan 20 23:50:41 crc kubenswrapper[5030]: I0120 23:50:41.158067 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d4c58fbf5-82sk5" Jan 20 23:50:41 crc kubenswrapper[5030]: I0120 23:50:41.167235 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.191:8776/healthcheck\": read tcp 10.217.0.2:46448->10.217.1.191:8776: read: connection reset by peer" Jan 20 23:50:41 crc kubenswrapper[5030]: I0120 23:50:41.167846 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.191:8776/healthcheck\": dial tcp 10.217.1.191:8776: connect: connection refused" Jan 20 23:50:41 crc kubenswrapper[5030]: I0120 23:50:41.209678 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-d4c58fbf5-82sk5"] Jan 20 23:50:41 crc kubenswrapper[5030]: I0120 23:50:41.222114 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-d4c58fbf5-82sk5"] Jan 20 23:50:41 crc kubenswrapper[5030]: I0120 23:50:41.360409 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:41 crc kubenswrapper[5030]: I0120 23:50:41.415180 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:50:41 crc kubenswrapper[5030]: W0120 23:50:41.417677 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65b5db4a_e775_4981_a7fd_e2d8bd56f334.slice/crio-5a49ac1f90df3e226390cd658b6c516ac2e54d9efe1409be54ec69bf12364dd8 WatchSource:0}: Error finding container 5a49ac1f90df3e226390cd658b6c516ac2e54d9efe1409be54ec69bf12364dd8: Status 404 returned error can't find the container with id 5a49ac1f90df3e226390cd658b6c516ac2e54d9efe1409be54ec69bf12364dd8 Jan 20 23:50:41 crc kubenswrapper[5030]: I0120 23:50:41.499224 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:50:41 crc kubenswrapper[5030]: W0120 23:50:41.512130 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dd2d147_2c8e_4988_924e_68faf5018f06.slice/crio-2b25d7384f621ff81c333ef84c71a45302ef227e695a8d713cb2074fe5664a83 WatchSource:0}: Error finding container 2b25d7384f621ff81c333ef84c71a45302ef227e695a8d713cb2074fe5664a83: Status 404 returned error can't find the container with id 2b25d7384f621ff81c333ef84c71a45302ef227e695a8d713cb2074fe5664a83 Jan 20 23:50:41 crc kubenswrapper[5030]: I0120 23:50:41.971767 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16b352a0-984a-4736-9ede-a3797059b56a" path="/var/lib/kubelet/pods/16b352a0-984a-4736-9ede-a3797059b56a/volumes" Jan 20 23:50:41 crc kubenswrapper[5030]: I0120 23:50:41.972516 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bd1362d-6fee-4d39-ad27-ba95334caeef" path="/var/lib/kubelet/pods/6bd1362d-6fee-4d39-ad27-ba95334caeef/volumes" Jan 20 23:50:41 crc kubenswrapper[5030]: I0120 23:50:41.973112 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c80f8119-49a7-4395-afb7-cdb94b8d6c35" path="/var/lib/kubelet/pods/c80f8119-49a7-4395-afb7-cdb94b8d6c35/volumes" Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.177770 5030 generic.go:334] "Generic (PLEG): container finished" podID="7daf347c-1524-4050-b50e-deb2024da0cc" containerID="e6bfa4ce18ccb4bef2f10305bfdd261c586ba8e9d2496092c0df8e58dadb8812" exitCode=1 Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.177820 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7daf347c-1524-4050-b50e-deb2024da0cc","Type":"ContainerDied","Data":"e6bfa4ce18ccb4bef2f10305bfdd261c586ba8e9d2496092c0df8e58dadb8812"} Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.178035 5030 scope.go:117] "RemoveContainer" containerID="b20f1977eee125aebc807931296fe0bddf05f42f458f1beec8791a1bbb2c7e0d" Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.178397 5030 scope.go:117] "RemoveContainer" containerID="e6bfa4ce18ccb4bef2f10305bfdd261c586ba8e9d2496092c0df8e58dadb8812" Jan 20 23:50:42 crc kubenswrapper[5030]: E0120 23:50:42.178598 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(7daf347c-1524-4050-b50e-deb2024da0cc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.180721 5030 generic.go:334] "Generic (PLEG): container finished" podID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerID="70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd" exitCode=0 Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.180759 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e8420ab0-7e33-4fbc-a045-c190273ccc1f","Type":"ContainerDied","Data":"70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd"} Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.180776 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e8420ab0-7e33-4fbc-a045-c190273ccc1f","Type":"ContainerStarted","Data":"85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192"} Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.180861 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api-log" containerID="cri-o://07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169" gracePeriod=30 Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.180918 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.180951 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" containerID="cri-o://85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192" gracePeriod=30 Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.184154 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"df0c6001-ce31-4b3e-995f-d2a144d3142e","Type":"ContainerStarted","Data":"3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722"} Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.188979 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"65b5db4a-e775-4981-a7fd-e2d8bd56f334","Type":"ContainerStarted","Data":"03774a4c2f2b4d19bcaa58e9f4d6af0d8620d1a7f7640d24507de2ccd90dd36a"} Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.189024 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"65b5db4a-e775-4981-a7fd-e2d8bd56f334","Type":"ContainerStarted","Data":"5a49ac1f90df3e226390cd658b6c516ac2e54d9efe1409be54ec69bf12364dd8"} Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.189101 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.197520 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6dd2d147-2c8e-4988-924e-68faf5018f06","Type":"ContainerStarted","Data":"0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706"} Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.197566 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6dd2d147-2c8e-4988-924e-68faf5018f06","Type":"ContainerStarted","Data":"2b25d7384f621ff81c333ef84c71a45302ef227e695a8d713cb2074fe5664a83"} Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.242057 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.242035967 podStartE2EDuration="2.242035967s" podCreationTimestamp="2026-01-20 23:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:42.234821583 +0000 UTC m=+4514.555081871" watchObservedRunningTime="2026-01-20 23:50:42.242035967 +0000 UTC m=+4514.562296255" Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.274597 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.274578217 podStartE2EDuration="4.274578217s" podCreationTimestamp="2026-01-20 23:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:42.24871627 +0000 UTC m=+4514.568976558" watchObservedRunningTime="2026-01-20 23:50:42.274578217 +0000 UTC m=+4514.594838505" Jan 20 23:50:42 crc kubenswrapper[5030]: I0120 23:50:42.962556 5030 scope.go:117] "RemoveContainer" containerID="0f4d8ae10989b886f307feec10a85282b7e675f678e654903030afe74cc1794d" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.118571 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.193669 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-internal-tls-certs\") pod \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.193721 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-public-tls-certs\") pod \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.193740 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-combined-ca-bundle\") pod \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.193799 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-config-data\") pod \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.193841 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-config-data-custom\") pod \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.193930 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8420ab0-7e33-4fbc-a045-c190273ccc1f-logs\") pod \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.194035 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-scripts\") pod \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.194062 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g478r\" (UniqueName: \"kubernetes.io/projected/e8420ab0-7e33-4fbc-a045-c190273ccc1f-kube-api-access-g478r\") pod \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.194104 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8420ab0-7e33-4fbc-a045-c190273ccc1f-etc-machine-id\") pod \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\" (UID: \"e8420ab0-7e33-4fbc-a045-c190273ccc1f\") " Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.194489 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8420ab0-7e33-4fbc-a045-c190273ccc1f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e8420ab0-7e33-4fbc-a045-c190273ccc1f" (UID: "e8420ab0-7e33-4fbc-a045-c190273ccc1f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.194971 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8420ab0-7e33-4fbc-a045-c190273ccc1f-logs" (OuterVolumeSpecName: "logs") pod "e8420ab0-7e33-4fbc-a045-c190273ccc1f" (UID: "e8420ab0-7e33-4fbc-a045-c190273ccc1f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.200169 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e8420ab0-7e33-4fbc-a045-c190273ccc1f" (UID: "e8420ab0-7e33-4fbc-a045-c190273ccc1f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.201101 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-scripts" (OuterVolumeSpecName: "scripts") pod "e8420ab0-7e33-4fbc-a045-c190273ccc1f" (UID: "e8420ab0-7e33-4fbc-a045-c190273ccc1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.202779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8420ab0-7e33-4fbc-a045-c190273ccc1f-kube-api-access-g478r" (OuterVolumeSpecName: "kube-api-access-g478r") pod "e8420ab0-7e33-4fbc-a045-c190273ccc1f" (UID: "e8420ab0-7e33-4fbc-a045-c190273ccc1f"). InnerVolumeSpecName "kube-api-access-g478r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.243975 5030 generic.go:334] "Generic (PLEG): container finished" podID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerID="85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192" exitCode=0 Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.244009 5030 generic.go:334] "Generic (PLEG): container finished" podID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerID="07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169" exitCode=143 Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.244092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e8420ab0-7e33-4fbc-a045-c190273ccc1f","Type":"ContainerDied","Data":"85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192"} Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.244120 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e8420ab0-7e33-4fbc-a045-c190273ccc1f","Type":"ContainerDied","Data":"07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169"} Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.244133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"e8420ab0-7e33-4fbc-a045-c190273ccc1f","Type":"ContainerDied","Data":"493cd2b5907170a9970aacad3e9cd960c3ffbfdb54abe88696305ec096d41ff9"} Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.244159 5030 scope.go:117] "RemoveContainer" containerID="85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.244184 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.262509 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8420ab0-7e33-4fbc-a045-c190273ccc1f" (UID: "e8420ab0-7e33-4fbc-a045-c190273ccc1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.262702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6dd2d147-2c8e-4988-924e-68faf5018f06","Type":"ContainerStarted","Data":"489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7"} Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.275801 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-config-data" (OuterVolumeSpecName: "config-data") pod "e8420ab0-7e33-4fbc-a045-c190273ccc1f" (UID: "e8420ab0-7e33-4fbc-a045-c190273ccc1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.290438 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e8420ab0-7e33-4fbc-a045-c190273ccc1f" (UID: "e8420ab0-7e33-4fbc-a045-c190273ccc1f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.297645 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8420ab0-7e33-4fbc-a045-c190273ccc1f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.298243 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.298262 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g478r\" (UniqueName: \"kubernetes.io/projected/e8420ab0-7e33-4fbc-a045-c190273ccc1f-kube-api-access-g478r\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.298275 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8420ab0-7e33-4fbc-a045-c190273ccc1f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.298285 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.298294 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.298306 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.298317 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.300783 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e8420ab0-7e33-4fbc-a045-c190273ccc1f" (UID: "e8420ab0-7e33-4fbc-a045-c190273ccc1f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.301615 5030 scope.go:117] "RemoveContainer" containerID="70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.316750 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.3167348580000002 podStartE2EDuration="3.316734858s" podCreationTimestamp="2026-01-20 23:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:43.284809124 +0000 UTC m=+4515.605069422" watchObservedRunningTime="2026-01-20 23:50:43.316734858 +0000 UTC m=+4515.636995146" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.322136 5030 scope.go:117] "RemoveContainer" containerID="07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.346273 5030 scope.go:117] "RemoveContainer" containerID="85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192" Jan 20 23:50:43 crc kubenswrapper[5030]: E0120 23:50:43.346765 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192\": container with ID starting with 85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192 not found: ID does not exist" containerID="85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.346806 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192"} err="failed to get container status \"85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192\": rpc error: code = NotFound desc = could not find container \"85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192\": container with ID starting with 85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192 not found: ID does not exist" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.346857 5030 scope.go:117] "RemoveContainer" containerID="70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd" Jan 20 23:50:43 crc kubenswrapper[5030]: E0120 23:50:43.347224 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd\": container with ID starting with 70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd not found: ID does not exist" containerID="70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.347710 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd"} err="failed to get container status \"70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd\": rpc error: code = NotFound desc = could not find container \"70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd\": container with ID starting with 70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd not found: ID does not exist" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.347757 5030 scope.go:117] "RemoveContainer" containerID="07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169" Jan 20 23:50:43 crc kubenswrapper[5030]: E0120 23:50:43.348259 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169\": container with ID starting with 07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169 not found: ID does not exist" containerID="07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.348290 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169"} err="failed to get container status \"07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169\": rpc error: code = NotFound desc = could not find container \"07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169\": container with ID starting with 07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169 not found: ID does not exist" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.348307 5030 scope.go:117] "RemoveContainer" containerID="85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.348575 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192"} err="failed to get container status \"85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192\": rpc error: code = NotFound desc = could not find container \"85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192\": container with ID starting with 85af237151f9d173b4271e6ce3504604bbd584a04456619e687ff81257ac5192 not found: ID does not exist" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.348594 5030 scope.go:117] "RemoveContainer" containerID="70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.348818 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd"} err="failed to get container status \"70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd\": rpc error: code = NotFound desc = could not find container \"70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd\": container with ID starting with 70a54447598f971fc7e92bef9a355c806d606cc8b0ace1731b301d1aabb41ffd not found: ID does not exist" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.348837 5030 scope.go:117] "RemoveContainer" containerID="07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.349047 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169"} err="failed to get container status \"07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169\": rpc error: code = NotFound desc = could not find container \"07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169\": container with ID starting with 07c26c84cf98fab508379ee95538a48825849bdbcb6ecba7107100d9fb5e7169 not found: ID does not exist" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.400758 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8420ab0-7e33-4fbc-a045-c190273ccc1f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.588283 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.604147 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.614419 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:50:43 crc kubenswrapper[5030]: E0120 23:50:43.614991 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api-log" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.615088 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api-log" Jan 20 23:50:43 crc kubenswrapper[5030]: E0120 23:50:43.615281 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.615441 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" Jan 20 23:50:43 crc kubenswrapper[5030]: E0120 23:50:43.615557 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.615688 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.616168 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.616309 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.616493 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" containerName="cinder-api-log" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.618318 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.620745 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.621033 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.621069 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.638849 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.705496 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-etc-machine-id\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.705554 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-public-tls-certs\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.705645 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-logs\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.705862 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.705911 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhpld\" (UniqueName: \"kubernetes.io/projected/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-kube-api-access-hhpld\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.706031 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.706067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-scripts\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.706179 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-config-data\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.706213 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-config-data-custom\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.808212 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.808273 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-scripts\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.808324 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-config-data\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.808350 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-config-data-custom\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.808444 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-etc-machine-id\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.808467 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-public-tls-certs\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.808532 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-logs\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.808594 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.808642 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhpld\" (UniqueName: \"kubernetes.io/projected/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-kube-api-access-hhpld\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.808740 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-etc-machine-id\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.809188 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-logs\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.812228 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-config-data-custom\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.812482 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.813272 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-config-data\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.813315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.814965 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-public-tls-certs\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.824300 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-scripts\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.839929 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhpld\" (UniqueName: \"kubernetes.io/projected/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-kube-api-access-hhpld\") pod \"cinder-api-0\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.936829 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:43 crc kubenswrapper[5030]: I0120 23:50:43.994993 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8420ab0-7e33-4fbc-a045-c190273ccc1f" path="/var/lib/kubelet/pods/e8420ab0-7e33-4fbc-a045-c190273ccc1f/volumes" Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.274202 5030 generic.go:334] "Generic (PLEG): container finished" podID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerID="03774a4c2f2b4d19bcaa58e9f4d6af0d8620d1a7f7640d24507de2ccd90dd36a" exitCode=1 Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.274290 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"65b5db4a-e775-4981-a7fd-e2d8bd56f334","Type":"ContainerDied","Data":"03774a4c2f2b4d19bcaa58e9f4d6af0d8620d1a7f7640d24507de2ccd90dd36a"} Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.275014 5030 scope.go:117] "RemoveContainer" containerID="03774a4c2f2b4d19bcaa58e9f4d6af0d8620d1a7f7640d24507de2ccd90dd36a" Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.279070 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"87584043-7d0e-475d-89ec-2f743b49b145","Type":"ContainerStarted","Data":"8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa"} Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.279268 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.440030 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.568832 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.570164 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.576291 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.802466 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.803101 5030 scope.go:117] "RemoveContainer" containerID="e6bfa4ce18ccb4bef2f10305bfdd261c586ba8e9d2496092c0df8e58dadb8812" Jan 20 23:50:44 crc kubenswrapper[5030]: E0120 23:50:44.803777 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(7daf347c-1524-4050-b50e-deb2024da0cc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.803826 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.803841 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.803896 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:50:44 crc kubenswrapper[5030]: I0120 23:50:44.946336 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:45 crc kubenswrapper[5030]: I0120 23:50:45.314562 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"65b5db4a-e775-4981-a7fd-e2d8bd56f334","Type":"ContainerStarted","Data":"0dc97d8d7ede5b59ab4837ed536481e436e23bbbe6faed6c35aaefc9c6dcb7dc"} Jan 20 23:50:45 crc kubenswrapper[5030]: I0120 23:50:45.315503 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:45 crc kubenswrapper[5030]: I0120 23:50:45.322403 5030 generic.go:334] "Generic (PLEG): container finished" podID="87584043-7d0e-475d-89ec-2f743b49b145" containerID="8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa" exitCode=1 Jan 20 23:50:45 crc kubenswrapper[5030]: I0120 23:50:45.322480 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"87584043-7d0e-475d-89ec-2f743b49b145","Type":"ContainerDied","Data":"8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa"} Jan 20 23:50:45 crc kubenswrapper[5030]: I0120 23:50:45.322512 5030 scope.go:117] "RemoveContainer" containerID="0f4d8ae10989b886f307feec10a85282b7e675f678e654903030afe74cc1794d" Jan 20 23:50:45 crc kubenswrapper[5030]: I0120 23:50:45.323491 5030 scope.go:117] "RemoveContainer" containerID="8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa" Jan 20 23:50:45 crc kubenswrapper[5030]: E0120 23:50:45.323875 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 40s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(87584043-7d0e-475d-89ec-2f743b49b145)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="87584043-7d0e-475d-89ec-2f743b49b145" Jan 20 23:50:45 crc kubenswrapper[5030]: I0120 23:50:45.347110 5030 scope.go:117] "RemoveContainer" containerID="e6bfa4ce18ccb4bef2f10305bfdd261c586ba8e9d2496092c0df8e58dadb8812" Jan 20 23:50:45 crc kubenswrapper[5030]: E0120 23:50:45.347404 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(7daf347c-1524-4050-b50e-deb2024da0cc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" Jan 20 23:50:45 crc kubenswrapper[5030]: I0120 23:50:45.347470 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"184162cc-2ce3-4a2e-9167-ff4ffa8cd188","Type":"ContainerStarted","Data":"413322f6b3a5c0fff6d3a7d166e45c2e48dcd0a3a60cc53ad151616f2a518a54"} Jan 20 23:50:45 crc kubenswrapper[5030]: I0120 23:50:45.347500 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"184162cc-2ce3-4a2e-9167-ff4ffa8cd188","Type":"ContainerStarted","Data":"2addcdc8e5aca18200a55a675870e879d87f04f7750e2bb174d7710bfd8848d6"} Jan 20 23:50:45 crc kubenswrapper[5030]: I0120 23:50:45.365688 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:45 crc kubenswrapper[5030]: I0120 23:50:45.420561 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.188:8080/\": dial tcp 10.217.1.188:8080: i/o timeout (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:45 crc kubenswrapper[5030]: I0120 23:50:45.579063 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.271310 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.271749 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.359483 5030 scope.go:117] "RemoveContainer" containerID="8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa" Jan 20 23:50:46 crc kubenswrapper[5030]: E0120 23:50:46.359719 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 40s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(87584043-7d0e-475d-89ec-2f743b49b145)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="87584043-7d0e-475d-89ec-2f743b49b145" Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.360887 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.361612 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"184162cc-2ce3-4a2e-9167-ff4ffa8cd188","Type":"ContainerStarted","Data":"882be4226caa10dc7afd14321a5a024ab43fdef7d8d9c2cc6348bc47f5513cb1"} Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.362451 5030 scope.go:117] "RemoveContainer" containerID="e6bfa4ce18ccb4bef2f10305bfdd261c586ba8e9d2496092c0df8e58dadb8812" Jan 20 23:50:46 crc kubenswrapper[5030]: E0120 23:50:46.362902 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(7daf347c-1524-4050-b50e-deb2024da0cc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.383612 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.401895 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.401876189 podStartE2EDuration="3.401876189s" podCreationTimestamp="2026-01-20 23:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:46.398259861 +0000 UTC m=+4518.718520149" watchObservedRunningTime="2026-01-20 23:50:46.401876189 +0000 UTC m=+4518.722136477" Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.410369 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.477174 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.513861 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66"] Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.514332 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" podUID="94aa752f-480e-4bae-9e31-ae1ce700e577" containerName="barbican-api-log" containerID="cri-o://2a0cc0b59d1a6c1381916bc83161799447404481798fd098a6aef30f606848f3" gracePeriod=30 Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.514785 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" podUID="94aa752f-480e-4bae-9e31-ae1ce700e577" containerName="barbican-api" containerID="cri-o://d4b65a581e80f8642a2a817a22e1a404af731c0e858c88b0592707343aa1a8b7" gracePeriod=30 Jan 20 23:50:46 crc kubenswrapper[5030]: I0120 23:50:46.962391 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:50:46 crc kubenswrapper[5030]: E0120 23:50:46.962909 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.120847 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.240872 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.241123 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="df0c6001-ce31-4b3e-995f-d2a144d3142e" containerName="glance-log" containerID="cri-o://614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74" gracePeriod=30 Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.241522 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="df0c6001-ce31-4b3e-995f-d2a144d3142e" containerName="glance-httpd" containerID="cri-o://3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722" gracePeriod=30 Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.257479 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.291771 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="7338788b-d61c-4a48-8ec0-305003dc5397" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.292059 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="7338788b-d61c-4a48-8ec0-305003dc5397" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.312672 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.315288 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="6dd2d147-2c8e-4988-924e-68faf5018f06" containerName="glance-log" containerID="cri-o://0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706" gracePeriod=30 Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.315542 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="6dd2d147-2c8e-4988-924e-68faf5018f06" containerName="glance-httpd" containerID="cri-o://489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7" gracePeriod=30 Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.357862 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-78f766ffcd-dv8xk"] Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.358272 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" podUID="ada934e3-1bb4-44a0-ad44-771a35fcabf7" containerName="neutron-api" containerID="cri-o://32226fd0c8a82e1ea67c45bbe6ce2bde02a5fd033c62eb3de88db4248fca5777" gracePeriod=30 Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.358495 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" podUID="ada934e3-1bb4-44a0-ad44-771a35fcabf7" containerName="neutron-httpd" containerID="cri-o://7f36d07fb47f8a6aea3d6480a30f89be95381550b0c9be70f2f2037486ec60a8" gracePeriod=30 Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.382120 5030 generic.go:334] "Generic (PLEG): container finished" podID="94aa752f-480e-4bae-9e31-ae1ce700e577" containerID="2a0cc0b59d1a6c1381916bc83161799447404481798fd098a6aef30f606848f3" exitCode=143 Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.382211 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" event={"ID":"94aa752f-480e-4bae-9e31-ae1ce700e577","Type":"ContainerDied","Data":"2a0cc0b59d1a6c1381916bc83161799447404481798fd098a6aef30f606848f3"} Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.389351 5030 generic.go:334] "Generic (PLEG): container finished" podID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerID="0dc97d8d7ede5b59ab4837ed536481e436e23bbbe6faed6c35aaefc9c6dcb7dc" exitCode=1 Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.389531 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"65b5db4a-e775-4981-a7fd-e2d8bd56f334","Type":"ContainerDied","Data":"0dc97d8d7ede5b59ab4837ed536481e436e23bbbe6faed6c35aaefc9c6dcb7dc"} Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.389718 5030 scope.go:117] "RemoveContainer" containerID="03774a4c2f2b4d19bcaa58e9f4d6af0d8620d1a7f7640d24507de2ccd90dd36a" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.390594 5030 scope.go:117] "RemoveContainer" containerID="8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa" Jan 20 23:50:47 crc kubenswrapper[5030]: E0120 23:50:47.390802 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 40s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(87584043-7d0e-475d-89ec-2f743b49b145)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="87584043-7d0e-475d-89ec-2f743b49b145" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.390808 5030 scope.go:117] "RemoveContainer" containerID="0dc97d8d7ede5b59ab4837ed536481e436e23bbbe6faed6c35aaefc9c6dcb7dc" Jan 20 23:50:47 crc kubenswrapper[5030]: E0120 23:50:47.391001 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(65b5db4a-e775-4981-a7fd-e2d8bd56f334)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.391282 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.404141 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-579d49bbb6-jhfjg"] Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.405109 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" podUID="ada934e3-1bb4-44a0-ad44-771a35fcabf7" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.200:9696/\": EOF" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.406341 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.426707 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-579d49bbb6-jhfjg"] Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.465111 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.484737 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-ovndb-tls-certs\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.485151 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-combined-ca-bundle\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.485190 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-httpd-config\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.485219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpknm\" (UniqueName: \"kubernetes.io/projected/cdb5a54a-8811-4be4-a4ba-63f879326dab-kube-api-access-gpknm\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.485331 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-config\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.485364 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-public-tls-certs\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.485385 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-internal-tls-certs\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.589668 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-combined-ca-bundle\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.589712 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-httpd-config\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.589741 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpknm\" (UniqueName: \"kubernetes.io/projected/cdb5a54a-8811-4be4-a4ba-63f879326dab-kube-api-access-gpknm\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.589798 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-config\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.589821 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-public-tls-certs\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.589847 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-internal-tls-certs\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.589890 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-ovndb-tls-certs\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.602981 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-public-tls-certs\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.604532 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-combined-ca-bundle\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.605790 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-ovndb-tls-certs\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.607249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-httpd-config\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.609930 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-internal-tls-certs\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.611112 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-config\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.620107 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpknm\" (UniqueName: \"kubernetes.io/projected/cdb5a54a-8811-4be4-a4ba-63f879326dab-kube-api-access-gpknm\") pod \"neutron-579d49bbb6-jhfjg\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.784776 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-579d49bbb6-jhfjg"] Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.785551 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.817900 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-6fc9448c48-j8m88"] Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.819374 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.853560 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6fc9448c48-j8m88"] Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.899973 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-public-tls-certs\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.900047 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-combined-ca-bundle\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.900262 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-internal-tls-certs\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.900355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-httpd-config\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.900431 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkf4g\" (UniqueName: \"kubernetes.io/projected/599881de-74a7-46ff-a7a1-5599d4345e0c-kube-api-access-qkf4g\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.900477 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-ovndb-tls-certs\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:47 crc kubenswrapper[5030]: I0120 23:50:47.900510 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-config\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.006068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkf4g\" (UniqueName: \"kubernetes.io/projected/599881de-74a7-46ff-a7a1-5599d4345e0c-kube-api-access-qkf4g\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.006139 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-ovndb-tls-certs\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.006169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-config\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.006199 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-public-tls-certs\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.006219 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-combined-ca-bundle\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.006243 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-internal-tls-certs\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.006299 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-httpd-config\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.013831 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-combined-ca-bundle\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.019537 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-config\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.021274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-public-tls-certs\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.028288 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-internal-tls-certs\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.039433 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.040977 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-httpd-config\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.041437 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-ovndb-tls-certs\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.041932 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkf4g\" (UniqueName: \"kubernetes.io/projected/599881de-74a7-46ff-a7a1-5599d4345e0c-kube-api-access-qkf4g\") pod \"neutron-6fc9448c48-j8m88\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.099308 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.107255 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-public-tls-certs\") pod \"df0c6001-ce31-4b3e-995f-d2a144d3142e\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.107322 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-combined-ca-bundle\") pod \"df0c6001-ce31-4b3e-995f-d2a144d3142e\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.107386 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"df0c6001-ce31-4b3e-995f-d2a144d3142e\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.107492 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df0c6001-ce31-4b3e-995f-d2a144d3142e-httpd-run\") pod \"df0c6001-ce31-4b3e-995f-d2a144d3142e\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.107588 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-scripts\") pod \"df0c6001-ce31-4b3e-995f-d2a144d3142e\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.107671 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-998dd\" (UniqueName: \"kubernetes.io/projected/df0c6001-ce31-4b3e-995f-d2a144d3142e-kube-api-access-998dd\") pod \"df0c6001-ce31-4b3e-995f-d2a144d3142e\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.107693 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df0c6001-ce31-4b3e-995f-d2a144d3142e-logs\") pod \"df0c6001-ce31-4b3e-995f-d2a144d3142e\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.107744 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-config-data\") pod \"df0c6001-ce31-4b3e-995f-d2a144d3142e\" (UID: \"df0c6001-ce31-4b3e-995f-d2a144d3142e\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.113059 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0c6001-ce31-4b3e-995f-d2a144d3142e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "df0c6001-ce31-4b3e-995f-d2a144d3142e" (UID: "df0c6001-ce31-4b3e-995f-d2a144d3142e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.125194 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0c6001-ce31-4b3e-995f-d2a144d3142e-logs" (OuterVolumeSpecName: "logs") pod "df0c6001-ce31-4b3e-995f-d2a144d3142e" (UID: "df0c6001-ce31-4b3e-995f-d2a144d3142e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.137191 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-scripts" (OuterVolumeSpecName: "scripts") pod "df0c6001-ce31-4b3e-995f-d2a144d3142e" (UID: "df0c6001-ce31-4b3e-995f-d2a144d3142e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.139947 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "df0c6001-ce31-4b3e-995f-d2a144d3142e" (UID: "df0c6001-ce31-4b3e-995f-d2a144d3142e"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.145823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df0c6001-ce31-4b3e-995f-d2a144d3142e-kube-api-access-998dd" (OuterVolumeSpecName: "kube-api-access-998dd") pod "df0c6001-ce31-4b3e-995f-d2a144d3142e" (UID: "df0c6001-ce31-4b3e-995f-d2a144d3142e"). InnerVolumeSpecName "kube-api-access-998dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.166877 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df0c6001-ce31-4b3e-995f-d2a144d3142e" (UID: "df0c6001-ce31-4b3e-995f-d2a144d3142e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.179786 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-579d49bbb6-jhfjg"] Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.195788 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-config-data" (OuterVolumeSpecName: "config-data") pod "df0c6001-ce31-4b3e-995f-d2a144d3142e" (UID: "df0c6001-ce31-4b3e-995f-d2a144d3142e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.209281 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-scripts\") pod \"6dd2d147-2c8e-4988-924e-68faf5018f06\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.209414 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-config-data\") pod \"6dd2d147-2c8e-4988-924e-68faf5018f06\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.209442 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"6dd2d147-2c8e-4988-924e-68faf5018f06\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.209470 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-internal-tls-certs\") pod \"6dd2d147-2c8e-4988-924e-68faf5018f06\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.209522 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-combined-ca-bundle\") pod \"6dd2d147-2c8e-4988-924e-68faf5018f06\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.209561 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd2d147-2c8e-4988-924e-68faf5018f06-logs\") pod \"6dd2d147-2c8e-4988-924e-68faf5018f06\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.210091 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tpgx\" (UniqueName: \"kubernetes.io/projected/6dd2d147-2c8e-4988-924e-68faf5018f06-kube-api-access-5tpgx\") pod \"6dd2d147-2c8e-4988-924e-68faf5018f06\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.210133 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6dd2d147-2c8e-4988-924e-68faf5018f06-httpd-run\") pod \"6dd2d147-2c8e-4988-924e-68faf5018f06\" (UID: \"6dd2d147-2c8e-4988-924e-68faf5018f06\") " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.210412 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "df0c6001-ce31-4b3e-995f-d2a144d3142e" (UID: "df0c6001-ce31-4b3e-995f-d2a144d3142e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.216890 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.216925 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-998dd\" (UniqueName: \"kubernetes.io/projected/df0c6001-ce31-4b3e-995f-d2a144d3142e-kube-api-access-998dd\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.216938 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df0c6001-ce31-4b3e-995f-d2a144d3142e-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.216949 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.216960 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.216970 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df0c6001-ce31-4b3e-995f-d2a144d3142e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.216994 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.217015 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df0c6001-ce31-4b3e-995f-d2a144d3142e-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.219587 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd2d147-2c8e-4988-924e-68faf5018f06-logs" (OuterVolumeSpecName: "logs") pod "6dd2d147-2c8e-4988-924e-68faf5018f06" (UID: "6dd2d147-2c8e-4988-924e-68faf5018f06"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.222238 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd2d147-2c8e-4988-924e-68faf5018f06-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6dd2d147-2c8e-4988-924e-68faf5018f06" (UID: "6dd2d147-2c8e-4988-924e-68faf5018f06"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.225370 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.225762 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="7338788b-d61c-4a48-8ec0-305003dc5397" containerName="nova-api-log" containerID="cri-o://e7a018502be0466055f7e2d9c3c5111388a41fe69ae1db8fc06e7cce2f2adaf5" gracePeriod=30 Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.225966 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="7338788b-d61c-4a48-8ec0-305003dc5397" containerName="nova-api-api" containerID="cri-o://e9064aae263583ea97f12eaa487d85dde47b08e15128a9c49deab6445fad1c01" gracePeriod=30 Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.236923 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-scripts" (OuterVolumeSpecName: "scripts") pod "6dd2d147-2c8e-4988-924e-68faf5018f06" (UID: "6dd2d147-2c8e-4988-924e-68faf5018f06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.240278 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd2d147-2c8e-4988-924e-68faf5018f06-kube-api-access-5tpgx" (OuterVolumeSpecName: "kube-api-access-5tpgx") pod "6dd2d147-2c8e-4988-924e-68faf5018f06" (UID: "6dd2d147-2c8e-4988-924e-68faf5018f06"). InnerVolumeSpecName "kube-api-access-5tpgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.251822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "6dd2d147-2c8e-4988-924e-68faf5018f06" (UID: "6dd2d147-2c8e-4988-924e-68faf5018f06"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.265266 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dd2d147-2c8e-4988-924e-68faf5018f06" (UID: "6dd2d147-2c8e-4988-924e-68faf5018f06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.272691 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.292499 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-config-data" (OuterVolumeSpecName: "config-data") pod "6dd2d147-2c8e-4988-924e-68faf5018f06" (UID: "6dd2d147-2c8e-4988-924e-68faf5018f06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.292582 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6dd2d147-2c8e-4988-924e-68faf5018f06" (UID: "6dd2d147-2c8e-4988-924e-68faf5018f06"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.298482 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.319783 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dd2d147-2c8e-4988-924e-68faf5018f06-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.319966 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tpgx\" (UniqueName: \"kubernetes.io/projected/6dd2d147-2c8e-4988-924e-68faf5018f06-kube-api-access-5tpgx\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.320022 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6dd2d147-2c8e-4988-924e-68faf5018f06-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.320074 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.320146 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.320221 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.320299 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.320358 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.320412 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd2d147-2c8e-4988-924e-68faf5018f06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.343375 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.427796 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.440150 5030 generic.go:334] "Generic (PLEG): container finished" podID="6dd2d147-2c8e-4988-924e-68faf5018f06" containerID="489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7" exitCode=0 Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.440180 5030 generic.go:334] "Generic (PLEG): container finished" podID="6dd2d147-2c8e-4988-924e-68faf5018f06" containerID="0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706" exitCode=143 Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.440243 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6dd2d147-2c8e-4988-924e-68faf5018f06","Type":"ContainerDied","Data":"489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7"} Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.440270 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6dd2d147-2c8e-4988-924e-68faf5018f06","Type":"ContainerDied","Data":"0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706"} Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.440281 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"6dd2d147-2c8e-4988-924e-68faf5018f06","Type":"ContainerDied","Data":"2b25d7384f621ff81c333ef84c71a45302ef227e695a8d713cb2074fe5664a83"} Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.440296 5030 scope.go:117] "RemoveContainer" containerID="489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.440442 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.465095 5030 generic.go:334] "Generic (PLEG): container finished" podID="7338788b-d61c-4a48-8ec0-305003dc5397" containerID="e7a018502be0466055f7e2d9c3c5111388a41fe69ae1db8fc06e7cce2f2adaf5" exitCode=143 Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.465183 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7338788b-d61c-4a48-8ec0-305003dc5397","Type":"ContainerDied","Data":"e7a018502be0466055f7e2d9c3c5111388a41fe69ae1db8fc06e7cce2f2adaf5"} Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.467783 5030 generic.go:334] "Generic (PLEG): container finished" podID="df0c6001-ce31-4b3e-995f-d2a144d3142e" containerID="3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722" exitCode=0 Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.467804 5030 generic.go:334] "Generic (PLEG): container finished" podID="df0c6001-ce31-4b3e-995f-d2a144d3142e" containerID="614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74" exitCode=143 Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.467833 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"df0c6001-ce31-4b3e-995f-d2a144d3142e","Type":"ContainerDied","Data":"3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722"} Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.467847 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"df0c6001-ce31-4b3e-995f-d2a144d3142e","Type":"ContainerDied","Data":"614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74"} Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.467857 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"df0c6001-ce31-4b3e-995f-d2a144d3142e","Type":"ContainerDied","Data":"ad9fbba8899de986ea359d584c1bf50448fd29d3fd8b7be6e07cb8c3bf546e0a"} Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.468037 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.498962 5030 generic.go:334] "Generic (PLEG): container finished" podID="ada934e3-1bb4-44a0-ad44-771a35fcabf7" containerID="7f36d07fb47f8a6aea3d6480a30f89be95381550b0c9be70f2f2037486ec60a8" exitCode=0 Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.499042 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" event={"ID":"ada934e3-1bb4-44a0-ad44-771a35fcabf7","Type":"ContainerDied","Data":"7f36d07fb47f8a6aea3d6480a30f89be95381550b0c9be70f2f2037486ec60a8"} Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.502759 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.511965 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.529585 5030 scope.go:117] "RemoveContainer" containerID="0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.544471 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:50:48 crc kubenswrapper[5030]: E0120 23:50:48.545132 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0c6001-ce31-4b3e-995f-d2a144d3142e" containerName="glance-log" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.545197 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0c6001-ce31-4b3e-995f-d2a144d3142e" containerName="glance-log" Jan 20 23:50:48 crc kubenswrapper[5030]: E0120 23:50:48.545288 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd2d147-2c8e-4988-924e-68faf5018f06" containerName="glance-httpd" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.545337 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd2d147-2c8e-4988-924e-68faf5018f06" containerName="glance-httpd" Jan 20 23:50:48 crc kubenswrapper[5030]: E0120 23:50:48.545392 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0c6001-ce31-4b3e-995f-d2a144d3142e" containerName="glance-httpd" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.545442 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0c6001-ce31-4b3e-995f-d2a144d3142e" containerName="glance-httpd" Jan 20 23:50:48 crc kubenswrapper[5030]: E0120 23:50:48.545491 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd2d147-2c8e-4988-924e-68faf5018f06" containerName="glance-log" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.545535 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd2d147-2c8e-4988-924e-68faf5018f06" containerName="glance-log" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.545763 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0c6001-ce31-4b3e-995f-d2a144d3142e" containerName="glance-httpd" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.546494 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd2d147-2c8e-4988-924e-68faf5018f06" containerName="glance-log" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.546549 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd2d147-2c8e-4988-924e-68faf5018f06" containerName="glance-httpd" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.546600 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0c6001-ce31-4b3e-995f-d2a144d3142e" containerName="glance-log" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.547851 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.553794 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-hsjcl" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.554493 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.554596 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.554765 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.555789 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" event={"ID":"cdb5a54a-8811-4be4-a4ba-63f879326dab","Type":"ContainerStarted","Data":"5a5c90413d6548389b806d15aa0f8c2356f66c21b27ad6d573ec322779a52861"} Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.555873 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" event={"ID":"cdb5a54a-8811-4be4-a4ba-63f879326dab","Type":"ContainerStarted","Data":"34a89d4484f22d9a1b6e0651f0100f517957ba0ae48ce384870c6df350dde414"} Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.556122 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-log" containerID="cri-o://c25d532fd4fcba60465f02b139d44d010a92cbabfea1746a9dfa54e3df14eee8" gracePeriod=30 Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.556306 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-metadata" containerID="cri-o://8596413765ace3663ab1dbabdc220f74e3eac2172966214f82638524e7826651" gracePeriod=30 Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.578599 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.649877 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.679710 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.679772 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.681320 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.690111 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.690309 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.699873 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.732971 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.733023 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b54968d4-ae84-4e25-8899-6113e78d9a2b-logs\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.733079 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.733146 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czh6v\" (UniqueName: \"kubernetes.io/projected/b54968d4-ae84-4e25-8899-6113e78d9a2b-kube-api-access-czh6v\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.733175 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.733210 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.733234 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b54968d4-ae84-4e25-8899-6113e78d9a2b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.733278 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.744982 5030 scope.go:117] "RemoveContainer" containerID="489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7" Jan 20 23:50:48 crc kubenswrapper[5030]: E0120 23:50:48.747146 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7\": container with ID starting with 489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7 not found: ID does not exist" containerID="489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.747180 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7"} err="failed to get container status \"489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7\": rpc error: code = NotFound desc = could not find container \"489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7\": container with ID starting with 489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7 not found: ID does not exist" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.747208 5030 scope.go:117] "RemoveContainer" containerID="0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706" Jan 20 23:50:48 crc kubenswrapper[5030]: E0120 23:50:48.749455 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706\": container with ID starting with 0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706 not found: ID does not exist" containerID="0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.749477 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706"} err="failed to get container status \"0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706\": rpc error: code = NotFound desc = could not find container \"0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706\": container with ID starting with 0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706 not found: ID does not exist" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.749493 5030 scope.go:117] "RemoveContainer" containerID="489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.751376 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7"} err="failed to get container status \"489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7\": rpc error: code = NotFound desc = could not find container \"489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7\": container with ID starting with 489b1f144c582ae0c643e395dd236540b571ff9d827c4915460a80a5bf9759e7 not found: ID does not exist" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.751407 5030 scope.go:117] "RemoveContainer" containerID="0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.763690 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706"} err="failed to get container status \"0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706\": rpc error: code = NotFound desc = could not find container \"0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706\": container with ID starting with 0dd981d1964f689b2b52ecad693b33bc623e9b482974e9eb7622344396bba706 not found: ID does not exist" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.763739 5030 scope.go:117] "RemoveContainer" containerID="3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.839764 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840162 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") device mount path \"/mnt/openstack/pv19\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840275 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b54968d4-ae84-4e25-8899-6113e78d9a2b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840306 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840345 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840378 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840402 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840421 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840438 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b54968d4-ae84-4e25-8899-6113e78d9a2b-logs\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840489 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7p4h\" (UniqueName: \"kubernetes.io/projected/7c7e15f9-5447-4550-9b78-91a664abf455-kube-api-access-b7p4h\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840508 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840569 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c7e15f9-5447-4550-9b78-91a664abf455-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840593 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840618 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czh6v\" (UniqueName: \"kubernetes.io/projected/b54968d4-ae84-4e25-8899-6113e78d9a2b-kube-api-access-czh6v\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840733 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840743 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b54968d4-ae84-4e25-8899-6113e78d9a2b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840760 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840782 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7e15f9-5447-4550-9b78-91a664abf455-logs\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.840956 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b54968d4-ae84-4e25-8899-6113e78d9a2b-logs\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.847286 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.851339 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.852299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.852846 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.863306 5030 scope.go:117] "RemoveContainer" containerID="614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.875021 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czh6v\" (UniqueName: \"kubernetes.io/projected/b54968d4-ae84-4e25-8899-6113e78d9a2b-kube-api-access-czh6v\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.895688 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-6fc9448c48-j8m88"] Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.916143 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.942137 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7p4h\" (UniqueName: \"kubernetes.io/projected/7c7e15f9-5447-4550-9b78-91a664abf455-kube-api-access-b7p4h\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.942232 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c7e15f9-5447-4550-9b78-91a664abf455-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.942272 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.942307 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.942327 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7e15f9-5447-4550-9b78-91a664abf455-logs\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.942372 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.942403 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.942442 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.942750 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.949161 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.950274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c7e15f9-5447-4550-9b78-91a664abf455-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.952951 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7e15f9-5447-4550-9b78-91a664abf455-logs\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.953690 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.956030 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.966908 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.967548 5030 scope.go:117] "RemoveContainer" containerID="3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722" Jan 20 23:50:48 crc kubenswrapper[5030]: E0120 23:50:48.967910 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722\": container with ID starting with 3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722 not found: ID does not exist" containerID="3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.967951 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722"} err="failed to get container status \"3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722\": rpc error: code = NotFound desc = could not find container \"3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722\": container with ID starting with 3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722 not found: ID does not exist" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.967973 5030 scope.go:117] "RemoveContainer" containerID="614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.970897 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7p4h\" (UniqueName: \"kubernetes.io/projected/7c7e15f9-5447-4550-9b78-91a664abf455-kube-api-access-b7p4h\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:48 crc kubenswrapper[5030]: E0120 23:50:48.971023 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74\": container with ID starting with 614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74 not found: ID does not exist" containerID="614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.971049 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74"} err="failed to get container status \"614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74\": rpc error: code = NotFound desc = could not find container \"614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74\": container with ID starting with 614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74 not found: ID does not exist" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.971164 5030 scope.go:117] "RemoveContainer" containerID="3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.972015 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722"} err="failed to get container status \"3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722\": rpc error: code = NotFound desc = could not find container \"3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722\": container with ID starting with 3d29185e3fe0f267c4e3235b43ff6643a122ee2e58674f466304aa952eaae722 not found: ID does not exist" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.972034 5030 scope.go:117] "RemoveContainer" containerID="614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74" Jan 20 23:50:48 crc kubenswrapper[5030]: I0120 23:50:48.972258 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74"} err="failed to get container status \"614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74\": rpc error: code = NotFound desc = could not find container \"614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74\": container with ID starting with 614555c8d44ad063009f98a3c239dbf8955752db903293abde534f14bebc5f74 not found: ID does not exist" Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.022972 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.181165 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.275174 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.592468 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:50:49 crc kubenswrapper[5030]: W0120 23:50:49.609780 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb54968d4_ae84_4e25_8899_6113e78d9a2b.slice/crio-9b0f40d49806f25db8316c7623cd1178cdc747495ed183b07f257fae24d459f1 WatchSource:0}: Error finding container 9b0f40d49806f25db8316c7623cd1178cdc747495ed183b07f257fae24d459f1: Status 404 returned error can't find the container with id 9b0f40d49806f25db8316c7623cd1178cdc747495ed183b07f257fae24d459f1 Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.623051 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" event={"ID":"599881de-74a7-46ff-a7a1-5599d4345e0c","Type":"ContainerStarted","Data":"cce18799a99ac1715b565d5854b0dcbd08f2d61430c9731225dea07b9b533874"} Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.623094 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" event={"ID":"599881de-74a7-46ff-a7a1-5599d4345e0c","Type":"ContainerStarted","Data":"3d2c8b5ef48f92de27684ee3e9f03dc2c07b798a4d1a2c21ae54203b0fa195c0"} Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.638712 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" event={"ID":"cdb5a54a-8811-4be4-a4ba-63f879326dab","Type":"ContainerStarted","Data":"deb36e3c895d0c2d04c59cdd1fd41e349f7dbe36f3fb36c16343f022afbf5d74"} Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.639017 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" podUID="cdb5a54a-8811-4be4-a4ba-63f879326dab" containerName="neutron-api" containerID="cri-o://5a5c90413d6548389b806d15aa0f8c2356f66c21b27ad6d573ec322779a52861" gracePeriod=30 Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.639300 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.639684 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" podUID="cdb5a54a-8811-4be4-a4ba-63f879326dab" containerName="neutron-httpd" containerID="cri-o://deb36e3c895d0c2d04c59cdd1fd41e349f7dbe36f3fb36c16343f022afbf5d74" gracePeriod=30 Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.706555 5030 generic.go:334] "Generic (PLEG): container finished" podID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerID="c25d532fd4fcba60465f02b139d44d010a92cbabfea1746a9dfa54e3df14eee8" exitCode=143 Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.706626 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9916ac3c-e85e-498a-a0cc-769ca26f84b0","Type":"ContainerDied","Data":"c25d532fd4fcba60465f02b139d44d010a92cbabfea1746a9dfa54e3df14eee8"} Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.720597 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" podStartSLOduration=2.720575046 podStartE2EDuration="2.720575046s" podCreationTimestamp="2026-01-20 23:50:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:49.658284506 +0000 UTC m=+4521.978544794" watchObservedRunningTime="2026-01-20 23:50:49.720575046 +0000 UTC m=+4522.040835344" Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.802023 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" podUID="94aa752f-480e-4bae-9e31-ae1ce700e577" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.124:9311/healthcheck\": read tcp 10.217.0.2:51688->10.217.1.124:9311: read: connection reset by peer" Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.802310 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" podUID="94aa752f-480e-4bae-9e31-ae1ce700e577" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.124:9311/healthcheck\": read tcp 10.217.0.2:51684->10.217.1.124:9311: read: connection reset by peer" Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.919851 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:50:49 crc kubenswrapper[5030]: I0120 23:50:49.920468 5030 scope.go:117] "RemoveContainer" containerID="0dc97d8d7ede5b59ab4837ed536481e436e23bbbe6faed6c35aaefc9c6dcb7dc" Jan 20 23:50:49 crc kubenswrapper[5030]: E0120 23:50:49.920773 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 10s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(65b5db4a-e775-4981-a7fd-e2d8bd56f334)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.000843 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd2d147-2c8e-4988-924e-68faf5018f06" path="/var/lib/kubelet/pods/6dd2d147-2c8e-4988-924e-68faf5018f06/volumes" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.002950 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df0c6001-ce31-4b3e-995f-d2a144d3142e" path="/var/lib/kubelet/pods/df0c6001-ce31-4b3e-995f-d2a144d3142e/volumes" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.032992 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.347084 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.485735 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4h97\" (UniqueName: \"kubernetes.io/projected/94aa752f-480e-4bae-9e31-ae1ce700e577-kube-api-access-b4h97\") pod \"94aa752f-480e-4bae-9e31-ae1ce700e577\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.485875 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-config-data\") pod \"94aa752f-480e-4bae-9e31-ae1ce700e577\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.485901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-internal-tls-certs\") pod \"94aa752f-480e-4bae-9e31-ae1ce700e577\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.485951 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-combined-ca-bundle\") pod \"94aa752f-480e-4bae-9e31-ae1ce700e577\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.486002 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-config-data-custom\") pod \"94aa752f-480e-4bae-9e31-ae1ce700e577\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.486021 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-public-tls-certs\") pod \"94aa752f-480e-4bae-9e31-ae1ce700e577\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.486089 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94aa752f-480e-4bae-9e31-ae1ce700e577-logs\") pod \"94aa752f-480e-4bae-9e31-ae1ce700e577\" (UID: \"94aa752f-480e-4bae-9e31-ae1ce700e577\") " Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.487730 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94aa752f-480e-4bae-9e31-ae1ce700e577-logs" (OuterVolumeSpecName: "logs") pod "94aa752f-480e-4bae-9e31-ae1ce700e577" (UID: "94aa752f-480e-4bae-9e31-ae1ce700e577"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.510844 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "94aa752f-480e-4bae-9e31-ae1ce700e577" (UID: "94aa752f-480e-4bae-9e31-ae1ce700e577"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.520565 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94aa752f-480e-4bae-9e31-ae1ce700e577-kube-api-access-b4h97" (OuterVolumeSpecName: "kube-api-access-b4h97") pod "94aa752f-480e-4bae-9e31-ae1ce700e577" (UID: "94aa752f-480e-4bae-9e31-ae1ce700e577"). InnerVolumeSpecName "kube-api-access-b4h97". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.587386 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "94aa752f-480e-4bae-9e31-ae1ce700e577" (UID: "94aa752f-480e-4bae-9e31-ae1ce700e577"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.590967 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94aa752f-480e-4bae-9e31-ae1ce700e577-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.590999 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4h97\" (UniqueName: \"kubernetes.io/projected/94aa752f-480e-4bae-9e31-ae1ce700e577-kube-api-access-b4h97\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.591012 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.591124 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.593385 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-config-data" (OuterVolumeSpecName: "config-data") pod "94aa752f-480e-4bae-9e31-ae1ce700e577" (UID: "94aa752f-480e-4bae-9e31-ae1ce700e577"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.597778 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94aa752f-480e-4bae-9e31-ae1ce700e577" (UID: "94aa752f-480e-4bae-9e31-ae1ce700e577"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.622753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "94aa752f-480e-4bae-9e31-ae1ce700e577" (UID: "94aa752f-480e-4bae-9e31-ae1ce700e577"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.693967 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.693998 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.694009 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94aa752f-480e-4bae-9e31-ae1ce700e577-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.750182 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7c7e15f9-5447-4550-9b78-91a664abf455","Type":"ContainerStarted","Data":"8f1339c559af2e1446af82c4175e39a6aa13b7b5be46fea7e205cb6a8a0be31a"} Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.754142 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b54968d4-ae84-4e25-8899-6113e78d9a2b","Type":"ContainerStarted","Data":"9b0f40d49806f25db8316c7623cd1178cdc747495ed183b07f257fae24d459f1"} Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.757015 5030 generic.go:334] "Generic (PLEG): container finished" podID="94aa752f-480e-4bae-9e31-ae1ce700e577" containerID="d4b65a581e80f8642a2a817a22e1a404af731c0e858c88b0592707343aa1a8b7" exitCode=0 Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.757059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" event={"ID":"94aa752f-480e-4bae-9e31-ae1ce700e577","Type":"ContainerDied","Data":"d4b65a581e80f8642a2a817a22e1a404af731c0e858c88b0592707343aa1a8b7"} Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.757075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" event={"ID":"94aa752f-480e-4bae-9e31-ae1ce700e577","Type":"ContainerDied","Data":"a1ecf587dd8da4b8e0ca0a8bc79489b9100cc4646ac0bbc976d8c923e4c821d0"} Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.757090 5030 scope.go:117] "RemoveContainer" containerID="d4b65a581e80f8642a2a817a22e1a404af731c0e858c88b0592707343aa1a8b7" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.757198 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.768612 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" event={"ID":"599881de-74a7-46ff-a7a1-5599d4345e0c","Type":"ContainerStarted","Data":"a441fbf79feb6f0388d034551078971a6c769c048b93306c0f069c878d28557b"} Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.768889 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.810598 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" podStartSLOduration=3.810577539 podStartE2EDuration="3.810577539s" podCreationTimestamp="2026-01-20 23:50:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:50.802277997 +0000 UTC m=+4523.122538285" watchObservedRunningTime="2026-01-20 23:50:50.810577539 +0000 UTC m=+4523.130837827" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.811966 5030 generic.go:334] "Generic (PLEG): container finished" podID="cdb5a54a-8811-4be4-a4ba-63f879326dab" containerID="deb36e3c895d0c2d04c59cdd1fd41e349f7dbe36f3fb36c16343f022afbf5d74" exitCode=0 Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.812025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" event={"ID":"cdb5a54a-8811-4be4-a4ba-63f879326dab","Type":"ContainerDied","Data":"deb36e3c895d0c2d04c59cdd1fd41e349f7dbe36f3fb36c16343f022afbf5d74"} Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.828484 5030 scope.go:117] "RemoveContainer" containerID="2a0cc0b59d1a6c1381916bc83161799447404481798fd098a6aef30f606848f3" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.847886 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66"] Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.857137 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-5bd5fbdc44-xvm66"] Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.941256 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.943039 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="c07495be-8fe0-4e49-a868-75234b6547d6" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b38c7a180e10f0876b8ec363fc81fe7721ddcc28b1a9185fc6405b6f48bf18a3" gracePeriod=30 Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.949106 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" podUID="d63800d7-7064-4d3f-9294-bcdbbd3957ec" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.172:9696/\": dial tcp 10.217.1.172:9696: connect: connection refused" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.992560 5030 scope.go:117] "RemoveContainer" containerID="d4b65a581e80f8642a2a817a22e1a404af731c0e858c88b0592707343aa1a8b7" Jan 20 23:50:50 crc kubenswrapper[5030]: E0120 23:50:50.996613 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b65a581e80f8642a2a817a22e1a404af731c0e858c88b0592707343aa1a8b7\": container with ID starting with d4b65a581e80f8642a2a817a22e1a404af731c0e858c88b0592707343aa1a8b7 not found: ID does not exist" containerID="d4b65a581e80f8642a2a817a22e1a404af731c0e858c88b0592707343aa1a8b7" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.997883 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b65a581e80f8642a2a817a22e1a404af731c0e858c88b0592707343aa1a8b7"} err="failed to get container status \"d4b65a581e80f8642a2a817a22e1a404af731c0e858c88b0592707343aa1a8b7\": rpc error: code = NotFound desc = could not find container \"d4b65a581e80f8642a2a817a22e1a404af731c0e858c88b0592707343aa1a8b7\": container with ID starting with d4b65a581e80f8642a2a817a22e1a404af731c0e858c88b0592707343aa1a8b7 not found: ID does not exist" Jan 20 23:50:50 crc kubenswrapper[5030]: I0120 23:50:50.997920 5030 scope.go:117] "RemoveContainer" containerID="2a0cc0b59d1a6c1381916bc83161799447404481798fd098a6aef30f606848f3" Jan 20 23:50:51 crc kubenswrapper[5030]: E0120 23:50:51.008369 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a0cc0b59d1a6c1381916bc83161799447404481798fd098a6aef30f606848f3\": container with ID starting with 2a0cc0b59d1a6c1381916bc83161799447404481798fd098a6aef30f606848f3 not found: ID does not exist" containerID="2a0cc0b59d1a6c1381916bc83161799447404481798fd098a6aef30f606848f3" Jan 20 23:50:51 crc kubenswrapper[5030]: I0120 23:50:51.008427 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a0cc0b59d1a6c1381916bc83161799447404481798fd098a6aef30f606848f3"} err="failed to get container status \"2a0cc0b59d1a6c1381916bc83161799447404481798fd098a6aef30f606848f3\": rpc error: code = NotFound desc = could not find container \"2a0cc0b59d1a6c1381916bc83161799447404481798fd098a6aef30f606848f3\": container with ID starting with 2a0cc0b59d1a6c1381916bc83161799447404481798fd098a6aef30f606848f3 not found: ID does not exist" Jan 20 23:50:51 crc kubenswrapper[5030]: I0120 23:50:51.362757 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="c07495be-8fe0-4e49-a868-75234b6547d6" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.1.202:6080/vnc_lite.html\": dial tcp 10.217.1.202:6080: connect: connection refused" Jan 20 23:50:51 crc kubenswrapper[5030]: I0120 23:50:51.824229 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7c7e15f9-5447-4550-9b78-91a664abf455","Type":"ContainerStarted","Data":"659c6080b290beac39c64d7bbf8b45b9b4302d2101a13ac504110cc794f1c3ca"} Jan 20 23:50:51 crc kubenswrapper[5030]: I0120 23:50:51.825271 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7c7e15f9-5447-4550-9b78-91a664abf455","Type":"ContainerStarted","Data":"83b2f026ab0c42167ca1bb5968a95879a1cec2e859ab889dcd0ef3007905b2f0"} Jan 20 23:50:51 crc kubenswrapper[5030]: I0120 23:50:51.837173 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b54968d4-ae84-4e25-8899-6113e78d9a2b","Type":"ContainerStarted","Data":"5bc5381a28ae85e8ac4d78c24cbcbda308750d29cddbbbd612d8dadbe5fff537"} Jan 20 23:50:51 crc kubenswrapper[5030]: I0120 23:50:51.837216 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b54968d4-ae84-4e25-8899-6113e78d9a2b","Type":"ContainerStarted","Data":"6aba11ff0a4eac220e8cc2af3d966d1f2a4d5a16d8754c243966dc541c179f09"} Jan 20 23:50:51 crc kubenswrapper[5030]: I0120 23:50:51.840394 5030 generic.go:334] "Generic (PLEG): container finished" podID="c07495be-8fe0-4e49-a868-75234b6547d6" containerID="b38c7a180e10f0876b8ec363fc81fe7721ddcc28b1a9185fc6405b6f48bf18a3" exitCode=0 Jan 20 23:50:51 crc kubenswrapper[5030]: I0120 23:50:51.840976 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c07495be-8fe0-4e49-a868-75234b6547d6","Type":"ContainerDied","Data":"b38c7a180e10f0876b8ec363fc81fe7721ddcc28b1a9185fc6405b6f48bf18a3"} Jan 20 23:50:51 crc kubenswrapper[5030]: I0120 23:50:51.863557 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.863541422 podStartE2EDuration="3.863541422s" podCreationTimestamp="2026-01-20 23:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:51.854649776 +0000 UTC m=+4524.174910064" watchObservedRunningTime="2026-01-20 23:50:51.863541422 +0000 UTC m=+4524.183801710" Jan 20 23:50:51 crc kubenswrapper[5030]: I0120 23:50:51.896288 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.896269646 podStartE2EDuration="3.896269646s" podCreationTimestamp="2026-01-20 23:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:51.882477931 +0000 UTC m=+4524.202738219" watchObservedRunningTime="2026-01-20 23:50:51.896269646 +0000 UTC m=+4524.216529934" Jan 20 23:50:51 crc kubenswrapper[5030]: I0120 23:50:51.974889 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94aa752f-480e-4bae-9e31-ae1ce700e577" path="/var/lib/kubelet/pods/94aa752f-480e-4bae-9e31-ae1ce700e577/volumes" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.007833 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.183:8775/\": read tcp 10.217.0.2:45054->10.217.1.183:8775: read: connection reset by peer" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.007917 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.183:8775/\": read tcp 10.217.0.2:45062->10.217.1.183:8775: read: connection reset by peer" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.056695 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-5796c565b-rf2fr"] Jan 20 23:50:52 crc kubenswrapper[5030]: E0120 23:50:52.057276 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94aa752f-480e-4bae-9e31-ae1ce700e577" containerName="barbican-api" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.057345 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="94aa752f-480e-4bae-9e31-ae1ce700e577" containerName="barbican-api" Jan 20 23:50:52 crc kubenswrapper[5030]: E0120 23:50:52.057407 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94aa752f-480e-4bae-9e31-ae1ce700e577" containerName="barbican-api-log" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.057461 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="94aa752f-480e-4bae-9e31-ae1ce700e577" containerName="barbican-api-log" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.057763 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="94aa752f-480e-4bae-9e31-ae1ce700e577" containerName="barbican-api-log" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.057827 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="94aa752f-480e-4bae-9e31-ae1ce700e577" containerName="barbican-api" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.058796 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.069706 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5796c565b-rf2fr"] Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.131187 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.131257 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-internal-tls-certs\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.131318 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-public-tls-certs\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.131349 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data-custom\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.131375 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtf6q\" (UniqueName: \"kubernetes.io/projected/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-kube-api-access-vtf6q\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.131404 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-logs\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.131443 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-combined-ca-bundle\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.235120 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-public-tls-certs\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.235198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data-custom\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.235233 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtf6q\" (UniqueName: \"kubernetes.io/projected/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-kube-api-access-vtf6q\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.235265 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-logs\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.235309 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-combined-ca-bundle\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.235365 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.235408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-internal-tls-certs\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.238342 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-logs\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.580231 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.645961 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-combined-ca-bundle\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.649898 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-internal-tls-certs\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.658056 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-public-tls-certs\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.658843 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.659257 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data-custom\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.659732 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtf6q\" (UniqueName: \"kubernetes.io/projected/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-kube-api-access-vtf6q\") pod \"barbican-api-5796c565b-rf2fr\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.697154 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.720250 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.794879 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-79b6d76946-54s42"] Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.795177 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" podUID="9235118a-bc18-42b7-8968-8d1716aa58e6" containerName="placement-log" containerID="cri-o://e4604a1a5e1055d193592b0018c10b47b6be4ee3b04193a2a20752fb258dd934" gracePeriod=30 Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.795253 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" podUID="9235118a-bc18-42b7-8968-8d1716aa58e6" containerName="placement-api" containerID="cri-o://078b4c220ea49991c2fd2e5c63d577629afa1a9bc49871f2340203ad8b972145" gracePeriod=30 Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.847430 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.863874 5030 generic.go:334] "Generic (PLEG): container finished" podID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerID="8596413765ace3663ab1dbabdc220f74e3eac2172966214f82638524e7826651" exitCode=0 Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.863924 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9916ac3c-e85e-498a-a0cc-769ca26f84b0","Type":"ContainerDied","Data":"8596413765ace3663ab1dbabdc220f74e3eac2172966214f82638524e7826651"} Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.866070 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.866282 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"c07495be-8fe0-4e49-a868-75234b6547d6","Type":"ContainerDied","Data":"8c7a580c04e91fe4ade9368da014125790c9fdacbb046aba154736ac6001b15b"} Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.866310 5030 scope.go:117] "RemoveContainer" containerID="b38c7a180e10f0876b8ec363fc81fe7721ddcc28b1a9185fc6405b6f48bf18a3" Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.962138 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-vencrypt-tls-certs\") pod \"c07495be-8fe0-4e49-a868-75234b6547d6\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.962517 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-nova-novncproxy-tls-certs\") pod \"c07495be-8fe0-4e49-a868-75234b6547d6\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.962555 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-config-data\") pod \"c07495be-8fe0-4e49-a868-75234b6547d6\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.963097 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcnmj\" (UniqueName: \"kubernetes.io/projected/c07495be-8fe0-4e49-a868-75234b6547d6-kube-api-access-zcnmj\") pod \"c07495be-8fe0-4e49-a868-75234b6547d6\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.963193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-combined-ca-bundle\") pod \"c07495be-8fe0-4e49-a868-75234b6547d6\" (UID: \"c07495be-8fe0-4e49-a868-75234b6547d6\") " Jan 20 23:50:52 crc kubenswrapper[5030]: I0120 23:50:52.971471 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c07495be-8fe0-4e49-a868-75234b6547d6-kube-api-access-zcnmj" (OuterVolumeSpecName: "kube-api-access-zcnmj") pod "c07495be-8fe0-4e49-a868-75234b6547d6" (UID: "c07495be-8fe0-4e49-a868-75234b6547d6"). InnerVolumeSpecName "kube-api-access-zcnmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.044806 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c07495be-8fe0-4e49-a868-75234b6547d6" (UID: "c07495be-8fe0-4e49-a868-75234b6547d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.066589 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcnmj\" (UniqueName: \"kubernetes.io/projected/c07495be-8fe0-4e49-a868-75234b6547d6-kube-api-access-zcnmj\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.066620 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.151066 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-config-data" (OuterVolumeSpecName: "config-data") pod "c07495be-8fe0-4e49-a868-75234b6547d6" (UID: "c07495be-8fe0-4e49-a868-75234b6547d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.152406 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "c07495be-8fe0-4e49-a868-75234b6547d6" (UID: "c07495be-8fe0-4e49-a868-75234b6547d6"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.162202 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.168788 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-nova-metadata-tls-certs\") pod \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.168865 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vmzs\" (UniqueName: \"kubernetes.io/projected/9916ac3c-e85e-498a-a0cc-769ca26f84b0-kube-api-access-4vmzs\") pod \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.168886 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-config-data\") pod \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.169037 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-combined-ca-bundle\") pod \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.169060 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9916ac3c-e85e-498a-a0cc-769ca26f84b0-logs\") pod \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\" (UID: \"9916ac3c-e85e-498a-a0cc-769ca26f84b0\") " Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.169338 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.169355 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.169779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9916ac3c-e85e-498a-a0cc-769ca26f84b0-logs" (OuterVolumeSpecName: "logs") pod "9916ac3c-e85e-498a-a0cc-769ca26f84b0" (UID: "9916ac3c-e85e-498a-a0cc-769ca26f84b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.186866 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "c07495be-8fe0-4e49-a868-75234b6547d6" (UID: "c07495be-8fe0-4e49-a868-75234b6547d6"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.186981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9916ac3c-e85e-498a-a0cc-769ca26f84b0-kube-api-access-4vmzs" (OuterVolumeSpecName: "kube-api-access-4vmzs") pod "9916ac3c-e85e-498a-a0cc-769ca26f84b0" (UID: "9916ac3c-e85e-498a-a0cc-769ca26f84b0"). InnerVolumeSpecName "kube-api-access-4vmzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.223697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-config-data" (OuterVolumeSpecName: "config-data") pod "9916ac3c-e85e-498a-a0cc-769ca26f84b0" (UID: "9916ac3c-e85e-498a-a0cc-769ca26f84b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.226070 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9916ac3c-e85e-498a-a0cc-769ca26f84b0" (UID: "9916ac3c-e85e-498a-a0cc-769ca26f84b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.260590 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9916ac3c-e85e-498a-a0cc-769ca26f84b0" (UID: "9916ac3c-e85e-498a-a0cc-769ca26f84b0"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.271890 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07495be-8fe0-4e49-a868-75234b6547d6-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.271930 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.271940 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9916ac3c-e85e-498a-a0cc-769ca26f84b0-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.271951 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.271963 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vmzs\" (UniqueName: \"kubernetes.io/projected/9916ac3c-e85e-498a-a0cc-769ca26f84b0-kube-api-access-4vmzs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.271973 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9916ac3c-e85e-498a-a0cc-769ca26f84b0-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.325790 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-5796c565b-rf2fr"] Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.503602 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.514003 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.534114 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:50:53 crc kubenswrapper[5030]: E0120 23:50:53.534459 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c07495be-8fe0-4e49-a868-75234b6547d6" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.534475 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c07495be-8fe0-4e49-a868-75234b6547d6" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:50:53 crc kubenswrapper[5030]: E0120 23:50:53.534504 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-metadata" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.534511 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-metadata" Jan 20 23:50:53 crc kubenswrapper[5030]: E0120 23:50:53.534534 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-log" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.534540 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-log" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.534735 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-log" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.534758 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" containerName="nova-metadata-metadata" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.534770 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c07495be-8fe0-4e49-a868-75234b6547d6" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.535373 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.537780 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.539595 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.558078 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.580907 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.585047 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.585101 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.585135 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmtl9\" (UniqueName: \"kubernetes.io/projected/044ada6d-cbbc-45dc-b229-3884d815427b-kube-api-access-tmtl9\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.585177 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.585295 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.688667 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.688801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.688833 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.688868 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.688911 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmtl9\" (UniqueName: \"kubernetes.io/projected/044ada6d-cbbc-45dc-b229-3884d815427b-kube-api-access-tmtl9\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.698277 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.704386 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.706429 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.719177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmtl9\" (UniqueName: \"kubernetes.io/projected/044ada6d-cbbc-45dc-b229-3884d815427b-kube-api-access-tmtl9\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.733268 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.816272 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5796c565b-rf2fr"] Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.867386 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7"] Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.869644 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.876589 5030 generic.go:334] "Generic (PLEG): container finished" podID="9235118a-bc18-42b7-8968-8d1716aa58e6" containerID="e4604a1a5e1055d193592b0018c10b47b6be4ee3b04193a2a20752fb258dd934" exitCode=143 Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.876666 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" event={"ID":"9235118a-bc18-42b7-8968-8d1716aa58e6","Type":"ContainerDied","Data":"e4604a1a5e1055d193592b0018c10b47b6be4ee3b04193a2a20752fb258dd934"} Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.878329 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"9916ac3c-e85e-498a-a0cc-769ca26f84b0","Type":"ContainerDied","Data":"e818ee58a9c8866803b2d800afba678cba00b813a0b3b8e5e9c4a8b76c8aea10"} Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.878350 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.878388 5030 scope.go:117] "RemoveContainer" containerID="8596413765ace3663ab1dbabdc220f74e3eac2172966214f82638524e7826651" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.890477 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" event={"ID":"132cffa4-ee14-4f4d-a4f7-132d9ab4a552","Type":"ContainerStarted","Data":"01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d"} Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.890516 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" event={"ID":"132cffa4-ee14-4f4d-a4f7-132d9ab4a552","Type":"ContainerStarted","Data":"9df0b3b1b36ac34425ed231179d84d804e34c26545d145532e91a247319dd140"} Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.912905 5030 scope.go:117] "RemoveContainer" containerID="c25d532fd4fcba60465f02b139d44d010a92cbabfea1746a9dfa54e3df14eee8" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.916817 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.934154 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7"] Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.993955 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-public-tls-certs\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.994194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-combined-ca-bundle\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.994271 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpkhw\" (UniqueName: \"kubernetes.io/projected/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-kube-api-access-hpkhw\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.994403 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-logs\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.994536 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-config-data-custom\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.994607 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-internal-tls-certs\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:53 crc kubenswrapper[5030]: I0120 23:50:53.994714 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-config-data\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.061995 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c07495be-8fe0-4e49-a868-75234b6547d6" path="/var/lib/kubelet/pods/c07495be-8fe0-4e49-a868-75234b6547d6/volumes" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.098196 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-config-data\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.098259 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-public-tls-certs\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.098296 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-combined-ca-bundle\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.098321 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpkhw\" (UniqueName: \"kubernetes.io/projected/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-kube-api-access-hpkhw\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.098359 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-logs\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.098431 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-config-data-custom\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.098449 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-internal-tls-certs\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.099550 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-logs\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.110103 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.132362 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.161915 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.163607 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.166584 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.166793 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.182901 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.312086 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtlkk\" (UniqueName: \"kubernetes.io/projected/acb8010d-f18b-417a-919f-13e959ccf452-kube-api-access-dtlkk\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.312152 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acb8010d-f18b-417a-919f-13e959ccf452-logs\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.312176 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-config-data\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.312205 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.312222 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.346498 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-internal-tls-certs\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.346907 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-public-tls-certs\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.347358 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-config-data\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.348515 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-config-data-custom\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.349045 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-combined-ca-bundle\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.352170 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpkhw\" (UniqueName: \"kubernetes.io/projected/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-kube-api-access-hpkhw\") pod \"barbican-api-6cc86455b4-nd2m7\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.413665 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtlkk\" (UniqueName: \"kubernetes.io/projected/acb8010d-f18b-417a-919f-13e959ccf452-kube-api-access-dtlkk\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.413930 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acb8010d-f18b-417a-919f-13e959ccf452-logs\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.413958 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-config-data\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.413982 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.414002 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.414824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acb8010d-f18b-417a-919f-13e959ccf452-logs\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.417163 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.418134 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.418187 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-config-data\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.455189 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtlkk\" (UniqueName: \"kubernetes.io/projected/acb8010d-f18b-417a-919f-13e959ccf452-kube-api-access-dtlkk\") pod \"nova-metadata-0\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.495207 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.624922 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.845932 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.933880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"044ada6d-cbbc-45dc-b229-3884d815427b","Type":"ContainerStarted","Data":"1995a209fa47980dcdbde82a0f9dae04eafb043f0c657a0e98f0369c8daa5d19"} Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.964711 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" podUID="132cffa4-ee14-4f4d-a4f7-132d9ab4a552" containerName="barbican-api-log" containerID="cri-o://01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d" gracePeriod=30 Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.964806 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" podUID="132cffa4-ee14-4f4d-a4f7-132d9ab4a552" containerName="barbican-api" containerID="cri-o://07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769" gracePeriod=30 Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.965096 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.965116 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" event={"ID":"132cffa4-ee14-4f4d-a4f7-132d9ab4a552","Type":"ContainerStarted","Data":"07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769"} Jan 20 23:50:54 crc kubenswrapper[5030]: I0120 23:50:54.965131 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:54.997669 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" podStartSLOduration=3.99764419 podStartE2EDuration="3.99764419s" podCreationTimestamp="2026-01-20 23:50:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:54.990962168 +0000 UTC m=+4527.311222456" watchObservedRunningTime="2026-01-20 23:50:54.99764419 +0000 UTC m=+4527.317904478" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.002601 5030 generic.go:334] "Generic (PLEG): container finished" podID="7338788b-d61c-4a48-8ec0-305003dc5397" containerID="e9064aae263583ea97f12eaa487d85dde47b08e15128a9c49deab6445fad1c01" exitCode=0 Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.002702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7338788b-d61c-4a48-8ec0-305003dc5397","Type":"ContainerDied","Data":"e9064aae263583ea97f12eaa487d85dde47b08e15128a9c49deab6445fad1c01"} Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.128281 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:50:55 crc kubenswrapper[5030]: W0120 23:50:55.150409 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacb8010d_f18b_417a_919f_13e959ccf452.slice/crio-19b65cf0eeb9ae9ab9473dda49ea2757adf4eca1aaed79eb8349aaff7223bcad WatchSource:0}: Error finding container 19b65cf0eeb9ae9ab9473dda49ea2757adf4eca1aaed79eb8349aaff7223bcad: Status 404 returned error can't find the container with id 19b65cf0eeb9ae9ab9473dda49ea2757adf4eca1aaed79eb8349aaff7223bcad Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.168741 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-df6c9"] Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.170815 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.174776 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-df6c9"] Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.239848 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kld27\" (UniqueName: \"kubernetes.io/projected/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-kube-api-access-kld27\") pod \"redhat-operators-df6c9\" (UID: \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\") " pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.239907 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-catalog-content\") pod \"redhat-operators-df6c9\" (UID: \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\") " pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.239972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-utilities\") pod \"redhat-operators-df6c9\" (UID: \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\") " pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.282505 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.288549 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7"] Jan 20 23:50:55 crc kubenswrapper[5030]: W0120 23:50:55.290869 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9df19e9_10ce_496f_9ae7_89ac55ba91cf.slice/crio-3aea2eedd919c9e3098512ef5eca7e39545bc7305ffd166df3ccc4dbdd0cbf8a WatchSource:0}: Error finding container 3aea2eedd919c9e3098512ef5eca7e39545bc7305ffd166df3ccc4dbdd0cbf8a: Status 404 returned error can't find the container with id 3aea2eedd919c9e3098512ef5eca7e39545bc7305ffd166df3ccc4dbdd0cbf8a Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.341365 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-internal-tls-certs\") pod \"7338788b-d61c-4a48-8ec0-305003dc5397\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.341450 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqrcq\" (UniqueName: \"kubernetes.io/projected/7338788b-d61c-4a48-8ec0-305003dc5397-kube-api-access-sqrcq\") pod \"7338788b-d61c-4a48-8ec0-305003dc5397\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.341481 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-public-tls-certs\") pod \"7338788b-d61c-4a48-8ec0-305003dc5397\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.341699 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7338788b-d61c-4a48-8ec0-305003dc5397-logs\") pod \"7338788b-d61c-4a48-8ec0-305003dc5397\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.341748 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-config-data\") pod \"7338788b-d61c-4a48-8ec0-305003dc5397\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.341963 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-combined-ca-bundle\") pod \"7338788b-d61c-4a48-8ec0-305003dc5397\" (UID: \"7338788b-d61c-4a48-8ec0-305003dc5397\") " Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.343308 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kld27\" (UniqueName: \"kubernetes.io/projected/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-kube-api-access-kld27\") pod \"redhat-operators-df6c9\" (UID: \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\") " pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.343547 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-catalog-content\") pod \"redhat-operators-df6c9\" (UID: \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\") " pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.343731 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-utilities\") pod \"redhat-operators-df6c9\" (UID: \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\") " pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.344115 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7338788b-d61c-4a48-8ec0-305003dc5397-logs" (OuterVolumeSpecName: "logs") pod "7338788b-d61c-4a48-8ec0-305003dc5397" (UID: "7338788b-d61c-4a48-8ec0-305003dc5397"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.344436 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-utilities\") pod \"redhat-operators-df6c9\" (UID: \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\") " pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.344588 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-catalog-content\") pod \"redhat-operators-df6c9\" (UID: \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\") " pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.356391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7338788b-d61c-4a48-8ec0-305003dc5397-kube-api-access-sqrcq" (OuterVolumeSpecName: "kube-api-access-sqrcq") pod "7338788b-d61c-4a48-8ec0-305003dc5397" (UID: "7338788b-d61c-4a48-8ec0-305003dc5397"). InnerVolumeSpecName "kube-api-access-sqrcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.361645 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kld27\" (UniqueName: \"kubernetes.io/projected/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-kube-api-access-kld27\") pod \"redhat-operators-df6c9\" (UID: \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\") " pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.398774 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7338788b-d61c-4a48-8ec0-305003dc5397" (UID: "7338788b-d61c-4a48-8ec0-305003dc5397"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.446444 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.446780 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqrcq\" (UniqueName: \"kubernetes.io/projected/7338788b-d61c-4a48-8ec0-305003dc5397-kube-api-access-sqrcq\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.446794 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7338788b-d61c-4a48-8ec0-305003dc5397-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.507980 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.533092 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7338788b-d61c-4a48-8ec0-305003dc5397" (UID: "7338788b-d61c-4a48-8ec0-305003dc5397"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.535079 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7338788b-d61c-4a48-8ec0-305003dc5397" (UID: "7338788b-d61c-4a48-8ec0-305003dc5397"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.539122 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-config-data" (OuterVolumeSpecName: "config-data") pod "7338788b-d61c-4a48-8ec0-305003dc5397" (UID: "7338788b-d61c-4a48-8ec0-305003dc5397"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.550073 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.550112 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.550123 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7338788b-d61c-4a48-8ec0-305003dc5397-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.844798 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.959614 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-combined-ca-bundle\") pod \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.960939 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtf6q\" (UniqueName: \"kubernetes.io/projected/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-kube-api-access-vtf6q\") pod \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.960975 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-internal-tls-certs\") pod \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.961016 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-public-tls-certs\") pod \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.961046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data-custom\") pod \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.961100 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-logs\") pod \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.961205 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data\") pod \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.975226 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-logs" (OuterVolumeSpecName: "logs") pod "132cffa4-ee14-4f4d-a4f7-132d9ab4a552" (UID: "132cffa4-ee14-4f4d-a4f7-132d9ab4a552"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.985365 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-kube-api-access-vtf6q" (OuterVolumeSpecName: "kube-api-access-vtf6q") pod "132cffa4-ee14-4f4d-a4f7-132d9ab4a552" (UID: "132cffa4-ee14-4f4d-a4f7-132d9ab4a552"). InnerVolumeSpecName "kube-api-access-vtf6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:55 crc kubenswrapper[5030]: I0120 23:50:55.993780 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "132cffa4-ee14-4f4d-a4f7-132d9ab4a552" (UID: "132cffa4-ee14-4f4d-a4f7-132d9ab4a552"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.030031 5030 generic.go:334] "Generic (PLEG): container finished" podID="132cffa4-ee14-4f4d-a4f7-132d9ab4a552" containerID="07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769" exitCode=0 Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.030055 5030 generic.go:334] "Generic (PLEG): container finished" podID="132cffa4-ee14-4f4d-a4f7-132d9ab4a552" containerID="01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d" exitCode=143 Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.030122 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.033498 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.050585 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9916ac3c-e85e-498a-a0cc-769ca26f84b0" path="/var/lib/kubelet/pods/9916ac3c-e85e-498a-a0cc-769ca26f84b0/volumes" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.052278 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=3.052259064 podStartE2EDuration="3.052259064s" podCreationTimestamp="2026-01-20 23:50:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:56.044615219 +0000 UTC m=+4528.364875497" watchObservedRunningTime="2026-01-20 23:50:56.052259064 +0000 UTC m=+4528.372519352" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.064775 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtf6q\" (UniqueName: \"kubernetes.io/projected/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-kube-api-access-vtf6q\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.064809 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.064823 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: W0120 23:50:56.146023 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod348859b8_66ca_4e3e_a7b1_ad96fe0858a5.slice/crio-7e382be6fd371ec2fcd7680a7a578ff2105770fd2b2a6ca1009d05eb48c08aaa WatchSource:0}: Error finding container 7e382be6fd371ec2fcd7680a7a578ff2105770fd2b2a6ca1009d05eb48c08aaa: Status 404 returned error can't find the container with id 7e382be6fd371ec2fcd7680a7a578ff2105770fd2b2a6ca1009d05eb48c08aaa Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.318996 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "132cffa4-ee14-4f4d-a4f7-132d9ab4a552" (UID: "132cffa4-ee14-4f4d-a4f7-132d9ab4a552"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.329739 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "132cffa4-ee14-4f4d-a4f7-132d9ab4a552" (UID: "132cffa4-ee14-4f4d-a4f7-132d9ab4a552"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.380028 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.380070 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.425385 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" event={"ID":"a9df19e9-10ce-496f-9ae7-89ac55ba91cf","Type":"ContainerStarted","Data":"409fe497b78edb037279005ae11210c5ba6b57531c5991487330124e7e48ec82"} Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.425436 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" event={"ID":"a9df19e9-10ce-496f-9ae7-89ac55ba91cf","Type":"ContainerStarted","Data":"3aea2eedd919c9e3098512ef5eca7e39545bc7305ffd166df3ccc4dbdd0cbf8a"} Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.425454 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-df6c9"] Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.425470 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"044ada6d-cbbc-45dc-b229-3884d815427b","Type":"ContainerStarted","Data":"f458a7429598f19652d1d9f95c5960c8181174945df0f944cd35ca49eede8e40"} Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.425481 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"acb8010d-f18b-417a-919f-13e959ccf452","Type":"ContainerStarted","Data":"2832a1e00722193c8dc84462e3b5eaa3de72e46dcd9c97eb2ec6a97c0ac4c153"} Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.425493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"acb8010d-f18b-417a-919f-13e959ccf452","Type":"ContainerStarted","Data":"19b65cf0eeb9ae9ab9473dda49ea2757adf4eca1aaed79eb8349aaff7223bcad"} Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.425505 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" event={"ID":"132cffa4-ee14-4f4d-a4f7-132d9ab4a552","Type":"ContainerDied","Data":"07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769"} Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.425529 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" event={"ID":"132cffa4-ee14-4f4d-a4f7-132d9ab4a552","Type":"ContainerDied","Data":"01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d"} Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.425539 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-5796c565b-rf2fr" event={"ID":"132cffa4-ee14-4f4d-a4f7-132d9ab4a552","Type":"ContainerDied","Data":"9df0b3b1b36ac34425ed231179d84d804e34c26545d145532e91a247319dd140"} Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.425549 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"7338788b-d61c-4a48-8ec0-305003dc5397","Type":"ContainerDied","Data":"a9dcc89c3bcf83d0fcd1455fcf54baac067dc0c582f6daffbe02d9672f1444af"} Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.425569 5030 scope.go:117] "RemoveContainer" containerID="07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.476429 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.476757 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "132cffa4-ee14-4f4d-a4f7-132d9ab4a552" (UID: "132cffa4-ee14-4f4d-a4f7-132d9ab4a552"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.486656 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.487463 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data" (OuterVolumeSpecName: "config-data") pod "132cffa4-ee14-4f4d-a4f7-132d9ab4a552" (UID: "132cffa4-ee14-4f4d-a4f7-132d9ab4a552"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.493899 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data\") pod \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\" (UID: \"132cffa4-ee14-4f4d-a4f7-132d9ab4a552\") " Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.494598 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: W0120 23:50:56.494692 5030 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/132cffa4-ee14-4f4d-a4f7-132d9ab4a552/volumes/kubernetes.io~secret/config-data Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.494708 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data" (OuterVolumeSpecName: "config-data") pod "132cffa4-ee14-4f4d-a4f7-132d9ab4a552" (UID: "132cffa4-ee14-4f4d-a4f7-132d9ab4a552"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.503676 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.514349 5030 scope.go:117] "RemoveContainer" containerID="01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.517418 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:50:56 crc kubenswrapper[5030]: E0120 23:50:56.517882 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9235118a-bc18-42b7-8968-8d1716aa58e6" containerName="placement-log" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.517895 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9235118a-bc18-42b7-8968-8d1716aa58e6" containerName="placement-log" Jan 20 23:50:56 crc kubenswrapper[5030]: E0120 23:50:56.517918 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132cffa4-ee14-4f4d-a4f7-132d9ab4a552" containerName="barbican-api-log" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.517926 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="132cffa4-ee14-4f4d-a4f7-132d9ab4a552" containerName="barbican-api-log" Jan 20 23:50:56 crc kubenswrapper[5030]: E0120 23:50:56.517940 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7338788b-d61c-4a48-8ec0-305003dc5397" containerName="nova-api-log" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.517945 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7338788b-d61c-4a48-8ec0-305003dc5397" containerName="nova-api-log" Jan 20 23:50:56 crc kubenswrapper[5030]: E0120 23:50:56.517965 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9235118a-bc18-42b7-8968-8d1716aa58e6" containerName="placement-api" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.517971 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9235118a-bc18-42b7-8968-8d1716aa58e6" containerName="placement-api" Jan 20 23:50:56 crc kubenswrapper[5030]: E0120 23:50:56.517989 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132cffa4-ee14-4f4d-a4f7-132d9ab4a552" containerName="barbican-api" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.517995 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="132cffa4-ee14-4f4d-a4f7-132d9ab4a552" containerName="barbican-api" Jan 20 23:50:56 crc kubenswrapper[5030]: E0120 23:50:56.518009 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7338788b-d61c-4a48-8ec0-305003dc5397" containerName="nova-api-api" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.518017 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7338788b-d61c-4a48-8ec0-305003dc5397" containerName="nova-api-api" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.518187 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7338788b-d61c-4a48-8ec0-305003dc5397" containerName="nova-api-api" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.518202 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9235118a-bc18-42b7-8968-8d1716aa58e6" containerName="placement-log" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.518210 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7338788b-d61c-4a48-8ec0-305003dc5397" containerName="nova-api-log" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.518221 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9235118a-bc18-42b7-8968-8d1716aa58e6" containerName="placement-api" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.518238 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="132cffa4-ee14-4f4d-a4f7-132d9ab4a552" containerName="barbican-api-log" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.518247 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="132cffa4-ee14-4f4d-a4f7-132d9ab4a552" containerName="barbican-api" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.519249 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.525409 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.525544 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.525713 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.551126 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.562137 5030 scope.go:117] "RemoveContainer" containerID="07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769" Jan 20 23:50:56 crc kubenswrapper[5030]: E0120 23:50:56.563227 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769\": container with ID starting with 07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769 not found: ID does not exist" containerID="07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.563257 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769"} err="failed to get container status \"07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769\": rpc error: code = NotFound desc = could not find container \"07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769\": container with ID starting with 07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769 not found: ID does not exist" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.563278 5030 scope.go:117] "RemoveContainer" containerID="01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d" Jan 20 23:50:56 crc kubenswrapper[5030]: E0120 23:50:56.567599 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d\": container with ID starting with 01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d not found: ID does not exist" containerID="01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.567706 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d"} err="failed to get container status \"01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d\": rpc error: code = NotFound desc = could not find container \"01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d\": container with ID starting with 01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d not found: ID does not exist" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.567734 5030 scope.go:117] "RemoveContainer" containerID="07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.568014 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769"} err="failed to get container status \"07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769\": rpc error: code = NotFound desc = could not find container \"07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769\": container with ID starting with 07fe7f91b9f4061f46b5a30fd95e6624c9879dfc5ebd43931258c772278fc769 not found: ID does not exist" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.568039 5030 scope.go:117] "RemoveContainer" containerID="01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.568296 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d"} err="failed to get container status \"01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d\": rpc error: code = NotFound desc = could not find container \"01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d\": container with ID starting with 01864fe190d063f491723e5f6b719f6f89610dc738511898510c148b7505dd0d not found: ID does not exist" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.568312 5030 scope.go:117] "RemoveContainer" containerID="e9064aae263583ea97f12eaa487d85dde47b08e15128a9c49deab6445fad1c01" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.596434 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-combined-ca-bundle\") pod \"9235118a-bc18-42b7-8968-8d1716aa58e6\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.596490 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-config-data\") pod \"9235118a-bc18-42b7-8968-8d1716aa58e6\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.596589 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-public-tls-certs\") pod \"9235118a-bc18-42b7-8968-8d1716aa58e6\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.596647 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9235118a-bc18-42b7-8968-8d1716aa58e6-logs\") pod \"9235118a-bc18-42b7-8968-8d1716aa58e6\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.596755 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg47g\" (UniqueName: \"kubernetes.io/projected/9235118a-bc18-42b7-8968-8d1716aa58e6-kube-api-access-kg47g\") pod \"9235118a-bc18-42b7-8968-8d1716aa58e6\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.596877 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-internal-tls-certs\") pod \"9235118a-bc18-42b7-8968-8d1716aa58e6\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.596921 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-scripts\") pod \"9235118a-bc18-42b7-8968-8d1716aa58e6\" (UID: \"9235118a-bc18-42b7-8968-8d1716aa58e6\") " Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.597898 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/132cffa4-ee14-4f4d-a4f7-132d9ab4a552-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.605854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9235118a-bc18-42b7-8968-8d1716aa58e6-logs" (OuterVolumeSpecName: "logs") pod "9235118a-bc18-42b7-8968-8d1716aa58e6" (UID: "9235118a-bc18-42b7-8968-8d1716aa58e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.606400 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-scripts" (OuterVolumeSpecName: "scripts") pod "9235118a-bc18-42b7-8968-8d1716aa58e6" (UID: "9235118a-bc18-42b7-8968-8d1716aa58e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.607081 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9235118a-bc18-42b7-8968-8d1716aa58e6-kube-api-access-kg47g" (OuterVolumeSpecName: "kube-api-access-kg47g") pod "9235118a-bc18-42b7-8968-8d1716aa58e6" (UID: "9235118a-bc18-42b7-8968-8d1716aa58e6"). InnerVolumeSpecName "kube-api-access-kg47g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.622560 5030 scope.go:117] "RemoveContainer" containerID="e7a018502be0466055f7e2d9c3c5111388a41fe69ae1db8fc06e7cce2f2adaf5" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.699410 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbabd573-5784-4cb5-8294-cf65951a73ba-logs\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.699509 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bggnm\" (UniqueName: \"kubernetes.io/projected/dbabd573-5784-4cb5-8294-cf65951a73ba-kube-api-access-bggnm\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.699568 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-public-tls-certs\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.699594 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-config-data\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.699648 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.699682 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.699745 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg47g\" (UniqueName: \"kubernetes.io/projected/9235118a-bc18-42b7-8968-8d1716aa58e6-kube-api-access-kg47g\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.699757 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.699766 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9235118a-bc18-42b7-8968-8d1716aa58e6-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.704209 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9235118a-bc18-42b7-8968-8d1716aa58e6" (UID: "9235118a-bc18-42b7-8968-8d1716aa58e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.704649 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-config-data" (OuterVolumeSpecName: "config-data") pod "9235118a-bc18-42b7-8968-8d1716aa58e6" (UID: "9235118a-bc18-42b7-8968-8d1716aa58e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.721814 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-5796c565b-rf2fr"] Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.744313 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-5796c565b-rf2fr"] Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.802336 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.802400 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.802533 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbabd573-5784-4cb5-8294-cf65951a73ba-logs\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.802767 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bggnm\" (UniqueName: \"kubernetes.io/projected/dbabd573-5784-4cb5-8294-cf65951a73ba-kube-api-access-bggnm\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.802875 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-public-tls-certs\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.802920 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-config-data\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.803087 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.803109 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.803803 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbabd573-5784-4cb5-8294-cf65951a73ba-logs\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.806724 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-config-data\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.818066 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.847184 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.861086 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-public-tls-certs\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.861477 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bggnm\" (UniqueName: \"kubernetes.io/projected/dbabd573-5784-4cb5-8294-cf65951a73ba-kube-api-access-bggnm\") pod \"nova-api-0\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.873768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9235118a-bc18-42b7-8968-8d1716aa58e6" (UID: "9235118a-bc18-42b7-8968-8d1716aa58e6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.900418 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9235118a-bc18-42b7-8968-8d1716aa58e6" (UID: "9235118a-bc18-42b7-8968-8d1716aa58e6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.907071 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.907110 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9235118a-bc18-42b7-8968-8d1716aa58e6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:56 crc kubenswrapper[5030]: I0120 23:50:56.962109 5030 scope.go:117] "RemoveContainer" containerID="e6bfa4ce18ccb4bef2f10305bfdd261c586ba8e9d2496092c0df8e58dadb8812" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.046787 5030 generic.go:334] "Generic (PLEG): container finished" podID="9235118a-bc18-42b7-8968-8d1716aa58e6" containerID="078b4c220ea49991c2fd2e5c63d577629afa1a9bc49871f2340203ad8b972145" exitCode=0 Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.046858 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" event={"ID":"9235118a-bc18-42b7-8968-8d1716aa58e6","Type":"ContainerDied","Data":"078b4c220ea49991c2fd2e5c63d577629afa1a9bc49871f2340203ad8b972145"} Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.046892 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" event={"ID":"9235118a-bc18-42b7-8968-8d1716aa58e6","Type":"ContainerDied","Data":"421d4397cda92e5572573f711e6f557fa679c4211ffd70cb0a30eb2d318ff390"} Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.046911 5030 scope.go:117] "RemoveContainer" containerID="078b4c220ea49991c2fd2e5c63d577629afa1a9bc49871f2340203ad8b972145" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.047062 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-79b6d76946-54s42" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.056418 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" event={"ID":"a9df19e9-10ce-496f-9ae7-89ac55ba91cf","Type":"ContainerStarted","Data":"fc0ce133b045b63568d1b17b3af0a763ae05a71804fb6407998acf7b7d0be9dd"} Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.059729 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.059777 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.062640 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"acb8010d-f18b-417a-919f-13e959ccf452","Type":"ContainerStarted","Data":"d0823fe72a1822171721721bd07c6c91f0239b31d0a862490eb2023a64940f16"} Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.064821 5030 generic.go:334] "Generic (PLEG): container finished" podID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" containerID="6c42e805eba7995a0da4e140a5b1d86e191460ec7d618217ad81da7a94bf2066" exitCode=0 Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.064915 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df6c9" event={"ID":"348859b8-66ca-4e3e-a7b1-ad96fe0858a5","Type":"ContainerDied","Data":"6c42e805eba7995a0da4e140a5b1d86e191460ec7d618217ad81da7a94bf2066"} Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.064972 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df6c9" event={"ID":"348859b8-66ca-4e3e-a7b1-ad96fe0858a5","Type":"ContainerStarted","Data":"7e382be6fd371ec2fcd7680a7a578ff2105770fd2b2a6ca1009d05eb48c08aaa"} Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.070662 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.078302 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" podStartSLOduration=4.078284984 podStartE2EDuration="4.078284984s" podCreationTimestamp="2026-01-20 23:50:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:57.077097465 +0000 UTC m=+4529.397357753" watchObservedRunningTime="2026-01-20 23:50:57.078284984 +0000 UTC m=+4529.398545272" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.090315 5030 scope.go:117] "RemoveContainer" containerID="e4604a1a5e1055d193592b0018c10b47b6be4ee3b04193a2a20752fb258dd934" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.111377 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=3.111361587 podStartE2EDuration="3.111361587s" podCreationTimestamp="2026-01-20 23:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:57.101233241 +0000 UTC m=+4529.421493539" watchObservedRunningTime="2026-01-20 23:50:57.111361587 +0000 UTC m=+4529.431621875" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.120929 5030 scope.go:117] "RemoveContainer" containerID="078b4c220ea49991c2fd2e5c63d577629afa1a9bc49871f2340203ad8b972145" Jan 20 23:50:57 crc kubenswrapper[5030]: E0120 23:50:57.121324 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"078b4c220ea49991c2fd2e5c63d577629afa1a9bc49871f2340203ad8b972145\": container with ID starting with 078b4c220ea49991c2fd2e5c63d577629afa1a9bc49871f2340203ad8b972145 not found: ID does not exist" containerID="078b4c220ea49991c2fd2e5c63d577629afa1a9bc49871f2340203ad8b972145" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.121354 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"078b4c220ea49991c2fd2e5c63d577629afa1a9bc49871f2340203ad8b972145"} err="failed to get container status \"078b4c220ea49991c2fd2e5c63d577629afa1a9bc49871f2340203ad8b972145\": rpc error: code = NotFound desc = could not find container \"078b4c220ea49991c2fd2e5c63d577629afa1a9bc49871f2340203ad8b972145\": container with ID starting with 078b4c220ea49991c2fd2e5c63d577629afa1a9bc49871f2340203ad8b972145 not found: ID does not exist" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.121376 5030 scope.go:117] "RemoveContainer" containerID="e4604a1a5e1055d193592b0018c10b47b6be4ee3b04193a2a20752fb258dd934" Jan 20 23:50:57 crc kubenswrapper[5030]: E0120 23:50:57.121782 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4604a1a5e1055d193592b0018c10b47b6be4ee3b04193a2a20752fb258dd934\": container with ID starting with e4604a1a5e1055d193592b0018c10b47b6be4ee3b04193a2a20752fb258dd934 not found: ID does not exist" containerID="e4604a1a5e1055d193592b0018c10b47b6be4ee3b04193a2a20752fb258dd934" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.121806 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4604a1a5e1055d193592b0018c10b47b6be4ee3b04193a2a20752fb258dd934"} err="failed to get container status \"e4604a1a5e1055d193592b0018c10b47b6be4ee3b04193a2a20752fb258dd934\": rpc error: code = NotFound desc = could not find container \"e4604a1a5e1055d193592b0018c10b47b6be4ee3b04193a2a20752fb258dd934\": container with ID starting with e4604a1a5e1055d193592b0018c10b47b6be4ee3b04193a2a20752fb258dd934 not found: ID does not exist" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.140011 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.209684 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-79b6d76946-54s42"] Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.243681 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-79b6d76946-54s42"] Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.258297 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.258545 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="ceilometer-central-agent" containerID="cri-o://8b1845e8728d8152e80b8ed32bb158372b3cc417655fa2c42f9e0e6d2bad0093" gracePeriod=30 Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.258693 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="proxy-httpd" containerID="cri-o://a081c1c7ccdc4cb13791e724c739ec79d3301101fcdb06b56421f2dc0c9a48a4" gracePeriod=30 Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.258729 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="sg-core" containerID="cri-o://e041dd73fc1444a0956c52fe58129a69c5fa7a3c31ada76916a7e7c4bb1b49a7" gracePeriod=30 Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.258759 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="ceilometer-notification-agent" containerID="cri-o://92cf2fb122f30e5c6a21d4e0e4baf83970b5790cf5d4a295edcace6a86415e72" gracePeriod=30 Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.776916 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.998442 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="132cffa4-ee14-4f4d-a4f7-132d9ab4a552" path="/var/lib/kubelet/pods/132cffa4-ee14-4f4d-a4f7-132d9ab4a552/volumes" Jan 20 23:50:57 crc kubenswrapper[5030]: I0120 23:50:57.999480 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7338788b-d61c-4a48-8ec0-305003dc5397" path="/var/lib/kubelet/pods/7338788b-d61c-4a48-8ec0-305003dc5397/volumes" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.000234 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9235118a-bc18-42b7-8968-8d1716aa58e6" path="/var/lib/kubelet/pods/9235118a-bc18-42b7-8968-8d1716aa58e6/volumes" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.094325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7daf347c-1524-4050-b50e-deb2024da0cc","Type":"ContainerStarted","Data":"44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671"} Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.113451 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"dbabd573-5784-4cb5-8294-cf65951a73ba","Type":"ContainerStarted","Data":"6001742698e098c515b24cedb01ca7a6e21059956ff4d7773611300b09ff142c"} Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.113587 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"dbabd573-5784-4cb5-8294-cf65951a73ba","Type":"ContainerStarted","Data":"07a5a77c6849092bbaf34bb89cbb1c1c3abfdb6c450500709d2d2266d34e01b3"} Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.123173 5030 generic.go:334] "Generic (PLEG): container finished" podID="90dd7615-420e-4c08-a072-bb831bffaa87" containerID="a081c1c7ccdc4cb13791e724c739ec79d3301101fcdb06b56421f2dc0c9a48a4" exitCode=0 Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.123202 5030 generic.go:334] "Generic (PLEG): container finished" podID="90dd7615-420e-4c08-a072-bb831bffaa87" containerID="e041dd73fc1444a0956c52fe58129a69c5fa7a3c31ada76916a7e7c4bb1b49a7" exitCode=2 Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.123211 5030 generic.go:334] "Generic (PLEG): container finished" podID="90dd7615-420e-4c08-a072-bb831bffaa87" containerID="8b1845e8728d8152e80b8ed32bb158372b3cc417655fa2c42f9e0e6d2bad0093" exitCode=0 Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.124217 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"90dd7615-420e-4c08-a072-bb831bffaa87","Type":"ContainerDied","Data":"a081c1c7ccdc4cb13791e724c739ec79d3301101fcdb06b56421f2dc0c9a48a4"} Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.124247 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"90dd7615-420e-4c08-a072-bb831bffaa87","Type":"ContainerDied","Data":"e041dd73fc1444a0956c52fe58129a69c5fa7a3c31ada76916a7e7c4bb1b49a7"} Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.124258 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"90dd7615-420e-4c08-a072-bb831bffaa87","Type":"ContainerDied","Data":"8b1845e8728d8152e80b8ed32bb158372b3cc417655fa2c42f9e0e6d2bad0093"} Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.383142 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.570953 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-scripts\") pod \"90dd7615-420e-4c08-a072-bb831bffaa87\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.571378 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90dd7615-420e-4c08-a072-bb831bffaa87-log-httpd\") pod \"90dd7615-420e-4c08-a072-bb831bffaa87\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.571402 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-ceilometer-tls-certs\") pod \"90dd7615-420e-4c08-a072-bb831bffaa87\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.571466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-sg-core-conf-yaml\") pod \"90dd7615-420e-4c08-a072-bb831bffaa87\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.571495 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ctfq\" (UniqueName: \"kubernetes.io/projected/90dd7615-420e-4c08-a072-bb831bffaa87-kube-api-access-2ctfq\") pod \"90dd7615-420e-4c08-a072-bb831bffaa87\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.571585 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-config-data\") pod \"90dd7615-420e-4c08-a072-bb831bffaa87\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.571712 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90dd7615-420e-4c08-a072-bb831bffaa87-run-httpd\") pod \"90dd7615-420e-4c08-a072-bb831bffaa87\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.571753 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-combined-ca-bundle\") pod \"90dd7615-420e-4c08-a072-bb831bffaa87\" (UID: \"90dd7615-420e-4c08-a072-bb831bffaa87\") " Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.571909 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90dd7615-420e-4c08-a072-bb831bffaa87-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "90dd7615-420e-4c08-a072-bb831bffaa87" (UID: "90dd7615-420e-4c08-a072-bb831bffaa87"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.572181 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90dd7615-420e-4c08-a072-bb831bffaa87-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.572260 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90dd7615-420e-4c08-a072-bb831bffaa87-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "90dd7615-420e-4c08-a072-bb831bffaa87" (UID: "90dd7615-420e-4c08-a072-bb831bffaa87"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.576127 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90dd7615-420e-4c08-a072-bb831bffaa87-kube-api-access-2ctfq" (OuterVolumeSpecName: "kube-api-access-2ctfq") pod "90dd7615-420e-4c08-a072-bb831bffaa87" (UID: "90dd7615-420e-4c08-a072-bb831bffaa87"). InnerVolumeSpecName "kube-api-access-2ctfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.576728 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-scripts" (OuterVolumeSpecName: "scripts") pod "90dd7615-420e-4c08-a072-bb831bffaa87" (UID: "90dd7615-420e-4c08-a072-bb831bffaa87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.614796 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "90dd7615-420e-4c08-a072-bb831bffaa87" (UID: "90dd7615-420e-4c08-a072-bb831bffaa87"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.641144 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "90dd7615-420e-4c08-a072-bb831bffaa87" (UID: "90dd7615-420e-4c08-a072-bb831bffaa87"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.668092 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90dd7615-420e-4c08-a072-bb831bffaa87" (UID: "90dd7615-420e-4c08-a072-bb831bffaa87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.674439 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/90dd7615-420e-4c08-a072-bb831bffaa87-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.674474 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.674489 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.674501 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.674512 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.674530 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ctfq\" (UniqueName: \"kubernetes.io/projected/90dd7615-420e-4c08-a072-bb831bffaa87-kube-api-access-2ctfq\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.715598 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-config-data" (OuterVolumeSpecName: "config-data") pod "90dd7615-420e-4c08-a072-bb831bffaa87" (UID: "90dd7615-420e-4c08-a072-bb831bffaa87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.776304 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90dd7615-420e-4c08-a072-bb831bffaa87-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:50:58 crc kubenswrapper[5030]: I0120 23:50:58.917336 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.139460 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df6c9" event={"ID":"348859b8-66ca-4e3e-a7b1-ad96fe0858a5","Type":"ContainerStarted","Data":"f63d199113626361c3dcb63f16abdb9f62d684f2a0226aa027e8a5308befce13"} Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.146964 5030 generic.go:334] "Generic (PLEG): container finished" podID="90dd7615-420e-4c08-a072-bb831bffaa87" containerID="92cf2fb122f30e5c6a21d4e0e4baf83970b5790cf5d4a295edcace6a86415e72" exitCode=0 Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.147105 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"90dd7615-420e-4c08-a072-bb831bffaa87","Type":"ContainerDied","Data":"92cf2fb122f30e5c6a21d4e0e4baf83970b5790cf5d4a295edcace6a86415e72"} Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.147136 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.147156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"90dd7615-420e-4c08-a072-bb831bffaa87","Type":"ContainerDied","Data":"c130dd97c2e9d2d904967293b0e779e5788e6e15fe6a7443e51305baad8edd56"} Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.147222 5030 scope.go:117] "RemoveContainer" containerID="a081c1c7ccdc4cb13791e724c739ec79d3301101fcdb06b56421f2dc0c9a48a4" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.150934 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"dbabd573-5784-4cb5-8294-cf65951a73ba","Type":"ContainerStarted","Data":"d99eff37903cee5608b24fb64a68cb6530669b4caba4e1642f9039b72ff5ad58"} Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.169807 5030 scope.go:117] "RemoveContainer" containerID="e041dd73fc1444a0956c52fe58129a69c5fa7a3c31ada76916a7e7c4bb1b49a7" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.182591 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.182677 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.206352 5030 scope.go:117] "RemoveContainer" containerID="92cf2fb122f30e5c6a21d4e0e4baf83970b5790cf5d4a295edcace6a86415e72" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.210317 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=3.210303694 podStartE2EDuration="3.210303694s" podCreationTimestamp="2026-01-20 23:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:50:59.195454944 +0000 UTC m=+4531.515715242" watchObservedRunningTime="2026-01-20 23:50:59.210303694 +0000 UTC m=+4531.530563992" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.242290 5030 scope.go:117] "RemoveContainer" containerID="8b1845e8728d8152e80b8ed32bb158372b3cc417655fa2c42f9e0e6d2bad0093" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.244575 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.248487 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.253342 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.260619 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.274898 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:50:59 crc kubenswrapper[5030]: E0120 23:50:59.275266 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="ceilometer-central-agent" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.275280 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="ceilometer-central-agent" Jan 20 23:50:59 crc kubenswrapper[5030]: E0120 23:50:59.275303 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="sg-core" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.275326 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="sg-core" Jan 20 23:50:59 crc kubenswrapper[5030]: E0120 23:50:59.275357 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="proxy-httpd" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.275365 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="proxy-httpd" Jan 20 23:50:59 crc kubenswrapper[5030]: E0120 23:50:59.275376 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="ceilometer-notification-agent" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.275382 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="ceilometer-notification-agent" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.275573 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="ceilometer-central-agent" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.275589 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="ceilometer-notification-agent" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.275603 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="proxy-httpd" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.275612 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" containerName="sg-core" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.277250 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.277272 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.278042 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.285181 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.285254 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.285375 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.290742 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.294454 5030 scope.go:117] "RemoveContainer" containerID="a081c1c7ccdc4cb13791e724c739ec79d3301101fcdb06b56421f2dc0c9a48a4" Jan 20 23:50:59 crc kubenswrapper[5030]: E0120 23:50:59.296394 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a081c1c7ccdc4cb13791e724c739ec79d3301101fcdb06b56421f2dc0c9a48a4\": container with ID starting with a081c1c7ccdc4cb13791e724c739ec79d3301101fcdb06b56421f2dc0c9a48a4 not found: ID does not exist" containerID="a081c1c7ccdc4cb13791e724c739ec79d3301101fcdb06b56421f2dc0c9a48a4" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.296435 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a081c1c7ccdc4cb13791e724c739ec79d3301101fcdb06b56421f2dc0c9a48a4"} err="failed to get container status \"a081c1c7ccdc4cb13791e724c739ec79d3301101fcdb06b56421f2dc0c9a48a4\": rpc error: code = NotFound desc = could not find container \"a081c1c7ccdc4cb13791e724c739ec79d3301101fcdb06b56421f2dc0c9a48a4\": container with ID starting with a081c1c7ccdc4cb13791e724c739ec79d3301101fcdb06b56421f2dc0c9a48a4 not found: ID does not exist" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.296456 5030 scope.go:117] "RemoveContainer" containerID="e041dd73fc1444a0956c52fe58129a69c5fa7a3c31ada76916a7e7c4bb1b49a7" Jan 20 23:50:59 crc kubenswrapper[5030]: E0120 23:50:59.297562 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e041dd73fc1444a0956c52fe58129a69c5fa7a3c31ada76916a7e7c4bb1b49a7\": container with ID starting with e041dd73fc1444a0956c52fe58129a69c5fa7a3c31ada76916a7e7c4bb1b49a7 not found: ID does not exist" containerID="e041dd73fc1444a0956c52fe58129a69c5fa7a3c31ada76916a7e7c4bb1b49a7" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.297601 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e041dd73fc1444a0956c52fe58129a69c5fa7a3c31ada76916a7e7c4bb1b49a7"} err="failed to get container status \"e041dd73fc1444a0956c52fe58129a69c5fa7a3c31ada76916a7e7c4bb1b49a7\": rpc error: code = NotFound desc = could not find container \"e041dd73fc1444a0956c52fe58129a69c5fa7a3c31ada76916a7e7c4bb1b49a7\": container with ID starting with e041dd73fc1444a0956c52fe58129a69c5fa7a3c31ada76916a7e7c4bb1b49a7 not found: ID does not exist" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.297652 5030 scope.go:117] "RemoveContainer" containerID="92cf2fb122f30e5c6a21d4e0e4baf83970b5790cf5d4a295edcace6a86415e72" Jan 20 23:50:59 crc kubenswrapper[5030]: E0120 23:50:59.297904 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92cf2fb122f30e5c6a21d4e0e4baf83970b5790cf5d4a295edcace6a86415e72\": container with ID starting with 92cf2fb122f30e5c6a21d4e0e4baf83970b5790cf5d4a295edcace6a86415e72 not found: ID does not exist" containerID="92cf2fb122f30e5c6a21d4e0e4baf83970b5790cf5d4a295edcace6a86415e72" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.297921 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92cf2fb122f30e5c6a21d4e0e4baf83970b5790cf5d4a295edcace6a86415e72"} err="failed to get container status \"92cf2fb122f30e5c6a21d4e0e4baf83970b5790cf5d4a295edcace6a86415e72\": rpc error: code = NotFound desc = could not find container \"92cf2fb122f30e5c6a21d4e0e4baf83970b5790cf5d4a295edcace6a86415e72\": container with ID starting with 92cf2fb122f30e5c6a21d4e0e4baf83970b5790cf5d4a295edcace6a86415e72 not found: ID does not exist" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.297934 5030 scope.go:117] "RemoveContainer" containerID="8b1845e8728d8152e80b8ed32bb158372b3cc417655fa2c42f9e0e6d2bad0093" Jan 20 23:50:59 crc kubenswrapper[5030]: E0120 23:50:59.298095 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b1845e8728d8152e80b8ed32bb158372b3cc417655fa2c42f9e0e6d2bad0093\": container with ID starting with 8b1845e8728d8152e80b8ed32bb158372b3cc417655fa2c42f9e0e6d2bad0093 not found: ID does not exist" containerID="8b1845e8728d8152e80b8ed32bb158372b3cc417655fa2c42f9e0e6d2bad0093" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.298111 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b1845e8728d8152e80b8ed32bb158372b3cc417655fa2c42f9e0e6d2bad0093"} err="failed to get container status \"8b1845e8728d8152e80b8ed32bb158372b3cc417655fa2c42f9e0e6d2bad0093\": rpc error: code = NotFound desc = could not find container \"8b1845e8728d8152e80b8ed32bb158372b3cc417655fa2c42f9e0e6d2bad0093\": container with ID starting with 8b1845e8728d8152e80b8ed32bb158372b3cc417655fa2c42f9e0e6d2bad0093 not found: ID does not exist" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.344251 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.362165 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.388706 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-log-httpd\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.389983 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khz9n\" (UniqueName: \"kubernetes.io/projected/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-kube-api-access-khz9n\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.390025 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.390075 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-scripts\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.390180 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.390207 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-run-httpd\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.390225 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-config-data\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.390271 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.491850 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.491892 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-run-httpd\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.491915 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-config-data\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.491941 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.491997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-log-httpd\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.492032 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khz9n\" (UniqueName: \"kubernetes.io/projected/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-kube-api-access-khz9n\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.492060 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.492090 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-scripts\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.492798 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-log-httpd\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.492898 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-run-httpd\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.495985 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.496073 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.496097 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.496290 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.498612 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-scripts\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.500257 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-config-data\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.500488 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.517334 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khz9n\" (UniqueName: \"kubernetes.io/projected/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-kube-api-access-khz9n\") pod \"ceilometer-0\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.609789 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.803271 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.962360 5030 scope.go:117] "RemoveContainer" containerID="8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa" Jan 20 23:50:59 crc kubenswrapper[5030]: E0120 23:50:59.962759 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 40s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(87584043-7d0e-475d-89ec-2f743b49b145)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="87584043-7d0e-475d-89ec-2f743b49b145" Jan 20 23:50:59 crc kubenswrapper[5030]: I0120 23:50:59.980195 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90dd7615-420e-4c08-a072-bb831bffaa87" path="/var/lib/kubelet/pods/90dd7615-420e-4c08-a072-bb831bffaa87/volumes" Jan 20 23:51:00 crc kubenswrapper[5030]: I0120 23:51:00.055953 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:51:00 crc kubenswrapper[5030]: W0120 23:51:00.061464 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dbbe6f4_cd02_42d1_bc79_751984eb97c3.slice/crio-b93dd1484d8a81f88ce9a35353105992cdcf422fd2a00ee538bf004227c404fc WatchSource:0}: Error finding container b93dd1484d8a81f88ce9a35353105992cdcf422fd2a00ee538bf004227c404fc: Status 404 returned error can't find the container with id b93dd1484d8a81f88ce9a35353105992cdcf422fd2a00ee538bf004227c404fc Jan 20 23:51:00 crc kubenswrapper[5030]: I0120 23:51:00.169511 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5dbbe6f4-cd02-42d1-bc79-751984eb97c3","Type":"ContainerStarted","Data":"b93dd1484d8a81f88ce9a35353105992cdcf422fd2a00ee538bf004227c404fc"} Jan 20 23:51:00 crc kubenswrapper[5030]: I0120 23:51:00.171561 5030 generic.go:334] "Generic (PLEG): container finished" podID="7daf347c-1524-4050-b50e-deb2024da0cc" containerID="44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671" exitCode=1 Jan 20 23:51:00 crc kubenswrapper[5030]: I0120 23:51:00.172965 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7daf347c-1524-4050-b50e-deb2024da0cc","Type":"ContainerDied","Data":"44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671"} Jan 20 23:51:00 crc kubenswrapper[5030]: I0120 23:51:00.173005 5030 scope.go:117] "RemoveContainer" containerID="e6bfa4ce18ccb4bef2f10305bfdd261c586ba8e9d2496092c0df8e58dadb8812" Jan 20 23:51:00 crc kubenswrapper[5030]: I0120 23:51:00.174176 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:51:00 crc kubenswrapper[5030]: I0120 23:51:00.174398 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:51:00 crc kubenswrapper[5030]: I0120 23:51:00.174851 5030 scope.go:117] "RemoveContainer" containerID="44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671" Jan 20 23:51:00 crc kubenswrapper[5030]: E0120 23:51:00.175113 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(7daf347c-1524-4050-b50e-deb2024da0cc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" Jan 20 23:51:00 crc kubenswrapper[5030]: I0120 23:51:00.175257 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:51:00 crc kubenswrapper[5030]: I0120 23:51:00.175400 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:51:00 crc kubenswrapper[5030]: I0120 23:51:00.961839 5030 scope.go:117] "RemoveContainer" containerID="0dc97d8d7ede5b59ab4837ed536481e436e23bbbe6faed6c35aaefc9c6dcb7dc" Jan 20 23:51:01 crc kubenswrapper[5030]: I0120 23:51:01.183262 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5dbbe6f4-cd02-42d1-bc79-751984eb97c3","Type":"ContainerStarted","Data":"8048071a865ac6d4595f4e10540d7c5f758927588e82af002d9b212761a6d12d"} Jan 20 23:51:01 crc kubenswrapper[5030]: I0120 23:51:01.188110 5030 generic.go:334] "Generic (PLEG): container finished" podID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" containerID="f63d199113626361c3dcb63f16abdb9f62d684f2a0226aa027e8a5308befce13" exitCode=0 Jan 20 23:51:01 crc kubenswrapper[5030]: I0120 23:51:01.188989 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df6c9" event={"ID":"348859b8-66ca-4e3e-a7b1-ad96fe0858a5","Type":"ContainerDied","Data":"f63d199113626361c3dcb63f16abdb9f62d684f2a0226aa027e8a5308befce13"} Jan 20 23:51:01 crc kubenswrapper[5030]: I0120 23:51:01.962577 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:51:01 crc kubenswrapper[5030]: E0120 23:51:01.962862 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:51:02 crc kubenswrapper[5030]: I0120 23:51:02.063644 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:51:02 crc kubenswrapper[5030]: I0120 23:51:02.067883 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:51:02 crc kubenswrapper[5030]: I0120 23:51:02.133542 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:51:02 crc kubenswrapper[5030]: I0120 23:51:02.137151 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:51:02 crc kubenswrapper[5030]: I0120 23:51:02.197698 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"65b5db4a-e775-4981-a7fd-e2d8bd56f334","Type":"ContainerStarted","Data":"7f7607fa9cd7139f4b8d413ec29021dabc7d691ea574bea76d16f9b1e2924dd3"} Jan 20 23:51:03 crc kubenswrapper[5030]: I0120 23:51:03.917886 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:51:03 crc kubenswrapper[5030]: I0120 23:51:03.939228 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.226932 5030 generic.go:334] "Generic (PLEG): container finished" podID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerID="7f7607fa9cd7139f4b8d413ec29021dabc7d691ea574bea76d16f9b1e2924dd3" exitCode=1 Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.227051 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"65b5db4a-e775-4981-a7fd-e2d8bd56f334","Type":"ContainerDied","Data":"7f7607fa9cd7139f4b8d413ec29021dabc7d691ea574bea76d16f9b1e2924dd3"} Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.227120 5030 scope.go:117] "RemoveContainer" containerID="0dc97d8d7ede5b59ab4837ed536481e436e23bbbe6faed6c35aaefc9c6dcb7dc" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.228396 5030 scope.go:117] "RemoveContainer" containerID="7f7607fa9cd7139f4b8d413ec29021dabc7d691ea574bea76d16f9b1e2924dd3" Jan 20 23:51:04 crc kubenswrapper[5030]: E0120 23:51:04.229038 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(65b5db4a-e775-4981-a7fd-e2d8bd56f334)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.255978 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.371953 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" podUID="ada934e3-1bb4-44a0-ad44-771a35fcabf7" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.200:9696/\": dial tcp 10.217.1.200:9696: connect: connection refused" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.497801 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.498112 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.799009 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.803468 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.803505 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.803840 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.804176 5030 scope.go:117] "RemoveContainer" containerID="44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671" Jan 20 23:51:04 crc kubenswrapper[5030]: E0120 23:51:04.804448 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(7daf347c-1524-4050-b50e-deb2024da0cc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.904765 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-combined-ca-bundle\") pod \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.904837 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-ovndb-tls-certs\") pod \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.904860 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-internal-tls-certs\") pod \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.904909 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-httpd-config\") pod \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.905552 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-config\") pod \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.905598 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-public-tls-certs\") pod \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.905642 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rt76\" (UniqueName: \"kubernetes.io/projected/ada934e3-1bb4-44a0-ad44-771a35fcabf7-kube-api-access-8rt76\") pod \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\" (UID: \"ada934e3-1bb4-44a0-ad44-771a35fcabf7\") " Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.918717 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.944284 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada934e3-1bb4-44a0-ad44-771a35fcabf7-kube-api-access-8rt76" (OuterVolumeSpecName: "kube-api-access-8rt76") pod "ada934e3-1bb4-44a0-ad44-771a35fcabf7" (UID: "ada934e3-1bb4-44a0-ad44-771a35fcabf7"). InnerVolumeSpecName "kube-api-access-8rt76". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.947059 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ada934e3-1bb4-44a0-ad44-771a35fcabf7" (UID: "ada934e3-1bb4-44a0-ad44-771a35fcabf7"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:04 crc kubenswrapper[5030]: I0120 23:51:04.991075 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ada934e3-1bb4-44a0-ad44-771a35fcabf7" (UID: "ada934e3-1bb4-44a0-ad44-771a35fcabf7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.007849 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rt76\" (UniqueName: \"kubernetes.io/projected/ada934e3-1bb4-44a0-ad44-771a35fcabf7-kube-api-access-8rt76\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.007875 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.007886 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.022981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ada934e3-1bb4-44a0-ad44-771a35fcabf7" (UID: "ada934e3-1bb4-44a0-ad44-771a35fcabf7"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.035384 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ada934e3-1bb4-44a0-ad44-771a35fcabf7" (UID: "ada934e3-1bb4-44a0-ad44-771a35fcabf7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.035516 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-config" (OuterVolumeSpecName: "config") pod "ada934e3-1bb4-44a0-ad44-771a35fcabf7" (UID: "ada934e3-1bb4-44a0-ad44-771a35fcabf7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.038969 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ada934e3-1bb4-44a0-ad44-771a35fcabf7" (UID: "ada934e3-1bb4-44a0-ad44-771a35fcabf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.111879 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.111914 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.111926 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.111937 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada934e3-1bb4-44a0-ad44-771a35fcabf7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.126387 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5568cd6b4b-ms4dw_d63800d7-7064-4d3f-9294-bcdbbd3957ec/neutron-api/0.log" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.126457 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.213600 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-httpd-config\") pod \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.213698 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6q8s\" (UniqueName: \"kubernetes.io/projected/d63800d7-7064-4d3f-9294-bcdbbd3957ec-kube-api-access-r6q8s\") pod \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.213763 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs\") pod \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.213833 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs\") pod \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.213870 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-config\") pod \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.213913 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs\") pod \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.213938 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-combined-ca-bundle\") pod \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\" (UID: \"d63800d7-7064-4d3f-9294-bcdbbd3957ec\") " Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.218026 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d63800d7-7064-4d3f-9294-bcdbbd3957ec" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.218110 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63800d7-7064-4d3f-9294-bcdbbd3957ec-kube-api-access-r6q8s" (OuterVolumeSpecName: "kube-api-access-r6q8s") pod "d63800d7-7064-4d3f-9294-bcdbbd3957ec" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec"). InnerVolumeSpecName "kube-api-access-r6q8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.240194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df6c9" event={"ID":"348859b8-66ca-4e3e-a7b1-ad96fe0858a5","Type":"ContainerStarted","Data":"789d3935cc8be00f3dcb69fe47911653973427c3760d031b80db55ab911cec67"} Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.255565 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5dbbe6f4-cd02-42d1-bc79-751984eb97c3","Type":"ContainerStarted","Data":"7a42a417626dcbf1aa66434c1b654811344700544ec890d50c16d3ab2b26a4d7"} Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.257734 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-5568cd6b4b-ms4dw_d63800d7-7064-4d3f-9294-bcdbbd3957ec/neutron-api/0.log" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.257916 5030 generic.go:334] "Generic (PLEG): container finished" podID="d63800d7-7064-4d3f-9294-bcdbbd3957ec" containerID="2f8257c9154705931fbf08d31a2b04c330f4551f96f3ed0732be9ece70049a15" exitCode=137 Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.258064 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" event={"ID":"d63800d7-7064-4d3f-9294-bcdbbd3957ec","Type":"ContainerDied","Data":"2f8257c9154705931fbf08d31a2b04c330f4551f96f3ed0732be9ece70049a15"} Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.258143 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.258510 5030 scope.go:117] "RemoveContainer" containerID="242993683d7222f4ec4755e00b402788f7faba656a52586cfc1ddacde6d5bbde" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.258162 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw" event={"ID":"d63800d7-7064-4d3f-9294-bcdbbd3957ec","Type":"ContainerDied","Data":"fafd40d87fe0776a710b68e7cf7c6c26032068bba803c3ada0ccd6419170a76d"} Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.259965 5030 generic.go:334] "Generic (PLEG): container finished" podID="ada934e3-1bb4-44a0-ad44-771a35fcabf7" containerID="32226fd0c8a82e1ea67c45bbe6ce2bde02a5fd033c62eb3de88db4248fca5777" exitCode=0 Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.260008 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" event={"ID":"ada934e3-1bb4-44a0-ad44-771a35fcabf7","Type":"ContainerDied","Data":"32226fd0c8a82e1ea67c45bbe6ce2bde02a5fd033c62eb3de88db4248fca5777"} Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.260028 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" event={"ID":"ada934e3-1bb4-44a0-ad44-771a35fcabf7","Type":"ContainerDied","Data":"4291686583adb894843dbf9b11b0e8912d7c2e46aff101885ffb53f8a66c552f"} Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.260014 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-78f766ffcd-dv8xk" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.266234 5030 scope.go:117] "RemoveContainer" containerID="7f7607fa9cd7139f4b8d413ec29021dabc7d691ea574bea76d16f9b1e2924dd3" Jan 20 23:51:05 crc kubenswrapper[5030]: E0120 23:51:05.267141 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(65b5db4a-e775-4981-a7fd-e2d8bd56f334)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.268345 5030 scope.go:117] "RemoveContainer" containerID="44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671" Jan 20 23:51:05 crc kubenswrapper[5030]: E0120 23:51:05.268553 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(7daf347c-1524-4050-b50e-deb2024da0cc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.282545 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-df6c9" podStartSLOduration=3.410079052 podStartE2EDuration="10.282530248s" podCreationTimestamp="2026-01-20 23:50:55 +0000 UTC" firstStartedPulling="2026-01-20 23:50:57.090398768 +0000 UTC m=+4529.410659056" lastFinishedPulling="2026-01-20 23:51:03.962849924 +0000 UTC m=+4536.283110252" observedRunningTime="2026-01-20 23:51:05.271686065 +0000 UTC m=+4537.591946373" watchObservedRunningTime="2026-01-20 23:51:05.282530248 +0000 UTC m=+4537.602790536" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.298316 5030 scope.go:117] "RemoveContainer" containerID="2f8257c9154705931fbf08d31a2b04c330f4551f96f3ed0732be9ece70049a15" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.323683 5030 scope.go:117] "RemoveContainer" containerID="242993683d7222f4ec4755e00b402788f7faba656a52586cfc1ddacde6d5bbde" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.323816 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.323845 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6q8s\" (UniqueName: \"kubernetes.io/projected/d63800d7-7064-4d3f-9294-bcdbbd3957ec-kube-api-access-r6q8s\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: E0120 23:51:05.324032 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"242993683d7222f4ec4755e00b402788f7faba656a52586cfc1ddacde6d5bbde\": container with ID starting with 242993683d7222f4ec4755e00b402788f7faba656a52586cfc1ddacde6d5bbde not found: ID does not exist" containerID="242993683d7222f4ec4755e00b402788f7faba656a52586cfc1ddacde6d5bbde" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.324058 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"242993683d7222f4ec4755e00b402788f7faba656a52586cfc1ddacde6d5bbde"} err="failed to get container status \"242993683d7222f4ec4755e00b402788f7faba656a52586cfc1ddacde6d5bbde\": rpc error: code = NotFound desc = could not find container \"242993683d7222f4ec4755e00b402788f7faba656a52586cfc1ddacde6d5bbde\": container with ID starting with 242993683d7222f4ec4755e00b402788f7faba656a52586cfc1ddacde6d5bbde not found: ID does not exist" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.324101 5030 scope.go:117] "RemoveContainer" containerID="2f8257c9154705931fbf08d31a2b04c330f4551f96f3ed0732be9ece70049a15" Jan 20 23:51:05 crc kubenswrapper[5030]: E0120 23:51:05.324263 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f8257c9154705931fbf08d31a2b04c330f4551f96f3ed0732be9ece70049a15\": container with ID starting with 2f8257c9154705931fbf08d31a2b04c330f4551f96f3ed0732be9ece70049a15 not found: ID does not exist" containerID="2f8257c9154705931fbf08d31a2b04c330f4551f96f3ed0732be9ece70049a15" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.324287 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8257c9154705931fbf08d31a2b04c330f4551f96f3ed0732be9ece70049a15"} err="failed to get container status \"2f8257c9154705931fbf08d31a2b04c330f4551f96f3ed0732be9ece70049a15\": rpc error: code = NotFound desc = could not find container \"2f8257c9154705931fbf08d31a2b04c330f4551f96f3ed0732be9ece70049a15\": container with ID starting with 2f8257c9154705931fbf08d31a2b04c330f4551f96f3ed0732be9ece70049a15 not found: ID does not exist" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.324301 5030 scope.go:117] "RemoveContainer" containerID="7f36d07fb47f8a6aea3d6480a30f89be95381550b0c9be70f2f2037486ec60a8" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.329706 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-config" (OuterVolumeSpecName: "config") pod "d63800d7-7064-4d3f-9294-bcdbbd3957ec" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.330775 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d63800d7-7064-4d3f-9294-bcdbbd3957ec" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.331526 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d63800d7-7064-4d3f-9294-bcdbbd3957ec" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.347591 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-78f766ffcd-dv8xk"] Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.358863 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d63800d7-7064-4d3f-9294-bcdbbd3957ec" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.361859 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d63800d7-7064-4d3f-9294-bcdbbd3957ec" (UID: "d63800d7-7064-4d3f-9294-bcdbbd3957ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.368492 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-78f766ffcd-dv8xk"] Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.380295 5030 scope.go:117] "RemoveContainer" containerID="32226fd0c8a82e1ea67c45bbe6ce2bde02a5fd033c62eb3de88db4248fca5777" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.398885 5030 scope.go:117] "RemoveContainer" containerID="7f36d07fb47f8a6aea3d6480a30f89be95381550b0c9be70f2f2037486ec60a8" Jan 20 23:51:05 crc kubenswrapper[5030]: E0120 23:51:05.399772 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f36d07fb47f8a6aea3d6480a30f89be95381550b0c9be70f2f2037486ec60a8\": container with ID starting with 7f36d07fb47f8a6aea3d6480a30f89be95381550b0c9be70f2f2037486ec60a8 not found: ID does not exist" containerID="7f36d07fb47f8a6aea3d6480a30f89be95381550b0c9be70f2f2037486ec60a8" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.399806 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f36d07fb47f8a6aea3d6480a30f89be95381550b0c9be70f2f2037486ec60a8"} err="failed to get container status \"7f36d07fb47f8a6aea3d6480a30f89be95381550b0c9be70f2f2037486ec60a8\": rpc error: code = NotFound desc = could not find container \"7f36d07fb47f8a6aea3d6480a30f89be95381550b0c9be70f2f2037486ec60a8\": container with ID starting with 7f36d07fb47f8a6aea3d6480a30f89be95381550b0c9be70f2f2037486ec60a8 not found: ID does not exist" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.399875 5030 scope.go:117] "RemoveContainer" containerID="32226fd0c8a82e1ea67c45bbe6ce2bde02a5fd033c62eb3de88db4248fca5777" Jan 20 23:51:05 crc kubenswrapper[5030]: E0120 23:51:05.400252 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32226fd0c8a82e1ea67c45bbe6ce2bde02a5fd033c62eb3de88db4248fca5777\": container with ID starting with 32226fd0c8a82e1ea67c45bbe6ce2bde02a5fd033c62eb3de88db4248fca5777 not found: ID does not exist" containerID="32226fd0c8a82e1ea67c45bbe6ce2bde02a5fd033c62eb3de88db4248fca5777" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.400308 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32226fd0c8a82e1ea67c45bbe6ce2bde02a5fd033c62eb3de88db4248fca5777"} err="failed to get container status \"32226fd0c8a82e1ea67c45bbe6ce2bde02a5fd033c62eb3de88db4248fca5777\": rpc error: code = NotFound desc = could not find container \"32226fd0c8a82e1ea67c45bbe6ce2bde02a5fd033c62eb3de88db4248fca5777\": container with ID starting with 32226fd0c8a82e1ea67c45bbe6ce2bde02a5fd033c62eb3de88db4248fca5777 not found: ID does not exist" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.425484 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.425515 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.425526 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.425540 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.425549 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63800d7-7064-4d3f-9294-bcdbbd3957ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.514859 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="acb8010d-f18b-417a-919f-13e959ccf452" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.514900 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="acb8010d-f18b-417a-919f-13e959ccf452" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.517988 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.518056 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.687787 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw"] Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.695970 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-5568cd6b4b-ms4dw"] Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.906792 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.918892 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.977256 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada934e3-1bb4-44a0-ad44-771a35fcabf7" path="/var/lib/kubelet/pods/ada934e3-1bb4-44a0-ad44-771a35fcabf7/volumes" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.977943 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d63800d7-7064-4d3f-9294-bcdbbd3957ec" path="/var/lib/kubelet/pods/d63800d7-7064-4d3f-9294-bcdbbd3957ec/volumes" Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.978588 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7fc69fb975-2tn7g"] Jan 20 23:51:05 crc kubenswrapper[5030]: I0120 23:51:05.979325 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" podUID="9124c35a-a0a3-4adf-92cd-5d7011098204" containerName="keystone-api" containerID="cri-o://3f6c7cb33ab3b87a8e27bc70591e84aa73c4cf65c60f08eb014c06c96a92cc58" gracePeriod=30 Jan 20 23:51:06 crc kubenswrapper[5030]: I0120 23:51:06.275223 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5dbbe6f4-cd02-42d1-bc79-751984eb97c3","Type":"ContainerStarted","Data":"32b34e0c53e081418b51f9f54af370209ce26fab6baf0835c1eec3d7915df72c"} Jan 20 23:51:06 crc kubenswrapper[5030]: I0120 23:51:06.276071 5030 scope.go:117] "RemoveContainer" containerID="7f7607fa9cd7139f4b8d413ec29021dabc7d691ea574bea76d16f9b1e2924dd3" Jan 20 23:51:06 crc kubenswrapper[5030]: E0120 23:51:06.276301 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(65b5db4a-e775-4981-a7fd-e2d8bd56f334)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" Jan 20 23:51:06 crc kubenswrapper[5030]: I0120 23:51:06.382949 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:51:06 crc kubenswrapper[5030]: I0120 23:51:06.392661 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:51:06 crc kubenswrapper[5030]: I0120 23:51:06.580075 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-777db55c66-sxd4j"] Jan 20 23:51:06 crc kubenswrapper[5030]: I0120 23:51:06.580522 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" containerID="cri-o://802ab5c1c2e050a8a391ce1421faacfc7008363dd9ae9278c32e3b4a5668b5f9" gracePeriod=30 Jan 20 23:51:06 crc kubenswrapper[5030]: I0120 23:51:06.580600 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" containerID="cri-o://cf4277b9970dab45af9d50c12645ba97abd7f0b0cad2e3a7288e663c8ecb365f" gracePeriod=30 Jan 20 23:51:06 crc kubenswrapper[5030]: I0120 23:51:06.606981 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-df6c9" podUID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" containerName="registry-server" probeResult="failure" output=< Jan 20 23:51:06 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 23:51:06 crc kubenswrapper[5030]: > Jan 20 23:51:06 crc kubenswrapper[5030]: I0120 23:51:06.618893 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": EOF" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.141507 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.141797 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.290553 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5dbbe6f4-cd02-42d1-bc79-751984eb97c3","Type":"ContainerStarted","Data":"26908818d2277cf1e3a58eea5d3eccff77e751e9bc4e5e3766cb0ad235b94325"} Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.291851 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.305970 5030 generic.go:334] "Generic (PLEG): container finished" podID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerID="802ab5c1c2e050a8a391ce1421faacfc7008363dd9ae9278c32e3b4a5668b5f9" exitCode=143 Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.306798 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" event={"ID":"7c59fe1c-661b-4268-a33d-e9f87ec167eb","Type":"ContainerDied","Data":"802ab5c1c2e050a8a391ce1421faacfc7008363dd9ae9278c32e3b4a5668b5f9"} Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.306833 5030 scope.go:117] "RemoveContainer" containerID="03b4cd2420f5137945cf116f9f72eb27ef3c09ae5682c6031cff9066b34979ca" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.326840 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.359258856 podStartE2EDuration="8.326819599s" podCreationTimestamp="2026-01-20 23:50:59 +0000 UTC" firstStartedPulling="2026-01-20 23:51:00.064346262 +0000 UTC m=+4532.384606550" lastFinishedPulling="2026-01-20 23:51:07.031907005 +0000 UTC m=+4539.352167293" observedRunningTime="2026-01-20 23:51:07.30955231 +0000 UTC m=+4539.629812598" watchObservedRunningTime="2026-01-20 23:51:07.326819599 +0000 UTC m=+4539.647079887" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.996252 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:51:07 crc kubenswrapper[5030]: E0120 23:51:07.999017 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63800d7-7064-4d3f-9294-bcdbbd3957ec" containerName="neutron-httpd" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.999045 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63800d7-7064-4d3f-9294-bcdbbd3957ec" containerName="neutron-httpd" Jan 20 23:51:07 crc kubenswrapper[5030]: E0120 23:51:07.999057 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada934e3-1bb4-44a0-ad44-771a35fcabf7" containerName="neutron-api" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.999063 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada934e3-1bb4-44a0-ad44-771a35fcabf7" containerName="neutron-api" Jan 20 23:51:07 crc kubenswrapper[5030]: E0120 23:51:07.999086 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63800d7-7064-4d3f-9294-bcdbbd3957ec" containerName="neutron-api" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.999092 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63800d7-7064-4d3f-9294-bcdbbd3957ec" containerName="neutron-api" Jan 20 23:51:07 crc kubenswrapper[5030]: E0120 23:51:07.999113 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada934e3-1bb4-44a0-ad44-771a35fcabf7" containerName="neutron-httpd" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.999119 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada934e3-1bb4-44a0-ad44-771a35fcabf7" containerName="neutron-httpd" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.999308 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada934e3-1bb4-44a0-ad44-771a35fcabf7" containerName="neutron-httpd" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.999322 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d63800d7-7064-4d3f-9294-bcdbbd3957ec" containerName="neutron-httpd" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.999330 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d63800d7-7064-4d3f-9294-bcdbbd3957ec" containerName="neutron-api" Jan 20 23:51:07 crc kubenswrapper[5030]: I0120 23:51:07.999347 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada934e3-1bb4-44a0-ad44-771a35fcabf7" containerName="neutron-api" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:07.999977 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.003352 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.005465 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.005603 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-lpzq6" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.009488 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.092192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-openstack-config-secret\") pod \"openstackclient\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.092282 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-openstack-config\") pod \"openstackclient\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.092350 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xm5h\" (UniqueName: \"kubernetes.io/projected/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-kube-api-access-2xm5h\") pod \"openstackclient\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.092382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.154787 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="dbabd573-5784-4cb5-8294-cf65951a73ba" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.217:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.154804 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="dbabd573-5784-4cb5-8294-cf65951a73ba" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.194558 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xm5h\" (UniqueName: \"kubernetes.io/projected/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-kube-api-access-2xm5h\") pod \"openstackclient\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.194646 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.194730 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-openstack-config-secret\") pod \"openstackclient\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.194779 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-openstack-config\") pod \"openstackclient\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.195654 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-openstack-config\") pod \"openstackclient\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.206206 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-openstack-config-secret\") pod \"openstackclient\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.207948 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.212791 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xm5h\" (UniqueName: \"kubernetes.io/projected/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-kube-api-access-2xm5h\") pod \"openstackclient\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.314796 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:51:08 crc kubenswrapper[5030]: I0120 23:51:08.832425 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.182691 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" podUID="9124c35a-a0a3-4adf-92cd-5d7011098204" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.1.122:5000/v3\": read tcp 10.217.0.2:44420->10.217.1.122:5000: read: connection reset by peer" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.333017 5030 generic.go:334] "Generic (PLEG): container finished" podID="9124c35a-a0a3-4adf-92cd-5d7011098204" containerID="3f6c7cb33ab3b87a8e27bc70591e84aa73c4cf65c60f08eb014c06c96a92cc58" exitCode=0 Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.333087 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" event={"ID":"9124c35a-a0a3-4adf-92cd-5d7011098204","Type":"ContainerDied","Data":"3f6c7cb33ab3b87a8e27bc70591e84aa73c4cf65c60f08eb014c06c96a92cc58"} Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.340746 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e","Type":"ContainerStarted","Data":"7a0678712472b5130cb0dc086ec6fd0a4428b165197d3f040a6d2bad8e988888"} Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.340810 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e","Type":"ContainerStarted","Data":"8052caf8b1c3c83bbbd95831621726d185a00f2ff0b6aae6d1d4c2a7180215cc"} Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.360851 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=2.360835902 podStartE2EDuration="2.360835902s" podCreationTimestamp="2026-01-20 23:51:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:51:09.356959908 +0000 UTC m=+4541.677220196" watchObservedRunningTime="2026-01-20 23:51:09.360835902 +0000 UTC m=+4541.681096180" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.724400 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.827002 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-scripts\") pod \"9124c35a-a0a3-4adf-92cd-5d7011098204\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.827776 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnjn5\" (UniqueName: \"kubernetes.io/projected/9124c35a-a0a3-4adf-92cd-5d7011098204-kube-api-access-tnjn5\") pod \"9124c35a-a0a3-4adf-92cd-5d7011098204\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.827833 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-fernet-keys\") pod \"9124c35a-a0a3-4adf-92cd-5d7011098204\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.827925 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-internal-tls-certs\") pod \"9124c35a-a0a3-4adf-92cd-5d7011098204\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.827957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-config-data\") pod \"9124c35a-a0a3-4adf-92cd-5d7011098204\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.828244 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-public-tls-certs\") pod \"9124c35a-a0a3-4adf-92cd-5d7011098204\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.828272 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-combined-ca-bundle\") pod \"9124c35a-a0a3-4adf-92cd-5d7011098204\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.828617 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-credential-keys\") pod \"9124c35a-a0a3-4adf-92cd-5d7011098204\" (UID: \"9124c35a-a0a3-4adf-92cd-5d7011098204\") " Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.832325 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-scripts" (OuterVolumeSpecName: "scripts") pod "9124c35a-a0a3-4adf-92cd-5d7011098204" (UID: "9124c35a-a0a3-4adf-92cd-5d7011098204"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.832605 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9124c35a-a0a3-4adf-92cd-5d7011098204" (UID: "9124c35a-a0a3-4adf-92cd-5d7011098204"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.836334 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9124c35a-a0a3-4adf-92cd-5d7011098204-kube-api-access-tnjn5" (OuterVolumeSpecName: "kube-api-access-tnjn5") pod "9124c35a-a0a3-4adf-92cd-5d7011098204" (UID: "9124c35a-a0a3-4adf-92cd-5d7011098204"). InnerVolumeSpecName "kube-api-access-tnjn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.838819 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9124c35a-a0a3-4adf-92cd-5d7011098204" (UID: "9124c35a-a0a3-4adf-92cd-5d7011098204"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.859859 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9124c35a-a0a3-4adf-92cd-5d7011098204" (UID: "9124c35a-a0a3-4adf-92cd-5d7011098204"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.860710 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-config-data" (OuterVolumeSpecName: "config-data") pod "9124c35a-a0a3-4adf-92cd-5d7011098204" (UID: "9124c35a-a0a3-4adf-92cd-5d7011098204"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.887688 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9124c35a-a0a3-4adf-92cd-5d7011098204" (UID: "9124c35a-a0a3-4adf-92cd-5d7011098204"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.898832 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9124c35a-a0a3-4adf-92cd-5d7011098204" (UID: "9124c35a-a0a3-4adf-92cd-5d7011098204"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.930841 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.930873 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.930883 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.930894 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.930904 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.930913 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.930922 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9124c35a-a0a3-4adf-92cd-5d7011098204-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:09 crc kubenswrapper[5030]: I0120 23:51:09.930933 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnjn5\" (UniqueName: \"kubernetes.io/projected/9124c35a-a0a3-4adf-92cd-5d7011098204-kube-api-access-tnjn5\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.015507 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": read tcp 10.217.0.2:53556->10.217.1.190:9311: read: connection reset by peer" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.015649 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.190:9311/healthcheck\": read tcp 10.217.0.2:53558->10.217.1.190:9311: read: connection reset by peer" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.299225 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod6bd1362d-6fee-4d39-ad27-ba95334caeef"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod6bd1362d-6fee-4d39-ad27-ba95334caeef] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6bd1362d_6fee_4d39_ad27_ba95334caeef.slice" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.351267 5030 generic.go:334] "Generic (PLEG): container finished" podID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerID="cf4277b9970dab45af9d50c12645ba97abd7f0b0cad2e3a7288e663c8ecb365f" exitCode=0 Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.351327 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" event={"ID":"7c59fe1c-661b-4268-a33d-e9f87ec167eb","Type":"ContainerDied","Data":"cf4277b9970dab45af9d50c12645ba97abd7f0b0cad2e3a7288e663c8ecb365f"} Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.351363 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" event={"ID":"7c59fe1c-661b-4268-a33d-e9f87ec167eb","Type":"ContainerDied","Data":"6689e1c56b54a2951f9299ed2fb4a60a7bdb882bc64aefd868fdeb1ac8978756"} Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.351374 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6689e1c56b54a2951f9299ed2fb4a60a7bdb882bc64aefd868fdeb1ac8978756" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.351388 5030 scope.go:117] "RemoveContainer" containerID="aad49c5ee0f0980c146469d4264f259f604d654ee7afbf6b04e4731884cf6baa" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.353436 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.354056 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-7fc69fb975-2tn7g" event={"ID":"9124c35a-a0a3-4adf-92cd-5d7011098204","Type":"ContainerDied","Data":"67f70121c9d064d309a201aa5a31d13ba6fde83ff52dac3b6851e7006a1e044f"} Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.402004 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.411814 5030 scope.go:117] "RemoveContainer" containerID="3f6c7cb33ab3b87a8e27bc70591e84aa73c4cf65c60f08eb014c06c96a92cc58" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.414968 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-7fc69fb975-2tn7g"] Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.422005 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-7fc69fb975-2tn7g"] Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.439950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-public-tls-certs\") pod \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.440067 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c59fe1c-661b-4268-a33d-e9f87ec167eb-logs\") pod \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.440153 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv5bw\" (UniqueName: \"kubernetes.io/projected/7c59fe1c-661b-4268-a33d-e9f87ec167eb-kube-api-access-cv5bw\") pod \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.440252 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-internal-tls-certs\") pod \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.440281 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-config-data-custom\") pod \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.440311 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-combined-ca-bundle\") pod \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.440365 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-config-data\") pod \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\" (UID: \"7c59fe1c-661b-4268-a33d-e9f87ec167eb\") " Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.443166 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c59fe1c-661b-4268-a33d-e9f87ec167eb-logs" (OuterVolumeSpecName: "logs") pod "7c59fe1c-661b-4268-a33d-e9f87ec167eb" (UID: "7c59fe1c-661b-4268-a33d-e9f87ec167eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.467262 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7c59fe1c-661b-4268-a33d-e9f87ec167eb" (UID: "7c59fe1c-661b-4268-a33d-e9f87ec167eb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.497323 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c59fe1c-661b-4268-a33d-e9f87ec167eb-kube-api-access-cv5bw" (OuterVolumeSpecName: "kube-api-access-cv5bw") pod "7c59fe1c-661b-4268-a33d-e9f87ec167eb" (UID: "7c59fe1c-661b-4268-a33d-e9f87ec167eb"). InnerVolumeSpecName "kube-api-access-cv5bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.546391 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv5bw\" (UniqueName: \"kubernetes.io/projected/7c59fe1c-661b-4268-a33d-e9f87ec167eb-kube-api-access-cv5bw\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.546420 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.546450 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c59fe1c-661b-4268-a33d-e9f87ec167eb-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.559156 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7c59fe1c-661b-4268-a33d-e9f87ec167eb" (UID: "7c59fe1c-661b-4268-a33d-e9f87ec167eb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.576839 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c59fe1c-661b-4268-a33d-e9f87ec167eb" (UID: "7c59fe1c-661b-4268-a33d-e9f87ec167eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.589839 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-config-data" (OuterVolumeSpecName: "config-data") pod "7c59fe1c-661b-4268-a33d-e9f87ec167eb" (UID: "7c59fe1c-661b-4268-a33d-e9f87ec167eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.594979 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7c59fe1c-661b-4268-a33d-e9f87ec167eb" (UID: "7c59fe1c-661b-4268-a33d-e9f87ec167eb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.647754 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.647791 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.647803 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.647813 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c59fe1c-661b-4268-a33d-e9f87ec167eb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:10 crc kubenswrapper[5030]: I0120 23:51:10.962719 5030 scope.go:117] "RemoveContainer" containerID="8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa" Jan 20 23:51:10 crc kubenswrapper[5030]: E0120 23:51:10.963084 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 40s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(87584043-7d0e-475d-89ec-2f743b49b145)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="87584043-7d0e-475d-89ec-2f743b49b145" Jan 20 23:51:11 crc kubenswrapper[5030]: I0120 23:51:11.365545 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-777db55c66-sxd4j" Jan 20 23:51:11 crc kubenswrapper[5030]: I0120 23:51:11.417924 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-777db55c66-sxd4j"] Jan 20 23:51:11 crc kubenswrapper[5030]: I0120 23:51:11.426826 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-777db55c66-sxd4j"] Jan 20 23:51:11 crc kubenswrapper[5030]: I0120 23:51:11.972705 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" path="/var/lib/kubelet/pods/7c59fe1c-661b-4268-a33d-e9f87ec167eb/volumes" Jan 20 23:51:11 crc kubenswrapper[5030]: I0120 23:51:11.973495 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9124c35a-a0a3-4adf-92cd-5d7011098204" path="/var/lib/kubelet/pods/9124c35a-a0a3-4adf-92cd-5d7011098204/volumes" Jan 20 23:51:12 crc kubenswrapper[5030]: I0120 23:51:12.963267 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:51:12 crc kubenswrapper[5030]: E0120 23:51:12.963781 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.283481 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws"] Jan 20 23:51:14 crc kubenswrapper[5030]: E0120 23:51:14.284083 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.284095 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" Jan 20 23:51:14 crc kubenswrapper[5030]: E0120 23:51:14.284106 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.284114 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" Jan 20 23:51:14 crc kubenswrapper[5030]: E0120 23:51:14.284130 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.284136 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" Jan 20 23:51:14 crc kubenswrapper[5030]: E0120 23:51:14.284145 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.284151 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" Jan 20 23:51:14 crc kubenswrapper[5030]: E0120 23:51:14.284178 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9124c35a-a0a3-4adf-92cd-5d7011098204" containerName="keystone-api" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.284184 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9124c35a-a0a3-4adf-92cd-5d7011098204" containerName="keystone-api" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.284347 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.284360 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9124c35a-a0a3-4adf-92cd-5d7011098204" containerName="keystone-api" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.284375 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.284386 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.284397 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c59fe1c-661b-4268-a33d-e9f87ec167eb" containerName="barbican-api-log" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.285281 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.310305 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws"] Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.317338 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-internal-tls-certs\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.317373 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-public-tls-certs\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.317397 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnvkt\" (UniqueName: \"kubernetes.io/projected/24160216-4080-4fa9-88c0-f89910f0a933-kube-api-access-fnvkt\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.317416 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/24160216-4080-4fa9-88c0-f89910f0a933-etc-swift\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.317441 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-combined-ca-bundle\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.317506 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-config-data\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.317523 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24160216-4080-4fa9-88c0-f89910f0a933-log-httpd\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.317551 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24160216-4080-4fa9-88c0-f89910f0a933-run-httpd\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.419227 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-internal-tls-certs\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.419277 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-public-tls-certs\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.419305 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnvkt\" (UniqueName: \"kubernetes.io/projected/24160216-4080-4fa9-88c0-f89910f0a933-kube-api-access-fnvkt\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.419332 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/24160216-4080-4fa9-88c0-f89910f0a933-etc-swift\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.419367 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-combined-ca-bundle\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.419461 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-config-data\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.419486 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24160216-4080-4fa9-88c0-f89910f0a933-log-httpd\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.419526 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24160216-4080-4fa9-88c0-f89910f0a933-run-httpd\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.420054 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24160216-4080-4fa9-88c0-f89910f0a933-run-httpd\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.420332 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24160216-4080-4fa9-88c0-f89910f0a933-log-httpd\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.425744 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/24160216-4080-4fa9-88c0-f89910f0a933-etc-swift\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.425846 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-config-data\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.425946 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-internal-tls-certs\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.426290 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-public-tls-certs\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.428051 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-combined-ca-bundle\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.435279 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnvkt\" (UniqueName: \"kubernetes.io/projected/24160216-4080-4fa9-88c0-f89910f0a933-kube-api-access-fnvkt\") pod \"swift-proxy-567c477ddd-tdrws\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.507946 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.509737 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.520982 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:51:14 crc kubenswrapper[5030]: I0120 23:51:14.614744 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:15 crc kubenswrapper[5030]: W0120 23:51:15.096885 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24160216_4080_4fa9_88c0_f89910f0a933.slice/crio-f8852c89a7930bfa0404d0bb8f24020b2bfb241a1717436a1813a1c6008b8e3a WatchSource:0}: Error finding container f8852c89a7930bfa0404d0bb8f24020b2bfb241a1717436a1813a1c6008b8e3a: Status 404 returned error can't find the container with id f8852c89a7930bfa0404d0bb8f24020b2bfb241a1717436a1813a1c6008b8e3a Jan 20 23:51:15 crc kubenswrapper[5030]: I0120 23:51:15.099519 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws"] Jan 20 23:51:15 crc kubenswrapper[5030]: I0120 23:51:15.411088 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" event={"ID":"24160216-4080-4fa9-88c0-f89910f0a933","Type":"ContainerStarted","Data":"f8852c89a7930bfa0404d0bb8f24020b2bfb241a1717436a1813a1c6008b8e3a"} Jan 20 23:51:15 crc kubenswrapper[5030]: I0120 23:51:15.548115 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:51:16 crc kubenswrapper[5030]: I0120 23:51:16.422605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" event={"ID":"24160216-4080-4fa9-88c0-f89910f0a933","Type":"ContainerStarted","Data":"066e433777c6f4fb06a9686d3ad45153cfc06dd3189e1674846e4aa6746e5f4d"} Jan 20 23:51:16 crc kubenswrapper[5030]: I0120 23:51:16.422657 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" event={"ID":"24160216-4080-4fa9-88c0-f89910f0a933","Type":"ContainerStarted","Data":"0dbbd5636d7c51b217c065b6413313ea12f5c9838b12455c2d92415149d72b65"} Jan 20 23:51:16 crc kubenswrapper[5030]: I0120 23:51:16.422682 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:16 crc kubenswrapper[5030]: I0120 23:51:16.422698 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:16 crc kubenswrapper[5030]: I0120 23:51:16.441867 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" podStartSLOduration=2.441845747 podStartE2EDuration="2.441845747s" podCreationTimestamp="2026-01-20 23:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:51:16.441696254 +0000 UTC m=+4548.761956542" watchObservedRunningTime="2026-01-20 23:51:16.441845747 +0000 UTC m=+4548.762106055" Jan 20 23:51:16 crc kubenswrapper[5030]: I0120 23:51:16.587331 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-df6c9" podUID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" containerName="registry-server" probeResult="failure" output=< Jan 20 23:51:16 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 20 23:51:16 crc kubenswrapper[5030]: > Jan 20 23:51:16 crc kubenswrapper[5030]: I0120 23:51:16.962380 5030 scope.go:117] "RemoveContainer" containerID="44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671" Jan 20 23:51:16 crc kubenswrapper[5030]: E0120 23:51:16.962740 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-scheduler-scheduler\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-scheduler-scheduler pod=nova-scheduler-0_openstack-kuttl-tests(7daf347c-1524-4050-b50e-deb2024da0cc)\"" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" Jan 20 23:51:17 crc kubenswrapper[5030]: I0120 23:51:17.147778 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:51:17 crc kubenswrapper[5030]: I0120 23:51:17.148267 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:51:17 crc kubenswrapper[5030]: I0120 23:51:17.150709 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:51:17 crc kubenswrapper[5030]: I0120 23:51:17.153736 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:51:17 crc kubenswrapper[5030]: I0120 23:51:17.428912 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:51:17 crc kubenswrapper[5030]: I0120 23:51:17.433902 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:51:17 crc kubenswrapper[5030]: I0120 23:51:17.789103 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" podUID="cdb5a54a-8811-4be4-a4ba-63f879326dab" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.208:9696/\": dial tcp 10.217.1.208:9696: connect: connection refused" Jan 20 23:51:17 crc kubenswrapper[5030]: I0120 23:51:17.821523 5030 scope.go:117] "RemoveContainer" containerID="32b66f89314cc7517837e61d98b70ca2ab834e88a6527f2a3fe2792e84855aef" Jan 20 23:51:17 crc kubenswrapper[5030]: I0120 23:51:17.847013 5030 scope.go:117] "RemoveContainer" containerID="a1aa7fac800cf0fad77a25577024fa04cb0bf4062cc984176e43fa01e676bda9" Jan 20 23:51:18 crc kubenswrapper[5030]: I0120 23:51:18.310047 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:51:18 crc kubenswrapper[5030]: I0120 23:51:18.374186 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-8d6495cc4-tcgn7"] Jan 20 23:51:18 crc kubenswrapper[5030]: I0120 23:51:18.378427 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" podUID="5a76a052-42f5-44fb-b981-dffa9b886afe" containerName="neutron-api" containerID="cri-o://296fedc1bd542169ff9175e281777f11d5558ffe31dda5bcb1a4e4db1759e2e0" gracePeriod=30 Jan 20 23:51:18 crc kubenswrapper[5030]: I0120 23:51:18.378767 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" podUID="5a76a052-42f5-44fb-b981-dffa9b886afe" containerName="neutron-httpd" containerID="cri-o://ddc74a644d7ab5111265bc9f9536df25bbc79c7022f36f8e1bbb5069f4cae5f8" gracePeriod=30 Jan 20 23:51:19 crc kubenswrapper[5030]: I0120 23:51:19.445988 5030 generic.go:334] "Generic (PLEG): container finished" podID="5a76a052-42f5-44fb-b981-dffa9b886afe" containerID="ddc74a644d7ab5111265bc9f9536df25bbc79c7022f36f8e1bbb5069f4cae5f8" exitCode=0 Jan 20 23:51:19 crc kubenswrapper[5030]: I0120 23:51:19.446076 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" event={"ID":"5a76a052-42f5-44fb-b981-dffa9b886afe","Type":"ContainerDied","Data":"ddc74a644d7ab5111265bc9f9536df25bbc79c7022f36f8e1bbb5069f4cae5f8"} Jan 20 23:51:19 crc kubenswrapper[5030]: I0120 23:51:19.962547 5030 scope.go:117] "RemoveContainer" containerID="7f7607fa9cd7139f4b8d413ec29021dabc7d691ea574bea76d16f9b1e2924dd3" Jan 20 23:51:19 crc kubenswrapper[5030]: E0120 23:51:19.963067 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-conductor-conductor\" with CrashLoopBackOff: \"back-off 20s restarting failed container=nova-cell1-conductor-conductor pod=nova-cell1-conductor-0_openstack-kuttl-tests(65b5db4a-e775-4981-a7fd-e2d8bd56f334)\"" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.363098 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-579d49bbb6-jhfjg_cdb5a54a-8811-4be4-a4ba-63f879326dab/neutron-api/0.log" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.363386 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.456861 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-combined-ca-bundle\") pod \"cdb5a54a-8811-4be4-a4ba-63f879326dab\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.456919 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-public-tls-certs\") pod \"cdb5a54a-8811-4be4-a4ba-63f879326dab\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.456955 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-config\") pod \"cdb5a54a-8811-4be4-a4ba-63f879326dab\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.457071 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-httpd-config\") pod \"cdb5a54a-8811-4be4-a4ba-63f879326dab\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.457845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpknm\" (UniqueName: \"kubernetes.io/projected/cdb5a54a-8811-4be4-a4ba-63f879326dab-kube-api-access-gpknm\") pod \"cdb5a54a-8811-4be4-a4ba-63f879326dab\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.457870 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-ovndb-tls-certs\") pod \"cdb5a54a-8811-4be4-a4ba-63f879326dab\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.457917 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-internal-tls-certs\") pod \"cdb5a54a-8811-4be4-a4ba-63f879326dab\" (UID: \"cdb5a54a-8811-4be4-a4ba-63f879326dab\") " Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.473838 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-579d49bbb6-jhfjg_cdb5a54a-8811-4be4-a4ba-63f879326dab/neutron-api/0.log" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.473886 5030 generic.go:334] "Generic (PLEG): container finished" podID="cdb5a54a-8811-4be4-a4ba-63f879326dab" containerID="5a5c90413d6548389b806d15aa0f8c2356f66c21b27ad6d573ec322779a52861" exitCode=137 Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.473915 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" event={"ID":"cdb5a54a-8811-4be4-a4ba-63f879326dab","Type":"ContainerDied","Data":"5a5c90413d6548389b806d15aa0f8c2356f66c21b27ad6d573ec322779a52861"} Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.473943 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" event={"ID":"cdb5a54a-8811-4be4-a4ba-63f879326dab","Type":"ContainerDied","Data":"34a89d4484f22d9a1b6e0651f0100f517957ba0ae48ce384870c6df350dde414"} Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.473961 5030 scope.go:117] "RemoveContainer" containerID="deb36e3c895d0c2d04c59cdd1fd41e349f7dbe36f3fb36c16343f022afbf5d74" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.474080 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-579d49bbb6-jhfjg" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.483795 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb5a54a-8811-4be4-a4ba-63f879326dab-kube-api-access-gpknm" (OuterVolumeSpecName: "kube-api-access-gpknm") pod "cdb5a54a-8811-4be4-a4ba-63f879326dab" (UID: "cdb5a54a-8811-4be4-a4ba-63f879326dab"). InnerVolumeSpecName "kube-api-access-gpknm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.495830 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "cdb5a54a-8811-4be4-a4ba-63f879326dab" (UID: "cdb5a54a-8811-4be4-a4ba-63f879326dab"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.548862 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdb5a54a-8811-4be4-a4ba-63f879326dab" (UID: "cdb5a54a-8811-4be4-a4ba-63f879326dab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.557279 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-config" (OuterVolumeSpecName: "config") pod "cdb5a54a-8811-4be4-a4ba-63f879326dab" (UID: "cdb5a54a-8811-4be4-a4ba-63f879326dab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.562119 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.562147 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.562168 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpknm\" (UniqueName: \"kubernetes.io/projected/cdb5a54a-8811-4be4-a4ba-63f879326dab-kube-api-access-gpknm\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.562180 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.563754 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cdb5a54a-8811-4be4-a4ba-63f879326dab" (UID: "cdb5a54a-8811-4be4-a4ba-63f879326dab"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.578570 5030 scope.go:117] "RemoveContainer" containerID="5a5c90413d6548389b806d15aa0f8c2356f66c21b27ad6d573ec322779a52861" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.585521 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cdb5a54a-8811-4be4-a4ba-63f879326dab" (UID: "cdb5a54a-8811-4be4-a4ba-63f879326dab"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.601855 5030 scope.go:117] "RemoveContainer" containerID="deb36e3c895d0c2d04c59cdd1fd41e349f7dbe36f3fb36c16343f022afbf5d74" Jan 20 23:51:20 crc kubenswrapper[5030]: E0120 23:51:20.602459 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb36e3c895d0c2d04c59cdd1fd41e349f7dbe36f3fb36c16343f022afbf5d74\": container with ID starting with deb36e3c895d0c2d04c59cdd1fd41e349f7dbe36f3fb36c16343f022afbf5d74 not found: ID does not exist" containerID="deb36e3c895d0c2d04c59cdd1fd41e349f7dbe36f3fb36c16343f022afbf5d74" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.602487 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb36e3c895d0c2d04c59cdd1fd41e349f7dbe36f3fb36c16343f022afbf5d74"} err="failed to get container status \"deb36e3c895d0c2d04c59cdd1fd41e349f7dbe36f3fb36c16343f022afbf5d74\": rpc error: code = NotFound desc = could not find container \"deb36e3c895d0c2d04c59cdd1fd41e349f7dbe36f3fb36c16343f022afbf5d74\": container with ID starting with deb36e3c895d0c2d04c59cdd1fd41e349f7dbe36f3fb36c16343f022afbf5d74 not found: ID does not exist" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.602506 5030 scope.go:117] "RemoveContainer" containerID="5a5c90413d6548389b806d15aa0f8c2356f66c21b27ad6d573ec322779a52861" Jan 20 23:51:20 crc kubenswrapper[5030]: E0120 23:51:20.603521 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a5c90413d6548389b806d15aa0f8c2356f66c21b27ad6d573ec322779a52861\": container with ID starting with 5a5c90413d6548389b806d15aa0f8c2356f66c21b27ad6d573ec322779a52861 not found: ID does not exist" containerID="5a5c90413d6548389b806d15aa0f8c2356f66c21b27ad6d573ec322779a52861" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.603549 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a5c90413d6548389b806d15aa0f8c2356f66c21b27ad6d573ec322779a52861"} err="failed to get container status \"5a5c90413d6548389b806d15aa0f8c2356f66c21b27ad6d573ec322779a52861\": rpc error: code = NotFound desc = could not find container \"5a5c90413d6548389b806d15aa0f8c2356f66c21b27ad6d573ec322779a52861\": container with ID starting with 5a5c90413d6548389b806d15aa0f8c2356f66c21b27ad6d573ec322779a52861 not found: ID does not exist" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.607388 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "cdb5a54a-8811-4be4-a4ba-63f879326dab" (UID: "cdb5a54a-8811-4be4-a4ba-63f879326dab"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.664139 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.664183 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.664198 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdb5a54a-8811-4be4-a4ba-63f879326dab-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.822266 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-579d49bbb6-jhfjg"] Jan 20 23:51:20 crc kubenswrapper[5030]: I0120 23:51:20.832552 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-579d49bbb6-jhfjg"] Jan 20 23:51:21 crc kubenswrapper[5030]: I0120 23:51:21.962550 5030 scope.go:117] "RemoveContainer" containerID="8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa" Jan 20 23:51:21 crc kubenswrapper[5030]: E0120 23:51:21.963984 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-conductor\" with CrashLoopBackOff: \"back-off 40s restarting failed container=nova-cell0-conductor-conductor pod=nova-cell0-conductor-0_openstack-kuttl-tests(87584043-7d0e-475d-89ec-2f743b49b145)\"" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="87584043-7d0e-475d-89ec-2f743b49b145" Jan 20 23:51:21 crc kubenswrapper[5030]: I0120 23:51:21.974424 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdb5a54a-8811-4be4-a4ba-63f879326dab" path="/var/lib/kubelet/pods/cdb5a54a-8811-4be4-a4ba-63f879326dab/volumes" Jan 20 23:51:23 crc kubenswrapper[5030]: I0120 23:51:23.962537 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:51:23 crc kubenswrapper[5030]: E0120 23:51:23.963268 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:51:24 crc kubenswrapper[5030]: I0120 23:51:24.623988 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:24 crc kubenswrapper[5030]: I0120 23:51:24.635715 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:51:24 crc kubenswrapper[5030]: I0120 23:51:24.749674 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc"] Jan 20 23:51:24 crc kubenswrapper[5030]: I0120 23:51:24.751531 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" podUID="3568f883-8a26-4dae-b664-474faca62644" containerName="proxy-httpd" containerID="cri-o://1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75" gracePeriod=30 Jan 20 23:51:24 crc kubenswrapper[5030]: I0120 23:51:24.752070 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" podUID="3568f883-8a26-4dae-b664-474faca62644" containerName="proxy-server" containerID="cri-o://5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa" gracePeriod=30 Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.528489 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.533617 5030 generic.go:334] "Generic (PLEG): container finished" podID="3568f883-8a26-4dae-b664-474faca62644" containerID="5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa" exitCode=0 Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.533668 5030 generic.go:334] "Generic (PLEG): container finished" podID="3568f883-8a26-4dae-b664-474faca62644" containerID="1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75" exitCode=0 Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.533688 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.533721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" event={"ID":"3568f883-8a26-4dae-b664-474faca62644","Type":"ContainerDied","Data":"5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa"} Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.533750 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" event={"ID":"3568f883-8a26-4dae-b664-474faca62644","Type":"ContainerDied","Data":"1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75"} Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.533761 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc" event={"ID":"3568f883-8a26-4dae-b664-474faca62644","Type":"ContainerDied","Data":"34d6ec9ef98aaec25c7575000309cb71c1632121138c7fc69ba6fba4fed06c17"} Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.533776 5030 scope.go:117] "RemoveContainer" containerID="5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.557028 5030 scope.go:117] "RemoveContainer" containerID="1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.577904 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.591707 5030 scope.go:117] "RemoveContainer" containerID="5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa" Jan 20 23:51:25 crc kubenswrapper[5030]: E0120 23:51:25.595314 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa\": container with ID starting with 5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa not found: ID does not exist" containerID="5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.595360 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa"} err="failed to get container status \"5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa\": rpc error: code = NotFound desc = could not find container \"5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa\": container with ID starting with 5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa not found: ID does not exist" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.595391 5030 scope.go:117] "RemoveContainer" containerID="1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75" Jan 20 23:51:25 crc kubenswrapper[5030]: E0120 23:51:25.595788 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75\": container with ID starting with 1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75 not found: ID does not exist" containerID="1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.595825 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75"} err="failed to get container status \"1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75\": rpc error: code = NotFound desc = could not find container \"1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75\": container with ID starting with 1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75 not found: ID does not exist" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.595844 5030 scope.go:117] "RemoveContainer" containerID="5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.596209 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa"} err="failed to get container status \"5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa\": rpc error: code = NotFound desc = could not find container \"5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa\": container with ID starting with 5d7981ed7f523908dc149a1e04ebc4ad1c5758e1c3075fa1593657c6b3b268aa not found: ID does not exist" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.596247 5030 scope.go:117] "RemoveContainer" containerID="1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.596601 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75"} err="failed to get container status \"1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75\": rpc error: code = NotFound desc = could not find container \"1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75\": container with ID starting with 1fbdcca0323871db3ee5d5a6e04d263a30e83d05098d4fb6f889130835adff75 not found: ID does not exist" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.639853 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.667540 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-internal-tls-certs\") pod \"3568f883-8a26-4dae-b664-474faca62644\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.667672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3568f883-8a26-4dae-b664-474faca62644-run-httpd\") pod \"3568f883-8a26-4dae-b664-474faca62644\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.667725 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csbcb\" (UniqueName: \"kubernetes.io/projected/3568f883-8a26-4dae-b664-474faca62644-kube-api-access-csbcb\") pod \"3568f883-8a26-4dae-b664-474faca62644\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.667745 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3568f883-8a26-4dae-b664-474faca62644-etc-swift\") pod \"3568f883-8a26-4dae-b664-474faca62644\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.667804 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-public-tls-certs\") pod \"3568f883-8a26-4dae-b664-474faca62644\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.667846 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3568f883-8a26-4dae-b664-474faca62644-log-httpd\") pod \"3568f883-8a26-4dae-b664-474faca62644\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.667896 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-config-data\") pod \"3568f883-8a26-4dae-b664-474faca62644\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.667931 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-combined-ca-bundle\") pod \"3568f883-8a26-4dae-b664-474faca62644\" (UID: \"3568f883-8a26-4dae-b664-474faca62644\") " Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.669001 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3568f883-8a26-4dae-b664-474faca62644-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3568f883-8a26-4dae-b664-474faca62644" (UID: "3568f883-8a26-4dae-b664-474faca62644"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.671520 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3568f883-8a26-4dae-b664-474faca62644-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3568f883-8a26-4dae-b664-474faca62644" (UID: "3568f883-8a26-4dae-b664-474faca62644"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.677524 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3568f883-8a26-4dae-b664-474faca62644-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3568f883-8a26-4dae-b664-474faca62644" (UID: "3568f883-8a26-4dae-b664-474faca62644"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.689375 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3568f883-8a26-4dae-b664-474faca62644-kube-api-access-csbcb" (OuterVolumeSpecName: "kube-api-access-csbcb") pod "3568f883-8a26-4dae-b664-474faca62644" (UID: "3568f883-8a26-4dae-b664-474faca62644"). InnerVolumeSpecName "kube-api-access-csbcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.723670 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-config-data" (OuterVolumeSpecName: "config-data") pod "3568f883-8a26-4dae-b664-474faca62644" (UID: "3568f883-8a26-4dae-b664-474faca62644"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.730296 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3568f883-8a26-4dae-b664-474faca62644" (UID: "3568f883-8a26-4dae-b664-474faca62644"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.735396 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3568f883-8a26-4dae-b664-474faca62644" (UID: "3568f883-8a26-4dae-b664-474faca62644"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.742920 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3568f883-8a26-4dae-b664-474faca62644" (UID: "3568f883-8a26-4dae-b664-474faca62644"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.770284 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3568f883-8a26-4dae-b664-474faca62644-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.770320 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.770332 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.770342 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.770351 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3568f883-8a26-4dae-b664-474faca62644-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.770362 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csbcb\" (UniqueName: \"kubernetes.io/projected/3568f883-8a26-4dae-b664-474faca62644-kube-api-access-csbcb\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.770372 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3568f883-8a26-4dae-b664-474faca62644-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.770381 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3568f883-8a26-4dae-b664-474faca62644-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.885518 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc"] Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.897010 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-85b58b6496-6nkdc"] Jan 20 23:51:25 crc kubenswrapper[5030]: I0120 23:51:25.972790 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3568f883-8a26-4dae-b664-474faca62644" path="/var/lib/kubelet/pods/3568f883-8a26-4dae-b664-474faca62644/volumes" Jan 20 23:51:26 crc kubenswrapper[5030]: I0120 23:51:26.355701 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-df6c9"] Jan 20 23:51:27 crc kubenswrapper[5030]: I0120 23:51:27.552772 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-df6c9" podUID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" containerName="registry-server" containerID="cri-o://789d3935cc8be00f3dcb69fe47911653973427c3760d031b80db55ab911cec67" gracePeriod=2 Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.064359 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.127063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-utilities\") pod \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\" (UID: \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\") " Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.127865 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kld27\" (UniqueName: \"kubernetes.io/projected/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-kube-api-access-kld27\") pod \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\" (UID: \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\") " Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.128016 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-catalog-content\") pod \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\" (UID: \"348859b8-66ca-4e3e-a7b1-ad96fe0858a5\") " Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.128096 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-utilities" (OuterVolumeSpecName: "utilities") pod "348859b8-66ca-4e3e-a7b1-ad96fe0858a5" (UID: "348859b8-66ca-4e3e-a7b1-ad96fe0858a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.128593 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.134787 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-kube-api-access-kld27" (OuterVolumeSpecName: "kube-api-access-kld27") pod "348859b8-66ca-4e3e-a7b1-ad96fe0858a5" (UID: "348859b8-66ca-4e3e-a7b1-ad96fe0858a5"). InnerVolumeSpecName "kube-api-access-kld27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.231603 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kld27\" (UniqueName: \"kubernetes.io/projected/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-kube-api-access-kld27\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.248753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "348859b8-66ca-4e3e-a7b1-ad96fe0858a5" (UID: "348859b8-66ca-4e3e-a7b1-ad96fe0858a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.333533 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/348859b8-66ca-4e3e-a7b1-ad96fe0858a5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.564568 5030 generic.go:334] "Generic (PLEG): container finished" podID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" containerID="789d3935cc8be00f3dcb69fe47911653973427c3760d031b80db55ab911cec67" exitCode=0 Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.564659 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df6c9" event={"ID":"348859b8-66ca-4e3e-a7b1-ad96fe0858a5","Type":"ContainerDied","Data":"789d3935cc8be00f3dcb69fe47911653973427c3760d031b80db55ab911cec67"} Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.564698 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df6c9" event={"ID":"348859b8-66ca-4e3e-a7b1-ad96fe0858a5","Type":"ContainerDied","Data":"7e382be6fd371ec2fcd7680a7a578ff2105770fd2b2a6ca1009d05eb48c08aaa"} Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.564720 5030 scope.go:117] "RemoveContainer" containerID="789d3935cc8be00f3dcb69fe47911653973427c3760d031b80db55ab911cec67" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.564663 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-df6c9" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.607880 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-df6c9"] Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.615473 5030 scope.go:117] "RemoveContainer" containerID="f63d199113626361c3dcb63f16abdb9f62d684f2a0226aa027e8a5308befce13" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.621047 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-df6c9"] Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.652257 5030 scope.go:117] "RemoveContainer" containerID="6c42e805eba7995a0da4e140a5b1d86e191460ec7d618217ad81da7a94bf2066" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.703535 5030 scope.go:117] "RemoveContainer" containerID="789d3935cc8be00f3dcb69fe47911653973427c3760d031b80db55ab911cec67" Jan 20 23:51:28 crc kubenswrapper[5030]: E0120 23:51:28.704024 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"789d3935cc8be00f3dcb69fe47911653973427c3760d031b80db55ab911cec67\": container with ID starting with 789d3935cc8be00f3dcb69fe47911653973427c3760d031b80db55ab911cec67 not found: ID does not exist" containerID="789d3935cc8be00f3dcb69fe47911653973427c3760d031b80db55ab911cec67" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.704057 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"789d3935cc8be00f3dcb69fe47911653973427c3760d031b80db55ab911cec67"} err="failed to get container status \"789d3935cc8be00f3dcb69fe47911653973427c3760d031b80db55ab911cec67\": rpc error: code = NotFound desc = could not find container \"789d3935cc8be00f3dcb69fe47911653973427c3760d031b80db55ab911cec67\": container with ID starting with 789d3935cc8be00f3dcb69fe47911653973427c3760d031b80db55ab911cec67 not found: ID does not exist" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.704078 5030 scope.go:117] "RemoveContainer" containerID="f63d199113626361c3dcb63f16abdb9f62d684f2a0226aa027e8a5308befce13" Jan 20 23:51:28 crc kubenswrapper[5030]: E0120 23:51:28.704310 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f63d199113626361c3dcb63f16abdb9f62d684f2a0226aa027e8a5308befce13\": container with ID starting with f63d199113626361c3dcb63f16abdb9f62d684f2a0226aa027e8a5308befce13 not found: ID does not exist" containerID="f63d199113626361c3dcb63f16abdb9f62d684f2a0226aa027e8a5308befce13" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.704340 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f63d199113626361c3dcb63f16abdb9f62d684f2a0226aa027e8a5308befce13"} err="failed to get container status \"f63d199113626361c3dcb63f16abdb9f62d684f2a0226aa027e8a5308befce13\": rpc error: code = NotFound desc = could not find container \"f63d199113626361c3dcb63f16abdb9f62d684f2a0226aa027e8a5308befce13\": container with ID starting with f63d199113626361c3dcb63f16abdb9f62d684f2a0226aa027e8a5308befce13 not found: ID does not exist" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.704359 5030 scope.go:117] "RemoveContainer" containerID="6c42e805eba7995a0da4e140a5b1d86e191460ec7d618217ad81da7a94bf2066" Jan 20 23:51:28 crc kubenswrapper[5030]: E0120 23:51:28.704765 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c42e805eba7995a0da4e140a5b1d86e191460ec7d618217ad81da7a94bf2066\": container with ID starting with 6c42e805eba7995a0da4e140a5b1d86e191460ec7d618217ad81da7a94bf2066 not found: ID does not exist" containerID="6c42e805eba7995a0da4e140a5b1d86e191460ec7d618217ad81da7a94bf2066" Jan 20 23:51:28 crc kubenswrapper[5030]: I0120 23:51:28.704795 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c42e805eba7995a0da4e140a5b1d86e191460ec7d618217ad81da7a94bf2066"} err="failed to get container status \"6c42e805eba7995a0da4e140a5b1d86e191460ec7d618217ad81da7a94bf2066\": rpc error: code = NotFound desc = could not find container \"6c42e805eba7995a0da4e140a5b1d86e191460ec7d618217ad81da7a94bf2066\": container with ID starting with 6c42e805eba7995a0da4e140a5b1d86e191460ec7d618217ad81da7a94bf2066 not found: ID does not exist" Jan 20 23:51:29 crc kubenswrapper[5030]: I0120 23:51:29.629433 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:29 crc kubenswrapper[5030]: I0120 23:51:29.991563 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" path="/var/lib/kubelet/pods/348859b8-66ca-4e3e-a7b1-ad96fe0858a5/volumes" Jan 20 23:51:30 crc kubenswrapper[5030]: I0120 23:51:30.962121 5030 scope.go:117] "RemoveContainer" containerID="44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671" Jan 20 23:51:31 crc kubenswrapper[5030]: I0120 23:51:31.592710 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7daf347c-1524-4050-b50e-deb2024da0cc","Type":"ContainerStarted","Data":"5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19"} Jan 20 23:51:31 crc kubenswrapper[5030]: I0120 23:51:31.752843 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:51:31 crc kubenswrapper[5030]: I0120 23:51:31.753252 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="ceilometer-central-agent" containerID="cri-o://8048071a865ac6d4595f4e10540d7c5f758927588e82af002d9b212761a6d12d" gracePeriod=30 Jan 20 23:51:31 crc kubenswrapper[5030]: I0120 23:51:31.753324 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="proxy-httpd" containerID="cri-o://26908818d2277cf1e3a58eea5d3eccff77e751e9bc4e5e3766cb0ad235b94325" gracePeriod=30 Jan 20 23:51:31 crc kubenswrapper[5030]: I0120 23:51:31.753319 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="ceilometer-notification-agent" containerID="cri-o://7a42a417626dcbf1aa66434c1b654811344700544ec890d50c16d3ab2b26a4d7" gracePeriod=30 Jan 20 23:51:31 crc kubenswrapper[5030]: I0120 23:51:31.753473 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="sg-core" containerID="cri-o://32b34e0c53e081418b51f9f54af370209ce26fab6baf0835c1eec3d7915df72c" gracePeriod=30 Jan 20 23:51:32 crc kubenswrapper[5030]: I0120 23:51:32.604740 5030 generic.go:334] "Generic (PLEG): container finished" podID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerID="26908818d2277cf1e3a58eea5d3eccff77e751e9bc4e5e3766cb0ad235b94325" exitCode=0 Jan 20 23:51:32 crc kubenswrapper[5030]: I0120 23:51:32.604995 5030 generic.go:334] "Generic (PLEG): container finished" podID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerID="32b34e0c53e081418b51f9f54af370209ce26fab6baf0835c1eec3d7915df72c" exitCode=2 Jan 20 23:51:32 crc kubenswrapper[5030]: I0120 23:51:32.605009 5030 generic.go:334] "Generic (PLEG): container finished" podID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerID="8048071a865ac6d4595f4e10540d7c5f758927588e82af002d9b212761a6d12d" exitCode=0 Jan 20 23:51:32 crc kubenswrapper[5030]: I0120 23:51:32.604807 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5dbbe6f4-cd02-42d1-bc79-751984eb97c3","Type":"ContainerDied","Data":"26908818d2277cf1e3a58eea5d3eccff77e751e9bc4e5e3766cb0ad235b94325"} Jan 20 23:51:32 crc kubenswrapper[5030]: I0120 23:51:32.605048 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5dbbe6f4-cd02-42d1-bc79-751984eb97c3","Type":"ContainerDied","Data":"32b34e0c53e081418b51f9f54af370209ce26fab6baf0835c1eec3d7915df72c"} Jan 20 23:51:32 crc kubenswrapper[5030]: I0120 23:51:32.605079 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5dbbe6f4-cd02-42d1-bc79-751984eb97c3","Type":"ContainerDied","Data":"8048071a865ac6d4595f4e10540d7c5f758927588e82af002d9b212761a6d12d"} Jan 20 23:51:33 crc kubenswrapper[5030]: I0120 23:51:33.961994 5030 scope.go:117] "RemoveContainer" containerID="7f7607fa9cd7139f4b8d413ec29021dabc7d691ea574bea76d16f9b1e2924dd3" Jan 20 23:51:34 crc kubenswrapper[5030]: I0120 23:51:34.625209 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"65b5db4a-e775-4981-a7fd-e2d8bd56f334","Type":"ContainerStarted","Data":"3707109cd3d63748c8f067fa10c5418b3203a5b175f14a8716f463c27d863616"} Jan 20 23:51:34 crc kubenswrapper[5030]: I0120 23:51:34.625923 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:51:34 crc kubenswrapper[5030]: I0120 23:51:34.802880 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:51:34 crc kubenswrapper[5030]: I0120 23:51:34.802969 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:51:34 crc kubenswrapper[5030]: I0120 23:51:34.868232 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:51:35 crc kubenswrapper[5030]: I0120 23:51:35.669926 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.582297 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.651179 5030 generic.go:334] "Generic (PLEG): container finished" podID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerID="7a42a417626dcbf1aa66434c1b654811344700544ec890d50c16d3ab2b26a4d7" exitCode=0 Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.651341 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5dbbe6f4-cd02-42d1-bc79-751984eb97c3","Type":"ContainerDied","Data":"7a42a417626dcbf1aa66434c1b654811344700544ec890d50c16d3ab2b26a4d7"} Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.651383 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"5dbbe6f4-cd02-42d1-bc79-751984eb97c3","Type":"ContainerDied","Data":"b93dd1484d8a81f88ce9a35353105992cdcf422fd2a00ee538bf004227c404fc"} Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.651402 5030 scope.go:117] "RemoveContainer" containerID="26908818d2277cf1e3a58eea5d3eccff77e751e9bc4e5e3766cb0ad235b94325" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.651557 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.674583 5030 scope.go:117] "RemoveContainer" containerID="32b34e0c53e081418b51f9f54af370209ce26fab6baf0835c1eec3d7915df72c" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.697027 5030 scope.go:117] "RemoveContainer" containerID="7a42a417626dcbf1aa66434c1b654811344700544ec890d50c16d3ab2b26a4d7" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.708114 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khz9n\" (UniqueName: \"kubernetes.io/projected/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-kube-api-access-khz9n\") pod \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.708197 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-ceilometer-tls-certs\") pod \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.708262 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-run-httpd\") pod \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.708290 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-config-data\") pod \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.708459 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-combined-ca-bundle\") pod \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.708577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-sg-core-conf-yaml\") pod \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.708605 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-scripts\") pod \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.708688 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-log-httpd\") pod \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\" (UID: \"5dbbe6f4-cd02-42d1-bc79-751984eb97c3\") " Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.709962 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5dbbe6f4-cd02-42d1-bc79-751984eb97c3" (UID: "5dbbe6f4-cd02-42d1-bc79-751984eb97c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.721944 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5dbbe6f4-cd02-42d1-bc79-751984eb97c3" (UID: "5dbbe6f4-cd02-42d1-bc79-751984eb97c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.725640 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-kube-api-access-khz9n" (OuterVolumeSpecName: "kube-api-access-khz9n") pod "5dbbe6f4-cd02-42d1-bc79-751984eb97c3" (UID: "5dbbe6f4-cd02-42d1-bc79-751984eb97c3"). InnerVolumeSpecName "kube-api-access-khz9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.728777 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-scripts" (OuterVolumeSpecName: "scripts") pod "5dbbe6f4-cd02-42d1-bc79-751984eb97c3" (UID: "5dbbe6f4-cd02-42d1-bc79-751984eb97c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.737753 5030 scope.go:117] "RemoveContainer" containerID="8048071a865ac6d4595f4e10540d7c5f758927588e82af002d9b212761a6d12d" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.754376 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5dbbe6f4-cd02-42d1-bc79-751984eb97c3" (UID: "5dbbe6f4-cd02-42d1-bc79-751984eb97c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.772333 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5dbbe6f4-cd02-42d1-bc79-751984eb97c3" (UID: "5dbbe6f4-cd02-42d1-bc79-751984eb97c3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.815516 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.815543 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.815552 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.815562 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khz9n\" (UniqueName: \"kubernetes.io/projected/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-kube-api-access-khz9n\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.815570 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.815579 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.823697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dbbe6f4-cd02-42d1-bc79-751984eb97c3" (UID: "5dbbe6f4-cd02-42d1-bc79-751984eb97c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.827802 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-config-data" (OuterVolumeSpecName: "config-data") pod "5dbbe6f4-cd02-42d1-bc79-751984eb97c3" (UID: "5dbbe6f4-cd02-42d1-bc79-751984eb97c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.897480 5030 scope.go:117] "RemoveContainer" containerID="26908818d2277cf1e3a58eea5d3eccff77e751e9bc4e5e3766cb0ad235b94325" Jan 20 23:51:36 crc kubenswrapper[5030]: E0120 23:51:36.897924 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26908818d2277cf1e3a58eea5d3eccff77e751e9bc4e5e3766cb0ad235b94325\": container with ID starting with 26908818d2277cf1e3a58eea5d3eccff77e751e9bc4e5e3766cb0ad235b94325 not found: ID does not exist" containerID="26908818d2277cf1e3a58eea5d3eccff77e751e9bc4e5e3766cb0ad235b94325" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.897965 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26908818d2277cf1e3a58eea5d3eccff77e751e9bc4e5e3766cb0ad235b94325"} err="failed to get container status \"26908818d2277cf1e3a58eea5d3eccff77e751e9bc4e5e3766cb0ad235b94325\": rpc error: code = NotFound desc = could not find container \"26908818d2277cf1e3a58eea5d3eccff77e751e9bc4e5e3766cb0ad235b94325\": container with ID starting with 26908818d2277cf1e3a58eea5d3eccff77e751e9bc4e5e3766cb0ad235b94325 not found: ID does not exist" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.897991 5030 scope.go:117] "RemoveContainer" containerID="32b34e0c53e081418b51f9f54af370209ce26fab6baf0835c1eec3d7915df72c" Jan 20 23:51:36 crc kubenswrapper[5030]: E0120 23:51:36.898357 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b34e0c53e081418b51f9f54af370209ce26fab6baf0835c1eec3d7915df72c\": container with ID starting with 32b34e0c53e081418b51f9f54af370209ce26fab6baf0835c1eec3d7915df72c not found: ID does not exist" containerID="32b34e0c53e081418b51f9f54af370209ce26fab6baf0835c1eec3d7915df72c" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.898382 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b34e0c53e081418b51f9f54af370209ce26fab6baf0835c1eec3d7915df72c"} err="failed to get container status \"32b34e0c53e081418b51f9f54af370209ce26fab6baf0835c1eec3d7915df72c\": rpc error: code = NotFound desc = could not find container \"32b34e0c53e081418b51f9f54af370209ce26fab6baf0835c1eec3d7915df72c\": container with ID starting with 32b34e0c53e081418b51f9f54af370209ce26fab6baf0835c1eec3d7915df72c not found: ID does not exist" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.898395 5030 scope.go:117] "RemoveContainer" containerID="7a42a417626dcbf1aa66434c1b654811344700544ec890d50c16d3ab2b26a4d7" Jan 20 23:51:36 crc kubenswrapper[5030]: E0120 23:51:36.898792 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a42a417626dcbf1aa66434c1b654811344700544ec890d50c16d3ab2b26a4d7\": container with ID starting with 7a42a417626dcbf1aa66434c1b654811344700544ec890d50c16d3ab2b26a4d7 not found: ID does not exist" containerID="7a42a417626dcbf1aa66434c1b654811344700544ec890d50c16d3ab2b26a4d7" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.898825 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a42a417626dcbf1aa66434c1b654811344700544ec890d50c16d3ab2b26a4d7"} err="failed to get container status \"7a42a417626dcbf1aa66434c1b654811344700544ec890d50c16d3ab2b26a4d7\": rpc error: code = NotFound desc = could not find container \"7a42a417626dcbf1aa66434c1b654811344700544ec890d50c16d3ab2b26a4d7\": container with ID starting with 7a42a417626dcbf1aa66434c1b654811344700544ec890d50c16d3ab2b26a4d7 not found: ID does not exist" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.898846 5030 scope.go:117] "RemoveContainer" containerID="8048071a865ac6d4595f4e10540d7c5f758927588e82af002d9b212761a6d12d" Jan 20 23:51:36 crc kubenswrapper[5030]: E0120 23:51:36.899793 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8048071a865ac6d4595f4e10540d7c5f758927588e82af002d9b212761a6d12d\": container with ID starting with 8048071a865ac6d4595f4e10540d7c5f758927588e82af002d9b212761a6d12d not found: ID does not exist" containerID="8048071a865ac6d4595f4e10540d7c5f758927588e82af002d9b212761a6d12d" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.899815 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8048071a865ac6d4595f4e10540d7c5f758927588e82af002d9b212761a6d12d"} err="failed to get container status \"8048071a865ac6d4595f4e10540d7c5f758927588e82af002d9b212761a6d12d\": rpc error: code = NotFound desc = could not find container \"8048071a865ac6d4595f4e10540d7c5f758927588e82af002d9b212761a6d12d\": container with ID starting with 8048071a865ac6d4595f4e10540d7c5f758927588e82af002d9b212761a6d12d not found: ID does not exist" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.916728 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.916764 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbbe6f4-cd02-42d1-bc79-751984eb97c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.961563 5030 scope.go:117] "RemoveContainer" containerID="8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa" Jan 20 23:51:36 crc kubenswrapper[5030]: I0120 23:51:36.963028 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:51:36 crc kubenswrapper[5030]: E0120 23:51:36.964016 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.016210 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.044685 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.050828 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:51:37 crc kubenswrapper[5030]: E0120 23:51:37.051197 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb5a54a-8811-4be4-a4ba-63f879326dab" containerName="neutron-api" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.051214 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb5a54a-8811-4be4-a4ba-63f879326dab" containerName="neutron-api" Jan 20 23:51:37 crc kubenswrapper[5030]: E0120 23:51:37.051228 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="ceilometer-central-agent" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.051239 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="ceilometer-central-agent" Jan 20 23:51:37 crc kubenswrapper[5030]: E0120 23:51:37.051247 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="proxy-httpd" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.051253 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="proxy-httpd" Jan 20 23:51:37 crc kubenswrapper[5030]: E0120 23:51:37.051264 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3568f883-8a26-4dae-b664-474faca62644" containerName="proxy-httpd" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.051270 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3568f883-8a26-4dae-b664-474faca62644" containerName="proxy-httpd" Jan 20 23:51:37 crc kubenswrapper[5030]: E0120 23:51:37.051284 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb5a54a-8811-4be4-a4ba-63f879326dab" containerName="neutron-httpd" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.051289 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb5a54a-8811-4be4-a4ba-63f879326dab" containerName="neutron-httpd" Jan 20 23:51:37 crc kubenswrapper[5030]: E0120 23:51:37.051299 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="ceilometer-notification-agent" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.051304 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="ceilometer-notification-agent" Jan 20 23:51:37 crc kubenswrapper[5030]: E0120 23:51:37.051315 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" containerName="registry-server" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.108554 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" containerName="registry-server" Jan 20 23:51:37 crc kubenswrapper[5030]: E0120 23:51:37.108611 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="sg-core" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.108634 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="sg-core" Jan 20 23:51:37 crc kubenswrapper[5030]: E0120 23:51:37.108656 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3568f883-8a26-4dae-b664-474faca62644" containerName="proxy-server" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.108662 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3568f883-8a26-4dae-b664-474faca62644" containerName="proxy-server" Jan 20 23:51:37 crc kubenswrapper[5030]: E0120 23:51:37.108674 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" containerName="extract-content" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.108681 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" containerName="extract-content" Jan 20 23:51:37 crc kubenswrapper[5030]: E0120 23:51:37.108699 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" containerName="extract-utilities" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.108705 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" containerName="extract-utilities" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.108998 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb5a54a-8811-4be4-a4ba-63f879326dab" containerName="neutron-api" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.109018 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="proxy-httpd" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.109028 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3568f883-8a26-4dae-b664-474faca62644" containerName="proxy-httpd" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.109038 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="sg-core" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.109053 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb5a54a-8811-4be4-a4ba-63f879326dab" containerName="neutron-httpd" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.109061 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="ceilometer-notification-agent" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.109073 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="348859b8-66ca-4e3e-a7b1-ad96fe0858a5" containerName="registry-server" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.109087 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3568f883-8a26-4dae-b664-474faca62644" containerName="proxy-server" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.109098 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" containerName="ceilometer-central-agent" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.110690 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.110772 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.114358 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.114498 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.114602 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.224586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.224952 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.225053 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a9ac390-5efe-4f40-addc-795d259c81cb-run-httpd\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.225152 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc6px\" (UniqueName: \"kubernetes.io/projected/0a9ac390-5efe-4f40-addc-795d259c81cb-kube-api-access-fc6px\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.225221 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.225319 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-scripts\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.225387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-config-data\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.225460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a9ac390-5efe-4f40-addc-795d259c81cb-log-httpd\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.326778 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.326850 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-scripts\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.326874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-config-data\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.326903 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a9ac390-5efe-4f40-addc-795d259c81cb-log-httpd\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.326962 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.326985 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.327026 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a9ac390-5efe-4f40-addc-795d259c81cb-run-httpd\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.327064 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc6px\" (UniqueName: \"kubernetes.io/projected/0a9ac390-5efe-4f40-addc-795d259c81cb-kube-api-access-fc6px\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.327690 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a9ac390-5efe-4f40-addc-795d259c81cb-log-httpd\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.328234 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a9ac390-5efe-4f40-addc-795d259c81cb-run-httpd\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.333325 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.334084 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.334429 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-config-data\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.335950 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-scripts\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.337234 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.350292 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc6px\" (UniqueName: \"kubernetes.io/projected/0a9ac390-5efe-4f40-addc-795d259c81cb-kube-api-access-fc6px\") pod \"ceilometer-0\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.485830 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.676351 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"87584043-7d0e-475d-89ec-2f743b49b145","Type":"ContainerStarted","Data":"630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d"} Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.677440 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.989153 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dbbe6f4-cd02-42d1-bc79-751984eb97c3" path="/var/lib/kubelet/pods/5dbbe6f4-cd02-42d1-bc79-751984eb97c3/volumes" Jan 20 23:51:37 crc kubenswrapper[5030]: I0120 23:51:37.989895 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:51:38 crc kubenswrapper[5030]: I0120 23:51:38.694056 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0a9ac390-5efe-4f40-addc-795d259c81cb","Type":"ContainerStarted","Data":"1950625fb18462814c13c211ec225c203ec0069591e8f5804442ccb5385c35fd"} Jan 20 23:51:39 crc kubenswrapper[5030]: I0120 23:51:39.706591 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0a9ac390-5efe-4f40-addc-795d259c81cb","Type":"ContainerStarted","Data":"02fae0c28694c60d27f62274bb757b8a4c5d3975c8580e05c8edede84309f354"} Jan 20 23:51:39 crc kubenswrapper[5030]: I0120 23:51:39.706963 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0a9ac390-5efe-4f40-addc-795d259c81cb","Type":"ContainerStarted","Data":"ec915196559782a71aca3bf84560183d6b635d2588eb3d679d36d2bb6b7a0c3b"} Jan 20 23:51:40 crc kubenswrapper[5030]: I0120 23:51:40.718394 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0a9ac390-5efe-4f40-addc-795d259c81cb","Type":"ContainerStarted","Data":"8031dce4c85b09f1d38bba5769584ff64416d46b3003775991a52eb69e88f932"} Jan 20 23:51:40 crc kubenswrapper[5030]: I0120 23:51:40.953778 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:51:41 crc kubenswrapper[5030]: I0120 23:51:41.123511 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" podUID="5a76a052-42f5-44fb-b981-dffa9b886afe" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.123:9696/\": dial tcp 10.217.1.123:9696: connect: connection refused" Jan 20 23:51:42 crc kubenswrapper[5030]: I0120 23:51:42.767916 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0a9ac390-5efe-4f40-addc-795d259c81cb","Type":"ContainerStarted","Data":"048f7b423760b667b60e4906fd1a0700771bb5250eff55c5e498304ebcbb54a2"} Jan 20 23:51:42 crc kubenswrapper[5030]: I0120 23:51:42.769506 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:51:42 crc kubenswrapper[5030]: I0120 23:51:42.812135 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.216844609 podStartE2EDuration="5.812117754s" podCreationTimestamp="2026-01-20 23:51:37 +0000 UTC" firstStartedPulling="2026-01-20 23:51:37.997928309 +0000 UTC m=+4570.318188597" lastFinishedPulling="2026-01-20 23:51:41.593201444 +0000 UTC m=+4573.913461742" observedRunningTime="2026-01-20 23:51:42.803244789 +0000 UTC m=+4575.123505077" watchObservedRunningTime="2026-01-20 23:51:42.812117754 +0000 UTC m=+4575.132378032" Jan 20 23:51:46 crc kubenswrapper[5030]: I0120 23:51:46.605729 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:51:47 crc kubenswrapper[5030]: I0120 23:51:47.969742 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:51:47 crc kubenswrapper[5030]: E0120 23:51:47.970449 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:51:48 crc kubenswrapper[5030]: I0120 23:51:48.824636 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-8d6495cc4-tcgn7_5a76a052-42f5-44fb-b981-dffa9b886afe/neutron-api/0.log" Jan 20 23:51:48 crc kubenswrapper[5030]: I0120 23:51:48.824888 5030 generic.go:334] "Generic (PLEG): container finished" podID="5a76a052-42f5-44fb-b981-dffa9b886afe" containerID="296fedc1bd542169ff9175e281777f11d5558ffe31dda5bcb1a4e4db1759e2e0" exitCode=137 Jan 20 23:51:48 crc kubenswrapper[5030]: I0120 23:51:48.824914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" event={"ID":"5a76a052-42f5-44fb-b981-dffa9b886afe","Type":"ContainerDied","Data":"296fedc1bd542169ff9175e281777f11d5558ffe31dda5bcb1a4e4db1759e2e0"} Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.644558 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-8d6495cc4-tcgn7_5a76a052-42f5-44fb-b981-dffa9b886afe/neutron-api/0.log" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.644889 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.656362 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-config\") pod \"5a76a052-42f5-44fb-b981-dffa9b886afe\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.656419 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w65n\" (UniqueName: \"kubernetes.io/projected/5a76a052-42f5-44fb-b981-dffa9b886afe-kube-api-access-5w65n\") pod \"5a76a052-42f5-44fb-b981-dffa9b886afe\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.656445 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-internal-tls-certs\") pod \"5a76a052-42f5-44fb-b981-dffa9b886afe\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.656531 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-combined-ca-bundle\") pod \"5a76a052-42f5-44fb-b981-dffa9b886afe\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.656599 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-httpd-config\") pod \"5a76a052-42f5-44fb-b981-dffa9b886afe\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.656645 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-ovndb-tls-certs\") pod \"5a76a052-42f5-44fb-b981-dffa9b886afe\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.656707 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-public-tls-certs\") pod \"5a76a052-42f5-44fb-b981-dffa9b886afe\" (UID: \"5a76a052-42f5-44fb-b981-dffa9b886afe\") " Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.678287 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5a76a052-42f5-44fb-b981-dffa9b886afe" (UID: "5a76a052-42f5-44fb-b981-dffa9b886afe"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.726918 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a76a052-42f5-44fb-b981-dffa9b886afe-kube-api-access-5w65n" (OuterVolumeSpecName: "kube-api-access-5w65n") pod "5a76a052-42f5-44fb-b981-dffa9b886afe" (UID: "5a76a052-42f5-44fb-b981-dffa9b886afe"). InnerVolumeSpecName "kube-api-access-5w65n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.758575 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.758726 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w65n\" (UniqueName: \"kubernetes.io/projected/5a76a052-42f5-44fb-b981-dffa9b886afe-kube-api-access-5w65n\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.783312 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a76a052-42f5-44fb-b981-dffa9b886afe" (UID: "5a76a052-42f5-44fb-b981-dffa9b886afe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.788822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5a76a052-42f5-44fb-b981-dffa9b886afe" (UID: "5a76a052-42f5-44fb-b981-dffa9b886afe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.795651 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-config" (OuterVolumeSpecName: "config") pod "5a76a052-42f5-44fb-b981-dffa9b886afe" (UID: "5a76a052-42f5-44fb-b981-dffa9b886afe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.798217 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5a76a052-42f5-44fb-b981-dffa9b886afe" (UID: "5a76a052-42f5-44fb-b981-dffa9b886afe"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.834834 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_neutron-8d6495cc4-tcgn7_5a76a052-42f5-44fb-b981-dffa9b886afe/neutron-api/0.log" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.834886 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" event={"ID":"5a76a052-42f5-44fb-b981-dffa9b886afe","Type":"ContainerDied","Data":"92aedd641ddc844f9250dfff551102007a7b55500847f45450a7d051fb61bdff"} Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.834927 5030 scope.go:117] "RemoveContainer" containerID="ddc74a644d7ab5111265bc9f9536df25bbc79c7022f36f8e1bbb5069f4cae5f8" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.835055 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8d6495cc4-tcgn7" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.850642 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5a76a052-42f5-44fb-b981-dffa9b886afe" (UID: "5a76a052-42f5-44fb-b981-dffa9b886afe"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.856438 5030 scope.go:117] "RemoveContainer" containerID="296fedc1bd542169ff9175e281777f11d5558ffe31dda5bcb1a4e4db1759e2e0" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.860842 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.860866 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.860876 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.860887 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:49 crc kubenswrapper[5030]: I0120 23:51:49.860896 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a76a052-42f5-44fb-b981-dffa9b886afe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:51:50 crc kubenswrapper[5030]: I0120 23:51:50.163415 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-8d6495cc4-tcgn7"] Jan 20 23:51:50 crc kubenswrapper[5030]: I0120 23:51:50.175123 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-8d6495cc4-tcgn7"] Jan 20 23:51:51 crc kubenswrapper[5030]: I0120 23:51:51.985823 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a76a052-42f5-44fb-b981-dffa9b886afe" path="/var/lib/kubelet/pods/5a76a052-42f5-44fb-b981-dffa9b886afe/volumes" Jan 20 23:51:59 crc kubenswrapper[5030]: I0120 23:51:59.962859 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:51:59 crc kubenswrapper[5030]: E0120 23:51:59.964030 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:52:07 crc kubenswrapper[5030]: I0120 23:52:07.498199 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:52:12 crc kubenswrapper[5030]: I0120 23:52:12.962595 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:52:12 crc kubenswrapper[5030]: E0120 23:52:12.963342 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:52:18 crc kubenswrapper[5030]: I0120 23:52:18.166689 5030 scope.go:117] "RemoveContainer" containerID="2d2794b65b2db7f4a9c84934ca6eef3c2896bd5ee5b8868af3e0f7db3cbf6b41" Jan 20 23:52:26 crc kubenswrapper[5030]: I0120 23:52:26.963220 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:52:26 crc kubenswrapper[5030]: E0120 23:52:26.966027 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.280211 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.280687 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e" containerName="openstackclient" containerID="cri-o://7a0678712472b5130cb0dc086ec6fd0a4428b165197d3f040a6d2bad8e988888" gracePeriod=2 Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.336675 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.380000 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.486907 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.496221 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="probe" containerID="cri-o://9df222c59711d3fd2a93db293137b68d56cc3f6dc8f6d95807cf89f2a8e746bc" gracePeriod=30 Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.528999 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="cinder-scheduler" containerID="cri-o://b39481b9df711fc23477860a389138de4b98dc604215528415d3e6762dbf8608" gracePeriod=30 Jan 20 23:52:28 crc kubenswrapper[5030]: E0120 23:52:28.540522 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:52:28 crc kubenswrapper[5030]: E0120 23:52:28.540587 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data podName:dd483381-3919-4ade-ac69-c8abde2869a6 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:29.040568435 +0000 UTC m=+4621.360828723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data") pod "rabbitmq-server-0" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6") : configmap "rabbitmq-config-data" not found Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.566710 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-846584955c-2ghpj"] Jan 20 23:52:28 crc kubenswrapper[5030]: E0120 23:52:28.567164 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a76a052-42f5-44fb-b981-dffa9b886afe" containerName="neutron-api" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.567177 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a76a052-42f5-44fb-b981-dffa9b886afe" containerName="neutron-api" Jan 20 23:52:28 crc kubenswrapper[5030]: E0120 23:52:28.567208 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a76a052-42f5-44fb-b981-dffa9b886afe" containerName="neutron-httpd" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.567216 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a76a052-42f5-44fb-b981-dffa9b886afe" containerName="neutron-httpd" Jan 20 23:52:28 crc kubenswrapper[5030]: E0120 23:52:28.567234 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e" containerName="openstackclient" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.567240 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e" containerName="openstackclient" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.567413 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a76a052-42f5-44fb-b981-dffa9b886afe" containerName="neutron-api" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.567426 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e" containerName="openstackclient" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.567447 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a76a052-42f5-44fb-b981-dffa9b886afe" containerName="neutron-httpd" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.568463 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.581160 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.582883 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.627818 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.629095 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.631637 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.644105 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.644272 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlktx\" (UniqueName: \"kubernetes.io/projected/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-kube-api-access-wlktx\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.644296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-config-data-custom\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.644386 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-logs\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.644419 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-config-data\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.663756 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.683183 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.710083 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-846584955c-2ghpj"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.730524 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-3da3-account-create-update-tgj5m"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.742763 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.747039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45pm9\" (UniqueName: \"kubernetes.io/projected/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-kube-api-access-45pm9\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.747136 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-config-data\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.747195 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-logs\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.747242 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-logs\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.747296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.747320 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-config-data\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.747383 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fd42af9-412d-4b7d-8be1-2a7c9008795f-operator-scripts\") pod \"nova-api-3da3-account-create-update-v87kw\" (UID: \"7fd42af9-412d-4b7d-8be1-2a7c9008795f\") " pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.747436 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znn5d\" (UniqueName: \"kubernetes.io/projected/7fd42af9-412d-4b7d-8be1-2a7c9008795f-kube-api-access-znn5d\") pod \"nova-api-3da3-account-create-update-v87kw\" (UID: \"7fd42af9-412d-4b7d-8be1-2a7c9008795f\") " pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.747505 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.747557 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlktx\" (UniqueName: \"kubernetes.io/projected/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-kube-api-access-wlktx\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.747597 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-config-data-custom\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.747656 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-config-data-custom\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.748250 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-logs\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: E0120 23:52:28.749768 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:52:28 crc kubenswrapper[5030]: E0120 23:52:28.749833 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle podName:3a438c50-84b9-49fb-9f49-0cfdbe2bab0a nodeName:}" failed. No retries permitted until 2026-01-20 23:52:29.24981383 +0000 UTC m=+4621.570074118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle") pod "barbican-worker-846584955c-2ghpj" (UID: "3a438c50-84b9-49fb-9f49-0cfdbe2bab0a") : secret "combined-ca-bundle" not found Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.757177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-config-data\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.766684 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g8ftc"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.784685 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-g8ftc"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.787380 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-config-data-custom\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.820861 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.821159 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="184162cc-2ce3-4a2e-9167-ff4ffa8cd188" containerName="cinder-api-log" containerID="cri-o://413322f6b3a5c0fff6d3a7d166e45c2e48dcd0a3a60cc53ad151616f2a518a54" gracePeriod=30 Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.821359 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="184162cc-2ce3-4a2e-9167-ff4ffa8cd188" containerName="cinder-api" containerID="cri-o://882be4226caa10dc7afd14321a5a024ab43fdef7d8d9c2cc6348bc47f5513cb1" gracePeriod=30 Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.826063 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlktx\" (UniqueName: \"kubernetes.io/projected/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-kube-api-access-wlktx\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.854724 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-config-data-custom\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.854769 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45pm9\" (UniqueName: \"kubernetes.io/projected/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-kube-api-access-45pm9\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.854804 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-config-data\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.854831 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-logs\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.854880 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.854917 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fd42af9-412d-4b7d-8be1-2a7c9008795f-operator-scripts\") pod \"nova-api-3da3-account-create-update-v87kw\" (UID: \"7fd42af9-412d-4b7d-8be1-2a7c9008795f\") " pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.854941 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znn5d\" (UniqueName: \"kubernetes.io/projected/7fd42af9-412d-4b7d-8be1-2a7c9008795f-kube-api-access-znn5d\") pod \"nova-api-3da3-account-create-update-v87kw\" (UID: \"7fd42af9-412d-4b7d-8be1-2a7c9008795f\") " pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.856009 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-logs\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: E0120 23:52:28.856231 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:52:28 crc kubenswrapper[5030]: E0120 23:52:28.856308 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle podName:21ceeb6e-ad82-423c-8a0c-a946171a2b5d nodeName:}" failed. No retries permitted until 2026-01-20 23:52:29.356288903 +0000 UTC m=+4621.676549191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle") pod "barbican-keystone-listener-79545f7658-lhclg" (UID: "21ceeb6e-ad82-423c-8a0c-a946171a2b5d") : secret "combined-ca-bundle" not found Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.856732 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fd42af9-412d-4b7d-8be1-2a7c9008795f-operator-scripts\") pod \"nova-api-3da3-account-create-update-v87kw\" (UID: \"7fd42af9-412d-4b7d-8be1-2a7c9008795f\") " pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.861734 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-config-data\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.862414 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-config-data-custom\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.895629 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4lg5x"] Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.896923 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-4lg5x" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.898234 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znn5d\" (UniqueName: \"kubernetes.io/projected/7fd42af9-412d-4b7d-8be1-2a7c9008795f-kube-api-access-znn5d\") pod \"nova-api-3da3-account-create-update-v87kw\" (UID: \"7fd42af9-412d-4b7d-8be1-2a7c9008795f\") " pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.900738 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.904297 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45pm9\" (UniqueName: \"kubernetes.io/projected/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-kube-api-access-45pm9\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.969482 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" Jan 20 23:52:28 crc kubenswrapper[5030]: I0120 23:52:28.971397 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4lg5x"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.033017 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.034963 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.055581 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.062237 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltk8g\" (UniqueName: \"kubernetes.io/projected/815dc9f1-424f-499a-b27b-16f3f0483fff-kube-api-access-ltk8g\") pod \"root-account-create-update-4lg5x\" (UID: \"815dc9f1-424f-499a-b27b-16f3f0483fff\") " pod="openstack-kuttl-tests/root-account-create-update-4lg5x" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.062412 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/815dc9f1-424f-499a-b27b-16f3f0483fff-operator-scripts\") pod \"root-account-create-update-4lg5x\" (UID: \"815dc9f1-424f-499a-b27b-16f3f0483fff\") " pod="openstack-kuttl-tests/root-account-create-update-4lg5x" Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.062576 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.062696 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data podName:dd483381-3919-4ade-ac69-c8abde2869a6 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:30.062604949 +0000 UTC m=+4622.382865237 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data") pod "rabbitmq-server-0" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6") : configmap "rabbitmq-config-data" not found Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.088703 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.089038 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" containerName="openstack-network-exporter" containerID="cri-o://00240cdae6147b11afb04a76cf0732eff9a4442b4dfe7a113f244770f4710e4f" gracePeriod=300 Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.141429 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.166248 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.166360 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data-custom\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.166408 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.166444 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.166470 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkjrf\" (UniqueName: \"kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.166524 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.166592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/815dc9f1-424f-499a-b27b-16f3f0483fff-operator-scripts\") pod \"root-account-create-update-4lg5x\" (UID: \"815dc9f1-424f-499a-b27b-16f3f0483fff\") " pod="openstack-kuttl-tests/root-account-create-update-4lg5x" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.166741 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d9855de-8b4b-4e39-940b-ff863190d072-logs\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.166798 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltk8g\" (UniqueName: \"kubernetes.io/projected/815dc9f1-424f-499a-b27b-16f3f0483fff-kube-api-access-ltk8g\") pod \"root-account-create-update-4lg5x\" (UID: \"815dc9f1-424f-499a-b27b-16f3f0483fff\") " pod="openstack-kuttl-tests/root-account-create-update-4lg5x" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.167884 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/815dc9f1-424f-499a-b27b-16f3f0483fff-operator-scripts\") pod \"root-account-create-update-4lg5x\" (UID: \"815dc9f1-424f-499a-b27b-16f3f0483fff\") " pod="openstack-kuttl-tests/root-account-create-update-4lg5x" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.184013 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-44f7-account-create-update-fjv9c"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.200722 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltk8g\" (UniqueName: \"kubernetes.io/projected/815dc9f1-424f-499a-b27b-16f3f0483fff-kube-api-access-ltk8g\") pod \"root-account-create-update-4lg5x\" (UID: \"815dc9f1-424f-499a-b27b-16f3f0483fff\") " pod="openstack-kuttl-tests/root-account-create-update-4lg5x" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.209197 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.210446 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.222834 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.225892 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.226244 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="2bc2f7c1-a435-4337-bd8b-47c0443f4004" containerName="openstack-network-exporter" containerID="cri-o://1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb" gracePeriod=300 Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.243220 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.276193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.276260 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.276327 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxmm6\" (UniqueName: \"kubernetes.io/projected/ee604c21-d89c-4f46-ae93-3912ff1da96d-kube-api-access-lxmm6\") pod \"nova-cell0-02a2-account-create-update-8wg5z\" (UID: \"ee604c21-d89c-4f46-ae93-3912ff1da96d\") " pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.276367 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data-custom\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.276410 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee604c21-d89c-4f46-ae93-3912ff1da96d-operator-scripts\") pod \"nova-cell0-02a2-account-create-update-8wg5z\" (UID: \"ee604c21-d89c-4f46-ae93-3912ff1da96d\") " pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.276446 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.276493 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.276514 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjrf\" (UniqueName: \"kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.276571 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.279008 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.279079 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle podName:3a438c50-84b9-49fb-9f49-0cfdbe2bab0a nodeName:}" failed. No retries permitted until 2026-01-20 23:52:30.279062849 +0000 UTC m=+4622.599323137 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle") pod "barbican-worker-846584955c-2ghpj" (UID: "3a438c50-84b9-49fb-9f49-0cfdbe2bab0a") : secret "combined-ca-bundle" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.279137 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.279196 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:29.779178212 +0000 UTC m=+4622.099438500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "barbican-config-data" not found Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.287464 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d9855de-8b4b-4e39-940b-ff863190d072-logs\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.287654 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.287732 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:29.787711599 +0000 UTC m=+4622.107971927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "cert-barbican-public-svc" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.287749 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.287785 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.287838 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:29.787816021 +0000 UTC m=+4622.108076309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "combined-ca-bundle" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.287854 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:29.787848733 +0000 UTC m=+4622.108109021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "cert-barbican-internal-svc" not found Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.288027 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d9855de-8b4b-4e39-940b-ff863190d072-logs\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.296189 5030 projected.go:194] Error preparing data for projected volume kube-api-access-hkjrf for pod openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.296249 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:29.796232996 +0000 UTC m=+4622.116493284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hkjrf" (UniqueName: "kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.297110 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data-custom\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.343494 5030 generic.go:334] "Generic (PLEG): container finished" podID="184162cc-2ce3-4a2e-9167-ff4ffa8cd188" containerID="413322f6b3a5c0fff6d3a7d166e45c2e48dcd0a3a60cc53ad151616f2a518a54" exitCode=143 Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.343562 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"184162cc-2ce3-4a2e-9167-ff4ffa8cd188","Type":"ContainerDied","Data":"413322f6b3a5c0fff6d3a7d166e45c2e48dcd0a3a60cc53ad151616f2a518a54"} Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.365644 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.366884 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.372392 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.379801 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="2bc2f7c1-a435-4337-bd8b-47c0443f4004" containerName="ovsdbserver-nb" containerID="cri-o://2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b" gracePeriod=300 Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.388982 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.389094 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxmm6\" (UniqueName: \"kubernetes.io/projected/ee604c21-d89c-4f46-ae93-3912ff1da96d-kube-api-access-lxmm6\") pod \"nova-cell0-02a2-account-create-update-8wg5z\" (UID: \"ee604c21-d89c-4f46-ae93-3912ff1da96d\") " pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.389133 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee604c21-d89c-4f46-ae93-3912ff1da96d-operator-scripts\") pod \"nova-cell0-02a2-account-create-update-8wg5z\" (UID: \"ee604c21-d89c-4f46-ae93-3912ff1da96d\") " pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.389833 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee604c21-d89c-4f46-ae93-3912ff1da96d-operator-scripts\") pod \"nova-cell0-02a2-account-create-update-8wg5z\" (UID: \"ee604c21-d89c-4f46-ae93-3912ff1da96d\") " pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.389914 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.389952 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle podName:21ceeb6e-ad82-423c-8a0c-a946171a2b5d nodeName:}" failed. No retries permitted until 2026-01-20 23:52:30.389941139 +0000 UTC m=+4622.710201427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle") pod "barbican-keystone-listener-79545f7658-lhclg" (UID: "21ceeb6e-ad82-423c-8a0c-a946171a2b5d") : secret "combined-ca-bundle" not found Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.405424 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" containerName="ovsdbserver-sb" containerID="cri-o://5dacabdec0bab9ffc3c73ab9cfe59add1f99a5ee81798258e3acf28ada151c65" gracePeriod=300 Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.406060 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-4lg5x" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.418723 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.422273 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxmm6\" (UniqueName: \"kubernetes.io/projected/ee604c21-d89c-4f46-ae93-3912ff1da96d-kube-api-access-lxmm6\") pod \"nova-cell0-02a2-account-create-update-8wg5z\" (UID: \"ee604c21-d89c-4f46-ae93-3912ff1da96d\") " pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.440362 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.454856 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-02a2-account-create-update-dgbt5"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.461675 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-ded5-account-create-update-kt6nm"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.477736 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.488093 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.488410 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="55537e5f-bb41-4899-a095-e87c7495ebdf" containerName="ovn-northd" containerID="cri-o://cef4a5f00f4648bf0fd3e6e59c501cb49f0b6ca80d5691ef8ef3da4d372a28d8" gracePeriod=30 Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.489171 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="55537e5f-bb41-4899-a095-e87c7495ebdf" containerName="openstack-network-exporter" containerID="cri-o://1afd7b619c388f1b085904f3aa4e3b2164180d355c9e37efea98319e13c503d6" gracePeriod=30 Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.497875 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a16ed2d-dd15-41c1-b6ef-61d242bd61a4-operator-scripts\") pod \"barbican-8027-account-create-update-xbzfc\" (UID: \"4a16ed2d-dd15-41c1-b6ef-61d242bd61a4\") " pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.511123 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r26j2\" (UniqueName: \"kubernetes.io/projected/4a16ed2d-dd15-41c1-b6ef-61d242bd61a4-kube-api-access-r26j2\") pod \"barbican-8027-account-create-update-xbzfc\" (UID: \"4a16ed2d-dd15-41c1-b6ef-61d242bd61a4\") " pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.573313 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.592005 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.595107 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.607772 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-aec2-account-create-update-t7k5l"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.615628 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r26j2\" (UniqueName: \"kubernetes.io/projected/4a16ed2d-dd15-41c1-b6ef-61d242bd61a4-kube-api-access-r26j2\") pod \"barbican-8027-account-create-update-xbzfc\" (UID: \"4a16ed2d-dd15-41c1-b6ef-61d242bd61a4\") " pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.615834 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a16ed2d-dd15-41c1-b6ef-61d242bd61a4-operator-scripts\") pod \"barbican-8027-account-create-update-xbzfc\" (UID: \"4a16ed2d-dd15-41c1-b6ef-61d242bd61a4\") " pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.616849 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a16ed2d-dd15-41c1-b6ef-61d242bd61a4-operator-scripts\") pod \"barbican-8027-account-create-update-xbzfc\" (UID: \"4a16ed2d-dd15-41c1-b6ef-61d242bd61a4\") " pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.622335 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-qttm2"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.632038 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-qttm2"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.647427 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r26j2\" (UniqueName: \"kubernetes.io/projected/4a16ed2d-dd15-41c1-b6ef-61d242bd61a4-kube-api-access-r26j2\") pod \"barbican-8027-account-create-update-xbzfc\" (UID: \"4a16ed2d-dd15-41c1-b6ef-61d242bd61a4\") " pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.651457 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-8027-account-create-update-4m224"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.661469 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-8027-account-create-update-4m224"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.698222 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-7v8jz"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.709308 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-7v8jz"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.713790 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.717404 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.717588 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data podName:81ab5d03-ddb2-4d25-bec9-485f19fa3f13 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:30.217565036 +0000 UTC m=+4622.537825324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data") pod "rabbitmq-cell1-server-0" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.718663 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.739760 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-bd31-account-create-update-rg5ql"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.757223 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-z276q"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.766681 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-z276q"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.813091 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-wcfd8"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.823531 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.823611 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.823657 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.823677 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjrf\" (UniqueName: \"kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.823710 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.823841 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.823885 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:30.823872555 +0000 UTC m=+4623.144132843 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "cert-barbican-internal-svc" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.823921 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.823943 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:30.823936447 +0000 UTC m=+4623.144196735 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "barbican-config-data" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.823972 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.823989 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:30.823984188 +0000 UTC m=+4623.144244476 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "combined-ca-bundle" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.824015 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.824032 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:30.824027659 +0000 UTC m=+4623.144287947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "cert-barbican-public-svc" not found Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.836376 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-wcfd8"] Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.837441 5030 projected.go:194] Error preparing data for projected volume kube-api-access-hkjrf for pod openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.837507 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:30.837485876 +0000 UTC m=+4623.157746164 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hkjrf" (UniqueName: "kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.863627 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t"] Jan 20 23:52:29 crc kubenswrapper[5030]: I0120 23:52:29.894272 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-6z58t"] Jan 20 23:52:29 crc kubenswrapper[5030]: W0120 23:52:29.916922 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fd42af9_412d_4b7d_8be1_2a7c9008795f.slice/crio-1f57f8cc45e473878b7fbfb5968110cfe6982171bf2f81ec0232410672e7c9fe WatchSource:0}: Error finding container 1f57f8cc45e473878b7fbfb5968110cfe6982171bf2f81ec0232410672e7c9fe: Status 404 returned error can't find the container with id 1f57f8cc45e473878b7fbfb5968110cfe6982171bf2f81ec0232410672e7c9fe Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.943359 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:52:29 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:52:29 crc kubenswrapper[5030]: Jan 20 23:52:29 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:52:29 crc kubenswrapper[5030]: Jan 20 23:52:29 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:52:29 crc kubenswrapper[5030]: Jan 20 23:52:29 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:52:29 crc kubenswrapper[5030]: Jan 20 23:52:29 crc kubenswrapper[5030]: if [ -n "nova_api" ]; then Jan 20 23:52:29 crc kubenswrapper[5030]: GRANT_DATABASE="nova_api" Jan 20 23:52:29 crc kubenswrapper[5030]: else Jan 20 23:52:29 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:52:29 crc kubenswrapper[5030]: fi Jan 20 23:52:29 crc kubenswrapper[5030]: Jan 20 23:52:29 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:52:29 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:52:29 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:52:29 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:52:29 crc kubenswrapper[5030]: # support updates Jan 20 23:52:29 crc kubenswrapper[5030]: Jan 20 23:52:29 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:52:29 crc kubenswrapper[5030]: E0120 23:52:29.944766 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" podUID="7fd42af9-412d-4b7d-8be1-2a7c9008795f" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.001123 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10ee2580-c430-43ee-bd0a-676053beeb00" path="/var/lib/kubelet/pods/10ee2580-c430-43ee-bd0a-676053beeb00/volumes" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.001780 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="242b3bb2-e246-48b9-a491-d5a2e1fceea1" path="/var/lib/kubelet/pods/242b3bb2-e246-48b9-a491-d5a2e1fceea1/volumes" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.002261 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35aff0fd-825b-4e57-84eb-b284266de899" path="/var/lib/kubelet/pods/35aff0fd-825b-4e57-84eb-b284266de899/volumes" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.006588 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54c23895-db55-439b-97b3-b53d50ba92fc" path="/var/lib/kubelet/pods/54c23895-db55-439b-97b3-b53d50ba92fc/volumes" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.007478 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56413b06-4751-4af2-a91e-d70b46e32c20" path="/var/lib/kubelet/pods/56413b06-4751-4af2-a91e-d70b46e32c20/volumes" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.008156 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="629f5916-4311-4b35-a772-d7dcced41284" path="/var/lib/kubelet/pods/629f5916-4311-4b35-a772-d7dcced41284/volumes" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.009215 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6933f1ec-acd0-4c85-a7b3-17363f451790" path="/var/lib/kubelet/pods/6933f1ec-acd0-4c85-a7b3-17363f451790/volumes" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.009699 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9275ea5f-43f0-447d-9e4d-adf3f977f058" path="/var/lib/kubelet/pods/9275ea5f-43f0-447d-9e4d-adf3f977f058/volumes" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.010197 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a473770-8a49-4263-b949-d4f89d63542d" path="/var/lib/kubelet/pods/9a473770-8a49-4263-b949-d4f89d63542d/volumes" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.011985 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b87816c4-975a-45d5-b926-dd5ec79d3b1e" path="/var/lib/kubelet/pods/b87816c4-975a-45d5-b926-dd5ec79d3b1e/volumes" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.012471 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c16de63a-192f-4731-8c18-aaaede74cb00" path="/var/lib/kubelet/pods/c16de63a-192f-4731-8c18-aaaede74cb00/volumes" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.013869 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae4fe8a-e775-4599-9ca9-ff3be2baa456" path="/var/lib/kubelet/pods/cae4fe8a-e775-4599-9ca9-ff3be2baa456/volumes" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.014578 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8" path="/var/lib/kubelet/pods/e0aca09c-1d5a-4d9b-a6f0-e843f97c6cc8/volumes" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.018607 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-pclms"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.039991 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-pclms"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.057118 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-f4t54"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.072940 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-f4t54"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.080499 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-hrsm6"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.099536 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-hrsm6"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.120684 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5484cf4b86-wsff9"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.121186 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" podUID="f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" containerName="placement-log" containerID="cri-o://255c4a9058db6132dbc1d591cdfa04bd107d181e1d3692fc5d55bf5752456d30" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.121209 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" podUID="f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" containerName="placement-api" containerID="cri-o://d2251a881137fe8b2237286d8f579027b803fabf914d7b024ad2945b1ccc202e" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.122363 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_2bc2f7c1-a435-4337-bd8b-47c0443f4004/ovsdbserver-nb/0.log" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.122406 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.129088 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.129149 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data podName:dd483381-3919-4ade-ac69-c8abde2869a6 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:32.12913394 +0000 UTC m=+4624.449394228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data") pod "rabbitmq-server-0" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6") : configmap "rabbitmq-config-data" not found Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.145803 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.161999 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.162462 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-server" containerID="cri-o://6af8f9825683a9e7bcfa9790dd622b1408428b449ce0e291c74c97d671c14f9b" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.162889 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="swift-recon-cron" containerID="cri-o://be639c5606151f012255de9bcb91690338a83b8cba439bddcc11b7175fb3edb5" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.162945 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="rsync" containerID="cri-o://b7cffbcf3ad1b63e00ae392389302e885c55f2585cddad9fe0291894a6e8cecb" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.162975 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-expirer" containerID="cri-o://dde3d6c7a1986ddafd263ad5e85db2efc0b08d473326e955b4001ad6c7a98f50" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.163030 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-updater" containerID="cri-o://add2324fcfa33b1725dfe035aca44112500e267e7cf07a88742e7c07f8c84862" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.163061 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-auditor" containerID="cri-o://1350a717bc1295245194beb51e514c867204d22c527b6a02bbcc86472510cc38" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.163092 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-replicator" containerID="cri-o://239d60cbdc9fb72217dc18f7e01aa568cec96d28806e913c0638419d688b401a" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.163130 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-server" containerID="cri-o://47752edc56435edd967b808a352f1ce3b054e10598ff669178f3ef8398eeefba" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.163163 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-updater" containerID="cri-o://be9299d3748b999437f0fa19a92cf43c1a5f5fe089543d8ab23f5a5c5f4f0c07" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.163192 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-auditor" containerID="cri-o://f6b6d32d54243a624699e2b703b31cb8449fd7e9d6677452c3fad7ecf705ad22" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.163222 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-replicator" containerID="cri-o://81bc4f92c31773ee13486eb01f1c4bfe2377e41f09e12794224e70040ffe1a53" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.163252 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-server" containerID="cri-o://92c9ca6d0747ab19439d8b49dfd9a1c96242ac77f3c689c788a65c1c0e1258b2" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.163283 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-reaper" containerID="cri-o://3c24cf1353bfd1771fb13a4e92f529dc36f8d0e1781a72b6afd315e775fba8c5" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.163314 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-auditor" containerID="cri-o://85df78331aa99aafae2f67fc3bc58a8e03b9b7cb97d02206474e31d6bc372557" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.163344 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-replicator" containerID="cri-o://6cd3e98e0e337b6e8110bc5ed6d220c6312475e87a43ee037f356b6914ffca40" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.181741 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk"] Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.182300 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc2f7c1-a435-4337-bd8b-47c0443f4004" containerName="openstack-network-exporter" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.182317 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc2f7c1-a435-4337-bd8b-47c0443f4004" containerName="openstack-network-exporter" Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.182331 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc2f7c1-a435-4337-bd8b-47c0443f4004" containerName="ovsdbserver-nb" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.182337 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc2f7c1-a435-4337-bd8b-47c0443f4004" containerName="ovsdbserver-nb" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.182515 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bc2f7c1-a435-4337-bd8b-47c0443f4004" containerName="openstack-network-exporter" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.182538 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bc2f7c1-a435-4337-bd8b-47c0443f4004" containerName="ovsdbserver-nb" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.184022 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.199074 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.234496 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-ovsdbserver-nb-tls-certs\") pod \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.234597 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-combined-ca-bundle\") pod \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.234681 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bc2f7c1-a435-4337-bd8b-47c0443f4004-config\") pod \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.234718 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.234748 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2bc2f7c1-a435-4337-bd8b-47c0443f4004-ovsdb-rundir\") pod \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.234775 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bc2f7c1-a435-4337-bd8b-47c0443f4004-scripts\") pod \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.234817 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwdkq\" (UniqueName: \"kubernetes.io/projected/2bc2f7c1-a435-4337-bd8b-47c0443f4004-kube-api-access-lwdkq\") pod \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.234879 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-metrics-certs-tls-certs\") pod \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\" (UID: \"2bc2f7c1-a435-4337-bd8b-47c0443f4004\") " Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.235354 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.235395 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data podName:81ab5d03-ddb2-4d25-bec9-485f19fa3f13 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:31.235381288 +0000 UTC m=+4623.555641576 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data") pod "rabbitmq-cell1-server-0" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.240494 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bc2f7c1-a435-4337-bd8b-47c0443f4004-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "2bc2f7c1-a435-4337-bd8b-47c0443f4004" (UID: "2bc2f7c1-a435-4337-bd8b-47c0443f4004"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.240589 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bc2f7c1-a435-4337-bd8b-47c0443f4004-config" (OuterVolumeSpecName: "config") pod "2bc2f7c1-a435-4337-bd8b-47c0443f4004" (UID: "2bc2f7c1-a435-4337-bd8b-47c0443f4004"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.242117 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bc2f7c1-a435-4337-bd8b-47c0443f4004-scripts" (OuterVolumeSpecName: "scripts") pod "2bc2f7c1-a435-4337-bd8b-47c0443f4004" (UID: "2bc2f7c1-a435-4337-bd8b-47c0443f4004"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.249522 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bc2f7c1-a435-4337-bd8b-47c0443f4004-kube-api-access-lwdkq" (OuterVolumeSpecName: "kube-api-access-lwdkq") pod "2bc2f7c1-a435-4337-bd8b-47c0443f4004" (UID: "2bc2f7c1-a435-4337-bd8b-47c0443f4004"). InnerVolumeSpecName "kube-api-access-lwdkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.267675 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "2bc2f7c1-a435-4337-bd8b-47c0443f4004" (UID: "2bc2f7c1-a435-4337-bd8b-47c0443f4004"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.288772 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6fc9448c48-j8m88"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.289858 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" podUID="599881de-74a7-46ff-a7a1-5599d4345e0c" containerName="neutron-api" containerID="cri-o://cce18799a99ac1715b565d5854b0dcbd08f2d61430c9731225dea07b9b533874" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.290055 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" podUID="599881de-74a7-46ff-a7a1-5599d4345e0c" containerName="neutron-httpd" containerID="cri-o://a441fbf79feb6f0388d034551078971a6c769c048b93306c0f069c878d28557b" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.331785 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bc2f7c1-a435-4337-bd8b-47c0443f4004" (UID: "2bc2f7c1-a435-4337-bd8b-47c0443f4004"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.339007 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-zm7tk\" (UID: \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.339080 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s57w9\" (UniqueName: \"kubernetes.io/projected/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-kube-api-access-s57w9\") pod \"dnsmasq-dnsmasq-84b9f45d47-zm7tk\" (UID: \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.339213 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-zm7tk\" (UID: \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.339387 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.339480 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.339492 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bc2f7c1-a435-4337-bd8b-47c0443f4004-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.339536 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.339546 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2bc2f7c1-a435-4337-bd8b-47c0443f4004-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.339555 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2bc2f7c1-a435-4337-bd8b-47c0443f4004-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.339565 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwdkq\" (UniqueName: \"kubernetes.io/projected/2bc2f7c1-a435-4337-bd8b-47c0443f4004-kube-api-access-lwdkq\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.340238 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.340312 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle podName:3a438c50-84b9-49fb-9f49-0cfdbe2bab0a nodeName:}" failed. No retries permitted until 2026-01-20 23:52:32.340291523 +0000 UTC m=+4624.660551811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle") pod "barbican-worker-846584955c-2ghpj" (UID: "3a438c50-84b9-49fb-9f49-0cfdbe2bab0a") : secret "combined-ca-bundle" not found Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.375024 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-jhh6k"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.388796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-4lg5x" event={"ID":"815dc9f1-424f-499a-b27b-16f3f0483fff","Type":"ContainerStarted","Data":"f4e49a48d2cd3b9d80a4169f53f58def50e2f14cd67eb9b2310ec93ef601a400"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.393477 5030 generic.go:334] "Generic (PLEG): container finished" podID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerID="9df222c59711d3fd2a93db293137b68d56cc3f6dc8f6d95807cf89f2a8e746bc" exitCode=0 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.393507 5030 generic.go:334] "Generic (PLEG): container finished" podID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerID="b39481b9df711fc23477860a389138de4b98dc604215528415d3e6762dbf8608" exitCode=0 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.393568 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"b476b9d8-8c44-47b1-9736-3b7441a703bd","Type":"ContainerDied","Data":"9df222c59711d3fd2a93db293137b68d56cc3f6dc8f6d95807cf89f2a8e746bc"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.393590 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"b476b9d8-8c44-47b1-9736-3b7441a703bd","Type":"ContainerDied","Data":"b39481b9df711fc23477860a389138de4b98dc604215528415d3e6762dbf8608"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.395580 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" event={"ID":"7fd42af9-412d-4b7d-8be1-2a7c9008795f","Type":"ContainerStarted","Data":"1f57f8cc45e473878b7fbfb5968110cfe6982171bf2f81ec0232410672e7c9fe"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.408089 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-jhh6k"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.432106 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-z6ngm"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.442103 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-zm7tk\" (UID: \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.442155 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s57w9\" (UniqueName: \"kubernetes.io/projected/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-kube-api-access-s57w9\") pod \"dnsmasq-dnsmasq-84b9f45d47-zm7tk\" (UID: \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.442242 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-zm7tk\" (UID: \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.442290 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.442470 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.442516 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle podName:21ceeb6e-ad82-423c-8a0c-a946171a2b5d nodeName:}" failed. No retries permitted until 2026-01-20 23:52:32.442497292 +0000 UTC m=+4624.762757580 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle") pod "barbican-keystone-listener-79545f7658-lhclg" (UID: "21ceeb6e-ad82-423c-8a0c-a946171a2b5d") : secret "combined-ca-bundle" not found Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.443688 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-zm7tk\" (UID: \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.460809 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-zm7tk\" (UID: \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.461074 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:52:30 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: if [ -n "nova_api" ]; then Jan 20 23:52:30 crc kubenswrapper[5030]: GRANT_DATABASE="nova_api" Jan 20 23:52:30 crc kubenswrapper[5030]: else Jan 20 23:52:30 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:52:30 crc kubenswrapper[5030]: fi Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:52:30 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:52:30 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:52:30 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:52:30 crc kubenswrapper[5030]: # support updates Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.464380 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" podUID="7fd42af9-412d-4b7d-8be1-2a7c9008795f" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.474119 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.486967 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s57w9\" (UniqueName: \"kubernetes.io/projected/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-kube-api-access-s57w9\") pod \"dnsmasq-dnsmasq-84b9f45d47-zm7tk\" (UID: \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.489193 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-z6ngm"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.528914 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848/ovsdbserver-sb/0.log" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.528986 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.542270 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "2bc2f7c1-a435-4337-bd8b-47c0443f4004" (UID: "2bc2f7c1-a435-4337-bd8b-47c0443f4004"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.542980 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-ovsdb-rundir\") pod \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.543017 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-metrics-certs-tls-certs\") pod \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.543049 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.543172 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-scripts\") pod \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.543189 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-config\") pod \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.543257 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-combined-ca-bundle\") pod \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.543362 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnpws\" (UniqueName: \"kubernetes.io/projected/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-kube-api-access-gnpws\") pod \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.543379 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-ovsdbserver-sb-tls-certs\") pod \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\" (UID: \"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848\") " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.543946 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.543967 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.544246 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-config" (OuterVolumeSpecName: "config") pod "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" (UID: "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.547680 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-scripts" (OuterVolumeSpecName: "scripts") pod "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" (UID: "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.549130 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" (UID: "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.553337 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "2bc2f7c1-a435-4337-bd8b-47c0443f4004" (UID: "2bc2f7c1-a435-4337-bd8b-47c0443f4004"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.558698 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-kube-api-access-gnpws" (OuterVolumeSpecName: "kube-api-access-gnpws") pod "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" (UID: "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848"). InnerVolumeSpecName "kube-api-access-gnpws". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.580130 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.580461 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="7c7e15f9-5447-4550-9b78-91a664abf455" containerName="glance-log" containerID="cri-o://83b2f026ab0c42167ca1bb5968a95879a1cec2e859ab889dcd0ef3007905b2f0" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.580933 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="7c7e15f9-5447-4550-9b78-91a664abf455" containerName="glance-httpd" containerID="cri-o://659c6080b290beac39c64d7bbf8b45b9b4302d2101a13ac504110cc794f1c3ca" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.586472 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="add2324fcfa33b1725dfe035aca44112500e267e7cf07a88742e7c07f8c84862" exitCode=0 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.586578 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="be9299d3748b999437f0fa19a92cf43c1a5f5fe089543d8ab23f5a5c5f4f0c07" exitCode=0 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.586661 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="f6b6d32d54243a624699e2b703b31cb8449fd7e9d6677452c3fad7ecf705ad22" exitCode=0 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.586747 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="81bc4f92c31773ee13486eb01f1c4bfe2377e41f09e12794224e70040ffe1a53" exitCode=0 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.586808 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="6cd3e98e0e337b6e8110bc5ed6d220c6312475e87a43ee037f356b6914ffca40" exitCode=0 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.586972 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"add2324fcfa33b1725dfe035aca44112500e267e7cf07a88742e7c07f8c84862"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.587052 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"be9299d3748b999437f0fa19a92cf43c1a5f5fe089543d8ab23f5a5c5f4f0c07"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.587115 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"f6b6d32d54243a624699e2b703b31cb8449fd7e9d6677452c3fad7ecf705ad22"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.587172 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"81bc4f92c31773ee13486eb01f1c4bfe2377e41f09e12794224e70040ffe1a53"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.587229 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"6cd3e98e0e337b6e8110bc5ed6d220c6312475e87a43ee037f356b6914ffca40"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.594555 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" (UID: "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.602158 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.610298 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4lg5x"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.624740 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" (UID: "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.645480 5030 generic.go:334] "Generic (PLEG): container finished" podID="55537e5f-bb41-4899-a095-e87c7495ebdf" containerID="1afd7b619c388f1b085904f3aa4e3b2164180d355c9e37efea98319e13c503d6" exitCode=2 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.645587 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"55537e5f-bb41-4899-a095-e87c7495ebdf","Type":"ContainerDied","Data":"1afd7b619c388f1b085904f3aa4e3b2164180d355c9e37efea98319e13c503d6"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.648285 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.648318 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.648334 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.648348 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bc2f7c1-a435-4337-bd8b-47c0443f4004-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.648360 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnpws\" (UniqueName: \"kubernetes.io/projected/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-kube-api-access-gnpws\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.648371 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.648394 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.648463 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.648707 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="b54968d4-ae84-4e25-8899-6113e78d9a2b" containerName="glance-log" containerID="cri-o://6aba11ff0a4eac220e8cc2af3d966d1f2a4d5a16d8754c243966dc541c179f09" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.648798 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="b54968d4-ae84-4e25-8899-6113e78d9a2b" containerName="glance-httpd" containerID="cri-o://5bc5381a28ae85e8ac4d78c24cbcbda308750d29cddbbbd612d8dadbe5fff537" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.650134 5030 generic.go:334] "Generic (PLEG): container finished" podID="f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" containerID="255c4a9058db6132dbc1d591cdfa04bd107d181e1d3692fc5d55bf5752456d30" exitCode=143 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.650171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" event={"ID":"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a","Type":"ContainerDied","Data":"255c4a9058db6132dbc1d591cdfa04bd107d181e1d3692fc5d55bf5752456d30"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.675781 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_2bc2f7c1-a435-4337-bd8b-47c0443f4004/ovsdbserver-nb/0.log" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.675822 5030 generic.go:334] "Generic (PLEG): container finished" podID="2bc2f7c1-a435-4337-bd8b-47c0443f4004" containerID="1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb" exitCode=2 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.675841 5030 generic.go:334] "Generic (PLEG): container finished" podID="2bc2f7c1-a435-4337-bd8b-47c0443f4004" containerID="2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b" exitCode=143 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.675904 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"2bc2f7c1-a435-4337-bd8b-47c0443f4004","Type":"ContainerDied","Data":"1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.675929 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"2bc2f7c1-a435-4337-bd8b-47c0443f4004","Type":"ContainerDied","Data":"2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.675938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"2bc2f7c1-a435-4337-bd8b-47c0443f4004","Type":"ContainerDied","Data":"76d6439f2add6a1ad3a45258a3ebaa3b4d0647b9065b86998539762ef5b46e80"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.675952 5030 scope.go:117] "RemoveContainer" containerID="1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.676197 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.724426 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.724499 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-knct4"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.734007 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848/ovsdbserver-sb/0.log" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.734050 5030 generic.go:334] "Generic (PLEG): container finished" podID="26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" containerID="00240cdae6147b11afb04a76cf0732eff9a4442b4dfe7a113f244770f4710e4f" exitCode=2 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.734065 5030 generic.go:334] "Generic (PLEG): container finished" podID="26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" containerID="5dacabdec0bab9ffc3c73ab9cfe59add1f99a5ee81798258e3acf28ada151c65" exitCode=143 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.734164 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.734555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848","Type":"ContainerDied","Data":"00240cdae6147b11afb04a76cf0732eff9a4442b4dfe7a113f244770f4710e4f"} Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.734577 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848","Type":"ContainerDied","Data":"5dacabdec0bab9ffc3c73ab9cfe59add1f99a5ee81798258e3acf28ada151c65"} Jan 20 23:52:30 crc kubenswrapper[5030]: W0120 23:52:30.745748 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee604c21_d89c_4f46_ae93_3912ff1da96d.slice/crio-f804646470ae9638965f80dbee604d9fda77a7ebbda062a817352238b0eb789d WatchSource:0}: Error finding container f804646470ae9638965f80dbee604d9fda77a7ebbda062a817352238b0eb789d: Status 404 returned error can't find the container with id f804646470ae9638965f80dbee604d9fda77a7ebbda062a817352238b0eb789d Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.752766 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.759160 5030 generic.go:334] "Generic (PLEG): container finished" podID="0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e" containerID="7a0678712472b5130cb0dc086ec6fd0a4428b165197d3f040a6d2bad8e988888" exitCode=137 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.765429 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-knct4"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.779544 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" (UID: "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.805720 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.805975 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="dbabd573-5784-4cb5-8294-cf65951a73ba" containerName="nova-api-log" containerID="cri-o://6001742698e098c515b24cedb01ca7a6e21059956ff4d7773611300b09ff142c" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.806399 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="dbabd573-5784-4cb5-8294-cf65951a73ba" containerName="nova-api-api" containerID="cri-o://d99eff37903cee5608b24fb64a68cb6530669b4caba4e1642f9039b72ff5ad58" gracePeriod=30 Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.853583 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.853784 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.853862 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.853902 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjrf\" (UniqueName: \"kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.853934 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.853994 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.854089 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.854152 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:32.854134058 +0000 UTC m=+4625.174394356 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "cert-barbican-public-svc" not found Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.854465 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.854499 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:32.854490657 +0000 UTC m=+4625.174750945 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "cert-barbican-internal-svc" not found Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.854532 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.854550 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:32.854545238 +0000 UTC m=+4625.174805526 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "barbican-config-data" not found Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.854798 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.854818 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:32.854813055 +0000 UTC m=+4625.175073343 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "combined-ca-bundle" not found Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.872097 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.928402 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:52:30 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: if [ -n "barbican" ]; then Jan 20 23:52:30 crc kubenswrapper[5030]: GRANT_DATABASE="barbican" Jan 20 23:52:30 crc kubenswrapper[5030]: else Jan 20 23:52:30 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:52:30 crc kubenswrapper[5030]: fi Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:52:30 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:52:30 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:52:30 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:52:30 crc kubenswrapper[5030]: # support updates Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.928652 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:52:30 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: if [ -n "nova_cell0" ]; then Jan 20 23:52:30 crc kubenswrapper[5030]: GRANT_DATABASE="nova_cell0" Jan 20 23:52:30 crc kubenswrapper[5030]: else Jan 20 23:52:30 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:52:30 crc kubenswrapper[5030]: fi Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:52:30 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:52:30 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:52:30 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:52:30 crc kubenswrapper[5030]: # support updates Jan 20 23:52:30 crc kubenswrapper[5030]: Jan 20 23:52:30 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.929454 5030 projected.go:194] Error preparing data for projected volume kube-api-access-hkjrf for pod openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.929500 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:32.929485186 +0000 UTC m=+4625.249745474 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hkjrf" (UniqueName: "kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.929518 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" podUID="4a16ed2d-dd15-41c1-b6ef-61d242bd61a4" Jan 20 23:52:30 crc kubenswrapper[5030]: E0120 23:52:30.929764 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" podUID="ee604c21-d89c-4f46-ae93-3912ff1da96d" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.953350 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" (UID: "26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.956008 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:30 crc kubenswrapper[5030]: I0120 23:52:30.963731 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:30.999679 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:30.999953 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="acb8010d-f18b-417a-919f-13e959ccf452" containerName="nova-metadata-log" containerID="cri-o://2832a1e00722193c8dc84462e3b5eaa3de72e46dcd9c97eb2ec6a97c0ac4c153" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.000369 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="acb8010d-f18b-417a-919f-13e959ccf452" containerName="nova-metadata-metadata" containerID="cri-o://d0823fe72a1822171721721bd07c6c91f0239b31d0a862490eb2023a64940f16" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.009187 5030 scope.go:117] "RemoveContainer" containerID="2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.114690 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-7ffdk"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.129688 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-lklcb"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.144691 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-lklcb"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.144823 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="dd483381-3919-4ade-ac69-c8abde2869a6" containerName="rabbitmq" containerID="cri-o://59a819026ca38754d2a26af58c705565136f6b307a0c53e7ac0b9eba9ecae242" gracePeriod=604800 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.187160 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-7ffdk"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.191308 5030 scope.go:117] "RemoveContainer" containerID="1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb" Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.196040 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb\": container with ID starting with 1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb not found: ID does not exist" containerID="1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.196081 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb"} err="failed to get container status \"1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb\": rpc error: code = NotFound desc = could not find container \"1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb\": container with ID starting with 1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb not found: ID does not exist" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.196104 5030 scope.go:117] "RemoveContainer" containerID="2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b" Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.203455 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b\": container with ID starting with 2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b not found: ID does not exist" containerID="2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.203484 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b"} err="failed to get container status \"2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b\": rpc error: code = NotFound desc = could not find container \"2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b\": container with ID starting with 2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b not found: ID does not exist" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.203503 5030 scope.go:117] "RemoveContainer" containerID="1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.205419 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.205567 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb"} err="failed to get container status \"1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb\": rpc error: code = NotFound desc = could not find container \"1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb\": container with ID starting with 1ad0c10ef61b0b9c4098584623f4a63a95beb8e3eee55122fb79166d6995b0bb not found: ID does not exist" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.205586 5030 scope.go:117] "RemoveContainer" containerID="2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.208612 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b"} err="failed to get container status \"2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b\": rpc error: code = NotFound desc = could not find container \"2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b\": container with ID starting with 2895d97e1c4d057d0d3d561265b27672f29895a1f551078dc02c0975a74aaa6b not found: ID does not exist" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.208663 5030 scope.go:117] "RemoveContainer" containerID="00240cdae6147b11afb04a76cf0732eff9a4442b4dfe7a113f244770f4710e4f" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.213066 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.222946 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.223204 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" podUID="24160216-4080-4fa9-88c0-f89910f0a933" containerName="proxy-httpd" containerID="cri-o://0dbbd5636d7c51b217c065b6413313ea12f5c9838b12455c2d92415149d72b65" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.223376 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" podUID="24160216-4080-4fa9-88c0-f89910f0a933" containerName="proxy-server" containerID="cri-o://066e433777c6f4fb06a9686d3ad45153cfc06dd3189e1674846e4aa6746e5f4d" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.230391 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.248109 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.261317 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.261832 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data podName:81ab5d03-ddb2-4d25-bec9-485f19fa3f13 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:33.261817508 +0000 UTC m=+4625.582077796 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data") pod "rabbitmq-cell1-server-0" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.263747 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.289171 5030 scope.go:117] "RemoveContainer" containerID="5dacabdec0bab9ffc3c73ab9cfe59add1f99a5ee81798258e3acf28ada151c65" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.323826 5030 scope.go:117] "RemoveContainer" containerID="00240cdae6147b11afb04a76cf0732eff9a4442b4dfe7a113f244770f4710e4f" Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.324192 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00240cdae6147b11afb04a76cf0732eff9a4442b4dfe7a113f244770f4710e4f\": container with ID starting with 00240cdae6147b11afb04a76cf0732eff9a4442b4dfe7a113f244770f4710e4f not found: ID does not exist" containerID="00240cdae6147b11afb04a76cf0732eff9a4442b4dfe7a113f244770f4710e4f" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.324217 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00240cdae6147b11afb04a76cf0732eff9a4442b4dfe7a113f244770f4710e4f"} err="failed to get container status \"00240cdae6147b11afb04a76cf0732eff9a4442b4dfe7a113f244770f4710e4f\": rpc error: code = NotFound desc = could not find container \"00240cdae6147b11afb04a76cf0732eff9a4442b4dfe7a113f244770f4710e4f\": container with ID starting with 00240cdae6147b11afb04a76cf0732eff9a4442b4dfe7a113f244770f4710e4f not found: ID does not exist" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.324238 5030 scope.go:117] "RemoveContainer" containerID="5dacabdec0bab9ffc3c73ab9cfe59add1f99a5ee81798258e3acf28ada151c65" Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.324674 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dacabdec0bab9ffc3c73ab9cfe59add1f99a5ee81798258e3acf28ada151c65\": container with ID starting with 5dacabdec0bab9ffc3c73ab9cfe59add1f99a5ee81798258e3acf28ada151c65 not found: ID does not exist" containerID="5dacabdec0bab9ffc3c73ab9cfe59add1f99a5ee81798258e3acf28ada151c65" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.324691 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dacabdec0bab9ffc3c73ab9cfe59add1f99a5ee81798258e3acf28ada151c65"} err="failed to get container status \"5dacabdec0bab9ffc3c73ab9cfe59add1f99a5ee81798258e3acf28ada151c65\": rpc error: code = NotFound desc = could not find container \"5dacabdec0bab9ffc3c73ab9cfe59add1f99a5ee81798258e3acf28ada151c65\": container with ID starting with 5dacabdec0bab9ffc3c73ab9cfe59add1f99a5ee81798258e3acf28ada151c65 not found: ID does not exist" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.344036 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.362080 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-config-data\") pod \"b476b9d8-8c44-47b1-9736-3b7441a703bd\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.362130 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b476b9d8-8c44-47b1-9736-3b7441a703bd-etc-machine-id\") pod \"b476b9d8-8c44-47b1-9736-3b7441a703bd\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.362171 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-config-data-custom\") pod \"b476b9d8-8c44-47b1-9736-3b7441a703bd\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.362226 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22hfz\" (UniqueName: \"kubernetes.io/projected/b476b9d8-8c44-47b1-9736-3b7441a703bd-kube-api-access-22hfz\") pod \"b476b9d8-8c44-47b1-9736-3b7441a703bd\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.362251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-openstack-config\") pod \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.362339 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-combined-ca-bundle\") pod \"b476b9d8-8c44-47b1-9736-3b7441a703bd\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.362370 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-scripts\") pod \"b476b9d8-8c44-47b1-9736-3b7441a703bd\" (UID: \"b476b9d8-8c44-47b1-9736-3b7441a703bd\") " Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.362430 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-openstack-config-secret\") pod \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.362467 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xm5h\" (UniqueName: \"kubernetes.io/projected/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-kube-api-access-2xm5h\") pod \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.362525 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-combined-ca-bundle\") pod \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\" (UID: \"0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e\") " Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.364689 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b476b9d8-8c44-47b1-9736-3b7441a703bd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b476b9d8-8c44-47b1-9736-3b7441a703bd" (UID: "b476b9d8-8c44-47b1-9736-3b7441a703bd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.373724 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-scripts" (OuterVolumeSpecName: "scripts") pod "b476b9d8-8c44-47b1-9736-3b7441a703bd" (UID: "b476b9d8-8c44-47b1-9736-3b7441a703bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.376835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-kube-api-access-2xm5h" (OuterVolumeSpecName: "kube-api-access-2xm5h") pod "0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e" (UID: "0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e"). InnerVolumeSpecName "kube-api-access-2xm5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.376951 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b476b9d8-8c44-47b1-9736-3b7441a703bd" (UID: "b476b9d8-8c44-47b1-9736-3b7441a703bd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.383541 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-bp42s"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.384856 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b476b9d8-8c44-47b1-9736-3b7441a703bd-kube-api-access-22hfz" (OuterVolumeSpecName: "kube-api-access-22hfz") pod "b476b9d8-8c44-47b1-9736-3b7441a703bd" (UID: "b476b9d8-8c44-47b1-9736-3b7441a703bd"). InnerVolumeSpecName "kube-api-access-22hfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.401016 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-bp42s"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.412395 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-c2wf4"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.424529 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e" (UID: "0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.431902 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-c2wf4"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.443221 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e" (UID: "0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.467728 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.467761 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b476b9d8-8c44-47b1-9736-3b7441a703bd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.467771 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.467781 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22hfz\" (UniqueName: \"kubernetes.io/projected/b476b9d8-8c44-47b1-9736-3b7441a703bd-kube-api-access-22hfz\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.467792 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.467800 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.467809 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xm5h\" (UniqueName: \"kubernetes.io/projected/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-kube-api-access-2xm5h\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.491604 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.492873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b476b9d8-8c44-47b1-9736-3b7441a703bd" (UID: "b476b9d8-8c44-47b1-9736-3b7441a703bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.494832 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e" (UID: "0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.522720 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-f32b-account-create-update-tdqp5"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.566736 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-kr8bd"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.570749 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-config-data" (OuterVolumeSpecName: "config-data") pod "b476b9d8-8c44-47b1-9736-3b7441a703bd" (UID: "b476b9d8-8c44-47b1-9736-3b7441a703bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.570855 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.570880 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b476b9d8-8c44-47b1-9736-3b7441a703bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.570891 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.571100 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="cdf571fc-fd44-4b32-bc32-5ccd319a5157" containerName="galera" containerID="cri-o://26977dc366c36dd6821aaf3e4eaa5d87aeefe7baa15409a5c0bce4fd3fa23efe" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.582593 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-kr8bd"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.589156 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.599015 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.599794 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="044ada6d-cbbc-45dc-b229-3884d815427b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f458a7429598f19652d1d9f95c5960c8181174945df0f944cd35ca49eede8e40" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.634537 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.634842 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" podUID="d3ed2d89-dd21-446c-b85b-b44bb72512bc" containerName="barbican-worker-log" containerID="cri-o://29681178c1d15195b91a0a520a9ab9575c9dc60635da73365d7bf863a845e294" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.635144 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" podUID="d3ed2d89-dd21-446c-b85b-b44bb72512bc" containerName="barbican-worker" containerID="cri-o://39cbbaace20c2208aaabbaa59903f192867600c05fcdfb4a1ad5000eb6ffd70e" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.665355 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-846584955c-2ghpj"] Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.668446 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" podUID="3a438c50-84b9-49fb-9f49-0cfdbe2bab0a" Jan 20 23:52:31 crc kubenswrapper[5030]: W0120 23:52:31.669856 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e3a8a28_350e_4704_a9dc_09fa5b1b7bd8.slice/crio-1cda944e55a41a98fa02b935f7fc543369804f1c4b521d2fa93d75906c580752 WatchSource:0}: Error finding container 1cda944e55a41a98fa02b935f7fc543369804f1c4b521d2fa93d75906c580752: Status 404 returned error can't find the container with id 1cda944e55a41a98fa02b935f7fc543369804f1c4b521d2fa93d75906c580752 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.683499 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.683826 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" podUID="a9df19e9-10ce-496f-9ae7-89ac55ba91cf" containerName="barbican-api-log" containerID="cri-o://409fe497b78edb037279005ae11210c5ba6b57531c5991487330124e7e48ec82" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.684089 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" podUID="a9df19e9-10ce-496f-9ae7-89ac55ba91cf" containerName="barbican-api" containerID="cri-o://fc0ce133b045b63568d1b17b3af0a763ae05a71804fb6407998acf7b7d0be9dd" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.710483 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.714016 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" podUID="611e73cb-eb48-4fed-9073-7b3ee43b88e8" containerName="barbican-keystone-listener-log" containerID="cri-o://640b59c1c1f24c64cc46c43b80b843802e0c48890d90206a9daf79885221c80a" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.714354 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" podUID="611e73cb-eb48-4fed-9073-7b3ee43b88e8" containerName="barbican-keystone-listener" containerID="cri-o://56c4a1d60ce76e519f42d8bf41166f1c90ffea4940e28ac6032adf8c11f04400" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.726901 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg"] Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.727491 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" podUID="21ceeb6e-ad82-423c-8a0c-a946171a2b5d" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.746015 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd"] Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.746744 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data internal-tls-certs kube-api-access-hkjrf public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" podUID="6d9855de-8b4b-4e39-940b-ff863190d072" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.752515 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.752723 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" containerID="cri-o://3707109cd3d63748c8f067fa10c5418b3203a5b175f14a8716f463c27d863616" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.771684 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" event={"ID":"ee604c21-d89c-4f46-ae93-3912ff1da96d","Type":"ContainerStarted","Data":"f804646470ae9638965f80dbee604d9fda77a7ebbda062a817352238b0eb789d"} Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.773238 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:52:31 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: if [ -n "nova_cell0" ]; then Jan 20 23:52:31 crc kubenswrapper[5030]: GRANT_DATABASE="nova_cell0" Jan 20 23:52:31 crc kubenswrapper[5030]: else Jan 20 23:52:31 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:52:31 crc kubenswrapper[5030]: fi Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:52:31 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:52:31 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:52:31 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:52:31 crc kubenswrapper[5030]: # support updates Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.774417 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" podUID="ee604c21-d89c-4f46-ae93-3912ff1da96d" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.775184 5030 generic.go:334] "Generic (PLEG): container finished" podID="dbabd573-5784-4cb5-8294-cf65951a73ba" containerID="6001742698e098c515b24cedb01ca7a6e21059956ff4d7773611300b09ff142c" exitCode=143 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.775228 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"dbabd573-5784-4cb5-8294-cf65951a73ba","Type":"ContainerDied","Data":"6001742698e098c515b24cedb01ca7a6e21059956ff4d7773611300b09ff142c"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.777070 5030 generic.go:334] "Generic (PLEG): container finished" podID="815dc9f1-424f-499a-b27b-16f3f0483fff" containerID="f633bba57d3d9c96735d364031f527fc137d20f358307a2a20ceaf07366876a0" exitCode=1 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.777110 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-4lg5x" event={"ID":"815dc9f1-424f-499a-b27b-16f3f0483fff","Type":"ContainerDied","Data":"f633bba57d3d9c96735d364031f527fc137d20f358307a2a20ceaf07366876a0"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.777506 5030 scope.go:117] "RemoveContainer" containerID="f633bba57d3d9c96735d364031f527fc137d20f358307a2a20ceaf07366876a0" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.781471 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.782375 5030 scope.go:117] "RemoveContainer" containerID="7a0678712472b5130cb0dc086ec6fd0a4428b165197d3f040a6d2bad8e988888" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.792737 5030 generic.go:334] "Generic (PLEG): container finished" podID="7c7e15f9-5447-4550-9b78-91a664abf455" containerID="83b2f026ab0c42167ca1bb5968a95879a1cec2e859ab889dcd0ef3007905b2f0" exitCode=143 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.792798 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7c7e15f9-5447-4550-9b78-91a664abf455","Type":"ContainerDied","Data":"83b2f026ab0c42167ca1bb5968a95879a1cec2e859ab889dcd0ef3007905b2f0"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.795125 5030 generic.go:334] "Generic (PLEG): container finished" podID="24160216-4080-4fa9-88c0-f89910f0a933" containerID="0dbbd5636d7c51b217c065b6413313ea12f5c9838b12455c2d92415149d72b65" exitCode=0 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.795181 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" event={"ID":"24160216-4080-4fa9-88c0-f89910f0a933","Type":"ContainerDied","Data":"0dbbd5636d7c51b217c065b6413313ea12f5c9838b12455c2d92415149d72b65"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.796777 5030 generic.go:334] "Generic (PLEG): container finished" podID="acb8010d-f18b-417a-919f-13e959ccf452" containerID="2832a1e00722193c8dc84462e3b5eaa3de72e46dcd9c97eb2ec6a97c0ac4c153" exitCode=143 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.797043 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"acb8010d-f18b-417a-919f-13e959ccf452","Type":"ContainerDied","Data":"2832a1e00722193c8dc84462e3b5eaa3de72e46dcd9c97eb2ec6a97c0ac4c153"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.798062 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" event={"ID":"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8","Type":"ContainerStarted","Data":"1cda944e55a41a98fa02b935f7fc543369804f1c4b521d2fa93d75906c580752"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.798831 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848","Type":"ContainerDied","Data":"5ea9498e4b598290f77c435bab88a811f81b8192261ec2cbc1eab93bba023fcc"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.806072 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.812502 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-s2df6"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.817771 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" event={"ID":"4a16ed2d-dd15-41c1-b6ef-61d242bd61a4","Type":"ContainerStarted","Data":"cea474c26653665197ef3cbcc33160450bc456d1e2e5d6873baf8abc1fbda87c"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.818492 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.818712 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" containerID="cri-o://630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.833290 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.834696 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.834738 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"b476b9d8-8c44-47b1-9736-3b7441a703bd","Type":"ContainerDied","Data":"cdc9fc78102b637f5c1b59cc5ee594e39a1ec8a8836d438b968bfb541b555a4e"} Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.839730 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:52:31 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: if [ -n "barbican" ]; then Jan 20 23:52:31 crc kubenswrapper[5030]: GRANT_DATABASE="barbican" Jan 20 23:52:31 crc kubenswrapper[5030]: else Jan 20 23:52:31 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:52:31 crc kubenswrapper[5030]: fi Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:52:31 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:52:31 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:52:31 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:52:31 crc kubenswrapper[5030]: # support updates Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.839891 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-smrcs"] Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.842368 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" podUID="4a16ed2d-dd15-41c1-b6ef-61d242bd61a4" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.850177 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.857561 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.857815 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" containerID="cri-o://5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.882685 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.901310 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.907958 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.914127 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926220 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="b7cffbcf3ad1b63e00ae392389302e885c55f2585cddad9fe0291894a6e8cecb" exitCode=0 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926246 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="dde3d6c7a1986ddafd263ad5e85db2efc0b08d473326e955b4001ad6c7a98f50" exitCode=0 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926254 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="1350a717bc1295245194beb51e514c867204d22c527b6a02bbcc86472510cc38" exitCode=0 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926261 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="239d60cbdc9fb72217dc18f7e01aa568cec96d28806e913c0638419d688b401a" exitCode=0 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926268 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="47752edc56435edd967b808a352f1ce3b054e10598ff669178f3ef8398eeefba" exitCode=0 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926274 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="92c9ca6d0747ab19439d8b49dfd9a1c96242ac77f3c689c788a65c1c0e1258b2" exitCode=0 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926280 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="3c24cf1353bfd1771fb13a4e92f529dc36f8d0e1781a72b6afd315e775fba8c5" exitCode=0 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926286 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="85df78331aa99aafae2f67fc3bc58a8e03b9b7cb97d02206474e31d6bc372557" exitCode=0 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926292 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="6af8f9825683a9e7bcfa9790dd622b1408428b449ce0e291c74c97d671c14f9b" exitCode=0 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926333 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"b7cffbcf3ad1b63e00ae392389302e885c55f2585cddad9fe0291894a6e8cecb"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"dde3d6c7a1986ddafd263ad5e85db2efc0b08d473326e955b4001ad6c7a98f50"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926369 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"1350a717bc1295245194beb51e514c867204d22c527b6a02bbcc86472510cc38"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926378 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"239d60cbdc9fb72217dc18f7e01aa568cec96d28806e913c0638419d688b401a"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926386 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"47752edc56435edd967b808a352f1ce3b054e10598ff669178f3ef8398eeefba"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926396 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"92c9ca6d0747ab19439d8b49dfd9a1c96242ac77f3c689c788a65c1c0e1258b2"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926404 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"3c24cf1353bfd1771fb13a4e92f529dc36f8d0e1781a72b6afd315e775fba8c5"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926415 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"85df78331aa99aafae2f67fc3bc58a8e03b9b7cb97d02206474e31d6bc372557"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.926427 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"6af8f9825683a9e7bcfa9790dd622b1408428b449ce0e291c74c97d671c14f9b"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.927526 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.928000 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="ceilometer-central-agent" containerID="cri-o://ec915196559782a71aca3bf84560183d6b635d2588eb3d679d36d2bb6b7a0c3b" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.928125 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="proxy-httpd" containerID="cri-o://048f7b423760b667b60e4906fd1a0700771bb5250eff55c5e498304ebcbb54a2" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.928176 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="sg-core" containerID="cri-o://8031dce4c85b09f1d38bba5769584ff64416d46b3003775991a52eb69e88f932" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.928207 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="ceilometer-notification-agent" containerID="cri-o://02fae0c28694c60d27f62274bb757b8a4c5d3975c8580e05c8edede84309f354" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.934832 5030 generic.go:334] "Generic (PLEG): container finished" podID="b54968d4-ae84-4e25-8899-6113e78d9a2b" containerID="6aba11ff0a4eac220e8cc2af3d966d1f2a4d5a16d8754c243966dc541c179f09" exitCode=143 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.934909 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.934935 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b54968d4-ae84-4e25-8899-6113e78d9a2b","Type":"ContainerDied","Data":"6aba11ff0a4eac220e8cc2af3d966d1f2a4d5a16d8754c243966dc541c179f09"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.935104 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="f832da32-f1c8-47a5-8bcd-7ea67db8b416" containerName="kube-state-metrics" containerID="cri-o://b1e5db55c72f251dc3ec01cc2adc3e99ecab004e604b0ef4205e9f8eccfb3fdb" gracePeriod=30 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.943548 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="81ab5d03-ddb2-4d25-bec9-485f19fa3f13" containerName="rabbitmq" containerID="cri-o://bbb816d2b9ba7c53b6ec15eb428bf7dd729a8cebd652ee1b9e0381c8dad5f9d5" gracePeriod=604800 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.944113 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk"] Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.951001 5030 generic.go:334] "Generic (PLEG): container finished" podID="d3ed2d89-dd21-446c-b85b-b44bb72512bc" containerID="29681178c1d15195b91a0a520a9ab9575c9dc60635da73365d7bf863a845e294" exitCode=143 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.951056 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" event={"ID":"d3ed2d89-dd21-446c-b85b-b44bb72512bc","Type":"ContainerDied","Data":"29681178c1d15195b91a0a520a9ab9575c9dc60635da73365d7bf863a845e294"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.959119 5030 generic.go:334] "Generic (PLEG): container finished" podID="599881de-74a7-46ff-a7a1-5599d4345e0c" containerID="a441fbf79feb6f0388d034551078971a6c769c048b93306c0f069c878d28557b" exitCode=0 Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.959383 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.959220 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" event={"ID":"599881de-74a7-46ff-a7a1-5599d4345e0c","Type":"ContainerDied","Data":"a441fbf79feb6f0388d034551078971a6c769c048b93306c0f069c878d28557b"} Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.959880 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.959909 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.975962 5030 scope.go:117] "RemoveContainer" containerID="9df222c59711d3fd2a93db293137b68d56cc3f6dc8f6d95807cf89f2a8e746bc" Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.976558 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:52:31 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: if [ -n "nova_api" ]; then Jan 20 23:52:31 crc kubenswrapper[5030]: GRANT_DATABASE="nova_api" Jan 20 23:52:31 crc kubenswrapper[5030]: else Jan 20 23:52:31 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:52:31 crc kubenswrapper[5030]: fi Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:52:31 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:52:31 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:52:31 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:52:31 crc kubenswrapper[5030]: # support updates Jan 20 23:52:31 crc kubenswrapper[5030]: Jan 20 23:52:31 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:52:31 crc kubenswrapper[5030]: E0120 23:52:31.978482 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" podUID="7fd42af9-412d-4b7d-8be1-2a7c9008795f" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.994951 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="043666ed-7c1f-4489-8cfa-bb9c089ba951" path="/var/lib/kubelet/pods/043666ed-7c1f-4489-8cfa-bb9c089ba951/volumes" Jan 20 23:52:31 crc kubenswrapper[5030]: I0120 23:52:31.999811 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e" path="/var/lib/kubelet/pods/0bac9a4f-8f38-49d1-a7ca-4a528d5e7c6e/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.000663 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ea9be16-3983-4590-943a-0870a9dfd9c3" path="/var/lib/kubelet/pods/1ea9be16-3983-4590-943a-0870a9dfd9c3/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.001277 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" path="/var/lib/kubelet/pods/26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.002636 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bc2f7c1-a435-4337-bd8b-47c0443f4004" path="/var/lib/kubelet/pods/2bc2f7c1-a435-4337-bd8b-47c0443f4004/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.003208 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3042e5b1-9e08-4870-99a1-2c89d2b536dd" path="/var/lib/kubelet/pods/3042e5b1-9e08-4870-99a1-2c89d2b536dd/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.004453 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ed9ea9-6c15-4082-b728-5d5d66dfa24e" path="/var/lib/kubelet/pods/43ed9ea9-6c15-4082-b728-5d5d66dfa24e/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.005847 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53052896-e072-486c-ba64-dbf034fc2647" path="/var/lib/kubelet/pods/53052896-e072-486c-ba64-dbf034fc2647/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.006343 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="650fe4d3-bc01-4ea5-b677-039054b091ec" path="/var/lib/kubelet/pods/650fe4d3-bc01-4ea5-b677-039054b091ec/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.006905 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f20a41-6543-40ee-9462-2c73c1b904ef" path="/var/lib/kubelet/pods/65f20a41-6543-40ee-9462-2c73c1b904ef/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.007833 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8a41ef-f17b-4a77-976e-0e8531cdbb52" path="/var/lib/kubelet/pods/8c8a41ef-f17b-4a77-976e-0e8531cdbb52/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.008341 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="907d9698-9733-4db5-93c2-8c35280a5610" path="/var/lib/kubelet/pods/907d9698-9733-4db5-93c2-8c35280a5610/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.008839 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c103a8f-3d99-4db6-861d-2fffca40b7ab" path="/var/lib/kubelet/pods/9c103a8f-3d99-4db6-861d-2fffca40b7ab/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.009290 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9fe9436-325e-46ee-9b56-41b8297f2f7c" path="/var/lib/kubelet/pods/a9fe9436-325e-46ee-9b56-41b8297f2f7c/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.010328 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc4566d9-34be-4ff1-8c9b-41c517c89979" path="/var/lib/kubelet/pods/dc4566d9-34be-4ff1-8c9b-41c517c89979/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.010803 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec81857f-9f0b-427d-8e80-985a39d18721" path="/var/lib/kubelet/pods/ec81857f-9f0b-427d-8e80-985a39d18721/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.011276 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efde4f69-1772-44dd-b9d1-fdc471133d2f" path="/var/lib/kubelet/pods/efde4f69-1772-44dd-b9d1-fdc471133d2f/volumes" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.012196 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.012224 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.227513 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.227596 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data podName:dd483381-3919-4ade-ac69-c8abde2869a6 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:36.227574756 +0000 UTC m=+4628.547835044 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data") pod "rabbitmq-server-0" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6") : configmap "rabbitmq-config-data" not found Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.272472 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.291701 5030 scope.go:117] "RemoveContainer" containerID="b39481b9df711fc23477860a389138de4b98dc604215528415d3e6762dbf8608" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.294458 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.300488 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.315522 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.341555 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-config-data-custom\") pod \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.341627 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-logs\") pod \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.341760 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-config-data\") pod \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.341904 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45pm9\" (UniqueName: \"kubernetes.io/projected/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-kube-api-access-45pm9\") pod \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.342191 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle\") pod \"barbican-worker-846584955c-2ghpj\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.342400 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.342441 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle podName:3a438c50-84b9-49fb-9f49-0cfdbe2bab0a nodeName:}" failed. No retries permitted until 2026-01-20 23:52:36.342427442 +0000 UTC m=+4628.662687730 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle") pod "barbican-worker-846584955c-2ghpj" (UID: "3a438c50-84b9-49fb-9f49-0cfdbe2bab0a") : secret "combined-ca-bundle" not found Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.345061 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-logs" (OuterVolumeSpecName: "logs") pod "21ceeb6e-ad82-423c-8a0c-a946171a2b5d" (UID: "21ceeb6e-ad82-423c-8a0c-a946171a2b5d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.352137 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-kube-api-access-45pm9" (OuterVolumeSpecName: "kube-api-access-45pm9") pod "21ceeb6e-ad82-423c-8a0c-a946171a2b5d" (UID: "21ceeb6e-ad82-423c-8a0c-a946171a2b5d"). InnerVolumeSpecName "kube-api-access-45pm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.354737 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "21ceeb6e-ad82-423c-8a0c-a946171a2b5d" (UID: "21ceeb6e-ad82-423c-8a0c-a946171a2b5d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.355273 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-config-data" (OuterVolumeSpecName: "config-data") pod "21ceeb6e-ad82-423c-8a0c-a946171a2b5d" (UID: "21ceeb6e-ad82-423c-8a0c-a946171a2b5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.443759 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24160216-4080-4fa9-88c0-f89910f0a933-log-httpd\") pod \"24160216-4080-4fa9-88c0-f89910f0a933\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.443845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-logs\") pod \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.443907 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data-custom\") pod \"6d9855de-8b4b-4e39-940b-ff863190d072\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.443964 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-config-data\") pod \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444017 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d9855de-8b4b-4e39-940b-ff863190d072-logs\") pod \"6d9855de-8b4b-4e39-940b-ff863190d072\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444034 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-config-data\") pod \"24160216-4080-4fa9-88c0-f89910f0a933\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444066 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-combined-ca-bundle\") pod \"24160216-4080-4fa9-88c0-f89910f0a933\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444086 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlktx\" (UniqueName: \"kubernetes.io/projected/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-kube-api-access-wlktx\") pod \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444107 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnvkt\" (UniqueName: \"kubernetes.io/projected/24160216-4080-4fa9-88c0-f89910f0a933-kube-api-access-fnvkt\") pod \"24160216-4080-4fa9-88c0-f89910f0a933\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444125 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-public-tls-certs\") pod \"24160216-4080-4fa9-88c0-f89910f0a933\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444149 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24160216-4080-4fa9-88c0-f89910f0a933-run-httpd\") pod \"24160216-4080-4fa9-88c0-f89910f0a933\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444166 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/24160216-4080-4fa9-88c0-f89910f0a933-etc-swift\") pod \"24160216-4080-4fa9-88c0-f89910f0a933\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444207 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-internal-tls-certs\") pod \"24160216-4080-4fa9-88c0-f89910f0a933\" (UID: \"24160216-4080-4fa9-88c0-f89910f0a933\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444229 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-config-data-custom\") pod \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\" (UID: \"3a438c50-84b9-49fb-9f49-0cfdbe2bab0a\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444566 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle\") pod \"barbican-keystone-listener-79545f7658-lhclg\" (UID: \"21ceeb6e-ad82-423c-8a0c-a946171a2b5d\") " pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444665 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444678 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444687 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.444698 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45pm9\" (UniqueName: \"kubernetes.io/projected/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-kube-api-access-45pm9\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.444774 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.444818 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle podName:21ceeb6e-ad82-423c-8a0c-a946171a2b5d nodeName:}" failed. No retries permitted until 2026-01-20 23:52:36.444804566 +0000 UTC m=+4628.765064854 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle") pod "barbican-keystone-listener-79545f7658-lhclg" (UID: "21ceeb6e-ad82-423c-8a0c-a946171a2b5d") : secret "combined-ca-bundle" not found Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.445648 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24160216-4080-4fa9-88c0-f89910f0a933-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "24160216-4080-4fa9-88c0-f89910f0a933" (UID: "24160216-4080-4fa9-88c0-f89910f0a933"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.447713 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24160216-4080-4fa9-88c0-f89910f0a933-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "24160216-4080-4fa9-88c0-f89910f0a933" (UID: "24160216-4080-4fa9-88c0-f89910f0a933"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.448598 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-logs" (OuterVolumeSpecName: "logs") pod "3a438c50-84b9-49fb-9f49-0cfdbe2bab0a" (UID: "3a438c50-84b9-49fb-9f49-0cfdbe2bab0a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.449782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d9855de-8b4b-4e39-940b-ff863190d072-logs" (OuterVolumeSpecName: "logs") pod "6d9855de-8b4b-4e39-940b-ff863190d072" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.453883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24160216-4080-4fa9-88c0-f89910f0a933-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "24160216-4080-4fa9-88c0-f89910f0a933" (UID: "24160216-4080-4fa9-88c0-f89910f0a933"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.454000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6d9855de-8b4b-4e39-940b-ff863190d072" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.454043 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3a438c50-84b9-49fb-9f49-0cfdbe2bab0a" (UID: "3a438c50-84b9-49fb-9f49-0cfdbe2bab0a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.454125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24160216-4080-4fa9-88c0-f89910f0a933-kube-api-access-fnvkt" (OuterVolumeSpecName: "kube-api-access-fnvkt") pod "24160216-4080-4fa9-88c0-f89910f0a933" (UID: "24160216-4080-4fa9-88c0-f89910f0a933"). InnerVolumeSpecName "kube-api-access-fnvkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.455756 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-kube-api-access-wlktx" (OuterVolumeSpecName: "kube-api-access-wlktx") pod "3a438c50-84b9-49fb-9f49-0cfdbe2bab0a" (UID: "3a438c50-84b9-49fb-9f49-0cfdbe2bab0a"). InnerVolumeSpecName "kube-api-access-wlktx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.465765 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-config-data" (OuterVolumeSpecName: "config-data") pod "3a438c50-84b9-49fb-9f49-0cfdbe2bab0a" (UID: "3a438c50-84b9-49fb-9f49-0cfdbe2bab0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.471035 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.512166 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "24160216-4080-4fa9-88c0-f89910f0a933" (UID: "24160216-4080-4fa9-88c0-f89910f0a933"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.533873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24160216-4080-4fa9-88c0-f89910f0a933" (UID: "24160216-4080-4fa9-88c0-f89910f0a933"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.545618 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-combined-ca-bundle\") pod \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.545701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-state-metrics-tls-certs\") pod \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.545741 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r55g\" (UniqueName: \"kubernetes.io/projected/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-api-access-8r55g\") pod \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.545780 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-state-metrics-tls-config\") pod \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\" (UID: \"f832da32-f1c8-47a5-8bcd-7ea67db8b416\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.546340 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.546351 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d9855de-8b4b-4e39-940b-ff863190d072-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.546361 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.546371 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlktx\" (UniqueName: \"kubernetes.io/projected/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-kube-api-access-wlktx\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.546380 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnvkt\" (UniqueName: \"kubernetes.io/projected/24160216-4080-4fa9-88c0-f89910f0a933-kube-api-access-fnvkt\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.546388 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.546397 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24160216-4080-4fa9-88c0-f89910f0a933-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.546409 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/24160216-4080-4fa9-88c0-f89910f0a933-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.546417 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.546427 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24160216-4080-4fa9-88c0-f89910f0a933-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.546435 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.546444 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.563783 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "24160216-4080-4fa9-88c0-f89910f0a933" (UID: "24160216-4080-4fa9-88c0-f89910f0a933"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.582199 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-api-access-8r55g" (OuterVolumeSpecName: "kube-api-access-8r55g") pod "f832da32-f1c8-47a5-8bcd-7ea67db8b416" (UID: "f832da32-f1c8-47a5-8bcd-7ea67db8b416"). InnerVolumeSpecName "kube-api-access-8r55g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.612038 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f832da32-f1c8-47a5-8bcd-7ea67db8b416" (UID: "f832da32-f1c8-47a5-8bcd-7ea67db8b416"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.613470 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "f832da32-f1c8-47a5-8bcd-7ea67db8b416" (UID: "f832da32-f1c8-47a5-8bcd-7ea67db8b416"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.629274 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-config-data" (OuterVolumeSpecName: "config-data") pod "24160216-4080-4fa9-88c0-f89910f0a933" (UID: "24160216-4080-4fa9-88c0-f89910f0a933"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.648059 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.648084 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.648095 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r55g\" (UniqueName: \"kubernetes.io/projected/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-api-access-8r55g\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.648104 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.648114 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24160216-4080-4fa9-88c0-f89910f0a933-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.648201 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "f832da32-f1c8-47a5-8bcd-7ea67db8b416" (UID: "f832da32-f1c8-47a5-8bcd-7ea67db8b416"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.689407 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.697314 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.748648 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-config-data\") pod \"044ada6d-cbbc-45dc-b229-3884d815427b\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.748864 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-nova-novncproxy-tls-certs\") pod \"044ada6d-cbbc-45dc-b229-3884d815427b\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.748968 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.749014 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc5bz\" (UniqueName: \"kubernetes.io/projected/cdf571fc-fd44-4b32-bc32-5ccd319a5157-kube-api-access-mc5bz\") pod \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.749051 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-operator-scripts\") pod \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.749096 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-kolla-config\") pod \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.749117 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmtl9\" (UniqueName: \"kubernetes.io/projected/044ada6d-cbbc-45dc-b229-3884d815427b-kube-api-access-tmtl9\") pod \"044ada6d-cbbc-45dc-b229-3884d815427b\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.751841 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf571fc-fd44-4b32-bc32-5ccd319a5157-combined-ca-bundle\") pod \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.751925 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cdf571fc-fd44-4b32-bc32-5ccd319a5157-config-data-generated\") pod \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.751955 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-vencrypt-tls-certs\") pod \"044ada6d-cbbc-45dc-b229-3884d815427b\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.751992 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-combined-ca-bundle\") pod \"044ada6d-cbbc-45dc-b229-3884d815427b\" (UID: \"044ada6d-cbbc-45dc-b229-3884d815427b\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.752056 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdf571fc-fd44-4b32-bc32-5ccd319a5157-galera-tls-certs\") pod \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.752178 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-config-data-default\") pod \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\" (UID: \"cdf571fc-fd44-4b32-bc32-5ccd319a5157\") " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.752599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "cdf571fc-fd44-4b32-bc32-5ccd319a5157" (UID: "cdf571fc-fd44-4b32-bc32-5ccd319a5157"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.753264 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdf571fc-fd44-4b32-bc32-5ccd319a5157-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "cdf571fc-fd44-4b32-bc32-5ccd319a5157" (UID: "cdf571fc-fd44-4b32-bc32-5ccd319a5157"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.753857 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cdf571fc-fd44-4b32-bc32-5ccd319a5157" (UID: "cdf571fc-fd44-4b32-bc32-5ccd319a5157"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.753980 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "cdf571fc-fd44-4b32-bc32-5ccd319a5157" (UID: "cdf571fc-fd44-4b32-bc32-5ccd319a5157"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.755554 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.755581 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.755592 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cdf571fc-fd44-4b32-bc32-5ccd319a5157-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.755604 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f832da32-f1c8-47a5-8bcd-7ea67db8b416-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.755615 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cdf571fc-fd44-4b32-bc32-5ccd319a5157-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.775553 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf571fc-fd44-4b32-bc32-5ccd319a5157-kube-api-access-mc5bz" (OuterVolumeSpecName: "kube-api-access-mc5bz") pod "cdf571fc-fd44-4b32-bc32-5ccd319a5157" (UID: "cdf571fc-fd44-4b32-bc32-5ccd319a5157"). InnerVolumeSpecName "kube-api-access-mc5bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.775636 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/044ada6d-cbbc-45dc-b229-3884d815427b-kube-api-access-tmtl9" (OuterVolumeSpecName: "kube-api-access-tmtl9") pod "044ada6d-cbbc-45dc-b229-3884d815427b" (UID: "044ada6d-cbbc-45dc-b229-3884d815427b"). InnerVolumeSpecName "kube-api-access-tmtl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.826742 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-config-data" (OuterVolumeSpecName: "config-data") pod "044ada6d-cbbc-45dc-b229-3884d815427b" (UID: "044ada6d-cbbc-45dc-b229-3884d815427b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.847285 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "cdf571fc-fd44-4b32-bc32-5ccd319a5157" (UID: "cdf571fc-fd44-4b32-bc32-5ccd319a5157"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.858270 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.859364 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.859429 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.859524 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.859613 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.859731 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc5bz\" (UniqueName: \"kubernetes.io/projected/cdf571fc-fd44-4b32-bc32-5ccd319a5157-kube-api-access-mc5bz\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.859742 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmtl9\" (UniqueName: \"kubernetes.io/projected/044ada6d-cbbc-45dc-b229-3884d815427b-kube-api-access-tmtl9\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.859752 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.860103 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.860171 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.860195 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.860180 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:36.860160442 +0000 UTC m=+4629.180420720 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "combined-ca-bundle" not found Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.860265 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:36.860246064 +0000 UTC m=+4629.180506352 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "barbican-config-data" not found Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.860282 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:36.860275745 +0000 UTC m=+4629.180536023 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "cert-barbican-internal-svc" not found Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.860348 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.860392 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:36.860371137 +0000 UTC m=+4629.180631505 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : secret "cert-barbican-public-svc" not found Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.860837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "044ada6d-cbbc-45dc-b229-3884d815427b" (UID: "044ada6d-cbbc-45dc-b229-3884d815427b"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.903996 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "044ada6d-cbbc-45dc-b229-3884d815427b" (UID: "044ada6d-cbbc-45dc-b229-3884d815427b"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.906820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf571fc-fd44-4b32-bc32-5ccd319a5157-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdf571fc-fd44-4b32-bc32-5ccd319a5157" (UID: "cdf571fc-fd44-4b32-bc32-5ccd319a5157"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.936229 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.967942 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs"] Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.969157 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjrf\" (UniqueName: \"kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf\") pod \"barbican-api-ff97c958d-4m5xd\" (UID: \"6d9855de-8b4b-4e39-940b-ff863190d072\") " pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.969393 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.969413 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.969430 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf571fc-fd44-4b32-bc32-5ccd319a5157-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.969443 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:32 crc kubenswrapper[5030]: I0120 23:52:32.969504 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-a1e6-account-create-update-vlzvs"] Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.984851 5030 projected.go:194] Error preparing data for projected volume kube-api-access-hkjrf for pod openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:52:32 crc kubenswrapper[5030]: E0120 23:52:32.984923 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf podName:6d9855de-8b4b-4e39-940b-ff863190d072 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:36.984903358 +0000 UTC m=+4629.305163636 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hkjrf" (UniqueName: "kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf") pod "barbican-api-ff97c958d-4m5xd" (UID: "6d9855de-8b4b-4e39-940b-ff863190d072") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.011278 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "044ada6d-cbbc-45dc-b229-3884d815427b" (UID: "044ada6d-cbbc-45dc-b229-3884d815427b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.011752 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.012006 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="84e84d7f-54c8-43d1-a0ee-29d3e173f662" containerName="memcached" containerID="cri-o://905aa18ac6c1a58594d7762f6524c72d05436f56d34dfa2fbeeeba54d58677b5" gracePeriod=30 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.027303 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf571fc-fd44-4b32-bc32-5ccd319a5157-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "cdf571fc-fd44-4b32-bc32-5ccd319a5157" (UID: "cdf571fc-fd44-4b32-bc32-5ccd319a5157"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.028033 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cef4a5f00f4648bf0fd3e6e59c501cb49f0b6ca80d5691ef8ef3da4d372a28d8" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.054724 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj"] Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.055208 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf571fc-fd44-4b32-bc32-5ccd319a5157" containerName="mysql-bootstrap" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.055223 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf571fc-fd44-4b32-bc32-5ccd319a5157" containerName="mysql-bootstrap" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.055252 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="probe" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.055258 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="probe" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.055271 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" containerName="ovsdbserver-sb" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.055277 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" containerName="ovsdbserver-sb" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.055260 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cef4a5f00f4648bf0fd3e6e59c501cb49f0b6ca80d5691ef8ef3da4d372a28d8" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.055286 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f832da32-f1c8-47a5-8bcd-7ea67db8b416" containerName="kube-state-metrics" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056002 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f832da32-f1c8-47a5-8bcd-7ea67db8b416" containerName="kube-state-metrics" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.056043 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="044ada6d-cbbc-45dc-b229-3884d815427b" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056050 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="044ada6d-cbbc-45dc-b229-3884d815427b" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.056083 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24160216-4080-4fa9-88c0-f89910f0a933" containerName="proxy-httpd" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056089 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24160216-4080-4fa9-88c0-f89910f0a933" containerName="proxy-httpd" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.056104 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24160216-4080-4fa9-88c0-f89910f0a933" containerName="proxy-server" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056110 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24160216-4080-4fa9-88c0-f89910f0a933" containerName="proxy-server" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.056150 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf571fc-fd44-4b32-bc32-5ccd319a5157" containerName="galera" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056156 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf571fc-fd44-4b32-bc32-5ccd319a5157" containerName="galera" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.056182 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="cinder-scheduler" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056188 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="cinder-scheduler" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.056200 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" containerName="openstack-network-exporter" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056218 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" containerName="openstack-network-exporter" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056889 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" containerName="ovsdbserver-sb" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056904 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf571fc-fd44-4b32-bc32-5ccd319a5157" containerName="galera" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056914 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f832da32-f1c8-47a5-8bcd-7ea67db8b416" containerName="kube-state-metrics" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056925 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="24160216-4080-4fa9-88c0-f89910f0a933" containerName="proxy-server" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056932 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="044ada6d-cbbc-45dc-b229-3884d815427b" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056946 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="cinder-scheduler" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056958 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="24160216-4080-4fa9-88c0-f89910f0a933" containerName="proxy-httpd" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056972 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" containerName="probe" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.056982 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="26cd9f7a-0ecf-4b5e-aaee-fcc7d5b32848" containerName="openstack-network-exporter" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.057583 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.067645 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj"] Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.069860 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cef4a5f00f4648bf0fd3e6e59c501cb49f0b6ca80d5691ef8ef3da4d372a28d8" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.069914 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="55537e5f-bb41-4899-a095-e87c7495ebdf" containerName="ovn-northd" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.070256 5030 generic.go:334] "Generic (PLEG): container finished" podID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerID="048f7b423760b667b60e4906fd1a0700771bb5250eff55c5e498304ebcbb54a2" exitCode=0 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.070269 5030 generic.go:334] "Generic (PLEG): container finished" podID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerID="8031dce4c85b09f1d38bba5769584ff64416d46b3003775991a52eb69e88f932" exitCode=2 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.070276 5030 generic.go:334] "Generic (PLEG): container finished" podID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerID="02fae0c28694c60d27f62274bb757b8a4c5d3975c8580e05c8edede84309f354" exitCode=0 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.070282 5030 generic.go:334] "Generic (PLEG): container finished" podID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerID="ec915196559782a71aca3bf84560183d6b635d2588eb3d679d36d2bb6b7a0c3b" exitCode=0 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.070339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0a9ac390-5efe-4f40-addc-795d259c81cb","Type":"ContainerDied","Data":"048f7b423760b667b60e4906fd1a0700771bb5250eff55c5e498304ebcbb54a2"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.070364 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0a9ac390-5efe-4f40-addc-795d259c81cb","Type":"ContainerDied","Data":"8031dce4c85b09f1d38bba5769584ff64416d46b3003775991a52eb69e88f932"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.070373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0a9ac390-5efe-4f40-addc-795d259c81cb","Type":"ContainerDied","Data":"02fae0c28694c60d27f62274bb757b8a4c5d3975c8580e05c8edede84309f354"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.070384 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0a9ac390-5efe-4f40-addc-795d259c81cb","Type":"ContainerDied","Data":"ec915196559782a71aca3bf84560183d6b635d2588eb3d679d36d2bb6b7a0c3b"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.070393 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"0a9ac390-5efe-4f40-addc-795d259c81cb","Type":"ContainerDied","Data":"1950625fb18462814c13c211ec225c203ec0069591e8f5804442ccb5385c35fd"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.070402 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1950625fb18462814c13c211ec225c203ec0069591e8f5804442ccb5385c35fd" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.070748 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.072244 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/044ada6d-cbbc-45dc-b229-3884d815427b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.072260 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdf571fc-fd44-4b32-bc32-5ccd319a5157-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.081036 5030 generic.go:334] "Generic (PLEG): container finished" podID="611e73cb-eb48-4fed-9073-7b3ee43b88e8" containerID="640b59c1c1f24c64cc46c43b80b843802e0c48890d90206a9daf79885221c80a" exitCode=143 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.081389 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-rzn4z"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.081411 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" event={"ID":"611e73cb-eb48-4fed-9073-7b3ee43b88e8","Type":"ContainerDied","Data":"640b59c1c1f24c64cc46c43b80b843802e0c48890d90206a9daf79885221c80a"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.083341 5030 generic.go:334] "Generic (PLEG): container finished" podID="815dc9f1-424f-499a-b27b-16f3f0483fff" containerID="8043d081ab68ab52f7f2599bf0db461c98d405d06b4245ce8431c2527b4cfcf7" exitCode=1 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.083372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-4lg5x" event={"ID":"815dc9f1-424f-499a-b27b-16f3f0483fff","Type":"ContainerDied","Data":"8043d081ab68ab52f7f2599bf0db461c98d405d06b4245ce8431c2527b4cfcf7"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.083393 5030 scope.go:117] "RemoveContainer" containerID="f633bba57d3d9c96735d364031f527fc137d20f358307a2a20ceaf07366876a0" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.083863 5030 scope.go:117] "RemoveContainer" containerID="8043d081ab68ab52f7f2599bf0db461c98d405d06b4245ce8431c2527b4cfcf7" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.084052 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-4lg5x_openstack-kuttl-tests(815dc9f1-424f-499a-b27b-16f3f0483fff)\"" pod="openstack-kuttl-tests/root-account-create-update-4lg5x" podUID="815dc9f1-424f-499a-b27b-16f3f0483fff" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.107988 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-rzn4z"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.132993 5030 generic.go:334] "Generic (PLEG): container finished" podID="044ada6d-cbbc-45dc-b229-3884d815427b" containerID="f458a7429598f19652d1d9f95c5960c8181174945df0f944cd35ca49eede8e40" exitCode=0 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.133086 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"044ada6d-cbbc-45dc-b229-3884d815427b","Type":"ContainerDied","Data":"f458a7429598f19652d1d9f95c5960c8181174945df0f944cd35ca49eede8e40"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.133113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"044ada6d-cbbc-45dc-b229-3884d815427b","Type":"ContainerDied","Data":"1995a209fa47980dcdbde82a0f9dae04eafb043f0c657a0e98f0369c8daa5d19"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.133214 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.150449 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5b98cb8978-zrv7q"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.150717 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" podUID="d098cb47-5d3b-48d8-b10e-59e0c1d79840" containerName="keystone-api" containerID="cri-o://22410699b2d975f66dd4fbd18bf8cdeb1ca6e285e64621da5b2a1c37a284b6b0" gracePeriod=30 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.157050 5030 generic.go:334] "Generic (PLEG): container finished" podID="24160216-4080-4fa9-88c0-f89910f0a933" containerID="066e433777c6f4fb06a9686d3ad45153cfc06dd3189e1674846e4aa6746e5f4d" exitCode=0 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.157110 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" event={"ID":"24160216-4080-4fa9-88c0-f89910f0a933","Type":"ContainerDied","Data":"066e433777c6f4fb06a9686d3ad45153cfc06dd3189e1674846e4aa6746e5f4d"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.157137 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" event={"ID":"24160216-4080-4fa9-88c0-f89910f0a933","Type":"ContainerDied","Data":"f8852c89a7930bfa0404d0bb8f24020b2bfb241a1717436a1813a1c6008b8e3a"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.157228 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.160177 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-9g6g8"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.164022 5030 scope.go:117] "RemoveContainer" containerID="f458a7429598f19652d1d9f95c5960c8181174945df0f944cd35ca49eede8e40" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.177804 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvdd2\" (UniqueName: \"kubernetes.io/projected/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-kube-api-access-dvdd2\") pod \"keystone-a1e6-account-create-update-ggxcj\" (UID: \"9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.178439 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.178501 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/815dc9f1-424f-499a-b27b-16f3f0483fff-operator-scripts podName:815dc9f1-424f-499a-b27b-16f3f0483fff nodeName:}" failed. No retries permitted until 2026-01-20 23:52:33.678486014 +0000 UTC m=+4625.998746302 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/815dc9f1-424f-499a-b27b-16f3f0483fff-operator-scripts") pod "root-account-create-update-4lg5x" (UID: "815dc9f1-424f-499a-b27b-16f3f0483fff") : configmap "openstack-scripts" not found Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.180332 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-operator-scripts\") pod \"keystone-a1e6-account-create-update-ggxcj\" (UID: \"9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.182474 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-9g6g8"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.182551 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.183827 5030 generic.go:334] "Generic (PLEG): container finished" podID="cdf571fc-fd44-4b32-bc32-5ccd319a5157" containerID="26977dc366c36dd6821aaf3e4eaa5d87aeefe7baa15409a5c0bce4fd3fa23efe" exitCode=0 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.183867 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"cdf571fc-fd44-4b32-bc32-5ccd319a5157","Type":"ContainerDied","Data":"26977dc366c36dd6821aaf3e4eaa5d87aeefe7baa15409a5c0bce4fd3fa23efe"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.183888 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"cdf571fc-fd44-4b32-bc32-5ccd319a5157","Type":"ContainerDied","Data":"392a99c47b75155bfabf1543efe845fd915bc665e86d59fee18e2aec6e7784ee"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.184212 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.191862 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.194947 5030 generic.go:334] "Generic (PLEG): container finished" podID="a9df19e9-10ce-496f-9ae7-89ac55ba91cf" containerID="409fe497b78edb037279005ae11210c5ba6b57531c5991487330124e7e48ec82" exitCode=143 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.195018 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" event={"ID":"a9df19e9-10ce-496f-9ae7-89ac55ba91cf","Type":"ContainerDied","Data":"409fe497b78edb037279005ae11210c5ba6b57531c5991487330124e7e48ec82"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.206593 5030 generic.go:334] "Generic (PLEG): container finished" podID="1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8" containerID="189d313043fe3d3466237f794d2223799a58ae3d17ec95a0a6e3330a34d98b2a" exitCode=0 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.206733 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" event={"ID":"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8","Type":"ContainerDied","Data":"189d313043fe3d3466237f794d2223799a58ae3d17ec95a0a6e3330a34d98b2a"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.216426 5030 generic.go:334] "Generic (PLEG): container finished" podID="f832da32-f1c8-47a5-8bcd-7ea67db8b416" containerID="b1e5db55c72f251dc3ec01cc2adc3e99ecab004e604b0ef4205e9f8eccfb3fdb" exitCode=2 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.216518 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.219709 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.221656 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.222291 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"f832da32-f1c8-47a5-8bcd-7ea67db8b416","Type":"ContainerDied","Data":"b1e5db55c72f251dc3ec01cc2adc3e99ecab004e604b0ef4205e9f8eccfb3fdb"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.222330 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"f832da32-f1c8-47a5-8bcd-7ea67db8b416","Type":"ContainerDied","Data":"db8482dc5d1a3de53f05c0df5f7b25915ec1c08554d77cf1e169d14db4b3bde0"} Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.222420 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-846584955c-2ghpj" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.222422 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-95nrc"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.222567 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-95nrc"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.228782 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.232872 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.243196 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.254259 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4lg5x"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.287195 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-combined-ca-bundle\") pod \"0a9ac390-5efe-4f40-addc-795d259c81cb\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.287317 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a9ac390-5efe-4f40-addc-795d259c81cb-log-httpd\") pod \"0a9ac390-5efe-4f40-addc-795d259c81cb\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.287347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc6px\" (UniqueName: \"kubernetes.io/projected/0a9ac390-5efe-4f40-addc-795d259c81cb-kube-api-access-fc6px\") pod \"0a9ac390-5efe-4f40-addc-795d259c81cb\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.287381 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-config-data\") pod \"0a9ac390-5efe-4f40-addc-795d259c81cb\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.287405 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-sg-core-conf-yaml\") pod \"0a9ac390-5efe-4f40-addc-795d259c81cb\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.287435 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-scripts\") pod \"0a9ac390-5efe-4f40-addc-795d259c81cb\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.287480 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a9ac390-5efe-4f40-addc-795d259c81cb-run-httpd\") pod \"0a9ac390-5efe-4f40-addc-795d259c81cb\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.287499 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-ceilometer-tls-certs\") pod \"0a9ac390-5efe-4f40-addc-795d259c81cb\" (UID: \"0a9ac390-5efe-4f40-addc-795d259c81cb\") " Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.287701 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a9ac390-5efe-4f40-addc-795d259c81cb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0a9ac390-5efe-4f40-addc-795d259c81cb" (UID: "0a9ac390-5efe-4f40-addc-795d259c81cb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.287843 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvdd2\" (UniqueName: \"kubernetes.io/projected/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-kube-api-access-dvdd2\") pod \"keystone-a1e6-account-create-update-ggxcj\" (UID: \"9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.287980 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-operator-scripts\") pod \"keystone-a1e6-account-create-update-ggxcj\" (UID: \"9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.288101 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a9ac390-5efe-4f40-addc-795d259c81cb-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.288477 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.288526 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data podName:81ab5d03-ddb2-4d25-bec9-485f19fa3f13 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:37.288510923 +0000 UTC m=+4629.608771211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data") pod "rabbitmq-cell1-server-0" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.289188 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.289219 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-operator-scripts podName:9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca nodeName:}" failed. No retries permitted until 2026-01-20 23:52:33.78921154 +0000 UTC m=+4626.109471828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-operator-scripts") pod "keystone-a1e6-account-create-update-ggxcj" (UID: "9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca") : configmap "openstack-scripts" not found Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.294342 5030 projected.go:194] Error preparing data for projected volume kube-api-access-dvdd2 for pod openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.294412 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-kube-api-access-dvdd2 podName:9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca nodeName:}" failed. No retries permitted until 2026-01-20 23:52:33.794393936 +0000 UTC m=+4626.114654224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dvdd2" (UniqueName: "kubernetes.io/projected/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-kube-api-access-dvdd2") pod "keystone-a1e6-account-create-update-ggxcj" (UID: "9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.294654 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a9ac390-5efe-4f40-addc-795d259c81cb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0a9ac390-5efe-4f40-addc-795d259c81cb" (UID: "0a9ac390-5efe-4f40-addc-795d259c81cb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.295066 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-scripts" (OuterVolumeSpecName: "scripts") pod "0a9ac390-5efe-4f40-addc-795d259c81cb" (UID: "0a9ac390-5efe-4f40-addc-795d259c81cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.302959 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9ac390-5efe-4f40-addc-795d259c81cb-kube-api-access-fc6px" (OuterVolumeSpecName: "kube-api-access-fc6px") pod "0a9ac390-5efe-4f40-addc-795d259c81cb" (UID: "0a9ac390-5efe-4f40-addc-795d259c81cb"). InnerVolumeSpecName "kube-api-access-fc6px". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.332588 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-dvdd2 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj" podUID="9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.381764 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.383781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0a9ac390-5efe-4f40-addc-795d259c81cb" (UID: "0a9ac390-5efe-4f40-addc-795d259c81cb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.393120 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc6px\" (UniqueName: \"kubernetes.io/projected/0a9ac390-5efe-4f40-addc-795d259c81cb-kube-api-access-fc6px\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.393148 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.393158 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.393167 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a9ac390-5efe-4f40-addc-795d259c81cb-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.401831 5030 scope.go:117] "RemoveContainer" containerID="f458a7429598f19652d1d9f95c5960c8181174945df0f944cd35ca49eede8e40" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.403215 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0a9ac390-5efe-4f40-addc-795d259c81cb" (UID: "0a9ac390-5efe-4f40-addc-795d259c81cb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.408287 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f458a7429598f19652d1d9f95c5960c8181174945df0f944cd35ca49eede8e40\": container with ID starting with f458a7429598f19652d1d9f95c5960c8181174945df0f944cd35ca49eede8e40 not found: ID does not exist" containerID="f458a7429598f19652d1d9f95c5960c8181174945df0f944cd35ca49eede8e40" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.408327 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f458a7429598f19652d1d9f95c5960c8181174945df0f944cd35ca49eede8e40"} err="failed to get container status \"f458a7429598f19652d1d9f95c5960c8181174945df0f944cd35ca49eede8e40\": rpc error: code = NotFound desc = could not find container \"f458a7429598f19652d1d9f95c5960c8181174945df0f944cd35ca49eede8e40\": container with ID starting with f458a7429598f19652d1d9f95c5960c8181174945df0f944cd35ca49eede8e40 not found: ID does not exist" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.408361 5030 scope.go:117] "RemoveContainer" containerID="066e433777c6f4fb06a9686d3ad45153cfc06dd3189e1674846e4aa6746e5f4d" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.422574 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="184162cc-2ce3-4a2e-9167-ff4ffa8cd188" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.207:8776/healthcheck\": read tcp 10.217.0.2:47158->10.217.1.207:8776: read: connection reset by peer" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.433541 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a9ac390-5efe-4f40-addc-795d259c81cb" (UID: "0a9ac390-5efe-4f40-addc-795d259c81cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.439148 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-ff97c958d-4m5xd"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.454186 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.464816 5030 scope.go:117] "RemoveContainer" containerID="0dbbd5636d7c51b217c065b6413313ea12f5c9838b12455c2d92415149d72b65" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.495315 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.495343 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.495355 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.495370 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.495383 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.495392 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9855de-8b4b-4e39-940b-ff863190d072-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.495405 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkjrf\" (UniqueName: \"kubernetes.io/projected/6d9855de-8b4b-4e39-940b-ff863190d072-kube-api-access-hkjrf\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.495658 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.507071 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.514399 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-567c477ddd-tdrws"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.518367 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" containerName="galera" containerID="cri-o://79303e278316bc7cf3193899577e73e1c0136d14a75ad0b5c2bf54b109f7b1d9" gracePeriod=30 Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.526800 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-config-data" (OuterVolumeSpecName: "config-data") pod "0a9ac390-5efe-4f40-addc-795d259c81cb" (UID: "0a9ac390-5efe-4f40-addc-795d259c81cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.528914 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.542148 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.560144 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.585247 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-846584955c-2ghpj"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.592639 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-846584955c-2ghpj"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.599426 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r26j2\" (UniqueName: \"kubernetes.io/projected/4a16ed2d-dd15-41c1-b6ef-61d242bd61a4-kube-api-access-r26j2\") pod \"4a16ed2d-dd15-41c1-b6ef-61d242bd61a4\" (UID: \"4a16ed2d-dd15-41c1-b6ef-61d242bd61a4\") " Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.599709 5030 scope.go:117] "RemoveContainer" containerID="066e433777c6f4fb06a9686d3ad45153cfc06dd3189e1674846e4aa6746e5f4d" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.600284 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"066e433777c6f4fb06a9686d3ad45153cfc06dd3189e1674846e4aa6746e5f4d\": container with ID starting with 066e433777c6f4fb06a9686d3ad45153cfc06dd3189e1674846e4aa6746e5f4d not found: ID does not exist" containerID="066e433777c6f4fb06a9686d3ad45153cfc06dd3189e1674846e4aa6746e5f4d" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.600337 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"066e433777c6f4fb06a9686d3ad45153cfc06dd3189e1674846e4aa6746e5f4d"} err="failed to get container status \"066e433777c6f4fb06a9686d3ad45153cfc06dd3189e1674846e4aa6746e5f4d\": rpc error: code = NotFound desc = could not find container \"066e433777c6f4fb06a9686d3ad45153cfc06dd3189e1674846e4aa6746e5f4d\": container with ID starting with 066e433777c6f4fb06a9686d3ad45153cfc06dd3189e1674846e4aa6746e5f4d not found: ID does not exist" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.600443 5030 scope.go:117] "RemoveContainer" containerID="0dbbd5636d7c51b217c065b6413313ea12f5c9838b12455c2d92415149d72b65" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.601279 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a16ed2d-dd15-41c1-b6ef-61d242bd61a4-operator-scripts\") pod \"4a16ed2d-dd15-41c1-b6ef-61d242bd61a4\" (UID: \"4a16ed2d-dd15-41c1-b6ef-61d242bd61a4\") " Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.601344 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dbbd5636d7c51b217c065b6413313ea12f5c9838b12455c2d92415149d72b65\": container with ID starting with 0dbbd5636d7c51b217c065b6413313ea12f5c9838b12455c2d92415149d72b65 not found: ID does not exist" containerID="0dbbd5636d7c51b217c065b6413313ea12f5c9838b12455c2d92415149d72b65" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.601998 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dbbd5636d7c51b217c065b6413313ea12f5c9838b12455c2d92415149d72b65"} err="failed to get container status \"0dbbd5636d7c51b217c065b6413313ea12f5c9838b12455c2d92415149d72b65\": rpc error: code = NotFound desc = could not find container \"0dbbd5636d7c51b217c065b6413313ea12f5c9838b12455c2d92415149d72b65\": container with ID starting with 0dbbd5636d7c51b217c065b6413313ea12f5c9838b12455c2d92415149d72b65 not found: ID does not exist" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.602027 5030 scope.go:117] "RemoveContainer" containerID="26977dc366c36dd6821aaf3e4eaa5d87aeefe7baa15409a5c0bce4fd3fa23efe" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.601820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a16ed2d-dd15-41c1-b6ef-61d242bd61a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a16ed2d-dd15-41c1-b6ef-61d242bd61a4" (UID: "4a16ed2d-dd15-41c1-b6ef-61d242bd61a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.602405 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a16ed2d-dd15-41c1-b6ef-61d242bd61a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.602429 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9ac390-5efe-4f40-addc-795d259c81cb-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.604370 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.604861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a16ed2d-dd15-41c1-b6ef-61d242bd61a4-kube-api-access-r26j2" (OuterVolumeSpecName: "kube-api-access-r26j2") pod "4a16ed2d-dd15-41c1-b6ef-61d242bd61a4" (UID: "4a16ed2d-dd15-41c1-b6ef-61d242bd61a4"). InnerVolumeSpecName "kube-api-access-r26j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.612046 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-79545f7658-lhclg"] Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.661808 5030 scope.go:117] "RemoveContainer" containerID="9262e8bee8e2eb2f56051f3125b33978e0a66c8e21628ff8884448813720bf84" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.693218 5030 scope.go:117] "RemoveContainer" containerID="26977dc366c36dd6821aaf3e4eaa5d87aeefe7baa15409a5c0bce4fd3fa23efe" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.693540 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26977dc366c36dd6821aaf3e4eaa5d87aeefe7baa15409a5c0bce4fd3fa23efe\": container with ID starting with 26977dc366c36dd6821aaf3e4eaa5d87aeefe7baa15409a5c0bce4fd3fa23efe not found: ID does not exist" containerID="26977dc366c36dd6821aaf3e4eaa5d87aeefe7baa15409a5c0bce4fd3fa23efe" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.693588 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26977dc366c36dd6821aaf3e4eaa5d87aeefe7baa15409a5c0bce4fd3fa23efe"} err="failed to get container status \"26977dc366c36dd6821aaf3e4eaa5d87aeefe7baa15409a5c0bce4fd3fa23efe\": rpc error: code = NotFound desc = could not find container \"26977dc366c36dd6821aaf3e4eaa5d87aeefe7baa15409a5c0bce4fd3fa23efe\": container with ID starting with 26977dc366c36dd6821aaf3e4eaa5d87aeefe7baa15409a5c0bce4fd3fa23efe not found: ID does not exist" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.693631 5030 scope.go:117] "RemoveContainer" containerID="9262e8bee8e2eb2f56051f3125b33978e0a66c8e21628ff8884448813720bf84" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.693877 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9262e8bee8e2eb2f56051f3125b33978e0a66c8e21628ff8884448813720bf84\": container with ID starting with 9262e8bee8e2eb2f56051f3125b33978e0a66c8e21628ff8884448813720bf84 not found: ID does not exist" containerID="9262e8bee8e2eb2f56051f3125b33978e0a66c8e21628ff8884448813720bf84" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.693897 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9262e8bee8e2eb2f56051f3125b33978e0a66c8e21628ff8884448813720bf84"} err="failed to get container status \"9262e8bee8e2eb2f56051f3125b33978e0a66c8e21628ff8884448813720bf84\": rpc error: code = NotFound desc = could not find container \"9262e8bee8e2eb2f56051f3125b33978e0a66c8e21628ff8884448813720bf84\": container with ID starting with 9262e8bee8e2eb2f56051f3125b33978e0a66c8e21628ff8884448813720bf84 not found: ID does not exist" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.693909 5030 scope.go:117] "RemoveContainer" containerID="b1e5db55c72f251dc3ec01cc2adc3e99ecab004e604b0ef4205e9f8eccfb3fdb" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.703675 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.703698 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r26j2\" (UniqueName: \"kubernetes.io/projected/4a16ed2d-dd15-41c1-b6ef-61d242bd61a4-kube-api-access-r26j2\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.703709 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21ceeb6e-ad82-423c-8a0c-a946171a2b5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.703768 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.703850 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/815dc9f1-424f-499a-b27b-16f3f0483fff-operator-scripts podName:815dc9f1-424f-499a-b27b-16f3f0483fff nodeName:}" failed. No retries permitted until 2026-01-20 23:52:34.703830768 +0000 UTC m=+4627.024091056 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/815dc9f1-424f-499a-b27b-16f3f0483fff-operator-scripts") pod "root-account-create-update-4lg5x" (UID: "815dc9f1-424f-499a-b27b-16f3f0483fff") : configmap "openstack-scripts" not found Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.760881 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.777206 5030 scope.go:117] "RemoveContainer" containerID="b1e5db55c72f251dc3ec01cc2adc3e99ecab004e604b0ef4205e9f8eccfb3fdb" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.777341 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.777747 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1e5db55c72f251dc3ec01cc2adc3e99ecab004e604b0ef4205e9f8eccfb3fdb\": container with ID starting with b1e5db55c72f251dc3ec01cc2adc3e99ecab004e604b0ef4205e9f8eccfb3fdb not found: ID does not exist" containerID="b1e5db55c72f251dc3ec01cc2adc3e99ecab004e604b0ef4205e9f8eccfb3fdb" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.777792 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e5db55c72f251dc3ec01cc2adc3e99ecab004e604b0ef4205e9f8eccfb3fdb"} err="failed to get container status \"b1e5db55c72f251dc3ec01cc2adc3e99ecab004e604b0ef4205e9f8eccfb3fdb\": rpc error: code = NotFound desc = could not find container \"b1e5db55c72f251dc3ec01cc2adc3e99ecab004e604b0ef4205e9f8eccfb3fdb\": container with ID starting with b1e5db55c72f251dc3ec01cc2adc3e99ecab004e604b0ef4205e9f8eccfb3fdb not found: ID does not exist" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.805235 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-operator-scripts\") pod \"keystone-a1e6-account-create-update-ggxcj\" (UID: \"9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.805348 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvdd2\" (UniqueName: \"kubernetes.io/projected/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-kube-api-access-dvdd2\") pod \"keystone-a1e6-account-create-update-ggxcj\" (UID: \"9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj" Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.805792 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.805838 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-operator-scripts podName:9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca nodeName:}" failed. No retries permitted until 2026-01-20 23:52:34.805818812 +0000 UTC m=+4627.126079100 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-operator-scripts") pod "keystone-a1e6-account-create-update-ggxcj" (UID: "9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca") : configmap "openstack-scripts" not found Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.812525 5030 projected.go:194] Error preparing data for projected volume kube-api-access-dvdd2 for pod openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:52:33 crc kubenswrapper[5030]: E0120 23:52:33.813929 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-kube-api-access-dvdd2 podName:9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca nodeName:}" failed. No retries permitted until 2026-01-20 23:52:34.813909728 +0000 UTC m=+4627.134170016 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dvdd2" (UniqueName: "kubernetes.io/projected/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-kube-api-access-dvdd2") pod "keystone-a1e6-account-create-update-ggxcj" (UID: "9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.906591 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxmm6\" (UniqueName: \"kubernetes.io/projected/ee604c21-d89c-4f46-ae93-3912ff1da96d-kube-api-access-lxmm6\") pod \"ee604c21-d89c-4f46-ae93-3912ff1da96d\" (UID: \"ee604c21-d89c-4f46-ae93-3912ff1da96d\") " Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.907436 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee604c21-d89c-4f46-ae93-3912ff1da96d-operator-scripts\") pod \"ee604c21-d89c-4f46-ae93-3912ff1da96d\" (UID: \"ee604c21-d89c-4f46-ae93-3912ff1da96d\") " Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.907574 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znn5d\" (UniqueName: \"kubernetes.io/projected/7fd42af9-412d-4b7d-8be1-2a7c9008795f-kube-api-access-znn5d\") pod \"7fd42af9-412d-4b7d-8be1-2a7c9008795f\" (UID: \"7fd42af9-412d-4b7d-8be1-2a7c9008795f\") " Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.907679 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fd42af9-412d-4b7d-8be1-2a7c9008795f-operator-scripts\") pod \"7fd42af9-412d-4b7d-8be1-2a7c9008795f\" (UID: \"7fd42af9-412d-4b7d-8be1-2a7c9008795f\") " Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.907815 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee604c21-d89c-4f46-ae93-3912ff1da96d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee604c21-d89c-4f46-ae93-3912ff1da96d" (UID: "ee604c21-d89c-4f46-ae93-3912ff1da96d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.908248 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee604c21-d89c-4f46-ae93-3912ff1da96d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.908603 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fd42af9-412d-4b7d-8be1-2a7c9008795f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7fd42af9-412d-4b7d-8be1-2a7c9008795f" (UID: "7fd42af9-412d-4b7d-8be1-2a7c9008795f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.911878 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fd42af9-412d-4b7d-8be1-2a7c9008795f-kube-api-access-znn5d" (OuterVolumeSpecName: "kube-api-access-znn5d") pod "7fd42af9-412d-4b7d-8be1-2a7c9008795f" (UID: "7fd42af9-412d-4b7d-8be1-2a7c9008795f"). InnerVolumeSpecName "kube-api-access-znn5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.912478 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee604c21-d89c-4f46-ae93-3912ff1da96d-kube-api-access-lxmm6" (OuterVolumeSpecName: "kube-api-access-lxmm6") pod "ee604c21-d89c-4f46-ae93-3912ff1da96d" (UID: "ee604c21-d89c-4f46-ae93-3912ff1da96d"). InnerVolumeSpecName "kube-api-access-lxmm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.972786 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="044ada6d-cbbc-45dc-b229-3884d815427b" path="/var/lib/kubelet/pods/044ada6d-cbbc-45dc-b229-3884d815427b/volumes" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.973395 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0afa8856-f941-4f8d-a68c-94e605b4825a" path="/var/lib/kubelet/pods/0afa8856-f941-4f8d-a68c-94e605b4825a/volumes" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.973886 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d3d6688-641e-4058-a346-61c3ee44938c" path="/var/lib/kubelet/pods/0d3d6688-641e-4058-a346-61c3ee44938c/volumes" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.974355 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20bd2b86-ed12-4296-97db-55879b9d87ac" path="/var/lib/kubelet/pods/20bd2b86-ed12-4296-97db-55879b9d87ac/volumes" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.975437 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21ceeb6e-ad82-423c-8a0c-a946171a2b5d" path="/var/lib/kubelet/pods/21ceeb6e-ad82-423c-8a0c-a946171a2b5d/volumes" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.975826 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24160216-4080-4fa9-88c0-f89910f0a933" path="/var/lib/kubelet/pods/24160216-4080-4fa9-88c0-f89910f0a933/volumes" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.976511 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a438c50-84b9-49fb-9f49-0cfdbe2bab0a" path="/var/lib/kubelet/pods/3a438c50-84b9-49fb-9f49-0cfdbe2bab0a/volumes" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.977407 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9855de-8b4b-4e39-940b-ff863190d072" path="/var/lib/kubelet/pods/6d9855de-8b4b-4e39-940b-ff863190d072/volumes" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.977825 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b476b9d8-8c44-47b1-9736-3b7441a703bd" path="/var/lib/kubelet/pods/b476b9d8-8c44-47b1-9736-3b7441a703bd/volumes" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.979685 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf571fc-fd44-4b32-bc32-5ccd319a5157" path="/var/lib/kubelet/pods/cdf571fc-fd44-4b32-bc32-5ccd319a5157/volumes" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.980752 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed4653dc-2f44-44ec-a22f-073fd4782f85" path="/var/lib/kubelet/pods/ed4653dc-2f44-44ec-a22f-073fd4782f85/volumes" Jan 20 23:52:33 crc kubenswrapper[5030]: I0120 23:52:33.981372 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f832da32-f1c8-47a5-8bcd-7ea67db8b416" path="/var/lib/kubelet/pods/f832da32-f1c8-47a5-8bcd-7ea67db8b416/volumes" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.010513 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znn5d\" (UniqueName: \"kubernetes.io/projected/7fd42af9-412d-4b7d-8be1-2a7c9008795f-kube-api-access-znn5d\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.010544 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fd42af9-412d-4b7d-8be1-2a7c9008795f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.010555 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxmm6\" (UniqueName: \"kubernetes.io/projected/ee604c21-d89c-4f46-ae93-3912ff1da96d-kube-api-access-lxmm6\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.148553 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.159395 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.178054 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.209246 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="81ab5d03-ddb2-4d25-bec9-485f19fa3f13" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.72:5671: connect: connection refused" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.212695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-config-data\") pod \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.212747 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-scripts\") pod \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.212831 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-logs\") pod \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.212855 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-public-tls-certs\") pod \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.212872 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-combined-ca-bundle\") pod \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.212903 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-internal-tls-certs\") pod \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.212927 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-config-data\") pod \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.212972 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-scripts\") pod \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.212989 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-logs\") pod \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.213019 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-internal-tls-certs\") pod \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.213037 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-etc-machine-id\") pod \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.213056 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvm4p\" (UniqueName: \"kubernetes.io/projected/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-kube-api-access-zvm4p\") pod \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.213081 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-config-data-custom\") pod \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.213096 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-public-tls-certs\") pod \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.213166 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhpld\" (UniqueName: \"kubernetes.io/projected/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-kube-api-access-hhpld\") pod \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\" (UID: \"184162cc-2ce3-4a2e-9167-ff4ffa8cd188\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.213184 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-combined-ca-bundle\") pod \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\" (UID: \"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.214310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-logs" (OuterVolumeSpecName: "logs") pod "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" (UID: "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.216877 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "184162cc-2ce3-4a2e-9167-ff4ffa8cd188" (UID: "184162cc-2ce3-4a2e-9167-ff4ffa8cd188"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.217461 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-logs" (OuterVolumeSpecName: "logs") pod "184162cc-2ce3-4a2e-9167-ff4ffa8cd188" (UID: "184162cc-2ce3-4a2e-9167-ff4ffa8cd188"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.217568 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-scripts" (OuterVolumeSpecName: "scripts") pod "184162cc-2ce3-4a2e-9167-ff4ffa8cd188" (UID: "184162cc-2ce3-4a2e-9167-ff4ffa8cd188"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.222750 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-scripts" (OuterVolumeSpecName: "scripts") pod "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" (UID: "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.222759 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "184162cc-2ce3-4a2e-9167-ff4ffa8cd188" (UID: "184162cc-2ce3-4a2e-9167-ff4ffa8cd188"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.222827 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-kube-api-access-hhpld" (OuterVolumeSpecName: "kube-api-access-hhpld") pod "184162cc-2ce3-4a2e-9167-ff4ffa8cd188" (UID: "184162cc-2ce3-4a2e-9167-ff4ffa8cd188"). InnerVolumeSpecName "kube-api-access-hhpld". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.222880 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-kube-api-access-zvm4p" (OuterVolumeSpecName: "kube-api-access-zvm4p") pod "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" (UID: "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a"). InnerVolumeSpecName "kube-api-access-zvm4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.263471 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.272175 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "184162cc-2ce3-4a2e-9167-ff4ffa8cd188" (UID: "184162cc-2ce3-4a2e-9167-ff4ffa8cd188"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.286575 5030 generic.go:334] "Generic (PLEG): container finished" podID="c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" containerID="79303e278316bc7cf3193899577e73e1c0136d14a75ad0b5c2bf54b109f7b1d9" exitCode=0 Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.286677 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4","Type":"ContainerDied","Data":"79303e278316bc7cf3193899577e73e1c0136d14a75ad0b5c2bf54b109f7b1d9"} Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.313818 5030 generic.go:334] "Generic (PLEG): container finished" podID="184162cc-2ce3-4a2e-9167-ff4ffa8cd188" containerID="882be4226caa10dc7afd14321a5a024ab43fdef7d8d9c2cc6348bc47f5513cb1" exitCode=0 Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.313900 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"184162cc-2ce3-4a2e-9167-ff4ffa8cd188","Type":"ContainerDied","Data":"882be4226caa10dc7afd14321a5a024ab43fdef7d8d9c2cc6348bc47f5513cb1"} Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.313925 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"184162cc-2ce3-4a2e-9167-ff4ffa8cd188","Type":"ContainerDied","Data":"2addcdc8e5aca18200a55a675870e879d87f04f7750e2bb174d7710bfd8848d6"} Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.313943 5030 scope.go:117] "RemoveContainer" containerID="882be4226caa10dc7afd14321a5a024ab43fdef7d8d9c2cc6348bc47f5513cb1" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.314044 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.315878 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7p4h\" (UniqueName: \"kubernetes.io/projected/7c7e15f9-5447-4550-9b78-91a664abf455-kube-api-access-b7p4h\") pod \"7c7e15f9-5447-4550-9b78-91a664abf455\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.315939 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c7e15f9-5447-4550-9b78-91a664abf455-httpd-run\") pod \"7c7e15f9-5447-4550-9b78-91a664abf455\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.315979 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-public-tls-certs\") pod \"7c7e15f9-5447-4550-9b78-91a664abf455\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316025 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft4wq\" (UniqueName: \"kubernetes.io/projected/84e84d7f-54c8-43d1-a0ee-29d3e173f662-kube-api-access-ft4wq\") pod \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316052 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84e84d7f-54c8-43d1-a0ee-29d3e173f662-config-data\") pod \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316092 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"7c7e15f9-5447-4550-9b78-91a664abf455\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316121 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-combined-ca-bundle\") pod \"7c7e15f9-5447-4550-9b78-91a664abf455\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316157 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-scripts\") pod \"7c7e15f9-5447-4550-9b78-91a664abf455\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316201 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-config-data\") pod \"7c7e15f9-5447-4550-9b78-91a664abf455\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316224 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84e84d7f-54c8-43d1-a0ee-29d3e173f662-memcached-tls-certs\") pod \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316249 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84e84d7f-54c8-43d1-a0ee-29d3e173f662-kolla-config\") pod \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316272 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7e15f9-5447-4550-9b78-91a664abf455-logs\") pod \"7c7e15f9-5447-4550-9b78-91a664abf455\" (UID: \"7c7e15f9-5447-4550-9b78-91a664abf455\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316340 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e84d7f-54c8-43d1-a0ee-29d3e173f662-combined-ca-bundle\") pod \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\" (UID: \"84e84d7f-54c8-43d1-a0ee-29d3e173f662\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316690 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhpld\" (UniqueName: \"kubernetes.io/projected/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-kube-api-access-hhpld\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316702 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316711 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316720 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316729 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316738 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316747 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316756 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvm4p\" (UniqueName: \"kubernetes.io/projected/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-kube-api-access-zvm4p\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.316767 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.338996 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c7e15f9-5447-4550-9b78-91a664abf455-logs" (OuterVolumeSpecName: "logs") pod "7c7e15f9-5447-4550-9b78-91a664abf455" (UID: "7c7e15f9-5447-4550-9b78-91a664abf455"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.342156 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e84d7f-54c8-43d1-a0ee-29d3e173f662-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "84e84d7f-54c8-43d1-a0ee-29d3e173f662" (UID: "84e84d7f-54c8-43d1-a0ee-29d3e173f662"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.344834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" event={"ID":"ee604c21-d89c-4f46-ae93-3912ff1da96d","Type":"ContainerDied","Data":"f804646470ae9638965f80dbee604d9fda77a7ebbda062a817352238b0eb789d"} Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.345003 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.345549 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e84d7f-54c8-43d1-a0ee-29d3e173f662-config-data" (OuterVolumeSpecName: "config-data") pod "84e84d7f-54c8-43d1-a0ee-29d3e173f662" (UID: "84e84d7f-54c8-43d1-a0ee-29d3e173f662"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.346071 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c7e15f9-5447-4550-9b78-91a664abf455-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7c7e15f9-5447-4550-9b78-91a664abf455" (UID: "7c7e15f9-5447-4550-9b78-91a664abf455"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.353277 5030 generic.go:334] "Generic (PLEG): container finished" podID="f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" containerID="d2251a881137fe8b2237286d8f579027b803fabf914d7b024ad2945b1ccc202e" exitCode=0 Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.353332 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" event={"ID":"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a","Type":"ContainerDied","Data":"d2251a881137fe8b2237286d8f579027b803fabf914d7b024ad2945b1ccc202e"} Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.353357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" event={"ID":"f1b4ffff-8836-4c4e-be88-1d0cc9213a5a","Type":"ContainerDied","Data":"045b68bed21ea5278f19fe263879de21960af7eeff5e88c071234831b358a2af"} Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.353416 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5484cf4b86-wsff9" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.354590 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" event={"ID":"7fd42af9-412d-4b7d-8be1-2a7c9008795f","Type":"ContainerDied","Data":"1f57f8cc45e473878b7fbfb5968110cfe6982171bf2f81ec0232410672e7c9fe"} Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.354646 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.360379 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "7c7e15f9-5447-4550-9b78-91a664abf455" (UID: "7c7e15f9-5447-4550-9b78-91a664abf455"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.361946 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-scripts" (OuterVolumeSpecName: "scripts") pod "7c7e15f9-5447-4550-9b78-91a664abf455" (UID: "7c7e15f9-5447-4550-9b78-91a664abf455"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.362037 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" event={"ID":"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8","Type":"ContainerStarted","Data":"18a0356c446c290186901af3ff696b1d91139274654c9abd64753ddb60bccff3"} Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.363722 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.381963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "184162cc-2ce3-4a2e-9167-ff4ffa8cd188" (UID: "184162cc-2ce3-4a2e-9167-ff4ffa8cd188"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.382998 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-4lg5x" secret="" err="secret \"galera-openstack-dockercfg-f8qjh\" not found" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.383037 5030 scope.go:117] "RemoveContainer" containerID="8043d081ab68ab52f7f2599bf0db461c98d405d06b4245ce8431c2527b4cfcf7" Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.383399 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-4lg5x_openstack-kuttl-tests(815dc9f1-424f-499a-b27b-16f3f0483fff)\"" pod="openstack-kuttl-tests/root-account-create-update-4lg5x" podUID="815dc9f1-424f-499a-b27b-16f3f0483fff" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.383582 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e84d7f-54c8-43d1-a0ee-29d3e173f662-kube-api-access-ft4wq" (OuterVolumeSpecName: "kube-api-access-ft4wq") pod "84e84d7f-54c8-43d1-a0ee-29d3e173f662" (UID: "84e84d7f-54c8-43d1-a0ee-29d3e173f662"). InnerVolumeSpecName "kube-api-access-ft4wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.384200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" event={"ID":"4a16ed2d-dd15-41c1-b6ef-61d242bd61a4","Type":"ContainerDied","Data":"cea474c26653665197ef3cbcc33160450bc456d1e2e5d6873baf8abc1fbda87c"} Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.384272 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.390017 5030 generic.go:334] "Generic (PLEG): container finished" podID="84e84d7f-54c8-43d1-a0ee-29d3e173f662" containerID="905aa18ac6c1a58594d7762f6524c72d05436f56d34dfa2fbeeeba54d58677b5" exitCode=0 Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.390077 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"84e84d7f-54c8-43d1-a0ee-29d3e173f662","Type":"ContainerDied","Data":"905aa18ac6c1a58594d7762f6524c72d05436f56d34dfa2fbeeeba54d58677b5"} Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.390101 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"84e84d7f-54c8-43d1-a0ee-29d3e173f662","Type":"ContainerDied","Data":"03e71d1266c739615d8e5fd86af9aa963f92040fafee649a5536a3dcdcde51b6"} Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.390150 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.396750 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-config-data" (OuterVolumeSpecName: "config-data") pod "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" (UID: "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.397346 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c7e15f9-5447-4550-9b78-91a664abf455-kube-api-access-b7p4h" (OuterVolumeSpecName: "kube-api-access-b7p4h") pod "7c7e15f9-5447-4550-9b78-91a664abf455" (UID: "7c7e15f9-5447-4550-9b78-91a664abf455"). InnerVolumeSpecName "kube-api-access-b7p4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.397448 5030 scope.go:117] "RemoveContainer" containerID="413322f6b3a5c0fff6d3a7d166e45c2e48dcd0a3a60cc53ad151616f2a518a54" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.408075 5030 generic.go:334] "Generic (PLEG): container finished" podID="7c7e15f9-5447-4550-9b78-91a664abf455" containerID="659c6080b290beac39c64d7bbf8b45b9b4302d2101a13ac504110cc794f1c3ca" exitCode=0 Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.408147 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7c7e15f9-5447-4550-9b78-91a664abf455","Type":"ContainerDied","Data":"659c6080b290beac39c64d7bbf8b45b9b4302d2101a13ac504110cc794f1c3ca"} Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.408174 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"7c7e15f9-5447-4550-9b78-91a664abf455","Type":"ContainerDied","Data":"8f1339c559af2e1446af82c4175e39a6aa13b7b5be46fea7e205cb6a8a0be31a"} Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.408256 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.418069 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft4wq\" (UniqueName: \"kubernetes.io/projected/84e84d7f-54c8-43d1-a0ee-29d3e173f662-kube-api-access-ft4wq\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.418090 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84e84d7f-54c8-43d1-a0ee-29d3e173f662-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.418109 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.418119 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.418127 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.418136 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84e84d7f-54c8-43d1-a0ee-29d3e173f662-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.418144 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c7e15f9-5447-4550-9b78-91a664abf455-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.418154 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7p4h\" (UniqueName: \"kubernetes.io/projected/7c7e15f9-5447-4550-9b78-91a664abf455-kube-api-access-b7p4h\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.418162 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c7e15f9-5447-4550-9b78-91a664abf455-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.418170 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.422831 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.423456 5030 scope.go:117] "RemoveContainer" containerID="882be4226caa10dc7afd14321a5a024ab43fdef7d8d9c2cc6348bc47f5513cb1" Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.425927 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"882be4226caa10dc7afd14321a5a024ab43fdef7d8d9c2cc6348bc47f5513cb1\": container with ID starting with 882be4226caa10dc7afd14321a5a024ab43fdef7d8d9c2cc6348bc47f5513cb1 not found: ID does not exist" containerID="882be4226caa10dc7afd14321a5a024ab43fdef7d8d9c2cc6348bc47f5513cb1" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.425960 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"882be4226caa10dc7afd14321a5a024ab43fdef7d8d9c2cc6348bc47f5513cb1"} err="failed to get container status \"882be4226caa10dc7afd14321a5a024ab43fdef7d8d9c2cc6348bc47f5513cb1\": rpc error: code = NotFound desc = could not find container \"882be4226caa10dc7afd14321a5a024ab43fdef7d8d9c2cc6348bc47f5513cb1\": container with ID starting with 882be4226caa10dc7afd14321a5a024ab43fdef7d8d9c2cc6348bc47f5513cb1 not found: ID does not exist" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.425981 5030 scope.go:117] "RemoveContainer" containerID="413322f6b3a5c0fff6d3a7d166e45c2e48dcd0a3a60cc53ad151616f2a518a54" Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.426658 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"413322f6b3a5c0fff6d3a7d166e45c2e48dcd0a3a60cc53ad151616f2a518a54\": container with ID starting with 413322f6b3a5c0fff6d3a7d166e45c2e48dcd0a3a60cc53ad151616f2a518a54 not found: ID does not exist" containerID="413322f6b3a5c0fff6d3a7d166e45c2e48dcd0a3a60cc53ad151616f2a518a54" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.426678 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"413322f6b3a5c0fff6d3a7d166e45c2e48dcd0a3a60cc53ad151616f2a518a54"} err="failed to get container status \"413322f6b3a5c0fff6d3a7d166e45c2e48dcd0a3a60cc53ad151616f2a518a54\": rpc error: code = NotFound desc = could not find container \"413322f6b3a5c0fff6d3a7d166e45c2e48dcd0a3a60cc53ad151616f2a518a54\": container with ID starting with 413322f6b3a5c0fff6d3a7d166e45c2e48dcd0a3a60cc53ad151616f2a518a54 not found: ID does not exist" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.426690 5030 scope.go:117] "RemoveContainer" containerID="d2251a881137fe8b2237286d8f579027b803fabf914d7b024ad2945b1ccc202e" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.426843 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.426858 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.428010 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.438032 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-02a2-account-create-update-8wg5z"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.448897 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" podStartSLOduration=5.448881462 podStartE2EDuration="5.448881462s" podCreationTimestamp="2026-01-20 23:52:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:52:34.397154638 +0000 UTC m=+4626.717414926" watchObservedRunningTime="2026-01-20 23:52:34.448881462 +0000 UTC m=+4626.769141750" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.451222 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-config-data" (OuterVolumeSpecName: "config-data") pod "184162cc-2ce3-4a2e-9167-ff4ffa8cd188" (UID: "184162cc-2ce3-4a2e-9167-ff4ffa8cd188"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.452000 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.471904 5030 scope.go:117] "RemoveContainer" containerID="255c4a9058db6132dbc1d591cdfa04bd107d181e1d3692fc5d55bf5752456d30" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.474768 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.481767 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "184162cc-2ce3-4a2e-9167-ff4ffa8cd188" (UID: "184162cc-2ce3-4a2e-9167-ff4ffa8cd188"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.485846 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-3da3-account-create-update-v87kw"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.496795 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="acb8010d-f18b-417a-919f-13e959ccf452" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.215:8775/\": dial tcp 10.217.1.215:8775: connect: connection refused" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.497701 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="acb8010d-f18b-417a-919f-13e959ccf452" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.215:8775/\": dial tcp 10.217.1.215:8775: connect: connection refused" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.501653 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.502782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c7e15f9-5447-4550-9b78-91a664abf455" (UID: "7c7e15f9-5447-4550-9b78-91a664abf455"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.514468 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" (UID: "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.514597 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84e84d7f-54c8-43d1-a0ee-29d3e173f662-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84e84d7f-54c8-43d1-a0ee-29d3e173f662" (UID: "84e84d7f-54c8-43d1-a0ee-29d3e173f662"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.519087 5030 scope.go:117] "RemoveContainer" containerID="d2251a881137fe8b2237286d8f579027b803fabf914d7b024ad2945b1ccc202e" Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.519680 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2251a881137fe8b2237286d8f579027b803fabf914d7b024ad2945b1ccc202e\": container with ID starting with d2251a881137fe8b2237286d8f579027b803fabf914d7b024ad2945b1ccc202e not found: ID does not exist" containerID="d2251a881137fe8b2237286d8f579027b803fabf914d7b024ad2945b1ccc202e" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.519704 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2251a881137fe8b2237286d8f579027b803fabf914d7b024ad2945b1ccc202e"} err="failed to get container status \"d2251a881137fe8b2237286d8f579027b803fabf914d7b024ad2945b1ccc202e\": rpc error: code = NotFound desc = could not find container \"d2251a881137fe8b2237286d8f579027b803fabf914d7b024ad2945b1ccc202e\": container with ID starting with d2251a881137fe8b2237286d8f579027b803fabf914d7b024ad2945b1ccc202e not found: ID does not exist" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.519725 5030 scope.go:117] "RemoveContainer" containerID="255c4a9058db6132dbc1d591cdfa04bd107d181e1d3692fc5d55bf5752456d30" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.520336 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.520483 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"255c4a9058db6132dbc1d591cdfa04bd107d181e1d3692fc5d55bf5752456d30\": container with ID starting with 255c4a9058db6132dbc1d591cdfa04bd107d181e1d3692fc5d55bf5752456d30 not found: ID does not exist" containerID="255c4a9058db6132dbc1d591cdfa04bd107d181e1d3692fc5d55bf5752456d30" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.520510 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-config-data-generated\") pod \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.520559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w56km\" (UniqueName: \"kubernetes.io/projected/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-kube-api-access-w56km\") pod \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.520593 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-kolla-config\") pod \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.520632 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-galera-tls-certs\") pod \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.520548 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255c4a9058db6132dbc1d591cdfa04bd107d181e1d3692fc5d55bf5752456d30"} err="failed to get container status \"255c4a9058db6132dbc1d591cdfa04bd107d181e1d3692fc5d55bf5752456d30\": rpc error: code = NotFound desc = could not find container \"255c4a9058db6132dbc1d591cdfa04bd107d181e1d3692fc5d55bf5752456d30\": container with ID starting with 255c4a9058db6132dbc1d591cdfa04bd107d181e1d3692fc5d55bf5752456d30 not found: ID does not exist" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.520690 5030 scope.go:117] "RemoveContainer" containerID="905aa18ac6c1a58594d7762f6524c72d05436f56d34dfa2fbeeeba54d58677b5" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.520855 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-operator-scripts\") pod \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.520993 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-combined-ca-bundle\") pod \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.521020 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-config-data-default\") pod \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\" (UID: \"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.522327 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.522384 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/184162cc-2ce3-4a2e-9167-ff4ffa8cd188-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.522397 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e84d7f-54c8-43d1-a0ee-29d3e173f662-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.522411 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.522449 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.523254 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" (UID: "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.523910 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" (UID: "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.523980 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" (UID: "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.524323 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" (UID: "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.526357 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.537448 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-8027-account-create-update-xbzfc"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.557900 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-config-data" (OuterVolumeSpecName: "config-data") pod "7c7e15f9-5447-4550-9b78-91a664abf455" (UID: "7c7e15f9-5447-4550-9b78-91a664abf455"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.564932 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-kube-api-access-w56km" (OuterVolumeSpecName: "kube-api-access-w56km") pod "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" (UID: "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4"). InnerVolumeSpecName "kube-api-access-w56km". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.568121 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.571107 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" (UID: "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.574856 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.586006 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" (UID: "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.593738 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7c7e15f9-5447-4550-9b78-91a664abf455" (UID: "7c7e15f9-5447-4550-9b78-91a664abf455"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.599035 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" (UID: "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.606891 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84e84d7f-54c8-43d1-a0ee-29d3e173f662-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "84e84d7f-54c8-43d1-a0ee-29d3e173f662" (UID: "84e84d7f-54c8-43d1-a0ee-29d3e173f662"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.623534 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.623798 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w56km\" (UniqueName: \"kubernetes.io/projected/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-kube-api-access-w56km\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.623808 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.623819 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.623827 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.623836 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/84e84d7f-54c8-43d1-a0ee-29d3e173f662-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.623845 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.623853 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.623861 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.623885 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.623894 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c7e15f9-5447-4550-9b78-91a664abf455-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.623903 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.628856 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" (UID: "c5bf870a-6b91-4f5b-8e86-f87d34d97aa4"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.639881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" (UID: "f1b4ffff-8836-4c4e-be88-1d0cc9213a5a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.643869 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.680561 5030 scope.go:117] "RemoveContainer" containerID="905aa18ac6c1a58594d7762f6524c72d05436f56d34dfa2fbeeeba54d58677b5" Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.681352 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905aa18ac6c1a58594d7762f6524c72d05436f56d34dfa2fbeeeba54d58677b5\": container with ID starting with 905aa18ac6c1a58594d7762f6524c72d05436f56d34dfa2fbeeeba54d58677b5 not found: ID does not exist" containerID="905aa18ac6c1a58594d7762f6524c72d05436f56d34dfa2fbeeeba54d58677b5" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.681378 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905aa18ac6c1a58594d7762f6524c72d05436f56d34dfa2fbeeeba54d58677b5"} err="failed to get container status \"905aa18ac6c1a58594d7762f6524c72d05436f56d34dfa2fbeeeba54d58677b5\": rpc error: code = NotFound desc = could not find container \"905aa18ac6c1a58594d7762f6524c72d05436f56d34dfa2fbeeeba54d58677b5\": container with ID starting with 905aa18ac6c1a58594d7762f6524c72d05436f56d34dfa2fbeeeba54d58677b5 not found: ID does not exist" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.681398 5030 scope.go:117] "RemoveContainer" containerID="659c6080b290beac39c64d7bbf8b45b9b4302d2101a13ac504110cc794f1c3ca" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.683425 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.693727 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.706975 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5484cf4b86-wsff9"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.712183 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-5484cf4b86-wsff9"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.719687 5030 scope.go:117] "RemoveContainer" containerID="83b2f026ab0c42167ca1bb5968a95879a1cec2e859ab889dcd0ef3007905b2f0" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.725064 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.725090 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.725101 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.725144 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.725195 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/815dc9f1-424f-499a-b27b-16f3f0483fff-operator-scripts podName:815dc9f1-424f-499a-b27b-16f3f0483fff nodeName:}" failed. No retries permitted until 2026-01-20 23:52:36.725180475 +0000 UTC m=+4629.045440763 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/815dc9f1-424f-499a-b27b-16f3f0483fff-operator-scripts") pod "root-account-create-update-4lg5x" (UID: "815dc9f1-424f-499a-b27b-16f3f0483fff") : configmap "openstack-scripts" not found Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.771365 5030 scope.go:117] "RemoveContainer" containerID="659c6080b290beac39c64d7bbf8b45b9b4302d2101a13ac504110cc794f1c3ca" Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.772184 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"659c6080b290beac39c64d7bbf8b45b9b4302d2101a13ac504110cc794f1c3ca\": container with ID starting with 659c6080b290beac39c64d7bbf8b45b9b4302d2101a13ac504110cc794f1c3ca not found: ID does not exist" containerID="659c6080b290beac39c64d7bbf8b45b9b4302d2101a13ac504110cc794f1c3ca" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.772216 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"659c6080b290beac39c64d7bbf8b45b9b4302d2101a13ac504110cc794f1c3ca"} err="failed to get container status \"659c6080b290beac39c64d7bbf8b45b9b4302d2101a13ac504110cc794f1c3ca\": rpc error: code = NotFound desc = could not find container \"659c6080b290beac39c64d7bbf8b45b9b4302d2101a13ac504110cc794f1c3ca\": container with ID starting with 659c6080b290beac39c64d7bbf8b45b9b4302d2101a13ac504110cc794f1c3ca not found: ID does not exist" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.772470 5030 scope.go:117] "RemoveContainer" containerID="83b2f026ab0c42167ca1bb5968a95879a1cec2e859ab889dcd0ef3007905b2f0" Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.773235 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83b2f026ab0c42167ca1bb5968a95879a1cec2e859ab889dcd0ef3007905b2f0\": container with ID starting with 83b2f026ab0c42167ca1bb5968a95879a1cec2e859ab889dcd0ef3007905b2f0 not found: ID does not exist" containerID="83b2f026ab0c42167ca1bb5968a95879a1cec2e859ab889dcd0ef3007905b2f0" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.773264 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b2f026ab0c42167ca1bb5968a95879a1cec2e859ab889dcd0ef3007905b2f0"} err="failed to get container status \"83b2f026ab0c42167ca1bb5968a95879a1cec2e859ab889dcd0ef3007905b2f0\": rpc error: code = NotFound desc = could not find container \"83b2f026ab0c42167ca1bb5968a95879a1cec2e859ab889dcd0ef3007905b2f0\": container with ID starting with 83b2f026ab0c42167ca1bb5968a95879a1cec2e859ab889dcd0ef3007905b2f0 not found: ID does not exist" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.778441 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.800354 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.803319 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19 is running failed: container process not found" containerID="5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.804148 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19 is running failed: container process not found" containerID="5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.804554 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19 is running failed: container process not found" containerID="5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.804612 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.811529 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.825291 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.826007 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"b54968d4-ae84-4e25-8899-6113e78d9a2b\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.826045 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-scripts\") pod \"b54968d4-ae84-4e25-8899-6113e78d9a2b\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.826072 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-combined-ca-bundle\") pod \"b54968d4-ae84-4e25-8899-6113e78d9a2b\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.826107 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czh6v\" (UniqueName: \"kubernetes.io/projected/b54968d4-ae84-4e25-8899-6113e78d9a2b-kube-api-access-czh6v\") pod \"b54968d4-ae84-4e25-8899-6113e78d9a2b\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.826175 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b54968d4-ae84-4e25-8899-6113e78d9a2b-logs\") pod \"b54968d4-ae84-4e25-8899-6113e78d9a2b\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.826203 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b54968d4-ae84-4e25-8899-6113e78d9a2b-httpd-run\") pod \"b54968d4-ae84-4e25-8899-6113e78d9a2b\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.826254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-config-data\") pod \"b54968d4-ae84-4e25-8899-6113e78d9a2b\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.826276 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-internal-tls-certs\") pod \"b54968d4-ae84-4e25-8899-6113e78d9a2b\" (UID: \"b54968d4-ae84-4e25-8899-6113e78d9a2b\") " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.826517 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvdd2\" (UniqueName: \"kubernetes.io/projected/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-kube-api-access-dvdd2\") pod \"keystone-a1e6-account-create-update-ggxcj\" (UID: \"9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.826611 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-operator-scripts\") pod \"keystone-a1e6-account-create-update-ggxcj\" (UID: \"9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca\") " pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj" Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.826727 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.826773 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-operator-scripts podName:9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca nodeName:}" failed. No retries permitted until 2026-01-20 23:52:36.826758428 +0000 UTC m=+4629.147018716 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-operator-scripts") pod "keystone-a1e6-account-create-update-ggxcj" (UID: "9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca") : configmap "openstack-scripts" not found Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.830109 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b54968d4-ae84-4e25-8899-6113e78d9a2b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b54968d4-ae84-4e25-8899-6113e78d9a2b" (UID: "b54968d4-ae84-4e25-8899-6113e78d9a2b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.831838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b54968d4-ae84-4e25-8899-6113e78d9a2b-logs" (OuterVolumeSpecName: "logs") pod "b54968d4-ae84-4e25-8899-6113e78d9a2b" (UID: "b54968d4-ae84-4e25-8899-6113e78d9a2b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.831895 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.834805 5030 projected.go:194] Error preparing data for projected volume kube-api-access-dvdd2 for pod openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:52:34 crc kubenswrapper[5030]: E0120 23:52:34.834856 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-kube-api-access-dvdd2 podName:9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca nodeName:}" failed. No retries permitted until 2026-01-20 23:52:36.834839945 +0000 UTC m=+4629.155100233 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dvdd2" (UniqueName: "kubernetes.io/projected/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-kube-api-access-dvdd2") pod "keystone-a1e6-account-create-update-ggxcj" (UID: "9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.834883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "b54968d4-ae84-4e25-8899-6113e78d9a2b" (UID: "b54968d4-ae84-4e25-8899-6113e78d9a2b"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.839504 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b54968d4-ae84-4e25-8899-6113e78d9a2b-kube-api-access-czh6v" (OuterVolumeSpecName: "kube-api-access-czh6v") pod "b54968d4-ae84-4e25-8899-6113e78d9a2b" (UID: "b54968d4-ae84-4e25-8899-6113e78d9a2b"). InnerVolumeSpecName "kube-api-access-czh6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.841584 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-scripts" (OuterVolumeSpecName: "scripts") pod "b54968d4-ae84-4e25-8899-6113e78d9a2b" (UID: "b54968d4-ae84-4e25-8899-6113e78d9a2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.879665 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b54968d4-ae84-4e25-8899-6113e78d9a2b" (UID: "b54968d4-ae84-4e25-8899-6113e78d9a2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.891943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b54968d4-ae84-4e25-8899-6113e78d9a2b" (UID: "b54968d4-ae84-4e25-8899-6113e78d9a2b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.892588 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-config-data" (OuterVolumeSpecName: "config-data") pod "b54968d4-ae84-4e25-8899-6113e78d9a2b" (UID: "b54968d4-ae84-4e25-8899-6113e78d9a2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.928019 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b54968d4-ae84-4e25-8899-6113e78d9a2b-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.928046 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.928058 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.928080 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.928090 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.928099 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54968d4-ae84-4e25-8899-6113e78d9a2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.928108 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czh6v\" (UniqueName: \"kubernetes.io/projected/b54968d4-ae84-4e25-8899-6113e78d9a2b-kube-api-access-czh6v\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.928116 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b54968d4-ae84-4e25-8899-6113e78d9a2b-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.953982 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.966425 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" podUID="a9df19e9-10ce-496f-9ae7-89ac55ba91cf" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.214:9311/healthcheck\": read tcp 10.217.0.2:37640->10.217.1.214:9311: read: connection reset by peer" Jan 20 23:52:34 crc kubenswrapper[5030]: I0120 23:52:34.966734 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" podUID="a9df19e9-10ce-496f-9ae7-89ac55ba91cf" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.214:9311/healthcheck\": read tcp 10.217.0.2:37628->10.217.1.214:9311: read: connection reset by peer" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.030355 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.269047 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.279450 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.291126 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.299741 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336104 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-nova-metadata-tls-certs\") pod \"acb8010d-f18b-417a-919f-13e959ccf452\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336165 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/611e73cb-eb48-4fed-9073-7b3ee43b88e8-logs\") pod \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336201 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8shf5\" (UniqueName: \"kubernetes.io/projected/611e73cb-eb48-4fed-9073-7b3ee43b88e8-kube-api-access-8shf5\") pod \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336231 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-combined-ca-bundle\") pod \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336267 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-combined-ca-bundle\") pod \"acb8010d-f18b-417a-919f-13e959ccf452\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acb8010d-f18b-417a-919f-13e959ccf452-logs\") pod \"acb8010d-f18b-417a-919f-13e959ccf452\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336357 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ed2d89-dd21-446c-b85b-b44bb72512bc-logs\") pod \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336417 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-config-data-custom\") pod \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336470 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-config-data\") pod \"acb8010d-f18b-417a-919f-13e959ccf452\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336504 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nxdw\" (UniqueName: \"kubernetes.io/projected/d3ed2d89-dd21-446c-b85b-b44bb72512bc-kube-api-access-6nxdw\") pod \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336528 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-config-data-custom\") pod \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336570 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7daf347c-1524-4050-b50e-deb2024da0cc-combined-ca-bundle\") pod \"7daf347c-1524-4050-b50e-deb2024da0cc\" (UID: \"7daf347c-1524-4050-b50e-deb2024da0cc\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bfgn\" (UniqueName: \"kubernetes.io/projected/7daf347c-1524-4050-b50e-deb2024da0cc-kube-api-access-4bfgn\") pod \"7daf347c-1524-4050-b50e-deb2024da0cc\" (UID: \"7daf347c-1524-4050-b50e-deb2024da0cc\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336643 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-config-data\") pod \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\" (UID: \"d3ed2d89-dd21-446c-b85b-b44bb72512bc\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336664 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-config-data\") pod \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336699 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtlkk\" (UniqueName: \"kubernetes.io/projected/acb8010d-f18b-417a-919f-13e959ccf452-kube-api-access-dtlkk\") pod \"acb8010d-f18b-417a-919f-13e959ccf452\" (UID: \"acb8010d-f18b-417a-919f-13e959ccf452\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336743 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-combined-ca-bundle\") pod \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\" (UID: \"611e73cb-eb48-4fed-9073-7b3ee43b88e8\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.336787 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7daf347c-1524-4050-b50e-deb2024da0cc-config-data\") pod \"7daf347c-1524-4050-b50e-deb2024da0cc\" (UID: \"7daf347c-1524-4050-b50e-deb2024da0cc\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.338269 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb8010d-f18b-417a-919f-13e959ccf452-logs" (OuterVolumeSpecName: "logs") pod "acb8010d-f18b-417a-919f-13e959ccf452" (UID: "acb8010d-f18b-417a-919f-13e959ccf452"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.339325 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/611e73cb-eb48-4fed-9073-7b3ee43b88e8-logs" (OuterVolumeSpecName: "logs") pod "611e73cb-eb48-4fed-9073-7b3ee43b88e8" (UID: "611e73cb-eb48-4fed-9073-7b3ee43b88e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.339888 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ed2d89-dd21-446c-b85b-b44bb72512bc-logs" (OuterVolumeSpecName: "logs") pod "d3ed2d89-dd21-446c-b85b-b44bb72512bc" (UID: "d3ed2d89-dd21-446c-b85b-b44bb72512bc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.348939 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "611e73cb-eb48-4fed-9073-7b3ee43b88e8" (UID: "611e73cb-eb48-4fed-9073-7b3ee43b88e8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.360845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7daf347c-1524-4050-b50e-deb2024da0cc-kube-api-access-4bfgn" (OuterVolumeSpecName: "kube-api-access-4bfgn") pod "7daf347c-1524-4050-b50e-deb2024da0cc" (UID: "7daf347c-1524-4050-b50e-deb2024da0cc"). InnerVolumeSpecName "kube-api-access-4bfgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.369902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/611e73cb-eb48-4fed-9073-7b3ee43b88e8-kube-api-access-8shf5" (OuterVolumeSpecName: "kube-api-access-8shf5") pod "611e73cb-eb48-4fed-9073-7b3ee43b88e8" (UID: "611e73cb-eb48-4fed-9073-7b3ee43b88e8"). InnerVolumeSpecName "kube-api-access-8shf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.376363 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d3ed2d89-dd21-446c-b85b-b44bb72512bc" (UID: "d3ed2d89-dd21-446c-b85b-b44bb72512bc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.377238 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb8010d-f18b-417a-919f-13e959ccf452-kube-api-access-dtlkk" (OuterVolumeSpecName: "kube-api-access-dtlkk") pod "acb8010d-f18b-417a-919f-13e959ccf452" (UID: "acb8010d-f18b-417a-919f-13e959ccf452"). InnerVolumeSpecName "kube-api-access-dtlkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.385178 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ed2d89-dd21-446c-b85b-b44bb72512bc-kube-api-access-6nxdw" (OuterVolumeSpecName: "kube-api-access-6nxdw") pod "d3ed2d89-dd21-446c-b85b-b44bb72512bc" (UID: "d3ed2d89-dd21-446c-b85b-b44bb72512bc"). InnerVolumeSpecName "kube-api-access-6nxdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.386829 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7daf347c-1524-4050-b50e-deb2024da0cc-config-data" (OuterVolumeSpecName: "config-data") pod "7daf347c-1524-4050-b50e-deb2024da0cc" (UID: "7daf347c-1524-4050-b50e-deb2024da0cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.389391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "611e73cb-eb48-4fed-9073-7b3ee43b88e8" (UID: "611e73cb-eb48-4fed-9073-7b3ee43b88e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.408515 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3ed2d89-dd21-446c-b85b-b44bb72512bc" (UID: "d3ed2d89-dd21-446c-b85b-b44bb72512bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.416688 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-config-data" (OuterVolumeSpecName: "config-data") pod "acb8010d-f18b-417a-919f-13e959ccf452" (UID: "acb8010d-f18b-417a-919f-13e959ccf452"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.433163 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acb8010d-f18b-417a-919f-13e959ccf452" (UID: "acb8010d-f18b-417a-919f-13e959ccf452"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.436376 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7daf347c-1524-4050-b50e-deb2024da0cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7daf347c-1524-4050-b50e-deb2024da0cc" (UID: "7daf347c-1524-4050-b50e-deb2024da0cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.441899 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/611e73cb-eb48-4fed-9073-7b3ee43b88e8-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442020 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8shf5\" (UniqueName: \"kubernetes.io/projected/611e73cb-eb48-4fed-9073-7b3ee43b88e8-kube-api-access-8shf5\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442104 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442190 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442264 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acb8010d-f18b-417a-919f-13e959ccf452-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442317 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ed2d89-dd21-446c-b85b-b44bb72512bc-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442375 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442430 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442482 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nxdw\" (UniqueName: \"kubernetes.io/projected/d3ed2d89-dd21-446c-b85b-b44bb72512bc-kube-api-access-6nxdw\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442535 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442586 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7daf347c-1524-4050-b50e-deb2024da0cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442664 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bfgn\" (UniqueName: \"kubernetes.io/projected/7daf347c-1524-4050-b50e-deb2024da0cc-kube-api-access-4bfgn\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442782 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtlkk\" (UniqueName: \"kubernetes.io/projected/acb8010d-f18b-417a-919f-13e959ccf452-kube-api-access-dtlkk\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442837 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.442889 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7daf347c-1524-4050-b50e-deb2024da0cc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.452949 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-config-data" (OuterVolumeSpecName: "config-data") pod "d3ed2d89-dd21-446c-b85b-b44bb72512bc" (UID: "d3ed2d89-dd21-446c-b85b-b44bb72512bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.459647 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-config-data" (OuterVolumeSpecName: "config-data") pod "611e73cb-eb48-4fed-9073-7b3ee43b88e8" (UID: "611e73cb-eb48-4fed-9073-7b3ee43b88e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.460125 5030 generic.go:334] "Generic (PLEG): container finished" podID="a9df19e9-10ce-496f-9ae7-89ac55ba91cf" containerID="fc0ce133b045b63568d1b17b3af0a763ae05a71804fb6407998acf7b7d0be9dd" exitCode=0 Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.460204 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" event={"ID":"a9df19e9-10ce-496f-9ae7-89ac55ba91cf","Type":"ContainerDied","Data":"fc0ce133b045b63568d1b17b3af0a763ae05a71804fb6407998acf7b7d0be9dd"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.465974 5030 generic.go:334] "Generic (PLEG): container finished" podID="d3ed2d89-dd21-446c-b85b-b44bb72512bc" containerID="39cbbaace20c2208aaabbaa59903f192867600c05fcdfb4a1ad5000eb6ffd70e" exitCode=0 Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.466015 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" event={"ID":"d3ed2d89-dd21-446c-b85b-b44bb72512bc","Type":"ContainerDied","Data":"39cbbaace20c2208aaabbaa59903f192867600c05fcdfb4a1ad5000eb6ffd70e"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.466055 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" event={"ID":"d3ed2d89-dd21-446c-b85b-b44bb72512bc","Type":"ContainerDied","Data":"33a963852f5f74a4ebd888b9e512423d08c12d4c9b02d76e4d46f9e67720a6f1"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.466071 5030 scope.go:117] "RemoveContainer" containerID="39cbbaace20c2208aaabbaa59903f192867600c05fcdfb4a1ad5000eb6ffd70e" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.466240 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.475343 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.476234 5030 generic.go:334] "Generic (PLEG): container finished" podID="acb8010d-f18b-417a-919f-13e959ccf452" containerID="d0823fe72a1822171721721bd07c6c91f0239b31d0a862490eb2023a64940f16" exitCode=0 Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.476263 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.476327 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"acb8010d-f18b-417a-919f-13e959ccf452","Type":"ContainerDied","Data":"d0823fe72a1822171721721bd07c6c91f0239b31d0a862490eb2023a64940f16"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.476360 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"acb8010d-f18b-417a-919f-13e959ccf452","Type":"ContainerDied","Data":"19b65cf0eeb9ae9ab9473dda49ea2757adf4eca1aaed79eb8349aaff7223bcad"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.480391 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_55537e5f-bb41-4899-a095-e87c7495ebdf/ovn-northd/0.log" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.480429 5030 generic.go:334] "Generic (PLEG): container finished" podID="55537e5f-bb41-4899-a095-e87c7495ebdf" containerID="cef4a5f00f4648bf0fd3e6e59c501cb49f0b6ca80d5691ef8ef3da4d372a28d8" exitCode=139 Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.480474 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"55537e5f-bb41-4899-a095-e87c7495ebdf","Type":"ContainerDied","Data":"cef4a5f00f4648bf0fd3e6e59c501cb49f0b6ca80d5691ef8ef3da4d372a28d8"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.484018 5030 generic.go:334] "Generic (PLEG): container finished" podID="611e73cb-eb48-4fed-9073-7b3ee43b88e8" containerID="56c4a1d60ce76e519f42d8bf41166f1c90ffea4940e28ac6032adf8c11f04400" exitCode=0 Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.484102 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" event={"ID":"611e73cb-eb48-4fed-9073-7b3ee43b88e8","Type":"ContainerDied","Data":"56c4a1d60ce76e519f42d8bf41166f1c90ffea4940e28ac6032adf8c11f04400"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.484139 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" event={"ID":"611e73cb-eb48-4fed-9073-7b3ee43b88e8","Type":"ContainerDied","Data":"c1b7fd9d7ea1d3ccd3b70102bb4c482c5343dd2f77a86f37c2b7339baf55ecf9"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.484203 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.487176 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "acb8010d-f18b-417a-919f-13e959ccf452" (UID: "acb8010d-f18b-417a-919f-13e959ccf452"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.487671 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"c5bf870a-6b91-4f5b-8e86-f87d34d97aa4","Type":"ContainerDied","Data":"b3bc34edace7fc3eca37d67215afa4a9bd7b0e4340834c68cf70f7d37b951847"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.487773 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.496246 5030 generic.go:334] "Generic (PLEG): container finished" podID="dbabd573-5784-4cb5-8294-cf65951a73ba" containerID="d99eff37903cee5608b24fb64a68cb6530669b4caba4e1642f9039b72ff5ad58" exitCode=0 Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.496339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"dbabd573-5784-4cb5-8294-cf65951a73ba","Type":"ContainerDied","Data":"d99eff37903cee5608b24fb64a68cb6530669b4caba4e1642f9039b72ff5ad58"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.505751 5030 scope.go:117] "RemoveContainer" containerID="29681178c1d15195b91a0a520a9ab9575c9dc60635da73365d7bf863a845e294" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.507064 5030 generic.go:334] "Generic (PLEG): container finished" podID="b54968d4-ae84-4e25-8899-6113e78d9a2b" containerID="5bc5381a28ae85e8ac4d78c24cbcbda308750d29cddbbbd612d8dadbe5fff537" exitCode=0 Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.507131 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.507177 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b54968d4-ae84-4e25-8899-6113e78d9a2b","Type":"ContainerDied","Data":"5bc5381a28ae85e8ac4d78c24cbcbda308750d29cddbbbd612d8dadbe5fff537"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.507203 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"b54968d4-ae84-4e25-8899-6113e78d9a2b","Type":"ContainerDied","Data":"9b0f40d49806f25db8316c7623cd1178cdc747495ed183b07f257fae24d459f1"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.507237 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="dd483381-3919-4ade-ac69-c8abde2869a6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.73:5671: connect: connection refused" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.514399 5030 generic.go:334] "Generic (PLEG): container finished" podID="7daf347c-1524-4050-b50e-deb2024da0cc" containerID="5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19" exitCode=0 Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.514584 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7daf347c-1524-4050-b50e-deb2024da0cc","Type":"ContainerDied","Data":"5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.514641 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"7daf347c-1524-4050-b50e-deb2024da0cc","Type":"ContainerDied","Data":"1c9a6d35ee19d31ab24156b86f222c57b195b06277a2e3a4b18222c4abea3ef5"} Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.514690 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.515134 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.529779 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.531855 5030 scope.go:117] "RemoveContainer" containerID="39cbbaace20c2208aaabbaa59903f192867600c05fcdfb4a1ad5000eb6ffd70e" Jan 20 23:52:35 crc kubenswrapper[5030]: E0120 23:52:35.532830 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39cbbaace20c2208aaabbaa59903f192867600c05fcdfb4a1ad5000eb6ffd70e\": container with ID starting with 39cbbaace20c2208aaabbaa59903f192867600c05fcdfb4a1ad5000eb6ffd70e not found: ID does not exist" containerID="39cbbaace20c2208aaabbaa59903f192867600c05fcdfb4a1ad5000eb6ffd70e" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.532871 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39cbbaace20c2208aaabbaa59903f192867600c05fcdfb4a1ad5000eb6ffd70e"} err="failed to get container status \"39cbbaace20c2208aaabbaa59903f192867600c05fcdfb4a1ad5000eb6ffd70e\": rpc error: code = NotFound desc = could not find container \"39cbbaace20c2208aaabbaa59903f192867600c05fcdfb4a1ad5000eb6ffd70e\": container with ID starting with 39cbbaace20c2208aaabbaa59903f192867600c05fcdfb4a1ad5000eb6ffd70e not found: ID does not exist" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.532895 5030 scope.go:117] "RemoveContainer" containerID="29681178c1d15195b91a0a520a9ab9575c9dc60635da73365d7bf863a845e294" Jan 20 23:52:35 crc kubenswrapper[5030]: E0120 23:52:35.535037 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29681178c1d15195b91a0a520a9ab9575c9dc60635da73365d7bf863a845e294\": container with ID starting with 29681178c1d15195b91a0a520a9ab9575c9dc60635da73365d7bf863a845e294 not found: ID does not exist" containerID="29681178c1d15195b91a0a520a9ab9575c9dc60635da73365d7bf863a845e294" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.535079 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29681178c1d15195b91a0a520a9ab9575c9dc60635da73365d7bf863a845e294"} err="failed to get container status \"29681178c1d15195b91a0a520a9ab9575c9dc60635da73365d7bf863a845e294\": rpc error: code = NotFound desc = could not find container \"29681178c1d15195b91a0a520a9ab9575c9dc60635da73365d7bf863a845e294\": container with ID starting with 29681178c1d15195b91a0a520a9ab9575c9dc60635da73365d7bf863a845e294 not found: ID does not exist" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.535112 5030 scope.go:117] "RemoveContainer" containerID="d0823fe72a1822171721721bd07c6c91f0239b31d0a862490eb2023a64940f16" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.536351 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.544425 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-config-data\") pod \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.544501 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-logs\") pod \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.544597 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-internal-tls-certs\") pod \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.545365 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-public-tls-certs\") pod \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.545409 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpkhw\" (UniqueName: \"kubernetes.io/projected/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-kube-api-access-hpkhw\") pod \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.545443 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-config-data-custom\") pod \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.545478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-combined-ca-bundle\") pod \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\" (UID: \"a9df19e9-10ce-496f-9ae7-89ac55ba91cf\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.545930 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ed2d89-dd21-446c-b85b-b44bb72512bc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.545949 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/611e73cb-eb48-4fed-9073-7b3ee43b88e8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.545959 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acb8010d-f18b-417a-919f-13e959ccf452-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.545290 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-logs" (OuterVolumeSpecName: "logs") pod "a9df19e9-10ce-496f-9ae7-89ac55ba91cf" (UID: "a9df19e9-10ce-496f-9ae7-89ac55ba91cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.553003 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4"] Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.569069 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-656dcc67f7-xmnf4"] Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.570156 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a9df19e9-10ce-496f-9ae7-89ac55ba91cf" (UID: "a9df19e9-10ce-496f-9ae7-89ac55ba91cf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.570352 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-kube-api-access-hpkhw" (OuterVolumeSpecName: "kube-api-access-hpkhw") pod "a9df19e9-10ce-496f-9ae7-89ac55ba91cf" (UID: "a9df19e9-10ce-496f-9ae7-89ac55ba91cf"). InnerVolumeSpecName "kube-api-access-hpkhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.574811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9df19e9-10ce-496f-9ae7-89ac55ba91cf" (UID: "a9df19e9-10ce-496f-9ae7-89ac55ba91cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.622088 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp"] Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.639315 5030 scope.go:117] "RemoveContainer" containerID="2832a1e00722193c8dc84462e3b5eaa3de72e46dcd9c97eb2ec6a97c0ac4c153" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.643518 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-config-data" (OuterVolumeSpecName: "config-data") pod "a9df19e9-10ce-496f-9ae7-89ac55ba91cf" (UID: "a9df19e9-10ce-496f-9ae7-89ac55ba91cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.644953 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a9df19e9-10ce-496f-9ae7-89ac55ba91cf" (UID: "a9df19e9-10ce-496f-9ae7-89ac55ba91cf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.645073 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a9df19e9-10ce-496f-9ae7-89ac55ba91cf" (UID: "a9df19e9-10ce-496f-9ae7-89ac55ba91cf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.646898 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.646991 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.647056 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.647116 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.647171 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.647234 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpkhw\" (UniqueName: \"kubernetes.io/projected/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-kube-api-access-hpkhw\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.647295 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9df19e9-10ce-496f-9ae7-89ac55ba91cf-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.669128 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-5db5b7584d-vzrsp"] Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.674981 5030 scope.go:117] "RemoveContainer" containerID="d0823fe72a1822171721721bd07c6c91f0239b31d0a862490eb2023a64940f16" Jan 20 23:52:35 crc kubenswrapper[5030]: E0120 23:52:35.677419 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0823fe72a1822171721721bd07c6c91f0239b31d0a862490eb2023a64940f16\": container with ID starting with d0823fe72a1822171721721bd07c6c91f0239b31d0a862490eb2023a64940f16 not found: ID does not exist" containerID="d0823fe72a1822171721721bd07c6c91f0239b31d0a862490eb2023a64940f16" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.677466 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0823fe72a1822171721721bd07c6c91f0239b31d0a862490eb2023a64940f16"} err="failed to get container status \"d0823fe72a1822171721721bd07c6c91f0239b31d0a862490eb2023a64940f16\": rpc error: code = NotFound desc = could not find container \"d0823fe72a1822171721721bd07c6c91f0239b31d0a862490eb2023a64940f16\": container with ID starting with d0823fe72a1822171721721bd07c6c91f0239b31d0a862490eb2023a64940f16 not found: ID does not exist" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.677495 5030 scope.go:117] "RemoveContainer" containerID="2832a1e00722193c8dc84462e3b5eaa3de72e46dcd9c97eb2ec6a97c0ac4c153" Jan 20 23:52:35 crc kubenswrapper[5030]: E0120 23:52:35.677789 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2832a1e00722193c8dc84462e3b5eaa3de72e46dcd9c97eb2ec6a97c0ac4c153\": container with ID starting with 2832a1e00722193c8dc84462e3b5eaa3de72e46dcd9c97eb2ec6a97c0ac4c153 not found: ID does not exist" containerID="2832a1e00722193c8dc84462e3b5eaa3de72e46dcd9c97eb2ec6a97c0ac4c153" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.677820 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2832a1e00722193c8dc84462e3b5eaa3de72e46dcd9c97eb2ec6a97c0ac4c153"} err="failed to get container status \"2832a1e00722193c8dc84462e3b5eaa3de72e46dcd9c97eb2ec6a97c0ac4c153\": rpc error: code = NotFound desc = could not find container \"2832a1e00722193c8dc84462e3b5eaa3de72e46dcd9c97eb2ec6a97c0ac4c153\": container with ID starting with 2832a1e00722193c8dc84462e3b5eaa3de72e46dcd9c97eb2ec6a97c0ac4c153 not found: ID does not exist" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.677840 5030 scope.go:117] "RemoveContainer" containerID="56c4a1d60ce76e519f42d8bf41166f1c90ffea4940e28ac6032adf8c11f04400" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.681666 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.692573 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.704526 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.706310 5030 scope.go:117] "RemoveContainer" containerID="640b59c1c1f24c64cc46c43b80b843802e0c48890d90206a9daf79885221c80a" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.707405 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.713036 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.727501 5030 scope.go:117] "RemoveContainer" containerID="56c4a1d60ce76e519f42d8bf41166f1c90ffea4940e28ac6032adf8c11f04400" Jan 20 23:52:35 crc kubenswrapper[5030]: E0120 23:52:35.728724 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56c4a1d60ce76e519f42d8bf41166f1c90ffea4940e28ac6032adf8c11f04400\": container with ID starting with 56c4a1d60ce76e519f42d8bf41166f1c90ffea4940e28ac6032adf8c11f04400 not found: ID does not exist" containerID="56c4a1d60ce76e519f42d8bf41166f1c90ffea4940e28ac6032adf8c11f04400" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.728780 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c4a1d60ce76e519f42d8bf41166f1c90ffea4940e28ac6032adf8c11f04400"} err="failed to get container status \"56c4a1d60ce76e519f42d8bf41166f1c90ffea4940e28ac6032adf8c11f04400\": rpc error: code = NotFound desc = could not find container \"56c4a1d60ce76e519f42d8bf41166f1c90ffea4940e28ac6032adf8c11f04400\": container with ID starting with 56c4a1d60ce76e519f42d8bf41166f1c90ffea4940e28ac6032adf8c11f04400 not found: ID does not exist" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.728813 5030 scope.go:117] "RemoveContainer" containerID="640b59c1c1f24c64cc46c43b80b843802e0c48890d90206a9daf79885221c80a" Jan 20 23:52:35 crc kubenswrapper[5030]: E0120 23:52:35.729252 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"640b59c1c1f24c64cc46c43b80b843802e0c48890d90206a9daf79885221c80a\": container with ID starting with 640b59c1c1f24c64cc46c43b80b843802e0c48890d90206a9daf79885221c80a not found: ID does not exist" containerID="640b59c1c1f24c64cc46c43b80b843802e0c48890d90206a9daf79885221c80a" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.729283 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"640b59c1c1f24c64cc46c43b80b843802e0c48890d90206a9daf79885221c80a"} err="failed to get container status \"640b59c1c1f24c64cc46c43b80b843802e0c48890d90206a9daf79885221c80a\": rpc error: code = NotFound desc = could not find container \"640b59c1c1f24c64cc46c43b80b843802e0c48890d90206a9daf79885221c80a\": container with ID starting with 640b59c1c1f24c64cc46c43b80b843802e0c48890d90206a9daf79885221c80a not found: ID does not exist" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.729304 5030 scope.go:117] "RemoveContainer" containerID="79303e278316bc7cf3193899577e73e1c0136d14a75ad0b5c2bf54b109f7b1d9" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.736809 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj"] Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.750300 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-internal-tls-certs\") pod \"dbabd573-5784-4cb5-8294-cf65951a73ba\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.750636 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-config-data\") pod \"dbabd573-5784-4cb5-8294-cf65951a73ba\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.750998 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-combined-ca-bundle\") pod \"dbabd573-5784-4cb5-8294-cf65951a73ba\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.751055 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-public-tls-certs\") pod \"dbabd573-5784-4cb5-8294-cf65951a73ba\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.751154 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-a1e6-account-create-update-ggxcj"] Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.751267 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bggnm\" (UniqueName: \"kubernetes.io/projected/dbabd573-5784-4cb5-8294-cf65951a73ba-kube-api-access-bggnm\") pod \"dbabd573-5784-4cb5-8294-cf65951a73ba\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.751388 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbabd573-5784-4cb5-8294-cf65951a73ba-logs\") pod \"dbabd573-5784-4cb5-8294-cf65951a73ba\" (UID: \"dbabd573-5784-4cb5-8294-cf65951a73ba\") " Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.753267 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbabd573-5784-4cb5-8294-cf65951a73ba-logs" (OuterVolumeSpecName: "logs") pod "dbabd573-5784-4cb5-8294-cf65951a73ba" (UID: "dbabd573-5784-4cb5-8294-cf65951a73ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.754493 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbabd573-5784-4cb5-8294-cf65951a73ba-kube-api-access-bggnm" (OuterVolumeSpecName: "kube-api-access-bggnm") pod "dbabd573-5784-4cb5-8294-cf65951a73ba" (UID: "dbabd573-5784-4cb5-8294-cf65951a73ba"). InnerVolumeSpecName "kube-api-access-bggnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.781965 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbabd573-5784-4cb5-8294-cf65951a73ba" (UID: "dbabd573-5784-4cb5-8294-cf65951a73ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.782015 5030 scope.go:117] "RemoveContainer" containerID="860fbedaf6f62ef518f67b4d9910fae4069d9b322765e36a8ffd936820b3bc72" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.791475 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-config-data" (OuterVolumeSpecName: "config-data") pod "dbabd573-5784-4cb5-8294-cf65951a73ba" (UID: "dbabd573-5784-4cb5-8294-cf65951a73ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.805773 5030 scope.go:117] "RemoveContainer" containerID="5bc5381a28ae85e8ac4d78c24cbcbda308750d29cddbbbd612d8dadbe5fff537" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.827346 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dbabd573-5784-4cb5-8294-cf65951a73ba" (UID: "dbabd573-5784-4cb5-8294-cf65951a73ba"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.838749 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dbabd573-5784-4cb5-8294-cf65951a73ba" (UID: "dbabd573-5784-4cb5-8294-cf65951a73ba"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.853726 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.853752 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.853762 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.853774 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bggnm\" (UniqueName: \"kubernetes.io/projected/dbabd573-5784-4cb5-8294-cf65951a73ba-kube-api-access-bggnm\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.853783 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbabd573-5784-4cb5-8294-cf65951a73ba-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.853792 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.853801 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvdd2\" (UniqueName: \"kubernetes.io/projected/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca-kube-api-access-dvdd2\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.853810 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbabd573-5784-4cb5-8294-cf65951a73ba-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:35 crc kubenswrapper[5030]: E0120 23:52:35.922741 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3707109cd3d63748c8f067fa10c5418b3203a5b175f14a8716f463c27d863616" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:52:35 crc kubenswrapper[5030]: E0120 23:52:35.924341 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3707109cd3d63748c8f067fa10c5418b3203a5b175f14a8716f463c27d863616" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:52:35 crc kubenswrapper[5030]: E0120 23:52:35.925583 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3707109cd3d63748c8f067fa10c5418b3203a5b175f14a8716f463c27d863616" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:52:35 crc kubenswrapper[5030]: E0120 23:52:35.925660 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.971505 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" path="/var/lib/kubelet/pods/0a9ac390-5efe-4f40-addc-795d259c81cb/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.972612 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="184162cc-2ce3-4a2e-9167-ff4ffa8cd188" path="/var/lib/kubelet/pods/184162cc-2ce3-4a2e-9167-ff4ffa8cd188/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.974129 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a16ed2d-dd15-41c1-b6ef-61d242bd61a4" path="/var/lib/kubelet/pods/4a16ed2d-dd15-41c1-b6ef-61d242bd61a4/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.975485 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="611e73cb-eb48-4fed-9073-7b3ee43b88e8" path="/var/lib/kubelet/pods/611e73cb-eb48-4fed-9073-7b3ee43b88e8/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.976565 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c7e15f9-5447-4550-9b78-91a664abf455" path="/var/lib/kubelet/pods/7c7e15f9-5447-4550-9b78-91a664abf455/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.981040 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" path="/var/lib/kubelet/pods/7daf347c-1524-4050-b50e-deb2024da0cc/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.987942 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fd42af9-412d-4b7d-8be1-2a7c9008795f" path="/var/lib/kubelet/pods/7fd42af9-412d-4b7d-8be1-2a7c9008795f/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.988574 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e84d7f-54c8-43d1-a0ee-29d3e173f662" path="/var/lib/kubelet/pods/84e84d7f-54c8-43d1-a0ee-29d3e173f662/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.989001 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca" path="/var/lib/kubelet/pods/9ceed73f-dbb7-4a1a-87e4-4e35c89b38ca/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.989370 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b54968d4-ae84-4e25-8899-6113e78d9a2b" path="/var/lib/kubelet/pods/b54968d4-ae84-4e25-8899-6113e78d9a2b/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.991190 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-4lg5x" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.991197 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" path="/var/lib/kubelet/pods/c5bf870a-6b91-4f5b-8e86-f87d34d97aa4/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.992204 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ed2d89-dd21-446c-b85b-b44bb72512bc" path="/var/lib/kubelet/pods/d3ed2d89-dd21-446c-b85b-b44bb72512bc/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.992776 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee604c21-d89c-4f46-ae93-3912ff1da96d" path="/var/lib/kubelet/pods/ee604c21-d89c-4f46-ae93-3912ff1da96d/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.993808 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" path="/var/lib/kubelet/pods/f1b4ffff-8836-4c4e-be88-1d0cc9213a5a/volumes" Jan 20 23:52:35 crc kubenswrapper[5030]: I0120 23:52:35.995297 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.000708 5030 scope.go:117] "RemoveContainer" containerID="6aba11ff0a4eac220e8cc2af3d966d1f2a4d5a16d8754c243966dc541c179f09" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.005635 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_55537e5f-bb41-4899-a095-e87c7495ebdf/ovn-northd/0.log" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.005721 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.010201 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.028301 5030 scope.go:117] "RemoveContainer" containerID="5bc5381a28ae85e8ac4d78c24cbcbda308750d29cddbbbd612d8dadbe5fff537" Jan 20 23:52:36 crc kubenswrapper[5030]: E0120 23:52:36.028939 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc5381a28ae85e8ac4d78c24cbcbda308750d29cddbbbd612d8dadbe5fff537\": container with ID starting with 5bc5381a28ae85e8ac4d78c24cbcbda308750d29cddbbbd612d8dadbe5fff537 not found: ID does not exist" containerID="5bc5381a28ae85e8ac4d78c24cbcbda308750d29cddbbbd612d8dadbe5fff537" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.028982 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc5381a28ae85e8ac4d78c24cbcbda308750d29cddbbbd612d8dadbe5fff537"} err="failed to get container status \"5bc5381a28ae85e8ac4d78c24cbcbda308750d29cddbbbd612d8dadbe5fff537\": rpc error: code = NotFound desc = could not find container \"5bc5381a28ae85e8ac4d78c24cbcbda308750d29cddbbbd612d8dadbe5fff537\": container with ID starting with 5bc5381a28ae85e8ac4d78c24cbcbda308750d29cddbbbd612d8dadbe5fff537 not found: ID does not exist" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.029006 5030 scope.go:117] "RemoveContainer" containerID="6aba11ff0a4eac220e8cc2af3d966d1f2a4d5a16d8754c243966dc541c179f09" Jan 20 23:52:36 crc kubenswrapper[5030]: E0120 23:52:36.029254 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aba11ff0a4eac220e8cc2af3d966d1f2a4d5a16d8754c243966dc541c179f09\": container with ID starting with 6aba11ff0a4eac220e8cc2af3d966d1f2a4d5a16d8754c243966dc541c179f09 not found: ID does not exist" containerID="6aba11ff0a4eac220e8cc2af3d966d1f2a4d5a16d8754c243966dc541c179f09" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.029281 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aba11ff0a4eac220e8cc2af3d966d1f2a4d5a16d8754c243966dc541c179f09"} err="failed to get container status \"6aba11ff0a4eac220e8cc2af3d966d1f2a4d5a16d8754c243966dc541c179f09\": rpc error: code = NotFound desc = could not find container \"6aba11ff0a4eac220e8cc2af3d966d1f2a4d5a16d8754c243966dc541c179f09\": container with ID starting with 6aba11ff0a4eac220e8cc2af3d966d1f2a4d5a16d8754c243966dc541c179f09 not found: ID does not exist" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.029334 5030 scope.go:117] "RemoveContainer" containerID="5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.051290 5030 scope.go:117] "RemoveContainer" containerID="44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.056609 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55537e5f-bb41-4899-a095-e87c7495ebdf-ovn-rundir\") pod \"55537e5f-bb41-4899-a095-e87c7495ebdf\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.056715 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/815dc9f1-424f-499a-b27b-16f3f0483fff-operator-scripts\") pod \"815dc9f1-424f-499a-b27b-16f3f0483fff\" (UID: \"815dc9f1-424f-499a-b27b-16f3f0483fff\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.056744 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-ovn-northd-tls-certs\") pod \"55537e5f-bb41-4899-a095-e87c7495ebdf\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.056764 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rq2f\" (UniqueName: \"kubernetes.io/projected/55537e5f-bb41-4899-a095-e87c7495ebdf-kube-api-access-7rq2f\") pod \"55537e5f-bb41-4899-a095-e87c7495ebdf\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.057175 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-combined-ca-bundle\") pod \"55537e5f-bb41-4899-a095-e87c7495ebdf\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.057294 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltk8g\" (UniqueName: \"kubernetes.io/projected/815dc9f1-424f-499a-b27b-16f3f0483fff-kube-api-access-ltk8g\") pod \"815dc9f1-424f-499a-b27b-16f3f0483fff\" (UID: \"815dc9f1-424f-499a-b27b-16f3f0483fff\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.057331 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-metrics-certs-tls-certs\") pod \"55537e5f-bb41-4899-a095-e87c7495ebdf\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.057355 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55537e5f-bb41-4899-a095-e87c7495ebdf-scripts\") pod \"55537e5f-bb41-4899-a095-e87c7495ebdf\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.057397 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55537e5f-bb41-4899-a095-e87c7495ebdf-config\") pod \"55537e5f-bb41-4899-a095-e87c7495ebdf\" (UID: \"55537e5f-bb41-4899-a095-e87c7495ebdf\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.057482 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/815dc9f1-424f-499a-b27b-16f3f0483fff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "815dc9f1-424f-499a-b27b-16f3f0483fff" (UID: "815dc9f1-424f-499a-b27b-16f3f0483fff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.057821 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/815dc9f1-424f-499a-b27b-16f3f0483fff-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.058165 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55537e5f-bb41-4899-a095-e87c7495ebdf-scripts" (OuterVolumeSpecName: "scripts") pod "55537e5f-bb41-4899-a095-e87c7495ebdf" (UID: "55537e5f-bb41-4899-a095-e87c7495ebdf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.058251 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55537e5f-bb41-4899-a095-e87c7495ebdf-config" (OuterVolumeSpecName: "config") pod "55537e5f-bb41-4899-a095-e87c7495ebdf" (UID: "55537e5f-bb41-4899-a095-e87c7495ebdf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.058683 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55537e5f-bb41-4899-a095-e87c7495ebdf-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "55537e5f-bb41-4899-a095-e87c7495ebdf" (UID: "55537e5f-bb41-4899-a095-e87c7495ebdf"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.063602 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55537e5f-bb41-4899-a095-e87c7495ebdf-kube-api-access-7rq2f" (OuterVolumeSpecName: "kube-api-access-7rq2f") pod "55537e5f-bb41-4899-a095-e87c7495ebdf" (UID: "55537e5f-bb41-4899-a095-e87c7495ebdf"). InnerVolumeSpecName "kube-api-access-7rq2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.064736 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815dc9f1-424f-499a-b27b-16f3f0483fff-kube-api-access-ltk8g" (OuterVolumeSpecName: "kube-api-access-ltk8g") pod "815dc9f1-424f-499a-b27b-16f3f0483fff" (UID: "815dc9f1-424f-499a-b27b-16f3f0483fff"). InnerVolumeSpecName "kube-api-access-ltk8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.091198 5030 scope.go:117] "RemoveContainer" containerID="5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19" Jan 20 23:52:36 crc kubenswrapper[5030]: E0120 23:52:36.091658 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19\": container with ID starting with 5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19 not found: ID does not exist" containerID="5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.091687 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19"} err="failed to get container status \"5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19\": rpc error: code = NotFound desc = could not find container \"5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19\": container with ID starting with 5692f83f95afb3c4f9d727870f278d7b8cd62cef655570808a04bdedb4850d19 not found: ID does not exist" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.091705 5030 scope.go:117] "RemoveContainer" containerID="44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671" Jan 20 23:52:36 crc kubenswrapper[5030]: E0120 23:52:36.091995 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671\": container with ID starting with 44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671 not found: ID does not exist" containerID="44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.092016 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671"} err="failed to get container status \"44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671\": rpc error: code = NotFound desc = could not find container \"44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671\": container with ID starting with 44cc9a9a6b54b0c565f1ac4c85e4cd91e76be048981402745021c633ec595671 not found: ID does not exist" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.104599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55537e5f-bb41-4899-a095-e87c7495ebdf" (UID: "55537e5f-bb41-4899-a095-e87c7495ebdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.120113 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "55537e5f-bb41-4899-a095-e87c7495ebdf" (UID: "55537e5f-bb41-4899-a095-e87c7495ebdf"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.128835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "55537e5f-bb41-4899-a095-e87c7495ebdf" (UID: "55537e5f-bb41-4899-a095-e87c7495ebdf"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.159174 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55537e5f-bb41-4899-a095-e87c7495ebdf-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.159202 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55537e5f-bb41-4899-a095-e87c7495ebdf-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.159213 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.159227 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rq2f\" (UniqueName: \"kubernetes.io/projected/55537e5f-bb41-4899-a095-e87c7495ebdf-kube-api-access-7rq2f\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.159238 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.159248 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltk8g\" (UniqueName: \"kubernetes.io/projected/815dc9f1-424f-499a-b27b-16f3f0483fff-kube-api-access-ltk8g\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.159258 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55537e5f-bb41-4899-a095-e87c7495ebdf-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.159267 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55537e5f-bb41-4899-a095-e87c7495ebdf-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: E0120 23:52:36.261047 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:52:36 crc kubenswrapper[5030]: E0120 23:52:36.261421 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data podName:dd483381-3919-4ade-ac69-c8abde2869a6 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:44.261395691 +0000 UTC m=+4636.581656019 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data") pod "rabbitmq-server-0" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6") : configmap "rabbitmq-config-data" not found Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.333909 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" podUID="d098cb47-5d3b-48d8-b10e-59e0c1d79840" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.1.199:5000/v3\": read tcp 10.217.0.2:60408->10.217.1.199:5000: read: connection reset by peer" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.534751 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" event={"ID":"a9df19e9-10ce-496f-9ae7-89ac55ba91cf","Type":"ContainerDied","Data":"3aea2eedd919c9e3098512ef5eca7e39545bc7305ffd166df3ccc4dbdd0cbf8a"} Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.534835 5030 scope.go:117] "RemoveContainer" containerID="fc0ce133b045b63568d1b17b3af0a763ae05a71804fb6407998acf7b7d0be9dd" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.534782 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.549369 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_55537e5f-bb41-4899-a095-e87c7495ebdf/ovn-northd/0.log" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.549510 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.550129 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"55537e5f-bb41-4899-a095-e87c7495ebdf","Type":"ContainerDied","Data":"c3047d14db9cfa745977ce04166814820750b9affc0308b4acc75edd96c2c062"} Jan 20 23:52:36 crc kubenswrapper[5030]: E0120 23:52:36.582332 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.583799 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"dbabd573-5784-4cb5-8294-cf65951a73ba","Type":"ContainerDied","Data":"07a5a77c6849092bbaf34bb89cbb1c1c3abfdb6c450500709d2d2266d34e01b3"} Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.583933 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:52:36 crc kubenswrapper[5030]: E0120 23:52:36.587012 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.589095 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7"] Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.590279 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-4lg5x" event={"ID":"815dc9f1-424f-499a-b27b-16f3f0483fff","Type":"ContainerDied","Data":"f4e49a48d2cd3b9d80a4169f53f58def50e2f14cd67eb9b2310ec93ef601a400"} Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.591150 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-4lg5x" Jan 20 23:52:36 crc kubenswrapper[5030]: E0120 23:52:36.591382 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:52:36 crc kubenswrapper[5030]: E0120 23:52:36.591443 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.599374 5030 scope.go:117] "RemoveContainer" containerID="409fe497b78edb037279005ae11210c5ba6b57531c5991487330124e7e48ec82" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.600064 5030 generic.go:334] "Generic (PLEG): container finished" podID="d098cb47-5d3b-48d8-b10e-59e0c1d79840" containerID="22410699b2d975f66dd4fbd18bf8cdeb1ca6e285e64621da5b2a1c37a284b6b0" exitCode=0 Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.600137 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" event={"ID":"d098cb47-5d3b-48d8-b10e-59e0c1d79840","Type":"ContainerDied","Data":"22410699b2d975f66dd4fbd18bf8cdeb1ca6e285e64621da5b2a1c37a284b6b0"} Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.609011 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-6cc86455b4-nd2m7"] Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.637761 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.648229 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.654647 5030 scope.go:117] "RemoveContainer" containerID="1afd7b619c388f1b085904f3aa4e3b2164180d355c9e37efea98319e13c503d6" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.657060 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.665134 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.676804 5030 scope.go:117] "RemoveContainer" containerID="cef4a5f00f4648bf0fd3e6e59c501cb49f0b6ca80d5691ef8ef3da4d372a28d8" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.678132 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4lg5x"] Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.684634 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-4lg5x"] Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.706017 5030 scope.go:117] "RemoveContainer" containerID="d99eff37903cee5608b24fb64a68cb6530669b4caba4e1642f9039b72ff5ad58" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.715350 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.738688 5030 scope.go:117] "RemoveContainer" containerID="6001742698e098c515b24cedb01ca7a6e21059956ff4d7773611300b09ff142c" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.771888 5030 scope.go:117] "RemoveContainer" containerID="8043d081ab68ab52f7f2599bf0db461c98d405d06b4245ce8431c2527b4cfcf7" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.772561 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-credential-keys\") pod \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.773078 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-config-data\") pod \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.773133 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-scripts\") pod \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.773292 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcrbg\" (UniqueName: \"kubernetes.io/projected/d098cb47-5d3b-48d8-b10e-59e0c1d79840-kube-api-access-kcrbg\") pod \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.773329 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-fernet-keys\") pod \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.773380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-internal-tls-certs\") pod \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.773428 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-combined-ca-bundle\") pod \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.773749 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-public-tls-certs\") pod \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\" (UID: \"d098cb47-5d3b-48d8-b10e-59e0c1d79840\") " Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.777122 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d098cb47-5d3b-48d8-b10e-59e0c1d79840-kube-api-access-kcrbg" (OuterVolumeSpecName: "kube-api-access-kcrbg") pod "d098cb47-5d3b-48d8-b10e-59e0c1d79840" (UID: "d098cb47-5d3b-48d8-b10e-59e0c1d79840"). InnerVolumeSpecName "kube-api-access-kcrbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.777197 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d098cb47-5d3b-48d8-b10e-59e0c1d79840" (UID: "d098cb47-5d3b-48d8-b10e-59e0c1d79840"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.777285 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-scripts" (OuterVolumeSpecName: "scripts") pod "d098cb47-5d3b-48d8-b10e-59e0c1d79840" (UID: "d098cb47-5d3b-48d8-b10e-59e0c1d79840"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.782125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d098cb47-5d3b-48d8-b10e-59e0c1d79840" (UID: "d098cb47-5d3b-48d8-b10e-59e0c1d79840"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.799694 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d098cb47-5d3b-48d8-b10e-59e0c1d79840" (UID: "d098cb47-5d3b-48d8-b10e-59e0c1d79840"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.804322 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-config-data" (OuterVolumeSpecName: "config-data") pod "d098cb47-5d3b-48d8-b10e-59e0c1d79840" (UID: "d098cb47-5d3b-48d8-b10e-59e0c1d79840"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.813543 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d098cb47-5d3b-48d8-b10e-59e0c1d79840" (UID: "d098cb47-5d3b-48d8-b10e-59e0c1d79840"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.814774 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d098cb47-5d3b-48d8-b10e-59e0c1d79840" (UID: "d098cb47-5d3b-48d8-b10e-59e0c1d79840"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.875354 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcrbg\" (UniqueName: \"kubernetes.io/projected/d098cb47-5d3b-48d8-b10e-59e0c1d79840-kube-api-access-kcrbg\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.875408 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.875432 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.875454 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.875473 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.875494 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.875513 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:36 crc kubenswrapper[5030]: I0120 23:52:36.875533 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d098cb47-5d3b-48d8-b10e-59e0c1d79840-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:37 crc kubenswrapper[5030]: E0120 23:52:37.387944 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:52:37 crc kubenswrapper[5030]: E0120 23:52:37.388401 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data podName:81ab5d03-ddb2-4d25-bec9-485f19fa3f13 nodeName:}" failed. No retries permitted until 2026-01-20 23:52:45.388335389 +0000 UTC m=+4637.708595717 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data") pod "rabbitmq-cell1-server-0" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:52:37 crc kubenswrapper[5030]: I0120 23:52:37.619360 5030 generic.go:334] "Generic (PLEG): container finished" podID="dd483381-3919-4ade-ac69-c8abde2869a6" containerID="59a819026ca38754d2a26af58c705565136f6b307a0c53e7ac0b9eba9ecae242" exitCode=0 Jan 20 23:52:37 crc kubenswrapper[5030]: I0120 23:52:37.619430 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"dd483381-3919-4ade-ac69-c8abde2869a6","Type":"ContainerDied","Data":"59a819026ca38754d2a26af58c705565136f6b307a0c53e7ac0b9eba9ecae242"} Jan 20 23:52:37 crc kubenswrapper[5030]: I0120 23:52:37.625673 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" event={"ID":"d098cb47-5d3b-48d8-b10e-59e0c1d79840","Type":"ContainerDied","Data":"e5879721858805294eb27cab1443dd8b8b4564521732c1c1365e72891c8606f7"} Jan 20 23:52:37 crc kubenswrapper[5030]: I0120 23:52:37.625742 5030 scope.go:117] "RemoveContainer" containerID="22410699b2d975f66dd4fbd18bf8cdeb1ca6e285e64621da5b2a1c37a284b6b0" Jan 20 23:52:37 crc kubenswrapper[5030]: I0120 23:52:37.625894 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5b98cb8978-zrv7q" Jan 20 23:52:37 crc kubenswrapper[5030]: I0120 23:52:37.685374 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5b98cb8978-zrv7q"] Jan 20 23:52:37 crc kubenswrapper[5030]: I0120 23:52:37.690824 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-5b98cb8978-zrv7q"] Jan 20 23:52:37 crc kubenswrapper[5030]: I0120 23:52:37.980697 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55537e5f-bb41-4899-a095-e87c7495ebdf" path="/var/lib/kubelet/pods/55537e5f-bb41-4899-a095-e87c7495ebdf/volumes" Jan 20 23:52:37 crc kubenswrapper[5030]: I0120 23:52:37.981320 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815dc9f1-424f-499a-b27b-16f3f0483fff" path="/var/lib/kubelet/pods/815dc9f1-424f-499a-b27b-16f3f0483fff/volumes" Jan 20 23:52:37 crc kubenswrapper[5030]: I0120 23:52:37.981871 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9df19e9-10ce-496f-9ae7-89ac55ba91cf" path="/var/lib/kubelet/pods/a9df19e9-10ce-496f-9ae7-89ac55ba91cf/volumes" Jan 20 23:52:37 crc kubenswrapper[5030]: I0120 23:52:37.983501 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb8010d-f18b-417a-919f-13e959ccf452" path="/var/lib/kubelet/pods/acb8010d-f18b-417a-919f-13e959ccf452/volumes" Jan 20 23:52:37 crc kubenswrapper[5030]: I0120 23:52:37.984088 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d098cb47-5d3b-48d8-b10e-59e0c1d79840" path="/var/lib/kubelet/pods/d098cb47-5d3b-48d8-b10e-59e0c1d79840/volumes" Jan 20 23:52:37 crc kubenswrapper[5030]: I0120 23:52:37.984670 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbabd573-5784-4cb5-8294-cf65951a73ba" path="/var/lib/kubelet/pods/dbabd573-5784-4cb5-8294-cf65951a73ba/volumes" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.276973 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.306725 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data\") pod \"dd483381-3919-4ade-ac69-c8abde2869a6\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.306787 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-erlang-cookie\") pod \"dd483381-3919-4ade-ac69-c8abde2869a6\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.306835 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-confd\") pod \"dd483381-3919-4ade-ac69-c8abde2869a6\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.306882 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd483381-3919-4ade-ac69-c8abde2869a6-pod-info\") pod \"dd483381-3919-4ade-ac69-c8abde2869a6\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.306915 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfr8n\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-kube-api-access-qfr8n\") pod \"dd483381-3919-4ade-ac69-c8abde2869a6\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.307521 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-tls\") pod \"dd483381-3919-4ade-ac69-c8abde2869a6\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.307332 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "dd483381-3919-4ade-ac69-c8abde2869a6" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.307570 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd483381-3919-4ade-ac69-c8abde2869a6-erlang-cookie-secret\") pod \"dd483381-3919-4ade-ac69-c8abde2869a6\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.307633 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-plugins-conf\") pod \"dd483381-3919-4ade-ac69-c8abde2869a6\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.307657 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-server-conf\") pod \"dd483381-3919-4ade-ac69-c8abde2869a6\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.307729 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"dd483381-3919-4ade-ac69-c8abde2869a6\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.307749 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-plugins\") pod \"dd483381-3919-4ade-ac69-c8abde2869a6\" (UID: \"dd483381-3919-4ade-ac69-c8abde2869a6\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.308030 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.308310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "dd483381-3919-4ade-ac69-c8abde2869a6" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.308763 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "dd483381-3919-4ade-ac69-c8abde2869a6" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.314105 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dd483381-3919-4ade-ac69-c8abde2869a6-pod-info" (OuterVolumeSpecName: "pod-info") pod "dd483381-3919-4ade-ac69-c8abde2869a6" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.321325 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-kube-api-access-qfr8n" (OuterVolumeSpecName: "kube-api-access-qfr8n") pod "dd483381-3919-4ade-ac69-c8abde2869a6" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6"). InnerVolumeSpecName "kube-api-access-qfr8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.322466 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "dd483381-3919-4ade-ac69-c8abde2869a6" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.322375 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "dd483381-3919-4ade-ac69-c8abde2869a6" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.330889 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd483381-3919-4ade-ac69-c8abde2869a6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "dd483381-3919-4ade-ac69-c8abde2869a6" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.375734 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data" (OuterVolumeSpecName: "config-data") pod "dd483381-3919-4ade-ac69-c8abde2869a6" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.409571 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd483381-3919-4ade-ac69-c8abde2869a6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.409634 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.411257 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.411284 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.411299 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.411312 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd483381-3919-4ade-ac69-c8abde2869a6-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.411327 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfr8n\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-kube-api-access-qfr8n\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.411340 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.419918 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-server-conf" (OuterVolumeSpecName: "server-conf") pod "dd483381-3919-4ade-ac69-c8abde2869a6" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.434448 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.456338 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "dd483381-3919-4ade-ac69-c8abde2869a6" (UID: "dd483381-3919-4ade-ac69-c8abde2869a6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.513702 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.513748 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd483381-3919-4ade-ac69-c8abde2869a6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.513772 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd483381-3919-4ade-ac69-c8abde2869a6-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.573348 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.614166 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-erlang-cookie-secret\") pod \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.614251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-server-conf\") pod \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.614295 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-plugins-conf\") pod \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.614333 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-erlang-cookie\") pod \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.614364 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-tls\") pod \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.614401 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data\") pod \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.614426 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-confd\") pod \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.614458 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6jx4\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-kube-api-access-t6jx4\") pod \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.615286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-pod-info\") pod \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.615979 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-plugins\") pod \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.616051 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\" (UID: \"81ab5d03-ddb2-4d25-bec9-485f19fa3f13\") " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.628373 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "81ab5d03-ddb2-4d25-bec9-485f19fa3f13" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.628743 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "81ab5d03-ddb2-4d25-bec9-485f19fa3f13" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.628848 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "81ab5d03-ddb2-4d25-bec9-485f19fa3f13" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.631655 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "persistence") pod "81ab5d03-ddb2-4d25-bec9-485f19fa3f13" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.631772 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "81ab5d03-ddb2-4d25-bec9-485f19fa3f13" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.640545 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "81ab5d03-ddb2-4d25-bec9-485f19fa3f13" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.658886 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-kube-api-access-t6jx4" (OuterVolumeSpecName: "kube-api-access-t6jx4") pod "81ab5d03-ddb2-4d25-bec9-485f19fa3f13" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13"). InnerVolumeSpecName "kube-api-access-t6jx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.658929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-pod-info" (OuterVolumeSpecName: "pod-info") pod "81ab5d03-ddb2-4d25-bec9-485f19fa3f13" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.665385 5030 generic.go:334] "Generic (PLEG): container finished" podID="81ab5d03-ddb2-4d25-bec9-485f19fa3f13" containerID="bbb816d2b9ba7c53b6ec15eb428bf7dd729a8cebd652ee1b9e0381c8dad5f9d5" exitCode=0 Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.665458 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"81ab5d03-ddb2-4d25-bec9-485f19fa3f13","Type":"ContainerDied","Data":"bbb816d2b9ba7c53b6ec15eb428bf7dd729a8cebd652ee1b9e0381c8dad5f9d5"} Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.665490 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"81ab5d03-ddb2-4d25-bec9-485f19fa3f13","Type":"ContainerDied","Data":"7d54a026a485d18cd1d8c96ef81c4c85d27c633cadb588acb70f22c02d843a63"} Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.665510 5030 scope.go:117] "RemoveContainer" containerID="bbb816d2b9ba7c53b6ec15eb428bf7dd729a8cebd652ee1b9e0381c8dad5f9d5" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.665694 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.671016 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"dd483381-3919-4ade-ac69-c8abde2869a6","Type":"ContainerDied","Data":"e9fdef20722e3e765799d93fdd39a7190da044ff2d99ede01532b274f61fad2e"} Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.671088 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.705458 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.710258 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.731682 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.731711 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.731746 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.731757 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.731768 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.731778 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.731787 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.731799 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6jx4\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-kube-api-access-t6jx4\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.742878 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data" (OuterVolumeSpecName: "config-data") pod "81ab5d03-ddb2-4d25-bec9-485f19fa3f13" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.755341 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.767323 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-server-conf" (OuterVolumeSpecName: "server-conf") pod "81ab5d03-ddb2-4d25-bec9-485f19fa3f13" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.768692 5030 scope.go:117] "RemoveContainer" containerID="f30c6ebbf974766386c1b296ba4efbecc9546efb107e4b5e3844480b05eb892c" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.789097 5030 scope.go:117] "RemoveContainer" containerID="bbb816d2b9ba7c53b6ec15eb428bf7dd729a8cebd652ee1b9e0381c8dad5f9d5" Jan 20 23:52:38 crc kubenswrapper[5030]: E0120 23:52:38.790281 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87584043_7d0e_475d_89ec_2f743b49b145.slice/crio-conmon-630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd483381_3919_4ade_ac69_c8abde2869a6.slice/crio-e9fdef20722e3e765799d93fdd39a7190da044ff2d99ede01532b274f61fad2e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87584043_7d0e_475d_89ec_2f743b49b145.slice/crio-630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd483381_3919_4ade_ac69_c8abde2869a6.slice\": RecentStats: unable to find data in memory cache]" Jan 20 23:52:38 crc kubenswrapper[5030]: E0120 23:52:38.790809 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbb816d2b9ba7c53b6ec15eb428bf7dd729a8cebd652ee1b9e0381c8dad5f9d5\": container with ID starting with bbb816d2b9ba7c53b6ec15eb428bf7dd729a8cebd652ee1b9e0381c8dad5f9d5 not found: ID does not exist" containerID="bbb816d2b9ba7c53b6ec15eb428bf7dd729a8cebd652ee1b9e0381c8dad5f9d5" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.790879 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb816d2b9ba7c53b6ec15eb428bf7dd729a8cebd652ee1b9e0381c8dad5f9d5"} err="failed to get container status \"bbb816d2b9ba7c53b6ec15eb428bf7dd729a8cebd652ee1b9e0381c8dad5f9d5\": rpc error: code = NotFound desc = could not find container \"bbb816d2b9ba7c53b6ec15eb428bf7dd729a8cebd652ee1b9e0381c8dad5f9d5\": container with ID starting with bbb816d2b9ba7c53b6ec15eb428bf7dd729a8cebd652ee1b9e0381c8dad5f9d5 not found: ID does not exist" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.790968 5030 scope.go:117] "RemoveContainer" containerID="f30c6ebbf974766386c1b296ba4efbecc9546efb107e4b5e3844480b05eb892c" Jan 20 23:52:38 crc kubenswrapper[5030]: E0120 23:52:38.791584 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f30c6ebbf974766386c1b296ba4efbecc9546efb107e4b5e3844480b05eb892c\": container with ID starting with f30c6ebbf974766386c1b296ba4efbecc9546efb107e4b5e3844480b05eb892c not found: ID does not exist" containerID="f30c6ebbf974766386c1b296ba4efbecc9546efb107e4b5e3844480b05eb892c" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.791646 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30c6ebbf974766386c1b296ba4efbecc9546efb107e4b5e3844480b05eb892c"} err="failed to get container status \"f30c6ebbf974766386c1b296ba4efbecc9546efb107e4b5e3844480b05eb892c\": rpc error: code = NotFound desc = could not find container \"f30c6ebbf974766386c1b296ba4efbecc9546efb107e4b5e3844480b05eb892c\": container with ID starting with f30c6ebbf974766386c1b296ba4efbecc9546efb107e4b5e3844480b05eb892c not found: ID does not exist" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.791675 5030 scope.go:117] "RemoveContainer" containerID="59a819026ca38754d2a26af58c705565136f6b307a0c53e7ac0b9eba9ecae242" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.815900 5030 scope.go:117] "RemoveContainer" containerID="11fee335c42cdc9d15ce4fe859a0bea109b4a8f81b0b4f42812b3dd2aaaede89" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.824093 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "81ab5d03-ddb2-4d25-bec9-485f19fa3f13" (UID: "81ab5d03-ddb2-4d25-bec9-485f19fa3f13"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.833412 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.833791 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.834014 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.834151 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/81ab5d03-ddb2-4d25-bec9-485f19fa3f13-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.938261 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="184162cc-2ce3-4a2e-9167-ff4ffa8cd188" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.207:8776/healthcheck\": dial tcp 10.217.1.207:8776: i/o timeout" Jan 20 23:52:38 crc kubenswrapper[5030]: I0120 23:52:38.964079 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:52:38 crc kubenswrapper[5030]: E0120 23:52:38.964247 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.021872 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.030231 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.044188 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.138259 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll68r\" (UniqueName: \"kubernetes.io/projected/87584043-7d0e-475d-89ec-2f743b49b145-kube-api-access-ll68r\") pod \"87584043-7d0e-475d-89ec-2f743b49b145\" (UID: \"87584043-7d0e-475d-89ec-2f743b49b145\") " Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.138344 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87584043-7d0e-475d-89ec-2f743b49b145-combined-ca-bundle\") pod \"87584043-7d0e-475d-89ec-2f743b49b145\" (UID: \"87584043-7d0e-475d-89ec-2f743b49b145\") " Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.138424 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87584043-7d0e-475d-89ec-2f743b49b145-config-data\") pod \"87584043-7d0e-475d-89ec-2f743b49b145\" (UID: \"87584043-7d0e-475d-89ec-2f743b49b145\") " Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.141479 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87584043-7d0e-475d-89ec-2f743b49b145-kube-api-access-ll68r" (OuterVolumeSpecName: "kube-api-access-ll68r") pod "87584043-7d0e-475d-89ec-2f743b49b145" (UID: "87584043-7d0e-475d-89ec-2f743b49b145"). InnerVolumeSpecName "kube-api-access-ll68r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.160685 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87584043-7d0e-475d-89ec-2f743b49b145-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87584043-7d0e-475d-89ec-2f743b49b145" (UID: "87584043-7d0e-475d-89ec-2f743b49b145"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.180080 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87584043-7d0e-475d-89ec-2f743b49b145-config-data" (OuterVolumeSpecName: "config-data") pod "87584043-7d0e-475d-89ec-2f743b49b145" (UID: "87584043-7d0e-475d-89ec-2f743b49b145"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.240441 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87584043-7d0e-475d-89ec-2f743b49b145-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.240493 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll68r\" (UniqueName: \"kubernetes.io/projected/87584043-7d0e-475d-89ec-2f743b49b145-kube-api-access-ll68r\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.240509 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87584043-7d0e-475d-89ec-2f743b49b145-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.688495 5030 generic.go:334] "Generic (PLEG): container finished" podID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerID="3707109cd3d63748c8f067fa10c5418b3203a5b175f14a8716f463c27d863616" exitCode=0 Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.688591 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"65b5db4a-e775-4981-a7fd-e2d8bd56f334","Type":"ContainerDied","Data":"3707109cd3d63748c8f067fa10c5418b3203a5b175f14a8716f463c27d863616"} Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.688693 5030 scope.go:117] "RemoveContainer" containerID="7f7607fa9cd7139f4b8d413ec29021dabc7d691ea574bea76d16f9b1e2924dd3" Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.693824 5030 generic.go:334] "Generic (PLEG): container finished" podID="87584043-7d0e-475d-89ec-2f743b49b145" containerID="630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d" exitCode=0 Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.693902 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.694066 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"87584043-7d0e-475d-89ec-2f743b49b145","Type":"ContainerDied","Data":"630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d"} Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.694283 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"87584043-7d0e-475d-89ec-2f743b49b145","Type":"ContainerDied","Data":"70d5d0c6f651e999014f7196349d603e480a9cd2506a2c434142c1d05c6f83a6"} Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.750828 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.762709 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.980305 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ab5d03-ddb2-4d25-bec9-485f19fa3f13" path="/var/lib/kubelet/pods/81ab5d03-ddb2-4d25-bec9-485f19fa3f13/volumes" Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.981503 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87584043-7d0e-475d-89ec-2f743b49b145" path="/var/lib/kubelet/pods/87584043-7d0e-475d-89ec-2f743b49b145/volumes" Jan 20 23:52:39 crc kubenswrapper[5030]: I0120 23:52:39.983980 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd483381-3919-4ade-ac69-c8abde2869a6" path="/var/lib/kubelet/pods/dd483381-3919-4ade-ac69-c8abde2869a6/volumes" Jan 20 23:52:40 crc kubenswrapper[5030]: E0120 23:52:40.060269 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/dns-swift-storage-0: configmap "dns-swift-storage-0" not found Jan 20 23:52:40 crc kubenswrapper[5030]: E0120 23:52:40.060380 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dns-swift-storage-0 podName:b30c4233-78ba-46a4-a091-ad9613da839f nodeName:}" failed. No retries permitted until 2026-01-20 23:52:40.560346808 +0000 UTC m=+4632.880607136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dns-swift-storage-0") pod "dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" (UID: "b30c4233-78ba-46a4-a091-ad9613da839f") : configmap "dns-swift-storage-0" not found Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.243345 5030 scope.go:117] "RemoveContainer" containerID="630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.280554 5030 scope.go:117] "RemoveContainer" containerID="8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.368899 5030 scope.go:117] "RemoveContainer" containerID="630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d" Jan 20 23:52:40 crc kubenswrapper[5030]: E0120 23:52:40.377801 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d\": container with ID starting with 630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d not found: ID does not exist" containerID="630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.377847 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d"} err="failed to get container status \"630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d\": rpc error: code = NotFound desc = could not find container \"630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d\": container with ID starting with 630a92a6bc24d0e11d89c12db4f4c7bb4375033d0bad90b0b377c2884cc72b3d not found: ID does not exist" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.377876 5030 scope.go:117] "RemoveContainer" containerID="8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa" Jan 20 23:52:40 crc kubenswrapper[5030]: E0120 23:52:40.378220 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa\": container with ID starting with 8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa not found: ID does not exist" containerID="8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.378274 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa"} err="failed to get container status \"8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa\": rpc error: code = NotFound desc = could not find container \"8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa\": container with ID starting with 8133b1ff73d5da162166d8d6e4786de2619de36df2fda3a4ab65aac6eb54e2fa not found: ID does not exist" Jan 20 23:52:40 crc kubenswrapper[5030]: E0120 23:52:40.568067 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/dns-swift-storage-0: configmap "dns-swift-storage-0" not found Jan 20 23:52:40 crc kubenswrapper[5030]: E0120 23:52:40.568204 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dns-swift-storage-0 podName:b30c4233-78ba-46a4-a091-ad9613da839f nodeName:}" failed. No retries permitted until 2026-01-20 23:52:41.568174698 +0000 UTC m=+4633.888435026 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dns-swift-storage-0") pod "dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" (UID: "b30c4233-78ba-46a4-a091-ad9613da839f") : configmap "dns-swift-storage-0" not found Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.603815 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.651930 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.697076 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp"] Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.714310 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" podUID="b30c4233-78ba-46a4-a091-ad9613da839f" containerName="dnsmasq-dns" containerID="cri-o://a1c6b2689b2462436e37911d9f2dc4eb381d2bc0fb353c3eccd86f82e90ec5f7" gracePeriod=10 Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.714617 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.714970 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"65b5db4a-e775-4981-a7fd-e2d8bd56f334","Type":"ContainerDied","Data":"5a49ac1f90df3e226390cd658b6c516ac2e54d9efe1409be54ec69bf12364dd8"} Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.715004 5030 scope.go:117] "RemoveContainer" containerID="3707109cd3d63748c8f067fa10c5418b3203a5b175f14a8716f463c27d863616" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.770445 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvwtx\" (UniqueName: \"kubernetes.io/projected/65b5db4a-e775-4981-a7fd-e2d8bd56f334-kube-api-access-nvwtx\") pod \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\" (UID: \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\") " Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.770514 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b5db4a-e775-4981-a7fd-e2d8bd56f334-combined-ca-bundle\") pod \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\" (UID: \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\") " Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.770569 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b5db4a-e775-4981-a7fd-e2d8bd56f334-config-data\") pod \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\" (UID: \"65b5db4a-e775-4981-a7fd-e2d8bd56f334\") " Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.776019 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b5db4a-e775-4981-a7fd-e2d8bd56f334-kube-api-access-nvwtx" (OuterVolumeSpecName: "kube-api-access-nvwtx") pod "65b5db4a-e775-4981-a7fd-e2d8bd56f334" (UID: "65b5db4a-e775-4981-a7fd-e2d8bd56f334"). InnerVolumeSpecName "kube-api-access-nvwtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.791655 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b5db4a-e775-4981-a7fd-e2d8bd56f334-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65b5db4a-e775-4981-a7fd-e2d8bd56f334" (UID: "65b5db4a-e775-4981-a7fd-e2d8bd56f334"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.797163 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b5db4a-e775-4981-a7fd-e2d8bd56f334-config-data" (OuterVolumeSpecName: "config-data") pod "65b5db4a-e775-4981-a7fd-e2d8bd56f334" (UID: "65b5db4a-e775-4981-a7fd-e2d8bd56f334"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.872647 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b5db4a-e775-4981-a7fd-e2d8bd56f334-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.872676 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b5db4a-e775-4981-a7fd-e2d8bd56f334-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:40 crc kubenswrapper[5030]: I0120 23:52:40.872688 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvwtx\" (UniqueName: \"kubernetes.io/projected/65b5db4a-e775-4981-a7fd-e2d8bd56f334-kube-api-access-nvwtx\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.056161 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.063589 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.194302 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.279460 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dnsmasq-svc\") pod \"b30c4233-78ba-46a4-a091-ad9613da839f\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.279513 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-config\") pod \"b30c4233-78ba-46a4-a091-ad9613da839f\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.279552 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dns-swift-storage-0\") pod \"b30c4233-78ba-46a4-a091-ad9613da839f\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.279708 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkfrg\" (UniqueName: \"kubernetes.io/projected/b30c4233-78ba-46a4-a091-ad9613da839f-kube-api-access-rkfrg\") pod \"b30c4233-78ba-46a4-a091-ad9613da839f\" (UID: \"b30c4233-78ba-46a4-a091-ad9613da839f\") " Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.284263 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b30c4233-78ba-46a4-a091-ad9613da839f-kube-api-access-rkfrg" (OuterVolumeSpecName: "kube-api-access-rkfrg") pod "b30c4233-78ba-46a4-a091-ad9613da839f" (UID: "b30c4233-78ba-46a4-a091-ad9613da839f"). InnerVolumeSpecName "kube-api-access-rkfrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.322180 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b30c4233-78ba-46a4-a091-ad9613da839f" (UID: "b30c4233-78ba-46a4-a091-ad9613da839f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.327533 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "b30c4233-78ba-46a4-a091-ad9613da839f" (UID: "b30c4233-78ba-46a4-a091-ad9613da839f"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.341419 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-config" (OuterVolumeSpecName: "config") pod "b30c4233-78ba-46a4-a091-ad9613da839f" (UID: "b30c4233-78ba-46a4-a091-ad9613da839f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.381235 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.381275 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.381292 5030 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b30c4233-78ba-46a4-a091-ad9613da839f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.381310 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkfrg\" (UniqueName: \"kubernetes.io/projected/b30c4233-78ba-46a4-a091-ad9613da839f-kube-api-access-rkfrg\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.734578 5030 generic.go:334] "Generic (PLEG): container finished" podID="b30c4233-78ba-46a4-a091-ad9613da839f" containerID="a1c6b2689b2462436e37911d9f2dc4eb381d2bc0fb353c3eccd86f82e90ec5f7" exitCode=0 Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.734664 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" event={"ID":"b30c4233-78ba-46a4-a091-ad9613da839f","Type":"ContainerDied","Data":"a1c6b2689b2462436e37911d9f2dc4eb381d2bc0fb353c3eccd86f82e90ec5f7"} Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.734699 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" event={"ID":"b30c4233-78ba-46a4-a091-ad9613da839f","Type":"ContainerDied","Data":"a63bf7ae68c3da2a0456deb327cf635f241aa975e7e24c71c406bd8493c9fb7c"} Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.734720 5030 scope.go:117] "RemoveContainer" containerID="a1c6b2689b2462436e37911d9f2dc4eb381d2bc0fb353c3eccd86f82e90ec5f7" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.734727 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.806591 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp"] Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.812755 5030 scope.go:117] "RemoveContainer" containerID="8c7e8e5624129560558ffe8ea84fb6213671c363ba9a62cdbc97526ad98c6905" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.815062 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6cd9ddf9b9-jcpxp"] Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.839377 5030 scope.go:117] "RemoveContainer" containerID="a1c6b2689b2462436e37911d9f2dc4eb381d2bc0fb353c3eccd86f82e90ec5f7" Jan 20 23:52:41 crc kubenswrapper[5030]: E0120 23:52:41.840048 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1c6b2689b2462436e37911d9f2dc4eb381d2bc0fb353c3eccd86f82e90ec5f7\": container with ID starting with a1c6b2689b2462436e37911d9f2dc4eb381d2bc0fb353c3eccd86f82e90ec5f7 not found: ID does not exist" containerID="a1c6b2689b2462436e37911d9f2dc4eb381d2bc0fb353c3eccd86f82e90ec5f7" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.840110 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1c6b2689b2462436e37911d9f2dc4eb381d2bc0fb353c3eccd86f82e90ec5f7"} err="failed to get container status \"a1c6b2689b2462436e37911d9f2dc4eb381d2bc0fb353c3eccd86f82e90ec5f7\": rpc error: code = NotFound desc = could not find container \"a1c6b2689b2462436e37911d9f2dc4eb381d2bc0fb353c3eccd86f82e90ec5f7\": container with ID starting with a1c6b2689b2462436e37911d9f2dc4eb381d2bc0fb353c3eccd86f82e90ec5f7 not found: ID does not exist" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.840144 5030 scope.go:117] "RemoveContainer" containerID="8c7e8e5624129560558ffe8ea84fb6213671c363ba9a62cdbc97526ad98c6905" Jan 20 23:52:41 crc kubenswrapper[5030]: E0120 23:52:41.840647 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c7e8e5624129560558ffe8ea84fb6213671c363ba9a62cdbc97526ad98c6905\": container with ID starting with 8c7e8e5624129560558ffe8ea84fb6213671c363ba9a62cdbc97526ad98c6905 not found: ID does not exist" containerID="8c7e8e5624129560558ffe8ea84fb6213671c363ba9a62cdbc97526ad98c6905" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.840795 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c7e8e5624129560558ffe8ea84fb6213671c363ba9a62cdbc97526ad98c6905"} err="failed to get container status \"8c7e8e5624129560558ffe8ea84fb6213671c363ba9a62cdbc97526ad98c6905\": rpc error: code = NotFound desc = could not find container \"8c7e8e5624129560558ffe8ea84fb6213671c363ba9a62cdbc97526ad98c6905\": container with ID starting with 8c7e8e5624129560558ffe8ea84fb6213671c363ba9a62cdbc97526ad98c6905 not found: ID does not exist" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.980223 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" path="/var/lib/kubelet/pods/65b5db4a-e775-4981-a7fd-e2d8bd56f334/volumes" Jan 20 23:52:41 crc kubenswrapper[5030]: I0120 23:52:41.981262 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b30c4233-78ba-46a4-a091-ad9613da839f" path="/var/lib/kubelet/pods/b30c4233-78ba-46a4-a091-ad9613da839f/volumes" Jan 20 23:52:48 crc kubenswrapper[5030]: I0120 23:52:48.300778 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" podUID="599881de-74a7-46ff-a7a1-5599d4345e0c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.209:9696/\": dial tcp 10.217.1.209:9696: connect: connection refused" Jan 20 23:52:49 crc kubenswrapper[5030]: I0120 23:52:49.963097 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:52:49 crc kubenswrapper[5030]: E0120 23:52:49.963756 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:52:52 crc kubenswrapper[5030]: I0120 23:52:52.875694 5030 generic.go:334] "Generic (PLEG): container finished" podID="599881de-74a7-46ff-a7a1-5599d4345e0c" containerID="cce18799a99ac1715b565d5854b0dcbd08f2d61430c9731225dea07b9b533874" exitCode=0 Jan 20 23:52:52 crc kubenswrapper[5030]: I0120 23:52:52.875962 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" event={"ID":"599881de-74a7-46ff-a7a1-5599d4345e0c","Type":"ContainerDied","Data":"cce18799a99ac1715b565d5854b0dcbd08f2d61430c9731225dea07b9b533874"} Jan 20 23:52:52 crc kubenswrapper[5030]: I0120 23:52:52.876056 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" event={"ID":"599881de-74a7-46ff-a7a1-5599d4345e0c","Type":"ContainerDied","Data":"3d2c8b5ef48f92de27684ee3e9f03dc2c07b798a4d1a2c21ae54203b0fa195c0"} Jan 20 23:52:52 crc kubenswrapper[5030]: I0120 23:52:52.876089 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d2c8b5ef48f92de27684ee3e9f03dc2c07b798a4d1a2c21ae54203b0fa195c0" Jan 20 23:52:52 crc kubenswrapper[5030]: I0120 23:52:52.930027 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:52:52 crc kubenswrapper[5030]: I0120 23:52:52.999474 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-config\") pod \"599881de-74a7-46ff-a7a1-5599d4345e0c\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " Jan 20 23:52:52 crc kubenswrapper[5030]: I0120 23:52:52.999524 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-combined-ca-bundle\") pod \"599881de-74a7-46ff-a7a1-5599d4345e0c\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " Jan 20 23:52:52 crc kubenswrapper[5030]: I0120 23:52:52.999564 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-public-tls-certs\") pod \"599881de-74a7-46ff-a7a1-5599d4345e0c\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " Jan 20 23:52:52 crc kubenswrapper[5030]: I0120 23:52:52.999616 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-httpd-config\") pod \"599881de-74a7-46ff-a7a1-5599d4345e0c\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " Jan 20 23:52:52 crc kubenswrapper[5030]: I0120 23:52:52.999664 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-internal-tls-certs\") pod \"599881de-74a7-46ff-a7a1-5599d4345e0c\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:52.999766 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkf4g\" (UniqueName: \"kubernetes.io/projected/599881de-74a7-46ff-a7a1-5599d4345e0c-kube-api-access-qkf4g\") pod \"599881de-74a7-46ff-a7a1-5599d4345e0c\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:52.999845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-ovndb-tls-certs\") pod \"599881de-74a7-46ff-a7a1-5599d4345e0c\" (UID: \"599881de-74a7-46ff-a7a1-5599d4345e0c\") " Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.005522 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "599881de-74a7-46ff-a7a1-5599d4345e0c" (UID: "599881de-74a7-46ff-a7a1-5599d4345e0c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.007900 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/599881de-74a7-46ff-a7a1-5599d4345e0c-kube-api-access-qkf4g" (OuterVolumeSpecName: "kube-api-access-qkf4g") pod "599881de-74a7-46ff-a7a1-5599d4345e0c" (UID: "599881de-74a7-46ff-a7a1-5599d4345e0c"). InnerVolumeSpecName "kube-api-access-qkf4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.042464 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "599881de-74a7-46ff-a7a1-5599d4345e0c" (UID: "599881de-74a7-46ff-a7a1-5599d4345e0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.052985 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-config" (OuterVolumeSpecName: "config") pod "599881de-74a7-46ff-a7a1-5599d4345e0c" (UID: "599881de-74a7-46ff-a7a1-5599d4345e0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.060000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "599881de-74a7-46ff-a7a1-5599d4345e0c" (UID: "599881de-74a7-46ff-a7a1-5599d4345e0c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.066566 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "599881de-74a7-46ff-a7a1-5599d4345e0c" (UID: "599881de-74a7-46ff-a7a1-5599d4345e0c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.079377 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "599881de-74a7-46ff-a7a1-5599d4345e0c" (UID: "599881de-74a7-46ff-a7a1-5599d4345e0c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.102659 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkf4g\" (UniqueName: \"kubernetes.io/projected/599881de-74a7-46ff-a7a1-5599d4345e0c-kube-api-access-qkf4g\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.102709 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.102724 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.102738 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.102750 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.102758 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.102767 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/599881de-74a7-46ff-a7a1-5599d4345e0c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.889513 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-6fc9448c48-j8m88" Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.950755 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-6fc9448c48-j8m88"] Jan 20 23:52:53 crc kubenswrapper[5030]: I0120 23:52:53.976276 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-6fc9448c48-j8m88"] Jan 20 23:52:55 crc kubenswrapper[5030]: I0120 23:52:55.974523 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="599881de-74a7-46ff-a7a1-5599d4345e0c" path="/var/lib/kubelet/pods/599881de-74a7-46ff-a7a1-5599d4345e0c/volumes" Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.742318 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.861237 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift\") pod \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.861286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-cache\") pod \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.861399 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-lock\") pod \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.861431 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.861472 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cl4k\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-kube-api-access-7cl4k\") pod \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\" (UID: \"0cb7ff45-e6d7-41cc-9001-de1c7f458a83\") " Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.862585 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-lock" (OuterVolumeSpecName: "lock") pod "0cb7ff45-e6d7-41cc-9001-de1c7f458a83" (UID: "0cb7ff45-e6d7-41cc-9001-de1c7f458a83"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.862852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-cache" (OuterVolumeSpecName: "cache") pod "0cb7ff45-e6d7-41cc-9001-de1c7f458a83" (UID: "0cb7ff45-e6d7-41cc-9001-de1c7f458a83"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.871406 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0cb7ff45-e6d7-41cc-9001-de1c7f458a83" (UID: "0cb7ff45-e6d7-41cc-9001-de1c7f458a83"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.871512 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-kube-api-access-7cl4k" (OuterVolumeSpecName: "kube-api-access-7cl4k") pod "0cb7ff45-e6d7-41cc-9001-de1c7f458a83" (UID: "0cb7ff45-e6d7-41cc-9001-de1c7f458a83"). InnerVolumeSpecName "kube-api-access-7cl4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.873997 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "swift") pod "0cb7ff45-e6d7-41cc-9001-de1c7f458a83" (UID: "0cb7ff45-e6d7-41cc-9001-de1c7f458a83"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.962543 5030 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-lock\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.962589 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.962602 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cl4k\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-kube-api-access-7cl4k\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.962612 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.962624 5030 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0cb7ff45-e6d7-41cc-9001-de1c7f458a83-cache\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.986459 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerID="be639c5606151f012255de9bcb91690338a83b8cba439bddcc11b7175fb3edb5" exitCode=137 Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.986518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"be639c5606151f012255de9bcb91690338a83b8cba439bddcc11b7175fb3edb5"} Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.986565 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"0cb7ff45-e6d7-41cc-9001-de1c7f458a83","Type":"ContainerDied","Data":"24da4819f9fb671a134974d99647e69b22fc2baef5e09e52f91aa2ac3e827c13"} Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.986583 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.986596 5030 scope.go:117] "RemoveContainer" containerID="be639c5606151f012255de9bcb91690338a83b8cba439bddcc11b7175fb3edb5" Jan 20 23:53:00 crc kubenswrapper[5030]: I0120 23:53:00.990771 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.033381 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.036747 5030 scope.go:117] "RemoveContainer" containerID="b7cffbcf3ad1b63e00ae392389302e885c55f2585cddad9fe0291894a6e8cecb" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.038976 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.064345 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.069533 5030 scope.go:117] "RemoveContainer" containerID="dde3d6c7a1986ddafd263ad5e85db2efc0b08d473326e955b4001ad6c7a98f50" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.096178 5030 scope.go:117] "RemoveContainer" containerID="add2324fcfa33b1725dfe035aca44112500e267e7cf07a88742e7c07f8c84862" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.129825 5030 scope.go:117] "RemoveContainer" containerID="1350a717bc1295245194beb51e514c867204d22c527b6a02bbcc86472510cc38" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.161684 5030 scope.go:117] "RemoveContainer" containerID="239d60cbdc9fb72217dc18f7e01aa568cec96d28806e913c0638419d688b401a" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.186430 5030 scope.go:117] "RemoveContainer" containerID="47752edc56435edd967b808a352f1ce3b054e10598ff669178f3ef8398eeefba" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.215650 5030 scope.go:117] "RemoveContainer" containerID="be9299d3748b999437f0fa19a92cf43c1a5f5fe089543d8ab23f5a5c5f4f0c07" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.246652 5030 scope.go:117] "RemoveContainer" containerID="f6b6d32d54243a624699e2b703b31cb8449fd7e9d6677452c3fad7ecf705ad22" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.273293 5030 scope.go:117] "RemoveContainer" containerID="81bc4f92c31773ee13486eb01f1c4bfe2377e41f09e12794224e70040ffe1a53" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.307425 5030 scope.go:117] "RemoveContainer" containerID="92c9ca6d0747ab19439d8b49dfd9a1c96242ac77f3c689c788a65c1c0e1258b2" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.336149 5030 scope.go:117] "RemoveContainer" containerID="3c24cf1353bfd1771fb13a4e92f529dc36f8d0e1781a72b6afd315e775fba8c5" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.361827 5030 scope.go:117] "RemoveContainer" containerID="85df78331aa99aafae2f67fc3bc58a8e03b9b7cb97d02206474e31d6bc372557" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.403408 5030 scope.go:117] "RemoveContainer" containerID="6cd3e98e0e337b6e8110bc5ed6d220c6312475e87a43ee037f356b6914ffca40" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.424248 5030 scope.go:117] "RemoveContainer" containerID="6af8f9825683a9e7bcfa9790dd622b1408428b449ce0e291c74c97d671c14f9b" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.442828 5030 scope.go:117] "RemoveContainer" containerID="be639c5606151f012255de9bcb91690338a83b8cba439bddcc11b7175fb3edb5" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.443237 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be639c5606151f012255de9bcb91690338a83b8cba439bddcc11b7175fb3edb5\": container with ID starting with be639c5606151f012255de9bcb91690338a83b8cba439bddcc11b7175fb3edb5 not found: ID does not exist" containerID="be639c5606151f012255de9bcb91690338a83b8cba439bddcc11b7175fb3edb5" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.443285 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be639c5606151f012255de9bcb91690338a83b8cba439bddcc11b7175fb3edb5"} err="failed to get container status \"be639c5606151f012255de9bcb91690338a83b8cba439bddcc11b7175fb3edb5\": rpc error: code = NotFound desc = could not find container \"be639c5606151f012255de9bcb91690338a83b8cba439bddcc11b7175fb3edb5\": container with ID starting with be639c5606151f012255de9bcb91690338a83b8cba439bddcc11b7175fb3edb5 not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.443320 5030 scope.go:117] "RemoveContainer" containerID="b7cffbcf3ad1b63e00ae392389302e885c55f2585cddad9fe0291894a6e8cecb" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.443863 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7cffbcf3ad1b63e00ae392389302e885c55f2585cddad9fe0291894a6e8cecb\": container with ID starting with b7cffbcf3ad1b63e00ae392389302e885c55f2585cddad9fe0291894a6e8cecb not found: ID does not exist" containerID="b7cffbcf3ad1b63e00ae392389302e885c55f2585cddad9fe0291894a6e8cecb" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.443935 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7cffbcf3ad1b63e00ae392389302e885c55f2585cddad9fe0291894a6e8cecb"} err="failed to get container status \"b7cffbcf3ad1b63e00ae392389302e885c55f2585cddad9fe0291894a6e8cecb\": rpc error: code = NotFound desc = could not find container \"b7cffbcf3ad1b63e00ae392389302e885c55f2585cddad9fe0291894a6e8cecb\": container with ID starting with b7cffbcf3ad1b63e00ae392389302e885c55f2585cddad9fe0291894a6e8cecb not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.443973 5030 scope.go:117] "RemoveContainer" containerID="dde3d6c7a1986ddafd263ad5e85db2efc0b08d473326e955b4001ad6c7a98f50" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.444378 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde3d6c7a1986ddafd263ad5e85db2efc0b08d473326e955b4001ad6c7a98f50\": container with ID starting with dde3d6c7a1986ddafd263ad5e85db2efc0b08d473326e955b4001ad6c7a98f50 not found: ID does not exist" containerID="dde3d6c7a1986ddafd263ad5e85db2efc0b08d473326e955b4001ad6c7a98f50" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.444422 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde3d6c7a1986ddafd263ad5e85db2efc0b08d473326e955b4001ad6c7a98f50"} err="failed to get container status \"dde3d6c7a1986ddafd263ad5e85db2efc0b08d473326e955b4001ad6c7a98f50\": rpc error: code = NotFound desc = could not find container \"dde3d6c7a1986ddafd263ad5e85db2efc0b08d473326e955b4001ad6c7a98f50\": container with ID starting with dde3d6c7a1986ddafd263ad5e85db2efc0b08d473326e955b4001ad6c7a98f50 not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.444453 5030 scope.go:117] "RemoveContainer" containerID="add2324fcfa33b1725dfe035aca44112500e267e7cf07a88742e7c07f8c84862" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.444803 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add2324fcfa33b1725dfe035aca44112500e267e7cf07a88742e7c07f8c84862\": container with ID starting with add2324fcfa33b1725dfe035aca44112500e267e7cf07a88742e7c07f8c84862 not found: ID does not exist" containerID="add2324fcfa33b1725dfe035aca44112500e267e7cf07a88742e7c07f8c84862" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.444869 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add2324fcfa33b1725dfe035aca44112500e267e7cf07a88742e7c07f8c84862"} err="failed to get container status \"add2324fcfa33b1725dfe035aca44112500e267e7cf07a88742e7c07f8c84862\": rpc error: code = NotFound desc = could not find container \"add2324fcfa33b1725dfe035aca44112500e267e7cf07a88742e7c07f8c84862\": container with ID starting with add2324fcfa33b1725dfe035aca44112500e267e7cf07a88742e7c07f8c84862 not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.444904 5030 scope.go:117] "RemoveContainer" containerID="1350a717bc1295245194beb51e514c867204d22c527b6a02bbcc86472510cc38" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.445178 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1350a717bc1295245194beb51e514c867204d22c527b6a02bbcc86472510cc38\": container with ID starting with 1350a717bc1295245194beb51e514c867204d22c527b6a02bbcc86472510cc38 not found: ID does not exist" containerID="1350a717bc1295245194beb51e514c867204d22c527b6a02bbcc86472510cc38" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.445203 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1350a717bc1295245194beb51e514c867204d22c527b6a02bbcc86472510cc38"} err="failed to get container status \"1350a717bc1295245194beb51e514c867204d22c527b6a02bbcc86472510cc38\": rpc error: code = NotFound desc = could not find container \"1350a717bc1295245194beb51e514c867204d22c527b6a02bbcc86472510cc38\": container with ID starting with 1350a717bc1295245194beb51e514c867204d22c527b6a02bbcc86472510cc38 not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.445219 5030 scope.go:117] "RemoveContainer" containerID="239d60cbdc9fb72217dc18f7e01aa568cec96d28806e913c0638419d688b401a" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.445518 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"239d60cbdc9fb72217dc18f7e01aa568cec96d28806e913c0638419d688b401a\": container with ID starting with 239d60cbdc9fb72217dc18f7e01aa568cec96d28806e913c0638419d688b401a not found: ID does not exist" containerID="239d60cbdc9fb72217dc18f7e01aa568cec96d28806e913c0638419d688b401a" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.445576 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"239d60cbdc9fb72217dc18f7e01aa568cec96d28806e913c0638419d688b401a"} err="failed to get container status \"239d60cbdc9fb72217dc18f7e01aa568cec96d28806e913c0638419d688b401a\": rpc error: code = NotFound desc = could not find container \"239d60cbdc9fb72217dc18f7e01aa568cec96d28806e913c0638419d688b401a\": container with ID starting with 239d60cbdc9fb72217dc18f7e01aa568cec96d28806e913c0638419d688b401a not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.445611 5030 scope.go:117] "RemoveContainer" containerID="47752edc56435edd967b808a352f1ce3b054e10598ff669178f3ef8398eeefba" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.445916 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47752edc56435edd967b808a352f1ce3b054e10598ff669178f3ef8398eeefba\": container with ID starting with 47752edc56435edd967b808a352f1ce3b054e10598ff669178f3ef8398eeefba not found: ID does not exist" containerID="47752edc56435edd967b808a352f1ce3b054e10598ff669178f3ef8398eeefba" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.445937 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47752edc56435edd967b808a352f1ce3b054e10598ff669178f3ef8398eeefba"} err="failed to get container status \"47752edc56435edd967b808a352f1ce3b054e10598ff669178f3ef8398eeefba\": rpc error: code = NotFound desc = could not find container \"47752edc56435edd967b808a352f1ce3b054e10598ff669178f3ef8398eeefba\": container with ID starting with 47752edc56435edd967b808a352f1ce3b054e10598ff669178f3ef8398eeefba not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.445955 5030 scope.go:117] "RemoveContainer" containerID="be9299d3748b999437f0fa19a92cf43c1a5f5fe089543d8ab23f5a5c5f4f0c07" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.446254 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9299d3748b999437f0fa19a92cf43c1a5f5fe089543d8ab23f5a5c5f4f0c07\": container with ID starting with be9299d3748b999437f0fa19a92cf43c1a5f5fe089543d8ab23f5a5c5f4f0c07 not found: ID does not exist" containerID="be9299d3748b999437f0fa19a92cf43c1a5f5fe089543d8ab23f5a5c5f4f0c07" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.446309 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9299d3748b999437f0fa19a92cf43c1a5f5fe089543d8ab23f5a5c5f4f0c07"} err="failed to get container status \"be9299d3748b999437f0fa19a92cf43c1a5f5fe089543d8ab23f5a5c5f4f0c07\": rpc error: code = NotFound desc = could not find container \"be9299d3748b999437f0fa19a92cf43c1a5f5fe089543d8ab23f5a5c5f4f0c07\": container with ID starting with be9299d3748b999437f0fa19a92cf43c1a5f5fe089543d8ab23f5a5c5f4f0c07 not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.446340 5030 scope.go:117] "RemoveContainer" containerID="f6b6d32d54243a624699e2b703b31cb8449fd7e9d6677452c3fad7ecf705ad22" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.446655 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b6d32d54243a624699e2b703b31cb8449fd7e9d6677452c3fad7ecf705ad22\": container with ID starting with f6b6d32d54243a624699e2b703b31cb8449fd7e9d6677452c3fad7ecf705ad22 not found: ID does not exist" containerID="f6b6d32d54243a624699e2b703b31cb8449fd7e9d6677452c3fad7ecf705ad22" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.446677 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b6d32d54243a624699e2b703b31cb8449fd7e9d6677452c3fad7ecf705ad22"} err="failed to get container status \"f6b6d32d54243a624699e2b703b31cb8449fd7e9d6677452c3fad7ecf705ad22\": rpc error: code = NotFound desc = could not find container \"f6b6d32d54243a624699e2b703b31cb8449fd7e9d6677452c3fad7ecf705ad22\": container with ID starting with f6b6d32d54243a624699e2b703b31cb8449fd7e9d6677452c3fad7ecf705ad22 not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.446690 5030 scope.go:117] "RemoveContainer" containerID="81bc4f92c31773ee13486eb01f1c4bfe2377e41f09e12794224e70040ffe1a53" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.447003 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81bc4f92c31773ee13486eb01f1c4bfe2377e41f09e12794224e70040ffe1a53\": container with ID starting with 81bc4f92c31773ee13486eb01f1c4bfe2377e41f09e12794224e70040ffe1a53 not found: ID does not exist" containerID="81bc4f92c31773ee13486eb01f1c4bfe2377e41f09e12794224e70040ffe1a53" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.447062 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81bc4f92c31773ee13486eb01f1c4bfe2377e41f09e12794224e70040ffe1a53"} err="failed to get container status \"81bc4f92c31773ee13486eb01f1c4bfe2377e41f09e12794224e70040ffe1a53\": rpc error: code = NotFound desc = could not find container \"81bc4f92c31773ee13486eb01f1c4bfe2377e41f09e12794224e70040ffe1a53\": container with ID starting with 81bc4f92c31773ee13486eb01f1c4bfe2377e41f09e12794224e70040ffe1a53 not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.447092 5030 scope.go:117] "RemoveContainer" containerID="92c9ca6d0747ab19439d8b49dfd9a1c96242ac77f3c689c788a65c1c0e1258b2" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.447360 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92c9ca6d0747ab19439d8b49dfd9a1c96242ac77f3c689c788a65c1c0e1258b2\": container with ID starting with 92c9ca6d0747ab19439d8b49dfd9a1c96242ac77f3c689c788a65c1c0e1258b2 not found: ID does not exist" containerID="92c9ca6d0747ab19439d8b49dfd9a1c96242ac77f3c689c788a65c1c0e1258b2" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.447383 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c9ca6d0747ab19439d8b49dfd9a1c96242ac77f3c689c788a65c1c0e1258b2"} err="failed to get container status \"92c9ca6d0747ab19439d8b49dfd9a1c96242ac77f3c689c788a65c1c0e1258b2\": rpc error: code = NotFound desc = could not find container \"92c9ca6d0747ab19439d8b49dfd9a1c96242ac77f3c689c788a65c1c0e1258b2\": container with ID starting with 92c9ca6d0747ab19439d8b49dfd9a1c96242ac77f3c689c788a65c1c0e1258b2 not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.447398 5030 scope.go:117] "RemoveContainer" containerID="3c24cf1353bfd1771fb13a4e92f529dc36f8d0e1781a72b6afd315e775fba8c5" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.447830 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c24cf1353bfd1771fb13a4e92f529dc36f8d0e1781a72b6afd315e775fba8c5\": container with ID starting with 3c24cf1353bfd1771fb13a4e92f529dc36f8d0e1781a72b6afd315e775fba8c5 not found: ID does not exist" containerID="3c24cf1353bfd1771fb13a4e92f529dc36f8d0e1781a72b6afd315e775fba8c5" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.447895 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c24cf1353bfd1771fb13a4e92f529dc36f8d0e1781a72b6afd315e775fba8c5"} err="failed to get container status \"3c24cf1353bfd1771fb13a4e92f529dc36f8d0e1781a72b6afd315e775fba8c5\": rpc error: code = NotFound desc = could not find container \"3c24cf1353bfd1771fb13a4e92f529dc36f8d0e1781a72b6afd315e775fba8c5\": container with ID starting with 3c24cf1353bfd1771fb13a4e92f529dc36f8d0e1781a72b6afd315e775fba8c5 not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.447931 5030 scope.go:117] "RemoveContainer" containerID="85df78331aa99aafae2f67fc3bc58a8e03b9b7cb97d02206474e31d6bc372557" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.448823 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85df78331aa99aafae2f67fc3bc58a8e03b9b7cb97d02206474e31d6bc372557\": container with ID starting with 85df78331aa99aafae2f67fc3bc58a8e03b9b7cb97d02206474e31d6bc372557 not found: ID does not exist" containerID="85df78331aa99aafae2f67fc3bc58a8e03b9b7cb97d02206474e31d6bc372557" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.448878 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85df78331aa99aafae2f67fc3bc58a8e03b9b7cb97d02206474e31d6bc372557"} err="failed to get container status \"85df78331aa99aafae2f67fc3bc58a8e03b9b7cb97d02206474e31d6bc372557\": rpc error: code = NotFound desc = could not find container \"85df78331aa99aafae2f67fc3bc58a8e03b9b7cb97d02206474e31d6bc372557\": container with ID starting with 85df78331aa99aafae2f67fc3bc58a8e03b9b7cb97d02206474e31d6bc372557 not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.448908 5030 scope.go:117] "RemoveContainer" containerID="6cd3e98e0e337b6e8110bc5ed6d220c6312475e87a43ee037f356b6914ffca40" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.451225 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd3e98e0e337b6e8110bc5ed6d220c6312475e87a43ee037f356b6914ffca40\": container with ID starting with 6cd3e98e0e337b6e8110bc5ed6d220c6312475e87a43ee037f356b6914ffca40 not found: ID does not exist" containerID="6cd3e98e0e337b6e8110bc5ed6d220c6312475e87a43ee037f356b6914ffca40" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.451287 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd3e98e0e337b6e8110bc5ed6d220c6312475e87a43ee037f356b6914ffca40"} err="failed to get container status \"6cd3e98e0e337b6e8110bc5ed6d220c6312475e87a43ee037f356b6914ffca40\": rpc error: code = NotFound desc = could not find container \"6cd3e98e0e337b6e8110bc5ed6d220c6312475e87a43ee037f356b6914ffca40\": container with ID starting with 6cd3e98e0e337b6e8110bc5ed6d220c6312475e87a43ee037f356b6914ffca40 not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.451312 5030 scope.go:117] "RemoveContainer" containerID="6af8f9825683a9e7bcfa9790dd622b1408428b449ce0e291c74c97d671c14f9b" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.451730 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af8f9825683a9e7bcfa9790dd622b1408428b449ce0e291c74c97d671c14f9b\": container with ID starting with 6af8f9825683a9e7bcfa9790dd622b1408428b449ce0e291c74c97d671c14f9b not found: ID does not exist" containerID="6af8f9825683a9e7bcfa9790dd622b1408428b449ce0e291c74c97d671c14f9b" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.451793 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af8f9825683a9e7bcfa9790dd622b1408428b449ce0e291c74c97d671c14f9b"} err="failed to get container status \"6af8f9825683a9e7bcfa9790dd622b1408428b449ce0e291c74c97d671c14f9b\": rpc error: code = NotFound desc = could not find container \"6af8f9825683a9e7bcfa9790dd622b1408428b449ce0e291c74c97d671c14f9b\": container with ID starting with 6af8f9825683a9e7bcfa9790dd622b1408428b449ce0e291c74c97d671c14f9b not found: ID does not exist" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.962356 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:53:01 crc kubenswrapper[5030]: E0120 23:53:01.962779 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:53:01 crc kubenswrapper[5030]: I0120 23:53:01.985458 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" path="/var/lib/kubelet/pods/0cb7ff45-e6d7-41cc-9001-de1c7f458a83/volumes" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.466593 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-jg5mc"] Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.480206 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-jg5mc"] Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.622241 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-bhrks"] Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.622768 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.622809 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.622832 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-server" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.622847 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-server" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.622866 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184162cc-2ce3-4a2e-9167-ff4ffa8cd188" containerName="cinder-api-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.622880 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="184162cc-2ce3-4a2e-9167-ff4ffa8cd188" containerName="cinder-api-log" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.622901 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55537e5f-bb41-4899-a095-e87c7495ebdf" containerName="ovn-northd" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.622913 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="55537e5f-bb41-4899-a095-e87c7495ebdf" containerName="ovn-northd" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.622944 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="599881de-74a7-46ff-a7a1-5599d4345e0c" containerName="neutron-api" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.622957 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="599881de-74a7-46ff-a7a1-5599d4345e0c" containerName="neutron-api" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.622975 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbabd573-5784-4cb5-8294-cf65951a73ba" containerName="nova-api-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.622988 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbabd573-5784-4cb5-8294-cf65951a73ba" containerName="nova-api-log" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623008 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" containerName="placement-api" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623022 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" containerName="placement-api" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623049 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-updater" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623062 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-updater" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623087 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7e15f9-5447-4550-9b78-91a664abf455" containerName="glance-httpd" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623105 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7e15f9-5447-4550-9b78-91a664abf455" containerName="glance-httpd" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623132 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="sg-core" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623149 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="sg-core" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623171 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ed2d89-dd21-446c-b85b-b44bb72512bc" containerName="barbican-worker" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623185 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ed2d89-dd21-446c-b85b-b44bb72512bc" containerName="barbican-worker" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623210 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" containerName="placement-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623224 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" containerName="placement-log" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623244 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d098cb47-5d3b-48d8-b10e-59e0c1d79840" containerName="keystone-api" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623257 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d098cb47-5d3b-48d8-b10e-59e0c1d79840" containerName="keystone-api" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623272 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623285 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623298 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623312 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623336 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623349 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623370 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ed2d89-dd21-446c-b85b-b44bb72512bc" containerName="barbican-worker-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623383 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ed2d89-dd21-446c-b85b-b44bb72512bc" containerName="barbican-worker-log" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623399 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55537e5f-bb41-4899-a095-e87c7495ebdf" containerName="openstack-network-exporter" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623412 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="55537e5f-bb41-4899-a095-e87c7495ebdf" containerName="openstack-network-exporter" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623453 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611e73cb-eb48-4fed-9073-7b3ee43b88e8" containerName="barbican-keystone-listener-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623467 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="611e73cb-eb48-4fed-9073-7b3ee43b88e8" containerName="barbican-keystone-listener-log" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623494 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815dc9f1-424f-499a-b27b-16f3f0483fff" containerName="mariadb-account-create-update" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623508 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="815dc9f1-424f-499a-b27b-16f3f0483fff" containerName="mariadb-account-create-update" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623531 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-expirer" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623544 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-expirer" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623558 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c7e15f9-5447-4550-9b78-91a664abf455" containerName="glance-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623571 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c7e15f9-5447-4550-9b78-91a664abf455" containerName="glance-log" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623590 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-updater" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623603 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-updater" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623653 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ab5d03-ddb2-4d25-bec9-485f19fa3f13" containerName="setup-container" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623667 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ab5d03-ddb2-4d25-bec9-485f19fa3f13" containerName="setup-container" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623687 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb8010d-f18b-417a-919f-13e959ccf452" containerName="nova-metadata-metadata" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623701 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb8010d-f18b-417a-919f-13e959ccf452" containerName="nova-metadata-metadata" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623721 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815dc9f1-424f-499a-b27b-16f3f0483fff" containerName="mariadb-account-create-update" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623734 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="815dc9f1-424f-499a-b27b-16f3f0483fff" containerName="mariadb-account-create-update" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623759 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-auditor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623773 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-auditor" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623793 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54968d4-ae84-4e25-8899-6113e78d9a2b" containerName="glance-httpd" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623808 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54968d4-ae84-4e25-8899-6113e78d9a2b" containerName="glance-httpd" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623823 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623836 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623855 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-auditor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623869 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-auditor" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623886 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-reaper" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623899 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-reaper" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623916 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-server" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623929 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-server" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623946 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30c4233-78ba-46a4-a091-ad9613da839f" containerName="dnsmasq-dns" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623959 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30c4233-78ba-46a4-a091-ad9613da839f" containerName="dnsmasq-dns" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.623976 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd483381-3919-4ade-ac69-c8abde2869a6" containerName="setup-container" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.623989 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd483381-3919-4ade-ac69-c8abde2869a6" containerName="setup-container" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624007 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd483381-3919-4ade-ac69-c8abde2869a6" containerName="rabbitmq" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624020 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd483381-3919-4ade-ac69-c8abde2869a6" containerName="rabbitmq" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624041 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611e73cb-eb48-4fed-9073-7b3ee43b88e8" containerName="barbican-keystone-listener" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624054 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="611e73cb-eb48-4fed-9073-7b3ee43b88e8" containerName="barbican-keystone-listener" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624076 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54968d4-ae84-4e25-8899-6113e78d9a2b" containerName="glance-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624088 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54968d4-ae84-4e25-8899-6113e78d9a2b" containerName="glance-log" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624106 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb8010d-f18b-417a-919f-13e959ccf452" containerName="nova-metadata-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624121 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb8010d-f18b-417a-919f-13e959ccf452" containerName="nova-metadata-log" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624138 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9df19e9-10ce-496f-9ae7-89ac55ba91cf" containerName="barbican-api-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624151 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9df19e9-10ce-496f-9ae7-89ac55ba91cf" containerName="barbican-api-log" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624171 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624183 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624201 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-auditor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624214 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-auditor" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624241 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-replicator" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624254 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-replicator" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624275 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624289 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624308 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" containerName="galera" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624321 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" containerName="galera" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624346 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="ceilometer-notification-agent" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624360 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="ceilometer-notification-agent" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624393 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e84d7f-54c8-43d1-a0ee-29d3e173f662" containerName="memcached" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624406 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e84d7f-54c8-43d1-a0ee-29d3e173f662" containerName="memcached" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624428 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30c4233-78ba-46a4-a091-ad9613da839f" containerName="init" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624442 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30c4233-78ba-46a4-a091-ad9613da839f" containerName="init" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624458 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624471 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624491 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-replicator" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624504 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-replicator" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624524 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624536 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624551 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184162cc-2ce3-4a2e-9167-ff4ffa8cd188" containerName="cinder-api" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624563 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="184162cc-2ce3-4a2e-9167-ff4ffa8cd188" containerName="cinder-api" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624584 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="599881de-74a7-46ff-a7a1-5599d4345e0c" containerName="neutron-httpd" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624596 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="599881de-74a7-46ff-a7a1-5599d4345e0c" containerName="neutron-httpd" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624613 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" containerName="mysql-bootstrap" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624651 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" containerName="mysql-bootstrap" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624670 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9df19e9-10ce-496f-9ae7-89ac55ba91cf" containerName="barbican-api" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624683 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9df19e9-10ce-496f-9ae7-89ac55ba91cf" containerName="barbican-api" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624699 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="ceilometer-central-agent" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624711 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="ceilometer-central-agent" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624733 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="proxy-httpd" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624745 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="proxy-httpd" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624765 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-replicator" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624777 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-replicator" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624792 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="rsync" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624805 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="rsync" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624824 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="swift-recon-cron" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624837 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="swift-recon-cron" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624854 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624867 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624881 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624893 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624912 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-server" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624927 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-server" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624944 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ab5d03-ddb2-4d25-bec9-485f19fa3f13" containerName="rabbitmq" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624957 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ab5d03-ddb2-4d25-bec9-485f19fa3f13" containerName="rabbitmq" Jan 20 23:53:12 crc kubenswrapper[5030]: E0120 23:53:12.624975 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbabd573-5784-4cb5-8294-cf65951a73ba" containerName="nova-api-api" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.624989 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbabd573-5784-4cb5-8294-cf65951a73ba" containerName="nova-api-api" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625353 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625387 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-replicator" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625414 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbabd573-5784-4cb5-8294-cf65951a73ba" containerName="nova-api-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625431 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-auditor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625454 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b30c4233-78ba-46a4-a091-ad9613da839f" containerName="dnsmasq-dns" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625472 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625488 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd483381-3919-4ade-ac69-c8abde2869a6" containerName="rabbitmq" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625505 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="599881de-74a7-46ff-a7a1-5599d4345e0c" containerName="neutron-api" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625526 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" containerName="placement-api" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625554 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625575 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625600 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-reaper" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625672 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-expirer" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625710 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="184162cc-2ce3-4a2e-9167-ff4ffa8cd188" containerName="cinder-api" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625729 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="proxy-httpd" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625758 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-server" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625788 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-server" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625820 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="swift-recon-cron" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625840 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="611e73cb-eb48-4fed-9073-7b3ee43b88e8" containerName="barbican-keystone-listener-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625861 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbabd573-5784-4cb5-8294-cf65951a73ba" containerName="nova-api-api" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625877 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b4ffff-8836-4c4e-be88-1d0cc9213a5a" containerName="placement-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625902 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7e15f9-5447-4550-9b78-91a664abf455" containerName="glance-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625925 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb8010d-f18b-417a-919f-13e959ccf452" containerName="nova-metadata-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625939 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625959 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="815dc9f1-424f-499a-b27b-16f3f0483fff" containerName="mariadb-account-create-update" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.625981 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626000 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9df19e9-10ce-496f-9ae7-89ac55ba91cf" containerName="barbican-api" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626024 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b54968d4-ae84-4e25-8899-6113e78d9a2b" containerName="glance-httpd" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626045 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-replicator" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626064 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626082 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e84d7f-54c8-43d1-a0ee-29d3e173f662" containerName="memcached" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626105 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="55537e5f-bb41-4899-a095-e87c7495ebdf" containerName="openstack-network-exporter" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626121 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-updater" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626141 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-auditor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626161 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="ceilometer-central-agent" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626183 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5bf870a-6b91-4f5b-8e86-f87d34d97aa4" containerName="galera" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626205 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626220 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="ceilometer-notification-agent" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626234 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b54968d4-ae84-4e25-8899-6113e78d9a2b" containerName="glance-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626253 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="184162cc-2ce3-4a2e-9167-ff4ffa8cd188" containerName="cinder-api-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626279 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a9ac390-5efe-4f40-addc-795d259c81cb" containerName="sg-core" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626300 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c7e15f9-5447-4550-9b78-91a664abf455" containerName="glance-httpd" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626321 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ab5d03-ddb2-4d25-bec9-485f19fa3f13" containerName="rabbitmq" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626340 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="rsync" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626356 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9df19e9-10ce-496f-9ae7-89ac55ba91cf" containerName="barbican-api-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626378 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb8010d-f18b-417a-919f-13e959ccf452" containerName="nova-metadata-metadata" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626401 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="container-server" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626425 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d098cb47-5d3b-48d8-b10e-59e0c1d79840" containerName="keystone-api" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626457 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ed2d89-dd21-446c-b85b-b44bb72512bc" containerName="barbican-worker" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626482 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ed2d89-dd21-446c-b85b-b44bb72512bc" containerName="barbican-worker-log" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626506 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="55537e5f-bb41-4899-a095-e87c7495ebdf" containerName="ovn-northd" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626532 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="599881de-74a7-46ff-a7a1-5599d4345e0c" containerName="neutron-httpd" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626559 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-auditor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626588 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="object-updater" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626605 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="611e73cb-eb48-4fed-9073-7b3ee43b88e8" containerName="barbican-keystone-listener" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626667 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb7ff45-e6d7-41cc-9001-de1c7f458a83" containerName="account-replicator" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.626758 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.627818 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bhrks" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.632747 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.632985 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.633612 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.633769 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bhrks"] Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.634701 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.803531 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d7c040e9-0ad3-4a70-a414-a6239e37ff92-node-mnt\") pod \"crc-storage-crc-bhrks\" (UID: \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\") " pod="crc-storage/crc-storage-crc-bhrks" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.803813 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d7c040e9-0ad3-4a70-a414-a6239e37ff92-crc-storage\") pod \"crc-storage-crc-bhrks\" (UID: \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\") " pod="crc-storage/crc-storage-crc-bhrks" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.803921 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwwdn\" (UniqueName: \"kubernetes.io/projected/d7c040e9-0ad3-4a70-a414-a6239e37ff92-kube-api-access-kwwdn\") pod \"crc-storage-crc-bhrks\" (UID: \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\") " pod="crc-storage/crc-storage-crc-bhrks" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.905215 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d7c040e9-0ad3-4a70-a414-a6239e37ff92-node-mnt\") pod \"crc-storage-crc-bhrks\" (UID: \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\") " pod="crc-storage/crc-storage-crc-bhrks" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.905355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d7c040e9-0ad3-4a70-a414-a6239e37ff92-crc-storage\") pod \"crc-storage-crc-bhrks\" (UID: \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\") " pod="crc-storage/crc-storage-crc-bhrks" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.905428 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwwdn\" (UniqueName: \"kubernetes.io/projected/d7c040e9-0ad3-4a70-a414-a6239e37ff92-kube-api-access-kwwdn\") pod \"crc-storage-crc-bhrks\" (UID: \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\") " pod="crc-storage/crc-storage-crc-bhrks" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.905823 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d7c040e9-0ad3-4a70-a414-a6239e37ff92-node-mnt\") pod \"crc-storage-crc-bhrks\" (UID: \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\") " pod="crc-storage/crc-storage-crc-bhrks" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.906926 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d7c040e9-0ad3-4a70-a414-a6239e37ff92-crc-storage\") pod \"crc-storage-crc-bhrks\" (UID: \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\") " pod="crc-storage/crc-storage-crc-bhrks" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.940780 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwwdn\" (UniqueName: \"kubernetes.io/projected/d7c040e9-0ad3-4a70-a414-a6239e37ff92-kube-api-access-kwwdn\") pod \"crc-storage-crc-bhrks\" (UID: \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\") " pod="crc-storage/crc-storage-crc-bhrks" Jan 20 23:53:12 crc kubenswrapper[5030]: I0120 23:53:12.954167 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bhrks" Jan 20 23:53:13 crc kubenswrapper[5030]: I0120 23:53:13.275007 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bhrks"] Jan 20 23:53:13 crc kubenswrapper[5030]: I0120 23:53:13.977510 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcfcd441-8232-4cc7-b794-e52c18eb1b72" path="/var/lib/kubelet/pods/fcfcd441-8232-4cc7-b794-e52c18eb1b72/volumes" Jan 20 23:53:14 crc kubenswrapper[5030]: I0120 23:53:14.161024 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bhrks" event={"ID":"d7c040e9-0ad3-4a70-a414-a6239e37ff92","Type":"ContainerStarted","Data":"a4192141036d6a8e1e1e44516f35cee30872e0345d2f52742e52b7d37c175644"} Jan 20 23:53:14 crc kubenswrapper[5030]: I0120 23:53:14.962350 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:53:15 crc kubenswrapper[5030]: I0120 23:53:15.174197 5030 generic.go:334] "Generic (PLEG): container finished" podID="d7c040e9-0ad3-4a70-a414-a6239e37ff92" containerID="0310625296be6170351f770fc5a8a857728effb7422def465702469c504419c3" exitCode=0 Jan 20 23:53:15 crc kubenswrapper[5030]: I0120 23:53:15.174254 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bhrks" event={"ID":"d7c040e9-0ad3-4a70-a414-a6239e37ff92","Type":"ContainerDied","Data":"0310625296be6170351f770fc5a8a857728effb7422def465702469c504419c3"} Jan 20 23:53:16 crc kubenswrapper[5030]: I0120 23:53:16.189090 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"99e97073dbe5bb28b1952568f31004b35204791a913cf21cbb334984821b7ecc"} Jan 20 23:53:16 crc kubenswrapper[5030]: I0120 23:53:16.578717 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bhrks" Jan 20 23:53:16 crc kubenswrapper[5030]: I0120 23:53:16.673572 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d7c040e9-0ad3-4a70-a414-a6239e37ff92-node-mnt\") pod \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\" (UID: \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\") " Jan 20 23:53:16 crc kubenswrapper[5030]: I0120 23:53:16.673769 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d7c040e9-0ad3-4a70-a414-a6239e37ff92-crc-storage\") pod \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\" (UID: \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\") " Jan 20 23:53:16 crc kubenswrapper[5030]: I0120 23:53:16.673879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7c040e9-0ad3-4a70-a414-a6239e37ff92-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d7c040e9-0ad3-4a70-a414-a6239e37ff92" (UID: "d7c040e9-0ad3-4a70-a414-a6239e37ff92"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:53:16 crc kubenswrapper[5030]: I0120 23:53:16.673925 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwwdn\" (UniqueName: \"kubernetes.io/projected/d7c040e9-0ad3-4a70-a414-a6239e37ff92-kube-api-access-kwwdn\") pod \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\" (UID: \"d7c040e9-0ad3-4a70-a414-a6239e37ff92\") " Jan 20 23:53:16 crc kubenswrapper[5030]: I0120 23:53:16.674617 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d7c040e9-0ad3-4a70-a414-a6239e37ff92-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:16 crc kubenswrapper[5030]: I0120 23:53:16.682943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c040e9-0ad3-4a70-a414-a6239e37ff92-kube-api-access-kwwdn" (OuterVolumeSpecName: "kube-api-access-kwwdn") pod "d7c040e9-0ad3-4a70-a414-a6239e37ff92" (UID: "d7c040e9-0ad3-4a70-a414-a6239e37ff92"). InnerVolumeSpecName "kube-api-access-kwwdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:53:16 crc kubenswrapper[5030]: I0120 23:53:16.709708 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c040e9-0ad3-4a70-a414-a6239e37ff92-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d7c040e9-0ad3-4a70-a414-a6239e37ff92" (UID: "d7c040e9-0ad3-4a70-a414-a6239e37ff92"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:53:16 crc kubenswrapper[5030]: I0120 23:53:16.776303 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwwdn\" (UniqueName: \"kubernetes.io/projected/d7c040e9-0ad3-4a70-a414-a6239e37ff92-kube-api-access-kwwdn\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:16 crc kubenswrapper[5030]: I0120 23:53:16.776364 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d7c040e9-0ad3-4a70-a414-a6239e37ff92-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:17 crc kubenswrapper[5030]: I0120 23:53:17.208104 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bhrks" event={"ID":"d7c040e9-0ad3-4a70-a414-a6239e37ff92","Type":"ContainerDied","Data":"a4192141036d6a8e1e1e44516f35cee30872e0345d2f52742e52b7d37c175644"} Jan 20 23:53:17 crc kubenswrapper[5030]: I0120 23:53:17.208544 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4192141036d6a8e1e1e44516f35cee30872e0345d2f52742e52b7d37c175644" Jan 20 23:53:17 crc kubenswrapper[5030]: I0120 23:53:17.208661 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bhrks" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.353575 5030 scope.go:117] "RemoveContainer" containerID="2f721d727f0ab17d1d01471a6de8a00a13c0596cb70faaf4e7a965049517134c" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.425935 5030 scope.go:117] "RemoveContainer" containerID="5af8e541c7b45adcd5f97f874d6b9a2214aad19a732554539987c9fffd8a4613" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.458025 5030 scope.go:117] "RemoveContainer" containerID="08ec6657f50d5550e85af87826ba5cf46aac8240ff9e47ea5d6763a2de1fa2fa" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.490480 5030 scope.go:117] "RemoveContainer" containerID="7202fe73450ff434a4d69e7b2aaa3222736c50ff914003ac1349d55d5e00980e" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.522564 5030 scope.go:117] "RemoveContainer" containerID="d73ad5f754eb2901b60128bff6a264983d7d74521f4176732b15724e0d3df8c8" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.547864 5030 scope.go:117] "RemoveContainer" containerID="4c5a39b28271097caeee65f9a045b46a3c2ee6068921dc3575a234b5903108e2" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.583575 5030 scope.go:117] "RemoveContainer" containerID="088e594ead1c9624dd906a0c4ba82c22d2712fb2e45e227fbb018786e1228e86" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.606273 5030 scope.go:117] "RemoveContainer" containerID="13735290db958c0573e03fd426dea1aa03063d2ff76782514b3c6f879976a7e7" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.630912 5030 scope.go:117] "RemoveContainer" containerID="e80638829867975c142f9a8772d2bd6c5c492f03b382bfe54bf55e324fbf8d6e" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.659086 5030 scope.go:117] "RemoveContainer" containerID="1dd19098759807465068b62dcf1137af6bfd6b4d362b08535bcc40499c15886e" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.706977 5030 scope.go:117] "RemoveContainer" containerID="666ff3f23387594aed671b6782cb36baf282d66c8bc54fcb03b31d2e44ed6692" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.731647 5030 scope.go:117] "RemoveContainer" containerID="172ef371bccaa04eddfc89e039a269b815a0129e68773a1d14b0d92242d7b805" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.754242 5030 scope.go:117] "RemoveContainer" containerID="0ebf9740d24b5d513af1db1bc11b43bf812a8327e814b235234029b85f1004bf" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.782243 5030 scope.go:117] "RemoveContainer" containerID="c8ad123c0154ee907610e2669c2cf247b63d6af191295a68bcf8042417ed6be7" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.802980 5030 scope.go:117] "RemoveContainer" containerID="281753cf2e936ebb772453915aa4f8dbf5ea0e0af7646cff35eb71a2c1c0f0f4" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.819091 5030 scope.go:117] "RemoveContainer" containerID="421267c6bfd98f4fe3243005cbd7630d3179d26c45dbcc020099fd31c6f39830" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.840229 5030 scope.go:117] "RemoveContainer" containerID="ba5937ed9f4588f1a552253a88dea74aa441d6cfd6aaa27566780ea0ec75d0ee" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.857578 5030 scope.go:117] "RemoveContainer" containerID="f02cc91e2853d55d3ece04bb8ab62f60b3907e5d9e90aa4bedab46dedc0cfc98" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.880196 5030 scope.go:117] "RemoveContainer" containerID="22e68e62988cf1e516c7b5af2b06fb4f49c2ec461e9218371d944061ae0e011c" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.897680 5030 scope.go:117] "RemoveContainer" containerID="1c026ff013ec59d543a67923331e5855251f76f3e958cb1eefdb053d5331676d" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.926103 5030 scope.go:117] "RemoveContainer" containerID="82c5de5d1e9ee60604eec4b8187687c0e8fd8640cff7d12c1f5a28c78d168427" Jan 20 23:53:18 crc kubenswrapper[5030]: I0120 23:53:18.971523 5030 scope.go:117] "RemoveContainer" containerID="a17d532b4f091901e53dc6ace5b971e6a334eb6f16e23f0fbf4b5e3399aab24c" Jan 20 23:53:19 crc kubenswrapper[5030]: I0120 23:53:19.031810 5030 scope.go:117] "RemoveContainer" containerID="86d552f2b139f590f704d703a14acee5de646739e003c7b6988c201a7f83dbc7" Jan 20 23:53:19 crc kubenswrapper[5030]: I0120 23:53:19.098699 5030 scope.go:117] "RemoveContainer" containerID="d98c71152944a299bb8484276e289c385976f06af8f4f14a45496d662e778bee" Jan 20 23:53:19 crc kubenswrapper[5030]: I0120 23:53:19.129319 5030 scope.go:117] "RemoveContainer" containerID="3e83d2eec52fcc87d54a061d272d727de88d4704afbd112e7973bc9837044a45" Jan 20 23:53:19 crc kubenswrapper[5030]: I0120 23:53:19.160536 5030 scope.go:117] "RemoveContainer" containerID="f05982f608e268c197c7f0c0f512888069aa7995523bedc0554768de0b8db5ce" Jan 20 23:53:19 crc kubenswrapper[5030]: I0120 23:53:19.200815 5030 scope.go:117] "RemoveContainer" containerID="27b0016d3de811e0565969532e55790cc76bbc720f2b4e02b99a2dd62f2af4ac" Jan 20 23:53:19 crc kubenswrapper[5030]: I0120 23:53:19.273175 5030 scope.go:117] "RemoveContainer" containerID="6f2c342a8688dabd93f58244581defe726775c8bbc85ea0469b1283dc62695de" Jan 20 23:53:19 crc kubenswrapper[5030]: I0120 23:53:19.320323 5030 scope.go:117] "RemoveContainer" containerID="ff7be2c694d3dca0c559222993519094634a7a1dfb835a7ff0372d249337df59" Jan 20 23:53:19 crc kubenswrapper[5030]: I0120 23:53:19.354781 5030 scope.go:117] "RemoveContainer" containerID="83ba40878af4490661874b5ada4f4f7d502d1cbc85a36fc133c30b8c8c4ae107" Jan 20 23:53:19 crc kubenswrapper[5030]: I0120 23:53:19.377474 5030 scope.go:117] "RemoveContainer" containerID="ff457d14d19e95e1a68beccc54d8b8ec84f5dfda5d6553ed1fa2422bfd6a6ecc" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.103509 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-bhrks"] Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.113779 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-bhrks"] Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.226602 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9pwcs"] Jan 20 23:53:20 crc kubenswrapper[5030]: E0120 23:53:20.227295 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.227347 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" Jan 20 23:53:20 crc kubenswrapper[5030]: E0120 23:53:20.227387 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c040e9-0ad3-4a70-a414-a6239e37ff92" containerName="storage" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.227406 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c040e9-0ad3-4a70-a414-a6239e37ff92" containerName="storage" Jan 20 23:53:20 crc kubenswrapper[5030]: E0120 23:53:20.227476 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.227495 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.228484 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="815dc9f1-424f-499a-b27b-16f3f0483fff" containerName="mariadb-account-create-update" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.228542 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.228576 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c040e9-0ad3-4a70-a414-a6239e37ff92" containerName="storage" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.228603 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b5db4a-e775-4981-a7fd-e2d8bd56f334" containerName="nova-cell1-conductor-conductor" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.228676 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="87584043-7d0e-475d-89ec-2f743b49b145" containerName="nova-cell0-conductor-conductor" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.228708 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7daf347c-1524-4050-b50e-deb2024da0cc" containerName="nova-scheduler-scheduler" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.229833 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9pwcs" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.232205 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.232222 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.233800 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.234174 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.246751 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9pwcs"] Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.264145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b60f009f-6f9c-4102-b78a-af3e0a7b6279-crc-storage\") pod \"crc-storage-crc-9pwcs\" (UID: \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\") " pod="crc-storage/crc-storage-crc-9pwcs" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.264232 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b60f009f-6f9c-4102-b78a-af3e0a7b6279-node-mnt\") pod \"crc-storage-crc-9pwcs\" (UID: \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\") " pod="crc-storage/crc-storage-crc-9pwcs" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.264457 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhcng\" (UniqueName: \"kubernetes.io/projected/b60f009f-6f9c-4102-b78a-af3e0a7b6279-kube-api-access-xhcng\") pod \"crc-storage-crc-9pwcs\" (UID: \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\") " pod="crc-storage/crc-storage-crc-9pwcs" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.365413 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b60f009f-6f9c-4102-b78a-af3e0a7b6279-node-mnt\") pod \"crc-storage-crc-9pwcs\" (UID: \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\") " pod="crc-storage/crc-storage-crc-9pwcs" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.365602 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhcng\" (UniqueName: \"kubernetes.io/projected/b60f009f-6f9c-4102-b78a-af3e0a7b6279-kube-api-access-xhcng\") pod \"crc-storage-crc-9pwcs\" (UID: \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\") " pod="crc-storage/crc-storage-crc-9pwcs" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.365672 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b60f009f-6f9c-4102-b78a-af3e0a7b6279-crc-storage\") pod \"crc-storage-crc-9pwcs\" (UID: \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\") " pod="crc-storage/crc-storage-crc-9pwcs" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.365940 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b60f009f-6f9c-4102-b78a-af3e0a7b6279-node-mnt\") pod \"crc-storage-crc-9pwcs\" (UID: \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\") " pod="crc-storage/crc-storage-crc-9pwcs" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.367865 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b60f009f-6f9c-4102-b78a-af3e0a7b6279-crc-storage\") pod \"crc-storage-crc-9pwcs\" (UID: \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\") " pod="crc-storage/crc-storage-crc-9pwcs" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.391901 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhcng\" (UniqueName: \"kubernetes.io/projected/b60f009f-6f9c-4102-b78a-af3e0a7b6279-kube-api-access-xhcng\") pod \"crc-storage-crc-9pwcs\" (UID: \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\") " pod="crc-storage/crc-storage-crc-9pwcs" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.568601 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9pwcs" Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.855711 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9pwcs"] Jan 20 23:53:20 crc kubenswrapper[5030]: I0120 23:53:20.861488 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 23:53:21 crc kubenswrapper[5030]: I0120 23:53:21.349701 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9pwcs" event={"ID":"b60f009f-6f9c-4102-b78a-af3e0a7b6279","Type":"ContainerStarted","Data":"ed6b2deca24737574e810deb22e4b1bbfe39a5e595e1abc3ad588a93d20671bf"} Jan 20 23:53:21 crc kubenswrapper[5030]: I0120 23:53:21.974905 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c040e9-0ad3-4a70-a414-a6239e37ff92" path="/var/lib/kubelet/pods/d7c040e9-0ad3-4a70-a414-a6239e37ff92/volumes" Jan 20 23:53:22 crc kubenswrapper[5030]: I0120 23:53:22.361656 5030 generic.go:334] "Generic (PLEG): container finished" podID="b60f009f-6f9c-4102-b78a-af3e0a7b6279" containerID="29dabf429be0c0a8f2d8516036afb5c0cb465a9ff45b71f25546ff511319cdb2" exitCode=0 Jan 20 23:53:22 crc kubenswrapper[5030]: I0120 23:53:22.361731 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9pwcs" event={"ID":"b60f009f-6f9c-4102-b78a-af3e0a7b6279","Type":"ContainerDied","Data":"29dabf429be0c0a8f2d8516036afb5c0cb465a9ff45b71f25546ff511319cdb2"} Jan 20 23:53:24 crc kubenswrapper[5030]: I0120 23:53:24.135957 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9pwcs" Jan 20 23:53:24 crc kubenswrapper[5030]: I0120 23:53:24.227049 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhcng\" (UniqueName: \"kubernetes.io/projected/b60f009f-6f9c-4102-b78a-af3e0a7b6279-kube-api-access-xhcng\") pod \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\" (UID: \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\") " Jan 20 23:53:24 crc kubenswrapper[5030]: I0120 23:53:24.227146 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b60f009f-6f9c-4102-b78a-af3e0a7b6279-node-mnt\") pod \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\" (UID: \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\") " Jan 20 23:53:24 crc kubenswrapper[5030]: I0120 23:53:24.227231 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b60f009f-6f9c-4102-b78a-af3e0a7b6279-crc-storage\") pod \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\" (UID: \"b60f009f-6f9c-4102-b78a-af3e0a7b6279\") " Jan 20 23:53:24 crc kubenswrapper[5030]: I0120 23:53:24.227254 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b60f009f-6f9c-4102-b78a-af3e0a7b6279-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "b60f009f-6f9c-4102-b78a-af3e0a7b6279" (UID: "b60f009f-6f9c-4102-b78a-af3e0a7b6279"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:53:24 crc kubenswrapper[5030]: I0120 23:53:24.227567 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b60f009f-6f9c-4102-b78a-af3e0a7b6279-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:24 crc kubenswrapper[5030]: I0120 23:53:24.235424 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60f009f-6f9c-4102-b78a-af3e0a7b6279-kube-api-access-xhcng" (OuterVolumeSpecName: "kube-api-access-xhcng") pod "b60f009f-6f9c-4102-b78a-af3e0a7b6279" (UID: "b60f009f-6f9c-4102-b78a-af3e0a7b6279"). InnerVolumeSpecName "kube-api-access-xhcng". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:53:24 crc kubenswrapper[5030]: I0120 23:53:24.262140 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b60f009f-6f9c-4102-b78a-af3e0a7b6279-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "b60f009f-6f9c-4102-b78a-af3e0a7b6279" (UID: "b60f009f-6f9c-4102-b78a-af3e0a7b6279"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:53:24 crc kubenswrapper[5030]: I0120 23:53:24.331316 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b60f009f-6f9c-4102-b78a-af3e0a7b6279-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:24 crc kubenswrapper[5030]: I0120 23:53:24.331370 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhcng\" (UniqueName: \"kubernetes.io/projected/b60f009f-6f9c-4102-b78a-af3e0a7b6279-kube-api-access-xhcng\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:24 crc kubenswrapper[5030]: I0120 23:53:24.381608 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9pwcs" event={"ID":"b60f009f-6f9c-4102-b78a-af3e0a7b6279","Type":"ContainerDied","Data":"ed6b2deca24737574e810deb22e4b1bbfe39a5e595e1abc3ad588a93d20671bf"} Jan 20 23:53:24 crc kubenswrapper[5030]: I0120 23:53:24.381708 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed6b2deca24737574e810deb22e4b1bbfe39a5e595e1abc3ad588a93d20671bf" Jan 20 23:53:24 crc kubenswrapper[5030]: I0120 23:53:24.381662 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9pwcs" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.665652 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:53:36 crc kubenswrapper[5030]: E0120 23:53:36.666568 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60f009f-6f9c-4102-b78a-af3e0a7b6279" containerName="storage" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.666589 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60f009f-6f9c-4102-b78a-af3e0a7b6279" containerName="storage" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.666828 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b60f009f-6f9c-4102-b78a-af3e0a7b6279" containerName="storage" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.667980 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.672177 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-default-user" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.673838 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-erlang-cookie" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.673969 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-plugins-conf" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.674830 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-cell1-svc" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.675216 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-conf" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.675264 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-cell1-server-dockercfg-drr5h" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.680604 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-cell1-config-data" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.710984 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.771006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.771071 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.771119 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.771151 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.771175 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.771199 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x8ns\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-kube-api-access-9x8ns\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.771230 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.771255 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.771285 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.771314 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.771334 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.872673 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.872726 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.872801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.872835 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.872860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.872886 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x8ns\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-kube-api-access-9x8ns\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.872919 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.872948 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.872976 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.873004 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.873027 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.874223 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.874406 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.874974 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.875312 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.875373 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.875432 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.881810 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.882103 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.882576 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.892062 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.903869 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x8ns\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-kube-api-access-9x8ns\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:36 crc kubenswrapper[5030]: I0120 23:53:36.936094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:37 crc kubenswrapper[5030]: I0120 23:53:37.004254 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:53:37 crc kubenswrapper[5030]: I0120 23:53:37.261579 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:53:37 crc kubenswrapper[5030]: I0120 23:53:37.994212 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:53:37 crc kubenswrapper[5030]: I0120 23:53:37.996251 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.001117 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-default-user" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.001343 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-server-conf" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.001454 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-rabbitmq-svc" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.001668 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-server-dockercfg-f2d29" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.001786 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.001917 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-plugins-conf" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.002084 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"rabbitmq-config-data" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.010776 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.093310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.093769 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs6nb\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-kube-api-access-rs6nb\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.093803 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.093851 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.093911 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.093964 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.093995 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.094300 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.094449 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.094470 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.094519 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.196124 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.196169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.196194 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.196220 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.196241 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs6nb\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-kube-api-access-rs6nb\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.196261 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.196309 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.196350 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.196403 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.196426 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.196452 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.196669 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") device mount path \"/mnt/openstack/pv13\"" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.196980 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.197221 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.197249 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.198038 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.198132 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.203011 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.203128 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.204686 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.213101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs6nb\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-kube-api-access-rs6nb\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.222069 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.224226 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"rabbitmq-server-0\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.402331 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.547468 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"19f6003c-3d52-44e9-a5a8-d525e4b6a79e","Type":"ContainerStarted","Data":"5a7ae9d1d15217f57b3f31ac70fb64faeda0e9bb019643a188cdd9125c575ac0"} Jan 20 23:53:38 crc kubenswrapper[5030]: I0120 23:53:38.695468 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:53:38 crc kubenswrapper[5030]: W0120 23:53:38.704745 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cdf6b51_c0c7_43fd_9a42_c99b2d8cd0bf.slice/crio-5565a4cb313abfdb07d83531115ca587d3e85dfdf2eabf7388b1f5bb1215b92a WatchSource:0}: Error finding container 5565a4cb313abfdb07d83531115ca587d3e85dfdf2eabf7388b1f5bb1215b92a: Status 404 returned error can't find the container with id 5565a4cb313abfdb07d83531115ca587d3e85dfdf2eabf7388b1f5bb1215b92a Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.210510 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.212076 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.214941 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config-data" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.216831 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-dockercfg-rjgrq" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.218495 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-svc" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.220411 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.222059 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-scripts" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.230948 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.319216 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-operator-scripts\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.319313 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.319427 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-config-data-default\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.319604 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/732faf79-c224-4a2a-bc39-07343926d772-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.319935 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732faf79-c224-4a2a-bc39-07343926d772-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.319961 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phlwp\" (UniqueName: \"kubernetes.io/projected/732faf79-c224-4a2a-bc39-07343926d772-kube-api-access-phlwp\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.320011 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-kolla-config\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.320057 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/732faf79-c224-4a2a-bc39-07343926d772-config-data-generated\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.421646 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.421980 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") device mount path \"/mnt/openstack/pv20\"" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.422015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-config-data-default\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.422043 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/732faf79-c224-4a2a-bc39-07343926d772-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.422147 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732faf79-c224-4a2a-bc39-07343926d772-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.422171 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phlwp\" (UniqueName: \"kubernetes.io/projected/732faf79-c224-4a2a-bc39-07343926d772-kube-api-access-phlwp\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.422197 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-kolla-config\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.422227 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/732faf79-c224-4a2a-bc39-07343926d772-config-data-generated\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.422274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-operator-scripts\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.423070 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/732faf79-c224-4a2a-bc39-07343926d772-config-data-generated\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.423596 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-config-data-default\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.424157 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-kolla-config\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.424359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-operator-scripts\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.556436 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf","Type":"ContainerStarted","Data":"5565a4cb313abfdb07d83531115ca587d3e85dfdf2eabf7388b1f5bb1215b92a"} Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.564385 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"19f6003c-3d52-44e9-a5a8-d525e4b6a79e","Type":"ContainerStarted","Data":"060f9ab9280053821ca2e9dbdfd76e7b2e79ed01983ce958e604a573ce49b483"} Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.748128 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732faf79-c224-4a2a-bc39-07343926d772-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.748140 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/732faf79-c224-4a2a-bc39-07343926d772-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.748765 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phlwp\" (UniqueName: \"kubernetes.io/projected/732faf79-c224-4a2a-bc39-07343926d772-kube-api-access-phlwp\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:39 crc kubenswrapper[5030]: I0120 23:53:39.878437 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"openstack-galera-0\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.145930 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.590192 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf","Type":"ContainerStarted","Data":"af0b95855676feb71cc4038a460e384b867378a3685449ad4e3a268dd14282da"} Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.596193 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.598605 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.636374 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-config-data" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.637212 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"galera-openstack-cell1-dockercfg-l8frs" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.637377 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-cell1-scripts" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.637422 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-galera-openstack-cell1-svc" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.655789 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.685189 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.748034 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkrmn\" (UniqueName: \"kubernetes.io/projected/5bdb99fc-435f-4ea4-bd16-e04616518099-kube-api-access-gkrmn\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.748088 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.748111 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.748144 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5bdb99fc-435f-4ea4-bd16-e04616518099-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.748160 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.748212 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bdb99fc-435f-4ea4-bd16-e04616518099-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.748241 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bdb99fc-435f-4ea4-bd16-e04616518099-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.748278 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.850048 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.850123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bdb99fc-435f-4ea4-bd16-e04616518099-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.850152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bdb99fc-435f-4ea4-bd16-e04616518099-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.850179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.850227 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkrmn\" (UniqueName: \"kubernetes.io/projected/5bdb99fc-435f-4ea4-bd16-e04616518099-kube-api-access-gkrmn\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.850255 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.850279 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.850300 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5bdb99fc-435f-4ea4-bd16-e04616518099-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.850715 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5bdb99fc-435f-4ea4-bd16-e04616518099-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.851373 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.852369 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") device mount path \"/mnt/openstack/pv14\"" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.852891 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.853577 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.855297 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bdb99fc-435f-4ea4-bd16-e04616518099-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.855363 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bdb99fc-435f-4ea4-bd16-e04616518099-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.871504 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkrmn\" (UniqueName: \"kubernetes.io/projected/5bdb99fc-435f-4ea4-bd16-e04616518099-kube-api-access-gkrmn\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.885824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.956501 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.960437 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.961317 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.973578 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"memcached-config-data" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.973664 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-memcached-svc" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.973579 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"memcached-memcached-dockercfg-n9z6t" Jan 20 23:53:40 crc kubenswrapper[5030]: I0120 23:53:40.983070 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.053380 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-memcached-tls-certs\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.053420 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-kolla-config\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.053512 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-config-data\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.053530 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v55kl\" (UniqueName: \"kubernetes.io/projected/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-kube-api-access-v55kl\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.053554 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-combined-ca-bundle\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.154773 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-config-data\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.154832 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v55kl\" (UniqueName: \"kubernetes.io/projected/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-kube-api-access-v55kl\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.154878 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-combined-ca-bundle\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.154946 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-memcached-tls-certs\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.154968 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-kolla-config\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.155695 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-config-data\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.155771 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-kolla-config\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.160159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-memcached-tls-certs\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.179356 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-combined-ca-bundle\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.180872 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v55kl\" (UniqueName: \"kubernetes.io/projected/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-kube-api-access-v55kl\") pod \"memcached-0\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.269525 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:53:41 crc kubenswrapper[5030]: W0120 23:53:41.273937 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bdb99fc_435f_4ea4_bd16_e04616518099.slice/crio-6da066bd7d25105a4da15c0529a9ce20d788f3400b5f68b97e8d04006a8eb3bf WatchSource:0}: Error finding container 6da066bd7d25105a4da15c0529a9ce20d788f3400b5f68b97e8d04006a8eb3bf: Status 404 returned error can't find the container with id 6da066bd7d25105a4da15c0529a9ce20d788f3400b5f68b97e8d04006a8eb3bf Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.357414 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.596984 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"732faf79-c224-4a2a-bc39-07343926d772","Type":"ContainerStarted","Data":"b67de712f93a9f90d450a8ee5ca472bfc03b00b231f684c27913cba97b33dc21"} Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.597040 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"732faf79-c224-4a2a-bc39-07343926d772","Type":"ContainerStarted","Data":"897bbc0f6d4b7187b341c23854f60881affd5828a7816b6d108b9cee8f44325d"} Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.601153 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"5bdb99fc-435f-4ea4-bd16-e04616518099","Type":"ContainerStarted","Data":"204f0f62944ca153053c055e5e6313febbbdaac72c94d5a59db6b700461c73ce"} Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.601193 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"5bdb99fc-435f-4ea4-bd16-e04616518099","Type":"ContainerStarted","Data":"6da066bd7d25105a4da15c0529a9ce20d788f3400b5f68b97e8d04006a8eb3bf"} Jan 20 23:53:41 crc kubenswrapper[5030]: I0120 23:53:41.833506 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:53:41 crc kubenswrapper[5030]: W0120 23:53:41.839765 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod443bb4b6_0b3c_4824_b5b6_e2771eb8e438.slice/crio-46d12a70b8b856cba286119e3a876938998f065df7bb984b6d3fc83283ebda7c WatchSource:0}: Error finding container 46d12a70b8b856cba286119e3a876938998f065df7bb984b6d3fc83283ebda7c: Status 404 returned error can't find the container with id 46d12a70b8b856cba286119e3a876938998f065df7bb984b6d3fc83283ebda7c Jan 20 23:53:42 crc kubenswrapper[5030]: I0120 23:53:42.608939 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"443bb4b6-0b3c-4824-b5b6-e2771eb8e438","Type":"ContainerStarted","Data":"a8bc8a463a575efe51b8f28d2ac8ab506a95e14b2c50dc6fee036bbc5a49a1bb"} Jan 20 23:53:42 crc kubenswrapper[5030]: I0120 23:53:42.609362 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"443bb4b6-0b3c-4824-b5b6-e2771eb8e438","Type":"ContainerStarted","Data":"46d12a70b8b856cba286119e3a876938998f065df7bb984b6d3fc83283ebda7c"} Jan 20 23:53:42 crc kubenswrapper[5030]: I0120 23:53:42.629706 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/memcached-0" podStartSLOduration=2.629691074 podStartE2EDuration="2.629691074s" podCreationTimestamp="2026-01-20 23:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:53:42.626381074 +0000 UTC m=+4694.946641362" watchObservedRunningTime="2026-01-20 23:53:42.629691074 +0000 UTC m=+4694.949951352" Jan 20 23:53:43 crc kubenswrapper[5030]: I0120 23:53:43.114397 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:53:43 crc kubenswrapper[5030]: I0120 23:53:43.115561 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:53:43 crc kubenswrapper[5030]: I0120 23:53:43.118486 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"telemetry-ceilometer-dockercfg-jc87g" Jan 20 23:53:43 crc kubenswrapper[5030]: I0120 23:53:43.125267 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:53:43 crc kubenswrapper[5030]: I0120 23:53:43.187737 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qlnq\" (UniqueName: \"kubernetes.io/projected/74d2242a-b72f-40ed-ba26-3c1e7940a218-kube-api-access-5qlnq\") pod \"kube-state-metrics-0\" (UID: \"74d2242a-b72f-40ed-ba26-3c1e7940a218\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:53:43 crc kubenswrapper[5030]: I0120 23:53:43.289131 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qlnq\" (UniqueName: \"kubernetes.io/projected/74d2242a-b72f-40ed-ba26-3c1e7940a218-kube-api-access-5qlnq\") pod \"kube-state-metrics-0\" (UID: \"74d2242a-b72f-40ed-ba26-3c1e7940a218\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:53:43 crc kubenswrapper[5030]: I0120 23:53:43.317827 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qlnq\" (UniqueName: \"kubernetes.io/projected/74d2242a-b72f-40ed-ba26-3c1e7940a218-kube-api-access-5qlnq\") pod \"kube-state-metrics-0\" (UID: \"74d2242a-b72f-40ed-ba26-3c1e7940a218\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:53:43 crc kubenswrapper[5030]: I0120 23:53:43.435112 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:53:43 crc kubenswrapper[5030]: I0120 23:53:43.619824 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:43 crc kubenswrapper[5030]: I0120 23:53:43.930777 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:53:44 crc kubenswrapper[5030]: I0120 23:53:44.631610 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"74d2242a-b72f-40ed-ba26-3c1e7940a218","Type":"ContainerStarted","Data":"074ad098cb3f61f9ad45d4cfa44a4724fde8cfdbe893c62831c21e800e9b8a85"} Jan 20 23:53:44 crc kubenswrapper[5030]: I0120 23:53:44.637798 5030 generic.go:334] "Generic (PLEG): container finished" podID="732faf79-c224-4a2a-bc39-07343926d772" containerID="b67de712f93a9f90d450a8ee5ca472bfc03b00b231f684c27913cba97b33dc21" exitCode=0 Jan 20 23:53:44 crc kubenswrapper[5030]: I0120 23:53:44.637868 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"732faf79-c224-4a2a-bc39-07343926d772","Type":"ContainerDied","Data":"b67de712f93a9f90d450a8ee5ca472bfc03b00b231f684c27913cba97b33dc21"} Jan 20 23:53:45 crc kubenswrapper[5030]: I0120 23:53:45.650611 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"74d2242a-b72f-40ed-ba26-3c1e7940a218","Type":"ContainerStarted","Data":"2f4be2996c4bb42f362ed54e4832f7079bdf293a0e0142c325547ed3531d22a0"} Jan 20 23:53:45 crc kubenswrapper[5030]: I0120 23:53:45.651077 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:53:45 crc kubenswrapper[5030]: I0120 23:53:45.654093 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"732faf79-c224-4a2a-bc39-07343926d772","Type":"ContainerStarted","Data":"c349fa99a47e05f4423344350c4f8ba2fbdc8b3c193473666cc6af2498e4f85c"} Jan 20 23:53:45 crc kubenswrapper[5030]: I0120 23:53:45.659302 5030 generic.go:334] "Generic (PLEG): container finished" podID="5bdb99fc-435f-4ea4-bd16-e04616518099" containerID="204f0f62944ca153053c055e5e6313febbbdaac72c94d5a59db6b700461c73ce" exitCode=0 Jan 20 23:53:45 crc kubenswrapper[5030]: I0120 23:53:45.659353 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"5bdb99fc-435f-4ea4-bd16-e04616518099","Type":"ContainerDied","Data":"204f0f62944ca153053c055e5e6313febbbdaac72c94d5a59db6b700461c73ce"} Jan 20 23:53:45 crc kubenswrapper[5030]: I0120 23:53:45.685381 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.11311253 podStartE2EDuration="2.68532223s" podCreationTimestamp="2026-01-20 23:53:43 +0000 UTC" firstStartedPulling="2026-01-20 23:53:43.940921833 +0000 UTC m=+4696.261182161" lastFinishedPulling="2026-01-20 23:53:44.513131563 +0000 UTC m=+4696.833391861" observedRunningTime="2026-01-20 23:53:45.674533778 +0000 UTC m=+4697.994794136" watchObservedRunningTime="2026-01-20 23:53:45.68532223 +0000 UTC m=+4698.005582528" Jan 20 23:53:46 crc kubenswrapper[5030]: I0120 23:53:46.673459 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"5bdb99fc-435f-4ea4-bd16-e04616518099","Type":"ContainerStarted","Data":"af17e77e1eb767a98bb78c74babe1fea956bf2c7c84c692b9ed22bca3333a0ae"} Jan 20 23:53:46 crc kubenswrapper[5030]: I0120 23:53:46.707518 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-galera-0" podStartSLOduration=8.707497846 podStartE2EDuration="8.707497846s" podCreationTimestamp="2026-01-20 23:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:53:45.72698624 +0000 UTC m=+4698.047246538" watchObservedRunningTime="2026-01-20 23:53:46.707497846 +0000 UTC m=+4699.027758144" Jan 20 23:53:46 crc kubenswrapper[5030]: I0120 23:53:46.713449 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podStartSLOduration=7.71343058 podStartE2EDuration="7.71343058s" podCreationTimestamp="2026-01-20 23:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:53:46.705370714 +0000 UTC m=+4699.025631022" watchObservedRunningTime="2026-01-20 23:53:46.71343058 +0000 UTC m=+4699.033690878" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.487696 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.489959 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.495312 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-sb-ovndbs" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.495328 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-config" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.495760 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovn-metrics" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.496203 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-sb-dockercfg-8f77w" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.501871 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-sb-scripts" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.557896 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.661967 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.662022 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.662049 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.662078 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fd59eaf-4b45-4866-a373-fbd87aa2054d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.662106 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm7r2\" (UniqueName: \"kubernetes.io/projected/3fd59eaf-4b45-4866-a373-fbd87aa2054d-kube-api-access-xm7r2\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.662159 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd59eaf-4b45-4866-a373-fbd87aa2054d-config\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.662219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fd59eaf-4b45-4866-a373-fbd87aa2054d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.662293 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.763811 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.763856 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.763885 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.763915 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fd59eaf-4b45-4866-a373-fbd87aa2054d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.763945 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm7r2\" (UniqueName: \"kubernetes.io/projected/3fd59eaf-4b45-4866-a373-fbd87aa2054d-kube-api-access-xm7r2\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.763989 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd59eaf-4b45-4866-a373-fbd87aa2054d-config\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.764049 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fd59eaf-4b45-4866-a373-fbd87aa2054d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.764098 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.764402 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") device mount path \"/mnt/openstack/pv18\"" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.766750 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fd59eaf-4b45-4866-a373-fbd87aa2054d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.767039 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fd59eaf-4b45-4866-a373-fbd87aa2054d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.767290 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd59eaf-4b45-4866-a373-fbd87aa2054d-config\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.770163 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.771366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.783742 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.784248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm7r2\" (UniqueName: \"kubernetes.io/projected/3fd59eaf-4b45-4866-a373-fbd87aa2054d-kube-api-access-xm7r2\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.784336 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:47 crc kubenswrapper[5030]: I0120 23:53:47.807810 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:48 crc kubenswrapper[5030]: I0120 23:53:48.273792 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:53:48 crc kubenswrapper[5030]: W0120 23:53:48.284371 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fd59eaf_4b45_4866_a373_fbd87aa2054d.slice/crio-04e99e903bcf133b615c5b5c7a4c34a8124ed39a9c5e1b974336ac6b30ad21d2 WatchSource:0}: Error finding container 04e99e903bcf133b615c5b5c7a4c34a8124ed39a9c5e1b974336ac6b30ad21d2: Status 404 returned error can't find the container with id 04e99e903bcf133b615c5b5c7a4c34a8124ed39a9c5e1b974336ac6b30ad21d2 Jan 20 23:53:48 crc kubenswrapper[5030]: I0120 23:53:48.694068 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"3fd59eaf-4b45-4866-a373-fbd87aa2054d","Type":"ContainerStarted","Data":"3ab6bfc1ef90ca2f374fc1f4c559d4366f18debbdef1aa55c34024687bf9298c"} Jan 20 23:53:48 crc kubenswrapper[5030]: I0120 23:53:48.694438 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"3fd59eaf-4b45-4866-a373-fbd87aa2054d","Type":"ContainerStarted","Data":"2dfa7821a1c881765f73c688770fe17e2ad11291288ce498f5cddfcfd52e03f5"} Jan 20 23:53:48 crc kubenswrapper[5030]: I0120 23:53:48.694453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"3fd59eaf-4b45-4866-a373-fbd87aa2054d","Type":"ContainerStarted","Data":"04e99e903bcf133b615c5b5c7a4c34a8124ed39a9c5e1b974336ac6b30ad21d2"} Jan 20 23:53:48 crc kubenswrapper[5030]: I0120 23:53:48.712736 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podStartSLOduration=2.71271443 podStartE2EDuration="2.71271443s" podCreationTimestamp="2026-01-20 23:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:53:48.711377507 +0000 UTC m=+4701.031637805" watchObservedRunningTime="2026-01-20 23:53:48.71271443 +0000 UTC m=+4701.032974718" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.216871 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.218074 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.225118 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-scripts" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.226561 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovncluster-ovndbcluster-nb-dockercfg-dppvg" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.226637 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovndbcluster-nb-config" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.226946 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovndbcluster-nb-ovndbs" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.244824 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.389199 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/14c17006-1397-4119-bd32-0e07e78d5068-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.389561 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.389607 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9wsc\" (UniqueName: \"kubernetes.io/projected/14c17006-1397-4119-bd32-0e07e78d5068-kube-api-access-k9wsc\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.389684 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14c17006-1397-4119-bd32-0e07e78d5068-config\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.389727 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.389759 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.389924 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14c17006-1397-4119-bd32-0e07e78d5068-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.389966 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.491473 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/14c17006-1397-4119-bd32-0e07e78d5068-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.491542 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.491575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9wsc\" (UniqueName: \"kubernetes.io/projected/14c17006-1397-4119-bd32-0e07e78d5068-kube-api-access-k9wsc\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.491604 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14c17006-1397-4119-bd32-0e07e78d5068-config\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.491650 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.491675 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.491693 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14c17006-1397-4119-bd32-0e07e78d5068-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.491708 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.493291 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") device mount path \"/mnt/openstack/pv11\"" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.493310 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/14c17006-1397-4119-bd32-0e07e78d5068-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.493751 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14c17006-1397-4119-bd32-0e07e78d5068-config\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.493877 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14c17006-1397-4119-bd32-0e07e78d5068-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.499553 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.500128 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.506745 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.510816 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9wsc\" (UniqueName: \"kubernetes.io/projected/14c17006-1397-4119-bd32-0e07e78d5068-kube-api-access-k9wsc\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.515043 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.544879 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:49 crc kubenswrapper[5030]: I0120 23:53:49.998517 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:53:50 crc kubenswrapper[5030]: I0120 23:53:50.146146 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:50 crc kubenswrapper[5030]: I0120 23:53:50.146422 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:50 crc kubenswrapper[5030]: I0120 23:53:50.243603 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:50 crc kubenswrapper[5030]: I0120 23:53:50.712419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"14c17006-1397-4119-bd32-0e07e78d5068","Type":"ContainerStarted","Data":"b3a1344ff3e013651e7f6d5c6a2155a0ddb578b7e656e6c2b4a5a5e435e9bfc2"} Jan 20 23:53:50 crc kubenswrapper[5030]: I0120 23:53:50.712481 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"14c17006-1397-4119-bd32-0e07e78d5068","Type":"ContainerStarted","Data":"c9d6e972c95d56185baa0e231c500cf95678aa232be38c9e6761786e20dc488f"} Jan 20 23:53:50 crc kubenswrapper[5030]: I0120 23:53:50.807983 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:50 crc kubenswrapper[5030]: I0120 23:53:50.931685 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:53:50 crc kubenswrapper[5030]: I0120 23:53:50.957334 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:50 crc kubenswrapper[5030]: I0120 23:53:50.957725 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.154589 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-create-dmm8j"] Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.155738 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-dmm8j" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.164050 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-dmm8j"] Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.220845 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1f25e04-6d69-4422-8114-d112be37db5a-operator-scripts\") pod \"placement-db-create-dmm8j\" (UID: \"f1f25e04-6d69-4422-8114-d112be37db5a\") " pod="openstack-kuttl-tests/placement-db-create-dmm8j" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.221009 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgsmw\" (UniqueName: \"kubernetes.io/projected/f1f25e04-6d69-4422-8114-d112be37db5a-kube-api-access-pgsmw\") pod \"placement-db-create-dmm8j\" (UID: \"f1f25e04-6d69-4422-8114-d112be37db5a\") " pod="openstack-kuttl-tests/placement-db-create-dmm8j" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.221824 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-5889-account-create-update-84pkt"] Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.223695 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.225493 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.234979 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5889-account-create-update-84pkt"] Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.321979 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8kvv\" (UniqueName: \"kubernetes.io/projected/4c71777b-833a-46c4-afac-95aff05fceb1-kube-api-access-r8kvv\") pod \"placement-5889-account-create-update-84pkt\" (UID: \"4c71777b-833a-46c4-afac-95aff05fceb1\") " pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.322034 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c71777b-833a-46c4-afac-95aff05fceb1-operator-scripts\") pod \"placement-5889-account-create-update-84pkt\" (UID: \"4c71777b-833a-46c4-afac-95aff05fceb1\") " pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.322080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgsmw\" (UniqueName: \"kubernetes.io/projected/f1f25e04-6d69-4422-8114-d112be37db5a-kube-api-access-pgsmw\") pod \"placement-db-create-dmm8j\" (UID: \"f1f25e04-6d69-4422-8114-d112be37db5a\") " pod="openstack-kuttl-tests/placement-db-create-dmm8j" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.322149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1f25e04-6d69-4422-8114-d112be37db5a-operator-scripts\") pod \"placement-db-create-dmm8j\" (UID: \"f1f25e04-6d69-4422-8114-d112be37db5a\") " pod="openstack-kuttl-tests/placement-db-create-dmm8j" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.323332 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1f25e04-6d69-4422-8114-d112be37db5a-operator-scripts\") pod \"placement-db-create-dmm8j\" (UID: \"f1f25e04-6d69-4422-8114-d112be37db5a\") " pod="openstack-kuttl-tests/placement-db-create-dmm8j" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.339282 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgsmw\" (UniqueName: \"kubernetes.io/projected/f1f25e04-6d69-4422-8114-d112be37db5a-kube-api-access-pgsmw\") pod \"placement-db-create-dmm8j\" (UID: \"f1f25e04-6d69-4422-8114-d112be37db5a\") " pod="openstack-kuttl-tests/placement-db-create-dmm8j" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.358878 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.423364 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8kvv\" (UniqueName: \"kubernetes.io/projected/4c71777b-833a-46c4-afac-95aff05fceb1-kube-api-access-r8kvv\") pod \"placement-5889-account-create-update-84pkt\" (UID: \"4c71777b-833a-46c4-afac-95aff05fceb1\") " pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.423432 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c71777b-833a-46c4-afac-95aff05fceb1-operator-scripts\") pod \"placement-5889-account-create-update-84pkt\" (UID: \"4c71777b-833a-46c4-afac-95aff05fceb1\") " pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.424274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c71777b-833a-46c4-afac-95aff05fceb1-operator-scripts\") pod \"placement-5889-account-create-update-84pkt\" (UID: \"4c71777b-833a-46c4-afac-95aff05fceb1\") " pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.453157 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8kvv\" (UniqueName: \"kubernetes.io/projected/4c71777b-833a-46c4-afac-95aff05fceb1-kube-api-access-r8kvv\") pod \"placement-5889-account-create-update-84pkt\" (UID: \"4c71777b-833a-46c4-afac-95aff05fceb1\") " pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.491338 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-dmm8j" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.531184 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-create-m5sbm"] Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.532711 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-m5sbm" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.541652 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-m5sbm"] Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.541863 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.626972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f572652-5099-4653-9d03-efe8a789d462-operator-scripts\") pod \"glance-db-create-m5sbm\" (UID: \"9f572652-5099-4653-9d03-efe8a789d462\") " pod="openstack-kuttl-tests/glance-db-create-m5sbm" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.627088 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h92dj\" (UniqueName: \"kubernetes.io/projected/9f572652-5099-4653-9d03-efe8a789d462-kube-api-access-h92dj\") pod \"glance-db-create-m5sbm\" (UID: \"9f572652-5099-4653-9d03-efe8a789d462\") " pod="openstack-kuttl-tests/glance-db-create-m5sbm" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.630721 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-638c-account-create-update-n28wn"] Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.631663 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.633472 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-db-secret" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.642790 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-638c-account-create-update-n28wn"] Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.721809 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"14c17006-1397-4119-bd32-0e07e78d5068","Type":"ContainerStarted","Data":"05145492f4392fcd8c755b7cc30f2f103a6adfaf13ba303dcd29f71b950cc8bb"} Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.728080 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4044eeda-1a3f-45ef-8f85-efdbfbffd4f6-operator-scripts\") pod \"glance-638c-account-create-update-n28wn\" (UID: \"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6\") " pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.728147 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkchr\" (UniqueName: \"kubernetes.io/projected/4044eeda-1a3f-45ef-8f85-efdbfbffd4f6-kube-api-access-lkchr\") pod \"glance-638c-account-create-update-n28wn\" (UID: \"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6\") " pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.728205 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f572652-5099-4653-9d03-efe8a789d462-operator-scripts\") pod \"glance-db-create-m5sbm\" (UID: \"9f572652-5099-4653-9d03-efe8a789d462\") " pod="openstack-kuttl-tests/glance-db-create-m5sbm" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.728315 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h92dj\" (UniqueName: \"kubernetes.io/projected/9f572652-5099-4653-9d03-efe8a789d462-kube-api-access-h92dj\") pod \"glance-db-create-m5sbm\" (UID: \"9f572652-5099-4653-9d03-efe8a789d462\") " pod="openstack-kuttl-tests/glance-db-create-m5sbm" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.730558 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f572652-5099-4653-9d03-efe8a789d462-operator-scripts\") pod \"glance-db-create-m5sbm\" (UID: \"9f572652-5099-4653-9d03-efe8a789d462\") " pod="openstack-kuttl-tests/glance-db-create-m5sbm" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.746257 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h92dj\" (UniqueName: \"kubernetes.io/projected/9f572652-5099-4653-9d03-efe8a789d462-kube-api-access-h92dj\") pod \"glance-db-create-m5sbm\" (UID: \"9f572652-5099-4653-9d03-efe8a789d462\") " pod="openstack-kuttl-tests/glance-db-create-m5sbm" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.746794 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podStartSLOduration=3.746770182 podStartE2EDuration="3.746770182s" podCreationTimestamp="2026-01-20 23:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:53:51.74050758 +0000 UTC m=+4704.060767868" watchObservedRunningTime="2026-01-20 23:53:51.746770182 +0000 UTC m=+4704.067030470" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.829816 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4044eeda-1a3f-45ef-8f85-efdbfbffd4f6-operator-scripts\") pod \"glance-638c-account-create-update-n28wn\" (UID: \"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6\") " pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.829867 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkchr\" (UniqueName: \"kubernetes.io/projected/4044eeda-1a3f-45ef-8f85-efdbfbffd4f6-kube-api-access-lkchr\") pod \"glance-638c-account-create-update-n28wn\" (UID: \"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6\") " pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.832453 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4044eeda-1a3f-45ef-8f85-efdbfbffd4f6-operator-scripts\") pod \"glance-638c-account-create-update-n28wn\" (UID: \"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6\") " pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.846068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkchr\" (UniqueName: \"kubernetes.io/projected/4044eeda-1a3f-45ef-8f85-efdbfbffd4f6-kube-api-access-lkchr\") pod \"glance-638c-account-create-update-n28wn\" (UID: \"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6\") " pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.898978 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-m5sbm" Jan 20 23:53:51 crc kubenswrapper[5030]: I0120 23:53:51.949741 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.031416 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-create-dmm8j"] Jan 20 23:53:52 crc kubenswrapper[5030]: W0120 23:53:52.041703 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1f25e04_6d69_4422_8114_d112be37db5a.slice/crio-675b5f3ed91031f6dcc53ec087e9bde4000b8503c73929dfa4a5df50583d62e3 WatchSource:0}: Error finding container 675b5f3ed91031f6dcc53ec087e9bde4000b8503c73929dfa4a5df50583d62e3: Status 404 returned error can't find the container with id 675b5f3ed91031f6dcc53ec087e9bde4000b8503c73929dfa4a5df50583d62e3 Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.080736 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5889-account-create-update-84pkt"] Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.398310 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-create-m5sbm"] Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.502840 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-638c-account-create-update-n28wn"] Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.545479 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:52 crc kubenswrapper[5030]: W0120 23:53:52.546998 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f572652_5099_4653_9d03_efe8a789d462.slice/crio-7b86f40f7e9f1fa503a7b77f174ee893882fbae2f163a7742fda828ce621be70 WatchSource:0}: Error finding container 7b86f40f7e9f1fa503a7b77f174ee893882fbae2f163a7742fda828ce621be70: Status 404 returned error can't find the container with id 7b86f40f7e9f1fa503a7b77f174ee893882fbae2f163a7742fda828ce621be70 Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.732670 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" event={"ID":"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6","Type":"ContainerStarted","Data":"61b2fe97e12ea72a594b4c383d60012eaefea9fd8d2a5b9bebe2e3f1e187f5aa"} Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.733017 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" event={"ID":"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6","Type":"ContainerStarted","Data":"e70474055267817c9d69f5e4a786ad7854b080e99e3989c7353b0c33b456ed65"} Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.734897 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-m5sbm" event={"ID":"9f572652-5099-4653-9d03-efe8a789d462","Type":"ContainerStarted","Data":"987b60ef6a05a596a6a79190d778511056918354b3a30c0d6162c75906330368"} Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.734942 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-m5sbm" event={"ID":"9f572652-5099-4653-9d03-efe8a789d462","Type":"ContainerStarted","Data":"7b86f40f7e9f1fa503a7b77f174ee893882fbae2f163a7742fda828ce621be70"} Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.737363 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" event={"ID":"4c71777b-833a-46c4-afac-95aff05fceb1","Type":"ContainerStarted","Data":"82fd2d1b87ea50bd1f70fa929f9a1b463798cdb4ceab4e71e9fb439030a3b2d8"} Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.737422 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" event={"ID":"4c71777b-833a-46c4-afac-95aff05fceb1","Type":"ContainerStarted","Data":"62c12e5c04f9edd3cc6dd16cfd673e5c3c005f8ea5d171a727f3eb11eda42ced"} Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.738641 5030 generic.go:334] "Generic (PLEG): container finished" podID="f1f25e04-6d69-4422-8114-d112be37db5a" containerID="cae7101fe36c5b5ad5d91e88266b7c244153c07641129413c400f25ad1076188" exitCode=0 Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.738823 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-dmm8j" event={"ID":"f1f25e04-6d69-4422-8114-d112be37db5a","Type":"ContainerDied","Data":"cae7101fe36c5b5ad5d91e88266b7c244153c07641129413c400f25ad1076188"} Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.738898 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-dmm8j" event={"ID":"f1f25e04-6d69-4422-8114-d112be37db5a","Type":"ContainerStarted","Data":"675b5f3ed91031f6dcc53ec087e9bde4000b8503c73929dfa4a5df50583d62e3"} Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.749000 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" podStartSLOduration=1.748974514 podStartE2EDuration="1.748974514s" podCreationTimestamp="2026-01-20 23:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:53:52.746026883 +0000 UTC m=+4705.066287191" watchObservedRunningTime="2026-01-20 23:53:52.748974514 +0000 UTC m=+4705.069234812" Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.766061 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" podStartSLOduration=1.766040178 podStartE2EDuration="1.766040178s" podCreationTimestamp="2026-01-20 23:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:53:52.76242089 +0000 UTC m=+4705.082681198" watchObservedRunningTime="2026-01-20 23:53:52.766040178 +0000 UTC m=+4705.086300476" Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.795581 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-create-m5sbm" podStartSLOduration=1.7955643239999999 podStartE2EDuration="1.795564324s" podCreationTimestamp="2026-01-20 23:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:53:52.792258154 +0000 UTC m=+4705.112518442" watchObservedRunningTime="2026-01-20 23:53:52.795564324 +0000 UTC m=+4705.115824612" Jan 20 23:53:52 crc kubenswrapper[5030]: I0120 23:53:52.808332 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:53 crc kubenswrapper[5030]: I0120 23:53:53.370918 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:53 crc kubenswrapper[5030]: I0120 23:53:53.441765 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:53:53 crc kubenswrapper[5030]: I0120 23:53:53.524670 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:53:53 crc kubenswrapper[5030]: I0120 23:53:53.748850 5030 generic.go:334] "Generic (PLEG): container finished" podID="4044eeda-1a3f-45ef-8f85-efdbfbffd4f6" containerID="61b2fe97e12ea72a594b4c383d60012eaefea9fd8d2a5b9bebe2e3f1e187f5aa" exitCode=0 Jan 20 23:53:53 crc kubenswrapper[5030]: I0120 23:53:53.748926 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" event={"ID":"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6","Type":"ContainerDied","Data":"61b2fe97e12ea72a594b4c383d60012eaefea9fd8d2a5b9bebe2e3f1e187f5aa"} Jan 20 23:53:53 crc kubenswrapper[5030]: I0120 23:53:53.750441 5030 generic.go:334] "Generic (PLEG): container finished" podID="9f572652-5099-4653-9d03-efe8a789d462" containerID="987b60ef6a05a596a6a79190d778511056918354b3a30c0d6162c75906330368" exitCode=0 Jan 20 23:53:53 crc kubenswrapper[5030]: I0120 23:53:53.750512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-m5sbm" event={"ID":"9f572652-5099-4653-9d03-efe8a789d462","Type":"ContainerDied","Data":"987b60ef6a05a596a6a79190d778511056918354b3a30c0d6162c75906330368"} Jan 20 23:53:53 crc kubenswrapper[5030]: I0120 23:53:53.752079 5030 generic.go:334] "Generic (PLEG): container finished" podID="4c71777b-833a-46c4-afac-95aff05fceb1" containerID="82fd2d1b87ea50bd1f70fa929f9a1b463798cdb4ceab4e71e9fb439030a3b2d8" exitCode=0 Jan 20 23:53:53 crc kubenswrapper[5030]: I0120 23:53:53.752150 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" event={"ID":"4c71777b-833a-46c4-afac-95aff05fceb1","Type":"ContainerDied","Data":"82fd2d1b87ea50bd1f70fa929f9a1b463798cdb4ceab4e71e9fb439030a3b2d8"} Jan 20 23:53:53 crc kubenswrapper[5030]: I0120 23:53:53.854751 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:53 crc kubenswrapper[5030]: I0120 23:53:53.916292 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.126668 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-dmm8j" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.276272 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgsmw\" (UniqueName: \"kubernetes.io/projected/f1f25e04-6d69-4422-8114-d112be37db5a-kube-api-access-pgsmw\") pod \"f1f25e04-6d69-4422-8114-d112be37db5a\" (UID: \"f1f25e04-6d69-4422-8114-d112be37db5a\") " Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.276386 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1f25e04-6d69-4422-8114-d112be37db5a-operator-scripts\") pod \"f1f25e04-6d69-4422-8114-d112be37db5a\" (UID: \"f1f25e04-6d69-4422-8114-d112be37db5a\") " Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.277216 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f25e04-6d69-4422-8114-d112be37db5a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1f25e04-6d69-4422-8114-d112be37db5a" (UID: "f1f25e04-6d69-4422-8114-d112be37db5a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.286691 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f25e04-6d69-4422-8114-d112be37db5a-kube-api-access-pgsmw" (OuterVolumeSpecName: "kube-api-access-pgsmw") pod "f1f25e04-6d69-4422-8114-d112be37db5a" (UID: "f1f25e04-6d69-4422-8114-d112be37db5a"). InnerVolumeSpecName "kube-api-access-pgsmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.379304 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgsmw\" (UniqueName: \"kubernetes.io/projected/f1f25e04-6d69-4422-8114-d112be37db5a-kube-api-access-pgsmw\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.379372 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1f25e04-6d69-4422-8114-d112be37db5a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.545467 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.546236 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:53:54 crc kubenswrapper[5030]: E0120 23:53:54.546551 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f25e04-6d69-4422-8114-d112be37db5a" containerName="mariadb-database-create" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.546568 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f25e04-6d69-4422-8114-d112be37db5a" containerName="mariadb-database-create" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.546766 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f25e04-6d69-4422-8114-d112be37db5a" containerName="mariadb-database-create" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.551204 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.554763 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-storage-config-data" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.555055 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-conf" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.555367 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-files" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.556727 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-swift-dockercfg-fms7p" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.590344 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.689801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.689900 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfwzn\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-kube-api-access-jfwzn\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.689929 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.689954 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-cache\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.690157 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-lock\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.762489 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-create-dmm8j" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.762577 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-create-dmm8j" event={"ID":"f1f25e04-6d69-4422-8114-d112be37db5a","Type":"ContainerDied","Data":"675b5f3ed91031f6dcc53ec087e9bde4000b8503c73929dfa4a5df50583d62e3"} Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.762640 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="675b5f3ed91031f6dcc53ec087e9bde4000b8503c73929dfa4a5df50583d62e3" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.791296 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-lock\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.791643 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.791710 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfwzn\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-kube-api-access-jfwzn\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.791736 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.791988 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-cache\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.792065 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-lock\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: E0120 23:53:54.792223 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:53:54 crc kubenswrapper[5030]: E0120 23:53:54.792246 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:53:54 crc kubenswrapper[5030]: E0120 23:53:54.792340 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift podName:647882ed-cb41-417a-98bc-5a6bfbf0dd2a nodeName:}" failed. No retries permitted until 2026-01-20 23:53:55.292322193 +0000 UTC m=+4707.612582491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift") pod "swift-storage-0" (UID: "647882ed-cb41-417a-98bc-5a6bfbf0dd2a") : configmap "swift-ring-files" not found Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.792367 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") device mount path \"/mnt/openstack/pv02\"" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.793704 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-cache\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.848250 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfwzn\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-kube-api-access-jfwzn\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.859781 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.929869 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-7vfh9"] Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.931175 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.933885 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.934440 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-config-data" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.934744 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"swift-ring-scripts" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.941283 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-7vfh9"] Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.998483 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzppj\" (UniqueName: \"kubernetes.io/projected/32dae846-6ace-453b-a3eb-b237739433c6-kube-api-access-gzppj\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.998569 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-combined-ca-bundle\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.998600 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/32dae846-6ace-453b-a3eb-b237739433c6-ring-data-devices\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.998631 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dae846-6ace-453b-a3eb-b237739433c6-scripts\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.998667 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/32dae846-6ace-453b-a3eb-b237739433c6-etc-swift\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.998721 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-dispersionconf\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:54 crc kubenswrapper[5030]: I0120 23:53:54.998750 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-swiftconf\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.100209 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/32dae846-6ace-453b-a3eb-b237739433c6-etc-swift\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.100282 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-dispersionconf\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.100316 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-swiftconf\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.100371 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzppj\" (UniqueName: \"kubernetes.io/projected/32dae846-6ace-453b-a3eb-b237739433c6-kube-api-access-gzppj\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.100403 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-combined-ca-bundle\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.100419 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/32dae846-6ace-453b-a3eb-b237739433c6-ring-data-devices\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.100438 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dae846-6ace-453b-a3eb-b237739433c6-scripts\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.100585 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/32dae846-6ace-453b-a3eb-b237739433c6-etc-swift\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.103467 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/32dae846-6ace-453b-a3eb-b237739433c6-ring-data-devices\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.103494 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dae846-6ace-453b-a3eb-b237739433c6-scripts\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.104379 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-combined-ca-bundle\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.105826 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-swiftconf\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.114514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-dispersionconf\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.122313 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzppj\" (UniqueName: \"kubernetes.io/projected/32dae846-6ace-453b-a3eb-b237739433c6-kube-api-access-gzppj\") pod \"swift-ring-rebalance-7vfh9\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.207359 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-m5sbm" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.248506 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.304845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h92dj\" (UniqueName: \"kubernetes.io/projected/9f572652-5099-4653-9d03-efe8a789d462-kube-api-access-h92dj\") pod \"9f572652-5099-4653-9d03-efe8a789d462\" (UID: \"9f572652-5099-4653-9d03-efe8a789d462\") " Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.304927 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f572652-5099-4653-9d03-efe8a789d462-operator-scripts\") pod \"9f572652-5099-4653-9d03-efe8a789d462\" (UID: \"9f572652-5099-4653-9d03-efe8a789d462\") " Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.305193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:55 crc kubenswrapper[5030]: E0120 23:53:55.305400 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:53:55 crc kubenswrapper[5030]: E0120 23:53:55.305418 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:53:55 crc kubenswrapper[5030]: E0120 23:53:55.305474 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift podName:647882ed-cb41-417a-98bc-5a6bfbf0dd2a nodeName:}" failed. No retries permitted until 2026-01-20 23:53:56.30545744 +0000 UTC m=+4708.625717728 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift") pod "swift-storage-0" (UID: "647882ed-cb41-417a-98bc-5a6bfbf0dd2a") : configmap "swift-ring-files" not found Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.305591 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f572652-5099-4653-9d03-efe8a789d462-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f572652-5099-4653-9d03-efe8a789d462" (UID: "9f572652-5099-4653-9d03-efe8a789d462"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.308889 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f572652-5099-4653-9d03-efe8a789d462-kube-api-access-h92dj" (OuterVolumeSpecName: "kube-api-access-h92dj") pod "9f572652-5099-4653-9d03-efe8a789d462" (UID: "9f572652-5099-4653-9d03-efe8a789d462"). InnerVolumeSpecName "kube-api-access-h92dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.341785 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.346267 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.406584 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4044eeda-1a3f-45ef-8f85-efdbfbffd4f6-operator-scripts\") pod \"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6\" (UID: \"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6\") " Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.406675 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8kvv\" (UniqueName: \"kubernetes.io/projected/4c71777b-833a-46c4-afac-95aff05fceb1-kube-api-access-r8kvv\") pod \"4c71777b-833a-46c4-afac-95aff05fceb1\" (UID: \"4c71777b-833a-46c4-afac-95aff05fceb1\") " Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.406755 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c71777b-833a-46c4-afac-95aff05fceb1-operator-scripts\") pod \"4c71777b-833a-46c4-afac-95aff05fceb1\" (UID: \"4c71777b-833a-46c4-afac-95aff05fceb1\") " Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.406784 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkchr\" (UniqueName: \"kubernetes.io/projected/4044eeda-1a3f-45ef-8f85-efdbfbffd4f6-kube-api-access-lkchr\") pod \"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6\" (UID: \"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6\") " Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.407231 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h92dj\" (UniqueName: \"kubernetes.io/projected/9f572652-5099-4653-9d03-efe8a789d462-kube-api-access-h92dj\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.407247 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f572652-5099-4653-9d03-efe8a789d462-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.408504 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c71777b-833a-46c4-afac-95aff05fceb1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c71777b-833a-46c4-afac-95aff05fceb1" (UID: "4c71777b-833a-46c4-afac-95aff05fceb1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.408479 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4044eeda-1a3f-45ef-8f85-efdbfbffd4f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4044eeda-1a3f-45ef-8f85-efdbfbffd4f6" (UID: "4044eeda-1a3f-45ef-8f85-efdbfbffd4f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.412875 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4044eeda-1a3f-45ef-8f85-efdbfbffd4f6-kube-api-access-lkchr" (OuterVolumeSpecName: "kube-api-access-lkchr") pod "4044eeda-1a3f-45ef-8f85-efdbfbffd4f6" (UID: "4044eeda-1a3f-45ef-8f85-efdbfbffd4f6"). InnerVolumeSpecName "kube-api-access-lkchr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.412954 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c71777b-833a-46c4-afac-95aff05fceb1-kube-api-access-r8kvv" (OuterVolumeSpecName: "kube-api-access-r8kvv") pod "4c71777b-833a-46c4-afac-95aff05fceb1" (UID: "4c71777b-833a-46c4-afac-95aff05fceb1"). InnerVolumeSpecName "kube-api-access-r8kvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.508534 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8kvv\" (UniqueName: \"kubernetes.io/projected/4c71777b-833a-46c4-afac-95aff05fceb1-kube-api-access-r8kvv\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.508581 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c71777b-833a-46c4-afac-95aff05fceb1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.508598 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkchr\" (UniqueName: \"kubernetes.io/projected/4044eeda-1a3f-45ef-8f85-efdbfbffd4f6-kube-api-access-lkchr\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.508612 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4044eeda-1a3f-45ef-8f85-efdbfbffd4f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.598078 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:55 crc kubenswrapper[5030]: W0120 23:53:55.715440 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32dae846_6ace_453b_a3eb_b237739433c6.slice/crio-84dde2368b6a13b622a7bed51b3c2f7b230f4976a763984a63cb3c64ae3cbf56 WatchSource:0}: Error finding container 84dde2368b6a13b622a7bed51b3c2f7b230f4976a763984a63cb3c64ae3cbf56: Status 404 returned error can't find the container with id 84dde2368b6a13b622a7bed51b3c2f7b230f4976a763984a63cb3c64ae3cbf56 Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.716642 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-7vfh9"] Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.769878 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" event={"ID":"32dae846-6ace-453b-a3eb-b237739433c6","Type":"ContainerStarted","Data":"84dde2368b6a13b622a7bed51b3c2f7b230f4976a763984a63cb3c64ae3cbf56"} Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.771638 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" event={"ID":"4044eeda-1a3f-45ef-8f85-efdbfbffd4f6","Type":"ContainerDied","Data":"e70474055267817c9d69f5e4a786ad7854b080e99e3989c7353b0c33b456ed65"} Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.771711 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e70474055267817c9d69f5e4a786ad7854b080e99e3989c7353b0c33b456ed65" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.771794 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-638c-account-create-update-n28wn" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.773695 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-create-m5sbm" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.773696 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-create-m5sbm" event={"ID":"9f572652-5099-4653-9d03-efe8a789d462","Type":"ContainerDied","Data":"7b86f40f7e9f1fa503a7b77f174ee893882fbae2f163a7742fda828ce621be70"} Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.773891 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b86f40f7e9f1fa503a7b77f174ee893882fbae2f163a7742fda828ce621be70" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.779237 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" event={"ID":"4c71777b-833a-46c4-afac-95aff05fceb1","Type":"ContainerDied","Data":"62c12e5c04f9edd3cc6dd16cfd673e5c3c005f8ea5d171a727f3eb11eda42ced"} Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.779285 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62c12e5c04f9edd3cc6dd16cfd673e5c3c005f8ea5d171a727f3eb11eda42ced" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.779294 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5889-account-create-update-84pkt" Jan 20 23:53:55 crc kubenswrapper[5030]: I0120 23:53:55.836821 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.019187 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:53:56 crc kubenswrapper[5030]: E0120 23:53:56.019926 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4044eeda-1a3f-45ef-8f85-efdbfbffd4f6" containerName="mariadb-account-create-update" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.020002 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4044eeda-1a3f-45ef-8f85-efdbfbffd4f6" containerName="mariadb-account-create-update" Jan 20 23:53:56 crc kubenswrapper[5030]: E0120 23:53:56.020078 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f572652-5099-4653-9d03-efe8a789d462" containerName="mariadb-database-create" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.020131 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f572652-5099-4653-9d03-efe8a789d462" containerName="mariadb-database-create" Jan 20 23:53:56 crc kubenswrapper[5030]: E0120 23:53:56.020199 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c71777b-833a-46c4-afac-95aff05fceb1" containerName="mariadb-account-create-update" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.020252 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c71777b-833a-46c4-afac-95aff05fceb1" containerName="mariadb-account-create-update" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.020438 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4044eeda-1a3f-45ef-8f85-efdbfbffd4f6" containerName="mariadb-account-create-update" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.020504 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c71777b-833a-46c4-afac-95aff05fceb1" containerName="mariadb-account-create-update" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.020571 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f572652-5099-4653-9d03-efe8a789d462" containerName="mariadb-database-create" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.021436 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.023254 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-scripts" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.023306 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ovnnorthd-ovndbs" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.025985 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ovnnorthd-ovnnorthd-dockercfg-77xqz" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.026190 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovnnorthd-config" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.033596 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.117967 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/44f0a514-9f01-4768-aeb4-93de0354b738-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.118372 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.118409 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmftk\" (UniqueName: \"kubernetes.io/projected/44f0a514-9f01-4768-aeb4-93de0354b738-kube-api-access-jmftk\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.118462 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.118611 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44f0a514-9f01-4768-aeb4-93de0354b738-scripts\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.118701 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.118901 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f0a514-9f01-4768-aeb4-93de0354b738-config\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.220396 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/44f0a514-9f01-4768-aeb4-93de0354b738-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.220451 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.221100 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmftk\" (UniqueName: \"kubernetes.io/projected/44f0a514-9f01-4768-aeb4-93de0354b738-kube-api-access-jmftk\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.221173 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.221232 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44f0a514-9f01-4768-aeb4-93de0354b738-scripts\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.221280 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.221359 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f0a514-9f01-4768-aeb4-93de0354b738-config\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.221780 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/44f0a514-9f01-4768-aeb4-93de0354b738-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.222287 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f0a514-9f01-4768-aeb4-93de0354b738-config\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.222324 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44f0a514-9f01-4768-aeb4-93de0354b738-scripts\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.224036 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.224820 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.225903 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.270513 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmftk\" (UniqueName: \"kubernetes.io/projected/44f0a514-9f01-4768-aeb4-93de0354b738-kube-api-access-jmftk\") pod \"ovn-northd-0\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.323039 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:56 crc kubenswrapper[5030]: E0120 23:53:56.323247 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:53:56 crc kubenswrapper[5030]: E0120 23:53:56.323280 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:53:56 crc kubenswrapper[5030]: E0120 23:53:56.323352 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift podName:647882ed-cb41-417a-98bc-5a6bfbf0dd2a nodeName:}" failed. No retries permitted until 2026-01-20 23:53:58.323331482 +0000 UTC m=+4710.643591790 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift") pod "swift-storage-0" (UID: "647882ed-cb41-417a-98bc-5a6bfbf0dd2a") : configmap "swift-ring-files" not found Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.351264 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.792186 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" event={"ID":"32dae846-6ace-453b-a3eb-b237739433c6","Type":"ContainerStarted","Data":"e2808049c7a50fc3f17f6bc197d8b642e6960f22f7752674054c0cb0f66685ef"} Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.816330 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-db-sync-zt8z9"] Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.817521 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.820586 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-config-data" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.822714 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" podStartSLOduration=2.8226955670000002 podStartE2EDuration="2.822695567s" podCreationTimestamp="2026-01-20 23:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:53:56.820114654 +0000 UTC m=+4709.140374952" watchObservedRunningTime="2026-01-20 23:53:56.822695567 +0000 UTC m=+4709.142955855" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.833401 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-pt9jr" Jan 20 23:53:56 crc kubenswrapper[5030]: W0120 23:53:56.847775 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44f0a514_9f01_4768_aeb4_93de0354b738.slice/crio-b1c0010d21784fca0d0cfe4d45772d4e1220c271927f22732e828be4e0e42ff3 WatchSource:0}: Error finding container b1c0010d21784fca0d0cfe4d45772d4e1220c271927f22732e828be4e0e42ff3: Status 404 returned error can't find the container with id b1c0010d21784fca0d0cfe4d45772d4e1220c271927f22732e828be4e0e42ff3 Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.852051 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-zt8z9"] Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.861674 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.932821 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-db-sync-config-data\") pod \"glance-db-sync-zt8z9\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.933355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntx66\" (UniqueName: \"kubernetes.io/projected/c43aca93-d5a2-4ec9-bac6-6ac012e45271-kube-api-access-ntx66\") pod \"glance-db-sync-zt8z9\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.933726 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-combined-ca-bundle\") pod \"glance-db-sync-zt8z9\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:56 crc kubenswrapper[5030]: I0120 23:53:56.933961 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-config-data\") pod \"glance-db-sync-zt8z9\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.036113 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-db-sync-config-data\") pod \"glance-db-sync-zt8z9\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.036448 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntx66\" (UniqueName: \"kubernetes.io/projected/c43aca93-d5a2-4ec9-bac6-6ac012e45271-kube-api-access-ntx66\") pod \"glance-db-sync-zt8z9\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.036683 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-combined-ca-bundle\") pod \"glance-db-sync-zt8z9\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.036850 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-config-data\") pod \"glance-db-sync-zt8z9\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.041912 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-db-sync-config-data\") pod \"glance-db-sync-zt8z9\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.042809 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-config-data\") pod \"glance-db-sync-zt8z9\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.043706 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-combined-ca-bundle\") pod \"glance-db-sync-zt8z9\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.056589 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntx66\" (UniqueName: \"kubernetes.io/projected/c43aca93-d5a2-4ec9-bac6-6ac012e45271-kube-api-access-ntx66\") pod \"glance-db-sync-zt8z9\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.144956 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.592669 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-zt8z9"] Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.806998 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"44f0a514-9f01-4768-aeb4-93de0354b738","Type":"ContainerStarted","Data":"39b13d4627425616c7d43237285304e39a1350f2af6bf2588bc9f6f2e89e8a28"} Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.807142 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.807160 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"44f0a514-9f01-4768-aeb4-93de0354b738","Type":"ContainerStarted","Data":"014d4f347fd703b2cd055c19f820360447ce9d879a2dd7808843d16160084184"} Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.807174 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"44f0a514-9f01-4768-aeb4-93de0354b738","Type":"ContainerStarted","Data":"b1c0010d21784fca0d0cfe4d45772d4e1220c271927f22732e828be4e0e42ff3"} Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.809196 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-zt8z9" event={"ID":"c43aca93-d5a2-4ec9-bac6-6ac012e45271","Type":"ContainerStarted","Data":"9dd4a3fd4d444478d1c873055c01bc8cc2fc9cd94b582f5a1ca327f132ffd3f4"} Jan 20 23:53:57 crc kubenswrapper[5030]: I0120 23:53:57.831571 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-northd-0" podStartSLOduration=1.83154823 podStartE2EDuration="1.83154823s" podCreationTimestamp="2026-01-20 23:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:53:57.825148525 +0000 UTC m=+4710.145408833" watchObservedRunningTime="2026-01-20 23:53:57.83154823 +0000 UTC m=+4710.151808518" Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.175602 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-v4kgx"] Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.177711 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-v4kgx" Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.180509 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.203448 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-v4kgx"] Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.262469 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97a5e929-b4bb-4c07-814c-980b709b5368-operator-scripts\") pod \"root-account-create-update-v4kgx\" (UID: \"97a5e929-b4bb-4c07-814c-980b709b5368\") " pod="openstack-kuttl-tests/root-account-create-update-v4kgx" Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.262530 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwsd9\" (UniqueName: \"kubernetes.io/projected/97a5e929-b4bb-4c07-814c-980b709b5368-kube-api-access-xwsd9\") pod \"root-account-create-update-v4kgx\" (UID: \"97a5e929-b4bb-4c07-814c-980b709b5368\") " pod="openstack-kuttl-tests/root-account-create-update-v4kgx" Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.363910 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.363980 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97a5e929-b4bb-4c07-814c-980b709b5368-operator-scripts\") pod \"root-account-create-update-v4kgx\" (UID: \"97a5e929-b4bb-4c07-814c-980b709b5368\") " pod="openstack-kuttl-tests/root-account-create-update-v4kgx" Jan 20 23:53:58 crc kubenswrapper[5030]: E0120 23:53:58.364099 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:53:58 crc kubenswrapper[5030]: E0120 23:53:58.365099 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:53:58 crc kubenswrapper[5030]: E0120 23:53:58.365150 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift podName:647882ed-cb41-417a-98bc-5a6bfbf0dd2a nodeName:}" failed. No retries permitted until 2026-01-20 23:54:02.365137484 +0000 UTC m=+4714.685397772 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift") pod "swift-storage-0" (UID: "647882ed-cb41-417a-98bc-5a6bfbf0dd2a") : configmap "swift-ring-files" not found Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.364999 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97a5e929-b4bb-4c07-814c-980b709b5368-operator-scripts\") pod \"root-account-create-update-v4kgx\" (UID: \"97a5e929-b4bb-4c07-814c-980b709b5368\") " pod="openstack-kuttl-tests/root-account-create-update-v4kgx" Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.365061 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwsd9\" (UniqueName: \"kubernetes.io/projected/97a5e929-b4bb-4c07-814c-980b709b5368-kube-api-access-xwsd9\") pod \"root-account-create-update-v4kgx\" (UID: \"97a5e929-b4bb-4c07-814c-980b709b5368\") " pod="openstack-kuttl-tests/root-account-create-update-v4kgx" Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.382748 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwsd9\" (UniqueName: \"kubernetes.io/projected/97a5e929-b4bb-4c07-814c-980b709b5368-kube-api-access-xwsd9\") pod \"root-account-create-update-v4kgx\" (UID: \"97a5e929-b4bb-4c07-814c-980b709b5368\") " pod="openstack-kuttl-tests/root-account-create-update-v4kgx" Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.563416 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-v4kgx" Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.826298 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-zt8z9" event={"ID":"c43aca93-d5a2-4ec9-bac6-6ac012e45271","Type":"ContainerStarted","Data":"5b1afaba37cd88267c0434830d481b7db9a7f6660005d0a8a40dae36c79fe289"} Jan 20 23:53:58 crc kubenswrapper[5030]: I0120 23:53:58.853061 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-db-sync-zt8z9" podStartSLOduration=2.85304281 podStartE2EDuration="2.85304281s" podCreationTimestamp="2026-01-20 23:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:53:58.84728841 +0000 UTC m=+4711.167548688" watchObservedRunningTime="2026-01-20 23:53:58.85304281 +0000 UTC m=+4711.173303088" Jan 20 23:53:59 crc kubenswrapper[5030]: I0120 23:53:59.106762 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-v4kgx"] Jan 20 23:53:59 crc kubenswrapper[5030]: W0120 23:53:59.111530 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97a5e929_b4bb_4c07_814c_980b709b5368.slice/crio-d081033fbf300dd596c64be5cc92e8b05270a79b870b99ae604a95dc786be35a WatchSource:0}: Error finding container d081033fbf300dd596c64be5cc92e8b05270a79b870b99ae604a95dc786be35a: Status 404 returned error can't find the container with id d081033fbf300dd596c64be5cc92e8b05270a79b870b99ae604a95dc786be35a Jan 20 23:53:59 crc kubenswrapper[5030]: I0120 23:53:59.844202 5030 generic.go:334] "Generic (PLEG): container finished" podID="97a5e929-b4bb-4c07-814c-980b709b5368" containerID="4b48160c13eeb727a67fb21b66457065131b176d3905049f3a13e3528f25f470" exitCode=0 Jan 20 23:53:59 crc kubenswrapper[5030]: I0120 23:53:59.844330 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-v4kgx" event={"ID":"97a5e929-b4bb-4c07-814c-980b709b5368","Type":"ContainerDied","Data":"4b48160c13eeb727a67fb21b66457065131b176d3905049f3a13e3528f25f470"} Jan 20 23:53:59 crc kubenswrapper[5030]: I0120 23:53:59.844613 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-v4kgx" event={"ID":"97a5e929-b4bb-4c07-814c-980b709b5368","Type":"ContainerStarted","Data":"d081033fbf300dd596c64be5cc92e8b05270a79b870b99ae604a95dc786be35a"} Jan 20 23:54:00 crc kubenswrapper[5030]: I0120 23:54:00.871802 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-create-56vq6"] Jan 20 23:54:00 crc kubenswrapper[5030]: I0120 23:54:00.873563 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-56vq6" Jan 20 23:54:00 crc kubenswrapper[5030]: I0120 23:54:00.880759 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-56vq6"] Jan 20 23:54:00 crc kubenswrapper[5030]: I0120 23:54:00.912068 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/923cb7da-1207-4cb6-b35f-2a8284253596-operator-scripts\") pod \"keystone-db-create-56vq6\" (UID: \"923cb7da-1207-4cb6-b35f-2a8284253596\") " pod="openstack-kuttl-tests/keystone-db-create-56vq6" Jan 20 23:54:00 crc kubenswrapper[5030]: I0120 23:54:00.912238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xtzw\" (UniqueName: \"kubernetes.io/projected/923cb7da-1207-4cb6-b35f-2a8284253596-kube-api-access-5xtzw\") pod \"keystone-db-create-56vq6\" (UID: \"923cb7da-1207-4cb6-b35f-2a8284253596\") " pod="openstack-kuttl-tests/keystone-db-create-56vq6" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.008351 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs"] Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.009670 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.012747 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e334cad-3adc-4d91-aa9c-3e68815b750f-operator-scripts\") pod \"keystone-d843-account-create-update-t6cfs\" (UID: \"9e334cad-3adc-4d91-aa9c-3e68815b750f\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.012796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xtzw\" (UniqueName: \"kubernetes.io/projected/923cb7da-1207-4cb6-b35f-2a8284253596-kube-api-access-5xtzw\") pod \"keystone-db-create-56vq6\" (UID: \"923cb7da-1207-4cb6-b35f-2a8284253596\") " pod="openstack-kuttl-tests/keystone-db-create-56vq6" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.012840 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj5f9\" (UniqueName: \"kubernetes.io/projected/9e334cad-3adc-4d91-aa9c-3e68815b750f-kube-api-access-fj5f9\") pod \"keystone-d843-account-create-update-t6cfs\" (UID: \"9e334cad-3adc-4d91-aa9c-3e68815b750f\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.012891 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/923cb7da-1207-4cb6-b35f-2a8284253596-operator-scripts\") pod \"keystone-db-create-56vq6\" (UID: \"923cb7da-1207-4cb6-b35f-2a8284253596\") " pod="openstack-kuttl-tests/keystone-db-create-56vq6" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.013556 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/923cb7da-1207-4cb6-b35f-2a8284253596-operator-scripts\") pod \"keystone-db-create-56vq6\" (UID: \"923cb7da-1207-4cb6-b35f-2a8284253596\") " pod="openstack-kuttl-tests/keystone-db-create-56vq6" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.015583 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.029140 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs"] Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.114107 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj5f9\" (UniqueName: \"kubernetes.io/projected/9e334cad-3adc-4d91-aa9c-3e68815b750f-kube-api-access-fj5f9\") pod \"keystone-d843-account-create-update-t6cfs\" (UID: \"9e334cad-3adc-4d91-aa9c-3e68815b750f\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.114211 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e334cad-3adc-4d91-aa9c-3e68815b750f-operator-scripts\") pod \"keystone-d843-account-create-update-t6cfs\" (UID: \"9e334cad-3adc-4d91-aa9c-3e68815b750f\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.115371 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e334cad-3adc-4d91-aa9c-3e68815b750f-operator-scripts\") pod \"keystone-d843-account-create-update-t6cfs\" (UID: \"9e334cad-3adc-4d91-aa9c-3e68815b750f\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.157228 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj5f9\" (UniqueName: \"kubernetes.io/projected/9e334cad-3adc-4d91-aa9c-3e68815b750f-kube-api-access-fj5f9\") pod \"keystone-d843-account-create-update-t6cfs\" (UID: \"9e334cad-3adc-4d91-aa9c-3e68815b750f\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.157900 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xtzw\" (UniqueName: \"kubernetes.io/projected/923cb7da-1207-4cb6-b35f-2a8284253596-kube-api-access-5xtzw\") pod \"keystone-db-create-56vq6\" (UID: \"923cb7da-1207-4cb6-b35f-2a8284253596\") " pod="openstack-kuttl-tests/keystone-db-create-56vq6" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.206583 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-56vq6" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.317070 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-v4kgx" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.333001 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.417160 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwsd9\" (UniqueName: \"kubernetes.io/projected/97a5e929-b4bb-4c07-814c-980b709b5368-kube-api-access-xwsd9\") pod \"97a5e929-b4bb-4c07-814c-980b709b5368\" (UID: \"97a5e929-b4bb-4c07-814c-980b709b5368\") " Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.417221 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97a5e929-b4bb-4c07-814c-980b709b5368-operator-scripts\") pod \"97a5e929-b4bb-4c07-814c-980b709b5368\" (UID: \"97a5e929-b4bb-4c07-814c-980b709b5368\") " Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.418306 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a5e929-b4bb-4c07-814c-980b709b5368-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97a5e929-b4bb-4c07-814c-980b709b5368" (UID: "97a5e929-b4bb-4c07-814c-980b709b5368"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.422923 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97a5e929-b4bb-4c07-814c-980b709b5368-kube-api-access-xwsd9" (OuterVolumeSpecName: "kube-api-access-xwsd9") pod "97a5e929-b4bb-4c07-814c-980b709b5368" (UID: "97a5e929-b4bb-4c07-814c-980b709b5368"). InnerVolumeSpecName "kube-api-access-xwsd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.520645 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwsd9\" (UniqueName: \"kubernetes.io/projected/97a5e929-b4bb-4c07-814c-980b709b5368-kube-api-access-xwsd9\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.520681 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97a5e929-b4bb-4c07-814c-980b709b5368-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.686919 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-56vq6"] Jan 20 23:54:01 crc kubenswrapper[5030]: W0120 23:54:01.825044 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e334cad_3adc_4d91_aa9c_3e68815b750f.slice/crio-f1b944f01b6430564c768f8ca4c0d6bfe2b06f0f2a38eb7a09d22d9e3fd642be WatchSource:0}: Error finding container f1b944f01b6430564c768f8ca4c0d6bfe2b06f0f2a38eb7a09d22d9e3fd642be: Status 404 returned error can't find the container with id f1b944f01b6430564c768f8ca4c0d6bfe2b06f0f2a38eb7a09d22d9e3fd642be Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.825441 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs"] Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.874252 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-v4kgx" event={"ID":"97a5e929-b4bb-4c07-814c-980b709b5368","Type":"ContainerDied","Data":"d081033fbf300dd596c64be5cc92e8b05270a79b870b99ae604a95dc786be35a"} Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.874306 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d081033fbf300dd596c64be5cc92e8b05270a79b870b99ae604a95dc786be35a" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.874418 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-v4kgx" Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.876296 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-56vq6" event={"ID":"923cb7da-1207-4cb6-b35f-2a8284253596","Type":"ContainerStarted","Data":"c75baeda2d353e37cf4caec1f8650337e4be517e3683b23b84fd61d692fa7627"} Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.877399 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs" event={"ID":"9e334cad-3adc-4d91-aa9c-3e68815b750f","Type":"ContainerStarted","Data":"f1b944f01b6430564c768f8ca4c0d6bfe2b06f0f2a38eb7a09d22d9e3fd642be"} Jan 20 23:54:01 crc kubenswrapper[5030]: I0120 23:54:01.897903 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-create-56vq6" podStartSLOduration=1.897882354 podStartE2EDuration="1.897882354s" podCreationTimestamp="2026-01-20 23:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:01.890939975 +0000 UTC m=+4714.211200273" watchObservedRunningTime="2026-01-20 23:54:01.897882354 +0000 UTC m=+4714.218142652" Jan 20 23:54:02 crc kubenswrapper[5030]: I0120 23:54:02.435351 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:54:02 crc kubenswrapper[5030]: E0120 23:54:02.436060 5030 projected.go:288] Couldn't get configMap openstack-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 20 23:54:02 crc kubenswrapper[5030]: E0120 23:54:02.436086 5030 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 20 23:54:02 crc kubenswrapper[5030]: E0120 23:54:02.436155 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift podName:647882ed-cb41-417a-98bc-5a6bfbf0dd2a nodeName:}" failed. No retries permitted until 2026-01-20 23:54:10.436132111 +0000 UTC m=+4722.756392439 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift") pod "swift-storage-0" (UID: "647882ed-cb41-417a-98bc-5a6bfbf0dd2a") : configmap "swift-ring-files" not found Jan 20 23:54:02 crc kubenswrapper[5030]: I0120 23:54:02.891326 5030 generic.go:334] "Generic (PLEG): container finished" podID="923cb7da-1207-4cb6-b35f-2a8284253596" containerID="d506b46d4826050d78b8587c50046a77fb0b06b3be11d10cf58e4bb8089509c7" exitCode=0 Jan 20 23:54:02 crc kubenswrapper[5030]: I0120 23:54:02.891404 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-56vq6" event={"ID":"923cb7da-1207-4cb6-b35f-2a8284253596","Type":"ContainerDied","Data":"d506b46d4826050d78b8587c50046a77fb0b06b3be11d10cf58e4bb8089509c7"} Jan 20 23:54:02 crc kubenswrapper[5030]: I0120 23:54:02.893535 5030 generic.go:334] "Generic (PLEG): container finished" podID="9e334cad-3adc-4d91-aa9c-3e68815b750f" containerID="9ee24ed4cf32f69dd0f0b0150b80f0e1548fae2992047a77d4538ce351224327" exitCode=0 Jan 20 23:54:02 crc kubenswrapper[5030]: I0120 23:54:02.893654 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs" event={"ID":"9e334cad-3adc-4d91-aa9c-3e68815b750f","Type":"ContainerDied","Data":"9ee24ed4cf32f69dd0f0b0150b80f0e1548fae2992047a77d4538ce351224327"} Jan 20 23:54:03 crc kubenswrapper[5030]: I0120 23:54:03.913360 5030 generic.go:334] "Generic (PLEG): container finished" podID="32dae846-6ace-453b-a3eb-b237739433c6" containerID="e2808049c7a50fc3f17f6bc197d8b642e6960f22f7752674054c0cb0f66685ef" exitCode=0 Jan 20 23:54:03 crc kubenswrapper[5030]: I0120 23:54:03.913469 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" event={"ID":"32dae846-6ace-453b-a3eb-b237739433c6","Type":"ContainerDied","Data":"e2808049c7a50fc3f17f6bc197d8b642e6960f22f7752674054c0cb0f66685ef"} Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.400598 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-56vq6" Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.407085 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs" Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.473545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xtzw\" (UniqueName: \"kubernetes.io/projected/923cb7da-1207-4cb6-b35f-2a8284253596-kube-api-access-5xtzw\") pod \"923cb7da-1207-4cb6-b35f-2a8284253596\" (UID: \"923cb7da-1207-4cb6-b35f-2a8284253596\") " Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.473633 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e334cad-3adc-4d91-aa9c-3e68815b750f-operator-scripts\") pod \"9e334cad-3adc-4d91-aa9c-3e68815b750f\" (UID: \"9e334cad-3adc-4d91-aa9c-3e68815b750f\") " Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.473762 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj5f9\" (UniqueName: \"kubernetes.io/projected/9e334cad-3adc-4d91-aa9c-3e68815b750f-kube-api-access-fj5f9\") pod \"9e334cad-3adc-4d91-aa9c-3e68815b750f\" (UID: \"9e334cad-3adc-4d91-aa9c-3e68815b750f\") " Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.473918 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/923cb7da-1207-4cb6-b35f-2a8284253596-operator-scripts\") pod \"923cb7da-1207-4cb6-b35f-2a8284253596\" (UID: \"923cb7da-1207-4cb6-b35f-2a8284253596\") " Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.474279 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/923cb7da-1207-4cb6-b35f-2a8284253596-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "923cb7da-1207-4cb6-b35f-2a8284253596" (UID: "923cb7da-1207-4cb6-b35f-2a8284253596"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.474340 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e334cad-3adc-4d91-aa9c-3e68815b750f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e334cad-3adc-4d91-aa9c-3e68815b750f" (UID: "9e334cad-3adc-4d91-aa9c-3e68815b750f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.479935 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/923cb7da-1207-4cb6-b35f-2a8284253596-kube-api-access-5xtzw" (OuterVolumeSpecName: "kube-api-access-5xtzw") pod "923cb7da-1207-4cb6-b35f-2a8284253596" (UID: "923cb7da-1207-4cb6-b35f-2a8284253596"). InnerVolumeSpecName "kube-api-access-5xtzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.480113 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e334cad-3adc-4d91-aa9c-3e68815b750f-kube-api-access-fj5f9" (OuterVolumeSpecName: "kube-api-access-fj5f9") pod "9e334cad-3adc-4d91-aa9c-3e68815b750f" (UID: "9e334cad-3adc-4d91-aa9c-3e68815b750f"). InnerVolumeSpecName "kube-api-access-fj5f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.576174 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/923cb7da-1207-4cb6-b35f-2a8284253596-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.576213 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xtzw\" (UniqueName: \"kubernetes.io/projected/923cb7da-1207-4cb6-b35f-2a8284253596-kube-api-access-5xtzw\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.576226 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e334cad-3adc-4d91-aa9c-3e68815b750f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.576238 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj5f9\" (UniqueName: \"kubernetes.io/projected/9e334cad-3adc-4d91-aa9c-3e68815b750f-kube-api-access-fj5f9\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.581178 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-v4kgx"] Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.618198 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-v4kgx"] Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.960484 5030 generic.go:334] "Generic (PLEG): container finished" podID="c43aca93-d5a2-4ec9-bac6-6ac012e45271" containerID="5b1afaba37cd88267c0434830d481b7db9a7f6660005d0a8a40dae36c79fe289" exitCode=0 Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.960594 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-zt8z9" event={"ID":"c43aca93-d5a2-4ec9-bac6-6ac012e45271","Type":"ContainerDied","Data":"5b1afaba37cd88267c0434830d481b7db9a7f6660005d0a8a40dae36c79fe289"} Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.967289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs" event={"ID":"9e334cad-3adc-4d91-aa9c-3e68815b750f","Type":"ContainerDied","Data":"f1b944f01b6430564c768f8ca4c0d6bfe2b06f0f2a38eb7a09d22d9e3fd642be"} Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.967340 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1b944f01b6430564c768f8ca4c0d6bfe2b06f0f2a38eb7a09d22d9e3fd642be" Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.967340 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs" Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.971514 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-create-56vq6" Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.971857 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-create-56vq6" event={"ID":"923cb7da-1207-4cb6-b35f-2a8284253596","Type":"ContainerDied","Data":"c75baeda2d353e37cf4caec1f8650337e4be517e3683b23b84fd61d692fa7627"} Jan 20 23:54:04 crc kubenswrapper[5030]: I0120 23:54:04.971941 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c75baeda2d353e37cf4caec1f8650337e4be517e3683b23b84fd61d692fa7627" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.409404 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.590723 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/32dae846-6ace-453b-a3eb-b237739433c6-ring-data-devices\") pod \"32dae846-6ace-453b-a3eb-b237739433c6\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.590818 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-swiftconf\") pod \"32dae846-6ace-453b-a3eb-b237739433c6\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.590895 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-dispersionconf\") pod \"32dae846-6ace-453b-a3eb-b237739433c6\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.590966 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzppj\" (UniqueName: \"kubernetes.io/projected/32dae846-6ace-453b-a3eb-b237739433c6-kube-api-access-gzppj\") pod \"32dae846-6ace-453b-a3eb-b237739433c6\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.591008 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-combined-ca-bundle\") pod \"32dae846-6ace-453b-a3eb-b237739433c6\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.591044 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/32dae846-6ace-453b-a3eb-b237739433c6-etc-swift\") pod \"32dae846-6ace-453b-a3eb-b237739433c6\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.591109 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dae846-6ace-453b-a3eb-b237739433c6-scripts\") pod \"32dae846-6ace-453b-a3eb-b237739433c6\" (UID: \"32dae846-6ace-453b-a3eb-b237739433c6\") " Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.591902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32dae846-6ace-453b-a3eb-b237739433c6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "32dae846-6ace-453b-a3eb-b237739433c6" (UID: "32dae846-6ace-453b-a3eb-b237739433c6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.592724 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32dae846-6ace-453b-a3eb-b237739433c6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "32dae846-6ace-453b-a3eb-b237739433c6" (UID: "32dae846-6ace-453b-a3eb-b237739433c6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.596470 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32dae846-6ace-453b-a3eb-b237739433c6-kube-api-access-gzppj" (OuterVolumeSpecName: "kube-api-access-gzppj") pod "32dae846-6ace-453b-a3eb-b237739433c6" (UID: "32dae846-6ace-453b-a3eb-b237739433c6"). InnerVolumeSpecName "kube-api-access-gzppj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.600887 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "32dae846-6ace-453b-a3eb-b237739433c6" (UID: "32dae846-6ace-453b-a3eb-b237739433c6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.611247 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32dae846-6ace-453b-a3eb-b237739433c6-scripts" (OuterVolumeSpecName: "scripts") pod "32dae846-6ace-453b-a3eb-b237739433c6" (UID: "32dae846-6ace-453b-a3eb-b237739433c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.629082 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32dae846-6ace-453b-a3eb-b237739433c6" (UID: "32dae846-6ace-453b-a3eb-b237739433c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.643819 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "32dae846-6ace-453b-a3eb-b237739433c6" (UID: "32dae846-6ace-453b-a3eb-b237739433c6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.695403 5030 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/32dae846-6ace-453b-a3eb-b237739433c6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.695475 5030 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.695502 5030 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.695533 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzppj\" (UniqueName: \"kubernetes.io/projected/32dae846-6ace-453b-a3eb-b237739433c6-kube-api-access-gzppj\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.695565 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/32dae846-6ace-453b-a3eb-b237739433c6-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.695594 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32dae846-6ace-453b-a3eb-b237739433c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.695656 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32dae846-6ace-453b-a3eb-b237739433c6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:05 crc kubenswrapper[5030]: I0120 23:54:05.997028 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97a5e929-b4bb-4c07-814c-980b709b5368" path="/var/lib/kubelet/pods/97a5e929-b4bb-4c07-814c-980b709b5368/volumes" Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.008398 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.012122 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-ring-rebalance-7vfh9" event={"ID":"32dae846-6ace-453b-a3eb-b237739433c6","Type":"ContainerDied","Data":"84dde2368b6a13b622a7bed51b3c2f7b230f4976a763984a63cb3c64ae3cbf56"} Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.012225 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84dde2368b6a13b622a7bed51b3c2f7b230f4976a763984a63cb3c64ae3cbf56" Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.483905 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.614547 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-db-sync-config-data\") pod \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.615392 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntx66\" (UniqueName: \"kubernetes.io/projected/c43aca93-d5a2-4ec9-bac6-6ac012e45271-kube-api-access-ntx66\") pod \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.615491 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-combined-ca-bundle\") pod \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.615525 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-config-data\") pod \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\" (UID: \"c43aca93-d5a2-4ec9-bac6-6ac012e45271\") " Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.619132 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c43aca93-d5a2-4ec9-bac6-6ac012e45271" (UID: "c43aca93-d5a2-4ec9-bac6-6ac012e45271"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.619302 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c43aca93-d5a2-4ec9-bac6-6ac012e45271-kube-api-access-ntx66" (OuterVolumeSpecName: "kube-api-access-ntx66") pod "c43aca93-d5a2-4ec9-bac6-6ac012e45271" (UID: "c43aca93-d5a2-4ec9-bac6-6ac012e45271"). InnerVolumeSpecName "kube-api-access-ntx66". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.656660 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c43aca93-d5a2-4ec9-bac6-6ac012e45271" (UID: "c43aca93-d5a2-4ec9-bac6-6ac012e45271"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.664434 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-config-data" (OuterVolumeSpecName: "config-data") pod "c43aca93-d5a2-4ec9-bac6-6ac012e45271" (UID: "c43aca93-d5a2-4ec9-bac6-6ac012e45271"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.717336 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.717398 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntx66\" (UniqueName: \"kubernetes.io/projected/c43aca93-d5a2-4ec9-bac6-6ac012e45271-kube-api-access-ntx66\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.717412 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:06 crc kubenswrapper[5030]: I0120 23:54:06.717422 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c43aca93-d5a2-4ec9-bac6-6ac012e45271-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:07 crc kubenswrapper[5030]: I0120 23:54:07.022301 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-db-sync-zt8z9" event={"ID":"c43aca93-d5a2-4ec9-bac6-6ac012e45271","Type":"ContainerDied","Data":"9dd4a3fd4d444478d1c873055c01bc8cc2fc9cd94b582f5a1ca327f132ffd3f4"} Jan 20 23:54:07 crc kubenswrapper[5030]: I0120 23:54:07.022734 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dd4a3fd4d444478d1c873055c01bc8cc2fc9cd94b582f5a1ca327f132ffd3f4" Jan 20 23:54:07 crc kubenswrapper[5030]: I0120 23:54:07.022383 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-db-sync-zt8z9" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.606524 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tjp4j"] Jan 20 23:54:09 crc kubenswrapper[5030]: E0120 23:54:09.607116 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e334cad-3adc-4d91-aa9c-3e68815b750f" containerName="mariadb-account-create-update" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.607140 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e334cad-3adc-4d91-aa9c-3e68815b750f" containerName="mariadb-account-create-update" Jan 20 23:54:09 crc kubenswrapper[5030]: E0120 23:54:09.607157 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a5e929-b4bb-4c07-814c-980b709b5368" containerName="mariadb-account-create-update" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.607163 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a5e929-b4bb-4c07-814c-980b709b5368" containerName="mariadb-account-create-update" Jan 20 23:54:09 crc kubenswrapper[5030]: E0120 23:54:09.607184 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32dae846-6ace-453b-a3eb-b237739433c6" containerName="swift-ring-rebalance" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.607191 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="32dae846-6ace-453b-a3eb-b237739433c6" containerName="swift-ring-rebalance" Jan 20 23:54:09 crc kubenswrapper[5030]: E0120 23:54:09.607205 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c43aca93-d5a2-4ec9-bac6-6ac012e45271" containerName="glance-db-sync" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.607210 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c43aca93-d5a2-4ec9-bac6-6ac012e45271" containerName="glance-db-sync" Jan 20 23:54:09 crc kubenswrapper[5030]: E0120 23:54:09.607221 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923cb7da-1207-4cb6-b35f-2a8284253596" containerName="mariadb-database-create" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.607227 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="923cb7da-1207-4cb6-b35f-2a8284253596" containerName="mariadb-database-create" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.607364 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="97a5e929-b4bb-4c07-814c-980b709b5368" containerName="mariadb-account-create-update" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.607380 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e334cad-3adc-4d91-aa9c-3e68815b750f" containerName="mariadb-account-create-update" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.607390 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="32dae846-6ace-453b-a3eb-b237739433c6" containerName="swift-ring-rebalance" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.607402 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="923cb7da-1207-4cb6-b35f-2a8284253596" containerName="mariadb-database-create" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.607411 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c43aca93-d5a2-4ec9-bac6-6ac012e45271" containerName="glance-db-sync" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.607971 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-tjp4j" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.616151 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.619855 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tjp4j"] Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.674573 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0-operator-scripts\") pod \"root-account-create-update-tjp4j\" (UID: \"ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0\") " pod="openstack-kuttl-tests/root-account-create-update-tjp4j" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.674845 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgc78\" (UniqueName: \"kubernetes.io/projected/ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0-kube-api-access-rgc78\") pod \"root-account-create-update-tjp4j\" (UID: \"ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0\") " pod="openstack-kuttl-tests/root-account-create-update-tjp4j" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.776808 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgc78\" (UniqueName: \"kubernetes.io/projected/ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0-kube-api-access-rgc78\") pod \"root-account-create-update-tjp4j\" (UID: \"ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0\") " pod="openstack-kuttl-tests/root-account-create-update-tjp4j" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.776963 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0-operator-scripts\") pod \"root-account-create-update-tjp4j\" (UID: \"ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0\") " pod="openstack-kuttl-tests/root-account-create-update-tjp4j" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.777877 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0-operator-scripts\") pod \"root-account-create-update-tjp4j\" (UID: \"ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0\") " pod="openstack-kuttl-tests/root-account-create-update-tjp4j" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.807116 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgc78\" (UniqueName: \"kubernetes.io/projected/ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0-kube-api-access-rgc78\") pod \"root-account-create-update-tjp4j\" (UID: \"ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0\") " pod="openstack-kuttl-tests/root-account-create-update-tjp4j" Jan 20 23:54:09 crc kubenswrapper[5030]: I0120 23:54:09.930723 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-tjp4j" Jan 20 23:54:10 crc kubenswrapper[5030]: I0120 23:54:10.440300 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tjp4j"] Jan 20 23:54:10 crc kubenswrapper[5030]: I0120 23:54:10.493734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:54:10 crc kubenswrapper[5030]: I0120 23:54:10.500679 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift\") pod \"swift-storage-0\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:54:10 crc kubenswrapper[5030]: I0120 23:54:10.767333 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:54:11 crc kubenswrapper[5030]: I0120 23:54:11.087033 5030 generic.go:334] "Generic (PLEG): container finished" podID="ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0" containerID="8a9629589aba37ec1cf676a565b634a2dbb74ba8259a834b9d096792ba716cc6" exitCode=0 Jan 20 23:54:11 crc kubenswrapper[5030]: I0120 23:54:11.087549 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-tjp4j" event={"ID":"ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0","Type":"ContainerDied","Data":"8a9629589aba37ec1cf676a565b634a2dbb74ba8259a834b9d096792ba716cc6"} Jan 20 23:54:11 crc kubenswrapper[5030]: I0120 23:54:11.087631 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-tjp4j" event={"ID":"ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0","Type":"ContainerStarted","Data":"623f3833aba2905bc013095cac1b17d5f1127938ab76d7466697be87d51d83da"} Jan 20 23:54:11 crc kubenswrapper[5030]: I0120 23:54:11.291397 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:54:11 crc kubenswrapper[5030]: I0120 23:54:11.442923 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:54:12 crc kubenswrapper[5030]: I0120 23:54:12.101989 5030 generic.go:334] "Generic (PLEG): container finished" podID="19f6003c-3d52-44e9-a5a8-d525e4b6a79e" containerID="060f9ab9280053821ca2e9dbdfd76e7b2e79ed01983ce958e604a573ce49b483" exitCode=0 Jan 20 23:54:12 crc kubenswrapper[5030]: I0120 23:54:12.102051 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"19f6003c-3d52-44e9-a5a8-d525e4b6a79e","Type":"ContainerDied","Data":"060f9ab9280053821ca2e9dbdfd76e7b2e79ed01983ce958e604a573ce49b483"} Jan 20 23:54:12 crc kubenswrapper[5030]: I0120 23:54:12.110530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"50c1d323bc10d2e9d482f6bae645cc74b4dc3db3e2fb097ee700a89b4499d86c"} Jan 20 23:54:12 crc kubenswrapper[5030]: I0120 23:54:12.110584 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"3654ab925574f24e6de289b448a4be2823edfaa3cfa76260680adad4710a392c"} Jan 20 23:54:12 crc kubenswrapper[5030]: I0120 23:54:12.110605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"4ed9aef9fb4eafe75c52a1017481474e53ebb1e8b61c72ef3fbe95bc2c025817"} Jan 20 23:54:12 crc kubenswrapper[5030]: I0120 23:54:12.110643 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"846ef75ed7641319a1122e200cc114c276facc1edc4d1b914c9ee658dab575ff"} Jan 20 23:54:12 crc kubenswrapper[5030]: I0120 23:54:12.479869 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-tjp4j" Jan 20 23:54:12 crc kubenswrapper[5030]: I0120 23:54:12.630185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0-operator-scripts\") pod \"ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0\" (UID: \"ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0\") " Jan 20 23:54:12 crc kubenswrapper[5030]: I0120 23:54:12.630322 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgc78\" (UniqueName: \"kubernetes.io/projected/ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0-kube-api-access-rgc78\") pod \"ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0\" (UID: \"ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0\") " Jan 20 23:54:12 crc kubenswrapper[5030]: I0120 23:54:12.630860 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0" (UID: "ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:12 crc kubenswrapper[5030]: I0120 23:54:12.636530 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0-kube-api-access-rgc78" (OuterVolumeSpecName: "kube-api-access-rgc78") pod "ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0" (UID: "ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0"). InnerVolumeSpecName "kube-api-access-rgc78". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:12 crc kubenswrapper[5030]: I0120 23:54:12.732279 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgc78\" (UniqueName: \"kubernetes.io/projected/ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0-kube-api-access-rgc78\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:12 crc kubenswrapper[5030]: I0120 23:54:12.732334 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:13 crc kubenswrapper[5030]: I0120 23:54:13.122808 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"19f6003c-3d52-44e9-a5a8-d525e4b6a79e","Type":"ContainerStarted","Data":"016af194f1eb4817af713ac5bfe3322cf5ee11ad44e3b6dcfc50cfc5c79e5f55"} Jan 20 23:54:13 crc kubenswrapper[5030]: I0120 23:54:13.123180 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:54:13 crc kubenswrapper[5030]: I0120 23:54:13.124477 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-tjp4j" event={"ID":"ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0","Type":"ContainerDied","Data":"623f3833aba2905bc013095cac1b17d5f1127938ab76d7466697be87d51d83da"} Jan 20 23:54:13 crc kubenswrapper[5030]: I0120 23:54:13.124515 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-tjp4j" Jan 20 23:54:13 crc kubenswrapper[5030]: I0120 23:54:13.124523 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="623f3833aba2905bc013095cac1b17d5f1127938ab76d7466697be87d51d83da" Jan 20 23:54:13 crc kubenswrapper[5030]: I0120 23:54:13.126240 5030 generic.go:334] "Generic (PLEG): container finished" podID="4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" containerID="af0b95855676feb71cc4038a460e384b867378a3685449ad4e3a268dd14282da" exitCode=0 Jan 20 23:54:13 crc kubenswrapper[5030]: I0120 23:54:13.126327 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf","Type":"ContainerDied","Data":"af0b95855676feb71cc4038a460e384b867378a3685449ad4e3a268dd14282da"} Jan 20 23:54:13 crc kubenswrapper[5030]: I0120 23:54:13.131601 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"57552db52780afc148b63dea6d16414f0cabd8ebca10a042707f9fe199f39642"} Jan 20 23:54:13 crc kubenswrapper[5030]: I0120 23:54:13.131682 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"691b234d76c80217c95ab8beecf74884acbbff47c7b789fb42fa87e618b49105"} Jan 20 23:54:13 crc kubenswrapper[5030]: I0120 23:54:13.131702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"e69298ac72328d9e05cbb9394d7318f97f51059ec2e51eed28645ae7a530e56b"} Jan 20 23:54:13 crc kubenswrapper[5030]: I0120 23:54:13.131720 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"422e52434457361c67a15e36827a2685f1991c2155732af2cb83cec3214ed1f9"} Jan 20 23:54:13 crc kubenswrapper[5030]: I0120 23:54:13.131738 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"5e75ab45cf815c6f40eaad36359127f421639ab2cf69cccab92eb297f6bff69d"} Jan 20 23:54:13 crc kubenswrapper[5030]: I0120 23:54:13.177186 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podStartSLOduration=38.177163573 podStartE2EDuration="38.177163573s" podCreationTimestamp="2026-01-20 23:53:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:13.16797858 +0000 UTC m=+4725.488238908" watchObservedRunningTime="2026-01-20 23:54:13.177163573 +0000 UTC m=+4725.497423891" Jan 20 23:54:14 crc kubenswrapper[5030]: I0120 23:54:14.141924 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf","Type":"ContainerStarted","Data":"816197e9ac2adae78cad4ecd0194c5c8ad4d46b8179549ee00873742d6bf4138"} Jan 20 23:54:14 crc kubenswrapper[5030]: I0120 23:54:14.142753 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:54:14 crc kubenswrapper[5030]: I0120 23:54:14.149144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"a374407f689a9d2bb03eff9d062e1cd3da80b68110ed673257b7015e9e92c1ad"} Jan 20 23:54:14 crc kubenswrapper[5030]: I0120 23:54:14.149195 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"f0d126b13c715f2ad13a917efe6ded91f57e7059086b604ef44b914a751d5bbc"} Jan 20 23:54:14 crc kubenswrapper[5030]: I0120 23:54:14.149211 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"8671b10a80366e99ef14723df98d3a82f9697f085022261cce60d19731b26754"} Jan 20 23:54:14 crc kubenswrapper[5030]: I0120 23:54:14.149224 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"4f455f04c208a9ca5dfa5ca0e43960e84838f224928f9a968af8ca6c20691bc7"} Jan 20 23:54:14 crc kubenswrapper[5030]: I0120 23:54:14.180362 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/rabbitmq-server-0" podStartSLOduration=38.180345559 podStartE2EDuration="38.180345559s" podCreationTimestamp="2026-01-20 23:53:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:14.180311248 +0000 UTC m=+4726.500571536" watchObservedRunningTime="2026-01-20 23:54:14.180345559 +0000 UTC m=+4726.500605847" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.162404 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"525d2f77e4bb4d57f35f228c13d361085452ce1a0496934e60bded3b63edca0f"} Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.162447 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"92c32d2bc86e98bab1819f43771f92a83d45f6ff289fdd9a2b5a70b53ad1558e"} Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.162460 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerStarted","Data":"cea3952cdaef6601fde2eb637e39d3001c3a7c366d044314ff9a9d1c02c48017"} Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.206008 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-storage-0" podStartSLOduration=22.205980639 podStartE2EDuration="22.205980639s" podCreationTimestamp="2026-01-20 23:53:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:15.195953106 +0000 UTC m=+4727.516213394" watchObservedRunningTime="2026-01-20 23:54:15.205980639 +0000 UTC m=+4727.526240967" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.370048 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6"] Jan 20 23:54:15 crc kubenswrapper[5030]: E0120 23:54:15.370382 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0" containerName="mariadb-account-create-update" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.370396 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0" containerName="mariadb-account-create-update" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.370563 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0" containerName="mariadb-account-create-update" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.371351 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:15 crc kubenswrapper[5030]: W0120 23:54:15.373813 5030 reflector.go:561] object-"openstack-kuttl-tests"/"dns-swift-storage-0": failed to list *v1.ConfigMap: configmaps "dns-swift-storage-0" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openstack-kuttl-tests": no relationship found between node 'crc' and this object Jan 20 23:54:15 crc kubenswrapper[5030]: E0120 23:54:15.373885 5030 reflector.go:158] "Unhandled Error" err="object-\"openstack-kuttl-tests\"/\"dns-swift-storage-0\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"dns-swift-storage-0\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openstack-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.395406 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6"] Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.485156 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c878c4cc-h9vb6\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.485275 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-config\") pod \"dnsmasq-dnsmasq-78c878c4cc-h9vb6\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.485301 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-78c878c4cc-h9vb6\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.485358 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9ktj\" (UniqueName: \"kubernetes.io/projected/b4804160-77d4-4618-91cc-85015bbed1a5-kube-api-access-g9ktj\") pod \"dnsmasq-dnsmasq-78c878c4cc-h9vb6\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.586848 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c878c4cc-h9vb6\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.586945 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-config\") pod \"dnsmasq-dnsmasq-78c878c4cc-h9vb6\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.586986 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-78c878c4cc-h9vb6\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.587058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9ktj\" (UniqueName: \"kubernetes.io/projected/b4804160-77d4-4618-91cc-85015bbed1a5-kube-api-access-g9ktj\") pod \"dnsmasq-dnsmasq-78c878c4cc-h9vb6\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.588514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c878c4cc-h9vb6\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.588563 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-config\") pod \"dnsmasq-dnsmasq-78c878c4cc-h9vb6\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:15 crc kubenswrapper[5030]: I0120 23:54:15.619906 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9ktj\" (UniqueName: \"kubernetes.io/projected/b4804160-77d4-4618-91cc-85015bbed1a5-kube-api-access-g9ktj\") pod \"dnsmasq-dnsmasq-78c878c4cc-h9vb6\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:16 crc kubenswrapper[5030]: E0120 23:54:16.588054 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/dns-swift-storage-0: failed to sync configmap cache: timed out waiting for the condition Jan 20 23:54:16 crc kubenswrapper[5030]: E0120 23:54:16.588189 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dns-swift-storage-0 podName:b4804160-77d4-4618-91cc-85015bbed1a5 nodeName:}" failed. No retries permitted until 2026-01-20 23:54:17.088158739 +0000 UTC m=+4729.408419057 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "dns-swift-storage-0" (UniqueName: "kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dns-swift-storage-0") pod "dnsmasq-dnsmasq-78c878c4cc-h9vb6" (UID: "b4804160-77d4-4618-91cc-85015bbed1a5") : failed to sync configmap cache: timed out waiting for the condition Jan 20 23:54:16 crc kubenswrapper[5030]: I0120 23:54:16.724535 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"dns-swift-storage-0" Jan 20 23:54:17 crc kubenswrapper[5030]: I0120 23:54:17.118061 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-78c878c4cc-h9vb6\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:17 crc kubenswrapper[5030]: I0120 23:54:17.119162 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dns-swift-storage-0\") pod \"dnsmasq-dnsmasq-78c878c4cc-h9vb6\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:17 crc kubenswrapper[5030]: I0120 23:54:17.191279 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:17 crc kubenswrapper[5030]: I0120 23:54:17.757121 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6"] Jan 20 23:54:17 crc kubenswrapper[5030]: W0120 23:54:17.764159 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4804160_77d4_4618_91cc_85015bbed1a5.slice/crio-392c79245c6663f862f9fc08d57cadf74130f19b62ef4e9027c766970d337840 WatchSource:0}: Error finding container 392c79245c6663f862f9fc08d57cadf74130f19b62ef4e9027c766970d337840: Status 404 returned error can't find the container with id 392c79245c6663f862f9fc08d57cadf74130f19b62ef4e9027c766970d337840 Jan 20 23:54:18 crc kubenswrapper[5030]: I0120 23:54:18.209074 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4804160-77d4-4618-91cc-85015bbed1a5" containerID="615df9dcdf35d3ae262a6af538fc4697b6295de8ea33815c235b9ee46c1faa9e" exitCode=0 Jan 20 23:54:18 crc kubenswrapper[5030]: I0120 23:54:18.209164 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" event={"ID":"b4804160-77d4-4618-91cc-85015bbed1a5","Type":"ContainerDied","Data":"615df9dcdf35d3ae262a6af538fc4697b6295de8ea33815c235b9ee46c1faa9e"} Jan 20 23:54:18 crc kubenswrapper[5030]: I0120 23:54:18.209453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" event={"ID":"b4804160-77d4-4618-91cc-85015bbed1a5","Type":"ContainerStarted","Data":"392c79245c6663f862f9fc08d57cadf74130f19b62ef4e9027c766970d337840"} Jan 20 23:54:19 crc kubenswrapper[5030]: I0120 23:54:19.222436 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" event={"ID":"b4804160-77d4-4618-91cc-85015bbed1a5","Type":"ContainerStarted","Data":"5bf7d3aa7dddcfd86f7a56357da5095f341a594eddc4a99e9088446df842548c"} Jan 20 23:54:19 crc kubenswrapper[5030]: I0120 23:54:19.222750 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:19 crc kubenswrapper[5030]: I0120 23:54:19.258095 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" podStartSLOduration=4.258063657 podStartE2EDuration="4.258063657s" podCreationTimestamp="2026-01-20 23:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:19.24828575 +0000 UTC m=+4731.568546068" watchObservedRunningTime="2026-01-20 23:54:19.258063657 +0000 UTC m=+4731.578323975" Jan 20 23:54:20 crc kubenswrapper[5030]: I0120 23:54:20.490387 5030 scope.go:117] "RemoveContainer" containerID="701f43b2568e312e9772d92318bb41bb84f4409a0d9a2c0eb6f8605d404a8b06" Jan 20 23:54:20 crc kubenswrapper[5030]: I0120 23:54:20.513232 5030 scope.go:117] "RemoveContainer" containerID="fcc1c5fccc13caf607c55b1c6f3a6b1f7b218f35a0ce717dc443ff2c2713f71d" Jan 20 23:54:20 crc kubenswrapper[5030]: I0120 23:54:20.576560 5030 scope.go:117] "RemoveContainer" containerID="caabf9e91a5b98c65a9956848d602849f629932b869a201628353e0848aa2e94" Jan 20 23:54:20 crc kubenswrapper[5030]: I0120 23:54:20.626450 5030 scope.go:117] "RemoveContainer" containerID="b5668c5767465d1ac0a059f35961c95c3750890a2ab067c7e40c0fd02ef0a90b" Jan 20 23:54:20 crc kubenswrapper[5030]: I0120 23:54:20.668532 5030 scope.go:117] "RemoveContainer" containerID="d2fed7ec38d03a6a43221941b4cffa52c915ff185c27683474988eb9263cdca6" Jan 20 23:54:20 crc kubenswrapper[5030]: I0120 23:54:20.731340 5030 scope.go:117] "RemoveContainer" containerID="de7d98c8e93e6a2cb05b7a6bf897c11d83d20b98e812427b2724dfe9cd548087" Jan 20 23:54:20 crc kubenswrapper[5030]: I0120 23:54:20.785207 5030 scope.go:117] "RemoveContainer" containerID="ae8a26acea412ba89d4499dc41189e13598d4aa8e2f33437732b1f4b93e89caa" Jan 20 23:54:20 crc kubenswrapper[5030]: I0120 23:54:20.824848 5030 scope.go:117] "RemoveContainer" containerID="c62780d3e5840b74bc2429c44c1168ee6a179caac4010c37ab7ee9c316f92d48" Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.008859 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.196767 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.282212 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk"] Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.282779 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" podUID="1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8" containerName="dnsmasq-dns" containerID="cri-o://18a0356c446c290186901af3ff696b1d91139274654c9abd64753ddb60bccff3" gracePeriod=10 Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.732219 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.856893 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-dnsmasq-svc\") pod \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\" (UID: \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\") " Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.857027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s57w9\" (UniqueName: \"kubernetes.io/projected/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-kube-api-access-s57w9\") pod \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\" (UID: \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\") " Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.857148 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-config\") pod \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\" (UID: \"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8\") " Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.869776 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-kube-api-access-s57w9" (OuterVolumeSpecName: "kube-api-access-s57w9") pod "1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8" (UID: "1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8"). InnerVolumeSpecName "kube-api-access-s57w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.915697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8" (UID: "1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.926585 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-config" (OuterVolumeSpecName: "config") pod "1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8" (UID: "1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.959784 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s57w9\" (UniqueName: \"kubernetes.io/projected/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-kube-api-access-s57w9\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.959844 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:27 crc kubenswrapper[5030]: I0120 23:54:27.959866 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.325644 5030 generic.go:334] "Generic (PLEG): container finished" podID="1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8" containerID="18a0356c446c290186901af3ff696b1d91139274654c9abd64753ddb60bccff3" exitCode=0 Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.325708 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" event={"ID":"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8","Type":"ContainerDied","Data":"18a0356c446c290186901af3ff696b1d91139274654c9abd64753ddb60bccff3"} Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.325780 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" event={"ID":"1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8","Type":"ContainerDied","Data":"1cda944e55a41a98fa02b935f7fc543369804f1c4b521d2fa93d75906c580752"} Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.325786 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.325811 5030 scope.go:117] "RemoveContainer" containerID="18a0356c446c290186901af3ff696b1d91139274654c9abd64753ddb60bccff3" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.359819 5030 scope.go:117] "RemoveContainer" containerID="189d313043fe3d3466237f794d2223799a58ae3d17ec95a0a6e3330a34d98b2a" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.364181 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk"] Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.376848 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-zm7tk"] Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.392206 5030 scope.go:117] "RemoveContainer" containerID="18a0356c446c290186901af3ff696b1d91139274654c9abd64753ddb60bccff3" Jan 20 23:54:28 crc kubenswrapper[5030]: E0120 23:54:28.393021 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a0356c446c290186901af3ff696b1d91139274654c9abd64753ddb60bccff3\": container with ID starting with 18a0356c446c290186901af3ff696b1d91139274654c9abd64753ddb60bccff3 not found: ID does not exist" containerID="18a0356c446c290186901af3ff696b1d91139274654c9abd64753ddb60bccff3" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.393100 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a0356c446c290186901af3ff696b1d91139274654c9abd64753ddb60bccff3"} err="failed to get container status \"18a0356c446c290186901af3ff696b1d91139274654c9abd64753ddb60bccff3\": rpc error: code = NotFound desc = could not find container \"18a0356c446c290186901af3ff696b1d91139274654c9abd64753ddb60bccff3\": container with ID starting with 18a0356c446c290186901af3ff696b1d91139274654c9abd64753ddb60bccff3 not found: ID does not exist" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.393160 5030 scope.go:117] "RemoveContainer" containerID="189d313043fe3d3466237f794d2223799a58ae3d17ec95a0a6e3330a34d98b2a" Jan 20 23:54:28 crc kubenswrapper[5030]: E0120 23:54:28.393704 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189d313043fe3d3466237f794d2223799a58ae3d17ec95a0a6e3330a34d98b2a\": container with ID starting with 189d313043fe3d3466237f794d2223799a58ae3d17ec95a0a6e3330a34d98b2a not found: ID does not exist" containerID="189d313043fe3d3466237f794d2223799a58ae3d17ec95a0a6e3330a34d98b2a" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.393761 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189d313043fe3d3466237f794d2223799a58ae3d17ec95a0a6e3330a34d98b2a"} err="failed to get container status \"189d313043fe3d3466237f794d2223799a58ae3d17ec95a0a6e3330a34d98b2a\": rpc error: code = NotFound desc = could not find container \"189d313043fe3d3466237f794d2223799a58ae3d17ec95a0a6e3330a34d98b2a\": container with ID starting with 189d313043fe3d3466237f794d2223799a58ae3d17ec95a0a6e3330a34d98b2a not found: ID does not exist" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.406026 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.727002 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-create-4vbjd"] Jan 20 23:54:28 crc kubenswrapper[5030]: E0120 23:54:28.727518 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8" containerName="dnsmasq-dns" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.727535 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8" containerName="dnsmasq-dns" Jan 20 23:54:28 crc kubenswrapper[5030]: E0120 23:54:28.727560 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8" containerName="init" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.727566 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8" containerName="init" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.727729 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8" containerName="dnsmasq-dns" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.728207 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-4vbjd" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.743974 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-4vbjd"] Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.832595 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-create-22v4w"] Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.833542 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-22v4w" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.845225 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-22v4w"] Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.875473 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjr59\" (UniqueName: \"kubernetes.io/projected/ff6f83e7-540c-419e-b09f-1a49dc622f40-kube-api-access-kjr59\") pod \"cinder-db-create-4vbjd\" (UID: \"ff6f83e7-540c-419e-b09f-1a49dc622f40\") " pod="openstack-kuttl-tests/cinder-db-create-4vbjd" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.875558 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6f83e7-540c-419e-b09f-1a49dc622f40-operator-scripts\") pod \"cinder-db-create-4vbjd\" (UID: \"ff6f83e7-540c-419e-b09f-1a49dc622f40\") " pod="openstack-kuttl-tests/cinder-db-create-4vbjd" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.929492 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg"] Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.930430 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.932223 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.944406 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg"] Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.977529 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjr59\" (UniqueName: \"kubernetes.io/projected/ff6f83e7-540c-419e-b09f-1a49dc622f40-kube-api-access-kjr59\") pod \"cinder-db-create-4vbjd\" (UID: \"ff6f83e7-540c-419e-b09f-1a49dc622f40\") " pod="openstack-kuttl-tests/cinder-db-create-4vbjd" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.977614 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtx6s\" (UniqueName: \"kubernetes.io/projected/f2da3223-e170-45c0-a464-2b91d405f3c0-kube-api-access-xtx6s\") pod \"barbican-db-create-22v4w\" (UID: \"f2da3223-e170-45c0-a464-2b91d405f3c0\") " pod="openstack-kuttl-tests/barbican-db-create-22v4w" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.977825 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2da3223-e170-45c0-a464-2b91d405f3c0-operator-scripts\") pod \"barbican-db-create-22v4w\" (UID: \"f2da3223-e170-45c0-a464-2b91d405f3c0\") " pod="openstack-kuttl-tests/barbican-db-create-22v4w" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.977899 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6f83e7-540c-419e-b09f-1a49dc622f40-operator-scripts\") pod \"cinder-db-create-4vbjd\" (UID: \"ff6f83e7-540c-419e-b09f-1a49dc622f40\") " pod="openstack-kuttl-tests/cinder-db-create-4vbjd" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.978739 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6f83e7-540c-419e-b09f-1a49dc622f40-operator-scripts\") pod \"cinder-db-create-4vbjd\" (UID: \"ff6f83e7-540c-419e-b09f-1a49dc622f40\") " pod="openstack-kuttl-tests/cinder-db-create-4vbjd" Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.983503 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-create-54lmm"] Jan 20 23:54:28 crc kubenswrapper[5030]: I0120 23:54:28.985726 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-54lmm" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.008169 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk"] Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.009384 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.014211 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-db-secret" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.021424 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-54lmm"] Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.027659 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjr59\" (UniqueName: \"kubernetes.io/projected/ff6f83e7-540c-419e-b09f-1a49dc622f40-kube-api-access-kjr59\") pod \"cinder-db-create-4vbjd\" (UID: \"ff6f83e7-540c-419e-b09f-1a49dc622f40\") " pod="openstack-kuttl-tests/cinder-db-create-4vbjd" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.032328 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk"] Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.053491 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-4vbjd" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.080758 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f386591d-bc9f-4389-b61b-e4a830c025f9-operator-scripts\") pod \"neutron-db-create-54lmm\" (UID: \"f386591d-bc9f-4389-b61b-e4a830c025f9\") " pod="openstack-kuttl-tests/neutron-db-create-54lmm" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.080801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5bww\" (UniqueName: \"kubernetes.io/projected/f386591d-bc9f-4389-b61b-e4a830c025f9-kube-api-access-c5bww\") pod \"neutron-db-create-54lmm\" (UID: \"f386591d-bc9f-4389-b61b-e4a830c025f9\") " pod="openstack-kuttl-tests/neutron-db-create-54lmm" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.080851 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtx6s\" (UniqueName: \"kubernetes.io/projected/f2da3223-e170-45c0-a464-2b91d405f3c0-kube-api-access-xtx6s\") pod \"barbican-db-create-22v4w\" (UID: \"f2da3223-e170-45c0-a464-2b91d405f3c0\") " pod="openstack-kuttl-tests/barbican-db-create-22v4w" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.080893 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/331a63fb-2148-49d9-ab0a-7dc819c26bea-operator-scripts\") pod \"barbican-4d16-account-create-update-mwwbg\" (UID: \"331a63fb-2148-49d9-ab0a-7dc819c26bea\") " pod="openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.080924 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2da3223-e170-45c0-a464-2b91d405f3c0-operator-scripts\") pod \"barbican-db-create-22v4w\" (UID: \"f2da3223-e170-45c0-a464-2b91d405f3c0\") " pod="openstack-kuttl-tests/barbican-db-create-22v4w" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.080976 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swc6f\" (UniqueName: \"kubernetes.io/projected/331a63fb-2148-49d9-ab0a-7dc819c26bea-kube-api-access-swc6f\") pod \"barbican-4d16-account-create-update-mwwbg\" (UID: \"331a63fb-2148-49d9-ab0a-7dc819c26bea\") " pod="openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.081933 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2da3223-e170-45c0-a464-2b91d405f3c0-operator-scripts\") pod \"barbican-db-create-22v4w\" (UID: \"f2da3223-e170-45c0-a464-2b91d405f3c0\") " pod="openstack-kuttl-tests/barbican-db-create-22v4w" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.096022 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtx6s\" (UniqueName: \"kubernetes.io/projected/f2da3223-e170-45c0-a464-2b91d405f3c0-kube-api-access-xtx6s\") pod \"barbican-db-create-22v4w\" (UID: \"f2da3223-e170-45c0-a464-2b91d405f3c0\") " pod="openstack-kuttl-tests/barbican-db-create-22v4w" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.132446 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8"] Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.133500 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.135411 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.144665 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pcjpl"] Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.146024 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.147103 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-22v4w" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.149676 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.149851 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.150134 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-4w7gr" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.150298 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.159783 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8"] Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.170389 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pcjpl"] Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.183819 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/331a63fb-2148-49d9-ab0a-7dc819c26bea-operator-scripts\") pod \"barbican-4d16-account-create-update-mwwbg\" (UID: \"331a63fb-2148-49d9-ab0a-7dc819c26bea\") " pod="openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.183907 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbjc9\" (UniqueName: \"kubernetes.io/projected/489812a0-ca0d-4259-a487-7c4840c0a7d5-kube-api-access-rbjc9\") pod \"cinder-7dea-account-create-update-mhshk\" (UID: \"489812a0-ca0d-4259-a487-7c4840c0a7d5\") " pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.183945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489812a0-ca0d-4259-a487-7c4840c0a7d5-operator-scripts\") pod \"cinder-7dea-account-create-update-mhshk\" (UID: \"489812a0-ca0d-4259-a487-7c4840c0a7d5\") " pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.183966 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swc6f\" (UniqueName: \"kubernetes.io/projected/331a63fb-2148-49d9-ab0a-7dc819c26bea-kube-api-access-swc6f\") pod \"barbican-4d16-account-create-update-mwwbg\" (UID: \"331a63fb-2148-49d9-ab0a-7dc819c26bea\") " pod="openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.184060 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f386591d-bc9f-4389-b61b-e4a830c025f9-operator-scripts\") pod \"neutron-db-create-54lmm\" (UID: \"f386591d-bc9f-4389-b61b-e4a830c025f9\") " pod="openstack-kuttl-tests/neutron-db-create-54lmm" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.184089 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5bww\" (UniqueName: \"kubernetes.io/projected/f386591d-bc9f-4389-b61b-e4a830c025f9-kube-api-access-c5bww\") pod \"neutron-db-create-54lmm\" (UID: \"f386591d-bc9f-4389-b61b-e4a830c025f9\") " pod="openstack-kuttl-tests/neutron-db-create-54lmm" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.184577 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/331a63fb-2148-49d9-ab0a-7dc819c26bea-operator-scripts\") pod \"barbican-4d16-account-create-update-mwwbg\" (UID: \"331a63fb-2148-49d9-ab0a-7dc819c26bea\") " pod="openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.185183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f386591d-bc9f-4389-b61b-e4a830c025f9-operator-scripts\") pod \"neutron-db-create-54lmm\" (UID: \"f386591d-bc9f-4389-b61b-e4a830c025f9\") " pod="openstack-kuttl-tests/neutron-db-create-54lmm" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.210657 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swc6f\" (UniqueName: \"kubernetes.io/projected/331a63fb-2148-49d9-ab0a-7dc819c26bea-kube-api-access-swc6f\") pod \"barbican-4d16-account-create-update-mwwbg\" (UID: \"331a63fb-2148-49d9-ab0a-7dc819c26bea\") " pod="openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.215155 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5bww\" (UniqueName: \"kubernetes.io/projected/f386591d-bc9f-4389-b61b-e4a830c025f9-kube-api-access-c5bww\") pod \"neutron-db-create-54lmm\" (UID: \"f386591d-bc9f-4389-b61b-e4a830c025f9\") " pod="openstack-kuttl-tests/neutron-db-create-54lmm" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.243117 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.285287 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e30653e-ea2a-4d57-b605-c35cf41208c4-config-data\") pod \"keystone-db-sync-pcjpl\" (UID: \"2e30653e-ea2a-4d57-b605-c35cf41208c4\") " pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.285330 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e30653e-ea2a-4d57-b605-c35cf41208c4-combined-ca-bundle\") pod \"keystone-db-sync-pcjpl\" (UID: \"2e30653e-ea2a-4d57-b605-c35cf41208c4\") " pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.285351 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/691512e9-1781-4ce2-9d4d-c1787adb07c5-operator-scripts\") pod \"neutron-9d9a-account-create-update-ngxq8\" (UID: \"691512e9-1781-4ce2-9d4d-c1787adb07c5\") " pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.285404 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbjc9\" (UniqueName: \"kubernetes.io/projected/489812a0-ca0d-4259-a487-7c4840c0a7d5-kube-api-access-rbjc9\") pod \"cinder-7dea-account-create-update-mhshk\" (UID: \"489812a0-ca0d-4259-a487-7c4840c0a7d5\") " pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.285452 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489812a0-ca0d-4259-a487-7c4840c0a7d5-operator-scripts\") pod \"cinder-7dea-account-create-update-mhshk\" (UID: \"489812a0-ca0d-4259-a487-7c4840c0a7d5\") " pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.285495 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rjrs\" (UniqueName: \"kubernetes.io/projected/691512e9-1781-4ce2-9d4d-c1787adb07c5-kube-api-access-8rjrs\") pod \"neutron-9d9a-account-create-update-ngxq8\" (UID: \"691512e9-1781-4ce2-9d4d-c1787adb07c5\") " pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.285549 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7ktx\" (UniqueName: \"kubernetes.io/projected/2e30653e-ea2a-4d57-b605-c35cf41208c4-kube-api-access-r7ktx\") pod \"keystone-db-sync-pcjpl\" (UID: \"2e30653e-ea2a-4d57-b605-c35cf41208c4\") " pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.286434 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489812a0-ca0d-4259-a487-7c4840c0a7d5-operator-scripts\") pod \"cinder-7dea-account-create-update-mhshk\" (UID: \"489812a0-ca0d-4259-a487-7c4840c0a7d5\") " pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.301817 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-54lmm" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.307192 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbjc9\" (UniqueName: \"kubernetes.io/projected/489812a0-ca0d-4259-a487-7c4840c0a7d5-kube-api-access-rbjc9\") pod \"cinder-7dea-account-create-update-mhshk\" (UID: \"489812a0-ca0d-4259-a487-7c4840c0a7d5\") " pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.360589 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.388921 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7ktx\" (UniqueName: \"kubernetes.io/projected/2e30653e-ea2a-4d57-b605-c35cf41208c4-kube-api-access-r7ktx\") pod \"keystone-db-sync-pcjpl\" (UID: \"2e30653e-ea2a-4d57-b605-c35cf41208c4\") " pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.388975 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e30653e-ea2a-4d57-b605-c35cf41208c4-config-data\") pod \"keystone-db-sync-pcjpl\" (UID: \"2e30653e-ea2a-4d57-b605-c35cf41208c4\") " pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.389003 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e30653e-ea2a-4d57-b605-c35cf41208c4-combined-ca-bundle\") pod \"keystone-db-sync-pcjpl\" (UID: \"2e30653e-ea2a-4d57-b605-c35cf41208c4\") " pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.389023 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/691512e9-1781-4ce2-9d4d-c1787adb07c5-operator-scripts\") pod \"neutron-9d9a-account-create-update-ngxq8\" (UID: \"691512e9-1781-4ce2-9d4d-c1787adb07c5\") " pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.389104 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rjrs\" (UniqueName: \"kubernetes.io/projected/691512e9-1781-4ce2-9d4d-c1787adb07c5-kube-api-access-8rjrs\") pod \"neutron-9d9a-account-create-update-ngxq8\" (UID: \"691512e9-1781-4ce2-9d4d-c1787adb07c5\") " pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.392502 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/691512e9-1781-4ce2-9d4d-c1787adb07c5-operator-scripts\") pod \"neutron-9d9a-account-create-update-ngxq8\" (UID: \"691512e9-1781-4ce2-9d4d-c1787adb07c5\") " pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.396911 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e30653e-ea2a-4d57-b605-c35cf41208c4-combined-ca-bundle\") pod \"keystone-db-sync-pcjpl\" (UID: \"2e30653e-ea2a-4d57-b605-c35cf41208c4\") " pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.415379 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e30653e-ea2a-4d57-b605-c35cf41208c4-config-data\") pod \"keystone-db-sync-pcjpl\" (UID: \"2e30653e-ea2a-4d57-b605-c35cf41208c4\") " pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.415919 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rjrs\" (UniqueName: \"kubernetes.io/projected/691512e9-1781-4ce2-9d4d-c1787adb07c5-kube-api-access-8rjrs\") pod \"neutron-9d9a-account-create-update-ngxq8\" (UID: \"691512e9-1781-4ce2-9d4d-c1787adb07c5\") " pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.416765 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7ktx\" (UniqueName: \"kubernetes.io/projected/2e30653e-ea2a-4d57-b605-c35cf41208c4-kube-api-access-r7ktx\") pod \"keystone-db-sync-pcjpl\" (UID: \"2e30653e-ea2a-4d57-b605-c35cf41208c4\") " pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.466348 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.482088 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.642283 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-22v4w"] Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.679115 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-4vbjd"] Jan 20 23:54:29 crc kubenswrapper[5030]: W0120 23:54:29.682775 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff6f83e7_540c_419e_b09f_1a49dc622f40.slice/crio-8a20632456cebd1dff17926456624776f08fb97f6ac5b882b50deef217a4a10c WatchSource:0}: Error finding container 8a20632456cebd1dff17926456624776f08fb97f6ac5b882b50deef217a4a10c: Status 404 returned error can't find the container with id 8a20632456cebd1dff17926456624776f08fb97f6ac5b882b50deef217a4a10c Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.830385 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-54lmm"] Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.841117 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg"] Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.975959 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8" path="/var/lib/kubelet/pods/1e3a8a28-350e-4704-a9dc-09fa5b1b7bd8/volumes" Jan 20 23:54:29 crc kubenswrapper[5030]: I0120 23:54:29.979791 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk"] Jan 20 23:54:29 crc kubenswrapper[5030]: W0120 23:54:29.989927 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod489812a0_ca0d_4259_a487_7c4840c0a7d5.slice/crio-7629026d1319f53c4b721f5b8b68e61d8b433fb07434e3de309a95da7f53a419 WatchSource:0}: Error finding container 7629026d1319f53c4b721f5b8b68e61d8b433fb07434e3de309a95da7f53a419: Status 404 returned error can't find the container with id 7629026d1319f53c4b721f5b8b68e61d8b433fb07434e3de309a95da7f53a419 Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.048446 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8"] Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.064363 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pcjpl"] Jan 20 23:54:30 crc kubenswrapper[5030]: W0120 23:54:30.111861 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e30653e_ea2a_4d57_b605_c35cf41208c4.slice/crio-8306437ca3890c35a211dd7f1168ee28004efd5b50417021045975978e4c2ab2 WatchSource:0}: Error finding container 8306437ca3890c35a211dd7f1168ee28004efd5b50417021045975978e4c2ab2: Status 404 returned error can't find the container with id 8306437ca3890c35a211dd7f1168ee28004efd5b50417021045975978e4c2ab2 Jan 20 23:54:30 crc kubenswrapper[5030]: W0120 23:54:30.112340 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod691512e9_1781_4ce2_9d4d_c1787adb07c5.slice/crio-3716ee4f3e7ce8d113594ed8347b84b425e0ee36fb204cdfb3142c2dbe408705 WatchSource:0}: Error finding container 3716ee4f3e7ce8d113594ed8347b84b425e0ee36fb204cdfb3142c2dbe408705: Status 404 returned error can't find the container with id 3716ee4f3e7ce8d113594ed8347b84b425e0ee36fb204cdfb3142c2dbe408705 Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.368572 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" event={"ID":"2e30653e-ea2a-4d57-b605-c35cf41208c4","Type":"ContainerStarted","Data":"87a82cba312227a3eedb47a0c268697ec415a32cbe79e06eaeca52d69f0bc499"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.368938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" event={"ID":"2e30653e-ea2a-4d57-b605-c35cf41208c4","Type":"ContainerStarted","Data":"8306437ca3890c35a211dd7f1168ee28004efd5b50417021045975978e4c2ab2"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.370088 5030 generic.go:334] "Generic (PLEG): container finished" podID="f2da3223-e170-45c0-a464-2b91d405f3c0" containerID="a5c4433b03820c6d18621f9c8f81c83389f156d4d17147855bf0f160cd803274" exitCode=0 Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.370141 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-22v4w" event={"ID":"f2da3223-e170-45c0-a464-2b91d405f3c0","Type":"ContainerDied","Data":"a5c4433b03820c6d18621f9c8f81c83389f156d4d17147855bf0f160cd803274"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.370163 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-22v4w" event={"ID":"f2da3223-e170-45c0-a464-2b91d405f3c0","Type":"ContainerStarted","Data":"6ee8656763822093fcac7f56e902dedb0e30db954eef7f6c5730f81cf9e084c3"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.371694 5030 generic.go:334] "Generic (PLEG): container finished" podID="331a63fb-2148-49d9-ab0a-7dc819c26bea" containerID="446598bf5fc66973129bf823658cc975cc96b4b3543a75bed93814311423a045" exitCode=0 Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.371750 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg" event={"ID":"331a63fb-2148-49d9-ab0a-7dc819c26bea","Type":"ContainerDied","Data":"446598bf5fc66973129bf823658cc975cc96b4b3543a75bed93814311423a045"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.371769 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg" event={"ID":"331a63fb-2148-49d9-ab0a-7dc819c26bea","Type":"ContainerStarted","Data":"4946ef767e5f1367fddff67024fc27c17ae271503b23e333efadd20bf38aabe0"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.376160 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" event={"ID":"691512e9-1781-4ce2-9d4d-c1787adb07c5","Type":"ContainerStarted","Data":"6fd0a762eb361ee47aaa848aac648360188300b079c2102d8a6884e136b43630"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.376206 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" event={"ID":"691512e9-1781-4ce2-9d4d-c1787adb07c5","Type":"ContainerStarted","Data":"3716ee4f3e7ce8d113594ed8347b84b425e0ee36fb204cdfb3142c2dbe408705"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.377496 5030 generic.go:334] "Generic (PLEG): container finished" podID="f386591d-bc9f-4389-b61b-e4a830c025f9" containerID="ffa3ae677b3ee3825734e167538f781603c6acc1215aa641ced6dbbfe5a472a5" exitCode=0 Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.377526 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-54lmm" event={"ID":"f386591d-bc9f-4389-b61b-e4a830c025f9","Type":"ContainerDied","Data":"ffa3ae677b3ee3825734e167538f781603c6acc1215aa641ced6dbbfe5a472a5"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.377561 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-54lmm" event={"ID":"f386591d-bc9f-4389-b61b-e4a830c025f9","Type":"ContainerStarted","Data":"a9d128f21e2bd839f473d002ad7086c7ba08a4384df35b67ee6e5222b78112cf"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.379052 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" event={"ID":"489812a0-ca0d-4259-a487-7c4840c0a7d5","Type":"ContainerStarted","Data":"54c63ead713d1354abf474373098647909a2d843a7ad797ffe6704b15a66b178"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.379079 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" event={"ID":"489812a0-ca0d-4259-a487-7c4840c0a7d5","Type":"ContainerStarted","Data":"7629026d1319f53c4b721f5b8b68e61d8b433fb07434e3de309a95da7f53a419"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.381544 5030 generic.go:334] "Generic (PLEG): container finished" podID="ff6f83e7-540c-419e-b09f-1a49dc622f40" containerID="59d69580f270ae36f95b6b71e7235da82769f6ef8d082e2ada36c0ab220149e3" exitCode=0 Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.381582 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-4vbjd" event={"ID":"ff6f83e7-540c-419e-b09f-1a49dc622f40","Type":"ContainerDied","Data":"59d69580f270ae36f95b6b71e7235da82769f6ef8d082e2ada36c0ab220149e3"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.381603 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-4vbjd" event={"ID":"ff6f83e7-540c-419e-b09f-1a49dc622f40","Type":"ContainerStarted","Data":"8a20632456cebd1dff17926456624776f08fb97f6ac5b882b50deef217a4a10c"} Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.391205 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" podStartSLOduration=1.391189771 podStartE2EDuration="1.391189771s" podCreationTimestamp="2026-01-20 23:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:30.384614522 +0000 UTC m=+4742.704874810" watchObservedRunningTime="2026-01-20 23:54:30.391189771 +0000 UTC m=+4742.711450059" Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.487696 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" podStartSLOduration=1.487683652 podStartE2EDuration="1.487683652s" podCreationTimestamp="2026-01-20 23:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:30.483927451 +0000 UTC m=+4742.804187739" watchObservedRunningTime="2026-01-20 23:54:30.487683652 +0000 UTC m=+4742.807943930" Jan 20 23:54:30 crc kubenswrapper[5030]: I0120 23:54:30.509928 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" podStartSLOduration=2.509907341 podStartE2EDuration="2.509907341s" podCreationTimestamp="2026-01-20 23:54:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:30.501345074 +0000 UTC m=+4742.821605352" watchObservedRunningTime="2026-01-20 23:54:30.509907341 +0000 UTC m=+4742.830167629" Jan 20 23:54:31 crc kubenswrapper[5030]: I0120 23:54:31.402364 5030 generic.go:334] "Generic (PLEG): container finished" podID="691512e9-1781-4ce2-9d4d-c1787adb07c5" containerID="6fd0a762eb361ee47aaa848aac648360188300b079c2102d8a6884e136b43630" exitCode=0 Jan 20 23:54:31 crc kubenswrapper[5030]: I0120 23:54:31.402456 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" event={"ID":"691512e9-1781-4ce2-9d4d-c1787adb07c5","Type":"ContainerDied","Data":"6fd0a762eb361ee47aaa848aac648360188300b079c2102d8a6884e136b43630"} Jan 20 23:54:31 crc kubenswrapper[5030]: I0120 23:54:31.407289 5030 generic.go:334] "Generic (PLEG): container finished" podID="489812a0-ca0d-4259-a487-7c4840c0a7d5" containerID="54c63ead713d1354abf474373098647909a2d843a7ad797ffe6704b15a66b178" exitCode=0 Jan 20 23:54:31 crc kubenswrapper[5030]: I0120 23:54:31.407473 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" event={"ID":"489812a0-ca0d-4259-a487-7c4840c0a7d5","Type":"ContainerDied","Data":"54c63ead713d1354abf474373098647909a2d843a7ad797ffe6704b15a66b178"} Jan 20 23:54:31 crc kubenswrapper[5030]: I0120 23:54:31.850332 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-4vbjd" Jan 20 23:54:31 crc kubenswrapper[5030]: I0120 23:54:31.937898 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6f83e7-540c-419e-b09f-1a49dc622f40-operator-scripts\") pod \"ff6f83e7-540c-419e-b09f-1a49dc622f40\" (UID: \"ff6f83e7-540c-419e-b09f-1a49dc622f40\") " Jan 20 23:54:31 crc kubenswrapper[5030]: I0120 23:54:31.938023 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjr59\" (UniqueName: \"kubernetes.io/projected/ff6f83e7-540c-419e-b09f-1a49dc622f40-kube-api-access-kjr59\") pod \"ff6f83e7-540c-419e-b09f-1a49dc622f40\" (UID: \"ff6f83e7-540c-419e-b09f-1a49dc622f40\") " Jan 20 23:54:31 crc kubenswrapper[5030]: I0120 23:54:31.938564 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6f83e7-540c-419e-b09f-1a49dc622f40-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff6f83e7-540c-419e-b09f-1a49dc622f40" (UID: "ff6f83e7-540c-419e-b09f-1a49dc622f40"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:31 crc kubenswrapper[5030]: I0120 23:54:31.940111 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-22v4w" Jan 20 23:54:31 crc kubenswrapper[5030]: I0120 23:54:31.943697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff6f83e7-540c-419e-b09f-1a49dc622f40-kube-api-access-kjr59" (OuterVolumeSpecName: "kube-api-access-kjr59") pod "ff6f83e7-540c-419e-b09f-1a49dc622f40" (UID: "ff6f83e7-540c-419e-b09f-1a49dc622f40"). InnerVolumeSpecName "kube-api-access-kjr59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:31 crc kubenswrapper[5030]: I0120 23:54:31.984357 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-54lmm" Jan 20 23:54:31 crc kubenswrapper[5030]: I0120 23:54:31.989740 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.040394 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2da3223-e170-45c0-a464-2b91d405f3c0-operator-scripts\") pod \"f2da3223-e170-45c0-a464-2b91d405f3c0\" (UID: \"f2da3223-e170-45c0-a464-2b91d405f3c0\") " Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.040456 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtx6s\" (UniqueName: \"kubernetes.io/projected/f2da3223-e170-45c0-a464-2b91d405f3c0-kube-api-access-xtx6s\") pod \"f2da3223-e170-45c0-a464-2b91d405f3c0\" (UID: \"f2da3223-e170-45c0-a464-2b91d405f3c0\") " Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.040918 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjr59\" (UniqueName: \"kubernetes.io/projected/ff6f83e7-540c-419e-b09f-1a49dc622f40-kube-api-access-kjr59\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.040936 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6f83e7-540c-419e-b09f-1a49dc622f40-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.042726 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2da3223-e170-45c0-a464-2b91d405f3c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2da3223-e170-45c0-a464-2b91d405f3c0" (UID: "f2da3223-e170-45c0-a464-2b91d405f3c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.045904 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2da3223-e170-45c0-a464-2b91d405f3c0-kube-api-access-xtx6s" (OuterVolumeSpecName: "kube-api-access-xtx6s") pod "f2da3223-e170-45c0-a464-2b91d405f3c0" (UID: "f2da3223-e170-45c0-a464-2b91d405f3c0"). InnerVolumeSpecName "kube-api-access-xtx6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.141570 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swc6f\" (UniqueName: \"kubernetes.io/projected/331a63fb-2148-49d9-ab0a-7dc819c26bea-kube-api-access-swc6f\") pod \"331a63fb-2148-49d9-ab0a-7dc819c26bea\" (UID: \"331a63fb-2148-49d9-ab0a-7dc819c26bea\") " Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.141653 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5bww\" (UniqueName: \"kubernetes.io/projected/f386591d-bc9f-4389-b61b-e4a830c025f9-kube-api-access-c5bww\") pod \"f386591d-bc9f-4389-b61b-e4a830c025f9\" (UID: \"f386591d-bc9f-4389-b61b-e4a830c025f9\") " Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.141829 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f386591d-bc9f-4389-b61b-e4a830c025f9-operator-scripts\") pod \"f386591d-bc9f-4389-b61b-e4a830c025f9\" (UID: \"f386591d-bc9f-4389-b61b-e4a830c025f9\") " Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.141847 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/331a63fb-2148-49d9-ab0a-7dc819c26bea-operator-scripts\") pod \"331a63fb-2148-49d9-ab0a-7dc819c26bea\" (UID: \"331a63fb-2148-49d9-ab0a-7dc819c26bea\") " Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.142160 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2da3223-e170-45c0-a464-2b91d405f3c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.142179 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtx6s\" (UniqueName: \"kubernetes.io/projected/f2da3223-e170-45c0-a464-2b91d405f3c0-kube-api-access-xtx6s\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.142243 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f386591d-bc9f-4389-b61b-e4a830c025f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f386591d-bc9f-4389-b61b-e4a830c025f9" (UID: "f386591d-bc9f-4389-b61b-e4a830c025f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.142334 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331a63fb-2148-49d9-ab0a-7dc819c26bea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "331a63fb-2148-49d9-ab0a-7dc819c26bea" (UID: "331a63fb-2148-49d9-ab0a-7dc819c26bea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.145801 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f386591d-bc9f-4389-b61b-e4a830c025f9-kube-api-access-c5bww" (OuterVolumeSpecName: "kube-api-access-c5bww") pod "f386591d-bc9f-4389-b61b-e4a830c025f9" (UID: "f386591d-bc9f-4389-b61b-e4a830c025f9"). InnerVolumeSpecName "kube-api-access-c5bww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.146701 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331a63fb-2148-49d9-ab0a-7dc819c26bea-kube-api-access-swc6f" (OuterVolumeSpecName: "kube-api-access-swc6f") pod "331a63fb-2148-49d9-ab0a-7dc819c26bea" (UID: "331a63fb-2148-49d9-ab0a-7dc819c26bea"). InnerVolumeSpecName "kube-api-access-swc6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.243441 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5bww\" (UniqueName: \"kubernetes.io/projected/f386591d-bc9f-4389-b61b-e4a830c025f9-kube-api-access-c5bww\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.243474 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f386591d-bc9f-4389-b61b-e4a830c025f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.243484 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/331a63fb-2148-49d9-ab0a-7dc819c26bea-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.243495 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swc6f\" (UniqueName: \"kubernetes.io/projected/331a63fb-2148-49d9-ab0a-7dc819c26bea-kube-api-access-swc6f\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.417718 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-create-22v4w" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.417740 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-create-22v4w" event={"ID":"f2da3223-e170-45c0-a464-2b91d405f3c0","Type":"ContainerDied","Data":"6ee8656763822093fcac7f56e902dedb0e30db954eef7f6c5730f81cf9e084c3"} Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.417786 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ee8656763822093fcac7f56e902dedb0e30db954eef7f6c5730f81cf9e084c3" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.419354 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg" event={"ID":"331a63fb-2148-49d9-ab0a-7dc819c26bea","Type":"ContainerDied","Data":"4946ef767e5f1367fddff67024fc27c17ae271503b23e333efadd20bf38aabe0"} Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.419375 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4946ef767e5f1367fddff67024fc27c17ae271503b23e333efadd20bf38aabe0" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.419405 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.421109 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-create-54lmm" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.421172 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-create-54lmm" event={"ID":"f386591d-bc9f-4389-b61b-e4a830c025f9","Type":"ContainerDied","Data":"a9d128f21e2bd839f473d002ad7086c7ba08a4384df35b67ee6e5222b78112cf"} Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.421241 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9d128f21e2bd839f473d002ad7086c7ba08a4384df35b67ee6e5222b78112cf" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.422926 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-create-4vbjd" event={"ID":"ff6f83e7-540c-419e-b09f-1a49dc622f40","Type":"ContainerDied","Data":"8a20632456cebd1dff17926456624776f08fb97f6ac5b882b50deef217a4a10c"} Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.422986 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a20632456cebd1dff17926456624776f08fb97f6ac5b882b50deef217a4a10c" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.423116 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-create-4vbjd" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.659005 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.755213 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489812a0-ca0d-4259-a487-7c4840c0a7d5-operator-scripts\") pod \"489812a0-ca0d-4259-a487-7c4840c0a7d5\" (UID: \"489812a0-ca0d-4259-a487-7c4840c0a7d5\") " Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.755286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbjc9\" (UniqueName: \"kubernetes.io/projected/489812a0-ca0d-4259-a487-7c4840c0a7d5-kube-api-access-rbjc9\") pod \"489812a0-ca0d-4259-a487-7c4840c0a7d5\" (UID: \"489812a0-ca0d-4259-a487-7c4840c0a7d5\") " Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.756431 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/489812a0-ca0d-4259-a487-7c4840c0a7d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "489812a0-ca0d-4259-a487-7c4840c0a7d5" (UID: "489812a0-ca0d-4259-a487-7c4840c0a7d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.765720 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/489812a0-ca0d-4259-a487-7c4840c0a7d5-kube-api-access-rbjc9" (OuterVolumeSpecName: "kube-api-access-rbjc9") pod "489812a0-ca0d-4259-a487-7c4840c0a7d5" (UID: "489812a0-ca0d-4259-a487-7c4840c0a7d5"). InnerVolumeSpecName "kube-api-access-rbjc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.857586 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/489812a0-ca0d-4259-a487-7c4840c0a7d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.857866 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbjc9\" (UniqueName: \"kubernetes.io/projected/489812a0-ca0d-4259-a487-7c4840c0a7d5-kube-api-access-rbjc9\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.871422 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.958458 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/691512e9-1781-4ce2-9d4d-c1787adb07c5-operator-scripts\") pod \"691512e9-1781-4ce2-9d4d-c1787adb07c5\" (UID: \"691512e9-1781-4ce2-9d4d-c1787adb07c5\") " Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.958569 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rjrs\" (UniqueName: \"kubernetes.io/projected/691512e9-1781-4ce2-9d4d-c1787adb07c5-kube-api-access-8rjrs\") pod \"691512e9-1781-4ce2-9d4d-c1787adb07c5\" (UID: \"691512e9-1781-4ce2-9d4d-c1787adb07c5\") " Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.959149 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/691512e9-1781-4ce2-9d4d-c1787adb07c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "691512e9-1781-4ce2-9d4d-c1787adb07c5" (UID: "691512e9-1781-4ce2-9d4d-c1787adb07c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:54:32 crc kubenswrapper[5030]: I0120 23:54:32.962554 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/691512e9-1781-4ce2-9d4d-c1787adb07c5-kube-api-access-8rjrs" (OuterVolumeSpecName: "kube-api-access-8rjrs") pod "691512e9-1781-4ce2-9d4d-c1787adb07c5" (UID: "691512e9-1781-4ce2-9d4d-c1787adb07c5"). InnerVolumeSpecName "kube-api-access-8rjrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:33 crc kubenswrapper[5030]: I0120 23:54:33.060551 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/691512e9-1781-4ce2-9d4d-c1787adb07c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:33 crc kubenswrapper[5030]: I0120 23:54:33.060589 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rjrs\" (UniqueName: \"kubernetes.io/projected/691512e9-1781-4ce2-9d4d-c1787adb07c5-kube-api-access-8rjrs\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:33 crc kubenswrapper[5030]: I0120 23:54:33.434420 5030 generic.go:334] "Generic (PLEG): container finished" podID="2e30653e-ea2a-4d57-b605-c35cf41208c4" containerID="87a82cba312227a3eedb47a0c268697ec415a32cbe79e06eaeca52d69f0bc499" exitCode=0 Jan 20 23:54:33 crc kubenswrapper[5030]: I0120 23:54:33.434511 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" event={"ID":"2e30653e-ea2a-4d57-b605-c35cf41208c4","Type":"ContainerDied","Data":"87a82cba312227a3eedb47a0c268697ec415a32cbe79e06eaeca52d69f0bc499"} Jan 20 23:54:33 crc kubenswrapper[5030]: I0120 23:54:33.436834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" event={"ID":"691512e9-1781-4ce2-9d4d-c1787adb07c5","Type":"ContainerDied","Data":"3716ee4f3e7ce8d113594ed8347b84b425e0ee36fb204cdfb3142c2dbe408705"} Jan 20 23:54:33 crc kubenswrapper[5030]: I0120 23:54:33.436890 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3716ee4f3e7ce8d113594ed8347b84b425e0ee36fb204cdfb3142c2dbe408705" Jan 20 23:54:33 crc kubenswrapper[5030]: I0120 23:54:33.436963 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8" Jan 20 23:54:33 crc kubenswrapper[5030]: I0120 23:54:33.440025 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" event={"ID":"489812a0-ca0d-4259-a487-7c4840c0a7d5","Type":"ContainerDied","Data":"7629026d1319f53c4b721f5b8b68e61d8b433fb07434e3de309a95da7f53a419"} Jan 20 23:54:33 crc kubenswrapper[5030]: I0120 23:54:33.440077 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7629026d1319f53c4b721f5b8b68e61d8b433fb07434e3de309a95da7f53a419" Jan 20 23:54:33 crc kubenswrapper[5030]: I0120 23:54:33.440189 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk" Jan 20 23:54:34 crc kubenswrapper[5030]: I0120 23:54:34.819454 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" Jan 20 23:54:34 crc kubenswrapper[5030]: I0120 23:54:34.894802 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e30653e-ea2a-4d57-b605-c35cf41208c4-combined-ca-bundle\") pod \"2e30653e-ea2a-4d57-b605-c35cf41208c4\" (UID: \"2e30653e-ea2a-4d57-b605-c35cf41208c4\") " Jan 20 23:54:34 crc kubenswrapper[5030]: I0120 23:54:34.894902 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7ktx\" (UniqueName: \"kubernetes.io/projected/2e30653e-ea2a-4d57-b605-c35cf41208c4-kube-api-access-r7ktx\") pod \"2e30653e-ea2a-4d57-b605-c35cf41208c4\" (UID: \"2e30653e-ea2a-4d57-b605-c35cf41208c4\") " Jan 20 23:54:34 crc kubenswrapper[5030]: I0120 23:54:34.895020 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e30653e-ea2a-4d57-b605-c35cf41208c4-config-data\") pod \"2e30653e-ea2a-4d57-b605-c35cf41208c4\" (UID: \"2e30653e-ea2a-4d57-b605-c35cf41208c4\") " Jan 20 23:54:34 crc kubenswrapper[5030]: I0120 23:54:34.905420 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e30653e-ea2a-4d57-b605-c35cf41208c4-kube-api-access-r7ktx" (OuterVolumeSpecName: "kube-api-access-r7ktx") pod "2e30653e-ea2a-4d57-b605-c35cf41208c4" (UID: "2e30653e-ea2a-4d57-b605-c35cf41208c4"). InnerVolumeSpecName "kube-api-access-r7ktx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:34 crc kubenswrapper[5030]: I0120 23:54:34.929129 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e30653e-ea2a-4d57-b605-c35cf41208c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e30653e-ea2a-4d57-b605-c35cf41208c4" (UID: "2e30653e-ea2a-4d57-b605-c35cf41208c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:34 crc kubenswrapper[5030]: I0120 23:54:34.960561 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e30653e-ea2a-4d57-b605-c35cf41208c4-config-data" (OuterVolumeSpecName: "config-data") pod "2e30653e-ea2a-4d57-b605-c35cf41208c4" (UID: "2e30653e-ea2a-4d57-b605-c35cf41208c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:34 crc kubenswrapper[5030]: I0120 23:54:34.997252 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e30653e-ea2a-4d57-b605-c35cf41208c4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:34 crc kubenswrapper[5030]: I0120 23:54:34.997312 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e30653e-ea2a-4d57-b605-c35cf41208c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:34 crc kubenswrapper[5030]: I0120 23:54:34.998512 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7ktx\" (UniqueName: \"kubernetes.io/projected/2e30653e-ea2a-4d57-b605-c35cf41208c4-kube-api-access-r7ktx\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.465274 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" event={"ID":"2e30653e-ea2a-4d57-b605-c35cf41208c4","Type":"ContainerDied","Data":"8306437ca3890c35a211dd7f1168ee28004efd5b50417021045975978e4c2ab2"} Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.465595 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8306437ca3890c35a211dd7f1168ee28004efd5b50417021045975978e4c2ab2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.465365 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-db-sync-pcjpl" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.608794 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-r5kx2"] Jan 20 23:54:35 crc kubenswrapper[5030]: E0120 23:54:35.609243 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f386591d-bc9f-4389-b61b-e4a830c025f9" containerName="mariadb-database-create" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609267 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f386591d-bc9f-4389-b61b-e4a830c025f9" containerName="mariadb-database-create" Jan 20 23:54:35 crc kubenswrapper[5030]: E0120 23:54:35.609278 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6f83e7-540c-419e-b09f-1a49dc622f40" containerName="mariadb-database-create" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609285 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6f83e7-540c-419e-b09f-1a49dc622f40" containerName="mariadb-database-create" Jan 20 23:54:35 crc kubenswrapper[5030]: E0120 23:54:35.609302 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="691512e9-1781-4ce2-9d4d-c1787adb07c5" containerName="mariadb-account-create-update" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609310 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="691512e9-1781-4ce2-9d4d-c1787adb07c5" containerName="mariadb-account-create-update" Jan 20 23:54:35 crc kubenswrapper[5030]: E0120 23:54:35.609333 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331a63fb-2148-49d9-ab0a-7dc819c26bea" containerName="mariadb-account-create-update" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609339 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="331a63fb-2148-49d9-ab0a-7dc819c26bea" containerName="mariadb-account-create-update" Jan 20 23:54:35 crc kubenswrapper[5030]: E0120 23:54:35.609346 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2da3223-e170-45c0-a464-2b91d405f3c0" containerName="mariadb-database-create" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609352 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2da3223-e170-45c0-a464-2b91d405f3c0" containerName="mariadb-database-create" Jan 20 23:54:35 crc kubenswrapper[5030]: E0120 23:54:35.609365 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="489812a0-ca0d-4259-a487-7c4840c0a7d5" containerName="mariadb-account-create-update" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609371 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="489812a0-ca0d-4259-a487-7c4840c0a7d5" containerName="mariadb-account-create-update" Jan 20 23:54:35 crc kubenswrapper[5030]: E0120 23:54:35.609382 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e30653e-ea2a-4d57-b605-c35cf41208c4" containerName="keystone-db-sync" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609388 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e30653e-ea2a-4d57-b605-c35cf41208c4" containerName="keystone-db-sync" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609530 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2da3223-e170-45c0-a464-2b91d405f3c0" containerName="mariadb-database-create" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609553 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="489812a0-ca0d-4259-a487-7c4840c0a7d5" containerName="mariadb-account-create-update" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609567 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="331a63fb-2148-49d9-ab0a-7dc819c26bea" containerName="mariadb-account-create-update" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609577 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff6f83e7-540c-419e-b09f-1a49dc622f40" containerName="mariadb-database-create" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609589 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="691512e9-1781-4ce2-9d4d-c1787adb07c5" containerName="mariadb-account-create-update" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609599 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e30653e-ea2a-4d57-b605-c35cf41208c4" containerName="keystone-db-sync" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.609612 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f386591d-bc9f-4389-b61b-e4a830c025f9" containerName="mariadb-database-create" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.610216 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.616975 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-r5kx2"] Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.617017 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.617207 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-4w7gr" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.617674 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.617828 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.618194 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.709723 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-fernet-keys\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.709786 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ccn5\" (UniqueName: \"kubernetes.io/projected/e69fbd0a-0caf-4068-a40b-7706ffb909e8-kube-api-access-6ccn5\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.709807 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-scripts\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.709996 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-credential-keys\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.710088 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-config-data\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.710308 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-combined-ca-bundle\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.786463 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-4jp4k"] Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.787391 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.790791 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.790995 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.791126 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-78dp8" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.812029 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-combined-ca-bundle\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.812082 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-fernet-keys\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.812123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ccn5\" (UniqueName: \"kubernetes.io/projected/e69fbd0a-0caf-4068-a40b-7706ffb909e8-kube-api-access-6ccn5\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.812140 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-scripts\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.812194 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-credential-keys\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.812225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-config-data\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.817835 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-combined-ca-bundle\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.824932 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-z8bpm"] Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.826530 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.827589 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-credential-keys\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.828462 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-scripts\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.828699 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.828900 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-7r9rk" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.829053 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.840697 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-fernet-keys\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.841272 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-config-data\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.842808 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ccn5\" (UniqueName: \"kubernetes.io/projected/e69fbd0a-0caf-4068-a40b-7706ffb909e8-kube-api-access-6ccn5\") pod \"keystone-bootstrap-r5kx2\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.850892 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-4jp4k"] Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.864812 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8kqng"] Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.866140 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.888663 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-z8bpm"] Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.889177 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-px6l8" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.889406 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.889639 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.917661 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-config\") pod \"neutron-db-sync-z8bpm\" (UID: \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\") " pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.917782 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-db-sync-config-data\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.917822 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0747945-eb36-4b4d-884c-37803fa92174-etc-machine-id\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.917853 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-combined-ca-bundle\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.917880 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d88p\" (UniqueName: \"kubernetes.io/projected/d0747945-eb36-4b4d-884c-37803fa92174-kube-api-access-9d88p\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.917910 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-config-data\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.917940 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjnjx\" (UniqueName: \"kubernetes.io/projected/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-kube-api-access-qjnjx\") pod \"neutron-db-sync-z8bpm\" (UID: \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\") " pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.917995 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-scripts\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.918015 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-combined-ca-bundle\") pod \"neutron-db-sync-z8bpm\" (UID: \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\") " pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.933017 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.937075 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8kqng"] Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.949393 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-9fhpl"] Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.950715 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.963157 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.963157 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-ld7f7" Jan 20 23:54:35 crc kubenswrapper[5030]: I0120 23:54:35.978573 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-9fhpl"] Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.009016 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.013828 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.017184 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.017994 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019185 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-db-sync-config-data\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019226 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-scripts\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019245 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0747945-eb36-4b4d-884c-37803fa92174-etc-machine-id\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019266 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-combined-ca-bundle\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019286 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d88p\" (UniqueName: \"kubernetes.io/projected/d0747945-eb36-4b4d-884c-37803fa92174-kube-api-access-9d88p\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019305 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-config-data\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019773 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjnjx\" (UniqueName: \"kubernetes.io/projected/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-kube-api-access-qjnjx\") pod \"neutron-db-sync-z8bpm\" (UID: \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\") " pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-scripts\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019834 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw7kl\" (UniqueName: \"kubernetes.io/projected/076d046e-5116-4b10-ace6-602cb79307fa-kube-api-access-qw7kl\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019852 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-combined-ca-bundle\") pod \"neutron-db-sync-z8bpm\" (UID: \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\") " pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019872 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-config-data\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019886 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076d046e-5116-4b10-ace6-602cb79307fa-logs\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019915 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-combined-ca-bundle\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.019937 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-config\") pod \"neutron-db-sync-z8bpm\" (UID: \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\") " pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.029303 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-combined-ca-bundle\") pod \"neutron-db-sync-z8bpm\" (UID: \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\") " pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.029307 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-db-sync-config-data\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.033039 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-scripts\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.033181 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0747945-eb36-4b4d-884c-37803fa92174-etc-machine-id\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.035406 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-config\") pod \"neutron-db-sync-z8bpm\" (UID: \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\") " pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.047460 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-combined-ca-bundle\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.048791 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-config-data\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.060200 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjnjx\" (UniqueName: \"kubernetes.io/projected/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-kube-api-access-qjnjx\") pod \"neutron-db-sync-z8bpm\" (UID: \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\") " pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.066776 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d88p\" (UniqueName: \"kubernetes.io/projected/d0747945-eb36-4b4d-884c-37803fa92174-kube-api-access-9d88p\") pod \"cinder-db-sync-4jp4k\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.074520 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.104347 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.120922 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.120967 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-scripts\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.120989 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.121016 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7238d6f-7a18-49b8-a128-4935e9d5fadc-combined-ca-bundle\") pod \"barbican-db-sync-9fhpl\" (UID: \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\") " pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.121044 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a7238d6f-7a18-49b8-a128-4935e9d5fadc-db-sync-config-data\") pod \"barbican-db-sync-9fhpl\" (UID: \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\") " pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.121070 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dvbm\" (UniqueName: \"kubernetes.io/projected/a7238d6f-7a18-49b8-a128-4935e9d5fadc-kube-api-access-4dvbm\") pod \"barbican-db-sync-9fhpl\" (UID: \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\") " pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.121092 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/985aa547-9920-4524-82f7-d16698da2c8d-log-httpd\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.121113 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw7kl\" (UniqueName: \"kubernetes.io/projected/076d046e-5116-4b10-ace6-602cb79307fa-kube-api-access-qw7kl\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.121133 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr988\" (UniqueName: \"kubernetes.io/projected/985aa547-9920-4524-82f7-d16698da2c8d-kube-api-access-dr988\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.121149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-config-data\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.121169 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076d046e-5116-4b10-ace6-602cb79307fa-logs\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.121194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-scripts\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.121213 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-combined-ca-bundle\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.121243 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-config-data\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.121260 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/985aa547-9920-4524-82f7-d16698da2c8d-run-httpd\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.123201 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076d046e-5116-4b10-ace6-602cb79307fa-logs\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.125897 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-combined-ca-bundle\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.126915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-scripts\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.131860 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-config-data\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.137274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw7kl\" (UniqueName: \"kubernetes.io/projected/076d046e-5116-4b10-ace6-602cb79307fa-kube-api-access-qw7kl\") pod \"placement-db-sync-8kqng\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.209202 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.225606 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-config-data\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.225658 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/985aa547-9920-4524-82f7-d16698da2c8d-run-httpd\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.225716 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.225734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.225764 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7238d6f-7a18-49b8-a128-4935e9d5fadc-combined-ca-bundle\") pod \"barbican-db-sync-9fhpl\" (UID: \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\") " pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.225791 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a7238d6f-7a18-49b8-a128-4935e9d5fadc-db-sync-config-data\") pod \"barbican-db-sync-9fhpl\" (UID: \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\") " pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.226274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dvbm\" (UniqueName: \"kubernetes.io/projected/a7238d6f-7a18-49b8-a128-4935e9d5fadc-kube-api-access-4dvbm\") pod \"barbican-db-sync-9fhpl\" (UID: \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\") " pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.226362 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/985aa547-9920-4524-82f7-d16698da2c8d-log-httpd\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.226422 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr988\" (UniqueName: \"kubernetes.io/projected/985aa547-9920-4524-82f7-d16698da2c8d-kube-api-access-dr988\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.226501 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-scripts\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.227812 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/985aa547-9920-4524-82f7-d16698da2c8d-log-httpd\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.228501 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/985aa547-9920-4524-82f7-d16698da2c8d-run-httpd\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.230176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-scripts\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.232881 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-config-data\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.233257 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.234192 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a7238d6f-7a18-49b8-a128-4935e9d5fadc-db-sync-config-data\") pod \"barbican-db-sync-9fhpl\" (UID: \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\") " pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.236040 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7238d6f-7a18-49b8-a128-4935e9d5fadc-combined-ca-bundle\") pod \"barbican-db-sync-9fhpl\" (UID: \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\") " pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.236377 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.240115 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.244621 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dvbm\" (UniqueName: \"kubernetes.io/projected/a7238d6f-7a18-49b8-a128-4935e9d5fadc-kube-api-access-4dvbm\") pod \"barbican-db-sync-9fhpl\" (UID: \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\") " pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" Jan 20 23:54:36 crc kubenswrapper[5030]: I0120 23:54:36.253466 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr988\" (UniqueName: \"kubernetes.io/projected/985aa547-9920-4524-82f7-d16698da2c8d-kube-api-access-dr988\") pod \"ceilometer-0\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.395742 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.404713 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.432671 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-r5kx2"] Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.476059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" event={"ID":"e69fbd0a-0caf-4068-a40b-7706ffb909e8","Type":"ContainerStarted","Data":"30840d3cad70cc84396470e21a804e3f3d6b4dd7c2c0c4976a908f9fda990f54"} Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.738830 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.741328 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.746680 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.746961 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-scripts" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.747219 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-glance-dockercfg-pt9jr" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.747360 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.750813 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.794412 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.796449 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.800397 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.800781 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.808420 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.841662 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxg6x\" (UniqueName: \"kubernetes.io/projected/a4ab7ab3-1606-480e-9736-1982f77df967-kube-api-access-zxg6x\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.841831 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.841870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4ab7ab3-1606-480e-9736-1982f77df967-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.841893 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-scripts\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.841936 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4ab7ab3-1606-480e-9736-1982f77df967-logs\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.841967 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.841992 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-config-data\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.842031 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.943603 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4b524b8-0743-4086-9935-3a0b0823a236-logs\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.943781 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxg6x\" (UniqueName: \"kubernetes.io/projected/a4ab7ab3-1606-480e-9736-1982f77df967-kube-api-access-zxg6x\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.943817 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.943846 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.943895 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.943946 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.943976 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4b524b8-0743-4086-9935-3a0b0823a236-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.944006 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4ab7ab3-1606-480e-9736-1982f77df967-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.944037 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-scripts\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.944065 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzfj6\" (UniqueName: \"kubernetes.io/projected/a4b524b8-0743-4086-9935-3a0b0823a236-kube-api-access-kzfj6\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.944091 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4ab7ab3-1606-480e-9736-1982f77df967-logs\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.944121 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.944153 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.944179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-config-data\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.944229 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.944253 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.944398 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.944729 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4ab7ab3-1606-480e-9736-1982f77df967-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.947248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4ab7ab3-1606-480e-9736-1982f77df967-logs\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.950429 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.955062 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.961567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-scripts\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.965456 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-config-data\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:36.981072 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxg6x\" (UniqueName: \"kubernetes.io/projected/a4ab7ab3-1606-480e-9736-1982f77df967-kube-api-access-zxg6x\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.019829 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.045412 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4b524b8-0743-4086-9935-3a0b0823a236-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.045481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzfj6\" (UniqueName: \"kubernetes.io/projected/a4b524b8-0743-4086-9935-3a0b0823a236-kube-api-access-kzfj6\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.045538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.046156 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4b524b8-0743-4086-9935-3a0b0823a236-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.048209 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.048482 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4b524b8-0743-4086-9935-3a0b0823a236-logs\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.049125 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4b524b8-0743-4086-9935-3a0b0823a236-logs\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.049714 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.049844 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.049868 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.050219 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.053286 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.055626 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.056950 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.057563 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.066541 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.070866 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzfj6\" (UniqueName: \"kubernetes.io/projected/a4b524b8-0743-4086-9935-3a0b0823a236-kube-api-access-kzfj6\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.086548 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.116181 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.421180 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-4jp4k"] Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.488919 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" event={"ID":"d0747945-eb36-4b4d-884c-37803fa92174","Type":"ContainerStarted","Data":"a4092b56e3dc4010a64a7ead7478226f9154c12d62ddb74e6f7aec13b2518285"} Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.507069 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" event={"ID":"e69fbd0a-0caf-4068-a40b-7706ffb909e8","Type":"ContainerStarted","Data":"c1391761fb7c8fc5ae03c94f84c2cb4e7dcbef4e6e695265c52e6766480d995c"} Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.526758 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-9fhpl"] Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.547201 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-z8bpm"] Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.563057 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" podStartSLOduration=2.56303613 podStartE2EDuration="2.56303613s" podCreationTimestamp="2026-01-20 23:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:37.535583815 +0000 UTC m=+4749.855844103" watchObservedRunningTime="2026-01-20 23:54:37.56303613 +0000 UTC m=+4749.883296418" Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.577120 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.584925 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8kqng"] Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.710802 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:54:37 crc kubenswrapper[5030]: W0120 23:54:37.720918 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ab7ab3_1606_480e_9736_1982f77df967.slice/crio-aa7cb9e25b48ca190f8db5cb693077e0c1de558525b942afa001418a50d5d71e WatchSource:0}: Error finding container aa7cb9e25b48ca190f8db5cb693077e0c1de558525b942afa001418a50d5d71e: Status 404 returned error can't find the container with id aa7cb9e25b48ca190f8db5cb693077e0c1de558525b942afa001418a50d5d71e Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.852616 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:54:37 crc kubenswrapper[5030]: I0120 23:54:37.913972 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.013679 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.063765 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.520276 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"985aa547-9920-4524-82f7-d16698da2c8d","Type":"ContainerStarted","Data":"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510"} Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.520554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"985aa547-9920-4524-82f7-d16698da2c8d","Type":"ContainerStarted","Data":"c3cf5e468850f79e029deca4b6fc564ea74669e584e9cf529b8212bde86b60de"} Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.523487 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a4ab7ab3-1606-480e-9736-1982f77df967","Type":"ContainerStarted","Data":"5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223"} Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.523533 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a4ab7ab3-1606-480e-9736-1982f77df967","Type":"ContainerStarted","Data":"aa7cb9e25b48ca190f8db5cb693077e0c1de558525b942afa001418a50d5d71e"} Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.537453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" event={"ID":"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba","Type":"ContainerStarted","Data":"70a7d28cf52d5ef1d00316b0dcbf8fba7e082729deb543a60515ea02129ddf50"} Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.537501 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" event={"ID":"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba","Type":"ContainerStarted","Data":"2e02f41599e5cb59701eb73c30a7adb60f7a75a8b99f1625e420fea9ba328af3"} Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.544838 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" event={"ID":"a7238d6f-7a18-49b8-a128-4935e9d5fadc","Type":"ContainerStarted","Data":"5327928e7446b8be0843fc4890d73bda7bfa3169ddb8d3054850ab76b5f5bbd7"} Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.544880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" event={"ID":"a7238d6f-7a18-49b8-a128-4935e9d5fadc","Type":"ContainerStarted","Data":"b6fb71ca17926cb9d6d6dc943db984f65473d9993eb37b85dc3cf57fc8a004e2"} Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.548194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-8kqng" event={"ID":"076d046e-5116-4b10-ace6-602cb79307fa","Type":"ContainerStarted","Data":"dca510e79d494a956a98c426b8f4611d3b6b6dde0cb62746d4ae5531d4f22da2"} Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.548222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-8kqng" event={"ID":"076d046e-5116-4b10-ace6-602cb79307fa","Type":"ContainerStarted","Data":"33b2e1ecf11b99d8860e0a2213a19371a0271517adbdbb5bfba2bfc87d10050b"} Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.555156 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" podStartSLOduration=3.555141218 podStartE2EDuration="3.555141218s" podCreationTimestamp="2026-01-20 23:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:38.550539026 +0000 UTC m=+4750.870799304" watchObservedRunningTime="2026-01-20 23:54:38.555141218 +0000 UTC m=+4750.875401506" Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.564746 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"a4b524b8-0743-4086-9935-3a0b0823a236","Type":"ContainerStarted","Data":"dd0d49987aff5d1153e4f324c8b4f42d11b2fd52b0f02d18cb0ffc08955617de"} Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.571152 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" podStartSLOduration=3.571139425 podStartE2EDuration="3.571139425s" podCreationTimestamp="2026-01-20 23:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:38.567207751 +0000 UTC m=+4750.887468039" watchObservedRunningTime="2026-01-20 23:54:38.571139425 +0000 UTC m=+4750.891399703" Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.591771 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-db-sync-8kqng" podStartSLOduration=3.591755156 podStartE2EDuration="3.591755156s" podCreationTimestamp="2026-01-20 23:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:38.583409253 +0000 UTC m=+4750.903669541" watchObservedRunningTime="2026-01-20 23:54:38.591755156 +0000 UTC m=+4750.912015444" Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.604847 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" event={"ID":"d0747945-eb36-4b4d-884c-37803fa92174","Type":"ContainerStarted","Data":"8908dc6a3ed08b89887c59ba107ec391977995d11046784ad1c79931bbf35386"} Jan 20 23:54:38 crc kubenswrapper[5030]: I0120 23:54:38.635339 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" podStartSLOduration=3.6353216120000003 podStartE2EDuration="3.635321612s" podCreationTimestamp="2026-01-20 23:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:38.617278284 +0000 UTC m=+4750.937538572" watchObservedRunningTime="2026-01-20 23:54:38.635321612 +0000 UTC m=+4750.955581900" Jan 20 23:54:39 crc kubenswrapper[5030]: I0120 23:54:39.614468 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"a4b524b8-0743-4086-9935-3a0b0823a236","Type":"ContainerStarted","Data":"03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460"} Jan 20 23:54:39 crc kubenswrapper[5030]: I0120 23:54:39.614950 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"a4b524b8-0743-4086-9935-3a0b0823a236","Type":"ContainerStarted","Data":"ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6"} Jan 20 23:54:39 crc kubenswrapper[5030]: I0120 23:54:39.614669 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="a4b524b8-0743-4086-9935-3a0b0823a236" containerName="glance-httpd" containerID="cri-o://03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460" gracePeriod=30 Jan 20 23:54:39 crc kubenswrapper[5030]: I0120 23:54:39.614647 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="a4b524b8-0743-4086-9935-3a0b0823a236" containerName="glance-log" containerID="cri-o://ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6" gracePeriod=30 Jan 20 23:54:39 crc kubenswrapper[5030]: I0120 23:54:39.619384 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"985aa547-9920-4524-82f7-d16698da2c8d","Type":"ContainerStarted","Data":"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0"} Jan 20 23:54:39 crc kubenswrapper[5030]: I0120 23:54:39.637012 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a4ab7ab3-1606-480e-9736-1982f77df967","Type":"ContainerStarted","Data":"663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789"} Jan 20 23:54:39 crc kubenswrapper[5030]: I0120 23:54:39.637357 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="a4ab7ab3-1606-480e-9736-1982f77df967" containerName="glance-log" containerID="cri-o://5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223" gracePeriod=30 Jan 20 23:54:39 crc kubenswrapper[5030]: I0120 23:54:39.637498 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="a4ab7ab3-1606-480e-9736-1982f77df967" containerName="glance-httpd" containerID="cri-o://663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789" gracePeriod=30 Jan 20 23:54:39 crc kubenswrapper[5030]: I0120 23:54:39.660574 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=4.660560544 podStartE2EDuration="4.660560544s" podCreationTimestamp="2026-01-20 23:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:39.65834068 +0000 UTC m=+4751.978600958" watchObservedRunningTime="2026-01-20 23:54:39.660560544 +0000 UTC m=+4751.980820832" Jan 20 23:54:39 crc kubenswrapper[5030]: I0120 23:54:39.709342 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=4.709324866 podStartE2EDuration="4.709324866s" podCreationTimestamp="2026-01-20 23:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:39.702314896 +0000 UTC m=+4752.022575184" watchObservedRunningTime="2026-01-20 23:54:39.709324866 +0000 UTC m=+4752.029585154" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.290602 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.428166 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxg6x\" (UniqueName: \"kubernetes.io/projected/a4ab7ab3-1606-480e-9736-1982f77df967-kube-api-access-zxg6x\") pod \"a4ab7ab3-1606-480e-9736-1982f77df967\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.428274 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4ab7ab3-1606-480e-9736-1982f77df967-httpd-run\") pod \"a4ab7ab3-1606-480e-9736-1982f77df967\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.428295 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"a4ab7ab3-1606-480e-9736-1982f77df967\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.428335 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-combined-ca-bundle\") pod \"a4ab7ab3-1606-480e-9736-1982f77df967\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.428370 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-scripts\") pod \"a4ab7ab3-1606-480e-9736-1982f77df967\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.428415 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-public-tls-certs\") pod \"a4ab7ab3-1606-480e-9736-1982f77df967\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.428432 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4ab7ab3-1606-480e-9736-1982f77df967-logs\") pod \"a4ab7ab3-1606-480e-9736-1982f77df967\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.428506 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-config-data\") pod \"a4ab7ab3-1606-480e-9736-1982f77df967\" (UID: \"a4ab7ab3-1606-480e-9736-1982f77df967\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.429329 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4ab7ab3-1606-480e-9736-1982f77df967-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a4ab7ab3-1606-480e-9736-1982f77df967" (UID: "a4ab7ab3-1606-480e-9736-1982f77df967"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.429539 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4ab7ab3-1606-480e-9736-1982f77df967-logs" (OuterVolumeSpecName: "logs") pod "a4ab7ab3-1606-480e-9736-1982f77df967" (UID: "a4ab7ab3-1606-480e-9736-1982f77df967"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.434220 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ab7ab3-1606-480e-9736-1982f77df967-kube-api-access-zxg6x" (OuterVolumeSpecName: "kube-api-access-zxg6x") pod "a4ab7ab3-1606-480e-9736-1982f77df967" (UID: "a4ab7ab3-1606-480e-9736-1982f77df967"). InnerVolumeSpecName "kube-api-access-zxg6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.434307 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "a4ab7ab3-1606-480e-9736-1982f77df967" (UID: "a4ab7ab3-1606-480e-9736-1982f77df967"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.437817 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-scripts" (OuterVolumeSpecName: "scripts") pod "a4ab7ab3-1606-480e-9736-1982f77df967" (UID: "a4ab7ab3-1606-480e-9736-1982f77df967"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.483782 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4ab7ab3-1606-480e-9736-1982f77df967" (UID: "a4ab7ab3-1606-480e-9736-1982f77df967"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.532989 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a4ab7ab3-1606-480e-9736-1982f77df967" (UID: "a4ab7ab3-1606-480e-9736-1982f77df967"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.533328 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.533365 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4ab7ab3-1606-480e-9736-1982f77df967-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.533378 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxg6x\" (UniqueName: \"kubernetes.io/projected/a4ab7ab3-1606-480e-9736-1982f77df967-kube-api-access-zxg6x\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.533388 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4ab7ab3-1606-480e-9736-1982f77df967-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.533410 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.533420 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.533428 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.542106 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-config-data" (OuterVolumeSpecName: "config-data") pod "a4ab7ab3-1606-480e-9736-1982f77df967" (UID: "a4ab7ab3-1606-480e-9736-1982f77df967"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.552378 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.553575 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.634251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-internal-tls-certs\") pod \"a4b524b8-0743-4086-9935-3a0b0823a236\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.634318 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-scripts\") pod \"a4b524b8-0743-4086-9935-3a0b0823a236\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.634353 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzfj6\" (UniqueName: \"kubernetes.io/projected/a4b524b8-0743-4086-9935-3a0b0823a236-kube-api-access-kzfj6\") pod \"a4b524b8-0743-4086-9935-3a0b0823a236\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.634383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-config-data\") pod \"a4b524b8-0743-4086-9935-3a0b0823a236\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.634416 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4b524b8-0743-4086-9935-3a0b0823a236-logs\") pod \"a4b524b8-0743-4086-9935-3a0b0823a236\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.634434 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-combined-ca-bundle\") pod \"a4b524b8-0743-4086-9935-3a0b0823a236\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.634493 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"a4b524b8-0743-4086-9935-3a0b0823a236\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.634529 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4b524b8-0743-4086-9935-3a0b0823a236-httpd-run\") pod \"a4b524b8-0743-4086-9935-3a0b0823a236\" (UID: \"a4b524b8-0743-4086-9935-3a0b0823a236\") " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.634908 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.634919 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4ab7ab3-1606-480e-9736-1982f77df967-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.635057 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4b524b8-0743-4086-9935-3a0b0823a236-logs" (OuterVolumeSpecName: "logs") pod "a4b524b8-0743-4086-9935-3a0b0823a236" (UID: "a4b524b8-0743-4086-9935-3a0b0823a236"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.635141 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4b524b8-0743-4086-9935-3a0b0823a236-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a4b524b8-0743-4086-9935-3a0b0823a236" (UID: "a4b524b8-0743-4086-9935-3a0b0823a236"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.637942 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b524b8-0743-4086-9935-3a0b0823a236-kube-api-access-kzfj6" (OuterVolumeSpecName: "kube-api-access-kzfj6") pod "a4b524b8-0743-4086-9935-3a0b0823a236" (UID: "a4b524b8-0743-4086-9935-3a0b0823a236"). InnerVolumeSpecName "kube-api-access-kzfj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.638347 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-scripts" (OuterVolumeSpecName: "scripts") pod "a4b524b8-0743-4086-9935-3a0b0823a236" (UID: "a4b524b8-0743-4086-9935-3a0b0823a236"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.638753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "a4b524b8-0743-4086-9935-3a0b0823a236" (UID: "a4b524b8-0743-4086-9935-3a0b0823a236"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.652107 5030 generic.go:334] "Generic (PLEG): container finished" podID="a4ab7ab3-1606-480e-9736-1982f77df967" containerID="663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789" exitCode=143 Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.652135 5030 generic.go:334] "Generic (PLEG): container finished" podID="a4ab7ab3-1606-480e-9736-1982f77df967" containerID="5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223" exitCode=143 Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.652169 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a4ab7ab3-1606-480e-9736-1982f77df967","Type":"ContainerDied","Data":"663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789"} Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.652193 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a4ab7ab3-1606-480e-9736-1982f77df967","Type":"ContainerDied","Data":"5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223"} Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.652203 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"a4ab7ab3-1606-480e-9736-1982f77df967","Type":"ContainerDied","Data":"aa7cb9e25b48ca190f8db5cb693077e0c1de558525b942afa001418a50d5d71e"} Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.652219 5030 scope.go:117] "RemoveContainer" containerID="663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.652325 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.664009 5030 generic.go:334] "Generic (PLEG): container finished" podID="a4b524b8-0743-4086-9935-3a0b0823a236" containerID="03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460" exitCode=143 Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.664033 5030 generic.go:334] "Generic (PLEG): container finished" podID="a4b524b8-0743-4086-9935-3a0b0823a236" containerID="ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6" exitCode=143 Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.664068 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"a4b524b8-0743-4086-9935-3a0b0823a236","Type":"ContainerDied","Data":"03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460"} Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.664088 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"a4b524b8-0743-4086-9935-3a0b0823a236","Type":"ContainerDied","Data":"ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6"} Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.664097 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"a4b524b8-0743-4086-9935-3a0b0823a236","Type":"ContainerDied","Data":"dd0d49987aff5d1153e4f324c8b4f42d11b2fd52b0f02d18cb0ffc08955617de"} Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.664145 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.667180 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"985aa547-9920-4524-82f7-d16698da2c8d","Type":"ContainerStarted","Data":"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df"} Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.671272 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4b524b8-0743-4086-9935-3a0b0823a236" (UID: "a4b524b8-0743-4086-9935-3a0b0823a236"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.678286 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-config-data" (OuterVolumeSpecName: "config-data") pod "a4b524b8-0743-4086-9935-3a0b0823a236" (UID: "a4b524b8-0743-4086-9935-3a0b0823a236"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.684507 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.685382 5030 scope.go:117] "RemoveContainer" containerID="5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.685994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a4b524b8-0743-4086-9935-3a0b0823a236" (UID: "a4b524b8-0743-4086-9935-3a0b0823a236"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.692640 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.707568 5030 scope.go:117] "RemoveContainer" containerID="663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789" Jan 20 23:54:40 crc kubenswrapper[5030]: E0120 23:54:40.707994 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789\": container with ID starting with 663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789 not found: ID does not exist" containerID="663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.708037 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789"} err="failed to get container status \"663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789\": rpc error: code = NotFound desc = could not find container \"663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789\": container with ID starting with 663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789 not found: ID does not exist" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.708062 5030 scope.go:117] "RemoveContainer" containerID="5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223" Jan 20 23:54:40 crc kubenswrapper[5030]: E0120 23:54:40.708271 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223\": container with ID starting with 5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223 not found: ID does not exist" containerID="5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.708302 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223"} err="failed to get container status \"5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223\": rpc error: code = NotFound desc = could not find container \"5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223\": container with ID starting with 5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223 not found: ID does not exist" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.708321 5030 scope.go:117] "RemoveContainer" containerID="663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.708506 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789"} err="failed to get container status \"663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789\": rpc error: code = NotFound desc = could not find container \"663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789\": container with ID starting with 663dfa0d78037d8ef8d96fd02ae4876e88d33acf30cdbff0e04c27360b9ef789 not found: ID does not exist" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.708530 5030 scope.go:117] "RemoveContainer" containerID="5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.709531 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223"} err="failed to get container status \"5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223\": rpc error: code = NotFound desc = could not find container \"5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223\": container with ID starting with 5ae355aca0ec9cc9ba0c41cf210ff7a875099a3c5c46e25bf1eb2cb5ca110223 not found: ID does not exist" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.710015 5030 scope.go:117] "RemoveContainer" containerID="03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.720881 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:54:40 crc kubenswrapper[5030]: E0120 23:54:40.722266 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b524b8-0743-4086-9935-3a0b0823a236" containerName="glance-httpd" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.722306 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b524b8-0743-4086-9935-3a0b0823a236" containerName="glance-httpd" Jan 20 23:54:40 crc kubenswrapper[5030]: E0120 23:54:40.722323 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ab7ab3-1606-480e-9736-1982f77df967" containerName="glance-httpd" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.722329 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ab7ab3-1606-480e-9736-1982f77df967" containerName="glance-httpd" Jan 20 23:54:40 crc kubenswrapper[5030]: E0120 23:54:40.722346 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ab7ab3-1606-480e-9736-1982f77df967" containerName="glance-log" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.722352 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ab7ab3-1606-480e-9736-1982f77df967" containerName="glance-log" Jan 20 23:54:40 crc kubenswrapper[5030]: E0120 23:54:40.722388 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b524b8-0743-4086-9935-3a0b0823a236" containerName="glance-log" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.722395 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b524b8-0743-4086-9935-3a0b0823a236" containerName="glance-log" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.722589 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ab7ab3-1606-480e-9736-1982f77df967" containerName="glance-log" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.722612 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b524b8-0743-4086-9935-3a0b0823a236" containerName="glance-log" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.722639 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ab7ab3-1606-480e-9736-1982f77df967" containerName="glance-httpd" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.722651 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b524b8-0743-4086-9935-3a0b0823a236" containerName="glance-httpd" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.723478 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.727151 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.727786 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.746489 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.746516 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.746529 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzfj6\" (UniqueName: \"kubernetes.io/projected/a4b524b8-0743-4086-9935-3a0b0823a236-kube-api-access-kzfj6\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.746540 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.746550 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4b524b8-0743-4086-9935-3a0b0823a236-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.746558 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4b524b8-0743-4086-9935-3a0b0823a236-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.746585 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.746594 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4b524b8-0743-4086-9935-3a0b0823a236-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.747432 5030 scope.go:117] "RemoveContainer" containerID="ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.759415 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.767932 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.773878 5030 scope.go:117] "RemoveContainer" containerID="03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460" Jan 20 23:54:40 crc kubenswrapper[5030]: E0120 23:54:40.774252 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460\": container with ID starting with 03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460 not found: ID does not exist" containerID="03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.774289 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460"} err="failed to get container status \"03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460\": rpc error: code = NotFound desc = could not find container \"03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460\": container with ID starting with 03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460 not found: ID does not exist" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.774316 5030 scope.go:117] "RemoveContainer" containerID="ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6" Jan 20 23:54:40 crc kubenswrapper[5030]: E0120 23:54:40.774551 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6\": container with ID starting with ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6 not found: ID does not exist" containerID="ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.774566 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6"} err="failed to get container status \"ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6\": rpc error: code = NotFound desc = could not find container \"ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6\": container with ID starting with ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6 not found: ID does not exist" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.774578 5030 scope.go:117] "RemoveContainer" containerID="03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.774807 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460"} err="failed to get container status \"03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460\": rpc error: code = NotFound desc = could not find container \"03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460\": container with ID starting with 03ab973e587e33627ccb8e21567e57b4ecdd2c1706a51e2408882f1152811460 not found: ID does not exist" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.774835 5030 scope.go:117] "RemoveContainer" containerID="ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.775430 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6"} err="failed to get container status \"ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6\": rpc error: code = NotFound desc = could not find container \"ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6\": container with ID starting with ca0fcca7e21996f54c6f25ce8dbb2ebea089f76ad253158036df001fd8eb5ee6 not found: ID does not exist" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.848011 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.848074 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-logs\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.848099 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.848178 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsl57\" (UniqueName: \"kubernetes.io/projected/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-kube-api-access-xsl57\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.848217 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-scripts\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.848259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-config-data\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.848324 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.848359 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.848418 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.949734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.949920 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.949962 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-logs\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.949987 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.950031 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsl57\" (UniqueName: \"kubernetes.io/projected/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-kube-api-access-xsl57\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.950084 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-scripts\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.950131 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-config-data\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.950173 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.950344 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.950497 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-logs\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.950576 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.954461 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.955549 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-scripts\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.958573 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.958769 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-config-data\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.968134 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsl57\" (UniqueName: \"kubernetes.io/projected/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-kube-api-access-xsl57\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:40 crc kubenswrapper[5030]: I0120 23:54:40.991263 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.055213 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.192662 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.201515 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.216788 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.218443 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.223406 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.223593 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.225255 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.364033 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82663c55-6190-4f89-a81a-4950835abad4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.364138 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.364191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82663c55-6190-4f89-a81a-4950835abad4-logs\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.364224 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.364253 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62q4b\" (UniqueName: \"kubernetes.io/projected/82663c55-6190-4f89-a81a-4950835abad4-kube-api-access-62q4b\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.364309 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.364348 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.364389 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.465507 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82663c55-6190-4f89-a81a-4950835abad4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.465611 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.465669 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82663c55-6190-4f89-a81a-4950835abad4-logs\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.465711 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.465735 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62q4b\" (UniqueName: \"kubernetes.io/projected/82663c55-6190-4f89-a81a-4950835abad4-kube-api-access-62q4b\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.465766 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.465805 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.465846 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.467052 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.467184 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82663c55-6190-4f89-a81a-4950835abad4-logs\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.468393 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82663c55-6190-4f89-a81a-4950835abad4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.475513 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.483240 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.483761 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.484604 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.490024 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62q4b\" (UniqueName: \"kubernetes.io/projected/82663c55-6190-4f89-a81a-4950835abad4-kube-api-access-62q4b\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.499616 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: W0120 23:54:41.538066 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42cd0b4c_f4c4_40a8_b181_1aec66c793c8.slice/crio-2f6b6feb6fa5a01bc974310597e12aeadf1d477b330f823af506702401a0d3ae WatchSource:0}: Error finding container 2f6b6feb6fa5a01bc974310597e12aeadf1d477b330f823af506702401a0d3ae: Status 404 returned error can't find the container with id 2f6b6feb6fa5a01bc974310597e12aeadf1d477b330f823af506702401a0d3ae Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.539977 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.564666 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.693657 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"42cd0b4c-f4c4-40a8-b181-1aec66c793c8","Type":"ContainerStarted","Data":"2f6b6feb6fa5a01bc974310597e12aeadf1d477b330f823af506702401a0d3ae"} Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.697056 5030 generic.go:334] "Generic (PLEG): container finished" podID="e69fbd0a-0caf-4068-a40b-7706ffb909e8" containerID="c1391761fb7c8fc5ae03c94f84c2cb4e7dcbef4e6e695265c52e6766480d995c" exitCode=0 Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.697093 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" event={"ID":"e69fbd0a-0caf-4068-a40b-7706ffb909e8","Type":"ContainerDied","Data":"c1391761fb7c8fc5ae03c94f84c2cb4e7dcbef4e6e695265c52e6766480d995c"} Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.700108 5030 generic.go:334] "Generic (PLEG): container finished" podID="076d046e-5116-4b10-ace6-602cb79307fa" containerID="dca510e79d494a956a98c426b8f4611d3b6b6dde0cb62746d4ae5531d4f22da2" exitCode=0 Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.700146 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-8kqng" event={"ID":"076d046e-5116-4b10-ace6-602cb79307fa","Type":"ContainerDied","Data":"dca510e79d494a956a98c426b8f4611d3b6b6dde0cb62746d4ae5531d4f22da2"} Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.972984 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ab7ab3-1606-480e-9736-1982f77df967" path="/var/lib/kubelet/pods/a4ab7ab3-1606-480e-9736-1982f77df967/volumes" Jan 20 23:54:41 crc kubenswrapper[5030]: I0120 23:54:41.974720 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b524b8-0743-4086-9935-3a0b0823a236" path="/var/lib/kubelet/pods/a4b524b8-0743-4086-9935-3a0b0823a236/volumes" Jan 20 23:54:42 crc kubenswrapper[5030]: I0120 23:54:42.056887 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:54:42 crc kubenswrapper[5030]: W0120 23:54:42.068758 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82663c55_6190_4f89_a81a_4950835abad4.slice/crio-f61b610ab862e112cc3cf58890d086a0645098c41f93f0507e1fa896ffd34c6e WatchSource:0}: Error finding container f61b610ab862e112cc3cf58890d086a0645098c41f93f0507e1fa896ffd34c6e: Status 404 returned error can't find the container with id f61b610ab862e112cc3cf58890d086a0645098c41f93f0507e1fa896ffd34c6e Jan 20 23:54:42 crc kubenswrapper[5030]: I0120 23:54:42.713948 5030 generic.go:334] "Generic (PLEG): container finished" podID="a7238d6f-7a18-49b8-a128-4935e9d5fadc" containerID="5327928e7446b8be0843fc4890d73bda7bfa3169ddb8d3054850ab76b5f5bbd7" exitCode=0 Jan 20 23:54:42 crc kubenswrapper[5030]: I0120 23:54:42.714175 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" event={"ID":"a7238d6f-7a18-49b8-a128-4935e9d5fadc","Type":"ContainerDied","Data":"5327928e7446b8be0843fc4890d73bda7bfa3169ddb8d3054850ab76b5f5bbd7"} Jan 20 23:54:42 crc kubenswrapper[5030]: I0120 23:54:42.715994 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"82663c55-6190-4f89-a81a-4950835abad4","Type":"ContainerStarted","Data":"f0f6dcbbc1beacc6118f560448c544e57f674e2e974643154ee6c4c223e11d73"} Jan 20 23:54:42 crc kubenswrapper[5030]: I0120 23:54:42.716014 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"82663c55-6190-4f89-a81a-4950835abad4","Type":"ContainerStarted","Data":"f61b610ab862e112cc3cf58890d086a0645098c41f93f0507e1fa896ffd34c6e"} Jan 20 23:54:42 crc kubenswrapper[5030]: I0120 23:54:42.720216 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"42cd0b4c-f4c4-40a8-b181-1aec66c793c8","Type":"ContainerStarted","Data":"45cdf598354a24fc6278eef04788f9515afb39b3703d29b3e9ef8afd1ff4a1ae"} Jan 20 23:54:42 crc kubenswrapper[5030]: I0120 23:54:42.723310 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"985aa547-9920-4524-82f7-d16698da2c8d","Type":"ContainerStarted","Data":"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292"} Jan 20 23:54:42 crc kubenswrapper[5030]: I0120 23:54:42.723599 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:42 crc kubenswrapper[5030]: I0120 23:54:42.723596 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="ceilometer-central-agent" containerID="cri-o://c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510" gracePeriod=30 Jan 20 23:54:42 crc kubenswrapper[5030]: I0120 23:54:42.723677 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="sg-core" containerID="cri-o://70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df" gracePeriod=30 Jan 20 23:54:42 crc kubenswrapper[5030]: I0120 23:54:42.723757 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="proxy-httpd" containerID="cri-o://c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292" gracePeriod=30 Jan 20 23:54:42 crc kubenswrapper[5030]: I0120 23:54:42.723812 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="ceilometer-notification-agent" containerID="cri-o://abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0" gracePeriod=30 Jan 20 23:54:42 crc kubenswrapper[5030]: I0120 23:54:42.762254 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=3.728549435 podStartE2EDuration="7.762235816s" podCreationTimestamp="2026-01-20 23:54:35 +0000 UTC" firstStartedPulling="2026-01-20 23:54:37.623468597 +0000 UTC m=+4749.943728885" lastFinishedPulling="2026-01-20 23:54:41.657154968 +0000 UTC m=+4753.977415266" observedRunningTime="2026-01-20 23:54:42.758109676 +0000 UTC m=+4755.078369964" watchObservedRunningTime="2026-01-20 23:54:42.762235816 +0000 UTC m=+4755.082496104" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.275605 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.282423 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.407027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw7kl\" (UniqueName: \"kubernetes.io/projected/076d046e-5116-4b10-ace6-602cb79307fa-kube-api-access-qw7kl\") pod \"076d046e-5116-4b10-ace6-602cb79307fa\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.407109 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-credential-keys\") pod \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.407174 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-config-data\") pod \"076d046e-5116-4b10-ace6-602cb79307fa\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.407208 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-config-data\") pod \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.407300 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-scripts\") pod \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.407326 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccn5\" (UniqueName: \"kubernetes.io/projected/e69fbd0a-0caf-4068-a40b-7706ffb909e8-kube-api-access-6ccn5\") pod \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.407344 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076d046e-5116-4b10-ace6-602cb79307fa-logs\") pod \"076d046e-5116-4b10-ace6-602cb79307fa\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.407383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-combined-ca-bundle\") pod \"076d046e-5116-4b10-ace6-602cb79307fa\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.407414 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-combined-ca-bundle\") pod \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.407432 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-fernet-keys\") pod \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\" (UID: \"e69fbd0a-0caf-4068-a40b-7706ffb909e8\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.407462 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-scripts\") pod \"076d046e-5116-4b10-ace6-602cb79307fa\" (UID: \"076d046e-5116-4b10-ace6-602cb79307fa\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.408538 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/076d046e-5116-4b10-ace6-602cb79307fa-logs" (OuterVolumeSpecName: "logs") pod "076d046e-5116-4b10-ace6-602cb79307fa" (UID: "076d046e-5116-4b10-ace6-602cb79307fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.413065 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-scripts" (OuterVolumeSpecName: "scripts") pod "e69fbd0a-0caf-4068-a40b-7706ffb909e8" (UID: "e69fbd0a-0caf-4068-a40b-7706ffb909e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.413495 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-scripts" (OuterVolumeSpecName: "scripts") pod "076d046e-5116-4b10-ace6-602cb79307fa" (UID: "076d046e-5116-4b10-ace6-602cb79307fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.413571 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e69fbd0a-0caf-4068-a40b-7706ffb909e8-kube-api-access-6ccn5" (OuterVolumeSpecName: "kube-api-access-6ccn5") pod "e69fbd0a-0caf-4068-a40b-7706ffb909e8" (UID: "e69fbd0a-0caf-4068-a40b-7706ffb909e8"). InnerVolumeSpecName "kube-api-access-6ccn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.413555 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e69fbd0a-0caf-4068-a40b-7706ffb909e8" (UID: "e69fbd0a-0caf-4068-a40b-7706ffb909e8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.413610 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076d046e-5116-4b10-ace6-602cb79307fa-kube-api-access-qw7kl" (OuterVolumeSpecName: "kube-api-access-qw7kl") pod "076d046e-5116-4b10-ace6-602cb79307fa" (UID: "076d046e-5116-4b10-ace6-602cb79307fa"). InnerVolumeSpecName "kube-api-access-qw7kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.414290 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e69fbd0a-0caf-4068-a40b-7706ffb909e8" (UID: "e69fbd0a-0caf-4068-a40b-7706ffb909e8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.431238 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-config-data" (OuterVolumeSpecName: "config-data") pod "e69fbd0a-0caf-4068-a40b-7706ffb909e8" (UID: "e69fbd0a-0caf-4068-a40b-7706ffb909e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.432315 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e69fbd0a-0caf-4068-a40b-7706ffb909e8" (UID: "e69fbd0a-0caf-4068-a40b-7706ffb909e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.432868 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "076d046e-5116-4b10-ace6-602cb79307fa" (UID: "076d046e-5116-4b10-ace6-602cb79307fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.440396 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-config-data" (OuterVolumeSpecName: "config-data") pod "076d046e-5116-4b10-ace6-602cb79307fa" (UID: "076d046e-5116-4b10-ace6-602cb79307fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.457396 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.508845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr988\" (UniqueName: \"kubernetes.io/projected/985aa547-9920-4524-82f7-d16698da2c8d-kube-api-access-dr988\") pod \"985aa547-9920-4524-82f7-d16698da2c8d\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.508904 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/985aa547-9920-4524-82f7-d16698da2c8d-log-httpd\") pod \"985aa547-9920-4524-82f7-d16698da2c8d\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.508928 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/985aa547-9920-4524-82f7-d16698da2c8d-run-httpd\") pod \"985aa547-9920-4524-82f7-d16698da2c8d\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.509043 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-combined-ca-bundle\") pod \"985aa547-9920-4524-82f7-d16698da2c8d\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.509086 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-config-data\") pod \"985aa547-9920-4524-82f7-d16698da2c8d\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.509108 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-sg-core-conf-yaml\") pod \"985aa547-9920-4524-82f7-d16698da2c8d\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.509194 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-scripts\") pod \"985aa547-9920-4524-82f7-d16698da2c8d\" (UID: \"985aa547-9920-4524-82f7-d16698da2c8d\") " Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.509483 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/985aa547-9920-4524-82f7-d16698da2c8d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "985aa547-9920-4524-82f7-d16698da2c8d" (UID: "985aa547-9920-4524-82f7-d16698da2c8d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.509832 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/985aa547-9920-4524-82f7-d16698da2c8d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.509899 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.509953 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.510008 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.510068 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.510123 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw7kl\" (UniqueName: \"kubernetes.io/projected/076d046e-5116-4b10-ace6-602cb79307fa-kube-api-access-qw7kl\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.510183 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.510234 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/076d046e-5116-4b10-ace6-602cb79307fa-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.510283 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.510332 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69fbd0a-0caf-4068-a40b-7706ffb909e8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.510386 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccn5\" (UniqueName: \"kubernetes.io/projected/e69fbd0a-0caf-4068-a40b-7706ffb909e8-kube-api-access-6ccn5\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.510441 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/076d046e-5116-4b10-ace6-602cb79307fa-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.512712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/985aa547-9920-4524-82f7-d16698da2c8d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "985aa547-9920-4524-82f7-d16698da2c8d" (UID: "985aa547-9920-4524-82f7-d16698da2c8d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.512971 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/985aa547-9920-4524-82f7-d16698da2c8d-kube-api-access-dr988" (OuterVolumeSpecName: "kube-api-access-dr988") pod "985aa547-9920-4524-82f7-d16698da2c8d" (UID: "985aa547-9920-4524-82f7-d16698da2c8d"). InnerVolumeSpecName "kube-api-access-dr988". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.513007 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-scripts" (OuterVolumeSpecName: "scripts") pod "985aa547-9920-4524-82f7-d16698da2c8d" (UID: "985aa547-9920-4524-82f7-d16698da2c8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.538723 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "985aa547-9920-4524-82f7-d16698da2c8d" (UID: "985aa547-9920-4524-82f7-d16698da2c8d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.580573 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "985aa547-9920-4524-82f7-d16698da2c8d" (UID: "985aa547-9920-4524-82f7-d16698da2c8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.601927 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-config-data" (OuterVolumeSpecName: "config-data") pod "985aa547-9920-4524-82f7-d16698da2c8d" (UID: "985aa547-9920-4524-82f7-d16698da2c8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.612391 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.612443 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.612462 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.612480 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/985aa547-9920-4524-82f7-d16698da2c8d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.612501 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr988\" (UniqueName: \"kubernetes.io/projected/985aa547-9920-4524-82f7-d16698da2c8d-kube-api-access-dr988\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.612522 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/985aa547-9920-4524-82f7-d16698da2c8d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.734749 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" event={"ID":"e69fbd0a-0caf-4068-a40b-7706ffb909e8","Type":"ContainerDied","Data":"30840d3cad70cc84396470e21a804e3f3d6b4dd7c2c0c4976a908f9fda990f54"} Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.734788 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30840d3cad70cc84396470e21a804e3f3d6b4dd7c2c0c4976a908f9fda990f54" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.734847 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-r5kx2" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.746423 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-db-sync-8kqng" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.746509 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-db-sync-8kqng" event={"ID":"076d046e-5116-4b10-ace6-602cb79307fa","Type":"ContainerDied","Data":"33b2e1ecf11b99d8860e0a2213a19371a0271517adbdbb5bfba2bfc87d10050b"} Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.746563 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33b2e1ecf11b99d8860e0a2213a19371a0271517adbdbb5bfba2bfc87d10050b" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.749968 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"82663c55-6190-4f89-a81a-4950835abad4","Type":"ContainerStarted","Data":"2fc6210e9ab2da522085f52343b3e8c26de15701510867d1f0192cef9767fbc7"} Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.754204 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"42cd0b4c-f4c4-40a8-b181-1aec66c793c8","Type":"ContainerStarted","Data":"534aaa3182d6a00f51c82456305aef8f80e5241233e2ce2d3b05ac7c8859bd93"} Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.766353 5030 generic.go:334] "Generic (PLEG): container finished" podID="985aa547-9920-4524-82f7-d16698da2c8d" containerID="c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292" exitCode=0 Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.766385 5030 generic.go:334] "Generic (PLEG): container finished" podID="985aa547-9920-4524-82f7-d16698da2c8d" containerID="70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df" exitCode=2 Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.766395 5030 generic.go:334] "Generic (PLEG): container finished" podID="985aa547-9920-4524-82f7-d16698da2c8d" containerID="abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0" exitCode=0 Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.766403 5030 generic.go:334] "Generic (PLEG): container finished" podID="985aa547-9920-4524-82f7-d16698da2c8d" containerID="c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510" exitCode=0 Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.766558 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.768354 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"985aa547-9920-4524-82f7-d16698da2c8d","Type":"ContainerDied","Data":"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292"} Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.768387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"985aa547-9920-4524-82f7-d16698da2c8d","Type":"ContainerDied","Data":"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df"} Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.768396 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"985aa547-9920-4524-82f7-d16698da2c8d","Type":"ContainerDied","Data":"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0"} Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.768405 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"985aa547-9920-4524-82f7-d16698da2c8d","Type":"ContainerDied","Data":"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510"} Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.768414 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"985aa547-9920-4524-82f7-d16698da2c8d","Type":"ContainerDied","Data":"c3cf5e468850f79e029deca4b6fc564ea74669e584e9cf529b8212bde86b60de"} Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.768429 5030 scope.go:117] "RemoveContainer" containerID="c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.779996 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.7799788850000002 podStartE2EDuration="2.779978885s" podCreationTimestamp="2026-01-20 23:54:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:43.775566328 +0000 UTC m=+4756.095826616" watchObservedRunningTime="2026-01-20 23:54:43.779978885 +0000 UTC m=+4756.100239173" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.811546 5030 scope.go:117] "RemoveContainer" containerID="70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.814429 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.81440511 podStartE2EDuration="3.81440511s" podCreationTimestamp="2026-01-20 23:54:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:43.811947441 +0000 UTC m=+4756.132207739" watchObservedRunningTime="2026-01-20 23:54:43.81440511 +0000 UTC m=+4756.134665418" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.843461 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.850265 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.867847 5030 scope.go:117] "RemoveContainer" containerID="abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.893482 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:54:43 crc kubenswrapper[5030]: E0120 23:54:43.893850 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="ceilometer-notification-agent" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.893865 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="ceilometer-notification-agent" Jan 20 23:54:43 crc kubenswrapper[5030]: E0120 23:54:43.893875 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="sg-core" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.893881 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="sg-core" Jan 20 23:54:43 crc kubenswrapper[5030]: E0120 23:54:43.893891 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="ceilometer-central-agent" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.893897 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="ceilometer-central-agent" Jan 20 23:54:43 crc kubenswrapper[5030]: E0120 23:54:43.893911 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076d046e-5116-4b10-ace6-602cb79307fa" containerName="placement-db-sync" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.893917 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="076d046e-5116-4b10-ace6-602cb79307fa" containerName="placement-db-sync" Jan 20 23:54:43 crc kubenswrapper[5030]: E0120 23:54:43.893940 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="proxy-httpd" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.893945 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="proxy-httpd" Jan 20 23:54:43 crc kubenswrapper[5030]: E0120 23:54:43.893953 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69fbd0a-0caf-4068-a40b-7706ffb909e8" containerName="keystone-bootstrap" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.893959 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69fbd0a-0caf-4068-a40b-7706ffb909e8" containerName="keystone-bootstrap" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.894113 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="ceilometer-central-agent" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.894129 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="076d046e-5116-4b10-ace6-602cb79307fa" containerName="placement-db-sync" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.894138 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="proxy-httpd" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.894153 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="ceilometer-notification-agent" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.894166 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="985aa547-9920-4524-82f7-d16698da2c8d" containerName="sg-core" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.894172 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69fbd0a-0caf-4068-a40b-7706ffb909e8" containerName="keystone-bootstrap" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.895569 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.904416 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.904672 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.907164 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.938808 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-r5kx2"] Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.941606 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-7cd488ddbd-4gzn9"] Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.947820 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.951001 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-placement-dockercfg-px6l8" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.951341 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-scripts" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.951542 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-public-svc" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.951658 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-placement-internal-svc" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.951736 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-config-data" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.960758 5030 scope.go:117] "RemoveContainer" containerID="c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.976795 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="985aa547-9920-4524-82f7-d16698da2c8d" path="/var/lib/kubelet/pods/985aa547-9920-4524-82f7-d16698da2c8d/volumes" Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.977525 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-r5kx2"] Jan 20 23:54:43 crc kubenswrapper[5030]: I0120 23:54:43.977556 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7cd488ddbd-4gzn9"] Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.002171 5030 scope.go:117] "RemoveContainer" containerID="c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292" Jan 20 23:54:44 crc kubenswrapper[5030]: E0120 23:54:44.003058 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292\": container with ID starting with c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292 not found: ID does not exist" containerID="c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.003109 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292"} err="failed to get container status \"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292\": rpc error: code = NotFound desc = could not find container \"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292\": container with ID starting with c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292 not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.003129 5030 scope.go:117] "RemoveContainer" containerID="70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df" Jan 20 23:54:44 crc kubenswrapper[5030]: E0120 23:54:44.004581 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df\": container with ID starting with 70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df not found: ID does not exist" containerID="70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.004606 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df"} err="failed to get container status \"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df\": rpc error: code = NotFound desc = could not find container \"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df\": container with ID starting with 70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.004635 5030 scope.go:117] "RemoveContainer" containerID="abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0" Jan 20 23:54:44 crc kubenswrapper[5030]: E0120 23:54:44.005552 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0\": container with ID starting with abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0 not found: ID does not exist" containerID="abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.006655 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0"} err="failed to get container status \"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0\": rpc error: code = NotFound desc = could not find container \"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0\": container with ID starting with abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0 not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.006690 5030 scope.go:117] "RemoveContainer" containerID="c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510" Jan 20 23:54:44 crc kubenswrapper[5030]: E0120 23:54:44.007273 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510\": container with ID starting with c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510 not found: ID does not exist" containerID="c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.007295 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510"} err="failed to get container status \"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510\": rpc error: code = NotFound desc = could not find container \"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510\": container with ID starting with c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510 not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.007309 5030 scope.go:117] "RemoveContainer" containerID="c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.007736 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7mn5q"] Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.008034 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292"} err="failed to get container status \"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292\": rpc error: code = NotFound desc = could not find container \"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292\": container with ID starting with c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292 not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.008049 5030 scope.go:117] "RemoveContainer" containerID="70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.008707 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.009439 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df"} err="failed to get container status \"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df\": rpc error: code = NotFound desc = could not find container \"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df\": container with ID starting with 70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.009463 5030 scope.go:117] "RemoveContainer" containerID="abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.010009 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0"} err="failed to get container status \"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0\": rpc error: code = NotFound desc = could not find container \"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0\": container with ID starting with abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0 not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.010040 5030 scope.go:117] "RemoveContainer" containerID="c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.010774 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510"} err="failed to get container status \"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510\": rpc error: code = NotFound desc = could not find container \"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510\": container with ID starting with c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510 not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.010814 5030 scope.go:117] "RemoveContainer" containerID="c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.011388 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292"} err="failed to get container status \"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292\": rpc error: code = NotFound desc = could not find container \"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292\": container with ID starting with c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292 not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.011410 5030 scope.go:117] "RemoveContainer" containerID="70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.011564 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"osp-secret" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.011588 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.011709 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df"} err="failed to get container status \"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df\": rpc error: code = NotFound desc = could not find container \"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df\": container with ID starting with 70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.011739 5030 scope.go:117] "RemoveContainer" containerID="abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.011807 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.012001 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0"} err="failed to get container status \"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0\": rpc error: code = NotFound desc = could not find container \"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0\": container with ID starting with abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0 not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.012024 5030 scope.go:117] "RemoveContainer" containerID="c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.012236 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510"} err="failed to get container status \"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510\": rpc error: code = NotFound desc = could not find container \"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510\": container with ID starting with c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510 not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.012253 5030 scope.go:117] "RemoveContainer" containerID="c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.012463 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292"} err="failed to get container status \"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292\": rpc error: code = NotFound desc = could not find container \"c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292\": container with ID starting with c6c574f23d125575b979b0e20dfb841269a1d604b741fe26d8e5f4cbce469292 not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.012482 5030 scope.go:117] "RemoveContainer" containerID="70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.012722 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df"} err="failed to get container status \"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df\": rpc error: code = NotFound desc = could not find container \"70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df\": container with ID starting with 70f5c2687d0b84e20ecdaf976277656d5ffbc0bda52a8da5eb9df846e402e8df not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.012740 5030 scope.go:117] "RemoveContainer" containerID="abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.013061 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0"} err="failed to get container status \"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0\": rpc error: code = NotFound desc = could not find container \"abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0\": container with ID starting with abb8299fc2b14a71a5c68cb4cca1d8b32028af7306f399a9050761d6144079d0 not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.013080 5030 scope.go:117] "RemoveContainer" containerID="c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.013235 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510"} err="failed to get container status \"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510\": rpc error: code = NotFound desc = could not find container \"c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510\": container with ID starting with c88846cdce5510a7aa10e7e9c84bbea8147d1e5231325c5752d68c776306d510 not found: ID does not exist" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.013566 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-4w7gr" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.015460 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.016079 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7mn5q"] Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.032382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-public-tls-certs\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.032552 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-scripts\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.032596 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-scripts\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.032640 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-logs\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.032676 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv2m8\" (UniqueName: \"kubernetes.io/projected/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-kube-api-access-jv2m8\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.032837 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.032921 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-config-data\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.032952 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-log-httpd\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.032980 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.033055 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-internal-tls-certs\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.033118 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-config-data\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.033145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-combined-ca-bundle\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.033180 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tgj9\" (UniqueName: \"kubernetes.io/projected/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-kube-api-access-4tgj9\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.033245 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-run-httpd\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135230 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-config-data\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135279 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-log-httpd\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135301 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135324 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-scripts\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-internal-tls-certs\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135382 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-config-data\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135399 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-combined-ca-bundle\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135422 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tgj9\" (UniqueName: \"kubernetes.io/projected/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-kube-api-access-4tgj9\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135445 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-run-httpd\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-public-tls-certs\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135502 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-766vs\" (UniqueName: \"kubernetes.io/projected/a58c134d-66fb-4e25-ac00-7fe6686da8e0-kube-api-access-766vs\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135520 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-fernet-keys\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135542 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-scripts\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135560 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-combined-ca-bundle\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135590 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-scripts\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135846 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-logs\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135863 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-config-data\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135885 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv2m8\" (UniqueName: \"kubernetes.io/projected/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-kube-api-access-jv2m8\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135937 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-credential-keys\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135974 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.136382 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-logs\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.135777 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-log-httpd\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.136763 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-run-httpd\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.140149 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-config-data\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.142283 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-config-data\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.153717 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-public-tls-certs\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.154365 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-scripts\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.155788 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv2m8\" (UniqueName: \"kubernetes.io/projected/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-kube-api-access-jv2m8\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.156253 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-combined-ca-bundle\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.160099 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.160771 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.161068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tgj9\" (UniqueName: \"kubernetes.io/projected/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-kube-api-access-4tgj9\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.161273 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-scripts\") pod \"ceilometer-0\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.161478 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-internal-tls-certs\") pod \"placement-7cd488ddbd-4gzn9\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.237002 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-scripts\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.237189 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-766vs\" (UniqueName: \"kubernetes.io/projected/a58c134d-66fb-4e25-ac00-7fe6686da8e0-kube-api-access-766vs\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.237274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-fernet-keys\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.237352 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-combined-ca-bundle\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.237444 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-config-data\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.237531 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-credential-keys\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.240023 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-scripts\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.240334 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-fernet-keys\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.240475 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-combined-ca-bundle\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.240480 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-credential-keys\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.244056 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-config-data\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.265969 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-766vs\" (UniqueName: \"kubernetes.io/projected/a58c134d-66fb-4e25-ac00-7fe6686da8e0-kube-api-access-766vs\") pod \"keystone-bootstrap-7mn5q\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.296713 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.311154 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.325803 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.398839 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.440400 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a7238d6f-7a18-49b8-a128-4935e9d5fadc-db-sync-config-data\") pod \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\" (UID: \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\") " Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.440990 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7238d6f-7a18-49b8-a128-4935e9d5fadc-combined-ca-bundle\") pod \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\" (UID: \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\") " Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.441048 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dvbm\" (UniqueName: \"kubernetes.io/projected/a7238d6f-7a18-49b8-a128-4935e9d5fadc-kube-api-access-4dvbm\") pod \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\" (UID: \"a7238d6f-7a18-49b8-a128-4935e9d5fadc\") " Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.445379 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7238d6f-7a18-49b8-a128-4935e9d5fadc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a7238d6f-7a18-49b8-a128-4935e9d5fadc" (UID: "a7238d6f-7a18-49b8-a128-4935e9d5fadc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.445445 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7238d6f-7a18-49b8-a128-4935e9d5fadc-kube-api-access-4dvbm" (OuterVolumeSpecName: "kube-api-access-4dvbm") pod "a7238d6f-7a18-49b8-a128-4935e9d5fadc" (UID: "a7238d6f-7a18-49b8-a128-4935e9d5fadc"). InnerVolumeSpecName "kube-api-access-4dvbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.480165 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7238d6f-7a18-49b8-a128-4935e9d5fadc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7238d6f-7a18-49b8-a128-4935e9d5fadc" (UID: "a7238d6f-7a18-49b8-a128-4935e9d5fadc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.543600 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7238d6f-7a18-49b8-a128-4935e9d5fadc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.543660 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dvbm\" (UniqueName: \"kubernetes.io/projected/a7238d6f-7a18-49b8-a128-4935e9d5fadc-kube-api-access-4dvbm\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.543674 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a7238d6f-7a18-49b8-a128-4935e9d5fadc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.777521 5030 generic.go:334] "Generic (PLEG): container finished" podID="d0747945-eb36-4b4d-884c-37803fa92174" containerID="8908dc6a3ed08b89887c59ba107ec391977995d11046784ad1c79931bbf35386" exitCode=0 Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.777681 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" event={"ID":"d0747945-eb36-4b4d-884c-37803fa92174","Type":"ContainerDied","Data":"8908dc6a3ed08b89887c59ba107ec391977995d11046784ad1c79931bbf35386"} Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.782760 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" event={"ID":"a7238d6f-7a18-49b8-a128-4935e9d5fadc","Type":"ContainerDied","Data":"b6fb71ca17926cb9d6d6dc943db984f65473d9993eb37b85dc3cf57fc8a004e2"} Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.782836 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6fb71ca17926cb9d6d6dc943db984f65473d9993eb37b85dc3cf57fc8a004e2" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.782792 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-db-sync-9fhpl" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.788195 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:54:44 crc kubenswrapper[5030]: W0120 23:54:44.796092 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fbe6d2d_126d_4a16_88d5_bf143b2e498b.slice/crio-7d6f313e5aa5fac92719a695fbadf0f8003c9b241ff0bacec51a7ca74d8891c6 WatchSource:0}: Error finding container 7d6f313e5aa5fac92719a695fbadf0f8003c9b241ff0bacec51a7ca74d8891c6: Status 404 returned error can't find the container with id 7d6f313e5aa5fac92719a695fbadf0f8003c9b241ff0bacec51a7ca74d8891c6 Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.909158 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-7cd488ddbd-4gzn9"] Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.955038 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88"] Jan 20 23:54:44 crc kubenswrapper[5030]: E0120 23:54:44.955670 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7238d6f-7a18-49b8-a128-4935e9d5fadc" containerName="barbican-db-sync" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.955683 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7238d6f-7a18-49b8-a128-4935e9d5fadc" containerName="barbican-db-sync" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.955826 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7238d6f-7a18-49b8-a128-4935e9d5fadc" containerName="barbican-db-sync" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.959762 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.964748 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-worker-config-data" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.965056 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-barbican-dockercfg-ld7f7" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.965193 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-config-data" Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.973747 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7mn5q"] Jan 20 23:54:44 crc kubenswrapper[5030]: I0120 23:54:44.995235 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88"] Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.018408 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l"] Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.020174 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.023460 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.048235 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l"] Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.057736 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-combined-ca-bundle\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.057778 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-config-data-custom\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.057817 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-config-data\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.057875 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-combined-ca-bundle\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.057897 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68728b46-0262-46f2-8eb0-a0301870b569-logs\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.057920 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf7vw\" (UniqueName: \"kubernetes.io/projected/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-kube-api-access-mf7vw\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.057947 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-logs\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.057987 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-config-data\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.058008 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-config-data-custom\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.058064 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhsgg\" (UniqueName: \"kubernetes.io/projected/68728b46-0262-46f2-8eb0-a0301870b569-kube-api-access-qhsgg\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.110143 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-f487f85c8-pl27h"] Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.113378 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.117104 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-api-config-data" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.123501 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-f487f85c8-pl27h"] Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.167973 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-combined-ca-bundle\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168036 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68728b46-0262-46f2-8eb0-a0301870b569-logs\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168064 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf7vw\" (UniqueName: \"kubernetes.io/projected/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-kube-api-access-mf7vw\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168097 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-combined-ca-bundle\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168131 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-logs\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168166 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-logs\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168199 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbmss\" (UniqueName: \"kubernetes.io/projected/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-kube-api-access-qbmss\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168243 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-config-data\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168298 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-config-data-custom\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168389 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhsgg\" (UniqueName: \"kubernetes.io/projected/68728b46-0262-46f2-8eb0-a0301870b569-kube-api-access-qhsgg\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168482 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-config-data-custom\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168508 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-combined-ca-bundle\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168536 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-config-data-custom\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168563 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-config-data\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.168644 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-config-data\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.170278 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68728b46-0262-46f2-8eb0-a0301870b569-logs\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.170674 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-logs\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.175399 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-combined-ca-bundle\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.176341 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-config-data-custom\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.176926 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-combined-ca-bundle\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.177986 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-config-data-custom\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.179103 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-config-data\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.179796 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-config-data\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.188774 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf7vw\" (UniqueName: \"kubernetes.io/projected/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-kube-api-access-mf7vw\") pod \"barbican-worker-687887ddc7-x7m88\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.189220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhsgg\" (UniqueName: \"kubernetes.io/projected/68728b46-0262-46f2-8eb0-a0301870b569-kube-api-access-qhsgg\") pod \"barbican-keystone-listener-7f84774d9d-5g78l\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.273593 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-config-data-custom\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.273892 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-config-data\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.273941 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-combined-ca-bundle\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.273963 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-logs\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.273989 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbmss\" (UniqueName: \"kubernetes.io/projected/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-kube-api-access-qbmss\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.274993 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-logs\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.296512 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-config-data\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.297471 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-combined-ca-bundle\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.304207 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-config-data-custom\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.304274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbmss\" (UniqueName: \"kubernetes.io/projected/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-kube-api-access-qbmss\") pod \"barbican-api-f487f85c8-pl27h\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.415137 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.469215 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.488442 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.794868 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fbe6d2d-126d-4a16-88d5-bf143b2e498b","Type":"ContainerStarted","Data":"1d5cb939e54086daf239a43fbb24b45799f0f4e24b46023811eb433994349f54"} Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.795253 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fbe6d2d-126d-4a16-88d5-bf143b2e498b","Type":"ContainerStarted","Data":"7d6f313e5aa5fac92719a695fbadf0f8003c9b241ff0bacec51a7ca74d8891c6"} Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.796092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" event={"ID":"a58c134d-66fb-4e25-ac00-7fe6686da8e0","Type":"ContainerStarted","Data":"0f2d9ff94a70f371ea0c2f655df1343171f75f715efffd2ebcbd88a09a91dcdf"} Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.796109 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" event={"ID":"a58c134d-66fb-4e25-ac00-7fe6686da8e0","Type":"ContainerStarted","Data":"a62d9cbdd182265654d87e1dcb68001e5929de1761935e5bbfb0f7388a1fda0b"} Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.799101 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" event={"ID":"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f","Type":"ContainerStarted","Data":"298b4e1352b60ce437b507a18333d12357a166ee55c90098fbb3bb0047c27131"} Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.799128 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.799137 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" event={"ID":"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f","Type":"ContainerStarted","Data":"0f4f42344c283426fc23c3e24b986e08ac56986fe44233a22ed80905aaf6f05b"} Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.799146 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" event={"ID":"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f","Type":"ContainerStarted","Data":"e2eee43ae98678198dc94a3bab964617abea148f3a77ad06f4b67562e642007a"} Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.799165 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.845728 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" podStartSLOduration=2.8457107170000002 podStartE2EDuration="2.845710717s" podCreationTimestamp="2026-01-20 23:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:45.837951349 +0000 UTC m=+4758.158211637" watchObservedRunningTime="2026-01-20 23:54:45.845710717 +0000 UTC m=+4758.165971005" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.884425 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" podStartSLOduration=2.884408036 podStartE2EDuration="2.884408036s" podCreationTimestamp="2026-01-20 23:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:45.860173077 +0000 UTC m=+4758.180433355" watchObservedRunningTime="2026-01-20 23:54:45.884408036 +0000 UTC m=+4758.204668314" Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.939024 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88"] Jan 20 23:54:45 crc kubenswrapper[5030]: I0120 23:54:45.982284 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e69fbd0a-0caf-4068-a40b-7706ffb909e8" path="/var/lib/kubelet/pods/e69fbd0a-0caf-4068-a40b-7706ffb909e8/volumes" Jan 20 23:54:46 crc kubenswrapper[5030]: W0120 23:54:46.032751 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf99fdfeb_7bb8_4ac4_b21a_e8b0ce724fd4.slice/crio-dbe215f3894cb82c012f25e8864c093a6efc53729448845d411e0cc5e788bfcb WatchSource:0}: Error finding container dbe215f3894cb82c012f25e8864c093a6efc53729448845d411e0cc5e788bfcb: Status 404 returned error can't find the container with id dbe215f3894cb82c012f25e8864c093a6efc53729448845d411e0cc5e788bfcb Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.055770 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-f487f85c8-pl27h"] Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.130941 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l"] Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.255106 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.392357 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-scripts\") pod \"d0747945-eb36-4b4d-884c-37803fa92174\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.392542 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-db-sync-config-data\") pod \"d0747945-eb36-4b4d-884c-37803fa92174\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.392579 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0747945-eb36-4b4d-884c-37803fa92174-etc-machine-id\") pod \"d0747945-eb36-4b4d-884c-37803fa92174\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.392790 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-config-data\") pod \"d0747945-eb36-4b4d-884c-37803fa92174\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.392833 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-combined-ca-bundle\") pod \"d0747945-eb36-4b4d-884c-37803fa92174\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.392874 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d88p\" (UniqueName: \"kubernetes.io/projected/d0747945-eb36-4b4d-884c-37803fa92174-kube-api-access-9d88p\") pod \"d0747945-eb36-4b4d-884c-37803fa92174\" (UID: \"d0747945-eb36-4b4d-884c-37803fa92174\") " Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.393779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0747945-eb36-4b4d-884c-37803fa92174-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d0747945-eb36-4b4d-884c-37803fa92174" (UID: "d0747945-eb36-4b4d-884c-37803fa92174"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.397849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d0747945-eb36-4b4d-884c-37803fa92174" (UID: "d0747945-eb36-4b4d-884c-37803fa92174"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.399101 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-scripts" (OuterVolumeSpecName: "scripts") pod "d0747945-eb36-4b4d-884c-37803fa92174" (UID: "d0747945-eb36-4b4d-884c-37803fa92174"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.399786 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0747945-eb36-4b4d-884c-37803fa92174-kube-api-access-9d88p" (OuterVolumeSpecName: "kube-api-access-9d88p") pod "d0747945-eb36-4b4d-884c-37803fa92174" (UID: "d0747945-eb36-4b4d-884c-37803fa92174"). InnerVolumeSpecName "kube-api-access-9d88p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.447790 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0747945-eb36-4b4d-884c-37803fa92174" (UID: "d0747945-eb36-4b4d-884c-37803fa92174"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.471966 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-config-data" (OuterVolumeSpecName: "config-data") pod "d0747945-eb36-4b4d-884c-37803fa92174" (UID: "d0747945-eb36-4b4d-884c-37803fa92174"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.495407 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.495441 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0747945-eb36-4b4d-884c-37803fa92174-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.495454 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.495468 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.495483 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d88p\" (UniqueName: \"kubernetes.io/projected/d0747945-eb36-4b4d-884c-37803fa92174-kube-api-access-9d88p\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.495494 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0747945-eb36-4b4d-884c-37803fa92174-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.835446 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" event={"ID":"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e","Type":"ContainerStarted","Data":"c9f22341c20b80fbc92351565eeb1684e14881bf7d19d2e8e7a7402532a5420e"} Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.835492 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" event={"ID":"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e","Type":"ContainerStarted","Data":"c73d1bfddd3255de91348a4a282edc41b3a6067265685d19999b944603a37028"} Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.835507 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" event={"ID":"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e","Type":"ContainerStarted","Data":"201453dbf37bc1a1dfbd34bb420f06869a1b1725a0e4a35cadb830ea2c57e931"} Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.844500 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fbe6d2d-126d-4a16-88d5-bf143b2e498b","Type":"ContainerStarted","Data":"2abe2d924f0000c1aaadcca65d6d7bf7b58064c5f5cae2a648f5d11c73e94ca9"} Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.851231 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" podStartSLOduration=2.851215529 podStartE2EDuration="2.851215529s" podCreationTimestamp="2026-01-20 23:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:46.850925412 +0000 UTC m=+4759.171185700" watchObservedRunningTime="2026-01-20 23:54:46.851215529 +0000 UTC m=+4759.171475817" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.852459 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" event={"ID":"d0747945-eb36-4b4d-884c-37803fa92174","Type":"ContainerDied","Data":"a4092b56e3dc4010a64a7ead7478226f9154c12d62ddb74e6f7aec13b2518285"} Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.852502 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4092b56e3dc4010a64a7ead7478226f9154c12d62ddb74e6f7aec13b2518285" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.852561 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-db-sync-4jp4k" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.862995 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" event={"ID":"68728b46-0262-46f2-8eb0-a0301870b569","Type":"ContainerStarted","Data":"c841a5010562e147fe1443063c1f51700f2ba446789e1d9cbd9424b883b66e6e"} Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.863044 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" event={"ID":"68728b46-0262-46f2-8eb0-a0301870b569","Type":"ContainerStarted","Data":"1cd498511cf2e3d79c9f51bfa7be8a47de25abc9f9cf591603a16ea099174d99"} Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.863056 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" event={"ID":"68728b46-0262-46f2-8eb0-a0301870b569","Type":"ContainerStarted","Data":"888527f66f4a0ef9165e4fd9dc2ad05e6aef6de2a802ffb8c80aba0cb8962da2"} Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.869959 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" event={"ID":"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4","Type":"ContainerStarted","Data":"814b477ae880ccaca92b5c785cd2d2553e82b02298aabbeef4db6c3a1baebad0"} Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.869999 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" event={"ID":"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4","Type":"ContainerStarted","Data":"29f7e642a020b85679305a6db841e40a748dd5191ddeefc75f16c3da7cbc9388"} Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.870011 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" event={"ID":"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4","Type":"ContainerStarted","Data":"dbe215f3894cb82c012f25e8864c093a6efc53729448845d411e0cc5e788bfcb"} Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.895982 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" podStartSLOduration=2.895958664 podStartE2EDuration="2.895958664s" podCreationTimestamp="2026-01-20 23:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:46.881444312 +0000 UTC m=+4759.201704600" watchObservedRunningTime="2026-01-20 23:54:46.895958664 +0000 UTC m=+4759.216218952" Jan 20 23:54:46 crc kubenswrapper[5030]: I0120 23:54:46.907565 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" podStartSLOduration=1.907545455 podStartE2EDuration="1.907545455s" podCreationTimestamp="2026-01-20 23:54:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:46.902425251 +0000 UTC m=+4759.222685539" watchObservedRunningTime="2026-01-20 23:54:46.907545455 +0000 UTC m=+4759.227805753" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.103669 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:54:47 crc kubenswrapper[5030]: E0120 23:54:47.103986 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0747945-eb36-4b4d-884c-37803fa92174" containerName="cinder-db-sync" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.104002 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0747945-eb36-4b4d-884c-37803fa92174" containerName="cinder-db-sync" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.104181 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0747945-eb36-4b4d-884c-37803fa92174" containerName="cinder-db-sync" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.105360 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.108637 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scripts" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.108742 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.108866 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-config-data" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.108929 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-cinder-dockercfg-78dp8" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.120736 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.212828 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.214509 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.216118 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.220605 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.224450 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-scripts\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.224516 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a7344df-b142-4ed8-b56e-70045a4c9a80-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.224539 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5srcd\" (UniqueName: \"kubernetes.io/projected/7a7344df-b142-4ed8-b56e-70045a4c9a80-kube-api-access-5srcd\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.224589 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-config-data\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.224652 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.224670 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.326123 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/52118814-66ad-4be7-a65d-2d65313b76b1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.326199 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-config-data\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.326226 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwrg8\" (UniqueName: \"kubernetes.io/projected/52118814-66ad-4be7-a65d-2d65313b76b1-kube-api-access-xwrg8\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.326272 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.326292 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.326320 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-config-data-custom\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.326342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52118814-66ad-4be7-a65d-2d65313b76b1-logs\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.326365 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-config-data\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.326381 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.326401 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-scripts\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.326440 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a7344df-b142-4ed8-b56e-70045a4c9a80-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.326456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5srcd\" (UniqueName: \"kubernetes.io/projected/7a7344df-b142-4ed8-b56e-70045a4c9a80-kube-api-access-5srcd\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.326474 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-scripts\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.327516 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a7344df-b142-4ed8-b56e-70045a4c9a80-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.335560 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.339404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-config-data\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.346199 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-scripts\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.348256 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.357215 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5srcd\" (UniqueName: \"kubernetes.io/projected/7a7344df-b142-4ed8-b56e-70045a4c9a80-kube-api-access-5srcd\") pod \"cinder-scheduler-0\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.427835 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-scripts\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.427888 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/52118814-66ad-4be7-a65d-2d65313b76b1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.427939 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwrg8\" (UniqueName: \"kubernetes.io/projected/52118814-66ad-4be7-a65d-2d65313b76b1-kube-api-access-xwrg8\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.427995 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-config-data-custom\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.428017 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52118814-66ad-4be7-a65d-2d65313b76b1-logs\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.428041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-config-data\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.428059 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.428076 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/52118814-66ad-4be7-a65d-2d65313b76b1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.428819 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52118814-66ad-4be7-a65d-2d65313b76b1-logs\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.433955 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-scripts\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.434926 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.439115 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-config-data-custom\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.440545 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-config-data\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.449434 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwrg8\" (UniqueName: \"kubernetes.io/projected/52118814-66ad-4be7-a65d-2d65313b76b1-kube-api-access-xwrg8\") pod \"cinder-api-0\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.557205 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.557574 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.888964 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fbe6d2d-126d-4a16-88d5-bf143b2e498b","Type":"ContainerStarted","Data":"1c4217b930c25c0b6d190299fdaf7767c7a4f55ceafbf01214b79b6d073f42bb"} Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.889913 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:47 crc kubenswrapper[5030]: I0120 23:54:47.890349 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:48 crc kubenswrapper[5030]: I0120 23:54:48.068391 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:54:48 crc kubenswrapper[5030]: W0120 23:54:48.072072 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a7344df_b142_4ed8_b56e_70045a4c9a80.slice/crio-84d912739cede0b73c7cb53b290ea92d13e77b71c9107b916f43dce7651a179d WatchSource:0}: Error finding container 84d912739cede0b73c7cb53b290ea92d13e77b71c9107b916f43dce7651a179d: Status 404 returned error can't find the container with id 84d912739cede0b73c7cb53b290ea92d13e77b71c9107b916f43dce7651a179d Jan 20 23:54:48 crc kubenswrapper[5030]: I0120 23:54:48.131743 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:54:48 crc kubenswrapper[5030]: I0120 23:54:48.903196 5030 generic.go:334] "Generic (PLEG): container finished" podID="a58c134d-66fb-4e25-ac00-7fe6686da8e0" containerID="0f2d9ff94a70f371ea0c2f655df1343171f75f715efffd2ebcbd88a09a91dcdf" exitCode=0 Jan 20 23:54:48 crc kubenswrapper[5030]: I0120 23:54:48.903251 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" event={"ID":"a58c134d-66fb-4e25-ac00-7fe6686da8e0","Type":"ContainerDied","Data":"0f2d9ff94a70f371ea0c2f655df1343171f75f715efffd2ebcbd88a09a91dcdf"} Jan 20 23:54:48 crc kubenswrapper[5030]: I0120 23:54:48.905028 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"52118814-66ad-4be7-a65d-2d65313b76b1","Type":"ContainerStarted","Data":"39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1"} Jan 20 23:54:48 crc kubenswrapper[5030]: I0120 23:54:48.905063 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"52118814-66ad-4be7-a65d-2d65313b76b1","Type":"ContainerStarted","Data":"9452ae8f88d286e2631d4916f853db481b923b659f06558f8a7613c7005344c1"} Jan 20 23:54:48 crc kubenswrapper[5030]: I0120 23:54:48.914546 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"7a7344df-b142-4ed8-b56e-70045a4c9a80","Type":"ContainerStarted","Data":"2efa82fc92e7223d08fa7b7db2c4b8535d63caa3e86275361a439fefb37a072e"} Jan 20 23:54:48 crc kubenswrapper[5030]: I0120 23:54:48.915125 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"7a7344df-b142-4ed8-b56e-70045a4c9a80","Type":"ContainerStarted","Data":"84d912739cede0b73c7cb53b290ea92d13e77b71c9107b916f43dce7651a179d"} Jan 20 23:54:49 crc kubenswrapper[5030]: I0120 23:54:49.945995 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"7a7344df-b142-4ed8-b56e-70045a4c9a80","Type":"ContainerStarted","Data":"5883cf8e5ed479df0e086b82e2d084c87c1034103ce10b7652703f48e4d4a460"} Jan 20 23:54:49 crc kubenswrapper[5030]: I0120 23:54:49.956450 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fbe6d2d-126d-4a16-88d5-bf143b2e498b","Type":"ContainerStarted","Data":"0d79518f69b3443188b9f81d15a173595b8c99f49e13bf48bf0f078a5d181752"} Jan 20 23:54:49 crc kubenswrapper[5030]: I0120 23:54:49.956585 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:54:49 crc kubenswrapper[5030]: I0120 23:54:49.960058 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"52118814-66ad-4be7-a65d-2d65313b76b1","Type":"ContainerStarted","Data":"59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f"} Jan 20 23:54:49 crc kubenswrapper[5030]: I0120 23:54:49.960550 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:49 crc kubenswrapper[5030]: I0120 23:54:49.979829 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=2.979808225 podStartE2EDuration="2.979808225s" podCreationTimestamp="2026-01-20 23:54:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:49.965474466 +0000 UTC m=+4762.285734754" watchObservedRunningTime="2026-01-20 23:54:49.979808225 +0000 UTC m=+4762.300068513" Jan 20 23:54:49 crc kubenswrapper[5030]: I0120 23:54:49.996477 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.795768676 podStartE2EDuration="6.996433008s" podCreationTimestamp="2026-01-20 23:54:43 +0000 UTC" firstStartedPulling="2026-01-20 23:54:44.798599465 +0000 UTC m=+4757.118859753" lastFinishedPulling="2026-01-20 23:54:48.999263797 +0000 UTC m=+4761.319524085" observedRunningTime="2026-01-20 23:54:49.989654923 +0000 UTC m=+4762.309915211" watchObservedRunningTime="2026-01-20 23:54:49.996433008 +0000 UTC m=+4762.316693296" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.008716 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.008700015 podStartE2EDuration="3.008700015s" podCreationTimestamp="2026-01-20 23:54:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:50.006462511 +0000 UTC m=+4762.326722799" watchObservedRunningTime="2026-01-20 23:54:50.008700015 +0000 UTC m=+4762.328960303" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.366596 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.501992 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-fernet-keys\") pod \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.502057 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-credential-keys\") pod \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.502113 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-config-data\") pod \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.502154 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-combined-ca-bundle\") pod \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.502206 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-766vs\" (UniqueName: \"kubernetes.io/projected/a58c134d-66fb-4e25-ac00-7fe6686da8e0-kube-api-access-766vs\") pod \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.502239 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-scripts\") pod \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\" (UID: \"a58c134d-66fb-4e25-ac00-7fe6686da8e0\") " Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.507854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a58c134d-66fb-4e25-ac00-7fe6686da8e0" (UID: "a58c134d-66fb-4e25-ac00-7fe6686da8e0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.507908 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58c134d-66fb-4e25-ac00-7fe6686da8e0-kube-api-access-766vs" (OuterVolumeSpecName: "kube-api-access-766vs") pod "a58c134d-66fb-4e25-ac00-7fe6686da8e0" (UID: "a58c134d-66fb-4e25-ac00-7fe6686da8e0"). InnerVolumeSpecName "kube-api-access-766vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.518988 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-scripts" (OuterVolumeSpecName: "scripts") pod "a58c134d-66fb-4e25-ac00-7fe6686da8e0" (UID: "a58c134d-66fb-4e25-ac00-7fe6686da8e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.523469 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a58c134d-66fb-4e25-ac00-7fe6686da8e0" (UID: "a58c134d-66fb-4e25-ac00-7fe6686da8e0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.532138 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a58c134d-66fb-4e25-ac00-7fe6686da8e0" (UID: "a58c134d-66fb-4e25-ac00-7fe6686da8e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.546348 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-config-data" (OuterVolumeSpecName: "config-data") pod "a58c134d-66fb-4e25-ac00-7fe6686da8e0" (UID: "a58c134d-66fb-4e25-ac00-7fe6686da8e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.604596 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.604673 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.604696 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.604717 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.604741 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-766vs\" (UniqueName: \"kubernetes.io/projected/a58c134d-66fb-4e25-ac00-7fe6686da8e0-kube-api-access-766vs\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.604759 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a58c134d-66fb-4e25-ac00-7fe6686da8e0-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.976660 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.988412 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-bootstrap-7mn5q" event={"ID":"a58c134d-66fb-4e25-ac00-7fe6686da8e0","Type":"ContainerDied","Data":"a62d9cbdd182265654d87e1dcb68001e5929de1761935e5bbfb0f7388a1fda0b"} Jan 20 23:54:50 crc kubenswrapper[5030]: I0120 23:54:50.988453 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a62d9cbdd182265654d87e1dcb68001e5929de1761935e5bbfb0f7388a1fda0b" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.061784 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx"] Jan 20 23:54:51 crc kubenswrapper[5030]: E0120 23:54:51.062302 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58c134d-66fb-4e25-ac00-7fe6686da8e0" containerName="keystone-bootstrap" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.062315 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58c134d-66fb-4e25-ac00-7fe6686da8e0" containerName="keystone-bootstrap" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.062652 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a58c134d-66fb-4e25-ac00-7fe6686da8e0" containerName="keystone-bootstrap" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.063397 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.063412 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.063539 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.068117 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-scripts" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.068362 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.068998 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-public-svc" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.069221 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-config-data" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.070041 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx"] Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.070683 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-keystone-internal-svc" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.092555 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-keystone-dockercfg-4w7gr" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.105252 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.111974 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.217340 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-public-tls-certs\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.217444 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-credential-keys\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.217542 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-config-data\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.217840 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-internal-tls-certs\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.217904 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-combined-ca-bundle\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.217932 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-scripts\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.218009 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-fernet-keys\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.218104 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kktd5\" (UniqueName: \"kubernetes.io/projected/6a35511b-3fcd-40b1-8657-c26c6bf69e50-kube-api-access-kktd5\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.320054 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kktd5\" (UniqueName: \"kubernetes.io/projected/6a35511b-3fcd-40b1-8657-c26c6bf69e50-kube-api-access-kktd5\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.320170 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-public-tls-certs\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.320211 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-credential-keys\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.320266 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-config-data\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.320345 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-internal-tls-certs\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.320373 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-combined-ca-bundle\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.320397 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-scripts\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.320430 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-fernet-keys\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.325222 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-fernet-keys\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.325456 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-credential-keys\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.326126 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-public-tls-certs\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.327950 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-scripts\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.328294 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-config-data\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.328511 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-internal-tls-certs\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.328921 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-combined-ca-bundle\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.337665 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kktd5\" (UniqueName: \"kubernetes.io/projected/6a35511b-3fcd-40b1-8657-c26c6bf69e50-kube-api-access-kktd5\") pod \"keystone-5c4f5f45b8-vhvkx\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.394358 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.400608 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.565410 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.569015 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.628079 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.633422 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.860098 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx"] Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.940788 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr"] Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.943127 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.945153 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-internal-svc" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.945402 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-barbican-public-svc" Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.949852 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr"] Jan 20 23:54:51 crc kubenswrapper[5030]: I0120 23:54:51.994768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" event={"ID":"6a35511b-3fcd-40b1-8657-c26c6bf69e50","Type":"ContainerStarted","Data":"67fab0b44a9cc5d4947b05475b62d1bdf5403f745040c339e3b475c527c80ffb"} Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:51.994829 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="52118814-66ad-4be7-a65d-2d65313b76b1" containerName="cinder-api-log" containerID="cri-o://39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1" gracePeriod=30 Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:51.994940 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="52118814-66ad-4be7-a65d-2d65313b76b1" containerName="cinder-api" containerID="cri-o://59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f" gracePeriod=30 Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:51.995206 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:51.995236 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:51.995246 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:51.995509 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.031260 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-combined-ca-bundle\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.031435 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slr56\" (UniqueName: \"kubernetes.io/projected/5915461a-f712-499e-ba04-bab78e49188f-kube-api-access-slr56\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.031546 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-public-tls-certs\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.031665 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-internal-tls-certs\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.031749 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5915461a-f712-499e-ba04-bab78e49188f-logs\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.031829 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-config-data\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.031908 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-config-data-custom\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.133780 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-public-tls-certs\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.134119 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-internal-tls-certs\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.134155 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5915461a-f712-499e-ba04-bab78e49188f-logs\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.134179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-config-data\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.134206 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-config-data-custom\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.134312 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-combined-ca-bundle\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.134355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slr56\" (UniqueName: \"kubernetes.io/projected/5915461a-f712-499e-ba04-bab78e49188f-kube-api-access-slr56\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.134964 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5915461a-f712-499e-ba04-bab78e49188f-logs\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.139141 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-internal-tls-certs\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.139322 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-public-tls-certs\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.140027 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-combined-ca-bundle\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.140778 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-config-data-custom\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.145767 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-config-data\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.155799 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slr56\" (UniqueName: \"kubernetes.io/projected/5915461a-f712-499e-ba04-bab78e49188f-kube-api-access-slr56\") pod \"barbican-api-77b4fdf4bb-v8fpr\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.276081 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.558581 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.617964 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.750635 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-config-data-custom\") pod \"52118814-66ad-4be7-a65d-2d65313b76b1\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.750738 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwrg8\" (UniqueName: \"kubernetes.io/projected/52118814-66ad-4be7-a65d-2d65313b76b1-kube-api-access-xwrg8\") pod \"52118814-66ad-4be7-a65d-2d65313b76b1\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.750782 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-combined-ca-bundle\") pod \"52118814-66ad-4be7-a65d-2d65313b76b1\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.750811 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/52118814-66ad-4be7-a65d-2d65313b76b1-etc-machine-id\") pod \"52118814-66ad-4be7-a65d-2d65313b76b1\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.750860 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52118814-66ad-4be7-a65d-2d65313b76b1-logs\") pod \"52118814-66ad-4be7-a65d-2d65313b76b1\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.750899 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-config-data\") pod \"52118814-66ad-4be7-a65d-2d65313b76b1\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.750925 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-scripts\") pod \"52118814-66ad-4be7-a65d-2d65313b76b1\" (UID: \"52118814-66ad-4be7-a65d-2d65313b76b1\") " Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.751600 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52118814-66ad-4be7-a65d-2d65313b76b1-logs" (OuterVolumeSpecName: "logs") pod "52118814-66ad-4be7-a65d-2d65313b76b1" (UID: "52118814-66ad-4be7-a65d-2d65313b76b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.751666 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52118814-66ad-4be7-a65d-2d65313b76b1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "52118814-66ad-4be7-a65d-2d65313b76b1" (UID: "52118814-66ad-4be7-a65d-2d65313b76b1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.756013 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-scripts" (OuterVolumeSpecName: "scripts") pod "52118814-66ad-4be7-a65d-2d65313b76b1" (UID: "52118814-66ad-4be7-a65d-2d65313b76b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.756040 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52118814-66ad-4be7-a65d-2d65313b76b1-kube-api-access-xwrg8" (OuterVolumeSpecName: "kube-api-access-xwrg8") pod "52118814-66ad-4be7-a65d-2d65313b76b1" (UID: "52118814-66ad-4be7-a65d-2d65313b76b1"). InnerVolumeSpecName "kube-api-access-xwrg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.768768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "52118814-66ad-4be7-a65d-2d65313b76b1" (UID: "52118814-66ad-4be7-a65d-2d65313b76b1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.787796 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52118814-66ad-4be7-a65d-2d65313b76b1" (UID: "52118814-66ad-4be7-a65d-2d65313b76b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.799593 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-config-data" (OuterVolumeSpecName: "config-data") pod "52118814-66ad-4be7-a65d-2d65313b76b1" (UID: "52118814-66ad-4be7-a65d-2d65313b76b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.848913 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr"] Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.855877 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.855956 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwrg8\" (UniqueName: \"kubernetes.io/projected/52118814-66ad-4be7-a65d-2d65313b76b1-kube-api-access-xwrg8\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.855967 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.855978 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/52118814-66ad-4be7-a65d-2d65313b76b1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.855988 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52118814-66ad-4be7-a65d-2d65313b76b1-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.855997 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:52 crc kubenswrapper[5030]: I0120 23:54:52.856006 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52118814-66ad-4be7-a65d-2d65313b76b1-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.020252 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" event={"ID":"5915461a-f712-499e-ba04-bab78e49188f","Type":"ContainerStarted","Data":"ac52bbaae9279d327d926eae11a2177bd95c95212a5758e0429fa5341d1b67fc"} Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.020297 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" event={"ID":"5915461a-f712-499e-ba04-bab78e49188f","Type":"ContainerStarted","Data":"a37c9f9c679f33c6c1ef9875abf21a9359cc5347c8c0bdc8c43f4516070489e2"} Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.023017 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" event={"ID":"6a35511b-3fcd-40b1-8657-c26c6bf69e50","Type":"ContainerStarted","Data":"b59655e0dbc36eb3cf361d0cf1ec929cafefa9196ef964eb00f393988ef15b65"} Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.025583 5030 generic.go:334] "Generic (PLEG): container finished" podID="52118814-66ad-4be7-a65d-2d65313b76b1" containerID="59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f" exitCode=0 Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.025632 5030 generic.go:334] "Generic (PLEG): container finished" podID="52118814-66ad-4be7-a65d-2d65313b76b1" containerID="39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1" exitCode=143 Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.026227 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"52118814-66ad-4be7-a65d-2d65313b76b1","Type":"ContainerDied","Data":"59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f"} Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.026300 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"52118814-66ad-4be7-a65d-2d65313b76b1","Type":"ContainerDied","Data":"39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1"} Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.026312 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"52118814-66ad-4be7-a65d-2d65313b76b1","Type":"ContainerDied","Data":"9452ae8f88d286e2631d4916f853db481b923b659f06558f8a7613c7005344c1"} Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.026351 5030 scope.go:117] "RemoveContainer" containerID="59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.027874 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.056097 5030 scope.go:117] "RemoveContainer" containerID="39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.074396 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" podStartSLOduration=2.074354593 podStartE2EDuration="2.074354593s" podCreationTimestamp="2026-01-20 23:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:53.065845617 +0000 UTC m=+4765.386105935" watchObservedRunningTime="2026-01-20 23:54:53.074354593 +0000 UTC m=+4765.394614881" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.096070 5030 scope.go:117] "RemoveContainer" containerID="59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f" Jan 20 23:54:53 crc kubenswrapper[5030]: E0120 23:54:53.100154 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f\": container with ID starting with 59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f not found: ID does not exist" containerID="59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.100523 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f"} err="failed to get container status \"59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f\": rpc error: code = NotFound desc = could not find container \"59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f\": container with ID starting with 59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f not found: ID does not exist" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.100591 5030 scope.go:117] "RemoveContainer" containerID="39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.102082 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:54:53 crc kubenswrapper[5030]: E0120 23:54:53.106200 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1\": container with ID starting with 39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1 not found: ID does not exist" containerID="39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.106374 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1"} err="failed to get container status \"39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1\": rpc error: code = NotFound desc = could not find container \"39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1\": container with ID starting with 39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1 not found: ID does not exist" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.106456 5030 scope.go:117] "RemoveContainer" containerID="59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.107456 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f"} err="failed to get container status \"59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f\": rpc error: code = NotFound desc = could not find container \"59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f\": container with ID starting with 59cb98dabe830459820a9ea3d375055317f50e0adf7b5d945dcea51c0fe6d55f not found: ID does not exist" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.107800 5030 scope.go:117] "RemoveContainer" containerID="39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.108222 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1"} err="failed to get container status \"39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1\": rpc error: code = NotFound desc = could not find container \"39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1\": container with ID starting with 39636d5fd9d0d1a155c31557bb4986563dabab6102e792ded9eaece78db7dbc1 not found: ID does not exist" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.113863 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.121676 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:54:53 crc kubenswrapper[5030]: E0120 23:54:53.122126 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52118814-66ad-4be7-a65d-2d65313b76b1" containerName="cinder-api-log" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.122140 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="52118814-66ad-4be7-a65d-2d65313b76b1" containerName="cinder-api-log" Jan 20 23:54:53 crc kubenswrapper[5030]: E0120 23:54:53.122160 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52118814-66ad-4be7-a65d-2d65313b76b1" containerName="cinder-api" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.122166 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="52118814-66ad-4be7-a65d-2d65313b76b1" containerName="cinder-api" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.122318 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="52118814-66ad-4be7-a65d-2d65313b76b1" containerName="cinder-api" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.122329 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="52118814-66ad-4be7-a65d-2d65313b76b1" containerName="cinder-api-log" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.123912 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.127377 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.127939 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-api-config-data" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.128578 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-internal-svc" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.128754 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-cinder-public-svc" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.173609 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-config-data\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.173757 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.174060 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.174173 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-public-tls-certs\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.174315 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15f641b8-896e-47f1-b6b9-a2ced1fb3808-logs\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.174371 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-config-data-custom\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.174430 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15f641b8-896e-47f1-b6b9-a2ced1fb3808-etc-machine-id\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.174529 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-scripts\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.174601 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq5xp\" (UniqueName: \"kubernetes.io/projected/15f641b8-896e-47f1-b6b9-a2ced1fb3808-kube-api-access-xq5xp\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.275842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.275900 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-public-tls-certs\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.275933 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15f641b8-896e-47f1-b6b9-a2ced1fb3808-logs\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.275954 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-config-data-custom\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.275985 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15f641b8-896e-47f1-b6b9-a2ced1fb3808-etc-machine-id\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.276019 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-scripts\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.276047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq5xp\" (UniqueName: \"kubernetes.io/projected/15f641b8-896e-47f1-b6b9-a2ced1fb3808-kube-api-access-xq5xp\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.276068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-config-data\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.276094 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.276696 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15f641b8-896e-47f1-b6b9-a2ced1fb3808-etc-machine-id\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.277511 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15f641b8-896e-47f1-b6b9-a2ced1fb3808-logs\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.282053 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-scripts\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.283056 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-config-data-custom\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.283174 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.283480 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-public-tls-certs\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.283696 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.284480 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-config-data\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.294176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq5xp\" (UniqueName: \"kubernetes.io/projected/15f641b8-896e-47f1-b6b9-a2ced1fb3808-kube-api-access-xq5xp\") pod \"cinder-api-0\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.523013 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:53 crc kubenswrapper[5030]: I0120 23:54:53.978487 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52118814-66ad-4be7-a65d-2d65313b76b1" path="/var/lib/kubelet/pods/52118814-66ad-4be7-a65d-2d65313b76b1/volumes" Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.035758 5030 generic.go:334] "Generic (PLEG): container finished" podID="5d5df3a2-8ec2-4eed-93ba-f7ee82646dba" containerID="70a7d28cf52d5ef1d00316b0dcbf8fba7e082729deb543a60515ea02129ddf50" exitCode=0 Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.035823 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" event={"ID":"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba","Type":"ContainerDied","Data":"70a7d28cf52d5ef1d00316b0dcbf8fba7e082729deb543a60515ea02129ddf50"} Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.037895 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" event={"ID":"5915461a-f712-499e-ba04-bab78e49188f","Type":"ContainerStarted","Data":"8df38ebfb58a2a5a0af6b5b510d94cbadd6881af9e85606c92acec410f4bba95"} Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.038745 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.038781 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.040909 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.040993 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.041004 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.048566 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.064302 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.086832 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" podStartSLOduration=3.086814773 podStartE2EDuration="3.086814773s" podCreationTimestamp="2026-01-20 23:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:54.079993238 +0000 UTC m=+4766.400253526" watchObservedRunningTime="2026-01-20 23:54:54.086814773 +0000 UTC m=+4766.407075061" Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.167949 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.168187 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.190279 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:54:54 crc kubenswrapper[5030]: I0120 23:54:54.530280 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:54:55 crc kubenswrapper[5030]: I0120 23:54:55.114123 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"15f641b8-896e-47f1-b6b9-a2ced1fb3808","Type":"ContainerStarted","Data":"2bbc2f8fdbf5ffee6d3a44f5227cab09b42416f5ef6e76d016d8424b2478caa8"} Jan 20 23:54:55 crc kubenswrapper[5030]: I0120 23:54:55.114360 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"15f641b8-896e-47f1-b6b9-a2ced1fb3808","Type":"ContainerStarted","Data":"51f0805c6390247f2ae841b8e1ff556a76fa7b50d0cbf1368262d075ed69acf5"} Jan 20 23:54:55 crc kubenswrapper[5030]: I0120 23:54:55.526040 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" Jan 20 23:54:55 crc kubenswrapper[5030]: I0120 23:54:55.625148 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-config\") pod \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\" (UID: \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\") " Jan 20 23:54:55 crc kubenswrapper[5030]: I0120 23:54:55.625301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-combined-ca-bundle\") pod \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\" (UID: \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\") " Jan 20 23:54:55 crc kubenswrapper[5030]: I0120 23:54:55.625405 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjnjx\" (UniqueName: \"kubernetes.io/projected/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-kube-api-access-qjnjx\") pod \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\" (UID: \"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba\") " Jan 20 23:54:55 crc kubenswrapper[5030]: I0120 23:54:55.633848 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-kube-api-access-qjnjx" (OuterVolumeSpecName: "kube-api-access-qjnjx") pod "5d5df3a2-8ec2-4eed-93ba-f7ee82646dba" (UID: "5d5df3a2-8ec2-4eed-93ba-f7ee82646dba"). InnerVolumeSpecName "kube-api-access-qjnjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:54:55 crc kubenswrapper[5030]: I0120 23:54:55.657691 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-config" (OuterVolumeSpecName: "config") pod "5d5df3a2-8ec2-4eed-93ba-f7ee82646dba" (UID: "5d5df3a2-8ec2-4eed-93ba-f7ee82646dba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:55 crc kubenswrapper[5030]: I0120 23:54:55.660092 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d5df3a2-8ec2-4eed-93ba-f7ee82646dba" (UID: "5d5df3a2-8ec2-4eed-93ba-f7ee82646dba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:54:55 crc kubenswrapper[5030]: I0120 23:54:55.727048 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjnjx\" (UniqueName: \"kubernetes.io/projected/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-kube-api-access-qjnjx\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:55 crc kubenswrapper[5030]: I0120 23:54:55.727085 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:55 crc kubenswrapper[5030]: I0120 23:54:55.727097 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.125803 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"15f641b8-896e-47f1-b6b9-a2ced1fb3808","Type":"ContainerStarted","Data":"cf937266b65a833ea2519d5e345ec986e9d92e36c9c09cd00ad3dbe11dbc94df"} Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.128077 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.137936 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" event={"ID":"5d5df3a2-8ec2-4eed-93ba-f7ee82646dba","Type":"ContainerDied","Data":"2e02f41599e5cb59701eb73c30a7adb60f7a75a8b99f1625e420fea9ba328af3"} Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.137975 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e02f41599e5cb59701eb73c30a7adb60f7a75a8b99f1625e420fea9ba328af3" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.138052 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-db-sync-z8bpm" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.190839 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-api-0" podStartSLOduration=3.190819064 podStartE2EDuration="3.190819064s" podCreationTimestamp="2026-01-20 23:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:56.161591375 +0000 UTC m=+4768.481851663" watchObservedRunningTime="2026-01-20 23:54:56.190819064 +0000 UTC m=+4768.511079352" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.255928 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-bb87bd44f-r9nlp"] Jan 20 23:54:56 crc kubenswrapper[5030]: E0120 23:54:56.256338 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5df3a2-8ec2-4eed-93ba-f7ee82646dba" containerName="neutron-db-sync" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.256355 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5df3a2-8ec2-4eed-93ba-f7ee82646dba" containerName="neutron-db-sync" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.256567 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5df3a2-8ec2-4eed-93ba-f7ee82646dba" containerName="neutron-db-sync" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.257604 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.261166 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-config" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.261260 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-ovndbs" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.263240 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-httpd-config" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.263383 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-neutron-dockercfg-7r9rk" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.269580 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-bb87bd44f-r9nlp"] Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.351693 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q269f\" (UniqueName: \"kubernetes.io/projected/d6b58aa5-de62-4813-a617-8c78a3fff4d5-kube-api-access-q269f\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.351758 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-combined-ca-bundle\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.351808 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-config\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.351852 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-ovndb-tls-certs\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.351870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-httpd-config\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.453603 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-combined-ca-bundle\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.453688 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-config\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.453735 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-ovndb-tls-certs\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.453753 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-httpd-config\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.453860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q269f\" (UniqueName: \"kubernetes.io/projected/d6b58aa5-de62-4813-a617-8c78a3fff4d5-kube-api-access-q269f\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.458756 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-httpd-config\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.458776 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-ovndb-tls-certs\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.461161 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-combined-ca-bundle\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.468369 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-config\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.481596 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q269f\" (UniqueName: \"kubernetes.io/projected/d6b58aa5-de62-4813-a617-8c78a3fff4d5-kube-api-access-q269f\") pod \"neutron-bb87bd44f-r9nlp\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.584853 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:56 crc kubenswrapper[5030]: I0120 23:54:56.990566 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:57 crc kubenswrapper[5030]: I0120 23:54:57.025032 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-bb87bd44f-r9nlp"] Jan 20 23:54:57 crc kubenswrapper[5030]: I0120 23:54:57.091942 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:54:57 crc kubenswrapper[5030]: I0120 23:54:57.161022 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" event={"ID":"d6b58aa5-de62-4813-a617-8c78a3fff4d5","Type":"ContainerStarted","Data":"90479ca1996df68de44c021e15d59988df0704d15e43620758bd60270ab2597d"} Jan 20 23:54:57 crc kubenswrapper[5030]: I0120 23:54:57.799856 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:54:57 crc kubenswrapper[5030]: I0120 23:54:57.856859 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.168593 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" event={"ID":"d6b58aa5-de62-4813-a617-8c78a3fff4d5","Type":"ContainerStarted","Data":"e6e7888360d872f24a36db36807e01b2591cc55656ff1e0d723621a752039572"} Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.168657 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" event={"ID":"d6b58aa5-de62-4813-a617-8c78a3fff4d5","Type":"ContainerStarted","Data":"a63bfb247ca4df6b5be3ebcaa0ea553a69e80ba8b8d075b0efcf142c71c5b4fd"} Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.168881 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.168993 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="7a7344df-b142-4ed8-b56e-70045a4c9a80" containerName="cinder-scheduler" containerID="cri-o://2efa82fc92e7223d08fa7b7db2c4b8535d63caa3e86275361a439fefb37a072e" gracePeriod=30 Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.169012 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="7a7344df-b142-4ed8-b56e-70045a4c9a80" containerName="probe" containerID="cri-o://5883cf8e5ed479df0e086b82e2d084c87c1034103ce10b7652703f48e4d4a460" gracePeriod=30 Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.193482 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" podStartSLOduration=2.193460925 podStartE2EDuration="2.193460925s" podCreationTimestamp="2026-01-20 23:54:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:54:58.188131605 +0000 UTC m=+4770.508391893" watchObservedRunningTime="2026-01-20 23:54:58.193460925 +0000 UTC m=+4770.513721213" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.581108 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4"] Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.587693 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.592190 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-public-svc" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.592851 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-neutron-internal-svc" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.598324 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4"] Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.691198 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-internal-tls-certs\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.691258 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wtf6\" (UniqueName: \"kubernetes.io/projected/e0a3039f-15d0-4764-bcdf-4833853ed168-kube-api-access-8wtf6\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.691401 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-httpd-config\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.691474 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-combined-ca-bundle\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.691504 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-public-tls-certs\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.691735 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-config\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.691856 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-ovndb-tls-certs\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.793378 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-httpd-config\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.793465 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-combined-ca-bundle\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.793495 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-public-tls-certs\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.793532 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-config\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.793567 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-ovndb-tls-certs\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.793586 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-internal-tls-certs\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.793607 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wtf6\" (UniqueName: \"kubernetes.io/projected/e0a3039f-15d0-4764-bcdf-4833853ed168-kube-api-access-8wtf6\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.799541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-combined-ca-bundle\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.800400 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-config\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.800798 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-ovndb-tls-certs\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.816302 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-httpd-config\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.816653 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-public-tls-certs\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.817224 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-internal-tls-certs\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.818220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wtf6\" (UniqueName: \"kubernetes.io/projected/e0a3039f-15d0-4764-bcdf-4833853ed168-kube-api-access-8wtf6\") pod \"neutron-8f95b5bc4-cgjm4\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.892685 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:54:58 crc kubenswrapper[5030]: I0120 23:54:58.910575 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:54:59 crc kubenswrapper[5030]: I0120 23:54:59.185702 5030 generic.go:334] "Generic (PLEG): container finished" podID="7a7344df-b142-4ed8-b56e-70045a4c9a80" containerID="5883cf8e5ed479df0e086b82e2d084c87c1034103ce10b7652703f48e4d4a460" exitCode=0 Jan 20 23:54:59 crc kubenswrapper[5030]: I0120 23:54:59.186737 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"7a7344df-b142-4ed8-b56e-70045a4c9a80","Type":"ContainerDied","Data":"5883cf8e5ed479df0e086b82e2d084c87c1034103ce10b7652703f48e4d4a460"} Jan 20 23:54:59 crc kubenswrapper[5030]: I0120 23:54:59.354065 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4"] Jan 20 23:54:59 crc kubenswrapper[5030]: I0120 23:54:59.999102 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.117064 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a7344df-b142-4ed8-b56e-70045a4c9a80-etc-machine-id\") pod \"7a7344df-b142-4ed8-b56e-70045a4c9a80\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.117179 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a7344df-b142-4ed8-b56e-70045a4c9a80-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7a7344df-b142-4ed8-b56e-70045a4c9a80" (UID: "7a7344df-b142-4ed8-b56e-70045a4c9a80"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.117263 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5srcd\" (UniqueName: \"kubernetes.io/projected/7a7344df-b142-4ed8-b56e-70045a4c9a80-kube-api-access-5srcd\") pod \"7a7344df-b142-4ed8-b56e-70045a4c9a80\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.117308 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-config-data-custom\") pod \"7a7344df-b142-4ed8-b56e-70045a4c9a80\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.117349 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-scripts\") pod \"7a7344df-b142-4ed8-b56e-70045a4c9a80\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.117368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-config-data\") pod \"7a7344df-b142-4ed8-b56e-70045a4c9a80\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.117474 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-combined-ca-bundle\") pod \"7a7344df-b142-4ed8-b56e-70045a4c9a80\" (UID: \"7a7344df-b142-4ed8-b56e-70045a4c9a80\") " Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.117842 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a7344df-b142-4ed8-b56e-70045a4c9a80-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.121900 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a7344df-b142-4ed8-b56e-70045a4c9a80" (UID: "7a7344df-b142-4ed8-b56e-70045a4c9a80"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.122878 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-scripts" (OuterVolumeSpecName: "scripts") pod "7a7344df-b142-4ed8-b56e-70045a4c9a80" (UID: "7a7344df-b142-4ed8-b56e-70045a4c9a80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.125762 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a7344df-b142-4ed8-b56e-70045a4c9a80-kube-api-access-5srcd" (OuterVolumeSpecName: "kube-api-access-5srcd") pod "7a7344df-b142-4ed8-b56e-70045a4c9a80" (UID: "7a7344df-b142-4ed8-b56e-70045a4c9a80"). InnerVolumeSpecName "kube-api-access-5srcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.172018 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a7344df-b142-4ed8-b56e-70045a4c9a80" (UID: "7a7344df-b142-4ed8-b56e-70045a4c9a80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.201768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" event={"ID":"e0a3039f-15d0-4764-bcdf-4833853ed168","Type":"ContainerStarted","Data":"614efb0d878fa014c022337fb7a7d478520d2e68328e205d409a9cb6aab50305"} Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.201815 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" event={"ID":"e0a3039f-15d0-4764-bcdf-4833853ed168","Type":"ContainerStarted","Data":"00b43d04171daa71b4b128f12c54427b55f83c6364025b5f0dfa8199fd3ad500"} Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.201824 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" event={"ID":"e0a3039f-15d0-4764-bcdf-4833853ed168","Type":"ContainerStarted","Data":"695f1206946234ddbf81a6b4ceb02882ff0365de646682eedf8bdefae4f51335"} Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.202600 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.207950 5030 generic.go:334] "Generic (PLEG): container finished" podID="7a7344df-b142-4ed8-b56e-70045a4c9a80" containerID="2efa82fc92e7223d08fa7b7db2c4b8535d63caa3e86275361a439fefb37a072e" exitCode=0 Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.207992 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"7a7344df-b142-4ed8-b56e-70045a4c9a80","Type":"ContainerDied","Data":"2efa82fc92e7223d08fa7b7db2c4b8535d63caa3e86275361a439fefb37a072e"} Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.208014 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"7a7344df-b142-4ed8-b56e-70045a4c9a80","Type":"ContainerDied","Data":"84d912739cede0b73c7cb53b290ea92d13e77b71c9107b916f43dce7651a179d"} Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.208032 5030 scope.go:117] "RemoveContainer" containerID="5883cf8e5ed479df0e086b82e2d084c87c1034103ce10b7652703f48e4d4a460" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.208166 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.221269 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.221312 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.221329 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5srcd\" (UniqueName: \"kubernetes.io/projected/7a7344df-b142-4ed8-b56e-70045a4c9a80-kube-api-access-5srcd\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.221341 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.235040 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" podStartSLOduration=2.23502209 podStartE2EDuration="2.23502209s" podCreationTimestamp="2026-01-20 23:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:55:00.222787872 +0000 UTC m=+4772.543048160" watchObservedRunningTime="2026-01-20 23:55:00.23502209 +0000 UTC m=+4772.555282388" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.240824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-config-data" (OuterVolumeSpecName: "config-data") pod "7a7344df-b142-4ed8-b56e-70045a4c9a80" (UID: "7a7344df-b142-4ed8-b56e-70045a4c9a80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.279233 5030 scope.go:117] "RemoveContainer" containerID="2efa82fc92e7223d08fa7b7db2c4b8535d63caa3e86275361a439fefb37a072e" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.301104 5030 scope.go:117] "RemoveContainer" containerID="5883cf8e5ed479df0e086b82e2d084c87c1034103ce10b7652703f48e4d4a460" Jan 20 23:55:00 crc kubenswrapper[5030]: E0120 23:55:00.301561 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5883cf8e5ed479df0e086b82e2d084c87c1034103ce10b7652703f48e4d4a460\": container with ID starting with 5883cf8e5ed479df0e086b82e2d084c87c1034103ce10b7652703f48e4d4a460 not found: ID does not exist" containerID="5883cf8e5ed479df0e086b82e2d084c87c1034103ce10b7652703f48e4d4a460" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.301593 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5883cf8e5ed479df0e086b82e2d084c87c1034103ce10b7652703f48e4d4a460"} err="failed to get container status \"5883cf8e5ed479df0e086b82e2d084c87c1034103ce10b7652703f48e4d4a460\": rpc error: code = NotFound desc = could not find container \"5883cf8e5ed479df0e086b82e2d084c87c1034103ce10b7652703f48e4d4a460\": container with ID starting with 5883cf8e5ed479df0e086b82e2d084c87c1034103ce10b7652703f48e4d4a460 not found: ID does not exist" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.301615 5030 scope.go:117] "RemoveContainer" containerID="2efa82fc92e7223d08fa7b7db2c4b8535d63caa3e86275361a439fefb37a072e" Jan 20 23:55:00 crc kubenswrapper[5030]: E0120 23:55:00.301893 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2efa82fc92e7223d08fa7b7db2c4b8535d63caa3e86275361a439fefb37a072e\": container with ID starting with 2efa82fc92e7223d08fa7b7db2c4b8535d63caa3e86275361a439fefb37a072e not found: ID does not exist" containerID="2efa82fc92e7223d08fa7b7db2c4b8535d63caa3e86275361a439fefb37a072e" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.301912 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2efa82fc92e7223d08fa7b7db2c4b8535d63caa3e86275361a439fefb37a072e"} err="failed to get container status \"2efa82fc92e7223d08fa7b7db2c4b8535d63caa3e86275361a439fefb37a072e\": rpc error: code = NotFound desc = could not find container \"2efa82fc92e7223d08fa7b7db2c4b8535d63caa3e86275361a439fefb37a072e\": container with ID starting with 2efa82fc92e7223d08fa7b7db2c4b8535d63caa3e86275361a439fefb37a072e not found: ID does not exist" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.322802 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a7344df-b142-4ed8-b56e-70045a4c9a80-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.440182 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.521984 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-f487f85c8-pl27h"] Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.522272 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" podUID="f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" containerName="barbican-api-log" containerID="cri-o://29f7e642a020b85679305a6db841e40a748dd5191ddeefc75f16c3da7cbc9388" gracePeriod=30 Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.522405 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" podUID="f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" containerName="barbican-api" containerID="cri-o://814b477ae880ccaca92b5c785cd2d2553e82b02298aabbeef4db6c3a1baebad0" gracePeriod=30 Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.578441 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.586340 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.610394 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:55:00 crc kubenswrapper[5030]: E0120 23:55:00.611040 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7344df-b142-4ed8-b56e-70045a4c9a80" containerName="cinder-scheduler" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.611056 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7344df-b142-4ed8-b56e-70045a4c9a80" containerName="cinder-scheduler" Jan 20 23:55:00 crc kubenswrapper[5030]: E0120 23:55:00.611075 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7344df-b142-4ed8-b56e-70045a4c9a80" containerName="probe" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.611082 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7344df-b142-4ed8-b56e-70045a4c9a80" containerName="probe" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.611270 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a7344df-b142-4ed8-b56e-70045a4c9a80" containerName="probe" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.611295 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a7344df-b142-4ed8-b56e-70045a4c9a80" containerName="cinder-scheduler" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.612239 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.617051 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cinder-scheduler-config-data" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.629578 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.729496 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7305e60-aa68-4d37-a472-80060d0fb5cb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.730059 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-config-data\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.730198 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.730302 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5s8x\" (UniqueName: \"kubernetes.io/projected/a7305e60-aa68-4d37-a472-80060d0fb5cb-kube-api-access-n5s8x\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.730422 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.730528 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-scripts\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.832116 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7305e60-aa68-4d37-a472-80060d0fb5cb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.832317 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7305e60-aa68-4d37-a472-80060d0fb5cb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.832504 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-config-data\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.832593 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.832709 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5s8x\" (UniqueName: \"kubernetes.io/projected/a7305e60-aa68-4d37-a472-80060d0fb5cb-kube-api-access-n5s8x\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.832796 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.832866 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-scripts\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.839835 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-scripts\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.839963 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.841087 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.841419 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-config-data\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.854839 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5s8x\" (UniqueName: \"kubernetes.io/projected/a7305e60-aa68-4d37-a472-80060d0fb5cb-kube-api-access-n5s8x\") pod \"cinder-scheduler-0\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:00 crc kubenswrapper[5030]: I0120 23:55:00.939154 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:01 crc kubenswrapper[5030]: I0120 23:55:01.227118 5030 generic.go:334] "Generic (PLEG): container finished" podID="f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" containerID="29f7e642a020b85679305a6db841e40a748dd5191ddeefc75f16c3da7cbc9388" exitCode=143 Jan 20 23:55:01 crc kubenswrapper[5030]: I0120 23:55:01.227355 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" event={"ID":"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4","Type":"ContainerDied","Data":"29f7e642a020b85679305a6db841e40a748dd5191ddeefc75f16c3da7cbc9388"} Jan 20 23:55:01 crc kubenswrapper[5030]: I0120 23:55:01.418448 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:55:01 crc kubenswrapper[5030]: W0120 23:55:01.428393 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7305e60_aa68_4d37_a472_80060d0fb5cb.slice/crio-6c91c04d39134fe91bc0213b9945f716857fdb4d5d2d88bbccb158eacac87b1f WatchSource:0}: Error finding container 6c91c04d39134fe91bc0213b9945f716857fdb4d5d2d88bbccb158eacac87b1f: Status 404 returned error can't find the container with id 6c91c04d39134fe91bc0213b9945f716857fdb4d5d2d88bbccb158eacac87b1f Jan 20 23:55:01 crc kubenswrapper[5030]: I0120 23:55:01.986563 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a7344df-b142-4ed8-b56e-70045a4c9a80" path="/var/lib/kubelet/pods/7a7344df-b142-4ed8-b56e-70045a4c9a80/volumes" Jan 20 23:55:02 crc kubenswrapper[5030]: I0120 23:55:02.251226 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"a7305e60-aa68-4d37-a472-80060d0fb5cb","Type":"ContainerStarted","Data":"2352db31d2fa21ab9eebafd17ba76cf00ee85495907870c2b6db941db94c99ad"} Jan 20 23:55:02 crc kubenswrapper[5030]: I0120 23:55:02.251324 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"a7305e60-aa68-4d37-a472-80060d0fb5cb","Type":"ContainerStarted","Data":"6c91c04d39134fe91bc0213b9945f716857fdb4d5d2d88bbccb158eacac87b1f"} Jan 20 23:55:03 crc kubenswrapper[5030]: I0120 23:55:03.262992 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"a7305e60-aa68-4d37-a472-80060d0fb5cb","Type":"ContainerStarted","Data":"7aa1a8b59ede0ec7d6ca8badb42b6c8311eed524c20c0dfff1c4b3113611868b"} Jan 20 23:55:03 crc kubenswrapper[5030]: I0120 23:55:03.286204 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/cinder-scheduler-0" podStartSLOduration=3.286186646 podStartE2EDuration="3.286186646s" podCreationTimestamp="2026-01-20 23:55:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:55:03.286099674 +0000 UTC m=+4775.606359962" watchObservedRunningTime="2026-01-20 23:55:03.286186646 +0000 UTC m=+4775.606446934" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.161453 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.212288 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-combined-ca-bundle\") pod \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.212368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbmss\" (UniqueName: \"kubernetes.io/projected/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-kube-api-access-qbmss\") pod \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.212407 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-config-data-custom\") pod \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.212464 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-logs\") pod \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.212502 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-config-data\") pod \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\" (UID: \"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4\") " Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.213038 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-logs" (OuterVolumeSpecName: "logs") pod "f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" (UID: "f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.229875 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-kube-api-access-qbmss" (OuterVolumeSpecName: "kube-api-access-qbmss") pod "f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" (UID: "f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4"). InnerVolumeSpecName "kube-api-access-qbmss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.230862 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" (UID: "f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.251893 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" (UID: "f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.278167 5030 generic.go:334] "Generic (PLEG): container finished" podID="f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" containerID="814b477ae880ccaca92b5c785cd2d2553e82b02298aabbeef4db6c3a1baebad0" exitCode=0 Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.278261 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.278261 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" event={"ID":"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4","Type":"ContainerDied","Data":"814b477ae880ccaca92b5c785cd2d2553e82b02298aabbeef4db6c3a1baebad0"} Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.278306 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-f487f85c8-pl27h" event={"ID":"f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4","Type":"ContainerDied","Data":"dbe215f3894cb82c012f25e8864c093a6efc53729448845d411e0cc5e788bfcb"} Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.278324 5030 scope.go:117] "RemoveContainer" containerID="814b477ae880ccaca92b5c785cd2d2553e82b02298aabbeef4db6c3a1baebad0" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.306353 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-config-data" (OuterVolumeSpecName: "config-data") pod "f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" (UID: "f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.315253 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.315511 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbmss\" (UniqueName: \"kubernetes.io/projected/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-kube-api-access-qbmss\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.315705 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.315915 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.316031 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.327135 5030 scope.go:117] "RemoveContainer" containerID="29f7e642a020b85679305a6db841e40a748dd5191ddeefc75f16c3da7cbc9388" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.362558 5030 scope.go:117] "RemoveContainer" containerID="814b477ae880ccaca92b5c785cd2d2553e82b02298aabbeef4db6c3a1baebad0" Jan 20 23:55:04 crc kubenswrapper[5030]: E0120 23:55:04.363311 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"814b477ae880ccaca92b5c785cd2d2553e82b02298aabbeef4db6c3a1baebad0\": container with ID starting with 814b477ae880ccaca92b5c785cd2d2553e82b02298aabbeef4db6c3a1baebad0 not found: ID does not exist" containerID="814b477ae880ccaca92b5c785cd2d2553e82b02298aabbeef4db6c3a1baebad0" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.363360 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"814b477ae880ccaca92b5c785cd2d2553e82b02298aabbeef4db6c3a1baebad0"} err="failed to get container status \"814b477ae880ccaca92b5c785cd2d2553e82b02298aabbeef4db6c3a1baebad0\": rpc error: code = NotFound desc = could not find container \"814b477ae880ccaca92b5c785cd2d2553e82b02298aabbeef4db6c3a1baebad0\": container with ID starting with 814b477ae880ccaca92b5c785cd2d2553e82b02298aabbeef4db6c3a1baebad0 not found: ID does not exist" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.363387 5030 scope.go:117] "RemoveContainer" containerID="29f7e642a020b85679305a6db841e40a748dd5191ddeefc75f16c3da7cbc9388" Jan 20 23:55:04 crc kubenswrapper[5030]: E0120 23:55:04.363848 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29f7e642a020b85679305a6db841e40a748dd5191ddeefc75f16c3da7cbc9388\": container with ID starting with 29f7e642a020b85679305a6db841e40a748dd5191ddeefc75f16c3da7cbc9388 not found: ID does not exist" containerID="29f7e642a020b85679305a6db841e40a748dd5191ddeefc75f16c3da7cbc9388" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.363919 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29f7e642a020b85679305a6db841e40a748dd5191ddeefc75f16c3da7cbc9388"} err="failed to get container status \"29f7e642a020b85679305a6db841e40a748dd5191ddeefc75f16c3da7cbc9388\": rpc error: code = NotFound desc = could not find container \"29f7e642a020b85679305a6db841e40a748dd5191ddeefc75f16c3da7cbc9388\": container with ID starting with 29f7e642a020b85679305a6db841e40a748dd5191ddeefc75f16c3da7cbc9388 not found: ID does not exist" Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.651018 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-f487f85c8-pl27h"] Jan 20 23:55:04 crc kubenswrapper[5030]: I0120 23:55:04.665587 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-f487f85c8-pl27h"] Jan 20 23:55:05 crc kubenswrapper[5030]: I0120 23:55:05.199107 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:55:05 crc kubenswrapper[5030]: I0120 23:55:05.939839 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:05 crc kubenswrapper[5030]: I0120 23:55:05.984691 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" path="/var/lib/kubelet/pods/f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4/volumes" Jan 20 23:55:11 crc kubenswrapper[5030]: I0120 23:55:11.178580 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:55:14 crc kubenswrapper[5030]: I0120 23:55:14.304932 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:15 crc kubenswrapper[5030]: I0120 23:55:15.331739 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:55:15 crc kubenswrapper[5030]: I0120 23:55:15.336290 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:55:22 crc kubenswrapper[5030]: I0120 23:55:22.880199 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.124638 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5"] Jan 20 23:55:26 crc kubenswrapper[5030]: E0120 23:55:26.125492 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" containerName="barbican-api-log" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.125507 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" containerName="barbican-api-log" Jan 20 23:55:26 crc kubenswrapper[5030]: E0120 23:55:26.125556 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" containerName="barbican-api" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.125565 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" containerName="barbican-api" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.125779 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" containerName="barbican-api" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.125806 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f99fdfeb-7bb8-4ac4-b21a-e8b0ce724fd4" containerName="barbican-api-log" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.126886 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.133215 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"swift-proxy-config-data" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.133365 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-public-svc" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.133488 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-swift-internal-svc" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.149253 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5"] Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.154440 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-public-tls-certs\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.154499 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03c5eea5-bd6f-4b19-829d-32f4ef09997c-log-httpd\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.154528 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03c5eea5-bd6f-4b19-829d-32f4ef09997c-run-httpd\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.155310 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-internal-tls-certs\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.155359 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v64pf\" (UniqueName: \"kubernetes.io/projected/03c5eea5-bd6f-4b19-829d-32f4ef09997c-kube-api-access-v64pf\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.155376 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-config-data\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.155428 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/03c5eea5-bd6f-4b19-829d-32f4ef09997c-etc-swift\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.155468 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-combined-ca-bundle\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.256231 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-public-tls-certs\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.256285 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03c5eea5-bd6f-4b19-829d-32f4ef09997c-log-httpd\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.256315 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03c5eea5-bd6f-4b19-829d-32f4ef09997c-run-httpd\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.256343 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-internal-tls-certs\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.256386 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v64pf\" (UniqueName: \"kubernetes.io/projected/03c5eea5-bd6f-4b19-829d-32f4ef09997c-kube-api-access-v64pf\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.256405 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-config-data\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.256452 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/03c5eea5-bd6f-4b19-829d-32f4ef09997c-etc-swift\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.256487 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-combined-ca-bundle\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.257895 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03c5eea5-bd6f-4b19-829d-32f4ef09997c-log-httpd\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.258151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03c5eea5-bd6f-4b19-829d-32f4ef09997c-run-httpd\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.343960 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-public-tls-certs\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.344423 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-config-data\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.344703 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-combined-ca-bundle\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.346505 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-internal-tls-certs\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.346965 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/03c5eea5-bd6f-4b19-829d-32f4ef09997c-etc-swift\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.346994 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v64pf\" (UniqueName: \"kubernetes.io/projected/03c5eea5-bd6f-4b19-829d-32f4ef09997c-kube-api-access-v64pf\") pod \"swift-proxy-58bb755474-jnzx5\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.448498 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:26 crc kubenswrapper[5030]: I0120 23:55:26.907935 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.339671 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5"] Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.503155 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.504445 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.518994 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-config" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.519432 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstackclient-openstackclient-dockercfg-nc72q" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.519683 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-config-secret" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.542937 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.584537 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" event={"ID":"03c5eea5-bd6f-4b19-829d-32f4ef09997c","Type":"ContainerStarted","Data":"63856504b66b5735168be1f4f4e022e263dccc76e1d62bc9633b316c047c486e"} Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.597393 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bftd\" (UniqueName: \"kubernetes.io/projected/525b8de8-34a8-4bd1-b426-ed17becbab12-kube-api-access-9bftd\") pod \"openstackclient\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.597460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525b8de8-34a8-4bd1-b426-ed17becbab12-combined-ca-bundle\") pod \"openstackclient\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.597538 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/525b8de8-34a8-4bd1-b426-ed17becbab12-openstack-config\") pod \"openstackclient\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.597570 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/525b8de8-34a8-4bd1-b426-ed17becbab12-openstack-config-secret\") pod \"openstackclient\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.699124 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bftd\" (UniqueName: \"kubernetes.io/projected/525b8de8-34a8-4bd1-b426-ed17becbab12-kube-api-access-9bftd\") pod \"openstackclient\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.699177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525b8de8-34a8-4bd1-b426-ed17becbab12-combined-ca-bundle\") pod \"openstackclient\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.699269 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/525b8de8-34a8-4bd1-b426-ed17becbab12-openstack-config\") pod \"openstackclient\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.699303 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/525b8de8-34a8-4bd1-b426-ed17becbab12-openstack-config-secret\") pod \"openstackclient\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.700318 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/525b8de8-34a8-4bd1-b426-ed17becbab12-openstack-config\") pod \"openstackclient\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.703399 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525b8de8-34a8-4bd1-b426-ed17becbab12-combined-ca-bundle\") pod \"openstackclient\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.703564 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/525b8de8-34a8-4bd1-b426-ed17becbab12-openstack-config-secret\") pod \"openstackclient\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.750473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bftd\" (UniqueName: \"kubernetes.io/projected/525b8de8-34a8-4bd1-b426-ed17becbab12-kube-api-access-9bftd\") pod \"openstackclient\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.867755 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.938223 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.938478 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="ceilometer-central-agent" containerID="cri-o://1d5cb939e54086daf239a43fbb24b45799f0f4e24b46023811eb433994349f54" gracePeriod=30 Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.938584 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="proxy-httpd" containerID="cri-o://0d79518f69b3443188b9f81d15a173595b8c99f49e13bf48bf0f078a5d181752" gracePeriod=30 Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.938860 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="sg-core" containerID="cri-o://1c4217b930c25c0b6d190299fdaf7767c7a4f55ceafbf01214b79b6d073f42bb" gracePeriod=30 Jan 20 23:55:27 crc kubenswrapper[5030]: I0120 23:55:27.938911 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="ceilometer-notification-agent" containerID="cri-o://2abe2d924f0000c1aaadcca65d6d7bf7b58064c5f5cae2a648f5d11c73e94ca9" gracePeriod=30 Jan 20 23:55:28 crc kubenswrapper[5030]: I0120 23:55:28.588348 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:55:28 crc kubenswrapper[5030]: I0120 23:55:28.598162 5030 generic.go:334] "Generic (PLEG): container finished" podID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerID="0d79518f69b3443188b9f81d15a173595b8c99f49e13bf48bf0f078a5d181752" exitCode=0 Jan 20 23:55:28 crc kubenswrapper[5030]: I0120 23:55:28.598222 5030 generic.go:334] "Generic (PLEG): container finished" podID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerID="1c4217b930c25c0b6d190299fdaf7767c7a4f55ceafbf01214b79b6d073f42bb" exitCode=2 Jan 20 23:55:28 crc kubenswrapper[5030]: I0120 23:55:28.598235 5030 generic.go:334] "Generic (PLEG): container finished" podID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerID="1d5cb939e54086daf239a43fbb24b45799f0f4e24b46023811eb433994349f54" exitCode=0 Jan 20 23:55:28 crc kubenswrapper[5030]: I0120 23:55:28.598238 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fbe6d2d-126d-4a16-88d5-bf143b2e498b","Type":"ContainerDied","Data":"0d79518f69b3443188b9f81d15a173595b8c99f49e13bf48bf0f078a5d181752"} Jan 20 23:55:28 crc kubenswrapper[5030]: I0120 23:55:28.598301 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fbe6d2d-126d-4a16-88d5-bf143b2e498b","Type":"ContainerDied","Data":"1c4217b930c25c0b6d190299fdaf7767c7a4f55ceafbf01214b79b6d073f42bb"} Jan 20 23:55:28 crc kubenswrapper[5030]: I0120 23:55:28.598321 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fbe6d2d-126d-4a16-88d5-bf143b2e498b","Type":"ContainerDied","Data":"1d5cb939e54086daf239a43fbb24b45799f0f4e24b46023811eb433994349f54"} Jan 20 23:55:28 crc kubenswrapper[5030]: I0120 23:55:28.600687 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" event={"ID":"03c5eea5-bd6f-4b19-829d-32f4ef09997c","Type":"ContainerStarted","Data":"9ca4957d78a8a6a00d4df062b5abb4a6f62840672dc365ddc417d1f2516cfcb4"} Jan 20 23:55:28 crc kubenswrapper[5030]: I0120 23:55:28.600735 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" event={"ID":"03c5eea5-bd6f-4b19-829d-32f4ef09997c","Type":"ContainerStarted","Data":"e48bf78e90710aa76a16ac4f0324a165f77700561ddfe100c023788b47cc3e69"} Jan 20 23:55:28 crc kubenswrapper[5030]: I0120 23:55:28.601287 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:28 crc kubenswrapper[5030]: I0120 23:55:28.637660 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" podStartSLOduration=2.637612527 podStartE2EDuration="2.637612527s" podCreationTimestamp="2026-01-20 23:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:55:28.628840123 +0000 UTC m=+4800.949100431" watchObservedRunningTime="2026-01-20 23:55:28.637612527 +0000 UTC m=+4800.957872855" Jan 20 23:55:28 crc kubenswrapper[5030]: I0120 23:55:28.931897 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:55:29 crc kubenswrapper[5030]: I0120 23:55:29.002168 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-bb87bd44f-r9nlp"] Jan 20 23:55:29 crc kubenswrapper[5030]: I0120 23:55:29.002480 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" podUID="d6b58aa5-de62-4813-a617-8c78a3fff4d5" containerName="neutron-api" containerID="cri-o://a63bfb247ca4df6b5be3ebcaa0ea553a69e80ba8b8d075b0efcf142c71c5b4fd" gracePeriod=30 Jan 20 23:55:29 crc kubenswrapper[5030]: I0120 23:55:29.002841 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" podUID="d6b58aa5-de62-4813-a617-8c78a3fff4d5" containerName="neutron-httpd" containerID="cri-o://e6e7888360d872f24a36db36807e01b2591cc55656ff1e0d723621a752039572" gracePeriod=30 Jan 20 23:55:29 crc kubenswrapper[5030]: I0120 23:55:29.610281 5030 generic.go:334] "Generic (PLEG): container finished" podID="d6b58aa5-de62-4813-a617-8c78a3fff4d5" containerID="e6e7888360d872f24a36db36807e01b2591cc55656ff1e0d723621a752039572" exitCode=0 Jan 20 23:55:29 crc kubenswrapper[5030]: I0120 23:55:29.610336 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" event={"ID":"d6b58aa5-de62-4813-a617-8c78a3fff4d5","Type":"ContainerDied","Data":"e6e7888360d872f24a36db36807e01b2591cc55656ff1e0d723621a752039572"} Jan 20 23:55:29 crc kubenswrapper[5030]: I0120 23:55:29.612854 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"525b8de8-34a8-4bd1-b426-ed17becbab12","Type":"ContainerStarted","Data":"8b93f1e880b42dec39213a1363781794f9e5e185249a6d3eb089db7e9620bfcd"} Jan 20 23:55:29 crc kubenswrapper[5030]: I0120 23:55:29.612882 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstackclient" event={"ID":"525b8de8-34a8-4bd1-b426-ed17becbab12","Type":"ContainerStarted","Data":"b7ab60b5e63d6c19351b76248e7db4e407a50ba86f37f72a6524aa66915c6655"} Jan 20 23:55:29 crc kubenswrapper[5030]: I0120 23:55:29.612904 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.097979 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.120898 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/openstackclient" podStartSLOduration=3.120879999 podStartE2EDuration="3.120879999s" podCreationTimestamp="2026-01-20 23:55:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:55:29.631744503 +0000 UTC m=+4801.952004791" watchObservedRunningTime="2026-01-20 23:55:30.120879999 +0000 UTC m=+4802.441140287" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.247920 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-sg-core-conf-yaml\") pod \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.248008 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tgj9\" (UniqueName: \"kubernetes.io/projected/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-kube-api-access-4tgj9\") pod \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.248052 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-scripts\") pod \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.248105 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-config-data\") pod \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.248149 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-run-httpd\") pod \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.248232 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-combined-ca-bundle\") pod \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.248301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-log-httpd\") pod \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\" (UID: \"8fbe6d2d-126d-4a16-88d5-bf143b2e498b\") " Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.248971 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8fbe6d2d-126d-4a16-88d5-bf143b2e498b" (UID: "8fbe6d2d-126d-4a16-88d5-bf143b2e498b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.249883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8fbe6d2d-126d-4a16-88d5-bf143b2e498b" (UID: "8fbe6d2d-126d-4a16-88d5-bf143b2e498b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.254984 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-kube-api-access-4tgj9" (OuterVolumeSpecName: "kube-api-access-4tgj9") pod "8fbe6d2d-126d-4a16-88d5-bf143b2e498b" (UID: "8fbe6d2d-126d-4a16-88d5-bf143b2e498b"). InnerVolumeSpecName "kube-api-access-4tgj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.256118 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-scripts" (OuterVolumeSpecName: "scripts") pod "8fbe6d2d-126d-4a16-88d5-bf143b2e498b" (UID: "8fbe6d2d-126d-4a16-88d5-bf143b2e498b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.294700 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8fbe6d2d-126d-4a16-88d5-bf143b2e498b" (UID: "8fbe6d2d-126d-4a16-88d5-bf143b2e498b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.350435 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.350470 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.350490 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tgj9\" (UniqueName: \"kubernetes.io/projected/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-kube-api-access-4tgj9\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.350503 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.350518 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.354481 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fbe6d2d-126d-4a16-88d5-bf143b2e498b" (UID: "8fbe6d2d-126d-4a16-88d5-bf143b2e498b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.360773 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-config-data" (OuterVolumeSpecName: "config-data") pod "8fbe6d2d-126d-4a16-88d5-bf143b2e498b" (UID: "8fbe6d2d-126d-4a16-88d5-bf143b2e498b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.451780 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.451816 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbe6d2d-126d-4a16-88d5-bf143b2e498b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.622277 5030 generic.go:334] "Generic (PLEG): container finished" podID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerID="2abe2d924f0000c1aaadcca65d6d7bf7b58064c5f5cae2a648f5d11c73e94ca9" exitCode=0 Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.622352 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fbe6d2d-126d-4a16-88d5-bf143b2e498b","Type":"ContainerDied","Data":"2abe2d924f0000c1aaadcca65d6d7bf7b58064c5f5cae2a648f5d11c73e94ca9"} Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.622363 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.623713 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8fbe6d2d-126d-4a16-88d5-bf143b2e498b","Type":"ContainerDied","Data":"7d6f313e5aa5fac92719a695fbadf0f8003c9b241ff0bacec51a7ca74d8891c6"} Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.623787 5030 scope.go:117] "RemoveContainer" containerID="0d79518f69b3443188b9f81d15a173595b8c99f49e13bf48bf0f078a5d181752" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.657984 5030 scope.go:117] "RemoveContainer" containerID="1c4217b930c25c0b6d190299fdaf7767c7a4f55ceafbf01214b79b6d073f42bb" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.661001 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.675114 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.681137 5030 scope.go:117] "RemoveContainer" containerID="2abe2d924f0000c1aaadcca65d6d7bf7b58064c5f5cae2a648f5d11c73e94ca9" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.701096 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:30 crc kubenswrapper[5030]: E0120 23:55:30.702033 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="proxy-httpd" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.702063 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="proxy-httpd" Jan 20 23:55:30 crc kubenswrapper[5030]: E0120 23:55:30.702102 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="ceilometer-central-agent" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.702117 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="ceilometer-central-agent" Jan 20 23:55:30 crc kubenswrapper[5030]: E0120 23:55:30.702145 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="ceilometer-notification-agent" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.702158 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="ceilometer-notification-agent" Jan 20 23:55:30 crc kubenswrapper[5030]: E0120 23:55:30.702195 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="sg-core" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.702211 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="sg-core" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.702559 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="ceilometer-central-agent" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.702594 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="ceilometer-notification-agent" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.702649 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="proxy-httpd" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.702683 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" containerName="sg-core" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.709811 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.713529 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.744990 5030 scope.go:117] "RemoveContainer" containerID="1d5cb939e54086daf239a43fbb24b45799f0f4e24b46023811eb433994349f54" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.745227 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.745758 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.823876 5030 scope.go:117] "RemoveContainer" containerID="0d79518f69b3443188b9f81d15a173595b8c99f49e13bf48bf0f078a5d181752" Jan 20 23:55:30 crc kubenswrapper[5030]: E0120 23:55:30.824409 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d79518f69b3443188b9f81d15a173595b8c99f49e13bf48bf0f078a5d181752\": container with ID starting with 0d79518f69b3443188b9f81d15a173595b8c99f49e13bf48bf0f078a5d181752 not found: ID does not exist" containerID="0d79518f69b3443188b9f81d15a173595b8c99f49e13bf48bf0f078a5d181752" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.824438 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d79518f69b3443188b9f81d15a173595b8c99f49e13bf48bf0f078a5d181752"} err="failed to get container status \"0d79518f69b3443188b9f81d15a173595b8c99f49e13bf48bf0f078a5d181752\": rpc error: code = NotFound desc = could not find container \"0d79518f69b3443188b9f81d15a173595b8c99f49e13bf48bf0f078a5d181752\": container with ID starting with 0d79518f69b3443188b9f81d15a173595b8c99f49e13bf48bf0f078a5d181752 not found: ID does not exist" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.824458 5030 scope.go:117] "RemoveContainer" containerID="1c4217b930c25c0b6d190299fdaf7767c7a4f55ceafbf01214b79b6d073f42bb" Jan 20 23:55:30 crc kubenswrapper[5030]: E0120 23:55:30.824750 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c4217b930c25c0b6d190299fdaf7767c7a4f55ceafbf01214b79b6d073f42bb\": container with ID starting with 1c4217b930c25c0b6d190299fdaf7767c7a4f55ceafbf01214b79b6d073f42bb not found: ID does not exist" containerID="1c4217b930c25c0b6d190299fdaf7767c7a4f55ceafbf01214b79b6d073f42bb" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.824771 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c4217b930c25c0b6d190299fdaf7767c7a4f55ceafbf01214b79b6d073f42bb"} err="failed to get container status \"1c4217b930c25c0b6d190299fdaf7767c7a4f55ceafbf01214b79b6d073f42bb\": rpc error: code = NotFound desc = could not find container \"1c4217b930c25c0b6d190299fdaf7767c7a4f55ceafbf01214b79b6d073f42bb\": container with ID starting with 1c4217b930c25c0b6d190299fdaf7767c7a4f55ceafbf01214b79b6d073f42bb not found: ID does not exist" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.824785 5030 scope.go:117] "RemoveContainer" containerID="2abe2d924f0000c1aaadcca65d6d7bf7b58064c5f5cae2a648f5d11c73e94ca9" Jan 20 23:55:30 crc kubenswrapper[5030]: E0120 23:55:30.825003 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2abe2d924f0000c1aaadcca65d6d7bf7b58064c5f5cae2a648f5d11c73e94ca9\": container with ID starting with 2abe2d924f0000c1aaadcca65d6d7bf7b58064c5f5cae2a648f5d11c73e94ca9 not found: ID does not exist" containerID="2abe2d924f0000c1aaadcca65d6d7bf7b58064c5f5cae2a648f5d11c73e94ca9" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.825022 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2abe2d924f0000c1aaadcca65d6d7bf7b58064c5f5cae2a648f5d11c73e94ca9"} err="failed to get container status \"2abe2d924f0000c1aaadcca65d6d7bf7b58064c5f5cae2a648f5d11c73e94ca9\": rpc error: code = NotFound desc = could not find container \"2abe2d924f0000c1aaadcca65d6d7bf7b58064c5f5cae2a648f5d11c73e94ca9\": container with ID starting with 2abe2d924f0000c1aaadcca65d6d7bf7b58064c5f5cae2a648f5d11c73e94ca9 not found: ID does not exist" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.825034 5030 scope.go:117] "RemoveContainer" containerID="1d5cb939e54086daf239a43fbb24b45799f0f4e24b46023811eb433994349f54" Jan 20 23:55:30 crc kubenswrapper[5030]: E0120 23:55:30.825417 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d5cb939e54086daf239a43fbb24b45799f0f4e24b46023811eb433994349f54\": container with ID starting with 1d5cb939e54086daf239a43fbb24b45799f0f4e24b46023811eb433994349f54 not found: ID does not exist" containerID="1d5cb939e54086daf239a43fbb24b45799f0f4e24b46023811eb433994349f54" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.825460 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d5cb939e54086daf239a43fbb24b45799f0f4e24b46023811eb433994349f54"} err="failed to get container status \"1d5cb939e54086daf239a43fbb24b45799f0f4e24b46023811eb433994349f54\": rpc error: code = NotFound desc = could not find container \"1d5cb939e54086daf239a43fbb24b45799f0f4e24b46023811eb433994349f54\": container with ID starting with 1d5cb939e54086daf239a43fbb24b45799f0f4e24b46023811eb433994349f54 not found: ID does not exist" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.863076 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-scripts\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.863146 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.863213 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-run-httpd\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.863235 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-config-data\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.863275 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.863298 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-log-httpd\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.863353 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlgrv\" (UniqueName: \"kubernetes.io/projected/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-kube-api-access-nlgrv\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.964713 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-config-data\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.964768 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.964792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-log-httpd\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.964836 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlgrv\" (UniqueName: \"kubernetes.io/projected/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-kube-api-access-nlgrv\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.964883 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-scripts\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.964928 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.964974 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-run-httpd\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.965347 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-run-httpd\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.965787 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-log-httpd\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.971068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.972285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-config-data\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.979781 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.983105 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-scripts\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:30 crc kubenswrapper[5030]: I0120 23:55:30.988915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlgrv\" (UniqueName: \"kubernetes.io/projected/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-kube-api-access-nlgrv\") pod \"ceilometer-0\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:31 crc kubenswrapper[5030]: I0120 23:55:31.117040 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:31 crc kubenswrapper[5030]: I0120 23:55:31.575736 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:31 crc kubenswrapper[5030]: I0120 23:55:31.640798 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"35c3804a-1fb1-4d5b-9a22-8d3075e365b5","Type":"ContainerStarted","Data":"a9627c9884e881dda612262bd7e89e3e043c2ce657f3c6c1f41d1f686cd3802e"} Jan 20 23:55:31 crc kubenswrapper[5030]: I0120 23:55:31.977548 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fbe6d2d-126d-4a16-88d5-bf143b2e498b" path="/var/lib/kubelet/pods/8fbe6d2d-126d-4a16-88d5-bf143b2e498b/volumes" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.474071 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.594499 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-config\") pod \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.594907 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-httpd-config\") pod \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.595078 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-combined-ca-bundle\") pod \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.595165 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-ovndb-tls-certs\") pod \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.595340 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q269f\" (UniqueName: \"kubernetes.io/projected/d6b58aa5-de62-4813-a617-8c78a3fff4d5-kube-api-access-q269f\") pod \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\" (UID: \"d6b58aa5-de62-4813-a617-8c78a3fff4d5\") " Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.600679 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d6b58aa5-de62-4813-a617-8c78a3fff4d5" (UID: "d6b58aa5-de62-4813-a617-8c78a3fff4d5"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.600745 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6b58aa5-de62-4813-a617-8c78a3fff4d5-kube-api-access-q269f" (OuterVolumeSpecName: "kube-api-access-q269f") pod "d6b58aa5-de62-4813-a617-8c78a3fff4d5" (UID: "d6b58aa5-de62-4813-a617-8c78a3fff4d5"). InnerVolumeSpecName "kube-api-access-q269f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.641612 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6b58aa5-de62-4813-a617-8c78a3fff4d5" (UID: "d6b58aa5-de62-4813-a617-8c78a3fff4d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.649525 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"35c3804a-1fb1-4d5b-9a22-8d3075e365b5","Type":"ContainerStarted","Data":"8caa13dd781c9759683154ebd9f30f4a3b90c1c917e46377818fca712130e349"} Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.651880 5030 generic.go:334] "Generic (PLEG): container finished" podID="d6b58aa5-de62-4813-a617-8c78a3fff4d5" containerID="a63bfb247ca4df6b5be3ebcaa0ea553a69e80ba8b8d075b0efcf142c71c5b4fd" exitCode=0 Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.651922 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" event={"ID":"d6b58aa5-de62-4813-a617-8c78a3fff4d5","Type":"ContainerDied","Data":"a63bfb247ca4df6b5be3ebcaa0ea553a69e80ba8b8d075b0efcf142c71c5b4fd"} Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.651954 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" event={"ID":"d6b58aa5-de62-4813-a617-8c78a3fff4d5","Type":"ContainerDied","Data":"90479ca1996df68de44c021e15d59988df0704d15e43620758bd60270ab2597d"} Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.651962 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-bb87bd44f-r9nlp" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.651973 5030 scope.go:117] "RemoveContainer" containerID="e6e7888360d872f24a36db36807e01b2591cc55656ff1e0d723621a752039572" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.656014 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-config" (OuterVolumeSpecName: "config") pod "d6b58aa5-de62-4813-a617-8c78a3fff4d5" (UID: "d6b58aa5-de62-4813-a617-8c78a3fff4d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.670054 5030 scope.go:117] "RemoveContainer" containerID="a63bfb247ca4df6b5be3ebcaa0ea553a69e80ba8b8d075b0efcf142c71c5b4fd" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.691431 5030 scope.go:117] "RemoveContainer" containerID="e6e7888360d872f24a36db36807e01b2591cc55656ff1e0d723621a752039572" Jan 20 23:55:32 crc kubenswrapper[5030]: E0120 23:55:32.691840 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6e7888360d872f24a36db36807e01b2591cc55656ff1e0d723621a752039572\": container with ID starting with e6e7888360d872f24a36db36807e01b2591cc55656ff1e0d723621a752039572 not found: ID does not exist" containerID="e6e7888360d872f24a36db36807e01b2591cc55656ff1e0d723621a752039572" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.691881 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e7888360d872f24a36db36807e01b2591cc55656ff1e0d723621a752039572"} err="failed to get container status \"e6e7888360d872f24a36db36807e01b2591cc55656ff1e0d723621a752039572\": rpc error: code = NotFound desc = could not find container \"e6e7888360d872f24a36db36807e01b2591cc55656ff1e0d723621a752039572\": container with ID starting with e6e7888360d872f24a36db36807e01b2591cc55656ff1e0d723621a752039572 not found: ID does not exist" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.691906 5030 scope.go:117] "RemoveContainer" containerID="a63bfb247ca4df6b5be3ebcaa0ea553a69e80ba8b8d075b0efcf142c71c5b4fd" Jan 20 23:55:32 crc kubenswrapper[5030]: E0120 23:55:32.692123 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a63bfb247ca4df6b5be3ebcaa0ea553a69e80ba8b8d075b0efcf142c71c5b4fd\": container with ID starting with a63bfb247ca4df6b5be3ebcaa0ea553a69e80ba8b8d075b0efcf142c71c5b4fd not found: ID does not exist" containerID="a63bfb247ca4df6b5be3ebcaa0ea553a69e80ba8b8d075b0efcf142c71c5b4fd" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.692147 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a63bfb247ca4df6b5be3ebcaa0ea553a69e80ba8b8d075b0efcf142c71c5b4fd"} err="failed to get container status \"a63bfb247ca4df6b5be3ebcaa0ea553a69e80ba8b8d075b0efcf142c71c5b4fd\": rpc error: code = NotFound desc = could not find container \"a63bfb247ca4df6b5be3ebcaa0ea553a69e80ba8b8d075b0efcf142c71c5b4fd\": container with ID starting with a63bfb247ca4df6b5be3ebcaa0ea553a69e80ba8b8d075b0efcf142c71c5b4fd not found: ID does not exist" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.693018 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d6b58aa5-de62-4813-a617-8c78a3fff4d5" (UID: "d6b58aa5-de62-4813-a617-8c78a3fff4d5"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.697047 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.697072 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.697083 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.697096 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q269f\" (UniqueName: \"kubernetes.io/projected/d6b58aa5-de62-4813-a617-8c78a3fff4d5-kube-api-access-q269f\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.697106 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d6b58aa5-de62-4813-a617-8c78a3fff4d5-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:32 crc kubenswrapper[5030]: I0120 23:55:32.997190 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-bb87bd44f-r9nlp"] Jan 20 23:55:33 crc kubenswrapper[5030]: I0120 23:55:33.008306 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-bb87bd44f-r9nlp"] Jan 20 23:55:33 crc kubenswrapper[5030]: I0120 23:55:33.664770 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"35c3804a-1fb1-4d5b-9a22-8d3075e365b5","Type":"ContainerStarted","Data":"3700a0ee6f339f5e8855f0ce58069a1c0c7dec2bdc51fc6a0be6a88ca79cc099"} Jan 20 23:55:33 crc kubenswrapper[5030]: I0120 23:55:33.975554 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6b58aa5-de62-4813-a617-8c78a3fff4d5" path="/var/lib/kubelet/pods/d6b58aa5-de62-4813-a617-8c78a3fff4d5/volumes" Jan 20 23:55:34 crc kubenswrapper[5030]: I0120 23:55:34.677546 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"35c3804a-1fb1-4d5b-9a22-8d3075e365b5","Type":"ContainerStarted","Data":"4f73214b357536c82973f09ac4438f960a61bcab548feede0987dd7a6d60ca59"} Jan 20 23:55:36 crc kubenswrapper[5030]: I0120 23:55:36.135897 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:36 crc kubenswrapper[5030]: I0120 23:55:36.454934 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:36 crc kubenswrapper[5030]: I0120 23:55:36.459400 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:55:40 crc kubenswrapper[5030]: I0120 23:55:40.157225 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:55:40 crc kubenswrapper[5030]: I0120 23:55:40.157560 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:55:41 crc kubenswrapper[5030]: I0120 23:55:41.877986 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:55:41 crc kubenswrapper[5030]: I0120 23:55:41.878836 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="42cd0b4c-f4c4-40a8-b181-1aec66c793c8" containerName="glance-log" containerID="cri-o://45cdf598354a24fc6278eef04788f9515afb39b3703d29b3e9ef8afd1ff4a1ae" gracePeriod=30 Jan 20 23:55:41 crc kubenswrapper[5030]: I0120 23:55:41.878912 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="42cd0b4c-f4c4-40a8-b181-1aec66c793c8" containerName="glance-httpd" containerID="cri-o://534aaa3182d6a00f51c82456305aef8f80e5241233e2ce2d3b05ac7c8859bd93" gracePeriod=30 Jan 20 23:55:42 crc kubenswrapper[5030]: I0120 23:55:42.626139 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:55:42 crc kubenswrapper[5030]: I0120 23:55:42.626698 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="82663c55-6190-4f89-a81a-4950835abad4" containerName="glance-log" containerID="cri-o://f0f6dcbbc1beacc6118f560448c544e57f674e2e974643154ee6c4c223e11d73" gracePeriod=30 Jan 20 23:55:42 crc kubenswrapper[5030]: I0120 23:55:42.626815 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="82663c55-6190-4f89-a81a-4950835abad4" containerName="glance-httpd" containerID="cri-o://2fc6210e9ab2da522085f52343b3e8c26de15701510867d1f0192cef9767fbc7" gracePeriod=30 Jan 20 23:55:42 crc kubenswrapper[5030]: I0120 23:55:42.782100 5030 generic.go:334] "Generic (PLEG): container finished" podID="82663c55-6190-4f89-a81a-4950835abad4" containerID="f0f6dcbbc1beacc6118f560448c544e57f674e2e974643154ee6c4c223e11d73" exitCode=143 Jan 20 23:55:42 crc kubenswrapper[5030]: I0120 23:55:42.782173 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"82663c55-6190-4f89-a81a-4950835abad4","Type":"ContainerDied","Data":"f0f6dcbbc1beacc6118f560448c544e57f674e2e974643154ee6c4c223e11d73"} Jan 20 23:55:42 crc kubenswrapper[5030]: I0120 23:55:42.784061 5030 generic.go:334] "Generic (PLEG): container finished" podID="42cd0b4c-f4c4-40a8-b181-1aec66c793c8" containerID="45cdf598354a24fc6278eef04788f9515afb39b3703d29b3e9ef8afd1ff4a1ae" exitCode=143 Jan 20 23:55:42 crc kubenswrapper[5030]: I0120 23:55:42.784113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"42cd0b4c-f4c4-40a8-b181-1aec66c793c8","Type":"ContainerDied","Data":"45cdf598354a24fc6278eef04788f9515afb39b3703d29b3e9ef8afd1ff4a1ae"} Jan 20 23:55:43 crc kubenswrapper[5030]: I0120 23:55:43.796839 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"35c3804a-1fb1-4d5b-9a22-8d3075e365b5","Type":"ContainerStarted","Data":"3aa122a78832916cf1101dc87c6f3c54ff42d43151a118aaa2e0d67fe1d452d0"} Jan 20 23:55:43 crc kubenswrapper[5030]: I0120 23:55:43.798142 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:43 crc kubenswrapper[5030]: I0120 23:55:43.797546 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="ceilometer-central-agent" containerID="cri-o://8caa13dd781c9759683154ebd9f30f4a3b90c1c917e46377818fca712130e349" gracePeriod=30 Jan 20 23:55:43 crc kubenswrapper[5030]: I0120 23:55:43.798332 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="sg-core" containerID="cri-o://4f73214b357536c82973f09ac4438f960a61bcab548feede0987dd7a6d60ca59" gracePeriod=30 Jan 20 23:55:43 crc kubenswrapper[5030]: I0120 23:55:43.798302 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="proxy-httpd" containerID="cri-o://3aa122a78832916cf1101dc87c6f3c54ff42d43151a118aaa2e0d67fe1d452d0" gracePeriod=30 Jan 20 23:55:43 crc kubenswrapper[5030]: I0120 23:55:43.798446 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="ceilometer-notification-agent" containerID="cri-o://3700a0ee6f339f5e8855f0ce58069a1c0c7dec2bdc51fc6a0be6a88ca79cc099" gracePeriod=30 Jan 20 23:55:43 crc kubenswrapper[5030]: I0120 23:55:43.839415 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.256320781 podStartE2EDuration="13.83939577s" podCreationTimestamp="2026-01-20 23:55:30 +0000 UTC" firstStartedPulling="2026-01-20 23:55:31.598380581 +0000 UTC m=+4803.918640909" lastFinishedPulling="2026-01-20 23:55:43.1814556 +0000 UTC m=+4815.501715898" observedRunningTime="2026-01-20 23:55:43.824207252 +0000 UTC m=+4816.144467540" watchObservedRunningTime="2026-01-20 23:55:43.83939577 +0000 UTC m=+4816.159656068" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.578449 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-7tr47"] Jan 20 23:55:44 crc kubenswrapper[5030]: E0120 23:55:44.579116 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b58aa5-de62-4813-a617-8c78a3fff4d5" containerName="neutron-api" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.579136 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b58aa5-de62-4813-a617-8c78a3fff4d5" containerName="neutron-api" Jan 20 23:55:44 crc kubenswrapper[5030]: E0120 23:55:44.579176 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b58aa5-de62-4813-a617-8c78a3fff4d5" containerName="neutron-httpd" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.579186 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b58aa5-de62-4813-a617-8c78a3fff4d5" containerName="neutron-httpd" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.579411 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b58aa5-de62-4813-a617-8c78a3fff4d5" containerName="neutron-httpd" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.579433 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b58aa5-de62-4813-a617-8c78a3fff4d5" containerName="neutron-api" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.580161 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-7tr47" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.596395 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-7tr47"] Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.743829 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9xns\" (UniqueName: \"kubernetes.io/projected/5236b48f-7d0e-40ce-94c0-adb838bf5de6-kube-api-access-r9xns\") pod \"nova-api-db-create-7tr47\" (UID: \"5236b48f-7d0e-40ce-94c0-adb838bf5de6\") " pod="openstack-kuttl-tests/nova-api-db-create-7tr47" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.744091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5236b48f-7d0e-40ce-94c0-adb838bf5de6-operator-scripts\") pod \"nova-api-db-create-7tr47\" (UID: \"5236b48f-7d0e-40ce-94c0-adb838bf5de6\") " pod="openstack-kuttl-tests/nova-api-db-create-7tr47" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.771200 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-57x7b"] Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.772425 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-57x7b" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.792727 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc"] Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.794274 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.798922 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.810156 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-57x7b"] Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.822800 5030 generic.go:334] "Generic (PLEG): container finished" podID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerID="3aa122a78832916cf1101dc87c6f3c54ff42d43151a118aaa2e0d67fe1d452d0" exitCode=0 Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.822969 5030 generic.go:334] "Generic (PLEG): container finished" podID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerID="4f73214b357536c82973f09ac4438f960a61bcab548feede0987dd7a6d60ca59" exitCode=2 Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.823028 5030 generic.go:334] "Generic (PLEG): container finished" podID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerID="8caa13dd781c9759683154ebd9f30f4a3b90c1c917e46377818fca712130e349" exitCode=0 Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.823095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"35c3804a-1fb1-4d5b-9a22-8d3075e365b5","Type":"ContainerDied","Data":"3aa122a78832916cf1101dc87c6f3c54ff42d43151a118aaa2e0d67fe1d452d0"} Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.823165 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"35c3804a-1fb1-4d5b-9a22-8d3075e365b5","Type":"ContainerDied","Data":"4f73214b357536c82973f09ac4438f960a61bcab548feede0987dd7a6d60ca59"} Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.823229 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"35c3804a-1fb1-4d5b-9a22-8d3075e365b5","Type":"ContainerDied","Data":"8caa13dd781c9759683154ebd9f30f4a3b90c1c917e46377818fca712130e349"} Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.823391 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc"] Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.845996 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5236b48f-7d0e-40ce-94c0-adb838bf5de6-operator-scripts\") pod \"nova-api-db-create-7tr47\" (UID: \"5236b48f-7d0e-40ce-94c0-adb838bf5de6\") " pod="openstack-kuttl-tests/nova-api-db-create-7tr47" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.846087 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9xns\" (UniqueName: \"kubernetes.io/projected/5236b48f-7d0e-40ce-94c0-adb838bf5de6-kube-api-access-r9xns\") pod \"nova-api-db-create-7tr47\" (UID: \"5236b48f-7d0e-40ce-94c0-adb838bf5de6\") " pod="openstack-kuttl-tests/nova-api-db-create-7tr47" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.851429 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5236b48f-7d0e-40ce-94c0-adb838bf5de6-operator-scripts\") pod \"nova-api-db-create-7tr47\" (UID: \"5236b48f-7d0e-40ce-94c0-adb838bf5de6\") " pod="openstack-kuttl-tests/nova-api-db-create-7tr47" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.877337 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-qgdsf"] Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.879070 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-qgdsf" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.886921 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-qgdsf"] Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.948718 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhlf9\" (UniqueName: \"kubernetes.io/projected/ca761538-4cfb-4a36-92a3-f5b0601eb08b-kube-api-access-lhlf9\") pod \"nova-cell0-db-create-57x7b\" (UID: \"ca761538-4cfb-4a36-92a3-f5b0601eb08b\") " pod="openstack-kuttl-tests/nova-cell0-db-create-57x7b" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.948779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgk2p\" (UniqueName: \"kubernetes.io/projected/eb6c79c1-6daa-40b3-92dc-c0b7a77182ae-kube-api-access-dgk2p\") pod \"nova-api-5ec4-account-create-update-5twgc\" (UID: \"eb6c79c1-6daa-40b3-92dc-c0b7a77182ae\") " pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.948977 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb6c79c1-6daa-40b3-92dc-c0b7a77182ae-operator-scripts\") pod \"nova-api-5ec4-account-create-update-5twgc\" (UID: \"eb6c79c1-6daa-40b3-92dc-c0b7a77182ae\") " pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.949113 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca761538-4cfb-4a36-92a3-f5b0601eb08b-operator-scripts\") pod \"nova-cell0-db-create-57x7b\" (UID: \"ca761538-4cfb-4a36-92a3-f5b0601eb08b\") " pod="openstack-kuttl-tests/nova-cell0-db-create-57x7b" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.971519 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq"] Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.972583 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.976523 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-db-secret" Jan 20 23:55:44 crc kubenswrapper[5030]: I0120 23:55:44.994601 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq"] Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.044611 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9xns\" (UniqueName: \"kubernetes.io/projected/5236b48f-7d0e-40ce-94c0-adb838bf5de6-kube-api-access-r9xns\") pod \"nova-api-db-create-7tr47\" (UID: \"5236b48f-7d0e-40ce-94c0-adb838bf5de6\") " pod="openstack-kuttl-tests/nova-api-db-create-7tr47" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.051964 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb6c79c1-6daa-40b3-92dc-c0b7a77182ae-operator-scripts\") pod \"nova-api-5ec4-account-create-update-5twgc\" (UID: \"eb6c79c1-6daa-40b3-92dc-c0b7a77182ae\") " pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.052035 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca761538-4cfb-4a36-92a3-f5b0601eb08b-operator-scripts\") pod \"nova-cell0-db-create-57x7b\" (UID: \"ca761538-4cfb-4a36-92a3-f5b0601eb08b\") " pod="openstack-kuttl-tests/nova-cell0-db-create-57x7b" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.052085 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d7f91b3-ec99-44df-a70b-86ca862be85d-operator-scripts\") pod \"nova-cell1-db-create-qgdsf\" (UID: \"7d7f91b3-ec99-44df-a70b-86ca862be85d\") " pod="openstack-kuttl-tests/nova-cell1-db-create-qgdsf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.052134 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhlf9\" (UniqueName: \"kubernetes.io/projected/ca761538-4cfb-4a36-92a3-f5b0601eb08b-kube-api-access-lhlf9\") pod \"nova-cell0-db-create-57x7b\" (UID: \"ca761538-4cfb-4a36-92a3-f5b0601eb08b\") " pod="openstack-kuttl-tests/nova-cell0-db-create-57x7b" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.052164 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgk2p\" (UniqueName: \"kubernetes.io/projected/eb6c79c1-6daa-40b3-92dc-c0b7a77182ae-kube-api-access-dgk2p\") pod \"nova-api-5ec4-account-create-update-5twgc\" (UID: \"eb6c79c1-6daa-40b3-92dc-c0b7a77182ae\") " pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.052237 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbm5b\" (UniqueName: \"kubernetes.io/projected/7d7f91b3-ec99-44df-a70b-86ca862be85d-kube-api-access-lbm5b\") pod \"nova-cell1-db-create-qgdsf\" (UID: \"7d7f91b3-ec99-44df-a70b-86ca862be85d\") " pod="openstack-kuttl-tests/nova-cell1-db-create-qgdsf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.052834 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb6c79c1-6daa-40b3-92dc-c0b7a77182ae-operator-scripts\") pod \"nova-api-5ec4-account-create-update-5twgc\" (UID: \"eb6c79c1-6daa-40b3-92dc-c0b7a77182ae\") " pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.053834 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca761538-4cfb-4a36-92a3-f5b0601eb08b-operator-scripts\") pod \"nova-cell0-db-create-57x7b\" (UID: \"ca761538-4cfb-4a36-92a3-f5b0601eb08b\") " pod="openstack-kuttl-tests/nova-cell0-db-create-57x7b" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.069312 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhlf9\" (UniqueName: \"kubernetes.io/projected/ca761538-4cfb-4a36-92a3-f5b0601eb08b-kube-api-access-lhlf9\") pod \"nova-cell0-db-create-57x7b\" (UID: \"ca761538-4cfb-4a36-92a3-f5b0601eb08b\") " pod="openstack-kuttl-tests/nova-cell0-db-create-57x7b" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.110218 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-57x7b" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.153600 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65855d62-04e3-45a7-97a2-3a14806fe649-operator-scripts\") pod \"nova-cell0-0c78-account-create-update-sngpq\" (UID: \"65855d62-04e3-45a7-97a2-3a14806fe649\") " pod="openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.153668 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d7f91b3-ec99-44df-a70b-86ca862be85d-operator-scripts\") pod \"nova-cell1-db-create-qgdsf\" (UID: \"7d7f91b3-ec99-44df-a70b-86ca862be85d\") " pod="openstack-kuttl-tests/nova-cell1-db-create-qgdsf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.153785 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbm5b\" (UniqueName: \"kubernetes.io/projected/7d7f91b3-ec99-44df-a70b-86ca862be85d-kube-api-access-lbm5b\") pod \"nova-cell1-db-create-qgdsf\" (UID: \"7d7f91b3-ec99-44df-a70b-86ca862be85d\") " pod="openstack-kuttl-tests/nova-cell1-db-create-qgdsf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.153805 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7xfq\" (UniqueName: \"kubernetes.io/projected/65855d62-04e3-45a7-97a2-3a14806fe649-kube-api-access-l7xfq\") pod \"nova-cell0-0c78-account-create-update-sngpq\" (UID: \"65855d62-04e3-45a7-97a2-3a14806fe649\") " pod="openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.154494 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d7f91b3-ec99-44df-a70b-86ca862be85d-operator-scripts\") pod \"nova-cell1-db-create-qgdsf\" (UID: \"7d7f91b3-ec99-44df-a70b-86ca862be85d\") " pod="openstack-kuttl-tests/nova-cell1-db-create-qgdsf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.155476 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgk2p\" (UniqueName: \"kubernetes.io/projected/eb6c79c1-6daa-40b3-92dc-c0b7a77182ae-kube-api-access-dgk2p\") pod \"nova-api-5ec4-account-create-update-5twgc\" (UID: \"eb6c79c1-6daa-40b3-92dc-c0b7a77182ae\") " pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.179019 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbm5b\" (UniqueName: \"kubernetes.io/projected/7d7f91b3-ec99-44df-a70b-86ca862be85d-kube-api-access-lbm5b\") pod \"nova-cell1-db-create-qgdsf\" (UID: \"7d7f91b3-ec99-44df-a70b-86ca862be85d\") " pod="openstack-kuttl-tests/nova-cell1-db-create-qgdsf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.184147 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf"] Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.186469 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.189892 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-db-secret" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.200641 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf"] Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.203735 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-qgdsf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.236991 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-7tr47" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.255128 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7xfq\" (UniqueName: \"kubernetes.io/projected/65855d62-04e3-45a7-97a2-3a14806fe649-kube-api-access-l7xfq\") pod \"nova-cell0-0c78-account-create-update-sngpq\" (UID: \"65855d62-04e3-45a7-97a2-3a14806fe649\") " pod="openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.255232 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65855d62-04e3-45a7-97a2-3a14806fe649-operator-scripts\") pod \"nova-cell0-0c78-account-create-update-sngpq\" (UID: \"65855d62-04e3-45a7-97a2-3a14806fe649\") " pod="openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.255948 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65855d62-04e3-45a7-97a2-3a14806fe649-operator-scripts\") pod \"nova-cell0-0c78-account-create-update-sngpq\" (UID: \"65855d62-04e3-45a7-97a2-3a14806fe649\") " pod="openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.280193 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7xfq\" (UniqueName: \"kubernetes.io/projected/65855d62-04e3-45a7-97a2-3a14806fe649-kube-api-access-l7xfq\") pod \"nova-cell0-0c78-account-create-update-sngpq\" (UID: \"65855d62-04e3-45a7-97a2-3a14806fe649\") " pod="openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.298295 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.357830 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e43cb69-a286-454d-9d54-5ce3aad208d0-operator-scripts\") pod \"nova-cell1-ae66-account-create-update-4tgbf\" (UID: \"5e43cb69-a286-454d-9d54-5ce3aad208d0\") " pod="openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.358296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbs9n\" (UniqueName: \"kubernetes.io/projected/5e43cb69-a286-454d-9d54-5ce3aad208d0-kube-api-access-bbs9n\") pod \"nova-cell1-ae66-account-create-update-4tgbf\" (UID: \"5e43cb69-a286-454d-9d54-5ce3aad208d0\") " pod="openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.426300 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.460486 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbs9n\" (UniqueName: \"kubernetes.io/projected/5e43cb69-a286-454d-9d54-5ce3aad208d0-kube-api-access-bbs9n\") pod \"nova-cell1-ae66-account-create-update-4tgbf\" (UID: \"5e43cb69-a286-454d-9d54-5ce3aad208d0\") " pod="openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.460832 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e43cb69-a286-454d-9d54-5ce3aad208d0-operator-scripts\") pod \"nova-cell1-ae66-account-create-update-4tgbf\" (UID: \"5e43cb69-a286-454d-9d54-5ce3aad208d0\") " pod="openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.461573 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e43cb69-a286-454d-9d54-5ce3aad208d0-operator-scripts\") pod \"nova-cell1-ae66-account-create-update-4tgbf\" (UID: \"5e43cb69-a286-454d-9d54-5ce3aad208d0\") " pod="openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.486194 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbs9n\" (UniqueName: \"kubernetes.io/projected/5e43cb69-a286-454d-9d54-5ce3aad208d0-kube-api-access-bbs9n\") pod \"nova-cell1-ae66-account-create-update-4tgbf\" (UID: \"5e43cb69-a286-454d-9d54-5ce3aad208d0\") " pod="openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.552099 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.563945 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.663229 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-combined-ca-bundle\") pod \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.663523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-log-httpd\") pod \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.663557 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-run-httpd\") pod \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.663705 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-config-data\") pod \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.663756 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlgrv\" (UniqueName: \"kubernetes.io/projected/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-kube-api-access-nlgrv\") pod \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.663985 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "35c3804a-1fb1-4d5b-9a22-8d3075e365b5" (UID: "35c3804a-1fb1-4d5b-9a22-8d3075e365b5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.664047 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "35c3804a-1fb1-4d5b-9a22-8d3075e365b5" (UID: "35c3804a-1fb1-4d5b-9a22-8d3075e365b5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.663788 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-sg-core-conf-yaml\") pod \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.664231 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-scripts\") pod \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\" (UID: \"35c3804a-1fb1-4d5b-9a22-8d3075e365b5\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.664748 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.664765 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.667595 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-kube-api-access-nlgrv" (OuterVolumeSpecName: "kube-api-access-nlgrv") pod "35c3804a-1fb1-4d5b-9a22-8d3075e365b5" (UID: "35c3804a-1fb1-4d5b-9a22-8d3075e365b5"). InnerVolumeSpecName "kube-api-access-nlgrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.668330 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-scripts" (OuterVolumeSpecName: "scripts") pod "35c3804a-1fb1-4d5b-9a22-8d3075e365b5" (UID: "35c3804a-1fb1-4d5b-9a22-8d3075e365b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.685772 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.698848 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "35c3804a-1fb1-4d5b-9a22-8d3075e365b5" (UID: "35c3804a-1fb1-4d5b-9a22-8d3075e365b5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.762820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-config-data" (OuterVolumeSpecName: "config-data") pod "35c3804a-1fb1-4d5b-9a22-8d3075e365b5" (UID: "35c3804a-1fb1-4d5b-9a22-8d3075e365b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.766416 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.766436 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlgrv\" (UniqueName: \"kubernetes.io/projected/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-kube-api-access-nlgrv\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.766446 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.766456 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.777898 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35c3804a-1fb1-4d5b-9a22-8d3075e365b5" (UID: "35c3804a-1fb1-4d5b-9a22-8d3075e365b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.837379 5030 generic.go:334] "Generic (PLEG): container finished" podID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerID="3700a0ee6f339f5e8855f0ce58069a1c0c7dec2bdc51fc6a0be6a88ca79cc099" exitCode=0 Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.837463 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"35c3804a-1fb1-4d5b-9a22-8d3075e365b5","Type":"ContainerDied","Data":"3700a0ee6f339f5e8855f0ce58069a1c0c7dec2bdc51fc6a0be6a88ca79cc099"} Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.837512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"35c3804a-1fb1-4d5b-9a22-8d3075e365b5","Type":"ContainerDied","Data":"a9627c9884e881dda612262bd7e89e3e043c2ce657f3c6c1f41d1f686cd3802e"} Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.837531 5030 scope.go:117] "RemoveContainer" containerID="3aa122a78832916cf1101dc87c6f3c54ff42d43151a118aaa2e0d67fe1d452d0" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.837721 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.847193 5030 generic.go:334] "Generic (PLEG): container finished" podID="42cd0b4c-f4c4-40a8-b181-1aec66c793c8" containerID="534aaa3182d6a00f51c82456305aef8f80e5241233e2ce2d3b05ac7c8859bd93" exitCode=0 Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.847236 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"42cd0b4c-f4c4-40a8-b181-1aec66c793c8","Type":"ContainerDied","Data":"534aaa3182d6a00f51c82456305aef8f80e5241233e2ce2d3b05ac7c8859bd93"} Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.847259 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"42cd0b4c-f4c4-40a8-b181-1aec66c793c8","Type":"ContainerDied","Data":"2f6b6feb6fa5a01bc974310597e12aeadf1d477b330f823af506702401a0d3ae"} Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.847309 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.871278 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-httpd-run\") pod \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.871346 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-combined-ca-bundle\") pod \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.871387 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-public-tls-certs\") pod \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.871414 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsl57\" (UniqueName: \"kubernetes.io/projected/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-kube-api-access-xsl57\") pod \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.871457 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.871494 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-config-data\") pod \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.871545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-scripts\") pod \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.871587 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-logs\") pod \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\" (UID: \"42cd0b4c-f4c4-40a8-b181-1aec66c793c8\") " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.872033 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "42cd0b4c-f4c4-40a8-b181-1aec66c793c8" (UID: "42cd0b4c-f4c4-40a8-b181-1aec66c793c8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.872522 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.872566 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35c3804a-1fb1-4d5b-9a22-8d3075e365b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.872951 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-logs" (OuterVolumeSpecName: "logs") pod "42cd0b4c-f4c4-40a8-b181-1aec66c793c8" (UID: "42cd0b4c-f4c4-40a8-b181-1aec66c793c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.885843 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.891281 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-scripts" (OuterVolumeSpecName: "scripts") pod "42cd0b4c-f4c4-40a8-b181-1aec66c793c8" (UID: "42cd0b4c-f4c4-40a8-b181-1aec66c793c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.900084 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-kube-api-access-xsl57" (OuterVolumeSpecName: "kube-api-access-xsl57") pod "42cd0b4c-f4c4-40a8-b181-1aec66c793c8" (UID: "42cd0b4c-f4c4-40a8-b181-1aec66c793c8"). InnerVolumeSpecName "kube-api-access-xsl57". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.914572 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "42cd0b4c-f4c4-40a8-b181-1aec66c793c8" (UID: "42cd0b4c-f4c4-40a8-b181-1aec66c793c8"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.922077 5030 scope.go:117] "RemoveContainer" containerID="4f73214b357536c82973f09ac4438f960a61bcab548feede0987dd7a6d60ca59" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.923759 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.944025 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:45 crc kubenswrapper[5030]: E0120 23:55:45.948328 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="ceilometer-central-agent" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.948350 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="ceilometer-central-agent" Jan 20 23:55:45 crc kubenswrapper[5030]: E0120 23:55:45.948370 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="ceilometer-notification-agent" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.948376 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="ceilometer-notification-agent" Jan 20 23:55:45 crc kubenswrapper[5030]: E0120 23:55:45.948386 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="proxy-httpd" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.948392 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="proxy-httpd" Jan 20 23:55:45 crc kubenswrapper[5030]: E0120 23:55:45.948417 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd0b4c-f4c4-40a8-b181-1aec66c793c8" containerName="glance-httpd" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.948423 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd0b4c-f4c4-40a8-b181-1aec66c793c8" containerName="glance-httpd" Jan 20 23:55:45 crc kubenswrapper[5030]: E0120 23:55:45.948432 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd0b4c-f4c4-40a8-b181-1aec66c793c8" containerName="glance-log" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.948438 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd0b4c-f4c4-40a8-b181-1aec66c793c8" containerName="glance-log" Jan 20 23:55:45 crc kubenswrapper[5030]: E0120 23:55:45.948449 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="sg-core" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.948455 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="sg-core" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.948615 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd0b4c-f4c4-40a8-b181-1aec66c793c8" containerName="glance-httpd" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.948642 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="ceilometer-notification-agent" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.948653 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd0b4c-f4c4-40a8-b181-1aec66c793c8" containerName="glance-log" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.948664 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="proxy-httpd" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.948672 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="ceilometer-central-agent" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.948689 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" containerName="sg-core" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.949103 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-config-data" (OuterVolumeSpecName: "config-data") pod "42cd0b4c-f4c4-40a8-b181-1aec66c793c8" (UID: "42cd0b4c-f4c4-40a8-b181-1aec66c793c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.950236 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.952048 5030 scope.go:117] "RemoveContainer" containerID="3700a0ee6f339f5e8855f0ce58069a1c0c7dec2bdc51fc6a0be6a88ca79cc099" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.952925 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.953211 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.958304 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-57x7b"] Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.967533 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42cd0b4c-f4c4-40a8-b181-1aec66c793c8" (UID: "42cd0b4c-f4c4-40a8-b181-1aec66c793c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.976410 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.976437 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsl57\" (UniqueName: \"kubernetes.io/projected/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-kube-api-access-xsl57\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.976460 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.976470 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.976478 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.976487 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.984141 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c3804a-1fb1-4d5b-9a22-8d3075e365b5" path="/var/lib/kubelet/pods/35c3804a-1fb1-4d5b-9a22-8d3075e365b5/volumes" Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.986923 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-qgdsf"] Jan 20 23:55:45 crc kubenswrapper[5030]: I0120 23:55:45.987069 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.001587 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq"] Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.013878 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-7tr47"] Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.016251 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.026558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "42cd0b4c-f4c4-40a8-b181-1aec66c793c8" (UID: "42cd0b4c-f4c4-40a8-b181-1aec66c793c8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.032387 5030 scope.go:117] "RemoveContainer" containerID="8caa13dd781c9759683154ebd9f30f4a3b90c1c917e46377818fca712130e349" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.082815 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-scripts\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.083110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4j9v\" (UniqueName: \"kubernetes.io/projected/ce204d76-a9d3-454d-8b82-0156fcd68d5b-kube-api-access-n4j9v\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.083151 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce204d76-a9d3-454d-8b82-0156fcd68d5b-run-httpd\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.083173 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.083253 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-config-data\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.083279 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.083338 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce204d76-a9d3-454d-8b82-0156fcd68d5b-log-httpd\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.083384 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42cd0b4c-f4c4-40a8-b181-1aec66c793c8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.083394 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.084018 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf"] Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.088661 5030 scope.go:117] "RemoveContainer" containerID="3aa122a78832916cf1101dc87c6f3c54ff42d43151a118aaa2e0d67fe1d452d0" Jan 20 23:55:46 crc kubenswrapper[5030]: E0120 23:55:46.090379 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aa122a78832916cf1101dc87c6f3c54ff42d43151a118aaa2e0d67fe1d452d0\": container with ID starting with 3aa122a78832916cf1101dc87c6f3c54ff42d43151a118aaa2e0d67fe1d452d0 not found: ID does not exist" containerID="3aa122a78832916cf1101dc87c6f3c54ff42d43151a118aaa2e0d67fe1d452d0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.090414 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa122a78832916cf1101dc87c6f3c54ff42d43151a118aaa2e0d67fe1d452d0"} err="failed to get container status \"3aa122a78832916cf1101dc87c6f3c54ff42d43151a118aaa2e0d67fe1d452d0\": rpc error: code = NotFound desc = could not find container \"3aa122a78832916cf1101dc87c6f3c54ff42d43151a118aaa2e0d67fe1d452d0\": container with ID starting with 3aa122a78832916cf1101dc87c6f3c54ff42d43151a118aaa2e0d67fe1d452d0 not found: ID does not exist" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.090450 5030 scope.go:117] "RemoveContainer" containerID="4f73214b357536c82973f09ac4438f960a61bcab548feede0987dd7a6d60ca59" Jan 20 23:55:46 crc kubenswrapper[5030]: E0120 23:55:46.091008 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f73214b357536c82973f09ac4438f960a61bcab548feede0987dd7a6d60ca59\": container with ID starting with 4f73214b357536c82973f09ac4438f960a61bcab548feede0987dd7a6d60ca59 not found: ID does not exist" containerID="4f73214b357536c82973f09ac4438f960a61bcab548feede0987dd7a6d60ca59" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.091049 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f73214b357536c82973f09ac4438f960a61bcab548feede0987dd7a6d60ca59"} err="failed to get container status \"4f73214b357536c82973f09ac4438f960a61bcab548feede0987dd7a6d60ca59\": rpc error: code = NotFound desc = could not find container \"4f73214b357536c82973f09ac4438f960a61bcab548feede0987dd7a6d60ca59\": container with ID starting with 4f73214b357536c82973f09ac4438f960a61bcab548feede0987dd7a6d60ca59 not found: ID does not exist" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.091077 5030 scope.go:117] "RemoveContainer" containerID="3700a0ee6f339f5e8855f0ce58069a1c0c7dec2bdc51fc6a0be6a88ca79cc099" Jan 20 23:55:46 crc kubenswrapper[5030]: E0120 23:55:46.091404 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3700a0ee6f339f5e8855f0ce58069a1c0c7dec2bdc51fc6a0be6a88ca79cc099\": container with ID starting with 3700a0ee6f339f5e8855f0ce58069a1c0c7dec2bdc51fc6a0be6a88ca79cc099 not found: ID does not exist" containerID="3700a0ee6f339f5e8855f0ce58069a1c0c7dec2bdc51fc6a0be6a88ca79cc099" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.091446 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3700a0ee6f339f5e8855f0ce58069a1c0c7dec2bdc51fc6a0be6a88ca79cc099"} err="failed to get container status \"3700a0ee6f339f5e8855f0ce58069a1c0c7dec2bdc51fc6a0be6a88ca79cc099\": rpc error: code = NotFound desc = could not find container \"3700a0ee6f339f5e8855f0ce58069a1c0c7dec2bdc51fc6a0be6a88ca79cc099\": container with ID starting with 3700a0ee6f339f5e8855f0ce58069a1c0c7dec2bdc51fc6a0be6a88ca79cc099 not found: ID does not exist" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.091472 5030 scope.go:117] "RemoveContainer" containerID="8caa13dd781c9759683154ebd9f30f4a3b90c1c917e46377818fca712130e349" Jan 20 23:55:46 crc kubenswrapper[5030]: E0120 23:55:46.091762 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8caa13dd781c9759683154ebd9f30f4a3b90c1c917e46377818fca712130e349\": container with ID starting with 8caa13dd781c9759683154ebd9f30f4a3b90c1c917e46377818fca712130e349 not found: ID does not exist" containerID="8caa13dd781c9759683154ebd9f30f4a3b90c1c917e46377818fca712130e349" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.091806 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8caa13dd781c9759683154ebd9f30f4a3b90c1c917e46377818fca712130e349"} err="failed to get container status \"8caa13dd781c9759683154ebd9f30f4a3b90c1c917e46377818fca712130e349\": rpc error: code = NotFound desc = could not find container \"8caa13dd781c9759683154ebd9f30f4a3b90c1c917e46377818fca712130e349\": container with ID starting with 8caa13dd781c9759683154ebd9f30f4a3b90c1c917e46377818fca712130e349 not found: ID does not exist" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.091834 5030 scope.go:117] "RemoveContainer" containerID="534aaa3182d6a00f51c82456305aef8f80e5241233e2ce2d3b05ac7c8859bd93" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.095636 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc"] Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.183044 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.184662 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce204d76-a9d3-454d-8b82-0156fcd68d5b-run-httpd\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.184727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.185008 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-config-data\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.185116 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.185209 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce204d76-a9d3-454d-8b82-0156fcd68d5b-log-httpd\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.185315 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-scripts\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.185375 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4j9v\" (UniqueName: \"kubernetes.io/projected/ce204d76-a9d3-454d-8b82-0156fcd68d5b-kube-api-access-n4j9v\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.185382 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce204d76-a9d3-454d-8b82-0156fcd68d5b-run-httpd\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.185952 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce204d76-a9d3-454d-8b82-0156fcd68d5b-log-httpd\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.204490 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.230353 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.231887 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.234963 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-external-config-data" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.235129 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-public-svc" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.239397 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.389215 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.389377 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.389486 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.389577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.389618 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.389901 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq4m4\" (UniqueName: \"kubernetes.io/projected/c3b71d0a-805d-4894-b6ff-c049baf96ad5-kube-api-access-mq4m4\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.389949 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b71d0a-805d-4894-b6ff-c049baf96ad5-logs\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.390000 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b71d0a-805d-4894-b6ff-c049baf96ad5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.491940 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq4m4\" (UniqueName: \"kubernetes.io/projected/c3b71d0a-805d-4894-b6ff-c049baf96ad5-kube-api-access-mq4m4\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.492015 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b71d0a-805d-4894-b6ff-c049baf96ad5-logs\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.492091 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b71d0a-805d-4894-b6ff-c049baf96ad5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.492196 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.492308 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.492340 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.492375 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.492400 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.493036 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") device mount path \"/mnt/openstack/pv05\"" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.493118 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b71d0a-805d-4894-b6ff-c049baf96ad5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.493075 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b71d0a-805d-4894-b6ff-c049baf96ad5-logs\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.558404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.558882 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4j9v\" (UniqueName: \"kubernetes.io/projected/ce204d76-a9d3-454d-8b82-0156fcd68d5b-kube-api-access-n4j9v\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.559223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-scripts\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.559434 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.560067 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-config-data\") pod \"ceilometer-0\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.561594 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.566573 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.567850 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.568705 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq4m4\" (UniqueName: \"kubernetes.io/projected/c3b71d0a-805d-4894-b6ff-c049baf96ad5-kube-api-access-mq4m4\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.571435 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: W0120 23:55:46.578980 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e43cb69_a286_454d_9d54_5ce3aad208d0.slice/crio-064f9f9e2fae4306e725ac6b1f9539e6d45d4e6b2340247422b21dffa08748eb WatchSource:0}: Error finding container 064f9f9e2fae4306e725ac6b1f9539e6d45d4e6b2340247422b21dffa08748eb: Status 404 returned error can't find the container with id 064f9f9e2fae4306e725ac6b1f9539e6d45d4e6b2340247422b21dffa08748eb Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.581141 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.597357 5030 scope.go:117] "RemoveContainer" containerID="45cdf598354a24fc6278eef04788f9515afb39b3703d29b3e9ef8afd1ff4a1ae" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.606509 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.813794 5030 scope.go:117] "RemoveContainer" containerID="534aaa3182d6a00f51c82456305aef8f80e5241233e2ce2d3b05ac7c8859bd93" Jan 20 23:55:46 crc kubenswrapper[5030]: E0120 23:55:46.814488 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"534aaa3182d6a00f51c82456305aef8f80e5241233e2ce2d3b05ac7c8859bd93\": container with ID starting with 534aaa3182d6a00f51c82456305aef8f80e5241233e2ce2d3b05ac7c8859bd93 not found: ID does not exist" containerID="534aaa3182d6a00f51c82456305aef8f80e5241233e2ce2d3b05ac7c8859bd93" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.814532 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"534aaa3182d6a00f51c82456305aef8f80e5241233e2ce2d3b05ac7c8859bd93"} err="failed to get container status \"534aaa3182d6a00f51c82456305aef8f80e5241233e2ce2d3b05ac7c8859bd93\": rpc error: code = NotFound desc = could not find container \"534aaa3182d6a00f51c82456305aef8f80e5241233e2ce2d3b05ac7c8859bd93\": container with ID starting with 534aaa3182d6a00f51c82456305aef8f80e5241233e2ce2d3b05ac7c8859bd93 not found: ID does not exist" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.814561 5030 scope.go:117] "RemoveContainer" containerID="45cdf598354a24fc6278eef04788f9515afb39b3703d29b3e9ef8afd1ff4a1ae" Jan 20 23:55:46 crc kubenswrapper[5030]: E0120 23:55:46.814933 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45cdf598354a24fc6278eef04788f9515afb39b3703d29b3e9ef8afd1ff4a1ae\": container with ID starting with 45cdf598354a24fc6278eef04788f9515afb39b3703d29b3e9ef8afd1ff4a1ae not found: ID does not exist" containerID="45cdf598354a24fc6278eef04788f9515afb39b3703d29b3e9ef8afd1ff4a1ae" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.814977 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45cdf598354a24fc6278eef04788f9515afb39b3703d29b3e9ef8afd1ff4a1ae"} err="failed to get container status \"45cdf598354a24fc6278eef04788f9515afb39b3703d29b3e9ef8afd1ff4a1ae\": rpc error: code = NotFound desc = could not find container \"45cdf598354a24fc6278eef04788f9515afb39b3703d29b3e9ef8afd1ff4a1ae\": container with ID starting with 45cdf598354a24fc6278eef04788f9515afb39b3703d29b3e9ef8afd1ff4a1ae not found: ID does not exist" Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.857609 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-qgdsf" event={"ID":"7d7f91b3-ec99-44df-a70b-86ca862be85d","Type":"ContainerStarted","Data":"21f333b150bd0ccbc8b20f6ad3212a5ab82dc5a46f701100c6a171585c206909"} Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.859537 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc" event={"ID":"eb6c79c1-6daa-40b3-92dc-c0b7a77182ae","Type":"ContainerStarted","Data":"c3561a3898cd0c5269e449f8733db7df821a0b75d88c69af81fecbd24af2b26c"} Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.861343 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf" event={"ID":"5e43cb69-a286-454d-9d54-5ce3aad208d0","Type":"ContainerStarted","Data":"064f9f9e2fae4306e725ac6b1f9539e6d45d4e6b2340247422b21dffa08748eb"} Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.865986 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq" event={"ID":"65855d62-04e3-45a7-97a2-3a14806fe649","Type":"ContainerStarted","Data":"cb05c75137aa101871f1644e9c74a4467a9245bff264d0684af172eceb2c7e9a"} Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.867154 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-7tr47" event={"ID":"5236b48f-7d0e-40ce-94c0-adb838bf5de6","Type":"ContainerStarted","Data":"0ff872d0ed9fbc8d1f17fe40389c00163a14f9c461e848858171ea5df7350ad1"} Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.868151 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-57x7b" event={"ID":"ca761538-4cfb-4a36-92a3-f5b0601eb08b","Type":"ContainerStarted","Data":"6fb92222774843e949a4334cc92937d94ee65ab777ac0276a2a53823f3a55d2c"} Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.870419 5030 generic.go:334] "Generic (PLEG): container finished" podID="82663c55-6190-4f89-a81a-4950835abad4" containerID="2fc6210e9ab2da522085f52343b3e8c26de15701510867d1f0192cef9767fbc7" exitCode=0 Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.870450 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"82663c55-6190-4f89-a81a-4950835abad4","Type":"ContainerDied","Data":"2fc6210e9ab2da522085f52343b3e8c26de15701510867d1f0192cef9767fbc7"} Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.870471 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"82663c55-6190-4f89-a81a-4950835abad4","Type":"ContainerDied","Data":"f61b610ab862e112cc3cf58890d086a0645098c41f93f0507e1fa896ffd34c6e"} Jan 20 23:55:46 crc kubenswrapper[5030]: I0120 23:55:46.870485 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f61b610ab862e112cc3cf58890d086a0645098c41f93f0507e1fa896ffd34c6e" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.041075 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.042792 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.164340 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.209120 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-combined-ca-bundle\") pod \"82663c55-6190-4f89-a81a-4950835abad4\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.209177 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-internal-tls-certs\") pod \"82663c55-6190-4f89-a81a-4950835abad4\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.209203 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-config-data\") pod \"82663c55-6190-4f89-a81a-4950835abad4\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.209258 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82663c55-6190-4f89-a81a-4950835abad4-httpd-run\") pod \"82663c55-6190-4f89-a81a-4950835abad4\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.209331 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82663c55-6190-4f89-a81a-4950835abad4-logs\") pod \"82663c55-6190-4f89-a81a-4950835abad4\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.209371 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-scripts\") pod \"82663c55-6190-4f89-a81a-4950835abad4\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.209402 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62q4b\" (UniqueName: \"kubernetes.io/projected/82663c55-6190-4f89-a81a-4950835abad4-kube-api-access-62q4b\") pod \"82663c55-6190-4f89-a81a-4950835abad4\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.210771 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"82663c55-6190-4f89-a81a-4950835abad4\" (UID: \"82663c55-6190-4f89-a81a-4950835abad4\") " Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.210328 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82663c55-6190-4f89-a81a-4950835abad4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "82663c55-6190-4f89-a81a-4950835abad4" (UID: "82663c55-6190-4f89-a81a-4950835abad4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.210564 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82663c55-6190-4f89-a81a-4950835abad4-logs" (OuterVolumeSpecName: "logs") pod "82663c55-6190-4f89-a81a-4950835abad4" (UID: "82663c55-6190-4f89-a81a-4950835abad4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.215296 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "82663c55-6190-4f89-a81a-4950835abad4" (UID: "82663c55-6190-4f89-a81a-4950835abad4"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.219955 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82663c55-6190-4f89-a81a-4950835abad4-kube-api-access-62q4b" (OuterVolumeSpecName: "kube-api-access-62q4b") pod "82663c55-6190-4f89-a81a-4950835abad4" (UID: "82663c55-6190-4f89-a81a-4950835abad4"). InnerVolumeSpecName "kube-api-access-62q4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.224328 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-scripts" (OuterVolumeSpecName: "scripts") pod "82663c55-6190-4f89-a81a-4950835abad4" (UID: "82663c55-6190-4f89-a81a-4950835abad4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.282787 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-config-data" (OuterVolumeSpecName: "config-data") pod "82663c55-6190-4f89-a81a-4950835abad4" (UID: "82663c55-6190-4f89-a81a-4950835abad4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.283072 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82663c55-6190-4f89-a81a-4950835abad4" (UID: "82663c55-6190-4f89-a81a-4950835abad4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.295772 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "82663c55-6190-4f89-a81a-4950835abad4" (UID: "82663c55-6190-4f89-a81a-4950835abad4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.314077 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82663c55-6190-4f89-a81a-4950835abad4-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.314122 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.314160 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62q4b\" (UniqueName: \"kubernetes.io/projected/82663c55-6190-4f89-a81a-4950835abad4-kube-api-access-62q4b\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.314195 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.314234 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.314252 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.314262 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82663c55-6190-4f89-a81a-4950835abad4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.314271 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/82663c55-6190-4f89-a81a-4950835abad4-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.354770 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.416771 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.579723 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.885313 5030 generic.go:334] "Generic (PLEG): container finished" podID="5e43cb69-a286-454d-9d54-5ce3aad208d0" containerID="e1e2983165e29a800fa64cc016e762e324a67a241d2c5258ac2493278f551e3d" exitCode=0 Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.885544 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf" event={"ID":"5e43cb69-a286-454d-9d54-5ce3aad208d0","Type":"ContainerDied","Data":"e1e2983165e29a800fa64cc016e762e324a67a241d2c5258ac2493278f551e3d"} Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.893369 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ce204d76-a9d3-454d-8b82-0156fcd68d5b","Type":"ContainerStarted","Data":"3f848fd9588975ccaa77a05a2dd41e9eb763c5cfe5333a1b583dccc9b5a92831"} Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.895470 5030 generic.go:334] "Generic (PLEG): container finished" podID="7d7f91b3-ec99-44df-a70b-86ca862be85d" containerID="564d0b1c0b03d0adc2809f29ac6547f795cb079d19c20e29f26d4c272452c3f2" exitCode=0 Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.895515 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-qgdsf" event={"ID":"7d7f91b3-ec99-44df-a70b-86ca862be85d","Type":"ContainerDied","Data":"564d0b1c0b03d0adc2809f29ac6547f795cb079d19c20e29f26d4c272452c3f2"} Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.897361 5030 generic.go:334] "Generic (PLEG): container finished" podID="eb6c79c1-6daa-40b3-92dc-c0b7a77182ae" containerID="52182c029b767781079783540a6f3a0d338c9ba22b74f17b3312241d0cd52362" exitCode=0 Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.897483 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc" event={"ID":"eb6c79c1-6daa-40b3-92dc-c0b7a77182ae","Type":"ContainerDied","Data":"52182c029b767781079783540a6f3a0d338c9ba22b74f17b3312241d0cd52362"} Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.901373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c3b71d0a-805d-4894-b6ff-c049baf96ad5","Type":"ContainerStarted","Data":"302eb9dae515d805df345359bb644f217048d87f18a8428bf72865f4546c50bd"} Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.904458 5030 generic.go:334] "Generic (PLEG): container finished" podID="65855d62-04e3-45a7-97a2-3a14806fe649" containerID="fb9c6686c48a4b7a4260b23a6489d78b4e75fa78defcbab5958de2322f702462" exitCode=0 Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.904530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq" event={"ID":"65855d62-04e3-45a7-97a2-3a14806fe649","Type":"ContainerDied","Data":"fb9c6686c48a4b7a4260b23a6489d78b4e75fa78defcbab5958de2322f702462"} Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.911028 5030 generic.go:334] "Generic (PLEG): container finished" podID="5236b48f-7d0e-40ce-94c0-adb838bf5de6" containerID="9c6534fafecb72297beee3c7cfa3b48e9f876c41be16464ad6e0e57b69ba270e" exitCode=0 Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.911096 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-7tr47" event={"ID":"5236b48f-7d0e-40ce-94c0-adb838bf5de6","Type":"ContainerDied","Data":"9c6534fafecb72297beee3c7cfa3b48e9f876c41be16464ad6e0e57b69ba270e"} Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.918170 5030 generic.go:334] "Generic (PLEG): container finished" podID="ca761538-4cfb-4a36-92a3-f5b0601eb08b" containerID="5d45ca2eedc3f467103cc821521fdc430f2f9384c7f98f931adfdccedb66b0a2" exitCode=0 Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.918224 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-57x7b" event={"ID":"ca761538-4cfb-4a36-92a3-f5b0601eb08b","Type":"ContainerDied","Data":"5d45ca2eedc3f467103cc821521fdc430f2f9384c7f98f931adfdccedb66b0a2"} Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.918292 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:47 crc kubenswrapper[5030]: I0120 23:55:47.983529 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42cd0b4c-f4c4-40a8-b181-1aec66c793c8" path="/var/lib/kubelet/pods/42cd0b4c-f4c4-40a8-b181-1aec66c793c8/volumes" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.001436 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.010691 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.027907 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:55:48 crc kubenswrapper[5030]: E0120 23:55:48.028284 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82663c55-6190-4f89-a81a-4950835abad4" containerName="glance-log" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.028295 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="82663c55-6190-4f89-a81a-4950835abad4" containerName="glance-log" Jan 20 23:55:48 crc kubenswrapper[5030]: E0120 23:55:48.028306 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82663c55-6190-4f89-a81a-4950835abad4" containerName="glance-httpd" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.028312 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="82663c55-6190-4f89-a81a-4950835abad4" containerName="glance-httpd" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.028485 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="82663c55-6190-4f89-a81a-4950835abad4" containerName="glance-log" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.028502 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="82663c55-6190-4f89-a81a-4950835abad4" containerName="glance-httpd" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.030765 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.034483 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"glance-default-internal-config-data" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.034642 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-glance-default-internal-svc" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.061599 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.134006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.134425 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.134448 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f8fa8f5-9f43-489c-b61d-668187fe87ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.134732 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.134757 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.134787 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f8fa8f5-9f43-489c-b61d-668187fe87ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.134841 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdzkw\" (UniqueName: \"kubernetes.io/projected/4f8fa8f5-9f43-489c-b61d-668187fe87ea-kube-api-access-qdzkw\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.134912 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.236592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.236661 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.236685 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f8fa8f5-9f43-489c-b61d-668187fe87ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.236733 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.236754 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.236786 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f8fa8f5-9f43-489c-b61d-668187fe87ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.236812 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdzkw\" (UniqueName: \"kubernetes.io/projected/4f8fa8f5-9f43-489c-b61d-668187fe87ea-kube-api-access-qdzkw\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.236839 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.239336 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") device mount path \"/mnt/openstack/pv09\"" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.239489 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f8fa8f5-9f43-489c-b61d-668187fe87ea-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.239608 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f8fa8f5-9f43-489c-b61d-668187fe87ea-logs\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.240958 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.241040 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.243260 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.243724 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.260600 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdzkw\" (UniqueName: \"kubernetes.io/projected/4f8fa8f5-9f43-489c-b61d-668187fe87ea-kube-api-access-qdzkw\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.299226 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.354824 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.802107 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:55:48 crc kubenswrapper[5030]: W0120 23:55:48.804109 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f8fa8f5_9f43_489c_b61d_668187fe87ea.slice/crio-67cb62bd95631dc7f87521e63278f9924538b335f0d365ecc6dc88087e9fd5f6 WatchSource:0}: Error finding container 67cb62bd95631dc7f87521e63278f9924538b335f0d365ecc6dc88087e9fd5f6: Status 404 returned error can't find the container with id 67cb62bd95631dc7f87521e63278f9924538b335f0d365ecc6dc88087e9fd5f6 Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.936305 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ce204d76-a9d3-454d-8b82-0156fcd68d5b","Type":"ContainerStarted","Data":"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1"} Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.936348 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ce204d76-a9d3-454d-8b82-0156fcd68d5b","Type":"ContainerStarted","Data":"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8"} Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.938067 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"4f8fa8f5-9f43-489c-b61d-668187fe87ea","Type":"ContainerStarted","Data":"67cb62bd95631dc7f87521e63278f9924538b335f0d365ecc6dc88087e9fd5f6"} Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.940170 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c3b71d0a-805d-4894-b6ff-c049baf96ad5","Type":"ContainerStarted","Data":"cfb03bd09901565aba481629836ae75c34742d2b796a614f9812657af674d0ce"} Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.940245 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c3b71d0a-805d-4894-b6ff-c049baf96ad5","Type":"ContainerStarted","Data":"8df5d7b4dd58e28f0b12d08b3d7eee031074f3eaa5bb04801e96f40173a4f7f1"} Jan 20 23:55:48 crc kubenswrapper[5030]: I0120 23:55:48.972946 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.972927452 podStartE2EDuration="2.972927452s" podCreationTimestamp="2026-01-20 23:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:55:48.965248156 +0000 UTC m=+4821.285508454" watchObservedRunningTime="2026-01-20 23:55:48.972927452 +0000 UTC m=+4821.293187750" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.264283 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-qgdsf" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.360368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbm5b\" (UniqueName: \"kubernetes.io/projected/7d7f91b3-ec99-44df-a70b-86ca862be85d-kube-api-access-lbm5b\") pod \"7d7f91b3-ec99-44df-a70b-86ca862be85d\" (UID: \"7d7f91b3-ec99-44df-a70b-86ca862be85d\") " Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.360489 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d7f91b3-ec99-44df-a70b-86ca862be85d-operator-scripts\") pod \"7d7f91b3-ec99-44df-a70b-86ca862be85d\" (UID: \"7d7f91b3-ec99-44df-a70b-86ca862be85d\") " Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.361538 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d7f91b3-ec99-44df-a70b-86ca862be85d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d7f91b3-ec99-44df-a70b-86ca862be85d" (UID: "7d7f91b3-ec99-44df-a70b-86ca862be85d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.369004 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d7f91b3-ec99-44df-a70b-86ca862be85d-kube-api-access-lbm5b" (OuterVolumeSpecName: "kube-api-access-lbm5b") pod "7d7f91b3-ec99-44df-a70b-86ca862be85d" (UID: "7d7f91b3-ec99-44df-a70b-86ca862be85d"). InnerVolumeSpecName "kube-api-access-lbm5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.463827 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbm5b\" (UniqueName: \"kubernetes.io/projected/7d7f91b3-ec99-44df-a70b-86ca862be85d-kube-api-access-lbm5b\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.463869 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d7f91b3-ec99-44df-a70b-86ca862be85d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.562924 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.586068 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-7tr47" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.612918 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.614922 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.638251 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-57x7b" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.671453 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9xns\" (UniqueName: \"kubernetes.io/projected/5236b48f-7d0e-40ce-94c0-adb838bf5de6-kube-api-access-r9xns\") pod \"5236b48f-7d0e-40ce-94c0-adb838bf5de6\" (UID: \"5236b48f-7d0e-40ce-94c0-adb838bf5de6\") " Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.671511 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65855d62-04e3-45a7-97a2-3a14806fe649-operator-scripts\") pod \"65855d62-04e3-45a7-97a2-3a14806fe649\" (UID: \"65855d62-04e3-45a7-97a2-3a14806fe649\") " Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.671532 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7xfq\" (UniqueName: \"kubernetes.io/projected/65855d62-04e3-45a7-97a2-3a14806fe649-kube-api-access-l7xfq\") pod \"65855d62-04e3-45a7-97a2-3a14806fe649\" (UID: \"65855d62-04e3-45a7-97a2-3a14806fe649\") " Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.671605 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5236b48f-7d0e-40ce-94c0-adb838bf5de6-operator-scripts\") pod \"5236b48f-7d0e-40ce-94c0-adb838bf5de6\" (UID: \"5236b48f-7d0e-40ce-94c0-adb838bf5de6\") " Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.673071 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65855d62-04e3-45a7-97a2-3a14806fe649-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65855d62-04e3-45a7-97a2-3a14806fe649" (UID: "65855d62-04e3-45a7-97a2-3a14806fe649"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.673208 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5236b48f-7d0e-40ce-94c0-adb838bf5de6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5236b48f-7d0e-40ce-94c0-adb838bf5de6" (UID: "5236b48f-7d0e-40ce-94c0-adb838bf5de6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.676902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65855d62-04e3-45a7-97a2-3a14806fe649-kube-api-access-l7xfq" (OuterVolumeSpecName: "kube-api-access-l7xfq") pod "65855d62-04e3-45a7-97a2-3a14806fe649" (UID: "65855d62-04e3-45a7-97a2-3a14806fe649"). InnerVolumeSpecName "kube-api-access-l7xfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.679197 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5236b48f-7d0e-40ce-94c0-adb838bf5de6-kube-api-access-r9xns" (OuterVolumeSpecName: "kube-api-access-r9xns") pod "5236b48f-7d0e-40ce-94c0-adb838bf5de6" (UID: "5236b48f-7d0e-40ce-94c0-adb838bf5de6"). InnerVolumeSpecName "kube-api-access-r9xns". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.772898 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbs9n\" (UniqueName: \"kubernetes.io/projected/5e43cb69-a286-454d-9d54-5ce3aad208d0-kube-api-access-bbs9n\") pod \"5e43cb69-a286-454d-9d54-5ce3aad208d0\" (UID: \"5e43cb69-a286-454d-9d54-5ce3aad208d0\") " Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.772977 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e43cb69-a286-454d-9d54-5ce3aad208d0-operator-scripts\") pod \"5e43cb69-a286-454d-9d54-5ce3aad208d0\" (UID: \"5e43cb69-a286-454d-9d54-5ce3aad208d0\") " Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.773075 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgk2p\" (UniqueName: \"kubernetes.io/projected/eb6c79c1-6daa-40b3-92dc-c0b7a77182ae-kube-api-access-dgk2p\") pod \"eb6c79c1-6daa-40b3-92dc-c0b7a77182ae\" (UID: \"eb6c79c1-6daa-40b3-92dc-c0b7a77182ae\") " Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.773108 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca761538-4cfb-4a36-92a3-f5b0601eb08b-operator-scripts\") pod \"ca761538-4cfb-4a36-92a3-f5b0601eb08b\" (UID: \"ca761538-4cfb-4a36-92a3-f5b0601eb08b\") " Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.773160 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb6c79c1-6daa-40b3-92dc-c0b7a77182ae-operator-scripts\") pod \"eb6c79c1-6daa-40b3-92dc-c0b7a77182ae\" (UID: \"eb6c79c1-6daa-40b3-92dc-c0b7a77182ae\") " Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.773200 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhlf9\" (UniqueName: \"kubernetes.io/projected/ca761538-4cfb-4a36-92a3-f5b0601eb08b-kube-api-access-lhlf9\") pod \"ca761538-4cfb-4a36-92a3-f5b0601eb08b\" (UID: \"ca761538-4cfb-4a36-92a3-f5b0601eb08b\") " Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.773563 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9xns\" (UniqueName: \"kubernetes.io/projected/5236b48f-7d0e-40ce-94c0-adb838bf5de6-kube-api-access-r9xns\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.773581 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65855d62-04e3-45a7-97a2-3a14806fe649-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.773591 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7xfq\" (UniqueName: \"kubernetes.io/projected/65855d62-04e3-45a7-97a2-3a14806fe649-kube-api-access-l7xfq\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.773600 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5236b48f-7d0e-40ce-94c0-adb838bf5de6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.776823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca761538-4cfb-4a36-92a3-f5b0601eb08b-kube-api-access-lhlf9" (OuterVolumeSpecName: "kube-api-access-lhlf9") pod "ca761538-4cfb-4a36-92a3-f5b0601eb08b" (UID: "ca761538-4cfb-4a36-92a3-f5b0601eb08b"). InnerVolumeSpecName "kube-api-access-lhlf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.777137 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb6c79c1-6daa-40b3-92dc-c0b7a77182ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb6c79c1-6daa-40b3-92dc-c0b7a77182ae" (UID: "eb6c79c1-6daa-40b3-92dc-c0b7a77182ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.777227 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca761538-4cfb-4a36-92a3-f5b0601eb08b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca761538-4cfb-4a36-92a3-f5b0601eb08b" (UID: "ca761538-4cfb-4a36-92a3-f5b0601eb08b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.777837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e43cb69-a286-454d-9d54-5ce3aad208d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e43cb69-a286-454d-9d54-5ce3aad208d0" (UID: "5e43cb69-a286-454d-9d54-5ce3aad208d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.778558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e43cb69-a286-454d-9d54-5ce3aad208d0-kube-api-access-bbs9n" (OuterVolumeSpecName: "kube-api-access-bbs9n") pod "5e43cb69-a286-454d-9d54-5ce3aad208d0" (UID: "5e43cb69-a286-454d-9d54-5ce3aad208d0"). InnerVolumeSpecName "kube-api-access-bbs9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.779190 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6c79c1-6daa-40b3-92dc-c0b7a77182ae-kube-api-access-dgk2p" (OuterVolumeSpecName: "kube-api-access-dgk2p") pod "eb6c79c1-6daa-40b3-92dc-c0b7a77182ae" (UID: "eb6c79c1-6daa-40b3-92dc-c0b7a77182ae"). InnerVolumeSpecName "kube-api-access-dgk2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.874899 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhlf9\" (UniqueName: \"kubernetes.io/projected/ca761538-4cfb-4a36-92a3-f5b0601eb08b-kube-api-access-lhlf9\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.874934 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbs9n\" (UniqueName: \"kubernetes.io/projected/5e43cb69-a286-454d-9d54-5ce3aad208d0-kube-api-access-bbs9n\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.874946 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e43cb69-a286-454d-9d54-5ce3aad208d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.874956 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgk2p\" (UniqueName: \"kubernetes.io/projected/eb6c79c1-6daa-40b3-92dc-c0b7a77182ae-kube-api-access-dgk2p\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.874966 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca761538-4cfb-4a36-92a3-f5b0601eb08b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.874976 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb6c79c1-6daa-40b3-92dc-c0b7a77182ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.952135 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ce204d76-a9d3-454d-8b82-0156fcd68d5b","Type":"ContainerStarted","Data":"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6"} Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.953554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"4f8fa8f5-9f43-489c-b61d-668187fe87ea","Type":"ContainerStarted","Data":"3cc6b0b1ef67cf3ed623ef331af4f0711ba50f68fe18f908775fb4147e800fbd"} Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.955012 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-db-create-qgdsf" event={"ID":"7d7f91b3-ec99-44df-a70b-86ca862be85d","Type":"ContainerDied","Data":"21f333b150bd0ccbc8b20f6ad3212a5ab82dc5a46f701100c6a171585c206909"} Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.955053 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21f333b150bd0ccbc8b20f6ad3212a5ab82dc5a46f701100c6a171585c206909" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.955093 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-db-create-qgdsf" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.956461 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.956530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc" event={"ID":"eb6c79c1-6daa-40b3-92dc-c0b7a77182ae","Type":"ContainerDied","Data":"c3561a3898cd0c5269e449f8733db7df821a0b75d88c69af81fecbd24af2b26c"} Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.956557 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3561a3898cd0c5269e449f8733db7df821a0b75d88c69af81fecbd24af2b26c" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.959841 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq" event={"ID":"65855d62-04e3-45a7-97a2-3a14806fe649","Type":"ContainerDied","Data":"cb05c75137aa101871f1644e9c74a4467a9245bff264d0684af172eceb2c7e9a"} Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.959865 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb05c75137aa101871f1644e9c74a4467a9245bff264d0684af172eceb2c7e9a" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.959914 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.973550 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-db-create-7tr47" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.980193 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-db-create-57x7b" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.987216 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf" Jan 20 23:55:49 crc kubenswrapper[5030]: I0120 23:55:49.993775 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82663c55-6190-4f89-a81a-4950835abad4" path="/var/lib/kubelet/pods/82663c55-6190-4f89-a81a-4950835abad4/volumes" Jan 20 23:55:50 crc kubenswrapper[5030]: I0120 23:55:50.000178 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-db-create-7tr47" event={"ID":"5236b48f-7d0e-40ce-94c0-adb838bf5de6","Type":"ContainerDied","Data":"0ff872d0ed9fbc8d1f17fe40389c00163a14f9c461e848858171ea5df7350ad1"} Jan 20 23:55:50 crc kubenswrapper[5030]: I0120 23:55:50.000210 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ff872d0ed9fbc8d1f17fe40389c00163a14f9c461e848858171ea5df7350ad1" Jan 20 23:55:50 crc kubenswrapper[5030]: I0120 23:55:50.000220 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-db-create-57x7b" event={"ID":"ca761538-4cfb-4a36-92a3-f5b0601eb08b","Type":"ContainerDied","Data":"6fb92222774843e949a4334cc92937d94ee65ab777ac0276a2a53823f3a55d2c"} Jan 20 23:55:50 crc kubenswrapper[5030]: I0120 23:55:50.000229 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fb92222774843e949a4334cc92937d94ee65ab777ac0276a2a53823f3a55d2c" Jan 20 23:55:50 crc kubenswrapper[5030]: I0120 23:55:50.000239 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf" event={"ID":"5e43cb69-a286-454d-9d54-5ce3aad208d0","Type":"ContainerDied","Data":"064f9f9e2fae4306e725ac6b1f9539e6d45d4e6b2340247422b21dffa08748eb"} Jan 20 23:55:50 crc kubenswrapper[5030]: I0120 23:55:50.000248 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="064f9f9e2fae4306e725ac6b1f9539e6d45d4e6b2340247422b21dffa08748eb" Jan 20 23:55:51 crc kubenswrapper[5030]: I0120 23:55:51.001810 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"4f8fa8f5-9f43-489c-b61d-668187fe87ea","Type":"ContainerStarted","Data":"a214f98f4f8cf3fbcb2676eb84babdfaed14af092e194e107984c7c352cadedf"} Jan 20 23:55:51 crc kubenswrapper[5030]: I0120 23:55:51.029789 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.029772798 podStartE2EDuration="3.029772798s" podCreationTimestamp="2026-01-20 23:55:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:55:51.027216357 +0000 UTC m=+4823.347476645" watchObservedRunningTime="2026-01-20 23:55:51.029772798 +0000 UTC m=+4823.350033086" Jan 20 23:55:52 crc kubenswrapper[5030]: I0120 23:55:52.020139 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ce204d76-a9d3-454d-8b82-0156fcd68d5b","Type":"ContainerStarted","Data":"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae"} Jan 20 23:55:52 crc kubenswrapper[5030]: I0120 23:55:52.074185 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=3.332264291 podStartE2EDuration="7.074162574s" podCreationTimestamp="2026-01-20 23:55:45 +0000 UTC" firstStartedPulling="2026-01-20 23:55:47.182804447 +0000 UTC m=+4819.503064735" lastFinishedPulling="2026-01-20 23:55:50.92470269 +0000 UTC m=+4823.244963018" observedRunningTime="2026-01-20 23:55:52.054237841 +0000 UTC m=+4824.374498179" watchObservedRunningTime="2026-01-20 23:55:52.074162574 +0000 UTC m=+4824.394422872" Jan 20 23:55:53 crc kubenswrapper[5030]: I0120 23:55:53.034398 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:53 crc kubenswrapper[5030]: I0120 23:55:53.381844 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.052468 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="ceilometer-central-agent" containerID="cri-o://2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8" gracePeriod=30 Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.052510 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="proxy-httpd" containerID="cri-o://41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae" gracePeriod=30 Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.052554 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="ceilometer-notification-agent" containerID="cri-o://73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1" gracePeriod=30 Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.052608 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="sg-core" containerID="cri-o://fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6" gracePeriod=30 Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.227850 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p"] Jan 20 23:55:55 crc kubenswrapper[5030]: E0120 23:55:55.228252 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5236b48f-7d0e-40ce-94c0-adb838bf5de6" containerName="mariadb-database-create" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.228273 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5236b48f-7d0e-40ce-94c0-adb838bf5de6" containerName="mariadb-database-create" Jan 20 23:55:55 crc kubenswrapper[5030]: E0120 23:55:55.228293 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca761538-4cfb-4a36-92a3-f5b0601eb08b" containerName="mariadb-database-create" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.228301 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca761538-4cfb-4a36-92a3-f5b0601eb08b" containerName="mariadb-database-create" Jan 20 23:55:55 crc kubenswrapper[5030]: E0120 23:55:55.228324 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e43cb69-a286-454d-9d54-5ce3aad208d0" containerName="mariadb-account-create-update" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.228332 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e43cb69-a286-454d-9d54-5ce3aad208d0" containerName="mariadb-account-create-update" Jan 20 23:55:55 crc kubenswrapper[5030]: E0120 23:55:55.228347 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6c79c1-6daa-40b3-92dc-c0b7a77182ae" containerName="mariadb-account-create-update" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.228354 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6c79c1-6daa-40b3-92dc-c0b7a77182ae" containerName="mariadb-account-create-update" Jan 20 23:55:55 crc kubenswrapper[5030]: E0120 23:55:55.228372 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7f91b3-ec99-44df-a70b-86ca862be85d" containerName="mariadb-database-create" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.228382 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7f91b3-ec99-44df-a70b-86ca862be85d" containerName="mariadb-database-create" Jan 20 23:55:55 crc kubenswrapper[5030]: E0120 23:55:55.228405 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65855d62-04e3-45a7-97a2-3a14806fe649" containerName="mariadb-account-create-update" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.228412 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="65855d62-04e3-45a7-97a2-3a14806fe649" containerName="mariadb-account-create-update" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.228605 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e43cb69-a286-454d-9d54-5ce3aad208d0" containerName="mariadb-account-create-update" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.228636 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="65855d62-04e3-45a7-97a2-3a14806fe649" containerName="mariadb-account-create-update" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.228648 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7f91b3-ec99-44df-a70b-86ca862be85d" containerName="mariadb-database-create" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.228674 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5236b48f-7d0e-40ce-94c0-adb838bf5de6" containerName="mariadb-database-create" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.228684 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb6c79c1-6daa-40b3-92dc-c0b7a77182ae" containerName="mariadb-account-create-update" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.228702 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca761538-4cfb-4a36-92a3-f5b0601eb08b" containerName="mariadb-database-create" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.229350 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.235501 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-psgrf" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.235519 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-scripts" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.236089 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.240350 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p"] Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.289603 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b7w6\" (UniqueName: \"kubernetes.io/projected/53344bed-1dcb-467d-a87e-93b38be8acf1-kube-api-access-4b7w6\") pod \"nova-cell0-conductor-db-sync-xkn7p\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.289793 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xkn7p\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.289898 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-scripts\") pod \"nova-cell0-conductor-db-sync-xkn7p\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.289943 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-config-data\") pod \"nova-cell0-conductor-db-sync-xkn7p\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.391504 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xkn7p\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.391563 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-scripts\") pod \"nova-cell0-conductor-db-sync-xkn7p\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.391589 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-config-data\") pod \"nova-cell0-conductor-db-sync-xkn7p\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.391652 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b7w6\" (UniqueName: \"kubernetes.io/projected/53344bed-1dcb-467d-a87e-93b38be8acf1-kube-api-access-4b7w6\") pod \"nova-cell0-conductor-db-sync-xkn7p\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.396583 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-scripts\") pod \"nova-cell0-conductor-db-sync-xkn7p\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.398036 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xkn7p\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.398373 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-config-data\") pod \"nova-cell0-conductor-db-sync-xkn7p\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.409746 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b7w6\" (UniqueName: \"kubernetes.io/projected/53344bed-1dcb-467d-a87e-93b38be8acf1-kube-api-access-4b7w6\") pod \"nova-cell0-conductor-db-sync-xkn7p\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.558472 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:55:55 crc kubenswrapper[5030]: I0120 23:55:55.928252 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.001682 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-config-data\") pod \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.001717 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce204d76-a9d3-454d-8b82-0156fcd68d5b-log-httpd\") pod \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.001741 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-combined-ca-bundle\") pod \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.001803 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce204d76-a9d3-454d-8b82-0156fcd68d5b-run-httpd\") pod \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.001850 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-sg-core-conf-yaml\") pod \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.001875 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4j9v\" (UniqueName: \"kubernetes.io/projected/ce204d76-a9d3-454d-8b82-0156fcd68d5b-kube-api-access-n4j9v\") pod \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.001926 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-scripts\") pod \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\" (UID: \"ce204d76-a9d3-454d-8b82-0156fcd68d5b\") " Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.005441 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce204d76-a9d3-454d-8b82-0156fcd68d5b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ce204d76-a9d3-454d-8b82-0156fcd68d5b" (UID: "ce204d76-a9d3-454d-8b82-0156fcd68d5b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.006928 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce204d76-a9d3-454d-8b82-0156fcd68d5b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ce204d76-a9d3-454d-8b82-0156fcd68d5b" (UID: "ce204d76-a9d3-454d-8b82-0156fcd68d5b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.008639 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-scripts" (OuterVolumeSpecName: "scripts") pod "ce204d76-a9d3-454d-8b82-0156fcd68d5b" (UID: "ce204d76-a9d3-454d-8b82-0156fcd68d5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.010694 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce204d76-a9d3-454d-8b82-0156fcd68d5b-kube-api-access-n4j9v" (OuterVolumeSpecName: "kube-api-access-n4j9v") pod "ce204d76-a9d3-454d-8b82-0156fcd68d5b" (UID: "ce204d76-a9d3-454d-8b82-0156fcd68d5b"). InnerVolumeSpecName "kube-api-access-n4j9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.052999 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ce204d76-a9d3-454d-8b82-0156fcd68d5b" (UID: "ce204d76-a9d3-454d-8b82-0156fcd68d5b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.074929 5030 generic.go:334] "Generic (PLEG): container finished" podID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerID="41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae" exitCode=0 Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.074958 5030 generic.go:334] "Generic (PLEG): container finished" podID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerID="fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6" exitCode=2 Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.074966 5030 generic.go:334] "Generic (PLEG): container finished" podID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerID="73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1" exitCode=0 Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.074973 5030 generic.go:334] "Generic (PLEG): container finished" podID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerID="2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8" exitCode=0 Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.074991 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ce204d76-a9d3-454d-8b82-0156fcd68d5b","Type":"ContainerDied","Data":"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae"} Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.075017 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ce204d76-a9d3-454d-8b82-0156fcd68d5b","Type":"ContainerDied","Data":"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6"} Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.075027 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ce204d76-a9d3-454d-8b82-0156fcd68d5b","Type":"ContainerDied","Data":"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1"} Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.075036 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ce204d76-a9d3-454d-8b82-0156fcd68d5b","Type":"ContainerDied","Data":"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8"} Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.075044 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"ce204d76-a9d3-454d-8b82-0156fcd68d5b","Type":"ContainerDied","Data":"3f848fd9588975ccaa77a05a2dd41e9eb763c5cfe5333a1b583dccc9b5a92831"} Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.075058 5030 scope.go:117] "RemoveContainer" containerID="41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.075179 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.075578 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p"] Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.097102 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce204d76-a9d3-454d-8b82-0156fcd68d5b" (UID: "ce204d76-a9d3-454d-8b82-0156fcd68d5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.104354 5030 scope.go:117] "RemoveContainer" containerID="fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.106188 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce204d76-a9d3-454d-8b82-0156fcd68d5b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.106214 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.106224 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce204d76-a9d3-454d-8b82-0156fcd68d5b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.106233 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.106245 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4j9v\" (UniqueName: \"kubernetes.io/projected/ce204d76-a9d3-454d-8b82-0156fcd68d5b-kube-api-access-n4j9v\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.106255 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.121168 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-config-data" (OuterVolumeSpecName: "config-data") pod "ce204d76-a9d3-454d-8b82-0156fcd68d5b" (UID: "ce204d76-a9d3-454d-8b82-0156fcd68d5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.133663 5030 scope.go:117] "RemoveContainer" containerID="73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.171256 5030 scope.go:117] "RemoveContainer" containerID="2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.208219 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce204d76-a9d3-454d-8b82-0156fcd68d5b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.275743 5030 scope.go:117] "RemoveContainer" containerID="41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae" Jan 20 23:55:56 crc kubenswrapper[5030]: E0120 23:55:56.276389 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae\": container with ID starting with 41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae not found: ID does not exist" containerID="41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.276424 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae"} err="failed to get container status \"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae\": rpc error: code = NotFound desc = could not find container \"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae\": container with ID starting with 41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.276477 5030 scope.go:117] "RemoveContainer" containerID="fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6" Jan 20 23:55:56 crc kubenswrapper[5030]: E0120 23:55:56.276859 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6\": container with ID starting with fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6 not found: ID does not exist" containerID="fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.276907 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6"} err="failed to get container status \"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6\": rpc error: code = NotFound desc = could not find container \"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6\": container with ID starting with fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6 not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.276926 5030 scope.go:117] "RemoveContainer" containerID="73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1" Jan 20 23:55:56 crc kubenswrapper[5030]: E0120 23:55:56.277397 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1\": container with ID starting with 73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1 not found: ID does not exist" containerID="73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.277472 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1"} err="failed to get container status \"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1\": rpc error: code = NotFound desc = could not find container \"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1\": container with ID starting with 73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1 not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.277515 5030 scope.go:117] "RemoveContainer" containerID="2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8" Jan 20 23:55:56 crc kubenswrapper[5030]: E0120 23:55:56.278118 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8\": container with ID starting with 2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8 not found: ID does not exist" containerID="2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.278194 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8"} err="failed to get container status \"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8\": rpc error: code = NotFound desc = could not find container \"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8\": container with ID starting with 2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8 not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.278239 5030 scope.go:117] "RemoveContainer" containerID="41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.278690 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae"} err="failed to get container status \"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae\": rpc error: code = NotFound desc = could not find container \"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae\": container with ID starting with 41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.278733 5030 scope.go:117] "RemoveContainer" containerID="fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.279146 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6"} err="failed to get container status \"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6\": rpc error: code = NotFound desc = could not find container \"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6\": container with ID starting with fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6 not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.279191 5030 scope.go:117] "RemoveContainer" containerID="73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.279756 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1"} err="failed to get container status \"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1\": rpc error: code = NotFound desc = could not find container \"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1\": container with ID starting with 73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1 not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.279784 5030 scope.go:117] "RemoveContainer" containerID="2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.280104 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8"} err="failed to get container status \"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8\": rpc error: code = NotFound desc = could not find container \"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8\": container with ID starting with 2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8 not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.280135 5030 scope.go:117] "RemoveContainer" containerID="41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.280456 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae"} err="failed to get container status \"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae\": rpc error: code = NotFound desc = could not find container \"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae\": container with ID starting with 41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.280511 5030 scope.go:117] "RemoveContainer" containerID="fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.280848 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6"} err="failed to get container status \"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6\": rpc error: code = NotFound desc = could not find container \"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6\": container with ID starting with fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6 not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.280875 5030 scope.go:117] "RemoveContainer" containerID="73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.281143 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1"} err="failed to get container status \"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1\": rpc error: code = NotFound desc = could not find container \"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1\": container with ID starting with 73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1 not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.281166 5030 scope.go:117] "RemoveContainer" containerID="2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.281474 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8"} err="failed to get container status \"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8\": rpc error: code = NotFound desc = could not find container \"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8\": container with ID starting with 2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8 not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.281497 5030 scope.go:117] "RemoveContainer" containerID="41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.281914 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae"} err="failed to get container status \"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae\": rpc error: code = NotFound desc = could not find container \"41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae\": container with ID starting with 41b2d6983c851124b18bca4573abd69887b496eb136728736f9f3b2dc9a664ae not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.281933 5030 scope.go:117] "RemoveContainer" containerID="fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.283201 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6"} err="failed to get container status \"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6\": rpc error: code = NotFound desc = could not find container \"fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6\": container with ID starting with fe53ef8dadbdba8984e7ee7a7bb6eda475a149dd852dda863daf5e57a6bd64f6 not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.283227 5030 scope.go:117] "RemoveContainer" containerID="73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.284215 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1"} err="failed to get container status \"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1\": rpc error: code = NotFound desc = could not find container \"73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1\": container with ID starting with 73d2d91bd53513a70b91b21598808889d7afa696a3637c3ca84fbbaffd17e3a1 not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.284237 5030 scope.go:117] "RemoveContainer" containerID="2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.284519 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8"} err="failed to get container status \"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8\": rpc error: code = NotFound desc = could not find container \"2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8\": container with ID starting with 2c0bb80dc5d8e52402658b53c7138334cde0c271492b750740057b89493711a8 not found: ID does not exist" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.431545 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.454307 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.465497 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:56 crc kubenswrapper[5030]: E0120 23:55:56.466216 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="sg-core" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.466247 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="sg-core" Jan 20 23:55:56 crc kubenswrapper[5030]: E0120 23:55:56.466285 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="ceilometer-notification-agent" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.466299 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="ceilometer-notification-agent" Jan 20 23:55:56 crc kubenswrapper[5030]: E0120 23:55:56.466321 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="ceilometer-central-agent" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.466336 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="ceilometer-central-agent" Jan 20 23:55:56 crc kubenswrapper[5030]: E0120 23:55:56.466352 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="proxy-httpd" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.466364 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="proxy-httpd" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.466757 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="sg-core" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.466795 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="proxy-httpd" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.466814 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="ceilometer-central-agent" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.466850 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" containerName="ceilometer-notification-agent" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.469844 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.472155 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.472780 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.473510 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.614456 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a823f180-b204-43e1-a009-442a930ef6ce-run-httpd\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.614973 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.615242 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.615415 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-config-data\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.615579 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5bzc\" (UniqueName: \"kubernetes.io/projected/a823f180-b204-43e1-a009-442a930ef6ce-kube-api-access-l5bzc\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.615949 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a823f180-b204-43e1-a009-442a930ef6ce-log-httpd\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.616250 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-scripts\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.718856 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a823f180-b204-43e1-a009-442a930ef6ce-log-httpd\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.719325 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-scripts\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.719500 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a823f180-b204-43e1-a009-442a930ef6ce-run-httpd\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.719705 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.719871 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.720016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-config-data\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.720188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5bzc\" (UniqueName: \"kubernetes.io/projected/a823f180-b204-43e1-a009-442a930ef6ce-kube-api-access-l5bzc\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.719719 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a823f180-b204-43e1-a009-442a930ef6ce-log-httpd\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.721136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a823f180-b204-43e1-a009-442a930ef6ce-run-httpd\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.726064 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-scripts\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.726612 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.727136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.728427 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-config-data\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.741497 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5bzc\" (UniqueName: \"kubernetes.io/projected/a823f180-b204-43e1-a009-442a930ef6ce-kube-api-access-l5bzc\") pod \"ceilometer-0\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:56 crc kubenswrapper[5030]: I0120 23:55:56.802128 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:55:57 crc kubenswrapper[5030]: I0120 23:55:57.041840 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:57 crc kubenswrapper[5030]: I0120 23:55:57.042737 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:57 crc kubenswrapper[5030]: I0120 23:55:57.079895 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:57 crc kubenswrapper[5030]: I0120 23:55:57.087367 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" event={"ID":"53344bed-1dcb-467d-a87e-93b38be8acf1","Type":"ContainerStarted","Data":"c24b019586990fa217dfc9ad1cd63c34f40b14cb2468b7c316539d0b370e2051"} Jan 20 23:55:57 crc kubenswrapper[5030]: I0120 23:55:57.087407 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" event={"ID":"53344bed-1dcb-467d-a87e-93b38be8acf1","Type":"ContainerStarted","Data":"515c25c7686a8d407e61a20b50c5bae2321c5c3cd9bb1688fe6241bde2f00a07"} Jan 20 23:55:57 crc kubenswrapper[5030]: I0120 23:55:57.087898 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:57 crc kubenswrapper[5030]: I0120 23:55:57.095984 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:57 crc kubenswrapper[5030]: I0120 23:55:57.134173 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" podStartSLOduration=2.134154432 podStartE2EDuration="2.134154432s" podCreationTimestamp="2026-01-20 23:55:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:55:57.129443408 +0000 UTC m=+4829.449703736" watchObservedRunningTime="2026-01-20 23:55:57.134154432 +0000 UTC m=+4829.454414710" Jan 20 23:55:57 crc kubenswrapper[5030]: I0120 23:55:57.315737 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:55:57 crc kubenswrapper[5030]: W0120 23:55:57.329669 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda823f180_b204_43e1_a009_442a930ef6ce.slice/crio-06d9f4f169d74164fe584438dda0bad3423c64ff31221eca53f34234f81a3d67 WatchSource:0}: Error finding container 06d9f4f169d74164fe584438dda0bad3423c64ff31221eca53f34234f81a3d67: Status 404 returned error can't find the container with id 06d9f4f169d74164fe584438dda0bad3423c64ff31221eca53f34234f81a3d67 Jan 20 23:55:57 crc kubenswrapper[5030]: I0120 23:55:57.983294 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce204d76-a9d3-454d-8b82-0156fcd68d5b" path="/var/lib/kubelet/pods/ce204d76-a9d3-454d-8b82-0156fcd68d5b/volumes" Jan 20 23:55:58 crc kubenswrapper[5030]: I0120 23:55:58.099485 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a823f180-b204-43e1-a009-442a930ef6ce","Type":"ContainerStarted","Data":"82ee678d8f0c37c036b2ee21b6fa99bf21b0ae83829d6946c69273cc65331ff4"} Jan 20 23:55:58 crc kubenswrapper[5030]: I0120 23:55:58.099949 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a823f180-b204-43e1-a009-442a930ef6ce","Type":"ContainerStarted","Data":"06d9f4f169d74164fe584438dda0bad3423c64ff31221eca53f34234f81a3d67"} Jan 20 23:55:58 crc kubenswrapper[5030]: I0120 23:55:58.099989 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:58 crc kubenswrapper[5030]: I0120 23:55:58.355812 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:58 crc kubenswrapper[5030]: I0120 23:55:58.355860 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:58 crc kubenswrapper[5030]: I0120 23:55:58.409563 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:58 crc kubenswrapper[5030]: I0120 23:55:58.422268 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:59 crc kubenswrapper[5030]: I0120 23:55:59.034734 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:55:59 crc kubenswrapper[5030]: I0120 23:55:59.119442 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a823f180-b204-43e1-a009-442a930ef6ce","Type":"ContainerStarted","Data":"912c48f793e3654ce8dbddbd917036f67d80141af963cffe6cd33fabc32f5a4f"} Jan 20 23:55:59 crc kubenswrapper[5030]: I0120 23:55:59.119514 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:59 crc kubenswrapper[5030]: I0120 23:55:59.119541 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:55:59 crc kubenswrapper[5030]: I0120 23:55:59.810459 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:56:00 crc kubenswrapper[5030]: I0120 23:56:00.127313 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a823f180-b204-43e1-a009-442a930ef6ce","Type":"ContainerStarted","Data":"4754b89700ad44cb51e2a954b5f05731f1a3294c5c5c42bba97252b7672980c7"} Jan 20 23:56:00 crc kubenswrapper[5030]: I0120 23:56:00.887884 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:56:01 crc kubenswrapper[5030]: I0120 23:56:01.135812 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a823f180-b204-43e1-a009-442a930ef6ce","Type":"ContainerStarted","Data":"ba950ce10f8a0dc5a72fb21cd413ad12ab0963b3033564d8d27ba0b98d1c7b6a"} Jan 20 23:56:01 crc kubenswrapper[5030]: I0120 23:56:01.135944 5030 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 23:56:01 crc kubenswrapper[5030]: I0120 23:56:01.136263 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:01 crc kubenswrapper[5030]: I0120 23:56:01.168807 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.2019814269999998 podStartE2EDuration="5.168789797s" podCreationTimestamp="2026-01-20 23:55:56 +0000 UTC" firstStartedPulling="2026-01-20 23:55:57.337825034 +0000 UTC m=+4829.658085322" lastFinishedPulling="2026-01-20 23:56:00.304633404 +0000 UTC m=+4832.624893692" observedRunningTime="2026-01-20 23:56:01.153356683 +0000 UTC m=+4833.473616961" watchObservedRunningTime="2026-01-20 23:56:01.168789797 +0000 UTC m=+4833.489050085" Jan 20 23:56:01 crc kubenswrapper[5030]: I0120 23:56:01.584600 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:56:06 crc kubenswrapper[5030]: I0120 23:56:06.188651 5030 generic.go:334] "Generic (PLEG): container finished" podID="53344bed-1dcb-467d-a87e-93b38be8acf1" containerID="c24b019586990fa217dfc9ad1cd63c34f40b14cb2468b7c316539d0b370e2051" exitCode=0 Jan 20 23:56:06 crc kubenswrapper[5030]: I0120 23:56:06.188775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" event={"ID":"53344bed-1dcb-467d-a87e-93b38be8acf1","Type":"ContainerDied","Data":"c24b019586990fa217dfc9ad1cd63c34f40b14cb2468b7c316539d0b370e2051"} Jan 20 23:56:07 crc kubenswrapper[5030]: I0120 23:56:07.681942 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:56:07 crc kubenswrapper[5030]: I0120 23:56:07.767312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-config-data\") pod \"53344bed-1dcb-467d-a87e-93b38be8acf1\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " Jan 20 23:56:07 crc kubenswrapper[5030]: I0120 23:56:07.767387 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b7w6\" (UniqueName: \"kubernetes.io/projected/53344bed-1dcb-467d-a87e-93b38be8acf1-kube-api-access-4b7w6\") pod \"53344bed-1dcb-467d-a87e-93b38be8acf1\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " Jan 20 23:56:07 crc kubenswrapper[5030]: I0120 23:56:07.767494 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-scripts\") pod \"53344bed-1dcb-467d-a87e-93b38be8acf1\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " Jan 20 23:56:07 crc kubenswrapper[5030]: I0120 23:56:07.767845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-combined-ca-bundle\") pod \"53344bed-1dcb-467d-a87e-93b38be8acf1\" (UID: \"53344bed-1dcb-467d-a87e-93b38be8acf1\") " Jan 20 23:56:07 crc kubenswrapper[5030]: I0120 23:56:07.772453 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53344bed-1dcb-467d-a87e-93b38be8acf1-kube-api-access-4b7w6" (OuterVolumeSpecName: "kube-api-access-4b7w6") pod "53344bed-1dcb-467d-a87e-93b38be8acf1" (UID: "53344bed-1dcb-467d-a87e-93b38be8acf1"). InnerVolumeSpecName "kube-api-access-4b7w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:07 crc kubenswrapper[5030]: I0120 23:56:07.774397 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-scripts" (OuterVolumeSpecName: "scripts") pod "53344bed-1dcb-467d-a87e-93b38be8acf1" (UID: "53344bed-1dcb-467d-a87e-93b38be8acf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:07 crc kubenswrapper[5030]: I0120 23:56:07.799928 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53344bed-1dcb-467d-a87e-93b38be8acf1" (UID: "53344bed-1dcb-467d-a87e-93b38be8acf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:07 crc kubenswrapper[5030]: I0120 23:56:07.800986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-config-data" (OuterVolumeSpecName: "config-data") pod "53344bed-1dcb-467d-a87e-93b38be8acf1" (UID: "53344bed-1dcb-467d-a87e-93b38be8acf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:07 crc kubenswrapper[5030]: I0120 23:56:07.870781 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:07 crc kubenswrapper[5030]: I0120 23:56:07.870834 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:07 crc kubenswrapper[5030]: I0120 23:56:07.870859 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b7w6\" (UniqueName: \"kubernetes.io/projected/53344bed-1dcb-467d-a87e-93b38be8acf1-kube-api-access-4b7w6\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:07 crc kubenswrapper[5030]: I0120 23:56:07.870879 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53344bed-1dcb-467d-a87e-93b38be8acf1-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.219455 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" event={"ID":"53344bed-1dcb-467d-a87e-93b38be8acf1","Type":"ContainerDied","Data":"515c25c7686a8d407e61a20b50c5bae2321c5c3cd9bb1688fe6241bde2f00a07"} Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.219520 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="515c25c7686a8d407e61a20b50c5bae2321c5c3cd9bb1688fe6241bde2f00a07" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.219547 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.310028 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:56:08 crc kubenswrapper[5030]: E0120 23:56:08.310453 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53344bed-1dcb-467d-a87e-93b38be8acf1" containerName="nova-cell0-conductor-db-sync" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.310480 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="53344bed-1dcb-467d-a87e-93b38be8acf1" containerName="nova-cell0-conductor-db-sync" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.310785 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="53344bed-1dcb-467d-a87e-93b38be8acf1" containerName="nova-cell0-conductor-db-sync" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.311608 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.325322 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-nova-dockercfg-psgrf" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.325505 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-conductor-config-data" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.329250 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.379384 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faaf0cec-efea-4050-9195-ee0262c8deb8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"faaf0cec-efea-4050-9195-ee0262c8deb8\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.379477 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h58bv\" (UniqueName: \"kubernetes.io/projected/faaf0cec-efea-4050-9195-ee0262c8deb8-kube-api-access-h58bv\") pod \"nova-cell0-conductor-0\" (UID: \"faaf0cec-efea-4050-9195-ee0262c8deb8\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.379511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faaf0cec-efea-4050-9195-ee0262c8deb8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"faaf0cec-efea-4050-9195-ee0262c8deb8\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.481269 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faaf0cec-efea-4050-9195-ee0262c8deb8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"faaf0cec-efea-4050-9195-ee0262c8deb8\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.481360 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h58bv\" (UniqueName: \"kubernetes.io/projected/faaf0cec-efea-4050-9195-ee0262c8deb8-kube-api-access-h58bv\") pod \"nova-cell0-conductor-0\" (UID: \"faaf0cec-efea-4050-9195-ee0262c8deb8\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.481438 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faaf0cec-efea-4050-9195-ee0262c8deb8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"faaf0cec-efea-4050-9195-ee0262c8deb8\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.486876 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faaf0cec-efea-4050-9195-ee0262c8deb8-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"faaf0cec-efea-4050-9195-ee0262c8deb8\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.488208 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faaf0cec-efea-4050-9195-ee0262c8deb8-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"faaf0cec-efea-4050-9195-ee0262c8deb8\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.508235 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h58bv\" (UniqueName: \"kubernetes.io/projected/faaf0cec-efea-4050-9195-ee0262c8deb8-kube-api-access-h58bv\") pod \"nova-cell0-conductor-0\" (UID: \"faaf0cec-efea-4050-9195-ee0262c8deb8\") " pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:56:08 crc kubenswrapper[5030]: I0120 23:56:08.639218 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:56:09 crc kubenswrapper[5030]: W0120 23:56:09.163001 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaaf0cec_efea_4050_9195_ee0262c8deb8.slice/crio-f6c6a5af6afb555f15628b039266fb078c6a385db1d91b4388d81bd0271b141e WatchSource:0}: Error finding container f6c6a5af6afb555f15628b039266fb078c6a385db1d91b4388d81bd0271b141e: Status 404 returned error can't find the container with id f6c6a5af6afb555f15628b039266fb078c6a385db1d91b4388d81bd0271b141e Jan 20 23:56:09 crc kubenswrapper[5030]: I0120 23:56:09.166833 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:56:09 crc kubenswrapper[5030]: I0120 23:56:09.241874 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"faaf0cec-efea-4050-9195-ee0262c8deb8","Type":"ContainerStarted","Data":"f6c6a5af6afb555f15628b039266fb078c6a385db1d91b4388d81bd0271b141e"} Jan 20 23:56:10 crc kubenswrapper[5030]: I0120 23:56:10.158749 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:56:10 crc kubenswrapper[5030]: I0120 23:56:10.158806 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:56:10 crc kubenswrapper[5030]: I0120 23:56:10.253911 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"faaf0cec-efea-4050-9195-ee0262c8deb8","Type":"ContainerStarted","Data":"5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37"} Jan 20 23:56:10 crc kubenswrapper[5030]: I0120 23:56:10.254261 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:56:10 crc kubenswrapper[5030]: I0120 23:56:10.273199 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podStartSLOduration=2.273182127 podStartE2EDuration="2.273182127s" podCreationTimestamp="2026-01-20 23:56:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:10.267394897 +0000 UTC m=+4842.587655185" watchObservedRunningTime="2026-01-20 23:56:10.273182127 +0000 UTC m=+4842.593442415" Jan 20 23:56:18 crc kubenswrapper[5030]: I0120 23:56:18.666722 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.228751 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x"] Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.230365 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.233274 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-scripts" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.233711 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell0-manage-config-data" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.251777 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x"] Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.352444 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.354679 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.361934 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.386713 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.420973 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-g2f6x\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.421017 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-scripts\") pod \"nova-cell0-cell-mapping-g2f6x\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.421061 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62dtf\" (UniqueName: \"kubernetes.io/projected/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-kube-api-access-62dtf\") pod \"nova-cell0-cell-mapping-g2f6x\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.421121 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-config-data\") pod \"nova-cell0-cell-mapping-g2f6x\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.453737 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.455206 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.463182 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.483656 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.520038 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.521872 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.522403 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-g2f6x\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.522456 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c253f065-5420-4b1e-b1dd-f6a93727b59c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.522482 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-scripts\") pod \"nova-cell0-cell-mapping-g2f6x\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.522525 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7svdh\" (UniqueName: \"kubernetes.io/projected/c253f065-5420-4b1e-b1dd-f6a93727b59c-kube-api-access-7svdh\") pod \"nova-api-0\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.522550 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62dtf\" (UniqueName: \"kubernetes.io/projected/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-kube-api-access-62dtf\") pod \"nova-cell0-cell-mapping-g2f6x\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.522590 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c253f065-5420-4b1e-b1dd-f6a93727b59c-logs\") pod \"nova-api-0\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.522613 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c253f065-5420-4b1e-b1dd-f6a93727b59c-config-data\") pod \"nova-api-0\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.522659 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-config-data\") pod \"nova-cell0-cell-mapping-g2f6x\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.525908 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.528831 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.530672 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-scripts\") pod \"nova-cell0-cell-mapping-g2f6x\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.531264 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-config-data\") pod \"nova-cell0-cell-mapping-g2f6x\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.538176 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-g2f6x\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.549109 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62dtf\" (UniqueName: \"kubernetes.io/projected/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-kube-api-access-62dtf\") pod \"nova-cell0-cell-mapping-g2f6x\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.559586 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.622670 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.623902 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.628166 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.632097 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c253f065-5420-4b1e-b1dd-f6a93727b59c-logs\") pod \"nova-api-0\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.632169 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c288c177-9fd3-4b77-b251-3f53913fb132-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c288c177-9fd3-4b77-b251-3f53913fb132\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.632190 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c253f065-5420-4b1e-b1dd-f6a93727b59c-config-data\") pod \"nova-api-0\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.632267 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.632308 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-logs\") pod \"nova-metadata-0\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.633022 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c253f065-5420-4b1e-b1dd-f6a93727b59c-logs\") pod \"nova-api-0\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.633088 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksdgl\" (UniqueName: \"kubernetes.io/projected/c288c177-9fd3-4b77-b251-3f53913fb132-kube-api-access-ksdgl\") pod \"nova-scheduler-0\" (UID: \"c288c177-9fd3-4b77-b251-3f53913fb132\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.633125 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrwxb\" (UniqueName: \"kubernetes.io/projected/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-kube-api-access-lrwxb\") pod \"nova-metadata-0\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.633255 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-config-data\") pod \"nova-metadata-0\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.633608 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c253f065-5420-4b1e-b1dd-f6a93727b59c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.638285 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7svdh\" (UniqueName: \"kubernetes.io/projected/c253f065-5420-4b1e-b1dd-f6a93727b59c-kube-api-access-7svdh\") pod \"nova-api-0\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.638343 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c288c177-9fd3-4b77-b251-3f53913fb132-config-data\") pod \"nova-scheduler-0\" (UID: \"c288c177-9fd3-4b77-b251-3f53913fb132\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.640356 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c253f065-5420-4b1e-b1dd-f6a93727b59c-config-data\") pod \"nova-api-0\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.646507 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c253f065-5420-4b1e-b1dd-f6a93727b59c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.646663 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.674266 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7svdh\" (UniqueName: \"kubernetes.io/projected/c253f065-5420-4b1e-b1dd-f6a93727b59c-kube-api-access-7svdh\") pod \"nova-api-0\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.689374 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.743039 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c288c177-9fd3-4b77-b251-3f53913fb132-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c288c177-9fd3-4b77-b251-3f53913fb132\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.743091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66qzz\" (UniqueName: \"kubernetes.io/projected/594a7bb2-910d-4167-a4cb-33e23ee9e381-kube-api-access-66qzz\") pod \"nova-cell1-novncproxy-0\" (UID: \"594a7bb2-910d-4167-a4cb-33e23ee9e381\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.743117 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594a7bb2-910d-4167-a4cb-33e23ee9e381-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"594a7bb2-910d-4167-a4cb-33e23ee9e381\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.743156 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.743188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-logs\") pod \"nova-metadata-0\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.743231 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksdgl\" (UniqueName: \"kubernetes.io/projected/c288c177-9fd3-4b77-b251-3f53913fb132-kube-api-access-ksdgl\") pod \"nova-scheduler-0\" (UID: \"c288c177-9fd3-4b77-b251-3f53913fb132\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.743254 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-config-data\") pod \"nova-metadata-0\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.743270 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrwxb\" (UniqueName: \"kubernetes.io/projected/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-kube-api-access-lrwxb\") pod \"nova-metadata-0\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.743345 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c288c177-9fd3-4b77-b251-3f53913fb132-config-data\") pod \"nova-scheduler-0\" (UID: \"c288c177-9fd3-4b77-b251-3f53913fb132\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.743363 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594a7bb2-910d-4167-a4cb-33e23ee9e381-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"594a7bb2-910d-4167-a4cb-33e23ee9e381\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.756508 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-logs\") pod \"nova-metadata-0\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.756766 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c288c177-9fd3-4b77-b251-3f53913fb132-config-data\") pod \"nova-scheduler-0\" (UID: \"c288c177-9fd3-4b77-b251-3f53913fb132\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.769253 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-config-data\") pod \"nova-metadata-0\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.771278 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksdgl\" (UniqueName: \"kubernetes.io/projected/c288c177-9fd3-4b77-b251-3f53913fb132-kube-api-access-ksdgl\") pod \"nova-scheduler-0\" (UID: \"c288c177-9fd3-4b77-b251-3f53913fb132\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.772703 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c288c177-9fd3-4b77-b251-3f53913fb132-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c288c177-9fd3-4b77-b251-3f53913fb132\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.778185 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.780199 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrwxb\" (UniqueName: \"kubernetes.io/projected/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-kube-api-access-lrwxb\") pod \"nova-metadata-0\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.854171 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594a7bb2-910d-4167-a4cb-33e23ee9e381-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"594a7bb2-910d-4167-a4cb-33e23ee9e381\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.854258 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66qzz\" (UniqueName: \"kubernetes.io/projected/594a7bb2-910d-4167-a4cb-33e23ee9e381-kube-api-access-66qzz\") pod \"nova-cell1-novncproxy-0\" (UID: \"594a7bb2-910d-4167-a4cb-33e23ee9e381\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.854279 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594a7bb2-910d-4167-a4cb-33e23ee9e381-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"594a7bb2-910d-4167-a4cb-33e23ee9e381\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.857315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594a7bb2-910d-4167-a4cb-33e23ee9e381-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"594a7bb2-910d-4167-a4cb-33e23ee9e381\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.863490 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594a7bb2-910d-4167-a4cb-33e23ee9e381-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"594a7bb2-910d-4167-a4cb-33e23ee9e381\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:19 crc kubenswrapper[5030]: I0120 23:56:19.877073 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66qzz\" (UniqueName: \"kubernetes.io/projected/594a7bb2-910d-4167-a4cb-33e23ee9e381-kube-api-access-66qzz\") pod \"nova-cell1-novncproxy-0\" (UID: \"594a7bb2-910d-4167-a4cb-33e23ee9e381\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.058982 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.078577 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.081277 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.116888 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x"] Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.270932 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.330489 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v"] Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.331591 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.336677 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-scripts" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.336823 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.349598 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v"] Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.362124 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x5l8v\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.362189 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-scripts\") pod \"nova-cell1-conductor-db-sync-x5l8v\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.362226 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snq9n\" (UniqueName: \"kubernetes.io/projected/1fd59946-c071-4913-973f-68c0ba64772d-kube-api-access-snq9n\") pod \"nova-cell1-conductor-db-sync-x5l8v\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.362291 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-config-data\") pod \"nova-cell1-conductor-db-sync-x5l8v\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.395967 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c253f065-5420-4b1e-b1dd-f6a93727b59c","Type":"ContainerStarted","Data":"bc5bb1cadb4843a1a8542de935ebb4346bb9262014982843d641aac652f47e60"} Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.401054 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" event={"ID":"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142","Type":"ContainerStarted","Data":"06a8d01b98047a34695d8eed827d07c0f9b42ef60e1d6f76d82e59648996f156"} Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.463134 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x5l8v\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.463186 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-scripts\") pod \"nova-cell1-conductor-db-sync-x5l8v\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.463220 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snq9n\" (UniqueName: \"kubernetes.io/projected/1fd59946-c071-4913-973f-68c0ba64772d-kube-api-access-snq9n\") pod \"nova-cell1-conductor-db-sync-x5l8v\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.463258 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-config-data\") pod \"nova-cell1-conductor-db-sync-x5l8v\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.469003 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-scripts\") pod \"nova-cell1-conductor-db-sync-x5l8v\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.469182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x5l8v\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.469495 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-config-data\") pod \"nova-cell1-conductor-db-sync-x5l8v\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.480566 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snq9n\" (UniqueName: \"kubernetes.io/projected/1fd59946-c071-4913-973f-68c0ba64772d-kube-api-access-snq9n\") pod \"nova-cell1-conductor-db-sync-x5l8v\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.578321 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:20 crc kubenswrapper[5030]: W0120 23:56:20.617200 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5aed4a9c_2e97_4d36_b72a_b52cbe77c23d.slice/crio-f49a739c957df1b4d154e44b15960f216f23c9dee35fbb8cbcc6f2e04be41cb2 WatchSource:0}: Error finding container f49a739c957df1b4d154e44b15960f216f23c9dee35fbb8cbcc6f2e04be41cb2: Status 404 returned error can't find the container with id f49a739c957df1b4d154e44b15960f216f23c9dee35fbb8cbcc6f2e04be41cb2 Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.655219 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.663473 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.813402 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:56:20 crc kubenswrapper[5030]: I0120 23:56:20.994357 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v"] Jan 20 23:56:21 crc kubenswrapper[5030]: W0120 23:56:21.012700 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fd59946_c071_4913_973f_68c0ba64772d.slice/crio-86bcd6b9cc1f67910120fcec530e72fad75b9a15f7c92d860f8d462e730393e8 WatchSource:0}: Error finding container 86bcd6b9cc1f67910120fcec530e72fad75b9a15f7c92d860f8d462e730393e8: Status 404 returned error can't find the container with id 86bcd6b9cc1f67910120fcec530e72fad75b9a15f7c92d860f8d462e730393e8 Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.167594 5030 scope.go:117] "RemoveContainer" containerID="1a322a5b0f4609dfb53e29d267cab28f1bd935ea7c08384b0f94ec463163dffc" Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.434767 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c253f065-5420-4b1e-b1dd-f6a93727b59c","Type":"ContainerStarted","Data":"d0453ec9a12f63421e9622610bd2f1ea38c565546d4a4e8ad3e5bb65ccd4a8ac"} Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.434840 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c253f065-5420-4b1e-b1dd-f6a93727b59c","Type":"ContainerStarted","Data":"a62e1c7d7b87eac8e29fdfa82bffb2b8f6663c1ee5bde55f5904578e4a56a546"} Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.444786 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c288c177-9fd3-4b77-b251-3f53913fb132","Type":"ContainerStarted","Data":"19ecc5790c45d4d3f8c3584eb1b9b7a0cc1009e8712c72a7ba0b5c78d51abbd0"} Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.444831 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c288c177-9fd3-4b77-b251-3f53913fb132","Type":"ContainerStarted","Data":"384c3e3ae2a029a72767df12518f8fca4851c0469a3099e6aec19e5c18edc000"} Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.447556 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d","Type":"ContainerStarted","Data":"71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319"} Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.447581 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d","Type":"ContainerStarted","Data":"51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b"} Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.447593 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d","Type":"ContainerStarted","Data":"f49a739c957df1b4d154e44b15960f216f23c9dee35fbb8cbcc6f2e04be41cb2"} Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.458343 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.458326783 podStartE2EDuration="2.458326783s" podCreationTimestamp="2026-01-20 23:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:21.456933229 +0000 UTC m=+4853.777193517" watchObservedRunningTime="2026-01-20 23:56:21.458326783 +0000 UTC m=+4853.778587071" Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.459679 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" event={"ID":"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142","Type":"ContainerStarted","Data":"f534f8867f01b669a98be117b16cec81c30a05651a6531a8ebf984a5ed21c492"} Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.475431 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" event={"ID":"1fd59946-c071-4913-973f-68c0ba64772d","Type":"ContainerStarted","Data":"c16032f8775ab4a7bd458a948244e294cfac5d089df162a717211e073be9778d"} Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.475479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" event={"ID":"1fd59946-c071-4913-973f-68c0ba64772d","Type":"ContainerStarted","Data":"86bcd6b9cc1f67910120fcec530e72fad75b9a15f7c92d860f8d462e730393e8"} Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.481505 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"594a7bb2-910d-4167-a4cb-33e23ee9e381","Type":"ContainerStarted","Data":"3189fd55ab3a064c762782e07ab68083b2632c77c2dd353f6686fbfaee8ea169"} Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.481550 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"594a7bb2-910d-4167-a4cb-33e23ee9e381","Type":"ContainerStarted","Data":"bbf840ae6071a5f2d3fe041a8ae2181360989e6ab1ebbd9ff39ca0aa53f09d84"} Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.481553 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.481538296 podStartE2EDuration="2.481538296s" podCreationTimestamp="2026-01-20 23:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:21.480150672 +0000 UTC m=+4853.800410970" watchObservedRunningTime="2026-01-20 23:56:21.481538296 +0000 UTC m=+4853.801798584" Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.506040 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.5060242600000002 podStartE2EDuration="2.50602426s" podCreationTimestamp="2026-01-20 23:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:21.496505449 +0000 UTC m=+4853.816765737" watchObservedRunningTime="2026-01-20 23:56:21.50602426 +0000 UTC m=+4853.826284548" Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.554393 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" podStartSLOduration=1.554367423 podStartE2EDuration="1.554367423s" podCreationTimestamp="2026-01-20 23:56:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:21.527426689 +0000 UTC m=+4853.847686977" watchObservedRunningTime="2026-01-20 23:56:21.554367423 +0000 UTC m=+4853.874627711" Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.565422 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" podStartSLOduration=2.56540757 podStartE2EDuration="2.56540757s" podCreationTimestamp="2026-01-20 23:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:21.552764424 +0000 UTC m=+4853.873024702" watchObservedRunningTime="2026-01-20 23:56:21.56540757 +0000 UTC m=+4853.885667858" Jan 20 23:56:21 crc kubenswrapper[5030]: I0120 23:56:21.581533 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.581514002 podStartE2EDuration="2.581514002s" podCreationTimestamp="2026-01-20 23:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:21.571233373 +0000 UTC m=+4853.891493661" watchObservedRunningTime="2026-01-20 23:56:21.581514002 +0000 UTC m=+4853.901774290" Jan 20 23:56:23 crc kubenswrapper[5030]: I0120 23:56:23.201047 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:56:23 crc kubenswrapper[5030]: I0120 23:56:23.212077 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:23 crc kubenswrapper[5030]: I0120 23:56:23.495534 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="594a7bb2-910d-4167-a4cb-33e23ee9e381" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3189fd55ab3a064c762782e07ab68083b2632c77c2dd353f6686fbfaee8ea169" gracePeriod=30 Jan 20 23:56:23 crc kubenswrapper[5030]: I0120 23:56:23.495731 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" containerName="nova-metadata-log" containerID="cri-o://51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b" gracePeriod=30 Jan 20 23:56:23 crc kubenswrapper[5030]: I0120 23:56:23.495837 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" containerName="nova-metadata-metadata" containerID="cri-o://71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319" gracePeriod=30 Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.333978 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.446610 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrwxb\" (UniqueName: \"kubernetes.io/projected/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-kube-api-access-lrwxb\") pod \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.446802 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-config-data\") pod \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.446831 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-logs\") pod \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.446926 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-combined-ca-bundle\") pod \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\" (UID: \"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d\") " Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.447300 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-logs" (OuterVolumeSpecName: "logs") pod "5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" (UID: "5aed4a9c-2e97-4d36-b72a-b52cbe77c23d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.447429 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.452702 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-kube-api-access-lrwxb" (OuterVolumeSpecName: "kube-api-access-lrwxb") pod "5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" (UID: "5aed4a9c-2e97-4d36-b72a-b52cbe77c23d"). InnerVolumeSpecName "kube-api-access-lrwxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.471403 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.481824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" (UID: "5aed4a9c-2e97-4d36-b72a-b52cbe77c23d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.498244 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-config-data" (OuterVolumeSpecName: "config-data") pod "5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" (UID: "5aed4a9c-2e97-4d36-b72a-b52cbe77c23d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.506584 5030 generic.go:334] "Generic (PLEG): container finished" podID="5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" containerID="71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319" exitCode=0 Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.506876 5030 generic.go:334] "Generic (PLEG): container finished" podID="5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" containerID="51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b" exitCode=143 Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.506983 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d","Type":"ContainerDied","Data":"71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319"} Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.507069 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d","Type":"ContainerDied","Data":"51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b"} Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.507144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"5aed4a9c-2e97-4d36-b72a-b52cbe77c23d","Type":"ContainerDied","Data":"f49a739c957df1b4d154e44b15960f216f23c9dee35fbb8cbcc6f2e04be41cb2"} Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.507209 5030 scope.go:117] "RemoveContainer" containerID="71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.506982 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.510513 5030 generic.go:334] "Generic (PLEG): container finished" podID="594a7bb2-910d-4167-a4cb-33e23ee9e381" containerID="3189fd55ab3a064c762782e07ab68083b2632c77c2dd353f6686fbfaee8ea169" exitCode=0 Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.510578 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"594a7bb2-910d-4167-a4cb-33e23ee9e381","Type":"ContainerDied","Data":"3189fd55ab3a064c762782e07ab68083b2632c77c2dd353f6686fbfaee8ea169"} Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.510639 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"594a7bb2-910d-4167-a4cb-33e23ee9e381","Type":"ContainerDied","Data":"bbf840ae6071a5f2d3fe041a8ae2181360989e6ab1ebbd9ff39ca0aa53f09d84"} Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.510741 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.533590 5030 scope.go:117] "RemoveContainer" containerID="51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.545485 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.558427 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66qzz\" (UniqueName: \"kubernetes.io/projected/594a7bb2-910d-4167-a4cb-33e23ee9e381-kube-api-access-66qzz\") pod \"594a7bb2-910d-4167-a4cb-33e23ee9e381\" (UID: \"594a7bb2-910d-4167-a4cb-33e23ee9e381\") " Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.558495 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594a7bb2-910d-4167-a4cb-33e23ee9e381-combined-ca-bundle\") pod \"594a7bb2-910d-4167-a4cb-33e23ee9e381\" (UID: \"594a7bb2-910d-4167-a4cb-33e23ee9e381\") " Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.558700 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594a7bb2-910d-4167-a4cb-33e23ee9e381-config-data\") pod \"594a7bb2-910d-4167-a4cb-33e23ee9e381\" (UID: \"594a7bb2-910d-4167-a4cb-33e23ee9e381\") " Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.558968 5030 scope.go:117] "RemoveContainer" containerID="71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.559380 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrwxb\" (UniqueName: \"kubernetes.io/projected/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-kube-api-access-lrwxb\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.559409 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.559422 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:24 crc kubenswrapper[5030]: E0120 23:56:24.566813 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319\": container with ID starting with 71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319 not found: ID does not exist" containerID="71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.566874 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319"} err="failed to get container status \"71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319\": rpc error: code = NotFound desc = could not find container \"71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319\": container with ID starting with 71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319 not found: ID does not exist" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.566902 5030 scope.go:117] "RemoveContainer" containerID="51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.567004 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:24 crc kubenswrapper[5030]: E0120 23:56:24.573834 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b\": container with ID starting with 51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b not found: ID does not exist" containerID="51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.573885 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b"} err="failed to get container status \"51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b\": rpc error: code = NotFound desc = could not find container \"51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b\": container with ID starting with 51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b not found: ID does not exist" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.573908 5030 scope.go:117] "RemoveContainer" containerID="71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.576564 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319"} err="failed to get container status \"71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319\": rpc error: code = NotFound desc = could not find container \"71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319\": container with ID starting with 71c4c39482f956be06a3c73cc57b383e37cb77e1080c2c0bbff03f1fde163319 not found: ID does not exist" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.576641 5030 scope.go:117] "RemoveContainer" containerID="51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.578858 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b"} err="failed to get container status \"51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b\": rpc error: code = NotFound desc = could not find container \"51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b\": container with ID starting with 51534747f7e1a405cf462716056ca83499927c3299e08df1211867967fee2a0b not found: ID does not exist" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.578900 5030 scope.go:117] "RemoveContainer" containerID="3189fd55ab3a064c762782e07ab68083b2632c77c2dd353f6686fbfaee8ea169" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.591476 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:24 crc kubenswrapper[5030]: E0120 23:56:24.592053 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594a7bb2-910d-4167-a4cb-33e23ee9e381" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.592076 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="594a7bb2-910d-4167-a4cb-33e23ee9e381" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:56:24 crc kubenswrapper[5030]: E0120 23:56:24.592092 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" containerName="nova-metadata-log" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.592100 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" containerName="nova-metadata-log" Jan 20 23:56:24 crc kubenswrapper[5030]: E0120 23:56:24.592115 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" containerName="nova-metadata-metadata" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.592124 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" containerName="nova-metadata-metadata" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.592368 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="594a7bb2-910d-4167-a4cb-33e23ee9e381" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.592403 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" containerName="nova-metadata-log" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.592418 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" containerName="nova-metadata-metadata" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.593836 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.595723 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.595912 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.599957 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.604657 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594a7bb2-910d-4167-a4cb-33e23ee9e381-kube-api-access-66qzz" (OuterVolumeSpecName: "kube-api-access-66qzz") pod "594a7bb2-910d-4167-a4cb-33e23ee9e381" (UID: "594a7bb2-910d-4167-a4cb-33e23ee9e381"). InnerVolumeSpecName "kube-api-access-66qzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.625536 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594a7bb2-910d-4167-a4cb-33e23ee9e381-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "594a7bb2-910d-4167-a4cb-33e23ee9e381" (UID: "594a7bb2-910d-4167-a4cb-33e23ee9e381"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.633341 5030 scope.go:117] "RemoveContainer" containerID="3189fd55ab3a064c762782e07ab68083b2632c77c2dd353f6686fbfaee8ea169" Jan 20 23:56:24 crc kubenswrapper[5030]: E0120 23:56:24.635035 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3189fd55ab3a064c762782e07ab68083b2632c77c2dd353f6686fbfaee8ea169\": container with ID starting with 3189fd55ab3a064c762782e07ab68083b2632c77c2dd353f6686fbfaee8ea169 not found: ID does not exist" containerID="3189fd55ab3a064c762782e07ab68083b2632c77c2dd353f6686fbfaee8ea169" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.635070 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3189fd55ab3a064c762782e07ab68083b2632c77c2dd353f6686fbfaee8ea169"} err="failed to get container status \"3189fd55ab3a064c762782e07ab68083b2632c77c2dd353f6686fbfaee8ea169\": rpc error: code = NotFound desc = could not find container \"3189fd55ab3a064c762782e07ab68083b2632c77c2dd353f6686fbfaee8ea169\": container with ID starting with 3189fd55ab3a064c762782e07ab68083b2632c77c2dd353f6686fbfaee8ea169 not found: ID does not exist" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.636266 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594a7bb2-910d-4167-a4cb-33e23ee9e381-config-data" (OuterVolumeSpecName: "config-data") pod "594a7bb2-910d-4167-a4cb-33e23ee9e381" (UID: "594a7bb2-910d-4167-a4cb-33e23ee9e381"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.660950 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-config-data\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.660999 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430fcc6b-0049-4829-8386-bd2ea73bd5f9-logs\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.661044 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khqt9\" (UniqueName: \"kubernetes.io/projected/430fcc6b-0049-4829-8386-bd2ea73bd5f9-kube-api-access-khqt9\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.661118 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.661162 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.661240 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594a7bb2-910d-4167-a4cb-33e23ee9e381-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.661271 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66qzz\" (UniqueName: \"kubernetes.io/projected/594a7bb2-910d-4167-a4cb-33e23ee9e381-kube-api-access-66qzz\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.661286 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594a7bb2-910d-4167-a4cb-33e23ee9e381-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.762387 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.762781 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-config-data\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.763068 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430fcc6b-0049-4829-8386-bd2ea73bd5f9-logs\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.763669 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khqt9\" (UniqueName: \"kubernetes.io/projected/430fcc6b-0049-4829-8386-bd2ea73bd5f9-kube-api-access-khqt9\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.763820 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.763993 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430fcc6b-0049-4829-8386-bd2ea73bd5f9-logs\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.767257 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.768276 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.768547 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-config-data\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.777948 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khqt9\" (UniqueName: \"kubernetes.io/projected/430fcc6b-0049-4829-8386-bd2ea73bd5f9-kube-api-access-khqt9\") pod \"nova-metadata-0\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.886348 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.924708 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.966730 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.968038 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.970955 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.971313 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-novncproxy-config-data" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.972533 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 23:56:24 crc kubenswrapper[5030]: I0120 23:56:24.978492 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.022819 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.059663 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.069646 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.069821 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.069892 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.070042 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.070238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dxfm\" (UniqueName: \"kubernetes.io/projected/231da45e-cd6f-435b-9209-0d8794fa33ad-kube-api-access-2dxfm\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.172558 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.172925 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.172943 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.173012 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.173040 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dxfm\" (UniqueName: \"kubernetes.io/projected/231da45e-cd6f-435b-9209-0d8794fa33ad-kube-api-access-2dxfm\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.178503 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.179812 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.180830 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.184309 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.190296 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dxfm\" (UniqueName: \"kubernetes.io/projected/231da45e-cd6f-435b-9209-0d8794fa33ad-kube-api-access-2dxfm\") pod \"nova-cell1-novncproxy-0\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.294066 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.490319 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.779522 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:56:25 crc kubenswrapper[5030]: W0120 23:56:25.949919 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod430fcc6b_0049_4829_8386_bd2ea73bd5f9.slice/crio-e1255ade31fcc32a0dfa00fd0c67753c40b07873dc5759995ba8d63c1523bc41 WatchSource:0}: Error finding container e1255ade31fcc32a0dfa00fd0c67753c40b07873dc5759995ba8d63c1523bc41: Status 404 returned error can't find the container with id e1255ade31fcc32a0dfa00fd0c67753c40b07873dc5759995ba8d63c1523bc41 Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.975866 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="594a7bb2-910d-4167-a4cb-33e23ee9e381" path="/var/lib/kubelet/pods/594a7bb2-910d-4167-a4cb-33e23ee9e381/volumes" Jan 20 23:56:25 crc kubenswrapper[5030]: I0120 23:56:25.976646 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aed4a9c-2e97-4d36-b72a-b52cbe77c23d" path="/var/lib/kubelet/pods/5aed4a9c-2e97-4d36-b72a-b52cbe77c23d/volumes" Jan 20 23:56:26 crc kubenswrapper[5030]: I0120 23:56:26.554547 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"430fcc6b-0049-4829-8386-bd2ea73bd5f9","Type":"ContainerStarted","Data":"3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db"} Jan 20 23:56:26 crc kubenswrapper[5030]: I0120 23:56:26.554962 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"430fcc6b-0049-4829-8386-bd2ea73bd5f9","Type":"ContainerStarted","Data":"e1255ade31fcc32a0dfa00fd0c67753c40b07873dc5759995ba8d63c1523bc41"} Jan 20 23:56:26 crc kubenswrapper[5030]: I0120 23:56:26.556771 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"231da45e-cd6f-435b-9209-0d8794fa33ad","Type":"ContainerStarted","Data":"1b1952168676d99572794ea82a32c819d0bb2a2bc32e30b900afcbcea78e76da"} Jan 20 23:56:26 crc kubenswrapper[5030]: I0120 23:56:26.556828 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"231da45e-cd6f-435b-9209-0d8794fa33ad","Type":"ContainerStarted","Data":"0af217e7af4fddc0a9f0a907f30ae0f7f766a3608f64dddd1f9bb3ea15f3ac9b"} Jan 20 23:56:26 crc kubenswrapper[5030]: I0120 23:56:26.576561 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podStartSLOduration=2.576543513 podStartE2EDuration="2.576543513s" podCreationTimestamp="2026-01-20 23:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:26.572403273 +0000 UTC m=+4858.892663561" watchObservedRunningTime="2026-01-20 23:56:26.576543513 +0000 UTC m=+4858.896803801" Jan 20 23:56:26 crc kubenswrapper[5030]: I0120 23:56:26.808318 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:28 crc kubenswrapper[5030]: I0120 23:56:28.625503 5030 generic.go:334] "Generic (PLEG): container finished" podID="1fd59946-c071-4913-973f-68c0ba64772d" containerID="c16032f8775ab4a7bd458a948244e294cfac5d089df162a717211e073be9778d" exitCode=0 Jan 20 23:56:28 crc kubenswrapper[5030]: I0120 23:56:28.626385 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" event={"ID":"1fd59946-c071-4913-973f-68c0ba64772d","Type":"ContainerDied","Data":"c16032f8775ab4a7bd458a948244e294cfac5d089df162a717211e073be9778d"} Jan 20 23:56:28 crc kubenswrapper[5030]: I0120 23:56:28.629606 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"430fcc6b-0049-4829-8386-bd2ea73bd5f9","Type":"ContainerStarted","Data":"9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344"} Jan 20 23:56:28 crc kubenswrapper[5030]: I0120 23:56:28.632408 5030 generic.go:334] "Generic (PLEG): container finished" podID="dddfd43a-d45b-4e9e-9b5f-b4fc0401a142" containerID="f534f8867f01b669a98be117b16cec81c30a05651a6531a8ebf984a5ed21c492" exitCode=0 Jan 20 23:56:28 crc kubenswrapper[5030]: I0120 23:56:28.632452 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" event={"ID":"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142","Type":"ContainerDied","Data":"f534f8867f01b669a98be117b16cec81c30a05651a6531a8ebf984a5ed21c492"} Jan 20 23:56:28 crc kubenswrapper[5030]: I0120 23:56:28.704808 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=4.704784681 podStartE2EDuration="4.704784681s" podCreationTimestamp="2026-01-20 23:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:28.688557478 +0000 UTC m=+4861.008817766" watchObservedRunningTime="2026-01-20 23:56:28.704784681 +0000 UTC m=+4861.025044979" Jan 20 23:56:29 crc kubenswrapper[5030]: I0120 23:56:29.690825 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:29 crc kubenswrapper[5030]: I0120 23:56:29.691305 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.044783 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.045096 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.060835 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.137208 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.203647 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.212560 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.253435 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snq9n\" (UniqueName: \"kubernetes.io/projected/1fd59946-c071-4913-973f-68c0ba64772d-kube-api-access-snq9n\") pod \"1fd59946-c071-4913-973f-68c0ba64772d\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.253489 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-config-data\") pod \"1fd59946-c071-4913-973f-68c0ba64772d\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.253529 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-config-data\") pod \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.253564 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62dtf\" (UniqueName: \"kubernetes.io/projected/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-kube-api-access-62dtf\") pod \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.253587 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-combined-ca-bundle\") pod \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.253609 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-scripts\") pod \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\" (UID: \"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142\") " Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.253654 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-scripts\") pod \"1fd59946-c071-4913-973f-68c0ba64772d\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.253783 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-combined-ca-bundle\") pod \"1fd59946-c071-4913-973f-68c0ba64772d\" (UID: \"1fd59946-c071-4913-973f-68c0ba64772d\") " Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.261032 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-scripts" (OuterVolumeSpecName: "scripts") pod "dddfd43a-d45b-4e9e-9b5f-b4fc0401a142" (UID: "dddfd43a-d45b-4e9e-9b5f-b4fc0401a142"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.264074 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-scripts" (OuterVolumeSpecName: "scripts") pod "1fd59946-c071-4913-973f-68c0ba64772d" (UID: "1fd59946-c071-4913-973f-68c0ba64772d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.264193 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-kube-api-access-62dtf" (OuterVolumeSpecName: "kube-api-access-62dtf") pod "dddfd43a-d45b-4e9e-9b5f-b4fc0401a142" (UID: "dddfd43a-d45b-4e9e-9b5f-b4fc0401a142"). InnerVolumeSpecName "kube-api-access-62dtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.265905 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fd59946-c071-4913-973f-68c0ba64772d-kube-api-access-snq9n" (OuterVolumeSpecName: "kube-api-access-snq9n") pod "1fd59946-c071-4913-973f-68c0ba64772d" (UID: "1fd59946-c071-4913-973f-68c0ba64772d"). InnerVolumeSpecName "kube-api-access-snq9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.294193 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.298926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fd59946-c071-4913-973f-68c0ba64772d" (UID: "1fd59946-c071-4913-973f-68c0ba64772d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.299945 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-config-data" (OuterVolumeSpecName: "config-data") pod "1fd59946-c071-4913-973f-68c0ba64772d" (UID: "1fd59946-c071-4913-973f-68c0ba64772d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.300136 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dddfd43a-d45b-4e9e-9b5f-b4fc0401a142" (UID: "dddfd43a-d45b-4e9e-9b5f-b4fc0401a142"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.307784 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-config-data" (OuterVolumeSpecName: "config-data") pod "dddfd43a-d45b-4e9e-9b5f-b4fc0401a142" (UID: "dddfd43a-d45b-4e9e-9b5f-b4fc0401a142"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.358129 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62dtf\" (UniqueName: \"kubernetes.io/projected/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-kube-api-access-62dtf\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.358157 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.358167 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.358177 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.358186 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.358196 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snq9n\" (UniqueName: \"kubernetes.io/projected/1fd59946-c071-4913-973f-68c0ba64772d-kube-api-access-snq9n\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.358205 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd59946-c071-4913-973f-68c0ba64772d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.358215 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.656984 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" event={"ID":"dddfd43a-d45b-4e9e-9b5f-b4fc0401a142","Type":"ContainerDied","Data":"06a8d01b98047a34695d8eed827d07c0f9b42ef60e1d6f76d82e59648996f156"} Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.657023 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06a8d01b98047a34695d8eed827d07c0f9b42ef60e1d6f76d82e59648996f156" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.657079 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.663704 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.663963 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v" event={"ID":"1fd59946-c071-4913-973f-68c0ba64772d","Type":"ContainerDied","Data":"86bcd6b9cc1f67910120fcec530e72fad75b9a15f7c92d860f8d462e730393e8"} Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.664036 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86bcd6b9cc1f67910120fcec530e72fad75b9a15f7c92d860f8d462e730393e8" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.722122 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.754764 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:56:30 crc kubenswrapper[5030]: E0120 23:56:30.755282 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dddfd43a-d45b-4e9e-9b5f-b4fc0401a142" containerName="nova-manage" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.755299 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dddfd43a-d45b-4e9e-9b5f-b4fc0401a142" containerName="nova-manage" Jan 20 23:56:30 crc kubenswrapper[5030]: E0120 23:56:30.755342 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd59946-c071-4913-973f-68c0ba64772d" containerName="nova-cell1-conductor-db-sync" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.755352 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd59946-c071-4913-973f-68c0ba64772d" containerName="nova-cell1-conductor-db-sync" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.755658 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd59946-c071-4913-973f-68c0ba64772d" containerName="nova-cell1-conductor-db-sync" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.755690 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dddfd43a-d45b-4e9e-9b5f-b4fc0401a142" containerName="nova-manage" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.757366 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.766856 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-conductor-config-data" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.774014 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="c253f065-5420-4b1e-b1dd-f6a93727b59c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.131:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.774385 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="c253f065-5420-4b1e-b1dd-f6a93727b59c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.131:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.780132 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.870378 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvrr4\" (UniqueName: \"kubernetes.io/projected/09900067-cfe8-4543-8e6c-88c692d43ed8-kube-api-access-nvrr4\") pod \"nova-cell1-conductor-0\" (UID: \"09900067-cfe8-4543-8e6c-88c692d43ed8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.870436 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09900067-cfe8-4543-8e6c-88c692d43ed8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"09900067-cfe8-4543-8e6c-88c692d43ed8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.870760 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09900067-cfe8-4543-8e6c-88c692d43ed8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"09900067-cfe8-4543-8e6c-88c692d43ed8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.873263 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.896584 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.972307 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvrr4\" (UniqueName: \"kubernetes.io/projected/09900067-cfe8-4543-8e6c-88c692d43ed8-kube-api-access-nvrr4\") pod \"nova-cell1-conductor-0\" (UID: \"09900067-cfe8-4543-8e6c-88c692d43ed8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.972377 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09900067-cfe8-4543-8e6c-88c692d43ed8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"09900067-cfe8-4543-8e6c-88c692d43ed8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.972458 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09900067-cfe8-4543-8e6c-88c692d43ed8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"09900067-cfe8-4543-8e6c-88c692d43ed8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.977238 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09900067-cfe8-4543-8e6c-88c692d43ed8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"09900067-cfe8-4543-8e6c-88c692d43ed8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.977374 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09900067-cfe8-4543-8e6c-88c692d43ed8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"09900067-cfe8-4543-8e6c-88c692d43ed8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:56:30 crc kubenswrapper[5030]: I0120 23:56:30.990203 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvrr4\" (UniqueName: \"kubernetes.io/projected/09900067-cfe8-4543-8e6c-88c692d43ed8-kube-api-access-nvrr4\") pod \"nova-cell1-conductor-0\" (UID: \"09900067-cfe8-4543-8e6c-88c692d43ed8\") " pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:56:31 crc kubenswrapper[5030]: I0120 23:56:31.091175 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:56:31 crc kubenswrapper[5030]: I0120 23:56:31.253805 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:31 crc kubenswrapper[5030]: I0120 23:56:31.493649 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:56:31 crc kubenswrapper[5030]: I0120 23:56:31.494152 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="74d2242a-b72f-40ed-ba26-3c1e7940a218" containerName="kube-state-metrics" containerID="cri-o://2f4be2996c4bb42f362ed54e4832f7079bdf293a0e0142c325547ed3531d22a0" gracePeriod=30 Jan 20 23:56:31 crc kubenswrapper[5030]: I0120 23:56:31.603288 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:56:31 crc kubenswrapper[5030]: I0120 23:56:31.673710 5030 generic.go:334] "Generic (PLEG): container finished" podID="74d2242a-b72f-40ed-ba26-3c1e7940a218" containerID="2f4be2996c4bb42f362ed54e4832f7079bdf293a0e0142c325547ed3531d22a0" exitCode=2 Jan 20 23:56:31 crc kubenswrapper[5030]: I0120 23:56:31.673794 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"74d2242a-b72f-40ed-ba26-3c1e7940a218","Type":"ContainerDied","Data":"2f4be2996c4bb42f362ed54e4832f7079bdf293a0e0142c325547ed3531d22a0"} Jan 20 23:56:31 crc kubenswrapper[5030]: I0120 23:56:31.675114 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"09900067-cfe8-4543-8e6c-88c692d43ed8","Type":"ContainerStarted","Data":"659cc44757f7d43dc2a9c4209e12b7746f41c307a3d0984ea1e78aa6fdd9a896"} Jan 20 23:56:31 crc kubenswrapper[5030]: I0120 23:56:31.675230 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="c253f065-5420-4b1e-b1dd-f6a93727b59c" containerName="nova-api-log" containerID="cri-o://a62e1c7d7b87eac8e29fdfa82bffb2b8f6663c1ee5bde55f5904578e4a56a546" gracePeriod=30 Jan 20 23:56:31 crc kubenswrapper[5030]: I0120 23:56:31.675318 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="c253f065-5420-4b1e-b1dd-f6a93727b59c" containerName="nova-api-api" containerID="cri-o://d0453ec9a12f63421e9622610bd2f1ea38c565546d4a4e8ad3e5bb65ccd4a8ac" gracePeriod=30 Jan 20 23:56:31 crc kubenswrapper[5030]: I0120 23:56:31.675661 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="430fcc6b-0049-4829-8386-bd2ea73bd5f9" containerName="nova-metadata-log" containerID="cri-o://3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db" gracePeriod=30 Jan 20 23:56:31 crc kubenswrapper[5030]: I0120 23:56:31.675734 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="430fcc6b-0049-4829-8386-bd2ea73bd5f9" containerName="nova-metadata-metadata" containerID="cri-o://9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344" gracePeriod=30 Jan 20 23:56:31 crc kubenswrapper[5030]: I0120 23:56:31.952802 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.006580 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qlnq\" (UniqueName: \"kubernetes.io/projected/74d2242a-b72f-40ed-ba26-3c1e7940a218-kube-api-access-5qlnq\") pod \"74d2242a-b72f-40ed-ba26-3c1e7940a218\" (UID: \"74d2242a-b72f-40ed-ba26-3c1e7940a218\") " Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.060121 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d2242a-b72f-40ed-ba26-3c1e7940a218-kube-api-access-5qlnq" (OuterVolumeSpecName: "kube-api-access-5qlnq") pod "74d2242a-b72f-40ed-ba26-3c1e7940a218" (UID: "74d2242a-b72f-40ed-ba26-3c1e7940a218"). InnerVolumeSpecName "kube-api-access-5qlnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.114367 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qlnq\" (UniqueName: \"kubernetes.io/projected/74d2242a-b72f-40ed-ba26-3c1e7940a218-kube-api-access-5qlnq\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.446629 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.527206 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-config-data\") pod \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.527256 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khqt9\" (UniqueName: \"kubernetes.io/projected/430fcc6b-0049-4829-8386-bd2ea73bd5f9-kube-api-access-khqt9\") pod \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.527423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-combined-ca-bundle\") pod \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.527504 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-nova-metadata-tls-certs\") pod \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.527551 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430fcc6b-0049-4829-8386-bd2ea73bd5f9-logs\") pod \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\" (UID: \"430fcc6b-0049-4829-8386-bd2ea73bd5f9\") " Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.528447 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/430fcc6b-0049-4829-8386-bd2ea73bd5f9-logs" (OuterVolumeSpecName: "logs") pod "430fcc6b-0049-4829-8386-bd2ea73bd5f9" (UID: "430fcc6b-0049-4829-8386-bd2ea73bd5f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.533608 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/430fcc6b-0049-4829-8386-bd2ea73bd5f9-kube-api-access-khqt9" (OuterVolumeSpecName: "kube-api-access-khqt9") pod "430fcc6b-0049-4829-8386-bd2ea73bd5f9" (UID: "430fcc6b-0049-4829-8386-bd2ea73bd5f9"). InnerVolumeSpecName "kube-api-access-khqt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.561895 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-config-data" (OuterVolumeSpecName: "config-data") pod "430fcc6b-0049-4829-8386-bd2ea73bd5f9" (UID: "430fcc6b-0049-4829-8386-bd2ea73bd5f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.564689 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "430fcc6b-0049-4829-8386-bd2ea73bd5f9" (UID: "430fcc6b-0049-4829-8386-bd2ea73bd5f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.591393 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "430fcc6b-0049-4829-8386-bd2ea73bd5f9" (UID: "430fcc6b-0049-4829-8386-bd2ea73bd5f9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.630161 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.630196 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.630208 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/430fcc6b-0049-4829-8386-bd2ea73bd5f9-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.630219 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/430fcc6b-0049-4829-8386-bd2ea73bd5f9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.630233 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khqt9\" (UniqueName: \"kubernetes.io/projected/430fcc6b-0049-4829-8386-bd2ea73bd5f9-kube-api-access-khqt9\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.684450 5030 generic.go:334] "Generic (PLEG): container finished" podID="430fcc6b-0049-4829-8386-bd2ea73bd5f9" containerID="9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344" exitCode=0 Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.684483 5030 generic.go:334] "Generic (PLEG): container finished" podID="430fcc6b-0049-4829-8386-bd2ea73bd5f9" containerID="3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db" exitCode=143 Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.684518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"430fcc6b-0049-4829-8386-bd2ea73bd5f9","Type":"ContainerDied","Data":"9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344"} Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.684543 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"430fcc6b-0049-4829-8386-bd2ea73bd5f9","Type":"ContainerDied","Data":"3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db"} Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.684554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"430fcc6b-0049-4829-8386-bd2ea73bd5f9","Type":"ContainerDied","Data":"e1255ade31fcc32a0dfa00fd0c67753c40b07873dc5759995ba8d63c1523bc41"} Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.684569 5030 scope.go:117] "RemoveContainer" containerID="9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.684732 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.691231 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"74d2242a-b72f-40ed-ba26-3c1e7940a218","Type":"ContainerDied","Data":"074ad098cb3f61f9ad45d4cfa44a4724fde8cfdbe893c62831c21e800e9b8a85"} Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.691320 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.696865 5030 generic.go:334] "Generic (PLEG): container finished" podID="c253f065-5420-4b1e-b1dd-f6a93727b59c" containerID="a62e1c7d7b87eac8e29fdfa82bffb2b8f6663c1ee5bde55f5904578e4a56a546" exitCode=143 Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.696957 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c253f065-5420-4b1e-b1dd-f6a93727b59c","Type":"ContainerDied","Data":"a62e1c7d7b87eac8e29fdfa82bffb2b8f6663c1ee5bde55f5904578e4a56a546"} Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.699000 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"09900067-cfe8-4543-8e6c-88c692d43ed8","Type":"ContainerStarted","Data":"44eaef6b2a5abf1f99511a45ea107f9625dee6ea706b24c87e2b305edec32984"} Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.699196 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c288c177-9fd3-4b77-b251-3f53913fb132" containerName="nova-scheduler-scheduler" containerID="cri-o://19ecc5790c45d4d3f8c3584eb1b9b7a0cc1009e8712c72a7ba0b5c78d51abbd0" gracePeriod=30 Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.712932 5030 scope.go:117] "RemoveContainer" containerID="3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.728573 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.736248 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.762815 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:32 crc kubenswrapper[5030]: E0120 23:56:32.763319 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="430fcc6b-0049-4829-8386-bd2ea73bd5f9" containerName="nova-metadata-log" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.763341 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="430fcc6b-0049-4829-8386-bd2ea73bd5f9" containerName="nova-metadata-log" Jan 20 23:56:32 crc kubenswrapper[5030]: E0120 23:56:32.763375 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="430fcc6b-0049-4829-8386-bd2ea73bd5f9" containerName="nova-metadata-metadata" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.763382 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="430fcc6b-0049-4829-8386-bd2ea73bd5f9" containerName="nova-metadata-metadata" Jan 20 23:56:32 crc kubenswrapper[5030]: E0120 23:56:32.763394 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d2242a-b72f-40ed-ba26-3c1e7940a218" containerName="kube-state-metrics" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.763401 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d2242a-b72f-40ed-ba26-3c1e7940a218" containerName="kube-state-metrics" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.763610 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="430fcc6b-0049-4829-8386-bd2ea73bd5f9" containerName="nova-metadata-log" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.763669 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d2242a-b72f-40ed-ba26-3c1e7940a218" containerName="kube-state-metrics" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.763688 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="430fcc6b-0049-4829-8386-bd2ea73bd5f9" containerName="nova-metadata-metadata" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.764947 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.766901 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.767911 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.770837 5030 scope.go:117] "RemoveContainer" containerID="9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344" Jan 20 23:56:32 crc kubenswrapper[5030]: E0120 23:56:32.771313 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344\": container with ID starting with 9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344 not found: ID does not exist" containerID="9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.771341 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344"} err="failed to get container status \"9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344\": rpc error: code = NotFound desc = could not find container \"9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344\": container with ID starting with 9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344 not found: ID does not exist" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.771370 5030 scope.go:117] "RemoveContainer" containerID="3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db" Jan 20 23:56:32 crc kubenswrapper[5030]: E0120 23:56:32.771960 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db\": container with ID starting with 3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db not found: ID does not exist" containerID="3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.771998 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db"} err="failed to get container status \"3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db\": rpc error: code = NotFound desc = could not find container \"3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db\": container with ID starting with 3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db not found: ID does not exist" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.772013 5030 scope.go:117] "RemoveContainer" containerID="9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.772225 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344"} err="failed to get container status \"9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344\": rpc error: code = NotFound desc = could not find container \"9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344\": container with ID starting with 9796784efa96d7098fbf6283b1827320e1d9bc2928e37c4ffc31977075d35344 not found: ID does not exist" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.772250 5030 scope.go:117] "RemoveContainer" containerID="3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.772459 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db"} err="failed to get container status \"3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db\": rpc error: code = NotFound desc = could not find container \"3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db\": container with ID starting with 3a9ed0f25a82488338649eb6a4ca42e0598b82422b3d8c23dcb7fd567cc7f2db not found: ID does not exist" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.772802 5030 scope.go:117] "RemoveContainer" containerID="2f4be2996c4bb42f362ed54e4832f7079bdf293a0e0142c325547ed3531d22a0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.775940 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.785452 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podStartSLOduration=2.785433823 podStartE2EDuration="2.785433823s" podCreationTimestamp="2026-01-20 23:56:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:32.755188258 +0000 UTC m=+4865.075448546" watchObservedRunningTime="2026-01-20 23:56:32.785433823 +0000 UTC m=+4865.105694111" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.807664 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.815689 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.826144 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.827442 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.829092 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"kube-state-metrics-tls-config" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.836195 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-kube-state-metrics-svc" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.837022 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-config-data\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.837123 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.837153 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.837256 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mcm7\" (UniqueName: \"kubernetes.io/projected/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-kube-api-access-5mcm7\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.837293 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-logs\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.859544 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.939079 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.939141 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m9np\" (UniqueName: \"kubernetes.io/projected/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-api-access-4m9np\") pod \"kube-state-metrics-0\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.939192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mcm7\" (UniqueName: \"kubernetes.io/projected/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-kube-api-access-5mcm7\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.939230 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.939260 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-logs\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.939466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-config-data\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.939653 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-logs\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.939666 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.939700 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.939796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.947593 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.947807 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-config-data\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.950085 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:32 crc kubenswrapper[5030]: I0120 23:56:32.955371 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mcm7\" (UniqueName: \"kubernetes.io/projected/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-kube-api-access-5mcm7\") pod \"nova-metadata-0\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.041603 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.041675 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.041729 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m9np\" (UniqueName: \"kubernetes.io/projected/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-api-access-4m9np\") pod \"kube-state-metrics-0\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.041808 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.048880 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.049337 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.050422 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.059177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m9np\" (UniqueName: \"kubernetes.io/projected/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-api-access-4m9np\") pod \"kube-state-metrics-0\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.081314 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.147377 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.552513 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.638248 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.699136 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.699474 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="proxy-httpd" containerID="cri-o://ba950ce10f8a0dc5a72fb21cd413ad12ab0963b3033564d8d27ba0b98d1c7b6a" gracePeriod=30 Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.699565 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="sg-core" containerID="cri-o://4754b89700ad44cb51e2a954b5f05731f1a3294c5c5c42bba97252b7672980c7" gracePeriod=30 Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.699559 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="ceilometer-notification-agent" containerID="cri-o://912c48f793e3654ce8dbddbd917036f67d80141af963cffe6cd33fabc32f5a4f" gracePeriod=30 Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.699909 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="ceilometer-central-agent" containerID="cri-o://82ee678d8f0c37c036b2ee21b6fa99bf21b0ae83829d6946c69273cc65331ff4" gracePeriod=30 Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.710654 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:56:33 crc kubenswrapper[5030]: W0120 23:56:33.850305 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02190451_840d_4e4e_8b67_a14ac1f9712c.slice/crio-70b7b87cadfbc887c22fbf5316e31d57f694d46fb6e0c71a375307b3c9eeafb5 WatchSource:0}: Error finding container 70b7b87cadfbc887c22fbf5316e31d57f694d46fb6e0c71a375307b3c9eeafb5: Status 404 returned error can't find the container with id 70b7b87cadfbc887c22fbf5316e31d57f694d46fb6e0c71a375307b3c9eeafb5 Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.986080 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="430fcc6b-0049-4829-8386-bd2ea73bd5f9" path="/var/lib/kubelet/pods/430fcc6b-0049-4829-8386-bd2ea73bd5f9/volumes" Jan 20 23:56:33 crc kubenswrapper[5030]: I0120 23:56:33.990865 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d2242a-b72f-40ed-ba26-3c1e7940a218" path="/var/lib/kubelet/pods/74d2242a-b72f-40ed-ba26-3c1e7940a218/volumes" Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.723086 5030 generic.go:334] "Generic (PLEG): container finished" podID="a823f180-b204-43e1-a009-442a930ef6ce" containerID="ba950ce10f8a0dc5a72fb21cd413ad12ab0963b3033564d8d27ba0b98d1c7b6a" exitCode=0 Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.723336 5030 generic.go:334] "Generic (PLEG): container finished" podID="a823f180-b204-43e1-a009-442a930ef6ce" containerID="4754b89700ad44cb51e2a954b5f05731f1a3294c5c5c42bba97252b7672980c7" exitCode=2 Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.723346 5030 generic.go:334] "Generic (PLEG): container finished" podID="a823f180-b204-43e1-a009-442a930ef6ce" containerID="912c48f793e3654ce8dbddbd917036f67d80141af963cffe6cd33fabc32f5a4f" exitCode=0 Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.723353 5030 generic.go:334] "Generic (PLEG): container finished" podID="a823f180-b204-43e1-a009-442a930ef6ce" containerID="82ee678d8f0c37c036b2ee21b6fa99bf21b0ae83829d6946c69273cc65331ff4" exitCode=0 Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.723584 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a823f180-b204-43e1-a009-442a930ef6ce","Type":"ContainerDied","Data":"ba950ce10f8a0dc5a72fb21cd413ad12ab0963b3033564d8d27ba0b98d1c7b6a"} Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.723797 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a823f180-b204-43e1-a009-442a930ef6ce","Type":"ContainerDied","Data":"4754b89700ad44cb51e2a954b5f05731f1a3294c5c5c42bba97252b7672980c7"} Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.723890 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a823f180-b204-43e1-a009-442a930ef6ce","Type":"ContainerDied","Data":"912c48f793e3654ce8dbddbd917036f67d80141af963cffe6cd33fabc32f5a4f"} Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.723985 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a823f180-b204-43e1-a009-442a930ef6ce","Type":"ContainerDied","Data":"82ee678d8f0c37c036b2ee21b6fa99bf21b0ae83829d6946c69273cc65331ff4"} Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.725367 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96","Type":"ContainerStarted","Data":"1b7602111efadf863a6c9d2c4986983df1b114e90cf5617d5a80180113b63734"} Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.725400 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96","Type":"ContainerStarted","Data":"d24def3bcfe92f2ee0e820719ee6a03082d4459b1a06c611e4cc7a9b912113c2"} Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.725411 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96","Type":"ContainerStarted","Data":"227ee7e0208a069c51a2b830e7ae0d95caf9df543cf0498697a32a63b53fb71e"} Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.729269 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"02190451-840d-4e4e-8b67-a14ac1f9712c","Type":"ContainerStarted","Data":"13e58d45af774f4c8a8b261f1aa8f35442fe8fe8b25360af8c074429af8cfa70"} Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.729305 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"02190451-840d-4e4e-8b67-a14ac1f9712c","Type":"ContainerStarted","Data":"70b7b87cadfbc887c22fbf5316e31d57f694d46fb6e0c71a375307b3c9eeafb5"} Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.729318 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.747302 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=2.747279634 podStartE2EDuration="2.747279634s" podCreationTimestamp="2026-01-20 23:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:34.740808647 +0000 UTC m=+4867.061068935" watchObservedRunningTime="2026-01-20 23:56:34.747279634 +0000 UTC m=+4867.067539922" Jan 20 23:56:34 crc kubenswrapper[5030]: I0120 23:56:34.770689 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kube-state-metrics-0" podStartSLOduration=2.329090719 podStartE2EDuration="2.770673392s" podCreationTimestamp="2026-01-20 23:56:32 +0000 UTC" firstStartedPulling="2026-01-20 23:56:33.866908427 +0000 UTC m=+4866.187168735" lastFinishedPulling="2026-01-20 23:56:34.30849112 +0000 UTC m=+4866.628751408" observedRunningTime="2026-01-20 23:56:34.767002352 +0000 UTC m=+4867.087262640" watchObservedRunningTime="2026-01-20 23:56:34.770673392 +0000 UTC m=+4867.090933680" Jan 20 23:56:35 crc kubenswrapper[5030]: E0120 23:56:35.061081 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19ecc5790c45d4d3f8c3584eb1b9b7a0cc1009e8712c72a7ba0b5c78d51abbd0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:56:35 crc kubenswrapper[5030]: E0120 23:56:35.063435 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19ecc5790c45d4d3f8c3584eb1b9b7a0cc1009e8712c72a7ba0b5c78d51abbd0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:56:35 crc kubenswrapper[5030]: E0120 23:56:35.064604 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19ecc5790c45d4d3f8c3584eb1b9b7a0cc1009e8712c72a7ba0b5c78d51abbd0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:56:35 crc kubenswrapper[5030]: E0120 23:56:35.064694 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="c288c177-9fd3-4b77-b251-3f53913fb132" containerName="nova-scheduler-scheduler" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.067371 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.194902 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5bzc\" (UniqueName: \"kubernetes.io/projected/a823f180-b204-43e1-a009-442a930ef6ce-kube-api-access-l5bzc\") pod \"a823f180-b204-43e1-a009-442a930ef6ce\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.195004 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-config-data\") pod \"a823f180-b204-43e1-a009-442a930ef6ce\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.195099 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a823f180-b204-43e1-a009-442a930ef6ce-log-httpd\") pod \"a823f180-b204-43e1-a009-442a930ef6ce\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.195129 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-combined-ca-bundle\") pod \"a823f180-b204-43e1-a009-442a930ef6ce\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.195194 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-sg-core-conf-yaml\") pod \"a823f180-b204-43e1-a009-442a930ef6ce\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.195225 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a823f180-b204-43e1-a009-442a930ef6ce-run-httpd\") pod \"a823f180-b204-43e1-a009-442a930ef6ce\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.195273 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-scripts\") pod \"a823f180-b204-43e1-a009-442a930ef6ce\" (UID: \"a823f180-b204-43e1-a009-442a930ef6ce\") " Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.196104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a823f180-b204-43e1-a009-442a930ef6ce-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a823f180-b204-43e1-a009-442a930ef6ce" (UID: "a823f180-b204-43e1-a009-442a930ef6ce"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.196169 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a823f180-b204-43e1-a009-442a930ef6ce-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a823f180-b204-43e1-a009-442a930ef6ce" (UID: "a823f180-b204-43e1-a009-442a930ef6ce"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.200962 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-scripts" (OuterVolumeSpecName: "scripts") pod "a823f180-b204-43e1-a009-442a930ef6ce" (UID: "a823f180-b204-43e1-a009-442a930ef6ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.200993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a823f180-b204-43e1-a009-442a930ef6ce-kube-api-access-l5bzc" (OuterVolumeSpecName: "kube-api-access-l5bzc") pod "a823f180-b204-43e1-a009-442a930ef6ce" (UID: "a823f180-b204-43e1-a009-442a930ef6ce"). InnerVolumeSpecName "kube-api-access-l5bzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.227094 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a823f180-b204-43e1-a009-442a930ef6ce" (UID: "a823f180-b204-43e1-a009-442a930ef6ce"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.285430 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-config-data" (OuterVolumeSpecName: "config-data") pod "a823f180-b204-43e1-a009-442a930ef6ce" (UID: "a823f180-b204-43e1-a009-442a930ef6ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.294678 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.297768 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5bzc\" (UniqueName: \"kubernetes.io/projected/a823f180-b204-43e1-a009-442a930ef6ce-kube-api-access-l5bzc\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.297800 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.297812 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a823f180-b204-43e1-a009-442a930ef6ce-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.297820 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.297828 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a823f180-b204-43e1-a009-442a930ef6ce-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.297837 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.314050 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.746962 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"a823f180-b204-43e1-a009-442a930ef6ce","Type":"ContainerDied","Data":"06d9f4f169d74164fe584438dda0bad3423c64ff31221eca53f34234f81a3d67"} Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.747346 5030 scope.go:117] "RemoveContainer" containerID="ba950ce10f8a0dc5a72fb21cd413ad12ab0963b3033564d8d27ba0b98d1c7b6a" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.747053 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:35 crc kubenswrapper[5030]: I0120 23:56:35.767309 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.297329 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a823f180-b204-43e1-a009-442a930ef6ce" (UID: "a823f180-b204-43e1-a009-442a930ef6ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.318026 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a823f180-b204-43e1-a009-442a930ef6ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.488113 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.497716 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.506889 5030 scope.go:117] "RemoveContainer" containerID="4754b89700ad44cb51e2a954b5f05731f1a3294c5c5c42bba97252b7672980c7" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.534954 5030 scope.go:117] "RemoveContainer" containerID="912c48f793e3654ce8dbddbd917036f67d80141af963cffe6cd33fabc32f5a4f" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.595982 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.600032 5030 scope.go:117] "RemoveContainer" containerID="82ee678d8f0c37c036b2ee21b6fa99bf21b0ae83829d6946c69273cc65331ff4" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.607327 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.615772 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:56:36 crc kubenswrapper[5030]: E0120 23:56:36.616299 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="ceilometer-central-agent" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.616320 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="ceilometer-central-agent" Jan 20 23:56:36 crc kubenswrapper[5030]: E0120 23:56:36.616338 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="sg-core" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.616347 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="sg-core" Jan 20 23:56:36 crc kubenswrapper[5030]: E0120 23:56:36.616366 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="proxy-httpd" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.616375 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="proxy-httpd" Jan 20 23:56:36 crc kubenswrapper[5030]: E0120 23:56:36.616389 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c253f065-5420-4b1e-b1dd-f6a93727b59c" containerName="nova-api-log" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.616400 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c253f065-5420-4b1e-b1dd-f6a93727b59c" containerName="nova-api-log" Jan 20 23:56:36 crc kubenswrapper[5030]: E0120 23:56:36.616416 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c253f065-5420-4b1e-b1dd-f6a93727b59c" containerName="nova-api-api" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.616426 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c253f065-5420-4b1e-b1dd-f6a93727b59c" containerName="nova-api-api" Jan 20 23:56:36 crc kubenswrapper[5030]: E0120 23:56:36.616449 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="ceilometer-notification-agent" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.616457 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="ceilometer-notification-agent" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.616754 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c253f065-5420-4b1e-b1dd-f6a93727b59c" containerName="nova-api-api" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.616780 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="sg-core" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.616802 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c253f065-5420-4b1e-b1dd-f6a93727b59c" containerName="nova-api-log" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.616820 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="proxy-httpd" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.616842 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="ceilometer-central-agent" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.616875 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a823f180-b204-43e1-a009-442a930ef6ce" containerName="ceilometer-notification-agent" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.619117 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.624317 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.624412 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7svdh\" (UniqueName: \"kubernetes.io/projected/c253f065-5420-4b1e-b1dd-f6a93727b59c-kube-api-access-7svdh\") pod \"c253f065-5420-4b1e-b1dd-f6a93727b59c\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.624597 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c253f065-5420-4b1e-b1dd-f6a93727b59c-config-data\") pod \"c253f065-5420-4b1e-b1dd-f6a93727b59c\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.624656 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.624825 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.624656 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c253f065-5420-4b1e-b1dd-f6a93727b59c-logs\") pod \"c253f065-5420-4b1e-b1dd-f6a93727b59c\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.625237 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c253f065-5420-4b1e-b1dd-f6a93727b59c-combined-ca-bundle\") pod \"c253f065-5420-4b1e-b1dd-f6a93727b59c\" (UID: \"c253f065-5420-4b1e-b1dd-f6a93727b59c\") " Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.625377 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c253f065-5420-4b1e-b1dd-f6a93727b59c-logs" (OuterVolumeSpecName: "logs") pod "c253f065-5420-4b1e-b1dd-f6a93727b59c" (UID: "c253f065-5420-4b1e-b1dd-f6a93727b59c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.626099 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c253f065-5420-4b1e-b1dd-f6a93727b59c-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.627125 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.629193 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c253f065-5420-4b1e-b1dd-f6a93727b59c-kube-api-access-7svdh" (OuterVolumeSpecName: "kube-api-access-7svdh") pod "c253f065-5420-4b1e-b1dd-f6a93727b59c" (UID: "c253f065-5420-4b1e-b1dd-f6a93727b59c"). InnerVolumeSpecName "kube-api-access-7svdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.657867 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c253f065-5420-4b1e-b1dd-f6a93727b59c-config-data" (OuterVolumeSpecName: "config-data") pod "c253f065-5420-4b1e-b1dd-f6a93727b59c" (UID: "c253f065-5420-4b1e-b1dd-f6a93727b59c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.663739 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c253f065-5420-4b1e-b1dd-f6a93727b59c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c253f065-5420-4b1e-b1dd-f6a93727b59c" (UID: "c253f065-5420-4b1e-b1dd-f6a93727b59c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.727879 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-config-data\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.727945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-scripts\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.728177 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.728308 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.728475 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3eab81e-57e8-4e42-b071-fac6dba52982-log-httpd\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.728509 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4hjr\" (UniqueName: \"kubernetes.io/projected/d3eab81e-57e8-4e42-b071-fac6dba52982-kube-api-access-t4hjr\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.728555 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.728845 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3eab81e-57e8-4e42-b071-fac6dba52982-run-httpd\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.729024 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c253f065-5420-4b1e-b1dd-f6a93727b59c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.729048 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c253f065-5420-4b1e-b1dd-f6a93727b59c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.729060 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7svdh\" (UniqueName: \"kubernetes.io/projected/c253f065-5420-4b1e-b1dd-f6a93727b59c-kube-api-access-7svdh\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.768309 5030 generic.go:334] "Generic (PLEG): container finished" podID="c253f065-5420-4b1e-b1dd-f6a93727b59c" containerID="d0453ec9a12f63421e9622610bd2f1ea38c565546d4a4e8ad3e5bb65ccd4a8ac" exitCode=0 Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.768410 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.768352 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c253f065-5420-4b1e-b1dd-f6a93727b59c","Type":"ContainerDied","Data":"d0453ec9a12f63421e9622610bd2f1ea38c565546d4a4e8ad3e5bb65ccd4a8ac"} Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.768795 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"c253f065-5420-4b1e-b1dd-f6a93727b59c","Type":"ContainerDied","Data":"bc5bb1cadb4843a1a8542de935ebb4346bb9262014982843d641aac652f47e60"} Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.768838 5030 scope.go:117] "RemoveContainer" containerID="d0453ec9a12f63421e9622610bd2f1ea38c565546d4a4e8ad3e5bb65ccd4a8ac" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.795636 5030 scope.go:117] "RemoveContainer" containerID="a62e1c7d7b87eac8e29fdfa82bffb2b8f6663c1ee5bde55f5904578e4a56a546" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.831425 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.832210 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3eab81e-57e8-4e42-b071-fac6dba52982-run-httpd\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.832277 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-config-data\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.832322 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-scripts\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.832457 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.832725 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.833021 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3eab81e-57e8-4e42-b071-fac6dba52982-log-httpd\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.833056 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4hjr\" (UniqueName: \"kubernetes.io/projected/d3eab81e-57e8-4e42-b071-fac6dba52982-kube-api-access-t4hjr\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.841634 5030 scope.go:117] "RemoveContainer" containerID="d0453ec9a12f63421e9622610bd2f1ea38c565546d4a4e8ad3e5bb65ccd4a8ac" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.843859 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-scripts\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.844410 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3eab81e-57e8-4e42-b071-fac6dba52982-log-httpd\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: E0120 23:56:36.844808 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0453ec9a12f63421e9622610bd2f1ea38c565546d4a4e8ad3e5bb65ccd4a8ac\": container with ID starting with d0453ec9a12f63421e9622610bd2f1ea38c565546d4a4e8ad3e5bb65ccd4a8ac not found: ID does not exist" containerID="d0453ec9a12f63421e9622610bd2f1ea38c565546d4a4e8ad3e5bb65ccd4a8ac" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.844864 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0453ec9a12f63421e9622610bd2f1ea38c565546d4a4e8ad3e5bb65ccd4a8ac"} err="failed to get container status \"d0453ec9a12f63421e9622610bd2f1ea38c565546d4a4e8ad3e5bb65ccd4a8ac\": rpc error: code = NotFound desc = could not find container \"d0453ec9a12f63421e9622610bd2f1ea38c565546d4a4e8ad3e5bb65ccd4a8ac\": container with ID starting with d0453ec9a12f63421e9622610bd2f1ea38c565546d4a4e8ad3e5bb65ccd4a8ac not found: ID does not exist" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.844890 5030 scope.go:117] "RemoveContainer" containerID="a62e1c7d7b87eac8e29fdfa82bffb2b8f6663c1ee5bde55f5904578e4a56a546" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.844986 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:36 crc kubenswrapper[5030]: E0120 23:56:36.845264 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a62e1c7d7b87eac8e29fdfa82bffb2b8f6663c1ee5bde55f5904578e4a56a546\": container with ID starting with a62e1c7d7b87eac8e29fdfa82bffb2b8f6663c1ee5bde55f5904578e4a56a546 not found: ID does not exist" containerID="a62e1c7d7b87eac8e29fdfa82bffb2b8f6663c1ee5bde55f5904578e4a56a546" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.845295 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a62e1c7d7b87eac8e29fdfa82bffb2b8f6663c1ee5bde55f5904578e4a56a546"} err="failed to get container status \"a62e1c7d7b87eac8e29fdfa82bffb2b8f6663c1ee5bde55f5904578e4a56a546\": rpc error: code = NotFound desc = could not find container \"a62e1c7d7b87eac8e29fdfa82bffb2b8f6663c1ee5bde55f5904578e4a56a546\": container with ID starting with a62e1c7d7b87eac8e29fdfa82bffb2b8f6663c1ee5bde55f5904578e4a56a546 not found: ID does not exist" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.845419 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3eab81e-57e8-4e42-b071-fac6dba52982-run-httpd\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.846073 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.847068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.850530 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-config-data\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.850756 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.865958 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.867603 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4hjr\" (UniqueName: \"kubernetes.io/projected/d3eab81e-57e8-4e42-b071-fac6dba52982-kube-api-access-t4hjr\") pod \"ceilometer-0\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.874390 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.878066 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.883375 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.883981 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.936458 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3928fcd-06b6-4d38-9a64-8213d238e615-logs\") pod \"nova-api-0\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.936601 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9r64\" (UniqueName: \"kubernetes.io/projected/a3928fcd-06b6-4d38-9a64-8213d238e615-kube-api-access-m9r64\") pod \"nova-api-0\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.936651 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3928fcd-06b6-4d38-9a64-8213d238e615-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.936675 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3928fcd-06b6-4d38-9a64-8213d238e615-config-data\") pod \"nova-api-0\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.986244 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-trpng"] Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.987713 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.990132 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-config-data" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.990265 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-manage-scripts" Jan 20 23:56:36 crc kubenswrapper[5030]: I0120 23:56:36.997272 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-trpng"] Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.006965 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.044083 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3928fcd-06b6-4d38-9a64-8213d238e615-logs\") pod \"nova-api-0\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.044142 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-scripts\") pod \"nova-cell1-cell-mapping-trpng\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.044177 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-config-data\") pod \"nova-cell1-cell-mapping-trpng\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.044250 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-trpng\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.044340 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9r64\" (UniqueName: \"kubernetes.io/projected/a3928fcd-06b6-4d38-9a64-8213d238e615-kube-api-access-m9r64\") pod \"nova-api-0\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.044376 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3928fcd-06b6-4d38-9a64-8213d238e615-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.044418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3928fcd-06b6-4d38-9a64-8213d238e615-config-data\") pod \"nova-api-0\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.044445 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d7pm\" (UniqueName: \"kubernetes.io/projected/1797744b-0052-4c51-bc74-77bfcf4329c2-kube-api-access-6d7pm\") pod \"nova-cell1-cell-mapping-trpng\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.044476 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3928fcd-06b6-4d38-9a64-8213d238e615-logs\") pod \"nova-api-0\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.049799 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3928fcd-06b6-4d38-9a64-8213d238e615-config-data\") pod \"nova-api-0\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.056216 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3928fcd-06b6-4d38-9a64-8213d238e615-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.061253 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9r64\" (UniqueName: \"kubernetes.io/projected/a3928fcd-06b6-4d38-9a64-8213d238e615-kube-api-access-m9r64\") pod \"nova-api-0\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.146381 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d7pm\" (UniqueName: \"kubernetes.io/projected/1797744b-0052-4c51-bc74-77bfcf4329c2-kube-api-access-6d7pm\") pod \"nova-cell1-cell-mapping-trpng\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.146769 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-scripts\") pod \"nova-cell1-cell-mapping-trpng\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.146797 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-config-data\") pod \"nova-cell1-cell-mapping-trpng\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.146859 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-trpng\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.150826 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-scripts\") pod \"nova-cell1-cell-mapping-trpng\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.151324 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-config-data\") pod \"nova-cell1-cell-mapping-trpng\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.152226 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-trpng\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.180101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d7pm\" (UniqueName: \"kubernetes.io/projected/1797744b-0052-4c51-bc74-77bfcf4329c2-kube-api-access-6d7pm\") pod \"nova-cell1-cell-mapping-trpng\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.215598 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.348937 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.540976 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.781212 5030 generic.go:334] "Generic (PLEG): container finished" podID="c288c177-9fd3-4b77-b251-3f53913fb132" containerID="19ecc5790c45d4d3f8c3584eb1b9b7a0cc1009e8712c72a7ba0b5c78d51abbd0" exitCode=0 Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.781261 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c288c177-9fd3-4b77-b251-3f53913fb132","Type":"ContainerDied","Data":"19ecc5790c45d4d3f8c3584eb1b9b7a0cc1009e8712c72a7ba0b5c78d51abbd0"} Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.782431 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d3eab81e-57e8-4e42-b071-fac6dba52982","Type":"ContainerStarted","Data":"b2f2622b0737c27e23f33156e17e809d60de83ae5a1421217e02730beb31900b"} Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.867975 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:37 crc kubenswrapper[5030]: W0120 23:56:37.872487 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3928fcd_06b6_4d38_9a64_8213d238e615.slice/crio-3c28f318d32af35728a197f4f46ed6991b7e2ab8d7aabf1c4a04123f025bbc9b WatchSource:0}: Error finding container 3c28f318d32af35728a197f4f46ed6991b7e2ab8d7aabf1c4a04123f025bbc9b: Status 404 returned error can't find the container with id 3c28f318d32af35728a197f4f46ed6991b7e2ab8d7aabf1c4a04123f025bbc9b Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.982235 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a823f180-b204-43e1-a009-442a930ef6ce" path="/var/lib/kubelet/pods/a823f180-b204-43e1-a009-442a930ef6ce/volumes" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.983112 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c253f065-5420-4b1e-b1dd-f6a93727b59c" path="/var/lib/kubelet/pods/c253f065-5420-4b1e-b1dd-f6a93727b59c/volumes" Jan 20 23:56:37 crc kubenswrapper[5030]: I0120 23:56:37.983722 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-trpng"] Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.060862 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.082638 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.083436 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.172293 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c288c177-9fd3-4b77-b251-3f53913fb132-combined-ca-bundle\") pod \"c288c177-9fd3-4b77-b251-3f53913fb132\" (UID: \"c288c177-9fd3-4b77-b251-3f53913fb132\") " Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.172667 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c288c177-9fd3-4b77-b251-3f53913fb132-config-data\") pod \"c288c177-9fd3-4b77-b251-3f53913fb132\" (UID: \"c288c177-9fd3-4b77-b251-3f53913fb132\") " Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.172798 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksdgl\" (UniqueName: \"kubernetes.io/projected/c288c177-9fd3-4b77-b251-3f53913fb132-kube-api-access-ksdgl\") pod \"c288c177-9fd3-4b77-b251-3f53913fb132\" (UID: \"c288c177-9fd3-4b77-b251-3f53913fb132\") " Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.178155 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c288c177-9fd3-4b77-b251-3f53913fb132-kube-api-access-ksdgl" (OuterVolumeSpecName: "kube-api-access-ksdgl") pod "c288c177-9fd3-4b77-b251-3f53913fb132" (UID: "c288c177-9fd3-4b77-b251-3f53913fb132"). InnerVolumeSpecName "kube-api-access-ksdgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.221882 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c288c177-9fd3-4b77-b251-3f53913fb132-config-data" (OuterVolumeSpecName: "config-data") pod "c288c177-9fd3-4b77-b251-3f53913fb132" (UID: "c288c177-9fd3-4b77-b251-3f53913fb132"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.236672 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c288c177-9fd3-4b77-b251-3f53913fb132-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c288c177-9fd3-4b77-b251-3f53913fb132" (UID: "c288c177-9fd3-4b77-b251-3f53913fb132"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.274861 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c288c177-9fd3-4b77-b251-3f53913fb132-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.274976 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c288c177-9fd3-4b77-b251-3f53913fb132-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.275003 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksdgl\" (UniqueName: \"kubernetes.io/projected/c288c177-9fd3-4b77-b251-3f53913fb132-kube-api-access-ksdgl\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.795557 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a3928fcd-06b6-4d38-9a64-8213d238e615","Type":"ContainerStarted","Data":"c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102"} Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.795618 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a3928fcd-06b6-4d38-9a64-8213d238e615","Type":"ContainerStarted","Data":"9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7"} Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.797309 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a3928fcd-06b6-4d38-9a64-8213d238e615","Type":"ContainerStarted","Data":"3c28f318d32af35728a197f4f46ed6991b7e2ab8d7aabf1c4a04123f025bbc9b"} Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.798609 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d3eab81e-57e8-4e42-b071-fac6dba52982","Type":"ContainerStarted","Data":"d29b58f5f0487b420ff7ff80d9b9065282199b027b8298744d83e05b4d435c0b"} Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.813023 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" event={"ID":"1797744b-0052-4c51-bc74-77bfcf4329c2","Type":"ContainerStarted","Data":"db2bb0cf2f1a80c4d2224f7a8ac219f2f023585a6875d7b9b1e1d5f85b801548"} Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.813109 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" event={"ID":"1797744b-0052-4c51-bc74-77bfcf4329c2","Type":"ContainerStarted","Data":"71eb0984ce577d5cbc947e7193e20599c570b0bec4079c1c1a6984d3cfed993a"} Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.820713 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.820697979 podStartE2EDuration="2.820697979s" podCreationTimestamp="2026-01-20 23:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:38.815403672 +0000 UTC m=+4871.135663960" watchObservedRunningTime="2026-01-20 23:56:38.820697979 +0000 UTC m=+4871.140958267" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.826550 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.826659 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"c288c177-9fd3-4b77-b251-3f53913fb132","Type":"ContainerDied","Data":"384c3e3ae2a029a72767df12518f8fca4851c0469a3099e6aec19e5c18edc000"} Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.826711 5030 scope.go:117] "RemoveContainer" containerID="19ecc5790c45d4d3f8c3584eb1b9b7a0cc1009e8712c72a7ba0b5c78d51abbd0" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.832530 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" podStartSLOduration=2.832511946 podStartE2EDuration="2.832511946s" podCreationTimestamp="2026-01-20 23:56:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:38.830581119 +0000 UTC m=+4871.150841397" watchObservedRunningTime="2026-01-20 23:56:38.832511946 +0000 UTC m=+4871.152772244" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.884686 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.899474 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.911261 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:38 crc kubenswrapper[5030]: E0120 23:56:38.911714 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c288c177-9fd3-4b77-b251-3f53913fb132" containerName="nova-scheduler-scheduler" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.911730 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c288c177-9fd3-4b77-b251-3f53913fb132" containerName="nova-scheduler-scheduler" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.911899 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c288c177-9fd3-4b77-b251-3f53913fb132" containerName="nova-scheduler-scheduler" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.912526 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.915955 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:56:38 crc kubenswrapper[5030]: I0120 23:56:38.919964 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.001917 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d8a1422-6517-4308-b09b-5a880db7dd72-config-data\") pod \"nova-scheduler-0\" (UID: \"9d8a1422-6517-4308-b09b-5a880db7dd72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.001967 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d8a1422-6517-4308-b09b-5a880db7dd72-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9d8a1422-6517-4308-b09b-5a880db7dd72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.002002 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmfwg\" (UniqueName: \"kubernetes.io/projected/9d8a1422-6517-4308-b09b-5a880db7dd72-kube-api-access-bmfwg\") pod \"nova-scheduler-0\" (UID: \"9d8a1422-6517-4308-b09b-5a880db7dd72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.104904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d8a1422-6517-4308-b09b-5a880db7dd72-config-data\") pod \"nova-scheduler-0\" (UID: \"9d8a1422-6517-4308-b09b-5a880db7dd72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.105186 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d8a1422-6517-4308-b09b-5a880db7dd72-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9d8a1422-6517-4308-b09b-5a880db7dd72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.105298 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmfwg\" (UniqueName: \"kubernetes.io/projected/9d8a1422-6517-4308-b09b-5a880db7dd72-kube-api-access-bmfwg\") pod \"nova-scheduler-0\" (UID: \"9d8a1422-6517-4308-b09b-5a880db7dd72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.121736 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d8a1422-6517-4308-b09b-5a880db7dd72-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9d8a1422-6517-4308-b09b-5a880db7dd72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.125483 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d8a1422-6517-4308-b09b-5a880db7dd72-config-data\") pod \"nova-scheduler-0\" (UID: \"9d8a1422-6517-4308-b09b-5a880db7dd72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.145153 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmfwg\" (UniqueName: \"kubernetes.io/projected/9d8a1422-6517-4308-b09b-5a880db7dd72-kube-api-access-bmfwg\") pod \"nova-scheduler-0\" (UID: \"9d8a1422-6517-4308-b09b-5a880db7dd72\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.231393 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.710241 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:39 crc kubenswrapper[5030]: W0120 23:56:39.716898 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d8a1422_6517_4308_b09b_5a880db7dd72.slice/crio-9eaa05cc426da6eb6fb99fa35ffeea73995b0082dfbe811af93c900b1e841868 WatchSource:0}: Error finding container 9eaa05cc426da6eb6fb99fa35ffeea73995b0082dfbe811af93c900b1e841868: Status 404 returned error can't find the container with id 9eaa05cc426da6eb6fb99fa35ffeea73995b0082dfbe811af93c900b1e841868 Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.856357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d3eab81e-57e8-4e42-b071-fac6dba52982","Type":"ContainerStarted","Data":"b78af2b3322b7e58e728d2b965eb2d06c1ae306bc81334ef6253c1eb662028a0"} Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.858723 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"9d8a1422-6517-4308-b09b-5a880db7dd72","Type":"ContainerStarted","Data":"9eaa05cc426da6eb6fb99fa35ffeea73995b0082dfbe811af93c900b1e841868"} Jan 20 23:56:39 crc kubenswrapper[5030]: I0120 23:56:39.979046 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c288c177-9fd3-4b77-b251-3f53913fb132" path="/var/lib/kubelet/pods/c288c177-9fd3-4b77-b251-3f53913fb132/volumes" Jan 20 23:56:40 crc kubenswrapper[5030]: I0120 23:56:40.157896 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:56:40 crc kubenswrapper[5030]: I0120 23:56:40.157964 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:56:40 crc kubenswrapper[5030]: I0120 23:56:40.158008 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 23:56:40 crc kubenswrapper[5030]: I0120 23:56:40.158679 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"99e97073dbe5bb28b1952568f31004b35204791a913cf21cbb334984821b7ecc"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 23:56:40 crc kubenswrapper[5030]: I0120 23:56:40.158733 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://99e97073dbe5bb28b1952568f31004b35204791a913cf21cbb334984821b7ecc" gracePeriod=600 Jan 20 23:56:40 crc kubenswrapper[5030]: I0120 23:56:40.869095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d3eab81e-57e8-4e42-b071-fac6dba52982","Type":"ContainerStarted","Data":"b2c6427b448e49488f7bf4402cfd45b087a0029e374c4de0cb9963626b3229bb"} Jan 20 23:56:40 crc kubenswrapper[5030]: I0120 23:56:40.871276 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"9d8a1422-6517-4308-b09b-5a880db7dd72","Type":"ContainerStarted","Data":"bed8b587581e0579d490db5878f3a422c9ef7b76560426ab394ebc02b200e802"} Jan 20 23:56:40 crc kubenswrapper[5030]: I0120 23:56:40.874488 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="99e97073dbe5bb28b1952568f31004b35204791a913cf21cbb334984821b7ecc" exitCode=0 Jan 20 23:56:40 crc kubenswrapper[5030]: I0120 23:56:40.874578 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"99e97073dbe5bb28b1952568f31004b35204791a913cf21cbb334984821b7ecc"} Jan 20 23:56:40 crc kubenswrapper[5030]: I0120 23:56:40.874645 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc"} Jan 20 23:56:40 crc kubenswrapper[5030]: I0120 23:56:40.874669 5030 scope.go:117] "RemoveContainer" containerID="42bdbd6195933aa9b44f598787faebe4f7452158d085ccaadab91e3f12fad121" Jan 20 23:56:40 crc kubenswrapper[5030]: I0120 23:56:40.892647 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.892613451 podStartE2EDuration="2.892613451s" podCreationTimestamp="2026-01-20 23:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:40.888919552 +0000 UTC m=+4873.209179840" watchObservedRunningTime="2026-01-20 23:56:40.892613451 +0000 UTC m=+4873.212873749" Jan 20 23:56:41 crc kubenswrapper[5030]: I0120 23:56:41.885499 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d3eab81e-57e8-4e42-b071-fac6dba52982","Type":"ContainerStarted","Data":"86839dbb41f521f26da4c4c111ba8689cc5681dbf846bca127e616377838e071"} Jan 20 23:56:41 crc kubenswrapper[5030]: I0120 23:56:41.886793 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:56:41 crc kubenswrapper[5030]: I0120 23:56:41.919333 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=2.386458045 podStartE2EDuration="5.919312577s" podCreationTimestamp="2026-01-20 23:56:36 +0000 UTC" firstStartedPulling="2026-01-20 23:56:37.574300434 +0000 UTC m=+4869.894560722" lastFinishedPulling="2026-01-20 23:56:41.107154966 +0000 UTC m=+4873.427415254" observedRunningTime="2026-01-20 23:56:41.916973011 +0000 UTC m=+4874.237233309" watchObservedRunningTime="2026-01-20 23:56:41.919312577 +0000 UTC m=+4874.239572875" Jan 20 23:56:43 crc kubenswrapper[5030]: I0120 23:56:43.082710 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:43 crc kubenswrapper[5030]: I0120 23:56:43.083072 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:43 crc kubenswrapper[5030]: I0120 23:56:43.161049 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:56:43 crc kubenswrapper[5030]: I0120 23:56:43.912425 5030 generic.go:334] "Generic (PLEG): container finished" podID="1797744b-0052-4c51-bc74-77bfcf4329c2" containerID="db2bb0cf2f1a80c4d2224f7a8ac219f2f023585a6875d7b9b1e1d5f85b801548" exitCode=0 Jan 20 23:56:43 crc kubenswrapper[5030]: I0120 23:56:43.912530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" event={"ID":"1797744b-0052-4c51-bc74-77bfcf4329c2","Type":"ContainerDied","Data":"db2bb0cf2f1a80c4d2224f7a8ac219f2f023585a6875d7b9b1e1d5f85b801548"} Jan 20 23:56:44 crc kubenswrapper[5030]: I0120 23:56:44.095948 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.139:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:56:44 crc kubenswrapper[5030]: I0120 23:56:44.095959 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.139:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:56:44 crc kubenswrapper[5030]: I0120 23:56:44.231852 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.338292 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.435911 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-scripts\") pod \"1797744b-0052-4c51-bc74-77bfcf4329c2\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.436066 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-config-data\") pod \"1797744b-0052-4c51-bc74-77bfcf4329c2\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.436111 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-combined-ca-bundle\") pod \"1797744b-0052-4c51-bc74-77bfcf4329c2\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.436283 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d7pm\" (UniqueName: \"kubernetes.io/projected/1797744b-0052-4c51-bc74-77bfcf4329c2-kube-api-access-6d7pm\") pod \"1797744b-0052-4c51-bc74-77bfcf4329c2\" (UID: \"1797744b-0052-4c51-bc74-77bfcf4329c2\") " Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.441825 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1797744b-0052-4c51-bc74-77bfcf4329c2-kube-api-access-6d7pm" (OuterVolumeSpecName: "kube-api-access-6d7pm") pod "1797744b-0052-4c51-bc74-77bfcf4329c2" (UID: "1797744b-0052-4c51-bc74-77bfcf4329c2"). InnerVolumeSpecName "kube-api-access-6d7pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.442772 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-scripts" (OuterVolumeSpecName: "scripts") pod "1797744b-0052-4c51-bc74-77bfcf4329c2" (UID: "1797744b-0052-4c51-bc74-77bfcf4329c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.474679 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1797744b-0052-4c51-bc74-77bfcf4329c2" (UID: "1797744b-0052-4c51-bc74-77bfcf4329c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.484337 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-config-data" (OuterVolumeSpecName: "config-data") pod "1797744b-0052-4c51-bc74-77bfcf4329c2" (UID: "1797744b-0052-4c51-bc74-77bfcf4329c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.538816 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d7pm\" (UniqueName: \"kubernetes.io/projected/1797744b-0052-4c51-bc74-77bfcf4329c2-kube-api-access-6d7pm\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.538855 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.538872 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.538886 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1797744b-0052-4c51-bc74-77bfcf4329c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.938827 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" event={"ID":"1797744b-0052-4c51-bc74-77bfcf4329c2","Type":"ContainerDied","Data":"71eb0984ce577d5cbc947e7193e20599c570b0bec4079c1c1a6984d3cfed993a"} Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.938918 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71eb0984ce577d5cbc947e7193e20599c570b0bec4079c1c1a6984d3cfed993a" Jan 20 23:56:45 crc kubenswrapper[5030]: I0120 23:56:45.938927 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-cell-mapping-trpng" Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.125263 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.125478 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a3928fcd-06b6-4d38-9a64-8213d238e615" containerName="nova-api-log" containerID="cri-o://9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7" gracePeriod=30 Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.125653 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="a3928fcd-06b6-4d38-9a64-8213d238e615" containerName="nova-api-api" containerID="cri-o://c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102" gracePeriod=30 Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.173294 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.173497 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="9d8a1422-6517-4308-b09b-5a880db7dd72" containerName="nova-scheduler-scheduler" containerID="cri-o://bed8b587581e0579d490db5878f3a422c9ef7b76560426ab394ebc02b200e802" gracePeriod=30 Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.208423 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.208641 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" containerName="nova-metadata-log" containerID="cri-o://d24def3bcfe92f2ee0e820719ee6a03082d4459b1a06c611e4cc7a9b912113c2" gracePeriod=30 Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.208739 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" containerName="nova-metadata-metadata" containerID="cri-o://1b7602111efadf863a6c9d2c4986983df1b114e90cf5617d5a80180113b63734" gracePeriod=30 Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.746697 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.867376 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3928fcd-06b6-4d38-9a64-8213d238e615-combined-ca-bundle\") pod \"a3928fcd-06b6-4d38-9a64-8213d238e615\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.867514 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3928fcd-06b6-4d38-9a64-8213d238e615-config-data\") pod \"a3928fcd-06b6-4d38-9a64-8213d238e615\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.867550 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3928fcd-06b6-4d38-9a64-8213d238e615-logs\") pod \"a3928fcd-06b6-4d38-9a64-8213d238e615\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.867775 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9r64\" (UniqueName: \"kubernetes.io/projected/a3928fcd-06b6-4d38-9a64-8213d238e615-kube-api-access-m9r64\") pod \"a3928fcd-06b6-4d38-9a64-8213d238e615\" (UID: \"a3928fcd-06b6-4d38-9a64-8213d238e615\") " Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.868207 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3928fcd-06b6-4d38-9a64-8213d238e615-logs" (OuterVolumeSpecName: "logs") pod "a3928fcd-06b6-4d38-9a64-8213d238e615" (UID: "a3928fcd-06b6-4d38-9a64-8213d238e615"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.872598 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3928fcd-06b6-4d38-9a64-8213d238e615-kube-api-access-m9r64" (OuterVolumeSpecName: "kube-api-access-m9r64") pod "a3928fcd-06b6-4d38-9a64-8213d238e615" (UID: "a3928fcd-06b6-4d38-9a64-8213d238e615"). InnerVolumeSpecName "kube-api-access-m9r64". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.906981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3928fcd-06b6-4d38-9a64-8213d238e615-config-data" (OuterVolumeSpecName: "config-data") pod "a3928fcd-06b6-4d38-9a64-8213d238e615" (UID: "a3928fcd-06b6-4d38-9a64-8213d238e615"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.918047 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3928fcd-06b6-4d38-9a64-8213d238e615-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3928fcd-06b6-4d38-9a64-8213d238e615" (UID: "a3928fcd-06b6-4d38-9a64-8213d238e615"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.951972 5030 generic.go:334] "Generic (PLEG): container finished" podID="f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" containerID="d24def3bcfe92f2ee0e820719ee6a03082d4459b1a06c611e4cc7a9b912113c2" exitCode=143 Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.952051 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96","Type":"ContainerDied","Data":"d24def3bcfe92f2ee0e820719ee6a03082d4459b1a06c611e4cc7a9b912113c2"} Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.954080 5030 generic.go:334] "Generic (PLEG): container finished" podID="a3928fcd-06b6-4d38-9a64-8213d238e615" containerID="c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102" exitCode=0 Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.954121 5030 generic.go:334] "Generic (PLEG): container finished" podID="a3928fcd-06b6-4d38-9a64-8213d238e615" containerID="9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7" exitCode=143 Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.954136 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.954146 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a3928fcd-06b6-4d38-9a64-8213d238e615","Type":"ContainerDied","Data":"c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102"} Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.954182 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a3928fcd-06b6-4d38-9a64-8213d238e615","Type":"ContainerDied","Data":"9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7"} Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.954197 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"a3928fcd-06b6-4d38-9a64-8213d238e615","Type":"ContainerDied","Data":"3c28f318d32af35728a197f4f46ed6991b7e2ab8d7aabf1c4a04123f025bbc9b"} Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.954216 5030 scope.go:117] "RemoveContainer" containerID="c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102" Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.970343 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3928fcd-06b6-4d38-9a64-8213d238e615-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.970395 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3928fcd-06b6-4d38-9a64-8213d238e615-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.970419 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9r64\" (UniqueName: \"kubernetes.io/projected/a3928fcd-06b6-4d38-9a64-8213d238e615-kube-api-access-m9r64\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.970438 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3928fcd-06b6-4d38-9a64-8213d238e615-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:46 crc kubenswrapper[5030]: I0120 23:56:46.994346 5030 scope.go:117] "RemoveContainer" containerID="9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.002169 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.016002 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.029535 5030 scope.go:117] "RemoveContainer" containerID="c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.029540 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:47 crc kubenswrapper[5030]: E0120 23:56:47.030076 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1797744b-0052-4c51-bc74-77bfcf4329c2" containerName="nova-manage" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.030094 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1797744b-0052-4c51-bc74-77bfcf4329c2" containerName="nova-manage" Jan 20 23:56:47 crc kubenswrapper[5030]: E0120 23:56:47.030122 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3928fcd-06b6-4d38-9a64-8213d238e615" containerName="nova-api-api" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.030130 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3928fcd-06b6-4d38-9a64-8213d238e615" containerName="nova-api-api" Jan 20 23:56:47 crc kubenswrapper[5030]: E0120 23:56:47.030139 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3928fcd-06b6-4d38-9a64-8213d238e615" containerName="nova-api-log" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.030147 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3928fcd-06b6-4d38-9a64-8213d238e615" containerName="nova-api-log" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.030525 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3928fcd-06b6-4d38-9a64-8213d238e615" containerName="nova-api-log" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.030549 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1797744b-0052-4c51-bc74-77bfcf4329c2" containerName="nova-manage" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.030561 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3928fcd-06b6-4d38-9a64-8213d238e615" containerName="nova-api-api" Jan 20 23:56:47 crc kubenswrapper[5030]: E0120 23:56:47.030728 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102\": container with ID starting with c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102 not found: ID does not exist" containerID="c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.030765 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102"} err="failed to get container status \"c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102\": rpc error: code = NotFound desc = could not find container \"c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102\": container with ID starting with c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102 not found: ID does not exist" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.030787 5030 scope.go:117] "RemoveContainer" containerID="9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7" Jan 20 23:56:47 crc kubenswrapper[5030]: E0120 23:56:47.031262 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7\": container with ID starting with 9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7 not found: ID does not exist" containerID="9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.031339 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7"} err="failed to get container status \"9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7\": rpc error: code = NotFound desc = could not find container \"9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7\": container with ID starting with 9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7 not found: ID does not exist" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.031380 5030 scope.go:117] "RemoveContainer" containerID="c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.031814 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.031851 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102"} err="failed to get container status \"c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102\": rpc error: code = NotFound desc = could not find container \"c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102\": container with ID starting with c95298c4cce5f850ea646fc192798c923ee9cdb98464498673b26fe9af4b7102 not found: ID does not exist" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.031868 5030 scope.go:117] "RemoveContainer" containerID="9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.032935 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7"} err="failed to get container status \"9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7\": rpc error: code = NotFound desc = could not find container \"9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7\": container with ID starting with 9c1f2f9a1985394f176e3b99bf66389962cf669885eae07e81358a02db6b2cb7 not found: ID does not exist" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.040095 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.041931 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.173546 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6519c3f5-4768-4b48-81e9-c7e7713c9b04-logs\") pod \"nova-api-0\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.173762 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4nxw\" (UniqueName: \"kubernetes.io/projected/6519c3f5-4768-4b48-81e9-c7e7713c9b04-kube-api-access-d4nxw\") pod \"nova-api-0\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.173945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6519c3f5-4768-4b48-81e9-c7e7713c9b04-config-data\") pod \"nova-api-0\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.174117 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6519c3f5-4768-4b48-81e9-c7e7713c9b04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.278057 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6519c3f5-4768-4b48-81e9-c7e7713c9b04-logs\") pod \"nova-api-0\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.278157 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4nxw\" (UniqueName: \"kubernetes.io/projected/6519c3f5-4768-4b48-81e9-c7e7713c9b04-kube-api-access-d4nxw\") pod \"nova-api-0\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.278240 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6519c3f5-4768-4b48-81e9-c7e7713c9b04-config-data\") pod \"nova-api-0\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.278291 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6519c3f5-4768-4b48-81e9-c7e7713c9b04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.279179 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6519c3f5-4768-4b48-81e9-c7e7713c9b04-logs\") pod \"nova-api-0\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.288715 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6519c3f5-4768-4b48-81e9-c7e7713c9b04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.289388 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6519c3f5-4768-4b48-81e9-c7e7713c9b04-config-data\") pod \"nova-api-0\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.308909 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4nxw\" (UniqueName: \"kubernetes.io/projected/6519c3f5-4768-4b48-81e9-c7e7713c9b04-kube-api-access-d4nxw\") pod \"nova-api-0\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.368750 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.825061 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:56:47 crc kubenswrapper[5030]: W0120 23:56:47.838848 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6519c3f5_4768_4b48_81e9_c7e7713c9b04.slice/crio-758e41e2286f01353958b25602b5816b0ac1c3717a0b5ee4056e2e568d208fbf WatchSource:0}: Error finding container 758e41e2286f01353958b25602b5816b0ac1c3717a0b5ee4056e2e568d208fbf: Status 404 returned error can't find the container with id 758e41e2286f01353958b25602b5816b0ac1c3717a0b5ee4056e2e568d208fbf Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.969000 5030 generic.go:334] "Generic (PLEG): container finished" podID="9d8a1422-6517-4308-b09b-5a880db7dd72" containerID="bed8b587581e0579d490db5878f3a422c9ef7b76560426ab394ebc02b200e802" exitCode=0 Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.972675 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3928fcd-06b6-4d38-9a64-8213d238e615" path="/var/lib/kubelet/pods/a3928fcd-06b6-4d38-9a64-8213d238e615/volumes" Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.973335 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6519c3f5-4768-4b48-81e9-c7e7713c9b04","Type":"ContainerStarted","Data":"758e41e2286f01353958b25602b5816b0ac1c3717a0b5ee4056e2e568d208fbf"} Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.973367 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"9d8a1422-6517-4308-b09b-5a880db7dd72","Type":"ContainerDied","Data":"bed8b587581e0579d490db5878f3a422c9ef7b76560426ab394ebc02b200e802"} Jan 20 23:56:47 crc kubenswrapper[5030]: I0120 23:56:47.979839 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.092498 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmfwg\" (UniqueName: \"kubernetes.io/projected/9d8a1422-6517-4308-b09b-5a880db7dd72-kube-api-access-bmfwg\") pod \"9d8a1422-6517-4308-b09b-5a880db7dd72\" (UID: \"9d8a1422-6517-4308-b09b-5a880db7dd72\") " Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.092771 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d8a1422-6517-4308-b09b-5a880db7dd72-combined-ca-bundle\") pod \"9d8a1422-6517-4308-b09b-5a880db7dd72\" (UID: \"9d8a1422-6517-4308-b09b-5a880db7dd72\") " Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.092915 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d8a1422-6517-4308-b09b-5a880db7dd72-config-data\") pod \"9d8a1422-6517-4308-b09b-5a880db7dd72\" (UID: \"9d8a1422-6517-4308-b09b-5a880db7dd72\") " Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.095938 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d8a1422-6517-4308-b09b-5a880db7dd72-kube-api-access-bmfwg" (OuterVolumeSpecName: "kube-api-access-bmfwg") pod "9d8a1422-6517-4308-b09b-5a880db7dd72" (UID: "9d8a1422-6517-4308-b09b-5a880db7dd72"). InnerVolumeSpecName "kube-api-access-bmfwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.118568 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d8a1422-6517-4308-b09b-5a880db7dd72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d8a1422-6517-4308-b09b-5a880db7dd72" (UID: "9d8a1422-6517-4308-b09b-5a880db7dd72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.142788 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d8a1422-6517-4308-b09b-5a880db7dd72-config-data" (OuterVolumeSpecName: "config-data") pod "9d8a1422-6517-4308-b09b-5a880db7dd72" (UID: "9d8a1422-6517-4308-b09b-5a880db7dd72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.195326 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d8a1422-6517-4308-b09b-5a880db7dd72-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.195362 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmfwg\" (UniqueName: \"kubernetes.io/projected/9d8a1422-6517-4308-b09b-5a880db7dd72-kube-api-access-bmfwg\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.195377 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d8a1422-6517-4308-b09b-5a880db7dd72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.981149 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6519c3f5-4768-4b48-81e9-c7e7713c9b04","Type":"ContainerStarted","Data":"4340651e0f704e768231d9ca042858c1891e7c28dfcadbe7e80ffc8747176fd8"} Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.981205 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6519c3f5-4768-4b48-81e9-c7e7713c9b04","Type":"ContainerStarted","Data":"f5730b6d895805c69cbc7f577c2c99904365f9d232826e7fa9d2034a2836c533"} Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.986004 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"9d8a1422-6517-4308-b09b-5a880db7dd72","Type":"ContainerDied","Data":"9eaa05cc426da6eb6fb99fa35ffeea73995b0082dfbe811af93c900b1e841868"} Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.986060 5030 scope.go:117] "RemoveContainer" containerID="bed8b587581e0579d490db5878f3a422c9ef7b76560426ab394ebc02b200e802" Jan 20 23:56:48 crc kubenswrapper[5030]: I0120 23:56:48.986073 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.020303 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=3.020277117 podStartE2EDuration="3.020277117s" podCreationTimestamp="2026-01-20 23:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:49.003782697 +0000 UTC m=+4881.324043025" watchObservedRunningTime="2026-01-20 23:56:49.020277117 +0000 UTC m=+4881.340537405" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.060060 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.072025 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.083692 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:49 crc kubenswrapper[5030]: E0120 23:56:49.084308 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d8a1422-6517-4308-b09b-5a880db7dd72" containerName="nova-scheduler-scheduler" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.084353 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d8a1422-6517-4308-b09b-5a880db7dd72" containerName="nova-scheduler-scheduler" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.084648 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d8a1422-6517-4308-b09b-5a880db7dd72" containerName="nova-scheduler-scheduler" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.085530 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.087567 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-scheduler-config-data" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.088309 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.212778 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-config-data\") pod \"nova-scheduler-0\" (UID: \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.212923 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77vts\" (UniqueName: \"kubernetes.io/projected/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-kube-api-access-77vts\") pod \"nova-scheduler-0\" (UID: \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.213027 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.319584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.320141 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-config-data\") pod \"nova-scheduler-0\" (UID: \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.320575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77vts\" (UniqueName: \"kubernetes.io/projected/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-kube-api-access-77vts\") pod \"nova-scheduler-0\" (UID: \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.334639 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-config-data\") pod \"nova-scheduler-0\" (UID: \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.334841 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.367872 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77vts\" (UniqueName: \"kubernetes.io/projected/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-kube-api-access-77vts\") pod \"nova-scheduler-0\" (UID: \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\") " pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.403665 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.844487 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.933401 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-combined-ca-bundle\") pod \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.933783 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-logs\") pod \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.934105 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-config-data\") pod \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.934267 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-nova-metadata-tls-certs\") pod \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.934386 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mcm7\" (UniqueName: \"kubernetes.io/projected/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-kube-api-access-5mcm7\") pod \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\" (UID: \"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96\") " Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.934319 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-logs" (OuterVolumeSpecName: "logs") pod "f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" (UID: "f730b28f-1b7e-418a-8c4d-4fb49a9b0c96"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.935192 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.940072 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-kube-api-access-5mcm7" (OuterVolumeSpecName: "kube-api-access-5mcm7") pod "f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" (UID: "f730b28f-1b7e-418a-8c4d-4fb49a9b0c96"). InnerVolumeSpecName "kube-api-access-5mcm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.964369 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" (UID: "f730b28f-1b7e-418a-8c4d-4fb49a9b0c96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.973515 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-config-data" (OuterVolumeSpecName: "config-data") pod "f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" (UID: "f730b28f-1b7e-418a-8c4d-4fb49a9b0c96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.974978 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d8a1422-6517-4308-b09b-5a880db7dd72" path="/var/lib/kubelet/pods/9d8a1422-6517-4308-b09b-5a880db7dd72/volumes" Jan 20 23:56:49 crc kubenswrapper[5030]: I0120 23:56:49.992364 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" (UID: "f730b28f-1b7e-418a-8c4d-4fb49a9b0c96"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.000159 5030 generic.go:334] "Generic (PLEG): container finished" podID="f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" containerID="1b7602111efadf863a6c9d2c4986983df1b114e90cf5617d5a80180113b63734" exitCode=0 Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.000265 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96","Type":"ContainerDied","Data":"1b7602111efadf863a6c9d2c4986983df1b114e90cf5617d5a80180113b63734"} Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.000319 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"f730b28f-1b7e-418a-8c4d-4fb49a9b0c96","Type":"ContainerDied","Data":"227ee7e0208a069c51a2b830e7ae0d95caf9df543cf0498697a32a63b53fb71e"} Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.000349 5030 scope.go:117] "RemoveContainer" containerID="1b7602111efadf863a6c9d2c4986983df1b114e90cf5617d5a80180113b63734" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.000787 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.013944 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.025752 5030 scope.go:117] "RemoveContainer" containerID="d24def3bcfe92f2ee0e820719ee6a03082d4459b1a06c611e4cc7a9b912113c2" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.037842 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.038082 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.038177 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mcm7\" (UniqueName: \"kubernetes.io/projected/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-kube-api-access-5mcm7\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.038240 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.065602 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.086168 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.087434 5030 scope.go:117] "RemoveContainer" containerID="1b7602111efadf863a6c9d2c4986983df1b114e90cf5617d5a80180113b63734" Jan 20 23:56:50 crc kubenswrapper[5030]: E0120 23:56:50.087886 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7602111efadf863a6c9d2c4986983df1b114e90cf5617d5a80180113b63734\": container with ID starting with 1b7602111efadf863a6c9d2c4986983df1b114e90cf5617d5a80180113b63734 not found: ID does not exist" containerID="1b7602111efadf863a6c9d2c4986983df1b114e90cf5617d5a80180113b63734" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.087923 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7602111efadf863a6c9d2c4986983df1b114e90cf5617d5a80180113b63734"} err="failed to get container status \"1b7602111efadf863a6c9d2c4986983df1b114e90cf5617d5a80180113b63734\": rpc error: code = NotFound desc = could not find container \"1b7602111efadf863a6c9d2c4986983df1b114e90cf5617d5a80180113b63734\": container with ID starting with 1b7602111efadf863a6c9d2c4986983df1b114e90cf5617d5a80180113b63734 not found: ID does not exist" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.087948 5030 scope.go:117] "RemoveContainer" containerID="d24def3bcfe92f2ee0e820719ee6a03082d4459b1a06c611e4cc7a9b912113c2" Jan 20 23:56:50 crc kubenswrapper[5030]: E0120 23:56:50.088250 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d24def3bcfe92f2ee0e820719ee6a03082d4459b1a06c611e4cc7a9b912113c2\": container with ID starting with d24def3bcfe92f2ee0e820719ee6a03082d4459b1a06c611e4cc7a9b912113c2 not found: ID does not exist" containerID="d24def3bcfe92f2ee0e820719ee6a03082d4459b1a06c611e4cc7a9b912113c2" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.088284 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24def3bcfe92f2ee0e820719ee6a03082d4459b1a06c611e4cc7a9b912113c2"} err="failed to get container status \"d24def3bcfe92f2ee0e820719ee6a03082d4459b1a06c611e4cc7a9b912113c2\": rpc error: code = NotFound desc = could not find container \"d24def3bcfe92f2ee0e820719ee6a03082d4459b1a06c611e4cc7a9b912113c2\": container with ID starting with d24def3bcfe92f2ee0e820719ee6a03082d4459b1a06c611e4cc7a9b912113c2 not found: ID does not exist" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.100746 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:50 crc kubenswrapper[5030]: E0120 23:56:50.101196 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" containerName="nova-metadata-metadata" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.101207 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" containerName="nova-metadata-metadata" Jan 20 23:56:50 crc kubenswrapper[5030]: E0120 23:56:50.101230 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" containerName="nova-metadata-log" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.101238 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" containerName="nova-metadata-log" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.101465 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" containerName="nova-metadata-metadata" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.101483 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" containerName="nova-metadata-log" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.102930 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.106929 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-config-data" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.106968 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-metadata-internal-svc" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.108757 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.241536 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-logs\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.241587 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.241608 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79n8c\" (UniqueName: \"kubernetes.io/projected/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-kube-api-access-79n8c\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.241655 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-config-data\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.241681 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.343929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.343965 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79n8c\" (UniqueName: \"kubernetes.io/projected/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-kube-api-access-79n8c\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.343999 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-config-data\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.344040 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.344168 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-logs\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.344557 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-logs\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.348774 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-config-data\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.349138 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.349790 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.362578 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79n8c\" (UniqueName: \"kubernetes.io/projected/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-kube-api-access-79n8c\") pod \"nova-metadata-0\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.419394 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:50 crc kubenswrapper[5030]: W0120 23:56:50.933862 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9e3e45b_cb93_4ba4_9597_0e329f3d34df.slice/crio-75237cdcdbee17e1679e3839f90a9c44762d8f40102fa69be2dc860bd81e97a7 WatchSource:0}: Error finding container 75237cdcdbee17e1679e3839f90a9c44762d8f40102fa69be2dc860bd81e97a7: Status 404 returned error can't find the container with id 75237cdcdbee17e1679e3839f90a9c44762d8f40102fa69be2dc860bd81e97a7 Jan 20 23:56:50 crc kubenswrapper[5030]: I0120 23:56:50.948802 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:56:51 crc kubenswrapper[5030]: I0120 23:56:51.018906 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae","Type":"ContainerStarted","Data":"dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7"} Jan 20 23:56:51 crc kubenswrapper[5030]: I0120 23:56:51.018971 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae","Type":"ContainerStarted","Data":"7197c24f7f5f10cae29f0354ba85dac7e7241c44be681ac19130045f42fd8c8c"} Jan 20 23:56:51 crc kubenswrapper[5030]: I0120 23:56:51.020205 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c9e3e45b-cb93-4ba4-9597-0e329f3d34df","Type":"ContainerStarted","Data":"75237cdcdbee17e1679e3839f90a9c44762d8f40102fa69be2dc860bd81e97a7"} Jan 20 23:56:51 crc kubenswrapper[5030]: I0120 23:56:51.051318 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-scheduler-0" podStartSLOduration=2.051299108 podStartE2EDuration="2.051299108s" podCreationTimestamp="2026-01-20 23:56:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:51.046292666 +0000 UTC m=+4883.366552964" watchObservedRunningTime="2026-01-20 23:56:51.051299108 +0000 UTC m=+4883.371559406" Jan 20 23:56:51 crc kubenswrapper[5030]: I0120 23:56:51.986009 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f730b28f-1b7e-418a-8c4d-4fb49a9b0c96" path="/var/lib/kubelet/pods/f730b28f-1b7e-418a-8c4d-4fb49a9b0c96/volumes" Jan 20 23:56:52 crc kubenswrapper[5030]: I0120 23:56:52.038546 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c9e3e45b-cb93-4ba4-9597-0e329f3d34df","Type":"ContainerStarted","Data":"f5c35f53ada49fe2d99f4e9e01f82911c9f6197814cbb265768499364c69f9b4"} Jan 20 23:56:52 crc kubenswrapper[5030]: I0120 23:56:52.038617 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c9e3e45b-cb93-4ba4-9597-0e329f3d34df","Type":"ContainerStarted","Data":"46849f5297b3e94a8c0334dc522aedd518e42402f9b1b4cc2bcf817e42c3141e"} Jan 20 23:56:54 crc kubenswrapper[5030]: I0120 23:56:54.404025 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:55 crc kubenswrapper[5030]: I0120 23:56:55.419697 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:55 crc kubenswrapper[5030]: I0120 23:56:55.420003 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:56:57 crc kubenswrapper[5030]: I0120 23:56:57.369732 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:57 crc kubenswrapper[5030]: I0120 23:56:57.370101 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:56:58 crc kubenswrapper[5030]: I0120 23:56:58.451844 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.145:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:56:58 crc kubenswrapper[5030]: I0120 23:56:58.452133 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.145:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 23:56:59 crc kubenswrapper[5030]: I0120 23:56:59.404669 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:59 crc kubenswrapper[5030]: I0120 23:56:59.436882 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:56:59 crc kubenswrapper[5030]: I0120 23:56:59.457340 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-metadata-0" podStartSLOduration=9.457320545 podStartE2EDuration="9.457320545s" podCreationTimestamp="2026-01-20 23:56:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:56:52.066941795 +0000 UTC m=+4884.387202123" watchObservedRunningTime="2026-01-20 23:56:59.457320545 +0000 UTC m=+4891.777580843" Jan 20 23:57:00 crc kubenswrapper[5030]: I0120 23:57:00.158295 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:57:00 crc kubenswrapper[5030]: I0120 23:57:00.419936 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:57:00 crc kubenswrapper[5030]: I0120 23:57:00.420033 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:57:01 crc kubenswrapper[5030]: I0120 23:57:01.434810 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.147:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:57:01 crc kubenswrapper[5030]: I0120 23:57:01.434810 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.147:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:57:07 crc kubenswrapper[5030]: I0120 23:57:07.020281 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:07 crc kubenswrapper[5030]: I0120 23:57:07.373714 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:07 crc kubenswrapper[5030]: I0120 23:57:07.374217 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:07 crc kubenswrapper[5030]: I0120 23:57:07.374271 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:07 crc kubenswrapper[5030]: I0120 23:57:07.390217 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:08 crc kubenswrapper[5030]: I0120 23:57:08.236279 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:08 crc kubenswrapper[5030]: I0120 23:57:08.241574 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:10 crc kubenswrapper[5030]: I0120 23:57:10.425609 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:57:10 crc kubenswrapper[5030]: I0120 23:57:10.453638 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:57:10 crc kubenswrapper[5030]: I0120 23:57:10.454415 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:57:10 crc kubenswrapper[5030]: I0120 23:57:10.732166 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:57:10 crc kubenswrapper[5030]: I0120 23:57:10.732503 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="ceilometer-central-agent" containerID="cri-o://d29b58f5f0487b420ff7ff80d9b9065282199b027b8298744d83e05b4d435c0b" gracePeriod=30 Jan 20 23:57:10 crc kubenswrapper[5030]: I0120 23:57:10.732670 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="proxy-httpd" containerID="cri-o://86839dbb41f521f26da4c4c111ba8689cc5681dbf846bca127e616377838e071" gracePeriod=30 Jan 20 23:57:10 crc kubenswrapper[5030]: I0120 23:57:10.732725 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="sg-core" containerID="cri-o://b2c6427b448e49488f7bf4402cfd45b087a0029e374c4de0cb9963626b3229bb" gracePeriod=30 Jan 20 23:57:10 crc kubenswrapper[5030]: I0120 23:57:10.732834 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="ceilometer-notification-agent" containerID="cri-o://b78af2b3322b7e58e728d2b965eb2d06c1ae306bc81334ef6253c1eb662028a0" gracePeriod=30 Jan 20 23:57:11 crc kubenswrapper[5030]: I0120 23:57:11.260636 5030 generic.go:334] "Generic (PLEG): container finished" podID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerID="86839dbb41f521f26da4c4c111ba8689cc5681dbf846bca127e616377838e071" exitCode=0 Jan 20 23:57:11 crc kubenswrapper[5030]: I0120 23:57:11.260886 5030 generic.go:334] "Generic (PLEG): container finished" podID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerID="b2c6427b448e49488f7bf4402cfd45b087a0029e374c4de0cb9963626b3229bb" exitCode=2 Jan 20 23:57:11 crc kubenswrapper[5030]: I0120 23:57:11.260722 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d3eab81e-57e8-4e42-b071-fac6dba52982","Type":"ContainerDied","Data":"86839dbb41f521f26da4c4c111ba8689cc5681dbf846bca127e616377838e071"} Jan 20 23:57:11 crc kubenswrapper[5030]: I0120 23:57:11.260946 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d3eab81e-57e8-4e42-b071-fac6dba52982","Type":"ContainerDied","Data":"b2c6427b448e49488f7bf4402cfd45b087a0029e374c4de0cb9963626b3229bb"} Jan 20 23:57:11 crc kubenswrapper[5030]: I0120 23:57:11.260957 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d3eab81e-57e8-4e42-b071-fac6dba52982","Type":"ContainerDied","Data":"d29b58f5f0487b420ff7ff80d9b9065282199b027b8298744d83e05b4d435c0b"} Jan 20 23:57:11 crc kubenswrapper[5030]: I0120 23:57:11.260895 5030 generic.go:334] "Generic (PLEG): container finished" podID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerID="d29b58f5f0487b420ff7ff80d9b9065282199b027b8298744d83e05b4d435c0b" exitCode=0 Jan 20 23:57:11 crc kubenswrapper[5030]: I0120 23:57:11.268087 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:57:11 crc kubenswrapper[5030]: I0120 23:57:11.594445 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:57:11 crc kubenswrapper[5030]: I0120 23:57:11.594680 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" containerName="nova-api-log" containerID="cri-o://f5730b6d895805c69cbc7f577c2c99904365f9d232826e7fa9d2034a2836c533" gracePeriod=30 Jan 20 23:57:11 crc kubenswrapper[5030]: I0120 23:57:11.594816 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" containerName="nova-api-api" containerID="cri-o://4340651e0f704e768231d9ca042858c1891e7c28dfcadbe7e80ffc8747176fd8" gracePeriod=30 Jan 20 23:57:12 crc kubenswrapper[5030]: I0120 23:57:12.273769 5030 generic.go:334] "Generic (PLEG): container finished" podID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" containerID="f5730b6d895805c69cbc7f577c2c99904365f9d232826e7fa9d2034a2836c533" exitCode=143 Jan 20 23:57:12 crc kubenswrapper[5030]: I0120 23:57:12.273809 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6519c3f5-4768-4b48-81e9-c7e7713c9b04","Type":"ContainerDied","Data":"f5730b6d895805c69cbc7f577c2c99904365f9d232826e7fa9d2034a2836c533"} Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.030677 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.171307 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.177322 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3eab81e-57e8-4e42-b071-fac6dba52982-log-httpd\") pod \"d3eab81e-57e8-4e42-b071-fac6dba52982\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.177355 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-scripts\") pod \"d3eab81e-57e8-4e42-b071-fac6dba52982\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.177380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-ceilometer-tls-certs\") pod \"d3eab81e-57e8-4e42-b071-fac6dba52982\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.177430 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-sg-core-conf-yaml\") pod \"d3eab81e-57e8-4e42-b071-fac6dba52982\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.177493 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-combined-ca-bundle\") pod \"d3eab81e-57e8-4e42-b071-fac6dba52982\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.177563 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-config-data\") pod \"d3eab81e-57e8-4e42-b071-fac6dba52982\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.177591 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4hjr\" (UniqueName: \"kubernetes.io/projected/d3eab81e-57e8-4e42-b071-fac6dba52982-kube-api-access-t4hjr\") pod \"d3eab81e-57e8-4e42-b071-fac6dba52982\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.177650 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3eab81e-57e8-4e42-b071-fac6dba52982-run-httpd\") pod \"d3eab81e-57e8-4e42-b071-fac6dba52982\" (UID: \"d3eab81e-57e8-4e42-b071-fac6dba52982\") " Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.178125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3eab81e-57e8-4e42-b071-fac6dba52982-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d3eab81e-57e8-4e42-b071-fac6dba52982" (UID: "d3eab81e-57e8-4e42-b071-fac6dba52982"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.178234 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3eab81e-57e8-4e42-b071-fac6dba52982-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d3eab81e-57e8-4e42-b071-fac6dba52982" (UID: "d3eab81e-57e8-4e42-b071-fac6dba52982"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.183154 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3eab81e-57e8-4e42-b071-fac6dba52982-kube-api-access-t4hjr" (OuterVolumeSpecName: "kube-api-access-t4hjr") pod "d3eab81e-57e8-4e42-b071-fac6dba52982" (UID: "d3eab81e-57e8-4e42-b071-fac6dba52982"). InnerVolumeSpecName "kube-api-access-t4hjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.183214 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-scripts" (OuterVolumeSpecName: "scripts") pod "d3eab81e-57e8-4e42-b071-fac6dba52982" (UID: "d3eab81e-57e8-4e42-b071-fac6dba52982"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.224468 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d3eab81e-57e8-4e42-b071-fac6dba52982" (UID: "d3eab81e-57e8-4e42-b071-fac6dba52982"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.253785 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d3eab81e-57e8-4e42-b071-fac6dba52982" (UID: "d3eab81e-57e8-4e42-b071-fac6dba52982"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.278814 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4nxw\" (UniqueName: \"kubernetes.io/projected/6519c3f5-4768-4b48-81e9-c7e7713c9b04-kube-api-access-d4nxw\") pod \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.278905 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6519c3f5-4768-4b48-81e9-c7e7713c9b04-logs\") pod \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.278983 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6519c3f5-4768-4b48-81e9-c7e7713c9b04-combined-ca-bundle\") pod \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.279078 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6519c3f5-4768-4b48-81e9-c7e7713c9b04-config-data\") pod \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\" (UID: \"6519c3f5-4768-4b48-81e9-c7e7713c9b04\") " Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.279443 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3eab81e-57e8-4e42-b071-fac6dba52982-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.279454 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3eab81e-57e8-4e42-b071-fac6dba52982-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.279464 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.279472 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.279481 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.279490 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4hjr\" (UniqueName: \"kubernetes.io/projected/d3eab81e-57e8-4e42-b071-fac6dba52982-kube-api-access-t4hjr\") on node \"crc\" DevicePath \"\"" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.280020 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6519c3f5-4768-4b48-81e9-c7e7713c9b04-logs" (OuterVolumeSpecName: "logs") pod "6519c3f5-4768-4b48-81e9-c7e7713c9b04" (UID: "6519c3f5-4768-4b48-81e9-c7e7713c9b04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.308590 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6519c3f5-4768-4b48-81e9-c7e7713c9b04-config-data" (OuterVolumeSpecName: "config-data") pod "6519c3f5-4768-4b48-81e9-c7e7713c9b04" (UID: "6519c3f5-4768-4b48-81e9-c7e7713c9b04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.332126 5030 generic.go:334] "Generic (PLEG): container finished" podID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerID="b78af2b3322b7e58e728d2b965eb2d06c1ae306bc81334ef6253c1eb662028a0" exitCode=0 Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.332188 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d3eab81e-57e8-4e42-b071-fac6dba52982","Type":"ContainerDied","Data":"b78af2b3322b7e58e728d2b965eb2d06c1ae306bc81334ef6253c1eb662028a0"} Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.332220 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"d3eab81e-57e8-4e42-b071-fac6dba52982","Type":"ContainerDied","Data":"b2f2622b0737c27e23f33156e17e809d60de83ae5a1421217e02730beb31900b"} Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.332237 5030 scope.go:117] "RemoveContainer" containerID="86839dbb41f521f26da4c4c111ba8689cc5681dbf846bca127e616377838e071" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.332366 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.342351 5030 generic.go:334] "Generic (PLEG): container finished" podID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" containerID="4340651e0f704e768231d9ca042858c1891e7c28dfcadbe7e80ffc8747176fd8" exitCode=0 Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.342496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6519c3f5-4768-4b48-81e9-c7e7713c9b04","Type":"ContainerDied","Data":"4340651e0f704e768231d9ca042858c1891e7c28dfcadbe7e80ffc8747176fd8"} Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.342598 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"6519c3f5-4768-4b48-81e9-c7e7713c9b04","Type":"ContainerDied","Data":"758e41e2286f01353958b25602b5816b0ac1c3717a0b5ee4056e2e568d208fbf"} Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.342741 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.364348 5030 scope.go:117] "RemoveContainer" containerID="b2c6427b448e49488f7bf4402cfd45b087a0029e374c4de0cb9963626b3229bb" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.382880 5030 scope.go:117] "RemoveContainer" containerID="b78af2b3322b7e58e728d2b965eb2d06c1ae306bc81334ef6253c1eb662028a0" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.383799 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6519c3f5-4768-4b48-81e9-c7e7713c9b04-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.383833 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6519c3f5-4768-4b48-81e9-c7e7713c9b04-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.414206 5030 scope.go:117] "RemoveContainer" containerID="d29b58f5f0487b420ff7ff80d9b9065282199b027b8298744d83e05b4d435c0b" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.440331 5030 scope.go:117] "RemoveContainer" containerID="86839dbb41f521f26da4c4c111ba8689cc5681dbf846bca127e616377838e071" Jan 20 23:57:15 crc kubenswrapper[5030]: E0120 23:57:15.441004 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86839dbb41f521f26da4c4c111ba8689cc5681dbf846bca127e616377838e071\": container with ID starting with 86839dbb41f521f26da4c4c111ba8689cc5681dbf846bca127e616377838e071 not found: ID does not exist" containerID="86839dbb41f521f26da4c4c111ba8689cc5681dbf846bca127e616377838e071" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.441041 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86839dbb41f521f26da4c4c111ba8689cc5681dbf846bca127e616377838e071"} err="failed to get container status \"86839dbb41f521f26da4c4c111ba8689cc5681dbf846bca127e616377838e071\": rpc error: code = NotFound desc = could not find container \"86839dbb41f521f26da4c4c111ba8689cc5681dbf846bca127e616377838e071\": container with ID starting with 86839dbb41f521f26da4c4c111ba8689cc5681dbf846bca127e616377838e071 not found: ID does not exist" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.441067 5030 scope.go:117] "RemoveContainer" containerID="b2c6427b448e49488f7bf4402cfd45b087a0029e374c4de0cb9963626b3229bb" Jan 20 23:57:15 crc kubenswrapper[5030]: E0120 23:57:15.441526 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2c6427b448e49488f7bf4402cfd45b087a0029e374c4de0cb9963626b3229bb\": container with ID starting with b2c6427b448e49488f7bf4402cfd45b087a0029e374c4de0cb9963626b3229bb not found: ID does not exist" containerID="b2c6427b448e49488f7bf4402cfd45b087a0029e374c4de0cb9963626b3229bb" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.441585 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2c6427b448e49488f7bf4402cfd45b087a0029e374c4de0cb9963626b3229bb"} err="failed to get container status \"b2c6427b448e49488f7bf4402cfd45b087a0029e374c4de0cb9963626b3229bb\": rpc error: code = NotFound desc = could not find container \"b2c6427b448e49488f7bf4402cfd45b087a0029e374c4de0cb9963626b3229bb\": container with ID starting with b2c6427b448e49488f7bf4402cfd45b087a0029e374c4de0cb9963626b3229bb not found: ID does not exist" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.441694 5030 scope.go:117] "RemoveContainer" containerID="b78af2b3322b7e58e728d2b965eb2d06c1ae306bc81334ef6253c1eb662028a0" Jan 20 23:57:15 crc kubenswrapper[5030]: E0120 23:57:15.442255 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b78af2b3322b7e58e728d2b965eb2d06c1ae306bc81334ef6253c1eb662028a0\": container with ID starting with b78af2b3322b7e58e728d2b965eb2d06c1ae306bc81334ef6253c1eb662028a0 not found: ID does not exist" containerID="b78af2b3322b7e58e728d2b965eb2d06c1ae306bc81334ef6253c1eb662028a0" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.442287 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b78af2b3322b7e58e728d2b965eb2d06c1ae306bc81334ef6253c1eb662028a0"} err="failed to get container status \"b78af2b3322b7e58e728d2b965eb2d06c1ae306bc81334ef6253c1eb662028a0\": rpc error: code = NotFound desc = could not find container \"b78af2b3322b7e58e728d2b965eb2d06c1ae306bc81334ef6253c1eb662028a0\": container with ID starting with b78af2b3322b7e58e728d2b965eb2d06c1ae306bc81334ef6253c1eb662028a0 not found: ID does not exist" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.442309 5030 scope.go:117] "RemoveContainer" containerID="d29b58f5f0487b420ff7ff80d9b9065282199b027b8298744d83e05b4d435c0b" Jan 20 23:57:15 crc kubenswrapper[5030]: E0120 23:57:15.442885 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29b58f5f0487b420ff7ff80d9b9065282199b027b8298744d83e05b4d435c0b\": container with ID starting with d29b58f5f0487b420ff7ff80d9b9065282199b027b8298744d83e05b4d435c0b not found: ID does not exist" containerID="d29b58f5f0487b420ff7ff80d9b9065282199b027b8298744d83e05b4d435c0b" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.442929 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29b58f5f0487b420ff7ff80d9b9065282199b027b8298744d83e05b4d435c0b"} err="failed to get container status \"d29b58f5f0487b420ff7ff80d9b9065282199b027b8298744d83e05b4d435c0b\": rpc error: code = NotFound desc = could not find container \"d29b58f5f0487b420ff7ff80d9b9065282199b027b8298744d83e05b4d435c0b\": container with ID starting with d29b58f5f0487b420ff7ff80d9b9065282199b027b8298744d83e05b4d435c0b not found: ID does not exist" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.442955 5030 scope.go:117] "RemoveContainer" containerID="4340651e0f704e768231d9ca042858c1891e7c28dfcadbe7e80ffc8747176fd8" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.464398 5030 scope.go:117] "RemoveContainer" containerID="f5730b6d895805c69cbc7f577c2c99904365f9d232826e7fa9d2034a2836c533" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.487190 5030 scope.go:117] "RemoveContainer" containerID="4340651e0f704e768231d9ca042858c1891e7c28dfcadbe7e80ffc8747176fd8" Jan 20 23:57:15 crc kubenswrapper[5030]: E0120 23:57:15.487805 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4340651e0f704e768231d9ca042858c1891e7c28dfcadbe7e80ffc8747176fd8\": container with ID starting with 4340651e0f704e768231d9ca042858c1891e7c28dfcadbe7e80ffc8747176fd8 not found: ID does not exist" containerID="4340651e0f704e768231d9ca042858c1891e7c28dfcadbe7e80ffc8747176fd8" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.487844 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4340651e0f704e768231d9ca042858c1891e7c28dfcadbe7e80ffc8747176fd8"} err="failed to get container status \"4340651e0f704e768231d9ca042858c1891e7c28dfcadbe7e80ffc8747176fd8\": rpc error: code = NotFound desc = could not find container \"4340651e0f704e768231d9ca042858c1891e7c28dfcadbe7e80ffc8747176fd8\": container with ID starting with 4340651e0f704e768231d9ca042858c1891e7c28dfcadbe7e80ffc8747176fd8 not found: ID does not exist" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.487869 5030 scope.go:117] "RemoveContainer" containerID="f5730b6d895805c69cbc7f577c2c99904365f9d232826e7fa9d2034a2836c533" Jan 20 23:57:15 crc kubenswrapper[5030]: E0120 23:57:15.488216 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5730b6d895805c69cbc7f577c2c99904365f9d232826e7fa9d2034a2836c533\": container with ID starting with f5730b6d895805c69cbc7f577c2c99904365f9d232826e7fa9d2034a2836c533 not found: ID does not exist" containerID="f5730b6d895805c69cbc7f577c2c99904365f9d232826e7fa9d2034a2836c533" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.488244 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5730b6d895805c69cbc7f577c2c99904365f9d232826e7fa9d2034a2836c533"} err="failed to get container status \"f5730b6d895805c69cbc7f577c2c99904365f9d232826e7fa9d2034a2836c533\": rpc error: code = NotFound desc = could not find container \"f5730b6d895805c69cbc7f577c2c99904365f9d232826e7fa9d2034a2836c533\": container with ID starting with f5730b6d895805c69cbc7f577c2c99904365f9d232826e7fa9d2034a2836c533 not found: ID does not exist" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.748959 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6519c3f5-4768-4b48-81e9-c7e7713c9b04-kube-api-access-d4nxw" (OuterVolumeSpecName: "kube-api-access-d4nxw") pod "6519c3f5-4768-4b48-81e9-c7e7713c9b04" (UID: "6519c3f5-4768-4b48-81e9-c7e7713c9b04"). InnerVolumeSpecName "kube-api-access-d4nxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.788689 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3eab81e-57e8-4e42-b071-fac6dba52982" (UID: "d3eab81e-57e8-4e42-b071-fac6dba52982"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.791023 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.791069 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4nxw\" (UniqueName: \"kubernetes.io/projected/6519c3f5-4768-4b48-81e9-c7e7713c9b04-kube-api-access-d4nxw\") on node \"crc\" DevicePath \"\"" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.895584 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6519c3f5-4768-4b48-81e9-c7e7713c9b04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6519c3f5-4768-4b48-81e9-c7e7713c9b04" (UID: "6519c3f5-4768-4b48-81e9-c7e7713c9b04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:57:15 crc kubenswrapper[5030]: I0120 23:57:15.994690 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6519c3f5-4768-4b48-81e9-c7e7713c9b04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.026314 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-config-data" (OuterVolumeSpecName: "config-data") pod "d3eab81e-57e8-4e42-b071-fac6dba52982" (UID: "d3eab81e-57e8-4e42-b071-fac6dba52982"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.098245 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3eab81e-57e8-4e42-b071-fac6dba52982-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.268161 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.271792 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.288373 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:57:16 crc kubenswrapper[5030]: E0120 23:57:16.288732 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="proxy-httpd" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.288749 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="proxy-httpd" Jan 20 23:57:16 crc kubenswrapper[5030]: E0120 23:57:16.288764 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="ceilometer-notification-agent" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.288771 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="ceilometer-notification-agent" Jan 20 23:57:16 crc kubenswrapper[5030]: E0120 23:57:16.288788 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="sg-core" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.288794 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="sg-core" Jan 20 23:57:16 crc kubenswrapper[5030]: E0120 23:57:16.288814 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" containerName="nova-api-log" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.288820 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" containerName="nova-api-log" Jan 20 23:57:16 crc kubenswrapper[5030]: E0120 23:57:16.288831 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="ceilometer-central-agent" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.288837 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="ceilometer-central-agent" Jan 20 23:57:16 crc kubenswrapper[5030]: E0120 23:57:16.288847 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" containerName="nova-api-api" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.288853 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" containerName="nova-api-api" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.289029 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" containerName="nova-api-api" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.289043 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="proxy-httpd" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.289060 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="ceilometer-notification-agent" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.289071 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" containerName="nova-api-log" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.289080 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="ceilometer-central-agent" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.289090 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" containerName="sg-core" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.290564 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.294674 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-config-data" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.294876 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-ceilometer-internal-svc" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.297233 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"ceilometer-scripts" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.303561 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.403145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ffece47-af44-49bc-b3b4-6e6c24e6589e-run-httpd\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.403194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.403232 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-config-data\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.403249 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.403267 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ccqd\" (UniqueName: \"kubernetes.io/projected/8ffece47-af44-49bc-b3b4-6e6c24e6589e-kube-api-access-8ccqd\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.403296 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-scripts\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.403370 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.403393 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ffece47-af44-49bc-b3b4-6e6c24e6589e-log-httpd\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.504463 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-config-data\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.504503 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.504527 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ccqd\" (UniqueName: \"kubernetes.io/projected/8ffece47-af44-49bc-b3b4-6e6c24e6589e-kube-api-access-8ccqd\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.504558 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-scripts\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.504652 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.504672 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ffece47-af44-49bc-b3b4-6e6c24e6589e-log-httpd\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.504713 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ffece47-af44-49bc-b3b4-6e6c24e6589e-run-httpd\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.504743 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.505502 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ffece47-af44-49bc-b3b4-6e6c24e6589e-log-httpd\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.505591 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ffece47-af44-49bc-b3b4-6e6c24e6589e-run-httpd\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.508876 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.509346 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.509691 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.510067 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-config-data\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.512759 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-scripts\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.521482 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ccqd\" (UniqueName: \"kubernetes.io/projected/8ffece47-af44-49bc-b3b4-6e6c24e6589e-kube-api-access-8ccqd\") pod \"ceilometer-0\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:16 crc kubenswrapper[5030]: I0120 23:57:16.620204 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:17 crc kubenswrapper[5030]: I0120 23:57:17.076809 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:57:17 crc kubenswrapper[5030]: I0120 23:57:17.363561 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8ffece47-af44-49bc-b3b4-6e6c24e6589e","Type":"ContainerStarted","Data":"1827ef65f6b7145ccfc452dc1cdf99466a4d8ad8ffa3665d791ad890a26d59b1"} Jan 20 23:57:17 crc kubenswrapper[5030]: I0120 23:57:17.977806 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3eab81e-57e8-4e42-b071-fac6dba52982" path="/var/lib/kubelet/pods/d3eab81e-57e8-4e42-b071-fac6dba52982/volumes" Jan 20 23:57:18 crc kubenswrapper[5030]: I0120 23:57:18.371368 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8ffece47-af44-49bc-b3b4-6e6c24e6589e","Type":"ContainerStarted","Data":"85c6e0e15e78a3153d0dd557c849ff4f4ff72da720eb6c506baa0d7377395508"} Jan 20 23:57:19 crc kubenswrapper[5030]: I0120 23:57:19.400922 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8ffece47-af44-49bc-b3b4-6e6c24e6589e","Type":"ContainerStarted","Data":"dcc49b3097e7c3881a9e6d16ca2206768b7f3632eb827aa614fc3cf5b9335766"} Jan 20 23:57:20 crc kubenswrapper[5030]: I0120 23:57:20.413838 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8ffece47-af44-49bc-b3b4-6e6c24e6589e","Type":"ContainerStarted","Data":"3aab7494f35c8d53bed3be64a7d47121e15d43ba605ba4d9c3f83abc05cd8653"} Jan 20 23:57:21 crc kubenswrapper[5030]: I0120 23:57:21.345913 5030 scope.go:117] "RemoveContainer" containerID="cce18799a99ac1715b565d5854b0dcbd08f2d61430c9731225dea07b9b533874" Jan 20 23:57:21 crc kubenswrapper[5030]: I0120 23:57:21.386563 5030 scope.go:117] "RemoveContainer" containerID="cf4277b9970dab45af9d50c12645ba97abd7f0b0cad2e3a7288e663c8ecb365f" Jan 20 23:57:21 crc kubenswrapper[5030]: I0120 23:57:21.406435 5030 scope.go:117] "RemoveContainer" containerID="a441fbf79feb6f0388d034551078971a6c769c048b93306c0f069c878d28557b" Jan 20 23:57:21 crc kubenswrapper[5030]: I0120 23:57:21.425293 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8ffece47-af44-49bc-b3b4-6e6c24e6589e","Type":"ContainerStarted","Data":"fd95c8d95d19d605865a5d57c895aa4658929f6fbbc1cd7b2f7dcb7695715783"} Jan 20 23:57:21 crc kubenswrapper[5030]: I0120 23:57:21.426555 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:21 crc kubenswrapper[5030]: I0120 23:57:21.429447 5030 scope.go:117] "RemoveContainer" containerID="802ab5c1c2e050a8a391ce1421faacfc7008363dd9ae9278c32e3b4a5668b5f9" Jan 20 23:57:21 crc kubenswrapper[5030]: E0120 23:57:21.429682 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a441fbf79feb6f0388d034551078971a6c769c048b93306c0f069c878d28557b\": container with ID starting with a441fbf79feb6f0388d034551078971a6c769c048b93306c0f069c878d28557b not found: ID does not exist" containerID="a441fbf79feb6f0388d034551078971a6c769c048b93306c0f069c878d28557b" Jan 20 23:57:21 crc kubenswrapper[5030]: I0120 23:57:21.453994 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ceilometer-0" podStartSLOduration=1.525366462 podStartE2EDuration="5.453969504s" podCreationTimestamp="2026-01-20 23:57:16 +0000 UTC" firstStartedPulling="2026-01-20 23:57:17.081902194 +0000 UTC m=+4909.402162502" lastFinishedPulling="2026-01-20 23:57:21.010505256 +0000 UTC m=+4913.330765544" observedRunningTime="2026-01-20 23:57:21.445491569 +0000 UTC m=+4913.765751857" watchObservedRunningTime="2026-01-20 23:57:21.453969504 +0000 UTC m=+4913.774229792" Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.071835 5030 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod6519c3f5-4768-4b48-81e9-c7e7713c9b04"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod6519c3f5-4768-4b48-81e9-c7e7713c9b04] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6519c3f5_4768_4b48_81e9_c7e7713c9b04.slice" Jan 20 23:57:46 crc kubenswrapper[5030]: E0120 23:57:46.072704 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod6519c3f5-4768-4b48-81e9-c7e7713c9b04] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod6519c3f5-4768-4b48-81e9-c7e7713c9b04] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6519c3f5_4768_4b48_81e9_c7e7713c9b04.slice" pod="openstack-kuttl-tests/nova-api-0" podUID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.628185 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.732467 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.798441 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.813898 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.823264 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.824702 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.827203 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-internal-svc" Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.827416 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"cert-nova-public-svc" Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.827473 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-config-data" Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.834802 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.950565 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-config-data\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.950721 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cksb6\" (UniqueName: \"kubernetes.io/projected/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-kube-api-access-cksb6\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.950777 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.950809 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-public-tls-certs\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.951227 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-logs\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:46 crc kubenswrapper[5030]: I0120 23:57:46.951270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.055460 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-logs\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.055514 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.055570 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-config-data\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.055630 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cksb6\" (UniqueName: \"kubernetes.io/projected/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-kube-api-access-cksb6\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.055671 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.055702 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-public-tls-certs\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.056183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-logs\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.069004 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.069016 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.069139 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-public-tls-certs\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.069450 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-config-data\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.075280 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cksb6\" (UniqueName: \"kubernetes.io/projected/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-kube-api-access-cksb6\") pod \"nova-api-0\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.151023 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.629248 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.742880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"3f6a312c-eea2-4e52-8e9d-e61ab90cd620","Type":"ContainerStarted","Data":"b49fcb5e7e6960cbdf8e2b132518ee79619728e60c50fcd34aaa5a627d9c556d"} Jan 20 23:57:47 crc kubenswrapper[5030]: I0120 23:57:47.974357 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6519c3f5-4768-4b48-81e9-c7e7713c9b04" path="/var/lib/kubelet/pods/6519c3f5-4768-4b48-81e9-c7e7713c9b04/volumes" Jan 20 23:57:48 crc kubenswrapper[5030]: I0120 23:57:48.756138 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"3f6a312c-eea2-4e52-8e9d-e61ab90cd620","Type":"ContainerStarted","Data":"cb9ad803e0eda1434ca9fd4b5fbefe7f4ffd0908214c35cf856e52de1ddd73bc"} Jan 20 23:57:48 crc kubenswrapper[5030]: I0120 23:57:48.756427 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"3f6a312c-eea2-4e52-8e9d-e61ab90cd620","Type":"ContainerStarted","Data":"80d29224c06a128d8eed095ff5e5a28ed9ae45635587e8e737f90ba851398c23"} Jan 20 23:57:48 crc kubenswrapper[5030]: I0120 23:57:48.783543 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-api-0" podStartSLOduration=2.783525971 podStartE2EDuration="2.783525971s" podCreationTimestamp="2026-01-20 23:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:57:48.777082184 +0000 UTC m=+4941.097342472" watchObservedRunningTime="2026-01-20 23:57:48.783525971 +0000 UTC m=+4941.103786259" Jan 20 23:57:57 crc kubenswrapper[5030]: I0120 23:57:57.152254 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:57 crc kubenswrapper[5030]: I0120 23:57:57.153127 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:57:58 crc kubenswrapper[5030]: I0120 23:57:58.167878 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.149:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:57:58 crc kubenswrapper[5030]: I0120 23:57:58.167897 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-kuttl-tests/nova-api-0" podUID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.149:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 23:58:07 crc kubenswrapper[5030]: I0120 23:58:07.162148 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:58:07 crc kubenswrapper[5030]: I0120 23:58:07.163404 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:58:07 crc kubenswrapper[5030]: I0120 23:58:07.164404 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:58:07 crc kubenswrapper[5030]: I0120 23:58:07.175337 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:58:07 crc kubenswrapper[5030]: I0120 23:58:07.988247 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:58:07 crc kubenswrapper[5030]: I0120 23:58:07.999344 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.609146 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.651892 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/root-account-create-update-zlxnk"] Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.652982 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-zlxnk" Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.700236 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-zlxnk"] Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.705861 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-cell1-mariadb-root-db-secret" Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.825672 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqknk\" (UniqueName: \"kubernetes.io/projected/b714d540-db02-4fc2-ae0b-140a4cdab084-kube-api-access-wqknk\") pod \"root-account-create-update-zlxnk\" (UID: \"b714d540-db02-4fc2-ae0b-140a4cdab084\") " pod="openstack-kuttl-tests/root-account-create-update-zlxnk" Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.825776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b714d540-db02-4fc2-ae0b-140a4cdab084-operator-scripts\") pod \"root-account-create-update-zlxnk\" (UID: \"b714d540-db02-4fc2-ae0b-140a4cdab084\") " pod="openstack-kuttl-tests/root-account-create-update-zlxnk" Jan 20 23:58:11 crc kubenswrapper[5030]: E0120 23:58:11.825918 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:58:11 crc kubenswrapper[5030]: E0120 23:58:11.825965 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data podName:4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf nodeName:}" failed. No retries permitted until 2026-01-20 23:58:12.325948958 +0000 UTC m=+4964.646209246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data") pod "rabbitmq-server-0" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf") : configmap "rabbitmq-config-data" not found Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.858691 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.858921 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstackclient" podUID="525b8de8-34a8-4bd1-b426-ed17becbab12" containerName="openstackclient" containerID="cri-o://8b93f1e880b42dec39213a1363781794f9e5e185249a6d3eb089db7e9620bfcd" gracePeriod=2 Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.866425 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7"] Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.868042 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.915897 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstackclient"] Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.938799 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b714d540-db02-4fc2-ae0b-140a4cdab084-operator-scripts\") pod \"root-account-create-update-zlxnk\" (UID: \"b714d540-db02-4fc2-ae0b-140a4cdab084\") " pod="openstack-kuttl-tests/root-account-create-update-zlxnk" Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.939182 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqknk\" (UniqueName: \"kubernetes.io/projected/b714d540-db02-4fc2-ae0b-140a4cdab084-kube-api-access-wqknk\") pod \"root-account-create-update-zlxnk\" (UID: \"b714d540-db02-4fc2-ae0b-140a4cdab084\") " pod="openstack-kuttl-tests/root-account-create-update-zlxnk" Jan 20 23:58:11 crc kubenswrapper[5030]: I0120 23:58:11.940302 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b714d540-db02-4fc2-ae0b-140a4cdab084-operator-scripts\") pod \"root-account-create-update-zlxnk\" (UID: \"b714d540-db02-4fc2-ae0b-140a4cdab084\") " pod="openstack-kuttl-tests/root-account-create-update-zlxnk" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.042675 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tjp4j"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.042779 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqknk\" (UniqueName: \"kubernetes.io/projected/b714d540-db02-4fc2-ae0b-140a4cdab084-kube-api-access-wqknk\") pod \"root-account-create-update-zlxnk\" (UID: \"b714d540-db02-4fc2-ae0b-140a4cdab084\") " pod="openstack-kuttl-tests/root-account-create-update-zlxnk" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.064820 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6"] Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.065634 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525b8de8-34a8-4bd1-b426-ed17becbab12" containerName="openstackclient" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.065650 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="525b8de8-34a8-4bd1-b426-ed17becbab12" containerName="openstackclient" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.065900 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="525b8de8-34a8-4bd1-b426-ed17becbab12" containerName="openstackclient" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.067187 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.070599 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-combined-ca-bundle\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.070691 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-config-data\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.070761 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4979b9f-9d14-4e60-b62f-82d983b0fda9-logs\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.070792 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q9ss\" (UniqueName: \"kubernetes.io/projected/f4979b9f-9d14-4e60-b62f-82d983b0fda9-kube-api-access-5q9ss\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.070841 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-config-data-custom\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.098685 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-tjp4j"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.132714 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.133976 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.172869 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4979b9f-9d14-4e60-b62f-82d983b0fda9-logs\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.172926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q9ss\" (UniqueName: \"kubernetes.io/projected/f4979b9f-9d14-4e60-b62f-82d983b0fda9-kube-api-access-5q9ss\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.172972 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-config-data-custom\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.173018 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-combined-ca-bundle\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.173073 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-config-data\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.175579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4979b9f-9d14-4e60-b62f-82d983b0fda9-logs\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.179708 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-config-data\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.188722 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.194308 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-config-data-custom\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.196407 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-db-secret" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.198292 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-combined-ca-bundle\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.210389 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.227276 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.246221 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.246449 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="a7305e60-aa68-4d37-a472-80060d0fb5cb" containerName="cinder-scheduler" containerID="cri-o://2352db31d2fa21ab9eebafd17ba76cf00ee85495907870c2b6db941db94c99ad" gracePeriod=30 Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.246585 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-scheduler-0" podUID="a7305e60-aa68-4d37-a472-80060d0fb5cb" containerName="probe" containerID="cri-o://7aa1a8b59ede0ec7d6ca8badb42b6c8311eed524c20c0dfff1c4b3113611868b" gracePeriod=30 Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.277950 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-config-data\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.278042 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-config-data-custom\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.278063 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.278109 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901be408-dfd2-4cfa-9120-041671bd94ed-logs\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.278149 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3-operator-scripts\") pod \"neutron-9d9a-account-create-update-69r22\" (UID: \"467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3\") " pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.278171 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc5ls\" (UniqueName: \"kubernetes.io/projected/467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3-kube-api-access-vc5ls\") pod \"neutron-9d9a-account-create-update-69r22\" (UID: \"467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3\") " pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.278192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f4pd\" (UniqueName: \"kubernetes.io/projected/901be408-dfd2-4cfa-9120-041671bd94ed-kube-api-access-6f4pd\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.282821 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-zlxnk" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.314682 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.333297 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-9d9a-account-create-update-ngxq8"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.357517 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.358002 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="15f641b8-896e-47f1-b6b9-a2ced1fb3808" containerName="cinder-api-log" containerID="cri-o://2bbc2f8fdbf5ffee6d3a44f5227cab09b42416f5ef6e76d016d8424b2478caa8" gracePeriod=30 Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.358422 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/cinder-api-0" podUID="15f641b8-896e-47f1-b6b9-a2ced1fb3808" containerName="cinder-api" containerID="cri-o://cf937266b65a833ea2519d5e345ec986e9d92e36c9c09cd00ad3dbe11dbc94df" gracePeriod=30 Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.385748 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901be408-dfd2-4cfa-9120-041671bd94ed-logs\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.385807 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3-operator-scripts\") pod \"neutron-9d9a-account-create-update-69r22\" (UID: \"467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3\") " pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.385833 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc5ls\" (UniqueName: \"kubernetes.io/projected/467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3-kube-api-access-vc5ls\") pod \"neutron-9d9a-account-create-update-69r22\" (UID: \"467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3\") " pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.385860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f4pd\" (UniqueName: \"kubernetes.io/projected/901be408-dfd2-4cfa-9120-041671bd94ed-kube-api-access-6f4pd\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.385906 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-config-data\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.385993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-config-data-custom\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.386014 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.386133 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.386173 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle podName:901be408-dfd2-4cfa-9120-041671bd94ed nodeName:}" failed. No retries permitted until 2026-01-20 23:58:12.886159468 +0000 UTC m=+4965.206419756 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle") pod "barbican-keystone-listener-7f995c9cbb-dqqv6" (UID: "901be408-dfd2-4cfa-9120-041671bd94ed") : secret "combined-ca-bundle" not found Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.386518 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901be408-dfd2-4cfa-9120-041671bd94ed-logs\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.386811 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.386859 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data podName:4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf nodeName:}" failed. No retries permitted until 2026-01-20 23:58:13.386845055 +0000 UTC m=+4965.707105343 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data") pod "rabbitmq-server-0" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf") : configmap "rabbitmq-config-data" not found Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.387059 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3-operator-scripts\") pod \"neutron-9d9a-account-create-update-69r22\" (UID: \"467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3\") " pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.426823 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.427093 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="44f0a514-9f01-4768-aeb4-93de0354b738" containerName="ovn-northd" containerID="cri-o://014d4f347fd703b2cd055c19f820360447ce9d879a2dd7808843d16160084184" gracePeriod=30 Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.427477 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovn-northd-0" podUID="44f0a514-9f01-4768-aeb4-93de0354b738" containerName="openstack-network-exporter" containerID="cri-o://39b13d4627425616c7d43237285304e39a1350f2af6bf2588bc9f6f2e89e8a28" gracePeriod=30 Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.512001 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.512762 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="3fd59eaf-4b45-4866-a373-fbd87aa2054d" containerName="openstack-network-exporter" containerID="cri-o://3ab6bfc1ef90ca2f374fc1f4c559d4366f18debbdef1aa55c34024687bf9298c" gracePeriod=300 Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.552170 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-api-54769bd5cb-djssm"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.556073 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.567605 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5889-account-create-update-84pkt"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.583003 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-5889-account-create-update-84pkt"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.599484 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/placement-5889-account-create-update-7spqm"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.601572 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5889-account-create-update-7spqm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.607774 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"placement-db-secret" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.614923 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-api-54769bd5cb-djssm"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.628567 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/placement-5889-account-create-update-7spqm"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.638062 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.639362 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.643954 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-api-db-secret" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.658008 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.671703 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.686440 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-5ec4-account-create-update-5twgc"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.691191 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djdkq\" (UniqueName: \"kubernetes.io/projected/28827f2e-866b-47b6-a6bb-d38ef943bd09-kube-api-access-djdkq\") pod \"placement-5889-account-create-update-7spqm\" (UID: \"28827f2e-866b-47b6-a6bb-d38ef943bd09\") " pod="openstack-kuttl-tests/placement-5889-account-create-update-7spqm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.691423 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8dg\" (UniqueName: \"kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.691533 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns5v4\" (UniqueName: \"kubernetes.io/projected/5b1faee2-019f-4357-ab94-2680305278c7-kube-api-access-ns5v4\") pod \"nova-api-5ec4-account-create-update-nqqrm\" (UID: \"5b1faee2-019f-4357-ab94-2680305278c7\") " pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.691637 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.691727 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b1faee2-019f-4357-ab94-2680305278c7-operator-scripts\") pod \"nova-api-5ec4-account-create-update-nqqrm\" (UID: \"5b1faee2-019f-4357-ab94-2680305278c7\") " pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.691810 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.691941 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66791d21-1d99-480a-b3fe-e93c4e20c047-logs\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.692097 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.692175 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28827f2e-866b-47b6-a6bb-d38ef943bd09-operator-scripts\") pod \"placement-5889-account-create-update-7spqm\" (UID: \"28827f2e-866b-47b6-a6bb-d38ef943bd09\") " pod="openstack-kuttl-tests/placement-5889-account-create-update-7spqm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.692389 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data-custom\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.692466 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.721266 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.736678 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.746036 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q9ss\" (UniqueName: \"kubernetes.io/projected/f4979b9f-9d14-4e60-b62f-82d983b0fda9-kube-api-access-5q9ss\") pod \"barbican-worker-5d4668d597-b98m7\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.754827 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-4d16-account-create-update-mwwbg"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.756495 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-config-data-custom\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.761120 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-config-data\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.770773 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-7dea-account-create-update-mhshk"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.783917 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f4pd\" (UniqueName: \"kubernetes.io/projected/901be408-dfd2-4cfa-9120-041671bd94ed-kube-api-access-6f4pd\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.783984 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc5ls\" (UniqueName: \"kubernetes.io/projected/467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3-kube-api-access-vc5ls\") pod \"neutron-9d9a-account-create-update-69r22\" (UID: \"467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3\") " pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.799045 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.807058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b1faee2-019f-4357-ab94-2680305278c7-operator-scripts\") pod \"nova-api-5ec4-account-create-update-nqqrm\" (UID: \"5b1faee2-019f-4357-ab94-2680305278c7\") " pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.807303 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.807428 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66791d21-1d99-480a-b3fe-e93c4e20c047-logs\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.807836 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.810046 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66791d21-1d99-480a-b3fe-e93c4e20c047-logs\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.799232 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.809820 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.810311 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28827f2e-866b-47b6-a6bb-d38ef943bd09-operator-scripts\") pod \"placement-5889-account-create-update-7spqm\" (UID: \"28827f2e-866b-47b6-a6bb-d38ef943bd09\") " pod="openstack-kuttl-tests/placement-5889-account-create-update-7spqm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.810552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data-custom\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.810645 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.810806 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:13.310791299 +0000 UTC m=+4965.631051587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "barbican-config-data" not found Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.809802 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b1faee2-019f-4357-ab94-2680305278c7-operator-scripts\") pod \"nova-api-5ec4-account-create-update-nqqrm\" (UID: \"5b1faee2-019f-4357-ab94-2680305278c7\") " pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm" Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.811162 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:13.311143118 +0000 UTC m=+4965.631403406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "cert-barbican-internal-svc" not found Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.811763 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28827f2e-866b-47b6-a6bb-d38ef943bd09-operator-scripts\") pod \"placement-5889-account-create-update-7spqm\" (UID: \"28827f2e-866b-47b6-a6bb-d38ef943bd09\") " pod="openstack-kuttl-tests/placement-5889-account-create-update-7spqm" Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.811825 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.811858 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:13.311851075 +0000 UTC m=+4965.632111353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "combined-ca-bundle" not found Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.811895 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.811914 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:13.311906996 +0000 UTC m=+4965.632167284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "cert-barbican-public-svc" not found Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.819228 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.810984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djdkq\" (UniqueName: \"kubernetes.io/projected/28827f2e-866b-47b6-a6bb-d38ef943bd09-kube-api-access-djdkq\") pod \"placement-5889-account-create-update-7spqm\" (UID: \"28827f2e-866b-47b6-a6bb-d38ef943bd09\") " pod="openstack-kuttl-tests/placement-5889-account-create-update-7spqm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.819408 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8dg\" (UniqueName: \"kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.819459 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns5v4\" (UniqueName: \"kubernetes.io/projected/5b1faee2-019f-4357-ab94-2680305278c7-kube-api-access-ns5v4\") pod \"nova-api-5ec4-account-create-update-nqqrm\" (UID: \"5b1faee2-019f-4357-ab94-2680305278c7\") " pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.845712 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-zt8z9"] Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.846289 5030 projected.go:194] Error preparing data for projected volume kube-api-access-dc8dg for pod openstack-kuttl-tests/barbican-api-54769bd5cb-djssm: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.846357 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:13.346341252 +0000 UTC m=+4965.666601540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dc8dg" (UniqueName: "kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.849079 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data-custom\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.868329 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djdkq\" (UniqueName: \"kubernetes.io/projected/28827f2e-866b-47b6-a6bb-d38ef943bd09-kube-api-access-djdkq\") pod \"placement-5889-account-create-update-7spqm\" (UID: \"28827f2e-866b-47b6-a6bb-d38ef943bd09\") " pod="openstack-kuttl-tests/placement-5889-account-create-update-7spqm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.870856 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-sync-zt8z9"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.907248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns5v4\" (UniqueName: \"kubernetes.io/projected/5b1faee2-019f-4357-ab94-2680305278c7-kube-api-access-ns5v4\") pod \"nova-api-5ec4-account-create-update-nqqrm\" (UID: \"5b1faee2-019f-4357-ab94-2680305278c7\") " pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.925513 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.925784 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:58:12 crc kubenswrapper[5030]: E0120 23:58:12.925833 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle podName:901be408-dfd2-4cfa-9120-041671bd94ed nodeName:}" failed. No retries permitted until 2026-01-20 23:58:13.92581993 +0000 UTC m=+4966.246080208 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle") pod "barbican-keystone-listener-7f995c9cbb-dqqv6" (UID: "901be408-dfd2-4cfa-9120-041671bd94ed") : secret "combined-ca-bundle" not found Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.926094 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj"] Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.927280 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.942179 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5889-account-create-update-7spqm" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.949534 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"barbican-db-secret" Jan 20 23:58:12 crc kubenswrapper[5030]: I0120 23:58:12.960079 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.077346 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.082784 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.117603 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-sb-0" podUID="3fd59eaf-4b45-4866-a373-fbd87aa2054d" containerName="ovsdbserver-sb" containerID="cri-o://2dfa7821a1c881765f73c688770fe17e2ad11291288ce498f5cddfcfd52e03f5" gracePeriod=300 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.148035 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-0c78-account-create-update-sngpq"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.164538 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d61475e-3539-4e26-aac7-dfbde0ece687-operator-scripts\") pod \"barbican-4d16-account-create-update-g7nxj\" (UID: \"9d61475e-3539-4e26-aac7-dfbde0ece687\") " pod="openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.164638 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcz9s\" (UniqueName: \"kubernetes.io/projected/9d61475e-3539-4e26-aac7-dfbde0ece687-kube-api-access-pcz9s\") pod \"barbican-4d16-account-create-update-g7nxj\" (UID: \"9d61475e-3539-4e26-aac7-dfbde0ece687\") " pod="openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.253171 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.256903 5030 generic.go:334] "Generic (PLEG): container finished" podID="44f0a514-9f01-4768-aeb4-93de0354b738" containerID="39b13d4627425616c7d43237285304e39a1350f2af6bf2588bc9f6f2e89e8a28" exitCode=2 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.256980 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"44f0a514-9f01-4768-aeb4-93de0354b738","Type":"ContainerDied","Data":"39b13d4627425616c7d43237285304e39a1350f2af6bf2588bc9f6f2e89e8a28"} Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.266666 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcz9s\" (UniqueName: \"kubernetes.io/projected/9d61475e-3539-4e26-aac7-dfbde0ece687-kube-api-access-pcz9s\") pod \"barbican-4d16-account-create-update-g7nxj\" (UID: \"9d61475e-3539-4e26-aac7-dfbde0ece687\") " pod="openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.266877 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d61475e-3539-4e26-aac7-dfbde0ece687-operator-scripts\") pod \"barbican-4d16-account-create-update-g7nxj\" (UID: \"9d61475e-3539-4e26-aac7-dfbde0ece687\") " pod="openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.267461 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d61475e-3539-4e26-aac7-dfbde0ece687-operator-scripts\") pod \"barbican-4d16-account-create-update-g7nxj\" (UID: \"9d61475e-3539-4e26-aac7-dfbde0ece687\") " pod="openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.317135 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcz9s\" (UniqueName: \"kubernetes.io/projected/9d61475e-3539-4e26-aac7-dfbde0ece687-kube-api-access-pcz9s\") pod \"barbican-4d16-account-create-update-g7nxj\" (UID: \"9d61475e-3539-4e26-aac7-dfbde0ece687\") " pod="openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.323953 5030 generic.go:334] "Generic (PLEG): container finished" podID="15f641b8-896e-47f1-b6b9-a2ced1fb3808" containerID="2bbc2f8fdbf5ffee6d3a44f5227cab09b42416f5ef6e76d016d8424b2478caa8" exitCode=143 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.324029 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"15f641b8-896e-47f1-b6b9-a2ced1fb3808","Type":"ContainerDied","Data":"2bbc2f8fdbf5ffee6d3a44f5227cab09b42416f5ef6e76d016d8424b2478caa8"} Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.327322 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.327926 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="14c17006-1397-4119-bd32-0e07e78d5068" containerName="openstack-network-exporter" containerID="cri-o://05145492f4392fcd8c755b7cc30f2f103a6adfaf13ba303dcd29f71b950cc8bb" gracePeriod=300 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.342091 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-z8bpm"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.342697 5030 generic.go:334] "Generic (PLEG): container finished" podID="3fd59eaf-4b45-4866-a373-fbd87aa2054d" containerID="3ab6bfc1ef90ca2f374fc1f4c559d4366f18debbdef1aa55c34024687bf9298c" exitCode=2 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.342735 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"3fd59eaf-4b45-4866-a373-fbd87aa2054d","Type":"ContainerDied","Data":"3ab6bfc1ef90ca2f374fc1f4c559d4366f18debbdef1aa55c34024687bf9298c"} Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.365974 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.371354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.371551 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.371635 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:14.371605684 +0000 UTC m=+4966.691865972 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "combined-ca-bundle" not found Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.373214 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.373319 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8dg\" (UniqueName: \"kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.373380 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.373450 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.373669 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.373728 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:14.373709884 +0000 UTC m=+4966.693970232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "barbican-config-data" not found Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.373787 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.373817 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:14.373808207 +0000 UTC m=+4966.694068495 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "cert-barbican-public-svc" not found Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.374159 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.374192 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:14.374182466 +0000 UTC m=+4966.694442754 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "cert-barbican-internal-svc" not found Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.379714 5030 projected.go:194] Error preparing data for projected volume kube-api-access-dc8dg for pod openstack-kuttl-tests/barbican-api-54769bd5cb-djssm: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.379775 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:14.379758062 +0000 UTC m=+4966.700018350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dc8dg" (UniqueName: "kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.380142 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.389073 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-sync-z8bpm"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.400031 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8kqng"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.436373 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.459213 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-ae66-account-create-update-4tgbf"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.465034 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="14c17006-1397-4119-bd32-0e07e78d5068" containerName="ovsdbserver-nb" containerID="cri-o://b3a1344ff3e013651e7f6d5c6a2155a0ddb578b7e656e6c2b4a5a5e435e9bfc2" gracePeriod=300 Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.486420 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.486478 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data podName:4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf nodeName:}" failed. No retries permitted until 2026-01-20 23:58:15.48646435 +0000 UTC m=+4967.806724638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data") pod "rabbitmq-server-0" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf") : configmap "rabbitmq-config-data" not found Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.492755 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-4jp4k"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.518137 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-sync-8kqng"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.529358 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-cell-mapping-g2f6x"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.560728 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-sync-4jp4k"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.573711 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.593062 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-trpng"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.605717 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-cell-mapping-trpng"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.634787 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.635665 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-server" containerID="cri-o://4f455f04c208a9ca5dfa5ca0e43960e84838f224928f9a968af8ca6c20691bc7" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.636052 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="swift-recon-cron" containerID="cri-o://525d2f77e4bb4d57f35f228c13d361085452ce1a0496934e60bded3b63edca0f" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.636106 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="rsync" containerID="cri-o://92c32d2bc86e98bab1819f43771f92a83d45f6ff289fdd9a2b5a70b53ad1558e" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.636164 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-replicator" containerID="cri-o://8671b10a80366e99ef14723df98d3a82f9697f085022261cce60d19731b26754" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.636179 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-auditor" containerID="cri-o://f0d126b13c715f2ad13a917efe6ded91f57e7059086b604ef44b914a751d5bbc" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.636259 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-server" containerID="cri-o://422e52434457361c67a15e36827a2685f1991c2155732af2cb83cec3214ed1f9" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.636301 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-updater" containerID="cri-o://57552db52780afc148b63dea6d16414f0cabd8ebca10a042707f9fe199f39642" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.636332 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-auditor" containerID="cri-o://691b234d76c80217c95ab8beecf74884acbbff47c7b789fb42fa87e618b49105" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.636360 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-replicator" containerID="cri-o://e69298ac72328d9e05cbb9394d7318f97f51059ec2e51eed28645ae7a530e56b" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.636427 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-expirer" containerID="cri-o://cea3952cdaef6601fde2eb637e39d3001c3a7c366d044314ff9a9d1c02c48017" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.636490 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-updater" containerID="cri-o://a374407f689a9d2bb03eff9d062e1cd3da80b68110ed673257b7015e9e92c1ad" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.636558 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-auditor" containerID="cri-o://50c1d323bc10d2e9d482f6bae645cc74b4dc3db3e2fb097ee700a89b4499d86c" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.637943 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-replicator" containerID="cri-o://3654ab925574f24e6de289b448a4be2823edfaa3cfa76260680adad4710a392c" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.638176 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-server" containerID="cri-o://4ed9aef9fb4eafe75c52a1017481474e53ebb1e8b61c72ef3fbe95bc2c025817" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.636855 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-storage-0" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-reaper" containerID="cri-o://5e75ab45cf815c6f40eaad36359127f421639ab2cf69cccab92eb297f6bff69d" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.682704 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-9fhpl"] Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.694158 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.694406 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data podName:19f6003c-3d52-44e9-a5a8-d525e4b6a79e nodeName:}" failed. No retries permitted until 2026-01-20 23:58:14.194394574 +0000 UTC m=+4966.514654862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data") pod "rabbitmq-cell1-server-0" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.701169 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-sync-9fhpl"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.730037 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-7vfh9"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.744450 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-ring-rebalance-7vfh9"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.761922 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.762176 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="4f8fa8f5-9f43-489c-b61d-668187fe87ea" containerName="glance-log" containerID="cri-o://3cc6b0b1ef67cf3ed623ef331af4f0711ba50f68fe18f908775fb4147e800fbd" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.762248 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-internal-api-0" podUID="4f8fa8f5-9f43-489c-b61d-668187fe87ea" containerName="glance-httpd" containerID="cri-o://a214f98f4f8cf3fbcb2676eb84babdfaed14af092e194e107984c7c352cadedf" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.779960 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-db-create-m5sbm"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.790019 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.790306 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" podUID="e0a3039f-15d0-4764-bcdf-4833853ed168" containerName="neutron-api" containerID="cri-o://00b43d04171daa71b4b128f12c54427b55f83c6364025b5f0dfa8199fd3ad500" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.791949 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" podUID="e0a3039f-15d0-4764-bcdf-4833853ed168" containerName="neutron-httpd" containerID="cri-o://614efb0d878fa014c022337fb7a7d478520d2e68328e205d409a9cb6aab50305" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.802772 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:58:13 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:58:13 crc kubenswrapper[5030]: Jan 20 23:58:13 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:58:13 crc kubenswrapper[5030]: Jan 20 23:58:13 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:58:13 crc kubenswrapper[5030]: Jan 20 23:58:13 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:58:13 crc kubenswrapper[5030]: Jan 20 23:58:13 crc kubenswrapper[5030]: if [ -n "" ]; then Jan 20 23:58:13 crc kubenswrapper[5030]: GRANT_DATABASE="" Jan 20 23:58:13 crc kubenswrapper[5030]: else Jan 20 23:58:13 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:58:13 crc kubenswrapper[5030]: fi Jan 20 23:58:13 crc kubenswrapper[5030]: Jan 20 23:58:13 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:58:13 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:58:13 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:58:13 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:58:13 crc kubenswrapper[5030]: # support updates Jan 20 23:58:13 crc kubenswrapper[5030]: Jan 20 23:58:13 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:58:13 crc kubenswrapper[5030]: E0120 23:58:13.803890 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-zlxnk" podUID="b714d540-db02-4fc2-ae0b-140a4cdab084" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.810690 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.810988 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="c3b71d0a-805d-4894-b6ff-c049baf96ad5" containerName="glance-log" containerID="cri-o://8df5d7b4dd58e28f0b12d08b3d7eee031074f3eaa5bb04801e96f40173a4f7f1" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.811433 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="c3b71d0a-805d-4894-b6ff-c049baf96ad5" containerName="glance-httpd" containerID="cri-o://cfb03bd09901565aba481629836ae75c34742d2b796a614f9812657af674d0ce" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.819305 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7cd488ddbd-4gzn9"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.819589 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" podUID="6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" containerName="placement-log" containerID="cri-o://0f4f42344c283426fc23c3e24b986e08ac56986fe44233a22ed80905aaf6f05b" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.819763 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" podUID="6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" containerName="placement-api" containerID="cri-o://298b4e1352b60ce437b507a18333d12357a166ee55c90098fbb3bb0047c27131" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.828796 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-db-create-m5sbm"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.853716 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-638c-account-create-update-n28wn"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.873543 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-638c-account-create-update-n28wn"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.900769 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.902481 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.915138 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.949326 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.949600 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerName="nova-metadata-log" containerID="cri-o://46849f5297b3e94a8c0334dc522aedd518e42402f9b1b4cc2bcf817e42c3141e" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.950050 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerName="nova-metadata-metadata" containerID="cri-o://f5c35f53ada49fe2d99f4e9e01f82911c9f6197814cbb265768499364c69f9b4" gracePeriod=30 Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.983332 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="076d046e-5116-4b10-ace6-602cb79307fa" path="/var/lib/kubelet/pods/076d046e-5116-4b10-ace6-602cb79307fa/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.984074 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1797744b-0052-4c51-bc74-77bfcf4329c2" path="/var/lib/kubelet/pods/1797744b-0052-4c51-bc74-77bfcf4329c2/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.984778 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32dae846-6ace-453b-a3eb-b237739433c6" path="/var/lib/kubelet/pods/32dae846-6ace-453b-a3eb-b237739433c6/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.986004 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="331a63fb-2148-49d9-ab0a-7dc819c26bea" path="/var/lib/kubelet/pods/331a63fb-2148-49d9-ab0a-7dc819c26bea/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.986835 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4044eeda-1a3f-45ef-8f85-efdbfbffd4f6" path="/var/lib/kubelet/pods/4044eeda-1a3f-45ef-8f85-efdbfbffd4f6/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.987546 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="489812a0-ca0d-4259-a487-7c4840c0a7d5" path="/var/lib/kubelet/pods/489812a0-ca0d-4259-a487-7c4840c0a7d5/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.988189 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c71777b-833a-46c4-afac-95aff05fceb1" path="/var/lib/kubelet/pods/4c71777b-833a-46c4-afac-95aff05fceb1/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.989269 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5df3a2-8ec2-4eed-93ba-f7ee82646dba" path="/var/lib/kubelet/pods/5d5df3a2-8ec2-4eed-93ba-f7ee82646dba/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.990077 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e43cb69-a286-454d-9d54-5ce3aad208d0" path="/var/lib/kubelet/pods/5e43cb69-a286-454d-9d54-5ce3aad208d0/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.990739 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65855d62-04e3-45a7-97a2-3a14806fe649" path="/var/lib/kubelet/pods/65855d62-04e3-45a7-97a2-3a14806fe649/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.991895 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="691512e9-1781-4ce2-9d4d-c1787adb07c5" path="/var/lib/kubelet/pods/691512e9-1781-4ce2-9d4d-c1787adb07c5/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.992766 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f572652-5099-4653-9d03-efe8a789d462" path="/var/lib/kubelet/pods/9f572652-5099-4653-9d03-efe8a789d462/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.993909 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7238d6f-7a18-49b8-a128-4935e9d5fadc" path="/var/lib/kubelet/pods/a7238d6f-7a18-49b8-a128-4935e9d5fadc/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.994457 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c43aca93-d5a2-4ec9-bac6-6ac012e45271" path="/var/lib/kubelet/pods/c43aca93-d5a2-4ec9-bac6-6ac012e45271/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.995582 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0747945-eb36-4b4d-884c-37803fa92174" path="/var/lib/kubelet/pods/d0747945-eb36-4b4d-884c-37803fa92174/volumes" Jan 20 23:58:13 crc kubenswrapper[5030]: I0120 23:58:13.996599 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dddfd43a-d45b-4e9e-9b5f-b4fc0401a142" path="/var/lib/kubelet/pods/dddfd43a-d45b-4e9e-9b5f-b4fc0401a142/volumes" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:13.997108 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb6c79c1-6daa-40b3-92dc-c0b7a77182ae" path="/var/lib/kubelet/pods/eb6c79c1-6daa-40b3-92dc-c0b7a77182ae/volumes" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:13.997715 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0" path="/var/lib/kubelet/pods/ff9ac2e8-d836-41b2-92e8-0fdb2e0215e0/volumes" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:13.998236 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:13.998271 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-54lmm"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:13.998463 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" podUID="03c5eea5-bd6f-4b19-829d-32f4ef09997c" containerName="proxy-httpd" containerID="cri-o://e48bf78e90710aa76a16ac4f0324a165f77700561ddfe100c023788b47cc3e69" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:13.998808 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" podUID="03c5eea5-bd6f-4b19-829d-32f4ef09997c" containerName="proxy-server" containerID="cri-o://9ca4957d78a8a6a00d4df062b5abb4a6f62840672dc365ddc417d1f2516cfcb4" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.003161 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8v65m\" (UID: \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.003220 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8v65m\" (UID: \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.004862 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.004939 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq4pc\" (UniqueName: \"kubernetes.io/projected/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-kube-api-access-wq4pc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8v65m\" (UID: \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.006660 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.006716 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle podName:901be408-dfd2-4cfa-9120-041671bd94ed nodeName:}" failed. No retries permitted until 2026-01-20 23:58:16.006701621 +0000 UTC m=+4968.326961909 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle") pod "barbican-keystone-listener-7f995c9cbb-dqqv6" (UID: "901be408-dfd2-4cfa-9120-041671bd94ed") : secret "combined-ca-bundle" not found Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.039506 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-db-create-54lmm"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.051574 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.073162 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.073455 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerName="nova-api-log" containerID="cri-o://80d29224c06a128d8eed095ff5e5a28ed9ae45635587e8e737f90ba851398c23" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.073563 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-api-0" podUID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerName="nova-api-api" containerID="cri-o://cb9ad803e0eda1434ca9fd4b5fbefe7f4ffd0908214c35cf856e52de1ddd73bc" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.086909 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.124998 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq4pc\" (UniqueName: \"kubernetes.io/projected/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-kube-api-access-wq4pc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8v65m\" (UID: \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.125180 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8v65m\" (UID: \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.125222 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8v65m\" (UID: \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.134035 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8v65m\" (UID: \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.134113 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8v65m\" (UID: \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.158086 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq4pc\" (UniqueName: \"kubernetes.io/projected/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-kube-api-access-wq4pc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8v65m\" (UID: \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.217935 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.222705 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" containerName="rabbitmq" containerID="cri-o://816197e9ac2adae78cad4ecd0194c5c8ad4d46b8179549ee00873742d6bf4138" gracePeriod=604800 Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.228946 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.229014 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data podName:19f6003c-3d52-44e9-a5a8-d525e4b6a79e nodeName:}" failed. No retries permitted until 2026-01-20 23:58:15.228988392 +0000 UTC m=+4967.549248680 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data") pod "rabbitmq-cell1-server-0" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.261917 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-db-create-dmm8j"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.267683 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-db-create-dmm8j"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.277696 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.278137 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="231da45e-cd6f-435b-9209-0d8794fa33ad" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://1b1952168676d99572794ea82a32c819d0bb2a2bc32e30b900afcbcea78e76da" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.282499 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.288359 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-57x7b"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.297509 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_3fd59eaf-4b45-4866-a373-fbd87aa2054d/ovsdbserver-sb/0.log" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.297577 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.318198 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-db-create-57x7b"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.332671 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.332920 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" podUID="68728b46-0262-46f2-8eb0-a0301870b569" containerName="barbican-keystone-listener-log" containerID="cri-o://1cd498511cf2e3d79c9f51bfa7be8a47de25abc9f9cf591603a16ea099174d99" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.333332 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" podUID="68728b46-0262-46f2-8eb0-a0301870b569" containerName="barbican-keystone-listener" containerID="cri-o://c841a5010562e147fe1443063c1f51700f2ba446789e1d9cbd9424b883b66e6e" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.342481 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.342701 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" podUID="04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" containerName="barbican-worker-log" containerID="cri-o://c73d1bfddd3255de91348a4a282edc41b3a6067265685d19999b944603a37028" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.343041 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" podUID="04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" containerName="barbican-worker" containerID="cri-o://c9f22341c20b80fbc92351565eeb1684e14881bf7d19d2e8e7a7402532a5420e" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.353397 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.354767 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-zlxnk" event={"ID":"b714d540-db02-4fc2-ae0b-140a4cdab084","Type":"ContainerStarted","Data":"a3b6c3ca786f2a117b425576c27f17c9e7d8b0ee9231b09f7729ff71498a3f4f"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.355280 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-zlxnk" secret="" err="secret \"galera-openstack-cell1-dockercfg-l8frs\" not found" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.368321 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5889-account-create-update-7spqm"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.391968 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-54769bd5cb-djssm"] Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.392160 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:58:14 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: if [ -n "" ]; then Jan 20 23:58:14 crc kubenswrapper[5030]: GRANT_DATABASE="" Jan 20 23:58:14 crc kubenswrapper[5030]: else Jan 20 23:58:14 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:58:14 crc kubenswrapper[5030]: fi Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:58:14 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:58:14 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:58:14 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:58:14 crc kubenswrapper[5030]: # support updates Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.392778 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data internal-tls-certs kube-api-access-dc8dg public-tls-certs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" podUID="66791d21-1d99-480a-b3fe-e93c4e20c047" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.392968 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-sb-0_3fd59eaf-4b45-4866-a373-fbd87aa2054d/ovsdbserver-sb/0.log" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.393011 5030 generic.go:334] "Generic (PLEG): container finished" podID="3fd59eaf-4b45-4866-a373-fbd87aa2054d" containerID="2dfa7821a1c881765f73c688770fe17e2ad11291288ce498f5cddfcfd52e03f5" exitCode=143 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.393109 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-sb-0" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.393604 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"3fd59eaf-4b45-4866-a373-fbd87aa2054d","Type":"ContainerDied","Data":"2dfa7821a1c881765f73c688770fe17e2ad11291288ce498f5cddfcfd52e03f5"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.393724 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-sb-0" event={"ID":"3fd59eaf-4b45-4866-a373-fbd87aa2054d","Type":"ContainerDied","Data":"04e99e903bcf133b615c5b5c7a4c34a8124ed39a9c5e1b974336ac6b30ad21d2"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.393751 5030 scope.go:117] "RemoveContainer" containerID="3ab6bfc1ef90ca2f374fc1f4c559d4366f18debbdef1aa55c34024687bf9298c" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.394604 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-zlxnk" podUID="b714d540-db02-4fc2-ae0b-140a4cdab084" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.399091 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-4vbjd"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.406171 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6"] Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.407136 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" podUID="901be408-dfd2-4cfa-9120-041671bd94ed" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.414270 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.415006 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" podUID="5915461a-f712-499e-ba04-bab78e49188f" containerName="barbican-api-log" containerID="cri-o://ac52bbaae9279d327d926eae11a2177bd95c95212a5758e0429fa5341d1b67fc" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.415439 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" podUID="5915461a-f712-499e-ba04-bab78e49188f" containerName="barbican-api" containerID="cri-o://8df38ebfb58a2a5a0af6b5b510d94cbadd6881af9e85606c92acec410f4bba95" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.416154 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_14c17006-1397-4119-bd32-0e07e78d5068/ovsdbserver-nb/0.log" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.416201 5030 generic.go:334] "Generic (PLEG): container finished" podID="14c17006-1397-4119-bd32-0e07e78d5068" containerID="05145492f4392fcd8c755b7cc30f2f103a6adfaf13ba303dcd29f71b950cc8bb" exitCode=2 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.416215 5030 generic.go:334] "Generic (PLEG): container finished" podID="14c17006-1397-4119-bd32-0e07e78d5068" containerID="b3a1344ff3e013651e7f6d5c6a2155a0ddb578b7e656e6c2b4a5a5e435e9bfc2" exitCode=143 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.416261 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"14c17006-1397-4119-bd32-0e07e78d5068","Type":"ContainerDied","Data":"05145492f4392fcd8c755b7cc30f2f103a6adfaf13ba303dcd29f71b950cc8bb"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.416280 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"14c17006-1397-4119-bd32-0e07e78d5068","Type":"ContainerDied","Data":"b3a1344ff3e013651e7f6d5c6a2155a0ddb578b7e656e6c2b4a5a5e435e9bfc2"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.420344 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-22v4w"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.431913 5030 generic.go:334] "Generic (PLEG): container finished" podID="525b8de8-34a8-4bd1-b426-ed17becbab12" containerID="8b93f1e880b42dec39213a1363781794f9e5e185249a6d3eb089db7e9620bfcd" exitCode=137 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.437531 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-ovsdbserver-sb-tls-certs\") pod \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.437609 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-combined-ca-bundle\") pod \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.437726 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm7r2\" (UniqueName: \"kubernetes.io/projected/3fd59eaf-4b45-4866-a373-fbd87aa2054d-kube-api-access-xm7r2\") pod \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.437781 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd59eaf-4b45-4866-a373-fbd87aa2054d-config\") pod \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.437826 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fd59eaf-4b45-4866-a373-fbd87aa2054d-scripts\") pod \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.437871 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.437935 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fd59eaf-4b45-4866-a373-fbd87aa2054d-ovsdb-rundir\") pod \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.437987 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-metrics-certs-tls-certs\") pod \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\" (UID: \"3fd59eaf-4b45-4866-a373-fbd87aa2054d\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.438308 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.438365 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8dg\" (UniqueName: \"kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.438409 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.438440 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.438520 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.438660 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.438706 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:16.43869261 +0000 UTC m=+4968.758952898 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "combined-ca-bundle" not found Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.441520 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-db-create-22v4w"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.443678 5030 generic.go:334] "Generic (PLEG): container finished" podID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerID="46849f5297b3e94a8c0334dc522aedd518e42402f9b1b4cc2bcf817e42c3141e" exitCode=143 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.443736 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c9e3e45b-cb93-4ba4-9597-0e329f3d34df","Type":"ContainerDied","Data":"46849f5297b3e94a8c0334dc522aedd518e42402f9b1b4cc2bcf817e42c3141e"} Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.444336 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.444383 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:16.444367987 +0000 UTC m=+4968.764628275 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "cert-barbican-internal-svc" not found Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.444417 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.444436 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:16.444430939 +0000 UTC m=+4968.764691217 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "cert-barbican-public-svc" not found Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.445807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd59eaf-4b45-4866-a373-fbd87aa2054d-scripts" (OuterVolumeSpecName: "scripts") pod "3fd59eaf-4b45-4866-a373-fbd87aa2054d" (UID: "3fd59eaf-4b45-4866-a373-fbd87aa2054d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.446216 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.446379 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fd59eaf-4b45-4866-a373-fbd87aa2054d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "3fd59eaf-4b45-4866-a373-fbd87aa2054d" (UID: "3fd59eaf-4b45-4866-a373-fbd87aa2054d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.446477 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.446513 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.446522 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b714d540-db02-4fc2-ae0b-140a4cdab084-operator-scripts podName:b714d540-db02-4fc2-ae0b-140a4cdab084 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:14.946505439 +0000 UTC m=+4967.266765717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/b714d540-db02-4fc2-ae0b-140a4cdab084-operator-scripts") pod "root-account-create-update-zlxnk" (UID: "b714d540-db02-4fc2-ae0b-140a4cdab084") : configmap "openstack-cell1-scripts" not found Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.446714 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:16.446699484 +0000 UTC m=+4968.766959772 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "barbican-config-data" not found Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.446790 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd59eaf-4b45-4866-a373-fbd87aa2054d-config" (OuterVolumeSpecName: "config") pod "3fd59eaf-4b45-4866-a373-fbd87aa2054d" (UID: "3fd59eaf-4b45-4866-a373-fbd87aa2054d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.447238 5030 projected.go:194] Error preparing data for projected volume kube-api-access-dc8dg for pod openstack-kuttl-tests/barbican-api-54769bd5cb-djssm: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.447323 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:16.447284878 +0000 UTC m=+4968.767545236 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dc8dg" (UniqueName: "kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.453021 5030 generic.go:334] "Generic (PLEG): container finished" podID="c3b71d0a-805d-4894-b6ff-c049baf96ad5" containerID="8df5d7b4dd58e28f0b12d08b3d7eee031074f3eaa5bb04801e96f40173a4f7f1" exitCode=143 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.453095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c3b71d0a-805d-4894-b6ff-c049baf96ad5","Type":"ContainerDied","Data":"8df5d7b4dd58e28f0b12d08b3d7eee031074f3eaa5bb04801e96f40173a4f7f1"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.457346 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd59eaf-4b45-4866-a373-fbd87aa2054d-kube-api-access-xm7r2" (OuterVolumeSpecName: "kube-api-access-xm7r2") pod "3fd59eaf-4b45-4866-a373-fbd87aa2054d" (UID: "3fd59eaf-4b45-4866-a373-fbd87aa2054d"). InnerVolumeSpecName "kube-api-access-xm7r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.458631 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-db-create-4vbjd"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.459866 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "3fd59eaf-4b45-4866-a373-fbd87aa2054d" (UID: "3fd59eaf-4b45-4866-a373-fbd87aa2054d"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.475948 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-qgdsf"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.492037 5030 scope.go:117] "RemoveContainer" containerID="2dfa7821a1c881765f73c688770fe17e2ad11291288ce498f5cddfcfd52e03f5" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.492322 5030 generic.go:334] "Generic (PLEG): container finished" podID="6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" containerID="0f4f42344c283426fc23c3e24b986e08ac56986fe44233a22ed80905aaf6f05b" exitCode=143 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.492380 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" event={"ID":"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f","Type":"ContainerDied","Data":"0f4f42344c283426fc23c3e24b986e08ac56986fe44233a22ed80905aaf6f05b"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.499609 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-cell1-galera-0" podUID="5bdb99fc-435f-4ea4-bd16-e04616518099" containerName="galera" containerID="cri-o://af17e77e1eb767a98bb78c74babe1fea956bf2c7c84c692b9ed22bca3333a0ae" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.502699 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-zlxnk"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.512980 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-db-create-qgdsf"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516430 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="92c32d2bc86e98bab1819f43771f92a83d45f6ff289fdd9a2b5a70b53ad1558e" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516458 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="cea3952cdaef6601fde2eb637e39d3001c3a7c366d044314ff9a9d1c02c48017" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516466 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="a374407f689a9d2bb03eff9d062e1cd3da80b68110ed673257b7015e9e92c1ad" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516474 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="f0d126b13c715f2ad13a917efe6ded91f57e7059086b604ef44b914a751d5bbc" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516482 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="8671b10a80366e99ef14723df98d3a82f9697f085022261cce60d19731b26754" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516489 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="4f455f04c208a9ca5dfa5ca0e43960e84838f224928f9a968af8ca6c20691bc7" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516495 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="57552db52780afc148b63dea6d16414f0cabd8ebca10a042707f9fe199f39642" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516500 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="691b234d76c80217c95ab8beecf74884acbbff47c7b789fb42fa87e618b49105" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516507 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="e69298ac72328d9e05cbb9394d7318f97f51059ec2e51eed28645ae7a530e56b" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516514 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="422e52434457361c67a15e36827a2685f1991c2155732af2cb83cec3214ed1f9" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516519 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="5e75ab45cf815c6f40eaad36359127f421639ab2cf69cccab92eb297f6bff69d" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516526 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="50c1d323bc10d2e9d482f6bae645cc74b4dc3db3e2fb097ee700a89b4499d86c" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516532 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="3654ab925574f24e6de289b448a4be2823edfaa3cfa76260680adad4710a392c" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516538 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="4ed9aef9fb4eafe75c52a1017481474e53ebb1e8b61c72ef3fbe95bc2c025817" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516582 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"92c32d2bc86e98bab1819f43771f92a83d45f6ff289fdd9a2b5a70b53ad1558e"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516604 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"cea3952cdaef6601fde2eb637e39d3001c3a7c366d044314ff9a9d1c02c48017"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516614 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"a374407f689a9d2bb03eff9d062e1cd3da80b68110ed673257b7015e9e92c1ad"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516637 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"f0d126b13c715f2ad13a917efe6ded91f57e7059086b604ef44b914a751d5bbc"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516646 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"8671b10a80366e99ef14723df98d3a82f9697f085022261cce60d19731b26754"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"4f455f04c208a9ca5dfa5ca0e43960e84838f224928f9a968af8ca6c20691bc7"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516665 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"57552db52780afc148b63dea6d16414f0cabd8ebca10a042707f9fe199f39642"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516684 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"691b234d76c80217c95ab8beecf74884acbbff47c7b789fb42fa87e618b49105"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516694 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"e69298ac72328d9e05cbb9394d7318f97f51059ec2e51eed28645ae7a530e56b"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"422e52434457361c67a15e36827a2685f1991c2155732af2cb83cec3214ed1f9"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516710 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"5e75ab45cf815c6f40eaad36359127f421639ab2cf69cccab92eb297f6bff69d"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516719 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"50c1d323bc10d2e9d482f6bae645cc74b4dc3db3e2fb097ee700a89b4499d86c"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516727 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"3654ab925574f24e6de289b448a4be2823edfaa3cfa76260680adad4710a392c"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.516735 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"4ed9aef9fb4eafe75c52a1017481474e53ebb1e8b61c72ef3fbe95bc2c025817"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.522006 5030 generic.go:334] "Generic (PLEG): container finished" podID="a7305e60-aa68-4d37-a472-80060d0fb5cb" containerID="7aa1a8b59ede0ec7d6ca8badb42b6c8311eed524c20c0dfff1c4b3113611868b" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.522058 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"a7305e60-aa68-4d37-a472-80060d0fb5cb","Type":"ContainerDied","Data":"7aa1a8b59ede0ec7d6ca8badb42b6c8311eed524c20c0dfff1c4b3113611868b"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.530827 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fd59eaf-4b45-4866-a373-fbd87aa2054d" (UID: "3fd59eaf-4b45-4866-a373-fbd87aa2054d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.532860 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-7tr47"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.540023 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.540052 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3fd59eaf-4b45-4866-a373-fbd87aa2054d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.540064 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.540074 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm7r2\" (UniqueName: \"kubernetes.io/projected/3fd59eaf-4b45-4866-a373-fbd87aa2054d-kube-api-access-xm7r2\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.540084 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fd59eaf-4b45-4866-a373-fbd87aa2054d-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.540095 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fd59eaf-4b45-4866-a373-fbd87aa2054d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.553607 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-db-create-7tr47"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.560526 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.569049 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-zlxnk"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.578116 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.582677 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-db-sync-x5l8v"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.589679 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.589896 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell1-conductor-0" podUID="09900067-cfe8-4543-8e6c-88c692d43ed8" containerName="nova-cell1-conductor-conductor" containerID="cri-o://44eaef6b2a5abf1f99511a45ea107f9625dee6ea706b24c87e2b305edec32984" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.599422 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:58:14 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: if [ -n "nova_api" ]; then Jan 20 23:58:14 crc kubenswrapper[5030]: GRANT_DATABASE="nova_api" Jan 20 23:58:14 crc kubenswrapper[5030]: else Jan 20 23:58:14 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:58:14 crc kubenswrapper[5030]: fi Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:58:14 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:58:14 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:58:14 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:58:14 crc kubenswrapper[5030]: # support updates Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.601053 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm" podUID="5b1faee2-019f-4357-ab94-2680305278c7" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.601108 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.629821 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f8fa8f5-9f43-489c-b61d-668187fe87ea" containerID="3cc6b0b1ef67cf3ed623ef331af4f0711ba50f68fe18f908775fb4147e800fbd" exitCode=143 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.630113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"4f8fa8f5-9f43-489c-b61d-668187fe87ea","Type":"ContainerDied","Data":"3cc6b0b1ef67cf3ed623ef331af4f0711ba50f68fe18f908775fb4147e800fbd"} Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.630168 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b3a1344ff3e013651e7f6d5c6a2155a0ddb578b7e656e6c2b4a5a5e435e9bfc2 is running failed: container process not found" containerID="b3a1344ff3e013651e7f6d5c6a2155a0ddb578b7e656e6c2b4a5a5e435e9bfc2" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.634283 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p"] Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.638542 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b3a1344ff3e013651e7f6d5c6a2155a0ddb578b7e656e6c2b4a5a5e435e9bfc2 is running failed: container process not found" containerID="b3a1344ff3e013651e7f6d5c6a2155a0ddb578b7e656e6c2b4a5a5e435e9bfc2" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.651939 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b3a1344ff3e013651e7f6d5c6a2155a0ddb578b7e656e6c2b4a5a5e435e9bfc2 is running failed: container process not found" containerID="b3a1344ff3e013651e7f6d5c6a2155a0ddb578b7e656e6c2b4a5a5e435e9bfc2" cmd=["/usr/bin/pidof","ovsdb-server"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.652002 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-db-sync-xkn7p"] Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.652010 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b3a1344ff3e013651e7f6d5c6a2155a0ddb578b7e656e6c2b4a5a5e435e9bfc2 is running failed: container process not found" probeType="Readiness" pod="openstack-kuttl-tests/ovsdbserver-nb-0" podUID="14c17006-1397-4119-bd32-0e07e78d5068" containerName="ovsdbserver-nb" Jan 20 23:58:14 crc kubenswrapper[5030]: W0120 23:58:14.652115 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28827f2e_866b_47b6_a6bb_d38ef943bd09.slice/crio-43d489b5deede9eb716e0a497359ce24ce0437af098585ddda118a92b1a96879 WatchSource:0}: Error finding container 43d489b5deede9eb716e0a497359ce24ce0437af098585ddda118a92b1a96879: Status 404 returned error can't find the container with id 43d489b5deede9eb716e0a497359ce24ce0437af098585ddda118a92b1a96879 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.656824 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.657042 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="faaf0cec-efea-4050-9195-ee0262c8deb8" containerName="nova-cell0-conductor-conductor" containerID="cri-o://5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.672353 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.672409 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.672658 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae" containerName="nova-scheduler-scheduler" containerID="cri-o://dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.679773 5030 generic.go:334] "Generic (PLEG): container finished" podID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerID="80d29224c06a128d8eed095ff5e5a28ed9ae45635587e8e737f90ba851398c23" exitCode=143 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.679827 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"3f6a312c-eea2-4e52-8e9d-e61ab90cd620","Type":"ContainerDied","Data":"80d29224c06a128d8eed095ff5e5a28ed9ae45635587e8e737f90ba851398c23"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.698028 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_14c17006-1397-4119-bd32-0e07e78d5068/ovsdbserver-nb/0.log" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.698110 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.700993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "3fd59eaf-4b45-4866-a373-fbd87aa2054d" (UID: "3fd59eaf-4b45-4866-a373-fbd87aa2054d"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.708818 5030 generic.go:334] "Generic (PLEG): container finished" podID="03c5eea5-bd6f-4b19-829d-32f4ef09997c" containerID="9ca4957d78a8a6a00d4df062b5abb4a6f62840672dc365ddc417d1f2516cfcb4" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.708853 5030 generic.go:334] "Generic (PLEG): container finished" podID="03c5eea5-bd6f-4b19-829d-32f4ef09997c" containerID="e48bf78e90710aa76a16ac4f0324a165f77700561ddfe100c023788b47cc3e69" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.709985 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.710178 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" event={"ID":"03c5eea5-bd6f-4b19-829d-32f4ef09997c","Type":"ContainerDied","Data":"9ca4957d78a8a6a00d4df062b5abb4a6f62840672dc365ddc417d1f2516cfcb4"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.710222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" event={"ID":"03c5eea5-bd6f-4b19-829d-32f4ef09997c","Type":"ContainerDied","Data":"e48bf78e90710aa76a16ac4f0324a165f77700561ddfe100c023788b47cc3e69"} Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.722715 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:58:14 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: if [ -n "placement" ]; then Jan 20 23:58:14 crc kubenswrapper[5030]: GRANT_DATABASE="placement" Jan 20 23:58:14 crc kubenswrapper[5030]: else Jan 20 23:58:14 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:58:14 crc kubenswrapper[5030]: fi Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:58:14 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:58:14 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:58:14 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:58:14 crc kubenswrapper[5030]: # support updates Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.722779 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:58:14 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: if [ -n "neutron" ]; then Jan 20 23:58:14 crc kubenswrapper[5030]: GRANT_DATABASE="neutron" Jan 20 23:58:14 crc kubenswrapper[5030]: else Jan 20 23:58:14 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:58:14 crc kubenswrapper[5030]: fi Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:58:14 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:58:14 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:58:14 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:58:14 crc kubenswrapper[5030]: # support updates Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.724106 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22" podUID="467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.724159 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack-kuttl-tests/placement-5889-account-create-update-7spqm" podUID="28827f2e-866b-47b6-a6bb-d38ef943bd09" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.726147 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:58:14 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: if [ -n "barbican" ]; then Jan 20 23:58:14 crc kubenswrapper[5030]: GRANT_DATABASE="barbican" Jan 20 23:58:14 crc kubenswrapper[5030]: else Jan 20 23:58:14 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:58:14 crc kubenswrapper[5030]: fi Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:58:14 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:58:14 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:58:14 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:58:14 crc kubenswrapper[5030]: # support updates Jan 20 23:58:14 crc kubenswrapper[5030]: Jan 20 23:58:14 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.728247 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj" podUID="9d61475e-3539-4e26-aac7-dfbde0ece687" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.750882 5030 scope.go:117] "RemoveContainer" containerID="3ab6bfc1ef90ca2f374fc1f4c559d4366f18debbdef1aa55c34024687bf9298c" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.755422 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ab6bfc1ef90ca2f374fc1f4c559d4366f18debbdef1aa55c34024687bf9298c\": container with ID starting with 3ab6bfc1ef90ca2f374fc1f4c559d4366f18debbdef1aa55c34024687bf9298c not found: ID does not exist" containerID="3ab6bfc1ef90ca2f374fc1f4c559d4366f18debbdef1aa55c34024687bf9298c" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.755530 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ab6bfc1ef90ca2f374fc1f4c559d4366f18debbdef1aa55c34024687bf9298c"} err="failed to get container status \"3ab6bfc1ef90ca2f374fc1f4c559d4366f18debbdef1aa55c34024687bf9298c\": rpc error: code = NotFound desc = could not find container \"3ab6bfc1ef90ca2f374fc1f4c559d4366f18debbdef1aa55c34024687bf9298c\": container with ID starting with 3ab6bfc1ef90ca2f374fc1f4c559d4366f18debbdef1aa55c34024687bf9298c not found: ID does not exist" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.755607 5030 scope.go:117] "RemoveContainer" containerID="2dfa7821a1c881765f73c688770fe17e2ad11291288ce498f5cddfcfd52e03f5" Jan 20 23:58:14 crc kubenswrapper[5030]: E0120 23:58:14.757547 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dfa7821a1c881765f73c688770fe17e2ad11291288ce498f5cddfcfd52e03f5\": container with ID starting with 2dfa7821a1c881765f73c688770fe17e2ad11291288ce498f5cddfcfd52e03f5 not found: ID does not exist" containerID="2dfa7821a1c881765f73c688770fe17e2ad11291288ce498f5cddfcfd52e03f5" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.757586 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dfa7821a1c881765f73c688770fe17e2ad11291288ce498f5cddfcfd52e03f5"} err="failed to get container status \"2dfa7821a1c881765f73c688770fe17e2ad11291288ce498f5cddfcfd52e03f5\": rpc error: code = NotFound desc = could not find container \"2dfa7821a1c881765f73c688770fe17e2ad11291288ce498f5cddfcfd52e03f5\": container with ID starting with 2dfa7821a1c881765f73c688770fe17e2ad11291288ce498f5cddfcfd52e03f5 not found: ID does not exist" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.757685 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0a3039f-15d0-4764-bcdf-4833853ed168" containerID="614efb0d878fa014c022337fb7a7d478520d2e68328e205d409a9cb6aab50305" exitCode=0 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.757926 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" event={"ID":"e0a3039f-15d0-4764-bcdf-4833853ed168","Type":"ContainerDied","Data":"614efb0d878fa014c022337fb7a7d478520d2e68328e205d409a9cb6aab50305"} Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.776397 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.776436 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.785334 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.788832 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.797678 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5889-account-create-update-7spqm"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.824953 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.838665 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.839248 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3fd59eaf-4b45-4866-a373-fbd87aa2054d" (UID: "3fd59eaf-4b45-4866-a373-fbd87aa2054d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.845780 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.846067 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="ceilometer-central-agent" containerID="cri-o://85c6e0e15e78a3153d0dd557c849ff4f4ff72da720eb6c506baa0d7377395508" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.846462 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="proxy-httpd" containerID="cri-o://fd95c8d95d19d605865a5d57c895aa4658929f6fbbc1cd7b2f7dcb7695715783" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.846514 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="sg-core" containerID="cri-o://3aab7494f35c8d53bed3be64a7d47121e15d43ba605ba4d9c3f83abc05cd8653" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.849521 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/ceilometer-0" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="ceilometer-notification-agent" containerID="cri-o://dcc49b3097e7c3881a9e6d16ca2206768b7f3632eb827aa614fc3cf5b9335766" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.894658 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="19f6003c-3d52-44e9-a5a8-d525e4b6a79e" containerName="rabbitmq" containerID="cri-o://016af194f1eb4817af713ac5bfe3322cf5ee11ad44e3b6dcfc50cfc5c79e5f55" gracePeriod=604800 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.898184 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.898437 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kube-state-metrics-0" podUID="02190451-840d-4e4e-8b67-a14ac1f9712c" containerName="kube-state-metrics" containerID="cri-o://13e58d45af774f4c8a8b261f1aa8f35442fe8fe8b25360af8c074429af8cfa70" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.912943 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.913167 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/memcached-0" podUID="443bb4b6-0b3c-4824-b5b6-e2771eb8e438" containerName="memcached" containerID="cri-o://a8bc8a463a575efe51b8f28d2ac8ab506a95e14b2c50dc6fee036bbc5a49a1bb" gracePeriod=30 Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.914562 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-metrics-certs-tls-certs\") pod \"14c17006-1397-4119-bd32-0e07e78d5068\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.914595 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9wsc\" (UniqueName: \"kubernetes.io/projected/14c17006-1397-4119-bd32-0e07e78d5068-kube-api-access-k9wsc\") pod \"14c17006-1397-4119-bd32-0e07e78d5068\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.914702 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525b8de8-34a8-4bd1-b426-ed17becbab12-combined-ca-bundle\") pod \"525b8de8-34a8-4bd1-b426-ed17becbab12\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.914745 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/525b8de8-34a8-4bd1-b426-ed17becbab12-openstack-config-secret\") pod \"525b8de8-34a8-4bd1-b426-ed17becbab12\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.914775 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-combined-ca-bundle\") pod \"14c17006-1397-4119-bd32-0e07e78d5068\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.914796 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-ovsdbserver-nb-tls-certs\") pod \"14c17006-1397-4119-bd32-0e07e78d5068\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.914835 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"14c17006-1397-4119-bd32-0e07e78d5068\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.914870 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/525b8de8-34a8-4bd1-b426-ed17becbab12-openstack-config\") pod \"525b8de8-34a8-4bd1-b426-ed17becbab12\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.914946 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/14c17006-1397-4119-bd32-0e07e78d5068-ovsdb-rundir\") pod \"14c17006-1397-4119-bd32-0e07e78d5068\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.914998 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14c17006-1397-4119-bd32-0e07e78d5068-scripts\") pod \"14c17006-1397-4119-bd32-0e07e78d5068\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.915022 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14c17006-1397-4119-bd32-0e07e78d5068-config\") pod \"14c17006-1397-4119-bd32-0e07e78d5068\" (UID: \"14c17006-1397-4119-bd32-0e07e78d5068\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.915046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bftd\" (UniqueName: \"kubernetes.io/projected/525b8de8-34a8-4bd1-b426-ed17becbab12-kube-api-access-9bftd\") pod \"525b8de8-34a8-4bd1-b426-ed17becbab12\" (UID: \"525b8de8-34a8-4bd1-b426-ed17becbab12\") " Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.915747 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd59eaf-4b45-4866-a373-fbd87aa2054d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.917478 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14c17006-1397-4119-bd32-0e07e78d5068-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "14c17006-1397-4119-bd32-0e07e78d5068" (UID: "14c17006-1397-4119-bd32-0e07e78d5068"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.922780 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "14c17006-1397-4119-bd32-0e07e78d5068" (UID: "14c17006-1397-4119-bd32-0e07e78d5068"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.923030 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14c17006-1397-4119-bd32-0e07e78d5068-scripts" (OuterVolumeSpecName: "scripts") pod "14c17006-1397-4119-bd32-0e07e78d5068" (UID: "14c17006-1397-4119-bd32-0e07e78d5068"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.923447 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14c17006-1397-4119-bd32-0e07e78d5068-config" (OuterVolumeSpecName: "config") pod "14c17006-1397-4119-bd32-0e07e78d5068" (UID: "14c17006-1397-4119-bd32-0e07e78d5068"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.927737 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14c17006-1397-4119-bd32-0e07e78d5068-kube-api-access-k9wsc" (OuterVolumeSpecName: "kube-api-access-k9wsc") pod "14c17006-1397-4119-bd32-0e07e78d5068" (UID: "14c17006-1397-4119-bd32-0e07e78d5068"). InnerVolumeSpecName "kube-api-access-k9wsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.948563 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/525b8de8-34a8-4bd1-b426-ed17becbab12-kube-api-access-9bftd" (OuterVolumeSpecName: "kube-api-access-9bftd") pod "525b8de8-34a8-4bd1-b426-ed17becbab12" (UID: "525b8de8-34a8-4bd1-b426-ed17becbab12"). InnerVolumeSpecName "kube-api-access-9bftd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.977436 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.988560 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14c17006-1397-4119-bd32-0e07e78d5068" (UID: "14c17006-1397-4119-bd32-0e07e78d5068"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:14 crc kubenswrapper[5030]: I0120 23:58:14.996875 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/525b8de8-34a8-4bd1-b426-ed17becbab12-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "525b8de8-34a8-4bd1-b426-ed17becbab12" (UID: "525b8de8-34a8-4bd1-b426-ed17becbab12"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.017131 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.017163 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v64pf\" (UniqueName: \"kubernetes.io/projected/03c5eea5-bd6f-4b19-829d-32f4ef09997c-kube-api-access-v64pf\") pod \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.017303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03c5eea5-bd6f-4b19-829d-32f4ef09997c-log-httpd\") pod \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.017416 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-config-data\") pod \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.017476 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/03c5eea5-bd6f-4b19-829d-32f4ef09997c-etc-swift\") pod \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.017516 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-combined-ca-bundle\") pod \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.017545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03c5eea5-bd6f-4b19-829d-32f4ef09997c-run-httpd\") pod \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.017592 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-internal-tls-certs\") pod \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.017688 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-public-tls-certs\") pod \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\" (UID: \"03c5eea5-bd6f-4b19-829d-32f4ef09997c\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.018542 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.018574 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.018585 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/525b8de8-34a8-4bd1-b426-ed17becbab12-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.018595 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/14c17006-1397-4119-bd32-0e07e78d5068-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.018604 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14c17006-1397-4119-bd32-0e07e78d5068-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.018616 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14c17006-1397-4119-bd32-0e07e78d5068-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.018640 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bftd\" (UniqueName: \"kubernetes.io/projected/525b8de8-34a8-4bd1-b426-ed17becbab12-kube-api-access-9bftd\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.018654 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9wsc\" (UniqueName: \"kubernetes.io/projected/14c17006-1397-4119-bd32-0e07e78d5068-kube-api-access-k9wsc\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.019658 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03c5eea5-bd6f-4b19-829d-32f4ef09997c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "03c5eea5-bd6f-4b19-829d-32f4ef09997c" (UID: "03c5eea5-bd6f-4b19-829d-32f4ef09997c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.019780 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.019854 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b714d540-db02-4fc2-ae0b-140a4cdab084-operator-scripts podName:b714d540-db02-4fc2-ae0b-140a4cdab084 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:16.019835487 +0000 UTC m=+4968.340095775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/b714d540-db02-4fc2-ae0b-140a4cdab084-operator-scripts") pod "root-account-create-update-zlxnk" (UID: "b714d540-db02-4fc2-ae0b-140a4cdab084") : configmap "openstack-cell1-scripts" not found Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.019906 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03c5eea5-bd6f-4b19-829d-32f4ef09997c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "03c5eea5-bd6f-4b19-829d-32f4ef09997c" (UID: "03c5eea5-bd6f-4b19-829d-32f4ef09997c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.025925 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03c5eea5-bd6f-4b19-829d-32f4ef09997c-kube-api-access-v64pf" (OuterVolumeSpecName: "kube-api-access-v64pf") pod "03c5eea5-bd6f-4b19-829d-32f4ef09997c" (UID: "03c5eea5-bd6f-4b19-829d-32f4ef09997c"). InnerVolumeSpecName "kube-api-access-v64pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.025998 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03c5eea5-bd6f-4b19-829d-32f4ef09997c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "03c5eea5-bd6f-4b19-829d-32f4ef09997c" (UID: "03c5eea5-bd6f-4b19-829d-32f4ef09997c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.052431 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525b8de8-34a8-4bd1-b426-ed17becbab12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "525b8de8-34a8-4bd1-b426-ed17becbab12" (UID: "525b8de8-34a8-4bd1-b426-ed17becbab12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.055355 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.058056 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-d843-account-create-update-t6cfs"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.109277 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4"] Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.109810 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03c5eea5-bd6f-4b19-829d-32f4ef09997c" containerName="proxy-server" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.109824 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="03c5eea5-bd6f-4b19-829d-32f4ef09997c" containerName="proxy-server" Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.109839 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03c5eea5-bd6f-4b19-829d-32f4ef09997c" containerName="proxy-httpd" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.109845 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="03c5eea5-bd6f-4b19-829d-32f4ef09997c" containerName="proxy-httpd" Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.109854 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c17006-1397-4119-bd32-0e07e78d5068" containerName="ovsdbserver-nb" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.109876 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c17006-1397-4119-bd32-0e07e78d5068" containerName="ovsdbserver-nb" Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.109898 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd59eaf-4b45-4866-a373-fbd87aa2054d" containerName="ovsdbserver-sb" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.109904 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd59eaf-4b45-4866-a373-fbd87aa2054d" containerName="ovsdbserver-sb" Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.109909 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd59eaf-4b45-4866-a373-fbd87aa2054d" containerName="openstack-network-exporter" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.109915 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd59eaf-4b45-4866-a373-fbd87aa2054d" containerName="openstack-network-exporter" Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.109927 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c17006-1397-4119-bd32-0e07e78d5068" containerName="openstack-network-exporter" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.109933 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c17006-1397-4119-bd32-0e07e78d5068" containerName="openstack-network-exporter" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.110218 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c17006-1397-4119-bd32-0e07e78d5068" containerName="openstack-network-exporter" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.110230 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd59eaf-4b45-4866-a373-fbd87aa2054d" containerName="ovsdbserver-sb" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.110243 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="03c5eea5-bd6f-4b19-829d-32f4ef09997c" containerName="proxy-server" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.110260 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="03c5eea5-bd6f-4b19-829d-32f4ef09997c" containerName="proxy-httpd" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.110288 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd59eaf-4b45-4866-a373-fbd87aa2054d" containerName="openstack-network-exporter" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.110299 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c17006-1397-4119-bd32-0e07e78d5068" containerName="ovsdbserver-nb" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.113912 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.118074 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"keystone-db-secret" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.120211 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525b8de8-34a8-4bd1-b426-ed17becbab12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.120277 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v64pf\" (UniqueName: \"kubernetes.io/projected/03c5eea5-bd6f-4b19-829d-32f4ef09997c-kube-api-access-v64pf\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.120287 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.120325 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03c5eea5-bd6f-4b19-829d-32f4ef09997c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.120336 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/03c5eea5-bd6f-4b19-829d-32f4ef09997c-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.120345 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03c5eea5-bd6f-4b19-829d-32f4ef09997c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.122406 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.145591 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pcjpl"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.172759 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-sync-pcjpl"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.195763 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7mn5q"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.200941 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525b8de8-34a8-4bd1-b426-ed17becbab12-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "525b8de8-34a8-4bd1-b426-ed17becbab12" (UID: "525b8de8-34a8-4bd1-b426-ed17becbab12"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.224030 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.224247 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" podUID="6a35511b-3fcd-40b1-8657-c26c6bf69e50" containerName="keystone-api" containerID="cri-o://b59655e0dbc36eb3cf361d0cf1ec929cafefa9196ef964eb00f393988ef15b65" gracePeriod=30 Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.234113 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-bootstrap-7mn5q"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.234706 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks59n\" (UniqueName: \"kubernetes.io/projected/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-kube-api-access-ks59n\") pod \"keystone-d843-account-create-update-4x9f4\" (UID: \"b7a462d5-bc01-4f4d-a54e-10acd974e5a0\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.234873 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-operator-scripts\") pod \"keystone-d843-account-create-update-4x9f4\" (UID: \"b7a462d5-bc01-4f4d-a54e-10acd974e5a0\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.234937 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/525b8de8-34a8-4bd1-b426-ed17becbab12-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.235000 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.235030 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data podName:19f6003c-3d52-44e9-a5a8-d525e4b6a79e nodeName:}" failed. No retries permitted until 2026-01-20 23:58:17.235018248 +0000 UTC m=+4969.555278536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data") pod "rabbitmq-cell1-server-0" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.245897 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.260235 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-56vq6"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.266520 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-db-create-56vq6"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.278156 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.292217 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.301455 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" podUID="231da45e-cd6f-435b-9209-0d8794fa33ad" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.137:6080/vnc_lite.html\": dial tcp 10.217.0.137:6080: connect: connection refused" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.304724 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-sb-0"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.321939 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03c5eea5-bd6f-4b19-829d-32f4ef09997c" (UID: "03c5eea5-bd6f-4b19-829d-32f4ef09997c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.327963 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m"] Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.333986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "03c5eea5-bd6f-4b19-829d-32f4ef09997c" (UID: "03c5eea5-bd6f-4b19-829d-32f4ef09997c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.336650 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks59n\" (UniqueName: \"kubernetes.io/projected/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-kube-api-access-ks59n\") pod \"keystone-d843-account-create-update-4x9f4\" (UID: \"b7a462d5-bc01-4f4d-a54e-10acd974e5a0\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.336816 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-operator-scripts\") pod \"keystone-d843-account-create-update-4x9f4\" (UID: \"b7a462d5-bc01-4f4d-a54e-10acd974e5a0\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.336920 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.336936 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.336986 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.337021 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-operator-scripts podName:b7a462d5-bc01-4f4d-a54e-10acd974e5a0 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:15.837008262 +0000 UTC m=+4968.157268540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-operator-scripts") pod "keystone-d843-account-create-update-4x9f4" (UID: "b7a462d5-bc01-4f4d-a54e-10acd974e5a0") : configmap "openstack-scripts" not found Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.353374 5030 projected.go:194] Error preparing data for projected volume kube-api-access-ks59n for pod openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.353443 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-kube-api-access-ks59n podName:b7a462d5-bc01-4f4d-a54e-10acd974e5a0 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:15.853424569 +0000 UTC m=+4968.173684857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ks59n" (UniqueName: "kubernetes.io/projected/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-kube-api-access-ks59n") pod "keystone-d843-account-create-update-4x9f4" (UID: "b7a462d5-bc01-4f4d-a54e-10acd974e5a0") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.360523 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "14c17006-1397-4119-bd32-0e07e78d5068" (UID: "14c17006-1397-4119-bd32-0e07e78d5068"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.371830 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "03c5eea5-bd6f-4b19-829d-32f4ef09997c" (UID: "03c5eea5-bd6f-4b19-829d-32f4ef09997c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.372184 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-config-data" (OuterVolumeSpecName: "config-data") pod "03c5eea5-bd6f-4b19-829d-32f4ef09997c" (UID: "03c5eea5-bd6f-4b19-829d-32f4ef09997c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.386802 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "14c17006-1397-4119-bd32-0e07e78d5068" (UID: "14c17006-1397-4119-bd32-0e07e78d5068"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.439717 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.439753 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.439764 5030 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/14c17006-1397-4119-bd32-0e07e78d5068-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.439823 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c5eea5-bd6f-4b19-829d-32f4ef09997c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.495864 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/openstack-galera-0" podUID="732faf79-c224-4a2a-bc39-07343926d772" containerName="galera" containerID="cri-o://c349fa99a47e05f4423344350c4f8ba2fbdc8b3c193473666cc6af2498e4f85c" gracePeriod=30 Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.542012 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.542086 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data podName:4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf nodeName:}" failed. No retries permitted until 2026-01-20 23:58:19.542067426 +0000 UTC m=+4971.862327714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data") pod "rabbitmq-server-0" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf") : configmap "rabbitmq-config-data" not found Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.740218 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-ks59n operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4" podUID="b7a462d5-bc01-4f4d-a54e-10acd974e5a0" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.745832 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.772668 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.786863 5030 scope.go:117] "RemoveContainer" containerID="8b93f1e880b42dec39213a1363781794f9e5e185249a6d3eb089db7e9620bfcd" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.787105 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstackclient" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.800604 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.811357 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.818164 5030 generic.go:334] "Generic (PLEG): container finished" podID="68728b46-0262-46f2-8eb0-a0301870b569" containerID="1cd498511cf2e3d79c9f51bfa7be8a47de25abc9f9cf591603a16ea099174d99" exitCode=143 Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.818221 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" event={"ID":"68728b46-0262-46f2-8eb0-a0301870b569","Type":"ContainerDied","Data":"1cd498511cf2e3d79c9f51bfa7be8a47de25abc9f9cf591603a16ea099174d99"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.827286 5030 generic.go:334] "Generic (PLEG): container finished" podID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerID="fd95c8d95d19d605865a5d57c895aa4658929f6fbbc1cd7b2f7dcb7695715783" exitCode=0 Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.827333 5030 generic.go:334] "Generic (PLEG): container finished" podID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerID="3aab7494f35c8d53bed3be64a7d47121e15d43ba605ba4d9c3f83abc05cd8653" exitCode=2 Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.827343 5030 generic.go:334] "Generic (PLEG): container finished" podID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerID="85c6e0e15e78a3153d0dd557c849ff4f4ff72da720eb6c506baa0d7377395508" exitCode=0 Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.827412 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8ffece47-af44-49bc-b3b4-6e6c24e6589e","Type":"ContainerDied","Data":"fd95c8d95d19d605865a5d57c895aa4658929f6fbbc1cd7b2f7dcb7695715783"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.827434 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8ffece47-af44-49bc-b3b4-6e6c24e6589e","Type":"ContainerDied","Data":"3aab7494f35c8d53bed3be64a7d47121e15d43ba605ba4d9c3f83abc05cd8653"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.827579 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8ffece47-af44-49bc-b3b4-6e6c24e6589e","Type":"ContainerDied","Data":"85c6e0e15e78a3153d0dd557c849ff4f4ff72da720eb6c506baa0d7377395508"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.829300 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.841183 5030 generic.go:334] "Generic (PLEG): container finished" podID="5bdb99fc-435f-4ea4-bd16-e04616518099" containerID="af17e77e1eb767a98bb78c74babe1fea956bf2c7c84c692b9ed22bca3333a0ae" exitCode=0 Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.841251 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"5bdb99fc-435f-4ea4-bd16-e04616518099","Type":"ContainerDied","Data":"af17e77e1eb767a98bb78c74babe1fea956bf2c7c84c692b9ed22bca3333a0ae"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.841280 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-cell1-galera-0" event={"ID":"5bdb99fc-435f-4ea4-bd16-e04616518099","Type":"ContainerDied","Data":"6da066bd7d25105a4da15c0529a9ce20d788f3400b5f68b97e8d04006a8eb3bf"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.841345 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-cell1-galera-0" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.843657 5030 scope.go:117] "RemoveContainer" containerID="af17e77e1eb767a98bb78c74babe1fea956bf2c7c84c692b9ed22bca3333a0ae" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863387 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5s8x\" (UniqueName: \"kubernetes.io/projected/a7305e60-aa68-4d37-a472-80060d0fb5cb-kube-api-access-n5s8x\") pod \"a7305e60-aa68-4d37-a472-80060d0fb5cb\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863453 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m9np\" (UniqueName: \"kubernetes.io/projected/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-api-access-4m9np\") pod \"02190451-840d-4e4e-8b67-a14ac1f9712c\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863477 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5bdb99fc-435f-4ea4-bd16-e04616518099-config-data-generated\") pod \"5bdb99fc-435f-4ea4-bd16-e04616518099\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863499 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-combined-ca-bundle\") pod \"02190451-840d-4e4e-8b67-a14ac1f9712c\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863522 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-vencrypt-tls-certs\") pod \"231da45e-cd6f-435b-9209-0d8794fa33ad\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863553 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-operator-scripts\") pod \"5bdb99fc-435f-4ea4-bd16-e04616518099\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863587 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-nova-novncproxy-tls-certs\") pod \"231da45e-cd6f-435b-9209-0d8794fa33ad\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863650 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bdb99fc-435f-4ea4-bd16-e04616518099-galera-tls-certs\") pod \"5bdb99fc-435f-4ea4-bd16-e04616518099\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863676 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-scripts\") pod \"a7305e60-aa68-4d37-a472-80060d0fb5cb\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863705 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7305e60-aa68-4d37-a472-80060d0fb5cb-etc-machine-id\") pod \"a7305e60-aa68-4d37-a472-80060d0fb5cb\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863735 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-config-data\") pod \"231da45e-cd6f-435b-9209-0d8794fa33ad\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863753 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bdb99fc-435f-4ea4-bd16-e04616518099-combined-ca-bundle\") pod \"5bdb99fc-435f-4ea4-bd16-e04616518099\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863785 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-config-data-default\") pod \"5bdb99fc-435f-4ea4-bd16-e04616518099\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863806 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-config-data-custom\") pod \"a7305e60-aa68-4d37-a472-80060d0fb5cb\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863832 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkrmn\" (UniqueName: \"kubernetes.io/projected/5bdb99fc-435f-4ea4-bd16-e04616518099-kube-api-access-gkrmn\") pod \"5bdb99fc-435f-4ea4-bd16-e04616518099\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863870 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-kolla-config\") pod \"5bdb99fc-435f-4ea4-bd16-e04616518099\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863891 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-state-metrics-tls-certs\") pod \"02190451-840d-4e4e-8b67-a14ac1f9712c\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863911 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-state-metrics-tls-config\") pod \"02190451-840d-4e4e-8b67-a14ac1f9712c\" (UID: \"02190451-840d-4e4e-8b67-a14ac1f9712c\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863932 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-combined-ca-bundle\") pod \"231da45e-cd6f-435b-9209-0d8794fa33ad\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.863974 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-config-data\") pod \"a7305e60-aa68-4d37-a472-80060d0fb5cb\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.864015 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"5bdb99fc-435f-4ea4-bd16-e04616518099\" (UID: \"5bdb99fc-435f-4ea4-bd16-e04616518099\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.864034 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dxfm\" (UniqueName: \"kubernetes.io/projected/231da45e-cd6f-435b-9209-0d8794fa33ad-kube-api-access-2dxfm\") pod \"231da45e-cd6f-435b-9209-0d8794fa33ad\" (UID: \"231da45e-cd6f-435b-9209-0d8794fa33ad\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.864073 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-combined-ca-bundle\") pod \"a7305e60-aa68-4d37-a472-80060d0fb5cb\" (UID: \"a7305e60-aa68-4d37-a472-80060d0fb5cb\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.864328 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-operator-scripts\") pod \"keystone-d843-account-create-update-4x9f4\" (UID: \"b7a462d5-bc01-4f4d-a54e-10acd974e5a0\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.864424 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks59n\" (UniqueName: \"kubernetes.io/projected/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-kube-api-access-ks59n\") pod \"keystone-d843-account-create-update-4x9f4\" (UID: \"b7a462d5-bc01-4f4d-a54e-10acd974e5a0\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.880418 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "5bdb99fc-435f-4ea4-bd16-e04616518099" (UID: "5bdb99fc-435f-4ea4-bd16-e04616518099"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.880941 5030 generic.go:334] "Generic (PLEG): container finished" podID="a7305e60-aa68-4d37-a472-80060d0fb5cb" containerID="2352db31d2fa21ab9eebafd17ba76cf00ee85495907870c2b6db941db94c99ad" exitCode=0 Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.881026 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"a7305e60-aa68-4d37-a472-80060d0fb5cb","Type":"ContainerDied","Data":"2352db31d2fa21ab9eebafd17ba76cf00ee85495907870c2b6db941db94c99ad"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.881052 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-scheduler-0" event={"ID":"a7305e60-aa68-4d37-a472-80060d0fb5cb","Type":"ContainerDied","Data":"6c91c04d39134fe91bc0213b9945f716857fdb4d5d2d88bbccb158eacac87b1f"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.881120 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-scheduler-0" Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.883909 5030 projected.go:194] Error preparing data for projected volume kube-api-access-ks59n for pod openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.883973 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-kube-api-access-ks59n podName:b7a462d5-bc01-4f4d-a54e-10acd974e5a0 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:16.883955569 +0000 UTC m=+4969.204215857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ks59n" (UniqueName: "kubernetes.io/projected/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-kube-api-access-ks59n") pod "keystone-d843-account-create-update-4x9f4" (UID: "b7a462d5-bc01-4f4d-a54e-10acd974e5a0") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.884743 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "5bdb99fc-435f-4ea4-bd16-e04616518099" (UID: "5bdb99fc-435f-4ea4-bd16-e04616518099"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.888334 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bdb99fc-435f-4ea4-bd16-e04616518099-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "5bdb99fc-435f-4ea4-bd16-e04616518099" (UID: "5bdb99fc-435f-4ea4-bd16-e04616518099"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.889669 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7305e60-aa68-4d37-a472-80060d0fb5cb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a7305e60-aa68-4d37-a472-80060d0fb5cb" (UID: "a7305e60-aa68-4d37-a472-80060d0fb5cb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.890852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5bdb99fc-435f-4ea4-bd16-e04616518099" (UID: "5bdb99fc-435f-4ea4-bd16-e04616518099"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.891461 5030 generic.go:334] "Generic (PLEG): container finished" podID="231da45e-cd6f-435b-9209-0d8794fa33ad" containerID="1b1952168676d99572794ea82a32c819d0bb2a2bc32e30b900afcbcea78e76da" exitCode=0 Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.891514 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"231da45e-cd6f-435b-9209-0d8794fa33ad","Type":"ContainerDied","Data":"1b1952168676d99572794ea82a32c819d0bb2a2bc32e30b900afcbcea78e76da"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.891539 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" event={"ID":"231da45e-cd6f-435b-9209-0d8794fa33ad","Type":"ContainerDied","Data":"0af217e7af4fddc0a9f0a907f30ae0f7f766a3608f64dddd1f9bb3ea15f3ac9b"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.891859 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-novncproxy-0" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.893194 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-scripts" (OuterVolumeSpecName: "scripts") pod "a7305e60-aa68-4d37-a472-80060d0fb5cb" (UID: "a7305e60-aa68-4d37-a472-80060d0fb5cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.893905 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a7305e60-aa68-4d37-a472-80060d0fb5cb" (UID: "a7305e60-aa68-4d37-a472-80060d0fb5cb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.899447 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.900546 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7305e60-aa68-4d37-a472-80060d0fb5cb-kube-api-access-n5s8x" (OuterVolumeSpecName: "kube-api-access-n5s8x") pod "a7305e60-aa68-4d37-a472-80060d0fb5cb" (UID: "a7305e60-aa68-4d37-a472-80060d0fb5cb"). InnerVolumeSpecName "kube-api-access-n5s8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.900974 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231da45e-cd6f-435b-9209-0d8794fa33ad-kube-api-access-2dxfm" (OuterVolumeSpecName: "kube-api-access-2dxfm") pod "231da45e-cd6f-435b-9209-0d8794fa33ad" (UID: "231da45e-cd6f-435b-9209-0d8794fa33ad"). InnerVolumeSpecName "kube-api-access-2dxfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: E0120 23:58:15.902851 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-operator-scripts podName:b7a462d5-bc01-4f4d-a54e-10acd974e5a0 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:16.899599979 +0000 UTC m=+4969.219860267 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-operator-scripts") pod "keystone-d843-account-create-update-4x9f4" (UID: "b7a462d5-bc01-4f4d-a54e-10acd974e5a0") : configmap "openstack-scripts" not found Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.903829 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bdb99fc-435f-4ea4-bd16-e04616518099-kube-api-access-gkrmn" (OuterVolumeSpecName: "kube-api-access-gkrmn") pod "5bdb99fc-435f-4ea4-bd16-e04616518099" (UID: "5bdb99fc-435f-4ea4-bd16-e04616518099"). InnerVolumeSpecName "kube-api-access-gkrmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.907712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-api-access-4m9np" (OuterVolumeSpecName: "kube-api-access-4m9np") pod "02190451-840d-4e4e-8b67-a14ac1f9712c" (UID: "02190451-840d-4e4e-8b67-a14ac1f9712c"). InnerVolumeSpecName "kube-api-access-4m9np". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.911026 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj" event={"ID":"9d61475e-3539-4e26-aac7-dfbde0ece687","Type":"ContainerStarted","Data":"d3b6d97e1da10de00566bb84f7479d64264d5e7100980a7be0af4e9a85473d2e"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.918071 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5889-account-create-update-7spqm" event={"ID":"28827f2e-866b-47b6-a6bb-d38ef943bd09","Type":"ContainerStarted","Data":"43d489b5deede9eb716e0a497359ce24ce0437af098585ddda118a92b1a96879"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.932791 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22" event={"ID":"467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3","Type":"ContainerStarted","Data":"ae384507090a42030e78757bdadb897b825a52aa454163bbcc90fe249c7e85ea"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.944596 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bdb99fc-435f-4ea4-bd16-e04616518099-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bdb99fc-435f-4ea4-bd16-e04616518099" (UID: "5bdb99fc-435f-4ea4-bd16-e04616518099"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.945765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" event={"ID":"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0","Type":"ContainerStarted","Data":"71ee938eac7bb8fed7cd685f7ceddd8fc6942ddb5416f7e0149497813ef1f517"} Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.955203 5030 scope.go:117] "RemoveContainer" containerID="204f0f62944ca153053c055e5e6313febbbdaac72c94d5a59db6b700461c73ce" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.965185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09900067-cfe8-4543-8e6c-88c692d43ed8-combined-ca-bundle\") pod \"09900067-cfe8-4543-8e6c-88c692d43ed8\" (UID: \"09900067-cfe8-4543-8e6c-88c692d43ed8\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.965493 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09900067-cfe8-4543-8e6c-88c692d43ed8-config-data\") pod \"09900067-cfe8-4543-8e6c-88c692d43ed8\" (UID: \"09900067-cfe8-4543-8e6c-88c692d43ed8\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.966602 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvrr4\" (UniqueName: \"kubernetes.io/projected/09900067-cfe8-4543-8e6c-88c692d43ed8-kube-api-access-nvrr4\") pod \"09900067-cfe8-4543-8e6c-88c692d43ed8\" (UID: \"09900067-cfe8-4543-8e6c-88c692d43ed8\") " Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.972651 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5s8x\" (UniqueName: \"kubernetes.io/projected/a7305e60-aa68-4d37-a472-80060d0fb5cb-kube-api-access-n5s8x\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.972693 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m9np\" (UniqueName: \"kubernetes.io/projected/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-api-access-4m9np\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.972705 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5bdb99fc-435f-4ea4-bd16-e04616518099-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.972717 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.972728 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.972739 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7305e60-aa68-4d37-a472-80060d0fb5cb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.972750 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bdb99fc-435f-4ea4-bd16-e04616518099-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.972762 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.972775 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.972788 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkrmn\" (UniqueName: \"kubernetes.io/projected/5bdb99fc-435f-4ea4-bd16-e04616518099-kube-api-access-gkrmn\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.972799 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5bdb99fc-435f-4ea4-bd16-e04616518099-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.972814 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dxfm\" (UniqueName: \"kubernetes.io/projected/231da45e-cd6f-435b-9209-0d8794fa33ad-kube-api-access-2dxfm\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.973301 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "mysql-db") pod "5bdb99fc-435f-4ea4-bd16-e04616518099" (UID: "5bdb99fc-435f-4ea4-bd16-e04616518099"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.973835 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovsdbserver-nb-0_14c17006-1397-4119-bd32-0e07e78d5068/ovsdbserver-nb/0.log" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.973971 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovsdbserver-nb-0" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.975439 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02190451-840d-4e4e-8b67-a14ac1f9712c" (UID: "02190451-840d-4e4e-8b67-a14ac1f9712c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.976878 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09900067-cfe8-4543-8e6c-88c692d43ed8-kube-api-access-nvrr4" (OuterVolumeSpecName: "kube-api-access-nvrr4") pod "09900067-cfe8-4543-8e6c-88c692d43ed8" (UID: "09900067-cfe8-4543-8e6c-88c692d43ed8"). InnerVolumeSpecName "kube-api-access-nvrr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.979235 5030 generic.go:334] "Generic (PLEG): container finished" podID="02190451-840d-4e4e-8b67-a14ac1f9712c" containerID="13e58d45af774f4c8a8b261f1aa8f35442fe8fe8b25360af8c074429af8cfa70" exitCode=2 Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.979380 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kube-state-metrics-0" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.982449 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.988608 5030 generic.go:334] "Generic (PLEG): container finished" podID="5915461a-f712-499e-ba04-bab78e49188f" containerID="ac52bbaae9279d327d926eae11a2177bd95c95212a5758e0429fa5341d1b67fc" exitCode=143 Jan 20 23:58:15 crc kubenswrapper[5030]: I0120 23:58:15.999572 5030 generic.go:334] "Generic (PLEG): container finished" podID="04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" containerID="c73d1bfddd3255de91348a4a282edc41b3a6067265685d19999b944603a37028" exitCode=143 Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.005484 5030 generic.go:334] "Generic (PLEG): container finished" podID="09900067-cfe8-4543-8e6c-88c692d43ed8" containerID="44eaef6b2a5abf1f99511a45ea107f9625dee6ea706b24c87e2b305edec32984" exitCode=0 Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.005588 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell1-conductor-0" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.006030 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "231da45e-cd6f-435b-9209-0d8794fa33ad" (UID: "231da45e-cd6f-435b-9209-0d8794fa33ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.007986 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" podUID="f4979b9f-9d14-4e60-b62f-82d983b0fda9" containerName="barbican-worker-log" containerID="cri-o://3a64e91d5d48c05cb735d0cb229c7481a1d20317e71c2a80aafc083cefa1b5aa" gracePeriod=30 Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.008122 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" podUID="f4979b9f-9d14-4e60-b62f-82d983b0fda9" containerName="barbican-worker" containerID="cri-o://9edc9b5e1d0d3cbd3ae7e5e30adbd2b6dd6a00cb85691efc291253101a274524" gracePeriod=30 Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.015746 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.015814 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.015974 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.016371 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-kuttl-tests/root-account-create-update-zlxnk" secret="" err="secret \"galera-openstack-cell1-dockercfg-l8frs\" not found" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.029432 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fd59946-c071-4913-973f-68c0ba64772d" path="/var/lib/kubelet/pods/1fd59946-c071-4913-973f-68c0ba64772d/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.030710 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e30653e-ea2a-4d57-b605-c35cf41208c4" path="/var/lib/kubelet/pods/2e30653e-ea2a-4d57-b605-c35cf41208c4/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.039177 5030 scope.go:117] "RemoveContainer" containerID="af17e77e1eb767a98bb78c74babe1fea956bf2c7c84c692b9ed22bca3333a0ae" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.031392 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd59eaf-4b45-4866-a373-fbd87aa2054d" path="/var/lib/kubelet/pods/3fd59eaf-4b45-4866-a373-fbd87aa2054d/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.041023 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af17e77e1eb767a98bb78c74babe1fea956bf2c7c84c692b9ed22bca3333a0ae\": container with ID starting with af17e77e1eb767a98bb78c74babe1fea956bf2c7c84c692b9ed22bca3333a0ae not found: ID does not exist" containerID="af17e77e1eb767a98bb78c74babe1fea956bf2c7c84c692b9ed22bca3333a0ae" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.041058 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af17e77e1eb767a98bb78c74babe1fea956bf2c7c84c692b9ed22bca3333a0ae"} err="failed to get container status \"af17e77e1eb767a98bb78c74babe1fea956bf2c7c84c692b9ed22bca3333a0ae\": rpc error: code = NotFound desc = could not find container \"af17e77e1eb767a98bb78c74babe1fea956bf2c7c84c692b9ed22bca3333a0ae\": container with ID starting with af17e77e1eb767a98bb78c74babe1fea956bf2c7c84c692b9ed22bca3333a0ae not found: ID does not exist" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.041081 5030 scope.go:117] "RemoveContainer" containerID="204f0f62944ca153053c055e5e6313febbbdaac72c94d5a59db6b700461c73ce" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.041180 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5236b48f-7d0e-40ce-94c0-adb838bf5de6" path="/var/lib/kubelet/pods/5236b48f-7d0e-40ce-94c0-adb838bf5de6/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.041710 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="525b8de8-34a8-4bd1-b426-ed17becbab12" path="/var/lib/kubelet/pods/525b8de8-34a8-4bd1-b426-ed17becbab12/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.041711 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"204f0f62944ca153053c055e5e6313febbbdaac72c94d5a59db6b700461c73ce\": container with ID starting with 204f0f62944ca153053c055e5e6313febbbdaac72c94d5a59db6b700461c73ce not found: ID does not exist" containerID="204f0f62944ca153053c055e5e6313febbbdaac72c94d5a59db6b700461c73ce" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.041805 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"204f0f62944ca153053c055e5e6313febbbdaac72c94d5a59db6b700461c73ce"} err="failed to get container status \"204f0f62944ca153053c055e5e6313febbbdaac72c94d5a59db6b700461c73ce\": rpc error: code = NotFound desc = could not find container \"204f0f62944ca153053c055e5e6313febbbdaac72c94d5a59db6b700461c73ce\": container with ID starting with 204f0f62944ca153053c055e5e6313febbbdaac72c94d5a59db6b700461c73ce not found: ID does not exist" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.041818 5030 scope.go:117] "RemoveContainer" containerID="7aa1a8b59ede0ec7d6ca8badb42b6c8311eed524c20c0dfff1c4b3113611868b" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.042267 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53344bed-1dcb-467d-a87e-93b38be8acf1" path="/var/lib/kubelet/pods/53344bed-1dcb-467d-a87e-93b38be8acf1/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.043212 5030 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 23:58:16 crc kubenswrapper[5030]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Jan 20 23:58:16 crc kubenswrapper[5030]: Jan 20 23:58:16 crc kubenswrapper[5030]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 20 23:58:16 crc kubenswrapper[5030]: Jan 20 23:58:16 crc kubenswrapper[5030]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 20 23:58:16 crc kubenswrapper[5030]: Jan 20 23:58:16 crc kubenswrapper[5030]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 20 23:58:16 crc kubenswrapper[5030]: Jan 20 23:58:16 crc kubenswrapper[5030]: if [ -n "" ]; then Jan 20 23:58:16 crc kubenswrapper[5030]: GRANT_DATABASE="" Jan 20 23:58:16 crc kubenswrapper[5030]: else Jan 20 23:58:16 crc kubenswrapper[5030]: GRANT_DATABASE="*" Jan 20 23:58:16 crc kubenswrapper[5030]: fi Jan 20 23:58:16 crc kubenswrapper[5030]: Jan 20 23:58:16 crc kubenswrapper[5030]: # going for maximum compatibility here: Jan 20 23:58:16 crc kubenswrapper[5030]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 20 23:58:16 crc kubenswrapper[5030]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 20 23:58:16 crc kubenswrapper[5030]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 20 23:58:16 crc kubenswrapper[5030]: # support updates Jan 20 23:58:16 crc kubenswrapper[5030]: Jan 20 23:58:16 crc kubenswrapper[5030]: $MYSQL_CMD < logger="UnhandledError" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.043600 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d7f91b3-ec99-44df-a70b-86ca862be85d" path="/var/lib/kubelet/pods/7d7f91b3-ec99-44df-a70b-86ca862be85d/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.044167 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="923cb7da-1207-4cb6-b35f-2a8284253596" path="/var/lib/kubelet/pods/923cb7da-1207-4cb6-b35f-2a8284253596/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.044830 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e334cad-3adc-4d91-aa9c-3e68815b750f" path="/var/lib/kubelet/pods/9e334cad-3adc-4d91-aa9c-3e68815b750f/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.045515 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack-kuttl-tests/root-account-create-update-zlxnk" podUID="b714d540-db02-4fc2-ae0b-140a4cdab084" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.045902 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a58c134d-66fb-4e25-ac00-7fe6686da8e0" path="/var/lib/kubelet/pods/a58c134d-66fb-4e25-ac00-7fe6686da8e0/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.046413 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca761538-4cfb-4a36-92a3-f5b0601eb08b" path="/var/lib/kubelet/pods/ca761538-4cfb-4a36-92a3-f5b0601eb08b/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.046935 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f25e04-6d69-4422-8114-d112be37db5a" path="/var/lib/kubelet/pods/f1f25e04-6d69-4422-8114-d112be37db5a/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.047841 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2da3223-e170-45c0-a464-2b91d405f3c0" path="/var/lib/kubelet/pods/f2da3223-e170-45c0-a464-2b91d405f3c0/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.048442 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f386591d-bc9f-4389-b61b-e4a830c025f9" path="/var/lib/kubelet/pods/f386591d-bc9f-4389-b61b-e4a830c025f9/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.054712 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" podStartSLOduration=5.054695841 podStartE2EDuration="5.054695841s" podCreationTimestamp="2026-01-20 23:58:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:58:16.027530643 +0000 UTC m=+4968.347790931" watchObservedRunningTime="2026-01-20 23:58:16.054695841 +0000 UTC m=+4968.374956129" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.089412 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff6f83e7-540c-419e-b09f-1a49dc622f40" path="/var/lib/kubelet/pods/ff6f83e7-540c-419e-b09f-1a49dc622f40/volumes" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.093744 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle\") pod \"barbican-keystone-listener-7f995c9cbb-dqqv6\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.093895 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.093924 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.093936 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvrr4\" (UniqueName: \"kubernetes.io/projected/09900067-cfe8-4543-8e6c-88c692d43ed8-kube-api-access-nvrr4\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.093952 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.097218 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.097311 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b714d540-db02-4fc2-ae0b-140a4cdab084-operator-scripts podName:b714d540-db02-4fc2-ae0b-140a4cdab084 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:18.097286215 +0000 UTC m=+4970.417546503 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/b714d540-db02-4fc2-ae0b-140a4cdab084-operator-scripts") pod "root-account-create-update-zlxnk" (UID: "b714d540-db02-4fc2-ae0b-140a4cdab084") : configmap "openstack-cell1-scripts" not found Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.097231 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.097363 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle podName:901be408-dfd2-4cfa-9120-041671bd94ed nodeName:}" failed. No retries permitted until 2026-01-20 23:58:20.097356327 +0000 UTC m=+4972.417616615 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle") pod "barbican-keystone-listener-7f995c9cbb-dqqv6" (UID: "901be408-dfd2-4cfa-9120-041671bd94ed") : secret "combined-ca-bundle" not found Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.106019 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "02190451-840d-4e4e-8b67-a14ac1f9712c" (UID: "02190451-840d-4e4e-8b67-a14ac1f9712c"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.121911 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "231da45e-cd6f-435b-9209-0d8794fa33ad" (UID: "231da45e-cd6f-435b-9209-0d8794fa33ad"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.154472 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.161599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09900067-cfe8-4543-8e6c-88c692d43ed8-config-data" (OuterVolumeSpecName: "config-data") pod "09900067-cfe8-4543-8e6c-88c692d43ed8" (UID: "09900067-cfe8-4543-8e6c-88c692d43ed8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.177092 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-config-data" (OuterVolumeSpecName: "config-data") pod "231da45e-cd6f-435b-9209-0d8794fa33ad" (UID: "231da45e-cd6f-435b-9209-0d8794fa33ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.181369 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bdb99fc-435f-4ea4-bd16-e04616518099-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "5bdb99fc-435f-4ea4-bd16-e04616518099" (UID: "5bdb99fc-435f-4ea4-bd16-e04616518099"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.184703 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09900067-cfe8-4543-8e6c-88c692d43ed8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09900067-cfe8-4543-8e6c-88c692d43ed8" (UID: "09900067-cfe8-4543-8e6c-88c692d43ed8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.186929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "02190451-840d-4e4e-8b67-a14ac1f9712c" (UID: "02190451-840d-4e4e-8b67-a14ac1f9712c"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.196682 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.196712 5030 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.196722 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bdb99fc-435f-4ea4-bd16-e04616518099-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.196731 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.196741 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09900067-cfe8-4543-8e6c-88c692d43ed8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.197293 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09900067-cfe8-4543-8e6c-88c692d43ed8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.197313 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.197323 5030 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/02190451-840d-4e4e-8b67-a14ac1f9712c-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.217050 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "231da45e-cd6f-435b-9209-0d8794fa33ad" (UID: "231da45e-cd6f-435b-9209-0d8794fa33ad"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.229366 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7305e60-aa68-4d37-a472-80060d0fb5cb" (UID: "a7305e60-aa68-4d37-a472-80060d0fb5cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.298557 5030 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/231da45e-cd6f-435b-9209-0d8794fa33ad-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.298583 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.308075 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-config-data" (OuterVolumeSpecName: "config-data") pod "a7305e60-aa68-4d37-a472-80060d0fb5cb" (UID: "a7305e60-aa68-4d37-a472-80060d0fb5cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.354969 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="014d4f347fd703b2cd055c19f820360447ce9d879a2dd7808843d16160084184" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.357397 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="014d4f347fd703b2cd055c19f820360447ce9d879a2dd7808843d16160084184" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.358153 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/memcached-0" podUID="443bb4b6-0b3c-4824-b5b6-e2771eb8e438" containerName="memcached" probeResult="failure" output="dial tcp 10.217.1.237:11211: connect: connection refused" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.360898 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="014d4f347fd703b2cd055c19f820360447ce9d879a2dd7808843d16160084184" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.360932 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/ovn-northd-0" podUID="44f0a514-9f01-4768-aeb4-93de0354b738" containerName="ovn-northd" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.403482 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovsdbserver-nb-0" event={"ID":"14c17006-1397-4119-bd32-0e07e78d5068","Type":"ContainerDied","Data":"c9d6e972c95d56185baa0e231c500cf95678aa232be38c9e6761786e20dc488f"} Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.403524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"02190451-840d-4e4e-8b67-a14ac1f9712c","Type":"ContainerDied","Data":"13e58d45af774f4c8a8b261f1aa8f35442fe8fe8b25360af8c074429af8cfa70"} Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.403540 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kube-state-metrics-0" event={"ID":"02190451-840d-4e4e-8b67-a14ac1f9712c","Type":"ContainerDied","Data":"70b7b87cadfbc887c22fbf5316e31d57f694d46fb6e0c71a375307b3c9eeafb5"} Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.403554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5" event={"ID":"03c5eea5-bd6f-4b19-829d-32f4ef09997c","Type":"ContainerDied","Data":"63856504b66b5735168be1f4f4e022e263dccc76e1d62bc9633b316c047c486e"} Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.403567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" event={"ID":"5915461a-f712-499e-ba04-bab78e49188f","Type":"ContainerDied","Data":"ac52bbaae9279d327d926eae11a2177bd95c95212a5758e0429fa5341d1b67fc"} Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.403578 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" event={"ID":"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e","Type":"ContainerDied","Data":"c73d1bfddd3255de91348a4a282edc41b3a6067265685d19999b944603a37028"} Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.403607 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell1-conductor-0" event={"ID":"09900067-cfe8-4543-8e6c-88c692d43ed8","Type":"ContainerDied","Data":"44eaef6b2a5abf1f99511a45ea107f9625dee6ea706b24c87e2b305edec32984"} Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.403640 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" event={"ID":"f4979b9f-9d14-4e60-b62f-82d983b0fda9","Type":"ContainerStarted","Data":"3a64e91d5d48c05cb735d0cb229c7481a1d20317e71c2a80aafc083cefa1b5aa"} Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.403651 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" event={"ID":"f4979b9f-9d14-4e60-b62f-82d983b0fda9","Type":"ContainerStarted","Data":"cedd4f0ce87996317edcf5b3b3ae5399cfe3e74e63bb4e27e816846fe94418c8"} Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.403659 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm" event={"ID":"5b1faee2-019f-4357-ab94-2680305278c7","Type":"ContainerStarted","Data":"3b14fcf4f9ebbb090048b23a19a4990a1d35587510d989a42ea37aa1375e75d7"} Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.415652 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7305e60-aa68-4d37-a472-80060d0fb5cb-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.418455 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.424880 5030 scope.go:117] "RemoveContainer" containerID="2352db31d2fa21ab9eebafd17ba76cf00ee85495907870c2b6db941db94c99ad" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.425860 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovsdbserver-nb-0"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.431934 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.450983 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.467535 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.472746 5030 scope.go:117] "RemoveContainer" containerID="7aa1a8b59ede0ec7d6ca8badb42b6c8311eed524c20c0dfff1c4b3113611868b" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.472999 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa1a8b59ede0ec7d6ca8badb42b6c8311eed524c20c0dfff1c4b3113611868b\": container with ID starting with 7aa1a8b59ede0ec7d6ca8badb42b6c8311eed524c20c0dfff1c4b3113611868b not found: ID does not exist" containerID="7aa1a8b59ede0ec7d6ca8badb42b6c8311eed524c20c0dfff1c4b3113611868b" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.473015 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.473023 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa1a8b59ede0ec7d6ca8badb42b6c8311eed524c20c0dfff1c4b3113611868b"} err="failed to get container status \"7aa1a8b59ede0ec7d6ca8badb42b6c8311eed524c20c0dfff1c4b3113611868b\": rpc error: code = NotFound desc = could not find container \"7aa1a8b59ede0ec7d6ca8badb42b6c8311eed524c20c0dfff1c4b3113611868b\": container with ID starting with 7aa1a8b59ede0ec7d6ca8badb42b6c8311eed524c20c0dfff1c4b3113611868b not found: ID does not exist" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.473041 5030 scope.go:117] "RemoveContainer" containerID="2352db31d2fa21ab9eebafd17ba76cf00ee85495907870c2b6db941db94c99ad" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.473255 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2352db31d2fa21ab9eebafd17ba76cf00ee85495907870c2b6db941db94c99ad\": container with ID starting with 2352db31d2fa21ab9eebafd17ba76cf00ee85495907870c2b6db941db94c99ad not found: ID does not exist" containerID="2352db31d2fa21ab9eebafd17ba76cf00ee85495907870c2b6db941db94c99ad" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.473297 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2352db31d2fa21ab9eebafd17ba76cf00ee85495907870c2b6db941db94c99ad"} err="failed to get container status \"2352db31d2fa21ab9eebafd17ba76cf00ee85495907870c2b6db941db94c99ad\": rpc error: code = NotFound desc = could not find container \"2352db31d2fa21ab9eebafd17ba76cf00ee85495907870c2b6db941db94c99ad\": container with ID starting with 2352db31d2fa21ab9eebafd17ba76cf00ee85495907870c2b6db941db94c99ad not found: ID does not exist" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.473324 5030 scope.go:117] "RemoveContainer" containerID="1b1952168676d99572794ea82a32c819d0bb2a2bc32e30b900afcbcea78e76da" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.483802 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.490740 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-conductor-0"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.500344 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5889-account-create-update-7spqm" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.514878 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.517400 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-config-data-custom\") pod \"901be408-dfd2-4cfa-9120-041671bd94ed\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.517481 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f4pd\" (UniqueName: \"kubernetes.io/projected/901be408-dfd2-4cfa-9120-041671bd94ed-kube-api-access-6f4pd\") pod \"901be408-dfd2-4cfa-9120-041671bd94ed\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.517667 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901be408-dfd2-4cfa-9120-041671bd94ed-logs\") pod \"901be408-dfd2-4cfa-9120-041671bd94ed\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.517862 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-config-data\") pod \"901be408-dfd2-4cfa-9120-041671bd94ed\" (UID: \"901be408-dfd2-4cfa-9120-041671bd94ed\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.518256 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8dg\" (UniqueName: \"kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.518302 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.518330 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.518405 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.518483 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs\") pod \"barbican-api-54769bd5cb-djssm\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.518667 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-public-svc: secret "cert-barbican-public-svc" not found Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.518708 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:20.518695638 +0000 UTC m=+4972.838955926 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "cert-barbican-public-svc" not found Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.519308 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/901be408-dfd2-4cfa-9120-041671bd94ed-logs" (OuterVolumeSpecName: "logs") pod "901be408-dfd2-4cfa-9120-041671bd94ed" (UID: "901be408-dfd2-4cfa-9120-041671bd94ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.522582 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "901be408-dfd2-4cfa-9120-041671bd94ed" (UID: "901be408-dfd2-4cfa-9120-041671bd94ed"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.528796 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/combined-ca-bundle: secret "combined-ca-bundle" not found Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.528835 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/barbican-config-data: secret "barbican-config-data" not found Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.528853 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:20.528839004 +0000 UTC m=+4972.849099292 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "combined-ca-bundle" not found Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.528923 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:20.528903445 +0000 UTC m=+4972.849163733 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "barbican-config-data" not found Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.528939 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-config-data" (OuterVolumeSpecName: "config-data") pod "901be408-dfd2-4cfa-9120-041671bd94ed" (UID: "901be408-dfd2-4cfa-9120-041671bd94ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.529007 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/cert-barbican-internal-svc: secret "cert-barbican-internal-svc" not found Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.529036 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:20.529028258 +0000 UTC m=+4972.849288546 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : secret "cert-barbican-internal-svc" not found Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.531180 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.531600 5030 projected.go:194] Error preparing data for projected volume kube-api-access-dc8dg for pod openstack-kuttl-tests/barbican-api-54769bd5cb-djssm: failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.531672 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg podName:66791d21-1d99-480a-b3fe-e93c4e20c047 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:20.531662712 +0000 UTC m=+4972.851923000 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dc8dg" (UniqueName: "kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg") pod "barbican-api-54769bd5cb-djssm" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047") : failed to fetch token: serviceaccounts "barbican-barbican" not found Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.533040 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.533818 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-proxy-58bb755474-jnzx5"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.534098 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/901be408-dfd2-4cfa-9120-041671bd94ed-kube-api-access-6f4pd" (OuterVolumeSpecName: "kube-api-access-6f4pd") pod "901be408-dfd2-4cfa-9120-041671bd94ed" (UID: "901be408-dfd2-4cfa-9120-041671bd94ed"). InnerVolumeSpecName "kube-api-access-6f4pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.544768 5030 scope.go:117] "RemoveContainer" containerID="1b1952168676d99572794ea82a32c819d0bb2a2bc32e30b900afcbcea78e76da" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.545784 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b1952168676d99572794ea82a32c819d0bb2a2bc32e30b900afcbcea78e76da\": container with ID starting with 1b1952168676d99572794ea82a32c819d0bb2a2bc32e30b900afcbcea78e76da not found: ID does not exist" containerID="1b1952168676d99572794ea82a32c819d0bb2a2bc32e30b900afcbcea78e76da" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.545824 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b1952168676d99572794ea82a32c819d0bb2a2bc32e30b900afcbcea78e76da"} err="failed to get container status \"1b1952168676d99572794ea82a32c819d0bb2a2bc32e30b900afcbcea78e76da\": rpc error: code = NotFound desc = could not find container \"1b1952168676d99572794ea82a32c819d0bb2a2bc32e30b900afcbcea78e76da\": container with ID starting with 1b1952168676d99572794ea82a32c819d0bb2a2bc32e30b900afcbcea78e76da not found: ID does not exist" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.545850 5030 scope.go:117] "RemoveContainer" containerID="05145492f4392fcd8c755b7cc30f2f103a6adfaf13ba303dcd29f71b950cc8bb" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.546887 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.552181 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kube-state-metrics-0"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.582988 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.590986 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-cell1-galera-0"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.606710 5030 scope.go:117] "RemoveContainer" containerID="b3a1344ff3e013651e7f6d5c6a2155a0ddb578b7e656e6c2b4a5a5e435e9bfc2" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.607610 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.612703 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell1-novncproxy-0"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.619502 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28827f2e-866b-47b6-a6bb-d38ef943bd09-operator-scripts\") pod \"28827f2e-866b-47b6-a6bb-d38ef943bd09\" (UID: \"28827f2e-866b-47b6-a6bb-d38ef943bd09\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.619577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3-operator-scripts\") pod \"467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3\" (UID: \"467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.619633 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djdkq\" (UniqueName: \"kubernetes.io/projected/28827f2e-866b-47b6-a6bb-d38ef943bd09-kube-api-access-djdkq\") pod \"28827f2e-866b-47b6-a6bb-d38ef943bd09\" (UID: \"28827f2e-866b-47b6-a6bb-d38ef943bd09\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.619663 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d61475e-3539-4e26-aac7-dfbde0ece687-operator-scripts\") pod \"9d61475e-3539-4e26-aac7-dfbde0ece687\" (UID: \"9d61475e-3539-4e26-aac7-dfbde0ece687\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.619724 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data-custom\") pod \"66791d21-1d99-480a-b3fe-e93c4e20c047\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.619810 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcz9s\" (UniqueName: \"kubernetes.io/projected/9d61475e-3539-4e26-aac7-dfbde0ece687-kube-api-access-pcz9s\") pod \"9d61475e-3539-4e26-aac7-dfbde0ece687\" (UID: \"9d61475e-3539-4e26-aac7-dfbde0ece687\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.619850 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc5ls\" (UniqueName: \"kubernetes.io/projected/467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3-kube-api-access-vc5ls\") pod \"467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3\" (UID: \"467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.620237 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.620615 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d61475e-3539-4e26-aac7-dfbde0ece687-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d61475e-3539-4e26-aac7-dfbde0ece687" (UID: "9d61475e-3539-4e26-aac7-dfbde0ece687"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.619869 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns5v4\" (UniqueName: \"kubernetes.io/projected/5b1faee2-019f-4357-ab94-2680305278c7-kube-api-access-ns5v4\") pod \"5b1faee2-019f-4357-ab94-2680305278c7\" (UID: \"5b1faee2-019f-4357-ab94-2680305278c7\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.621176 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66791d21-1d99-480a-b3fe-e93c4e20c047-logs\") pod \"66791d21-1d99-480a-b3fe-e93c4e20c047\" (UID: \"66791d21-1d99-480a-b3fe-e93c4e20c047\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.621235 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b1faee2-019f-4357-ab94-2680305278c7-operator-scripts\") pod \"5b1faee2-019f-4357-ab94-2680305278c7\" (UID: \"5b1faee2-019f-4357-ab94-2680305278c7\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.621820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b1faee2-019f-4357-ab94-2680305278c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b1faee2-019f-4357-ab94-2680305278c7" (UID: "5b1faee2-019f-4357-ab94-2680305278c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.621980 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66791d21-1d99-480a-b3fe-e93c4e20c047-logs" (OuterVolumeSpecName: "logs") pod "66791d21-1d99-480a-b3fe-e93c4e20c047" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.622130 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3" (UID: "467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.622171 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28827f2e-866b-47b6-a6bb-d38ef943bd09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28827f2e-866b-47b6-a6bb-d38ef943bd09" (UID: "28827f2e-866b-47b6-a6bb-d38ef943bd09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.622518 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/ceilometer-0" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.148:3000/\": dial tcp 10.217.0.148:3000: connect: connection refused" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.622823 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66791d21-1d99-480a-b3fe-e93c4e20c047-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.622838 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.622850 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b1faee2-019f-4357-ab94-2680305278c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.622859 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f4pd\" (UniqueName: \"kubernetes.io/projected/901be408-dfd2-4cfa-9120-041671bd94ed-kube-api-access-6f4pd\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.622869 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28827f2e-866b-47b6-a6bb-d38ef943bd09-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.622878 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.622887 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d61475e-3539-4e26-aac7-dfbde0ece687-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.622896 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901be408-dfd2-4cfa-9120-041671bd94ed-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.622911 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.624522 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3-kube-api-access-vc5ls" (OuterVolumeSpecName: "kube-api-access-vc5ls") pod "467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3" (UID: "467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3"). InnerVolumeSpecName "kube-api-access-vc5ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.625842 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d61475e-3539-4e26-aac7-dfbde0ece687-kube-api-access-pcz9s" (OuterVolumeSpecName: "kube-api-access-pcz9s") pod "9d61475e-3539-4e26-aac7-dfbde0ece687" (UID: "9d61475e-3539-4e26-aac7-dfbde0ece687"). InnerVolumeSpecName "kube-api-access-pcz9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.627729 5030 scope.go:117] "RemoveContainer" containerID="13e58d45af774f4c8a8b261f1aa8f35442fe8fe8b25360af8c074429af8cfa70" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.630223 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28827f2e-866b-47b6-a6bb-d38ef943bd09-kube-api-access-djdkq" (OuterVolumeSpecName: "kube-api-access-djdkq") pod "28827f2e-866b-47b6-a6bb-d38ef943bd09" (UID: "28827f2e-866b-47b6-a6bb-d38ef943bd09"). InnerVolumeSpecName "kube-api-access-djdkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.638219 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "66791d21-1d99-480a-b3fe-e93c4e20c047" (UID: "66791d21-1d99-480a-b3fe-e93c4e20c047"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.643258 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b1faee2-019f-4357-ab94-2680305278c7-kube-api-access-ns5v4" (OuterVolumeSpecName: "kube-api-access-ns5v4") pod "5b1faee2-019f-4357-ab94-2680305278c7" (UID: "5b1faee2-019f-4357-ab94-2680305278c7"). InnerVolumeSpecName "kube-api-access-ns5v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.647747 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.659615 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-scheduler-0"] Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.659700 5030 scope.go:117] "RemoveContainer" containerID="13e58d45af774f4c8a8b261f1aa8f35442fe8fe8b25360af8c074429af8cfa70" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.660118 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13e58d45af774f4c8a8b261f1aa8f35442fe8fe8b25360af8c074429af8cfa70\": container with ID starting with 13e58d45af774f4c8a8b261f1aa8f35442fe8fe8b25360af8c074429af8cfa70 not found: ID does not exist" containerID="13e58d45af774f4c8a8b261f1aa8f35442fe8fe8b25360af8c074429af8cfa70" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.660182 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13e58d45af774f4c8a8b261f1aa8f35442fe8fe8b25360af8c074429af8cfa70"} err="failed to get container status \"13e58d45af774f4c8a8b261f1aa8f35442fe8fe8b25360af8c074429af8cfa70\": rpc error: code = NotFound desc = could not find container \"13e58d45af774f4c8a8b261f1aa8f35442fe8fe8b25360af8c074429af8cfa70\": container with ID starting with 13e58d45af774f4c8a8b261f1aa8f35442fe8fe8b25360af8c074429af8cfa70 not found: ID does not exist" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.660210 5030 scope.go:117] "RemoveContainer" containerID="9ca4957d78a8a6a00d4df062b5abb4a6f62840672dc365ddc417d1f2516cfcb4" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.698562 5030 scope.go:117] "RemoveContainer" containerID="e48bf78e90710aa76a16ac4f0324a165f77700561ddfe100c023788b47cc3e69" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.724396 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v55kl\" (UniqueName: \"kubernetes.io/projected/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-kube-api-access-v55kl\") pod \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.724488 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-memcached-tls-certs\") pod \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.724519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-kolla-config\") pod \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.724598 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-config-data\") pod \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.724716 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-combined-ca-bundle\") pod \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\" (UID: \"443bb4b6-0b3c-4824-b5b6-e2771eb8e438\") " Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.725392 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.725425 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcz9s\" (UniqueName: \"kubernetes.io/projected/9d61475e-3539-4e26-aac7-dfbde0ece687-kube-api-access-pcz9s\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.725442 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc5ls\" (UniqueName: \"kubernetes.io/projected/467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3-kube-api-access-vc5ls\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.725454 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns5v4\" (UniqueName: \"kubernetes.io/projected/5b1faee2-019f-4357-ab94-2680305278c7-kube-api-access-ns5v4\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.725467 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djdkq\" (UniqueName: \"kubernetes.io/projected/28827f2e-866b-47b6-a6bb-d38ef943bd09-kube-api-access-djdkq\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.725752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "443bb4b6-0b3c-4824-b5b6-e2771eb8e438" (UID: "443bb4b6-0b3c-4824-b5b6-e2771eb8e438"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.725877 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-config-data" (OuterVolumeSpecName: "config-data") pod "443bb4b6-0b3c-4824-b5b6-e2771eb8e438" (UID: "443bb4b6-0b3c-4824-b5b6-e2771eb8e438"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.727428 5030 scope.go:117] "RemoveContainer" containerID="44eaef6b2a5abf1f99511a45ea107f9625dee6ea706b24c87e2b305edec32984" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.727980 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-kube-api-access-v55kl" (OuterVolumeSpecName: "kube-api-access-v55kl") pod "443bb4b6-0b3c-4824-b5b6-e2771eb8e438" (UID: "443bb4b6-0b3c-4824-b5b6-e2771eb8e438"). InnerVolumeSpecName "kube-api-access-v55kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.749673 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "443bb4b6-0b3c-4824-b5b6-e2771eb8e438" (UID: "443bb4b6-0b3c-4824-b5b6-e2771eb8e438"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.765214 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "443bb4b6-0b3c-4824-b5b6-e2771eb8e438" (UID: "443bb4b6-0b3c-4824-b5b6-e2771eb8e438"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.828289 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v55kl\" (UniqueName: \"kubernetes.io/projected/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-kube-api-access-v55kl\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.828340 5030 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.828361 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.828381 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.828403 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443bb4b6-0b3c-4824-b5b6-e2771eb8e438-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.847352 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/cinder-api-0" podUID="15f641b8-896e-47f1-b6b9-a2ced1fb3808" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.111:8776/healthcheck\": read tcp 10.217.0.2:60360->10.217.0.111:8776: read: connection reset by peer" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.935497 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks59n\" (UniqueName: \"kubernetes.io/projected/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-kube-api-access-ks59n\") pod \"keystone-d843-account-create-update-4x9f4\" (UID: \"b7a462d5-bc01-4f4d-a54e-10acd974e5a0\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4" Jan 20 23:58:16 crc kubenswrapper[5030]: I0120 23:58:16.935802 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-operator-scripts\") pod \"keystone-d843-account-create-update-4x9f4\" (UID: \"b7a462d5-bc01-4f4d-a54e-10acd974e5a0\") " pod="openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4" Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.935941 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.935998 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-operator-scripts podName:b7a462d5-bc01-4f4d-a54e-10acd974e5a0 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:18.935981251 +0000 UTC m=+4971.256241549 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-operator-scripts") pod "keystone-d843-account-create-update-4x9f4" (UID: "b7a462d5-bc01-4f4d-a54e-10acd974e5a0") : configmap "openstack-scripts" not found Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.939160 5030 projected.go:194] Error preparing data for projected volume kube-api-access-ks59n for pod openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:58:16 crc kubenswrapper[5030]: E0120 23:58:16.939302 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-kube-api-access-ks59n podName:b7a462d5-bc01-4f4d-a54e-10acd974e5a0 nodeName:}" failed. No retries permitted until 2026-01-20 23:58:18.93928124 +0000 UTC m=+4971.259541528 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ks59n" (UniqueName: "kubernetes.io/projected/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-kube-api-access-ks59n") pod "keystone-d843-account-create-update-4x9f4" (UID: "b7a462d5-bc01-4f4d-a54e-10acd974e5a0") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.005402 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" podUID="19f6003c-3d52-44e9-a5a8-d525e4b6a79e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.233:5671: connect: connection refused" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.017206 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" podUID="6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.102:8778/\": read tcp 10.217.0.2:36936->10.217.0.102:8778: read: connection reset by peer" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.017489 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" podUID="6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.102:8778/\": read tcp 10.217.0.2:36920->10.217.0.102:8778: read: connection reset by peer" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.042974 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="c3b71d0a-805d-4894-b6ff-c049baf96ad5" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.125:9292/healthcheck\": dial tcp 10.217.0.125:9292: connect: connection refused" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.043406 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/glance-default-external-api-0" podUID="c3b71d0a-805d-4894-b6ff-c049baf96ad5" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.125:9292/healthcheck\": dial tcp 10.217.0.125:9292: connect: connection refused" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.057354 5030 generic.go:334] "Generic (PLEG): container finished" podID="01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0" containerID="55dbe1052955b7454b5b9d7d0a561274172522f13d73d9d9c0a2009f53aeb411" exitCode=0 Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.057404 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" event={"ID":"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0","Type":"ContainerDied","Data":"55dbe1052955b7454b5b9d7d0a561274172522f13d73d9d9c0a2009f53aeb411"} Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.064306 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj" event={"ID":"9d61475e-3539-4e26-aac7-dfbde0ece687","Type":"ContainerDied","Data":"d3b6d97e1da10de00566bb84f7479d64264d5e7100980a7be0af4e9a85473d2e"} Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.064392 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj" Jan 20 23:58:17 crc kubenswrapper[5030]: E0120 23:58:17.100213 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f8fa8f5_9f43_489c_b61d_668187fe87ea.slice/crio-conmon-a214f98f4f8cf3fbcb2676eb84babdfaed14af092e194e107984c7c352cadedf.scope\": RecentStats: unable to find data in memory cache]" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.105896 5030 generic.go:334] "Generic (PLEG): container finished" podID="15f641b8-896e-47f1-b6b9-a2ced1fb3808" containerID="cf937266b65a833ea2519d5e345ec986e9d92e36c9c09cd00ad3dbe11dbc94df" exitCode=0 Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.105955 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"15f641b8-896e-47f1-b6b9-a2ced1fb3808","Type":"ContainerDied","Data":"cf937266b65a833ea2519d5e345ec986e9d92e36c9c09cd00ad3dbe11dbc94df"} Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.114134 5030 generic.go:334] "Generic (PLEG): container finished" podID="443bb4b6-0b3c-4824-b5b6-e2771eb8e438" containerID="a8bc8a463a575efe51b8f28d2ac8ab506a95e14b2c50dc6fee036bbc5a49a1bb" exitCode=0 Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.114397 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"443bb4b6-0b3c-4824-b5b6-e2771eb8e438","Type":"ContainerDied","Data":"a8bc8a463a575efe51b8f28d2ac8ab506a95e14b2c50dc6fee036bbc5a49a1bb"} Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.114414 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/memcached-0" event={"ID":"443bb4b6-0b3c-4824-b5b6-e2771eb8e438","Type":"ContainerDied","Data":"46d12a70b8b856cba286119e3a876938998f065df7bb984b6d3fc83283ebda7c"} Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.114447 5030 scope.go:117] "RemoveContainer" containerID="a8bc8a463a575efe51b8f28d2ac8ab506a95e14b2c50dc6fee036bbc5a49a1bb" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.114558 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/memcached-0" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.176173 5030 generic.go:334] "Generic (PLEG): container finished" podID="732faf79-c224-4a2a-bc39-07343926d772" containerID="c349fa99a47e05f4423344350c4f8ba2fbdc8b3c193473666cc6af2498e4f85c" exitCode=0 Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.176241 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"732faf79-c224-4a2a-bc39-07343926d772","Type":"ContainerDied","Data":"c349fa99a47e05f4423344350c4f8ba2fbdc8b3c193473666cc6af2498e4f85c"} Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.188258 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4979b9f-9d14-4e60-b62f-82d983b0fda9" containerID="3a64e91d5d48c05cb735d0cb229c7481a1d20317e71c2a80aafc083cefa1b5aa" exitCode=143 Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.188317 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" event={"ID":"f4979b9f-9d14-4e60-b62f-82d983b0fda9","Type":"ContainerDied","Data":"3a64e91d5d48c05cb735d0cb229c7481a1d20317e71c2a80aafc083cefa1b5aa"} Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.188339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" event={"ID":"f4979b9f-9d14-4e60-b62f-82d983b0fda9","Type":"ContainerStarted","Data":"9edc9b5e1d0d3cbd3ae7e5e30adbd2b6dd6a00cb85691efc291253101a274524"} Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.189598 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm" event={"ID":"5b1faee2-019f-4357-ab94-2680305278c7","Type":"ContainerDied","Data":"3b14fcf4f9ebbb090048b23a19a4990a1d35587510d989a42ea37aa1375e75d7"} Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.189693 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.204368 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f8fa8f5-9f43-489c-b61d-668187fe87ea" containerID="a214f98f4f8cf3fbcb2676eb84babdfaed14af092e194e107984c7c352cadedf" exitCode=0 Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.204428 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"4f8fa8f5-9f43-489c-b61d-668187fe87ea","Type":"ContainerDied","Data":"a214f98f4f8cf3fbcb2676eb84babdfaed14af092e194e107984c7c352cadedf"} Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.205401 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-5889-account-create-update-7spqm" event={"ID":"28827f2e-866b-47b6-a6bb-d38ef943bd09","Type":"ContainerDied","Data":"43d489b5deede9eb716e0a497359ce24ce0437af098585ddda118a92b1a96879"} Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.205469 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-5889-account-create-update-7spqm" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.227504 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22" event={"ID":"467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3","Type":"ContainerDied","Data":"ae384507090a42030e78757bdadb897b825a52aa454163bbcc90fe249c7e85ea"} Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.227590 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.234330 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.149:8774/\": read tcp 10.217.0.2:56820->10.217.0.149:8774: read: connection reset by peer" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.234407 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-api-0" podUID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.149:8774/\": read tcp 10.217.0.2:56816->10.217.0.149:8774: read: connection reset by peer" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.237180 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.237650 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-54769bd5cb-djssm" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.238069 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4" Jan 20 23:58:17 crc kubenswrapper[5030]: E0120 23:58:17.244984 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:58:17 crc kubenswrapper[5030]: E0120 23:58:17.245041 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data podName:19f6003c-3d52-44e9-a5a8-d525e4b6a79e nodeName:}" failed. No retries permitted until 2026-01-20 23:58:21.245027737 +0000 UTC m=+4973.565288015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data") pod "rabbitmq-cell1-server-0" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.276884 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.147:8775/\": read tcp 10.217.0.2:54312->10.217.0.147:8775: read: connection reset by peer" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.277334 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/nova-metadata-0" podUID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.147:8775/\": read tcp 10.217.0.2:54296->10.217.0.147:8775: read: connection reset by peer" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.559782 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.600062 5030 scope.go:117] "RemoveContainer" containerID="a8bc8a463a575efe51b8f28d2ac8ab506a95e14b2c50dc6fee036bbc5a49a1bb" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.601407 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:58:17 crc kubenswrapper[5030]: E0120 23:58:17.601706 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8bc8a463a575efe51b8f28d2ac8ab506a95e14b2c50dc6fee036bbc5a49a1bb\": container with ID starting with a8bc8a463a575efe51b8f28d2ac8ab506a95e14b2c50dc6fee036bbc5a49a1bb not found: ID does not exist" containerID="a8bc8a463a575efe51b8f28d2ac8ab506a95e14b2c50dc6fee036bbc5a49a1bb" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.601872 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8bc8a463a575efe51b8f28d2ac8ab506a95e14b2c50dc6fee036bbc5a49a1bb"} err="failed to get container status \"a8bc8a463a575efe51b8f28d2ac8ab506a95e14b2c50dc6fee036bbc5a49a1bb\": rpc error: code = NotFound desc = could not find container \"a8bc8a463a575efe51b8f28d2ac8ab506a95e14b2c50dc6fee036bbc5a49a1bb\": container with ID starting with a8bc8a463a575efe51b8f28d2ac8ab506a95e14b2c50dc6fee036bbc5a49a1bb not found: ID does not exist" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.658446 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" podUID="5915461a-f712-499e-ba04-bab78e49188f" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.110:9311/healthcheck\": read tcp 10.217.0.2:41394->10.217.0.110:9311: read: connection reset by peer" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.659262 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" podUID="5915461a-f712-499e-ba04-bab78e49188f" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.110:9311/healthcheck\": read tcp 10.217.0.2:41384->10.217.0.110:9311: read: connection reset by peer" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.662966 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq5xp\" (UniqueName: \"kubernetes.io/projected/15f641b8-896e-47f1-b6b9-a2ced1fb3808-kube-api-access-xq5xp\") pod \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.663011 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-internal-tls-certs\") pod \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.663063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-config-data-custom\") pod \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.663126 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-config-data\") pod \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.663162 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-scripts\") pod \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.663201 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15f641b8-896e-47f1-b6b9-a2ced1fb3808-logs\") pod \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.663234 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-public-tls-certs\") pod \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.663305 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15f641b8-896e-47f1-b6b9-a2ced1fb3808-etc-machine-id\") pod \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.663367 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-combined-ca-bundle\") pod \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\" (UID: \"15f641b8-896e-47f1-b6b9-a2ced1fb3808\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.664534 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15f641b8-896e-47f1-b6b9-a2ced1fb3808-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "15f641b8-896e-47f1-b6b9-a2ced1fb3808" (UID: "15f641b8-896e-47f1-b6b9-a2ced1fb3808"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.664911 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f641b8-896e-47f1-b6b9-a2ced1fb3808-logs" (OuterVolumeSpecName: "logs") pod "15f641b8-896e-47f1-b6b9-a2ced1fb3808" (UID: "15f641b8-896e-47f1-b6b9-a2ced1fb3808"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.676899 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f641b8-896e-47f1-b6b9-a2ced1fb3808-kube-api-access-xq5xp" (OuterVolumeSpecName: "kube-api-access-xq5xp") pod "15f641b8-896e-47f1-b6b9-a2ced1fb3808" (UID: "15f641b8-896e-47f1-b6b9-a2ced1fb3808"). InnerVolumeSpecName "kube-api-access-xq5xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.677031 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-scripts" (OuterVolumeSpecName: "scripts") pod "15f641b8-896e-47f1-b6b9-a2ced1fb3808" (UID: "15f641b8-896e-47f1-b6b9-a2ced1fb3808"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.682185 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "15f641b8-896e-47f1-b6b9-a2ced1fb3808" (UID: "15f641b8-896e-47f1-b6b9-a2ced1fb3808"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.682503 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.684247 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.708816 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-d843-account-create-update-4x9f4"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.730651 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.743004 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-4d16-account-create-update-g7nxj"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.765036 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-public-tls-certs\") pod \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.767168 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-combined-ca-bundle\") pod \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.767290 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phlwp\" (UniqueName: \"kubernetes.io/projected/732faf79-c224-4a2a-bc39-07343926d772-kube-api-access-phlwp\") pod \"732faf79-c224-4a2a-bc39-07343926d772\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.767368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/732faf79-c224-4a2a-bc39-07343926d772-config-data-generated\") pod \"732faf79-c224-4a2a-bc39-07343926d772\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.767493 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv2m8\" (UniqueName: \"kubernetes.io/projected/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-kube-api-access-jv2m8\") pod \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.767573 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/732faf79-c224-4a2a-bc39-07343926d772-galera-tls-certs\") pod \"732faf79-c224-4a2a-bc39-07343926d772\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.767705 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-internal-tls-certs\") pod \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.767799 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732faf79-c224-4a2a-bc39-07343926d772-combined-ca-bundle\") pod \"732faf79-c224-4a2a-bc39-07343926d772\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.767884 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-logs\") pod \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.767954 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-scripts\") pod \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.768083 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"732faf79-c224-4a2a-bc39-07343926d772\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.768217 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-config-data-default\") pod \"732faf79-c224-4a2a-bc39-07343926d772\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.768322 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-kolla-config\") pod \"732faf79-c224-4a2a-bc39-07343926d772\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.768428 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-config-data\") pod \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\" (UID: \"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.768492 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/732faf79-c224-4a2a-bc39-07343926d772-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "732faf79-c224-4a2a-bc39-07343926d772" (UID: "732faf79-c224-4a2a-bc39-07343926d772"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.768594 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-operator-scripts\") pod \"732faf79-c224-4a2a-bc39-07343926d772\" (UID: \"732faf79-c224-4a2a-bc39-07343926d772\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.769175 5030 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15f641b8-896e-47f1-b6b9-a2ced1fb3808-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.769260 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq5xp\" (UniqueName: \"kubernetes.io/projected/15f641b8-896e-47f1-b6b9-a2ced1fb3808-kube-api-access-xq5xp\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.769354 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.769432 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.769505 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15f641b8-896e-47f1-b6b9-a2ced1fb3808-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.769386 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.771121 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "732faf79-c224-4a2a-bc39-07343926d772" (UID: "732faf79-c224-4a2a-bc39-07343926d772"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.773856 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-9d9a-account-create-update-69r22"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.774798 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "732faf79-c224-4a2a-bc39-07343926d772" (UID: "732faf79-c224-4a2a-bc39-07343926d772"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.775376 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-logs" (OuterVolumeSpecName: "logs") pod "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" (UID: "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.776293 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "732faf79-c224-4a2a-bc39-07343926d772" (UID: "732faf79-c224-4a2a-bc39-07343926d772"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.788844 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-scripts" (OuterVolumeSpecName: "scripts") pod "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" (UID: "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.790921 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-kube-api-access-jv2m8" (OuterVolumeSpecName: "kube-api-access-jv2m8") pod "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" (UID: "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f"). InnerVolumeSpecName "kube-api-access-jv2m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.820988 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.834988 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.845338 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7f995c9cbb-dqqv6"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.852665 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.856467 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/memcached-0"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.858696 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/732faf79-c224-4a2a-bc39-07343926d772-kube-api-access-phlwp" (OuterVolumeSpecName: "kube-api-access-phlwp") pod "732faf79-c224-4a2a-bc39-07343926d772" (UID: "732faf79-c224-4a2a-bc39-07343926d772"). InnerVolumeSpecName "kube-api-access-phlwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.878307 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.878770 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.878781 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.878792 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks59n\" (UniqueName: \"kubernetes.io/projected/b7a462d5-bc01-4f4d-a54e-10acd974e5a0-kube-api-access-ks59n\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.878804 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.878814 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.878835 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732faf79-c224-4a2a-bc39-07343926d772-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.878845 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phlwp\" (UniqueName: \"kubernetes.io/projected/732faf79-c224-4a2a-bc39-07343926d772-kube-api-access-phlwp\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.878854 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/732faf79-c224-4a2a-bc39-07343926d772-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.878863 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv2m8\" (UniqueName: \"kubernetes.io/projected/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-kube-api-access-jv2m8\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.879214 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.883418 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "mysql-db") pod "732faf79-c224-4a2a-bc39-07343926d772" (UID: "732faf79-c224-4a2a-bc39-07343926d772"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.885378 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-5ec4-account-create-update-nqqrm"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.901601 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-54769bd5cb-djssm"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.907063 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-54769bd5cb-djssm"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.918082 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-5889-account-create-update-7spqm"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.924460 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-5889-account-create-update-7spqm"] Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.955857 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.967612 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.979423 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15f641b8-896e-47f1-b6b9-a2ced1fb3808" (UID: "15f641b8-896e-47f1-b6b9-a2ced1fb3808"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.981697 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02190451-840d-4e4e-8b67-a14ac1f9712c" path="/var/lib/kubelet/pods/02190451-840d-4e4e-8b67-a14ac1f9712c/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.982657 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03c5eea5-bd6f-4b19-829d-32f4ef09997c" path="/var/lib/kubelet/pods/03c5eea5-bd6f-4b19-829d-32f4ef09997c/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.983780 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09900067-cfe8-4543-8e6c-88c692d43ed8" path="/var/lib/kubelet/pods/09900067-cfe8-4543-8e6c-88c692d43ed8/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.985096 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14c17006-1397-4119-bd32-0e07e78d5068" path="/var/lib/kubelet/pods/14c17006-1397-4119-bd32-0e07e78d5068/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.985799 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231da45e-cd6f-435b-9209-0d8794fa33ad" path="/var/lib/kubelet/pods/231da45e-cd6f-435b-9209-0d8794fa33ad/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.986831 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28827f2e-866b-47b6-a6bb-d38ef943bd09" path="/var/lib/kubelet/pods/28827f2e-866b-47b6-a6bb-d38ef943bd09/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.987525 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="443bb4b6-0b3c-4824-b5b6-e2771eb8e438" path="/var/lib/kubelet/pods/443bb4b6-0b3c-4824-b5b6-e2771eb8e438/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.988094 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3" path="/var/lib/kubelet/pods/467aa9cf-4fd9-44fa-bfc8-da0ecc62aab3/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.988588 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b1faee2-019f-4357-ab94-2680305278c7" path="/var/lib/kubelet/pods/5b1faee2-019f-4357-ab94-2680305278c7/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.989167 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bdb99fc-435f-4ea4-bd16-e04616518099" path="/var/lib/kubelet/pods/5bdb99fc-435f-4ea4-bd16-e04616518099/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.989369 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-zlxnk" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.988681 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-config-data\") pod \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.989810 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-combined-ca-bundle\") pod \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.989913 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f8fa8f5-9f43-489c-b61d-668187fe87ea-logs\") pod \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.990088 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.990299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-internal-tls-certs\") pod \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.990385 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdzkw\" (UniqueName: \"kubernetes.io/projected/4f8fa8f5-9f43-489c-b61d-668187fe87ea-kube-api-access-qdzkw\") pod \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.990473 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f8fa8f5-9f43-489c-b61d-668187fe87ea-httpd-run\") pod \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.990564 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-scripts\") pod \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\" (UID: \"4f8fa8f5-9f43-489c-b61d-668187fe87ea\") " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.991521 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.991608 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901be408-dfd2-4cfa-9120-041671bd94ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.991703 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.992842 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66791d21-1d99-480a-b3fe-e93c4e20c047" path="/var/lib/kubelet/pods/66791d21-1d99-480a-b3fe-e93c4e20c047/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.993392 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="901be408-dfd2-4cfa-9120-041671bd94ed" path="/var/lib/kubelet/pods/901be408-dfd2-4cfa-9120-041671bd94ed/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.994061 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d61475e-3539-4e26-aac7-dfbde0ece687" path="/var/lib/kubelet/pods/9d61475e-3539-4e26-aac7-dfbde0ece687/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.994459 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7305e60-aa68-4d37-a472-80060d0fb5cb" path="/var/lib/kubelet/pods/a7305e60-aa68-4d37-a472-80060d0fb5cb/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.995514 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a462d5-bc01-4f4d-a54e-10acd974e5a0" path="/var/lib/kubelet/pods/b7a462d5-bc01-4f4d-a54e-10acd974e5a0/volumes" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.996087 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f8fa8f5-9f43-489c-b61d-668187fe87ea-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4f8fa8f5-9f43-489c-b61d-668187fe87ea" (UID: "4f8fa8f5-9f43-489c-b61d-668187fe87ea"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:17 crc kubenswrapper[5030]: I0120 23:58:17.997078 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f8fa8f5-9f43-489c-b61d-668187fe87ea-logs" (OuterVolumeSpecName: "logs") pod "4f8fa8f5-9f43-489c-b61d-668187fe87ea" (UID: "4f8fa8f5-9f43-489c-b61d-668187fe87ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:17.999678 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-scripts" (OuterVolumeSpecName: "scripts") pod "4f8fa8f5-9f43-489c-b61d-668187fe87ea" (UID: "4f8fa8f5-9f43-489c-b61d-668187fe87ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.003055 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8fa8f5-9f43-489c-b61d-668187fe87ea-kube-api-access-qdzkw" (OuterVolumeSpecName: "kube-api-access-qdzkw") pod "4f8fa8f5-9f43-489c-b61d-668187fe87ea" (UID: "4f8fa8f5-9f43-489c-b61d-668187fe87ea"). InnerVolumeSpecName "kube-api-access-qdzkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.003074 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "4f8fa8f5-9f43-489c-b61d-668187fe87ea" (UID: "4f8fa8f5-9f43-489c-b61d-668187fe87ea"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.036476 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-config-data" (OuterVolumeSpecName: "config-data") pod "15f641b8-896e-47f1-b6b9-a2ced1fb3808" (UID: "15f641b8-896e-47f1-b6b9-a2ced1fb3808"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.053992 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.076974 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732faf79-c224-4a2a-bc39-07343926d772-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "732faf79-c224-4a2a-bc39-07343926d772" (UID: "732faf79-c224-4a2a-bc39-07343926d772"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093333 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-config-data\") pod \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093467 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cksb6\" (UniqueName: \"kubernetes.io/projected/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-kube-api-access-cksb6\") pod \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093504 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-combined-ca-bundle\") pod \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-scripts\") pod \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093658 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b71d0a-805d-4894-b6ff-c049baf96ad5-httpd-run\") pod \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093684 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-public-tls-certs\") pod \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093712 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b71d0a-805d-4894-b6ff-c049baf96ad5-logs\") pod \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093737 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b714d540-db02-4fc2-ae0b-140a4cdab084-operator-scripts\") pod \"b714d540-db02-4fc2-ae0b-140a4cdab084\" (UID: \"b714d540-db02-4fc2-ae0b-140a4cdab084\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093759 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-combined-ca-bundle\") pod \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093796 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-config-data\") pod \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093826 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq4m4\" (UniqueName: \"kubernetes.io/projected/c3b71d0a-805d-4894-b6ff-c049baf96ad5-kube-api-access-mq4m4\") pod \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093886 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-internal-tls-certs\") pod \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093911 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqknk\" (UniqueName: \"kubernetes.io/projected/b714d540-db02-4fc2-ae0b-140a4cdab084-kube-api-access-wqknk\") pod \"b714d540-db02-4fc2-ae0b-140a4cdab084\" (UID: \"b714d540-db02-4fc2-ae0b-140a4cdab084\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093954 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-public-tls-certs\") pod \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.093981 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-logs\") pod \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\" (UID: \"3f6a312c-eea2-4e52-8e9d-e61ab90cd620\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.094017 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\" (UID: \"c3b71d0a-805d-4894-b6ff-c049baf96ad5\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.094413 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.094431 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc8dg\" (UniqueName: \"kubernetes.io/projected/66791d21-1d99-480a-b3fe-e93c4e20c047-kube-api-access-dc8dg\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.094441 5030 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/732faf79-c224-4a2a-bc39-07343926d772-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.094451 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f8fa8f5-9f43-489c-b61d-668187fe87ea-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.094460 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.094469 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.094488 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.094499 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.094509 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdzkw\" (UniqueName: \"kubernetes.io/projected/4f8fa8f5-9f43-489c-b61d-668187fe87ea-kube-api-access-qdzkw\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.094519 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f8fa8f5-9f43-489c-b61d-668187fe87ea-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.094528 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.094537 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66791d21-1d99-480a-b3fe-e93c4e20c047-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.100393 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3b71d0a-805d-4894-b6ff-c049baf96ad5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c3b71d0a-805d-4894-b6ff-c049baf96ad5" (UID: "c3b71d0a-805d-4894-b6ff-c049baf96ad5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.100885 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b714d540-db02-4fc2-ae0b-140a4cdab084-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b714d540-db02-4fc2-ae0b-140a4cdab084" (UID: "b714d540-db02-4fc2-ae0b-140a4cdab084"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.102631 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "15f641b8-896e-47f1-b6b9-a2ced1fb3808" (UID: "15f641b8-896e-47f1-b6b9-a2ced1fb3808"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.105689 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-logs" (OuterVolumeSpecName: "logs") pod "3f6a312c-eea2-4e52-8e9d-e61ab90cd620" (UID: "3f6a312c-eea2-4e52-8e9d-e61ab90cd620"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.105787 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3b71d0a-805d-4894-b6ff-c049baf96ad5-logs" (OuterVolumeSpecName: "logs") pod "c3b71d0a-805d-4894-b6ff-c049baf96ad5" (UID: "c3b71d0a-805d-4894-b6ff-c049baf96ad5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.108035 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-scripts" (OuterVolumeSpecName: "scripts") pod "c3b71d0a-805d-4894-b6ff-c049baf96ad5" (UID: "c3b71d0a-805d-4894-b6ff-c049baf96ad5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.108301 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b714d540-db02-4fc2-ae0b-140a4cdab084-kube-api-access-wqknk" (OuterVolumeSpecName: "kube-api-access-wqknk") pod "b714d540-db02-4fc2-ae0b-140a4cdab084" (UID: "b714d540-db02-4fc2-ae0b-140a4cdab084"). InnerVolumeSpecName "kube-api-access-wqknk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.108404 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-kube-api-access-cksb6" (OuterVolumeSpecName: "kube-api-access-cksb6") pod "3f6a312c-eea2-4e52-8e9d-e61ab90cd620" (UID: "3f6a312c-eea2-4e52-8e9d-e61ab90cd620"). InnerVolumeSpecName "kube-api-access-cksb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.108469 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b71d0a-805d-4894-b6ff-c049baf96ad5-kube-api-access-mq4m4" (OuterVolumeSpecName: "kube-api-access-mq4m4") pod "c3b71d0a-805d-4894-b6ff-c049baf96ad5" (UID: "c3b71d0a-805d-4894-b6ff-c049baf96ad5"). InnerVolumeSpecName "kube-api-access-mq4m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.112812 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "c3b71d0a-805d-4894-b6ff-c049baf96ad5" (UID: "c3b71d0a-805d-4894-b6ff-c049baf96ad5"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.142911 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" (UID: "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.151351 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204240 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-nova-metadata-tls-certs\") pod \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-combined-ca-bundle\") pod \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204358 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-config-data\") pod \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204389 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-logs\") pod \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204449 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79n8c\" (UniqueName: \"kubernetes.io/projected/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-kube-api-access-79n8c\") pod \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\" (UID: \"c9e3e45b-cb93-4ba4-9597-0e329f3d34df\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204822 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204836 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204856 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204865 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cksb6\" (UniqueName: \"kubernetes.io/projected/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-kube-api-access-cksb6\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204875 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204883 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204892 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b71d0a-805d-4894-b6ff-c049baf96ad5-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204901 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b71d0a-805d-4894-b6ff-c049baf96ad5-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204912 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b714d540-db02-4fc2-ae0b-140a4cdab084-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204921 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq4m4\" (UniqueName: \"kubernetes.io/projected/c3b71d0a-805d-4894-b6ff-c049baf96ad5-kube-api-access-mq4m4\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204934 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqknk\" (UniqueName: \"kubernetes.io/projected/b714d540-db02-4fc2-ae0b-140a4cdab084-kube-api-access-wqknk\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.204943 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.206174 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-logs" (OuterVolumeSpecName: "logs") pod "c9e3e45b-cb93-4ba4-9597-0e329f3d34df" (UID: "c9e3e45b-cb93-4ba4-9597-0e329f3d34df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.246736 5030 generic.go:334] "Generic (PLEG): container finished" podID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerID="cb9ad803e0eda1434ca9fd4b5fbefe7f4ffd0908214c35cf856e52de1ddd73bc" exitCode=0 Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.246788 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-api-0" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.246831 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"3f6a312c-eea2-4e52-8e9d-e61ab90cd620","Type":"ContainerDied","Data":"cb9ad803e0eda1434ca9fd4b5fbefe7f4ffd0908214c35cf856e52de1ddd73bc"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.246857 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-api-0" event={"ID":"3f6a312c-eea2-4e52-8e9d-e61ab90cd620","Type":"ContainerDied","Data":"b49fcb5e7e6960cbdf8e2b132518ee79619728e60c50fcd34aaa5a627d9c556d"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.246874 5030 scope.go:117] "RemoveContainer" containerID="cb9ad803e0eda1434ca9fd4b5fbefe7f4ffd0908214c35cf856e52de1ddd73bc" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.253748 5030 generic.go:334] "Generic (PLEG): container finished" podID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerID="f5c35f53ada49fe2d99f4e9e01f82911c9f6197814cbb265768499364c69f9b4" exitCode=0 Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.253789 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-metadata-0" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.253818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c9e3e45b-cb93-4ba4-9597-0e329f3d34df","Type":"ContainerDied","Data":"f5c35f53ada49fe2d99f4e9e01f82911c9f6197814cbb265768499364c69f9b4"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.253847 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-metadata-0" event={"ID":"c9e3e45b-cb93-4ba4-9597-0e329f3d34df","Type":"ContainerDied","Data":"75237cdcdbee17e1679e3839f90a9c44762d8f40102fa69be2dc860bd81e97a7"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.257942 5030 generic.go:334] "Generic (PLEG): container finished" podID="c3b71d0a-805d-4894-b6ff-c049baf96ad5" containerID="cfb03bd09901565aba481629836ae75c34742d2b796a614f9812657af674d0ce" exitCode=0 Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.257977 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-external-api-0" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.257996 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c3b71d0a-805d-4894-b6ff-c049baf96ad5","Type":"ContainerDied","Data":"cfb03bd09901565aba481629836ae75c34742d2b796a614f9812657af674d0ce"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.258014 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-external-api-0" event={"ID":"c3b71d0a-805d-4894-b6ff-c049baf96ad5","Type":"ContainerDied","Data":"302eb9dae515d805df345359bb644f217048d87f18a8428bf72865f4546c50bd"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.259905 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/root-account-create-update-zlxnk" event={"ID":"b714d540-db02-4fc2-ae0b-140a4cdab084","Type":"ContainerDied","Data":"a3b6c3ca786f2a117b425576c27f17c9e7d8b0ee9231b09f7729ff71498a3f4f"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.259963 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/root-account-create-update-zlxnk" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.264664 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" event={"ID":"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0","Type":"ContainerStarted","Data":"f57172ba21bcb5ac38aab3ebc6acde6183de18a44c65cf1f131a2aaf083e01c4"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.264821 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.267438 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/cinder-api-0" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.267548 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/cinder-api-0" event={"ID":"15f641b8-896e-47f1-b6b9-a2ced1fb3808","Type":"ContainerDied","Data":"51f0805c6390247f2ae841b8e1ff556a76fa7b50d0cbf1368262d075ed69acf5"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.269904 5030 generic.go:334] "Generic (PLEG): container finished" podID="5915461a-f712-499e-ba04-bab78e49188f" containerID="8df38ebfb58a2a5a0af6b5b510d94cbadd6881af9e85606c92acec410f4bba95" exitCode=0 Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.269958 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" event={"ID":"5915461a-f712-499e-ba04-bab78e49188f","Type":"ContainerDied","Data":"8df38ebfb58a2a5a0af6b5b510d94cbadd6881af9e85606c92acec410f4bba95"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.271694 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/openstack-galera-0" event={"ID":"732faf79-c224-4a2a-bc39-07343926d772","Type":"ContainerDied","Data":"897bbc0f6d4b7187b341c23854f60881affd5828a7816b6d108b9cee8f44325d"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.271724 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/openstack-galera-0" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.273431 5030 generic.go:334] "Generic (PLEG): container finished" podID="6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" containerID="298b4e1352b60ce437b507a18333d12357a166ee55c90098fbb3bb0047c27131" exitCode=0 Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.273467 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.273533 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" event={"ID":"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f","Type":"ContainerDied","Data":"298b4e1352b60ce437b507a18333d12357a166ee55c90098fbb3bb0047c27131"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.273556 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/placement-7cd488ddbd-4gzn9" event={"ID":"6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f","Type":"ContainerDied","Data":"e2eee43ae98678198dc94a3bab964617abea148f3a77ad06f4b67562e642007a"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.279884 5030 generic.go:334] "Generic (PLEG): container finished" podID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerID="dcc49b3097e7c3881a9e6d16ca2206768b7f3632eb827aa614fc3cf5b9335766" exitCode=0 Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.280916 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8ffece47-af44-49bc-b3b4-6e6c24e6589e","Type":"ContainerDied","Data":"dcc49b3097e7c3881a9e6d16ca2206768b7f3632eb827aa614fc3cf5b9335766"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.286404 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/glance-default-internal-api-0" event={"ID":"4f8fa8f5-9f43-489c-b61d-668187fe87ea","Type":"ContainerDied","Data":"67cb62bd95631dc7f87521e63278f9924538b335f0d365ecc6dc88087e9fd5f6"} Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.286514 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/glance-default-internal-api-0" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.304700 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" podStartSLOduration=5.304679243 podStartE2EDuration="5.304679243s" podCreationTimestamp="2026-01-20 23:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:58:18.279092843 +0000 UTC m=+4970.599353131" watchObservedRunningTime="2026-01-20 23:58:18.304679243 +0000 UTC m=+4970.624939531" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.306827 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.343888 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-kube-api-access-79n8c" (OuterVolumeSpecName: "kube-api-access-79n8c") pod "c9e3e45b-cb93-4ba4-9597-0e329f3d34df" (UID: "c9e3e45b-cb93-4ba4-9597-0e329f3d34df"). InnerVolumeSpecName "kube-api-access-79n8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.357141 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.371373 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732faf79-c224-4a2a-bc39-07343926d772-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "732faf79-c224-4a2a-bc39-07343926d772" (UID: "732faf79-c224-4a2a-bc39-07343926d772"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.381590 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3b71d0a-805d-4894-b6ff-c049baf96ad5" (UID: "c3b71d0a-805d-4894-b6ff-c049baf96ad5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.403338 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-config-data" (OuterVolumeSpecName: "config-data") pod "3f6a312c-eea2-4e52-8e9d-e61ab90cd620" (UID: "3f6a312c-eea2-4e52-8e9d-e61ab90cd620"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.403746 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/rabbitmq-server-0" podUID="4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.234:5671: connect: connection refused" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.409886 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.410117 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79n8c\" (UniqueName: \"kubernetes.io/projected/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-kube-api-access-79n8c\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.410145 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732faf79-c224-4a2a-bc39-07343926d772-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.410160 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.410257 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.417595 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "15f641b8-896e-47f1-b6b9-a2ced1fb3808" (UID: "15f641b8-896e-47f1-b6b9-a2ced1fb3808"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.423565 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" (UID: "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.435159 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f8fa8f5-9f43-489c-b61d-668187fe87ea" (UID: "4f8fa8f5-9f43-489c-b61d-668187fe87ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.437290 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.445373 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-config-data" (OuterVolumeSpecName: "config-data") pod "c9e3e45b-cb93-4ba4-9597-0e329f3d34df" (UID: "c9e3e45b-cb93-4ba4-9597-0e329f3d34df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.447314 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4f8fa8f5-9f43-489c-b61d-668187fe87ea" (UID: "4f8fa8f5-9f43-489c-b61d-668187fe87ea"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.453505 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-config-data" (OuterVolumeSpecName: "config-data") pod "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" (UID: "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.456287 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9e3e45b-cb93-4ba4-9597-0e329f3d34df" (UID: "c9e3e45b-cb93-4ba4-9597-0e329f3d34df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.456938 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-config-data" (OuterVolumeSpecName: "config-data") pod "4f8fa8f5-9f43-489c-b61d-668187fe87ea" (UID: "4f8fa8f5-9f43-489c-b61d-668187fe87ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.457048 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-config-data" (OuterVolumeSpecName: "config-data") pod "c3b71d0a-805d-4894-b6ff-c049baf96ad5" (UID: "c3b71d0a-805d-4894-b6ff-c049baf96ad5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.462981 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f6a312c-eea2-4e52-8e9d-e61ab90cd620" (UID: "3f6a312c-eea2-4e52-8e9d-e61ab90cd620"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.464956 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c9e3e45b-cb93-4ba4-9597-0e329f3d34df" (UID: "c9e3e45b-cb93-4ba4-9597-0e329f3d34df"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.472255 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3f6a312c-eea2-4e52-8e9d-e61ab90cd620" (UID: "3f6a312c-eea2-4e52-8e9d-e61ab90cd620"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.472290 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3f6a312c-eea2-4e52-8e9d-e61ab90cd620" (UID: "3f6a312c-eea2-4e52-8e9d-e61ab90cd620"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.480930 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" (UID: "6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.489590 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c3b71d0a-805d-4894-b6ff-c049baf96ad5" (UID: "c3b71d0a-805d-4894-b6ff-c049baf96ad5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512778 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15f641b8-896e-47f1-b6b9-a2ced1fb3808-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512812 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512822 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512832 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512842 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512854 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3b71d0a-805d-4894-b6ff-c049baf96ad5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512866 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512878 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512890 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512902 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512916 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512929 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512937 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512946 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e3e45b-cb93-4ba4-9597-0e329f3d34df-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512955 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f6a312c-eea2-4e52-8e9d-e61ab90cd620-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.512964 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f8fa8f5-9f43-489c-b61d-668187fe87ea-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.616432 5030 scope.go:117] "RemoveContainer" containerID="80d29224c06a128d8eed095ff5e5a28ed9ae45635587e8e737f90ba851398c23" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.624705 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:58:18 crc kubenswrapper[5030]: E0120 23:58:18.665797 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:58:18 crc kubenswrapper[5030]: E0120 23:58:18.669492 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.669699 5030 scope.go:117] "RemoveContainer" containerID="cb9ad803e0eda1434ca9fd4b5fbefe7f4ffd0908214c35cf856e52de1ddd73bc" Jan 20 23:58:18 crc kubenswrapper[5030]: E0120 23:58:18.672035 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb9ad803e0eda1434ca9fd4b5fbefe7f4ffd0908214c35cf856e52de1ddd73bc\": container with ID starting with cb9ad803e0eda1434ca9fd4b5fbefe7f4ffd0908214c35cf856e52de1ddd73bc not found: ID does not exist" containerID="cb9ad803e0eda1434ca9fd4b5fbefe7f4ffd0908214c35cf856e52de1ddd73bc" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.672080 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb9ad803e0eda1434ca9fd4b5fbefe7f4ffd0908214c35cf856e52de1ddd73bc"} err="failed to get container status \"cb9ad803e0eda1434ca9fd4b5fbefe7f4ffd0908214c35cf856e52de1ddd73bc\": rpc error: code = NotFound desc = could not find container \"cb9ad803e0eda1434ca9fd4b5fbefe7f4ffd0908214c35cf856e52de1ddd73bc\": container with ID starting with cb9ad803e0eda1434ca9fd4b5fbefe7f4ffd0908214c35cf856e52de1ddd73bc not found: ID does not exist" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.672201 5030 scope.go:117] "RemoveContainer" containerID="80d29224c06a128d8eed095ff5e5a28ed9ae45635587e8e737f90ba851398c23" Jan 20 23:58:18 crc kubenswrapper[5030]: E0120 23:58:18.673882 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80d29224c06a128d8eed095ff5e5a28ed9ae45635587e8e737f90ba851398c23\": container with ID starting with 80d29224c06a128d8eed095ff5e5a28ed9ae45635587e8e737f90ba851398c23 not found: ID does not exist" containerID="80d29224c06a128d8eed095ff5e5a28ed9ae45635587e8e737f90ba851398c23" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.673915 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d29224c06a128d8eed095ff5e5a28ed9ae45635587e8e737f90ba851398c23"} err="failed to get container status \"80d29224c06a128d8eed095ff5e5a28ed9ae45635587e8e737f90ba851398c23\": rpc error: code = NotFound desc = could not find container \"80d29224c06a128d8eed095ff5e5a28ed9ae45635587e8e737f90ba851398c23\": container with ID starting with 80d29224c06a128d8eed095ff5e5a28ed9ae45635587e8e737f90ba851398c23 not found: ID does not exist" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.673934 5030 scope.go:117] "RemoveContainer" containerID="f5c35f53ada49fe2d99f4e9e01f82911c9f6197814cbb265768499364c69f9b4" Jan 20 23:58:18 crc kubenswrapper[5030]: E0120 23:58:18.674024 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 20 23:58:18 crc kubenswrapper[5030]: E0120 23:58:18.674048 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-cell0-conductor-0" podUID="faaf0cec-efea-4050-9195-ee0262c8deb8" containerName="nova-cell0-conductor-conductor" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.674413 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.720013 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.722326 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-config-data-custom\") pod \"5915461a-f712-499e-ba04-bab78e49188f\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.722425 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-config-data\") pod \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.723982 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slr56\" (UniqueName: \"kubernetes.io/projected/5915461a-f712-499e-ba04-bab78e49188f-kube-api-access-slr56\") pod \"5915461a-f712-499e-ba04-bab78e49188f\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.724143 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-sg-core-conf-yaml\") pod \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.724279 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ffece47-af44-49bc-b3b4-6e6c24e6589e-run-httpd\") pod \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.724376 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-public-tls-certs\") pod \"5915461a-f712-499e-ba04-bab78e49188f\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.724448 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5915461a-f712-499e-ba04-bab78e49188f-logs\") pod \"5915461a-f712-499e-ba04-bab78e49188f\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.724524 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-ceilometer-tls-certs\") pod \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.724612 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-scripts\") pod \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.724711 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-combined-ca-bundle\") pod \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.724906 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-config-data\") pod \"5915461a-f712-499e-ba04-bab78e49188f\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.725117 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ccqd\" (UniqueName: \"kubernetes.io/projected/8ffece47-af44-49bc-b3b4-6e6c24e6589e-kube-api-access-8ccqd\") pod \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.725342 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ffece47-af44-49bc-b3b4-6e6c24e6589e-log-httpd\") pod \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\" (UID: \"8ffece47-af44-49bc-b3b4-6e6c24e6589e\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.725630 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-internal-tls-certs\") pod \"5915461a-f712-499e-ba04-bab78e49188f\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.727661 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-metadata-0"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.728237 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5915461a-f712-499e-ba04-bab78e49188f-logs" (OuterVolumeSpecName: "logs") pod "5915461a-f712-499e-ba04-bab78e49188f" (UID: "5915461a-f712-499e-ba04-bab78e49188f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.725841 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-combined-ca-bundle\") pod \"5915461a-f712-499e-ba04-bab78e49188f\" (UID: \"5915461a-f712-499e-ba04-bab78e49188f\") " Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.730335 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5915461a-f712-499e-ba04-bab78e49188f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.736206 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5915461a-f712-499e-ba04-bab78e49188f" (UID: "5915461a-f712-499e-ba04-bab78e49188f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.736446 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ffece47-af44-49bc-b3b4-6e6c24e6589e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8ffece47-af44-49bc-b3b4-6e6c24e6589e" (UID: "8ffece47-af44-49bc-b3b4-6e6c24e6589e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.737002 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ffece47-af44-49bc-b3b4-6e6c24e6589e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8ffece47-af44-49bc-b3b4-6e6c24e6589e" (UID: "8ffece47-af44-49bc-b3b4-6e6c24e6589e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.737438 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.738831 5030 scope.go:117] "RemoveContainer" containerID="46849f5297b3e94a8c0334dc522aedd518e42402f9b1b4cc2bcf817e42c3141e" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.742712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ffece47-af44-49bc-b3b4-6e6c24e6589e-kube-api-access-8ccqd" (OuterVolumeSpecName: "kube-api-access-8ccqd") pod "8ffece47-af44-49bc-b3b4-6e6c24e6589e" (UID: "8ffece47-af44-49bc-b3b4-6e6c24e6589e"). InnerVolumeSpecName "kube-api-access-8ccqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.743782 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/openstack-galera-0"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.743855 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5915461a-f712-499e-ba04-bab78e49188f-kube-api-access-slr56" (OuterVolumeSpecName: "kube-api-access-slr56") pod "5915461a-f712-499e-ba04-bab78e49188f" (UID: "5915461a-f712-499e-ba04-bab78e49188f"). InnerVolumeSpecName "kube-api-access-slr56". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.748068 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.754997 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/cinder-api-0"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.762873 5030 scope.go:117] "RemoveContainer" containerID="f5c35f53ada49fe2d99f4e9e01f82911c9f6197814cbb265768499364c69f9b4" Jan 20 23:58:18 crc kubenswrapper[5030]: E0120 23:58:18.764253 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5c35f53ada49fe2d99f4e9e01f82911c9f6197814cbb265768499364c69f9b4\": container with ID starting with f5c35f53ada49fe2d99f4e9e01f82911c9f6197814cbb265768499364c69f9b4 not found: ID does not exist" containerID="f5c35f53ada49fe2d99f4e9e01f82911c9f6197814cbb265768499364c69f9b4" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.764340 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5c35f53ada49fe2d99f4e9e01f82911c9f6197814cbb265768499364c69f9b4"} err="failed to get container status \"f5c35f53ada49fe2d99f4e9e01f82911c9f6197814cbb265768499364c69f9b4\": rpc error: code = NotFound desc = could not find container \"f5c35f53ada49fe2d99f4e9e01f82911c9f6197814cbb265768499364c69f9b4\": container with ID starting with f5c35f53ada49fe2d99f4e9e01f82911c9f6197814cbb265768499364c69f9b4 not found: ID does not exist" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.764386 5030 scope.go:117] "RemoveContainer" containerID="46849f5297b3e94a8c0334dc522aedd518e42402f9b1b4cc2bcf817e42c3141e" Jan 20 23:58:18 crc kubenswrapper[5030]: E0120 23:58:18.764851 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46849f5297b3e94a8c0334dc522aedd518e42402f9b1b4cc2bcf817e42c3141e\": container with ID starting with 46849f5297b3e94a8c0334dc522aedd518e42402f9b1b4cc2bcf817e42c3141e not found: ID does not exist" containerID="46849f5297b3e94a8c0334dc522aedd518e42402f9b1b4cc2bcf817e42c3141e" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.764869 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46849f5297b3e94a8c0334dc522aedd518e42402f9b1b4cc2bcf817e42c3141e"} err="failed to get container status \"46849f5297b3e94a8c0334dc522aedd518e42402f9b1b4cc2bcf817e42c3141e\": rpc error: code = NotFound desc = could not find container \"46849f5297b3e94a8c0334dc522aedd518e42402f9b1b4cc2bcf817e42c3141e\": container with ID starting with 46849f5297b3e94a8c0334dc522aedd518e42402f9b1b4cc2bcf817e42c3141e not found: ID does not exist" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.764904 5030 scope.go:117] "RemoveContainer" containerID="cfb03bd09901565aba481629836ae75c34742d2b796a614f9812657af674d0ce" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.765014 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.773876 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-internal-api-0"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.793633 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-zlxnk"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.804426 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/root-account-create-update-zlxnk"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.814878 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.818723 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-api-0"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.825033 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.829446 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/glance-default-external-api-0"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.832051 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slr56\" (UniqueName: \"kubernetes.io/projected/5915461a-f712-499e-ba04-bab78e49188f-kube-api-access-slr56\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.832073 5030 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ffece47-af44-49bc-b3b4-6e6c24e6589e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.832084 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ccqd\" (UniqueName: \"kubernetes.io/projected/8ffece47-af44-49bc-b3b4-6e6c24e6589e-kube-api-access-8ccqd\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.832093 5030 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ffece47-af44-49bc-b3b4-6e6c24e6589e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.832102 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.839344 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/placement-7cd488ddbd-4gzn9"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.845493 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/placement-7cd488ddbd-4gzn9"] Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.951530 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-scripts" (OuterVolumeSpecName: "scripts") pod "8ffece47-af44-49bc-b3b4-6e6c24e6589e" (UID: "8ffece47-af44-49bc-b3b4-6e6c24e6589e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.975830 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8ffece47-af44-49bc-b3b4-6e6c24e6589e" (UID: "8ffece47-af44-49bc-b3b4-6e6c24e6589e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.979927 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5915461a-f712-499e-ba04-bab78e49188f" (UID: "5915461a-f712-499e-ba04-bab78e49188f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.982171 5030 scope.go:117] "RemoveContainer" containerID="8df5d7b4dd58e28f0b12d08b3d7eee031074f3eaa5bb04801e96f40173a4f7f1" Jan 20 23:58:18 crc kubenswrapper[5030]: I0120 23:58:18.984558 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.010875 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-config-data" (OuterVolumeSpecName: "config-data") pod "5915461a-f712-499e-ba04-bab78e49188f" (UID: "5915461a-f712-499e-ba04-bab78e49188f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.035182 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-public-tls-certs\") pod \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.035236 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kktd5\" (UniqueName: \"kubernetes.io/projected/6a35511b-3fcd-40b1-8657-c26c6bf69e50-kube-api-access-kktd5\") pod \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.035802 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-config-data\") pod \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.035855 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-internal-tls-certs\") pod \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.035875 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-combined-ca-bundle\") pod \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.035902 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-credential-keys\") pod \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.036347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-fernet-keys\") pod \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.036526 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-scripts\") pod \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\" (UID: \"6a35511b-3fcd-40b1-8657-c26c6bf69e50\") " Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.042283 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.042306 5030 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.042316 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.042326 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.045431 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6a35511b-3fcd-40b1-8657-c26c6bf69e50" (UID: "6a35511b-3fcd-40b1-8657-c26c6bf69e50"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.046002 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6a35511b-3fcd-40b1-8657-c26c6bf69e50" (UID: "6a35511b-3fcd-40b1-8657-c26c6bf69e50"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.047337 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-scripts" (OuterVolumeSpecName: "scripts") pod "6a35511b-3fcd-40b1-8657-c26c6bf69e50" (UID: "6a35511b-3fcd-40b1-8657-c26c6bf69e50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.054453 5030 scope.go:117] "RemoveContainer" containerID="cfb03bd09901565aba481629836ae75c34742d2b796a614f9812657af674d0ce" Jan 20 23:58:19 crc kubenswrapper[5030]: E0120 23:58:19.054907 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfb03bd09901565aba481629836ae75c34742d2b796a614f9812657af674d0ce\": container with ID starting with cfb03bd09901565aba481629836ae75c34742d2b796a614f9812657af674d0ce not found: ID does not exist" containerID="cfb03bd09901565aba481629836ae75c34742d2b796a614f9812657af674d0ce" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.054933 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb03bd09901565aba481629836ae75c34742d2b796a614f9812657af674d0ce"} err="failed to get container status \"cfb03bd09901565aba481629836ae75c34742d2b796a614f9812657af674d0ce\": rpc error: code = NotFound desc = could not find container \"cfb03bd09901565aba481629836ae75c34742d2b796a614f9812657af674d0ce\": container with ID starting with cfb03bd09901565aba481629836ae75c34742d2b796a614f9812657af674d0ce not found: ID does not exist" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.054950 5030 scope.go:117] "RemoveContainer" containerID="8df5d7b4dd58e28f0b12d08b3d7eee031074f3eaa5bb04801e96f40173a4f7f1" Jan 20 23:58:19 crc kubenswrapper[5030]: E0120 23:58:19.057221 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8df5d7b4dd58e28f0b12d08b3d7eee031074f3eaa5bb04801e96f40173a4f7f1\": container with ID starting with 8df5d7b4dd58e28f0b12d08b3d7eee031074f3eaa5bb04801e96f40173a4f7f1 not found: ID does not exist" containerID="8df5d7b4dd58e28f0b12d08b3d7eee031074f3eaa5bb04801e96f40173a4f7f1" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.057276 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8df5d7b4dd58e28f0b12d08b3d7eee031074f3eaa5bb04801e96f40173a4f7f1"} err="failed to get container status \"8df5d7b4dd58e28f0b12d08b3d7eee031074f3eaa5bb04801e96f40173a4f7f1\": rpc error: code = NotFound desc = could not find container \"8df5d7b4dd58e28f0b12d08b3d7eee031074f3eaa5bb04801e96f40173a4f7f1\": container with ID starting with 8df5d7b4dd58e28f0b12d08b3d7eee031074f3eaa5bb04801e96f40173a4f7f1 not found: ID does not exist" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.057290 5030 scope.go:117] "RemoveContainer" containerID="cf937266b65a833ea2519d5e345ec986e9d92e36c9c09cd00ad3dbe11dbc94df" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.062686 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5915461a-f712-499e-ba04-bab78e49188f" (UID: "5915461a-f712-499e-ba04-bab78e49188f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.070606 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8ffece47-af44-49bc-b3b4-6e6c24e6589e" (UID: "8ffece47-af44-49bc-b3b4-6e6c24e6589e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.076691 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ffece47-af44-49bc-b3b4-6e6c24e6589e" (UID: "8ffece47-af44-49bc-b3b4-6e6c24e6589e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.081951 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a35511b-3fcd-40b1-8657-c26c6bf69e50-kube-api-access-kktd5" (OuterVolumeSpecName: "kube-api-access-kktd5") pod "6a35511b-3fcd-40b1-8657-c26c6bf69e50" (UID: "6a35511b-3fcd-40b1-8657-c26c6bf69e50"). InnerVolumeSpecName "kube-api-access-kktd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.090735 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5915461a-f712-499e-ba04-bab78e49188f" (UID: "5915461a-f712-499e-ba04-bab78e49188f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.100096 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a35511b-3fcd-40b1-8657-c26c6bf69e50" (UID: "6a35511b-3fcd-40b1-8657-c26c6bf69e50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.114994 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-config-data" (OuterVolumeSpecName: "config-data") pod "6a35511b-3fcd-40b1-8657-c26c6bf69e50" (UID: "6a35511b-3fcd-40b1-8657-c26c6bf69e50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.121359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6a35511b-3fcd-40b1-8657-c26c6bf69e50" (UID: "6a35511b-3fcd-40b1-8657-c26c6bf69e50"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.133248 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6a35511b-3fcd-40b1-8657-c26c6bf69e50" (UID: "6a35511b-3fcd-40b1-8657-c26c6bf69e50"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.135590 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-config-data" (OuterVolumeSpecName: "config-data") pod "8ffece47-af44-49bc-b3b4-6e6c24e6589e" (UID: "8ffece47-af44-49bc-b3b4-6e6c24e6589e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.144838 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kktd5\" (UniqueName: \"kubernetes.io/projected/6a35511b-3fcd-40b1-8657-c26c6bf69e50-kube-api-access-kktd5\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.144872 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.144891 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.144904 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.144917 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.144929 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.144941 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.144953 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.144965 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.144978 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5915461a-f712-499e-ba04-bab78e49188f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.144990 5030 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.145106 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a35511b-3fcd-40b1-8657-c26c6bf69e50-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.145119 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ffece47-af44-49bc-b3b4-6e6c24e6589e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.225323 5030 scope.go:117] "RemoveContainer" containerID="2bbc2f8fdbf5ffee6d3a44f5227cab09b42416f5ef6e76d016d8424b2478caa8" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.251375 5030 scope.go:117] "RemoveContainer" containerID="c349fa99a47e05f4423344350c4f8ba2fbdc8b3c193473666cc6af2498e4f85c" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.273573 5030 scope.go:117] "RemoveContainer" containerID="b67de712f93a9f90d450a8ee5ca472bfc03b00b231f684c27913cba97b33dc21" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.300320 5030 scope.go:117] "RemoveContainer" containerID="298b4e1352b60ce437b507a18333d12357a166ee55c90098fbb3bb0047c27131" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.314559 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" event={"ID":"5915461a-f712-499e-ba04-bab78e49188f","Type":"ContainerDied","Data":"a37c9f9c679f33c6c1ef9875abf21a9359cc5347c8c0bdc8c43f4516070489e2"} Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.314649 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.321162 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ceilometer-0" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.321161 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ceilometer-0" event={"ID":"8ffece47-af44-49bc-b3b4-6e6c24e6589e","Type":"ContainerDied","Data":"1827ef65f6b7145ccfc452dc1cdf99466a4d8ad8ffa3665d791ad890a26d59b1"} Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.324909 5030 generic.go:334] "Generic (PLEG): container finished" podID="6a35511b-3fcd-40b1-8657-c26c6bf69e50" containerID="b59655e0dbc36eb3cf361d0cf1ec929cafefa9196ef964eb00f393988ef15b65" exitCode=0 Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.324982 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" event={"ID":"6a35511b-3fcd-40b1-8657-c26c6bf69e50","Type":"ContainerDied","Data":"b59655e0dbc36eb3cf361d0cf1ec929cafefa9196ef964eb00f393988ef15b65"} Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.325068 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" event={"ID":"6a35511b-3fcd-40b1-8657-c26c6bf69e50","Type":"ContainerDied","Data":"67fab0b44a9cc5d4947b05475b62d1bdf5403f745040c339e3b475c527c80ffb"} Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.325159 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.373482 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.381417 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ceilometer-0"] Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.381571 5030 scope.go:117] "RemoveContainer" containerID="0f4f42344c283426fc23c3e24b986e08ac56986fe44233a22ed80905aaf6f05b" Jan 20 23:58:19 crc kubenswrapper[5030]: E0120 23:58:19.406084 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:58:19 crc kubenswrapper[5030]: E0120 23:58:19.407688 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.408348 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx"] Jan 20 23:58:19 crc kubenswrapper[5030]: E0120 23:58:19.408944 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 23:58:19 crc kubenswrapper[5030]: E0120 23:58:19.409019 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-kuttl-tests/nova-scheduler-0" podUID="3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae" containerName="nova-scheduler-scheduler" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.410428 5030 scope.go:117] "RemoveContainer" containerID="298b4e1352b60ce437b507a18333d12357a166ee55c90098fbb3bb0047c27131" Jan 20 23:58:19 crc kubenswrapper[5030]: E0120 23:58:19.410775 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"298b4e1352b60ce437b507a18333d12357a166ee55c90098fbb3bb0047c27131\": container with ID starting with 298b4e1352b60ce437b507a18333d12357a166ee55c90098fbb3bb0047c27131 not found: ID does not exist" containerID="298b4e1352b60ce437b507a18333d12357a166ee55c90098fbb3bb0047c27131" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.410816 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"298b4e1352b60ce437b507a18333d12357a166ee55c90098fbb3bb0047c27131"} err="failed to get container status \"298b4e1352b60ce437b507a18333d12357a166ee55c90098fbb3bb0047c27131\": rpc error: code = NotFound desc = could not find container \"298b4e1352b60ce437b507a18333d12357a166ee55c90098fbb3bb0047c27131\": container with ID starting with 298b4e1352b60ce437b507a18333d12357a166ee55c90098fbb3bb0047c27131 not found: ID does not exist" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.410851 5030 scope.go:117] "RemoveContainer" containerID="0f4f42344c283426fc23c3e24b986e08ac56986fe44233a22ed80905aaf6f05b" Jan 20 23:58:19 crc kubenswrapper[5030]: E0120 23:58:19.411385 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4f42344c283426fc23c3e24b986e08ac56986fe44233a22ed80905aaf6f05b\": container with ID starting with 0f4f42344c283426fc23c3e24b986e08ac56986fe44233a22ed80905aaf6f05b not found: ID does not exist" containerID="0f4f42344c283426fc23c3e24b986e08ac56986fe44233a22ed80905aaf6f05b" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.411430 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4f42344c283426fc23c3e24b986e08ac56986fe44233a22ed80905aaf6f05b"} err="failed to get container status \"0f4f42344c283426fc23c3e24b986e08ac56986fe44233a22ed80905aaf6f05b\": rpc error: code = NotFound desc = could not find container \"0f4f42344c283426fc23c3e24b986e08ac56986fe44233a22ed80905aaf6f05b\": container with ID starting with 0f4f42344c283426fc23c3e24b986e08ac56986fe44233a22ed80905aaf6f05b not found: ID does not exist" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.411461 5030 scope.go:117] "RemoveContainer" containerID="a214f98f4f8cf3fbcb2676eb84babdfaed14af092e194e107984c7c352cadedf" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.416281 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/keystone-5c4f5f45b8-vhvkx"] Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.428210 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr"] Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.436932 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-api-77b4fdf4bb-v8fpr"] Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.438596 5030 scope.go:117] "RemoveContainer" containerID="3cc6b0b1ef67cf3ed623ef331af4f0711ba50f68fe18f908775fb4147e800fbd" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.471035 5030 scope.go:117] "RemoveContainer" containerID="8df38ebfb58a2a5a0af6b5b510d94cbadd6881af9e85606c92acec410f4bba95" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.496255 5030 scope.go:117] "RemoveContainer" containerID="ac52bbaae9279d327d926eae11a2177bd95c95212a5758e0429fa5341d1b67fc" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.528584 5030 scope.go:117] "RemoveContainer" containerID="fd95c8d95d19d605865a5d57c895aa4658929f6fbbc1cd7b2f7dcb7695715783" Jan 20 23:58:19 crc kubenswrapper[5030]: E0120 23:58:19.554194 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 20 23:58:19 crc kubenswrapper[5030]: E0120 23:58:19.554256 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data podName:4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf nodeName:}" failed. No retries permitted until 2026-01-20 23:58:27.554237466 +0000 UTC m=+4979.874497774 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data") pod "rabbitmq-server-0" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf") : configmap "rabbitmq-config-data" not found Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.671449 5030 scope.go:117] "RemoveContainer" containerID="3aab7494f35c8d53bed3be64a7d47121e15d43ba605ba4d9c3f83abc05cd8653" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.688153 5030 scope.go:117] "RemoveContainer" containerID="dcc49b3097e7c3881a9e6d16ca2206768b7f3632eb827aa614fc3cf5b9335766" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.710275 5030 scope.go:117] "RemoveContainer" containerID="85c6e0e15e78a3153d0dd557c849ff4f4ff72da720eb6c506baa0d7377395508" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.726523 5030 scope.go:117] "RemoveContainer" containerID="b59655e0dbc36eb3cf361d0cf1ec929cafefa9196ef964eb00f393988ef15b65" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.752794 5030 scope.go:117] "RemoveContainer" containerID="b59655e0dbc36eb3cf361d0cf1ec929cafefa9196ef964eb00f393988ef15b65" Jan 20 23:58:19 crc kubenswrapper[5030]: E0120 23:58:19.753203 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59655e0dbc36eb3cf361d0cf1ec929cafefa9196ef964eb00f393988ef15b65\": container with ID starting with b59655e0dbc36eb3cf361d0cf1ec929cafefa9196ef964eb00f393988ef15b65 not found: ID does not exist" containerID="b59655e0dbc36eb3cf361d0cf1ec929cafefa9196ef964eb00f393988ef15b65" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.753236 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59655e0dbc36eb3cf361d0cf1ec929cafefa9196ef964eb00f393988ef15b65"} err="failed to get container status \"b59655e0dbc36eb3cf361d0cf1ec929cafefa9196ef964eb00f393988ef15b65\": rpc error: code = NotFound desc = could not find container \"b59655e0dbc36eb3cf361d0cf1ec929cafefa9196ef964eb00f393988ef15b65\": container with ID starting with b59655e0dbc36eb3cf361d0cf1ec929cafefa9196ef964eb00f393988ef15b65 not found: ID does not exist" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.989974 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15f641b8-896e-47f1-b6b9-a2ced1fb3808" path="/var/lib/kubelet/pods/15f641b8-896e-47f1-b6b9-a2ced1fb3808/volumes" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.992825 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" path="/var/lib/kubelet/pods/3f6a312c-eea2-4e52-8e9d-e61ab90cd620/volumes" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.994803 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f8fa8f5-9f43-489c-b61d-668187fe87ea" path="/var/lib/kubelet/pods/4f8fa8f5-9f43-489c-b61d-668187fe87ea/volumes" Jan 20 23:58:19 crc kubenswrapper[5030]: I0120 23:58:19.998549 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5915461a-f712-499e-ba04-bab78e49188f" path="/var/lib/kubelet/pods/5915461a-f712-499e-ba04-bab78e49188f/volumes" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.001056 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" path="/var/lib/kubelet/pods/6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f/volumes" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.003972 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a35511b-3fcd-40b1-8657-c26c6bf69e50" path="/var/lib/kubelet/pods/6a35511b-3fcd-40b1-8657-c26c6bf69e50/volumes" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.006205 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="732faf79-c224-4a2a-bc39-07343926d772" path="/var/lib/kubelet/pods/732faf79-c224-4a2a-bc39-07343926d772/volumes" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.008068 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" path="/var/lib/kubelet/pods/8ffece47-af44-49bc-b3b4-6e6c24e6589e/volumes" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.011990 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b714d540-db02-4fc2-ae0b-140a4cdab084" path="/var/lib/kubelet/pods/b714d540-db02-4fc2-ae0b-140a4cdab084/volumes" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.012990 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b71d0a-805d-4894-b6ff-c049baf96ad5" path="/var/lib/kubelet/pods/c3b71d0a-805d-4894-b6ff-c049baf96ad5/volumes" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.014769 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" path="/var/lib/kubelet/pods/c9e3e45b-cb93-4ba4-9597-0e329f3d34df/volumes" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.361164 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_44f0a514-9f01-4768-aeb4-93de0354b738/ovn-northd/0.log" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.361581 5030 generic.go:334] "Generic (PLEG): container finished" podID="44f0a514-9f01-4768-aeb4-93de0354b738" containerID="014d4f347fd703b2cd055c19f820360447ce9d879a2dd7808843d16160084184" exitCode=139 Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.361682 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"44f0a514-9f01-4768-aeb4-93de0354b738","Type":"ContainerDied","Data":"014d4f347fd703b2cd055c19f820360447ce9d879a2dd7808843d16160084184"} Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.861119 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_44f0a514-9f01-4768-aeb4-93de0354b738/ovn-northd/0.log" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.861265 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.885228 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-metrics-certs-tls-certs\") pod \"44f0a514-9f01-4768-aeb4-93de0354b738\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.885297 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-combined-ca-bundle\") pod \"44f0a514-9f01-4768-aeb4-93de0354b738\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.885329 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44f0a514-9f01-4768-aeb4-93de0354b738-scripts\") pod \"44f0a514-9f01-4768-aeb4-93de0354b738\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.885383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmftk\" (UniqueName: \"kubernetes.io/projected/44f0a514-9f01-4768-aeb4-93de0354b738-kube-api-access-jmftk\") pod \"44f0a514-9f01-4768-aeb4-93de0354b738\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.885478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-ovn-northd-tls-certs\") pod \"44f0a514-9f01-4768-aeb4-93de0354b738\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.885509 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f0a514-9f01-4768-aeb4-93de0354b738-config\") pod \"44f0a514-9f01-4768-aeb4-93de0354b738\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.885557 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/44f0a514-9f01-4768-aeb4-93de0354b738-ovn-rundir\") pod \"44f0a514-9f01-4768-aeb4-93de0354b738\" (UID: \"44f0a514-9f01-4768-aeb4-93de0354b738\") " Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.886466 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44f0a514-9f01-4768-aeb4-93de0354b738-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "44f0a514-9f01-4768-aeb4-93de0354b738" (UID: "44f0a514-9f01-4768-aeb4-93de0354b738"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.888235 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f0a514-9f01-4768-aeb4-93de0354b738-config" (OuterVolumeSpecName: "config") pod "44f0a514-9f01-4768-aeb4-93de0354b738" (UID: "44f0a514-9f01-4768-aeb4-93de0354b738"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.888203 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f0a514-9f01-4768-aeb4-93de0354b738-scripts" (OuterVolumeSpecName: "scripts") pod "44f0a514-9f01-4768-aeb4-93de0354b738" (UID: "44f0a514-9f01-4768-aeb4-93de0354b738"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.913000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f0a514-9f01-4768-aeb4-93de0354b738-kube-api-access-jmftk" (OuterVolumeSpecName: "kube-api-access-jmftk") pod "44f0a514-9f01-4768-aeb4-93de0354b738" (UID: "44f0a514-9f01-4768-aeb4-93de0354b738"). InnerVolumeSpecName "kube-api-access-jmftk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.926779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44f0a514-9f01-4768-aeb4-93de0354b738" (UID: "44f0a514-9f01-4768-aeb4-93de0354b738"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.988369 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44f0a514-9f01-4768-aeb4-93de0354b738-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.988397 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/44f0a514-9f01-4768-aeb4-93de0354b738-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.988408 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.988417 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44f0a514-9f01-4768-aeb4-93de0354b738-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.988426 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmftk\" (UniqueName: \"kubernetes.io/projected/44f0a514-9f01-4768-aeb4-93de0354b738-kube-api-access-jmftk\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:20 crc kubenswrapper[5030]: I0120 23:58:20.996654 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "44f0a514-9f01-4768-aeb4-93de0354b738" (UID: "44f0a514-9f01-4768-aeb4-93de0354b738"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.089607 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.096536 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "44f0a514-9f01-4768-aeb4-93de0354b738" (UID: "44f0a514-9f01-4768-aeb4-93de0354b738"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.138702 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.190828 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-confd\") pod \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.190888 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data\") pod \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.191052 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-erlang-cookie\") pod \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.191398 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs6nb\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-kube-api-access-rs6nb\") pod \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.191473 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-erlang-cookie-secret\") pod \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.191542 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.191893 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-tls\") pod \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.191950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-server-conf\") pod \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.191968 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-pod-info\") pod \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.191989 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-plugins-conf\") pod \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.192017 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-plugins\") pod \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\" (UID: \"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.192318 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/44f0a514-9f01-4768-aeb4-93de0354b738-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.192679 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.193248 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.193380 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.205685 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-pod-info" (OuterVolumeSpecName: "pod-info") pod "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.205752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-kube-api-access-rs6nb" (OuterVolumeSpecName: "kube-api-access-rs6nb") pod "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf"). InnerVolumeSpecName "kube-api-access-rs6nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.206329 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "persistence") pod "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.206574 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.207070 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.245871 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data" (OuterVolumeSpecName: "config-data") pod "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.284892 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-server-conf" (OuterVolumeSpecName: "server-conf") pod "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.294369 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.294655 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.295289 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.295364 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.295455 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.295521 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.295587 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs6nb\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-kube-api-access-rs6nb\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.295666 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.295744 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.295809 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: E0120 23:58:21.294757 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 20 23:58:21 crc kubenswrapper[5030]: E0120 23:58:21.296066 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data podName:19f6003c-3d52-44e9-a5a8-d525e4b6a79e nodeName:}" failed. No retries permitted until 2026-01-20 23:58:29.296022739 +0000 UTC m=+4981.616283027 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data") pod "rabbitmq-cell1-server-0" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e") : configmap "rabbitmq-cell1-config-data" not found Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.312895 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.348280 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" (UID: "4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.374370 5030 generic.go:334] "Generic (PLEG): container finished" podID="4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" containerID="816197e9ac2adae78cad4ecd0194c5c8ad4d46b8179549ee00873742d6bf4138" exitCode=0 Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.374444 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-server-0" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.374493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf","Type":"ContainerDied","Data":"816197e9ac2adae78cad4ecd0194c5c8ad4d46b8179549ee00873742d6bf4138"} Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.374554 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-server-0" event={"ID":"4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf","Type":"ContainerDied","Data":"5565a4cb313abfdb07d83531115ca587d3e85dfdf2eabf7388b1f5bb1215b92a"} Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.374599 5030 scope.go:117] "RemoveContainer" containerID="816197e9ac2adae78cad4ecd0194c5c8ad4d46b8179549ee00873742d6bf4138" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.379100 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_ovn-northd-0_44f0a514-9f01-4768-aeb4-93de0354b738/ovn-northd/0.log" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.379267 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-northd-0" event={"ID":"44f0a514-9f01-4768-aeb4-93de0354b738","Type":"ContainerDied","Data":"b1c0010d21784fca0d0cfe4d45772d4e1220c271927f22732e828be4e0e42ff3"} Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.379285 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-northd-0" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.382518 5030 generic.go:334] "Generic (PLEG): container finished" podID="19f6003c-3d52-44e9-a5a8-d525e4b6a79e" containerID="016af194f1eb4817af713ac5bfe3322cf5ee11ad44e3b6dcfc50cfc5c79e5f55" exitCode=0 Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.382558 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"19f6003c-3d52-44e9-a5a8-d525e4b6a79e","Type":"ContainerDied","Data":"016af194f1eb4817af713ac5bfe3322cf5ee11ad44e3b6dcfc50cfc5c79e5f55"} Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.397445 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.397485 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.406082 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.412311 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-server-0"] Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.412393 5030 scope.go:117] "RemoveContainer" containerID="af0b95855676feb71cc4038a460e384b867378a3685449ad4e3a268dd14282da" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.428683 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.432001 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-northd-0"] Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.437035 5030 scope.go:117] "RemoveContainer" containerID="816197e9ac2adae78cad4ecd0194c5c8ad4d46b8179549ee00873742d6bf4138" Jan 20 23:58:21 crc kubenswrapper[5030]: E0120 23:58:21.437658 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"816197e9ac2adae78cad4ecd0194c5c8ad4d46b8179549ee00873742d6bf4138\": container with ID starting with 816197e9ac2adae78cad4ecd0194c5c8ad4d46b8179549ee00873742d6bf4138 not found: ID does not exist" containerID="816197e9ac2adae78cad4ecd0194c5c8ad4d46b8179549ee00873742d6bf4138" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.437715 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"816197e9ac2adae78cad4ecd0194c5c8ad4d46b8179549ee00873742d6bf4138"} err="failed to get container status \"816197e9ac2adae78cad4ecd0194c5c8ad4d46b8179549ee00873742d6bf4138\": rpc error: code = NotFound desc = could not find container \"816197e9ac2adae78cad4ecd0194c5c8ad4d46b8179549ee00873742d6bf4138\": container with ID starting with 816197e9ac2adae78cad4ecd0194c5c8ad4d46b8179549ee00873742d6bf4138 not found: ID does not exist" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.437745 5030 scope.go:117] "RemoveContainer" containerID="af0b95855676feb71cc4038a460e384b867378a3685449ad4e3a268dd14282da" Jan 20 23:58:21 crc kubenswrapper[5030]: E0120 23:58:21.438117 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0b95855676feb71cc4038a460e384b867378a3685449ad4e3a268dd14282da\": container with ID starting with af0b95855676feb71cc4038a460e384b867378a3685449ad4e3a268dd14282da not found: ID does not exist" containerID="af0b95855676feb71cc4038a460e384b867378a3685449ad4e3a268dd14282da" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.438150 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0b95855676feb71cc4038a460e384b867378a3685449ad4e3a268dd14282da"} err="failed to get container status \"af0b95855676feb71cc4038a460e384b867378a3685449ad4e3a268dd14282da\": rpc error: code = NotFound desc = could not find container \"af0b95855676feb71cc4038a460e384b867378a3685449ad4e3a268dd14282da\": container with ID starting with af0b95855676feb71cc4038a460e384b867378a3685449ad4e3a268dd14282da not found: ID does not exist" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.438168 5030 scope.go:117] "RemoveContainer" containerID="39b13d4627425616c7d43237285304e39a1350f2af6bf2588bc9f6f2e89e8a28" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.447468 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.464320 5030 scope.go:117] "RemoveContainer" containerID="014d4f347fd703b2cd055c19f820360447ce9d879a2dd7808843d16160084184" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.498633 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-plugins-conf\") pod \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.498683 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x8ns\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-kube-api-access-9x8ns\") pod \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.498752 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-erlang-cookie\") pod \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.498800 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-pod-info\") pod \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.498825 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data\") pod \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.498847 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-server-conf\") pod \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.498876 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-confd\") pod \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.498912 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-tls\") pod \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.498932 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.498967 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-erlang-cookie-secret\") pod \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.499003 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-plugins\") pod \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\" (UID: \"19f6003c-3d52-44e9-a5a8-d525e4b6a79e\") " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.499733 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "19f6003c-3d52-44e9-a5a8-d525e4b6a79e" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.501550 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "19f6003c-3d52-44e9-a5a8-d525e4b6a79e" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.501700 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "19f6003c-3d52-44e9-a5a8-d525e4b6a79e" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.504045 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "19f6003c-3d52-44e9-a5a8-d525e4b6a79e" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.507047 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-kube-api-access-9x8ns" (OuterVolumeSpecName: "kube-api-access-9x8ns") pod "19f6003c-3d52-44e9-a5a8-d525e4b6a79e" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e"). InnerVolumeSpecName "kube-api-access-9x8ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.507116 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "19f6003c-3d52-44e9-a5a8-d525e4b6a79e" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.507913 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-pod-info" (OuterVolumeSpecName: "pod-info") pod "19f6003c-3d52-44e9-a5a8-d525e4b6a79e" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.516943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data" (OuterVolumeSpecName: "config-data") pod "19f6003c-3d52-44e9-a5a8-d525e4b6a79e" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.517143 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "19f6003c-3d52-44e9-a5a8-d525e4b6a79e" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.543530 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-server-conf" (OuterVolumeSpecName: "server-conf") pod "19f6003c-3d52-44e9-a5a8-d525e4b6a79e" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.571714 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "19f6003c-3d52-44e9-a5a8-d525e4b6a79e" (UID: "19f6003c-3d52-44e9-a5a8-d525e4b6a79e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.601230 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.601264 5030 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.601274 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.601284 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.601314 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.601324 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.601335 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.601344 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.601356 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x8ns\" (UniqueName: \"kubernetes.io/projected/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-kube-api-access-9x8ns\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.601364 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.601374 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/19f6003c-3d52-44e9-a5a8-d525e4b6a79e-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.626903 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.690911 5030 scope.go:117] "RemoveContainer" containerID="ec915196559782a71aca3bf84560183d6b635d2588eb3d679d36d2bb6b7a0c3b" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.702554 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.971336 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.973353 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f0a514-9f01-4768-aeb4-93de0354b738" path="/var/lib/kubelet/pods/44f0a514-9f01-4768-aeb4-93de0354b738/volumes" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.974992 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" path="/var/lib/kubelet/pods/4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf/volumes" Jan 20 23:58:21 crc kubenswrapper[5030]: I0120 23:58:21.979092 5030 scope.go:117] "RemoveContainer" containerID="02fae0c28694c60d27f62274bb757b8a4c5d3975c8580e05c8edede84309f354" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.001534 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.008966 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-config-data\") pod \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.009156 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-config-data-custom\") pod \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.009278 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-logs\") pod \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.009404 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-combined-ca-bundle\") pod \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.009975 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf7vw\" (UniqueName: \"kubernetes.io/projected/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-kube-api-access-mf7vw\") pod \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\" (UID: \"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.009929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-logs" (OuterVolumeSpecName: "logs") pod "04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" (UID: "04fe6346-a29a-4d3d-9cd1-5210fbe15b9e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.010706 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.010959 5030 scope.go:117] "RemoveContainer" containerID="8031dce4c85b09f1d38bba5769584ff64416d46b3003775991a52eb69e88f932" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.039861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" (UID: "04fe6346-a29a-4d3d-9cd1-5210fbe15b9e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.043468 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-kube-api-access-mf7vw" (OuterVolumeSpecName: "kube-api-access-mf7vw") pod "04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" (UID: "04fe6346-a29a-4d3d-9cd1-5210fbe15b9e"). InnerVolumeSpecName "kube-api-access-mf7vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.051577 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" (UID: "04fe6346-a29a-4d3d-9cd1-5210fbe15b9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.057878 5030 scope.go:117] "RemoveContainer" containerID="048f7b423760b667b60e4906fd1a0700771bb5250eff55c5e498304ebcbb54a2" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.062010 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-config-data" (OuterVolumeSpecName: "config-data") pod "04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" (UID: "04fe6346-a29a-4d3d-9cd1-5210fbe15b9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.111649 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-config-data-custom\") pod \"68728b46-0262-46f2-8eb0-a0301870b569\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.111723 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-config-data\") pod \"68728b46-0262-46f2-8eb0-a0301870b569\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.111751 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68728b46-0262-46f2-8eb0-a0301870b569-logs\") pod \"68728b46-0262-46f2-8eb0-a0301870b569\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.111772 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-combined-ca-bundle\") pod \"68728b46-0262-46f2-8eb0-a0301870b569\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.112144 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhsgg\" (UniqueName: \"kubernetes.io/projected/68728b46-0262-46f2-8eb0-a0301870b569-kube-api-access-qhsgg\") pod \"68728b46-0262-46f2-8eb0-a0301870b569\" (UID: \"68728b46-0262-46f2-8eb0-a0301870b569\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.112440 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.112453 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf7vw\" (UniqueName: \"kubernetes.io/projected/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-kube-api-access-mf7vw\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.112471 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.112481 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.113186 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68728b46-0262-46f2-8eb0-a0301870b569-logs" (OuterVolumeSpecName: "logs") pod "68728b46-0262-46f2-8eb0-a0301870b569" (UID: "68728b46-0262-46f2-8eb0-a0301870b569"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.117029 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68728b46-0262-46f2-8eb0-a0301870b569-kube-api-access-qhsgg" (OuterVolumeSpecName: "kube-api-access-qhsgg") pod "68728b46-0262-46f2-8eb0-a0301870b569" (UID: "68728b46-0262-46f2-8eb0-a0301870b569"). InnerVolumeSpecName "kube-api-access-qhsgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.122423 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "68728b46-0262-46f2-8eb0-a0301870b569" (UID: "68728b46-0262-46f2-8eb0-a0301870b569"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.123076 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.140271 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68728b46-0262-46f2-8eb0-a0301870b569" (UID: "68728b46-0262-46f2-8eb0-a0301870b569"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.169900 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-config-data" (OuterVolumeSpecName: "config-data") pod "68728b46-0262-46f2-8eb0-a0301870b569" (UID: "68728b46-0262-46f2-8eb0-a0301870b569"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.174295 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.215397 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77vts\" (UniqueName: \"kubernetes.io/projected/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-kube-api-access-77vts\") pod \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\" (UID: \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.215554 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faaf0cec-efea-4050-9195-ee0262c8deb8-config-data\") pod \"faaf0cec-efea-4050-9195-ee0262c8deb8\" (UID: \"faaf0cec-efea-4050-9195-ee0262c8deb8\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.215600 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-combined-ca-bundle\") pod \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\" (UID: \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.215647 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h58bv\" (UniqueName: \"kubernetes.io/projected/faaf0cec-efea-4050-9195-ee0262c8deb8-kube-api-access-h58bv\") pod \"faaf0cec-efea-4050-9195-ee0262c8deb8\" (UID: \"faaf0cec-efea-4050-9195-ee0262c8deb8\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.215678 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-config-data\") pod \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\" (UID: \"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.215700 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faaf0cec-efea-4050-9195-ee0262c8deb8-combined-ca-bundle\") pod \"faaf0cec-efea-4050-9195-ee0262c8deb8\" (UID: \"faaf0cec-efea-4050-9195-ee0262c8deb8\") " Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.215974 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.215988 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.215999 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68728b46-0262-46f2-8eb0-a0301870b569-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.216008 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68728b46-0262-46f2-8eb0-a0301870b569-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.216018 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhsgg\" (UniqueName: \"kubernetes.io/projected/68728b46-0262-46f2-8eb0-a0301870b569-kube-api-access-qhsgg\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.218088 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-kube-api-access-77vts" (OuterVolumeSpecName: "kube-api-access-77vts") pod "3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae" (UID: "3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae"). InnerVolumeSpecName "kube-api-access-77vts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.220895 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faaf0cec-efea-4050-9195-ee0262c8deb8-kube-api-access-h58bv" (OuterVolumeSpecName: "kube-api-access-h58bv") pod "faaf0cec-efea-4050-9195-ee0262c8deb8" (UID: "faaf0cec-efea-4050-9195-ee0262c8deb8"). InnerVolumeSpecName "kube-api-access-h58bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.232852 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faaf0cec-efea-4050-9195-ee0262c8deb8-config-data" (OuterVolumeSpecName: "config-data") pod "faaf0cec-efea-4050-9195-ee0262c8deb8" (UID: "faaf0cec-efea-4050-9195-ee0262c8deb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.234567 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-config-data" (OuterVolumeSpecName: "config-data") pod "3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae" (UID: "3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.235579 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faaf0cec-efea-4050-9195-ee0262c8deb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faaf0cec-efea-4050-9195-ee0262c8deb8" (UID: "faaf0cec-efea-4050-9195-ee0262c8deb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.236119 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae" (UID: "3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.317148 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faaf0cec-efea-4050-9195-ee0262c8deb8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.317184 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.317199 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h58bv\" (UniqueName: \"kubernetes.io/projected/faaf0cec-efea-4050-9195-ee0262c8deb8-kube-api-access-h58bv\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.317210 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.317220 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faaf0cec-efea-4050-9195-ee0262c8deb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.317233 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77vts\" (UniqueName: \"kubernetes.io/projected/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae-kube-api-access-77vts\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.398852 5030 generic.go:334] "Generic (PLEG): container finished" podID="faaf0cec-efea-4050-9195-ee0262c8deb8" containerID="5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37" exitCode=0 Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.398907 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"faaf0cec-efea-4050-9195-ee0262c8deb8","Type":"ContainerDied","Data":"5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37"} Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.398928 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-cell0-conductor-0" event={"ID":"faaf0cec-efea-4050-9195-ee0262c8deb8","Type":"ContainerDied","Data":"f6c6a5af6afb555f15628b039266fb078c6a385db1d91b4388d81bd0271b141e"} Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.398945 5030 scope.go:117] "RemoveContainer" containerID="5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.399033 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-cell0-conductor-0" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.408359 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" event={"ID":"19f6003c-3d52-44e9-a5a8-d525e4b6a79e","Type":"ContainerDied","Data":"5a7ae9d1d15217f57b3f31ac70fb64faeda0e9bb019643a188cdd9125c575ac0"} Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.408531 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/rabbitmq-cell1-server-0" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.413446 5030 generic.go:334] "Generic (PLEG): container finished" podID="04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" containerID="c9f22341c20b80fbc92351565eeb1684e14881bf7d19d2e8e7a7402532a5420e" exitCode=0 Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.413555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" event={"ID":"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e","Type":"ContainerDied","Data":"c9f22341c20b80fbc92351565eeb1684e14881bf7d19d2e8e7a7402532a5420e"} Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.413631 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" event={"ID":"04fe6346-a29a-4d3d-9cd1-5210fbe15b9e","Type":"ContainerDied","Data":"201453dbf37bc1a1dfbd34bb420f06869a1b1725a0e4a35cadb830ea2c57e931"} Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.413713 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.418445 5030 generic.go:334] "Generic (PLEG): container finished" podID="68728b46-0262-46f2-8eb0-a0301870b569" containerID="c841a5010562e147fe1443063c1f51700f2ba446789e1d9cbd9424b883b66e6e" exitCode=0 Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.418530 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" event={"ID":"68728b46-0262-46f2-8eb0-a0301870b569","Type":"ContainerDied","Data":"c841a5010562e147fe1443063c1f51700f2ba446789e1d9cbd9424b883b66e6e"} Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.418569 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" event={"ID":"68728b46-0262-46f2-8eb0-a0301870b569","Type":"ContainerDied","Data":"888527f66f4a0ef9165e4fd9dc2ad05e6aef6de2a802ffb8c80aba0cb8962da2"} Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.418538 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.421579 5030 generic.go:334] "Generic (PLEG): container finished" podID="3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae" containerID="dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7" exitCode=0 Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.421639 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae","Type":"ContainerDied","Data":"dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7"} Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.421663 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-scheduler-0" event={"ID":"3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae","Type":"ContainerDied","Data":"7197c24f7f5f10cae29f0354ba85dac7e7241c44be681ac19130045f42fd8c8c"} Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.421675 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-scheduler-0" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.440608 5030 scope.go:117] "RemoveContainer" containerID="5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.443817 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:58:22 crc kubenswrapper[5030]: E0120 23:58:22.444251 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37\": container with ID starting with 5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37 not found: ID does not exist" containerID="5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.444311 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37"} err="failed to get container status \"5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37\": rpc error: code = NotFound desc = could not find container \"5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37\": container with ID starting with 5995a210d0824802621adb993b4765a5f9421f8f3c7d878cc4b1614e75330a37 not found: ID does not exist" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.444348 5030 scope.go:117] "RemoveContainer" containerID="016af194f1eb4817af713ac5bfe3322cf5ee11ad44e3b6dcfc50cfc5c79e5f55" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.448325 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/rabbitmq-cell1-server-0"] Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.500732 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.502150 5030 scope.go:117] "RemoveContainer" containerID="060f9ab9280053821ca2e9dbdfd76e7b2e79ed01983ce958e604a573ce49b483" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.515103 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-cell0-conductor-0"] Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.521544 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88"] Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.528232 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-687887ddc7-x7m88"] Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.537278 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l"] Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.542051 5030 scope.go:117] "RemoveContainer" containerID="c9f22341c20b80fbc92351565eeb1684e14881bf7d19d2e8e7a7402532a5420e" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.547724 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-keystone-listener-7f84774d9d-5g78l"] Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.553100 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.557762 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-scheduler-0"] Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.562795 5030 scope.go:117] "RemoveContainer" containerID="c73d1bfddd3255de91348a4a282edc41b3a6067265685d19999b944603a37028" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.587528 5030 scope.go:117] "RemoveContainer" containerID="c9f22341c20b80fbc92351565eeb1684e14881bf7d19d2e8e7a7402532a5420e" Jan 20 23:58:22 crc kubenswrapper[5030]: E0120 23:58:22.588191 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f22341c20b80fbc92351565eeb1684e14881bf7d19d2e8e7a7402532a5420e\": container with ID starting with c9f22341c20b80fbc92351565eeb1684e14881bf7d19d2e8e7a7402532a5420e not found: ID does not exist" containerID="c9f22341c20b80fbc92351565eeb1684e14881bf7d19d2e8e7a7402532a5420e" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.588224 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f22341c20b80fbc92351565eeb1684e14881bf7d19d2e8e7a7402532a5420e"} err="failed to get container status \"c9f22341c20b80fbc92351565eeb1684e14881bf7d19d2e8e7a7402532a5420e\": rpc error: code = NotFound desc = could not find container \"c9f22341c20b80fbc92351565eeb1684e14881bf7d19d2e8e7a7402532a5420e\": container with ID starting with c9f22341c20b80fbc92351565eeb1684e14881bf7d19d2e8e7a7402532a5420e not found: ID does not exist" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.588248 5030 scope.go:117] "RemoveContainer" containerID="c73d1bfddd3255de91348a4a282edc41b3a6067265685d19999b944603a37028" Jan 20 23:58:22 crc kubenswrapper[5030]: E0120 23:58:22.591675 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73d1bfddd3255de91348a4a282edc41b3a6067265685d19999b944603a37028\": container with ID starting with c73d1bfddd3255de91348a4a282edc41b3a6067265685d19999b944603a37028 not found: ID does not exist" containerID="c73d1bfddd3255de91348a4a282edc41b3a6067265685d19999b944603a37028" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.591698 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73d1bfddd3255de91348a4a282edc41b3a6067265685d19999b944603a37028"} err="failed to get container status \"c73d1bfddd3255de91348a4a282edc41b3a6067265685d19999b944603a37028\": rpc error: code = NotFound desc = could not find container \"c73d1bfddd3255de91348a4a282edc41b3a6067265685d19999b944603a37028\": container with ID starting with c73d1bfddd3255de91348a4a282edc41b3a6067265685d19999b944603a37028 not found: ID does not exist" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.591712 5030 scope.go:117] "RemoveContainer" containerID="c841a5010562e147fe1443063c1f51700f2ba446789e1d9cbd9424b883b66e6e" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.685078 5030 scope.go:117] "RemoveContainer" containerID="1cd498511cf2e3d79c9f51bfa7be8a47de25abc9f9cf591603a16ea099174d99" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.720138 5030 scope.go:117] "RemoveContainer" containerID="c841a5010562e147fe1443063c1f51700f2ba446789e1d9cbd9424b883b66e6e" Jan 20 23:58:22 crc kubenswrapper[5030]: E0120 23:58:22.720692 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c841a5010562e147fe1443063c1f51700f2ba446789e1d9cbd9424b883b66e6e\": container with ID starting with c841a5010562e147fe1443063c1f51700f2ba446789e1d9cbd9424b883b66e6e not found: ID does not exist" containerID="c841a5010562e147fe1443063c1f51700f2ba446789e1d9cbd9424b883b66e6e" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.720770 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c841a5010562e147fe1443063c1f51700f2ba446789e1d9cbd9424b883b66e6e"} err="failed to get container status \"c841a5010562e147fe1443063c1f51700f2ba446789e1d9cbd9424b883b66e6e\": rpc error: code = NotFound desc = could not find container \"c841a5010562e147fe1443063c1f51700f2ba446789e1d9cbd9424b883b66e6e\": container with ID starting with c841a5010562e147fe1443063c1f51700f2ba446789e1d9cbd9424b883b66e6e not found: ID does not exist" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.720812 5030 scope.go:117] "RemoveContainer" containerID="1cd498511cf2e3d79c9f51bfa7be8a47de25abc9f9cf591603a16ea099174d99" Jan 20 23:58:22 crc kubenswrapper[5030]: E0120 23:58:22.721545 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd498511cf2e3d79c9f51bfa7be8a47de25abc9f9cf591603a16ea099174d99\": container with ID starting with 1cd498511cf2e3d79c9f51bfa7be8a47de25abc9f9cf591603a16ea099174d99 not found: ID does not exist" containerID="1cd498511cf2e3d79c9f51bfa7be8a47de25abc9f9cf591603a16ea099174d99" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.721577 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd498511cf2e3d79c9f51bfa7be8a47de25abc9f9cf591603a16ea099174d99"} err="failed to get container status \"1cd498511cf2e3d79c9f51bfa7be8a47de25abc9f9cf591603a16ea099174d99\": rpc error: code = NotFound desc = could not find container \"1cd498511cf2e3d79c9f51bfa7be8a47de25abc9f9cf591603a16ea099174d99\": container with ID starting with 1cd498511cf2e3d79c9f51bfa7be8a47de25abc9f9cf591603a16ea099174d99 not found: ID does not exist" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.721599 5030 scope.go:117] "RemoveContainer" containerID="dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.750886 5030 scope.go:117] "RemoveContainer" containerID="dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7" Jan 20 23:58:22 crc kubenswrapper[5030]: E0120 23:58:22.751386 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7\": container with ID starting with dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7 not found: ID does not exist" containerID="dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7" Jan 20 23:58:22 crc kubenswrapper[5030]: I0120 23:58:22.751655 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7"} err="failed to get container status \"dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7\": rpc error: code = NotFound desc = could not find container \"dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7\": container with ID starting with dcb8d5c435e9bf4e1fa5f08a6d3f41ca65dbc6590cdec056aefdf3209c8aeda7 not found: ID does not exist" Jan 20 23:58:23 crc kubenswrapper[5030]: I0120 23:58:23.976927 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" path="/var/lib/kubelet/pods/04fe6346-a29a-4d3d-9cd1-5210fbe15b9e/volumes" Jan 20 23:58:23 crc kubenswrapper[5030]: I0120 23:58:23.978586 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f6003c-3d52-44e9-a5a8-d525e4b6a79e" path="/var/lib/kubelet/pods/19f6003c-3d52-44e9-a5a8-d525e4b6a79e/volumes" Jan 20 23:58:23 crc kubenswrapper[5030]: I0120 23:58:23.979843 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae" path="/var/lib/kubelet/pods/3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae/volumes" Jan 20 23:58:23 crc kubenswrapper[5030]: I0120 23:58:23.982082 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68728b46-0262-46f2-8eb0-a0301870b569" path="/var/lib/kubelet/pods/68728b46-0262-46f2-8eb0-a0301870b569/volumes" Jan 20 23:58:23 crc kubenswrapper[5030]: I0120 23:58:23.983550 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faaf0cec-efea-4050-9195-ee0262c8deb8" path="/var/lib/kubelet/pods/faaf0cec-efea-4050-9195-ee0262c8deb8/volumes" Jan 20 23:58:24 crc kubenswrapper[5030]: I0120 23:58:24.285749 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:58:24 crc kubenswrapper[5030]: I0120 23:58:24.352087 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6"] Jan 20 23:58:24 crc kubenswrapper[5030]: I0120 23:58:24.352421 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" podUID="b4804160-77d4-4618-91cc-85015bbed1a5" containerName="dnsmasq-dns" containerID="cri-o://5bf7d3aa7dddcfd86f7a56357da5095f341a594eddc4a99e9088446df842548c" gracePeriod=10 Jan 20 23:58:24 crc kubenswrapper[5030]: I0120 23:58:24.959339 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:58:24 crc kubenswrapper[5030]: I0120 23:58:24.966413 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.060805 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-config\") pod \"b4804160-77d4-4618-91cc-85015bbed1a5\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.061723 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dns-swift-storage-0\") pod \"b4804160-77d4-4618-91cc-85015bbed1a5\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.061863 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-internal-tls-certs\") pod \"e0a3039f-15d0-4764-bcdf-4833853ed168\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.062031 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-combined-ca-bundle\") pod \"e0a3039f-15d0-4764-bcdf-4833853ed168\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.062167 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-public-tls-certs\") pod \"e0a3039f-15d0-4764-bcdf-4833853ed168\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.062303 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wtf6\" (UniqueName: \"kubernetes.io/projected/e0a3039f-15d0-4764-bcdf-4833853ed168-kube-api-access-8wtf6\") pod \"e0a3039f-15d0-4764-bcdf-4833853ed168\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.062437 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-config\") pod \"e0a3039f-15d0-4764-bcdf-4833853ed168\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.062542 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-httpd-config\") pod \"e0a3039f-15d0-4764-bcdf-4833853ed168\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.062846 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dnsmasq-svc\") pod \"b4804160-77d4-4618-91cc-85015bbed1a5\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.062957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-ovndb-tls-certs\") pod \"e0a3039f-15d0-4764-bcdf-4833853ed168\" (UID: \"e0a3039f-15d0-4764-bcdf-4833853ed168\") " Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.063088 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9ktj\" (UniqueName: \"kubernetes.io/projected/b4804160-77d4-4618-91cc-85015bbed1a5-kube-api-access-g9ktj\") pod \"b4804160-77d4-4618-91cc-85015bbed1a5\" (UID: \"b4804160-77d4-4618-91cc-85015bbed1a5\") " Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.069641 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4804160-77d4-4618-91cc-85015bbed1a5-kube-api-access-g9ktj" (OuterVolumeSpecName: "kube-api-access-g9ktj") pod "b4804160-77d4-4618-91cc-85015bbed1a5" (UID: "b4804160-77d4-4618-91cc-85015bbed1a5"). InnerVolumeSpecName "kube-api-access-g9ktj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.076704 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e0a3039f-15d0-4764-bcdf-4833853ed168" (UID: "e0a3039f-15d0-4764-bcdf-4833853ed168"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.078643 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a3039f-15d0-4764-bcdf-4833853ed168-kube-api-access-8wtf6" (OuterVolumeSpecName: "kube-api-access-8wtf6") pod "e0a3039f-15d0-4764-bcdf-4833853ed168" (UID: "e0a3039f-15d0-4764-bcdf-4833853ed168"). InnerVolumeSpecName "kube-api-access-8wtf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.101964 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0a3039f-15d0-4764-bcdf-4833853ed168" (UID: "e0a3039f-15d0-4764-bcdf-4833853ed168"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.106521 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e0a3039f-15d0-4764-bcdf-4833853ed168" (UID: "e0a3039f-15d0-4764-bcdf-4833853ed168"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.109561 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-config" (OuterVolumeSpecName: "config") pod "e0a3039f-15d0-4764-bcdf-4833853ed168" (UID: "e0a3039f-15d0-4764-bcdf-4833853ed168"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.121317 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "b4804160-77d4-4618-91cc-85015bbed1a5" (UID: "b4804160-77d4-4618-91cc-85015bbed1a5"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.126392 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b4804160-77d4-4618-91cc-85015bbed1a5" (UID: "b4804160-77d4-4618-91cc-85015bbed1a5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.129899 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e0a3039f-15d0-4764-bcdf-4833853ed168" (UID: "e0a3039f-15d0-4764-bcdf-4833853ed168"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.133203 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-config" (OuterVolumeSpecName: "config") pod "b4804160-77d4-4618-91cc-85015bbed1a5" (UID: "b4804160-77d4-4618-91cc-85015bbed1a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.140327 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e0a3039f-15d0-4764-bcdf-4833853ed168" (UID: "e0a3039f-15d0-4764-bcdf-4833853ed168"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.165303 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.165341 5030 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.165361 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.165482 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.165501 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.165517 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wtf6\" (UniqueName: \"kubernetes.io/projected/e0a3039f-15d0-4764-bcdf-4833853ed168-kube-api-access-8wtf6\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.165532 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.165544 5030 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.165557 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b4804160-77d4-4618-91cc-85015bbed1a5-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.165569 5030 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a3039f-15d0-4764-bcdf-4833853ed168-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.165582 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9ktj\" (UniqueName: \"kubernetes.io/projected/b4804160-77d4-4618-91cc-85015bbed1a5-kube-api-access-g9ktj\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.468365 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4804160-77d4-4618-91cc-85015bbed1a5" containerID="5bf7d3aa7dddcfd86f7a56357da5095f341a594eddc4a99e9088446df842548c" exitCode=0 Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.468461 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.468475 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" event={"ID":"b4804160-77d4-4618-91cc-85015bbed1a5","Type":"ContainerDied","Data":"5bf7d3aa7dddcfd86f7a56357da5095f341a594eddc4a99e9088446df842548c"} Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.468607 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6" event={"ID":"b4804160-77d4-4618-91cc-85015bbed1a5","Type":"ContainerDied","Data":"392c79245c6663f862f9fc08d57cadf74130f19b62ef4e9027c766970d337840"} Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.468720 5030 scope.go:117] "RemoveContainer" containerID="5bf7d3aa7dddcfd86f7a56357da5095f341a594eddc4a99e9088446df842548c" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.471651 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0a3039f-15d0-4764-bcdf-4833853ed168" containerID="00b43d04171daa71b4b128f12c54427b55f83c6364025b5f0dfa8199fd3ad500" exitCode=0 Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.471711 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" event={"ID":"e0a3039f-15d0-4764-bcdf-4833853ed168","Type":"ContainerDied","Data":"00b43d04171daa71b4b128f12c54427b55f83c6364025b5f0dfa8199fd3ad500"} Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.471752 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" event={"ID":"e0a3039f-15d0-4764-bcdf-4833853ed168","Type":"ContainerDied","Data":"695f1206946234ddbf81a6b4ceb02882ff0365de646682eedf8bdefae4f51335"} Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.471777 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.510657 5030 scope.go:117] "RemoveContainer" containerID="615df9dcdf35d3ae262a6af538fc4697b6295de8ea33815c235b9ee46c1faa9e" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.626159 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6"] Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.630872 5030 scope.go:117] "RemoveContainer" containerID="5bf7d3aa7dddcfd86f7a56357da5095f341a594eddc4a99e9088446df842548c" Jan 20 23:58:25 crc kubenswrapper[5030]: E0120 23:58:25.632075 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bf7d3aa7dddcfd86f7a56357da5095f341a594eddc4a99e9088446df842548c\": container with ID starting with 5bf7d3aa7dddcfd86f7a56357da5095f341a594eddc4a99e9088446df842548c not found: ID does not exist" containerID="5bf7d3aa7dddcfd86f7a56357da5095f341a594eddc4a99e9088446df842548c" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.632135 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bf7d3aa7dddcfd86f7a56357da5095f341a594eddc4a99e9088446df842548c"} err="failed to get container status \"5bf7d3aa7dddcfd86f7a56357da5095f341a594eddc4a99e9088446df842548c\": rpc error: code = NotFound desc = could not find container \"5bf7d3aa7dddcfd86f7a56357da5095f341a594eddc4a99e9088446df842548c\": container with ID starting with 5bf7d3aa7dddcfd86f7a56357da5095f341a594eddc4a99e9088446df842548c not found: ID does not exist" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.632179 5030 scope.go:117] "RemoveContainer" containerID="615df9dcdf35d3ae262a6af538fc4697b6295de8ea33815c235b9ee46c1faa9e" Jan 20 23:58:25 crc kubenswrapper[5030]: E0120 23:58:25.632824 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"615df9dcdf35d3ae262a6af538fc4697b6295de8ea33815c235b9ee46c1faa9e\": container with ID starting with 615df9dcdf35d3ae262a6af538fc4697b6295de8ea33815c235b9ee46c1faa9e not found: ID does not exist" containerID="615df9dcdf35d3ae262a6af538fc4697b6295de8ea33815c235b9ee46c1faa9e" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.632886 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"615df9dcdf35d3ae262a6af538fc4697b6295de8ea33815c235b9ee46c1faa9e"} err="failed to get container status \"615df9dcdf35d3ae262a6af538fc4697b6295de8ea33815c235b9ee46c1faa9e\": rpc error: code = NotFound desc = could not find container \"615df9dcdf35d3ae262a6af538fc4697b6295de8ea33815c235b9ee46c1faa9e\": container with ID starting with 615df9dcdf35d3ae262a6af538fc4697b6295de8ea33815c235b9ee46c1faa9e not found: ID does not exist" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.632935 5030 scope.go:117] "RemoveContainer" containerID="614efb0d878fa014c022337fb7a7d478520d2e68328e205d409a9cb6aab50305" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.633060 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c878c4cc-h9vb6"] Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.643815 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4"] Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.652099 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-8f95b5bc4-cgjm4"] Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.662598 5030 scope.go:117] "RemoveContainer" containerID="00b43d04171daa71b4b128f12c54427b55f83c6364025b5f0dfa8199fd3ad500" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.696270 5030 scope.go:117] "RemoveContainer" containerID="614efb0d878fa014c022337fb7a7d478520d2e68328e205d409a9cb6aab50305" Jan 20 23:58:25 crc kubenswrapper[5030]: E0120 23:58:25.696882 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614efb0d878fa014c022337fb7a7d478520d2e68328e205d409a9cb6aab50305\": container with ID starting with 614efb0d878fa014c022337fb7a7d478520d2e68328e205d409a9cb6aab50305 not found: ID does not exist" containerID="614efb0d878fa014c022337fb7a7d478520d2e68328e205d409a9cb6aab50305" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.696937 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614efb0d878fa014c022337fb7a7d478520d2e68328e205d409a9cb6aab50305"} err="failed to get container status \"614efb0d878fa014c022337fb7a7d478520d2e68328e205d409a9cb6aab50305\": rpc error: code = NotFound desc = could not find container \"614efb0d878fa014c022337fb7a7d478520d2e68328e205d409a9cb6aab50305\": container with ID starting with 614efb0d878fa014c022337fb7a7d478520d2e68328e205d409a9cb6aab50305 not found: ID does not exist" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.696970 5030 scope.go:117] "RemoveContainer" containerID="00b43d04171daa71b4b128f12c54427b55f83c6364025b5f0dfa8199fd3ad500" Jan 20 23:58:25 crc kubenswrapper[5030]: E0120 23:58:25.697339 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00b43d04171daa71b4b128f12c54427b55f83c6364025b5f0dfa8199fd3ad500\": container with ID starting with 00b43d04171daa71b4b128f12c54427b55f83c6364025b5f0dfa8199fd3ad500 not found: ID does not exist" containerID="00b43d04171daa71b4b128f12c54427b55f83c6364025b5f0dfa8199fd3ad500" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.697396 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00b43d04171daa71b4b128f12c54427b55f83c6364025b5f0dfa8199fd3ad500"} err="failed to get container status \"00b43d04171daa71b4b128f12c54427b55f83c6364025b5f0dfa8199fd3ad500\": rpc error: code = NotFound desc = could not find container \"00b43d04171daa71b4b128f12c54427b55f83c6364025b5f0dfa8199fd3ad500\": container with ID starting with 00b43d04171daa71b4b128f12c54427b55f83c6364025b5f0dfa8199fd3ad500 not found: ID does not exist" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.982742 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4804160-77d4-4618-91cc-85015bbed1a5" path="/var/lib/kubelet/pods/b4804160-77d4-4618-91cc-85015bbed1a5/volumes" Jan 20 23:58:25 crc kubenswrapper[5030]: I0120 23:58:25.984285 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a3039f-15d0-4764-bcdf-4833853ed168" path="/var/lib/kubelet/pods/e0a3039f-15d0-4764-bcdf-4833853ed168/volumes" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606025 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hpg46"] Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.606793 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" containerName="placement-api" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606805 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" containerName="placement-api" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.606816 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f6003c-3d52-44e9-a5a8-d525e4b6a79e" containerName="setup-container" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606822 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f6003c-3d52-44e9-a5a8-d525e4b6a79e" containerName="setup-container" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.606840 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68728b46-0262-46f2-8eb0-a0301870b569" containerName="barbican-keystone-listener-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606846 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="68728b46-0262-46f2-8eb0-a0301870b569" containerName="barbican-keystone-listener-log" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.606856 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" containerName="rabbitmq" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606862 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" containerName="rabbitmq" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.606869 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerName="nova-api-api" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606875 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerName="nova-api-api" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.606885 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f0a514-9f01-4768-aeb4-93de0354b738" containerName="ovn-northd" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606891 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f0a514-9f01-4768-aeb4-93de0354b738" containerName="ovn-northd" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.606898 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerName="nova-api-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606903 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerName="nova-api-log" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.606912 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="sg-core" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606917 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="sg-core" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.606925 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" containerName="barbican-worker-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606931 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" containerName="barbican-worker-log" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.606938 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7305e60-aa68-4d37-a472-80060d0fb5cb" containerName="cinder-scheduler" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606945 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7305e60-aa68-4d37-a472-80060d0fb5cb" containerName="cinder-scheduler" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.606953 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f641b8-896e-47f1-b6b9-a2ced1fb3808" containerName="cinder-api" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606959 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f641b8-896e-47f1-b6b9-a2ced1fb3808" containerName="cinder-api" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.606970 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="732faf79-c224-4a2a-bc39-07343926d772" containerName="mysql-bootstrap" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606976 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="732faf79-c224-4a2a-bc39-07343926d772" containerName="mysql-bootstrap" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.606987 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" containerName="placement-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.606993 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" containerName="placement-log" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607003 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerName="nova-metadata-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607008 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerName="nova-metadata-log" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607017 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a3039f-15d0-4764-bcdf-4833853ed168" containerName="neutron-httpd" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607023 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a3039f-15d0-4764-bcdf-4833853ed168" containerName="neutron-httpd" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607033 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bdb99fc-435f-4ea4-bd16-e04616518099" containerName="mysql-bootstrap" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607040 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bdb99fc-435f-4ea4-bd16-e04616518099" containerName="mysql-bootstrap" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607047 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b71d0a-805d-4894-b6ff-c049baf96ad5" containerName="glance-httpd" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607053 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b71d0a-805d-4894-b6ff-c049baf96ad5" containerName="glance-httpd" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607062 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae" containerName="nova-scheduler-scheduler" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607068 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae" containerName="nova-scheduler-scheduler" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607077 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f641b8-896e-47f1-b6b9-a2ced1fb3808" containerName="cinder-api-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607083 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f641b8-896e-47f1-b6b9-a2ced1fb3808" containerName="cinder-api-log" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607095 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="443bb4b6-0b3c-4824-b5b6-e2771eb8e438" containerName="memcached" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607101 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="443bb4b6-0b3c-4824-b5b6-e2771eb8e438" containerName="memcached" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607111 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerName="nova-metadata-metadata" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607116 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerName="nova-metadata-metadata" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607125 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231da45e-cd6f-435b-9209-0d8794fa33ad" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607130 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="231da45e-cd6f-435b-9209-0d8794fa33ad" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607137 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faaf0cec-efea-4050-9195-ee0262c8deb8" containerName="nova-cell0-conductor-conductor" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607143 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="faaf0cec-efea-4050-9195-ee0262c8deb8" containerName="nova-cell0-conductor-conductor" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607154 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8fa8f5-9f43-489c-b61d-668187fe87ea" containerName="glance-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607159 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8fa8f5-9f43-489c-b61d-668187fe87ea" containerName="glance-log" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607170 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="ceilometer-central-agent" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607176 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="ceilometer-central-agent" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607183 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" containerName="barbican-worker" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607189 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" containerName="barbican-worker" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607199 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" containerName="setup-container" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607204 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" containerName="setup-container" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607212 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a3039f-15d0-4764-bcdf-4833853ed168" containerName="neutron-api" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607218 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a3039f-15d0-4764-bcdf-4833853ed168" containerName="neutron-api" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607229 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09900067-cfe8-4543-8e6c-88c692d43ed8" containerName="nova-cell1-conductor-conductor" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607235 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="09900067-cfe8-4543-8e6c-88c692d43ed8" containerName="nova-cell1-conductor-conductor" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607245 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a35511b-3fcd-40b1-8657-c26c6bf69e50" containerName="keystone-api" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607250 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a35511b-3fcd-40b1-8657-c26c6bf69e50" containerName="keystone-api" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607258 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02190451-840d-4e4e-8b67-a14ac1f9712c" containerName="kube-state-metrics" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607264 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="02190451-840d-4e4e-8b67-a14ac1f9712c" containerName="kube-state-metrics" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607274 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="ceilometer-notification-agent" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607280 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="ceilometer-notification-agent" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607289 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="proxy-httpd" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607294 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="proxy-httpd" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607306 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8fa8f5-9f43-489c-b61d-668187fe87ea" containerName="glance-httpd" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607311 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8fa8f5-9f43-489c-b61d-668187fe87ea" containerName="glance-httpd" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607320 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4804160-77d4-4618-91cc-85015bbed1a5" containerName="init" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607326 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4804160-77d4-4618-91cc-85015bbed1a5" containerName="init" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607335 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b71d0a-805d-4894-b6ff-c049baf96ad5" containerName="glance-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607340 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b71d0a-805d-4894-b6ff-c049baf96ad5" containerName="glance-log" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607348 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4804160-77d4-4618-91cc-85015bbed1a5" containerName="dnsmasq-dns" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607354 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4804160-77d4-4618-91cc-85015bbed1a5" containerName="dnsmasq-dns" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607362 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68728b46-0262-46f2-8eb0-a0301870b569" containerName="barbican-keystone-listener" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607368 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="68728b46-0262-46f2-8eb0-a0301870b569" containerName="barbican-keystone-listener" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607376 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f0a514-9f01-4768-aeb4-93de0354b738" containerName="openstack-network-exporter" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607382 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f0a514-9f01-4768-aeb4-93de0354b738" containerName="openstack-network-exporter" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607389 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f6003c-3d52-44e9-a5a8-d525e4b6a79e" containerName="rabbitmq" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607395 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f6003c-3d52-44e9-a5a8-d525e4b6a79e" containerName="rabbitmq" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607403 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5915461a-f712-499e-ba04-bab78e49188f" containerName="barbican-api-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607410 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5915461a-f712-499e-ba04-bab78e49188f" containerName="barbican-api-log" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607421 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7305e60-aa68-4d37-a472-80060d0fb5cb" containerName="probe" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607427 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7305e60-aa68-4d37-a472-80060d0fb5cb" containerName="probe" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607437 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bdb99fc-435f-4ea4-bd16-e04616518099" containerName="galera" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607442 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bdb99fc-435f-4ea4-bd16-e04616518099" containerName="galera" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607449 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5915461a-f712-499e-ba04-bab78e49188f" containerName="barbican-api" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607454 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5915461a-f712-499e-ba04-bab78e49188f" containerName="barbican-api" Jan 20 23:58:37 crc kubenswrapper[5030]: E0120 23:58:37.607461 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="732faf79-c224-4a2a-bc39-07343926d772" containerName="galera" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607466 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="732faf79-c224-4a2a-bc39-07343926d772" containerName="galera" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607589 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a3039f-15d0-4764-bcdf-4833853ed168" containerName="neutron-api" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607602 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7305e60-aa68-4d37-a472-80060d0fb5cb" containerName="probe" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607609 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a35511b-3fcd-40b1-8657-c26c6bf69e50" containerName="keystone-api" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607618 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" containerName="barbican-worker-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607647 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5915461a-f712-499e-ba04-bab78e49188f" containerName="barbican-api" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607657 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f641b8-896e-47f1-b6b9-a2ced1fb3808" containerName="cinder-api-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607669 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bdb99fc-435f-4ea4-bd16-e04616518099" containerName="galera" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607676 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="68728b46-0262-46f2-8eb0-a0301870b569" containerName="barbican-keystone-listener" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607687 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="02190451-840d-4e4e-8b67-a14ac1f9712c" containerName="kube-state-metrics" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607700 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="231da45e-cd6f-435b-9209-0d8794fa33ad" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607710 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cdf6b51-c0c7-43fd-9a42-c99b2d8cd0bf" containerName="rabbitmq" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607717 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f641b8-896e-47f1-b6b9-a2ced1fb3808" containerName="cinder-api" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607724 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f0a514-9f01-4768-aeb4-93de0354b738" containerName="openstack-network-exporter" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607738 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b71d0a-805d-4894-b6ff-c049baf96ad5" containerName="glance-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607748 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="68728b46-0262-46f2-8eb0-a0301870b569" containerName="barbican-keystone-listener-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607759 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8fa8f5-9f43-489c-b61d-668187fe87ea" containerName="glance-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607769 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="sg-core" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607779 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4804160-77d4-4618-91cc-85015bbed1a5" containerName="dnsmasq-dns" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607790 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc1dc4a-cb0d-4a3c-ba23-39cbb9d183ae" containerName="nova-scheduler-scheduler" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607799 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="proxy-httpd" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607811 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerName="nova-metadata-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607820 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerName="nova-api-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607831 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8fa8f5-9f43-489c-b61d-668187fe87ea" containerName="glance-httpd" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607844 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e3e45b-cb93-4ba4-9597-0e329f3d34df" containerName="nova-metadata-metadata" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607854 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="732faf79-c224-4a2a-bc39-07343926d772" containerName="galera" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607864 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7305e60-aa68-4d37-a472-80060d0fb5cb" containerName="cinder-scheduler" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607873 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="443bb4b6-0b3c-4824-b5b6-e2771eb8e438" containerName="memcached" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607881 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="ceilometer-central-agent" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607892 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="faaf0cec-efea-4050-9195-ee0262c8deb8" containerName="nova-cell0-conductor-conductor" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607901 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b71d0a-805d-4894-b6ff-c049baf96ad5" containerName="glance-httpd" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607910 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5915461a-f712-499e-ba04-bab78e49188f" containerName="barbican-api-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607921 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a3039f-15d0-4764-bcdf-4833853ed168" containerName="neutron-httpd" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607934 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="04fe6346-a29a-4d3d-9cd1-5210fbe15b9e" containerName="barbican-worker" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607943 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f0a514-9f01-4768-aeb4-93de0354b738" containerName="ovn-northd" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607950 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ffece47-af44-49bc-b3b4-6e6c24e6589e" containerName="ceilometer-notification-agent" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607958 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="09900067-cfe8-4543-8e6c-88c692d43ed8" containerName="nova-cell1-conductor-conductor" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607966 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f6003c-3d52-44e9-a5a8-d525e4b6a79e" containerName="rabbitmq" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607972 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f6a312c-eea2-4e52-8e9d-e61ab90cd620" containerName="nova-api-api" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607981 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" containerName="placement-api" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.607990 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1d53fa-21db-4d8b-a2bf-2a3be24cdc3f" containerName="placement-log" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.609002 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.636294 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpg46"] Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.648478 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqk84\" (UniqueName: \"kubernetes.io/projected/56e828da-acef-4c43-ae22-165ce29b60f8-kube-api-access-cqk84\") pod \"certified-operators-hpg46\" (UID: \"56e828da-acef-4c43-ae22-165ce29b60f8\") " pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.648554 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56e828da-acef-4c43-ae22-165ce29b60f8-catalog-content\") pod \"certified-operators-hpg46\" (UID: \"56e828da-acef-4c43-ae22-165ce29b60f8\") " pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.649363 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56e828da-acef-4c43-ae22-165ce29b60f8-utilities\") pod \"certified-operators-hpg46\" (UID: \"56e828da-acef-4c43-ae22-165ce29b60f8\") " pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.751991 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqk84\" (UniqueName: \"kubernetes.io/projected/56e828da-acef-4c43-ae22-165ce29b60f8-kube-api-access-cqk84\") pod \"certified-operators-hpg46\" (UID: \"56e828da-acef-4c43-ae22-165ce29b60f8\") " pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.752999 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56e828da-acef-4c43-ae22-165ce29b60f8-catalog-content\") pod \"certified-operators-hpg46\" (UID: \"56e828da-acef-4c43-ae22-165ce29b60f8\") " pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.753198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56e828da-acef-4c43-ae22-165ce29b60f8-utilities\") pod \"certified-operators-hpg46\" (UID: \"56e828da-acef-4c43-ae22-165ce29b60f8\") " pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.753854 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56e828da-acef-4c43-ae22-165ce29b60f8-catalog-content\") pod \"certified-operators-hpg46\" (UID: \"56e828da-acef-4c43-ae22-165ce29b60f8\") " pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.753877 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56e828da-acef-4c43-ae22-165ce29b60f8-utilities\") pod \"certified-operators-hpg46\" (UID: \"56e828da-acef-4c43-ae22-165ce29b60f8\") " pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.780688 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqk84\" (UniqueName: \"kubernetes.io/projected/56e828da-acef-4c43-ae22-165ce29b60f8-kube-api-access-cqk84\") pod \"certified-operators-hpg46\" (UID: \"56e828da-acef-4c43-ae22-165ce29b60f8\") " pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:37 crc kubenswrapper[5030]: I0120 23:58:37.943184 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.196875 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-flq7n"] Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.204400 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.210074 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-flq7n"] Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.259015 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-catalog-content\") pod \"community-operators-flq7n\" (UID: \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\") " pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.259487 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-utilities\") pod \"community-operators-flq7n\" (UID: \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\") " pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.259528 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2knhs\" (UniqueName: \"kubernetes.io/projected/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-kube-api-access-2knhs\") pod \"community-operators-flq7n\" (UID: \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\") " pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.360016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-catalog-content\") pod \"community-operators-flq7n\" (UID: \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\") " pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.360115 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-utilities\") pod \"community-operators-flq7n\" (UID: \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\") " pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.360146 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2knhs\" (UniqueName: \"kubernetes.io/projected/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-kube-api-access-2knhs\") pod \"community-operators-flq7n\" (UID: \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\") " pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.360524 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-catalog-content\") pod \"community-operators-flq7n\" (UID: \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\") " pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.360656 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-utilities\") pod \"community-operators-flq7n\" (UID: \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\") " pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.433530 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2knhs\" (UniqueName: \"kubernetes.io/projected/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-kube-api-access-2knhs\") pod \"community-operators-flq7n\" (UID: \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\") " pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.460483 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpg46"] Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.528217 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.620348 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpg46" event={"ID":"56e828da-acef-4c43-ae22-165ce29b60f8","Type":"ContainerStarted","Data":"43c8c9725b93f9fddb94ce8ba0ab4553f995ecbf82c4fcdc1ef7d031c7b0b4bf"} Jan 20 23:58:38 crc kubenswrapper[5030]: I0120 23:58:38.818036 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-flq7n"] Jan 20 23:58:38 crc kubenswrapper[5030]: W0120 23:58:38.822473 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2127faa_68e3_4b3f_aea7_2d49c0c3a1c4.slice/crio-a663306082cb524b4fdfe80627c9a23188487eda92828c8b8b2682216f5a4c2f WatchSource:0}: Error finding container a663306082cb524b4fdfe80627c9a23188487eda92828c8b8b2682216f5a4c2f: Status 404 returned error can't find the container with id a663306082cb524b4fdfe80627c9a23188487eda92828c8b8b2682216f5a4c2f Jan 20 23:58:39 crc kubenswrapper[5030]: I0120 23:58:39.633402 5030 generic.go:334] "Generic (PLEG): container finished" podID="56e828da-acef-4c43-ae22-165ce29b60f8" containerID="840e1eaed434b295f88e7b52b454e12ae962f285a22261917f6beba2bcffcb5d" exitCode=0 Jan 20 23:58:39 crc kubenswrapper[5030]: I0120 23:58:39.633519 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpg46" event={"ID":"56e828da-acef-4c43-ae22-165ce29b60f8","Type":"ContainerDied","Data":"840e1eaed434b295f88e7b52b454e12ae962f285a22261917f6beba2bcffcb5d"} Jan 20 23:58:39 crc kubenswrapper[5030]: I0120 23:58:39.637427 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 23:58:39 crc kubenswrapper[5030]: I0120 23:58:39.637936 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" containerID="f8069ec259a68848db68ae1bd34bba7acb5740c20d804872d8f96772c922056c" exitCode=0 Jan 20 23:58:39 crc kubenswrapper[5030]: I0120 23:58:39.638005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flq7n" event={"ID":"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4","Type":"ContainerDied","Data":"f8069ec259a68848db68ae1bd34bba7acb5740c20d804872d8f96772c922056c"} Jan 20 23:58:39 crc kubenswrapper[5030]: I0120 23:58:39.638033 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flq7n" event={"ID":"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4","Type":"ContainerStarted","Data":"a663306082cb524b4fdfe80627c9a23188487eda92828c8b8b2682216f5a4c2f"} Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.006698 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5c582"] Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.012713 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.020515 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5c582"] Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.157012 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.157075 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.197988 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvnkv\" (UniqueName: \"kubernetes.io/projected/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-kube-api-access-bvnkv\") pod \"redhat-marketplace-5c582\" (UID: \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\") " pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.198139 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-catalog-content\") pod \"redhat-marketplace-5c582\" (UID: \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\") " pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.198388 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-utilities\") pod \"redhat-marketplace-5c582\" (UID: \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\") " pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.299810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-utilities\") pod \"redhat-marketplace-5c582\" (UID: \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\") " pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.299910 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvnkv\" (UniqueName: \"kubernetes.io/projected/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-kube-api-access-bvnkv\") pod \"redhat-marketplace-5c582\" (UID: \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\") " pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.299957 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-catalog-content\") pod \"redhat-marketplace-5c582\" (UID: \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\") " pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.300403 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-utilities\") pod \"redhat-marketplace-5c582\" (UID: \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\") " pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.300449 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-catalog-content\") pod \"redhat-marketplace-5c582\" (UID: \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\") " pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.321768 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvnkv\" (UniqueName: \"kubernetes.io/projected/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-kube-api-access-bvnkv\") pod \"redhat-marketplace-5c582\" (UID: \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\") " pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.341721 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.648848 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flq7n" event={"ID":"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4","Type":"ContainerStarted","Data":"a77e1aa1daf52e5564ff152615e35c6e7834f11f5a5d23d9ffb0e3926baac66b"} Jan 20 23:58:40 crc kubenswrapper[5030]: I0120 23:58:40.822214 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5c582"] Jan 20 23:58:40 crc kubenswrapper[5030]: W0120 23:58:40.827923 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16c360f3_a7ca_4a2f_9bf8_b2a3e8f27e8b.slice/crio-68665c517a722a5a5eded42ac20d0567a0f186b7cad016988d67c998866a20b1 WatchSource:0}: Error finding container 68665c517a722a5a5eded42ac20d0567a0f186b7cad016988d67c998866a20b1: Status 404 returned error can't find the container with id 68665c517a722a5a5eded42ac20d0567a0f186b7cad016988d67c998866a20b1 Jan 20 23:58:41 crc kubenswrapper[5030]: I0120 23:58:41.667318 5030 generic.go:334] "Generic (PLEG): container finished" podID="16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" containerID="1bfe195e90e4fea9c6fc638d87fa40905e7c70279350099f1d4d3564147f9044" exitCode=0 Jan 20 23:58:41 crc kubenswrapper[5030]: I0120 23:58:41.667444 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5c582" event={"ID":"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b","Type":"ContainerDied","Data":"1bfe195e90e4fea9c6fc638d87fa40905e7c70279350099f1d4d3564147f9044"} Jan 20 23:58:41 crc kubenswrapper[5030]: I0120 23:58:41.668868 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5c582" event={"ID":"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b","Type":"ContainerStarted","Data":"68665c517a722a5a5eded42ac20d0567a0f186b7cad016988d67c998866a20b1"} Jan 20 23:58:41 crc kubenswrapper[5030]: I0120 23:58:41.673342 5030 generic.go:334] "Generic (PLEG): container finished" podID="56e828da-acef-4c43-ae22-165ce29b60f8" containerID="1aa6d8d043765b94a19fbd15cb1a295eacf379d17698d8c08185e724a14452e7" exitCode=0 Jan 20 23:58:41 crc kubenswrapper[5030]: I0120 23:58:41.673440 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpg46" event={"ID":"56e828da-acef-4c43-ae22-165ce29b60f8","Type":"ContainerDied","Data":"1aa6d8d043765b94a19fbd15cb1a295eacf379d17698d8c08185e724a14452e7"} Jan 20 23:58:41 crc kubenswrapper[5030]: I0120 23:58:41.678880 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" containerID="a77e1aa1daf52e5564ff152615e35c6e7834f11f5a5d23d9ffb0e3926baac66b" exitCode=0 Jan 20 23:58:41 crc kubenswrapper[5030]: I0120 23:58:41.678934 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flq7n" event={"ID":"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4","Type":"ContainerDied","Data":"a77e1aa1daf52e5564ff152615e35c6e7834f11f5a5d23d9ffb0e3926baac66b"} Jan 20 23:58:42 crc kubenswrapper[5030]: I0120 23:58:42.693673 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flq7n" event={"ID":"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4","Type":"ContainerStarted","Data":"4cca686e8ae995a5b2f7e1d4b695d0f72bacd941fb7e02faa997797f22d4cb84"} Jan 20 23:58:42 crc kubenswrapper[5030]: I0120 23:58:42.696263 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5c582" event={"ID":"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b","Type":"ContainerStarted","Data":"3ceb2ef022ec851041c800f4859b26aabd1b1adfc83af76b57d495cb5ce33a82"} Jan 20 23:58:42 crc kubenswrapper[5030]: I0120 23:58:42.698569 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpg46" event={"ID":"56e828da-acef-4c43-ae22-165ce29b60f8","Type":"ContainerStarted","Data":"c3aaf779143426a1b7526640527ea8879e3899190d12b5f1838559f537a79b74"} Jan 20 23:58:42 crc kubenswrapper[5030]: I0120 23:58:42.721415 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-flq7n" podStartSLOduration=2.240712129 podStartE2EDuration="4.721394217s" podCreationTimestamp="2026-01-20 23:58:38 +0000 UTC" firstStartedPulling="2026-01-20 23:58:39.640112489 +0000 UTC m=+4991.960372777" lastFinishedPulling="2026-01-20 23:58:42.120794547 +0000 UTC m=+4994.441054865" observedRunningTime="2026-01-20 23:58:42.714884639 +0000 UTC m=+4995.035144967" watchObservedRunningTime="2026-01-20 23:58:42.721394217 +0000 UTC m=+4995.041654515" Jan 20 23:58:42 crc kubenswrapper[5030]: I0120 23:58:42.758204 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hpg46" podStartSLOduration=3.229075067 podStartE2EDuration="5.758177589s" podCreationTimestamp="2026-01-20 23:58:37 +0000 UTC" firstStartedPulling="2026-01-20 23:58:39.637160528 +0000 UTC m=+4991.957420816" lastFinishedPulling="2026-01-20 23:58:42.16626301 +0000 UTC m=+4994.486523338" observedRunningTime="2026-01-20 23:58:42.748794712 +0000 UTC m=+4995.069055010" watchObservedRunningTime="2026-01-20 23:58:42.758177589 +0000 UTC m=+4995.078437917" Jan 20 23:58:43 crc kubenswrapper[5030]: I0120 23:58:43.725285 5030 generic.go:334] "Generic (PLEG): container finished" podID="16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" containerID="3ceb2ef022ec851041c800f4859b26aabd1b1adfc83af76b57d495cb5ce33a82" exitCode=0 Jan 20 23:58:43 crc kubenswrapper[5030]: I0120 23:58:43.725394 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5c582" event={"ID":"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b","Type":"ContainerDied","Data":"3ceb2ef022ec851041c800f4859b26aabd1b1adfc83af76b57d495cb5ce33a82"} Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.197737 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.274260 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.274352 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfwzn\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-kube-api-access-jfwzn\") pod \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.274394 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift\") pod \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.274439 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-cache\") pod \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.274535 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-lock\") pod \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\" (UID: \"647882ed-cb41-417a-98bc-5a6bfbf0dd2a\") " Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.276905 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-cache" (OuterVolumeSpecName: "cache") pod "647882ed-cb41-417a-98bc-5a6bfbf0dd2a" (UID: "647882ed-cb41-417a-98bc-5a6bfbf0dd2a"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.277238 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-lock" (OuterVolumeSpecName: "lock") pod "647882ed-cb41-417a-98bc-5a6bfbf0dd2a" (UID: "647882ed-cb41-417a-98bc-5a6bfbf0dd2a"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.281403 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-kube-api-access-jfwzn" (OuterVolumeSpecName: "kube-api-access-jfwzn") pod "647882ed-cb41-417a-98bc-5a6bfbf0dd2a" (UID: "647882ed-cb41-417a-98bc-5a6bfbf0dd2a"). InnerVolumeSpecName "kube-api-access-jfwzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.281459 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "647882ed-cb41-417a-98bc-5a6bfbf0dd2a" (UID: "647882ed-cb41-417a-98bc-5a6bfbf0dd2a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.281876 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "swift") pod "647882ed-cb41-417a-98bc-5a6bfbf0dd2a" (UID: "647882ed-cb41-417a-98bc-5a6bfbf0dd2a"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.375483 5030 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-lock\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.375533 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.375549 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfwzn\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-kube-api-access-jfwzn\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.375559 5030 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.375568 5030 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/647882ed-cb41-417a-98bc-5a6bfbf0dd2a-cache\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.389266 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.476514 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.744969 5030 generic.go:334] "Generic (PLEG): container finished" podID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerID="525d2f77e4bb4d57f35f228c13d361085452ce1a0496934e60bded3b63edca0f" exitCode=137 Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.745067 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"525d2f77e4bb4d57f35f228c13d361085452ce1a0496934e60bded3b63edca0f"} Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.745117 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/swift-storage-0" event={"ID":"647882ed-cb41-417a-98bc-5a6bfbf0dd2a","Type":"ContainerDied","Data":"846ef75ed7641319a1122e200cc114c276facc1edc4d1b914c9ee658dab575ff"} Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.745146 5030 scope.go:117] "RemoveContainer" containerID="525d2f77e4bb4d57f35f228c13d361085452ce1a0496934e60bded3b63edca0f" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.745142 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/swift-storage-0" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.753250 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5c582" event={"ID":"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b","Type":"ContainerStarted","Data":"a9e15fb2ab44ed239155114338511bb4ab6d10fa47b7f5bce757e46e6f4162e7"} Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.771685 5030 scope.go:117] "RemoveContainer" containerID="92c32d2bc86e98bab1819f43771f92a83d45f6ff289fdd9a2b5a70b53ad1558e" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.815501 5030 scope.go:117] "RemoveContainer" containerID="cea3952cdaef6601fde2eb637e39d3001c3a7c366d044314ff9a9d1c02c48017" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.822616 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5c582" podStartSLOduration=3.24212502 podStartE2EDuration="5.822345663s" podCreationTimestamp="2026-01-20 23:58:39 +0000 UTC" firstStartedPulling="2026-01-20 23:58:41.671768444 +0000 UTC m=+4993.992028742" lastFinishedPulling="2026-01-20 23:58:44.251989087 +0000 UTC m=+4996.572249385" observedRunningTime="2026-01-20 23:58:44.78553506 +0000 UTC m=+4997.105795378" watchObservedRunningTime="2026-01-20 23:58:44.822345663 +0000 UTC m=+4997.142606001" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.825821 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.832498 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/swift-storage-0"] Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.837917 5030 scope.go:117] "RemoveContainer" containerID="a374407f689a9d2bb03eff9d062e1cd3da80b68110ed673257b7015e9e92c1ad" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.859652 5030 scope.go:117] "RemoveContainer" containerID="f0d126b13c715f2ad13a917efe6ded91f57e7059086b604ef44b914a751d5bbc" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.909005 5030 scope.go:117] "RemoveContainer" containerID="8671b10a80366e99ef14723df98d3a82f9697f085022261cce60d19731b26754" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.944932 5030 scope.go:117] "RemoveContainer" containerID="4f455f04c208a9ca5dfa5ca0e43960e84838f224928f9a968af8ca6c20691bc7" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.971019 5030 scope.go:117] "RemoveContainer" containerID="57552db52780afc148b63dea6d16414f0cabd8ebca10a042707f9fe199f39642" Jan 20 23:58:44 crc kubenswrapper[5030]: I0120 23:58:44.991495 5030 scope.go:117] "RemoveContainer" containerID="691b234d76c80217c95ab8beecf74884acbbff47c7b789fb42fa87e618b49105" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.020508 5030 scope.go:117] "RemoveContainer" containerID="e69298ac72328d9e05cbb9394d7318f97f51059ec2e51eed28645ae7a530e56b" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.044786 5030 scope.go:117] "RemoveContainer" containerID="422e52434457361c67a15e36827a2685f1991c2155732af2cb83cec3214ed1f9" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.064013 5030 scope.go:117] "RemoveContainer" containerID="5e75ab45cf815c6f40eaad36359127f421639ab2cf69cccab92eb297f6bff69d" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.102473 5030 scope.go:117] "RemoveContainer" containerID="50c1d323bc10d2e9d482f6bae645cc74b4dc3db3e2fb097ee700a89b4499d86c" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.127263 5030 scope.go:117] "RemoveContainer" containerID="3654ab925574f24e6de289b448a4be2823edfaa3cfa76260680adad4710a392c" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.144866 5030 scope.go:117] "RemoveContainer" containerID="4ed9aef9fb4eafe75c52a1017481474e53ebb1e8b61c72ef3fbe95bc2c025817" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.164970 5030 scope.go:117] "RemoveContainer" containerID="525d2f77e4bb4d57f35f228c13d361085452ce1a0496934e60bded3b63edca0f" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.165473 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"525d2f77e4bb4d57f35f228c13d361085452ce1a0496934e60bded3b63edca0f\": container with ID starting with 525d2f77e4bb4d57f35f228c13d361085452ce1a0496934e60bded3b63edca0f not found: ID does not exist" containerID="525d2f77e4bb4d57f35f228c13d361085452ce1a0496934e60bded3b63edca0f" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.165505 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"525d2f77e4bb4d57f35f228c13d361085452ce1a0496934e60bded3b63edca0f"} err="failed to get container status \"525d2f77e4bb4d57f35f228c13d361085452ce1a0496934e60bded3b63edca0f\": rpc error: code = NotFound desc = could not find container \"525d2f77e4bb4d57f35f228c13d361085452ce1a0496934e60bded3b63edca0f\": container with ID starting with 525d2f77e4bb4d57f35f228c13d361085452ce1a0496934e60bded3b63edca0f not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.165524 5030 scope.go:117] "RemoveContainer" containerID="92c32d2bc86e98bab1819f43771f92a83d45f6ff289fdd9a2b5a70b53ad1558e" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.166090 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92c32d2bc86e98bab1819f43771f92a83d45f6ff289fdd9a2b5a70b53ad1558e\": container with ID starting with 92c32d2bc86e98bab1819f43771f92a83d45f6ff289fdd9a2b5a70b53ad1558e not found: ID does not exist" containerID="92c32d2bc86e98bab1819f43771f92a83d45f6ff289fdd9a2b5a70b53ad1558e" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.166138 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c32d2bc86e98bab1819f43771f92a83d45f6ff289fdd9a2b5a70b53ad1558e"} err="failed to get container status \"92c32d2bc86e98bab1819f43771f92a83d45f6ff289fdd9a2b5a70b53ad1558e\": rpc error: code = NotFound desc = could not find container \"92c32d2bc86e98bab1819f43771f92a83d45f6ff289fdd9a2b5a70b53ad1558e\": container with ID starting with 92c32d2bc86e98bab1819f43771f92a83d45f6ff289fdd9a2b5a70b53ad1558e not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.166169 5030 scope.go:117] "RemoveContainer" containerID="cea3952cdaef6601fde2eb637e39d3001c3a7c366d044314ff9a9d1c02c48017" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.166703 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea3952cdaef6601fde2eb637e39d3001c3a7c366d044314ff9a9d1c02c48017\": container with ID starting with cea3952cdaef6601fde2eb637e39d3001c3a7c366d044314ff9a9d1c02c48017 not found: ID does not exist" containerID="cea3952cdaef6601fde2eb637e39d3001c3a7c366d044314ff9a9d1c02c48017" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.166732 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea3952cdaef6601fde2eb637e39d3001c3a7c366d044314ff9a9d1c02c48017"} err="failed to get container status \"cea3952cdaef6601fde2eb637e39d3001c3a7c366d044314ff9a9d1c02c48017\": rpc error: code = NotFound desc = could not find container \"cea3952cdaef6601fde2eb637e39d3001c3a7c366d044314ff9a9d1c02c48017\": container with ID starting with cea3952cdaef6601fde2eb637e39d3001c3a7c366d044314ff9a9d1c02c48017 not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.166747 5030 scope.go:117] "RemoveContainer" containerID="a374407f689a9d2bb03eff9d062e1cd3da80b68110ed673257b7015e9e92c1ad" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.167138 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a374407f689a9d2bb03eff9d062e1cd3da80b68110ed673257b7015e9e92c1ad\": container with ID starting with a374407f689a9d2bb03eff9d062e1cd3da80b68110ed673257b7015e9e92c1ad not found: ID does not exist" containerID="a374407f689a9d2bb03eff9d062e1cd3da80b68110ed673257b7015e9e92c1ad" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.167191 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a374407f689a9d2bb03eff9d062e1cd3da80b68110ed673257b7015e9e92c1ad"} err="failed to get container status \"a374407f689a9d2bb03eff9d062e1cd3da80b68110ed673257b7015e9e92c1ad\": rpc error: code = NotFound desc = could not find container \"a374407f689a9d2bb03eff9d062e1cd3da80b68110ed673257b7015e9e92c1ad\": container with ID starting with a374407f689a9d2bb03eff9d062e1cd3da80b68110ed673257b7015e9e92c1ad not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.167205 5030 scope.go:117] "RemoveContainer" containerID="f0d126b13c715f2ad13a917efe6ded91f57e7059086b604ef44b914a751d5bbc" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.167493 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0d126b13c715f2ad13a917efe6ded91f57e7059086b604ef44b914a751d5bbc\": container with ID starting with f0d126b13c715f2ad13a917efe6ded91f57e7059086b604ef44b914a751d5bbc not found: ID does not exist" containerID="f0d126b13c715f2ad13a917efe6ded91f57e7059086b604ef44b914a751d5bbc" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.167527 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0d126b13c715f2ad13a917efe6ded91f57e7059086b604ef44b914a751d5bbc"} err="failed to get container status \"f0d126b13c715f2ad13a917efe6ded91f57e7059086b604ef44b914a751d5bbc\": rpc error: code = NotFound desc = could not find container \"f0d126b13c715f2ad13a917efe6ded91f57e7059086b604ef44b914a751d5bbc\": container with ID starting with f0d126b13c715f2ad13a917efe6ded91f57e7059086b604ef44b914a751d5bbc not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.167545 5030 scope.go:117] "RemoveContainer" containerID="8671b10a80366e99ef14723df98d3a82f9697f085022261cce60d19731b26754" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.167867 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8671b10a80366e99ef14723df98d3a82f9697f085022261cce60d19731b26754\": container with ID starting with 8671b10a80366e99ef14723df98d3a82f9697f085022261cce60d19731b26754 not found: ID does not exist" containerID="8671b10a80366e99ef14723df98d3a82f9697f085022261cce60d19731b26754" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.167927 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8671b10a80366e99ef14723df98d3a82f9697f085022261cce60d19731b26754"} err="failed to get container status \"8671b10a80366e99ef14723df98d3a82f9697f085022261cce60d19731b26754\": rpc error: code = NotFound desc = could not find container \"8671b10a80366e99ef14723df98d3a82f9697f085022261cce60d19731b26754\": container with ID starting with 8671b10a80366e99ef14723df98d3a82f9697f085022261cce60d19731b26754 not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.167941 5030 scope.go:117] "RemoveContainer" containerID="4f455f04c208a9ca5dfa5ca0e43960e84838f224928f9a968af8ca6c20691bc7" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.168204 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f455f04c208a9ca5dfa5ca0e43960e84838f224928f9a968af8ca6c20691bc7\": container with ID starting with 4f455f04c208a9ca5dfa5ca0e43960e84838f224928f9a968af8ca6c20691bc7 not found: ID does not exist" containerID="4f455f04c208a9ca5dfa5ca0e43960e84838f224928f9a968af8ca6c20691bc7" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.168236 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f455f04c208a9ca5dfa5ca0e43960e84838f224928f9a968af8ca6c20691bc7"} err="failed to get container status \"4f455f04c208a9ca5dfa5ca0e43960e84838f224928f9a968af8ca6c20691bc7\": rpc error: code = NotFound desc = could not find container \"4f455f04c208a9ca5dfa5ca0e43960e84838f224928f9a968af8ca6c20691bc7\": container with ID starting with 4f455f04c208a9ca5dfa5ca0e43960e84838f224928f9a968af8ca6c20691bc7 not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.168269 5030 scope.go:117] "RemoveContainer" containerID="57552db52780afc148b63dea6d16414f0cabd8ebca10a042707f9fe199f39642" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.168540 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57552db52780afc148b63dea6d16414f0cabd8ebca10a042707f9fe199f39642\": container with ID starting with 57552db52780afc148b63dea6d16414f0cabd8ebca10a042707f9fe199f39642 not found: ID does not exist" containerID="57552db52780afc148b63dea6d16414f0cabd8ebca10a042707f9fe199f39642" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.168588 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57552db52780afc148b63dea6d16414f0cabd8ebca10a042707f9fe199f39642"} err="failed to get container status \"57552db52780afc148b63dea6d16414f0cabd8ebca10a042707f9fe199f39642\": rpc error: code = NotFound desc = could not find container \"57552db52780afc148b63dea6d16414f0cabd8ebca10a042707f9fe199f39642\": container with ID starting with 57552db52780afc148b63dea6d16414f0cabd8ebca10a042707f9fe199f39642 not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.168604 5030 scope.go:117] "RemoveContainer" containerID="691b234d76c80217c95ab8beecf74884acbbff47c7b789fb42fa87e618b49105" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.168881 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"691b234d76c80217c95ab8beecf74884acbbff47c7b789fb42fa87e618b49105\": container with ID starting with 691b234d76c80217c95ab8beecf74884acbbff47c7b789fb42fa87e618b49105 not found: ID does not exist" containerID="691b234d76c80217c95ab8beecf74884acbbff47c7b789fb42fa87e618b49105" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.168902 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"691b234d76c80217c95ab8beecf74884acbbff47c7b789fb42fa87e618b49105"} err="failed to get container status \"691b234d76c80217c95ab8beecf74884acbbff47c7b789fb42fa87e618b49105\": rpc error: code = NotFound desc = could not find container \"691b234d76c80217c95ab8beecf74884acbbff47c7b789fb42fa87e618b49105\": container with ID starting with 691b234d76c80217c95ab8beecf74884acbbff47c7b789fb42fa87e618b49105 not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.168915 5030 scope.go:117] "RemoveContainer" containerID="e69298ac72328d9e05cbb9394d7318f97f51059ec2e51eed28645ae7a530e56b" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.169186 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e69298ac72328d9e05cbb9394d7318f97f51059ec2e51eed28645ae7a530e56b\": container with ID starting with e69298ac72328d9e05cbb9394d7318f97f51059ec2e51eed28645ae7a530e56b not found: ID does not exist" containerID="e69298ac72328d9e05cbb9394d7318f97f51059ec2e51eed28645ae7a530e56b" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.169215 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e69298ac72328d9e05cbb9394d7318f97f51059ec2e51eed28645ae7a530e56b"} err="failed to get container status \"e69298ac72328d9e05cbb9394d7318f97f51059ec2e51eed28645ae7a530e56b\": rpc error: code = NotFound desc = could not find container \"e69298ac72328d9e05cbb9394d7318f97f51059ec2e51eed28645ae7a530e56b\": container with ID starting with e69298ac72328d9e05cbb9394d7318f97f51059ec2e51eed28645ae7a530e56b not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.169233 5030 scope.go:117] "RemoveContainer" containerID="422e52434457361c67a15e36827a2685f1991c2155732af2cb83cec3214ed1f9" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.169501 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422e52434457361c67a15e36827a2685f1991c2155732af2cb83cec3214ed1f9\": container with ID starting with 422e52434457361c67a15e36827a2685f1991c2155732af2cb83cec3214ed1f9 not found: ID does not exist" containerID="422e52434457361c67a15e36827a2685f1991c2155732af2cb83cec3214ed1f9" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.169527 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422e52434457361c67a15e36827a2685f1991c2155732af2cb83cec3214ed1f9"} err="failed to get container status \"422e52434457361c67a15e36827a2685f1991c2155732af2cb83cec3214ed1f9\": rpc error: code = NotFound desc = could not find container \"422e52434457361c67a15e36827a2685f1991c2155732af2cb83cec3214ed1f9\": container with ID starting with 422e52434457361c67a15e36827a2685f1991c2155732af2cb83cec3214ed1f9 not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.169542 5030 scope.go:117] "RemoveContainer" containerID="5e75ab45cf815c6f40eaad36359127f421639ab2cf69cccab92eb297f6bff69d" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.169853 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e75ab45cf815c6f40eaad36359127f421639ab2cf69cccab92eb297f6bff69d\": container with ID starting with 5e75ab45cf815c6f40eaad36359127f421639ab2cf69cccab92eb297f6bff69d not found: ID does not exist" containerID="5e75ab45cf815c6f40eaad36359127f421639ab2cf69cccab92eb297f6bff69d" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.169874 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e75ab45cf815c6f40eaad36359127f421639ab2cf69cccab92eb297f6bff69d"} err="failed to get container status \"5e75ab45cf815c6f40eaad36359127f421639ab2cf69cccab92eb297f6bff69d\": rpc error: code = NotFound desc = could not find container \"5e75ab45cf815c6f40eaad36359127f421639ab2cf69cccab92eb297f6bff69d\": container with ID starting with 5e75ab45cf815c6f40eaad36359127f421639ab2cf69cccab92eb297f6bff69d not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.169887 5030 scope.go:117] "RemoveContainer" containerID="50c1d323bc10d2e9d482f6bae645cc74b4dc3db3e2fb097ee700a89b4499d86c" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.170178 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c1d323bc10d2e9d482f6bae645cc74b4dc3db3e2fb097ee700a89b4499d86c\": container with ID starting with 50c1d323bc10d2e9d482f6bae645cc74b4dc3db3e2fb097ee700a89b4499d86c not found: ID does not exist" containerID="50c1d323bc10d2e9d482f6bae645cc74b4dc3db3e2fb097ee700a89b4499d86c" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.170208 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c1d323bc10d2e9d482f6bae645cc74b4dc3db3e2fb097ee700a89b4499d86c"} err="failed to get container status \"50c1d323bc10d2e9d482f6bae645cc74b4dc3db3e2fb097ee700a89b4499d86c\": rpc error: code = NotFound desc = could not find container \"50c1d323bc10d2e9d482f6bae645cc74b4dc3db3e2fb097ee700a89b4499d86c\": container with ID starting with 50c1d323bc10d2e9d482f6bae645cc74b4dc3db3e2fb097ee700a89b4499d86c not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.170225 5030 scope.go:117] "RemoveContainer" containerID="3654ab925574f24e6de289b448a4be2823edfaa3cfa76260680adad4710a392c" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.170591 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3654ab925574f24e6de289b448a4be2823edfaa3cfa76260680adad4710a392c\": container with ID starting with 3654ab925574f24e6de289b448a4be2823edfaa3cfa76260680adad4710a392c not found: ID does not exist" containerID="3654ab925574f24e6de289b448a4be2823edfaa3cfa76260680adad4710a392c" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.170653 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3654ab925574f24e6de289b448a4be2823edfaa3cfa76260680adad4710a392c"} err="failed to get container status \"3654ab925574f24e6de289b448a4be2823edfaa3cfa76260680adad4710a392c\": rpc error: code = NotFound desc = could not find container \"3654ab925574f24e6de289b448a4be2823edfaa3cfa76260680adad4710a392c\": container with ID starting with 3654ab925574f24e6de289b448a4be2823edfaa3cfa76260680adad4710a392c not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.170667 5030 scope.go:117] "RemoveContainer" containerID="4ed9aef9fb4eafe75c52a1017481474e53ebb1e8b61c72ef3fbe95bc2c025817" Jan 20 23:58:45 crc kubenswrapper[5030]: E0120 23:58:45.171291 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed9aef9fb4eafe75c52a1017481474e53ebb1e8b61c72ef3fbe95bc2c025817\": container with ID starting with 4ed9aef9fb4eafe75c52a1017481474e53ebb1e8b61c72ef3fbe95bc2c025817 not found: ID does not exist" containerID="4ed9aef9fb4eafe75c52a1017481474e53ebb1e8b61c72ef3fbe95bc2c025817" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.171334 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed9aef9fb4eafe75c52a1017481474e53ebb1e8b61c72ef3fbe95bc2c025817"} err="failed to get container status \"4ed9aef9fb4eafe75c52a1017481474e53ebb1e8b61c72ef3fbe95bc2c025817\": rpc error: code = NotFound desc = could not find container \"4ed9aef9fb4eafe75c52a1017481474e53ebb1e8b61c72ef3fbe95bc2c025817\": container with ID starting with 4ed9aef9fb4eafe75c52a1017481474e53ebb1e8b61c72ef3fbe95bc2c025817 not found: ID does not exist" Jan 20 23:58:45 crc kubenswrapper[5030]: I0120 23:58:45.974143 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" path="/var/lib/kubelet/pods/647882ed-cb41-417a-98bc-5a6bfbf0dd2a/volumes" Jan 20 23:58:46 crc kubenswrapper[5030]: I0120 23:58:46.780668 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4979b9f-9d14-4e60-b62f-82d983b0fda9" containerID="9edc9b5e1d0d3cbd3ae7e5e30adbd2b6dd6a00cb85691efc291253101a274524" exitCode=137 Jan 20 23:58:46 crc kubenswrapper[5030]: I0120 23:58:46.780716 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" event={"ID":"f4979b9f-9d14-4e60-b62f-82d983b0fda9","Type":"ContainerDied","Data":"9edc9b5e1d0d3cbd3ae7e5e30adbd2b6dd6a00cb85691efc291253101a274524"} Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.209618 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.229601 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q9ss\" (UniqueName: \"kubernetes.io/projected/f4979b9f-9d14-4e60-b62f-82d983b0fda9-kube-api-access-5q9ss\") pod \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.229876 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-combined-ca-bundle\") pod \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.230137 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4979b9f-9d14-4e60-b62f-82d983b0fda9-logs\") pod \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.230232 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-config-data-custom\") pod \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.230341 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-config-data\") pod \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\" (UID: \"f4979b9f-9d14-4e60-b62f-82d983b0fda9\") " Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.242585 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4979b9f-9d14-4e60-b62f-82d983b0fda9-logs" (OuterVolumeSpecName: "logs") pod "f4979b9f-9d14-4e60-b62f-82d983b0fda9" (UID: "f4979b9f-9d14-4e60-b62f-82d983b0fda9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.242802 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f4979b9f-9d14-4e60-b62f-82d983b0fda9" (UID: "f4979b9f-9d14-4e60-b62f-82d983b0fda9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.247041 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4979b9f-9d14-4e60-b62f-82d983b0fda9-kube-api-access-5q9ss" (OuterVolumeSpecName: "kube-api-access-5q9ss") pod "f4979b9f-9d14-4e60-b62f-82d983b0fda9" (UID: "f4979b9f-9d14-4e60-b62f-82d983b0fda9"). InnerVolumeSpecName "kube-api-access-5q9ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.273694 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4979b9f-9d14-4e60-b62f-82d983b0fda9" (UID: "f4979b9f-9d14-4e60-b62f-82d983b0fda9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.333128 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.333163 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4979b9f-9d14-4e60-b62f-82d983b0fda9-logs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.333178 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.333197 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q9ss\" (UniqueName: \"kubernetes.io/projected/f4979b9f-9d14-4e60-b62f-82d983b0fda9-kube-api-access-5q9ss\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.334242 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-config-data" (OuterVolumeSpecName: "config-data") pod "f4979b9f-9d14-4e60-b62f-82d983b0fda9" (UID: "f4979b9f-9d14-4e60-b62f-82d983b0fda9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.434599 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4979b9f-9d14-4e60-b62f-82d983b0fda9-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.800275 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" event={"ID":"f4979b9f-9d14-4e60-b62f-82d983b0fda9","Type":"ContainerDied","Data":"cedd4f0ce87996317edcf5b3b3ae5399cfe3e74e63bb4e27e816846fe94418c8"} Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.800347 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.800357 5030 scope.go:117] "RemoveContainer" containerID="9edc9b5e1d0d3cbd3ae7e5e30adbd2b6dd6a00cb85691efc291253101a274524" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.846020 5030 scope.go:117] "RemoveContainer" containerID="3a64e91d5d48c05cb735d0cb229c7481a1d20317e71c2a80aafc083cefa1b5aa" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.848275 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7"] Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.855285 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/barbican-worker-5d4668d597-b98m7"] Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.943414 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.943599 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:47 crc kubenswrapper[5030]: I0120 23:58:47.971084 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4979b9f-9d14-4e60-b62f-82d983b0fda9" path="/var/lib/kubelet/pods/f4979b9f-9d14-4e60-b62f-82d983b0fda9/volumes" Jan 20 23:58:47 crc kubenswrapper[5030]: E0120 23:58:47.973603 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4979b9f_9d14_4e60_b62f_82d983b0fda9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4979b9f_9d14_4e60_b62f_82d983b0fda9.slice/crio-cedd4f0ce87996317edcf5b3b3ae5399cfe3e74e63bb4e27e816846fe94418c8\": RecentStats: unable to find data in memory cache]" Jan 20 23:58:48 crc kubenswrapper[5030]: I0120 23:58:48.017215 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:48 crc kubenswrapper[5030]: I0120 23:58:48.529566 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:48 crc kubenswrapper[5030]: I0120 23:58:48.529711 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:48 crc kubenswrapper[5030]: I0120 23:58:48.699580 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:48 crc kubenswrapper[5030]: I0120 23:58:48.867920 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:48 crc kubenswrapper[5030]: I0120 23:58:48.885777 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:50 crc kubenswrapper[5030]: I0120 23:58:50.342031 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:50 crc kubenswrapper[5030]: I0120 23:58:50.342267 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:50 crc kubenswrapper[5030]: I0120 23:58:50.421090 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:50 crc kubenswrapper[5030]: I0120 23:58:50.588098 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-flq7n"] Jan 20 23:58:50 crc kubenswrapper[5030]: I0120 23:58:50.834841 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-flq7n" podUID="b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" containerName="registry-server" containerID="cri-o://4cca686e8ae995a5b2f7e1d4b695d0f72bacd941fb7e02faa997797f22d4cb84" gracePeriod=2 Jan 20 23:58:50 crc kubenswrapper[5030]: I0120 23:58:50.916390 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.188627 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpg46"] Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.188860 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hpg46" podUID="56e828da-acef-4c43-ae22-165ce29b60f8" containerName="registry-server" containerID="cri-o://c3aaf779143426a1b7526640527ea8879e3899190d12b5f1838559f537a79b74" gracePeriod=2 Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.655885 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.784507 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.848566 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" containerID="4cca686e8ae995a5b2f7e1d4b695d0f72bacd941fb7e02faa997797f22d4cb84" exitCode=0 Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.848623 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56e828da-acef-4c43-ae22-165ce29b60f8-utilities\") pod \"56e828da-acef-4c43-ae22-165ce29b60f8\" (UID: \"56e828da-acef-4c43-ae22-165ce29b60f8\") " Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.848651 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flq7n" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.848654 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flq7n" event={"ID":"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4","Type":"ContainerDied","Data":"4cca686e8ae995a5b2f7e1d4b695d0f72bacd941fb7e02faa997797f22d4cb84"} Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.848709 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqk84\" (UniqueName: \"kubernetes.io/projected/56e828da-acef-4c43-ae22-165ce29b60f8-kube-api-access-cqk84\") pod \"56e828da-acef-4c43-ae22-165ce29b60f8\" (UID: \"56e828da-acef-4c43-ae22-165ce29b60f8\") " Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.848800 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flq7n" event={"ID":"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4","Type":"ContainerDied","Data":"a663306082cb524b4fdfe80627c9a23188487eda92828c8b8b2682216f5a4c2f"} Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.848931 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56e828da-acef-4c43-ae22-165ce29b60f8-catalog-content\") pod \"56e828da-acef-4c43-ae22-165ce29b60f8\" (UID: \"56e828da-acef-4c43-ae22-165ce29b60f8\") " Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.848879 5030 scope.go:117] "RemoveContainer" containerID="4cca686e8ae995a5b2f7e1d4b695d0f72bacd941fb7e02faa997797f22d4cb84" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.849902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56e828da-acef-4c43-ae22-165ce29b60f8-utilities" (OuterVolumeSpecName: "utilities") pod "56e828da-acef-4c43-ae22-165ce29b60f8" (UID: "56e828da-acef-4c43-ae22-165ce29b60f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.850200 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56e828da-acef-4c43-ae22-165ce29b60f8-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.855293 5030 generic.go:334] "Generic (PLEG): container finished" podID="56e828da-acef-4c43-ae22-165ce29b60f8" containerID="c3aaf779143426a1b7526640527ea8879e3899190d12b5f1838559f537a79b74" exitCode=0 Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.855859 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpg46" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.855859 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e828da-acef-4c43-ae22-165ce29b60f8-kube-api-access-cqk84" (OuterVolumeSpecName: "kube-api-access-cqk84") pod "56e828da-acef-4c43-ae22-165ce29b60f8" (UID: "56e828da-acef-4c43-ae22-165ce29b60f8"). InnerVolumeSpecName "kube-api-access-cqk84". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.855909 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpg46" event={"ID":"56e828da-acef-4c43-ae22-165ce29b60f8","Type":"ContainerDied","Data":"c3aaf779143426a1b7526640527ea8879e3899190d12b5f1838559f537a79b74"} Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.855974 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpg46" event={"ID":"56e828da-acef-4c43-ae22-165ce29b60f8","Type":"ContainerDied","Data":"43c8c9725b93f9fddb94ce8ba0ab4553f995ecbf82c4fcdc1ef7d031c7b0b4bf"} Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.880470 5030 scope.go:117] "RemoveContainer" containerID="a77e1aa1daf52e5564ff152615e35c6e7834f11f5a5d23d9ffb0e3926baac66b" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.906002 5030 scope.go:117] "RemoveContainer" containerID="f8069ec259a68848db68ae1bd34bba7acb5740c20d804872d8f96772c922056c" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.920625 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56e828da-acef-4c43-ae22-165ce29b60f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56e828da-acef-4c43-ae22-165ce29b60f8" (UID: "56e828da-acef-4c43-ae22-165ce29b60f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.938097 5030 scope.go:117] "RemoveContainer" containerID="4cca686e8ae995a5b2f7e1d4b695d0f72bacd941fb7e02faa997797f22d4cb84" Jan 20 23:58:51 crc kubenswrapper[5030]: E0120 23:58:51.938635 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cca686e8ae995a5b2f7e1d4b695d0f72bacd941fb7e02faa997797f22d4cb84\": container with ID starting with 4cca686e8ae995a5b2f7e1d4b695d0f72bacd941fb7e02faa997797f22d4cb84 not found: ID does not exist" containerID="4cca686e8ae995a5b2f7e1d4b695d0f72bacd941fb7e02faa997797f22d4cb84" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.938670 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cca686e8ae995a5b2f7e1d4b695d0f72bacd941fb7e02faa997797f22d4cb84"} err="failed to get container status \"4cca686e8ae995a5b2f7e1d4b695d0f72bacd941fb7e02faa997797f22d4cb84\": rpc error: code = NotFound desc = could not find container \"4cca686e8ae995a5b2f7e1d4b695d0f72bacd941fb7e02faa997797f22d4cb84\": container with ID starting with 4cca686e8ae995a5b2f7e1d4b695d0f72bacd941fb7e02faa997797f22d4cb84 not found: ID does not exist" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.938688 5030 scope.go:117] "RemoveContainer" containerID="a77e1aa1daf52e5564ff152615e35c6e7834f11f5a5d23d9ffb0e3926baac66b" Jan 20 23:58:51 crc kubenswrapper[5030]: E0120 23:58:51.939446 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a77e1aa1daf52e5564ff152615e35c6e7834f11f5a5d23d9ffb0e3926baac66b\": container with ID starting with a77e1aa1daf52e5564ff152615e35c6e7834f11f5a5d23d9ffb0e3926baac66b not found: ID does not exist" containerID="a77e1aa1daf52e5564ff152615e35c6e7834f11f5a5d23d9ffb0e3926baac66b" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.939466 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77e1aa1daf52e5564ff152615e35c6e7834f11f5a5d23d9ffb0e3926baac66b"} err="failed to get container status \"a77e1aa1daf52e5564ff152615e35c6e7834f11f5a5d23d9ffb0e3926baac66b\": rpc error: code = NotFound desc = could not find container \"a77e1aa1daf52e5564ff152615e35c6e7834f11f5a5d23d9ffb0e3926baac66b\": container with ID starting with a77e1aa1daf52e5564ff152615e35c6e7834f11f5a5d23d9ffb0e3926baac66b not found: ID does not exist" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.939479 5030 scope.go:117] "RemoveContainer" containerID="f8069ec259a68848db68ae1bd34bba7acb5740c20d804872d8f96772c922056c" Jan 20 23:58:51 crc kubenswrapper[5030]: E0120 23:58:51.939767 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8069ec259a68848db68ae1bd34bba7acb5740c20d804872d8f96772c922056c\": container with ID starting with f8069ec259a68848db68ae1bd34bba7acb5740c20d804872d8f96772c922056c not found: ID does not exist" containerID="f8069ec259a68848db68ae1bd34bba7acb5740c20d804872d8f96772c922056c" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.939784 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8069ec259a68848db68ae1bd34bba7acb5740c20d804872d8f96772c922056c"} err="failed to get container status \"f8069ec259a68848db68ae1bd34bba7acb5740c20d804872d8f96772c922056c\": rpc error: code = NotFound desc = could not find container \"f8069ec259a68848db68ae1bd34bba7acb5740c20d804872d8f96772c922056c\": container with ID starting with f8069ec259a68848db68ae1bd34bba7acb5740c20d804872d8f96772c922056c not found: ID does not exist" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.939795 5030 scope.go:117] "RemoveContainer" containerID="c3aaf779143426a1b7526640527ea8879e3899190d12b5f1838559f537a79b74" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.951716 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-catalog-content\") pod \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\" (UID: \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\") " Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.951757 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-utilities\") pod \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\" (UID: \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\") " Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.951810 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2knhs\" (UniqueName: \"kubernetes.io/projected/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-kube-api-access-2knhs\") pod \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\" (UID: \"b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4\") " Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.952003 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqk84\" (UniqueName: \"kubernetes.io/projected/56e828da-acef-4c43-ae22-165ce29b60f8-kube-api-access-cqk84\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.952019 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56e828da-acef-4c43-ae22-165ce29b60f8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.952768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-utilities" (OuterVolumeSpecName: "utilities") pod "b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" (UID: "b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.955587 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-kube-api-access-2knhs" (OuterVolumeSpecName: "kube-api-access-2knhs") pod "b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" (UID: "b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4"). InnerVolumeSpecName "kube-api-access-2knhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:51 crc kubenswrapper[5030]: I0120 23:58:51.963509 5030 scope.go:117] "RemoveContainer" containerID="1aa6d8d043765b94a19fbd15cb1a295eacf379d17698d8c08185e724a14452e7" Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.023392 5030 scope.go:117] "RemoveContainer" containerID="840e1eaed434b295f88e7b52b454e12ae962f285a22261917f6beba2bcffcb5d" Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.025382 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" (UID: "b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.053423 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.053731 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.053837 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2knhs\" (UniqueName: \"kubernetes.io/projected/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4-kube-api-access-2knhs\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.062081 5030 scope.go:117] "RemoveContainer" containerID="c3aaf779143426a1b7526640527ea8879e3899190d12b5f1838559f537a79b74" Jan 20 23:58:52 crc kubenswrapper[5030]: E0120 23:58:52.063008 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3aaf779143426a1b7526640527ea8879e3899190d12b5f1838559f537a79b74\": container with ID starting with c3aaf779143426a1b7526640527ea8879e3899190d12b5f1838559f537a79b74 not found: ID does not exist" containerID="c3aaf779143426a1b7526640527ea8879e3899190d12b5f1838559f537a79b74" Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.063061 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3aaf779143426a1b7526640527ea8879e3899190d12b5f1838559f537a79b74"} err="failed to get container status \"c3aaf779143426a1b7526640527ea8879e3899190d12b5f1838559f537a79b74\": rpc error: code = NotFound desc = could not find container \"c3aaf779143426a1b7526640527ea8879e3899190d12b5f1838559f537a79b74\": container with ID starting with c3aaf779143426a1b7526640527ea8879e3899190d12b5f1838559f537a79b74 not found: ID does not exist" Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.063095 5030 scope.go:117] "RemoveContainer" containerID="1aa6d8d043765b94a19fbd15cb1a295eacf379d17698d8c08185e724a14452e7" Jan 20 23:58:52 crc kubenswrapper[5030]: E0120 23:58:52.063415 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aa6d8d043765b94a19fbd15cb1a295eacf379d17698d8c08185e724a14452e7\": container with ID starting with 1aa6d8d043765b94a19fbd15cb1a295eacf379d17698d8c08185e724a14452e7 not found: ID does not exist" containerID="1aa6d8d043765b94a19fbd15cb1a295eacf379d17698d8c08185e724a14452e7" Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.063451 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aa6d8d043765b94a19fbd15cb1a295eacf379d17698d8c08185e724a14452e7"} err="failed to get container status \"1aa6d8d043765b94a19fbd15cb1a295eacf379d17698d8c08185e724a14452e7\": rpc error: code = NotFound desc = could not find container \"1aa6d8d043765b94a19fbd15cb1a295eacf379d17698d8c08185e724a14452e7\": container with ID starting with 1aa6d8d043765b94a19fbd15cb1a295eacf379d17698d8c08185e724a14452e7 not found: ID does not exist" Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.063476 5030 scope.go:117] "RemoveContainer" containerID="840e1eaed434b295f88e7b52b454e12ae962f285a22261917f6beba2bcffcb5d" Jan 20 23:58:52 crc kubenswrapper[5030]: E0120 23:58:52.063777 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"840e1eaed434b295f88e7b52b454e12ae962f285a22261917f6beba2bcffcb5d\": container with ID starting with 840e1eaed434b295f88e7b52b454e12ae962f285a22261917f6beba2bcffcb5d not found: ID does not exist" containerID="840e1eaed434b295f88e7b52b454e12ae962f285a22261917f6beba2bcffcb5d" Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.063823 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840e1eaed434b295f88e7b52b454e12ae962f285a22261917f6beba2bcffcb5d"} err="failed to get container status \"840e1eaed434b295f88e7b52b454e12ae962f285a22261917f6beba2bcffcb5d\": rpc error: code = NotFound desc = could not find container \"840e1eaed434b295f88e7b52b454e12ae962f285a22261917f6beba2bcffcb5d\": container with ID starting with 840e1eaed434b295f88e7b52b454e12ae962f285a22261917f6beba2bcffcb5d not found: ID does not exist" Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.177520 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpg46"] Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.183361 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hpg46"] Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.198849 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-flq7n"] Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.209210 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-flq7n"] Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.980548 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5c582"] Jan 20 23:58:52 crc kubenswrapper[5030]: I0120 23:58:52.980772 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5c582" podUID="16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" containerName="registry-server" containerID="cri-o://a9e15fb2ab44ed239155114338511bb4ab6d10fa47b7f5bce757e46e6f4162e7" gracePeriod=2 Jan 20 23:58:53 crc kubenswrapper[5030]: I0120 23:58:53.911112 5030 generic.go:334] "Generic (PLEG): container finished" podID="16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" containerID="a9e15fb2ab44ed239155114338511bb4ab6d10fa47b7f5bce757e46e6f4162e7" exitCode=0 Jan 20 23:58:53 crc kubenswrapper[5030]: I0120 23:58:53.911164 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5c582" event={"ID":"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b","Type":"ContainerDied","Data":"a9e15fb2ab44ed239155114338511bb4ab6d10fa47b7f5bce757e46e6f4162e7"} Jan 20 23:58:53 crc kubenswrapper[5030]: I0120 23:58:53.975127 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e828da-acef-4c43-ae22-165ce29b60f8" path="/var/lib/kubelet/pods/56e828da-acef-4c43-ae22-165ce29b60f8/volumes" Jan 20 23:58:53 crc kubenswrapper[5030]: I0120 23:58:53.976301 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" path="/var/lib/kubelet/pods/b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4/volumes" Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.002416 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.103954 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-utilities\") pod \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\" (UID: \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\") " Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.104045 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-catalog-content\") pod \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\" (UID: \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\") " Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.104110 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvnkv\" (UniqueName: \"kubernetes.io/projected/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-kube-api-access-bvnkv\") pod \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\" (UID: \"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b\") " Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.105632 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-utilities" (OuterVolumeSpecName: "utilities") pod "16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" (UID: "16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.111929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-kube-api-access-bvnkv" (OuterVolumeSpecName: "kube-api-access-bvnkv") pod "16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" (UID: "16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b"). InnerVolumeSpecName "kube-api-access-bvnkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.128102 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" (UID: "16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.206233 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.206274 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.206290 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvnkv\" (UniqueName: \"kubernetes.io/projected/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b-kube-api-access-bvnkv\") on node \"crc\" DevicePath \"\"" Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.927443 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5c582" event={"ID":"16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b","Type":"ContainerDied","Data":"68665c517a722a5a5eded42ac20d0567a0f186b7cad016988d67c998866a20b1"} Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.927777 5030 scope.go:117] "RemoveContainer" containerID="a9e15fb2ab44ed239155114338511bb4ab6d10fa47b7f5bce757e46e6f4162e7" Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.927541 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5c582" Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.973019 5030 scope.go:117] "RemoveContainer" containerID="3ceb2ef022ec851041c800f4859b26aabd1b1adfc83af76b57d495cb5ce33a82" Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.974465 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5c582"] Jan 20 23:58:54 crc kubenswrapper[5030]: I0120 23:58:54.984516 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5c582"] Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.005668 5030 scope.go:117] "RemoveContainer" containerID="1bfe195e90e4fea9c6fc638d87fa40905e7c70279350099f1d4d3564147f9044" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.247602 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-9pwcs"] Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.253756 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-9pwcs"] Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.425409 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-w492d"] Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.426514 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-replicator" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.426561 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-replicator" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.426579 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" containerName="registry-server" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.426595 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" containerName="registry-server" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.426685 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-expirer" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.426704 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-expirer" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.426732 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" containerName="extract-content" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.426748 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" containerName="extract-content" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.426775 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e828da-acef-4c43-ae22-165ce29b60f8" containerName="extract-content" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.426790 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e828da-acef-4c43-ae22-165ce29b60f8" containerName="extract-content" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.426816 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-replicator" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.426831 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-replicator" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.426851 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" containerName="extract-content" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.426866 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" containerName="extract-content" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.426888 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-server" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.426904 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-server" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.426935 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4979b9f-9d14-4e60-b62f-82d983b0fda9" containerName="barbican-worker" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.426953 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4979b9f-9d14-4e60-b62f-82d983b0fda9" containerName="barbican-worker" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.426976 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-server" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.426992 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-server" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.427008 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-auditor" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.427023 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-auditor" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.427051 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e828da-acef-4c43-ae22-165ce29b60f8" containerName="extract-utilities" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.427067 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e828da-acef-4c43-ae22-165ce29b60f8" containerName="extract-utilities" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.427102 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" containerName="extract-utilities" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.427121 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" containerName="extract-utilities" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.427147 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-reaper" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.427163 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-reaper" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.427185 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-replicator" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.427200 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-replicator" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.427228 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-updater" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.427242 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-updater" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.427259 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e828da-acef-4c43-ae22-165ce29b60f8" containerName="registry-server" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.427274 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e828da-acef-4c43-ae22-165ce29b60f8" containerName="registry-server" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.427294 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-auditor" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.427310 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-auditor" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.427335 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-server" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.427351 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-server" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.427382 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" containerName="extract-utilities" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.427397 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" containerName="extract-utilities" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.427416 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-auditor" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.427433 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-auditor" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.427452 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="swift-recon-cron" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.427470 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="swift-recon-cron" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.428408 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4979b9f-9d14-4e60-b62f-82d983b0fda9" containerName="barbican-worker-log" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.428431 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4979b9f-9d14-4e60-b62f-82d983b0fda9" containerName="barbican-worker-log" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.428463 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" containerName="registry-server" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.428479 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" containerName="registry-server" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.428517 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-updater" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.428536 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-updater" Jan 20 23:58:55 crc kubenswrapper[5030]: E0120 23:58:55.428562 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="rsync" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.428578 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="rsync" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.428954 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-replicator" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.428996 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-server" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429027 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4979b9f-9d14-4e60-b62f-82d983b0fda9" containerName="barbican-worker-log" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429048 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4979b9f-9d14-4e60-b62f-82d983b0fda9" containerName="barbican-worker" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429066 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-server" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429091 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-auditor" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429121 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" containerName="registry-server" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429141 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-replicator" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429167 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-replicator" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429200 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-updater" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429230 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="swift-recon-cron" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429267 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-expirer" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429295 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-server" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429324 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-auditor" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429352 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="container-updater" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429371 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e828da-acef-4c43-ae22-165ce29b60f8" containerName="registry-server" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429393 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="object-auditor" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429427 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2127faa-68e3-4b3f-aea7-2d49c0c3a1c4" containerName="registry-server" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429570 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="rsync" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.429594 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="647882ed-cb41-417a-98bc-5a6bfbf0dd2a" containerName="account-reaper" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.431793 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-w492d" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.433808 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.434040 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.435253 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.435267 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.444341 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-w492d"] Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.631117 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsnrz\" (UniqueName: \"kubernetes.io/projected/e3fdf327-352e-4e19-88a5-78509925ead1-kube-api-access-lsnrz\") pod \"crc-storage-crc-w492d\" (UID: \"e3fdf327-352e-4e19-88a5-78509925ead1\") " pod="crc-storage/crc-storage-crc-w492d" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.631254 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e3fdf327-352e-4e19-88a5-78509925ead1-crc-storage\") pod \"crc-storage-crc-w492d\" (UID: \"e3fdf327-352e-4e19-88a5-78509925ead1\") " pod="crc-storage/crc-storage-crc-w492d" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.631305 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e3fdf327-352e-4e19-88a5-78509925ead1-node-mnt\") pod \"crc-storage-crc-w492d\" (UID: \"e3fdf327-352e-4e19-88a5-78509925ead1\") " pod="crc-storage/crc-storage-crc-w492d" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.732432 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsnrz\" (UniqueName: \"kubernetes.io/projected/e3fdf327-352e-4e19-88a5-78509925ead1-kube-api-access-lsnrz\") pod \"crc-storage-crc-w492d\" (UID: \"e3fdf327-352e-4e19-88a5-78509925ead1\") " pod="crc-storage/crc-storage-crc-w492d" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.732501 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e3fdf327-352e-4e19-88a5-78509925ead1-crc-storage\") pod \"crc-storage-crc-w492d\" (UID: \"e3fdf327-352e-4e19-88a5-78509925ead1\") " pod="crc-storage/crc-storage-crc-w492d" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.732536 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e3fdf327-352e-4e19-88a5-78509925ead1-node-mnt\") pod \"crc-storage-crc-w492d\" (UID: \"e3fdf327-352e-4e19-88a5-78509925ead1\") " pod="crc-storage/crc-storage-crc-w492d" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.733010 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e3fdf327-352e-4e19-88a5-78509925ead1-node-mnt\") pod \"crc-storage-crc-w492d\" (UID: \"e3fdf327-352e-4e19-88a5-78509925ead1\") " pod="crc-storage/crc-storage-crc-w492d" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.733968 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e3fdf327-352e-4e19-88a5-78509925ead1-crc-storage\") pod \"crc-storage-crc-w492d\" (UID: \"e3fdf327-352e-4e19-88a5-78509925ead1\") " pod="crc-storage/crc-storage-crc-w492d" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.765440 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsnrz\" (UniqueName: \"kubernetes.io/projected/e3fdf327-352e-4e19-88a5-78509925ead1-kube-api-access-lsnrz\") pod \"crc-storage-crc-w492d\" (UID: \"e3fdf327-352e-4e19-88a5-78509925ead1\") " pod="crc-storage/crc-storage-crc-w492d" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.973965 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b" path="/var/lib/kubelet/pods/16c360f3-a7ca-4a2f-9bf8-b2a3e8f27e8b/volumes" Jan 20 23:58:55 crc kubenswrapper[5030]: I0120 23:58:55.975394 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b60f009f-6f9c-4102-b78a-af3e0a7b6279" path="/var/lib/kubelet/pods/b60f009f-6f9c-4102-b78a-af3e0a7b6279/volumes" Jan 20 23:58:56 crc kubenswrapper[5030]: I0120 23:58:56.061640 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-w492d" Jan 20 23:58:56 crc kubenswrapper[5030]: I0120 23:58:56.548457 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-w492d"] Jan 20 23:58:56 crc kubenswrapper[5030]: I0120 23:58:56.954548 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-w492d" event={"ID":"e3fdf327-352e-4e19-88a5-78509925ead1","Type":"ContainerStarted","Data":"3aa73ae352f5e2d368330e7a61ad8eeab8c31d42a1243d7cd4f7c9332402d06c"} Jan 20 23:58:57 crc kubenswrapper[5030]: I0120 23:58:57.968133 5030 generic.go:334] "Generic (PLEG): container finished" podID="e3fdf327-352e-4e19-88a5-78509925ead1" containerID="6a94b4f12576bc4770168fd5a8a6729bf06a9b306ed382492bd0701c4327a171" exitCode=0 Jan 20 23:58:57 crc kubenswrapper[5030]: I0120 23:58:57.979536 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-w492d" event={"ID":"e3fdf327-352e-4e19-88a5-78509925ead1","Type":"ContainerDied","Data":"6a94b4f12576bc4770168fd5a8a6729bf06a9b306ed382492bd0701c4327a171"} Jan 20 23:58:59 crc kubenswrapper[5030]: I0120 23:58:59.838270 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-w492d" Jan 20 23:58:59 crc kubenswrapper[5030]: I0120 23:58:59.995876 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-w492d" event={"ID":"e3fdf327-352e-4e19-88a5-78509925ead1","Type":"ContainerDied","Data":"3aa73ae352f5e2d368330e7a61ad8eeab8c31d42a1243d7cd4f7c9332402d06c"} Jan 20 23:58:59 crc kubenswrapper[5030]: I0120 23:58:59.996153 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3aa73ae352f5e2d368330e7a61ad8eeab8c31d42a1243d7cd4f7c9332402d06c" Jan 20 23:58:59 crc kubenswrapper[5030]: I0120 23:58:59.995944 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-w492d" Jan 20 23:59:00 crc kubenswrapper[5030]: I0120 23:59:00.030149 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e3fdf327-352e-4e19-88a5-78509925ead1-crc-storage\") pod \"e3fdf327-352e-4e19-88a5-78509925ead1\" (UID: \"e3fdf327-352e-4e19-88a5-78509925ead1\") " Jan 20 23:59:00 crc kubenswrapper[5030]: I0120 23:59:00.030265 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsnrz\" (UniqueName: \"kubernetes.io/projected/e3fdf327-352e-4e19-88a5-78509925ead1-kube-api-access-lsnrz\") pod \"e3fdf327-352e-4e19-88a5-78509925ead1\" (UID: \"e3fdf327-352e-4e19-88a5-78509925ead1\") " Jan 20 23:59:00 crc kubenswrapper[5030]: I0120 23:59:00.030319 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e3fdf327-352e-4e19-88a5-78509925ead1-node-mnt\") pod \"e3fdf327-352e-4e19-88a5-78509925ead1\" (UID: \"e3fdf327-352e-4e19-88a5-78509925ead1\") " Jan 20 23:59:00 crc kubenswrapper[5030]: I0120 23:59:00.030838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3fdf327-352e-4e19-88a5-78509925ead1-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "e3fdf327-352e-4e19-88a5-78509925ead1" (UID: "e3fdf327-352e-4e19-88a5-78509925ead1"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:59:00 crc kubenswrapper[5030]: I0120 23:59:00.035853 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3fdf327-352e-4e19-88a5-78509925ead1-kube-api-access-lsnrz" (OuterVolumeSpecName: "kube-api-access-lsnrz") pod "e3fdf327-352e-4e19-88a5-78509925ead1" (UID: "e3fdf327-352e-4e19-88a5-78509925ead1"). InnerVolumeSpecName "kube-api-access-lsnrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:59:00 crc kubenswrapper[5030]: I0120 23:59:00.131766 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsnrz\" (UniqueName: \"kubernetes.io/projected/e3fdf327-352e-4e19-88a5-78509925ead1-kube-api-access-lsnrz\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:00 crc kubenswrapper[5030]: I0120 23:59:00.131819 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/e3fdf327-352e-4e19-88a5-78509925ead1-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:00 crc kubenswrapper[5030]: I0120 23:59:00.156104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3fdf327-352e-4e19-88a5-78509925ead1-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "e3fdf327-352e-4e19-88a5-78509925ead1" (UID: "e3fdf327-352e-4e19-88a5-78509925ead1"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:59:00 crc kubenswrapper[5030]: I0120 23:59:00.232887 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/e3fdf327-352e-4e19-88a5-78509925ead1-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.587930 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-w492d"] Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.599299 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-w492d"] Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.720307 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-wnt5h"] Jan 20 23:59:03 crc kubenswrapper[5030]: E0120 23:59:03.720900 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3fdf327-352e-4e19-88a5-78509925ead1" containerName="storage" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.720931 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3fdf327-352e-4e19-88a5-78509925ead1" containerName="storage" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.721216 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3fdf327-352e-4e19-88a5-78509925ead1" containerName="storage" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.722143 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wnt5h" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.723868 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.725753 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.726108 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.726256 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.731447 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-wnt5h"] Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.791704 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g84xk\" (UniqueName: \"kubernetes.io/projected/3a1f623f-afaf-431c-a665-7edcd01cb754-kube-api-access-g84xk\") pod \"crc-storage-crc-wnt5h\" (UID: \"3a1f623f-afaf-431c-a665-7edcd01cb754\") " pod="crc-storage/crc-storage-crc-wnt5h" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.791777 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3a1f623f-afaf-431c-a665-7edcd01cb754-node-mnt\") pod \"crc-storage-crc-wnt5h\" (UID: \"3a1f623f-afaf-431c-a665-7edcd01cb754\") " pod="crc-storage/crc-storage-crc-wnt5h" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.791858 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3a1f623f-afaf-431c-a665-7edcd01cb754-crc-storage\") pod \"crc-storage-crc-wnt5h\" (UID: \"3a1f623f-afaf-431c-a665-7edcd01cb754\") " pod="crc-storage/crc-storage-crc-wnt5h" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.894129 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g84xk\" (UniqueName: \"kubernetes.io/projected/3a1f623f-afaf-431c-a665-7edcd01cb754-kube-api-access-g84xk\") pod \"crc-storage-crc-wnt5h\" (UID: \"3a1f623f-afaf-431c-a665-7edcd01cb754\") " pod="crc-storage/crc-storage-crc-wnt5h" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.894198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3a1f623f-afaf-431c-a665-7edcd01cb754-node-mnt\") pod \"crc-storage-crc-wnt5h\" (UID: \"3a1f623f-afaf-431c-a665-7edcd01cb754\") " pod="crc-storage/crc-storage-crc-wnt5h" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.894274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3a1f623f-afaf-431c-a665-7edcd01cb754-crc-storage\") pod \"crc-storage-crc-wnt5h\" (UID: \"3a1f623f-afaf-431c-a665-7edcd01cb754\") " pod="crc-storage/crc-storage-crc-wnt5h" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.895032 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3a1f623f-afaf-431c-a665-7edcd01cb754-node-mnt\") pod \"crc-storage-crc-wnt5h\" (UID: \"3a1f623f-afaf-431c-a665-7edcd01cb754\") " pod="crc-storage/crc-storage-crc-wnt5h" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.896862 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3a1f623f-afaf-431c-a665-7edcd01cb754-crc-storage\") pod \"crc-storage-crc-wnt5h\" (UID: \"3a1f623f-afaf-431c-a665-7edcd01cb754\") " pod="crc-storage/crc-storage-crc-wnt5h" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.937582 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g84xk\" (UniqueName: \"kubernetes.io/projected/3a1f623f-afaf-431c-a665-7edcd01cb754-kube-api-access-g84xk\") pod \"crc-storage-crc-wnt5h\" (UID: \"3a1f623f-afaf-431c-a665-7edcd01cb754\") " pod="crc-storage/crc-storage-crc-wnt5h" Jan 20 23:59:03 crc kubenswrapper[5030]: I0120 23:59:03.984565 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3fdf327-352e-4e19-88a5-78509925ead1" path="/var/lib/kubelet/pods/e3fdf327-352e-4e19-88a5-78509925ead1/volumes" Jan 20 23:59:04 crc kubenswrapper[5030]: I0120 23:59:04.052421 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wnt5h" Jan 20 23:59:04 crc kubenswrapper[5030]: I0120 23:59:04.634243 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-wnt5h"] Jan 20 23:59:05 crc kubenswrapper[5030]: I0120 23:59:05.055810 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wnt5h" event={"ID":"3a1f623f-afaf-431c-a665-7edcd01cb754","Type":"ContainerStarted","Data":"0b0053fc88832e76a412122b37f29bd01982f85f339c5f94c543049f215715d5"} Jan 20 23:59:06 crc kubenswrapper[5030]: I0120 23:59:06.069743 5030 generic.go:334] "Generic (PLEG): container finished" podID="3a1f623f-afaf-431c-a665-7edcd01cb754" containerID="fe7e0a28e309f4b3b3b54f2671350c8935af49ddb2d6e6455da9ad8b2c979856" exitCode=0 Jan 20 23:59:06 crc kubenswrapper[5030]: I0120 23:59:06.069826 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wnt5h" event={"ID":"3a1f623f-afaf-431c-a665-7edcd01cb754","Type":"ContainerDied","Data":"fe7e0a28e309f4b3b3b54f2671350c8935af49ddb2d6e6455da9ad8b2c979856"} Jan 20 23:59:07 crc kubenswrapper[5030]: I0120 23:59:07.467968 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wnt5h" Jan 20 23:59:07 crc kubenswrapper[5030]: I0120 23:59:07.661307 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3a1f623f-afaf-431c-a665-7edcd01cb754-crc-storage\") pod \"3a1f623f-afaf-431c-a665-7edcd01cb754\" (UID: \"3a1f623f-afaf-431c-a665-7edcd01cb754\") " Jan 20 23:59:07 crc kubenswrapper[5030]: I0120 23:59:07.661560 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3a1f623f-afaf-431c-a665-7edcd01cb754-node-mnt\") pod \"3a1f623f-afaf-431c-a665-7edcd01cb754\" (UID: \"3a1f623f-afaf-431c-a665-7edcd01cb754\") " Jan 20 23:59:07 crc kubenswrapper[5030]: I0120 23:59:07.661702 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a1f623f-afaf-431c-a665-7edcd01cb754-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "3a1f623f-afaf-431c-a665-7edcd01cb754" (UID: "3a1f623f-afaf-431c-a665-7edcd01cb754"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:59:07 crc kubenswrapper[5030]: I0120 23:59:07.661842 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g84xk\" (UniqueName: \"kubernetes.io/projected/3a1f623f-afaf-431c-a665-7edcd01cb754-kube-api-access-g84xk\") pod \"3a1f623f-afaf-431c-a665-7edcd01cb754\" (UID: \"3a1f623f-afaf-431c-a665-7edcd01cb754\") " Jan 20 23:59:07 crc kubenswrapper[5030]: I0120 23:59:07.662815 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3a1f623f-afaf-431c-a665-7edcd01cb754-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:07 crc kubenswrapper[5030]: I0120 23:59:07.668191 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a1f623f-afaf-431c-a665-7edcd01cb754-kube-api-access-g84xk" (OuterVolumeSpecName: "kube-api-access-g84xk") pod "3a1f623f-afaf-431c-a665-7edcd01cb754" (UID: "3a1f623f-afaf-431c-a665-7edcd01cb754"). InnerVolumeSpecName "kube-api-access-g84xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:59:07 crc kubenswrapper[5030]: I0120 23:59:07.684754 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a1f623f-afaf-431c-a665-7edcd01cb754-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "3a1f623f-afaf-431c-a665-7edcd01cb754" (UID: "3a1f623f-afaf-431c-a665-7edcd01cb754"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:59:07 crc kubenswrapper[5030]: I0120 23:59:07.764179 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3a1f623f-afaf-431c-a665-7edcd01cb754-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:07 crc kubenswrapper[5030]: I0120 23:59:07.764238 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g84xk\" (UniqueName: \"kubernetes.io/projected/3a1f623f-afaf-431c-a665-7edcd01cb754-kube-api-access-g84xk\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:08 crc kubenswrapper[5030]: I0120 23:59:08.092862 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wnt5h" event={"ID":"3a1f623f-afaf-431c-a665-7edcd01cb754","Type":"ContainerDied","Data":"0b0053fc88832e76a412122b37f29bd01982f85f339c5f94c543049f215715d5"} Jan 20 23:59:08 crc kubenswrapper[5030]: I0120 23:59:08.092926 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wnt5h" Jan 20 23:59:08 crc kubenswrapper[5030]: I0120 23:59:08.092930 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b0053fc88832e76a412122b37f29bd01982f85f339c5f94c543049f215715d5" Jan 20 23:59:10 crc kubenswrapper[5030]: I0120 23:59:10.158154 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:59:10 crc kubenswrapper[5030]: I0120 23:59:10.158660 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:59:11 crc kubenswrapper[5030]: I0120 23:59:11.974014 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5"] Jan 20 23:59:11 crc kubenswrapper[5030]: E0120 23:59:11.974290 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a1f623f-afaf-431c-a665-7edcd01cb754" containerName="storage" Jan 20 23:59:11 crc kubenswrapper[5030]: I0120 23:59:11.974302 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a1f623f-afaf-431c-a665-7edcd01cb754" containerName="storage" Jan 20 23:59:11 crc kubenswrapper[5030]: I0120 23:59:11.974434 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a1f623f-afaf-431c-a665-7edcd01cb754" containerName="storage" Jan 20 23:59:11 crc kubenswrapper[5030]: I0120 23:59:11.975250 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:11 crc kubenswrapper[5030]: I0120 23:59:11.975816 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5"] Jan 20 23:59:11 crc kubenswrapper[5030]: I0120 23:59:11.980038 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-edpm-ipam" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.137253 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-config\") pod \"dnsmasq-dnsmasq-5b68c79f89-wnsl5\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.137344 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5b68c79f89-wnsl5\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.137428 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncb9l\" (UniqueName: \"kubernetes.io/projected/b6b1cdab-4020-4c85-be49-6bfdf4122247-kube-api-access-ncb9l\") pod \"dnsmasq-dnsmasq-5b68c79f89-wnsl5\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.137488 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-openstack-edpm-ipam\") pod \"dnsmasq-dnsmasq-5b68c79f89-wnsl5\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.239081 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-config\") pod \"dnsmasq-dnsmasq-5b68c79f89-wnsl5\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.239166 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5b68c79f89-wnsl5\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.239256 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncb9l\" (UniqueName: \"kubernetes.io/projected/b6b1cdab-4020-4c85-be49-6bfdf4122247-kube-api-access-ncb9l\") pod \"dnsmasq-dnsmasq-5b68c79f89-wnsl5\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.239321 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-openstack-edpm-ipam\") pod \"dnsmasq-dnsmasq-5b68c79f89-wnsl5\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.240551 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5b68c79f89-wnsl5\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.240786 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-config\") pod \"dnsmasq-dnsmasq-5b68c79f89-wnsl5\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.240856 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-openstack-edpm-ipam\") pod \"dnsmasq-dnsmasq-5b68c79f89-wnsl5\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.270863 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncb9l\" (UniqueName: \"kubernetes.io/projected/b6b1cdab-4020-4c85-be49-6bfdf4122247-kube-api-access-ncb9l\") pod \"dnsmasq-dnsmasq-5b68c79f89-wnsl5\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.296164 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:12 crc kubenswrapper[5030]: I0120 23:59:12.815245 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5"] Jan 20 23:59:13 crc kubenswrapper[5030]: I0120 23:59:13.155404 5030 generic.go:334] "Generic (PLEG): container finished" podID="b6b1cdab-4020-4c85-be49-6bfdf4122247" containerID="638be82e423111cbeed082d91812e2250fac99e3e5c24b26e2a1e4bf213ede51" exitCode=0 Jan 20 23:59:13 crc kubenswrapper[5030]: I0120 23:59:13.155721 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" event={"ID":"b6b1cdab-4020-4c85-be49-6bfdf4122247","Type":"ContainerDied","Data":"638be82e423111cbeed082d91812e2250fac99e3e5c24b26e2a1e4bf213ede51"} Jan 20 23:59:13 crc kubenswrapper[5030]: I0120 23:59:13.155893 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" event={"ID":"b6b1cdab-4020-4c85-be49-6bfdf4122247","Type":"ContainerStarted","Data":"1faacc239bb4e3035309aa2d414eb86a5af53216290d5e6b7065dee481a9d945"} Jan 20 23:59:14 crc kubenswrapper[5030]: I0120 23:59:14.172247 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" event={"ID":"b6b1cdab-4020-4c85-be49-6bfdf4122247","Type":"ContainerStarted","Data":"70afeef35b72b57a9097cdee32da00e21873e7fb2bed588154656478ef5f56d5"} Jan 20 23:59:14 crc kubenswrapper[5030]: I0120 23:59:14.173641 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:14 crc kubenswrapper[5030]: I0120 23:59:14.201093 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" podStartSLOduration=3.2010673 podStartE2EDuration="3.2010673s" podCreationTimestamp="2026-01-20 23:59:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:59:14.197525795 +0000 UTC m=+5026.517786103" watchObservedRunningTime="2026-01-20 23:59:14.2010673 +0000 UTC m=+5026.521327628" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.297926 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.389469 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m"] Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.389864 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" podUID="01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0" containerName="dnsmasq-dns" containerID="cri-o://f57172ba21bcb5ac38aab3ebc6acde6183de18a44c65cf1f131a2aaf083e01c4" gracePeriod=10 Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.585272 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555"] Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.586339 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.618578 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555"] Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.632593 5030 scope.go:117] "RemoveContainer" containerID="29dabf429be0c0a8f2d8516036afb5c0cb465a9ff45b71f25546ff511319cdb2" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.666013 5030 scope.go:117] "RemoveContainer" containerID="0310625296be6170351f770fc5a8a857728effb7422def465702469c504419c3" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.767686 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79667f9c49-q7555\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.768063 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-openstack-edpm-ipam\") pod \"dnsmasq-dnsmasq-79667f9c49-q7555\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.768099 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-config\") pod \"dnsmasq-dnsmasq-79667f9c49-q7555\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.768148 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhrbl\" (UniqueName: \"kubernetes.io/projected/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-kube-api-access-rhrbl\") pod \"dnsmasq-dnsmasq-79667f9c49-q7555\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.839099 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.869386 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhrbl\" (UniqueName: \"kubernetes.io/projected/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-kube-api-access-rhrbl\") pod \"dnsmasq-dnsmasq-79667f9c49-q7555\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.869455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79667f9c49-q7555\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.869515 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-openstack-edpm-ipam\") pod \"dnsmasq-dnsmasq-79667f9c49-q7555\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.869560 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-config\") pod \"dnsmasq-dnsmasq-79667f9c49-q7555\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.870584 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-config\") pod \"dnsmasq-dnsmasq-79667f9c49-q7555\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.870645 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-openstack-edpm-ipam\") pod \"dnsmasq-dnsmasq-79667f9c49-q7555\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.870675 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79667f9c49-q7555\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.895357 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhrbl\" (UniqueName: \"kubernetes.io/projected/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-kube-api-access-rhrbl\") pod \"dnsmasq-dnsmasq-79667f9c49-q7555\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.901641 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.970505 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq4pc\" (UniqueName: \"kubernetes.io/projected/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-kube-api-access-wq4pc\") pod \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\" (UID: \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\") " Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.971190 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-config\") pod \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\" (UID: \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\") " Jan 20 23:59:22 crc kubenswrapper[5030]: I0120 23:59:22.971302 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-dnsmasq-svc\") pod \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\" (UID: \"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0\") " Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.143329 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555"] Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.277524 5030 generic.go:334] "Generic (PLEG): container finished" podID="01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0" containerID="f57172ba21bcb5ac38aab3ebc6acde6183de18a44c65cf1f131a2aaf083e01c4" exitCode=0 Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.277671 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.277719 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" event={"ID":"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0","Type":"ContainerDied","Data":"f57172ba21bcb5ac38aab3ebc6acde6183de18a44c65cf1f131a2aaf083e01c4"} Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.277817 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m" event={"ID":"01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0","Type":"ContainerDied","Data":"71ee938eac7bb8fed7cd685f7ceddd8fc6942ddb5416f7e0149497813ef1f517"} Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.277874 5030 scope.go:117] "RemoveContainer" containerID="f57172ba21bcb5ac38aab3ebc6acde6183de18a44c65cf1f131a2aaf083e01c4" Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.343008 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-kube-api-access-wq4pc" (OuterVolumeSpecName: "kube-api-access-wq4pc") pod "01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0" (UID: "01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0"). InnerVolumeSpecName "kube-api-access-wq4pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.366941 5030 scope.go:117] "RemoveContainer" containerID="55dbe1052955b7454b5b9d7d0a561274172522f13d73d9d9c0a2009f53aeb411" Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.382323 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq4pc\" (UniqueName: \"kubernetes.io/projected/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-kube-api-access-wq4pc\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.385409 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-config" (OuterVolumeSpecName: "config") pod "01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0" (UID: "01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.396692 5030 scope.go:117] "RemoveContainer" containerID="f57172ba21bcb5ac38aab3ebc6acde6183de18a44c65cf1f131a2aaf083e01c4" Jan 20 23:59:23 crc kubenswrapper[5030]: E0120 23:59:23.397453 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f57172ba21bcb5ac38aab3ebc6acde6183de18a44c65cf1f131a2aaf083e01c4\": container with ID starting with f57172ba21bcb5ac38aab3ebc6acde6183de18a44c65cf1f131a2aaf083e01c4 not found: ID does not exist" containerID="f57172ba21bcb5ac38aab3ebc6acde6183de18a44c65cf1f131a2aaf083e01c4" Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.397500 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f57172ba21bcb5ac38aab3ebc6acde6183de18a44c65cf1f131a2aaf083e01c4"} err="failed to get container status \"f57172ba21bcb5ac38aab3ebc6acde6183de18a44c65cf1f131a2aaf083e01c4\": rpc error: code = NotFound desc = could not find container \"f57172ba21bcb5ac38aab3ebc6acde6183de18a44c65cf1f131a2aaf083e01c4\": container with ID starting with f57172ba21bcb5ac38aab3ebc6acde6183de18a44c65cf1f131a2aaf083e01c4 not found: ID does not exist" Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.397525 5030 scope.go:117] "RemoveContainer" containerID="55dbe1052955b7454b5b9d7d0a561274172522f13d73d9d9c0a2009f53aeb411" Jan 20 23:59:23 crc kubenswrapper[5030]: E0120 23:59:23.397843 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55dbe1052955b7454b5b9d7d0a561274172522f13d73d9d9c0a2009f53aeb411\": container with ID starting with 55dbe1052955b7454b5b9d7d0a561274172522f13d73d9d9c0a2009f53aeb411 not found: ID does not exist" containerID="55dbe1052955b7454b5b9d7d0a561274172522f13d73d9d9c0a2009f53aeb411" Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.397868 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55dbe1052955b7454b5b9d7d0a561274172522f13d73d9d9c0a2009f53aeb411"} err="failed to get container status \"55dbe1052955b7454b5b9d7d0a561274172522f13d73d9d9c0a2009f53aeb411\": rpc error: code = NotFound desc = could not find container \"55dbe1052955b7454b5b9d7d0a561274172522f13d73d9d9c0a2009f53aeb411\": container with ID starting with 55dbe1052955b7454b5b9d7d0a561274172522f13d73d9d9c0a2009f53aeb411 not found: ID does not exist" Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.400265 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0" (UID: "01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.484504 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.484863 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.698788 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m"] Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.711246 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8v65m"] Jan 20 23:59:23 crc kubenswrapper[5030]: I0120 23:59:23.972962 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0" path="/var/lib/kubelet/pods/01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0/volumes" Jan 20 23:59:24 crc kubenswrapper[5030]: I0120 23:59:24.291057 5030 generic.go:334] "Generic (PLEG): container finished" podID="ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3" containerID="c9a1163885b4753c744e0ea3c9d4bdd67063c0d7b7b64edad1cd3801b6804bf5" exitCode=0 Jan 20 23:59:24 crc kubenswrapper[5030]: I0120 23:59:24.291171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" event={"ID":"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3","Type":"ContainerDied","Data":"c9a1163885b4753c744e0ea3c9d4bdd67063c0d7b7b64edad1cd3801b6804bf5"} Jan 20 23:59:24 crc kubenswrapper[5030]: I0120 23:59:24.291567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" event={"ID":"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3","Type":"ContainerStarted","Data":"9bcf7a30715f81a51f0a596d7248dadf407de309c0df5653c58e45eb032c04b8"} Jan 20 23:59:25 crc kubenswrapper[5030]: I0120 23:59:25.308486 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" event={"ID":"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3","Type":"ContainerStarted","Data":"deb69a9b16ec09310fc25c6c6b7f57f864856e19dc79974ea23645e4044ed56d"} Jan 20 23:59:25 crc kubenswrapper[5030]: I0120 23:59:25.308983 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:25 crc kubenswrapper[5030]: I0120 23:59:25.339729 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" podStartSLOduration=3.339710571 podStartE2EDuration="3.339710571s" podCreationTimestamp="2026-01-20 23:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:59:25.33431275 +0000 UTC m=+5037.654573078" watchObservedRunningTime="2026-01-20 23:59:25.339710571 +0000 UTC m=+5037.659970869" Jan 20 23:59:32 crc kubenswrapper[5030]: I0120 23:59:32.902886 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.013157 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5"] Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.013422 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" podUID="b6b1cdab-4020-4c85-be49-6bfdf4122247" containerName="dnsmasq-dns" containerID="cri-o://70afeef35b72b57a9097cdee32da00e21873e7fb2bed588154656478ef5f56d5" gracePeriod=10 Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.355995 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc"] Jan 20 23:59:33 crc kubenswrapper[5030]: E0120 23:59:33.356504 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0" containerName="dnsmasq-dns" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.356516 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0" containerName="dnsmasq-dns" Jan 20 23:59:33 crc kubenswrapper[5030]: E0120 23:59:33.356544 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0" containerName="init" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.356550 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0" containerName="init" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.356703 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b86cbd-e0d0-4179-a9ad-28cbb5dbb4f0" containerName="dnsmasq-dns" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.358545 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.366858 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc"] Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.391578 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsstd\" (UniqueName: \"kubernetes.io/projected/00017f62-948a-44df-9f36-e898a374c13a-kube-api-access-nsstd\") pod \"dnsmasq-dnsmasq-84b9f45d47-br6nc\" (UID: \"00017f62-948a-44df-9f36-e898a374c13a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.391825 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/00017f62-948a-44df-9f36-e898a374c13a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-br6nc\" (UID: \"00017f62-948a-44df-9f36-e898a374c13a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.391894 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00017f62-948a-44df-9f36-e898a374c13a-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-br6nc\" (UID: \"00017f62-948a-44df-9f36-e898a374c13a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.403059 5030 generic.go:334] "Generic (PLEG): container finished" podID="b6b1cdab-4020-4c85-be49-6bfdf4122247" containerID="70afeef35b72b57a9097cdee32da00e21873e7fb2bed588154656478ef5f56d5" exitCode=0 Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.403099 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" event={"ID":"b6b1cdab-4020-4c85-be49-6bfdf4122247","Type":"ContainerDied","Data":"70afeef35b72b57a9097cdee32da00e21873e7fb2bed588154656478ef5f56d5"} Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.459948 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.493128 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-config\") pod \"b6b1cdab-4020-4c85-be49-6bfdf4122247\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.493201 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-dnsmasq-svc\") pod \"b6b1cdab-4020-4c85-be49-6bfdf4122247\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.493274 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncb9l\" (UniqueName: \"kubernetes.io/projected/b6b1cdab-4020-4c85-be49-6bfdf4122247-kube-api-access-ncb9l\") pod \"b6b1cdab-4020-4c85-be49-6bfdf4122247\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.493299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-openstack-edpm-ipam\") pod \"b6b1cdab-4020-4c85-be49-6bfdf4122247\" (UID: \"b6b1cdab-4020-4c85-be49-6bfdf4122247\") " Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.493525 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00017f62-948a-44df-9f36-e898a374c13a-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-br6nc\" (UID: \"00017f62-948a-44df-9f36-e898a374c13a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.493586 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsstd\" (UniqueName: \"kubernetes.io/projected/00017f62-948a-44df-9f36-e898a374c13a-kube-api-access-nsstd\") pod \"dnsmasq-dnsmasq-84b9f45d47-br6nc\" (UID: \"00017f62-948a-44df-9f36-e898a374c13a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.493725 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/00017f62-948a-44df-9f36-e898a374c13a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-br6nc\" (UID: \"00017f62-948a-44df-9f36-e898a374c13a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.494525 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00017f62-948a-44df-9f36-e898a374c13a-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-br6nc\" (UID: \"00017f62-948a-44df-9f36-e898a374c13a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.495470 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/00017f62-948a-44df-9f36-e898a374c13a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-br6nc\" (UID: \"00017f62-948a-44df-9f36-e898a374c13a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.511285 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b1cdab-4020-4c85-be49-6bfdf4122247-kube-api-access-ncb9l" (OuterVolumeSpecName: "kube-api-access-ncb9l") pod "b6b1cdab-4020-4c85-be49-6bfdf4122247" (UID: "b6b1cdab-4020-4c85-be49-6bfdf4122247"). InnerVolumeSpecName "kube-api-access-ncb9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.516877 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsstd\" (UniqueName: \"kubernetes.io/projected/00017f62-948a-44df-9f36-e898a374c13a-kube-api-access-nsstd\") pod \"dnsmasq-dnsmasq-84b9f45d47-br6nc\" (UID: \"00017f62-948a-44df-9f36-e898a374c13a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.533661 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "b6b1cdab-4020-4c85-be49-6bfdf4122247" (UID: "b6b1cdab-4020-4c85-be49-6bfdf4122247"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.552890 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-config" (OuterVolumeSpecName: "config") pod "b6b1cdab-4020-4c85-be49-6bfdf4122247" (UID: "b6b1cdab-4020-4c85-be49-6bfdf4122247"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.554314 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "b6b1cdab-4020-4c85-be49-6bfdf4122247" (UID: "b6b1cdab-4020-4c85-be49-6bfdf4122247"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.594377 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.594850 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.594924 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncb9l\" (UniqueName: \"kubernetes.io/projected/b6b1cdab-4020-4c85-be49-6bfdf4122247-kube-api-access-ncb9l\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.594989 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b6b1cdab-4020-4c85-be49-6bfdf4122247-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:33 crc kubenswrapper[5030]: I0120 23:59:33.685965 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 20 23:59:34 crc kubenswrapper[5030]: W0120 23:59:34.033241 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00017f62_948a_44df_9f36_e898a374c13a.slice/crio-22dbe0a91b3870a093d2d32c871933ae31ecd3d6a98a05dc52619e984870e49b WatchSource:0}: Error finding container 22dbe0a91b3870a093d2d32c871933ae31ecd3d6a98a05dc52619e984870e49b: Status 404 returned error can't find the container with id 22dbe0a91b3870a093d2d32c871933ae31ecd3d6a98a05dc52619e984870e49b Jan 20 23:59:34 crc kubenswrapper[5030]: I0120 23:59:34.039088 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc"] Jan 20 23:59:34 crc kubenswrapper[5030]: I0120 23:59:34.417937 5030 generic.go:334] "Generic (PLEG): container finished" podID="00017f62-948a-44df-9f36-e898a374c13a" containerID="044df7453007b283ca5cb478cae487944eec298f9df1cf558ef9d182a2253898" exitCode=0 Jan 20 23:59:34 crc kubenswrapper[5030]: I0120 23:59:34.418089 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" event={"ID":"00017f62-948a-44df-9f36-e898a374c13a","Type":"ContainerDied","Data":"044df7453007b283ca5cb478cae487944eec298f9df1cf558ef9d182a2253898"} Jan 20 23:59:34 crc kubenswrapper[5030]: I0120 23:59:34.418350 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" event={"ID":"00017f62-948a-44df-9f36-e898a374c13a","Type":"ContainerStarted","Data":"22dbe0a91b3870a093d2d32c871933ae31ecd3d6a98a05dc52619e984870e49b"} Jan 20 23:59:34 crc kubenswrapper[5030]: I0120 23:59:34.422984 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" event={"ID":"b6b1cdab-4020-4c85-be49-6bfdf4122247","Type":"ContainerDied","Data":"1faacc239bb4e3035309aa2d414eb86a5af53216290d5e6b7065dee481a9d945"} Jan 20 23:59:34 crc kubenswrapper[5030]: I0120 23:59:34.423031 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5" Jan 20 23:59:34 crc kubenswrapper[5030]: I0120 23:59:34.423138 5030 scope.go:117] "RemoveContainer" containerID="70afeef35b72b57a9097cdee32da00e21873e7fb2bed588154656478ef5f56d5" Jan 20 23:59:34 crc kubenswrapper[5030]: I0120 23:59:34.494985 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5"] Jan 20 23:59:34 crc kubenswrapper[5030]: I0120 23:59:34.508565 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5b68c79f89-wnsl5"] Jan 20 23:59:34 crc kubenswrapper[5030]: I0120 23:59:34.581718 5030 scope.go:117] "RemoveContainer" containerID="638be82e423111cbeed082d91812e2250fac99e3e5c24b26e2a1e4bf213ede51" Jan 20 23:59:35 crc kubenswrapper[5030]: I0120 23:59:35.443218 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" event={"ID":"00017f62-948a-44df-9f36-e898a374c13a","Type":"ContainerStarted","Data":"3ac947683c9a10693e536e71d8fcae46de6b4f974744197d92b1efc73d9bd194"} Jan 20 23:59:35 crc kubenswrapper[5030]: I0120 23:59:35.443554 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 20 23:59:35 crc kubenswrapper[5030]: I0120 23:59:35.478270 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" podStartSLOduration=2.478243981 podStartE2EDuration="2.478243981s" podCreationTimestamp="2026-01-20 23:59:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:59:35.468666808 +0000 UTC m=+5047.788927136" watchObservedRunningTime="2026-01-20 23:59:35.478243981 +0000 UTC m=+5047.798504309" Jan 20 23:59:35 crc kubenswrapper[5030]: I0120 23:59:35.979893 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b1cdab-4020-4c85-be49-6bfdf4122247" path="/var/lib/kubelet/pods/b6b1cdab-4020-4c85-be49-6bfdf4122247/volumes" Jan 20 23:59:40 crc kubenswrapper[5030]: I0120 23:59:40.157170 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 23:59:40 crc kubenswrapper[5030]: I0120 23:59:40.157930 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 23:59:40 crc kubenswrapper[5030]: I0120 23:59:40.158037 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 20 23:59:40 crc kubenswrapper[5030]: I0120 23:59:40.158951 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 23:59:40 crc kubenswrapper[5030]: I0120 23:59:40.159052 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" gracePeriod=600 Jan 20 23:59:40 crc kubenswrapper[5030]: E0120 23:59:40.369785 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:59:40 crc kubenswrapper[5030]: I0120 23:59:40.493377 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" exitCode=0 Jan 20 23:59:40 crc kubenswrapper[5030]: I0120 23:59:40.493477 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc"} Jan 20 23:59:40 crc kubenswrapper[5030]: I0120 23:59:40.494903 5030 scope.go:117] "RemoveContainer" containerID="99e97073dbe5bb28b1952568f31004b35204791a913cf21cbb334984821b7ecc" Jan 20 23:59:40 crc kubenswrapper[5030]: I0120 23:59:40.495051 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 20 23:59:40 crc kubenswrapper[5030]: E0120 23:59:40.495441 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:59:41 crc kubenswrapper[5030]: I0120 23:59:41.982828 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-wnt5h"] Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:41.986949 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-wnt5h"] Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.124060 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-wb96v"] Jan 20 23:59:42 crc kubenswrapper[5030]: E0120 23:59:42.126456 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b1cdab-4020-4c85-be49-6bfdf4122247" containerName="dnsmasq-dns" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.126504 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b1cdab-4020-4c85-be49-6bfdf4122247" containerName="dnsmasq-dns" Jan 20 23:59:42 crc kubenswrapper[5030]: E0120 23:59:42.126539 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b1cdab-4020-4c85-be49-6bfdf4122247" containerName="init" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.126556 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b1cdab-4020-4c85-be49-6bfdf4122247" containerName="init" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.126947 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b1cdab-4020-4c85-be49-6bfdf4122247" containerName="dnsmasq-dns" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.127974 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wb96v" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.131252 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.131555 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.131587 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.131587 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.144231 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-wb96v"] Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.195022 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a356563e-7411-45c9-ae5e-b3fd42a92249-crc-storage\") pod \"crc-storage-crc-wb96v\" (UID: \"a356563e-7411-45c9-ae5e-b3fd42a92249\") " pod="crc-storage/crc-storage-crc-wb96v" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.195424 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7gx7\" (UniqueName: \"kubernetes.io/projected/a356563e-7411-45c9-ae5e-b3fd42a92249-kube-api-access-n7gx7\") pod \"crc-storage-crc-wb96v\" (UID: \"a356563e-7411-45c9-ae5e-b3fd42a92249\") " pod="crc-storage/crc-storage-crc-wb96v" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.195459 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a356563e-7411-45c9-ae5e-b3fd42a92249-node-mnt\") pod \"crc-storage-crc-wb96v\" (UID: \"a356563e-7411-45c9-ae5e-b3fd42a92249\") " pod="crc-storage/crc-storage-crc-wb96v" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.296765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a356563e-7411-45c9-ae5e-b3fd42a92249-crc-storage\") pod \"crc-storage-crc-wb96v\" (UID: \"a356563e-7411-45c9-ae5e-b3fd42a92249\") " pod="crc-storage/crc-storage-crc-wb96v" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.296961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7gx7\" (UniqueName: \"kubernetes.io/projected/a356563e-7411-45c9-ae5e-b3fd42a92249-kube-api-access-n7gx7\") pod \"crc-storage-crc-wb96v\" (UID: \"a356563e-7411-45c9-ae5e-b3fd42a92249\") " pod="crc-storage/crc-storage-crc-wb96v" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.297004 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a356563e-7411-45c9-ae5e-b3fd42a92249-node-mnt\") pod \"crc-storage-crc-wb96v\" (UID: \"a356563e-7411-45c9-ae5e-b3fd42a92249\") " pod="crc-storage/crc-storage-crc-wb96v" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.297396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a356563e-7411-45c9-ae5e-b3fd42a92249-node-mnt\") pod \"crc-storage-crc-wb96v\" (UID: \"a356563e-7411-45c9-ae5e-b3fd42a92249\") " pod="crc-storage/crc-storage-crc-wb96v" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.297753 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a356563e-7411-45c9-ae5e-b3fd42a92249-crc-storage\") pod \"crc-storage-crc-wb96v\" (UID: \"a356563e-7411-45c9-ae5e-b3fd42a92249\") " pod="crc-storage/crc-storage-crc-wb96v" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.328434 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7gx7\" (UniqueName: \"kubernetes.io/projected/a356563e-7411-45c9-ae5e-b3fd42a92249-kube-api-access-n7gx7\") pod \"crc-storage-crc-wb96v\" (UID: \"a356563e-7411-45c9-ae5e-b3fd42a92249\") " pod="crc-storage/crc-storage-crc-wb96v" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.461263 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wb96v" Jan 20 23:59:42 crc kubenswrapper[5030]: I0120 23:59:42.937408 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-wb96v"] Jan 20 23:59:43 crc kubenswrapper[5030]: I0120 23:59:43.538668 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wb96v" event={"ID":"a356563e-7411-45c9-ae5e-b3fd42a92249","Type":"ContainerStarted","Data":"f2fda0f695132d11a2d010bf11fafbd529ac69d6f5cdb9a4ba720ea76f37f8a2"} Jan 20 23:59:43 crc kubenswrapper[5030]: I0120 23:59:43.687924 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 20 23:59:43 crc kubenswrapper[5030]: I0120 23:59:43.762661 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555"] Jan 20 23:59:43 crc kubenswrapper[5030]: I0120 23:59:43.762937 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" podUID="ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3" containerName="dnsmasq-dns" containerID="cri-o://deb69a9b16ec09310fc25c6c6b7f57f864856e19dc79974ea23645e4044ed56d" gracePeriod=10 Jan 20 23:59:43 crc kubenswrapper[5030]: I0120 23:59:43.978268 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a1f623f-afaf-431c-a665-7edcd01cb754" path="/var/lib/kubelet/pods/3a1f623f-afaf-431c-a665-7edcd01cb754/volumes" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.141172 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.326018 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-dnsmasq-svc\") pod \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.326480 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhrbl\" (UniqueName: \"kubernetes.io/projected/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-kube-api-access-rhrbl\") pod \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.326565 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-config\") pod \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.326652 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-openstack-edpm-ipam\") pod \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\" (UID: \"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3\") " Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.333118 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-kube-api-access-rhrbl" (OuterVolumeSpecName: "kube-api-access-rhrbl") pod "ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3" (UID: "ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3"). InnerVolumeSpecName "kube-api-access-rhrbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.375198 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-config" (OuterVolumeSpecName: "config") pod "ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3" (UID: "ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.381566 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3" (UID: "ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.389746 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3" (UID: "ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.429731 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-config\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.430108 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.430224 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.430333 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhrbl\" (UniqueName: \"kubernetes.io/projected/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3-kube-api-access-rhrbl\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.556596 5030 generic.go:334] "Generic (PLEG): container finished" podID="a356563e-7411-45c9-ae5e-b3fd42a92249" containerID="69bef87377bb66e3b027d614cfd60256b7e6670cebbe8fba722f2a003e9cddcb" exitCode=0 Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.556715 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wb96v" event={"ID":"a356563e-7411-45c9-ae5e-b3fd42a92249","Type":"ContainerDied","Data":"69bef87377bb66e3b027d614cfd60256b7e6670cebbe8fba722f2a003e9cddcb"} Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.568149 5030 generic.go:334] "Generic (PLEG): container finished" podID="ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3" containerID="deb69a9b16ec09310fc25c6c6b7f57f864856e19dc79974ea23645e4044ed56d" exitCode=0 Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.568248 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" event={"ID":"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3","Type":"ContainerDied","Data":"deb69a9b16ec09310fc25c6c6b7f57f864856e19dc79974ea23645e4044ed56d"} Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.568308 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" event={"ID":"ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3","Type":"ContainerDied","Data":"9bcf7a30715f81a51f0a596d7248dadf407de309c0df5653c58e45eb032c04b8"} Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.568357 5030 scope.go:117] "RemoveContainer" containerID="deb69a9b16ec09310fc25c6c6b7f57f864856e19dc79974ea23645e4044ed56d" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.568470 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.630873 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555"] Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.633138 5030 scope.go:117] "RemoveContainer" containerID="c9a1163885b4753c744e0ea3c9d4bdd67063c0d7b7b64edad1cd3801b6804bf5" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.638729 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79667f9c49-q7555"] Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.656851 5030 scope.go:117] "RemoveContainer" containerID="deb69a9b16ec09310fc25c6c6b7f57f864856e19dc79974ea23645e4044ed56d" Jan 20 23:59:44 crc kubenswrapper[5030]: E0120 23:59:44.657641 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb69a9b16ec09310fc25c6c6b7f57f864856e19dc79974ea23645e4044ed56d\": container with ID starting with deb69a9b16ec09310fc25c6c6b7f57f864856e19dc79974ea23645e4044ed56d not found: ID does not exist" containerID="deb69a9b16ec09310fc25c6c6b7f57f864856e19dc79974ea23645e4044ed56d" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.657690 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb69a9b16ec09310fc25c6c6b7f57f864856e19dc79974ea23645e4044ed56d"} err="failed to get container status \"deb69a9b16ec09310fc25c6c6b7f57f864856e19dc79974ea23645e4044ed56d\": rpc error: code = NotFound desc = could not find container \"deb69a9b16ec09310fc25c6c6b7f57f864856e19dc79974ea23645e4044ed56d\": container with ID starting with deb69a9b16ec09310fc25c6c6b7f57f864856e19dc79974ea23645e4044ed56d not found: ID does not exist" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.657716 5030 scope.go:117] "RemoveContainer" containerID="c9a1163885b4753c744e0ea3c9d4bdd67063c0d7b7b64edad1cd3801b6804bf5" Jan 20 23:59:44 crc kubenswrapper[5030]: E0120 23:59:44.658193 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9a1163885b4753c744e0ea3c9d4bdd67063c0d7b7b64edad1cd3801b6804bf5\": container with ID starting with c9a1163885b4753c744e0ea3c9d4bdd67063c0d7b7b64edad1cd3801b6804bf5 not found: ID does not exist" containerID="c9a1163885b4753c744e0ea3c9d4bdd67063c0d7b7b64edad1cd3801b6804bf5" Jan 20 23:59:44 crc kubenswrapper[5030]: I0120 23:59:44.658409 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9a1163885b4753c744e0ea3c9d4bdd67063c0d7b7b64edad1cd3801b6804bf5"} err="failed to get container status \"c9a1163885b4753c744e0ea3c9d4bdd67063c0d7b7b64edad1cd3801b6804bf5\": rpc error: code = NotFound desc = could not find container \"c9a1163885b4753c744e0ea3c9d4bdd67063c0d7b7b64edad1cd3801b6804bf5\": container with ID starting with c9a1163885b4753c744e0ea3c9d4bdd67063c0d7b7b64edad1cd3801b6804bf5 not found: ID does not exist" Jan 20 23:59:45 crc kubenswrapper[5030]: I0120 23:59:45.974823 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3" path="/var/lib/kubelet/pods/ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3/volumes" Jan 20 23:59:46 crc kubenswrapper[5030]: I0120 23:59:46.043089 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wb96v" Jan 20 23:59:46 crc kubenswrapper[5030]: I0120 23:59:46.055721 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a356563e-7411-45c9-ae5e-b3fd42a92249-node-mnt\") pod \"a356563e-7411-45c9-ae5e-b3fd42a92249\" (UID: \"a356563e-7411-45c9-ae5e-b3fd42a92249\") " Jan 20 23:59:46 crc kubenswrapper[5030]: I0120 23:59:46.055808 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a356563e-7411-45c9-ae5e-b3fd42a92249-crc-storage\") pod \"a356563e-7411-45c9-ae5e-b3fd42a92249\" (UID: \"a356563e-7411-45c9-ae5e-b3fd42a92249\") " Jan 20 23:59:46 crc kubenswrapper[5030]: I0120 23:59:46.055853 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7gx7\" (UniqueName: \"kubernetes.io/projected/a356563e-7411-45c9-ae5e-b3fd42a92249-kube-api-access-n7gx7\") pod \"a356563e-7411-45c9-ae5e-b3fd42a92249\" (UID: \"a356563e-7411-45c9-ae5e-b3fd42a92249\") " Jan 20 23:59:46 crc kubenswrapper[5030]: I0120 23:59:46.055857 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a356563e-7411-45c9-ae5e-b3fd42a92249-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "a356563e-7411-45c9-ae5e-b3fd42a92249" (UID: "a356563e-7411-45c9-ae5e-b3fd42a92249"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:59:46 crc kubenswrapper[5030]: I0120 23:59:46.057032 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a356563e-7411-45c9-ae5e-b3fd42a92249-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:46 crc kubenswrapper[5030]: I0120 23:59:46.063468 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a356563e-7411-45c9-ae5e-b3fd42a92249-kube-api-access-n7gx7" (OuterVolumeSpecName: "kube-api-access-n7gx7") pod "a356563e-7411-45c9-ae5e-b3fd42a92249" (UID: "a356563e-7411-45c9-ae5e-b3fd42a92249"). InnerVolumeSpecName "kube-api-access-n7gx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:59:46 crc kubenswrapper[5030]: I0120 23:59:46.077766 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a356563e-7411-45c9-ae5e-b3fd42a92249-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "a356563e-7411-45c9-ae5e-b3fd42a92249" (UID: "a356563e-7411-45c9-ae5e-b3fd42a92249"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:59:46 crc kubenswrapper[5030]: I0120 23:59:46.158406 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a356563e-7411-45c9-ae5e-b3fd42a92249-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:46 crc kubenswrapper[5030]: I0120 23:59:46.158466 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7gx7\" (UniqueName: \"kubernetes.io/projected/a356563e-7411-45c9-ae5e-b3fd42a92249-kube-api-access-n7gx7\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:46 crc kubenswrapper[5030]: I0120 23:59:46.598798 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wb96v" event={"ID":"a356563e-7411-45c9-ae5e-b3fd42a92249","Type":"ContainerDied","Data":"f2fda0f695132d11a2d010bf11fafbd529ac69d6f5cdb9a4ba720ea76f37f8a2"} Jan 20 23:59:46 crc kubenswrapper[5030]: I0120 23:59:46.598847 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wb96v" Jan 20 23:59:46 crc kubenswrapper[5030]: I0120 23:59:46.598860 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2fda0f695132d11a2d010bf11fafbd529ac69d6f5cdb9a4ba720ea76f37f8a2" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.571133 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-wb96v"] Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.586898 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-wb96v"] Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.718421 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-pfk2b"] Jan 20 23:59:49 crc kubenswrapper[5030]: E0120 23:59:49.719132 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3" containerName="dnsmasq-dns" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.719169 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3" containerName="dnsmasq-dns" Jan 20 23:59:49 crc kubenswrapper[5030]: E0120 23:59:49.719220 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a356563e-7411-45c9-ae5e-b3fd42a92249" containerName="storage" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.719234 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a356563e-7411-45c9-ae5e-b3fd42a92249" containerName="storage" Jan 20 23:59:49 crc kubenswrapper[5030]: E0120 23:59:49.719277 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3" containerName="init" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.719290 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3" containerName="init" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.719538 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef57e7ae-dd57-4b74-b3f6-d53e1fd6fce3" containerName="dnsmasq-dns" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.719574 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a356563e-7411-45c9-ae5e-b3fd42a92249" containerName="storage" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.720483 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pfk2b" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.724176 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.724191 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.724336 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.724558 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.732790 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-pfk2b"] Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.820381 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d821564-7a44-40c8-a135-a91c0c60dc7d-node-mnt\") pod \"crc-storage-crc-pfk2b\" (UID: \"6d821564-7a44-40c8-a135-a91c0c60dc7d\") " pod="crc-storage/crc-storage-crc-pfk2b" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.820680 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d821564-7a44-40c8-a135-a91c0c60dc7d-crc-storage\") pod \"crc-storage-crc-pfk2b\" (UID: \"6d821564-7a44-40c8-a135-a91c0c60dc7d\") " pod="crc-storage/crc-storage-crc-pfk2b" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.820755 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltl7f\" (UniqueName: \"kubernetes.io/projected/6d821564-7a44-40c8-a135-a91c0c60dc7d-kube-api-access-ltl7f\") pod \"crc-storage-crc-pfk2b\" (UID: \"6d821564-7a44-40c8-a135-a91c0c60dc7d\") " pod="crc-storage/crc-storage-crc-pfk2b" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.922258 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d821564-7a44-40c8-a135-a91c0c60dc7d-node-mnt\") pod \"crc-storage-crc-pfk2b\" (UID: \"6d821564-7a44-40c8-a135-a91c0c60dc7d\") " pod="crc-storage/crc-storage-crc-pfk2b" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.922505 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d821564-7a44-40c8-a135-a91c0c60dc7d-crc-storage\") pod \"crc-storage-crc-pfk2b\" (UID: \"6d821564-7a44-40c8-a135-a91c0c60dc7d\") " pod="crc-storage/crc-storage-crc-pfk2b" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.922573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltl7f\" (UniqueName: \"kubernetes.io/projected/6d821564-7a44-40c8-a135-a91c0c60dc7d-kube-api-access-ltl7f\") pod \"crc-storage-crc-pfk2b\" (UID: \"6d821564-7a44-40c8-a135-a91c0c60dc7d\") " pod="crc-storage/crc-storage-crc-pfk2b" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.922655 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d821564-7a44-40c8-a135-a91c0c60dc7d-node-mnt\") pod \"crc-storage-crc-pfk2b\" (UID: \"6d821564-7a44-40c8-a135-a91c0c60dc7d\") " pod="crc-storage/crc-storage-crc-pfk2b" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.924069 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d821564-7a44-40c8-a135-a91c0c60dc7d-crc-storage\") pod \"crc-storage-crc-pfk2b\" (UID: \"6d821564-7a44-40c8-a135-a91c0c60dc7d\") " pod="crc-storage/crc-storage-crc-pfk2b" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.955677 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltl7f\" (UniqueName: \"kubernetes.io/projected/6d821564-7a44-40c8-a135-a91c0c60dc7d-kube-api-access-ltl7f\") pod \"crc-storage-crc-pfk2b\" (UID: \"6d821564-7a44-40c8-a135-a91c0c60dc7d\") " pod="crc-storage/crc-storage-crc-pfk2b" Jan 20 23:59:49 crc kubenswrapper[5030]: I0120 23:59:49.979093 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a356563e-7411-45c9-ae5e-b3fd42a92249" path="/var/lib/kubelet/pods/a356563e-7411-45c9-ae5e-b3fd42a92249/volumes" Jan 20 23:59:50 crc kubenswrapper[5030]: I0120 23:59:50.057481 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pfk2b" Jan 20 23:59:50 crc kubenswrapper[5030]: I0120 23:59:50.362965 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-pfk2b"] Jan 20 23:59:50 crc kubenswrapper[5030]: W0120 23:59:50.371136 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d821564_7a44_40c8_a135_a91c0c60dc7d.slice/crio-98669459f045358e70eec3c30ea80e28d10d37a68614707800bea003c94001dc WatchSource:0}: Error finding container 98669459f045358e70eec3c30ea80e28d10d37a68614707800bea003c94001dc: Status 404 returned error can't find the container with id 98669459f045358e70eec3c30ea80e28d10d37a68614707800bea003c94001dc Jan 20 23:59:50 crc kubenswrapper[5030]: I0120 23:59:50.646214 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pfk2b" event={"ID":"6d821564-7a44-40c8-a135-a91c0c60dc7d","Type":"ContainerStarted","Data":"98669459f045358e70eec3c30ea80e28d10d37a68614707800bea003c94001dc"} Jan 20 23:59:51 crc kubenswrapper[5030]: I0120 23:59:51.661194 5030 generic.go:334] "Generic (PLEG): container finished" podID="6d821564-7a44-40c8-a135-a91c0c60dc7d" containerID="d666523f48cf4aaf05db4dd3f6ba6ca62dc9695f19f4f6ca3c7bc2af9b00b323" exitCode=0 Jan 20 23:59:51 crc kubenswrapper[5030]: I0120 23:59:51.661268 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pfk2b" event={"ID":"6d821564-7a44-40c8-a135-a91c0c60dc7d","Type":"ContainerDied","Data":"d666523f48cf4aaf05db4dd3f6ba6ca62dc9695f19f4f6ca3c7bc2af9b00b323"} Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.074521 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pfk2b" Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.204809 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d821564-7a44-40c8-a135-a91c0c60dc7d-node-mnt\") pod \"6d821564-7a44-40c8-a135-a91c0c60dc7d\" (UID: \"6d821564-7a44-40c8-a135-a91c0c60dc7d\") " Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.205244 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d821564-7a44-40c8-a135-a91c0c60dc7d-crc-storage\") pod \"6d821564-7a44-40c8-a135-a91c0c60dc7d\" (UID: \"6d821564-7a44-40c8-a135-a91c0c60dc7d\") " Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.205070 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d821564-7a44-40c8-a135-a91c0c60dc7d-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6d821564-7a44-40c8-a135-a91c0c60dc7d" (UID: "6d821564-7a44-40c8-a135-a91c0c60dc7d"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.205292 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltl7f\" (UniqueName: \"kubernetes.io/projected/6d821564-7a44-40c8-a135-a91c0c60dc7d-kube-api-access-ltl7f\") pod \"6d821564-7a44-40c8-a135-a91c0c60dc7d\" (UID: \"6d821564-7a44-40c8-a135-a91c0c60dc7d\") " Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.205752 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d821564-7a44-40c8-a135-a91c0c60dc7d-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.211952 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d821564-7a44-40c8-a135-a91c0c60dc7d-kube-api-access-ltl7f" (OuterVolumeSpecName: "kube-api-access-ltl7f") pod "6d821564-7a44-40c8-a135-a91c0c60dc7d" (UID: "6d821564-7a44-40c8-a135-a91c0c60dc7d"). InnerVolumeSpecName "kube-api-access-ltl7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.228928 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d821564-7a44-40c8-a135-a91c0c60dc7d-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6d821564-7a44-40c8-a135-a91c0c60dc7d" (UID: "6d821564-7a44-40c8-a135-a91c0c60dc7d"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.307047 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d821564-7a44-40c8-a135-a91c0c60dc7d-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.307083 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltl7f\" (UniqueName: \"kubernetes.io/projected/6d821564-7a44-40c8-a135-a91c0c60dc7d-kube-api-access-ltl7f\") on node \"crc\" DevicePath \"\"" Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.692139 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pfk2b" event={"ID":"6d821564-7a44-40c8-a135-a91c0c60dc7d","Type":"ContainerDied","Data":"98669459f045358e70eec3c30ea80e28d10d37a68614707800bea003c94001dc"} Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.692209 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98669459f045358e70eec3c30ea80e28d10d37a68614707800bea003c94001dc" Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.692215 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pfk2b" Jan 20 23:59:53 crc kubenswrapper[5030]: I0120 23:59:53.962450 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 20 23:59:53 crc kubenswrapper[5030]: E0120 23:59:53.962914 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.782927 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs"] Jan 20 23:59:56 crc kubenswrapper[5030]: E0120 23:59:56.783842 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d821564-7a44-40c8-a135-a91c0c60dc7d" containerName="storage" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.783860 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d821564-7a44-40c8-a135-a91c0c60dc7d" containerName="storage" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.784065 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d821564-7a44-40c8-a135-a91c0c60dc7d" containerName="storage" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.784938 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.787588 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-compute-global" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.800855 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs"] Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.880988 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-config\") pod \"dnsmasq-dnsmasq-7d78464677-7vsfs\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.881356 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wc8m\" (UniqueName: \"kubernetes.io/projected/036b9faa-2fd2-4849-b968-d5f5e9918f9a-kube-api-access-9wc8m\") pod \"dnsmasq-dnsmasq-7d78464677-7vsfs\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.881457 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d78464677-7vsfs\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.881513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-7vsfs\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.991502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-config\") pod \"dnsmasq-dnsmasq-7d78464677-7vsfs\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.991570 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wc8m\" (UniqueName: \"kubernetes.io/projected/036b9faa-2fd2-4849-b968-d5f5e9918f9a-kube-api-access-9wc8m\") pod \"dnsmasq-dnsmasq-7d78464677-7vsfs\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.991640 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d78464677-7vsfs\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.991671 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-7vsfs\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.992541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-7vsfs\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.993245 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-config\") pod \"dnsmasq-dnsmasq-7d78464677-7vsfs\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:56 crc kubenswrapper[5030]: I0120 23:59:56.993760 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d78464677-7vsfs\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.010863 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wc8m\" (UniqueName: \"kubernetes.io/projected/036b9faa-2fd2-4849-b968-d5f5e9918f9a-kube-api-access-9wc8m\") pod \"dnsmasq-dnsmasq-7d78464677-7vsfs\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.169525 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.454057 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs"] Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.646223 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4"] Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.647296 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.655396 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.655544 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.656103 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.656432 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.678184 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4"] Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.702384 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4tnk\" (UniqueName: \"kubernetes.io/projected/90919d70-facc-4760-8900-d80c9b4ccaff-kube-api-access-v4tnk\") pod \"download-cache-edpm-compute-global-edpm-compute-global-zvdh4\" (UID: \"90919d70-facc-4760-8900-d80c9b4ccaff\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.702584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/90919d70-facc-4760-8900-d80c9b4ccaff-ssh-key-edpm-compute-global\") pod \"download-cache-edpm-compute-global-edpm-compute-global-zvdh4\" (UID: \"90919d70-facc-4760-8900-d80c9b4ccaff\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.702680 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90919d70-facc-4760-8900-d80c9b4ccaff-inventory\") pod \"download-cache-edpm-compute-global-edpm-compute-global-zvdh4\" (UID: \"90919d70-facc-4760-8900-d80c9b4ccaff\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.733759 5030 generic.go:334] "Generic (PLEG): container finished" podID="036b9faa-2fd2-4849-b968-d5f5e9918f9a" containerID="f4bf1cef80dfcdca87ebe48eb0f81d29a5cbf7794edfc098eb7f252a3a065abe" exitCode=0 Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.733794 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" event={"ID":"036b9faa-2fd2-4849-b968-d5f5e9918f9a","Type":"ContainerDied","Data":"f4bf1cef80dfcdca87ebe48eb0f81d29a5cbf7794edfc098eb7f252a3a065abe"} Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.733815 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" event={"ID":"036b9faa-2fd2-4849-b968-d5f5e9918f9a","Type":"ContainerStarted","Data":"ed35637345f3a2062e1168de02b04afaa10fd19c6485f78a0999cc9086aa9108"} Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.804112 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/90919d70-facc-4760-8900-d80c9b4ccaff-ssh-key-edpm-compute-global\") pod \"download-cache-edpm-compute-global-edpm-compute-global-zvdh4\" (UID: \"90919d70-facc-4760-8900-d80c9b4ccaff\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.804174 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90919d70-facc-4760-8900-d80c9b4ccaff-inventory\") pod \"download-cache-edpm-compute-global-edpm-compute-global-zvdh4\" (UID: \"90919d70-facc-4760-8900-d80c9b4ccaff\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.804239 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4tnk\" (UniqueName: \"kubernetes.io/projected/90919d70-facc-4760-8900-d80c9b4ccaff-kube-api-access-v4tnk\") pod \"download-cache-edpm-compute-global-edpm-compute-global-zvdh4\" (UID: \"90919d70-facc-4760-8900-d80c9b4ccaff\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.809700 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/90919d70-facc-4760-8900-d80c9b4ccaff-ssh-key-edpm-compute-global\") pod \"download-cache-edpm-compute-global-edpm-compute-global-zvdh4\" (UID: \"90919d70-facc-4760-8900-d80c9b4ccaff\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.809800 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90919d70-facc-4760-8900-d80c9b4ccaff-inventory\") pod \"download-cache-edpm-compute-global-edpm-compute-global-zvdh4\" (UID: \"90919d70-facc-4760-8900-d80c9b4ccaff\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" Jan 20 23:59:57 crc kubenswrapper[5030]: I0120 23:59:57.821751 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4tnk\" (UniqueName: \"kubernetes.io/projected/90919d70-facc-4760-8900-d80c9b4ccaff-kube-api-access-v4tnk\") pod \"download-cache-edpm-compute-global-edpm-compute-global-zvdh4\" (UID: \"90919d70-facc-4760-8900-d80c9b4ccaff\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" Jan 20 23:59:58 crc kubenswrapper[5030]: I0120 23:59:58.035528 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" Jan 20 23:59:58 crc kubenswrapper[5030]: I0120 23:59:58.551851 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4"] Jan 20 23:59:58 crc kubenswrapper[5030]: I0120 23:59:58.744503 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" event={"ID":"90919d70-facc-4760-8900-d80c9b4ccaff","Type":"ContainerStarted","Data":"32be4685a1331f544b350ca019fa96582826e10954ec0363d6875826b478d2a7"} Jan 20 23:59:58 crc kubenswrapper[5030]: I0120 23:59:58.747495 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" event={"ID":"036b9faa-2fd2-4849-b968-d5f5e9918f9a","Type":"ContainerStarted","Data":"e3fdcaef684db49c0c1ca8b951c4379f6cd5b3f350924378cff9de3b6686cac9"} Jan 20 23:59:58 crc kubenswrapper[5030]: I0120 23:59:58.747951 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 20 23:59:58 crc kubenswrapper[5030]: I0120 23:59:58.783034 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" podStartSLOduration=2.782993922 podStartE2EDuration="2.782993922s" podCreationTimestamp="2026-01-20 23:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 23:59:58.770433807 +0000 UTC m=+5071.090694135" watchObservedRunningTime="2026-01-20 23:59:58.782993922 +0000 UTC m=+5071.103254250" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.137677 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn"] Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.139523 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.141963 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29482560-94n7h"] Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.143517 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29482560-94n7h" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.144231 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.144408 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.145752 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.146017 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.147993 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn"] Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.153465 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29482560-94n7h"] Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.246050 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-config-volume\") pod \"collect-profiles-29482560-xcgvn\" (UID: \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.246111 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7p2s\" (UniqueName: \"kubernetes.io/projected/95838249-c91a-4a20-a829-8ef155894e90-kube-api-access-w7p2s\") pod \"image-pruner-29482560-94n7h\" (UID: \"95838249-c91a-4a20-a829-8ef155894e90\") " pod="openshift-image-registry/image-pruner-29482560-94n7h" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.246148 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnj2p\" (UniqueName: \"kubernetes.io/projected/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-kube-api-access-jnj2p\") pod \"collect-profiles-29482560-xcgvn\" (UID: \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.246210 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/95838249-c91a-4a20-a829-8ef155894e90-serviceca\") pod \"image-pruner-29482560-94n7h\" (UID: \"95838249-c91a-4a20-a829-8ef155894e90\") " pod="openshift-image-registry/image-pruner-29482560-94n7h" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.246250 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-secret-volume\") pod \"collect-profiles-29482560-xcgvn\" (UID: \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.347484 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-secret-volume\") pod \"collect-profiles-29482560-xcgvn\" (UID: \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.347837 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-config-volume\") pod \"collect-profiles-29482560-xcgvn\" (UID: \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.347877 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7p2s\" (UniqueName: \"kubernetes.io/projected/95838249-c91a-4a20-a829-8ef155894e90-kube-api-access-w7p2s\") pod \"image-pruner-29482560-94n7h\" (UID: \"95838249-c91a-4a20-a829-8ef155894e90\") " pod="openshift-image-registry/image-pruner-29482560-94n7h" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.347909 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnj2p\" (UniqueName: \"kubernetes.io/projected/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-kube-api-access-jnj2p\") pod \"collect-profiles-29482560-xcgvn\" (UID: \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.347969 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/95838249-c91a-4a20-a829-8ef155894e90-serviceca\") pod \"image-pruner-29482560-94n7h\" (UID: \"95838249-c91a-4a20-a829-8ef155894e90\") " pod="openshift-image-registry/image-pruner-29482560-94n7h" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.348637 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-config-volume\") pod \"collect-profiles-29482560-xcgvn\" (UID: \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.348957 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/95838249-c91a-4a20-a829-8ef155894e90-serviceca\") pod \"image-pruner-29482560-94n7h\" (UID: \"95838249-c91a-4a20-a829-8ef155894e90\") " pod="openshift-image-registry/image-pruner-29482560-94n7h" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.366394 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-secret-volume\") pod \"collect-profiles-29482560-xcgvn\" (UID: \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.366653 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnj2p\" (UniqueName: \"kubernetes.io/projected/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-kube-api-access-jnj2p\") pod \"collect-profiles-29482560-xcgvn\" (UID: \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.372130 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7p2s\" (UniqueName: \"kubernetes.io/projected/95838249-c91a-4a20-a829-8ef155894e90-kube-api-access-w7p2s\") pod \"image-pruner-29482560-94n7h\" (UID: \"95838249-c91a-4a20-a829-8ef155894e90\") " pod="openshift-image-registry/image-pruner-29482560-94n7h" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.487024 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.498504 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29482560-94n7h" Jan 21 00:00:00 crc kubenswrapper[5030]: I0121 00:00:00.987125 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29482560-94n7h"] Jan 21 00:00:01 crc kubenswrapper[5030]: I0121 00:00:01.030224 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn"] Jan 21 00:00:01 crc kubenswrapper[5030]: W0121 00:00:01.038886 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc290e3b2_94f3_4ba3_bbb0_5c4ec1933edc.slice/crio-7bb63e2695d0ad76b3ce61df0745c7825bc2767d8edc223af17d50528a7f6962 WatchSource:0}: Error finding container 7bb63e2695d0ad76b3ce61df0745c7825bc2767d8edc223af17d50528a7f6962: Status 404 returned error can't find the container with id 7bb63e2695d0ad76b3ce61df0745c7825bc2767d8edc223af17d50528a7f6962 Jan 21 00:00:01 crc kubenswrapper[5030]: I0121 00:00:01.779881 5030 generic.go:334] "Generic (PLEG): container finished" podID="c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc" containerID="e11e07bcf23ae86edc15b54aa67d20fe38ea1d2124730c4723e7388e0bd2d17e" exitCode=0 Jan 21 00:00:01 crc kubenswrapper[5030]: I0121 00:00:01.780087 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" event={"ID":"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc","Type":"ContainerDied","Data":"e11e07bcf23ae86edc15b54aa67d20fe38ea1d2124730c4723e7388e0bd2d17e"} Jan 21 00:00:01 crc kubenswrapper[5030]: I0121 00:00:01.780271 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" event={"ID":"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc","Type":"ContainerStarted","Data":"7bb63e2695d0ad76b3ce61df0745c7825bc2767d8edc223af17d50528a7f6962"} Jan 21 00:00:01 crc kubenswrapper[5030]: I0121 00:00:01.783444 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29482560-94n7h" event={"ID":"95838249-c91a-4a20-a829-8ef155894e90","Type":"ContainerStarted","Data":"6fdd7398d585010baca7d2f73ab35a448672e96a018dc797633a3c0191c104c4"} Jan 21 00:00:01 crc kubenswrapper[5030]: I0121 00:00:01.783496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29482560-94n7h" event={"ID":"95838249-c91a-4a20-a829-8ef155894e90","Type":"ContainerStarted","Data":"102c7a7b73d77acb60744a3d0631fe27e84cc6a3e3e460d47db7ed3a9a1c7aa5"} Jan 21 00:00:01 crc kubenswrapper[5030]: I0121 00:00:01.820560 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29482560-94n7h" podStartSLOduration=1.820458292 podStartE2EDuration="1.820458292s" podCreationTimestamp="2026-01-21 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:00:01.81090963 +0000 UTC m=+5074.131169918" watchObservedRunningTime="2026-01-21 00:00:01.820458292 +0000 UTC m=+5074.140718580" Jan 21 00:00:02 crc kubenswrapper[5030]: I0121 00:00:02.172876 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 21 00:00:02 crc kubenswrapper[5030]: I0121 00:00:02.250202 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc"] Jan 21 00:00:02 crc kubenswrapper[5030]: I0121 00:00:02.250760 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" podUID="00017f62-948a-44df-9f36-e898a374c13a" containerName="dnsmasq-dns" containerID="cri-o://3ac947683c9a10693e536e71d8fcae46de6b4f974744197d92b1efc73d9bd194" gracePeriod=10 Jan 21 00:00:02 crc kubenswrapper[5030]: I0121 00:00:02.794028 5030 generic.go:334] "Generic (PLEG): container finished" podID="95838249-c91a-4a20-a829-8ef155894e90" containerID="6fdd7398d585010baca7d2f73ab35a448672e96a018dc797633a3c0191c104c4" exitCode=0 Jan 21 00:00:02 crc kubenswrapper[5030]: I0121 00:00:02.794182 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29482560-94n7h" event={"ID":"95838249-c91a-4a20-a829-8ef155894e90","Type":"ContainerDied","Data":"6fdd7398d585010baca7d2f73ab35a448672e96a018dc797633a3c0191c104c4"} Jan 21 00:00:02 crc kubenswrapper[5030]: I0121 00:00:02.798848 5030 generic.go:334] "Generic (PLEG): container finished" podID="00017f62-948a-44df-9f36-e898a374c13a" containerID="3ac947683c9a10693e536e71d8fcae46de6b4f974744197d92b1efc73d9bd194" exitCode=0 Jan 21 00:00:02 crc kubenswrapper[5030]: I0121 00:00:02.798917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" event={"ID":"00017f62-948a-44df-9f36-e898a374c13a","Type":"ContainerDied","Data":"3ac947683c9a10693e536e71d8fcae46de6b4f974744197d92b1efc73d9bd194"} Jan 21 00:00:03 crc kubenswrapper[5030]: I0121 00:00:03.687122 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" podUID="00017f62-948a-44df-9f36-e898a374c13a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: connect: connection refused" Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.687464 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" podUID="00017f62-948a-44df-9f36-e898a374c13a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.167:5353: connect: connection refused" Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.795720 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.802582 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29482560-94n7h" Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.873132 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" event={"ID":"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc","Type":"ContainerDied","Data":"7bb63e2695d0ad76b3ce61df0745c7825bc2767d8edc223af17d50528a7f6962"} Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.873181 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bb63e2695d0ad76b3ce61df0745c7825bc2767d8edc223af17d50528a7f6962" Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.873295 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn" Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.878566 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29482560-94n7h" event={"ID":"95838249-c91a-4a20-a829-8ef155894e90","Type":"ContainerDied","Data":"102c7a7b73d77acb60744a3d0631fe27e84cc6a3e3e460d47db7ed3a9a1c7aa5"} Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.878647 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="102c7a7b73d77acb60744a3d0631fe27e84cc6a3e3e460d47db7ed3a9a1c7aa5" Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.878739 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29482560-94n7h" Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.901661 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/95838249-c91a-4a20-a829-8ef155894e90-serviceca\") pod \"95838249-c91a-4a20-a829-8ef155894e90\" (UID: \"95838249-c91a-4a20-a829-8ef155894e90\") " Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.901700 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-config-volume\") pod \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\" (UID: \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\") " Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.901872 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnj2p\" (UniqueName: \"kubernetes.io/projected/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-kube-api-access-jnj2p\") pod \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\" (UID: \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\") " Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.901904 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7p2s\" (UniqueName: \"kubernetes.io/projected/95838249-c91a-4a20-a829-8ef155894e90-kube-api-access-w7p2s\") pod \"95838249-c91a-4a20-a829-8ef155894e90\" (UID: \"95838249-c91a-4a20-a829-8ef155894e90\") " Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.901932 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-secret-volume\") pod \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\" (UID: \"c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc\") " Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.902686 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95838249-c91a-4a20-a829-8ef155894e90-serviceca" (OuterVolumeSpecName: "serviceca") pod "95838249-c91a-4a20-a829-8ef155894e90" (UID: "95838249-c91a-4a20-a829-8ef155894e90"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.903229 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-config-volume" (OuterVolumeSpecName: "config-volume") pod "c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc" (UID: "c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.905585 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95838249-c91a-4a20-a829-8ef155894e90-kube-api-access-w7p2s" (OuterVolumeSpecName: "kube-api-access-w7p2s") pod "95838249-c91a-4a20-a829-8ef155894e90" (UID: "95838249-c91a-4a20-a829-8ef155894e90"). InnerVolumeSpecName "kube-api-access-w7p2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.906843 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc" (UID: "c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.909801 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-kube-api-access-jnj2p" (OuterVolumeSpecName: "kube-api-access-jnj2p") pod "c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc" (UID: "c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc"). InnerVolumeSpecName "kube-api-access-jnj2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:00:08 crc kubenswrapper[5030]: I0121 00:00:08.961743 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:00:08 crc kubenswrapper[5030]: E0121 00:00:08.962351 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.003789 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnj2p\" (UniqueName: \"kubernetes.io/projected/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-kube-api-access-jnj2p\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.003834 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7p2s\" (UniqueName: \"kubernetes.io/projected/95838249-c91a-4a20-a829-8ef155894e90-kube-api-access-w7p2s\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.003857 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.003875 5030 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/95838249-c91a-4a20-a829-8ef155894e90-serviceca\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.003893 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.006542 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.104954 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsstd\" (UniqueName: \"kubernetes.io/projected/00017f62-948a-44df-9f36-e898a374c13a-kube-api-access-nsstd\") pod \"00017f62-948a-44df-9f36-e898a374c13a\" (UID: \"00017f62-948a-44df-9f36-e898a374c13a\") " Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.105025 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/00017f62-948a-44df-9f36-e898a374c13a-dnsmasq-svc\") pod \"00017f62-948a-44df-9f36-e898a374c13a\" (UID: \"00017f62-948a-44df-9f36-e898a374c13a\") " Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.105098 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00017f62-948a-44df-9f36-e898a374c13a-config\") pod \"00017f62-948a-44df-9f36-e898a374c13a\" (UID: \"00017f62-948a-44df-9f36-e898a374c13a\") " Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.110879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00017f62-948a-44df-9f36-e898a374c13a-kube-api-access-nsstd" (OuterVolumeSpecName: "kube-api-access-nsstd") pod "00017f62-948a-44df-9f36-e898a374c13a" (UID: "00017f62-948a-44df-9f36-e898a374c13a"). InnerVolumeSpecName "kube-api-access-nsstd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.154839 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00017f62-948a-44df-9f36-e898a374c13a-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "00017f62-948a-44df-9f36-e898a374c13a" (UID: "00017f62-948a-44df-9f36-e898a374c13a"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.160715 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00017f62-948a-44df-9f36-e898a374c13a-config" (OuterVolumeSpecName: "config") pod "00017f62-948a-44df-9f36-e898a374c13a" (UID: "00017f62-948a-44df-9f36-e898a374c13a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.206524 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00017f62-948a-44df-9f36-e898a374c13a-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.206573 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsstd\" (UniqueName: \"kubernetes.io/projected/00017f62-948a-44df-9f36-e898a374c13a-kube-api-access-nsstd\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.206587 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/00017f62-948a-44df-9f36-e898a374c13a-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.894902 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" event={"ID":"90919d70-facc-4760-8900-d80c9b4ccaff","Type":"ContainerStarted","Data":"e81d8bde96ffa60ced095539bf8194443e1a9eaae0fb5f78fda9c45b6aaf0f69"} Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.905045 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" event={"ID":"00017f62-948a-44df-9f36-e898a374c13a","Type":"ContainerDied","Data":"22dbe0a91b3870a093d2d32c871933ae31ecd3d6a98a05dc52619e984870e49b"} Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.905129 5030 scope.go:117] "RemoveContainer" containerID="3ac947683c9a10693e536e71d8fcae46de6b4f974744197d92b1efc73d9bd194" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.905187 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.931696 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt"] Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.950387 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482515-6kzdt"] Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.955705 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" podStartSLOduration=2.7189189860000003 podStartE2EDuration="12.955683577s" podCreationTimestamp="2026-01-20 23:59:57 +0000 UTC" firstStartedPulling="2026-01-20 23:59:58.565195051 +0000 UTC m=+5070.885455339" lastFinishedPulling="2026-01-21 00:00:08.801959632 +0000 UTC m=+5081.122219930" observedRunningTime="2026-01-21 00:00:09.9314546 +0000 UTC m=+5082.251714888" watchObservedRunningTime="2026-01-21 00:00:09.955683577 +0000 UTC m=+5082.275943865" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.959609 5030 scope.go:117] "RemoveContainer" containerID="044df7453007b283ca5cb478cae487944eec298f9df1cf558ef9d182a2253898" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.979780 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35630068-6911-4cd5-93bb-7d81bd6a1e72" path="/var/lib/kubelet/pods/35630068-6911-4cd5-93bb-7d81bd6a1e72/volumes" Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.980788 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc"] Jan 21 00:00:09 crc kubenswrapper[5030]: I0121 00:00:09.980813 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-br6nc"] Jan 21 00:00:10 crc kubenswrapper[5030]: I0121 00:00:10.920035 5030 generic.go:334] "Generic (PLEG): container finished" podID="90919d70-facc-4760-8900-d80c9b4ccaff" containerID="e81d8bde96ffa60ced095539bf8194443e1a9eaae0fb5f78fda9c45b6aaf0f69" exitCode=0 Jan 21 00:00:10 crc kubenswrapper[5030]: I0121 00:00:10.920144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" event={"ID":"90919d70-facc-4760-8900-d80c9b4ccaff","Type":"ContainerDied","Data":"e81d8bde96ffa60ced095539bf8194443e1a9eaae0fb5f78fda9c45b6aaf0f69"} Jan 21 00:00:11 crc kubenswrapper[5030]: I0121 00:00:11.975423 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00017f62-948a-44df-9f36-e898a374c13a" path="/var/lib/kubelet/pods/00017f62-948a-44df-9f36-e898a374c13a/volumes" Jan 21 00:00:12 crc kubenswrapper[5030]: I0121 00:00:12.336433 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" Jan 21 00:00:12 crc kubenswrapper[5030]: I0121 00:00:12.386554 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4tnk\" (UniqueName: \"kubernetes.io/projected/90919d70-facc-4760-8900-d80c9b4ccaff-kube-api-access-v4tnk\") pod \"90919d70-facc-4760-8900-d80c9b4ccaff\" (UID: \"90919d70-facc-4760-8900-d80c9b4ccaff\") " Jan 21 00:00:12 crc kubenswrapper[5030]: I0121 00:00:12.386645 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90919d70-facc-4760-8900-d80c9b4ccaff-inventory\") pod \"90919d70-facc-4760-8900-d80c9b4ccaff\" (UID: \"90919d70-facc-4760-8900-d80c9b4ccaff\") " Jan 21 00:00:12 crc kubenswrapper[5030]: I0121 00:00:12.391487 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90919d70-facc-4760-8900-d80c9b4ccaff-kube-api-access-v4tnk" (OuterVolumeSpecName: "kube-api-access-v4tnk") pod "90919d70-facc-4760-8900-d80c9b4ccaff" (UID: "90919d70-facc-4760-8900-d80c9b4ccaff"). InnerVolumeSpecName "kube-api-access-v4tnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:00:12 crc kubenswrapper[5030]: I0121 00:00:12.411113 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90919d70-facc-4760-8900-d80c9b4ccaff-inventory" (OuterVolumeSpecName: "inventory") pod "90919d70-facc-4760-8900-d80c9b4ccaff" (UID: "90919d70-facc-4760-8900-d80c9b4ccaff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:12 crc kubenswrapper[5030]: I0121 00:00:12.488206 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/90919d70-facc-4760-8900-d80c9b4ccaff-ssh-key-edpm-compute-global\") pod \"90919d70-facc-4760-8900-d80c9b4ccaff\" (UID: \"90919d70-facc-4760-8900-d80c9b4ccaff\") " Jan 21 00:00:12 crc kubenswrapper[5030]: I0121 00:00:12.489125 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90919d70-facc-4760-8900-d80c9b4ccaff-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:12 crc kubenswrapper[5030]: I0121 00:00:12.489237 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4tnk\" (UniqueName: \"kubernetes.io/projected/90919d70-facc-4760-8900-d80c9b4ccaff-kube-api-access-v4tnk\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:12 crc kubenswrapper[5030]: I0121 00:00:12.511118 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90919d70-facc-4760-8900-d80c9b4ccaff-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "90919d70-facc-4760-8900-d80c9b4ccaff" (UID: "90919d70-facc-4760-8900-d80c9b4ccaff"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:12 crc kubenswrapper[5030]: I0121 00:00:12.590591 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/90919d70-facc-4760-8900-d80c9b4ccaff-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:12 crc kubenswrapper[5030]: I0121 00:00:12.950818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" event={"ID":"90919d70-facc-4760-8900-d80c9b4ccaff","Type":"ContainerDied","Data":"32be4685a1331f544b350ca019fa96582826e10954ec0363d6875826b478d2a7"} Jan 21 00:00:12 crc kubenswrapper[5030]: I0121 00:00:12.950883 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32be4685a1331f544b350ca019fa96582826e10954ec0363d6875826b478d2a7" Jan 21 00:00:12 crc kubenswrapper[5030]: I0121 00:00:12.950893 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.047870 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8"] Jan 21 00:00:13 crc kubenswrapper[5030]: E0121 00:00:13.048479 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00017f62-948a-44df-9f36-e898a374c13a" containerName="init" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.048492 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="00017f62-948a-44df-9f36-e898a374c13a" containerName="init" Jan 21 00:00:13 crc kubenswrapper[5030]: E0121 00:00:13.048509 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00017f62-948a-44df-9f36-e898a374c13a" containerName="dnsmasq-dns" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.048515 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="00017f62-948a-44df-9f36-e898a374c13a" containerName="dnsmasq-dns" Jan 21 00:00:13 crc kubenswrapper[5030]: E0121 00:00:13.048528 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc" containerName="collect-profiles" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.048534 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc" containerName="collect-profiles" Jan 21 00:00:13 crc kubenswrapper[5030]: E0121 00:00:13.048545 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90919d70-facc-4760-8900-d80c9b4ccaff" containerName="download-cache-edpm-compute-global-edpm-compute-global" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.048552 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="90919d70-facc-4760-8900-d80c9b4ccaff" containerName="download-cache-edpm-compute-global-edpm-compute-global" Jan 21 00:00:13 crc kubenswrapper[5030]: E0121 00:00:13.048564 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95838249-c91a-4a20-a829-8ef155894e90" containerName="image-pruner" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.048570 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="95838249-c91a-4a20-a829-8ef155894e90" containerName="image-pruner" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.048705 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="95838249-c91a-4a20-a829-8ef155894e90" containerName="image-pruner" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.048724 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="00017f62-948a-44df-9f36-e898a374c13a" containerName="dnsmasq-dns" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.048734 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc" containerName="collect-profiles" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.048746 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="90919d70-facc-4760-8900-d80c9b4ccaff" containerName="download-cache-edpm-compute-global-edpm-compute-global" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.049193 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.053942 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.054147 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.054337 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.054418 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.054684 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.064326 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8"] Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.201961 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-qxcs8\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.202273 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46jsc\" (UniqueName: \"kubernetes.io/projected/1be57b93-2657-4128-869a-db7cc3610e32-kube-api-access-46jsc\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-qxcs8\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.202354 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-ssh-key-edpm-compute-global\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-qxcs8\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.202427 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-inventory\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-qxcs8\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.303213 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-qxcs8\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.303267 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46jsc\" (UniqueName: \"kubernetes.io/projected/1be57b93-2657-4128-869a-db7cc3610e32-kube-api-access-46jsc\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-qxcs8\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.303289 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-ssh-key-edpm-compute-global\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-qxcs8\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.303312 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-inventory\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-qxcs8\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.315342 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-qxcs8\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.315568 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-inventory\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-qxcs8\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.315730 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-ssh-key-edpm-compute-global\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-qxcs8\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.319675 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46jsc\" (UniqueName: \"kubernetes.io/projected/1be57b93-2657-4128-869a-db7cc3610e32-kube-api-access-46jsc\") pod \"bootstrap-edpm-compute-global-edpm-compute-global-qxcs8\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.366296 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.810207 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8"] Jan 21 00:00:13 crc kubenswrapper[5030]: W0121 00:00:13.814744 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1be57b93_2657_4128_869a_db7cc3610e32.slice/crio-a8d91a19a9a8361838973666ec77c52e3a218041ad0903c7962af598131e5d99 WatchSource:0}: Error finding container a8d91a19a9a8361838973666ec77c52e3a218041ad0903c7962af598131e5d99: Status 404 returned error can't find the container with id a8d91a19a9a8361838973666ec77c52e3a218041ad0903c7962af598131e5d99 Jan 21 00:00:13 crc kubenswrapper[5030]: I0121 00:00:13.977198 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" event={"ID":"1be57b93-2657-4128-869a-db7cc3610e32","Type":"ContainerStarted","Data":"a8d91a19a9a8361838973666ec77c52e3a218041ad0903c7962af598131e5d99"} Jan 21 00:00:14 crc kubenswrapper[5030]: I0121 00:00:14.977434 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" event={"ID":"1be57b93-2657-4128-869a-db7cc3610e32","Type":"ContainerStarted","Data":"5f6a7f78da319e388f0437ef2beeaa421aa70e851f691b2b4f03d29707c2bb80"} Jan 21 00:00:15 crc kubenswrapper[5030]: I0121 00:00:15.001194 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" podStartSLOduration=1.145945247 podStartE2EDuration="2.001165734s" podCreationTimestamp="2026-01-21 00:00:13 +0000 UTC" firstStartedPulling="2026-01-21 00:00:13.820012305 +0000 UTC m=+5086.140272633" lastFinishedPulling="2026-01-21 00:00:14.675232792 +0000 UTC m=+5086.995493120" observedRunningTime="2026-01-21 00:00:14.993886468 +0000 UTC m=+5087.314146766" watchObservedRunningTime="2026-01-21 00:00:15.001165734 +0000 UTC m=+5087.321426062" Jan 21 00:00:17 crc kubenswrapper[5030]: I0121 00:00:17.046974 5030 generic.go:334] "Generic (PLEG): container finished" podID="1be57b93-2657-4128-869a-db7cc3610e32" containerID="5f6a7f78da319e388f0437ef2beeaa421aa70e851f691b2b4f03d29707c2bb80" exitCode=0 Jan 21 00:00:17 crc kubenswrapper[5030]: I0121 00:00:17.047281 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" event={"ID":"1be57b93-2657-4128-869a-db7cc3610e32","Type":"ContainerDied","Data":"5f6a7f78da319e388f0437ef2beeaa421aa70e851f691b2b4f03d29707c2bb80"} Jan 21 00:00:18 crc kubenswrapper[5030]: I0121 00:00:18.505883 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:18 crc kubenswrapper[5030]: I0121 00:00:18.588668 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-inventory\") pod \"1be57b93-2657-4128-869a-db7cc3610e32\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " Jan 21 00:00:18 crc kubenswrapper[5030]: I0121 00:00:18.588723 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46jsc\" (UniqueName: \"kubernetes.io/projected/1be57b93-2657-4128-869a-db7cc3610e32-kube-api-access-46jsc\") pod \"1be57b93-2657-4128-869a-db7cc3610e32\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " Jan 21 00:00:18 crc kubenswrapper[5030]: I0121 00:00:18.588749 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-ssh-key-edpm-compute-global\") pod \"1be57b93-2657-4128-869a-db7cc3610e32\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " Jan 21 00:00:18 crc kubenswrapper[5030]: I0121 00:00:18.588786 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-bootstrap-combined-ca-bundle\") pod \"1be57b93-2657-4128-869a-db7cc3610e32\" (UID: \"1be57b93-2657-4128-869a-db7cc3610e32\") " Jan 21 00:00:18 crc kubenswrapper[5030]: I0121 00:00:18.600889 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1be57b93-2657-4128-869a-db7cc3610e32" (UID: "1be57b93-2657-4128-869a-db7cc3610e32"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:18 crc kubenswrapper[5030]: I0121 00:00:18.600928 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be57b93-2657-4128-869a-db7cc3610e32-kube-api-access-46jsc" (OuterVolumeSpecName: "kube-api-access-46jsc") pod "1be57b93-2657-4128-869a-db7cc3610e32" (UID: "1be57b93-2657-4128-869a-db7cc3610e32"). InnerVolumeSpecName "kube-api-access-46jsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:00:18 crc kubenswrapper[5030]: I0121 00:00:18.610170 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "1be57b93-2657-4128-869a-db7cc3610e32" (UID: "1be57b93-2657-4128-869a-db7cc3610e32"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:18 crc kubenswrapper[5030]: I0121 00:00:18.622388 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-inventory" (OuterVolumeSpecName: "inventory") pod "1be57b93-2657-4128-869a-db7cc3610e32" (UID: "1be57b93-2657-4128-869a-db7cc3610e32"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:18 crc kubenswrapper[5030]: I0121 00:00:18.690196 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:18 crc kubenswrapper[5030]: I0121 00:00:18.690231 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46jsc\" (UniqueName: \"kubernetes.io/projected/1be57b93-2657-4128-869a-db7cc3610e32-kube-api-access-46jsc\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:18 crc kubenswrapper[5030]: I0121 00:00:18.690241 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:18 crc kubenswrapper[5030]: I0121 00:00:18.690251 5030 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1be57b93-2657-4128-869a-db7cc3610e32-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.070918 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" event={"ID":"1be57b93-2657-4128-869a-db7cc3610e32","Type":"ContainerDied","Data":"a8d91a19a9a8361838973666ec77c52e3a218041ad0903c7962af598131e5d99"} Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.070994 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8d91a19a9a8361838973666ec77c52e3a218041ad0903c7962af598131e5d99" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.071095 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.169951 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq"] Jan 21 00:00:19 crc kubenswrapper[5030]: E0121 00:00:19.170463 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1be57b93-2657-4128-869a-db7cc3610e32" containerName="bootstrap-edpm-compute-global-edpm-compute-global" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.170495 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1be57b93-2657-4128-869a-db7cc3610e32" containerName="bootstrap-edpm-compute-global-edpm-compute-global" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.170795 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1be57b93-2657-4128-869a-db7cc3610e32" containerName="bootstrap-edpm-compute-global-edpm-compute-global" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.171613 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.175303 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.176879 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq"] Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.184590 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.184800 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.185437 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.317334 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cbad952-81f9-4579-9b1f-23b2883cf1e4-inventory\") pod \"configure-network-edpm-compute-global-edpm-compute-global-plvdq\" (UID: \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.317586 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlbsw\" (UniqueName: \"kubernetes.io/projected/0cbad952-81f9-4579-9b1f-23b2883cf1e4-kube-api-access-qlbsw\") pod \"configure-network-edpm-compute-global-edpm-compute-global-plvdq\" (UID: \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.317816 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0cbad952-81f9-4579-9b1f-23b2883cf1e4-ssh-key-edpm-compute-global\") pod \"configure-network-edpm-compute-global-edpm-compute-global-plvdq\" (UID: \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.419036 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlbsw\" (UniqueName: \"kubernetes.io/projected/0cbad952-81f9-4579-9b1f-23b2883cf1e4-kube-api-access-qlbsw\") pod \"configure-network-edpm-compute-global-edpm-compute-global-plvdq\" (UID: \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.419147 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0cbad952-81f9-4579-9b1f-23b2883cf1e4-ssh-key-edpm-compute-global\") pod \"configure-network-edpm-compute-global-edpm-compute-global-plvdq\" (UID: \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.419245 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cbad952-81f9-4579-9b1f-23b2883cf1e4-inventory\") pod \"configure-network-edpm-compute-global-edpm-compute-global-plvdq\" (UID: \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.431722 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cbad952-81f9-4579-9b1f-23b2883cf1e4-inventory\") pod \"configure-network-edpm-compute-global-edpm-compute-global-plvdq\" (UID: \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.431774 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0cbad952-81f9-4579-9b1f-23b2883cf1e4-ssh-key-edpm-compute-global\") pod \"configure-network-edpm-compute-global-edpm-compute-global-plvdq\" (UID: \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.440931 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlbsw\" (UniqueName: \"kubernetes.io/projected/0cbad952-81f9-4579-9b1f-23b2883cf1e4-kube-api-access-qlbsw\") pod \"configure-network-edpm-compute-global-edpm-compute-global-plvdq\" (UID: \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.533903 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" Jan 21 00:00:19 crc kubenswrapper[5030]: I0121 00:00:19.755106 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq"] Jan 21 00:00:19 crc kubenswrapper[5030]: W0121 00:00:19.760406 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cbad952_81f9_4579_9b1f_23b2883cf1e4.slice/crio-4e105156217ad5f4b2986c4e78c434786ae1356322a35d3603030923c2cca18b WatchSource:0}: Error finding container 4e105156217ad5f4b2986c4e78c434786ae1356322a35d3603030923c2cca18b: Status 404 returned error can't find the container with id 4e105156217ad5f4b2986c4e78c434786ae1356322a35d3603030923c2cca18b Jan 21 00:00:20 crc kubenswrapper[5030]: I0121 00:00:20.084317 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" event={"ID":"0cbad952-81f9-4579-9b1f-23b2883cf1e4","Type":"ContainerStarted","Data":"4e105156217ad5f4b2986c4e78c434786ae1356322a35d3603030923c2cca18b"} Jan 21 00:00:21 crc kubenswrapper[5030]: I0121 00:00:21.096031 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" event={"ID":"0cbad952-81f9-4579-9b1f-23b2883cf1e4","Type":"ContainerStarted","Data":"335c0b8dbb4d6d5eeeb22eb1e0e0d95f51b996152234b3f644a2665b4d2fa0ff"} Jan 21 00:00:21 crc kubenswrapper[5030]: I0121 00:00:21.127662 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" podStartSLOduration=1.4308583590000001 podStartE2EDuration="2.127610203s" podCreationTimestamp="2026-01-21 00:00:19 +0000 UTC" firstStartedPulling="2026-01-21 00:00:19.763474427 +0000 UTC m=+5092.083734735" lastFinishedPulling="2026-01-21 00:00:20.460226251 +0000 UTC m=+5092.780486579" observedRunningTime="2026-01-21 00:00:21.127399968 +0000 UTC m=+5093.447660316" watchObservedRunningTime="2026-01-21 00:00:21.127610203 +0000 UTC m=+5093.447870531" Jan 21 00:00:21 crc kubenswrapper[5030]: I0121 00:00:21.962369 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:00:21 crc kubenswrapper[5030]: E0121 00:00:21.962963 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:00:22 crc kubenswrapper[5030]: I0121 00:00:22.110659 5030 generic.go:334] "Generic (PLEG): container finished" podID="0cbad952-81f9-4579-9b1f-23b2883cf1e4" containerID="335c0b8dbb4d6d5eeeb22eb1e0e0d95f51b996152234b3f644a2665b4d2fa0ff" exitCode=0 Jan 21 00:00:22 crc kubenswrapper[5030]: I0121 00:00:22.110716 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" event={"ID":"0cbad952-81f9-4579-9b1f-23b2883cf1e4","Type":"ContainerDied","Data":"335c0b8dbb4d6d5eeeb22eb1e0e0d95f51b996152234b3f644a2665b4d2fa0ff"} Jan 21 00:00:22 crc kubenswrapper[5030]: I0121 00:00:22.926963 5030 scope.go:117] "RemoveContainer" containerID="d506b46d4826050d78b8587c50046a77fb0b06b3be11d10cf58e4bb8089509c7" Jan 21 00:00:22 crc kubenswrapper[5030]: I0121 00:00:22.966679 5030 scope.go:117] "RemoveContainer" containerID="5b1afaba37cd88267c0434830d481b7db9a7f6660005d0a8a40dae36c79fe289" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.047091 5030 scope.go:117] "RemoveContainer" containerID="82fd2d1b87ea50bd1f70fa929f9a1b463798cdb4ceab4e71e9fb439030a3b2d8" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.079542 5030 scope.go:117] "RemoveContainer" containerID="8a9629589aba37ec1cf676a565b634a2dbb74ba8259a834b9d096792ba716cc6" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.124303 5030 scope.go:117] "RemoveContainer" containerID="61b2fe97e12ea72a594b4c383d60012eaefea9fd8d2a5b9bebe2e3f1e187f5aa" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.165056 5030 scope.go:117] "RemoveContainer" containerID="987b60ef6a05a596a6a79190d778511056918354b3a30c0d6162c75906330368" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.215498 5030 scope.go:117] "RemoveContainer" containerID="9ee24ed4cf32f69dd0f0b0150b80f0e1548fae2992047a77d4538ce351224327" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.257130 5030 scope.go:117] "RemoveContainer" containerID="4b48160c13eeb727a67fb21b66457065131b176d3905049f3a13e3528f25f470" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.287912 5030 scope.go:117] "RemoveContainer" containerID="408858c7b76bc42f0d3a71d267c335c5c087f74c9039d5854354217999c10a68" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.327176 5030 scope.go:117] "RemoveContainer" containerID="cae7101fe36c5b5ad5d91e88266b7c244153c07641129413c400f25ad1076188" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.360685 5030 scope.go:117] "RemoveContainer" containerID="e2808049c7a50fc3f17f6bc197d8b642e6960f22f7752674054c0cb0f66685ef" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.412725 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.528500 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0cbad952-81f9-4579-9b1f-23b2883cf1e4-ssh-key-edpm-compute-global\") pod \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\" (UID: \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\") " Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.528725 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cbad952-81f9-4579-9b1f-23b2883cf1e4-inventory\") pod \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\" (UID: \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\") " Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.528819 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlbsw\" (UniqueName: \"kubernetes.io/projected/0cbad952-81f9-4579-9b1f-23b2883cf1e4-kube-api-access-qlbsw\") pod \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\" (UID: \"0cbad952-81f9-4579-9b1f-23b2883cf1e4\") " Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.533263 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cbad952-81f9-4579-9b1f-23b2883cf1e4-kube-api-access-qlbsw" (OuterVolumeSpecName: "kube-api-access-qlbsw") pod "0cbad952-81f9-4579-9b1f-23b2883cf1e4" (UID: "0cbad952-81f9-4579-9b1f-23b2883cf1e4"). InnerVolumeSpecName "kube-api-access-qlbsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.557840 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbad952-81f9-4579-9b1f-23b2883cf1e4-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "0cbad952-81f9-4579-9b1f-23b2883cf1e4" (UID: "0cbad952-81f9-4579-9b1f-23b2883cf1e4"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.560048 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbad952-81f9-4579-9b1f-23b2883cf1e4-inventory" (OuterVolumeSpecName: "inventory") pod "0cbad952-81f9-4579-9b1f-23b2883cf1e4" (UID: "0cbad952-81f9-4579-9b1f-23b2883cf1e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.630448 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0cbad952-81f9-4579-9b1f-23b2883cf1e4-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.630484 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cbad952-81f9-4579-9b1f-23b2883cf1e4-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:23 crc kubenswrapper[5030]: I0121 00:00:23.630503 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlbsw\" (UniqueName: \"kubernetes.io/projected/0cbad952-81f9-4579-9b1f-23b2883cf1e4-kube-api-access-qlbsw\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.163784 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" event={"ID":"0cbad952-81f9-4579-9b1f-23b2883cf1e4","Type":"ContainerDied","Data":"4e105156217ad5f4b2986c4e78c434786ae1356322a35d3603030923c2cca18b"} Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.163860 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e105156217ad5f4b2986c4e78c434786ae1356322a35d3603030923c2cca18b" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.164043 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.218038 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m"] Jan 21 00:00:24 crc kubenswrapper[5030]: E0121 00:00:24.219399 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbad952-81f9-4579-9b1f-23b2883cf1e4" containerName="configure-network-edpm-compute-global-edpm-compute-global" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.219431 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbad952-81f9-4579-9b1f-23b2883cf1e4" containerName="configure-network-edpm-compute-global-edpm-compute-global" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.219639 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cbad952-81f9-4579-9b1f-23b2883cf1e4" containerName="configure-network-edpm-compute-global-edpm-compute-global" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.220362 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.222490 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.223447 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.223738 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.226315 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.240119 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m"] Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.340509 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/49a9f23b-1a46-4a43-8803-f671c298a379-ssh-key-edpm-compute-global\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hwd7m\" (UID: \"49a9f23b-1a46-4a43-8803-f671c298a379\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.340590 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg4qz\" (UniqueName: \"kubernetes.io/projected/49a9f23b-1a46-4a43-8803-f671c298a379-kube-api-access-bg4qz\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hwd7m\" (UID: \"49a9f23b-1a46-4a43-8803-f671c298a379\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.340673 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49a9f23b-1a46-4a43-8803-f671c298a379-inventory\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hwd7m\" (UID: \"49a9f23b-1a46-4a43-8803-f671c298a379\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.441717 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/49a9f23b-1a46-4a43-8803-f671c298a379-ssh-key-edpm-compute-global\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hwd7m\" (UID: \"49a9f23b-1a46-4a43-8803-f671c298a379\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.442004 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg4qz\" (UniqueName: \"kubernetes.io/projected/49a9f23b-1a46-4a43-8803-f671c298a379-kube-api-access-bg4qz\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hwd7m\" (UID: \"49a9f23b-1a46-4a43-8803-f671c298a379\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.442071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49a9f23b-1a46-4a43-8803-f671c298a379-inventory\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hwd7m\" (UID: \"49a9f23b-1a46-4a43-8803-f671c298a379\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.450944 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49a9f23b-1a46-4a43-8803-f671c298a379-inventory\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hwd7m\" (UID: \"49a9f23b-1a46-4a43-8803-f671c298a379\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.451227 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/49a9f23b-1a46-4a43-8803-f671c298a379-ssh-key-edpm-compute-global\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hwd7m\" (UID: \"49a9f23b-1a46-4a43-8803-f671c298a379\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.469373 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg4qz\" (UniqueName: \"kubernetes.io/projected/49a9f23b-1a46-4a43-8803-f671c298a379-kube-api-access-bg4qz\") pod \"validate-network-edpm-compute-global-edpm-compute-global-hwd7m\" (UID: \"49a9f23b-1a46-4a43-8803-f671c298a379\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" Jan 21 00:00:24 crc kubenswrapper[5030]: I0121 00:00:24.540669 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" Jan 21 00:00:25 crc kubenswrapper[5030]: I0121 00:00:25.084099 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m"] Jan 21 00:00:25 crc kubenswrapper[5030]: W0121 00:00:25.091774 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49a9f23b_1a46_4a43_8803_f671c298a379.slice/crio-025bc394ea94b8f9b9f19d9b7b76ca234a18c071b5effa6cffe37ee55038a68f WatchSource:0}: Error finding container 025bc394ea94b8f9b9f19d9b7b76ca234a18c071b5effa6cffe37ee55038a68f: Status 404 returned error can't find the container with id 025bc394ea94b8f9b9f19d9b7b76ca234a18c071b5effa6cffe37ee55038a68f Jan 21 00:00:25 crc kubenswrapper[5030]: I0121 00:00:25.174513 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" event={"ID":"49a9f23b-1a46-4a43-8803-f671c298a379","Type":"ContainerStarted","Data":"025bc394ea94b8f9b9f19d9b7b76ca234a18c071b5effa6cffe37ee55038a68f"} Jan 21 00:00:26 crc kubenswrapper[5030]: I0121 00:00:26.188457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" event={"ID":"49a9f23b-1a46-4a43-8803-f671c298a379","Type":"ContainerStarted","Data":"bfdb85a1a852ba6ed22c70fb37d83478902cda9decc9e21cbe1f0ab2acd7e84b"} Jan 21 00:00:26 crc kubenswrapper[5030]: I0121 00:00:26.210312 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" podStartSLOduration=1.6657762699999998 podStartE2EDuration="2.210293913s" podCreationTimestamp="2026-01-21 00:00:24 +0000 UTC" firstStartedPulling="2026-01-21 00:00:25.09581936 +0000 UTC m=+5097.416079638" lastFinishedPulling="2026-01-21 00:00:25.640336953 +0000 UTC m=+5097.960597281" observedRunningTime="2026-01-21 00:00:26.207340181 +0000 UTC m=+5098.527600469" watchObservedRunningTime="2026-01-21 00:00:26.210293913 +0000 UTC m=+5098.530554211" Jan 21 00:00:27 crc kubenswrapper[5030]: I0121 00:00:27.201580 5030 generic.go:334] "Generic (PLEG): container finished" podID="49a9f23b-1a46-4a43-8803-f671c298a379" containerID="bfdb85a1a852ba6ed22c70fb37d83478902cda9decc9e21cbe1f0ab2acd7e84b" exitCode=0 Jan 21 00:00:27 crc kubenswrapper[5030]: I0121 00:00:27.201708 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" event={"ID":"49a9f23b-1a46-4a43-8803-f671c298a379","Type":"ContainerDied","Data":"bfdb85a1a852ba6ed22c70fb37d83478902cda9decc9e21cbe1f0ab2acd7e84b"} Jan 21 00:00:28 crc kubenswrapper[5030]: I0121 00:00:28.618663 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" Jan 21 00:00:28 crc kubenswrapper[5030]: I0121 00:00:28.723499 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/49a9f23b-1a46-4a43-8803-f671c298a379-ssh-key-edpm-compute-global\") pod \"49a9f23b-1a46-4a43-8803-f671c298a379\" (UID: \"49a9f23b-1a46-4a43-8803-f671c298a379\") " Jan 21 00:00:28 crc kubenswrapper[5030]: I0121 00:00:28.723645 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg4qz\" (UniqueName: \"kubernetes.io/projected/49a9f23b-1a46-4a43-8803-f671c298a379-kube-api-access-bg4qz\") pod \"49a9f23b-1a46-4a43-8803-f671c298a379\" (UID: \"49a9f23b-1a46-4a43-8803-f671c298a379\") " Jan 21 00:00:28 crc kubenswrapper[5030]: I0121 00:00:28.723739 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49a9f23b-1a46-4a43-8803-f671c298a379-inventory\") pod \"49a9f23b-1a46-4a43-8803-f671c298a379\" (UID: \"49a9f23b-1a46-4a43-8803-f671c298a379\") " Jan 21 00:00:28 crc kubenswrapper[5030]: I0121 00:00:28.729808 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a9f23b-1a46-4a43-8803-f671c298a379-kube-api-access-bg4qz" (OuterVolumeSpecName: "kube-api-access-bg4qz") pod "49a9f23b-1a46-4a43-8803-f671c298a379" (UID: "49a9f23b-1a46-4a43-8803-f671c298a379"). InnerVolumeSpecName "kube-api-access-bg4qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:00:28 crc kubenswrapper[5030]: I0121 00:00:28.750366 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a9f23b-1a46-4a43-8803-f671c298a379-inventory" (OuterVolumeSpecName: "inventory") pod "49a9f23b-1a46-4a43-8803-f671c298a379" (UID: "49a9f23b-1a46-4a43-8803-f671c298a379"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:28 crc kubenswrapper[5030]: I0121 00:00:28.763788 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a9f23b-1a46-4a43-8803-f671c298a379-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "49a9f23b-1a46-4a43-8803-f671c298a379" (UID: "49a9f23b-1a46-4a43-8803-f671c298a379"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:28 crc kubenswrapper[5030]: I0121 00:00:28.826127 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/49a9f23b-1a46-4a43-8803-f671c298a379-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:28 crc kubenswrapper[5030]: I0121 00:00:28.826181 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/49a9f23b-1a46-4a43-8803-f671c298a379-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:28 crc kubenswrapper[5030]: I0121 00:00:28.826206 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg4qz\" (UniqueName: \"kubernetes.io/projected/49a9f23b-1a46-4a43-8803-f671c298a379-kube-api-access-bg4qz\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.225464 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" event={"ID":"49a9f23b-1a46-4a43-8803-f671c298a379","Type":"ContainerDied","Data":"025bc394ea94b8f9b9f19d9b7b76ca234a18c071b5effa6cffe37ee55038a68f"} Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.225502 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="025bc394ea94b8f9b9f19d9b7b76ca234a18c071b5effa6cffe37ee55038a68f" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.225567 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.307037 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7"] Jan 21 00:00:29 crc kubenswrapper[5030]: E0121 00:00:29.307354 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a9f23b-1a46-4a43-8803-f671c298a379" containerName="validate-network-edpm-compute-global-edpm-compute-global" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.307376 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a9f23b-1a46-4a43-8803-f671c298a379" containerName="validate-network-edpm-compute-global-edpm-compute-global" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.307571 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a9f23b-1a46-4a43-8803-f671c298a379" containerName="validate-network-edpm-compute-global-edpm-compute-global" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.308131 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.310787 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.311015 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.311742 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.311906 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.332375 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7"] Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.433820 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6d515a9-aa02-4b8c-993a-11ae9267ff80-inventory\") pod \"install-os-edpm-compute-global-edpm-compute-global-chzt7\" (UID: \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.433934 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx4s9\" (UniqueName: \"kubernetes.io/projected/d6d515a9-aa02-4b8c-993a-11ae9267ff80-kube-api-access-dx4s9\") pod \"install-os-edpm-compute-global-edpm-compute-global-chzt7\" (UID: \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.433996 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d6d515a9-aa02-4b8c-993a-11ae9267ff80-ssh-key-edpm-compute-global\") pod \"install-os-edpm-compute-global-edpm-compute-global-chzt7\" (UID: \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.535424 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx4s9\" (UniqueName: \"kubernetes.io/projected/d6d515a9-aa02-4b8c-993a-11ae9267ff80-kube-api-access-dx4s9\") pod \"install-os-edpm-compute-global-edpm-compute-global-chzt7\" (UID: \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.535605 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d6d515a9-aa02-4b8c-993a-11ae9267ff80-ssh-key-edpm-compute-global\") pod \"install-os-edpm-compute-global-edpm-compute-global-chzt7\" (UID: \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.535755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6d515a9-aa02-4b8c-993a-11ae9267ff80-inventory\") pod \"install-os-edpm-compute-global-edpm-compute-global-chzt7\" (UID: \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.541475 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d6d515a9-aa02-4b8c-993a-11ae9267ff80-ssh-key-edpm-compute-global\") pod \"install-os-edpm-compute-global-edpm-compute-global-chzt7\" (UID: \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.549425 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6d515a9-aa02-4b8c-993a-11ae9267ff80-inventory\") pod \"install-os-edpm-compute-global-edpm-compute-global-chzt7\" (UID: \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.558700 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx4s9\" (UniqueName: \"kubernetes.io/projected/d6d515a9-aa02-4b8c-993a-11ae9267ff80-kube-api-access-dx4s9\") pod \"install-os-edpm-compute-global-edpm-compute-global-chzt7\" (UID: \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\") " pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.627609 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" Jan 21 00:00:29 crc kubenswrapper[5030]: I0121 00:00:29.991393 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7"] Jan 21 00:00:30 crc kubenswrapper[5030]: W0121 00:00:30.145910 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6d515a9_aa02_4b8c_993a_11ae9267ff80.slice/crio-1afe87105f42778efc13a6db5bfb60ddf3448e8c33164522f828a0ec1b62e4d5 WatchSource:0}: Error finding container 1afe87105f42778efc13a6db5bfb60ddf3448e8c33164522f828a0ec1b62e4d5: Status 404 returned error can't find the container with id 1afe87105f42778efc13a6db5bfb60ddf3448e8c33164522f828a0ec1b62e4d5 Jan 21 00:00:30 crc kubenswrapper[5030]: I0121 00:00:30.236190 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" event={"ID":"d6d515a9-aa02-4b8c-993a-11ae9267ff80","Type":"ContainerStarted","Data":"1afe87105f42778efc13a6db5bfb60ddf3448e8c33164522f828a0ec1b62e4d5"} Jan 21 00:00:31 crc kubenswrapper[5030]: I0121 00:00:31.248400 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" event={"ID":"d6d515a9-aa02-4b8c-993a-11ae9267ff80","Type":"ContainerStarted","Data":"3230d8b8fb830fbba2e2a906bcdb7f0fbd36e2d48ed70c1a2b75c947e7349a06"} Jan 21 00:00:31 crc kubenswrapper[5030]: I0121 00:00:31.271880 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" podStartSLOduration=1.817126794 podStartE2EDuration="2.271867121s" podCreationTimestamp="2026-01-21 00:00:29 +0000 UTC" firstStartedPulling="2026-01-21 00:00:30.149020495 +0000 UTC m=+5102.469280783" lastFinishedPulling="2026-01-21 00:00:30.603760782 +0000 UTC m=+5102.924021110" observedRunningTime="2026-01-21 00:00:31.26893099 +0000 UTC m=+5103.589191278" watchObservedRunningTime="2026-01-21 00:00:31.271867121 +0000 UTC m=+5103.592127399" Jan 21 00:00:32 crc kubenswrapper[5030]: I0121 00:00:32.263344 5030 generic.go:334] "Generic (PLEG): container finished" podID="d6d515a9-aa02-4b8c-993a-11ae9267ff80" containerID="3230d8b8fb830fbba2e2a906bcdb7f0fbd36e2d48ed70c1a2b75c947e7349a06" exitCode=0 Jan 21 00:00:32 crc kubenswrapper[5030]: I0121 00:00:32.263426 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" event={"ID":"d6d515a9-aa02-4b8c-993a-11ae9267ff80","Type":"ContainerDied","Data":"3230d8b8fb830fbba2e2a906bcdb7f0fbd36e2d48ed70c1a2b75c947e7349a06"} Jan 21 00:00:33 crc kubenswrapper[5030]: I0121 00:00:33.583897 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" Jan 21 00:00:33 crc kubenswrapper[5030]: I0121 00:00:33.698651 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx4s9\" (UniqueName: \"kubernetes.io/projected/d6d515a9-aa02-4b8c-993a-11ae9267ff80-kube-api-access-dx4s9\") pod \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\" (UID: \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\") " Jan 21 00:00:33 crc kubenswrapper[5030]: I0121 00:00:33.698899 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6d515a9-aa02-4b8c-993a-11ae9267ff80-inventory\") pod \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\" (UID: \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\") " Jan 21 00:00:33 crc kubenswrapper[5030]: I0121 00:00:33.698953 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d6d515a9-aa02-4b8c-993a-11ae9267ff80-ssh-key-edpm-compute-global\") pod \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\" (UID: \"d6d515a9-aa02-4b8c-993a-11ae9267ff80\") " Jan 21 00:00:33 crc kubenswrapper[5030]: I0121 00:00:33.706023 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6d515a9-aa02-4b8c-993a-11ae9267ff80-kube-api-access-dx4s9" (OuterVolumeSpecName: "kube-api-access-dx4s9") pod "d6d515a9-aa02-4b8c-993a-11ae9267ff80" (UID: "d6d515a9-aa02-4b8c-993a-11ae9267ff80"). InnerVolumeSpecName "kube-api-access-dx4s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:00:33 crc kubenswrapper[5030]: I0121 00:00:33.728239 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d515a9-aa02-4b8c-993a-11ae9267ff80-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "d6d515a9-aa02-4b8c-993a-11ae9267ff80" (UID: "d6d515a9-aa02-4b8c-993a-11ae9267ff80"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:33 crc kubenswrapper[5030]: I0121 00:00:33.729365 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d515a9-aa02-4b8c-993a-11ae9267ff80-inventory" (OuterVolumeSpecName: "inventory") pod "d6d515a9-aa02-4b8c-993a-11ae9267ff80" (UID: "d6d515a9-aa02-4b8c-993a-11ae9267ff80"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:33 crc kubenswrapper[5030]: I0121 00:00:33.800301 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6d515a9-aa02-4b8c-993a-11ae9267ff80-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:33 crc kubenswrapper[5030]: I0121 00:00:33.800332 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d6d515a9-aa02-4b8c-993a-11ae9267ff80-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:33 crc kubenswrapper[5030]: I0121 00:00:33.800348 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx4s9\" (UniqueName: \"kubernetes.io/projected/d6d515a9-aa02-4b8c-993a-11ae9267ff80-kube-api-access-dx4s9\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.286357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" event={"ID":"d6d515a9-aa02-4b8c-993a-11ae9267ff80","Type":"ContainerDied","Data":"1afe87105f42778efc13a6db5bfb60ddf3448e8c33164522f828a0ec1b62e4d5"} Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.286400 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1afe87105f42778efc13a6db5bfb60ddf3448e8c33164522f828a0ec1b62e4d5" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.286937 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.373590 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w"] Jan 21 00:00:34 crc kubenswrapper[5030]: E0121 00:00:34.374492 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6d515a9-aa02-4b8c-993a-11ae9267ff80" containerName="install-os-edpm-compute-global-edpm-compute-global" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.374747 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6d515a9-aa02-4b8c-993a-11ae9267ff80" containerName="install-os-edpm-compute-global-edpm-compute-global" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.375192 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6d515a9-aa02-4b8c-993a-11ae9267ff80" containerName="install-os-edpm-compute-global-edpm-compute-global" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.376065 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.388916 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w"] Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.394729 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.395051 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.395292 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.395888 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.408448 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cgbh\" (UniqueName: \"kubernetes.io/projected/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-kube-api-access-6cgbh\") pod \"configure-os-edpm-compute-global-edpm-compute-global-hmj6w\" (UID: \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.408667 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-inventory\") pod \"configure-os-edpm-compute-global-edpm-compute-global-hmj6w\" (UID: \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.408814 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-ssh-key-edpm-compute-global\") pod \"configure-os-edpm-compute-global-edpm-compute-global-hmj6w\" (UID: \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.510431 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cgbh\" (UniqueName: \"kubernetes.io/projected/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-kube-api-access-6cgbh\") pod \"configure-os-edpm-compute-global-edpm-compute-global-hmj6w\" (UID: \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.510499 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-inventory\") pod \"configure-os-edpm-compute-global-edpm-compute-global-hmj6w\" (UID: \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.510522 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-ssh-key-edpm-compute-global\") pod \"configure-os-edpm-compute-global-edpm-compute-global-hmj6w\" (UID: \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.520645 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-ssh-key-edpm-compute-global\") pod \"configure-os-edpm-compute-global-edpm-compute-global-hmj6w\" (UID: \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.527100 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-inventory\") pod \"configure-os-edpm-compute-global-edpm-compute-global-hmj6w\" (UID: \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.542448 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cgbh\" (UniqueName: \"kubernetes.io/projected/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-kube-api-access-6cgbh\") pod \"configure-os-edpm-compute-global-edpm-compute-global-hmj6w\" (UID: \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" Jan 21 00:00:34 crc kubenswrapper[5030]: I0121 00:00:34.733573 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" Jan 21 00:00:35 crc kubenswrapper[5030]: I0121 00:00:35.165279 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w"] Jan 21 00:00:35 crc kubenswrapper[5030]: I0121 00:00:35.297737 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" event={"ID":"3eb3e7e1-261f-4a77-b84b-c4a444bd032e","Type":"ContainerStarted","Data":"8a01ca27af52b2f4d46342851d29c373a6b63049e98376ebe2b9dbabbc0ebfb4"} Jan 21 00:00:36 crc kubenswrapper[5030]: I0121 00:00:36.311770 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" event={"ID":"3eb3e7e1-261f-4a77-b84b-c4a444bd032e","Type":"ContainerStarted","Data":"09a5e577660ef1eb54d2f816874bbc7c00dfc4021b3c3fbcaac1ed3eba815132"} Jan 21 00:00:36 crc kubenswrapper[5030]: I0121 00:00:36.338159 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" podStartSLOduration=1.684630778 podStartE2EDuration="2.338131254s" podCreationTimestamp="2026-01-21 00:00:34 +0000 UTC" firstStartedPulling="2026-01-21 00:00:35.173664358 +0000 UTC m=+5107.493924696" lastFinishedPulling="2026-01-21 00:00:35.827164874 +0000 UTC m=+5108.147425172" observedRunningTime="2026-01-21 00:00:36.326314397 +0000 UTC m=+5108.646574765" watchObservedRunningTime="2026-01-21 00:00:36.338131254 +0000 UTC m=+5108.658391582" Jan 21 00:00:36 crc kubenswrapper[5030]: I0121 00:00:36.962324 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:00:36 crc kubenswrapper[5030]: E0121 00:00:36.962651 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:00:38 crc kubenswrapper[5030]: I0121 00:00:38.335448 5030 generic.go:334] "Generic (PLEG): container finished" podID="3eb3e7e1-261f-4a77-b84b-c4a444bd032e" containerID="09a5e577660ef1eb54d2f816874bbc7c00dfc4021b3c3fbcaac1ed3eba815132" exitCode=0 Jan 21 00:00:38 crc kubenswrapper[5030]: I0121 00:00:38.335511 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" event={"ID":"3eb3e7e1-261f-4a77-b84b-c4a444bd032e","Type":"ContainerDied","Data":"09a5e577660ef1eb54d2f816874bbc7c00dfc4021b3c3fbcaac1ed3eba815132"} Jan 21 00:00:39 crc kubenswrapper[5030]: I0121 00:00:39.664821 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" Jan 21 00:00:39 crc kubenswrapper[5030]: I0121 00:00:39.699435 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cgbh\" (UniqueName: \"kubernetes.io/projected/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-kube-api-access-6cgbh\") pod \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\" (UID: \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\") " Jan 21 00:00:39 crc kubenswrapper[5030]: I0121 00:00:39.699574 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-ssh-key-edpm-compute-global\") pod \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\" (UID: \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\") " Jan 21 00:00:39 crc kubenswrapper[5030]: I0121 00:00:39.699611 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-inventory\") pod \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\" (UID: \"3eb3e7e1-261f-4a77-b84b-c4a444bd032e\") " Jan 21 00:00:39 crc kubenswrapper[5030]: I0121 00:00:39.712703 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-kube-api-access-6cgbh" (OuterVolumeSpecName: "kube-api-access-6cgbh") pod "3eb3e7e1-261f-4a77-b84b-c4a444bd032e" (UID: "3eb3e7e1-261f-4a77-b84b-c4a444bd032e"). InnerVolumeSpecName "kube-api-access-6cgbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:00:39 crc kubenswrapper[5030]: I0121 00:00:39.732853 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "3eb3e7e1-261f-4a77-b84b-c4a444bd032e" (UID: "3eb3e7e1-261f-4a77-b84b-c4a444bd032e"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:39 crc kubenswrapper[5030]: I0121 00:00:39.732987 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-inventory" (OuterVolumeSpecName: "inventory") pod "3eb3e7e1-261f-4a77-b84b-c4a444bd032e" (UID: "3eb3e7e1-261f-4a77-b84b-c4a444bd032e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:39 crc kubenswrapper[5030]: I0121 00:00:39.801665 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cgbh\" (UniqueName: \"kubernetes.io/projected/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-kube-api-access-6cgbh\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:39 crc kubenswrapper[5030]: I0121 00:00:39.801731 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:39 crc kubenswrapper[5030]: I0121 00:00:39.801747 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eb3e7e1-261f-4a77-b84b-c4a444bd032e-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.377157 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" event={"ID":"3eb3e7e1-261f-4a77-b84b-c4a444bd032e","Type":"ContainerDied","Data":"8a01ca27af52b2f4d46342851d29c373a6b63049e98376ebe2b9dbabbc0ebfb4"} Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.377230 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a01ca27af52b2f4d46342851d29c373a6b63049e98376ebe2b9dbabbc0ebfb4" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.377316 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.474197 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2"] Jan 21 00:00:40 crc kubenswrapper[5030]: E0121 00:00:40.474835 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb3e7e1-261f-4a77-b84b-c4a444bd032e" containerName="configure-os-edpm-compute-global-edpm-compute-global" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.474860 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb3e7e1-261f-4a77-b84b-c4a444bd032e" containerName="configure-os-edpm-compute-global-edpm-compute-global" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.475057 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb3e7e1-261f-4a77-b84b-c4a444bd032e" containerName="configure-os-edpm-compute-global-edpm-compute-global" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.475749 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.480155 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.480376 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.480591 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.481060 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.483646 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2"] Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.512882 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0dd60ec9-3010-4372-8401-9492c7a8d6b1-ssh-key-edpm-compute-global\") pod \"run-os-edpm-compute-global-edpm-compute-global-sw7z2\" (UID: \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.512931 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd60ec9-3010-4372-8401-9492c7a8d6b1-inventory\") pod \"run-os-edpm-compute-global-edpm-compute-global-sw7z2\" (UID: \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.512990 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz7sz\" (UniqueName: \"kubernetes.io/projected/0dd60ec9-3010-4372-8401-9492c7a8d6b1-kube-api-access-kz7sz\") pod \"run-os-edpm-compute-global-edpm-compute-global-sw7z2\" (UID: \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.614057 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0dd60ec9-3010-4372-8401-9492c7a8d6b1-ssh-key-edpm-compute-global\") pod \"run-os-edpm-compute-global-edpm-compute-global-sw7z2\" (UID: \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.614322 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd60ec9-3010-4372-8401-9492c7a8d6b1-inventory\") pod \"run-os-edpm-compute-global-edpm-compute-global-sw7z2\" (UID: \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.614365 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz7sz\" (UniqueName: \"kubernetes.io/projected/0dd60ec9-3010-4372-8401-9492c7a8d6b1-kube-api-access-kz7sz\") pod \"run-os-edpm-compute-global-edpm-compute-global-sw7z2\" (UID: \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.619402 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd60ec9-3010-4372-8401-9492c7a8d6b1-inventory\") pod \"run-os-edpm-compute-global-edpm-compute-global-sw7z2\" (UID: \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.629458 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0dd60ec9-3010-4372-8401-9492c7a8d6b1-ssh-key-edpm-compute-global\") pod \"run-os-edpm-compute-global-edpm-compute-global-sw7z2\" (UID: \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.632030 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz7sz\" (UniqueName: \"kubernetes.io/projected/0dd60ec9-3010-4372-8401-9492c7a8d6b1-kube-api-access-kz7sz\") pod \"run-os-edpm-compute-global-edpm-compute-global-sw7z2\" (UID: \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\") " pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" Jan 21 00:00:40 crc kubenswrapper[5030]: I0121 00:00:40.800219 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" Jan 21 00:00:41 crc kubenswrapper[5030]: I0121 00:00:41.394798 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2"] Jan 21 00:00:42 crc kubenswrapper[5030]: I0121 00:00:42.396656 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" event={"ID":"0dd60ec9-3010-4372-8401-9492c7a8d6b1","Type":"ContainerStarted","Data":"1c97d4862ea3795766f6f88dd8b96826b66a8011101e8ba525c4165d4e17391d"} Jan 21 00:00:43 crc kubenswrapper[5030]: I0121 00:00:43.406164 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" event={"ID":"0dd60ec9-3010-4372-8401-9492c7a8d6b1","Type":"ContainerStarted","Data":"bf5667e55f136f2b8a9faf5d82492ca64d6650d1a230ab45823cdc77c51f34cd"} Jan 21 00:00:43 crc kubenswrapper[5030]: I0121 00:00:43.420676 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" podStartSLOduration=2.196261895 podStartE2EDuration="3.420656673s" podCreationTimestamp="2026-01-21 00:00:40 +0000 UTC" firstStartedPulling="2026-01-21 00:00:41.415691029 +0000 UTC m=+5113.735951357" lastFinishedPulling="2026-01-21 00:00:42.640085807 +0000 UTC m=+5114.960346135" observedRunningTime="2026-01-21 00:00:43.419598157 +0000 UTC m=+5115.739858445" watchObservedRunningTime="2026-01-21 00:00:43.420656673 +0000 UTC m=+5115.740916961" Jan 21 00:00:44 crc kubenswrapper[5030]: I0121 00:00:44.419033 5030 generic.go:334] "Generic (PLEG): container finished" podID="0dd60ec9-3010-4372-8401-9492c7a8d6b1" containerID="bf5667e55f136f2b8a9faf5d82492ca64d6650d1a230ab45823cdc77c51f34cd" exitCode=0 Jan 21 00:00:44 crc kubenswrapper[5030]: I0121 00:00:44.419314 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" event={"ID":"0dd60ec9-3010-4372-8401-9492c7a8d6b1","Type":"ContainerDied","Data":"bf5667e55f136f2b8a9faf5d82492ca64d6650d1a230ab45823cdc77c51f34cd"} Jan 21 00:00:45 crc kubenswrapper[5030]: I0121 00:00:45.797003 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" Jan 21 00:00:45 crc kubenswrapper[5030]: I0121 00:00:45.899017 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd60ec9-3010-4372-8401-9492c7a8d6b1-inventory\") pod \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\" (UID: \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\") " Jan 21 00:00:45 crc kubenswrapper[5030]: I0121 00:00:45.899179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0dd60ec9-3010-4372-8401-9492c7a8d6b1-ssh-key-edpm-compute-global\") pod \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\" (UID: \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\") " Jan 21 00:00:45 crc kubenswrapper[5030]: I0121 00:00:45.899214 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz7sz\" (UniqueName: \"kubernetes.io/projected/0dd60ec9-3010-4372-8401-9492c7a8d6b1-kube-api-access-kz7sz\") pod \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\" (UID: \"0dd60ec9-3010-4372-8401-9492c7a8d6b1\") " Jan 21 00:00:45 crc kubenswrapper[5030]: I0121 00:00:45.905446 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd60ec9-3010-4372-8401-9492c7a8d6b1-kube-api-access-kz7sz" (OuterVolumeSpecName: "kube-api-access-kz7sz") pod "0dd60ec9-3010-4372-8401-9492c7a8d6b1" (UID: "0dd60ec9-3010-4372-8401-9492c7a8d6b1"). InnerVolumeSpecName "kube-api-access-kz7sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:00:45 crc kubenswrapper[5030]: I0121 00:00:45.923389 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd60ec9-3010-4372-8401-9492c7a8d6b1-inventory" (OuterVolumeSpecName: "inventory") pod "0dd60ec9-3010-4372-8401-9492c7a8d6b1" (UID: "0dd60ec9-3010-4372-8401-9492c7a8d6b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:45 crc kubenswrapper[5030]: I0121 00:00:45.932777 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd60ec9-3010-4372-8401-9492c7a8d6b1-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "0dd60ec9-3010-4372-8401-9492c7a8d6b1" (UID: "0dd60ec9-3010-4372-8401-9492c7a8d6b1"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.004672 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/0dd60ec9-3010-4372-8401-9492c7a8d6b1-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.004713 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz7sz\" (UniqueName: \"kubernetes.io/projected/0dd60ec9-3010-4372-8401-9492c7a8d6b1-kube-api-access-kz7sz\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.004727 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd60ec9-3010-4372-8401-9492c7a8d6b1-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.446525 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" event={"ID":"0dd60ec9-3010-4372-8401-9492c7a8d6b1","Type":"ContainerDied","Data":"1c97d4862ea3795766f6f88dd8b96826b66a8011101e8ba525c4165d4e17391d"} Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.446600 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c97d4862ea3795766f6f88dd8b96826b66a8011101e8ba525c4165d4e17391d" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.446810 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.520137 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc"] Jan 21 00:00:46 crc kubenswrapper[5030]: E0121 00:00:46.520513 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd60ec9-3010-4372-8401-9492c7a8d6b1" containerName="run-os-edpm-compute-global-edpm-compute-global" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.520535 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd60ec9-3010-4372-8401-9492c7a8d6b1" containerName="run-os-edpm-compute-global-edpm-compute-global" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.520762 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd60ec9-3010-4372-8401-9492c7a8d6b1" containerName="run-os-edpm-compute-global-edpm-compute-global" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.521321 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.523213 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.524021 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.524246 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.524479 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.524930 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.542667 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc"] Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.717588 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-custom-global-service-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.717669 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.717700 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.717729 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-inventory\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.717796 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.717839 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.717870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.717891 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.717912 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.718082 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-nova-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.718152 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4bx8\" (UniqueName: \"kubernetes.io/projected/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-kube-api-access-z4bx8\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.718223 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-ssh-key-edpm-compute-global\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.819242 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.819288 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-inventory\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.819323 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.819373 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.819410 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.819441 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.819466 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.819509 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-nova-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.819541 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4bx8\" (UniqueName: \"kubernetes.io/projected/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-kube-api-access-z4bx8\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.819582 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-ssh-key-edpm-compute-global\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.819639 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-custom-global-service-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.819662 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.825327 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.828169 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-custom-global-service-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.828270 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.828398 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.830088 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.830271 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-inventory\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.832143 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-ssh-key-edpm-compute-global\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.832200 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.834855 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-nova-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.836142 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.836261 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:46 crc kubenswrapper[5030]: I0121 00:00:46.844500 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4bx8\" (UniqueName: \"kubernetes.io/projected/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-kube-api-access-z4bx8\") pod \"install-certs-edpm-compute-global-edpm-compute-global-4m8mc\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:47 crc kubenswrapper[5030]: I0121 00:00:47.141017 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:47 crc kubenswrapper[5030]: I0121 00:00:47.514337 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc"] Jan 21 00:00:47 crc kubenswrapper[5030]: W0121 00:00:47.522037 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c5c05d4_79cf_45b6_ae63_f90ebfbcfe33.slice/crio-98b42f402b283f6c113b2173a5458f66d3cc073fe2458303e9aa437b49720e89 WatchSource:0}: Error finding container 98b42f402b283f6c113b2173a5458f66d3cc073fe2458303e9aa437b49720e89: Status 404 returned error can't find the container with id 98b42f402b283f6c113b2173a5458f66d3cc073fe2458303e9aa437b49720e89 Jan 21 00:00:48 crc kubenswrapper[5030]: I0121 00:00:48.467599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" event={"ID":"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33","Type":"ContainerStarted","Data":"98b42f402b283f6c113b2173a5458f66d3cc073fe2458303e9aa437b49720e89"} Jan 21 00:00:49 crc kubenswrapper[5030]: I0121 00:00:49.481750 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" event={"ID":"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33","Type":"ContainerStarted","Data":"d9287011d2be6abb8f8faee1816bd14ba090483f5a6463eca0da3e2d1ec24891"} Jan 21 00:00:49 crc kubenswrapper[5030]: I0121 00:00:49.516361 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" podStartSLOduration=2.680437268 podStartE2EDuration="3.516342436s" podCreationTimestamp="2026-01-21 00:00:46 +0000 UTC" firstStartedPulling="2026-01-21 00:00:47.54268016 +0000 UTC m=+5119.862940448" lastFinishedPulling="2026-01-21 00:00:48.378585288 +0000 UTC m=+5120.698845616" observedRunningTime="2026-01-21 00:00:49.509207383 +0000 UTC m=+5121.829467681" watchObservedRunningTime="2026-01-21 00:00:49.516342436 +0000 UTC m=+5121.836602734" Jan 21 00:00:50 crc kubenswrapper[5030]: I0121 00:00:50.493452 5030 generic.go:334] "Generic (PLEG): container finished" podID="5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" containerID="d9287011d2be6abb8f8faee1816bd14ba090483f5a6463eca0da3e2d1ec24891" exitCode=0 Jan 21 00:00:50 crc kubenswrapper[5030]: I0121 00:00:50.494036 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" event={"ID":"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33","Type":"ContainerDied","Data":"d9287011d2be6abb8f8faee1816bd14ba090483f5a6463eca0da3e2d1ec24891"} Jan 21 00:00:50 crc kubenswrapper[5030]: I0121 00:00:50.962447 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:00:50 crc kubenswrapper[5030]: E0121 00:00:50.962796 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.805056 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.901825 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-custom-global-service-combined-ca-bundle\") pod \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.901926 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-metadata-combined-ca-bundle\") pod \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.901974 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4bx8\" (UniqueName: \"kubernetes.io/projected/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-kube-api-access-z4bx8\") pod \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.902019 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-libvirt-combined-ca-bundle\") pod \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.902087 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-ovn-combined-ca-bundle\") pod \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.902122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-sriov-combined-ca-bundle\") pod \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.902192 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-inventory\") pod \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.902233 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-dhcp-combined-ca-bundle\") pod \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.902291 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-bootstrap-combined-ca-bundle\") pod \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.902343 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-nova-combined-ca-bundle\") pod \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.902380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-ovn-combined-ca-bundle\") pod \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.902418 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-ssh-key-edpm-compute-global\") pod \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\" (UID: \"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33\") " Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.908827 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-kube-api-access-z4bx8" (OuterVolumeSpecName: "kube-api-access-z4bx8") pod "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" (UID: "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33"). InnerVolumeSpecName "kube-api-access-z4bx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.910589 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" (UID: "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.911497 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" (UID: "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.912946 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-custom-global-service-combined-ca-bundle" (OuterVolumeSpecName: "custom-global-service-combined-ca-bundle") pod "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" (UID: "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33"). InnerVolumeSpecName "custom-global-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.913354 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" (UID: "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.914322 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" (UID: "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.914875 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" (UID: "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.917375 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" (UID: "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.924255 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" (UID: "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.924482 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" (UID: "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.948777 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-inventory" (OuterVolumeSpecName: "inventory") pod "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" (UID: "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:51 crc kubenswrapper[5030]: I0121 00:00:51.954833 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" (UID: "5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.003989 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4bx8\" (UniqueName: \"kubernetes.io/projected/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-kube-api-access-z4bx8\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.004020 5030 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.004032 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.004042 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.004052 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.004062 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.004071 5030 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.004081 5030 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.004092 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.004100 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.004111 5030 reconciler_common.go:293] "Volume detached for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-custom-global-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.004123 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.517208 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" event={"ID":"5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33","Type":"ContainerDied","Data":"98b42f402b283f6c113b2173a5458f66d3cc073fe2458303e9aa437b49720e89"} Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.517273 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98b42f402b283f6c113b2173a5458f66d3cc073fe2458303e9aa437b49720e89" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.517289 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.625730 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx"] Jan 21 00:00:52 crc kubenswrapper[5030]: E0121 00:00:52.626287 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" containerName="install-certs-edpm-compute-global-edpm-compute-global" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.626354 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" containerName="install-certs-edpm-compute-global-edpm-compute-global" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.626548 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" containerName="install-certs-edpm-compute-global-edpm-compute-global" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.627055 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.629555 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.629817 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.629855 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.629957 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.630129 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-config" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.632216 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.632501 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx"] Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.815746 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ovncontroller-config-0\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.815852 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ssh-key-edpm-compute-global\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.815888 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-inventory\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.815941 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fgsq\" (UniqueName: \"kubernetes.io/projected/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-kube-api-access-2fgsq\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.815959 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.917987 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fgsq\" (UniqueName: \"kubernetes.io/projected/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-kube-api-access-2fgsq\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.918077 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.918136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ovncontroller-config-0\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.918274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ssh-key-edpm-compute-global\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.918334 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-inventory\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.920028 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ovncontroller-config-0\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.923178 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ssh-key-edpm-compute-global\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.923178 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.923672 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-inventory\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.937504 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fgsq\" (UniqueName: \"kubernetes.io/projected/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-kube-api-access-2fgsq\") pod \"ovn-edpm-compute-global-edpm-compute-global-g8lhx\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:52 crc kubenswrapper[5030]: I0121 00:00:52.945680 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:53 crc kubenswrapper[5030]: I0121 00:00:53.513573 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx"] Jan 21 00:00:54 crc kubenswrapper[5030]: I0121 00:00:54.538354 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" event={"ID":"7595a00c-2956-4cd8-a2f2-054d1e6ddcad","Type":"ContainerStarted","Data":"2908aa2e4e9bc4187917a423f58947a131a03f715e10a776a832009b80bc0dc0"} Jan 21 00:00:55 crc kubenswrapper[5030]: I0121 00:00:55.550451 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" event={"ID":"7595a00c-2956-4cd8-a2f2-054d1e6ddcad","Type":"ContainerStarted","Data":"ba48e6fed9d77051010ef4b50d4a71367424b43c03e32266a234e2a2bc45ffdf"} Jan 21 00:00:55 crc kubenswrapper[5030]: I0121 00:00:55.571834 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" podStartSLOduration=2.866014089 podStartE2EDuration="3.571813423s" podCreationTimestamp="2026-01-21 00:00:52 +0000 UTC" firstStartedPulling="2026-01-21 00:00:53.957515211 +0000 UTC m=+5126.277775499" lastFinishedPulling="2026-01-21 00:00:54.663314505 +0000 UTC m=+5126.983574833" observedRunningTime="2026-01-21 00:00:55.568742818 +0000 UTC m=+5127.889003146" watchObservedRunningTime="2026-01-21 00:00:55.571813423 +0000 UTC m=+5127.892073711" Jan 21 00:00:56 crc kubenswrapper[5030]: I0121 00:00:56.562476 5030 generic.go:334] "Generic (PLEG): container finished" podID="7595a00c-2956-4cd8-a2f2-054d1e6ddcad" containerID="ba48e6fed9d77051010ef4b50d4a71367424b43c03e32266a234e2a2bc45ffdf" exitCode=0 Jan 21 00:00:56 crc kubenswrapper[5030]: I0121 00:00:56.562512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" event={"ID":"7595a00c-2956-4cd8-a2f2-054d1e6ddcad","Type":"ContainerDied","Data":"ba48e6fed9d77051010ef4b50d4a71367424b43c03e32266a234e2a2bc45ffdf"} Jan 21 00:00:57 crc kubenswrapper[5030]: I0121 00:00:57.857881 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.001455 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ssh-key-edpm-compute-global\") pod \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.001545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fgsq\" (UniqueName: \"kubernetes.io/projected/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-kube-api-access-2fgsq\") pod \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.001662 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ovncontroller-config-0\") pod \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.001687 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ovn-combined-ca-bundle\") pod \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.001770 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-inventory\") pod \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\" (UID: \"7595a00c-2956-4cd8-a2f2-054d1e6ddcad\") " Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.007391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-kube-api-access-2fgsq" (OuterVolumeSpecName: "kube-api-access-2fgsq") pod "7595a00c-2956-4cd8-a2f2-054d1e6ddcad" (UID: "7595a00c-2956-4cd8-a2f2-054d1e6ddcad"). InnerVolumeSpecName "kube-api-access-2fgsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.008567 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7595a00c-2956-4cd8-a2f2-054d1e6ddcad" (UID: "7595a00c-2956-4cd8-a2f2-054d1e6ddcad"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.030926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "7595a00c-2956-4cd8-a2f2-054d1e6ddcad" (UID: "7595a00c-2956-4cd8-a2f2-054d1e6ddcad"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.035407 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "7595a00c-2956-4cd8-a2f2-054d1e6ddcad" (UID: "7595a00c-2956-4cd8-a2f2-054d1e6ddcad"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.044496 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-inventory" (OuterVolumeSpecName: "inventory") pod "7595a00c-2956-4cd8-a2f2-054d1e6ddcad" (UID: "7595a00c-2956-4cd8-a2f2-054d1e6ddcad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.103647 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.103689 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.103703 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fgsq\" (UniqueName: \"kubernetes.io/projected/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-kube-api-access-2fgsq\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.103715 5030 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.103727 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7595a00c-2956-4cd8-a2f2-054d1e6ddcad-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.585374 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" event={"ID":"7595a00c-2956-4cd8-a2f2-054d1e6ddcad","Type":"ContainerDied","Data":"2908aa2e4e9bc4187917a423f58947a131a03f715e10a776a832009b80bc0dc0"} Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.585435 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2908aa2e4e9bc4187917a423f58947a131a03f715e10a776a832009b80bc0dc0" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.585468 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.672083 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2"] Jan 21 00:00:58 crc kubenswrapper[5030]: E0121 00:00:58.672538 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7595a00c-2956-4cd8-a2f2-054d1e6ddcad" containerName="ovn-edpm-compute-global-edpm-compute-global" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.672570 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7595a00c-2956-4cd8-a2f2-054d1e6ddcad" containerName="ovn-edpm-compute-global-edpm-compute-global" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.684546 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7595a00c-2956-4cd8-a2f2-054d1e6ddcad" containerName="ovn-edpm-compute-global-edpm-compute-global" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.686045 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.693274 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.693304 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-neutron-config" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.693603 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-metadata-agent-neutron-config" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.693820 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.693945 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.693958 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.694406 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.703193 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2"] Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.834763 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.834817 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-inventory\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.834863 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-ssh-key-edpm-compute-global\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.834896 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.834930 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdtr\" (UniqueName: \"kubernetes.io/projected/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-kube-api-access-9zdtr\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.834955 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.834978 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.835006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.935972 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.936048 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.936090 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.936151 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.936180 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-inventory\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.936219 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-ssh-key-edpm-compute-global\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.936259 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.936301 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdtr\" (UniqueName: \"kubernetes.io/projected/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-kube-api-access-9zdtr\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.942173 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.942840 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.943223 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-ssh-key-edpm-compute-global\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.948254 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.952083 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.953830 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-inventory\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.955997 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:58 crc kubenswrapper[5030]: I0121 00:00:58.976449 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdtr\" (UniqueName: \"kubernetes.io/projected/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-kube-api-access-9zdtr\") pod \"neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:59 crc kubenswrapper[5030]: I0121 00:00:59.021773 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:00:59 crc kubenswrapper[5030]: I0121 00:00:59.518098 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2"] Jan 21 00:00:59 crc kubenswrapper[5030]: W0121 00:00:59.525184 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d6b8fc4_a5c4_40a1_8b3e_44beb0e2aa5d.slice/crio-c329d4c9bd5ff8d864c18bf2b2b062f6a3783eaf70944d650567e388a1c37e32 WatchSource:0}: Error finding container c329d4c9bd5ff8d864c18bf2b2b062f6a3783eaf70944d650567e388a1c37e32: Status 404 returned error can't find the container with id c329d4c9bd5ff8d864c18bf2b2b062f6a3783eaf70944d650567e388a1c37e32 Jan 21 00:00:59 crc kubenswrapper[5030]: I0121 00:00:59.595123 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" event={"ID":"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d","Type":"ContainerStarted","Data":"c329d4c9bd5ff8d864c18bf2b2b062f6a3783eaf70944d650567e388a1c37e32"} Jan 21 00:01:00 crc kubenswrapper[5030]: I0121 00:01:00.607183 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" event={"ID":"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d","Type":"ContainerStarted","Data":"9e7dd4a2fb1ccb6483359d29e120ffa0bdb87303b53fd708a9d0f9b2b739b5db"} Jan 21 00:01:00 crc kubenswrapper[5030]: I0121 00:01:00.635395 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" podStartSLOduration=1.825934391 podStartE2EDuration="2.635368909s" podCreationTimestamp="2026-01-21 00:00:58 +0000 UTC" firstStartedPulling="2026-01-21 00:00:59.528377487 +0000 UTC m=+5131.848637775" lastFinishedPulling="2026-01-21 00:01:00.337811975 +0000 UTC m=+5132.658072293" observedRunningTime="2026-01-21 00:01:00.629749683 +0000 UTC m=+5132.950009971" watchObservedRunningTime="2026-01-21 00:01:00.635368909 +0000 UTC m=+5132.955629217" Jan 21 00:01:02 crc kubenswrapper[5030]: I0121 00:01:02.625810 5030 generic.go:334] "Generic (PLEG): container finished" podID="6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d" containerID="9e7dd4a2fb1ccb6483359d29e120ffa0bdb87303b53fd708a9d0f9b2b739b5db" exitCode=0 Jan 21 00:01:02 crc kubenswrapper[5030]: I0121 00:01:02.625882 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" event={"ID":"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d","Type":"ContainerDied","Data":"9e7dd4a2fb1ccb6483359d29e120ffa0bdb87303b53fd708a9d0f9b2b739b5db"} Jan 21 00:01:03 crc kubenswrapper[5030]: I0121 00:01:03.991868 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.114876 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-2\") pod \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.115030 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-ssh-key-edpm-compute-global\") pod \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.115179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-neutron-metadata-combined-ca-bundle\") pod \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.115228 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-neutron-ovn-metadata-agent-neutron-config-0\") pod \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.115299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-1\") pod \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.115387 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-0\") pod \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.115439 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-inventory\") pod \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.115505 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zdtr\" (UniqueName: \"kubernetes.io/projected/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-kube-api-access-9zdtr\") pod \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\" (UID: \"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d\") " Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.135814 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d" (UID: "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.136529 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-kube-api-access-9zdtr" (OuterVolumeSpecName: "kube-api-access-9zdtr") pod "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d" (UID: "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d"). InnerVolumeSpecName "kube-api-access-9zdtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.142521 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-1" (OuterVolumeSpecName: "nova-metadata-neutron-config-1") pod "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d" (UID: "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d"). InnerVolumeSpecName "nova-metadata-neutron-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.143843 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d" (UID: "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.153579 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d" (UID: "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.160882 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-2" (OuterVolumeSpecName: "nova-metadata-neutron-config-2") pod "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d" (UID: "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d"). InnerVolumeSpecName "nova-metadata-neutron-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.161773 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-inventory" (OuterVolumeSpecName: "inventory") pod "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d" (UID: "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.164867 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d" (UID: "6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.217647 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zdtr\" (UniqueName: \"kubernetes.io/projected/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-kube-api-access-9zdtr\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.217705 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-2\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.217729 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.217750 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.217772 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.217794 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.217815 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.217836 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.647724 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" event={"ID":"6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d","Type":"ContainerDied","Data":"c329d4c9bd5ff8d864c18bf2b2b062f6a3783eaf70944d650567e388a1c37e32"} Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.647787 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c329d4c9bd5ff8d864c18bf2b2b062f6a3783eaf70944d650567e388a1c37e32" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.647789 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.754310 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8"] Jan 21 00:01:04 crc kubenswrapper[5030]: E0121 00:01:04.754581 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d" containerName="neutron-metadata-edpm-compute-global-edpm-compute-global" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.754597 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d" containerName="neutron-metadata-edpm-compute-global-edpm-compute-global" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.754750 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d" containerName="neutron-metadata-edpm-compute-global-edpm-compute-global" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.755179 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.757307 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.757424 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.757412 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.757710 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.758816 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-agent-neutron-config" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.759142 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.772659 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8"] Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.928789 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-inventory\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.928834 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.928865 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.928912 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-ssh-key-edpm-compute-global\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:04 crc kubenswrapper[5030]: I0121 00:01:04.928951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kfvb\" (UniqueName: \"kubernetes.io/projected/f07a5642-e5ba-42ab-ae0f-88e26b167ada-kube-api-access-8kfvb\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:05 crc kubenswrapper[5030]: I0121 00:01:05.030091 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:05 crc kubenswrapper[5030]: I0121 00:01:05.030794 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-ssh-key-edpm-compute-global\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:05 crc kubenswrapper[5030]: I0121 00:01:05.030856 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kfvb\" (UniqueName: \"kubernetes.io/projected/f07a5642-e5ba-42ab-ae0f-88e26b167ada-kube-api-access-8kfvb\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:05 crc kubenswrapper[5030]: I0121 00:01:05.030943 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-inventory\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:05 crc kubenswrapper[5030]: I0121 00:01:05.031013 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:05 crc kubenswrapper[5030]: I0121 00:01:05.034998 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-inventory\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:05 crc kubenswrapper[5030]: I0121 00:01:05.035053 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:05 crc kubenswrapper[5030]: I0121 00:01:05.035533 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:05 crc kubenswrapper[5030]: I0121 00:01:05.035813 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-ssh-key-edpm-compute-global\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:05 crc kubenswrapper[5030]: I0121 00:01:05.345836 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kfvb\" (UniqueName: \"kubernetes.io/projected/f07a5642-e5ba-42ab-ae0f-88e26b167ada-kube-api-access-8kfvb\") pod \"neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:05 crc kubenswrapper[5030]: I0121 00:01:05.368566 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:05 crc kubenswrapper[5030]: I0121 00:01:05.916171 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8"] Jan 21 00:01:05 crc kubenswrapper[5030]: I0121 00:01:05.963226 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:01:05 crc kubenswrapper[5030]: E0121 00:01:05.963892 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:01:06 crc kubenswrapper[5030]: I0121 00:01:06.666362 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" event={"ID":"f07a5642-e5ba-42ab-ae0f-88e26b167ada","Type":"ContainerStarted","Data":"6566c2bfbf4b4c608ef886d9cb22730c82f84d067ca9338ce48f7efd5ccd84e8"} Jan 21 00:01:06 crc kubenswrapper[5030]: I0121 00:01:06.666732 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" event={"ID":"f07a5642-e5ba-42ab-ae0f-88e26b167ada","Type":"ContainerStarted","Data":"0dfb5523a9b924ab6161ba355e0b41eb89b681d7faf51ebb1a9b3c21cf64d72f"} Jan 21 00:01:06 crc kubenswrapper[5030]: I0121 00:01:06.685264 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" podStartSLOduration=2.190683149 podStartE2EDuration="2.685246571s" podCreationTimestamp="2026-01-21 00:01:04 +0000 UTC" firstStartedPulling="2026-01-21 00:01:05.929936967 +0000 UTC m=+5138.250197255" lastFinishedPulling="2026-01-21 00:01:06.424500359 +0000 UTC m=+5138.744760677" observedRunningTime="2026-01-21 00:01:06.681799637 +0000 UTC m=+5139.002059935" watchObservedRunningTime="2026-01-21 00:01:06.685246571 +0000 UTC m=+5139.005506859" Jan 21 00:01:08 crc kubenswrapper[5030]: I0121 00:01:08.686138 5030 generic.go:334] "Generic (PLEG): container finished" podID="f07a5642-e5ba-42ab-ae0f-88e26b167ada" containerID="6566c2bfbf4b4c608ef886d9cb22730c82f84d067ca9338ce48f7efd5ccd84e8" exitCode=0 Jan 21 00:01:08 crc kubenswrapper[5030]: I0121 00:01:08.686249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" event={"ID":"f07a5642-e5ba-42ab-ae0f-88e26b167ada","Type":"ContainerDied","Data":"6566c2bfbf4b4c608ef886d9cb22730c82f84d067ca9338ce48f7efd5ccd84e8"} Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.025726 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.150231 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-inventory\") pod \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.151362 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kfvb\" (UniqueName: \"kubernetes.io/projected/f07a5642-e5ba-42ab-ae0f-88e26b167ada-kube-api-access-8kfvb\") pod \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.151427 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-neutron-ovn-combined-ca-bundle\") pod \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.151494 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-ssh-key-edpm-compute-global\") pod \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.151546 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-neutron-ovn-agent-neutron-config-0\") pod \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\" (UID: \"f07a5642-e5ba-42ab-ae0f-88e26b167ada\") " Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.157215 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "f07a5642-e5ba-42ab-ae0f-88e26b167ada" (UID: "f07a5642-e5ba-42ab-ae0f-88e26b167ada"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.158246 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07a5642-e5ba-42ab-ae0f-88e26b167ada-kube-api-access-8kfvb" (OuterVolumeSpecName: "kube-api-access-8kfvb") pod "f07a5642-e5ba-42ab-ae0f-88e26b167ada" (UID: "f07a5642-e5ba-42ab-ae0f-88e26b167ada"). InnerVolumeSpecName "kube-api-access-8kfvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.174698 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-inventory" (OuterVolumeSpecName: "inventory") pod "f07a5642-e5ba-42ab-ae0f-88e26b167ada" (UID: "f07a5642-e5ba-42ab-ae0f-88e26b167ada"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.177222 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "f07a5642-e5ba-42ab-ae0f-88e26b167ada" (UID: "f07a5642-e5ba-42ab-ae0f-88e26b167ada"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.182820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-neutron-ovn-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-agent-neutron-config-0") pod "f07a5642-e5ba-42ab-ae0f-88e26b167ada" (UID: "f07a5642-e5ba-42ab-ae0f-88e26b167ada"). InnerVolumeSpecName "neutron-ovn-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.253489 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.253528 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-neutron-ovn-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.253547 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.253565 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kfvb\" (UniqueName: \"kubernetes.io/projected/f07a5642-e5ba-42ab-ae0f-88e26b167ada-kube-api-access-8kfvb\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.253579 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07a5642-e5ba-42ab-ae0f-88e26b167ada-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.705660 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" event={"ID":"f07a5642-e5ba-42ab-ae0f-88e26b167ada","Type":"ContainerDied","Data":"0dfb5523a9b924ab6161ba355e0b41eb89b681d7faf51ebb1a9b3c21cf64d72f"} Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.705706 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.705738 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dfb5523a9b924ab6161ba355e0b41eb89b681d7faf51ebb1a9b3c21cf64d72f" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.823836 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk"] Jan 21 00:01:10 crc kubenswrapper[5030]: E0121 00:01:10.824120 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07a5642-e5ba-42ab-ae0f-88e26b167ada" containerName="neutron-ovn-edpm-compute-global-edpm-compute-global" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.824135 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07a5642-e5ba-42ab-ae0f-88e26b167ada" containerName="neutron-ovn-edpm-compute-global-edpm-compute-global" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.824273 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f07a5642-e5ba-42ab-ae0f-88e26b167ada" containerName="neutron-ovn-edpm-compute-global-edpm-compute-global" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.824734 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.829536 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.829581 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-sriov-agent-neutron-config" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.829762 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.829770 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.830261 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.830412 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.842708 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk"] Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.860966 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-inventory\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.861077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7xgw\" (UniqueName: \"kubernetes.io/projected/8b09e4d5-4bde-4471-848a-dbc6275081e1-kube-api-access-j7xgw\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.861127 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.861259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-ssh-key-edpm-compute-global\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.861284 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.962845 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7xgw\" (UniqueName: \"kubernetes.io/projected/8b09e4d5-4bde-4471-848a-dbc6275081e1-kube-api-access-j7xgw\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.962901 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.962927 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-ssh-key-edpm-compute-global\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.962948 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.963014 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-inventory\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.967001 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.967231 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-ssh-key-edpm-compute-global\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.967462 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.967849 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-inventory\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:10 crc kubenswrapper[5030]: I0121 00:01:10.981689 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7xgw\" (UniqueName: \"kubernetes.io/projected/8b09e4d5-4bde-4471-848a-dbc6275081e1-kube-api-access-j7xgw\") pod \"neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:11 crc kubenswrapper[5030]: I0121 00:01:11.141498 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:11 crc kubenswrapper[5030]: I0121 00:01:11.599750 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk"] Jan 21 00:01:11 crc kubenswrapper[5030]: I0121 00:01:11.715804 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" event={"ID":"8b09e4d5-4bde-4471-848a-dbc6275081e1","Type":"ContainerStarted","Data":"40d6c353f9307529463a5d3b88d77e3eabe259d2c6eaf3ccf3844e71cb2a5028"} Jan 21 00:01:12 crc kubenswrapper[5030]: I0121 00:01:12.726729 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" event={"ID":"8b09e4d5-4bde-4471-848a-dbc6275081e1","Type":"ContainerStarted","Data":"5c9f1182548ba43265d81f8447c1f62b1f165ac558b255b1174dccc95471ee9a"} Jan 21 00:01:12 crc kubenswrapper[5030]: I0121 00:01:12.752487 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" podStartSLOduration=2.301484438 podStartE2EDuration="2.752458103s" podCreationTimestamp="2026-01-21 00:01:10 +0000 UTC" firstStartedPulling="2026-01-21 00:01:11.60372325 +0000 UTC m=+5143.923983538" lastFinishedPulling="2026-01-21 00:01:12.054696875 +0000 UTC m=+5144.374957203" observedRunningTime="2026-01-21 00:01:12.740309539 +0000 UTC m=+5145.060569827" watchObservedRunningTime="2026-01-21 00:01:12.752458103 +0000 UTC m=+5145.072718421" Jan 21 00:01:13 crc kubenswrapper[5030]: I0121 00:01:13.741869 5030 generic.go:334] "Generic (PLEG): container finished" podID="8b09e4d5-4bde-4471-848a-dbc6275081e1" containerID="5c9f1182548ba43265d81f8447c1f62b1f165ac558b255b1174dccc95471ee9a" exitCode=0 Jan 21 00:01:13 crc kubenswrapper[5030]: I0121 00:01:13.741935 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" event={"ID":"8b09e4d5-4bde-4471-848a-dbc6275081e1","Type":"ContainerDied","Data":"5c9f1182548ba43265d81f8447c1f62b1f165ac558b255b1174dccc95471ee9a"} Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.139548 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.237914 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-ssh-key-edpm-compute-global\") pod \"8b09e4d5-4bde-4471-848a-dbc6275081e1\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.238193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7xgw\" (UniqueName: \"kubernetes.io/projected/8b09e4d5-4bde-4471-848a-dbc6275081e1-kube-api-access-j7xgw\") pod \"8b09e4d5-4bde-4471-848a-dbc6275081e1\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.238280 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-inventory\") pod \"8b09e4d5-4bde-4471-848a-dbc6275081e1\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.238350 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-neutron-sriov-combined-ca-bundle\") pod \"8b09e4d5-4bde-4471-848a-dbc6275081e1\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.238426 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-neutron-sriov-agent-neutron-config-0\") pod \"8b09e4d5-4bde-4471-848a-dbc6275081e1\" (UID: \"8b09e4d5-4bde-4471-848a-dbc6275081e1\") " Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.244267 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "8b09e4d5-4bde-4471-848a-dbc6275081e1" (UID: "8b09e4d5-4bde-4471-848a-dbc6275081e1"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.246301 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b09e4d5-4bde-4471-848a-dbc6275081e1-kube-api-access-j7xgw" (OuterVolumeSpecName: "kube-api-access-j7xgw") pod "8b09e4d5-4bde-4471-848a-dbc6275081e1" (UID: "8b09e4d5-4bde-4471-848a-dbc6275081e1"). InnerVolumeSpecName "kube-api-access-j7xgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.276485 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-inventory" (OuterVolumeSpecName: "inventory") pod "8b09e4d5-4bde-4471-848a-dbc6275081e1" (UID: "8b09e4d5-4bde-4471-848a-dbc6275081e1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.281102 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "8b09e4d5-4bde-4471-848a-dbc6275081e1" (UID: "8b09e4d5-4bde-4471-848a-dbc6275081e1"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.289350 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "8b09e4d5-4bde-4471-848a-dbc6275081e1" (UID: "8b09e4d5-4bde-4471-848a-dbc6275081e1"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.340826 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.340875 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.340908 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.340923 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/8b09e4d5-4bde-4471-848a-dbc6275081e1-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.340942 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7xgw\" (UniqueName: \"kubernetes.io/projected/8b09e4d5-4bde-4471-848a-dbc6275081e1-kube-api-access-j7xgw\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.764251 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" event={"ID":"8b09e4d5-4bde-4471-848a-dbc6275081e1","Type":"ContainerDied","Data":"40d6c353f9307529463a5d3b88d77e3eabe259d2c6eaf3ccf3844e71cb2a5028"} Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.764713 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40d6c353f9307529463a5d3b88d77e3eabe259d2c6eaf3ccf3844e71cb2a5028" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.764334 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.830730 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78"] Jan 21 00:01:15 crc kubenswrapper[5030]: E0121 00:01:15.831471 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b09e4d5-4bde-4471-848a-dbc6275081e1" containerName="neutron-sriov-edpm-compute-global-edpm-compute-global" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.831497 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b09e4d5-4bde-4471-848a-dbc6275081e1" containerName="neutron-sriov-edpm-compute-global-edpm-compute-global" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.831712 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b09e4d5-4bde-4471-848a-dbc6275081e1" containerName="neutron-sriov-edpm-compute-global-edpm-compute-global" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.832255 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.840658 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.841170 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.841245 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-dhcp-agent-neutron-config" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.841580 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.843844 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.849089 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78"] Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.849595 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.988448 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.988552 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkfpn\" (UniqueName: \"kubernetes.io/projected/a4c283c4-8213-44c4-823a-9a645886d652-kube-api-access-gkfpn\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.988601 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.988653 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-inventory\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:15 crc kubenswrapper[5030]: I0121 00:01:15.988684 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-ssh-key-edpm-compute-global\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.090110 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.090580 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-inventory\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.090866 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-ssh-key-edpm-compute-global\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.091148 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.091481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkfpn\" (UniqueName: \"kubernetes.io/projected/a4c283c4-8213-44c4-823a-9a645886d652-kube-api-access-gkfpn\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.095001 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-ssh-key-edpm-compute-global\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.095161 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.095502 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.097379 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-inventory\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.111453 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkfpn\" (UniqueName: \"kubernetes.io/projected/a4c283c4-8213-44c4-823a-9a645886d652-kube-api-access-gkfpn\") pod \"neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.196662 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.692788 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78"] Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.775813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" event={"ID":"a4c283c4-8213-44c4-823a-9a645886d652","Type":"ContainerStarted","Data":"4c686472c8b39947cf65235be2d1d55d03936ee48751fbe2876d6e3952a8f191"} Jan 21 00:01:16 crc kubenswrapper[5030]: I0121 00:01:16.962614 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:01:16 crc kubenswrapper[5030]: E0121 00:01:16.963065 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:01:18 crc kubenswrapper[5030]: I0121 00:01:18.816310 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" event={"ID":"a4c283c4-8213-44c4-823a-9a645886d652","Type":"ContainerStarted","Data":"a6f5cb9b8e23c29e698185e82117ee05bd9a54776cb215075639461cb84c9eb5"} Jan 21 00:01:18 crc kubenswrapper[5030]: I0121 00:01:18.846243 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" podStartSLOduration=2.339787963 podStartE2EDuration="3.846155488s" podCreationTimestamp="2026-01-21 00:01:15 +0000 UTC" firstStartedPulling="2026-01-21 00:01:16.70784037 +0000 UTC m=+5149.028100668" lastFinishedPulling="2026-01-21 00:01:18.214207855 +0000 UTC m=+5150.534468193" observedRunningTime="2026-01-21 00:01:18.843969215 +0000 UTC m=+5151.164229513" watchObservedRunningTime="2026-01-21 00:01:18.846155488 +0000 UTC m=+5151.166415826" Jan 21 00:01:20 crc kubenswrapper[5030]: I0121 00:01:20.840497 5030 generic.go:334] "Generic (PLEG): container finished" podID="a4c283c4-8213-44c4-823a-9a645886d652" containerID="a6f5cb9b8e23c29e698185e82117ee05bd9a54776cb215075639461cb84c9eb5" exitCode=0 Jan 21 00:01:20 crc kubenswrapper[5030]: I0121 00:01:20.840646 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" event={"ID":"a4c283c4-8213-44c4-823a-9a645886d652","Type":"ContainerDied","Data":"a6f5cb9b8e23c29e698185e82117ee05bd9a54776cb215075639461cb84c9eb5"} Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.319412 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.413157 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-inventory\") pod \"a4c283c4-8213-44c4-823a-9a645886d652\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.413273 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-neutron-dhcp-agent-neutron-config-0\") pod \"a4c283c4-8213-44c4-823a-9a645886d652\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.413298 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-neutron-dhcp-combined-ca-bundle\") pod \"a4c283c4-8213-44c4-823a-9a645886d652\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.414111 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkfpn\" (UniqueName: \"kubernetes.io/projected/a4c283c4-8213-44c4-823a-9a645886d652-kube-api-access-gkfpn\") pod \"a4c283c4-8213-44c4-823a-9a645886d652\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.414206 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-ssh-key-edpm-compute-global\") pod \"a4c283c4-8213-44c4-823a-9a645886d652\" (UID: \"a4c283c4-8213-44c4-823a-9a645886d652\") " Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.421541 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c283c4-8213-44c4-823a-9a645886d652-kube-api-access-gkfpn" (OuterVolumeSpecName: "kube-api-access-gkfpn") pod "a4c283c4-8213-44c4-823a-9a645886d652" (UID: "a4c283c4-8213-44c4-823a-9a645886d652"). InnerVolumeSpecName "kube-api-access-gkfpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.426384 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "a4c283c4-8213-44c4-823a-9a645886d652" (UID: "a4c283c4-8213-44c4-823a-9a645886d652"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.454730 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "a4c283c4-8213-44c4-823a-9a645886d652" (UID: "a4c283c4-8213-44c4-823a-9a645886d652"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.459537 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-inventory" (OuterVolumeSpecName: "inventory") pod "a4c283c4-8213-44c4-823a-9a645886d652" (UID: "a4c283c4-8213-44c4-823a-9a645886d652"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.471061 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "a4c283c4-8213-44c4-823a-9a645886d652" (UID: "a4c283c4-8213-44c4-823a-9a645886d652"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.516189 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.516241 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.516255 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.516275 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c283c4-8213-44c4-823a-9a645886d652-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.516296 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkfpn\" (UniqueName: \"kubernetes.io/projected/a4c283c4-8213-44c4-823a-9a645886d652-kube-api-access-gkfpn\") on node \"crc\" DevicePath \"\"" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.868914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" event={"ID":"a4c283c4-8213-44c4-823a-9a645886d652","Type":"ContainerDied","Data":"4c686472c8b39947cf65235be2d1d55d03936ee48751fbe2876d6e3952a8f191"} Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.869000 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c686472c8b39947cf65235be2d1d55d03936ee48751fbe2876d6e3952a8f191" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.869001 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.989806 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5"] Jan 21 00:01:22 crc kubenswrapper[5030]: E0121 00:01:22.990234 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c283c4-8213-44c4-823a-9a645886d652" containerName="neutron-dhcp-edpm-compute-global-edpm-compute-global" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.990256 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c283c4-8213-44c4-823a-9a645886d652" containerName="neutron-dhcp-edpm-compute-global-edpm-compute-global" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.990458 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c283c4-8213-44c4-823a-9a645886d652" containerName="neutron-dhcp-edpm-compute-global-edpm-compute-global" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.991178 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:22 crc kubenswrapper[5030]: I0121 00:01:22.993478 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"libvirt-secret" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.001996 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.002725 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.003096 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.003291 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.003302 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.005087 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5"] Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.126287 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-inventory\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.127727 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-ssh-key-edpm-compute-global\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.127870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-libvirt-secret-0\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.127923 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.128015 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7622\" (UniqueName: \"kubernetes.io/projected/5d52393c-0a90-435d-8215-4822292a501e-kube-api-access-c7622\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.229350 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-libvirt-secret-0\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.229650 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.229820 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7622\" (UniqueName: \"kubernetes.io/projected/5d52393c-0a90-435d-8215-4822292a501e-kube-api-access-c7622\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.230002 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-inventory\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.230129 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-ssh-key-edpm-compute-global\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.234469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-libvirt-secret-0\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.234469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-ssh-key-edpm-compute-global\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.234705 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.249161 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7622\" (UniqueName: \"kubernetes.io/projected/5d52393c-0a90-435d-8215-4822292a501e-kube-api-access-c7622\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.249291 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-inventory\") pod \"libvirt-edpm-compute-global-edpm-compute-global-fjgq5\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.308424 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.619779 5030 scope.go:117] "RemoveContainer" containerID="70a7d28cf52d5ef1d00316b0dcbf8fba7e082729deb543a60515ea02129ddf50" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.670903 5030 scope.go:117] "RemoveContainer" containerID="2fc6210e9ab2da522085f52343b3e8c26de15701510867d1f0192cef9767fbc7" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.699964 5030 scope.go:117] "RemoveContainer" containerID="a5c4433b03820c6d18621f9c8f81c83389f156d4d17147855bf0f160cd803274" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.723484 5030 scope.go:117] "RemoveContainer" containerID="f0f6dcbbc1beacc6118f560448c544e57f674e2e974643154ee6c4c223e11d73" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.741602 5030 scope.go:117] "RemoveContainer" containerID="0f2d9ff94a70f371ea0c2f655df1343171f75f715efffd2ebcbd88a09a91dcdf" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.778599 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5"] Jan 21 00:01:23 crc kubenswrapper[5030]: W0121 00:01:23.788323 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d52393c_0a90_435d_8215_4822292a501e.slice/crio-adcc250b23b7f64ff90775dcfb760d24513e219559f3f13a43391ce2d3eae9af WatchSource:0}: Error finding container adcc250b23b7f64ff90775dcfb760d24513e219559f3f13a43391ce2d3eae9af: Status 404 returned error can't find the container with id adcc250b23b7f64ff90775dcfb760d24513e219559f3f13a43391ce2d3eae9af Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.804314 5030 scope.go:117] "RemoveContainer" containerID="6fd0a762eb361ee47aaa848aac648360188300b079c2102d8a6884e136b43630" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.837661 5030 scope.go:117] "RemoveContainer" containerID="ffa3ae677b3ee3825734e167538f781603c6acc1215aa641ced6dbbfe5a472a5" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.879849 5030 scope.go:117] "RemoveContainer" containerID="8908dc6a3ed08b89887c59ba107ec391977995d11046784ad1c79931bbf35386" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.882294 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" event={"ID":"5d52393c-0a90-435d-8215-4822292a501e","Type":"ContainerStarted","Data":"adcc250b23b7f64ff90775dcfb760d24513e219559f3f13a43391ce2d3eae9af"} Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.943611 5030 scope.go:117] "RemoveContainer" containerID="5327928e7446b8be0843fc4890d73bda7bfa3169ddb8d3054850ab76b5f5bbd7" Jan 21 00:01:23 crc kubenswrapper[5030]: I0121 00:01:23.969071 5030 scope.go:117] "RemoveContainer" containerID="87a82cba312227a3eedb47a0c268697ec415a32cbe79e06eaeca52d69f0bc499" Jan 21 00:01:24 crc kubenswrapper[5030]: I0121 00:01:24.016934 5030 scope.go:117] "RemoveContainer" containerID="59d69580f270ae36f95b6b71e7235da82769f6ef8d082e2ada36c0ab220149e3" Jan 21 00:01:24 crc kubenswrapper[5030]: I0121 00:01:24.041735 5030 scope.go:117] "RemoveContainer" containerID="54c63ead713d1354abf474373098647909a2d843a7ad797ffe6704b15a66b178" Jan 21 00:01:24 crc kubenswrapper[5030]: I0121 00:01:24.071135 5030 scope.go:117] "RemoveContainer" containerID="dca510e79d494a956a98c426b8f4611d3b6b6dde0cb62746d4ae5531d4f22da2" Jan 21 00:01:24 crc kubenswrapper[5030]: I0121 00:01:24.102100 5030 scope.go:117] "RemoveContainer" containerID="c1391761fb7c8fc5ae03c94f84c2cb4e7dcbef4e6e695265c52e6766480d995c" Jan 21 00:01:24 crc kubenswrapper[5030]: I0121 00:01:24.139363 5030 scope.go:117] "RemoveContainer" containerID="446598bf5fc66973129bf823658cc975cc96b4b3543a75bed93814311423a045" Jan 21 00:01:27 crc kubenswrapper[5030]: I0121 00:01:27.974143 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:01:27 crc kubenswrapper[5030]: E0121 00:01:27.976675 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:01:32 crc kubenswrapper[5030]: I0121 00:01:32.819153 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hd4dq"] Jan 21 00:01:32 crc kubenswrapper[5030]: I0121 00:01:32.823148 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:01:32 crc kubenswrapper[5030]: I0121 00:01:32.843579 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hd4dq"] Jan 21 00:01:33 crc kubenswrapper[5030]: I0121 00:01:33.012697 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2913e6de-f32f-4b7a-b67f-cc9625a1831c-catalog-content\") pod \"redhat-operators-hd4dq\" (UID: \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\") " pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:01:33 crc kubenswrapper[5030]: I0121 00:01:33.012802 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddt7f\" (UniqueName: \"kubernetes.io/projected/2913e6de-f32f-4b7a-b67f-cc9625a1831c-kube-api-access-ddt7f\") pod \"redhat-operators-hd4dq\" (UID: \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\") " pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:01:33 crc kubenswrapper[5030]: I0121 00:01:33.013629 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2913e6de-f32f-4b7a-b67f-cc9625a1831c-utilities\") pod \"redhat-operators-hd4dq\" (UID: \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\") " pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:01:33 crc kubenswrapper[5030]: I0121 00:01:33.114948 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2913e6de-f32f-4b7a-b67f-cc9625a1831c-catalog-content\") pod \"redhat-operators-hd4dq\" (UID: \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\") " pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:01:33 crc kubenswrapper[5030]: I0121 00:01:33.115009 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddt7f\" (UniqueName: \"kubernetes.io/projected/2913e6de-f32f-4b7a-b67f-cc9625a1831c-kube-api-access-ddt7f\") pod \"redhat-operators-hd4dq\" (UID: \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\") " pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:01:33 crc kubenswrapper[5030]: I0121 00:01:33.115065 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2913e6de-f32f-4b7a-b67f-cc9625a1831c-utilities\") pod \"redhat-operators-hd4dq\" (UID: \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\") " pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:01:33 crc kubenswrapper[5030]: I0121 00:01:33.115755 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2913e6de-f32f-4b7a-b67f-cc9625a1831c-utilities\") pod \"redhat-operators-hd4dq\" (UID: \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\") " pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:01:33 crc kubenswrapper[5030]: I0121 00:01:33.115825 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2913e6de-f32f-4b7a-b67f-cc9625a1831c-catalog-content\") pod \"redhat-operators-hd4dq\" (UID: \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\") " pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:01:33 crc kubenswrapper[5030]: I0121 00:01:33.143184 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddt7f\" (UniqueName: \"kubernetes.io/projected/2913e6de-f32f-4b7a-b67f-cc9625a1831c-kube-api-access-ddt7f\") pod \"redhat-operators-hd4dq\" (UID: \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\") " pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:01:33 crc kubenswrapper[5030]: I0121 00:01:33.163006 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:01:39 crc kubenswrapper[5030]: I0121 00:01:39.962293 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:01:39 crc kubenswrapper[5030]: E0121 00:01:39.963293 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:01:45 crc kubenswrapper[5030]: I0121 00:01:45.683889 5030 patch_prober.go:28] interesting pod/route-controller-manager-6878449bd4-dw9cx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 00:01:45 crc kubenswrapper[5030]: I0121 00:01:45.684611 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6878449bd4-dw9cx" podUID="37d68c50-0572-4d84-88b9-a0240cb28055" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 00:01:52 crc kubenswrapper[5030]: I0121 00:01:52.963196 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:01:52 crc kubenswrapper[5030]: E0121 00:01:52.964731 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:02:01 crc kubenswrapper[5030]: W0121 00:02:01.037791 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2913e6de_f32f_4b7a_b67f_cc9625a1831c.slice/crio-92e29d5832acd637bb9acf4ac182193b1166f0510c6c1cec4fa90b7382f61114 WatchSource:0}: Error finding container 92e29d5832acd637bb9acf4ac182193b1166f0510c6c1cec4fa90b7382f61114: Status 404 returned error can't find the container with id 92e29d5832acd637bb9acf4ac182193b1166f0510c6c1cec4fa90b7382f61114 Jan 21 00:02:01 crc kubenswrapper[5030]: I0121 00:02:01.037884 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hd4dq"] Jan 21 00:02:01 crc kubenswrapper[5030]: I0121 00:02:01.286904 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd4dq" event={"ID":"2913e6de-f32f-4b7a-b67f-cc9625a1831c","Type":"ContainerStarted","Data":"92e29d5832acd637bb9acf4ac182193b1166f0510c6c1cec4fa90b7382f61114"} Jan 21 00:02:06 crc kubenswrapper[5030]: I0121 00:02:06.962360 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:02:06 crc kubenswrapper[5030]: E0121 00:02:06.963122 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:02:18 crc kubenswrapper[5030]: I0121 00:02:18.963253 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:02:18 crc kubenswrapper[5030]: E0121 00:02:18.964503 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:02:23 crc kubenswrapper[5030]: I0121 00:02:23.505948 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd4dq" event={"ID":"2913e6de-f32f-4b7a-b67f-cc9625a1831c","Type":"ContainerStarted","Data":"8f5d3e992bf5b2b5be564b30360b97fd4a3095c5db92164ffa267a33cf6e06c4"} Jan 21 00:02:23 crc kubenswrapper[5030]: I0121 00:02:23.517469 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:02:24 crc kubenswrapper[5030]: I0121 00:02:24.411497 5030 scope.go:117] "RemoveContainer" containerID="c16032f8775ab4a7bd458a948244e294cfac5d089df162a717211e073be9778d" Jan 21 00:02:24 crc kubenswrapper[5030]: I0121 00:02:24.490805 5030 scope.go:117] "RemoveContainer" containerID="e1e2983165e29a800fa64cc016e762e324a67a241d2c5258ac2493278f551e3d" Jan 21 00:02:24 crc kubenswrapper[5030]: I0121 00:02:24.517897 5030 generic.go:334] "Generic (PLEG): container finished" podID="2913e6de-f32f-4b7a-b67f-cc9625a1831c" containerID="8f5d3e992bf5b2b5be564b30360b97fd4a3095c5db92164ffa267a33cf6e06c4" exitCode=0 Jan 21 00:02:24 crc kubenswrapper[5030]: I0121 00:02:24.518004 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd4dq" event={"ID":"2913e6de-f32f-4b7a-b67f-cc9625a1831c","Type":"ContainerDied","Data":"8f5d3e992bf5b2b5be564b30360b97fd4a3095c5db92164ffa267a33cf6e06c4"} Jan 21 00:02:24 crc kubenswrapper[5030]: I0121 00:02:24.522193 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" event={"ID":"5d52393c-0a90-435d-8215-4822292a501e","Type":"ContainerStarted","Data":"c27e63d6bb0d90af35362d77a9b2ea7614368d866e70db392fa9344972e9f4ee"} Jan 21 00:02:24 crc kubenswrapper[5030]: I0121 00:02:24.548472 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" podStartSLOduration=2.837186964 podStartE2EDuration="1m2.548455661s" podCreationTimestamp="2026-01-21 00:01:22 +0000 UTC" firstStartedPulling="2026-01-21 00:01:23.804424552 +0000 UTC m=+5156.124684840" lastFinishedPulling="2026-01-21 00:02:23.515693249 +0000 UTC m=+5215.835953537" observedRunningTime="2026-01-21 00:02:24.54511646 +0000 UTC m=+5216.865376748" watchObservedRunningTime="2026-01-21 00:02:24.548455661 +0000 UTC m=+5216.868715949" Jan 21 00:02:24 crc kubenswrapper[5030]: I0121 00:02:24.553838 5030 scope.go:117] "RemoveContainer" containerID="f534f8867f01b669a98be117b16cec81c30a05651a6531a8ebf984a5ed21c492" Jan 21 00:02:24 crc kubenswrapper[5030]: I0121 00:02:24.603871 5030 scope.go:117] "RemoveContainer" containerID="564d0b1c0b03d0adc2809f29ac6547f795cb079d19c20e29f26d4c272452c3f2" Jan 21 00:02:24 crc kubenswrapper[5030]: I0121 00:02:24.641922 5030 scope.go:117] "RemoveContainer" containerID="c24b019586990fa217dfc9ad1cd63c34f40b14cb2468b7c316539d0b370e2051" Jan 21 00:02:24 crc kubenswrapper[5030]: I0121 00:02:24.702829 5030 scope.go:117] "RemoveContainer" containerID="52182c029b767781079783540a6f3a0d338c9ba22b74f17b3312241d0cd52362" Jan 21 00:02:24 crc kubenswrapper[5030]: I0121 00:02:24.736442 5030 scope.go:117] "RemoveContainer" containerID="9c6534fafecb72297beee3c7cfa3b48e9f876c41be16464ad6e0e57b69ba270e" Jan 21 00:02:24 crc kubenswrapper[5030]: I0121 00:02:24.788543 5030 scope.go:117] "RemoveContainer" containerID="5d45ca2eedc3f467103cc821521fdc430f2f9384c7f98f931adfdccedb66b0a2" Jan 21 00:02:24 crc kubenswrapper[5030]: I0121 00:02:24.811270 5030 scope.go:117] "RemoveContainer" containerID="fb9c6686c48a4b7a4260b23a6489d78b4e75fa78defcbab5958de2322f702462" Jan 21 00:02:25 crc kubenswrapper[5030]: I0121 00:02:25.532695 5030 generic.go:334] "Generic (PLEG): container finished" podID="5d52393c-0a90-435d-8215-4822292a501e" containerID="c27e63d6bb0d90af35362d77a9b2ea7614368d866e70db392fa9344972e9f4ee" exitCode=0 Jan 21 00:02:25 crc kubenswrapper[5030]: I0121 00:02:25.532786 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" event={"ID":"5d52393c-0a90-435d-8215-4822292a501e","Type":"ContainerDied","Data":"c27e63d6bb0d90af35362d77a9b2ea7614368d866e70db392fa9344972e9f4ee"} Jan 21 00:02:25 crc kubenswrapper[5030]: I0121 00:02:25.535431 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd4dq" event={"ID":"2913e6de-f32f-4b7a-b67f-cc9625a1831c","Type":"ContainerStarted","Data":"e0608db7bcb6ef2ae65077a88ab24172c789bdad1ac37ff87d8eed028e632ccd"} Jan 21 00:02:26 crc kubenswrapper[5030]: I0121 00:02:26.842716 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.004730 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-libvirt-secret-0\") pod \"5d52393c-0a90-435d-8215-4822292a501e\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.004789 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7622\" (UniqueName: \"kubernetes.io/projected/5d52393c-0a90-435d-8215-4822292a501e-kube-api-access-c7622\") pod \"5d52393c-0a90-435d-8215-4822292a501e\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.004828 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-libvirt-combined-ca-bundle\") pod \"5d52393c-0a90-435d-8215-4822292a501e\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.004872 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-inventory\") pod \"5d52393c-0a90-435d-8215-4822292a501e\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.004938 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-ssh-key-edpm-compute-global\") pod \"5d52393c-0a90-435d-8215-4822292a501e\" (UID: \"5d52393c-0a90-435d-8215-4822292a501e\") " Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.010162 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d52393c-0a90-435d-8215-4822292a501e-kube-api-access-c7622" (OuterVolumeSpecName: "kube-api-access-c7622") pod "5d52393c-0a90-435d-8215-4822292a501e" (UID: "5d52393c-0a90-435d-8215-4822292a501e"). InnerVolumeSpecName "kube-api-access-c7622". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.010936 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5d52393c-0a90-435d-8215-4822292a501e" (UID: "5d52393c-0a90-435d-8215-4822292a501e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.028883 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "5d52393c-0a90-435d-8215-4822292a501e" (UID: "5d52393c-0a90-435d-8215-4822292a501e"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.029655 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "5d52393c-0a90-435d-8215-4822292a501e" (UID: "5d52393c-0a90-435d-8215-4822292a501e"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.029928 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-inventory" (OuterVolumeSpecName: "inventory") pod "5d52393c-0a90-435d-8215-4822292a501e" (UID: "5d52393c-0a90-435d-8215-4822292a501e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.106938 5030 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.106987 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7622\" (UniqueName: \"kubernetes.io/projected/5d52393c-0a90-435d-8215-4822292a501e-kube-api-access-c7622\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.106998 5030 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.107013 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.107024 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/5d52393c-0a90-435d-8215-4822292a501e-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.554364 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.554376 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5" event={"ID":"5d52393c-0a90-435d-8215-4822292a501e","Type":"ContainerDied","Data":"adcc250b23b7f64ff90775dcfb760d24513e219559f3f13a43391ce2d3eae9af"} Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.554418 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adcc250b23b7f64ff90775dcfb760d24513e219559f3f13a43391ce2d3eae9af" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.556877 5030 generic.go:334] "Generic (PLEG): container finished" podID="2913e6de-f32f-4b7a-b67f-cc9625a1831c" containerID="e0608db7bcb6ef2ae65077a88ab24172c789bdad1ac37ff87d8eed028e632ccd" exitCode=0 Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.556930 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd4dq" event={"ID":"2913e6de-f32f-4b7a-b67f-cc9625a1831c","Type":"ContainerDied","Data":"e0608db7bcb6ef2ae65077a88ab24172c789bdad1ac37ff87d8eed028e632ccd"} Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.649912 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw"] Jan 21 00:02:27 crc kubenswrapper[5030]: E0121 00:02:27.650312 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d52393c-0a90-435d-8215-4822292a501e" containerName="libvirt-edpm-compute-global-edpm-compute-global" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.650334 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d52393c-0a90-435d-8215-4822292a501e" containerName="libvirt-edpm-compute-global-edpm-compute-global" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.650506 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d52393c-0a90-435d-8215-4822292a501e" containerName="libvirt-edpm-compute-global-edpm-compute-global" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.651347 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.659379 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-compute-config" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.659379 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.659462 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-migration-ssh-key" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.659565 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.659614 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.659571 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.659882 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.667249 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw"] Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.817112 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-cell1-compute-config-0\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.817180 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-migration-ssh-key-1\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.817208 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj6h7\" (UniqueName: \"kubernetes.io/projected/c0408f65-3264-42f0-8364-9839f8b46222-kube-api-access-rj6h7\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.817228 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-cell1-compute-config-1\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.817295 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-inventory\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.817317 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-ssh-key-edpm-compute-global\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.817365 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-combined-ca-bundle\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.817452 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-migration-ssh-key-0\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.918999 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj6h7\" (UniqueName: \"kubernetes.io/projected/c0408f65-3264-42f0-8364-9839f8b46222-kube-api-access-rj6h7\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.919058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-cell1-compute-config-1\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.919091 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-inventory\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.919111 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-ssh-key-edpm-compute-global\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.919136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-combined-ca-bundle\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.919211 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-migration-ssh-key-0\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.919250 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-cell1-compute-config-0\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:27 crc kubenswrapper[5030]: I0121 00:02:27.919283 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-migration-ssh-key-1\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:28 crc kubenswrapper[5030]: I0121 00:02:28.044693 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-inventory\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:28 crc kubenswrapper[5030]: I0121 00:02:28.044711 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-ssh-key-edpm-compute-global\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:28 crc kubenswrapper[5030]: I0121 00:02:28.044917 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-cell1-compute-config-1\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:28 crc kubenswrapper[5030]: I0121 00:02:28.044970 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-migration-ssh-key-0\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:28 crc kubenswrapper[5030]: I0121 00:02:28.045217 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-migration-ssh-key-1\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:28 crc kubenswrapper[5030]: I0121 00:02:28.045283 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-combined-ca-bundle\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:28 crc kubenswrapper[5030]: I0121 00:02:28.047409 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-cell1-compute-config-0\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:28 crc kubenswrapper[5030]: I0121 00:02:28.049066 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj6h7\" (UniqueName: \"kubernetes.io/projected/c0408f65-3264-42f0-8364-9839f8b46222-kube-api-access-rj6h7\") pod \"nova-edpm-compute-global-edpm-compute-global-28kqw\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:28 crc kubenswrapper[5030]: I0121 00:02:28.278272 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:02:28 crc kubenswrapper[5030]: I0121 00:02:28.286938 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:28 crc kubenswrapper[5030]: I0121 00:02:28.566544 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd4dq" event={"ID":"2913e6de-f32f-4b7a-b67f-cc9625a1831c","Type":"ContainerStarted","Data":"ab16ee0add78066cf5c55f95ba50c92b4625b09b4a0441fd418acdfbb2bee675"} Jan 21 00:02:28 crc kubenswrapper[5030]: I0121 00:02:28.585175 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hd4dq" podStartSLOduration=52.012910436 podStartE2EDuration="56.585156619s" podCreationTimestamp="2026-01-21 00:01:32 +0000 UTC" firstStartedPulling="2026-01-21 00:02:23.514784997 +0000 UTC m=+5215.835045285" lastFinishedPulling="2026-01-21 00:02:28.08703116 +0000 UTC m=+5220.407291468" observedRunningTime="2026-01-21 00:02:28.580725581 +0000 UTC m=+5220.900985899" watchObservedRunningTime="2026-01-21 00:02:28.585156619 +0000 UTC m=+5220.905416897" Jan 21 00:02:28 crc kubenswrapper[5030]: I0121 00:02:28.707720 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw"] Jan 21 00:02:29 crc kubenswrapper[5030]: I0121 00:02:29.375948 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:02:29 crc kubenswrapper[5030]: I0121 00:02:29.580482 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" event={"ID":"c0408f65-3264-42f0-8364-9839f8b46222","Type":"ContainerStarted","Data":"b4607f8eff8a546a9e7a0b80e7a2fa6e3d4ad6cc8a29127ea6fc93fe5abcc3e8"} Jan 21 00:02:29 crc kubenswrapper[5030]: I0121 00:02:29.962813 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:02:29 crc kubenswrapper[5030]: E0121 00:02:29.963095 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:02:30 crc kubenswrapper[5030]: I0121 00:02:30.592045 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" event={"ID":"c0408f65-3264-42f0-8364-9839f8b46222","Type":"ContainerStarted","Data":"a52f73fa0e272c089051843538b533563c54e09f5ff5207ac2d758d5e49e7cac"} Jan 21 00:02:30 crc kubenswrapper[5030]: I0121 00:02:30.617233 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" podStartSLOduration=2.96352336 podStartE2EDuration="3.61720781s" podCreationTimestamp="2026-01-21 00:02:27 +0000 UTC" firstStartedPulling="2026-01-21 00:02:28.718488121 +0000 UTC m=+5221.038748399" lastFinishedPulling="2026-01-21 00:02:29.372172551 +0000 UTC m=+5221.692432849" observedRunningTime="2026-01-21 00:02:30.61022171 +0000 UTC m=+5222.930482028" watchObservedRunningTime="2026-01-21 00:02:30.61720781 +0000 UTC m=+5222.937468098" Jan 21 00:02:31 crc kubenswrapper[5030]: I0121 00:02:31.629982 5030 generic.go:334] "Generic (PLEG): container finished" podID="c0408f65-3264-42f0-8364-9839f8b46222" containerID="a52f73fa0e272c089051843538b533563c54e09f5ff5207ac2d758d5e49e7cac" exitCode=0 Jan 21 00:02:31 crc kubenswrapper[5030]: I0121 00:02:31.630082 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" event={"ID":"c0408f65-3264-42f0-8364-9839f8b46222","Type":"ContainerDied","Data":"a52f73fa0e272c089051843538b533563c54e09f5ff5207ac2d758d5e49e7cac"} Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.114910 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.163651 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.163702 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.250046 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-combined-ca-bundle\") pod \"c0408f65-3264-42f0-8364-9839f8b46222\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.250202 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-cell1-compute-config-1\") pod \"c0408f65-3264-42f0-8364-9839f8b46222\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.250234 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-migration-ssh-key-1\") pod \"c0408f65-3264-42f0-8364-9839f8b46222\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.250308 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-cell1-compute-config-0\") pod \"c0408f65-3264-42f0-8364-9839f8b46222\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.250393 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-inventory\") pod \"c0408f65-3264-42f0-8364-9839f8b46222\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.250754 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-migration-ssh-key-0\") pod \"c0408f65-3264-42f0-8364-9839f8b46222\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.250809 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-ssh-key-edpm-compute-global\") pod \"c0408f65-3264-42f0-8364-9839f8b46222\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.250840 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj6h7\" (UniqueName: \"kubernetes.io/projected/c0408f65-3264-42f0-8364-9839f8b46222-kube-api-access-rj6h7\") pod \"c0408f65-3264-42f0-8364-9839f8b46222\" (UID: \"c0408f65-3264-42f0-8364-9839f8b46222\") " Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.258753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0408f65-3264-42f0-8364-9839f8b46222-kube-api-access-rj6h7" (OuterVolumeSpecName: "kube-api-access-rj6h7") pod "c0408f65-3264-42f0-8364-9839f8b46222" (UID: "c0408f65-3264-42f0-8364-9839f8b46222"). InnerVolumeSpecName "kube-api-access-rj6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.270250 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c0408f65-3264-42f0-8364-9839f8b46222" (UID: "c0408f65-3264-42f0-8364-9839f8b46222"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.276896 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "c0408f65-3264-42f0-8364-9839f8b46222" (UID: "c0408f65-3264-42f0-8364-9839f8b46222"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.285719 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "c0408f65-3264-42f0-8364-9839f8b46222" (UID: "c0408f65-3264-42f0-8364-9839f8b46222"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.287339 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-inventory" (OuterVolumeSpecName: "inventory") pod "c0408f65-3264-42f0-8364-9839f8b46222" (UID: "c0408f65-3264-42f0-8364-9839f8b46222"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.291504 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "c0408f65-3264-42f0-8364-9839f8b46222" (UID: "c0408f65-3264-42f0-8364-9839f8b46222"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.293815 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "c0408f65-3264-42f0-8364-9839f8b46222" (UID: "c0408f65-3264-42f0-8364-9839f8b46222"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.294119 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "c0408f65-3264-42f0-8364-9839f8b46222" (UID: "c0408f65-3264-42f0-8364-9839f8b46222"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.352650 5030 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.352689 5030 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.352704 5030 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.352717 5030 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.352731 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.352745 5030 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.352757 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/c0408f65-3264-42f0-8364-9839f8b46222-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.352772 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj6h7\" (UniqueName: \"kubernetes.io/projected/c0408f65-3264-42f0-8364-9839f8b46222-kube-api-access-rj6h7\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.669292 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.669199 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw" event={"ID":"c0408f65-3264-42f0-8364-9839f8b46222","Type":"ContainerDied","Data":"b4607f8eff8a546a9e7a0b80e7a2fa6e3d4ad6cc8a29127ea6fc93fe5abcc3e8"} Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.675843 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4607f8eff8a546a9e7a0b80e7a2fa6e3d4ad6cc8a29127ea6fc93fe5abcc3e8" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.703747 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw"] Jan 21 00:02:33 crc kubenswrapper[5030]: E0121 00:02:33.704168 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0408f65-3264-42f0-8364-9839f8b46222" containerName="nova-edpm-compute-global-edpm-compute-global" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.704194 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0408f65-3264-42f0-8364-9839f8b46222" containerName="nova-edpm-compute-global-edpm-compute-global" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.704379 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0408f65-3264-42f0-8364-9839f8b46222" containerName="nova-edpm-compute-global-edpm-compute-global" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.705033 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.708893 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.708961 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.708977 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.709026 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.709235 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.717292 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw"] Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.859599 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppvqh\" (UniqueName: \"kubernetes.io/projected/514781ff-085c-4cb6-95f9-7d2ef45d5c99-kube-api-access-ppvqh\") pod \"custom-global-service-edpm-compute-global-9lfzw\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.859714 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-inventory-0\") pod \"custom-global-service-edpm-compute-global-9lfzw\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.859765 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-custom-global-service-combined-ca-bundle\") pod \"custom-global-service-edpm-compute-global-9lfzw\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.859823 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-ssh-key-edpm-compute-global\") pod \"custom-global-service-edpm-compute-global-9lfzw\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.961252 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-ssh-key-edpm-compute-global\") pod \"custom-global-service-edpm-compute-global-9lfzw\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.961522 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppvqh\" (UniqueName: \"kubernetes.io/projected/514781ff-085c-4cb6-95f9-7d2ef45d5c99-kube-api-access-ppvqh\") pod \"custom-global-service-edpm-compute-global-9lfzw\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.961596 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-inventory-0\") pod \"custom-global-service-edpm-compute-global-9lfzw\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.961647 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-custom-global-service-combined-ca-bundle\") pod \"custom-global-service-edpm-compute-global-9lfzw\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.968331 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-custom-global-service-combined-ca-bundle\") pod \"custom-global-service-edpm-compute-global-9lfzw\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.968389 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-ssh-key-edpm-compute-global\") pod \"custom-global-service-edpm-compute-global-9lfzw\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.968839 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-inventory-0\") pod \"custom-global-service-edpm-compute-global-9lfzw\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:33 crc kubenswrapper[5030]: I0121 00:02:33.982194 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppvqh\" (UniqueName: \"kubernetes.io/projected/514781ff-085c-4cb6-95f9-7d2ef45d5c99-kube-api-access-ppvqh\") pod \"custom-global-service-edpm-compute-global-9lfzw\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:34 crc kubenswrapper[5030]: I0121 00:02:34.029250 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:34 crc kubenswrapper[5030]: I0121 00:02:34.228949 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hd4dq" podUID="2913e6de-f32f-4b7a-b67f-cc9625a1831c" containerName="registry-server" probeResult="failure" output=< Jan 21 00:02:34 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 21 00:02:34 crc kubenswrapper[5030]: > Jan 21 00:02:34 crc kubenswrapper[5030]: I0121 00:02:34.505331 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw"] Jan 21 00:02:34 crc kubenswrapper[5030]: W0121 00:02:34.511854 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod514781ff_085c_4cb6_95f9_7d2ef45d5c99.slice/crio-5765dbfa6d7720b434ef905ee76ca48b590ab192c3f2e5e600983ca9ea41ca2e WatchSource:0}: Error finding container 5765dbfa6d7720b434ef905ee76ca48b590ab192c3f2e5e600983ca9ea41ca2e: Status 404 returned error can't find the container with id 5765dbfa6d7720b434ef905ee76ca48b590ab192c3f2e5e600983ca9ea41ca2e Jan 21 00:02:34 crc kubenswrapper[5030]: I0121 00:02:34.680403 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" event={"ID":"514781ff-085c-4cb6-95f9-7d2ef45d5c99","Type":"ContainerStarted","Data":"5765dbfa6d7720b434ef905ee76ca48b590ab192c3f2e5e600983ca9ea41ca2e"} Jan 21 00:02:37 crc kubenswrapper[5030]: I0121 00:02:37.708096 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" event={"ID":"514781ff-085c-4cb6-95f9-7d2ef45d5c99","Type":"ContainerStarted","Data":"cc4c2de831173441106ea6dd1fd1c7911b0707c11132dd7aa40e8b8cc79474bc"} Jan 21 00:02:37 crc kubenswrapper[5030]: I0121 00:02:37.746740 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" podStartSLOduration=2.359739693 podStartE2EDuration="4.746712629s" podCreationTimestamp="2026-01-21 00:02:33 +0000 UTC" firstStartedPulling="2026-01-21 00:02:34.517002739 +0000 UTC m=+5226.837263027" lastFinishedPulling="2026-01-21 00:02:36.903975635 +0000 UTC m=+5229.224235963" observedRunningTime="2026-01-21 00:02:37.732650348 +0000 UTC m=+5230.052910646" watchObservedRunningTime="2026-01-21 00:02:37.746712629 +0000 UTC m=+5230.066972947" Jan 21 00:02:41 crc kubenswrapper[5030]: I0121 00:02:41.749637 5030 generic.go:334] "Generic (PLEG): container finished" podID="514781ff-085c-4cb6-95f9-7d2ef45d5c99" containerID="cc4c2de831173441106ea6dd1fd1c7911b0707c11132dd7aa40e8b8cc79474bc" exitCode=0 Jan 21 00:02:41 crc kubenswrapper[5030]: I0121 00:02:41.749789 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" event={"ID":"514781ff-085c-4cb6-95f9-7d2ef45d5c99","Type":"ContainerDied","Data":"cc4c2de831173441106ea6dd1fd1c7911b0707c11132dd7aa40e8b8cc79474bc"} Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.081749 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.151097 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-inventory-0\") pod \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.151179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-custom-global-service-combined-ca-bundle\") pod \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.151283 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppvqh\" (UniqueName: \"kubernetes.io/projected/514781ff-085c-4cb6-95f9-7d2ef45d5c99-kube-api-access-ppvqh\") pod \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.151398 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-ssh-key-edpm-compute-global\") pod \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\" (UID: \"514781ff-085c-4cb6-95f9-7d2ef45d5c99\") " Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.160862 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-custom-global-service-combined-ca-bundle" (OuterVolumeSpecName: "custom-global-service-combined-ca-bundle") pod "514781ff-085c-4cb6-95f9-7d2ef45d5c99" (UID: "514781ff-085c-4cb6-95f9-7d2ef45d5c99"). InnerVolumeSpecName "custom-global-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.162238 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514781ff-085c-4cb6-95f9-7d2ef45d5c99-kube-api-access-ppvqh" (OuterVolumeSpecName: "kube-api-access-ppvqh") pod "514781ff-085c-4cb6-95f9-7d2ef45d5c99" (UID: "514781ff-085c-4cb6-95f9-7d2ef45d5c99"). InnerVolumeSpecName "kube-api-access-ppvqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.196747 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "514781ff-085c-4cb6-95f9-7d2ef45d5c99" (UID: "514781ff-085c-4cb6-95f9-7d2ef45d5c99"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.197846 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "514781ff-085c-4cb6-95f9-7d2ef45d5c99" (UID: "514781ff-085c-4cb6-95f9-7d2ef45d5c99"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.237259 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.252831 5030 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.252864 5030 reconciler_common.go:293] "Volume detached for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-custom-global-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.252909 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppvqh\" (UniqueName: \"kubernetes.io/projected/514781ff-085c-4cb6-95f9-7d2ef45d5c99-kube-api-access-ppvqh\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.252929 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/514781ff-085c-4cb6-95f9-7d2ef45d5c99-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.296131 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.482980 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hd4dq"] Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.773493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" event={"ID":"514781ff-085c-4cb6-95f9-7d2ef45d5c99","Type":"ContainerDied","Data":"5765dbfa6d7720b434ef905ee76ca48b590ab192c3f2e5e600983ca9ea41ca2e"} Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.773556 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5765dbfa6d7720b434ef905ee76ca48b590ab192c3f2e5e600983ca9ea41ca2e" Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.773523 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw" Jan 21 00:02:43 crc kubenswrapper[5030]: I0121 00:02:43.962957 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:02:43 crc kubenswrapper[5030]: E0121 00:02:43.963236 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:02:44 crc kubenswrapper[5030]: I0121 00:02:44.781383 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hd4dq" podUID="2913e6de-f32f-4b7a-b67f-cc9625a1831c" containerName="registry-server" containerID="cri-o://ab16ee0add78066cf5c55f95ba50c92b4625b09b4a0441fd418acdfbb2bee675" gracePeriod=2 Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.247979 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm"] Jan 21 00:02:45 crc kubenswrapper[5030]: E0121 00:02:45.248334 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514781ff-085c-4cb6-95f9-7d2ef45d5c99" containerName="custom-global-service-edpm-compute-global" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.248348 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="514781ff-085c-4cb6-95f9-7d2ef45d5c99" containerName="custom-global-service-edpm-compute-global" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.248514 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="514781ff-085c-4cb6-95f9-7d2ef45d5c99" containerName="custom-global-service-edpm-compute-global" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.249275 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.251655 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-compute-beta-nodeset" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.270356 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm"] Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.281855 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-edpm-compute-beta-nodeset\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.281909 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-config\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.281954 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-edpm-compute-global\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.281989 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.282147 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpjzq\" (UniqueName: \"kubernetes.io/projected/0d3065ec-86f1-4484-a0f3-f78c7b62c978-kube-api-access-zpjzq\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.383449 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpjzq\" (UniqueName: \"kubernetes.io/projected/0d3065ec-86f1-4484-a0f3-f78c7b62c978-kube-api-access-zpjzq\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.383570 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-edpm-compute-beta-nodeset\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.383608 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-config\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.383692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-edpm-compute-global\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.383720 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.384556 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.384559 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-edpm-compute-beta-nodeset\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.384672 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-edpm-compute-global\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.385236 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-config\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.417908 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpjzq\" (UniqueName: \"kubernetes.io/projected/0d3065ec-86f1-4484-a0f3-f78c7b62c978-kube-api-access-zpjzq\") pod \"dnsmasq-dnsmasq-6668544499-84mhm\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.567102 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.708399 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.789039 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddt7f\" (UniqueName: \"kubernetes.io/projected/2913e6de-f32f-4b7a-b67f-cc9625a1831c-kube-api-access-ddt7f\") pod \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\" (UID: \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\") " Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.789092 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2913e6de-f32f-4b7a-b67f-cc9625a1831c-utilities\") pod \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\" (UID: \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\") " Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.789114 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2913e6de-f32f-4b7a-b67f-cc9625a1831c-catalog-content\") pod \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\" (UID: \"2913e6de-f32f-4b7a-b67f-cc9625a1831c\") " Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.790521 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2913e6de-f32f-4b7a-b67f-cc9625a1831c-utilities" (OuterVolumeSpecName: "utilities") pod "2913e6de-f32f-4b7a-b67f-cc9625a1831c" (UID: "2913e6de-f32f-4b7a-b67f-cc9625a1831c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.793225 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2913e6de-f32f-4b7a-b67f-cc9625a1831c-kube-api-access-ddt7f" (OuterVolumeSpecName: "kube-api-access-ddt7f") pod "2913e6de-f32f-4b7a-b67f-cc9625a1831c" (UID: "2913e6de-f32f-4b7a-b67f-cc9625a1831c"). InnerVolumeSpecName "kube-api-access-ddt7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.795443 5030 generic.go:334] "Generic (PLEG): container finished" podID="2913e6de-f32f-4b7a-b67f-cc9625a1831c" containerID="ab16ee0add78066cf5c55f95ba50c92b4625b09b4a0441fd418acdfbb2bee675" exitCode=0 Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.795499 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hd4dq" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.795495 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd4dq" event={"ID":"2913e6de-f32f-4b7a-b67f-cc9625a1831c","Type":"ContainerDied","Data":"ab16ee0add78066cf5c55f95ba50c92b4625b09b4a0441fd418acdfbb2bee675"} Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.795634 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd4dq" event={"ID":"2913e6de-f32f-4b7a-b67f-cc9625a1831c","Type":"ContainerDied","Data":"92e29d5832acd637bb9acf4ac182193b1166f0510c6c1cec4fa90b7382f61114"} Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.795663 5030 scope.go:117] "RemoveContainer" containerID="ab16ee0add78066cf5c55f95ba50c92b4625b09b4a0441fd418acdfbb2bee675" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.813691 5030 scope.go:117] "RemoveContainer" containerID="e0608db7bcb6ef2ae65077a88ab24172c789bdad1ac37ff87d8eed028e632ccd" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.838920 5030 scope.go:117] "RemoveContainer" containerID="8f5d3e992bf5b2b5be564b30360b97fd4a3095c5db92164ffa267a33cf6e06c4" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.858295 5030 scope.go:117] "RemoveContainer" containerID="ab16ee0add78066cf5c55f95ba50c92b4625b09b4a0441fd418acdfbb2bee675" Jan 21 00:02:45 crc kubenswrapper[5030]: E0121 00:02:45.858916 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab16ee0add78066cf5c55f95ba50c92b4625b09b4a0441fd418acdfbb2bee675\": container with ID starting with ab16ee0add78066cf5c55f95ba50c92b4625b09b4a0441fd418acdfbb2bee675 not found: ID does not exist" containerID="ab16ee0add78066cf5c55f95ba50c92b4625b09b4a0441fd418acdfbb2bee675" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.858975 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab16ee0add78066cf5c55f95ba50c92b4625b09b4a0441fd418acdfbb2bee675"} err="failed to get container status \"ab16ee0add78066cf5c55f95ba50c92b4625b09b4a0441fd418acdfbb2bee675\": rpc error: code = NotFound desc = could not find container \"ab16ee0add78066cf5c55f95ba50c92b4625b09b4a0441fd418acdfbb2bee675\": container with ID starting with ab16ee0add78066cf5c55f95ba50c92b4625b09b4a0441fd418acdfbb2bee675 not found: ID does not exist" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.859009 5030 scope.go:117] "RemoveContainer" containerID="e0608db7bcb6ef2ae65077a88ab24172c789bdad1ac37ff87d8eed028e632ccd" Jan 21 00:02:45 crc kubenswrapper[5030]: E0121 00:02:45.859394 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0608db7bcb6ef2ae65077a88ab24172c789bdad1ac37ff87d8eed028e632ccd\": container with ID starting with e0608db7bcb6ef2ae65077a88ab24172c789bdad1ac37ff87d8eed028e632ccd not found: ID does not exist" containerID="e0608db7bcb6ef2ae65077a88ab24172c789bdad1ac37ff87d8eed028e632ccd" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.859425 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0608db7bcb6ef2ae65077a88ab24172c789bdad1ac37ff87d8eed028e632ccd"} err="failed to get container status \"e0608db7bcb6ef2ae65077a88ab24172c789bdad1ac37ff87d8eed028e632ccd\": rpc error: code = NotFound desc = could not find container \"e0608db7bcb6ef2ae65077a88ab24172c789bdad1ac37ff87d8eed028e632ccd\": container with ID starting with e0608db7bcb6ef2ae65077a88ab24172c789bdad1ac37ff87d8eed028e632ccd not found: ID does not exist" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.859445 5030 scope.go:117] "RemoveContainer" containerID="8f5d3e992bf5b2b5be564b30360b97fd4a3095c5db92164ffa267a33cf6e06c4" Jan 21 00:02:45 crc kubenswrapper[5030]: E0121 00:02:45.859722 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f5d3e992bf5b2b5be564b30360b97fd4a3095c5db92164ffa267a33cf6e06c4\": container with ID starting with 8f5d3e992bf5b2b5be564b30360b97fd4a3095c5db92164ffa267a33cf6e06c4 not found: ID does not exist" containerID="8f5d3e992bf5b2b5be564b30360b97fd4a3095c5db92164ffa267a33cf6e06c4" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.859750 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f5d3e992bf5b2b5be564b30360b97fd4a3095c5db92164ffa267a33cf6e06c4"} err="failed to get container status \"8f5d3e992bf5b2b5be564b30360b97fd4a3095c5db92164ffa267a33cf6e06c4\": rpc error: code = NotFound desc = could not find container \"8f5d3e992bf5b2b5be564b30360b97fd4a3095c5db92164ffa267a33cf6e06c4\": container with ID starting with 8f5d3e992bf5b2b5be564b30360b97fd4a3095c5db92164ffa267a33cf6e06c4 not found: ID does not exist" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.890585 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddt7f\" (UniqueName: \"kubernetes.io/projected/2913e6de-f32f-4b7a-b67f-cc9625a1831c-kube-api-access-ddt7f\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.890648 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2913e6de-f32f-4b7a-b67f-cc9625a1831c-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.929470 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2913e6de-f32f-4b7a-b67f-cc9625a1831c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2913e6de-f32f-4b7a-b67f-cc9625a1831c" (UID: "2913e6de-f32f-4b7a-b67f-cc9625a1831c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:02:45 crc kubenswrapper[5030]: I0121 00:02:45.994686 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2913e6de-f32f-4b7a-b67f-cc9625a1831c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:46 crc kubenswrapper[5030]: I0121 00:02:46.016511 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm"] Jan 21 00:02:46 crc kubenswrapper[5030]: W0121 00:02:46.020036 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d3065ec_86f1_4484_a0f3_f78c7b62c978.slice/crio-326b9496eec4b61e22c6b4ef27821ba8e4d398b3fe0b57533b8d094086b60735 WatchSource:0}: Error finding container 326b9496eec4b61e22c6b4ef27821ba8e4d398b3fe0b57533b8d094086b60735: Status 404 returned error can't find the container with id 326b9496eec4b61e22c6b4ef27821ba8e4d398b3fe0b57533b8d094086b60735 Jan 21 00:02:46 crc kubenswrapper[5030]: I0121 00:02:46.146050 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hd4dq"] Jan 21 00:02:46 crc kubenswrapper[5030]: I0121 00:02:46.161990 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hd4dq"] Jan 21 00:02:46 crc kubenswrapper[5030]: I0121 00:02:46.809060 5030 generic.go:334] "Generic (PLEG): container finished" podID="0d3065ec-86f1-4484-a0f3-f78c7b62c978" containerID="137ac3508758792744fa7ebba8c978a81c9ed7a6e1eb9c746b574d8436a42ec8" exitCode=0 Jan 21 00:02:46 crc kubenswrapper[5030]: I0121 00:02:46.809424 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" event={"ID":"0d3065ec-86f1-4484-a0f3-f78c7b62c978","Type":"ContainerDied","Data":"137ac3508758792744fa7ebba8c978a81c9ed7a6e1eb9c746b574d8436a42ec8"} Jan 21 00:02:46 crc kubenswrapper[5030]: I0121 00:02:46.809708 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" event={"ID":"0d3065ec-86f1-4484-a0f3-f78c7b62c978","Type":"ContainerStarted","Data":"326b9496eec4b61e22c6b4ef27821ba8e4d398b3fe0b57533b8d094086b60735"} Jan 21 00:02:47 crc kubenswrapper[5030]: I0121 00:02:47.821664 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" event={"ID":"0d3065ec-86f1-4484-a0f3-f78c7b62c978","Type":"ContainerStarted","Data":"2a30a45a21e461f508e1ca18b522de57dbc96c62100ccf7c8d5e97d4e41a8e31"} Jan 21 00:02:47 crc kubenswrapper[5030]: I0121 00:02:47.822092 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:47 crc kubenswrapper[5030]: I0121 00:02:47.851251 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" podStartSLOduration=2.851225765 podStartE2EDuration="2.851225765s" podCreationTimestamp="2026-01-21 00:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:02:47.843249321 +0000 UTC m=+5240.163509619" watchObservedRunningTime="2026-01-21 00:02:47.851225765 +0000 UTC m=+5240.171486063" Jan 21 00:02:47 crc kubenswrapper[5030]: I0121 00:02:47.982595 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2913e6de-f32f-4b7a-b67f-cc9625a1831c" path="/var/lib/kubelet/pods/2913e6de-f32f-4b7a-b67f-cc9625a1831c/volumes" Jan 21 00:02:55 crc kubenswrapper[5030]: I0121 00:02:55.569837 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:02:55 crc kubenswrapper[5030]: I0121 00:02:55.625130 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs"] Jan 21 00:02:55 crc kubenswrapper[5030]: I0121 00:02:55.626138 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" podUID="036b9faa-2fd2-4849-b968-d5f5e9918f9a" containerName="dnsmasq-dns" containerID="cri-o://e3fdcaef684db49c0c1ca8b951c4379f6cd5b3f350924378cff9de3b6686cac9" gracePeriod=10 Jan 21 00:02:55 crc kubenswrapper[5030]: I0121 00:02:55.903600 5030 generic.go:334] "Generic (PLEG): container finished" podID="036b9faa-2fd2-4849-b968-d5f5e9918f9a" containerID="e3fdcaef684db49c0c1ca8b951c4379f6cd5b3f350924378cff9de3b6686cac9" exitCode=0 Jan 21 00:02:55 crc kubenswrapper[5030]: I0121 00:02:55.903675 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" event={"ID":"036b9faa-2fd2-4849-b968-d5f5e9918f9a","Type":"ContainerDied","Data":"e3fdcaef684db49c0c1ca8b951c4379f6cd5b3f350924378cff9de3b6686cac9"} Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.094358 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.257590 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-edpm-compute-global\") pod \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.257721 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-config\") pod \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.257754 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-dnsmasq-svc\") pod \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.257824 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wc8m\" (UniqueName: \"kubernetes.io/projected/036b9faa-2fd2-4849-b968-d5f5e9918f9a-kube-api-access-9wc8m\") pod \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\" (UID: \"036b9faa-2fd2-4849-b968-d5f5e9918f9a\") " Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.262973 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/036b9faa-2fd2-4849-b968-d5f5e9918f9a-kube-api-access-9wc8m" (OuterVolumeSpecName: "kube-api-access-9wc8m") pod "036b9faa-2fd2-4849-b968-d5f5e9918f9a" (UID: "036b9faa-2fd2-4849-b968-d5f5e9918f9a"). InnerVolumeSpecName "kube-api-access-9wc8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.295717 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "036b9faa-2fd2-4849-b968-d5f5e9918f9a" (UID: "036b9faa-2fd2-4849-b968-d5f5e9918f9a"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.299427 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-config" (OuterVolumeSpecName: "config") pod "036b9faa-2fd2-4849-b968-d5f5e9918f9a" (UID: "036b9faa-2fd2-4849-b968-d5f5e9918f9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.311977 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-edpm-compute-global" (OuterVolumeSpecName: "edpm-compute-global") pod "036b9faa-2fd2-4849-b968-d5f5e9918f9a" (UID: "036b9faa-2fd2-4849-b968-d5f5e9918f9a"). InnerVolumeSpecName "edpm-compute-global". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.359604 5030 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.359691 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.359709 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/036b9faa-2fd2-4849-b968-d5f5e9918f9a-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.359727 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wc8m\" (UniqueName: \"kubernetes.io/projected/036b9faa-2fd2-4849-b968-d5f5e9918f9a-kube-api-access-9wc8m\") on node \"crc\" DevicePath \"\"" Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.914152 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" event={"ID":"036b9faa-2fd2-4849-b968-d5f5e9918f9a","Type":"ContainerDied","Data":"ed35637345f3a2062e1168de02b04afaa10fd19c6485f78a0999cc9086aa9108"} Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.914213 5030 scope.go:117] "RemoveContainer" containerID="e3fdcaef684db49c0c1ca8b951c4379f6cd5b3f350924378cff9de3b6686cac9" Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.914290 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs" Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.938044 5030 scope.go:117] "RemoveContainer" containerID="f4bf1cef80dfcdca87ebe48eb0f81d29a5cbf7794edfc098eb7f252a3a065abe" Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.970859 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs"] Jan 21 00:02:56 crc kubenswrapper[5030]: I0121 00:02:56.979907 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-7vsfs"] Jan 21 00:02:57 crc kubenswrapper[5030]: I0121 00:02:57.972827 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="036b9faa-2fd2-4849-b968-d5f5e9918f9a" path="/var/lib/kubelet/pods/036b9faa-2fd2-4849-b968-d5f5e9918f9a/volumes" Jan 21 00:02:58 crc kubenswrapper[5030]: I0121 00:02:58.962444 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:02:58 crc kubenswrapper[5030]: E0121 00:02:58.962851 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.214299 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd"] Jan 21 00:03:00 crc kubenswrapper[5030]: E0121 00:03:00.215198 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036b9faa-2fd2-4849-b968-d5f5e9918f9a" containerName="init" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.215222 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="036b9faa-2fd2-4849-b968-d5f5e9918f9a" containerName="init" Jan 21 00:03:00 crc kubenswrapper[5030]: E0121 00:03:00.215252 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2913e6de-f32f-4b7a-b67f-cc9625a1831c" containerName="extract-content" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.215265 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2913e6de-f32f-4b7a-b67f-cc9625a1831c" containerName="extract-content" Jan 21 00:03:00 crc kubenswrapper[5030]: E0121 00:03:00.215293 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2913e6de-f32f-4b7a-b67f-cc9625a1831c" containerName="extract-utilities" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.215305 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2913e6de-f32f-4b7a-b67f-cc9625a1831c" containerName="extract-utilities" Jan 21 00:03:00 crc kubenswrapper[5030]: E0121 00:03:00.215331 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2913e6de-f32f-4b7a-b67f-cc9625a1831c" containerName="registry-server" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.215342 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2913e6de-f32f-4b7a-b67f-cc9625a1831c" containerName="registry-server" Jan 21 00:03:00 crc kubenswrapper[5030]: E0121 00:03:00.215362 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036b9faa-2fd2-4849-b968-d5f5e9918f9a" containerName="dnsmasq-dns" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.215373 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="036b9faa-2fd2-4849-b968-d5f5e9918f9a" containerName="dnsmasq-dns" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.215586 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2913e6de-f32f-4b7a-b67f-cc9625a1831c" containerName="registry-server" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.215618 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="036b9faa-2fd2-4849-b968-d5f5e9918f9a" containerName="dnsmasq-dns" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.216263 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.224017 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.224791 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv"] Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.226025 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.227123 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.239189 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd"] Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.239561 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.239800 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.240042 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-beta-nodeset" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.240246 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-beta-nodeset-dockercfg-tpppd" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.246826 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv"] Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.318960 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-ssh-key-edpm-compute-global\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-8hgqd\" (UID: \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.319026 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh5p2\" (UniqueName: \"kubernetes.io/projected/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-kube-api-access-nh5p2\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv\" (UID: \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.319059 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzdb2\" (UniqueName: \"kubernetes.io/projected/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-kube-api-access-qzdb2\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-8hgqd\" (UID: \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.319089 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-8hgqd\" (UID: \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.319133 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-ssh-key-edpm-compute-beta-nodeset\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv\" (UID: \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.319243 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv\" (UID: \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.420798 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh5p2\" (UniqueName: \"kubernetes.io/projected/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-kube-api-access-nh5p2\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv\" (UID: \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.420863 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzdb2\" (UniqueName: \"kubernetes.io/projected/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-kube-api-access-qzdb2\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-8hgqd\" (UID: \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.420896 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-8hgqd\" (UID: \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.420951 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-ssh-key-edpm-compute-beta-nodeset\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv\" (UID: \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.422400 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv\" (UID: \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.422521 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-ssh-key-edpm-compute-global\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-8hgqd\" (UID: \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.427276 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-8hgqd\" (UID: \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.428159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-ssh-key-edpm-compute-beta-nodeset\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv\" (UID: \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.428602 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-ssh-key-edpm-compute-global\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-8hgqd\" (UID: \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.429515 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv\" (UID: \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.443292 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzdb2\" (UniqueName: \"kubernetes.io/projected/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-kube-api-access-qzdb2\") pod \"download-cache-edpm-multinodeset-edpm-compute-global-8hgqd\" (UID: \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.453024 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh5p2\" (UniqueName: \"kubernetes.io/projected/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-kube-api-access-nh5p2\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv\" (UID: \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.561370 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.578822 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" Jan 21 00:03:00 crc kubenswrapper[5030]: I0121 00:03:00.870093 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd"] Jan 21 00:03:01 crc kubenswrapper[5030]: I0121 00:03:01.134049 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv"] Jan 21 00:03:01 crc kubenswrapper[5030]: W0121 00:03:01.251705 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a1e96bd_a7f6_4c06_9fc6_d09fac76e710.slice/crio-6b30052b56450afc78d08638156ebe5f205e6be7de368778b2ff4154b9d94304 WatchSource:0}: Error finding container 6b30052b56450afc78d08638156ebe5f205e6be7de368778b2ff4154b9d94304: Status 404 returned error can't find the container with id 6b30052b56450afc78d08638156ebe5f205e6be7de368778b2ff4154b9d94304 Jan 21 00:03:01 crc kubenswrapper[5030]: W0121 00:03:01.255082 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71f979a7_8c0c_4b78_8e10_c5bcc7cf68d2.slice/crio-41284db0515a0f49e4c9563ffbb2fde26120f46f92ad64c643ae434a1af56739 WatchSource:0}: Error finding container 41284db0515a0f49e4c9563ffbb2fde26120f46f92ad64c643ae434a1af56739: Status 404 returned error can't find the container with id 41284db0515a0f49e4c9563ffbb2fde26120f46f92ad64c643ae434a1af56739 Jan 21 00:03:01 crc kubenswrapper[5030]: I0121 00:03:01.974418 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" event={"ID":"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2","Type":"ContainerStarted","Data":"41284db0515a0f49e4c9563ffbb2fde26120f46f92ad64c643ae434a1af56739"} Jan 21 00:03:01 crc kubenswrapper[5030]: I0121 00:03:01.974757 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" event={"ID":"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710","Type":"ContainerStarted","Data":"6b30052b56450afc78d08638156ebe5f205e6be7de368778b2ff4154b9d94304"} Jan 21 00:03:02 crc kubenswrapper[5030]: I0121 00:03:02.976234 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" event={"ID":"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2","Type":"ContainerStarted","Data":"1b3ba403d5e893e33075b3c3e2c708f162680bd74867f03fa13f7e625e25754d"} Jan 21 00:03:02 crc kubenswrapper[5030]: I0121 00:03:02.978322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" event={"ID":"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710","Type":"ContainerStarted","Data":"d534c8ea94b95c2c94ff4109e16cc8f97d0f06c4118d05fe469a56cad543a368"} Jan 21 00:03:03 crc kubenswrapper[5030]: I0121 00:03:03.019047 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" podStartSLOduration=2.501660974 podStartE2EDuration="3.019028499s" podCreationTimestamp="2026-01-21 00:03:00 +0000 UTC" firstStartedPulling="2026-01-21 00:03:01.259019624 +0000 UTC m=+5253.579279942" lastFinishedPulling="2026-01-21 00:03:01.776387179 +0000 UTC m=+5254.096647467" observedRunningTime="2026-01-21 00:03:02.995877098 +0000 UTC m=+5255.316137386" watchObservedRunningTime="2026-01-21 00:03:03.019028499 +0000 UTC m=+5255.339288787" Jan 21 00:03:03 crc kubenswrapper[5030]: I0121 00:03:03.021493 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" podStartSLOduration=2.307857236 podStartE2EDuration="3.021485629s" podCreationTimestamp="2026-01-21 00:03:00 +0000 UTC" firstStartedPulling="2026-01-21 00:03:01.2559423 +0000 UTC m=+5253.576202628" lastFinishedPulling="2026-01-21 00:03:01.969570733 +0000 UTC m=+5254.289831021" observedRunningTime="2026-01-21 00:03:03.0145318 +0000 UTC m=+5255.334792088" watchObservedRunningTime="2026-01-21 00:03:03.021485629 +0000 UTC m=+5255.341745917" Jan 21 00:03:03 crc kubenswrapper[5030]: E0121 00:03:03.625033 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a1e96bd_a7f6_4c06_9fc6_d09fac76e710.slice/crio-conmon-d534c8ea94b95c2c94ff4109e16cc8f97d0f06c4118d05fe469a56cad543a368.scope\": RecentStats: unable to find data in memory cache]" Jan 21 00:03:03 crc kubenswrapper[5030]: I0121 00:03:03.987399 5030 generic.go:334] "Generic (PLEG): container finished" podID="71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2" containerID="1b3ba403d5e893e33075b3c3e2c708f162680bd74867f03fa13f7e625e25754d" exitCode=0 Jan 21 00:03:03 crc kubenswrapper[5030]: I0121 00:03:03.987498 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" event={"ID":"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2","Type":"ContainerDied","Data":"1b3ba403d5e893e33075b3c3e2c708f162680bd74867f03fa13f7e625e25754d"} Jan 21 00:03:03 crc kubenswrapper[5030]: I0121 00:03:03.989403 5030 generic.go:334] "Generic (PLEG): container finished" podID="9a1e96bd-a7f6-4c06-9fc6-d09fac76e710" containerID="d534c8ea94b95c2c94ff4109e16cc8f97d0f06c4118d05fe469a56cad543a368" exitCode=0 Jan 21 00:03:03 crc kubenswrapper[5030]: I0121 00:03:03.989466 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" event={"ID":"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710","Type":"ContainerDied","Data":"d534c8ea94b95c2c94ff4109e16cc8f97d0f06c4118d05fe469a56cad543a368"} Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.476251 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.477209 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.620147 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-ssh-key-edpm-compute-global\") pod \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\" (UID: \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\") " Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.620232 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzdb2\" (UniqueName: \"kubernetes.io/projected/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-kube-api-access-qzdb2\") pod \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\" (UID: \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\") " Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.620300 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-ssh-key-edpm-compute-beta-nodeset\") pod \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\" (UID: \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\") " Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.620339 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-inventory\") pod \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\" (UID: \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\") " Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.621340 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-inventory\") pod \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\" (UID: \"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710\") " Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.621544 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh5p2\" (UniqueName: \"kubernetes.io/projected/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-kube-api-access-nh5p2\") pod \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\" (UID: \"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2\") " Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.626676 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-kube-api-access-qzdb2" (OuterVolumeSpecName: "kube-api-access-qzdb2") pod "9a1e96bd-a7f6-4c06-9fc6-d09fac76e710" (UID: "9a1e96bd-a7f6-4c06-9fc6-d09fac76e710"). InnerVolumeSpecName "kube-api-access-qzdb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.633309 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-kube-api-access-nh5p2" (OuterVolumeSpecName: "kube-api-access-nh5p2") pod "71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2" (UID: "71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2"). InnerVolumeSpecName "kube-api-access-nh5p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.644566 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-ssh-key-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "ssh-key-edpm-compute-beta-nodeset") pod "71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2" (UID: "71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2"). InnerVolumeSpecName "ssh-key-edpm-compute-beta-nodeset". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.663834 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-inventory" (OuterVolumeSpecName: "inventory") pod "9a1e96bd-a7f6-4c06-9fc6-d09fac76e710" (UID: "9a1e96bd-a7f6-4c06-9fc6-d09fac76e710"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.669168 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "9a1e96bd-a7f6-4c06-9fc6-d09fac76e710" (UID: "9a1e96bd-a7f6-4c06-9fc6-d09fac76e710"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.673789 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-inventory" (OuterVolumeSpecName: "inventory") pod "71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2" (UID: "71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.723989 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh5p2\" (UniqueName: \"kubernetes.io/projected/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-kube-api-access-nh5p2\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.724044 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.724066 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzdb2\" (UniqueName: \"kubernetes.io/projected/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-kube-api-access-qzdb2\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.724088 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-ssh-key-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.724108 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:05 crc kubenswrapper[5030]: I0121 00:03:05.724127 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.020470 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" event={"ID":"71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2","Type":"ContainerDied","Data":"41284db0515a0f49e4c9563ffbb2fde26120f46f92ad64c643ae434a1af56739"} Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.020560 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41284db0515a0f49e4c9563ffbb2fde26120f46f92ad64c643ae434a1af56739" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.020530 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.023156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" event={"ID":"9a1e96bd-a7f6-4c06-9fc6-d09fac76e710","Type":"ContainerDied","Data":"6b30052b56450afc78d08638156ebe5f205e6be7de368778b2ff4154b9d94304"} Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.023431 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b30052b56450afc78d08638156ebe5f205e6be7de368778b2ff4154b9d94304" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.023219 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.107661 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422"] Jan 21 00:03:06 crc kubenswrapper[5030]: E0121 00:03:06.108325 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a1e96bd-a7f6-4c06-9fc6-d09fac76e710" containerName="download-cache-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.108495 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a1e96bd-a7f6-4c06-9fc6-d09fac76e710" containerName="download-cache-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:06 crc kubenswrapper[5030]: E0121 00:03:06.108734 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2" containerName="download-cache-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.108849 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2" containerName="download-cache-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.109191 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a1e96bd-a7f6-4c06-9fc6-d09fac76e710" containerName="download-cache-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.109350 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2" containerName="download-cache-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.110146 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.112765 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.113360 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-beta-nodeset" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.114005 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-beta-nodeset-dockercfg-tpppd" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.114178 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.115099 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.119605 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422"] Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.145422 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg"] Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.146571 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.152690 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.152931 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.164989 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg"] Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.232241 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-ssh-key-edpm-compute-global\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.232292 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-ssh-key-edpm-compute-beta-nodeset\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.232338 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.232361 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwms5\" (UniqueName: \"kubernetes.io/projected/8de79324-db47-4fa2-b08c-ac25869a4c9a-kube-api-access-mwms5\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.232444 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.232510 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.232534 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg6wf\" (UniqueName: \"kubernetes.io/projected/d9d98fe0-c996-4703-abdf-1149bb577de7-kube-api-access-lg6wf\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.232563 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.333871 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.334006 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-ssh-key-edpm-compute-global\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.334070 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-ssh-key-edpm-compute-beta-nodeset\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.334186 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.334263 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwms5\" (UniqueName: \"kubernetes.io/projected/8de79324-db47-4fa2-b08c-ac25869a4c9a-kube-api-access-mwms5\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.334383 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.334442 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.334492 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg6wf\" (UniqueName: \"kubernetes.io/projected/d9d98fe0-c996-4703-abdf-1149bb577de7-kube-api-access-lg6wf\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.338506 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-ssh-key-edpm-compute-global\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.338709 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.338862 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-ssh-key-edpm-compute-beta-nodeset\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.340368 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.341181 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.342145 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.363665 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwms5\" (UniqueName: \"kubernetes.io/projected/8de79324-db47-4fa2-b08c-ac25869a4c9a-kube-api-access-mwms5\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.366430 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg6wf\" (UniqueName: \"kubernetes.io/projected/d9d98fe0-c996-4703-abdf-1149bb577de7-kube-api-access-lg6wf\") pod \"bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.433670 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.480483 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.949051 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg"] Jan 21 00:03:06 crc kubenswrapper[5030]: I0121 00:03:06.989044 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422"] Jan 21 00:03:06 crc kubenswrapper[5030]: W0121 00:03:06.998721 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8de79324_db47_4fa2_b08c_ac25869a4c9a.slice/crio-c71151c0c05ddaf17c21cac42ec23a0af867ae5e3d98afde39dc0802cb24b0e9 WatchSource:0}: Error finding container c71151c0c05ddaf17c21cac42ec23a0af867ae5e3d98afde39dc0802cb24b0e9: Status 404 returned error can't find the container with id c71151c0c05ddaf17c21cac42ec23a0af867ae5e3d98afde39dc0802cb24b0e9 Jan 21 00:03:07 crc kubenswrapper[5030]: I0121 00:03:07.031700 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" event={"ID":"d9d98fe0-c996-4703-abdf-1149bb577de7","Type":"ContainerStarted","Data":"478398293a0a0118275e8229713c7784d257065aa0d85dc2ccf40ce5444e64dd"} Jan 21 00:03:07 crc kubenswrapper[5030]: I0121 00:03:07.032688 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" event={"ID":"8de79324-db47-4fa2-b08c-ac25869a4c9a","Type":"ContainerStarted","Data":"c71151c0c05ddaf17c21cac42ec23a0af867ae5e3d98afde39dc0802cb24b0e9"} Jan 21 00:03:08 crc kubenswrapper[5030]: I0121 00:03:08.041835 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" event={"ID":"d9d98fe0-c996-4703-abdf-1149bb577de7","Type":"ContainerStarted","Data":"e4ac8dd6948d0ea55265f9098316b293c94e72d19e89441a23b08995c34cde72"} Jan 21 00:03:08 crc kubenswrapper[5030]: I0121 00:03:08.044305 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" event={"ID":"8de79324-db47-4fa2-b08c-ac25869a4c9a","Type":"ContainerStarted","Data":"bf1a5dc9da4f6f22aa1c0bce914a19fa47df1650802843f948da6a0a5a55d884"} Jan 21 00:03:08 crc kubenswrapper[5030]: I0121 00:03:08.070453 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" podStartSLOduration=1.523275494 podStartE2EDuration="2.070427931s" podCreationTimestamp="2026-01-21 00:03:06 +0000 UTC" firstStartedPulling="2026-01-21 00:03:06.954930753 +0000 UTC m=+5259.275191051" lastFinishedPulling="2026-01-21 00:03:07.50208319 +0000 UTC m=+5259.822343488" observedRunningTime="2026-01-21 00:03:08.063731228 +0000 UTC m=+5260.383991526" watchObservedRunningTime="2026-01-21 00:03:08.070427931 +0000 UTC m=+5260.390688219" Jan 21 00:03:09 crc kubenswrapper[5030]: I0121 00:03:09.963117 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:03:09 crc kubenswrapper[5030]: E0121 00:03:09.963920 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:03:10 crc kubenswrapper[5030]: I0121 00:03:10.063429 5030 generic.go:334] "Generic (PLEG): container finished" podID="d9d98fe0-c996-4703-abdf-1149bb577de7" containerID="e4ac8dd6948d0ea55265f9098316b293c94e72d19e89441a23b08995c34cde72" exitCode=0 Jan 21 00:03:10 crc kubenswrapper[5030]: I0121 00:03:10.063552 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" event={"ID":"d9d98fe0-c996-4703-abdf-1149bb577de7","Type":"ContainerDied","Data":"e4ac8dd6948d0ea55265f9098316b293c94e72d19e89441a23b08995c34cde72"} Jan 21 00:03:10 crc kubenswrapper[5030]: I0121 00:03:10.065791 5030 generic.go:334] "Generic (PLEG): container finished" podID="8de79324-db47-4fa2-b08c-ac25869a4c9a" containerID="bf1a5dc9da4f6f22aa1c0bce914a19fa47df1650802843f948da6a0a5a55d884" exitCode=0 Jan 21 00:03:10 crc kubenswrapper[5030]: I0121 00:03:10.065853 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" event={"ID":"8de79324-db47-4fa2-b08c-ac25869a4c9a","Type":"ContainerDied","Data":"bf1a5dc9da4f6f22aa1c0bce914a19fa47df1650802843f948da6a0a5a55d884"} Jan 21 00:03:10 crc kubenswrapper[5030]: I0121 00:03:10.097881 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" podStartSLOduration=3.613529567 podStartE2EDuration="4.09785061s" podCreationTimestamp="2026-01-21 00:03:06 +0000 UTC" firstStartedPulling="2026-01-21 00:03:07.002062156 +0000 UTC m=+5259.322322444" lastFinishedPulling="2026-01-21 00:03:07.486383199 +0000 UTC m=+5259.806643487" observedRunningTime="2026-01-21 00:03:08.082997545 +0000 UTC m=+5260.403257853" watchObservedRunningTime="2026-01-21 00:03:10.09785061 +0000 UTC m=+5262.418110948" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.469305 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.507550 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.626256 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-ssh-key-edpm-compute-beta-nodeset\") pod \"8de79324-db47-4fa2-b08c-ac25869a4c9a\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.626537 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-ssh-key-edpm-compute-global\") pod \"d9d98fe0-c996-4703-abdf-1149bb577de7\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.626573 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-inventory\") pod \"d9d98fe0-c996-4703-abdf-1149bb577de7\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.626614 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-bootstrap-combined-ca-bundle\") pod \"d9d98fe0-c996-4703-abdf-1149bb577de7\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.626704 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-inventory\") pod \"8de79324-db47-4fa2-b08c-ac25869a4c9a\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.626739 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwms5\" (UniqueName: \"kubernetes.io/projected/8de79324-db47-4fa2-b08c-ac25869a4c9a-kube-api-access-mwms5\") pod \"8de79324-db47-4fa2-b08c-ac25869a4c9a\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.626759 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-bootstrap-combined-ca-bundle\") pod \"8de79324-db47-4fa2-b08c-ac25869a4c9a\" (UID: \"8de79324-db47-4fa2-b08c-ac25869a4c9a\") " Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.626796 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg6wf\" (UniqueName: \"kubernetes.io/projected/d9d98fe0-c996-4703-abdf-1149bb577de7-kube-api-access-lg6wf\") pod \"d9d98fe0-c996-4703-abdf-1149bb577de7\" (UID: \"d9d98fe0-c996-4703-abdf-1149bb577de7\") " Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.649494 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d98fe0-c996-4703-abdf-1149bb577de7-kube-api-access-lg6wf" (OuterVolumeSpecName: "kube-api-access-lg6wf") pod "d9d98fe0-c996-4703-abdf-1149bb577de7" (UID: "d9d98fe0-c996-4703-abdf-1149bb577de7"). InnerVolumeSpecName "kube-api-access-lg6wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.649688 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de79324-db47-4fa2-b08c-ac25869a4c9a-kube-api-access-mwms5" (OuterVolumeSpecName: "kube-api-access-mwms5") pod "8de79324-db47-4fa2-b08c-ac25869a4c9a" (UID: "8de79324-db47-4fa2-b08c-ac25869a4c9a"). InnerVolumeSpecName "kube-api-access-mwms5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.650032 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8de79324-db47-4fa2-b08c-ac25869a4c9a" (UID: "8de79324-db47-4fa2-b08c-ac25869a4c9a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.652559 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d9d98fe0-c996-4703-abdf-1149bb577de7" (UID: "d9d98fe0-c996-4703-abdf-1149bb577de7"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.669849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-inventory" (OuterVolumeSpecName: "inventory") pod "8de79324-db47-4fa2-b08c-ac25869a4c9a" (UID: "8de79324-db47-4fa2-b08c-ac25869a4c9a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.683777 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-ssh-key-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "ssh-key-edpm-compute-beta-nodeset") pod "8de79324-db47-4fa2-b08c-ac25869a4c9a" (UID: "8de79324-db47-4fa2-b08c-ac25869a4c9a"). InnerVolumeSpecName "ssh-key-edpm-compute-beta-nodeset". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.686878 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "d9d98fe0-c996-4703-abdf-1149bb577de7" (UID: "d9d98fe0-c996-4703-abdf-1149bb577de7"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.698457 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-inventory" (OuterVolumeSpecName: "inventory") pod "d9d98fe0-c996-4703-abdf-1149bb577de7" (UID: "d9d98fe0-c996-4703-abdf-1149bb577de7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.728553 5030 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.728596 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.728610 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwms5\" (UniqueName: \"kubernetes.io/projected/8de79324-db47-4fa2-b08c-ac25869a4c9a-kube-api-access-mwms5\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.728634 5030 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.728645 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg6wf\" (UniqueName: \"kubernetes.io/projected/d9d98fe0-c996-4703-abdf-1149bb577de7-kube-api-access-lg6wf\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.728654 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/8de79324-db47-4fa2-b08c-ac25869a4c9a-ssh-key-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.728666 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:11 crc kubenswrapper[5030]: I0121 00:03:11.728701 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9d98fe0-c996-4703-abdf-1149bb577de7-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.097587 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.097692 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg" event={"ID":"d9d98fe0-c996-4703-abdf-1149bb577de7","Type":"ContainerDied","Data":"478398293a0a0118275e8229713c7784d257065aa0d85dc2ccf40ce5444e64dd"} Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.097741 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="478398293a0a0118275e8229713c7784d257065aa0d85dc2ccf40ce5444e64dd" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.099510 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" event={"ID":"8de79324-db47-4fa2-b08c-ac25869a4c9a","Type":"ContainerDied","Data":"c71151c0c05ddaf17c21cac42ec23a0af867ae5e3d98afde39dc0802cb24b0e9"} Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.099565 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c71151c0c05ddaf17c21cac42ec23a0af867ae5e3d98afde39dc0802cb24b0e9" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.099580 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.179924 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp"] Jan 21 00:03:12 crc kubenswrapper[5030]: E0121 00:03:12.180452 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d98fe0-c996-4703-abdf-1149bb577de7" containerName="bootstrap-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.180530 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d98fe0-c996-4703-abdf-1149bb577de7" containerName="bootstrap-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:12 crc kubenswrapper[5030]: E0121 00:03:12.180593 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de79324-db47-4fa2-b08c-ac25869a4c9a" containerName="bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.180673 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de79324-db47-4fa2-b08c-ac25869a4c9a" containerName="bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.180870 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de79324-db47-4fa2-b08c-ac25869a4c9a" containerName="bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.180937 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d98fe0-c996-4703-abdf-1149bb577de7" containerName="bootstrap-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.181400 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.187891 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.188011 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.188207 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.188459 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.190599 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp"] Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.338510 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fwr2\" (UniqueName: \"kubernetes.io/projected/33ef4425-ed8f-4995-96e0-e76e5e202ad0-kube-api-access-4fwr2\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-kjtsp\" (UID: \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.338603 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33ef4425-ed8f-4995-96e0-e76e5e202ad0-inventory\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-kjtsp\" (UID: \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.338745 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/33ef4425-ed8f-4995-96e0-e76e5e202ad0-ssh-key-edpm-compute-global\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-kjtsp\" (UID: \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.440118 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/33ef4425-ed8f-4995-96e0-e76e5e202ad0-ssh-key-edpm-compute-global\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-kjtsp\" (UID: \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.440229 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fwr2\" (UniqueName: \"kubernetes.io/projected/33ef4425-ed8f-4995-96e0-e76e5e202ad0-kube-api-access-4fwr2\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-kjtsp\" (UID: \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.440280 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33ef4425-ed8f-4995-96e0-e76e5e202ad0-inventory\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-kjtsp\" (UID: \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.444336 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33ef4425-ed8f-4995-96e0-e76e5e202ad0-inventory\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-kjtsp\" (UID: \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.445350 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/33ef4425-ed8f-4995-96e0-e76e5e202ad0-ssh-key-edpm-compute-global\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-kjtsp\" (UID: \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.460092 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fwr2\" (UniqueName: \"kubernetes.io/projected/33ef4425-ed8f-4995-96e0-e76e5e202ad0-kube-api-access-4fwr2\") pod \"configure-network-edpm-multinodeset-edpm-compute-global-kjtsp\" (UID: \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.503491 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" Jan 21 00:03:12 crc kubenswrapper[5030]: I0121 00:03:12.974727 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp"] Jan 21 00:03:12 crc kubenswrapper[5030]: W0121 00:03:12.980332 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33ef4425_ed8f_4995_96e0_e76e5e202ad0.slice/crio-62b0f0fb1311a4d011e24fe0a26f2c5994ac4588501d5fcf67d2617359bca906 WatchSource:0}: Error finding container 62b0f0fb1311a4d011e24fe0a26f2c5994ac4588501d5fcf67d2617359bca906: Status 404 returned error can't find the container with id 62b0f0fb1311a4d011e24fe0a26f2c5994ac4588501d5fcf67d2617359bca906 Jan 21 00:03:13 crc kubenswrapper[5030]: I0121 00:03:13.109465 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" event={"ID":"33ef4425-ed8f-4995-96e0-e76e5e202ad0","Type":"ContainerStarted","Data":"62b0f0fb1311a4d011e24fe0a26f2c5994ac4588501d5fcf67d2617359bca906"} Jan 21 00:03:14 crc kubenswrapper[5030]: I0121 00:03:14.120936 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" event={"ID":"33ef4425-ed8f-4995-96e0-e76e5e202ad0","Type":"ContainerStarted","Data":"21057c50bf3441617fadd94c896c6a5cac3e9ac85b073879e5c2245b37717ab8"} Jan 21 00:03:14 crc kubenswrapper[5030]: I0121 00:03:14.142799 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" podStartSLOduration=1.6241984729999999 podStartE2EDuration="2.142777747s" podCreationTimestamp="2026-01-21 00:03:12 +0000 UTC" firstStartedPulling="2026-01-21 00:03:12.983946109 +0000 UTC m=+5265.304206397" lastFinishedPulling="2026-01-21 00:03:13.502525343 +0000 UTC m=+5265.822785671" observedRunningTime="2026-01-21 00:03:14.139452977 +0000 UTC m=+5266.459713275" watchObservedRunningTime="2026-01-21 00:03:14.142777747 +0000 UTC m=+5266.463038045" Jan 21 00:03:15 crc kubenswrapper[5030]: I0121 00:03:15.131014 5030 generic.go:334] "Generic (PLEG): container finished" podID="33ef4425-ed8f-4995-96e0-e76e5e202ad0" containerID="21057c50bf3441617fadd94c896c6a5cac3e9ac85b073879e5c2245b37717ab8" exitCode=0 Jan 21 00:03:15 crc kubenswrapper[5030]: I0121 00:03:15.131088 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" event={"ID":"33ef4425-ed8f-4995-96e0-e76e5e202ad0","Type":"ContainerDied","Data":"21057c50bf3441617fadd94c896c6a5cac3e9ac85b073879e5c2245b37717ab8"} Jan 21 00:03:16 crc kubenswrapper[5030]: I0121 00:03:16.453255 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" Jan 21 00:03:16 crc kubenswrapper[5030]: I0121 00:03:16.512326 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fwr2\" (UniqueName: \"kubernetes.io/projected/33ef4425-ed8f-4995-96e0-e76e5e202ad0-kube-api-access-4fwr2\") pod \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\" (UID: \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\") " Jan 21 00:03:16 crc kubenswrapper[5030]: I0121 00:03:16.512427 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/33ef4425-ed8f-4995-96e0-e76e5e202ad0-ssh-key-edpm-compute-global\") pod \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\" (UID: \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\") " Jan 21 00:03:16 crc kubenswrapper[5030]: I0121 00:03:16.512483 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33ef4425-ed8f-4995-96e0-e76e5e202ad0-inventory\") pod \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\" (UID: \"33ef4425-ed8f-4995-96e0-e76e5e202ad0\") " Jan 21 00:03:16 crc kubenswrapper[5030]: I0121 00:03:16.517540 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ef4425-ed8f-4995-96e0-e76e5e202ad0-kube-api-access-4fwr2" (OuterVolumeSpecName: "kube-api-access-4fwr2") pod "33ef4425-ed8f-4995-96e0-e76e5e202ad0" (UID: "33ef4425-ed8f-4995-96e0-e76e5e202ad0"). InnerVolumeSpecName "kube-api-access-4fwr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:03:16 crc kubenswrapper[5030]: I0121 00:03:16.536809 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ef4425-ed8f-4995-96e0-e76e5e202ad0-inventory" (OuterVolumeSpecName: "inventory") pod "33ef4425-ed8f-4995-96e0-e76e5e202ad0" (UID: "33ef4425-ed8f-4995-96e0-e76e5e202ad0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:16 crc kubenswrapper[5030]: I0121 00:03:16.537846 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ef4425-ed8f-4995-96e0-e76e5e202ad0-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "33ef4425-ed8f-4995-96e0-e76e5e202ad0" (UID: "33ef4425-ed8f-4995-96e0-e76e5e202ad0"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:16 crc kubenswrapper[5030]: I0121 00:03:16.614672 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/33ef4425-ed8f-4995-96e0-e76e5e202ad0-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:16 crc kubenswrapper[5030]: I0121 00:03:16.614711 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/33ef4425-ed8f-4995-96e0-e76e5e202ad0-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:16 crc kubenswrapper[5030]: I0121 00:03:16.614725 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fwr2\" (UniqueName: \"kubernetes.io/projected/33ef4425-ed8f-4995-96e0-e76e5e202ad0-kube-api-access-4fwr2\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.155505 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" event={"ID":"33ef4425-ed8f-4995-96e0-e76e5e202ad0","Type":"ContainerDied","Data":"62b0f0fb1311a4d011e24fe0a26f2c5994ac4588501d5fcf67d2617359bca906"} Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.155565 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62b0f0fb1311a4d011e24fe0a26f2c5994ac4588501d5fcf67d2617359bca906" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.155692 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.233828 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7"] Jan 21 00:03:17 crc kubenswrapper[5030]: E0121 00:03:17.234354 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ef4425-ed8f-4995-96e0-e76e5e202ad0" containerName="configure-network-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.234380 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ef4425-ed8f-4995-96e0-e76e5e202ad0" containerName="configure-network-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.234715 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ef4425-ed8f-4995-96e0-e76e5e202ad0" containerName="configure-network-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.235591 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.238290 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.239750 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.240048 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.240397 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.258995 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7"] Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.327416 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7644b304-ddc0-4057-8786-8e6b3f899386-ssh-key-edpm-compute-global\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-zkdp7\" (UID: \"7644b304-ddc0-4057-8786-8e6b3f899386\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.327520 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcpnc\" (UniqueName: \"kubernetes.io/projected/7644b304-ddc0-4057-8786-8e6b3f899386-kube-api-access-vcpnc\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-zkdp7\" (UID: \"7644b304-ddc0-4057-8786-8e6b3f899386\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.327566 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7644b304-ddc0-4057-8786-8e6b3f899386-inventory\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-zkdp7\" (UID: \"7644b304-ddc0-4057-8786-8e6b3f899386\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.429132 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcpnc\" (UniqueName: \"kubernetes.io/projected/7644b304-ddc0-4057-8786-8e6b3f899386-kube-api-access-vcpnc\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-zkdp7\" (UID: \"7644b304-ddc0-4057-8786-8e6b3f899386\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.429207 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7644b304-ddc0-4057-8786-8e6b3f899386-inventory\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-zkdp7\" (UID: \"7644b304-ddc0-4057-8786-8e6b3f899386\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.429240 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7644b304-ddc0-4057-8786-8e6b3f899386-ssh-key-edpm-compute-global\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-zkdp7\" (UID: \"7644b304-ddc0-4057-8786-8e6b3f899386\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.434044 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7644b304-ddc0-4057-8786-8e6b3f899386-ssh-key-edpm-compute-global\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-zkdp7\" (UID: \"7644b304-ddc0-4057-8786-8e6b3f899386\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.435212 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7644b304-ddc0-4057-8786-8e6b3f899386-inventory\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-zkdp7\" (UID: \"7644b304-ddc0-4057-8786-8e6b3f899386\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.449167 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcpnc\" (UniqueName: \"kubernetes.io/projected/7644b304-ddc0-4057-8786-8e6b3f899386-kube-api-access-vcpnc\") pod \"validate-network-edpm-multinodeset-edpm-compute-global-zkdp7\" (UID: \"7644b304-ddc0-4057-8786-8e6b3f899386\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" Jan 21 00:03:17 crc kubenswrapper[5030]: I0121 00:03:17.566240 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" Jan 21 00:03:18 crc kubenswrapper[5030]: I0121 00:03:18.236848 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7"] Jan 21 00:03:19 crc kubenswrapper[5030]: I0121 00:03:19.172861 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" event={"ID":"7644b304-ddc0-4057-8786-8e6b3f899386","Type":"ContainerStarted","Data":"e5b77e31575aea46b2642fe181515b8e3096d8e55836850fec4cb8f05c24832b"} Jan 21 00:03:19 crc kubenswrapper[5030]: I0121 00:03:19.173199 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" event={"ID":"7644b304-ddc0-4057-8786-8e6b3f899386","Type":"ContainerStarted","Data":"1ca6230e367a818c4fabc18b8b55fb81cdce8f96031b34c5fc1f428501540ccb"} Jan 21 00:03:19 crc kubenswrapper[5030]: I0121 00:03:19.188579 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" podStartSLOduration=1.6594284830000001 podStartE2EDuration="2.188562283s" podCreationTimestamp="2026-01-21 00:03:17 +0000 UTC" firstStartedPulling="2026-01-21 00:03:18.241569481 +0000 UTC m=+5270.561829779" lastFinishedPulling="2026-01-21 00:03:18.770703251 +0000 UTC m=+5271.090963579" observedRunningTime="2026-01-21 00:03:19.186359569 +0000 UTC m=+5271.506619857" watchObservedRunningTime="2026-01-21 00:03:19.188562283 +0000 UTC m=+5271.508822571" Jan 21 00:03:20 crc kubenswrapper[5030]: I0121 00:03:20.182564 5030 generic.go:334] "Generic (PLEG): container finished" podID="7644b304-ddc0-4057-8786-8e6b3f899386" containerID="e5b77e31575aea46b2642fe181515b8e3096d8e55836850fec4cb8f05c24832b" exitCode=0 Jan 21 00:03:20 crc kubenswrapper[5030]: I0121 00:03:20.182649 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" event={"ID":"7644b304-ddc0-4057-8786-8e6b3f899386","Type":"ContainerDied","Data":"e5b77e31575aea46b2642fe181515b8e3096d8e55836850fec4cb8f05c24832b"} Jan 21 00:03:21 crc kubenswrapper[5030]: I0121 00:03:21.550089 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" Jan 21 00:03:21 crc kubenswrapper[5030]: I0121 00:03:21.700985 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7644b304-ddc0-4057-8786-8e6b3f899386-ssh-key-edpm-compute-global\") pod \"7644b304-ddc0-4057-8786-8e6b3f899386\" (UID: \"7644b304-ddc0-4057-8786-8e6b3f899386\") " Jan 21 00:03:21 crc kubenswrapper[5030]: I0121 00:03:21.701281 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7644b304-ddc0-4057-8786-8e6b3f899386-inventory\") pod \"7644b304-ddc0-4057-8786-8e6b3f899386\" (UID: \"7644b304-ddc0-4057-8786-8e6b3f899386\") " Jan 21 00:03:21 crc kubenswrapper[5030]: I0121 00:03:21.701462 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcpnc\" (UniqueName: \"kubernetes.io/projected/7644b304-ddc0-4057-8786-8e6b3f899386-kube-api-access-vcpnc\") pod \"7644b304-ddc0-4057-8786-8e6b3f899386\" (UID: \"7644b304-ddc0-4057-8786-8e6b3f899386\") " Jan 21 00:03:21 crc kubenswrapper[5030]: I0121 00:03:21.705983 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7644b304-ddc0-4057-8786-8e6b3f899386-kube-api-access-vcpnc" (OuterVolumeSpecName: "kube-api-access-vcpnc") pod "7644b304-ddc0-4057-8786-8e6b3f899386" (UID: "7644b304-ddc0-4057-8786-8e6b3f899386"). InnerVolumeSpecName "kube-api-access-vcpnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:03:21 crc kubenswrapper[5030]: I0121 00:03:21.722320 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7644b304-ddc0-4057-8786-8e6b3f899386-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "7644b304-ddc0-4057-8786-8e6b3f899386" (UID: "7644b304-ddc0-4057-8786-8e6b3f899386"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:21 crc kubenswrapper[5030]: I0121 00:03:21.724524 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7644b304-ddc0-4057-8786-8e6b3f899386-inventory" (OuterVolumeSpecName: "inventory") pod "7644b304-ddc0-4057-8786-8e6b3f899386" (UID: "7644b304-ddc0-4057-8786-8e6b3f899386"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:21 crc kubenswrapper[5030]: I0121 00:03:21.803821 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7644b304-ddc0-4057-8786-8e6b3f899386-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:21 crc kubenswrapper[5030]: I0121 00:03:21.803865 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcpnc\" (UniqueName: \"kubernetes.io/projected/7644b304-ddc0-4057-8786-8e6b3f899386-kube-api-access-vcpnc\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:21 crc kubenswrapper[5030]: I0121 00:03:21.803877 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7644b304-ddc0-4057-8786-8e6b3f899386-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.209889 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" event={"ID":"7644b304-ddc0-4057-8786-8e6b3f899386","Type":"ContainerDied","Data":"1ca6230e367a818c4fabc18b8b55fb81cdce8f96031b34c5fc1f428501540ccb"} Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.209950 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ca6230e367a818c4fabc18b8b55fb81cdce8f96031b34c5fc1f428501540ccb" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.209983 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.635587 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx"] Jan 21 00:03:22 crc kubenswrapper[5030]: E0121 00:03:22.636049 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7644b304-ddc0-4057-8786-8e6b3f899386" containerName="validate-network-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.636070 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7644b304-ddc0-4057-8786-8e6b3f899386" containerName="validate-network-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.636352 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7644b304-ddc0-4057-8786-8e6b3f899386" containerName="validate-network-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.636978 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.638859 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.638956 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.642412 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.642561 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.644350 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx"] Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.718822 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-inventory\") pod \"install-os-edpm-multinodeset-edpm-compute-global-rcqpx\" (UID: \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.718956 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2wf8\" (UniqueName: \"kubernetes.io/projected/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-kube-api-access-v2wf8\") pod \"install-os-edpm-multinodeset-edpm-compute-global-rcqpx\" (UID: \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.719143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-ssh-key-edpm-compute-global\") pod \"install-os-edpm-multinodeset-edpm-compute-global-rcqpx\" (UID: \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.821170 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-inventory\") pod \"install-os-edpm-multinodeset-edpm-compute-global-rcqpx\" (UID: \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.821613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wf8\" (UniqueName: \"kubernetes.io/projected/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-kube-api-access-v2wf8\") pod \"install-os-edpm-multinodeset-edpm-compute-global-rcqpx\" (UID: \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.821947 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-ssh-key-edpm-compute-global\") pod \"install-os-edpm-multinodeset-edpm-compute-global-rcqpx\" (UID: \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.825685 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-inventory\") pod \"install-os-edpm-multinodeset-edpm-compute-global-rcqpx\" (UID: \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.826597 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-ssh-key-edpm-compute-global\") pod \"install-os-edpm-multinodeset-edpm-compute-global-rcqpx\" (UID: \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" Jan 21 00:03:22 crc kubenswrapper[5030]: I0121 00:03:22.843465 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2wf8\" (UniqueName: \"kubernetes.io/projected/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-kube-api-access-v2wf8\") pod \"install-os-edpm-multinodeset-edpm-compute-global-rcqpx\" (UID: \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" Jan 21 00:03:23 crc kubenswrapper[5030]: I0121 00:03:23.005475 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" Jan 21 00:03:23 crc kubenswrapper[5030]: I0121 00:03:23.456922 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx"] Jan 21 00:03:24 crc kubenswrapper[5030]: I0121 00:03:24.238498 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" event={"ID":"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2","Type":"ContainerStarted","Data":"aed6a55814c2617a5e8a7dfc71738c7f903a8a7da81207ae2a68d8cd63ca347f"} Jan 21 00:03:24 crc kubenswrapper[5030]: I0121 00:03:24.962102 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:03:24 crc kubenswrapper[5030]: I0121 00:03:24.974939 5030 scope.go:117] "RemoveContainer" containerID="db2bb0cf2f1a80c4d2224f7a8ac219f2f023585a6875d7b9b1e1d5f85b801548" Jan 21 00:03:24 crc kubenswrapper[5030]: E0121 00:03:24.977662 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:03:25 crc kubenswrapper[5030]: I0121 00:03:25.249543 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" event={"ID":"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2","Type":"ContainerStarted","Data":"4deddf2d60cbd5a1d761a5715a9fb78ae7235375fa933bcf8f4675c23125d31e"} Jan 21 00:03:25 crc kubenswrapper[5030]: I0121 00:03:25.279088 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" podStartSLOduration=2.417257434 podStartE2EDuration="3.279065149s" podCreationTimestamp="2026-01-21 00:03:22 +0000 UTC" firstStartedPulling="2026-01-21 00:03:23.457856361 +0000 UTC m=+5275.778116639" lastFinishedPulling="2026-01-21 00:03:24.319664056 +0000 UTC m=+5276.639924354" observedRunningTime="2026-01-21 00:03:25.262273152 +0000 UTC m=+5277.582533480" watchObservedRunningTime="2026-01-21 00:03:25.279065149 +0000 UTC m=+5277.599325437" Jan 21 00:03:26 crc kubenswrapper[5030]: I0121 00:03:26.262424 5030 generic.go:334] "Generic (PLEG): container finished" podID="f776b79c-1a2d-4a18-ab4b-ace393bf1ab2" containerID="4deddf2d60cbd5a1d761a5715a9fb78ae7235375fa933bcf8f4675c23125d31e" exitCode=0 Jan 21 00:03:26 crc kubenswrapper[5030]: I0121 00:03:26.262512 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" event={"ID":"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2","Type":"ContainerDied","Data":"4deddf2d60cbd5a1d761a5715a9fb78ae7235375fa933bcf8f4675c23125d31e"} Jan 21 00:03:27 crc kubenswrapper[5030]: I0121 00:03:27.553975 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" Jan 21 00:03:27 crc kubenswrapper[5030]: I0121 00:03:27.604297 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2wf8\" (UniqueName: \"kubernetes.io/projected/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-kube-api-access-v2wf8\") pod \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\" (UID: \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\") " Jan 21 00:03:27 crc kubenswrapper[5030]: I0121 00:03:27.604348 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-inventory\") pod \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\" (UID: \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\") " Jan 21 00:03:27 crc kubenswrapper[5030]: I0121 00:03:27.604385 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-ssh-key-edpm-compute-global\") pod \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\" (UID: \"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2\") " Jan 21 00:03:27 crc kubenswrapper[5030]: I0121 00:03:27.612879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-kube-api-access-v2wf8" (OuterVolumeSpecName: "kube-api-access-v2wf8") pod "f776b79c-1a2d-4a18-ab4b-ace393bf1ab2" (UID: "f776b79c-1a2d-4a18-ab4b-ace393bf1ab2"). InnerVolumeSpecName "kube-api-access-v2wf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:03:27 crc kubenswrapper[5030]: I0121 00:03:27.624537 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-inventory" (OuterVolumeSpecName: "inventory") pod "f776b79c-1a2d-4a18-ab4b-ace393bf1ab2" (UID: "f776b79c-1a2d-4a18-ab4b-ace393bf1ab2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:27 crc kubenswrapper[5030]: I0121 00:03:27.625514 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "f776b79c-1a2d-4a18-ab4b-ace393bf1ab2" (UID: "f776b79c-1a2d-4a18-ab4b-ace393bf1ab2"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:27 crc kubenswrapper[5030]: I0121 00:03:27.705991 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2wf8\" (UniqueName: \"kubernetes.io/projected/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-kube-api-access-v2wf8\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:27 crc kubenswrapper[5030]: I0121 00:03:27.706025 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:27 crc kubenswrapper[5030]: I0121 00:03:27.706037 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.291984 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" event={"ID":"f776b79c-1a2d-4a18-ab4b-ace393bf1ab2","Type":"ContainerDied","Data":"aed6a55814c2617a5e8a7dfc71738c7f903a8a7da81207ae2a68d8cd63ca347f"} Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.292053 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aed6a55814c2617a5e8a7dfc71738c7f903a8a7da81207ae2a68d8cd63ca347f" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.292090 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.349773 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7"] Jan 21 00:03:28 crc kubenswrapper[5030]: E0121 00:03:28.350068 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f776b79c-1a2d-4a18-ab4b-ace393bf1ab2" containerName="install-os-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.350084 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f776b79c-1a2d-4a18-ab4b-ace393bf1ab2" containerName="install-os-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.350206 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f776b79c-1a2d-4a18-ab4b-ace393bf1ab2" containerName="install-os-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.350740 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.352604 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.352724 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.353041 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.354658 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.372463 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7"] Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.519981 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-ssh-key-edpm-compute-global\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-xsfl7\" (UID: \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.520065 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-inventory\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-xsfl7\" (UID: \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.520197 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m6g2\" (UniqueName: \"kubernetes.io/projected/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-kube-api-access-5m6g2\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-xsfl7\" (UID: \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.622376 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-ssh-key-edpm-compute-global\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-xsfl7\" (UID: \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.622447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-inventory\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-xsfl7\" (UID: \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.622492 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m6g2\" (UniqueName: \"kubernetes.io/projected/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-kube-api-access-5m6g2\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-xsfl7\" (UID: \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.629902 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-inventory\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-xsfl7\" (UID: \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.639190 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-ssh-key-edpm-compute-global\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-xsfl7\" (UID: \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.639489 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m6g2\" (UniqueName: \"kubernetes.io/projected/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-kube-api-access-5m6g2\") pod \"configure-os-edpm-multinodeset-edpm-compute-global-xsfl7\" (UID: \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.677255 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" Jan 21 00:03:28 crc kubenswrapper[5030]: I0121 00:03:28.939284 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7"] Jan 21 00:03:29 crc kubenswrapper[5030]: I0121 00:03:29.304044 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" event={"ID":"7456a5a7-d2e0-4e08-b19c-00e4d75429fb","Type":"ContainerStarted","Data":"f5d05b4118e51394fa46a63df9dfbb1ca52bf1e598770e798c1b68edec292d48"} Jan 21 00:03:30 crc kubenswrapper[5030]: I0121 00:03:30.312123 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" event={"ID":"7456a5a7-d2e0-4e08-b19c-00e4d75429fb","Type":"ContainerStarted","Data":"fc669352154f42c2e7f1a3a6479262acbc8674ce5b236f4e5ece1cc7865c794d"} Jan 21 00:03:31 crc kubenswrapper[5030]: I0121 00:03:31.325424 5030 generic.go:334] "Generic (PLEG): container finished" podID="7456a5a7-d2e0-4e08-b19c-00e4d75429fb" containerID="fc669352154f42c2e7f1a3a6479262acbc8674ce5b236f4e5ece1cc7865c794d" exitCode=0 Jan 21 00:03:31 crc kubenswrapper[5030]: I0121 00:03:31.325483 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" event={"ID":"7456a5a7-d2e0-4e08-b19c-00e4d75429fb","Type":"ContainerDied","Data":"fc669352154f42c2e7f1a3a6479262acbc8674ce5b236f4e5ece1cc7865c794d"} Jan 21 00:03:32 crc kubenswrapper[5030]: I0121 00:03:32.642673 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" Jan 21 00:03:32 crc kubenswrapper[5030]: I0121 00:03:32.790392 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m6g2\" (UniqueName: \"kubernetes.io/projected/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-kube-api-access-5m6g2\") pod \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\" (UID: \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\") " Jan 21 00:03:32 crc kubenswrapper[5030]: I0121 00:03:32.790657 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-ssh-key-edpm-compute-global\") pod \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\" (UID: \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\") " Jan 21 00:03:32 crc kubenswrapper[5030]: I0121 00:03:32.790695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-inventory\") pod \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\" (UID: \"7456a5a7-d2e0-4e08-b19c-00e4d75429fb\") " Jan 21 00:03:32 crc kubenswrapper[5030]: I0121 00:03:32.798827 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-kube-api-access-5m6g2" (OuterVolumeSpecName: "kube-api-access-5m6g2") pod "7456a5a7-d2e0-4e08-b19c-00e4d75429fb" (UID: "7456a5a7-d2e0-4e08-b19c-00e4d75429fb"). InnerVolumeSpecName "kube-api-access-5m6g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:03:32 crc kubenswrapper[5030]: I0121 00:03:32.820019 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "7456a5a7-d2e0-4e08-b19c-00e4d75429fb" (UID: "7456a5a7-d2e0-4e08-b19c-00e4d75429fb"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:32 crc kubenswrapper[5030]: I0121 00:03:32.829302 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-inventory" (OuterVolumeSpecName: "inventory") pod "7456a5a7-d2e0-4e08-b19c-00e4d75429fb" (UID: "7456a5a7-d2e0-4e08-b19c-00e4d75429fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:32 crc kubenswrapper[5030]: I0121 00:03:32.892472 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:32 crc kubenswrapper[5030]: I0121 00:03:32.892520 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:32 crc kubenswrapper[5030]: I0121 00:03:32.892537 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m6g2\" (UniqueName: \"kubernetes.io/projected/7456a5a7-d2e0-4e08-b19c-00e4d75429fb-kube-api-access-5m6g2\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.344319 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" event={"ID":"7456a5a7-d2e0-4e08-b19c-00e4d75429fb","Type":"ContainerDied","Data":"f5d05b4118e51394fa46a63df9dfbb1ca52bf1e598770e798c1b68edec292d48"} Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.344369 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5d05b4118e51394fa46a63df9dfbb1ca52bf1e598770e798c1b68edec292d48" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.344397 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.417981 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj"] Jan 21 00:03:33 crc kubenswrapper[5030]: E0121 00:03:33.418358 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7456a5a7-d2e0-4e08-b19c-00e4d75429fb" containerName="configure-os-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.418376 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7456a5a7-d2e0-4e08-b19c-00e4d75429fb" containerName="configure-os-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.418519 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7456a5a7-d2e0-4e08-b19c-00e4d75429fb" containerName="configure-os-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.419042 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.422600 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.422984 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.423107 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.423227 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.429833 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj"] Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.602465 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-inventory\") pod \"run-os-edpm-multinodeset-edpm-compute-global-bwknj\" (UID: \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.602533 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz28k\" (UniqueName: \"kubernetes.io/projected/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-kube-api-access-gz28k\") pod \"run-os-edpm-multinodeset-edpm-compute-global-bwknj\" (UID: \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.602576 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-ssh-key-edpm-compute-global\") pod \"run-os-edpm-multinodeset-edpm-compute-global-bwknj\" (UID: \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.703523 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-inventory\") pod \"run-os-edpm-multinodeset-edpm-compute-global-bwknj\" (UID: \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.704471 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz28k\" (UniqueName: \"kubernetes.io/projected/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-kube-api-access-gz28k\") pod \"run-os-edpm-multinodeset-edpm-compute-global-bwknj\" (UID: \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.704704 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-ssh-key-edpm-compute-global\") pod \"run-os-edpm-multinodeset-edpm-compute-global-bwknj\" (UID: \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.710535 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-ssh-key-edpm-compute-global\") pod \"run-os-edpm-multinodeset-edpm-compute-global-bwknj\" (UID: \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.714085 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-inventory\") pod \"run-os-edpm-multinodeset-edpm-compute-global-bwknj\" (UID: \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.727344 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz28k\" (UniqueName: \"kubernetes.io/projected/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-kube-api-access-gz28k\") pod \"run-os-edpm-multinodeset-edpm-compute-global-bwknj\" (UID: \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" Jan 21 00:03:33 crc kubenswrapper[5030]: I0121 00:03:33.748128 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" Jan 21 00:03:34 crc kubenswrapper[5030]: I0121 00:03:34.180306 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj"] Jan 21 00:03:34 crc kubenswrapper[5030]: I0121 00:03:34.354195 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" event={"ID":"15f92ce3-6dd6-4f75-816f-e1ce833f83c7","Type":"ContainerStarted","Data":"f85b5a8b299a41a683e5a9e210086d07a76d8d9c8228b0c41db4a4a5ccdf02f9"} Jan 21 00:03:35 crc kubenswrapper[5030]: I0121 00:03:35.366677 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" event={"ID":"15f92ce3-6dd6-4f75-816f-e1ce833f83c7","Type":"ContainerStarted","Data":"3ce78b355797f87ef2cb1d7308b05170fa5147487151cf2accedf6170e20ac72"} Jan 21 00:03:35 crc kubenswrapper[5030]: I0121 00:03:35.393699 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" podStartSLOduration=1.931476371 podStartE2EDuration="2.393675669s" podCreationTimestamp="2026-01-21 00:03:33 +0000 UTC" firstStartedPulling="2026-01-21 00:03:34.186500158 +0000 UTC m=+5286.506760446" lastFinishedPulling="2026-01-21 00:03:34.648699456 +0000 UTC m=+5286.968959744" observedRunningTime="2026-01-21 00:03:35.388664087 +0000 UTC m=+5287.708924455" watchObservedRunningTime="2026-01-21 00:03:35.393675669 +0000 UTC m=+5287.713935977" Jan 21 00:03:36 crc kubenswrapper[5030]: I0121 00:03:36.381037 5030 generic.go:334] "Generic (PLEG): container finished" podID="15f92ce3-6dd6-4f75-816f-e1ce833f83c7" containerID="3ce78b355797f87ef2cb1d7308b05170fa5147487151cf2accedf6170e20ac72" exitCode=0 Jan 21 00:03:36 crc kubenswrapper[5030]: I0121 00:03:36.381100 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" event={"ID":"15f92ce3-6dd6-4f75-816f-e1ce833f83c7","Type":"ContainerDied","Data":"3ce78b355797f87ef2cb1d7308b05170fa5147487151cf2accedf6170e20ac72"} Jan 21 00:03:36 crc kubenswrapper[5030]: I0121 00:03:36.961814 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:03:36 crc kubenswrapper[5030]: E0121 00:03:36.962193 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:03:37 crc kubenswrapper[5030]: I0121 00:03:37.678606 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" Jan 21 00:03:37 crc kubenswrapper[5030]: I0121 00:03:37.874160 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-inventory\") pod \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\" (UID: \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\") " Jan 21 00:03:37 crc kubenswrapper[5030]: I0121 00:03:37.874429 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz28k\" (UniqueName: \"kubernetes.io/projected/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-kube-api-access-gz28k\") pod \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\" (UID: \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\") " Jan 21 00:03:37 crc kubenswrapper[5030]: I0121 00:03:37.874476 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-ssh-key-edpm-compute-global\") pod \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\" (UID: \"15f92ce3-6dd6-4f75-816f-e1ce833f83c7\") " Jan 21 00:03:37 crc kubenswrapper[5030]: I0121 00:03:37.884026 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-kube-api-access-gz28k" (OuterVolumeSpecName: "kube-api-access-gz28k") pod "15f92ce3-6dd6-4f75-816f-e1ce833f83c7" (UID: "15f92ce3-6dd6-4f75-816f-e1ce833f83c7"). InnerVolumeSpecName "kube-api-access-gz28k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:03:37 crc kubenswrapper[5030]: I0121 00:03:37.914115 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "15f92ce3-6dd6-4f75-816f-e1ce833f83c7" (UID: "15f92ce3-6dd6-4f75-816f-e1ce833f83c7"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:37 crc kubenswrapper[5030]: I0121 00:03:37.914812 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-inventory" (OuterVolumeSpecName: "inventory") pod "15f92ce3-6dd6-4f75-816f-e1ce833f83c7" (UID: "15f92ce3-6dd6-4f75-816f-e1ce833f83c7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:37 crc kubenswrapper[5030]: I0121 00:03:37.976251 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz28k\" (UniqueName: \"kubernetes.io/projected/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-kube-api-access-gz28k\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:37 crc kubenswrapper[5030]: I0121 00:03:37.976285 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:37 crc kubenswrapper[5030]: I0121 00:03:37.976296 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15f92ce3-6dd6-4f75-816f-e1ce833f83c7-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.402041 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" event={"ID":"15f92ce3-6dd6-4f75-816f-e1ce833f83c7","Type":"ContainerDied","Data":"f85b5a8b299a41a683e5a9e210086d07a76d8d9c8228b0c41db4a4a5ccdf02f9"} Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.402081 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f85b5a8b299a41a683e5a9e210086d07a76d8d9c8228b0c41db4a4a5ccdf02f9" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.402143 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.469370 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs"] Jan 21 00:03:38 crc kubenswrapper[5030]: E0121 00:03:38.469736 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f92ce3-6dd6-4f75-816f-e1ce833f83c7" containerName="run-os-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.469764 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f92ce3-6dd6-4f75-816f-e1ce833f83c7" containerName="run-os-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.469982 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f92ce3-6dd6-4f75-816f-e1ce833f83c7" containerName="run-os-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.470528 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.473579 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.473661 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.473937 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.478130 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.483662 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.490603 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs"] Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.583952 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-custom-global-service-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.584012 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.584042 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.584070 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.584096 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7wsz\" (UniqueName: \"kubernetes.io/projected/f21b5167-5f2e-41d0-b184-e0290e7030af-kube-api-access-j7wsz\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.584118 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-ssh-key-edpm-compute-global\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.584151 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.584173 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.584192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.584213 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-nova-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.584250 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-inventory\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.584272 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.686679 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-inventory\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.686782 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.686835 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-custom-global-service-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.686877 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.686911 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.686949 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.686981 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7wsz\" (UniqueName: \"kubernetes.io/projected/f21b5167-5f2e-41d0-b184-e0290e7030af-kube-api-access-j7wsz\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.687016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-ssh-key-edpm-compute-global\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.687047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.687078 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.687103 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.687135 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-nova-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.691146 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.691997 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-custom-global-service-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.692379 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.692808 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-inventory\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.693665 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.694064 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-ssh-key-edpm-compute-global\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.694082 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.693867 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.695716 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.703055 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.703669 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-nova-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.706829 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7wsz\" (UniqueName: \"kubernetes.io/projected/f21b5167-5f2e-41d0-b184-e0290e7030af-kube-api-access-j7wsz\") pod \"install-certs-edpm-multinodeset-edpm-compute-global-wgvvs\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:38 crc kubenswrapper[5030]: I0121 00:03:38.788341 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:39 crc kubenswrapper[5030]: I0121 00:03:39.036164 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs"] Jan 21 00:03:39 crc kubenswrapper[5030]: W0121 00:03:39.041585 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf21b5167_5f2e_41d0_b184_e0290e7030af.slice/crio-157fb8c315e84c77f338074bd0293ba40f1f9251eed47038b18cd5bbe46d8430 WatchSource:0}: Error finding container 157fb8c315e84c77f338074bd0293ba40f1f9251eed47038b18cd5bbe46d8430: Status 404 returned error can't find the container with id 157fb8c315e84c77f338074bd0293ba40f1f9251eed47038b18cd5bbe46d8430 Jan 21 00:03:39 crc kubenswrapper[5030]: I0121 00:03:39.414606 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" event={"ID":"f21b5167-5f2e-41d0-b184-e0290e7030af","Type":"ContainerStarted","Data":"157fb8c315e84c77f338074bd0293ba40f1f9251eed47038b18cd5bbe46d8430"} Jan 21 00:03:40 crc kubenswrapper[5030]: I0121 00:03:40.425026 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" event={"ID":"f21b5167-5f2e-41d0-b184-e0290e7030af","Type":"ContainerStarted","Data":"a3ec44899c258b554f9d9ff568c58803c6610ea8e8bbdf8f877a2deebf7aa4b7"} Jan 21 00:03:40 crc kubenswrapper[5030]: I0121 00:03:40.461649 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" podStartSLOduration=1.8627235199999999 podStartE2EDuration="2.461613141s" podCreationTimestamp="2026-01-21 00:03:38 +0000 UTC" firstStartedPulling="2026-01-21 00:03:39.043571638 +0000 UTC m=+5291.363831946" lastFinishedPulling="2026-01-21 00:03:39.642461269 +0000 UTC m=+5291.962721567" observedRunningTime="2026-01-21 00:03:40.454671573 +0000 UTC m=+5292.774931871" watchObservedRunningTime="2026-01-21 00:03:40.461613141 +0000 UTC m=+5292.781873469" Jan 21 00:03:41 crc kubenswrapper[5030]: I0121 00:03:41.439166 5030 generic.go:334] "Generic (PLEG): container finished" podID="f21b5167-5f2e-41d0-b184-e0290e7030af" containerID="a3ec44899c258b554f9d9ff568c58803c6610ea8e8bbdf8f877a2deebf7aa4b7" exitCode=0 Jan 21 00:03:41 crc kubenswrapper[5030]: I0121 00:03:41.439214 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" event={"ID":"f21b5167-5f2e-41d0-b184-e0290e7030af","Type":"ContainerDied","Data":"a3ec44899c258b554f9d9ff568c58803c6610ea8e8bbdf8f877a2deebf7aa4b7"} Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.770318 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.959204 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-inventory\") pod \"f21b5167-5f2e-41d0-b184-e0290e7030af\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.959289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-sriov-combined-ca-bundle\") pod \"f21b5167-5f2e-41d0-b184-e0290e7030af\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.959341 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-ovn-combined-ca-bundle\") pod \"f21b5167-5f2e-41d0-b184-e0290e7030af\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.959371 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7wsz\" (UniqueName: \"kubernetes.io/projected/f21b5167-5f2e-41d0-b184-e0290e7030af-kube-api-access-j7wsz\") pod \"f21b5167-5f2e-41d0-b184-e0290e7030af\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.959405 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-ssh-key-edpm-compute-global\") pod \"f21b5167-5f2e-41d0-b184-e0290e7030af\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.959479 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-ovn-combined-ca-bundle\") pod \"f21b5167-5f2e-41d0-b184-e0290e7030af\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.959520 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-libvirt-combined-ca-bundle\") pod \"f21b5167-5f2e-41d0-b184-e0290e7030af\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.959555 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-dhcp-combined-ca-bundle\") pod \"f21b5167-5f2e-41d0-b184-e0290e7030af\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.959582 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-nova-combined-ca-bundle\") pod \"f21b5167-5f2e-41d0-b184-e0290e7030af\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.959611 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-metadata-combined-ca-bundle\") pod \"f21b5167-5f2e-41d0-b184-e0290e7030af\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.959787 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-custom-global-service-combined-ca-bundle\") pod \"f21b5167-5f2e-41d0-b184-e0290e7030af\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.959845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-bootstrap-combined-ca-bundle\") pod \"f21b5167-5f2e-41d0-b184-e0290e7030af\" (UID: \"f21b5167-5f2e-41d0-b184-e0290e7030af\") " Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.965963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f21b5167-5f2e-41d0-b184-e0290e7030af" (UID: "f21b5167-5f2e-41d0-b184-e0290e7030af"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.966356 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f21b5167-5f2e-41d0-b184-e0290e7030af" (UID: "f21b5167-5f2e-41d0-b184-e0290e7030af"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.967117 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "f21b5167-5f2e-41d0-b184-e0290e7030af" (UID: "f21b5167-5f2e-41d0-b184-e0290e7030af"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.967145 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-custom-global-service-combined-ca-bundle" (OuterVolumeSpecName: "custom-global-service-combined-ca-bundle") pod "f21b5167-5f2e-41d0-b184-e0290e7030af" (UID: "f21b5167-5f2e-41d0-b184-e0290e7030af"). InnerVolumeSpecName "custom-global-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.967497 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f21b5167-5f2e-41d0-b184-e0290e7030af" (UID: "f21b5167-5f2e-41d0-b184-e0290e7030af"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.967616 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f21b5167-5f2e-41d0-b184-e0290e7030af" (UID: "f21b5167-5f2e-41d0-b184-e0290e7030af"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.968136 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21b5167-5f2e-41d0-b184-e0290e7030af-kube-api-access-j7wsz" (OuterVolumeSpecName: "kube-api-access-j7wsz") pod "f21b5167-5f2e-41d0-b184-e0290e7030af" (UID: "f21b5167-5f2e-41d0-b184-e0290e7030af"). InnerVolumeSpecName "kube-api-access-j7wsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.968361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "f21b5167-5f2e-41d0-b184-e0290e7030af" (UID: "f21b5167-5f2e-41d0-b184-e0290e7030af"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.968576 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "f21b5167-5f2e-41d0-b184-e0290e7030af" (UID: "f21b5167-5f2e-41d0-b184-e0290e7030af"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.969445 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f21b5167-5f2e-41d0-b184-e0290e7030af" (UID: "f21b5167-5f2e-41d0-b184-e0290e7030af"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.991408 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-inventory" (OuterVolumeSpecName: "inventory") pod "f21b5167-5f2e-41d0-b184-e0290e7030af" (UID: "f21b5167-5f2e-41d0-b184-e0290e7030af"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:42 crc kubenswrapper[5030]: I0121 00:03:42.997582 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "f21b5167-5f2e-41d0-b184-e0290e7030af" (UID: "f21b5167-5f2e-41d0-b184-e0290e7030af"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.063466 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.064212 5030 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.064249 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.064277 5030 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.064301 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.064322 5030 reconciler_common.go:293] "Volume detached for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-custom-global-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.064344 5030 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.064367 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.064388 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.064409 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.064431 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7wsz\" (UniqueName: \"kubernetes.io/projected/f21b5167-5f2e-41d0-b184-e0290e7030af-kube-api-access-j7wsz\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.064452 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/f21b5167-5f2e-41d0-b184-e0290e7030af-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.466791 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" event={"ID":"f21b5167-5f2e-41d0-b184-e0290e7030af","Type":"ContainerDied","Data":"157fb8c315e84c77f338074bd0293ba40f1f9251eed47038b18cd5bbe46d8430"} Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.466858 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="157fb8c315e84c77f338074bd0293ba40f1f9251eed47038b18cd5bbe46d8430" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.466933 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.709331 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5"] Jan 21 00:03:43 crc kubenswrapper[5030]: E0121 00:03:43.709814 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21b5167-5f2e-41d0-b184-e0290e7030af" containerName="install-certs-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.709840 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21b5167-5f2e-41d0-b184-e0290e7030af" containerName="install-certs-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.710035 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f21b5167-5f2e-41d0-b184-e0290e7030af" containerName="install-certs-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.710600 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.713903 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.714497 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.714828 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.720121 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.720993 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-config" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.724611 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.726930 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5"] Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.876312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ovn-combined-ca-bundle\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.876423 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ssh-key-edpm-compute-global\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.876469 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9str6\" (UniqueName: \"kubernetes.io/projected/a1a9bec2-dc0b-40b7-a84f-3321849e3240-kube-api-access-9str6\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.876524 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-inventory\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.877422 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ovncontroller-config-0\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.984269 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ovn-combined-ca-bundle\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.984415 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ssh-key-edpm-compute-global\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.984490 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9str6\" (UniqueName: \"kubernetes.io/projected/a1a9bec2-dc0b-40b7-a84f-3321849e3240-kube-api-access-9str6\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.984584 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-inventory\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.984774 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ovncontroller-config-0\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.986912 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ovncontroller-config-0\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.990766 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ovn-combined-ca-bundle\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:43 crc kubenswrapper[5030]: I0121 00:03:43.997685 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ssh-key-edpm-compute-global\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:44 crc kubenswrapper[5030]: I0121 00:03:44.005225 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-inventory\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:44 crc kubenswrapper[5030]: I0121 00:03:44.008525 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9str6\" (UniqueName: \"kubernetes.io/projected/a1a9bec2-dc0b-40b7-a84f-3321849e3240-kube-api-access-9str6\") pod \"ovn-edpm-multinodeset-edpm-compute-global-msbl5\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:44 crc kubenswrapper[5030]: I0121 00:03:44.040267 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:44 crc kubenswrapper[5030]: I0121 00:03:44.532541 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5"] Jan 21 00:03:44 crc kubenswrapper[5030]: W0121 00:03:44.536934 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1a9bec2_dc0b_40b7_a84f_3321849e3240.slice/crio-0cdbd19ab789bf46aac6f7ee6ee0e791f26eaf23e5ed8ce1e10d76377f9165e1 WatchSource:0}: Error finding container 0cdbd19ab789bf46aac6f7ee6ee0e791f26eaf23e5ed8ce1e10d76377f9165e1: Status 404 returned error can't find the container with id 0cdbd19ab789bf46aac6f7ee6ee0e791f26eaf23e5ed8ce1e10d76377f9165e1 Jan 21 00:03:44 crc kubenswrapper[5030]: I0121 00:03:44.541128 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 00:03:45 crc kubenswrapper[5030]: I0121 00:03:45.493304 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" event={"ID":"a1a9bec2-dc0b-40b7-a84f-3321849e3240","Type":"ContainerStarted","Data":"0cdbd19ab789bf46aac6f7ee6ee0e791f26eaf23e5ed8ce1e10d76377f9165e1"} Jan 21 00:03:46 crc kubenswrapper[5030]: I0121 00:03:46.510609 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" event={"ID":"a1a9bec2-dc0b-40b7-a84f-3321849e3240","Type":"ContainerStarted","Data":"7e78918f3cd2ec77333e5bf501e588f142d2ac1fac7efe1267fda1c35affa090"} Jan 21 00:03:46 crc kubenswrapper[5030]: I0121 00:03:46.538361 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" podStartSLOduration=2.760110924 podStartE2EDuration="3.538345154s" podCreationTimestamp="2026-01-21 00:03:43 +0000 UTC" firstStartedPulling="2026-01-21 00:03:44.540915823 +0000 UTC m=+5296.861176111" lastFinishedPulling="2026-01-21 00:03:45.319150043 +0000 UTC m=+5297.639410341" observedRunningTime="2026-01-21 00:03:46.536352586 +0000 UTC m=+5298.856612904" watchObservedRunningTime="2026-01-21 00:03:46.538345154 +0000 UTC m=+5298.858605442" Jan 21 00:03:47 crc kubenswrapper[5030]: I0121 00:03:47.523374 5030 generic.go:334] "Generic (PLEG): container finished" podID="a1a9bec2-dc0b-40b7-a84f-3321849e3240" containerID="7e78918f3cd2ec77333e5bf501e588f142d2ac1fac7efe1267fda1c35affa090" exitCode=0 Jan 21 00:03:47 crc kubenswrapper[5030]: I0121 00:03:47.523421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" event={"ID":"a1a9bec2-dc0b-40b7-a84f-3321849e3240","Type":"ContainerDied","Data":"7e78918f3cd2ec77333e5bf501e588f142d2ac1fac7efe1267fda1c35affa090"} Jan 21 00:03:48 crc kubenswrapper[5030]: I0121 00:03:48.854401 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:48 crc kubenswrapper[5030]: I0121 00:03:48.980291 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ssh-key-edpm-compute-global\") pod \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " Jan 21 00:03:48 crc kubenswrapper[5030]: I0121 00:03:48.980408 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9str6\" (UniqueName: \"kubernetes.io/projected/a1a9bec2-dc0b-40b7-a84f-3321849e3240-kube-api-access-9str6\") pod \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " Jan 21 00:03:48 crc kubenswrapper[5030]: I0121 00:03:48.980471 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-inventory\") pod \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " Jan 21 00:03:48 crc kubenswrapper[5030]: I0121 00:03:48.980518 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ovncontroller-config-0\") pod \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " Jan 21 00:03:48 crc kubenswrapper[5030]: I0121 00:03:48.980558 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ovn-combined-ca-bundle\") pod \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\" (UID: \"a1a9bec2-dc0b-40b7-a84f-3321849e3240\") " Jan 21 00:03:48 crc kubenswrapper[5030]: I0121 00:03:48.987848 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1a9bec2-dc0b-40b7-a84f-3321849e3240-kube-api-access-9str6" (OuterVolumeSpecName: "kube-api-access-9str6") pod "a1a9bec2-dc0b-40b7-a84f-3321849e3240" (UID: "a1a9bec2-dc0b-40b7-a84f-3321849e3240"). InnerVolumeSpecName "kube-api-access-9str6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:03:48 crc kubenswrapper[5030]: I0121 00:03:48.988022 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a1a9bec2-dc0b-40b7-a84f-3321849e3240" (UID: "a1a9bec2-dc0b-40b7-a84f-3321849e3240"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.003839 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "a1a9bec2-dc0b-40b7-a84f-3321849e3240" (UID: "a1a9bec2-dc0b-40b7-a84f-3321849e3240"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.007312 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a1a9bec2-dc0b-40b7-a84f-3321849e3240" (UID: "a1a9bec2-dc0b-40b7-a84f-3321849e3240"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.018838 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-inventory" (OuterVolumeSpecName: "inventory") pod "a1a9bec2-dc0b-40b7-a84f-3321849e3240" (UID: "a1a9bec2-dc0b-40b7-a84f-3321849e3240"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.082897 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9str6\" (UniqueName: \"kubernetes.io/projected/a1a9bec2-dc0b-40b7-a84f-3321849e3240-kube-api-access-9str6\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.082960 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.082983 5030 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.083002 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.083022 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/a1a9bec2-dc0b-40b7-a84f-3321849e3240-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.544702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" event={"ID":"a1a9bec2-dc0b-40b7-a84f-3321849e3240","Type":"ContainerDied","Data":"0cdbd19ab789bf46aac6f7ee6ee0e791f26eaf23e5ed8ce1e10d76377f9165e1"} Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.545260 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cdbd19ab789bf46aac6f7ee6ee0e791f26eaf23e5ed8ce1e10d76377f9165e1" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.544770 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.633103 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b"] Jan 21 00:03:49 crc kubenswrapper[5030]: E0121 00:03:49.633570 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a9bec2-dc0b-40b7-a84f-3321849e3240" containerName="ovn-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.633597 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a9bec2-dc0b-40b7-a84f-3321849e3240" containerName="ovn-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.633874 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a9bec2-dc0b-40b7-a84f-3321849e3240" containerName="ovn-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.634693 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.640905 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.641694 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.642004 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-metadata-agent-neutron-config" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.642536 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-neutron-config" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.643012 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.643514 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.648152 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.655016 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b"] Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.795803 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-inventory\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.795857 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.795927 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttnbw\" (UniqueName: \"kubernetes.io/projected/61619052-c2f6-435e-935f-d1e60275a021-kube-api-access-ttnbw\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.795958 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.796007 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.796051 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.796208 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-ssh-key-edpm-compute-global\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.796436 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.898594 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.898737 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.898781 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-ssh-key-edpm-compute-global\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.898886 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.898984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-inventory\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.899022 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.899065 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttnbw\" (UniqueName: \"kubernetes.io/projected/61619052-c2f6-435e-935f-d1e60275a021-kube-api-access-ttnbw\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.899109 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.905815 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.905908 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.905947 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.905975 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-ssh-key-edpm-compute-global\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.907241 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-inventory\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.907269 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.907860 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.923175 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttnbw\" (UniqueName: \"kubernetes.io/projected/61619052-c2f6-435e-935f-d1e60275a021-kube-api-access-ttnbw\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:49 crc kubenswrapper[5030]: I0121 00:03:49.952912 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:50 crc kubenswrapper[5030]: I0121 00:03:50.502340 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b"] Jan 21 00:03:50 crc kubenswrapper[5030]: I0121 00:03:50.557940 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" event={"ID":"61619052-c2f6-435e-935f-d1e60275a021","Type":"ContainerStarted","Data":"b61e35743156cee624fa0bec1479183e743cd2b2f742da01a0cb2daf0edd958f"} Jan 21 00:03:51 crc kubenswrapper[5030]: I0121 00:03:51.570829 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" event={"ID":"61619052-c2f6-435e-935f-d1e60275a021","Type":"ContainerStarted","Data":"9a3b99b979ff6f98a24aa52e56f001865bfa347ccfcc6000baddf9d101b99fa1"} Jan 21 00:03:51 crc kubenswrapper[5030]: I0121 00:03:51.592319 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" podStartSLOduration=2.068066797 podStartE2EDuration="2.592297328s" podCreationTimestamp="2026-01-21 00:03:49 +0000 UTC" firstStartedPulling="2026-01-21 00:03:50.507041104 +0000 UTC m=+5302.827301432" lastFinishedPulling="2026-01-21 00:03:51.031271635 +0000 UTC m=+5303.351531963" observedRunningTime="2026-01-21 00:03:51.588121467 +0000 UTC m=+5303.908381755" watchObservedRunningTime="2026-01-21 00:03:51.592297328 +0000 UTC m=+5303.912557616" Jan 21 00:03:51 crc kubenswrapper[5030]: I0121 00:03:51.962416 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:03:51 crc kubenswrapper[5030]: E0121 00:03:51.962688 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:03:53 crc kubenswrapper[5030]: I0121 00:03:53.591787 5030 generic.go:334] "Generic (PLEG): container finished" podID="61619052-c2f6-435e-935f-d1e60275a021" containerID="9a3b99b979ff6f98a24aa52e56f001865bfa347ccfcc6000baddf9d101b99fa1" exitCode=0 Jan 21 00:03:53 crc kubenswrapper[5030]: I0121 00:03:53.591889 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" event={"ID":"61619052-c2f6-435e-935f-d1e60275a021","Type":"ContainerDied","Data":"9a3b99b979ff6f98a24aa52e56f001865bfa347ccfcc6000baddf9d101b99fa1"} Jan 21 00:03:54 crc kubenswrapper[5030]: I0121 00:03:54.946891 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.092701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-inventory\") pod \"61619052-c2f6-435e-935f-d1e60275a021\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.092851 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-1\") pod \"61619052-c2f6-435e-935f-d1e60275a021\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.092922 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-neutron-metadata-combined-ca-bundle\") pod \"61619052-c2f6-435e-935f-d1e60275a021\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.092999 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-ssh-key-edpm-compute-global\") pod \"61619052-c2f6-435e-935f-d1e60275a021\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.093066 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-2\") pod \"61619052-c2f6-435e-935f-d1e60275a021\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.093118 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-0\") pod \"61619052-c2f6-435e-935f-d1e60275a021\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.093171 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttnbw\" (UniqueName: \"kubernetes.io/projected/61619052-c2f6-435e-935f-d1e60275a021-kube-api-access-ttnbw\") pod \"61619052-c2f6-435e-935f-d1e60275a021\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.093237 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-neutron-ovn-metadata-agent-neutron-config-0\") pod \"61619052-c2f6-435e-935f-d1e60275a021\" (UID: \"61619052-c2f6-435e-935f-d1e60275a021\") " Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.100030 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "61619052-c2f6-435e-935f-d1e60275a021" (UID: "61619052-c2f6-435e-935f-d1e60275a021"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.100952 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61619052-c2f6-435e-935f-d1e60275a021-kube-api-access-ttnbw" (OuterVolumeSpecName: "kube-api-access-ttnbw") pod "61619052-c2f6-435e-935f-d1e60275a021" (UID: "61619052-c2f6-435e-935f-d1e60275a021"). InnerVolumeSpecName "kube-api-access-ttnbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.119616 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-2" (OuterVolumeSpecName: "nova-metadata-neutron-config-2") pod "61619052-c2f6-435e-935f-d1e60275a021" (UID: "61619052-c2f6-435e-935f-d1e60275a021"). InnerVolumeSpecName "nova-metadata-neutron-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.121843 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-inventory" (OuterVolumeSpecName: "inventory") pod "61619052-c2f6-435e-935f-d1e60275a021" (UID: "61619052-c2f6-435e-935f-d1e60275a021"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.125993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "61619052-c2f6-435e-935f-d1e60275a021" (UID: "61619052-c2f6-435e-935f-d1e60275a021"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.134290 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "61619052-c2f6-435e-935f-d1e60275a021" (UID: "61619052-c2f6-435e-935f-d1e60275a021"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.136288 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "61619052-c2f6-435e-935f-d1e60275a021" (UID: "61619052-c2f6-435e-935f-d1e60275a021"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.136829 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-1" (OuterVolumeSpecName: "nova-metadata-neutron-config-1") pod "61619052-c2f6-435e-935f-d1e60275a021" (UID: "61619052-c2f6-435e-935f-d1e60275a021"). InnerVolumeSpecName "nova-metadata-neutron-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.194535 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-2\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.194842 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.194862 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttnbw\" (UniqueName: \"kubernetes.io/projected/61619052-c2f6-435e-935f-d1e60275a021-kube-api-access-ttnbw\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.194879 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.194911 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.194923 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-nova-metadata-neutron-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.194935 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.194949 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/61619052-c2f6-435e-935f-d1e60275a021-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.621140 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" event={"ID":"61619052-c2f6-435e-935f-d1e60275a021","Type":"ContainerDied","Data":"b61e35743156cee624fa0bec1479183e743cd2b2f742da01a0cb2daf0edd958f"} Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.621183 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b61e35743156cee624fa0bec1479183e743cd2b2f742da01a0cb2daf0edd958f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.621229 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.699024 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f"] Jan 21 00:03:55 crc kubenswrapper[5030]: E0121 00:03:55.699380 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61619052-c2f6-435e-935f-d1e60275a021" containerName="neutron-metadata-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.699399 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="61619052-c2f6-435e-935f-d1e60275a021" containerName="neutron-metadata-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.699839 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="61619052-c2f6-435e-935f-d1e60275a021" containerName="neutron-metadata-edpm-multinodeset-edpm-compute-global" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.700431 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.702934 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.703468 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-agent-neutron-config" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.704013 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.705270 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.708198 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.708207 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.714958 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f"] Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.812612 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.812714 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.812758 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdljl\" (UniqueName: \"kubernetes.io/projected/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-kube-api-access-wdljl\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.812781 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-ssh-key-edpm-compute-global\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.812819 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-inventory\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.914401 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.914481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.914544 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdljl\" (UniqueName: \"kubernetes.io/projected/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-kube-api-access-wdljl\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.914585 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-ssh-key-edpm-compute-global\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.914714 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-inventory\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.919960 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.920020 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.921401 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-ssh-key-edpm-compute-global\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.923235 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-inventory\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:55 crc kubenswrapper[5030]: I0121 00:03:55.938387 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdljl\" (UniqueName: \"kubernetes.io/projected/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-kube-api-access-wdljl\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:56 crc kubenswrapper[5030]: I0121 00:03:56.015848 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:03:56 crc kubenswrapper[5030]: I0121 00:03:56.500452 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f"] Jan 21 00:03:56 crc kubenswrapper[5030]: W0121 00:03:56.505596 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11f4d776_8b41_4b6e_a237_a7e0a1e4aa87.slice/crio-cefeda45ed4959d44006c640bc6a64cc45d0f6b24ff16c72816352c5ff1c2c44 WatchSource:0}: Error finding container cefeda45ed4959d44006c640bc6a64cc45d0f6b24ff16c72816352c5ff1c2c44: Status 404 returned error can't find the container with id cefeda45ed4959d44006c640bc6a64cc45d0f6b24ff16c72816352c5ff1c2c44 Jan 21 00:03:56 crc kubenswrapper[5030]: I0121 00:03:56.633509 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" event={"ID":"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87","Type":"ContainerStarted","Data":"cefeda45ed4959d44006c640bc6a64cc45d0f6b24ff16c72816352c5ff1c2c44"} Jan 21 00:03:58 crc kubenswrapper[5030]: I0121 00:03:58.656477 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" event={"ID":"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87","Type":"ContainerStarted","Data":"7dd27b2996adc0ef0d867132f450301a477661cef0106f18ed271dbf2dae7bf5"} Jan 21 00:03:58 crc kubenswrapper[5030]: I0121 00:03:58.687364 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" podStartSLOduration=2.843742546 podStartE2EDuration="3.687338821s" podCreationTimestamp="2026-01-21 00:03:55 +0000 UTC" firstStartedPulling="2026-01-21 00:03:56.510425438 +0000 UTC m=+5308.830685736" lastFinishedPulling="2026-01-21 00:03:57.354021713 +0000 UTC m=+5309.674282011" observedRunningTime="2026-01-21 00:03:58.676929009 +0000 UTC m=+5310.997189347" watchObservedRunningTime="2026-01-21 00:03:58.687338821 +0000 UTC m=+5311.007599139" Jan 21 00:03:59 crc kubenswrapper[5030]: I0121 00:03:59.666581 5030 generic.go:334] "Generic (PLEG): container finished" podID="11f4d776-8b41-4b6e-a237-a7e0a1e4aa87" containerID="7dd27b2996adc0ef0d867132f450301a477661cef0106f18ed271dbf2dae7bf5" exitCode=0 Jan 21 00:03:59 crc kubenswrapper[5030]: I0121 00:03:59.666694 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" event={"ID":"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87","Type":"ContainerDied","Data":"7dd27b2996adc0ef0d867132f450301a477661cef0106f18ed271dbf2dae7bf5"} Jan 21 00:04:00 crc kubenswrapper[5030]: I0121 00:04:00.959764 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.012956 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-ssh-key-edpm-compute-global\") pod \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.013091 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-inventory\") pod \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.013116 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-neutron-ovn-combined-ca-bundle\") pod \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.013176 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdljl\" (UniqueName: \"kubernetes.io/projected/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-kube-api-access-wdljl\") pod \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.013230 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-neutron-ovn-agent-neutron-config-0\") pod \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\" (UID: \"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87\") " Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.019221 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "11f4d776-8b41-4b6e-a237-a7e0a1e4aa87" (UID: "11f4d776-8b41-4b6e-a237-a7e0a1e4aa87"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.019257 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-kube-api-access-wdljl" (OuterVolumeSpecName: "kube-api-access-wdljl") pod "11f4d776-8b41-4b6e-a237-a7e0a1e4aa87" (UID: "11f4d776-8b41-4b6e-a237-a7e0a1e4aa87"). InnerVolumeSpecName "kube-api-access-wdljl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.045661 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "11f4d776-8b41-4b6e-a237-a7e0a1e4aa87" (UID: "11f4d776-8b41-4b6e-a237-a7e0a1e4aa87"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.064604 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-inventory" (OuterVolumeSpecName: "inventory") pod "11f4d776-8b41-4b6e-a237-a7e0a1e4aa87" (UID: "11f4d776-8b41-4b6e-a237-a7e0a1e4aa87"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.067429 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-neutron-ovn-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-agent-neutron-config-0") pod "11f4d776-8b41-4b6e-a237-a7e0a1e4aa87" (UID: "11f4d776-8b41-4b6e-a237-a7e0a1e4aa87"). InnerVolumeSpecName "neutron-ovn-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.110767 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v"] Jan 21 00:04:01 crc kubenswrapper[5030]: E0121 00:04:01.111148 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f4d776-8b41-4b6e-a237-a7e0a1e4aa87" containerName="neutron-ovn-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.111171 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f4d776-8b41-4b6e-a237-a7e0a1e4aa87" containerName="neutron-ovn-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.111370 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f4d776-8b41-4b6e-a237-a7e0a1e4aa87" containerName="neutron-ovn-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.112002 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.114514 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.114549 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.114566 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdljl\" (UniqueName: \"kubernetes.io/projected/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-kube-api-access-wdljl\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.114581 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-neutron-ovn-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.114595 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.114783 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-sriov-agent-neutron-config" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.122743 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v"] Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.215888 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.215945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.215965 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-ssh-key-edpm-compute-global\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.215999 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-inventory\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.216097 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5srr9\" (UniqueName: \"kubernetes.io/projected/78643fa1-c6b8-41c1-a3be-17517755b59c-kube-api-access-5srr9\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.317483 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-inventory\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.317594 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5srr9\" (UniqueName: \"kubernetes.io/projected/78643fa1-c6b8-41c1-a3be-17517755b59c-kube-api-access-5srr9\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.317719 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.317779 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.317815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-ssh-key-edpm-compute-global\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.322001 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-inventory\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.322142 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-ssh-key-edpm-compute-global\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.322415 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.322559 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.341527 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5srr9\" (UniqueName: \"kubernetes.io/projected/78643fa1-c6b8-41c1-a3be-17517755b59c-kube-api-access-5srr9\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.443829 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.687404 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" event={"ID":"11f4d776-8b41-4b6e-a237-a7e0a1e4aa87","Type":"ContainerDied","Data":"cefeda45ed4959d44006c640bc6a64cc45d0f6b24ff16c72816352c5ff1c2c44"} Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.687784 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cefeda45ed4959d44006c640bc6a64cc45d0f6b24ff16c72816352c5ff1c2c44" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.687498 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f" Jan 21 00:04:01 crc kubenswrapper[5030]: I0121 00:04:01.702945 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v"] Jan 21 00:04:02 crc kubenswrapper[5030]: I0121 00:04:02.702303 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" event={"ID":"78643fa1-c6b8-41c1-a3be-17517755b59c","Type":"ContainerStarted","Data":"a5163e335a2657d84e00e57cac5dbe4bf62d749cbb0df60fb460b25d9a4845dc"} Jan 21 00:04:02 crc kubenswrapper[5030]: I0121 00:04:02.702660 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" event={"ID":"78643fa1-c6b8-41c1-a3be-17517755b59c","Type":"ContainerStarted","Data":"06dbad481228dcab1a0c1d216b08f53c5ef2beab9d29395a626051f870b41f62"} Jan 21 00:04:02 crc kubenswrapper[5030]: I0121 00:04:02.725472 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" podStartSLOduration=1.135269684 podStartE2EDuration="1.725447064s" podCreationTimestamp="2026-01-21 00:04:01 +0000 UTC" firstStartedPulling="2026-01-21 00:04:01.710387372 +0000 UTC m=+5314.030647660" lastFinishedPulling="2026-01-21 00:04:02.300564752 +0000 UTC m=+5314.620825040" observedRunningTime="2026-01-21 00:04:02.720045602 +0000 UTC m=+5315.040305930" watchObservedRunningTime="2026-01-21 00:04:02.725447064 +0000 UTC m=+5315.045707372" Jan 21 00:04:03 crc kubenswrapper[5030]: I0121 00:04:03.720054 5030 generic.go:334] "Generic (PLEG): container finished" podID="78643fa1-c6b8-41c1-a3be-17517755b59c" containerID="a5163e335a2657d84e00e57cac5dbe4bf62d749cbb0df60fb460b25d9a4845dc" exitCode=0 Jan 21 00:04:03 crc kubenswrapper[5030]: I0121 00:04:03.720156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" event={"ID":"78643fa1-c6b8-41c1-a3be-17517755b59c","Type":"ContainerDied","Data":"a5163e335a2657d84e00e57cac5dbe4bf62d749cbb0df60fb460b25d9a4845dc"} Jan 21 00:04:04 crc kubenswrapper[5030]: I0121 00:04:04.962427 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:04:04 crc kubenswrapper[5030]: E0121 00:04:04.962917 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.100371 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.185237 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-neutron-sriov-agent-neutron-config-0\") pod \"78643fa1-c6b8-41c1-a3be-17517755b59c\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.185827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-inventory\") pod \"78643fa1-c6b8-41c1-a3be-17517755b59c\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.185997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5srr9\" (UniqueName: \"kubernetes.io/projected/78643fa1-c6b8-41c1-a3be-17517755b59c-kube-api-access-5srr9\") pod \"78643fa1-c6b8-41c1-a3be-17517755b59c\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.186086 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-neutron-sriov-combined-ca-bundle\") pod \"78643fa1-c6b8-41c1-a3be-17517755b59c\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.186142 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-ssh-key-edpm-compute-global\") pod \"78643fa1-c6b8-41c1-a3be-17517755b59c\" (UID: \"78643fa1-c6b8-41c1-a3be-17517755b59c\") " Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.191722 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "78643fa1-c6b8-41c1-a3be-17517755b59c" (UID: "78643fa1-c6b8-41c1-a3be-17517755b59c"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.191954 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78643fa1-c6b8-41c1-a3be-17517755b59c-kube-api-access-5srr9" (OuterVolumeSpecName: "kube-api-access-5srr9") pod "78643fa1-c6b8-41c1-a3be-17517755b59c" (UID: "78643fa1-c6b8-41c1-a3be-17517755b59c"). InnerVolumeSpecName "kube-api-access-5srr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.213767 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "78643fa1-c6b8-41c1-a3be-17517755b59c" (UID: "78643fa1-c6b8-41c1-a3be-17517755b59c"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.216906 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-inventory" (OuterVolumeSpecName: "inventory") pod "78643fa1-c6b8-41c1-a3be-17517755b59c" (UID: "78643fa1-c6b8-41c1-a3be-17517755b59c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.226787 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "78643fa1-c6b8-41c1-a3be-17517755b59c" (UID: "78643fa1-c6b8-41c1-a3be-17517755b59c"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.287428 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5srr9\" (UniqueName: \"kubernetes.io/projected/78643fa1-c6b8-41c1-a3be-17517755b59c-kube-api-access-5srr9\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.287461 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.287473 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.287482 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.287495 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78643fa1-c6b8-41c1-a3be-17517755b59c-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.747915 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" event={"ID":"78643fa1-c6b8-41c1-a3be-17517755b59c","Type":"ContainerDied","Data":"06dbad481228dcab1a0c1d216b08f53c5ef2beab9d29395a626051f870b41f62"} Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.747989 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06dbad481228dcab1a0c1d216b08f53c5ef2beab9d29395a626051f870b41f62" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.748039 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.831141 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt"] Jan 21 00:04:05 crc kubenswrapper[5030]: E0121 00:04:05.831495 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78643fa1-c6b8-41c1-a3be-17517755b59c" containerName="neutron-sriov-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.831512 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78643fa1-c6b8-41c1-a3be-17517755b59c" containerName="neutron-sriov-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.831663 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="78643fa1-c6b8-41c1-a3be-17517755b59c" containerName="neutron-sriov-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.832136 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.837168 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.837409 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-dhcp-agent-neutron-config" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.837479 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.837596 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.837780 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.838305 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.844094 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt"] Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.896303 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5tkq\" (UniqueName: \"kubernetes.io/projected/090eb691-45f3-4215-b4d3-b6bfb0915137-kube-api-access-r5tkq\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.896392 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.896465 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.896536 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-ssh-key-edpm-compute-global\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.896655 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-inventory\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.997939 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-inventory\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.998113 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5tkq\" (UniqueName: \"kubernetes.io/projected/090eb691-45f3-4215-b4d3-b6bfb0915137-kube-api-access-r5tkq\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.998179 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.998278 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:05 crc kubenswrapper[5030]: I0121 00:04:05.998581 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-ssh-key-edpm-compute-global\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:06 crc kubenswrapper[5030]: I0121 00:04:06.004580 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-inventory\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:06 crc kubenswrapper[5030]: I0121 00:04:06.005727 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:06 crc kubenswrapper[5030]: I0121 00:04:06.005800 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-ssh-key-edpm-compute-global\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:06 crc kubenswrapper[5030]: I0121 00:04:06.008208 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:06 crc kubenswrapper[5030]: I0121 00:04:06.022089 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5tkq\" (UniqueName: \"kubernetes.io/projected/090eb691-45f3-4215-b4d3-b6bfb0915137-kube-api-access-r5tkq\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:06 crc kubenswrapper[5030]: I0121 00:04:06.150007 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:06 crc kubenswrapper[5030]: I0121 00:04:06.472972 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt"] Jan 21 00:04:06 crc kubenswrapper[5030]: W0121 00:04:06.479540 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod090eb691_45f3_4215_b4d3_b6bfb0915137.slice/crio-dbdd0f0af06ed1df28da587635935f80b5f0d99d1e0eb25cd5858427da7e64b4 WatchSource:0}: Error finding container dbdd0f0af06ed1df28da587635935f80b5f0d99d1e0eb25cd5858427da7e64b4: Status 404 returned error can't find the container with id dbdd0f0af06ed1df28da587635935f80b5f0d99d1e0eb25cd5858427da7e64b4 Jan 21 00:04:06 crc kubenswrapper[5030]: I0121 00:04:06.758785 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" event={"ID":"090eb691-45f3-4215-b4d3-b6bfb0915137","Type":"ContainerStarted","Data":"dbdd0f0af06ed1df28da587635935f80b5f0d99d1e0eb25cd5858427da7e64b4"} Jan 21 00:04:07 crc kubenswrapper[5030]: I0121 00:04:07.774106 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" event={"ID":"090eb691-45f3-4215-b4d3-b6bfb0915137","Type":"ContainerStarted","Data":"e23f615058c2e18301e1bba79c8abfd2f23643f312553b46b4ffbe78b01a18ca"} Jan 21 00:04:07 crc kubenswrapper[5030]: I0121 00:04:07.804861 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" podStartSLOduration=2.251354186 podStartE2EDuration="2.804835797s" podCreationTimestamp="2026-01-21 00:04:05 +0000 UTC" firstStartedPulling="2026-01-21 00:04:06.482084834 +0000 UTC m=+5318.802345132" lastFinishedPulling="2026-01-21 00:04:07.035566415 +0000 UTC m=+5319.355826743" observedRunningTime="2026-01-21 00:04:07.795563272 +0000 UTC m=+5320.115823600" watchObservedRunningTime="2026-01-21 00:04:07.804835797 +0000 UTC m=+5320.125096115" Jan 21 00:04:08 crc kubenswrapper[5030]: I0121 00:04:08.784887 5030 generic.go:334] "Generic (PLEG): container finished" podID="090eb691-45f3-4215-b4d3-b6bfb0915137" containerID="e23f615058c2e18301e1bba79c8abfd2f23643f312553b46b4ffbe78b01a18ca" exitCode=0 Jan 21 00:04:08 crc kubenswrapper[5030]: I0121 00:04:08.784950 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" event={"ID":"090eb691-45f3-4215-b4d3-b6bfb0915137","Type":"ContainerDied","Data":"e23f615058c2e18301e1bba79c8abfd2f23643f312553b46b4ffbe78b01a18ca"} Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.138535 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.173201 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-ssh-key-edpm-compute-global\") pod \"090eb691-45f3-4215-b4d3-b6bfb0915137\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.173257 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-neutron-dhcp-combined-ca-bundle\") pod \"090eb691-45f3-4215-b4d3-b6bfb0915137\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.173293 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-neutron-dhcp-agent-neutron-config-0\") pod \"090eb691-45f3-4215-b4d3-b6bfb0915137\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.173384 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5tkq\" (UniqueName: \"kubernetes.io/projected/090eb691-45f3-4215-b4d3-b6bfb0915137-kube-api-access-r5tkq\") pod \"090eb691-45f3-4215-b4d3-b6bfb0915137\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.173504 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-inventory\") pod \"090eb691-45f3-4215-b4d3-b6bfb0915137\" (UID: \"090eb691-45f3-4215-b4d3-b6bfb0915137\") " Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.180177 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "090eb691-45f3-4215-b4d3-b6bfb0915137" (UID: "090eb691-45f3-4215-b4d3-b6bfb0915137"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.184309 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/090eb691-45f3-4215-b4d3-b6bfb0915137-kube-api-access-r5tkq" (OuterVolumeSpecName: "kube-api-access-r5tkq") pod "090eb691-45f3-4215-b4d3-b6bfb0915137" (UID: "090eb691-45f3-4215-b4d3-b6bfb0915137"). InnerVolumeSpecName "kube-api-access-r5tkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.194934 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-inventory" (OuterVolumeSpecName: "inventory") pod "090eb691-45f3-4215-b4d3-b6bfb0915137" (UID: "090eb691-45f3-4215-b4d3-b6bfb0915137"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.205719 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "090eb691-45f3-4215-b4d3-b6bfb0915137" (UID: "090eb691-45f3-4215-b4d3-b6bfb0915137"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.210653 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "090eb691-45f3-4215-b4d3-b6bfb0915137" (UID: "090eb691-45f3-4215-b4d3-b6bfb0915137"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.275034 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5tkq\" (UniqueName: \"kubernetes.io/projected/090eb691-45f3-4215-b4d3-b6bfb0915137-kube-api-access-r5tkq\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.275083 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.275105 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.275126 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.275147 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/090eb691-45f3-4215-b4d3-b6bfb0915137-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.809242 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" event={"ID":"090eb691-45f3-4215-b4d3-b6bfb0915137","Type":"ContainerDied","Data":"dbdd0f0af06ed1df28da587635935f80b5f0d99d1e0eb25cd5858427da7e64b4"} Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.809321 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbdd0f0af06ed1df28da587635935f80b5f0d99d1e0eb25cd5858427da7e64b4" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.809354 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.911974 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg"] Jan 21 00:04:10 crc kubenswrapper[5030]: E0121 00:04:10.912418 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090eb691-45f3-4215-b4d3-b6bfb0915137" containerName="neutron-dhcp-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.912451 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="090eb691-45f3-4215-b4d3-b6bfb0915137" containerName="neutron-dhcp-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.913839 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="090eb691-45f3-4215-b4d3-b6bfb0915137" containerName="neutron-dhcp-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.914554 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.919717 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.919976 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.920109 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.920233 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"libvirt-secret" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.920441 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.921529 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.935611 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg"] Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.987228 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6cx5\" (UniqueName: \"kubernetes.io/projected/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-kube-api-access-w6cx5\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.987284 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.987330 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-inventory\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.987453 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-ssh-key-edpm-compute-global\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:10 crc kubenswrapper[5030]: I0121 00:04:10.987577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-libvirt-secret-0\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:11 crc kubenswrapper[5030]: I0121 00:04:11.088648 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-inventory\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:11 crc kubenswrapper[5030]: I0121 00:04:11.088713 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-ssh-key-edpm-compute-global\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:11 crc kubenswrapper[5030]: I0121 00:04:11.088760 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-libvirt-secret-0\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:11 crc kubenswrapper[5030]: I0121 00:04:11.088865 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6cx5\" (UniqueName: \"kubernetes.io/projected/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-kube-api-access-w6cx5\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:11 crc kubenswrapper[5030]: I0121 00:04:11.088903 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:11 crc kubenswrapper[5030]: I0121 00:04:11.093136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:11 crc kubenswrapper[5030]: I0121 00:04:11.093233 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-inventory\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:11 crc kubenswrapper[5030]: I0121 00:04:11.094107 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-ssh-key-edpm-compute-global\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:11 crc kubenswrapper[5030]: I0121 00:04:11.095799 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-libvirt-secret-0\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:11 crc kubenswrapper[5030]: I0121 00:04:11.107487 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6cx5\" (UniqueName: \"kubernetes.io/projected/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-kube-api-access-w6cx5\") pod \"libvirt-edpm-multinodeset-edpm-compute-global-9vqfg\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:11 crc kubenswrapper[5030]: I0121 00:04:11.243809 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:11 crc kubenswrapper[5030]: I0121 00:04:11.707401 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg"] Jan 21 00:04:11 crc kubenswrapper[5030]: I0121 00:04:11.820816 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" event={"ID":"d6d126a1-30ac-4726-b744-9d4ad6ca9d12","Type":"ContainerStarted","Data":"27d4b2a132d9cd756e36888003fdffe378019797969e27a532305cbfc6725425"} Jan 21 00:04:16 crc kubenswrapper[5030]: I0121 00:04:16.877747 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" event={"ID":"d6d126a1-30ac-4726-b744-9d4ad6ca9d12","Type":"ContainerStarted","Data":"eae02cf9d85bac20bfc235b84d8e1016ef91f21f75218679b7ac6189da8adfc9"} Jan 21 00:04:16 crc kubenswrapper[5030]: I0121 00:04:16.905869 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" podStartSLOduration=2.332305695 podStartE2EDuration="6.905838011s" podCreationTimestamp="2026-01-21 00:04:10 +0000 UTC" firstStartedPulling="2026-01-21 00:04:11.704818711 +0000 UTC m=+5324.025078999" lastFinishedPulling="2026-01-21 00:04:16.278350997 +0000 UTC m=+5328.598611315" observedRunningTime="2026-01-21 00:04:16.905737659 +0000 UTC m=+5329.225998007" watchObservedRunningTime="2026-01-21 00:04:16.905838011 +0000 UTC m=+5329.226098329" Jan 21 00:04:17 crc kubenswrapper[5030]: I0121 00:04:17.892957 5030 generic.go:334] "Generic (PLEG): container finished" podID="d6d126a1-30ac-4726-b744-9d4ad6ca9d12" containerID="eae02cf9d85bac20bfc235b84d8e1016ef91f21f75218679b7ac6189da8adfc9" exitCode=0 Jan 21 00:04:17 crc kubenswrapper[5030]: I0121 00:04:17.893013 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" event={"ID":"d6d126a1-30ac-4726-b744-9d4ad6ca9d12","Type":"ContainerDied","Data":"eae02cf9d85bac20bfc235b84d8e1016ef91f21f75218679b7ac6189da8adfc9"} Jan 21 00:04:17 crc kubenswrapper[5030]: I0121 00:04:17.976843 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:04:17 crc kubenswrapper[5030]: E0121 00:04:17.977160 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.319762 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.335804 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-libvirt-secret-0\") pod \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.335875 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-libvirt-combined-ca-bundle\") pod \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.335956 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-inventory\") pod \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.336012 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6cx5\" (UniqueName: \"kubernetes.io/projected/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-kube-api-access-w6cx5\") pod \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.336132 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-ssh-key-edpm-compute-global\") pod \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\" (UID: \"d6d126a1-30ac-4726-b744-9d4ad6ca9d12\") " Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.345748 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-kube-api-access-w6cx5" (OuterVolumeSpecName: "kube-api-access-w6cx5") pod "d6d126a1-30ac-4726-b744-9d4ad6ca9d12" (UID: "d6d126a1-30ac-4726-b744-9d4ad6ca9d12"). InnerVolumeSpecName "kube-api-access-w6cx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.348822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d6d126a1-30ac-4726-b744-9d4ad6ca9d12" (UID: "d6d126a1-30ac-4726-b744-9d4ad6ca9d12"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.375793 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "d6d126a1-30ac-4726-b744-9d4ad6ca9d12" (UID: "d6d126a1-30ac-4726-b744-9d4ad6ca9d12"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.384186 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-inventory" (OuterVolumeSpecName: "inventory") pod "d6d126a1-30ac-4726-b744-9d4ad6ca9d12" (UID: "d6d126a1-30ac-4726-b744-9d4ad6ca9d12"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.396351 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "d6d126a1-30ac-4726-b744-9d4ad6ca9d12" (UID: "d6d126a1-30ac-4726-b744-9d4ad6ca9d12"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.437572 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.437607 5030 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.437635 5030 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.437647 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.437663 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6cx5\" (UniqueName: \"kubernetes.io/projected/d6d126a1-30ac-4726-b744-9d4ad6ca9d12-kube-api-access-w6cx5\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.921824 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" event={"ID":"d6d126a1-30ac-4726-b744-9d4ad6ca9d12","Type":"ContainerDied","Data":"27d4b2a132d9cd756e36888003fdffe378019797969e27a532305cbfc6725425"} Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.921873 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27d4b2a132d9cd756e36888003fdffe378019797969e27a532305cbfc6725425" Jan 21 00:04:19 crc kubenswrapper[5030]: I0121 00:04:19.921922 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.026301 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l"] Jan 21 00:04:20 crc kubenswrapper[5030]: E0121 00:04:20.027205 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6d126a1-30ac-4726-b744-9d4ad6ca9d12" containerName="libvirt-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.027235 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6d126a1-30ac-4726-b744-9d4ad6ca9d12" containerName="libvirt-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.027435 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6d126a1-30ac-4726-b744-9d4ad6ca9d12" containerName="libvirt-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.028143 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.032256 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-migration-ssh-key" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.033006 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-compute-config" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.034228 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.034507 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.034535 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.034740 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.037050 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.048704 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-inventory\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.048778 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-cell1-compute-config-1\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.048854 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-ssh-key-edpm-compute-global\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.048893 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-migration-ssh-key-0\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.048942 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-cell1-compute-config-0\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.049067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8rtt\" (UniqueName: \"kubernetes.io/projected/1083f387-c3df-4244-872e-0ead2b92df2e-kube-api-access-l8rtt\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.049192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-combined-ca-bundle\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.049235 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-migration-ssh-key-1\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.051650 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l"] Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.150933 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-cell1-compute-config-0\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.151009 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8rtt\" (UniqueName: \"kubernetes.io/projected/1083f387-c3df-4244-872e-0ead2b92df2e-kube-api-access-l8rtt\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.151107 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-combined-ca-bundle\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.151140 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-migration-ssh-key-1\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.151177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-inventory\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.151201 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-cell1-compute-config-1\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.151245 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-ssh-key-edpm-compute-global\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.151272 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-migration-ssh-key-0\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.156325 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-combined-ca-bundle\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.156479 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-inventory\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.157075 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-migration-ssh-key-0\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.157265 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-ssh-key-edpm-compute-global\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.158369 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-cell1-compute-config-1\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.158956 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-cell1-compute-config-0\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.160035 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-migration-ssh-key-1\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.172457 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8rtt\" (UniqueName: \"kubernetes.io/projected/1083f387-c3df-4244-872e-0ead2b92df2e-kube-api-access-l8rtt\") pod \"nova-edpm-multinodeset-edpm-compute-global-brv4l\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.352987 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.894522 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l"] Jan 21 00:04:20 crc kubenswrapper[5030]: W0121 00:04:20.896096 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1083f387_c3df_4244_872e_0ead2b92df2e.slice/crio-54cab134e082c07ff317a34dd9a805dbf95d686c48ae76855ce0b4511ec74627 WatchSource:0}: Error finding container 54cab134e082c07ff317a34dd9a805dbf95d686c48ae76855ce0b4511ec74627: Status 404 returned error can't find the container with id 54cab134e082c07ff317a34dd9a805dbf95d686c48ae76855ce0b4511ec74627 Jan 21 00:04:20 crc kubenswrapper[5030]: I0121 00:04:20.937059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" event={"ID":"1083f387-c3df-4244-872e-0ead2b92df2e","Type":"ContainerStarted","Data":"54cab134e082c07ff317a34dd9a805dbf95d686c48ae76855ce0b4511ec74627"} Jan 21 00:04:21 crc kubenswrapper[5030]: I0121 00:04:21.946968 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" event={"ID":"1083f387-c3df-4244-872e-0ead2b92df2e","Type":"ContainerStarted","Data":"45fa506a7c13811f70e578240df840f22276a03f7ce42680ae95a7384bcf8aca"} Jan 21 00:04:21 crc kubenswrapper[5030]: I0121 00:04:21.986709 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" podStartSLOduration=2.426909374 podStartE2EDuration="2.986687277s" podCreationTimestamp="2026-01-21 00:04:19 +0000 UTC" firstStartedPulling="2026-01-21 00:04:20.898703547 +0000 UTC m=+5333.218963885" lastFinishedPulling="2026-01-21 00:04:21.4584815 +0000 UTC m=+5333.778741788" observedRunningTime="2026-01-21 00:04:21.977011763 +0000 UTC m=+5334.297272071" watchObservedRunningTime="2026-01-21 00:04:21.986687277 +0000 UTC m=+5334.306947565" Jan 21 00:04:23 crc kubenswrapper[5030]: I0121 00:04:23.966792 5030 generic.go:334] "Generic (PLEG): container finished" podID="1083f387-c3df-4244-872e-0ead2b92df2e" containerID="45fa506a7c13811f70e578240df840f22276a03f7ce42680ae95a7384bcf8aca" exitCode=0 Jan 21 00:04:23 crc kubenswrapper[5030]: I0121 00:04:23.973018 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" event={"ID":"1083f387-c3df-4244-872e-0ead2b92df2e","Type":"ContainerDied","Data":"45fa506a7c13811f70e578240df840f22276a03f7ce42680ae95a7384bcf8aca"} Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.328542 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.438074 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-inventory\") pod \"1083f387-c3df-4244-872e-0ead2b92df2e\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.438169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-cell1-compute-config-1\") pod \"1083f387-c3df-4244-872e-0ead2b92df2e\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.438299 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-migration-ssh-key-0\") pod \"1083f387-c3df-4244-872e-0ead2b92df2e\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.438374 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8rtt\" (UniqueName: \"kubernetes.io/projected/1083f387-c3df-4244-872e-0ead2b92df2e-kube-api-access-l8rtt\") pod \"1083f387-c3df-4244-872e-0ead2b92df2e\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.439209 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-cell1-compute-config-0\") pod \"1083f387-c3df-4244-872e-0ead2b92df2e\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.439260 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-ssh-key-edpm-compute-global\") pod \"1083f387-c3df-4244-872e-0ead2b92df2e\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.439341 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-combined-ca-bundle\") pod \"1083f387-c3df-4244-872e-0ead2b92df2e\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.439778 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-migration-ssh-key-1\") pod \"1083f387-c3df-4244-872e-0ead2b92df2e\" (UID: \"1083f387-c3df-4244-872e-0ead2b92df2e\") " Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.446059 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "1083f387-c3df-4244-872e-0ead2b92df2e" (UID: "1083f387-c3df-4244-872e-0ead2b92df2e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.448090 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1083f387-c3df-4244-872e-0ead2b92df2e-kube-api-access-l8rtt" (OuterVolumeSpecName: "kube-api-access-l8rtt") pod "1083f387-c3df-4244-872e-0ead2b92df2e" (UID: "1083f387-c3df-4244-872e-0ead2b92df2e"). InnerVolumeSpecName "kube-api-access-l8rtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.465450 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-inventory" (OuterVolumeSpecName: "inventory") pod "1083f387-c3df-4244-872e-0ead2b92df2e" (UID: "1083f387-c3df-4244-872e-0ead2b92df2e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.466889 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "1083f387-c3df-4244-872e-0ead2b92df2e" (UID: "1083f387-c3df-4244-872e-0ead2b92df2e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.484658 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "1083f387-c3df-4244-872e-0ead2b92df2e" (UID: "1083f387-c3df-4244-872e-0ead2b92df2e"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.488349 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "1083f387-c3df-4244-872e-0ead2b92df2e" (UID: "1083f387-c3df-4244-872e-0ead2b92df2e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.489781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "1083f387-c3df-4244-872e-0ead2b92df2e" (UID: "1083f387-c3df-4244-872e-0ead2b92df2e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.507242 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "1083f387-c3df-4244-872e-0ead2b92df2e" (UID: "1083f387-c3df-4244-872e-0ead2b92df2e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.540734 5030 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.540767 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8rtt\" (UniqueName: \"kubernetes.io/projected/1083f387-c3df-4244-872e-0ead2b92df2e-kube-api-access-l8rtt\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.540778 5030 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.540788 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.540797 5030 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.540807 5030 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.540817 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.540826 5030 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1083f387-c3df-4244-872e-0ead2b92df2e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.990137 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" event={"ID":"1083f387-c3df-4244-872e-0ead2b92df2e","Type":"ContainerDied","Data":"54cab134e082c07ff317a34dd9a805dbf95d686c48ae76855ce0b4511ec74627"} Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.990202 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54cab134e082c07ff317a34dd9a805dbf95d686c48ae76855ce0b4511ec74627" Jan 21 00:04:25 crc kubenswrapper[5030]: I0121 00:04:25.990299 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.102582 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw"] Jan 21 00:04:26 crc kubenswrapper[5030]: E0121 00:04:26.103053 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1083f387-c3df-4244-872e-0ead2b92df2e" containerName="nova-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.103083 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1083f387-c3df-4244-872e-0ead2b92df2e" containerName="nova-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.103292 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1083f387-c3df-4244-872e-0ead2b92df2e" containerName="nova-edpm-multinodeset-edpm-compute-global" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.104013 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.107337 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.108151 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-global-dockercfg-b9rz5" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.108502 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.108993 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-global" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.109319 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-beta-nodeset" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.109614 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.150716 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw"] Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.152524 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-ssh-key-edpm-compute-global\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.152610 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-inventory-1\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.152695 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-ssh-key-edpm-compute-beta-nodeset\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.152772 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gssr\" (UniqueName: \"kubernetes.io/projected/be06028a-0040-4f62-bbdf-8b9d262db8bb-kube-api-access-6gssr\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.152844 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-custom-global-service-combined-ca-bundle\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.152876 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-inventory-0\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.253701 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-ssh-key-edpm-compute-beta-nodeset\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.253758 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gssr\" (UniqueName: \"kubernetes.io/projected/be06028a-0040-4f62-bbdf-8b9d262db8bb-kube-api-access-6gssr\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.253805 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-custom-global-service-combined-ca-bundle\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.253836 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-inventory-0\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.253950 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-ssh-key-edpm-compute-global\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.253989 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-inventory-1\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.259778 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-ssh-key-edpm-compute-global\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.260043 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-inventory-1\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.261730 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-custom-global-service-combined-ca-bundle\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.262101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-ssh-key-edpm-compute-beta-nodeset\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.262812 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-inventory-0\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.275677 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gssr\" (UniqueName: \"kubernetes.io/projected/be06028a-0040-4f62-bbdf-8b9d262db8bb-kube-api-access-6gssr\") pod \"custom-global-service-edpm-multinodeset-dw9hw\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.432012 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:26 crc kubenswrapper[5030]: I0121 00:04:26.745157 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw"] Jan 21 00:04:27 crc kubenswrapper[5030]: I0121 00:04:27.002704 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" event={"ID":"be06028a-0040-4f62-bbdf-8b9d262db8bb","Type":"ContainerStarted","Data":"536055736cc673e8af0a829cab503298452a708342fe75fb040fca89579d6711"} Jan 21 00:04:28 crc kubenswrapper[5030]: I0121 00:04:28.017880 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" event={"ID":"be06028a-0040-4f62-bbdf-8b9d262db8bb","Type":"ContainerStarted","Data":"fb29cdea2b9b93472258dc8bb21a524cd8ebc0614e4c2b5f68f20f99aa6b6424"} Jan 21 00:04:28 crc kubenswrapper[5030]: I0121 00:04:28.041744 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" podStartSLOduration=1.567355192 podStartE2EDuration="2.041721214s" podCreationTimestamp="2026-01-21 00:04:26 +0000 UTC" firstStartedPulling="2026-01-21 00:04:26.752543155 +0000 UTC m=+5339.072803453" lastFinishedPulling="2026-01-21 00:04:27.226909157 +0000 UTC m=+5339.547169475" observedRunningTime="2026-01-21 00:04:28.03782725 +0000 UTC m=+5340.358087538" watchObservedRunningTime="2026-01-21 00:04:28.041721214 +0000 UTC m=+5340.361981512" Jan 21 00:04:31 crc kubenswrapper[5030]: I0121 00:04:31.051843 5030 generic.go:334] "Generic (PLEG): container finished" podID="be06028a-0040-4f62-bbdf-8b9d262db8bb" containerID="fb29cdea2b9b93472258dc8bb21a524cd8ebc0614e4c2b5f68f20f99aa6b6424" exitCode=0 Jan 21 00:04:31 crc kubenswrapper[5030]: I0121 00:04:31.051908 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" event={"ID":"be06028a-0040-4f62-bbdf-8b9d262db8bb","Type":"ContainerDied","Data":"fb29cdea2b9b93472258dc8bb21a524cd8ebc0614e4c2b5f68f20f99aa6b6424"} Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.450304 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.572504 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-inventory-0\") pod \"be06028a-0040-4f62-bbdf-8b9d262db8bb\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.572588 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gssr\" (UniqueName: \"kubernetes.io/projected/be06028a-0040-4f62-bbdf-8b9d262db8bb-kube-api-access-6gssr\") pod \"be06028a-0040-4f62-bbdf-8b9d262db8bb\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.572606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-inventory-1\") pod \"be06028a-0040-4f62-bbdf-8b9d262db8bb\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.572770 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-custom-global-service-combined-ca-bundle\") pod \"be06028a-0040-4f62-bbdf-8b9d262db8bb\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.572795 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-ssh-key-edpm-compute-global\") pod \"be06028a-0040-4f62-bbdf-8b9d262db8bb\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.572864 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-ssh-key-edpm-compute-beta-nodeset\") pod \"be06028a-0040-4f62-bbdf-8b9d262db8bb\" (UID: \"be06028a-0040-4f62-bbdf-8b9d262db8bb\") " Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.578176 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-custom-global-service-combined-ca-bundle" (OuterVolumeSpecName: "custom-global-service-combined-ca-bundle") pod "be06028a-0040-4f62-bbdf-8b9d262db8bb" (UID: "be06028a-0040-4f62-bbdf-8b9d262db8bb"). InnerVolumeSpecName "custom-global-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.578313 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be06028a-0040-4f62-bbdf-8b9d262db8bb-kube-api-access-6gssr" (OuterVolumeSpecName: "kube-api-access-6gssr") pod "be06028a-0040-4f62-bbdf-8b9d262db8bb" (UID: "be06028a-0040-4f62-bbdf-8b9d262db8bb"). InnerVolumeSpecName "kube-api-access-6gssr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.591643 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "be06028a-0040-4f62-bbdf-8b9d262db8bb" (UID: "be06028a-0040-4f62-bbdf-8b9d262db8bb"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.593319 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-inventory-1" (OuterVolumeSpecName: "inventory-1") pod "be06028a-0040-4f62-bbdf-8b9d262db8bb" (UID: "be06028a-0040-4f62-bbdf-8b9d262db8bb"). InnerVolumeSpecName "inventory-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.594328 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-ssh-key-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "ssh-key-edpm-compute-beta-nodeset") pod "be06028a-0040-4f62-bbdf-8b9d262db8bb" (UID: "be06028a-0040-4f62-bbdf-8b9d262db8bb"). InnerVolumeSpecName "ssh-key-edpm-compute-beta-nodeset". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.597024 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-ssh-key-edpm-compute-global" (OuterVolumeSpecName: "ssh-key-edpm-compute-global") pod "be06028a-0040-4f62-bbdf-8b9d262db8bb" (UID: "be06028a-0040-4f62-bbdf-8b9d262db8bb"). InnerVolumeSpecName "ssh-key-edpm-compute-global". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.675296 5030 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.675361 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gssr\" (UniqueName: \"kubernetes.io/projected/be06028a-0040-4f62-bbdf-8b9d262db8bb-kube-api-access-6gssr\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.675382 5030 reconciler_common.go:293] "Volume detached for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-inventory-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.675402 5030 reconciler_common.go:293] "Volume detached for volume \"custom-global-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-custom-global-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.675422 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-global\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-ssh-key-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.675442 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/be06028a-0040-4f62-bbdf-8b9d262db8bb-ssh-key-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:32 crc kubenswrapper[5030]: I0121 00:04:32.962612 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:04:32 crc kubenswrapper[5030]: E0121 00:04:32.963044 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.079078 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" event={"ID":"be06028a-0040-4f62-bbdf-8b9d262db8bb","Type":"ContainerDied","Data":"536055736cc673e8af0a829cab503298452a708342fe75fb040fca89579d6711"} Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.079145 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="536055736cc673e8af0a829cab503298452a708342fe75fb040fca89579d6711" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.079144 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.592171 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.603414 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.618546 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.629188 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetz55pv"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.645574 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.655780 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.665313 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-multinodeset-dw9hw"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.674030 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.682666 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.689919 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-global-92n2b"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.699149 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-global-xsfl7"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.708398 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.716412 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-global-bgk6f"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.725797 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-global-edpm-compute-global-chzt7"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.734366 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.755837 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.796847 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.805511 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.814193 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.821112 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.841638 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.847791 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.854983 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-global-brv4l"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.860845 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-global-wgvvs"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.866521 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-global-kjtsp"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.872242 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-global-8hgqd"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.877449 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.882940 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.888303 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-global-rcqpx"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.893824 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.899085 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-global-zkdp7"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.904573 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-global-bwknj"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.909675 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-cz422"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.914643 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-global-f4p5v"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.919661 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.924674 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-global-9vqfg"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.929572 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-global-8z8pg"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.934409 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-global-88dbt"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.939214 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-global-msbl5"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.944288 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-global-edpm-compute-global-28kqw"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.949216 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.954089 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.958879 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx"] Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.978952 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="090eb691-45f3-4215-b4d3-b6bfb0915137" path="/var/lib/kubelet/pods/090eb691-45f3-4215-b4d3-b6bfb0915137/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.979968 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1083f387-c3df-4244-872e-0ead2b92df2e" path="/var/lib/kubelet/pods/1083f387-c3df-4244-872e-0ead2b92df2e/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.980955 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11f4d776-8b41-4b6e-a237-a7e0a1e4aa87" path="/var/lib/kubelet/pods/11f4d776-8b41-4b6e-a237-a7e0a1e4aa87/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.981927 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15f92ce3-6dd6-4f75-816f-e1ce833f83c7" path="/var/lib/kubelet/pods/15f92ce3-6dd6-4f75-816f-e1ce833f83c7/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.983780 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ef4425-ed8f-4995-96e0-e76e5e202ad0" path="/var/lib/kubelet/pods/33ef4425-ed8f-4995-96e0-e76e5e202ad0/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.984742 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61619052-c2f6-435e-935f-d1e60275a021" path="/var/lib/kubelet/pods/61619052-c2f6-435e-935f-d1e60275a021/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.985704 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2" path="/var/lib/kubelet/pods/71f979a7-8c0c-4b78-8e10-c5bcc7cf68d2/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.987700 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7456a5a7-d2e0-4e08-b19c-00e4d75429fb" path="/var/lib/kubelet/pods/7456a5a7-d2e0-4e08-b19c-00e4d75429fb/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.988805 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7644b304-ddc0-4057-8786-8e6b3f899386" path="/var/lib/kubelet/pods/7644b304-ddc0-4057-8786-8e6b3f899386/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.989770 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78643fa1-c6b8-41c1-a3be-17517755b59c" path="/var/lib/kubelet/pods/78643fa1-c6b8-41c1-a3be-17517755b59c/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.990726 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de79324-db47-4fa2-b08c-ac25869a4c9a" path="/var/lib/kubelet/pods/8de79324-db47-4fa2-b08c-ac25869a4c9a/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.992460 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a1e96bd-a7f6-4c06-9fc6-d09fac76e710" path="/var/lib/kubelet/pods/9a1e96bd-a7f6-4c06-9fc6-d09fac76e710/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.993551 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a9bec2-dc0b-40b7-a84f-3321849e3240" path="/var/lib/kubelet/pods/a1a9bec2-dc0b-40b7-a84f-3321849e3240/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.995826 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be06028a-0040-4f62-bbdf-8b9d262db8bb" path="/var/lib/kubelet/pods/be06028a-0040-4f62-bbdf-8b9d262db8bb/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.997272 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0408f65-3264-42f0-8364-9839f8b46222" path="/var/lib/kubelet/pods/c0408f65-3264-42f0-8364-9839f8b46222/volumes" Jan 21 00:04:33 crc kubenswrapper[5030]: I0121 00:04:33.998913 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6d126a1-30ac-4726-b744-9d4ad6ca9d12" path="/var/lib/kubelet/pods/d6d126a1-30ac-4726-b744-9d4ad6ca9d12/volumes" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.000049 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6d515a9-aa02-4b8c-993a-11ae9267ff80" path="/var/lib/kubelet/pods/d6d515a9-aa02-4b8c-993a-11ae9267ff80/volumes" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.002281 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9d98fe0-c996-4703-abdf-1149bb577de7" path="/var/lib/kubelet/pods/d9d98fe0-c996-4703-abdf-1149bb577de7/volumes" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.003380 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f21b5167-5f2e-41d0-b184-e0290e7030af" path="/var/lib/kubelet/pods/f21b5167-5f2e-41d0-b184-e0290e7030af/volumes" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.004973 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f776b79c-1a2d-4a18-ab4b-ace393bf1ab2" path="/var/lib/kubelet/pods/f776b79c-1a2d-4a18-ab4b-ace393bf1ab2/volumes" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.007173 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-global-edpm-compute-global-5t9s2"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.007234 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.007259 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/custom-global-service-edpm-compute-global-9lfzw"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.007320 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.007340 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-global-edpm-compute-global-g8lhx"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.007365 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.010244 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.017271 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.032053 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-global-edpm-compute-global-hwd7m"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.045870 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-global-edpm-compute-global-hsgc8"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.058331 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-global-edpm-compute-global-mbkfk"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.065974 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-global-edpm-compute-global-hqb78"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.079727 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.091918 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.103235 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-global-edpm-compute-global-sw7z2"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.111926 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.118810 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.125076 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-global-edpm-compute-global-qxcs8"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.152822 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-global-edpm-compute-global-plvdq"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.166281 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.181347 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-global-edpm-compute-global-4m8mc"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.194163 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global-zvdh4"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.207734 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-global-edpm-compute-global-fjgq5"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.220090 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.228005 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-global-edpm-compute-global-hmj6w"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.234457 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d"] Jan 21 00:04:34 crc kubenswrapper[5030]: E0121 00:04:34.235258 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be06028a-0040-4f62-bbdf-8b9d262db8bb" containerName="custom-global-service-edpm-multinodeset" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.235295 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="be06028a-0040-4f62-bbdf-8b9d262db8bb" containerName="custom-global-service-edpm-multinodeset" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.235607 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="be06028a-0040-4f62-bbdf-8b9d262db8bb" containerName="custom-global-service-edpm-multinodeset" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.237004 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.245217 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.263613 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d"] Jan 21 00:04:34 crc kubenswrapper[5030]: E0121 00:04:34.265250 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dnsmasq-svc edpm-compute-global kube-api-access-fpcpg], unattached volumes=[], failed to process volumes=[config dnsmasq-svc edpm-compute-global kube-api-access-fpcpg]: context canceled" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" podUID="18478ef3-f5dc-4dc7-9575-74db4dc77bb2" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.278978 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.282304 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.293996 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9"] Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.402961 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/705cf2be-b0f3-4599-815f-1301a4535cfb-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8t4c9\" (UID: \"705cf2be-b0f3-4599-815f-1301a4535cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.403105 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-88n5d\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.403152 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d78464677-88n5d\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.403192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-config\") pod \"dnsmasq-dnsmasq-7d78464677-88n5d\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.403243 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpcpg\" (UniqueName: \"kubernetes.io/projected/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-kube-api-access-fpcpg\") pod \"dnsmasq-dnsmasq-7d78464677-88n5d\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.403652 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5m8\" (UniqueName: \"kubernetes.io/projected/705cf2be-b0f3-4599-815f-1301a4535cfb-kube-api-access-rk5m8\") pod \"dnsmasq-dnsmasq-84b9f45d47-8t4c9\" (UID: \"705cf2be-b0f3-4599-815f-1301a4535cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.403801 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/705cf2be-b0f3-4599-815f-1301a4535cfb-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8t4c9\" (UID: \"705cf2be-b0f3-4599-815f-1301a4535cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.505589 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-88n5d\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.505683 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d78464677-88n5d\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.505713 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-config\") pod \"dnsmasq-dnsmasq-7d78464677-88n5d\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.505763 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpcpg\" (UniqueName: \"kubernetes.io/projected/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-kube-api-access-fpcpg\") pod \"dnsmasq-dnsmasq-7d78464677-88n5d\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.505833 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5m8\" (UniqueName: \"kubernetes.io/projected/705cf2be-b0f3-4599-815f-1301a4535cfb-kube-api-access-rk5m8\") pod \"dnsmasq-dnsmasq-84b9f45d47-8t4c9\" (UID: \"705cf2be-b0f3-4599-815f-1301a4535cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.505904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/705cf2be-b0f3-4599-815f-1301a4535cfb-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8t4c9\" (UID: \"705cf2be-b0f3-4599-815f-1301a4535cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.505956 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/705cf2be-b0f3-4599-815f-1301a4535cfb-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8t4c9\" (UID: \"705cf2be-b0f3-4599-815f-1301a4535cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:04:34 crc kubenswrapper[5030]: E0121 00:04:34.505895 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/edpm-compute-global: configmap "edpm-compute-global" not found Jan 21 00:04:34 crc kubenswrapper[5030]: E0121 00:04:34.507045 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-edpm-compute-global podName:18478ef3-f5dc-4dc7-9575-74db4dc77bb2 nodeName:}" failed. No retries permitted until 2026-01-21 00:04:35.007004028 +0000 UTC m=+5347.327264336 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "edpm-compute-global" (UniqueName: "kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-edpm-compute-global") pod "dnsmasq-dnsmasq-7d78464677-88n5d" (UID: "18478ef3-f5dc-4dc7-9575-74db4dc77bb2") : configmap "edpm-compute-global" not found Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.507453 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/705cf2be-b0f3-4599-815f-1301a4535cfb-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8t4c9\" (UID: \"705cf2be-b0f3-4599-815f-1301a4535cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.507769 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/705cf2be-b0f3-4599-815f-1301a4535cfb-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8t4c9\" (UID: \"705cf2be-b0f3-4599-815f-1301a4535cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.508557 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-7d78464677-88n5d\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.509055 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-config\") pod \"dnsmasq-dnsmasq-7d78464677-88n5d\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.525327 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5m8\" (UniqueName: \"kubernetes.io/projected/705cf2be-b0f3-4599-815f-1301a4535cfb-kube-api-access-rk5m8\") pod \"dnsmasq-dnsmasq-84b9f45d47-8t4c9\" (UID: \"705cf2be-b0f3-4599-815f-1301a4535cfb\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.535455 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpcpg\" (UniqueName: \"kubernetes.io/projected/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-kube-api-access-fpcpg\") pod \"dnsmasq-dnsmasq-7d78464677-88n5d\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.603206 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:04:34 crc kubenswrapper[5030]: I0121 00:04:34.919948 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9"] Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.013941 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-88n5d\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:35 crc kubenswrapper[5030]: E0121 00:04:35.014143 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/edpm-compute-global: configmap "edpm-compute-global" not found Jan 21 00:04:35 crc kubenswrapper[5030]: E0121 00:04:35.014260 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-edpm-compute-global podName:18478ef3-f5dc-4dc7-9575-74db4dc77bb2 nodeName:}" failed. No retries permitted until 2026-01-21 00:04:36.014234156 +0000 UTC m=+5348.334494464 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "edpm-compute-global" (UniqueName: "kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-edpm-compute-global") pod "dnsmasq-dnsmasq-7d78464677-88n5d" (UID: "18478ef3-f5dc-4dc7-9575-74db4dc77bb2") : configmap "edpm-compute-global" not found Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.104846 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.104834 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" event={"ID":"705cf2be-b0f3-4599-815f-1301a4535cfb","Type":"ContainerStarted","Data":"79fab27d696387974163bd73b25698b66ce4caf3aa6f2018420c9eccb24a2d1a"} Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.121101 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.216971 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-config\") pod \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.217122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpcpg\" (UniqueName: \"kubernetes.io/projected/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-kube-api-access-fpcpg\") pod \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.217179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-dnsmasq-svc\") pod \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.217880 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "18478ef3-f5dc-4dc7-9575-74db4dc77bb2" (UID: "18478ef3-f5dc-4dc7-9575-74db4dc77bb2"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.217930 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-config" (OuterVolumeSpecName: "config") pod "18478ef3-f5dc-4dc7-9575-74db4dc77bb2" (UID: "18478ef3-f5dc-4dc7-9575-74db4dc77bb2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.222556 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-kube-api-access-fpcpg" (OuterVolumeSpecName: "kube-api-access-fpcpg") pod "18478ef3-f5dc-4dc7-9575-74db4dc77bb2" (UID: "18478ef3-f5dc-4dc7-9575-74db4dc77bb2"). InnerVolumeSpecName "kube-api-access-fpcpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.319496 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.319570 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpcpg\" (UniqueName: \"kubernetes.io/projected/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-kube-api-access-fpcpg\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.319586 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.981450 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cbad952-81f9-4579-9b1f-23b2883cf1e4" path="/var/lib/kubelet/pods/0cbad952-81f9-4579-9b1f-23b2883cf1e4/volumes" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.983298 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd60ec9-3010-4372-8401-9492c7a8d6b1" path="/var/lib/kubelet/pods/0dd60ec9-3010-4372-8401-9492c7a8d6b1/volumes" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.984250 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be57b93-2657-4128-869a-db7cc3610e32" path="/var/lib/kubelet/pods/1be57b93-2657-4128-869a-db7cc3610e32/volumes" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.985226 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb3e7e1-261f-4a77-b84b-c4a444bd032e" path="/var/lib/kubelet/pods/3eb3e7e1-261f-4a77-b84b-c4a444bd032e/volumes" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.988034 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a9f23b-1a46-4a43-8803-f671c298a379" path="/var/lib/kubelet/pods/49a9f23b-1a46-4a43-8803-f671c298a379/volumes" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.989080 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="514781ff-085c-4cb6-95f9-7d2ef45d5c99" path="/var/lib/kubelet/pods/514781ff-085c-4cb6-95f9-7d2ef45d5c99/volumes" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.990106 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33" path="/var/lib/kubelet/pods/5c5c05d4-79cf-45b6-ae63-f90ebfbcfe33/volumes" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.992829 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d52393c-0a90-435d-8215-4822292a501e" path="/var/lib/kubelet/pods/5d52393c-0a90-435d-8215-4822292a501e/volumes" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.994178 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d" path="/var/lib/kubelet/pods/6d6b8fc4-a5c4-40a1-8b3e-44beb0e2aa5d/volumes" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.996188 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7595a00c-2956-4cd8-a2f2-054d1e6ddcad" path="/var/lib/kubelet/pods/7595a00c-2956-4cd8-a2f2-054d1e6ddcad/volumes" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.998278 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b09e4d5-4bde-4471-848a-dbc6275081e1" path="/var/lib/kubelet/pods/8b09e4d5-4bde-4471-848a-dbc6275081e1/volumes" Jan 21 00:04:35 crc kubenswrapper[5030]: I0121 00:04:35.999837 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90919d70-facc-4760-8900-d80c9b4ccaff" path="/var/lib/kubelet/pods/90919d70-facc-4760-8900-d80c9b4ccaff/volumes" Jan 21 00:04:36 crc kubenswrapper[5030]: I0121 00:04:36.000952 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4c283c4-8213-44c4-823a-9a645886d652" path="/var/lib/kubelet/pods/a4c283c4-8213-44c4-823a-9a645886d652/volumes" Jan 21 00:04:36 crc kubenswrapper[5030]: I0121 00:04:36.001957 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f07a5642-e5ba-42ab-ae0f-88e26b167ada" path="/var/lib/kubelet/pods/f07a5642-e5ba-42ab-ae0f-88e26b167ada/volumes" Jan 21 00:04:36 crc kubenswrapper[5030]: I0121 00:04:36.072514 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-edpm-compute-global\") pod \"dnsmasq-dnsmasq-7d78464677-88n5d\" (UID: \"18478ef3-f5dc-4dc7-9575-74db4dc77bb2\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:36 crc kubenswrapper[5030]: E0121 00:04:36.072815 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/edpm-compute-global: configmap "edpm-compute-global" not found Jan 21 00:04:36 crc kubenswrapper[5030]: E0121 00:04:36.072936 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-edpm-compute-global podName:18478ef3-f5dc-4dc7-9575-74db4dc77bb2 nodeName:}" failed. No retries permitted until 2026-01-21 00:04:38.072905427 +0000 UTC m=+5350.393165745 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "edpm-compute-global" (UniqueName: "kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-edpm-compute-global") pod "dnsmasq-dnsmasq-7d78464677-88n5d" (UID: "18478ef3-f5dc-4dc7-9575-74db4dc77bb2") : configmap "edpm-compute-global" not found Jan 21 00:04:36 crc kubenswrapper[5030]: I0121 00:04:36.121976 5030 generic.go:334] "Generic (PLEG): container finished" podID="705cf2be-b0f3-4599-815f-1301a4535cfb" containerID="6648c2fbaea2c11782ca9fd1e200de40695ca1244f50a56434489991065b1fa8" exitCode=0 Jan 21 00:04:36 crc kubenswrapper[5030]: I0121 00:04:36.122066 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" event={"ID":"705cf2be-b0f3-4599-815f-1301a4535cfb","Type":"ContainerDied","Data":"6648c2fbaea2c11782ca9fd1e200de40695ca1244f50a56434489991065b1fa8"} Jan 21 00:04:36 crc kubenswrapper[5030]: I0121 00:04:36.122337 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d" Jan 21 00:04:36 crc kubenswrapper[5030]: I0121 00:04:36.210984 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d"] Jan 21 00:04:36 crc kubenswrapper[5030]: I0121 00:04:36.220839 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-7d78464677-88n5d"] Jan 21 00:04:36 crc kubenswrapper[5030]: I0121 00:04:36.386457 5030 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/18478ef3-f5dc-4dc7-9575-74db4dc77bb2-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:37 crc kubenswrapper[5030]: I0121 00:04:37.134492 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" event={"ID":"705cf2be-b0f3-4599-815f-1301a4535cfb","Type":"ContainerStarted","Data":"4d0692fc099264876788cec8f439e267256c8724789757cd95c79d7e9da92ef6"} Jan 21 00:04:37 crc kubenswrapper[5030]: I0121 00:04:37.134837 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:04:37 crc kubenswrapper[5030]: I0121 00:04:37.163429 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" podStartSLOduration=4.163401587 podStartE2EDuration="4.163401587s" podCreationTimestamp="2026-01-21 00:04:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:04:37.162492786 +0000 UTC m=+5349.482753134" watchObservedRunningTime="2026-01-21 00:04:37.163401587 +0000 UTC m=+5349.483661915" Jan 21 00:04:37 crc kubenswrapper[5030]: I0121 00:04:37.986879 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18478ef3-f5dc-4dc7-9575-74db4dc77bb2" path="/var/lib/kubelet/pods/18478ef3-f5dc-4dc7-9575-74db4dc77bb2/volumes" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.048498 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-pfk2b"] Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.057315 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-pfk2b"] Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.202334 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-qstqw"] Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.203519 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qstqw" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.205447 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.205918 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.205925 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.206342 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.213048 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qstqw"] Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.314504 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4p2q\" (UniqueName: \"kubernetes.io/projected/95b075db-260f-4fe5-acdd-78ee634e8306-kube-api-access-b4p2q\") pod \"crc-storage-crc-qstqw\" (UID: \"95b075db-260f-4fe5-acdd-78ee634e8306\") " pod="crc-storage/crc-storage-crc-qstqw" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.314755 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/95b075db-260f-4fe5-acdd-78ee634e8306-node-mnt\") pod \"crc-storage-crc-qstqw\" (UID: \"95b075db-260f-4fe5-acdd-78ee634e8306\") " pod="crc-storage/crc-storage-crc-qstqw" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.314849 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/95b075db-260f-4fe5-acdd-78ee634e8306-crc-storage\") pod \"crc-storage-crc-qstqw\" (UID: \"95b075db-260f-4fe5-acdd-78ee634e8306\") " pod="crc-storage/crc-storage-crc-qstqw" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.416204 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/95b075db-260f-4fe5-acdd-78ee634e8306-node-mnt\") pod \"crc-storage-crc-qstqw\" (UID: \"95b075db-260f-4fe5-acdd-78ee634e8306\") " pod="crc-storage/crc-storage-crc-qstqw" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.416573 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/95b075db-260f-4fe5-acdd-78ee634e8306-crc-storage\") pod \"crc-storage-crc-qstqw\" (UID: \"95b075db-260f-4fe5-acdd-78ee634e8306\") " pod="crc-storage/crc-storage-crc-qstqw" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.416730 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4p2q\" (UniqueName: \"kubernetes.io/projected/95b075db-260f-4fe5-acdd-78ee634e8306-kube-api-access-b4p2q\") pod \"crc-storage-crc-qstqw\" (UID: \"95b075db-260f-4fe5-acdd-78ee634e8306\") " pod="crc-storage/crc-storage-crc-qstqw" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.417300 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/95b075db-260f-4fe5-acdd-78ee634e8306-node-mnt\") pod \"crc-storage-crc-qstqw\" (UID: \"95b075db-260f-4fe5-acdd-78ee634e8306\") " pod="crc-storage/crc-storage-crc-qstqw" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.417688 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/95b075db-260f-4fe5-acdd-78ee634e8306-crc-storage\") pod \"crc-storage-crc-qstqw\" (UID: \"95b075db-260f-4fe5-acdd-78ee634e8306\") " pod="crc-storage/crc-storage-crc-qstqw" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.452567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4p2q\" (UniqueName: \"kubernetes.io/projected/95b075db-260f-4fe5-acdd-78ee634e8306-kube-api-access-b4p2q\") pod \"crc-storage-crc-qstqw\" (UID: \"95b075db-260f-4fe5-acdd-78ee634e8306\") " pod="crc-storage/crc-storage-crc-qstqw" Jan 21 00:04:42 crc kubenswrapper[5030]: I0121 00:04:42.534713 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qstqw" Jan 21 00:04:43 crc kubenswrapper[5030]: I0121 00:04:43.035114 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qstqw"] Jan 21 00:04:43 crc kubenswrapper[5030]: I0121 00:04:43.207813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qstqw" event={"ID":"95b075db-260f-4fe5-acdd-78ee634e8306","Type":"ContainerStarted","Data":"7d49185c42438da43226920b7b3a663baddb7d0827d1ed62c8de23694c9e8f18"} Jan 21 00:04:43 crc kubenswrapper[5030]: I0121 00:04:43.980490 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d821564-7a44-40c8-a135-a91c0c60dc7d" path="/var/lib/kubelet/pods/6d821564-7a44-40c8-a135-a91c0c60dc7d/volumes" Jan 21 00:04:44 crc kubenswrapper[5030]: I0121 00:04:44.223893 5030 generic.go:334] "Generic (PLEG): container finished" podID="95b075db-260f-4fe5-acdd-78ee634e8306" containerID="e12808262dd0591c349493480d7d466ca3fcc6c780b33b812aa498bc6c79775d" exitCode=0 Jan 21 00:04:44 crc kubenswrapper[5030]: I0121 00:04:44.223949 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qstqw" event={"ID":"95b075db-260f-4fe5-acdd-78ee634e8306","Type":"ContainerDied","Data":"e12808262dd0591c349493480d7d466ca3fcc6c780b33b812aa498bc6c79775d"} Jan 21 00:04:44 crc kubenswrapper[5030]: I0121 00:04:44.607571 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:04:44 crc kubenswrapper[5030]: I0121 00:04:44.688108 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm"] Jan 21 00:04:44 crc kubenswrapper[5030]: I0121 00:04:44.688838 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" podUID="0d3065ec-86f1-4484-a0f3-f78c7b62c978" containerName="dnsmasq-dns" containerID="cri-o://2a30a45a21e461f508e1ca18b522de57dbc96c62100ccf7c8d5e97d4e41a8e31" gracePeriod=10 Jan 21 00:04:44 crc kubenswrapper[5030]: I0121 00:04:44.963025 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.177995 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.239919 5030 generic.go:334] "Generic (PLEG): container finished" podID="0d3065ec-86f1-4484-a0f3-f78c7b62c978" containerID="2a30a45a21e461f508e1ca18b522de57dbc96c62100ccf7c8d5e97d4e41a8e31" exitCode=0 Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.240151 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.240768 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" event={"ID":"0d3065ec-86f1-4484-a0f3-f78c7b62c978","Type":"ContainerDied","Data":"2a30a45a21e461f508e1ca18b522de57dbc96c62100ccf7c8d5e97d4e41a8e31"} Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.240799 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm" event={"ID":"0d3065ec-86f1-4484-a0f3-f78c7b62c978","Type":"ContainerDied","Data":"326b9496eec4b61e22c6b4ef27821ba8e4d398b3fe0b57533b8d094086b60735"} Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.240820 5030 scope.go:117] "RemoveContainer" containerID="2a30a45a21e461f508e1ca18b522de57dbc96c62100ccf7c8d5e97d4e41a8e31" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.261753 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-dnsmasq-svc\") pod \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.261901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-edpm-compute-beta-nodeset\") pod \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.261943 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-edpm-compute-global\") pod \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.261995 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-config\") pod \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.262067 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpjzq\" (UniqueName: \"kubernetes.io/projected/0d3065ec-86f1-4484-a0f3-f78c7b62c978-kube-api-access-zpjzq\") pod \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\" (UID: \"0d3065ec-86f1-4484-a0f3-f78c7b62c978\") " Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.269899 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3065ec-86f1-4484-a0f3-f78c7b62c978-kube-api-access-zpjzq" (OuterVolumeSpecName: "kube-api-access-zpjzq") pod "0d3065ec-86f1-4484-a0f3-f78c7b62c978" (UID: "0d3065ec-86f1-4484-a0f3-f78c7b62c978"). InnerVolumeSpecName "kube-api-access-zpjzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.295566 5030 scope.go:117] "RemoveContainer" containerID="137ac3508758792744fa7ebba8c978a81c9ed7a6e1eb9c746b574d8436a42ec8" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.314908 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "0d3065ec-86f1-4484-a0f3-f78c7b62c978" (UID: "0d3065ec-86f1-4484-a0f3-f78c7b62c978"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.323240 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-edpm-compute-global" (OuterVolumeSpecName: "edpm-compute-global") pod "0d3065ec-86f1-4484-a0f3-f78c7b62c978" (UID: "0d3065ec-86f1-4484-a0f3-f78c7b62c978"). InnerVolumeSpecName "edpm-compute-global". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.323409 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-config" (OuterVolumeSpecName: "config") pod "0d3065ec-86f1-4484-a0f3-f78c7b62c978" (UID: "0d3065ec-86f1-4484-a0f3-f78c7b62c978"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.336333 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "edpm-compute-beta-nodeset") pod "0d3065ec-86f1-4484-a0f3-f78c7b62c978" (UID: "0d3065ec-86f1-4484-a0f3-f78c7b62c978"). InnerVolumeSpecName "edpm-compute-beta-nodeset". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.338718 5030 scope.go:117] "RemoveContainer" containerID="2a30a45a21e461f508e1ca18b522de57dbc96c62100ccf7c8d5e97d4e41a8e31" Jan 21 00:04:45 crc kubenswrapper[5030]: E0121 00:04:45.339113 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a30a45a21e461f508e1ca18b522de57dbc96c62100ccf7c8d5e97d4e41a8e31\": container with ID starting with 2a30a45a21e461f508e1ca18b522de57dbc96c62100ccf7c8d5e97d4e41a8e31 not found: ID does not exist" containerID="2a30a45a21e461f508e1ca18b522de57dbc96c62100ccf7c8d5e97d4e41a8e31" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.339151 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a30a45a21e461f508e1ca18b522de57dbc96c62100ccf7c8d5e97d4e41a8e31"} err="failed to get container status \"2a30a45a21e461f508e1ca18b522de57dbc96c62100ccf7c8d5e97d4e41a8e31\": rpc error: code = NotFound desc = could not find container \"2a30a45a21e461f508e1ca18b522de57dbc96c62100ccf7c8d5e97d4e41a8e31\": container with ID starting with 2a30a45a21e461f508e1ca18b522de57dbc96c62100ccf7c8d5e97d4e41a8e31 not found: ID does not exist" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.339173 5030 scope.go:117] "RemoveContainer" containerID="137ac3508758792744fa7ebba8c978a81c9ed7a6e1eb9c746b574d8436a42ec8" Jan 21 00:04:45 crc kubenswrapper[5030]: E0121 00:04:45.340457 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137ac3508758792744fa7ebba8c978a81c9ed7a6e1eb9c746b574d8436a42ec8\": container with ID starting with 137ac3508758792744fa7ebba8c978a81c9ed7a6e1eb9c746b574d8436a42ec8 not found: ID does not exist" containerID="137ac3508758792744fa7ebba8c978a81c9ed7a6e1eb9c746b574d8436a42ec8" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.340560 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137ac3508758792744fa7ebba8c978a81c9ed7a6e1eb9c746b574d8436a42ec8"} err="failed to get container status \"137ac3508758792744fa7ebba8c978a81c9ed7a6e1eb9c746b574d8436a42ec8\": rpc error: code = NotFound desc = could not find container \"137ac3508758792744fa7ebba8c978a81c9ed7a6e1eb9c746b574d8436a42ec8\": container with ID starting with 137ac3508758792744fa7ebba8c978a81c9ed7a6e1eb9c746b574d8436a42ec8 not found: ID does not exist" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.367426 5030 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.367463 5030 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-global\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-edpm-compute-global\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.367482 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.367501 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpjzq\" (UniqueName: \"kubernetes.io/projected/0d3065ec-86f1-4484-a0f3-f78c7b62c978-kube-api-access-zpjzq\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.367515 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0d3065ec-86f1-4484-a0f3-f78c7b62c978-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.546461 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qstqw" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.569692 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4p2q\" (UniqueName: \"kubernetes.io/projected/95b075db-260f-4fe5-acdd-78ee634e8306-kube-api-access-b4p2q\") pod \"95b075db-260f-4fe5-acdd-78ee634e8306\" (UID: \"95b075db-260f-4fe5-acdd-78ee634e8306\") " Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.569789 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/95b075db-260f-4fe5-acdd-78ee634e8306-crc-storage\") pod \"95b075db-260f-4fe5-acdd-78ee634e8306\" (UID: \"95b075db-260f-4fe5-acdd-78ee634e8306\") " Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.569913 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/95b075db-260f-4fe5-acdd-78ee634e8306-node-mnt\") pod \"95b075db-260f-4fe5-acdd-78ee634e8306\" (UID: \"95b075db-260f-4fe5-acdd-78ee634e8306\") " Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.570033 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95b075db-260f-4fe5-acdd-78ee634e8306-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "95b075db-260f-4fe5-acdd-78ee634e8306" (UID: "95b075db-260f-4fe5-acdd-78ee634e8306"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.570403 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/95b075db-260f-4fe5-acdd-78ee634e8306-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.594408 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm"] Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.601809 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6668544499-84mhm"] Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.621300 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b075db-260f-4fe5-acdd-78ee634e8306-kube-api-access-b4p2q" (OuterVolumeSpecName: "kube-api-access-b4p2q") pod "95b075db-260f-4fe5-acdd-78ee634e8306" (UID: "95b075db-260f-4fe5-acdd-78ee634e8306"). InnerVolumeSpecName "kube-api-access-b4p2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.621521 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b075db-260f-4fe5-acdd-78ee634e8306-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "95b075db-260f-4fe5-acdd-78ee634e8306" (UID: "95b075db-260f-4fe5-acdd-78ee634e8306"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.671232 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4p2q\" (UniqueName: \"kubernetes.io/projected/95b075db-260f-4fe5-acdd-78ee634e8306-kube-api-access-b4p2q\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.671268 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/95b075db-260f-4fe5-acdd-78ee634e8306-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:45 crc kubenswrapper[5030]: E0121 00:04:45.684212 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d3065ec_86f1_4484_a0f3_f78c7b62c978.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d3065ec_86f1_4484_a0f3_f78c7b62c978.slice/crio-326b9496eec4b61e22c6b4ef27821ba8e4d398b3fe0b57533b8d094086b60735\": RecentStats: unable to find data in memory cache]" Jan 21 00:04:45 crc kubenswrapper[5030]: I0121 00:04:45.976059 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d3065ec-86f1-4484-a0f3-f78c7b62c978" path="/var/lib/kubelet/pods/0d3065ec-86f1-4484-a0f3-f78c7b62c978/volumes" Jan 21 00:04:46 crc kubenswrapper[5030]: I0121 00:04:46.259012 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"9b88bdb3b6c450bc938e22d6a7c3a8c64bf73cd7d39cd07687b66e4ace86044b"} Jan 21 00:04:46 crc kubenswrapper[5030]: I0121 00:04:46.264896 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qstqw" event={"ID":"95b075db-260f-4fe5-acdd-78ee634e8306","Type":"ContainerDied","Data":"7d49185c42438da43226920b7b3a663baddb7d0827d1ed62c8de23694c9e8f18"} Jan 21 00:04:46 crc kubenswrapper[5030]: I0121 00:04:46.264953 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d49185c42438da43226920b7b3a663baddb7d0827d1ed62c8de23694c9e8f18" Jan 21 00:04:46 crc kubenswrapper[5030]: I0121 00:04:46.264972 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qstqw" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.116005 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-qstqw"] Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.127204 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-qstqw"] Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.237943 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-p4ck6"] Jan 21 00:04:49 crc kubenswrapper[5030]: E0121 00:04:49.238244 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3065ec-86f1-4484-a0f3-f78c7b62c978" containerName="init" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.238261 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3065ec-86f1-4484-a0f3-f78c7b62c978" containerName="init" Jan 21 00:04:49 crc kubenswrapper[5030]: E0121 00:04:49.238273 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b075db-260f-4fe5-acdd-78ee634e8306" containerName="storage" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.238282 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b075db-260f-4fe5-acdd-78ee634e8306" containerName="storage" Jan 21 00:04:49 crc kubenswrapper[5030]: E0121 00:04:49.238297 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3065ec-86f1-4484-a0f3-f78c7b62c978" containerName="dnsmasq-dns" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.238306 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3065ec-86f1-4484-a0f3-f78c7b62c978" containerName="dnsmasq-dns" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.238494 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3065ec-86f1-4484-a0f3-f78c7b62c978" containerName="dnsmasq-dns" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.238526 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b075db-260f-4fe5-acdd-78ee634e8306" containerName="storage" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.239116 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4ck6" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.241611 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.242910 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.242987 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.243115 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.259367 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p4ck6"] Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.325588 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/49572136-a63e-443b-840e-0f4559f297f1-node-mnt\") pod \"crc-storage-crc-p4ck6\" (UID: \"49572136-a63e-443b-840e-0f4559f297f1\") " pod="crc-storage/crc-storage-crc-p4ck6" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.325730 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkqft\" (UniqueName: \"kubernetes.io/projected/49572136-a63e-443b-840e-0f4559f297f1-kube-api-access-vkqft\") pod \"crc-storage-crc-p4ck6\" (UID: \"49572136-a63e-443b-840e-0f4559f297f1\") " pod="crc-storage/crc-storage-crc-p4ck6" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.325831 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/49572136-a63e-443b-840e-0f4559f297f1-crc-storage\") pod \"crc-storage-crc-p4ck6\" (UID: \"49572136-a63e-443b-840e-0f4559f297f1\") " pod="crc-storage/crc-storage-crc-p4ck6" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.427849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/49572136-a63e-443b-840e-0f4559f297f1-node-mnt\") pod \"crc-storage-crc-p4ck6\" (UID: \"49572136-a63e-443b-840e-0f4559f297f1\") " pod="crc-storage/crc-storage-crc-p4ck6" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.427929 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkqft\" (UniqueName: \"kubernetes.io/projected/49572136-a63e-443b-840e-0f4559f297f1-kube-api-access-vkqft\") pod \"crc-storage-crc-p4ck6\" (UID: \"49572136-a63e-443b-840e-0f4559f297f1\") " pod="crc-storage/crc-storage-crc-p4ck6" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.427956 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/49572136-a63e-443b-840e-0f4559f297f1-crc-storage\") pod \"crc-storage-crc-p4ck6\" (UID: \"49572136-a63e-443b-840e-0f4559f297f1\") " pod="crc-storage/crc-storage-crc-p4ck6" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.428217 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/49572136-a63e-443b-840e-0f4559f297f1-node-mnt\") pod \"crc-storage-crc-p4ck6\" (UID: \"49572136-a63e-443b-840e-0f4559f297f1\") " pod="crc-storage/crc-storage-crc-p4ck6" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.428682 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/49572136-a63e-443b-840e-0f4559f297f1-crc-storage\") pod \"crc-storage-crc-p4ck6\" (UID: \"49572136-a63e-443b-840e-0f4559f297f1\") " pod="crc-storage/crc-storage-crc-p4ck6" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.445009 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkqft\" (UniqueName: \"kubernetes.io/projected/49572136-a63e-443b-840e-0f4559f297f1-kube-api-access-vkqft\") pod \"crc-storage-crc-p4ck6\" (UID: \"49572136-a63e-443b-840e-0f4559f297f1\") " pod="crc-storage/crc-storage-crc-p4ck6" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.554088 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4ck6" Jan 21 00:04:49 crc kubenswrapper[5030]: I0121 00:04:49.976893 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b075db-260f-4fe5-acdd-78ee634e8306" path="/var/lib/kubelet/pods/95b075db-260f-4fe5-acdd-78ee634e8306/volumes" Jan 21 00:04:50 crc kubenswrapper[5030]: I0121 00:04:50.016509 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p4ck6"] Jan 21 00:04:50 crc kubenswrapper[5030]: W0121 00:04:50.024870 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49572136_a63e_443b_840e_0f4559f297f1.slice/crio-ad0667d1c7ccb96476ab56fab94f85d268b67944f5a5d5d85c6460ac5b083ec9 WatchSource:0}: Error finding container ad0667d1c7ccb96476ab56fab94f85d268b67944f5a5d5d85c6460ac5b083ec9: Status 404 returned error can't find the container with id ad0667d1c7ccb96476ab56fab94f85d268b67944f5a5d5d85c6460ac5b083ec9 Jan 21 00:04:50 crc kubenswrapper[5030]: I0121 00:04:50.332190 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p4ck6" event={"ID":"49572136-a63e-443b-840e-0f4559f297f1","Type":"ContainerStarted","Data":"ad0667d1c7ccb96476ab56fab94f85d268b67944f5a5d5d85c6460ac5b083ec9"} Jan 21 00:04:51 crc kubenswrapper[5030]: I0121 00:04:51.355566 5030 generic.go:334] "Generic (PLEG): container finished" podID="49572136-a63e-443b-840e-0f4559f297f1" containerID="c84defbcbb2770b6b3cb2a981832bd3f083c624307887a3d429e51e7f8a4fbde" exitCode=0 Jan 21 00:04:51 crc kubenswrapper[5030]: I0121 00:04:51.355702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p4ck6" event={"ID":"49572136-a63e-443b-840e-0f4559f297f1","Type":"ContainerDied","Data":"c84defbcbb2770b6b3cb2a981832bd3f083c624307887a3d429e51e7f8a4fbde"} Jan 21 00:04:52 crc kubenswrapper[5030]: I0121 00:04:52.735675 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4ck6" Jan 21 00:04:52 crc kubenswrapper[5030]: I0121 00:04:52.782404 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/49572136-a63e-443b-840e-0f4559f297f1-crc-storage\") pod \"49572136-a63e-443b-840e-0f4559f297f1\" (UID: \"49572136-a63e-443b-840e-0f4559f297f1\") " Jan 21 00:04:52 crc kubenswrapper[5030]: I0121 00:04:52.782517 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/49572136-a63e-443b-840e-0f4559f297f1-node-mnt\") pod \"49572136-a63e-443b-840e-0f4559f297f1\" (UID: \"49572136-a63e-443b-840e-0f4559f297f1\") " Jan 21 00:04:52 crc kubenswrapper[5030]: I0121 00:04:52.782588 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkqft\" (UniqueName: \"kubernetes.io/projected/49572136-a63e-443b-840e-0f4559f297f1-kube-api-access-vkqft\") pod \"49572136-a63e-443b-840e-0f4559f297f1\" (UID: \"49572136-a63e-443b-840e-0f4559f297f1\") " Jan 21 00:04:52 crc kubenswrapper[5030]: I0121 00:04:52.782723 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49572136-a63e-443b-840e-0f4559f297f1-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "49572136-a63e-443b-840e-0f4559f297f1" (UID: "49572136-a63e-443b-840e-0f4559f297f1"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:04:52 crc kubenswrapper[5030]: I0121 00:04:52.783002 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/49572136-a63e-443b-840e-0f4559f297f1-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:52 crc kubenswrapper[5030]: I0121 00:04:52.787284 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49572136-a63e-443b-840e-0f4559f297f1-kube-api-access-vkqft" (OuterVolumeSpecName: "kube-api-access-vkqft") pod "49572136-a63e-443b-840e-0f4559f297f1" (UID: "49572136-a63e-443b-840e-0f4559f297f1"). InnerVolumeSpecName "kube-api-access-vkqft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:04:52 crc kubenswrapper[5030]: I0121 00:04:52.803353 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49572136-a63e-443b-840e-0f4559f297f1-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "49572136-a63e-443b-840e-0f4559f297f1" (UID: "49572136-a63e-443b-840e-0f4559f297f1"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:52 crc kubenswrapper[5030]: I0121 00:04:52.884282 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkqft\" (UniqueName: \"kubernetes.io/projected/49572136-a63e-443b-840e-0f4559f297f1-kube-api-access-vkqft\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:52 crc kubenswrapper[5030]: I0121 00:04:52.884318 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/49572136-a63e-443b-840e-0f4559f297f1-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:53 crc kubenswrapper[5030]: I0121 00:04:53.381056 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p4ck6" event={"ID":"49572136-a63e-443b-840e-0f4559f297f1","Type":"ContainerDied","Data":"ad0667d1c7ccb96476ab56fab94f85d268b67944f5a5d5d85c6460ac5b083ec9"} Jan 21 00:04:53 crc kubenswrapper[5030]: I0121 00:04:53.381112 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad0667d1c7ccb96476ab56fab94f85d268b67944f5a5d5d85c6460ac5b083ec9" Jan 21 00:04:53 crc kubenswrapper[5030]: I0121 00:04:53.381181 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p4ck6" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.343947 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd"] Jan 21 00:04:56 crc kubenswrapper[5030]: E0121 00:04:56.344883 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49572136-a63e-443b-840e-0f4559f297f1" containerName="storage" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.344901 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="49572136-a63e-443b-840e-0f4559f297f1" containerName="storage" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.345110 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="49572136-a63e-443b-840e-0f4559f297f1" containerName="storage" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.346083 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.348422 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-edpm-tls" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.362301 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd"] Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.430998 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd"] Jan 21 00:04:56 crc kubenswrapper[5030]: E0121 00:04:56.431649 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dnsmasq-svc kube-api-access-ndcmd openstack-edpm-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" podUID="f6be7542-0564-4731-aca4-1f1a6bccede0" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.447122 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z"] Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.448369 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.463828 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z"] Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.490994 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-78c7b787f5-6htkd\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.491071 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9jxh\" (UniqueName: \"kubernetes.io/projected/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-kube-api-access-c9jxh\") pod \"dnsmasq-dnsmasq-684b6db755-sgp7z\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.491111 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-684b6db755-sgp7z\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.491165 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-config\") pod \"dnsmasq-dnsmasq-78c7b787f5-6htkd\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.491196 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c7b787f5-6htkd\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.491225 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-684b6db755-sgp7z\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.491279 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-config\") pod \"dnsmasq-dnsmasq-684b6db755-sgp7z\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.491307 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndcmd\" (UniqueName: \"kubernetes.io/projected/f6be7542-0564-4731-aca4-1f1a6bccede0-kube-api-access-ndcmd\") pod \"dnsmasq-dnsmasq-78c7b787f5-6htkd\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.566151 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z"] Jan 21 00:04:56 crc kubenswrapper[5030]: E0121 00:04:56.566710 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dnsmasq-svc kube-api-access-c9jxh openstack-edpm-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" podUID="b825094f-f0a0-4c1c-b0a6-3db5d7371b28" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.586407 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q"] Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.587676 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.593534 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-config\") pod \"dnsmasq-dnsmasq-78c7b787f5-6htkd\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.593579 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c7b787f5-6htkd\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.593604 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-684b6db755-sgp7z\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.593833 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-config\") pod \"dnsmasq-dnsmasq-684b6db755-sgp7z\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.593857 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndcmd\" (UniqueName: \"kubernetes.io/projected/f6be7542-0564-4731-aca4-1f1a6bccede0-kube-api-access-ndcmd\") pod \"dnsmasq-dnsmasq-78c7b787f5-6htkd\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.593875 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-78c7b787f5-6htkd\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.593903 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9jxh\" (UniqueName: \"kubernetes.io/projected/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-kube-api-access-c9jxh\") pod \"dnsmasq-dnsmasq-684b6db755-sgp7z\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.593926 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-684b6db755-sgp7z\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.594568 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-config\") pod \"dnsmasq-dnsmasq-78c7b787f5-6htkd\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.594803 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-684b6db755-sgp7z\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.595043 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c7b787f5-6htkd\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.595102 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-78c7b787f5-6htkd\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.595518 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-684b6db755-sgp7z\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.595997 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-config\") pod \"dnsmasq-dnsmasq-684b6db755-sgp7z\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.601777 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q"] Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.619299 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndcmd\" (UniqueName: \"kubernetes.io/projected/f6be7542-0564-4731-aca4-1f1a6bccede0-kube-api-access-ndcmd\") pod \"dnsmasq-dnsmasq-78c7b787f5-6htkd\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.622724 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9jxh\" (UniqueName: \"kubernetes.io/projected/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-kube-api-access-c9jxh\") pod \"dnsmasq-dnsmasq-684b6db755-sgp7z\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.694725 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sbkq\" (UniqueName: \"kubernetes.io/projected/dce6d1bc-2fdc-4501-867d-25a233954a2f-kube-api-access-6sbkq\") pod \"dnsmasq-dnsmasq-76b7c4d945-86v9q\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.694775 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-76b7c4d945-86v9q\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.695075 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-76b7c4d945-86v9q\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.695111 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-config\") pod \"dnsmasq-dnsmasq-76b7c4d945-86v9q\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.796272 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sbkq\" (UniqueName: \"kubernetes.io/projected/dce6d1bc-2fdc-4501-867d-25a233954a2f-kube-api-access-6sbkq\") pod \"dnsmasq-dnsmasq-76b7c4d945-86v9q\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.796337 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-76b7c4d945-86v9q\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.796430 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-76b7c4d945-86v9q\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.796477 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-config\") pod \"dnsmasq-dnsmasq-76b7c4d945-86v9q\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.797473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-76b7c4d945-86v9q\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.797492 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-76b7c4d945-86v9q\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.797600 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-config\") pod \"dnsmasq-dnsmasq-76b7c4d945-86v9q\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.816350 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sbkq\" (UniqueName: \"kubernetes.io/projected/dce6d1bc-2fdc-4501-867d-25a233954a2f-kube-api-access-6sbkq\") pod \"dnsmasq-dnsmasq-76b7c4d945-86v9q\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:56 crc kubenswrapper[5030]: I0121 00:04:56.905735 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.383116 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q"] Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.424406 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" event={"ID":"dce6d1bc-2fdc-4501-867d-25a233954a2f","Type":"ContainerStarted","Data":"ae47e42e40bfb4eab5338d7bec1ef23d65924598f33e697a343ef623d11fe51e"} Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.424430 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.424478 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.441104 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.449847 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.607014 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-config\") pod \"f6be7542-0564-4731-aca4-1f1a6bccede0\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.607362 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-openstack-edpm-tls\") pod \"f6be7542-0564-4731-aca4-1f1a6bccede0\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.607586 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9jxh\" (UniqueName: \"kubernetes.io/projected/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-kube-api-access-c9jxh\") pod \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.607638 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-dnsmasq-svc\") pod \"f6be7542-0564-4731-aca4-1f1a6bccede0\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.607675 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-config" (OuterVolumeSpecName: "config") pod "f6be7542-0564-4731-aca4-1f1a6bccede0" (UID: "f6be7542-0564-4731-aca4-1f1a6bccede0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.607687 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-dnsmasq-svc\") pod \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.607726 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndcmd\" (UniqueName: \"kubernetes.io/projected/f6be7542-0564-4731-aca4-1f1a6bccede0-kube-api-access-ndcmd\") pod \"f6be7542-0564-4731-aca4-1f1a6bccede0\" (UID: \"f6be7542-0564-4731-aca4-1f1a6bccede0\") " Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.607749 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-config\") pod \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.607864 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-openstack-edpm-tls\") pod \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\" (UID: \"b825094f-f0a0-4c1c-b0a6-3db5d7371b28\") " Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.608186 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.608256 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "f6be7542-0564-4731-aca4-1f1a6bccede0" (UID: "f6be7542-0564-4731-aca4-1f1a6bccede0"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.608385 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-openstack-edpm-tls" (OuterVolumeSpecName: "openstack-edpm-tls") pod "f6be7542-0564-4731-aca4-1f1a6bccede0" (UID: "f6be7542-0564-4731-aca4-1f1a6bccede0"). InnerVolumeSpecName "openstack-edpm-tls". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.608561 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-config" (OuterVolumeSpecName: "config") pod "b825094f-f0a0-4c1c-b0a6-3db5d7371b28" (UID: "b825094f-f0a0-4c1c-b0a6-3db5d7371b28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.608666 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "b825094f-f0a0-4c1c-b0a6-3db5d7371b28" (UID: "b825094f-f0a0-4c1c-b0a6-3db5d7371b28"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.608738 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-openstack-edpm-tls" (OuterVolumeSpecName: "openstack-edpm-tls") pod "b825094f-f0a0-4c1c-b0a6-3db5d7371b28" (UID: "b825094f-f0a0-4c1c-b0a6-3db5d7371b28"). InnerVolumeSpecName "openstack-edpm-tls". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.610754 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6be7542-0564-4731-aca4-1f1a6bccede0-kube-api-access-ndcmd" (OuterVolumeSpecName: "kube-api-access-ndcmd") pod "f6be7542-0564-4731-aca4-1f1a6bccede0" (UID: "f6be7542-0564-4731-aca4-1f1a6bccede0"). InnerVolumeSpecName "kube-api-access-ndcmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.610816 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-kube-api-access-c9jxh" (OuterVolumeSpecName: "kube-api-access-c9jxh") pod "b825094f-f0a0-4c1c-b0a6-3db5d7371b28" (UID: "b825094f-f0a0-4c1c-b0a6-3db5d7371b28"). InnerVolumeSpecName "kube-api-access-c9jxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.708952 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.708987 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9jxh\" (UniqueName: \"kubernetes.io/projected/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-kube-api-access-c9jxh\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.709000 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f6be7542-0564-4731-aca4-1f1a6bccede0-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.709009 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.709018 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndcmd\" (UniqueName: \"kubernetes.io/projected/f6be7542-0564-4731-aca4-1f1a6bccede0-kube-api-access-ndcmd\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.709028 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:57 crc kubenswrapper[5030]: I0121 00:04:57.709036 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/b825094f-f0a0-4c1c-b0a6-3db5d7371b28-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:04:58 crc kubenswrapper[5030]: I0121 00:04:58.434816 5030 generic.go:334] "Generic (PLEG): container finished" podID="dce6d1bc-2fdc-4501-867d-25a233954a2f" containerID="4bc722aec0a1d9062aea25041b160148060536f2728cc386978ce956f981f7f4" exitCode=0 Jan 21 00:04:58 crc kubenswrapper[5030]: I0121 00:04:58.434861 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" event={"ID":"dce6d1bc-2fdc-4501-867d-25a233954a2f","Type":"ContainerDied","Data":"4bc722aec0a1d9062aea25041b160148060536f2728cc386978ce956f981f7f4"} Jan 21 00:04:58 crc kubenswrapper[5030]: I0121 00:04:58.434894 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd" Jan 21 00:04:58 crc kubenswrapper[5030]: I0121 00:04:58.434930 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z" Jan 21 00:04:58 crc kubenswrapper[5030]: I0121 00:04:58.510106 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z"] Jan 21 00:04:58 crc kubenswrapper[5030]: I0121 00:04:58.523492 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-684b6db755-sgp7z"] Jan 21 00:04:58 crc kubenswrapper[5030]: I0121 00:04:58.556688 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd"] Jan 21 00:04:58 crc kubenswrapper[5030]: I0121 00:04:58.569897 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-6htkd"] Jan 21 00:04:59 crc kubenswrapper[5030]: I0121 00:04:59.449938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" event={"ID":"dce6d1bc-2fdc-4501-867d-25a233954a2f","Type":"ContainerStarted","Data":"280a1556cb277df080ab259851aca18d8df86469789ee303a674ca05e760cf5e"} Jan 21 00:04:59 crc kubenswrapper[5030]: I0121 00:04:59.450154 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:04:59 crc kubenswrapper[5030]: I0121 00:04:59.476069 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" podStartSLOduration=3.476040084 podStartE2EDuration="3.476040084s" podCreationTimestamp="2026-01-21 00:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:04:59.469225259 +0000 UTC m=+5371.789485567" watchObservedRunningTime="2026-01-21 00:04:59.476040084 +0000 UTC m=+5371.796300382" Jan 21 00:04:59 crc kubenswrapper[5030]: I0121 00:04:59.971058 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b825094f-f0a0-4c1c-b0a6-3db5d7371b28" path="/var/lib/kubelet/pods/b825094f-f0a0-4c1c-b0a6-3db5d7371b28/volumes" Jan 21 00:04:59 crc kubenswrapper[5030]: I0121 00:04:59.971859 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6be7542-0564-4731-aca4-1f1a6bccede0" path="/var/lib/kubelet/pods/f6be7542-0564-4731-aca4-1f1a6bccede0/volumes" Jan 21 00:05:06 crc kubenswrapper[5030]: I0121 00:05:06.907691 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:05:06 crc kubenswrapper[5030]: I0121 00:05:06.978036 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9"] Jan 21 00:05:06 crc kubenswrapper[5030]: I0121 00:05:06.978363 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" podUID="705cf2be-b0f3-4599-815f-1301a4535cfb" containerName="dnsmasq-dns" containerID="cri-o://4d0692fc099264876788cec8f439e267256c8724789757cd95c79d7e9da92ef6" gracePeriod=10 Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.505867 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.540057 5030 generic.go:334] "Generic (PLEG): container finished" podID="705cf2be-b0f3-4599-815f-1301a4535cfb" containerID="4d0692fc099264876788cec8f439e267256c8724789757cd95c79d7e9da92ef6" exitCode=0 Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.540100 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" event={"ID":"705cf2be-b0f3-4599-815f-1301a4535cfb","Type":"ContainerDied","Data":"4d0692fc099264876788cec8f439e267256c8724789757cd95c79d7e9da92ef6"} Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.540117 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.540135 5030 scope.go:117] "RemoveContainer" containerID="4d0692fc099264876788cec8f439e267256c8724789757cd95c79d7e9da92ef6" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.540123 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9" event={"ID":"705cf2be-b0f3-4599-815f-1301a4535cfb","Type":"ContainerDied","Data":"79fab27d696387974163bd73b25698b66ce4caf3aa6f2018420c9eccb24a2d1a"} Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.556928 5030 scope.go:117] "RemoveContainer" containerID="6648c2fbaea2c11782ca9fd1e200de40695ca1244f50a56434489991065b1fa8" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.572055 5030 scope.go:117] "RemoveContainer" containerID="4d0692fc099264876788cec8f439e267256c8724789757cd95c79d7e9da92ef6" Jan 21 00:05:07 crc kubenswrapper[5030]: E0121 00:05:07.572548 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0692fc099264876788cec8f439e267256c8724789757cd95c79d7e9da92ef6\": container with ID starting with 4d0692fc099264876788cec8f439e267256c8724789757cd95c79d7e9da92ef6 not found: ID does not exist" containerID="4d0692fc099264876788cec8f439e267256c8724789757cd95c79d7e9da92ef6" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.572599 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0692fc099264876788cec8f439e267256c8724789757cd95c79d7e9da92ef6"} err="failed to get container status \"4d0692fc099264876788cec8f439e267256c8724789757cd95c79d7e9da92ef6\": rpc error: code = NotFound desc = could not find container \"4d0692fc099264876788cec8f439e267256c8724789757cd95c79d7e9da92ef6\": container with ID starting with 4d0692fc099264876788cec8f439e267256c8724789757cd95c79d7e9da92ef6 not found: ID does not exist" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.572666 5030 scope.go:117] "RemoveContainer" containerID="6648c2fbaea2c11782ca9fd1e200de40695ca1244f50a56434489991065b1fa8" Jan 21 00:05:07 crc kubenswrapper[5030]: E0121 00:05:07.572990 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6648c2fbaea2c11782ca9fd1e200de40695ca1244f50a56434489991065b1fa8\": container with ID starting with 6648c2fbaea2c11782ca9fd1e200de40695ca1244f50a56434489991065b1fa8 not found: ID does not exist" containerID="6648c2fbaea2c11782ca9fd1e200de40695ca1244f50a56434489991065b1fa8" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.573031 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6648c2fbaea2c11782ca9fd1e200de40695ca1244f50a56434489991065b1fa8"} err="failed to get container status \"6648c2fbaea2c11782ca9fd1e200de40695ca1244f50a56434489991065b1fa8\": rpc error: code = NotFound desc = could not find container \"6648c2fbaea2c11782ca9fd1e200de40695ca1244f50a56434489991065b1fa8\": container with ID starting with 6648c2fbaea2c11782ca9fd1e200de40695ca1244f50a56434489991065b1fa8 not found: ID does not exist" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.600480 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/705cf2be-b0f3-4599-815f-1301a4535cfb-config\") pod \"705cf2be-b0f3-4599-815f-1301a4535cfb\" (UID: \"705cf2be-b0f3-4599-815f-1301a4535cfb\") " Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.600533 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/705cf2be-b0f3-4599-815f-1301a4535cfb-dnsmasq-svc\") pod \"705cf2be-b0f3-4599-815f-1301a4535cfb\" (UID: \"705cf2be-b0f3-4599-815f-1301a4535cfb\") " Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.600573 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk5m8\" (UniqueName: \"kubernetes.io/projected/705cf2be-b0f3-4599-815f-1301a4535cfb-kube-api-access-rk5m8\") pod \"705cf2be-b0f3-4599-815f-1301a4535cfb\" (UID: \"705cf2be-b0f3-4599-815f-1301a4535cfb\") " Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.605910 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/705cf2be-b0f3-4599-815f-1301a4535cfb-kube-api-access-rk5m8" (OuterVolumeSpecName: "kube-api-access-rk5m8") pod "705cf2be-b0f3-4599-815f-1301a4535cfb" (UID: "705cf2be-b0f3-4599-815f-1301a4535cfb"). InnerVolumeSpecName "kube-api-access-rk5m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.643354 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/705cf2be-b0f3-4599-815f-1301a4535cfb-config" (OuterVolumeSpecName: "config") pod "705cf2be-b0f3-4599-815f-1301a4535cfb" (UID: "705cf2be-b0f3-4599-815f-1301a4535cfb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.655076 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/705cf2be-b0f3-4599-815f-1301a4535cfb-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "705cf2be-b0f3-4599-815f-1301a4535cfb" (UID: "705cf2be-b0f3-4599-815f-1301a4535cfb"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.701811 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/705cf2be-b0f3-4599-815f-1301a4535cfb-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.701842 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/705cf2be-b0f3-4599-815f-1301a4535cfb-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.701856 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk5m8\" (UniqueName: \"kubernetes.io/projected/705cf2be-b0f3-4599-815f-1301a4535cfb-kube-api-access-rk5m8\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.893183 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9"] Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.903405 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8t4c9"] Jan 21 00:05:07 crc kubenswrapper[5030]: I0121 00:05:07.973558 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="705cf2be-b0f3-4599-815f-1301a4535cfb" path="/var/lib/kubelet/pods/705cf2be-b0f3-4599-815f-1301a4535cfb/volumes" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.628760 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j"] Jan 21 00:05:11 crc kubenswrapper[5030]: E0121 00:05:11.629382 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="705cf2be-b0f3-4599-815f-1301a4535cfb" containerName="dnsmasq-dns" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.629397 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="705cf2be-b0f3-4599-815f-1301a4535cfb" containerName="dnsmasq-dns" Jan 21 00:05:11 crc kubenswrapper[5030]: E0121 00:05:11.629440 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="705cf2be-b0f3-4599-815f-1301a4535cfb" containerName="init" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.629449 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="705cf2be-b0f3-4599-815f-1301a4535cfb" containerName="init" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.629807 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="705cf2be-b0f3-4599-815f-1301a4535cfb" containerName="dnsmasq-dns" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.630389 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.633066 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.633432 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.633442 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-58br7" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.633745 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-generic-service1-default-certs-1" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.633960 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-generic-service1-default-certs-0" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.638461 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.639010 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.639134 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-generic-service1-default-certs-2" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.649822 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j"] Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.762262 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-inventory\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.762307 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.762378 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"install-certs-ovr-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-install-certs-ovr-combined-ca-bundle\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.762403 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-generic-service1-combined-ca-bundle\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.762447 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-generic-service1-default-certs-0\" (UniqueName: \"kubernetes.io/projected/95ae89c7-5a9f-4d75-a5ad-97c844954a03-openstack-edpm-tls-generic-service1-default-certs-0\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.762484 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw2ft\" (UniqueName: \"kubernetes.io/projected/95ae89c7-5a9f-4d75-a5ad-97c844954a03-kube-api-access-bw2ft\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.863805 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"install-certs-ovr-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-install-certs-ovr-combined-ca-bundle\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.863887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-generic-service1-combined-ca-bundle\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.863946 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-generic-service1-default-certs-0\" (UniqueName: \"kubernetes.io/projected/95ae89c7-5a9f-4d75-a5ad-97c844954a03-openstack-edpm-tls-generic-service1-default-certs-0\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.864012 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw2ft\" (UniqueName: \"kubernetes.io/projected/95ae89c7-5a9f-4d75-a5ad-97c844954a03-kube-api-access-bw2ft\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.864103 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-inventory\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.864146 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.869294 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-inventory\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.869683 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-generic-service1-combined-ca-bundle\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.871967 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"install-certs-ovr-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-install-certs-ovr-combined-ca-bundle\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.872416 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-generic-service1-default-certs-0\" (UniqueName: \"kubernetes.io/projected/95ae89c7-5a9f-4d75-a5ad-97c844954a03-openstack-edpm-tls-generic-service1-default-certs-0\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.875430 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.886975 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw2ft\" (UniqueName: \"kubernetes.io/projected/95ae89c7-5a9f-4d75-a5ad-97c844954a03-kube-api-access-bw2ft\") pod \"install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:11 crc kubenswrapper[5030]: I0121 00:05:11.956182 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:12 crc kubenswrapper[5030]: I0121 00:05:12.484284 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j"] Jan 21 00:05:12 crc kubenswrapper[5030]: I0121 00:05:12.593753 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" event={"ID":"95ae89c7-5a9f-4d75-a5ad-97c844954a03","Type":"ContainerStarted","Data":"ba47ef39aab977da1fb996f2de66db5a01babd97256c788ae48f8721d51056c9"} Jan 21 00:05:13 crc kubenswrapper[5030]: I0121 00:05:13.609279 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" event={"ID":"95ae89c7-5a9f-4d75-a5ad-97c844954a03","Type":"ContainerStarted","Data":"c3a73a233ab47c20d06f04d011e64ad85d7f4ceb9feb1d02e46bb2523739a1bf"} Jan 21 00:05:13 crc kubenswrapper[5030]: I0121 00:05:13.637245 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" podStartSLOduration=1.8980041189999999 podStartE2EDuration="2.637226382s" podCreationTimestamp="2026-01-21 00:05:11 +0000 UTC" firstStartedPulling="2026-01-21 00:05:12.493308925 +0000 UTC m=+5384.813569253" lastFinishedPulling="2026-01-21 00:05:13.232531218 +0000 UTC m=+5385.552791516" observedRunningTime="2026-01-21 00:05:13.637095448 +0000 UTC m=+5385.957355766" watchObservedRunningTime="2026-01-21 00:05:13.637226382 +0000 UTC m=+5385.957486670" Jan 21 00:05:16 crc kubenswrapper[5030]: E0121 00:05:16.252361 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95ae89c7_5a9f_4d75_a5ad_97c844954a03.slice/crio-conmon-c3a73a233ab47c20d06f04d011e64ad85d7f4ceb9feb1d02e46bb2523739a1bf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95ae89c7_5a9f_4d75_a5ad_97c844954a03.slice/crio-c3a73a233ab47c20d06f04d011e64ad85d7f4ceb9feb1d02e46bb2523739a1bf.scope\": RecentStats: unable to find data in memory cache]" Jan 21 00:05:16 crc kubenswrapper[5030]: I0121 00:05:16.647499 5030 generic.go:334] "Generic (PLEG): container finished" podID="95ae89c7-5a9f-4d75-a5ad-97c844954a03" containerID="c3a73a233ab47c20d06f04d011e64ad85d7f4ceb9feb1d02e46bb2523739a1bf" exitCode=0 Jan 21 00:05:16 crc kubenswrapper[5030]: I0121 00:05:16.647567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" event={"ID":"95ae89c7-5a9f-4d75-a5ad-97c844954a03","Type":"ContainerDied","Data":"c3a73a233ab47c20d06f04d011e64ad85d7f4ceb9feb1d02e46bb2523739a1bf"} Jan 21 00:05:17 crc kubenswrapper[5030]: I0121 00:05:17.911462 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.077325 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-ssh-key-openstack-edpm-tls\") pod \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.077986 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw2ft\" (UniqueName: \"kubernetes.io/projected/95ae89c7-5a9f-4d75-a5ad-97c844954a03-kube-api-access-bw2ft\") pod \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.078145 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-inventory\") pod \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.078216 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-generic-service1-default-certs-0\" (UniqueName: \"kubernetes.io/projected/95ae89c7-5a9f-4d75-a5ad-97c844954a03-openstack-edpm-tls-generic-service1-default-certs-0\") pod \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.078284 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"install-certs-ovr-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-install-certs-ovr-combined-ca-bundle\") pod \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.078348 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-generic-service1-combined-ca-bundle\") pod \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\" (UID: \"95ae89c7-5a9f-4d75-a5ad-97c844954a03\") " Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.084785 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ae89c7-5a9f-4d75-a5ad-97c844954a03-openstack-edpm-tls-generic-service1-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-generic-service1-default-certs-0") pod "95ae89c7-5a9f-4d75-a5ad-97c844954a03" (UID: "95ae89c7-5a9f-4d75-a5ad-97c844954a03"). InnerVolumeSpecName "openstack-edpm-tls-generic-service1-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.085214 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-generic-service1-combined-ca-bundle" (OuterVolumeSpecName: "generic-service1-combined-ca-bundle") pod "95ae89c7-5a9f-4d75-a5ad-97c844954a03" (UID: "95ae89c7-5a9f-4d75-a5ad-97c844954a03"). InnerVolumeSpecName "generic-service1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.085388 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ae89c7-5a9f-4d75-a5ad-97c844954a03-kube-api-access-bw2ft" (OuterVolumeSpecName: "kube-api-access-bw2ft") pod "95ae89c7-5a9f-4d75-a5ad-97c844954a03" (UID: "95ae89c7-5a9f-4d75-a5ad-97c844954a03"). InnerVolumeSpecName "kube-api-access-bw2ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.094477 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-install-certs-ovr-combined-ca-bundle" (OuterVolumeSpecName: "install-certs-ovr-combined-ca-bundle") pod "95ae89c7-5a9f-4d75-a5ad-97c844954a03" (UID: "95ae89c7-5a9f-4d75-a5ad-97c844954a03"). InnerVolumeSpecName "install-certs-ovr-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.106823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-inventory" (OuterVolumeSpecName: "inventory") pod "95ae89c7-5a9f-4d75-a5ad-97c844954a03" (UID: "95ae89c7-5a9f-4d75-a5ad-97c844954a03"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.113990 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "95ae89c7-5a9f-4d75-a5ad-97c844954a03" (UID: "95ae89c7-5a9f-4d75-a5ad-97c844954a03"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.181599 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw2ft\" (UniqueName: \"kubernetes.io/projected/95ae89c7-5a9f-4d75-a5ad-97c844954a03-kube-api-access-bw2ft\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.181674 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.181694 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-generic-service1-default-certs-0\" (UniqueName: \"kubernetes.io/projected/95ae89c7-5a9f-4d75-a5ad-97c844954a03-openstack-edpm-tls-generic-service1-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.181719 5030 reconciler_common.go:293] "Volume detached for volume \"install-certs-ovr-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-install-certs-ovr-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.181742 5030 reconciler_common.go:293] "Volume detached for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-generic-service1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.181761 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/95ae89c7-5a9f-4d75-a5ad-97c844954a03-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.667136 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" event={"ID":"95ae89c7-5a9f-4d75-a5ad-97c844954a03","Type":"ContainerDied","Data":"ba47ef39aab977da1fb996f2de66db5a01babd97256c788ae48f8721d51056c9"} Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.667192 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba47ef39aab977da1fb996f2de66db5a01babd97256c788ae48f8721d51056c9" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.667319 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.781543 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc"] Jan 21 00:05:18 crc kubenswrapper[5030]: E0121 00:05:18.781928 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ae89c7-5a9f-4d75-a5ad-97c844954a03" containerName="install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.781952 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ae89c7-5a9f-4d75-a5ad-97c844954a03" containerName="install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.782161 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ae89c7-5a9f-4d75-a5ad-97c844954a03" containerName="install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.782865 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.785540 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.785761 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.790025 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.790052 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-58br7" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.790259 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.793579 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc"] Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.893219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-ssh-key-openstack-edpm-tls\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.893312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-inventory\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.893366 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-generic-service1-combined-ca-bundle\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.893396 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnjsz\" (UniqueName: \"kubernetes.io/projected/0b7a865f-7798-4358-9ae6-cd1f694f41d5-kube-api-access-wnjsz\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.994884 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-ssh-key-openstack-edpm-tls\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.995196 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-inventory\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.995291 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-generic-service1-combined-ca-bundle\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:18 crc kubenswrapper[5030]: I0121 00:05:18.995354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnjsz\" (UniqueName: \"kubernetes.io/projected/0b7a865f-7798-4358-9ae6-cd1f694f41d5-kube-api-access-wnjsz\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:19 crc kubenswrapper[5030]: I0121 00:05:19.001104 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-generic-service1-combined-ca-bundle\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:19 crc kubenswrapper[5030]: I0121 00:05:19.001193 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-ssh-key-openstack-edpm-tls\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:19 crc kubenswrapper[5030]: I0121 00:05:19.003460 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-inventory\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:19 crc kubenswrapper[5030]: I0121 00:05:19.018179 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnjsz\" (UniqueName: \"kubernetes.io/projected/0b7a865f-7798-4358-9ae6-cd1f694f41d5-kube-api-access-wnjsz\") pod \"generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:19 crc kubenswrapper[5030]: I0121 00:05:19.100729 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:19 crc kubenswrapper[5030]: I0121 00:05:19.380236 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc"] Jan 21 00:05:19 crc kubenswrapper[5030]: I0121 00:05:19.679968 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" event={"ID":"0b7a865f-7798-4358-9ae6-cd1f694f41d5","Type":"ContainerStarted","Data":"5803af034f349e5b2fa14ac79aedab8a0c52f1a90e3486a542562dbff63dd7c7"} Jan 21 00:05:20 crc kubenswrapper[5030]: I0121 00:05:20.692763 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" event={"ID":"0b7a865f-7798-4358-9ae6-cd1f694f41d5","Type":"ContainerStarted","Data":"783ac0d3a9708daa66821522ab1b5e4696a21225ac14d5964b5e946f983faf3c"} Jan 21 00:05:20 crc kubenswrapper[5030]: I0121 00:05:20.718915 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" podStartSLOduration=2.001096565 podStartE2EDuration="2.71888742s" podCreationTimestamp="2026-01-21 00:05:18 +0000 UTC" firstStartedPulling="2026-01-21 00:05:19.391549536 +0000 UTC m=+5391.711809864" lastFinishedPulling="2026-01-21 00:05:20.109340401 +0000 UTC m=+5392.429600719" observedRunningTime="2026-01-21 00:05:20.716435862 +0000 UTC m=+5393.036696190" watchObservedRunningTime="2026-01-21 00:05:20.71888742 +0000 UTC m=+5393.039147748" Jan 21 00:05:23 crc kubenswrapper[5030]: I0121 00:05:23.720302 5030 generic.go:334] "Generic (PLEG): container finished" podID="0b7a865f-7798-4358-9ae6-cd1f694f41d5" containerID="783ac0d3a9708daa66821522ab1b5e4696a21225ac14d5964b5e946f983faf3c" exitCode=0 Jan 21 00:05:23 crc kubenswrapper[5030]: I0121 00:05:23.720398 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" event={"ID":"0b7a865f-7798-4358-9ae6-cd1f694f41d5","Type":"ContainerDied","Data":"783ac0d3a9708daa66821522ab1b5e4696a21225ac14d5964b5e946f983faf3c"} Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.067165 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.123271 5030 scope.go:117] "RemoveContainer" containerID="fe7e0a28e309f4b3b3b54f2671350c8935af49ddb2d6e6455da9ad8b2c979856" Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.147659 5030 scope.go:117] "RemoveContainer" containerID="6a94b4f12576bc4770168fd5a8a6729bf06a9b306ed382492bd0701c4327a171" Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.195822 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnjsz\" (UniqueName: \"kubernetes.io/projected/0b7a865f-7798-4358-9ae6-cd1f694f41d5-kube-api-access-wnjsz\") pod \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.195965 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-ssh-key-openstack-edpm-tls\") pod \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.196027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-generic-service1-combined-ca-bundle\") pod \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.196102 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-inventory\") pod \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\" (UID: \"0b7a865f-7798-4358-9ae6-cd1f694f41d5\") " Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.202006 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-generic-service1-combined-ca-bundle" (OuterVolumeSpecName: "generic-service1-combined-ca-bundle") pod "0b7a865f-7798-4358-9ae6-cd1f694f41d5" (UID: "0b7a865f-7798-4358-9ae6-cd1f694f41d5"). InnerVolumeSpecName "generic-service1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.202218 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7a865f-7798-4358-9ae6-cd1f694f41d5-kube-api-access-wnjsz" (OuterVolumeSpecName: "kube-api-access-wnjsz") pod "0b7a865f-7798-4358-9ae6-cd1f694f41d5" (UID: "0b7a865f-7798-4358-9ae6-cd1f694f41d5"). InnerVolumeSpecName "kube-api-access-wnjsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.217491 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "0b7a865f-7798-4358-9ae6-cd1f694f41d5" (UID: "0b7a865f-7798-4358-9ae6-cd1f694f41d5"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.224824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-inventory" (OuterVolumeSpecName: "inventory") pod "0b7a865f-7798-4358-9ae6-cd1f694f41d5" (UID: "0b7a865f-7798-4358-9ae6-cd1f694f41d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.297951 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.297983 5030 reconciler_common.go:293] "Volume detached for volume \"generic-service1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-generic-service1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.297997 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b7a865f-7798-4358-9ae6-cd1f694f41d5-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.298012 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnjsz\" (UniqueName: \"kubernetes.io/projected/0b7a865f-7798-4358-9ae6-cd1f694f41d5-kube-api-access-wnjsz\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.742130 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.742216 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc" event={"ID":"0b7a865f-7798-4358-9ae6-cd1f694f41d5","Type":"ContainerDied","Data":"5803af034f349e5b2fa14ac79aedab8a0c52f1a90e3486a542562dbff63dd7c7"} Jan 21 00:05:25 crc kubenswrapper[5030]: I0121 00:05:25.742550 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5803af034f349e5b2fa14ac79aedab8a0c52f1a90e3486a542562dbff63dd7c7" Jan 21 00:05:26 crc kubenswrapper[5030]: I0121 00:05:26.918343 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j"] Jan 21 00:05:26 crc kubenswrapper[5030]: I0121 00:05:26.927170 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-ovr-openstack-edpm-tls-openstack-edpm-tls-p2c9j"] Jan 21 00:05:26 crc kubenswrapper[5030]: I0121 00:05:26.939183 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc"] Jan 21 00:05:26 crc kubenswrapper[5030]: I0121 00:05:26.951770 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/generic-service1-openstack-edpm-tls-openstack-edpm-tls-mp5vc"] Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.011978 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7"] Jan 21 00:05:27 crc kubenswrapper[5030]: E0121 00:05:27.012308 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7a865f-7798-4358-9ae6-cd1f694f41d5" containerName="generic-service1-openstack-edpm-tls-openstack-edpm-tls" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.012328 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7a865f-7798-4358-9ae6-cd1f694f41d5" containerName="generic-service1-openstack-edpm-tls-openstack-edpm-tls" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.012506 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7a865f-7798-4358-9ae6-cd1f694f41d5" containerName="generic-service1-openstack-edpm-tls-openstack-edpm-tls" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.013406 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.025134 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7"] Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.127705 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjzbm\" (UniqueName: \"kubernetes.io/projected/2c86b3c9-8e4e-412f-889b-de7b99a185e5-kube-api-access-qjzbm\") pod \"dnsmasq-dnsmasq-84b9f45d47-x9bl7\" (UID: \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.127938 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c86b3c9-8e4e-412f-889b-de7b99a185e5-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-x9bl7\" (UID: \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.127988 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/2c86b3c9-8e4e-412f-889b-de7b99a185e5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-x9bl7\" (UID: \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.228741 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjzbm\" (UniqueName: \"kubernetes.io/projected/2c86b3c9-8e4e-412f-889b-de7b99a185e5-kube-api-access-qjzbm\") pod \"dnsmasq-dnsmasq-84b9f45d47-x9bl7\" (UID: \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.228845 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c86b3c9-8e4e-412f-889b-de7b99a185e5-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-x9bl7\" (UID: \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.228879 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/2c86b3c9-8e4e-412f-889b-de7b99a185e5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-x9bl7\" (UID: \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.229834 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/2c86b3c9-8e4e-412f-889b-de7b99a185e5-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-x9bl7\" (UID: \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.229944 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c86b3c9-8e4e-412f-889b-de7b99a185e5-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-x9bl7\" (UID: \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.252906 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjzbm\" (UniqueName: \"kubernetes.io/projected/2c86b3c9-8e4e-412f-889b-de7b99a185e5-kube-api-access-qjzbm\") pod \"dnsmasq-dnsmasq-84b9f45d47-x9bl7\" (UID: \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.329612 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.991920 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b7a865f-7798-4358-9ae6-cd1f694f41d5" path="/var/lib/kubelet/pods/0b7a865f-7798-4358-9ae6-cd1f694f41d5/volumes" Jan 21 00:05:27 crc kubenswrapper[5030]: I0121 00:05:27.994134 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ae89c7-5a9f-4d75-a5ad-97c844954a03" path="/var/lib/kubelet/pods/95ae89c7-5a9f-4d75-a5ad-97c844954a03/volumes" Jan 21 00:05:28 crc kubenswrapper[5030]: I0121 00:05:28.011998 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7"] Jan 21 00:05:28 crc kubenswrapper[5030]: I0121 00:05:28.820527 5030 generic.go:334] "Generic (PLEG): container finished" podID="2c86b3c9-8e4e-412f-889b-de7b99a185e5" containerID="d30bf7369b010c519411d07557a982f893dbd795b28efd7f558d769fa0f87905" exitCode=0 Jan 21 00:05:28 crc kubenswrapper[5030]: I0121 00:05:28.821129 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" event={"ID":"2c86b3c9-8e4e-412f-889b-de7b99a185e5","Type":"ContainerDied","Data":"d30bf7369b010c519411d07557a982f893dbd795b28efd7f558d769fa0f87905"} Jan 21 00:05:28 crc kubenswrapper[5030]: I0121 00:05:28.821177 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" event={"ID":"2c86b3c9-8e4e-412f-889b-de7b99a185e5","Type":"ContainerStarted","Data":"e1094499043ebd5dbbd20442f23c8ec49aaa400d4c3f68ed1852a4d57834479a"} Jan 21 00:05:29 crc kubenswrapper[5030]: I0121 00:05:29.831437 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" event={"ID":"2c86b3c9-8e4e-412f-889b-de7b99a185e5","Type":"ContainerStarted","Data":"966a06c7ac4b01247fc89746bde7a98ae0d57728269e2e00ed4640abb0f9f803"} Jan 21 00:05:29 crc kubenswrapper[5030]: I0121 00:05:29.831758 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:05:29 crc kubenswrapper[5030]: I0121 00:05:29.862272 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" podStartSLOduration=3.86224495 podStartE2EDuration="3.86224495s" podCreationTimestamp="2026-01-21 00:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:05:29.853410315 +0000 UTC m=+5402.173670643" watchObservedRunningTime="2026-01-21 00:05:29.86224495 +0000 UTC m=+5402.182505268" Jan 21 00:05:35 crc kubenswrapper[5030]: I0121 00:05:35.942439 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-p4ck6"] Jan 21 00:05:35 crc kubenswrapper[5030]: I0121 00:05:35.958441 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-p4ck6"] Jan 21 00:05:35 crc kubenswrapper[5030]: I0121 00:05:35.972058 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49572136-a63e-443b-840e-0f4559f297f1" path="/var/lib/kubelet/pods/49572136-a63e-443b-840e-0f4559f297f1/volumes" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.073436 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-zw8tn"] Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.074850 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zw8tn" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.077806 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.078119 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.078676 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.088191 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zw8tn"] Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.090239 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.228347 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x92c\" (UniqueName: \"kubernetes.io/projected/6c859cf5-0293-45df-93e1-59bd630f2598-kube-api-access-9x92c\") pod \"crc-storage-crc-zw8tn\" (UID: \"6c859cf5-0293-45df-93e1-59bd630f2598\") " pod="crc-storage/crc-storage-crc-zw8tn" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.228810 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6c859cf5-0293-45df-93e1-59bd630f2598-crc-storage\") pod \"crc-storage-crc-zw8tn\" (UID: \"6c859cf5-0293-45df-93e1-59bd630f2598\") " pod="crc-storage/crc-storage-crc-zw8tn" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.229034 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6c859cf5-0293-45df-93e1-59bd630f2598-node-mnt\") pod \"crc-storage-crc-zw8tn\" (UID: \"6c859cf5-0293-45df-93e1-59bd630f2598\") " pod="crc-storage/crc-storage-crc-zw8tn" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.331819 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6c859cf5-0293-45df-93e1-59bd630f2598-crc-storage\") pod \"crc-storage-crc-zw8tn\" (UID: \"6c859cf5-0293-45df-93e1-59bd630f2598\") " pod="crc-storage/crc-storage-crc-zw8tn" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.331911 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6c859cf5-0293-45df-93e1-59bd630f2598-node-mnt\") pod \"crc-storage-crc-zw8tn\" (UID: \"6c859cf5-0293-45df-93e1-59bd630f2598\") " pod="crc-storage/crc-storage-crc-zw8tn" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.331986 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x92c\" (UniqueName: \"kubernetes.io/projected/6c859cf5-0293-45df-93e1-59bd630f2598-kube-api-access-9x92c\") pod \"crc-storage-crc-zw8tn\" (UID: \"6c859cf5-0293-45df-93e1-59bd630f2598\") " pod="crc-storage/crc-storage-crc-zw8tn" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.333108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6c859cf5-0293-45df-93e1-59bd630f2598-node-mnt\") pod \"crc-storage-crc-zw8tn\" (UID: \"6c859cf5-0293-45df-93e1-59bd630f2598\") " pod="crc-storage/crc-storage-crc-zw8tn" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.333766 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6c859cf5-0293-45df-93e1-59bd630f2598-crc-storage\") pod \"crc-storage-crc-zw8tn\" (UID: \"6c859cf5-0293-45df-93e1-59bd630f2598\") " pod="crc-storage/crc-storage-crc-zw8tn" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.357878 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x92c\" (UniqueName: \"kubernetes.io/projected/6c859cf5-0293-45df-93e1-59bd630f2598-kube-api-access-9x92c\") pod \"crc-storage-crc-zw8tn\" (UID: \"6c859cf5-0293-45df-93e1-59bd630f2598\") " pod="crc-storage/crc-storage-crc-zw8tn" Jan 21 00:05:36 crc kubenswrapper[5030]: I0121 00:05:36.425020 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zw8tn" Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.331828 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.402580 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q"] Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.403046 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" podUID="dce6d1bc-2fdc-4501-867d-25a233954a2f" containerName="dnsmasq-dns" containerID="cri-o://280a1556cb277df080ab259851aca18d8df86469789ee303a674ca05e760cf5e" gracePeriod=10 Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.700251 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zw8tn"] Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.823059 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.927141 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zw8tn" event={"ID":"6c859cf5-0293-45df-93e1-59bd630f2598","Type":"ContainerStarted","Data":"601ae75361a2dbf2463db243b33050d6caf769a5dbcb99aa86aea3bb0145483b"} Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.930181 5030 generic.go:334] "Generic (PLEG): container finished" podID="dce6d1bc-2fdc-4501-867d-25a233954a2f" containerID="280a1556cb277df080ab259851aca18d8df86469789ee303a674ca05e760cf5e" exitCode=0 Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.930227 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" event={"ID":"dce6d1bc-2fdc-4501-867d-25a233954a2f","Type":"ContainerDied","Data":"280a1556cb277df080ab259851aca18d8df86469789ee303a674ca05e760cf5e"} Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.930259 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" event={"ID":"dce6d1bc-2fdc-4501-867d-25a233954a2f","Type":"ContainerDied","Data":"ae47e42e40bfb4eab5338d7bec1ef23d65924598f33e697a343ef623d11fe51e"} Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.930292 5030 scope.go:117] "RemoveContainer" containerID="280a1556cb277df080ab259851aca18d8df86469789ee303a674ca05e760cf5e" Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.930310 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q" Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.953658 5030 scope.go:117] "RemoveContainer" containerID="4bc722aec0a1d9062aea25041b160148060536f2728cc386978ce956f981f7f4" Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.955182 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-config\") pod \"dce6d1bc-2fdc-4501-867d-25a233954a2f\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.955294 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-openstack-edpm-tls\") pod \"dce6d1bc-2fdc-4501-867d-25a233954a2f\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.955325 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sbkq\" (UniqueName: \"kubernetes.io/projected/dce6d1bc-2fdc-4501-867d-25a233954a2f-kube-api-access-6sbkq\") pod \"dce6d1bc-2fdc-4501-867d-25a233954a2f\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.955441 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-dnsmasq-svc\") pod \"dce6d1bc-2fdc-4501-867d-25a233954a2f\" (UID: \"dce6d1bc-2fdc-4501-867d-25a233954a2f\") " Jan 21 00:05:37 crc kubenswrapper[5030]: I0121 00:05:37.960411 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dce6d1bc-2fdc-4501-867d-25a233954a2f-kube-api-access-6sbkq" (OuterVolumeSpecName: "kube-api-access-6sbkq") pod "dce6d1bc-2fdc-4501-867d-25a233954a2f" (UID: "dce6d1bc-2fdc-4501-867d-25a233954a2f"). InnerVolumeSpecName "kube-api-access-6sbkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.014255 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-config" (OuterVolumeSpecName: "config") pod "dce6d1bc-2fdc-4501-867d-25a233954a2f" (UID: "dce6d1bc-2fdc-4501-867d-25a233954a2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.015316 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-openstack-edpm-tls" (OuterVolumeSpecName: "openstack-edpm-tls") pod "dce6d1bc-2fdc-4501-867d-25a233954a2f" (UID: "dce6d1bc-2fdc-4501-867d-25a233954a2f"). InnerVolumeSpecName "openstack-edpm-tls". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.033014 5030 scope.go:117] "RemoveContainer" containerID="280a1556cb277df080ab259851aca18d8df86469789ee303a674ca05e760cf5e" Jan 21 00:05:38 crc kubenswrapper[5030]: E0121 00:05:38.033618 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"280a1556cb277df080ab259851aca18d8df86469789ee303a674ca05e760cf5e\": container with ID starting with 280a1556cb277df080ab259851aca18d8df86469789ee303a674ca05e760cf5e not found: ID does not exist" containerID="280a1556cb277df080ab259851aca18d8df86469789ee303a674ca05e760cf5e" Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.033736 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280a1556cb277df080ab259851aca18d8df86469789ee303a674ca05e760cf5e"} err="failed to get container status \"280a1556cb277df080ab259851aca18d8df86469789ee303a674ca05e760cf5e\": rpc error: code = NotFound desc = could not find container \"280a1556cb277df080ab259851aca18d8df86469789ee303a674ca05e760cf5e\": container with ID starting with 280a1556cb277df080ab259851aca18d8df86469789ee303a674ca05e760cf5e not found: ID does not exist" Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.033780 5030 scope.go:117] "RemoveContainer" containerID="4bc722aec0a1d9062aea25041b160148060536f2728cc386978ce956f981f7f4" Jan 21 00:05:38 crc kubenswrapper[5030]: E0121 00:05:38.034181 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bc722aec0a1d9062aea25041b160148060536f2728cc386978ce956f981f7f4\": container with ID starting with 4bc722aec0a1d9062aea25041b160148060536f2728cc386978ce956f981f7f4 not found: ID does not exist" containerID="4bc722aec0a1d9062aea25041b160148060536f2728cc386978ce956f981f7f4" Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.034232 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc722aec0a1d9062aea25041b160148060536f2728cc386978ce956f981f7f4"} err="failed to get container status \"4bc722aec0a1d9062aea25041b160148060536f2728cc386978ce956f981f7f4\": rpc error: code = NotFound desc = could not find container \"4bc722aec0a1d9062aea25041b160148060536f2728cc386978ce956f981f7f4\": container with ID starting with 4bc722aec0a1d9062aea25041b160148060536f2728cc386978ce956f981f7f4 not found: ID does not exist" Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.037566 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "dce6d1bc-2fdc-4501-867d-25a233954a2f" (UID: "dce6d1bc-2fdc-4501-867d-25a233954a2f"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.056758 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.056789 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.056799 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/dce6d1bc-2fdc-4501-867d-25a233954a2f-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.056816 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sbkq\" (UniqueName: \"kubernetes.io/projected/dce6d1bc-2fdc-4501-867d-25a233954a2f-kube-api-access-6sbkq\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.284648 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q"] Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.298374 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76b7c4d945-86v9q"] Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.944001 5030 generic.go:334] "Generic (PLEG): container finished" podID="6c859cf5-0293-45df-93e1-59bd630f2598" containerID="7f354130cfe70a506e0b8da7fc2de2a746b7f0bec1340ad75d7e0807dd9c86f0" exitCode=0 Jan 21 00:05:38 crc kubenswrapper[5030]: I0121 00:05:38.944091 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zw8tn" event={"ID":"6c859cf5-0293-45df-93e1-59bd630f2598","Type":"ContainerDied","Data":"7f354130cfe70a506e0b8da7fc2de2a746b7f0bec1340ad75d7e0807dd9c86f0"} Jan 21 00:05:39 crc kubenswrapper[5030]: I0121 00:05:39.982775 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dce6d1bc-2fdc-4501-867d-25a233954a2f" path="/var/lib/kubelet/pods/dce6d1bc-2fdc-4501-867d-25a233954a2f/volumes" Jan 21 00:05:40 crc kubenswrapper[5030]: I0121 00:05:40.253049 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zw8tn" Jan 21 00:05:40 crc kubenswrapper[5030]: I0121 00:05:40.397355 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x92c\" (UniqueName: \"kubernetes.io/projected/6c859cf5-0293-45df-93e1-59bd630f2598-kube-api-access-9x92c\") pod \"6c859cf5-0293-45df-93e1-59bd630f2598\" (UID: \"6c859cf5-0293-45df-93e1-59bd630f2598\") " Jan 21 00:05:40 crc kubenswrapper[5030]: I0121 00:05:40.397607 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6c859cf5-0293-45df-93e1-59bd630f2598-crc-storage\") pod \"6c859cf5-0293-45df-93e1-59bd630f2598\" (UID: \"6c859cf5-0293-45df-93e1-59bd630f2598\") " Jan 21 00:05:40 crc kubenswrapper[5030]: I0121 00:05:40.397852 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6c859cf5-0293-45df-93e1-59bd630f2598-node-mnt\") pod \"6c859cf5-0293-45df-93e1-59bd630f2598\" (UID: \"6c859cf5-0293-45df-93e1-59bd630f2598\") " Jan 21 00:05:40 crc kubenswrapper[5030]: I0121 00:05:40.398135 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c859cf5-0293-45df-93e1-59bd630f2598-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6c859cf5-0293-45df-93e1-59bd630f2598" (UID: "6c859cf5-0293-45df-93e1-59bd630f2598"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:05:40 crc kubenswrapper[5030]: I0121 00:05:40.407836 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c859cf5-0293-45df-93e1-59bd630f2598-kube-api-access-9x92c" (OuterVolumeSpecName: "kube-api-access-9x92c") pod "6c859cf5-0293-45df-93e1-59bd630f2598" (UID: "6c859cf5-0293-45df-93e1-59bd630f2598"). InnerVolumeSpecName "kube-api-access-9x92c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:05:40 crc kubenswrapper[5030]: I0121 00:05:40.433364 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c859cf5-0293-45df-93e1-59bd630f2598-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6c859cf5-0293-45df-93e1-59bd630f2598" (UID: "6c859cf5-0293-45df-93e1-59bd630f2598"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:05:40 crc kubenswrapper[5030]: I0121 00:05:40.499657 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x92c\" (UniqueName: \"kubernetes.io/projected/6c859cf5-0293-45df-93e1-59bd630f2598-kube-api-access-9x92c\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:40 crc kubenswrapper[5030]: I0121 00:05:40.499713 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6c859cf5-0293-45df-93e1-59bd630f2598-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:40 crc kubenswrapper[5030]: I0121 00:05:40.499724 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6c859cf5-0293-45df-93e1-59bd630f2598-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:40 crc kubenswrapper[5030]: I0121 00:05:40.966887 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zw8tn" event={"ID":"6c859cf5-0293-45df-93e1-59bd630f2598","Type":"ContainerDied","Data":"601ae75361a2dbf2463db243b33050d6caf769a5dbcb99aa86aea3bb0145483b"} Jan 21 00:05:40 crc kubenswrapper[5030]: I0121 00:05:40.966940 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="601ae75361a2dbf2463db243b33050d6caf769a5dbcb99aa86aea3bb0145483b" Jan 21 00:05:40 crc kubenswrapper[5030]: I0121 00:05:40.967345 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zw8tn" Jan 21 00:05:43 crc kubenswrapper[5030]: I0121 00:05:43.787442 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-zw8tn"] Jan 21 00:05:43 crc kubenswrapper[5030]: I0121 00:05:43.803411 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-zw8tn"] Jan 21 00:05:43 crc kubenswrapper[5030]: I0121 00:05:43.975198 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c859cf5-0293-45df-93e1-59bd630f2598" path="/var/lib/kubelet/pods/6c859cf5-0293-45df-93e1-59bd630f2598/volumes" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.009522 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-sq55g"] Jan 21 00:05:44 crc kubenswrapper[5030]: E0121 00:05:44.010149 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c859cf5-0293-45df-93e1-59bd630f2598" containerName="storage" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.010199 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c859cf5-0293-45df-93e1-59bd630f2598" containerName="storage" Jan 21 00:05:44 crc kubenswrapper[5030]: E0121 00:05:44.010295 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dce6d1bc-2fdc-4501-867d-25a233954a2f" containerName="init" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.010308 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dce6d1bc-2fdc-4501-867d-25a233954a2f" containerName="init" Jan 21 00:05:44 crc kubenswrapper[5030]: E0121 00:05:44.010332 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dce6d1bc-2fdc-4501-867d-25a233954a2f" containerName="dnsmasq-dns" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.010368 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dce6d1bc-2fdc-4501-867d-25a233954a2f" containerName="dnsmasq-dns" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.010662 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c859cf5-0293-45df-93e1-59bd630f2598" containerName="storage" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.010711 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dce6d1bc-2fdc-4501-867d-25a233954a2f" containerName="dnsmasq-dns" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.011419 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sq55g" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.014749 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.014890 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.015111 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.016196 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.029972 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-sq55g"] Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.162977 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/367334e1-8d9b-4474-9fb9-89f154b07667-node-mnt\") pod \"crc-storage-crc-sq55g\" (UID: \"367334e1-8d9b-4474-9fb9-89f154b07667\") " pod="crc-storage/crc-storage-crc-sq55g" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.163029 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/367334e1-8d9b-4474-9fb9-89f154b07667-crc-storage\") pod \"crc-storage-crc-sq55g\" (UID: \"367334e1-8d9b-4474-9fb9-89f154b07667\") " pod="crc-storage/crc-storage-crc-sq55g" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.163076 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg2bf\" (UniqueName: \"kubernetes.io/projected/367334e1-8d9b-4474-9fb9-89f154b07667-kube-api-access-wg2bf\") pod \"crc-storage-crc-sq55g\" (UID: \"367334e1-8d9b-4474-9fb9-89f154b07667\") " pod="crc-storage/crc-storage-crc-sq55g" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.264469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/367334e1-8d9b-4474-9fb9-89f154b07667-crc-storage\") pod \"crc-storage-crc-sq55g\" (UID: \"367334e1-8d9b-4474-9fb9-89f154b07667\") " pod="crc-storage/crc-storage-crc-sq55g" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.264687 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg2bf\" (UniqueName: \"kubernetes.io/projected/367334e1-8d9b-4474-9fb9-89f154b07667-kube-api-access-wg2bf\") pod \"crc-storage-crc-sq55g\" (UID: \"367334e1-8d9b-4474-9fb9-89f154b07667\") " pod="crc-storage/crc-storage-crc-sq55g" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.264907 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/367334e1-8d9b-4474-9fb9-89f154b07667-node-mnt\") pod \"crc-storage-crc-sq55g\" (UID: \"367334e1-8d9b-4474-9fb9-89f154b07667\") " pod="crc-storage/crc-storage-crc-sq55g" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.265167 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/367334e1-8d9b-4474-9fb9-89f154b07667-node-mnt\") pod \"crc-storage-crc-sq55g\" (UID: \"367334e1-8d9b-4474-9fb9-89f154b07667\") " pod="crc-storage/crc-storage-crc-sq55g" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.265960 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/367334e1-8d9b-4474-9fb9-89f154b07667-crc-storage\") pod \"crc-storage-crc-sq55g\" (UID: \"367334e1-8d9b-4474-9fb9-89f154b07667\") " pod="crc-storage/crc-storage-crc-sq55g" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.286579 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg2bf\" (UniqueName: \"kubernetes.io/projected/367334e1-8d9b-4474-9fb9-89f154b07667-kube-api-access-wg2bf\") pod \"crc-storage-crc-sq55g\" (UID: \"367334e1-8d9b-4474-9fb9-89f154b07667\") " pod="crc-storage/crc-storage-crc-sq55g" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.336883 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sq55g" Jan 21 00:05:44 crc kubenswrapper[5030]: I0121 00:05:44.805553 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-sq55g"] Jan 21 00:05:45 crc kubenswrapper[5030]: I0121 00:05:45.015246 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sq55g" event={"ID":"367334e1-8d9b-4474-9fb9-89f154b07667","Type":"ContainerStarted","Data":"c0c0173fedd3dde8edfde808cdb9ca90d84c4c39ef35179ef29c1aa5ca2ce822"} Jan 21 00:05:46 crc kubenswrapper[5030]: I0121 00:05:46.031189 5030 generic.go:334] "Generic (PLEG): container finished" podID="367334e1-8d9b-4474-9fb9-89f154b07667" containerID="4398953fb94de880cdde55183ed2f8d0aae947333b5028db7dfbe6fbfed0e008" exitCode=0 Jan 21 00:05:46 crc kubenswrapper[5030]: I0121 00:05:46.031289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sq55g" event={"ID":"367334e1-8d9b-4474-9fb9-89f154b07667","Type":"ContainerDied","Data":"4398953fb94de880cdde55183ed2f8d0aae947333b5028db7dfbe6fbfed0e008"} Jan 21 00:05:47 crc kubenswrapper[5030]: I0121 00:05:47.430903 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sq55g" Jan 21 00:05:47 crc kubenswrapper[5030]: I0121 00:05:47.518844 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/367334e1-8d9b-4474-9fb9-89f154b07667-node-mnt\") pod \"367334e1-8d9b-4474-9fb9-89f154b07667\" (UID: \"367334e1-8d9b-4474-9fb9-89f154b07667\") " Jan 21 00:05:47 crc kubenswrapper[5030]: I0121 00:05:47.518928 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/367334e1-8d9b-4474-9fb9-89f154b07667-crc-storage\") pod \"367334e1-8d9b-4474-9fb9-89f154b07667\" (UID: \"367334e1-8d9b-4474-9fb9-89f154b07667\") " Jan 21 00:05:47 crc kubenswrapper[5030]: I0121 00:05:47.519043 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg2bf\" (UniqueName: \"kubernetes.io/projected/367334e1-8d9b-4474-9fb9-89f154b07667-kube-api-access-wg2bf\") pod \"367334e1-8d9b-4474-9fb9-89f154b07667\" (UID: \"367334e1-8d9b-4474-9fb9-89f154b07667\") " Jan 21 00:05:47 crc kubenswrapper[5030]: I0121 00:05:47.519482 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/367334e1-8d9b-4474-9fb9-89f154b07667-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "367334e1-8d9b-4474-9fb9-89f154b07667" (UID: "367334e1-8d9b-4474-9fb9-89f154b07667"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:05:47 crc kubenswrapper[5030]: I0121 00:05:47.531001 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/367334e1-8d9b-4474-9fb9-89f154b07667-kube-api-access-wg2bf" (OuterVolumeSpecName: "kube-api-access-wg2bf") pod "367334e1-8d9b-4474-9fb9-89f154b07667" (UID: "367334e1-8d9b-4474-9fb9-89f154b07667"). InnerVolumeSpecName "kube-api-access-wg2bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:05:47 crc kubenswrapper[5030]: I0121 00:05:47.544595 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/367334e1-8d9b-4474-9fb9-89f154b07667-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "367334e1-8d9b-4474-9fb9-89f154b07667" (UID: "367334e1-8d9b-4474-9fb9-89f154b07667"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:05:47 crc kubenswrapper[5030]: I0121 00:05:47.620956 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/367334e1-8d9b-4474-9fb9-89f154b07667-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:47 crc kubenswrapper[5030]: I0121 00:05:47.621011 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/367334e1-8d9b-4474-9fb9-89f154b07667-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:47 crc kubenswrapper[5030]: I0121 00:05:47.621034 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wg2bf\" (UniqueName: \"kubernetes.io/projected/367334e1-8d9b-4474-9fb9-89f154b07667-kube-api-access-wg2bf\") on node \"crc\" DevicePath \"\"" Jan 21 00:05:48 crc kubenswrapper[5030]: I0121 00:05:48.053819 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-sq55g" event={"ID":"367334e1-8d9b-4474-9fb9-89f154b07667","Type":"ContainerDied","Data":"c0c0173fedd3dde8edfde808cdb9ca90d84c4c39ef35179ef29c1aa5ca2ce822"} Jan 21 00:05:48 crc kubenswrapper[5030]: I0121 00:05:48.053876 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0c0173fedd3dde8edfde808cdb9ca90d84c4c39ef35179ef29c1aa5ca2ce822" Jan 21 00:05:48 crc kubenswrapper[5030]: I0121 00:05:48.053948 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-sq55g" Jan 21 00:05:50 crc kubenswrapper[5030]: I0121 00:05:50.877550 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj"] Jan 21 00:05:50 crc kubenswrapper[5030]: E0121 00:05:50.877945 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367334e1-8d9b-4474-9fb9-89f154b07667" containerName="storage" Jan 21 00:05:50 crc kubenswrapper[5030]: I0121 00:05:50.877962 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="367334e1-8d9b-4474-9fb9-89f154b07667" containerName="storage" Jan 21 00:05:50 crc kubenswrapper[5030]: I0121 00:05:50.878163 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="367334e1-8d9b-4474-9fb9-89f154b07667" containerName="storage" Jan 21 00:05:50 crc kubenswrapper[5030]: I0121 00:05:50.879082 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:50 crc kubenswrapper[5030]: I0121 00:05:50.880962 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes" Jan 21 00:05:50 crc kubenswrapper[5030]: I0121 00:05:50.892586 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj"] Jan 21 00:05:50 crc kubenswrapper[5030]: I0121 00:05:50.981303 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-t4prj\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:50 crc kubenswrapper[5030]: I0121 00:05:50.981423 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sltqr\" (UniqueName: \"kubernetes.io/projected/4c98d80a-3889-499f-b822-dabbeb25f4cf-kube-api-access-sltqr\") pod \"dnsmasq-dnsmasq-64864b6d57-t4prj\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:50 crc kubenswrapper[5030]: I0121 00:05:50.981477 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-config\") pod \"dnsmasq-dnsmasq-64864b6d57-t4prj\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:50 crc kubenswrapper[5030]: I0121 00:05:50.981506 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-t4prj\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:51 crc kubenswrapper[5030]: I0121 00:05:51.082552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sltqr\" (UniqueName: \"kubernetes.io/projected/4c98d80a-3889-499f-b822-dabbeb25f4cf-kube-api-access-sltqr\") pod \"dnsmasq-dnsmasq-64864b6d57-t4prj\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:51 crc kubenswrapper[5030]: I0121 00:05:51.082931 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-config\") pod \"dnsmasq-dnsmasq-64864b6d57-t4prj\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:51 crc kubenswrapper[5030]: I0121 00:05:51.082961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-t4prj\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:51 crc kubenswrapper[5030]: I0121 00:05:51.083007 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-t4prj\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:51 crc kubenswrapper[5030]: I0121 00:05:51.084043 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-t4prj\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:51 crc kubenswrapper[5030]: I0121 00:05:51.084094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-t4prj\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:51 crc kubenswrapper[5030]: I0121 00:05:51.084401 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-config\") pod \"dnsmasq-dnsmasq-64864b6d57-t4prj\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:51 crc kubenswrapper[5030]: I0121 00:05:51.106407 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sltqr\" (UniqueName: \"kubernetes.io/projected/4c98d80a-3889-499f-b822-dabbeb25f4cf-kube-api-access-sltqr\") pod \"dnsmasq-dnsmasq-64864b6d57-t4prj\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:51 crc kubenswrapper[5030]: I0121 00:05:51.290658 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:51 crc kubenswrapper[5030]: I0121 00:05:51.763704 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj"] Jan 21 00:05:51 crc kubenswrapper[5030]: W0121 00:05:51.773826 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c98d80a_3889_499f_b822_dabbeb25f4cf.slice/crio-a66068da43c9b30a16bac93262e47c35d28228f871c77de276f4efe524e222d0 WatchSource:0}: Error finding container a66068da43c9b30a16bac93262e47c35d28228f871c77de276f4efe524e222d0: Status 404 returned error can't find the container with id a66068da43c9b30a16bac93262e47c35d28228f871c77de276f4efe524e222d0 Jan 21 00:05:52 crc kubenswrapper[5030]: I0121 00:05:52.095820 5030 generic.go:334] "Generic (PLEG): container finished" podID="4c98d80a-3889-499f-b822-dabbeb25f4cf" containerID="a7d9774025737c4497227ec8725c0294a993dbee1f310820daaa88fd05c97532" exitCode=0 Jan 21 00:05:52 crc kubenswrapper[5030]: I0121 00:05:52.095893 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" event={"ID":"4c98d80a-3889-499f-b822-dabbeb25f4cf","Type":"ContainerDied","Data":"a7d9774025737c4497227ec8725c0294a993dbee1f310820daaa88fd05c97532"} Jan 21 00:05:52 crc kubenswrapper[5030]: I0121 00:05:52.095964 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" event={"ID":"4c98d80a-3889-499f-b822-dabbeb25f4cf","Type":"ContainerStarted","Data":"a66068da43c9b30a16bac93262e47c35d28228f871c77de276f4efe524e222d0"} Jan 21 00:05:53 crc kubenswrapper[5030]: I0121 00:05:53.112199 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" event={"ID":"4c98d80a-3889-499f-b822-dabbeb25f4cf","Type":"ContainerStarted","Data":"8c661467137337265d146274a69301d98c05212632a21587c294aedf83e72985"} Jan 21 00:05:53 crc kubenswrapper[5030]: I0121 00:05:53.112773 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:05:53 crc kubenswrapper[5030]: I0121 00:05:53.149484 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" podStartSLOduration=3.149453945 podStartE2EDuration="3.149453945s" podCreationTimestamp="2026-01-21 00:05:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:05:53.135968598 +0000 UTC m=+5425.456228916" watchObservedRunningTime="2026-01-21 00:05:53.149453945 +0000 UTC m=+5425.469714273" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.291910 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.369541 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7"] Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.369941 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" podUID="2c86b3c9-8e4e-412f-889b-de7b99a185e5" containerName="dnsmasq-dns" containerID="cri-o://966a06c7ac4b01247fc89746bde7a98ae0d57728269e2e00ed4640abb0f9f803" gracePeriod=10 Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.742504 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.862081 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c86b3c9-8e4e-412f-889b-de7b99a185e5-config\") pod \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\" (UID: \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\") " Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.862146 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjzbm\" (UniqueName: \"kubernetes.io/projected/2c86b3c9-8e4e-412f-889b-de7b99a185e5-kube-api-access-qjzbm\") pod \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\" (UID: \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\") " Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.862185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/2c86b3c9-8e4e-412f-889b-de7b99a185e5-dnsmasq-svc\") pod \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\" (UID: \"2c86b3c9-8e4e-412f-889b-de7b99a185e5\") " Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.895134 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c86b3c9-8e4e-412f-889b-de7b99a185e5-kube-api-access-qjzbm" (OuterVolumeSpecName: "kube-api-access-qjzbm") pod "2c86b3c9-8e4e-412f-889b-de7b99a185e5" (UID: "2c86b3c9-8e4e-412f-889b-de7b99a185e5"). InnerVolumeSpecName "kube-api-access-qjzbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.913881 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5"] Jan 21 00:06:01 crc kubenswrapper[5030]: E0121 00:06:01.914581 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c86b3c9-8e4e-412f-889b-de7b99a185e5" containerName="dnsmasq-dns" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.914613 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c86b3c9-8e4e-412f-889b-de7b99a185e5" containerName="dnsmasq-dns" Jan 21 00:06:01 crc kubenswrapper[5030]: E0121 00:06:01.914700 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c86b3c9-8e4e-412f-889b-de7b99a185e5" containerName="init" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.914740 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c86b3c9-8e4e-412f-889b-de7b99a185e5" containerName="init" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.915064 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c86b3c9-8e4e-412f-889b-de7b99a185e5" containerName="dnsmasq-dns" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.916670 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.922022 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.922420 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.922575 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.923573 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.925894 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5"] Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.934671 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c86b3c9-8e4e-412f-889b-de7b99a185e5-config" (OuterVolumeSpecName: "config") pod "2c86b3c9-8e4e-412f-889b-de7b99a185e5" (UID: "2c86b3c9-8e4e-412f-889b-de7b99a185e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.944967 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c86b3c9-8e4e-412f-889b-de7b99a185e5-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "2c86b3c9-8e4e-412f-889b-de7b99a185e5" (UID: "2c86b3c9-8e4e-412f-889b-de7b99a185e5"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.963704 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/2c86b3c9-8e4e-412f-889b-de7b99a185e5-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.963747 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c86b3c9-8e4e-412f-889b-de7b99a185e5-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:01 crc kubenswrapper[5030]: I0121 00:06:01.963765 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjzbm\" (UniqueName: \"kubernetes.io/projected/2c86b3c9-8e4e-412f-889b-de7b99a185e5-kube-api-access-qjzbm\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.065380 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74msd\" (UniqueName: \"kubernetes.io/projected/a5d8524f-463c-4a40-9b09-fbc9e80ded50-kube-api-access-74msd\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5\" (UID: \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.065692 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a5d8524f-463c-4a40-9b09-fbc9e80ded50-ssh-key-edpm-compute-no-nodes\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5\" (UID: \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.065776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5d8524f-463c-4a40-9b09-fbc9e80ded50-inventory\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5\" (UID: \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.167815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74msd\" (UniqueName: \"kubernetes.io/projected/a5d8524f-463c-4a40-9b09-fbc9e80ded50-kube-api-access-74msd\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5\" (UID: \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.168054 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a5d8524f-463c-4a40-9b09-fbc9e80ded50-ssh-key-edpm-compute-no-nodes\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5\" (UID: \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.168101 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5d8524f-463c-4a40-9b09-fbc9e80ded50-inventory\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5\" (UID: \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.173181 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a5d8524f-463c-4a40-9b09-fbc9e80ded50-ssh-key-edpm-compute-no-nodes\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5\" (UID: \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.173694 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5d8524f-463c-4a40-9b09-fbc9e80ded50-inventory\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5\" (UID: \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.186459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74msd\" (UniqueName: \"kubernetes.io/projected/a5d8524f-463c-4a40-9b09-fbc9e80ded50-kube-api-access-74msd\") pod \"download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5\" (UID: \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\") " pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.220960 5030 generic.go:334] "Generic (PLEG): container finished" podID="2c86b3c9-8e4e-412f-889b-de7b99a185e5" containerID="966a06c7ac4b01247fc89746bde7a98ae0d57728269e2e00ed4640abb0f9f803" exitCode=0 Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.221023 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.221033 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" event={"ID":"2c86b3c9-8e4e-412f-889b-de7b99a185e5","Type":"ContainerDied","Data":"966a06c7ac4b01247fc89746bde7a98ae0d57728269e2e00ed4640abb0f9f803"} Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.221115 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7" event={"ID":"2c86b3c9-8e4e-412f-889b-de7b99a185e5","Type":"ContainerDied","Data":"e1094499043ebd5dbbd20442f23c8ec49aaa400d4c3f68ed1852a4d57834479a"} Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.221145 5030 scope.go:117] "RemoveContainer" containerID="966a06c7ac4b01247fc89746bde7a98ae0d57728269e2e00ed4640abb0f9f803" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.247103 5030 scope.go:117] "RemoveContainer" containerID="d30bf7369b010c519411d07557a982f893dbd795b28efd7f558d769fa0f87905" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.250881 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7"] Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.258692 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-x9bl7"] Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.265300 5030 scope.go:117] "RemoveContainer" containerID="966a06c7ac4b01247fc89746bde7a98ae0d57728269e2e00ed4640abb0f9f803" Jan 21 00:06:02 crc kubenswrapper[5030]: E0121 00:06:02.266217 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"966a06c7ac4b01247fc89746bde7a98ae0d57728269e2e00ed4640abb0f9f803\": container with ID starting with 966a06c7ac4b01247fc89746bde7a98ae0d57728269e2e00ed4640abb0f9f803 not found: ID does not exist" containerID="966a06c7ac4b01247fc89746bde7a98ae0d57728269e2e00ed4640abb0f9f803" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.266276 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966a06c7ac4b01247fc89746bde7a98ae0d57728269e2e00ed4640abb0f9f803"} err="failed to get container status \"966a06c7ac4b01247fc89746bde7a98ae0d57728269e2e00ed4640abb0f9f803\": rpc error: code = NotFound desc = could not find container \"966a06c7ac4b01247fc89746bde7a98ae0d57728269e2e00ed4640abb0f9f803\": container with ID starting with 966a06c7ac4b01247fc89746bde7a98ae0d57728269e2e00ed4640abb0f9f803 not found: ID does not exist" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.266311 5030 scope.go:117] "RemoveContainer" containerID="d30bf7369b010c519411d07557a982f893dbd795b28efd7f558d769fa0f87905" Jan 21 00:06:02 crc kubenswrapper[5030]: E0121 00:06:02.266792 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d30bf7369b010c519411d07557a982f893dbd795b28efd7f558d769fa0f87905\": container with ID starting with d30bf7369b010c519411d07557a982f893dbd795b28efd7f558d769fa0f87905 not found: ID does not exist" containerID="d30bf7369b010c519411d07557a982f893dbd795b28efd7f558d769fa0f87905" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.266831 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d30bf7369b010c519411d07557a982f893dbd795b28efd7f558d769fa0f87905"} err="failed to get container status \"d30bf7369b010c519411d07557a982f893dbd795b28efd7f558d769fa0f87905\": rpc error: code = NotFound desc = could not find container \"d30bf7369b010c519411d07557a982f893dbd795b28efd7f558d769fa0f87905\": container with ID starting with d30bf7369b010c519411d07557a982f893dbd795b28efd7f558d769fa0f87905 not found: ID does not exist" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.269288 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" Jan 21 00:06:02 crc kubenswrapper[5030]: I0121 00:06:02.699503 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5"] Jan 21 00:06:03 crc kubenswrapper[5030]: I0121 00:06:03.235479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" event={"ID":"a5d8524f-463c-4a40-9b09-fbc9e80ded50","Type":"ContainerStarted","Data":"66fadf550b0af4bbbe57b3262aca71d8cd2ce689b64c8a9c5f04ca8f42bbcab9"} Jan 21 00:06:03 crc kubenswrapper[5030]: I0121 00:06:03.988335 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c86b3c9-8e4e-412f-889b-de7b99a185e5" path="/var/lib/kubelet/pods/2c86b3c9-8e4e-412f-889b-de7b99a185e5/volumes" Jan 21 00:06:04 crc kubenswrapper[5030]: I0121 00:06:04.249305 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" event={"ID":"a5d8524f-463c-4a40-9b09-fbc9e80ded50","Type":"ContainerStarted","Data":"c57fa0852a94326fd2eb1efb569129a3b88c52bbd0c71a3d2d8eb0d1f3e05ce0"} Jan 21 00:06:04 crc kubenswrapper[5030]: I0121 00:06:04.270583 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" podStartSLOduration=2.805380991 podStartE2EDuration="3.27055888s" podCreationTimestamp="2026-01-21 00:06:01 +0000 UTC" firstStartedPulling="2026-01-21 00:06:02.707477589 +0000 UTC m=+5435.027737877" lastFinishedPulling="2026-01-21 00:06:03.172655468 +0000 UTC m=+5435.492915766" observedRunningTime="2026-01-21 00:06:04.266323047 +0000 UTC m=+5436.586583355" watchObservedRunningTime="2026-01-21 00:06:04.27055888 +0000 UTC m=+5436.590819168" Jan 21 00:06:05 crc kubenswrapper[5030]: I0121 00:06:05.261977 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5d8524f-463c-4a40-9b09-fbc9e80ded50" containerID="c57fa0852a94326fd2eb1efb569129a3b88c52bbd0c71a3d2d8eb0d1f3e05ce0" exitCode=0 Jan 21 00:06:05 crc kubenswrapper[5030]: I0121 00:06:05.262088 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" event={"ID":"a5d8524f-463c-4a40-9b09-fbc9e80ded50","Type":"ContainerDied","Data":"c57fa0852a94326fd2eb1efb569129a3b88c52bbd0c71a3d2d8eb0d1f3e05ce0"} Jan 21 00:06:06 crc kubenswrapper[5030]: I0121 00:06:06.546371 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" Jan 21 00:06:06 crc kubenswrapper[5030]: I0121 00:06:06.650208 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74msd\" (UniqueName: \"kubernetes.io/projected/a5d8524f-463c-4a40-9b09-fbc9e80ded50-kube-api-access-74msd\") pod \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\" (UID: \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\") " Jan 21 00:06:06 crc kubenswrapper[5030]: I0121 00:06:06.650673 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a5d8524f-463c-4a40-9b09-fbc9e80ded50-ssh-key-edpm-compute-no-nodes\") pod \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\" (UID: \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\") " Jan 21 00:06:06 crc kubenswrapper[5030]: I0121 00:06:06.650731 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5d8524f-463c-4a40-9b09-fbc9e80ded50-inventory\") pod \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\" (UID: \"a5d8524f-463c-4a40-9b09-fbc9e80ded50\") " Jan 21 00:06:06 crc kubenswrapper[5030]: I0121 00:06:06.655012 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5d8524f-463c-4a40-9b09-fbc9e80ded50-kube-api-access-74msd" (OuterVolumeSpecName: "kube-api-access-74msd") pod "a5d8524f-463c-4a40-9b09-fbc9e80ded50" (UID: "a5d8524f-463c-4a40-9b09-fbc9e80ded50"). InnerVolumeSpecName "kube-api-access-74msd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:06:06 crc kubenswrapper[5030]: I0121 00:06:06.671781 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5d8524f-463c-4a40-9b09-fbc9e80ded50-inventory" (OuterVolumeSpecName: "inventory") pod "a5d8524f-463c-4a40-9b09-fbc9e80ded50" (UID: "a5d8524f-463c-4a40-9b09-fbc9e80ded50"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:06 crc kubenswrapper[5030]: I0121 00:06:06.680047 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5d8524f-463c-4a40-9b09-fbc9e80ded50-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "a5d8524f-463c-4a40-9b09-fbc9e80ded50" (UID: "a5d8524f-463c-4a40-9b09-fbc9e80ded50"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:06 crc kubenswrapper[5030]: I0121 00:06:06.752577 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a5d8524f-463c-4a40-9b09-fbc9e80ded50-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:06 crc kubenswrapper[5030]: I0121 00:06:06.752618 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5d8524f-463c-4a40-9b09-fbc9e80ded50-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:06 crc kubenswrapper[5030]: I0121 00:06:06.752665 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74msd\" (UniqueName: \"kubernetes.io/projected/a5d8524f-463c-4a40-9b09-fbc9e80ded50-kube-api-access-74msd\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.279480 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" event={"ID":"a5d8524f-463c-4a40-9b09-fbc9e80ded50","Type":"ContainerDied","Data":"66fadf550b0af4bbbe57b3262aca71d8cd2ce689b64c8a9c5f04ca8f42bbcab9"} Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.279518 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66fadf550b0af4bbbe57b3262aca71d8cd2ce689b64c8a9c5f04ca8f42bbcab9" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.279538 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.343799 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d"] Jan 21 00:06:07 crc kubenswrapper[5030]: E0121 00:06:07.344155 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5d8524f-463c-4a40-9b09-fbc9e80ded50" containerName="download-cache-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.344177 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5d8524f-463c-4a40-9b09-fbc9e80ded50" containerName="download-cache-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.344366 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5d8524f-463c-4a40-9b09-fbc9e80ded50" containerName="download-cache-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.346556 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.348739 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.348820 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.349035 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.349337 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.349760 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.359061 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d"] Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.463301 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x66pp\" (UniqueName: \"kubernetes.io/projected/103c5df9-e30f-4d80-96fd-75cfcc07ce18-kube-api-access-x66pp\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.463366 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-ssh-key-edpm-compute-no-nodes\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.463391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-inventory\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.463455 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.564370 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x66pp\" (UniqueName: \"kubernetes.io/projected/103c5df9-e30f-4d80-96fd-75cfcc07ce18-kube-api-access-x66pp\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.564592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-ssh-key-edpm-compute-no-nodes\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.564613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-inventory\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.564668 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.570013 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-inventory\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.571329 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-ssh-key-edpm-compute-no-nodes\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.571855 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.585215 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x66pp\" (UniqueName: \"kubernetes.io/projected/103c5df9-e30f-4d80-96fd-75cfcc07ce18-kube-api-access-x66pp\") pod \"bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:07 crc kubenswrapper[5030]: I0121 00:06:07.665839 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:08 crc kubenswrapper[5030]: I0121 00:06:08.170399 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d"] Jan 21 00:06:08 crc kubenswrapper[5030]: W0121 00:06:08.172922 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod103c5df9_e30f_4d80_96fd_75cfcc07ce18.slice/crio-d1cc05801d34fde2da5f88cd13f0be611bfb759252e3b9a25a3e1917cd34f515 WatchSource:0}: Error finding container d1cc05801d34fde2da5f88cd13f0be611bfb759252e3b9a25a3e1917cd34f515: Status 404 returned error can't find the container with id d1cc05801d34fde2da5f88cd13f0be611bfb759252e3b9a25a3e1917cd34f515 Jan 21 00:06:08 crc kubenswrapper[5030]: I0121 00:06:08.293884 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" event={"ID":"103c5df9-e30f-4d80-96fd-75cfcc07ce18","Type":"ContainerStarted","Data":"d1cc05801d34fde2da5f88cd13f0be611bfb759252e3b9a25a3e1917cd34f515"} Jan 21 00:06:09 crc kubenswrapper[5030]: I0121 00:06:09.306968 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" event={"ID":"103c5df9-e30f-4d80-96fd-75cfcc07ce18","Type":"ContainerStarted","Data":"af34ceed8e9bf13ec3c26ccbf34aa4a2647431beac065ed3581c2efc00908b28"} Jan 21 00:06:09 crc kubenswrapper[5030]: I0121 00:06:09.336271 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" podStartSLOduration=1.798027307 podStartE2EDuration="2.336249898s" podCreationTimestamp="2026-01-21 00:06:07 +0000 UTC" firstStartedPulling="2026-01-21 00:06:08.17664237 +0000 UTC m=+5440.496902658" lastFinishedPulling="2026-01-21 00:06:08.714864941 +0000 UTC m=+5441.035125249" observedRunningTime="2026-01-21 00:06:09.328446389 +0000 UTC m=+5441.648706767" watchObservedRunningTime="2026-01-21 00:06:09.336249898 +0000 UTC m=+5441.656510186" Jan 21 00:06:10 crc kubenswrapper[5030]: I0121 00:06:10.323088 5030 generic.go:334] "Generic (PLEG): container finished" podID="103c5df9-e30f-4d80-96fd-75cfcc07ce18" containerID="af34ceed8e9bf13ec3c26ccbf34aa4a2647431beac065ed3581c2efc00908b28" exitCode=0 Jan 21 00:06:10 crc kubenswrapper[5030]: I0121 00:06:10.323162 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" event={"ID":"103c5df9-e30f-4d80-96fd-75cfcc07ce18","Type":"ContainerDied","Data":"af34ceed8e9bf13ec3c26ccbf34aa4a2647431beac065ed3581c2efc00908b28"} Jan 21 00:06:11 crc kubenswrapper[5030]: I0121 00:06:11.667270 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:11 crc kubenswrapper[5030]: I0121 00:06:11.828790 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-ssh-key-edpm-compute-no-nodes\") pod \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " Jan 21 00:06:11 crc kubenswrapper[5030]: I0121 00:06:11.828843 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-inventory\") pod \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " Jan 21 00:06:11 crc kubenswrapper[5030]: I0121 00:06:11.828903 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x66pp\" (UniqueName: \"kubernetes.io/projected/103c5df9-e30f-4d80-96fd-75cfcc07ce18-kube-api-access-x66pp\") pod \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " Jan 21 00:06:11 crc kubenswrapper[5030]: I0121 00:06:11.828939 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-bootstrap-combined-ca-bundle\") pod \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\" (UID: \"103c5df9-e30f-4d80-96fd-75cfcc07ce18\") " Jan 21 00:06:11 crc kubenswrapper[5030]: I0121 00:06:11.835980 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "103c5df9-e30f-4d80-96fd-75cfcc07ce18" (UID: "103c5df9-e30f-4d80-96fd-75cfcc07ce18"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:11 crc kubenswrapper[5030]: I0121 00:06:11.836508 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/103c5df9-e30f-4d80-96fd-75cfcc07ce18-kube-api-access-x66pp" (OuterVolumeSpecName: "kube-api-access-x66pp") pod "103c5df9-e30f-4d80-96fd-75cfcc07ce18" (UID: "103c5df9-e30f-4d80-96fd-75cfcc07ce18"). InnerVolumeSpecName "kube-api-access-x66pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:06:11 crc kubenswrapper[5030]: I0121 00:06:11.851657 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-inventory" (OuterVolumeSpecName: "inventory") pod "103c5df9-e30f-4d80-96fd-75cfcc07ce18" (UID: "103c5df9-e30f-4d80-96fd-75cfcc07ce18"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:11 crc kubenswrapper[5030]: I0121 00:06:11.860889 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "103c5df9-e30f-4d80-96fd-75cfcc07ce18" (UID: "103c5df9-e30f-4d80-96fd-75cfcc07ce18"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:11 crc kubenswrapper[5030]: I0121 00:06:11.930262 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:11 crc kubenswrapper[5030]: I0121 00:06:11.930316 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:11 crc kubenswrapper[5030]: I0121 00:06:11.930335 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x66pp\" (UniqueName: \"kubernetes.io/projected/103c5df9-e30f-4d80-96fd-75cfcc07ce18-kube-api-access-x66pp\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:11 crc kubenswrapper[5030]: I0121 00:06:11.930349 5030 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103c5df9-e30f-4d80-96fd-75cfcc07ce18-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.361896 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" event={"ID":"103c5df9-e30f-4d80-96fd-75cfcc07ce18","Type":"ContainerDied","Data":"d1cc05801d34fde2da5f88cd13f0be611bfb759252e3b9a25a3e1917cd34f515"} Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.361992 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1cc05801d34fde2da5f88cd13f0be611bfb759252e3b9a25a3e1917cd34f515" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.362174 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.426560 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q"] Jan 21 00:06:12 crc kubenswrapper[5030]: E0121 00:06:12.427100 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103c5df9-e30f-4d80-96fd-75cfcc07ce18" containerName="bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.427180 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="103c5df9-e30f-4d80-96fd-75cfcc07ce18" containerName="bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.427438 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="103c5df9-e30f-4d80-96fd-75cfcc07ce18" containerName="bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.428077 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.430708 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.430977 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.431242 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.433174 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.449295 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q"] Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.541006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3439eae5-4f69-4f67-9cd2-6b25b38792b6-inventory\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q\" (UID: \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.541063 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn4xp\" (UniqueName: \"kubernetes.io/projected/3439eae5-4f69-4f67-9cd2-6b25b38792b6-kube-api-access-gn4xp\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q\" (UID: \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.541169 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3439eae5-4f69-4f67-9cd2-6b25b38792b6-ssh-key-edpm-compute-no-nodes\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q\" (UID: \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.643034 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3439eae5-4f69-4f67-9cd2-6b25b38792b6-ssh-key-edpm-compute-no-nodes\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q\" (UID: \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.643142 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3439eae5-4f69-4f67-9cd2-6b25b38792b6-inventory\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q\" (UID: \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.643166 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn4xp\" (UniqueName: \"kubernetes.io/projected/3439eae5-4f69-4f67-9cd2-6b25b38792b6-kube-api-access-gn4xp\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q\" (UID: \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.649054 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3439eae5-4f69-4f67-9cd2-6b25b38792b6-ssh-key-edpm-compute-no-nodes\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q\" (UID: \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.675330 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3439eae5-4f69-4f67-9cd2-6b25b38792b6-inventory\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q\" (UID: \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.684199 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn4xp\" (UniqueName: \"kubernetes.io/projected/3439eae5-4f69-4f67-9cd2-6b25b38792b6-kube-api-access-gn4xp\") pod \"configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q\" (UID: \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\") " pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" Jan 21 00:06:12 crc kubenswrapper[5030]: I0121 00:06:12.758364 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" Jan 21 00:06:13 crc kubenswrapper[5030]: I0121 00:06:13.247524 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q"] Jan 21 00:06:13 crc kubenswrapper[5030]: W0121 00:06:13.252203 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3439eae5_4f69_4f67_9cd2_6b25b38792b6.slice/crio-2161f7f06697c76b4d058d6126ec5d3b27efeebcec8127bc5f9a9c542fb9e7af WatchSource:0}: Error finding container 2161f7f06697c76b4d058d6126ec5d3b27efeebcec8127bc5f9a9c542fb9e7af: Status 404 returned error can't find the container with id 2161f7f06697c76b4d058d6126ec5d3b27efeebcec8127bc5f9a9c542fb9e7af Jan 21 00:06:13 crc kubenswrapper[5030]: I0121 00:06:13.373482 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" event={"ID":"3439eae5-4f69-4f67-9cd2-6b25b38792b6","Type":"ContainerStarted","Data":"2161f7f06697c76b4d058d6126ec5d3b27efeebcec8127bc5f9a9c542fb9e7af"} Jan 21 00:06:14 crc kubenswrapper[5030]: I0121 00:06:14.386435 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" event={"ID":"3439eae5-4f69-4f67-9cd2-6b25b38792b6","Type":"ContainerStarted","Data":"c038ec01c492b5f8a12c4451e87132666135683daa656c8c5bfb22f57684b6c4"} Jan 21 00:06:14 crc kubenswrapper[5030]: I0121 00:06:14.418606 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" podStartSLOduration=1.865328174 podStartE2EDuration="2.418571899s" podCreationTimestamp="2026-01-21 00:06:12 +0000 UTC" firstStartedPulling="2026-01-21 00:06:13.254931684 +0000 UTC m=+5445.575191982" lastFinishedPulling="2026-01-21 00:06:13.808175379 +0000 UTC m=+5446.128435707" observedRunningTime="2026-01-21 00:06:14.407072721 +0000 UTC m=+5446.727333039" watchObservedRunningTime="2026-01-21 00:06:14.418571899 +0000 UTC m=+5446.738832227" Jan 21 00:06:15 crc kubenswrapper[5030]: I0121 00:06:15.401252 5030 generic.go:334] "Generic (PLEG): container finished" podID="3439eae5-4f69-4f67-9cd2-6b25b38792b6" containerID="c038ec01c492b5f8a12c4451e87132666135683daa656c8c5bfb22f57684b6c4" exitCode=0 Jan 21 00:06:15 crc kubenswrapper[5030]: I0121 00:06:15.401332 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" event={"ID":"3439eae5-4f69-4f67-9cd2-6b25b38792b6","Type":"ContainerDied","Data":"c038ec01c492b5f8a12c4451e87132666135683daa656c8c5bfb22f57684b6c4"} Jan 21 00:06:16 crc kubenswrapper[5030]: I0121 00:06:16.849173 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.015860 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn4xp\" (UniqueName: \"kubernetes.io/projected/3439eae5-4f69-4f67-9cd2-6b25b38792b6-kube-api-access-gn4xp\") pod \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\" (UID: \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\") " Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.015986 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3439eae5-4f69-4f67-9cd2-6b25b38792b6-ssh-key-edpm-compute-no-nodes\") pod \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\" (UID: \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\") " Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.016040 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3439eae5-4f69-4f67-9cd2-6b25b38792b6-inventory\") pod \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\" (UID: \"3439eae5-4f69-4f67-9cd2-6b25b38792b6\") " Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.027440 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3439eae5-4f69-4f67-9cd2-6b25b38792b6-kube-api-access-gn4xp" (OuterVolumeSpecName: "kube-api-access-gn4xp") pod "3439eae5-4f69-4f67-9cd2-6b25b38792b6" (UID: "3439eae5-4f69-4f67-9cd2-6b25b38792b6"). InnerVolumeSpecName "kube-api-access-gn4xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.045387 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3439eae5-4f69-4f67-9cd2-6b25b38792b6-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "3439eae5-4f69-4f67-9cd2-6b25b38792b6" (UID: "3439eae5-4f69-4f67-9cd2-6b25b38792b6"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.063393 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3439eae5-4f69-4f67-9cd2-6b25b38792b6-inventory" (OuterVolumeSpecName: "inventory") pod "3439eae5-4f69-4f67-9cd2-6b25b38792b6" (UID: "3439eae5-4f69-4f67-9cd2-6b25b38792b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.118503 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn4xp\" (UniqueName: \"kubernetes.io/projected/3439eae5-4f69-4f67-9cd2-6b25b38792b6-kube-api-access-gn4xp\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.118556 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3439eae5-4f69-4f67-9cd2-6b25b38792b6-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.118578 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3439eae5-4f69-4f67-9cd2-6b25b38792b6-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.422409 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" event={"ID":"3439eae5-4f69-4f67-9cd2-6b25b38792b6","Type":"ContainerDied","Data":"2161f7f06697c76b4d058d6126ec5d3b27efeebcec8127bc5f9a9c542fb9e7af"} Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.422795 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2161f7f06697c76b4d058d6126ec5d3b27efeebcec8127bc5f9a9c542fb9e7af" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.422855 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.510793 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9"] Jan 21 00:06:17 crc kubenswrapper[5030]: E0121 00:06:17.511219 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3439eae5-4f69-4f67-9cd2-6b25b38792b6" containerName="configure-network-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.511247 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3439eae5-4f69-4f67-9cd2-6b25b38792b6" containerName="configure-network-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.511424 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3439eae5-4f69-4f67-9cd2-6b25b38792b6" containerName="configure-network-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.512067 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.515563 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.515773 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.515885 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.515580 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.517507 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9"] Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.624409 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/93625bb1-fc41-4613-9bc1-ef512f010bbc-ssh-key-edpm-compute-no-nodes\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9\" (UID: \"93625bb1-fc41-4613-9bc1-ef512f010bbc\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.624565 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93625bb1-fc41-4613-9bc1-ef512f010bbc-inventory\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9\" (UID: \"93625bb1-fc41-4613-9bc1-ef512f010bbc\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.624591 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdq5l\" (UniqueName: \"kubernetes.io/projected/93625bb1-fc41-4613-9bc1-ef512f010bbc-kube-api-access-xdq5l\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9\" (UID: \"93625bb1-fc41-4613-9bc1-ef512f010bbc\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.725423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/93625bb1-fc41-4613-9bc1-ef512f010bbc-ssh-key-edpm-compute-no-nodes\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9\" (UID: \"93625bb1-fc41-4613-9bc1-ef512f010bbc\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.725569 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93625bb1-fc41-4613-9bc1-ef512f010bbc-inventory\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9\" (UID: \"93625bb1-fc41-4613-9bc1-ef512f010bbc\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.725605 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdq5l\" (UniqueName: \"kubernetes.io/projected/93625bb1-fc41-4613-9bc1-ef512f010bbc-kube-api-access-xdq5l\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9\" (UID: \"93625bb1-fc41-4613-9bc1-ef512f010bbc\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.730385 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93625bb1-fc41-4613-9bc1-ef512f010bbc-inventory\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9\" (UID: \"93625bb1-fc41-4613-9bc1-ef512f010bbc\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.730887 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/93625bb1-fc41-4613-9bc1-ef512f010bbc-ssh-key-edpm-compute-no-nodes\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9\" (UID: \"93625bb1-fc41-4613-9bc1-ef512f010bbc\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.754330 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdq5l\" (UniqueName: \"kubernetes.io/projected/93625bb1-fc41-4613-9bc1-ef512f010bbc-kube-api-access-xdq5l\") pod \"validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9\" (UID: \"93625bb1-fc41-4613-9bc1-ef512f010bbc\") " pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" Jan 21 00:06:17 crc kubenswrapper[5030]: I0121 00:06:17.837155 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" Jan 21 00:06:18 crc kubenswrapper[5030]: I0121 00:06:18.084274 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9"] Jan 21 00:06:18 crc kubenswrapper[5030]: W0121 00:06:18.087447 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93625bb1_fc41_4613_9bc1_ef512f010bbc.slice/crio-33b32727ee1ed5a6069baa8692c363570281265f7b2cc9a35b86718239ae6a85 WatchSource:0}: Error finding container 33b32727ee1ed5a6069baa8692c363570281265f7b2cc9a35b86718239ae6a85: Status 404 returned error can't find the container with id 33b32727ee1ed5a6069baa8692c363570281265f7b2cc9a35b86718239ae6a85 Jan 21 00:06:18 crc kubenswrapper[5030]: I0121 00:06:18.435777 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" event={"ID":"93625bb1-fc41-4613-9bc1-ef512f010bbc","Type":"ContainerStarted","Data":"33b32727ee1ed5a6069baa8692c363570281265f7b2cc9a35b86718239ae6a85"} Jan 21 00:06:19 crc kubenswrapper[5030]: I0121 00:06:19.447495 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" event={"ID":"93625bb1-fc41-4613-9bc1-ef512f010bbc","Type":"ContainerStarted","Data":"95861705696a2581bb7d9cb33534f21eec2421d6086413b3d6c124867f1992e8"} Jan 21 00:06:19 crc kubenswrapper[5030]: I0121 00:06:19.470258 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" podStartSLOduration=1.978188867 podStartE2EDuration="2.470231538s" podCreationTimestamp="2026-01-21 00:06:17 +0000 UTC" firstStartedPulling="2026-01-21 00:06:18.089815297 +0000 UTC m=+5450.410075585" lastFinishedPulling="2026-01-21 00:06:18.581857938 +0000 UTC m=+5450.902118256" observedRunningTime="2026-01-21 00:06:19.462368948 +0000 UTC m=+5451.782629226" watchObservedRunningTime="2026-01-21 00:06:19.470231538 +0000 UTC m=+5451.790491846" Jan 21 00:06:20 crc kubenswrapper[5030]: I0121 00:06:20.463387 5030 generic.go:334] "Generic (PLEG): container finished" podID="93625bb1-fc41-4613-9bc1-ef512f010bbc" containerID="95861705696a2581bb7d9cb33534f21eec2421d6086413b3d6c124867f1992e8" exitCode=0 Jan 21 00:06:20 crc kubenswrapper[5030]: I0121 00:06:20.463491 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" event={"ID":"93625bb1-fc41-4613-9bc1-ef512f010bbc","Type":"ContainerDied","Data":"95861705696a2581bb7d9cb33534f21eec2421d6086413b3d6c124867f1992e8"} Jan 21 00:06:21 crc kubenswrapper[5030]: I0121 00:06:21.811691 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.000281 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/93625bb1-fc41-4613-9bc1-ef512f010bbc-ssh-key-edpm-compute-no-nodes\") pod \"93625bb1-fc41-4613-9bc1-ef512f010bbc\" (UID: \"93625bb1-fc41-4613-9bc1-ef512f010bbc\") " Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.000896 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdq5l\" (UniqueName: \"kubernetes.io/projected/93625bb1-fc41-4613-9bc1-ef512f010bbc-kube-api-access-xdq5l\") pod \"93625bb1-fc41-4613-9bc1-ef512f010bbc\" (UID: \"93625bb1-fc41-4613-9bc1-ef512f010bbc\") " Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.000977 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93625bb1-fc41-4613-9bc1-ef512f010bbc-inventory\") pod \"93625bb1-fc41-4613-9bc1-ef512f010bbc\" (UID: \"93625bb1-fc41-4613-9bc1-ef512f010bbc\") " Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.006252 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93625bb1-fc41-4613-9bc1-ef512f010bbc-kube-api-access-xdq5l" (OuterVolumeSpecName: "kube-api-access-xdq5l") pod "93625bb1-fc41-4613-9bc1-ef512f010bbc" (UID: "93625bb1-fc41-4613-9bc1-ef512f010bbc"). InnerVolumeSpecName "kube-api-access-xdq5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.023399 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93625bb1-fc41-4613-9bc1-ef512f010bbc-inventory" (OuterVolumeSpecName: "inventory") pod "93625bb1-fc41-4613-9bc1-ef512f010bbc" (UID: "93625bb1-fc41-4613-9bc1-ef512f010bbc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.036530 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93625bb1-fc41-4613-9bc1-ef512f010bbc-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "93625bb1-fc41-4613-9bc1-ef512f010bbc" (UID: "93625bb1-fc41-4613-9bc1-ef512f010bbc"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.104067 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93625bb1-fc41-4613-9bc1-ef512f010bbc-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.104112 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/93625bb1-fc41-4613-9bc1-ef512f010bbc-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.104131 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdq5l\" (UniqueName: \"kubernetes.io/projected/93625bb1-fc41-4613-9bc1-ef512f010bbc-kube-api-access-xdq5l\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.487478 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" event={"ID":"93625bb1-fc41-4613-9bc1-ef512f010bbc","Type":"ContainerDied","Data":"33b32727ee1ed5a6069baa8692c363570281265f7b2cc9a35b86718239ae6a85"} Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.487532 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33b32727ee1ed5a6069baa8692c363570281265f7b2cc9a35b86718239ae6a85" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.487966 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.580505 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg"] Jan 21 00:06:22 crc kubenswrapper[5030]: E0121 00:06:22.581019 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93625bb1-fc41-4613-9bc1-ef512f010bbc" containerName="validate-network-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.581052 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="93625bb1-fc41-4613-9bc1-ef512f010bbc" containerName="validate-network-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.581274 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="93625bb1-fc41-4613-9bc1-ef512f010bbc" containerName="validate-network-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.581932 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.586010 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.586122 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.586224 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.586329 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.592478 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg"] Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.613439 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-inventory\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg\" (UID: \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.613519 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq494\" (UniqueName: \"kubernetes.io/projected/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-kube-api-access-pq494\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg\" (UID: \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.613574 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-ssh-key-edpm-compute-no-nodes\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg\" (UID: \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.715007 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq494\" (UniqueName: \"kubernetes.io/projected/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-kube-api-access-pq494\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg\" (UID: \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.715080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-ssh-key-edpm-compute-no-nodes\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg\" (UID: \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.715161 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-inventory\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg\" (UID: \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.718519 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-ssh-key-edpm-compute-no-nodes\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg\" (UID: \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.724756 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-inventory\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg\" (UID: \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.736042 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq494\" (UniqueName: \"kubernetes.io/projected/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-kube-api-access-pq494\") pod \"install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg\" (UID: \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\") " pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" Jan 21 00:06:22 crc kubenswrapper[5030]: I0121 00:06:22.906606 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" Jan 21 00:06:23 crc kubenswrapper[5030]: I0121 00:06:23.361646 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg"] Jan 21 00:06:23 crc kubenswrapper[5030]: I0121 00:06:23.504911 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" event={"ID":"68fd9305-02e2-4e20-9b4e-2835ab4fb12e","Type":"ContainerStarted","Data":"4c830f7beb105ab9dbdd64f7f7d203bdae1bb4f39c4bdda926ca52d66dabb799"} Jan 21 00:06:25 crc kubenswrapper[5030]: I0121 00:06:25.266444 5030 scope.go:117] "RemoveContainer" containerID="d666523f48cf4aaf05db4dd3f6ba6ca62dc9695f19f4f6ca3c7bc2af9b00b323" Jan 21 00:06:25 crc kubenswrapper[5030]: I0121 00:06:25.292196 5030 scope.go:117] "RemoveContainer" containerID="335c0b8dbb4d6d5eeeb22eb1e0e0d95f51b996152234b3f644a2665b4d2fa0ff" Jan 21 00:06:25 crc kubenswrapper[5030]: I0121 00:06:25.328971 5030 scope.go:117] "RemoveContainer" containerID="5f6a7f78da319e388f0437ef2beeaa421aa70e851f691b2b4f03d29707c2bb80" Jan 21 00:06:25 crc kubenswrapper[5030]: I0121 00:06:25.352851 5030 scope.go:117] "RemoveContainer" containerID="69bef87377bb66e3b027d614cfd60256b7e6670cebbe8fba722f2a003e9cddcb" Jan 21 00:06:25 crc kubenswrapper[5030]: I0121 00:06:25.379159 5030 scope.go:117] "RemoveContainer" containerID="e81d8bde96ffa60ced095539bf8194443e1a9eaae0fb5f78fda9c45b6aaf0f69" Jan 21 00:06:25 crc kubenswrapper[5030]: I0121 00:06:25.530847 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" event={"ID":"68fd9305-02e2-4e20-9b4e-2835ab4fb12e","Type":"ContainerStarted","Data":"b0d506265f978beb8126f6798cd50bf16df24d80854f3b6172f602219dded71b"} Jan 21 00:06:25 crc kubenswrapper[5030]: I0121 00:06:25.558596 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" podStartSLOduration=2.629754014 podStartE2EDuration="3.558576424s" podCreationTimestamp="2026-01-21 00:06:22 +0000 UTC" firstStartedPulling="2026-01-21 00:06:23.373061562 +0000 UTC m=+5455.693321890" lastFinishedPulling="2026-01-21 00:06:24.301884012 +0000 UTC m=+5456.622144300" observedRunningTime="2026-01-21 00:06:25.551495642 +0000 UTC m=+5457.871755930" watchObservedRunningTime="2026-01-21 00:06:25.558576424 +0000 UTC m=+5457.878836732" Jan 21 00:06:26 crc kubenswrapper[5030]: I0121 00:06:26.553328 5030 generic.go:334] "Generic (PLEG): container finished" podID="68fd9305-02e2-4e20-9b4e-2835ab4fb12e" containerID="b0d506265f978beb8126f6798cd50bf16df24d80854f3b6172f602219dded71b" exitCode=0 Jan 21 00:06:26 crc kubenswrapper[5030]: I0121 00:06:26.553398 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" event={"ID":"68fd9305-02e2-4e20-9b4e-2835ab4fb12e","Type":"ContainerDied","Data":"b0d506265f978beb8126f6798cd50bf16df24d80854f3b6172f602219dded71b"} Jan 21 00:06:27 crc kubenswrapper[5030]: I0121 00:06:27.918532 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.009153 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq494\" (UniqueName: \"kubernetes.io/projected/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-kube-api-access-pq494\") pod \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\" (UID: \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\") " Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.009228 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-inventory\") pod \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\" (UID: \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\") " Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.009286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-ssh-key-edpm-compute-no-nodes\") pod \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\" (UID: \"68fd9305-02e2-4e20-9b4e-2835ab4fb12e\") " Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.016973 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-kube-api-access-pq494" (OuterVolumeSpecName: "kube-api-access-pq494") pod "68fd9305-02e2-4e20-9b4e-2835ab4fb12e" (UID: "68fd9305-02e2-4e20-9b4e-2835ab4fb12e"). InnerVolumeSpecName "kube-api-access-pq494". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.031445 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-inventory" (OuterVolumeSpecName: "inventory") pod "68fd9305-02e2-4e20-9b4e-2835ab4fb12e" (UID: "68fd9305-02e2-4e20-9b4e-2835ab4fb12e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.044963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "68fd9305-02e2-4e20-9b4e-2835ab4fb12e" (UID: "68fd9305-02e2-4e20-9b4e-2835ab4fb12e"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.110807 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq494\" (UniqueName: \"kubernetes.io/projected/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-kube-api-access-pq494\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.110861 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.110886 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/68fd9305-02e2-4e20-9b4e-2835ab4fb12e-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.586328 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" event={"ID":"68fd9305-02e2-4e20-9b4e-2835ab4fb12e","Type":"ContainerDied","Data":"4c830f7beb105ab9dbdd64f7f7d203bdae1bb4f39c4bdda926ca52d66dabb799"} Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.586461 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c830f7beb105ab9dbdd64f7f7d203bdae1bb4f39c4bdda926ca52d66dabb799" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.586482 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.654677 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs"] Jan 21 00:06:28 crc kubenswrapper[5030]: E0121 00:06:28.655200 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68fd9305-02e2-4e20-9b4e-2835ab4fb12e" containerName="install-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.655230 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="68fd9305-02e2-4e20-9b4e-2835ab4fb12e" containerName="install-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.655477 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="68fd9305-02e2-4e20-9b4e-2835ab4fb12e" containerName="install-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.656356 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.660224 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.660655 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.661949 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.662260 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.676897 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs"] Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.720577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c4e39394-d293-4bc0-a41b-71ab76c7a740-ssh-key-edpm-compute-no-nodes\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs\" (UID: \"c4e39394-d293-4bc0-a41b-71ab76c7a740\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.720884 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jvmt\" (UniqueName: \"kubernetes.io/projected/c4e39394-d293-4bc0-a41b-71ab76c7a740-kube-api-access-4jvmt\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs\" (UID: \"c4e39394-d293-4bc0-a41b-71ab76c7a740\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.720982 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4e39394-d293-4bc0-a41b-71ab76c7a740-inventory\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs\" (UID: \"c4e39394-d293-4bc0-a41b-71ab76c7a740\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.822653 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c4e39394-d293-4bc0-a41b-71ab76c7a740-ssh-key-edpm-compute-no-nodes\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs\" (UID: \"c4e39394-d293-4bc0-a41b-71ab76c7a740\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.822850 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jvmt\" (UniqueName: \"kubernetes.io/projected/c4e39394-d293-4bc0-a41b-71ab76c7a740-kube-api-access-4jvmt\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs\" (UID: \"c4e39394-d293-4bc0-a41b-71ab76c7a740\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.822906 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4e39394-d293-4bc0-a41b-71ab76c7a740-inventory\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs\" (UID: \"c4e39394-d293-4bc0-a41b-71ab76c7a740\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.827534 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c4e39394-d293-4bc0-a41b-71ab76c7a740-ssh-key-edpm-compute-no-nodes\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs\" (UID: \"c4e39394-d293-4bc0-a41b-71ab76c7a740\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.827604 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4e39394-d293-4bc0-a41b-71ab76c7a740-inventory\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs\" (UID: \"c4e39394-d293-4bc0-a41b-71ab76c7a740\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.849640 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jvmt\" (UniqueName: \"kubernetes.io/projected/c4e39394-d293-4bc0-a41b-71ab76c7a740-kube-api-access-4jvmt\") pod \"configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs\" (UID: \"c4e39394-d293-4bc0-a41b-71ab76c7a740\") " pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" Jan 21 00:06:28 crc kubenswrapper[5030]: I0121 00:06:28.976073 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" Jan 21 00:06:29 crc kubenswrapper[5030]: I0121 00:06:29.447331 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs"] Jan 21 00:06:29 crc kubenswrapper[5030]: I0121 00:06:29.602982 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" event={"ID":"c4e39394-d293-4bc0-a41b-71ab76c7a740","Type":"ContainerStarted","Data":"52c2a3ab7c41b0508234e2310b6ed2569bca9e27dda51ceae9501f077cc89239"} Jan 21 00:06:30 crc kubenswrapper[5030]: I0121 00:06:30.616112 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" event={"ID":"c4e39394-d293-4bc0-a41b-71ab76c7a740","Type":"ContainerStarted","Data":"15ef8289575339fe902af26da37aaa611d70617294878ec110682d3c22ffdd21"} Jan 21 00:06:30 crc kubenswrapper[5030]: I0121 00:06:30.679212 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" podStartSLOduration=2.210896149 podStartE2EDuration="2.679188884s" podCreationTimestamp="2026-01-21 00:06:28 +0000 UTC" firstStartedPulling="2026-01-21 00:06:29.455300808 +0000 UTC m=+5461.775561136" lastFinishedPulling="2026-01-21 00:06:29.923593543 +0000 UTC m=+5462.243853871" observedRunningTime="2026-01-21 00:06:30.658294607 +0000 UTC m=+5462.978554935" watchObservedRunningTime="2026-01-21 00:06:30.679188884 +0000 UTC m=+5462.999449182" Jan 21 00:06:31 crc kubenswrapper[5030]: I0121 00:06:31.629438 5030 generic.go:334] "Generic (PLEG): container finished" podID="c4e39394-d293-4bc0-a41b-71ab76c7a740" containerID="15ef8289575339fe902af26da37aaa611d70617294878ec110682d3c22ffdd21" exitCode=0 Jan 21 00:06:31 crc kubenswrapper[5030]: I0121 00:06:31.629493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" event={"ID":"c4e39394-d293-4bc0-a41b-71ab76c7a740","Type":"ContainerDied","Data":"15ef8289575339fe902af26da37aaa611d70617294878ec110682d3c22ffdd21"} Jan 21 00:06:32 crc kubenswrapper[5030]: I0121 00:06:32.954034 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" Jan 21 00:06:32 crc kubenswrapper[5030]: I0121 00:06:32.988248 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c4e39394-d293-4bc0-a41b-71ab76c7a740-ssh-key-edpm-compute-no-nodes\") pod \"c4e39394-d293-4bc0-a41b-71ab76c7a740\" (UID: \"c4e39394-d293-4bc0-a41b-71ab76c7a740\") " Jan 21 00:06:32 crc kubenswrapper[5030]: I0121 00:06:32.988337 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4e39394-d293-4bc0-a41b-71ab76c7a740-inventory\") pod \"c4e39394-d293-4bc0-a41b-71ab76c7a740\" (UID: \"c4e39394-d293-4bc0-a41b-71ab76c7a740\") " Jan 21 00:06:32 crc kubenswrapper[5030]: I0121 00:06:32.988423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jvmt\" (UniqueName: \"kubernetes.io/projected/c4e39394-d293-4bc0-a41b-71ab76c7a740-kube-api-access-4jvmt\") pod \"c4e39394-d293-4bc0-a41b-71ab76c7a740\" (UID: \"c4e39394-d293-4bc0-a41b-71ab76c7a740\") " Jan 21 00:06:32 crc kubenswrapper[5030]: I0121 00:06:32.997513 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e39394-d293-4bc0-a41b-71ab76c7a740-kube-api-access-4jvmt" (OuterVolumeSpecName: "kube-api-access-4jvmt") pod "c4e39394-d293-4bc0-a41b-71ab76c7a740" (UID: "c4e39394-d293-4bc0-a41b-71ab76c7a740"). InnerVolumeSpecName "kube-api-access-4jvmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:06:33 crc kubenswrapper[5030]: I0121 00:06:33.017497 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e39394-d293-4bc0-a41b-71ab76c7a740-inventory" (OuterVolumeSpecName: "inventory") pod "c4e39394-d293-4bc0-a41b-71ab76c7a740" (UID: "c4e39394-d293-4bc0-a41b-71ab76c7a740"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:33 crc kubenswrapper[5030]: I0121 00:06:33.018039 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e39394-d293-4bc0-a41b-71ab76c7a740-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "c4e39394-d293-4bc0-a41b-71ab76c7a740" (UID: "c4e39394-d293-4bc0-a41b-71ab76c7a740"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:33 crc kubenswrapper[5030]: I0121 00:06:33.091261 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jvmt\" (UniqueName: \"kubernetes.io/projected/c4e39394-d293-4bc0-a41b-71ab76c7a740-kube-api-access-4jvmt\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:33 crc kubenswrapper[5030]: I0121 00:06:33.091314 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c4e39394-d293-4bc0-a41b-71ab76c7a740-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:33 crc kubenswrapper[5030]: I0121 00:06:33.091339 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4e39394-d293-4bc0-a41b-71ab76c7a740-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:33 crc kubenswrapper[5030]: I0121 00:06:33.658495 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" event={"ID":"c4e39394-d293-4bc0-a41b-71ab76c7a740","Type":"ContainerDied","Data":"52c2a3ab7c41b0508234e2310b6ed2569bca9e27dda51ceae9501f077cc89239"} Jan 21 00:06:33 crc kubenswrapper[5030]: I0121 00:06:33.658545 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52c2a3ab7c41b0508234e2310b6ed2569bca9e27dda51ceae9501f077cc89239" Jan 21 00:06:33 crc kubenswrapper[5030]: I0121 00:06:33.658572 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.048090 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj"] Jan 21 00:06:34 crc kubenswrapper[5030]: E0121 00:06:34.048998 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e39394-d293-4bc0-a41b-71ab76c7a740" containerName="configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.049026 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e39394-d293-4bc0-a41b-71ab76c7a740" containerName="configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.049270 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e39394-d293-4bc0-a41b-71ab76c7a740" containerName="configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.050027 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.052402 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.052607 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.054366 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.056443 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.066758 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj"] Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.109223 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9337bf6b-bdfa-4575-993e-cd766ae96e00-ssh-key-edpm-compute-no-nodes\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj\" (UID: \"9337bf6b-bdfa-4575-993e-cd766ae96e00\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.109466 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngd2d\" (UniqueName: \"kubernetes.io/projected/9337bf6b-bdfa-4575-993e-cd766ae96e00-kube-api-access-ngd2d\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj\" (UID: \"9337bf6b-bdfa-4575-993e-cd766ae96e00\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.109598 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9337bf6b-bdfa-4575-993e-cd766ae96e00-inventory\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj\" (UID: \"9337bf6b-bdfa-4575-993e-cd766ae96e00\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.210483 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9337bf6b-bdfa-4575-993e-cd766ae96e00-inventory\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj\" (UID: \"9337bf6b-bdfa-4575-993e-cd766ae96e00\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.210613 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9337bf6b-bdfa-4575-993e-cd766ae96e00-ssh-key-edpm-compute-no-nodes\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj\" (UID: \"9337bf6b-bdfa-4575-993e-cd766ae96e00\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.210666 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngd2d\" (UniqueName: \"kubernetes.io/projected/9337bf6b-bdfa-4575-993e-cd766ae96e00-kube-api-access-ngd2d\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj\" (UID: \"9337bf6b-bdfa-4575-993e-cd766ae96e00\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.217547 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9337bf6b-bdfa-4575-993e-cd766ae96e00-ssh-key-edpm-compute-no-nodes\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj\" (UID: \"9337bf6b-bdfa-4575-993e-cd766ae96e00\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.217940 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9337bf6b-bdfa-4575-993e-cd766ae96e00-inventory\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj\" (UID: \"9337bf6b-bdfa-4575-993e-cd766ae96e00\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.228338 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngd2d\" (UniqueName: \"kubernetes.io/projected/9337bf6b-bdfa-4575-993e-cd766ae96e00-kube-api-access-ngd2d\") pod \"run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj\" (UID: \"9337bf6b-bdfa-4575-993e-cd766ae96e00\") " pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.367888 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" Jan 21 00:06:34 crc kubenswrapper[5030]: I0121 00:06:34.849852 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj"] Jan 21 00:06:34 crc kubenswrapper[5030]: W0121 00:06:34.856402 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9337bf6b_bdfa_4575_993e_cd766ae96e00.slice/crio-e69ab04c50022a897472052dfed6b6ed9e83c7e92f187cb09a179ba24280cb3f WatchSource:0}: Error finding container e69ab04c50022a897472052dfed6b6ed9e83c7e92f187cb09a179ba24280cb3f: Status 404 returned error can't find the container with id e69ab04c50022a897472052dfed6b6ed9e83c7e92f187cb09a179ba24280cb3f Jan 21 00:06:35 crc kubenswrapper[5030]: I0121 00:06:35.685889 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" event={"ID":"9337bf6b-bdfa-4575-993e-cd766ae96e00","Type":"ContainerStarted","Data":"f15501a966c5d5350e24d57b2d179b04ae0d6d080867da95227104c4672cc0fc"} Jan 21 00:06:35 crc kubenswrapper[5030]: I0121 00:06:35.686463 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" event={"ID":"9337bf6b-bdfa-4575-993e-cd766ae96e00","Type":"ContainerStarted","Data":"e69ab04c50022a897472052dfed6b6ed9e83c7e92f187cb09a179ba24280cb3f"} Jan 21 00:06:35 crc kubenswrapper[5030]: I0121 00:06:35.713760 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" podStartSLOduration=1.188808098 podStartE2EDuration="1.713738057s" podCreationTimestamp="2026-01-21 00:06:34 +0000 UTC" firstStartedPulling="2026-01-21 00:06:34.862491186 +0000 UTC m=+5467.182751514" lastFinishedPulling="2026-01-21 00:06:35.387421185 +0000 UTC m=+5467.707681473" observedRunningTime="2026-01-21 00:06:35.709396002 +0000 UTC m=+5468.029656390" watchObservedRunningTime="2026-01-21 00:06:35.713738057 +0000 UTC m=+5468.033998355" Jan 21 00:06:37 crc kubenswrapper[5030]: I0121 00:06:37.710617 5030 generic.go:334] "Generic (PLEG): container finished" podID="9337bf6b-bdfa-4575-993e-cd766ae96e00" containerID="f15501a966c5d5350e24d57b2d179b04ae0d6d080867da95227104c4672cc0fc" exitCode=0 Jan 21 00:06:37 crc kubenswrapper[5030]: I0121 00:06:37.710843 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" event={"ID":"9337bf6b-bdfa-4575-993e-cd766ae96e00","Type":"ContainerDied","Data":"f15501a966c5d5350e24d57b2d179b04ae0d6d080867da95227104c4672cc0fc"} Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.069336 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.107597 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9337bf6b-bdfa-4575-993e-cd766ae96e00-ssh-key-edpm-compute-no-nodes\") pod \"9337bf6b-bdfa-4575-993e-cd766ae96e00\" (UID: \"9337bf6b-bdfa-4575-993e-cd766ae96e00\") " Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.107911 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngd2d\" (UniqueName: \"kubernetes.io/projected/9337bf6b-bdfa-4575-993e-cd766ae96e00-kube-api-access-ngd2d\") pod \"9337bf6b-bdfa-4575-993e-cd766ae96e00\" (UID: \"9337bf6b-bdfa-4575-993e-cd766ae96e00\") " Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.108311 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9337bf6b-bdfa-4575-993e-cd766ae96e00-inventory\") pod \"9337bf6b-bdfa-4575-993e-cd766ae96e00\" (UID: \"9337bf6b-bdfa-4575-993e-cd766ae96e00\") " Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.114869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9337bf6b-bdfa-4575-993e-cd766ae96e00-kube-api-access-ngd2d" (OuterVolumeSpecName: "kube-api-access-ngd2d") pod "9337bf6b-bdfa-4575-993e-cd766ae96e00" (UID: "9337bf6b-bdfa-4575-993e-cd766ae96e00"). InnerVolumeSpecName "kube-api-access-ngd2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.128256 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9337bf6b-bdfa-4575-993e-cd766ae96e00-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "9337bf6b-bdfa-4575-993e-cd766ae96e00" (UID: "9337bf6b-bdfa-4575-993e-cd766ae96e00"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.134971 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9337bf6b-bdfa-4575-993e-cd766ae96e00-inventory" (OuterVolumeSpecName: "inventory") pod "9337bf6b-bdfa-4575-993e-cd766ae96e00" (UID: "9337bf6b-bdfa-4575-993e-cd766ae96e00"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.210543 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9337bf6b-bdfa-4575-993e-cd766ae96e00-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.210589 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngd2d\" (UniqueName: \"kubernetes.io/projected/9337bf6b-bdfa-4575-993e-cd766ae96e00-kube-api-access-ngd2d\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.210600 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9337bf6b-bdfa-4575-993e-cd766ae96e00-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.734599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" event={"ID":"9337bf6b-bdfa-4575-993e-cd766ae96e00","Type":"ContainerDied","Data":"e69ab04c50022a897472052dfed6b6ed9e83c7e92f187cb09a179ba24280cb3f"} Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.734664 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e69ab04c50022a897472052dfed6b6ed9e83c7e92f187cb09a179ba24280cb3f" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.734746 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.827356 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq"] Jan 21 00:06:39 crc kubenswrapper[5030]: E0121 00:06:39.827809 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9337bf6b-bdfa-4575-993e-cd766ae96e00" containerName="run-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.827841 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9337bf6b-bdfa-4575-993e-cd766ae96e00" containerName="run-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.828081 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9337bf6b-bdfa-4575-993e-cd766ae96e00" containerName="run-os-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.828598 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.832926 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.833419 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.833789 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.834241 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.834739 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.854802 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq"] Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.921236 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.921615 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-nova-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.921888 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-ssh-key-edpm-compute-no-nodes\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.922100 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.922305 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-inventory\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.922495 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.922696 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.922920 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.923240 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.923461 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:39 crc kubenswrapper[5030]: I0121 00:06:39.923922 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x86g\" (UniqueName: \"kubernetes.io/projected/c68f004c-ce56-43ba-bee3-581bf3c7dce8-kube-api-access-7x86g\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.025421 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.025485 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-inventory\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.025575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.026789 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.026843 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.026889 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.026998 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.027093 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x86g\" (UniqueName: \"kubernetes.io/projected/c68f004c-ce56-43ba-bee3-581bf3c7dce8-kube-api-access-7x86g\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.027236 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.027279 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-nova-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.027331 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-ssh-key-edpm-compute-no-nodes\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.031205 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.032362 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-inventory\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.033689 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.036931 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.037396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.039035 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-nova-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.037288 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.040387 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-ssh-key-edpm-compute-no-nodes\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.043847 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.044217 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.061165 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x86g\" (UniqueName: \"kubernetes.io/projected/c68f004c-ce56-43ba-bee3-581bf3c7dce8-kube-api-access-7x86g\") pod \"install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.164667 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.452614 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq"] Jan 21 00:06:40 crc kubenswrapper[5030]: I0121 00:06:40.747283 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" event={"ID":"c68f004c-ce56-43ba-bee3-581bf3c7dce8","Type":"ContainerStarted","Data":"2cbe20c0449ac33339c1a5718f67396c320e0369dbccafe870b1127e48c87354"} Jan 21 00:06:41 crc kubenswrapper[5030]: I0121 00:06:41.759572 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" event={"ID":"c68f004c-ce56-43ba-bee3-581bf3c7dce8","Type":"ContainerStarted","Data":"68eca8223b3ceb007c67bdf76460ad1845ff7e01fb93b8ecbb8cd84f015dc8b3"} Jan 21 00:06:41 crc kubenswrapper[5030]: I0121 00:06:41.798173 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" podStartSLOduration=2.293264722 podStartE2EDuration="2.798151025s" podCreationTimestamp="2026-01-21 00:06:39 +0000 UTC" firstStartedPulling="2026-01-21 00:06:40.463512194 +0000 UTC m=+5472.783772482" lastFinishedPulling="2026-01-21 00:06:40.968398487 +0000 UTC m=+5473.288658785" observedRunningTime="2026-01-21 00:06:41.782559167 +0000 UTC m=+5474.102819465" watchObservedRunningTime="2026-01-21 00:06:41.798151025 +0000 UTC m=+5474.118411323" Jan 21 00:06:42 crc kubenswrapper[5030]: I0121 00:06:42.769657 5030 generic.go:334] "Generic (PLEG): container finished" podID="c68f004c-ce56-43ba-bee3-581bf3c7dce8" containerID="68eca8223b3ceb007c67bdf76460ad1845ff7e01fb93b8ecbb8cd84f015dc8b3" exitCode=0 Jan 21 00:06:42 crc kubenswrapper[5030]: I0121 00:06:42.769697 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" event={"ID":"c68f004c-ce56-43ba-bee3-581bf3c7dce8","Type":"ContainerDied","Data":"68eca8223b3ceb007c67bdf76460ad1845ff7e01fb93b8ecbb8cd84f015dc8b3"} Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.044521 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.096969 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-ovn-combined-ca-bundle\") pod \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.097025 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-ovn-combined-ca-bundle\") pod \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.097052 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-inventory\") pod \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.097111 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-metadata-combined-ca-bundle\") pod \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.097150 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-bootstrap-combined-ca-bundle\") pod \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.097200 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x86g\" (UniqueName: \"kubernetes.io/projected/c68f004c-ce56-43ba-bee3-581bf3c7dce8-kube-api-access-7x86g\") pod \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.097254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-nova-combined-ca-bundle\") pod \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.097276 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-dhcp-combined-ca-bundle\") pod \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.097298 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-ssh-key-edpm-compute-no-nodes\") pod \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.097327 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-sriov-combined-ca-bundle\") pod \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.097351 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-libvirt-combined-ca-bundle\") pod \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\" (UID: \"c68f004c-ce56-43ba-bee3-581bf3c7dce8\") " Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.102276 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c68f004c-ce56-43ba-bee3-581bf3c7dce8" (UID: "c68f004c-ce56-43ba-bee3-581bf3c7dce8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.102306 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "c68f004c-ce56-43ba-bee3-581bf3c7dce8" (UID: "c68f004c-ce56-43ba-bee3-581bf3c7dce8"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.102444 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c68f004c-ce56-43ba-bee3-581bf3c7dce8" (UID: "c68f004c-ce56-43ba-bee3-581bf3c7dce8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.102817 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68f004c-ce56-43ba-bee3-581bf3c7dce8-kube-api-access-7x86g" (OuterVolumeSpecName: "kube-api-access-7x86g") pod "c68f004c-ce56-43ba-bee3-581bf3c7dce8" (UID: "c68f004c-ce56-43ba-bee3-581bf3c7dce8"). InnerVolumeSpecName "kube-api-access-7x86g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.103148 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c68f004c-ce56-43ba-bee3-581bf3c7dce8" (UID: "c68f004c-ce56-43ba-bee3-581bf3c7dce8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.103219 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "c68f004c-ce56-43ba-bee3-581bf3c7dce8" (UID: "c68f004c-ce56-43ba-bee3-581bf3c7dce8"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.103447 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c68f004c-ce56-43ba-bee3-581bf3c7dce8" (UID: "c68f004c-ce56-43ba-bee3-581bf3c7dce8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.103509 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c68f004c-ce56-43ba-bee3-581bf3c7dce8" (UID: "c68f004c-ce56-43ba-bee3-581bf3c7dce8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.104391 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "c68f004c-ce56-43ba-bee3-581bf3c7dce8" (UID: "c68f004c-ce56-43ba-bee3-581bf3c7dce8"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.118888 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "c68f004c-ce56-43ba-bee3-581bf3c7dce8" (UID: "c68f004c-ce56-43ba-bee3-581bf3c7dce8"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.125521 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-inventory" (OuterVolumeSpecName: "inventory") pod "c68f004c-ce56-43ba-bee3-581bf3c7dce8" (UID: "c68f004c-ce56-43ba-bee3-581bf3c7dce8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.198260 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.198297 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.198310 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.198319 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.198331 5030 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.198343 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x86g\" (UniqueName: \"kubernetes.io/projected/c68f004c-ce56-43ba-bee3-581bf3c7dce8-kube-api-access-7x86g\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.198353 5030 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.198361 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.198370 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.198378 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.198387 5030 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68f004c-ce56-43ba-bee3-581bf3c7dce8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.796101 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" event={"ID":"c68f004c-ce56-43ba-bee3-581bf3c7dce8","Type":"ContainerDied","Data":"2cbe20c0449ac33339c1a5718f67396c320e0369dbccafe870b1127e48c87354"} Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.796146 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cbe20c0449ac33339c1a5718f67396c320e0369dbccafe870b1127e48c87354" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.796158 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.875793 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc"] Jan 21 00:06:44 crc kubenswrapper[5030]: E0121 00:06:44.876183 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68f004c-ce56-43ba-bee3-581bf3c7dce8" containerName="install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.876211 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68f004c-ce56-43ba-bee3-581bf3c7dce8" containerName="install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.876436 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c68f004c-ce56-43ba-bee3-581bf3c7dce8" containerName="install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.877025 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.888258 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.888259 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.888413 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-config" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.888447 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.888675 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.889109 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.899920 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc"] Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.910966 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlz5t\" (UniqueName: \"kubernetes.io/projected/af5c1628-f417-46b1-8d56-49ecf0a67139-kube-api-access-mlz5t\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.911284 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-inventory\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.911395 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.911542 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af5c1628-f417-46b1-8d56-49ecf0a67139-ovncontroller-config-0\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:44 crc kubenswrapper[5030]: I0121 00:06:44.911645 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:45 crc kubenswrapper[5030]: I0121 00:06:45.012674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-inventory\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:45 crc kubenswrapper[5030]: I0121 00:06:45.012733 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:45 crc kubenswrapper[5030]: I0121 00:06:45.012861 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af5c1628-f417-46b1-8d56-49ecf0a67139-ovncontroller-config-0\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:45 crc kubenswrapper[5030]: I0121 00:06:45.013009 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:45 crc kubenswrapper[5030]: I0121 00:06:45.013159 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlz5t\" (UniqueName: \"kubernetes.io/projected/af5c1628-f417-46b1-8d56-49ecf0a67139-kube-api-access-mlz5t\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:45 crc kubenswrapper[5030]: I0121 00:06:45.014899 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af5c1628-f417-46b1-8d56-49ecf0a67139-ovncontroller-config-0\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:45 crc kubenswrapper[5030]: I0121 00:06:45.016525 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:45 crc kubenswrapper[5030]: I0121 00:06:45.026298 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-inventory\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:45 crc kubenswrapper[5030]: I0121 00:06:45.026398 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:45 crc kubenswrapper[5030]: I0121 00:06:45.037985 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlz5t\" (UniqueName: \"kubernetes.io/projected/af5c1628-f417-46b1-8d56-49ecf0a67139-kube-api-access-mlz5t\") pod \"ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:45 crc kubenswrapper[5030]: I0121 00:06:45.196217 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:45 crc kubenswrapper[5030]: I0121 00:06:45.631342 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc"] Jan 21 00:06:45 crc kubenswrapper[5030]: I0121 00:06:45.809826 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" event={"ID":"af5c1628-f417-46b1-8d56-49ecf0a67139","Type":"ContainerStarted","Data":"8bb8925b1d771c7ce45e4c0d3e11e1e42cb4666de3cbcce0ec597335fe15bb03"} Jan 21 00:06:46 crc kubenswrapper[5030]: I0121 00:06:46.823812 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" event={"ID":"af5c1628-f417-46b1-8d56-49ecf0a67139","Type":"ContainerStarted","Data":"bb3f88c095f0b16a51144b9a0e63fbcaf1b4088bb32a7d50f84e6404b9c7f4eb"} Jan 21 00:06:46 crc kubenswrapper[5030]: I0121 00:06:46.851152 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" podStartSLOduration=2.164842376 podStartE2EDuration="2.851135326s" podCreationTimestamp="2026-01-21 00:06:44 +0000 UTC" firstStartedPulling="2026-01-21 00:06:45.643600967 +0000 UTC m=+5477.963861255" lastFinishedPulling="2026-01-21 00:06:46.329893877 +0000 UTC m=+5478.650154205" observedRunningTime="2026-01-21 00:06:46.845817236 +0000 UTC m=+5479.166077524" watchObservedRunningTime="2026-01-21 00:06:46.851135326 +0000 UTC m=+5479.171395614" Jan 21 00:06:47 crc kubenswrapper[5030]: I0121 00:06:47.837350 5030 generic.go:334] "Generic (PLEG): container finished" podID="af5c1628-f417-46b1-8d56-49ecf0a67139" containerID="bb3f88c095f0b16a51144b9a0e63fbcaf1b4088bb32a7d50f84e6404b9c7f4eb" exitCode=0 Jan 21 00:06:47 crc kubenswrapper[5030]: I0121 00:06:47.837397 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" event={"ID":"af5c1628-f417-46b1-8d56-49ecf0a67139","Type":"ContainerDied","Data":"bb3f88c095f0b16a51144b9a0e63fbcaf1b4088bb32a7d50f84e6404b9c7f4eb"} Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.161982 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.181592 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-inventory\") pod \"af5c1628-f417-46b1-8d56-49ecf0a67139\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.181677 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-ssh-key-edpm-compute-no-nodes\") pod \"af5c1628-f417-46b1-8d56-49ecf0a67139\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.181761 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlz5t\" (UniqueName: \"kubernetes.io/projected/af5c1628-f417-46b1-8d56-49ecf0a67139-kube-api-access-mlz5t\") pod \"af5c1628-f417-46b1-8d56-49ecf0a67139\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.181804 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-ovn-combined-ca-bundle\") pod \"af5c1628-f417-46b1-8d56-49ecf0a67139\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.181842 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af5c1628-f417-46b1-8d56-49ecf0a67139-ovncontroller-config-0\") pod \"af5c1628-f417-46b1-8d56-49ecf0a67139\" (UID: \"af5c1628-f417-46b1-8d56-49ecf0a67139\") " Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.187214 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "af5c1628-f417-46b1-8d56-49ecf0a67139" (UID: "af5c1628-f417-46b1-8d56-49ecf0a67139"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.191918 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af5c1628-f417-46b1-8d56-49ecf0a67139-kube-api-access-mlz5t" (OuterVolumeSpecName: "kube-api-access-mlz5t") pod "af5c1628-f417-46b1-8d56-49ecf0a67139" (UID: "af5c1628-f417-46b1-8d56-49ecf0a67139"). InnerVolumeSpecName "kube-api-access-mlz5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.205037 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af5c1628-f417-46b1-8d56-49ecf0a67139-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "af5c1628-f417-46b1-8d56-49ecf0a67139" (UID: "af5c1628-f417-46b1-8d56-49ecf0a67139"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.205846 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "af5c1628-f417-46b1-8d56-49ecf0a67139" (UID: "af5c1628-f417-46b1-8d56-49ecf0a67139"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.207207 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-inventory" (OuterVolumeSpecName: "inventory") pod "af5c1628-f417-46b1-8d56-49ecf0a67139" (UID: "af5c1628-f417-46b1-8d56-49ecf0a67139"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.283717 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.283758 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.283778 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlz5t\" (UniqueName: \"kubernetes.io/projected/af5c1628-f417-46b1-8d56-49ecf0a67139-kube-api-access-mlz5t\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.283795 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5c1628-f417-46b1-8d56-49ecf0a67139-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.283811 5030 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/af5c1628-f417-46b1-8d56-49ecf0a67139-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.870166 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" event={"ID":"af5c1628-f417-46b1-8d56-49ecf0a67139","Type":"ContainerDied","Data":"8bb8925b1d771c7ce45e4c0d3e11e1e42cb4666de3cbcce0ec597335fe15bb03"} Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.870505 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bb8925b1d771c7ce45e4c0d3e11e1e42cb4666de3cbcce0ec597335fe15bb03" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.870264 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.946252 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9"] Jan 21 00:06:49 crc kubenswrapper[5030]: E0121 00:06:49.947042 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af5c1628-f417-46b1-8d56-49ecf0a67139" containerName="ovn-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.947194 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="af5c1628-f417-46b1-8d56-49ecf0a67139" containerName="ovn-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.947954 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="af5c1628-f417-46b1-8d56-49ecf0a67139" containerName="ovn-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.950125 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.957748 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.958106 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-neutron-config" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.958255 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.958481 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.958675 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.958808 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.959710 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-metadata-agent-neutron-config" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.983600 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9"] Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.997988 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.998041 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.998080 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-inventory\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.998117 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-ssh-key-edpm-compute-no-nodes\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.998142 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgm7p\" (UniqueName: \"kubernetes.io/projected/bb1ac8e8-7505-4239-92d4-748025edb7d3-kube-api-access-bgm7p\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.998167 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.998185 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:49 crc kubenswrapper[5030]: I0121 00:06:49.998204 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.099355 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-inventory\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.099491 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-ssh-key-edpm-compute-no-nodes\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.099554 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgm7p\" (UniqueName: \"kubernetes.io/projected/bb1ac8e8-7505-4239-92d4-748025edb7d3-kube-api-access-bgm7p\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.099609 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.099679 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.099726 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.099824 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.099931 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.104339 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-inventory\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.104483 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.104615 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.105655 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-ssh-key-edpm-compute-no-nodes\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.106142 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.106727 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.107962 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.117560 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgm7p\" (UniqueName: \"kubernetes.io/projected/bb1ac8e8-7505-4239-92d4-748025edb7d3-kube-api-access-bgm7p\") pod \"neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.279684 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.819962 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9"] Jan 21 00:06:50 crc kubenswrapper[5030]: I0121 00:06:50.878386 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" event={"ID":"bb1ac8e8-7505-4239-92d4-748025edb7d3","Type":"ContainerStarted","Data":"b7bddd28b2bfb668359fd041b16b713e63760bfce8b0acbf2324dfa7ea96333e"} Jan 21 00:06:51 crc kubenswrapper[5030]: I0121 00:06:51.890191 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" event={"ID":"bb1ac8e8-7505-4239-92d4-748025edb7d3","Type":"ContainerStarted","Data":"ba2ebdaadaf5424fb657a643d775069242a84d88f1220fc7d9fb80d24f7e7fc3"} Jan 21 00:06:51 crc kubenswrapper[5030]: I0121 00:06:51.915075 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" podStartSLOduration=2.191846664 podStartE2EDuration="2.915056921s" podCreationTimestamp="2026-01-21 00:06:49 +0000 UTC" firstStartedPulling="2026-01-21 00:06:50.82915333 +0000 UTC m=+5483.149413618" lastFinishedPulling="2026-01-21 00:06:51.552363547 +0000 UTC m=+5483.872623875" observedRunningTime="2026-01-21 00:06:51.9084212 +0000 UTC m=+5484.228681508" watchObservedRunningTime="2026-01-21 00:06:51.915056921 +0000 UTC m=+5484.235317209" Jan 21 00:06:53 crc kubenswrapper[5030]: I0121 00:06:53.918125 5030 generic.go:334] "Generic (PLEG): container finished" podID="bb1ac8e8-7505-4239-92d4-748025edb7d3" containerID="ba2ebdaadaf5424fb657a643d775069242a84d88f1220fc7d9fb80d24f7e7fc3" exitCode=0 Jan 21 00:06:53 crc kubenswrapper[5030]: I0121 00:06:53.918258 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" event={"ID":"bb1ac8e8-7505-4239-92d4-748025edb7d3","Type":"ContainerDied","Data":"ba2ebdaadaf5424fb657a643d775069242a84d88f1220fc7d9fb80d24f7e7fc3"} Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.274827 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.283874 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgm7p\" (UniqueName: \"kubernetes.io/projected/bb1ac8e8-7505-4239-92d4-748025edb7d3-kube-api-access-bgm7p\") pod \"bb1ac8e8-7505-4239-92d4-748025edb7d3\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.283936 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-2\") pod \"bb1ac8e8-7505-4239-92d4-748025edb7d3\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.283968 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-inventory\") pod \"bb1ac8e8-7505-4239-92d4-748025edb7d3\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.284004 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-neutron-metadata-combined-ca-bundle\") pod \"bb1ac8e8-7505-4239-92d4-748025edb7d3\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.284069 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-0\") pod \"bb1ac8e8-7505-4239-92d4-748025edb7d3\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.284143 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"bb1ac8e8-7505-4239-92d4-748025edb7d3\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.284194 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-1\") pod \"bb1ac8e8-7505-4239-92d4-748025edb7d3\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.284246 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-ssh-key-edpm-compute-no-nodes\") pod \"bb1ac8e8-7505-4239-92d4-748025edb7d3\" (UID: \"bb1ac8e8-7505-4239-92d4-748025edb7d3\") " Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.295292 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb1ac8e8-7505-4239-92d4-748025edb7d3-kube-api-access-bgm7p" (OuterVolumeSpecName: "kube-api-access-bgm7p") pod "bb1ac8e8-7505-4239-92d4-748025edb7d3" (UID: "bb1ac8e8-7505-4239-92d4-748025edb7d3"). InnerVolumeSpecName "kube-api-access-bgm7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.296887 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "bb1ac8e8-7505-4239-92d4-748025edb7d3" (UID: "bb1ac8e8-7505-4239-92d4-748025edb7d3"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.314837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "bb1ac8e8-7505-4239-92d4-748025edb7d3" (UID: "bb1ac8e8-7505-4239-92d4-748025edb7d3"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.317369 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "bb1ac8e8-7505-4239-92d4-748025edb7d3" (UID: "bb1ac8e8-7505-4239-92d4-748025edb7d3"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.320145 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "bb1ac8e8-7505-4239-92d4-748025edb7d3" (UID: "bb1ac8e8-7505-4239-92d4-748025edb7d3"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.321753 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-1" (OuterVolumeSpecName: "nova-metadata-neutron-config-1") pod "bb1ac8e8-7505-4239-92d4-748025edb7d3" (UID: "bb1ac8e8-7505-4239-92d4-748025edb7d3"). InnerVolumeSpecName "nova-metadata-neutron-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.336795 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-2" (OuterVolumeSpecName: "nova-metadata-neutron-config-2") pod "bb1ac8e8-7505-4239-92d4-748025edb7d3" (UID: "bb1ac8e8-7505-4239-92d4-748025edb7d3"). InnerVolumeSpecName "nova-metadata-neutron-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.338106 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-inventory" (OuterVolumeSpecName: "inventory") pod "bb1ac8e8-7505-4239-92d4-748025edb7d3" (UID: "bb1ac8e8-7505-4239-92d4-748025edb7d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.386804 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.386865 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.386879 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.386906 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgm7p\" (UniqueName: \"kubernetes.io/projected/bb1ac8e8-7505-4239-92d4-748025edb7d3-kube-api-access-bgm7p\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.386918 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-2\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.386933 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.386947 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.386962 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bb1ac8e8-7505-4239-92d4-748025edb7d3-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.946466 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" event={"ID":"bb1ac8e8-7505-4239-92d4-748025edb7d3","Type":"ContainerDied","Data":"b7bddd28b2bfb668359fd041b16b713e63760bfce8b0acbf2324dfa7ea96333e"} Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.946508 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7bddd28b2bfb668359fd041b16b713e63760bfce8b0acbf2324dfa7ea96333e" Jan 21 00:06:55 crc kubenswrapper[5030]: I0121 00:06:55.946533 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.023393 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67"] Jan 21 00:06:56 crc kubenswrapper[5030]: E0121 00:06:56.023777 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1ac8e8-7505-4239-92d4-748025edb7d3" containerName="neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.023799 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1ac8e8-7505-4239-92d4-748025edb7d3" containerName="neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.023966 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb1ac8e8-7505-4239-92d4-748025edb7d3" containerName="neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.024506 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.045301 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.045496 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.046637 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.046921 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.047726 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-agent-neutron-config" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.047931 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.051846 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67"] Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.100981 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-inventory\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.101060 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.101207 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.101263 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pprq4\" (UniqueName: \"kubernetes.io/projected/97700265-3bbc-43be-b0de-a5e05ecf1fd4-kube-api-access-pprq4\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.101428 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-ssh-key-edpm-compute-no-nodes\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.203255 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-inventory\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.203376 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.203444 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.203478 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pprq4\" (UniqueName: \"kubernetes.io/projected/97700265-3bbc-43be-b0de-a5e05ecf1fd4-kube-api-access-pprq4\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.203533 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-ssh-key-edpm-compute-no-nodes\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.207881 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-ssh-key-edpm-compute-no-nodes\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.208190 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.209115 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-inventory\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.209316 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.224258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pprq4\" (UniqueName: \"kubernetes.io/projected/97700265-3bbc-43be-b0de-a5e05ecf1fd4-kube-api-access-pprq4\") pod \"neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.389443 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.627812 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67"] Jan 21 00:06:56 crc kubenswrapper[5030]: W0121 00:06:56.634295 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97700265_3bbc_43be_b0de_a5e05ecf1fd4.slice/crio-008db6ee46f7b17227d1047691a8cfdb46231cb6d3de14e8eccaa490a6152ac4 WatchSource:0}: Error finding container 008db6ee46f7b17227d1047691a8cfdb46231cb6d3de14e8eccaa490a6152ac4: Status 404 returned error can't find the container with id 008db6ee46f7b17227d1047691a8cfdb46231cb6d3de14e8eccaa490a6152ac4 Jan 21 00:06:56 crc kubenswrapper[5030]: I0121 00:06:56.955435 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" event={"ID":"97700265-3bbc-43be-b0de-a5e05ecf1fd4","Type":"ContainerStarted","Data":"008db6ee46f7b17227d1047691a8cfdb46231cb6d3de14e8eccaa490a6152ac4"} Jan 21 00:06:57 crc kubenswrapper[5030]: I0121 00:06:57.982076 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" event={"ID":"97700265-3bbc-43be-b0de-a5e05ecf1fd4","Type":"ContainerStarted","Data":"b908271e6c33798d690460914aeb3907afebd8bb8992aed9d312e8b5d796dd7c"} Jan 21 00:06:58 crc kubenswrapper[5030]: I0121 00:06:58.023482 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" podStartSLOduration=1.46077324 podStartE2EDuration="2.023457894s" podCreationTimestamp="2026-01-21 00:06:56 +0000 UTC" firstStartedPulling="2026-01-21 00:06:56.637030076 +0000 UTC m=+5488.957290374" lastFinishedPulling="2026-01-21 00:06:57.19971474 +0000 UTC m=+5489.519975028" observedRunningTime="2026-01-21 00:06:58.020884671 +0000 UTC m=+5490.341144969" watchObservedRunningTime="2026-01-21 00:06:58.023457894 +0000 UTC m=+5490.343718192" Jan 21 00:06:58 crc kubenswrapper[5030]: I0121 00:06:58.992843 5030 generic.go:334] "Generic (PLEG): container finished" podID="97700265-3bbc-43be-b0de-a5e05ecf1fd4" containerID="b908271e6c33798d690460914aeb3907afebd8bb8992aed9d312e8b5d796dd7c" exitCode=0 Jan 21 00:06:58 crc kubenswrapper[5030]: I0121 00:06:58.992972 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" event={"ID":"97700265-3bbc-43be-b0de-a5e05ecf1fd4","Type":"ContainerDied","Data":"b908271e6c33798d690460914aeb3907afebd8bb8992aed9d312e8b5d796dd7c"} Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.314454 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.360822 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-neutron-ovn-combined-ca-bundle\") pod \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.360957 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-ssh-key-edpm-compute-no-nodes\") pod \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.361027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-neutron-ovn-agent-neutron-config-0\") pod \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.361142 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pprq4\" (UniqueName: \"kubernetes.io/projected/97700265-3bbc-43be-b0de-a5e05ecf1fd4-kube-api-access-pprq4\") pod \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.361178 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-inventory\") pod \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\" (UID: \"97700265-3bbc-43be-b0de-a5e05ecf1fd4\") " Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.367104 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "97700265-3bbc-43be-b0de-a5e05ecf1fd4" (UID: "97700265-3bbc-43be-b0de-a5e05ecf1fd4"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.368210 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97700265-3bbc-43be-b0de-a5e05ecf1fd4-kube-api-access-pprq4" (OuterVolumeSpecName: "kube-api-access-pprq4") pod "97700265-3bbc-43be-b0de-a5e05ecf1fd4" (UID: "97700265-3bbc-43be-b0de-a5e05ecf1fd4"). InnerVolumeSpecName "kube-api-access-pprq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.399351 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-inventory" (OuterVolumeSpecName: "inventory") pod "97700265-3bbc-43be-b0de-a5e05ecf1fd4" (UID: "97700265-3bbc-43be-b0de-a5e05ecf1fd4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.401819 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "97700265-3bbc-43be-b0de-a5e05ecf1fd4" (UID: "97700265-3bbc-43be-b0de-a5e05ecf1fd4"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.402255 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-neutron-ovn-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-agent-neutron-config-0") pod "97700265-3bbc-43be-b0de-a5e05ecf1fd4" (UID: "97700265-3bbc-43be-b0de-a5e05ecf1fd4"). InnerVolumeSpecName "neutron-ovn-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.462211 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pprq4\" (UniqueName: \"kubernetes.io/projected/97700265-3bbc-43be-b0de-a5e05ecf1fd4-kube-api-access-pprq4\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.462253 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.462267 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.462280 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:00 crc kubenswrapper[5030]: I0121 00:07:00.462293 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/97700265-3bbc-43be-b0de-a5e05ecf1fd4-neutron-ovn-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.023889 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" event={"ID":"97700265-3bbc-43be-b0de-a5e05ecf1fd4","Type":"ContainerDied","Data":"008db6ee46f7b17227d1047691a8cfdb46231cb6d3de14e8eccaa490a6152ac4"} Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.023982 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="008db6ee46f7b17227d1047691a8cfdb46231cb6d3de14e8eccaa490a6152ac4" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.024411 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.103810 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk"] Jan 21 00:07:01 crc kubenswrapper[5030]: E0121 00:07:01.104252 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97700265-3bbc-43be-b0de-a5e05ecf1fd4" containerName="neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.104275 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="97700265-3bbc-43be-b0de-a5e05ecf1fd4" containerName="neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.104409 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="97700265-3bbc-43be-b0de-a5e05ecf1fd4" containerName="neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.104913 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.108186 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.108186 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.113711 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.114002 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.115459 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.115855 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-sriov-agent-neutron-config" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.138049 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk"] Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.273979 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-ssh-key-edpm-compute-no-nodes\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.274081 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5jdf\" (UniqueName: \"kubernetes.io/projected/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-kube-api-access-f5jdf\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.274111 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.274155 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.274207 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-inventory\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.375389 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5jdf\" (UniqueName: \"kubernetes.io/projected/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-kube-api-access-f5jdf\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.375456 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.375510 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.375538 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-inventory\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.375637 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-ssh-key-edpm-compute-no-nodes\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.381824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-ssh-key-edpm-compute-no-nodes\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.382296 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.383362 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.384062 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-inventory\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.403910 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5jdf\" (UniqueName: \"kubernetes.io/projected/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-kube-api-access-f5jdf\") pod \"neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.436866 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:01 crc kubenswrapper[5030]: I0121 00:07:01.874782 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk"] Jan 21 00:07:02 crc kubenswrapper[5030]: I0121 00:07:02.033473 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" event={"ID":"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c","Type":"ContainerStarted","Data":"174cc019e53f2232e2bf69a5411fdd9d61a2362903f1873fafe76b6990afad7c"} Jan 21 00:07:03 crc kubenswrapper[5030]: I0121 00:07:03.042856 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" event={"ID":"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c","Type":"ContainerStarted","Data":"e43f71e8ef63bd9bd54911f9972fed66e390abceaf1aa9bd5d895db3b9c311e2"} Jan 21 00:07:03 crc kubenswrapper[5030]: I0121 00:07:03.071297 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" podStartSLOduration=1.556491636 podStartE2EDuration="2.071280219s" podCreationTimestamp="2026-01-21 00:07:01 +0000 UTC" firstStartedPulling="2026-01-21 00:07:01.883259332 +0000 UTC m=+5494.203519660" lastFinishedPulling="2026-01-21 00:07:02.398047935 +0000 UTC m=+5494.718308243" observedRunningTime="2026-01-21 00:07:03.066473751 +0000 UTC m=+5495.386734059" watchObservedRunningTime="2026-01-21 00:07:03.071280219 +0000 UTC m=+5495.391540507" Jan 21 00:07:04 crc kubenswrapper[5030]: I0121 00:07:04.053189 5030 generic.go:334] "Generic (PLEG): container finished" podID="9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c" containerID="e43f71e8ef63bd9bd54911f9972fed66e390abceaf1aa9bd5d895db3b9c311e2" exitCode=0 Jan 21 00:07:04 crc kubenswrapper[5030]: I0121 00:07:04.053476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" event={"ID":"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c","Type":"ContainerDied","Data":"e43f71e8ef63bd9bd54911f9972fed66e390abceaf1aa9bd5d895db3b9c311e2"} Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.378605 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.439395 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5jdf\" (UniqueName: \"kubernetes.io/projected/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-kube-api-access-f5jdf\") pod \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.439478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-ssh-key-edpm-compute-no-nodes\") pod \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.439523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-neutron-sriov-agent-neutron-config-0\") pod \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.439549 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-neutron-sriov-combined-ca-bundle\") pod \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.439592 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-inventory\") pod \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\" (UID: \"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c\") " Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.447231 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-kube-api-access-f5jdf" (OuterVolumeSpecName: "kube-api-access-f5jdf") pod "9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c" (UID: "9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c"). InnerVolumeSpecName "kube-api-access-f5jdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.447944 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c" (UID: "9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.466956 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c" (UID: "9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.476086 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c" (UID: "9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.483686 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-inventory" (OuterVolumeSpecName: "inventory") pod "9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c" (UID: "9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.542460 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5jdf\" (UniqueName: \"kubernetes.io/projected/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-kube-api-access-f5jdf\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.542799 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.542978 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.543179 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:05 crc kubenswrapper[5030]: I0121 00:07:05.543310 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.072762 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" event={"ID":"9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c","Type":"ContainerDied","Data":"174cc019e53f2232e2bf69a5411fdd9d61a2362903f1873fafe76b6990afad7c"} Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.072827 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="174cc019e53f2232e2bf69a5411fdd9d61a2362903f1873fafe76b6990afad7c" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.073279 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.142869 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26"] Jan 21 00:07:06 crc kubenswrapper[5030]: E0121 00:07:06.143241 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c" containerName="neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.143256 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c" containerName="neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.143427 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c" containerName="neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.144028 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.150328 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.150570 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-dhcp-agent-neutron-config" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.150585 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.151082 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.151334 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.151549 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.153120 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.153196 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.153330 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9424\" (UniqueName: \"kubernetes.io/projected/0255ce78-12a6-4072-b45e-5eafad612f0b-kube-api-access-l9424\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.153382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-ssh-key-edpm-compute-no-nodes\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.153662 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-inventory\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.159540 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26"] Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.255396 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-ssh-key-edpm-compute-no-nodes\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.255515 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-inventory\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.255551 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.255597 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.255663 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9424\" (UniqueName: \"kubernetes.io/projected/0255ce78-12a6-4072-b45e-5eafad612f0b-kube-api-access-l9424\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.259953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.259975 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-inventory\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.260432 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-ssh-key-edpm-compute-no-nodes\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.260807 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.274638 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9424\" (UniqueName: \"kubernetes.io/projected/0255ce78-12a6-4072-b45e-5eafad612f0b-kube-api-access-l9424\") pod \"neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.466345 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:06 crc kubenswrapper[5030]: I0121 00:07:06.879093 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26"] Jan 21 00:07:06 crc kubenswrapper[5030]: W0121 00:07:06.884003 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0255ce78_12a6_4072_b45e_5eafad612f0b.slice/crio-3b1dcd70542af398e881ecd3dec86f54b8b8fb29f5c9bca4a42dfc7b3827ba82 WatchSource:0}: Error finding container 3b1dcd70542af398e881ecd3dec86f54b8b8fb29f5c9bca4a42dfc7b3827ba82: Status 404 returned error can't find the container with id 3b1dcd70542af398e881ecd3dec86f54b8b8fb29f5c9bca4a42dfc7b3827ba82 Jan 21 00:07:07 crc kubenswrapper[5030]: I0121 00:07:07.081042 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" event={"ID":"0255ce78-12a6-4072-b45e-5eafad612f0b","Type":"ContainerStarted","Data":"3b1dcd70542af398e881ecd3dec86f54b8b8fb29f5c9bca4a42dfc7b3827ba82"} Jan 21 00:07:08 crc kubenswrapper[5030]: I0121 00:07:08.090974 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" event={"ID":"0255ce78-12a6-4072-b45e-5eafad612f0b","Type":"ContainerStarted","Data":"9ad440a17a700ea0b70f5d65e67661366e51e90b7fb29ee2a9e1a2f80faf07b5"} Jan 21 00:07:08 crc kubenswrapper[5030]: I0121 00:07:08.112546 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" podStartSLOduration=1.595852146 podStartE2EDuration="2.112530064s" podCreationTimestamp="2026-01-21 00:07:06 +0000 UTC" firstStartedPulling="2026-01-21 00:07:06.88621188 +0000 UTC m=+5499.206472168" lastFinishedPulling="2026-01-21 00:07:07.402889798 +0000 UTC m=+5499.723150086" observedRunningTime="2026-01-21 00:07:08.110191508 +0000 UTC m=+5500.430451796" watchObservedRunningTime="2026-01-21 00:07:08.112530064 +0000 UTC m=+5500.432790352" Jan 21 00:07:09 crc kubenswrapper[5030]: I0121 00:07:09.107355 5030 generic.go:334] "Generic (PLEG): container finished" podID="0255ce78-12a6-4072-b45e-5eafad612f0b" containerID="9ad440a17a700ea0b70f5d65e67661366e51e90b7fb29ee2a9e1a2f80faf07b5" exitCode=0 Jan 21 00:07:09 crc kubenswrapper[5030]: I0121 00:07:09.107431 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" event={"ID":"0255ce78-12a6-4072-b45e-5eafad612f0b","Type":"ContainerDied","Data":"9ad440a17a700ea0b70f5d65e67661366e51e90b7fb29ee2a9e1a2f80faf07b5"} Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.157145 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.157206 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.437852 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.615959 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9424\" (UniqueName: \"kubernetes.io/projected/0255ce78-12a6-4072-b45e-5eafad612f0b-kube-api-access-l9424\") pod \"0255ce78-12a6-4072-b45e-5eafad612f0b\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.616058 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-neutron-dhcp-agent-neutron-config-0\") pod \"0255ce78-12a6-4072-b45e-5eafad612f0b\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.616133 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-neutron-dhcp-combined-ca-bundle\") pod \"0255ce78-12a6-4072-b45e-5eafad612f0b\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.616181 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-inventory\") pod \"0255ce78-12a6-4072-b45e-5eafad612f0b\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.616263 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-ssh-key-edpm-compute-no-nodes\") pod \"0255ce78-12a6-4072-b45e-5eafad612f0b\" (UID: \"0255ce78-12a6-4072-b45e-5eafad612f0b\") " Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.622566 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "0255ce78-12a6-4072-b45e-5eafad612f0b" (UID: "0255ce78-12a6-4072-b45e-5eafad612f0b"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.627873 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0255ce78-12a6-4072-b45e-5eafad612f0b-kube-api-access-l9424" (OuterVolumeSpecName: "kube-api-access-l9424") pod "0255ce78-12a6-4072-b45e-5eafad612f0b" (UID: "0255ce78-12a6-4072-b45e-5eafad612f0b"). InnerVolumeSpecName "kube-api-access-l9424". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.636172 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "0255ce78-12a6-4072-b45e-5eafad612f0b" (UID: "0255ce78-12a6-4072-b45e-5eafad612f0b"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.645996 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "0255ce78-12a6-4072-b45e-5eafad612f0b" (UID: "0255ce78-12a6-4072-b45e-5eafad612f0b"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.649747 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-inventory" (OuterVolumeSpecName: "inventory") pod "0255ce78-12a6-4072-b45e-5eafad612f0b" (UID: "0255ce78-12a6-4072-b45e-5eafad612f0b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.718100 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9424\" (UniqueName: \"kubernetes.io/projected/0255ce78-12a6-4072-b45e-5eafad612f0b-kube-api-access-l9424\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.718143 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.718161 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.718175 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:10 crc kubenswrapper[5030]: I0121 00:07:10.718189 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0255ce78-12a6-4072-b45e-5eafad612f0b-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.064732 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq"] Jan 21 00:07:11 crc kubenswrapper[5030]: E0121 00:07:11.065037 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0255ce78-12a6-4072-b45e-5eafad612f0b" containerName="neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.065050 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0255ce78-12a6-4072-b45e-5eafad612f0b" containerName="neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.065229 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0255ce78-12a6-4072-b45e-5eafad612f0b" containerName="neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.065743 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.069305 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"libvirt-secret" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.081396 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq"] Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.125081 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" event={"ID":"0255ce78-12a6-4072-b45e-5eafad612f0b","Type":"ContainerDied","Data":"3b1dcd70542af398e881ecd3dec86f54b8b8fb29f5c9bca4a42dfc7b3827ba82"} Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.125130 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b1dcd70542af398e881ecd3dec86f54b8b8fb29f5c9bca4a42dfc7b3827ba82" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.125158 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.224479 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-inventory\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.224540 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.224591 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr7ql\" (UniqueName: \"kubernetes.io/projected/2add332c-08b7-4a36-836a-bd74ec1404ce-kube-api-access-hr7ql\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.224719 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-ssh-key-edpm-compute-no-nodes\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.224744 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-libvirt-secret-0\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.325744 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.325800 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr7ql\" (UniqueName: \"kubernetes.io/projected/2add332c-08b7-4a36-836a-bd74ec1404ce-kube-api-access-hr7ql\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.325874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-ssh-key-edpm-compute-no-nodes\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.325906 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-libvirt-secret-0\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.325963 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-inventory\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.331769 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-ssh-key-edpm-compute-no-nodes\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.332706 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-libvirt-secret-0\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.333580 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.333810 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-inventory\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.354482 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr7ql\" (UniqueName: \"kubernetes.io/projected/2add332c-08b7-4a36-836a-bd74ec1404ce-kube-api-access-hr7ql\") pod \"libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.393710 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:11 crc kubenswrapper[5030]: W0121 00:07:11.850221 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2add332c_08b7_4a36_836a_bd74ec1404ce.slice/crio-ec3506f789c15e558a80707f5ff53583ef6e1472ed96f32e0322a53e50669fd5 WatchSource:0}: Error finding container ec3506f789c15e558a80707f5ff53583ef6e1472ed96f32e0322a53e50669fd5: Status 404 returned error can't find the container with id ec3506f789c15e558a80707f5ff53583ef6e1472ed96f32e0322a53e50669fd5 Jan 21 00:07:11 crc kubenswrapper[5030]: I0121 00:07:11.850999 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq"] Jan 21 00:07:12 crc kubenswrapper[5030]: I0121 00:07:12.134909 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" event={"ID":"2add332c-08b7-4a36-836a-bd74ec1404ce","Type":"ContainerStarted","Data":"ec3506f789c15e558a80707f5ff53583ef6e1472ed96f32e0322a53e50669fd5"} Jan 21 00:07:13 crc kubenswrapper[5030]: I0121 00:07:13.144727 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" event={"ID":"2add332c-08b7-4a36-836a-bd74ec1404ce","Type":"ContainerStarted","Data":"54725a2010d44952cce43671c2f4cbbfa72fc077f14151c5f54d9dfd04047849"} Jan 21 00:07:14 crc kubenswrapper[5030]: I0121 00:07:14.189343 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" podStartSLOduration=2.537223827 podStartE2EDuration="3.189314348s" podCreationTimestamp="2026-01-21 00:07:11 +0000 UTC" firstStartedPulling="2026-01-21 00:07:11.853332018 +0000 UTC m=+5504.173592306" lastFinishedPulling="2026-01-21 00:07:12.505422539 +0000 UTC m=+5504.825682827" observedRunningTime="2026-01-21 00:07:14.184678226 +0000 UTC m=+5506.504938584" watchObservedRunningTime="2026-01-21 00:07:14.189314348 +0000 UTC m=+5506.509574666" Jan 21 00:07:15 crc kubenswrapper[5030]: I0121 00:07:15.170139 5030 generic.go:334] "Generic (PLEG): container finished" podID="2add332c-08b7-4a36-836a-bd74ec1404ce" containerID="54725a2010d44952cce43671c2f4cbbfa72fc077f14151c5f54d9dfd04047849" exitCode=0 Jan 21 00:07:15 crc kubenswrapper[5030]: I0121 00:07:15.170210 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" event={"ID":"2add332c-08b7-4a36-836a-bd74ec1404ce","Type":"ContainerDied","Data":"54725a2010d44952cce43671c2f4cbbfa72fc077f14151c5f54d9dfd04047849"} Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.518241 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.708456 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-ssh-key-edpm-compute-no-nodes\") pod \"2add332c-08b7-4a36-836a-bd74ec1404ce\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.708516 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-libvirt-combined-ca-bundle\") pod \"2add332c-08b7-4a36-836a-bd74ec1404ce\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.708564 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-libvirt-secret-0\") pod \"2add332c-08b7-4a36-836a-bd74ec1404ce\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.708636 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-inventory\") pod \"2add332c-08b7-4a36-836a-bd74ec1404ce\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.708737 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr7ql\" (UniqueName: \"kubernetes.io/projected/2add332c-08b7-4a36-836a-bd74ec1404ce-kube-api-access-hr7ql\") pod \"2add332c-08b7-4a36-836a-bd74ec1404ce\" (UID: \"2add332c-08b7-4a36-836a-bd74ec1404ce\") " Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.713739 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2add332c-08b7-4a36-836a-bd74ec1404ce-kube-api-access-hr7ql" (OuterVolumeSpecName: "kube-api-access-hr7ql") pod "2add332c-08b7-4a36-836a-bd74ec1404ce" (UID: "2add332c-08b7-4a36-836a-bd74ec1404ce"). InnerVolumeSpecName "kube-api-access-hr7ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.713801 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2add332c-08b7-4a36-836a-bd74ec1404ce" (UID: "2add332c-08b7-4a36-836a-bd74ec1404ce"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.730091 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "2add332c-08b7-4a36-836a-bd74ec1404ce" (UID: "2add332c-08b7-4a36-836a-bd74ec1404ce"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.730778 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "2add332c-08b7-4a36-836a-bd74ec1404ce" (UID: "2add332c-08b7-4a36-836a-bd74ec1404ce"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.734113 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-inventory" (OuterVolumeSpecName: "inventory") pod "2add332c-08b7-4a36-836a-bd74ec1404ce" (UID: "2add332c-08b7-4a36-836a-bd74ec1404ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.809787 5030 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.809820 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.809834 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr7ql\" (UniqueName: \"kubernetes.io/projected/2add332c-08b7-4a36-836a-bd74ec1404ce-kube-api-access-hr7ql\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.809846 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:16 crc kubenswrapper[5030]: I0121 00:07:16.809855 5030 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2add332c-08b7-4a36-836a-bd74ec1404ce-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.199791 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" event={"ID":"2add332c-08b7-4a36-836a-bd74ec1404ce","Type":"ContainerDied","Data":"ec3506f789c15e558a80707f5ff53583ef6e1472ed96f32e0322a53e50669fd5"} Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.199901 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec3506f789c15e558a80707f5ff53583ef6e1472ed96f32e0322a53e50669fd5" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.199919 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.289925 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl"] Jan 21 00:07:17 crc kubenswrapper[5030]: E0121 00:07:17.290477 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2add332c-08b7-4a36-836a-bd74ec1404ce" containerName="libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.290585 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2add332c-08b7-4a36-836a-bd74ec1404ce" containerName="libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.290909 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2add332c-08b7-4a36-836a-bd74ec1404ce" containerName="libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.291570 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.296188 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.297846 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-migration-ssh-key" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.298145 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.298379 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.298462 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.298524 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.298563 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-compute-config" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.300596 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl"] Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.422175 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-ssh-key-edpm-compute-no-nodes\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.422239 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-migration-ssh-key-0\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.422261 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtmkk\" (UniqueName: \"kubernetes.io/projected/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-kube-api-access-mtmkk\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.422464 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-cell1-compute-config-0\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.422656 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-inventory\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.422696 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-cell1-compute-config-1\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.422721 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-migration-ssh-key-1\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.422759 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-combined-ca-bundle\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.524057 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-cell1-compute-config-0\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.524117 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-inventory\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.524143 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-cell1-compute-config-1\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.524166 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-migration-ssh-key-1\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.524193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-combined-ca-bundle\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.524242 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-ssh-key-edpm-compute-no-nodes\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.524260 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-migration-ssh-key-0\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.524278 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtmkk\" (UniqueName: \"kubernetes.io/projected/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-kube-api-access-mtmkk\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.529100 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-migration-ssh-key-1\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.529765 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-ssh-key-edpm-compute-no-nodes\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.531870 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-migration-ssh-key-0\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.534006 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-inventory\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.536065 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-combined-ca-bundle\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.537611 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-cell1-compute-config-1\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.541731 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-cell1-compute-config-0\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.549262 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtmkk\" (UniqueName: \"kubernetes.io/projected/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-kube-api-access-mtmkk\") pod \"nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:17 crc kubenswrapper[5030]: I0121 00:07:17.627201 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:18 crc kubenswrapper[5030]: I0121 00:07:18.079253 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl"] Jan 21 00:07:18 crc kubenswrapper[5030]: I0121 00:07:18.211436 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" event={"ID":"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2","Type":"ContainerStarted","Data":"9748ca90a1f2a1d62769897791ffed80e01daa84bdd76387cbd933f38ebf3f91"} Jan 21 00:07:20 crc kubenswrapper[5030]: I0121 00:07:20.232299 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" event={"ID":"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2","Type":"ContainerStarted","Data":"62bbe7d608f220881fa325385b732cea042f38f208f2d3bc3e495bf44b1671d8"} Jan 21 00:07:20 crc kubenswrapper[5030]: I0121 00:07:20.257390 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" podStartSLOduration=1.3986850610000001 podStartE2EDuration="3.25737456s" podCreationTimestamp="2026-01-21 00:07:17 +0000 UTC" firstStartedPulling="2026-01-21 00:07:18.094078126 +0000 UTC m=+5510.414338414" lastFinishedPulling="2026-01-21 00:07:19.952767615 +0000 UTC m=+5512.273027913" observedRunningTime="2026-01-21 00:07:20.249811237 +0000 UTC m=+5512.570071535" watchObservedRunningTime="2026-01-21 00:07:20.25737456 +0000 UTC m=+5512.577634848" Jan 21 00:07:22 crc kubenswrapper[5030]: I0121 00:07:22.257021 5030 generic.go:334] "Generic (PLEG): container finished" podID="b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2" containerID="62bbe7d608f220881fa325385b732cea042f38f208f2d3bc3e495bf44b1671d8" exitCode=0 Jan 21 00:07:22 crc kubenswrapper[5030]: I0121 00:07:22.257067 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" event={"ID":"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2","Type":"ContainerDied","Data":"62bbe7d608f220881fa325385b732cea042f38f208f2d3bc3e495bf44b1671d8"} Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.618365 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.732320 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-cell1-compute-config-0\") pod \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.732391 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-combined-ca-bundle\") pod \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.732450 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-migration-ssh-key-1\") pod \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.732540 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-migration-ssh-key-0\") pod \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.732578 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-ssh-key-edpm-compute-no-nodes\") pod \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.732650 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-inventory\") pod \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.732790 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtmkk\" (UniqueName: \"kubernetes.io/projected/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-kube-api-access-mtmkk\") pod \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.732831 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-cell1-compute-config-1\") pod \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\" (UID: \"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2\") " Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.747063 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2" (UID: "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.747295 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-kube-api-access-mtmkk" (OuterVolumeSpecName: "kube-api-access-mtmkk") pod "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2" (UID: "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2"). InnerVolumeSpecName "kube-api-access-mtmkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.760562 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2" (UID: "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.762733 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-inventory" (OuterVolumeSpecName: "inventory") pod "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2" (UID: "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.762892 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2" (UID: "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.768615 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2" (UID: "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.769223 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2" (UID: "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.771143 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2" (UID: "b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.834570 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtmkk\" (UniqueName: \"kubernetes.io/projected/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-kube-api-access-mtmkk\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.834656 5030 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.834676 5030 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.834695 5030 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.834715 5030 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.834734 5030 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.834752 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:23 crc kubenswrapper[5030]: I0121 00:07:23.834805 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.314416 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.314419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl" event={"ID":"b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2","Type":"ContainerDied","Data":"9748ca90a1f2a1d62769897791ffed80e01daa84bdd76387cbd933f38ebf3f91"} Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.314866 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9748ca90a1f2a1d62769897791ffed80e01daa84bdd76387cbd933f38ebf3f91" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.832991 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4"] Jan 21 00:07:24 crc kubenswrapper[5030]: E0121 00:07:24.833502 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2" containerName="nova-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.833525 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2" containerName="nova-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.833894 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2" containerName="nova-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.834688 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.839332 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.839610 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.840401 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.841276 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.841784 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.852301 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-custom-svc-combined-ca-bundle\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.852396 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-ssh-key-edpm-compute-no-nodes\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.852459 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-inventory\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.852976 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcxmf\" (UniqueName: \"kubernetes.io/projected/51d56d7c-1323-4c76-90ad-41273710d00a-kube-api-access-fcxmf\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.863528 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4"] Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.955045 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcxmf\" (UniqueName: \"kubernetes.io/projected/51d56d7c-1323-4c76-90ad-41273710d00a-kube-api-access-fcxmf\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.955565 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-custom-svc-combined-ca-bundle\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.955727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-ssh-key-edpm-compute-no-nodes\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.955961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-inventory\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.962374 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-ssh-key-edpm-compute-no-nodes\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.962778 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-custom-svc-combined-ca-bundle\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.971679 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-inventory\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:24 crc kubenswrapper[5030]: I0121 00:07:24.981680 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcxmf\" (UniqueName: \"kubernetes.io/projected/51d56d7c-1323-4c76-90ad-41273710d00a-kube-api-access-fcxmf\") pod \"custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:25 crc kubenswrapper[5030]: I0121 00:07:25.169531 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:25 crc kubenswrapper[5030]: I0121 00:07:25.443897 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4"] Jan 21 00:07:25 crc kubenswrapper[5030]: I0121 00:07:25.530366 5030 scope.go:117] "RemoveContainer" containerID="d9287011d2be6abb8f8faee1816bd14ba090483f5a6463eca0da3e2d1ec24891" Jan 21 00:07:25 crc kubenswrapper[5030]: I0121 00:07:25.576241 5030 scope.go:117] "RemoveContainer" containerID="5c9f1182548ba43265d81f8447c1f62b1f165ac558b255b1174dccc95471ee9a" Jan 21 00:07:25 crc kubenswrapper[5030]: I0121 00:07:25.618638 5030 scope.go:117] "RemoveContainer" containerID="6566c2bfbf4b4c608ef886d9cb22730c82f84d067ca9338ce48f7efd5ccd84e8" Jan 21 00:07:25 crc kubenswrapper[5030]: I0121 00:07:25.645023 5030 scope.go:117] "RemoveContainer" containerID="9e7dd4a2fb1ccb6483359d29e120ffa0bdb87303b53fd708a9d0f9b2b739b5db" Jan 21 00:07:25 crc kubenswrapper[5030]: I0121 00:07:25.713046 5030 scope.go:117] "RemoveContainer" containerID="bfdb85a1a852ba6ed22c70fb37d83478902cda9decc9e21cbe1f0ab2acd7e84b" Jan 21 00:07:25 crc kubenswrapper[5030]: I0121 00:07:25.757805 5030 scope.go:117] "RemoveContainer" containerID="3230d8b8fb830fbba2e2a906bcdb7f0fbd36e2d48ed70c1a2b75c947e7349a06" Jan 21 00:07:25 crc kubenswrapper[5030]: I0121 00:07:25.800344 5030 scope.go:117] "RemoveContainer" containerID="ba48e6fed9d77051010ef4b50d4a71367424b43c03e32266a234e2a2bc45ffdf" Jan 21 00:07:25 crc kubenswrapper[5030]: I0121 00:07:25.840155 5030 scope.go:117] "RemoveContainer" containerID="a6f5cb9b8e23c29e698185e82117ee05bd9a54776cb215075639461cb84c9eb5" Jan 21 00:07:25 crc kubenswrapper[5030]: I0121 00:07:25.905835 5030 scope.go:117] "RemoveContainer" containerID="09a5e577660ef1eb54d2f816874bbc7c00dfc4021b3c3fbcaac1ed3eba815132" Jan 21 00:07:25 crc kubenswrapper[5030]: I0121 00:07:25.954402 5030 scope.go:117] "RemoveContainer" containerID="bf5667e55f136f2b8a9faf5d82492ca64d6650d1a230ab45823cdc77c51f34cd" Jan 21 00:07:26 crc kubenswrapper[5030]: I0121 00:07:26.350805 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" event={"ID":"51d56d7c-1323-4c76-90ad-41273710d00a","Type":"ContainerStarted","Data":"69b237462deac7558d1a7e91bc64872aee22857b9902c1fe1585c151ee0eb8ca"} Jan 21 00:07:27 crc kubenswrapper[5030]: I0121 00:07:27.368373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" event={"ID":"51d56d7c-1323-4c76-90ad-41273710d00a","Type":"ContainerStarted","Data":"1dee82a08442e7115d8fc7132222f50cc9c262689696e05bbe12723f05e13928"} Jan 21 00:07:27 crc kubenswrapper[5030]: I0121 00:07:27.400485 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" podStartSLOduration=2.331373327 podStartE2EDuration="3.400451449s" podCreationTimestamp="2026-01-21 00:07:24 +0000 UTC" firstStartedPulling="2026-01-21 00:07:25.450510019 +0000 UTC m=+5517.770770357" lastFinishedPulling="2026-01-21 00:07:26.519588151 +0000 UTC m=+5518.839848479" observedRunningTime="2026-01-21 00:07:27.390315674 +0000 UTC m=+5519.710576022" watchObservedRunningTime="2026-01-21 00:07:27.400451449 +0000 UTC m=+5519.720711777" Jan 21 00:07:30 crc kubenswrapper[5030]: I0121 00:07:30.399565 5030 generic.go:334] "Generic (PLEG): container finished" podID="51d56d7c-1323-4c76-90ad-41273710d00a" containerID="1dee82a08442e7115d8fc7132222f50cc9c262689696e05bbe12723f05e13928" exitCode=0 Jan 21 00:07:30 crc kubenswrapper[5030]: I0121 00:07:30.399719 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" event={"ID":"51d56d7c-1323-4c76-90ad-41273710d00a","Type":"ContainerDied","Data":"1dee82a08442e7115d8fc7132222f50cc9c262689696e05bbe12723f05e13928"} Jan 21 00:07:31 crc kubenswrapper[5030]: I0121 00:07:31.826183 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:31 crc kubenswrapper[5030]: I0121 00:07:31.884606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-inventory\") pod \"51d56d7c-1323-4c76-90ad-41273710d00a\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " Jan 21 00:07:31 crc kubenswrapper[5030]: I0121 00:07:31.884701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcxmf\" (UniqueName: \"kubernetes.io/projected/51d56d7c-1323-4c76-90ad-41273710d00a-kube-api-access-fcxmf\") pod \"51d56d7c-1323-4c76-90ad-41273710d00a\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " Jan 21 00:07:31 crc kubenswrapper[5030]: I0121 00:07:31.884940 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-custom-svc-combined-ca-bundle\") pod \"51d56d7c-1323-4c76-90ad-41273710d00a\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " Jan 21 00:07:31 crc kubenswrapper[5030]: I0121 00:07:31.885122 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-ssh-key-edpm-compute-no-nodes\") pod \"51d56d7c-1323-4c76-90ad-41273710d00a\" (UID: \"51d56d7c-1323-4c76-90ad-41273710d00a\") " Jan 21 00:07:31 crc kubenswrapper[5030]: I0121 00:07:31.893028 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-custom-svc-combined-ca-bundle" (OuterVolumeSpecName: "custom-svc-combined-ca-bundle") pod "51d56d7c-1323-4c76-90ad-41273710d00a" (UID: "51d56d7c-1323-4c76-90ad-41273710d00a"). InnerVolumeSpecName "custom-svc-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:31 crc kubenswrapper[5030]: I0121 00:07:31.895929 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d56d7c-1323-4c76-90ad-41273710d00a-kube-api-access-fcxmf" (OuterVolumeSpecName: "kube-api-access-fcxmf") pod "51d56d7c-1323-4c76-90ad-41273710d00a" (UID: "51d56d7c-1323-4c76-90ad-41273710d00a"). InnerVolumeSpecName "kube-api-access-fcxmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:07:31 crc kubenswrapper[5030]: I0121 00:07:31.929127 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "51d56d7c-1323-4c76-90ad-41273710d00a" (UID: "51d56d7c-1323-4c76-90ad-41273710d00a"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:31 crc kubenswrapper[5030]: I0121 00:07:31.930351 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-inventory" (OuterVolumeSpecName: "inventory") pod "51d56d7c-1323-4c76-90ad-41273710d00a" (UID: "51d56d7c-1323-4c76-90ad-41273710d00a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:07:31 crc kubenswrapper[5030]: I0121 00:07:31.987979 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:31 crc kubenswrapper[5030]: I0121 00:07:31.988027 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:31 crc kubenswrapper[5030]: I0121 00:07:31.988046 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcxmf\" (UniqueName: \"kubernetes.io/projected/51d56d7c-1323-4c76-90ad-41273710d00a-kube-api-access-fcxmf\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:31 crc kubenswrapper[5030]: I0121 00:07:31.988064 5030 reconciler_common.go:293] "Volume detached for volume \"custom-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d56d7c-1323-4c76-90ad-41273710d00a-custom-svc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:07:32 crc kubenswrapper[5030]: I0121 00:07:32.427024 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" event={"ID":"51d56d7c-1323-4c76-90ad-41273710d00a","Type":"ContainerDied","Data":"69b237462deac7558d1a7e91bc64872aee22857b9902c1fe1585c151ee0eb8ca"} Jan 21 00:07:32 crc kubenswrapper[5030]: I0121 00:07:32.427495 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69b237462deac7558d1a7e91bc64872aee22857b9902c1fe1585c151ee0eb8ca" Jan 21 00:07:32 crc kubenswrapper[5030]: I0121 00:07:32.427141 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4" Jan 21 00:07:40 crc kubenswrapper[5030]: I0121 00:07:40.158478 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:07:40 crc kubenswrapper[5030]: I0121 00:07:40.159191 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:07:54 crc kubenswrapper[5030]: I0121 00:07:54.887421 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm"] Jan 21 00:07:54 crc kubenswrapper[5030]: E0121 00:07:54.888364 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d56d7c-1323-4c76-90ad-41273710d00a" containerName="custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodes" Jan 21 00:07:54 crc kubenswrapper[5030]: I0121 00:07:54.888383 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d56d7c-1323-4c76-90ad-41273710d00a" containerName="custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodes" Jan 21 00:07:54 crc kubenswrapper[5030]: I0121 00:07:54.888586 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d56d7c-1323-4c76-90ad-41273710d00a" containerName="custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodes" Jan 21 00:07:54 crc kubenswrapper[5030]: I0121 00:07:54.889173 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:54 crc kubenswrapper[5030]: I0121 00:07:54.891791 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:07:54 crc kubenswrapper[5030]: I0121 00:07:54.891845 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-config" Jan 21 00:07:54 crc kubenswrapper[5030]: I0121 00:07:54.891975 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:07:54 crc kubenswrapper[5030]: I0121 00:07:54.894308 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:07:54 crc kubenswrapper[5030]: I0121 00:07:54.894499 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:07:54 crc kubenswrapper[5030]: I0121 00:07:54.894558 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:07:54 crc kubenswrapper[5030]: I0121 00:07:54.904493 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm"] Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.058450 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-inventory\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.058518 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgt2d\" (UniqueName: \"kubernetes.io/projected/c378999a-6e05-40e9-9386-b0e79d90d988-kube-api-access-dgt2d\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.058561 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c378999a-6e05-40e9-9386-b0e79d90d988-ovncontroller-config-0\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.058607 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.059848 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.161212 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c378999a-6e05-40e9-9386-b0e79d90d988-ovncontroller-config-0\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.161294 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.161363 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.161429 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-inventory\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.161469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgt2d\" (UniqueName: \"kubernetes.io/projected/c378999a-6e05-40e9-9386-b0e79d90d988-kube-api-access-dgt2d\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.162252 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c378999a-6e05-40e9-9386-b0e79d90d988-ovncontroller-config-0\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.168524 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.168914 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-inventory\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.169717 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-ovn-combined-ca-bundle\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.195400 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgt2d\" (UniqueName: \"kubernetes.io/projected/c378999a-6e05-40e9-9386-b0e79d90d988-kube-api-access-dgt2d\") pod \"ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.215420 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.665577 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm"] Jan 21 00:07:55 crc kubenswrapper[5030]: I0121 00:07:55.710521 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" event={"ID":"c378999a-6e05-40e9-9386-b0e79d90d988","Type":"ContainerStarted","Data":"f500ea0a6d65c09ccc39febd6546e745cc17df1e5bd82a0bb8dbfb743696dddb"} Jan 21 00:07:57 crc kubenswrapper[5030]: I0121 00:07:57.729899 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" event={"ID":"c378999a-6e05-40e9-9386-b0e79d90d988","Type":"ContainerStarted","Data":"2f2e657df57e980501620d8a2ad742358ddd970a47135f5766de4a461ddbe991"} Jan 21 00:07:57 crc kubenswrapper[5030]: I0121 00:07:57.764060 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" podStartSLOduration=2.7871976370000002 podStartE2EDuration="3.764035961s" podCreationTimestamp="2026-01-21 00:07:54 +0000 UTC" firstStartedPulling="2026-01-21 00:07:55.676885945 +0000 UTC m=+5547.997146253" lastFinishedPulling="2026-01-21 00:07:56.653724289 +0000 UTC m=+5548.973984577" observedRunningTime="2026-01-21 00:07:57.753932106 +0000 UTC m=+5550.074192464" watchObservedRunningTime="2026-01-21 00:07:57.764035961 +0000 UTC m=+5550.084296259" Jan 21 00:07:58 crc kubenswrapper[5030]: I0121 00:07:58.741648 5030 generic.go:334] "Generic (PLEG): container finished" podID="c378999a-6e05-40e9-9386-b0e79d90d988" containerID="2f2e657df57e980501620d8a2ad742358ddd970a47135f5766de4a461ddbe991" exitCode=0 Jan 21 00:07:58 crc kubenswrapper[5030]: I0121 00:07:58.741734 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" event={"ID":"c378999a-6e05-40e9-9386-b0e79d90d988","Type":"ContainerDied","Data":"2f2e657df57e980501620d8a2ad742358ddd970a47135f5766de4a461ddbe991"} Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.062735 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.243544 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-inventory\") pod \"c378999a-6e05-40e9-9386-b0e79d90d988\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.243712 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-ssh-key-edpm-compute-no-nodes\") pod \"c378999a-6e05-40e9-9386-b0e79d90d988\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.243770 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-ovn-combined-ca-bundle\") pod \"c378999a-6e05-40e9-9386-b0e79d90d988\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.243898 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c378999a-6e05-40e9-9386-b0e79d90d988-ovncontroller-config-0\") pod \"c378999a-6e05-40e9-9386-b0e79d90d988\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.243950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgt2d\" (UniqueName: \"kubernetes.io/projected/c378999a-6e05-40e9-9386-b0e79d90d988-kube-api-access-dgt2d\") pod \"c378999a-6e05-40e9-9386-b0e79d90d988\" (UID: \"c378999a-6e05-40e9-9386-b0e79d90d988\") " Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.252188 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c378999a-6e05-40e9-9386-b0e79d90d988" (UID: "c378999a-6e05-40e9-9386-b0e79d90d988"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.252241 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c378999a-6e05-40e9-9386-b0e79d90d988-kube-api-access-dgt2d" (OuterVolumeSpecName: "kube-api-access-dgt2d") pod "c378999a-6e05-40e9-9386-b0e79d90d988" (UID: "c378999a-6e05-40e9-9386-b0e79d90d988"). InnerVolumeSpecName "kube-api-access-dgt2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.275990 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "c378999a-6e05-40e9-9386-b0e79d90d988" (UID: "c378999a-6e05-40e9-9386-b0e79d90d988"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.283242 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c378999a-6e05-40e9-9386-b0e79d90d988-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "c378999a-6e05-40e9-9386-b0e79d90d988" (UID: "c378999a-6e05-40e9-9386-b0e79d90d988"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.286786 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-inventory" (OuterVolumeSpecName: "inventory") pod "c378999a-6e05-40e9-9386-b0e79d90d988" (UID: "c378999a-6e05-40e9-9386-b0e79d90d988"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.345134 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.345476 5030 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c378999a-6e05-40e9-9386-b0e79d90d988-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.345494 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgt2d\" (UniqueName: \"kubernetes.io/projected/c378999a-6e05-40e9-9386-b0e79d90d988-kube-api-access-dgt2d\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.345508 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.345522 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c378999a-6e05-40e9-9386-b0e79d90d988-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.761921 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" event={"ID":"c378999a-6e05-40e9-9386-b0e79d90d988","Type":"ContainerDied","Data":"f500ea0a6d65c09ccc39febd6546e745cc17df1e5bd82a0bb8dbfb743696dddb"} Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.761976 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f500ea0a6d65c09ccc39febd6546e745cc17df1e5bd82a0bb8dbfb743696dddb" Jan 21 00:08:00 crc kubenswrapper[5030]: I0121 00:08:00.761975 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.097277 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5"] Jan 21 00:08:03 crc kubenswrapper[5030]: E0121 00:08:03.097655 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c378999a-6e05-40e9-9386-b0e79d90d988" containerName="ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nodes" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.097672 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c378999a-6e05-40e9-9386-b0e79d90d988" containerName="ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nodes" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.097848 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c378999a-6e05-40e9-9386-b0e79d90d988" containerName="ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nodes" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.098715 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.109863 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-compute-beta-nodeset" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.117314 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5"] Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.181898 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjfcb\" (UniqueName: \"kubernetes.io/projected/5a1b02bd-9738-4790-baff-925f03c0a71f-kube-api-access-hjfcb\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.181964 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.182066 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-config\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.182227 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-edpm-compute-beta-nodeset\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.182259 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.283377 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-edpm-compute-beta-nodeset\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.283436 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.283483 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjfcb\" (UniqueName: \"kubernetes.io/projected/5a1b02bd-9738-4790-baff-925f03c0a71f-kube-api-access-hjfcb\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.283545 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.283579 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-config\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.284451 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.284478 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.284495 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-edpm-compute-beta-nodeset\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.284811 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-config\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.300114 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjfcb\" (UniqueName: \"kubernetes.io/projected/5a1b02bd-9738-4790-baff-925f03c0a71f-kube-api-access-hjfcb\") pod \"dnsmasq-dnsmasq-67886899f9-5psd5\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.415344 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.707566 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5"] Jan 21 00:08:03 crc kubenswrapper[5030]: I0121 00:08:03.788214 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" event={"ID":"5a1b02bd-9738-4790-baff-925f03c0a71f","Type":"ContainerStarted","Data":"01bef595be97b30751deff224ad18388a639e170a1fb16a66688b36711455a6f"} Jan 21 00:08:04 crc kubenswrapper[5030]: I0121 00:08:04.802703 5030 generic.go:334] "Generic (PLEG): container finished" podID="5a1b02bd-9738-4790-baff-925f03c0a71f" containerID="7f9326088d5c814000efb93bf7558908e6312eb5f0a00c5107378adf3023b911" exitCode=0 Jan 21 00:08:04 crc kubenswrapper[5030]: I0121 00:08:04.802770 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" event={"ID":"5a1b02bd-9738-4790-baff-925f03c0a71f","Type":"ContainerDied","Data":"7f9326088d5c814000efb93bf7558908e6312eb5f0a00c5107378adf3023b911"} Jan 21 00:08:05 crc kubenswrapper[5030]: I0121 00:08:05.816322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" event={"ID":"5a1b02bd-9738-4790-baff-925f03c0a71f","Type":"ContainerStarted","Data":"3b3cc079576b6c41c4dc4d8e2cb418e086300f92064e461c1d7aec46247d0edc"} Jan 21 00:08:05 crc kubenswrapper[5030]: I0121 00:08:05.817204 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:05 crc kubenswrapper[5030]: I0121 00:08:05.839863 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" podStartSLOduration=2.839844126 podStartE2EDuration="2.839844126s" podCreationTimestamp="2026-01-21 00:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:08:05.838124134 +0000 UTC m=+5558.158384422" watchObservedRunningTime="2026-01-21 00:08:05.839844126 +0000 UTC m=+5558.160104414" Jan 21 00:08:10 crc kubenswrapper[5030]: I0121 00:08:10.157348 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:08:10 crc kubenswrapper[5030]: I0121 00:08:10.157810 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:08:10 crc kubenswrapper[5030]: I0121 00:08:10.157872 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 00:08:10 crc kubenswrapper[5030]: I0121 00:08:10.158768 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9b88bdb3b6c450bc938e22d6a7c3a8c64bf73cd7d39cd07687b66e4ace86044b"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 00:08:10 crc kubenswrapper[5030]: I0121 00:08:10.158859 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://9b88bdb3b6c450bc938e22d6a7c3a8c64bf73cd7d39cd07687b66e4ace86044b" gracePeriod=600 Jan 21 00:08:10 crc kubenswrapper[5030]: I0121 00:08:10.863382 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="9b88bdb3b6c450bc938e22d6a7c3a8c64bf73cd7d39cd07687b66e4ace86044b" exitCode=0 Jan 21 00:08:10 crc kubenswrapper[5030]: I0121 00:08:10.863462 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"9b88bdb3b6c450bc938e22d6a7c3a8c64bf73cd7d39cd07687b66e4ace86044b"} Jan 21 00:08:10 crc kubenswrapper[5030]: I0121 00:08:10.863761 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba"} Jan 21 00:08:10 crc kubenswrapper[5030]: I0121 00:08:10.863789 5030 scope.go:117] "RemoveContainer" containerID="9cd316f9e713923cefccc02f466cd75207b4e9f1f1196b9940ce8740c1815cfc" Jan 21 00:08:13 crc kubenswrapper[5030]: I0121 00:08:13.416918 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:08:13 crc kubenswrapper[5030]: I0121 00:08:13.507558 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj"] Jan 21 00:08:13 crc kubenswrapper[5030]: I0121 00:08:13.509181 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" podUID="4c98d80a-3889-499f-b822-dabbeb25f4cf" containerName="dnsmasq-dns" containerID="cri-o://8c661467137337265d146274a69301d98c05212632a21587c294aedf83e72985" gracePeriod=10 Jan 21 00:08:13 crc kubenswrapper[5030]: I0121 00:08:13.898698 5030 generic.go:334] "Generic (PLEG): container finished" podID="4c98d80a-3889-499f-b822-dabbeb25f4cf" containerID="8c661467137337265d146274a69301d98c05212632a21587c294aedf83e72985" exitCode=0 Jan 21 00:08:13 crc kubenswrapper[5030]: I0121 00:08:13.898794 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" event={"ID":"4c98d80a-3889-499f-b822-dabbeb25f4cf","Type":"ContainerDied","Data":"8c661467137337265d146274a69301d98c05212632a21587c294aedf83e72985"} Jan 21 00:08:13 crc kubenswrapper[5030]: I0121 00:08:13.899343 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" event={"ID":"4c98d80a-3889-499f-b822-dabbeb25f4cf","Type":"ContainerDied","Data":"a66068da43c9b30a16bac93262e47c35d28228f871c77de276f4efe524e222d0"} Jan 21 00:08:13 crc kubenswrapper[5030]: I0121 00:08:13.899368 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a66068da43c9b30a16bac93262e47c35d28228f871c77de276f4efe524e222d0" Jan 21 00:08:13 crc kubenswrapper[5030]: I0121 00:08:13.936160 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.045421 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-edpm-compute-no-nodes\") pod \"4c98d80a-3889-499f-b822-dabbeb25f4cf\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.046445 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-config\") pod \"4c98d80a-3889-499f-b822-dabbeb25f4cf\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.046583 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-dnsmasq-svc\") pod \"4c98d80a-3889-499f-b822-dabbeb25f4cf\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.046680 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sltqr\" (UniqueName: \"kubernetes.io/projected/4c98d80a-3889-499f-b822-dabbeb25f4cf-kube-api-access-sltqr\") pod \"4c98d80a-3889-499f-b822-dabbeb25f4cf\" (UID: \"4c98d80a-3889-499f-b822-dabbeb25f4cf\") " Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.053107 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c98d80a-3889-499f-b822-dabbeb25f4cf-kube-api-access-sltqr" (OuterVolumeSpecName: "kube-api-access-sltqr") pod "4c98d80a-3889-499f-b822-dabbeb25f4cf" (UID: "4c98d80a-3889-499f-b822-dabbeb25f4cf"). InnerVolumeSpecName "kube-api-access-sltqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.083610 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "4c98d80a-3889-499f-b822-dabbeb25f4cf" (UID: "4c98d80a-3889-499f-b822-dabbeb25f4cf"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.087163 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-config" (OuterVolumeSpecName: "config") pod "4c98d80a-3889-499f-b822-dabbeb25f4cf" (UID: "4c98d80a-3889-499f-b822-dabbeb25f4cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.093051 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-edpm-compute-no-nodes" (OuterVolumeSpecName: "edpm-compute-no-nodes") pod "4c98d80a-3889-499f-b822-dabbeb25f4cf" (UID: "4c98d80a-3889-499f-b822-dabbeb25f4cf"). InnerVolumeSpecName "edpm-compute-no-nodes". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.149018 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.149073 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.149104 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sltqr\" (UniqueName: \"kubernetes.io/projected/4c98d80a-3889-499f-b822-dabbeb25f4cf-kube-api-access-sltqr\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.149124 5030 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/4c98d80a-3889-499f-b822-dabbeb25f4cf-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.908890 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj" Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.948550 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj"] Jan 21 00:08:14 crc kubenswrapper[5030]: I0121 00:08:14.956981 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-t4prj"] Jan 21 00:08:15 crc kubenswrapper[5030]: I0121 00:08:15.979738 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c98d80a-3889-499f-b822-dabbeb25f4cf" path="/var/lib/kubelet/pods/4c98d80a-3889-499f-b822-dabbeb25f4cf/volumes" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.076086 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr"] Jan 21 00:08:18 crc kubenswrapper[5030]: E0121 00:08:18.076576 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c98d80a-3889-499f-b822-dabbeb25f4cf" containerName="dnsmasq-dns" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.076588 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c98d80a-3889-499f-b822-dabbeb25f4cf" containerName="dnsmasq-dns" Jan 21 00:08:18 crc kubenswrapper[5030]: E0121 00:08:18.076607 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c98d80a-3889-499f-b822-dabbeb25f4cf" containerName="init" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.076613 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c98d80a-3889-499f-b822-dabbeb25f4cf" containerName="init" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.076764 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c98d80a-3889-499f-b822-dabbeb25f4cf" containerName="dnsmasq-dns" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.077204 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.082145 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.082456 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.082801 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.082933 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.092983 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq"] Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.094241 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.097788 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-beta-nodeset-dockercfg-vt4p7" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.098479 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-beta-nodeset" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.104565 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq"] Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.138801 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr"] Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.220387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwf4c\" (UniqueName: \"kubernetes.io/projected/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-kube-api-access-qwf4c\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq\" (UID: \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.220472 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr\" (UID: \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.220500 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p88wq\" (UniqueName: \"kubernetes.io/projected/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-kube-api-access-p88wq\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr\" (UID: \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.220789 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq\" (UID: \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.220861 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-ssh-key-edpm-compute-beta-nodeset\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq\" (UID: \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.221115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-ssh-key-edpm-compute-no-nodes\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr\" (UID: \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.322990 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-ssh-key-edpm-compute-no-nodes\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr\" (UID: \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.323098 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwf4c\" (UniqueName: \"kubernetes.io/projected/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-kube-api-access-qwf4c\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq\" (UID: \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.323144 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr\" (UID: \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.323183 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p88wq\" (UniqueName: \"kubernetes.io/projected/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-kube-api-access-p88wq\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr\" (UID: \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.323252 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq\" (UID: \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.323299 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-ssh-key-edpm-compute-beta-nodeset\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq\" (UID: \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.331205 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-ssh-key-edpm-compute-no-nodes\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr\" (UID: \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.332066 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr\" (UID: \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.338237 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-inventory\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq\" (UID: \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.338391 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-ssh-key-edpm-compute-beta-nodeset\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq\" (UID: \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.341279 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p88wq\" (UniqueName: \"kubernetes.io/projected/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-kube-api-access-p88wq\") pod \"download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr\" (UID: \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.351340 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwf4c\" (UniqueName: \"kubernetes.io/projected/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-kube-api-access-qwf4c\") pod \"download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq\" (UID: \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\") " pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.414369 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.428157 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.924345 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr"] Jan 21 00:08:18 crc kubenswrapper[5030]: W0121 00:08:18.929414 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72b30f34_e1f3_44c1_8d05_9fc1ec51e677.slice/crio-83fd754ca88b0014f0872db9f9632712246ebf2a24ada3061f300d37816da0de WatchSource:0}: Error finding container 83fd754ca88b0014f0872db9f9632712246ebf2a24ada3061f300d37816da0de: Status 404 returned error can't find the container with id 83fd754ca88b0014f0872db9f9632712246ebf2a24ada3061f300d37816da0de Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.931207 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq"] Jan 21 00:08:18 crc kubenswrapper[5030]: W0121 00:08:18.931318 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5c3d22d_f0a9_4638_a23c_0b4c56390a6e.slice/crio-8adb14b2c41ea5fedbbca4b74fceffd5c357b885557c8c3141ff57a7e969c529 WatchSource:0}: Error finding container 8adb14b2c41ea5fedbbca4b74fceffd5c357b885557c8c3141ff57a7e969c529: Status 404 returned error can't find the container with id 8adb14b2c41ea5fedbbca4b74fceffd5c357b885557c8c3141ff57a7e969c529 Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.954775 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" event={"ID":"72b30f34-e1f3-44c1-8d05-9fc1ec51e677","Type":"ContainerStarted","Data":"83fd754ca88b0014f0872db9f9632712246ebf2a24ada3061f300d37816da0de"} Jan 21 00:08:18 crc kubenswrapper[5030]: I0121 00:08:18.959323 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" event={"ID":"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e","Type":"ContainerStarted","Data":"8adb14b2c41ea5fedbbca4b74fceffd5c357b885557c8c3141ff57a7e969c529"} Jan 21 00:08:20 crc kubenswrapper[5030]: I0121 00:08:20.009372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" event={"ID":"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e","Type":"ContainerStarted","Data":"b30c2cf2fc3a3e832fdddeb1326de45892385ac7e44ead48ab4e5ccfec463134"} Jan 21 00:08:20 crc kubenswrapper[5030]: I0121 00:08:20.011948 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" event={"ID":"72b30f34-e1f3-44c1-8d05-9fc1ec51e677","Type":"ContainerStarted","Data":"194c3efc28459b64e8648a553459099e72c31634fc55d2467db7d065d5c4cc67"} Jan 21 00:08:20 crc kubenswrapper[5030]: I0121 00:08:20.035832 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" podStartSLOduration=1.29481886 podStartE2EDuration="2.035806346s" podCreationTimestamp="2026-01-21 00:08:18 +0000 UTC" firstStartedPulling="2026-01-21 00:08:18.934076052 +0000 UTC m=+5571.254336340" lastFinishedPulling="2026-01-21 00:08:19.675063538 +0000 UTC m=+5571.995323826" observedRunningTime="2026-01-21 00:08:20.009831176 +0000 UTC m=+5572.330091484" watchObservedRunningTime="2026-01-21 00:08:20.035806346 +0000 UTC m=+5572.356066644" Jan 21 00:08:20 crc kubenswrapper[5030]: I0121 00:08:20.040531 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" podStartSLOduration=1.448869104 podStartE2EDuration="2.04050999s" podCreationTimestamp="2026-01-21 00:08:18 +0000 UTC" firstStartedPulling="2026-01-21 00:08:18.931786876 +0000 UTC m=+5571.252047164" lastFinishedPulling="2026-01-21 00:08:19.523427762 +0000 UTC m=+5571.843688050" observedRunningTime="2026-01-21 00:08:20.028950909 +0000 UTC m=+5572.349211227" watchObservedRunningTime="2026-01-21 00:08:20.04050999 +0000 UTC m=+5572.360770278" Jan 21 00:08:22 crc kubenswrapper[5030]: I0121 00:08:22.010565 5030 generic.go:334] "Generic (PLEG): container finished" podID="72b30f34-e1f3-44c1-8d05-9fc1ec51e677" containerID="194c3efc28459b64e8648a553459099e72c31634fc55d2467db7d065d5c4cc67" exitCode=0 Jan 21 00:08:22 crc kubenswrapper[5030]: I0121 00:08:22.010688 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" event={"ID":"72b30f34-e1f3-44c1-8d05-9fc1ec51e677","Type":"ContainerDied","Data":"194c3efc28459b64e8648a553459099e72c31634fc55d2467db7d065d5c4cc67"} Jan 21 00:08:22 crc kubenswrapper[5030]: I0121 00:08:22.014423 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5c3d22d-f0a9-4638-a23c-0b4c56390a6e" containerID="b30c2cf2fc3a3e832fdddeb1326de45892385ac7e44ead48ab4e5ccfec463134" exitCode=0 Jan 21 00:08:22 crc kubenswrapper[5030]: I0121 00:08:22.014481 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" event={"ID":"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e","Type":"ContainerDied","Data":"b30c2cf2fc3a3e832fdddeb1326de45892385ac7e44ead48ab4e5ccfec463134"} Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.418125 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.429801 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.610701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-inventory\") pod \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\" (UID: \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\") " Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.610789 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwf4c\" (UniqueName: \"kubernetes.io/projected/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-kube-api-access-qwf4c\") pod \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\" (UID: \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\") " Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.610824 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-ssh-key-edpm-compute-beta-nodeset\") pod \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\" (UID: \"72b30f34-e1f3-44c1-8d05-9fc1ec51e677\") " Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.610859 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-ssh-key-edpm-compute-no-nodes\") pod \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\" (UID: \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\") " Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.610889 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p88wq\" (UniqueName: \"kubernetes.io/projected/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-kube-api-access-p88wq\") pod \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\" (UID: \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\") " Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.610965 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-inventory\") pod \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\" (UID: \"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e\") " Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.620879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-kube-api-access-p88wq" (OuterVolumeSpecName: "kube-api-access-p88wq") pod "a5c3d22d-f0a9-4638-a23c-0b4c56390a6e" (UID: "a5c3d22d-f0a9-4638-a23c-0b4c56390a6e"). InnerVolumeSpecName "kube-api-access-p88wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.622453 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-kube-api-access-qwf4c" (OuterVolumeSpecName: "kube-api-access-qwf4c") pod "72b30f34-e1f3-44c1-8d05-9fc1ec51e677" (UID: "72b30f34-e1f3-44c1-8d05-9fc1ec51e677"). InnerVolumeSpecName "kube-api-access-qwf4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.632503 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "a5c3d22d-f0a9-4638-a23c-0b4c56390a6e" (UID: "a5c3d22d-f0a9-4638-a23c-0b4c56390a6e"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.635136 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-inventory" (OuterVolumeSpecName: "inventory") pod "72b30f34-e1f3-44c1-8d05-9fc1ec51e677" (UID: "72b30f34-e1f3-44c1-8d05-9fc1ec51e677"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.640228 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-inventory" (OuterVolumeSpecName: "inventory") pod "a5c3d22d-f0a9-4638-a23c-0b4c56390a6e" (UID: "a5c3d22d-f0a9-4638-a23c-0b4c56390a6e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.642166 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-ssh-key-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "ssh-key-edpm-compute-beta-nodeset") pod "72b30f34-e1f3-44c1-8d05-9fc1ec51e677" (UID: "72b30f34-e1f3-44c1-8d05-9fc1ec51e677"). InnerVolumeSpecName "ssh-key-edpm-compute-beta-nodeset". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.713084 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-ssh-key-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.713125 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.713146 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p88wq\" (UniqueName: \"kubernetes.io/projected/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-kube-api-access-p88wq\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.713159 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.713169 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:23 crc kubenswrapper[5030]: I0121 00:08:23.713181 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwf4c\" (UniqueName: \"kubernetes.io/projected/72b30f34-e1f3-44c1-8d05-9fc1ec51e677-kube-api-access-qwf4c\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.037424 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" event={"ID":"a5c3d22d-f0a9-4638-a23c-0b4c56390a6e","Type":"ContainerDied","Data":"8adb14b2c41ea5fedbbca4b74fceffd5c357b885557c8c3141ff57a7e969c529"} Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.037451 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.037470 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8adb14b2c41ea5fedbbca4b74fceffd5c357b885557c8c3141ff57a7e969c529" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.039201 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" event={"ID":"72b30f34-e1f3-44c1-8d05-9fc1ec51e677","Type":"ContainerDied","Data":"83fd754ca88b0014f0872db9f9632712246ebf2a24ada3061f300d37816da0de"} Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.039277 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83fd754ca88b0014f0872db9f9632712246ebf2a24ada3061f300d37816da0de" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.039312 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.126544 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl"] Jan 21 00:08:24 crc kubenswrapper[5030]: E0121 00:08:24.127018 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b30f34-e1f3-44c1-8d05-9fc1ec51e677" containerName="download-cache-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.127046 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b30f34-e1f3-44c1-8d05-9fc1ec51e677" containerName="download-cache-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 00:08:24 crc kubenswrapper[5030]: E0121 00:08:24.127077 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5c3d22d-f0a9-4638-a23c-0b4c56390a6e" containerName="download-cache-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.127087 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5c3d22d-f0a9-4638-a23c-0b4c56390a6e" containerName="download-cache-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.127281 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b30f34-e1f3-44c1-8d05-9fc1ec51e677" containerName="download-cache-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.127304 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5c3d22d-f0a9-4638-a23c-0b4c56390a6e" containerName="download-cache-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.127888 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.133424 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.133525 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.133770 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.134242 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.134708 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.141085 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl"] Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.323480 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-ssh-key-edpm-compute-no-nodes\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.323852 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.324003 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.324106 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlwmb\" (UniqueName: \"kubernetes.io/projected/10d6caab-b199-4c48-ba30-6b3a23a2f921-kube-api-access-dlwmb\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.425883 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-ssh-key-edpm-compute-no-nodes\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.426002 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.426058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.426116 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlwmb\" (UniqueName: \"kubernetes.io/projected/10d6caab-b199-4c48-ba30-6b3a23a2f921-kube-api-access-dlwmb\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.432652 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-ssh-key-edpm-compute-no-nodes\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.433553 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.433567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.469127 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlwmb\" (UniqueName: \"kubernetes.io/projected/10d6caab-b199-4c48-ba30-6b3a23a2f921-kube-api-access-dlwmb\") pod \"bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.577492 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8"] Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.578522 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.581936 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-beta-nodeset" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.584031 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-beta-nodeset-dockercfg-vt4p7" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.605178 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8"] Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.730314 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-ssh-key-edpm-compute-beta-nodeset\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.730369 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.730490 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2zh6\" (UniqueName: \"kubernetes.io/projected/c8224f59-84ad-49de-af58-6853e11fe09e-kube-api-access-c2zh6\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.730513 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.748272 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.832948 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2zh6\" (UniqueName: \"kubernetes.io/projected/c8224f59-84ad-49de-af58-6853e11fe09e-kube-api-access-c2zh6\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.833031 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.833110 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-ssh-key-edpm-compute-beta-nodeset\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.833155 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.842448 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-inventory\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.842891 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.846493 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-ssh-key-edpm-compute-beta-nodeset\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.853129 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2zh6\" (UniqueName: \"kubernetes.io/projected/c8224f59-84ad-49de-af58-6853e11fe09e-kube-api-access-c2zh6\") pod \"bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:24 crc kubenswrapper[5030]: I0121 00:08:24.896110 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:25 crc kubenswrapper[5030]: I0121 00:08:25.105587 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8"] Jan 21 00:08:25 crc kubenswrapper[5030]: W0121 00:08:25.108684 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8224f59_84ad_49de_af58_6853e11fe09e.slice/crio-74681eeb66987e3f6158206fc50b38ca8bed5958e159eb8253c3beeadc0924d7 WatchSource:0}: Error finding container 74681eeb66987e3f6158206fc50b38ca8bed5958e159eb8253c3beeadc0924d7: Status 404 returned error can't find the container with id 74681eeb66987e3f6158206fc50b38ca8bed5958e159eb8253c3beeadc0924d7 Jan 21 00:08:25 crc kubenswrapper[5030]: I0121 00:08:25.184872 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl"] Jan 21 00:08:25 crc kubenswrapper[5030]: W0121 00:08:25.190081 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10d6caab_b199_4c48_ba30_6b3a23a2f921.slice/crio-667d85d48d9b16450ce2f55a45c34f343cdb39ccaec0aa054cf603bc9a8d792b WatchSource:0}: Error finding container 667d85d48d9b16450ce2f55a45c34f343cdb39ccaec0aa054cf603bc9a8d792b: Status 404 returned error can't find the container with id 667d85d48d9b16450ce2f55a45c34f343cdb39ccaec0aa054cf603bc9a8d792b Jan 21 00:08:26 crc kubenswrapper[5030]: I0121 00:08:26.058185 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" event={"ID":"c8224f59-84ad-49de-af58-6853e11fe09e","Type":"ContainerStarted","Data":"06019265b6eabe0b81be6109c0bffdb530cf5cf5af10923985e45d1e1edf92e2"} Jan 21 00:08:26 crc kubenswrapper[5030]: I0121 00:08:26.058536 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" event={"ID":"c8224f59-84ad-49de-af58-6853e11fe09e","Type":"ContainerStarted","Data":"74681eeb66987e3f6158206fc50b38ca8bed5958e159eb8253c3beeadc0924d7"} Jan 21 00:08:26 crc kubenswrapper[5030]: I0121 00:08:26.060084 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" event={"ID":"10d6caab-b199-4c48-ba30-6b3a23a2f921","Type":"ContainerStarted","Data":"11a82d346b30da8208a3328089fa26f7c1cff67802177ca59adcd25b382bc1b8"} Jan 21 00:08:26 crc kubenswrapper[5030]: I0121 00:08:26.060133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" event={"ID":"10d6caab-b199-4c48-ba30-6b3a23a2f921","Type":"ContainerStarted","Data":"667d85d48d9b16450ce2f55a45c34f343cdb39ccaec0aa054cf603bc9a8d792b"} Jan 21 00:08:26 crc kubenswrapper[5030]: I0121 00:08:26.077382 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" podStartSLOduration=1.409019241 podStartE2EDuration="2.077361387s" podCreationTimestamp="2026-01-21 00:08:24 +0000 UTC" firstStartedPulling="2026-01-21 00:08:25.111815405 +0000 UTC m=+5577.432075693" lastFinishedPulling="2026-01-21 00:08:25.780157541 +0000 UTC m=+5578.100417839" observedRunningTime="2026-01-21 00:08:26.076150078 +0000 UTC m=+5578.396410396" watchObservedRunningTime="2026-01-21 00:08:26.077361387 +0000 UTC m=+5578.397621695" Jan 21 00:08:26 crc kubenswrapper[5030]: I0121 00:08:26.099370 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" podStartSLOduration=1.678655349 podStartE2EDuration="2.099351111s" podCreationTimestamp="2026-01-21 00:08:24 +0000 UTC" firstStartedPulling="2026-01-21 00:08:25.194021538 +0000 UTC m=+5577.514281826" lastFinishedPulling="2026-01-21 00:08:25.61471728 +0000 UTC m=+5577.934977588" observedRunningTime="2026-01-21 00:08:26.095998269 +0000 UTC m=+5578.416258597" watchObservedRunningTime="2026-01-21 00:08:26.099351111 +0000 UTC m=+5578.419611399" Jan 21 00:08:26 crc kubenswrapper[5030]: I0121 00:08:26.295758 5030 scope.go:117] "RemoveContainer" containerID="c27e63d6bb0d90af35362d77a9b2ea7614368d866e70db392fa9344972e9f4ee" Jan 21 00:08:28 crc kubenswrapper[5030]: I0121 00:08:28.181223 5030 generic.go:334] "Generic (PLEG): container finished" podID="c8224f59-84ad-49de-af58-6853e11fe09e" containerID="06019265b6eabe0b81be6109c0bffdb530cf5cf5af10923985e45d1e1edf92e2" exitCode=0 Jan 21 00:08:28 crc kubenswrapper[5030]: I0121 00:08:28.181301 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" event={"ID":"c8224f59-84ad-49de-af58-6853e11fe09e","Type":"ContainerDied","Data":"06019265b6eabe0b81be6109c0bffdb530cf5cf5af10923985e45d1e1edf92e2"} Jan 21 00:08:28 crc kubenswrapper[5030]: I0121 00:08:28.186384 5030 generic.go:334] "Generic (PLEG): container finished" podID="10d6caab-b199-4c48-ba30-6b3a23a2f921" containerID="11a82d346b30da8208a3328089fa26f7c1cff67802177ca59adcd25b382bc1b8" exitCode=0 Jan 21 00:08:28 crc kubenswrapper[5030]: I0121 00:08:28.186454 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" event={"ID":"10d6caab-b199-4c48-ba30-6b3a23a2f921","Type":"ContainerDied","Data":"11a82d346b30da8208a3328089fa26f7c1cff67802177ca59adcd25b382bc1b8"} Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.597503 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.606935 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.713577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-bootstrap-combined-ca-bundle\") pod \"c8224f59-84ad-49de-af58-6853e11fe09e\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.713647 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-bootstrap-combined-ca-bundle\") pod \"10d6caab-b199-4c48-ba30-6b3a23a2f921\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.713684 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-inventory\") pod \"10d6caab-b199-4c48-ba30-6b3a23a2f921\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.713727 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-inventory\") pod \"c8224f59-84ad-49de-af58-6853e11fe09e\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.713765 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-ssh-key-edpm-compute-beta-nodeset\") pod \"c8224f59-84ad-49de-af58-6853e11fe09e\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.713799 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2zh6\" (UniqueName: \"kubernetes.io/projected/c8224f59-84ad-49de-af58-6853e11fe09e-kube-api-access-c2zh6\") pod \"c8224f59-84ad-49de-af58-6853e11fe09e\" (UID: \"c8224f59-84ad-49de-af58-6853e11fe09e\") " Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.713890 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-ssh-key-edpm-compute-no-nodes\") pod \"10d6caab-b199-4c48-ba30-6b3a23a2f921\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.713919 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlwmb\" (UniqueName: \"kubernetes.io/projected/10d6caab-b199-4c48-ba30-6b3a23a2f921-kube-api-access-dlwmb\") pod \"10d6caab-b199-4c48-ba30-6b3a23a2f921\" (UID: \"10d6caab-b199-4c48-ba30-6b3a23a2f921\") " Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.720663 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d6caab-b199-4c48-ba30-6b3a23a2f921-kube-api-access-dlwmb" (OuterVolumeSpecName: "kube-api-access-dlwmb") pod "10d6caab-b199-4c48-ba30-6b3a23a2f921" (UID: "10d6caab-b199-4c48-ba30-6b3a23a2f921"). InnerVolumeSpecName "kube-api-access-dlwmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.720904 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8224f59-84ad-49de-af58-6853e11fe09e-kube-api-access-c2zh6" (OuterVolumeSpecName: "kube-api-access-c2zh6") pod "c8224f59-84ad-49de-af58-6853e11fe09e" (UID: "c8224f59-84ad-49de-af58-6853e11fe09e"). InnerVolumeSpecName "kube-api-access-c2zh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.721836 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c8224f59-84ad-49de-af58-6853e11fe09e" (UID: "c8224f59-84ad-49de-af58-6853e11fe09e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.728961 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "10d6caab-b199-4c48-ba30-6b3a23a2f921" (UID: "10d6caab-b199-4c48-ba30-6b3a23a2f921"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.736908 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "10d6caab-b199-4c48-ba30-6b3a23a2f921" (UID: "10d6caab-b199-4c48-ba30-6b3a23a2f921"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.737720 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-inventory" (OuterVolumeSpecName: "inventory") pod "10d6caab-b199-4c48-ba30-6b3a23a2f921" (UID: "10d6caab-b199-4c48-ba30-6b3a23a2f921"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.759934 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-inventory" (OuterVolumeSpecName: "inventory") pod "c8224f59-84ad-49de-af58-6853e11fe09e" (UID: "c8224f59-84ad-49de-af58-6853e11fe09e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.762878 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-ssh-key-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "ssh-key-edpm-compute-beta-nodeset") pod "c8224f59-84ad-49de-af58-6853e11fe09e" (UID: "c8224f59-84ad-49de-af58-6853e11fe09e"). InnerVolumeSpecName "ssh-key-edpm-compute-beta-nodeset". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.816403 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.816506 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlwmb\" (UniqueName: \"kubernetes.io/projected/10d6caab-b199-4c48-ba30-6b3a23a2f921-kube-api-access-dlwmb\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.816527 5030 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.816545 5030 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.816598 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10d6caab-b199-4c48-ba30-6b3a23a2f921-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.816648 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.816678 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/secret/c8224f59-84ad-49de-af58-6853e11fe09e-ssh-key-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:29 crc kubenswrapper[5030]: I0121 00:08:29.816708 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2zh6\" (UniqueName: \"kubernetes.io/projected/c8224f59-84ad-49de-af58-6853e11fe09e-kube-api-access-c2zh6\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.209546 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" event={"ID":"c8224f59-84ad-49de-af58-6853e11fe09e","Type":"ContainerDied","Data":"74681eeb66987e3f6158206fc50b38ca8bed5958e159eb8253c3beeadc0924d7"} Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.209596 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.209663 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74681eeb66987e3f6158206fc50b38ca8bed5958e159eb8253c3beeadc0924d7" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.213693 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" event={"ID":"10d6caab-b199-4c48-ba30-6b3a23a2f921","Type":"ContainerDied","Data":"667d85d48d9b16450ce2f55a45c34f343cdb39ccaec0aa054cf603bc9a8d792b"} Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.213763 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="667d85d48d9b16450ce2f55a45c34f343cdb39ccaec0aa054cf603bc9a8d792b" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.213825 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.310360 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn"] Jan 21 00:08:30 crc kubenswrapper[5030]: E0121 00:08:30.310813 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8224f59-84ad-49de-af58-6853e11fe09e" containerName="bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.310837 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8224f59-84ad-49de-af58-6853e11fe09e" containerName="bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 00:08:30 crc kubenswrapper[5030]: E0121 00:08:30.310866 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d6caab-b199-4c48-ba30-6b3a23a2f921" containerName="bootstrap-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.310876 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d6caab-b199-4c48-ba30-6b3a23a2f921" containerName="bootstrap-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.311072 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8224f59-84ad-49de-af58-6853e11fe09e" containerName="bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.311095 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d6caab-b199-4c48-ba30-6b3a23a2f921" containerName="bootstrap-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.311671 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.314115 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.314309 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.314549 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.314826 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.324063 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn"] Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.430106 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-inventory\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn\" (UID: \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.430193 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjvtq\" (UniqueName: \"kubernetes.io/projected/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-kube-api-access-zjvtq\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn\" (UID: \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.430412 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-ssh-key-edpm-compute-no-nodes\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn\" (UID: \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.532760 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-ssh-key-edpm-compute-no-nodes\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn\" (UID: \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.532952 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-inventory\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn\" (UID: \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.533046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjvtq\" (UniqueName: \"kubernetes.io/projected/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-kube-api-access-zjvtq\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn\" (UID: \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.541725 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-ssh-key-edpm-compute-no-nodes\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn\" (UID: \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.544327 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-inventory\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn\" (UID: \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.566540 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjvtq\" (UniqueName: \"kubernetes.io/projected/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-kube-api-access-zjvtq\") pod \"configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn\" (UID: \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\") " pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" Jan 21 00:08:30 crc kubenswrapper[5030]: I0121 00:08:30.628114 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" Jan 21 00:08:31 crc kubenswrapper[5030]: W0121 00:08:31.111859 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3d46dc3_5c30_46c4_881e_b02affc0e9a3.slice/crio-d2c7369254436ece3c0fab3702eeb40bc040a91b2983a436ac62a5e7aa0d11ed WatchSource:0}: Error finding container d2c7369254436ece3c0fab3702eeb40bc040a91b2983a436ac62a5e7aa0d11ed: Status 404 returned error can't find the container with id d2c7369254436ece3c0fab3702eeb40bc040a91b2983a436ac62a5e7aa0d11ed Jan 21 00:08:31 crc kubenswrapper[5030]: I0121 00:08:31.115244 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn"] Jan 21 00:08:31 crc kubenswrapper[5030]: I0121 00:08:31.241044 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" event={"ID":"f3d46dc3-5c30-46c4-881e-b02affc0e9a3","Type":"ContainerStarted","Data":"d2c7369254436ece3c0fab3702eeb40bc040a91b2983a436ac62a5e7aa0d11ed"} Jan 21 00:08:32 crc kubenswrapper[5030]: I0121 00:08:32.254489 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" event={"ID":"f3d46dc3-5c30-46c4-881e-b02affc0e9a3","Type":"ContainerStarted","Data":"de2eb28606c6759e62cd01298595b14a1ffa5b4dfa6d7941f92c84212e3b6908"} Jan 21 00:08:32 crc kubenswrapper[5030]: I0121 00:08:32.284371 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" podStartSLOduration=1.777208712 podStartE2EDuration="2.284341339s" podCreationTimestamp="2026-01-21 00:08:30 +0000 UTC" firstStartedPulling="2026-01-21 00:08:31.114053352 +0000 UTC m=+5583.434313650" lastFinishedPulling="2026-01-21 00:08:31.621185979 +0000 UTC m=+5583.941446277" observedRunningTime="2026-01-21 00:08:32.277124454 +0000 UTC m=+5584.597384782" watchObservedRunningTime="2026-01-21 00:08:32.284341339 +0000 UTC m=+5584.604601667" Jan 21 00:08:33 crc kubenswrapper[5030]: I0121 00:08:33.268329 5030 generic.go:334] "Generic (PLEG): container finished" podID="f3d46dc3-5c30-46c4-881e-b02affc0e9a3" containerID="de2eb28606c6759e62cd01298595b14a1ffa5b4dfa6d7941f92c84212e3b6908" exitCode=0 Jan 21 00:08:33 crc kubenswrapper[5030]: I0121 00:08:33.268598 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" event={"ID":"f3d46dc3-5c30-46c4-881e-b02affc0e9a3","Type":"ContainerDied","Data":"de2eb28606c6759e62cd01298595b14a1ffa5b4dfa6d7941f92c84212e3b6908"} Jan 21 00:08:34 crc kubenswrapper[5030]: I0121 00:08:34.782772 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" Jan 21 00:08:34 crc kubenswrapper[5030]: I0121 00:08:34.906581 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-ssh-key-edpm-compute-no-nodes\") pod \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\" (UID: \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\") " Jan 21 00:08:34 crc kubenswrapper[5030]: I0121 00:08:34.906718 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjvtq\" (UniqueName: \"kubernetes.io/projected/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-kube-api-access-zjvtq\") pod \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\" (UID: \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\") " Jan 21 00:08:34 crc kubenswrapper[5030]: I0121 00:08:34.906827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-inventory\") pod \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\" (UID: \"f3d46dc3-5c30-46c4-881e-b02affc0e9a3\") " Jan 21 00:08:34 crc kubenswrapper[5030]: I0121 00:08:34.918045 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-kube-api-access-zjvtq" (OuterVolumeSpecName: "kube-api-access-zjvtq") pod "f3d46dc3-5c30-46c4-881e-b02affc0e9a3" (UID: "f3d46dc3-5c30-46c4-881e-b02affc0e9a3"). InnerVolumeSpecName "kube-api-access-zjvtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:08:34 crc kubenswrapper[5030]: I0121 00:08:34.938390 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "f3d46dc3-5c30-46c4-881e-b02affc0e9a3" (UID: "f3d46dc3-5c30-46c4-881e-b02affc0e9a3"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:34 crc kubenswrapper[5030]: I0121 00:08:34.938402 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-inventory" (OuterVolumeSpecName: "inventory") pod "f3d46dc3-5c30-46c4-881e-b02affc0e9a3" (UID: "f3d46dc3-5c30-46c4-881e-b02affc0e9a3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.010030 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.010085 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjvtq\" (UniqueName: \"kubernetes.io/projected/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-kube-api-access-zjvtq\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.010109 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3d46dc3-5c30-46c4-881e-b02affc0e9a3-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.301035 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" event={"ID":"f3d46dc3-5c30-46c4-881e-b02affc0e9a3","Type":"ContainerDied","Data":"d2c7369254436ece3c0fab3702eeb40bc040a91b2983a436ac62a5e7aa0d11ed"} Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.301085 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2c7369254436ece3c0fab3702eeb40bc040a91b2983a436ac62a5e7aa0d11ed" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.301145 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.364141 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t"] Jan 21 00:08:35 crc kubenswrapper[5030]: E0121 00:08:35.364518 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d46dc3-5c30-46c4-881e-b02affc0e9a3" containerName="configure-network-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.364541 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d46dc3-5c30-46c4-881e-b02affc0e9a3" containerName="configure-network-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.380537 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d46dc3-5c30-46c4-881e-b02affc0e9a3" containerName="configure-network-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.381580 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.388804 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t"] Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.390448 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.390461 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.390594 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.390558 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.517280 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-ssh-key-edpm-compute-no-nodes\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t\" (UID: \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.517335 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c2dm\" (UniqueName: \"kubernetes.io/projected/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-kube-api-access-2c2dm\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t\" (UID: \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.517387 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-inventory\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t\" (UID: \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.619121 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-ssh-key-edpm-compute-no-nodes\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t\" (UID: \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.619431 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c2dm\" (UniqueName: \"kubernetes.io/projected/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-kube-api-access-2c2dm\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t\" (UID: \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.619562 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-inventory\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t\" (UID: \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.629056 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-ssh-key-edpm-compute-no-nodes\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t\" (UID: \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.629332 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-inventory\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t\" (UID: \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.646133 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c2dm\" (UniqueName: \"kubernetes.io/projected/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-kube-api-access-2c2dm\") pod \"validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t\" (UID: \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\") " pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" Jan 21 00:08:35 crc kubenswrapper[5030]: I0121 00:08:35.710530 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" Jan 21 00:08:36 crc kubenswrapper[5030]: W0121 00:08:36.178562 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32c4ac6a_8fbe_44b6_a6e7_beebc0a141d8.slice/crio-f58fec3c7fc24827c830f59ab5018846fad51255d57655d1b346097c9bd0ae45 WatchSource:0}: Error finding container f58fec3c7fc24827c830f59ab5018846fad51255d57655d1b346097c9bd0ae45: Status 404 returned error can't find the container with id f58fec3c7fc24827c830f59ab5018846fad51255d57655d1b346097c9bd0ae45 Jan 21 00:08:36 crc kubenswrapper[5030]: I0121 00:08:36.183692 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t"] Jan 21 00:08:36 crc kubenswrapper[5030]: I0121 00:08:36.312418 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" event={"ID":"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8","Type":"ContainerStarted","Data":"f58fec3c7fc24827c830f59ab5018846fad51255d57655d1b346097c9bd0ae45"} Jan 21 00:08:37 crc kubenswrapper[5030]: I0121 00:08:37.320550 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" event={"ID":"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8","Type":"ContainerStarted","Data":"a21170eaaaa8bd732a221e30c9e24e967c320d9a23616f558d13013131437541"} Jan 21 00:08:37 crc kubenswrapper[5030]: I0121 00:08:37.342680 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" podStartSLOduration=1.5380528390000001 podStartE2EDuration="2.342665448s" podCreationTimestamp="2026-01-21 00:08:35 +0000 UTC" firstStartedPulling="2026-01-21 00:08:36.180303454 +0000 UTC m=+5588.500563732" lastFinishedPulling="2026-01-21 00:08:36.984916053 +0000 UTC m=+5589.305176341" observedRunningTime="2026-01-21 00:08:37.341457139 +0000 UTC m=+5589.661717447" watchObservedRunningTime="2026-01-21 00:08:37.342665448 +0000 UTC m=+5589.662925726" Jan 21 00:08:38 crc kubenswrapper[5030]: I0121 00:08:38.337328 5030 generic.go:334] "Generic (PLEG): container finished" podID="32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8" containerID="a21170eaaaa8bd732a221e30c9e24e967c320d9a23616f558d13013131437541" exitCode=0 Jan 21 00:08:38 crc kubenswrapper[5030]: I0121 00:08:38.337836 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" event={"ID":"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8","Type":"ContainerDied","Data":"a21170eaaaa8bd732a221e30c9e24e967c320d9a23616f558d13013131437541"} Jan 21 00:08:39 crc kubenswrapper[5030]: I0121 00:08:39.647129 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" Jan 21 00:08:39 crc kubenswrapper[5030]: I0121 00:08:39.780465 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-inventory\") pod \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\" (UID: \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\") " Jan 21 00:08:39 crc kubenswrapper[5030]: I0121 00:08:39.780524 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c2dm\" (UniqueName: \"kubernetes.io/projected/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-kube-api-access-2c2dm\") pod \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\" (UID: \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\") " Jan 21 00:08:39 crc kubenswrapper[5030]: I0121 00:08:39.780685 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-ssh-key-edpm-compute-no-nodes\") pod \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\" (UID: \"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8\") " Jan 21 00:08:39 crc kubenswrapper[5030]: I0121 00:08:39.786313 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-kube-api-access-2c2dm" (OuterVolumeSpecName: "kube-api-access-2c2dm") pod "32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8" (UID: "32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8"). InnerVolumeSpecName "kube-api-access-2c2dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:08:39 crc kubenswrapper[5030]: I0121 00:08:39.803346 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-inventory" (OuterVolumeSpecName: "inventory") pod "32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8" (UID: "32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:39 crc kubenswrapper[5030]: I0121 00:08:39.806858 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8" (UID: "32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:39 crc kubenswrapper[5030]: I0121 00:08:39.882864 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c2dm\" (UniqueName: \"kubernetes.io/projected/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-kube-api-access-2c2dm\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:39 crc kubenswrapper[5030]: I0121 00:08:39.882899 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:39 crc kubenswrapper[5030]: I0121 00:08:39.882913 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.369938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" event={"ID":"32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8","Type":"ContainerDied","Data":"f58fec3c7fc24827c830f59ab5018846fad51255d57655d1b346097c9bd0ae45"} Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.370306 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f58fec3c7fc24827c830f59ab5018846fad51255d57655d1b346097c9bd0ae45" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.370414 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.464510 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq"] Jan 21 00:08:40 crc kubenswrapper[5030]: E0121 00:08:40.464837 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8" containerName="validate-network-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.464854 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8" containerName="validate-network-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.464993 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8" containerName="validate-network-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.465437 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.468245 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.468875 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.471982 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.473742 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.479550 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq"] Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.521352 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-ssh-key-edpm-compute-no-nodes\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq\" (UID: \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.521637 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz8xf\" (UniqueName: \"kubernetes.io/projected/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-kube-api-access-qz8xf\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq\" (UID: \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.521783 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-inventory\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq\" (UID: \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.622914 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-inventory\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq\" (UID: \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.623031 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-ssh-key-edpm-compute-no-nodes\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq\" (UID: \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.623070 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz8xf\" (UniqueName: \"kubernetes.io/projected/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-kube-api-access-qz8xf\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq\" (UID: \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.628697 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-ssh-key-edpm-compute-no-nodes\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq\" (UID: \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.628973 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-inventory\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq\" (UID: \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.641272 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz8xf\" (UniqueName: \"kubernetes.io/projected/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-kube-api-access-qz8xf\") pod \"install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq\" (UID: \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\") " pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" Jan 21 00:08:40 crc kubenswrapper[5030]: I0121 00:08:40.791804 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" Jan 21 00:08:41 crc kubenswrapper[5030]: I0121 00:08:41.273112 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq"] Jan 21 00:08:41 crc kubenswrapper[5030]: I0121 00:08:41.381034 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" event={"ID":"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8","Type":"ContainerStarted","Data":"6da04083c47ad1558440ee3473204fffd7e32e131657adb7a3b778758cf337c7"} Jan 21 00:08:46 crc kubenswrapper[5030]: I0121 00:08:46.440144 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" event={"ID":"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8","Type":"ContainerStarted","Data":"cc8347fdd1e6eec1f6d192d22b0b150e056b13e81e240ab57f39a88cc76a545f"} Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.433165 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" podStartSLOduration=2.887821023 podStartE2EDuration="7.433147103s" podCreationTimestamp="2026-01-21 00:08:40 +0000 UTC" firstStartedPulling="2026-01-21 00:08:41.264113292 +0000 UTC m=+5593.584373580" lastFinishedPulling="2026-01-21 00:08:45.809439362 +0000 UTC m=+5598.129699660" observedRunningTime="2026-01-21 00:08:46.4729311 +0000 UTC m=+5598.793191418" watchObservedRunningTime="2026-01-21 00:08:47.433147103 +0000 UTC m=+5599.753407401" Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.435761 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rpccd"] Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.438318 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.446177 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rpccd"] Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.455040 5030 generic.go:334] "Generic (PLEG): container finished" podID="f3580aa5-c95d-4d3c-8b9a-da3fe92633b8" containerID="cc8347fdd1e6eec1f6d192d22b0b150e056b13e81e240ab57f39a88cc76a545f" exitCode=0 Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.455088 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" event={"ID":"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8","Type":"ContainerDied","Data":"cc8347fdd1e6eec1f6d192d22b0b150e056b13e81e240ab57f39a88cc76a545f"} Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.536196 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157b727a-8cd0-44ae-8406-1dcaf981ffdd-catalog-content\") pod \"certified-operators-rpccd\" (UID: \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\") " pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.536480 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84lsw\" (UniqueName: \"kubernetes.io/projected/157b727a-8cd0-44ae-8406-1dcaf981ffdd-kube-api-access-84lsw\") pod \"certified-operators-rpccd\" (UID: \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\") " pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.536517 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157b727a-8cd0-44ae-8406-1dcaf981ffdd-utilities\") pod \"certified-operators-rpccd\" (UID: \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\") " pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.637990 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157b727a-8cd0-44ae-8406-1dcaf981ffdd-catalog-content\") pod \"certified-operators-rpccd\" (UID: \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\") " pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.638077 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84lsw\" (UniqueName: \"kubernetes.io/projected/157b727a-8cd0-44ae-8406-1dcaf981ffdd-kube-api-access-84lsw\") pod \"certified-operators-rpccd\" (UID: \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\") " pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.638121 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157b727a-8cd0-44ae-8406-1dcaf981ffdd-utilities\") pod \"certified-operators-rpccd\" (UID: \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\") " pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.638583 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157b727a-8cd0-44ae-8406-1dcaf981ffdd-catalog-content\") pod \"certified-operators-rpccd\" (UID: \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\") " pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.638604 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157b727a-8cd0-44ae-8406-1dcaf981ffdd-utilities\") pod \"certified-operators-rpccd\" (UID: \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\") " pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.680899 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84lsw\" (UniqueName: \"kubernetes.io/projected/157b727a-8cd0-44ae-8406-1dcaf981ffdd-kube-api-access-84lsw\") pod \"certified-operators-rpccd\" (UID: \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\") " pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:47 crc kubenswrapper[5030]: I0121 00:08:47.763837 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:48 crc kubenswrapper[5030]: W0121 00:08:48.280216 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod157b727a_8cd0_44ae_8406_1dcaf981ffdd.slice/crio-111768af66beda35aed5ae66398604eb69cd074faeecfe7b2a03a9f0c868e564 WatchSource:0}: Error finding container 111768af66beda35aed5ae66398604eb69cd074faeecfe7b2a03a9f0c868e564: Status 404 returned error can't find the container with id 111768af66beda35aed5ae66398604eb69cd074faeecfe7b2a03a9f0c868e564 Jan 21 00:08:48 crc kubenswrapper[5030]: I0121 00:08:48.286455 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rpccd"] Jan 21 00:08:48 crc kubenswrapper[5030]: I0121 00:08:48.473869 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpccd" event={"ID":"157b727a-8cd0-44ae-8406-1dcaf981ffdd","Type":"ContainerStarted","Data":"111768af66beda35aed5ae66398604eb69cd074faeecfe7b2a03a9f0c868e564"} Jan 21 00:08:48 crc kubenswrapper[5030]: I0121 00:08:48.777911 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" Jan 21 00:08:48 crc kubenswrapper[5030]: I0121 00:08:48.962667 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-inventory\") pod \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\" (UID: \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\") " Jan 21 00:08:48 crc kubenswrapper[5030]: I0121 00:08:48.962832 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz8xf\" (UniqueName: \"kubernetes.io/projected/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-kube-api-access-qz8xf\") pod \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\" (UID: \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\") " Jan 21 00:08:48 crc kubenswrapper[5030]: I0121 00:08:48.962908 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-ssh-key-edpm-compute-no-nodes\") pod \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\" (UID: \"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8\") " Jan 21 00:08:48 crc kubenswrapper[5030]: I0121 00:08:48.969218 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-kube-api-access-qz8xf" (OuterVolumeSpecName: "kube-api-access-qz8xf") pod "f3580aa5-c95d-4d3c-8b9a-da3fe92633b8" (UID: "f3580aa5-c95d-4d3c-8b9a-da3fe92633b8"). InnerVolumeSpecName "kube-api-access-qz8xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:08:48 crc kubenswrapper[5030]: I0121 00:08:48.989389 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "f3580aa5-c95d-4d3c-8b9a-da3fe92633b8" (UID: "f3580aa5-c95d-4d3c-8b9a-da3fe92633b8"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:48 crc kubenswrapper[5030]: I0121 00:08:48.990322 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-inventory" (OuterVolumeSpecName: "inventory") pod "f3580aa5-c95d-4d3c-8b9a-da3fe92633b8" (UID: "f3580aa5-c95d-4d3c-8b9a-da3fe92633b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.064976 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.065022 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz8xf\" (UniqueName: \"kubernetes.io/projected/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-kube-api-access-qz8xf\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.065034 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.486587 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" event={"ID":"f3580aa5-c95d-4d3c-8b9a-da3fe92633b8","Type":"ContainerDied","Data":"6da04083c47ad1558440ee3473204fffd7e32e131657adb7a3b778758cf337c7"} Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.486680 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6da04083c47ad1558440ee3473204fffd7e32e131657adb7a3b778758cf337c7" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.486637 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.489091 5030 generic.go:334] "Generic (PLEG): container finished" podID="157b727a-8cd0-44ae-8406-1dcaf981ffdd" containerID="8207e1592085806e1f3d97f81fc4dadde2881307788bbf90f7cfa7410bc37f67" exitCode=0 Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.489159 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpccd" event={"ID":"157b727a-8cd0-44ae-8406-1dcaf981ffdd","Type":"ContainerDied","Data":"8207e1592085806e1f3d97f81fc4dadde2881307788bbf90f7cfa7410bc37f67"} Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.494298 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.550551 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw"] Jan 21 00:08:49 crc kubenswrapper[5030]: E0121 00:08:49.553314 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3580aa5-c95d-4d3c-8b9a-da3fe92633b8" containerName="install-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.553340 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3580aa5-c95d-4d3c-8b9a-da3fe92633b8" containerName="install-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.553485 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3580aa5-c95d-4d3c-8b9a-da3fe92633b8" containerName="install-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.553994 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.556477 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.557091 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.557123 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.557424 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.565799 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw"] Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.672600 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-ssh-key-edpm-compute-no-nodes\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw\" (UID: \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.672690 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-inventory\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw\" (UID: \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.672730 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phfwh\" (UniqueName: \"kubernetes.io/projected/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-kube-api-access-phfwh\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw\" (UID: \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.774362 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-ssh-key-edpm-compute-no-nodes\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw\" (UID: \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.774405 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-inventory\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw\" (UID: \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.774447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phfwh\" (UniqueName: \"kubernetes.io/projected/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-kube-api-access-phfwh\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw\" (UID: \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.780151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-inventory\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw\" (UID: \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.780792 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-ssh-key-edpm-compute-no-nodes\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw\" (UID: \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.801835 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phfwh\" (UniqueName: \"kubernetes.io/projected/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-kube-api-access-phfwh\") pod \"configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw\" (UID: \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\") " pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" Jan 21 00:08:49 crc kubenswrapper[5030]: I0121 00:08:49.889700 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" Jan 21 00:08:50 crc kubenswrapper[5030]: I0121 00:08:50.331970 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw"] Jan 21 00:08:50 crc kubenswrapper[5030]: W0121 00:08:50.342417 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfba4cf5f_c4fd_49ec_9d00_f5724174b6a3.slice/crio-f81582f357576f12dab2d8fe192a868fe38527baf1bf195902c7398e5722ccf6 WatchSource:0}: Error finding container f81582f357576f12dab2d8fe192a868fe38527baf1bf195902c7398e5722ccf6: Status 404 returned error can't find the container with id f81582f357576f12dab2d8fe192a868fe38527baf1bf195902c7398e5722ccf6 Jan 21 00:08:50 crc kubenswrapper[5030]: I0121 00:08:50.497552 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" event={"ID":"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3","Type":"ContainerStarted","Data":"f81582f357576f12dab2d8fe192a868fe38527baf1bf195902c7398e5722ccf6"} Jan 21 00:08:51 crc kubenswrapper[5030]: I0121 00:08:51.505394 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" event={"ID":"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3","Type":"ContainerStarted","Data":"7fa7223501a2a9f0fb1d9ac88a72f6ada4e6ae3525f87be4e1affde884fd2858"} Jan 21 00:08:51 crc kubenswrapper[5030]: I0121 00:08:51.533355 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" podStartSLOduration=1.740431095 podStartE2EDuration="2.53333344s" podCreationTimestamp="2026-01-21 00:08:49 +0000 UTC" firstStartedPulling="2026-01-21 00:08:50.345809546 +0000 UTC m=+5602.666069834" lastFinishedPulling="2026-01-21 00:08:51.138711891 +0000 UTC m=+5603.458972179" observedRunningTime="2026-01-21 00:08:51.523720367 +0000 UTC m=+5603.843980645" watchObservedRunningTime="2026-01-21 00:08:51.53333344 +0000 UTC m=+5603.853593728" Jan 21 00:08:52 crc kubenswrapper[5030]: I0121 00:08:52.624530 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-85f7g"] Jan 21 00:08:52 crc kubenswrapper[5030]: I0121 00:08:52.628031 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:08:52 crc kubenswrapper[5030]: I0121 00:08:52.635758 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85f7g"] Jan 21 00:08:52 crc kubenswrapper[5030]: I0121 00:08:52.717597 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c077515c-ffe7-4674-acc5-6ede3db5e452-utilities\") pod \"community-operators-85f7g\" (UID: \"c077515c-ffe7-4674-acc5-6ede3db5e452\") " pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:08:52 crc kubenswrapper[5030]: I0121 00:08:52.717703 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c077515c-ffe7-4674-acc5-6ede3db5e452-catalog-content\") pod \"community-operators-85f7g\" (UID: \"c077515c-ffe7-4674-acc5-6ede3db5e452\") " pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:08:52 crc kubenswrapper[5030]: I0121 00:08:52.717769 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q8t2\" (UniqueName: \"kubernetes.io/projected/c077515c-ffe7-4674-acc5-6ede3db5e452-kube-api-access-4q8t2\") pod \"community-operators-85f7g\" (UID: \"c077515c-ffe7-4674-acc5-6ede3db5e452\") " pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:08:52 crc kubenswrapper[5030]: I0121 00:08:52.819262 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q8t2\" (UniqueName: \"kubernetes.io/projected/c077515c-ffe7-4674-acc5-6ede3db5e452-kube-api-access-4q8t2\") pod \"community-operators-85f7g\" (UID: \"c077515c-ffe7-4674-acc5-6ede3db5e452\") " pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:08:52 crc kubenswrapper[5030]: I0121 00:08:52.819365 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c077515c-ffe7-4674-acc5-6ede3db5e452-utilities\") pod \"community-operators-85f7g\" (UID: \"c077515c-ffe7-4674-acc5-6ede3db5e452\") " pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:08:52 crc kubenswrapper[5030]: I0121 00:08:52.819444 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c077515c-ffe7-4674-acc5-6ede3db5e452-catalog-content\") pod \"community-operators-85f7g\" (UID: \"c077515c-ffe7-4674-acc5-6ede3db5e452\") " pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:08:52 crc kubenswrapper[5030]: I0121 00:08:52.819966 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c077515c-ffe7-4674-acc5-6ede3db5e452-catalog-content\") pod \"community-operators-85f7g\" (UID: \"c077515c-ffe7-4674-acc5-6ede3db5e452\") " pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:08:52 crc kubenswrapper[5030]: I0121 00:08:52.819972 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c077515c-ffe7-4674-acc5-6ede3db5e452-utilities\") pod \"community-operators-85f7g\" (UID: \"c077515c-ffe7-4674-acc5-6ede3db5e452\") " pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:08:52 crc kubenswrapper[5030]: I0121 00:08:52.839772 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q8t2\" (UniqueName: \"kubernetes.io/projected/c077515c-ffe7-4674-acc5-6ede3db5e452-kube-api-access-4q8t2\") pod \"community-operators-85f7g\" (UID: \"c077515c-ffe7-4674-acc5-6ede3db5e452\") " pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.032332 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.222370 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w4tnv"] Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.224787 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.231734 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4tnv"] Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.327433 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca0b585-e59f-437b-895a-9fd2ce709f2e-catalog-content\") pod \"redhat-marketplace-w4tnv\" (UID: \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\") " pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.327485 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca0b585-e59f-437b-895a-9fd2ce709f2e-utilities\") pod \"redhat-marketplace-w4tnv\" (UID: \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\") " pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.327867 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjmc9\" (UniqueName: \"kubernetes.io/projected/1ca0b585-e59f-437b-895a-9fd2ce709f2e-kube-api-access-tjmc9\") pod \"redhat-marketplace-w4tnv\" (UID: \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\") " pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.429356 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca0b585-e59f-437b-895a-9fd2ce709f2e-catalog-content\") pod \"redhat-marketplace-w4tnv\" (UID: \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\") " pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.429411 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca0b585-e59f-437b-895a-9fd2ce709f2e-utilities\") pod \"redhat-marketplace-w4tnv\" (UID: \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\") " pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.429475 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjmc9\" (UniqueName: \"kubernetes.io/projected/1ca0b585-e59f-437b-895a-9fd2ce709f2e-kube-api-access-tjmc9\") pod \"redhat-marketplace-w4tnv\" (UID: \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\") " pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.429911 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca0b585-e59f-437b-895a-9fd2ce709f2e-utilities\") pod \"redhat-marketplace-w4tnv\" (UID: \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\") " pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.430100 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca0b585-e59f-437b-895a-9fd2ce709f2e-catalog-content\") pod \"redhat-marketplace-w4tnv\" (UID: \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\") " pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.458492 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjmc9\" (UniqueName: \"kubernetes.io/projected/1ca0b585-e59f-437b-895a-9fd2ce709f2e-kube-api-access-tjmc9\") pod \"redhat-marketplace-w4tnv\" (UID: \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\") " pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.524483 5030 generic.go:334] "Generic (PLEG): container finished" podID="fba4cf5f-c4fd-49ec-9d00-f5724174b6a3" containerID="7fa7223501a2a9f0fb1d9ac88a72f6ada4e6ae3525f87be4e1affde884fd2858" exitCode=0 Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.524527 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" event={"ID":"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3","Type":"ContainerDied","Data":"7fa7223501a2a9f0fb1d9ac88a72f6ada4e6ae3525f87be4e1affde884fd2858"} Jan 21 00:08:53 crc kubenswrapper[5030]: I0121 00:08:53.543486 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:08:54 crc kubenswrapper[5030]: I0121 00:08:54.411417 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4tnv"] Jan 21 00:08:54 crc kubenswrapper[5030]: W0121 00:08:54.432679 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ca0b585_e59f_437b_895a_9fd2ce709f2e.slice/crio-60dbffcb0977b5d13ffc26db81224b605dfdc3f091c43d011c649ea5faa1d922 WatchSource:0}: Error finding container 60dbffcb0977b5d13ffc26db81224b605dfdc3f091c43d011c649ea5faa1d922: Status 404 returned error can't find the container with id 60dbffcb0977b5d13ffc26db81224b605dfdc3f091c43d011c649ea5faa1d922 Jan 21 00:08:54 crc kubenswrapper[5030]: I0121 00:08:54.459048 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85f7g"] Jan 21 00:08:54 crc kubenswrapper[5030]: W0121 00:08:54.468566 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc077515c_ffe7_4674_acc5_6ede3db5e452.slice/crio-13d803828118213a3cdcf3b93f5616f5e63ab052553968aa1aaa45e89ce0a578 WatchSource:0}: Error finding container 13d803828118213a3cdcf3b93f5616f5e63ab052553968aa1aaa45e89ce0a578: Status 404 returned error can't find the container with id 13d803828118213a3cdcf3b93f5616f5e63ab052553968aa1aaa45e89ce0a578 Jan 21 00:08:54 crc kubenswrapper[5030]: I0121 00:08:54.541018 5030 generic.go:334] "Generic (PLEG): container finished" podID="157b727a-8cd0-44ae-8406-1dcaf981ffdd" containerID="e2cc1922b10bab485194876b2ee89c889b82b0d77dd03a6e014604e6f3420189" exitCode=0 Jan 21 00:08:54 crc kubenswrapper[5030]: I0121 00:08:54.541080 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpccd" event={"ID":"157b727a-8cd0-44ae-8406-1dcaf981ffdd","Type":"ContainerDied","Data":"e2cc1922b10bab485194876b2ee89c889b82b0d77dd03a6e014604e6f3420189"} Jan 21 00:08:54 crc kubenswrapper[5030]: I0121 00:08:54.544421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4tnv" event={"ID":"1ca0b585-e59f-437b-895a-9fd2ce709f2e","Type":"ContainerStarted","Data":"60dbffcb0977b5d13ffc26db81224b605dfdc3f091c43d011c649ea5faa1d922"} Jan 21 00:08:54 crc kubenswrapper[5030]: I0121 00:08:54.547549 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85f7g" event={"ID":"c077515c-ffe7-4674-acc5-6ede3db5e452","Type":"ContainerStarted","Data":"13d803828118213a3cdcf3b93f5616f5e63ab052553968aa1aaa45e89ce0a578"} Jan 21 00:08:54 crc kubenswrapper[5030]: I0121 00:08:54.794165 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" Jan 21 00:08:54 crc kubenswrapper[5030]: I0121 00:08:54.950147 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-ssh-key-edpm-compute-no-nodes\") pod \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\" (UID: \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\") " Jan 21 00:08:54 crc kubenswrapper[5030]: I0121 00:08:54.950216 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-inventory\") pod \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\" (UID: \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\") " Jan 21 00:08:54 crc kubenswrapper[5030]: I0121 00:08:54.950322 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phfwh\" (UniqueName: \"kubernetes.io/projected/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-kube-api-access-phfwh\") pod \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\" (UID: \"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3\") " Jan 21 00:08:54 crc kubenswrapper[5030]: I0121 00:08:54.956056 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-kube-api-access-phfwh" (OuterVolumeSpecName: "kube-api-access-phfwh") pod "fba4cf5f-c4fd-49ec-9d00-f5724174b6a3" (UID: "fba4cf5f-c4fd-49ec-9d00-f5724174b6a3"). InnerVolumeSpecName "kube-api-access-phfwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:08:54 crc kubenswrapper[5030]: I0121 00:08:54.979839 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-inventory" (OuterVolumeSpecName: "inventory") pod "fba4cf5f-c4fd-49ec-9d00-f5724174b6a3" (UID: "fba4cf5f-c4fd-49ec-9d00-f5724174b6a3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:54 crc kubenswrapper[5030]: I0121 00:08:54.979958 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "fba4cf5f-c4fd-49ec-9d00-f5724174b6a3" (UID: "fba4cf5f-c4fd-49ec-9d00-f5724174b6a3"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.052381 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.052421 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.052438 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phfwh\" (UniqueName: \"kubernetes.io/projected/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3-kube-api-access-phfwh\") on node \"crc\" DevicePath \"\"" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.556510 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" event={"ID":"fba4cf5f-c4fd-49ec-9d00-f5724174b6a3","Type":"ContainerDied","Data":"f81582f357576f12dab2d8fe192a868fe38527baf1bf195902c7398e5722ccf6"} Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.556780 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f81582f357576f12dab2d8fe192a868fe38527baf1bf195902c7398e5722ccf6" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.556521 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.559048 5030 generic.go:334] "Generic (PLEG): container finished" podID="c077515c-ffe7-4674-acc5-6ede3db5e452" containerID="cd147a5ff6098f428cc68988baa56501a1b302ac3d463655b297222c007295d9" exitCode=0 Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.559090 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85f7g" event={"ID":"c077515c-ffe7-4674-acc5-6ede3db5e452","Type":"ContainerDied","Data":"cd147a5ff6098f428cc68988baa56501a1b302ac3d463655b297222c007295d9"} Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.562471 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpccd" event={"ID":"157b727a-8cd0-44ae-8406-1dcaf981ffdd","Type":"ContainerStarted","Data":"e0eae01eca9efd3b51d95b741a01b46c7fe7f2fbdb5f37fbf2681b21e9f7e81f"} Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.566850 5030 generic.go:334] "Generic (PLEG): container finished" podID="1ca0b585-e59f-437b-895a-9fd2ce709f2e" containerID="1c8c49a30bd63c73c01ade5fa4ffe498035336c12b09dfd2f8e47e8d97752883" exitCode=0 Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.567114 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4tnv" event={"ID":"1ca0b585-e59f-437b-895a-9fd2ce709f2e","Type":"ContainerDied","Data":"1c8c49a30bd63c73c01ade5fa4ffe498035336c12b09dfd2f8e47e8d97752883"} Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.628175 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5"] Jan 21 00:08:55 crc kubenswrapper[5030]: E0121 00:08:55.628526 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba4cf5f-c4fd-49ec-9d00-f5724174b6a3" containerName="configure-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.628548 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba4cf5f-c4fd-49ec-9d00-f5724174b6a3" containerName="configure-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.628741 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fba4cf5f-c4fd-49ec-9d00-f5724174b6a3" containerName="configure-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.629280 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.637283 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.637365 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.637410 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.637575 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.657614 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5"] Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.671029 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rpccd" podStartSLOduration=3.204941021 podStartE2EDuration="8.670996386s" podCreationTimestamp="2026-01-21 00:08:47 +0000 UTC" firstStartedPulling="2026-01-21 00:08:49.494045933 +0000 UTC m=+5601.814306221" lastFinishedPulling="2026-01-21 00:08:54.960101288 +0000 UTC m=+5607.280361586" observedRunningTime="2026-01-21 00:08:55.661552446 +0000 UTC m=+5607.981812744" watchObservedRunningTime="2026-01-21 00:08:55.670996386 +0000 UTC m=+5607.991256664" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.763097 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-inventory\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5\" (UID: \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.763415 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jbjg\" (UniqueName: \"kubernetes.io/projected/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-kube-api-access-7jbjg\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5\" (UID: \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.763584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-ssh-key-edpm-compute-no-nodes\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5\" (UID: \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.864601 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-ssh-key-edpm-compute-no-nodes\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5\" (UID: \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.864703 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-inventory\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5\" (UID: \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.864751 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jbjg\" (UniqueName: \"kubernetes.io/projected/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-kube-api-access-7jbjg\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5\" (UID: \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.870248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-inventory\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5\" (UID: \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.872287 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-ssh-key-edpm-compute-no-nodes\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5\" (UID: \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.886468 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jbjg\" (UniqueName: \"kubernetes.io/projected/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-kube-api-access-7jbjg\") pod \"run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5\" (UID: \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\") " pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" Jan 21 00:08:55 crc kubenswrapper[5030]: I0121 00:08:55.962216 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" Jan 21 00:08:56 crc kubenswrapper[5030]: I0121 00:08:56.254273 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5"] Jan 21 00:08:56 crc kubenswrapper[5030]: W0121 00:08:56.270865 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eaa38f0_4707_4be1_a3ca_b343dff5e36a.slice/crio-fcbcd06f623e94eedd5433b508bcb1fd17ccf0dc964d0e99d27be81d6859102a WatchSource:0}: Error finding container fcbcd06f623e94eedd5433b508bcb1fd17ccf0dc964d0e99d27be81d6859102a: Status 404 returned error can't find the container with id fcbcd06f623e94eedd5433b508bcb1fd17ccf0dc964d0e99d27be81d6859102a Jan 21 00:08:56 crc kubenswrapper[5030]: I0121 00:08:56.588553 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85f7g" event={"ID":"c077515c-ffe7-4674-acc5-6ede3db5e452","Type":"ContainerStarted","Data":"18fb6c51e628c99ea0edf87ce3a99aff977c4baa976713d12773b6e8ce018b42"} Jan 21 00:08:56 crc kubenswrapper[5030]: I0121 00:08:56.589819 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" event={"ID":"0eaa38f0-4707-4be1-a3ca-b343dff5e36a","Type":"ContainerStarted","Data":"fcbcd06f623e94eedd5433b508bcb1fd17ccf0dc964d0e99d27be81d6859102a"} Jan 21 00:08:57 crc kubenswrapper[5030]: I0121 00:08:57.601452 5030 generic.go:334] "Generic (PLEG): container finished" podID="c077515c-ffe7-4674-acc5-6ede3db5e452" containerID="18fb6c51e628c99ea0edf87ce3a99aff977c4baa976713d12773b6e8ce018b42" exitCode=0 Jan 21 00:08:57 crc kubenswrapper[5030]: I0121 00:08:57.601519 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85f7g" event={"ID":"c077515c-ffe7-4674-acc5-6ede3db5e452","Type":"ContainerDied","Data":"18fb6c51e628c99ea0edf87ce3a99aff977c4baa976713d12773b6e8ce018b42"} Jan 21 00:08:57 crc kubenswrapper[5030]: I0121 00:08:57.605171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" event={"ID":"0eaa38f0-4707-4be1-a3ca-b343dff5e36a","Type":"ContainerStarted","Data":"d46f0597a253211b0fae46ccba166f5d11c876c7f259f66e0ac54845d6296249"} Jan 21 00:08:57 crc kubenswrapper[5030]: I0121 00:08:57.639519 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" podStartSLOduration=2.003229618 podStartE2EDuration="2.639499126s" podCreationTimestamp="2026-01-21 00:08:55 +0000 UTC" firstStartedPulling="2026-01-21 00:08:56.272229264 +0000 UTC m=+5608.592489552" lastFinishedPulling="2026-01-21 00:08:56.908498772 +0000 UTC m=+5609.228759060" observedRunningTime="2026-01-21 00:08:57.632818634 +0000 UTC m=+5609.953078922" watchObservedRunningTime="2026-01-21 00:08:57.639499126 +0000 UTC m=+5609.959759414" Jan 21 00:08:57 crc kubenswrapper[5030]: I0121 00:08:57.764516 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:57 crc kubenswrapper[5030]: I0121 00:08:57.764909 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:57 crc kubenswrapper[5030]: I0121 00:08:57.807011 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:08:58 crc kubenswrapper[5030]: I0121 00:08:58.666084 5030 generic.go:334] "Generic (PLEG): container finished" podID="0eaa38f0-4707-4be1-a3ca-b343dff5e36a" containerID="d46f0597a253211b0fae46ccba166f5d11c876c7f259f66e0ac54845d6296249" exitCode=0 Jan 21 00:08:58 crc kubenswrapper[5030]: I0121 00:08:58.666742 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" event={"ID":"0eaa38f0-4707-4be1-a3ca-b343dff5e36a","Type":"ContainerDied","Data":"d46f0597a253211b0fae46ccba166f5d11c876c7f259f66e0ac54845d6296249"} Jan 21 00:08:59 crc kubenswrapper[5030]: I0121 00:08:59.677833 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4tnv" event={"ID":"1ca0b585-e59f-437b-895a-9fd2ce709f2e","Type":"ContainerStarted","Data":"32c10319cc98b71529f1f81f9eebbbd38461c2834d697fa32ec41494bb153f86"} Jan 21 00:08:59 crc kubenswrapper[5030]: I0121 00:08:59.997360 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.079592 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-ssh-key-edpm-compute-no-nodes\") pod \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\" (UID: \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\") " Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.079914 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-inventory\") pod \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\" (UID: \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\") " Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.079976 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jbjg\" (UniqueName: \"kubernetes.io/projected/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-kube-api-access-7jbjg\") pod \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\" (UID: \"0eaa38f0-4707-4be1-a3ca-b343dff5e36a\") " Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.085332 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-kube-api-access-7jbjg" (OuterVolumeSpecName: "kube-api-access-7jbjg") pod "0eaa38f0-4707-4be1-a3ca-b343dff5e36a" (UID: "0eaa38f0-4707-4be1-a3ca-b343dff5e36a"). InnerVolumeSpecName "kube-api-access-7jbjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.103505 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-inventory" (OuterVolumeSpecName: "inventory") pod "0eaa38f0-4707-4be1-a3ca-b343dff5e36a" (UID: "0eaa38f0-4707-4be1-a3ca-b343dff5e36a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.108870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "0eaa38f0-4707-4be1-a3ca-b343dff5e36a" (UID: "0eaa38f0-4707-4be1-a3ca-b343dff5e36a"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.181275 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.181307 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.181324 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jbjg\" (UniqueName: \"kubernetes.io/projected/0eaa38f0-4707-4be1-a3ca-b343dff5e36a-kube-api-access-7jbjg\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.691769 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" event={"ID":"0eaa38f0-4707-4be1-a3ca-b343dff5e36a","Type":"ContainerDied","Data":"fcbcd06f623e94eedd5433b508bcb1fd17ccf0dc964d0e99d27be81d6859102a"} Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.691829 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcbcd06f623e94eedd5433b508bcb1fd17ccf0dc964d0e99d27be81d6859102a" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.691903 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.697448 5030 generic.go:334] "Generic (PLEG): container finished" podID="1ca0b585-e59f-437b-895a-9fd2ce709f2e" containerID="32c10319cc98b71529f1f81f9eebbbd38461c2834d697fa32ec41494bb153f86" exitCode=0 Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.697605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4tnv" event={"ID":"1ca0b585-e59f-437b-895a-9fd2ce709f2e","Type":"ContainerDied","Data":"32c10319cc98b71529f1f81f9eebbbd38461c2834d697fa32ec41494bb153f86"} Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.704479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85f7g" event={"ID":"c077515c-ffe7-4674-acc5-6ede3db5e452","Type":"ContainerStarted","Data":"9820f5fb0b6f3412d2264859f445257d8919deaf384d4446b3f8ef3e36caa631"} Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.769707 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz"] Jan 21 00:09:00 crc kubenswrapper[5030]: E0121 00:09:00.770052 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eaa38f0-4707-4be1-a3ca-b343dff5e36a" containerName="run-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.770069 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eaa38f0-4707-4be1-a3ca-b343dff5e36a" containerName="run-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.770214 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eaa38f0-4707-4be1-a3ca-b343dff5e36a" containerName="run-os-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.770718 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.776058 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.776790 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.776812 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-85f7g" podStartSLOduration=4.168489059 podStartE2EDuration="8.776790986s" podCreationTimestamp="2026-01-21 00:08:52 +0000 UTC" firstStartedPulling="2026-01-21 00:08:55.56022495 +0000 UTC m=+5607.880485238" lastFinishedPulling="2026-01-21 00:09:00.168526877 +0000 UTC m=+5612.488787165" observedRunningTime="2026-01-21 00:09:00.768766521 +0000 UTC m=+5613.089026849" watchObservedRunningTime="2026-01-21 00:09:00.776790986 +0000 UTC m=+5613.097051274" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.776942 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.777049 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.777130 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.797810 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz"] Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.890824 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.890909 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.890960 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-nova-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.891053 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.891106 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl7pd\" (UniqueName: \"kubernetes.io/projected/0bb49995-af5d-457a-9f76-67eae8b2a408-kube-api-access-gl7pd\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.891146 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.891186 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.891238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.891301 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-inventory\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.891465 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.891612 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-ssh-key-edpm-compute-no-nodes\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.992954 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.993016 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.993041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-nova-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.993072 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.993094 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl7pd\" (UniqueName: \"kubernetes.io/projected/0bb49995-af5d-457a-9f76-67eae8b2a408-kube-api-access-gl7pd\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.993117 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.993142 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.993167 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.993194 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-inventory\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.993219 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.993250 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-ssh-key-edpm-compute-no-nodes\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.999307 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.999612 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:00 crc kubenswrapper[5030]: I0121 00:09:00.999694 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-ovn-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:01 crc kubenswrapper[5030]: I0121 00:09:01.004263 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-sriov-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:01 crc kubenswrapper[5030]: I0121 00:09:01.004461 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-inventory\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:01 crc kubenswrapper[5030]: I0121 00:09:01.004693 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-ssh-key-edpm-compute-no-nodes\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:01 crc kubenswrapper[5030]: I0121 00:09:01.004727 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-nova-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:01 crc kubenswrapper[5030]: I0121 00:09:01.004973 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:01 crc kubenswrapper[5030]: I0121 00:09:01.008399 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:01 crc kubenswrapper[5030]: I0121 00:09:01.011680 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:01 crc kubenswrapper[5030]: I0121 00:09:01.012904 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl7pd\" (UniqueName: \"kubernetes.io/projected/0bb49995-af5d-457a-9f76-67eae8b2a408-kube-api-access-gl7pd\") pod \"install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:01 crc kubenswrapper[5030]: I0121 00:09:01.101427 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:01 crc kubenswrapper[5030]: I0121 00:09:01.582119 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz"] Jan 21 00:09:01 crc kubenswrapper[5030]: W0121 00:09:01.585791 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bb49995_af5d_457a_9f76_67eae8b2a408.slice/crio-3cc4bda5e3b34825f804a8e71a0862210d4edc1ff5237cfef8b99b2dee77033f WatchSource:0}: Error finding container 3cc4bda5e3b34825f804a8e71a0862210d4edc1ff5237cfef8b99b2dee77033f: Status 404 returned error can't find the container with id 3cc4bda5e3b34825f804a8e71a0862210d4edc1ff5237cfef8b99b2dee77033f Jan 21 00:09:01 crc kubenswrapper[5030]: I0121 00:09:01.728075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4tnv" event={"ID":"1ca0b585-e59f-437b-895a-9fd2ce709f2e","Type":"ContainerStarted","Data":"f36feba575813ce4a9719fb0a36ca80d2a7ef5c1350ea62e4b1fdc18ff8abf23"} Jan 21 00:09:01 crc kubenswrapper[5030]: I0121 00:09:01.730247 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" event={"ID":"0bb49995-af5d-457a-9f76-67eae8b2a408","Type":"ContainerStarted","Data":"3cc4bda5e3b34825f804a8e71a0862210d4edc1ff5237cfef8b99b2dee77033f"} Jan 21 00:09:01 crc kubenswrapper[5030]: I0121 00:09:01.752705 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w4tnv" podStartSLOduration=3.088468039 podStartE2EDuration="8.752682839s" podCreationTimestamp="2026-01-21 00:08:53 +0000 UTC" firstStartedPulling="2026-01-21 00:08:55.570047368 +0000 UTC m=+5607.890307656" lastFinishedPulling="2026-01-21 00:09:01.234262168 +0000 UTC m=+5613.554522456" observedRunningTime="2026-01-21 00:09:01.745957175 +0000 UTC m=+5614.066217463" watchObservedRunningTime="2026-01-21 00:09:01.752682839 +0000 UTC m=+5614.072943137" Jan 21 00:09:03 crc kubenswrapper[5030]: I0121 00:09:03.033229 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:09:03 crc kubenswrapper[5030]: I0121 00:09:03.033554 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:09:03 crc kubenswrapper[5030]: I0121 00:09:03.114687 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:09:03 crc kubenswrapper[5030]: I0121 00:09:03.544486 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:09:03 crc kubenswrapper[5030]: I0121 00:09:03.544563 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:09:03 crc kubenswrapper[5030]: I0121 00:09:03.600877 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:09:03 crc kubenswrapper[5030]: I0121 00:09:03.747273 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" event={"ID":"0bb49995-af5d-457a-9f76-67eae8b2a408","Type":"ContainerStarted","Data":"baed88bf47793cc103183bf100cfc830e86ad5195ccf15741aa726a2d053e430"} Jan 21 00:09:04 crc kubenswrapper[5030]: I0121 00:09:04.758459 5030 generic.go:334] "Generic (PLEG): container finished" podID="0bb49995-af5d-457a-9f76-67eae8b2a408" containerID="baed88bf47793cc103183bf100cfc830e86ad5195ccf15741aa726a2d053e430" exitCode=0 Jan 21 00:09:04 crc kubenswrapper[5030]: I0121 00:09:04.759680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" event={"ID":"0bb49995-af5d-457a-9f76-67eae8b2a408","Type":"ContainerDied","Data":"baed88bf47793cc103183bf100cfc830e86ad5195ccf15741aa726a2d053e430"} Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.109264 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.166091 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-dhcp-combined-ca-bundle\") pod \"0bb49995-af5d-457a-9f76-67eae8b2a408\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.166160 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-ovn-combined-ca-bundle\") pod \"0bb49995-af5d-457a-9f76-67eae8b2a408\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.166188 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl7pd\" (UniqueName: \"kubernetes.io/projected/0bb49995-af5d-457a-9f76-67eae8b2a408-kube-api-access-gl7pd\") pod \"0bb49995-af5d-457a-9f76-67eae8b2a408\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.166225 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-nova-combined-ca-bundle\") pod \"0bb49995-af5d-457a-9f76-67eae8b2a408\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.166287 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-sriov-combined-ca-bundle\") pod \"0bb49995-af5d-457a-9f76-67eae8b2a408\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.166305 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-inventory\") pod \"0bb49995-af5d-457a-9f76-67eae8b2a408\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.166334 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-ssh-key-edpm-compute-no-nodes\") pod \"0bb49995-af5d-457a-9f76-67eae8b2a408\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.166355 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-ovn-combined-ca-bundle\") pod \"0bb49995-af5d-457a-9f76-67eae8b2a408\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.166385 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-bootstrap-combined-ca-bundle\") pod \"0bb49995-af5d-457a-9f76-67eae8b2a408\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.166423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-metadata-combined-ca-bundle\") pod \"0bb49995-af5d-457a-9f76-67eae8b2a408\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.166454 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-libvirt-combined-ca-bundle\") pod \"0bb49995-af5d-457a-9f76-67eae8b2a408\" (UID: \"0bb49995-af5d-457a-9f76-67eae8b2a408\") " Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.171818 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "0bb49995-af5d-457a-9f76-67eae8b2a408" (UID: "0bb49995-af5d-457a-9f76-67eae8b2a408"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.172053 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0bb49995-af5d-457a-9f76-67eae8b2a408" (UID: "0bb49995-af5d-457a-9f76-67eae8b2a408"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.172193 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0bb49995-af5d-457a-9f76-67eae8b2a408" (UID: "0bb49995-af5d-457a-9f76-67eae8b2a408"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.173525 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bb49995-af5d-457a-9f76-67eae8b2a408-kube-api-access-gl7pd" (OuterVolumeSpecName: "kube-api-access-gl7pd") pod "0bb49995-af5d-457a-9f76-67eae8b2a408" (UID: "0bb49995-af5d-457a-9f76-67eae8b2a408"). InnerVolumeSpecName "kube-api-access-gl7pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.173627 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "0bb49995-af5d-457a-9f76-67eae8b2a408" (UID: "0bb49995-af5d-457a-9f76-67eae8b2a408"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.174725 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0bb49995-af5d-457a-9f76-67eae8b2a408" (UID: "0bb49995-af5d-457a-9f76-67eae8b2a408"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.175352 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0bb49995-af5d-457a-9f76-67eae8b2a408" (UID: "0bb49995-af5d-457a-9f76-67eae8b2a408"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.175469 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0bb49995-af5d-457a-9f76-67eae8b2a408" (UID: "0bb49995-af5d-457a-9f76-67eae8b2a408"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.176832 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "0bb49995-af5d-457a-9f76-67eae8b2a408" (UID: "0bb49995-af5d-457a-9f76-67eae8b2a408"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.193338 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-inventory" (OuterVolumeSpecName: "inventory") pod "0bb49995-af5d-457a-9f76-67eae8b2a408" (UID: "0bb49995-af5d-457a-9f76-67eae8b2a408"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.195442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "0bb49995-af5d-457a-9f76-67eae8b2a408" (UID: "0bb49995-af5d-457a-9f76-67eae8b2a408"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.268490 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.268541 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.268574 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl7pd\" (UniqueName: \"kubernetes.io/projected/0bb49995-af5d-457a-9f76-67eae8b2a408-kube-api-access-gl7pd\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.268593 5030 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.268611 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.268652 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.268671 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.268688 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.268707 5030 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.268725 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.268743 5030 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb49995-af5d-457a-9f76-67eae8b2a408-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.783413 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" event={"ID":"0bb49995-af5d-457a-9f76-67eae8b2a408","Type":"ContainerDied","Data":"3cc4bda5e3b34825f804a8e71a0862210d4edc1ff5237cfef8b99b2dee77033f"} Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.783467 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cc4bda5e3b34825f804a8e71a0862210d4edc1ff5237cfef8b99b2dee77033f" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.783528 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.882102 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt"] Jan 21 00:09:06 crc kubenswrapper[5030]: E0121 00:09:06.882486 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bb49995-af5d-457a-9f76-67eae8b2a408" containerName="install-certs-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.882506 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bb49995-af5d-457a-9f76-67eae8b2a408" containerName="install-certs-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.882706 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bb49995-af5d-457a-9f76-67eae8b2a408" containerName="install-certs-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.883231 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.885545 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.885951 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.885979 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.886004 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.886068 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"ovncontroller-config" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.886141 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.898365 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt"] Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.981091 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvvkj\" (UniqueName: \"kubernetes.io/projected/2787768b-6299-4a11-a990-4d436661b035-kube-api-access-nvvkj\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.981146 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-inventory\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.981178 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-ovn-combined-ca-bundle\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.981219 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:06 crc kubenswrapper[5030]: I0121 00:09:06.981246 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2787768b-6299-4a11-a990-4d436661b035-ovncontroller-config-0\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.082263 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.082349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2787768b-6299-4a11-a990-4d436661b035-ovncontroller-config-0\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.082473 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvvkj\" (UniqueName: \"kubernetes.io/projected/2787768b-6299-4a11-a990-4d436661b035-kube-api-access-nvvkj\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.082522 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-inventory\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.082570 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-ovn-combined-ca-bundle\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.083610 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2787768b-6299-4a11-a990-4d436661b035-ovncontroller-config-0\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.087424 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-ssh-key-edpm-compute-no-nodes\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.088285 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-ovn-combined-ca-bundle\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.088738 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-inventory\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.102675 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvvkj\" (UniqueName: \"kubernetes.io/projected/2787768b-6299-4a11-a990-4d436661b035-kube-api-access-nvvkj\") pod \"ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.201586 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.466408 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt"] Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.816212 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" event={"ID":"2787768b-6299-4a11-a990-4d436661b035","Type":"ContainerStarted","Data":"6b9ef990bdc0f98997e5e37866c95c4d1b33c5a28215f3c53e48dab94ee419b5"} Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.848824 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.927330 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rpccd"] Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.977942 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9nnfj"] Jan 21 00:09:07 crc kubenswrapper[5030]: I0121 00:09:07.978225 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9nnfj" podUID="f880cab6-348f-406b-ab12-6053781eedce" containerName="registry-server" containerID="cri-o://0b1c1e87cf452f7c0e8d907d8bd7c6982af26ab5a0ea69f838a99868391098fb" gracePeriod=2 Jan 21 00:09:10 crc kubenswrapper[5030]: I0121 00:09:10.893349 5030 generic.go:334] "Generic (PLEG): container finished" podID="f880cab6-348f-406b-ab12-6053781eedce" containerID="0b1c1e87cf452f7c0e8d907d8bd7c6982af26ab5a0ea69f838a99868391098fb" exitCode=0 Jan 21 00:09:10 crc kubenswrapper[5030]: I0121 00:09:10.893491 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nnfj" event={"ID":"f880cab6-348f-406b-ab12-6053781eedce","Type":"ContainerDied","Data":"0b1c1e87cf452f7c0e8d907d8bd7c6982af26ab5a0ea69f838a99868391098fb"} Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.610571 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nnfj" Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.648940 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f880cab6-348f-406b-ab12-6053781eedce-catalog-content\") pod \"f880cab6-348f-406b-ab12-6053781eedce\" (UID: \"f880cab6-348f-406b-ab12-6053781eedce\") " Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.649057 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f880cab6-348f-406b-ab12-6053781eedce-utilities\") pod \"f880cab6-348f-406b-ab12-6053781eedce\" (UID: \"f880cab6-348f-406b-ab12-6053781eedce\") " Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.649241 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25g75\" (UniqueName: \"kubernetes.io/projected/f880cab6-348f-406b-ab12-6053781eedce-kube-api-access-25g75\") pod \"f880cab6-348f-406b-ab12-6053781eedce\" (UID: \"f880cab6-348f-406b-ab12-6053781eedce\") " Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.649962 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f880cab6-348f-406b-ab12-6053781eedce-utilities" (OuterVolumeSpecName: "utilities") pod "f880cab6-348f-406b-ab12-6053781eedce" (UID: "f880cab6-348f-406b-ab12-6053781eedce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.653519 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f880cab6-348f-406b-ab12-6053781eedce-kube-api-access-25g75" (OuterVolumeSpecName: "kube-api-access-25g75") pod "f880cab6-348f-406b-ab12-6053781eedce" (UID: "f880cab6-348f-406b-ab12-6053781eedce"). InnerVolumeSpecName "kube-api-access-25g75". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.699901 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f880cab6-348f-406b-ab12-6053781eedce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f880cab6-348f-406b-ab12-6053781eedce" (UID: "f880cab6-348f-406b-ab12-6053781eedce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.751386 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25g75\" (UniqueName: \"kubernetes.io/projected/f880cab6-348f-406b-ab12-6053781eedce-kube-api-access-25g75\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.751433 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f880cab6-348f-406b-ab12-6053781eedce-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.751445 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f880cab6-348f-406b-ab12-6053781eedce-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.907215 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9nnfj" event={"ID":"f880cab6-348f-406b-ab12-6053781eedce","Type":"ContainerDied","Data":"7ff7a9f196b9838ecdf9c755d296121cdcb3f52ec287bea7b9330b11549e9a34"} Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.907314 5030 scope.go:117] "RemoveContainer" containerID="0b1c1e87cf452f7c0e8d907d8bd7c6982af26ab5a0ea69f838a99868391098fb" Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.907731 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9nnfj" Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.908813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" event={"ID":"2787768b-6299-4a11-a990-4d436661b035","Type":"ContainerStarted","Data":"5684c664f654279461f9b79890c2a571e256b4b31a47b47161c9ff10184e5879"} Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.933581 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" podStartSLOduration=1.923886714 podStartE2EDuration="5.933561966s" podCreationTimestamp="2026-01-21 00:09:06 +0000 UTC" firstStartedPulling="2026-01-21 00:09:07.469106277 +0000 UTC m=+5619.789366575" lastFinishedPulling="2026-01-21 00:09:11.478781529 +0000 UTC m=+5623.799041827" observedRunningTime="2026-01-21 00:09:11.926024834 +0000 UTC m=+5624.246285142" watchObservedRunningTime="2026-01-21 00:09:11.933561966 +0000 UTC m=+5624.253822254" Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.937535 5030 scope.go:117] "RemoveContainer" containerID="c96c7e6c2c1f1b1e7b52997f64002751e0b87c85c79c229f2d0c80bd328f6e52" Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.974657 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9nnfj"] Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.979485 5030 scope.go:117] "RemoveContainer" containerID="10a97929e918afbe30046bbbda875def4a77352e6bc72b87106bea65e5c5f099" Jan 21 00:09:11 crc kubenswrapper[5030]: I0121 00:09:11.981933 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9nnfj"] Jan 21 00:09:12 crc kubenswrapper[5030]: I0121 00:09:12.922533 5030 generic.go:334] "Generic (PLEG): container finished" podID="2787768b-6299-4a11-a990-4d436661b035" containerID="5684c664f654279461f9b79890c2a571e256b4b31a47b47161c9ff10184e5879" exitCode=0 Jan 21 00:09:12 crc kubenswrapper[5030]: I0121 00:09:12.922579 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" event={"ID":"2787768b-6299-4a11-a990-4d436661b035","Type":"ContainerDied","Data":"5684c664f654279461f9b79890c2a571e256b4b31a47b47161c9ff10184e5879"} Jan 21 00:09:13 crc kubenswrapper[5030]: I0121 00:09:13.078468 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:09:13 crc kubenswrapper[5030]: I0121 00:09:13.601482 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:09:13 crc kubenswrapper[5030]: I0121 00:09:13.973578 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f880cab6-348f-406b-ab12-6053781eedce" path="/var/lib/kubelet/pods/f880cab6-348f-406b-ab12-6053781eedce/volumes" Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.224081 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.291600 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-ovn-combined-ca-bundle\") pod \"2787768b-6299-4a11-a990-4d436661b035\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.291725 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-inventory\") pod \"2787768b-6299-4a11-a990-4d436661b035\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.291787 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-ssh-key-edpm-compute-no-nodes\") pod \"2787768b-6299-4a11-a990-4d436661b035\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.291863 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvvkj\" (UniqueName: \"kubernetes.io/projected/2787768b-6299-4a11-a990-4d436661b035-kube-api-access-nvvkj\") pod \"2787768b-6299-4a11-a990-4d436661b035\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.291940 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2787768b-6299-4a11-a990-4d436661b035-ovncontroller-config-0\") pod \"2787768b-6299-4a11-a990-4d436661b035\" (UID: \"2787768b-6299-4a11-a990-4d436661b035\") " Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.297778 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "2787768b-6299-4a11-a990-4d436661b035" (UID: "2787768b-6299-4a11-a990-4d436661b035"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.297927 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2787768b-6299-4a11-a990-4d436661b035-kube-api-access-nvvkj" (OuterVolumeSpecName: "kube-api-access-nvvkj") pod "2787768b-6299-4a11-a990-4d436661b035" (UID: "2787768b-6299-4a11-a990-4d436661b035"). InnerVolumeSpecName "kube-api-access-nvvkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.311461 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2787768b-6299-4a11-a990-4d436661b035-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "2787768b-6299-4a11-a990-4d436661b035" (UID: "2787768b-6299-4a11-a990-4d436661b035"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.313735 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-inventory" (OuterVolumeSpecName: "inventory") pod "2787768b-6299-4a11-a990-4d436661b035" (UID: "2787768b-6299-4a11-a990-4d436661b035"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.315243 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "2787768b-6299-4a11-a990-4d436661b035" (UID: "2787768b-6299-4a11-a990-4d436661b035"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.393052 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.393307 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvvkj\" (UniqueName: \"kubernetes.io/projected/2787768b-6299-4a11-a990-4d436661b035-kube-api-access-nvvkj\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.393424 5030 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/2787768b-6299-4a11-a990-4d436661b035-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.393504 5030 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.393596 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2787768b-6299-4a11-a990-4d436661b035-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.947045 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" event={"ID":"2787768b-6299-4a11-a990-4d436661b035","Type":"ContainerDied","Data":"6b9ef990bdc0f98997e5e37866c95c4d1b33c5a28215f3c53e48dab94ee419b5"} Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.947094 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b9ef990bdc0f98997e5e37866c95c4d1b33c5a28215f3c53e48dab94ee419b5" Jan 21 00:09:14 crc kubenswrapper[5030]: I0121 00:09:14.947121 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.014191 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85f7g"] Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.014453 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-85f7g" podUID="c077515c-ffe7-4674-acc5-6ede3db5e452" containerName="registry-server" containerID="cri-o://9820f5fb0b6f3412d2264859f445257d8919deaf384d4446b3f8ef3e36caa631" gracePeriod=2 Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.035303 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6"] Jan 21 00:09:15 crc kubenswrapper[5030]: E0121 00:09:15.035711 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f880cab6-348f-406b-ab12-6053781eedce" containerName="extract-utilities" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.035733 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f880cab6-348f-406b-ab12-6053781eedce" containerName="extract-utilities" Jan 21 00:09:15 crc kubenswrapper[5030]: E0121 00:09:15.035765 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f880cab6-348f-406b-ab12-6053781eedce" containerName="extract-content" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.035774 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f880cab6-348f-406b-ab12-6053781eedce" containerName="extract-content" Jan 21 00:09:15 crc kubenswrapper[5030]: E0121 00:09:15.035790 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f880cab6-348f-406b-ab12-6053781eedce" containerName="registry-server" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.035799 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f880cab6-348f-406b-ab12-6053781eedce" containerName="registry-server" Jan 21 00:09:15 crc kubenswrapper[5030]: E0121 00:09:15.035829 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2787768b-6299-4a11-a990-4d436661b035" containerName="ovn-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.035839 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2787768b-6299-4a11-a990-4d436661b035" containerName="ovn-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.036903 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2787768b-6299-4a11-a990-4d436661b035" containerName="ovn-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.036970 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f880cab6-348f-406b-ab12-6053781eedce" containerName="registry-server" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.037983 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.055940 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.056071 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.056525 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.055956 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.056861 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-metadata-agent-neutron-config" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.056980 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-metadata-neutron-config" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.057067 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.071994 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6"] Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.104525 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mvxx\" (UniqueName: \"kubernetes.io/projected/3eb88677-5ad8-42e9-8d98-b3c15d80c448-kube-api-access-8mvxx\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.104588 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.104659 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-inventory\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.104688 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.104710 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.104739 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-ssh-key-edpm-compute-no-nodes\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.104758 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.104778 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.206208 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.206270 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.206304 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-ssh-key-edpm-compute-no-nodes\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.206335 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.206362 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.206449 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mvxx\" (UniqueName: \"kubernetes.io/projected/3eb88677-5ad8-42e9-8d98-b3c15d80c448-kube-api-access-8mvxx\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.206477 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.206515 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-inventory\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.212241 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-inventory\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.212523 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-1\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.212805 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-ssh-key-edpm-compute-no-nodes\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.213279 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.215765 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.215891 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.226756 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-2\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.226864 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mvxx\" (UniqueName: \"kubernetes.io/projected/3eb88677-5ad8-42e9-8d98-b3c15d80c448-kube-api-access-8mvxx\") pod \"neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.384777 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.468153 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.510352 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c077515c-ffe7-4674-acc5-6ede3db5e452-utilities\") pod \"c077515c-ffe7-4674-acc5-6ede3db5e452\" (UID: \"c077515c-ffe7-4674-acc5-6ede3db5e452\") " Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.510418 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q8t2\" (UniqueName: \"kubernetes.io/projected/c077515c-ffe7-4674-acc5-6ede3db5e452-kube-api-access-4q8t2\") pod \"c077515c-ffe7-4674-acc5-6ede3db5e452\" (UID: \"c077515c-ffe7-4674-acc5-6ede3db5e452\") " Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.510474 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c077515c-ffe7-4674-acc5-6ede3db5e452-catalog-content\") pod \"c077515c-ffe7-4674-acc5-6ede3db5e452\" (UID: \"c077515c-ffe7-4674-acc5-6ede3db5e452\") " Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.511365 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c077515c-ffe7-4674-acc5-6ede3db5e452-utilities" (OuterVolumeSpecName: "utilities") pod "c077515c-ffe7-4674-acc5-6ede3db5e452" (UID: "c077515c-ffe7-4674-acc5-6ede3db5e452"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.515615 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c077515c-ffe7-4674-acc5-6ede3db5e452-kube-api-access-4q8t2" (OuterVolumeSpecName: "kube-api-access-4q8t2") pod "c077515c-ffe7-4674-acc5-6ede3db5e452" (UID: "c077515c-ffe7-4674-acc5-6ede3db5e452"). InnerVolumeSpecName "kube-api-access-4q8t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.617613 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c077515c-ffe7-4674-acc5-6ede3db5e452-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.617685 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q8t2\" (UniqueName: \"kubernetes.io/projected/c077515c-ffe7-4674-acc5-6ede3db5e452-kube-api-access-4q8t2\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.628416 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c077515c-ffe7-4674-acc5-6ede3db5e452-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c077515c-ffe7-4674-acc5-6ede3db5e452" (UID: "c077515c-ffe7-4674-acc5-6ede3db5e452"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.645339 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4tnv"] Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.719089 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c077515c-ffe7-4674-acc5-6ede3db5e452-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.967402 5030 generic.go:334] "Generic (PLEG): container finished" podID="c077515c-ffe7-4674-acc5-6ede3db5e452" containerID="9820f5fb0b6f3412d2264859f445257d8919deaf384d4446b3f8ef3e36caa631" exitCode=0 Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.967492 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85f7g" Jan 21 00:09:15 crc kubenswrapper[5030]: W0121 00:09:15.968847 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb88677_5ad8_42e9_8d98_b3c15d80c448.slice/crio-bf59cceebb39674b9bdbeecfb954707aca8598e4d2dd51e00fd3ceea9d7e19de WatchSource:0}: Error finding container bf59cceebb39674b9bdbeecfb954707aca8598e4d2dd51e00fd3ceea9d7e19de: Status 404 returned error can't find the container with id bf59cceebb39674b9bdbeecfb954707aca8598e4d2dd51e00fd3ceea9d7e19de Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.975245 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6"] Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.975280 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85f7g" event={"ID":"c077515c-ffe7-4674-acc5-6ede3db5e452","Type":"ContainerDied","Data":"9820f5fb0b6f3412d2264859f445257d8919deaf384d4446b3f8ef3e36caa631"} Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.975306 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85f7g" event={"ID":"c077515c-ffe7-4674-acc5-6ede3db5e452","Type":"ContainerDied","Data":"13d803828118213a3cdcf3b93f5616f5e63ab052553968aa1aaa45e89ce0a578"} Jan 21 00:09:15 crc kubenswrapper[5030]: I0121 00:09:15.975329 5030 scope.go:117] "RemoveContainer" containerID="9820f5fb0b6f3412d2264859f445257d8919deaf384d4446b3f8ef3e36caa631" Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.016393 5030 scope.go:117] "RemoveContainer" containerID="18fb6c51e628c99ea0edf87ce3a99aff977c4baa976713d12773b6e8ce018b42" Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.057690 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv846"] Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.058006 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fv846" podUID="58bea95b-6632-463b-8ff9-24a85f213dde" containerName="registry-server" containerID="cri-o://fef85f95d9352905e4aa39b823442e3681224682f150a380347d28a466215ced" gracePeriod=2 Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.076286 5030 scope.go:117] "RemoveContainer" containerID="cd147a5ff6098f428cc68988baa56501a1b302ac3d463655b297222c007295d9" Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.111960 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85f7g"] Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.118786 5030 scope.go:117] "RemoveContainer" containerID="9820f5fb0b6f3412d2264859f445257d8919deaf384d4446b3f8ef3e36caa631" Jan 21 00:09:16 crc kubenswrapper[5030]: E0121 00:09:16.122830 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9820f5fb0b6f3412d2264859f445257d8919deaf384d4446b3f8ef3e36caa631\": container with ID starting with 9820f5fb0b6f3412d2264859f445257d8919deaf384d4446b3f8ef3e36caa631 not found: ID does not exist" containerID="9820f5fb0b6f3412d2264859f445257d8919deaf384d4446b3f8ef3e36caa631" Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.122896 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9820f5fb0b6f3412d2264859f445257d8919deaf384d4446b3f8ef3e36caa631"} err="failed to get container status \"9820f5fb0b6f3412d2264859f445257d8919deaf384d4446b3f8ef3e36caa631\": rpc error: code = NotFound desc = could not find container \"9820f5fb0b6f3412d2264859f445257d8919deaf384d4446b3f8ef3e36caa631\": container with ID starting with 9820f5fb0b6f3412d2264859f445257d8919deaf384d4446b3f8ef3e36caa631 not found: ID does not exist" Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.122929 5030 scope.go:117] "RemoveContainer" containerID="18fb6c51e628c99ea0edf87ce3a99aff977c4baa976713d12773b6e8ce018b42" Jan 21 00:09:16 crc kubenswrapper[5030]: E0121 00:09:16.134824 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18fb6c51e628c99ea0edf87ce3a99aff977c4baa976713d12773b6e8ce018b42\": container with ID starting with 18fb6c51e628c99ea0edf87ce3a99aff977c4baa976713d12773b6e8ce018b42 not found: ID does not exist" containerID="18fb6c51e628c99ea0edf87ce3a99aff977c4baa976713d12773b6e8ce018b42" Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.134902 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18fb6c51e628c99ea0edf87ce3a99aff977c4baa976713d12773b6e8ce018b42"} err="failed to get container status \"18fb6c51e628c99ea0edf87ce3a99aff977c4baa976713d12773b6e8ce018b42\": rpc error: code = NotFound desc = could not find container \"18fb6c51e628c99ea0edf87ce3a99aff977c4baa976713d12773b6e8ce018b42\": container with ID starting with 18fb6c51e628c99ea0edf87ce3a99aff977c4baa976713d12773b6e8ce018b42 not found: ID does not exist" Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.134934 5030 scope.go:117] "RemoveContainer" containerID="cd147a5ff6098f428cc68988baa56501a1b302ac3d463655b297222c007295d9" Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.136328 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-85f7g"] Jan 21 00:09:16 crc kubenswrapper[5030]: E0121 00:09:16.138776 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd147a5ff6098f428cc68988baa56501a1b302ac3d463655b297222c007295d9\": container with ID starting with cd147a5ff6098f428cc68988baa56501a1b302ac3d463655b297222c007295d9 not found: ID does not exist" containerID="cd147a5ff6098f428cc68988baa56501a1b302ac3d463655b297222c007295d9" Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.138837 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd147a5ff6098f428cc68988baa56501a1b302ac3d463655b297222c007295d9"} err="failed to get container status \"cd147a5ff6098f428cc68988baa56501a1b302ac3d463655b297222c007295d9\": rpc error: code = NotFound desc = could not find container \"cd147a5ff6098f428cc68988baa56501a1b302ac3d463655b297222c007295d9\": container with ID starting with cd147a5ff6098f428cc68988baa56501a1b302ac3d463655b297222c007295d9 not found: ID does not exist" Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.979432 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" event={"ID":"3eb88677-5ad8-42e9-8d98-b3c15d80c448","Type":"ContainerStarted","Data":"07a3eb05f554cd9db9ec72f8a584308cda93ea2b3337655bb7dbd9b74dd1130d"} Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.981810 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" event={"ID":"3eb88677-5ad8-42e9-8d98-b3c15d80c448","Type":"ContainerStarted","Data":"bf59cceebb39674b9bdbeecfb954707aca8598e4d2dd51e00fd3ceea9d7e19de"} Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.988599 5030 generic.go:334] "Generic (PLEG): container finished" podID="58bea95b-6632-463b-8ff9-24a85f213dde" containerID="fef85f95d9352905e4aa39b823442e3681224682f150a380347d28a466215ced" exitCode=0 Jan 21 00:09:16 crc kubenswrapper[5030]: I0121 00:09:16.988690 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv846" event={"ID":"58bea95b-6632-463b-8ff9-24a85f213dde","Type":"ContainerDied","Data":"fef85f95d9352905e4aa39b823442e3681224682f150a380347d28a466215ced"} Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.002708 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" podStartSLOduration=1.481137043 podStartE2EDuration="2.002688838s" podCreationTimestamp="2026-01-21 00:09:15 +0000 UTC" firstStartedPulling="2026-01-21 00:09:15.975263636 +0000 UTC m=+5628.295523924" lastFinishedPulling="2026-01-21 00:09:16.496815431 +0000 UTC m=+5628.817075719" observedRunningTime="2026-01-21 00:09:16.99867085 +0000 UTC m=+5629.318931158" watchObservedRunningTime="2026-01-21 00:09:17.002688838 +0000 UTC m=+5629.322949126" Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.035082 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fv846" Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.142798 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkrt8\" (UniqueName: \"kubernetes.io/projected/58bea95b-6632-463b-8ff9-24a85f213dde-kube-api-access-jkrt8\") pod \"58bea95b-6632-463b-8ff9-24a85f213dde\" (UID: \"58bea95b-6632-463b-8ff9-24a85f213dde\") " Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.142929 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58bea95b-6632-463b-8ff9-24a85f213dde-utilities\") pod \"58bea95b-6632-463b-8ff9-24a85f213dde\" (UID: \"58bea95b-6632-463b-8ff9-24a85f213dde\") " Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.143049 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58bea95b-6632-463b-8ff9-24a85f213dde-catalog-content\") pod \"58bea95b-6632-463b-8ff9-24a85f213dde\" (UID: \"58bea95b-6632-463b-8ff9-24a85f213dde\") " Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.145546 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58bea95b-6632-463b-8ff9-24a85f213dde-utilities" (OuterVolumeSpecName: "utilities") pod "58bea95b-6632-463b-8ff9-24a85f213dde" (UID: "58bea95b-6632-463b-8ff9-24a85f213dde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.148745 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58bea95b-6632-463b-8ff9-24a85f213dde-kube-api-access-jkrt8" (OuterVolumeSpecName: "kube-api-access-jkrt8") pod "58bea95b-6632-463b-8ff9-24a85f213dde" (UID: "58bea95b-6632-463b-8ff9-24a85f213dde"). InnerVolumeSpecName "kube-api-access-jkrt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.164076 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58bea95b-6632-463b-8ff9-24a85f213dde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58bea95b-6632-463b-8ff9-24a85f213dde" (UID: "58bea95b-6632-463b-8ff9-24a85f213dde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.245339 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkrt8\" (UniqueName: \"kubernetes.io/projected/58bea95b-6632-463b-8ff9-24a85f213dde-kube-api-access-jkrt8\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.245379 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58bea95b-6632-463b-8ff9-24a85f213dde-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.245393 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58bea95b-6632-463b-8ff9-24a85f213dde-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.973511 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c077515c-ffe7-4674-acc5-6ede3db5e452" path="/var/lib/kubelet/pods/c077515c-ffe7-4674-acc5-6ede3db5e452/volumes" Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.999328 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv846" event={"ID":"58bea95b-6632-463b-8ff9-24a85f213dde","Type":"ContainerDied","Data":"50d80229f9d6786d308f83d20e80bc964cb49f4d6cf800bce640a2f53c804bc8"} Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.999352 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fv846" Jan 21 00:09:17 crc kubenswrapper[5030]: I0121 00:09:17.999385 5030 scope.go:117] "RemoveContainer" containerID="fef85f95d9352905e4aa39b823442e3681224682f150a380347d28a466215ced" Jan 21 00:09:18 crc kubenswrapper[5030]: I0121 00:09:18.035901 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv846"] Jan 21 00:09:18 crc kubenswrapper[5030]: I0121 00:09:18.043016 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv846"] Jan 21 00:09:18 crc kubenswrapper[5030]: I0121 00:09:18.086144 5030 scope.go:117] "RemoveContainer" containerID="891f24e0df6c930a7d936beacdc4e4f1f368ffaf6e5fc0c599677bc0f64fe044" Jan 21 00:09:18 crc kubenswrapper[5030]: I0121 00:09:18.106107 5030 scope.go:117] "RemoveContainer" containerID="73be9cf8f1339b142151c1171c75060eb7944d2f49dffe59b4dc40eb865bd793" Jan 21 00:09:19 crc kubenswrapper[5030]: I0121 00:09:19.011930 5030 generic.go:334] "Generic (PLEG): container finished" podID="3eb88677-5ad8-42e9-8d98-b3c15d80c448" containerID="07a3eb05f554cd9db9ec72f8a584308cda93ea2b3337655bb7dbd9b74dd1130d" exitCode=0 Jan 21 00:09:19 crc kubenswrapper[5030]: I0121 00:09:19.011961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" event={"ID":"3eb88677-5ad8-42e9-8d98-b3c15d80c448","Type":"ContainerDied","Data":"07a3eb05f554cd9db9ec72f8a584308cda93ea2b3337655bb7dbd9b74dd1130d"} Jan 21 00:09:19 crc kubenswrapper[5030]: I0121 00:09:19.980893 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58bea95b-6632-463b-8ff9-24a85f213dde" path="/var/lib/kubelet/pods/58bea95b-6632-463b-8ff9-24a85f213dde/volumes" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.285091 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.386008 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mvxx\" (UniqueName: \"kubernetes.io/projected/3eb88677-5ad8-42e9-8d98-b3c15d80c448-kube-api-access-8mvxx\") pod \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.386156 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-0\") pod \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.386212 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-ssh-key-edpm-compute-no-nodes\") pod \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.386241 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-inventory\") pod \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.386287 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-1\") pod \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.386363 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-neutron-metadata-combined-ca-bundle\") pod \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.386396 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-neutron-ovn-metadata-agent-neutron-config-0\") pod \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.386444 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-2\") pod \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\" (UID: \"3eb88677-5ad8-42e9-8d98-b3c15d80c448\") " Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.392159 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb88677-5ad8-42e9-8d98-b3c15d80c448-kube-api-access-8mvxx" (OuterVolumeSpecName: "kube-api-access-8mvxx") pod "3eb88677-5ad8-42e9-8d98-b3c15d80c448" (UID: "3eb88677-5ad8-42e9-8d98-b3c15d80c448"). InnerVolumeSpecName "kube-api-access-8mvxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.396728 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3eb88677-5ad8-42e9-8d98-b3c15d80c448" (UID: "3eb88677-5ad8-42e9-8d98-b3c15d80c448"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.410016 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "3eb88677-5ad8-42e9-8d98-b3c15d80c448" (UID: "3eb88677-5ad8-42e9-8d98-b3c15d80c448"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.411117 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "3eb88677-5ad8-42e9-8d98-b3c15d80c448" (UID: "3eb88677-5ad8-42e9-8d98-b3c15d80c448"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.411150 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-1" (OuterVolumeSpecName: "nova-metadata-neutron-config-1") pod "3eb88677-5ad8-42e9-8d98-b3c15d80c448" (UID: "3eb88677-5ad8-42e9-8d98-b3c15d80c448"). InnerVolumeSpecName "nova-metadata-neutron-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.411178 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-inventory" (OuterVolumeSpecName: "inventory") pod "3eb88677-5ad8-42e9-8d98-b3c15d80c448" (UID: "3eb88677-5ad8-42e9-8d98-b3c15d80c448"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.412260 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-2" (OuterVolumeSpecName: "nova-metadata-neutron-config-2") pod "3eb88677-5ad8-42e9-8d98-b3c15d80c448" (UID: "3eb88677-5ad8-42e9-8d98-b3c15d80c448"). InnerVolumeSpecName "nova-metadata-neutron-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.426498 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "3eb88677-5ad8-42e9-8d98-b3c15d80c448" (UID: "3eb88677-5ad8-42e9-8d98-b3c15d80c448"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.487847 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-2\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-2\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.488123 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mvxx\" (UniqueName: \"kubernetes.io/projected/3eb88677-5ad8-42e9-8d98-b3c15d80c448-kube-api-access-8mvxx\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.488200 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.488281 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.488341 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.488396 5030 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-1\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-nova-metadata-neutron-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.488449 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:20 crc kubenswrapper[5030]: I0121 00:09:20.488503 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3eb88677-5ad8-42e9-8d98-b3c15d80c448-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.030214 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" event={"ID":"3eb88677-5ad8-42e9-8d98-b3c15d80c448","Type":"ContainerDied","Data":"bf59cceebb39674b9bdbeecfb954707aca8598e4d2dd51e00fd3ceea9d7e19de"} Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.030262 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf59cceebb39674b9bdbeecfb954707aca8598e4d2dd51e00fd3ceea9d7e19de" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.031450 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.103566 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc"] Jan 21 00:09:21 crc kubenswrapper[5030]: E0121 00:09:21.104165 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58bea95b-6632-463b-8ff9-24a85f213dde" containerName="extract-content" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.104195 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="58bea95b-6632-463b-8ff9-24a85f213dde" containerName="extract-content" Jan 21 00:09:21 crc kubenswrapper[5030]: E0121 00:09:21.104227 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58bea95b-6632-463b-8ff9-24a85f213dde" containerName="extract-utilities" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.104236 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="58bea95b-6632-463b-8ff9-24a85f213dde" containerName="extract-utilities" Jan 21 00:09:21 crc kubenswrapper[5030]: E0121 00:09:21.104255 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb88677-5ad8-42e9-8d98-b3c15d80c448" containerName="neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.104264 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb88677-5ad8-42e9-8d98-b3c15d80c448" containerName="neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:21 crc kubenswrapper[5030]: E0121 00:09:21.104278 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c077515c-ffe7-4674-acc5-6ede3db5e452" containerName="extract-content" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.104285 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c077515c-ffe7-4674-acc5-6ede3db5e452" containerName="extract-content" Jan 21 00:09:21 crc kubenswrapper[5030]: E0121 00:09:21.104303 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c077515c-ffe7-4674-acc5-6ede3db5e452" containerName="extract-utilities" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.104310 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c077515c-ffe7-4674-acc5-6ede3db5e452" containerName="extract-utilities" Jan 21 00:09:21 crc kubenswrapper[5030]: E0121 00:09:21.104323 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c077515c-ffe7-4674-acc5-6ede3db5e452" containerName="registry-server" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.104331 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c077515c-ffe7-4674-acc5-6ede3db5e452" containerName="registry-server" Jan 21 00:09:21 crc kubenswrapper[5030]: E0121 00:09:21.104348 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58bea95b-6632-463b-8ff9-24a85f213dde" containerName="registry-server" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.104355 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="58bea95b-6632-463b-8ff9-24a85f213dde" containerName="registry-server" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.104557 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c077515c-ffe7-4674-acc5-6ede3db5e452" containerName="registry-server" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.104574 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb88677-5ad8-42e9-8d98-b3c15d80c448" containerName="neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.104589 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="58bea95b-6632-463b-8ff9-24a85f213dde" containerName="registry-server" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.105194 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.107684 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.107871 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.108012 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.108363 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-ovn-agent-neutron-config" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.109014 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.109970 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.124901 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc"] Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.197855 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vglr\" (UniqueName: \"kubernetes.io/projected/47f0cf25-98ae-4e27-a478-d6953938b014-kube-api-access-4vglr\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.198263 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.198471 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.198543 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-inventory\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.198639 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-ssh-key-edpm-compute-no-nodes\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.299366 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.299414 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-inventory\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.299446 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-ssh-key-edpm-compute-no-nodes\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.299490 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vglr\" (UniqueName: \"kubernetes.io/projected/47f0cf25-98ae-4e27-a478-d6953938b014-kube-api-access-4vglr\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.299552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.305438 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-ssh-key-edpm-compute-no-nodes\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.305946 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-neutron-ovn-combined-ca-bundle\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.306346 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-neutron-ovn-agent-neutron-config-0\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.318489 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-inventory\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.328641 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vglr\" (UniqueName: \"kubernetes.io/projected/47f0cf25-98ae-4e27-a478-d6953938b014-kube-api-access-4vglr\") pod \"neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.422140 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:21 crc kubenswrapper[5030]: I0121 00:09:21.900632 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc"] Jan 21 00:09:22 crc kubenswrapper[5030]: I0121 00:09:22.041794 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" event={"ID":"47f0cf25-98ae-4e27-a478-d6953938b014","Type":"ContainerStarted","Data":"b2efdefbd6a52446a5dfba0e4c2aadb12f899c7b6f765515dfb2585c227cc43c"} Jan 21 00:09:23 crc kubenswrapper[5030]: I0121 00:09:23.051921 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" event={"ID":"47f0cf25-98ae-4e27-a478-d6953938b014","Type":"ContainerStarted","Data":"e0ed5fefbb791137c5aa9809673b42ec2898f058c02141144c8d75cb153943bb"} Jan 21 00:09:23 crc kubenswrapper[5030]: I0121 00:09:23.101275 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" podStartSLOduration=1.314938733 podStartE2EDuration="2.101249719s" podCreationTimestamp="2026-01-21 00:09:21 +0000 UTC" firstStartedPulling="2026-01-21 00:09:21.906206323 +0000 UTC m=+5634.226466621" lastFinishedPulling="2026-01-21 00:09:22.692517319 +0000 UTC m=+5635.012777607" observedRunningTime="2026-01-21 00:09:23.070548305 +0000 UTC m=+5635.390808603" watchObservedRunningTime="2026-01-21 00:09:23.101249719 +0000 UTC m=+5635.421509997" Jan 21 00:09:25 crc kubenswrapper[5030]: I0121 00:09:25.071939 5030 generic.go:334] "Generic (PLEG): container finished" podID="47f0cf25-98ae-4e27-a478-d6953938b014" containerID="e0ed5fefbb791137c5aa9809673b42ec2898f058c02141144c8d75cb153943bb" exitCode=0 Jan 21 00:09:25 crc kubenswrapper[5030]: I0121 00:09:25.071991 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" event={"ID":"47f0cf25-98ae-4e27-a478-d6953938b014","Type":"ContainerDied","Data":"e0ed5fefbb791137c5aa9809673b42ec2898f058c02141144c8d75cb153943bb"} Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.351296 5030 scope.go:117] "RemoveContainer" containerID="21057c50bf3441617fadd94c896c6a5cac3e9ac85b073879e5c2245b37717ab8" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.364553 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.381461 5030 scope.go:117] "RemoveContainer" containerID="e4ac8dd6948d0ea55265f9098316b293c94e72d19e89441a23b08995c34cde72" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.414051 5030 scope.go:117] "RemoveContainer" containerID="4deddf2d60cbd5a1d761a5715a9fb78ae7235375fa933bcf8f4675c23125d31e" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.457060 5030 scope.go:117] "RemoveContainer" containerID="cc4c2de831173441106ea6dd1fd1c7911b0707c11132dd7aa40e8b8cc79474bc" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.473874 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-ssh-key-edpm-compute-no-nodes\") pod \"47f0cf25-98ae-4e27-a478-d6953938b014\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.473979 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-neutron-ovn-combined-ca-bundle\") pod \"47f0cf25-98ae-4e27-a478-d6953938b014\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.474126 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-inventory\") pod \"47f0cf25-98ae-4e27-a478-d6953938b014\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.474167 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vglr\" (UniqueName: \"kubernetes.io/projected/47f0cf25-98ae-4e27-a478-d6953938b014-kube-api-access-4vglr\") pod \"47f0cf25-98ae-4e27-a478-d6953938b014\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.474199 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-neutron-ovn-agent-neutron-config-0\") pod \"47f0cf25-98ae-4e27-a478-d6953938b014\" (UID: \"47f0cf25-98ae-4e27-a478-d6953938b014\") " Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.479499 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f0cf25-98ae-4e27-a478-d6953938b014-kube-api-access-4vglr" (OuterVolumeSpecName: "kube-api-access-4vglr") pod "47f0cf25-98ae-4e27-a478-d6953938b014" (UID: "47f0cf25-98ae-4e27-a478-d6953938b014"). InnerVolumeSpecName "kube-api-access-4vglr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.479506 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-neutron-ovn-combined-ca-bundle" (OuterVolumeSpecName: "neutron-ovn-combined-ca-bundle") pod "47f0cf25-98ae-4e27-a478-d6953938b014" (UID: "47f0cf25-98ae-4e27-a478-d6953938b014"). InnerVolumeSpecName "neutron-ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.483582 5030 scope.go:117] "RemoveContainer" containerID="d534c8ea94b95c2c94ff4109e16cc8f97d0f06c4118d05fe469a56cad543a368" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.496197 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-inventory" (OuterVolumeSpecName: "inventory") pod "47f0cf25-98ae-4e27-a478-d6953938b014" (UID: "47f0cf25-98ae-4e27-a478-d6953938b014"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.499166 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "47f0cf25-98ae-4e27-a478-d6953938b014" (UID: "47f0cf25-98ae-4e27-a478-d6953938b014"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.499953 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-neutron-ovn-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-agent-neutron-config-0") pod "47f0cf25-98ae-4e27-a478-d6953938b014" (UID: "47f0cf25-98ae-4e27-a478-d6953938b014"). InnerVolumeSpecName "neutron-ovn-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.506680 5030 scope.go:117] "RemoveContainer" containerID="e5b77e31575aea46b2642fe181515b8e3096d8e55836850fec4cb8f05c24832b" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.530911 5030 scope.go:117] "RemoveContainer" containerID="a52f73fa0e272c089051843538b533563c54e09f5ff5207ac2d758d5e49e7cac" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.552725 5030 scope.go:117] "RemoveContainer" containerID="1b3ba403d5e893e33075b3c3e2c708f162680bd74867f03fa13f7e625e25754d" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.575276 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.575316 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vglr\" (UniqueName: \"kubernetes.io/projected/47f0cf25-98ae-4e27-a478-d6953938b014-kube-api-access-4vglr\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.575328 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-neutron-ovn-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.575342 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.575356 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f0cf25-98ae-4e27-a478-d6953938b014-neutron-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:26 crc kubenswrapper[5030]: I0121 00:09:26.580093 5030 scope.go:117] "RemoveContainer" containerID="bf1a5dc9da4f6f22aa1c0bce914a19fa47df1650802843f948da6a0a5a55d884" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.094287 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" event={"ID":"47f0cf25-98ae-4e27-a478-d6953938b014","Type":"ContainerDied","Data":"b2efdefbd6a52446a5dfba0e4c2aadb12f899c7b6f765515dfb2585c227cc43c"} Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.094334 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2efdefbd6a52446a5dfba0e4c2aadb12f899c7b6f765515dfb2585c227cc43c" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.094393 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.446031 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z"] Jan 21 00:09:27 crc kubenswrapper[5030]: E0121 00:09:27.446656 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f0cf25-98ae-4e27-a478-d6953938b014" containerName="neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.446672 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f0cf25-98ae-4e27-a478-d6953938b014" containerName="neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.446830 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f0cf25-98ae-4e27-a478-d6953938b014" containerName="neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.447306 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.450225 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.452342 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.452509 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.452680 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-sriov-agent-neutron-config" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.452785 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.455060 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.467762 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z"] Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.635523 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.635575 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-ssh-key-edpm-compute-no-nodes\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.635605 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-inventory\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.635653 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.635727 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzbcx\" (UniqueName: \"kubernetes.io/projected/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-kube-api-access-gzbcx\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.737246 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.738097 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-ssh-key-edpm-compute-no-nodes\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.738133 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-inventory\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.738158 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.738218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzbcx\" (UniqueName: \"kubernetes.io/projected/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-kube-api-access-gzbcx\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.741166 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-ssh-key-edpm-compute-no-nodes\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.741206 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.749131 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-inventory\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.753088 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.754191 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzbcx\" (UniqueName: \"kubernetes.io/projected/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-kube-api-access-gzbcx\") pod \"neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:27 crc kubenswrapper[5030]: I0121 00:09:27.764270 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:28 crc kubenswrapper[5030]: I0121 00:09:28.204937 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z"] Jan 21 00:09:29 crc kubenswrapper[5030]: I0121 00:09:29.112496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" event={"ID":"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75","Type":"ContainerStarted","Data":"a46fda98e57748f10eff1ce2aa138efaec9232c86f690fd035415de04574a92a"} Jan 21 00:09:30 crc kubenswrapper[5030]: I0121 00:09:30.624528 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:09:31 crc kubenswrapper[5030]: I0121 00:09:31.137307 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" event={"ID":"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75","Type":"ContainerStarted","Data":"b39773b4420068e26289e2e633b7c381c5ccf26a1ec7464ff2e04f7c962bfa77"} Jan 21 00:09:31 crc kubenswrapper[5030]: I0121 00:09:31.160024 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" podStartSLOduration=1.751904311 podStartE2EDuration="4.160005501s" podCreationTimestamp="2026-01-21 00:09:27 +0000 UTC" firstStartedPulling="2026-01-21 00:09:28.214565532 +0000 UTC m=+5640.534825830" lastFinishedPulling="2026-01-21 00:09:30.622666732 +0000 UTC m=+5642.942927020" observedRunningTime="2026-01-21 00:09:31.15584796 +0000 UTC m=+5643.476108288" watchObservedRunningTime="2026-01-21 00:09:31.160005501 +0000 UTC m=+5643.480265789" Jan 21 00:09:33 crc kubenswrapper[5030]: I0121 00:09:33.167156 5030 generic.go:334] "Generic (PLEG): container finished" podID="c14a0bb0-fad1-4d7f-bd3d-06f24af42d75" containerID="b39773b4420068e26289e2e633b7c381c5ccf26a1ec7464ff2e04f7c962bfa77" exitCode=0 Jan 21 00:09:33 crc kubenswrapper[5030]: I0121 00:09:33.167256 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" event={"ID":"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75","Type":"ContainerDied","Data":"b39773b4420068e26289e2e633b7c381c5ccf26a1ec7464ff2e04f7c962bfa77"} Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.485089 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.564715 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzbcx\" (UniqueName: \"kubernetes.io/projected/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-kube-api-access-gzbcx\") pod \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.564867 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-inventory\") pod \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.564984 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-neutron-sriov-combined-ca-bundle\") pod \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.565019 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-ssh-key-edpm-compute-no-nodes\") pod \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.565078 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-neutron-sriov-agent-neutron-config-0\") pod \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\" (UID: \"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75\") " Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.571085 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-kube-api-access-gzbcx" (OuterVolumeSpecName: "kube-api-access-gzbcx") pod "c14a0bb0-fad1-4d7f-bd3d-06f24af42d75" (UID: "c14a0bb0-fad1-4d7f-bd3d-06f24af42d75"). InnerVolumeSpecName "kube-api-access-gzbcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.571245 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "c14a0bb0-fad1-4d7f-bd3d-06f24af42d75" (UID: "c14a0bb0-fad1-4d7f-bd3d-06f24af42d75"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.587010 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-inventory" (OuterVolumeSpecName: "inventory") pod "c14a0bb0-fad1-4d7f-bd3d-06f24af42d75" (UID: "c14a0bb0-fad1-4d7f-bd3d-06f24af42d75"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.591442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "c14a0bb0-fad1-4d7f-bd3d-06f24af42d75" (UID: "c14a0bb0-fad1-4d7f-bd3d-06f24af42d75"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.592943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "c14a0bb0-fad1-4d7f-bd3d-06f24af42d75" (UID: "c14a0bb0-fad1-4d7f-bd3d-06f24af42d75"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.666898 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.666961 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.666978 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.666999 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzbcx\" (UniqueName: \"kubernetes.io/projected/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-kube-api-access-gzbcx\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:34 crc kubenswrapper[5030]: I0121 00:09:34.667014 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.186451 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" event={"ID":"c14a0bb0-fad1-4d7f-bd3d-06f24af42d75","Type":"ContainerDied","Data":"a46fda98e57748f10eff1ce2aa138efaec9232c86f690fd035415de04574a92a"} Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.186877 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a46fda98e57748f10eff1ce2aa138efaec9232c86f690fd035415de04574a92a" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.186950 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.262054 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh"] Jan 21 00:09:35 crc kubenswrapper[5030]: E0121 00:09:35.262388 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14a0bb0-fad1-4d7f-bd3d-06f24af42d75" containerName="neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.262409 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14a0bb0-fad1-4d7f-bd3d-06f24af42d75" containerName="neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.262578 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c14a0bb0-fad1-4d7f-bd3d-06f24af42d75" containerName="neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.263113 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.266155 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.266924 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.267481 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.267596 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.268937 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.270241 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"neutron-dhcp-agent-neutron-config" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.272174 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh"] Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.378194 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-css97\" (UniqueName: \"kubernetes.io/projected/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-kube-api-access-css97\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.378335 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.378448 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-inventory\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.378692 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.378787 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-ssh-key-edpm-compute-no-nodes\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.480476 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.480539 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-ssh-key-edpm-compute-no-nodes\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.480577 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-css97\" (UniqueName: \"kubernetes.io/projected/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-kube-api-access-css97\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.480608 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.480663 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-inventory\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.484919 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.485033 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.485058 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-ssh-key-edpm-compute-no-nodes\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.488984 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-inventory\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.508374 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-css97\" (UniqueName: \"kubernetes.io/projected/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-kube-api-access-css97\") pod \"neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:35 crc kubenswrapper[5030]: I0121 00:09:35.580473 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:36 crc kubenswrapper[5030]: I0121 00:09:36.010738 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh"] Jan 21 00:09:36 crc kubenswrapper[5030]: I0121 00:09:36.196013 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" event={"ID":"c8dc50b1-bd1b-4da8-aae6-9776d1e41237","Type":"ContainerStarted","Data":"c9d8f48d48686fc18b2f6fae3832bd7fc16e6fb8448e21910531d05a095674b8"} Jan 21 00:09:38 crc kubenswrapper[5030]: I0121 00:09:38.214879 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" event={"ID":"c8dc50b1-bd1b-4da8-aae6-9776d1e41237","Type":"ContainerStarted","Data":"30521d269776a85fe20908b38ff56ff1cc6ea0771a52f32b9b6db57321f4327a"} Jan 21 00:09:38 crc kubenswrapper[5030]: I0121 00:09:38.242033 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" podStartSLOduration=2.024434566 podStartE2EDuration="3.242017309s" podCreationTimestamp="2026-01-21 00:09:35 +0000 UTC" firstStartedPulling="2026-01-21 00:09:36.014532739 +0000 UTC m=+5648.334793027" lastFinishedPulling="2026-01-21 00:09:37.232115482 +0000 UTC m=+5649.552375770" observedRunningTime="2026-01-21 00:09:38.235732446 +0000 UTC m=+5650.555992734" watchObservedRunningTime="2026-01-21 00:09:38.242017309 +0000 UTC m=+5650.562277597" Jan 21 00:09:39 crc kubenswrapper[5030]: I0121 00:09:39.225819 5030 generic.go:334] "Generic (PLEG): container finished" podID="c8dc50b1-bd1b-4da8-aae6-9776d1e41237" containerID="30521d269776a85fe20908b38ff56ff1cc6ea0771a52f32b9b6db57321f4327a" exitCode=0 Jan 21 00:09:39 crc kubenswrapper[5030]: I0121 00:09:39.225870 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" event={"ID":"c8dc50b1-bd1b-4da8-aae6-9776d1e41237","Type":"ContainerDied","Data":"30521d269776a85fe20908b38ff56ff1cc6ea0771a52f32b9b6db57321f4327a"} Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.567822 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.667704 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-neutron-dhcp-combined-ca-bundle\") pod \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.667776 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-css97\" (UniqueName: \"kubernetes.io/projected/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-kube-api-access-css97\") pod \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.667824 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-inventory\") pod \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.667873 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-ssh-key-edpm-compute-no-nodes\") pod \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.667973 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-neutron-dhcp-agent-neutron-config-0\") pod \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\" (UID: \"c8dc50b1-bd1b-4da8-aae6-9776d1e41237\") " Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.673395 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "c8dc50b1-bd1b-4da8-aae6-9776d1e41237" (UID: "c8dc50b1-bd1b-4da8-aae6-9776d1e41237"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.673735 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-kube-api-access-css97" (OuterVolumeSpecName: "kube-api-access-css97") pod "c8dc50b1-bd1b-4da8-aae6-9776d1e41237" (UID: "c8dc50b1-bd1b-4da8-aae6-9776d1e41237"). InnerVolumeSpecName "kube-api-access-css97". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.693330 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "c8dc50b1-bd1b-4da8-aae6-9776d1e41237" (UID: "c8dc50b1-bd1b-4da8-aae6-9776d1e41237"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.693865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-inventory" (OuterVolumeSpecName: "inventory") pod "c8dc50b1-bd1b-4da8-aae6-9776d1e41237" (UID: "c8dc50b1-bd1b-4da8-aae6-9776d1e41237"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.698553 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "c8dc50b1-bd1b-4da8-aae6-9776d1e41237" (UID: "c8dc50b1-bd1b-4da8-aae6-9776d1e41237"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.769612 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.769713 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.769726 5030 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.769742 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-css97\" (UniqueName: \"kubernetes.io/projected/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-kube-api-access-css97\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:40 crc kubenswrapper[5030]: I0121 00:09:40.769754 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8dc50b1-bd1b-4da8-aae6-9776d1e41237-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.075048 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9"] Jan 21 00:09:41 crc kubenswrapper[5030]: E0121 00:09:41.075482 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8dc50b1-bd1b-4da8-aae6-9776d1e41237" containerName="neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.075506 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8dc50b1-bd1b-4da8-aae6-9776d1e41237" containerName="neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.075714 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8dc50b1-bd1b-4da8-aae6-9776d1e41237" containerName="neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.076295 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.078504 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"libvirt-secret" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.085977 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9"] Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.175682 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-inventory\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.175786 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.175824 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncjg9\" (UniqueName: \"kubernetes.io/projected/28cf46b6-f08b-4182-a7b9-024347e4ab97-kube-api-access-ncjg9\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.175908 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-libvirt-secret-0\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.175944 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-ssh-key-edpm-compute-no-nodes\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.246473 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" event={"ID":"c8dc50b1-bd1b-4da8-aae6-9776d1e41237","Type":"ContainerDied","Data":"c9d8f48d48686fc18b2f6fae3832bd7fc16e6fb8448e21910531d05a095674b8"} Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.246805 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9d8f48d48686fc18b2f6fae3832bd7fc16e6fb8448e21910531d05a095674b8" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.246551 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.277773 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.278118 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncjg9\" (UniqueName: \"kubernetes.io/projected/28cf46b6-f08b-4182-a7b9-024347e4ab97-kube-api-access-ncjg9\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.278247 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-libvirt-secret-0\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.278339 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-ssh-key-edpm-compute-no-nodes\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.278440 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-inventory\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.282498 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-ssh-key-edpm-compute-no-nodes\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.282500 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-inventory\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.282760 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-libvirt-secret-0\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.283562 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.299657 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncjg9\" (UniqueName: \"kubernetes.io/projected/28cf46b6-f08b-4182-a7b9-024347e4ab97-kube-api-access-ncjg9\") pod \"libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.393200 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:41 crc kubenswrapper[5030]: I0121 00:09:41.828008 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9"] Jan 21 00:09:41 crc kubenswrapper[5030]: W0121 00:09:41.832830 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28cf46b6_f08b_4182_a7b9_024347e4ab97.slice/crio-18d45899959ac36fe2c4d746269b65c7a499fd1629bc673705cb5afbf942c00b WatchSource:0}: Error finding container 18d45899959ac36fe2c4d746269b65c7a499fd1629bc673705cb5afbf942c00b: Status 404 returned error can't find the container with id 18d45899959ac36fe2c4d746269b65c7a499fd1629bc673705cb5afbf942c00b Jan 21 00:09:42 crc kubenswrapper[5030]: I0121 00:09:42.254778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" event={"ID":"28cf46b6-f08b-4182-a7b9-024347e4ab97","Type":"ContainerStarted","Data":"18d45899959ac36fe2c4d746269b65c7a499fd1629bc673705cb5afbf942c00b"} Jan 21 00:09:45 crc kubenswrapper[5030]: I0121 00:09:45.277746 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" event={"ID":"28cf46b6-f08b-4182-a7b9-024347e4ab97","Type":"ContainerStarted","Data":"32943b0f86d22d82b455cd36d24f1bf4c6589515dd381e4fee0072632633c544"} Jan 21 00:09:45 crc kubenswrapper[5030]: I0121 00:09:45.297246 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" podStartSLOduration=1.6312931449999999 podStartE2EDuration="4.297225785s" podCreationTimestamp="2026-01-21 00:09:41 +0000 UTC" firstStartedPulling="2026-01-21 00:09:41.834394803 +0000 UTC m=+5654.154655091" lastFinishedPulling="2026-01-21 00:09:44.500327453 +0000 UTC m=+5656.820587731" observedRunningTime="2026-01-21 00:09:45.293009753 +0000 UTC m=+5657.613270041" watchObservedRunningTime="2026-01-21 00:09:45.297225785 +0000 UTC m=+5657.617486073" Jan 21 00:09:46 crc kubenswrapper[5030]: I0121 00:09:46.287218 5030 generic.go:334] "Generic (PLEG): container finished" podID="28cf46b6-f08b-4182-a7b9-024347e4ab97" containerID="32943b0f86d22d82b455cd36d24f1bf4c6589515dd381e4fee0072632633c544" exitCode=0 Jan 21 00:09:46 crc kubenswrapper[5030]: I0121 00:09:46.287323 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" event={"ID":"28cf46b6-f08b-4182-a7b9-024347e4ab97","Type":"ContainerDied","Data":"32943b0f86d22d82b455cd36d24f1bf4c6589515dd381e4fee0072632633c544"} Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.706473 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.786150 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-libvirt-combined-ca-bundle\") pod \"28cf46b6-f08b-4182-a7b9-024347e4ab97\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.786193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-ssh-key-edpm-compute-no-nodes\") pod \"28cf46b6-f08b-4182-a7b9-024347e4ab97\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.786232 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-inventory\") pod \"28cf46b6-f08b-4182-a7b9-024347e4ab97\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.786270 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncjg9\" (UniqueName: \"kubernetes.io/projected/28cf46b6-f08b-4182-a7b9-024347e4ab97-kube-api-access-ncjg9\") pod \"28cf46b6-f08b-4182-a7b9-024347e4ab97\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.786355 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-libvirt-secret-0\") pod \"28cf46b6-f08b-4182-a7b9-024347e4ab97\" (UID: \"28cf46b6-f08b-4182-a7b9-024347e4ab97\") " Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.795856 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "28cf46b6-f08b-4182-a7b9-024347e4ab97" (UID: "28cf46b6-f08b-4182-a7b9-024347e4ab97"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.795887 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28cf46b6-f08b-4182-a7b9-024347e4ab97-kube-api-access-ncjg9" (OuterVolumeSpecName: "kube-api-access-ncjg9") pod "28cf46b6-f08b-4182-a7b9-024347e4ab97" (UID: "28cf46b6-f08b-4182-a7b9-024347e4ab97"). InnerVolumeSpecName "kube-api-access-ncjg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.827791 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-inventory" (OuterVolumeSpecName: "inventory") pod "28cf46b6-f08b-4182-a7b9-024347e4ab97" (UID: "28cf46b6-f08b-4182-a7b9-024347e4ab97"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.827934 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "28cf46b6-f08b-4182-a7b9-024347e4ab97" (UID: "28cf46b6-f08b-4182-a7b9-024347e4ab97"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.828295 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "28cf46b6-f08b-4182-a7b9-024347e4ab97" (UID: "28cf46b6-f08b-4182-a7b9-024347e4ab97"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.887512 5030 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.887559 5030 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.887572 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.887582 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28cf46b6-f08b-4182-a7b9-024347e4ab97-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:47 crc kubenswrapper[5030]: I0121 00:09:47.887596 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncjg9\" (UniqueName: \"kubernetes.io/projected/28cf46b6-f08b-4182-a7b9-024347e4ab97-kube-api-access-ncjg9\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.305518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" event={"ID":"28cf46b6-f08b-4182-a7b9-024347e4ab97","Type":"ContainerDied","Data":"18d45899959ac36fe2c4d746269b65c7a499fd1629bc673705cb5afbf942c00b"} Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.305570 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18d45899959ac36fe2c4d746269b65c7a499fd1629bc673705cb5afbf942c00b" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.305964 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.373486 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8"] Jan 21 00:09:48 crc kubenswrapper[5030]: E0121 00:09:48.374176 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cf46b6-f08b-4182-a7b9-024347e4ab97" containerName="libvirt-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.374198 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cf46b6-f08b-4182-a7b9-024347e4ab97" containerName="libvirt-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.374360 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="28cf46b6-f08b-4182-a7b9-024347e4ab97" containerName="libvirt-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.375045 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.377956 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.378133 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-migration-ssh-key" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.378179 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.378143 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.378334 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.380102 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"nova-cell1-compute-config" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.380930 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-8qnk2" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.390726 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8"] Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.423577 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-ssh-key-edpm-compute-no-nodes\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.423661 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-combined-ca-bundle\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.423719 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-migration-ssh-key-1\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.423750 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-cell1-compute-config-1\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.423783 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-inventory\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.423805 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-cell1-compute-config-0\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.423823 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-migration-ssh-key-0\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.423850 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qj22\" (UniqueName: \"kubernetes.io/projected/1a200254-a58d-405a-b7d8-46b961fa72ee-kube-api-access-8qj22\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.525581 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-cell1-compute-config-1\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.525990 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-inventory\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.526116 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-cell1-compute-config-0\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.526253 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-migration-ssh-key-0\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.526388 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qj22\" (UniqueName: \"kubernetes.io/projected/1a200254-a58d-405a-b7d8-46b961fa72ee-kube-api-access-8qj22\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.526533 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-ssh-key-edpm-compute-no-nodes\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.526695 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-combined-ca-bundle\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.526849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-migration-ssh-key-1\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.529982 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-ssh-key-edpm-compute-no-nodes\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.530024 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-cell1-compute-config-1\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.531243 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-combined-ca-bundle\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.531277 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-inventory\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.531384 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-cell1-compute-config-0\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.531663 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-migration-ssh-key-1\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.532407 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-migration-ssh-key-0\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.541824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qj22\" (UniqueName: \"kubernetes.io/projected/1a200254-a58d-405a-b7d8-46b961fa72ee-kube-api-access-8qj22\") pod \"nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:48 crc kubenswrapper[5030]: I0121 00:09:48.695448 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:49 crc kubenswrapper[5030]: I0121 00:09:49.144459 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8"] Jan 21 00:09:49 crc kubenswrapper[5030]: W0121 00:09:49.147577 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a200254_a58d_405a_b7d8_46b961fa72ee.slice/crio-4cb0943d5b9433e86f9a70ad3725b22dc75640752b155260e6607a2feb553657 WatchSource:0}: Error finding container 4cb0943d5b9433e86f9a70ad3725b22dc75640752b155260e6607a2feb553657: Status 404 returned error can't find the container with id 4cb0943d5b9433e86f9a70ad3725b22dc75640752b155260e6607a2feb553657 Jan 21 00:09:49 crc kubenswrapper[5030]: I0121 00:09:49.314031 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" event={"ID":"1a200254-a58d-405a-b7d8-46b961fa72ee","Type":"ContainerStarted","Data":"4cb0943d5b9433e86f9a70ad3725b22dc75640752b155260e6607a2feb553657"} Jan 21 00:09:52 crc kubenswrapper[5030]: I0121 00:09:52.349485 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" event={"ID":"1a200254-a58d-405a-b7d8-46b961fa72ee","Type":"ContainerStarted","Data":"2e73c302e23b021531eea2bbe45fd7a8ceb15c63e6d12ce0821b3f9462c3900d"} Jan 21 00:09:55 crc kubenswrapper[5030]: I0121 00:09:55.379068 5030 generic.go:334] "Generic (PLEG): container finished" podID="1a200254-a58d-405a-b7d8-46b961fa72ee" containerID="2e73c302e23b021531eea2bbe45fd7a8ceb15c63e6d12ce0821b3f9462c3900d" exitCode=0 Jan 21 00:09:55 crc kubenswrapper[5030]: I0121 00:09:55.379129 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" event={"ID":"1a200254-a58d-405a-b7d8-46b961fa72ee","Type":"ContainerDied","Data":"2e73c302e23b021531eea2bbe45fd7a8ceb15c63e6d12ce0821b3f9462c3900d"} Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.824788 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.956206 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-inventory\") pod \"1a200254-a58d-405a-b7d8-46b961fa72ee\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.956314 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-migration-ssh-key-0\") pod \"1a200254-a58d-405a-b7d8-46b961fa72ee\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.956352 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-ssh-key-edpm-compute-no-nodes\") pod \"1a200254-a58d-405a-b7d8-46b961fa72ee\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.956414 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-migration-ssh-key-1\") pod \"1a200254-a58d-405a-b7d8-46b961fa72ee\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.956466 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qj22\" (UniqueName: \"kubernetes.io/projected/1a200254-a58d-405a-b7d8-46b961fa72ee-kube-api-access-8qj22\") pod \"1a200254-a58d-405a-b7d8-46b961fa72ee\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.956519 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-cell1-compute-config-1\") pod \"1a200254-a58d-405a-b7d8-46b961fa72ee\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.956578 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-cell1-compute-config-0\") pod \"1a200254-a58d-405a-b7d8-46b961fa72ee\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.956634 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-combined-ca-bundle\") pod \"1a200254-a58d-405a-b7d8-46b961fa72ee\" (UID: \"1a200254-a58d-405a-b7d8-46b961fa72ee\") " Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.966544 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "1a200254-a58d-405a-b7d8-46b961fa72ee" (UID: "1a200254-a58d-405a-b7d8-46b961fa72ee"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.969940 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a200254-a58d-405a-b7d8-46b961fa72ee-kube-api-access-8qj22" (OuterVolumeSpecName: "kube-api-access-8qj22") pod "1a200254-a58d-405a-b7d8-46b961fa72ee" (UID: "1a200254-a58d-405a-b7d8-46b961fa72ee"). InnerVolumeSpecName "kube-api-access-8qj22". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.984840 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "1a200254-a58d-405a-b7d8-46b961fa72ee" (UID: "1a200254-a58d-405a-b7d8-46b961fa72ee"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.987082 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-inventory" (OuterVolumeSpecName: "inventory") pod "1a200254-a58d-405a-b7d8-46b961fa72ee" (UID: "1a200254-a58d-405a-b7d8-46b961fa72ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.997202 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "1a200254-a58d-405a-b7d8-46b961fa72ee" (UID: "1a200254-a58d-405a-b7d8-46b961fa72ee"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:56 crc kubenswrapper[5030]: I0121 00:09:56.997258 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "1a200254-a58d-405a-b7d8-46b961fa72ee" (UID: "1a200254-a58d-405a-b7d8-46b961fa72ee"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:57 crc kubenswrapper[5030]: I0121 00:09:57.006159 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "1a200254-a58d-405a-b7d8-46b961fa72ee" (UID: "1a200254-a58d-405a-b7d8-46b961fa72ee"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:57 crc kubenswrapper[5030]: I0121 00:09:57.009518 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "1a200254-a58d-405a-b7d8-46b961fa72ee" (UID: "1a200254-a58d-405a-b7d8-46b961fa72ee"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:09:57 crc kubenswrapper[5030]: I0121 00:09:57.058864 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:57 crc kubenswrapper[5030]: I0121 00:09:57.058922 5030 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:57 crc kubenswrapper[5030]: I0121 00:09:57.059039 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:57 crc kubenswrapper[5030]: I0121 00:09:57.059055 5030 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:57 crc kubenswrapper[5030]: I0121 00:09:57.059069 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qj22\" (UniqueName: \"kubernetes.io/projected/1a200254-a58d-405a-b7d8-46b961fa72ee-kube-api-access-8qj22\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:57 crc kubenswrapper[5030]: I0121 00:09:57.059080 5030 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:57 crc kubenswrapper[5030]: I0121 00:09:57.059091 5030 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:57 crc kubenswrapper[5030]: I0121 00:09:57.059106 5030 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a200254-a58d-405a-b7d8-46b961fa72ee-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:09:57 crc kubenswrapper[5030]: I0121 00:09:57.398225 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" event={"ID":"1a200254-a58d-405a-b7d8-46b961fa72ee","Type":"ContainerDied","Data":"4cb0943d5b9433e86f9a70ad3725b22dc75640752b155260e6607a2feb553657"} Jan 21 00:09:57 crc kubenswrapper[5030]: I0121 00:09:57.398461 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cb0943d5b9433e86f9a70ad3725b22dc75640752b155260e6607a2feb553657" Jan 21 00:09:57 crc kubenswrapper[5030]: I0121 00:09:57.398308 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.452812 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.463870 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-multinodeset-edpm-compute-no-nodes-7ltfc"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.474188 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.483256 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.496168 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.503025 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-edpm-multinodeset-edpm-compute-no-nodes-p86n8"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.516935 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.523745 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.533152 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-multinodeset-edpm-compute-no-nodes-zt7m5"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.539963 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-no-nodes-r82gl"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.549427 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-beta-nodesetpqvkq"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.556177 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-multinodeset-edpm-compute-beta-nodeset-c8rc8"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.562800 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.569229 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.580465 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.588091 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.595604 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.603312 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-multinodeset-edpm-compute-no-nodes-sjqlq"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.610362 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-multinodeset-edpm-compute-no-nodes-jrpdr"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.617877 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-multinodeset-edpm-compute-no-nodes-wv4gt"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.624737 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.632165 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.639168 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.646795 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-multinodeset-edpm-compute-no-nodes-k2g5z"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.655461 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-multinodeset-edpm-compute-no-nodes-4bwc6"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.662724 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.669771 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-multinodeset-edpm-compute-no-nodes-5p47t"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.676439 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-multinodeset-edpm-compute-no-nodes-5txqn"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.685105 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.695156 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-multinodeset-edpm-compute-no-nodes-5fxdh"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.710012 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.710066 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-multinodeset-edpm-compute-no-nodes-scvhz"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.710084 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-multinodeset-edpm-compute-no-nodes-5c2tw"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.717176 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-multinodeset-edpm-compute-no-nodes-97gw9"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.742587 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.746945 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/custom-svc-edpm-compute-no-nodes-ovrd-edpm-compute-no-nodesglr4"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.765257 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.774303 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-updated-ovn-cm-edpm-compute-no-nd6hsm"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.786810 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.793712 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.799347 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.805420 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.811078 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.816728 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.822493 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.829763 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.833106 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/bootstrap-edpm-compute-no-nodes-edpm-compute-no-nodes-mjp2d"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.838211 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.843248 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-os-edpm-compute-no-nodes-edpm-compute-no-nodes-5dkzs"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.848810 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/run-os-edpm-compute-no-nodes-edpm-compute-no-nodes-jt2xj"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.854740 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.860156 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.865393 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/libvirt-edpm-compute-no-nodes-edpm-compute-no-nodes-cbqbq"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.870305 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.875231 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-dhcp-edpm-compute-no-nodes-edpm-compute-no-nodes-jwj26"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.879939 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.885024 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-sriov-edpm-compute-no-nodes-edpm-compute-no-nodes-2rxxk"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.890411 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-edpm-compute-no-nodes-edpm-compute-no-nodes-z5gnq"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.896077 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.901652 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.906745 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-os-edpm-compute-no-nodes-edpm-compute-no-nodes-xzqgg"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.912477 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-kwx67"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.917383 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/neutron-metadata-edpm-compute-no-nodes-edpm-compute-no-nod7g6f9"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.923255 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/ovn-edpm-compute-no-nodes-edpm-compute-no-nodes-9skfc"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.929050 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/nova-edpm-compute-no-nodes-edpm-compute-no-nodes-dbwnl"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.932662 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/configure-network-edpm-compute-no-nodes-edpm-compute-no-no4wr6q"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.937262 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/download-cache-edpm-compute-no-nodes-edpm-compute-no-nodesqdmw5"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.942915 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/validate-network-edpm-compute-no-nodes-edpm-compute-no-nodgzmw9"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.947983 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449"] Jan 21 00:09:59 crc kubenswrapper[5030]: E0121 00:09:59.948322 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a200254-a58d-405a-b7d8-46b961fa72ee" containerName="nova-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.948341 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a200254-a58d-405a-b7d8-46b961fa72ee" containerName="nova-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.948475 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a200254-a58d-405a-b7d8-46b961fa72ee" containerName="nova-edpm-multinodeset-edpm-compute-no-nodes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.949242 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.952692 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.957408 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.962892 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4"] Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.964763 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:09:59 crc kubenswrapper[5030]: E0121 00:09:59.967002 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dnsmasq-svc edpm-compute-no-nodes kube-api-access-f9r75], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" podUID="658ea176-5943-44a6-aa3b-1ec684ae1d82" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.979174 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0255ce78-12a6-4072-b45e-5eafad612f0b" path="/var/lib/kubelet/pods/0255ce78-12a6-4072-b45e-5eafad612f0b/volumes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.987189 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bb49995-af5d-457a-9f76-67eae8b2a408" path="/var/lib/kubelet/pods/0bb49995-af5d-457a-9f76-67eae8b2a408/volumes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.988044 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eaa38f0-4707-4be1-a3ca-b343dff5e36a" path="/var/lib/kubelet/pods/0eaa38f0-4707-4be1-a3ca-b343dff5e36a/volumes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.989089 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="103c5df9-e30f-4d80-96fd-75cfcc07ce18" path="/var/lib/kubelet/pods/103c5df9-e30f-4d80-96fd-75cfcc07ce18/volumes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.990796 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d6caab-b199-4c48-ba30-6b3a23a2f921" path="/var/lib/kubelet/pods/10d6caab-b199-4c48-ba30-6b3a23a2f921/volumes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.991868 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a200254-a58d-405a-b7d8-46b961fa72ee" path="/var/lib/kubelet/pods/1a200254-a58d-405a-b7d8-46b961fa72ee/volumes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.992915 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2787768b-6299-4a11-a990-4d436661b035" path="/var/lib/kubelet/pods/2787768b-6299-4a11-a990-4d436661b035/volumes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.996715 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28cf46b6-f08b-4182-a7b9-024347e4ab97" path="/var/lib/kubelet/pods/28cf46b6-f08b-4182-a7b9-024347e4ab97/volumes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.997342 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2add332c-08b7-4a36-836a-bd74ec1404ce" path="/var/lib/kubelet/pods/2add332c-08b7-4a36-836a-bd74ec1404ce/volumes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.997948 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8" path="/var/lib/kubelet/pods/32c4ac6a-8fbe-44b6-a6e7-beebc0a141d8/volumes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.998934 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3439eae5-4f69-4f67-9cd2-6b25b38792b6" path="/var/lib/kubelet/pods/3439eae5-4f69-4f67-9cd2-6b25b38792b6/volumes" Jan 21 00:09:59 crc kubenswrapper[5030]: I0121 00:09:59.999841 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb88677-5ad8-42e9-8d98-b3c15d80c448" path="/var/lib/kubelet/pods/3eb88677-5ad8-42e9-8d98-b3c15d80c448/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.000291 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9r75\" (UniqueName: \"kubernetes.io/projected/658ea176-5943-44a6-aa3b-1ec684ae1d82-kube-api-access-f9r75\") pod \"dnsmasq-dnsmasq-64864b6d57-7n449\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.000390 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-7n449\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.000432 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-7n449\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.000457 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-config\") pod \"dnsmasq-dnsmasq-64864b6d57-7n449\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.001088 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47f0cf25-98ae-4e27-a478-d6953938b014" path="/var/lib/kubelet/pods/47f0cf25-98ae-4e27-a478-d6953938b014/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.001581 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d56d7c-1323-4c76-90ad-41273710d00a" path="/var/lib/kubelet/pods/51d56d7c-1323-4c76-90ad-41273710d00a/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.002110 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68fd9305-02e2-4e20-9b4e-2835ab4fb12e" path="/var/lib/kubelet/pods/68fd9305-02e2-4e20-9b4e-2835ab4fb12e/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.003019 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72b30f34-e1f3-44c1-8d05-9fc1ec51e677" path="/var/lib/kubelet/pods/72b30f34-e1f3-44c1-8d05-9fc1ec51e677/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.003518 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9337bf6b-bdfa-4575-993e-cd766ae96e00" path="/var/lib/kubelet/pods/9337bf6b-bdfa-4575-993e-cd766ae96e00/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.004026 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93625bb1-fc41-4613-9bc1-ef512f010bbc" path="/var/lib/kubelet/pods/93625bb1-fc41-4613-9bc1-ef512f010bbc/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.004492 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97700265-3bbc-43be-b0de-a5e05ecf1fd4" path="/var/lib/kubelet/pods/97700265-3bbc-43be-b0de-a5e05ecf1fd4/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.005443 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c" path="/var/lib/kubelet/pods/9b7db6b6-6ede-4e21-93f6-c3afdf6fdd2c/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.005936 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5c3d22d-f0a9-4638-a23c-0b4c56390a6e" path="/var/lib/kubelet/pods/a5c3d22d-f0a9-4638-a23c-0b4c56390a6e/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.006422 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5d8524f-463c-4a40-9b09-fbc9e80ded50" path="/var/lib/kubelet/pods/a5d8524f-463c-4a40-9b09-fbc9e80ded50/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.007765 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af5c1628-f417-46b1-8d56-49ecf0a67139" path="/var/lib/kubelet/pods/af5c1628-f417-46b1-8d56-49ecf0a67139/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.008237 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2" path="/var/lib/kubelet/pods/b6602e9f-5eca-4fd2-b3bc-667e3fdbf9c2/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.008705 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb1ac8e8-7505-4239-92d4-748025edb7d3" path="/var/lib/kubelet/pods/bb1ac8e8-7505-4239-92d4-748025edb7d3/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.009504 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c14a0bb0-fad1-4d7f-bd3d-06f24af42d75" path="/var/lib/kubelet/pods/c14a0bb0-fad1-4d7f-bd3d-06f24af42d75/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.010046 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c378999a-6e05-40e9-9386-b0e79d90d988" path="/var/lib/kubelet/pods/c378999a-6e05-40e9-9386-b0e79d90d988/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.010510 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4e39394-d293-4bc0-a41b-71ab76c7a740" path="/var/lib/kubelet/pods/c4e39394-d293-4bc0-a41b-71ab76c7a740/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.011058 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c68f004c-ce56-43ba-bee3-581bf3c7dce8" path="/var/lib/kubelet/pods/c68f004c-ce56-43ba-bee3-581bf3c7dce8/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.011945 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8224f59-84ad-49de-af58-6853e11fe09e" path="/var/lib/kubelet/pods/c8224f59-84ad-49de-af58-6853e11fe09e/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.012375 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8dc50b1-bd1b-4da8-aae6-9776d1e41237" path="/var/lib/kubelet/pods/c8dc50b1-bd1b-4da8-aae6-9776d1e41237/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.012829 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3580aa5-c95d-4d3c-8b9a-da3fe92633b8" path="/var/lib/kubelet/pods/f3580aa5-c95d-4d3c-8b9a-da3fe92633b8/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.013651 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3d46dc3-5c30-46c4-881e-b02affc0e9a3" path="/var/lib/kubelet/pods/f3d46dc3-5c30-46c4-881e-b02affc0e9a3/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.014111 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fba4cf5f-c4fd-49ec-9d00-f5724174b6a3" path="/var/lib/kubelet/pods/fba4cf5f-c4fd-49ec-9d00-f5724174b6a3/volumes" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.014542 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4"] Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.102153 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz9d9\" (UniqueName: \"kubernetes.io/projected/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-kube-api-access-dz9d9\") pod \"dnsmasq-dnsmasq-84b9f45d47-559v4\" (UID: \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.102204 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-config\") pod \"dnsmasq-dnsmasq-64864b6d57-7n449\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.102250 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9r75\" (UniqueName: \"kubernetes.io/projected/658ea176-5943-44a6-aa3b-1ec684ae1d82-kube-api-access-f9r75\") pod \"dnsmasq-dnsmasq-64864b6d57-7n449\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.102266 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-559v4\" (UID: \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.102370 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-7n449\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.102423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-7n449\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.102449 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-559v4\" (UID: \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:00 crc kubenswrapper[5030]: E0121 00:10:00.102546 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/edpm-compute-no-nodes: configmap "edpm-compute-no-nodes" not found Jan 21 00:10:00 crc kubenswrapper[5030]: E0121 00:10:00.102589 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-edpm-compute-no-nodes podName:658ea176-5943-44a6-aa3b-1ec684ae1d82 nodeName:}" failed. No retries permitted until 2026-01-21 00:10:00.602574872 +0000 UTC m=+5672.922835160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "edpm-compute-no-nodes" (UniqueName: "kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-edpm-compute-no-nodes") pod "dnsmasq-dnsmasq-64864b6d57-7n449" (UID: "658ea176-5943-44a6-aa3b-1ec684ae1d82") : configmap "edpm-compute-no-nodes" not found Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.103235 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-config\") pod \"dnsmasq-dnsmasq-64864b6d57-7n449\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.104730 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-7n449\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.119985 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9r75\" (UniqueName: \"kubernetes.io/projected/658ea176-5943-44a6-aa3b-1ec684ae1d82-kube-api-access-f9r75\") pod \"dnsmasq-dnsmasq-64864b6d57-7n449\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.204292 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-559v4\" (UID: \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.204369 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz9d9\" (UniqueName: \"kubernetes.io/projected/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-kube-api-access-dz9d9\") pod \"dnsmasq-dnsmasq-84b9f45d47-559v4\" (UID: \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.204420 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-559v4\" (UID: \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.205472 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-559v4\" (UID: \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.205564 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-559v4\" (UID: \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.221024 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz9d9\" (UniqueName: \"kubernetes.io/projected/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-kube-api-access-dz9d9\") pod \"dnsmasq-dnsmasq-84b9f45d47-559v4\" (UID: \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.288063 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.424965 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.449464 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.508753 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-dnsmasq-svc\") pod \"658ea176-5943-44a6-aa3b-1ec684ae1d82\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.508945 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-config\") pod \"658ea176-5943-44a6-aa3b-1ec684ae1d82\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.508971 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9r75\" (UniqueName: \"kubernetes.io/projected/658ea176-5943-44a6-aa3b-1ec684ae1d82-kube-api-access-f9r75\") pod \"658ea176-5943-44a6-aa3b-1ec684ae1d82\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.510331 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "658ea176-5943-44a6-aa3b-1ec684ae1d82" (UID: "658ea176-5943-44a6-aa3b-1ec684ae1d82"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.510414 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-config" (OuterVolumeSpecName: "config") pod "658ea176-5943-44a6-aa3b-1ec684ae1d82" (UID: "658ea176-5943-44a6-aa3b-1ec684ae1d82"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.515056 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/658ea176-5943-44a6-aa3b-1ec684ae1d82-kube-api-access-f9r75" (OuterVolumeSpecName: "kube-api-access-f9r75") pod "658ea176-5943-44a6-aa3b-1ec684ae1d82" (UID: "658ea176-5943-44a6-aa3b-1ec684ae1d82"). InnerVolumeSpecName "kube-api-access-f9r75". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.615428 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-7n449\" (UID: \"658ea176-5943-44a6-aa3b-1ec684ae1d82\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.615615 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.615652 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.615666 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9r75\" (UniqueName: \"kubernetes.io/projected/658ea176-5943-44a6-aa3b-1ec684ae1d82-kube-api-access-f9r75\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:00 crc kubenswrapper[5030]: E0121 00:10:00.615742 5030 configmap.go:193] Couldn't get configMap openstack-kuttl-tests/edpm-compute-no-nodes: configmap "edpm-compute-no-nodes" not found Jan 21 00:10:00 crc kubenswrapper[5030]: E0121 00:10:00.615799 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-edpm-compute-no-nodes podName:658ea176-5943-44a6-aa3b-1ec684ae1d82 nodeName:}" failed. No retries permitted until 2026-01-21 00:10:01.615780326 +0000 UTC m=+5673.936040614 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "edpm-compute-no-nodes" (UniqueName: "kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-edpm-compute-no-nodes") pod "dnsmasq-dnsmasq-64864b6d57-7n449" (UID: "658ea176-5943-44a6-aa3b-1ec684ae1d82") : configmap "edpm-compute-no-nodes" not found Jan 21 00:10:00 crc kubenswrapper[5030]: I0121 00:10:00.734079 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4"] Jan 21 00:10:01 crc kubenswrapper[5030]: I0121 00:10:01.458575 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" event={"ID":"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84","Type":"ContainerStarted","Data":"4608c8179775a48d643dc7bdd6928bf7d6cea588a081c51c0092cfc8cd51089c"} Jan 21 00:10:01 crc kubenswrapper[5030]: I0121 00:10:01.458998 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" event={"ID":"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84","Type":"ContainerStarted","Data":"067523dd090d1ce5651aa226e9d0a37af6f002db382925a9597510272ebe733a"} Jan 21 00:10:01 crc kubenswrapper[5030]: I0121 00:10:01.458606 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449" Jan 21 00:10:01 crc kubenswrapper[5030]: I0121 00:10:01.519386 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449"] Jan 21 00:10:01 crc kubenswrapper[5030]: I0121 00:10:01.527060 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-7n449"] Jan 21 00:10:01 crc kubenswrapper[5030]: I0121 00:10:01.637162 5030 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/658ea176-5943-44a6-aa3b-1ec684ae1d82-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:01 crc kubenswrapper[5030]: I0121 00:10:01.987477 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="658ea176-5943-44a6-aa3b-1ec684ae1d82" path="/var/lib/kubelet/pods/658ea176-5943-44a6-aa3b-1ec684ae1d82/volumes" Jan 21 00:10:02 crc kubenswrapper[5030]: I0121 00:10:02.469576 5030 generic.go:334] "Generic (PLEG): container finished" podID="3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84" containerID="4608c8179775a48d643dc7bdd6928bf7d6cea588a081c51c0092cfc8cd51089c" exitCode=0 Jan 21 00:10:02 crc kubenswrapper[5030]: I0121 00:10:02.469644 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" event={"ID":"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84","Type":"ContainerDied","Data":"4608c8179775a48d643dc7bdd6928bf7d6cea588a081c51c0092cfc8cd51089c"} Jan 21 00:10:03 crc kubenswrapper[5030]: I0121 00:10:03.483038 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" event={"ID":"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84","Type":"ContainerStarted","Data":"2658a8799e479577625d206e371483b3ea2b159884cf659798eef4826fc755e6"} Jan 21 00:10:03 crc kubenswrapper[5030]: I0121 00:10:03.483466 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:03 crc kubenswrapper[5030]: I0121 00:10:03.506531 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" podStartSLOduration=4.506503657 podStartE2EDuration="4.506503657s" podCreationTimestamp="2026-01-21 00:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:10:03.50334122 +0000 UTC m=+5675.823601538" watchObservedRunningTime="2026-01-21 00:10:03.506503657 +0000 UTC m=+5675.826763985" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.480910 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-sq55g"] Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.487973 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-sq55g"] Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.584002 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-gzzm2"] Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.585156 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gzzm2" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.587305 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.587636 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.587782 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.587832 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.592368 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-gzzm2"] Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.680398 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/83f9bcae-ba65-4143-8a26-62d22d4d0819-node-mnt\") pod \"crc-storage-crc-gzzm2\" (UID: \"83f9bcae-ba65-4143-8a26-62d22d4d0819\") " pod="crc-storage/crc-storage-crc-gzzm2" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.680590 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfhrj\" (UniqueName: \"kubernetes.io/projected/83f9bcae-ba65-4143-8a26-62d22d4d0819-kube-api-access-rfhrj\") pod \"crc-storage-crc-gzzm2\" (UID: \"83f9bcae-ba65-4143-8a26-62d22d4d0819\") " pod="crc-storage/crc-storage-crc-gzzm2" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.680685 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/83f9bcae-ba65-4143-8a26-62d22d4d0819-crc-storage\") pod \"crc-storage-crc-gzzm2\" (UID: \"83f9bcae-ba65-4143-8a26-62d22d4d0819\") " pod="crc-storage/crc-storage-crc-gzzm2" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.781519 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/83f9bcae-ba65-4143-8a26-62d22d4d0819-crc-storage\") pod \"crc-storage-crc-gzzm2\" (UID: \"83f9bcae-ba65-4143-8a26-62d22d4d0819\") " pod="crc-storage/crc-storage-crc-gzzm2" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.781590 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/83f9bcae-ba65-4143-8a26-62d22d4d0819-node-mnt\") pod \"crc-storage-crc-gzzm2\" (UID: \"83f9bcae-ba65-4143-8a26-62d22d4d0819\") " pod="crc-storage/crc-storage-crc-gzzm2" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.781693 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfhrj\" (UniqueName: \"kubernetes.io/projected/83f9bcae-ba65-4143-8a26-62d22d4d0819-kube-api-access-rfhrj\") pod \"crc-storage-crc-gzzm2\" (UID: \"83f9bcae-ba65-4143-8a26-62d22d4d0819\") " pod="crc-storage/crc-storage-crc-gzzm2" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.782519 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/83f9bcae-ba65-4143-8a26-62d22d4d0819-crc-storage\") pod \"crc-storage-crc-gzzm2\" (UID: \"83f9bcae-ba65-4143-8a26-62d22d4d0819\") " pod="crc-storage/crc-storage-crc-gzzm2" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.782848 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/83f9bcae-ba65-4143-8a26-62d22d4d0819-node-mnt\") pod \"crc-storage-crc-gzzm2\" (UID: \"83f9bcae-ba65-4143-8a26-62d22d4d0819\") " pod="crc-storage/crc-storage-crc-gzzm2" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.800059 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfhrj\" (UniqueName: \"kubernetes.io/projected/83f9bcae-ba65-4143-8a26-62d22d4d0819-kube-api-access-rfhrj\") pod \"crc-storage-crc-gzzm2\" (UID: \"83f9bcae-ba65-4143-8a26-62d22d4d0819\") " pod="crc-storage/crc-storage-crc-gzzm2" Jan 21 00:10:08 crc kubenswrapper[5030]: I0121 00:10:08.902854 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gzzm2" Jan 21 00:10:09 crc kubenswrapper[5030]: I0121 00:10:09.384733 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-gzzm2"] Jan 21 00:10:09 crc kubenswrapper[5030]: I0121 00:10:09.537171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gzzm2" event={"ID":"83f9bcae-ba65-4143-8a26-62d22d4d0819","Type":"ContainerStarted","Data":"8b207e7203f3b08c60cfc54e506bdbb17ac5f852d08be5ee23dd7a2d5a4173c9"} Jan 21 00:10:09 crc kubenswrapper[5030]: I0121 00:10:09.972912 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="367334e1-8d9b-4474-9fb9-89f154b07667" path="/var/lib/kubelet/pods/367334e1-8d9b-4474-9fb9-89f154b07667/volumes" Jan 21 00:10:10 crc kubenswrapper[5030]: I0121 00:10:10.157717 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:10:10 crc kubenswrapper[5030]: I0121 00:10:10.157811 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:10:10 crc kubenswrapper[5030]: I0121 00:10:10.289819 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:10 crc kubenswrapper[5030]: I0121 00:10:10.376704 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5"] Jan 21 00:10:10 crc kubenswrapper[5030]: I0121 00:10:10.377616 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" podUID="5a1b02bd-9738-4790-baff-925f03c0a71f" containerName="dnsmasq-dns" containerID="cri-o://3b3cc079576b6c41c4dc4d8e2cb418e086300f92064e461c1d7aec46247d0edc" gracePeriod=10 Jan 21 00:10:11 crc kubenswrapper[5030]: I0121 00:10:11.564468 5030 generic.go:334] "Generic (PLEG): container finished" podID="5a1b02bd-9738-4790-baff-925f03c0a71f" containerID="3b3cc079576b6c41c4dc4d8e2cb418e086300f92064e461c1d7aec46247d0edc" exitCode=0 Jan 21 00:10:11 crc kubenswrapper[5030]: I0121 00:10:11.564545 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" event={"ID":"5a1b02bd-9738-4790-baff-925f03c0a71f","Type":"ContainerDied","Data":"3b3cc079576b6c41c4dc4d8e2cb418e086300f92064e461c1d7aec46247d0edc"} Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.548486 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.589526 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" event={"ID":"5a1b02bd-9738-4790-baff-925f03c0a71f","Type":"ContainerDied","Data":"01bef595be97b30751deff224ad18388a639e170a1fb16a66688b36711455a6f"} Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.589831 5030 scope.go:117] "RemoveContainer" containerID="3b3cc079576b6c41c4dc4d8e2cb418e086300f92064e461c1d7aec46247d0edc" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.589995 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.612033 5030 scope.go:117] "RemoveContainer" containerID="7f9326088d5c814000efb93bf7558908e6312eb5f0a00c5107378adf3023b911" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.662163 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-config\") pod \"5a1b02bd-9738-4790-baff-925f03c0a71f\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.662254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-dnsmasq-svc\") pod \"5a1b02bd-9738-4790-baff-925f03c0a71f\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.662362 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjfcb\" (UniqueName: \"kubernetes.io/projected/5a1b02bd-9738-4790-baff-925f03c0a71f-kube-api-access-hjfcb\") pod \"5a1b02bd-9738-4790-baff-925f03c0a71f\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.662391 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-edpm-compute-no-nodes\") pod \"5a1b02bd-9738-4790-baff-925f03c0a71f\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.662457 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-edpm-compute-beta-nodeset\") pod \"5a1b02bd-9738-4790-baff-925f03c0a71f\" (UID: \"5a1b02bd-9738-4790-baff-925f03c0a71f\") " Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.670433 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1b02bd-9738-4790-baff-925f03c0a71f-kube-api-access-hjfcb" (OuterVolumeSpecName: "kube-api-access-hjfcb") pod "5a1b02bd-9738-4790-baff-925f03c0a71f" (UID: "5a1b02bd-9738-4790-baff-925f03c0a71f"). InnerVolumeSpecName "kube-api-access-hjfcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.700095 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "5a1b02bd-9738-4790-baff-925f03c0a71f" (UID: "5a1b02bd-9738-4790-baff-925f03c0a71f"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.700133 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-edpm-compute-beta-nodeset" (OuterVolumeSpecName: "edpm-compute-beta-nodeset") pod "5a1b02bd-9738-4790-baff-925f03c0a71f" (UID: "5a1b02bd-9738-4790-baff-925f03c0a71f"). InnerVolumeSpecName "edpm-compute-beta-nodeset". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.703048 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-config" (OuterVolumeSpecName: "config") pod "5a1b02bd-9738-4790-baff-925f03c0a71f" (UID: "5a1b02bd-9738-4790-baff-925f03c0a71f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.717486 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-edpm-compute-no-nodes" (OuterVolumeSpecName: "edpm-compute-no-nodes") pod "5a1b02bd-9738-4790-baff-925f03c0a71f" (UID: "5a1b02bd-9738-4790-baff-925f03c0a71f"). InnerVolumeSpecName "edpm-compute-no-nodes". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.765351 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.765394 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.765413 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjfcb\" (UniqueName: \"kubernetes.io/projected/5a1b02bd-9738-4790-baff-925f03c0a71f-kube-api-access-hjfcb\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.765426 5030 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.765439 5030 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-beta-nodeset\" (UniqueName: \"kubernetes.io/configmap/5a1b02bd-9738-4790-baff-925f03c0a71f-edpm-compute-beta-nodeset\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.931599 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5"] Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.941371 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5"] Jan 21 00:10:13 crc kubenswrapper[5030]: I0121 00:10:13.972660 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1b02bd-9738-4790-baff-925f03c0a71f" path="/var/lib/kubelet/pods/5a1b02bd-9738-4790-baff-925f03c0a71f/volumes" Jan 21 00:10:18 crc kubenswrapper[5030]: I0121 00:10:18.416317 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-67886899f9-5psd5" podUID="5a1b02bd-9738-4790-baff-925f03c0a71f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.239:5353: i/o timeout" Jan 21 00:10:21 crc kubenswrapper[5030]: I0121 00:10:21.675923 5030 generic.go:334] "Generic (PLEG): container finished" podID="83f9bcae-ba65-4143-8a26-62d22d4d0819" containerID="9078165806f3ddb811540570b6df86d43d23e2910910bec703d999a27a431972" exitCode=0 Jan 21 00:10:21 crc kubenswrapper[5030]: I0121 00:10:21.675982 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gzzm2" event={"ID":"83f9bcae-ba65-4143-8a26-62d22d4d0819","Type":"ContainerDied","Data":"9078165806f3ddb811540570b6df86d43d23e2910910bec703d999a27a431972"} Jan 21 00:10:23 crc kubenswrapper[5030]: I0121 00:10:23.015347 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gzzm2" Jan 21 00:10:23 crc kubenswrapper[5030]: I0121 00:10:23.141823 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/83f9bcae-ba65-4143-8a26-62d22d4d0819-crc-storage\") pod \"83f9bcae-ba65-4143-8a26-62d22d4d0819\" (UID: \"83f9bcae-ba65-4143-8a26-62d22d4d0819\") " Jan 21 00:10:23 crc kubenswrapper[5030]: I0121 00:10:23.141992 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfhrj\" (UniqueName: \"kubernetes.io/projected/83f9bcae-ba65-4143-8a26-62d22d4d0819-kube-api-access-rfhrj\") pod \"83f9bcae-ba65-4143-8a26-62d22d4d0819\" (UID: \"83f9bcae-ba65-4143-8a26-62d22d4d0819\") " Jan 21 00:10:23 crc kubenswrapper[5030]: I0121 00:10:23.142030 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/83f9bcae-ba65-4143-8a26-62d22d4d0819-node-mnt\") pod \"83f9bcae-ba65-4143-8a26-62d22d4d0819\" (UID: \"83f9bcae-ba65-4143-8a26-62d22d4d0819\") " Jan 21 00:10:23 crc kubenswrapper[5030]: I0121 00:10:23.142198 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83f9bcae-ba65-4143-8a26-62d22d4d0819-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "83f9bcae-ba65-4143-8a26-62d22d4d0819" (UID: "83f9bcae-ba65-4143-8a26-62d22d4d0819"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:10:23 crc kubenswrapper[5030]: I0121 00:10:23.142305 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/83f9bcae-ba65-4143-8a26-62d22d4d0819-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:23 crc kubenswrapper[5030]: I0121 00:10:23.146633 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f9bcae-ba65-4143-8a26-62d22d4d0819-kube-api-access-rfhrj" (OuterVolumeSpecName: "kube-api-access-rfhrj") pod "83f9bcae-ba65-4143-8a26-62d22d4d0819" (UID: "83f9bcae-ba65-4143-8a26-62d22d4d0819"). InnerVolumeSpecName "kube-api-access-rfhrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:10:23 crc kubenswrapper[5030]: I0121 00:10:23.160885 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83f9bcae-ba65-4143-8a26-62d22d4d0819-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "83f9bcae-ba65-4143-8a26-62d22d4d0819" (UID: "83f9bcae-ba65-4143-8a26-62d22d4d0819"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:10:23 crc kubenswrapper[5030]: I0121 00:10:23.243052 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfhrj\" (UniqueName: \"kubernetes.io/projected/83f9bcae-ba65-4143-8a26-62d22d4d0819-kube-api-access-rfhrj\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:23 crc kubenswrapper[5030]: I0121 00:10:23.243088 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/83f9bcae-ba65-4143-8a26-62d22d4d0819-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:23 crc kubenswrapper[5030]: I0121 00:10:23.701016 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gzzm2" event={"ID":"83f9bcae-ba65-4143-8a26-62d22d4d0819","Type":"ContainerDied","Data":"8b207e7203f3b08c60cfc54e506bdbb17ac5f852d08be5ee23dd7a2d5a4173c9"} Jan 21 00:10:23 crc kubenswrapper[5030]: I0121 00:10:23.701282 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b207e7203f3b08c60cfc54e506bdbb17ac5f852d08be5ee23dd7a2d5a4173c9" Jan 21 00:10:23 crc kubenswrapper[5030]: I0121 00:10:23.701053 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gzzm2" Jan 21 00:10:26 crc kubenswrapper[5030]: I0121 00:10:26.803408 5030 scope.go:117] "RemoveContainer" containerID="7dd27b2996adc0ef0d867132f450301a477661cef0106f18ed271dbf2dae7bf5" Jan 21 00:10:26 crc kubenswrapper[5030]: I0121 00:10:26.828939 5030 scope.go:117] "RemoveContainer" containerID="45fa506a7c13811f70e578240df840f22276a03f7ce42680ae95a7384bcf8aca" Jan 21 00:10:26 crc kubenswrapper[5030]: I0121 00:10:26.854183 5030 scope.go:117] "RemoveContainer" containerID="7e78918f3cd2ec77333e5bf501e588f142d2ac1fac7efe1267fda1c35affa090" Jan 21 00:10:26 crc kubenswrapper[5030]: I0121 00:10:26.880085 5030 scope.go:117] "RemoveContainer" containerID="fc669352154f42c2e7f1a3a6479262acbc8674ce5b236f4e5ece1cc7865c794d" Jan 21 00:10:26 crc kubenswrapper[5030]: I0121 00:10:26.905549 5030 scope.go:117] "RemoveContainer" containerID="eae02cf9d85bac20bfc235b84d8e1016ef91f21f75218679b7ac6189da8adfc9" Jan 21 00:10:26 crc kubenswrapper[5030]: I0121 00:10:26.931732 5030 scope.go:117] "RemoveContainer" containerID="e23f615058c2e18301e1bba79c8abfd2f23643f312553b46b4ffbe78b01a18ca" Jan 21 00:10:26 crc kubenswrapper[5030]: I0121 00:10:26.956248 5030 scope.go:117] "RemoveContainer" containerID="9a3b99b979ff6f98a24aa52e56f001865bfa347ccfcc6000baddf9d101b99fa1" Jan 21 00:10:26 crc kubenswrapper[5030]: I0121 00:10:26.979391 5030 scope.go:117] "RemoveContainer" containerID="a3ec44899c258b554f9d9ff568c58803c6610ea8e8bbdf8f877a2deebf7aa4b7" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.020056 5030 scope.go:117] "RemoveContainer" containerID="3ce78b355797f87ef2cb1d7308b05170fa5147487151cf2accedf6170e20ac72" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.047559 5030 scope.go:117] "RemoveContainer" containerID="a5163e335a2657d84e00e57cac5dbe4bf62d749cbb0df60fb460b25d9a4845dc" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.097291 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-gzzm2"] Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.105670 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-gzzm2"] Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.247662 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-592ch"] Jan 21 00:10:27 crc kubenswrapper[5030]: E0121 00:10:27.248007 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83f9bcae-ba65-4143-8a26-62d22d4d0819" containerName="storage" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.248025 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f9bcae-ba65-4143-8a26-62d22d4d0819" containerName="storage" Jan 21 00:10:27 crc kubenswrapper[5030]: E0121 00:10:27.248039 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b02bd-9738-4790-baff-925f03c0a71f" containerName="dnsmasq-dns" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.248046 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b02bd-9738-4790-baff-925f03c0a71f" containerName="dnsmasq-dns" Jan 21 00:10:27 crc kubenswrapper[5030]: E0121 00:10:27.248060 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b02bd-9738-4790-baff-925f03c0a71f" containerName="init" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.248068 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b02bd-9738-4790-baff-925f03c0a71f" containerName="init" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.248217 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1b02bd-9738-4790-baff-925f03c0a71f" containerName="dnsmasq-dns" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.248235 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="83f9bcae-ba65-4143-8a26-62d22d4d0819" containerName="storage" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.248995 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-592ch" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.251153 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.251207 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.252054 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.252248 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.256075 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-592ch"] Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.310088 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-crc-storage\") pod \"crc-storage-crc-592ch\" (UID: \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\") " pod="crc-storage/crc-storage-crc-592ch" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.310772 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv9nv\" (UniqueName: \"kubernetes.io/projected/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-kube-api-access-mv9nv\") pod \"crc-storage-crc-592ch\" (UID: \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\") " pod="crc-storage/crc-storage-crc-592ch" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.311155 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-node-mnt\") pod \"crc-storage-crc-592ch\" (UID: \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\") " pod="crc-storage/crc-storage-crc-592ch" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.412197 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv9nv\" (UniqueName: \"kubernetes.io/projected/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-kube-api-access-mv9nv\") pod \"crc-storage-crc-592ch\" (UID: \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\") " pod="crc-storage/crc-storage-crc-592ch" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.412271 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-node-mnt\") pod \"crc-storage-crc-592ch\" (UID: \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\") " pod="crc-storage/crc-storage-crc-592ch" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.412347 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-crc-storage\") pod \"crc-storage-crc-592ch\" (UID: \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\") " pod="crc-storage/crc-storage-crc-592ch" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.413014 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-node-mnt\") pod \"crc-storage-crc-592ch\" (UID: \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\") " pod="crc-storage/crc-storage-crc-592ch" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.413800 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-crc-storage\") pod \"crc-storage-crc-592ch\" (UID: \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\") " pod="crc-storage/crc-storage-crc-592ch" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.442594 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv9nv\" (UniqueName: \"kubernetes.io/projected/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-kube-api-access-mv9nv\") pod \"crc-storage-crc-592ch\" (UID: \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\") " pod="crc-storage/crc-storage-crc-592ch" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.574797 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-592ch" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.972494 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f9bcae-ba65-4143-8a26-62d22d4d0819" path="/var/lib/kubelet/pods/83f9bcae-ba65-4143-8a26-62d22d4d0819/volumes" Jan 21 00:10:27 crc kubenswrapper[5030]: I0121 00:10:27.986887 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-592ch"] Jan 21 00:10:28 crc kubenswrapper[5030]: I0121 00:10:28.767361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-592ch" event={"ID":"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d","Type":"ContainerStarted","Data":"7067a5a76ef66e8fe1b93f1bdc830570cf776a9ce5ea1801ff39c1c93dea36ab"} Jan 21 00:10:29 crc kubenswrapper[5030]: I0121 00:10:29.779963 5030 generic.go:334] "Generic (PLEG): container finished" podID="cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d" containerID="758cbd33cafaa2fc69bce8b0657402bc66e32624cdb40e875281c8e72b08398f" exitCode=0 Jan 21 00:10:29 crc kubenswrapper[5030]: I0121 00:10:29.780059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-592ch" event={"ID":"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d","Type":"ContainerDied","Data":"758cbd33cafaa2fc69bce8b0657402bc66e32624cdb40e875281c8e72b08398f"} Jan 21 00:10:31 crc kubenswrapper[5030]: I0121 00:10:31.040342 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-592ch" Jan 21 00:10:31 crc kubenswrapper[5030]: I0121 00:10:31.187791 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-node-mnt\") pod \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\" (UID: \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\") " Jan 21 00:10:31 crc kubenswrapper[5030]: I0121 00:10:31.187877 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv9nv\" (UniqueName: \"kubernetes.io/projected/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-kube-api-access-mv9nv\") pod \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\" (UID: \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\") " Jan 21 00:10:31 crc kubenswrapper[5030]: I0121 00:10:31.187944 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d" (UID: "cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:10:31 crc kubenswrapper[5030]: I0121 00:10:31.187972 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-crc-storage\") pod \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\" (UID: \"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d\") " Jan 21 00:10:31 crc kubenswrapper[5030]: I0121 00:10:31.188264 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:31 crc kubenswrapper[5030]: I0121 00:10:31.192714 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-kube-api-access-mv9nv" (OuterVolumeSpecName: "kube-api-access-mv9nv") pod "cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d" (UID: "cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d"). InnerVolumeSpecName "kube-api-access-mv9nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:10:31 crc kubenswrapper[5030]: I0121 00:10:31.210586 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d" (UID: "cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:10:31 crc kubenswrapper[5030]: I0121 00:10:31.289674 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv9nv\" (UniqueName: \"kubernetes.io/projected/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-kube-api-access-mv9nv\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:31 crc kubenswrapper[5030]: I0121 00:10:31.289720 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:31 crc kubenswrapper[5030]: I0121 00:10:31.801751 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-592ch" event={"ID":"cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d","Type":"ContainerDied","Data":"7067a5a76ef66e8fe1b93f1bdc830570cf776a9ce5ea1801ff39c1c93dea36ab"} Jan 21 00:10:31 crc kubenswrapper[5030]: I0121 00:10:31.801815 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7067a5a76ef66e8fe1b93f1bdc830570cf776a9ce5ea1801ff39c1c93dea36ab" Jan 21 00:10:31 crc kubenswrapper[5030]: I0121 00:10:31.801837 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-592ch" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.526665 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk"] Jan 21 00:10:34 crc kubenswrapper[5030]: E0121 00:10:34.527243 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d" containerName="storage" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.527259 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d" containerName="storage" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.527431 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d" containerName="storage" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.528158 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.530734 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-edpm-tls" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.547760 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk"] Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.640186 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbd9t\" (UniqueName: \"kubernetes.io/projected/5300e6a1-a48d-4827-b2a5-654042621c71-kube-api-access-hbd9t\") pod \"dnsmasq-dnsmasq-78c7b787f5-qgcqk\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.640269 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-config\") pod \"dnsmasq-dnsmasq-78c7b787f5-qgcqk\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.640504 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-78c7b787f5-qgcqk\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.640669 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c7b787f5-qgcqk\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.742564 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c7b787f5-qgcqk\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.742862 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbd9t\" (UniqueName: \"kubernetes.io/projected/5300e6a1-a48d-4827-b2a5-654042621c71-kube-api-access-hbd9t\") pod \"dnsmasq-dnsmasq-78c7b787f5-qgcqk\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.742925 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-config\") pod \"dnsmasq-dnsmasq-78c7b787f5-qgcqk\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.743054 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-78c7b787f5-qgcqk\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.743809 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-78c7b787f5-qgcqk\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.744517 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-78c7b787f5-qgcqk\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.744609 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-config\") pod \"dnsmasq-dnsmasq-78c7b787f5-qgcqk\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.775271 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbd9t\" (UniqueName: \"kubernetes.io/projected/5300e6a1-a48d-4827-b2a5-654042621c71-kube-api-access-hbd9t\") pod \"dnsmasq-dnsmasq-78c7b787f5-qgcqk\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:34 crc kubenswrapper[5030]: I0121 00:10:34.849924 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:35 crc kubenswrapper[5030]: I0121 00:10:35.121444 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk"] Jan 21 00:10:35 crc kubenswrapper[5030]: I0121 00:10:35.847260 5030 generic.go:334] "Generic (PLEG): container finished" podID="5300e6a1-a48d-4827-b2a5-654042621c71" containerID="b26ba1cde7111b91d056b4c29f8604b8654cf8d80047bbc541c8354b9b25a606" exitCode=0 Jan 21 00:10:35 crc kubenswrapper[5030]: I0121 00:10:35.847470 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" event={"ID":"5300e6a1-a48d-4827-b2a5-654042621c71","Type":"ContainerDied","Data":"b26ba1cde7111b91d056b4c29f8604b8654cf8d80047bbc541c8354b9b25a606"} Jan 21 00:10:35 crc kubenswrapper[5030]: I0121 00:10:35.848002 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" event={"ID":"5300e6a1-a48d-4827-b2a5-654042621c71","Type":"ContainerStarted","Data":"9fd0d1ca376a1ab7bfcc80c463115423aed822a546d099cb96d6e48afec87d1e"} Jan 21 00:10:36 crc kubenswrapper[5030]: I0121 00:10:36.860377 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" event={"ID":"5300e6a1-a48d-4827-b2a5-654042621c71","Type":"ContainerStarted","Data":"ba580e2de24e2320ef8cb2b32d2f6e3271464521133e70296418cad7660cf6f3"} Jan 21 00:10:36 crc kubenswrapper[5030]: I0121 00:10:36.861905 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:36 crc kubenswrapper[5030]: I0121 00:10:36.885900 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" podStartSLOduration=2.885882271 podStartE2EDuration="2.885882271s" podCreationTimestamp="2026-01-21 00:10:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:10:36.880100641 +0000 UTC m=+5709.200360949" watchObservedRunningTime="2026-01-21 00:10:36.885882271 +0000 UTC m=+5709.206142559" Jan 21 00:10:40 crc kubenswrapper[5030]: I0121 00:10:40.157245 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:10:40 crc kubenswrapper[5030]: I0121 00:10:40.157596 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:10:44 crc kubenswrapper[5030]: I0121 00:10:44.851997 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:44 crc kubenswrapper[5030]: I0121 00:10:44.929237 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4"] Jan 21 00:10:44 crc kubenswrapper[5030]: I0121 00:10:44.929561 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" podUID="3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84" containerName="dnsmasq-dns" containerID="cri-o://2658a8799e479577625d206e371483b3ea2b159884cf659798eef4826fc755e6" gracePeriod=10 Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.058846 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44"] Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.060039 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.076769 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44"] Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.115323 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrf2z\" (UniqueName: \"kubernetes.io/projected/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-kube-api-access-zrf2z\") pod \"dnsmasq-dnsmasq-79cc674687-npl44\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.115388 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79cc674687-npl44\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.115434 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-79cc674687-npl44\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.115519 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-config\") pod \"dnsmasq-dnsmasq-79cc674687-npl44\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.217401 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-79cc674687-npl44\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.217773 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-config\") pod \"dnsmasq-dnsmasq-79cc674687-npl44\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.217936 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrf2z\" (UniqueName: \"kubernetes.io/projected/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-kube-api-access-zrf2z\") pod \"dnsmasq-dnsmasq-79cc674687-npl44\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.218009 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79cc674687-npl44\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.218873 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-openstack-edpm-tls\") pod \"dnsmasq-dnsmasq-79cc674687-npl44\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.219102 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-config\") pod \"dnsmasq-dnsmasq-79cc674687-npl44\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.219111 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-79cc674687-npl44\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.235576 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrf2z\" (UniqueName: \"kubernetes.io/projected/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-kube-api-access-zrf2z\") pod \"dnsmasq-dnsmasq-79cc674687-npl44\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.288521 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" podUID="3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.5:5353: connect: connection refused" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.437129 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.855695 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.868691 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44"] Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.980484 5030 generic.go:334] "Generic (PLEG): container finished" podID="3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84" containerID="2658a8799e479577625d206e371483b3ea2b159884cf659798eef4826fc755e6" exitCode=0 Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.980866 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.984808 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" event={"ID":"7b2e7b50-765c-4f4a-b9da-69f16961e8b1","Type":"ContainerStarted","Data":"b36d32b4d828fec3cc2ee394b65c9839dc02cbe0812194062d4e1025ca6ad537"} Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.984840 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" event={"ID":"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84","Type":"ContainerDied","Data":"2658a8799e479577625d206e371483b3ea2b159884cf659798eef4826fc755e6"} Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.984862 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4" event={"ID":"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84","Type":"ContainerDied","Data":"067523dd090d1ce5651aa226e9d0a37af6f002db382925a9597510272ebe733a"} Jan 21 00:10:45 crc kubenswrapper[5030]: I0121 00:10:45.984878 5030 scope.go:117] "RemoveContainer" containerID="2658a8799e479577625d206e371483b3ea2b159884cf659798eef4826fc755e6" Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.003667 5030 scope.go:117] "RemoveContainer" containerID="4608c8179775a48d643dc7bdd6928bf7d6cea588a081c51c0092cfc8cd51089c" Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.022217 5030 scope.go:117] "RemoveContainer" containerID="2658a8799e479577625d206e371483b3ea2b159884cf659798eef4826fc755e6" Jan 21 00:10:46 crc kubenswrapper[5030]: E0121 00:10:46.022587 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2658a8799e479577625d206e371483b3ea2b159884cf659798eef4826fc755e6\": container with ID starting with 2658a8799e479577625d206e371483b3ea2b159884cf659798eef4826fc755e6 not found: ID does not exist" containerID="2658a8799e479577625d206e371483b3ea2b159884cf659798eef4826fc755e6" Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.022677 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2658a8799e479577625d206e371483b3ea2b159884cf659798eef4826fc755e6"} err="failed to get container status \"2658a8799e479577625d206e371483b3ea2b159884cf659798eef4826fc755e6\": rpc error: code = NotFound desc = could not find container \"2658a8799e479577625d206e371483b3ea2b159884cf659798eef4826fc755e6\": container with ID starting with 2658a8799e479577625d206e371483b3ea2b159884cf659798eef4826fc755e6 not found: ID does not exist" Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.022713 5030 scope.go:117] "RemoveContainer" containerID="4608c8179775a48d643dc7bdd6928bf7d6cea588a081c51c0092cfc8cd51089c" Jan 21 00:10:46 crc kubenswrapper[5030]: E0121 00:10:46.023162 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4608c8179775a48d643dc7bdd6928bf7d6cea588a081c51c0092cfc8cd51089c\": container with ID starting with 4608c8179775a48d643dc7bdd6928bf7d6cea588a081c51c0092cfc8cd51089c not found: ID does not exist" containerID="4608c8179775a48d643dc7bdd6928bf7d6cea588a081c51c0092cfc8cd51089c" Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.023195 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4608c8179775a48d643dc7bdd6928bf7d6cea588a081c51c0092cfc8cd51089c"} err="failed to get container status \"4608c8179775a48d643dc7bdd6928bf7d6cea588a081c51c0092cfc8cd51089c\": rpc error: code = NotFound desc = could not find container \"4608c8179775a48d643dc7bdd6928bf7d6cea588a081c51c0092cfc8cd51089c\": container with ID starting with 4608c8179775a48d643dc7bdd6928bf7d6cea588a081c51c0092cfc8cd51089c not found: ID does not exist" Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.033436 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz9d9\" (UniqueName: \"kubernetes.io/projected/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-kube-api-access-dz9d9\") pod \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\" (UID: \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\") " Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.033507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-dnsmasq-svc\") pod \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\" (UID: \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\") " Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.033545 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-config\") pod \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\" (UID: \"3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84\") " Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.037707 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-kube-api-access-dz9d9" (OuterVolumeSpecName: "kube-api-access-dz9d9") pod "3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84" (UID: "3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84"). InnerVolumeSpecName "kube-api-access-dz9d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.065231 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84" (UID: "3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.075712 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-config" (OuterVolumeSpecName: "config") pod "3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84" (UID: "3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.135550 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.135602 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.135644 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz9d9\" (UniqueName: \"kubernetes.io/projected/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84-kube-api-access-dz9d9\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.310992 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4"] Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.316388 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-559v4"] Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.994363 5030 generic.go:334] "Generic (PLEG): container finished" podID="7b2e7b50-765c-4f4a-b9da-69f16961e8b1" containerID="dc49ff8692861e4255c6051f313a37d6b3f437b9726d6a0e3002bb1776e705e7" exitCode=0 Jan 21 00:10:46 crc kubenswrapper[5030]: I0121 00:10:46.994408 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" event={"ID":"7b2e7b50-765c-4f4a-b9da-69f16961e8b1","Type":"ContainerDied","Data":"dc49ff8692861e4255c6051f313a37d6b3f437b9726d6a0e3002bb1776e705e7"} Jan 21 00:10:47 crc kubenswrapper[5030]: I0121 00:10:47.972374 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84" path="/var/lib/kubelet/pods/3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84/volumes" Jan 21 00:10:48 crc kubenswrapper[5030]: I0121 00:10:48.009045 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" event={"ID":"7b2e7b50-765c-4f4a-b9da-69f16961e8b1","Type":"ContainerStarted","Data":"a8e4fdac2501ca40c43af66648ac121f8b07dca717af20dfc5b34338c337f8ef"} Jan 21 00:10:48 crc kubenswrapper[5030]: I0121 00:10:48.009274 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:48 crc kubenswrapper[5030]: I0121 00:10:48.031060 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" podStartSLOduration=3.031038849 podStartE2EDuration="3.031038849s" podCreationTimestamp="2026-01-21 00:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:10:48.024403129 +0000 UTC m=+5720.344663417" watchObservedRunningTime="2026-01-21 00:10:48.031038849 +0000 UTC m=+5720.351299137" Jan 21 00:10:55 crc kubenswrapper[5030]: I0121 00:10:55.438931 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:10:55 crc kubenswrapper[5030]: I0121 00:10:55.543565 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk"] Jan 21 00:10:55 crc kubenswrapper[5030]: I0121 00:10:55.543983 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" podUID="5300e6a1-a48d-4827-b2a5-654042621c71" containerName="dnsmasq-dns" containerID="cri-o://ba580e2de24e2320ef8cb2b32d2f6e3271464521133e70296418cad7660cf6f3" gracePeriod=10 Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.093291 5030 generic.go:334] "Generic (PLEG): container finished" podID="5300e6a1-a48d-4827-b2a5-654042621c71" containerID="ba580e2de24e2320ef8cb2b32d2f6e3271464521133e70296418cad7660cf6f3" exitCode=0 Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.093369 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" event={"ID":"5300e6a1-a48d-4827-b2a5-654042621c71","Type":"ContainerDied","Data":"ba580e2de24e2320ef8cb2b32d2f6e3271464521133e70296418cad7660cf6f3"} Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.445702 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.559217 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-openstack-edpm-tls\") pod \"5300e6a1-a48d-4827-b2a5-654042621c71\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.559344 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbd9t\" (UniqueName: \"kubernetes.io/projected/5300e6a1-a48d-4827-b2a5-654042621c71-kube-api-access-hbd9t\") pod \"5300e6a1-a48d-4827-b2a5-654042621c71\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.559379 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-dnsmasq-svc\") pod \"5300e6a1-a48d-4827-b2a5-654042621c71\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.559458 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-config\") pod \"5300e6a1-a48d-4827-b2a5-654042621c71\" (UID: \"5300e6a1-a48d-4827-b2a5-654042621c71\") " Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.594004 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5300e6a1-a48d-4827-b2a5-654042621c71-kube-api-access-hbd9t" (OuterVolumeSpecName: "kube-api-access-hbd9t") pod "5300e6a1-a48d-4827-b2a5-654042621c71" (UID: "5300e6a1-a48d-4827-b2a5-654042621c71"). InnerVolumeSpecName "kube-api-access-hbd9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.605686 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-openstack-edpm-tls" (OuterVolumeSpecName: "openstack-edpm-tls") pod "5300e6a1-a48d-4827-b2a5-654042621c71" (UID: "5300e6a1-a48d-4827-b2a5-654042621c71"). InnerVolumeSpecName "openstack-edpm-tls". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.608397 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-config" (OuterVolumeSpecName: "config") pod "5300e6a1-a48d-4827-b2a5-654042621c71" (UID: "5300e6a1-a48d-4827-b2a5-654042621c71"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.611513 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "5300e6a1-a48d-4827-b2a5-654042621c71" (UID: "5300e6a1-a48d-4827-b2a5-654042621c71"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.660926 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.660962 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.660978 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbd9t\" (UniqueName: \"kubernetes.io/projected/5300e6a1-a48d-4827-b2a5-654042621c71-kube-api-access-hbd9t\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.660987 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/5300e6a1-a48d-4827-b2a5-654042621c71-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.864964 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq"] Jan 21 00:10:57 crc kubenswrapper[5030]: E0121 00:10:57.865256 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84" containerName="init" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.865272 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84" containerName="init" Jan 21 00:10:57 crc kubenswrapper[5030]: E0121 00:10:57.865287 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5300e6a1-a48d-4827-b2a5-654042621c71" containerName="init" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.865293 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5300e6a1-a48d-4827-b2a5-654042621c71" containerName="init" Jan 21 00:10:57 crc kubenswrapper[5030]: E0121 00:10:57.865305 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84" containerName="dnsmasq-dns" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.865311 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84" containerName="dnsmasq-dns" Jan 21 00:10:57 crc kubenswrapper[5030]: E0121 00:10:57.865326 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5300e6a1-a48d-4827-b2a5-654042621c71" containerName="dnsmasq-dns" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.865332 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5300e6a1-a48d-4827-b2a5-654042621c71" containerName="dnsmasq-dns" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.865452 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5300e6a1-a48d-4827-b2a5-654042621c71" containerName="dnsmasq-dns" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.865471 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e24dc89-c2c7-4ca9-b561-9f68c0ac5c84" containerName="dnsmasq-dns" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.865924 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.867995 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.868101 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.868111 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-tls-dnsnames-second-certs-0" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.868560 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-tls-dnsnames-default-certs-0" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.869010 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.869054 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.870657 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-4jfcd" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.876668 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq"] Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.965761 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-tls-dnsnames-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.965843 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.965873 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.965911 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsvfz\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-kube-api-access-tsvfz\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.965972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-inventory\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.966035 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-tls-dnsnames-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-openstack-edpm-tls-tls-dnsnames-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:57 crc kubenswrapper[5030]: I0121 00:10:57.966059 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-tls-dnsnames-second-certs-0\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-openstack-edpm-tls-tls-dnsnames-second-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.067244 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.067309 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.067347 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsvfz\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-kube-api-access-tsvfz\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.067418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-inventory\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.067462 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-tls-dnsnames-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-openstack-edpm-tls-tls-dnsnames-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.067490 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-tls-dnsnames-second-certs-0\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-openstack-edpm-tls-tls-dnsnames-second-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.067537 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-tls-dnsnames-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.071712 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.071737 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-tls-dnsnames-second-certs-0\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-openstack-edpm-tls-tls-dnsnames-second-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.071783 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-tls-dnsnames-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-openstack-edpm-tls-tls-dnsnames-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.071822 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-inventory\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.072002 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.072102 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-tls-dnsnames-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.082831 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsvfz\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-kube-api-access-tsvfz\") pod \"install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.103712 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" event={"ID":"5300e6a1-a48d-4827-b2a5-654042621c71","Type":"ContainerDied","Data":"9fd0d1ca376a1ab7bfcc80c463115423aed822a546d099cb96d6e48afec87d1e"} Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.104052 5030 scope.go:117] "RemoveContainer" containerID="ba580e2de24e2320ef8cb2b32d2f6e3271464521133e70296418cad7660cf6f3" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.103787 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.123931 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk"] Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.128115 5030 scope.go:117] "RemoveContainer" containerID="b26ba1cde7111b91d056b4c29f8604b8654cf8d80047bbc541c8354b9b25a606" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.130713 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-78c7b787f5-qgcqk"] Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.191193 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:10:58 crc kubenswrapper[5030]: I0121 00:10:58.614073 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq"] Jan 21 00:10:58 crc kubenswrapper[5030]: W0121 00:10:58.619040 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6287ec83_eeba_458f_a91e_5667d0858735.slice/crio-244f4333ca489f86b2d08ef39aae4a66a18070c8b0a4dfacd185ee041c81b233 WatchSource:0}: Error finding container 244f4333ca489f86b2d08ef39aae4a66a18070c8b0a4dfacd185ee041c81b233: Status 404 returned error can't find the container with id 244f4333ca489f86b2d08ef39aae4a66a18070c8b0a4dfacd185ee041c81b233 Jan 21 00:10:59 crc kubenswrapper[5030]: I0121 00:10:59.116499 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" event={"ID":"6287ec83-eeba-458f-a91e-5667d0858735","Type":"ContainerStarted","Data":"244f4333ca489f86b2d08ef39aae4a66a18070c8b0a4dfacd185ee041c81b233"} Jan 21 00:10:59 crc kubenswrapper[5030]: I0121 00:10:59.970500 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5300e6a1-a48d-4827-b2a5-654042621c71" path="/var/lib/kubelet/pods/5300e6a1-a48d-4827-b2a5-654042621c71/volumes" Jan 21 00:11:03 crc kubenswrapper[5030]: I0121 00:11:03.159855 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" event={"ID":"6287ec83-eeba-458f-a91e-5667d0858735","Type":"ContainerStarted","Data":"c4a06335f88871083eb1b64b57f710ab53596ef67f62067e5dca2e32320bb1e4"} Jan 21 00:11:03 crc kubenswrapper[5030]: I0121 00:11:03.184500 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" podStartSLOduration=2.95086716 podStartE2EDuration="6.184485296s" podCreationTimestamp="2026-01-21 00:10:57 +0000 UTC" firstStartedPulling="2026-01-21 00:10:58.621595459 +0000 UTC m=+5730.941855747" lastFinishedPulling="2026-01-21 00:11:01.855213595 +0000 UTC m=+5734.175473883" observedRunningTime="2026-01-21 00:11:03.176304737 +0000 UTC m=+5735.496565025" watchObservedRunningTime="2026-01-21 00:11:03.184485296 +0000 UTC m=+5735.504745584" Jan 21 00:11:06 crc kubenswrapper[5030]: I0121 00:11:06.185947 5030 generic.go:334] "Generic (PLEG): container finished" podID="6287ec83-eeba-458f-a91e-5667d0858735" containerID="c4a06335f88871083eb1b64b57f710ab53596ef67f62067e5dca2e32320bb1e4" exitCode=0 Jan 21 00:11:06 crc kubenswrapper[5030]: I0121 00:11:06.186248 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" event={"ID":"6287ec83-eeba-458f-a91e-5667d0858735","Type":"ContainerDied","Data":"c4a06335f88871083eb1b64b57f710ab53596ef67f62067e5dca2e32320bb1e4"} Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.477019 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.623220 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-ssh-key-openstack-edpm-tls\") pod \"6287ec83-eeba-458f-a91e-5667d0858735\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.623300 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsvfz\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-kube-api-access-tsvfz\") pod \"6287ec83-eeba-458f-a91e-5667d0858735\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.623324 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-tls-dnsnames-combined-ca-bundle\") pod \"6287ec83-eeba-458f-a91e-5667d0858735\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.623438 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-tls-dnsnames-second-certs-0\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-openstack-edpm-tls-tls-dnsnames-second-certs-0\") pod \"6287ec83-eeba-458f-a91e-5667d0858735\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.623482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-inventory\") pod \"6287ec83-eeba-458f-a91e-5667d0858735\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.623504 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-tls-dnsnames-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-openstack-edpm-tls-tls-dnsnames-default-certs-0\") pod \"6287ec83-eeba-458f-a91e-5667d0858735\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.623535 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-install-certs-ovrd-combined-ca-bundle\") pod \"6287ec83-eeba-458f-a91e-5667d0858735\" (UID: \"6287ec83-eeba-458f-a91e-5667d0858735\") " Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.628139 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-openstack-edpm-tls-tls-dnsnames-second-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-tls-dnsnames-second-certs-0") pod "6287ec83-eeba-458f-a91e-5667d0858735" (UID: "6287ec83-eeba-458f-a91e-5667d0858735"). InnerVolumeSpecName "openstack-edpm-tls-tls-dnsnames-second-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.628253 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-kube-api-access-tsvfz" (OuterVolumeSpecName: "kube-api-access-tsvfz") pod "6287ec83-eeba-458f-a91e-5667d0858735" (UID: "6287ec83-eeba-458f-a91e-5667d0858735"). InnerVolumeSpecName "kube-api-access-tsvfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.628609 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-install-certs-ovrd-combined-ca-bundle" (OuterVolumeSpecName: "install-certs-ovrd-combined-ca-bundle") pod "6287ec83-eeba-458f-a91e-5667d0858735" (UID: "6287ec83-eeba-458f-a91e-5667d0858735"). InnerVolumeSpecName "install-certs-ovrd-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.628749 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-tls-dnsnames-combined-ca-bundle" (OuterVolumeSpecName: "tls-dnsnames-combined-ca-bundle") pod "6287ec83-eeba-458f-a91e-5667d0858735" (UID: "6287ec83-eeba-458f-a91e-5667d0858735"). InnerVolumeSpecName "tls-dnsnames-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.629635 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-openstack-edpm-tls-tls-dnsnames-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-tls-dnsnames-default-certs-0") pod "6287ec83-eeba-458f-a91e-5667d0858735" (UID: "6287ec83-eeba-458f-a91e-5667d0858735"). InnerVolumeSpecName "openstack-edpm-tls-tls-dnsnames-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.651870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-inventory" (OuterVolumeSpecName: "inventory") pod "6287ec83-eeba-458f-a91e-5667d0858735" (UID: "6287ec83-eeba-458f-a91e-5667d0858735"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.654435 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "6287ec83-eeba-458f-a91e-5667d0858735" (UID: "6287ec83-eeba-458f-a91e-5667d0858735"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.725465 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.725551 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-tls-dnsnames-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-openstack-edpm-tls-tls-dnsnames-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.725591 5030 reconciler_common.go:293] "Volume detached for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-install-certs-ovrd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.725609 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.725742 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsvfz\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-kube-api-access-tsvfz\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.725756 5030 reconciler_common.go:293] "Volume detached for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6287ec83-eeba-458f-a91e-5667d0858735-tls-dnsnames-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:07 crc kubenswrapper[5030]: I0121 00:11:07.725769 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-tls-dnsnames-second-certs-0\" (UniqueName: \"kubernetes.io/projected/6287ec83-eeba-458f-a91e-5667d0858735-openstack-edpm-tls-tls-dnsnames-second-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.204751 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" event={"ID":"6287ec83-eeba-458f-a91e-5667d0858735","Type":"ContainerDied","Data":"244f4333ca489f86b2d08ef39aae4a66a18070c8b0a4dfacd185ee041c81b233"} Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.204797 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="244f4333ca489f86b2d08ef39aae4a66a18070c8b0a4dfacd185ee041c81b233" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.204881 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.274983 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2"] Jan 21 00:11:08 crc kubenswrapper[5030]: E0121 00:11:08.275299 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6287ec83-eeba-458f-a91e-5667d0858735" containerName="install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.275323 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6287ec83-eeba-458f-a91e-5667d0858735" containerName="install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.275519 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6287ec83-eeba-458f-a91e-5667d0858735" containerName="install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.276087 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.277817 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.278044 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.278219 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.278763 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-4jfcd" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.278999 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.288411 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2"] Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.435865 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc4tv\" (UniqueName: \"kubernetes.io/projected/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-kube-api-access-sc4tv\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.435974 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-ssh-key-openstack-edpm-tls\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.436053 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-inventory\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.436087 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-tls-dnsnames-combined-ca-bundle\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.536980 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-inventory\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.537039 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-tls-dnsnames-combined-ca-bundle\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.537094 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc4tv\" (UniqueName: \"kubernetes.io/projected/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-kube-api-access-sc4tv\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.537158 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-ssh-key-openstack-edpm-tls\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.542183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-tls-dnsnames-combined-ca-bundle\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.542803 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-ssh-key-openstack-edpm-tls\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.548068 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-inventory\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.558044 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc4tv\" (UniqueName: \"kubernetes.io/projected/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-kube-api-access-sc4tv\") pod \"tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:08 crc kubenswrapper[5030]: I0121 00:11:08.592176 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:09 crc kubenswrapper[5030]: I0121 00:11:09.006307 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2"] Jan 21 00:11:09 crc kubenswrapper[5030]: I0121 00:11:09.216474 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" event={"ID":"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d","Type":"ContainerStarted","Data":"0b32197c3c7f88a400fd4428e949e73e0ac1dcca4efc5a833a1fd22847930c75"} Jan 21 00:11:10 crc kubenswrapper[5030]: I0121 00:11:10.157140 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:11:10 crc kubenswrapper[5030]: I0121 00:11:10.157217 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:11:10 crc kubenswrapper[5030]: I0121 00:11:10.157268 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 00:11:10 crc kubenswrapper[5030]: I0121 00:11:10.158043 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 00:11:10 crc kubenswrapper[5030]: I0121 00:11:10.158108 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" gracePeriod=600 Jan 21 00:11:11 crc kubenswrapper[5030]: I0121 00:11:11.239691 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" exitCode=0 Jan 21 00:11:11 crc kubenswrapper[5030]: I0121 00:11:11.239805 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba"} Jan 21 00:11:11 crc kubenswrapper[5030]: I0121 00:11:11.240197 5030 scope.go:117] "RemoveContainer" containerID="9b88bdb3b6c450bc938e22d6a7c3a8c64bf73cd7d39cd07687b66e4ace86044b" Jan 21 00:11:12 crc kubenswrapper[5030]: E0121 00:11:12.409429 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:11:13 crc kubenswrapper[5030]: I0121 00:11:13.261599 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:11:13 crc kubenswrapper[5030]: E0121 00:11:13.262095 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:11:17 crc kubenswrapper[5030]: I0121 00:11:17.313262 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" event={"ID":"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d","Type":"ContainerStarted","Data":"b502b9e619433261ba720c46b8d94cf90dc56f4d7747ceab0da0c15d5aa7beff"} Jan 21 00:11:17 crc kubenswrapper[5030]: I0121 00:11:17.336961 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" podStartSLOduration=1.7960421659999999 podStartE2EDuration="9.336939004s" podCreationTimestamp="2026-01-21 00:11:08 +0000 UTC" firstStartedPulling="2026-01-21 00:11:09.018602406 +0000 UTC m=+5741.338862694" lastFinishedPulling="2026-01-21 00:11:16.559499224 +0000 UTC m=+5748.879759532" observedRunningTime="2026-01-21 00:11:17.334612887 +0000 UTC m=+5749.654873185" watchObservedRunningTime="2026-01-21 00:11:17.336939004 +0000 UTC m=+5749.657199292" Jan 21 00:11:20 crc kubenswrapper[5030]: I0121 00:11:20.355164 5030 generic.go:334] "Generic (PLEG): container finished" podID="a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d" containerID="b502b9e619433261ba720c46b8d94cf90dc56f4d7747ceab0da0c15d5aa7beff" exitCode=0 Jan 21 00:11:20 crc kubenswrapper[5030]: I0121 00:11:20.355271 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" event={"ID":"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d","Type":"ContainerDied","Data":"b502b9e619433261ba720c46b8d94cf90dc56f4d7747ceab0da0c15d5aa7beff"} Jan 21 00:11:21 crc kubenswrapper[5030]: I0121 00:11:21.657937 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:21 crc kubenswrapper[5030]: I0121 00:11:21.752419 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-tls-dnsnames-combined-ca-bundle\") pod \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " Jan 21 00:11:21 crc kubenswrapper[5030]: I0121 00:11:21.752498 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-ssh-key-openstack-edpm-tls\") pod \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " Jan 21 00:11:21 crc kubenswrapper[5030]: I0121 00:11:21.752540 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc4tv\" (UniqueName: \"kubernetes.io/projected/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-kube-api-access-sc4tv\") pod \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " Jan 21 00:11:21 crc kubenswrapper[5030]: I0121 00:11:21.752581 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-inventory\") pod \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\" (UID: \"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d\") " Jan 21 00:11:21 crc kubenswrapper[5030]: I0121 00:11:21.760552 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-tls-dnsnames-combined-ca-bundle" (OuterVolumeSpecName: "tls-dnsnames-combined-ca-bundle") pod "a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d" (UID: "a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d"). InnerVolumeSpecName "tls-dnsnames-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:21 crc kubenswrapper[5030]: I0121 00:11:21.762857 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-kube-api-access-sc4tv" (OuterVolumeSpecName: "kube-api-access-sc4tv") pod "a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d" (UID: "a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d"). InnerVolumeSpecName "kube-api-access-sc4tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:11:21 crc kubenswrapper[5030]: I0121 00:11:21.773435 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-inventory" (OuterVolumeSpecName: "inventory") pod "a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d" (UID: "a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:21 crc kubenswrapper[5030]: I0121 00:11:21.780585 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d" (UID: "a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:21 crc kubenswrapper[5030]: I0121 00:11:21.853576 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:21 crc kubenswrapper[5030]: I0121 00:11:21.853607 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc4tv\" (UniqueName: \"kubernetes.io/projected/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-kube-api-access-sc4tv\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:21 crc kubenswrapper[5030]: I0121 00:11:21.853631 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:21 crc kubenswrapper[5030]: I0121 00:11:21.853641 5030 reconciler_common.go:293] "Volume detached for volume \"tls-dnsnames-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d-tls-dnsnames-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:22 crc kubenswrapper[5030]: I0121 00:11:22.374186 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" event={"ID":"a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d","Type":"ContainerDied","Data":"0b32197c3c7f88a400fd4428e949e73e0ac1dcca4efc5a833a1fd22847930c75"} Jan 21 00:11:22 crc kubenswrapper[5030]: I0121 00:11:22.374516 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b32197c3c7f88a400fd4428e949e73e0ac1dcca4efc5a833a1fd22847930c75" Jan 21 00:11:22 crc kubenswrapper[5030]: I0121 00:11:22.374570 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.727317 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb"] Jan 21 00:11:24 crc kubenswrapper[5030]: E0121 00:11:24.727926 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d" containerName="tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.727941 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d" containerName="tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.728087 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d" containerName="tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.728563 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.732160 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.732396 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.732556 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.732880 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-4jfcd" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.733569 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.733734 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-tls-dns-ips-default-certs-0" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.738876 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb"] Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.739071 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-custom-tls-dns-default-certs-0" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.892762 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wdzw\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-kube-api-access-2wdzw\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.892839 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.892955 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.893071 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.893097 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.893130 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.893205 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.893244 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-inventory\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.994726 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wdzw\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-kube-api-access-2wdzw\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.994788 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.995766 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.996151 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.996950 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.997048 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.997117 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:24 crc kubenswrapper[5030]: I0121 00:11:24.997215 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-inventory\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:25 crc kubenswrapper[5030]: I0121 00:11:25.009257 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-inventory\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:25 crc kubenswrapper[5030]: I0121 00:11:25.009837 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:25 crc kubenswrapper[5030]: I0121 00:11:25.010177 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:25 crc kubenswrapper[5030]: I0121 00:11:25.010461 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:25 crc kubenswrapper[5030]: I0121 00:11:25.010876 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:25 crc kubenswrapper[5030]: I0121 00:11:25.011486 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:25 crc kubenswrapper[5030]: I0121 00:11:25.012272 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:25 crc kubenswrapper[5030]: I0121 00:11:25.013977 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wdzw\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-kube-api-access-2wdzw\") pod \"install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:25 crc kubenswrapper[5030]: I0121 00:11:25.061206 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:25 crc kubenswrapper[5030]: I0121 00:11:25.549705 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb"] Jan 21 00:11:25 crc kubenswrapper[5030]: I0121 00:11:25.962428 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:11:25 crc kubenswrapper[5030]: E0121 00:11:25.962824 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:11:26 crc kubenswrapper[5030]: I0121 00:11:26.406445 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" event={"ID":"552195c8-5b4f-4685-ace9-e1b093ad8617","Type":"ContainerStarted","Data":"6b60948631f006d6a9d18bca30bc7814cf68deb67ca44b604e156bacaab52391"} Jan 21 00:11:26 crc kubenswrapper[5030]: I0121 00:11:26.407120 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" event={"ID":"552195c8-5b4f-4685-ace9-e1b093ad8617","Type":"ContainerStarted","Data":"e4a2d98a6ea95feb3174644b8b8c3d6c04b6c7eb0778e284c1caf7b7310adfdc"} Jan 21 00:11:26 crc kubenswrapper[5030]: I0121 00:11:26.432315 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" podStartSLOduration=1.861327616 podStartE2EDuration="2.432295769s" podCreationTimestamp="2026-01-21 00:11:24 +0000 UTC" firstStartedPulling="2026-01-21 00:11:25.558024221 +0000 UTC m=+5757.878284509" lastFinishedPulling="2026-01-21 00:11:26.128992374 +0000 UTC m=+5758.449252662" observedRunningTime="2026-01-21 00:11:26.426825356 +0000 UTC m=+5758.747085644" watchObservedRunningTime="2026-01-21 00:11:26.432295769 +0000 UTC m=+5758.752556057" Jan 21 00:11:27 crc kubenswrapper[5030]: I0121 00:11:27.219878 5030 scope.go:117] "RemoveContainer" containerID="c84defbcbb2770b6b3cb2a981832bd3f083c624307887a3d429e51e7f8a4fbde" Jan 21 00:11:27 crc kubenswrapper[5030]: I0121 00:11:27.242885 5030 scope.go:117] "RemoveContainer" containerID="783ac0d3a9708daa66821522ab1b5e4696a21225ac14d5964b5e946f983faf3c" Jan 21 00:11:27 crc kubenswrapper[5030]: I0121 00:11:27.285878 5030 scope.go:117] "RemoveContainer" containerID="c3a73a233ab47c20d06f04d011e64ad85d7f4ceb9feb1d02e46bb2523739a1bf" Jan 21 00:11:27 crc kubenswrapper[5030]: I0121 00:11:27.356152 5030 scope.go:117] "RemoveContainer" containerID="fb29cdea2b9b93472258dc8bb21a524cd8ebc0614e4c2b5f68f20f99aa6b6424" Jan 21 00:11:27 crc kubenswrapper[5030]: I0121 00:11:27.379319 5030 scope.go:117] "RemoveContainer" containerID="e12808262dd0591c349493480d7d466ca3fcc6c780b33b812aa498bc6c79775d" Jan 21 00:11:29 crc kubenswrapper[5030]: I0121 00:11:29.437526 5030 generic.go:334] "Generic (PLEG): container finished" podID="552195c8-5b4f-4685-ace9-e1b093ad8617" containerID="6b60948631f006d6a9d18bca30bc7814cf68deb67ca44b604e156bacaab52391" exitCode=0 Jan 21 00:11:29 crc kubenswrapper[5030]: I0121 00:11:29.437606 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" event={"ID":"552195c8-5b4f-4685-ace9-e1b093ad8617","Type":"ContainerDied","Data":"6b60948631f006d6a9d18bca30bc7814cf68deb67ca44b604e156bacaab52391"} Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.781150 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.914987 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-inventory\") pod \"552195c8-5b4f-4685-ace9-e1b093ad8617\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.915094 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-install-certs-ovrd-combined-ca-bundle\") pod \"552195c8-5b4f-4685-ace9-e1b093ad8617\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.915178 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-custom-tls-dns-combined-ca-bundle\") pod \"552195c8-5b4f-4685-ace9-e1b093ad8617\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.915212 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"552195c8-5b4f-4685-ace9-e1b093ad8617\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.915258 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-tls-dns-ips-combined-ca-bundle\") pod \"552195c8-5b4f-4685-ace9-e1b093ad8617\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.915284 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-ssh-key-openstack-edpm-tls\") pod \"552195c8-5b4f-4685-ace9-e1b093ad8617\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.915305 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"552195c8-5b4f-4685-ace9-e1b093ad8617\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.915335 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wdzw\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-kube-api-access-2wdzw\") pod \"552195c8-5b4f-4685-ace9-e1b093ad8617\" (UID: \"552195c8-5b4f-4685-ace9-e1b093ad8617\") " Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.921768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-openstack-edpm-tls-custom-tls-dns-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-custom-tls-dns-default-certs-0") pod "552195c8-5b4f-4685-ace9-e1b093ad8617" (UID: "552195c8-5b4f-4685-ace9-e1b093ad8617"). InnerVolumeSpecName "openstack-edpm-tls-custom-tls-dns-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.922003 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-openstack-edpm-tls-tls-dns-ips-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-tls-dns-ips-default-certs-0") pod "552195c8-5b4f-4685-ace9-e1b093ad8617" (UID: "552195c8-5b4f-4685-ace9-e1b093ad8617"). InnerVolumeSpecName "openstack-edpm-tls-tls-dns-ips-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.922777 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-tls-dns-ips-combined-ca-bundle" (OuterVolumeSpecName: "tls-dns-ips-combined-ca-bundle") pod "552195c8-5b4f-4685-ace9-e1b093ad8617" (UID: "552195c8-5b4f-4685-ace9-e1b093ad8617"). InnerVolumeSpecName "tls-dns-ips-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.923705 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-custom-tls-dns-combined-ca-bundle" (OuterVolumeSpecName: "custom-tls-dns-combined-ca-bundle") pod "552195c8-5b4f-4685-ace9-e1b093ad8617" (UID: "552195c8-5b4f-4685-ace9-e1b093ad8617"). InnerVolumeSpecName "custom-tls-dns-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.924691 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-install-certs-ovrd-combined-ca-bundle" (OuterVolumeSpecName: "install-certs-ovrd-combined-ca-bundle") pod "552195c8-5b4f-4685-ace9-e1b093ad8617" (UID: "552195c8-5b4f-4685-ace9-e1b093ad8617"). InnerVolumeSpecName "install-certs-ovrd-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.928050 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-kube-api-access-2wdzw" (OuterVolumeSpecName: "kube-api-access-2wdzw") pod "552195c8-5b4f-4685-ace9-e1b093ad8617" (UID: "552195c8-5b4f-4685-ace9-e1b093ad8617"). InnerVolumeSpecName "kube-api-access-2wdzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.939963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "552195c8-5b4f-4685-ace9-e1b093ad8617" (UID: "552195c8-5b4f-4685-ace9-e1b093ad8617"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:30 crc kubenswrapper[5030]: I0121 00:11:30.945995 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-inventory" (OuterVolumeSpecName: "inventory") pod "552195c8-5b4f-4685-ace9-e1b093ad8617" (UID: "552195c8-5b4f-4685-ace9-e1b093ad8617"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.016991 5030 reconciler_common.go:293] "Volume detached for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-install-certs-ovrd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.017065 5030 reconciler_common.go:293] "Volume detached for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-custom-tls-dns-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.017077 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-openstack-edpm-tls-tls-dns-ips-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.017089 5030 reconciler_common.go:293] "Volume detached for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-tls-dns-ips-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.017100 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.017115 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-openstack-edpm-tls-custom-tls-dns-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.017130 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wdzw\" (UniqueName: \"kubernetes.io/projected/552195c8-5b4f-4685-ace9-e1b093ad8617-kube-api-access-2wdzw\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.017142 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/552195c8-5b4f-4685-ace9-e1b093ad8617-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.466939 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" event={"ID":"552195c8-5b4f-4685-ace9-e1b093ad8617","Type":"ContainerDied","Data":"e4a2d98a6ea95feb3174644b8b8c3d6c04b6c7eb0778e284c1caf7b7310adfdc"} Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.467241 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4a2d98a6ea95feb3174644b8b8c3d6c04b6c7eb0778e284c1caf7b7310adfdc" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.467414 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.546495 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p"] Jan 21 00:11:31 crc kubenswrapper[5030]: E0121 00:11:31.561562 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552195c8-5b4f-4685-ace9-e1b093ad8617" containerName="install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.561592 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="552195c8-5b4f-4685-ace9-e1b093ad8617" containerName="install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.561890 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="552195c8-5b4f-4685-ace9-e1b093ad8617" containerName="install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.562339 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p"] Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.562444 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.568115 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.568197 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.568307 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.568663 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-4jfcd" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.571559 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.725852 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-ssh-key-openstack-edpm-tls\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.726239 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-inventory\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.726345 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prr59\" (UniqueName: \"kubernetes.io/projected/f269dd4a-049c-4639-b987-7831fcb249dc-kube-api-access-prr59\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.726409 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-tls-dns-ips-combined-ca-bundle\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.828069 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prr59\" (UniqueName: \"kubernetes.io/projected/f269dd4a-049c-4639-b987-7831fcb249dc-kube-api-access-prr59\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.828204 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-tls-dns-ips-combined-ca-bundle\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.828251 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-ssh-key-openstack-edpm-tls\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.828311 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-inventory\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.833375 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-inventory\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.833379 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-ssh-key-openstack-edpm-tls\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.834699 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-tls-dns-ips-combined-ca-bundle\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.848145 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prr59\" (UniqueName: \"kubernetes.io/projected/f269dd4a-049c-4639-b987-7831fcb249dc-kube-api-access-prr59\") pod \"tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:31 crc kubenswrapper[5030]: I0121 00:11:31.882937 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:32 crc kubenswrapper[5030]: I0121 00:11:32.300821 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p"] Jan 21 00:11:32 crc kubenswrapper[5030]: I0121 00:11:32.476963 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" event={"ID":"f269dd4a-049c-4639-b987-7831fcb249dc","Type":"ContainerStarted","Data":"3e2cdd0aae07de85bd3bb1cc065c61112e321f0488db34aaf5f50d4d28558f8a"} Jan 21 00:11:35 crc kubenswrapper[5030]: I0121 00:11:35.504517 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" event={"ID":"f269dd4a-049c-4639-b987-7831fcb249dc","Type":"ContainerStarted","Data":"dea7270de9afd4ceb98f5f5e373eb14e2c823503fc52aa4f92211f7f733286c3"} Jan 21 00:11:35 crc kubenswrapper[5030]: I0121 00:11:35.525285 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" podStartSLOduration=2.6755690469999998 podStartE2EDuration="4.525257686s" podCreationTimestamp="2026-01-21 00:11:31 +0000 UTC" firstStartedPulling="2026-01-21 00:11:32.302770291 +0000 UTC m=+5764.623030579" lastFinishedPulling="2026-01-21 00:11:34.15245893 +0000 UTC m=+5766.472719218" observedRunningTime="2026-01-21 00:11:35.520294726 +0000 UTC m=+5767.840555044" watchObservedRunningTime="2026-01-21 00:11:35.525257686 +0000 UTC m=+5767.845517994" Jan 21 00:11:37 crc kubenswrapper[5030]: I0121 00:11:37.526157 5030 generic.go:334] "Generic (PLEG): container finished" podID="f269dd4a-049c-4639-b987-7831fcb249dc" containerID="dea7270de9afd4ceb98f5f5e373eb14e2c823503fc52aa4f92211f7f733286c3" exitCode=0 Jan 21 00:11:37 crc kubenswrapper[5030]: I0121 00:11:37.526280 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" event={"ID":"f269dd4a-049c-4639-b987-7831fcb249dc","Type":"ContainerDied","Data":"dea7270de9afd4ceb98f5f5e373eb14e2c823503fc52aa4f92211f7f733286c3"} Jan 21 00:11:38 crc kubenswrapper[5030]: I0121 00:11:38.827598 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:38 crc kubenswrapper[5030]: I0121 00:11:38.929414 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prr59\" (UniqueName: \"kubernetes.io/projected/f269dd4a-049c-4639-b987-7831fcb249dc-kube-api-access-prr59\") pod \"f269dd4a-049c-4639-b987-7831fcb249dc\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " Jan 21 00:11:38 crc kubenswrapper[5030]: I0121 00:11:38.929481 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-inventory\") pod \"f269dd4a-049c-4639-b987-7831fcb249dc\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " Jan 21 00:11:38 crc kubenswrapper[5030]: I0121 00:11:38.929561 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-ssh-key-openstack-edpm-tls\") pod \"f269dd4a-049c-4639-b987-7831fcb249dc\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " Jan 21 00:11:38 crc kubenswrapper[5030]: I0121 00:11:38.929641 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-tls-dns-ips-combined-ca-bundle\") pod \"f269dd4a-049c-4639-b987-7831fcb249dc\" (UID: \"f269dd4a-049c-4639-b987-7831fcb249dc\") " Jan 21 00:11:38 crc kubenswrapper[5030]: I0121 00:11:38.935472 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-tls-dns-ips-combined-ca-bundle" (OuterVolumeSpecName: "tls-dns-ips-combined-ca-bundle") pod "f269dd4a-049c-4639-b987-7831fcb249dc" (UID: "f269dd4a-049c-4639-b987-7831fcb249dc"). InnerVolumeSpecName "tls-dns-ips-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:38 crc kubenswrapper[5030]: I0121 00:11:38.935887 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f269dd4a-049c-4639-b987-7831fcb249dc-kube-api-access-prr59" (OuterVolumeSpecName: "kube-api-access-prr59") pod "f269dd4a-049c-4639-b987-7831fcb249dc" (UID: "f269dd4a-049c-4639-b987-7831fcb249dc"). InnerVolumeSpecName "kube-api-access-prr59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:11:38 crc kubenswrapper[5030]: I0121 00:11:38.953374 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-inventory" (OuterVolumeSpecName: "inventory") pod "f269dd4a-049c-4639-b987-7831fcb249dc" (UID: "f269dd4a-049c-4639-b987-7831fcb249dc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:38 crc kubenswrapper[5030]: I0121 00:11:38.956323 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "f269dd4a-049c-4639-b987-7831fcb249dc" (UID: "f269dd4a-049c-4639-b987-7831fcb249dc"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.031224 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prr59\" (UniqueName: \"kubernetes.io/projected/f269dd4a-049c-4639-b987-7831fcb249dc-kube-api-access-prr59\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.031261 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.031271 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.031280 5030 reconciler_common.go:293] "Volume detached for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f269dd4a-049c-4639-b987-7831fcb249dc-tls-dns-ips-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.544166 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" event={"ID":"f269dd4a-049c-4639-b987-7831fcb249dc","Type":"ContainerDied","Data":"3e2cdd0aae07de85bd3bb1cc065c61112e321f0488db34aaf5f50d4d28558f8a"} Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.544208 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e2cdd0aae07de85bd3bb1cc065c61112e321f0488db34aaf5f50d4d28558f8a" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.544246 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.962169 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:11:39 crc kubenswrapper[5030]: E0121 00:11:39.962435 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.991339 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq"] Jan 21 00:11:39 crc kubenswrapper[5030]: E0121 00:11:39.991647 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f269dd4a-049c-4639-b987-7831fcb249dc" containerName="tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.991665 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f269dd4a-049c-4639-b987-7831fcb249dc" containerName="tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.991808 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f269dd4a-049c-4639-b987-7831fcb249dc" containerName="tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.992240 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.996079 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.996101 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.996351 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.996516 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 00:11:39 crc kubenswrapper[5030]: I0121 00:11:39.996541 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-4jfcd" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.007863 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq"] Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.150900 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-inventory\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.151115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-ssh-key-openstack-edpm-tls\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.151186 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-custom-tls-dns-combined-ca-bundle\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.151338 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fqfc\" (UniqueName: \"kubernetes.io/projected/e729d672-5409-47b0-9e4c-6a6c875782f8-kube-api-access-9fqfc\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.252842 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-custom-tls-dns-combined-ca-bundle\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.253257 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fqfc\" (UniqueName: \"kubernetes.io/projected/e729d672-5409-47b0-9e4c-6a6c875782f8-kube-api-access-9fqfc\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.253307 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-inventory\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.253368 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-ssh-key-openstack-edpm-tls\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.257545 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-inventory\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.258023 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-ssh-key-openstack-edpm-tls\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.258953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-custom-tls-dns-combined-ca-bundle\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.276331 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fqfc\" (UniqueName: \"kubernetes.io/projected/e729d672-5409-47b0-9e4c-6a6c875782f8-kube-api-access-9fqfc\") pod \"custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.320520 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:40 crc kubenswrapper[5030]: I0121 00:11:40.799836 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq"] Jan 21 00:11:41 crc kubenswrapper[5030]: I0121 00:11:41.560698 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" event={"ID":"e729d672-5409-47b0-9e4c-6a6c875782f8","Type":"ContainerStarted","Data":"3f84dc638d3c3bc7d40226ca98070268572b6907cbf834a2015ea7e7d12f6476"} Jan 21 00:11:43 crc kubenswrapper[5030]: I0121 00:11:43.584193 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" event={"ID":"e729d672-5409-47b0-9e4c-6a6c875782f8","Type":"ContainerStarted","Data":"e17f8d6088fec44a34a9d5f5c2d13bbaadb9f6e8a7aa11a615410041648c8f82"} Jan 21 00:11:43 crc kubenswrapper[5030]: I0121 00:11:43.606812 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" podStartSLOduration=2.979329748 podStartE2EDuration="4.606791149s" podCreationTimestamp="2026-01-21 00:11:39 +0000 UTC" firstStartedPulling="2026-01-21 00:11:40.794253534 +0000 UTC m=+5773.114513822" lastFinishedPulling="2026-01-21 00:11:42.421714935 +0000 UTC m=+5774.741975223" observedRunningTime="2026-01-21 00:11:43.600939068 +0000 UTC m=+5775.921199366" watchObservedRunningTime="2026-01-21 00:11:43.606791149 +0000 UTC m=+5775.927051437" Jan 21 00:11:46 crc kubenswrapper[5030]: I0121 00:11:46.613931 5030 generic.go:334] "Generic (PLEG): container finished" podID="e729d672-5409-47b0-9e4c-6a6c875782f8" containerID="e17f8d6088fec44a34a9d5f5c2d13bbaadb9f6e8a7aa11a615410041648c8f82" exitCode=0 Jan 21 00:11:46 crc kubenswrapper[5030]: I0121 00:11:46.614029 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" event={"ID":"e729d672-5409-47b0-9e4c-6a6c875782f8","Type":"ContainerDied","Data":"e17f8d6088fec44a34a9d5f5c2d13bbaadb9f6e8a7aa11a615410041648c8f82"} Jan 21 00:11:47 crc kubenswrapper[5030]: I0121 00:11:47.959496 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.000384 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fqfc\" (UniqueName: \"kubernetes.io/projected/e729d672-5409-47b0-9e4c-6a6c875782f8-kube-api-access-9fqfc\") pod \"e729d672-5409-47b0-9e4c-6a6c875782f8\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.000578 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-custom-tls-dns-combined-ca-bundle\") pod \"e729d672-5409-47b0-9e4c-6a6c875782f8\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.000644 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-ssh-key-openstack-edpm-tls\") pod \"e729d672-5409-47b0-9e4c-6a6c875782f8\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.000682 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-inventory\") pod \"e729d672-5409-47b0-9e4c-6a6c875782f8\" (UID: \"e729d672-5409-47b0-9e4c-6a6c875782f8\") " Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.005765 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e729d672-5409-47b0-9e4c-6a6c875782f8-kube-api-access-9fqfc" (OuterVolumeSpecName: "kube-api-access-9fqfc") pod "e729d672-5409-47b0-9e4c-6a6c875782f8" (UID: "e729d672-5409-47b0-9e4c-6a6c875782f8"). InnerVolumeSpecName "kube-api-access-9fqfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.006869 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-custom-tls-dns-combined-ca-bundle" (OuterVolumeSpecName: "custom-tls-dns-combined-ca-bundle") pod "e729d672-5409-47b0-9e4c-6a6c875782f8" (UID: "e729d672-5409-47b0-9e4c-6a6c875782f8"). InnerVolumeSpecName "custom-tls-dns-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.023533 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-inventory" (OuterVolumeSpecName: "inventory") pod "e729d672-5409-47b0-9e4c-6a6c875782f8" (UID: "e729d672-5409-47b0-9e4c-6a6c875782f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.028207 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "e729d672-5409-47b0-9e4c-6a6c875782f8" (UID: "e729d672-5409-47b0-9e4c-6a6c875782f8"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.102815 5030 reconciler_common.go:293] "Volume detached for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-custom-tls-dns-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.102915 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.102933 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e729d672-5409-47b0-9e4c-6a6c875782f8-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.102950 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fqfc\" (UniqueName: \"kubernetes.io/projected/e729d672-5409-47b0-9e4c-6a6c875782f8-kube-api-access-9fqfc\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.638937 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" event={"ID":"e729d672-5409-47b0-9e4c-6a6c875782f8","Type":"ContainerDied","Data":"3f84dc638d3c3bc7d40226ca98070268572b6907cbf834a2015ea7e7d12f6476"} Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.638984 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f84dc638d3c3bc7d40226ca98070268572b6907cbf834a2015ea7e7d12f6476" Jan 21 00:11:48 crc kubenswrapper[5030]: I0121 00:11:48.639003 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.291604 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg"] Jan 21 00:11:50 crc kubenswrapper[5030]: E0121 00:11:50.293075 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e729d672-5409-47b0-9e4c-6a6c875782f8" containerName="custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.293095 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e729d672-5409-47b0-9e4c-6a6c875782f8" containerName="custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.293245 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e729d672-5409-47b0-9e4c-6a6c875782f8" containerName="custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.293774 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.295256 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.295578 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-custom-tls-dns-default-certs-0" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.296370 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.296928 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-tls-dns-ips-default-certs-0" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.297926 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-4jfcd" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.298911 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.300142 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.321966 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg"] Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.345271 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.345333 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.345366 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.345483 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.345511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.345539 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-kube-api-access-62qd9\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.345558 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.345646 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.447734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.447847 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.448003 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.448051 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-kube-api-access-62qd9\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.448108 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.448200 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.448263 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.448315 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.454016 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.454038 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.454157 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.454507 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.455926 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.459163 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.460054 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.479507 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-kube-api-access-62qd9\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:50 crc kubenswrapper[5030]: I0121 00:11:50.659350 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:51 crc kubenswrapper[5030]: I0121 00:11:51.115125 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg"] Jan 21 00:11:51 crc kubenswrapper[5030]: I0121 00:11:51.670295 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" event={"ID":"a4e84f57-c154-49aa-8741-6e5068d94481","Type":"ContainerStarted","Data":"2d4daf37887fa232ba8265248de65fb0a95320b59c57ec62ee5de8b4916f4ebd"} Jan 21 00:11:53 crc kubenswrapper[5030]: I0121 00:11:53.689345 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" event={"ID":"a4e84f57-c154-49aa-8741-6e5068d94481","Type":"ContainerStarted","Data":"808c466715a032ee3209d45b915fc9c59a888761ad015562a2cacadf4accc0d5"} Jan 21 00:11:53 crc kubenswrapper[5030]: I0121 00:11:53.721850 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" podStartSLOduration=2.047079911 podStartE2EDuration="3.72182537s" podCreationTimestamp="2026-01-21 00:11:50 +0000 UTC" firstStartedPulling="2026-01-21 00:11:51.123079907 +0000 UTC m=+5783.443340205" lastFinishedPulling="2026-01-21 00:11:52.797825366 +0000 UTC m=+5785.118085664" observedRunningTime="2026-01-21 00:11:53.71317531 +0000 UTC m=+5786.033435638" watchObservedRunningTime="2026-01-21 00:11:53.72182537 +0000 UTC m=+5786.042085698" Jan 21 00:11:54 crc kubenswrapper[5030]: I0121 00:11:54.962388 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:11:54 crc kubenswrapper[5030]: E0121 00:11:54.962691 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:11:56 crc kubenswrapper[5030]: I0121 00:11:56.726569 5030 generic.go:334] "Generic (PLEG): container finished" podID="a4e84f57-c154-49aa-8741-6e5068d94481" containerID="808c466715a032ee3209d45b915fc9c59a888761ad015562a2cacadf4accc0d5" exitCode=0 Jan 21 00:11:56 crc kubenswrapper[5030]: I0121 00:11:56.726686 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" event={"ID":"a4e84f57-c154-49aa-8741-6e5068d94481","Type":"ContainerDied","Data":"808c466715a032ee3209d45b915fc9c59a888761ad015562a2cacadf4accc0d5"} Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.236352 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.309680 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-custom-tls-dns-combined-ca-bundle\") pod \"a4e84f57-c154-49aa-8741-6e5068d94481\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.309739 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"a4e84f57-c154-49aa-8741-6e5068d94481\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.309811 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"a4e84f57-c154-49aa-8741-6e5068d94481\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.309882 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-kube-api-access-62qd9\") pod \"a4e84f57-c154-49aa-8741-6e5068d94481\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.309916 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-inventory\") pod \"a4e84f57-c154-49aa-8741-6e5068d94481\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.309962 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-install-certs-ovrd-combined-ca-bundle\") pod \"a4e84f57-c154-49aa-8741-6e5068d94481\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.309997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-ssh-key-openstack-edpm-tls\") pod \"a4e84f57-c154-49aa-8741-6e5068d94481\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.310027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-tls-dns-ips-combined-ca-bundle\") pod \"a4e84f57-c154-49aa-8741-6e5068d94481\" (UID: \"a4e84f57-c154-49aa-8741-6e5068d94481\") " Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.315704 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-openstack-edpm-tls-custom-tls-dns-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-custom-tls-dns-default-certs-0") pod "a4e84f57-c154-49aa-8741-6e5068d94481" (UID: "a4e84f57-c154-49aa-8741-6e5068d94481"). InnerVolumeSpecName "openstack-edpm-tls-custom-tls-dns-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.315749 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-custom-tls-dns-combined-ca-bundle" (OuterVolumeSpecName: "custom-tls-dns-combined-ca-bundle") pod "a4e84f57-c154-49aa-8741-6e5068d94481" (UID: "a4e84f57-c154-49aa-8741-6e5068d94481"). InnerVolumeSpecName "custom-tls-dns-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.315926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-tls-dns-ips-combined-ca-bundle" (OuterVolumeSpecName: "tls-dns-ips-combined-ca-bundle") pod "a4e84f57-c154-49aa-8741-6e5068d94481" (UID: "a4e84f57-c154-49aa-8741-6e5068d94481"). InnerVolumeSpecName "tls-dns-ips-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.316126 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-install-certs-ovrd-combined-ca-bundle" (OuterVolumeSpecName: "install-certs-ovrd-combined-ca-bundle") pod "a4e84f57-c154-49aa-8741-6e5068d94481" (UID: "a4e84f57-c154-49aa-8741-6e5068d94481"). InnerVolumeSpecName "install-certs-ovrd-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.316259 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-openstack-edpm-tls-tls-dns-ips-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-tls-dns-ips-default-certs-0") pod "a4e84f57-c154-49aa-8741-6e5068d94481" (UID: "a4e84f57-c154-49aa-8741-6e5068d94481"). InnerVolumeSpecName "openstack-edpm-tls-tls-dns-ips-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.317359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-kube-api-access-62qd9" (OuterVolumeSpecName: "kube-api-access-62qd9") pod "a4e84f57-c154-49aa-8741-6e5068d94481" (UID: "a4e84f57-c154-49aa-8741-6e5068d94481"). InnerVolumeSpecName "kube-api-access-62qd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.337328 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "a4e84f57-c154-49aa-8741-6e5068d94481" (UID: "a4e84f57-c154-49aa-8741-6e5068d94481"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.342275 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-inventory" (OuterVolumeSpecName: "inventory") pod "a4e84f57-c154-49aa-8741-6e5068d94481" (UID: "a4e84f57-c154-49aa-8741-6e5068d94481"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.411310 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-kube-api-access-62qd9\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.411350 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.411361 5030 reconciler_common.go:293] "Volume detached for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-install-certs-ovrd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.411375 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.411389 5030 reconciler_common.go:293] "Volume detached for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-tls-dns-ips-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.411400 5030 reconciler_common.go:293] "Volume detached for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4e84f57-c154-49aa-8741-6e5068d94481-custom-tls-dns-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.411413 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-openstack-edpm-tls-tls-dns-ips-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.411424 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a4e84f57-c154-49aa-8741-6e5068d94481-openstack-edpm-tls-custom-tls-dns-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.747807 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" event={"ID":"a4e84f57-c154-49aa-8741-6e5068d94481","Type":"ContainerDied","Data":"2d4daf37887fa232ba8265248de65fb0a95320b59c57ec62ee5de8b4916f4ebd"} Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.747853 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d4daf37887fa232ba8265248de65fb0a95320b59c57ec62ee5de8b4916f4ebd" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.747921 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.843063 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb"] Jan 21 00:11:58 crc kubenswrapper[5030]: E0121 00:11:58.843658 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4e84f57-c154-49aa-8741-6e5068d94481" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.843678 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4e84f57-c154-49aa-8741-6e5068d94481" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.843856 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4e84f57-c154-49aa-8741-6e5068d94481" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.844433 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.846974 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-openstack-edpm-tls" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.847262 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.847461 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.847598 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-dockercfg-4jfcd" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.852199 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.855166 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb"] Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.918105 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-tls-dns-ips-combined-ca-bundle\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.918171 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qjcc\" (UniqueName: \"kubernetes.io/projected/23dd7785-7f14-44ac-b615-c48e0367f949-kube-api-access-2qjcc\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.918197 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-inventory\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:58 crc kubenswrapper[5030]: I0121 00:11:58.918379 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-ssh-key-openstack-edpm-tls\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.020003 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-tls-dns-ips-combined-ca-bundle\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.020098 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qjcc\" (UniqueName: \"kubernetes.io/projected/23dd7785-7f14-44ac-b615-c48e0367f949-kube-api-access-2qjcc\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.020123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-inventory\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.020196 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-ssh-key-openstack-edpm-tls\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.024871 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-inventory\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.025179 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-ssh-key-openstack-edpm-tls\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.031870 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-tls-dns-ips-combined-ca-bundle\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.036953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qjcc\" (UniqueName: \"kubernetes.io/projected/23dd7785-7f14-44ac-b615-c48e0367f949-kube-api-access-2qjcc\") pod \"tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.085332 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp"] Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.086705 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.090413 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-tls-dns-ips-default-certs-0" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.091721 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"openstack-edpm-tls-custom-tls-dns-default-certs-0" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.102944 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp"] Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.164975 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.223211 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.223279 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-kube-api-access-62qd9\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.223307 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.223333 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.223598 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.223765 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.223910 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:11:59 crc kubenswrapper[5030]: I0121 00:11:59.224077 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.326147 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.326225 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-kube-api-access-62qd9\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.326249 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.326275 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.326294 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.326314 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.326340 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.326380 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.331010 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.331198 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.331443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.332688 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.333315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.333541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.340089 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.347178 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-kube-api-access-62qd9\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:11:59.434247 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:00 crc kubenswrapper[5030]: E0121 00:11:59.434460 5030 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" hostnameMaxLen=63 truncatedHostname="install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug" Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:12:00.877683 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb"] Jan 21 00:12:00 crc kubenswrapper[5030]: I0121 00:12:00.941136 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp"] Jan 21 00:12:00 crc kubenswrapper[5030]: E0121 00:12:00.946725 5030 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" hostnameMaxLen=63 truncatedHostname="install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug" Jan 21 00:12:01 crc kubenswrapper[5030]: E0121 00:12:01.640290 5030 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" hostnameMaxLen=63 truncatedHostname="install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug" Jan 21 00:12:01 crc kubenswrapper[5030]: I0121 00:12:01.773137 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" event={"ID":"87c31536-fea3-4cf7-989b-4873e208454f","Type":"ContainerStarted","Data":"807090d1107a9c59a9389067cbfe24c7664ac6affbcec79edcb91a21857e2c82"} Jan 21 00:12:01 crc kubenswrapper[5030]: I0121 00:12:01.774548 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" event={"ID":"23dd7785-7f14-44ac-b615-c48e0367f949","Type":"ContainerStarted","Data":"b0dad132493957c43c18a181b366ffb6e2167787ffc6cf1d64a7e5fd41d2d45a"} Jan 21 00:12:02 crc kubenswrapper[5030]: I0121 00:12:02.784260 5030 generic.go:334] "Generic (PLEG): container finished" podID="87c31536-fea3-4cf7-989b-4873e208454f" containerID="a6159386ce2f2f438d145b7614214979b9486e0514f17ad647c7f1ac436d8b13" exitCode=0 Jan 21 00:12:02 crc kubenswrapper[5030]: I0121 00:12:02.784341 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" event={"ID":"87c31536-fea3-4cf7-989b-4873e208454f","Type":"ContainerDied","Data":"a6159386ce2f2f438d145b7614214979b9486e0514f17ad647c7f1ac436d8b13"} Jan 21 00:12:02 crc kubenswrapper[5030]: I0121 00:12:02.787231 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" event={"ID":"23dd7785-7f14-44ac-b615-c48e0367f949","Type":"ContainerStarted","Data":"6cdf699538371ab232fbd636fd875076b1ecdabcdefe8ee91d0fb83a745b3bba"} Jan 21 00:12:02 crc kubenswrapper[5030]: I0121 00:12:02.866300 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" podStartSLOduration=3.66620519 podStartE2EDuration="4.820525429s" podCreationTimestamp="2026-01-21 00:11:58 +0000 UTC" firstStartedPulling="2026-01-21 00:12:00.89620012 +0000 UTC m=+5793.216460408" lastFinishedPulling="2026-01-21 00:12:02.050520359 +0000 UTC m=+5794.370780647" observedRunningTime="2026-01-21 00:12:02.816328317 +0000 UTC m=+5795.136588645" watchObservedRunningTime="2026-01-21 00:12:02.820525429 +0000 UTC m=+5795.140785707" Jan 21 00:12:02 crc kubenswrapper[5030]: I0121 00:12:02.875227 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp"] Jan 21 00:12:02 crc kubenswrapper[5030]: I0121 00:12:02.880659 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp"] Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.173482 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm"] Jan 21 00:12:03 crc kubenswrapper[5030]: E0121 00:12:03.174227 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c31536-fea3-4cf7-989b-4873e208454f" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.174253 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c31536-fea3-4cf7-989b-4873e208454f" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.174488 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c31536-fea3-4cf7-989b-4873e208454f" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.175158 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.194792 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm"] Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.333210 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.333281 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.333312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.333518 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.333584 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.333731 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.333945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.334037 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-kube-api-access-62qd9\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.435855 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.435909 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.435932 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.435957 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.435979 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.436017 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.436097 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.436128 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-kube-api-access-62qd9\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.442441 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-install-certs-ovrd-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.442573 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-tls-dns-ips-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.442845 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-custom-tls-dns-combined-ca-bundle\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.443307 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-inventory\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.443481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.443605 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.445335 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-ssh-key-openstack-edpm-tls\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.462163 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-kube-api-access-62qd9\") pod \"install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.497574 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:03 crc kubenswrapper[5030]: E0121 00:12:03.497703 5030 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" hostnameMaxLen=63 truncatedHostname="install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug" Jan 21 00:12:03 crc kubenswrapper[5030]: I0121 00:12:03.936373 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm"] Jan 21 00:12:03 crc kubenswrapper[5030]: W0121 00:12:03.942519 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod109f040c_197c_43ba_b93f_271f3c4e47a5.slice/crio-6862abb762aa24bb8b76ac387f14c956634689b2c294eb4995d9e7614c5aca62 WatchSource:0}: Error finding container 6862abb762aa24bb8b76ac387f14c956634689b2c294eb4995d9e7614c5aca62: Status 404 returned error can't find the container with id 6862abb762aa24bb8b76ac387f14c956634689b2c294eb4995d9e7614c5aca62 Jan 21 00:12:03 crc kubenswrapper[5030]: E0121 00:12:03.943911 5030 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" hostnameMaxLen=63 truncatedHostname="install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.066486 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.248965 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"87c31536-fea3-4cf7-989b-4873e208454f\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.249063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"87c31536-fea3-4cf7-989b-4873e208454f\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.249188 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-tls-dns-ips-combined-ca-bundle\") pod \"87c31536-fea3-4cf7-989b-4873e208454f\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.249278 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-ssh-key-openstack-edpm-tls\") pod \"87c31536-fea3-4cf7-989b-4873e208454f\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.249350 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-inventory\") pod \"87c31536-fea3-4cf7-989b-4873e208454f\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.249391 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-install-certs-ovrd-combined-ca-bundle\") pod \"87c31536-fea3-4cf7-989b-4873e208454f\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.249451 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-custom-tls-dns-combined-ca-bundle\") pod \"87c31536-fea3-4cf7-989b-4873e208454f\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.249587 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-kube-api-access-62qd9\") pod \"87c31536-fea3-4cf7-989b-4873e208454f\" (UID: \"87c31536-fea3-4cf7-989b-4873e208454f\") " Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.254863 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-openstack-edpm-tls-custom-tls-dns-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-custom-tls-dns-default-certs-0") pod "87c31536-fea3-4cf7-989b-4873e208454f" (UID: "87c31536-fea3-4cf7-989b-4873e208454f"). InnerVolumeSpecName "openstack-edpm-tls-custom-tls-dns-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.255105 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-tls-dns-ips-combined-ca-bundle" (OuterVolumeSpecName: "tls-dns-ips-combined-ca-bundle") pod "87c31536-fea3-4cf7-989b-4873e208454f" (UID: "87c31536-fea3-4cf7-989b-4873e208454f"). InnerVolumeSpecName "tls-dns-ips-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.255200 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-openstack-edpm-tls-tls-dns-ips-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-tls-dns-ips-default-certs-0") pod "87c31536-fea3-4cf7-989b-4873e208454f" (UID: "87c31536-fea3-4cf7-989b-4873e208454f"). InnerVolumeSpecName "openstack-edpm-tls-tls-dns-ips-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.260560 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-custom-tls-dns-combined-ca-bundle" (OuterVolumeSpecName: "custom-tls-dns-combined-ca-bundle") pod "87c31536-fea3-4cf7-989b-4873e208454f" (UID: "87c31536-fea3-4cf7-989b-4873e208454f"). InnerVolumeSpecName "custom-tls-dns-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.260741 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-install-certs-ovrd-combined-ca-bundle" (OuterVolumeSpecName: "install-certs-ovrd-combined-ca-bundle") pod "87c31536-fea3-4cf7-989b-4873e208454f" (UID: "87c31536-fea3-4cf7-989b-4873e208454f"). InnerVolumeSpecName "install-certs-ovrd-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.261359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-kube-api-access-62qd9" (OuterVolumeSpecName: "kube-api-access-62qd9") pod "87c31536-fea3-4cf7-989b-4873e208454f" (UID: "87c31536-fea3-4cf7-989b-4873e208454f"). InnerVolumeSpecName "kube-api-access-62qd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.272943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-inventory" (OuterVolumeSpecName: "inventory") pod "87c31536-fea3-4cf7-989b-4873e208454f" (UID: "87c31536-fea3-4cf7-989b-4873e208454f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.276497 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "87c31536-fea3-4cf7-989b-4873e208454f" (UID: "87c31536-fea3-4cf7-989b-4873e208454f"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.351248 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-kube-api-access-62qd9\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.351530 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-openstack-edpm-tls-custom-tls-dns-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.351544 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/87c31536-fea3-4cf7-989b-4873e208454f-openstack-edpm-tls-tls-dns-ips-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.351555 5030 reconciler_common.go:293] "Volume detached for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-tls-dns-ips-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.351567 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.351578 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.351588 5030 reconciler_common.go:293] "Volume detached for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-install-certs-ovrd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.351598 5030 reconciler_common.go:293] "Volume detached for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c31536-fea3-4cf7-989b-4873e208454f-custom-tls-dns-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:04 crc kubenswrapper[5030]: E0121 00:12:04.761631 5030 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" hostnameMaxLen=63 truncatedHostname="install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.805544 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" event={"ID":"109f040c-197c-43ba-b93f-271f3c4e47a5","Type":"ContainerStarted","Data":"6862abb762aa24bb8b76ac387f14c956634689b2c294eb4995d9e7614c5aca62"} Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.807367 5030 scope.go:117] "RemoveContainer" containerID="a6159386ce2f2f438d145b7614214979b9486e0514f17ad647c7f1ac436d8b13" Jan 21 00:12:04 crc kubenswrapper[5030]: I0121 00:12:04.807415 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-w7lcp" Jan 21 00:12:05 crc kubenswrapper[5030]: I0121 00:12:05.818543 5030 generic.go:334] "Generic (PLEG): container finished" podID="23dd7785-7f14-44ac-b615-c48e0367f949" containerID="6cdf699538371ab232fbd636fd875076b1ecdabcdefe8ee91d0fb83a745b3bba" exitCode=0 Jan 21 00:12:05 crc kubenswrapper[5030]: I0121 00:12:05.818665 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" event={"ID":"23dd7785-7f14-44ac-b615-c48e0367f949","Type":"ContainerDied","Data":"6cdf699538371ab232fbd636fd875076b1ecdabcdefe8ee91d0fb83a745b3bba"} Jan 21 00:12:05 crc kubenswrapper[5030]: I0121 00:12:05.822323 5030 generic.go:334] "Generic (PLEG): container finished" podID="109f040c-197c-43ba-b93f-271f3c4e47a5" containerID="1e55b80f2a72abb7314cce6960684c3f1b899fb93aa4bc5f6c14740462bcaa2a" exitCode=0 Jan 21 00:12:05 crc kubenswrapper[5030]: I0121 00:12:05.822363 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" event={"ID":"109f040c-197c-43ba-b93f-271f3c4e47a5","Type":"ContainerDied","Data":"1e55b80f2a72abb7314cce6960684c3f1b899fb93aa4bc5f6c14740462bcaa2a"} Jan 21 00:12:05 crc kubenswrapper[5030]: I0121 00:12:05.886690 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm"] Jan 21 00:12:05 crc kubenswrapper[5030]: I0121 00:12:05.898308 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm"] Jan 21 00:12:05 crc kubenswrapper[5030]: I0121 00:12:05.956826 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb"] Jan 21 00:12:05 crc kubenswrapper[5030]: I0121 00:12:05.970787 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87c31536-fea3-4cf7-989b-4873e208454f" path="/var/lib/kubelet/pods/87c31536-fea3-4cf7-989b-4873e208454f/volumes" Jan 21 00:12:05 crc kubenswrapper[5030]: I0121 00:12:05.971264 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-ovrd-openstack-edpm-f92sb"] Jan 21 00:12:05 crc kubenswrapper[5030]: I0121 00:12:05.978346 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq"] Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.000899 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg"] Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.009772 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p"] Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.015990 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/custom-tls-dns-openstack-edpm-tls-ovrd-openstack-edpm-tls-9xttq"] Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.023559 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb"] Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.031466 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-openstack-edpm-tls-ovrd-openstack-edpm-tls-8pf8p"] Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.037503 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg"] Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.044940 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq"] Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.051797 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/install-certs-ovrd-openstack-edpm-tls-openstack-edpm-tls-mwfjq"] Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.058019 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2"] Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.064214 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/tls-dnsnames-openstack-edpm-tls-openstack-edpm-tls-rj9s2"] Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.069033 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc"] Jan 21 00:12:06 crc kubenswrapper[5030]: E0121 00:12:06.069302 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="109f040c-197c-43ba-b93f-271f3c4e47a5" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.069314 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="109f040c-197c-43ba-b93f-271f3c4e47a5" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.069468 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="109f040c-197c-43ba-b93f-271f3c4e47a5" containerName="install-certs-ovrd-certs-refresh-openstack-edpm-tls" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.070200 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.079171 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc"] Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.182240 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhczg\" (UniqueName: \"kubernetes.io/projected/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-kube-api-access-jhczg\") pod \"dnsmasq-dnsmasq-84b9f45d47-l74hc\" (UID: \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.182320 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-l74hc\" (UID: \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.182342 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-l74hc\" (UID: \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.283580 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhczg\" (UniqueName: \"kubernetes.io/projected/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-kube-api-access-jhczg\") pod \"dnsmasq-dnsmasq-84b9f45d47-l74hc\" (UID: \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.283660 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-l74hc\" (UID: \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.283681 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-l74hc\" (UID: \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.284545 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-l74hc\" (UID: \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.284575 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-l74hc\" (UID: \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.305156 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhczg\" (UniqueName: \"kubernetes.io/projected/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-kube-api-access-jhczg\") pod \"dnsmasq-dnsmasq-84b9f45d47-l74hc\" (UID: \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.409329 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:06 crc kubenswrapper[5030]: I0121 00:12:06.821687 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc"] Jan 21 00:12:06 crc kubenswrapper[5030]: W0121 00:12:06.825942 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b5f2ad4_9390_4f53_b6c3_aaa2ef112e31.slice/crio-577cc34b4acb2f76b27798fa468190793d8fed3b5de4f6fdd86dd04fe867ec14 WatchSource:0}: Error finding container 577cc34b4acb2f76b27798fa468190793d8fed3b5de4f6fdd86dd04fe867ec14: Status 404 returned error can't find the container with id 577cc34b4acb2f76b27798fa468190793d8fed3b5de4f6fdd86dd04fe867ec14 Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.082294 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.125017 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.201195 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-ssh-key-openstack-edpm-tls\") pod \"23dd7785-7f14-44ac-b615-c48e0367f949\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.201325 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-tls-dns-ips-combined-ca-bundle\") pod \"23dd7785-7f14-44ac-b615-c48e0367f949\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.201380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qjcc\" (UniqueName: \"kubernetes.io/projected/23dd7785-7f14-44ac-b615-c48e0367f949-kube-api-access-2qjcc\") pod \"23dd7785-7f14-44ac-b615-c48e0367f949\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.201442 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-inventory\") pod \"23dd7785-7f14-44ac-b615-c48e0367f949\" (UID: \"23dd7785-7f14-44ac-b615-c48e0367f949\") " Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.205101 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23dd7785-7f14-44ac-b615-c48e0367f949-kube-api-access-2qjcc" (OuterVolumeSpecName: "kube-api-access-2qjcc") pod "23dd7785-7f14-44ac-b615-c48e0367f949" (UID: "23dd7785-7f14-44ac-b615-c48e0367f949"). InnerVolumeSpecName "kube-api-access-2qjcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.205130 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-tls-dns-ips-combined-ca-bundle" (OuterVolumeSpecName: "tls-dns-ips-combined-ca-bundle") pod "23dd7785-7f14-44ac-b615-c48e0367f949" (UID: "23dd7785-7f14-44ac-b615-c48e0367f949"). InnerVolumeSpecName "tls-dns-ips-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.219802 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-inventory" (OuterVolumeSpecName: "inventory") pod "23dd7785-7f14-44ac-b615-c48e0367f949" (UID: "23dd7785-7f14-44ac-b615-c48e0367f949"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.222967 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "23dd7785-7f14-44ac-b615-c48e0367f949" (UID: "23dd7785-7f14-44ac-b615-c48e0367f949"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.302716 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-inventory\") pod \"109f040c-197c-43ba-b93f-271f3c4e47a5\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.302810 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-openstack-edpm-tls-tls-dns-ips-default-certs-0\") pod \"109f040c-197c-43ba-b93f-271f3c4e47a5\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.302885 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-ssh-key-openstack-edpm-tls\") pod \"109f040c-197c-43ba-b93f-271f3c4e47a5\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.302968 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-tls-dns-ips-combined-ca-bundle\") pod \"109f040c-197c-43ba-b93f-271f3c4e47a5\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.303014 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-custom-tls-dns-combined-ca-bundle\") pod \"109f040c-197c-43ba-b93f-271f3c4e47a5\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.303049 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-openstack-edpm-tls-custom-tls-dns-default-certs-0\") pod \"109f040c-197c-43ba-b93f-271f3c4e47a5\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.303079 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-kube-api-access-62qd9\") pod \"109f040c-197c-43ba-b93f-271f3c4e47a5\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.303108 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-install-certs-ovrd-combined-ca-bundle\") pod \"109f040c-197c-43ba-b93f-271f3c4e47a5\" (UID: \"109f040c-197c-43ba-b93f-271f3c4e47a5\") " Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.303481 5030 reconciler_common.go:293] "Volume detached for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-tls-dns-ips-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.303503 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qjcc\" (UniqueName: \"kubernetes.io/projected/23dd7785-7f14-44ac-b615-c48e0367f949-kube-api-access-2qjcc\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.303518 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.303529 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/23dd7785-7f14-44ac-b615-c48e0367f949-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.306314 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-openstack-edpm-tls-tls-dns-ips-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-tls-dns-ips-default-certs-0") pod "109f040c-197c-43ba-b93f-271f3c4e47a5" (UID: "109f040c-197c-43ba-b93f-271f3c4e47a5"). InnerVolumeSpecName "openstack-edpm-tls-tls-dns-ips-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.306963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-install-certs-ovrd-combined-ca-bundle" (OuterVolumeSpecName: "install-certs-ovrd-combined-ca-bundle") pod "109f040c-197c-43ba-b93f-271f3c4e47a5" (UID: "109f040c-197c-43ba-b93f-271f3c4e47a5"). InnerVolumeSpecName "install-certs-ovrd-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.308843 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-tls-dns-ips-combined-ca-bundle" (OuterVolumeSpecName: "tls-dns-ips-combined-ca-bundle") pod "109f040c-197c-43ba-b93f-271f3c4e47a5" (UID: "109f040c-197c-43ba-b93f-271f3c4e47a5"). InnerVolumeSpecName "tls-dns-ips-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.309279 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-custom-tls-dns-combined-ca-bundle" (OuterVolumeSpecName: "custom-tls-dns-combined-ca-bundle") pod "109f040c-197c-43ba-b93f-271f3c4e47a5" (UID: "109f040c-197c-43ba-b93f-271f3c4e47a5"). InnerVolumeSpecName "custom-tls-dns-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.309453 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-kube-api-access-62qd9" (OuterVolumeSpecName: "kube-api-access-62qd9") pod "109f040c-197c-43ba-b93f-271f3c4e47a5" (UID: "109f040c-197c-43ba-b93f-271f3c4e47a5"). InnerVolumeSpecName "kube-api-access-62qd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.310641 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-openstack-edpm-tls-custom-tls-dns-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-tls-custom-tls-dns-default-certs-0") pod "109f040c-197c-43ba-b93f-271f3c4e47a5" (UID: "109f040c-197c-43ba-b93f-271f3c4e47a5"). InnerVolumeSpecName "openstack-edpm-tls-custom-tls-dns-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.325312 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-inventory" (OuterVolumeSpecName: "inventory") pod "109f040c-197c-43ba-b93f-271f3c4e47a5" (UID: "109f040c-197c-43ba-b93f-271f3c4e47a5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.331366 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-ssh-key-openstack-edpm-tls" (OuterVolumeSpecName: "ssh-key-openstack-edpm-tls") pod "109f040c-197c-43ba-b93f-271f3c4e47a5" (UID: "109f040c-197c-43ba-b93f-271f3c4e47a5"). InnerVolumeSpecName "ssh-key-openstack-edpm-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.404834 5030 reconciler_common.go:293] "Volume detached for volume \"custom-tls-dns-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-custom-tls-dns-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.404868 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-custom-tls-dns-default-certs-0\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-openstack-edpm-tls-custom-tls-dns-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.404885 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62qd9\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-kube-api-access-62qd9\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.404896 5030 reconciler_common.go:293] "Volume detached for volume \"install-certs-ovrd-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-install-certs-ovrd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.404909 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.404918 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls-tls-dns-ips-default-certs-0\" (UniqueName: \"kubernetes.io/projected/109f040c-197c-43ba-b93f-271f3c4e47a5-openstack-edpm-tls-tls-dns-ips-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.404931 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-tls\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-ssh-key-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.404941 5030 reconciler_common.go:293] "Volume detached for volume \"tls-dns-ips-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/109f040c-197c-43ba-b93f-271f3c4e47a5-tls-dns-ips-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.841015 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" event={"ID":"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31","Type":"ContainerStarted","Data":"43808b530f21b598eca811bc4b36de08321af911b57be25ef2de3be33db93cad"} Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.841380 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" event={"ID":"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31","Type":"ContainerStarted","Data":"577cc34b4acb2f76b27798fa468190793d8fed3b5de4f6fdd86dd04fe867ec14"} Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.843887 5030 scope.go:117] "RemoveContainer" containerID="1e55b80f2a72abb7314cce6960684c3f1b899fb93aa4bc5f6c14740462bcaa2a" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.843951 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/install-certs-ovrd-certs-refresh-openstack-edpm-tls-mkxmg-debug-79svm" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.851787 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" event={"ID":"23dd7785-7f14-44ac-b615-c48e0367f949","Type":"ContainerDied","Data":"b0dad132493957c43c18a181b366ffb6e2167787ffc6cf1d64a7e5fd41d2d45a"} Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.851824 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0dad132493957c43c18a181b366ffb6e2167787ffc6cf1d64a7e5fd41d2d45a" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.851869 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.946984 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb"] Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.953799 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/tls-dns-ips-certs-refresh-openstack-edpm-tls-v8zpb"] Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.969961 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:12:07 crc kubenswrapper[5030]: E0121 00:12:07.970341 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.979434 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="109f040c-197c-43ba-b93f-271f3c4e47a5" path="/var/lib/kubelet/pods/109f040c-197c-43ba-b93f-271f3c4e47a5/volumes" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.980170 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23dd7785-7f14-44ac-b615-c48e0367f949" path="/var/lib/kubelet/pods/23dd7785-7f14-44ac-b615-c48e0367f949/volumes" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.980894 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="552195c8-5b4f-4685-ace9-e1b093ad8617" path="/var/lib/kubelet/pods/552195c8-5b4f-4685-ace9-e1b093ad8617/volumes" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.981520 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6287ec83-eeba-458f-a91e-5667d0858735" path="/var/lib/kubelet/pods/6287ec83-eeba-458f-a91e-5667d0858735/volumes" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.982967 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4e84f57-c154-49aa-8741-6e5068d94481" path="/var/lib/kubelet/pods/a4e84f57-c154-49aa-8741-6e5068d94481/volumes" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.983577 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d" path="/var/lib/kubelet/pods/a75dfcc1-bbfa-44bc-b616-f2ff3e0cc17d/volumes" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.984258 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e729d672-5409-47b0-9e4c-6a6c875782f8" path="/var/lib/kubelet/pods/e729d672-5409-47b0-9e4c-6a6c875782f8/volumes" Jan 21 00:12:07 crc kubenswrapper[5030]: I0121 00:12:07.985580 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f269dd4a-049c-4639-b987-7831fcb249dc" path="/var/lib/kubelet/pods/f269dd4a-049c-4639-b987-7831fcb249dc/volumes" Jan 21 00:12:08 crc kubenswrapper[5030]: I0121 00:12:08.869865 5030 generic.go:334] "Generic (PLEG): container finished" podID="4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31" containerID="43808b530f21b598eca811bc4b36de08321af911b57be25ef2de3be33db93cad" exitCode=0 Jan 21 00:12:08 crc kubenswrapper[5030]: I0121 00:12:08.869938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" event={"ID":"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31","Type":"ContainerDied","Data":"43808b530f21b598eca811bc4b36de08321af911b57be25ef2de3be33db93cad"} Jan 21 00:12:09 crc kubenswrapper[5030]: I0121 00:12:09.883116 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" event={"ID":"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31","Type":"ContainerStarted","Data":"0abbd64c785752eec005749b253689ef56538e2b5eb9865e3aa6441640191890"} Jan 21 00:12:09 crc kubenswrapper[5030]: I0121 00:12:09.883455 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:09 crc kubenswrapper[5030]: I0121 00:12:09.907266 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" podStartSLOduration=3.907249031 podStartE2EDuration="3.907249031s" podCreationTimestamp="2026-01-21 00:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:12:09.902076656 +0000 UTC m=+5802.222336944" watchObservedRunningTime="2026-01-21 00:12:09.907249031 +0000 UTC m=+5802.227509319" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.294666 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-592ch"] Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.300481 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-592ch"] Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.438746 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-jgdtg"] Jan 21 00:12:14 crc kubenswrapper[5030]: E0121 00:12:14.439372 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23dd7785-7f14-44ac-b615-c48e0367f949" containerName="tls-dns-ips-certs-refresh-openstack-edpm-tls" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.439482 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="23dd7785-7f14-44ac-b615-c48e0367f949" containerName="tls-dns-ips-certs-refresh-openstack-edpm-tls" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.439899 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="23dd7785-7f14-44ac-b615-c48e0367f949" containerName="tls-dns-ips-certs-refresh-openstack-edpm-tls" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.440825 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jgdtg" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.443014 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.443079 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.443345 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.447930 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.449876 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jgdtg"] Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.531593 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6e0ad6ae-a386-42e5-8246-bb563230b508-node-mnt\") pod \"crc-storage-crc-jgdtg\" (UID: \"6e0ad6ae-a386-42e5-8246-bb563230b508\") " pod="crc-storage/crc-storage-crc-jgdtg" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.532110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd687\" (UniqueName: \"kubernetes.io/projected/6e0ad6ae-a386-42e5-8246-bb563230b508-kube-api-access-xd687\") pod \"crc-storage-crc-jgdtg\" (UID: \"6e0ad6ae-a386-42e5-8246-bb563230b508\") " pod="crc-storage/crc-storage-crc-jgdtg" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.532230 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6e0ad6ae-a386-42e5-8246-bb563230b508-crc-storage\") pod \"crc-storage-crc-jgdtg\" (UID: \"6e0ad6ae-a386-42e5-8246-bb563230b508\") " pod="crc-storage/crc-storage-crc-jgdtg" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.633037 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6e0ad6ae-a386-42e5-8246-bb563230b508-crc-storage\") pod \"crc-storage-crc-jgdtg\" (UID: \"6e0ad6ae-a386-42e5-8246-bb563230b508\") " pod="crc-storage/crc-storage-crc-jgdtg" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.633382 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6e0ad6ae-a386-42e5-8246-bb563230b508-node-mnt\") pod \"crc-storage-crc-jgdtg\" (UID: \"6e0ad6ae-a386-42e5-8246-bb563230b508\") " pod="crc-storage/crc-storage-crc-jgdtg" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.633568 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd687\" (UniqueName: \"kubernetes.io/projected/6e0ad6ae-a386-42e5-8246-bb563230b508-kube-api-access-xd687\") pod \"crc-storage-crc-jgdtg\" (UID: \"6e0ad6ae-a386-42e5-8246-bb563230b508\") " pod="crc-storage/crc-storage-crc-jgdtg" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.633785 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6e0ad6ae-a386-42e5-8246-bb563230b508-node-mnt\") pod \"crc-storage-crc-jgdtg\" (UID: \"6e0ad6ae-a386-42e5-8246-bb563230b508\") " pod="crc-storage/crc-storage-crc-jgdtg" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.634793 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6e0ad6ae-a386-42e5-8246-bb563230b508-crc-storage\") pod \"crc-storage-crc-jgdtg\" (UID: \"6e0ad6ae-a386-42e5-8246-bb563230b508\") " pod="crc-storage/crc-storage-crc-jgdtg" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.654617 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd687\" (UniqueName: \"kubernetes.io/projected/6e0ad6ae-a386-42e5-8246-bb563230b508-kube-api-access-xd687\") pod \"crc-storage-crc-jgdtg\" (UID: \"6e0ad6ae-a386-42e5-8246-bb563230b508\") " pod="crc-storage/crc-storage-crc-jgdtg" Jan 21 00:12:14 crc kubenswrapper[5030]: I0121 00:12:14.762997 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jgdtg" Jan 21 00:12:15 crc kubenswrapper[5030]: I0121 00:12:15.194214 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jgdtg"] Jan 21 00:12:15 crc kubenswrapper[5030]: I0121 00:12:15.945062 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jgdtg" event={"ID":"6e0ad6ae-a386-42e5-8246-bb563230b508","Type":"ContainerStarted","Data":"872de74aef18a16f81624c855f70cc4711334e4ca8f367bb4c60ea48f7f1a3c1"} Jan 21 00:12:15 crc kubenswrapper[5030]: I0121 00:12:15.972388 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d" path="/var/lib/kubelet/pods/cb4d4cb1-293d-481c-9dd7-3f1797a2fa9d/volumes" Jan 21 00:12:16 crc kubenswrapper[5030]: I0121 00:12:16.411422 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:16 crc kubenswrapper[5030]: I0121 00:12:16.501568 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44"] Jan 21 00:12:16 crc kubenswrapper[5030]: I0121 00:12:16.502072 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" podUID="7b2e7b50-765c-4f4a-b9da-69f16961e8b1" containerName="dnsmasq-dns" containerID="cri-o://a8e4fdac2501ca40c43af66648ac121f8b07dca717af20dfc5b34338c337f8ef" gracePeriod=10 Jan 21 00:12:16 crc kubenswrapper[5030]: I0121 00:12:16.957096 5030 generic.go:334] "Generic (PLEG): container finished" podID="7b2e7b50-765c-4f4a-b9da-69f16961e8b1" containerID="a8e4fdac2501ca40c43af66648ac121f8b07dca717af20dfc5b34338c337f8ef" exitCode=0 Jan 21 00:12:16 crc kubenswrapper[5030]: I0121 00:12:16.957129 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" event={"ID":"7b2e7b50-765c-4f4a-b9da-69f16961e8b1","Type":"ContainerDied","Data":"a8e4fdac2501ca40c43af66648ac121f8b07dca717af20dfc5b34338c337f8ef"} Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.218530 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.277569 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrf2z\" (UniqueName: \"kubernetes.io/projected/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-kube-api-access-zrf2z\") pod \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.277696 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-openstack-edpm-tls\") pod \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.277774 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-config\") pod \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.277823 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-dnsmasq-svc\") pod \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\" (UID: \"7b2e7b50-765c-4f4a-b9da-69f16961e8b1\") " Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.283750 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-kube-api-access-zrf2z" (OuterVolumeSpecName: "kube-api-access-zrf2z") pod "7b2e7b50-765c-4f4a-b9da-69f16961e8b1" (UID: "7b2e7b50-765c-4f4a-b9da-69f16961e8b1"). InnerVolumeSpecName "kube-api-access-zrf2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.318199 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "7b2e7b50-765c-4f4a-b9da-69f16961e8b1" (UID: "7b2e7b50-765c-4f4a-b9da-69f16961e8b1"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.319081 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-openstack-edpm-tls" (OuterVolumeSpecName: "openstack-edpm-tls") pod "7b2e7b50-765c-4f4a-b9da-69f16961e8b1" (UID: "7b2e7b50-765c-4f4a-b9da-69f16961e8b1"). InnerVolumeSpecName "openstack-edpm-tls". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.320539 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-config" (OuterVolumeSpecName: "config") pod "7b2e7b50-765c-4f4a-b9da-69f16961e8b1" (UID: "7b2e7b50-765c-4f4a-b9da-69f16961e8b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.379795 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-tls\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-openstack-edpm-tls\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.379834 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.379844 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.379857 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrf2z\" (UniqueName: \"kubernetes.io/projected/7b2e7b50-765c-4f4a-b9da-69f16961e8b1-kube-api-access-zrf2z\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.966716 5030 generic.go:334] "Generic (PLEG): container finished" podID="6e0ad6ae-a386-42e5-8246-bb563230b508" containerID="4cd150a4f0dcf557aa527bf36c716d85a95bde3fe3f482844022156f6b8740c7" exitCode=0 Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.971088 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jgdtg" event={"ID":"6e0ad6ae-a386-42e5-8246-bb563230b508","Type":"ContainerDied","Data":"4cd150a4f0dcf557aa527bf36c716d85a95bde3fe3f482844022156f6b8740c7"} Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.972277 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" event={"ID":"7b2e7b50-765c-4f4a-b9da-69f16961e8b1","Type":"ContainerDied","Data":"b36d32b4d828fec3cc2ee394b65c9839dc02cbe0812194062d4e1025ca6ad537"} Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.972317 5030 scope.go:117] "RemoveContainer" containerID="a8e4fdac2501ca40c43af66648ac121f8b07dca717af20dfc5b34338c337f8ef" Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.972360 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44" Jan 21 00:12:17 crc kubenswrapper[5030]: I0121 00:12:17.996447 5030 scope.go:117] "RemoveContainer" containerID="dc49ff8692861e4255c6051f313a37d6b3f437b9726d6a0e3002bb1776e705e7" Jan 21 00:12:18 crc kubenswrapper[5030]: I0121 00:12:18.024374 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44"] Jan 21 00:12:18 crc kubenswrapper[5030]: I0121 00:12:18.029783 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-79cc674687-npl44"] Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.230687 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jgdtg" Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.306249 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd687\" (UniqueName: \"kubernetes.io/projected/6e0ad6ae-a386-42e5-8246-bb563230b508-kube-api-access-xd687\") pod \"6e0ad6ae-a386-42e5-8246-bb563230b508\" (UID: \"6e0ad6ae-a386-42e5-8246-bb563230b508\") " Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.306301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6e0ad6ae-a386-42e5-8246-bb563230b508-crc-storage\") pod \"6e0ad6ae-a386-42e5-8246-bb563230b508\" (UID: \"6e0ad6ae-a386-42e5-8246-bb563230b508\") " Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.306407 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6e0ad6ae-a386-42e5-8246-bb563230b508-node-mnt\") pod \"6e0ad6ae-a386-42e5-8246-bb563230b508\" (UID: \"6e0ad6ae-a386-42e5-8246-bb563230b508\") " Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.306666 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e0ad6ae-a386-42e5-8246-bb563230b508-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6e0ad6ae-a386-42e5-8246-bb563230b508" (UID: "6e0ad6ae-a386-42e5-8246-bb563230b508"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.306842 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6e0ad6ae-a386-42e5-8246-bb563230b508-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.310979 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e0ad6ae-a386-42e5-8246-bb563230b508-kube-api-access-xd687" (OuterVolumeSpecName: "kube-api-access-xd687") pod "6e0ad6ae-a386-42e5-8246-bb563230b508" (UID: "6e0ad6ae-a386-42e5-8246-bb563230b508"). InnerVolumeSpecName "kube-api-access-xd687". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.323518 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e0ad6ae-a386-42e5-8246-bb563230b508-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6e0ad6ae-a386-42e5-8246-bb563230b508" (UID: "6e0ad6ae-a386-42e5-8246-bb563230b508"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.407750 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd687\" (UniqueName: \"kubernetes.io/projected/6e0ad6ae-a386-42e5-8246-bb563230b508-kube-api-access-xd687\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.408095 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6e0ad6ae-a386-42e5-8246-bb563230b508-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.961998 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:12:19 crc kubenswrapper[5030]: E0121 00:12:19.962401 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.972474 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b2e7b50-765c-4f4a-b9da-69f16961e8b1" path="/var/lib/kubelet/pods/7b2e7b50-765c-4f4a-b9da-69f16961e8b1/volumes" Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.998377 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jgdtg" event={"ID":"6e0ad6ae-a386-42e5-8246-bb563230b508","Type":"ContainerDied","Data":"872de74aef18a16f81624c855f70cc4711334e4ca8f367bb4c60ea48f7f1a3c1"} Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.998426 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jgdtg" Jan 21 00:12:19 crc kubenswrapper[5030]: I0121 00:12:19.998425 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="872de74aef18a16f81624c855f70cc4711334e4ca8f367bb4c60ea48f7f1a3c1" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.222761 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-jgdtg"] Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.229186 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-jgdtg"] Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.385397 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-pm6b8"] Jan 21 00:12:23 crc kubenswrapper[5030]: E0121 00:12:23.385775 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2e7b50-765c-4f4a-b9da-69f16961e8b1" containerName="init" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.385794 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2e7b50-765c-4f4a-b9da-69f16961e8b1" containerName="init" Jan 21 00:12:23 crc kubenswrapper[5030]: E0121 00:12:23.385807 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0ad6ae-a386-42e5-8246-bb563230b508" containerName="storage" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.385814 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0ad6ae-a386-42e5-8246-bb563230b508" containerName="storage" Jan 21 00:12:23 crc kubenswrapper[5030]: E0121 00:12:23.385831 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2e7b50-765c-4f4a-b9da-69f16961e8b1" containerName="dnsmasq-dns" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.385837 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2e7b50-765c-4f4a-b9da-69f16961e8b1" containerName="dnsmasq-dns" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.385956 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b2e7b50-765c-4f4a-b9da-69f16961e8b1" containerName="dnsmasq-dns" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.385982 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e0ad6ae-a386-42e5-8246-bb563230b508" containerName="storage" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.386464 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pm6b8" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.388495 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.389030 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.389077 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.389082 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.392527 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-pm6b8"] Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.467467 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1e339d48-7041-45c7-a8fd-8586e0bcb73e-node-mnt\") pod \"crc-storage-crc-pm6b8\" (UID: \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\") " pod="crc-storage/crc-storage-crc-pm6b8" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.467821 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrr7l\" (UniqueName: \"kubernetes.io/projected/1e339d48-7041-45c7-a8fd-8586e0bcb73e-kube-api-access-lrr7l\") pod \"crc-storage-crc-pm6b8\" (UID: \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\") " pod="crc-storage/crc-storage-crc-pm6b8" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.467948 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1e339d48-7041-45c7-a8fd-8586e0bcb73e-crc-storage\") pod \"crc-storage-crc-pm6b8\" (UID: \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\") " pod="crc-storage/crc-storage-crc-pm6b8" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.569220 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1e339d48-7041-45c7-a8fd-8586e0bcb73e-node-mnt\") pod \"crc-storage-crc-pm6b8\" (UID: \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\") " pod="crc-storage/crc-storage-crc-pm6b8" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.570131 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrr7l\" (UniqueName: \"kubernetes.io/projected/1e339d48-7041-45c7-a8fd-8586e0bcb73e-kube-api-access-lrr7l\") pod \"crc-storage-crc-pm6b8\" (UID: \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\") " pod="crc-storage/crc-storage-crc-pm6b8" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.570271 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1e339d48-7041-45c7-a8fd-8586e0bcb73e-crc-storage\") pod \"crc-storage-crc-pm6b8\" (UID: \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\") " pod="crc-storage/crc-storage-crc-pm6b8" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.571333 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1e339d48-7041-45c7-a8fd-8586e0bcb73e-crc-storage\") pod \"crc-storage-crc-pm6b8\" (UID: \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\") " pod="crc-storage/crc-storage-crc-pm6b8" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.571615 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1e339d48-7041-45c7-a8fd-8586e0bcb73e-node-mnt\") pod \"crc-storage-crc-pm6b8\" (UID: \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\") " pod="crc-storage/crc-storage-crc-pm6b8" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.605726 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrr7l\" (UniqueName: \"kubernetes.io/projected/1e339d48-7041-45c7-a8fd-8586e0bcb73e-kube-api-access-lrr7l\") pod \"crc-storage-crc-pm6b8\" (UID: \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\") " pod="crc-storage/crc-storage-crc-pm6b8" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.701203 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pm6b8" Jan 21 00:12:23 crc kubenswrapper[5030]: I0121 00:12:23.975283 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e0ad6ae-a386-42e5-8246-bb563230b508" path="/var/lib/kubelet/pods/6e0ad6ae-a386-42e5-8246-bb563230b508/volumes" Jan 21 00:12:24 crc kubenswrapper[5030]: I0121 00:12:24.109698 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-pm6b8"] Jan 21 00:12:24 crc kubenswrapper[5030]: W0121 00:12:24.113922 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e339d48_7041_45c7_a8fd_8586e0bcb73e.slice/crio-9e9f7b37336b0ffd00fd47eda69d0f7505d5643c1f16fea89bb46e61074a5184 WatchSource:0}: Error finding container 9e9f7b37336b0ffd00fd47eda69d0f7505d5643c1f16fea89bb46e61074a5184: Status 404 returned error can't find the container with id 9e9f7b37336b0ffd00fd47eda69d0f7505d5643c1f16fea89bb46e61074a5184 Jan 21 00:12:25 crc kubenswrapper[5030]: I0121 00:12:25.039590 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pm6b8" event={"ID":"1e339d48-7041-45c7-a8fd-8586e0bcb73e","Type":"ContainerStarted","Data":"9e9f7b37336b0ffd00fd47eda69d0f7505d5643c1f16fea89bb46e61074a5184"} Jan 21 00:12:27 crc kubenswrapper[5030]: I0121 00:12:27.059897 5030 generic.go:334] "Generic (PLEG): container finished" podID="1e339d48-7041-45c7-a8fd-8586e0bcb73e" containerID="af80139a16fc5d56aad0d1464a7af3e6a4ffa08bff4a2f1c3865175be23c6fc8" exitCode=0 Jan 21 00:12:27 crc kubenswrapper[5030]: I0121 00:12:27.059989 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pm6b8" event={"ID":"1e339d48-7041-45c7-a8fd-8586e0bcb73e","Type":"ContainerDied","Data":"af80139a16fc5d56aad0d1464a7af3e6a4ffa08bff4a2f1c3865175be23c6fc8"} Jan 21 00:12:27 crc kubenswrapper[5030]: I0121 00:12:27.528096 5030 scope.go:117] "RemoveContainer" containerID="af34ceed8e9bf13ec3c26ccbf34aa4a2647431beac065ed3581c2efc00908b28" Jan 21 00:12:27 crc kubenswrapper[5030]: I0121 00:12:27.568402 5030 scope.go:117] "RemoveContainer" containerID="a7d9774025737c4497227ec8725c0294a993dbee1f310820daaa88fd05c97532" Jan 21 00:12:27 crc kubenswrapper[5030]: I0121 00:12:27.597896 5030 scope.go:117] "RemoveContainer" containerID="b0d506265f978beb8126f6798cd50bf16df24d80854f3b6172f602219dded71b" Jan 21 00:12:27 crc kubenswrapper[5030]: I0121 00:12:27.635816 5030 scope.go:117] "RemoveContainer" containerID="8c661467137337265d146274a69301d98c05212632a21587c294aedf83e72985" Jan 21 00:12:27 crc kubenswrapper[5030]: I0121 00:12:27.654412 5030 scope.go:117] "RemoveContainer" containerID="95861705696a2581bb7d9cb33534f21eec2421d6086413b3d6c124867f1992e8" Jan 21 00:12:27 crc kubenswrapper[5030]: I0121 00:12:27.682309 5030 scope.go:117] "RemoveContainer" containerID="c038ec01c492b5f8a12c4451e87132666135683daa656c8c5bfb22f57684b6c4" Jan 21 00:12:27 crc kubenswrapper[5030]: I0121 00:12:27.719463 5030 scope.go:117] "RemoveContainer" containerID="c57fa0852a94326fd2eb1efb569129a3b88c52bbd0c71a3d2d8eb0d1f3e05ce0" Jan 21 00:12:27 crc kubenswrapper[5030]: I0121 00:12:27.745498 5030 scope.go:117] "RemoveContainer" containerID="4398953fb94de880cdde55183ed2f8d0aae947333b5028db7dfbe6fbfed0e008" Jan 21 00:12:27 crc kubenswrapper[5030]: I0121 00:12:27.777181 5030 scope.go:117] "RemoveContainer" containerID="7f354130cfe70a506e0b8da7fc2de2a746b7f0bec1340ad75d7e0807dd9c86f0" Jan 21 00:12:28 crc kubenswrapper[5030]: I0121 00:12:28.321670 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pm6b8" Jan 21 00:12:28 crc kubenswrapper[5030]: I0121 00:12:28.392656 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1e339d48-7041-45c7-a8fd-8586e0bcb73e-crc-storage\") pod \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\" (UID: \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\") " Jan 21 00:12:28 crc kubenswrapper[5030]: I0121 00:12:28.392753 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrr7l\" (UniqueName: \"kubernetes.io/projected/1e339d48-7041-45c7-a8fd-8586e0bcb73e-kube-api-access-lrr7l\") pod \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\" (UID: \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\") " Jan 21 00:12:28 crc kubenswrapper[5030]: I0121 00:12:28.392777 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1e339d48-7041-45c7-a8fd-8586e0bcb73e-node-mnt\") pod \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\" (UID: \"1e339d48-7041-45c7-a8fd-8586e0bcb73e\") " Jan 21 00:12:28 crc kubenswrapper[5030]: I0121 00:12:28.393014 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e339d48-7041-45c7-a8fd-8586e0bcb73e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "1e339d48-7041-45c7-a8fd-8586e0bcb73e" (UID: "1e339d48-7041-45c7-a8fd-8586e0bcb73e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:12:28 crc kubenswrapper[5030]: I0121 00:12:28.396703 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e339d48-7041-45c7-a8fd-8586e0bcb73e-kube-api-access-lrr7l" (OuterVolumeSpecName: "kube-api-access-lrr7l") pod "1e339d48-7041-45c7-a8fd-8586e0bcb73e" (UID: "1e339d48-7041-45c7-a8fd-8586e0bcb73e"). InnerVolumeSpecName "kube-api-access-lrr7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:12:28 crc kubenswrapper[5030]: I0121 00:12:28.408864 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e339d48-7041-45c7-a8fd-8586e0bcb73e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "1e339d48-7041-45c7-a8fd-8586e0bcb73e" (UID: "1e339d48-7041-45c7-a8fd-8586e0bcb73e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:12:28 crc kubenswrapper[5030]: I0121 00:12:28.493958 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1e339d48-7041-45c7-a8fd-8586e0bcb73e-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:28 crc kubenswrapper[5030]: I0121 00:12:28.493999 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrr7l\" (UniqueName: \"kubernetes.io/projected/1e339d48-7041-45c7-a8fd-8586e0bcb73e-kube-api-access-lrr7l\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:28 crc kubenswrapper[5030]: I0121 00:12:28.494009 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1e339d48-7041-45c7-a8fd-8586e0bcb73e-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:29 crc kubenswrapper[5030]: I0121 00:12:29.095416 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pm6b8" event={"ID":"1e339d48-7041-45c7-a8fd-8586e0bcb73e","Type":"ContainerDied","Data":"9e9f7b37336b0ffd00fd47eda69d0f7505d5643c1f16fea89bb46e61074a5184"} Jan 21 00:12:29 crc kubenswrapper[5030]: I0121 00:12:29.095504 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pm6b8" Jan 21 00:12:29 crc kubenswrapper[5030]: I0121 00:12:29.095524 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e9f7b37336b0ffd00fd47eda69d0f7505d5643c1f16fea89bb46e61074a5184" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.700464 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5"] Jan 21 00:12:31 crc kubenswrapper[5030]: E0121 00:12:31.703469 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e339d48-7041-45c7-a8fd-8586e0bcb73e" containerName="storage" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.703585 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e339d48-7041-45c7-a8fd-8586e0bcb73e" containerName="storage" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.703997 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e339d48-7041-45c7-a8fd-8586e0bcb73e" containerName="storage" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.711679 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5"] Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.711835 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.714709 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-extramounts" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.765084 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjj55\" (UniqueName: \"kubernetes.io/projected/46a97e5c-2126-4f43-a86c-c040672f56ac-kube-api-access-cjj55\") pod \"dnsmasq-dnsmasq-76cd9645f5-xvnj5\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.765534 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-config\") pod \"dnsmasq-dnsmasq-76cd9645f5-xvnj5\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.765692 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-extramounts\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-edpm-extramounts\") pod \"dnsmasq-dnsmasq-76cd9645f5-xvnj5\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.765764 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-76cd9645f5-xvnj5\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.867049 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-extramounts\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-edpm-extramounts\") pod \"dnsmasq-dnsmasq-76cd9645f5-xvnj5\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.867455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-76cd9645f5-xvnj5\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.867722 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjj55\" (UniqueName: \"kubernetes.io/projected/46a97e5c-2126-4f43-a86c-c040672f56ac-kube-api-access-cjj55\") pod \"dnsmasq-dnsmasq-76cd9645f5-xvnj5\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.867929 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-extramounts\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-edpm-extramounts\") pod \"dnsmasq-dnsmasq-76cd9645f5-xvnj5\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.867930 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-config\") pod \"dnsmasq-dnsmasq-76cd9645f5-xvnj5\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.868473 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-76cd9645f5-xvnj5\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.869115 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-config\") pod \"dnsmasq-dnsmasq-76cd9645f5-xvnj5\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.889100 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjj55\" (UniqueName: \"kubernetes.io/projected/46a97e5c-2126-4f43-a86c-c040672f56ac-kube-api-access-cjj55\") pod \"dnsmasq-dnsmasq-76cd9645f5-xvnj5\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:31 crc kubenswrapper[5030]: I0121 00:12:31.962857 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:12:31 crc kubenswrapper[5030]: E0121 00:12:31.963322 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:12:32 crc kubenswrapper[5030]: I0121 00:12:32.037374 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:32 crc kubenswrapper[5030]: I0121 00:12:32.474398 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5"] Jan 21 00:12:33 crc kubenswrapper[5030]: I0121 00:12:33.136195 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" event={"ID":"46a97e5c-2126-4f43-a86c-c040672f56ac","Type":"ContainerStarted","Data":"8640dc8cf7e86dbb92326fac1b96aecbabc5b1ba9ed093a36ca4c1bd185b6c7c"} Jan 21 00:12:34 crc kubenswrapper[5030]: I0121 00:12:34.148022 5030 generic.go:334] "Generic (PLEG): container finished" podID="46a97e5c-2126-4f43-a86c-c040672f56ac" containerID="32fe4f6465612862d4585cafd1d0c175eb313122dc01de192c987c948162b06a" exitCode=0 Jan 21 00:12:34 crc kubenswrapper[5030]: I0121 00:12:34.148092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" event={"ID":"46a97e5c-2126-4f43-a86c-c040672f56ac","Type":"ContainerDied","Data":"32fe4f6465612862d4585cafd1d0c175eb313122dc01de192c987c948162b06a"} Jan 21 00:12:35 crc kubenswrapper[5030]: I0121 00:12:35.161933 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" event={"ID":"46a97e5c-2126-4f43-a86c-c040672f56ac","Type":"ContainerStarted","Data":"47e606e5ffbe4497598fe8854bd9cb875bbc06e4188a1c689bb803cc466f7af1"} Jan 21 00:12:35 crc kubenswrapper[5030]: I0121 00:12:35.162429 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:35 crc kubenswrapper[5030]: I0121 00:12:35.188843 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" podStartSLOduration=4.188826054 podStartE2EDuration="4.188826054s" podCreationTimestamp="2026-01-21 00:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:12:35.182867 +0000 UTC m=+5827.503127298" watchObservedRunningTime="2026-01-21 00:12:35.188826054 +0000 UTC m=+5827.509086342" Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.041801 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.110551 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc"] Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.110914 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" podUID="4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31" containerName="dnsmasq-dns" containerID="cri-o://0abbd64c785752eec005749b253689ef56538e2b5eb9865e3aa6441640191890" gracePeriod=10 Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.256200 5030 generic.go:334] "Generic (PLEG): container finished" podID="4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31" containerID="0abbd64c785752eec005749b253689ef56538e2b5eb9865e3aa6441640191890" exitCode=0 Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.256251 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" event={"ID":"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31","Type":"ContainerDied","Data":"0abbd64c785752eec005749b253689ef56538e2b5eb9865e3aa6441640191890"} Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.615554 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.735808 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-dnsmasq-svc\") pod \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\" (UID: \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\") " Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.735898 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhczg\" (UniqueName: \"kubernetes.io/projected/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-kube-api-access-jhczg\") pod \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\" (UID: \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\") " Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.736010 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-config\") pod \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\" (UID: \"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31\") " Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.741192 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-kube-api-access-jhczg" (OuterVolumeSpecName: "kube-api-access-jhczg") pod "4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31" (UID: "4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31"). InnerVolumeSpecName "kube-api-access-jhczg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.769476 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31" (UID: "4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.787585 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-config" (OuterVolumeSpecName: "config") pod "4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31" (UID: "4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.837366 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.837425 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhczg\" (UniqueName: \"kubernetes.io/projected/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-kube-api-access-jhczg\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:42 crc kubenswrapper[5030]: I0121 00:12:42.837444 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:43 crc kubenswrapper[5030]: I0121 00:12:43.272117 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" event={"ID":"4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31","Type":"ContainerDied","Data":"577cc34b4acb2f76b27798fa468190793d8fed3b5de4f6fdd86dd04fe867ec14"} Jan 21 00:12:43 crc kubenswrapper[5030]: I0121 00:12:43.272153 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc" Jan 21 00:12:43 crc kubenswrapper[5030]: I0121 00:12:43.272203 5030 scope.go:117] "RemoveContainer" containerID="0abbd64c785752eec005749b253689ef56538e2b5eb9865e3aa6441640191890" Jan 21 00:12:43 crc kubenswrapper[5030]: I0121 00:12:43.307370 5030 scope.go:117] "RemoveContainer" containerID="43808b530f21b598eca811bc4b36de08321af911b57be25ef2de3be33db93cad" Jan 21 00:12:43 crc kubenswrapper[5030]: I0121 00:12:43.314213 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc"] Jan 21 00:12:43 crc kubenswrapper[5030]: I0121 00:12:43.324224 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-l74hc"] Jan 21 00:12:43 crc kubenswrapper[5030]: I0121 00:12:43.963426 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:12:43 crc kubenswrapper[5030]: E0121 00:12:43.964375 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:12:43 crc kubenswrapper[5030]: I0121 00:12:43.974357 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31" path="/var/lib/kubelet/pods/4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31/volumes" Jan 21 00:12:46 crc kubenswrapper[5030]: I0121 00:12:46.830254 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf"] Jan 21 00:12:46 crc kubenswrapper[5030]: E0121 00:12:46.830933 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31" containerName="dnsmasq-dns" Jan 21 00:12:46 crc kubenswrapper[5030]: I0121 00:12:46.830950 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31" containerName="dnsmasq-dns" Jan 21 00:12:46 crc kubenswrapper[5030]: E0121 00:12:46.831001 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31" containerName="init" Jan 21 00:12:46 crc kubenswrapper[5030]: I0121 00:12:46.831010 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31" containerName="init" Jan 21 00:12:46 crc kubenswrapper[5030]: I0121 00:12:46.831309 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b5f2ad4-9390-4f53-b6c3-aaa2ef112e31" containerName="dnsmasq-dns" Jan 21 00:12:46 crc kubenswrapper[5030]: I0121 00:12:46.832218 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:12:46 crc kubenswrapper[5030]: I0121 00:12:46.869248 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf"] Jan 21 00:12:46 crc kubenswrapper[5030]: I0121 00:12:46.902346 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82f2t\" (UniqueName: \"kubernetes.io/projected/85281a47-0cc3-42cd-b349-082ac2d4f10e-kube-api-access-82f2t\") pod \"dnsmasq-dnsmasq-84b9f45d47-g5mjf\" (UID: \"85281a47-0cc3-42cd-b349-082ac2d4f10e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:12:46 crc kubenswrapper[5030]: I0121 00:12:46.902648 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/85281a47-0cc3-42cd-b349-082ac2d4f10e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-g5mjf\" (UID: \"85281a47-0cc3-42cd-b349-082ac2d4f10e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:12:46 crc kubenswrapper[5030]: I0121 00:12:46.902745 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85281a47-0cc3-42cd-b349-082ac2d4f10e-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-g5mjf\" (UID: \"85281a47-0cc3-42cd-b349-082ac2d4f10e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:12:47 crc kubenswrapper[5030]: I0121 00:12:47.004335 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/85281a47-0cc3-42cd-b349-082ac2d4f10e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-g5mjf\" (UID: \"85281a47-0cc3-42cd-b349-082ac2d4f10e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:12:47 crc kubenswrapper[5030]: I0121 00:12:47.004402 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85281a47-0cc3-42cd-b349-082ac2d4f10e-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-g5mjf\" (UID: \"85281a47-0cc3-42cd-b349-082ac2d4f10e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:12:47 crc kubenswrapper[5030]: I0121 00:12:47.004469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82f2t\" (UniqueName: \"kubernetes.io/projected/85281a47-0cc3-42cd-b349-082ac2d4f10e-kube-api-access-82f2t\") pod \"dnsmasq-dnsmasq-84b9f45d47-g5mjf\" (UID: \"85281a47-0cc3-42cd-b349-082ac2d4f10e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:12:47 crc kubenswrapper[5030]: I0121 00:12:47.005864 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85281a47-0cc3-42cd-b349-082ac2d4f10e-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-g5mjf\" (UID: \"85281a47-0cc3-42cd-b349-082ac2d4f10e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:12:47 crc kubenswrapper[5030]: I0121 00:12:47.006136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/85281a47-0cc3-42cd-b349-082ac2d4f10e-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-g5mjf\" (UID: \"85281a47-0cc3-42cd-b349-082ac2d4f10e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:12:47 crc kubenswrapper[5030]: I0121 00:12:47.027830 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82f2t\" (UniqueName: \"kubernetes.io/projected/85281a47-0cc3-42cd-b349-082ac2d4f10e-kube-api-access-82f2t\") pod \"dnsmasq-dnsmasq-84b9f45d47-g5mjf\" (UID: \"85281a47-0cc3-42cd-b349-082ac2d4f10e\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:12:47 crc kubenswrapper[5030]: I0121 00:12:47.149505 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:12:47 crc kubenswrapper[5030]: I0121 00:12:47.395929 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf"] Jan 21 00:12:48 crc kubenswrapper[5030]: I0121 00:12:48.327893 5030 generic.go:334] "Generic (PLEG): container finished" podID="85281a47-0cc3-42cd-b349-082ac2d4f10e" containerID="eb264058be944df9b4ab4e3d081bd0021dcea6ee40423f309ac9d58c0b622518" exitCode=0 Jan 21 00:12:48 crc kubenswrapper[5030]: I0121 00:12:48.328005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" event={"ID":"85281a47-0cc3-42cd-b349-082ac2d4f10e","Type":"ContainerDied","Data":"eb264058be944df9b4ab4e3d081bd0021dcea6ee40423f309ac9d58c0b622518"} Jan 21 00:12:48 crc kubenswrapper[5030]: I0121 00:12:48.328244 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" event={"ID":"85281a47-0cc3-42cd-b349-082ac2d4f10e","Type":"ContainerStarted","Data":"8a6ff4e21d3defb5f7287bc5f087b8ad26f76f5e38c7a3e3501809500c0d6f5d"} Jan 21 00:12:49 crc kubenswrapper[5030]: I0121 00:12:49.341611 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" event={"ID":"85281a47-0cc3-42cd-b349-082ac2d4f10e","Type":"ContainerStarted","Data":"45189cd6b4d5c85a574cc281f266181738365081b919e180c7cb2da4c6ecca46"} Jan 21 00:12:49 crc kubenswrapper[5030]: I0121 00:12:49.342620 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:12:49 crc kubenswrapper[5030]: I0121 00:12:49.371473 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" podStartSLOduration=3.371450626 podStartE2EDuration="3.371450626s" podCreationTimestamp="2026-01-21 00:12:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:12:49.365734488 +0000 UTC m=+5841.685994776" watchObservedRunningTime="2026-01-21 00:12:49.371450626 +0000 UTC m=+5841.691710924" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.140703 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-pm6b8"] Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.147866 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-pm6b8"] Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.298122 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-mglm8"] Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.299481 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mglm8" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.302128 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.302297 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.302545 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.303881 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.309571 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mglm8"] Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.335348 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/07027afb-0244-415d-b9ce-d9aed22cca89-node-mnt\") pod \"crc-storage-crc-mglm8\" (UID: \"07027afb-0244-415d-b9ce-d9aed22cca89\") " pod="crc-storage/crc-storage-crc-mglm8" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.335410 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbktj\" (UniqueName: \"kubernetes.io/projected/07027afb-0244-415d-b9ce-d9aed22cca89-kube-api-access-mbktj\") pod \"crc-storage-crc-mglm8\" (UID: \"07027afb-0244-415d-b9ce-d9aed22cca89\") " pod="crc-storage/crc-storage-crc-mglm8" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.335455 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/07027afb-0244-415d-b9ce-d9aed22cca89-crc-storage\") pod \"crc-storage-crc-mglm8\" (UID: \"07027afb-0244-415d-b9ce-d9aed22cca89\") " pod="crc-storage/crc-storage-crc-mglm8" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.437413 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbktj\" (UniqueName: \"kubernetes.io/projected/07027afb-0244-415d-b9ce-d9aed22cca89-kube-api-access-mbktj\") pod \"crc-storage-crc-mglm8\" (UID: \"07027afb-0244-415d-b9ce-d9aed22cca89\") " pod="crc-storage/crc-storage-crc-mglm8" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.437764 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/07027afb-0244-415d-b9ce-d9aed22cca89-crc-storage\") pod \"crc-storage-crc-mglm8\" (UID: \"07027afb-0244-415d-b9ce-d9aed22cca89\") " pod="crc-storage/crc-storage-crc-mglm8" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.437875 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/07027afb-0244-415d-b9ce-d9aed22cca89-node-mnt\") pod \"crc-storage-crc-mglm8\" (UID: \"07027afb-0244-415d-b9ce-d9aed22cca89\") " pod="crc-storage/crc-storage-crc-mglm8" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.438090 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/07027afb-0244-415d-b9ce-d9aed22cca89-node-mnt\") pod \"crc-storage-crc-mglm8\" (UID: \"07027afb-0244-415d-b9ce-d9aed22cca89\") " pod="crc-storage/crc-storage-crc-mglm8" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.438444 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/07027afb-0244-415d-b9ce-d9aed22cca89-crc-storage\") pod \"crc-storage-crc-mglm8\" (UID: \"07027afb-0244-415d-b9ce-d9aed22cca89\") " pod="crc-storage/crc-storage-crc-mglm8" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.456935 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbktj\" (UniqueName: \"kubernetes.io/projected/07027afb-0244-415d-b9ce-d9aed22cca89-kube-api-access-mbktj\") pod \"crc-storage-crc-mglm8\" (UID: \"07027afb-0244-415d-b9ce-d9aed22cca89\") " pod="crc-storage/crc-storage-crc-mglm8" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.642462 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mglm8" Jan 21 00:12:55 crc kubenswrapper[5030]: I0121 00:12:55.972345 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e339d48-7041-45c7-a8fd-8586e0bcb73e" path="/var/lib/kubelet/pods/1e339d48-7041-45c7-a8fd-8586e0bcb73e/volumes" Jan 21 00:12:56 crc kubenswrapper[5030]: I0121 00:12:56.100390 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-mglm8"] Jan 21 00:12:56 crc kubenswrapper[5030]: I0121 00:12:56.401810 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mglm8" event={"ID":"07027afb-0244-415d-b9ce-d9aed22cca89","Type":"ContainerStarted","Data":"1a5c314edc115e99e7556d80b9e761d71393449645f5fe6b06fe052afb3de9e1"} Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.151279 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.199530 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5"] Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.200102 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" podUID="46a97e5c-2126-4f43-a86c-c040672f56ac" containerName="dnsmasq-dns" containerID="cri-o://47e606e5ffbe4497598fe8854bd9cb875bbc06e4188a1c689bb803cc466f7af1" gracePeriod=10 Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.418456 5030 generic.go:334] "Generic (PLEG): container finished" podID="46a97e5c-2126-4f43-a86c-c040672f56ac" containerID="47e606e5ffbe4497598fe8854bd9cb875bbc06e4188a1c689bb803cc466f7af1" exitCode=0 Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.418557 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" event={"ID":"46a97e5c-2126-4f43-a86c-c040672f56ac","Type":"ContainerDied","Data":"47e606e5ffbe4497598fe8854bd9cb875bbc06e4188a1c689bb803cc466f7af1"} Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.426549 5030 generic.go:334] "Generic (PLEG): container finished" podID="07027afb-0244-415d-b9ce-d9aed22cca89" containerID="c68321cd97426dedda3236234ce414ff9647646f6d40a29e549a00832aba34f9" exitCode=0 Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.426648 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mglm8" event={"ID":"07027afb-0244-415d-b9ce-d9aed22cca89","Type":"ContainerDied","Data":"c68321cd97426dedda3236234ce414ff9647646f6d40a29e549a00832aba34f9"} Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.585919 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.673822 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-extramounts\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-edpm-extramounts\") pod \"46a97e5c-2126-4f43-a86c-c040672f56ac\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.673939 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjj55\" (UniqueName: \"kubernetes.io/projected/46a97e5c-2126-4f43-a86c-c040672f56ac-kube-api-access-cjj55\") pod \"46a97e5c-2126-4f43-a86c-c040672f56ac\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.674018 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-dnsmasq-svc\") pod \"46a97e5c-2126-4f43-a86c-c040672f56ac\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.674087 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-config\") pod \"46a97e5c-2126-4f43-a86c-c040672f56ac\" (UID: \"46a97e5c-2126-4f43-a86c-c040672f56ac\") " Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.681881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46a97e5c-2126-4f43-a86c-c040672f56ac-kube-api-access-cjj55" (OuterVolumeSpecName: "kube-api-access-cjj55") pod "46a97e5c-2126-4f43-a86c-c040672f56ac" (UID: "46a97e5c-2126-4f43-a86c-c040672f56ac"). InnerVolumeSpecName "kube-api-access-cjj55". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.711816 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-config" (OuterVolumeSpecName: "config") pod "46a97e5c-2126-4f43-a86c-c040672f56ac" (UID: "46a97e5c-2126-4f43-a86c-c040672f56ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.720263 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-edpm-extramounts" (OuterVolumeSpecName: "edpm-extramounts") pod "46a97e5c-2126-4f43-a86c-c040672f56ac" (UID: "46a97e5c-2126-4f43-a86c-c040672f56ac"). InnerVolumeSpecName "edpm-extramounts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.720901 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "46a97e5c-2126-4f43-a86c-c040672f56ac" (UID: "46a97e5c-2126-4f43-a86c-c040672f56ac"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.775501 5030 reconciler_common.go:293] "Volume detached for volume \"edpm-extramounts\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-edpm-extramounts\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.775537 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjj55\" (UniqueName: \"kubernetes.io/projected/46a97e5c-2126-4f43-a86c-c040672f56ac-kube-api-access-cjj55\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.775548 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.775558 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46a97e5c-2126-4f43-a86c-c040672f56ac-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:57 crc kubenswrapper[5030]: I0121 00:12:57.968274 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:12:57 crc kubenswrapper[5030]: E0121 00:12:57.968781 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.441077 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.441079 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5" event={"ID":"46a97e5c-2126-4f43-a86c-c040672f56ac","Type":"ContainerDied","Data":"8640dc8cf7e86dbb92326fac1b96aecbabc5b1ba9ed093a36ca4c1bd185b6c7c"} Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.441171 5030 scope.go:117] "RemoveContainer" containerID="47e606e5ffbe4497598fe8854bd9cb875bbc06e4188a1c689bb803cc466f7af1" Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.470726 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5"] Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.473921 5030 scope.go:117] "RemoveContainer" containerID="32fe4f6465612862d4585cafd1d0c175eb313122dc01de192c987c948162b06a" Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.477679 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-76cd9645f5-xvnj5"] Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.750278 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mglm8" Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.788186 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbktj\" (UniqueName: \"kubernetes.io/projected/07027afb-0244-415d-b9ce-d9aed22cca89-kube-api-access-mbktj\") pod \"07027afb-0244-415d-b9ce-d9aed22cca89\" (UID: \"07027afb-0244-415d-b9ce-d9aed22cca89\") " Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.788287 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/07027afb-0244-415d-b9ce-d9aed22cca89-crc-storage\") pod \"07027afb-0244-415d-b9ce-d9aed22cca89\" (UID: \"07027afb-0244-415d-b9ce-d9aed22cca89\") " Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.788323 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/07027afb-0244-415d-b9ce-d9aed22cca89-node-mnt\") pod \"07027afb-0244-415d-b9ce-d9aed22cca89\" (UID: \"07027afb-0244-415d-b9ce-d9aed22cca89\") " Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.788533 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07027afb-0244-415d-b9ce-d9aed22cca89-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "07027afb-0244-415d-b9ce-d9aed22cca89" (UID: "07027afb-0244-415d-b9ce-d9aed22cca89"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.788755 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/07027afb-0244-415d-b9ce-d9aed22cca89-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.794056 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07027afb-0244-415d-b9ce-d9aed22cca89-kube-api-access-mbktj" (OuterVolumeSpecName: "kube-api-access-mbktj") pod "07027afb-0244-415d-b9ce-d9aed22cca89" (UID: "07027afb-0244-415d-b9ce-d9aed22cca89"). InnerVolumeSpecName "kube-api-access-mbktj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.804963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07027afb-0244-415d-b9ce-d9aed22cca89-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "07027afb-0244-415d-b9ce-d9aed22cca89" (UID: "07027afb-0244-415d-b9ce-d9aed22cca89"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.890747 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbktj\" (UniqueName: \"kubernetes.io/projected/07027afb-0244-415d-b9ce-d9aed22cca89-kube-api-access-mbktj\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:58 crc kubenswrapper[5030]: I0121 00:12:58.890807 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/07027afb-0244-415d-b9ce-d9aed22cca89-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:12:59 crc kubenswrapper[5030]: I0121 00:12:59.457587 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-mglm8" event={"ID":"07027afb-0244-415d-b9ce-d9aed22cca89","Type":"ContainerDied","Data":"1a5c314edc115e99e7556d80b9e761d71393449645f5fe6b06fe052afb3de9e1"} Jan 21 00:12:59 crc kubenswrapper[5030]: I0121 00:12:59.458024 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a5c314edc115e99e7556d80b9e761d71393449645f5fe6b06fe052afb3de9e1" Jan 21 00:12:59 crc kubenswrapper[5030]: I0121 00:12:59.457702 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-mglm8" Jan 21 00:12:59 crc kubenswrapper[5030]: I0121 00:12:59.979038 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46a97e5c-2126-4f43-a86c-c040672f56ac" path="/var/lib/kubelet/pods/46a97e5c-2126-4f43-a86c-c040672f56ac/volumes" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.352886 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cfg94"] Jan 21 00:13:05 crc kubenswrapper[5030]: E0121 00:13:05.353458 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a97e5c-2126-4f43-a86c-c040672f56ac" containerName="dnsmasq-dns" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.353472 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a97e5c-2126-4f43-a86c-c040672f56ac" containerName="dnsmasq-dns" Jan 21 00:13:05 crc kubenswrapper[5030]: E0121 00:13:05.353483 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a97e5c-2126-4f43-a86c-c040672f56ac" containerName="init" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.353491 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a97e5c-2126-4f43-a86c-c040672f56ac" containerName="init" Jan 21 00:13:05 crc kubenswrapper[5030]: E0121 00:13:05.353508 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07027afb-0244-415d-b9ce-d9aed22cca89" containerName="storage" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.353514 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="07027afb-0244-415d-b9ce-d9aed22cca89" containerName="storage" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.353693 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="46a97e5c-2126-4f43-a86c-c040672f56ac" containerName="dnsmasq-dns" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.353716 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="07027afb-0244-415d-b9ce-d9aed22cca89" containerName="storage" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.354695 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.369507 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cfg94"] Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.385876 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9080363-ed5a-4be9-bc15-448c996d4628-catalog-content\") pod \"redhat-operators-cfg94\" (UID: \"a9080363-ed5a-4be9-bc15-448c996d4628\") " pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.386311 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9080363-ed5a-4be9-bc15-448c996d4628-utilities\") pod \"redhat-operators-cfg94\" (UID: \"a9080363-ed5a-4be9-bc15-448c996d4628\") " pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.386497 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq5d5\" (UniqueName: \"kubernetes.io/projected/a9080363-ed5a-4be9-bc15-448c996d4628-kube-api-access-tq5d5\") pod \"redhat-operators-cfg94\" (UID: \"a9080363-ed5a-4be9-bc15-448c996d4628\") " pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.487438 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9080363-ed5a-4be9-bc15-448c996d4628-utilities\") pod \"redhat-operators-cfg94\" (UID: \"a9080363-ed5a-4be9-bc15-448c996d4628\") " pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.487486 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq5d5\" (UniqueName: \"kubernetes.io/projected/a9080363-ed5a-4be9-bc15-448c996d4628-kube-api-access-tq5d5\") pod \"redhat-operators-cfg94\" (UID: \"a9080363-ed5a-4be9-bc15-448c996d4628\") " pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.487559 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9080363-ed5a-4be9-bc15-448c996d4628-catalog-content\") pod \"redhat-operators-cfg94\" (UID: \"a9080363-ed5a-4be9-bc15-448c996d4628\") " pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.488220 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9080363-ed5a-4be9-bc15-448c996d4628-catalog-content\") pod \"redhat-operators-cfg94\" (UID: \"a9080363-ed5a-4be9-bc15-448c996d4628\") " pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.488229 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9080363-ed5a-4be9-bc15-448c996d4628-utilities\") pod \"redhat-operators-cfg94\" (UID: \"a9080363-ed5a-4be9-bc15-448c996d4628\") " pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.522774 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq5d5\" (UniqueName: \"kubernetes.io/projected/a9080363-ed5a-4be9-bc15-448c996d4628-kube-api-access-tq5d5\") pod \"redhat-operators-cfg94\" (UID: \"a9080363-ed5a-4be9-bc15-448c996d4628\") " pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:05 crc kubenswrapper[5030]: I0121 00:13:05.683178 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:06 crc kubenswrapper[5030]: I0121 00:13:06.121085 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cfg94"] Jan 21 00:13:06 crc kubenswrapper[5030]: W0121 00:13:06.127971 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9080363_ed5a_4be9_bc15_448c996d4628.slice/crio-b08280388da89ba2e5f79bb598246c595d6d6d0578de398b21308383a833ad86 WatchSource:0}: Error finding container b08280388da89ba2e5f79bb598246c595d6d6d0578de398b21308383a833ad86: Status 404 returned error can't find the container with id b08280388da89ba2e5f79bb598246c595d6d6d0578de398b21308383a833ad86 Jan 21 00:13:06 crc kubenswrapper[5030]: I0121 00:13:06.526413 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfg94" event={"ID":"a9080363-ed5a-4be9-bc15-448c996d4628","Type":"ContainerStarted","Data":"b08280388da89ba2e5f79bb598246c595d6d6d0578de398b21308383a833ad86"} Jan 21 00:13:07 crc kubenswrapper[5030]: I0121 00:13:07.534223 5030 generic.go:334] "Generic (PLEG): container finished" podID="a9080363-ed5a-4be9-bc15-448c996d4628" containerID="2161082fec78f3f7053ac20b54b45e2eae7c90480f5e7fee64413140474a886a" exitCode=0 Jan 21 00:13:07 crc kubenswrapper[5030]: I0121 00:13:07.534315 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfg94" event={"ID":"a9080363-ed5a-4be9-bc15-448c996d4628","Type":"ContainerDied","Data":"2161082fec78f3f7053ac20b54b45e2eae7c90480f5e7fee64413140474a886a"} Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.483713 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-mglm8"] Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.489719 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-mglm8"] Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.618065 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-tldd2"] Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.619939 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tldd2" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.621915 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.622151 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.622147 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.622693 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.632500 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tldd2"] Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.743491 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/24bd3397-0c91-43ec-9af2-50dd9b14e067-node-mnt\") pod \"crc-storage-crc-tldd2\" (UID: \"24bd3397-0c91-43ec-9af2-50dd9b14e067\") " pod="crc-storage/crc-storage-crc-tldd2" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.743578 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8qq2\" (UniqueName: \"kubernetes.io/projected/24bd3397-0c91-43ec-9af2-50dd9b14e067-kube-api-access-t8qq2\") pod \"crc-storage-crc-tldd2\" (UID: \"24bd3397-0c91-43ec-9af2-50dd9b14e067\") " pod="crc-storage/crc-storage-crc-tldd2" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.743613 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/24bd3397-0c91-43ec-9af2-50dd9b14e067-crc-storage\") pod \"crc-storage-crc-tldd2\" (UID: \"24bd3397-0c91-43ec-9af2-50dd9b14e067\") " pod="crc-storage/crc-storage-crc-tldd2" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.845287 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/24bd3397-0c91-43ec-9af2-50dd9b14e067-node-mnt\") pod \"crc-storage-crc-tldd2\" (UID: \"24bd3397-0c91-43ec-9af2-50dd9b14e067\") " pod="crc-storage/crc-storage-crc-tldd2" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.845371 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8qq2\" (UniqueName: \"kubernetes.io/projected/24bd3397-0c91-43ec-9af2-50dd9b14e067-kube-api-access-t8qq2\") pod \"crc-storage-crc-tldd2\" (UID: \"24bd3397-0c91-43ec-9af2-50dd9b14e067\") " pod="crc-storage/crc-storage-crc-tldd2" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.845411 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/24bd3397-0c91-43ec-9af2-50dd9b14e067-crc-storage\") pod \"crc-storage-crc-tldd2\" (UID: \"24bd3397-0c91-43ec-9af2-50dd9b14e067\") " pod="crc-storage/crc-storage-crc-tldd2" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.845591 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/24bd3397-0c91-43ec-9af2-50dd9b14e067-node-mnt\") pod \"crc-storage-crc-tldd2\" (UID: \"24bd3397-0c91-43ec-9af2-50dd9b14e067\") " pod="crc-storage/crc-storage-crc-tldd2" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.846318 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/24bd3397-0c91-43ec-9af2-50dd9b14e067-crc-storage\") pod \"crc-storage-crc-tldd2\" (UID: \"24bd3397-0c91-43ec-9af2-50dd9b14e067\") " pod="crc-storage/crc-storage-crc-tldd2" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.878709 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8qq2\" (UniqueName: \"kubernetes.io/projected/24bd3397-0c91-43ec-9af2-50dd9b14e067-kube-api-access-t8qq2\") pod \"crc-storage-crc-tldd2\" (UID: \"24bd3397-0c91-43ec-9af2-50dd9b14e067\") " pod="crc-storage/crc-storage-crc-tldd2" Jan 21 00:13:08 crc kubenswrapper[5030]: I0121 00:13:08.946354 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tldd2" Jan 21 00:13:09 crc kubenswrapper[5030]: I0121 00:13:09.413613 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tldd2"] Jan 21 00:13:09 crc kubenswrapper[5030]: I0121 00:13:09.549425 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tldd2" event={"ID":"24bd3397-0c91-43ec-9af2-50dd9b14e067","Type":"ContainerStarted","Data":"d0d838932dfc1f8ef54a928cdf3234f687e717393fbafc3faf717c57d23944ee"} Jan 21 00:13:09 crc kubenswrapper[5030]: I0121 00:13:09.971070 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07027afb-0244-415d-b9ce-d9aed22cca89" path="/var/lib/kubelet/pods/07027afb-0244-415d-b9ce-d9aed22cca89/volumes" Jan 21 00:13:10 crc kubenswrapper[5030]: I0121 00:13:10.563250 5030 generic.go:334] "Generic (PLEG): container finished" podID="24bd3397-0c91-43ec-9af2-50dd9b14e067" containerID="dae39f7e3837254aa58a115b0e9f14ef10e199f7f4924deb8c74aa6723b1bb05" exitCode=0 Jan 21 00:13:10 crc kubenswrapper[5030]: I0121 00:13:10.563386 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tldd2" event={"ID":"24bd3397-0c91-43ec-9af2-50dd9b14e067","Type":"ContainerDied","Data":"dae39f7e3837254aa58a115b0e9f14ef10e199f7f4924deb8c74aa6723b1bb05"} Jan 21 00:13:10 crc kubenswrapper[5030]: I0121 00:13:10.568144 5030 generic.go:334] "Generic (PLEG): container finished" podID="a9080363-ed5a-4be9-bc15-448c996d4628" containerID="3faf6e747bd670b9fea7bee9ebf88d2c09994d641ecfb67fd254da130418cc0e" exitCode=0 Jan 21 00:13:10 crc kubenswrapper[5030]: I0121 00:13:10.568208 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfg94" event={"ID":"a9080363-ed5a-4be9-bc15-448c996d4628","Type":"ContainerDied","Data":"3faf6e747bd670b9fea7bee9ebf88d2c09994d641ecfb67fd254da130418cc0e"} Jan 21 00:13:10 crc kubenswrapper[5030]: I0121 00:13:10.962243 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:13:10 crc kubenswrapper[5030]: E0121 00:13:10.962939 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:13:11 crc kubenswrapper[5030]: I0121 00:13:11.580443 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfg94" event={"ID":"a9080363-ed5a-4be9-bc15-448c996d4628","Type":"ContainerStarted","Data":"d2be95238ab3f5c07a81fc4c0a1fea0e1d16ce6ac890812c99e209ee1e3e9e29"} Jan 21 00:13:11 crc kubenswrapper[5030]: I0121 00:13:11.599888 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cfg94" podStartSLOduration=3.108711671 podStartE2EDuration="6.599871881s" podCreationTimestamp="2026-01-21 00:13:05 +0000 UTC" firstStartedPulling="2026-01-21 00:13:07.536015895 +0000 UTC m=+5859.856276183" lastFinishedPulling="2026-01-21 00:13:11.027176085 +0000 UTC m=+5863.347436393" observedRunningTime="2026-01-21 00:13:11.599263157 +0000 UTC m=+5863.919523455" watchObservedRunningTime="2026-01-21 00:13:11.599871881 +0000 UTC m=+5863.920132169" Jan 21 00:13:11 crc kubenswrapper[5030]: I0121 00:13:11.927265 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tldd2" Jan 21 00:13:11 crc kubenswrapper[5030]: I0121 00:13:11.990947 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8qq2\" (UniqueName: \"kubernetes.io/projected/24bd3397-0c91-43ec-9af2-50dd9b14e067-kube-api-access-t8qq2\") pod \"24bd3397-0c91-43ec-9af2-50dd9b14e067\" (UID: \"24bd3397-0c91-43ec-9af2-50dd9b14e067\") " Jan 21 00:13:11 crc kubenswrapper[5030]: I0121 00:13:11.991032 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/24bd3397-0c91-43ec-9af2-50dd9b14e067-crc-storage\") pod \"24bd3397-0c91-43ec-9af2-50dd9b14e067\" (UID: \"24bd3397-0c91-43ec-9af2-50dd9b14e067\") " Jan 21 00:13:11 crc kubenswrapper[5030]: I0121 00:13:11.991190 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/24bd3397-0c91-43ec-9af2-50dd9b14e067-node-mnt\") pod \"24bd3397-0c91-43ec-9af2-50dd9b14e067\" (UID: \"24bd3397-0c91-43ec-9af2-50dd9b14e067\") " Jan 21 00:13:11 crc kubenswrapper[5030]: I0121 00:13:11.991306 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24bd3397-0c91-43ec-9af2-50dd9b14e067-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "24bd3397-0c91-43ec-9af2-50dd9b14e067" (UID: "24bd3397-0c91-43ec-9af2-50dd9b14e067"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:13:11 crc kubenswrapper[5030]: I0121 00:13:11.991534 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/24bd3397-0c91-43ec-9af2-50dd9b14e067-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:11 crc kubenswrapper[5030]: I0121 00:13:11.996455 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24bd3397-0c91-43ec-9af2-50dd9b14e067-kube-api-access-t8qq2" (OuterVolumeSpecName: "kube-api-access-t8qq2") pod "24bd3397-0c91-43ec-9af2-50dd9b14e067" (UID: "24bd3397-0c91-43ec-9af2-50dd9b14e067"). InnerVolumeSpecName "kube-api-access-t8qq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:13:12 crc kubenswrapper[5030]: I0121 00:13:12.010820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24bd3397-0c91-43ec-9af2-50dd9b14e067-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "24bd3397-0c91-43ec-9af2-50dd9b14e067" (UID: "24bd3397-0c91-43ec-9af2-50dd9b14e067"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:13:12 crc kubenswrapper[5030]: I0121 00:13:12.093161 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8qq2\" (UniqueName: \"kubernetes.io/projected/24bd3397-0c91-43ec-9af2-50dd9b14e067-kube-api-access-t8qq2\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:12 crc kubenswrapper[5030]: I0121 00:13:12.093222 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/24bd3397-0c91-43ec-9af2-50dd9b14e067-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:12 crc kubenswrapper[5030]: I0121 00:13:12.588612 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tldd2" Jan 21 00:13:12 crc kubenswrapper[5030]: I0121 00:13:12.588718 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tldd2" event={"ID":"24bd3397-0c91-43ec-9af2-50dd9b14e067","Type":"ContainerDied","Data":"d0d838932dfc1f8ef54a928cdf3234f687e717393fbafc3faf717c57d23944ee"} Jan 21 00:13:12 crc kubenswrapper[5030]: I0121 00:13:12.588748 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0d838932dfc1f8ef54a928cdf3234f687e717393fbafc3faf717c57d23944ee" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.299272 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n"] Jan 21 00:13:15 crc kubenswrapper[5030]: E0121 00:13:15.299820 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24bd3397-0c91-43ec-9af2-50dd9b14e067" containerName="storage" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.299832 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="24bd3397-0c91-43ec-9af2-50dd9b14e067" containerName="storage" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.299977 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="24bd3397-0c91-43ec-9af2-50dd9b14e067" containerName="storage" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.300750 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.303113 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-edpm-multinode" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.326361 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n"] Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.342719 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x9rz\" (UniqueName: \"kubernetes.io/projected/d555e402-eec6-4804-b441-d130e5e0cfbe-kube-api-access-8x9rz\") pod \"dnsmasq-dnsmasq-58854494b5-wbc4n\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.342779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-58854494b5-wbc4n\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.342810 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-58854494b5-wbc4n\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.342870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-config\") pod \"dnsmasq-dnsmasq-58854494b5-wbc4n\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.414549 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n"] Jan 21 00:13:15 crc kubenswrapper[5030]: E0121 00:13:15.415024 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dnsmasq-svc kube-api-access-8x9rz openstack-edpm-multinode], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" podUID="d555e402-eec6-4804-b441-d130e5e0cfbe" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.436459 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt"] Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.437741 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.443655 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x9rz\" (UniqueName: \"kubernetes.io/projected/d555e402-eec6-4804-b441-d130e5e0cfbe-kube-api-access-8x9rz\") pod \"dnsmasq-dnsmasq-58854494b5-wbc4n\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.443712 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-58854494b5-wbc4n\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.443755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-58854494b5-wbc4n\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.443784 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-config\") pod \"dnsmasq-dnsmasq-58854494b5-wbc4n\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.444653 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-config\") pod \"dnsmasq-dnsmasq-58854494b5-wbc4n\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.444759 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-58854494b5-wbc4n\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.444938 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-58854494b5-wbc4n\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.460958 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt"] Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.476431 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x9rz\" (UniqueName: \"kubernetes.io/projected/d555e402-eec6-4804-b441-d130e5e0cfbe-kube-api-access-8x9rz\") pod \"dnsmasq-dnsmasq-58854494b5-wbc4n\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.545636 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-59887957c5-xpzwt\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.545949 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk62t\" (UniqueName: \"kubernetes.io/projected/516053ba-95fc-459e-b835-bbbc1092b9ab-kube-api-access-nk62t\") pod \"dnsmasq-dnsmasq-59887957c5-xpzwt\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.546022 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-59887957c5-xpzwt\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.546047 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-config\") pod \"dnsmasq-dnsmasq-59887957c5-xpzwt\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.621777 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.632338 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.646904 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-59887957c5-xpzwt\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.646958 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-config\") pod \"dnsmasq-dnsmasq-59887957c5-xpzwt\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.647020 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-59887957c5-xpzwt\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.647043 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk62t\" (UniqueName: \"kubernetes.io/projected/516053ba-95fc-459e-b835-bbbc1092b9ab-kube-api-access-nk62t\") pod \"dnsmasq-dnsmasq-59887957c5-xpzwt\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.647810 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-59887957c5-xpzwt\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.647971 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-59887957c5-xpzwt\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.648109 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-config\") pod \"dnsmasq-dnsmasq-59887957c5-xpzwt\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.667217 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk62t\" (UniqueName: \"kubernetes.io/projected/516053ba-95fc-459e-b835-bbbc1092b9ab-kube-api-access-nk62t\") pod \"dnsmasq-dnsmasq-59887957c5-xpzwt\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.683401 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.683797 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.747587 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-config\") pod \"d555e402-eec6-4804-b441-d130e5e0cfbe\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.747697 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x9rz\" (UniqueName: \"kubernetes.io/projected/d555e402-eec6-4804-b441-d130e5e0cfbe-kube-api-access-8x9rz\") pod \"d555e402-eec6-4804-b441-d130e5e0cfbe\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.747748 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-dnsmasq-svc\") pod \"d555e402-eec6-4804-b441-d130e5e0cfbe\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.747789 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-openstack-edpm-multinode\") pod \"d555e402-eec6-4804-b441-d130e5e0cfbe\" (UID: \"d555e402-eec6-4804-b441-d130e5e0cfbe\") " Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.748251 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-config" (OuterVolumeSpecName: "config") pod "d555e402-eec6-4804-b441-d130e5e0cfbe" (UID: "d555e402-eec6-4804-b441-d130e5e0cfbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.748368 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-openstack-edpm-multinode" (OuterVolumeSpecName: "openstack-edpm-multinode") pod "d555e402-eec6-4804-b441-d130e5e0cfbe" (UID: "d555e402-eec6-4804-b441-d130e5e0cfbe"). InnerVolumeSpecName "openstack-edpm-multinode". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.748455 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "d555e402-eec6-4804-b441-d130e5e0cfbe" (UID: "d555e402-eec6-4804-b441-d130e5e0cfbe"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.750986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d555e402-eec6-4804-b441-d130e5e0cfbe-kube-api-access-8x9rz" (OuterVolumeSpecName: "kube-api-access-8x9rz") pod "d555e402-eec6-4804-b441-d130e5e0cfbe" (UID: "d555e402-eec6-4804-b441-d130e5e0cfbe"). InnerVolumeSpecName "kube-api-access-8x9rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.752473 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.850461 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.850893 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x9rz\" (UniqueName: \"kubernetes.io/projected/d555e402-eec6-4804-b441-d130e5e0cfbe-kube-api-access-8x9rz\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.850904 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:15 crc kubenswrapper[5030]: I0121 00:13:15.850915 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/d555e402-eec6-4804-b441-d130e5e0cfbe-openstack-edpm-multinode\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:16 crc kubenswrapper[5030]: I0121 00:13:16.266007 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt"] Jan 21 00:13:16 crc kubenswrapper[5030]: I0121 00:13:16.634507 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" event={"ID":"516053ba-95fc-459e-b835-bbbc1092b9ab","Type":"ContainerStarted","Data":"00423861b20cb7f0e04fcd9c4f08ac85ecec30c3daa22102f7ad32613547402b"} Jan 21 00:13:16 crc kubenswrapper[5030]: I0121 00:13:16.634673 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n" Jan 21 00:13:16 crc kubenswrapper[5030]: I0121 00:13:16.679448 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n"] Jan 21 00:13:16 crc kubenswrapper[5030]: I0121 00:13:16.688745 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-58854494b5-wbc4n"] Jan 21 00:13:16 crc kubenswrapper[5030]: I0121 00:13:16.737710 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cfg94" podUID="a9080363-ed5a-4be9-bc15-448c996d4628" containerName="registry-server" probeResult="failure" output=< Jan 21 00:13:16 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 21 00:13:16 crc kubenswrapper[5030]: > Jan 21 00:13:17 crc kubenswrapper[5030]: I0121 00:13:17.976657 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d555e402-eec6-4804-b441-d130e5e0cfbe" path="/var/lib/kubelet/pods/d555e402-eec6-4804-b441-d130e5e0cfbe/volumes" Jan 21 00:13:20 crc kubenswrapper[5030]: I0121 00:13:20.677165 5030 generic.go:334] "Generic (PLEG): container finished" podID="516053ba-95fc-459e-b835-bbbc1092b9ab" containerID="e696f51ad723cdfe5bcd7389d6e7868250642a447e95649bad0e55533535719e" exitCode=0 Jan 21 00:13:20 crc kubenswrapper[5030]: I0121 00:13:20.677235 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" event={"ID":"516053ba-95fc-459e-b835-bbbc1092b9ab","Type":"ContainerDied","Data":"e696f51ad723cdfe5bcd7389d6e7868250642a447e95649bad0e55533535719e"} Jan 21 00:13:24 crc kubenswrapper[5030]: I0121 00:13:24.963124 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:13:24 crc kubenswrapper[5030]: E0121 00:13:24.965364 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:13:25 crc kubenswrapper[5030]: I0121 00:13:25.724276 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:25 crc kubenswrapper[5030]: I0121 00:13:25.744534 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" event={"ID":"516053ba-95fc-459e-b835-bbbc1092b9ab","Type":"ContainerStarted","Data":"bcfe097b2c9c19b1fe9d770e2823f64159de149693301be6074a345cad557171"} Jan 21 00:13:25 crc kubenswrapper[5030]: I0121 00:13:25.745453 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:25 crc kubenswrapper[5030]: I0121 00:13:25.770249 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" podStartSLOduration=10.770229653 podStartE2EDuration="10.770229653s" podCreationTimestamp="2026-01-21 00:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:13:25.765310603 +0000 UTC m=+5878.085570891" watchObservedRunningTime="2026-01-21 00:13:25.770229653 +0000 UTC m=+5878.090489951" Jan 21 00:13:25 crc kubenswrapper[5030]: I0121 00:13:25.775930 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:25 crc kubenswrapper[5030]: I0121 00:13:25.972074 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cfg94"] Jan 21 00:13:26 crc kubenswrapper[5030]: I0121 00:13:26.756488 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cfg94" podUID="a9080363-ed5a-4be9-bc15-448c996d4628" containerName="registry-server" containerID="cri-o://d2be95238ab3f5c07a81fc4c0a1fea0e1d16ce6ac890812c99e209ee1e3e9e29" gracePeriod=2 Jan 21 00:13:27 crc kubenswrapper[5030]: I0121 00:13:27.770699 5030 generic.go:334] "Generic (PLEG): container finished" podID="a9080363-ed5a-4be9-bc15-448c996d4628" containerID="d2be95238ab3f5c07a81fc4c0a1fea0e1d16ce6ac890812c99e209ee1e3e9e29" exitCode=0 Jan 21 00:13:27 crc kubenswrapper[5030]: I0121 00:13:27.771829 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfg94" event={"ID":"a9080363-ed5a-4be9-bc15-448c996d4628","Type":"ContainerDied","Data":"d2be95238ab3f5c07a81fc4c0a1fea0e1d16ce6ac890812c99e209ee1e3e9e29"} Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.089032 5030 scope.go:117] "RemoveContainer" containerID="1dee82a08442e7115d8fc7132222f50cc9c262689696e05bbe12723f05e13928" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.124362 5030 scope.go:117] "RemoveContainer" containerID="b908271e6c33798d690460914aeb3907afebd8bb8992aed9d312e8b5d796dd7c" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.156169 5030 scope.go:117] "RemoveContainer" containerID="e43f71e8ef63bd9bd54911f9972fed66e390abceaf1aa9bd5d895db3b9c311e2" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.200271 5030 scope.go:117] "RemoveContainer" containerID="68eca8223b3ceb007c67bdf76460ad1845ff7e01fb93b8ecbb8cd84f015dc8b3" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.236020 5030 scope.go:117] "RemoveContainer" containerID="15ef8289575339fe902af26da37aaa611d70617294878ec110682d3c22ffdd21" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.263237 5030 scope.go:117] "RemoveContainer" containerID="f15501a966c5d5350e24d57b2d179b04ae0d6d080867da95227104c4672cc0fc" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.308883 5030 scope.go:117] "RemoveContainer" containerID="bb3f88c095f0b16a51144b9a0e63fbcaf1b4088bb32a7d50f84e6404b9c7f4eb" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.339106 5030 scope.go:117] "RemoveContainer" containerID="54725a2010d44952cce43671c2f4cbbfa72fc077f14151c5f54d9dfd04047849" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.365589 5030 scope.go:117] "RemoveContainer" containerID="62bbe7d608f220881fa325385b732cea042f38f208f2d3bc3e495bf44b1671d8" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.379634 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.402379 5030 scope.go:117] "RemoveContainer" containerID="ba2ebdaadaf5424fb657a643d775069242a84d88f1220fc7d9fb80d24f7e7fc3" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.434260 5030 scope.go:117] "RemoveContainer" containerID="9ad440a17a700ea0b70f5d65e67661366e51e90b7fb29ee2a9e1a2f80faf07b5" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.452517 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9080363-ed5a-4be9-bc15-448c996d4628-utilities\") pod \"a9080363-ed5a-4be9-bc15-448c996d4628\" (UID: \"a9080363-ed5a-4be9-bc15-448c996d4628\") " Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.452588 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9080363-ed5a-4be9-bc15-448c996d4628-catalog-content\") pod \"a9080363-ed5a-4be9-bc15-448c996d4628\" (UID: \"a9080363-ed5a-4be9-bc15-448c996d4628\") " Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.452665 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq5d5\" (UniqueName: \"kubernetes.io/projected/a9080363-ed5a-4be9-bc15-448c996d4628-kube-api-access-tq5d5\") pod \"a9080363-ed5a-4be9-bc15-448c996d4628\" (UID: \"a9080363-ed5a-4be9-bc15-448c996d4628\") " Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.453653 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9080363-ed5a-4be9-bc15-448c996d4628-utilities" (OuterVolumeSpecName: "utilities") pod "a9080363-ed5a-4be9-bc15-448c996d4628" (UID: "a9080363-ed5a-4be9-bc15-448c996d4628"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.459402 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9080363-ed5a-4be9-bc15-448c996d4628-kube-api-access-tq5d5" (OuterVolumeSpecName: "kube-api-access-tq5d5") pod "a9080363-ed5a-4be9-bc15-448c996d4628" (UID: "a9080363-ed5a-4be9-bc15-448c996d4628"). InnerVolumeSpecName "kube-api-access-tq5d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.554232 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9080363-ed5a-4be9-bc15-448c996d4628-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.554265 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq5d5\" (UniqueName: \"kubernetes.io/projected/a9080363-ed5a-4be9-bc15-448c996d4628-kube-api-access-tq5d5\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.592063 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9080363-ed5a-4be9-bc15-448c996d4628-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9080363-ed5a-4be9-bc15-448c996d4628" (UID: "a9080363-ed5a-4be9-bc15-448c996d4628"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.656241 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9080363-ed5a-4be9-bc15-448c996d4628-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.786529 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfg94" event={"ID":"a9080363-ed5a-4be9-bc15-448c996d4628","Type":"ContainerDied","Data":"b08280388da89ba2e5f79bb598246c595d6d6d0578de398b21308383a833ad86"} Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.786594 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfg94" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.786611 5030 scope.go:117] "RemoveContainer" containerID="d2be95238ab3f5c07a81fc4c0a1fea0e1d16ce6ac890812c99e209ee1e3e9e29" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.808099 5030 scope.go:117] "RemoveContainer" containerID="3faf6e747bd670b9fea7bee9ebf88d2c09994d641ecfb67fd254da130418cc0e" Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.831205 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cfg94"] Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.838223 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cfg94"] Jan 21 00:13:28 crc kubenswrapper[5030]: I0121 00:13:28.838606 5030 scope.go:117] "RemoveContainer" containerID="2161082fec78f3f7053ac20b54b45e2eae7c90480f5e7fee64413140474a886a" Jan 21 00:13:29 crc kubenswrapper[5030]: I0121 00:13:29.971299 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9080363-ed5a-4be9-bc15-448c996d4628" path="/var/lib/kubelet/pods/a9080363-ed5a-4be9-bc15-448c996d4628/volumes" Jan 21 00:13:30 crc kubenswrapper[5030]: I0121 00:13:30.755549 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:30 crc kubenswrapper[5030]: I0121 00:13:30.807422 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf"] Jan 21 00:13:30 crc kubenswrapper[5030]: I0121 00:13:30.807674 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" podUID="85281a47-0cc3-42cd-b349-082ac2d4f10e" containerName="dnsmasq-dns" containerID="cri-o://45189cd6b4d5c85a574cc281f266181738365081b919e180c7cb2da4c6ecca46" gracePeriod=10 Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.753073 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs"] Jan 21 00:13:31 crc kubenswrapper[5030]: E0121 00:13:31.753556 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9080363-ed5a-4be9-bc15-448c996d4628" containerName="extract-content" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.753578 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9080363-ed5a-4be9-bc15-448c996d4628" containerName="extract-content" Jan 21 00:13:31 crc kubenswrapper[5030]: E0121 00:13:31.753612 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9080363-ed5a-4be9-bc15-448c996d4628" containerName="extract-utilities" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.753651 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9080363-ed5a-4be9-bc15-448c996d4628" containerName="extract-utilities" Jan 21 00:13:31 crc kubenswrapper[5030]: E0121 00:13:31.753681 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9080363-ed5a-4be9-bc15-448c996d4628" containerName="registry-server" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.753693 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9080363-ed5a-4be9-bc15-448c996d4628" containerName="registry-server" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.753939 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9080363-ed5a-4be9-bc15-448c996d4628" containerName="registry-server" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.755433 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.760718 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs"] Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.805671 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-config\") pod \"dnsmasq-dnsmasq-5d55f47b6c-v29fs\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.805761 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxcgx\" (UniqueName: \"kubernetes.io/projected/836e8025-5146-48c8-b4a3-99e6af074715-kube-api-access-jxcgx\") pod \"dnsmasq-dnsmasq-5d55f47b6c-v29fs\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.805828 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-5d55f47b6c-v29fs\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.805880 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5d55f47b6c-v29fs\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.820542 5030 generic.go:334] "Generic (PLEG): container finished" podID="85281a47-0cc3-42cd-b349-082ac2d4f10e" containerID="45189cd6b4d5c85a574cc281f266181738365081b919e180c7cb2da4c6ecca46" exitCode=0 Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.820600 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" event={"ID":"85281a47-0cc3-42cd-b349-082ac2d4f10e","Type":"ContainerDied","Data":"45189cd6b4d5c85a574cc281f266181738365081b919e180c7cb2da4c6ecca46"} Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.907388 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxcgx\" (UniqueName: \"kubernetes.io/projected/836e8025-5146-48c8-b4a3-99e6af074715-kube-api-access-jxcgx\") pod \"dnsmasq-dnsmasq-5d55f47b6c-v29fs\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.907463 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-5d55f47b6c-v29fs\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.907508 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5d55f47b6c-v29fs\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.907550 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-config\") pod \"dnsmasq-dnsmasq-5d55f47b6c-v29fs\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.908340 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-config\") pod \"dnsmasq-dnsmasq-5d55f47b6c-v29fs\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.908456 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-5d55f47b6c-v29fs\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.908531 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-5d55f47b6c-v29fs\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:31 crc kubenswrapper[5030]: I0121 00:13:31.927882 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxcgx\" (UniqueName: \"kubernetes.io/projected/836e8025-5146-48c8-b4a3-99e6af074715-kube-api-access-jxcgx\") pod \"dnsmasq-dnsmasq-5d55f47b6c-v29fs\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.110767 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.149916 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" podUID="85281a47-0cc3-42cd-b349-082ac2d4f10e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.23:5353: connect: connection refused" Jan 21 00:13:32 crc kubenswrapper[5030]: W0121 00:13:32.511513 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod836e8025_5146_48c8_b4a3_99e6af074715.slice/crio-dc0fb2fbb70a269f36a1970d9a8fbd63bf73e9799fd012f8273450810cc713c9 WatchSource:0}: Error finding container dc0fb2fbb70a269f36a1970d9a8fbd63bf73e9799fd012f8273450810cc713c9: Status 404 returned error can't find the container with id dc0fb2fbb70a269f36a1970d9a8fbd63bf73e9799fd012f8273450810cc713c9 Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.515386 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs"] Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.646019 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.717383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/85281a47-0cc3-42cd-b349-082ac2d4f10e-dnsmasq-svc\") pod \"85281a47-0cc3-42cd-b349-082ac2d4f10e\" (UID: \"85281a47-0cc3-42cd-b349-082ac2d4f10e\") " Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.717437 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82f2t\" (UniqueName: \"kubernetes.io/projected/85281a47-0cc3-42cd-b349-082ac2d4f10e-kube-api-access-82f2t\") pod \"85281a47-0cc3-42cd-b349-082ac2d4f10e\" (UID: \"85281a47-0cc3-42cd-b349-082ac2d4f10e\") " Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.717544 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85281a47-0cc3-42cd-b349-082ac2d4f10e-config\") pod \"85281a47-0cc3-42cd-b349-082ac2d4f10e\" (UID: \"85281a47-0cc3-42cd-b349-082ac2d4f10e\") " Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.721444 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85281a47-0cc3-42cd-b349-082ac2d4f10e-kube-api-access-82f2t" (OuterVolumeSpecName: "kube-api-access-82f2t") pod "85281a47-0cc3-42cd-b349-082ac2d4f10e" (UID: "85281a47-0cc3-42cd-b349-082ac2d4f10e"). InnerVolumeSpecName "kube-api-access-82f2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.747943 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85281a47-0cc3-42cd-b349-082ac2d4f10e-config" (OuterVolumeSpecName: "config") pod "85281a47-0cc3-42cd-b349-082ac2d4f10e" (UID: "85281a47-0cc3-42cd-b349-082ac2d4f10e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.750473 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85281a47-0cc3-42cd-b349-082ac2d4f10e-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "85281a47-0cc3-42cd-b349-082ac2d4f10e" (UID: "85281a47-0cc3-42cd-b349-082ac2d4f10e"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.819746 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85281a47-0cc3-42cd-b349-082ac2d4f10e-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.819800 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/85281a47-0cc3-42cd-b349-082ac2d4f10e-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.819827 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82f2t\" (UniqueName: \"kubernetes.io/projected/85281a47-0cc3-42cd-b349-082ac2d4f10e-kube-api-access-82f2t\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.831747 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" event={"ID":"836e8025-5146-48c8-b4a3-99e6af074715","Type":"ContainerStarted","Data":"dc0fb2fbb70a269f36a1970d9a8fbd63bf73e9799fd012f8273450810cc713c9"} Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.833744 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" event={"ID":"85281a47-0cc3-42cd-b349-082ac2d4f10e","Type":"ContainerDied","Data":"8a6ff4e21d3defb5f7287bc5f087b8ad26f76f5e38c7a3e3501809500c0d6f5d"} Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.833848 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf" Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.833999 5030 scope.go:117] "RemoveContainer" containerID="45189cd6b4d5c85a574cc281f266181738365081b919e180c7cb2da4c6ecca46" Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.854920 5030 scope.go:117] "RemoveContainer" containerID="eb264058be944df9b4ab4e3d081bd0021dcea6ee40423f309ac9d58c0b622518" Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.867430 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf"] Jan 21 00:13:32 crc kubenswrapper[5030]: I0121 00:13:32.872591 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-g5mjf"] Jan 21 00:13:33 crc kubenswrapper[5030]: I0121 00:13:33.843757 5030 generic.go:334] "Generic (PLEG): container finished" podID="836e8025-5146-48c8-b4a3-99e6af074715" containerID="0175b1a8e2eee6e7e381a2671ff9616f08211f33815de4d168f7aaee058f6a76" exitCode=0 Jan 21 00:13:33 crc kubenswrapper[5030]: I0121 00:13:33.843808 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" event={"ID":"836e8025-5146-48c8-b4a3-99e6af074715","Type":"ContainerDied","Data":"0175b1a8e2eee6e7e381a2671ff9616f08211f33815de4d168f7aaee058f6a76"} Jan 21 00:13:33 crc kubenswrapper[5030]: I0121 00:13:33.971740 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85281a47-0cc3-42cd-b349-082ac2d4f10e" path="/var/lib/kubelet/pods/85281a47-0cc3-42cd-b349-082ac2d4f10e/volumes" Jan 21 00:13:34 crc kubenswrapper[5030]: I0121 00:13:34.876944 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" event={"ID":"836e8025-5146-48c8-b4a3-99e6af074715","Type":"ContainerStarted","Data":"9ba7f5ea726cd4bb0bfcc4aa4955f1b360cb55cbb43ab08a4ba2f2e01776f052"} Jan 21 00:13:34 crc kubenswrapper[5030]: I0121 00:13:34.877390 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:34 crc kubenswrapper[5030]: I0121 00:13:34.897135 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" podStartSLOduration=3.897114582 podStartE2EDuration="3.897114582s" podCreationTimestamp="2026-01-21 00:13:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:13:34.893598397 +0000 UTC m=+5887.213858695" watchObservedRunningTime="2026-01-21 00:13:34.897114582 +0000 UTC m=+5887.217374860" Jan 21 00:13:38 crc kubenswrapper[5030]: I0121 00:13:38.961953 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:13:38 crc kubenswrapper[5030]: E0121 00:13:38.962453 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.112839 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.180317 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt"] Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.180611 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" podUID="516053ba-95fc-459e-b835-bbbc1092b9ab" containerName="dnsmasq-dns" containerID="cri-o://bcfe097b2c9c19b1fe9d770e2823f64159de149693301be6074a345cad557171" gracePeriod=10 Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.350929 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w"] Jan 21 00:13:42 crc kubenswrapper[5030]: E0121 00:13:42.351710 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85281a47-0cc3-42cd-b349-082ac2d4f10e" containerName="init" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.351733 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="85281a47-0cc3-42cd-b349-082ac2d4f10e" containerName="init" Jan 21 00:13:42 crc kubenswrapper[5030]: E0121 00:13:42.351746 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85281a47-0cc3-42cd-b349-082ac2d4f10e" containerName="dnsmasq-dns" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.351756 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="85281a47-0cc3-42cd-b349-082ac2d4f10e" containerName="dnsmasq-dns" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.351938 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="85281a47-0cc3-42cd-b349-082ac2d4f10e" containerName="dnsmasq-dns" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.352889 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.362237 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w"] Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.479920 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-964c896d7-wbw5w\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.480027 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rvrd\" (UniqueName: \"kubernetes.io/projected/0001fa43-24c5-4c19-879b-0a21f3d77490-kube-api-access-6rvrd\") pod \"dnsmasq-dnsmasq-964c896d7-wbw5w\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.480059 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-config\") pod \"dnsmasq-dnsmasq-964c896d7-wbw5w\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.480097 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-964c896d7-wbw5w\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.580984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-964c896d7-wbw5w\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.581209 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-964c896d7-wbw5w\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.581344 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rvrd\" (UniqueName: \"kubernetes.io/projected/0001fa43-24c5-4c19-879b-0a21f3d77490-kube-api-access-6rvrd\") pod \"dnsmasq-dnsmasq-964c896d7-wbw5w\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.581443 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-config\") pod \"dnsmasq-dnsmasq-964c896d7-wbw5w\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.582114 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-openstack-edpm-multinode\") pod \"dnsmasq-dnsmasq-964c896d7-wbw5w\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.582385 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-config\") pod \"dnsmasq-dnsmasq-964c896d7-wbw5w\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.582429 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-964c896d7-wbw5w\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.605545 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rvrd\" (UniqueName: \"kubernetes.io/projected/0001fa43-24c5-4c19-879b-0a21f3d77490-kube-api-access-6rvrd\") pod \"dnsmasq-dnsmasq-964c896d7-wbw5w\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.607927 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.671824 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.783999 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-openstack-edpm-multinode\") pod \"516053ba-95fc-459e-b835-bbbc1092b9ab\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.784388 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-dnsmasq-svc\") pod \"516053ba-95fc-459e-b835-bbbc1092b9ab\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.784432 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk62t\" (UniqueName: \"kubernetes.io/projected/516053ba-95fc-459e-b835-bbbc1092b9ab-kube-api-access-nk62t\") pod \"516053ba-95fc-459e-b835-bbbc1092b9ab\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.784485 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-config\") pod \"516053ba-95fc-459e-b835-bbbc1092b9ab\" (UID: \"516053ba-95fc-459e-b835-bbbc1092b9ab\") " Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.788252 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/516053ba-95fc-459e-b835-bbbc1092b9ab-kube-api-access-nk62t" (OuterVolumeSpecName: "kube-api-access-nk62t") pod "516053ba-95fc-459e-b835-bbbc1092b9ab" (UID: "516053ba-95fc-459e-b835-bbbc1092b9ab"). InnerVolumeSpecName "kube-api-access-nk62t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.820978 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-openstack-edpm-multinode" (OuterVolumeSpecName: "openstack-edpm-multinode") pod "516053ba-95fc-459e-b835-bbbc1092b9ab" (UID: "516053ba-95fc-459e-b835-bbbc1092b9ab"). InnerVolumeSpecName "openstack-edpm-multinode". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.821098 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "516053ba-95fc-459e-b835-bbbc1092b9ab" (UID: "516053ba-95fc-459e-b835-bbbc1092b9ab"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.827309 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-config" (OuterVolumeSpecName: "config") pod "516053ba-95fc-459e-b835-bbbc1092b9ab" (UID: "516053ba-95fc-459e-b835-bbbc1092b9ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.885796 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.885830 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-openstack-edpm-multinode\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.885843 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/516053ba-95fc-459e-b835-bbbc1092b9ab-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.885857 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk62t\" (UniqueName: \"kubernetes.io/projected/516053ba-95fc-459e-b835-bbbc1092b9ab-kube-api-access-nk62t\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.944471 5030 generic.go:334] "Generic (PLEG): container finished" podID="516053ba-95fc-459e-b835-bbbc1092b9ab" containerID="bcfe097b2c9c19b1fe9d770e2823f64159de149693301be6074a345cad557171" exitCode=0 Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.944511 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" event={"ID":"516053ba-95fc-459e-b835-bbbc1092b9ab","Type":"ContainerDied","Data":"bcfe097b2c9c19b1fe9d770e2823f64159de149693301be6074a345cad557171"} Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.944538 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" event={"ID":"516053ba-95fc-459e-b835-bbbc1092b9ab","Type":"ContainerDied","Data":"00423861b20cb7f0e04fcd9c4f08ac85ecec30c3daa22102f7ad32613547402b"} Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.944540 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.944556 5030 scope.go:117] "RemoveContainer" containerID="bcfe097b2c9c19b1fe9d770e2823f64159de149693301be6074a345cad557171" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.980596 5030 scope.go:117] "RemoveContainer" containerID="e696f51ad723cdfe5bcd7389d6e7868250642a447e95649bad0e55533535719e" Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.981872 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt"] Jan 21 00:13:42 crc kubenswrapper[5030]: I0121 00:13:42.995180 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-59887957c5-xpzwt"] Jan 21 00:13:43 crc kubenswrapper[5030]: I0121 00:13:43.004883 5030 scope.go:117] "RemoveContainer" containerID="bcfe097b2c9c19b1fe9d770e2823f64159de149693301be6074a345cad557171" Jan 21 00:13:43 crc kubenswrapper[5030]: E0121 00:13:43.005322 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcfe097b2c9c19b1fe9d770e2823f64159de149693301be6074a345cad557171\": container with ID starting with bcfe097b2c9c19b1fe9d770e2823f64159de149693301be6074a345cad557171 not found: ID does not exist" containerID="bcfe097b2c9c19b1fe9d770e2823f64159de149693301be6074a345cad557171" Jan 21 00:13:43 crc kubenswrapper[5030]: I0121 00:13:43.005373 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcfe097b2c9c19b1fe9d770e2823f64159de149693301be6074a345cad557171"} err="failed to get container status \"bcfe097b2c9c19b1fe9d770e2823f64159de149693301be6074a345cad557171\": rpc error: code = NotFound desc = could not find container \"bcfe097b2c9c19b1fe9d770e2823f64159de149693301be6074a345cad557171\": container with ID starting with bcfe097b2c9c19b1fe9d770e2823f64159de149693301be6074a345cad557171 not found: ID does not exist" Jan 21 00:13:43 crc kubenswrapper[5030]: I0121 00:13:43.005406 5030 scope.go:117] "RemoveContainer" containerID="e696f51ad723cdfe5bcd7389d6e7868250642a447e95649bad0e55533535719e" Jan 21 00:13:43 crc kubenswrapper[5030]: E0121 00:13:43.005789 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e696f51ad723cdfe5bcd7389d6e7868250642a447e95649bad0e55533535719e\": container with ID starting with e696f51ad723cdfe5bcd7389d6e7868250642a447e95649bad0e55533535719e not found: ID does not exist" containerID="e696f51ad723cdfe5bcd7389d6e7868250642a447e95649bad0e55533535719e" Jan 21 00:13:43 crc kubenswrapper[5030]: I0121 00:13:43.005818 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e696f51ad723cdfe5bcd7389d6e7868250642a447e95649bad0e55533535719e"} err="failed to get container status \"e696f51ad723cdfe5bcd7389d6e7868250642a447e95649bad0e55533535719e\": rpc error: code = NotFound desc = could not find container \"e696f51ad723cdfe5bcd7389d6e7868250642a447e95649bad0e55533535719e\": container with ID starting with e696f51ad723cdfe5bcd7389d6e7868250642a447e95649bad0e55533535719e not found: ID does not exist" Jan 21 00:13:43 crc kubenswrapper[5030]: I0121 00:13:43.118841 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w"] Jan 21 00:13:43 crc kubenswrapper[5030]: I0121 00:13:43.958952 5030 generic.go:334] "Generic (PLEG): container finished" podID="0001fa43-24c5-4c19-879b-0a21f3d77490" containerID="e78e6da9bc064f524c441e151206e5302085cb323b1020f54affc4a2ab222245" exitCode=0 Jan 21 00:13:43 crc kubenswrapper[5030]: I0121 00:13:43.959123 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" event={"ID":"0001fa43-24c5-4c19-879b-0a21f3d77490","Type":"ContainerDied","Data":"e78e6da9bc064f524c441e151206e5302085cb323b1020f54affc4a2ab222245"} Jan 21 00:13:43 crc kubenswrapper[5030]: I0121 00:13:43.959219 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" event={"ID":"0001fa43-24c5-4c19-879b-0a21f3d77490","Type":"ContainerStarted","Data":"be66565c15ff92d56939e8c0553069181e0360fb1c7e696448ca060fa50526af"} Jan 21 00:13:43 crc kubenswrapper[5030]: I0121 00:13:43.993880 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="516053ba-95fc-459e-b835-bbbc1092b9ab" path="/var/lib/kubelet/pods/516053ba-95fc-459e-b835-bbbc1092b9ab/volumes" Jan 21 00:13:44 crc kubenswrapper[5030]: I0121 00:13:44.967815 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" event={"ID":"0001fa43-24c5-4c19-879b-0a21f3d77490","Type":"ContainerStarted","Data":"744e0c3ef5305df675b4d892b17d4b55530867fa2e4164a327683f34a6b1eba4"} Jan 21 00:13:44 crc kubenswrapper[5030]: I0121 00:13:44.967931 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:51 crc kubenswrapper[5030]: I0121 00:13:51.961867 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:13:51 crc kubenswrapper[5030]: E0121 00:13:51.962522 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:13:52 crc kubenswrapper[5030]: I0121 00:13:52.673851 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:13:52 crc kubenswrapper[5030]: I0121 00:13:52.694460 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" podStartSLOduration=10.694440915 podStartE2EDuration="10.694440915s" podCreationTimestamp="2026-01-21 00:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:13:44.989999165 +0000 UTC m=+5897.310259463" watchObservedRunningTime="2026-01-21 00:13:52.694440915 +0000 UTC m=+5905.014701203" Jan 21 00:13:52 crc kubenswrapper[5030]: I0121 00:13:52.733692 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs"] Jan 21 00:13:52 crc kubenswrapper[5030]: I0121 00:13:52.734151 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" podUID="836e8025-5146-48c8-b4a3-99e6af074715" containerName="dnsmasq-dns" containerID="cri-o://9ba7f5ea726cd4bb0bfcc4aa4955f1b360cb55cbb43ab08a4ba2f2e01776f052" gracePeriod=10 Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.033486 5030 generic.go:334] "Generic (PLEG): container finished" podID="836e8025-5146-48c8-b4a3-99e6af074715" containerID="9ba7f5ea726cd4bb0bfcc4aa4955f1b360cb55cbb43ab08a4ba2f2e01776f052" exitCode=0 Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.033761 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" event={"ID":"836e8025-5146-48c8-b4a3-99e6af074715","Type":"ContainerDied","Data":"9ba7f5ea726cd4bb0bfcc4aa4955f1b360cb55cbb43ab08a4ba2f2e01776f052"} Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.109231 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.210980 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv"] Jan 21 00:13:53 crc kubenswrapper[5030]: E0121 00:13:53.211309 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836e8025-5146-48c8-b4a3-99e6af074715" containerName="dnsmasq-dns" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.211332 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="836e8025-5146-48c8-b4a3-99e6af074715" containerName="dnsmasq-dns" Jan 21 00:13:53 crc kubenswrapper[5030]: E0121 00:13:53.211351 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836e8025-5146-48c8-b4a3-99e6af074715" containerName="init" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.211357 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="836e8025-5146-48c8-b4a3-99e6af074715" containerName="init" Jan 21 00:13:53 crc kubenswrapper[5030]: E0121 00:13:53.211367 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="516053ba-95fc-459e-b835-bbbc1092b9ab" containerName="init" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.211373 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="516053ba-95fc-459e-b835-bbbc1092b9ab" containerName="init" Jan 21 00:13:53 crc kubenswrapper[5030]: E0121 00:13:53.211400 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="516053ba-95fc-459e-b835-bbbc1092b9ab" containerName="dnsmasq-dns" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.211406 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="516053ba-95fc-459e-b835-bbbc1092b9ab" containerName="dnsmasq-dns" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.211528 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="836e8025-5146-48c8-b4a3-99e6af074715" containerName="dnsmasq-dns" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.211543 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="516053ba-95fc-459e-b835-bbbc1092b9ab" containerName="dnsmasq-dns" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.212296 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.224264 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv"] Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.241599 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxcgx\" (UniqueName: \"kubernetes.io/projected/836e8025-5146-48c8-b4a3-99e6af074715-kube-api-access-jxcgx\") pod \"836e8025-5146-48c8-b4a3-99e6af074715\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.241680 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-dnsmasq-svc\") pod \"836e8025-5146-48c8-b4a3-99e6af074715\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.241718 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-config\") pod \"836e8025-5146-48c8-b4a3-99e6af074715\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.241771 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-openstack-edpm-multinode\") pod \"836e8025-5146-48c8-b4a3-99e6af074715\" (UID: \"836e8025-5146-48c8-b4a3-99e6af074715\") " Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.251261 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/836e8025-5146-48c8-b4a3-99e6af074715-kube-api-access-jxcgx" (OuterVolumeSpecName: "kube-api-access-jxcgx") pod "836e8025-5146-48c8-b4a3-99e6af074715" (UID: "836e8025-5146-48c8-b4a3-99e6af074715"). InnerVolumeSpecName "kube-api-access-jxcgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.280588 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-openstack-edpm-multinode" (OuterVolumeSpecName: "openstack-edpm-multinode") pod "836e8025-5146-48c8-b4a3-99e6af074715" (UID: "836e8025-5146-48c8-b4a3-99e6af074715"). InnerVolumeSpecName "openstack-edpm-multinode". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.282828 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "836e8025-5146-48c8-b4a3-99e6af074715" (UID: "836e8025-5146-48c8-b4a3-99e6af074715"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.286222 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-config" (OuterVolumeSpecName: "config") pod "836e8025-5146-48c8-b4a3-99e6af074715" (UID: "836e8025-5146-48c8-b4a3-99e6af074715"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.344055 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ffac9277-dcd9-4837-8aab-a71e98801227-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8npzv\" (UID: \"ffac9277-dcd9-4837-8aab-a71e98801227\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.344160 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xzpz\" (UniqueName: \"kubernetes.io/projected/ffac9277-dcd9-4837-8aab-a71e98801227-kube-api-access-7xzpz\") pod \"dnsmasq-dnsmasq-84b9f45d47-8npzv\" (UID: \"ffac9277-dcd9-4837-8aab-a71e98801227\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.344238 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffac9277-dcd9-4837-8aab-a71e98801227-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8npzv\" (UID: \"ffac9277-dcd9-4837-8aab-a71e98801227\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.344295 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.344315 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-openstack-edpm-multinode\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.344329 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxcgx\" (UniqueName: \"kubernetes.io/projected/836e8025-5146-48c8-b4a3-99e6af074715-kube-api-access-jxcgx\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.344338 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/836e8025-5146-48c8-b4a3-99e6af074715-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.446094 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffac9277-dcd9-4837-8aab-a71e98801227-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8npzv\" (UID: \"ffac9277-dcd9-4837-8aab-a71e98801227\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.446165 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ffac9277-dcd9-4837-8aab-a71e98801227-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8npzv\" (UID: \"ffac9277-dcd9-4837-8aab-a71e98801227\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.446232 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xzpz\" (UniqueName: \"kubernetes.io/projected/ffac9277-dcd9-4837-8aab-a71e98801227-kube-api-access-7xzpz\") pod \"dnsmasq-dnsmasq-84b9f45d47-8npzv\" (UID: \"ffac9277-dcd9-4837-8aab-a71e98801227\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.447091 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ffac9277-dcd9-4837-8aab-a71e98801227-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-8npzv\" (UID: \"ffac9277-dcd9-4837-8aab-a71e98801227\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.447134 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffac9277-dcd9-4837-8aab-a71e98801227-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-8npzv\" (UID: \"ffac9277-dcd9-4837-8aab-a71e98801227\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.469710 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xzpz\" (UniqueName: \"kubernetes.io/projected/ffac9277-dcd9-4837-8aab-a71e98801227-kube-api-access-7xzpz\") pod \"dnsmasq-dnsmasq-84b9f45d47-8npzv\" (UID: \"ffac9277-dcd9-4837-8aab-a71e98801227\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:13:53 crc kubenswrapper[5030]: I0121 00:13:53.529240 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:13:54 crc kubenswrapper[5030]: I0121 00:13:54.013593 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv"] Jan 21 00:13:54 crc kubenswrapper[5030]: I0121 00:13:54.047818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" event={"ID":"836e8025-5146-48c8-b4a3-99e6af074715","Type":"ContainerDied","Data":"dc0fb2fbb70a269f36a1970d9a8fbd63bf73e9799fd012f8273450810cc713c9"} Jan 21 00:13:54 crc kubenswrapper[5030]: I0121 00:13:54.047837 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs" Jan 21 00:13:54 crc kubenswrapper[5030]: I0121 00:13:54.047882 5030 scope.go:117] "RemoveContainer" containerID="9ba7f5ea726cd4bb0bfcc4aa4955f1b360cb55cbb43ab08a4ba2f2e01776f052" Jan 21 00:13:54 crc kubenswrapper[5030]: I0121 00:13:54.049012 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" event={"ID":"ffac9277-dcd9-4837-8aab-a71e98801227","Type":"ContainerStarted","Data":"e35f15b7de86fbc6fad40251caacd2acbb82e0c36c94d288bba98ef504b06cac"} Jan 21 00:13:54 crc kubenswrapper[5030]: I0121 00:13:54.135822 5030 scope.go:117] "RemoveContainer" containerID="0175b1a8e2eee6e7e381a2671ff9616f08211f33815de4d168f7aaee058f6a76" Jan 21 00:13:54 crc kubenswrapper[5030]: I0121 00:13:54.163643 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs"] Jan 21 00:13:54 crc kubenswrapper[5030]: I0121 00:13:54.172160 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-5d55f47b6c-v29fs"] Jan 21 00:13:55 crc kubenswrapper[5030]: I0121 00:13:55.059859 5030 generic.go:334] "Generic (PLEG): container finished" podID="ffac9277-dcd9-4837-8aab-a71e98801227" containerID="461a138132bb072f17f3e33b59cdc6dd32351633d5b970c30c1f5f6e7c837aa8" exitCode=0 Jan 21 00:13:55 crc kubenswrapper[5030]: I0121 00:13:55.059970 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" event={"ID":"ffac9277-dcd9-4837-8aab-a71e98801227","Type":"ContainerDied","Data":"461a138132bb072f17f3e33b59cdc6dd32351633d5b970c30c1f5f6e7c837aa8"} Jan 21 00:13:55 crc kubenswrapper[5030]: I0121 00:13:55.971556 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="836e8025-5146-48c8-b4a3-99e6af074715" path="/var/lib/kubelet/pods/836e8025-5146-48c8-b4a3-99e6af074715/volumes" Jan 21 00:13:56 crc kubenswrapper[5030]: I0121 00:13:56.069955 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" event={"ID":"ffac9277-dcd9-4837-8aab-a71e98801227","Type":"ContainerStarted","Data":"f11afcbd7acb15bf56d3b55de497c22052975f0f4e1139ac5e5f6b6e965b4c55"} Jan 21 00:13:56 crc kubenswrapper[5030]: I0121 00:13:56.070099 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:13:56 crc kubenswrapper[5030]: I0121 00:13:56.094820 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" podStartSLOduration=3.094802854 podStartE2EDuration="3.094802854s" podCreationTimestamp="2026-01-21 00:13:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:13:56.08804686 +0000 UTC m=+5908.408307168" watchObservedRunningTime="2026-01-21 00:13:56.094802854 +0000 UTC m=+5908.415063132" Jan 21 00:14:01 crc kubenswrapper[5030]: I0121 00:14:01.779594 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-tldd2"] Jan 21 00:14:01 crc kubenswrapper[5030]: I0121 00:14:01.784643 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-tldd2"] Jan 21 00:14:01 crc kubenswrapper[5030]: I0121 00:14:01.972074 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24bd3397-0c91-43ec-9af2-50dd9b14e067" path="/var/lib/kubelet/pods/24bd3397-0c91-43ec-9af2-50dd9b14e067/volumes" Jan 21 00:14:01 crc kubenswrapper[5030]: I0121 00:14:01.972708 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-jt84z"] Jan 21 00:14:01 crc kubenswrapper[5030]: I0121 00:14:01.973476 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jt84z"] Jan 21 00:14:01 crc kubenswrapper[5030]: I0121 00:14:01.973543 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jt84z" Jan 21 00:14:01 crc kubenswrapper[5030]: I0121 00:14:01.975303 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:14:01 crc kubenswrapper[5030]: I0121 00:14:01.975504 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:14:01 crc kubenswrapper[5030]: I0121 00:14:01.976490 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:14:01 crc kubenswrapper[5030]: I0121 00:14:01.977011 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:14:02 crc kubenswrapper[5030]: I0121 00:14:02.074538 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5tr4\" (UniqueName: \"kubernetes.io/projected/514c6de7-b0d1-405f-9732-1f400a548c4f-kube-api-access-q5tr4\") pod \"crc-storage-crc-jt84z\" (UID: \"514c6de7-b0d1-405f-9732-1f400a548c4f\") " pod="crc-storage/crc-storage-crc-jt84z" Jan 21 00:14:02 crc kubenswrapper[5030]: I0121 00:14:02.075281 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/514c6de7-b0d1-405f-9732-1f400a548c4f-crc-storage\") pod \"crc-storage-crc-jt84z\" (UID: \"514c6de7-b0d1-405f-9732-1f400a548c4f\") " pod="crc-storage/crc-storage-crc-jt84z" Jan 21 00:14:02 crc kubenswrapper[5030]: I0121 00:14:02.075435 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/514c6de7-b0d1-405f-9732-1f400a548c4f-node-mnt\") pod \"crc-storage-crc-jt84z\" (UID: \"514c6de7-b0d1-405f-9732-1f400a548c4f\") " pod="crc-storage/crc-storage-crc-jt84z" Jan 21 00:14:02 crc kubenswrapper[5030]: I0121 00:14:02.177292 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/514c6de7-b0d1-405f-9732-1f400a548c4f-crc-storage\") pod \"crc-storage-crc-jt84z\" (UID: \"514c6de7-b0d1-405f-9732-1f400a548c4f\") " pod="crc-storage/crc-storage-crc-jt84z" Jan 21 00:14:02 crc kubenswrapper[5030]: I0121 00:14:02.177598 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/514c6de7-b0d1-405f-9732-1f400a548c4f-node-mnt\") pod \"crc-storage-crc-jt84z\" (UID: \"514c6de7-b0d1-405f-9732-1f400a548c4f\") " pod="crc-storage/crc-storage-crc-jt84z" Jan 21 00:14:02 crc kubenswrapper[5030]: I0121 00:14:02.177957 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5tr4\" (UniqueName: \"kubernetes.io/projected/514c6de7-b0d1-405f-9732-1f400a548c4f-kube-api-access-q5tr4\") pod \"crc-storage-crc-jt84z\" (UID: \"514c6de7-b0d1-405f-9732-1f400a548c4f\") " pod="crc-storage/crc-storage-crc-jt84z" Jan 21 00:14:02 crc kubenswrapper[5030]: I0121 00:14:02.177906 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/514c6de7-b0d1-405f-9732-1f400a548c4f-node-mnt\") pod \"crc-storage-crc-jt84z\" (UID: \"514c6de7-b0d1-405f-9732-1f400a548c4f\") " pod="crc-storage/crc-storage-crc-jt84z" Jan 21 00:14:02 crc kubenswrapper[5030]: I0121 00:14:02.178294 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/514c6de7-b0d1-405f-9732-1f400a548c4f-crc-storage\") pod \"crc-storage-crc-jt84z\" (UID: \"514c6de7-b0d1-405f-9732-1f400a548c4f\") " pod="crc-storage/crc-storage-crc-jt84z" Jan 21 00:14:02 crc kubenswrapper[5030]: I0121 00:14:02.199141 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5tr4\" (UniqueName: \"kubernetes.io/projected/514c6de7-b0d1-405f-9732-1f400a548c4f-kube-api-access-q5tr4\") pod \"crc-storage-crc-jt84z\" (UID: \"514c6de7-b0d1-405f-9732-1f400a548c4f\") " pod="crc-storage/crc-storage-crc-jt84z" Jan 21 00:14:02 crc kubenswrapper[5030]: I0121 00:14:02.318347 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jt84z" Jan 21 00:14:02 crc kubenswrapper[5030]: I0121 00:14:02.753424 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jt84z"] Jan 21 00:14:02 crc kubenswrapper[5030]: I0121 00:14:02.760176 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 00:14:03 crc kubenswrapper[5030]: I0121 00:14:03.136807 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jt84z" event={"ID":"514c6de7-b0d1-405f-9732-1f400a548c4f","Type":"ContainerStarted","Data":"71450730ff212c32094c90deffac2dc1ee9eb679716f39a36e98cea7bb6e8af0"} Jan 21 00:14:03 crc kubenswrapper[5030]: I0121 00:14:03.531606 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:14:03 crc kubenswrapper[5030]: I0121 00:14:03.588312 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w"] Jan 21 00:14:03 crc kubenswrapper[5030]: I0121 00:14:03.588579 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" podUID="0001fa43-24c5-4c19-879b-0a21f3d77490" containerName="dnsmasq-dns" containerID="cri-o://744e0c3ef5305df675b4d892b17d4b55530867fa2e4164a327683f34a6b1eba4" gracePeriod=10 Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.147980 5030 generic.go:334] "Generic (PLEG): container finished" podID="0001fa43-24c5-4c19-879b-0a21f3d77490" containerID="744e0c3ef5305df675b4d892b17d4b55530867fa2e4164a327683f34a6b1eba4" exitCode=0 Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.148055 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" event={"ID":"0001fa43-24c5-4c19-879b-0a21f3d77490","Type":"ContainerDied","Data":"744e0c3ef5305df675b4d892b17d4b55530867fa2e4164a327683f34a6b1eba4"} Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.560392 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.738413 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-dnsmasq-svc\") pod \"0001fa43-24c5-4c19-879b-0a21f3d77490\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.738478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-config\") pod \"0001fa43-24c5-4c19-879b-0a21f3d77490\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.738678 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rvrd\" (UniqueName: \"kubernetes.io/projected/0001fa43-24c5-4c19-879b-0a21f3d77490-kube-api-access-6rvrd\") pod \"0001fa43-24c5-4c19-879b-0a21f3d77490\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.738696 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-openstack-edpm-multinode\") pod \"0001fa43-24c5-4c19-879b-0a21f3d77490\" (UID: \"0001fa43-24c5-4c19-879b-0a21f3d77490\") " Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.743389 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0001fa43-24c5-4c19-879b-0a21f3d77490-kube-api-access-6rvrd" (OuterVolumeSpecName: "kube-api-access-6rvrd") pod "0001fa43-24c5-4c19-879b-0a21f3d77490" (UID: "0001fa43-24c5-4c19-879b-0a21f3d77490"). InnerVolumeSpecName "kube-api-access-6rvrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.778823 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "0001fa43-24c5-4c19-879b-0a21f3d77490" (UID: "0001fa43-24c5-4c19-879b-0a21f3d77490"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.788370 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-openstack-edpm-multinode" (OuterVolumeSpecName: "openstack-edpm-multinode") pod "0001fa43-24c5-4c19-879b-0a21f3d77490" (UID: "0001fa43-24c5-4c19-879b-0a21f3d77490"). InnerVolumeSpecName "openstack-edpm-multinode". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.794935 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-config" (OuterVolumeSpecName: "config") pod "0001fa43-24c5-4c19-879b-0a21f3d77490" (UID: "0001fa43-24c5-4c19-879b-0a21f3d77490"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.841823 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rvrd\" (UniqueName: \"kubernetes.io/projected/0001fa43-24c5-4c19-879b-0a21f3d77490-kube-api-access-6rvrd\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.841861 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-multinode\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-openstack-edpm-multinode\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.841872 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:04 crc kubenswrapper[5030]: I0121 00:14:04.841885 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0001fa43-24c5-4c19-879b-0a21f3d77490-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:05 crc kubenswrapper[5030]: I0121 00:14:05.163343 5030 generic.go:334] "Generic (PLEG): container finished" podID="514c6de7-b0d1-405f-9732-1f400a548c4f" containerID="c85362660680b33fba94dd833f9713999ae13302dd327217622edfaab4a3d463" exitCode=0 Jan 21 00:14:05 crc kubenswrapper[5030]: I0121 00:14:05.163424 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jt84z" event={"ID":"514c6de7-b0d1-405f-9732-1f400a548c4f","Type":"ContainerDied","Data":"c85362660680b33fba94dd833f9713999ae13302dd327217622edfaab4a3d463"} Jan 21 00:14:05 crc kubenswrapper[5030]: I0121 00:14:05.166470 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" event={"ID":"0001fa43-24c5-4c19-879b-0a21f3d77490","Type":"ContainerDied","Data":"be66565c15ff92d56939e8c0553069181e0360fb1c7e696448ca060fa50526af"} Jan 21 00:14:05 crc kubenswrapper[5030]: I0121 00:14:05.166548 5030 scope.go:117] "RemoveContainer" containerID="744e0c3ef5305df675b4d892b17d4b55530867fa2e4164a327683f34a6b1eba4" Jan 21 00:14:05 crc kubenswrapper[5030]: I0121 00:14:05.166615 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w" Jan 21 00:14:05 crc kubenswrapper[5030]: I0121 00:14:05.205870 5030 scope.go:117] "RemoveContainer" containerID="e78e6da9bc064f524c441e151206e5302085cb323b1020f54affc4a2ab222245" Jan 21 00:14:05 crc kubenswrapper[5030]: I0121 00:14:05.240916 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w"] Jan 21 00:14:05 crc kubenswrapper[5030]: I0121 00:14:05.249821 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-964c896d7-wbw5w"] Jan 21 00:14:05 crc kubenswrapper[5030]: I0121 00:14:05.969927 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0001fa43-24c5-4c19-879b-0a21f3d77490" path="/var/lib/kubelet/pods/0001fa43-24c5-4c19-879b-0a21f3d77490/volumes" Jan 21 00:14:06 crc kubenswrapper[5030]: I0121 00:14:06.541292 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jt84z" Jan 21 00:14:06 crc kubenswrapper[5030]: I0121 00:14:06.565817 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/514c6de7-b0d1-405f-9732-1f400a548c4f-node-mnt\") pod \"514c6de7-b0d1-405f-9732-1f400a548c4f\" (UID: \"514c6de7-b0d1-405f-9732-1f400a548c4f\") " Jan 21 00:14:06 crc kubenswrapper[5030]: I0121 00:14:06.565923 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/514c6de7-b0d1-405f-9732-1f400a548c4f-crc-storage\") pod \"514c6de7-b0d1-405f-9732-1f400a548c4f\" (UID: \"514c6de7-b0d1-405f-9732-1f400a548c4f\") " Jan 21 00:14:06 crc kubenswrapper[5030]: I0121 00:14:06.565987 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5tr4\" (UniqueName: \"kubernetes.io/projected/514c6de7-b0d1-405f-9732-1f400a548c4f-kube-api-access-q5tr4\") pod \"514c6de7-b0d1-405f-9732-1f400a548c4f\" (UID: \"514c6de7-b0d1-405f-9732-1f400a548c4f\") " Jan 21 00:14:06 crc kubenswrapper[5030]: I0121 00:14:06.565990 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/514c6de7-b0d1-405f-9732-1f400a548c4f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "514c6de7-b0d1-405f-9732-1f400a548c4f" (UID: "514c6de7-b0d1-405f-9732-1f400a548c4f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:14:06 crc kubenswrapper[5030]: I0121 00:14:06.566225 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/514c6de7-b0d1-405f-9732-1f400a548c4f-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:06 crc kubenswrapper[5030]: I0121 00:14:06.582962 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/514c6de7-b0d1-405f-9732-1f400a548c4f-kube-api-access-q5tr4" (OuterVolumeSpecName: "kube-api-access-q5tr4") pod "514c6de7-b0d1-405f-9732-1f400a548c4f" (UID: "514c6de7-b0d1-405f-9732-1f400a548c4f"). InnerVolumeSpecName "kube-api-access-q5tr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:14:06 crc kubenswrapper[5030]: I0121 00:14:06.590899 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/514c6de7-b0d1-405f-9732-1f400a548c4f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "514c6de7-b0d1-405f-9732-1f400a548c4f" (UID: "514c6de7-b0d1-405f-9732-1f400a548c4f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:06 crc kubenswrapper[5030]: I0121 00:14:06.667142 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/514c6de7-b0d1-405f-9732-1f400a548c4f-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:06 crc kubenswrapper[5030]: I0121 00:14:06.667190 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5tr4\" (UniqueName: \"kubernetes.io/projected/514c6de7-b0d1-405f-9732-1f400a548c4f-kube-api-access-q5tr4\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:06 crc kubenswrapper[5030]: I0121 00:14:06.963561 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:14:06 crc kubenswrapper[5030]: E0121 00:14:06.963980 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:14:07 crc kubenswrapper[5030]: I0121 00:14:07.186064 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jt84z" event={"ID":"514c6de7-b0d1-405f-9732-1f400a548c4f","Type":"ContainerDied","Data":"71450730ff212c32094c90deffac2dc1ee9eb679716f39a36e98cea7bb6e8af0"} Jan 21 00:14:07 crc kubenswrapper[5030]: I0121 00:14:07.186115 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71450730ff212c32094c90deffac2dc1ee9eb679716f39a36e98cea7bb6e8af0" Jan 21 00:14:07 crc kubenswrapper[5030]: I0121 00:14:07.186143 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jt84z" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.043425 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-jt84z"] Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.049071 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-jt84z"] Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.160862 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-xcxt5"] Jan 21 00:14:10 crc kubenswrapper[5030]: E0121 00:14:10.161143 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="514c6de7-b0d1-405f-9732-1f400a548c4f" containerName="storage" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.161158 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="514c6de7-b0d1-405f-9732-1f400a548c4f" containerName="storage" Jan 21 00:14:10 crc kubenswrapper[5030]: E0121 00:14:10.161173 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0001fa43-24c5-4c19-879b-0a21f3d77490" containerName="init" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.161179 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0001fa43-24c5-4c19-879b-0a21f3d77490" containerName="init" Jan 21 00:14:10 crc kubenswrapper[5030]: E0121 00:14:10.161188 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0001fa43-24c5-4c19-879b-0a21f3d77490" containerName="dnsmasq-dns" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.161194 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0001fa43-24c5-4c19-879b-0a21f3d77490" containerName="dnsmasq-dns" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.161361 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0001fa43-24c5-4c19-879b-0a21f3d77490" containerName="dnsmasq-dns" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.161378 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="514c6de7-b0d1-405f-9732-1f400a548c4f" containerName="storage" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.161855 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xcxt5" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.163856 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.164145 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.164163 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.164507 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.174135 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-xcxt5"] Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.322053 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/abf37276-206c-458e-8998-6fe0df5024b7-crc-storage\") pod \"crc-storage-crc-xcxt5\" (UID: \"abf37276-206c-458e-8998-6fe0df5024b7\") " pod="crc-storage/crc-storage-crc-xcxt5" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.322108 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/abf37276-206c-458e-8998-6fe0df5024b7-node-mnt\") pod \"crc-storage-crc-xcxt5\" (UID: \"abf37276-206c-458e-8998-6fe0df5024b7\") " pod="crc-storage/crc-storage-crc-xcxt5" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.322696 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr2zb\" (UniqueName: \"kubernetes.io/projected/abf37276-206c-458e-8998-6fe0df5024b7-kube-api-access-pr2zb\") pod \"crc-storage-crc-xcxt5\" (UID: \"abf37276-206c-458e-8998-6fe0df5024b7\") " pod="crc-storage/crc-storage-crc-xcxt5" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.424192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr2zb\" (UniqueName: \"kubernetes.io/projected/abf37276-206c-458e-8998-6fe0df5024b7-kube-api-access-pr2zb\") pod \"crc-storage-crc-xcxt5\" (UID: \"abf37276-206c-458e-8998-6fe0df5024b7\") " pod="crc-storage/crc-storage-crc-xcxt5" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.424317 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/abf37276-206c-458e-8998-6fe0df5024b7-crc-storage\") pod \"crc-storage-crc-xcxt5\" (UID: \"abf37276-206c-458e-8998-6fe0df5024b7\") " pod="crc-storage/crc-storage-crc-xcxt5" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.424349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/abf37276-206c-458e-8998-6fe0df5024b7-node-mnt\") pod \"crc-storage-crc-xcxt5\" (UID: \"abf37276-206c-458e-8998-6fe0df5024b7\") " pod="crc-storage/crc-storage-crc-xcxt5" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.424805 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/abf37276-206c-458e-8998-6fe0df5024b7-node-mnt\") pod \"crc-storage-crc-xcxt5\" (UID: \"abf37276-206c-458e-8998-6fe0df5024b7\") " pod="crc-storage/crc-storage-crc-xcxt5" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.425263 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/abf37276-206c-458e-8998-6fe0df5024b7-crc-storage\") pod \"crc-storage-crc-xcxt5\" (UID: \"abf37276-206c-458e-8998-6fe0df5024b7\") " pod="crc-storage/crc-storage-crc-xcxt5" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.457326 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr2zb\" (UniqueName: \"kubernetes.io/projected/abf37276-206c-458e-8998-6fe0df5024b7-kube-api-access-pr2zb\") pod \"crc-storage-crc-xcxt5\" (UID: \"abf37276-206c-458e-8998-6fe0df5024b7\") " pod="crc-storage/crc-storage-crc-xcxt5" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.486289 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xcxt5" Jan 21 00:14:10 crc kubenswrapper[5030]: I0121 00:14:10.755114 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-xcxt5"] Jan 21 00:14:11 crc kubenswrapper[5030]: I0121 00:14:11.219391 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xcxt5" event={"ID":"abf37276-206c-458e-8998-6fe0df5024b7","Type":"ContainerStarted","Data":"73614227afea22a001c59b4b8d115b6a42ca648f57d3ba501cd4893f0c5f4f53"} Jan 21 00:14:11 crc kubenswrapper[5030]: I0121 00:14:11.970264 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="514c6de7-b0d1-405f-9732-1f400a548c4f" path="/var/lib/kubelet/pods/514c6de7-b0d1-405f-9732-1f400a548c4f/volumes" Jan 21 00:14:12 crc kubenswrapper[5030]: I0121 00:14:12.228572 5030 generic.go:334] "Generic (PLEG): container finished" podID="abf37276-206c-458e-8998-6fe0df5024b7" containerID="ab23fc188f9006404bc34c587b59f313b0487a9749aefd311c320d4905613bf2" exitCode=0 Jan 21 00:14:12 crc kubenswrapper[5030]: I0121 00:14:12.228661 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xcxt5" event={"ID":"abf37276-206c-458e-8998-6fe0df5024b7","Type":"ContainerDied","Data":"ab23fc188f9006404bc34c587b59f313b0487a9749aefd311c320d4905613bf2"} Jan 21 00:14:13 crc kubenswrapper[5030]: I0121 00:14:13.494998 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xcxt5" Jan 21 00:14:13 crc kubenswrapper[5030]: I0121 00:14:13.569480 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr2zb\" (UniqueName: \"kubernetes.io/projected/abf37276-206c-458e-8998-6fe0df5024b7-kube-api-access-pr2zb\") pod \"abf37276-206c-458e-8998-6fe0df5024b7\" (UID: \"abf37276-206c-458e-8998-6fe0df5024b7\") " Jan 21 00:14:13 crc kubenswrapper[5030]: I0121 00:14:13.569539 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/abf37276-206c-458e-8998-6fe0df5024b7-crc-storage\") pod \"abf37276-206c-458e-8998-6fe0df5024b7\" (UID: \"abf37276-206c-458e-8998-6fe0df5024b7\") " Jan 21 00:14:13 crc kubenswrapper[5030]: I0121 00:14:13.569566 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/abf37276-206c-458e-8998-6fe0df5024b7-node-mnt\") pod \"abf37276-206c-458e-8998-6fe0df5024b7\" (UID: \"abf37276-206c-458e-8998-6fe0df5024b7\") " Jan 21 00:14:13 crc kubenswrapper[5030]: I0121 00:14:13.569802 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abf37276-206c-458e-8998-6fe0df5024b7-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "abf37276-206c-458e-8998-6fe0df5024b7" (UID: "abf37276-206c-458e-8998-6fe0df5024b7"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:14:13 crc kubenswrapper[5030]: I0121 00:14:13.574722 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf37276-206c-458e-8998-6fe0df5024b7-kube-api-access-pr2zb" (OuterVolumeSpecName: "kube-api-access-pr2zb") pod "abf37276-206c-458e-8998-6fe0df5024b7" (UID: "abf37276-206c-458e-8998-6fe0df5024b7"). InnerVolumeSpecName "kube-api-access-pr2zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:14:13 crc kubenswrapper[5030]: I0121 00:14:13.587375 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abf37276-206c-458e-8998-6fe0df5024b7-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "abf37276-206c-458e-8998-6fe0df5024b7" (UID: "abf37276-206c-458e-8998-6fe0df5024b7"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:13 crc kubenswrapper[5030]: I0121 00:14:13.671311 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/abf37276-206c-458e-8998-6fe0df5024b7-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:13 crc kubenswrapper[5030]: I0121 00:14:13.671362 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/abf37276-206c-458e-8998-6fe0df5024b7-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:13 crc kubenswrapper[5030]: I0121 00:14:13.671380 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr2zb\" (UniqueName: \"kubernetes.io/projected/abf37276-206c-458e-8998-6fe0df5024b7-kube-api-access-pr2zb\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:14 crc kubenswrapper[5030]: I0121 00:14:14.246893 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-xcxt5" event={"ID":"abf37276-206c-458e-8998-6fe0df5024b7","Type":"ContainerDied","Data":"73614227afea22a001c59b4b8d115b6a42ca648f57d3ba501cd4893f0c5f4f53"} Jan 21 00:14:14 crc kubenswrapper[5030]: I0121 00:14:14.247191 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73614227afea22a001c59b4b8d115b6a42ca648f57d3ba501cd4893f0c5f4f53" Jan 21 00:14:14 crc kubenswrapper[5030]: I0121 00:14:14.246947 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-xcxt5" Jan 21 00:14:16 crc kubenswrapper[5030]: I0121 00:14:16.867836 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf"] Jan 21 00:14:16 crc kubenswrapper[5030]: E0121 00:14:16.868217 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf37276-206c-458e-8998-6fe0df5024b7" containerName="storage" Jan 21 00:14:16 crc kubenswrapper[5030]: I0121 00:14:16.868232 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf37276-206c-458e-8998-6fe0df5024b7" containerName="storage" Jan 21 00:14:16 crc kubenswrapper[5030]: I0121 00:14:16.868413 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf37276-206c-458e-8998-6fe0df5024b7" containerName="storage" Jan 21 00:14:16 crc kubenswrapper[5030]: I0121 00:14:16.869309 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:16 crc kubenswrapper[5030]: I0121 00:14:16.871192 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes" Jan 21 00:14:16 crc kubenswrapper[5030]: I0121 00:14:16.884686 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf"] Jan 21 00:14:16 crc kubenswrapper[5030]: I0121 00:14:16.934799 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-p8xnf\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:16 crc kubenswrapper[5030]: I0121 00:14:16.935133 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-p8xnf\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:16 crc kubenswrapper[5030]: I0121 00:14:16.935162 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpj2j\" (UniqueName: \"kubernetes.io/projected/d0d97df5-4aa5-474d-83dd-cfcc487db61a-kube-api-access-fpj2j\") pod \"dnsmasq-dnsmasq-64864b6d57-p8xnf\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:16 crc kubenswrapper[5030]: I0121 00:14:16.935202 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-config\") pod \"dnsmasq-dnsmasq-64864b6d57-p8xnf\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:17 crc kubenswrapper[5030]: I0121 00:14:17.036614 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-p8xnf\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:17 crc kubenswrapper[5030]: I0121 00:14:17.036702 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-p8xnf\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:17 crc kubenswrapper[5030]: I0121 00:14:17.036727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpj2j\" (UniqueName: \"kubernetes.io/projected/d0d97df5-4aa5-474d-83dd-cfcc487db61a-kube-api-access-fpj2j\") pod \"dnsmasq-dnsmasq-64864b6d57-p8xnf\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:17 crc kubenswrapper[5030]: I0121 00:14:17.036755 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-config\") pod \"dnsmasq-dnsmasq-64864b6d57-p8xnf\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:17 crc kubenswrapper[5030]: I0121 00:14:17.037789 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-config\") pod \"dnsmasq-dnsmasq-64864b6d57-p8xnf\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:17 crc kubenswrapper[5030]: I0121 00:14:17.037888 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-p8xnf\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:17 crc kubenswrapper[5030]: I0121 00:14:17.038092 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-p8xnf\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:17 crc kubenswrapper[5030]: I0121 00:14:17.064974 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpj2j\" (UniqueName: \"kubernetes.io/projected/d0d97df5-4aa5-474d-83dd-cfcc487db61a-kube-api-access-fpj2j\") pod \"dnsmasq-dnsmasq-64864b6d57-p8xnf\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:17 crc kubenswrapper[5030]: I0121 00:14:17.191582 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:17 crc kubenswrapper[5030]: I0121 00:14:17.603382 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf"] Jan 21 00:14:17 crc kubenswrapper[5030]: W0121 00:14:17.613227 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0d97df5_4aa5_474d_83dd_cfcc487db61a.slice/crio-e9e07dda4181d92d76d484c3b761e2a1a80162e077cffbb513b9627b1e4956a7 WatchSource:0}: Error finding container e9e07dda4181d92d76d484c3b761e2a1a80162e077cffbb513b9627b1e4956a7: Status 404 returned error can't find the container with id e9e07dda4181d92d76d484c3b761e2a1a80162e077cffbb513b9627b1e4956a7 Jan 21 00:14:18 crc kubenswrapper[5030]: I0121 00:14:18.291415 5030 generic.go:334] "Generic (PLEG): container finished" podID="d0d97df5-4aa5-474d-83dd-cfcc487db61a" containerID="80e1fb526a53af02d36132aca6456e17281c6789641fc525142e7d7be7bc2a22" exitCode=0 Jan 21 00:14:18 crc kubenswrapper[5030]: I0121 00:14:18.291478 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" event={"ID":"d0d97df5-4aa5-474d-83dd-cfcc487db61a","Type":"ContainerDied","Data":"80e1fb526a53af02d36132aca6456e17281c6789641fc525142e7d7be7bc2a22"} Jan 21 00:14:18 crc kubenswrapper[5030]: I0121 00:14:18.292348 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" event={"ID":"d0d97df5-4aa5-474d-83dd-cfcc487db61a","Type":"ContainerStarted","Data":"e9e07dda4181d92d76d484c3b761e2a1a80162e077cffbb513b9627b1e4956a7"} Jan 21 00:14:18 crc kubenswrapper[5030]: I0121 00:14:18.962588 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:14:18 crc kubenswrapper[5030]: E0121 00:14:18.963128 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:14:19 crc kubenswrapper[5030]: I0121 00:14:19.300566 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" event={"ID":"d0d97df5-4aa5-474d-83dd-cfcc487db61a","Type":"ContainerStarted","Data":"ec50c59b6f1b2bbdd07a4051dd936f5f2b49ad9223d411cf98609b488f254d1c"} Jan 21 00:14:19 crc kubenswrapper[5030]: I0121 00:14:19.300766 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:19 crc kubenswrapper[5030]: I0121 00:14:19.326318 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" podStartSLOduration=3.32630084 podStartE2EDuration="3.32630084s" podCreationTimestamp="2026-01-21 00:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:14:19.320978951 +0000 UTC m=+5931.641239239" watchObservedRunningTime="2026-01-21 00:14:19.32630084 +0000 UTC m=+5931.646561128" Jan 21 00:14:27 crc kubenswrapper[5030]: I0121 00:14:27.193074 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:27 crc kubenswrapper[5030]: I0121 00:14:27.243586 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv"] Jan 21 00:14:27 crc kubenswrapper[5030]: I0121 00:14:27.243977 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" podUID="ffac9277-dcd9-4837-8aab-a71e98801227" containerName="dnsmasq-dns" containerID="cri-o://f11afcbd7acb15bf56d3b55de497c22052975f0f4e1139ac5e5f6b6e965b4c55" gracePeriod=10 Jan 21 00:14:27 crc kubenswrapper[5030]: I0121 00:14:27.413391 5030 generic.go:334] "Generic (PLEG): container finished" podID="ffac9277-dcd9-4837-8aab-a71e98801227" containerID="f11afcbd7acb15bf56d3b55de497c22052975f0f4e1139ac5e5f6b6e965b4c55" exitCode=0 Jan 21 00:14:27 crc kubenswrapper[5030]: I0121 00:14:27.413439 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" event={"ID":"ffac9277-dcd9-4837-8aab-a71e98801227","Type":"ContainerDied","Data":"f11afcbd7acb15bf56d3b55de497c22052975f0f4e1139ac5e5f6b6e965b4c55"} Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.143978 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.205151 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ffac9277-dcd9-4837-8aab-a71e98801227-dnsmasq-svc\") pod \"ffac9277-dcd9-4837-8aab-a71e98801227\" (UID: \"ffac9277-dcd9-4837-8aab-a71e98801227\") " Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.205264 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xzpz\" (UniqueName: \"kubernetes.io/projected/ffac9277-dcd9-4837-8aab-a71e98801227-kube-api-access-7xzpz\") pod \"ffac9277-dcd9-4837-8aab-a71e98801227\" (UID: \"ffac9277-dcd9-4837-8aab-a71e98801227\") " Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.205335 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffac9277-dcd9-4837-8aab-a71e98801227-config\") pod \"ffac9277-dcd9-4837-8aab-a71e98801227\" (UID: \"ffac9277-dcd9-4837-8aab-a71e98801227\") " Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.212216 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffac9277-dcd9-4837-8aab-a71e98801227-kube-api-access-7xzpz" (OuterVolumeSpecName: "kube-api-access-7xzpz") pod "ffac9277-dcd9-4837-8aab-a71e98801227" (UID: "ffac9277-dcd9-4837-8aab-a71e98801227"). InnerVolumeSpecName "kube-api-access-7xzpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.250937 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffac9277-dcd9-4837-8aab-a71e98801227-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "ffac9277-dcd9-4837-8aab-a71e98801227" (UID: "ffac9277-dcd9-4837-8aab-a71e98801227"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.255248 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffac9277-dcd9-4837-8aab-a71e98801227-config" (OuterVolumeSpecName: "config") pod "ffac9277-dcd9-4837-8aab-a71e98801227" (UID: "ffac9277-dcd9-4837-8aab-a71e98801227"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.306157 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/ffac9277-dcd9-4837-8aab-a71e98801227-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.306215 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xzpz\" (UniqueName: \"kubernetes.io/projected/ffac9277-dcd9-4837-8aab-a71e98801227-kube-api-access-7xzpz\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.306231 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffac9277-dcd9-4837-8aab-a71e98801227-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.423001 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" event={"ID":"ffac9277-dcd9-4837-8aab-a71e98801227","Type":"ContainerDied","Data":"e35f15b7de86fbc6fad40251caacd2acbb82e0c36c94d288bba98ef504b06cac"} Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.423105 5030 scope.go:117] "RemoveContainer" containerID="f11afcbd7acb15bf56d3b55de497c22052975f0f4e1139ac5e5f6b6e965b4c55" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.423114 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.452909 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv"] Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.454600 5030 scope.go:117] "RemoveContainer" containerID="461a138132bb072f17f3e33b59cdc6dd32351633d5b970c30c1f5f6e7c837aa8" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.460607 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-8npzv"] Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.634549 5030 scope.go:117] "RemoveContainer" containerID="06019265b6eabe0b81be6109c0bffdb530cf5cf5af10923985e45d1e1edf92e2" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.656911 5030 scope.go:117] "RemoveContainer" containerID="b30c2cf2fc3a3e832fdddeb1326de45892385ac7e44ead48ab4e5ccfec463134" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.691685 5030 scope.go:117] "RemoveContainer" containerID="194c3efc28459b64e8648a553459099e72c31634fc55d2467db7d065d5c4cc67" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.744018 5030 scope.go:117] "RemoveContainer" containerID="2f2e657df57e980501620d8a2ad742358ddd970a47135f5766de4a461ddbe991" Jan 21 00:14:28 crc kubenswrapper[5030]: I0121 00:14:28.763438 5030 scope.go:117] "RemoveContainer" containerID="11a82d346b30da8208a3328089fa26f7c1cff67802177ca59adcd25b382bc1b8" Jan 21 00:14:29 crc kubenswrapper[5030]: I0121 00:14:29.972252 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffac9277-dcd9-4837-8aab-a71e98801227" path="/var/lib/kubelet/pods/ffac9277-dcd9-4837-8aab-a71e98801227/volumes" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.807486 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht"] Jan 21 00:14:31 crc kubenswrapper[5030]: E0121 00:14:31.808079 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffac9277-dcd9-4837-8aab-a71e98801227" containerName="dnsmasq-dns" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.808092 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffac9277-dcd9-4837-8aab-a71e98801227" containerName="dnsmasq-dns" Jan 21 00:14:31 crc kubenswrapper[5030]: E0121 00:14:31.808108 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffac9277-dcd9-4837-8aab-a71e98801227" containerName="init" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.808114 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffac9277-dcd9-4837-8aab-a71e98801227" containerName="init" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.808254 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffac9277-dcd9-4837-8aab-a71e98801227" containerName="dnsmasq-dns" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.808769 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.811312 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.811493 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"kuttl-service-cm-2" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.811787 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.811906 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-x9vc6" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.811840 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"kuttl-service-cm-1" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.812283 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.812654 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.812999 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"kuttl-service-cm-0" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.818402 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht"] Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.961253 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-0-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.961334 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-0-1\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-1\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.961360 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-1-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.961417 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-2-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-2-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.961434 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-1-2\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-2\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.961454 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-0-2\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-2\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.961478 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-ssh-key-edpm-compute-no-nodes\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.961526 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92kp4\" (UniqueName: \"kubernetes.io/projected/9283a30a-200f-46b9-988e-a6aa110f5152-kube-api-access-92kp4\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.961556 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-inventory\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.961576 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-cm-1-1\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-1\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.961600 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kuttl-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-combined-ca-bundle\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:31 crc kubenswrapper[5030]: I0121 00:14:31.962132 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:14:31 crc kubenswrapper[5030]: E0121 00:14:31.962351 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.062489 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92kp4\" (UniqueName: \"kubernetes.io/projected/9283a30a-200f-46b9-988e-a6aa110f5152-kube-api-access-92kp4\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.062581 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-inventory\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.062650 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-1-1\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-1\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.062688 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-combined-ca-bundle\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.062715 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-0-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.062741 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-0-1\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-1\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.062823 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-1-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.062906 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-2-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-2-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.062931 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-1-2\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-2\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.062959 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kuttl-service-cm-0-2\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-2\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.062985 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-ssh-key-edpm-compute-no-nodes\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.063813 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-0-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.064227 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-1-1\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-1\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.064463 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-1-2\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-2\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.064608 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-2-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-2-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.064889 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-0-2\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-2\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.065038 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-0-1\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-1\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.065055 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-cm-1-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-0\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.068800 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-ssh-key-edpm-compute-no-nodes\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.068811 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kuttl-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-combined-ca-bundle\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.069320 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-inventory\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.080571 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92kp4\" (UniqueName: \"kubernetes.io/projected/9283a30a-200f-46b9-988e-a6aa110f5152-kube-api-access-92kp4\") pod \"kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.126770 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:32 crc kubenswrapper[5030]: I0121 00:14:32.776193 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht"] Jan 21 00:14:32 crc kubenswrapper[5030]: W0121 00:14:32.779774 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9283a30a_200f_46b9_988e_a6aa110f5152.slice/crio-57e73c9ea9031e1f4a0fb9169db71c7e1ca44466998b8501b94a801856e45030 WatchSource:0}: Error finding container 57e73c9ea9031e1f4a0fb9169db71c7e1ca44466998b8501b94a801856e45030: Status 404 returned error can't find the container with id 57e73c9ea9031e1f4a0fb9169db71c7e1ca44466998b8501b94a801856e45030 Jan 21 00:14:33 crc kubenswrapper[5030]: I0121 00:14:33.468762 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" event={"ID":"9283a30a-200f-46b9-988e-a6aa110f5152","Type":"ContainerStarted","Data":"57e73c9ea9031e1f4a0fb9169db71c7e1ca44466998b8501b94a801856e45030"} Jan 21 00:14:34 crc kubenswrapper[5030]: I0121 00:14:34.478743 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" event={"ID":"9283a30a-200f-46b9-988e-a6aa110f5152","Type":"ContainerStarted","Data":"658a8db2c9d1121544a17eb3708297a372badd347832bc6e065f8ccfd30a4900"} Jan 21 00:14:34 crc kubenswrapper[5030]: I0121 00:14:34.502727 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" podStartSLOduration=2.887434133 podStartE2EDuration="3.502707943s" podCreationTimestamp="2026-01-21 00:14:31 +0000 UTC" firstStartedPulling="2026-01-21 00:14:32.782823981 +0000 UTC m=+5945.103084259" lastFinishedPulling="2026-01-21 00:14:33.398097781 +0000 UTC m=+5945.718358069" observedRunningTime="2026-01-21 00:14:34.498986954 +0000 UTC m=+5946.819247252" watchObservedRunningTime="2026-01-21 00:14:34.502707943 +0000 UTC m=+5946.822968241" Jan 21 00:14:35 crc kubenswrapper[5030]: I0121 00:14:35.921663 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht"] Jan 21 00:14:35 crc kubenswrapper[5030]: I0121 00:14:35.993649 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s"] Jan 21 00:14:35 crc kubenswrapper[5030]: I0121 00:14:35.994924 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:14:36 crc kubenswrapper[5030]: I0121 00:14:36.009477 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s"] Jan 21 00:14:36 crc kubenswrapper[5030]: I0121 00:14:36.129681 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a2d97883-4659-457b-ad5d-94cf7d74f62c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-b2r7s\" (UID: \"a2d97883-4659-457b-ad5d-94cf7d74f62c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:14:36 crc kubenswrapper[5030]: I0121 00:14:36.129887 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt58f\" (UniqueName: \"kubernetes.io/projected/a2d97883-4659-457b-ad5d-94cf7d74f62c-kube-api-access-nt58f\") pod \"dnsmasq-dnsmasq-84b9f45d47-b2r7s\" (UID: \"a2d97883-4659-457b-ad5d-94cf7d74f62c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:14:36 crc kubenswrapper[5030]: I0121 00:14:36.129930 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d97883-4659-457b-ad5d-94cf7d74f62c-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-b2r7s\" (UID: \"a2d97883-4659-457b-ad5d-94cf7d74f62c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:14:36 crc kubenswrapper[5030]: I0121 00:14:36.231542 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt58f\" (UniqueName: \"kubernetes.io/projected/a2d97883-4659-457b-ad5d-94cf7d74f62c-kube-api-access-nt58f\") pod \"dnsmasq-dnsmasq-84b9f45d47-b2r7s\" (UID: \"a2d97883-4659-457b-ad5d-94cf7d74f62c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:14:36 crc kubenswrapper[5030]: I0121 00:14:36.231673 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d97883-4659-457b-ad5d-94cf7d74f62c-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-b2r7s\" (UID: \"a2d97883-4659-457b-ad5d-94cf7d74f62c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:14:36 crc kubenswrapper[5030]: I0121 00:14:36.231714 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a2d97883-4659-457b-ad5d-94cf7d74f62c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-b2r7s\" (UID: \"a2d97883-4659-457b-ad5d-94cf7d74f62c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:14:36 crc kubenswrapper[5030]: I0121 00:14:36.232466 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d97883-4659-457b-ad5d-94cf7d74f62c-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-b2r7s\" (UID: \"a2d97883-4659-457b-ad5d-94cf7d74f62c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:14:36 crc kubenswrapper[5030]: I0121 00:14:36.232507 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a2d97883-4659-457b-ad5d-94cf7d74f62c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-b2r7s\" (UID: \"a2d97883-4659-457b-ad5d-94cf7d74f62c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:14:36 crc kubenswrapper[5030]: I0121 00:14:36.252958 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt58f\" (UniqueName: \"kubernetes.io/projected/a2d97883-4659-457b-ad5d-94cf7d74f62c-kube-api-access-nt58f\") pod \"dnsmasq-dnsmasq-84b9f45d47-b2r7s\" (UID: \"a2d97883-4659-457b-ad5d-94cf7d74f62c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:14:36 crc kubenswrapper[5030]: I0121 00:14:36.334031 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:14:36 crc kubenswrapper[5030]: I0121 00:14:36.509730 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" podUID="9283a30a-200f-46b9-988e-a6aa110f5152" containerName="kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes" containerID="cri-o://658a8db2c9d1121544a17eb3708297a372badd347832bc6e065f8ccfd30a4900" gracePeriod=30 Jan 21 00:14:36 crc kubenswrapper[5030]: I0121 00:14:36.581135 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s"] Jan 21 00:14:37 crc kubenswrapper[5030]: I0121 00:14:37.523452 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2d97883-4659-457b-ad5d-94cf7d74f62c" containerID="92a7a88d4d4b90acecfe03d4abe956cfdcda1431adc23ee57939ee2e432e9f5c" exitCode=0 Jan 21 00:14:37 crc kubenswrapper[5030]: I0121 00:14:37.523542 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" event={"ID":"a2d97883-4659-457b-ad5d-94cf7d74f62c","Type":"ContainerDied","Data":"92a7a88d4d4b90acecfe03d4abe956cfdcda1431adc23ee57939ee2e432e9f5c"} Jan 21 00:14:37 crc kubenswrapper[5030]: I0121 00:14:37.524916 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" event={"ID":"a2d97883-4659-457b-ad5d-94cf7d74f62c","Type":"ContainerStarted","Data":"6c0d055ccdb1d91a6a8e878bc9c6e06261c5fe1c3916700ae4d00a97c391e0fa"} Jan 21 00:14:38 crc kubenswrapper[5030]: I0121 00:14:38.535286 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" event={"ID":"a2d97883-4659-457b-ad5d-94cf7d74f62c","Type":"ContainerStarted","Data":"0c421a9b50a99f6e78e376ada5955b4f3f3a4dfff91cce82062474041dc10720"} Jan 21 00:14:38 crc kubenswrapper[5030]: I0121 00:14:38.535462 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:14:38 crc kubenswrapper[5030]: I0121 00:14:38.566530 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" podStartSLOduration=3.566512899 podStartE2EDuration="3.566512899s" podCreationTimestamp="2026-01-21 00:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:14:38.562945862 +0000 UTC m=+5950.883206160" watchObservedRunningTime="2026-01-21 00:14:38.566512899 +0000 UTC m=+5950.886773187" Jan 21 00:14:43 crc kubenswrapper[5030]: I0121 00:14:43.901455 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-xcxt5"] Jan 21 00:14:43 crc kubenswrapper[5030]: I0121 00:14:43.906474 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-xcxt5"] Jan 21 00:14:43 crc kubenswrapper[5030]: I0121 00:14:43.963734 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:14:43 crc kubenswrapper[5030]: E0121 00:14:43.963964 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:14:43 crc kubenswrapper[5030]: I0121 00:14:43.972210 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abf37276-206c-458e-8998-6fe0df5024b7" path="/var/lib/kubelet/pods/abf37276-206c-458e-8998-6fe0df5024b7/volumes" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.083639 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-m8zfs"] Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.085035 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-m8zfs" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.087934 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.088210 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.088413 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.093612 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.094111 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-m8zfs"] Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.250341 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ae52089a-bc15-4c10-9405-68ae71992e24-node-mnt\") pod \"crc-storage-crc-m8zfs\" (UID: \"ae52089a-bc15-4c10-9405-68ae71992e24\") " pod="crc-storage/crc-storage-crc-m8zfs" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.250445 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ae52089a-bc15-4c10-9405-68ae71992e24-crc-storage\") pod \"crc-storage-crc-m8zfs\" (UID: \"ae52089a-bc15-4c10-9405-68ae71992e24\") " pod="crc-storage/crc-storage-crc-m8zfs" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.250502 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsvxn\" (UniqueName: \"kubernetes.io/projected/ae52089a-bc15-4c10-9405-68ae71992e24-kube-api-access-dsvxn\") pod \"crc-storage-crc-m8zfs\" (UID: \"ae52089a-bc15-4c10-9405-68ae71992e24\") " pod="crc-storage/crc-storage-crc-m8zfs" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.349455 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.353549 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ae52089a-bc15-4c10-9405-68ae71992e24-node-mnt\") pod \"crc-storage-crc-m8zfs\" (UID: \"ae52089a-bc15-4c10-9405-68ae71992e24\") " pod="crc-storage/crc-storage-crc-m8zfs" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.353604 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ae52089a-bc15-4c10-9405-68ae71992e24-crc-storage\") pod \"crc-storage-crc-m8zfs\" (UID: \"ae52089a-bc15-4c10-9405-68ae71992e24\") " pod="crc-storage/crc-storage-crc-m8zfs" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.353665 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsvxn\" (UniqueName: \"kubernetes.io/projected/ae52089a-bc15-4c10-9405-68ae71992e24-kube-api-access-dsvxn\") pod \"crc-storage-crc-m8zfs\" (UID: \"ae52089a-bc15-4c10-9405-68ae71992e24\") " pod="crc-storage/crc-storage-crc-m8zfs" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.353839 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ae52089a-bc15-4c10-9405-68ae71992e24-node-mnt\") pod \"crc-storage-crc-m8zfs\" (UID: \"ae52089a-bc15-4c10-9405-68ae71992e24\") " pod="crc-storage/crc-storage-crc-m8zfs" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.356794 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ae52089a-bc15-4c10-9405-68ae71992e24-crc-storage\") pod \"crc-storage-crc-m8zfs\" (UID: \"ae52089a-bc15-4c10-9405-68ae71992e24\") " pod="crc-storage/crc-storage-crc-m8zfs" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.392131 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsvxn\" (UniqueName: \"kubernetes.io/projected/ae52089a-bc15-4c10-9405-68ae71992e24-kube-api-access-dsvxn\") pod \"crc-storage-crc-m8zfs\" (UID: \"ae52089a-bc15-4c10-9405-68ae71992e24\") " pod="crc-storage/crc-storage-crc-m8zfs" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.422165 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-m8zfs" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.454880 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-inventory\") pod \"9283a30a-200f-46b9-988e-a6aa110f5152\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.454938 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-1-2\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-2\") pod \"9283a30a-200f-46b9-988e-a6aa110f5152\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.454963 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-0-2\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-2\") pod \"9283a30a-200f-46b9-988e-a6aa110f5152\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.455009 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-2-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-2-0\") pod \"9283a30a-200f-46b9-988e-a6aa110f5152\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.455071 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-combined-ca-bundle\") pod \"9283a30a-200f-46b9-988e-a6aa110f5152\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.455101 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-0-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-0\") pod \"9283a30a-200f-46b9-988e-a6aa110f5152\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.455123 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-1-1\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-1\") pod \"9283a30a-200f-46b9-988e-a6aa110f5152\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.455158 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92kp4\" (UniqueName: \"kubernetes.io/projected/9283a30a-200f-46b9-988e-a6aa110f5152-kube-api-access-92kp4\") pod \"9283a30a-200f-46b9-988e-a6aa110f5152\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.455181 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-ssh-key-edpm-compute-no-nodes\") pod \"9283a30a-200f-46b9-988e-a6aa110f5152\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.455231 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-0-1\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-1\") pod \"9283a30a-200f-46b9-988e-a6aa110f5152\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.455255 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kuttl-service-cm-1-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-0\") pod \"9283a30a-200f-46b9-988e-a6aa110f5152\" (UID: \"9283a30a-200f-46b9-988e-a6aa110f5152\") " Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.460648 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-combined-ca-bundle" (OuterVolumeSpecName: "kuttl-service-combined-ca-bundle") pod "9283a30a-200f-46b9-988e-a6aa110f5152" (UID: "9283a30a-200f-46b9-988e-a6aa110f5152"). InnerVolumeSpecName "kuttl-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.462944 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9283a30a-200f-46b9-988e-a6aa110f5152-kube-api-access-92kp4" (OuterVolumeSpecName: "kube-api-access-92kp4") pod "9283a30a-200f-46b9-988e-a6aa110f5152" (UID: "9283a30a-200f-46b9-988e-a6aa110f5152"). InnerVolumeSpecName "kube-api-access-92kp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.474880 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-0" (OuterVolumeSpecName: "kuttl-service-cm-0-0") pod "9283a30a-200f-46b9-988e-a6aa110f5152" (UID: "9283a30a-200f-46b9-988e-a6aa110f5152"). InnerVolumeSpecName "kuttl-service-cm-0-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.475149 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-2-0" (OuterVolumeSpecName: "kuttl-service-cm-2-0") pod "9283a30a-200f-46b9-988e-a6aa110f5152" (UID: "9283a30a-200f-46b9-988e-a6aa110f5152"). InnerVolumeSpecName "kuttl-service-cm-2-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.476703 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-2" (OuterVolumeSpecName: "kuttl-service-cm-0-2") pod "9283a30a-200f-46b9-988e-a6aa110f5152" (UID: "9283a30a-200f-46b9-988e-a6aa110f5152"). InnerVolumeSpecName "kuttl-service-cm-0-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.478074 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-1" (OuterVolumeSpecName: "kuttl-service-cm-1-1") pod "9283a30a-200f-46b9-988e-a6aa110f5152" (UID: "9283a30a-200f-46b9-988e-a6aa110f5152"). InnerVolumeSpecName "kuttl-service-cm-1-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.479036 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-inventory" (OuterVolumeSpecName: "inventory") pod "9283a30a-200f-46b9-988e-a6aa110f5152" (UID: "9283a30a-200f-46b9-988e-a6aa110f5152"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.479088 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-2" (OuterVolumeSpecName: "kuttl-service-cm-1-2") pod "9283a30a-200f-46b9-988e-a6aa110f5152" (UID: "9283a30a-200f-46b9-988e-a6aa110f5152"). InnerVolumeSpecName "kuttl-service-cm-1-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.480265 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-1" (OuterVolumeSpecName: "kuttl-service-cm-0-1") pod "9283a30a-200f-46b9-988e-a6aa110f5152" (UID: "9283a30a-200f-46b9-988e-a6aa110f5152"). InnerVolumeSpecName "kuttl-service-cm-0-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.483758 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "9283a30a-200f-46b9-988e-a6aa110f5152" (UID: "9283a30a-200f-46b9-988e-a6aa110f5152"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.484311 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-0" (OuterVolumeSpecName: "kuttl-service-cm-1-0") pod "9283a30a-200f-46b9-988e-a6aa110f5152" (UID: "9283a30a-200f-46b9-988e-a6aa110f5152"). InnerVolumeSpecName "kuttl-service-cm-1-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.557025 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.557280 5030 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-1-2\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-2\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.557294 5030 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-0-2\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-2\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.557304 5030 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-2-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-2-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.557315 5030 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.557325 5030 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-0-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.557336 5030 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-1-1\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.557350 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92kp4\" (UniqueName: \"kubernetes.io/projected/9283a30a-200f-46b9-988e-a6aa110f5152-kube-api-access-92kp4\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.557358 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/9283a30a-200f-46b9-988e-a6aa110f5152-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.557367 5030 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-0-1\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-0-1\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.557378 5030 reconciler_common.go:293] "Volume detached for volume \"kuttl-service-cm-1-0\" (UniqueName: \"kubernetes.io/configmap/9283a30a-200f-46b9-988e-a6aa110f5152-kuttl-service-cm-1-0\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.587483 5030 generic.go:334] "Generic (PLEG): container finished" podID="9283a30a-200f-46b9-988e-a6aa110f5152" containerID="658a8db2c9d1121544a17eb3708297a372badd347832bc6e065f8ccfd30a4900" exitCode=254 Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.587522 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" event={"ID":"9283a30a-200f-46b9-988e-a6aa110f5152","Type":"ContainerDied","Data":"658a8db2c9d1121544a17eb3708297a372badd347832bc6e065f8ccfd30a4900"} Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.587547 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" event={"ID":"9283a30a-200f-46b9-988e-a6aa110f5152","Type":"ContainerDied","Data":"57e73c9ea9031e1f4a0fb9169db71c7e1ca44466998b8501b94a801856e45030"} Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.587564 5030 scope.go:117] "RemoveContainer" containerID="658a8db2c9d1121544a17eb3708297a372badd347832bc6e065f8ccfd30a4900" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.587699 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.617996 5030 scope.go:117] "RemoveContainer" containerID="658a8db2c9d1121544a17eb3708297a372badd347832bc6e065f8ccfd30a4900" Jan 21 00:14:44 crc kubenswrapper[5030]: E0121 00:14:44.618448 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"658a8db2c9d1121544a17eb3708297a372badd347832bc6e065f8ccfd30a4900\": container with ID starting with 658a8db2c9d1121544a17eb3708297a372badd347832bc6e065f8ccfd30a4900 not found: ID does not exist" containerID="658a8db2c9d1121544a17eb3708297a372badd347832bc6e065f8ccfd30a4900" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.618500 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"658a8db2c9d1121544a17eb3708297a372badd347832bc6e065f8ccfd30a4900"} err="failed to get container status \"658a8db2c9d1121544a17eb3708297a372badd347832bc6e065f8ccfd30a4900\": rpc error: code = NotFound desc = could not find container \"658a8db2c9d1121544a17eb3708297a372badd347832bc6e065f8ccfd30a4900\": container with ID starting with 658a8db2c9d1121544a17eb3708297a372badd347832bc6e065f8ccfd30a4900 not found: ID does not exist" Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.642400 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht"] Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.649408 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes-5v5ht"] Jan 21 00:14:44 crc kubenswrapper[5030]: I0121 00:14:44.867238 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-m8zfs"] Jan 21 00:14:45 crc kubenswrapper[5030]: I0121 00:14:45.600680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-m8zfs" event={"ID":"ae52089a-bc15-4c10-9405-68ae71992e24","Type":"ContainerStarted","Data":"bbe3e2938fd24d5797822b932b1aac3625efdb2df77277b1ec121135fd35a209"} Jan 21 00:14:45 crc kubenswrapper[5030]: I0121 00:14:45.972242 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9283a30a-200f-46b9-988e-a6aa110f5152" path="/var/lib/kubelet/pods/9283a30a-200f-46b9-988e-a6aa110f5152/volumes" Jan 21 00:14:46 crc kubenswrapper[5030]: I0121 00:14:46.336147 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:14:46 crc kubenswrapper[5030]: I0121 00:14:46.389482 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf"] Jan 21 00:14:46 crc kubenswrapper[5030]: I0121 00:14:46.389926 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" podUID="d0d97df5-4aa5-474d-83dd-cfcc487db61a" containerName="dnsmasq-dns" containerID="cri-o://ec50c59b6f1b2bbdd07a4051dd936f5f2b49ad9223d411cf98609b488f254d1c" gracePeriod=10 Jan 21 00:14:46 crc kubenswrapper[5030]: I0121 00:14:46.613967 5030 generic.go:334] "Generic (PLEG): container finished" podID="d0d97df5-4aa5-474d-83dd-cfcc487db61a" containerID="ec50c59b6f1b2bbdd07a4051dd936f5f2b49ad9223d411cf98609b488f254d1c" exitCode=0 Jan 21 00:14:46 crc kubenswrapper[5030]: I0121 00:14:46.614010 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" event={"ID":"d0d97df5-4aa5-474d-83dd-cfcc487db61a","Type":"ContainerDied","Data":"ec50c59b6f1b2bbdd07a4051dd936f5f2b49ad9223d411cf98609b488f254d1c"} Jan 21 00:14:46 crc kubenswrapper[5030]: I0121 00:14:46.616661 5030 generic.go:334] "Generic (PLEG): container finished" podID="ae52089a-bc15-4c10-9405-68ae71992e24" containerID="6954db4fba719163fd4731ada0e3754c7a3aeea31be25d67085b1222c9ec7c8c" exitCode=0 Jan 21 00:14:46 crc kubenswrapper[5030]: I0121 00:14:46.616783 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-m8zfs" event={"ID":"ae52089a-bc15-4c10-9405-68ae71992e24","Type":"ContainerDied","Data":"6954db4fba719163fd4731ada0e3754c7a3aeea31be25d67085b1222c9ec7c8c"} Jan 21 00:14:46 crc kubenswrapper[5030]: I0121 00:14:46.931249 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:46 crc kubenswrapper[5030]: I0121 00:14:46.993021 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-dnsmasq-svc\") pod \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " Jan 21 00:14:46 crc kubenswrapper[5030]: I0121 00:14:46.993106 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-config\") pod \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " Jan 21 00:14:46 crc kubenswrapper[5030]: I0121 00:14:46.993182 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-edpm-compute-no-nodes\") pod \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " Jan 21 00:14:46 crc kubenswrapper[5030]: I0121 00:14:46.993248 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpj2j\" (UniqueName: \"kubernetes.io/projected/d0d97df5-4aa5-474d-83dd-cfcc487db61a-kube-api-access-fpj2j\") pod \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\" (UID: \"d0d97df5-4aa5-474d-83dd-cfcc487db61a\") " Jan 21 00:14:46 crc kubenswrapper[5030]: I0121 00:14:46.998690 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0d97df5-4aa5-474d-83dd-cfcc487db61a-kube-api-access-fpj2j" (OuterVolumeSpecName: "kube-api-access-fpj2j") pod "d0d97df5-4aa5-474d-83dd-cfcc487db61a" (UID: "d0d97df5-4aa5-474d-83dd-cfcc487db61a"). InnerVolumeSpecName "kube-api-access-fpj2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.027345 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-edpm-compute-no-nodes" (OuterVolumeSpecName: "edpm-compute-no-nodes") pod "d0d97df5-4aa5-474d-83dd-cfcc487db61a" (UID: "d0d97df5-4aa5-474d-83dd-cfcc487db61a"). InnerVolumeSpecName "edpm-compute-no-nodes". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.028489 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "d0d97df5-4aa5-474d-83dd-cfcc487db61a" (UID: "d0d97df5-4aa5-474d-83dd-cfcc487db61a"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.041750 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-config" (OuterVolumeSpecName: "config") pod "d0d97df5-4aa5-474d-83dd-cfcc487db61a" (UID: "d0d97df5-4aa5-474d-83dd-cfcc487db61a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.094730 5030 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.095242 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpj2j\" (UniqueName: \"kubernetes.io/projected/d0d97df5-4aa5-474d-83dd-cfcc487db61a-kube-api-access-fpj2j\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.095255 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.095265 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0d97df5-4aa5-474d-83dd-cfcc487db61a-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.626357 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.626350 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf" event={"ID":"d0d97df5-4aa5-474d-83dd-cfcc487db61a","Type":"ContainerDied","Data":"e9e07dda4181d92d76d484c3b761e2a1a80162e077cffbb513b9627b1e4956a7"} Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.626876 5030 scope.go:117] "RemoveContainer" containerID="ec50c59b6f1b2bbdd07a4051dd936f5f2b49ad9223d411cf98609b488f254d1c" Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.662819 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf"] Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.668891 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-p8xnf"] Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.677856 5030 scope.go:117] "RemoveContainer" containerID="80e1fb526a53af02d36132aca6456e17281c6789641fc525142e7d7be7bc2a22" Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.904274 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-m8zfs" Jan 21 00:14:47 crc kubenswrapper[5030]: I0121 00:14:47.971171 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0d97df5-4aa5-474d-83dd-cfcc487db61a" path="/var/lib/kubelet/pods/d0d97df5-4aa5-474d-83dd-cfcc487db61a/volumes" Jan 21 00:14:48 crc kubenswrapper[5030]: I0121 00:14:48.009350 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ae52089a-bc15-4c10-9405-68ae71992e24-node-mnt\") pod \"ae52089a-bc15-4c10-9405-68ae71992e24\" (UID: \"ae52089a-bc15-4c10-9405-68ae71992e24\") " Jan 21 00:14:48 crc kubenswrapper[5030]: I0121 00:14:48.009431 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ae52089a-bc15-4c10-9405-68ae71992e24-crc-storage\") pod \"ae52089a-bc15-4c10-9405-68ae71992e24\" (UID: \"ae52089a-bc15-4c10-9405-68ae71992e24\") " Jan 21 00:14:48 crc kubenswrapper[5030]: I0121 00:14:48.010306 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae52089a-bc15-4c10-9405-68ae71992e24-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "ae52089a-bc15-4c10-9405-68ae71992e24" (UID: "ae52089a-bc15-4c10-9405-68ae71992e24"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:14:48 crc kubenswrapper[5030]: I0121 00:14:48.017094 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsvxn\" (UniqueName: \"kubernetes.io/projected/ae52089a-bc15-4c10-9405-68ae71992e24-kube-api-access-dsvxn\") pod \"ae52089a-bc15-4c10-9405-68ae71992e24\" (UID: \"ae52089a-bc15-4c10-9405-68ae71992e24\") " Jan 21 00:14:48 crc kubenswrapper[5030]: I0121 00:14:48.017835 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ae52089a-bc15-4c10-9405-68ae71992e24-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:48 crc kubenswrapper[5030]: I0121 00:14:48.020805 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae52089a-bc15-4c10-9405-68ae71992e24-kube-api-access-dsvxn" (OuterVolumeSpecName: "kube-api-access-dsvxn") pod "ae52089a-bc15-4c10-9405-68ae71992e24" (UID: "ae52089a-bc15-4c10-9405-68ae71992e24"). InnerVolumeSpecName "kube-api-access-dsvxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:14:48 crc kubenswrapper[5030]: I0121 00:14:48.027410 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae52089a-bc15-4c10-9405-68ae71992e24-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "ae52089a-bc15-4c10-9405-68ae71992e24" (UID: "ae52089a-bc15-4c10-9405-68ae71992e24"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:48 crc kubenswrapper[5030]: I0121 00:14:48.119202 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsvxn\" (UniqueName: \"kubernetes.io/projected/ae52089a-bc15-4c10-9405-68ae71992e24-kube-api-access-dsvxn\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:48 crc kubenswrapper[5030]: I0121 00:14:48.119239 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ae52089a-bc15-4c10-9405-68ae71992e24-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:48 crc kubenswrapper[5030]: I0121 00:14:48.642015 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-m8zfs" event={"ID":"ae52089a-bc15-4c10-9405-68ae71992e24","Type":"ContainerDied","Data":"bbe3e2938fd24d5797822b932b1aac3625efdb2df77277b1ec121135fd35a209"} Jan 21 00:14:48 crc kubenswrapper[5030]: I0121 00:14:48.642077 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbe3e2938fd24d5797822b932b1aac3625efdb2df77277b1ec121135fd35a209" Jan 21 00:14:48 crc kubenswrapper[5030]: I0121 00:14:48.642097 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-m8zfs" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.406654 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-m8zfs"] Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.414552 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-m8zfs"] Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.551937 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-qzwgf"] Jan 21 00:14:51 crc kubenswrapper[5030]: E0121 00:14:51.552543 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae52089a-bc15-4c10-9405-68ae71992e24" containerName="storage" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.552590 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae52089a-bc15-4c10-9405-68ae71992e24" containerName="storage" Jan 21 00:14:51 crc kubenswrapper[5030]: E0121 00:14:51.552673 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d97df5-4aa5-474d-83dd-cfcc487db61a" containerName="dnsmasq-dns" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.552696 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d97df5-4aa5-474d-83dd-cfcc487db61a" containerName="dnsmasq-dns" Jan 21 00:14:51 crc kubenswrapper[5030]: E0121 00:14:51.552770 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d97df5-4aa5-474d-83dd-cfcc487db61a" containerName="init" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.552789 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d97df5-4aa5-474d-83dd-cfcc487db61a" containerName="init" Jan 21 00:14:51 crc kubenswrapper[5030]: E0121 00:14:51.552819 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9283a30a-200f-46b9-988e-a6aa110f5152" containerName="kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.552841 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9283a30a-200f-46b9-988e-a6aa110f5152" containerName="kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.553261 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae52089a-bc15-4c10-9405-68ae71992e24" containerName="storage" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.553314 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d97df5-4aa5-474d-83dd-cfcc487db61a" containerName="dnsmasq-dns" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.553337 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9283a30a-200f-46b9-988e-a6aa110f5152" containerName="kuttl-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.554893 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qzwgf" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.558149 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.566257 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.566379 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.567675 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.568047 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qzwgf"] Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.670041 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmqld\" (UniqueName: \"kubernetes.io/projected/b9f57e7f-7203-4536-bea5-91dc8828c783-kube-api-access-cmqld\") pod \"crc-storage-crc-qzwgf\" (UID: \"b9f57e7f-7203-4536-bea5-91dc8828c783\") " pod="crc-storage/crc-storage-crc-qzwgf" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.670167 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b9f57e7f-7203-4536-bea5-91dc8828c783-node-mnt\") pod \"crc-storage-crc-qzwgf\" (UID: \"b9f57e7f-7203-4536-bea5-91dc8828c783\") " pod="crc-storage/crc-storage-crc-qzwgf" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.670284 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b9f57e7f-7203-4536-bea5-91dc8828c783-crc-storage\") pod \"crc-storage-crc-qzwgf\" (UID: \"b9f57e7f-7203-4536-bea5-91dc8828c783\") " pod="crc-storage/crc-storage-crc-qzwgf" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.771692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b9f57e7f-7203-4536-bea5-91dc8828c783-node-mnt\") pod \"crc-storage-crc-qzwgf\" (UID: \"b9f57e7f-7203-4536-bea5-91dc8828c783\") " pod="crc-storage/crc-storage-crc-qzwgf" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.772003 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b9f57e7f-7203-4536-bea5-91dc8828c783-crc-storage\") pod \"crc-storage-crc-qzwgf\" (UID: \"b9f57e7f-7203-4536-bea5-91dc8828c783\") " pod="crc-storage/crc-storage-crc-qzwgf" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.772121 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmqld\" (UniqueName: \"kubernetes.io/projected/b9f57e7f-7203-4536-bea5-91dc8828c783-kube-api-access-cmqld\") pod \"crc-storage-crc-qzwgf\" (UID: \"b9f57e7f-7203-4536-bea5-91dc8828c783\") " pod="crc-storage/crc-storage-crc-qzwgf" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.772366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b9f57e7f-7203-4536-bea5-91dc8828c783-node-mnt\") pod \"crc-storage-crc-qzwgf\" (UID: \"b9f57e7f-7203-4536-bea5-91dc8828c783\") " pod="crc-storage/crc-storage-crc-qzwgf" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.772730 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b9f57e7f-7203-4536-bea5-91dc8828c783-crc-storage\") pod \"crc-storage-crc-qzwgf\" (UID: \"b9f57e7f-7203-4536-bea5-91dc8828c783\") " pod="crc-storage/crc-storage-crc-qzwgf" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.796472 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmqld\" (UniqueName: \"kubernetes.io/projected/b9f57e7f-7203-4536-bea5-91dc8828c783-kube-api-access-cmqld\") pod \"crc-storage-crc-qzwgf\" (UID: \"b9f57e7f-7203-4536-bea5-91dc8828c783\") " pod="crc-storage/crc-storage-crc-qzwgf" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.887752 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qzwgf" Jan 21 00:14:51 crc kubenswrapper[5030]: I0121 00:14:51.970953 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae52089a-bc15-4c10-9405-68ae71992e24" path="/var/lib/kubelet/pods/ae52089a-bc15-4c10-9405-68ae71992e24/volumes" Jan 21 00:14:52 crc kubenswrapper[5030]: I0121 00:14:52.378444 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qzwgf"] Jan 21 00:14:52 crc kubenswrapper[5030]: I0121 00:14:52.676240 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qzwgf" event={"ID":"b9f57e7f-7203-4536-bea5-91dc8828c783","Type":"ContainerStarted","Data":"c681a4d8e48952b85fc8b95f0c147753fb9120d8dec3f10bf50a92799c3cd1e6"} Jan 21 00:14:53 crc kubenswrapper[5030]: I0121 00:14:53.703691 5030 generic.go:334] "Generic (PLEG): container finished" podID="b9f57e7f-7203-4536-bea5-91dc8828c783" containerID="86dfd28a16763e6122806fe4b880bd8e7aada9e1039f68fae429d50a23fc23ab" exitCode=0 Jan 21 00:14:53 crc kubenswrapper[5030]: I0121 00:14:53.703938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qzwgf" event={"ID":"b9f57e7f-7203-4536-bea5-91dc8828c783","Type":"ContainerDied","Data":"86dfd28a16763e6122806fe4b880bd8e7aada9e1039f68fae429d50a23fc23ab"} Jan 21 00:14:55 crc kubenswrapper[5030]: I0121 00:14:55.052327 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qzwgf" Jan 21 00:14:55 crc kubenswrapper[5030]: I0121 00:14:55.126942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b9f57e7f-7203-4536-bea5-91dc8828c783-node-mnt\") pod \"b9f57e7f-7203-4536-bea5-91dc8828c783\" (UID: \"b9f57e7f-7203-4536-bea5-91dc8828c783\") " Jan 21 00:14:55 crc kubenswrapper[5030]: I0121 00:14:55.127319 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmqld\" (UniqueName: \"kubernetes.io/projected/b9f57e7f-7203-4536-bea5-91dc8828c783-kube-api-access-cmqld\") pod \"b9f57e7f-7203-4536-bea5-91dc8828c783\" (UID: \"b9f57e7f-7203-4536-bea5-91dc8828c783\") " Jan 21 00:14:55 crc kubenswrapper[5030]: I0121 00:14:55.127044 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9f57e7f-7203-4536-bea5-91dc8828c783-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "b9f57e7f-7203-4536-bea5-91dc8828c783" (UID: "b9f57e7f-7203-4536-bea5-91dc8828c783"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:14:55 crc kubenswrapper[5030]: I0121 00:14:55.127380 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b9f57e7f-7203-4536-bea5-91dc8828c783-crc-storage\") pod \"b9f57e7f-7203-4536-bea5-91dc8828c783\" (UID: \"b9f57e7f-7203-4536-bea5-91dc8828c783\") " Jan 21 00:14:55 crc kubenswrapper[5030]: I0121 00:14:55.127751 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b9f57e7f-7203-4536-bea5-91dc8828c783-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:55 crc kubenswrapper[5030]: I0121 00:14:55.132711 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9f57e7f-7203-4536-bea5-91dc8828c783-kube-api-access-cmqld" (OuterVolumeSpecName: "kube-api-access-cmqld") pod "b9f57e7f-7203-4536-bea5-91dc8828c783" (UID: "b9f57e7f-7203-4536-bea5-91dc8828c783"). InnerVolumeSpecName "kube-api-access-cmqld". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:14:55 crc kubenswrapper[5030]: I0121 00:14:55.147117 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9f57e7f-7203-4536-bea5-91dc8828c783-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "b9f57e7f-7203-4536-bea5-91dc8828c783" (UID: "b9f57e7f-7203-4536-bea5-91dc8828c783"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:14:55 crc kubenswrapper[5030]: I0121 00:14:55.229165 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmqld\" (UniqueName: \"kubernetes.io/projected/b9f57e7f-7203-4536-bea5-91dc8828c783-kube-api-access-cmqld\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:55 crc kubenswrapper[5030]: I0121 00:14:55.229218 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b9f57e7f-7203-4536-bea5-91dc8828c783-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:14:55 crc kubenswrapper[5030]: I0121 00:14:55.731917 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qzwgf" event={"ID":"b9f57e7f-7203-4536-bea5-91dc8828c783","Type":"ContainerDied","Data":"c681a4d8e48952b85fc8b95f0c147753fb9120d8dec3f10bf50a92799c3cd1e6"} Jan 21 00:14:55 crc kubenswrapper[5030]: I0121 00:14:55.731992 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c681a4d8e48952b85fc8b95f0c147753fb9120d8dec3f10bf50a92799c3cd1e6" Jan 21 00:14:55 crc kubenswrapper[5030]: I0121 00:14:55.731961 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qzwgf" Jan 21 00:14:57 crc kubenswrapper[5030]: I0121 00:14:57.972586 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:14:57 crc kubenswrapper[5030]: E0121 00:14:57.973319 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.380394 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6"] Jan 21 00:14:58 crc kubenswrapper[5030]: E0121 00:14:58.380837 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9f57e7f-7203-4536-bea5-91dc8828c783" containerName="storage" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.380854 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9f57e7f-7203-4536-bea5-91dc8828c783" containerName="storage" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.381027 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9f57e7f-7203-4536-bea5-91dc8828c783" containerName="storage" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.381905 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.384507 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-no-nodes-custom-svc" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.402924 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6"] Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.478336 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-c5gj6\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.478417 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-edpm-no-nodes-custom-svc\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-c5gj6\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.478472 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-config\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-c5gj6\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.478743 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8v75\" (UniqueName: \"kubernetes.io/projected/16d2c5a6-d548-406d-9b99-7bcfd82a333c-kube-api-access-l8v75\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-c5gj6\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.579578 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-edpm-no-nodes-custom-svc\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-c5gj6\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.579682 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-config\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-c5gj6\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.579738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8v75\" (UniqueName: \"kubernetes.io/projected/16d2c5a6-d548-406d-9b99-7bcfd82a333c-kube-api-access-l8v75\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-c5gj6\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.579768 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-c5gj6\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.580921 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-c5gj6\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.580950 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-edpm-no-nodes-custom-svc\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-c5gj6\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.581399 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-config\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-c5gj6\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.599861 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8v75\" (UniqueName: \"kubernetes.io/projected/16d2c5a6-d548-406d-9b99-7bcfd82a333c-kube-api-access-l8v75\") pod \"dnsmasq-dnsmasq-6fdb84cf7c-c5gj6\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:58 crc kubenswrapper[5030]: I0121 00:14:58.734911 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:14:59 crc kubenswrapper[5030]: W0121 00:14:59.178042 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16d2c5a6_d548_406d_9b99_7bcfd82a333c.slice/crio-c7ee1b35ece3894d3a8a4fb97462e4941555306f3c55d1507c6a1edd6054abd1 WatchSource:0}: Error finding container c7ee1b35ece3894d3a8a4fb97462e4941555306f3c55d1507c6a1edd6054abd1: Status 404 returned error can't find the container with id c7ee1b35ece3894d3a8a4fb97462e4941555306f3c55d1507c6a1edd6054abd1 Jan 21 00:14:59 crc kubenswrapper[5030]: I0121 00:14:59.178971 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6"] Jan 21 00:14:59 crc kubenswrapper[5030]: I0121 00:14:59.763379 5030 generic.go:334] "Generic (PLEG): container finished" podID="16d2c5a6-d548-406d-9b99-7bcfd82a333c" containerID="1a2233278e2a449b6ee4308d0fad8accd05a202a93cc269845570751671db7a1" exitCode=0 Jan 21 00:14:59 crc kubenswrapper[5030]: I0121 00:14:59.763489 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" event={"ID":"16d2c5a6-d548-406d-9b99-7bcfd82a333c","Type":"ContainerDied","Data":"1a2233278e2a449b6ee4308d0fad8accd05a202a93cc269845570751671db7a1"} Jan 21 00:14:59 crc kubenswrapper[5030]: I0121 00:14:59.763725 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" event={"ID":"16d2c5a6-d548-406d-9b99-7bcfd82a333c","Type":"ContainerStarted","Data":"c7ee1b35ece3894d3a8a4fb97462e4941555306f3c55d1507c6a1edd6054abd1"} Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.134852 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl"] Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.135948 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.138202 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.138516 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.147283 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl"] Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.262479 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7c6a312-8157-4460-9800-7f2df0edc7fa-config-volume\") pod \"collect-profiles-29482575-lmgpl\" (UID: \"b7c6a312-8157-4460-9800-7f2df0edc7fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.262560 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkg5j\" (UniqueName: \"kubernetes.io/projected/b7c6a312-8157-4460-9800-7f2df0edc7fa-kube-api-access-tkg5j\") pod \"collect-profiles-29482575-lmgpl\" (UID: \"b7c6a312-8157-4460-9800-7f2df0edc7fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.262596 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7c6a312-8157-4460-9800-7f2df0edc7fa-secret-volume\") pod \"collect-profiles-29482575-lmgpl\" (UID: \"b7c6a312-8157-4460-9800-7f2df0edc7fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.363759 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkg5j\" (UniqueName: \"kubernetes.io/projected/b7c6a312-8157-4460-9800-7f2df0edc7fa-kube-api-access-tkg5j\") pod \"collect-profiles-29482575-lmgpl\" (UID: \"b7c6a312-8157-4460-9800-7f2df0edc7fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.363812 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7c6a312-8157-4460-9800-7f2df0edc7fa-secret-volume\") pod \"collect-profiles-29482575-lmgpl\" (UID: \"b7c6a312-8157-4460-9800-7f2df0edc7fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.364826 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7c6a312-8157-4460-9800-7f2df0edc7fa-config-volume\") pod \"collect-profiles-29482575-lmgpl\" (UID: \"b7c6a312-8157-4460-9800-7f2df0edc7fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.366101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7c6a312-8157-4460-9800-7f2df0edc7fa-config-volume\") pod \"collect-profiles-29482575-lmgpl\" (UID: \"b7c6a312-8157-4460-9800-7f2df0edc7fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.368960 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7c6a312-8157-4460-9800-7f2df0edc7fa-secret-volume\") pod \"collect-profiles-29482575-lmgpl\" (UID: \"b7c6a312-8157-4460-9800-7f2df0edc7fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.381647 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkg5j\" (UniqueName: \"kubernetes.io/projected/b7c6a312-8157-4460-9800-7f2df0edc7fa-kube-api-access-tkg5j\") pod \"collect-profiles-29482575-lmgpl\" (UID: \"b7c6a312-8157-4460-9800-7f2df0edc7fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.455345 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.775040 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" event={"ID":"16d2c5a6-d548-406d-9b99-7bcfd82a333c","Type":"ContainerStarted","Data":"f6159c44c967fa1bbd0bb6c62d115191f333705b1afe3b5bf3b8d4deba1a3d7e"} Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.775373 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:15:00 crc kubenswrapper[5030]: I0121 00:15:00.800284 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" podStartSLOduration=2.8002672520000003 podStartE2EDuration="2.800267252s" podCreationTimestamp="2026-01-21 00:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:15:00.793062006 +0000 UTC m=+5973.113322294" watchObservedRunningTime="2026-01-21 00:15:00.800267252 +0000 UTC m=+5973.120527540" Jan 21 00:15:01 crc kubenswrapper[5030]: I0121 00:15:01.105385 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl"] Jan 21 00:15:01 crc kubenswrapper[5030]: W0121 00:15:01.117932 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7c6a312_8157_4460_9800_7f2df0edc7fa.slice/crio-d58d6506e717b2086e366db4058cdef0cde27e9fcb9c6d4face394a7cdcdb56f WatchSource:0}: Error finding container d58d6506e717b2086e366db4058cdef0cde27e9fcb9c6d4face394a7cdcdb56f: Status 404 returned error can't find the container with id d58d6506e717b2086e366db4058cdef0cde27e9fcb9c6d4face394a7cdcdb56f Jan 21 00:15:01 crc kubenswrapper[5030]: I0121 00:15:01.784426 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" event={"ID":"b7c6a312-8157-4460-9800-7f2df0edc7fa","Type":"ContainerStarted","Data":"7e2729c7f2dedcba7ffe2a612f970854d6a0b3bacf69942ee1e4c7982d3371fc"} Jan 21 00:15:01 crc kubenswrapper[5030]: I0121 00:15:01.784788 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" event={"ID":"b7c6a312-8157-4460-9800-7f2df0edc7fa","Type":"ContainerStarted","Data":"d58d6506e717b2086e366db4058cdef0cde27e9fcb9c6d4face394a7cdcdb56f"} Jan 21 00:15:02 crc kubenswrapper[5030]: I0121 00:15:02.795568 5030 generic.go:334] "Generic (PLEG): container finished" podID="b7c6a312-8157-4460-9800-7f2df0edc7fa" containerID="7e2729c7f2dedcba7ffe2a612f970854d6a0b3bacf69942ee1e4c7982d3371fc" exitCode=0 Jan 21 00:15:02 crc kubenswrapper[5030]: I0121 00:15:02.795676 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" event={"ID":"b7c6a312-8157-4460-9800-7f2df0edc7fa","Type":"ContainerDied","Data":"7e2729c7f2dedcba7ffe2a612f970854d6a0b3bacf69942ee1e4c7982d3371fc"} Jan 21 00:15:04 crc kubenswrapper[5030]: I0121 00:15:04.084116 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" Jan 21 00:15:04 crc kubenswrapper[5030]: I0121 00:15:04.118434 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7c6a312-8157-4460-9800-7f2df0edc7fa-secret-volume\") pod \"b7c6a312-8157-4460-9800-7f2df0edc7fa\" (UID: \"b7c6a312-8157-4460-9800-7f2df0edc7fa\") " Jan 21 00:15:04 crc kubenswrapper[5030]: I0121 00:15:04.118521 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkg5j\" (UniqueName: \"kubernetes.io/projected/b7c6a312-8157-4460-9800-7f2df0edc7fa-kube-api-access-tkg5j\") pod \"b7c6a312-8157-4460-9800-7f2df0edc7fa\" (UID: \"b7c6a312-8157-4460-9800-7f2df0edc7fa\") " Jan 21 00:15:04 crc kubenswrapper[5030]: I0121 00:15:04.118560 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7c6a312-8157-4460-9800-7f2df0edc7fa-config-volume\") pod \"b7c6a312-8157-4460-9800-7f2df0edc7fa\" (UID: \"b7c6a312-8157-4460-9800-7f2df0edc7fa\") " Jan 21 00:15:04 crc kubenswrapper[5030]: I0121 00:15:04.119442 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7c6a312-8157-4460-9800-7f2df0edc7fa-config-volume" (OuterVolumeSpecName: "config-volume") pod "b7c6a312-8157-4460-9800-7f2df0edc7fa" (UID: "b7c6a312-8157-4460-9800-7f2df0edc7fa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:15:04 crc kubenswrapper[5030]: I0121 00:15:04.126922 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7c6a312-8157-4460-9800-7f2df0edc7fa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b7c6a312-8157-4460-9800-7f2df0edc7fa" (UID: "b7c6a312-8157-4460-9800-7f2df0edc7fa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:15:04 crc kubenswrapper[5030]: I0121 00:15:04.127000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7c6a312-8157-4460-9800-7f2df0edc7fa-kube-api-access-tkg5j" (OuterVolumeSpecName: "kube-api-access-tkg5j") pod "b7c6a312-8157-4460-9800-7f2df0edc7fa" (UID: "b7c6a312-8157-4460-9800-7f2df0edc7fa"). InnerVolumeSpecName "kube-api-access-tkg5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:15:04 crc kubenswrapper[5030]: I0121 00:15:04.220560 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7c6a312-8157-4460-9800-7f2df0edc7fa-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:04 crc kubenswrapper[5030]: I0121 00:15:04.220601 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7c6a312-8157-4460-9800-7f2df0edc7fa-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:04 crc kubenswrapper[5030]: I0121 00:15:04.220615 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkg5j\" (UniqueName: \"kubernetes.io/projected/b7c6a312-8157-4460-9800-7f2df0edc7fa-kube-api-access-tkg5j\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:04 crc kubenswrapper[5030]: I0121 00:15:04.815961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" event={"ID":"b7c6a312-8157-4460-9800-7f2df0edc7fa","Type":"ContainerDied","Data":"d58d6506e717b2086e366db4058cdef0cde27e9fcb9c6d4face394a7cdcdb56f"} Jan 21 00:15:04 crc kubenswrapper[5030]: I0121 00:15:04.816002 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d58d6506e717b2086e366db4058cdef0cde27e9fcb9c6d4face394a7cdcdb56f" Jan 21 00:15:04 crc kubenswrapper[5030]: I0121 00:15:04.816049 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl" Jan 21 00:15:05 crc kubenswrapper[5030]: I0121 00:15:05.165652 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm"] Jan 21 00:15:05 crc kubenswrapper[5030]: I0121 00:15:05.174546 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482530-pd5cm"] Jan 21 00:15:05 crc kubenswrapper[5030]: I0121 00:15:05.971114 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a201bafe-f06d-4f32-a740-c15f9ea69d8d" path="/var/lib/kubelet/pods/a201bafe-f06d-4f32-a740-c15f9ea69d8d/volumes" Jan 21 00:15:08 crc kubenswrapper[5030]: I0121 00:15:08.736830 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:15:08 crc kubenswrapper[5030]: I0121 00:15:08.800437 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s"] Jan 21 00:15:08 crc kubenswrapper[5030]: I0121 00:15:08.801003 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" podUID="a2d97883-4659-457b-ad5d-94cf7d74f62c" containerName="dnsmasq-dns" containerID="cri-o://0c421a9b50a99f6e78e376ada5955b4f3f3a4dfff91cce82062474041dc10720" gracePeriod=10 Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.214516 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.398273 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a2d97883-4659-457b-ad5d-94cf7d74f62c-dnsmasq-svc\") pod \"a2d97883-4659-457b-ad5d-94cf7d74f62c\" (UID: \"a2d97883-4659-457b-ad5d-94cf7d74f62c\") " Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.398423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d97883-4659-457b-ad5d-94cf7d74f62c-config\") pod \"a2d97883-4659-457b-ad5d-94cf7d74f62c\" (UID: \"a2d97883-4659-457b-ad5d-94cf7d74f62c\") " Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.398479 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt58f\" (UniqueName: \"kubernetes.io/projected/a2d97883-4659-457b-ad5d-94cf7d74f62c-kube-api-access-nt58f\") pod \"a2d97883-4659-457b-ad5d-94cf7d74f62c\" (UID: \"a2d97883-4659-457b-ad5d-94cf7d74f62c\") " Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.403894 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d97883-4659-457b-ad5d-94cf7d74f62c-kube-api-access-nt58f" (OuterVolumeSpecName: "kube-api-access-nt58f") pod "a2d97883-4659-457b-ad5d-94cf7d74f62c" (UID: "a2d97883-4659-457b-ad5d-94cf7d74f62c"). InnerVolumeSpecName "kube-api-access-nt58f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.431729 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d97883-4659-457b-ad5d-94cf7d74f62c-config" (OuterVolumeSpecName: "config") pod "a2d97883-4659-457b-ad5d-94cf7d74f62c" (UID: "a2d97883-4659-457b-ad5d-94cf7d74f62c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.431959 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d97883-4659-457b-ad5d-94cf7d74f62c-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "a2d97883-4659-457b-ad5d-94cf7d74f62c" (UID: "a2d97883-4659-457b-ad5d-94cf7d74f62c"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.500037 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d97883-4659-457b-ad5d-94cf7d74f62c-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.500077 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt58f\" (UniqueName: \"kubernetes.io/projected/a2d97883-4659-457b-ad5d-94cf7d74f62c-kube-api-access-nt58f\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.500088 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/a2d97883-4659-457b-ad5d-94cf7d74f62c-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.857232 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2d97883-4659-457b-ad5d-94cf7d74f62c" containerID="0c421a9b50a99f6e78e376ada5955b4f3f3a4dfff91cce82062474041dc10720" exitCode=0 Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.857282 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" event={"ID":"a2d97883-4659-457b-ad5d-94cf7d74f62c","Type":"ContainerDied","Data":"0c421a9b50a99f6e78e376ada5955b4f3f3a4dfff91cce82062474041dc10720"} Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.857312 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" event={"ID":"a2d97883-4659-457b-ad5d-94cf7d74f62c","Type":"ContainerDied","Data":"6c0d055ccdb1d91a6a8e878bc9c6e06261c5fe1c3916700ae4d00a97c391e0fa"} Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.857330 5030 scope.go:117] "RemoveContainer" containerID="0c421a9b50a99f6e78e376ada5955b4f3f3a4dfff91cce82062474041dc10720" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.857289 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.886266 5030 scope.go:117] "RemoveContainer" containerID="92a7a88d4d4b90acecfe03d4abe956cfdcda1431adc23ee57939ee2e432e9f5c" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.889282 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s"] Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.895501 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-b2r7s"] Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.913658 5030 scope.go:117] "RemoveContainer" containerID="0c421a9b50a99f6e78e376ada5955b4f3f3a4dfff91cce82062474041dc10720" Jan 21 00:15:09 crc kubenswrapper[5030]: E0121 00:15:09.914037 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c421a9b50a99f6e78e376ada5955b4f3f3a4dfff91cce82062474041dc10720\": container with ID starting with 0c421a9b50a99f6e78e376ada5955b4f3f3a4dfff91cce82062474041dc10720 not found: ID does not exist" containerID="0c421a9b50a99f6e78e376ada5955b4f3f3a4dfff91cce82062474041dc10720" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.914080 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c421a9b50a99f6e78e376ada5955b4f3f3a4dfff91cce82062474041dc10720"} err="failed to get container status \"0c421a9b50a99f6e78e376ada5955b4f3f3a4dfff91cce82062474041dc10720\": rpc error: code = NotFound desc = could not find container \"0c421a9b50a99f6e78e376ada5955b4f3f3a4dfff91cce82062474041dc10720\": container with ID starting with 0c421a9b50a99f6e78e376ada5955b4f3f3a4dfff91cce82062474041dc10720 not found: ID does not exist" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.914105 5030 scope.go:117] "RemoveContainer" containerID="92a7a88d4d4b90acecfe03d4abe956cfdcda1431adc23ee57939ee2e432e9f5c" Jan 21 00:15:09 crc kubenswrapper[5030]: E0121 00:15:09.914427 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92a7a88d4d4b90acecfe03d4abe956cfdcda1431adc23ee57939ee2e432e9f5c\": container with ID starting with 92a7a88d4d4b90acecfe03d4abe956cfdcda1431adc23ee57939ee2e432e9f5c not found: ID does not exist" containerID="92a7a88d4d4b90acecfe03d4abe956cfdcda1431adc23ee57939ee2e432e9f5c" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.914455 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92a7a88d4d4b90acecfe03d4abe956cfdcda1431adc23ee57939ee2e432e9f5c"} err="failed to get container status \"92a7a88d4d4b90acecfe03d4abe956cfdcda1431adc23ee57939ee2e432e9f5c\": rpc error: code = NotFound desc = could not find container \"92a7a88d4d4b90acecfe03d4abe956cfdcda1431adc23ee57939ee2e432e9f5c\": container with ID starting with 92a7a88d4d4b90acecfe03d4abe956cfdcda1431adc23ee57939ee2e432e9f5c not found: ID does not exist" Jan 21 00:15:09 crc kubenswrapper[5030]: I0121 00:15:09.973459 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d97883-4659-457b-ad5d-94cf7d74f62c" path="/var/lib/kubelet/pods/a2d97883-4659-457b-ad5d-94cf7d74f62c/volumes" Jan 21 00:15:11 crc kubenswrapper[5030]: I0121 00:15:11.962239 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:15:11 crc kubenswrapper[5030]: E0121 00:15:11.962475 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.337123 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw"] Jan 21 00:15:13 crc kubenswrapper[5030]: E0121 00:15:13.339920 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d97883-4659-457b-ad5d-94cf7d74f62c" containerName="dnsmasq-dns" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.339957 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d97883-4659-457b-ad5d-94cf7d74f62c" containerName="dnsmasq-dns" Jan 21 00:15:13 crc kubenswrapper[5030]: E0121 00:15:13.339969 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7c6a312-8157-4460-9800-7f2df0edc7fa" containerName="collect-profiles" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.339978 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7c6a312-8157-4460-9800-7f2df0edc7fa" containerName="collect-profiles" Jan 21 00:15:13 crc kubenswrapper[5030]: E0121 00:15:13.339990 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d97883-4659-457b-ad5d-94cf7d74f62c" containerName="init" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.339997 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d97883-4659-457b-ad5d-94cf7d74f62c" containerName="init" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.340201 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7c6a312-8157-4460-9800-7f2df0edc7fa" containerName="collect-profiles" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.340224 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d97883-4659-457b-ad5d-94cf7d74f62c" containerName="dnsmasq-dns" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.340773 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.343046 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.343097 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-no-nodes-custom-svc-dockercfg-mqm7g" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.343355 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.343456 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-no-nodes-custom-svc" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.345137 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.357716 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw"] Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.454171 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8k5j\" (UniqueName: \"kubernetes.io/projected/5a143e25-1454-4e45-8fec-80c44da2f6fa-kube-api-access-s8k5j\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.454247 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-ssh-key-edpm-no-nodes-custom-svc\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.454286 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-inventory\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.454357 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-img-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-custom-img-svc-combined-ca-bundle\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.487367 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw"] Jan 21 00:15:13 crc kubenswrapper[5030]: E0121 00:15:13.488097 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[custom-img-svc-combined-ca-bundle inventory kube-api-access-s8k5j ssh-key-edpm-no-nodes-custom-svc], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" podUID="5a143e25-1454-4e45-8fec-80c44da2f6fa" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.555337 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr"] Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.556083 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8k5j\" (UniqueName: \"kubernetes.io/projected/5a143e25-1454-4e45-8fec-80c44da2f6fa-kube-api-access-s8k5j\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.556152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-ssh-key-edpm-no-nodes-custom-svc\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.556177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-inventory\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.556224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-img-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-custom-img-svc-combined-ca-bundle\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.556514 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:13 crc kubenswrapper[5030]: E0121 00:15:13.557389 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/dataplanenodeset-edpm-no-nodes-custom-svc: secret "dataplanenodeset-edpm-no-nodes-custom-svc" not found Jan 21 00:15:13 crc kubenswrapper[5030]: E0121 00:15:13.557438 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-inventory podName:5a143e25-1454-4e45-8fec-80c44da2f6fa nodeName:}" failed. No retries permitted until 2026-01-21 00:15:14.057423555 +0000 UTC m=+5986.377683843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "inventory" (UniqueName: "kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-inventory") pod "custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" (UID: "5a143e25-1454-4e45-8fec-80c44da2f6fa") : secret "dataplanenodeset-edpm-no-nodes-custom-svc" not found Jan 21 00:15:13 crc kubenswrapper[5030]: E0121 00:15:13.570239 5030 projected.go:194] Error preparing data for projected volume kube-api-access-s8k5j for pod openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw: failed to fetch token: serviceaccounts "edpm-no-nodes-custom-svc" not found Jan 21 00:15:13 crc kubenswrapper[5030]: E0121 00:15:13.570308 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a143e25-1454-4e45-8fec-80c44da2f6fa-kube-api-access-s8k5j podName:5a143e25-1454-4e45-8fec-80c44da2f6fa nodeName:}" failed. No retries permitted until 2026-01-21 00:15:14.070289437 +0000 UTC m=+5986.390549725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s8k5j" (UniqueName: "kubernetes.io/projected/5a143e25-1454-4e45-8fec-80c44da2f6fa-kube-api-access-s8k5j") pod "custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" (UID: "5a143e25-1454-4e45-8fec-80c44da2f6fa") : failed to fetch token: serviceaccounts "edpm-no-nodes-custom-svc" not found Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.570323 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-ssh-key-edpm-no-nodes-custom-svc\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.577386 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr"] Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.608199 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-img-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-custom-img-svc-combined-ca-bundle\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.658093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-xbrhr\" (UID: \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.658229 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnbm4\" (UniqueName: \"kubernetes.io/projected/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-kube-api-access-dnbm4\") pod \"dnsmasq-dnsmasq-84b9f45d47-xbrhr\" (UID: \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.658390 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-xbrhr\" (UID: \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.759944 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-xbrhr\" (UID: \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.760046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnbm4\" (UniqueName: \"kubernetes.io/projected/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-kube-api-access-dnbm4\") pod \"dnsmasq-dnsmasq-84b9f45d47-xbrhr\" (UID: \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.760102 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-xbrhr\" (UID: \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.760919 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-xbrhr\" (UID: \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.760930 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-xbrhr\" (UID: \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.780272 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnbm4\" (UniqueName: \"kubernetes.io/projected/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-kube-api-access-dnbm4\") pod \"dnsmasq-dnsmasq-84b9f45d47-xbrhr\" (UID: \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.889865 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.901894 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:13 crc kubenswrapper[5030]: I0121 00:15:13.933878 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:14 crc kubenswrapper[5030]: I0121 00:15:14.065458 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-ssh-key-edpm-no-nodes-custom-svc\") pod \"5a143e25-1454-4e45-8fec-80c44da2f6fa\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " Jan 21 00:15:14 crc kubenswrapper[5030]: I0121 00:15:14.065606 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-img-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-custom-img-svc-combined-ca-bundle\") pod \"5a143e25-1454-4e45-8fec-80c44da2f6fa\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " Jan 21 00:15:14 crc kubenswrapper[5030]: I0121 00:15:14.065939 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-inventory\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:14 crc kubenswrapper[5030]: E0121 00:15:14.066134 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/dataplanenodeset-edpm-no-nodes-custom-svc: secret "dataplanenodeset-edpm-no-nodes-custom-svc" not found Jan 21 00:15:14 crc kubenswrapper[5030]: E0121 00:15:14.066199 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-inventory podName:5a143e25-1454-4e45-8fec-80c44da2f6fa nodeName:}" failed. No retries permitted until 2026-01-21 00:15:15.066182231 +0000 UTC m=+5987.386442519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "inventory" (UniqueName: "kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-inventory") pod "custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" (UID: "5a143e25-1454-4e45-8fec-80c44da2f6fa") : secret "dataplanenodeset-edpm-no-nodes-custom-svc" not found Jan 21 00:15:14 crc kubenswrapper[5030]: I0121 00:15:14.069467 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-ssh-key-edpm-no-nodes-custom-svc" (OuterVolumeSpecName: "ssh-key-edpm-no-nodes-custom-svc") pod "5a143e25-1454-4e45-8fec-80c44da2f6fa" (UID: "5a143e25-1454-4e45-8fec-80c44da2f6fa"). InnerVolumeSpecName "ssh-key-edpm-no-nodes-custom-svc". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:15:14 crc kubenswrapper[5030]: I0121 00:15:14.070723 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-custom-img-svc-combined-ca-bundle" (OuterVolumeSpecName: "custom-img-svc-combined-ca-bundle") pod "5a143e25-1454-4e45-8fec-80c44da2f6fa" (UID: "5a143e25-1454-4e45-8fec-80c44da2f6fa"). InnerVolumeSpecName "custom-img-svc-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:15:14 crc kubenswrapper[5030]: I0121 00:15:14.167800 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8k5j\" (UniqueName: \"kubernetes.io/projected/5a143e25-1454-4e45-8fec-80c44da2f6fa-kube-api-access-s8k5j\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:14 crc kubenswrapper[5030]: I0121 00:15:14.168139 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-ssh-key-edpm-no-nodes-custom-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:14 crc kubenswrapper[5030]: I0121 00:15:14.168196 5030 reconciler_common.go:293] "Volume detached for volume \"custom-img-svc-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-custom-img-svc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:14 crc kubenswrapper[5030]: E0121 00:15:14.171181 5030 projected.go:194] Error preparing data for projected volume kube-api-access-s8k5j for pod openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw: failed to fetch token: serviceaccounts "edpm-no-nodes-custom-svc" not found Jan 21 00:15:14 crc kubenswrapper[5030]: E0121 00:15:14.171278 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a143e25-1454-4e45-8fec-80c44da2f6fa-kube-api-access-s8k5j podName:5a143e25-1454-4e45-8fec-80c44da2f6fa nodeName:}" failed. No retries permitted until 2026-01-21 00:15:15.171255869 +0000 UTC m=+5987.491516167 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s8k5j" (UniqueName: "kubernetes.io/projected/5a143e25-1454-4e45-8fec-80c44da2f6fa-kube-api-access-s8k5j") pod "custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" (UID: "5a143e25-1454-4e45-8fec-80c44da2f6fa") : failed to fetch token: serviceaccounts "edpm-no-nodes-custom-svc" not found Jan 21 00:15:14 crc kubenswrapper[5030]: I0121 00:15:14.356212 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr"] Jan 21 00:15:14 crc kubenswrapper[5030]: I0121 00:15:14.905791 5030 generic.go:334] "Generic (PLEG): container finished" podID="3f04d0a9-d5ec-4650-90cc-b7db9c1d2101" containerID="38d0099cce948ed7140a63385f1d334d4df80a4aba7f0d0a3e07e19b60506c3d" exitCode=0 Jan 21 00:15:14 crc kubenswrapper[5030]: I0121 00:15:14.906108 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:14 crc kubenswrapper[5030]: I0121 00:15:14.905937 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" event={"ID":"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101","Type":"ContainerDied","Data":"38d0099cce948ed7140a63385f1d334d4df80a4aba7f0d0a3e07e19b60506c3d"} Jan 21 00:15:14 crc kubenswrapper[5030]: I0121 00:15:14.906293 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" event={"ID":"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101","Type":"ContainerStarted","Data":"9c5d3c25fd7a15abc89381730aae99a9990bce68426b8cee428100c0459ef9f5"} Jan 21 00:15:15 crc kubenswrapper[5030]: I0121 00:15:15.067386 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw"] Jan 21 00:15:15 crc kubenswrapper[5030]: I0121 00:15:15.073336 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw"] Jan 21 00:15:15 crc kubenswrapper[5030]: I0121 00:15:15.083686 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-inventory\") pod \"custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw\" (UID: \"5a143e25-1454-4e45-8fec-80c44da2f6fa\") " pod="openstack-kuttl-tests/custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" Jan 21 00:15:15 crc kubenswrapper[5030]: E0121 00:15:15.083799 5030 secret.go:188] Couldn't get secret openstack-kuttl-tests/dataplanenodeset-edpm-no-nodes-custom-svc: object "openstack-kuttl-tests"/"dataplanenodeset-edpm-no-nodes-custom-svc" not registered Jan 21 00:15:15 crc kubenswrapper[5030]: E0121 00:15:15.083859 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-inventory podName:5a143e25-1454-4e45-8fec-80c44da2f6fa nodeName:}" failed. No retries permitted until 2026-01-21 00:15:17.083846026 +0000 UTC m=+5989.404106314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "inventory" (UniqueName: "kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-inventory") pod "custom-img-svc-edpm-compute-no-nodes-edpm-no-nodes-custom-h87vw" (UID: "5a143e25-1454-4e45-8fec-80c44da2f6fa") : object "openstack-kuttl-tests"/"dataplanenodeset-edpm-no-nodes-custom-svc" not registered Jan 21 00:15:15 crc kubenswrapper[5030]: I0121 00:15:15.186013 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a143e25-1454-4e45-8fec-80c44da2f6fa-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:15 crc kubenswrapper[5030]: I0121 00:15:15.186378 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8k5j\" (UniqueName: \"kubernetes.io/projected/5a143e25-1454-4e45-8fec-80c44da2f6fa-kube-api-access-s8k5j\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:15 crc kubenswrapper[5030]: I0121 00:15:15.917985 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" event={"ID":"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101","Type":"ContainerStarted","Data":"a00e2eb084c862eb002653905c316f35ee9354b6f17eb33a3cdd5265e5823852"} Jan 21 00:15:15 crc kubenswrapper[5030]: I0121 00:15:15.918845 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:15 crc kubenswrapper[5030]: I0121 00:15:15.937611 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" podStartSLOduration=2.9375855570000002 podStartE2EDuration="2.937585557s" podCreationTimestamp="2026-01-21 00:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:15:15.935548087 +0000 UTC m=+5988.255808395" watchObservedRunningTime="2026-01-21 00:15:15.937585557 +0000 UTC m=+5988.257845845" Jan 21 00:15:15 crc kubenswrapper[5030]: I0121 00:15:15.973689 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a143e25-1454-4e45-8fec-80c44da2f6fa" path="/var/lib/kubelet/pods/5a143e25-1454-4e45-8fec-80c44da2f6fa/volumes" Jan 21 00:15:21 crc kubenswrapper[5030]: I0121 00:15:21.910305 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-qzwgf"] Jan 21 00:15:21 crc kubenswrapper[5030]: I0121 00:15:21.920533 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-qzwgf"] Jan 21 00:15:21 crc kubenswrapper[5030]: I0121 00:15:21.975193 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9f57e7f-7203-4536-bea5-91dc8828c783" path="/var/lib/kubelet/pods/b9f57e7f-7203-4536-bea5-91dc8828c783/volumes" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.060046 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-4qds8"] Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.061052 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qds8" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.062885 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-4qds8"] Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.063892 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.065894 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.066108 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.066234 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.203362 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-crc-storage\") pod \"crc-storage-crc-4qds8\" (UID: \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\") " pod="crc-storage/crc-storage-crc-4qds8" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.203441 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-node-mnt\") pod \"crc-storage-crc-4qds8\" (UID: \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\") " pod="crc-storage/crc-storage-crc-4qds8" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.203542 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmbnq\" (UniqueName: \"kubernetes.io/projected/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-kube-api-access-fmbnq\") pod \"crc-storage-crc-4qds8\" (UID: \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\") " pod="crc-storage/crc-storage-crc-4qds8" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.304898 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-crc-storage\") pod \"crc-storage-crc-4qds8\" (UID: \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\") " pod="crc-storage/crc-storage-crc-4qds8" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.304996 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-node-mnt\") pod \"crc-storage-crc-4qds8\" (UID: \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\") " pod="crc-storage/crc-storage-crc-4qds8" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.305069 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmbnq\" (UniqueName: \"kubernetes.io/projected/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-kube-api-access-fmbnq\") pod \"crc-storage-crc-4qds8\" (UID: \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\") " pod="crc-storage/crc-storage-crc-4qds8" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.305422 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-node-mnt\") pod \"crc-storage-crc-4qds8\" (UID: \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\") " pod="crc-storage/crc-storage-crc-4qds8" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.307163 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-crc-storage\") pod \"crc-storage-crc-4qds8\" (UID: \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\") " pod="crc-storage/crc-storage-crc-4qds8" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.329735 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmbnq\" (UniqueName: \"kubernetes.io/projected/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-kube-api-access-fmbnq\") pod \"crc-storage-crc-4qds8\" (UID: \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\") " pod="crc-storage/crc-storage-crc-4qds8" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.410708 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qds8" Jan 21 00:15:22 crc kubenswrapper[5030]: I0121 00:15:22.850591 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-4qds8"] Jan 21 00:15:23 crc kubenswrapper[5030]: I0121 00:15:23.006911 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4qds8" event={"ID":"9825bee0-3f22-42f1-ae1b-056a35b9ba3f","Type":"ContainerStarted","Data":"9fb36f1f7b15c13f6295b0c358cfcd006824c309d1d31ac27cb08aaace32198e"} Jan 21 00:15:23 crc kubenswrapper[5030]: I0121 00:15:23.935798 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.012901 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6"] Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.013337 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" podUID="16d2c5a6-d548-406d-9b99-7bcfd82a333c" containerName="dnsmasq-dns" containerID="cri-o://f6159c44c967fa1bbd0bb6c62d115191f333705b1afe3b5bf3b8d4deba1a3d7e" gracePeriod=10 Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.028213 5030 generic.go:334] "Generic (PLEG): container finished" podID="9825bee0-3f22-42f1-ae1b-056a35b9ba3f" containerID="9ed66246b5a326d1c96d2d8f373e75242df9f08d33bc6fe16c1953efd4660339" exitCode=0 Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.028778 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4qds8" event={"ID":"9825bee0-3f22-42f1-ae1b-056a35b9ba3f","Type":"ContainerDied","Data":"9ed66246b5a326d1c96d2d8f373e75242df9f08d33bc6fe16c1953efd4660339"} Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.450965 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.545814 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-config\") pod \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.545889 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-edpm-no-nodes-custom-svc\") pod \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.545912 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-dnsmasq-svc\") pod \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.546079 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8v75\" (UniqueName: \"kubernetes.io/projected/16d2c5a6-d548-406d-9b99-7bcfd82a333c-kube-api-access-l8v75\") pod \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\" (UID: \"16d2c5a6-d548-406d-9b99-7bcfd82a333c\") " Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.551226 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d2c5a6-d548-406d-9b99-7bcfd82a333c-kube-api-access-l8v75" (OuterVolumeSpecName: "kube-api-access-l8v75") pod "16d2c5a6-d548-406d-9b99-7bcfd82a333c" (UID: "16d2c5a6-d548-406d-9b99-7bcfd82a333c"). InnerVolumeSpecName "kube-api-access-l8v75". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.578516 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "16d2c5a6-d548-406d-9b99-7bcfd82a333c" (UID: "16d2c5a6-d548-406d-9b99-7bcfd82a333c"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.581444 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-config" (OuterVolumeSpecName: "config") pod "16d2c5a6-d548-406d-9b99-7bcfd82a333c" (UID: "16d2c5a6-d548-406d-9b99-7bcfd82a333c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.583963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-edpm-no-nodes-custom-svc" (OuterVolumeSpecName: "edpm-no-nodes-custom-svc") pod "16d2c5a6-d548-406d-9b99-7bcfd82a333c" (UID: "16d2c5a6-d548-406d-9b99-7bcfd82a333c"). InnerVolumeSpecName "edpm-no-nodes-custom-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.647802 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8v75\" (UniqueName: \"kubernetes.io/projected/16d2c5a6-d548-406d-9b99-7bcfd82a333c-kube-api-access-l8v75\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.647848 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.647858 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:24 crc kubenswrapper[5030]: I0121 00:15:24.647868 5030 reconciler_common.go:293] "Volume detached for volume \"edpm-no-nodes-custom-svc\" (UniqueName: \"kubernetes.io/configmap/16d2c5a6-d548-406d-9b99-7bcfd82a333c-edpm-no-nodes-custom-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.040658 5030 generic.go:334] "Generic (PLEG): container finished" podID="16d2c5a6-d548-406d-9b99-7bcfd82a333c" containerID="f6159c44c967fa1bbd0bb6c62d115191f333705b1afe3b5bf3b8d4deba1a3d7e" exitCode=0 Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.040745 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.040774 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" event={"ID":"16d2c5a6-d548-406d-9b99-7bcfd82a333c","Type":"ContainerDied","Data":"f6159c44c967fa1bbd0bb6c62d115191f333705b1afe3b5bf3b8d4deba1a3d7e"} Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.041534 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6" event={"ID":"16d2c5a6-d548-406d-9b99-7bcfd82a333c","Type":"ContainerDied","Data":"c7ee1b35ece3894d3a8a4fb97462e4941555306f3c55d1507c6a1edd6054abd1"} Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.041743 5030 scope.go:117] "RemoveContainer" containerID="f6159c44c967fa1bbd0bb6c62d115191f333705b1afe3b5bf3b8d4deba1a3d7e" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.065239 5030 scope.go:117] "RemoveContainer" containerID="1a2233278e2a449b6ee4308d0fad8accd05a202a93cc269845570751671db7a1" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.090163 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6"] Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.096450 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-6fdb84cf7c-c5gj6"] Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.120857 5030 scope.go:117] "RemoveContainer" containerID="f6159c44c967fa1bbd0bb6c62d115191f333705b1afe3b5bf3b8d4deba1a3d7e" Jan 21 00:15:25 crc kubenswrapper[5030]: E0121 00:15:25.121397 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6159c44c967fa1bbd0bb6c62d115191f333705b1afe3b5bf3b8d4deba1a3d7e\": container with ID starting with f6159c44c967fa1bbd0bb6c62d115191f333705b1afe3b5bf3b8d4deba1a3d7e not found: ID does not exist" containerID="f6159c44c967fa1bbd0bb6c62d115191f333705b1afe3b5bf3b8d4deba1a3d7e" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.121434 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6159c44c967fa1bbd0bb6c62d115191f333705b1afe3b5bf3b8d4deba1a3d7e"} err="failed to get container status \"f6159c44c967fa1bbd0bb6c62d115191f333705b1afe3b5bf3b8d4deba1a3d7e\": rpc error: code = NotFound desc = could not find container \"f6159c44c967fa1bbd0bb6c62d115191f333705b1afe3b5bf3b8d4deba1a3d7e\": container with ID starting with f6159c44c967fa1bbd0bb6c62d115191f333705b1afe3b5bf3b8d4deba1a3d7e not found: ID does not exist" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.121457 5030 scope.go:117] "RemoveContainer" containerID="1a2233278e2a449b6ee4308d0fad8accd05a202a93cc269845570751671db7a1" Jan 21 00:15:25 crc kubenswrapper[5030]: E0121 00:15:25.121789 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a2233278e2a449b6ee4308d0fad8accd05a202a93cc269845570751671db7a1\": container with ID starting with 1a2233278e2a449b6ee4308d0fad8accd05a202a93cc269845570751671db7a1 not found: ID does not exist" containerID="1a2233278e2a449b6ee4308d0fad8accd05a202a93cc269845570751671db7a1" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.121834 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2233278e2a449b6ee4308d0fad8accd05a202a93cc269845570751671db7a1"} err="failed to get container status \"1a2233278e2a449b6ee4308d0fad8accd05a202a93cc269845570751671db7a1\": rpc error: code = NotFound desc = could not find container \"1a2233278e2a449b6ee4308d0fad8accd05a202a93cc269845570751671db7a1\": container with ID starting with 1a2233278e2a449b6ee4308d0fad8accd05a202a93cc269845570751671db7a1 not found: ID does not exist" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.340255 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qds8" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.356540 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-node-mnt\") pod \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\" (UID: \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\") " Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.356604 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-crc-storage\") pod \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\" (UID: \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\") " Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.356710 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmbnq\" (UniqueName: \"kubernetes.io/projected/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-kube-api-access-fmbnq\") pod \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\" (UID: \"9825bee0-3f22-42f1-ae1b-056a35b9ba3f\") " Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.357334 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "9825bee0-3f22-42f1-ae1b-056a35b9ba3f" (UID: "9825bee0-3f22-42f1-ae1b-056a35b9ba3f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.361170 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-kube-api-access-fmbnq" (OuterVolumeSpecName: "kube-api-access-fmbnq") pod "9825bee0-3f22-42f1-ae1b-056a35b9ba3f" (UID: "9825bee0-3f22-42f1-ae1b-056a35b9ba3f"). InnerVolumeSpecName "kube-api-access-fmbnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.374129 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "9825bee0-3f22-42f1-ae1b-056a35b9ba3f" (UID: "9825bee0-3f22-42f1-ae1b-056a35b9ba3f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.457676 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmbnq\" (UniqueName: \"kubernetes.io/projected/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-kube-api-access-fmbnq\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.457745 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.457757 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9825bee0-3f22-42f1-ae1b-056a35b9ba3f-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:25 crc kubenswrapper[5030]: I0121 00:15:25.971489 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d2c5a6-d548-406d-9b99-7bcfd82a333c" path="/var/lib/kubelet/pods/16d2c5a6-d548-406d-9b99-7bcfd82a333c/volumes" Jan 21 00:15:26 crc kubenswrapper[5030]: I0121 00:15:26.055035 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4qds8" event={"ID":"9825bee0-3f22-42f1-ae1b-056a35b9ba3f","Type":"ContainerDied","Data":"9fb36f1f7b15c13f6295b0c358cfcd006824c309d1d31ac27cb08aaace32198e"} Jan 21 00:15:26 crc kubenswrapper[5030]: I0121 00:15:26.055084 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fb36f1f7b15c13f6295b0c358cfcd006824c309d1d31ac27cb08aaace32198e" Jan 21 00:15:26 crc kubenswrapper[5030]: I0121 00:15:26.055274 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4qds8" Jan 21 00:15:26 crc kubenswrapper[5030]: I0121 00:15:26.962511 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:15:26 crc kubenswrapper[5030]: E0121 00:15:26.963394 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:15:28 crc kubenswrapper[5030]: I0121 00:15:28.929660 5030 scope.go:117] "RemoveContainer" containerID="cc8347fdd1e6eec1f6d192d22b0b150e056b13e81e240ab57f39a88cc76a545f" Jan 21 00:15:28 crc kubenswrapper[5030]: I0121 00:15:28.964652 5030 scope.go:117] "RemoveContainer" containerID="e0ed5fefbb791137c5aa9809673b42ec2898f058c02141144c8d75cb153943bb" Jan 21 00:15:28 crc kubenswrapper[5030]: I0121 00:15:28.986111 5030 scope.go:117] "RemoveContainer" containerID="a21170eaaaa8bd732a221e30c9e24e967c320d9a23616f558d13013131437541" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.019610 5030 scope.go:117] "RemoveContainer" containerID="5684c664f654279461f9b79890c2a571e256b4b31a47b47161c9ff10184e5879" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.058503 5030 scope.go:117] "RemoveContainer" containerID="d46f0597a253211b0fae46ccba166f5d11c876c7f259f66e0ac54845d6296249" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.091970 5030 scope.go:117] "RemoveContainer" containerID="7ef770f8b6fb8a9dfbf7cf9dbe9f107c4a5dcb6df01d2136ef60b4b74a34b569" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.118675 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-4qds8"] Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.119783 5030 scope.go:117] "RemoveContainer" containerID="de2eb28606c6759e62cd01298595b14a1ffa5b4dfa6d7941f92c84212e3b6908" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.123908 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-4qds8"] Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.139743 5030 scope.go:117] "RemoveContainer" containerID="7fa7223501a2a9f0fb1d9ac88a72f6ada4e6ae3525f87be4e1affde884fd2858" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.164890 5030 scope.go:117] "RemoveContainer" containerID="07a3eb05f554cd9db9ec72f8a584308cda93ea2b3337655bb7dbd9b74dd1130d" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.190477 5030 scope.go:117] "RemoveContainer" containerID="baed88bf47793cc103183bf100cfc830e86ad5195ccf15741aa726a2d053e430" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.277995 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-qll8j"] Jan 21 00:15:29 crc kubenswrapper[5030]: E0121 00:15:29.278277 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d2c5a6-d548-406d-9b99-7bcfd82a333c" containerName="init" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.278297 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d2c5a6-d548-406d-9b99-7bcfd82a333c" containerName="init" Jan 21 00:15:29 crc kubenswrapper[5030]: E0121 00:15:29.278307 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d2c5a6-d548-406d-9b99-7bcfd82a333c" containerName="dnsmasq-dns" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.278312 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d2c5a6-d548-406d-9b99-7bcfd82a333c" containerName="dnsmasq-dns" Jan 21 00:15:29 crc kubenswrapper[5030]: E0121 00:15:29.278328 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9825bee0-3f22-42f1-ae1b-056a35b9ba3f" containerName="storage" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.278335 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9825bee0-3f22-42f1-ae1b-056a35b9ba3f" containerName="storage" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.278457 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9825bee0-3f22-42f1-ae1b-056a35b9ba3f" containerName="storage" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.278477 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d2c5a6-d548-406d-9b99-7bcfd82a333c" containerName="dnsmasq-dns" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.279099 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qll8j" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.281653 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.281761 5030 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-pg8n5" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.281985 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.283486 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.295533 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qll8j"] Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.414940 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5844649c-280e-4b8e-a2ed-c09accd605fd-node-mnt\") pod \"crc-storage-crc-qll8j\" (UID: \"5844649c-280e-4b8e-a2ed-c09accd605fd\") " pod="crc-storage/crc-storage-crc-qll8j" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.415095 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5844649c-280e-4b8e-a2ed-c09accd605fd-crc-storage\") pod \"crc-storage-crc-qll8j\" (UID: \"5844649c-280e-4b8e-a2ed-c09accd605fd\") " pod="crc-storage/crc-storage-crc-qll8j" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.415128 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9nrm\" (UniqueName: \"kubernetes.io/projected/5844649c-280e-4b8e-a2ed-c09accd605fd-kube-api-access-l9nrm\") pod \"crc-storage-crc-qll8j\" (UID: \"5844649c-280e-4b8e-a2ed-c09accd605fd\") " pod="crc-storage/crc-storage-crc-qll8j" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.516812 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5844649c-280e-4b8e-a2ed-c09accd605fd-node-mnt\") pod \"crc-storage-crc-qll8j\" (UID: \"5844649c-280e-4b8e-a2ed-c09accd605fd\") " pod="crc-storage/crc-storage-crc-qll8j" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.516858 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5844649c-280e-4b8e-a2ed-c09accd605fd-crc-storage\") pod \"crc-storage-crc-qll8j\" (UID: \"5844649c-280e-4b8e-a2ed-c09accd605fd\") " pod="crc-storage/crc-storage-crc-qll8j" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.516884 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9nrm\" (UniqueName: \"kubernetes.io/projected/5844649c-280e-4b8e-a2ed-c09accd605fd-kube-api-access-l9nrm\") pod \"crc-storage-crc-qll8j\" (UID: \"5844649c-280e-4b8e-a2ed-c09accd605fd\") " pod="crc-storage/crc-storage-crc-qll8j" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.517250 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5844649c-280e-4b8e-a2ed-c09accd605fd-node-mnt\") pod \"crc-storage-crc-qll8j\" (UID: \"5844649c-280e-4b8e-a2ed-c09accd605fd\") " pod="crc-storage/crc-storage-crc-qll8j" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.518210 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5844649c-280e-4b8e-a2ed-c09accd605fd-crc-storage\") pod \"crc-storage-crc-qll8j\" (UID: \"5844649c-280e-4b8e-a2ed-c09accd605fd\") " pod="crc-storage/crc-storage-crc-qll8j" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.547720 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9nrm\" (UniqueName: \"kubernetes.io/projected/5844649c-280e-4b8e-a2ed-c09accd605fd-kube-api-access-l9nrm\") pod \"crc-storage-crc-qll8j\" (UID: \"5844649c-280e-4b8e-a2ed-c09accd605fd\") " pod="crc-storage/crc-storage-crc-qll8j" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.608027 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qll8j" Jan 21 00:15:29 crc kubenswrapper[5030]: I0121 00:15:29.982827 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9825bee0-3f22-42f1-ae1b-056a35b9ba3f" path="/var/lib/kubelet/pods/9825bee0-3f22-42f1-ae1b-056a35b9ba3f/volumes" Jan 21 00:15:30 crc kubenswrapper[5030]: I0121 00:15:30.044339 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qll8j"] Jan 21 00:15:30 crc kubenswrapper[5030]: I0121 00:15:30.096591 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qll8j" event={"ID":"5844649c-280e-4b8e-a2ed-c09accd605fd","Type":"ContainerStarted","Data":"b7690a75edc5ed3b771ff13c9ef317f2fd282fceb2e71455566b27680ba66023"} Jan 21 00:15:31 crc kubenswrapper[5030]: I0121 00:15:31.107284 5030 generic.go:334] "Generic (PLEG): container finished" podID="5844649c-280e-4b8e-a2ed-c09accd605fd" containerID="65c0e006b42c95fbd72b93f7c39f8f4c01afa8eeb81efc7506ad43f664829adc" exitCode=0 Jan 21 00:15:31 crc kubenswrapper[5030]: I0121 00:15:31.107460 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qll8j" event={"ID":"5844649c-280e-4b8e-a2ed-c09accd605fd","Type":"ContainerDied","Data":"65c0e006b42c95fbd72b93f7c39f8f4c01afa8eeb81efc7506ad43f664829adc"} Jan 21 00:15:32 crc kubenswrapper[5030]: I0121 00:15:32.456400 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qll8j" Jan 21 00:15:32 crc kubenswrapper[5030]: I0121 00:15:32.565385 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5844649c-280e-4b8e-a2ed-c09accd605fd-crc-storage\") pod \"5844649c-280e-4b8e-a2ed-c09accd605fd\" (UID: \"5844649c-280e-4b8e-a2ed-c09accd605fd\") " Jan 21 00:15:32 crc kubenswrapper[5030]: I0121 00:15:32.565568 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9nrm\" (UniqueName: \"kubernetes.io/projected/5844649c-280e-4b8e-a2ed-c09accd605fd-kube-api-access-l9nrm\") pod \"5844649c-280e-4b8e-a2ed-c09accd605fd\" (UID: \"5844649c-280e-4b8e-a2ed-c09accd605fd\") " Jan 21 00:15:32 crc kubenswrapper[5030]: I0121 00:15:32.565609 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5844649c-280e-4b8e-a2ed-c09accd605fd-node-mnt\") pod \"5844649c-280e-4b8e-a2ed-c09accd605fd\" (UID: \"5844649c-280e-4b8e-a2ed-c09accd605fd\") " Jan 21 00:15:32 crc kubenswrapper[5030]: I0121 00:15:32.566022 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5844649c-280e-4b8e-a2ed-c09accd605fd-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "5844649c-280e-4b8e-a2ed-c09accd605fd" (UID: "5844649c-280e-4b8e-a2ed-c09accd605fd"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 00:15:32 crc kubenswrapper[5030]: I0121 00:15:32.572125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5844649c-280e-4b8e-a2ed-c09accd605fd-kube-api-access-l9nrm" (OuterVolumeSpecName: "kube-api-access-l9nrm") pod "5844649c-280e-4b8e-a2ed-c09accd605fd" (UID: "5844649c-280e-4b8e-a2ed-c09accd605fd"). InnerVolumeSpecName "kube-api-access-l9nrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:15:32 crc kubenswrapper[5030]: I0121 00:15:32.601280 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5844649c-280e-4b8e-a2ed-c09accd605fd-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "5844649c-280e-4b8e-a2ed-c09accd605fd" (UID: "5844649c-280e-4b8e-a2ed-c09accd605fd"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:15:32 crc kubenswrapper[5030]: I0121 00:15:32.667689 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9nrm\" (UniqueName: \"kubernetes.io/projected/5844649c-280e-4b8e-a2ed-c09accd605fd-kube-api-access-l9nrm\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:32 crc kubenswrapper[5030]: I0121 00:15:32.667726 5030 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5844649c-280e-4b8e-a2ed-c09accd605fd-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:32 crc kubenswrapper[5030]: I0121 00:15:32.667738 5030 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5844649c-280e-4b8e-a2ed-c09accd605fd-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:33 crc kubenswrapper[5030]: I0121 00:15:33.131416 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qll8j" event={"ID":"5844649c-280e-4b8e-a2ed-c09accd605fd","Type":"ContainerDied","Data":"b7690a75edc5ed3b771ff13c9ef317f2fd282fceb2e71455566b27680ba66023"} Jan 21 00:15:33 crc kubenswrapper[5030]: I0121 00:15:33.131470 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7690a75edc5ed3b771ff13c9ef317f2fd282fceb2e71455566b27680ba66023" Jan 21 00:15:33 crc kubenswrapper[5030]: I0121 00:15:33.131510 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qll8j" Jan 21 00:15:35 crc kubenswrapper[5030]: I0121 00:15:35.815391 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc"] Jan 21 00:15:35 crc kubenswrapper[5030]: E0121 00:15:35.816006 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5844649c-280e-4b8e-a2ed-c09accd605fd" containerName="storage" Jan 21 00:15:35 crc kubenswrapper[5030]: I0121 00:15:35.816018 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5844649c-280e-4b8e-a2ed-c09accd605fd" containerName="storage" Jan 21 00:15:35 crc kubenswrapper[5030]: I0121 00:15:35.816151 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5844649c-280e-4b8e-a2ed-c09accd605fd" containerName="storage" Jan 21 00:15:35 crc kubenswrapper[5030]: I0121 00:15:35.816878 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:35 crc kubenswrapper[5030]: I0121 00:15:35.819443 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes" Jan 21 00:15:35 crc kubenswrapper[5030]: I0121 00:15:35.825794 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc"] Jan 21 00:15:35 crc kubenswrapper[5030]: I0121 00:15:35.927050 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-config\") pod \"dnsmasq-dnsmasq-64864b6d57-9h5cc\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:35 crc kubenswrapper[5030]: I0121 00:15:35.927176 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p9rq\" (UniqueName: \"kubernetes.io/projected/d475005e-6dd1-42ff-a7d7-e983199f2c79-kube-api-access-6p9rq\") pod \"dnsmasq-dnsmasq-64864b6d57-9h5cc\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:35 crc kubenswrapper[5030]: I0121 00:15:35.927213 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-9h5cc\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:35 crc kubenswrapper[5030]: I0121 00:15:35.927230 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-9h5cc\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:36 crc kubenswrapper[5030]: I0121 00:15:36.028431 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p9rq\" (UniqueName: \"kubernetes.io/projected/d475005e-6dd1-42ff-a7d7-e983199f2c79-kube-api-access-6p9rq\") pod \"dnsmasq-dnsmasq-64864b6d57-9h5cc\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:36 crc kubenswrapper[5030]: I0121 00:15:36.028522 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-9h5cc\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:36 crc kubenswrapper[5030]: I0121 00:15:36.028551 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-9h5cc\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:36 crc kubenswrapper[5030]: I0121 00:15:36.028605 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-config\") pod \"dnsmasq-dnsmasq-64864b6d57-9h5cc\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:36 crc kubenswrapper[5030]: I0121 00:15:36.029677 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-edpm-compute-no-nodes\") pod \"dnsmasq-dnsmasq-64864b6d57-9h5cc\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:36 crc kubenswrapper[5030]: I0121 00:15:36.030323 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-64864b6d57-9h5cc\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:36 crc kubenswrapper[5030]: I0121 00:15:36.030437 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-config\") pod \"dnsmasq-dnsmasq-64864b6d57-9h5cc\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:36 crc kubenswrapper[5030]: I0121 00:15:36.060106 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p9rq\" (UniqueName: \"kubernetes.io/projected/d475005e-6dd1-42ff-a7d7-e983199f2c79-kube-api-access-6p9rq\") pod \"dnsmasq-dnsmasq-64864b6d57-9h5cc\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:36 crc kubenswrapper[5030]: I0121 00:15:36.171461 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:36 crc kubenswrapper[5030]: I0121 00:15:36.596815 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc"] Jan 21 00:15:37 crc kubenswrapper[5030]: I0121 00:15:37.166830 5030 generic.go:334] "Generic (PLEG): container finished" podID="d475005e-6dd1-42ff-a7d7-e983199f2c79" containerID="666267ac45d3dfc090c45bcb6bba55be8b63632f65cce7737cce52453278b8ef" exitCode=0 Jan 21 00:15:37 crc kubenswrapper[5030]: I0121 00:15:37.166936 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" event={"ID":"d475005e-6dd1-42ff-a7d7-e983199f2c79","Type":"ContainerDied","Data":"666267ac45d3dfc090c45bcb6bba55be8b63632f65cce7737cce52453278b8ef"} Jan 21 00:15:37 crc kubenswrapper[5030]: I0121 00:15:37.167389 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" event={"ID":"d475005e-6dd1-42ff-a7d7-e983199f2c79","Type":"ContainerStarted","Data":"15411f7570ab18a6bbefb0e4f4d962158a8113020c3ee75e1fcef1fb390786a7"} Jan 21 00:15:38 crc kubenswrapper[5030]: I0121 00:15:38.178215 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" event={"ID":"d475005e-6dd1-42ff-a7d7-e983199f2c79","Type":"ContainerStarted","Data":"24908274e6f6b95663e742f913c7ffca6ad5269d0fe425f4ebbd1152a9de54a6"} Jan 21 00:15:38 crc kubenswrapper[5030]: I0121 00:15:38.178634 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:38 crc kubenswrapper[5030]: I0121 00:15:38.201857 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" podStartSLOduration=3.201831801 podStartE2EDuration="3.201831801s" podCreationTimestamp="2026-01-21 00:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:15:38.200375356 +0000 UTC m=+6010.520635684" watchObservedRunningTime="2026-01-21 00:15:38.201831801 +0000 UTC m=+6010.522092109" Jan 21 00:15:38 crc kubenswrapper[5030]: I0121 00:15:38.962037 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:15:38 crc kubenswrapper[5030]: E0121 00:15:38.962286 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:15:46 crc kubenswrapper[5030]: I0121 00:15:46.173874 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:15:46 crc kubenswrapper[5030]: I0121 00:15:46.256846 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr"] Jan 21 00:15:46 crc kubenswrapper[5030]: I0121 00:15:46.257147 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" podUID="3f04d0a9-d5ec-4650-90cc-b7db9c1d2101" containerName="dnsmasq-dns" containerID="cri-o://a00e2eb084c862eb002653905c316f35ee9354b6f17eb33a3cdd5265e5823852" gracePeriod=10 Jan 21 00:15:46 crc kubenswrapper[5030]: I0121 00:15:46.656971 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:46 crc kubenswrapper[5030]: I0121 00:15:46.821637 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-dnsmasq-svc\") pod \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\" (UID: \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\") " Jan 21 00:15:46 crc kubenswrapper[5030]: I0121 00:15:46.821806 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnbm4\" (UniqueName: \"kubernetes.io/projected/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-kube-api-access-dnbm4\") pod \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\" (UID: \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\") " Jan 21 00:15:46 crc kubenswrapper[5030]: I0121 00:15:46.821898 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-config\") pod \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\" (UID: \"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101\") " Jan 21 00:15:46 crc kubenswrapper[5030]: I0121 00:15:46.829384 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-kube-api-access-dnbm4" (OuterVolumeSpecName: "kube-api-access-dnbm4") pod "3f04d0a9-d5ec-4650-90cc-b7db9c1d2101" (UID: "3f04d0a9-d5ec-4650-90cc-b7db9c1d2101"). InnerVolumeSpecName "kube-api-access-dnbm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:15:46 crc kubenswrapper[5030]: I0121 00:15:46.867465 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-config" (OuterVolumeSpecName: "config") pod "3f04d0a9-d5ec-4650-90cc-b7db9c1d2101" (UID: "3f04d0a9-d5ec-4650-90cc-b7db9c1d2101"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:15:46 crc kubenswrapper[5030]: I0121 00:15:46.874837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "3f04d0a9-d5ec-4650-90cc-b7db9c1d2101" (UID: "3f04d0a9-d5ec-4650-90cc-b7db9c1d2101"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:15:46 crc kubenswrapper[5030]: I0121 00:15:46.923439 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:46 crc kubenswrapper[5030]: I0121 00:15:46.923491 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:46 crc kubenswrapper[5030]: I0121 00:15:46.924080 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnbm4\" (UniqueName: \"kubernetes.io/projected/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101-kube-api-access-dnbm4\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:47 crc kubenswrapper[5030]: I0121 00:15:47.280905 5030 generic.go:334] "Generic (PLEG): container finished" podID="3f04d0a9-d5ec-4650-90cc-b7db9c1d2101" containerID="a00e2eb084c862eb002653905c316f35ee9354b6f17eb33a3cdd5265e5823852" exitCode=0 Jan 21 00:15:47 crc kubenswrapper[5030]: I0121 00:15:47.280969 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" event={"ID":"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101","Type":"ContainerDied","Data":"a00e2eb084c862eb002653905c316f35ee9354b6f17eb33a3cdd5265e5823852"} Jan 21 00:15:47 crc kubenswrapper[5030]: I0121 00:15:47.281006 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" event={"ID":"3f04d0a9-d5ec-4650-90cc-b7db9c1d2101","Type":"ContainerDied","Data":"9c5d3c25fd7a15abc89381730aae99a9990bce68426b8cee428100c0459ef9f5"} Jan 21 00:15:47 crc kubenswrapper[5030]: I0121 00:15:47.281014 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr" Jan 21 00:15:47 crc kubenswrapper[5030]: I0121 00:15:47.281033 5030 scope.go:117] "RemoveContainer" containerID="a00e2eb084c862eb002653905c316f35ee9354b6f17eb33a3cdd5265e5823852" Jan 21 00:15:47 crc kubenswrapper[5030]: I0121 00:15:47.323218 5030 scope.go:117] "RemoveContainer" containerID="38d0099cce948ed7140a63385f1d334d4df80a4aba7f0d0a3e07e19b60506c3d" Jan 21 00:15:47 crc kubenswrapper[5030]: I0121 00:15:47.349699 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr"] Jan 21 00:15:47 crc kubenswrapper[5030]: I0121 00:15:47.355882 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-xbrhr"] Jan 21 00:15:47 crc kubenswrapper[5030]: I0121 00:15:47.358880 5030 scope.go:117] "RemoveContainer" containerID="a00e2eb084c862eb002653905c316f35ee9354b6f17eb33a3cdd5265e5823852" Jan 21 00:15:47 crc kubenswrapper[5030]: E0121 00:15:47.359344 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00e2eb084c862eb002653905c316f35ee9354b6f17eb33a3cdd5265e5823852\": container with ID starting with a00e2eb084c862eb002653905c316f35ee9354b6f17eb33a3cdd5265e5823852 not found: ID does not exist" containerID="a00e2eb084c862eb002653905c316f35ee9354b6f17eb33a3cdd5265e5823852" Jan 21 00:15:47 crc kubenswrapper[5030]: I0121 00:15:47.359383 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00e2eb084c862eb002653905c316f35ee9354b6f17eb33a3cdd5265e5823852"} err="failed to get container status \"a00e2eb084c862eb002653905c316f35ee9354b6f17eb33a3cdd5265e5823852\": rpc error: code = NotFound desc = could not find container \"a00e2eb084c862eb002653905c316f35ee9354b6f17eb33a3cdd5265e5823852\": container with ID starting with a00e2eb084c862eb002653905c316f35ee9354b6f17eb33a3cdd5265e5823852 not found: ID does not exist" Jan 21 00:15:47 crc kubenswrapper[5030]: I0121 00:15:47.359405 5030 scope.go:117] "RemoveContainer" containerID="38d0099cce948ed7140a63385f1d334d4df80a4aba7f0d0a3e07e19b60506c3d" Jan 21 00:15:47 crc kubenswrapper[5030]: E0121 00:15:47.359812 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38d0099cce948ed7140a63385f1d334d4df80a4aba7f0d0a3e07e19b60506c3d\": container with ID starting with 38d0099cce948ed7140a63385f1d334d4df80a4aba7f0d0a3e07e19b60506c3d not found: ID does not exist" containerID="38d0099cce948ed7140a63385f1d334d4df80a4aba7f0d0a3e07e19b60506c3d" Jan 21 00:15:47 crc kubenswrapper[5030]: I0121 00:15:47.359860 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38d0099cce948ed7140a63385f1d334d4df80a4aba7f0d0a3e07e19b60506c3d"} err="failed to get container status \"38d0099cce948ed7140a63385f1d334d4df80a4aba7f0d0a3e07e19b60506c3d\": rpc error: code = NotFound desc = could not find container \"38d0099cce948ed7140a63385f1d334d4df80a4aba7f0d0a3e07e19b60506c3d\": container with ID starting with 38d0099cce948ed7140a63385f1d334d4df80a4aba7f0d0a3e07e19b60506c3d not found: ID does not exist" Jan 21 00:15:47 crc kubenswrapper[5030]: I0121 00:15:47.979046 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f04d0a9-d5ec-4650-90cc-b7db9c1d2101" path="/var/lib/kubelet/pods/3f04d0a9-d5ec-4650-90cc-b7db9c1d2101/volumes" Jan 21 00:15:49 crc kubenswrapper[5030]: I0121 00:15:49.962099 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:15:49 crc kubenswrapper[5030]: E0121 00:15:49.962700 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.781195 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9"] Jan 21 00:15:50 crc kubenswrapper[5030]: E0121 00:15:50.782878 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f04d0a9-d5ec-4650-90cc-b7db9c1d2101" containerName="init" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.782922 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f04d0a9-d5ec-4650-90cc-b7db9c1d2101" containerName="init" Jan 21 00:15:50 crc kubenswrapper[5030]: E0121 00:15:50.782988 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f04d0a9-d5ec-4650-90cc-b7db9c1d2101" containerName="dnsmasq-dns" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.783006 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f04d0a9-d5ec-4650-90cc-b7db9c1d2101" containerName="dnsmasq-dns" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.783344 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f04d0a9-d5ec-4650-90cc-b7db9c1d2101" containerName="dnsmasq-dns" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.784106 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.786683 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-gwxgx" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.787320 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.787422 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.788807 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.789716 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.800150 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9"] Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.907101 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f6r4\" (UniqueName: \"kubernetes.io/projected/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-kube-api-access-5f6r4\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.907163 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.907268 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:50 crc kubenswrapper[5030]: I0121 00:15:50.907332 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:51 crc kubenswrapper[5030]: I0121 00:15:51.008344 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f6r4\" (UniqueName: \"kubernetes.io/projected/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-kube-api-access-5f6r4\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:51 crc kubenswrapper[5030]: I0121 00:15:51.008394 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:51 crc kubenswrapper[5030]: I0121 00:15:51.008776 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:51 crc kubenswrapper[5030]: I0121 00:15:51.008887 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:51 crc kubenswrapper[5030]: I0121 00:15:51.018275 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:51 crc kubenswrapper[5030]: I0121 00:15:51.018395 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:51 crc kubenswrapper[5030]: I0121 00:15:51.022512 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:51 crc kubenswrapper[5030]: I0121 00:15:51.036836 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f6r4\" (UniqueName: \"kubernetes.io/projected/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-kube-api-access-5f6r4\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:51 crc kubenswrapper[5030]: I0121 00:15:51.145843 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:51 crc kubenswrapper[5030]: W0121 00:15:51.580438 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c05cf2e_50b6_433d_90ec_5cfa46a2e3bc.slice/crio-c67813e8225095434c17270e635582e672ee36f06dad29ef321c737f9d6b46f1 WatchSource:0}: Error finding container c67813e8225095434c17270e635582e672ee36f06dad29ef321c737f9d6b46f1: Status 404 returned error can't find the container with id c67813e8225095434c17270e635582e672ee36f06dad29ef321c737f9d6b46f1 Jan 21 00:15:51 crc kubenswrapper[5030]: I0121 00:15:51.581077 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9"] Jan 21 00:15:52 crc kubenswrapper[5030]: I0121 00:15:52.329571 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" event={"ID":"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc","Type":"ContainerStarted","Data":"c67813e8225095434c17270e635582e672ee36f06dad29ef321c737f9d6b46f1"} Jan 21 00:15:53 crc kubenswrapper[5030]: I0121 00:15:53.361682 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" event={"ID":"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc","Type":"ContainerStarted","Data":"2ad8dded7146ad34908f5b8c82eb71ed3858392783cbb24fd9d9fed95c890b76"} Jan 21 00:15:53 crc kubenswrapper[5030]: I0121 00:15:53.383259 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" podStartSLOduration=2.766930504 podStartE2EDuration="3.383238478s" podCreationTimestamp="2026-01-21 00:15:50 +0000 UTC" firstStartedPulling="2026-01-21 00:15:51.583847408 +0000 UTC m=+6023.904107726" lastFinishedPulling="2026-01-21 00:15:52.200155382 +0000 UTC m=+6024.520415700" observedRunningTime="2026-01-21 00:15:53.376827432 +0000 UTC m=+6025.697087720" watchObservedRunningTime="2026-01-21 00:15:53.383238478 +0000 UTC m=+6025.703498766" Jan 21 00:15:54 crc kubenswrapper[5030]: I0121 00:15:54.373105 5030 generic.go:334] "Generic (PLEG): container finished" podID="8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc" containerID="2ad8dded7146ad34908f5b8c82eb71ed3858392783cbb24fd9d9fed95c890b76" exitCode=2 Jan 21 00:15:54 crc kubenswrapper[5030]: I0121 00:15:54.373162 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" event={"ID":"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc","Type":"ContainerDied","Data":"2ad8dded7146ad34908f5b8c82eb71ed3858392783cbb24fd9d9fed95c890b76"} Jan 21 00:15:55 crc kubenswrapper[5030]: I0121 00:15:55.670957 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:55 crc kubenswrapper[5030]: I0121 00:15:55.782642 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f6r4\" (UniqueName: \"kubernetes.io/projected/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-kube-api-access-5f6r4\") pod \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " Jan 21 00:15:55 crc kubenswrapper[5030]: I0121 00:15:55.782703 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-failed-service-combined-ca-bundle\") pod \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " Jan 21 00:15:55 crc kubenswrapper[5030]: I0121 00:15:55.782866 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-inventory\") pod \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " Jan 21 00:15:55 crc kubenswrapper[5030]: I0121 00:15:55.782924 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-ssh-key-edpm-compute-no-nodes\") pod \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\" (UID: \"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc\") " Jan 21 00:15:55 crc kubenswrapper[5030]: I0121 00:15:55.788361 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-failed-service-combined-ca-bundle" (OuterVolumeSpecName: "failed-service-combined-ca-bundle") pod "8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc" (UID: "8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc"). InnerVolumeSpecName "failed-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:15:55 crc kubenswrapper[5030]: I0121 00:15:55.789504 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-kube-api-access-5f6r4" (OuterVolumeSpecName: "kube-api-access-5f6r4") pod "8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc" (UID: "8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc"). InnerVolumeSpecName "kube-api-access-5f6r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:15:55 crc kubenswrapper[5030]: I0121 00:15:55.804399 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-inventory" (OuterVolumeSpecName: "inventory") pod "8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc" (UID: "8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:15:55 crc kubenswrapper[5030]: I0121 00:15:55.815558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc" (UID: "8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:15:55 crc kubenswrapper[5030]: I0121 00:15:55.884687 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f6r4\" (UniqueName: \"kubernetes.io/projected/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-kube-api-access-5f6r4\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:55 crc kubenswrapper[5030]: I0121 00:15:55.884728 5030 reconciler_common.go:293] "Volume detached for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-failed-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:55 crc kubenswrapper[5030]: I0121 00:15:55.884744 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:55 crc kubenswrapper[5030]: I0121 00:15:55.884758 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:15:56 crc kubenswrapper[5030]: I0121 00:15:56.390016 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" event={"ID":"8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc","Type":"ContainerDied","Data":"c67813e8225095434c17270e635582e672ee36f06dad29ef321c737f9d6b46f1"} Jan 21 00:15:56 crc kubenswrapper[5030]: I0121 00:15:56.390136 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9" Jan 21 00:15:56 crc kubenswrapper[5030]: I0121 00:15:56.390148 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c67813e8225095434c17270e635582e672ee36f06dad29ef321c737f9d6b46f1" Jan 21 00:16:02 crc kubenswrapper[5030]: I0121 00:16:02.961882 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:16:02 crc kubenswrapper[5030]: E0121 00:16:02.963697 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.026468 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb"] Jan 21 00:16:03 crc kubenswrapper[5030]: E0121 00:16:03.027080 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.027100 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.027241 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.027910 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.030752 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.030985 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.031255 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.032397 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.032814 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-gwxgx" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.042058 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb"] Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.097526 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.097693 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.097739 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zzv6\" (UniqueName: \"kubernetes.io/projected/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-kube-api-access-4zzv6\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.097863 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.199082 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.199145 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zzv6\" (UniqueName: \"kubernetes.io/projected/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-kube-api-access-4zzv6\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.199167 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.199229 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.205398 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.205412 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.205824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.215550 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zzv6\" (UniqueName: \"kubernetes.io/projected/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-kube-api-access-4zzv6\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.356147 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:03 crc kubenswrapper[5030]: I0121 00:16:03.658899 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb"] Jan 21 00:16:04 crc kubenswrapper[5030]: I0121 00:16:04.483113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" event={"ID":"3f81bbf4-bb07-4b85-900f-c49fd761e3c9","Type":"ContainerStarted","Data":"1084215f143ef30d91eeaaebf9c9bb81f8a928c758a760d252e97d08227b0783"} Jan 21 00:16:04 crc kubenswrapper[5030]: I0121 00:16:04.483434 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" event={"ID":"3f81bbf4-bb07-4b85-900f-c49fd761e3c9","Type":"ContainerStarted","Data":"ab889259d43c6dd52cbe744bd626b205fb08fc36b196f0272c8dce7b194127e3"} Jan 21 00:16:04 crc kubenswrapper[5030]: I0121 00:16:04.503326 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" podStartSLOduration=0.948242558 podStartE2EDuration="1.503307896s" podCreationTimestamp="2026-01-21 00:16:03 +0000 UTC" firstStartedPulling="2026-01-21 00:16:03.665134494 +0000 UTC m=+6035.985394792" lastFinishedPulling="2026-01-21 00:16:04.220199842 +0000 UTC m=+6036.540460130" observedRunningTime="2026-01-21 00:16:04.495049246 +0000 UTC m=+6036.815309544" watchObservedRunningTime="2026-01-21 00:16:04.503307896 +0000 UTC m=+6036.823568194" Jan 21 00:16:06 crc kubenswrapper[5030]: I0121 00:16:06.509090 5030 generic.go:334] "Generic (PLEG): container finished" podID="3f81bbf4-bb07-4b85-900f-c49fd761e3c9" containerID="1084215f143ef30d91eeaaebf9c9bb81f8a928c758a760d252e97d08227b0783" exitCode=2 Jan 21 00:16:06 crc kubenswrapper[5030]: I0121 00:16:06.509196 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" event={"ID":"3f81bbf4-bb07-4b85-900f-c49fd761e3c9","Type":"ContainerDied","Data":"1084215f143ef30d91eeaaebf9c9bb81f8a928c758a760d252e97d08227b0783"} Jan 21 00:16:07 crc kubenswrapper[5030]: I0121 00:16:07.807564 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:07 crc kubenswrapper[5030]: I0121 00:16:07.995471 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-inventory\") pod \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " Jan 21 00:16:07 crc kubenswrapper[5030]: I0121 00:16:07.995794 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-failed-service-combined-ca-bundle\") pod \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " Jan 21 00:16:07 crc kubenswrapper[5030]: I0121 00:16:07.996072 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-ssh-key-edpm-compute-no-nodes\") pod \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " Jan 21 00:16:07 crc kubenswrapper[5030]: I0121 00:16:07.996136 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zzv6\" (UniqueName: \"kubernetes.io/projected/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-kube-api-access-4zzv6\") pod \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\" (UID: \"3f81bbf4-bb07-4b85-900f-c49fd761e3c9\") " Jan 21 00:16:08 crc kubenswrapper[5030]: I0121 00:16:08.001697 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-failed-service-combined-ca-bundle" (OuterVolumeSpecName: "failed-service-combined-ca-bundle") pod "3f81bbf4-bb07-4b85-900f-c49fd761e3c9" (UID: "3f81bbf4-bb07-4b85-900f-c49fd761e3c9"). InnerVolumeSpecName "failed-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:16:08 crc kubenswrapper[5030]: I0121 00:16:08.002161 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-kube-api-access-4zzv6" (OuterVolumeSpecName: "kube-api-access-4zzv6") pod "3f81bbf4-bb07-4b85-900f-c49fd761e3c9" (UID: "3f81bbf4-bb07-4b85-900f-c49fd761e3c9"). InnerVolumeSpecName "kube-api-access-4zzv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:16:08 crc kubenswrapper[5030]: I0121 00:16:08.038521 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-inventory" (OuterVolumeSpecName: "inventory") pod "3f81bbf4-bb07-4b85-900f-c49fd761e3c9" (UID: "3f81bbf4-bb07-4b85-900f-c49fd761e3c9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:16:08 crc kubenswrapper[5030]: I0121 00:16:08.038880 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "3f81bbf4-bb07-4b85-900f-c49fd761e3c9" (UID: "3f81bbf4-bb07-4b85-900f-c49fd761e3c9"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:16:08 crc kubenswrapper[5030]: I0121 00:16:08.097744 5030 reconciler_common.go:293] "Volume detached for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-failed-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:16:08 crc kubenswrapper[5030]: I0121 00:16:08.097776 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:16:08 crc kubenswrapper[5030]: I0121 00:16:08.097792 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zzv6\" (UniqueName: \"kubernetes.io/projected/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-kube-api-access-4zzv6\") on node \"crc\" DevicePath \"\"" Jan 21 00:16:08 crc kubenswrapper[5030]: I0121 00:16:08.097802 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f81bbf4-bb07-4b85-900f-c49fd761e3c9-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:16:08 crc kubenswrapper[5030]: I0121 00:16:08.529467 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" event={"ID":"3f81bbf4-bb07-4b85-900f-c49fd761e3c9","Type":"ContainerDied","Data":"ab889259d43c6dd52cbe744bd626b205fb08fc36b196f0272c8dce7b194127e3"} Jan 21 00:16:08 crc kubenswrapper[5030]: I0121 00:16:08.529509 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab889259d43c6dd52cbe744bd626b205fb08fc36b196f0272c8dce7b194127e3" Jan 21 00:16:08 crc kubenswrapper[5030]: I0121 00:16:08.529541 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb" Jan 21 00:16:14 crc kubenswrapper[5030]: I0121 00:16:14.962595 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:16:15 crc kubenswrapper[5030]: I0121 00:16:15.644479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"25666b6d120603c062926c4e08877d49612e752e6df3c058aacbe8d2325dd00b"} Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.029975 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2"] Jan 21 00:16:26 crc kubenswrapper[5030]: E0121 00:16:26.030808 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f81bbf4-bb07-4b85-900f-c49fd761e3c9" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.030823 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f81bbf4-bb07-4b85-900f-c49fd761e3c9" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.030958 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f81bbf4-bb07-4b85-900f-c49fd761e3c9" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.031431 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.033384 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.033454 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w67k5\" (UniqueName: \"kubernetes.io/projected/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-kube-api-access-w67k5\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.033598 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.033852 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.038467 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.038547 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.038468 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.038845 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-gwxgx" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.042903 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.055359 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2"] Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.135114 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.135181 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w67k5\" (UniqueName: \"kubernetes.io/projected/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-kube-api-access-w67k5\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.135223 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.135285 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.150313 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.150421 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.150446 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.161103 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w67k5\" (UniqueName: \"kubernetes.io/projected/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-kube-api-access-w67k5\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.366987 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.590582 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2"] Jan 21 00:16:26 crc kubenswrapper[5030]: I0121 00:16:26.756842 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" event={"ID":"c7ec236f-9cf8-4e07-ac95-cae07a6848eb","Type":"ContainerStarted","Data":"36527d3ee1ce8c20bfc6de43a23619575159906dec51b771f35efc6780f43b66"} Jan 21 00:16:28 crc kubenswrapper[5030]: I0121 00:16:28.778070 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" event={"ID":"c7ec236f-9cf8-4e07-ac95-cae07a6848eb","Type":"ContainerStarted","Data":"3229f0d80fcf56c29c7b93700bf3ac47371f65c7b8b5fd6d40bfa996fe440891"} Jan 21 00:16:28 crc kubenswrapper[5030]: I0121 00:16:28.808317 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" podStartSLOduration=1.586242266 podStartE2EDuration="2.808294037s" podCreationTimestamp="2026-01-21 00:16:26 +0000 UTC" firstStartedPulling="2026-01-21 00:16:26.599529491 +0000 UTC m=+6058.919789779" lastFinishedPulling="2026-01-21 00:16:27.821581242 +0000 UTC m=+6060.141841550" observedRunningTime="2026-01-21 00:16:28.803223103 +0000 UTC m=+6061.123483431" watchObservedRunningTime="2026-01-21 00:16:28.808294037 +0000 UTC m=+6061.128554325" Jan 21 00:16:29 crc kubenswrapper[5030]: I0121 00:16:29.401412 5030 scope.go:117] "RemoveContainer" containerID="9078165806f3ddb811540570b6df86d43d23e2910910bec703d999a27a431972" Jan 21 00:16:29 crc kubenswrapper[5030]: I0121 00:16:29.421524 5030 scope.go:117] "RemoveContainer" containerID="32943b0f86d22d82b455cd36d24f1bf4c6589515dd381e4fee0072632633c544" Jan 21 00:16:29 crc kubenswrapper[5030]: I0121 00:16:29.456999 5030 scope.go:117] "RemoveContainer" containerID="30521d269776a85fe20908b38ff56ff1cc6ea0771a52f32b9b6db57321f4327a" Jan 21 00:16:29 crc kubenswrapper[5030]: I0121 00:16:29.480309 5030 scope.go:117] "RemoveContainer" containerID="2e73c302e23b021531eea2bbe45fd7a8ceb15c63e6d12ce0821b3f9462c3900d" Jan 21 00:16:29 crc kubenswrapper[5030]: I0121 00:16:29.508545 5030 scope.go:117] "RemoveContainer" containerID="b39773b4420068e26289e2e633b7c381c5ccf26a1ec7464ff2e04f7c962bfa77" Jan 21 00:16:29 crc kubenswrapper[5030]: I0121 00:16:29.535708 5030 scope.go:117] "RemoveContainer" containerID="758cbd33cafaa2fc69bce8b0657402bc66e32624cdb40e875281c8e72b08398f" Jan 21 00:16:29 crc kubenswrapper[5030]: I0121 00:16:29.789820 5030 generic.go:334] "Generic (PLEG): container finished" podID="c7ec236f-9cf8-4e07-ac95-cae07a6848eb" containerID="3229f0d80fcf56c29c7b93700bf3ac47371f65c7b8b5fd6d40bfa996fe440891" exitCode=2 Jan 21 00:16:29 crc kubenswrapper[5030]: I0121 00:16:29.789897 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" event={"ID":"c7ec236f-9cf8-4e07-ac95-cae07a6848eb","Type":"ContainerDied","Data":"3229f0d80fcf56c29c7b93700bf3ac47371f65c7b8b5fd6d40bfa996fe440891"} Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.140337 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.318853 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-failed-service-combined-ca-bundle\") pod \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.318972 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-inventory\") pod \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.319148 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w67k5\" (UniqueName: \"kubernetes.io/projected/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-kube-api-access-w67k5\") pod \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.319253 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-ssh-key-edpm-compute-no-nodes\") pod \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\" (UID: \"c7ec236f-9cf8-4e07-ac95-cae07a6848eb\") " Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.324727 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-failed-service-combined-ca-bundle" (OuterVolumeSpecName: "failed-service-combined-ca-bundle") pod "c7ec236f-9cf8-4e07-ac95-cae07a6848eb" (UID: "c7ec236f-9cf8-4e07-ac95-cae07a6848eb"). InnerVolumeSpecName "failed-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.326573 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-kube-api-access-w67k5" (OuterVolumeSpecName: "kube-api-access-w67k5") pod "c7ec236f-9cf8-4e07-ac95-cae07a6848eb" (UID: "c7ec236f-9cf8-4e07-ac95-cae07a6848eb"). InnerVolumeSpecName "kube-api-access-w67k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.358416 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "c7ec236f-9cf8-4e07-ac95-cae07a6848eb" (UID: "c7ec236f-9cf8-4e07-ac95-cae07a6848eb"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.360254 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-inventory" (OuterVolumeSpecName: "inventory") pod "c7ec236f-9cf8-4e07-ac95-cae07a6848eb" (UID: "c7ec236f-9cf8-4e07-ac95-cae07a6848eb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.421787 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w67k5\" (UniqueName: \"kubernetes.io/projected/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-kube-api-access-w67k5\") on node \"crc\" DevicePath \"\"" Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.422192 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.422216 5030 reconciler_common.go:293] "Volume detached for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-failed-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.422240 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7ec236f-9cf8-4e07-ac95-cae07a6848eb-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.809748 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" event={"ID":"c7ec236f-9cf8-4e07-ac95-cae07a6848eb","Type":"ContainerDied","Data":"36527d3ee1ce8c20bfc6de43a23619575159906dec51b771f35efc6780f43b66"} Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.809802 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36527d3ee1ce8c20bfc6de43a23619575159906dec51b771f35efc6780f43b66" Jan 21 00:16:31 crc kubenswrapper[5030]: I0121 00:16:31.809837 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.043771 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9"] Jan 21 00:17:09 crc kubenswrapper[5030]: E0121 00:17:09.044754 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ec236f-9cf8-4e07-ac95-cae07a6848eb" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.044772 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ec236f-9cf8-4e07-ac95-cae07a6848eb" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.044959 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ec236f-9cf8-4e07-ac95-cae07a6848eb" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.045570 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.047855 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"edpm-compute-no-nodes-dockercfg-gwxgx" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.049370 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"combined-ca-bundle" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.050048 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-kuttl-tests"/"openstack-aee-default-env" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.050495 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplane-ansible-ssh-private-key-secret" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.052324 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-kuttl-tests"/"dataplanenodeset-edpm-compute-no-nodes" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.069775 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9"] Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.159413 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjx67\" (UniqueName: \"kubernetes.io/projected/59517e78-00c3-4b68-8060-d9c952b669c9-kube-api-access-mjx67\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.159509 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.159554 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.159607 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.261501 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.261569 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.261666 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.261721 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjx67\" (UniqueName: \"kubernetes.io/projected/59517e78-00c3-4b68-8060-d9c952b669c9-kube-api-access-mjx67\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.270600 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-failed-service-combined-ca-bundle\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.270720 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-ssh-key-edpm-compute-no-nodes\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.273116 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-inventory\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.291255 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjx67\" (UniqueName: \"kubernetes.io/projected/59517e78-00c3-4b68-8060-d9c952b669c9-kube-api-access-mjx67\") pod \"failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.372838 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:09 crc kubenswrapper[5030]: I0121 00:17:09.846713 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9"] Jan 21 00:17:10 crc kubenswrapper[5030]: I0121 00:17:10.167151 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" event={"ID":"59517e78-00c3-4b68-8060-d9c952b669c9","Type":"ContainerStarted","Data":"45a8e96540ebdb57c0cc46271a65eae22d81fcdbfc86c98966cf0fedb3dd151e"} Jan 21 00:17:11 crc kubenswrapper[5030]: I0121 00:17:11.176597 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" event={"ID":"59517e78-00c3-4b68-8060-d9c952b669c9","Type":"ContainerStarted","Data":"e73bcaa6e8845c42c762d3ac37774ce0b650251773b18e1b5219e96b854a79f7"} Jan 21 00:17:11 crc kubenswrapper[5030]: I0121 00:17:11.196931 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" podStartSLOduration=1.624521467 podStartE2EDuration="2.196904685s" podCreationTimestamp="2026-01-21 00:17:09 +0000 UTC" firstStartedPulling="2026-01-21 00:17:09.860290925 +0000 UTC m=+6102.180551223" lastFinishedPulling="2026-01-21 00:17:10.432674153 +0000 UTC m=+6102.752934441" observedRunningTime="2026-01-21 00:17:11.19589636 +0000 UTC m=+6103.516156648" watchObservedRunningTime="2026-01-21 00:17:11.196904685 +0000 UTC m=+6103.517165003" Jan 21 00:17:12 crc kubenswrapper[5030]: I0121 00:17:12.184888 5030 generic.go:334] "Generic (PLEG): container finished" podID="59517e78-00c3-4b68-8060-d9c952b669c9" containerID="e73bcaa6e8845c42c762d3ac37774ce0b650251773b18e1b5219e96b854a79f7" exitCode=2 Jan 21 00:17:12 crc kubenswrapper[5030]: I0121 00:17:12.185440 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" event={"ID":"59517e78-00c3-4b68-8060-d9c952b669c9","Type":"ContainerDied","Data":"e73bcaa6e8845c42c762d3ac37774ce0b650251773b18e1b5219e96b854a79f7"} Jan 21 00:17:13 crc kubenswrapper[5030]: I0121 00:17:13.475243 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:13 crc kubenswrapper[5030]: I0121 00:17:13.545183 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-failed-service-combined-ca-bundle\") pod \"59517e78-00c3-4b68-8060-d9c952b669c9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " Jan 21 00:17:13 crc kubenswrapper[5030]: I0121 00:17:13.545329 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-ssh-key-edpm-compute-no-nodes\") pod \"59517e78-00c3-4b68-8060-d9c952b669c9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " Jan 21 00:17:13 crc kubenswrapper[5030]: I0121 00:17:13.545375 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjx67\" (UniqueName: \"kubernetes.io/projected/59517e78-00c3-4b68-8060-d9c952b669c9-kube-api-access-mjx67\") pod \"59517e78-00c3-4b68-8060-d9c952b669c9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " Jan 21 00:17:13 crc kubenswrapper[5030]: I0121 00:17:13.545424 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-inventory\") pod \"59517e78-00c3-4b68-8060-d9c952b669c9\" (UID: \"59517e78-00c3-4b68-8060-d9c952b669c9\") " Jan 21 00:17:13 crc kubenswrapper[5030]: I0121 00:17:13.549828 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-failed-service-combined-ca-bundle" (OuterVolumeSpecName: "failed-service-combined-ca-bundle") pod "59517e78-00c3-4b68-8060-d9c952b669c9" (UID: "59517e78-00c3-4b68-8060-d9c952b669c9"). InnerVolumeSpecName "failed-service-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:17:13 crc kubenswrapper[5030]: I0121 00:17:13.549848 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59517e78-00c3-4b68-8060-d9c952b669c9-kube-api-access-mjx67" (OuterVolumeSpecName: "kube-api-access-mjx67") pod "59517e78-00c3-4b68-8060-d9c952b669c9" (UID: "59517e78-00c3-4b68-8060-d9c952b669c9"). InnerVolumeSpecName "kube-api-access-mjx67". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:13 crc kubenswrapper[5030]: I0121 00:17:13.564310 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-inventory" (OuterVolumeSpecName: "inventory") pod "59517e78-00c3-4b68-8060-d9c952b669c9" (UID: "59517e78-00c3-4b68-8060-d9c952b669c9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:17:13 crc kubenswrapper[5030]: I0121 00:17:13.581389 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-ssh-key-edpm-compute-no-nodes" (OuterVolumeSpecName: "ssh-key-edpm-compute-no-nodes") pod "59517e78-00c3-4b68-8060-d9c952b669c9" (UID: "59517e78-00c3-4b68-8060-d9c952b669c9"). InnerVolumeSpecName "ssh-key-edpm-compute-no-nodes". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:17:13 crc kubenswrapper[5030]: I0121 00:17:13.647015 5030 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-ssh-key-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:13 crc kubenswrapper[5030]: I0121 00:17:13.647225 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjx67\" (UniqueName: \"kubernetes.io/projected/59517e78-00c3-4b68-8060-d9c952b669c9-kube-api-access-mjx67\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:13 crc kubenswrapper[5030]: I0121 00:17:13.647311 5030 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:13 crc kubenswrapper[5030]: I0121 00:17:13.647368 5030 reconciler_common.go:293] "Volume detached for volume \"failed-service-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59517e78-00c3-4b68-8060-d9c952b669c9-failed-service-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.204267 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" event={"ID":"59517e78-00c3-4b68-8060-d9c952b669c9","Type":"ContainerDied","Data":"45a8e96540ebdb57c0cc46271a65eae22d81fcdbfc86c98966cf0fedb3dd151e"} Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.204307 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45a8e96540ebdb57c0cc46271a65eae22d81fcdbfc86c98966cf0fedb3dd151e" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.204345 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.694654 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9"] Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.705607 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb"] Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.717239 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9"] Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.724217 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2"] Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.732610 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes84zf9"] Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.739152 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodestsvp9"] Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.746994 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes2cbs2"] Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.754479 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/failed-service-edpm-compute-no-nodes-edpm-compute-no-nodesnm2gb"] Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.822894 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z"] Jan 21 00:17:14 crc kubenswrapper[5030]: E0121 00:17:14.823190 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59517e78-00c3-4b68-8060-d9c952b669c9" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.823208 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59517e78-00c3-4b68-8060-d9c952b669c9" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.823352 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59517e78-00c3-4b68-8060-d9c952b669c9" containerName="failed-service-edpm-compute-no-nodes-edpm-compute-no-nodes" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.824046 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.834924 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z"] Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.868330 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj8d5\" (UniqueName: \"kubernetes.io/projected/f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d-kube-api-access-kj8d5\") pod \"dnsmasq-dnsmasq-84b9f45d47-kd68z\" (UID: \"f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.868424 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-kd68z\" (UID: \"f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.868666 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-kd68z\" (UID: \"f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.970334 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj8d5\" (UniqueName: \"kubernetes.io/projected/f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d-kube-api-access-kj8d5\") pod \"dnsmasq-dnsmasq-84b9f45d47-kd68z\" (UID: \"f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.970719 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-kd68z\" (UID: \"f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.970766 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-kd68z\" (UID: \"f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.971570 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d-config\") pod \"dnsmasq-dnsmasq-84b9f45d47-kd68z\" (UID: \"f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.971771 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d-dnsmasq-svc\") pod \"dnsmasq-dnsmasq-84b9f45d47-kd68z\" (UID: \"f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" Jan 21 00:17:14 crc kubenswrapper[5030]: I0121 00:17:14.991947 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj8d5\" (UniqueName: \"kubernetes.io/projected/f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d-kube-api-access-kj8d5\") pod \"dnsmasq-dnsmasq-84b9f45d47-kd68z\" (UID: \"f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d\") " pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" Jan 21 00:17:15 crc kubenswrapper[5030]: I0121 00:17:15.141665 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" Jan 21 00:17:15 crc kubenswrapper[5030]: I0121 00:17:15.541404 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z"] Jan 21 00:17:15 crc kubenswrapper[5030]: I0121 00:17:15.975969 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f81bbf4-bb07-4b85-900f-c49fd761e3c9" path="/var/lib/kubelet/pods/3f81bbf4-bb07-4b85-900f-c49fd761e3c9/volumes" Jan 21 00:17:15 crc kubenswrapper[5030]: I0121 00:17:15.978346 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59517e78-00c3-4b68-8060-d9c952b669c9" path="/var/lib/kubelet/pods/59517e78-00c3-4b68-8060-d9c952b669c9/volumes" Jan 21 00:17:15 crc kubenswrapper[5030]: I0121 00:17:15.979773 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc" path="/var/lib/kubelet/pods/8c05cf2e-50b6-433d-90ec-5cfa46a2e3bc/volumes" Jan 21 00:17:15 crc kubenswrapper[5030]: I0121 00:17:15.980969 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ec236f-9cf8-4e07-ac95-cae07a6848eb" path="/var/lib/kubelet/pods/c7ec236f-9cf8-4e07-ac95-cae07a6848eb/volumes" Jan 21 00:17:16 crc kubenswrapper[5030]: I0121 00:17:16.220245 5030 generic.go:334] "Generic (PLEG): container finished" podID="f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d" containerID="8c5418f613766d01be41814d1751528f77945bffa64d280382dc975d85f6d607" exitCode=0 Jan 21 00:17:16 crc kubenswrapper[5030]: I0121 00:17:16.220360 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" event={"ID":"f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d","Type":"ContainerDied","Data":"8c5418f613766d01be41814d1751528f77945bffa64d280382dc975d85f6d607"} Jan 21 00:17:16 crc kubenswrapper[5030]: I0121 00:17:16.220700 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" event={"ID":"f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d","Type":"ContainerStarted","Data":"9a90cb7fecaff8fe70bd6b82e0ec8b903e77a37d001426efc6d8077a52a84574"} Jan 21 00:17:17 crc kubenswrapper[5030]: I0121 00:17:17.233325 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" event={"ID":"f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d","Type":"ContainerStarted","Data":"c40bbe3b54ed34d6bd2bce64d4a552565dced3a2e280727b87c7ab6708feb92c"} Jan 21 00:17:17 crc kubenswrapper[5030]: I0121 00:17:17.233527 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" Jan 21 00:17:17 crc kubenswrapper[5030]: I0121 00:17:17.265065 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" podStartSLOduration=3.265045388 podStartE2EDuration="3.265045388s" podCreationTimestamp="2026-01-21 00:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:17:17.260229253 +0000 UTC m=+6109.580489551" watchObservedRunningTime="2026-01-21 00:17:17.265045388 +0000 UTC m=+6109.585305676" Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.156571 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.157036 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" podUID="f32a8aff-01df-47f8-83db-03ee02703e0f" containerName="manager" containerID="cri-o://4b004e6eb4eea715e4d1d1225ebf941708a403dfaa7b81a3c79884fc564712b1" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.192660 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.192860 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" podUID="df584eed-267c-41ec-b019-6af9322630da" containerName="manager" containerID="cri-o://e65d03a216edb99e7ff8a94c7c71b7abe78524f83d9aa3650be11579769cdee0" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.208451 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.208732 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" podUID="64490fad-ad11-4eea-b07f-c3c90f9bab8f" containerName="manager" containerID="cri-o://40584f451b0c6b41ad84e7c1a4e84ad8546905f21a6fe716959214b8c35e3b2e" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.221128 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.221327 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" podUID="dac66b05-4f64-4bad-9d59-9c01ccd16018" containerName="manager" containerID="cri-o://99b032a36da1aba76e618c982844826d6f9973086eeb750125c45339572b6225" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.243186 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.243431 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" podUID="ad595f62-3880-4f00-8f70-879d673dcf55" containerName="manager" containerID="cri-o://0aa89417ba2f6c0729f3c89c1c5b9fc64701d1b0b7276ab33b14d2c745d5d230" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.252371 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.252566 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" podUID="1d5ff7f0-0960-44b0-8637-446aa3906d52" containerName="manager" containerID="cri-o://b720d678cbbfa9862850b06990f6e49cefc3e136fd89594ae535f4c99f8bf0f2" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.258774 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.258987 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" podUID="bee1b017-6c33-42be-82ba-9af5cd5b5362" containerName="operator" containerID="cri-o://10c57e78eced15714eb0ea981712a0028cd5ff4428918476dae035318d85d001" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.268270 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.268501 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" podUID="1eab1afc-3438-48e1-9210-c49f8c94316d" containerName="manager" containerID="cri-o://5fb34c8e29ac29cd7e990d10eebe752719765609284b211b3eb8c9c3932928d3" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.274720 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.274972 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" podUID="caa5b6cd-4c8e-424e-8ba9-578ccd662eab" containerName="manager" containerID="cri-o://0b82f35514b33886fb833fea714971e2e950a0b2b66749e86203ca588d1951df" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.280356 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.280954 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" podUID="5810f6d7-0bb3-4137-8ce4-03f0b1e9b225" containerName="manager" containerID="cri-o://29788f9d9bf1fc564feb6809bf41ef8c112495b8e2f29373c56d5f6af7a181b4" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.298732 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.298987 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" podUID="23bf9edf-4e02-4e6b-815d-c957b80275f0" containerName="manager" containerID="cri-o://09ab300b6027b5cb312945c9d6e4e14bd2732b961fe4b0fc413eabbd873d0271" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.340677 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.340911 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" podUID="9a08b9f6-0526-42ba-9439-1fad41456c73" containerName="manager" containerID="cri-o://eb9e1f492058fcaf14bce1a2e0dddb06a461741cf089920f0fd020ef503b9cb8" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.363747 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.363915 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" podUID="d7b05cde-2eb1-4879-bc97-dd0a420d7617" containerName="manager" containerID="cri-o://71b58c9561c91a3cafd3472b3584f780486758bc8767e83ad2f39a640c5cd1d5" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.383142 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.383355 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" podUID="b73ba68f-7653-47c8-bd19-4a3f05afad72" containerName="manager" containerID="cri-o://c3c465ef33d6aab6dd4b7551ec2eac0ee39cb7f40222d561b8e2d9fac3cdaa83" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.407238 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.408040 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" podUID="96630154-0812-445c-bd03-9a2066468891" containerName="manager" containerID="cri-o://6565260d57febc636c85ea1add62c31f1272ab12c8e5392a27d79459653a3bd3" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.413293 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.413523 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" podUID="bf527df4-5b46-4894-ac83-7b5613ee7fae" containerName="manager" containerID="cri-o://24c0e2734280ba547ae5ef0d76f16dddb92ff6ddcf0d95bf42ae9dbf234ed756" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.443600 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.443813 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" podUID="1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2" containerName="manager" containerID="cri-o://24e74137735a20307ecb6198bd20f9e08759fa32dbac36ab6f996478b164bc37" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.458770 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.458998 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" podUID="a4f33608-dc70-40e3-b757-8410b4f1938e" containerName="manager" containerID="cri-o://7ded8ff756807b2d90f4c65bdc54ae4f12c5406d841f6298740660fc611f6f3b" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.476748 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.476986 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" podUID="c9aea915-81f5-4204-9d3b-4d107d4678a1" containerName="manager" containerID="cri-o://aca7b7399a1f0721b34e01b5e2aa237c8d9035db4cf09a589bb3d945a0420ae7" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.493183 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.493430 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" podUID="a2343f20-8163-4f12-85f6-d621d416444b" containerName="manager" containerID="cri-o://a04440c595a263c44973ca82a87dcf37eb9e37396951ad858d0befee7751d427" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.506825 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.507008 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" podUID="708640d3-cade-4b2d-b378-1f25cafb3575" containerName="manager" containerID="cri-o://f316aa61f7ada9f40c18b6a646bb9bdcb06db631aab04eb4aa5b0679c6f42397" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.527513 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.527822 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" podUID="98fe7bc7-a9a3-4681-9d86-2d134e784ed0" containerName="manager" containerID="cri-o://79c3796b053a1e0dbfc7557940757713515a2dd19e862a09c993127480e0e1fc" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.536886 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-t75jj"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.537065 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-t75jj" podUID="ff5263c0-d16c-41dd-a473-a329c770c17c" containerName="registry-server" containerID="cri-o://d5cd2ad4e448b2529a942ea1ce402bbbff882edee6616a2f81d0df880a69cb7f" gracePeriod=30 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.551723 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.551916 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" podUID="123d952b-3c67-49ea-9c09-82b04a14d494" containerName="manager" containerID="cri-o://5abe53d03f892d3544b17d81eba417d5a56af62aff5ab61676205fa77053baf6" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.588817 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.589078 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" podUID="5fbec18c-c7ab-43eb-9440-c2a8973d3532" containerName="operator" containerID="cri-o://395ed9d798d4cf55d2e4f993c6ce58fafbcf3f2feee517ef1ae91409e5ef8021" gracePeriod=10 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.600404 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.611822 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/7f8269a825e737cb1f2e67fcbeccb826d8bfc6ea337cf3db10b8143e2eb5n6v"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.619319 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.619566 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" podUID="dce6cbef-8eea-40d0-83ee-8790d2bd9450" containerName="catalog-operator" containerID="cri-o://c69fc5730d416ab0b9e63766bd41ff5288d7a1376f17aa93612b835bc23c8ef9" gracePeriod=30 Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.625164 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.626391 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.631803 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl"] Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.638305 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad686649-19fc-40ba-bb30-86fcc6511e79-profile-collector-cert\") pod \"catalog-operator-68c6474976-m5ltl\" (UID: \"ad686649-19fc-40ba-bb30-86fcc6511e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.638391 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad686649-19fc-40ba-bb30-86fcc6511e79-srv-cert\") pod \"catalog-operator-68c6474976-m5ltl\" (UID: \"ad686649-19fc-40ba-bb30-86fcc6511e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.638455 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzq9n\" (UniqueName: \"kubernetes.io/projected/ad686649-19fc-40ba-bb30-86fcc6511e79-kube-api-access-xzq9n\") pod \"catalog-operator-68c6474976-m5ltl\" (UID: \"ad686649-19fc-40ba-bb30-86fcc6511e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.738982 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad686649-19fc-40ba-bb30-86fcc6511e79-profile-collector-cert\") pod \"catalog-operator-68c6474976-m5ltl\" (UID: \"ad686649-19fc-40ba-bb30-86fcc6511e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.739063 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad686649-19fc-40ba-bb30-86fcc6511e79-srv-cert\") pod \"catalog-operator-68c6474976-m5ltl\" (UID: \"ad686649-19fc-40ba-bb30-86fcc6511e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.739129 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzq9n\" (UniqueName: \"kubernetes.io/projected/ad686649-19fc-40ba-bb30-86fcc6511e79-kube-api-access-xzq9n\") pod \"catalog-operator-68c6474976-m5ltl\" (UID: \"ad686649-19fc-40ba-bb30-86fcc6511e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.753574 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad686649-19fc-40ba-bb30-86fcc6511e79-profile-collector-cert\") pod \"catalog-operator-68c6474976-m5ltl\" (UID: \"ad686649-19fc-40ba-bb30-86fcc6511e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.753580 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad686649-19fc-40ba-bb30-86fcc6511e79-srv-cert\") pod \"catalog-operator-68c6474976-m5ltl\" (UID: \"ad686649-19fc-40ba-bb30-86fcc6511e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" Jan 21 00:17:18 crc kubenswrapper[5030]: I0121 00:17:18.766530 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzq9n\" (UniqueName: \"kubernetes.io/projected/ad686649-19fc-40ba-bb30-86fcc6511e79-kube-api-access-xzq9n\") pod \"catalog-operator-68c6474976-m5ltl\" (UID: \"ad686649-19fc-40ba-bb30-86fcc6511e79\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.262495 5030 generic.go:334] "Generic (PLEG): container finished" podID="5fbec18c-c7ab-43eb-9440-c2a8973d3532" containerID="395ed9d798d4cf55d2e4f993c6ce58fafbcf3f2feee517ef1ae91409e5ef8021" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.262567 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" event={"ID":"5fbec18c-c7ab-43eb-9440-c2a8973d3532","Type":"ContainerDied","Data":"395ed9d798d4cf55d2e4f993c6ce58fafbcf3f2feee517ef1ae91409e5ef8021"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.265787 5030 generic.go:334] "Generic (PLEG): container finished" podID="1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2" containerID="24e74137735a20307ecb6198bd20f9e08759fa32dbac36ab6f996478b164bc37" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.265866 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" event={"ID":"1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2","Type":"ContainerDied","Data":"24e74137735a20307ecb6198bd20f9e08759fa32dbac36ab6f996478b164bc37"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.269456 5030 generic.go:334] "Generic (PLEG): container finished" podID="b73ba68f-7653-47c8-bd19-4a3f05afad72" containerID="c3c465ef33d6aab6dd4b7551ec2eac0ee39cb7f40222d561b8e2d9fac3cdaa83" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.269548 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" event={"ID":"b73ba68f-7653-47c8-bd19-4a3f05afad72","Type":"ContainerDied","Data":"c3c465ef33d6aab6dd4b7551ec2eac0ee39cb7f40222d561b8e2d9fac3cdaa83"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.276523 5030 generic.go:334] "Generic (PLEG): container finished" podID="ad595f62-3880-4f00-8f70-879d673dcf55" containerID="0aa89417ba2f6c0729f3c89c1c5b9fc64701d1b0b7276ab33b14d2c745d5d230" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.276646 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" event={"ID":"ad595f62-3880-4f00-8f70-879d673dcf55","Type":"ContainerDied","Data":"0aa89417ba2f6c0729f3c89c1c5b9fc64701d1b0b7276ab33b14d2c745d5d230"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.278459 5030 generic.go:334] "Generic (PLEG): container finished" podID="708640d3-cade-4b2d-b378-1f25cafb3575" containerID="f316aa61f7ada9f40c18b6a646bb9bdcb06db631aab04eb4aa5b0679c6f42397" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.278543 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" event={"ID":"708640d3-cade-4b2d-b378-1f25cafb3575","Type":"ContainerDied","Data":"f316aa61f7ada9f40c18b6a646bb9bdcb06db631aab04eb4aa5b0679c6f42397"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.283568 5030 generic.go:334] "Generic (PLEG): container finished" podID="9a08b9f6-0526-42ba-9439-1fad41456c73" containerID="eb9e1f492058fcaf14bce1a2e0dddb06a461741cf089920f0fd020ef503b9cb8" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.283681 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" event={"ID":"9a08b9f6-0526-42ba-9439-1fad41456c73","Type":"ContainerDied","Data":"eb9e1f492058fcaf14bce1a2e0dddb06a461741cf089920f0fd020ef503b9cb8"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.287199 5030 generic.go:334] "Generic (PLEG): container finished" podID="ff5263c0-d16c-41dd-a473-a329c770c17c" containerID="d5cd2ad4e448b2529a942ea1ce402bbbff882edee6616a2f81d0df880a69cb7f" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.287277 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t75jj" event={"ID":"ff5263c0-d16c-41dd-a473-a329c770c17c","Type":"ContainerDied","Data":"d5cd2ad4e448b2529a942ea1ce402bbbff882edee6616a2f81d0df880a69cb7f"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.289226 5030 generic.go:334] "Generic (PLEG): container finished" podID="bee1b017-6c33-42be-82ba-9af5cd5b5362" containerID="10c57e78eced15714eb0ea981712a0028cd5ff4428918476dae035318d85d001" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.289333 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" event={"ID":"bee1b017-6c33-42be-82ba-9af5cd5b5362","Type":"ContainerDied","Data":"10c57e78eced15714eb0ea981712a0028cd5ff4428918476dae035318d85d001"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.291637 5030 generic.go:334] "Generic (PLEG): container finished" podID="dac66b05-4f64-4bad-9d59-9c01ccd16018" containerID="99b032a36da1aba76e618c982844826d6f9973086eeb750125c45339572b6225" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.291739 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" event={"ID":"dac66b05-4f64-4bad-9d59-9c01ccd16018","Type":"ContainerDied","Data":"99b032a36da1aba76e618c982844826d6f9973086eeb750125c45339572b6225"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.293955 5030 generic.go:334] "Generic (PLEG): container finished" podID="a4f33608-dc70-40e3-b757-8410b4f1938e" containerID="7ded8ff756807b2d90f4c65bdc54ae4f12c5406d841f6298740660fc611f6f3b" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.294048 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" event={"ID":"a4f33608-dc70-40e3-b757-8410b4f1938e","Type":"ContainerDied","Data":"7ded8ff756807b2d90f4c65bdc54ae4f12c5406d841f6298740660fc611f6f3b"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.296166 5030 generic.go:334] "Generic (PLEG): container finished" podID="96630154-0812-445c-bd03-9a2066468891" containerID="6565260d57febc636c85ea1add62c31f1272ab12c8e5392a27d79459653a3bd3" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.296229 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" event={"ID":"96630154-0812-445c-bd03-9a2066468891","Type":"ContainerDied","Data":"6565260d57febc636c85ea1add62c31f1272ab12c8e5392a27d79459653a3bd3"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.297937 5030 generic.go:334] "Generic (PLEG): container finished" podID="f32a8aff-01df-47f8-83db-03ee02703e0f" containerID="4b004e6eb4eea715e4d1d1225ebf941708a403dfaa7b81a3c79884fc564712b1" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.298023 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" event={"ID":"f32a8aff-01df-47f8-83db-03ee02703e0f","Type":"ContainerDied","Data":"4b004e6eb4eea715e4d1d1225ebf941708a403dfaa7b81a3c79884fc564712b1"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.300184 5030 generic.go:334] "Generic (PLEG): container finished" podID="dce6cbef-8eea-40d0-83ee-8790d2bd9450" containerID="c69fc5730d416ab0b9e63766bd41ff5288d7a1376f17aa93612b835bc23c8ef9" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.300249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" event={"ID":"dce6cbef-8eea-40d0-83ee-8790d2bd9450","Type":"ContainerDied","Data":"c69fc5730d416ab0b9e63766bd41ff5288d7a1376f17aa93612b835bc23c8ef9"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.302133 5030 generic.go:334] "Generic (PLEG): container finished" podID="64490fad-ad11-4eea-b07f-c3c90f9bab8f" containerID="40584f451b0c6b41ad84e7c1a4e84ad8546905f21a6fe716959214b8c35e3b2e" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.302196 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" event={"ID":"64490fad-ad11-4eea-b07f-c3c90f9bab8f","Type":"ContainerDied","Data":"40584f451b0c6b41ad84e7c1a4e84ad8546905f21a6fe716959214b8c35e3b2e"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.302262 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" event={"ID":"64490fad-ad11-4eea-b07f-c3c90f9bab8f","Type":"ContainerDied","Data":"cef5e3ff95f86d527a7368b737f75cb7a5ac6347b7bf32ad5c02df992761da30"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.302287 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cef5e3ff95f86d527a7368b737f75cb7a5ac6347b7bf32ad5c02df992761da30" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.304114 5030 generic.go:334] "Generic (PLEG): container finished" podID="caa5b6cd-4c8e-424e-8ba9-578ccd662eab" containerID="0b82f35514b33886fb833fea714971e2e950a0b2b66749e86203ca588d1951df" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.304181 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" event={"ID":"caa5b6cd-4c8e-424e-8ba9-578ccd662eab","Type":"ContainerDied","Data":"0b82f35514b33886fb833fea714971e2e950a0b2b66749e86203ca588d1951df"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.304199 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" event={"ID":"caa5b6cd-4c8e-424e-8ba9-578ccd662eab","Type":"ContainerDied","Data":"91fe49cac6b767fc3fad43be2ab43912ec4d57d9bf705669148fa947cc8530e1"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.304212 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91fe49cac6b767fc3fad43be2ab43912ec4d57d9bf705669148fa947cc8530e1" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.321547 5030 generic.go:334] "Generic (PLEG): container finished" podID="1eab1afc-3438-48e1-9210-c49f8c94316d" containerID="5fb34c8e29ac29cd7e990d10eebe752719765609284b211b3eb8c9c3932928d3" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.321741 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" event={"ID":"1eab1afc-3438-48e1-9210-c49f8c94316d","Type":"ContainerDied","Data":"5fb34c8e29ac29cd7e990d10eebe752719765609284b211b3eb8c9c3932928d3"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.326520 5030 generic.go:334] "Generic (PLEG): container finished" podID="d7b05cde-2eb1-4879-bc97-dd0a420d7617" containerID="71b58c9561c91a3cafd3472b3584f780486758bc8767e83ad2f39a640c5cd1d5" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.326580 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" event={"ID":"d7b05cde-2eb1-4879-bc97-dd0a420d7617","Type":"ContainerDied","Data":"71b58c9561c91a3cafd3472b3584f780486758bc8767e83ad2f39a640c5cd1d5"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.334361 5030 generic.go:334] "Generic (PLEG): container finished" podID="1d5ff7f0-0960-44b0-8637-446aa3906d52" containerID="b720d678cbbfa9862850b06990f6e49cefc3e136fd89594ae535f4c99f8bf0f2" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.334433 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" event={"ID":"1d5ff7f0-0960-44b0-8637-446aa3906d52","Type":"ContainerDied","Data":"b720d678cbbfa9862850b06990f6e49cefc3e136fd89594ae535f4c99f8bf0f2"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.338472 5030 generic.go:334] "Generic (PLEG): container finished" podID="98fe7bc7-a9a3-4681-9d86-2d134e784ed0" containerID="79c3796b053a1e0dbfc7557940757713515a2dd19e862a09c993127480e0e1fc" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.338576 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" event={"ID":"98fe7bc7-a9a3-4681-9d86-2d134e784ed0","Type":"ContainerDied","Data":"79c3796b053a1e0dbfc7557940757713515a2dd19e862a09c993127480e0e1fc"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.340563 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2343f20-8163-4f12-85f6-d621d416444b" containerID="a04440c595a263c44973ca82a87dcf37eb9e37396951ad858d0befee7751d427" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.340654 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" event={"ID":"a2343f20-8163-4f12-85f6-d621d416444b","Type":"ContainerDied","Data":"a04440c595a263c44973ca82a87dcf37eb9e37396951ad858d0befee7751d427"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.342257 5030 generic.go:334] "Generic (PLEG): container finished" podID="23bf9edf-4e02-4e6b-815d-c957b80275f0" containerID="09ab300b6027b5cb312945c9d6e4e14bd2732b961fe4b0fc413eabbd873d0271" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.342292 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" event={"ID":"23bf9edf-4e02-4e6b-815d-c957b80275f0","Type":"ContainerDied","Data":"09ab300b6027b5cb312945c9d6e4e14bd2732b961fe4b0fc413eabbd873d0271"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.342307 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" event={"ID":"23bf9edf-4e02-4e6b-815d-c957b80275f0","Type":"ContainerDied","Data":"4411971495e539214bd5762d84173cef8610b70cdcc4feddee4278edf42064a1"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.342319 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4411971495e539214bd5762d84173cef8610b70cdcc4feddee4278edf42064a1" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.345282 5030 generic.go:334] "Generic (PLEG): container finished" podID="c9aea915-81f5-4204-9d3b-4d107d4678a1" containerID="aca7b7399a1f0721b34e01b5e2aa237c8d9035db4cf09a589bb3d945a0420ae7" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.345336 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" event={"ID":"c9aea915-81f5-4204-9d3b-4d107d4678a1","Type":"ContainerDied","Data":"aca7b7399a1f0721b34e01b5e2aa237c8d9035db4cf09a589bb3d945a0420ae7"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.347668 5030 generic.go:334] "Generic (PLEG): container finished" podID="5810f6d7-0bb3-4137-8ce4-03f0b1e9b225" containerID="29788f9d9bf1fc564feb6809bf41ef8c112495b8e2f29373c56d5f6af7a181b4" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.347731 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" event={"ID":"5810f6d7-0bb3-4137-8ce4-03f0b1e9b225","Type":"ContainerDied","Data":"29788f9d9bf1fc564feb6809bf41ef8c112495b8e2f29373c56d5f6af7a181b4"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.349573 5030 generic.go:334] "Generic (PLEG): container finished" podID="123d952b-3c67-49ea-9c09-82b04a14d494" containerID="5abe53d03f892d3544b17d81eba417d5a56af62aff5ab61676205fa77053baf6" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.349606 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" event={"ID":"123d952b-3c67-49ea-9c09-82b04a14d494","Type":"ContainerDied","Data":"5abe53d03f892d3544b17d81eba417d5a56af62aff5ab61676205fa77053baf6"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.357072 5030 generic.go:334] "Generic (PLEG): container finished" podID="bf527df4-5b46-4894-ac83-7b5613ee7fae" containerID="24c0e2734280ba547ae5ef0d76f16dddb92ff6ddcf0d95bf42ae9dbf234ed756" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.362797 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" event={"ID":"bf527df4-5b46-4894-ac83-7b5613ee7fae","Type":"ContainerDied","Data":"24c0e2734280ba547ae5ef0d76f16dddb92ff6ddcf0d95bf42ae9dbf234ed756"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.363730 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" event={"ID":"bf527df4-5b46-4894-ac83-7b5613ee7fae","Type":"ContainerDied","Data":"7e34a090f497852624dff94cac24642c505e29a0112d097e507c1d063804471c"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.363745 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e34a090f497852624dff94cac24642c505e29a0112d097e507c1d063804471c" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.367808 5030 generic.go:334] "Generic (PLEG): container finished" podID="df584eed-267c-41ec-b019-6af9322630da" containerID="e65d03a216edb99e7ff8a94c7c71b7abe78524f83d9aa3650be11579769cdee0" exitCode=0 Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.367845 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" event={"ID":"df584eed-267c-41ec-b019-6af9322630da","Type":"ContainerDied","Data":"e65d03a216edb99e7ff8a94c7c71b7abe78524f83d9aa3650be11579769cdee0"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.367870 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" event={"ID":"df584eed-267c-41ec-b019-6af9322630da","Type":"ContainerDied","Data":"92ca70581cc9b932f7b9414457a3edb7c6bd6d159de093e030c1cbff4f7535cc"} Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.367884 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92ca70581cc9b932f7b9414457a3edb7c6bd6d159de093e030c1cbff4f7535cc" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.768827 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.772109 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.786047 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.791292 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.804068 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.811664 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.820274 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.829851 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.839591 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.844528 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.854171 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.857606 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.866162 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.872841 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert\") pod \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\" (UID: \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.872889 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbwjb\" (UniqueName: \"kubernetes.io/projected/bf527df4-5b46-4894-ac83-7b5613ee7fae-kube-api-access-kbwjb\") pod \"bf527df4-5b46-4894-ac83-7b5613ee7fae\" (UID: \"bf527df4-5b46-4894-ac83-7b5613ee7fae\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.872918 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdgwt\" (UniqueName: \"kubernetes.io/projected/23bf9edf-4e02-4e6b-815d-c957b80275f0-kube-api-access-hdgwt\") pod \"23bf9edf-4e02-4e6b-815d-c957b80275f0\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.872950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv5lf\" (UniqueName: \"kubernetes.io/projected/df584eed-267c-41ec-b019-6af9322630da-kube-api-access-dv5lf\") pod \"df584eed-267c-41ec-b019-6af9322630da\" (UID: \"df584eed-267c-41ec-b019-6af9322630da\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.872985 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs\") pod \"23bf9edf-4e02-4e6b-815d-c957b80275f0\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.873005 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knmnv\" (UniqueName: \"kubernetes.io/projected/caa5b6cd-4c8e-424e-8ba9-578ccd662eab-kube-api-access-knmnv\") pod \"caa5b6cd-4c8e-424e-8ba9-578ccd662eab\" (UID: \"caa5b6cd-4c8e-424e-8ba9-578ccd662eab\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.873049 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h9vz\" (UniqueName: \"kubernetes.io/projected/1eab1afc-3438-48e1-9210-c49f8c94316d-kube-api-access-5h9vz\") pod \"1eab1afc-3438-48e1-9210-c49f8c94316d\" (UID: \"1eab1afc-3438-48e1-9210-c49f8c94316d\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.873069 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-587tm\" (UniqueName: \"kubernetes.io/projected/dac66b05-4f64-4bad-9d59-9c01ccd16018-kube-api-access-587tm\") pod \"dac66b05-4f64-4bad-9d59-9c01ccd16018\" (UID: \"dac66b05-4f64-4bad-9d59-9c01ccd16018\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.873125 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz7hr\" (UniqueName: \"kubernetes.io/projected/96630154-0812-445c-bd03-9a2066468891-kube-api-access-zz7hr\") pod \"96630154-0812-445c-bd03-9a2066468891\" (UID: \"96630154-0812-445c-bd03-9a2066468891\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.873153 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll987\" (UniqueName: \"kubernetes.io/projected/1d5ff7f0-0960-44b0-8637-446aa3906d52-kube-api-access-ll987\") pod \"1d5ff7f0-0960-44b0-8637-446aa3906d52\" (UID: \"1d5ff7f0-0960-44b0-8637-446aa3906d52\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.873169 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrxxp\" (UniqueName: \"kubernetes.io/projected/f32a8aff-01df-47f8-83db-03ee02703e0f-kube-api-access-xrxxp\") pod \"f32a8aff-01df-47f8-83db-03ee02703e0f\" (UID: \"f32a8aff-01df-47f8-83db-03ee02703e0f\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.873193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs\") pod \"23bf9edf-4e02-4e6b-815d-c957b80275f0\" (UID: \"23bf9edf-4e02-4e6b-815d-c957b80275f0\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.873217 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp9zq\" (UniqueName: \"kubernetes.io/projected/64490fad-ad11-4eea-b07f-c3c90f9bab8f-kube-api-access-tp9zq\") pod \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\" (UID: \"64490fad-ad11-4eea-b07f-c3c90f9bab8f\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.873239 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ndzq\" (UniqueName: \"kubernetes.io/projected/ad595f62-3880-4f00-8f70-879d673dcf55-kube-api-access-2ndzq\") pod \"ad595f62-3880-4f00-8f70-879d673dcf55\" (UID: \"ad595f62-3880-4f00-8f70-879d673dcf55\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.873266 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fcjv\" (UniqueName: \"kubernetes.io/projected/bee1b017-6c33-42be-82ba-9af5cd5b5362-kube-api-access-6fcjv\") pod \"bee1b017-6c33-42be-82ba-9af5cd5b5362\" (UID: \"bee1b017-6c33-42be-82ba-9af5cd5b5362\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.874093 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.882127 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23bf9edf-4e02-4e6b-815d-c957b80275f0-kube-api-access-hdgwt" (OuterVolumeSpecName: "kube-api-access-hdgwt") pod "23bf9edf-4e02-4e6b-815d-c957b80275f0" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0"). InnerVolumeSpecName "kube-api-access-hdgwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.882643 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf527df4-5b46-4894-ac83-7b5613ee7fae-kube-api-access-kbwjb" (OuterVolumeSpecName: "kube-api-access-kbwjb") pod "bf527df4-5b46-4894-ac83-7b5613ee7fae" (UID: "bf527df4-5b46-4894-ac83-7b5613ee7fae"). InnerVolumeSpecName "kube-api-access-kbwjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.883845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "23bf9edf-4e02-4e6b-815d-c957b80275f0" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.885217 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eab1afc-3438-48e1-9210-c49f8c94316d-kube-api-access-5h9vz" (OuterVolumeSpecName: "kube-api-access-5h9vz") pod "1eab1afc-3438-48e1-9210-c49f8c94316d" (UID: "1eab1afc-3438-48e1-9210-c49f8c94316d"). InnerVolumeSpecName "kube-api-access-5h9vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.886989 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64490fad-ad11-4eea-b07f-c3c90f9bab8f-kube-api-access-tp9zq" (OuterVolumeSpecName: "kube-api-access-tp9zq") pod "64490fad-ad11-4eea-b07f-c3c90f9bab8f" (UID: "64490fad-ad11-4eea-b07f-c3c90f9bab8f"). InnerVolumeSpecName "kube-api-access-tp9zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.890093 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96630154-0812-445c-bd03-9a2066468891-kube-api-access-zz7hr" (OuterVolumeSpecName: "kube-api-access-zz7hr") pod "96630154-0812-445c-bd03-9a2066468891" (UID: "96630154-0812-445c-bd03-9a2066468891"). InnerVolumeSpecName "kube-api-access-zz7hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.892724 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad595f62-3880-4f00-8f70-879d673dcf55-kube-api-access-2ndzq" (OuterVolumeSpecName: "kube-api-access-2ndzq") pod "ad595f62-3880-4f00-8f70-879d673dcf55" (UID: "ad595f62-3880-4f00-8f70-879d673dcf55"). InnerVolumeSpecName "kube-api-access-2ndzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.893497 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert" (OuterVolumeSpecName: "cert") pod "64490fad-ad11-4eea-b07f-c3c90f9bab8f" (UID: "64490fad-ad11-4eea-b07f-c3c90f9bab8f"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.893548 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.893986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "23bf9edf-4e02-4e6b-815d-c957b80275f0" (UID: "23bf9edf-4e02-4e6b-815d-c957b80275f0"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.894780 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa5b6cd-4c8e-424e-8ba9-578ccd662eab-kube-api-access-knmnv" (OuterVolumeSpecName: "kube-api-access-knmnv") pod "caa5b6cd-4c8e-424e-8ba9-578ccd662eab" (UID: "caa5b6cd-4c8e-424e-8ba9-578ccd662eab"). InnerVolumeSpecName "kube-api-access-knmnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.895096 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df584eed-267c-41ec-b019-6af9322630da-kube-api-access-dv5lf" (OuterVolumeSpecName: "kube-api-access-dv5lf") pod "df584eed-267c-41ec-b019-6af9322630da" (UID: "df584eed-267c-41ec-b019-6af9322630da"). InnerVolumeSpecName "kube-api-access-dv5lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.898199 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac66b05-4f64-4bad-9d59-9c01ccd16018-kube-api-access-587tm" (OuterVolumeSpecName: "kube-api-access-587tm") pod "dac66b05-4f64-4bad-9d59-9c01ccd16018" (UID: "dac66b05-4f64-4bad-9d59-9c01ccd16018"). InnerVolumeSpecName "kube-api-access-587tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.900308 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d5ff7f0-0960-44b0-8637-446aa3906d52-kube-api-access-ll987" (OuterVolumeSpecName: "kube-api-access-ll987") pod "1d5ff7f0-0960-44b0-8637-446aa3906d52" (UID: "1d5ff7f0-0960-44b0-8637-446aa3906d52"). InnerVolumeSpecName "kube-api-access-ll987". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.901525 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f32a8aff-01df-47f8-83db-03ee02703e0f-kube-api-access-xrxxp" (OuterVolumeSpecName: "kube-api-access-xrxxp") pod "f32a8aff-01df-47f8-83db-03ee02703e0f" (UID: "f32a8aff-01df-47f8-83db-03ee02703e0f"). InnerVolumeSpecName "kube-api-access-xrxxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.902364 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee1b017-6c33-42be-82ba-9af5cd5b5362-kube-api-access-6fcjv" (OuterVolumeSpecName: "kube-api-access-6fcjv") pod "bee1b017-6c33-42be-82ba-9af5cd5b5362" (UID: "bee1b017-6c33-42be-82ba-9af5cd5b5362"). InnerVolumeSpecName "kube-api-access-6fcjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.918870 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.918978 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.947238 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.948211 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.953480 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t75jj" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.960512 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.965337 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.973890 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.973961 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a0012c2-48d6-40b9-8490-c0b904e785b7" path="/var/lib/kubelet/pods/9a0012c2-48d6-40b9-8490-c0b904e785b7/volumes" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.976086 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkmlr\" (UniqueName: \"kubernetes.io/projected/123d952b-3c67-49ea-9c09-82b04a14d494-kube-api-access-bkmlr\") pod \"123d952b-3c67-49ea-9c09-82b04a14d494\" (UID: \"123d952b-3c67-49ea-9c09-82b04a14d494\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.976175 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dce6cbef-8eea-40d0-83ee-8790d2bd9450-profile-collector-cert\") pod \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\" (UID: \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.976259 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45n7f\" (UniqueName: \"kubernetes.io/projected/9a08b9f6-0526-42ba-9439-1fad41456c73-kube-api-access-45n7f\") pod \"9a08b9f6-0526-42ba-9439-1fad41456c73\" (UID: \"9a08b9f6-0526-42ba-9439-1fad41456c73\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.976316 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr8vv\" (UniqueName: \"kubernetes.io/projected/ff5263c0-d16c-41dd-a473-a329c770c17c-kube-api-access-qr8vv\") pod \"ff5263c0-d16c-41dd-a473-a329c770c17c\" (UID: \"ff5263c0-d16c-41dd-a473-a329c770c17c\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.976350 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkwdf\" (UniqueName: \"kubernetes.io/projected/5fbec18c-c7ab-43eb-9440-c2a8973d3532-kube-api-access-gkwdf\") pod \"5fbec18c-c7ab-43eb-9440-c2a8973d3532\" (UID: \"5fbec18c-c7ab-43eb-9440-c2a8973d3532\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.976373 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dce6cbef-8eea-40d0-83ee-8790d2bd9450-srv-cert\") pod \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\" (UID: \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.976409 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prs4f\" (UniqueName: \"kubernetes.io/projected/d7b05cde-2eb1-4879-bc97-dd0a420d7617-kube-api-access-prs4f\") pod \"d7b05cde-2eb1-4879-bc97-dd0a420d7617\" (UID: \"d7b05cde-2eb1-4879-bc97-dd0a420d7617\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.976432 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q95sh\" (UniqueName: \"kubernetes.io/projected/dce6cbef-8eea-40d0-83ee-8790d2bd9450-kube-api-access-q95sh\") pod \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\" (UID: \"dce6cbef-8eea-40d0-83ee-8790d2bd9450\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.976452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert\") pod \"123d952b-3c67-49ea-9c09-82b04a14d494\" (UID: \"123d952b-3c67-49ea-9c09-82b04a14d494\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.976496 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nzfn\" (UniqueName: \"kubernetes.io/projected/5810f6d7-0bb3-4137-8ce4-03f0b1e9b225-kube-api-access-4nzfn\") pod \"5810f6d7-0bb3-4137-8ce4-03f0b1e9b225\" (UID: \"5810f6d7-0bb3-4137-8ce4-03f0b1e9b225\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.976523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2cl4\" (UniqueName: \"kubernetes.io/projected/98fe7bc7-a9a3-4681-9d86-2d134e784ed0-kube-api-access-g2cl4\") pod \"98fe7bc7-a9a3-4681-9d86-2d134e784ed0\" (UID: \"98fe7bc7-a9a3-4681-9d86-2d134e784ed0\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.976562 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbhk5\" (UniqueName: \"kubernetes.io/projected/a4f33608-dc70-40e3-b757-8410b4f1938e-kube-api-access-qbhk5\") pod \"a4f33608-dc70-40e3-b757-8410b4f1938e\" (UID: \"a4f33608-dc70-40e3-b757-8410b4f1938e\") " Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977049 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv5lf\" (UniqueName: \"kubernetes.io/projected/df584eed-267c-41ec-b019-6af9322630da-kube-api-access-dv5lf\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977070 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977084 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knmnv\" (UniqueName: \"kubernetes.io/projected/caa5b6cd-4c8e-424e-8ba9-578ccd662eab-kube-api-access-knmnv\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977097 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h9vz\" (UniqueName: \"kubernetes.io/projected/1eab1afc-3438-48e1-9210-c49f8c94316d-kube-api-access-5h9vz\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977110 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-587tm\" (UniqueName: \"kubernetes.io/projected/dac66b05-4f64-4bad-9d59-9c01ccd16018-kube-api-access-587tm\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977122 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz7hr\" (UniqueName: \"kubernetes.io/projected/96630154-0812-445c-bd03-9a2066468891-kube-api-access-zz7hr\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977134 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll987\" (UniqueName: \"kubernetes.io/projected/1d5ff7f0-0960-44b0-8637-446aa3906d52-kube-api-access-ll987\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977146 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrxxp\" (UniqueName: \"kubernetes.io/projected/f32a8aff-01df-47f8-83db-03ee02703e0f-kube-api-access-xrxxp\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977157 5030 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23bf9edf-4e02-4e6b-815d-c957b80275f0-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977170 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp9zq\" (UniqueName: \"kubernetes.io/projected/64490fad-ad11-4eea-b07f-c3c90f9bab8f-kube-api-access-tp9zq\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977181 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ndzq\" (UniqueName: \"kubernetes.io/projected/ad595f62-3880-4f00-8f70-879d673dcf55-kube-api-access-2ndzq\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977193 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fcjv\" (UniqueName: \"kubernetes.io/projected/bee1b017-6c33-42be-82ba-9af5cd5b5362-kube-api-access-6fcjv\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977204 5030 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64490fad-ad11-4eea-b07f-c3c90f9bab8f-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977217 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbwjb\" (UniqueName: \"kubernetes.io/projected/bf527df4-5b46-4894-ac83-7b5613ee7fae-kube-api-access-kbwjb\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.977229 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdgwt\" (UniqueName: \"kubernetes.io/projected/23bf9edf-4e02-4e6b-815d-c957b80275f0-kube-api-access-hdgwt\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.984861 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/123d952b-3c67-49ea-9c09-82b04a14d494-kube-api-access-bkmlr" (OuterVolumeSpecName: "kube-api-access-bkmlr") pod "123d952b-3c67-49ea-9c09-82b04a14d494" (UID: "123d952b-3c67-49ea-9c09-82b04a14d494"). InnerVolumeSpecName "kube-api-access-bkmlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:19 crc kubenswrapper[5030]: I0121 00:17:19.986605 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.025820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98fe7bc7-a9a3-4681-9d86-2d134e784ed0-kube-api-access-g2cl4" (OuterVolumeSpecName: "kube-api-access-g2cl4") pod "98fe7bc7-a9a3-4681-9d86-2d134e784ed0" (UID: "98fe7bc7-a9a3-4681-9d86-2d134e784ed0"). InnerVolumeSpecName "kube-api-access-g2cl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.025935 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert" (OuterVolumeSpecName: "cert") pod "123d952b-3c67-49ea-9c09-82b04a14d494" (UID: "123d952b-3c67-49ea-9c09-82b04a14d494"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.026941 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dce6cbef-8eea-40d0-83ee-8790d2bd9450-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "dce6cbef-8eea-40d0-83ee-8790d2bd9450" (UID: "dce6cbef-8eea-40d0-83ee-8790d2bd9450"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.027053 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fbec18c-c7ab-43eb-9440-c2a8973d3532-kube-api-access-gkwdf" (OuterVolumeSpecName: "kube-api-access-gkwdf") pod "5fbec18c-c7ab-43eb-9440-c2a8973d3532" (UID: "5fbec18c-c7ab-43eb-9440-c2a8973d3532"). InnerVolumeSpecName "kube-api-access-gkwdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.027121 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dce6cbef-8eea-40d0-83ee-8790d2bd9450-kube-api-access-q95sh" (OuterVolumeSpecName: "kube-api-access-q95sh") pod "dce6cbef-8eea-40d0-83ee-8790d2bd9450" (UID: "dce6cbef-8eea-40d0-83ee-8790d2bd9450"). InnerVolumeSpecName "kube-api-access-q95sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.027249 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f33608-dc70-40e3-b757-8410b4f1938e-kube-api-access-qbhk5" (OuterVolumeSpecName: "kube-api-access-qbhk5") pod "a4f33608-dc70-40e3-b757-8410b4f1938e" (UID: "a4f33608-dc70-40e3-b757-8410b4f1938e"). InnerVolumeSpecName "kube-api-access-qbhk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.027364 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7b05cde-2eb1-4879-bc97-dd0a420d7617-kube-api-access-prs4f" (OuterVolumeSpecName: "kube-api-access-prs4f") pod "d7b05cde-2eb1-4879-bc97-dd0a420d7617" (UID: "d7b05cde-2eb1-4879-bc97-dd0a420d7617"). InnerVolumeSpecName "kube-api-access-prs4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.027519 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a08b9f6-0526-42ba-9439-1fad41456c73-kube-api-access-45n7f" (OuterVolumeSpecName: "kube-api-access-45n7f") pod "9a08b9f6-0526-42ba-9439-1fad41456c73" (UID: "9a08b9f6-0526-42ba-9439-1fad41456c73"). InnerVolumeSpecName "kube-api-access-45n7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.027655 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5810f6d7-0bb3-4137-8ce4-03f0b1e9b225-kube-api-access-4nzfn" (OuterVolumeSpecName: "kube-api-access-4nzfn") pod "5810f6d7-0bb3-4137-8ce4-03f0b1e9b225" (UID: "5810f6d7-0bb3-4137-8ce4-03f0b1e9b225"). InnerVolumeSpecName "kube-api-access-4nzfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.032535 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dce6cbef-8eea-40d0-83ee-8790d2bd9450-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "dce6cbef-8eea-40d0-83ee-8790d2bd9450" (UID: "dce6cbef-8eea-40d0-83ee-8790d2bd9450"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.033115 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5263c0-d16c-41dd-a473-a329c770c17c-kube-api-access-qr8vv" (OuterVolumeSpecName: "kube-api-access-qr8vv") pod "ff5263c0-d16c-41dd-a473-a329c770c17c" (UID: "ff5263c0-d16c-41dd-a473-a329c770c17c"). InnerVolumeSpecName "kube-api-access-qr8vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.036940 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.081953 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk2kv\" (UniqueName: \"kubernetes.io/projected/b73ba68f-7653-47c8-bd19-4a3f05afad72-kube-api-access-rk2kv\") pod \"b73ba68f-7653-47c8-bd19-4a3f05afad72\" (UID: \"b73ba68f-7653-47c8-bd19-4a3f05afad72\") " Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.082027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbnhw\" (UniqueName: \"kubernetes.io/projected/a2343f20-8163-4f12-85f6-d621d416444b-kube-api-access-lbnhw\") pod \"a2343f20-8163-4f12-85f6-d621d416444b\" (UID: \"a2343f20-8163-4f12-85f6-d621d416444b\") " Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.084238 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7ttt\" (UniqueName: \"kubernetes.io/projected/c9aea915-81f5-4204-9d3b-4d107d4678a1-kube-api-access-z7ttt\") pod \"c9aea915-81f5-4204-9d3b-4d107d4678a1\" (UID: \"c9aea915-81f5-4204-9d3b-4d107d4678a1\") " Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.084734 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45n7f\" (UniqueName: \"kubernetes.io/projected/9a08b9f6-0526-42ba-9439-1fad41456c73-kube-api-access-45n7f\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.084751 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr8vv\" (UniqueName: \"kubernetes.io/projected/ff5263c0-d16c-41dd-a473-a329c770c17c-kube-api-access-qr8vv\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.084763 5030 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dce6cbef-8eea-40d0-83ee-8790d2bd9450-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.084773 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkwdf\" (UniqueName: \"kubernetes.io/projected/5fbec18c-c7ab-43eb-9440-c2a8973d3532-kube-api-access-gkwdf\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.084782 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prs4f\" (UniqueName: \"kubernetes.io/projected/d7b05cde-2eb1-4879-bc97-dd0a420d7617-kube-api-access-prs4f\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.084791 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q95sh\" (UniqueName: \"kubernetes.io/projected/dce6cbef-8eea-40d0-83ee-8790d2bd9450-kube-api-access-q95sh\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.084801 5030 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/123d952b-3c67-49ea-9c09-82b04a14d494-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.084813 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nzfn\" (UniqueName: \"kubernetes.io/projected/5810f6d7-0bb3-4137-8ce4-03f0b1e9b225-kube-api-access-4nzfn\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.084822 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2cl4\" (UniqueName: \"kubernetes.io/projected/98fe7bc7-a9a3-4681-9d86-2d134e784ed0-kube-api-access-g2cl4\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.084832 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbhk5\" (UniqueName: \"kubernetes.io/projected/a4f33608-dc70-40e3-b757-8410b4f1938e-kube-api-access-qbhk5\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.084842 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkmlr\" (UniqueName: \"kubernetes.io/projected/123d952b-3c67-49ea-9c09-82b04a14d494-kube-api-access-bkmlr\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.084851 5030 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dce6cbef-8eea-40d0-83ee-8790d2bd9450-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.089446 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2343f20-8163-4f12-85f6-d621d416444b-kube-api-access-lbnhw" (OuterVolumeSpecName: "kube-api-access-lbnhw") pod "a2343f20-8163-4f12-85f6-d621d416444b" (UID: "a2343f20-8163-4f12-85f6-d621d416444b"). InnerVolumeSpecName "kube-api-access-lbnhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.089818 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b73ba68f-7653-47c8-bd19-4a3f05afad72-kube-api-access-rk2kv" (OuterVolumeSpecName: "kube-api-access-rk2kv") pod "b73ba68f-7653-47c8-bd19-4a3f05afad72" (UID: "b73ba68f-7653-47c8-bd19-4a3f05afad72"). InnerVolumeSpecName "kube-api-access-rk2kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.105289 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9aea915-81f5-4204-9d3b-4d107d4678a1-kube-api-access-z7ttt" (OuterVolumeSpecName: "kube-api-access-z7ttt") pod "c9aea915-81f5-4204-9d3b-4d107d4678a1" (UID: "c9aea915-81f5-4204-9d3b-4d107d4678a1"). InnerVolumeSpecName "kube-api-access-z7ttt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.185652 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk2kv\" (UniqueName: \"kubernetes.io/projected/b73ba68f-7653-47c8-bd19-4a3f05afad72-kube-api-access-rk2kv\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.185994 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbnhw\" (UniqueName: \"kubernetes.io/projected/a2343f20-8163-4f12-85f6-d621d416444b-kube-api-access-lbnhw\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.186006 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7ttt\" (UniqueName: \"kubernetes.io/projected/c9aea915-81f5-4204-9d3b-4d107d4678a1-kube-api-access-z7ttt\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.305265 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.318559 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.348347 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.383509 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" event={"ID":"5810f6d7-0bb3-4137-8ce4-03f0b1e9b225","Type":"ContainerDied","Data":"935f3c59e93c64cea85ed2d5be55e63cafcaf1639be8be9438362ca7681d6735"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.383599 5030 scope.go:117] "RemoveContainer" containerID="29788f9d9bf1fc564feb6809bf41ef8c112495b8e2f29373c56d5f6af7a181b4" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.384259 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.385415 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" event={"ID":"123d952b-3c67-49ea-9c09-82b04a14d494","Type":"ContainerDied","Data":"4c56725c9af1fd9984d9fe795fecb822e35ef346cecd8821f436327204352292"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.385510 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.388613 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" event={"ID":"dac66b05-4f64-4bad-9d59-9c01ccd16018","Type":"ContainerDied","Data":"0c3c66e40c7c2b8ae9854585da60c4d307ff190a708bb1b04184653096c0fab0"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.388949 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.391546 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" event={"ID":"1eab1afc-3438-48e1-9210-c49f8c94316d","Type":"ContainerDied","Data":"7153f1dca68387e2a6d3300318f24efd0e08123231f30ee0fa253ab14ef21e89"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.391658 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.397938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" event={"ID":"5fbec18c-c7ab-43eb-9440-c2a8973d3532","Type":"ContainerDied","Data":"f4c3382246a3c95961ab4f3a240007cd3a134d3284a23038ebd7b692ddbdc754"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.398219 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.399398 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.399423 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5" event={"ID":"ad595f62-3880-4f00-8f70-879d673dcf55","Type":"ContainerDied","Data":"3f4609d1c2943327f98dc0860442ae37a003eb0c8c8dd464054dc25633d8a127"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.402373 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.402401 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg" event={"ID":"bee1b017-6c33-42be-82ba-9af5cd5b5362","Type":"ContainerDied","Data":"5085c8ef0fa89c562c35f33a6f3a6baef9af8d7b2386c6a5a533595050133ed2"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.404053 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z7wr\" (UniqueName: \"kubernetes.io/projected/1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2-kube-api-access-8z7wr\") pod \"1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2\" (UID: \"1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2\") " Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.404154 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hrc7\" (UniqueName: \"kubernetes.io/projected/708640d3-cade-4b2d-b378-1f25cafb3575-kube-api-access-5hrc7\") pod \"708640d3-cade-4b2d-b378-1f25cafb3575\" (UID: \"708640d3-cade-4b2d-b378-1f25cafb3575\") " Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.407110 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" event={"ID":"a2343f20-8163-4f12-85f6-d621d416444b","Type":"ContainerDied","Data":"1b9233cacbdaab03a3e512ad967827c332cd4d73c727aee17bdf4c3317255495"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.407174 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.408519 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2-kube-api-access-8z7wr" (OuterVolumeSpecName: "kube-api-access-8z7wr") pod "1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2" (UID: "1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2"). InnerVolumeSpecName "kube-api-access-8z7wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.409468 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/708640d3-cade-4b2d-b378-1f25cafb3575-kube-api-access-5hrc7" (OuterVolumeSpecName: "kube-api-access-5hrc7") pod "708640d3-cade-4b2d-b378-1f25cafb3575" (UID: "708640d3-cade-4b2d-b378-1f25cafb3575"). InnerVolumeSpecName "kube-api-access-5hrc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.410972 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" event={"ID":"1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2","Type":"ContainerDied","Data":"45056e4b4551d7ca50fe1510a2a3fcb124f7a6e3dcb50c106c56b9280376cd45"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.411067 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.413556 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.413645 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p" event={"ID":"b73ba68f-7653-47c8-bd19-4a3f05afad72","Type":"ContainerDied","Data":"9aa3520b44cb611b8c2c70cf9782e71781c58c8199642c5756659871ca3ad8a4"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.416538 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" event={"ID":"d7b05cde-2eb1-4879-bc97-dd0a420d7617","Type":"ContainerDied","Data":"27b8e7fdbf0a93aa969d05b8df05ba304376d0624a935c3b19db1986ad813da8"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.416584 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.418606 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" event={"ID":"ad686649-19fc-40ba-bb30-86fcc6511e79","Type":"ContainerStarted","Data":"6538c10985bec6592eb3fb28f95678fb634b019dcc9dfa823207c812201a2973"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.421343 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" event={"ID":"708640d3-cade-4b2d-b378-1f25cafb3575","Type":"ContainerDied","Data":"e598c82ee9bfff80aef89685a049df0958cabccae007296671b9eb8dc6155114"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.421361 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.424133 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.424113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj" event={"ID":"f32a8aff-01df-47f8-83db-03ee02703e0f","Type":"ContainerDied","Data":"af20e3150d908877f0530fe90a7c29d6c4c54f8acdd7115e791b3d0bdbfcc7c4"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.426899 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" event={"ID":"dce6cbef-8eea-40d0-83ee-8790d2bd9450","Type":"ContainerDied","Data":"02ed1f2589a1078b2496e47c2ca39670e4d7cfcff601ab8d54db6520fdb20c7d"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.426959 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.429766 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" event={"ID":"98fe7bc7-a9a3-4681-9d86-2d134e784ed0","Type":"ContainerDied","Data":"30aed801a968d758af4b6d855a2ef6d6d91c6050f073d908fb33f6377fed4227"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.429776 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.441951 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-t75jj" event={"ID":"ff5263c0-d16c-41dd-a473-a329c770c17c","Type":"ContainerDied","Data":"c260c58585e003a105245cb984aedbbbc3ec07de889640d7a0a11403d88ab5a0"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.442078 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-t75jj" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.450242 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" event={"ID":"c9aea915-81f5-4204-9d3b-4d107d4678a1","Type":"ContainerDied","Data":"04eeeed3ac64e1451efbd97883acb294ae843803f74752e4477422cc9b11a002"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.450356 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.455703 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" event={"ID":"9a08b9f6-0526-42ba-9439-1fad41456c73","Type":"ContainerDied","Data":"f3e2cbeb0278267c45cf2f1949649c84fee44f1c3ae03089f9b5363dffeb03f2"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.455834 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.460405 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" event={"ID":"a4f33608-dc70-40e3-b757-8410b4f1938e","Type":"ContainerDied","Data":"d77ad8f7e577fc1539cec8914e2c76b1cf272490a60878313932d418a9bee7dd"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.460483 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.464658 5030 scope.go:117] "RemoveContainer" containerID="5abe53d03f892d3544b17d81eba417d5a56af62aff5ab61676205fa77053baf6" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.465117 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" event={"ID":"96630154-0812-445c-bd03-9a2066468891","Type":"ContainerDied","Data":"f0f53acb4e69e31489ae5f14455a28bc895789db2a9d9911dfac82840d78f0c9"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.465885 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.469271 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" event={"ID":"1d5ff7f0-0960-44b0-8637-446aa3906d52","Type":"ContainerDied","Data":"64e21b6bd260ff06d4c3d53a805a0c54484d958266d1ba19717faa6bf864d9c3"} Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.469332 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.469391 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.469486 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.469864 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.470157 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.470184 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.505543 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z7wr\" (UniqueName: \"kubernetes.io/projected/1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2-kube-api-access-8z7wr\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.505575 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hrc7\" (UniqueName: \"kubernetes.io/projected/708640d3-cade-4b2d-b378-1f25cafb3575-kube-api-access-5hrc7\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.538394 5030 scope.go:117] "RemoveContainer" containerID="99b032a36da1aba76e618c982844826d6f9973086eeb750125c45339572b6225" Jan 21 00:17:20 crc kubenswrapper[5030]: E0121 00:17:20.565456 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64490fad_ad11_4eea_b07f_c3c90f9bab8f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaa5b6cd_4c8e_424e_8ba9_578ccd662eab.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1eab1afc_3438_48e1_9210_c49f8c94316d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaa5b6cd_4c8e_424e_8ba9_578ccd662eab.slice/crio-91fe49cac6b767fc3fad43be2ab43912ec4d57d9bf705669148fa947cc8530e1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64490fad_ad11_4eea_b07f_c3c90f9bab8f.slice/crio-cef5e3ff95f86d527a7368b737f75cb7a5ac6347b7bf32ad5c02df992761da30\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1eab1afc_3438_48e1_9210_c49f8c94316d.slice/crio-7153f1dca68387e2a6d3300318f24efd0e08123231f30ee0fa253ab14ef21e89\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddac66b05_4f64_4bad_9d59_9c01ccd16018.slice/crio-0c3c66e40c7c2b8ae9854585da60c4d307ff190a708bb1b04184653096c0fab0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5810f6d7_0bb3_4137_8ce4_03f0b1e9b225.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7b05cde_2eb1_4879_bc97_dd0a420d7617.slice/crio-27b8e7fdbf0a93aa969d05b8df05ba304376d0624a935c3b19db1986ad813da8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddce6cbef_8eea_40d0_83ee_8790d2bd9450.slice/crio-02ed1f2589a1078b2496e47c2ca39670e4d7cfcff601ab8d54db6520fdb20c7d\": RecentStats: unable to find data in memory cache]" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.565523 5030 scope.go:117] "RemoveContainer" containerID="5fb34c8e29ac29cd7e990d10eebe752719765609284b211b3eb8c9c3932928d3" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.614100 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.622142 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pxvmq"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.627057 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48" podUID="1d5ff7f0-0960-44b0-8637-446aa3906d52" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.71:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.653396 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq" podUID="98fe7bc7-a9a3-4681-9d86-2d134e784ed0" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.72:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.663364 5030 scope.go:117] "RemoveContainer" containerID="395ed9d798d4cf55d2e4f993c6ce58fafbcf3f2feee517ef1ae91409e5ef8021" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.663690 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.663727 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-mvzhb"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.670264 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.678002 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-2cdl5"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.707079 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.724683 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-dh9wl"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.728912 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp" podUID="9a08b9f6-0526-42ba-9439-1fad41456c73" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.73:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.738912 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.746725 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd" podUID="c9aea915-81f5-4204-9d3b-4d107d4678a1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.74:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.752047 5030 scope.go:117] "RemoveContainer" containerID="0aa89417ba2f6c0729f3c89c1c5b9fc64701d1b0b7276ab33b14d2c745d5d230" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.762709 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-hw2rq"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.778929 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.798877 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-qqnjk"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.807010 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.832682 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-6hb48"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.838763 5030 scope.go:117] "RemoveContainer" containerID="10c57e78eced15714eb0ea981712a0028cd5ff4428918476dae035318d85d001" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.845079 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.849859 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-dvhb6"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.865684 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.872070 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-cq5dj"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.882696 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.903638 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-nrctv"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.906420 5030 scope.go:117] "RemoveContainer" containerID="a04440c595a263c44973ca82a87dcf37eb9e37396951ad858d0befee7751d427" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.938867 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.963685 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-jgsmp"] Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.993953 5030 scope.go:117] "RemoveContainer" containerID="24e74137735a20307ecb6198bd20f9e08759fa32dbac36ab6f996478b164bc37" Jan 21 00:17:20 crc kubenswrapper[5030]: I0121 00:17:20.995023 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.002717 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-w79zj"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.008947 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.011779 5030 scope.go:117] "RemoveContainer" containerID="c3c465ef33d6aab6dd4b7551ec2eac0ee39cb7f40222d561b8e2d9fac3cdaa83" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.015947 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b9875986d7xnrt"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.025167 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.029924 5030 scope.go:117] "RemoveContainer" containerID="71b58c9561c91a3cafd3472b3584f780486758bc8767e83ad2f39a640c5cd1d5" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.030264 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-pl7qm"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.048457 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.056793 5030 scope.go:117] "RemoveContainer" containerID="f316aa61f7ada9f40c18b6a646bb9bdcb06db631aab04eb4aa5b0679c6f42397" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.056915 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-sjfdt"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.063754 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.069796 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-g9v2l"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.078201 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.081158 5030 scope.go:117] "RemoveContainer" containerID="4b004e6eb4eea715e4d1d1225ebf941708a403dfaa7b81a3c79884fc564712b1" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.084924 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-lwzdd"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.090993 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.095713 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6d4d7d8545-fz9q7"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.100370 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-t75jj"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.107768 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-t75jj"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.111162 5030 scope.go:117] "RemoveContainer" containerID="c69fc5730d416ab0b9e63766bd41ff5288d7a1376f17aa93612b835bc23c8ef9" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.113067 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.122457 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-9qgvp"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.127144 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.131132 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8nglg"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.131569 5030 scope.go:117] "RemoveContainer" containerID="79c3796b053a1e0dbfc7557940757713515a2dd19e862a09c993127480e0e1fc" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.135027 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.138719 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-n2m7p"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.143244 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.146580 5030 scope.go:117] "RemoveContainer" containerID="d5cd2ad4e448b2529a942ea1ce402bbbff882edee6616a2f81d0df880a69cb7f" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.147416 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-dpmdr"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.151354 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.155162 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-gtjfb"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.159224 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.161373 5030 scope.go:117] "RemoveContainer" containerID="aca7b7399a1f0721b34e01b5e2aa237c8d9035db4cf09a589bb3d945a0420ae7" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.162856 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-75bfd788c8-dslzz"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.166460 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.170174 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-mq6l5"] Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.177598 5030 scope.go:117] "RemoveContainer" containerID="eb9e1f492058fcaf14bce1a2e0dddb06a461741cf089920f0fd020ef503b9cb8" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.191939 5030 scope.go:117] "RemoveContainer" containerID="7ded8ff756807b2d90f4c65bdc54ae4f12c5406d841f6298740660fc611f6f3b" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.206205 5030 scope.go:117] "RemoveContainer" containerID="6565260d57febc636c85ea1add62c31f1272ab12c8e5392a27d79459653a3bd3" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.220492 5030 scope.go:117] "RemoveContainer" containerID="b720d678cbbfa9862850b06990f6e49cefc3e136fd89594ae535f4c99f8bf0f2" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.490683 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" event={"ID":"ad686649-19fc-40ba-bb30-86fcc6511e79","Type":"ContainerStarted","Data":"544ac73f0c272ad9b72a162b04442160ed82bebd6c77f0b7981d5c3cf3fa1a42"} Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.490913 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.498040 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.508100 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-m5ltl" podStartSLOduration=3.5080830389999997 podStartE2EDuration="3.508083039s" podCreationTimestamp="2026-01-21 00:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:17:21.507673539 +0000 UTC m=+6113.827933827" watchObservedRunningTime="2026-01-21 00:17:21.508083039 +0000 UTC m=+6113.828343327" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.974966 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="123d952b-3c67-49ea-9c09-82b04a14d494" path="/var/lib/kubelet/pods/123d952b-3c67-49ea-9c09-82b04a14d494/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.975776 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d5ff7f0-0960-44b0-8637-446aa3906d52" path="/var/lib/kubelet/pods/1d5ff7f0-0960-44b0-8637-446aa3906d52/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.976184 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eab1afc-3438-48e1-9210-c49f8c94316d" path="/var/lib/kubelet/pods/1eab1afc-3438-48e1-9210-c49f8c94316d/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.977321 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2" path="/var/lib/kubelet/pods/1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.979174 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23bf9edf-4e02-4e6b-815d-c957b80275f0" path="/var/lib/kubelet/pods/23bf9edf-4e02-4e6b-815d-c957b80275f0/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.979700 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5810f6d7-0bb3-4137-8ce4-03f0b1e9b225" path="/var/lib/kubelet/pods/5810f6d7-0bb3-4137-8ce4-03f0b1e9b225/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.980095 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fbec18c-c7ab-43eb-9440-c2a8973d3532" path="/var/lib/kubelet/pods/5fbec18c-c7ab-43eb-9440-c2a8973d3532/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.981490 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64490fad-ad11-4eea-b07f-c3c90f9bab8f" path="/var/lib/kubelet/pods/64490fad-ad11-4eea-b07f-c3c90f9bab8f/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.982728 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="708640d3-cade-4b2d-b378-1f25cafb3575" path="/var/lib/kubelet/pods/708640d3-cade-4b2d-b378-1f25cafb3575/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.983197 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96630154-0812-445c-bd03-9a2066468891" path="/var/lib/kubelet/pods/96630154-0812-445c-bd03-9a2066468891/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.983703 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98fe7bc7-a9a3-4681-9d86-2d134e784ed0" path="/var/lib/kubelet/pods/98fe7bc7-a9a3-4681-9d86-2d134e784ed0/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.984817 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a08b9f6-0526-42ba-9439-1fad41456c73" path="/var/lib/kubelet/pods/9a08b9f6-0526-42ba-9439-1fad41456c73/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.985362 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2343f20-8163-4f12-85f6-d621d416444b" path="/var/lib/kubelet/pods/a2343f20-8163-4f12-85f6-d621d416444b/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.985820 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4f33608-dc70-40e3-b757-8410b4f1938e" path="/var/lib/kubelet/pods/a4f33608-dc70-40e3-b757-8410b4f1938e/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.986436 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad595f62-3880-4f00-8f70-879d673dcf55" path="/var/lib/kubelet/pods/ad595f62-3880-4f00-8f70-879d673dcf55/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.987636 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b73ba68f-7653-47c8-bd19-4a3f05afad72" path="/var/lib/kubelet/pods/b73ba68f-7653-47c8-bd19-4a3f05afad72/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.988123 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bee1b017-6c33-42be-82ba-9af5cd5b5362" path="/var/lib/kubelet/pods/bee1b017-6c33-42be-82ba-9af5cd5b5362/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.988562 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf527df4-5b46-4894-ac83-7b5613ee7fae" path="/var/lib/kubelet/pods/bf527df4-5b46-4894-ac83-7b5613ee7fae/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.989035 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9aea915-81f5-4204-9d3b-4d107d4678a1" path="/var/lib/kubelet/pods/c9aea915-81f5-4204-9d3b-4d107d4678a1/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.989995 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caa5b6cd-4c8e-424e-8ba9-578ccd662eab" path="/var/lib/kubelet/pods/caa5b6cd-4c8e-424e-8ba9-578ccd662eab/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.990533 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7b05cde-2eb1-4879-bc97-dd0a420d7617" path="/var/lib/kubelet/pods/d7b05cde-2eb1-4879-bc97-dd0a420d7617/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.991201 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dac66b05-4f64-4bad-9d59-9c01ccd16018" path="/var/lib/kubelet/pods/dac66b05-4f64-4bad-9d59-9c01ccd16018/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.992906 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dce6cbef-8eea-40d0-83ee-8790d2bd9450" path="/var/lib/kubelet/pods/dce6cbef-8eea-40d0-83ee-8790d2bd9450/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.993938 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df584eed-267c-41ec-b019-6af9322630da" path="/var/lib/kubelet/pods/df584eed-267c-41ec-b019-6af9322630da/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.994386 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f32a8aff-01df-47f8-83db-03ee02703e0f" path="/var/lib/kubelet/pods/f32a8aff-01df-47f8-83db-03ee02703e0f/volumes" Jan 21 00:17:21 crc kubenswrapper[5030]: I0121 00:17:21.995820 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5263c0-d16c-41dd-a473-a329c770c17c" path="/var/lib/kubelet/pods/ff5263c0-d16c-41dd-a473-a329c770c17c/volumes" Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.144931 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-84b9f45d47-kd68z" Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.210072 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc"] Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.210302 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" podUID="d475005e-6dd1-42ff-a7d7-e983199f2c79" containerName="dnsmasq-dns" containerID="cri-o://24908274e6f6b95663e742f913c7ffca6ad5269d0fe425f4ebbd1152a9de54a6" gracePeriod=10 Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.549754 5030 generic.go:334] "Generic (PLEG): container finished" podID="d475005e-6dd1-42ff-a7d7-e983199f2c79" containerID="24908274e6f6b95663e742f913c7ffca6ad5269d0fe425f4ebbd1152a9de54a6" exitCode=0 Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.549837 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" event={"ID":"d475005e-6dd1-42ff-a7d7-e983199f2c79","Type":"ContainerDied","Data":"24908274e6f6b95663e742f913c7ffca6ad5269d0fe425f4ebbd1152a9de54a6"} Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.619921 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.800855 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p9rq\" (UniqueName: \"kubernetes.io/projected/d475005e-6dd1-42ff-a7d7-e983199f2c79-kube-api-access-6p9rq\") pod \"d475005e-6dd1-42ff-a7d7-e983199f2c79\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.802158 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-config\") pod \"d475005e-6dd1-42ff-a7d7-e983199f2c79\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.802330 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-dnsmasq-svc\") pod \"d475005e-6dd1-42ff-a7d7-e983199f2c79\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.802455 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-edpm-compute-no-nodes\") pod \"d475005e-6dd1-42ff-a7d7-e983199f2c79\" (UID: \"d475005e-6dd1-42ff-a7d7-e983199f2c79\") " Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.807154 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d475005e-6dd1-42ff-a7d7-e983199f2c79-kube-api-access-6p9rq" (OuterVolumeSpecName: "kube-api-access-6p9rq") pod "d475005e-6dd1-42ff-a7d7-e983199f2c79" (UID: "d475005e-6dd1-42ff-a7d7-e983199f2c79"). InnerVolumeSpecName "kube-api-access-6p9rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.853277 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-dnsmasq-svc" (OuterVolumeSpecName: "dnsmasq-svc") pod "d475005e-6dd1-42ff-a7d7-e983199f2c79" (UID: "d475005e-6dd1-42ff-a7d7-e983199f2c79"). InnerVolumeSpecName "dnsmasq-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.858265 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-edpm-compute-no-nodes" (OuterVolumeSpecName: "edpm-compute-no-nodes") pod "d475005e-6dd1-42ff-a7d7-e983199f2c79" (UID: "d475005e-6dd1-42ff-a7d7-e983199f2c79"). InnerVolumeSpecName "edpm-compute-no-nodes". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.868343 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-config" (OuterVolumeSpecName: "config") pod "d475005e-6dd1-42ff-a7d7-e983199f2c79" (UID: "d475005e-6dd1-42ff-a7d7-e983199f2c79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.903805 5030 reconciler_common.go:293] "Volume detached for volume \"dnsmasq-svc\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-dnsmasq-svc\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.903839 5030 reconciler_common.go:293] "Volume detached for volume \"edpm-compute-no-nodes\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-edpm-compute-no-nodes\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.903850 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p9rq\" (UniqueName: \"kubernetes.io/projected/d475005e-6dd1-42ff-a7d7-e983199f2c79-kube-api-access-6p9rq\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:25 crc kubenswrapper[5030]: I0121 00:17:25.903861 5030 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d475005e-6dd1-42ff-a7d7-e983199f2c79-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:17:26 crc kubenswrapper[5030]: I0121 00:17:26.559838 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" event={"ID":"d475005e-6dd1-42ff-a7d7-e983199f2c79","Type":"ContainerDied","Data":"15411f7570ab18a6bbefb0e4f4d962158a8113020c3ee75e1fcef1fb390786a7"} Jan 21 00:17:26 crc kubenswrapper[5030]: I0121 00:17:26.559890 5030 scope.go:117] "RemoveContainer" containerID="24908274e6f6b95663e742f913c7ffca6ad5269d0fe425f4ebbd1152a9de54a6" Jan 21 00:17:26 crc kubenswrapper[5030]: I0121 00:17:26.560028 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc" Jan 21 00:17:26 crc kubenswrapper[5030]: I0121 00:17:26.580526 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc"] Jan 21 00:17:26 crc kubenswrapper[5030]: I0121 00:17:26.585589 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-kuttl-tests/dnsmasq-dnsmasq-64864b6d57-9h5cc"] Jan 21 00:17:26 crc kubenswrapper[5030]: I0121 00:17:26.591016 5030 scope.go:117] "RemoveContainer" containerID="666267ac45d3dfc090c45bcb6bba55be8b63632f65cce7737cce52453278b8ef" Jan 21 00:17:27 crc kubenswrapper[5030]: I0121 00:17:27.970434 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d475005e-6dd1-42ff-a7d7-e983199f2c79" path="/var/lib/kubelet/pods/d475005e-6dd1-42ff-a7d7-e983199f2c79/volumes" Jan 21 00:17:29 crc kubenswrapper[5030]: I0121 00:17:29.649889 5030 scope.go:117] "RemoveContainer" containerID="0b82f35514b33886fb833fea714971e2e950a0b2b66749e86203ca588d1951df" Jan 21 00:17:29 crc kubenswrapper[5030]: I0121 00:17:29.669191 5030 scope.go:117] "RemoveContainer" containerID="324a62c6af727b39111590bbd594cae9d7d93c4239e962a7da2abaf5131d0cac" Jan 21 00:17:29 crc kubenswrapper[5030]: I0121 00:17:29.692687 5030 scope.go:117] "RemoveContainer" containerID="b502b9e619433261ba720c46b8d94cf90dc56f4d7747ceab0da0c15d5aa7beff" Jan 21 00:17:29 crc kubenswrapper[5030]: I0121 00:17:29.727860 5030 scope.go:117] "RemoveContainer" containerID="24c0e2734280ba547ae5ef0d76f16dddb92ff6ddcf0d95bf42ae9dbf234ed756" Jan 21 00:17:29 crc kubenswrapper[5030]: I0121 00:17:29.744903 5030 scope.go:117] "RemoveContainer" containerID="40584f451b0c6b41ad84e7c1a4e84ad8546905f21a6fe716959214b8c35e3b2e" Jan 21 00:17:29 crc kubenswrapper[5030]: I0121 00:17:29.766510 5030 scope.go:117] "RemoveContainer" containerID="6b60948631f006d6a9d18bca30bc7814cf68deb67ca44b604e156bacaab52391" Jan 21 00:17:29 crc kubenswrapper[5030]: I0121 00:17:29.811129 5030 scope.go:117] "RemoveContainer" containerID="09ab300b6027b5cb312945c9d6e4e14bd2732b961fe4b0fc413eabbd873d0271" Jan 21 00:17:29 crc kubenswrapper[5030]: I0121 00:17:29.844589 5030 scope.go:117] "RemoveContainer" containerID="e65d03a216edb99e7ff8a94c7c71b7abe78524f83d9aa3650be11579769cdee0" Jan 21 00:17:29 crc kubenswrapper[5030]: I0121 00:17:29.878768 5030 scope.go:117] "RemoveContainer" containerID="9f2d63d3e6ab738ff2f2a269569e5b6cf524ea4f47c18034d1306d5a3cc5b29b" Jan 21 00:17:29 crc kubenswrapper[5030]: I0121 00:17:29.907346 5030 scope.go:117] "RemoveContainer" containerID="35a4a73cc7b2a93dccec18d3a9260524935c827138006c6b0a4b3e14c932fee9" Jan 21 00:17:29 crc kubenswrapper[5030]: I0121 00:17:29.934302 5030 scope.go:117] "RemoveContainer" containerID="c4a06335f88871083eb1b64b57f710ab53596ef67f62067e5dca2e32320bb1e4" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.294814 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-kc6h7"] Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.295840 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b05cde-2eb1-4879-bc97-dd0a420d7617" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.295859 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b05cde-2eb1-4879-bc97-dd0a420d7617" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.295884 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df584eed-267c-41ec-b019-6af9322630da" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.295893 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="df584eed-267c-41ec-b019-6af9322630da" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.295903 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="123d952b-3c67-49ea-9c09-82b04a14d494" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.295912 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="123d952b-3c67-49ea-9c09-82b04a14d494" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.295929 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64490fad-ad11-4eea-b07f-c3c90f9bab8f" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.295939 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="64490fad-ad11-4eea-b07f-c3c90f9bab8f" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.295953 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708640d3-cade-4b2d-b378-1f25cafb3575" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.295962 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="708640d3-cade-4b2d-b378-1f25cafb3575" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.295975 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9aea915-81f5-4204-9d3b-4d107d4678a1" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.295983 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9aea915-81f5-4204-9d3b-4d107d4678a1" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.295993 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73ba68f-7653-47c8-bd19-4a3f05afad72" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296000 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73ba68f-7653-47c8-bd19-4a3f05afad72" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296017 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d475005e-6dd1-42ff-a7d7-e983199f2c79" containerName="dnsmasq-dns" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296025 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d475005e-6dd1-42ff-a7d7-e983199f2c79" containerName="dnsmasq-dns" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296039 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dce6cbef-8eea-40d0-83ee-8790d2bd9450" containerName="catalog-operator" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296047 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dce6cbef-8eea-40d0-83ee-8790d2bd9450" containerName="catalog-operator" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296062 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d5ff7f0-0960-44b0-8637-446aa3906d52" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296070 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d5ff7f0-0960-44b0-8637-446aa3906d52" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296084 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f33608-dc70-40e3-b757-8410b4f1938e" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296091 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f33608-dc70-40e3-b757-8410b4f1938e" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296107 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5810f6d7-0bb3-4137-8ce4-03f0b1e9b225" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296114 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5810f6d7-0bb3-4137-8ce4-03f0b1e9b225" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296131 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2343f20-8163-4f12-85f6-d621d416444b" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296139 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2343f20-8163-4f12-85f6-d621d416444b" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296149 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbec18c-c7ab-43eb-9440-c2a8973d3532" containerName="operator" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296158 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbec18c-c7ab-43eb-9440-c2a8973d3532" containerName="operator" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296172 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296179 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296191 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad595f62-3880-4f00-8f70-879d673dcf55" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296199 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad595f62-3880-4f00-8f70-879d673dcf55" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296212 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eab1afc-3438-48e1-9210-c49f8c94316d" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296220 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eab1afc-3438-48e1-9210-c49f8c94316d" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296233 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96630154-0812-445c-bd03-9a2066468891" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296241 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="96630154-0812-445c-bd03-9a2066468891" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296254 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f32a8aff-01df-47f8-83db-03ee02703e0f" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296262 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f32a8aff-01df-47f8-83db-03ee02703e0f" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296273 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac66b05-4f64-4bad-9d59-9c01ccd16018" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296281 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac66b05-4f64-4bad-9d59-9c01ccd16018" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296292 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23bf9edf-4e02-4e6b-815d-c957b80275f0" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296300 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="23bf9edf-4e02-4e6b-815d-c957b80275f0" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296311 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf527df4-5b46-4894-ac83-7b5613ee7fae" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296319 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf527df4-5b46-4894-ac83-7b5613ee7fae" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296332 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a08b9f6-0526-42ba-9439-1fad41456c73" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296340 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a08b9f6-0526-42ba-9439-1fad41456c73" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296354 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d475005e-6dd1-42ff-a7d7-e983199f2c79" containerName="init" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296365 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d475005e-6dd1-42ff-a7d7-e983199f2c79" containerName="init" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296380 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fe7bc7-a9a3-4681-9d86-2d134e784ed0" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296388 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fe7bc7-a9a3-4681-9d86-2d134e784ed0" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296402 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5263c0-d16c-41dd-a473-a329c770c17c" containerName="registry-server" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296410 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5263c0-d16c-41dd-a473-a329c770c17c" containerName="registry-server" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296426 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee1b017-6c33-42be-82ba-9af5cd5b5362" containerName="operator" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296433 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee1b017-6c33-42be-82ba-9af5cd5b5362" containerName="operator" Jan 21 00:18:06 crc kubenswrapper[5030]: E0121 00:18:06.296448 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa5b6cd-4c8e-424e-8ba9-578ccd662eab" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296457 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa5b6cd-4c8e-424e-8ba9-578ccd662eab" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296611 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dce6cbef-8eea-40d0-83ee-8790d2bd9450" containerName="catalog-operator" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296647 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac66b05-4f64-4bad-9d59-9c01ccd16018" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296666 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7b05cde-2eb1-4879-bc97-dd0a420d7617" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296685 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad595f62-3880-4f00-8f70-879d673dcf55" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296696 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2343f20-8163-4f12-85f6-d621d416444b" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296712 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="23bf9edf-4e02-4e6b-815d-c957b80275f0" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296728 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="123d952b-3c67-49ea-9c09-82b04a14d494" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296742 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="df584eed-267c-41ec-b019-6af9322630da" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296753 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5263c0-d16c-41dd-a473-a329c770c17c" containerName="registry-server" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296764 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="98fe7bc7-a9a3-4681-9d86-2d134e784ed0" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296779 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa5b6cd-4c8e-424e-8ba9-578ccd662eab" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296791 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d5ff7f0-0960-44b0-8637-446aa3906d52" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296801 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9aea915-81f5-4204-9d3b-4d107d4678a1" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296813 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="96630154-0812-445c-bd03-9a2066468891" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296831 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee1b017-6c33-42be-82ba-9af5cd5b5362" containerName="operator" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296850 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f32a8aff-01df-47f8-83db-03ee02703e0f" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296868 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d475005e-6dd1-42ff-a7d7-e983199f2c79" containerName="dnsmasq-dns" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296887 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5810f6d7-0bb3-4137-8ce4-03f0b1e9b225" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296907 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a08b9f6-0526-42ba-9439-1fad41456c73" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296921 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73ba68f-7653-47c8-bd19-4a3f05afad72" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.296941 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eab1afc-3438-48e1-9210-c49f8c94316d" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.297001 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eb76f65-90b7-4ad2-9f0a-02ea66e5b6e2" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.297069 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="64490fad-ad11-4eea-b07f-c3c90f9bab8f" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.297105 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f33608-dc70-40e3-b757-8410b4f1938e" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.297115 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf527df4-5b46-4894-ac83-7b5613ee7fae" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.297129 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="708640d3-cade-4b2d-b378-1f25cafb3575" containerName="manager" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.297142 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbec18c-c7ab-43eb-9440-c2a8973d3532" containerName="operator" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.297790 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-kc6h7" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.301640 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.302841 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-kc6h7"] Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.304658 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-pc2cn" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.305534 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.334007 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xdlv\" (UniqueName: \"kubernetes.io/projected/ea3c7cab-4ba7-486e-8b3b-3deaf567e35d-kube-api-access-9xdlv\") pod \"mariadb-operator-index-kc6h7\" (UID: \"ea3c7cab-4ba7-486e-8b3b-3deaf567e35d\") " pod="openstack-operators/mariadb-operator-index-kc6h7" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.435365 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xdlv\" (UniqueName: \"kubernetes.io/projected/ea3c7cab-4ba7-486e-8b3b-3deaf567e35d-kube-api-access-9xdlv\") pod \"mariadb-operator-index-kc6h7\" (UID: \"ea3c7cab-4ba7-486e-8b3b-3deaf567e35d\") " pod="openstack-operators/mariadb-operator-index-kc6h7" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.454235 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xdlv\" (UniqueName: \"kubernetes.io/projected/ea3c7cab-4ba7-486e-8b3b-3deaf567e35d-kube-api-access-9xdlv\") pod \"mariadb-operator-index-kc6h7\" (UID: \"ea3c7cab-4ba7-486e-8b3b-3deaf567e35d\") " pod="openstack-operators/mariadb-operator-index-kc6h7" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.617609 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-kc6h7" Jan 21 00:18:06 crc kubenswrapper[5030]: I0121 00:18:06.641463 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-kc6h7"] Jan 21 00:18:07 crc kubenswrapper[5030]: I0121 00:18:07.049470 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-h4n9w"] Jan 21 00:18:07 crc kubenswrapper[5030]: I0121 00:18:07.050677 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h4n9w" Jan 21 00:18:07 crc kubenswrapper[5030]: I0121 00:18:07.074125 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-h4n9w"] Jan 21 00:18:07 crc kubenswrapper[5030]: I0121 00:18:07.087667 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-kc6h7"] Jan 21 00:18:07 crc kubenswrapper[5030]: I0121 00:18:07.146779 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gmpj\" (UniqueName: \"kubernetes.io/projected/57c9dd6b-175a-44f3-a3e7-90f211937483-kube-api-access-4gmpj\") pod \"mariadb-operator-index-h4n9w\" (UID: \"57c9dd6b-175a-44f3-a3e7-90f211937483\") " pod="openstack-operators/mariadb-operator-index-h4n9w" Jan 21 00:18:07 crc kubenswrapper[5030]: I0121 00:18:07.248551 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gmpj\" (UniqueName: \"kubernetes.io/projected/57c9dd6b-175a-44f3-a3e7-90f211937483-kube-api-access-4gmpj\") pod \"mariadb-operator-index-h4n9w\" (UID: \"57c9dd6b-175a-44f3-a3e7-90f211937483\") " pod="openstack-operators/mariadb-operator-index-h4n9w" Jan 21 00:18:07 crc kubenswrapper[5030]: I0121 00:18:07.271349 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gmpj\" (UniqueName: \"kubernetes.io/projected/57c9dd6b-175a-44f3-a3e7-90f211937483-kube-api-access-4gmpj\") pod \"mariadb-operator-index-h4n9w\" (UID: \"57c9dd6b-175a-44f3-a3e7-90f211937483\") " pod="openstack-operators/mariadb-operator-index-h4n9w" Jan 21 00:18:07 crc kubenswrapper[5030]: I0121 00:18:07.421903 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h4n9w" Jan 21 00:18:07 crc kubenswrapper[5030]: I0121 00:18:07.649210 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-h4n9w"] Jan 21 00:18:07 crc kubenswrapper[5030]: I0121 00:18:07.878247 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h4n9w" event={"ID":"57c9dd6b-175a-44f3-a3e7-90f211937483","Type":"ContainerStarted","Data":"fece8b1a2609d19a5b995791ad370400f65ff59305975e5fbe7b5cfc77112a31"} Jan 21 00:18:07 crc kubenswrapper[5030]: I0121 00:18:07.879871 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-kc6h7" event={"ID":"ea3c7cab-4ba7-486e-8b3b-3deaf567e35d","Type":"ContainerStarted","Data":"f156ef197f5a8ef9d9c7ba88c0ab327724d095547a2d93cc7933ddfa67f370a8"} Jan 21 00:18:08 crc kubenswrapper[5030]: I0121 00:18:08.890663 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-kc6h7" event={"ID":"ea3c7cab-4ba7-486e-8b3b-3deaf567e35d","Type":"ContainerStarted","Data":"50e922695ea01ba45d7cfb018b8e73e3c0e95dfad1f7d17a787bca860ecc2628"} Jan 21 00:18:08 crc kubenswrapper[5030]: I0121 00:18:08.891278 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-kc6h7" podUID="ea3c7cab-4ba7-486e-8b3b-3deaf567e35d" containerName="registry-server" containerID="cri-o://50e922695ea01ba45d7cfb018b8e73e3c0e95dfad1f7d17a787bca860ecc2628" gracePeriod=2 Jan 21 00:18:08 crc kubenswrapper[5030]: I0121 00:18:08.899123 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h4n9w" event={"ID":"57c9dd6b-175a-44f3-a3e7-90f211937483","Type":"ContainerStarted","Data":"9576524e8323da7db57298b88f0dddedf4750d64317985c6b2e6459b77e6bc40"} Jan 21 00:18:08 crc kubenswrapper[5030]: I0121 00:18:08.916228 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-kc6h7" podStartSLOduration=2.001805076 podStartE2EDuration="2.916201806s" podCreationTimestamp="2026-01-21 00:18:06 +0000 UTC" firstStartedPulling="2026-01-21 00:18:07.087659011 +0000 UTC m=+6159.407919299" lastFinishedPulling="2026-01-21 00:18:08.002055741 +0000 UTC m=+6160.322316029" observedRunningTime="2026-01-21 00:18:08.912764083 +0000 UTC m=+6161.233024381" watchObservedRunningTime="2026-01-21 00:18:08.916201806 +0000 UTC m=+6161.236462114" Jan 21 00:18:08 crc kubenswrapper[5030]: I0121 00:18:08.933159 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-h4n9w" podStartSLOduration=1.454984954 podStartE2EDuration="1.93313905s" podCreationTimestamp="2026-01-21 00:18:07 +0000 UTC" firstStartedPulling="2026-01-21 00:18:07.667981268 +0000 UTC m=+6159.988241576" lastFinishedPulling="2026-01-21 00:18:08.146135374 +0000 UTC m=+6160.466395672" observedRunningTime="2026-01-21 00:18:08.926603284 +0000 UTC m=+6161.246863582" watchObservedRunningTime="2026-01-21 00:18:08.93313905 +0000 UTC m=+6161.253399348" Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.760526 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-kc6h7_ea3c7cab-4ba7-486e-8b3b-3deaf567e35d/registry-server/0.log" Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.760891 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-kc6h7" Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.800349 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xdlv\" (UniqueName: \"kubernetes.io/projected/ea3c7cab-4ba7-486e-8b3b-3deaf567e35d-kube-api-access-9xdlv\") pod \"ea3c7cab-4ba7-486e-8b3b-3deaf567e35d\" (UID: \"ea3c7cab-4ba7-486e-8b3b-3deaf567e35d\") " Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.810778 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3c7cab-4ba7-486e-8b3b-3deaf567e35d-kube-api-access-9xdlv" (OuterVolumeSpecName: "kube-api-access-9xdlv") pod "ea3c7cab-4ba7-486e-8b3b-3deaf567e35d" (UID: "ea3c7cab-4ba7-486e-8b3b-3deaf567e35d"). InnerVolumeSpecName "kube-api-access-9xdlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.901977 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xdlv\" (UniqueName: \"kubernetes.io/projected/ea3c7cab-4ba7-486e-8b3b-3deaf567e35d-kube-api-access-9xdlv\") on node \"crc\" DevicePath \"\"" Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.907034 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-kc6h7_ea3c7cab-4ba7-486e-8b3b-3deaf567e35d/registry-server/0.log" Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.907089 5030 generic.go:334] "Generic (PLEG): container finished" podID="ea3c7cab-4ba7-486e-8b3b-3deaf567e35d" containerID="50e922695ea01ba45d7cfb018b8e73e3c0e95dfad1f7d17a787bca860ecc2628" exitCode=1 Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.907174 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-kc6h7" event={"ID":"ea3c7cab-4ba7-486e-8b3b-3deaf567e35d","Type":"ContainerDied","Data":"50e922695ea01ba45d7cfb018b8e73e3c0e95dfad1f7d17a787bca860ecc2628"} Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.907203 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-kc6h7" Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.907230 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-kc6h7" event={"ID":"ea3c7cab-4ba7-486e-8b3b-3deaf567e35d","Type":"ContainerDied","Data":"f156ef197f5a8ef9d9c7ba88c0ab327724d095547a2d93cc7933ddfa67f370a8"} Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.907251 5030 scope.go:117] "RemoveContainer" containerID="50e922695ea01ba45d7cfb018b8e73e3c0e95dfad1f7d17a787bca860ecc2628" Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.927735 5030 scope.go:117] "RemoveContainer" containerID="50e922695ea01ba45d7cfb018b8e73e3c0e95dfad1f7d17a787bca860ecc2628" Jan 21 00:18:09 crc kubenswrapper[5030]: E0121 00:18:09.928316 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e922695ea01ba45d7cfb018b8e73e3c0e95dfad1f7d17a787bca860ecc2628\": container with ID starting with 50e922695ea01ba45d7cfb018b8e73e3c0e95dfad1f7d17a787bca860ecc2628 not found: ID does not exist" containerID="50e922695ea01ba45d7cfb018b8e73e3c0e95dfad1f7d17a787bca860ecc2628" Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.928402 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e922695ea01ba45d7cfb018b8e73e3c0e95dfad1f7d17a787bca860ecc2628"} err="failed to get container status \"50e922695ea01ba45d7cfb018b8e73e3c0e95dfad1f7d17a787bca860ecc2628\": rpc error: code = NotFound desc = could not find container \"50e922695ea01ba45d7cfb018b8e73e3c0e95dfad1f7d17a787bca860ecc2628\": container with ID starting with 50e922695ea01ba45d7cfb018b8e73e3c0e95dfad1f7d17a787bca860ecc2628 not found: ID does not exist" Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.945539 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-kc6h7"] Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.950578 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-kc6h7"] Jan 21 00:18:09 crc kubenswrapper[5030]: I0121 00:18:09.973081 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3c7cab-4ba7-486e-8b3b-3deaf567e35d" path="/var/lib/kubelet/pods/ea3c7cab-4ba7-486e-8b3b-3deaf567e35d/volumes" Jan 21 00:18:17 crc kubenswrapper[5030]: I0121 00:18:17.422754 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-h4n9w" Jan 21 00:18:17 crc kubenswrapper[5030]: I0121 00:18:17.423090 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-h4n9w" Jan 21 00:18:17 crc kubenswrapper[5030]: I0121 00:18:17.448000 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-h4n9w" Jan 21 00:18:18 crc kubenswrapper[5030]: I0121 00:18:18.017290 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-h4n9w" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.394540 5030 scope.go:117] "RemoveContainer" containerID="af80139a16fc5d56aad0d1464a7af3e6a4ffa08bff4a2f1c3865175be23c6fc8" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.424860 5030 scope.go:117] "RemoveContainer" containerID="6cdf699538371ab232fbd636fd875076b1ecdabcdefe8ee91d0fb83a745b3bba" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.449854 5030 scope.go:117] "RemoveContainer" containerID="808c466715a032ee3209d45b915fc9c59a888761ad015562a2cacadf4accc0d5" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.486100 5030 scope.go:117] "RemoveContainer" containerID="4cd150a4f0dcf557aa527bf36c716d85a95bde3fe3f482844022156f6b8740c7" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.504452 5030 scope.go:117] "RemoveContainer" containerID="dea7270de9afd4ceb98f5f5e373eb14e2c823503fc52aa4f92211f7f733286c3" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.534012 5030 scope.go:117] "RemoveContainer" containerID="e17f8d6088fec44a34a9d5f5c2d13bbaadb9f6e8a7aa11a615410041648c8f82" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.692713 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp"] Jan 21 00:18:30 crc kubenswrapper[5030]: E0121 00:18:30.693098 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3c7cab-4ba7-486e-8b3b-3deaf567e35d" containerName="registry-server" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.693120 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3c7cab-4ba7-486e-8b3b-3deaf567e35d" containerName="registry-server" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.693288 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3c7cab-4ba7-486e-8b3b-3deaf567e35d" containerName="registry-server" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.694446 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.701890 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp"] Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.734591 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.841693 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/375155e3-b3f2-44e8-a551-1050e4e072b4-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp\" (UID: \"375155e3-b3f2-44e8-a551-1050e4e072b4\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.841769 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgfxs\" (UniqueName: \"kubernetes.io/projected/375155e3-b3f2-44e8-a551-1050e4e072b4-kube-api-access-jgfxs\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp\" (UID: \"375155e3-b3f2-44e8-a551-1050e4e072b4\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.841872 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/375155e3-b3f2-44e8-a551-1050e4e072b4-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp\" (UID: \"375155e3-b3f2-44e8-a551-1050e4e072b4\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.942992 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/375155e3-b3f2-44e8-a551-1050e4e072b4-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp\" (UID: \"375155e3-b3f2-44e8-a551-1050e4e072b4\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.943074 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgfxs\" (UniqueName: \"kubernetes.io/projected/375155e3-b3f2-44e8-a551-1050e4e072b4-kube-api-access-jgfxs\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp\" (UID: \"375155e3-b3f2-44e8-a551-1050e4e072b4\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.943109 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/375155e3-b3f2-44e8-a551-1050e4e072b4-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp\" (UID: \"375155e3-b3f2-44e8-a551-1050e4e072b4\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.943959 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/375155e3-b3f2-44e8-a551-1050e4e072b4-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp\" (UID: \"375155e3-b3f2-44e8-a551-1050e4e072b4\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.944197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/375155e3-b3f2-44e8-a551-1050e4e072b4-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp\" (UID: \"375155e3-b3f2-44e8-a551-1050e4e072b4\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" Jan 21 00:18:30 crc kubenswrapper[5030]: I0121 00:18:30.963747 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgfxs\" (UniqueName: \"kubernetes.io/projected/375155e3-b3f2-44e8-a551-1050e4e072b4-kube-api-access-jgfxs\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp\" (UID: \"375155e3-b3f2-44e8-a551-1050e4e072b4\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" Jan 21 00:18:31 crc kubenswrapper[5030]: I0121 00:18:31.048872 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" Jan 21 00:18:31 crc kubenswrapper[5030]: I0121 00:18:31.456844 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp"] Jan 21 00:18:32 crc kubenswrapper[5030]: I0121 00:18:32.074157 5030 generic.go:334] "Generic (PLEG): container finished" podID="375155e3-b3f2-44e8-a551-1050e4e072b4" containerID="555c8eef907969632d5648a21d3b0319730cf56a78b22a4fe57a6084e80e7e9f" exitCode=0 Jan 21 00:18:32 crc kubenswrapper[5030]: I0121 00:18:32.074273 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" event={"ID":"375155e3-b3f2-44e8-a551-1050e4e072b4","Type":"ContainerDied","Data":"555c8eef907969632d5648a21d3b0319730cf56a78b22a4fe57a6084e80e7e9f"} Jan 21 00:18:32 crc kubenswrapper[5030]: I0121 00:18:32.074890 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" event={"ID":"375155e3-b3f2-44e8-a551-1050e4e072b4","Type":"ContainerStarted","Data":"82ede166a81b218c2ccebb053be65aa18153ea249a830456926b5af1defbc418"} Jan 21 00:18:33 crc kubenswrapper[5030]: I0121 00:18:33.083172 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" event={"ID":"375155e3-b3f2-44e8-a551-1050e4e072b4","Type":"ContainerStarted","Data":"6ab77e0659379b1d9e7ff1fe4cf2d42d2b2882b795dd39477a8a0ddbab18aab1"} Jan 21 00:18:34 crc kubenswrapper[5030]: I0121 00:18:34.092892 5030 generic.go:334] "Generic (PLEG): container finished" podID="375155e3-b3f2-44e8-a551-1050e4e072b4" containerID="6ab77e0659379b1d9e7ff1fe4cf2d42d2b2882b795dd39477a8a0ddbab18aab1" exitCode=0 Jan 21 00:18:34 crc kubenswrapper[5030]: I0121 00:18:34.093263 5030 generic.go:334] "Generic (PLEG): container finished" podID="375155e3-b3f2-44e8-a551-1050e4e072b4" containerID="49387d81df18a11895c2eaa3aae68f8bdf0370436f03e99bc447e8ffe027ee1a" exitCode=0 Jan 21 00:18:34 crc kubenswrapper[5030]: I0121 00:18:34.093002 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" event={"ID":"375155e3-b3f2-44e8-a551-1050e4e072b4","Type":"ContainerDied","Data":"6ab77e0659379b1d9e7ff1fe4cf2d42d2b2882b795dd39477a8a0ddbab18aab1"} Jan 21 00:18:34 crc kubenswrapper[5030]: I0121 00:18:34.093367 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" event={"ID":"375155e3-b3f2-44e8-a551-1050e4e072b4","Type":"ContainerDied","Data":"49387d81df18a11895c2eaa3aae68f8bdf0370436f03e99bc447e8ffe027ee1a"} Jan 21 00:18:35 crc kubenswrapper[5030]: I0121 00:18:35.429683 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" Jan 21 00:18:35 crc kubenswrapper[5030]: I0121 00:18:35.509104 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgfxs\" (UniqueName: \"kubernetes.io/projected/375155e3-b3f2-44e8-a551-1050e4e072b4-kube-api-access-jgfxs\") pod \"375155e3-b3f2-44e8-a551-1050e4e072b4\" (UID: \"375155e3-b3f2-44e8-a551-1050e4e072b4\") " Jan 21 00:18:35 crc kubenswrapper[5030]: I0121 00:18:35.509205 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/375155e3-b3f2-44e8-a551-1050e4e072b4-bundle\") pod \"375155e3-b3f2-44e8-a551-1050e4e072b4\" (UID: \"375155e3-b3f2-44e8-a551-1050e4e072b4\") " Jan 21 00:18:35 crc kubenswrapper[5030]: I0121 00:18:35.509273 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/375155e3-b3f2-44e8-a551-1050e4e072b4-util\") pod \"375155e3-b3f2-44e8-a551-1050e4e072b4\" (UID: \"375155e3-b3f2-44e8-a551-1050e4e072b4\") " Jan 21 00:18:35 crc kubenswrapper[5030]: I0121 00:18:35.510295 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/375155e3-b3f2-44e8-a551-1050e4e072b4-bundle" (OuterVolumeSpecName: "bundle") pod "375155e3-b3f2-44e8-a551-1050e4e072b4" (UID: "375155e3-b3f2-44e8-a551-1050e4e072b4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:18:35 crc kubenswrapper[5030]: I0121 00:18:35.514752 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375155e3-b3f2-44e8-a551-1050e4e072b4-kube-api-access-jgfxs" (OuterVolumeSpecName: "kube-api-access-jgfxs") pod "375155e3-b3f2-44e8-a551-1050e4e072b4" (UID: "375155e3-b3f2-44e8-a551-1050e4e072b4"). InnerVolumeSpecName "kube-api-access-jgfxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:18:35 crc kubenswrapper[5030]: I0121 00:18:35.527141 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/375155e3-b3f2-44e8-a551-1050e4e072b4-util" (OuterVolumeSpecName: "util") pod "375155e3-b3f2-44e8-a551-1050e4e072b4" (UID: "375155e3-b3f2-44e8-a551-1050e4e072b4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:18:35 crc kubenswrapper[5030]: I0121 00:18:35.610492 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/375155e3-b3f2-44e8-a551-1050e4e072b4-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:18:35 crc kubenswrapper[5030]: I0121 00:18:35.610757 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/375155e3-b3f2-44e8-a551-1050e4e072b4-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:18:35 crc kubenswrapper[5030]: I0121 00:18:35.610846 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgfxs\" (UniqueName: \"kubernetes.io/projected/375155e3-b3f2-44e8-a551-1050e4e072b4-kube-api-access-jgfxs\") on node \"crc\" DevicePath \"\"" Jan 21 00:18:36 crc kubenswrapper[5030]: I0121 00:18:36.109934 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" event={"ID":"375155e3-b3f2-44e8-a551-1050e4e072b4","Type":"ContainerDied","Data":"82ede166a81b218c2ccebb053be65aa18153ea249a830456926b5af1defbc418"} Jan 21 00:18:36 crc kubenswrapper[5030]: I0121 00:18:36.109998 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82ede166a81b218c2ccebb053be65aa18153ea249a830456926b5af1defbc418" Jan 21 00:18:36 crc kubenswrapper[5030]: I0121 00:18:36.110030 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp" Jan 21 00:18:40 crc kubenswrapper[5030]: I0121 00:18:40.157173 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:18:40 crc kubenswrapper[5030]: I0121 00:18:40.157768 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.762943 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd"] Jan 21 00:18:42 crc kubenswrapper[5030]: E0121 00:18:42.763471 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375155e3-b3f2-44e8-a551-1050e4e072b4" containerName="pull" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.763484 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="375155e3-b3f2-44e8-a551-1050e4e072b4" containerName="pull" Jan 21 00:18:42 crc kubenswrapper[5030]: E0121 00:18:42.763495 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375155e3-b3f2-44e8-a551-1050e4e072b4" containerName="extract" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.763501 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="375155e3-b3f2-44e8-a551-1050e4e072b4" containerName="extract" Jan 21 00:18:42 crc kubenswrapper[5030]: E0121 00:18:42.763520 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375155e3-b3f2-44e8-a551-1050e4e072b4" containerName="util" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.763526 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="375155e3-b3f2-44e8-a551-1050e4e072b4" containerName="util" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.763708 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="375155e3-b3f2-44e8-a551-1050e4e072b4" containerName="extract" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.764141 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.766414 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.766529 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.766732 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-d8w6m" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.780471 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd"] Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.810349 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97e620d3-b6ad-4847-94fa-23e999bebfca-webhook-cert\") pod \"mariadb-operator-controller-manager-76878c7469-zppmd\" (UID: \"97e620d3-b6ad-4847-94fa-23e999bebfca\") " pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.810406 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqnk5\" (UniqueName: \"kubernetes.io/projected/97e620d3-b6ad-4847-94fa-23e999bebfca-kube-api-access-cqnk5\") pod \"mariadb-operator-controller-manager-76878c7469-zppmd\" (UID: \"97e620d3-b6ad-4847-94fa-23e999bebfca\") " pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.810580 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97e620d3-b6ad-4847-94fa-23e999bebfca-apiservice-cert\") pod \"mariadb-operator-controller-manager-76878c7469-zppmd\" (UID: \"97e620d3-b6ad-4847-94fa-23e999bebfca\") " pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.912134 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97e620d3-b6ad-4847-94fa-23e999bebfca-apiservice-cert\") pod \"mariadb-operator-controller-manager-76878c7469-zppmd\" (UID: \"97e620d3-b6ad-4847-94fa-23e999bebfca\") " pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.912218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97e620d3-b6ad-4847-94fa-23e999bebfca-webhook-cert\") pod \"mariadb-operator-controller-manager-76878c7469-zppmd\" (UID: \"97e620d3-b6ad-4847-94fa-23e999bebfca\") " pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.912252 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqnk5\" (UniqueName: \"kubernetes.io/projected/97e620d3-b6ad-4847-94fa-23e999bebfca-kube-api-access-cqnk5\") pod \"mariadb-operator-controller-manager-76878c7469-zppmd\" (UID: \"97e620d3-b6ad-4847-94fa-23e999bebfca\") " pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.918527 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97e620d3-b6ad-4847-94fa-23e999bebfca-apiservice-cert\") pod \"mariadb-operator-controller-manager-76878c7469-zppmd\" (UID: \"97e620d3-b6ad-4847-94fa-23e999bebfca\") " pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.918880 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97e620d3-b6ad-4847-94fa-23e999bebfca-webhook-cert\") pod \"mariadb-operator-controller-manager-76878c7469-zppmd\" (UID: \"97e620d3-b6ad-4847-94fa-23e999bebfca\") " pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:18:42 crc kubenswrapper[5030]: I0121 00:18:42.931340 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqnk5\" (UniqueName: \"kubernetes.io/projected/97e620d3-b6ad-4847-94fa-23e999bebfca-kube-api-access-cqnk5\") pod \"mariadb-operator-controller-manager-76878c7469-zppmd\" (UID: \"97e620d3-b6ad-4847-94fa-23e999bebfca\") " pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:18:43 crc kubenswrapper[5030]: I0121 00:18:43.082683 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:18:43 crc kubenswrapper[5030]: I0121 00:18:43.617843 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd"] Jan 21 00:18:43 crc kubenswrapper[5030]: W0121 00:18:43.625806 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97e620d3_b6ad_4847_94fa_23e999bebfca.slice/crio-1630300bbae8359feaf6521ad26adaf28662620576998f09a10a58508489e174 WatchSource:0}: Error finding container 1630300bbae8359feaf6521ad26adaf28662620576998f09a10a58508489e174: Status 404 returned error can't find the container with id 1630300bbae8359feaf6521ad26adaf28662620576998f09a10a58508489e174 Jan 21 00:18:44 crc kubenswrapper[5030]: I0121 00:18:44.200137 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" event={"ID":"97e620d3-b6ad-4847-94fa-23e999bebfca","Type":"ContainerStarted","Data":"1630300bbae8359feaf6521ad26adaf28662620576998f09a10a58508489e174"} Jan 21 00:18:47 crc kubenswrapper[5030]: I0121 00:18:47.227102 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" event={"ID":"97e620d3-b6ad-4847-94fa-23e999bebfca","Type":"ContainerStarted","Data":"89ac43add713166d036b64ce67b58bc1f08b82876bccda292a8c2a8ca3da9c54"} Jan 21 00:18:47 crc kubenswrapper[5030]: I0121 00:18:47.227518 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:18:47 crc kubenswrapper[5030]: I0121 00:18:47.249417 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" podStartSLOduration=2.703671357 podStartE2EDuration="5.249395859s" podCreationTimestamp="2026-01-21 00:18:42 +0000 UTC" firstStartedPulling="2026-01-21 00:18:43.627969311 +0000 UTC m=+6195.948229599" lastFinishedPulling="2026-01-21 00:18:46.173693813 +0000 UTC m=+6198.493954101" observedRunningTime="2026-01-21 00:18:47.245832813 +0000 UTC m=+6199.566093101" watchObservedRunningTime="2026-01-21 00:18:47.249395859 +0000 UTC m=+6199.569656147" Jan 21 00:18:53 crc kubenswrapper[5030]: I0121 00:18:53.090948 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:18:56 crc kubenswrapper[5030]: I0121 00:18:56.450556 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-md4t4"] Jan 21 00:18:56 crc kubenswrapper[5030]: I0121 00:18:56.451727 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-md4t4" Jan 21 00:18:56 crc kubenswrapper[5030]: I0121 00:18:56.457211 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-22q5b" Jan 21 00:18:56 crc kubenswrapper[5030]: I0121 00:18:56.467476 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-md4t4"] Jan 21 00:18:56 crc kubenswrapper[5030]: I0121 00:18:56.509413 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8pvx\" (UniqueName: \"kubernetes.io/projected/702a9aa6-bb25-4831-83be-50970db674b0-kube-api-access-t8pvx\") pod \"infra-operator-index-md4t4\" (UID: \"702a9aa6-bb25-4831-83be-50970db674b0\") " pod="openstack-operators/infra-operator-index-md4t4" Jan 21 00:18:56 crc kubenswrapper[5030]: I0121 00:18:56.611215 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8pvx\" (UniqueName: \"kubernetes.io/projected/702a9aa6-bb25-4831-83be-50970db674b0-kube-api-access-t8pvx\") pod \"infra-operator-index-md4t4\" (UID: \"702a9aa6-bb25-4831-83be-50970db674b0\") " pod="openstack-operators/infra-operator-index-md4t4" Jan 21 00:18:56 crc kubenswrapper[5030]: I0121 00:18:56.643820 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8pvx\" (UniqueName: \"kubernetes.io/projected/702a9aa6-bb25-4831-83be-50970db674b0-kube-api-access-t8pvx\") pod \"infra-operator-index-md4t4\" (UID: \"702a9aa6-bb25-4831-83be-50970db674b0\") " pod="openstack-operators/infra-operator-index-md4t4" Jan 21 00:18:56 crc kubenswrapper[5030]: I0121 00:18:56.772773 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-md4t4" Jan 21 00:18:57 crc kubenswrapper[5030]: I0121 00:18:57.240968 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-md4t4"] Jan 21 00:18:57 crc kubenswrapper[5030]: I0121 00:18:57.293308 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-md4t4" event={"ID":"702a9aa6-bb25-4831-83be-50970db674b0","Type":"ContainerStarted","Data":"0c6ec5416529b379f1d0b992695d2ed87876c9aeeabe5e9bdb80c9a978596fcf"} Jan 21 00:18:59 crc kubenswrapper[5030]: I0121 00:18:59.314095 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-md4t4" event={"ID":"702a9aa6-bb25-4831-83be-50970db674b0","Type":"ContainerStarted","Data":"2054f09e461f55bbf67cdcefc016529884b2e616ee4279b261f715286770a5d3"} Jan 21 00:18:59 crc kubenswrapper[5030]: I0121 00:18:59.332593 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-md4t4" podStartSLOduration=2.350441328 podStartE2EDuration="3.332570276s" podCreationTimestamp="2026-01-21 00:18:56 +0000 UTC" firstStartedPulling="2026-01-21 00:18:57.252219194 +0000 UTC m=+6209.572479472" lastFinishedPulling="2026-01-21 00:18:58.234348132 +0000 UTC m=+6210.554608420" observedRunningTime="2026-01-21 00:18:59.329526243 +0000 UTC m=+6211.649786591" watchObservedRunningTime="2026-01-21 00:18:59.332570276 +0000 UTC m=+6211.652830584" Jan 21 00:19:06 crc kubenswrapper[5030]: I0121 00:19:06.773644 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-md4t4" Jan 21 00:19:06 crc kubenswrapper[5030]: I0121 00:19:06.774217 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-md4t4" Jan 21 00:19:06 crc kubenswrapper[5030]: I0121 00:19:06.801911 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-md4t4" Jan 21 00:19:07 crc kubenswrapper[5030]: I0121 00:19:07.396523 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-md4t4" Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.286232 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr"] Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.287814 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.289770 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.301853 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr"] Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.380300 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c91ab469-6741-4eb3-a3d6-f651b5925221-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr\" (UID: \"c91ab469-6741-4eb3-a3d6-f651b5925221\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.380881 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk9bj\" (UniqueName: \"kubernetes.io/projected/c91ab469-6741-4eb3-a3d6-f651b5925221-kube-api-access-kk9bj\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr\" (UID: \"c91ab469-6741-4eb3-a3d6-f651b5925221\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.381120 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c91ab469-6741-4eb3-a3d6-f651b5925221-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr\" (UID: \"c91ab469-6741-4eb3-a3d6-f651b5925221\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.482488 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk9bj\" (UniqueName: \"kubernetes.io/projected/c91ab469-6741-4eb3-a3d6-f651b5925221-kube-api-access-kk9bj\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr\" (UID: \"c91ab469-6741-4eb3-a3d6-f651b5925221\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.482582 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c91ab469-6741-4eb3-a3d6-f651b5925221-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr\" (UID: \"c91ab469-6741-4eb3-a3d6-f651b5925221\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.482702 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c91ab469-6741-4eb3-a3d6-f651b5925221-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr\" (UID: \"c91ab469-6741-4eb3-a3d6-f651b5925221\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.483226 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c91ab469-6741-4eb3-a3d6-f651b5925221-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr\" (UID: \"c91ab469-6741-4eb3-a3d6-f651b5925221\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.483323 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c91ab469-6741-4eb3-a3d6-f651b5925221-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr\" (UID: \"c91ab469-6741-4eb3-a3d6-f651b5925221\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.509522 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk9bj\" (UniqueName: \"kubernetes.io/projected/c91ab469-6741-4eb3-a3d6-f651b5925221-kube-api-access-kk9bj\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr\" (UID: \"c91ab469-6741-4eb3-a3d6-f651b5925221\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" Jan 21 00:19:08 crc kubenswrapper[5030]: I0121 00:19:08.604526 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" Jan 21 00:19:09 crc kubenswrapper[5030]: I0121 00:19:09.055998 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr"] Jan 21 00:19:09 crc kubenswrapper[5030]: I0121 00:19:09.395607 5030 generic.go:334] "Generic (PLEG): container finished" podID="c91ab469-6741-4eb3-a3d6-f651b5925221" containerID="3515afd57f607427efd5ce2644f5dd488504d28f646955b9c4ef6c112083c664" exitCode=0 Jan 21 00:19:09 crc kubenswrapper[5030]: I0121 00:19:09.395697 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" event={"ID":"c91ab469-6741-4eb3-a3d6-f651b5925221","Type":"ContainerDied","Data":"3515afd57f607427efd5ce2644f5dd488504d28f646955b9c4ef6c112083c664"} Jan 21 00:19:09 crc kubenswrapper[5030]: I0121 00:19:09.395735 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" event={"ID":"c91ab469-6741-4eb3-a3d6-f651b5925221","Type":"ContainerStarted","Data":"d4545d492b7a9ef1d881197b8c83164d90556374c43741048bc24a7f6a587d69"} Jan 21 00:19:09 crc kubenswrapper[5030]: I0121 00:19:09.398220 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 00:19:10 crc kubenswrapper[5030]: I0121 00:19:10.157188 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:19:10 crc kubenswrapper[5030]: I0121 00:19:10.157251 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:19:10 crc kubenswrapper[5030]: I0121 00:19:10.403770 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" event={"ID":"c91ab469-6741-4eb3-a3d6-f651b5925221","Type":"ContainerStarted","Data":"36ccad619f00d34d28f42dcfbcbd572472f7da9ef5c7cbe484910369636a8140"} Jan 21 00:19:11 crc kubenswrapper[5030]: I0121 00:19:11.414808 5030 generic.go:334] "Generic (PLEG): container finished" podID="c91ab469-6741-4eb3-a3d6-f651b5925221" containerID="36ccad619f00d34d28f42dcfbcbd572472f7da9ef5c7cbe484910369636a8140" exitCode=0 Jan 21 00:19:11 crc kubenswrapper[5030]: I0121 00:19:11.414865 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" event={"ID":"c91ab469-6741-4eb3-a3d6-f651b5925221","Type":"ContainerDied","Data":"36ccad619f00d34d28f42dcfbcbd572472f7da9ef5c7cbe484910369636a8140"} Jan 21 00:19:12 crc kubenswrapper[5030]: I0121 00:19:12.426485 5030 generic.go:334] "Generic (PLEG): container finished" podID="c91ab469-6741-4eb3-a3d6-f651b5925221" containerID="1a814456dfcca7833db34b890f1cd5233c2654b9d22554d4f0ba998177248e68" exitCode=0 Jan 21 00:19:12 crc kubenswrapper[5030]: I0121 00:19:12.426575 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" event={"ID":"c91ab469-6741-4eb3-a3d6-f651b5925221","Type":"ContainerDied","Data":"1a814456dfcca7833db34b890f1cd5233c2654b9d22554d4f0ba998177248e68"} Jan 21 00:19:13 crc kubenswrapper[5030]: I0121 00:19:13.708841 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" Jan 21 00:19:13 crc kubenswrapper[5030]: I0121 00:19:13.764047 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c91ab469-6741-4eb3-a3d6-f651b5925221-bundle\") pod \"c91ab469-6741-4eb3-a3d6-f651b5925221\" (UID: \"c91ab469-6741-4eb3-a3d6-f651b5925221\") " Jan 21 00:19:13 crc kubenswrapper[5030]: I0121 00:19:13.764759 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c91ab469-6741-4eb3-a3d6-f651b5925221-util\") pod \"c91ab469-6741-4eb3-a3d6-f651b5925221\" (UID: \"c91ab469-6741-4eb3-a3d6-f651b5925221\") " Jan 21 00:19:13 crc kubenswrapper[5030]: I0121 00:19:13.764807 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk9bj\" (UniqueName: \"kubernetes.io/projected/c91ab469-6741-4eb3-a3d6-f651b5925221-kube-api-access-kk9bj\") pod \"c91ab469-6741-4eb3-a3d6-f651b5925221\" (UID: \"c91ab469-6741-4eb3-a3d6-f651b5925221\") " Jan 21 00:19:13 crc kubenswrapper[5030]: I0121 00:19:13.766661 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c91ab469-6741-4eb3-a3d6-f651b5925221-bundle" (OuterVolumeSpecName: "bundle") pod "c91ab469-6741-4eb3-a3d6-f651b5925221" (UID: "c91ab469-6741-4eb3-a3d6-f651b5925221"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:19:13 crc kubenswrapper[5030]: I0121 00:19:13.773860 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c91ab469-6741-4eb3-a3d6-f651b5925221-kube-api-access-kk9bj" (OuterVolumeSpecName: "kube-api-access-kk9bj") pod "c91ab469-6741-4eb3-a3d6-f651b5925221" (UID: "c91ab469-6741-4eb3-a3d6-f651b5925221"). InnerVolumeSpecName "kube-api-access-kk9bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:19:13 crc kubenswrapper[5030]: I0121 00:19:13.783364 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c91ab469-6741-4eb3-a3d6-f651b5925221-util" (OuterVolumeSpecName: "util") pod "c91ab469-6741-4eb3-a3d6-f651b5925221" (UID: "c91ab469-6741-4eb3-a3d6-f651b5925221"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:19:13 crc kubenswrapper[5030]: I0121 00:19:13.866951 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c91ab469-6741-4eb3-a3d6-f651b5925221-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:19:13 crc kubenswrapper[5030]: I0121 00:19:13.867014 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk9bj\" (UniqueName: \"kubernetes.io/projected/c91ab469-6741-4eb3-a3d6-f651b5925221-kube-api-access-kk9bj\") on node \"crc\" DevicePath \"\"" Jan 21 00:19:13 crc kubenswrapper[5030]: I0121 00:19:13.867036 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c91ab469-6741-4eb3-a3d6-f651b5925221-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:19:14 crc kubenswrapper[5030]: I0121 00:19:14.447181 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" event={"ID":"c91ab469-6741-4eb3-a3d6-f651b5925221","Type":"ContainerDied","Data":"d4545d492b7a9ef1d881197b8c83164d90556374c43741048bc24a7f6a587d69"} Jan 21 00:19:14 crc kubenswrapper[5030]: I0121 00:19:14.447272 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr" Jan 21 00:19:14 crc kubenswrapper[5030]: I0121 00:19:14.447286 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4545d492b7a9ef1d881197b8c83164d90556374c43741048bc24a7f6a587d69" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.681174 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Jan 21 00:19:20 crc kubenswrapper[5030]: E0121 00:19:20.682005 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91ab469-6741-4eb3-a3d6-f651b5925221" containerName="util" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.682019 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91ab469-6741-4eb3-a3d6-f651b5925221" containerName="util" Jan 21 00:19:20 crc kubenswrapper[5030]: E0121 00:19:20.682036 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91ab469-6741-4eb3-a3d6-f651b5925221" containerName="pull" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.682041 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91ab469-6741-4eb3-a3d6-f651b5925221" containerName="pull" Jan 21 00:19:20 crc kubenswrapper[5030]: E0121 00:19:20.682050 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91ab469-6741-4eb3-a3d6-f651b5925221" containerName="extract" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.682056 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91ab469-6741-4eb3-a3d6-f651b5925221" containerName="extract" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.682175 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c91ab469-6741-4eb3-a3d6-f651b5925221" containerName="extract" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.682829 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.686252 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"galera-openstack-dockercfg-h9swc" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.686290 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openstack-scripts" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.687161 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openshift-service-ca.crt" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.687906 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"openstack-config-data" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.690404 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"kube-root-ca.crt" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.702561 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.703850 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.720924 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.722123 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.726974 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.734914 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.746865 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.763442 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-config-data-default\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.763511 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-kolla-config\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.763550 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.763567 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-config-data-default\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.763594 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-kolla-config\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.763655 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.763675 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-operator-scripts\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.763697 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnfzc\" (UniqueName: \"kubernetes.io/projected/7186abfd-4ab6-445a-9c9b-9f512a483acc-kube-api-access-lnfzc\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.763735 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfxzf\" (UniqueName: \"kubernetes.io/projected/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-kube-api-access-zfxzf\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.763770 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7186abfd-4ab6-445a-9c9b-9f512a483acc-config-data-generated\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.763822 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.763846 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865505 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g249t\" (UniqueName: \"kubernetes.io/projected/45f30d64-1c48-455e-be8e-ad27d8c2ea05-kube-api-access-g249t\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865575 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7186abfd-4ab6-445a-9c9b-9f512a483acc-config-data-generated\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865610 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45f30d64-1c48-455e-be8e-ad27d8c2ea05-config-data-generated\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865669 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865721 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-config-data-default\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865739 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-kolla-config\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865760 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865779 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865797 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-config-data-default\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865818 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-config-data-default\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865836 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-kolla-config\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865859 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-operator-scripts\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865899 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-operator-scripts\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865926 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-kolla-config\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865945 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnfzc\" (UniqueName: \"kubernetes.io/projected/7186abfd-4ab6-445a-9c9b-9f512a483acc-kube-api-access-lnfzc\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.865986 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfxzf\" (UniqueName: \"kubernetes.io/projected/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-kube-api-access-zfxzf\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.867163 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-config-data-default\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.867467 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") device mount path \"/mnt/openstack/pv17\"" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.867635 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.867754 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-kolla-config\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.867875 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") device mount path \"/mnt/openstack/pv01\"" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.868125 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-config-data-default\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.868538 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-kolla-config\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.868864 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.868892 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7186abfd-4ab6-445a-9c9b-9f512a483acc-config-data-generated\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.869902 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-operator-scripts\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.883000 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfxzf\" (UniqueName: \"kubernetes.io/projected/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-kube-api-access-zfxzf\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.887302 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.887880 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnfzc\" (UniqueName: \"kubernetes.io/projected/7186abfd-4ab6-445a-9c9b-9f512a483acc-kube-api-access-lnfzc\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.895878 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-galera-2\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.967219 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.967299 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-config-data-default\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.967342 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-operator-scripts\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.967384 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-kolla-config\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.967396 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") device mount path \"/mnt/openstack/pv10\"" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.967436 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g249t\" (UniqueName: \"kubernetes.io/projected/45f30d64-1c48-455e-be8e-ad27d8c2ea05-kube-api-access-g249t\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.967483 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45f30d64-1c48-455e-be8e-ad27d8c2ea05-config-data-generated\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.969742 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-config-data-default\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.969826 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-kolla-config\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.974706 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-operator-scripts\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.974831 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45f30d64-1c48-455e-be8e-ad27d8c2ea05-config-data-generated\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.994337 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g249t\" (UniqueName: \"kubernetes.io/projected/45f30d64-1c48-455e-be8e-ad27d8c2ea05-kube-api-access-g249t\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.997585 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-1\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:20 crc kubenswrapper[5030]: I0121 00:19:20.999897 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:21 crc kubenswrapper[5030]: I0121 00:19:21.019001 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:21 crc kubenswrapper[5030]: I0121 00:19:21.037339 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:21 crc kubenswrapper[5030]: I0121 00:19:21.493005 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Jan 21 00:19:21 crc kubenswrapper[5030]: I0121 00:19:21.597842 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Jan 21 00:19:21 crc kubenswrapper[5030]: W0121 00:19:21.599475 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7186abfd_4ab6_445a_9c9b_9f512a483acc.slice/crio-8f4142684a5b0214c8f7f7403c804d3eefdef80000e85ae0df6c477e32ecdb34 WatchSource:0}: Error finding container 8f4142684a5b0214c8f7f7403c804d3eefdef80000e85ae0df6c477e32ecdb34: Status 404 returned error can't find the container with id 8f4142684a5b0214c8f7f7403c804d3eefdef80000e85ae0df6c477e32ecdb34 Jan 21 00:19:21 crc kubenswrapper[5030]: W0121 00:19:21.606045 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45f30d64_1c48_455e_be8e_ad27d8c2ea05.slice/crio-8242797e49bfdf22fb24bcd096eaa7c9df3a668ab4390f0cd4ef952551908301 WatchSource:0}: Error finding container 8242797e49bfdf22fb24bcd096eaa7c9df3a668ab4390f0cd4ef952551908301: Status 404 returned error can't find the container with id 8242797e49bfdf22fb24bcd096eaa7c9df3a668ab4390f0cd4ef952551908301 Jan 21 00:19:21 crc kubenswrapper[5030]: I0121 00:19:21.609508 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Jan 21 00:19:22 crc kubenswrapper[5030]: I0121 00:19:22.505084 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8","Type":"ContainerStarted","Data":"05b31a37506c8133de1a9c58bb77541c708e68a751680365757a81d56c53c8f5"} Jan 21 00:19:22 crc kubenswrapper[5030]: I0121 00:19:22.505456 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8","Type":"ContainerStarted","Data":"c3e712fc7ab1f93f738be4b9b775ffaf1841f32e1e9f3a80b8bc82aa63fcd39d"} Jan 21 00:19:22 crc kubenswrapper[5030]: I0121 00:19:22.509464 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"7186abfd-4ab6-445a-9c9b-9f512a483acc","Type":"ContainerStarted","Data":"92b1c0a0bc749e7799c8de35103ad87a58265f3d9420d8a702a0d0f314bccc4f"} Jan 21 00:19:22 crc kubenswrapper[5030]: I0121 00:19:22.509525 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"7186abfd-4ab6-445a-9c9b-9f512a483acc","Type":"ContainerStarted","Data":"8f4142684a5b0214c8f7f7403c804d3eefdef80000e85ae0df6c477e32ecdb34"} Jan 21 00:19:22 crc kubenswrapper[5030]: I0121 00:19:22.511920 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"45f30d64-1c48-455e-be8e-ad27d8c2ea05","Type":"ContainerStarted","Data":"7dfde2d724dca65b19fe11d0be962fbf619387faf1a3d9df7a0ebff64cf4aa71"} Jan 21 00:19:22 crc kubenswrapper[5030]: I0121 00:19:22.511963 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"45f30d64-1c48-455e-be8e-ad27d8c2ea05","Type":"ContainerStarted","Data":"8242797e49bfdf22fb24bcd096eaa7c9df3a668ab4390f0cd4ef952551908301"} Jan 21 00:19:25 crc kubenswrapper[5030]: I0121 00:19:25.534018 5030 generic.go:334] "Generic (PLEG): container finished" podID="45f30d64-1c48-455e-be8e-ad27d8c2ea05" containerID="7dfde2d724dca65b19fe11d0be962fbf619387faf1a3d9df7a0ebff64cf4aa71" exitCode=0 Jan 21 00:19:25 crc kubenswrapper[5030]: I0121 00:19:25.534112 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"45f30d64-1c48-455e-be8e-ad27d8c2ea05","Type":"ContainerDied","Data":"7dfde2d724dca65b19fe11d0be962fbf619387faf1a3d9df7a0ebff64cf4aa71"} Jan 21 00:19:25 crc kubenswrapper[5030]: I0121 00:19:25.537242 5030 generic.go:334] "Generic (PLEG): container finished" podID="233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" containerID="05b31a37506c8133de1a9c58bb77541c708e68a751680365757a81d56c53c8f5" exitCode=0 Jan 21 00:19:25 crc kubenswrapper[5030]: I0121 00:19:25.537314 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8","Type":"ContainerDied","Data":"05b31a37506c8133de1a9c58bb77541c708e68a751680365757a81d56c53c8f5"} Jan 21 00:19:25 crc kubenswrapper[5030]: I0121 00:19:25.544081 5030 generic.go:334] "Generic (PLEG): container finished" podID="7186abfd-4ab6-445a-9c9b-9f512a483acc" containerID="92b1c0a0bc749e7799c8de35103ad87a58265f3d9420d8a702a0d0f314bccc4f" exitCode=0 Jan 21 00:19:25 crc kubenswrapper[5030]: I0121 00:19:25.544150 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"7186abfd-4ab6-445a-9c9b-9f512a483acc","Type":"ContainerDied","Data":"92b1c0a0bc749e7799c8de35103ad87a58265f3d9420d8a702a0d0f314bccc4f"} Jan 21 00:19:26 crc kubenswrapper[5030]: I0121 00:19:26.552608 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8","Type":"ContainerStarted","Data":"c4d75f0985bf9e20df958dd4476f2c9b358b54ecc1e575263291b5c952e424e5"} Jan 21 00:19:26 crc kubenswrapper[5030]: I0121 00:19:26.554659 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"7186abfd-4ab6-445a-9c9b-9f512a483acc","Type":"ContainerStarted","Data":"da836f0d9f3388acf4da9363a2b325ea0e940ccd63797e39c38b07bb56277b34"} Jan 21 00:19:26 crc kubenswrapper[5030]: I0121 00:19:26.556582 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"45f30d64-1c48-455e-be8e-ad27d8c2ea05","Type":"ContainerStarted","Data":"f56399eb18b3c55f1de6def42262adcf48ba83d8116adb6e6612c6c83d3dcd27"} Jan 21 00:19:26 crc kubenswrapper[5030]: I0121 00:19:26.585143 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-0" podStartSLOduration=7.585122507 podStartE2EDuration="7.585122507s" podCreationTimestamp="2026-01-21 00:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:19:26.579753599 +0000 UTC m=+6238.900013887" watchObservedRunningTime="2026-01-21 00:19:26.585122507 +0000 UTC m=+6238.905382795" Jan 21 00:19:26 crc kubenswrapper[5030]: I0121 00:19:26.602592 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-1" podStartSLOduration=7.6025732139999995 podStartE2EDuration="7.602573214s" podCreationTimestamp="2026-01-21 00:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:19:26.596303784 +0000 UTC m=+6238.916564092" watchObservedRunningTime="2026-01-21 00:19:26.602573214 +0000 UTC m=+6238.922833502" Jan 21 00:19:26 crc kubenswrapper[5030]: I0121 00:19:26.625839 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/openstack-galera-2" podStartSLOduration=7.62581948 podStartE2EDuration="7.62581948s" podCreationTimestamp="2026-01-21 00:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:19:26.621006614 +0000 UTC m=+6238.941266912" watchObservedRunningTime="2026-01-21 00:19:26.62581948 +0000 UTC m=+6238.946079768" Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.501303 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4"] Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.502923 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.508171 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-rhlkq" Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.511583 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4"] Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.514014 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.588043 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-webhook-cert\") pod \"infra-operator-controller-manager-795f7dd4bc-p4mz4\" (UID: \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\") " pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.588110 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-apiservice-cert\") pod \"infra-operator-controller-manager-795f7dd4bc-p4mz4\" (UID: \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\") " pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.588150 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgtql\" (UniqueName: \"kubernetes.io/projected/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-kube-api-access-sgtql\") pod \"infra-operator-controller-manager-795f7dd4bc-p4mz4\" (UID: \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\") " pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.689871 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-webhook-cert\") pod \"infra-operator-controller-manager-795f7dd4bc-p4mz4\" (UID: \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\") " pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.689953 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-apiservice-cert\") pod \"infra-operator-controller-manager-795f7dd4bc-p4mz4\" (UID: \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\") " pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.689987 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgtql\" (UniqueName: \"kubernetes.io/projected/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-kube-api-access-sgtql\") pod \"infra-operator-controller-manager-795f7dd4bc-p4mz4\" (UID: \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\") " pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.696380 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-webhook-cert\") pod \"infra-operator-controller-manager-795f7dd4bc-p4mz4\" (UID: \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\") " pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.696411 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-apiservice-cert\") pod \"infra-operator-controller-manager-795f7dd4bc-p4mz4\" (UID: \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\") " pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.707366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgtql\" (UniqueName: \"kubernetes.io/projected/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-kube-api-access-sgtql\") pod \"infra-operator-controller-manager-795f7dd4bc-p4mz4\" (UID: \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\") " pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:19:29 crc kubenswrapper[5030]: I0121 00:19:29.822390 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:19:30 crc kubenswrapper[5030]: I0121 00:19:30.266916 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4"] Jan 21 00:19:30 crc kubenswrapper[5030]: I0121 00:19:30.582910 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" event={"ID":"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2","Type":"ContainerStarted","Data":"2a1558083c2d85111a58cc8552a6cba0038418f1fec0b2d6a713050490fa5c00"} Jan 21 00:19:30 crc kubenswrapper[5030]: I0121 00:19:30.634055 5030 scope.go:117] "RemoveContainer" containerID="c68321cd97426dedda3236234ce414ff9647646f6d40a29e549a00832aba34f9" Jan 21 00:19:30 crc kubenswrapper[5030]: I0121 00:19:30.654890 5030 scope.go:117] "RemoveContainer" containerID="dae39f7e3837254aa58a115b0e9f14ef10e199f7f4924deb8c74aa6723b1bb05" Jan 21 00:19:31 crc kubenswrapper[5030]: I0121 00:19:31.000808 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:31 crc kubenswrapper[5030]: I0121 00:19:31.000857 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:31 crc kubenswrapper[5030]: I0121 00:19:31.019944 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:31 crc kubenswrapper[5030]: I0121 00:19:31.019986 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:31 crc kubenswrapper[5030]: I0121 00:19:31.038528 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:31 crc kubenswrapper[5030]: I0121 00:19:31.038576 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:31 crc kubenswrapper[5030]: E0121 00:19:31.690791 5030 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.9:45436->38.102.83.9:37955: write tcp 38.102.83.9:45436->38.102.83.9:37955: write: connection reset by peer Jan 21 00:19:33 crc kubenswrapper[5030]: I0121 00:19:33.621930 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" event={"ID":"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2","Type":"ContainerStarted","Data":"860a74dd5faeee55096fbe1222ffe83b368db43dd3cc161a34d4204edc6c2579"} Jan 21 00:19:33 crc kubenswrapper[5030]: I0121 00:19:33.622687 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:19:33 crc kubenswrapper[5030]: I0121 00:19:33.644855 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" podStartSLOduration=2.473246433 podStartE2EDuration="4.644831804s" podCreationTimestamp="2026-01-21 00:19:29 +0000 UTC" firstStartedPulling="2026-01-21 00:19:30.279159129 +0000 UTC m=+6242.599419417" lastFinishedPulling="2026-01-21 00:19:32.4507445 +0000 UTC m=+6244.771004788" observedRunningTime="2026-01-21 00:19:33.638161045 +0000 UTC m=+6245.958421343" watchObservedRunningTime="2026-01-21 00:19:33.644831804 +0000 UTC m=+6245.965092092" Jan 21 00:19:35 crc kubenswrapper[5030]: I0121 00:19:35.712003 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:35 crc kubenswrapper[5030]: I0121 00:19:35.782182 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:19:39 crc kubenswrapper[5030]: I0121 00:19:39.785387 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/root-account-create-update-6mq5v"] Jan 21 00:19:39 crc kubenswrapper[5030]: I0121 00:19:39.786589 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-6mq5v" Jan 21 00:19:39 crc kubenswrapper[5030]: I0121 00:19:39.790086 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 00:19:39 crc kubenswrapper[5030]: I0121 00:19:39.802466 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-6mq5v"] Jan 21 00:19:39 crc kubenswrapper[5030]: I0121 00:19:39.831936 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:19:39 crc kubenswrapper[5030]: I0121 00:19:39.943123 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcblj\" (UniqueName: \"kubernetes.io/projected/95d2205e-2da2-4ea7-9959-1e41b88525a7-kube-api-access-kcblj\") pod \"root-account-create-update-6mq5v\" (UID: \"95d2205e-2da2-4ea7-9959-1e41b88525a7\") " pod="barbican-kuttl-tests/root-account-create-update-6mq5v" Jan 21 00:19:39 crc kubenswrapper[5030]: I0121 00:19:39.943460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95d2205e-2da2-4ea7-9959-1e41b88525a7-operator-scripts\") pod \"root-account-create-update-6mq5v\" (UID: \"95d2205e-2da2-4ea7-9959-1e41b88525a7\") " pod="barbican-kuttl-tests/root-account-create-update-6mq5v" Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.044898 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95d2205e-2da2-4ea7-9959-1e41b88525a7-operator-scripts\") pod \"root-account-create-update-6mq5v\" (UID: \"95d2205e-2da2-4ea7-9959-1e41b88525a7\") " pod="barbican-kuttl-tests/root-account-create-update-6mq5v" Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.045055 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcblj\" (UniqueName: \"kubernetes.io/projected/95d2205e-2da2-4ea7-9959-1e41b88525a7-kube-api-access-kcblj\") pod \"root-account-create-update-6mq5v\" (UID: \"95d2205e-2da2-4ea7-9959-1e41b88525a7\") " pod="barbican-kuttl-tests/root-account-create-update-6mq5v" Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.046436 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95d2205e-2da2-4ea7-9959-1e41b88525a7-operator-scripts\") pod \"root-account-create-update-6mq5v\" (UID: \"95d2205e-2da2-4ea7-9959-1e41b88525a7\") " pod="barbican-kuttl-tests/root-account-create-update-6mq5v" Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.071370 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcblj\" (UniqueName: \"kubernetes.io/projected/95d2205e-2da2-4ea7-9959-1e41b88525a7-kube-api-access-kcblj\") pod \"root-account-create-update-6mq5v\" (UID: \"95d2205e-2da2-4ea7-9959-1e41b88525a7\") " pod="barbican-kuttl-tests/root-account-create-update-6mq5v" Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.104166 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-6mq5v" Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.157591 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.158002 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.158062 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.158833 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25666b6d120603c062926c4e08877d49612e752e6df3c058aacbe8d2325dd00b"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.158891 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://25666b6d120603c062926c4e08877d49612e752e6df3c058aacbe8d2325dd00b" gracePeriod=600 Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.412425 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-6mq5v"] Jan 21 00:19:40 crc kubenswrapper[5030]: W0121 00:19:40.422144 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95d2205e_2da2_4ea7_9959_1e41b88525a7.slice/crio-1ea17871f9ad410ddba859b57c012a707ee9c5994fdaa3d5bf2a1e62f4dcc54b WatchSource:0}: Error finding container 1ea17871f9ad410ddba859b57c012a707ee9c5994fdaa3d5bf2a1e62f4dcc54b: Status 404 returned error can't find the container with id 1ea17871f9ad410ddba859b57c012a707ee9c5994fdaa3d5bf2a1e62f4dcc54b Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.689172 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-6mq5v" event={"ID":"95d2205e-2da2-4ea7-9959-1e41b88525a7","Type":"ContainerStarted","Data":"e0f6a4620408e63ad7436bb6d266362aeeab1b7f6b6008ec905800bf26491915"} Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.689487 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-6mq5v" event={"ID":"95d2205e-2da2-4ea7-9959-1e41b88525a7","Type":"ContainerStarted","Data":"1ea17871f9ad410ddba859b57c012a707ee9c5994fdaa3d5bf2a1e62f4dcc54b"} Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.691776 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="25666b6d120603c062926c4e08877d49612e752e6df3c058aacbe8d2325dd00b" exitCode=0 Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.691929 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"25666b6d120603c062926c4e08877d49612e752e6df3c058aacbe8d2325dd00b"} Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.691986 5030 scope.go:117] "RemoveContainer" containerID="83a87d37dc7be75c7d1082ad21c764c2150aac8866a45584d3ca8a34c5c71bba" Jan 21 00:19:40 crc kubenswrapper[5030]: I0121 00:19:40.709329 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/root-account-create-update-6mq5v" podStartSLOduration=1.709298236 podStartE2EDuration="1.709298236s" podCreationTimestamp="2026-01-21 00:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:19:40.708195759 +0000 UTC m=+6253.028456047" watchObservedRunningTime="2026-01-21 00:19:40.709298236 +0000 UTC m=+6253.029558524" Jan 21 00:19:41 crc kubenswrapper[5030]: I0121 00:19:41.099749 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="barbican-kuttl-tests/openstack-galera-2" podUID="7186abfd-4ab6-445a-9c9b-9f512a483acc" containerName="galera" probeResult="failure" output=< Jan 21 00:19:41 crc kubenswrapper[5030]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Jan 21 00:19:41 crc kubenswrapper[5030]: > Jan 21 00:19:41 crc kubenswrapper[5030]: I0121 00:19:41.701895 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575"} Jan 21 00:19:43 crc kubenswrapper[5030]: I0121 00:19:43.718127 5030 generic.go:334] "Generic (PLEG): container finished" podID="95d2205e-2da2-4ea7-9959-1e41b88525a7" containerID="e0f6a4620408e63ad7436bb6d266362aeeab1b7f6b6008ec905800bf26491915" exitCode=0 Jan 21 00:19:43 crc kubenswrapper[5030]: I0121 00:19:43.718201 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-6mq5v" event={"ID":"95d2205e-2da2-4ea7-9959-1e41b88525a7","Type":"ContainerDied","Data":"e0f6a4620408e63ad7436bb6d266362aeeab1b7f6b6008ec905800bf26491915"} Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.035708 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-6mq5v" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.132425 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95d2205e-2da2-4ea7-9959-1e41b88525a7-operator-scripts\") pod \"95d2205e-2da2-4ea7-9959-1e41b88525a7\" (UID: \"95d2205e-2da2-4ea7-9959-1e41b88525a7\") " Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.132511 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcblj\" (UniqueName: \"kubernetes.io/projected/95d2205e-2da2-4ea7-9959-1e41b88525a7-kube-api-access-kcblj\") pod \"95d2205e-2da2-4ea7-9959-1e41b88525a7\" (UID: \"95d2205e-2da2-4ea7-9959-1e41b88525a7\") " Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.133293 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95d2205e-2da2-4ea7-9959-1e41b88525a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95d2205e-2da2-4ea7-9959-1e41b88525a7" (UID: "95d2205e-2da2-4ea7-9959-1e41b88525a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.137985 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d2205e-2da2-4ea7-9959-1e41b88525a7-kube-api-access-kcblj" (OuterVolumeSpecName: "kube-api-access-kcblj") pod "95d2205e-2da2-4ea7-9959-1e41b88525a7" (UID: "95d2205e-2da2-4ea7-9959-1e41b88525a7"). InnerVolumeSpecName "kube-api-access-kcblj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.233658 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95d2205e-2da2-4ea7-9959-1e41b88525a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.233697 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcblj\" (UniqueName: \"kubernetes.io/projected/95d2205e-2da2-4ea7-9959-1e41b88525a7-kube-api-access-kcblj\") on node \"crc\" DevicePath \"\"" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.250108 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-5qf86"] Jan 21 00:19:45 crc kubenswrapper[5030]: E0121 00:19:45.250361 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d2205e-2da2-4ea7-9959-1e41b88525a7" containerName="mariadb-account-create-update" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.250374 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d2205e-2da2-4ea7-9959-1e41b88525a7" containerName="mariadb-account-create-update" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.250521 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d2205e-2da2-4ea7-9959-1e41b88525a7" containerName="mariadb-account-create-update" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.250976 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.253558 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-wxdbz" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.259962 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-5qf86"] Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.436800 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fzgn\" (UniqueName: \"kubernetes.io/projected/d11ee764-dc27-451e-a58c-ab22788647f5-kube-api-access-2fzgn\") pod \"rabbitmq-cluster-operator-index-5qf86\" (UID: \"d11ee764-dc27-451e-a58c-ab22788647f5\") " pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.537810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fzgn\" (UniqueName: \"kubernetes.io/projected/d11ee764-dc27-451e-a58c-ab22788647f5-kube-api-access-2fzgn\") pod \"rabbitmq-cluster-operator-index-5qf86\" (UID: \"d11ee764-dc27-451e-a58c-ab22788647f5\") " pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.554087 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fzgn\" (UniqueName: \"kubernetes.io/projected/d11ee764-dc27-451e-a58c-ab22788647f5-kube-api-access-2fzgn\") pod \"rabbitmq-cluster-operator-index-5qf86\" (UID: \"d11ee764-dc27-451e-a58c-ab22788647f5\") " pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.620430 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.757172 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/root-account-create-update-6mq5v" event={"ID":"95d2205e-2da2-4ea7-9959-1e41b88525a7","Type":"ContainerDied","Data":"1ea17871f9ad410ddba859b57c012a707ee9c5994fdaa3d5bf2a1e62f4dcc54b"} Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.757451 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ea17871f9ad410ddba859b57c012a707ee9c5994fdaa3d5bf2a1e62f4dcc54b" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.757230 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-6mq5v" Jan 21 00:19:45 crc kubenswrapper[5030]: I0121 00:19:45.950989 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-5qf86"] Jan 21 00:19:46 crc kubenswrapper[5030]: I0121 00:19:46.768240 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" event={"ID":"d11ee764-dc27-451e-a58c-ab22788647f5","Type":"ContainerStarted","Data":"ad57e99a6924fb06c555bdfc76ab06e401a2a814479fed13baedb3af4f540798"} Jan 21 00:19:47 crc kubenswrapper[5030]: I0121 00:19:47.133894 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:47 crc kubenswrapper[5030]: I0121 00:19:47.353642 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:19:50 crc kubenswrapper[5030]: I0121 00:19:50.808420 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" event={"ID":"d11ee764-dc27-451e-a58c-ab22788647f5","Type":"ContainerStarted","Data":"ab4f7e0c4afee90f08937e15065d68ce2b23161ac4519df9119304b2a77dbc04"} Jan 21 00:19:50 crc kubenswrapper[5030]: I0121 00:19:50.823596 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" podStartSLOduration=1.5358913379999999 podStartE2EDuration="5.823579137s" podCreationTimestamp="2026-01-21 00:19:45 +0000 UTC" firstStartedPulling="2026-01-21 00:19:45.98860434 +0000 UTC m=+6258.308864628" lastFinishedPulling="2026-01-21 00:19:50.276292139 +0000 UTC m=+6262.596552427" observedRunningTime="2026-01-21 00:19:50.821023405 +0000 UTC m=+6263.141283693" watchObservedRunningTime="2026-01-21 00:19:50.823579137 +0000 UTC m=+6263.143839425" Jan 21 00:19:51 crc kubenswrapper[5030]: I0121 00:19:51.318015 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:51 crc kubenswrapper[5030]: I0121 00:19:51.380248 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:19:55 crc kubenswrapper[5030]: I0121 00:19:55.621329 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" Jan 21 00:19:55 crc kubenswrapper[5030]: I0121 00:19:55.621936 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" Jan 21 00:19:55 crc kubenswrapper[5030]: I0121 00:19:55.654536 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" Jan 21 00:19:55 crc kubenswrapper[5030]: I0121 00:19:55.893451 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" Jan 21 00:19:58 crc kubenswrapper[5030]: I0121 00:19:58.686753 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2"] Jan 21 00:19:58 crc kubenswrapper[5030]: I0121 00:19:58.688403 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" Jan 21 00:19:58 crc kubenswrapper[5030]: I0121 00:19:58.690474 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:19:58 crc kubenswrapper[5030]: I0121 00:19:58.697982 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2"] Jan 21 00:19:58 crc kubenswrapper[5030]: I0121 00:19:58.839212 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09177594-8d5e-4b0e-a063-5f97d4d75df7-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2\" (UID: \"09177594-8d5e-4b0e-a063-5f97d4d75df7\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" Jan 21 00:19:58 crc kubenswrapper[5030]: I0121 00:19:58.839553 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp5lb\" (UniqueName: \"kubernetes.io/projected/09177594-8d5e-4b0e-a063-5f97d4d75df7-kube-api-access-wp5lb\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2\" (UID: \"09177594-8d5e-4b0e-a063-5f97d4d75df7\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" Jan 21 00:19:58 crc kubenswrapper[5030]: I0121 00:19:58.839591 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09177594-8d5e-4b0e-a063-5f97d4d75df7-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2\" (UID: \"09177594-8d5e-4b0e-a063-5f97d4d75df7\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" Jan 21 00:19:58 crc kubenswrapper[5030]: I0121 00:19:58.941080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09177594-8d5e-4b0e-a063-5f97d4d75df7-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2\" (UID: \"09177594-8d5e-4b0e-a063-5f97d4d75df7\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" Jan 21 00:19:58 crc kubenswrapper[5030]: I0121 00:19:58.941177 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp5lb\" (UniqueName: \"kubernetes.io/projected/09177594-8d5e-4b0e-a063-5f97d4d75df7-kube-api-access-wp5lb\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2\" (UID: \"09177594-8d5e-4b0e-a063-5f97d4d75df7\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" Jan 21 00:19:58 crc kubenswrapper[5030]: I0121 00:19:58.941224 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09177594-8d5e-4b0e-a063-5f97d4d75df7-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2\" (UID: \"09177594-8d5e-4b0e-a063-5f97d4d75df7\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" Jan 21 00:19:58 crc kubenswrapper[5030]: I0121 00:19:58.941848 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09177594-8d5e-4b0e-a063-5f97d4d75df7-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2\" (UID: \"09177594-8d5e-4b0e-a063-5f97d4d75df7\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" Jan 21 00:19:58 crc kubenswrapper[5030]: I0121 00:19:58.942072 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09177594-8d5e-4b0e-a063-5f97d4d75df7-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2\" (UID: \"09177594-8d5e-4b0e-a063-5f97d4d75df7\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" Jan 21 00:19:58 crc kubenswrapper[5030]: I0121 00:19:58.969647 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp5lb\" (UniqueName: \"kubernetes.io/projected/09177594-8d5e-4b0e-a063-5f97d4d75df7-kube-api-access-wp5lb\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2\" (UID: \"09177594-8d5e-4b0e-a063-5f97d4d75df7\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" Jan 21 00:19:59 crc kubenswrapper[5030]: I0121 00:19:59.031070 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" Jan 21 00:19:59 crc kubenswrapper[5030]: I0121 00:19:59.459103 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2"] Jan 21 00:19:59 crc kubenswrapper[5030]: I0121 00:19:59.881718 5030 generic.go:334] "Generic (PLEG): container finished" podID="09177594-8d5e-4b0e-a063-5f97d4d75df7" containerID="8a7188ca92d1fdbf5c23abfd3bd0b629ba4d14649d1755bebc4d171a62b91cdf" exitCode=0 Jan 21 00:19:59 crc kubenswrapper[5030]: I0121 00:19:59.883060 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" event={"ID":"09177594-8d5e-4b0e-a063-5f97d4d75df7","Type":"ContainerDied","Data":"8a7188ca92d1fdbf5c23abfd3bd0b629ba4d14649d1755bebc4d171a62b91cdf"} Jan 21 00:19:59 crc kubenswrapper[5030]: I0121 00:19:59.883088 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" event={"ID":"09177594-8d5e-4b0e-a063-5f97d4d75df7","Type":"ContainerStarted","Data":"6c7757e89d7e707e8d61c9bb28f9f820ac31b819d98686bd0ada7aecbe2e61cf"} Jan 21 00:20:01 crc kubenswrapper[5030]: I0121 00:20:01.901173 5030 generic.go:334] "Generic (PLEG): container finished" podID="09177594-8d5e-4b0e-a063-5f97d4d75df7" containerID="54d48e34c2f889854f1855038bf62bf09eabd8fc4b0ea020e2d64be9a5d1322f" exitCode=0 Jan 21 00:20:01 crc kubenswrapper[5030]: I0121 00:20:01.901250 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" event={"ID":"09177594-8d5e-4b0e-a063-5f97d4d75df7","Type":"ContainerDied","Data":"54d48e34c2f889854f1855038bf62bf09eabd8fc4b0ea020e2d64be9a5d1322f"} Jan 21 00:20:02 crc kubenswrapper[5030]: I0121 00:20:02.912287 5030 generic.go:334] "Generic (PLEG): container finished" podID="09177594-8d5e-4b0e-a063-5f97d4d75df7" containerID="8218d5aaf397f53ba3fe22face6fb221c680e54ba03e3666cf2f887e43d2980d" exitCode=0 Jan 21 00:20:02 crc kubenswrapper[5030]: I0121 00:20:02.912359 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" event={"ID":"09177594-8d5e-4b0e-a063-5f97d4d75df7","Type":"ContainerDied","Data":"8218d5aaf397f53ba3fe22face6fb221c680e54ba03e3666cf2f887e43d2980d"} Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.235015 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.323050 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09177594-8d5e-4b0e-a063-5f97d4d75df7-bundle\") pod \"09177594-8d5e-4b0e-a063-5f97d4d75df7\" (UID: \"09177594-8d5e-4b0e-a063-5f97d4d75df7\") " Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.323197 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp5lb\" (UniqueName: \"kubernetes.io/projected/09177594-8d5e-4b0e-a063-5f97d4d75df7-kube-api-access-wp5lb\") pod \"09177594-8d5e-4b0e-a063-5f97d4d75df7\" (UID: \"09177594-8d5e-4b0e-a063-5f97d4d75df7\") " Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.323261 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09177594-8d5e-4b0e-a063-5f97d4d75df7-util\") pod \"09177594-8d5e-4b0e-a063-5f97d4d75df7\" (UID: \"09177594-8d5e-4b0e-a063-5f97d4d75df7\") " Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.324007 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09177594-8d5e-4b0e-a063-5f97d4d75df7-bundle" (OuterVolumeSpecName: "bundle") pod "09177594-8d5e-4b0e-a063-5f97d4d75df7" (UID: "09177594-8d5e-4b0e-a063-5f97d4d75df7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.327836 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09177594-8d5e-4b0e-a063-5f97d4d75df7-kube-api-access-wp5lb" (OuterVolumeSpecName: "kube-api-access-wp5lb") pod "09177594-8d5e-4b0e-a063-5f97d4d75df7" (UID: "09177594-8d5e-4b0e-a063-5f97d4d75df7"). InnerVolumeSpecName "kube-api-access-wp5lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.339213 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09177594-8d5e-4b0e-a063-5f97d4d75df7-util" (OuterVolumeSpecName: "util") pod "09177594-8d5e-4b0e-a063-5f97d4d75df7" (UID: "09177594-8d5e-4b0e-a063-5f97d4d75df7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.421129 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/memcached-0"] Jan 21 00:20:04 crc kubenswrapper[5030]: E0121 00:20:04.421396 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09177594-8d5e-4b0e-a063-5f97d4d75df7" containerName="pull" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.421412 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="09177594-8d5e-4b0e-a063-5f97d4d75df7" containerName="pull" Jan 21 00:20:04 crc kubenswrapper[5030]: E0121 00:20:04.421434 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09177594-8d5e-4b0e-a063-5f97d4d75df7" containerName="extract" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.421442 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="09177594-8d5e-4b0e-a063-5f97d4d75df7" containerName="extract" Jan 21 00:20:04 crc kubenswrapper[5030]: E0121 00:20:04.421453 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09177594-8d5e-4b0e-a063-5f97d4d75df7" containerName="util" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.421459 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="09177594-8d5e-4b0e-a063-5f97d4d75df7" containerName="util" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.421600 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="09177594-8d5e-4b0e-a063-5f97d4d75df7" containerName="extract" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.422060 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.424570 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp5lb\" (UniqueName: \"kubernetes.io/projected/09177594-8d5e-4b0e-a063-5f97d4d75df7-kube-api-access-wp5lb\") on node \"crc\" DevicePath \"\"" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.424613 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09177594-8d5e-4b0e-a063-5f97d4d75df7-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.424646 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09177594-8d5e-4b0e-a063-5f97d4d75df7-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.424686 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"memcached-memcached-dockercfg-ll7mh" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.425603 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"memcached-config-data" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.433466 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.525888 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58fca83f-7678-4c7a-bff7-5830b38c788f-config-data\") pod \"memcached-0\" (UID: \"58fca83f-7678-4c7a-bff7-5830b38c788f\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.525949 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58fca83f-7678-4c7a-bff7-5830b38c788f-kolla-config\") pod \"memcached-0\" (UID: \"58fca83f-7678-4c7a-bff7-5830b38c788f\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.525980 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxkq5\" (UniqueName: \"kubernetes.io/projected/58fca83f-7678-4c7a-bff7-5830b38c788f-kube-api-access-dxkq5\") pod \"memcached-0\" (UID: \"58fca83f-7678-4c7a-bff7-5830b38c788f\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.627910 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58fca83f-7678-4c7a-bff7-5830b38c788f-config-data\") pod \"memcached-0\" (UID: \"58fca83f-7678-4c7a-bff7-5830b38c788f\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.628176 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58fca83f-7678-4c7a-bff7-5830b38c788f-kolla-config\") pod \"memcached-0\" (UID: \"58fca83f-7678-4c7a-bff7-5830b38c788f\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.628825 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxkq5\" (UniqueName: \"kubernetes.io/projected/58fca83f-7678-4c7a-bff7-5830b38c788f-kube-api-access-dxkq5\") pod \"memcached-0\" (UID: \"58fca83f-7678-4c7a-bff7-5830b38c788f\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.628780 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58fca83f-7678-4c7a-bff7-5830b38c788f-kolla-config\") pod \"memcached-0\" (UID: \"58fca83f-7678-4c7a-bff7-5830b38c788f\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.628772 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58fca83f-7678-4c7a-bff7-5830b38c788f-config-data\") pod \"memcached-0\" (UID: \"58fca83f-7678-4c7a-bff7-5830b38c788f\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.656991 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxkq5\" (UniqueName: \"kubernetes.io/projected/58fca83f-7678-4c7a-bff7-5830b38c788f-kube-api-access-dxkq5\") pod \"memcached-0\" (UID: \"58fca83f-7678-4c7a-bff7-5830b38c788f\") " pod="barbican-kuttl-tests/memcached-0" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.736205 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.931810 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" event={"ID":"09177594-8d5e-4b0e-a063-5f97d4d75df7","Type":"ContainerDied","Data":"6c7757e89d7e707e8d61c9bb28f9f820ac31b819d98686bd0ada7aecbe2e61cf"} Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.932051 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c7757e89d7e707e8d61c9bb28f9f820ac31b819d98686bd0ada7aecbe2e61cf" Jan 21 00:20:04 crc kubenswrapper[5030]: I0121 00:20:04.932068 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2" Jan 21 00:20:05 crc kubenswrapper[5030]: I0121 00:20:05.142354 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Jan 21 00:20:05 crc kubenswrapper[5030]: W0121 00:20:05.148612 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58fca83f_7678_4c7a_bff7_5830b38c788f.slice/crio-359218c9dc9840c54b6e60061a1c3df73e7bebd8580ca911b6c349e069148fe5 WatchSource:0}: Error finding container 359218c9dc9840c54b6e60061a1c3df73e7bebd8580ca911b6c349e069148fe5: Status 404 returned error can't find the container with id 359218c9dc9840c54b6e60061a1c3df73e7bebd8580ca911b6c349e069148fe5 Jan 21 00:20:05 crc kubenswrapper[5030]: I0121 00:20:05.939925 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"58fca83f-7678-4c7a-bff7-5830b38c788f","Type":"ContainerStarted","Data":"128532b232d14abd012ae5553e81841d7ca0434be38a7c8565cae820e60e4e7e"} Jan 21 00:20:05 crc kubenswrapper[5030]: I0121 00:20:05.940320 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"58fca83f-7678-4c7a-bff7-5830b38c788f","Type":"ContainerStarted","Data":"359218c9dc9840c54b6e60061a1c3df73e7bebd8580ca911b6c349e069148fe5"} Jan 21 00:20:05 crc kubenswrapper[5030]: I0121 00:20:05.940350 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/memcached-0" Jan 21 00:20:05 crc kubenswrapper[5030]: I0121 00:20:05.974729 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/memcached-0" podStartSLOduration=1.974701576 podStartE2EDuration="1.974701576s" podCreationTimestamp="2026-01-21 00:20:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:20:05.972884903 +0000 UTC m=+6278.293145201" watchObservedRunningTime="2026-01-21 00:20:05.974701576 +0000 UTC m=+6278.294961904" Jan 21 00:20:14 crc kubenswrapper[5030]: I0121 00:20:14.737779 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/memcached-0" Jan 21 00:20:20 crc kubenswrapper[5030]: I0121 00:20:20.871191 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj"] Jan 21 00:20:20 crc kubenswrapper[5030]: I0121 00:20:20.873428 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj" Jan 21 00:20:20 crc kubenswrapper[5030]: I0121 00:20:20.877659 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-gcnbk" Jan 21 00:20:20 crc kubenswrapper[5030]: I0121 00:20:20.888926 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj"] Jan 21 00:20:20 crc kubenswrapper[5030]: I0121 00:20:20.976226 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx8lf\" (UniqueName: \"kubernetes.io/projected/4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e-kube-api-access-bx8lf\") pod \"rabbitmq-cluster-operator-779fc9694b-hx8sj\" (UID: \"4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj" Jan 21 00:20:21 crc kubenswrapper[5030]: I0121 00:20:21.077870 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx8lf\" (UniqueName: \"kubernetes.io/projected/4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e-kube-api-access-bx8lf\") pod \"rabbitmq-cluster-operator-779fc9694b-hx8sj\" (UID: \"4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj" Jan 21 00:20:21 crc kubenswrapper[5030]: I0121 00:20:21.097586 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx8lf\" (UniqueName: \"kubernetes.io/projected/4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e-kube-api-access-bx8lf\") pod \"rabbitmq-cluster-operator-779fc9694b-hx8sj\" (UID: \"4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj" Jan 21 00:20:21 crc kubenswrapper[5030]: I0121 00:20:21.195874 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj" Jan 21 00:20:21 crc kubenswrapper[5030]: I0121 00:20:21.610107 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj"] Jan 21 00:20:21 crc kubenswrapper[5030]: W0121 00:20:21.614919 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4c6c096a_7e9f_4a9e_8145_23ceb5dc0b1e.slice/crio-53b4fc04149be862d61104a47469be731be019b7abae3edbbb9a0664b98290b8 WatchSource:0}: Error finding container 53b4fc04149be862d61104a47469be731be019b7abae3edbbb9a0664b98290b8: Status 404 returned error can't find the container with id 53b4fc04149be862d61104a47469be731be019b7abae3edbbb9a0664b98290b8 Jan 21 00:20:22 crc kubenswrapper[5030]: I0121 00:20:22.052221 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj" event={"ID":"4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e","Type":"ContainerStarted","Data":"53b4fc04149be862d61104a47469be731be019b7abae3edbbb9a0664b98290b8"} Jan 21 00:20:23 crc kubenswrapper[5030]: I0121 00:20:23.060501 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj" event={"ID":"4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e","Type":"ContainerStarted","Data":"1260533e34f0b0f41dc1b59c06bb0c914bc20a24a56808fef2d4f41df1999dd7"} Jan 21 00:20:23 crc kubenswrapper[5030]: I0121 00:20:23.078102 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj" podStartSLOduration=3.078079326 podStartE2EDuration="3.078079326s" podCreationTimestamp="2026-01-21 00:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:20:23.074680065 +0000 UTC m=+6295.394940343" watchObservedRunningTime="2026-01-21 00:20:23.078079326 +0000 UTC m=+6295.398339624" Jan 21 00:20:30 crc kubenswrapper[5030]: I0121 00:20:30.054457 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-9zm4t"] Jan 21 00:20:30 crc kubenswrapper[5030]: I0121 00:20:30.055819 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-9zm4t" Jan 21 00:20:30 crc kubenswrapper[5030]: I0121 00:20:30.061989 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-j8xc2" Jan 21 00:20:30 crc kubenswrapper[5030]: I0121 00:20:30.067381 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-9zm4t"] Jan 21 00:20:30 crc kubenswrapper[5030]: I0121 00:20:30.112756 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n84h\" (UniqueName: \"kubernetes.io/projected/e1e6d5b8-1b67-4435-85d1-6bd44aecb164-kube-api-access-8n84h\") pod \"keystone-operator-index-9zm4t\" (UID: \"e1e6d5b8-1b67-4435-85d1-6bd44aecb164\") " pod="openstack-operators/keystone-operator-index-9zm4t" Jan 21 00:20:30 crc kubenswrapper[5030]: I0121 00:20:30.214088 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n84h\" (UniqueName: \"kubernetes.io/projected/e1e6d5b8-1b67-4435-85d1-6bd44aecb164-kube-api-access-8n84h\") pod \"keystone-operator-index-9zm4t\" (UID: \"e1e6d5b8-1b67-4435-85d1-6bd44aecb164\") " pod="openstack-operators/keystone-operator-index-9zm4t" Jan 21 00:20:30 crc kubenswrapper[5030]: I0121 00:20:30.239238 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n84h\" (UniqueName: \"kubernetes.io/projected/e1e6d5b8-1b67-4435-85d1-6bd44aecb164-kube-api-access-8n84h\") pod \"keystone-operator-index-9zm4t\" (UID: \"e1e6d5b8-1b67-4435-85d1-6bd44aecb164\") " pod="openstack-operators/keystone-operator-index-9zm4t" Jan 21 00:20:30 crc kubenswrapper[5030]: I0121 00:20:30.438474 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-9zm4t" Jan 21 00:20:30 crc kubenswrapper[5030]: I0121 00:20:30.711181 5030 scope.go:117] "RemoveContainer" containerID="c85362660680b33fba94dd833f9713999ae13302dd327217622edfaab4a3d463" Jan 21 00:20:30 crc kubenswrapper[5030]: I0121 00:20:30.736961 5030 scope.go:117] "RemoveContainer" containerID="ab23fc188f9006404bc34c587b59f313b0487a9749aefd311c320d4905613bf2" Jan 21 00:20:30 crc kubenswrapper[5030]: I0121 00:20:30.857204 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-9zm4t"] Jan 21 00:20:30 crc kubenswrapper[5030]: W0121 00:20:30.860918 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1e6d5b8_1b67_4435_85d1_6bd44aecb164.slice/crio-f27feb8e5f7ab26dd6b4ddbbe3bb339dc29fa7b952835d357d9a4b06e8ece0e3 WatchSource:0}: Error finding container f27feb8e5f7ab26dd6b4ddbbe3bb339dc29fa7b952835d357d9a4b06e8ece0e3: Status 404 returned error can't find the container with id f27feb8e5f7ab26dd6b4ddbbe3bb339dc29fa7b952835d357d9a4b06e8ece0e3 Jan 21 00:20:31 crc kubenswrapper[5030]: I0121 00:20:31.133261 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-9zm4t" event={"ID":"e1e6d5b8-1b67-4435-85d1-6bd44aecb164","Type":"ContainerStarted","Data":"f27feb8e5f7ab26dd6b4ddbbe3bb339dc29fa7b952835d357d9a4b06e8ece0e3"} Jan 21 00:20:32 crc kubenswrapper[5030]: I0121 00:20:32.142550 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-9zm4t" event={"ID":"e1e6d5b8-1b67-4435-85d1-6bd44aecb164","Type":"ContainerStarted","Data":"e7003500e5f4bd1c5c8e31580c783d11dd555d5045eb1e7617feb80d5b44425c"} Jan 21 00:20:32 crc kubenswrapper[5030]: I0121 00:20:32.165024 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-9zm4t" podStartSLOduration=1.2641684180000001 podStartE2EDuration="2.165007553s" podCreationTimestamp="2026-01-21 00:20:30 +0000 UTC" firstStartedPulling="2026-01-21 00:20:30.862815757 +0000 UTC m=+6303.183076045" lastFinishedPulling="2026-01-21 00:20:31.763654882 +0000 UTC m=+6304.083915180" observedRunningTime="2026-01-21 00:20:32.159914631 +0000 UTC m=+6304.480174919" watchObservedRunningTime="2026-01-21 00:20:32.165007553 +0000 UTC m=+6304.485267831" Jan 21 00:20:34 crc kubenswrapper[5030]: I0121 00:20:34.448517 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-9zm4t"] Jan 21 00:20:34 crc kubenswrapper[5030]: I0121 00:20:34.448763 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-9zm4t" podUID="e1e6d5b8-1b67-4435-85d1-6bd44aecb164" containerName="registry-server" containerID="cri-o://e7003500e5f4bd1c5c8e31580c783d11dd555d5045eb1e7617feb80d5b44425c" gracePeriod=2 Jan 21 00:20:34 crc kubenswrapper[5030]: I0121 00:20:34.864837 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-9zm4t" Jan 21 00:20:34 crc kubenswrapper[5030]: I0121 00:20:34.988574 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n84h\" (UniqueName: \"kubernetes.io/projected/e1e6d5b8-1b67-4435-85d1-6bd44aecb164-kube-api-access-8n84h\") pod \"e1e6d5b8-1b67-4435-85d1-6bd44aecb164\" (UID: \"e1e6d5b8-1b67-4435-85d1-6bd44aecb164\") " Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.001837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e6d5b8-1b67-4435-85d1-6bd44aecb164-kube-api-access-8n84h" (OuterVolumeSpecName: "kube-api-access-8n84h") pod "e1e6d5b8-1b67-4435-85d1-6bd44aecb164" (UID: "e1e6d5b8-1b67-4435-85d1-6bd44aecb164"). InnerVolumeSpecName "kube-api-access-8n84h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.051542 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-jp4q5"] Jan 21 00:20:35 crc kubenswrapper[5030]: E0121 00:20:35.051926 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e6d5b8-1b67-4435-85d1-6bd44aecb164" containerName="registry-server" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.051943 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e6d5b8-1b67-4435-85d1-6bd44aecb164" containerName="registry-server" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.052103 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e6d5b8-1b67-4435-85d1-6bd44aecb164" containerName="registry-server" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.052703 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-jp4q5" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.068960 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-jp4q5"] Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.091366 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n84h\" (UniqueName: \"kubernetes.io/projected/e1e6d5b8-1b67-4435-85d1-6bd44aecb164-kube-api-access-8n84h\") on node \"crc\" DevicePath \"\"" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.162868 5030 generic.go:334] "Generic (PLEG): container finished" podID="e1e6d5b8-1b67-4435-85d1-6bd44aecb164" containerID="e7003500e5f4bd1c5c8e31580c783d11dd555d5045eb1e7617feb80d5b44425c" exitCode=0 Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.162921 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-9zm4t" event={"ID":"e1e6d5b8-1b67-4435-85d1-6bd44aecb164","Type":"ContainerDied","Data":"e7003500e5f4bd1c5c8e31580c783d11dd555d5045eb1e7617feb80d5b44425c"} Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.162947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-9zm4t" event={"ID":"e1e6d5b8-1b67-4435-85d1-6bd44aecb164","Type":"ContainerDied","Data":"f27feb8e5f7ab26dd6b4ddbbe3bb339dc29fa7b952835d357d9a4b06e8ece0e3"} Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.162964 5030 scope.go:117] "RemoveContainer" containerID="e7003500e5f4bd1c5c8e31580c783d11dd555d5045eb1e7617feb80d5b44425c" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.163070 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-9zm4t" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.187646 5030 scope.go:117] "RemoveContainer" containerID="e7003500e5f4bd1c5c8e31580c783d11dd555d5045eb1e7617feb80d5b44425c" Jan 21 00:20:35 crc kubenswrapper[5030]: E0121 00:20:35.195605 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7003500e5f4bd1c5c8e31580c783d11dd555d5045eb1e7617feb80d5b44425c\": container with ID starting with e7003500e5f4bd1c5c8e31580c783d11dd555d5045eb1e7617feb80d5b44425c not found: ID does not exist" containerID="e7003500e5f4bd1c5c8e31580c783d11dd555d5045eb1e7617feb80d5b44425c" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.195687 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7003500e5f4bd1c5c8e31580c783d11dd555d5045eb1e7617feb80d5b44425c"} err="failed to get container status \"e7003500e5f4bd1c5c8e31580c783d11dd555d5045eb1e7617feb80d5b44425c\": rpc error: code = NotFound desc = could not find container \"e7003500e5f4bd1c5c8e31580c783d11dd555d5045eb1e7617feb80d5b44425c\": container with ID starting with e7003500e5f4bd1c5c8e31580c783d11dd555d5045eb1e7617feb80d5b44425c not found: ID does not exist" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.197753 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxxq4\" (UniqueName: \"kubernetes.io/projected/915a4368-1348-488c-a968-bec3d1b87d23-kube-api-access-gxxq4\") pod \"keystone-operator-index-jp4q5\" (UID: \"915a4368-1348-488c-a968-bec3d1b87d23\") " pod="openstack-operators/keystone-operator-index-jp4q5" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.211423 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-9zm4t"] Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.218479 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-9zm4t"] Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.298961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxxq4\" (UniqueName: \"kubernetes.io/projected/915a4368-1348-488c-a968-bec3d1b87d23-kube-api-access-gxxq4\") pod \"keystone-operator-index-jp4q5\" (UID: \"915a4368-1348-488c-a968-bec3d1b87d23\") " pod="openstack-operators/keystone-operator-index-jp4q5" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.315927 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxxq4\" (UniqueName: \"kubernetes.io/projected/915a4368-1348-488c-a968-bec3d1b87d23-kube-api-access-gxxq4\") pod \"keystone-operator-index-jp4q5\" (UID: \"915a4368-1348-488c-a968-bec3d1b87d23\") " pod="openstack-operators/keystone-operator-index-jp4q5" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.377540 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-jp4q5" Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.820855 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-jp4q5"] Jan 21 00:20:35 crc kubenswrapper[5030]: I0121 00:20:35.972881 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e6d5b8-1b67-4435-85d1-6bd44aecb164" path="/var/lib/kubelet/pods/e1e6d5b8-1b67-4435-85d1-6bd44aecb164/volumes" Jan 21 00:20:36 crc kubenswrapper[5030]: I0121 00:20:36.172485 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-jp4q5" event={"ID":"915a4368-1348-488c-a968-bec3d1b87d23","Type":"ContainerStarted","Data":"a0cc466ca2d21bb456115cb0b4df6bb46edd0941ed80b8bec062e701129d458c"} Jan 21 00:20:37 crc kubenswrapper[5030]: I0121 00:20:37.181573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-jp4q5" event={"ID":"915a4368-1348-488c-a968-bec3d1b87d23","Type":"ContainerStarted","Data":"9d23c2a86c88a78de8b357da28b04ae29725c56e0216ad5bee6e2867c610abb7"} Jan 21 00:20:37 crc kubenswrapper[5030]: I0121 00:20:37.198317 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-jp4q5" podStartSLOduration=1.734044205 podStartE2EDuration="2.198296328s" podCreationTimestamp="2026-01-21 00:20:35 +0000 UTC" firstStartedPulling="2026-01-21 00:20:35.825906384 +0000 UTC m=+6308.146166672" lastFinishedPulling="2026-01-21 00:20:36.290158507 +0000 UTC m=+6308.610418795" observedRunningTime="2026-01-21 00:20:37.193474692 +0000 UTC m=+6309.513734980" watchObservedRunningTime="2026-01-21 00:20:37.198296328 +0000 UTC m=+6309.518556646" Jan 21 00:20:45 crc kubenswrapper[5030]: I0121 00:20:45.379330 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-jp4q5" Jan 21 00:20:45 crc kubenswrapper[5030]: I0121 00:20:45.380016 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-jp4q5" Jan 21 00:20:45 crc kubenswrapper[5030]: I0121 00:20:45.419403 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-jp4q5" Jan 21 00:20:46 crc kubenswrapper[5030]: I0121 00:20:46.333342 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-jp4q5" Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.492185 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7"] Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.495731 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.500103 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.505316 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7"] Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.586027 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2def119-ddef-4378-84d0-61650bf36ad5-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7\" (UID: \"b2def119-ddef-4378-84d0-61650bf36ad5\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.586090 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-284g5\" (UniqueName: \"kubernetes.io/projected/b2def119-ddef-4378-84d0-61650bf36ad5-kube-api-access-284g5\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7\" (UID: \"b2def119-ddef-4378-84d0-61650bf36ad5\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.586112 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2def119-ddef-4378-84d0-61650bf36ad5-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7\" (UID: \"b2def119-ddef-4378-84d0-61650bf36ad5\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.687244 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2def119-ddef-4378-84d0-61650bf36ad5-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7\" (UID: \"b2def119-ddef-4378-84d0-61650bf36ad5\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.687354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-284g5\" (UniqueName: \"kubernetes.io/projected/b2def119-ddef-4378-84d0-61650bf36ad5-kube-api-access-284g5\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7\" (UID: \"b2def119-ddef-4378-84d0-61650bf36ad5\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.687392 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2def119-ddef-4378-84d0-61650bf36ad5-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7\" (UID: \"b2def119-ddef-4378-84d0-61650bf36ad5\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.687871 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2def119-ddef-4378-84d0-61650bf36ad5-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7\" (UID: \"b2def119-ddef-4378-84d0-61650bf36ad5\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.687977 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2def119-ddef-4378-84d0-61650bf36ad5-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7\" (UID: \"b2def119-ddef-4378-84d0-61650bf36ad5\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.712475 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-284g5\" (UniqueName: \"kubernetes.io/projected/b2def119-ddef-4378-84d0-61650bf36ad5-kube-api-access-284g5\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7\" (UID: \"b2def119-ddef-4378-84d0-61650bf36ad5\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" Jan 21 00:20:47 crc kubenswrapper[5030]: I0121 00:20:47.854124 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" Jan 21 00:20:48 crc kubenswrapper[5030]: I0121 00:20:48.274777 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7"] Jan 21 00:20:49 crc kubenswrapper[5030]: I0121 00:20:49.286569 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2def119-ddef-4378-84d0-61650bf36ad5" containerID="0476f5ecbb75f96d7a6a54d3a60ae225d4226ce19053893c8d06c97357fa2a58" exitCode=0 Jan 21 00:20:49 crc kubenswrapper[5030]: I0121 00:20:49.286610 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" event={"ID":"b2def119-ddef-4378-84d0-61650bf36ad5","Type":"ContainerDied","Data":"0476f5ecbb75f96d7a6a54d3a60ae225d4226ce19053893c8d06c97357fa2a58"} Jan 21 00:20:49 crc kubenswrapper[5030]: I0121 00:20:49.286647 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" event={"ID":"b2def119-ddef-4378-84d0-61650bf36ad5","Type":"ContainerStarted","Data":"41f10c5129c0534e5d6daf2a2184ac12130d3949b10be992be72a993b28da1aa"} Jan 21 00:20:51 crc kubenswrapper[5030]: I0121 00:20:51.313390 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2def119-ddef-4378-84d0-61650bf36ad5" containerID="43f6d68ddf3d8f9396197db2bb3d0f4b0486309dc12bd8012350f620bd080bd2" exitCode=0 Jan 21 00:20:51 crc kubenswrapper[5030]: I0121 00:20:51.313493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" event={"ID":"b2def119-ddef-4378-84d0-61650bf36ad5","Type":"ContainerDied","Data":"43f6d68ddf3d8f9396197db2bb3d0f4b0486309dc12bd8012350f620bd080bd2"} Jan 21 00:20:52 crc kubenswrapper[5030]: I0121 00:20:52.322809 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2def119-ddef-4378-84d0-61650bf36ad5" containerID="b5d8dcfea8a041c375b7f1b749671da811924b9bbdc38e69112881bb9c534734" exitCode=0 Jan 21 00:20:52 crc kubenswrapper[5030]: I0121 00:20:52.322852 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" event={"ID":"b2def119-ddef-4378-84d0-61650bf36ad5","Type":"ContainerDied","Data":"b5d8dcfea8a041c375b7f1b749671da811924b9bbdc38e69112881bb9c534734"} Jan 21 00:20:53 crc kubenswrapper[5030]: I0121 00:20:53.625380 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" Jan 21 00:20:53 crc kubenswrapper[5030]: I0121 00:20:53.783257 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-284g5\" (UniqueName: \"kubernetes.io/projected/b2def119-ddef-4378-84d0-61650bf36ad5-kube-api-access-284g5\") pod \"b2def119-ddef-4378-84d0-61650bf36ad5\" (UID: \"b2def119-ddef-4378-84d0-61650bf36ad5\") " Jan 21 00:20:53 crc kubenswrapper[5030]: I0121 00:20:53.783315 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2def119-ddef-4378-84d0-61650bf36ad5-bundle\") pod \"b2def119-ddef-4378-84d0-61650bf36ad5\" (UID: \"b2def119-ddef-4378-84d0-61650bf36ad5\") " Jan 21 00:20:53 crc kubenswrapper[5030]: I0121 00:20:53.783347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2def119-ddef-4378-84d0-61650bf36ad5-util\") pod \"b2def119-ddef-4378-84d0-61650bf36ad5\" (UID: \"b2def119-ddef-4378-84d0-61650bf36ad5\") " Jan 21 00:20:53 crc kubenswrapper[5030]: I0121 00:20:53.784405 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2def119-ddef-4378-84d0-61650bf36ad5-bundle" (OuterVolumeSpecName: "bundle") pod "b2def119-ddef-4378-84d0-61650bf36ad5" (UID: "b2def119-ddef-4378-84d0-61650bf36ad5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:20:53 crc kubenswrapper[5030]: I0121 00:20:53.789411 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2def119-ddef-4378-84d0-61650bf36ad5-kube-api-access-284g5" (OuterVolumeSpecName: "kube-api-access-284g5") pod "b2def119-ddef-4378-84d0-61650bf36ad5" (UID: "b2def119-ddef-4378-84d0-61650bf36ad5"). InnerVolumeSpecName "kube-api-access-284g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:20:53 crc kubenswrapper[5030]: I0121 00:20:53.797807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2def119-ddef-4378-84d0-61650bf36ad5-util" (OuterVolumeSpecName: "util") pod "b2def119-ddef-4378-84d0-61650bf36ad5" (UID: "b2def119-ddef-4378-84d0-61650bf36ad5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:20:53 crc kubenswrapper[5030]: I0121 00:20:53.884400 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-284g5\" (UniqueName: \"kubernetes.io/projected/b2def119-ddef-4378-84d0-61650bf36ad5-kube-api-access-284g5\") on node \"crc\" DevicePath \"\"" Jan 21 00:20:53 crc kubenswrapper[5030]: I0121 00:20:53.884426 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2def119-ddef-4378-84d0-61650bf36ad5-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:20:53 crc kubenswrapper[5030]: I0121 00:20:53.884436 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2def119-ddef-4378-84d0-61650bf36ad5-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:20:54 crc kubenswrapper[5030]: I0121 00:20:54.340273 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" event={"ID":"b2def119-ddef-4378-84d0-61650bf36ad5","Type":"ContainerDied","Data":"41f10c5129c0534e5d6daf2a2184ac12130d3949b10be992be72a993b28da1aa"} Jan 21 00:20:54 crc kubenswrapper[5030]: I0121 00:20:54.340314 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41f10c5129c0534e5d6daf2a2184ac12130d3949b10be992be72a993b28da1aa" Jan 21 00:20:54 crc kubenswrapper[5030]: I0121 00:20:54.340357 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.309868 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 00:20:56 crc kubenswrapper[5030]: E0121 00:20:56.310492 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2def119-ddef-4378-84d0-61650bf36ad5" containerName="extract" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.310508 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2def119-ddef-4378-84d0-61650bf36ad5" containerName="extract" Jan 21 00:20:56 crc kubenswrapper[5030]: E0121 00:20:56.310524 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2def119-ddef-4378-84d0-61650bf36ad5" containerName="util" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.310532 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2def119-ddef-4378-84d0-61650bf36ad5" containerName="util" Jan 21 00:20:56 crc kubenswrapper[5030]: E0121 00:20:56.310558 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2def119-ddef-4378-84d0-61650bf36ad5" containerName="pull" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.310567 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2def119-ddef-4378-84d0-61650bf36ad5" containerName="pull" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.310734 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2def119-ddef-4378-84d0-61650bf36ad5" containerName="extract" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.311593 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.313506 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.313967 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"rabbitmq-server-conf" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.314172 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-default-user" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.314493 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"rabbitmq-server-dockercfg-7hhpg" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.314818 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"barbican-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.328149 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.423818 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.423921 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.423953 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc47j\" (UniqueName: \"kubernetes.io/projected/07630df9-d9b5-4944-939a-6a284e5e1488-kube-api-access-zc47j\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.423978 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/07630df9-d9b5-4944-939a-6a284e5e1488-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.424013 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.424092 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2157812c-8de0-4411-8737-01292dafa830\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2157812c-8de0-4411-8737-01292dafa830\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.424124 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/07630df9-d9b5-4944-939a-6a284e5e1488-pod-info\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.424153 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/07630df9-d9b5-4944-939a-6a284e5e1488-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.526325 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2157812c-8de0-4411-8737-01292dafa830\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2157812c-8de0-4411-8737-01292dafa830\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.526396 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/07630df9-d9b5-4944-939a-6a284e5e1488-pod-info\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.526503 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/07630df9-d9b5-4944-939a-6a284e5e1488-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.527218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.527403 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.527435 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc47j\" (UniqueName: \"kubernetes.io/projected/07630df9-d9b5-4944-939a-6a284e5e1488-kube-api-access-zc47j\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.527464 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/07630df9-d9b5-4944-939a-6a284e5e1488-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.527518 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.528113 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.528214 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.528514 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/07630df9-d9b5-4944-939a-6a284e5e1488-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.532534 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/07630df9-d9b5-4944-939a-6a284e5e1488-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.532769 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.536094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/07630df9-d9b5-4944-939a-6a284e5e1488-pod-info\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.538465 5030 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.538590 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2157812c-8de0-4411-8737-01292dafa830\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2157812c-8de0-4411-8737-01292dafa830\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a9ae0b3ea352fd8ce279b5a5da12c8e7d28e3f940920c772bf4d690ee9da94bb/globalmount\"" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.544243 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc47j\" (UniqueName: \"kubernetes.io/projected/07630df9-d9b5-4944-939a-6a284e5e1488-kube-api-access-zc47j\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.572280 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2157812c-8de0-4411-8737-01292dafa830\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2157812c-8de0-4411-8737-01292dafa830\") pod \"rabbitmq-server-0\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:56 crc kubenswrapper[5030]: I0121 00:20:56.629916 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:20:57 crc kubenswrapper[5030]: I0121 00:20:57.100264 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 00:20:57 crc kubenswrapper[5030]: I0121 00:20:57.378476 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"07630df9-d9b5-4944-939a-6a284e5e1488","Type":"ContainerStarted","Data":"fdd2f81e22b36e48311d5ab0b6d90f032f67103d6a419ceae034e9dc8c5cef1e"} Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.175451 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6"] Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.176995 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.179812 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-9kzfd" Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.180443 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.183041 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6"] Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.243765 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j77m7\" (UniqueName: \"kubernetes.io/projected/134304ed-5bea-4841-aaa0-d7d7284fc67e-kube-api-access-j77m7\") pod \"keystone-operator-controller-manager-5fcd4486c5-gqfs6\" (UID: \"134304ed-5bea-4841-aaa0-d7d7284fc67e\") " pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.243997 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/134304ed-5bea-4841-aaa0-d7d7284fc67e-webhook-cert\") pod \"keystone-operator-controller-manager-5fcd4486c5-gqfs6\" (UID: \"134304ed-5bea-4841-aaa0-d7d7284fc67e\") " pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.244093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/134304ed-5bea-4841-aaa0-d7d7284fc67e-apiservice-cert\") pod \"keystone-operator-controller-manager-5fcd4486c5-gqfs6\" (UID: \"134304ed-5bea-4841-aaa0-d7d7284fc67e\") " pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.346107 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j77m7\" (UniqueName: \"kubernetes.io/projected/134304ed-5bea-4841-aaa0-d7d7284fc67e-kube-api-access-j77m7\") pod \"keystone-operator-controller-manager-5fcd4486c5-gqfs6\" (UID: \"134304ed-5bea-4841-aaa0-d7d7284fc67e\") " pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.346268 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/134304ed-5bea-4841-aaa0-d7d7284fc67e-webhook-cert\") pod \"keystone-operator-controller-manager-5fcd4486c5-gqfs6\" (UID: \"134304ed-5bea-4841-aaa0-d7d7284fc67e\") " pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.346319 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/134304ed-5bea-4841-aaa0-d7d7284fc67e-apiservice-cert\") pod \"keystone-operator-controller-manager-5fcd4486c5-gqfs6\" (UID: \"134304ed-5bea-4841-aaa0-d7d7284fc67e\") " pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.353270 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/134304ed-5bea-4841-aaa0-d7d7284fc67e-apiservice-cert\") pod \"keystone-operator-controller-manager-5fcd4486c5-gqfs6\" (UID: \"134304ed-5bea-4841-aaa0-d7d7284fc67e\") " pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.359258 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/134304ed-5bea-4841-aaa0-d7d7284fc67e-webhook-cert\") pod \"keystone-operator-controller-manager-5fcd4486c5-gqfs6\" (UID: \"134304ed-5bea-4841-aaa0-d7d7284fc67e\") " pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.364575 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j77m7\" (UniqueName: \"kubernetes.io/projected/134304ed-5bea-4841-aaa0-d7d7284fc67e-kube-api-access-j77m7\") pod \"keystone-operator-controller-manager-5fcd4486c5-gqfs6\" (UID: \"134304ed-5bea-4841-aaa0-d7d7284fc67e\") " pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:21:03 crc kubenswrapper[5030]: I0121 00:21:03.507742 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:21:09 crc kubenswrapper[5030]: I0121 00:21:09.892260 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6"] Jan 21 00:21:10 crc kubenswrapper[5030]: W0121 00:21:10.938417 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod134304ed_5bea_4841_aaa0_d7d7284fc67e.slice/crio-8851eead1903c6a0cde3092e2c9cbe1e1ecf1cb7b6390ef5a789981080b56fca WatchSource:0}: Error finding container 8851eead1903c6a0cde3092e2c9cbe1e1ecf1cb7b6390ef5a789981080b56fca: Status 404 returned error can't find the container with id 8851eead1903c6a0cde3092e2c9cbe1e1ecf1cb7b6390ef5a789981080b56fca Jan 21 00:21:11 crc kubenswrapper[5030]: I0121 00:21:11.510077 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" event={"ID":"134304ed-5bea-4841-aaa0-d7d7284fc67e","Type":"ContainerStarted","Data":"8851eead1903c6a0cde3092e2c9cbe1e1ecf1cb7b6390ef5a789981080b56fca"} Jan 21 00:21:11 crc kubenswrapper[5030]: E0121 00:21:11.574439 5030 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="rabbitmq:4.1.1-management" Jan 21 00:21:11 crc kubenswrapper[5030]: E0121 00:21:11.574525 5030 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="rabbitmq:4.1.1-management" Jan 21 00:21:11 crc kubenswrapper[5030]: E0121 00:21:11.574797 5030 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:rabbitmq:4.1.1-management,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zc47j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000750000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_barbican-kuttl-tests(07630df9-d9b5-4944-939a-6a284e5e1488): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 00:21:11 crc kubenswrapper[5030]: E0121 00:21:11.576073 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="barbican-kuttl-tests/rabbitmq-server-0" podUID="07630df9-d9b5-4944-939a-6a284e5e1488" Jan 21 00:21:12 crc kubenswrapper[5030]: E0121 00:21:12.521544 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"rabbitmq:4.1.1-management\\\"\"" pod="barbican-kuttl-tests/rabbitmq-server-0" podUID="07630df9-d9b5-4944-939a-6a284e5e1488" Jan 21 00:21:15 crc kubenswrapper[5030]: I0121 00:21:15.540819 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" event={"ID":"134304ed-5bea-4841-aaa0-d7d7284fc67e","Type":"ContainerStarted","Data":"02a538876acf22f39c0a24a305c6ce365f4a2ec0d2a2beea93a36cfc0fe5890c"} Jan 21 00:21:15 crc kubenswrapper[5030]: I0121 00:21:15.541419 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:21:15 crc kubenswrapper[5030]: I0121 00:21:15.573531 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" podStartSLOduration=8.474065027 podStartE2EDuration="12.573510317s" podCreationTimestamp="2026-01-21 00:21:03 +0000 UTC" firstStartedPulling="2026-01-21 00:21:10.942585476 +0000 UTC m=+6343.262845764" lastFinishedPulling="2026-01-21 00:21:15.042030766 +0000 UTC m=+6347.362291054" observedRunningTime="2026-01-21 00:21:15.568889497 +0000 UTC m=+6347.889149785" watchObservedRunningTime="2026-01-21 00:21:15.573510317 +0000 UTC m=+6347.893770605" Jan 21 00:21:23 crc kubenswrapper[5030]: I0121 00:21:23.512933 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:21:29 crc kubenswrapper[5030]: I0121 00:21:29.077549 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-gc5n2"] Jan 21 00:21:29 crc kubenswrapper[5030]: I0121 00:21:29.079969 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-gc5n2" Jan 21 00:21:29 crc kubenswrapper[5030]: I0121 00:21:29.082099 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-index-dockercfg-lw8k2" Jan 21 00:21:29 crc kubenswrapper[5030]: I0121 00:21:29.086204 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-gc5n2"] Jan 21 00:21:29 crc kubenswrapper[5030]: I0121 00:21:29.189996 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhfr7\" (UniqueName: \"kubernetes.io/projected/def6e261-6609-4230-a535-baaba57c2757-kube-api-access-xhfr7\") pod \"barbican-operator-index-gc5n2\" (UID: \"def6e261-6609-4230-a535-baaba57c2757\") " pod="openstack-operators/barbican-operator-index-gc5n2" Jan 21 00:21:29 crc kubenswrapper[5030]: I0121 00:21:29.291158 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhfr7\" (UniqueName: \"kubernetes.io/projected/def6e261-6609-4230-a535-baaba57c2757-kube-api-access-xhfr7\") pod \"barbican-operator-index-gc5n2\" (UID: \"def6e261-6609-4230-a535-baaba57c2757\") " pod="openstack-operators/barbican-operator-index-gc5n2" Jan 21 00:21:29 crc kubenswrapper[5030]: I0121 00:21:29.311839 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhfr7\" (UniqueName: \"kubernetes.io/projected/def6e261-6609-4230-a535-baaba57c2757-kube-api-access-xhfr7\") pod \"barbican-operator-index-gc5n2\" (UID: \"def6e261-6609-4230-a535-baaba57c2757\") " pod="openstack-operators/barbican-operator-index-gc5n2" Jan 21 00:21:29 crc kubenswrapper[5030]: I0121 00:21:29.413675 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-gc5n2" Jan 21 00:21:29 crc kubenswrapper[5030]: I0121 00:21:29.703269 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"07630df9-d9b5-4944-939a-6a284e5e1488","Type":"ContainerStarted","Data":"9754df6db66703e2cca4cfda28e6dc5b5b74240ec8edd308b113c6623d6f191d"} Jan 21 00:21:29 crc kubenswrapper[5030]: I0121 00:21:29.717014 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-gc5n2"] Jan 21 00:21:29 crc kubenswrapper[5030]: W0121 00:21:29.725934 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddef6e261_6609_4230_a535_baaba57c2757.slice/crio-bd7a71cb5b545f55696fd4cc06be0076445fce53ac44ebb909efd0d32ca5fd09 WatchSource:0}: Error finding container bd7a71cb5b545f55696fd4cc06be0076445fce53ac44ebb909efd0d32ca5fd09: Status 404 returned error can't find the container with id bd7a71cb5b545f55696fd4cc06be0076445fce53ac44ebb909efd0d32ca5fd09 Jan 21 00:21:30 crc kubenswrapper[5030]: I0121 00:21:30.710951 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-gc5n2" event={"ID":"def6e261-6609-4230-a535-baaba57c2757","Type":"ContainerStarted","Data":"bd7a71cb5b545f55696fd4cc06be0076445fce53ac44ebb909efd0d32ca5fd09"} Jan 21 00:21:30 crc kubenswrapper[5030]: I0121 00:21:30.808352 5030 scope.go:117] "RemoveContainer" containerID="86dfd28a16763e6122806fe4b880bd8e7aada9e1039f68fae429d50a23fc23ab" Jan 21 00:21:30 crc kubenswrapper[5030]: I0121 00:21:30.833922 5030 scope.go:117] "RemoveContainer" containerID="9ed66246b5a326d1c96d2d8f373e75242df9f08d33bc6fe16c1953efd4660339" Jan 21 00:21:30 crc kubenswrapper[5030]: I0121 00:21:30.862970 5030 scope.go:117] "RemoveContainer" containerID="6954db4fba719163fd4731ada0e3754c7a3aeea31be25d67085b1222c9ec7c8c" Jan 21 00:21:32 crc kubenswrapper[5030]: I0121 00:21:32.733540 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-gc5n2" event={"ID":"def6e261-6609-4230-a535-baaba57c2757","Type":"ContainerStarted","Data":"fe2f67ee20e6395b1772b04c9fd66ad636dd62dbd48d1844aa7f2f4f259176be"} Jan 21 00:21:32 crc kubenswrapper[5030]: I0121 00:21:32.752321 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-gc5n2" podStartSLOduration=1.153740364 podStartE2EDuration="3.752297149s" podCreationTimestamp="2026-01-21 00:21:29 +0000 UTC" firstStartedPulling="2026-01-21 00:21:29.735308435 +0000 UTC m=+6362.055568723" lastFinishedPulling="2026-01-21 00:21:32.33386522 +0000 UTC m=+6364.654125508" observedRunningTime="2026-01-21 00:21:32.74900256 +0000 UTC m=+6365.069262868" watchObservedRunningTime="2026-01-21 00:21:32.752297149 +0000 UTC m=+6365.072557437" Jan 21 00:21:33 crc kubenswrapper[5030]: I0121 00:21:33.255083 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-gc5n2"] Jan 21 00:21:33 crc kubenswrapper[5030]: I0121 00:21:33.857721 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-gcvmv"] Jan 21 00:21:33 crc kubenswrapper[5030]: I0121 00:21:33.859132 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-gcvmv" Jan 21 00:21:33 crc kubenswrapper[5030]: I0121 00:21:33.875181 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-gcvmv"] Jan 21 00:21:33 crc kubenswrapper[5030]: I0121 00:21:33.973129 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plczk\" (UniqueName: \"kubernetes.io/projected/3756e766-1fab-47ec-aefc-8ad5e10f4fe4-kube-api-access-plczk\") pod \"barbican-operator-index-gcvmv\" (UID: \"3756e766-1fab-47ec-aefc-8ad5e10f4fe4\") " pod="openstack-operators/barbican-operator-index-gcvmv" Jan 21 00:21:34 crc kubenswrapper[5030]: I0121 00:21:34.074855 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plczk\" (UniqueName: \"kubernetes.io/projected/3756e766-1fab-47ec-aefc-8ad5e10f4fe4-kube-api-access-plczk\") pod \"barbican-operator-index-gcvmv\" (UID: \"3756e766-1fab-47ec-aefc-8ad5e10f4fe4\") " pod="openstack-operators/barbican-operator-index-gcvmv" Jan 21 00:21:34 crc kubenswrapper[5030]: I0121 00:21:34.099365 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plczk\" (UniqueName: \"kubernetes.io/projected/3756e766-1fab-47ec-aefc-8ad5e10f4fe4-kube-api-access-plczk\") pod \"barbican-operator-index-gcvmv\" (UID: \"3756e766-1fab-47ec-aefc-8ad5e10f4fe4\") " pod="openstack-operators/barbican-operator-index-gcvmv" Jan 21 00:21:34 crc kubenswrapper[5030]: I0121 00:21:34.175086 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-gcvmv" Jan 21 00:21:34 crc kubenswrapper[5030]: I0121 00:21:34.605667 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-gcvmv"] Jan 21 00:21:34 crc kubenswrapper[5030]: W0121 00:21:34.612428 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3756e766_1fab_47ec_aefc_8ad5e10f4fe4.slice/crio-5e2b5ac4963abb37b01709f5149f5669183664de8f887685a870804272bf0ea6 WatchSource:0}: Error finding container 5e2b5ac4963abb37b01709f5149f5669183664de8f887685a870804272bf0ea6: Status 404 returned error can't find the container with id 5e2b5ac4963abb37b01709f5149f5669183664de8f887685a870804272bf0ea6 Jan 21 00:21:34 crc kubenswrapper[5030]: I0121 00:21:34.748929 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-index-gc5n2" podUID="def6e261-6609-4230-a535-baaba57c2757" containerName="registry-server" containerID="cri-o://fe2f67ee20e6395b1772b04c9fd66ad636dd62dbd48d1844aa7f2f4f259176be" gracePeriod=2 Jan 21 00:21:34 crc kubenswrapper[5030]: I0121 00:21:34.749207 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-gcvmv" event={"ID":"3756e766-1fab-47ec-aefc-8ad5e10f4fe4","Type":"ContainerStarted","Data":"5e2b5ac4963abb37b01709f5149f5669183664de8f887685a870804272bf0ea6"} Jan 21 00:21:35 crc kubenswrapper[5030]: I0121 00:21:35.758093 5030 generic.go:334] "Generic (PLEG): container finished" podID="def6e261-6609-4230-a535-baaba57c2757" containerID="fe2f67ee20e6395b1772b04c9fd66ad636dd62dbd48d1844aa7f2f4f259176be" exitCode=0 Jan 21 00:21:35 crc kubenswrapper[5030]: I0121 00:21:35.758171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-gc5n2" event={"ID":"def6e261-6609-4230-a535-baaba57c2757","Type":"ContainerDied","Data":"fe2f67ee20e6395b1772b04c9fd66ad636dd62dbd48d1844aa7f2f4f259176be"} Jan 21 00:21:35 crc kubenswrapper[5030]: I0121 00:21:35.997203 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-gc5n2" Jan 21 00:21:36 crc kubenswrapper[5030]: I0121 00:21:36.107594 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhfr7\" (UniqueName: \"kubernetes.io/projected/def6e261-6609-4230-a535-baaba57c2757-kube-api-access-xhfr7\") pod \"def6e261-6609-4230-a535-baaba57c2757\" (UID: \"def6e261-6609-4230-a535-baaba57c2757\") " Jan 21 00:21:36 crc kubenswrapper[5030]: I0121 00:21:36.115002 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def6e261-6609-4230-a535-baaba57c2757-kube-api-access-xhfr7" (OuterVolumeSpecName: "kube-api-access-xhfr7") pod "def6e261-6609-4230-a535-baaba57c2757" (UID: "def6e261-6609-4230-a535-baaba57c2757"). InnerVolumeSpecName "kube-api-access-xhfr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:21:36 crc kubenswrapper[5030]: I0121 00:21:36.210286 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhfr7\" (UniqueName: \"kubernetes.io/projected/def6e261-6609-4230-a535-baaba57c2757-kube-api-access-xhfr7\") on node \"crc\" DevicePath \"\"" Jan 21 00:21:36 crc kubenswrapper[5030]: I0121 00:21:36.766961 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-gc5n2" Jan 21 00:21:36 crc kubenswrapper[5030]: I0121 00:21:36.766943 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-gc5n2" event={"ID":"def6e261-6609-4230-a535-baaba57c2757","Type":"ContainerDied","Data":"bd7a71cb5b545f55696fd4cc06be0076445fce53ac44ebb909efd0d32ca5fd09"} Jan 21 00:21:36 crc kubenswrapper[5030]: I0121 00:21:36.767155 5030 scope.go:117] "RemoveContainer" containerID="fe2f67ee20e6395b1772b04c9fd66ad636dd62dbd48d1844aa7f2f4f259176be" Jan 21 00:21:36 crc kubenswrapper[5030]: I0121 00:21:36.769304 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-gcvmv" event={"ID":"3756e766-1fab-47ec-aefc-8ad5e10f4fe4","Type":"ContainerStarted","Data":"3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec"} Jan 21 00:21:36 crc kubenswrapper[5030]: I0121 00:21:36.795420 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-gcvmv" podStartSLOduration=2.578581105 podStartE2EDuration="3.795394552s" podCreationTimestamp="2026-01-21 00:21:33 +0000 UTC" firstStartedPulling="2026-01-21 00:21:34.620016289 +0000 UTC m=+6366.940276577" lastFinishedPulling="2026-01-21 00:21:35.836829736 +0000 UTC m=+6368.157090024" observedRunningTime="2026-01-21 00:21:36.792865612 +0000 UTC m=+6369.113125910" watchObservedRunningTime="2026-01-21 00:21:36.795394552 +0000 UTC m=+6369.115654840" Jan 21 00:21:36 crc kubenswrapper[5030]: I0121 00:21:36.817317 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-gc5n2"] Jan 21 00:21:36 crc kubenswrapper[5030]: I0121 00:21:36.827308 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-index-gc5n2"] Jan 21 00:21:37 crc kubenswrapper[5030]: I0121 00:21:37.973537 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def6e261-6609-4230-a535-baaba57c2757" path="/var/lib/kubelet/pods/def6e261-6609-4230-a535-baaba57c2757/volumes" Jan 21 00:21:44 crc kubenswrapper[5030]: I0121 00:21:44.176323 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/barbican-operator-index-gcvmv" Jan 21 00:21:44 crc kubenswrapper[5030]: I0121 00:21:44.176923 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-index-gcvmv" Jan 21 00:21:44 crc kubenswrapper[5030]: I0121 00:21:44.204486 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/barbican-operator-index-gcvmv" Jan 21 00:21:44 crc kubenswrapper[5030]: I0121 00:21:44.866193 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-index-gcvmv" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.703549 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww"] Jan 21 00:21:46 crc kubenswrapper[5030]: E0121 00:21:46.703890 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def6e261-6609-4230-a535-baaba57c2757" containerName="registry-server" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.703902 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="def6e261-6609-4230-a535-baaba57c2757" containerName="registry-server" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.704033 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="def6e261-6609-4230-a535-baaba57c2757" containerName="registry-server" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.704952 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.707273 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.717529 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww"] Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.867489 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgkw8\" (UniqueName: \"kubernetes.io/projected/91b2e63e-c166-461e-92f0-71614f7ff73b-kube-api-access-pgkw8\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww\" (UID: \"91b2e63e-c166-461e-92f0-71614f7ff73b\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.867908 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91b2e63e-c166-461e-92f0-71614f7ff73b-bundle\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww\" (UID: \"91b2e63e-c166-461e-92f0-71614f7ff73b\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.868108 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91b2e63e-c166-461e-92f0-71614f7ff73b-util\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww\" (UID: \"91b2e63e-c166-461e-92f0-71614f7ff73b\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.970111 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91b2e63e-c166-461e-92f0-71614f7ff73b-util\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww\" (UID: \"91b2e63e-c166-461e-92f0-71614f7ff73b\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.970280 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgkw8\" (UniqueName: \"kubernetes.io/projected/91b2e63e-c166-461e-92f0-71614f7ff73b-kube-api-access-pgkw8\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww\" (UID: \"91b2e63e-c166-461e-92f0-71614f7ff73b\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.970388 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91b2e63e-c166-461e-92f0-71614f7ff73b-bundle\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww\" (UID: \"91b2e63e-c166-461e-92f0-71614f7ff73b\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.971294 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91b2e63e-c166-461e-92f0-71614f7ff73b-bundle\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww\" (UID: \"91b2e63e-c166-461e-92f0-71614f7ff73b\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.971297 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91b2e63e-c166-461e-92f0-71614f7ff73b-util\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww\" (UID: \"91b2e63e-c166-461e-92f0-71614f7ff73b\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" Jan 21 00:21:46 crc kubenswrapper[5030]: I0121 00:21:46.998226 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgkw8\" (UniqueName: \"kubernetes.io/projected/91b2e63e-c166-461e-92f0-71614f7ff73b-kube-api-access-pgkw8\") pod \"efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww\" (UID: \"91b2e63e-c166-461e-92f0-71614f7ff73b\") " pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" Jan 21 00:21:47 crc kubenswrapper[5030]: I0121 00:21:47.023706 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" Jan 21 00:21:47 crc kubenswrapper[5030]: I0121 00:21:47.431435 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww"] Jan 21 00:21:47 crc kubenswrapper[5030]: W0121 00:21:47.434972 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b2e63e_c166_461e_92f0_71614f7ff73b.slice/crio-0802ba13fcdead5de4c0f64555d77bccad9ac799f8d76f8645e6fd8eb9709266 WatchSource:0}: Error finding container 0802ba13fcdead5de4c0f64555d77bccad9ac799f8d76f8645e6fd8eb9709266: Status 404 returned error can't find the container with id 0802ba13fcdead5de4c0f64555d77bccad9ac799f8d76f8645e6fd8eb9709266 Jan 21 00:21:47 crc kubenswrapper[5030]: I0121 00:21:47.857751 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" event={"ID":"91b2e63e-c166-461e-92f0-71614f7ff73b","Type":"ContainerStarted","Data":"0802ba13fcdead5de4c0f64555d77bccad9ac799f8d76f8645e6fd8eb9709266"} Jan 21 00:21:48 crc kubenswrapper[5030]: I0121 00:21:48.867167 5030 generic.go:334] "Generic (PLEG): container finished" podID="91b2e63e-c166-461e-92f0-71614f7ff73b" containerID="0972703b71c7a6099b4d59d3fb431741000ac8ef39554b6381b063e428753517" exitCode=0 Jan 21 00:21:48 crc kubenswrapper[5030]: I0121 00:21:48.867221 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" event={"ID":"91b2e63e-c166-461e-92f0-71614f7ff73b","Type":"ContainerDied","Data":"0972703b71c7a6099b4d59d3fb431741000ac8ef39554b6381b063e428753517"} Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.244198 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-7313-account-create-update-q222b"] Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.245314 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.247345 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-db-secret" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.257830 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-db-create-bn7lq"] Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.260470 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-bn7lq" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.263281 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-7313-account-create-update-q222b"] Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.270786 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-bn7lq"] Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.411436 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnqtv\" (UniqueName: \"kubernetes.io/projected/60451bed-d2a2-4033-8c5b-10d20ded889a-kube-api-access-cnqtv\") pod \"keystone-db-create-bn7lq\" (UID: \"60451bed-d2a2-4033-8c5b-10d20ded889a\") " pod="barbican-kuttl-tests/keystone-db-create-bn7lq" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.411499 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktxgg\" (UniqueName: \"kubernetes.io/projected/d5c2d75e-5dd8-4544-adb0-c6a956b69393-kube-api-access-ktxgg\") pod \"keystone-7313-account-create-update-q222b\" (UID: \"d5c2d75e-5dd8-4544-adb0-c6a956b69393\") " pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.411585 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60451bed-d2a2-4033-8c5b-10d20ded889a-operator-scripts\") pod \"keystone-db-create-bn7lq\" (UID: \"60451bed-d2a2-4033-8c5b-10d20ded889a\") " pod="barbican-kuttl-tests/keystone-db-create-bn7lq" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.412233 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5c2d75e-5dd8-4544-adb0-c6a956b69393-operator-scripts\") pod \"keystone-7313-account-create-update-q222b\" (UID: \"d5c2d75e-5dd8-4544-adb0-c6a956b69393\") " pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.513749 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60451bed-d2a2-4033-8c5b-10d20ded889a-operator-scripts\") pod \"keystone-db-create-bn7lq\" (UID: \"60451bed-d2a2-4033-8c5b-10d20ded889a\") " pod="barbican-kuttl-tests/keystone-db-create-bn7lq" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.513925 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5c2d75e-5dd8-4544-adb0-c6a956b69393-operator-scripts\") pod \"keystone-7313-account-create-update-q222b\" (UID: \"d5c2d75e-5dd8-4544-adb0-c6a956b69393\") " pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.513975 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnqtv\" (UniqueName: \"kubernetes.io/projected/60451bed-d2a2-4033-8c5b-10d20ded889a-kube-api-access-cnqtv\") pod \"keystone-db-create-bn7lq\" (UID: \"60451bed-d2a2-4033-8c5b-10d20ded889a\") " pod="barbican-kuttl-tests/keystone-db-create-bn7lq" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.514023 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktxgg\" (UniqueName: \"kubernetes.io/projected/d5c2d75e-5dd8-4544-adb0-c6a956b69393-kube-api-access-ktxgg\") pod \"keystone-7313-account-create-update-q222b\" (UID: \"d5c2d75e-5dd8-4544-adb0-c6a956b69393\") " pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.515343 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5c2d75e-5dd8-4544-adb0-c6a956b69393-operator-scripts\") pod \"keystone-7313-account-create-update-q222b\" (UID: \"d5c2d75e-5dd8-4544-adb0-c6a956b69393\") " pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.515644 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60451bed-d2a2-4033-8c5b-10d20ded889a-operator-scripts\") pod \"keystone-db-create-bn7lq\" (UID: \"60451bed-d2a2-4033-8c5b-10d20ded889a\") " pod="barbican-kuttl-tests/keystone-db-create-bn7lq" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.535008 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnqtv\" (UniqueName: \"kubernetes.io/projected/60451bed-d2a2-4033-8c5b-10d20ded889a-kube-api-access-cnqtv\") pod \"keystone-db-create-bn7lq\" (UID: \"60451bed-d2a2-4033-8c5b-10d20ded889a\") " pod="barbican-kuttl-tests/keystone-db-create-bn7lq" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.535084 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktxgg\" (UniqueName: \"kubernetes.io/projected/d5c2d75e-5dd8-4544-adb0-c6a956b69393-kube-api-access-ktxgg\") pod \"keystone-7313-account-create-update-q222b\" (UID: \"d5c2d75e-5dd8-4544-adb0-c6a956b69393\") " pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.561832 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" Jan 21 00:21:49 crc kubenswrapper[5030]: I0121 00:21:49.575264 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-bn7lq" Jan 21 00:21:51 crc kubenswrapper[5030]: I0121 00:21:51.632979 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-7313-account-create-update-q222b"] Jan 21 00:21:51 crc kubenswrapper[5030]: W0121 00:21:51.636395 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5c2d75e_5dd8_4544_adb0_c6a956b69393.slice/crio-ceb51797a79eea1c88b57ed1428dea60c57d5282bb51f9a3e2fd6ad20add6f5c WatchSource:0}: Error finding container ceb51797a79eea1c88b57ed1428dea60c57d5282bb51f9a3e2fd6ad20add6f5c: Status 404 returned error can't find the container with id ceb51797a79eea1c88b57ed1428dea60c57d5282bb51f9a3e2fd6ad20add6f5c Jan 21 00:21:51 crc kubenswrapper[5030]: I0121 00:21:51.655176 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-bn7lq"] Jan 21 00:21:51 crc kubenswrapper[5030]: W0121 00:21:51.664879 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60451bed_d2a2_4033_8c5b_10d20ded889a.slice/crio-671bb8dc9edf8c363df27832cf845f2615f685ef3926ba2e9e4eaa64f652bd67 WatchSource:0}: Error finding container 671bb8dc9edf8c363df27832cf845f2615f685ef3926ba2e9e4eaa64f652bd67: Status 404 returned error can't find the container with id 671bb8dc9edf8c363df27832cf845f2615f685ef3926ba2e9e4eaa64f652bd67 Jan 21 00:21:51 crc kubenswrapper[5030]: I0121 00:21:51.898159 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" event={"ID":"91b2e63e-c166-461e-92f0-71614f7ff73b","Type":"ContainerStarted","Data":"3c6c4ba7dcd5ddeff09c58f2dcac4c82f8cce6d8c2ff50244ca4eae88cfdc6f2"} Jan 21 00:21:51 crc kubenswrapper[5030]: I0121 00:21:51.903009 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" event={"ID":"d5c2d75e-5dd8-4544-adb0-c6a956b69393","Type":"ContainerStarted","Data":"dd306b1025814562f10ac884155612dc7902b43a48408d4f5c79b5dd83e322b0"} Jan 21 00:21:51 crc kubenswrapper[5030]: I0121 00:21:51.903067 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" event={"ID":"d5c2d75e-5dd8-4544-adb0-c6a956b69393","Type":"ContainerStarted","Data":"ceb51797a79eea1c88b57ed1428dea60c57d5282bb51f9a3e2fd6ad20add6f5c"} Jan 21 00:21:51 crc kubenswrapper[5030]: I0121 00:21:51.904934 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-bn7lq" event={"ID":"60451bed-d2a2-4033-8c5b-10d20ded889a","Type":"ContainerStarted","Data":"4780eff14b9160167f2aebd7335dbeef4f119149e7c378823fe1b8f44466bd30"} Jan 21 00:21:51 crc kubenswrapper[5030]: I0121 00:21:51.904967 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-bn7lq" event={"ID":"60451bed-d2a2-4033-8c5b-10d20ded889a","Type":"ContainerStarted","Data":"671bb8dc9edf8c363df27832cf845f2615f685ef3926ba2e9e4eaa64f652bd67"} Jan 21 00:21:51 crc kubenswrapper[5030]: I0121 00:21:51.939512 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" podStartSLOduration=2.939491821 podStartE2EDuration="2.939491821s" podCreationTimestamp="2026-01-21 00:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:21:51.93317192 +0000 UTC m=+6384.253432208" watchObservedRunningTime="2026-01-21 00:21:51.939491821 +0000 UTC m=+6384.259752109" Jan 21 00:21:51 crc kubenswrapper[5030]: I0121 00:21:51.954448 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-db-create-bn7lq" podStartSLOduration=2.954430329 podStartE2EDuration="2.954430329s" podCreationTimestamp="2026-01-21 00:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:21:51.952168444 +0000 UTC m=+6384.272428742" watchObservedRunningTime="2026-01-21 00:21:51.954430329 +0000 UTC m=+6384.274690617" Jan 21 00:21:52 crc kubenswrapper[5030]: I0121 00:21:52.913111 5030 generic.go:334] "Generic (PLEG): container finished" podID="91b2e63e-c166-461e-92f0-71614f7ff73b" containerID="3c6c4ba7dcd5ddeff09c58f2dcac4c82f8cce6d8c2ff50244ca4eae88cfdc6f2" exitCode=0 Jan 21 00:21:52 crc kubenswrapper[5030]: I0121 00:21:52.913201 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" event={"ID":"91b2e63e-c166-461e-92f0-71614f7ff73b","Type":"ContainerDied","Data":"3c6c4ba7dcd5ddeff09c58f2dcac4c82f8cce6d8c2ff50244ca4eae88cfdc6f2"} Jan 21 00:21:52 crc kubenswrapper[5030]: I0121 00:21:52.915372 5030 generic.go:334] "Generic (PLEG): container finished" podID="d5c2d75e-5dd8-4544-adb0-c6a956b69393" containerID="dd306b1025814562f10ac884155612dc7902b43a48408d4f5c79b5dd83e322b0" exitCode=0 Jan 21 00:21:52 crc kubenswrapper[5030]: I0121 00:21:52.915442 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" event={"ID":"d5c2d75e-5dd8-4544-adb0-c6a956b69393","Type":"ContainerDied","Data":"dd306b1025814562f10ac884155612dc7902b43a48408d4f5c79b5dd83e322b0"} Jan 21 00:21:52 crc kubenswrapper[5030]: I0121 00:21:52.917241 5030 generic.go:334] "Generic (PLEG): container finished" podID="60451bed-d2a2-4033-8c5b-10d20ded889a" containerID="4780eff14b9160167f2aebd7335dbeef4f119149e7c378823fe1b8f44466bd30" exitCode=0 Jan 21 00:21:52 crc kubenswrapper[5030]: I0121 00:21:52.917291 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-bn7lq" event={"ID":"60451bed-d2a2-4033-8c5b-10d20ded889a","Type":"ContainerDied","Data":"4780eff14b9160167f2aebd7335dbeef4f119149e7c378823fe1b8f44466bd30"} Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.333473 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.396148 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktxgg\" (UniqueName: \"kubernetes.io/projected/d5c2d75e-5dd8-4544-adb0-c6a956b69393-kube-api-access-ktxgg\") pod \"d5c2d75e-5dd8-4544-adb0-c6a956b69393\" (UID: \"d5c2d75e-5dd8-4544-adb0-c6a956b69393\") " Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.396224 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5c2d75e-5dd8-4544-adb0-c6a956b69393-operator-scripts\") pod \"d5c2d75e-5dd8-4544-adb0-c6a956b69393\" (UID: \"d5c2d75e-5dd8-4544-adb0-c6a956b69393\") " Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.397045 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c2d75e-5dd8-4544-adb0-c6a956b69393-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5c2d75e-5dd8-4544-adb0-c6a956b69393" (UID: "d5c2d75e-5dd8-4544-adb0-c6a956b69393"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.401744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c2d75e-5dd8-4544-adb0-c6a956b69393-kube-api-access-ktxgg" (OuterVolumeSpecName: "kube-api-access-ktxgg") pod "d5c2d75e-5dd8-4544-adb0-c6a956b69393" (UID: "d5c2d75e-5dd8-4544-adb0-c6a956b69393"). InnerVolumeSpecName "kube-api-access-ktxgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.441189 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-bn7lq" Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.501539 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60451bed-d2a2-4033-8c5b-10d20ded889a-operator-scripts\") pod \"60451bed-d2a2-4033-8c5b-10d20ded889a\" (UID: \"60451bed-d2a2-4033-8c5b-10d20ded889a\") " Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.501614 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnqtv\" (UniqueName: \"kubernetes.io/projected/60451bed-d2a2-4033-8c5b-10d20ded889a-kube-api-access-cnqtv\") pod \"60451bed-d2a2-4033-8c5b-10d20ded889a\" (UID: \"60451bed-d2a2-4033-8c5b-10d20ded889a\") " Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.503544 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60451bed-d2a2-4033-8c5b-10d20ded889a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60451bed-d2a2-4033-8c5b-10d20ded889a" (UID: "60451bed-d2a2-4033-8c5b-10d20ded889a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.504499 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktxgg\" (UniqueName: \"kubernetes.io/projected/d5c2d75e-5dd8-4544-adb0-c6a956b69393-kube-api-access-ktxgg\") on node \"crc\" DevicePath \"\"" Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.504553 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5c2d75e-5dd8-4544-adb0-c6a956b69393-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.504569 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60451bed-d2a2-4033-8c5b-10d20ded889a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.505916 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60451bed-d2a2-4033-8c5b-10d20ded889a-kube-api-access-cnqtv" (OuterVolumeSpecName: "kube-api-access-cnqtv") pod "60451bed-d2a2-4033-8c5b-10d20ded889a" (UID: "60451bed-d2a2-4033-8c5b-10d20ded889a"). InnerVolumeSpecName "kube-api-access-cnqtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.607264 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnqtv\" (UniqueName: \"kubernetes.io/projected/60451bed-d2a2-4033-8c5b-10d20ded889a-kube-api-access-cnqtv\") on node \"crc\" DevicePath \"\"" Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.934057 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" event={"ID":"d5c2d75e-5dd8-4544-adb0-c6a956b69393","Type":"ContainerDied","Data":"ceb51797a79eea1c88b57ed1428dea60c57d5282bb51f9a3e2fd6ad20add6f5c"} Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.934295 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceb51797a79eea1c88b57ed1428dea60c57d5282bb51f9a3e2fd6ad20add6f5c" Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.934084 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-7313-account-create-update-q222b" Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.935836 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-create-bn7lq" event={"ID":"60451bed-d2a2-4033-8c5b-10d20ded889a","Type":"ContainerDied","Data":"671bb8dc9edf8c363df27832cf845f2615f685ef3926ba2e9e4eaa64f652bd67"} Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.935868 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-create-bn7lq" Jan 21 00:21:54 crc kubenswrapper[5030]: I0121 00:21:54.935876 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="671bb8dc9edf8c363df27832cf845f2615f685ef3926ba2e9e4eaa64f652bd67" Jan 21 00:21:55 crc kubenswrapper[5030]: E0121 00:21:55.095904 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5c2d75e_5dd8_4544_adb0_c6a956b69393.slice\": RecentStats: unable to find data in memory cache]" Jan 21 00:21:59 crc kubenswrapper[5030]: I0121 00:21:59.979682 5030 generic.go:334] "Generic (PLEG): container finished" podID="91b2e63e-c166-461e-92f0-71614f7ff73b" containerID="d3fb2075d76030e7e163397240e228668676e0fecc64314fe4f656308408ca6d" exitCode=0 Jan 21 00:21:59 crc kubenswrapper[5030]: I0121 00:21:59.979726 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" event={"ID":"91b2e63e-c166-461e-92f0-71614f7ff73b","Type":"ContainerDied","Data":"d3fb2075d76030e7e163397240e228668676e0fecc64314fe4f656308408ca6d"} Jan 21 00:22:00 crc kubenswrapper[5030]: I0121 00:22:00.990081 5030 generic.go:334] "Generic (PLEG): container finished" podID="07630df9-d9b5-4944-939a-6a284e5e1488" containerID="9754df6db66703e2cca4cfda28e6dc5b5b74240ec8edd308b113c6623d6f191d" exitCode=0 Jan 21 00:22:00 crc kubenswrapper[5030]: I0121 00:22:00.990171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"07630df9-d9b5-4944-939a-6a284e5e1488","Type":"ContainerDied","Data":"9754df6db66703e2cca4cfda28e6dc5b5b74240ec8edd308b113c6623d6f191d"} Jan 21 00:22:01 crc kubenswrapper[5030]: I0121 00:22:01.336951 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" Jan 21 00:22:01 crc kubenswrapper[5030]: I0121 00:22:01.416054 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgkw8\" (UniqueName: \"kubernetes.io/projected/91b2e63e-c166-461e-92f0-71614f7ff73b-kube-api-access-pgkw8\") pod \"91b2e63e-c166-461e-92f0-71614f7ff73b\" (UID: \"91b2e63e-c166-461e-92f0-71614f7ff73b\") " Jan 21 00:22:01 crc kubenswrapper[5030]: I0121 00:22:01.416110 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91b2e63e-c166-461e-92f0-71614f7ff73b-bundle\") pod \"91b2e63e-c166-461e-92f0-71614f7ff73b\" (UID: \"91b2e63e-c166-461e-92f0-71614f7ff73b\") " Jan 21 00:22:01 crc kubenswrapper[5030]: I0121 00:22:01.416264 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91b2e63e-c166-461e-92f0-71614f7ff73b-util\") pod \"91b2e63e-c166-461e-92f0-71614f7ff73b\" (UID: \"91b2e63e-c166-461e-92f0-71614f7ff73b\") " Jan 21 00:22:01 crc kubenswrapper[5030]: I0121 00:22:01.419230 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91b2e63e-c166-461e-92f0-71614f7ff73b-bundle" (OuterVolumeSpecName: "bundle") pod "91b2e63e-c166-461e-92f0-71614f7ff73b" (UID: "91b2e63e-c166-461e-92f0-71614f7ff73b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:22:01 crc kubenswrapper[5030]: I0121 00:22:01.420751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91b2e63e-c166-461e-92f0-71614f7ff73b-kube-api-access-pgkw8" (OuterVolumeSpecName: "kube-api-access-pgkw8") pod "91b2e63e-c166-461e-92f0-71614f7ff73b" (UID: "91b2e63e-c166-461e-92f0-71614f7ff73b"). InnerVolumeSpecName "kube-api-access-pgkw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:22:01 crc kubenswrapper[5030]: I0121 00:22:01.430880 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91b2e63e-c166-461e-92f0-71614f7ff73b-util" (OuterVolumeSpecName: "util") pod "91b2e63e-c166-461e-92f0-71614f7ff73b" (UID: "91b2e63e-c166-461e-92f0-71614f7ff73b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:22:01 crc kubenswrapper[5030]: I0121 00:22:01.517874 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/91b2e63e-c166-461e-92f0-71614f7ff73b-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:01 crc kubenswrapper[5030]: I0121 00:22:01.517912 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgkw8\" (UniqueName: \"kubernetes.io/projected/91b2e63e-c166-461e-92f0-71614f7ff73b-kube-api-access-pgkw8\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:01 crc kubenswrapper[5030]: I0121 00:22:01.517926 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/91b2e63e-c166-461e-92f0-71614f7ff73b-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:01 crc kubenswrapper[5030]: I0121 00:22:01.999505 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" event={"ID":"91b2e63e-c166-461e-92f0-71614f7ff73b","Type":"ContainerDied","Data":"0802ba13fcdead5de4c0f64555d77bccad9ac799f8d76f8645e6fd8eb9709266"} Jan 21 00:22:01 crc kubenswrapper[5030]: I0121 00:22:01.999544 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0802ba13fcdead5de4c0f64555d77bccad9ac799f8d76f8645e6fd8eb9709266" Jan 21 00:22:02 crc kubenswrapper[5030]: I0121 00:22:01.999558 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww" Jan 21 00:22:02 crc kubenswrapper[5030]: I0121 00:22:02.004380 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"07630df9-d9b5-4944-939a-6a284e5e1488","Type":"ContainerStarted","Data":"709e96694e3e9eb5e91d2a073603f195918fdc2221a73e50ac46808fffb4edd7"} Jan 21 00:22:02 crc kubenswrapper[5030]: I0121 00:22:02.004581 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:22:02 crc kubenswrapper[5030]: I0121 00:22:02.039640 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.303234728 podStartE2EDuration="1m7.039597912s" podCreationTimestamp="2026-01-21 00:20:55 +0000 UTC" firstStartedPulling="2026-01-21 00:20:57.115923736 +0000 UTC m=+6329.436184024" lastFinishedPulling="2026-01-21 00:21:27.85228692 +0000 UTC m=+6360.172547208" observedRunningTime="2026-01-21 00:22:02.031011177 +0000 UTC m=+6394.351271465" watchObservedRunningTime="2026-01-21 00:22:02.039597912 +0000 UTC m=+6394.359858210" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.047194 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl"] Jan 21 00:22:10 crc kubenswrapper[5030]: E0121 00:22:10.048067 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b2e63e-c166-461e-92f0-71614f7ff73b" containerName="pull" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.048081 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b2e63e-c166-461e-92f0-71614f7ff73b" containerName="pull" Jan 21 00:22:10 crc kubenswrapper[5030]: E0121 00:22:10.048096 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b2e63e-c166-461e-92f0-71614f7ff73b" containerName="extract" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.048103 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b2e63e-c166-461e-92f0-71614f7ff73b" containerName="extract" Jan 21 00:22:10 crc kubenswrapper[5030]: E0121 00:22:10.048118 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b2e63e-c166-461e-92f0-71614f7ff73b" containerName="util" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.048125 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b2e63e-c166-461e-92f0-71614f7ff73b" containerName="util" Jan 21 00:22:10 crc kubenswrapper[5030]: E0121 00:22:10.048137 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c2d75e-5dd8-4544-adb0-c6a956b69393" containerName="mariadb-account-create-update" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.048142 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c2d75e-5dd8-4544-adb0-c6a956b69393" containerName="mariadb-account-create-update" Jan 21 00:22:10 crc kubenswrapper[5030]: E0121 00:22:10.048159 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60451bed-d2a2-4033-8c5b-10d20ded889a" containerName="mariadb-database-create" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.048166 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="60451bed-d2a2-4033-8c5b-10d20ded889a" containerName="mariadb-database-create" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.048315 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="60451bed-d2a2-4033-8c5b-10d20ded889a" containerName="mariadb-database-create" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.048330 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c2d75e-5dd8-4544-adb0-c6a956b69393" containerName="mariadb-account-create-update" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.048340 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="91b2e63e-c166-461e-92f0-71614f7ff73b" containerName="extract" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.048815 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.051145 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-service-cert" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.051351 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-g76qj" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.058346 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl"] Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.157466 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.157543 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.243207 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-webhook-cert\") pod \"barbican-operator-controller-manager-55fdffbbdb-4w8zl\" (UID: \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\") " pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.243303 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thw76\" (UniqueName: \"kubernetes.io/projected/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-kube-api-access-thw76\") pod \"barbican-operator-controller-manager-55fdffbbdb-4w8zl\" (UID: \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\") " pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.243337 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-apiservice-cert\") pod \"barbican-operator-controller-manager-55fdffbbdb-4w8zl\" (UID: \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\") " pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.344547 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thw76\" (UniqueName: \"kubernetes.io/projected/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-kube-api-access-thw76\") pod \"barbican-operator-controller-manager-55fdffbbdb-4w8zl\" (UID: \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\") " pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.344606 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-apiservice-cert\") pod \"barbican-operator-controller-manager-55fdffbbdb-4w8zl\" (UID: \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\") " pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.344685 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-webhook-cert\") pod \"barbican-operator-controller-manager-55fdffbbdb-4w8zl\" (UID: \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\") " pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.351013 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-webhook-cert\") pod \"barbican-operator-controller-manager-55fdffbbdb-4w8zl\" (UID: \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\") " pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.358891 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-apiservice-cert\") pod \"barbican-operator-controller-manager-55fdffbbdb-4w8zl\" (UID: \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\") " pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.361368 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thw76\" (UniqueName: \"kubernetes.io/projected/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-kube-api-access-thw76\") pod \"barbican-operator-controller-manager-55fdffbbdb-4w8zl\" (UID: \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\") " pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.369517 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:22:10 crc kubenswrapper[5030]: I0121 00:22:10.825191 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl"] Jan 21 00:22:11 crc kubenswrapper[5030]: I0121 00:22:11.066276 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" event={"ID":"55a2d438-b8dc-4dac-a66b-7e596e32c4f3","Type":"ContainerStarted","Data":"2901b67f1e91fb15b5d9ea3d02dc466b9ab51deac6ef4b190f61da34e8b143a7"} Jan 21 00:22:16 crc kubenswrapper[5030]: I0121 00:22:16.103401 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" event={"ID":"55a2d438-b8dc-4dac-a66b-7e596e32c4f3","Type":"ContainerStarted","Data":"e0fe572a232be498772b65e6c7c1ad48706f6539a7fc0ed07ded17263eee1415"} Jan 21 00:22:16 crc kubenswrapper[5030]: I0121 00:22:16.104040 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:22:16 crc kubenswrapper[5030]: I0121 00:22:16.138658 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" podStartSLOduration=2.212769029 podStartE2EDuration="6.13863756s" podCreationTimestamp="2026-01-21 00:22:10 +0000 UTC" firstStartedPulling="2026-01-21 00:22:10.830400696 +0000 UTC m=+6403.150660974" lastFinishedPulling="2026-01-21 00:22:14.756269217 +0000 UTC m=+6407.076529505" observedRunningTime="2026-01-21 00:22:16.123183801 +0000 UTC m=+6408.443444099" watchObservedRunningTime="2026-01-21 00:22:16.13863756 +0000 UTC m=+6408.458897848" Jan 21 00:22:16 crc kubenswrapper[5030]: I0121 00:22:16.632815 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.169259 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-fmtbr"] Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.170400 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.179384 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.179992 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-h8r6k" Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.180239 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.180406 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.187075 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-fmtbr"] Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.258952 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0e7639-c75c-41f1-8301-8d2928da6566-config-data\") pod \"keystone-db-sync-fmtbr\" (UID: \"6a0e7639-c75c-41f1-8301-8d2928da6566\") " pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.259400 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbp55\" (UniqueName: \"kubernetes.io/projected/6a0e7639-c75c-41f1-8301-8d2928da6566-kube-api-access-nbp55\") pod \"keystone-db-sync-fmtbr\" (UID: \"6a0e7639-c75c-41f1-8301-8d2928da6566\") " pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.361144 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0e7639-c75c-41f1-8301-8d2928da6566-config-data\") pod \"keystone-db-sync-fmtbr\" (UID: \"6a0e7639-c75c-41f1-8301-8d2928da6566\") " pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.361321 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbp55\" (UniqueName: \"kubernetes.io/projected/6a0e7639-c75c-41f1-8301-8d2928da6566-kube-api-access-nbp55\") pod \"keystone-db-sync-fmtbr\" (UID: \"6a0e7639-c75c-41f1-8301-8d2928da6566\") " pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.377851 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0e7639-c75c-41f1-8301-8d2928da6566-config-data\") pod \"keystone-db-sync-fmtbr\" (UID: \"6a0e7639-c75c-41f1-8301-8d2928da6566\") " pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.384136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbp55\" (UniqueName: \"kubernetes.io/projected/6a0e7639-c75c-41f1-8301-8d2928da6566-kube-api-access-nbp55\") pod \"keystone-db-sync-fmtbr\" (UID: \"6a0e7639-c75c-41f1-8301-8d2928da6566\") " pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.540021 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" Jan 21 00:22:17 crc kubenswrapper[5030]: I0121 00:22:17.789253 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-fmtbr"] Jan 21 00:22:17 crc kubenswrapper[5030]: W0121 00:22:17.791684 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a0e7639_c75c_41f1_8301_8d2928da6566.slice/crio-f7d3dee2eada509052909ef460c443be4e40e3b82832c8a586722a6f54133b5e WatchSource:0}: Error finding container f7d3dee2eada509052909ef460c443be4e40e3b82832c8a586722a6f54133b5e: Status 404 returned error can't find the container with id f7d3dee2eada509052909ef460c443be4e40e3b82832c8a586722a6f54133b5e Jan 21 00:22:18 crc kubenswrapper[5030]: I0121 00:22:18.117234 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" event={"ID":"6a0e7639-c75c-41f1-8301-8d2928da6566","Type":"ContainerStarted","Data":"f7d3dee2eada509052909ef460c443be4e40e3b82832c8a586722a6f54133b5e"} Jan 21 00:22:20 crc kubenswrapper[5030]: I0121 00:22:20.132607 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" event={"ID":"6a0e7639-c75c-41f1-8301-8d2928da6566","Type":"ContainerStarted","Data":"b0dcf5f69ea5f517008215877759c4e6c14e3065f71f19c488496bc5b4d56dbb"} Jan 21 00:22:20 crc kubenswrapper[5030]: I0121 00:22:20.154372 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" podStartSLOduration=1.622892193 podStartE2EDuration="3.154357689s" podCreationTimestamp="2026-01-21 00:22:17 +0000 UTC" firstStartedPulling="2026-01-21 00:22:17.794266443 +0000 UTC m=+6410.114526731" lastFinishedPulling="2026-01-21 00:22:19.325731929 +0000 UTC m=+6411.645992227" observedRunningTime="2026-01-21 00:22:20.148520289 +0000 UTC m=+6412.468780587" watchObservedRunningTime="2026-01-21 00:22:20.154357689 +0000 UTC m=+6412.474617977" Jan 21 00:22:20 crc kubenswrapper[5030]: I0121 00:22:20.374643 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:22:25 crc kubenswrapper[5030]: I0121 00:22:25.168537 5030 generic.go:334] "Generic (PLEG): container finished" podID="6a0e7639-c75c-41f1-8301-8d2928da6566" containerID="b0dcf5f69ea5f517008215877759c4e6c14e3065f71f19c488496bc5b4d56dbb" exitCode=0 Jan 21 00:22:25 crc kubenswrapper[5030]: I0121 00:22:25.168698 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" event={"ID":"6a0e7639-c75c-41f1-8301-8d2928da6566","Type":"ContainerDied","Data":"b0dcf5f69ea5f517008215877759c4e6c14e3065f71f19c488496bc5b4d56dbb"} Jan 21 00:22:26 crc kubenswrapper[5030]: I0121 00:22:26.506151 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" Jan 21 00:22:26 crc kubenswrapper[5030]: I0121 00:22:26.660021 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbp55\" (UniqueName: \"kubernetes.io/projected/6a0e7639-c75c-41f1-8301-8d2928da6566-kube-api-access-nbp55\") pod \"6a0e7639-c75c-41f1-8301-8d2928da6566\" (UID: \"6a0e7639-c75c-41f1-8301-8d2928da6566\") " Jan 21 00:22:26 crc kubenswrapper[5030]: I0121 00:22:26.660146 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0e7639-c75c-41f1-8301-8d2928da6566-config-data\") pod \"6a0e7639-c75c-41f1-8301-8d2928da6566\" (UID: \"6a0e7639-c75c-41f1-8301-8d2928da6566\") " Jan 21 00:22:26 crc kubenswrapper[5030]: I0121 00:22:26.665163 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a0e7639-c75c-41f1-8301-8d2928da6566-kube-api-access-nbp55" (OuterVolumeSpecName: "kube-api-access-nbp55") pod "6a0e7639-c75c-41f1-8301-8d2928da6566" (UID: "6a0e7639-c75c-41f1-8301-8d2928da6566"). InnerVolumeSpecName "kube-api-access-nbp55". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:22:26 crc kubenswrapper[5030]: I0121 00:22:26.697186 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0e7639-c75c-41f1-8301-8d2928da6566-config-data" (OuterVolumeSpecName: "config-data") pod "6a0e7639-c75c-41f1-8301-8d2928da6566" (UID: "6a0e7639-c75c-41f1-8301-8d2928da6566"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:22:26 crc kubenswrapper[5030]: I0121 00:22:26.763057 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbp55\" (UniqueName: \"kubernetes.io/projected/6a0e7639-c75c-41f1-8301-8d2928da6566-kube-api-access-nbp55\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:26 crc kubenswrapper[5030]: I0121 00:22:26.763094 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0e7639-c75c-41f1-8301-8d2928da6566-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.185055 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" event={"ID":"6a0e7639-c75c-41f1-8301-8d2928da6566","Type":"ContainerDied","Data":"f7d3dee2eada509052909ef460c443be4e40e3b82832c8a586722a6f54133b5e"} Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.185101 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7d3dee2eada509052909ef460c443be4e40e3b82832c8a586722a6f54133b5e" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.185383 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-db-sync-fmtbr" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.353752 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-95cdt"] Jan 21 00:22:27 crc kubenswrapper[5030]: E0121 00:22:27.354231 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0e7639-c75c-41f1-8301-8d2928da6566" containerName="keystone-db-sync" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.354245 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0e7639-c75c-41f1-8301-8d2928da6566" containerName="keystone-db-sync" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.354375 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0e7639-c75c-41f1-8301-8d2928da6566" containerName="keystone-db-sync" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.354846 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.356511 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.357051 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.357099 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.357578 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"osp-secret" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.357768 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-h8r6k" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.367752 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-95cdt"] Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.473496 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-scripts\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.473655 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9ptr\" (UniqueName: \"kubernetes.io/projected/e7d16aeb-b00f-463b-840d-2662ece08fe7-kube-api-access-s9ptr\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.473757 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-fernet-keys\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.473997 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-config-data\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.474537 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-credential-keys\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.576038 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-fernet-keys\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.576145 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-config-data\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.576175 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-credential-keys\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.576214 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-scripts\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.576244 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9ptr\" (UniqueName: \"kubernetes.io/projected/e7d16aeb-b00f-463b-840d-2662ece08fe7-kube-api-access-s9ptr\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.580111 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-scripts\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.580324 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-fernet-keys\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.580683 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-config-data\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.583090 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-credential-keys\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.591356 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9ptr\" (UniqueName: \"kubernetes.io/projected/e7d16aeb-b00f-463b-840d-2662ece08fe7-kube-api-access-s9ptr\") pod \"keystone-bootstrap-95cdt\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:27 crc kubenswrapper[5030]: I0121 00:22:27.672309 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:28 crc kubenswrapper[5030]: I0121 00:22:28.142592 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-95cdt"] Jan 21 00:22:28 crc kubenswrapper[5030]: W0121 00:22:28.148842 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7d16aeb_b00f_463b_840d_2662ece08fe7.slice/crio-772c498fb3eec7147e8220516ce2b6a123cd317114f346d4102944a6f986e1ba WatchSource:0}: Error finding container 772c498fb3eec7147e8220516ce2b6a123cd317114f346d4102944a6f986e1ba: Status 404 returned error can't find the container with id 772c498fb3eec7147e8220516ce2b6a123cd317114f346d4102944a6f986e1ba Jan 21 00:22:28 crc kubenswrapper[5030]: I0121 00:22:28.162737 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"osp-secret" Jan 21 00:22:28 crc kubenswrapper[5030]: I0121 00:22:28.193539 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" event={"ID":"e7d16aeb-b00f-463b-840d-2662ece08fe7","Type":"ContainerStarted","Data":"772c498fb3eec7147e8220516ce2b6a123cd317114f346d4102944a6f986e1ba"} Jan 21 00:22:29 crc kubenswrapper[5030]: I0121 00:22:29.202376 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" event={"ID":"e7d16aeb-b00f-463b-840d-2662ece08fe7","Type":"ContainerStarted","Data":"96423988b979eabdb9dd2652b8c3e27f9c7b0b171119e451f241c0402de3b5ec"} Jan 21 00:22:29 crc kubenswrapper[5030]: I0121 00:22:29.223202 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" podStartSLOduration=2.223185386 podStartE2EDuration="2.223185386s" podCreationTimestamp="2026-01-21 00:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:22:29.216025104 +0000 UTC m=+6421.536285392" watchObservedRunningTime="2026-01-21 00:22:29.223185386 +0000 UTC m=+6421.543445674" Jan 21 00:22:30 crc kubenswrapper[5030]: I0121 00:22:30.980878 5030 scope.go:117] "RemoveContainer" containerID="3229f0d80fcf56c29c7b93700bf3ac47371f65c7b8b5fd6d40bfa996fe440891" Jan 21 00:22:31 crc kubenswrapper[5030]: I0121 00:22:31.006840 5030 scope.go:117] "RemoveContainer" containerID="1084215f143ef30d91eeaaebf9c9bb81f8a928c758a760d252e97d08227b0783" Jan 21 00:22:31 crc kubenswrapper[5030]: I0121 00:22:31.047436 5030 scope.go:117] "RemoveContainer" containerID="2ad8dded7146ad34908f5b8c82eb71ed3858392783cbb24fd9d9fed95c890b76" Jan 21 00:22:37 crc kubenswrapper[5030]: I0121 00:22:37.257307 5030 generic.go:334] "Generic (PLEG): container finished" podID="e7d16aeb-b00f-463b-840d-2662ece08fe7" containerID="96423988b979eabdb9dd2652b8c3e27f9c7b0b171119e451f241c0402de3b5ec" exitCode=0 Jan 21 00:22:37 crc kubenswrapper[5030]: I0121 00:22:37.257410 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" event={"ID":"e7d16aeb-b00f-463b-840d-2662ece08fe7","Type":"ContainerDied","Data":"96423988b979eabdb9dd2652b8c3e27f9c7b0b171119e451f241c0402de3b5ec"} Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.563338 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.621897 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-config-data\") pod \"e7d16aeb-b00f-463b-840d-2662ece08fe7\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.621976 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-credential-keys\") pod \"e7d16aeb-b00f-463b-840d-2662ece08fe7\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.622033 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9ptr\" (UniqueName: \"kubernetes.io/projected/e7d16aeb-b00f-463b-840d-2662ece08fe7-kube-api-access-s9ptr\") pod \"e7d16aeb-b00f-463b-840d-2662ece08fe7\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.622132 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-fernet-keys\") pod \"e7d16aeb-b00f-463b-840d-2662ece08fe7\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.622159 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-scripts\") pod \"e7d16aeb-b00f-463b-840d-2662ece08fe7\" (UID: \"e7d16aeb-b00f-463b-840d-2662ece08fe7\") " Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.628142 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e7d16aeb-b00f-463b-840d-2662ece08fe7" (UID: "e7d16aeb-b00f-463b-840d-2662ece08fe7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.628359 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d16aeb-b00f-463b-840d-2662ece08fe7-kube-api-access-s9ptr" (OuterVolumeSpecName: "kube-api-access-s9ptr") pod "e7d16aeb-b00f-463b-840d-2662ece08fe7" (UID: "e7d16aeb-b00f-463b-840d-2662ece08fe7"). InnerVolumeSpecName "kube-api-access-s9ptr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.628737 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e7d16aeb-b00f-463b-840d-2662ece08fe7" (UID: "e7d16aeb-b00f-463b-840d-2662ece08fe7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.635863 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-scripts" (OuterVolumeSpecName: "scripts") pod "e7d16aeb-b00f-463b-840d-2662ece08fe7" (UID: "e7d16aeb-b00f-463b-840d-2662ece08fe7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.645682 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-config-data" (OuterVolumeSpecName: "config-data") pod "e7d16aeb-b00f-463b-840d-2662ece08fe7" (UID: "e7d16aeb-b00f-463b-840d-2662ece08fe7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.723434 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.723806 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.723820 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.723832 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e7d16aeb-b00f-463b-840d-2662ece08fe7-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:38 crc kubenswrapper[5030]: I0121 00:22:38.723852 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9ptr\" (UniqueName: \"kubernetes.io/projected/e7d16aeb-b00f-463b-840d-2662ece08fe7-kube-api-access-s9ptr\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.273692 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" event={"ID":"e7d16aeb-b00f-463b-840d-2662ece08fe7","Type":"ContainerDied","Data":"772c498fb3eec7147e8220516ce2b6a123cd317114f346d4102944a6f986e1ba"} Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.273745 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="772c498fb3eec7147e8220516ce2b6a123cd317114f346d4102944a6f986e1ba" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.273759 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-bootstrap-95cdt" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.358703 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone-567879f7bf-95bbl"] Jan 21 00:22:39 crc kubenswrapper[5030]: E0121 00:22:39.358999 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d16aeb-b00f-463b-840d-2662ece08fe7" containerName="keystone-bootstrap" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.359016 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d16aeb-b00f-463b-840d-2662ece08fe7" containerName="keystone-bootstrap" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.359134 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d16aeb-b00f-463b-840d-2662ece08fe7" containerName="keystone-bootstrap" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.359681 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.361758 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-config-data" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.364756 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-keystone-dockercfg-h8r6k" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.365765 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone-scripts" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.365816 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"keystone" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.379106 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-567879f7bf-95bbl"] Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.433705 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-fernet-keys\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.433791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-credential-keys\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.433816 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-config-data\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.433922 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjxrf\" (UniqueName: \"kubernetes.io/projected/bd6ed570-11c2-4baf-b681-8e330085006b-kube-api-access-xjxrf\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.434104 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-scripts\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.535983 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjxrf\" (UniqueName: \"kubernetes.io/projected/bd6ed570-11c2-4baf-b681-8e330085006b-kube-api-access-xjxrf\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.536851 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-scripts\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.536954 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-fernet-keys\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.537001 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-credential-keys\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.537076 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-config-data\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.540401 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-credential-keys\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.540605 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-fernet-keys\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.541194 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-config-data\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.543032 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-scripts\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.554946 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjxrf\" (UniqueName: \"kubernetes.io/projected/bd6ed570-11c2-4baf-b681-8e330085006b-kube-api-access-xjxrf\") pod \"keystone-567879f7bf-95bbl\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:39 crc kubenswrapper[5030]: I0121 00:22:39.678087 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:40 crc kubenswrapper[5030]: I0121 00:22:40.089929 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone-567879f7bf-95bbl"] Jan 21 00:22:40 crc kubenswrapper[5030]: I0121 00:22:40.157304 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:22:40 crc kubenswrapper[5030]: I0121 00:22:40.157411 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:22:40 crc kubenswrapper[5030]: I0121 00:22:40.281553 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" event={"ID":"bd6ed570-11c2-4baf-b681-8e330085006b","Type":"ContainerStarted","Data":"ee334a54da40d88755b58a39c94a571d47434027095fb36b744e693edd8b91a6"} Jan 21 00:22:41 crc kubenswrapper[5030]: I0121 00:22:41.290594 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" event={"ID":"bd6ed570-11c2-4baf-b681-8e330085006b","Type":"ContainerStarted","Data":"2dd206dae3650135eff50c52e9a793c7f7f1b630f067e8ff660b1318634891eb"} Jan 21 00:22:41 crc kubenswrapper[5030]: I0121 00:22:41.290959 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:22:41 crc kubenswrapper[5030]: I0121 00:22:41.308403 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" podStartSLOduration=2.308382772 podStartE2EDuration="2.308382772s" podCreationTimestamp="2026-01-21 00:22:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:22:41.305429931 +0000 UTC m=+6433.625690239" watchObservedRunningTime="2026-01-21 00:22:41.308382772 +0000 UTC m=+6433.628643060" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.371697 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-create-2thtv"] Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.374279 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-2thtv" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.380591 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-2thtv"] Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.390435 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p"] Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.399919 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.402264 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-db-secret" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.419242 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p"] Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.458231 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c0735a7-8329-4e49-ae6d-f67fdccf1ddc-operator-scripts\") pod \"barbican-db-create-2thtv\" (UID: \"3c0735a7-8329-4e49-ae6d-f67fdccf1ddc\") " pod="barbican-kuttl-tests/barbican-db-create-2thtv" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.458343 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/725c2e18-6911-427f-acef-152827de9010-operator-scripts\") pod \"barbican-d7f7-account-create-update-x4j5p\" (UID: \"725c2e18-6911-427f-acef-152827de9010\") " pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.458388 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdnnc\" (UniqueName: \"kubernetes.io/projected/725c2e18-6911-427f-acef-152827de9010-kube-api-access-kdnnc\") pod \"barbican-d7f7-account-create-update-x4j5p\" (UID: \"725c2e18-6911-427f-acef-152827de9010\") " pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.458435 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4bvg\" (UniqueName: \"kubernetes.io/projected/3c0735a7-8329-4e49-ae6d-f67fdccf1ddc-kube-api-access-l4bvg\") pod \"barbican-db-create-2thtv\" (UID: \"3c0735a7-8329-4e49-ae6d-f67fdccf1ddc\") " pod="barbican-kuttl-tests/barbican-db-create-2thtv" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.559945 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdnnc\" (UniqueName: \"kubernetes.io/projected/725c2e18-6911-427f-acef-152827de9010-kube-api-access-kdnnc\") pod \"barbican-d7f7-account-create-update-x4j5p\" (UID: \"725c2e18-6911-427f-acef-152827de9010\") " pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.560058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4bvg\" (UniqueName: \"kubernetes.io/projected/3c0735a7-8329-4e49-ae6d-f67fdccf1ddc-kube-api-access-l4bvg\") pod \"barbican-db-create-2thtv\" (UID: \"3c0735a7-8329-4e49-ae6d-f67fdccf1ddc\") " pod="barbican-kuttl-tests/barbican-db-create-2thtv" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.560172 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c0735a7-8329-4e49-ae6d-f67fdccf1ddc-operator-scripts\") pod \"barbican-db-create-2thtv\" (UID: \"3c0735a7-8329-4e49-ae6d-f67fdccf1ddc\") " pod="barbican-kuttl-tests/barbican-db-create-2thtv" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.560239 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/725c2e18-6911-427f-acef-152827de9010-operator-scripts\") pod \"barbican-d7f7-account-create-update-x4j5p\" (UID: \"725c2e18-6911-427f-acef-152827de9010\") " pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.561239 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/725c2e18-6911-427f-acef-152827de9010-operator-scripts\") pod \"barbican-d7f7-account-create-update-x4j5p\" (UID: \"725c2e18-6911-427f-acef-152827de9010\") " pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.561322 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c0735a7-8329-4e49-ae6d-f67fdccf1ddc-operator-scripts\") pod \"barbican-db-create-2thtv\" (UID: \"3c0735a7-8329-4e49-ae6d-f67fdccf1ddc\") " pod="barbican-kuttl-tests/barbican-db-create-2thtv" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.579558 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4bvg\" (UniqueName: \"kubernetes.io/projected/3c0735a7-8329-4e49-ae6d-f67fdccf1ddc-kube-api-access-l4bvg\") pod \"barbican-db-create-2thtv\" (UID: \"3c0735a7-8329-4e49-ae6d-f67fdccf1ddc\") " pod="barbican-kuttl-tests/barbican-db-create-2thtv" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.581040 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdnnc\" (UniqueName: \"kubernetes.io/projected/725c2e18-6911-427f-acef-152827de9010-kube-api-access-kdnnc\") pod \"barbican-d7f7-account-create-update-x4j5p\" (UID: \"725c2e18-6911-427f-acef-152827de9010\") " pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.704426 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-2thtv" Jan 21 00:22:47 crc kubenswrapper[5030]: I0121 00:22:47.719030 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" Jan 21 00:22:48 crc kubenswrapper[5030]: I0121 00:22:48.131199 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p"] Jan 21 00:22:48 crc kubenswrapper[5030]: I0121 00:22:48.196609 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-2thtv"] Jan 21 00:22:48 crc kubenswrapper[5030]: W0121 00:22:48.199929 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c0735a7_8329_4e49_ae6d_f67fdccf1ddc.slice/crio-52a80d036450f72947257f01a8fe9f989319d35c6afd1be4a11dc0b91d6425c9 WatchSource:0}: Error finding container 52a80d036450f72947257f01a8fe9f989319d35c6afd1be4a11dc0b91d6425c9: Status 404 returned error can't find the container with id 52a80d036450f72947257f01a8fe9f989319d35c6afd1be4a11dc0b91d6425c9 Jan 21 00:22:48 crc kubenswrapper[5030]: I0121 00:22:48.354910 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" event={"ID":"725c2e18-6911-427f-acef-152827de9010","Type":"ContainerStarted","Data":"5f221d2d0d551cd23f128dcce507a6d1a26d8589212885b5f75b7ba3a5f38b95"} Jan 21 00:22:48 crc kubenswrapper[5030]: I0121 00:22:48.355223 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" event={"ID":"725c2e18-6911-427f-acef-152827de9010","Type":"ContainerStarted","Data":"2d28f69c114f7589461438cae42d8b85c510dd13c34ea3270ab8c4ca70a5195c"} Jan 21 00:22:48 crc kubenswrapper[5030]: I0121 00:22:48.356420 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-2thtv" event={"ID":"3c0735a7-8329-4e49-ae6d-f67fdccf1ddc","Type":"ContainerStarted","Data":"52a80d036450f72947257f01a8fe9f989319d35c6afd1be4a11dc0b91d6425c9"} Jan 21 00:22:48 crc kubenswrapper[5030]: I0121 00:22:48.370579 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" podStartSLOduration=1.370554889 podStartE2EDuration="1.370554889s" podCreationTimestamp="2026-01-21 00:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:22:48.367292661 +0000 UTC m=+6440.687552959" watchObservedRunningTime="2026-01-21 00:22:48.370554889 +0000 UTC m=+6440.690815207" Jan 21 00:22:49 crc kubenswrapper[5030]: I0121 00:22:49.367444 5030 generic.go:334] "Generic (PLEG): container finished" podID="3c0735a7-8329-4e49-ae6d-f67fdccf1ddc" containerID="bb30e01ccbfbc917799867a4523d8b03ee98e8ae39fdc6661b3a6bd49ea96738" exitCode=0 Jan 21 00:22:49 crc kubenswrapper[5030]: I0121 00:22:49.367513 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-2thtv" event={"ID":"3c0735a7-8329-4e49-ae6d-f67fdccf1ddc","Type":"ContainerDied","Data":"bb30e01ccbfbc917799867a4523d8b03ee98e8ae39fdc6661b3a6bd49ea96738"} Jan 21 00:22:49 crc kubenswrapper[5030]: I0121 00:22:49.371201 5030 generic.go:334] "Generic (PLEG): container finished" podID="725c2e18-6911-427f-acef-152827de9010" containerID="5f221d2d0d551cd23f128dcce507a6d1a26d8589212885b5f75b7ba3a5f38b95" exitCode=0 Jan 21 00:22:49 crc kubenswrapper[5030]: I0121 00:22:49.371238 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" event={"ID":"725c2e18-6911-427f-acef-152827de9010","Type":"ContainerDied","Data":"5f221d2d0d551cd23f128dcce507a6d1a26d8589212885b5f75b7ba3a5f38b95"} Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.759867 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.766383 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-2thtv" Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.828721 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdnnc\" (UniqueName: \"kubernetes.io/projected/725c2e18-6911-427f-acef-152827de9010-kube-api-access-kdnnc\") pod \"725c2e18-6911-427f-acef-152827de9010\" (UID: \"725c2e18-6911-427f-acef-152827de9010\") " Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.828886 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c0735a7-8329-4e49-ae6d-f67fdccf1ddc-operator-scripts\") pod \"3c0735a7-8329-4e49-ae6d-f67fdccf1ddc\" (UID: \"3c0735a7-8329-4e49-ae6d-f67fdccf1ddc\") " Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.828942 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/725c2e18-6911-427f-acef-152827de9010-operator-scripts\") pod \"725c2e18-6911-427f-acef-152827de9010\" (UID: \"725c2e18-6911-427f-acef-152827de9010\") " Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.829022 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4bvg\" (UniqueName: \"kubernetes.io/projected/3c0735a7-8329-4e49-ae6d-f67fdccf1ddc-kube-api-access-l4bvg\") pod \"3c0735a7-8329-4e49-ae6d-f67fdccf1ddc\" (UID: \"3c0735a7-8329-4e49-ae6d-f67fdccf1ddc\") " Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.830381 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c0735a7-8329-4e49-ae6d-f67fdccf1ddc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c0735a7-8329-4e49-ae6d-f67fdccf1ddc" (UID: "3c0735a7-8329-4e49-ae6d-f67fdccf1ddc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.837652 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0735a7-8329-4e49-ae6d-f67fdccf1ddc-kube-api-access-l4bvg" (OuterVolumeSpecName: "kube-api-access-l4bvg") pod "3c0735a7-8329-4e49-ae6d-f67fdccf1ddc" (UID: "3c0735a7-8329-4e49-ae6d-f67fdccf1ddc"). InnerVolumeSpecName "kube-api-access-l4bvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.840420 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/725c2e18-6911-427f-acef-152827de9010-kube-api-access-kdnnc" (OuterVolumeSpecName: "kube-api-access-kdnnc") pod "725c2e18-6911-427f-acef-152827de9010" (UID: "725c2e18-6911-427f-acef-152827de9010"). InnerVolumeSpecName "kube-api-access-kdnnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.841095 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725c2e18-6911-427f-acef-152827de9010-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "725c2e18-6911-427f-acef-152827de9010" (UID: "725c2e18-6911-427f-acef-152827de9010"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.931194 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdnnc\" (UniqueName: \"kubernetes.io/projected/725c2e18-6911-427f-acef-152827de9010-kube-api-access-kdnnc\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.931248 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c0735a7-8329-4e49-ae6d-f67fdccf1ddc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.931259 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/725c2e18-6911-427f-acef-152827de9010-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:50 crc kubenswrapper[5030]: I0121 00:22:50.931270 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4bvg\" (UniqueName: \"kubernetes.io/projected/3c0735a7-8329-4e49-ae6d-f67fdccf1ddc-kube-api-access-l4bvg\") on node \"crc\" DevicePath \"\"" Jan 21 00:22:51 crc kubenswrapper[5030]: I0121 00:22:51.385873 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" event={"ID":"725c2e18-6911-427f-acef-152827de9010","Type":"ContainerDied","Data":"2d28f69c114f7589461438cae42d8b85c510dd13c34ea3270ab8c4ca70a5195c"} Jan 21 00:22:51 crc kubenswrapper[5030]: I0121 00:22:51.385907 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p" Jan 21 00:22:51 crc kubenswrapper[5030]: I0121 00:22:51.385917 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d28f69c114f7589461438cae42d8b85c510dd13c34ea3270ab8c4ca70a5195c" Jan 21 00:22:51 crc kubenswrapper[5030]: I0121 00:22:51.389607 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-2thtv" event={"ID":"3c0735a7-8329-4e49-ae6d-f67fdccf1ddc","Type":"ContainerDied","Data":"52a80d036450f72947257f01a8fe9f989319d35c6afd1be4a11dc0b91d6425c9"} Jan 21 00:22:51 crc kubenswrapper[5030]: I0121 00:22:51.389653 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52a80d036450f72947257f01a8fe9f989319d35c6afd1be4a11dc0b91d6425c9" Jan 21 00:22:51 crc kubenswrapper[5030]: I0121 00:22:51.389674 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-2thtv" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.665014 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-g78qf"] Jan 21 00:22:52 crc kubenswrapper[5030]: E0121 00:22:52.665604 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0735a7-8329-4e49-ae6d-f67fdccf1ddc" containerName="mariadb-database-create" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.665636 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0735a7-8329-4e49-ae6d-f67fdccf1ddc" containerName="mariadb-database-create" Jan 21 00:22:52 crc kubenswrapper[5030]: E0121 00:22:52.665657 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="725c2e18-6911-427f-acef-152827de9010" containerName="mariadb-account-create-update" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.665665 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="725c2e18-6911-427f-acef-152827de9010" containerName="mariadb-account-create-update" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.665814 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0735a7-8329-4e49-ae6d-f67fdccf1ddc" containerName="mariadb-database-create" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.665828 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="725c2e18-6911-427f-acef-152827de9010" containerName="mariadb-account-create-update" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.666397 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-g78qf" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.668651 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-cf5gh" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.668719 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-config-data" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.676770 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-g78qf"] Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.756487 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cc08d4c3-b17a-49b4-891f-1178e802d65a-db-sync-config-data\") pod \"barbican-db-sync-g78qf\" (UID: \"cc08d4c3-b17a-49b4-891f-1178e802d65a\") " pod="barbican-kuttl-tests/barbican-db-sync-g78qf" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.756651 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45nvz\" (UniqueName: \"kubernetes.io/projected/cc08d4c3-b17a-49b4-891f-1178e802d65a-kube-api-access-45nvz\") pod \"barbican-db-sync-g78qf\" (UID: \"cc08d4c3-b17a-49b4-891f-1178e802d65a\") " pod="barbican-kuttl-tests/barbican-db-sync-g78qf" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.857736 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45nvz\" (UniqueName: \"kubernetes.io/projected/cc08d4c3-b17a-49b4-891f-1178e802d65a-kube-api-access-45nvz\") pod \"barbican-db-sync-g78qf\" (UID: \"cc08d4c3-b17a-49b4-891f-1178e802d65a\") " pod="barbican-kuttl-tests/barbican-db-sync-g78qf" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.857840 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cc08d4c3-b17a-49b4-891f-1178e802d65a-db-sync-config-data\") pod \"barbican-db-sync-g78qf\" (UID: \"cc08d4c3-b17a-49b4-891f-1178e802d65a\") " pod="barbican-kuttl-tests/barbican-db-sync-g78qf" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.863254 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cc08d4c3-b17a-49b4-891f-1178e802d65a-db-sync-config-data\") pod \"barbican-db-sync-g78qf\" (UID: \"cc08d4c3-b17a-49b4-891f-1178e802d65a\") " pod="barbican-kuttl-tests/barbican-db-sync-g78qf" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.875894 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45nvz\" (UniqueName: \"kubernetes.io/projected/cc08d4c3-b17a-49b4-891f-1178e802d65a-kube-api-access-45nvz\") pod \"barbican-db-sync-g78qf\" (UID: \"cc08d4c3-b17a-49b4-891f-1178e802d65a\") " pod="barbican-kuttl-tests/barbican-db-sync-g78qf" Jan 21 00:22:52 crc kubenswrapper[5030]: I0121 00:22:52.987816 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-g78qf" Jan 21 00:22:53 crc kubenswrapper[5030]: I0121 00:22:53.450026 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-g78qf"] Jan 21 00:22:54 crc kubenswrapper[5030]: I0121 00:22:54.411415 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-g78qf" event={"ID":"cc08d4c3-b17a-49b4-891f-1178e802d65a","Type":"ContainerStarted","Data":"fafa25626775c155cde6fa482cef84cba58e2b86c8fd8fcf6e2e0afc0db17064"} Jan 21 00:22:55 crc kubenswrapper[5030]: I0121 00:22:55.421401 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-g78qf" event={"ID":"cc08d4c3-b17a-49b4-891f-1178e802d65a","Type":"ContainerStarted","Data":"f546ff270d13e87756b68bc69b14cfc012088fbc136eaae01b98149a7d9090e1"} Jan 21 00:22:55 crc kubenswrapper[5030]: I0121 00:22:55.441351 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-db-sync-g78qf" podStartSLOduration=2.435801363 podStartE2EDuration="3.441327761s" podCreationTimestamp="2026-01-21 00:22:52 +0000 UTC" firstStartedPulling="2026-01-21 00:22:53.454293369 +0000 UTC m=+6445.774553657" lastFinishedPulling="2026-01-21 00:22:54.459819767 +0000 UTC m=+6446.780080055" observedRunningTime="2026-01-21 00:22:55.435862 +0000 UTC m=+6447.756122308" watchObservedRunningTime="2026-01-21 00:22:55.441327761 +0000 UTC m=+6447.761588049" Jan 21 00:23:00 crc kubenswrapper[5030]: I0121 00:23:00.463480 5030 generic.go:334] "Generic (PLEG): container finished" podID="cc08d4c3-b17a-49b4-891f-1178e802d65a" containerID="f546ff270d13e87756b68bc69b14cfc012088fbc136eaae01b98149a7d9090e1" exitCode=0 Jan 21 00:23:00 crc kubenswrapper[5030]: I0121 00:23:00.463666 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-g78qf" event={"ID":"cc08d4c3-b17a-49b4-891f-1178e802d65a","Type":"ContainerDied","Data":"f546ff270d13e87756b68bc69b14cfc012088fbc136eaae01b98149a7d9090e1"} Jan 21 00:23:01 crc kubenswrapper[5030]: I0121 00:23:01.760236 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-g78qf" Jan 21 00:23:01 crc kubenswrapper[5030]: I0121 00:23:01.832292 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45nvz\" (UniqueName: \"kubernetes.io/projected/cc08d4c3-b17a-49b4-891f-1178e802d65a-kube-api-access-45nvz\") pod \"cc08d4c3-b17a-49b4-891f-1178e802d65a\" (UID: \"cc08d4c3-b17a-49b4-891f-1178e802d65a\") " Jan 21 00:23:01 crc kubenswrapper[5030]: I0121 00:23:01.832476 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cc08d4c3-b17a-49b4-891f-1178e802d65a-db-sync-config-data\") pod \"cc08d4c3-b17a-49b4-891f-1178e802d65a\" (UID: \"cc08d4c3-b17a-49b4-891f-1178e802d65a\") " Jan 21 00:23:01 crc kubenswrapper[5030]: I0121 00:23:01.842212 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc08d4c3-b17a-49b4-891f-1178e802d65a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cc08d4c3-b17a-49b4-891f-1178e802d65a" (UID: "cc08d4c3-b17a-49b4-891f-1178e802d65a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:01 crc kubenswrapper[5030]: I0121 00:23:01.842291 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc08d4c3-b17a-49b4-891f-1178e802d65a-kube-api-access-45nvz" (OuterVolumeSpecName: "kube-api-access-45nvz") pod "cc08d4c3-b17a-49b4-891f-1178e802d65a" (UID: "cc08d4c3-b17a-49b4-891f-1178e802d65a"). InnerVolumeSpecName "kube-api-access-45nvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:01 crc kubenswrapper[5030]: I0121 00:23:01.933873 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45nvz\" (UniqueName: \"kubernetes.io/projected/cc08d4c3-b17a-49b4-891f-1178e802d65a-kube-api-access-45nvz\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:01 crc kubenswrapper[5030]: I0121 00:23:01.933912 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cc08d4c3-b17a-49b4-891f-1178e802d65a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.480407 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-g78qf" event={"ID":"cc08d4c3-b17a-49b4-891f-1178e802d65a","Type":"ContainerDied","Data":"fafa25626775c155cde6fa482cef84cba58e2b86c8fd8fcf6e2e0afc0db17064"} Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.480452 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fafa25626775c155cde6fa482cef84cba58e2b86c8fd8fcf6e2e0afc0db17064" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.480493 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-g78qf" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.745285 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6"] Jan 21 00:23:02 crc kubenswrapper[5030]: E0121 00:23:02.745964 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc08d4c3-b17a-49b4-891f-1178e802d65a" containerName="barbican-db-sync" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.745985 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc08d4c3-b17a-49b4-891f-1178e802d65a" containerName="barbican-db-sync" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.746110 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc08d4c3-b17a-49b4-891f-1178e802d65a" containerName="barbican-db-sync" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.747021 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.752006 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-cf5gh" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.756951 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-worker-config-data" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.757797 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-config-data" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.761059 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw"] Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.762407 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.776696 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6"] Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.779916 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.798527 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw"] Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.849568 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f3e9f82-8640-4645-b3da-d678830b75cb-config-data\") pod \"barbican-keystone-listener-6d64c545dd-s8mlw\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.849702 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qkhx\" (UniqueName: \"kubernetes.io/projected/3f3e9f82-8640-4645-b3da-d678830b75cb-kube-api-access-9qkhx\") pod \"barbican-keystone-listener-6d64c545dd-s8mlw\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.849753 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a527ae0b-f731-403e-b261-9092b798f544-config-data-custom\") pod \"barbican-worker-8545978bf5-wsfw6\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.849791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a527ae0b-f731-403e-b261-9092b798f544-config-data\") pod \"barbican-worker-8545978bf5-wsfw6\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.849816 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh7jv\" (UniqueName: \"kubernetes.io/projected/a527ae0b-f731-403e-b261-9092b798f544-kube-api-access-nh7jv\") pod \"barbican-worker-8545978bf5-wsfw6\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.849846 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f3e9f82-8640-4645-b3da-d678830b75cb-logs\") pod \"barbican-keystone-listener-6d64c545dd-s8mlw\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.849916 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f3e9f82-8640-4645-b3da-d678830b75cb-config-data-custom\") pod \"barbican-keystone-listener-6d64c545dd-s8mlw\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.849947 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a527ae0b-f731-403e-b261-9092b798f544-logs\") pod \"barbican-worker-8545978bf5-wsfw6\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.951485 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f3e9f82-8640-4645-b3da-d678830b75cb-config-data\") pod \"barbican-keystone-listener-6d64c545dd-s8mlw\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.951599 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qkhx\" (UniqueName: \"kubernetes.io/projected/3f3e9f82-8640-4645-b3da-d678830b75cb-kube-api-access-9qkhx\") pod \"barbican-keystone-listener-6d64c545dd-s8mlw\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.951721 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a527ae0b-f731-403e-b261-9092b798f544-config-data-custom\") pod \"barbican-worker-8545978bf5-wsfw6\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.951754 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a527ae0b-f731-403e-b261-9092b798f544-config-data\") pod \"barbican-worker-8545978bf5-wsfw6\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.951777 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh7jv\" (UniqueName: \"kubernetes.io/projected/a527ae0b-f731-403e-b261-9092b798f544-kube-api-access-nh7jv\") pod \"barbican-worker-8545978bf5-wsfw6\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.951808 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f3e9f82-8640-4645-b3da-d678830b75cb-logs\") pod \"barbican-keystone-listener-6d64c545dd-s8mlw\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.951864 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f3e9f82-8640-4645-b3da-d678830b75cb-config-data-custom\") pod \"barbican-keystone-listener-6d64c545dd-s8mlw\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.951888 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a527ae0b-f731-403e-b261-9092b798f544-logs\") pod \"barbican-worker-8545978bf5-wsfw6\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.952362 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a527ae0b-f731-403e-b261-9092b798f544-logs\") pod \"barbican-worker-8545978bf5-wsfw6\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.955349 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f3e9f82-8640-4645-b3da-d678830b75cb-logs\") pod \"barbican-keystone-listener-6d64c545dd-s8mlw\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.960846 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f3e9f82-8640-4645-b3da-d678830b75cb-config-data-custom\") pod \"barbican-keystone-listener-6d64c545dd-s8mlw\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.964302 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a527ae0b-f731-403e-b261-9092b798f544-config-data-custom\") pod \"barbican-worker-8545978bf5-wsfw6\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.973500 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f3e9f82-8640-4645-b3da-d678830b75cb-config-data\") pod \"barbican-keystone-listener-6d64c545dd-s8mlw\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.978058 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a527ae0b-f731-403e-b261-9092b798f544-config-data\") pod \"barbican-worker-8545978bf5-wsfw6\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.993122 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh7jv\" (UniqueName: \"kubernetes.io/projected/a527ae0b-f731-403e-b261-9092b798f544-kube-api-access-nh7jv\") pod \"barbican-worker-8545978bf5-wsfw6\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:02 crc kubenswrapper[5030]: I0121 00:23:02.999187 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qkhx\" (UniqueName: \"kubernetes.io/projected/3f3e9f82-8640-4645-b3da-d678830b75cb-kube-api-access-9qkhx\") pod \"barbican-keystone-listener-6d64c545dd-s8mlw\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.030674 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld"] Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.033005 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.038746 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-api-config-data" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.043930 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld"] Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.083006 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.098250 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.154573 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/104fd241-bda1-4f9a-99a5-17f913975981-logs\") pod \"barbican-api-564dbbdd96-b77ld\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.154718 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/104fd241-bda1-4f9a-99a5-17f913975981-config-data\") pod \"barbican-api-564dbbdd96-b77ld\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.154783 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjp6k\" (UniqueName: \"kubernetes.io/projected/104fd241-bda1-4f9a-99a5-17f913975981-kube-api-access-vjp6k\") pod \"barbican-api-564dbbdd96-b77ld\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.154818 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/104fd241-bda1-4f9a-99a5-17f913975981-config-data-custom\") pod \"barbican-api-564dbbdd96-b77ld\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.255868 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/104fd241-bda1-4f9a-99a5-17f913975981-logs\") pod \"barbican-api-564dbbdd96-b77ld\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.256226 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/104fd241-bda1-4f9a-99a5-17f913975981-config-data\") pod \"barbican-api-564dbbdd96-b77ld\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.256278 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjp6k\" (UniqueName: \"kubernetes.io/projected/104fd241-bda1-4f9a-99a5-17f913975981-kube-api-access-vjp6k\") pod \"barbican-api-564dbbdd96-b77ld\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.256311 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/104fd241-bda1-4f9a-99a5-17f913975981-config-data-custom\") pod \"barbican-api-564dbbdd96-b77ld\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.256655 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/104fd241-bda1-4f9a-99a5-17f913975981-logs\") pod \"barbican-api-564dbbdd96-b77ld\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.268726 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/104fd241-bda1-4f9a-99a5-17f913975981-config-data-custom\") pod \"barbican-api-564dbbdd96-b77ld\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.273737 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/104fd241-bda1-4f9a-99a5-17f913975981-config-data\") pod \"barbican-api-564dbbdd96-b77ld\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.276385 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjp6k\" (UniqueName: \"kubernetes.io/projected/104fd241-bda1-4f9a-99a5-17f913975981-kube-api-access-vjp6k\") pod \"barbican-api-564dbbdd96-b77ld\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.370304 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.579122 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6"] Jan 21 00:23:03 crc kubenswrapper[5030]: W0121 00:23:03.591204 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda527ae0b_f731_403e_b261_9092b798f544.slice/crio-ea751a9a4cf0d7d15ecfae46bae1514d3e65ef385480f60ba42195960fcf5990 WatchSource:0}: Error finding container ea751a9a4cf0d7d15ecfae46bae1514d3e65ef385480f60ba42195960fcf5990: Status 404 returned error can't find the container with id ea751a9a4cf0d7d15ecfae46bae1514d3e65ef385480f60ba42195960fcf5990 Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.631934 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw"] Jan 21 00:23:03 crc kubenswrapper[5030]: W0121 00:23:03.639178 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f3e9f82_8640_4645_b3da_d678830b75cb.slice/crio-9f1e58939c2cd05b0f432f845686a20cfab3c881b04aa6585198141ae706b3b4 WatchSource:0}: Error finding container 9f1e58939c2cd05b0f432f845686a20cfab3c881b04aa6585198141ae706b3b4: Status 404 returned error can't find the container with id 9f1e58939c2cd05b0f432f845686a20cfab3c881b04aa6585198141ae706b3b4 Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.794914 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld"] Jan 21 00:23:03 crc kubenswrapper[5030]: W0121 00:23:03.799431 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod104fd241_bda1_4f9a_99a5_17f913975981.slice/crio-e96d4b3f48b588348de0118caae00030ea24e0e6c63d6dd67ab1f34546ba579f WatchSource:0}: Error finding container e96d4b3f48b588348de0118caae00030ea24e0e6c63d6dd67ab1f34546ba579f: Status 404 returned error can't find the container with id e96d4b3f48b588348de0118caae00030ea24e0e6c63d6dd67ab1f34546ba579f Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.816870 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs"] Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.822950 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.837035 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs"] Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.965173 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ec622f-3921-4633-a2d4-e29f771dd532-config-data\") pod \"barbican-api-564dbbdd96-xfdxs\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.965267 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1ec622f-3921-4633-a2d4-e29f771dd532-logs\") pod \"barbican-api-564dbbdd96-xfdxs\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.965344 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1ec622f-3921-4633-a2d4-e29f771dd532-config-data-custom\") pod \"barbican-api-564dbbdd96-xfdxs\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:03 crc kubenswrapper[5030]: I0121 00:23:03.965373 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmm62\" (UniqueName: \"kubernetes.io/projected/a1ec622f-3921-4633-a2d4-e29f771dd532-kube-api-access-tmm62\") pod \"barbican-api-564dbbdd96-xfdxs\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.037900 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw"] Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.039644 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.056142 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw"] Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.066401 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1ec622f-3921-4633-a2d4-e29f771dd532-logs\") pod \"barbican-api-564dbbdd96-xfdxs\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.066647 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1ec622f-3921-4633-a2d4-e29f771dd532-config-data-custom\") pod \"barbican-api-564dbbdd96-xfdxs\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.066735 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmm62\" (UniqueName: \"kubernetes.io/projected/a1ec622f-3921-4633-a2d4-e29f771dd532-kube-api-access-tmm62\") pod \"barbican-api-564dbbdd96-xfdxs\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.066848 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ec622f-3921-4633-a2d4-e29f771dd532-config-data\") pod \"barbican-api-564dbbdd96-xfdxs\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.070897 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1ec622f-3921-4633-a2d4-e29f771dd532-config-data-custom\") pod \"barbican-api-564dbbdd96-xfdxs\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.071209 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1ec622f-3921-4633-a2d4-e29f771dd532-logs\") pod \"barbican-api-564dbbdd96-xfdxs\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.072331 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ec622f-3921-4633-a2d4-e29f771dd532-config-data\") pod \"barbican-api-564dbbdd96-xfdxs\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.087318 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmm62\" (UniqueName: \"kubernetes.io/projected/a1ec622f-3921-4633-a2d4-e29f771dd532-kube-api-access-tmm62\") pod \"barbican-api-564dbbdd96-xfdxs\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.147125 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.169151 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/104f10b4-2a7a-4295-89a5-f1040fd74574-config-data\") pod \"barbican-keystone-listener-6d64c545dd-hzgqw\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.169240 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svzg4\" (UniqueName: \"kubernetes.io/projected/104f10b4-2a7a-4295-89a5-f1040fd74574-kube-api-access-svzg4\") pod \"barbican-keystone-listener-6d64c545dd-hzgqw\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.169300 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/104f10b4-2a7a-4295-89a5-f1040fd74574-config-data-custom\") pod \"barbican-keystone-listener-6d64c545dd-hzgqw\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.169325 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/104f10b4-2a7a-4295-89a5-f1040fd74574-logs\") pod \"barbican-keystone-listener-6d64c545dd-hzgqw\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.227312 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh"] Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.229194 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.237981 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh"] Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.271601 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/104f10b4-2a7a-4295-89a5-f1040fd74574-config-data\") pod \"barbican-keystone-listener-6d64c545dd-hzgqw\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.271735 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svzg4\" (UniqueName: \"kubernetes.io/projected/104f10b4-2a7a-4295-89a5-f1040fd74574-kube-api-access-svzg4\") pod \"barbican-keystone-listener-6d64c545dd-hzgqw\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.272815 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/104f10b4-2a7a-4295-89a5-f1040fd74574-config-data-custom\") pod \"barbican-keystone-listener-6d64c545dd-hzgqw\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.272895 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/104f10b4-2a7a-4295-89a5-f1040fd74574-logs\") pod \"barbican-keystone-listener-6d64c545dd-hzgqw\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.273456 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/104f10b4-2a7a-4295-89a5-f1040fd74574-logs\") pod \"barbican-keystone-listener-6d64c545dd-hzgqw\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.277292 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/104f10b4-2a7a-4295-89a5-f1040fd74574-config-data-custom\") pod \"barbican-keystone-listener-6d64c545dd-hzgqw\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.278942 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/104f10b4-2a7a-4295-89a5-f1040fd74574-config-data\") pod \"barbican-keystone-listener-6d64c545dd-hzgqw\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.297928 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svzg4\" (UniqueName: \"kubernetes.io/projected/104f10b4-2a7a-4295-89a5-f1040fd74574-kube-api-access-svzg4\") pod \"barbican-keystone-listener-6d64c545dd-hzgqw\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.374985 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95a44ec0-cea4-44d5-84e0-8468a74c72b9-config-data-custom\") pod \"barbican-worker-8545978bf5-4pwfh\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.375082 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bcvf\" (UniqueName: \"kubernetes.io/projected/95a44ec0-cea4-44d5-84e0-8468a74c72b9-kube-api-access-7bcvf\") pod \"barbican-worker-8545978bf5-4pwfh\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.375249 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95a44ec0-cea4-44d5-84e0-8468a74c72b9-logs\") pod \"barbican-worker-8545978bf5-4pwfh\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.375525 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a44ec0-cea4-44d5-84e0-8468a74c72b9-config-data\") pod \"barbican-worker-8545978bf5-4pwfh\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.426414 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.477418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95a44ec0-cea4-44d5-84e0-8468a74c72b9-logs\") pod \"barbican-worker-8545978bf5-4pwfh\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.477526 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a44ec0-cea4-44d5-84e0-8468a74c72b9-config-data\") pod \"barbican-worker-8545978bf5-4pwfh\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.477549 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95a44ec0-cea4-44d5-84e0-8468a74c72b9-config-data-custom\") pod \"barbican-worker-8545978bf5-4pwfh\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.477604 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bcvf\" (UniqueName: \"kubernetes.io/projected/95a44ec0-cea4-44d5-84e0-8468a74c72b9-kube-api-access-7bcvf\") pod \"barbican-worker-8545978bf5-4pwfh\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.478106 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95a44ec0-cea4-44d5-84e0-8468a74c72b9-logs\") pod \"barbican-worker-8545978bf5-4pwfh\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.481561 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs"] Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.484467 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95a44ec0-cea4-44d5-84e0-8468a74c72b9-config-data-custom\") pod \"barbican-worker-8545978bf5-4pwfh\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.485492 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a44ec0-cea4-44d5-84e0-8468a74c72b9-config-data\") pod \"barbican-worker-8545978bf5-4pwfh\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.500758 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bcvf\" (UniqueName: \"kubernetes.io/projected/95a44ec0-cea4-44d5-84e0-8468a74c72b9-kube-api-access-7bcvf\") pod \"barbican-worker-8545978bf5-4pwfh\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.565020 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" event={"ID":"104fd241-bda1-4f9a-99a5-17f913975981","Type":"ContainerStarted","Data":"1eed9e5fcb657e295f02eb1548116521353aeb8fa56707eea44c2f5ad7c7e439"} Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.565319 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" event={"ID":"104fd241-bda1-4f9a-99a5-17f913975981","Type":"ContainerStarted","Data":"a29a219ed889f9c2a16f8449476d999818a02e6bccd8b17e1f56f0c33130f249"} Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.565336 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" event={"ID":"104fd241-bda1-4f9a-99a5-17f913975981","Type":"ContainerStarted","Data":"e96d4b3f48b588348de0118caae00030ea24e0e6c63d6dd67ab1f34546ba579f"} Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.565148 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.565888 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.565915 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.581994 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" event={"ID":"a527ae0b-f731-403e-b261-9092b798f544","Type":"ContainerStarted","Data":"ea751a9a4cf0d7d15ecfae46bae1514d3e65ef385480f60ba42195960fcf5990"} Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.620982 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" podStartSLOduration=2.6209542150000003 podStartE2EDuration="2.620954215s" podCreationTimestamp="2026-01-21 00:23:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:23:04.60818951 +0000 UTC m=+6456.928449798" watchObservedRunningTime="2026-01-21 00:23:04.620954215 +0000 UTC m=+6456.941214513" Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.621059 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" event={"ID":"3f3e9f82-8640-4645-b3da-d678830b75cb","Type":"ContainerStarted","Data":"9f1e58939c2cd05b0f432f845686a20cfab3c881b04aa6585198141ae706b3b4"} Jan 21 00:23:04 crc kubenswrapper[5030]: I0121 00:23:04.946768 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw"] Jan 21 00:23:04 crc kubenswrapper[5030]: W0121 00:23:04.953110 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod104f10b4_2a7a_4295_89a5_f1040fd74574.slice/crio-13a25c6adc06c014fc7f7c66847e2577d656df38d22403b55008f53cef15aae7 WatchSource:0}: Error finding container 13a25c6adc06c014fc7f7c66847e2577d656df38d22403b55008f53cef15aae7: Status 404 returned error can't find the container with id 13a25c6adc06c014fc7f7c66847e2577d656df38d22403b55008f53cef15aae7 Jan 21 00:23:05 crc kubenswrapper[5030]: I0121 00:23:05.036940 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh"] Jan 21 00:23:05 crc kubenswrapper[5030]: W0121 00:23:05.048216 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95a44ec0_cea4_44d5_84e0_8468a74c72b9.slice/crio-1ccb6d215cf0efbaf74eb40dbf9cfe0624afb594d641c70e8096f63990ff6498 WatchSource:0}: Error finding container 1ccb6d215cf0efbaf74eb40dbf9cfe0624afb594d641c70e8096f63990ff6498: Status 404 returned error can't find the container with id 1ccb6d215cf0efbaf74eb40dbf9cfe0624afb594d641c70e8096f63990ff6498 Jan 21 00:23:05 crc kubenswrapper[5030]: I0121 00:23:05.394666 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs"] Jan 21 00:23:05 crc kubenswrapper[5030]: I0121 00:23:05.566112 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw"] Jan 21 00:23:05 crc kubenswrapper[5030]: I0121 00:23:05.629134 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" event={"ID":"a527ae0b-f731-403e-b261-9092b798f544","Type":"ContainerStarted","Data":"c214a0ed31fc8abe049a628d2f86c356902f4367e7de355914cc7fbdbaee9372"} Jan 21 00:23:05 crc kubenswrapper[5030]: I0121 00:23:05.630088 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" event={"ID":"3f3e9f82-8640-4645-b3da-d678830b75cb","Type":"ContainerStarted","Data":"433204c6eec430cbb10390cefa5fd5db439270c4694cc72117c33a73472a48d2"} Jan 21 00:23:05 crc kubenswrapper[5030]: I0121 00:23:05.631262 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" event={"ID":"104f10b4-2a7a-4295-89a5-f1040fd74574","Type":"ContainerStarted","Data":"13a25c6adc06c014fc7f7c66847e2577d656df38d22403b55008f53cef15aae7"} Jan 21 00:23:05 crc kubenswrapper[5030]: I0121 00:23:05.635563 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" event={"ID":"a1ec622f-3921-4633-a2d4-e29f771dd532","Type":"ContainerStarted","Data":"f9b81a1d14551f5d7202ece2da011aa043d3d485bfcc7e76ca93afed325bf829"} Jan 21 00:23:05 crc kubenswrapper[5030]: I0121 00:23:05.635603 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" event={"ID":"a1ec622f-3921-4633-a2d4-e29f771dd532","Type":"ContainerStarted","Data":"dfeb54aef3facf25f13f13516125d56a0629a19f762bca6d2ef8fae9c6885c74"} Jan 21 00:23:05 crc kubenswrapper[5030]: I0121 00:23:05.636391 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" event={"ID":"95a44ec0-cea4-44d5-84e0-8468a74c72b9","Type":"ContainerStarted","Data":"1ccb6d215cf0efbaf74eb40dbf9cfe0624afb594d641c70e8096f63990ff6498"} Jan 21 00:23:05 crc kubenswrapper[5030]: I0121 00:23:05.768362 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh"] Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.647675 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" event={"ID":"104f10b4-2a7a-4295-89a5-f1040fd74574","Type":"ContainerStarted","Data":"2caaad1dad98d3f0bde280b9f62f75f37b9a16e1ef62be6fd79c2d57aae1f591"} Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.647991 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" podUID="104f10b4-2a7a-4295-89a5-f1040fd74574" containerName="barbican-keystone-listener-log" containerID="cri-o://e5c76fade9d56f705e934b1239b542d08c1aed1713f5c0e36100b29dbde26be3" gracePeriod=30 Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.648065 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" podUID="104f10b4-2a7a-4295-89a5-f1040fd74574" containerName="barbican-keystone-listener" containerID="cri-o://2caaad1dad98d3f0bde280b9f62f75f37b9a16e1ef62be6fd79c2d57aae1f591" gracePeriod=30 Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.648008 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" event={"ID":"104f10b4-2a7a-4295-89a5-f1040fd74574","Type":"ContainerStarted","Data":"e5c76fade9d56f705e934b1239b542d08c1aed1713f5c0e36100b29dbde26be3"} Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.652734 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" event={"ID":"a1ec622f-3921-4633-a2d4-e29f771dd532","Type":"ContainerStarted","Data":"488cf71ce904ba65bfbbd66a025830faa669f188e1f004c6f1820bfd1f9a3e2e"} Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.652925 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" podUID="a1ec622f-3921-4633-a2d4-e29f771dd532" containerName="barbican-api-log" containerID="cri-o://f9b81a1d14551f5d7202ece2da011aa043d3d485bfcc7e76ca93afed325bf829" gracePeriod=30 Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.652979 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.652998 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" podUID="a1ec622f-3921-4633-a2d4-e29f771dd532" containerName="barbican-api" containerID="cri-o://488cf71ce904ba65bfbbd66a025830faa669f188e1f004c6f1820bfd1f9a3e2e" gracePeriod=30 Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.653065 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.655294 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" event={"ID":"95a44ec0-cea4-44d5-84e0-8468a74c72b9","Type":"ContainerStarted","Data":"67abbc44be874834bfc74bf06dbef4f876db30dd4f2391d60959932155e14081"} Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.655334 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" event={"ID":"95a44ec0-cea4-44d5-84e0-8468a74c72b9","Type":"ContainerStarted","Data":"e3cc4bf99227158bdae8763a00722f9cc15c09290fa146bcd19dbf2bd9a7b163"} Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.655444 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" podUID="95a44ec0-cea4-44d5-84e0-8468a74c72b9" containerName="barbican-worker-log" containerID="cri-o://e3cc4bf99227158bdae8763a00722f9cc15c09290fa146bcd19dbf2bd9a7b163" gracePeriod=30 Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.655522 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" podUID="95a44ec0-cea4-44d5-84e0-8468a74c72b9" containerName="barbican-worker" containerID="cri-o://67abbc44be874834bfc74bf06dbef4f876db30dd4f2391d60959932155e14081" gracePeriod=30 Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.664521 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" event={"ID":"a527ae0b-f731-403e-b261-9092b798f544","Type":"ContainerStarted","Data":"f8449311aba6960e9776f5d128658a0f2296ffb181af1b38a4c51523946b7c93"} Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.669985 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" podStartSLOduration=2.669964908 podStartE2EDuration="2.669964908s" podCreationTimestamp="2026-01-21 00:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:23:06.66752478 +0000 UTC m=+6458.987785078" watchObservedRunningTime="2026-01-21 00:23:06.669964908 +0000 UTC m=+6458.990225196" Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.671361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" event={"ID":"3f3e9f82-8640-4645-b3da-d678830b75cb","Type":"ContainerStarted","Data":"64525601efc78dd745920bd43f8b888d269f03d8f7a926c0c7ac47b3ab5f942e"} Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.695493 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" podStartSLOduration=2.695463807 podStartE2EDuration="2.695463807s" podCreationTimestamp="2026-01-21 00:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:23:06.687503137 +0000 UTC m=+6459.007763425" watchObservedRunningTime="2026-01-21 00:23:06.695463807 +0000 UTC m=+6459.015724095" Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.723676 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" podStartSLOduration=3.7235945900000003 podStartE2EDuration="3.72359459s" podCreationTimestamp="2026-01-21 00:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:23:06.705078367 +0000 UTC m=+6459.025338675" watchObservedRunningTime="2026-01-21 00:23:06.72359459 +0000 UTC m=+6459.043854878" Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.732174 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" podStartSLOduration=3.895630615 podStartE2EDuration="4.732148294s" podCreationTimestamp="2026-01-21 00:23:02 +0000 UTC" firstStartedPulling="2026-01-21 00:23:03.599836755 +0000 UTC m=+6455.920097043" lastFinishedPulling="2026-01-21 00:23:04.436354434 +0000 UTC m=+6456.756614722" observedRunningTime="2026-01-21 00:23:06.72109085 +0000 UTC m=+6459.041351138" watchObservedRunningTime="2026-01-21 00:23:06.732148294 +0000 UTC m=+6459.052408582" Jan 21 00:23:06 crc kubenswrapper[5030]: I0121 00:23:06.742766 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" podStartSLOduration=3.917539028 podStartE2EDuration="4.742745518s" podCreationTimestamp="2026-01-21 00:23:02 +0000 UTC" firstStartedPulling="2026-01-21 00:23:03.642551795 +0000 UTC m=+6455.962812073" lastFinishedPulling="2026-01-21 00:23:04.467758285 +0000 UTC m=+6456.788018563" observedRunningTime="2026-01-21 00:23:06.740116085 +0000 UTC m=+6459.060376373" watchObservedRunningTime="2026-01-21 00:23:06.742745518 +0000 UTC m=+6459.063005806" Jan 21 00:23:07 crc kubenswrapper[5030]: I0121 00:23:07.034905 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld"] Jan 21 00:23:07 crc kubenswrapper[5030]: I0121 00:23:07.121186 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw"] Jan 21 00:23:07 crc kubenswrapper[5030]: I0121 00:23:07.320573 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6"] Jan 21 00:23:07 crc kubenswrapper[5030]: I0121 00:23:07.685861 5030 generic.go:334] "Generic (PLEG): container finished" podID="104f10b4-2a7a-4295-89a5-f1040fd74574" containerID="e5c76fade9d56f705e934b1239b542d08c1aed1713f5c0e36100b29dbde26be3" exitCode=143 Jan 21 00:23:07 crc kubenswrapper[5030]: I0121 00:23:07.685955 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" event={"ID":"104f10b4-2a7a-4295-89a5-f1040fd74574","Type":"ContainerDied","Data":"e5c76fade9d56f705e934b1239b542d08c1aed1713f5c0e36100b29dbde26be3"} Jan 21 00:23:07 crc kubenswrapper[5030]: I0121 00:23:07.687784 5030 generic.go:334] "Generic (PLEG): container finished" podID="a1ec622f-3921-4633-a2d4-e29f771dd532" containerID="f9b81a1d14551f5d7202ece2da011aa043d3d485bfcc7e76ca93afed325bf829" exitCode=143 Jan 21 00:23:07 crc kubenswrapper[5030]: I0121 00:23:07.687865 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" event={"ID":"a1ec622f-3921-4633-a2d4-e29f771dd532","Type":"ContainerDied","Data":"f9b81a1d14551f5d7202ece2da011aa043d3d485bfcc7e76ca93afed325bf829"} Jan 21 00:23:07 crc kubenswrapper[5030]: I0121 00:23:07.689324 5030 generic.go:334] "Generic (PLEG): container finished" podID="95a44ec0-cea4-44d5-84e0-8468a74c72b9" containerID="e3cc4bf99227158bdae8763a00722f9cc15c09290fa146bcd19dbf2bd9a7b163" exitCode=143 Jan 21 00:23:07 crc kubenswrapper[5030]: I0121 00:23:07.689879 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" event={"ID":"95a44ec0-cea4-44d5-84e0-8468a74c72b9","Type":"ContainerDied","Data":"e3cc4bf99227158bdae8763a00722f9cc15c09290fa146bcd19dbf2bd9a7b163"} Jan 21 00:23:07 crc kubenswrapper[5030]: I0121 00:23:07.690347 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" podUID="104fd241-bda1-4f9a-99a5-17f913975981" containerName="barbican-api-log" containerID="cri-o://a29a219ed889f9c2a16f8449476d999818a02e6bccd8b17e1f56f0c33130f249" gracePeriod=30 Jan 21 00:23:07 crc kubenswrapper[5030]: I0121 00:23:07.691065 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" podUID="104fd241-bda1-4f9a-99a5-17f913975981" containerName="barbican-api" containerID="cri-o://1eed9e5fcb657e295f02eb1548116521353aeb8fa56707eea44c2f5ad7c7e439" gracePeriod=30 Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.545441 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-g78qf"] Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.554655 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-g78qf"] Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.563957 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbicand7f7-account-delete-lk46v"] Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.564876 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.578142 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbicand7f7-account-delete-lk46v"] Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.665680 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b49b609-47f9-4f9e-98ba-04775e57464f-operator-scripts\") pod \"barbicand7f7-account-delete-lk46v\" (UID: \"6b49b609-47f9-4f9e-98ba-04775e57464f\") " pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.665726 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx9c6\" (UniqueName: \"kubernetes.io/projected/6b49b609-47f9-4f9e-98ba-04775e57464f-kube-api-access-mx9c6\") pod \"barbicand7f7-account-delete-lk46v\" (UID: \"6b49b609-47f9-4f9e-98ba-04775e57464f\") " pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.697883 5030 generic.go:334] "Generic (PLEG): container finished" podID="a1ec622f-3921-4633-a2d4-e29f771dd532" containerID="488cf71ce904ba65bfbbd66a025830faa669f188e1f004c6f1820bfd1f9a3e2e" exitCode=0 Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.697961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" event={"ID":"a1ec622f-3921-4633-a2d4-e29f771dd532","Type":"ContainerDied","Data":"488cf71ce904ba65bfbbd66a025830faa669f188e1f004c6f1820bfd1f9a3e2e"} Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.700966 5030 generic.go:334] "Generic (PLEG): container finished" podID="104fd241-bda1-4f9a-99a5-17f913975981" containerID="a29a219ed889f9c2a16f8449476d999818a02e6bccd8b17e1f56f0c33130f249" exitCode=143 Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.701047 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" event={"ID":"104fd241-bda1-4f9a-99a5-17f913975981","Type":"ContainerDied","Data":"a29a219ed889f9c2a16f8449476d999818a02e6bccd8b17e1f56f0c33130f249"} Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.701324 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" podUID="3f3e9f82-8640-4645-b3da-d678830b75cb" containerName="barbican-keystone-listener-log" containerID="cri-o://433204c6eec430cbb10390cefa5fd5db439270c4694cc72117c33a73472a48d2" gracePeriod=30 Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.701412 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" podUID="a527ae0b-f731-403e-b261-9092b798f544" containerName="barbican-worker-log" containerID="cri-o://c214a0ed31fc8abe049a628d2f86c356902f4367e7de355914cc7fbdbaee9372" gracePeriod=30 Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.701495 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" podUID="a527ae0b-f731-403e-b261-9092b798f544" containerName="barbican-worker" containerID="cri-o://f8449311aba6960e9776f5d128658a0f2296ffb181af1b38a4c51523946b7c93" gracePeriod=30 Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.701342 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" podUID="3f3e9f82-8640-4645-b3da-d678830b75cb" containerName="barbican-keystone-listener" containerID="cri-o://64525601efc78dd745920bd43f8b888d269f03d8f7a926c0c7ac47b3ab5f942e" gracePeriod=30 Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.767046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b49b609-47f9-4f9e-98ba-04775e57464f-operator-scripts\") pod \"barbicand7f7-account-delete-lk46v\" (UID: \"6b49b609-47f9-4f9e-98ba-04775e57464f\") " pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.767111 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx9c6\" (UniqueName: \"kubernetes.io/projected/6b49b609-47f9-4f9e-98ba-04775e57464f-kube-api-access-mx9c6\") pod \"barbicand7f7-account-delete-lk46v\" (UID: \"6b49b609-47f9-4f9e-98ba-04775e57464f\") " pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.768550 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b49b609-47f9-4f9e-98ba-04775e57464f-operator-scripts\") pod \"barbicand7f7-account-delete-lk46v\" (UID: \"6b49b609-47f9-4f9e-98ba-04775e57464f\") " pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.789688 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx9c6\" (UniqueName: \"kubernetes.io/projected/6b49b609-47f9-4f9e-98ba-04775e57464f-kube-api-access-mx9c6\") pod \"barbicand7f7-account-delete-lk46v\" (UID: \"6b49b609-47f9-4f9e-98ba-04775e57464f\") " pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" Jan 21 00:23:08 crc kubenswrapper[5030]: I0121 00:23:08.884274 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.334546 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbicand7f7-account-delete-lk46v"] Jan 21 00:23:09 crc kubenswrapper[5030]: W0121 00:23:09.341257 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b49b609_47f9_4f9e_98ba_04775e57464f.slice/crio-b2648e6c3004a11c365a3e851c7c1c07cf3ebecebb86b7a8e05a998d69efb0a1 WatchSource:0}: Error finding container b2648e6c3004a11c365a3e851c7c1c07cf3ebecebb86b7a8e05a998d69efb0a1: Status 404 returned error can't find the container with id b2648e6c3004a11c365a3e851c7c1c07cf3ebecebb86b7a8e05a998d69efb0a1 Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.540286 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.709650 5030 generic.go:334] "Generic (PLEG): container finished" podID="a527ae0b-f731-403e-b261-9092b798f544" containerID="c214a0ed31fc8abe049a628d2f86c356902f4367e7de355914cc7fbdbaee9372" exitCode=143 Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.709712 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" event={"ID":"a527ae0b-f731-403e-b261-9092b798f544","Type":"ContainerDied","Data":"c214a0ed31fc8abe049a628d2f86c356902f4367e7de355914cc7fbdbaee9372"} Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.711231 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" event={"ID":"6b49b609-47f9-4f9e-98ba-04775e57464f","Type":"ContainerStarted","Data":"b2648e6c3004a11c365a3e851c7c1c07cf3ebecebb86b7a8e05a998d69efb0a1"} Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.713211 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" event={"ID":"a1ec622f-3921-4633-a2d4-e29f771dd532","Type":"ContainerDied","Data":"dfeb54aef3facf25f13f13516125d56a0629a19f762bca6d2ef8fae9c6885c74"} Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.713246 5030 scope.go:117] "RemoveContainer" containerID="488cf71ce904ba65bfbbd66a025830faa669f188e1f004c6f1820bfd1f9a3e2e" Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.713249 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs" Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.715500 5030 generic.go:334] "Generic (PLEG): container finished" podID="104fd241-bda1-4f9a-99a5-17f913975981" containerID="1eed9e5fcb657e295f02eb1548116521353aeb8fa56707eea44c2f5ad7c7e439" exitCode=0 Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.715541 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" event={"ID":"104fd241-bda1-4f9a-99a5-17f913975981","Type":"ContainerDied","Data":"1eed9e5fcb657e295f02eb1548116521353aeb8fa56707eea44c2f5ad7c7e439"} Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.720744 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1ec622f-3921-4633-a2d4-e29f771dd532-logs\") pod \"a1ec622f-3921-4633-a2d4-e29f771dd532\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.720792 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ec622f-3921-4633-a2d4-e29f771dd532-config-data\") pod \"a1ec622f-3921-4633-a2d4-e29f771dd532\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.720981 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1ec622f-3921-4633-a2d4-e29f771dd532-config-data-custom\") pod \"a1ec622f-3921-4633-a2d4-e29f771dd532\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.721035 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmm62\" (UniqueName: \"kubernetes.io/projected/a1ec622f-3921-4633-a2d4-e29f771dd532-kube-api-access-tmm62\") pod \"a1ec622f-3921-4633-a2d4-e29f771dd532\" (UID: \"a1ec622f-3921-4633-a2d4-e29f771dd532\") " Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.721509 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1ec622f-3921-4633-a2d4-e29f771dd532-logs" (OuterVolumeSpecName: "logs") pod "a1ec622f-3921-4633-a2d4-e29f771dd532" (UID: "a1ec622f-3921-4633-a2d4-e29f771dd532"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.726743 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1ec622f-3921-4633-a2d4-e29f771dd532-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a1ec622f-3921-4633-a2d4-e29f771dd532" (UID: "a1ec622f-3921-4633-a2d4-e29f771dd532"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.730016 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ec622f-3921-4633-a2d4-e29f771dd532-kube-api-access-tmm62" (OuterVolumeSpecName: "kube-api-access-tmm62") pod "a1ec622f-3921-4633-a2d4-e29f771dd532" (UID: "a1ec622f-3921-4633-a2d4-e29f771dd532"). InnerVolumeSpecName "kube-api-access-tmm62". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.734882 5030 scope.go:117] "RemoveContainer" containerID="f9b81a1d14551f5d7202ece2da011aa043d3d485bfcc7e76ca93afed325bf829" Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.766264 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1ec622f-3921-4633-a2d4-e29f771dd532-config-data" (OuterVolumeSpecName: "config-data") pod "a1ec622f-3921-4633-a2d4-e29f771dd532" (UID: "a1ec622f-3921-4633-a2d4-e29f771dd532"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.824448 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a1ec622f-3921-4633-a2d4-e29f771dd532-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.824869 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmm62\" (UniqueName: \"kubernetes.io/projected/a1ec622f-3921-4633-a2d4-e29f771dd532-kube-api-access-tmm62\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.824886 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1ec622f-3921-4633-a2d4-e29f771dd532-logs\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.824900 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1ec622f-3921-4633-a2d4-e29f771dd532-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:09 crc kubenswrapper[5030]: I0121 00:23:09.971026 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc08d4c3-b17a-49b4-891f-1178e802d65a" path="/var/lib/kubelet/pods/cc08d4c3-b17a-49b4-891f-1178e802d65a/volumes" Jan 21 00:23:10 crc kubenswrapper[5030]: I0121 00:23:10.132703 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs"] Jan 21 00:23:10 crc kubenswrapper[5030]: I0121 00:23:10.140003 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-564dbbdd96-xfdxs"] Jan 21 00:23:10 crc kubenswrapper[5030]: I0121 00:23:10.157779 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:23:10 crc kubenswrapper[5030]: I0121 00:23:10.157852 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:23:10 crc kubenswrapper[5030]: I0121 00:23:10.157905 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 00:23:10 crc kubenswrapper[5030]: I0121 00:23:10.158701 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 00:23:10 crc kubenswrapper[5030]: I0121 00:23:10.158776 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" gracePeriod=600 Jan 21 00:23:10 crc kubenswrapper[5030]: I0121 00:23:10.726997 5030 generic.go:334] "Generic (PLEG): container finished" podID="a527ae0b-f731-403e-b261-9092b798f544" containerID="f8449311aba6960e9776f5d128658a0f2296ffb181af1b38a4c51523946b7c93" exitCode=0 Jan 21 00:23:10 crc kubenswrapper[5030]: I0121 00:23:10.727067 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" event={"ID":"a527ae0b-f731-403e-b261-9092b798f544","Type":"ContainerDied","Data":"f8449311aba6960e9776f5d128658a0f2296ffb181af1b38a4c51523946b7c93"} Jan 21 00:23:10 crc kubenswrapper[5030]: I0121 00:23:10.731508 5030 generic.go:334] "Generic (PLEG): container finished" podID="3f3e9f82-8640-4645-b3da-d678830b75cb" containerID="433204c6eec430cbb10390cefa5fd5db439270c4694cc72117c33a73472a48d2" exitCode=143 Jan 21 00:23:10 crc kubenswrapper[5030]: I0121 00:23:10.731639 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" event={"ID":"3f3e9f82-8640-4645-b3da-d678830b75cb","Type":"ContainerDied","Data":"433204c6eec430cbb10390cefa5fd5db439270c4694cc72117c33a73472a48d2"} Jan 21 00:23:10 crc kubenswrapper[5030]: I0121 00:23:10.733346 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" event={"ID":"6b49b609-47f9-4f9e-98ba-04775e57464f","Type":"ContainerStarted","Data":"321aeec566632b05c5a0effe066112c99f2f9161e670b82f2a4f8a4e2d3081be"} Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.279878 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.282314 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.351470 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjp6k\" (UniqueName: \"kubernetes.io/projected/104fd241-bda1-4f9a-99a5-17f913975981-kube-api-access-vjp6k\") pod \"104fd241-bda1-4f9a-99a5-17f913975981\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.351703 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/104fd241-bda1-4f9a-99a5-17f913975981-logs\") pod \"104fd241-bda1-4f9a-99a5-17f913975981\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.351735 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/104fd241-bda1-4f9a-99a5-17f913975981-config-data\") pod \"104fd241-bda1-4f9a-99a5-17f913975981\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.351760 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/104fd241-bda1-4f9a-99a5-17f913975981-config-data-custom\") pod \"104fd241-bda1-4f9a-99a5-17f913975981\" (UID: \"104fd241-bda1-4f9a-99a5-17f913975981\") " Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.353006 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/104fd241-bda1-4f9a-99a5-17f913975981-logs" (OuterVolumeSpecName: "logs") pod "104fd241-bda1-4f9a-99a5-17f913975981" (UID: "104fd241-bda1-4f9a-99a5-17f913975981"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.356813 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/104fd241-bda1-4f9a-99a5-17f913975981-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "104fd241-bda1-4f9a-99a5-17f913975981" (UID: "104fd241-bda1-4f9a-99a5-17f913975981"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.356880 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/104fd241-bda1-4f9a-99a5-17f913975981-kube-api-access-vjp6k" (OuterVolumeSpecName: "kube-api-access-vjp6k") pod "104fd241-bda1-4f9a-99a5-17f913975981" (UID: "104fd241-bda1-4f9a-99a5-17f913975981"). InnerVolumeSpecName "kube-api-access-vjp6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.395307 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/104fd241-bda1-4f9a-99a5-17f913975981-config-data" (OuterVolumeSpecName: "config-data") pod "104fd241-bda1-4f9a-99a5-17f913975981" (UID: "104fd241-bda1-4f9a-99a5-17f913975981"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.453804 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/104fd241-bda1-4f9a-99a5-17f913975981-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.453860 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjp6k\" (UniqueName: \"kubernetes.io/projected/104fd241-bda1-4f9a-99a5-17f913975981-kube-api-access-vjp6k\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.453875 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/104fd241-bda1-4f9a-99a5-17f913975981-logs\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.453894 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/104fd241-bda1-4f9a-99a5-17f913975981-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.743898 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" event={"ID":"104fd241-bda1-4f9a-99a5-17f913975981","Type":"ContainerDied","Data":"e96d4b3f48b588348de0118caae00030ea24e0e6c63d6dd67ab1f34546ba579f"} Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.743974 5030 scope.go:117] "RemoveContainer" containerID="1eed9e5fcb657e295f02eb1548116521353aeb8fa56707eea44c2f5ad7c7e439" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.744008 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.789235 5030 scope.go:117] "RemoveContainer" containerID="a29a219ed889f9c2a16f8449476d999818a02e6bccd8b17e1f56f0c33130f249" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.792042 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld"] Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.798552 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-564dbbdd96-b77ld"] Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.976649 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="104fd241-bda1-4f9a-99a5-17f913975981" path="/var/lib/kubelet/pods/104fd241-bda1-4f9a-99a5-17f913975981/volumes" Jan 21 00:23:11 crc kubenswrapper[5030]: I0121 00:23:11.977279 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ec622f-3921-4633-a2d4-e29f771dd532" path="/var/lib/kubelet/pods/a1ec622f-3921-4633-a2d4-e29f771dd532/volumes" Jan 21 00:23:12 crc kubenswrapper[5030]: I0121 00:23:12.753988 5030 generic.go:334] "Generic (PLEG): container finished" podID="3f3e9f82-8640-4645-b3da-d678830b75cb" containerID="64525601efc78dd745920bd43f8b888d269f03d8f7a926c0c7ac47b3ab5f942e" exitCode=0 Jan 21 00:23:12 crc kubenswrapper[5030]: I0121 00:23:12.754066 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" event={"ID":"3f3e9f82-8640-4645-b3da-d678830b75cb","Type":"ContainerDied","Data":"64525601efc78dd745920bd43f8b888d269f03d8f7a926c0c7ac47b3ab5f942e"} Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.197842 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.284207 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a527ae0b-f731-403e-b261-9092b798f544-logs\") pod \"a527ae0b-f731-403e-b261-9092b798f544\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.284586 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a527ae0b-f731-403e-b261-9092b798f544-config-data-custom\") pod \"a527ae0b-f731-403e-b261-9092b798f544\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.284675 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh7jv\" (UniqueName: \"kubernetes.io/projected/a527ae0b-f731-403e-b261-9092b798f544-kube-api-access-nh7jv\") pod \"a527ae0b-f731-403e-b261-9092b798f544\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.284721 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a527ae0b-f731-403e-b261-9092b798f544-config-data\") pod \"a527ae0b-f731-403e-b261-9092b798f544\" (UID: \"a527ae0b-f731-403e-b261-9092b798f544\") " Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.284809 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a527ae0b-f731-403e-b261-9092b798f544-logs" (OuterVolumeSpecName: "logs") pod "a527ae0b-f731-403e-b261-9092b798f544" (UID: "a527ae0b-f731-403e-b261-9092b798f544"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.284978 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a527ae0b-f731-403e-b261-9092b798f544-logs\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.294134 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a527ae0b-f731-403e-b261-9092b798f544-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a527ae0b-f731-403e-b261-9092b798f544" (UID: "a527ae0b-f731-403e-b261-9092b798f544"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.294160 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a527ae0b-f731-403e-b261-9092b798f544-kube-api-access-nh7jv" (OuterVolumeSpecName: "kube-api-access-nh7jv") pod "a527ae0b-f731-403e-b261-9092b798f544" (UID: "a527ae0b-f731-403e-b261-9092b798f544"). InnerVolumeSpecName "kube-api-access-nh7jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.325649 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a527ae0b-f731-403e-b261-9092b798f544-config-data" (OuterVolumeSpecName: "config-data") pod "a527ae0b-f731-403e-b261-9092b798f544" (UID: "a527ae0b-f731-403e-b261-9092b798f544"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.386279 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a527ae0b-f731-403e-b261-9092b798f544-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.386340 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a527ae0b-f731-403e-b261-9092b798f544-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.386365 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh7jv\" (UniqueName: \"kubernetes.io/projected/a527ae0b-f731-403e-b261-9092b798f544-kube-api-access-nh7jv\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.777765 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" exitCode=0 Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.777925 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575"} Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.778035 5030 scope.go:117] "RemoveContainer" containerID="25666b6d120603c062926c4e08877d49612e752e6df3c058aacbe8d2325dd00b" Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.784493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" event={"ID":"a527ae0b-f731-403e-b261-9092b798f544","Type":"ContainerDied","Data":"ea751a9a4cf0d7d15ecfae46bae1514d3e65ef385480f60ba42195960fcf5990"} Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.784536 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6" Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.810578 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" podStartSLOduration=5.810556458 podStartE2EDuration="5.810556458s" podCreationTimestamp="2026-01-21 00:23:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:23:13.797933716 +0000 UTC m=+6466.118194024" watchObservedRunningTime="2026-01-21 00:23:13.810556458 +0000 UTC m=+6466.130816756" Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.833834 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6"] Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.848157 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-8545978bf5-wsfw6"] Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.947180 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:13 crc kubenswrapper[5030]: I0121 00:23:13.980095 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a527ae0b-f731-403e-b261-9092b798f544" path="/var/lib/kubelet/pods/a527ae0b-f731-403e-b261-9092b798f544/volumes" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.097024 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f3e9f82-8640-4645-b3da-d678830b75cb-config-data-custom\") pod \"3f3e9f82-8640-4645-b3da-d678830b75cb\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.097091 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qkhx\" (UniqueName: \"kubernetes.io/projected/3f3e9f82-8640-4645-b3da-d678830b75cb-kube-api-access-9qkhx\") pod \"3f3e9f82-8640-4645-b3da-d678830b75cb\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.097150 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f3e9f82-8640-4645-b3da-d678830b75cb-logs\") pod \"3f3e9f82-8640-4645-b3da-d678830b75cb\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.097211 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f3e9f82-8640-4645-b3da-d678830b75cb-config-data\") pod \"3f3e9f82-8640-4645-b3da-d678830b75cb\" (UID: \"3f3e9f82-8640-4645-b3da-d678830b75cb\") " Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.099574 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f3e9f82-8640-4645-b3da-d678830b75cb-logs" (OuterVolumeSpecName: "logs") pod "3f3e9f82-8640-4645-b3da-d678830b75cb" (UID: "3f3e9f82-8640-4645-b3da-d678830b75cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.101803 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f3e9f82-8640-4645-b3da-d678830b75cb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3f3e9f82-8640-4645-b3da-d678830b75cb" (UID: "3f3e9f82-8640-4645-b3da-d678830b75cb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.102187 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f3e9f82-8640-4645-b3da-d678830b75cb-kube-api-access-9qkhx" (OuterVolumeSpecName: "kube-api-access-9qkhx") pod "3f3e9f82-8640-4645-b3da-d678830b75cb" (UID: "3f3e9f82-8640-4645-b3da-d678830b75cb"). InnerVolumeSpecName "kube-api-access-9qkhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.131854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f3e9f82-8640-4645-b3da-d678830b75cb-config-data" (OuterVolumeSpecName: "config-data") pod "3f3e9f82-8640-4645-b3da-d678830b75cb" (UID: "3f3e9f82-8640-4645-b3da-d678830b75cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:14 crc kubenswrapper[5030]: E0121 00:23:14.191978 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.199324 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f3e9f82-8640-4645-b3da-d678830b75cb-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.199394 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qkhx\" (UniqueName: \"kubernetes.io/projected/3f3e9f82-8640-4645-b3da-d678830b75cb-kube-api-access-9qkhx\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.199413 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f3e9f82-8640-4645-b3da-d678830b75cb-logs\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.199429 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f3e9f82-8640-4645-b3da-d678830b75cb-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.204139 5030 scope.go:117] "RemoveContainer" containerID="f8449311aba6960e9776f5d128658a0f2296ffb181af1b38a4c51523946b7c93" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.229825 5030 scope.go:117] "RemoveContainer" containerID="c214a0ed31fc8abe049a628d2f86c356902f4367e7de355914cc7fbdbaee9372" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.792993 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:23:14 crc kubenswrapper[5030]: E0121 00:23:14.793209 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.795823 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" event={"ID":"3f3e9f82-8640-4645-b3da-d678830b75cb","Type":"ContainerDied","Data":"9f1e58939c2cd05b0f432f845686a20cfab3c881b04aa6585198141ae706b3b4"} Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.795860 5030 scope.go:117] "RemoveContainer" containerID="64525601efc78dd745920bd43f8b888d269f03d8f7a926c0c7ac47b3ab5f942e" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.795935 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.800580 5030 generic.go:334] "Generic (PLEG): container finished" podID="6b49b609-47f9-4f9e-98ba-04775e57464f" containerID="321aeec566632b05c5a0effe066112c99f2f9161e670b82f2a4f8a4e2d3081be" exitCode=0 Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.800644 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" event={"ID":"6b49b609-47f9-4f9e-98ba-04775e57464f","Type":"ContainerDied","Data":"321aeec566632b05c5a0effe066112c99f2f9161e670b82f2a4f8a4e2d3081be"} Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.817372 5030 scope.go:117] "RemoveContainer" containerID="433204c6eec430cbb10390cefa5fd5db439270c4694cc72117c33a73472a48d2" Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.870586 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw"] Jan 21 00:23:14 crc kubenswrapper[5030]: I0121 00:23:14.876585 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-s8mlw"] Jan 21 00:23:15 crc kubenswrapper[5030]: I0121 00:23:15.976330 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f3e9f82-8640-4645-b3da-d678830b75cb" path="/var/lib/kubelet/pods/3f3e9f82-8640-4645-b3da-d678830b75cb/volumes" Jan 21 00:23:16 crc kubenswrapper[5030]: I0121 00:23:16.174313 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" Jan 21 00:23:16 crc kubenswrapper[5030]: I0121 00:23:16.335395 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx9c6\" (UniqueName: \"kubernetes.io/projected/6b49b609-47f9-4f9e-98ba-04775e57464f-kube-api-access-mx9c6\") pod \"6b49b609-47f9-4f9e-98ba-04775e57464f\" (UID: \"6b49b609-47f9-4f9e-98ba-04775e57464f\") " Jan 21 00:23:16 crc kubenswrapper[5030]: I0121 00:23:16.335462 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b49b609-47f9-4f9e-98ba-04775e57464f-operator-scripts\") pod \"6b49b609-47f9-4f9e-98ba-04775e57464f\" (UID: \"6b49b609-47f9-4f9e-98ba-04775e57464f\") " Jan 21 00:23:16 crc kubenswrapper[5030]: I0121 00:23:16.336098 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b49b609-47f9-4f9e-98ba-04775e57464f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b49b609-47f9-4f9e-98ba-04775e57464f" (UID: "6b49b609-47f9-4f9e-98ba-04775e57464f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:16 crc kubenswrapper[5030]: I0121 00:23:16.349789 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b49b609-47f9-4f9e-98ba-04775e57464f-kube-api-access-mx9c6" (OuterVolumeSpecName: "kube-api-access-mx9c6") pod "6b49b609-47f9-4f9e-98ba-04775e57464f" (UID: "6b49b609-47f9-4f9e-98ba-04775e57464f"). InnerVolumeSpecName "kube-api-access-mx9c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:16 crc kubenswrapper[5030]: I0121 00:23:16.436826 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b49b609-47f9-4f9e-98ba-04775e57464f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:16 crc kubenswrapper[5030]: I0121 00:23:16.436872 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx9c6\" (UniqueName: \"kubernetes.io/projected/6b49b609-47f9-4f9e-98ba-04775e57464f-kube-api-access-mx9c6\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:16 crc kubenswrapper[5030]: I0121 00:23:16.821491 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" event={"ID":"6b49b609-47f9-4f9e-98ba-04775e57464f","Type":"ContainerDied","Data":"b2648e6c3004a11c365a3e851c7c1c07cf3ebecebb86b7a8e05a998d69efb0a1"} Jan 21 00:23:16 crc kubenswrapper[5030]: I0121 00:23:16.821908 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2648e6c3004a11c365a3e851c7c1c07cf3ebecebb86b7a8e05a998d69efb0a1" Jan 21 00:23:16 crc kubenswrapper[5030]: I0121 00:23:16.821540 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbicand7f7-account-delete-lk46v" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.591902 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-2thtv"] Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.601905 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-2thtv"] Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.614295 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbicand7f7-account-delete-lk46v"] Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.622377 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p"] Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.630100 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbicand7f7-account-delete-lk46v"] Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.638662 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-d7f7-account-create-update-x4j5p"] Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.684828 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-create-n4mfq"] Jan 21 00:23:18 crc kubenswrapper[5030]: E0121 00:23:18.685318 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ec622f-3921-4633-a2d4-e29f771dd532" containerName="barbican-api-log" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.685410 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ec622f-3921-4633-a2d4-e29f771dd532" containerName="barbican-api-log" Jan 21 00:23:18 crc kubenswrapper[5030]: E0121 00:23:18.685483 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a527ae0b-f731-403e-b261-9092b798f544" containerName="barbican-worker-log" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.685533 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a527ae0b-f731-403e-b261-9092b798f544" containerName="barbican-worker-log" Jan 21 00:23:18 crc kubenswrapper[5030]: E0121 00:23:18.685588 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="104fd241-bda1-4f9a-99a5-17f913975981" containerName="barbican-api" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.685662 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="104fd241-bda1-4f9a-99a5-17f913975981" containerName="barbican-api" Jan 21 00:23:18 crc kubenswrapper[5030]: E0121 00:23:18.685737 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3e9f82-8640-4645-b3da-d678830b75cb" containerName="barbican-keystone-listener" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.685788 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3e9f82-8640-4645-b3da-d678830b75cb" containerName="barbican-keystone-listener" Jan 21 00:23:18 crc kubenswrapper[5030]: E0121 00:23:18.685849 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="104fd241-bda1-4f9a-99a5-17f913975981" containerName="barbican-api-log" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.685898 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="104fd241-bda1-4f9a-99a5-17f913975981" containerName="barbican-api-log" Jan 21 00:23:18 crc kubenswrapper[5030]: E0121 00:23:18.685952 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ec622f-3921-4633-a2d4-e29f771dd532" containerName="barbican-api" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.686016 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ec622f-3921-4633-a2d4-e29f771dd532" containerName="barbican-api" Jan 21 00:23:18 crc kubenswrapper[5030]: E0121 00:23:18.686081 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3e9f82-8640-4645-b3da-d678830b75cb" containerName="barbican-keystone-listener-log" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.686136 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3e9f82-8640-4645-b3da-d678830b75cb" containerName="barbican-keystone-listener-log" Jan 21 00:23:18 crc kubenswrapper[5030]: E0121 00:23:18.686191 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a527ae0b-f731-403e-b261-9092b798f544" containerName="barbican-worker" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.686239 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a527ae0b-f731-403e-b261-9092b798f544" containerName="barbican-worker" Jan 21 00:23:18 crc kubenswrapper[5030]: E0121 00:23:18.686314 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b49b609-47f9-4f9e-98ba-04775e57464f" containerName="mariadb-account-delete" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.686377 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b49b609-47f9-4f9e-98ba-04775e57464f" containerName="mariadb-account-delete" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.686647 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ec622f-3921-4633-a2d4-e29f771dd532" containerName="barbican-api" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.686753 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a527ae0b-f731-403e-b261-9092b798f544" containerName="barbican-worker-log" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.687025 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b49b609-47f9-4f9e-98ba-04775e57464f" containerName="mariadb-account-delete" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.687085 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="104fd241-bda1-4f9a-99a5-17f913975981" containerName="barbican-api" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.687143 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f3e9f82-8640-4645-b3da-d678830b75cb" containerName="barbican-keystone-listener-log" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.687198 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ec622f-3921-4633-a2d4-e29f771dd532" containerName="barbican-api-log" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.687248 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="104fd241-bda1-4f9a-99a5-17f913975981" containerName="barbican-api-log" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.687299 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a527ae0b-f731-403e-b261-9092b798f544" containerName="barbican-worker" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.687355 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f3e9f82-8640-4645-b3da-d678830b75cb" containerName="barbican-keystone-listener" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.687971 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-n4mfq" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.692830 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-n4mfq"] Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.768920 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blttg\" (UniqueName: \"kubernetes.io/projected/eb686269-ac6c-4fab-9b4e-0e5187e7e391-kube-api-access-blttg\") pod \"barbican-db-create-n4mfq\" (UID: \"eb686269-ac6c-4fab-9b4e-0e5187e7e391\") " pod="barbican-kuttl-tests/barbican-db-create-n4mfq" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.769045 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb686269-ac6c-4fab-9b4e-0e5187e7e391-operator-scripts\") pod \"barbican-db-create-n4mfq\" (UID: \"eb686269-ac6c-4fab-9b4e-0e5187e7e391\") " pod="barbican-kuttl-tests/barbican-db-create-n4mfq" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.798119 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-051f-account-create-update-scdqw"] Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.799213 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.801172 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-db-secret" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.804364 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-051f-account-create-update-scdqw"] Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.870987 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db98e963-8539-4554-aaeb-9624f18d9ec0-operator-scripts\") pod \"barbican-051f-account-create-update-scdqw\" (UID: \"db98e963-8539-4554-aaeb-9624f18d9ec0\") " pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.871046 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkrmx\" (UniqueName: \"kubernetes.io/projected/db98e963-8539-4554-aaeb-9624f18d9ec0-kube-api-access-dkrmx\") pod \"barbican-051f-account-create-update-scdqw\" (UID: \"db98e963-8539-4554-aaeb-9624f18d9ec0\") " pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.871135 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb686269-ac6c-4fab-9b4e-0e5187e7e391-operator-scripts\") pod \"barbican-db-create-n4mfq\" (UID: \"eb686269-ac6c-4fab-9b4e-0e5187e7e391\") " pod="barbican-kuttl-tests/barbican-db-create-n4mfq" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.871245 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blttg\" (UniqueName: \"kubernetes.io/projected/eb686269-ac6c-4fab-9b4e-0e5187e7e391-kube-api-access-blttg\") pod \"barbican-db-create-n4mfq\" (UID: \"eb686269-ac6c-4fab-9b4e-0e5187e7e391\") " pod="barbican-kuttl-tests/barbican-db-create-n4mfq" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.871953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb686269-ac6c-4fab-9b4e-0e5187e7e391-operator-scripts\") pod \"barbican-db-create-n4mfq\" (UID: \"eb686269-ac6c-4fab-9b4e-0e5187e7e391\") " pod="barbican-kuttl-tests/barbican-db-create-n4mfq" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.888566 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blttg\" (UniqueName: \"kubernetes.io/projected/eb686269-ac6c-4fab-9b4e-0e5187e7e391-kube-api-access-blttg\") pod \"barbican-db-create-n4mfq\" (UID: \"eb686269-ac6c-4fab-9b4e-0e5187e7e391\") " pod="barbican-kuttl-tests/barbican-db-create-n4mfq" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.972745 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db98e963-8539-4554-aaeb-9624f18d9ec0-operator-scripts\") pod \"barbican-051f-account-create-update-scdqw\" (UID: \"db98e963-8539-4554-aaeb-9624f18d9ec0\") " pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.972805 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkrmx\" (UniqueName: \"kubernetes.io/projected/db98e963-8539-4554-aaeb-9624f18d9ec0-kube-api-access-dkrmx\") pod \"barbican-051f-account-create-update-scdqw\" (UID: \"db98e963-8539-4554-aaeb-9624f18d9ec0\") " pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.973487 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db98e963-8539-4554-aaeb-9624f18d9ec0-operator-scripts\") pod \"barbican-051f-account-create-update-scdqw\" (UID: \"db98e963-8539-4554-aaeb-9624f18d9ec0\") " pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" Jan 21 00:23:18 crc kubenswrapper[5030]: I0121 00:23:18.991197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkrmx\" (UniqueName: \"kubernetes.io/projected/db98e963-8539-4554-aaeb-9624f18d9ec0-kube-api-access-dkrmx\") pod \"barbican-051f-account-create-update-scdqw\" (UID: \"db98e963-8539-4554-aaeb-9624f18d9ec0\") " pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" Jan 21 00:23:19 crc kubenswrapper[5030]: I0121 00:23:19.005973 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-n4mfq" Jan 21 00:23:19 crc kubenswrapper[5030]: I0121 00:23:19.113024 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" Jan 21 00:23:19 crc kubenswrapper[5030]: I0121 00:23:19.551390 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-n4mfq"] Jan 21 00:23:19 crc kubenswrapper[5030]: I0121 00:23:19.620952 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-051f-account-create-update-scdqw"] Jan 21 00:23:19 crc kubenswrapper[5030]: W0121 00:23:19.625348 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb98e963_8539_4554_aaeb_9624f18d9ec0.slice/crio-7162ddb1f14cf9c1ba760d2b48d51fb4a23329091f4b7e255a6df3358e7b96f1 WatchSource:0}: Error finding container 7162ddb1f14cf9c1ba760d2b48d51fb4a23329091f4b7e255a6df3358e7b96f1: Status 404 returned error can't find the container with id 7162ddb1f14cf9c1ba760d2b48d51fb4a23329091f4b7e255a6df3358e7b96f1 Jan 21 00:23:19 crc kubenswrapper[5030]: I0121 00:23:19.844821 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" event={"ID":"db98e963-8539-4554-aaeb-9624f18d9ec0","Type":"ContainerStarted","Data":"0a0ef5b928090da58165f3a969293287ea6eacde529e8f548b3ae25dfe8d3364"} Jan 21 00:23:19 crc kubenswrapper[5030]: I0121 00:23:19.845360 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" event={"ID":"db98e963-8539-4554-aaeb-9624f18d9ec0","Type":"ContainerStarted","Data":"7162ddb1f14cf9c1ba760d2b48d51fb4a23329091f4b7e255a6df3358e7b96f1"} Jan 21 00:23:19 crc kubenswrapper[5030]: I0121 00:23:19.846457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-n4mfq" event={"ID":"eb686269-ac6c-4fab-9b4e-0e5187e7e391","Type":"ContainerStarted","Data":"f66814a9de46d7736f6c47ceed05542d04c0b9c09a550511cd01151c616c4ede"} Jan 21 00:23:19 crc kubenswrapper[5030]: I0121 00:23:19.846491 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-n4mfq" event={"ID":"eb686269-ac6c-4fab-9b4e-0e5187e7e391","Type":"ContainerStarted","Data":"3eacb8ed8f0bda8a88bc0f053ca1ffcaacb074ff0b0a4df86801d6a83f5cea7a"} Jan 21 00:23:19 crc kubenswrapper[5030]: I0121 00:23:19.865057 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" podStartSLOduration=1.8650354660000001 podStartE2EDuration="1.865035466s" podCreationTimestamp="2026-01-21 00:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:23:19.858458428 +0000 UTC m=+6472.178718716" watchObservedRunningTime="2026-01-21 00:23:19.865035466 +0000 UTC m=+6472.185295754" Jan 21 00:23:19 crc kubenswrapper[5030]: I0121 00:23:19.879013 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-db-create-n4mfq" podStartSLOduration=1.878993959 podStartE2EDuration="1.878993959s" podCreationTimestamp="2026-01-21 00:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:23:19.873534929 +0000 UTC m=+6472.193795237" watchObservedRunningTime="2026-01-21 00:23:19.878993959 +0000 UTC m=+6472.199254247" Jan 21 00:23:19 crc kubenswrapper[5030]: I0121 00:23:19.972257 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c0735a7-8329-4e49-ae6d-f67fdccf1ddc" path="/var/lib/kubelet/pods/3c0735a7-8329-4e49-ae6d-f67fdccf1ddc/volumes" Jan 21 00:23:19 crc kubenswrapper[5030]: I0121 00:23:19.972941 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b49b609-47f9-4f9e-98ba-04775e57464f" path="/var/lib/kubelet/pods/6b49b609-47f9-4f9e-98ba-04775e57464f/volumes" Jan 21 00:23:19 crc kubenswrapper[5030]: I0121 00:23:19.973514 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="725c2e18-6911-427f-acef-152827de9010" path="/var/lib/kubelet/pods/725c2e18-6911-427f-acef-152827de9010/volumes" Jan 21 00:23:20 crc kubenswrapper[5030]: I0121 00:23:20.855340 5030 generic.go:334] "Generic (PLEG): container finished" podID="db98e963-8539-4554-aaeb-9624f18d9ec0" containerID="0a0ef5b928090da58165f3a969293287ea6eacde529e8f548b3ae25dfe8d3364" exitCode=0 Jan 21 00:23:20 crc kubenswrapper[5030]: I0121 00:23:20.855409 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" event={"ID":"db98e963-8539-4554-aaeb-9624f18d9ec0","Type":"ContainerDied","Data":"0a0ef5b928090da58165f3a969293287ea6eacde529e8f548b3ae25dfe8d3364"} Jan 21 00:23:20 crc kubenswrapper[5030]: I0121 00:23:20.858049 5030 generic.go:334] "Generic (PLEG): container finished" podID="eb686269-ac6c-4fab-9b4e-0e5187e7e391" containerID="f66814a9de46d7736f6c47ceed05542d04c0b9c09a550511cd01151c616c4ede" exitCode=0 Jan 21 00:23:20 crc kubenswrapper[5030]: I0121 00:23:20.858113 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-n4mfq" event={"ID":"eb686269-ac6c-4fab-9b4e-0e5187e7e391","Type":"ContainerDied","Data":"f66814a9de46d7736f6c47ceed05542d04c0b9c09a550511cd01151c616c4ede"} Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.246916 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.259498 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-n4mfq" Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.428076 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blttg\" (UniqueName: \"kubernetes.io/projected/eb686269-ac6c-4fab-9b4e-0e5187e7e391-kube-api-access-blttg\") pod \"eb686269-ac6c-4fab-9b4e-0e5187e7e391\" (UID: \"eb686269-ac6c-4fab-9b4e-0e5187e7e391\") " Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.428337 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db98e963-8539-4554-aaeb-9624f18d9ec0-operator-scripts\") pod \"db98e963-8539-4554-aaeb-9624f18d9ec0\" (UID: \"db98e963-8539-4554-aaeb-9624f18d9ec0\") " Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.428386 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb686269-ac6c-4fab-9b4e-0e5187e7e391-operator-scripts\") pod \"eb686269-ac6c-4fab-9b4e-0e5187e7e391\" (UID: \"eb686269-ac6c-4fab-9b4e-0e5187e7e391\") " Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.428443 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkrmx\" (UniqueName: \"kubernetes.io/projected/db98e963-8539-4554-aaeb-9624f18d9ec0-kube-api-access-dkrmx\") pod \"db98e963-8539-4554-aaeb-9624f18d9ec0\" (UID: \"db98e963-8539-4554-aaeb-9624f18d9ec0\") " Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.429125 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb686269-ac6c-4fab-9b4e-0e5187e7e391-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb686269-ac6c-4fab-9b4e-0e5187e7e391" (UID: "eb686269-ac6c-4fab-9b4e-0e5187e7e391"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.429289 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db98e963-8539-4554-aaeb-9624f18d9ec0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db98e963-8539-4554-aaeb-9624f18d9ec0" (UID: "db98e963-8539-4554-aaeb-9624f18d9ec0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.435596 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db98e963-8539-4554-aaeb-9624f18d9ec0-kube-api-access-dkrmx" (OuterVolumeSpecName: "kube-api-access-dkrmx") pod "db98e963-8539-4554-aaeb-9624f18d9ec0" (UID: "db98e963-8539-4554-aaeb-9624f18d9ec0"). InnerVolumeSpecName "kube-api-access-dkrmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.437394 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb686269-ac6c-4fab-9b4e-0e5187e7e391-kube-api-access-blttg" (OuterVolumeSpecName: "kube-api-access-blttg") pod "eb686269-ac6c-4fab-9b4e-0e5187e7e391" (UID: "eb686269-ac6c-4fab-9b4e-0e5187e7e391"). InnerVolumeSpecName "kube-api-access-blttg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.529947 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blttg\" (UniqueName: \"kubernetes.io/projected/eb686269-ac6c-4fab-9b4e-0e5187e7e391-kube-api-access-blttg\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.529983 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db98e963-8539-4554-aaeb-9624f18d9ec0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.529994 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb686269-ac6c-4fab-9b4e-0e5187e7e391-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.530003 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkrmx\" (UniqueName: \"kubernetes.io/projected/db98e963-8539-4554-aaeb-9624f18d9ec0-kube-api-access-dkrmx\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.882689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" event={"ID":"db98e963-8539-4554-aaeb-9624f18d9ec0","Type":"ContainerDied","Data":"7162ddb1f14cf9c1ba760d2b48d51fb4a23329091f4b7e255a6df3358e7b96f1"} Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.882731 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7162ddb1f14cf9c1ba760d2b48d51fb4a23329091f4b7e255a6df3358e7b96f1" Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.882798 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-051f-account-create-update-scdqw" Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.884838 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-create-n4mfq" event={"ID":"eb686269-ac6c-4fab-9b4e-0e5187e7e391","Type":"ContainerDied","Data":"3eacb8ed8f0bda8a88bc0f053ca1ffcaacb074ff0b0a4df86801d6a83f5cea7a"} Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.884883 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eacb8ed8f0bda8a88bc0f053ca1ffcaacb074ff0b0a4df86801d6a83f5cea7a" Jan 21 00:23:22 crc kubenswrapper[5030]: I0121 00:23:22.884946 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-create-n4mfq" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.049054 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-fdfsw"] Jan 21 00:23:24 crc kubenswrapper[5030]: E0121 00:23:24.049644 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb686269-ac6c-4fab-9b4e-0e5187e7e391" containerName="mariadb-database-create" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.049657 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb686269-ac6c-4fab-9b4e-0e5187e7e391" containerName="mariadb-database-create" Jan 21 00:23:24 crc kubenswrapper[5030]: E0121 00:23:24.049685 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db98e963-8539-4554-aaeb-9624f18d9ec0" containerName="mariadb-account-create-update" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.049692 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="db98e963-8539-4554-aaeb-9624f18d9ec0" containerName="mariadb-account-create-update" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.049821 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="db98e963-8539-4554-aaeb-9624f18d9ec0" containerName="mariadb-account-create-update" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.049844 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb686269-ac6c-4fab-9b4e-0e5187e7e391" containerName="mariadb-database-create" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.050337 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.052145 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-gz78d" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.052732 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"combined-ca-bundle" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.060032 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-fdfsw"] Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.152846 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-db-sync-config-data\") pod \"barbican-db-sync-fdfsw\" (UID: \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\") " pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.152932 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-combined-ca-bundle\") pod \"barbican-db-sync-fdfsw\" (UID: \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\") " pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.153005 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkp9s\" (UniqueName: \"kubernetes.io/projected/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-kube-api-access-wkp9s\") pod \"barbican-db-sync-fdfsw\" (UID: \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\") " pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.254561 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkp9s\" (UniqueName: \"kubernetes.io/projected/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-kube-api-access-wkp9s\") pod \"barbican-db-sync-fdfsw\" (UID: \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\") " pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.254725 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-db-sync-config-data\") pod \"barbican-db-sync-fdfsw\" (UID: \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\") " pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.254774 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-combined-ca-bundle\") pod \"barbican-db-sync-fdfsw\" (UID: \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\") " pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.259658 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-combined-ca-bundle\") pod \"barbican-db-sync-fdfsw\" (UID: \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\") " pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.261016 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-db-sync-config-data\") pod \"barbican-db-sync-fdfsw\" (UID: \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\") " pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.273891 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkp9s\" (UniqueName: \"kubernetes.io/projected/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-kube-api-access-wkp9s\") pod \"barbican-db-sync-fdfsw\" (UID: \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\") " pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.427678 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.843960 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-fdfsw"] Jan 21 00:23:24 crc kubenswrapper[5030]: W0121 00:23:24.845205 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27ae5ba5_d0d0_47bc_86d5_ea30ad1ad48c.slice/crio-d724443a6fcf18ca3f290953fe6b38c5cec540406a0b7e1ee1040593b0adea66 WatchSource:0}: Error finding container d724443a6fcf18ca3f290953fe6b38c5cec540406a0b7e1ee1040593b0adea66: Status 404 returned error can't find the container with id d724443a6fcf18ca3f290953fe6b38c5cec540406a0b7e1ee1040593b0adea66 Jan 21 00:23:24 crc kubenswrapper[5030]: I0121 00:23:24.901797 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" event={"ID":"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c","Type":"ContainerStarted","Data":"d724443a6fcf18ca3f290953fe6b38c5cec540406a0b7e1ee1040593b0adea66"} Jan 21 00:23:25 crc kubenswrapper[5030]: I0121 00:23:25.911610 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" event={"ID":"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c","Type":"ContainerStarted","Data":"77cbef79247a9104c4a4c6cac7f81b37a3df69ae6fc58ae4d4fb5f11d2c53f32"} Jan 21 00:23:25 crc kubenswrapper[5030]: I0121 00:23:25.929134 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" podStartSLOduration=1.929116442 podStartE2EDuration="1.929116442s" podCreationTimestamp="2026-01-21 00:23:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:23:25.925965796 +0000 UTC m=+6478.246226084" watchObservedRunningTime="2026-01-21 00:23:25.929116442 +0000 UTC m=+6478.249376730" Jan 21 00:23:26 crc kubenswrapper[5030]: I0121 00:23:26.921780 5030 generic.go:334] "Generic (PLEG): container finished" podID="27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c" containerID="77cbef79247a9104c4a4c6cac7f81b37a3df69ae6fc58ae4d4fb5f11d2c53f32" exitCode=0 Jan 21 00:23:26 crc kubenswrapper[5030]: I0121 00:23:26.921871 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" event={"ID":"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c","Type":"ContainerDied","Data":"77cbef79247a9104c4a4c6cac7f81b37a3df69ae6fc58ae4d4fb5f11d2c53f32"} Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.274283 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.421630 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkp9s\" (UniqueName: \"kubernetes.io/projected/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-kube-api-access-wkp9s\") pod \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\" (UID: \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\") " Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.421754 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-combined-ca-bundle\") pod \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\" (UID: \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\") " Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.421882 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-db-sync-config-data\") pod \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\" (UID: \"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c\") " Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.426885 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c" (UID: "27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.427133 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-kube-api-access-wkp9s" (OuterVolumeSpecName: "kube-api-access-wkp9s") pod "27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c" (UID: "27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c"). InnerVolumeSpecName "kube-api-access-wkp9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.442774 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c" (UID: "27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.524319 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.524371 5030 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.524390 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkp9s\" (UniqueName: \"kubernetes.io/projected/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c-kube-api-access-wkp9s\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.936674 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" event={"ID":"27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c","Type":"ContainerDied","Data":"d724443a6fcf18ca3f290953fe6b38c5cec540406a0b7e1ee1040593b0adea66"} Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.936709 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-db-sync-fdfsw" Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.936718 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d724443a6fcf18ca3f290953fe6b38c5cec540406a0b7e1ee1040593b0adea66" Jan 21 00:23:28 crc kubenswrapper[5030]: I0121 00:23:28.962718 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:23:28 crc kubenswrapper[5030]: E0121 00:23:28.962995 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.210067 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r"] Jan 21 00:23:29 crc kubenswrapper[5030]: E0121 00:23:29.210373 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c" containerName="barbican-db-sync" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.210388 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c" containerName="barbican-db-sync" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.210521 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c" containerName="barbican-db-sync" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.211259 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.213863 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"combined-ca-bundle" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.214450 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-barbican-dockercfg-gz78d" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.231986 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw"] Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.233516 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.238392 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r"] Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.240076 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-config-data\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.240131 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-logs\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.240302 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-config-data\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.240369 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-combined-ca-bundle\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.240541 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-config-data-custom\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.240579 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcrps\" (UniqueName: \"kubernetes.io/projected/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-kube-api-access-fcrps\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.240637 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-combined-ca-bundle\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.240673 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-logs\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.240701 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nr65\" (UniqueName: \"kubernetes.io/projected/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-kube-api-access-8nr65\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.240809 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-config-data-custom\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.282186 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw"] Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.293125 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd"] Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.301957 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.305523 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"cert-barbican-internal-svc" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.305832 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"cert-barbican-public-svc" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.314181 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"barbican-api-config-data" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.332828 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd"] Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342061 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-public-tls-certs\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342114 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-config-data-custom\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342136 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-combined-ca-bundle\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342162 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7rvr\" (UniqueName: \"kubernetes.io/projected/130aa456-410d-436b-aa9f-6f56ace1354a-kube-api-access-m7rvr\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342200 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-internal-tls-certs\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342229 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-config-data\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342251 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-logs\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342283 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-config-data\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342304 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-combined-ca-bundle\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342327 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-config-data-custom\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342349 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcrps\" (UniqueName: \"kubernetes.io/projected/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-kube-api-access-fcrps\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342368 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-config-data-custom\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342385 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-combined-ca-bundle\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342403 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-logs\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342423 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nr65\" (UniqueName: \"kubernetes.io/projected/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-kube-api-access-8nr65\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342443 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/130aa456-410d-436b-aa9f-6f56ace1354a-logs\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.342464 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-config-data\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.346069 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-logs\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.349095 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-config-data-custom\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.349291 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-config-data\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.349443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-logs\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.351765 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-combined-ca-bundle\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.354321 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-config-data\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.355055 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-combined-ca-bundle\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.367089 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nr65\" (UniqueName: \"kubernetes.io/projected/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-kube-api-access-8nr65\") pod \"barbican-worker-85784bf4dc-2lm7r\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.368214 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcrps\" (UniqueName: \"kubernetes.io/projected/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-kube-api-access-fcrps\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.372837 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-config-data-custom\") pod \"barbican-keystone-listener-66b76d7c46-rp7jw\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.443673 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-config-data-custom\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.443736 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/130aa456-410d-436b-aa9f-6f56ace1354a-logs\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.443783 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-config-data\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.443817 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-public-tls-certs\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.443838 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-combined-ca-bundle\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.443863 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7rvr\" (UniqueName: \"kubernetes.io/projected/130aa456-410d-436b-aa9f-6f56ace1354a-kube-api-access-m7rvr\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.443906 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-internal-tls-certs\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.444674 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/130aa456-410d-436b-aa9f-6f56ace1354a-logs\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.448915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-config-data-custom\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.450073 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-combined-ca-bundle\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.450764 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-internal-tls-certs\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.450999 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-config-data\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.451393 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-public-tls-certs\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.465276 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7rvr\" (UniqueName: \"kubernetes.io/projected/130aa456-410d-436b-aa9f-6f56ace1354a-kube-api-access-m7rvr\") pod \"barbican-api-76c59dc47b-hfwgd\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.540596 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.553980 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.621457 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:29 crc kubenswrapper[5030]: I0121 00:23:29.975388 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r"] Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.045578 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw"] Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.105848 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd"] Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.267639 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-fdfsw"] Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.277570 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-sync-fdfsw"] Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.313930 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/barbican051f-account-delete-gt4q6"] Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.315435 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican051f-account-delete-gt4q6" Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.325469 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican051f-account-delete-gt4q6"] Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.334312 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw"] Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.343776 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r"] Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.353330 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd"] Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.358437 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa697694-b7e0-42b9-9911-08d88b66ac82-operator-scripts\") pod \"barbican051f-account-delete-gt4q6\" (UID: \"fa697694-b7e0-42b9-9911-08d88b66ac82\") " pod="barbican-kuttl-tests/barbican051f-account-delete-gt4q6" Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.358523 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z7dj\" (UniqueName: \"kubernetes.io/projected/fa697694-b7e0-42b9-9911-08d88b66ac82-kube-api-access-6z7dj\") pod \"barbican051f-account-delete-gt4q6\" (UID: \"fa697694-b7e0-42b9-9911-08d88b66ac82\") " pod="barbican-kuttl-tests/barbican051f-account-delete-gt4q6" Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.460254 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z7dj\" (UniqueName: \"kubernetes.io/projected/fa697694-b7e0-42b9-9911-08d88b66ac82-kube-api-access-6z7dj\") pod \"barbican051f-account-delete-gt4q6\" (UID: \"fa697694-b7e0-42b9-9911-08d88b66ac82\") " pod="barbican-kuttl-tests/barbican051f-account-delete-gt4q6" Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.460459 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa697694-b7e0-42b9-9911-08d88b66ac82-operator-scripts\") pod \"barbican051f-account-delete-gt4q6\" (UID: \"fa697694-b7e0-42b9-9911-08d88b66ac82\") " pod="barbican-kuttl-tests/barbican051f-account-delete-gt4q6" Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.461238 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa697694-b7e0-42b9-9911-08d88b66ac82-operator-scripts\") pod \"barbican051f-account-delete-gt4q6\" (UID: \"fa697694-b7e0-42b9-9911-08d88b66ac82\") " pod="barbican-kuttl-tests/barbican051f-account-delete-gt4q6" Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.486011 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z7dj\" (UniqueName: \"kubernetes.io/projected/fa697694-b7e0-42b9-9911-08d88b66ac82-kube-api-access-6z7dj\") pod \"barbican051f-account-delete-gt4q6\" (UID: \"fa697694-b7e0-42b9-9911-08d88b66ac82\") " pod="barbican-kuttl-tests/barbican051f-account-delete-gt4q6" Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.754180 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican051f-account-delete-gt4q6" Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.969884 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" event={"ID":"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a","Type":"ContainerStarted","Data":"e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd"} Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.969932 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" event={"ID":"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a","Type":"ContainerStarted","Data":"ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8"} Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.969944 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" event={"ID":"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a","Type":"ContainerStarted","Data":"a4dca8ecef274d7ee909bfb07263419e7c93750e10368a42ca0e781ee482f78d"} Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.970077 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" podUID="c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" containerName="barbican-worker-log" containerID="cri-o://ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8" gracePeriod=30 Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.970489 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" podUID="c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" containerName="barbican-worker" containerID="cri-o://e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd" gracePeriod=30 Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.981431 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" event={"ID":"130aa456-410d-436b-aa9f-6f56ace1354a","Type":"ContainerStarted","Data":"92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735"} Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.981479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" event={"ID":"130aa456-410d-436b-aa9f-6f56ace1354a","Type":"ContainerStarted","Data":"c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0"} Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.981490 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" event={"ID":"130aa456-410d-436b-aa9f-6f56ace1354a","Type":"ContainerStarted","Data":"f8fb2b91c21e1da53874c2be807d5ee6b48371a32ebe67628a8d3fd20063f9eb"} Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.981632 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" podUID="130aa456-410d-436b-aa9f-6f56ace1354a" containerName="barbican-api-log" containerID="cri-o://c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0" gracePeriod=30 Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.981721 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.981724 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" podUID="130aa456-410d-436b-aa9f-6f56ace1354a" containerName="barbican-api" containerID="cri-o://92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735" gracePeriod=30 Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.981855 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.988517 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" event={"ID":"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19","Type":"ContainerStarted","Data":"e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244"} Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.988560 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" event={"ID":"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19","Type":"ContainerStarted","Data":"6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9"} Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.988573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" event={"ID":"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19","Type":"ContainerStarted","Data":"d9208312f3b1ace32760d507be2c9325f8e18615dd5448ced48ad47bda268f53"} Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.988707 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" podUID="bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" containerName="barbican-keystone-listener-log" containerID="cri-o://6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9" gracePeriod=30 Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.988829 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" podUID="bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" containerName="barbican-keystone-listener" containerID="cri-o://e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244" gracePeriod=30 Jan 21 00:23:30 crc kubenswrapper[5030]: I0121 00:23:30.997878 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" podStartSLOduration=1.997852313 podStartE2EDuration="1.997852313s" podCreationTimestamp="2026-01-21 00:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:23:30.98934739 +0000 UTC m=+6483.309607688" watchObservedRunningTime="2026-01-21 00:23:30.997852313 +0000 UTC m=+6483.318112611" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.035478 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" podStartSLOduration=2.035458902 podStartE2EDuration="2.035458902s" podCreationTimestamp="2026-01-21 00:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:23:31.026669211 +0000 UTC m=+6483.346929499" watchObservedRunningTime="2026-01-21 00:23:31.035458902 +0000 UTC m=+6483.355719190" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.055357 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" podStartSLOduration=2.055340507 podStartE2EDuration="2.055340507s" podCreationTimestamp="2026-01-21 00:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:23:31.044466997 +0000 UTC m=+6483.364727285" watchObservedRunningTime="2026-01-21 00:23:31.055340507 +0000 UTC m=+6483.375600795" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.132279 5030 scope.go:117] "RemoveContainer" containerID="e73bcaa6e8845c42c762d3ac37774ce0b650251773b18e1b5219e96b854a79f7" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.195048 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/barbican051f-account-delete-gt4q6"] Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.813524 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.886375 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7rvr\" (UniqueName: \"kubernetes.io/projected/130aa456-410d-436b-aa9f-6f56ace1354a-kube-api-access-m7rvr\") pod \"130aa456-410d-436b-aa9f-6f56ace1354a\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.886430 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-combined-ca-bundle\") pod \"130aa456-410d-436b-aa9f-6f56ace1354a\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.886484 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-config-data\") pod \"130aa456-410d-436b-aa9f-6f56ace1354a\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.886502 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-public-tls-certs\") pod \"130aa456-410d-436b-aa9f-6f56ace1354a\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.886527 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-config-data-custom\") pod \"130aa456-410d-436b-aa9f-6f56ace1354a\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.887160 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-internal-tls-certs\") pod \"130aa456-410d-436b-aa9f-6f56ace1354a\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.887200 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/130aa456-410d-436b-aa9f-6f56ace1354a-logs\") pod \"130aa456-410d-436b-aa9f-6f56ace1354a\" (UID: \"130aa456-410d-436b-aa9f-6f56ace1354a\") " Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.887700 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/130aa456-410d-436b-aa9f-6f56ace1354a-logs" (OuterVolumeSpecName: "logs") pod "130aa456-410d-436b-aa9f-6f56ace1354a" (UID: "130aa456-410d-436b-aa9f-6f56ace1354a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.891489 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "130aa456-410d-436b-aa9f-6f56ace1354a" (UID: "130aa456-410d-436b-aa9f-6f56ace1354a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.891557 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130aa456-410d-436b-aa9f-6f56ace1354a-kube-api-access-m7rvr" (OuterVolumeSpecName: "kube-api-access-m7rvr") pod "130aa456-410d-436b-aa9f-6f56ace1354a" (UID: "130aa456-410d-436b-aa9f-6f56ace1354a"). InnerVolumeSpecName "kube-api-access-m7rvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.919916 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "130aa456-410d-436b-aa9f-6f56ace1354a" (UID: "130aa456-410d-436b-aa9f-6f56ace1354a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.923884 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "130aa456-410d-436b-aa9f-6f56ace1354a" (UID: "130aa456-410d-436b-aa9f-6f56ace1354a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.925068 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-config-data" (OuterVolumeSpecName: "config-data") pod "130aa456-410d-436b-aa9f-6f56ace1354a" (UID: "130aa456-410d-436b-aa9f-6f56ace1354a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.938283 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "130aa456-410d-436b-aa9f-6f56ace1354a" (UID: "130aa456-410d-436b-aa9f-6f56ace1354a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.982730 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c" path="/var/lib/kubelet/pods/27ae5ba5-d0d0-47bc-86d5-ea30ad1ad48c/volumes" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.988080 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.988220 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/130aa456-410d-436b-aa9f-6f56ace1354a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.988285 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7rvr\" (UniqueName: \"kubernetes.io/projected/130aa456-410d-436b-aa9f-6f56ace1354a-kube-api-access-m7rvr\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.988375 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.988449 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.988569 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.988677 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/130aa456-410d-436b-aa9f-6f56ace1354a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.997969 5030 generic.go:334] "Generic (PLEG): container finished" podID="c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" containerID="ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8" exitCode=143 Jan 21 00:23:31 crc kubenswrapper[5030]: I0121 00:23:31.998037 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" event={"ID":"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a","Type":"ContainerDied","Data":"ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8"} Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.000464 5030 generic.go:334] "Generic (PLEG): container finished" podID="130aa456-410d-436b-aa9f-6f56ace1354a" containerID="92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735" exitCode=0 Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.000560 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" event={"ID":"130aa456-410d-436b-aa9f-6f56ace1354a","Type":"ContainerDied","Data":"92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735"} Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.000603 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" event={"ID":"130aa456-410d-436b-aa9f-6f56ace1354a","Type":"ContainerDied","Data":"c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0"} Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.000633 5030 scope.go:117] "RemoveContainer" containerID="92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.000547 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.000777 5030 generic.go:334] "Generic (PLEG): container finished" podID="130aa456-410d-436b-aa9f-6f56ace1354a" containerID="c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0" exitCode=143 Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.000947 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd" event={"ID":"130aa456-410d-436b-aa9f-6f56ace1354a","Type":"ContainerDied","Data":"f8fb2b91c21e1da53874c2be807d5ee6b48371a32ebe67628a8d3fd20063f9eb"} Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.003594 5030 generic.go:334] "Generic (PLEG): container finished" podID="bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" containerID="6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9" exitCode=143 Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.003693 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" event={"ID":"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19","Type":"ContainerDied","Data":"6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9"} Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.005453 5030 generic.go:334] "Generic (PLEG): container finished" podID="fa697694-b7e0-42b9-9911-08d88b66ac82" containerID="29eb6c85004f3f4c0e27be580f3a686a708892e41dfe3e1b9d8b0047281c06ed" exitCode=0 Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.005494 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican051f-account-delete-gt4q6" event={"ID":"fa697694-b7e0-42b9-9911-08d88b66ac82","Type":"ContainerDied","Data":"29eb6c85004f3f4c0e27be580f3a686a708892e41dfe3e1b9d8b0047281c06ed"} Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.005518 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican051f-account-delete-gt4q6" event={"ID":"fa697694-b7e0-42b9-9911-08d88b66ac82","Type":"ContainerStarted","Data":"d8855e851aee07ef0c20626fe967a5835a3d2d73b1e2365d88e848a17e93a51c"} Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.026048 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd"] Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.033140 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-api-76c59dc47b-hfwgd"] Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.041351 5030 scope.go:117] "RemoveContainer" containerID="c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.060210 5030 scope.go:117] "RemoveContainer" containerID="92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735" Jan 21 00:23:32 crc kubenswrapper[5030]: E0121 00:23:32.060697 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735\": container with ID starting with 92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735 not found: ID does not exist" containerID="92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.060736 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735"} err="failed to get container status \"92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735\": rpc error: code = NotFound desc = could not find container \"92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735\": container with ID starting with 92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735 not found: ID does not exist" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.060760 5030 scope.go:117] "RemoveContainer" containerID="c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0" Jan 21 00:23:32 crc kubenswrapper[5030]: E0121 00:23:32.061113 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0\": container with ID starting with c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0 not found: ID does not exist" containerID="c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.061178 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0"} err="failed to get container status \"c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0\": rpc error: code = NotFound desc = could not find container \"c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0\": container with ID starting with c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0 not found: ID does not exist" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.061212 5030 scope.go:117] "RemoveContainer" containerID="92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.061556 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735"} err="failed to get container status \"92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735\": rpc error: code = NotFound desc = could not find container \"92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735\": container with ID starting with 92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735 not found: ID does not exist" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.061582 5030 scope.go:117] "RemoveContainer" containerID="c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.061841 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0"} err="failed to get container status \"c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0\": rpc error: code = NotFound desc = could not find container \"c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0\": container with ID starting with c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0 not found: ID does not exist" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.540830 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.553299 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.697613 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-combined-ca-bundle\") pod \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.697701 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcrps\" (UniqueName: \"kubernetes.io/projected/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-kube-api-access-fcrps\") pod \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.697776 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nr65\" (UniqueName: \"kubernetes.io/projected/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-kube-api-access-8nr65\") pod \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.697801 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-config-data-custom\") pod \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.697848 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-config-data\") pod \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.697886 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-combined-ca-bundle\") pod \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.697911 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-config-data-custom\") pod \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.697962 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-logs\") pod \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.697987 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-logs\") pod \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\" (UID: \"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a\") " Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.698012 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-config-data\") pod \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\" (UID: \"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19\") " Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.698380 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-logs" (OuterVolumeSpecName: "logs") pod "bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" (UID: "bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.698814 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-logs" (OuterVolumeSpecName: "logs") pod "c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" (UID: "c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.703171 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" (UID: "bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.703505 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-kube-api-access-fcrps" (OuterVolumeSpecName: "kube-api-access-fcrps") pod "bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" (UID: "bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19"). InnerVolumeSpecName "kube-api-access-fcrps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.703630 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-kube-api-access-8nr65" (OuterVolumeSpecName: "kube-api-access-8nr65") pod "c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" (UID: "c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a"). InnerVolumeSpecName "kube-api-access-8nr65". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.703818 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" (UID: "c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.721265 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" (UID: "bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.722049 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" (UID: "c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.736569 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-config-data" (OuterVolumeSpecName: "config-data") pod "c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" (UID: "c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.741038 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-config-data" (OuterVolumeSpecName: "config-data") pod "bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" (UID: "bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.799288 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.799344 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcrps\" (UniqueName: \"kubernetes.io/projected/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-kube-api-access-fcrps\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.799363 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nr65\" (UniqueName: \"kubernetes.io/projected/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-kube-api-access-8nr65\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.799375 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.799386 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.799396 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.799413 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.799428 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-logs\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.799440 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:32 crc kubenswrapper[5030]: I0121 00:23:32.799450 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.016963 5030 generic.go:334] "Generic (PLEG): container finished" podID="c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" containerID="e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd" exitCode=1 Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.017024 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.017087 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" event={"ID":"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a","Type":"ContainerDied","Data":"e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd"} Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.017148 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r" event={"ID":"c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a","Type":"ContainerDied","Data":"a4dca8ecef274d7ee909bfb07263419e7c93750e10368a42ca0e781ee482f78d"} Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.017169 5030 scope.go:117] "RemoveContainer" containerID="e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.024589 5030 generic.go:334] "Generic (PLEG): container finished" podID="bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" containerID="e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244" exitCode=1 Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.024963 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.024938 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" event={"ID":"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19","Type":"ContainerDied","Data":"e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244"} Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.025071 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw" event={"ID":"bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19","Type":"ContainerDied","Data":"d9208312f3b1ace32760d507be2c9325f8e18615dd5448ced48ad47bda268f53"} Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.039367 5030 scope.go:117] "RemoveContainer" containerID="ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.059269 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r"] Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.068423 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-85784bf4dc-2lm7r"] Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.070124 5030 scope.go:117] "RemoveContainer" containerID="e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd" Jan 21 00:23:33 crc kubenswrapper[5030]: E0121 00:23:33.070593 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd\": container with ID starting with e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd not found: ID does not exist" containerID="e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.070750 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd"} err="failed to get container status \"e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd\": rpc error: code = NotFound desc = could not find container \"e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd\": container with ID starting with e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd not found: ID does not exist" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.070882 5030 scope.go:117] "RemoveContainer" containerID="ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8" Jan 21 00:23:33 crc kubenswrapper[5030]: E0121 00:23:33.071232 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8\": container with ID starting with ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8 not found: ID does not exist" containerID="ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.071274 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8"} err="failed to get container status \"ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8\": rpc error: code = NotFound desc = could not find container \"ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8\": container with ID starting with ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8 not found: ID does not exist" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.071291 5030 scope.go:117] "RemoveContainer" containerID="e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.075805 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw"] Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.081109 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-66b76d7c46-rp7jw"] Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.091105 5030 scope.go:117] "RemoveContainer" containerID="6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.117287 5030 scope.go:117] "RemoveContainer" containerID="e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244" Jan 21 00:23:33 crc kubenswrapper[5030]: E0121 00:23:33.118644 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244\": container with ID starting with e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244 not found: ID does not exist" containerID="e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.118681 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244"} err="failed to get container status \"e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244\": rpc error: code = NotFound desc = could not find container \"e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244\": container with ID starting with e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244 not found: ID does not exist" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.118734 5030 scope.go:117] "RemoveContainer" containerID="6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9" Jan 21 00:23:33 crc kubenswrapper[5030]: E0121 00:23:33.119298 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9\": container with ID starting with 6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9 not found: ID does not exist" containerID="6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.119339 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9"} err="failed to get container status \"6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9\": rpc error: code = NotFound desc = could not find container \"6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9\": container with ID starting with 6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9 not found: ID does not exist" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.270450 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican051f-account-delete-gt4q6" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.412570 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z7dj\" (UniqueName: \"kubernetes.io/projected/fa697694-b7e0-42b9-9911-08d88b66ac82-kube-api-access-6z7dj\") pod \"fa697694-b7e0-42b9-9911-08d88b66ac82\" (UID: \"fa697694-b7e0-42b9-9911-08d88b66ac82\") " Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.413114 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa697694-b7e0-42b9-9911-08d88b66ac82-operator-scripts\") pod \"fa697694-b7e0-42b9-9911-08d88b66ac82\" (UID: \"fa697694-b7e0-42b9-9911-08d88b66ac82\") " Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.413881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa697694-b7e0-42b9-9911-08d88b66ac82-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fa697694-b7e0-42b9-9911-08d88b66ac82" (UID: "fa697694-b7e0-42b9-9911-08d88b66ac82"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.417686 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa697694-b7e0-42b9-9911-08d88b66ac82-kube-api-access-6z7dj" (OuterVolumeSpecName: "kube-api-access-6z7dj") pod "fa697694-b7e0-42b9-9911-08d88b66ac82" (UID: "fa697694-b7e0-42b9-9911-08d88b66ac82"). InnerVolumeSpecName "kube-api-access-6z7dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.514770 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z7dj\" (UniqueName: \"kubernetes.io/projected/fa697694-b7e0-42b9-9911-08d88b66ac82-kube-api-access-6z7dj\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.514815 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa697694-b7e0-42b9-9911-08d88b66ac82-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.972297 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="130aa456-410d-436b-aa9f-6f56ace1354a" path="/var/lib/kubelet/pods/130aa456-410d-436b-aa9f-6f56ace1354a/volumes" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.973210 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" path="/var/lib/kubelet/pods/bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19/volumes" Jan 21 00:23:33 crc kubenswrapper[5030]: I0121 00:23:33.973949 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" path="/var/lib/kubelet/pods/c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a/volumes" Jan 21 00:23:34 crc kubenswrapper[5030]: I0121 00:23:34.032489 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican051f-account-delete-gt4q6" event={"ID":"fa697694-b7e0-42b9-9911-08d88b66ac82","Type":"ContainerDied","Data":"d8855e851aee07ef0c20626fe967a5835a3d2d73b1e2365d88e848a17e93a51c"} Jan 21 00:23:34 crc kubenswrapper[5030]: I0121 00:23:34.032529 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8855e851aee07ef0c20626fe967a5835a3d2d73b1e2365d88e848a17e93a51c" Jan 21 00:23:34 crc kubenswrapper[5030]: I0121 00:23:34.032581 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican051f-account-delete-gt4q6" Jan 21 00:23:35 crc kubenswrapper[5030]: I0121 00:23:35.351556 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-n4mfq"] Jan 21 00:23:35 crc kubenswrapper[5030]: I0121 00:23:35.360942 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican051f-account-delete-gt4q6"] Jan 21 00:23:35 crc kubenswrapper[5030]: I0121 00:23:35.372477 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-db-create-n4mfq"] Jan 21 00:23:35 crc kubenswrapper[5030]: I0121 00:23:35.383007 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican051f-account-delete-gt4q6"] Jan 21 00:23:35 crc kubenswrapper[5030]: I0121 00:23:35.392399 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-051f-account-create-update-scdqw"] Jan 21 00:23:35 crc kubenswrapper[5030]: I0121 00:23:35.401273 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-051f-account-create-update-scdqw"] Jan 21 00:23:35 crc kubenswrapper[5030]: I0121 00:23:35.973833 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db98e963-8539-4554-aaeb-9624f18d9ec0" path="/var/lib/kubelet/pods/db98e963-8539-4554-aaeb-9624f18d9ec0/volumes" Jan 21 00:23:35 crc kubenswrapper[5030]: I0121 00:23:35.974774 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb686269-ac6c-4fab-9b4e-0e5187e7e391" path="/var/lib/kubelet/pods/eb686269-ac6c-4fab-9b4e-0e5187e7e391/volumes" Jan 21 00:23:35 crc kubenswrapper[5030]: I0121 00:23:35.975471 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa697694-b7e0-42b9-9911-08d88b66ac82" path="/var/lib/kubelet/pods/fa697694-b7e0-42b9-9911-08d88b66ac82/volumes" Jan 21 00:23:36 crc kubenswrapper[5030]: W0121 00:23:36.712325 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa697694_b7e0_42b9_9911_08d88b66ac82.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa697694_b7e0_42b9_9911_08d88b66ac82.slice: no such file or directory Jan 21 00:23:36 crc kubenswrapper[5030]: W0121 00:23:36.712409 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod130aa456_410d_436b_aa9f_6f56ace1354a.slice/crio-conmon-92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod130aa456_410d_436b_aa9f_6f56ace1354a.slice/crio-conmon-92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735.scope: no such file or directory Jan 21 00:23:36 crc kubenswrapper[5030]: W0121 00:23:36.712435 5030 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod130aa456_410d_436b_aa9f_6f56ace1354a.slice/crio-92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod130aa456_410d_436b_aa9f_6f56ace1354a.slice/crio-92ce378d8c5ec9463e4fd18882fa00a71bcb870db77d79f14ecc9653bf214735.scope: no such file or directory Jan 21 00:23:36 crc kubenswrapper[5030]: W0121 00:23:36.713972 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc65e0a8a_9877_4ec4_a9f1_2888ddb61d2a.slice/crio-ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8.scope WatchSource:0}: Error finding container ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8: Status 404 returned error can't find the container with id ddc6e88e93007ca56455b003d1bd6c5257af00587c9ec8747b840478302d5bf8 Jan 21 00:23:36 crc kubenswrapper[5030]: W0121 00:23:36.717888 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcae5f9e_55e7_44ec_b08c_7c79e8bfbf19.slice/crio-6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9.scope WatchSource:0}: Error finding container 6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9: Status 404 returned error can't find the container with id 6c89d235cfb2ea806e97bd9f7df06a5d1b24a63f68a2433e44b9deb1a170c9f9 Jan 21 00:23:36 crc kubenswrapper[5030]: W0121 00:23:36.718327 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod130aa456_410d_436b_aa9f_6f56ace1354a.slice/crio-c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0.scope WatchSource:0}: Error finding container c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0: Status 404 returned error can't find the container with id c188125c137c5bc1fe48a6ea4e3763c3c92382377fd1b74bcaf5aede8c4e3af0 Jan 21 00:23:36 crc kubenswrapper[5030]: W0121 00:23:36.718892 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc65e0a8a_9877_4ec4_a9f1_2888ddb61d2a.slice/crio-e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd.scope WatchSource:0}: Error finding container e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd: Status 404 returned error can't find the container with id e07c0c9b9b8f6c577011df29916472e8c62bd675d3e2c5440fce5a8fe1db91bd Jan 21 00:23:36 crc kubenswrapper[5030]: W0121 00:23:36.719157 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcae5f9e_55e7_44ec_b08c_7c79e8bfbf19.slice/crio-e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244.scope WatchSource:0}: Error finding container e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244: Status 404 returned error can't find the container with id e2758c25c738d5a3524ee232b9c22d8f72d5ce921815ce7277d09ec3ba3d1244 Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.068908 5030 generic.go:334] "Generic (PLEG): container finished" podID="104f10b4-2a7a-4295-89a5-f1040fd74574" containerID="2caaad1dad98d3f0bde280b9f62f75f37b9a16e1ef62be6fd79c2d57aae1f591" exitCode=137 Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.068977 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" event={"ID":"104f10b4-2a7a-4295-89a5-f1040fd74574","Type":"ContainerDied","Data":"2caaad1dad98d3f0bde280b9f62f75f37b9a16e1ef62be6fd79c2d57aae1f591"} Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.069301 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" event={"ID":"104f10b4-2a7a-4295-89a5-f1040fd74574","Type":"ContainerDied","Data":"13a25c6adc06c014fc7f7c66847e2577d656df38d22403b55008f53cef15aae7"} Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.069318 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13a25c6adc06c014fc7f7c66847e2577d656df38d22403b55008f53cef15aae7" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.071100 5030 generic.go:334] "Generic (PLEG): container finished" podID="95a44ec0-cea4-44d5-84e0-8468a74c72b9" containerID="67abbc44be874834bfc74bf06dbef4f876db30dd4f2391d60959932155e14081" exitCode=137 Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.071138 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" event={"ID":"95a44ec0-cea4-44d5-84e0-8468a74c72b9","Type":"ContainerDied","Data":"67abbc44be874834bfc74bf06dbef4f876db30dd4f2391d60959932155e14081"} Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.071165 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" event={"ID":"95a44ec0-cea4-44d5-84e0-8468a74c72b9","Type":"ContainerDied","Data":"1ccb6d215cf0efbaf74eb40dbf9cfe0624afb594d641c70e8096f63990ff6498"} Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.071354 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ccb6d215cf0efbaf74eb40dbf9cfe0624afb594d641c70e8096f63990ff6498" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.075937 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.082679 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.274117 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bcvf\" (UniqueName: \"kubernetes.io/projected/95a44ec0-cea4-44d5-84e0-8468a74c72b9-kube-api-access-7bcvf\") pod \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.274163 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95a44ec0-cea4-44d5-84e0-8468a74c72b9-logs\") pod \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.274231 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/104f10b4-2a7a-4295-89a5-f1040fd74574-logs\") pod \"104f10b4-2a7a-4295-89a5-f1040fd74574\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.274251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95a44ec0-cea4-44d5-84e0-8468a74c72b9-config-data-custom\") pod \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.274280 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a44ec0-cea4-44d5-84e0-8468a74c72b9-config-data\") pod \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\" (UID: \"95a44ec0-cea4-44d5-84e0-8468a74c72b9\") " Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.274348 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/104f10b4-2a7a-4295-89a5-f1040fd74574-config-data\") pod \"104f10b4-2a7a-4295-89a5-f1040fd74574\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.274379 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/104f10b4-2a7a-4295-89a5-f1040fd74574-config-data-custom\") pod \"104f10b4-2a7a-4295-89a5-f1040fd74574\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.274440 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svzg4\" (UniqueName: \"kubernetes.io/projected/104f10b4-2a7a-4295-89a5-f1040fd74574-kube-api-access-svzg4\") pod \"104f10b4-2a7a-4295-89a5-f1040fd74574\" (UID: \"104f10b4-2a7a-4295-89a5-f1040fd74574\") " Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.274608 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/104f10b4-2a7a-4295-89a5-f1040fd74574-logs" (OuterVolumeSpecName: "logs") pod "104f10b4-2a7a-4295-89a5-f1040fd74574" (UID: "104f10b4-2a7a-4295-89a5-f1040fd74574"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.274850 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/104f10b4-2a7a-4295-89a5-f1040fd74574-logs\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.275049 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95a44ec0-cea4-44d5-84e0-8468a74c72b9-logs" (OuterVolumeSpecName: "logs") pod "95a44ec0-cea4-44d5-84e0-8468a74c72b9" (UID: "95a44ec0-cea4-44d5-84e0-8468a74c72b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.297049 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/104f10b4-2a7a-4295-89a5-f1040fd74574-kube-api-access-svzg4" (OuterVolumeSpecName: "kube-api-access-svzg4") pod "104f10b4-2a7a-4295-89a5-f1040fd74574" (UID: "104f10b4-2a7a-4295-89a5-f1040fd74574"). InnerVolumeSpecName "kube-api-access-svzg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.302830 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a44ec0-cea4-44d5-84e0-8468a74c72b9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "95a44ec0-cea4-44d5-84e0-8468a74c72b9" (UID: "95a44ec0-cea4-44d5-84e0-8468a74c72b9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.304860 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a44ec0-cea4-44d5-84e0-8468a74c72b9-kube-api-access-7bcvf" (OuterVolumeSpecName: "kube-api-access-7bcvf") pod "95a44ec0-cea4-44d5-84e0-8468a74c72b9" (UID: "95a44ec0-cea4-44d5-84e0-8468a74c72b9"). InnerVolumeSpecName "kube-api-access-7bcvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.317839 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/104f10b4-2a7a-4295-89a5-f1040fd74574-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "104f10b4-2a7a-4295-89a5-f1040fd74574" (UID: "104f10b4-2a7a-4295-89a5-f1040fd74574"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.372682 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a44ec0-cea4-44d5-84e0-8468a74c72b9-config-data" (OuterVolumeSpecName: "config-data") pod "95a44ec0-cea4-44d5-84e0-8468a74c72b9" (UID: "95a44ec0-cea4-44d5-84e0-8468a74c72b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.376709 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/104f10b4-2a7a-4295-89a5-f1040fd74574-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.376926 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svzg4\" (UniqueName: \"kubernetes.io/projected/104f10b4-2a7a-4295-89a5-f1040fd74574-kube-api-access-svzg4\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.377003 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bcvf\" (UniqueName: \"kubernetes.io/projected/95a44ec0-cea4-44d5-84e0-8468a74c72b9-kube-api-access-7bcvf\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.377104 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95a44ec0-cea4-44d5-84e0-8468a74c72b9-logs\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.377185 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95a44ec0-cea4-44d5-84e0-8468a74c72b9-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.377251 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a44ec0-cea4-44d5-84e0-8468a74c72b9-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.386340 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/104f10b4-2a7a-4295-89a5-f1040fd74574-config-data" (OuterVolumeSpecName: "config-data") pod "104f10b4-2a7a-4295-89a5-f1040fd74574" (UID: "104f10b4-2a7a-4295-89a5-f1040fd74574"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:37 crc kubenswrapper[5030]: I0121 00:23:37.479165 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/104f10b4-2a7a-4295-89a5-f1040fd74574-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:38 crc kubenswrapper[5030]: I0121 00:23:38.081251 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh" Jan 21 00:23:38 crc kubenswrapper[5030]: I0121 00:23:38.081362 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw" Jan 21 00:23:38 crc kubenswrapper[5030]: I0121 00:23:38.110831 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh"] Jan 21 00:23:38 crc kubenswrapper[5030]: I0121 00:23:38.120271 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-worker-8545978bf5-4pwfh"] Jan 21 00:23:38 crc kubenswrapper[5030]: I0121 00:23:38.125571 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw"] Jan 21 00:23:38 crc kubenswrapper[5030]: I0121 00:23:38.130451 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/barbican-keystone-listener-6d64c545dd-hzgqw"] Jan 21 00:23:39 crc kubenswrapper[5030]: I0121 00:23:39.975670 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="104f10b4-2a7a-4295-89a5-f1040fd74574" path="/var/lib/kubelet/pods/104f10b4-2a7a-4295-89a5-f1040fd74574/volumes" Jan 21 00:23:39 crc kubenswrapper[5030]: I0121 00:23:39.977135 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a44ec0-cea4-44d5-84e0-8468a74c72b9" path="/var/lib/kubelet/pods/95a44ec0-cea4-44d5-84e0-8468a74c72b9/volumes" Jan 21 00:23:43 crc kubenswrapper[5030]: I0121 00:23:43.961894 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:23:43 crc kubenswrapper[5030]: E0121 00:23:43.962589 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.082278 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-fmtbr"] Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.090332 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-95cdt"] Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.098363 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-db-sync-fmtbr"] Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.110179 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-bootstrap-95cdt"] Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.122323 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-567879f7bf-95bbl"] Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.122590 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" podUID="bd6ed570-11c2-4baf-b681-8e330085006b" containerName="keystone-api" containerID="cri-o://2dd206dae3650135eff50c52e9a793c7f7f1b630f067e8ff660b1318634891eb" gracePeriod=30 Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.159154 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/keystone7313-account-delete-zvhjp"] Jan 21 00:23:50 crc kubenswrapper[5030]: E0121 00:23:50.160057 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a44ec0-cea4-44d5-84e0-8468a74c72b9" containerName="barbican-worker-log" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.162401 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a44ec0-cea4-44d5-84e0-8468a74c72b9" containerName="barbican-worker-log" Jan 21 00:23:50 crc kubenswrapper[5030]: E0121 00:23:50.162677 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" containerName="barbican-worker-log" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.162736 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" containerName="barbican-worker-log" Jan 21 00:23:50 crc kubenswrapper[5030]: E0121 00:23:50.162810 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130aa456-410d-436b-aa9f-6f56ace1354a" containerName="barbican-api" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.162860 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="130aa456-410d-436b-aa9f-6f56ace1354a" containerName="barbican-api" Jan 21 00:23:50 crc kubenswrapper[5030]: E0121 00:23:50.162935 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" containerName="barbican-worker" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.162986 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" containerName="barbican-worker" Jan 21 00:23:50 crc kubenswrapper[5030]: E0121 00:23:50.163053 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" containerName="barbican-keystone-listener" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.163105 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" containerName="barbican-keystone-listener" Jan 21 00:23:50 crc kubenswrapper[5030]: E0121 00:23:50.163159 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130aa456-410d-436b-aa9f-6f56ace1354a" containerName="barbican-api-log" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.163207 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="130aa456-410d-436b-aa9f-6f56ace1354a" containerName="barbican-api-log" Jan 21 00:23:50 crc kubenswrapper[5030]: E0121 00:23:50.163259 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" containerName="barbican-keystone-listener-log" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.163306 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" containerName="barbican-keystone-listener-log" Jan 21 00:23:50 crc kubenswrapper[5030]: E0121 00:23:50.163405 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a44ec0-cea4-44d5-84e0-8468a74c72b9" containerName="barbican-worker" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.163479 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a44ec0-cea4-44d5-84e0-8468a74c72b9" containerName="barbican-worker" Jan 21 00:23:50 crc kubenswrapper[5030]: E0121 00:23:50.163540 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa697694-b7e0-42b9-9911-08d88b66ac82" containerName="mariadb-account-delete" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.163591 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa697694-b7e0-42b9-9911-08d88b66ac82" containerName="mariadb-account-delete" Jan 21 00:23:50 crc kubenswrapper[5030]: E0121 00:23:50.163670 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="104f10b4-2a7a-4295-89a5-f1040fd74574" containerName="barbican-keystone-listener" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.164294 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="104f10b4-2a7a-4295-89a5-f1040fd74574" containerName="barbican-keystone-listener" Jan 21 00:23:50 crc kubenswrapper[5030]: E0121 00:23:50.164371 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="104f10b4-2a7a-4295-89a5-f1040fd74574" containerName="barbican-keystone-listener-log" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.164423 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="104f10b4-2a7a-4295-89a5-f1040fd74574" containerName="barbican-keystone-listener-log" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.168688 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="130aa456-410d-436b-aa9f-6f56ace1354a" containerName="barbican-api-log" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.168772 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa697694-b7e0-42b9-9911-08d88b66ac82" containerName="mariadb-account-delete" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.168787 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a44ec0-cea4-44d5-84e0-8468a74c72b9" containerName="barbican-worker" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.168814 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" containerName="barbican-keystone-listener" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.168842 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="104f10b4-2a7a-4295-89a5-f1040fd74574" containerName="barbican-keystone-listener" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.168871 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a44ec0-cea4-44d5-84e0-8468a74c72b9" containerName="barbican-worker-log" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.168886 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="104f10b4-2a7a-4295-89a5-f1040fd74574" containerName="barbican-keystone-listener-log" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.168908 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" containerName="barbican-worker-log" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.168918 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcae5f9e-55e7-44ec-b08c-7c79e8bfbf19" containerName="barbican-keystone-listener-log" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.168943 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65e0a8a-9877-4ec4-a9f1-2888ddb61d2a" containerName="barbican-worker" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.168968 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="130aa456-410d-436b-aa9f-6f56ace1354a" containerName="barbican-api" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.170064 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.197168 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone7313-account-delete-zvhjp"] Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.297738 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e289ac96-4ca2-4b3b-b176-72ff05794567-operator-scripts\") pod \"keystone7313-account-delete-zvhjp\" (UID: \"e289ac96-4ca2-4b3b-b176-72ff05794567\") " pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.298008 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn9w7\" (UniqueName: \"kubernetes.io/projected/e289ac96-4ca2-4b3b-b176-72ff05794567-kube-api-access-xn9w7\") pod \"keystone7313-account-delete-zvhjp\" (UID: \"e289ac96-4ca2-4b3b-b176-72ff05794567\") " pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.399188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e289ac96-4ca2-4b3b-b176-72ff05794567-operator-scripts\") pod \"keystone7313-account-delete-zvhjp\" (UID: \"e289ac96-4ca2-4b3b-b176-72ff05794567\") " pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.399243 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn9w7\" (UniqueName: \"kubernetes.io/projected/e289ac96-4ca2-4b3b-b176-72ff05794567-kube-api-access-xn9w7\") pod \"keystone7313-account-delete-zvhjp\" (UID: \"e289ac96-4ca2-4b3b-b176-72ff05794567\") " pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.400211 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e289ac96-4ca2-4b3b-b176-72ff05794567-operator-scripts\") pod \"keystone7313-account-delete-zvhjp\" (UID: \"e289ac96-4ca2-4b3b-b176-72ff05794567\") " pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.419112 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn9w7\" (UniqueName: \"kubernetes.io/projected/e289ac96-4ca2-4b3b-b176-72ff05794567-kube-api-access-xn9w7\") pod \"keystone7313-account-delete-zvhjp\" (UID: \"e289ac96-4ca2-4b3b-b176-72ff05794567\") " pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.501180 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" Jan 21 00:23:50 crc kubenswrapper[5030]: I0121 00:23:50.973989 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/keystone7313-account-delete-zvhjp"] Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.197384 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" event={"ID":"e289ac96-4ca2-4b3b-b176-72ff05794567","Type":"ContainerStarted","Data":"7455b0c322ade772e025ce4626b591e5262bdd366e579932bb533fe1332bd7fd"} Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.197682 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" event={"ID":"e289ac96-4ca2-4b3b-b176-72ff05794567","Type":"ContainerStarted","Data":"8098ffb9ce5e2ec3d834f0f1f70c887bfd3ac8dc5a72bd7d523d68e0615a6ec2"} Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.215480 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" podStartSLOduration=1.21546222 podStartE2EDuration="1.21546222s" podCreationTimestamp="2026-01-21 00:23:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:23:51.211741481 +0000 UTC m=+6503.532001769" watchObservedRunningTime="2026-01-21 00:23:51.21546222 +0000 UTC m=+6503.535722508" Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.306775 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-6mq5v"] Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.312056 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-6mq5v"] Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.325991 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["barbican-kuttl-tests/root-account-create-update-2fbmq"] Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.327153 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-2fbmq" Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.333563 5030 reflector.go:368] Caches populated for *v1.Secret from object-"barbican-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.335166 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.335546 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-operator-scripts\") pod \"root-account-create-update-2fbmq\" (UID: \"ed38483b-10f7-426c-9f97-b6f87c0f0bdc\") " pod="barbican-kuttl-tests/root-account-create-update-2fbmq" Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.336505 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llg5f\" (UniqueName: \"kubernetes.io/projected/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-kube-api-access-llg5f\") pod \"root-account-create-update-2fbmq\" (UID: \"ed38483b-10f7-426c-9f97-b6f87c0f0bdc\") " pod="barbican-kuttl-tests/root-account-create-update-2fbmq" Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.348077 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-2fbmq"] Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.365982 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.375524 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.406043 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-2fbmq"] Jan 21 00:23:51 crc kubenswrapper[5030]: E0121 00:23:51.406765 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-llg5f operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="barbican-kuttl-tests/root-account-create-update-2fbmq" podUID="ed38483b-10f7-426c-9f97-b6f87c0f0bdc" Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.437663 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-operator-scripts\") pod \"root-account-create-update-2fbmq\" (UID: \"ed38483b-10f7-426c-9f97-b6f87c0f0bdc\") " pod="barbican-kuttl-tests/root-account-create-update-2fbmq" Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.437747 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llg5f\" (UniqueName: \"kubernetes.io/projected/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-kube-api-access-llg5f\") pod \"root-account-create-update-2fbmq\" (UID: \"ed38483b-10f7-426c-9f97-b6f87c0f0bdc\") " pod="barbican-kuttl-tests/root-account-create-update-2fbmq" Jan 21 00:23:51 crc kubenswrapper[5030]: E0121 00:23:51.437845 5030 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 00:23:51 crc kubenswrapper[5030]: E0121 00:23:51.437907 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-operator-scripts podName:ed38483b-10f7-426c-9f97-b6f87c0f0bdc nodeName:}" failed. No retries permitted until 2026-01-21 00:23:51.937891504 +0000 UTC m=+6504.258151792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-operator-scripts") pod "root-account-create-update-2fbmq" (UID: "ed38483b-10f7-426c-9f97-b6f87c0f0bdc") : configmap "openstack-scripts" not found Jan 21 00:23:51 crc kubenswrapper[5030]: E0121 00:23:51.441632 5030 projected.go:194] Error preparing data for projected volume kube-api-access-llg5f for pod barbican-kuttl-tests/root-account-create-update-2fbmq: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 00:23:51 crc kubenswrapper[5030]: E0121 00:23:51.441715 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-kube-api-access-llg5f podName:ed38483b-10f7-426c-9f97-b6f87c0f0bdc nodeName:}" failed. No retries permitted until 2026-01-21 00:23:51.941693925 +0000 UTC m=+6504.261954213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-llg5f" (UniqueName: "kubernetes.io/projected/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-kube-api-access-llg5f") pod "root-account-create-update-2fbmq" (UID: "ed38483b-10f7-426c-9f97-b6f87c0f0bdc") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.513428 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-2" podUID="7186abfd-4ab6-445a-9c9b-9f512a483acc" containerName="galera" containerID="cri-o://da836f0d9f3388acf4da9363a2b325ea0e940ccd63797e39c38b07bb56277b34" gracePeriod=30 Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.942884 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llg5f\" (UniqueName: \"kubernetes.io/projected/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-kube-api-access-llg5f\") pod \"root-account-create-update-2fbmq\" (UID: \"ed38483b-10f7-426c-9f97-b6f87c0f0bdc\") " pod="barbican-kuttl-tests/root-account-create-update-2fbmq" Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.943369 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-operator-scripts\") pod \"root-account-create-update-2fbmq\" (UID: \"ed38483b-10f7-426c-9f97-b6f87c0f0bdc\") " pod="barbican-kuttl-tests/root-account-create-update-2fbmq" Jan 21 00:23:51 crc kubenswrapper[5030]: E0121 00:23:51.943506 5030 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 00:23:51 crc kubenswrapper[5030]: E0121 00:23:51.943587 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-operator-scripts podName:ed38483b-10f7-426c-9f97-b6f87c0f0bdc nodeName:}" failed. No retries permitted until 2026-01-21 00:23:52.943564818 +0000 UTC m=+6505.263825126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-operator-scripts") pod "root-account-create-update-2fbmq" (UID: "ed38483b-10f7-426c-9f97-b6f87c0f0bdc") : configmap "openstack-scripts" not found Jan 21 00:23:51 crc kubenswrapper[5030]: E0121 00:23:51.946589 5030 projected.go:194] Error preparing data for projected volume kube-api-access-llg5f for pod barbican-kuttl-tests/root-account-create-update-2fbmq: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 00:23:51 crc kubenswrapper[5030]: E0121 00:23:51.946690 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-kube-api-access-llg5f podName:ed38483b-10f7-426c-9f97-b6f87c0f0bdc nodeName:}" failed. No retries permitted until 2026-01-21 00:23:52.946668972 +0000 UTC m=+6505.266929330 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-llg5f" (UniqueName: "kubernetes.io/projected/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-kube-api-access-llg5f") pod "root-account-create-update-2fbmq" (UID: "ed38483b-10f7-426c-9f97-b6f87c0f0bdc") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.977185 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a0e7639-c75c-41f1-8301-8d2928da6566" path="/var/lib/kubelet/pods/6a0e7639-c75c-41f1-8301-8d2928da6566/volumes" Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.978109 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d2205e-2da2-4ea7-9959-1e41b88525a7" path="/var/lib/kubelet/pods/95d2205e-2da2-4ea7-9959-1e41b88525a7/volumes" Jan 21 00:23:51 crc kubenswrapper[5030]: I0121 00:23:51.978809 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d16aeb-b00f-463b-840d-2662ece08fe7" path="/var/lib/kubelet/pods/e7d16aeb-b00f-463b-840d-2662ece08fe7/volumes" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.023264 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.023529 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/memcached-0" podUID="58fca83f-7678-4c7a-bff7-5830b38c788f" containerName="memcached" containerID="cri-o://128532b232d14abd012ae5553e81841d7ca0434be38a7c8565cae820e60e4e7e" gracePeriod=30 Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.206242 5030 generic.go:334] "Generic (PLEG): container finished" podID="e289ac96-4ca2-4b3b-b176-72ff05794567" containerID="7455b0c322ade772e025ce4626b591e5262bdd366e579932bb533fe1332bd7fd" exitCode=0 Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.206309 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" event={"ID":"e289ac96-4ca2-4b3b-b176-72ff05794567","Type":"ContainerDied","Data":"7455b0c322ade772e025ce4626b591e5262bdd366e579932bb533fe1332bd7fd"} Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.223067 5030 generic.go:334] "Generic (PLEG): container finished" podID="7186abfd-4ab6-445a-9c9b-9f512a483acc" containerID="da836f0d9f3388acf4da9363a2b325ea0e940ccd63797e39c38b07bb56277b34" exitCode=0 Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.223231 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-2fbmq" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.223245 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"7186abfd-4ab6-445a-9c9b-9f512a483acc","Type":"ContainerDied","Data":"da836f0d9f3388acf4da9363a2b325ea0e940ccd63797e39c38b07bb56277b34"} Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.279094 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-2fbmq" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.502879 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.586555 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.655063 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnfzc\" (UniqueName: \"kubernetes.io/projected/7186abfd-4ab6-445a-9c9b-9f512a483acc-kube-api-access-lnfzc\") pod \"7186abfd-4ab6-445a-9c9b-9f512a483acc\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.655437 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"7186abfd-4ab6-445a-9c9b-9f512a483acc\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.655481 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-kolla-config\") pod \"7186abfd-4ab6-445a-9c9b-9f512a483acc\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.655580 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-config-data-default\") pod \"7186abfd-4ab6-445a-9c9b-9f512a483acc\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.655683 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7186abfd-4ab6-445a-9c9b-9f512a483acc-config-data-generated\") pod \"7186abfd-4ab6-445a-9c9b-9f512a483acc\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.655730 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-operator-scripts\") pod \"7186abfd-4ab6-445a-9c9b-9f512a483acc\" (UID: \"7186abfd-4ab6-445a-9c9b-9f512a483acc\") " Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.656130 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7186abfd-4ab6-445a-9c9b-9f512a483acc-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "7186abfd-4ab6-445a-9c9b-9f512a483acc" (UID: "7186abfd-4ab6-445a-9c9b-9f512a483acc"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.656317 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "7186abfd-4ab6-445a-9c9b-9f512a483acc" (UID: "7186abfd-4ab6-445a-9c9b-9f512a483acc"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.656399 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "7186abfd-4ab6-445a-9c9b-9f512a483acc" (UID: "7186abfd-4ab6-445a-9c9b-9f512a483acc"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.656832 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7186abfd-4ab6-445a-9c9b-9f512a483acc" (UID: "7186abfd-4ab6-445a-9c9b-9f512a483acc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.665267 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7186abfd-4ab6-445a-9c9b-9f512a483acc-kube-api-access-lnfzc" (OuterVolumeSpecName: "kube-api-access-lnfzc") pod "7186abfd-4ab6-445a-9c9b-9f512a483acc" (UID: "7186abfd-4ab6-445a-9c9b-9f512a483acc"). InnerVolumeSpecName "kube-api-access-lnfzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.671382 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "mysql-db") pod "7186abfd-4ab6-445a-9c9b-9f512a483acc" (UID: "7186abfd-4ab6-445a-9c9b-9f512a483acc"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.757574 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.757660 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7186abfd-4ab6-445a-9c9b-9f512a483acc-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.757684 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.757700 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnfzc\" (UniqueName: \"kubernetes.io/projected/7186abfd-4ab6-445a-9c9b-9f512a483acc-kube-api-access-lnfzc\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.757733 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.757746 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7186abfd-4ab6-445a-9c9b-9f512a483acc-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.789042 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.859501 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.960652 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llg5f\" (UniqueName: \"kubernetes.io/projected/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-kube-api-access-llg5f\") pod \"root-account-create-update-2fbmq\" (UID: \"ed38483b-10f7-426c-9f97-b6f87c0f0bdc\") " pod="barbican-kuttl-tests/root-account-create-update-2fbmq" Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.960757 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-operator-scripts\") pod \"root-account-create-update-2fbmq\" (UID: \"ed38483b-10f7-426c-9f97-b6f87c0f0bdc\") " pod="barbican-kuttl-tests/root-account-create-update-2fbmq" Jan 21 00:23:52 crc kubenswrapper[5030]: E0121 00:23:52.960848 5030 configmap.go:193] Couldn't get configMap barbican-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 00:23:52 crc kubenswrapper[5030]: E0121 00:23:52.960894 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-operator-scripts podName:ed38483b-10f7-426c-9f97-b6f87c0f0bdc nodeName:}" failed. No retries permitted until 2026-01-21 00:23:54.960881027 +0000 UTC m=+6507.281141315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-operator-scripts") pod "root-account-create-update-2fbmq" (UID: "ed38483b-10f7-426c-9f97-b6f87c0f0bdc") : configmap "openstack-scripts" not found Jan 21 00:23:52 crc kubenswrapper[5030]: I0121 00:23:52.962464 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 00:23:52 crc kubenswrapper[5030]: E0121 00:23:52.964796 5030 projected.go:194] Error preparing data for projected volume kube-api-access-llg5f for pod barbican-kuttl-tests/root-account-create-update-2fbmq: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 00:23:52 crc kubenswrapper[5030]: E0121 00:23:52.964837 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-kube-api-access-llg5f podName:ed38483b-10f7-426c-9f97-b6f87c0f0bdc nodeName:}" failed. No retries permitted until 2026-01-21 00:23:54.964827102 +0000 UTC m=+6507.285087390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-llg5f" (UniqueName: "kubernetes.io/projected/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-kube-api-access-llg5f") pod "root-account-create-update-2fbmq" (UID: "ed38483b-10f7-426c-9f97-b6f87c0f0bdc") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.295430 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-2" event={"ID":"7186abfd-4ab6-445a-9c9b-9f512a483acc","Type":"ContainerDied","Data":"8f4142684a5b0214c8f7f7403c804d3eefdef80000e85ae0df6c477e32ecdb34"} Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.295482 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-2" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.295509 5030 scope.go:117] "RemoveContainer" containerID="da836f0d9f3388acf4da9363a2b325ea0e940ccd63797e39c38b07bb56277b34" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.298475 5030 generic.go:334] "Generic (PLEG): container finished" podID="58fca83f-7678-4c7a-bff7-5830b38c788f" containerID="128532b232d14abd012ae5553e81841d7ca0434be38a7c8565cae820e60e4e7e" exitCode=0 Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.298566 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"58fca83f-7678-4c7a-bff7-5830b38c788f","Type":"ContainerDied","Data":"128532b232d14abd012ae5553e81841d7ca0434be38a7c8565cae820e60e4e7e"} Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.298987 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/root-account-create-update-2fbmq" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.351806 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.353014 5030 scope.go:117] "RemoveContainer" containerID="92b1c0a0bc749e7799c8de35103ad87a58265f3d9420d8a702a0d0f314bccc4f" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.360400 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-2"] Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.383928 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/rabbitmq-server-0" podUID="07630df9-d9b5-4944-939a-6a284e5e1488" containerName="rabbitmq" containerID="cri-o://709e96694e3e9eb5e91d2a073603f195918fdc2221a73e50ac46808fffb4edd7" gracePeriod=604800 Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.388100 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-2fbmq"] Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.395448 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/root-account-create-update-2fbmq"] Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.488598 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llg5f\" (UniqueName: \"kubernetes.io/projected/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-kube-api-access-llg5f\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.488638 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed38483b-10f7-426c-9f97-b6f87c0f0bdc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.522346 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.570956 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-1" podUID="45f30d64-1c48-455e-be8e-ad27d8c2ea05" containerName="galera" containerID="cri-o://f56399eb18b3c55f1de6def42262adcf48ba83d8116adb6e6612c6c83d3dcd27" gracePeriod=28 Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.594662 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxkq5\" (UniqueName: \"kubernetes.io/projected/58fca83f-7678-4c7a-bff7-5830b38c788f-kube-api-access-dxkq5\") pod \"58fca83f-7678-4c7a-bff7-5830b38c788f\" (UID: \"58fca83f-7678-4c7a-bff7-5830b38c788f\") " Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.594731 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58fca83f-7678-4c7a-bff7-5830b38c788f-config-data\") pod \"58fca83f-7678-4c7a-bff7-5830b38c788f\" (UID: \"58fca83f-7678-4c7a-bff7-5830b38c788f\") " Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.594800 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58fca83f-7678-4c7a-bff7-5830b38c788f-kolla-config\") pod \"58fca83f-7678-4c7a-bff7-5830b38c788f\" (UID: \"58fca83f-7678-4c7a-bff7-5830b38c788f\") " Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.596042 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58fca83f-7678-4c7a-bff7-5830b38c788f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "58fca83f-7678-4c7a-bff7-5830b38c788f" (UID: "58fca83f-7678-4c7a-bff7-5830b38c788f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.596942 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58fca83f-7678-4c7a-bff7-5830b38c788f-config-data" (OuterVolumeSpecName: "config-data") pod "58fca83f-7678-4c7a-bff7-5830b38c788f" (UID: "58fca83f-7678-4c7a-bff7-5830b38c788f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.604961 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58fca83f-7678-4c7a-bff7-5830b38c788f-kube-api-access-dxkq5" (OuterVolumeSpecName: "kube-api-access-dxkq5") pod "58fca83f-7678-4c7a-bff7-5830b38c788f" (UID: "58fca83f-7678-4c7a-bff7-5830b38c788f"). InnerVolumeSpecName "kube-api-access-dxkq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.658251 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.696041 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e289ac96-4ca2-4b3b-b176-72ff05794567-operator-scripts\") pod \"e289ac96-4ca2-4b3b-b176-72ff05794567\" (UID: \"e289ac96-4ca2-4b3b-b176-72ff05794567\") " Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.696181 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn9w7\" (UniqueName: \"kubernetes.io/projected/e289ac96-4ca2-4b3b-b176-72ff05794567-kube-api-access-xn9w7\") pod \"e289ac96-4ca2-4b3b-b176-72ff05794567\" (UID: \"e289ac96-4ca2-4b3b-b176-72ff05794567\") " Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.696465 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxkq5\" (UniqueName: \"kubernetes.io/projected/58fca83f-7678-4c7a-bff7-5830b38c788f-kube-api-access-dxkq5\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.696483 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58fca83f-7678-4c7a-bff7-5830b38c788f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.696493 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/58fca83f-7678-4c7a-bff7-5830b38c788f-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.696471 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e289ac96-4ca2-4b3b-b176-72ff05794567-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e289ac96-4ca2-4b3b-b176-72ff05794567" (UID: "e289ac96-4ca2-4b3b-b176-72ff05794567"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.703322 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e289ac96-4ca2-4b3b-b176-72ff05794567-kube-api-access-xn9w7" (OuterVolumeSpecName: "kube-api-access-xn9w7") pod "e289ac96-4ca2-4b3b-b176-72ff05794567" (UID: "e289ac96-4ca2-4b3b-b176-72ff05794567"). InnerVolumeSpecName "kube-api-access-xn9w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.798385 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn9w7\" (UniqueName: \"kubernetes.io/projected/e289ac96-4ca2-4b3b-b176-72ff05794567-kube-api-access-xn9w7\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.798452 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e289ac96-4ca2-4b3b-b176-72ff05794567-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.809444 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.899176 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-fernet-keys\") pod \"bd6ed570-11c2-4baf-b681-8e330085006b\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.899535 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-credential-keys\") pod \"bd6ed570-11c2-4baf-b681-8e330085006b\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.899676 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjxrf\" (UniqueName: \"kubernetes.io/projected/bd6ed570-11c2-4baf-b681-8e330085006b-kube-api-access-xjxrf\") pod \"bd6ed570-11c2-4baf-b681-8e330085006b\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.899710 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-scripts\") pod \"bd6ed570-11c2-4baf-b681-8e330085006b\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.899824 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-config-data\") pod \"bd6ed570-11c2-4baf-b681-8e330085006b\" (UID: \"bd6ed570-11c2-4baf-b681-8e330085006b\") " Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.906080 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bd6ed570-11c2-4baf-b681-8e330085006b" (UID: "bd6ed570-11c2-4baf-b681-8e330085006b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.906850 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-scripts" (OuterVolumeSpecName: "scripts") pod "bd6ed570-11c2-4baf-b681-8e330085006b" (UID: "bd6ed570-11c2-4baf-b681-8e330085006b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.908802 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bd6ed570-11c2-4baf-b681-8e330085006b" (UID: "bd6ed570-11c2-4baf-b681-8e330085006b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.912937 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd6ed570-11c2-4baf-b681-8e330085006b-kube-api-access-xjxrf" (OuterVolumeSpecName: "kube-api-access-xjxrf") pod "bd6ed570-11c2-4baf-b681-8e330085006b" (UID: "bd6ed570-11c2-4baf-b681-8e330085006b"). InnerVolumeSpecName "kube-api-access-xjxrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.921112 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl"] Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.921365 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" podUID="55a2d438-b8dc-4dac-a66b-7e596e32c4f3" containerName="manager" containerID="cri-o://e0fe572a232be498772b65e6c7c1ad48706f6539a7fc0ed07ded17263eee1415" gracePeriod=10 Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.937804 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-config-data" (OuterVolumeSpecName: "config-data") pod "bd6ed570-11c2-4baf-b681-8e330085006b" (UID: "bd6ed570-11c2-4baf-b681-8e330085006b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.973678 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7186abfd-4ab6-445a-9c9b-9f512a483acc" path="/var/lib/kubelet/pods/7186abfd-4ab6-445a-9c9b-9f512a483acc/volumes" Jan 21 00:23:53 crc kubenswrapper[5030]: I0121 00:23:53.974359 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed38483b-10f7-426c-9f97-b6f87c0f0bdc" path="/var/lib/kubelet/pods/ed38483b-10f7-426c-9f97-b6f87c0f0bdc/volumes" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.001692 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.001744 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.001759 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.001774 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjxrf\" (UniqueName: \"kubernetes.io/projected/bd6ed570-11c2-4baf-b681-8e330085006b-kube-api-access-xjxrf\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.001788 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd6ed570-11c2-4baf-b681-8e330085006b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.178456 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-gcvmv"] Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.178689 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-index-gcvmv" podUID="3756e766-1fab-47ec-aefc-8ad5e10f4fe4" containerName="registry-server" containerID="cri-o://3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec" gracePeriod=30 Jan 21 00:23:54 crc kubenswrapper[5030]: E0121 00:23:54.202825 5030 log.go:32] "ExecSync cmd from runtime service failed" err=< Jan 21 00:23:54 crc kubenswrapper[5030]: rpc error: code = Unknown desc = command error: setns `mnt`: Bad file descriptor Jan 21 00:23:54 crc kubenswrapper[5030]: fail startup Jan 21 00:23:54 crc kubenswrapper[5030]: , stdout: , stderr: , exit code -1 Jan 21 00:23:54 crc kubenswrapper[5030]: > containerID="3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 00:23:54 crc kubenswrapper[5030]: E0121 00:23:54.205798 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec is running failed: container process not found" containerID="3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 00:23:54 crc kubenswrapper[5030]: E0121 00:23:54.209822 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec is running failed: container process not found" containerID="3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 00:23:54 crc kubenswrapper[5030]: E0121 00:23:54.209890 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec is running failed: container process not found" probeType="Readiness" pod="openstack-operators/barbican-operator-index-gcvmv" podUID="3756e766-1fab-47ec-aefc-8ad5e10f4fe4" containerName="registry-server" Jan 21 00:23:54 crc kubenswrapper[5030]: E0121 00:23:54.212779 5030 log.go:32] "ExecSync cmd from runtime service failed" err=< Jan 21 00:23:54 crc kubenswrapper[5030]: rpc error: code = Unknown desc = command error: setns `mnt`: Bad file descriptor Jan 21 00:23:54 crc kubenswrapper[5030]: fail startup Jan 21 00:23:54 crc kubenswrapper[5030]: , stdout: , stderr: , exit code -1 Jan 21 00:23:54 crc kubenswrapper[5030]: > containerID="3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 00:23:54 crc kubenswrapper[5030]: E0121 00:23:54.215504 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec is running failed: container process not found" containerID="3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 00:23:54 crc kubenswrapper[5030]: E0121 00:23:54.215995 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec is running failed: container process not found" containerID="3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 00:23:54 crc kubenswrapper[5030]: E0121 00:23:54.216050 5030 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec is running failed: container process not found" probeType="Liveness" pod="openstack-operators/barbican-operator-index-gcvmv" podUID="3756e766-1fab-47ec-aefc-8ad5e10f4fe4" containerName="registry-server" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.228237 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww"] Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.235211 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/efb8278422048e49f60447fe08bd0467bd977c487c5bfd9acd08760cddvzqww"] Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.307120 5030 generic.go:334] "Generic (PLEG): container finished" podID="bd6ed570-11c2-4baf-b681-8e330085006b" containerID="2dd206dae3650135eff50c52e9a793c7f7f1b630f067e8ff660b1318634891eb" exitCode=0 Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.307173 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" event={"ID":"bd6ed570-11c2-4baf-b681-8e330085006b","Type":"ContainerDied","Data":"2dd206dae3650135eff50c52e9a793c7f7f1b630f067e8ff660b1318634891eb"} Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.307200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" event={"ID":"bd6ed570-11c2-4baf-b681-8e330085006b","Type":"ContainerDied","Data":"ee334a54da40d88755b58a39c94a571d47434027095fb36b744e693edd8b91a6"} Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.307216 5030 scope.go:117] "RemoveContainer" containerID="2dd206dae3650135eff50c52e9a793c7f7f1b630f067e8ff660b1318634891eb" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.307291 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone-567879f7bf-95bbl" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.317555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/memcached-0" event={"ID":"58fca83f-7678-4c7a-bff7-5830b38c788f","Type":"ContainerDied","Data":"359218c9dc9840c54b6e60061a1c3df73e7bebd8580ca911b6c349e069148fe5"} Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.317653 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/memcached-0" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.321998 5030 generic.go:334] "Generic (PLEG): container finished" podID="55a2d438-b8dc-4dac-a66b-7e596e32c4f3" containerID="e0fe572a232be498772b65e6c7c1ad48706f6539a7fc0ed07ded17263eee1415" exitCode=0 Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.322138 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" event={"ID":"55a2d438-b8dc-4dac-a66b-7e596e32c4f3","Type":"ContainerDied","Data":"e0fe572a232be498772b65e6c7c1ad48706f6539a7fc0ed07ded17263eee1415"} Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.323608 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" event={"ID":"e289ac96-4ca2-4b3b-b176-72ff05794567","Type":"ContainerDied","Data":"8098ffb9ce5e2ec3d834f0f1f70c887bfd3ac8dc5a72bd7d523d68e0615a6ec2"} Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.323660 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8098ffb9ce5e2ec3d834f0f1f70c887bfd3ac8dc5a72bd7d523d68e0615a6ec2" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.323722 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/keystone7313-account-delete-zvhjp" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.371811 5030 scope.go:117] "RemoveContainer" containerID="2dd206dae3650135eff50c52e9a793c7f7f1b630f067e8ff660b1318634891eb" Jan 21 00:23:54 crc kubenswrapper[5030]: E0121 00:23:54.372210 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd206dae3650135eff50c52e9a793c7f7f1b630f067e8ff660b1318634891eb\": container with ID starting with 2dd206dae3650135eff50c52e9a793c7f7f1b630f067e8ff660b1318634891eb not found: ID does not exist" containerID="2dd206dae3650135eff50c52e9a793c7f7f1b630f067e8ff660b1318634891eb" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.372246 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd206dae3650135eff50c52e9a793c7f7f1b630f067e8ff660b1318634891eb"} err="failed to get container status \"2dd206dae3650135eff50c52e9a793c7f7f1b630f067e8ff660b1318634891eb\": rpc error: code = NotFound desc = could not find container \"2dd206dae3650135eff50c52e9a793c7f7f1b630f067e8ff660b1318634891eb\": container with ID starting with 2dd206dae3650135eff50c52e9a793c7f7f1b630f067e8ff660b1318634891eb not found: ID does not exist" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.372272 5030 scope.go:117] "RemoveContainer" containerID="128532b232d14abd012ae5553e81841d7ca0434be38a7c8565cae820e60e4e7e" Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.394053 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-567879f7bf-95bbl"] Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.400198 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-567879f7bf-95bbl"] Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.409726 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.416455 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/memcached-0"] Jan 21 00:23:54 crc kubenswrapper[5030]: I0121 00:23:54.951592 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.013485 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-webhook-cert\") pod \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\" (UID: \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\") " Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.013723 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thw76\" (UniqueName: \"kubernetes.io/projected/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-kube-api-access-thw76\") pod \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\" (UID: \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\") " Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.013797 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-apiservice-cert\") pod \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\" (UID: \"55a2d438-b8dc-4dac-a66b-7e596e32c4f3\") " Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.019218 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "55a2d438-b8dc-4dac-a66b-7e596e32c4f3" (UID: "55a2d438-b8dc-4dac-a66b-7e596e32c4f3"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.019257 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "55a2d438-b8dc-4dac-a66b-7e596e32c4f3" (UID: "55a2d438-b8dc-4dac-a66b-7e596e32c4f3"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.019317 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-kube-api-access-thw76" (OuterVolumeSpecName: "kube-api-access-thw76") pod "55a2d438-b8dc-4dac-a66b-7e596e32c4f3" (UID: "55a2d438-b8dc-4dac-a66b-7e596e32c4f3"). InnerVolumeSpecName "kube-api-access-thw76". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.115231 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thw76\" (UniqueName: \"kubernetes.io/projected/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-kube-api-access-thw76\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.115280 5030 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.115295 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/55a2d438-b8dc-4dac-a66b-7e596e32c4f3-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.184949 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-bn7lq"] Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.210349 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-db-create-bn7lq"] Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.237823 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone-7313-account-create-update-q222b"] Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.238068 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/keystone7313-account-delete-zvhjp"] Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.241702 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone-7313-account-create-update-q222b"] Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.249875 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/keystone7313-account-delete-zvhjp"] Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.293894 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-gcvmv" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.359898 5030 generic.go:334] "Generic (PLEG): container finished" podID="07630df9-d9b5-4944-939a-6a284e5e1488" containerID="709e96694e3e9eb5e91d2a073603f195918fdc2221a73e50ac46808fffb4edd7" exitCode=0 Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.359962 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"07630df9-d9b5-4944-939a-6a284e5e1488","Type":"ContainerDied","Data":"709e96694e3e9eb5e91d2a073603f195918fdc2221a73e50ac46808fffb4edd7"} Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.361643 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" event={"ID":"55a2d438-b8dc-4dac-a66b-7e596e32c4f3","Type":"ContainerDied","Data":"2901b67f1e91fb15b5d9ea3d02dc466b9ab51deac6ef4b190f61da34e8b143a7"} Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.361675 5030 scope.go:117] "RemoveContainer" containerID="e0fe572a232be498772b65e6c7c1ad48706f6539a7fc0ed07ded17263eee1415" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.361806 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.375124 5030 generic.go:334] "Generic (PLEG): container finished" podID="3756e766-1fab-47ec-aefc-8ad5e10f4fe4" containerID="3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec" exitCode=0 Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.375165 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-gcvmv" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.375182 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-gcvmv" event={"ID":"3756e766-1fab-47ec-aefc-8ad5e10f4fe4","Type":"ContainerDied","Data":"3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec"} Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.375356 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-gcvmv" event={"ID":"3756e766-1fab-47ec-aefc-8ad5e10f4fe4","Type":"ContainerDied","Data":"5e2b5ac4963abb37b01709f5149f5669183664de8f887685a870804272bf0ea6"} Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.393190 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl"] Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.398756 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-55fdffbbdb-4w8zl"] Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.409745 5030 scope.go:117] "RemoveContainer" containerID="3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.420312 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plczk\" (UniqueName: \"kubernetes.io/projected/3756e766-1fab-47ec-aefc-8ad5e10f4fe4-kube-api-access-plczk\") pod \"3756e766-1fab-47ec-aefc-8ad5e10f4fe4\" (UID: \"3756e766-1fab-47ec-aefc-8ad5e10f4fe4\") " Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.423796 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3756e766-1fab-47ec-aefc-8ad5e10f4fe4-kube-api-access-plczk" (OuterVolumeSpecName: "kube-api-access-plczk") pod "3756e766-1fab-47ec-aefc-8ad5e10f4fe4" (UID: "3756e766-1fab-47ec-aefc-8ad5e10f4fe4"). InnerVolumeSpecName "kube-api-access-plczk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.454508 5030 scope.go:117] "RemoveContainer" containerID="3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec" Jan 21 00:23:55 crc kubenswrapper[5030]: E0121 00:23:55.455081 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec\": container with ID starting with 3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec not found: ID does not exist" containerID="3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.455140 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec"} err="failed to get container status \"3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec\": rpc error: code = NotFound desc = could not find container \"3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec\": container with ID starting with 3feaf5ccd4f3e0ffb48ebd71ca64d89c91e2df05a10c8f1af824b715975f62ec not found: ID does not exist" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.522297 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plczk\" (UniqueName: \"kubernetes.io/projected/3756e766-1fab-47ec-aefc-8ad5e10f4fe4-kube-api-access-plczk\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.550703 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="barbican-kuttl-tests/openstack-galera-0" podUID="233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" containerName="galera" containerID="cri-o://c4d75f0985bf9e20df958dd4476f2c9b358b54ecc1e575263291b5c952e424e5" gracePeriod=26 Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.627446 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.711193 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-gcvmv"] Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.719385 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-index-gcvmv"] Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.790006 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6"] Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.790358 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" podUID="134304ed-5bea-4841-aaa0-d7d7284fc67e" containerName="manager" containerID="cri-o://02a538876acf22f39c0a24a305c6ce365f4a2ec0d2a2beea93a36cfc0fe5890c" gracePeriod=10 Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.827301 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-erlang-cookie\") pod \"07630df9-d9b5-4944-939a-6a284e5e1488\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.827349 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/07630df9-d9b5-4944-939a-6a284e5e1488-pod-info\") pod \"07630df9-d9b5-4944-939a-6a284e5e1488\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.827392 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/07630df9-d9b5-4944-939a-6a284e5e1488-erlang-cookie-secret\") pod \"07630df9-d9b5-4944-939a-6a284e5e1488\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.827447 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/07630df9-d9b5-4944-939a-6a284e5e1488-plugins-conf\") pod \"07630df9-d9b5-4944-939a-6a284e5e1488\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.827477 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-confd\") pod \"07630df9-d9b5-4944-939a-6a284e5e1488\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.827497 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-plugins\") pod \"07630df9-d9b5-4944-939a-6a284e5e1488\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.827541 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc47j\" (UniqueName: \"kubernetes.io/projected/07630df9-d9b5-4944-939a-6a284e5e1488-kube-api-access-zc47j\") pod \"07630df9-d9b5-4944-939a-6a284e5e1488\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.827714 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2157812c-8de0-4411-8737-01292dafa830\") pod \"07630df9-d9b5-4944-939a-6a284e5e1488\" (UID: \"07630df9-d9b5-4944-939a-6a284e5e1488\") " Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.828298 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "07630df9-d9b5-4944-939a-6a284e5e1488" (UID: "07630df9-d9b5-4944-939a-6a284e5e1488"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.828348 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "07630df9-d9b5-4944-939a-6a284e5e1488" (UID: "07630df9-d9b5-4944-939a-6a284e5e1488"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.828475 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07630df9-d9b5-4944-939a-6a284e5e1488-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "07630df9-d9b5-4944-939a-6a284e5e1488" (UID: "07630df9-d9b5-4944-939a-6a284e5e1488"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.835810 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07630df9-d9b5-4944-939a-6a284e5e1488-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "07630df9-d9b5-4944-939a-6a284e5e1488" (UID: "07630df9-d9b5-4944-939a-6a284e5e1488"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.836037 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/07630df9-d9b5-4944-939a-6a284e5e1488-pod-info" (OuterVolumeSpecName: "pod-info") pod "07630df9-d9b5-4944-939a-6a284e5e1488" (UID: "07630df9-d9b5-4944-939a-6a284e5e1488"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.836478 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07630df9-d9b5-4944-939a-6a284e5e1488-kube-api-access-zc47j" (OuterVolumeSpecName: "kube-api-access-zc47j") pod "07630df9-d9b5-4944-939a-6a284e5e1488" (UID: "07630df9-d9b5-4944-939a-6a284e5e1488"). InnerVolumeSpecName "kube-api-access-zc47j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.837196 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2157812c-8de0-4411-8737-01292dafa830" (OuterVolumeSpecName: "persistence") pod "07630df9-d9b5-4944-939a-6a284e5e1488" (UID: "07630df9-d9b5-4944-939a-6a284e5e1488"). InnerVolumeSpecName "pvc-2157812c-8de0-4411-8737-01292dafa830". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.927886 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "07630df9-d9b5-4944-939a-6a284e5e1488" (UID: "07630df9-d9b5-4944-939a-6a284e5e1488"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.929368 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/07630df9-d9b5-4944-939a-6a284e5e1488-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.929405 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/07630df9-d9b5-4944-939a-6a284e5e1488-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.929420 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.929435 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.929454 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc47j\" (UniqueName: \"kubernetes.io/projected/07630df9-d9b5-4944-939a-6a284e5e1488-kube-api-access-zc47j\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.929505 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2157812c-8de0-4411-8737-01292dafa830\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2157812c-8de0-4411-8737-01292dafa830\") on node \"crc\" " Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.929529 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/07630df9-d9b5-4944-939a-6a284e5e1488-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.929883 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/07630df9-d9b5-4944-939a-6a284e5e1488-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.948293 5030 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.948446 5030 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2157812c-8de0-4411-8737-01292dafa830" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2157812c-8de0-4411-8737-01292dafa830") on node "crc" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.970115 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3756e766-1fab-47ec-aefc-8ad5e10f4fe4" path="/var/lib/kubelet/pods/3756e766-1fab-47ec-aefc-8ad5e10f4fe4/volumes" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.970773 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a2d438-b8dc-4dac-a66b-7e596e32c4f3" path="/var/lib/kubelet/pods/55a2d438-b8dc-4dac-a66b-7e596e32c4f3/volumes" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.971212 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58fca83f-7678-4c7a-bff7-5830b38c788f" path="/var/lib/kubelet/pods/58fca83f-7678-4c7a-bff7-5830b38c788f/volumes" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.971674 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60451bed-d2a2-4033-8c5b-10d20ded889a" path="/var/lib/kubelet/pods/60451bed-d2a2-4033-8c5b-10d20ded889a/volumes" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.972705 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91b2e63e-c166-461e-92f0-71614f7ff73b" path="/var/lib/kubelet/pods/91b2e63e-c166-461e-92f0-71614f7ff73b/volumes" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.973304 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd6ed570-11c2-4baf-b681-8e330085006b" path="/var/lib/kubelet/pods/bd6ed570-11c2-4baf-b681-8e330085006b/volumes" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.974469 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5c2d75e-5dd8-4544-adb0-c6a956b69393" path="/var/lib/kubelet/pods/d5c2d75e-5dd8-4544-adb0-c6a956b69393/volumes" Jan 21 00:23:55 crc kubenswrapper[5030]: I0121 00:23:55.975462 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e289ac96-4ca2-4b3b-b176-72ff05794567" path="/var/lib/kubelet/pods/e289ac96-4ca2-4b3b-b176-72ff05794567/volumes" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.031806 5030 reconciler_common.go:293] "Volume detached for volume \"pvc-2157812c-8de0-4411-8737-01292dafa830\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2157812c-8de0-4411-8737-01292dafa830\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.054513 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-jp4q5"] Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.054798 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-jp4q5" podUID="915a4368-1348-488c-a968-bec3d1b87d23" containerName="registry-server" containerID="cri-o://9d23c2a86c88a78de8b357da28b04ae29725c56e0216ad5bee6e2867c610abb7" gracePeriod=30 Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.117760 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7"] Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.122512 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069p8tr7"] Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.260979 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.386903 5030 generic.go:334] "Generic (PLEG): container finished" podID="915a4368-1348-488c-a968-bec3d1b87d23" containerID="9d23c2a86c88a78de8b357da28b04ae29725c56e0216ad5bee6e2867c610abb7" exitCode=0 Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.386995 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-jp4q5" event={"ID":"915a4368-1348-488c-a968-bec3d1b87d23","Type":"ContainerDied","Data":"9d23c2a86c88a78de8b357da28b04ae29725c56e0216ad5bee6e2867c610abb7"} Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.388538 5030 generic.go:334] "Generic (PLEG): container finished" podID="45f30d64-1c48-455e-be8e-ad27d8c2ea05" containerID="f56399eb18b3c55f1de6def42262adcf48ba83d8116adb6e6612c6c83d3dcd27" exitCode=0 Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.388588 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"45f30d64-1c48-455e-be8e-ad27d8c2ea05","Type":"ContainerDied","Data":"f56399eb18b3c55f1de6def42262adcf48ba83d8116adb6e6612c6c83d3dcd27"} Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.388615 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-1" event={"ID":"45f30d64-1c48-455e-be8e-ad27d8c2ea05","Type":"ContainerDied","Data":"8242797e49bfdf22fb24bcd096eaa7c9df3a668ab4390f0cd4ef952551908301"} Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.388652 5030 scope.go:117] "RemoveContainer" containerID="f56399eb18b3c55f1de6def42262adcf48ba83d8116adb6e6612c6c83d3dcd27" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.388756 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-1" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.391973 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/rabbitmq-server-0" event={"ID":"07630df9-d9b5-4944-939a-6a284e5e1488","Type":"ContainerDied","Data":"fdd2f81e22b36e48311d5ab0b6d90f032f67103d6a419ceae034e9dc8c5cef1e"} Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.392047 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/rabbitmq-server-0" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.406470 5030 generic.go:334] "Generic (PLEG): container finished" podID="134304ed-5bea-4841-aaa0-d7d7284fc67e" containerID="02a538876acf22f39c0a24a305c6ce365f4a2ec0d2a2beea93a36cfc0fe5890c" exitCode=0 Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.407075 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" event={"ID":"134304ed-5bea-4841-aaa0-d7d7284fc67e","Type":"ContainerDied","Data":"02a538876acf22f39c0a24a305c6ce365f4a2ec0d2a2beea93a36cfc0fe5890c"} Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.419521 5030 scope.go:117] "RemoveContainer" containerID="7dfde2d724dca65b19fe11d0be962fbf619387faf1a3d9df7a0ebff64cf4aa71" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.435553 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-kolla-config\") pod \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.435673 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45f30d64-1c48-455e-be8e-ad27d8c2ea05-config-data-generated\") pod \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.435725 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g249t\" (UniqueName: \"kubernetes.io/projected/45f30d64-1c48-455e-be8e-ad27d8c2ea05-kube-api-access-g249t\") pod \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.435763 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-config-data-default\") pod \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.435842 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-operator-scripts\") pod \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.435908 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\" (UID: \"45f30d64-1c48-455e-be8e-ad27d8c2ea05\") " Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.437081 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "45f30d64-1c48-455e-be8e-ad27d8c2ea05" (UID: "45f30d64-1c48-455e-be8e-ad27d8c2ea05"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.437234 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.438903 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "45f30d64-1c48-455e-be8e-ad27d8c2ea05" (UID: "45f30d64-1c48-455e-be8e-ad27d8c2ea05"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.439557 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45f30d64-1c48-455e-be8e-ad27d8c2ea05" (UID: "45f30d64-1c48-455e-be8e-ad27d8c2ea05"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.440095 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45f30d64-1c48-455e-be8e-ad27d8c2ea05-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "45f30d64-1c48-455e-be8e-ad27d8c2ea05" (UID: "45f30d64-1c48-455e-be8e-ad27d8c2ea05"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.451974 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.456877 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f30d64-1c48-455e-be8e-ad27d8c2ea05-kube-api-access-g249t" (OuterVolumeSpecName: "kube-api-access-g249t") pod "45f30d64-1c48-455e-be8e-ad27d8c2ea05" (UID: "45f30d64-1c48-455e-be8e-ad27d8c2ea05"). InnerVolumeSpecName "kube-api-access-g249t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.460023 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "45f30d64-1c48-455e-be8e-ad27d8c2ea05" (UID: "45f30d64-1c48-455e-be8e-ad27d8c2ea05"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.473764 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/rabbitmq-server-0"] Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.539126 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/45f30d64-1c48-455e-be8e-ad27d8c2ea05-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.539171 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g249t\" (UniqueName: \"kubernetes.io/projected/45f30d64-1c48-455e-be8e-ad27d8c2ea05-kube-api-access-g249t\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.539188 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.539204 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45f30d64-1c48-455e-be8e-ad27d8c2ea05-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.539245 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.543134 5030 scope.go:117] "RemoveContainer" containerID="f56399eb18b3c55f1de6def42262adcf48ba83d8116adb6e6612c6c83d3dcd27" Jan 21 00:23:56 crc kubenswrapper[5030]: E0121 00:23:56.543741 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f56399eb18b3c55f1de6def42262adcf48ba83d8116adb6e6612c6c83d3dcd27\": container with ID starting with f56399eb18b3c55f1de6def42262adcf48ba83d8116adb6e6612c6c83d3dcd27 not found: ID does not exist" containerID="f56399eb18b3c55f1de6def42262adcf48ba83d8116adb6e6612c6c83d3dcd27" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.543880 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f56399eb18b3c55f1de6def42262adcf48ba83d8116adb6e6612c6c83d3dcd27"} err="failed to get container status \"f56399eb18b3c55f1de6def42262adcf48ba83d8116adb6e6612c6c83d3dcd27\": rpc error: code = NotFound desc = could not find container \"f56399eb18b3c55f1de6def42262adcf48ba83d8116adb6e6612c6c83d3dcd27\": container with ID starting with f56399eb18b3c55f1de6def42262adcf48ba83d8116adb6e6612c6c83d3dcd27 not found: ID does not exist" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.543913 5030 scope.go:117] "RemoveContainer" containerID="7dfde2d724dca65b19fe11d0be962fbf619387faf1a3d9df7a0ebff64cf4aa71" Jan 21 00:23:56 crc kubenswrapper[5030]: E0121 00:23:56.544225 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dfde2d724dca65b19fe11d0be962fbf619387faf1a3d9df7a0ebff64cf4aa71\": container with ID starting with 7dfde2d724dca65b19fe11d0be962fbf619387faf1a3d9df7a0ebff64cf4aa71 not found: ID does not exist" containerID="7dfde2d724dca65b19fe11d0be962fbf619387faf1a3d9df7a0ebff64cf4aa71" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.544271 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dfde2d724dca65b19fe11d0be962fbf619387faf1a3d9df7a0ebff64cf4aa71"} err="failed to get container status \"7dfde2d724dca65b19fe11d0be962fbf619387faf1a3d9df7a0ebff64cf4aa71\": rpc error: code = NotFound desc = could not find container \"7dfde2d724dca65b19fe11d0be962fbf619387faf1a3d9df7a0ebff64cf4aa71\": container with ID starting with 7dfde2d724dca65b19fe11d0be962fbf619387faf1a3d9df7a0ebff64cf4aa71 not found: ID does not exist" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.544327 5030 scope.go:117] "RemoveContainer" containerID="709e96694e3e9eb5e91d2a073603f195918fdc2221a73e50ac46808fffb4edd7" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.552545 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.562447 5030 scope.go:117] "RemoveContainer" containerID="9754df6db66703e2cca4cfda28e6dc5b5b74240ec8edd308b113c6623d6f191d" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.627218 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-jp4q5" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.641755 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.721572 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.734255 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-1"] Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.742834 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxxq4\" (UniqueName: \"kubernetes.io/projected/915a4368-1348-488c-a968-bec3d1b87d23-kube-api-access-gxxq4\") pod \"915a4368-1348-488c-a968-bec3d1b87d23\" (UID: \"915a4368-1348-488c-a968-bec3d1b87d23\") " Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.750708 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/915a4368-1348-488c-a968-bec3d1b87d23-kube-api-access-gxxq4" (OuterVolumeSpecName: "kube-api-access-gxxq4") pod "915a4368-1348-488c-a968-bec3d1b87d23" (UID: "915a4368-1348-488c-a968-bec3d1b87d23"). InnerVolumeSpecName "kube-api-access-gxxq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.844445 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxxq4\" (UniqueName: \"kubernetes.io/projected/915a4368-1348-488c-a968-bec3d1b87d23-kube-api-access-gxxq4\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.951129 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:23:56 crc kubenswrapper[5030]: I0121 00:23:56.962979 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:23:56 crc kubenswrapper[5030]: E0121 00:23:56.963643 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.030614 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.149126 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfxzf\" (UniqueName: \"kubernetes.io/projected/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-kube-api-access-zfxzf\") pod \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.149982 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j77m7\" (UniqueName: \"kubernetes.io/projected/134304ed-5bea-4841-aaa0-d7d7284fc67e-kube-api-access-j77m7\") pod \"134304ed-5bea-4841-aaa0-d7d7284fc67e\" (UID: \"134304ed-5bea-4841-aaa0-d7d7284fc67e\") " Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.150207 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.150305 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/134304ed-5bea-4841-aaa0-d7d7284fc67e-apiservice-cert\") pod \"134304ed-5bea-4841-aaa0-d7d7284fc67e\" (UID: \"134304ed-5bea-4841-aaa0-d7d7284fc67e\") " Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.150392 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-operator-scripts\") pod \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.150518 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/134304ed-5bea-4841-aaa0-d7d7284fc67e-webhook-cert\") pod \"134304ed-5bea-4841-aaa0-d7d7284fc67e\" (UID: \"134304ed-5bea-4841-aaa0-d7d7284fc67e\") " Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.150607 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-kolla-config\") pod \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.151326 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-config-data-generated\") pod \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.151588 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-config-data-default\") pod \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\" (UID: \"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8\") " Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.152467 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" (UID: "233dc5f6-9da2-4f64-9e14-9ffe70f02ab8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.153023 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" (UID: "233dc5f6-9da2-4f64-9e14-9ffe70f02ab8"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.153767 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" (UID: "233dc5f6-9da2-4f64-9e14-9ffe70f02ab8"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.154545 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" (UID: "233dc5f6-9da2-4f64-9e14-9ffe70f02ab8"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.155481 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-kube-api-access-zfxzf" (OuterVolumeSpecName: "kube-api-access-zfxzf") pod "233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" (UID: "233dc5f6-9da2-4f64-9e14-9ffe70f02ab8"). InnerVolumeSpecName "kube-api-access-zfxzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.156095 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134304ed-5bea-4841-aaa0-d7d7284fc67e-kube-api-access-j77m7" (OuterVolumeSpecName: "kube-api-access-j77m7") pod "134304ed-5bea-4841-aaa0-d7d7284fc67e" (UID: "134304ed-5bea-4841-aaa0-d7d7284fc67e"). InnerVolumeSpecName "kube-api-access-j77m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.156214 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134304ed-5bea-4841-aaa0-d7d7284fc67e-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "134304ed-5bea-4841-aaa0-d7d7284fc67e" (UID: "134304ed-5bea-4841-aaa0-d7d7284fc67e"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.158228 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134304ed-5bea-4841-aaa0-d7d7284fc67e-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "134304ed-5bea-4841-aaa0-d7d7284fc67e" (UID: "134304ed-5bea-4841-aaa0-d7d7284fc67e"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.169917 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" (UID: "233dc5f6-9da2-4f64-9e14-9ffe70f02ab8"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.253606 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/134304ed-5bea-4841-aaa0-d7d7284fc67e-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.253672 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.253689 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.253700 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.253711 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfxzf\" (UniqueName: \"kubernetes.io/projected/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-kube-api-access-zfxzf\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.253720 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j77m7\" (UniqueName: \"kubernetes.io/projected/134304ed-5bea-4841-aaa0-d7d7284fc67e-kube-api-access-j77m7\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.253759 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.253769 5030 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/134304ed-5bea-4841-aaa0-d7d7284fc67e-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.253781 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.266098 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.354860 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.437024 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.437358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6" event={"ID":"134304ed-5bea-4841-aaa0-d7d7284fc67e","Type":"ContainerDied","Data":"8851eead1903c6a0cde3092e2c9cbe1e1ecf1cb7b6390ef5a789981080b56fca"} Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.437449 5030 scope.go:117] "RemoveContainer" containerID="02a538876acf22f39c0a24a305c6ce365f4a2ec0d2a2beea93a36cfc0fe5890c" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.447425 5030 generic.go:334] "Generic (PLEG): container finished" podID="233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" containerID="c4d75f0985bf9e20df958dd4476f2c9b358b54ecc1e575263291b5c952e424e5" exitCode=0 Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.447534 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8","Type":"ContainerDied","Data":"c4d75f0985bf9e20df958dd4476f2c9b358b54ecc1e575263291b5c952e424e5"} Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.447568 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="barbican-kuttl-tests/openstack-galera-0" event={"ID":"233dc5f6-9da2-4f64-9e14-9ffe70f02ab8","Type":"ContainerDied","Data":"c3e712fc7ab1f93f738be4b9b775ffaf1841f32e1e9f3a80b8bc82aa63fcd39d"} Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.447715 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="barbican-kuttl-tests/openstack-galera-0" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.468615 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-jp4q5" event={"ID":"915a4368-1348-488c-a968-bec3d1b87d23","Type":"ContainerDied","Data":"a0cc466ca2d21bb456115cb0b4df6bb46edd0941ed80b8bec062e701129d458c"} Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.468773 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-jp4q5" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.473715 5030 scope.go:117] "RemoveContainer" containerID="c4d75f0985bf9e20df958dd4476f2c9b358b54ecc1e575263291b5c952e424e5" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.484874 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6"] Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.488547 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5fcd4486c5-gqfs6"] Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.501779 5030 scope.go:117] "RemoveContainer" containerID="05b31a37506c8133de1a9c58bb77541c708e68a751680365757a81d56c53c8f5" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.506335 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.513800 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["barbican-kuttl-tests/openstack-galera-0"] Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.519333 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-jp4q5"] Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.529314 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-jp4q5"] Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.536911 5030 scope.go:117] "RemoveContainer" containerID="c4d75f0985bf9e20df958dd4476f2c9b358b54ecc1e575263291b5c952e424e5" Jan 21 00:23:57 crc kubenswrapper[5030]: E0121 00:23:57.537316 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d75f0985bf9e20df958dd4476f2c9b358b54ecc1e575263291b5c952e424e5\": container with ID starting with c4d75f0985bf9e20df958dd4476f2c9b358b54ecc1e575263291b5c952e424e5 not found: ID does not exist" containerID="c4d75f0985bf9e20df958dd4476f2c9b358b54ecc1e575263291b5c952e424e5" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.537351 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d75f0985bf9e20df958dd4476f2c9b358b54ecc1e575263291b5c952e424e5"} err="failed to get container status \"c4d75f0985bf9e20df958dd4476f2c9b358b54ecc1e575263291b5c952e424e5\": rpc error: code = NotFound desc = could not find container \"c4d75f0985bf9e20df958dd4476f2c9b358b54ecc1e575263291b5c952e424e5\": container with ID starting with c4d75f0985bf9e20df958dd4476f2c9b358b54ecc1e575263291b5c952e424e5 not found: ID does not exist" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.537374 5030 scope.go:117] "RemoveContainer" containerID="05b31a37506c8133de1a9c58bb77541c708e68a751680365757a81d56c53c8f5" Jan 21 00:23:57 crc kubenswrapper[5030]: E0121 00:23:57.539987 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05b31a37506c8133de1a9c58bb77541c708e68a751680365757a81d56c53c8f5\": container with ID starting with 05b31a37506c8133de1a9c58bb77541c708e68a751680365757a81d56c53c8f5 not found: ID does not exist" containerID="05b31a37506c8133de1a9c58bb77541c708e68a751680365757a81d56c53c8f5" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.540054 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b31a37506c8133de1a9c58bb77541c708e68a751680365757a81d56c53c8f5"} err="failed to get container status \"05b31a37506c8133de1a9c58bb77541c708e68a751680365757a81d56c53c8f5\": rpc error: code = NotFound desc = could not find container \"05b31a37506c8133de1a9c58bb77541c708e68a751680365757a81d56c53c8f5\": container with ID starting with 05b31a37506c8133de1a9c58bb77541c708e68a751680365757a81d56c53c8f5 not found: ID does not exist" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.540083 5030 scope.go:117] "RemoveContainer" containerID="9d23c2a86c88a78de8b357da28b04ae29725c56e0216ad5bee6e2867c610abb7" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.970823 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07630df9-d9b5-4944-939a-6a284e5e1488" path="/var/lib/kubelet/pods/07630df9-d9b5-4944-939a-6a284e5e1488/volumes" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.971417 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="134304ed-5bea-4841-aaa0-d7d7284fc67e" path="/var/lib/kubelet/pods/134304ed-5bea-4841-aaa0-d7d7284fc67e/volumes" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.971987 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" path="/var/lib/kubelet/pods/233dc5f6-9da2-4f64-9e14-9ffe70f02ab8/volumes" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.973140 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f30d64-1c48-455e-be8e-ad27d8c2ea05" path="/var/lib/kubelet/pods/45f30d64-1c48-455e-be8e-ad27d8c2ea05/volumes" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.973614 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="915a4368-1348-488c-a968-bec3d1b87d23" path="/var/lib/kubelet/pods/915a4368-1348-488c-a968-bec3d1b87d23/volumes" Jan 21 00:23:57 crc kubenswrapper[5030]: I0121 00:23:57.974069 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2def119-ddef-4378-84d0-61650bf36ad5" path="/var/lib/kubelet/pods/b2def119-ddef-4378-84d0-61650bf36ad5/volumes" Jan 21 00:23:58 crc kubenswrapper[5030]: I0121 00:23:58.255777 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj"] Jan 21 00:23:58 crc kubenswrapper[5030]: I0121 00:23:58.255979 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj" podUID="4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e" containerName="operator" containerID="cri-o://1260533e34f0b0f41dc1b59c06bb0c914bc20a24a56808fef2d4f41df1999dd7" gracePeriod=10 Jan 21 00:23:58 crc kubenswrapper[5030]: I0121 00:23:58.493358 5030 generic.go:334] "Generic (PLEG): container finished" podID="4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e" containerID="1260533e34f0b0f41dc1b59c06bb0c914bc20a24a56808fef2d4f41df1999dd7" exitCode=0 Jan 21 00:23:58 crc kubenswrapper[5030]: I0121 00:23:58.493426 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj" event={"ID":"4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e","Type":"ContainerDied","Data":"1260533e34f0b0f41dc1b59c06bb0c914bc20a24a56808fef2d4f41df1999dd7"} Jan 21 00:23:58 crc kubenswrapper[5030]: I0121 00:23:58.552526 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-5qf86"] Jan 21 00:23:58 crc kubenswrapper[5030]: I0121 00:23:58.554032 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" podUID="d11ee764-dc27-451e-a58c-ab22788647f5" containerName="registry-server" containerID="cri-o://ab4f7e0c4afee90f08937e15065d68ce2b23161ac4519df9119304b2a77dbc04" gracePeriod=30 Jan 21 00:23:58 crc kubenswrapper[5030]: I0121 00:23:58.572280 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2"] Jan 21 00:23:58 crc kubenswrapper[5030]: I0121 00:23:58.579103 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59049nq2"] Jan 21 00:23:58 crc kubenswrapper[5030]: I0121 00:23:58.747086 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj" Jan 21 00:23:58 crc kubenswrapper[5030]: I0121 00:23:58.877163 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx8lf\" (UniqueName: \"kubernetes.io/projected/4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e-kube-api-access-bx8lf\") pod \"4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e\" (UID: \"4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e\") " Jan 21 00:23:58 crc kubenswrapper[5030]: I0121 00:23:58.897969 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e-kube-api-access-bx8lf" (OuterVolumeSpecName: "kube-api-access-bx8lf") pod "4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e" (UID: "4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e"). InnerVolumeSpecName "kube-api-access-bx8lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:58 crc kubenswrapper[5030]: I0121 00:23:58.979951 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx8lf\" (UniqueName: \"kubernetes.io/projected/4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e-kube-api-access-bx8lf\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.179780 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.284012 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fzgn\" (UniqueName: \"kubernetes.io/projected/d11ee764-dc27-451e-a58c-ab22788647f5-kube-api-access-2fzgn\") pod \"d11ee764-dc27-451e-a58c-ab22788647f5\" (UID: \"d11ee764-dc27-451e-a58c-ab22788647f5\") " Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.288197 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11ee764-dc27-451e-a58c-ab22788647f5-kube-api-access-2fzgn" (OuterVolumeSpecName: "kube-api-access-2fzgn") pod "d11ee764-dc27-451e-a58c-ab22788647f5" (UID: "d11ee764-dc27-451e-a58c-ab22788647f5"). InnerVolumeSpecName "kube-api-access-2fzgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.385347 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fzgn\" (UniqueName: \"kubernetes.io/projected/d11ee764-dc27-451e-a58c-ab22788647f5-kube-api-access-2fzgn\") on node \"crc\" DevicePath \"\"" Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.513006 5030 generic.go:334] "Generic (PLEG): container finished" podID="d11ee764-dc27-451e-a58c-ab22788647f5" containerID="ab4f7e0c4afee90f08937e15065d68ce2b23161ac4519df9119304b2a77dbc04" exitCode=0 Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.513167 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.513171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" event={"ID":"d11ee764-dc27-451e-a58c-ab22788647f5","Type":"ContainerDied","Data":"ab4f7e0c4afee90f08937e15065d68ce2b23161ac4519df9119304b2a77dbc04"} Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.513232 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-5qf86" event={"ID":"d11ee764-dc27-451e-a58c-ab22788647f5","Type":"ContainerDied","Data":"ad57e99a6924fb06c555bdfc76ab06e401a2a814479fed13baedb3af4f540798"} Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.513270 5030 scope.go:117] "RemoveContainer" containerID="ab4f7e0c4afee90f08937e15065d68ce2b23161ac4519df9119304b2a77dbc04" Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.523850 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj" event={"ID":"4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e","Type":"ContainerDied","Data":"53b4fc04149be862d61104a47469be731be019b7abae3edbbb9a0664b98290b8"} Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.524010 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj" Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.543475 5030 scope.go:117] "RemoveContainer" containerID="ab4f7e0c4afee90f08937e15065d68ce2b23161ac4519df9119304b2a77dbc04" Jan 21 00:23:59 crc kubenswrapper[5030]: E0121 00:23:59.545894 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab4f7e0c4afee90f08937e15065d68ce2b23161ac4519df9119304b2a77dbc04\": container with ID starting with ab4f7e0c4afee90f08937e15065d68ce2b23161ac4519df9119304b2a77dbc04 not found: ID does not exist" containerID="ab4f7e0c4afee90f08937e15065d68ce2b23161ac4519df9119304b2a77dbc04" Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.545955 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab4f7e0c4afee90f08937e15065d68ce2b23161ac4519df9119304b2a77dbc04"} err="failed to get container status \"ab4f7e0c4afee90f08937e15065d68ce2b23161ac4519df9119304b2a77dbc04\": rpc error: code = NotFound desc = could not find container \"ab4f7e0c4afee90f08937e15065d68ce2b23161ac4519df9119304b2a77dbc04\": container with ID starting with ab4f7e0c4afee90f08937e15065d68ce2b23161ac4519df9119304b2a77dbc04 not found: ID does not exist" Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.545989 5030 scope.go:117] "RemoveContainer" containerID="1260533e34f0b0f41dc1b59c06bb0c914bc20a24a56808fef2d4f41df1999dd7" Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.554287 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-5qf86"] Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.564091 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-5qf86"] Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.569098 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj"] Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.574510 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-hx8sj"] Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.970268 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09177594-8d5e-4b0e-a063-5f97d4d75df7" path="/var/lib/kubelet/pods/09177594-8d5e-4b0e-a063-5f97d4d75df7/volumes" Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.971057 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e" path="/var/lib/kubelet/pods/4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e/volumes" Jan 21 00:23:59 crc kubenswrapper[5030]: I0121 00:23:59.971494 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11ee764-dc27-451e-a58c-ab22788647f5" path="/var/lib/kubelet/pods/d11ee764-dc27-451e-a58c-ab22788647f5/volumes" Jan 21 00:24:05 crc kubenswrapper[5030]: I0121 00:24:05.884076 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4"] Jan 21 00:24:05 crc kubenswrapper[5030]: I0121 00:24:05.886503 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" podUID="362e9235-2f96-4edd-8cc4-a22b8fb7b6d2" containerName="manager" containerID="cri-o://860a74dd5faeee55096fbe1222ffe83b368db43dd3cc161a34d4204edc6c2579" gracePeriod=10 Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.155767 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-md4t4"] Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.156082 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-md4t4" podUID="702a9aa6-bb25-4831-83be-50970db674b0" containerName="registry-server" containerID="cri-o://2054f09e461f55bbf67cdcefc016529884b2e616ee4279b261f715286770a5d3" gracePeriod=30 Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.202734 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr"] Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.215062 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1dkbhr"] Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.389197 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.494330 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgtql\" (UniqueName: \"kubernetes.io/projected/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-kube-api-access-sgtql\") pod \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\" (UID: \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\") " Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.494474 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-apiservice-cert\") pod \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\" (UID: \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\") " Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.494560 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-webhook-cert\") pod \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\" (UID: \"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2\") " Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.500992 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "362e9235-2f96-4edd-8cc4-a22b8fb7b6d2" (UID: "362e9235-2f96-4edd-8cc4-a22b8fb7b6d2"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.501199 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-kube-api-access-sgtql" (OuterVolumeSpecName: "kube-api-access-sgtql") pod "362e9235-2f96-4edd-8cc4-a22b8fb7b6d2" (UID: "362e9235-2f96-4edd-8cc4-a22b8fb7b6d2"). InnerVolumeSpecName "kube-api-access-sgtql". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.502721 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "362e9235-2f96-4edd-8cc4-a22b8fb7b6d2" (UID: "362e9235-2f96-4edd-8cc4-a22b8fb7b6d2"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.577257 5030 generic.go:334] "Generic (PLEG): container finished" podID="362e9235-2f96-4edd-8cc4-a22b8fb7b6d2" containerID="860a74dd5faeee55096fbe1222ffe83b368db43dd3cc161a34d4204edc6c2579" exitCode=0 Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.577318 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.577309 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" event={"ID":"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2","Type":"ContainerDied","Data":"860a74dd5faeee55096fbe1222ffe83b368db43dd3cc161a34d4204edc6c2579"} Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.577475 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4" event={"ID":"362e9235-2f96-4edd-8cc4-a22b8fb7b6d2","Type":"ContainerDied","Data":"2a1558083c2d85111a58cc8552a6cba0038418f1fec0b2d6a713050490fa5c00"} Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.577508 5030 scope.go:117] "RemoveContainer" containerID="860a74dd5faeee55096fbe1222ffe83b368db43dd3cc161a34d4204edc6c2579" Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.581839 5030 generic.go:334] "Generic (PLEG): container finished" podID="702a9aa6-bb25-4831-83be-50970db674b0" containerID="2054f09e461f55bbf67cdcefc016529884b2e616ee4279b261f715286770a5d3" exitCode=0 Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.581879 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-md4t4" event={"ID":"702a9aa6-bb25-4831-83be-50970db674b0","Type":"ContainerDied","Data":"2054f09e461f55bbf67cdcefc016529884b2e616ee4279b261f715286770a5d3"} Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.591735 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-md4t4" Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.596321 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.596363 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgtql\" (UniqueName: \"kubernetes.io/projected/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-kube-api-access-sgtql\") on node \"crc\" DevicePath \"\"" Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.596377 5030 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.605389 5030 scope.go:117] "RemoveContainer" containerID="860a74dd5faeee55096fbe1222ffe83b368db43dd3cc161a34d4204edc6c2579" Jan 21 00:24:06 crc kubenswrapper[5030]: E0121 00:24:06.608597 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"860a74dd5faeee55096fbe1222ffe83b368db43dd3cc161a34d4204edc6c2579\": container with ID starting with 860a74dd5faeee55096fbe1222ffe83b368db43dd3cc161a34d4204edc6c2579 not found: ID does not exist" containerID="860a74dd5faeee55096fbe1222ffe83b368db43dd3cc161a34d4204edc6c2579" Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.608671 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860a74dd5faeee55096fbe1222ffe83b368db43dd3cc161a34d4204edc6c2579"} err="failed to get container status \"860a74dd5faeee55096fbe1222ffe83b368db43dd3cc161a34d4204edc6c2579\": rpc error: code = NotFound desc = could not find container \"860a74dd5faeee55096fbe1222ffe83b368db43dd3cc161a34d4204edc6c2579\": container with ID starting with 860a74dd5faeee55096fbe1222ffe83b368db43dd3cc161a34d4204edc6c2579 not found: ID does not exist" Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.626168 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4"] Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.631188 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-795f7dd4bc-p4mz4"] Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.697575 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8pvx\" (UniqueName: \"kubernetes.io/projected/702a9aa6-bb25-4831-83be-50970db674b0-kube-api-access-t8pvx\") pod \"702a9aa6-bb25-4831-83be-50970db674b0\" (UID: \"702a9aa6-bb25-4831-83be-50970db674b0\") " Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.700834 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702a9aa6-bb25-4831-83be-50970db674b0-kube-api-access-t8pvx" (OuterVolumeSpecName: "kube-api-access-t8pvx") pod "702a9aa6-bb25-4831-83be-50970db674b0" (UID: "702a9aa6-bb25-4831-83be-50970db674b0"). InnerVolumeSpecName "kube-api-access-t8pvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:24:06 crc kubenswrapper[5030]: I0121 00:24:06.800147 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8pvx\" (UniqueName: \"kubernetes.io/projected/702a9aa6-bb25-4831-83be-50970db674b0-kube-api-access-t8pvx\") on node \"crc\" DevicePath \"\"" Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.189251 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd"] Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.189463 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" podUID="97e620d3-b6ad-4847-94fa-23e999bebfca" containerName="manager" containerID="cri-o://89ac43add713166d036b64ce67b58bc1f08b82876bccda292a8c2a8ca3da9c54" gracePeriod=10 Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.465864 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-h4n9w"] Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.474564 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-h4n9w" podUID="57c9dd6b-175a-44f3-a3e7-90f211937483" containerName="registry-server" containerID="cri-o://9576524e8323da7db57298b88f0dddedf4750d64317985c6b2e6459b77e6bc40" gracePeriod=30 Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.517702 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp"] Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.522472 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bsz4qp"] Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.595464 5030 generic.go:334] "Generic (PLEG): container finished" podID="97e620d3-b6ad-4847-94fa-23e999bebfca" containerID="89ac43add713166d036b64ce67b58bc1f08b82876bccda292a8c2a8ca3da9c54" exitCode=0 Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.595561 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" event={"ID":"97e620d3-b6ad-4847-94fa-23e999bebfca","Type":"ContainerDied","Data":"89ac43add713166d036b64ce67b58bc1f08b82876bccda292a8c2a8ca3da9c54"} Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.597765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-md4t4" event={"ID":"702a9aa6-bb25-4831-83be-50970db674b0","Type":"ContainerDied","Data":"0c6ec5416529b379f1d0b992695d2ed87876c9aeeabe5e9bdb80c9a978596fcf"} Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.597828 5030 scope.go:117] "RemoveContainer" containerID="2054f09e461f55bbf67cdcefc016529884b2e616ee4279b261f715286770a5d3" Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.597993 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-md4t4" Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.759271 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.773503 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-md4t4"] Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.778197 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-md4t4"] Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.910776 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h4n9w" Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.917074 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqnk5\" (UniqueName: \"kubernetes.io/projected/97e620d3-b6ad-4847-94fa-23e999bebfca-kube-api-access-cqnk5\") pod \"97e620d3-b6ad-4847-94fa-23e999bebfca\" (UID: \"97e620d3-b6ad-4847-94fa-23e999bebfca\") " Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.917177 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97e620d3-b6ad-4847-94fa-23e999bebfca-apiservice-cert\") pod \"97e620d3-b6ad-4847-94fa-23e999bebfca\" (UID: \"97e620d3-b6ad-4847-94fa-23e999bebfca\") " Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.917269 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97e620d3-b6ad-4847-94fa-23e999bebfca-webhook-cert\") pod \"97e620d3-b6ad-4847-94fa-23e999bebfca\" (UID: \"97e620d3-b6ad-4847-94fa-23e999bebfca\") " Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.922222 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e620d3-b6ad-4847-94fa-23e999bebfca-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "97e620d3-b6ad-4847-94fa-23e999bebfca" (UID: "97e620d3-b6ad-4847-94fa-23e999bebfca"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.922407 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e620d3-b6ad-4847-94fa-23e999bebfca-kube-api-access-cqnk5" (OuterVolumeSpecName: "kube-api-access-cqnk5") pod "97e620d3-b6ad-4847-94fa-23e999bebfca" (UID: "97e620d3-b6ad-4847-94fa-23e999bebfca"). InnerVolumeSpecName "kube-api-access-cqnk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.931076 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e620d3-b6ad-4847-94fa-23e999bebfca-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "97e620d3-b6ad-4847-94fa-23e999bebfca" (UID: "97e620d3-b6ad-4847-94fa-23e999bebfca"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.986951 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="362e9235-2f96-4edd-8cc4-a22b8fb7b6d2" path="/var/lib/kubelet/pods/362e9235-2f96-4edd-8cc4-a22b8fb7b6d2/volumes" Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.988673 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="375155e3-b3f2-44e8-a551-1050e4e072b4" path="/var/lib/kubelet/pods/375155e3-b3f2-44e8-a551-1050e4e072b4/volumes" Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.991651 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="702a9aa6-bb25-4831-83be-50970db674b0" path="/var/lib/kubelet/pods/702a9aa6-bb25-4831-83be-50970db674b0/volumes" Jan 21 00:24:07 crc kubenswrapper[5030]: I0121 00:24:07.992372 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c91ab469-6741-4eb3-a3d6-f651b5925221" path="/var/lib/kubelet/pods/c91ab469-6741-4eb3-a3d6-f651b5925221/volumes" Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.018081 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gmpj\" (UniqueName: \"kubernetes.io/projected/57c9dd6b-175a-44f3-a3e7-90f211937483-kube-api-access-4gmpj\") pod \"57c9dd6b-175a-44f3-a3e7-90f211937483\" (UID: \"57c9dd6b-175a-44f3-a3e7-90f211937483\") " Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.022738 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97e620d3-b6ad-4847-94fa-23e999bebfca-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.023132 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqnk5\" (UniqueName: \"kubernetes.io/projected/97e620d3-b6ad-4847-94fa-23e999bebfca-kube-api-access-cqnk5\") on node \"crc\" DevicePath \"\"" Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.023165 5030 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97e620d3-b6ad-4847-94fa-23e999bebfca-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.024276 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57c9dd6b-175a-44f3-a3e7-90f211937483-kube-api-access-4gmpj" (OuterVolumeSpecName: "kube-api-access-4gmpj") pod "57c9dd6b-175a-44f3-a3e7-90f211937483" (UID: "57c9dd6b-175a-44f3-a3e7-90f211937483"). InnerVolumeSpecName "kube-api-access-4gmpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.123807 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gmpj\" (UniqueName: \"kubernetes.io/projected/57c9dd6b-175a-44f3-a3e7-90f211937483-kube-api-access-4gmpj\") on node \"crc\" DevicePath \"\"" Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.606429 5030 generic.go:334] "Generic (PLEG): container finished" podID="57c9dd6b-175a-44f3-a3e7-90f211937483" containerID="9576524e8323da7db57298b88f0dddedf4750d64317985c6b2e6459b77e6bc40" exitCode=0 Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.606498 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h4n9w" event={"ID":"57c9dd6b-175a-44f3-a3e7-90f211937483","Type":"ContainerDied","Data":"9576524e8323da7db57298b88f0dddedf4750d64317985c6b2e6459b77e6bc40"} Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.606521 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h4n9w" Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.606540 5030 scope.go:117] "RemoveContainer" containerID="9576524e8323da7db57298b88f0dddedf4750d64317985c6b2e6459b77e6bc40" Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.606528 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h4n9w" event={"ID":"57c9dd6b-175a-44f3-a3e7-90f211937483","Type":"ContainerDied","Data":"fece8b1a2609d19a5b995791ad370400f65ff59305975e5fbe7b5cfc77112a31"} Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.609694 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" event={"ID":"97e620d3-b6ad-4847-94fa-23e999bebfca","Type":"ContainerDied","Data":"1630300bbae8359feaf6521ad26adaf28662620576998f09a10a58508489e174"} Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.609785 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd" Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.627716 5030 scope.go:117] "RemoveContainer" containerID="9576524e8323da7db57298b88f0dddedf4750d64317985c6b2e6459b77e6bc40" Jan 21 00:24:08 crc kubenswrapper[5030]: E0121 00:24:08.628127 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9576524e8323da7db57298b88f0dddedf4750d64317985c6b2e6459b77e6bc40\": container with ID starting with 9576524e8323da7db57298b88f0dddedf4750d64317985c6b2e6459b77e6bc40 not found: ID does not exist" containerID="9576524e8323da7db57298b88f0dddedf4750d64317985c6b2e6459b77e6bc40" Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.628165 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9576524e8323da7db57298b88f0dddedf4750d64317985c6b2e6459b77e6bc40"} err="failed to get container status \"9576524e8323da7db57298b88f0dddedf4750d64317985c6b2e6459b77e6bc40\": rpc error: code = NotFound desc = could not find container \"9576524e8323da7db57298b88f0dddedf4750d64317985c6b2e6459b77e6bc40\": container with ID starting with 9576524e8323da7db57298b88f0dddedf4750d64317985c6b2e6459b77e6bc40 not found: ID does not exist" Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.628190 5030 scope.go:117] "RemoveContainer" containerID="89ac43add713166d036b64ce67b58bc1f08b82876bccda292a8c2a8ca3da9c54" Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.633762 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd"] Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.641313 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-76878c7469-zppmd"] Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.648915 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-h4n9w"] Jan 21 00:24:08 crc kubenswrapper[5030]: I0121 00:24:08.654318 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-h4n9w"] Jan 21 00:24:09 crc kubenswrapper[5030]: I0121 00:24:09.970029 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57c9dd6b-175a-44f3-a3e7-90f211937483" path="/var/lib/kubelet/pods/57c9dd6b-175a-44f3-a3e7-90f211937483/volumes" Jan 21 00:24:09 crc kubenswrapper[5030]: I0121 00:24:09.970821 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e620d3-b6ad-4847-94fa-23e999bebfca" path="/var/lib/kubelet/pods/97e620d3-b6ad-4847-94fa-23e999bebfca/volumes" Jan 21 00:24:10 crc kubenswrapper[5030]: I0121 00:24:10.962572 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:24:10 crc kubenswrapper[5030]: E0121 00:24:10.962962 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:24:22 crc kubenswrapper[5030]: I0121 00:24:22.962114 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:24:22 crc kubenswrapper[5030]: E0121 00:24:22.962871 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:24:34 crc kubenswrapper[5030]: I0121 00:24:34.962373 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:24:34 crc kubenswrapper[5030]: E0121 00:24:34.963326 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:24:49 crc kubenswrapper[5030]: I0121 00:24:49.962235 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:24:49 crc kubenswrapper[5030]: E0121 00:24:49.963142 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:25:00 crc kubenswrapper[5030]: I0121 00:25:00.962588 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:25:00 crc kubenswrapper[5030]: E0121 00:25:00.963393 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289133 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-z9shv"] Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289442 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a2d438-b8dc-4dac-a66b-7e596e32c4f3" containerName="manager" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289466 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a2d438-b8dc-4dac-a66b-7e596e32c4f3" containerName="manager" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289481 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362e9235-2f96-4edd-8cc4-a22b8fb7b6d2" containerName="manager" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289490 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="362e9235-2f96-4edd-8cc4-a22b8fb7b6d2" containerName="manager" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289508 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" containerName="galera" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289516 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" containerName="galera" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289527 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11ee764-dc27-451e-a58c-ab22788647f5" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289535 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11ee764-dc27-451e-a58c-ab22788647f5" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289545 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" containerName="mysql-bootstrap" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289553 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" containerName="mysql-bootstrap" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289571 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6ed570-11c2-4baf-b681-8e330085006b" containerName="keystone-api" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289578 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6ed570-11c2-4baf-b681-8e330085006b" containerName="keystone-api" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289589 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f30d64-1c48-455e-be8e-ad27d8c2ea05" containerName="galera" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289596 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f30d64-1c48-455e-be8e-ad27d8c2ea05" containerName="galera" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289610 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3756e766-1fab-47ec-aefc-8ad5e10f4fe4" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289637 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3756e766-1fab-47ec-aefc-8ad5e10f4fe4" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289652 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e289ac96-4ca2-4b3b-b176-72ff05794567" containerName="mariadb-account-delete" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289659 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e289ac96-4ca2-4b3b-b176-72ff05794567" containerName="mariadb-account-delete" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289671 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e" containerName="operator" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289679 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e" containerName="operator" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289699 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7186abfd-4ab6-445a-9c9b-9f512a483acc" containerName="mysql-bootstrap" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289706 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7186abfd-4ab6-445a-9c9b-9f512a483acc" containerName="mysql-bootstrap" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289717 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07630df9-d9b5-4944-939a-6a284e5e1488" containerName="setup-container" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289724 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="07630df9-d9b5-4944-939a-6a284e5e1488" containerName="setup-container" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289736 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f30d64-1c48-455e-be8e-ad27d8c2ea05" containerName="mysql-bootstrap" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289742 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f30d64-1c48-455e-be8e-ad27d8c2ea05" containerName="mysql-bootstrap" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289756 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702a9aa6-bb25-4831-83be-50970db674b0" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289763 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="702a9aa6-bb25-4831-83be-50970db674b0" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289774 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7186abfd-4ab6-445a-9c9b-9f512a483acc" containerName="galera" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289781 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7186abfd-4ab6-445a-9c9b-9f512a483acc" containerName="galera" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289790 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57c9dd6b-175a-44f3-a3e7-90f211937483" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289797 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="57c9dd6b-175a-44f3-a3e7-90f211937483" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289811 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58fca83f-7678-4c7a-bff7-5830b38c788f" containerName="memcached" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289818 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="58fca83f-7678-4c7a-bff7-5830b38c788f" containerName="memcached" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289830 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e620d3-b6ad-4847-94fa-23e999bebfca" containerName="manager" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289837 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e620d3-b6ad-4847-94fa-23e999bebfca" containerName="manager" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289850 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134304ed-5bea-4841-aaa0-d7d7284fc67e" containerName="manager" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289857 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="134304ed-5bea-4841-aaa0-d7d7284fc67e" containerName="manager" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289870 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07630df9-d9b5-4944-939a-6a284e5e1488" containerName="rabbitmq" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289877 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="07630df9-d9b5-4944-939a-6a284e5e1488" containerName="rabbitmq" Jan 21 00:25:01 crc kubenswrapper[5030]: E0121 00:25:01.289890 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="915a4368-1348-488c-a968-bec3d1b87d23" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.289897 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="915a4368-1348-488c-a968-bec3d1b87d23" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290040 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11ee764-dc27-451e-a58c-ab22788647f5" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290053 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a2d438-b8dc-4dac-a66b-7e596e32c4f3" containerName="manager" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290065 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3756e766-1fab-47ec-aefc-8ad5e10f4fe4" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290079 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c6c096a-7e9f-4a9e-8145-23ceb5dc0b1e" containerName="operator" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290094 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f30d64-1c48-455e-be8e-ad27d8c2ea05" containerName="galera" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290104 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="07630df9-d9b5-4944-939a-6a284e5e1488" containerName="rabbitmq" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290115 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="134304ed-5bea-4841-aaa0-d7d7284fc67e" containerName="manager" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290123 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd6ed570-11c2-4baf-b681-8e330085006b" containerName="keystone-api" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290132 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="57c9dd6b-175a-44f3-a3e7-90f211937483" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290142 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="702a9aa6-bb25-4831-83be-50970db674b0" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290150 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="58fca83f-7678-4c7a-bff7-5830b38c788f" containerName="memcached" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290160 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e620d3-b6ad-4847-94fa-23e999bebfca" containerName="manager" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290167 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7186abfd-4ab6-445a-9c9b-9f512a483acc" containerName="galera" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290180 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="233dc5f6-9da2-4f64-9e14-9ffe70f02ab8" containerName="galera" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290189 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="362e9235-2f96-4edd-8cc4-a22b8fb7b6d2" containerName="manager" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290202 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="915a4368-1348-488c-a968-bec3d1b87d23" containerName="registry-server" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290214 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e289ac96-4ca2-4b3b-b176-72ff05794567" containerName="mariadb-account-delete" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.290845 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-z9shv" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.294835 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.294931 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.297464 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-whg75" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.311840 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-z9shv"] Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.364312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxh4l\" (UniqueName: \"kubernetes.io/projected/ebc399f6-b5c4-4e7b-9128-2c339687a516-kube-api-access-cxh4l\") pod \"mariadb-operator-index-z9shv\" (UID: \"ebc399f6-b5c4-4e7b-9128-2c339687a516\") " pod="openstack-operators/mariadb-operator-index-z9shv" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.465782 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxh4l\" (UniqueName: \"kubernetes.io/projected/ebc399f6-b5c4-4e7b-9128-2c339687a516-kube-api-access-cxh4l\") pod \"mariadb-operator-index-z9shv\" (UID: \"ebc399f6-b5c4-4e7b-9128-2c339687a516\") " pod="openstack-operators/mariadb-operator-index-z9shv" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.487181 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxh4l\" (UniqueName: \"kubernetes.io/projected/ebc399f6-b5c4-4e7b-9128-2c339687a516-kube-api-access-cxh4l\") pod \"mariadb-operator-index-z9shv\" (UID: \"ebc399f6-b5c4-4e7b-9128-2c339687a516\") " pod="openstack-operators/mariadb-operator-index-z9shv" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.618358 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-z9shv" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.663033 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-z9shv"] Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.873971 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-fgsht"] Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.874829 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-fgsht" Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.889667 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-fgsht"] Jan 21 00:25:01 crc kubenswrapper[5030]: I0121 00:25:01.978227 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klbpj\" (UniqueName: \"kubernetes.io/projected/c09abaf9-84c0-44d0-8aa0-5d930ccf4d59-kube-api-access-klbpj\") pod \"mariadb-operator-index-fgsht\" (UID: \"c09abaf9-84c0-44d0-8aa0-5d930ccf4d59\") " pod="openstack-operators/mariadb-operator-index-fgsht" Jan 21 00:25:02 crc kubenswrapper[5030]: I0121 00:25:02.079877 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klbpj\" (UniqueName: \"kubernetes.io/projected/c09abaf9-84c0-44d0-8aa0-5d930ccf4d59-kube-api-access-klbpj\") pod \"mariadb-operator-index-fgsht\" (UID: \"c09abaf9-84c0-44d0-8aa0-5d930ccf4d59\") " pod="openstack-operators/mariadb-operator-index-fgsht" Jan 21 00:25:02 crc kubenswrapper[5030]: I0121 00:25:02.091668 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-z9shv"] Jan 21 00:25:02 crc kubenswrapper[5030]: I0121 00:25:02.097961 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 00:25:02 crc kubenswrapper[5030]: I0121 00:25:02.102857 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klbpj\" (UniqueName: \"kubernetes.io/projected/c09abaf9-84c0-44d0-8aa0-5d930ccf4d59-kube-api-access-klbpj\") pod \"mariadb-operator-index-fgsht\" (UID: \"c09abaf9-84c0-44d0-8aa0-5d930ccf4d59\") " pod="openstack-operators/mariadb-operator-index-fgsht" Jan 21 00:25:02 crc kubenswrapper[5030]: I0121 00:25:02.193221 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-fgsht" Jan 21 00:25:02 crc kubenswrapper[5030]: I0121 00:25:02.390532 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-fgsht"] Jan 21 00:25:03 crc kubenswrapper[5030]: I0121 00:25:03.030260 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-fgsht" event={"ID":"c09abaf9-84c0-44d0-8aa0-5d930ccf4d59","Type":"ContainerStarted","Data":"c3666ef5c94bc23c780da6d907fababf6dd4c49bf284beafc162d95fbe105506"} Jan 21 00:25:03 crc kubenswrapper[5030]: I0121 00:25:03.030316 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-fgsht" event={"ID":"c09abaf9-84c0-44d0-8aa0-5d930ccf4d59","Type":"ContainerStarted","Data":"a981fc1ec3dda9b939ec3252eec526ce1013e8a3adf73257fbb3c9b79410b7ea"} Jan 21 00:25:03 crc kubenswrapper[5030]: I0121 00:25:03.033658 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-z9shv" event={"ID":"ebc399f6-b5c4-4e7b-9128-2c339687a516","Type":"ContainerStarted","Data":"cb63dbccc7719c0ba5e4e8c406bf69aa6ba2ddca29182c1bc01dc98584e12598"} Jan 21 00:25:03 crc kubenswrapper[5030]: I0121 00:25:03.033695 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-z9shv" event={"ID":"ebc399f6-b5c4-4e7b-9128-2c339687a516","Type":"ContainerStarted","Data":"4bee2756c3eb719ad5691333fa7e731f66662efcb4d82d8407f8b07c62bb05c8"} Jan 21 00:25:03 crc kubenswrapper[5030]: I0121 00:25:03.033804 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-z9shv" podUID="ebc399f6-b5c4-4e7b-9128-2c339687a516" containerName="registry-server" containerID="cri-o://cb63dbccc7719c0ba5e4e8c406bf69aa6ba2ddca29182c1bc01dc98584e12598" gracePeriod=2 Jan 21 00:25:03 crc kubenswrapper[5030]: I0121 00:25:03.045798 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-fgsht" podStartSLOduration=1.611020124 podStartE2EDuration="2.045774972s" podCreationTimestamp="2026-01-21 00:25:01 +0000 UTC" firstStartedPulling="2026-01-21 00:25:02.397284796 +0000 UTC m=+6574.717545084" lastFinishedPulling="2026-01-21 00:25:02.832039644 +0000 UTC m=+6575.152299932" observedRunningTime="2026-01-21 00:25:03.044855689 +0000 UTC m=+6575.365115997" watchObservedRunningTime="2026-01-21 00:25:03.045774972 +0000 UTC m=+6575.366035260" Jan 21 00:25:03 crc kubenswrapper[5030]: I0121 00:25:03.083504 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-z9shv" podStartSLOduration=1.529109606 podStartE2EDuration="2.083484603s" podCreationTimestamp="2026-01-21 00:25:01 +0000 UTC" firstStartedPulling="2026-01-21 00:25:02.097701037 +0000 UTC m=+6574.417961315" lastFinishedPulling="2026-01-21 00:25:02.652076034 +0000 UTC m=+6574.972336312" observedRunningTime="2026-01-21 00:25:03.063581378 +0000 UTC m=+6575.383841696" watchObservedRunningTime="2026-01-21 00:25:03.083484603 +0000 UTC m=+6575.403744891" Jan 21 00:25:03 crc kubenswrapper[5030]: I0121 00:25:03.349448 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-z9shv" Jan 21 00:25:03 crc kubenswrapper[5030]: I0121 00:25:03.401724 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxh4l\" (UniqueName: \"kubernetes.io/projected/ebc399f6-b5c4-4e7b-9128-2c339687a516-kube-api-access-cxh4l\") pod \"ebc399f6-b5c4-4e7b-9128-2c339687a516\" (UID: \"ebc399f6-b5c4-4e7b-9128-2c339687a516\") " Jan 21 00:25:03 crc kubenswrapper[5030]: I0121 00:25:03.407463 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebc399f6-b5c4-4e7b-9128-2c339687a516-kube-api-access-cxh4l" (OuterVolumeSpecName: "kube-api-access-cxh4l") pod "ebc399f6-b5c4-4e7b-9128-2c339687a516" (UID: "ebc399f6-b5c4-4e7b-9128-2c339687a516"). InnerVolumeSpecName "kube-api-access-cxh4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:25:03 crc kubenswrapper[5030]: I0121 00:25:03.503519 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxh4l\" (UniqueName: \"kubernetes.io/projected/ebc399f6-b5c4-4e7b-9128-2c339687a516-kube-api-access-cxh4l\") on node \"crc\" DevicePath \"\"" Jan 21 00:25:04 crc kubenswrapper[5030]: I0121 00:25:04.046192 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-z9shv" Jan 21 00:25:04 crc kubenswrapper[5030]: I0121 00:25:04.046214 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-z9shv" event={"ID":"ebc399f6-b5c4-4e7b-9128-2c339687a516","Type":"ContainerDied","Data":"cb63dbccc7719c0ba5e4e8c406bf69aa6ba2ddca29182c1bc01dc98584e12598"} Jan 21 00:25:04 crc kubenswrapper[5030]: I0121 00:25:04.046146 5030 generic.go:334] "Generic (PLEG): container finished" podID="ebc399f6-b5c4-4e7b-9128-2c339687a516" containerID="cb63dbccc7719c0ba5e4e8c406bf69aa6ba2ddca29182c1bc01dc98584e12598" exitCode=0 Jan 21 00:25:04 crc kubenswrapper[5030]: I0121 00:25:04.047128 5030 scope.go:117] "RemoveContainer" containerID="cb63dbccc7719c0ba5e4e8c406bf69aa6ba2ddca29182c1bc01dc98584e12598" Jan 21 00:25:04 crc kubenswrapper[5030]: I0121 00:25:04.053036 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-z9shv" event={"ID":"ebc399f6-b5c4-4e7b-9128-2c339687a516","Type":"ContainerDied","Data":"4bee2756c3eb719ad5691333fa7e731f66662efcb4d82d8407f8b07c62bb05c8"} Jan 21 00:25:04 crc kubenswrapper[5030]: I0121 00:25:04.065807 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-z9shv"] Jan 21 00:25:04 crc kubenswrapper[5030]: I0121 00:25:04.070251 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-z9shv"] Jan 21 00:25:04 crc kubenswrapper[5030]: I0121 00:25:04.090174 5030 scope.go:117] "RemoveContainer" containerID="cb63dbccc7719c0ba5e4e8c406bf69aa6ba2ddca29182c1bc01dc98584e12598" Jan 21 00:25:04 crc kubenswrapper[5030]: E0121 00:25:04.090890 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb63dbccc7719c0ba5e4e8c406bf69aa6ba2ddca29182c1bc01dc98584e12598\": container with ID starting with cb63dbccc7719c0ba5e4e8c406bf69aa6ba2ddca29182c1bc01dc98584e12598 not found: ID does not exist" containerID="cb63dbccc7719c0ba5e4e8c406bf69aa6ba2ddca29182c1bc01dc98584e12598" Jan 21 00:25:04 crc kubenswrapper[5030]: I0121 00:25:04.090924 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb63dbccc7719c0ba5e4e8c406bf69aa6ba2ddca29182c1bc01dc98584e12598"} err="failed to get container status \"cb63dbccc7719c0ba5e4e8c406bf69aa6ba2ddca29182c1bc01dc98584e12598\": rpc error: code = NotFound desc = could not find container \"cb63dbccc7719c0ba5e4e8c406bf69aa6ba2ddca29182c1bc01dc98584e12598\": container with ID starting with cb63dbccc7719c0ba5e4e8c406bf69aa6ba2ddca29182c1bc01dc98584e12598 not found: ID does not exist" Jan 21 00:25:05 crc kubenswrapper[5030]: I0121 00:25:05.971799 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebc399f6-b5c4-4e7b-9128-2c339687a516" path="/var/lib/kubelet/pods/ebc399f6-b5c4-4e7b-9128-2c339687a516/volumes" Jan 21 00:25:07 crc kubenswrapper[5030]: I0121 00:25:07.906132 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv"] Jan 21 00:25:07 crc kubenswrapper[5030]: E0121 00:25:07.906755 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc399f6-b5c4-4e7b-9128-2c339687a516" containerName="registry-server" Jan 21 00:25:07 crc kubenswrapper[5030]: I0121 00:25:07.906773 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc399f6-b5c4-4e7b-9128-2c339687a516" containerName="registry-server" Jan 21 00:25:07 crc kubenswrapper[5030]: I0121 00:25:07.906925 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebc399f6-b5c4-4e7b-9128-2c339687a516" containerName="registry-server" Jan 21 00:25:07 crc kubenswrapper[5030]: I0121 00:25:07.907967 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" Jan 21 00:25:07 crc kubenswrapper[5030]: I0121 00:25:07.912191 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:25:07 crc kubenswrapper[5030]: I0121 00:25:07.923353 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv"] Jan 21 00:25:08 crc kubenswrapper[5030]: I0121 00:25:08.066377 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6kcm\" (UniqueName: \"kubernetes.io/projected/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-kube-api-access-t6kcm\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv\" (UID: \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" Jan 21 00:25:08 crc kubenswrapper[5030]: I0121 00:25:08.066743 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv\" (UID: \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" Jan 21 00:25:08 crc kubenswrapper[5030]: I0121 00:25:08.066896 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv\" (UID: \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" Jan 21 00:25:08 crc kubenswrapper[5030]: I0121 00:25:08.167957 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv\" (UID: \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" Jan 21 00:25:08 crc kubenswrapper[5030]: I0121 00:25:08.168020 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv\" (UID: \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" Jan 21 00:25:08 crc kubenswrapper[5030]: I0121 00:25:08.168056 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6kcm\" (UniqueName: \"kubernetes.io/projected/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-kube-api-access-t6kcm\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv\" (UID: \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" Jan 21 00:25:08 crc kubenswrapper[5030]: I0121 00:25:08.168856 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv\" (UID: \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" Jan 21 00:25:08 crc kubenswrapper[5030]: I0121 00:25:08.169073 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv\" (UID: \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" Jan 21 00:25:08 crc kubenswrapper[5030]: I0121 00:25:08.185767 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6kcm\" (UniqueName: \"kubernetes.io/projected/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-kube-api-access-t6kcm\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv\" (UID: \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" Jan 21 00:25:08 crc kubenswrapper[5030]: I0121 00:25:08.226894 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" Jan 21 00:25:08 crc kubenswrapper[5030]: I0121 00:25:08.641460 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv"] Jan 21 00:25:09 crc kubenswrapper[5030]: I0121 00:25:09.091912 5030 generic.go:334] "Generic (PLEG): container finished" podID="ab25a3ce-1d1e-4621-93d3-51d691be4b9f" containerID="ce34f37c43d1ef02c865bcfa15b6652fa8311a3a8ebf277a435dce37d5e3dad0" exitCode=0 Jan 21 00:25:09 crc kubenswrapper[5030]: I0121 00:25:09.092016 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" event={"ID":"ab25a3ce-1d1e-4621-93d3-51d691be4b9f","Type":"ContainerDied","Data":"ce34f37c43d1ef02c865bcfa15b6652fa8311a3a8ebf277a435dce37d5e3dad0"} Jan 21 00:25:09 crc kubenswrapper[5030]: I0121 00:25:09.092233 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" event={"ID":"ab25a3ce-1d1e-4621-93d3-51d691be4b9f","Type":"ContainerStarted","Data":"d3b3227d1cafa588b8a49e5b069509989a8004a9f5a80854186cac07353f2391"} Jan 21 00:25:11 crc kubenswrapper[5030]: I0121 00:25:11.961662 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:25:11 crc kubenswrapper[5030]: E0121 00:25:11.962189 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:25:12 crc kubenswrapper[5030]: I0121 00:25:12.115733 5030 generic.go:334] "Generic (PLEG): container finished" podID="ab25a3ce-1d1e-4621-93d3-51d691be4b9f" containerID="b5d63c50bd5504ebbdb52eedc03b351d2a212b1cd402d8f005df68b3ba72e750" exitCode=0 Jan 21 00:25:12 crc kubenswrapper[5030]: I0121 00:25:12.115784 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" event={"ID":"ab25a3ce-1d1e-4621-93d3-51d691be4b9f","Type":"ContainerDied","Data":"b5d63c50bd5504ebbdb52eedc03b351d2a212b1cd402d8f005df68b3ba72e750"} Jan 21 00:25:12 crc kubenswrapper[5030]: I0121 00:25:12.193366 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-fgsht" Jan 21 00:25:12 crc kubenswrapper[5030]: I0121 00:25:12.193411 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-fgsht" Jan 21 00:25:12 crc kubenswrapper[5030]: I0121 00:25:12.238850 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-fgsht" Jan 21 00:25:13 crc kubenswrapper[5030]: I0121 00:25:13.123403 5030 generic.go:334] "Generic (PLEG): container finished" podID="ab25a3ce-1d1e-4621-93d3-51d691be4b9f" containerID="a99af352c8b003deea9af8fd281ac887d2d89e23b061a480da35bfdb2e0b70c0" exitCode=0 Jan 21 00:25:13 crc kubenswrapper[5030]: I0121 00:25:13.123493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" event={"ID":"ab25a3ce-1d1e-4621-93d3-51d691be4b9f","Type":"ContainerDied","Data":"a99af352c8b003deea9af8fd281ac887d2d89e23b061a480da35bfdb2e0b70c0"} Jan 21 00:25:13 crc kubenswrapper[5030]: I0121 00:25:13.153065 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-fgsht" Jan 21 00:25:14 crc kubenswrapper[5030]: I0121 00:25:14.400520 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" Jan 21 00:25:14 crc kubenswrapper[5030]: I0121 00:25:14.553027 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-bundle\") pod \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\" (UID: \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\") " Jan 21 00:25:14 crc kubenswrapper[5030]: I0121 00:25:14.553159 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6kcm\" (UniqueName: \"kubernetes.io/projected/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-kube-api-access-t6kcm\") pod \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\" (UID: \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\") " Jan 21 00:25:14 crc kubenswrapper[5030]: I0121 00:25:14.553189 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-util\") pod \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\" (UID: \"ab25a3ce-1d1e-4621-93d3-51d691be4b9f\") " Jan 21 00:25:14 crc kubenswrapper[5030]: I0121 00:25:14.553998 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-bundle" (OuterVolumeSpecName: "bundle") pod "ab25a3ce-1d1e-4621-93d3-51d691be4b9f" (UID: "ab25a3ce-1d1e-4621-93d3-51d691be4b9f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:25:14 crc kubenswrapper[5030]: I0121 00:25:14.559009 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-kube-api-access-t6kcm" (OuterVolumeSpecName: "kube-api-access-t6kcm") pod "ab25a3ce-1d1e-4621-93d3-51d691be4b9f" (UID: "ab25a3ce-1d1e-4621-93d3-51d691be4b9f"). InnerVolumeSpecName "kube-api-access-t6kcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:25:14 crc kubenswrapper[5030]: I0121 00:25:14.569322 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-util" (OuterVolumeSpecName: "util") pod "ab25a3ce-1d1e-4621-93d3-51d691be4b9f" (UID: "ab25a3ce-1d1e-4621-93d3-51d691be4b9f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:25:14 crc kubenswrapper[5030]: I0121 00:25:14.654817 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:25:14 crc kubenswrapper[5030]: I0121 00:25:14.654872 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6kcm\" (UniqueName: \"kubernetes.io/projected/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-kube-api-access-t6kcm\") on node \"crc\" DevicePath \"\"" Jan 21 00:25:14 crc kubenswrapper[5030]: I0121 00:25:14.654889 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab25a3ce-1d1e-4621-93d3-51d691be4b9f-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:25:15 crc kubenswrapper[5030]: I0121 00:25:15.157163 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" event={"ID":"ab25a3ce-1d1e-4621-93d3-51d691be4b9f","Type":"ContainerDied","Data":"d3b3227d1cafa588b8a49e5b069509989a8004a9f5a80854186cac07353f2391"} Jan 21 00:25:15 crc kubenswrapper[5030]: I0121 00:25:15.157204 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3b3227d1cafa588b8a49e5b069509989a8004a9f5a80854186cac07353f2391" Jan 21 00:25:15 crc kubenswrapper[5030]: I0121 00:25:15.157234 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv" Jan 21 00:25:22 crc kubenswrapper[5030]: I0121 00:25:22.962439 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:25:22 crc kubenswrapper[5030]: E0121 00:25:22.963086 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:25:24 crc kubenswrapper[5030]: I0121 00:25:24.873388 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d"] Jan 21 00:25:24 crc kubenswrapper[5030]: E0121 00:25:24.873717 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab25a3ce-1d1e-4621-93d3-51d691be4b9f" containerName="util" Jan 21 00:25:24 crc kubenswrapper[5030]: I0121 00:25:24.873731 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab25a3ce-1d1e-4621-93d3-51d691be4b9f" containerName="util" Jan 21 00:25:24 crc kubenswrapper[5030]: E0121 00:25:24.873749 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab25a3ce-1d1e-4621-93d3-51d691be4b9f" containerName="extract" Jan 21 00:25:24 crc kubenswrapper[5030]: I0121 00:25:24.873757 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab25a3ce-1d1e-4621-93d3-51d691be4b9f" containerName="extract" Jan 21 00:25:24 crc kubenswrapper[5030]: E0121 00:25:24.873796 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab25a3ce-1d1e-4621-93d3-51d691be4b9f" containerName="pull" Jan 21 00:25:24 crc kubenswrapper[5030]: I0121 00:25:24.873805 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab25a3ce-1d1e-4621-93d3-51d691be4b9f" containerName="pull" Jan 21 00:25:24 crc kubenswrapper[5030]: I0121 00:25:24.873943 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab25a3ce-1d1e-4621-93d3-51d691be4b9f" containerName="extract" Jan 21 00:25:24 crc kubenswrapper[5030]: I0121 00:25:24.874447 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:25:24 crc kubenswrapper[5030]: I0121 00:25:24.876425 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 21 00:25:24 crc kubenswrapper[5030]: I0121 00:25:24.876700 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wgfjk" Jan 21 00:25:24 crc kubenswrapper[5030]: I0121 00:25:24.878020 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Jan 21 00:25:24 crc kubenswrapper[5030]: I0121 00:25:24.893543 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d"] Jan 21 00:25:25 crc kubenswrapper[5030]: I0121 00:25:25.007406 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c46a4962-cf58-47ea-8ea6-b603605c747b-apiservice-cert\") pod \"mariadb-operator-controller-manager-547684f4d7-2vk2d\" (UID: \"c46a4962-cf58-47ea-8ea6-b603605c747b\") " pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:25:25 crc kubenswrapper[5030]: I0121 00:25:25.007783 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c46a4962-cf58-47ea-8ea6-b603605c747b-webhook-cert\") pod \"mariadb-operator-controller-manager-547684f4d7-2vk2d\" (UID: \"c46a4962-cf58-47ea-8ea6-b603605c747b\") " pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:25:25 crc kubenswrapper[5030]: I0121 00:25:25.007834 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlqts\" (UniqueName: \"kubernetes.io/projected/c46a4962-cf58-47ea-8ea6-b603605c747b-kube-api-access-qlqts\") pod \"mariadb-operator-controller-manager-547684f4d7-2vk2d\" (UID: \"c46a4962-cf58-47ea-8ea6-b603605c747b\") " pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:25:25 crc kubenswrapper[5030]: I0121 00:25:25.108662 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c46a4962-cf58-47ea-8ea6-b603605c747b-apiservice-cert\") pod \"mariadb-operator-controller-manager-547684f4d7-2vk2d\" (UID: \"c46a4962-cf58-47ea-8ea6-b603605c747b\") " pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:25:25 crc kubenswrapper[5030]: I0121 00:25:25.108810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c46a4962-cf58-47ea-8ea6-b603605c747b-webhook-cert\") pod \"mariadb-operator-controller-manager-547684f4d7-2vk2d\" (UID: \"c46a4962-cf58-47ea-8ea6-b603605c747b\") " pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:25:25 crc kubenswrapper[5030]: I0121 00:25:25.108912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlqts\" (UniqueName: \"kubernetes.io/projected/c46a4962-cf58-47ea-8ea6-b603605c747b-kube-api-access-qlqts\") pod \"mariadb-operator-controller-manager-547684f4d7-2vk2d\" (UID: \"c46a4962-cf58-47ea-8ea6-b603605c747b\") " pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:25:25 crc kubenswrapper[5030]: I0121 00:25:25.122870 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c46a4962-cf58-47ea-8ea6-b603605c747b-apiservice-cert\") pod \"mariadb-operator-controller-manager-547684f4d7-2vk2d\" (UID: \"c46a4962-cf58-47ea-8ea6-b603605c747b\") " pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:25:25 crc kubenswrapper[5030]: I0121 00:25:25.123135 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c46a4962-cf58-47ea-8ea6-b603605c747b-webhook-cert\") pod \"mariadb-operator-controller-manager-547684f4d7-2vk2d\" (UID: \"c46a4962-cf58-47ea-8ea6-b603605c747b\") " pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:25:25 crc kubenswrapper[5030]: I0121 00:25:25.130471 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlqts\" (UniqueName: \"kubernetes.io/projected/c46a4962-cf58-47ea-8ea6-b603605c747b-kube-api-access-qlqts\") pod \"mariadb-operator-controller-manager-547684f4d7-2vk2d\" (UID: \"c46a4962-cf58-47ea-8ea6-b603605c747b\") " pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:25:25 crc kubenswrapper[5030]: I0121 00:25:25.196432 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:25:25 crc kubenswrapper[5030]: I0121 00:25:25.628992 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d"] Jan 21 00:25:26 crc kubenswrapper[5030]: I0121 00:25:26.272280 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" event={"ID":"c46a4962-cf58-47ea-8ea6-b603605c747b","Type":"ContainerStarted","Data":"7820aa1133a01328a28dcaa37da8286034c2e6a5e219291842df1f7cc562f7ee"} Jan 21 00:25:26 crc kubenswrapper[5030]: I0121 00:25:26.272692 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" event={"ID":"c46a4962-cf58-47ea-8ea6-b603605c747b","Type":"ContainerStarted","Data":"a6690e7cba154e5d0b3b06be4e6e6081f32a5a7ba133ca5e9f8d152f72974112"} Jan 21 00:25:26 crc kubenswrapper[5030]: I0121 00:25:26.273001 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:25:26 crc kubenswrapper[5030]: I0121 00:25:26.292208 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" podStartSLOduration=2.292168363 podStartE2EDuration="2.292168363s" podCreationTimestamp="2026-01-21 00:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:25:26.29165952 +0000 UTC m=+6598.611919828" watchObservedRunningTime="2026-01-21 00:25:26.292168363 +0000 UTC m=+6598.612428651" Jan 21 00:25:31 crc kubenswrapper[5030]: I0121 00:25:31.466031 5030 scope.go:117] "RemoveContainer" containerID="555c8eef907969632d5648a21d3b0319730cf56a78b22a4fe57a6084e80e7e9f" Jan 21 00:25:31 crc kubenswrapper[5030]: I0121 00:25:31.492477 5030 scope.go:117] "RemoveContainer" containerID="49387d81df18a11895c2eaa3aae68f8bdf0370436f03e99bc447e8ffe027ee1a" Jan 21 00:25:31 crc kubenswrapper[5030]: I0121 00:25:31.518081 5030 scope.go:117] "RemoveContainer" containerID="6ab77e0659379b1d9e7ff1fe4cf2d42d2b2882b795dd39477a8a0ddbab18aab1" Jan 21 00:25:31 crc kubenswrapper[5030]: I0121 00:25:31.542721 5030 scope.go:117] "RemoveContainer" containerID="36ccad619f00d34d28f42dcfbcbd572472f7da9ef5c7cbe484910369636a8140" Jan 21 00:25:31 crc kubenswrapper[5030]: I0121 00:25:31.565701 5030 scope.go:117] "RemoveContainer" containerID="1a814456dfcca7833db34b890f1cd5233c2654b9d22554d4f0ba998177248e68" Jan 21 00:25:31 crc kubenswrapper[5030]: I0121 00:25:31.584645 5030 scope.go:117] "RemoveContainer" containerID="3515afd57f607427efd5ce2644f5dd488504d28f646955b9c4ef6c112083c664" Jan 21 00:25:35 crc kubenswrapper[5030]: I0121 00:25:35.202565 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:25:36 crc kubenswrapper[5030]: I0121 00:25:36.962194 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:25:36 crc kubenswrapper[5030]: E0121 00:25:36.962559 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:25:42 crc kubenswrapper[5030]: I0121 00:25:42.056598 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-zhhkb"] Jan 21 00:25:42 crc kubenswrapper[5030]: I0121 00:25:42.058721 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-zhhkb" Jan 21 00:25:42 crc kubenswrapper[5030]: I0121 00:25:42.067150 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-zhhkb"] Jan 21 00:25:42 crc kubenswrapper[5030]: I0121 00:25:42.160976 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrc7r\" (UniqueName: \"kubernetes.io/projected/ae461baa-acde-4165-bb3a-4fdbd928937a-kube-api-access-zrc7r\") pod \"infra-operator-index-zhhkb\" (UID: \"ae461baa-acde-4165-bb3a-4fdbd928937a\") " pod="openstack-operators/infra-operator-index-zhhkb" Jan 21 00:25:42 crc kubenswrapper[5030]: I0121 00:25:42.262801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrc7r\" (UniqueName: \"kubernetes.io/projected/ae461baa-acde-4165-bb3a-4fdbd928937a-kube-api-access-zrc7r\") pod \"infra-operator-index-zhhkb\" (UID: \"ae461baa-acde-4165-bb3a-4fdbd928937a\") " pod="openstack-operators/infra-operator-index-zhhkb" Jan 21 00:25:42 crc kubenswrapper[5030]: I0121 00:25:42.283197 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrc7r\" (UniqueName: \"kubernetes.io/projected/ae461baa-acde-4165-bb3a-4fdbd928937a-kube-api-access-zrc7r\") pod \"infra-operator-index-zhhkb\" (UID: \"ae461baa-acde-4165-bb3a-4fdbd928937a\") " pod="openstack-operators/infra-operator-index-zhhkb" Jan 21 00:25:42 crc kubenswrapper[5030]: I0121 00:25:42.433585 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-zhhkb" Jan 21 00:25:42 crc kubenswrapper[5030]: I0121 00:25:42.851790 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-zhhkb"] Jan 21 00:25:43 crc kubenswrapper[5030]: I0121 00:25:43.454176 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-zhhkb" event={"ID":"ae461baa-acde-4165-bb3a-4fdbd928937a","Type":"ContainerStarted","Data":"ff6f740c23113bbd525bed8a8f678583b6b737dea2be2df1f5f5db8221daa906"} Jan 21 00:25:44 crc kubenswrapper[5030]: I0121 00:25:44.463402 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-zhhkb" event={"ID":"ae461baa-acde-4165-bb3a-4fdbd928937a","Type":"ContainerStarted","Data":"ba1f65a778f4e68878a5b9b770a3f3866ec46c7794edb970c381ef2243aeb62f"} Jan 21 00:25:44 crc kubenswrapper[5030]: I0121 00:25:44.480465 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-zhhkb" podStartSLOduration=1.986118893 podStartE2EDuration="2.480442765s" podCreationTimestamp="2026-01-21 00:25:42 +0000 UTC" firstStartedPulling="2026-01-21 00:25:42.861149971 +0000 UTC m=+6615.181410249" lastFinishedPulling="2026-01-21 00:25:43.355473833 +0000 UTC m=+6615.675734121" observedRunningTime="2026-01-21 00:25:44.476478 +0000 UTC m=+6616.796738298" watchObservedRunningTime="2026-01-21 00:25:44.480442765 +0000 UTC m=+6616.800703053" Jan 21 00:25:45 crc kubenswrapper[5030]: I0121 00:25:45.826071 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-zhhkb"] Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.211285 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.213458 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.235162 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"kube-root-ca.crt" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.235210 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openshift-service-ca.crt" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.235398 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config-data" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.235461 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"galera-openstack-dockercfg-ms2hx" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.237555 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-scripts" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.242395 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.249093 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.251413 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.256221 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.278232 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.282645 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.289394 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.320541 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.320589 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phztj\" (UniqueName: \"kubernetes.io/projected/0577056e-7cb4-41ab-8763-f97dcbfa4f37-kube-api-access-phztj\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.320634 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.320687 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0577056e-7cb4-41ab-8763-f97dcbfa4f37-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.320712 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.320727 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-config-data-default\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.320807 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-kolla-config\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.320825 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-kolla-config\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.320860 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.320894 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-config-data-default\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.320961 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4xwd\" (UniqueName: \"kubernetes.io/projected/3af3725b-a540-4b57-9655-0087cea23aa0-kube-api-access-d4xwd\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.320986 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3af3725b-a540-4b57-9655-0087cea23aa0-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423076 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g987\" (UniqueName: \"kubernetes.io/projected/59579353-499a-4005-8702-38e5be21d855-kube-api-access-9g987\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423143 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-kolla-config\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423170 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-kolla-config\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423207 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423231 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-config-data-default\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423282 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4xwd\" (UniqueName: \"kubernetes.io/projected/3af3725b-a540-4b57-9655-0087cea23aa0-kube-api-access-d4xwd\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423299 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3af3725b-a540-4b57-9655-0087cea23aa0-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423327 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423350 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423371 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phztj\" (UniqueName: \"kubernetes.io/projected/0577056e-7cb4-41ab-8763-f97dcbfa4f37-kube-api-access-phztj\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423397 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423421 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-operator-scripts\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0577056e-7cb4-41ab-8763-f97dcbfa4f37-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423464 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423482 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-config-data-default\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423509 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-kolla-config\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423528 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-config-data-default\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423550 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/59579353-499a-4005-8702-38e5be21d855-config-data-generated\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.423897 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-kolla-config\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.424126 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") device mount path \"/mnt/openstack/pv18\"" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.424699 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-config-data-default\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.424963 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3af3725b-a540-4b57-9655-0087cea23aa0-config-data-generated\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.425494 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0577056e-7cb4-41ab-8763-f97dcbfa4f37-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.425637 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-config-data-default\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.425901 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-operator-scripts\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.426401 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") device mount path \"/mnt/openstack/pv04\"" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.427709 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.428194 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-kolla-config\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.445740 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.446308 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phztj\" (UniqueName: \"kubernetes.io/projected/0577056e-7cb4-41ab-8763-f97dcbfa4f37-kube-api-access-phztj\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.447136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.451409 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4xwd\" (UniqueName: \"kubernetes.io/projected/3af3725b-a540-4b57-9655-0087cea23aa0-kube-api-access-d4xwd\") pod \"openstack-galera-2\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.475478 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-zhhkb" podUID="ae461baa-acde-4165-bb3a-4fdbd928937a" containerName="registry-server" containerID="cri-o://ba1f65a778f4e68878a5b9b770a3f3866ec46c7794edb970c381ef2243aeb62f" gracePeriod=2 Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.524484 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g987\" (UniqueName: \"kubernetes.io/projected/59579353-499a-4005-8702-38e5be21d855-kube-api-access-9g987\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.524916 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.524954 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-operator-scripts\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.525223 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-kolla-config\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.525258 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-config-data-default\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.525288 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/59579353-499a-4005-8702-38e5be21d855-config-data-generated\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.525152 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") device mount path \"/mnt/openstack/pv03\"" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.525791 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/59579353-499a-4005-8702-38e5be21d855-config-data-generated\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.526121 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-kolla-config\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.526408 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-config-data-default\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.526786 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-operator-scripts\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.546110 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g987\" (UniqueName: \"kubernetes.io/projected/59579353-499a-4005-8702-38e5be21d855-kube-api-access-9g987\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.549888 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-1\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.582796 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.607098 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.610524 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.639181 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-7c4sv"] Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.640156 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-7c4sv" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.646015 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-78jwr" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.646663 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-7c4sv"] Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.728125 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27t47\" (UniqueName: \"kubernetes.io/projected/86f764f9-6250-4fda-8c47-96f9795a2fc6-kube-api-access-27t47\") pod \"infra-operator-index-7c4sv\" (UID: \"86f764f9-6250-4fda-8c47-96f9795a2fc6\") " pod="openstack-operators/infra-operator-index-7c4sv" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.829531 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27t47\" (UniqueName: \"kubernetes.io/projected/86f764f9-6250-4fda-8c47-96f9795a2fc6-kube-api-access-27t47\") pod \"infra-operator-index-7c4sv\" (UID: \"86f764f9-6250-4fda-8c47-96f9795a2fc6\") " pod="openstack-operators/infra-operator-index-7c4sv" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.848247 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27t47\" (UniqueName: \"kubernetes.io/projected/86f764f9-6250-4fda-8c47-96f9795a2fc6-kube-api-access-27t47\") pod \"infra-operator-index-7c4sv\" (UID: \"86f764f9-6250-4fda-8c47-96f9795a2fc6\") " pod="openstack-operators/infra-operator-index-7c4sv" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.852695 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-zhhkb" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.930506 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrc7r\" (UniqueName: \"kubernetes.io/projected/ae461baa-acde-4165-bb3a-4fdbd928937a-kube-api-access-zrc7r\") pod \"ae461baa-acde-4165-bb3a-4fdbd928937a\" (UID: \"ae461baa-acde-4165-bb3a-4fdbd928937a\") " Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.933542 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae461baa-acde-4165-bb3a-4fdbd928937a-kube-api-access-zrc7r" (OuterVolumeSpecName: "kube-api-access-zrc7r") pod "ae461baa-acde-4165-bb3a-4fdbd928937a" (UID: "ae461baa-acde-4165-bb3a-4fdbd928937a"). InnerVolumeSpecName "kube-api-access-zrc7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:25:46 crc kubenswrapper[5030]: I0121 00:25:46.971318 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-7c4sv" Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.032780 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrc7r\" (UniqueName: \"kubernetes.io/projected/ae461baa-acde-4165-bb3a-4fdbd928937a-kube-api-access-zrc7r\") on node \"crc\" DevicePath \"\"" Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.057976 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Jan 21 00:25:47 crc kubenswrapper[5030]: W0121 00:25:47.069026 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0577056e_7cb4_41ab_8763_f97dcbfa4f37.slice/crio-f8a787fc12fb04eaf61b204fee5319b04c4a69251ac36a51f9b6d9e3274bb785 WatchSource:0}: Error finding container f8a787fc12fb04eaf61b204fee5319b04c4a69251ac36a51f9b6d9e3274bb785: Status 404 returned error can't find the container with id f8a787fc12fb04eaf61b204fee5319b04c4a69251ac36a51f9b6d9e3274bb785 Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.118270 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Jan 21 00:25:47 crc kubenswrapper[5030]: W0121 00:25:47.148184 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3af3725b_a540_4b57_9655_0087cea23aa0.slice/crio-c091f8dfb5135dc8afec64ad97aaac9348a03d304b2ee147376ed1105a33f85a WatchSource:0}: Error finding container c091f8dfb5135dc8afec64ad97aaac9348a03d304b2ee147376ed1105a33f85a: Status 404 returned error can't find the container with id c091f8dfb5135dc8afec64ad97aaac9348a03d304b2ee147376ed1105a33f85a Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.177075 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.182173 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-7c4sv"] Jan 21 00:25:47 crc kubenswrapper[5030]: W0121 00:25:47.204743 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86f764f9_6250_4fda_8c47_96f9795a2fc6.slice/crio-0022c71671268cbfb7afb7baae1366bf352f3809e953d0c4b016fe9892c877c0 WatchSource:0}: Error finding container 0022c71671268cbfb7afb7baae1366bf352f3809e953d0c4b016fe9892c877c0: Status 404 returned error can't find the container with id 0022c71671268cbfb7afb7baae1366bf352f3809e953d0c4b016fe9892c877c0 Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.485060 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3af3725b-a540-4b57-9655-0087cea23aa0","Type":"ContainerStarted","Data":"2a4ce284153fef8d4a6d8f8bf008b95c83d6857b6d8e308d18de3e10cea8cf74"} Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.485359 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3af3725b-a540-4b57-9655-0087cea23aa0","Type":"ContainerStarted","Data":"c091f8dfb5135dc8afec64ad97aaac9348a03d304b2ee147376ed1105a33f85a"} Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.489815 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-7c4sv" event={"ID":"86f764f9-6250-4fda-8c47-96f9795a2fc6","Type":"ContainerStarted","Data":"0022c71671268cbfb7afb7baae1366bf352f3809e953d0c4b016fe9892c877c0"} Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.491652 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"0577056e-7cb4-41ab-8763-f97dcbfa4f37","Type":"ContainerStarted","Data":"af676271e3940564b0e02d55fa6babe6225fb1646cef723aa133def11239e1b2"} Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.491680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"0577056e-7cb4-41ab-8763-f97dcbfa4f37","Type":"ContainerStarted","Data":"f8a787fc12fb04eaf61b204fee5319b04c4a69251ac36a51f9b6d9e3274bb785"} Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.494430 5030 generic.go:334] "Generic (PLEG): container finished" podID="ae461baa-acde-4165-bb3a-4fdbd928937a" containerID="ba1f65a778f4e68878a5b9b770a3f3866ec46c7794edb970c381ef2243aeb62f" exitCode=0 Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.494500 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-zhhkb" event={"ID":"ae461baa-acde-4165-bb3a-4fdbd928937a","Type":"ContainerDied","Data":"ba1f65a778f4e68878a5b9b770a3f3866ec46c7794edb970c381ef2243aeb62f"} Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.494522 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-zhhkb" Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.494551 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-zhhkb" event={"ID":"ae461baa-acde-4165-bb3a-4fdbd928937a","Type":"ContainerDied","Data":"ff6f740c23113bbd525bed8a8f678583b6b737dea2be2df1f5f5db8221daa906"} Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.494574 5030 scope.go:117] "RemoveContainer" containerID="ba1f65a778f4e68878a5b9b770a3f3866ec46c7794edb970c381ef2243aeb62f" Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.501278 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"59579353-499a-4005-8702-38e5be21d855","Type":"ContainerStarted","Data":"9a0d8a8b1a729f99b29930c0d5e0f8a964b5aa9e6ff5d124174b406f41407965"} Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.501359 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"59579353-499a-4005-8702-38e5be21d855","Type":"ContainerStarted","Data":"062e6991dd6a95ed2a8fbda6821475998bd94411ac73c979320f305a0ce8fab8"} Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.534861 5030 scope.go:117] "RemoveContainer" containerID="ba1f65a778f4e68878a5b9b770a3f3866ec46c7794edb970c381ef2243aeb62f" Jan 21 00:25:47 crc kubenswrapper[5030]: E0121 00:25:47.535474 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba1f65a778f4e68878a5b9b770a3f3866ec46c7794edb970c381ef2243aeb62f\": container with ID starting with ba1f65a778f4e68878a5b9b770a3f3866ec46c7794edb970c381ef2243aeb62f not found: ID does not exist" containerID="ba1f65a778f4e68878a5b9b770a3f3866ec46c7794edb970c381ef2243aeb62f" Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.535517 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba1f65a778f4e68878a5b9b770a3f3866ec46c7794edb970c381ef2243aeb62f"} err="failed to get container status \"ba1f65a778f4e68878a5b9b770a3f3866ec46c7794edb970c381ef2243aeb62f\": rpc error: code = NotFound desc = could not find container \"ba1f65a778f4e68878a5b9b770a3f3866ec46c7794edb970c381ef2243aeb62f\": container with ID starting with ba1f65a778f4e68878a5b9b770a3f3866ec46c7794edb970c381ef2243aeb62f not found: ID does not exist" Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.579447 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-zhhkb"] Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.585864 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-zhhkb"] Jan 21 00:25:47 crc kubenswrapper[5030]: I0121 00:25:47.970857 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae461baa-acde-4165-bb3a-4fdbd928937a" path="/var/lib/kubelet/pods/ae461baa-acde-4165-bb3a-4fdbd928937a/volumes" Jan 21 00:25:49 crc kubenswrapper[5030]: I0121 00:25:49.517421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-7c4sv" event={"ID":"86f764f9-6250-4fda-8c47-96f9795a2fc6","Type":"ContainerStarted","Data":"70686d4e0a9ea3c67cae06b5d1c08584f81bc6d248bca11263fe35cbd871f412"} Jan 21 00:25:49 crc kubenswrapper[5030]: I0121 00:25:49.534667 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-7c4sv" podStartSLOduration=2.921681453 podStartE2EDuration="3.534647059s" podCreationTimestamp="2026-01-21 00:25:46 +0000 UTC" firstStartedPulling="2026-01-21 00:25:47.207379238 +0000 UTC m=+6619.527639526" lastFinishedPulling="2026-01-21 00:25:47.820344844 +0000 UTC m=+6620.140605132" observedRunningTime="2026-01-21 00:25:49.530135052 +0000 UTC m=+6621.850395360" watchObservedRunningTime="2026-01-21 00:25:49.534647059 +0000 UTC m=+6621.854907347" Jan 21 00:25:49 crc kubenswrapper[5030]: I0121 00:25:49.962762 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:25:49 crc kubenswrapper[5030]: E0121 00:25:49.963064 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:25:56 crc kubenswrapper[5030]: I0121 00:25:56.972426 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-7c4sv" Jan 21 00:25:56 crc kubenswrapper[5030]: I0121 00:25:56.973072 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-7c4sv" Jan 21 00:25:57 crc kubenswrapper[5030]: I0121 00:25:57.012869 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-7c4sv" Jan 21 00:25:57 crc kubenswrapper[5030]: I0121 00:25:57.591171 5030 generic.go:334] "Generic (PLEG): container finished" podID="0577056e-7cb4-41ab-8763-f97dcbfa4f37" containerID="af676271e3940564b0e02d55fa6babe6225fb1646cef723aa133def11239e1b2" exitCode=0 Jan 21 00:25:57 crc kubenswrapper[5030]: I0121 00:25:57.591289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"0577056e-7cb4-41ab-8763-f97dcbfa4f37","Type":"ContainerDied","Data":"af676271e3940564b0e02d55fa6babe6225fb1646cef723aa133def11239e1b2"} Jan 21 00:25:57 crc kubenswrapper[5030]: I0121 00:25:57.624071 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-7c4sv" Jan 21 00:25:58 crc kubenswrapper[5030]: I0121 00:25:58.601564 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"0577056e-7cb4-41ab-8763-f97dcbfa4f37","Type":"ContainerStarted","Data":"16a95d1de8157d23ee43ea886c42e2f2eb5193fe74f30e540f791006d0ed323c"} Jan 21 00:25:58 crc kubenswrapper[5030]: I0121 00:25:58.604402 5030 generic.go:334] "Generic (PLEG): container finished" podID="59579353-499a-4005-8702-38e5be21d855" containerID="9a0d8a8b1a729f99b29930c0d5e0f8a964b5aa9e6ff5d124174b406f41407965" exitCode=0 Jan 21 00:25:58 crc kubenswrapper[5030]: I0121 00:25:58.604467 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"59579353-499a-4005-8702-38e5be21d855","Type":"ContainerDied","Data":"9a0d8a8b1a729f99b29930c0d5e0f8a964b5aa9e6ff5d124174b406f41407965"} Jan 21 00:25:58 crc kubenswrapper[5030]: I0121 00:25:58.606480 5030 generic.go:334] "Generic (PLEG): container finished" podID="3af3725b-a540-4b57-9655-0087cea23aa0" containerID="2a4ce284153fef8d4a6d8f8bf008b95c83d6857b6d8e308d18de3e10cea8cf74" exitCode=0 Jan 21 00:25:58 crc kubenswrapper[5030]: I0121 00:25:58.606582 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3af3725b-a540-4b57-9655-0087cea23aa0","Type":"ContainerDied","Data":"2a4ce284153fef8d4a6d8f8bf008b95c83d6857b6d8e308d18de3e10cea8cf74"} Jan 21 00:25:58 crc kubenswrapper[5030]: I0121 00:25:58.663447 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-0" podStartSLOduration=13.663426729 podStartE2EDuration="13.663426729s" podCreationTimestamp="2026-01-21 00:25:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:25:58.63709445 +0000 UTC m=+6630.957354748" watchObservedRunningTime="2026-01-21 00:25:58.663426729 +0000 UTC m=+6630.983687017" Jan 21 00:25:59 crc kubenswrapper[5030]: I0121 00:25:59.616507 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"59579353-499a-4005-8702-38e5be21d855","Type":"ContainerStarted","Data":"e6836d2cd0525dcc81f5327e90f13f4b8342fe25b3f7ca5d7231bc3df10a4354"} Jan 21 00:25:59 crc kubenswrapper[5030]: I0121 00:25:59.620742 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3af3725b-a540-4b57-9655-0087cea23aa0","Type":"ContainerStarted","Data":"35cf9a354de6a03e1d2f76b7c36b7d348684cc280dac6e0b42684214d10b579f"} Jan 21 00:26:00 crc kubenswrapper[5030]: I0121 00:26:00.644358 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-2" podStartSLOduration=15.644337825000001 podStartE2EDuration="15.644337825s" podCreationTimestamp="2026-01-21 00:25:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:26:00.641715162 +0000 UTC m=+6632.961975470" watchObservedRunningTime="2026-01-21 00:26:00.644337825 +0000 UTC m=+6632.964598123" Jan 21 00:26:00 crc kubenswrapper[5030]: I0121 00:26:00.664934 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-1" podStartSLOduration=15.664909856 podStartE2EDuration="15.664909856s" podCreationTimestamp="2026-01-21 00:25:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:26:00.658412151 +0000 UTC m=+6632.978672439" watchObservedRunningTime="2026-01-21 00:26:00.664909856 +0000 UTC m=+6632.985170144" Jan 21 00:26:00 crc kubenswrapper[5030]: I0121 00:26:00.962589 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:26:00 crc kubenswrapper[5030]: E0121 00:26:00.962823 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.682138 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22"] Jan 21 00:26:05 crc kubenswrapper[5030]: E0121 00:26:05.683019 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae461baa-acde-4165-bb3a-4fdbd928937a" containerName="registry-server" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.683038 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae461baa-acde-4165-bb3a-4fdbd928937a" containerName="registry-server" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.683192 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae461baa-acde-4165-bb3a-4fdbd928937a" containerName="registry-server" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.684508 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.686130 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.705146 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22"] Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.722848 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr7t7\" (UniqueName: \"kubernetes.io/projected/8e006733-b13c-4371-922e-b87b7a68a197-kube-api-access-mr7t7\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22\" (UID: \"8e006733-b13c-4371-922e-b87b7a68a197\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.722917 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e006733-b13c-4371-922e-b87b7a68a197-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22\" (UID: \"8e006733-b13c-4371-922e-b87b7a68a197\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.722965 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e006733-b13c-4371-922e-b87b7a68a197-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22\" (UID: \"8e006733-b13c-4371-922e-b87b7a68a197\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.824266 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr7t7\" (UniqueName: \"kubernetes.io/projected/8e006733-b13c-4371-922e-b87b7a68a197-kube-api-access-mr7t7\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22\" (UID: \"8e006733-b13c-4371-922e-b87b7a68a197\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.824323 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e006733-b13c-4371-922e-b87b7a68a197-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22\" (UID: \"8e006733-b13c-4371-922e-b87b7a68a197\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.824361 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e006733-b13c-4371-922e-b87b7a68a197-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22\" (UID: \"8e006733-b13c-4371-922e-b87b7a68a197\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.824836 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e006733-b13c-4371-922e-b87b7a68a197-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22\" (UID: \"8e006733-b13c-4371-922e-b87b7a68a197\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.824959 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e006733-b13c-4371-922e-b87b7a68a197-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22\" (UID: \"8e006733-b13c-4371-922e-b87b7a68a197\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" Jan 21 00:26:05 crc kubenswrapper[5030]: I0121 00:26:05.844574 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr7t7\" (UniqueName: \"kubernetes.io/projected/8e006733-b13c-4371-922e-b87b7a68a197-kube-api-access-mr7t7\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22\" (UID: \"8e006733-b13c-4371-922e-b87b7a68a197\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" Jan 21 00:26:06 crc kubenswrapper[5030]: I0121 00:26:06.007180 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" Jan 21 00:26:06 crc kubenswrapper[5030]: I0121 00:26:06.456342 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22"] Jan 21 00:26:06 crc kubenswrapper[5030]: W0121 00:26:06.460913 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e006733_b13c_4371_922e_b87b7a68a197.slice/crio-3486188413ce06af7a95e7160e795b0251ad9e23503ec7497e6ec66b0a832bca WatchSource:0}: Error finding container 3486188413ce06af7a95e7160e795b0251ad9e23503ec7497e6ec66b0a832bca: Status 404 returned error can't find the container with id 3486188413ce06af7a95e7160e795b0251ad9e23503ec7497e6ec66b0a832bca Jan 21 00:26:06 crc kubenswrapper[5030]: I0121 00:26:06.583015 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:26:06 crc kubenswrapper[5030]: I0121 00:26:06.583075 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:26:06 crc kubenswrapper[5030]: I0121 00:26:06.608299 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:26:06 crc kubenswrapper[5030]: I0121 00:26:06.608587 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:26:06 crc kubenswrapper[5030]: I0121 00:26:06.611318 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:26:06 crc kubenswrapper[5030]: I0121 00:26:06.611365 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:26:06 crc kubenswrapper[5030]: I0121 00:26:06.680272 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" event={"ID":"8e006733-b13c-4371-922e-b87b7a68a197","Type":"ContainerStarted","Data":"8219be5f3668ca7881c7749e39a10f27cb54e97f6465d96081a6bd72380f6803"} Jan 21 00:26:06 crc kubenswrapper[5030]: I0121 00:26:06.680339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" event={"ID":"8e006733-b13c-4371-922e-b87b7a68a197","Type":"ContainerStarted","Data":"3486188413ce06af7a95e7160e795b0251ad9e23503ec7497e6ec66b0a832bca"} Jan 21 00:26:06 crc kubenswrapper[5030]: I0121 00:26:06.700552 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:26:07 crc kubenswrapper[5030]: I0121 00:26:07.689938 5030 generic.go:334] "Generic (PLEG): container finished" podID="8e006733-b13c-4371-922e-b87b7a68a197" containerID="8219be5f3668ca7881c7749e39a10f27cb54e97f6465d96081a6bd72380f6803" exitCode=0 Jan 21 00:26:07 crc kubenswrapper[5030]: I0121 00:26:07.689992 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" event={"ID":"8e006733-b13c-4371-922e-b87b7a68a197","Type":"ContainerDied","Data":"8219be5f3668ca7881c7749e39a10f27cb54e97f6465d96081a6bd72380f6803"} Jan 21 00:26:07 crc kubenswrapper[5030]: I0121 00:26:07.773232 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:26:08 crc kubenswrapper[5030]: I0121 00:26:08.716588 5030 generic.go:334] "Generic (PLEG): container finished" podID="8e006733-b13c-4371-922e-b87b7a68a197" containerID="e84b6ed2beeb34a63c1a4828cc53135a0fa535c31340599e05b0dc1db6b4cb5c" exitCode=0 Jan 21 00:26:08 crc kubenswrapper[5030]: I0121 00:26:08.716796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" event={"ID":"8e006733-b13c-4371-922e-b87b7a68a197","Type":"ContainerDied","Data":"e84b6ed2beeb34a63c1a4828cc53135a0fa535c31340599e05b0dc1db6b4cb5c"} Jan 21 00:26:09 crc kubenswrapper[5030]: I0121 00:26:09.726025 5030 generic.go:334] "Generic (PLEG): container finished" podID="8e006733-b13c-4371-922e-b87b7a68a197" containerID="665c94c69f82a90b6f6b3a7d0f22f91a514d01ed6d9711663321031bedb6f5fd" exitCode=0 Jan 21 00:26:09 crc kubenswrapper[5030]: I0121 00:26:09.726096 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" event={"ID":"8e006733-b13c-4371-922e-b87b7a68a197","Type":"ContainerDied","Data":"665c94c69f82a90b6f6b3a7d0f22f91a514d01ed6d9711663321031bedb6f5fd"} Jan 21 00:26:11 crc kubenswrapper[5030]: I0121 00:26:11.140157 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" Jan 21 00:26:11 crc kubenswrapper[5030]: I0121 00:26:11.302251 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e006733-b13c-4371-922e-b87b7a68a197-util\") pod \"8e006733-b13c-4371-922e-b87b7a68a197\" (UID: \"8e006733-b13c-4371-922e-b87b7a68a197\") " Jan 21 00:26:11 crc kubenswrapper[5030]: I0121 00:26:11.302335 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr7t7\" (UniqueName: \"kubernetes.io/projected/8e006733-b13c-4371-922e-b87b7a68a197-kube-api-access-mr7t7\") pod \"8e006733-b13c-4371-922e-b87b7a68a197\" (UID: \"8e006733-b13c-4371-922e-b87b7a68a197\") " Jan 21 00:26:11 crc kubenswrapper[5030]: I0121 00:26:11.302464 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e006733-b13c-4371-922e-b87b7a68a197-bundle\") pod \"8e006733-b13c-4371-922e-b87b7a68a197\" (UID: \"8e006733-b13c-4371-922e-b87b7a68a197\") " Jan 21 00:26:11 crc kubenswrapper[5030]: I0121 00:26:11.304275 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e006733-b13c-4371-922e-b87b7a68a197-bundle" (OuterVolumeSpecName: "bundle") pod "8e006733-b13c-4371-922e-b87b7a68a197" (UID: "8e006733-b13c-4371-922e-b87b7a68a197"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:26:11 crc kubenswrapper[5030]: I0121 00:26:11.309286 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e006733-b13c-4371-922e-b87b7a68a197-kube-api-access-mr7t7" (OuterVolumeSpecName: "kube-api-access-mr7t7") pod "8e006733-b13c-4371-922e-b87b7a68a197" (UID: "8e006733-b13c-4371-922e-b87b7a68a197"). InnerVolumeSpecName "kube-api-access-mr7t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:26:11 crc kubenswrapper[5030]: I0121 00:26:11.315599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e006733-b13c-4371-922e-b87b7a68a197-util" (OuterVolumeSpecName: "util") pod "8e006733-b13c-4371-922e-b87b7a68a197" (UID: "8e006733-b13c-4371-922e-b87b7a68a197"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:26:11 crc kubenswrapper[5030]: I0121 00:26:11.404108 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e006733-b13c-4371-922e-b87b7a68a197-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:26:11 crc kubenswrapper[5030]: I0121 00:26:11.404151 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e006733-b13c-4371-922e-b87b7a68a197-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:26:11 crc kubenswrapper[5030]: I0121 00:26:11.404168 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr7t7\" (UniqueName: \"kubernetes.io/projected/8e006733-b13c-4371-922e-b87b7a68a197-kube-api-access-mr7t7\") on node \"crc\" DevicePath \"\"" Jan 21 00:26:11 crc kubenswrapper[5030]: I0121 00:26:11.745150 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" event={"ID":"8e006733-b13c-4371-922e-b87b7a68a197","Type":"ContainerDied","Data":"3486188413ce06af7a95e7160e795b0251ad9e23503ec7497e6ec66b0a832bca"} Jan 21 00:26:11 crc kubenswrapper[5030]: I0121 00:26:11.745202 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3486188413ce06af7a95e7160e795b0251ad9e23503ec7497e6ec66b0a832bca" Jan 21 00:26:11 crc kubenswrapper[5030]: I0121 00:26:11.745243 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22" Jan 21 00:26:14 crc kubenswrapper[5030]: I0121 00:26:14.961599 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:26:14 crc kubenswrapper[5030]: E0121 00:26:14.962151 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.309885 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/root-account-create-update-2kwnl"] Jan 21 00:26:15 crc kubenswrapper[5030]: E0121 00:26:15.310587 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e006733-b13c-4371-922e-b87b7a68a197" containerName="pull" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.310760 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e006733-b13c-4371-922e-b87b7a68a197" containerName="pull" Jan 21 00:26:15 crc kubenswrapper[5030]: E0121 00:26:15.310901 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e006733-b13c-4371-922e-b87b7a68a197" containerName="extract" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.311029 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e006733-b13c-4371-922e-b87b7a68a197" containerName="extract" Jan 21 00:26:15 crc kubenswrapper[5030]: E0121 00:26:15.311143 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e006733-b13c-4371-922e-b87b7a68a197" containerName="util" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.311256 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e006733-b13c-4371-922e-b87b7a68a197" containerName="util" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.311581 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e006733-b13c-4371-922e-b87b7a68a197" containerName="extract" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.312432 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-2kwnl" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.314840 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.321211 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-2kwnl"] Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.467611 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqmhd\" (UniqueName: \"kubernetes.io/projected/3bf8a32f-b9b9-4652-bae4-a4585fd99b3c-kube-api-access-tqmhd\") pod \"root-account-create-update-2kwnl\" (UID: \"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c\") " pod="keystone-kuttl-tests/root-account-create-update-2kwnl" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.468010 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bf8a32f-b9b9-4652-bae4-a4585fd99b3c-operator-scripts\") pod \"root-account-create-update-2kwnl\" (UID: \"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c\") " pod="keystone-kuttl-tests/root-account-create-update-2kwnl" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.569085 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bf8a32f-b9b9-4652-bae4-a4585fd99b3c-operator-scripts\") pod \"root-account-create-update-2kwnl\" (UID: \"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c\") " pod="keystone-kuttl-tests/root-account-create-update-2kwnl" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.569568 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqmhd\" (UniqueName: \"kubernetes.io/projected/3bf8a32f-b9b9-4652-bae4-a4585fd99b3c-kube-api-access-tqmhd\") pod \"root-account-create-update-2kwnl\" (UID: \"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c\") " pod="keystone-kuttl-tests/root-account-create-update-2kwnl" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.570358 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bf8a32f-b9b9-4652-bae4-a4585fd99b3c-operator-scripts\") pod \"root-account-create-update-2kwnl\" (UID: \"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c\") " pod="keystone-kuttl-tests/root-account-create-update-2kwnl" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.587477 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqmhd\" (UniqueName: \"kubernetes.io/projected/3bf8a32f-b9b9-4652-bae4-a4585fd99b3c-kube-api-access-tqmhd\") pod \"root-account-create-update-2kwnl\" (UID: \"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c\") " pod="keystone-kuttl-tests/root-account-create-update-2kwnl" Jan 21 00:26:15 crc kubenswrapper[5030]: I0121 00:26:15.633289 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-2kwnl" Jan 21 00:26:16 crc kubenswrapper[5030]: I0121 00:26:16.092821 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-2kwnl"] Jan 21 00:26:16 crc kubenswrapper[5030]: I0121 00:26:16.692052 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="keystone-kuttl-tests/openstack-galera-2" podUID="3af3725b-a540-4b57-9655-0087cea23aa0" containerName="galera" probeResult="failure" output=< Jan 21 00:26:16 crc kubenswrapper[5030]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Jan 21 00:26:16 crc kubenswrapper[5030]: > Jan 21 00:26:16 crc kubenswrapper[5030]: I0121 00:26:16.798114 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/root-account-create-update-2kwnl" event={"ID":"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c","Type":"ContainerStarted","Data":"0bd6f7cd4661abab219cb48e19460cb50db6a6760f28d6b5d07031de996fa6a7"} Jan 21 00:26:16 crc kubenswrapper[5030]: I0121 00:26:16.798187 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/root-account-create-update-2kwnl" event={"ID":"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c","Type":"ContainerStarted","Data":"eaf5dbfc257f5b572ff24426fe6b6f5cb03e6236aecc85a356f75e61907774a4"} Jan 21 00:26:16 crc kubenswrapper[5030]: I0121 00:26:16.816535 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/root-account-create-update-2kwnl" podStartSLOduration=1.816516053 podStartE2EDuration="1.816516053s" podCreationTimestamp="2026-01-21 00:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:26:16.812317983 +0000 UTC m=+6649.132578291" watchObservedRunningTime="2026-01-21 00:26:16.816516053 +0000 UTC m=+6649.136776341" Jan 21 00:26:18 crc kubenswrapper[5030]: I0121 00:26:18.732663 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:26:18 crc kubenswrapper[5030]: I0121 00:26:18.813070 5030 generic.go:334] "Generic (PLEG): container finished" podID="3bf8a32f-b9b9-4652-bae4-a4585fd99b3c" containerID="0bd6f7cd4661abab219cb48e19460cb50db6a6760f28d6b5d07031de996fa6a7" exitCode=0 Jan 21 00:26:18 crc kubenswrapper[5030]: I0121 00:26:18.813129 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/root-account-create-update-2kwnl" event={"ID":"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c","Type":"ContainerDied","Data":"0bd6f7cd4661abab219cb48e19460cb50db6a6760f28d6b5d07031de996fa6a7"} Jan 21 00:26:18 crc kubenswrapper[5030]: I0121 00:26:18.830319 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:26:20 crc kubenswrapper[5030]: I0121 00:26:20.099656 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-2kwnl" Jan 21 00:26:20 crc kubenswrapper[5030]: I0121 00:26:20.255792 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqmhd\" (UniqueName: \"kubernetes.io/projected/3bf8a32f-b9b9-4652-bae4-a4585fd99b3c-kube-api-access-tqmhd\") pod \"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c\" (UID: \"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c\") " Jan 21 00:26:20 crc kubenswrapper[5030]: I0121 00:26:20.256280 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bf8a32f-b9b9-4652-bae4-a4585fd99b3c-operator-scripts\") pod \"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c\" (UID: \"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c\") " Jan 21 00:26:20 crc kubenswrapper[5030]: I0121 00:26:20.256917 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf8a32f-b9b9-4652-bae4-a4585fd99b3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3bf8a32f-b9b9-4652-bae4-a4585fd99b3c" (UID: "3bf8a32f-b9b9-4652-bae4-a4585fd99b3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:26:20 crc kubenswrapper[5030]: I0121 00:26:20.261118 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bf8a32f-b9b9-4652-bae4-a4585fd99b3c-kube-api-access-tqmhd" (OuterVolumeSpecName: "kube-api-access-tqmhd") pod "3bf8a32f-b9b9-4652-bae4-a4585fd99b3c" (UID: "3bf8a32f-b9b9-4652-bae4-a4585fd99b3c"). InnerVolumeSpecName "kube-api-access-tqmhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:26:20 crc kubenswrapper[5030]: I0121 00:26:20.358535 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bf8a32f-b9b9-4652-bae4-a4585fd99b3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:26:20 crc kubenswrapper[5030]: I0121 00:26:20.358583 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqmhd\" (UniqueName: \"kubernetes.io/projected/3bf8a32f-b9b9-4652-bae4-a4585fd99b3c-kube-api-access-tqmhd\") on node \"crc\" DevicePath \"\"" Jan 21 00:26:20 crc kubenswrapper[5030]: I0121 00:26:20.829169 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/root-account-create-update-2kwnl" event={"ID":"3bf8a32f-b9b9-4652-bae4-a4585fd99b3c","Type":"ContainerDied","Data":"eaf5dbfc257f5b572ff24426fe6b6f5cb03e6236aecc85a356f75e61907774a4"} Jan 21 00:26:20 crc kubenswrapper[5030]: I0121 00:26:20.829220 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaf5dbfc257f5b572ff24426fe6b6f5cb03e6236aecc85a356f75e61907774a4" Jan 21 00:26:20 crc kubenswrapper[5030]: I0121 00:26:20.829288 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-2kwnl" Jan 21 00:26:22 crc kubenswrapper[5030]: I0121 00:26:22.270465 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:26:22 crc kubenswrapper[5030]: I0121 00:26:22.339669 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.642087 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mmwt9"] Jan 21 00:26:26 crc kubenswrapper[5030]: E0121 00:26:26.642852 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf8a32f-b9b9-4652-bae4-a4585fd99b3c" containerName="mariadb-account-create-update" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.642870 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf8a32f-b9b9-4652-bae4-a4585fd99b3c" containerName="mariadb-account-create-update" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.643078 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf8a32f-b9b9-4652-bae4-a4585fd99b3c" containerName="mariadb-account-create-update" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.644530 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.668767 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmwt9"] Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.759232 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97d264c1-1c25-4188-a1fc-f0a4f0981135-catalog-content\") pod \"certified-operators-mmwt9\" (UID: \"97d264c1-1c25-4188-a1fc-f0a4f0981135\") " pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.759299 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97d264c1-1c25-4188-a1fc-f0a4f0981135-utilities\") pod \"certified-operators-mmwt9\" (UID: \"97d264c1-1c25-4188-a1fc-f0a4f0981135\") " pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.759331 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m5z6\" (UniqueName: \"kubernetes.io/projected/97d264c1-1c25-4188-a1fc-f0a4f0981135-kube-api-access-2m5z6\") pod \"certified-operators-mmwt9\" (UID: \"97d264c1-1c25-4188-a1fc-f0a4f0981135\") " pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.860380 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97d264c1-1c25-4188-a1fc-f0a4f0981135-catalog-content\") pod \"certified-operators-mmwt9\" (UID: \"97d264c1-1c25-4188-a1fc-f0a4f0981135\") " pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.860447 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97d264c1-1c25-4188-a1fc-f0a4f0981135-utilities\") pod \"certified-operators-mmwt9\" (UID: \"97d264c1-1c25-4188-a1fc-f0a4f0981135\") " pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.860476 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m5z6\" (UniqueName: \"kubernetes.io/projected/97d264c1-1c25-4188-a1fc-f0a4f0981135-kube-api-access-2m5z6\") pod \"certified-operators-mmwt9\" (UID: \"97d264c1-1c25-4188-a1fc-f0a4f0981135\") " pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.860915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97d264c1-1c25-4188-a1fc-f0a4f0981135-catalog-content\") pod \"certified-operators-mmwt9\" (UID: \"97d264c1-1c25-4188-a1fc-f0a4f0981135\") " pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.860968 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97d264c1-1c25-4188-a1fc-f0a4f0981135-utilities\") pod \"certified-operators-mmwt9\" (UID: \"97d264c1-1c25-4188-a1fc-f0a4f0981135\") " pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.881154 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m5z6\" (UniqueName: \"kubernetes.io/projected/97d264c1-1c25-4188-a1fc-f0a4f0981135-kube-api-access-2m5z6\") pod \"certified-operators-mmwt9\" (UID: \"97d264c1-1c25-4188-a1fc-f0a4f0981135\") " pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:26 crc kubenswrapper[5030]: I0121 00:26:26.972329 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:27 crc kubenswrapper[5030]: I0121 00:26:27.478593 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmwt9"] Jan 21 00:26:27 crc kubenswrapper[5030]: W0121 00:26:27.486280 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97d264c1_1c25_4188_a1fc_f0a4f0981135.slice/crio-53cbae2fcadc796ce14b463026ac7bc3adc657fff0a2ac74d59a29b62e9f0adf WatchSource:0}: Error finding container 53cbae2fcadc796ce14b463026ac7bc3adc657fff0a2ac74d59a29b62e9f0adf: Status 404 returned error can't find the container with id 53cbae2fcadc796ce14b463026ac7bc3adc657fff0a2ac74d59a29b62e9f0adf Jan 21 00:26:27 crc kubenswrapper[5030]: I0121 00:26:27.876914 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmwt9" event={"ID":"97d264c1-1c25-4188-a1fc-f0a4f0981135","Type":"ContainerStarted","Data":"65d62223b0e074e769aceb72115fd0e252229e13d21c1e2f52611883147b4512"} Jan 21 00:26:27 crc kubenswrapper[5030]: I0121 00:26:27.877241 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmwt9" event={"ID":"97d264c1-1c25-4188-a1fc-f0a4f0981135","Type":"ContainerStarted","Data":"53cbae2fcadc796ce14b463026ac7bc3adc657fff0a2ac74d59a29b62e9f0adf"} Jan 21 00:26:28 crc kubenswrapper[5030]: I0121 00:26:28.885560 5030 generic.go:334] "Generic (PLEG): container finished" podID="97d264c1-1c25-4188-a1fc-f0a4f0981135" containerID="65d62223b0e074e769aceb72115fd0e252229e13d21c1e2f52611883147b4512" exitCode=0 Jan 21 00:26:28 crc kubenswrapper[5030]: I0121 00:26:28.885610 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmwt9" event={"ID":"97d264c1-1c25-4188-a1fc-f0a4f0981135","Type":"ContainerDied","Data":"65d62223b0e074e769aceb72115fd0e252229e13d21c1e2f52611883147b4512"} Jan 21 00:26:29 crc kubenswrapper[5030]: I0121 00:26:29.968195 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:26:29 crc kubenswrapper[5030]: E0121 00:26:29.969047 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:26:30 crc kubenswrapper[5030]: I0121 00:26:30.899855 5030 generic.go:334] "Generic (PLEG): container finished" podID="97d264c1-1c25-4188-a1fc-f0a4f0981135" containerID="d21a9f8786d758e2f9d4d25ee862e53d3c66ae5d2747fee7c601c244a56fdc05" exitCode=0 Jan 21 00:26:30 crc kubenswrapper[5030]: I0121 00:26:30.899972 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmwt9" event={"ID":"97d264c1-1c25-4188-a1fc-f0a4f0981135","Type":"ContainerDied","Data":"d21a9f8786d758e2f9d4d25ee862e53d3c66ae5d2747fee7c601c244a56fdc05"} Jan 21 00:26:31 crc kubenswrapper[5030]: I0121 00:26:31.648478 5030 scope.go:117] "RemoveContainer" containerID="54d48e34c2f889854f1855038bf62bf09eabd8fc4b0ea020e2d64be9a5d1322f" Jan 21 00:26:31 crc kubenswrapper[5030]: I0121 00:26:31.676468 5030 scope.go:117] "RemoveContainer" containerID="8a7188ca92d1fdbf5c23abfd3bd0b629ba4d14649d1755bebc4d171a62b91cdf" Jan 21 00:26:31 crc kubenswrapper[5030]: I0121 00:26:31.700698 5030 scope.go:117] "RemoveContainer" containerID="e0f6a4620408e63ad7436bb6d266362aeeab1b7f6b6008ec905800bf26491915" Jan 21 00:26:31 crc kubenswrapper[5030]: I0121 00:26:31.733703 5030 scope.go:117] "RemoveContainer" containerID="8218d5aaf397f53ba3fe22face6fb221c680e54ba03e3666cf2f887e43d2980d" Jan 21 00:26:31 crc kubenswrapper[5030]: I0121 00:26:31.911130 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmwt9" event={"ID":"97d264c1-1c25-4188-a1fc-f0a4f0981135","Type":"ContainerStarted","Data":"529f16f1815f03c908a43f3339345a658a4bc57417a461d8c7fe054d9ec3a4e3"} Jan 21 00:26:31 crc kubenswrapper[5030]: I0121 00:26:31.931122 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mmwt9" podStartSLOduration=3.471741449 podStartE2EDuration="5.931100238s" podCreationTimestamp="2026-01-21 00:26:26 +0000 UTC" firstStartedPulling="2026-01-21 00:26:28.887215752 +0000 UTC m=+6661.207476040" lastFinishedPulling="2026-01-21 00:26:31.346574541 +0000 UTC m=+6663.666834829" observedRunningTime="2026-01-21 00:26:31.928045355 +0000 UTC m=+6664.248305663" watchObservedRunningTime="2026-01-21 00:26:31.931100238 +0000 UTC m=+6664.251360526" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.031424 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-756b778986-4q6xf"] Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.032533 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.038016 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.044283 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-587mr" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.051199 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-756b778986-4q6xf"] Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.077807 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfdqj\" (UniqueName: \"kubernetes.io/projected/19cc407a-76e6-4487-bb8b-23286acdd0a5-kube-api-access-rfdqj\") pod \"infra-operator-controller-manager-756b778986-4q6xf\" (UID: \"19cc407a-76e6-4487-bb8b-23286acdd0a5\") " pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.077948 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19cc407a-76e6-4487-bb8b-23286acdd0a5-apiservice-cert\") pod \"infra-operator-controller-manager-756b778986-4q6xf\" (UID: \"19cc407a-76e6-4487-bb8b-23286acdd0a5\") " pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.077996 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19cc407a-76e6-4487-bb8b-23286acdd0a5-webhook-cert\") pod \"infra-operator-controller-manager-756b778986-4q6xf\" (UID: \"19cc407a-76e6-4487-bb8b-23286acdd0a5\") " pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.179402 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19cc407a-76e6-4487-bb8b-23286acdd0a5-webhook-cert\") pod \"infra-operator-controller-manager-756b778986-4q6xf\" (UID: \"19cc407a-76e6-4487-bb8b-23286acdd0a5\") " pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.179555 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfdqj\" (UniqueName: \"kubernetes.io/projected/19cc407a-76e6-4487-bb8b-23286acdd0a5-kube-api-access-rfdqj\") pod \"infra-operator-controller-manager-756b778986-4q6xf\" (UID: \"19cc407a-76e6-4487-bb8b-23286acdd0a5\") " pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.179628 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19cc407a-76e6-4487-bb8b-23286acdd0a5-apiservice-cert\") pod \"infra-operator-controller-manager-756b778986-4q6xf\" (UID: \"19cc407a-76e6-4487-bb8b-23286acdd0a5\") " pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.186260 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19cc407a-76e6-4487-bb8b-23286acdd0a5-webhook-cert\") pod \"infra-operator-controller-manager-756b778986-4q6xf\" (UID: \"19cc407a-76e6-4487-bb8b-23286acdd0a5\") " pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.195206 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19cc407a-76e6-4487-bb8b-23286acdd0a5-apiservice-cert\") pod \"infra-operator-controller-manager-756b778986-4q6xf\" (UID: \"19cc407a-76e6-4487-bb8b-23286acdd0a5\") " pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.201361 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfdqj\" (UniqueName: \"kubernetes.io/projected/19cc407a-76e6-4487-bb8b-23286acdd0a5-kube-api-access-rfdqj\") pod \"infra-operator-controller-manager-756b778986-4q6xf\" (UID: \"19cc407a-76e6-4487-bb8b-23286acdd0a5\") " pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.352592 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.808056 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-756b778986-4q6xf"] Jan 21 00:26:34 crc kubenswrapper[5030]: W0121 00:26:34.810575 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19cc407a_76e6_4487_bb8b_23286acdd0a5.slice/crio-adf4b820821bb53d02891411b4a0bd1167450ea920bd645ae734937c321984e5 WatchSource:0}: Error finding container adf4b820821bb53d02891411b4a0bd1167450ea920bd645ae734937c321984e5: Status 404 returned error can't find the container with id adf4b820821bb53d02891411b4a0bd1167450ea920bd645ae734937c321984e5 Jan 21 00:26:34 crc kubenswrapper[5030]: I0121 00:26:34.937960 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" event={"ID":"19cc407a-76e6-4487-bb8b-23286acdd0a5","Type":"ContainerStarted","Data":"adf4b820821bb53d02891411b4a0bd1167450ea920bd645ae734937c321984e5"} Jan 21 00:26:35 crc kubenswrapper[5030]: I0121 00:26:35.946813 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" event={"ID":"19cc407a-76e6-4487-bb8b-23286acdd0a5","Type":"ContainerStarted","Data":"b99755c77b32a58453549ab8c94e9e24439bab281d44c2fb01f200043ced6c0b"} Jan 21 00:26:35 crc kubenswrapper[5030]: I0121 00:26:35.947246 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:26:36 crc kubenswrapper[5030]: I0121 00:26:36.972744 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:36 crc kubenswrapper[5030]: I0121 00:26:36.972789 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:37 crc kubenswrapper[5030]: I0121 00:26:37.022697 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:37 crc kubenswrapper[5030]: I0121 00:26:37.052550 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" podStartSLOduration=3.052517469 podStartE2EDuration="3.052517469s" podCreationTimestamp="2026-01-21 00:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:26:35.973184437 +0000 UTC m=+6668.293444725" watchObservedRunningTime="2026-01-21 00:26:37.052517469 +0000 UTC m=+6669.372777787" Jan 21 00:26:38 crc kubenswrapper[5030]: I0121 00:26:38.000636 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:38 crc kubenswrapper[5030]: I0121 00:26:38.628463 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mmwt9"] Jan 21 00:26:39 crc kubenswrapper[5030]: I0121 00:26:39.972216 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mmwt9" podUID="97d264c1-1c25-4188-a1fc-f0a4f0981135" containerName="registry-server" containerID="cri-o://529f16f1815f03c908a43f3339345a658a4bc57417a461d8c7fe054d9ec3a4e3" gracePeriod=2 Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.381552 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.485675 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97d264c1-1c25-4188-a1fc-f0a4f0981135-catalog-content\") pod \"97d264c1-1c25-4188-a1fc-f0a4f0981135\" (UID: \"97d264c1-1c25-4188-a1fc-f0a4f0981135\") " Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.485859 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97d264c1-1c25-4188-a1fc-f0a4f0981135-utilities\") pod \"97d264c1-1c25-4188-a1fc-f0a4f0981135\" (UID: \"97d264c1-1c25-4188-a1fc-f0a4f0981135\") " Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.485937 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m5z6\" (UniqueName: \"kubernetes.io/projected/97d264c1-1c25-4188-a1fc-f0a4f0981135-kube-api-access-2m5z6\") pod \"97d264c1-1c25-4188-a1fc-f0a4f0981135\" (UID: \"97d264c1-1c25-4188-a1fc-f0a4f0981135\") " Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.486667 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97d264c1-1c25-4188-a1fc-f0a4f0981135-utilities" (OuterVolumeSpecName: "utilities") pod "97d264c1-1c25-4188-a1fc-f0a4f0981135" (UID: "97d264c1-1c25-4188-a1fc-f0a4f0981135"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.487385 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97d264c1-1c25-4188-a1fc-f0a4f0981135-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.490887 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97d264c1-1c25-4188-a1fc-f0a4f0981135-kube-api-access-2m5z6" (OuterVolumeSpecName: "kube-api-access-2m5z6") pod "97d264c1-1c25-4188-a1fc-f0a4f0981135" (UID: "97d264c1-1c25-4188-a1fc-f0a4f0981135"). InnerVolumeSpecName "kube-api-access-2m5z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.537308 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97d264c1-1c25-4188-a1fc-f0a4f0981135-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97d264c1-1c25-4188-a1fc-f0a4f0981135" (UID: "97d264c1-1c25-4188-a1fc-f0a4f0981135"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.589516 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m5z6\" (UniqueName: \"kubernetes.io/projected/97d264c1-1c25-4188-a1fc-f0a4f0981135-kube-api-access-2m5z6\") on node \"crc\" DevicePath \"\"" Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.589560 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97d264c1-1c25-4188-a1fc-f0a4f0981135-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.981394 5030 generic.go:334] "Generic (PLEG): container finished" podID="97d264c1-1c25-4188-a1fc-f0a4f0981135" containerID="529f16f1815f03c908a43f3339345a658a4bc57417a461d8c7fe054d9ec3a4e3" exitCode=0 Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.981445 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmwt9" event={"ID":"97d264c1-1c25-4188-a1fc-f0a4f0981135","Type":"ContainerDied","Data":"529f16f1815f03c908a43f3339345a658a4bc57417a461d8c7fe054d9ec3a4e3"} Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.981466 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmwt9" Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.981492 5030 scope.go:117] "RemoveContainer" containerID="529f16f1815f03c908a43f3339345a658a4bc57417a461d8c7fe054d9ec3a4e3" Jan 21 00:26:40 crc kubenswrapper[5030]: I0121 00:26:40.981478 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmwt9" event={"ID":"97d264c1-1c25-4188-a1fc-f0a4f0981135","Type":"ContainerDied","Data":"53cbae2fcadc796ce14b463026ac7bc3adc657fff0a2ac74d59a29b62e9f0adf"} Jan 21 00:26:41 crc kubenswrapper[5030]: I0121 00:26:41.005872 5030 scope.go:117] "RemoveContainer" containerID="d21a9f8786d758e2f9d4d25ee862e53d3c66ae5d2747fee7c601c244a56fdc05" Jan 21 00:26:41 crc kubenswrapper[5030]: I0121 00:26:41.016248 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mmwt9"] Jan 21 00:26:41 crc kubenswrapper[5030]: I0121 00:26:41.023079 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mmwt9"] Jan 21 00:26:41 crc kubenswrapper[5030]: I0121 00:26:41.033089 5030 scope.go:117] "RemoveContainer" containerID="65d62223b0e074e769aceb72115fd0e252229e13d21c1e2f52611883147b4512" Jan 21 00:26:41 crc kubenswrapper[5030]: I0121 00:26:41.057953 5030 scope.go:117] "RemoveContainer" containerID="529f16f1815f03c908a43f3339345a658a4bc57417a461d8c7fe054d9ec3a4e3" Jan 21 00:26:41 crc kubenswrapper[5030]: E0121 00:26:41.058371 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"529f16f1815f03c908a43f3339345a658a4bc57417a461d8c7fe054d9ec3a4e3\": container with ID starting with 529f16f1815f03c908a43f3339345a658a4bc57417a461d8c7fe054d9ec3a4e3 not found: ID does not exist" containerID="529f16f1815f03c908a43f3339345a658a4bc57417a461d8c7fe054d9ec3a4e3" Jan 21 00:26:41 crc kubenswrapper[5030]: I0121 00:26:41.058404 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"529f16f1815f03c908a43f3339345a658a4bc57417a461d8c7fe054d9ec3a4e3"} err="failed to get container status \"529f16f1815f03c908a43f3339345a658a4bc57417a461d8c7fe054d9ec3a4e3\": rpc error: code = NotFound desc = could not find container \"529f16f1815f03c908a43f3339345a658a4bc57417a461d8c7fe054d9ec3a4e3\": container with ID starting with 529f16f1815f03c908a43f3339345a658a4bc57417a461d8c7fe054d9ec3a4e3 not found: ID does not exist" Jan 21 00:26:41 crc kubenswrapper[5030]: I0121 00:26:41.058426 5030 scope.go:117] "RemoveContainer" containerID="d21a9f8786d758e2f9d4d25ee862e53d3c66ae5d2747fee7c601c244a56fdc05" Jan 21 00:26:41 crc kubenswrapper[5030]: E0121 00:26:41.058791 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21a9f8786d758e2f9d4d25ee862e53d3c66ae5d2747fee7c601c244a56fdc05\": container with ID starting with d21a9f8786d758e2f9d4d25ee862e53d3c66ae5d2747fee7c601c244a56fdc05 not found: ID does not exist" containerID="d21a9f8786d758e2f9d4d25ee862e53d3c66ae5d2747fee7c601c244a56fdc05" Jan 21 00:26:41 crc kubenswrapper[5030]: I0121 00:26:41.058813 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21a9f8786d758e2f9d4d25ee862e53d3c66ae5d2747fee7c601c244a56fdc05"} err="failed to get container status \"d21a9f8786d758e2f9d4d25ee862e53d3c66ae5d2747fee7c601c244a56fdc05\": rpc error: code = NotFound desc = could not find container \"d21a9f8786d758e2f9d4d25ee862e53d3c66ae5d2747fee7c601c244a56fdc05\": container with ID starting with d21a9f8786d758e2f9d4d25ee862e53d3c66ae5d2747fee7c601c244a56fdc05 not found: ID does not exist" Jan 21 00:26:41 crc kubenswrapper[5030]: I0121 00:26:41.058829 5030 scope.go:117] "RemoveContainer" containerID="65d62223b0e074e769aceb72115fd0e252229e13d21c1e2f52611883147b4512" Jan 21 00:26:41 crc kubenswrapper[5030]: E0121 00:26:41.059078 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65d62223b0e074e769aceb72115fd0e252229e13d21c1e2f52611883147b4512\": container with ID starting with 65d62223b0e074e769aceb72115fd0e252229e13d21c1e2f52611883147b4512 not found: ID does not exist" containerID="65d62223b0e074e769aceb72115fd0e252229e13d21c1e2f52611883147b4512" Jan 21 00:26:41 crc kubenswrapper[5030]: I0121 00:26:41.059103 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65d62223b0e074e769aceb72115fd0e252229e13d21c1e2f52611883147b4512"} err="failed to get container status \"65d62223b0e074e769aceb72115fd0e252229e13d21c1e2f52611883147b4512\": rpc error: code = NotFound desc = could not find container \"65d62223b0e074e769aceb72115fd0e252229e13d21c1e2f52611883147b4512\": container with ID starting with 65d62223b0e074e769aceb72115fd0e252229e13d21c1e2f52611883147b4512 not found: ID does not exist" Jan 21 00:26:41 crc kubenswrapper[5030]: I0121 00:26:41.975359 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97d264c1-1c25-4188-a1fc-f0a4f0981135" path="/var/lib/kubelet/pods/97d264c1-1c25-4188-a1fc-f0a4f0981135/volumes" Jan 21 00:26:43 crc kubenswrapper[5030]: I0121 00:26:43.962843 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:26:43 crc kubenswrapper[5030]: E0121 00:26:43.963503 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:26:44 crc kubenswrapper[5030]: I0121 00:26:44.358711 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.016288 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fxxhb"] Jan 21 00:26:45 crc kubenswrapper[5030]: E0121 00:26:45.016614 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d264c1-1c25-4188-a1fc-f0a4f0981135" containerName="extract-content" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.016651 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d264c1-1c25-4188-a1fc-f0a4f0981135" containerName="extract-content" Jan 21 00:26:45 crc kubenswrapper[5030]: E0121 00:26:45.016668 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d264c1-1c25-4188-a1fc-f0a4f0981135" containerName="extract-utilities" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.016675 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d264c1-1c25-4188-a1fc-f0a4f0981135" containerName="extract-utilities" Jan 21 00:26:45 crc kubenswrapper[5030]: E0121 00:26:45.016694 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97d264c1-1c25-4188-a1fc-f0a4f0981135" containerName="registry-server" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.016702 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="97d264c1-1c25-4188-a1fc-f0a4f0981135" containerName="registry-server" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.016838 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="97d264c1-1c25-4188-a1fc-f0a4f0981135" containerName="registry-server" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.017759 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.027072 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxxhb"] Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.160722 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c408b887-9b0d-4ae0-ad25-f6f519e2141b-utilities\") pod \"redhat-operators-fxxhb\" (UID: \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\") " pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.160789 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c408b887-9b0d-4ae0-ad25-f6f519e2141b-catalog-content\") pod \"redhat-operators-fxxhb\" (UID: \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\") " pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.160858 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72f7h\" (UniqueName: \"kubernetes.io/projected/c408b887-9b0d-4ae0-ad25-f6f519e2141b-kube-api-access-72f7h\") pod \"redhat-operators-fxxhb\" (UID: \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\") " pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.262132 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c408b887-9b0d-4ae0-ad25-f6f519e2141b-utilities\") pod \"redhat-operators-fxxhb\" (UID: \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\") " pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.262190 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c408b887-9b0d-4ae0-ad25-f6f519e2141b-catalog-content\") pod \"redhat-operators-fxxhb\" (UID: \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\") " pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.262223 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72f7h\" (UniqueName: \"kubernetes.io/projected/c408b887-9b0d-4ae0-ad25-f6f519e2141b-kube-api-access-72f7h\") pod \"redhat-operators-fxxhb\" (UID: \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\") " pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.262648 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c408b887-9b0d-4ae0-ad25-f6f519e2141b-utilities\") pod \"redhat-operators-fxxhb\" (UID: \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\") " pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.262708 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c408b887-9b0d-4ae0-ad25-f6f519e2141b-catalog-content\") pod \"redhat-operators-fxxhb\" (UID: \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\") " pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.289370 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72f7h\" (UniqueName: \"kubernetes.io/projected/c408b887-9b0d-4ae0-ad25-f6f519e2141b-kube-api-access-72f7h\") pod \"redhat-operators-fxxhb\" (UID: \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\") " pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.337199 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:26:45 crc kubenswrapper[5030]: I0121 00:26:45.830313 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxxhb"] Jan 21 00:26:46 crc kubenswrapper[5030]: I0121 00:26:46.017455 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxxhb" event={"ID":"c408b887-9b0d-4ae0-ad25-f6f519e2141b","Type":"ContainerStarted","Data":"5f4b512a9167966ef00062448d02cac6e785289311846da2a0a0eae649921f59"} Jan 21 00:26:47 crc kubenswrapper[5030]: I0121 00:26:47.028511 5030 generic.go:334] "Generic (PLEG): container finished" podID="c408b887-9b0d-4ae0-ad25-f6f519e2141b" containerID="2967c319cc1aa3c3082179c0f003823a6ca29bd27dedaf8a6973b6536c0b37fd" exitCode=0 Jan 21 00:26:47 crc kubenswrapper[5030]: I0121 00:26:47.028664 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxxhb" event={"ID":"c408b887-9b0d-4ae0-ad25-f6f519e2141b","Type":"ContainerDied","Data":"2967c319cc1aa3c3082179c0f003823a6ca29bd27dedaf8a6973b6536c0b37fd"} Jan 21 00:26:49 crc kubenswrapper[5030]: I0121 00:26:49.603985 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jbt6q"] Jan 21 00:26:49 crc kubenswrapper[5030]: I0121 00:26:49.605437 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" Jan 21 00:26:49 crc kubenswrapper[5030]: I0121 00:26:49.607998 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-rbvl5" Jan 21 00:26:49 crc kubenswrapper[5030]: I0121 00:26:49.613528 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jbt6q"] Jan 21 00:26:49 crc kubenswrapper[5030]: I0121 00:26:49.741372 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smmm4\" (UniqueName: \"kubernetes.io/projected/4f47d75b-9f36-4d13-8b91-a5836565f670-kube-api-access-smmm4\") pod \"rabbitmq-cluster-operator-index-jbt6q\" (UID: \"4f47d75b-9f36-4d13-8b91-a5836565f670\") " pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" Jan 21 00:26:49 crc kubenswrapper[5030]: I0121 00:26:49.842734 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smmm4\" (UniqueName: \"kubernetes.io/projected/4f47d75b-9f36-4d13-8b91-a5836565f670-kube-api-access-smmm4\") pod \"rabbitmq-cluster-operator-index-jbt6q\" (UID: \"4f47d75b-9f36-4d13-8b91-a5836565f670\") " pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" Jan 21 00:26:49 crc kubenswrapper[5030]: I0121 00:26:49.872366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smmm4\" (UniqueName: \"kubernetes.io/projected/4f47d75b-9f36-4d13-8b91-a5836565f670-kube-api-access-smmm4\") pod \"rabbitmq-cluster-operator-index-jbt6q\" (UID: \"4f47d75b-9f36-4d13-8b91-a5836565f670\") " pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" Jan 21 00:26:49 crc kubenswrapper[5030]: I0121 00:26:49.922586 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" Jan 21 00:26:50 crc kubenswrapper[5030]: I0121 00:26:50.527130 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jbt6q"] Jan 21 00:26:50 crc kubenswrapper[5030]: W0121 00:26:50.531139 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f47d75b_9f36_4d13_8b91_a5836565f670.slice/crio-471c1c0a97416d2552eb6b59d2ff6ac9fcf8dd8b408710a0ddf295ebea029726 WatchSource:0}: Error finding container 471c1c0a97416d2552eb6b59d2ff6ac9fcf8dd8b408710a0ddf295ebea029726: Status 404 returned error can't find the container with id 471c1c0a97416d2552eb6b59d2ff6ac9fcf8dd8b408710a0ddf295ebea029726 Jan 21 00:26:51 crc kubenswrapper[5030]: I0121 00:26:51.072671 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" event={"ID":"4f47d75b-9f36-4d13-8b91-a5836565f670","Type":"ContainerStarted","Data":"471c1c0a97416d2552eb6b59d2ff6ac9fcf8dd8b408710a0ddf295ebea029726"} Jan 21 00:26:51 crc kubenswrapper[5030]: I0121 00:26:51.077600 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxxhb" event={"ID":"c408b887-9b0d-4ae0-ad25-f6f519e2141b","Type":"ContainerStarted","Data":"3fc225dbecdd53684d80da9d563bf19ff0e2f7a909d4329f783cb685fe2c7052"} Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.086652 5030 generic.go:334] "Generic (PLEG): container finished" podID="c408b887-9b0d-4ae0-ad25-f6f519e2141b" containerID="3fc225dbecdd53684d80da9d563bf19ff0e2f7a909d4329f783cb685fe2c7052" exitCode=0 Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.086702 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxxhb" event={"ID":"c408b887-9b0d-4ae0-ad25-f6f519e2141b","Type":"ContainerDied","Data":"3fc225dbecdd53684d80da9d563bf19ff0e2f7a909d4329f783cb685fe2c7052"} Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.732309 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/memcached-0"] Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.733685 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.736333 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"memcached-memcached-dockercfg-w97cm" Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.736392 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"memcached-config-data" Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.737829 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.795365 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-kolla-config\") pod \"memcached-0\" (UID: \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.795431 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwsbv\" (UniqueName: \"kubernetes.io/projected/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-kube-api-access-vwsbv\") pod \"memcached-0\" (UID: \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.795580 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-config-data\") pod \"memcached-0\" (UID: \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.896379 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-kolla-config\") pod \"memcached-0\" (UID: \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.896444 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwsbv\" (UniqueName: \"kubernetes.io/projected/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-kube-api-access-vwsbv\") pod \"memcached-0\" (UID: \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.896556 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-config-data\") pod \"memcached-0\" (UID: \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.897215 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-kolla-config\") pod \"memcached-0\" (UID: \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.897283 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-config-data\") pod \"memcached-0\" (UID: \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 00:26:52 crc kubenswrapper[5030]: I0121 00:26:52.916484 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwsbv\" (UniqueName: \"kubernetes.io/projected/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-kube-api-access-vwsbv\") pod \"memcached-0\" (UID: \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\") " pod="keystone-kuttl-tests/memcached-0" Jan 21 00:26:53 crc kubenswrapper[5030]: I0121 00:26:53.060875 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Jan 21 00:26:53 crc kubenswrapper[5030]: I0121 00:26:53.675388 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Jan 21 00:26:53 crc kubenswrapper[5030]: W0121 00:26:53.683996 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e8db42f_632b_4d70_9c6f_1bd38c0f1c47.slice/crio-825379b1ecbad96c19a675651d7f9d1537a51a3471290002710e60ce36fe8045 WatchSource:0}: Error finding container 825379b1ecbad96c19a675651d7f9d1537a51a3471290002710e60ce36fe8045: Status 404 returned error can't find the container with id 825379b1ecbad96c19a675651d7f9d1537a51a3471290002710e60ce36fe8045 Jan 21 00:26:54 crc kubenswrapper[5030]: I0121 00:26:54.102729 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47","Type":"ContainerStarted","Data":"825379b1ecbad96c19a675651d7f9d1537a51a3471290002710e60ce36fe8045"} Jan 21 00:26:54 crc kubenswrapper[5030]: I0121 00:26:54.104369 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" event={"ID":"4f47d75b-9f36-4d13-8b91-a5836565f670","Type":"ContainerStarted","Data":"0e9fbc48aca84c99ee185bfefc0b80cedae43d7f4c8af54a06705865c87c0177"} Jan 21 00:26:54 crc kubenswrapper[5030]: I0121 00:26:54.126758 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" podStartSLOduration=2.397770929 podStartE2EDuration="5.126733201s" podCreationTimestamp="2026-01-21 00:26:49 +0000 UTC" firstStartedPulling="2026-01-21 00:26:50.5340627 +0000 UTC m=+6682.854323028" lastFinishedPulling="2026-01-21 00:26:53.263024992 +0000 UTC m=+6685.583285300" observedRunningTime="2026-01-21 00:26:54.119091379 +0000 UTC m=+6686.439351677" watchObservedRunningTime="2026-01-21 00:26:54.126733201 +0000 UTC m=+6686.446993499" Jan 21 00:26:55 crc kubenswrapper[5030]: I0121 00:26:55.124203 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxxhb" event={"ID":"c408b887-9b0d-4ae0-ad25-f6f519e2141b","Type":"ContainerStarted","Data":"cf0d61b309a8b99ad497a559795fe8c304100c15c324beb7fce3dca11e039512"} Jan 21 00:26:55 crc kubenswrapper[5030]: I0121 00:26:55.125562 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47","Type":"ContainerStarted","Data":"ead90f2213be7eeed0596c3695de24918d1426e2e96d77e3af783b5a5d85bef7"} Jan 21 00:26:56 crc kubenswrapper[5030]: I0121 00:26:56.132945 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/memcached-0" Jan 21 00:26:56 crc kubenswrapper[5030]: I0121 00:26:56.162135 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/memcached-0" podStartSLOduration=4.162114718 podStartE2EDuration="4.162114718s" podCreationTimestamp="2026-01-21 00:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:26:56.154753933 +0000 UTC m=+6688.475014231" watchObservedRunningTime="2026-01-21 00:26:56.162114718 +0000 UTC m=+6688.482375006" Jan 21 00:26:56 crc kubenswrapper[5030]: I0121 00:26:56.196250 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fxxhb" podStartSLOduration=4.7721061769999995 podStartE2EDuration="12.196229433s" podCreationTimestamp="2026-01-21 00:26:44 +0000 UTC" firstStartedPulling="2026-01-21 00:26:47.031102635 +0000 UTC m=+6679.351362923" lastFinishedPulling="2026-01-21 00:26:54.455225891 +0000 UTC m=+6686.775486179" observedRunningTime="2026-01-21 00:26:56.183864658 +0000 UTC m=+6688.504124976" watchObservedRunningTime="2026-01-21 00:26:56.196229433 +0000 UTC m=+6688.516489721" Jan 21 00:26:56 crc kubenswrapper[5030]: I0121 00:26:56.966196 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:26:56 crc kubenswrapper[5030]: E0121 00:26:56.966533 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:26:59 crc kubenswrapper[5030]: I0121 00:26:59.923091 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" Jan 21 00:26:59 crc kubenswrapper[5030]: I0121 00:26:59.923458 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" Jan 21 00:26:59 crc kubenswrapper[5030]: I0121 00:26:59.952156 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" Jan 21 00:27:00 crc kubenswrapper[5030]: I0121 00:27:00.187784 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.063876 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/memcached-0" Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.261493 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj"] Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.262988 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.296662 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.303800 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj"] Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.353560 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmvr4\" (UniqueName: \"kubernetes.io/projected/df48f69d-ead6-4779-a18d-9bcdd762986b-kube-api-access-nmvr4\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj\" (UID: \"df48f69d-ead6-4779-a18d-9bcdd762986b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.353633 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df48f69d-ead6-4779-a18d-9bcdd762986b-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj\" (UID: \"df48f69d-ead6-4779-a18d-9bcdd762986b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.353842 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df48f69d-ead6-4779-a18d-9bcdd762986b-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj\" (UID: \"df48f69d-ead6-4779-a18d-9bcdd762986b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.454940 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmvr4\" (UniqueName: \"kubernetes.io/projected/df48f69d-ead6-4779-a18d-9bcdd762986b-kube-api-access-nmvr4\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj\" (UID: \"df48f69d-ead6-4779-a18d-9bcdd762986b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.454995 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df48f69d-ead6-4779-a18d-9bcdd762986b-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj\" (UID: \"df48f69d-ead6-4779-a18d-9bcdd762986b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.455046 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df48f69d-ead6-4779-a18d-9bcdd762986b-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj\" (UID: \"df48f69d-ead6-4779-a18d-9bcdd762986b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.455449 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df48f69d-ead6-4779-a18d-9bcdd762986b-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj\" (UID: \"df48f69d-ead6-4779-a18d-9bcdd762986b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.455518 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df48f69d-ead6-4779-a18d-9bcdd762986b-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj\" (UID: \"df48f69d-ead6-4779-a18d-9bcdd762986b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.483269 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmvr4\" (UniqueName: \"kubernetes.io/projected/df48f69d-ead6-4779-a18d-9bcdd762986b-kube-api-access-nmvr4\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj\" (UID: \"df48f69d-ead6-4779-a18d-9bcdd762986b\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" Jan 21 00:27:03 crc kubenswrapper[5030]: I0121 00:27:03.621287 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" Jan 21 00:27:04 crc kubenswrapper[5030]: I0121 00:27:04.067128 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj"] Jan 21 00:27:04 crc kubenswrapper[5030]: I0121 00:27:04.188442 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" event={"ID":"df48f69d-ead6-4779-a18d-9bcdd762986b","Type":"ContainerStarted","Data":"531de053f76afdea624078a1b3b36037c3e7a570f098aff07508551b1695c5b9"} Jan 21 00:27:05 crc kubenswrapper[5030]: I0121 00:27:05.209336 5030 generic.go:334] "Generic (PLEG): container finished" podID="df48f69d-ead6-4779-a18d-9bcdd762986b" containerID="2470bd93dd8f3ad9fc7ced919be5682dafb8952e759911e4dfef592d61dd9387" exitCode=0 Jan 21 00:27:05 crc kubenswrapper[5030]: I0121 00:27:05.209407 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" event={"ID":"df48f69d-ead6-4779-a18d-9bcdd762986b","Type":"ContainerDied","Data":"2470bd93dd8f3ad9fc7ced919be5682dafb8952e759911e4dfef592d61dd9387"} Jan 21 00:27:05 crc kubenswrapper[5030]: I0121 00:27:05.339291 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:27:05 crc kubenswrapper[5030]: I0121 00:27:05.339351 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:27:05 crc kubenswrapper[5030]: I0121 00:27:05.388461 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:27:06 crc kubenswrapper[5030]: I0121 00:27:06.218496 5030 generic.go:334] "Generic (PLEG): container finished" podID="df48f69d-ead6-4779-a18d-9bcdd762986b" containerID="7a833b3ae51014b4e193ee088e7e1b19f9d2fec8fb5cd1d43e8d27cfbc42c368" exitCode=0 Jan 21 00:27:06 crc kubenswrapper[5030]: I0121 00:27:06.218809 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" event={"ID":"df48f69d-ead6-4779-a18d-9bcdd762986b","Type":"ContainerDied","Data":"7a833b3ae51014b4e193ee088e7e1b19f9d2fec8fb5cd1d43e8d27cfbc42c368"} Jan 21 00:27:06 crc kubenswrapper[5030]: I0121 00:27:06.272703 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:27:07 crc kubenswrapper[5030]: I0121 00:27:07.226896 5030 generic.go:334] "Generic (PLEG): container finished" podID="df48f69d-ead6-4779-a18d-9bcdd762986b" containerID="0af5d8f42d6307b99b24b85866749933415ebe7af60453391f53f910014ffa6c" exitCode=0 Jan 21 00:27:07 crc kubenswrapper[5030]: I0121 00:27:07.227082 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" event={"ID":"df48f69d-ead6-4779-a18d-9bcdd762986b","Type":"ContainerDied","Data":"0af5d8f42d6307b99b24b85866749933415ebe7af60453391f53f910014ffa6c"} Jan 21 00:27:08 crc kubenswrapper[5030]: I0121 00:27:08.501361 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" Jan 21 00:27:08 crc kubenswrapper[5030]: I0121 00:27:08.674790 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df48f69d-ead6-4779-a18d-9bcdd762986b-util\") pod \"df48f69d-ead6-4779-a18d-9bcdd762986b\" (UID: \"df48f69d-ead6-4779-a18d-9bcdd762986b\") " Jan 21 00:27:08 crc kubenswrapper[5030]: I0121 00:27:08.674843 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df48f69d-ead6-4779-a18d-9bcdd762986b-bundle\") pod \"df48f69d-ead6-4779-a18d-9bcdd762986b\" (UID: \"df48f69d-ead6-4779-a18d-9bcdd762986b\") " Jan 21 00:27:08 crc kubenswrapper[5030]: I0121 00:27:08.674966 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmvr4\" (UniqueName: \"kubernetes.io/projected/df48f69d-ead6-4779-a18d-9bcdd762986b-kube-api-access-nmvr4\") pod \"df48f69d-ead6-4779-a18d-9bcdd762986b\" (UID: \"df48f69d-ead6-4779-a18d-9bcdd762986b\") " Jan 21 00:27:08 crc kubenswrapper[5030]: I0121 00:27:08.676140 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df48f69d-ead6-4779-a18d-9bcdd762986b-bundle" (OuterVolumeSpecName: "bundle") pod "df48f69d-ead6-4779-a18d-9bcdd762986b" (UID: "df48f69d-ead6-4779-a18d-9bcdd762986b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:27:08 crc kubenswrapper[5030]: I0121 00:27:08.681460 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df48f69d-ead6-4779-a18d-9bcdd762986b-kube-api-access-nmvr4" (OuterVolumeSpecName: "kube-api-access-nmvr4") pod "df48f69d-ead6-4779-a18d-9bcdd762986b" (UID: "df48f69d-ead6-4779-a18d-9bcdd762986b"). InnerVolumeSpecName "kube-api-access-nmvr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:27:08 crc kubenswrapper[5030]: I0121 00:27:08.694320 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df48f69d-ead6-4779-a18d-9bcdd762986b-util" (OuterVolumeSpecName: "util") pod "df48f69d-ead6-4779-a18d-9bcdd762986b" (UID: "df48f69d-ead6-4779-a18d-9bcdd762986b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:27:08 crc kubenswrapper[5030]: I0121 00:27:08.777037 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/df48f69d-ead6-4779-a18d-9bcdd762986b-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:08 crc kubenswrapper[5030]: I0121 00:27:08.777338 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/df48f69d-ead6-4779-a18d-9bcdd762986b-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:08 crc kubenswrapper[5030]: I0121 00:27:08.777356 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmvr4\" (UniqueName: \"kubernetes.io/projected/df48f69d-ead6-4779-a18d-9bcdd762986b-kube-api-access-nmvr4\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:09 crc kubenswrapper[5030]: I0121 00:27:09.243046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" event={"ID":"df48f69d-ead6-4779-a18d-9bcdd762986b","Type":"ContainerDied","Data":"531de053f76afdea624078a1b3b36037c3e7a570f098aff07508551b1695c5b9"} Jan 21 00:27:09 crc kubenswrapper[5030]: I0121 00:27:09.243092 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="531de053f76afdea624078a1b3b36037c3e7a570f098aff07508551b1695c5b9" Jan 21 00:27:09 crc kubenswrapper[5030]: I0121 00:27:09.243135 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj" Jan 21 00:27:09 crc kubenswrapper[5030]: I0121 00:27:09.397125 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxxhb"] Jan 21 00:27:09 crc kubenswrapper[5030]: I0121 00:27:09.397649 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fxxhb" podUID="c408b887-9b0d-4ae0-ad25-f6f519e2141b" containerName="registry-server" containerID="cri-o://cf0d61b309a8b99ad497a559795fe8c304100c15c324beb7fce3dca11e039512" gracePeriod=2 Jan 21 00:27:09 crc kubenswrapper[5030]: I0121 00:27:09.814672 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:27:09 crc kubenswrapper[5030]: I0121 00:27:09.994321 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c408b887-9b0d-4ae0-ad25-f6f519e2141b-utilities\") pod \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\" (UID: \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\") " Jan 21 00:27:09 crc kubenswrapper[5030]: I0121 00:27:09.994693 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c408b887-9b0d-4ae0-ad25-f6f519e2141b-catalog-content\") pod \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\" (UID: \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\") " Jan 21 00:27:09 crc kubenswrapper[5030]: I0121 00:27:09.994801 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72f7h\" (UniqueName: \"kubernetes.io/projected/c408b887-9b0d-4ae0-ad25-f6f519e2141b-kube-api-access-72f7h\") pod \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\" (UID: \"c408b887-9b0d-4ae0-ad25-f6f519e2141b\") " Jan 21 00:27:09 crc kubenswrapper[5030]: I0121 00:27:09.995976 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c408b887-9b0d-4ae0-ad25-f6f519e2141b-utilities" (OuterVolumeSpecName: "utilities") pod "c408b887-9b0d-4ae0-ad25-f6f519e2141b" (UID: "c408b887-9b0d-4ae0-ad25-f6f519e2141b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:09.999282 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c408b887-9b0d-4ae0-ad25-f6f519e2141b-kube-api-access-72f7h" (OuterVolumeSpecName: "kube-api-access-72f7h") pod "c408b887-9b0d-4ae0-ad25-f6f519e2141b" (UID: "c408b887-9b0d-4ae0-ad25-f6f519e2141b"). InnerVolumeSpecName "kube-api-access-72f7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.096603 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c408b887-9b0d-4ae0-ad25-f6f519e2141b-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.096659 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72f7h\" (UniqueName: \"kubernetes.io/projected/c408b887-9b0d-4ae0-ad25-f6f519e2141b-kube-api-access-72f7h\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.111099 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c408b887-9b0d-4ae0-ad25-f6f519e2141b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c408b887-9b0d-4ae0-ad25-f6f519e2141b" (UID: "c408b887-9b0d-4ae0-ad25-f6f519e2141b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.197907 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c408b887-9b0d-4ae0-ad25-f6f519e2141b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.252488 5030 generic.go:334] "Generic (PLEG): container finished" podID="c408b887-9b0d-4ae0-ad25-f6f519e2141b" containerID="cf0d61b309a8b99ad497a559795fe8c304100c15c324beb7fce3dca11e039512" exitCode=0 Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.252529 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxxhb" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.252539 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxxhb" event={"ID":"c408b887-9b0d-4ae0-ad25-f6f519e2141b","Type":"ContainerDied","Data":"cf0d61b309a8b99ad497a559795fe8c304100c15c324beb7fce3dca11e039512"} Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.252573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxxhb" event={"ID":"c408b887-9b0d-4ae0-ad25-f6f519e2141b","Type":"ContainerDied","Data":"5f4b512a9167966ef00062448d02cac6e785289311846da2a0a0eae649921f59"} Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.252599 5030 scope.go:117] "RemoveContainer" containerID="cf0d61b309a8b99ad497a559795fe8c304100c15c324beb7fce3dca11e039512" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.293543 5030 scope.go:117] "RemoveContainer" containerID="3fc225dbecdd53684d80da9d563bf19ff0e2f7a909d4329f783cb685fe2c7052" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.305749 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxxhb"] Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.310098 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fxxhb"] Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.337703 5030 scope.go:117] "RemoveContainer" containerID="2967c319cc1aa3c3082179c0f003823a6ca29bd27dedaf8a6973b6536c0b37fd" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.354341 5030 scope.go:117] "RemoveContainer" containerID="cf0d61b309a8b99ad497a559795fe8c304100c15c324beb7fce3dca11e039512" Jan 21 00:27:10 crc kubenswrapper[5030]: E0121 00:27:10.354822 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0d61b309a8b99ad497a559795fe8c304100c15c324beb7fce3dca11e039512\": container with ID starting with cf0d61b309a8b99ad497a559795fe8c304100c15c324beb7fce3dca11e039512 not found: ID does not exist" containerID="cf0d61b309a8b99ad497a559795fe8c304100c15c324beb7fce3dca11e039512" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.354859 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0d61b309a8b99ad497a559795fe8c304100c15c324beb7fce3dca11e039512"} err="failed to get container status \"cf0d61b309a8b99ad497a559795fe8c304100c15c324beb7fce3dca11e039512\": rpc error: code = NotFound desc = could not find container \"cf0d61b309a8b99ad497a559795fe8c304100c15c324beb7fce3dca11e039512\": container with ID starting with cf0d61b309a8b99ad497a559795fe8c304100c15c324beb7fce3dca11e039512 not found: ID does not exist" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.354884 5030 scope.go:117] "RemoveContainer" containerID="3fc225dbecdd53684d80da9d563bf19ff0e2f7a909d4329f783cb685fe2c7052" Jan 21 00:27:10 crc kubenswrapper[5030]: E0121 00:27:10.355091 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fc225dbecdd53684d80da9d563bf19ff0e2f7a909d4329f783cb685fe2c7052\": container with ID starting with 3fc225dbecdd53684d80da9d563bf19ff0e2f7a909d4329f783cb685fe2c7052 not found: ID does not exist" containerID="3fc225dbecdd53684d80da9d563bf19ff0e2f7a909d4329f783cb685fe2c7052" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.355116 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fc225dbecdd53684d80da9d563bf19ff0e2f7a909d4329f783cb685fe2c7052"} err="failed to get container status \"3fc225dbecdd53684d80da9d563bf19ff0e2f7a909d4329f783cb685fe2c7052\": rpc error: code = NotFound desc = could not find container \"3fc225dbecdd53684d80da9d563bf19ff0e2f7a909d4329f783cb685fe2c7052\": container with ID starting with 3fc225dbecdd53684d80da9d563bf19ff0e2f7a909d4329f783cb685fe2c7052 not found: ID does not exist" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.355133 5030 scope.go:117] "RemoveContainer" containerID="2967c319cc1aa3c3082179c0f003823a6ca29bd27dedaf8a6973b6536c0b37fd" Jan 21 00:27:10 crc kubenswrapper[5030]: E0121 00:27:10.355364 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2967c319cc1aa3c3082179c0f003823a6ca29bd27dedaf8a6973b6536c0b37fd\": container with ID starting with 2967c319cc1aa3c3082179c0f003823a6ca29bd27dedaf8a6973b6536c0b37fd not found: ID does not exist" containerID="2967c319cc1aa3c3082179c0f003823a6ca29bd27dedaf8a6973b6536c0b37fd" Jan 21 00:27:10 crc kubenswrapper[5030]: I0121 00:27:10.355389 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2967c319cc1aa3c3082179c0f003823a6ca29bd27dedaf8a6973b6536c0b37fd"} err="failed to get container status \"2967c319cc1aa3c3082179c0f003823a6ca29bd27dedaf8a6973b6536c0b37fd\": rpc error: code = NotFound desc = could not find container \"2967c319cc1aa3c3082179c0f003823a6ca29bd27dedaf8a6973b6536c0b37fd\": container with ID starting with 2967c319cc1aa3c3082179c0f003823a6ca29bd27dedaf8a6973b6536c0b37fd not found: ID does not exist" Jan 21 00:27:11 crc kubenswrapper[5030]: I0121 00:27:11.966138 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:27:11 crc kubenswrapper[5030]: E0121 00:27:11.967144 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:27:11 crc kubenswrapper[5030]: I0121 00:27:11.972096 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c408b887-9b0d-4ae0-ad25-f6f519e2141b" path="/var/lib/kubelet/pods/c408b887-9b0d-4ae0-ad25-f6f519e2141b/volumes" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.202093 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5"] Jan 21 00:27:23 crc kubenswrapper[5030]: E0121 00:27:23.202955 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c408b887-9b0d-4ae0-ad25-f6f519e2141b" containerName="extract-content" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.202971 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c408b887-9b0d-4ae0-ad25-f6f519e2141b" containerName="extract-content" Jan 21 00:27:23 crc kubenswrapper[5030]: E0121 00:27:23.202988 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df48f69d-ead6-4779-a18d-9bcdd762986b" containerName="util" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.202996 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="df48f69d-ead6-4779-a18d-9bcdd762986b" containerName="util" Jan 21 00:27:23 crc kubenswrapper[5030]: E0121 00:27:23.203012 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c408b887-9b0d-4ae0-ad25-f6f519e2141b" containerName="registry-server" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.203019 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c408b887-9b0d-4ae0-ad25-f6f519e2141b" containerName="registry-server" Jan 21 00:27:23 crc kubenswrapper[5030]: E0121 00:27:23.203039 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c408b887-9b0d-4ae0-ad25-f6f519e2141b" containerName="extract-utilities" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.203047 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c408b887-9b0d-4ae0-ad25-f6f519e2141b" containerName="extract-utilities" Jan 21 00:27:23 crc kubenswrapper[5030]: E0121 00:27:23.203060 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df48f69d-ead6-4779-a18d-9bcdd762986b" containerName="extract" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.203067 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="df48f69d-ead6-4779-a18d-9bcdd762986b" containerName="extract" Jan 21 00:27:23 crc kubenswrapper[5030]: E0121 00:27:23.203080 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df48f69d-ead6-4779-a18d-9bcdd762986b" containerName="pull" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.203087 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="df48f69d-ead6-4779-a18d-9bcdd762986b" containerName="pull" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.203235 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c408b887-9b0d-4ae0-ad25-f6f519e2141b" containerName="registry-server" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.203250 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="df48f69d-ead6-4779-a18d-9bcdd762986b" containerName="extract" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.203810 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.206518 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-8vnbb" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.215497 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5"] Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.382398 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgp57\" (UniqueName: \"kubernetes.io/projected/0af03329-faed-4eeb-bad2-482e5318f619-kube-api-access-hgp57\") pod \"rabbitmq-cluster-operator-779fc9694b-6kpl5\" (UID: \"0af03329-faed-4eeb-bad2-482e5318f619\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.484883 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgp57\" (UniqueName: \"kubernetes.io/projected/0af03329-faed-4eeb-bad2-482e5318f619-kube-api-access-hgp57\") pod \"rabbitmq-cluster-operator-779fc9694b-6kpl5\" (UID: \"0af03329-faed-4eeb-bad2-482e5318f619\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.505144 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgp57\" (UniqueName: \"kubernetes.io/projected/0af03329-faed-4eeb-bad2-482e5318f619-kube-api-access-hgp57\") pod \"rabbitmq-cluster-operator-779fc9694b-6kpl5\" (UID: \"0af03329-faed-4eeb-bad2-482e5318f619\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.524755 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5" Jan 21 00:27:23 crc kubenswrapper[5030]: I0121 00:27:23.948492 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5"] Jan 21 00:27:24 crc kubenswrapper[5030]: I0121 00:27:24.358150 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5" event={"ID":"0af03329-faed-4eeb-bad2-482e5318f619","Type":"ContainerStarted","Data":"6258392ce7c15b7d00a12401c0196dfcbd9e49f57f5074754f05eb4ee62206f3"} Jan 21 00:27:24 crc kubenswrapper[5030]: I0121 00:27:24.358507 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5" event={"ID":"0af03329-faed-4eeb-bad2-482e5318f619","Type":"ContainerStarted","Data":"f966472710d68c224401985a89400e07f2d9aec8871f1fa521cf7808650fb609"} Jan 21 00:27:24 crc kubenswrapper[5030]: I0121 00:27:24.377256 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5" podStartSLOduration=1.377232263 podStartE2EDuration="1.377232263s" podCreationTimestamp="2026-01-21 00:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:27:24.371743371 +0000 UTC m=+6716.692003679" watchObservedRunningTime="2026-01-21 00:27:24.377232263 +0000 UTC m=+6716.697492561" Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.006002 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cg8lk"] Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.007590 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.018553 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg8lk"] Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.105762 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/227e9165-78cf-4844-8bfb-ffda2221ee37-catalog-content\") pod \"redhat-marketplace-cg8lk\" (UID: \"227e9165-78cf-4844-8bfb-ffda2221ee37\") " pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.105846 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvj8b\" (UniqueName: \"kubernetes.io/projected/227e9165-78cf-4844-8bfb-ffda2221ee37-kube-api-access-jvj8b\") pod \"redhat-marketplace-cg8lk\" (UID: \"227e9165-78cf-4844-8bfb-ffda2221ee37\") " pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.105884 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/227e9165-78cf-4844-8bfb-ffda2221ee37-utilities\") pod \"redhat-marketplace-cg8lk\" (UID: \"227e9165-78cf-4844-8bfb-ffda2221ee37\") " pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.207051 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/227e9165-78cf-4844-8bfb-ffda2221ee37-catalog-content\") pod \"redhat-marketplace-cg8lk\" (UID: \"227e9165-78cf-4844-8bfb-ffda2221ee37\") " pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.207141 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvj8b\" (UniqueName: \"kubernetes.io/projected/227e9165-78cf-4844-8bfb-ffda2221ee37-kube-api-access-jvj8b\") pod \"redhat-marketplace-cg8lk\" (UID: \"227e9165-78cf-4844-8bfb-ffda2221ee37\") " pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.207183 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/227e9165-78cf-4844-8bfb-ffda2221ee37-utilities\") pod \"redhat-marketplace-cg8lk\" (UID: \"227e9165-78cf-4844-8bfb-ffda2221ee37\") " pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.207612 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/227e9165-78cf-4844-8bfb-ffda2221ee37-catalog-content\") pod \"redhat-marketplace-cg8lk\" (UID: \"227e9165-78cf-4844-8bfb-ffda2221ee37\") " pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.207707 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/227e9165-78cf-4844-8bfb-ffda2221ee37-utilities\") pod \"redhat-marketplace-cg8lk\" (UID: \"227e9165-78cf-4844-8bfb-ffda2221ee37\") " pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.228844 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvj8b\" (UniqueName: \"kubernetes.io/projected/227e9165-78cf-4844-8bfb-ffda2221ee37-kube-api-access-jvj8b\") pod \"redhat-marketplace-cg8lk\" (UID: \"227e9165-78cf-4844-8bfb-ffda2221ee37\") " pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.334250 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.767153 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg8lk"] Jan 21 00:27:25 crc kubenswrapper[5030]: W0121 00:27:25.768597 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod227e9165_78cf_4844_8bfb_ffda2221ee37.slice/crio-97cca5433c1b65c51b76d817566117e7358645bbe4ade85fdbcc035ecc802059 WatchSource:0}: Error finding container 97cca5433c1b65c51b76d817566117e7358645bbe4ade85fdbcc035ecc802059: Status 404 returned error can't find the container with id 97cca5433c1b65c51b76d817566117e7358645bbe4ade85fdbcc035ecc802059 Jan 21 00:27:25 crc kubenswrapper[5030]: I0121 00:27:25.962174 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:27:25 crc kubenswrapper[5030]: E0121 00:27:25.962357 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:27:26 crc kubenswrapper[5030]: I0121 00:27:26.401392 5030 generic.go:334] "Generic (PLEG): container finished" podID="227e9165-78cf-4844-8bfb-ffda2221ee37" containerID="ea1de3cd8d1ec5bde8220e8b32c858eedecfbecd8a4a993a8646fad5b3672ee2" exitCode=0 Jan 21 00:27:26 crc kubenswrapper[5030]: I0121 00:27:26.401936 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg8lk" event={"ID":"227e9165-78cf-4844-8bfb-ffda2221ee37","Type":"ContainerDied","Data":"ea1de3cd8d1ec5bde8220e8b32c858eedecfbecd8a4a993a8646fad5b3672ee2"} Jan 21 00:27:26 crc kubenswrapper[5030]: I0121 00:27:26.402015 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg8lk" event={"ID":"227e9165-78cf-4844-8bfb-ffda2221ee37","Type":"ContainerStarted","Data":"97cca5433c1b65c51b76d817566117e7358645bbe4ade85fdbcc035ecc802059"} Jan 21 00:27:27 crc kubenswrapper[5030]: I0121 00:27:27.410949 5030 generic.go:334] "Generic (PLEG): container finished" podID="227e9165-78cf-4844-8bfb-ffda2221ee37" containerID="c43f23a81b70867270323d83be0e7ef51961f9eec48d24cc32e15c65cc6e1ea9" exitCode=0 Jan 21 00:27:27 crc kubenswrapper[5030]: I0121 00:27:27.411005 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg8lk" event={"ID":"227e9165-78cf-4844-8bfb-ffda2221ee37","Type":"ContainerDied","Data":"c43f23a81b70867270323d83be0e7ef51961f9eec48d24cc32e15c65cc6e1ea9"} Jan 21 00:27:28 crc kubenswrapper[5030]: I0121 00:27:28.424895 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg8lk" event={"ID":"227e9165-78cf-4844-8bfb-ffda2221ee37","Type":"ContainerStarted","Data":"342a3c8df5bbefc057affc0366f2e9d6121bdcd8f4605a68eed7381f5fa36036"} Jan 21 00:27:28 crc kubenswrapper[5030]: I0121 00:27:28.447458 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cg8lk" podStartSLOduration=2.9742242 podStartE2EDuration="4.447441414s" podCreationTimestamp="2026-01-21 00:27:24 +0000 UTC" firstStartedPulling="2026-01-21 00:27:26.403645085 +0000 UTC m=+6718.723905373" lastFinishedPulling="2026-01-21 00:27:27.876862299 +0000 UTC m=+6720.197122587" observedRunningTime="2026-01-21 00:27:28.441396829 +0000 UTC m=+6720.761657137" watchObservedRunningTime="2026-01-21 00:27:28.447441414 +0000 UTC m=+6720.767701702" Jan 21 00:27:30 crc kubenswrapper[5030]: I0121 00:27:30.206169 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wf4rj"] Jan 21 00:27:30 crc kubenswrapper[5030]: I0121 00:27:30.207546 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:30 crc kubenswrapper[5030]: I0121 00:27:30.224060 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wf4rj"] Jan 21 00:27:30 crc kubenswrapper[5030]: I0121 00:27:30.377475 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzdcv\" (UniqueName: \"kubernetes.io/projected/2dd786ba-7142-4b2b-8b47-8ef1dae07126-kube-api-access-gzdcv\") pod \"community-operators-wf4rj\" (UID: \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\") " pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:30 crc kubenswrapper[5030]: I0121 00:27:30.377708 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd786ba-7142-4b2b-8b47-8ef1dae07126-utilities\") pod \"community-operators-wf4rj\" (UID: \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\") " pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:30 crc kubenswrapper[5030]: I0121 00:27:30.377762 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd786ba-7142-4b2b-8b47-8ef1dae07126-catalog-content\") pod \"community-operators-wf4rj\" (UID: \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\") " pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:30 crc kubenswrapper[5030]: I0121 00:27:30.479535 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd786ba-7142-4b2b-8b47-8ef1dae07126-utilities\") pod \"community-operators-wf4rj\" (UID: \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\") " pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:30 crc kubenswrapper[5030]: I0121 00:27:30.479592 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd786ba-7142-4b2b-8b47-8ef1dae07126-catalog-content\") pod \"community-operators-wf4rj\" (UID: \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\") " pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:30 crc kubenswrapper[5030]: I0121 00:27:30.479689 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzdcv\" (UniqueName: \"kubernetes.io/projected/2dd786ba-7142-4b2b-8b47-8ef1dae07126-kube-api-access-gzdcv\") pod \"community-operators-wf4rj\" (UID: \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\") " pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:30 crc kubenswrapper[5030]: I0121 00:27:30.480067 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd786ba-7142-4b2b-8b47-8ef1dae07126-utilities\") pod \"community-operators-wf4rj\" (UID: \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\") " pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:30 crc kubenswrapper[5030]: I0121 00:27:30.480108 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd786ba-7142-4b2b-8b47-8ef1dae07126-catalog-content\") pod \"community-operators-wf4rj\" (UID: \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\") " pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:30 crc kubenswrapper[5030]: I0121 00:27:30.503516 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzdcv\" (UniqueName: \"kubernetes.io/projected/2dd786ba-7142-4b2b-8b47-8ef1dae07126-kube-api-access-gzdcv\") pod \"community-operators-wf4rj\" (UID: \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\") " pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:30 crc kubenswrapper[5030]: I0121 00:27:30.530033 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:31 crc kubenswrapper[5030]: I0121 00:27:31.068613 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wf4rj"] Jan 21 00:27:31 crc kubenswrapper[5030]: W0121 00:27:31.072002 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dd786ba_7142_4b2b_8b47_8ef1dae07126.slice/crio-63e3982a55ead39e8063bdf9c43bcd5f84629d18b1c3f4917c0fa5c208955248 WatchSource:0}: Error finding container 63e3982a55ead39e8063bdf9c43bcd5f84629d18b1c3f4917c0fa5c208955248: Status 404 returned error can't find the container with id 63e3982a55ead39e8063bdf9c43bcd5f84629d18b1c3f4917c0fa5c208955248 Jan 21 00:27:31 crc kubenswrapper[5030]: I0121 00:27:31.445877 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wf4rj" event={"ID":"2dd786ba-7142-4b2b-8b47-8ef1dae07126","Type":"ContainerStarted","Data":"63e3982a55ead39e8063bdf9c43bcd5f84629d18b1c3f4917c0fa5c208955248"} Jan 21 00:27:31 crc kubenswrapper[5030]: I0121 00:27:31.801176 5030 scope.go:117] "RemoveContainer" containerID="b5d8dcfea8a041c375b7f1b749671da811924b9bbdc38e69112881bb9c534734" Jan 21 00:27:31 crc kubenswrapper[5030]: I0121 00:27:31.844474 5030 scope.go:117] "RemoveContainer" containerID="0476f5ecbb75f96d7a6a54d3a60ae225d4226ce19053893c8d06c97357fa2a58" Jan 21 00:27:31 crc kubenswrapper[5030]: I0121 00:27:31.910388 5030 scope.go:117] "RemoveContainer" containerID="43f6d68ddf3d8f9396197db2bb3d0f4b0486309dc12bd8012350f620bd080bd2" Jan 21 00:27:32 crc kubenswrapper[5030]: I0121 00:27:32.455388 5030 generic.go:334] "Generic (PLEG): container finished" podID="2dd786ba-7142-4b2b-8b47-8ef1dae07126" containerID="2d2d9d1c346b8a5c89da4b28484e197babd3de9c7d4ffa27ad66097c556708d6" exitCode=0 Jan 21 00:27:32 crc kubenswrapper[5030]: I0121 00:27:32.455436 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wf4rj" event={"ID":"2dd786ba-7142-4b2b-8b47-8ef1dae07126","Type":"ContainerDied","Data":"2d2d9d1c346b8a5c89da4b28484e197babd3de9c7d4ffa27ad66097c556708d6"} Jan 21 00:27:33 crc kubenswrapper[5030]: I0121 00:27:33.007344 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-m59dl"] Jan 21 00:27:33 crc kubenswrapper[5030]: I0121 00:27:33.008577 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-m59dl" Jan 21 00:27:33 crc kubenswrapper[5030]: I0121 00:27:33.011507 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-m8pml" Jan 21 00:27:33 crc kubenswrapper[5030]: I0121 00:27:33.020398 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-m59dl"] Jan 21 00:27:33 crc kubenswrapper[5030]: I0121 00:27:33.120485 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clt5l\" (UniqueName: \"kubernetes.io/projected/d6c9fb65-9372-46f7-b41e-bddf586f31d4-kube-api-access-clt5l\") pod \"keystone-operator-index-m59dl\" (UID: \"d6c9fb65-9372-46f7-b41e-bddf586f31d4\") " pod="openstack-operators/keystone-operator-index-m59dl" Jan 21 00:27:33 crc kubenswrapper[5030]: I0121 00:27:33.221876 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clt5l\" (UniqueName: \"kubernetes.io/projected/d6c9fb65-9372-46f7-b41e-bddf586f31d4-kube-api-access-clt5l\") pod \"keystone-operator-index-m59dl\" (UID: \"d6c9fb65-9372-46f7-b41e-bddf586f31d4\") " pod="openstack-operators/keystone-operator-index-m59dl" Jan 21 00:27:33 crc kubenswrapper[5030]: I0121 00:27:33.240061 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clt5l\" (UniqueName: \"kubernetes.io/projected/d6c9fb65-9372-46f7-b41e-bddf586f31d4-kube-api-access-clt5l\") pod \"keystone-operator-index-m59dl\" (UID: \"d6c9fb65-9372-46f7-b41e-bddf586f31d4\") " pod="openstack-operators/keystone-operator-index-m59dl" Jan 21 00:27:33 crc kubenswrapper[5030]: I0121 00:27:33.326612 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-m59dl" Jan 21 00:27:33 crc kubenswrapper[5030]: I0121 00:27:33.809661 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-m59dl"] Jan 21 00:27:34 crc kubenswrapper[5030]: I0121 00:27:34.488297 5030 generic.go:334] "Generic (PLEG): container finished" podID="2dd786ba-7142-4b2b-8b47-8ef1dae07126" containerID="bdb0c448c027659ce78b8ec81e7b1ae5536d3596546160189290331c6a20c0cb" exitCode=0 Jan 21 00:27:34 crc kubenswrapper[5030]: I0121 00:27:34.488374 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wf4rj" event={"ID":"2dd786ba-7142-4b2b-8b47-8ef1dae07126","Type":"ContainerDied","Data":"bdb0c448c027659ce78b8ec81e7b1ae5536d3596546160189290331c6a20c0cb"} Jan 21 00:27:34 crc kubenswrapper[5030]: I0121 00:27:34.492109 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-m59dl" event={"ID":"d6c9fb65-9372-46f7-b41e-bddf586f31d4","Type":"ContainerStarted","Data":"7ce7e78f9cde34952a70e85bf0664398449eeb1c884b9b21585fa29e564efd39"} Jan 21 00:27:35 crc kubenswrapper[5030]: I0121 00:27:35.336064 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:35 crc kubenswrapper[5030]: I0121 00:27:35.336120 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:35 crc kubenswrapper[5030]: I0121 00:27:35.379896 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:35 crc kubenswrapper[5030]: I0121 00:27:35.500448 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-m59dl" event={"ID":"d6c9fb65-9372-46f7-b41e-bddf586f31d4","Type":"ContainerStarted","Data":"755affc83f2ef84b3880fe7174ec8e87f766b88bfd198d5173e255d0ada9b887"} Jan 21 00:27:35 crc kubenswrapper[5030]: I0121 00:27:35.503128 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wf4rj" event={"ID":"2dd786ba-7142-4b2b-8b47-8ef1dae07126","Type":"ContainerStarted","Data":"bdc37c46f1667599ed7b953a6735c72d7a7532efc9072abf525d4631446cd481"} Jan 21 00:27:35 crc kubenswrapper[5030]: I0121 00:27:35.518963 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-m59dl" podStartSLOduration=2.9835273879999997 podStartE2EDuration="3.518947703s" podCreationTimestamp="2026-01-21 00:27:32 +0000 UTC" firstStartedPulling="2026-01-21 00:27:33.821708306 +0000 UTC m=+6726.141968594" lastFinishedPulling="2026-01-21 00:27:34.357128621 +0000 UTC m=+6726.677388909" observedRunningTime="2026-01-21 00:27:35.513844201 +0000 UTC m=+6727.834104489" watchObservedRunningTime="2026-01-21 00:27:35.518947703 +0000 UTC m=+6727.839207991" Jan 21 00:27:35 crc kubenswrapper[5030]: I0121 00:27:35.548911 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:35 crc kubenswrapper[5030]: I0121 00:27:35.565352 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wf4rj" podStartSLOduration=3.112727163 podStartE2EDuration="5.565333901s" podCreationTimestamp="2026-01-21 00:27:30 +0000 UTC" firstStartedPulling="2026-01-21 00:27:32.456846261 +0000 UTC m=+6724.777106549" lastFinishedPulling="2026-01-21 00:27:34.909452999 +0000 UTC m=+6727.229713287" observedRunningTime="2026-01-21 00:27:35.536676157 +0000 UTC m=+6727.856936445" watchObservedRunningTime="2026-01-21 00:27:35.565333901 +0000 UTC m=+6727.885594189" Jan 21 00:27:40 crc kubenswrapper[5030]: I0121 00:27:40.398039 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg8lk"] Jan 21 00:27:40 crc kubenswrapper[5030]: I0121 00:27:40.398577 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cg8lk" podUID="227e9165-78cf-4844-8bfb-ffda2221ee37" containerName="registry-server" containerID="cri-o://342a3c8df5bbefc057affc0366f2e9d6121bdcd8f4605a68eed7381f5fa36036" gracePeriod=2 Jan 21 00:27:40 crc kubenswrapper[5030]: I0121 00:27:40.530205 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:40 crc kubenswrapper[5030]: I0121 00:27:40.530363 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:40 crc kubenswrapper[5030]: I0121 00:27:40.583040 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:40 crc kubenswrapper[5030]: I0121 00:27:40.638459 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:40 crc kubenswrapper[5030]: I0121 00:27:40.961744 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:27:40 crc kubenswrapper[5030]: E0121 00:27:40.962211 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:27:41 crc kubenswrapper[5030]: I0121 00:27:41.566118 5030 generic.go:334] "Generic (PLEG): container finished" podID="227e9165-78cf-4844-8bfb-ffda2221ee37" containerID="342a3c8df5bbefc057affc0366f2e9d6121bdcd8f4605a68eed7381f5fa36036" exitCode=0 Jan 21 00:27:41 crc kubenswrapper[5030]: I0121 00:27:41.566203 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg8lk" event={"ID":"227e9165-78cf-4844-8bfb-ffda2221ee37","Type":"ContainerDied","Data":"342a3c8df5bbefc057affc0366f2e9d6121bdcd8f4605a68eed7381f5fa36036"} Jan 21 00:27:41 crc kubenswrapper[5030]: I0121 00:27:41.926326 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.051878 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/227e9165-78cf-4844-8bfb-ffda2221ee37-catalog-content\") pod \"227e9165-78cf-4844-8bfb-ffda2221ee37\" (UID: \"227e9165-78cf-4844-8bfb-ffda2221ee37\") " Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.051954 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/227e9165-78cf-4844-8bfb-ffda2221ee37-utilities\") pod \"227e9165-78cf-4844-8bfb-ffda2221ee37\" (UID: \"227e9165-78cf-4844-8bfb-ffda2221ee37\") " Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.052108 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvj8b\" (UniqueName: \"kubernetes.io/projected/227e9165-78cf-4844-8bfb-ffda2221ee37-kube-api-access-jvj8b\") pod \"227e9165-78cf-4844-8bfb-ffda2221ee37\" (UID: \"227e9165-78cf-4844-8bfb-ffda2221ee37\") " Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.053271 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/227e9165-78cf-4844-8bfb-ffda2221ee37-utilities" (OuterVolumeSpecName: "utilities") pod "227e9165-78cf-4844-8bfb-ffda2221ee37" (UID: "227e9165-78cf-4844-8bfb-ffda2221ee37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.057811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/227e9165-78cf-4844-8bfb-ffda2221ee37-kube-api-access-jvj8b" (OuterVolumeSpecName: "kube-api-access-jvj8b") pod "227e9165-78cf-4844-8bfb-ffda2221ee37" (UID: "227e9165-78cf-4844-8bfb-ffda2221ee37"). InnerVolumeSpecName "kube-api-access-jvj8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.077768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/227e9165-78cf-4844-8bfb-ffda2221ee37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "227e9165-78cf-4844-8bfb-ffda2221ee37" (UID: "227e9165-78cf-4844-8bfb-ffda2221ee37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.154363 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/227e9165-78cf-4844-8bfb-ffda2221ee37-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.154415 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/227e9165-78cf-4844-8bfb-ffda2221ee37-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.154436 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvj8b\" (UniqueName: \"kubernetes.io/projected/227e9165-78cf-4844-8bfb-ffda2221ee37-kube-api-access-jvj8b\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.575068 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cg8lk" event={"ID":"227e9165-78cf-4844-8bfb-ffda2221ee37","Type":"ContainerDied","Data":"97cca5433c1b65c51b76d817566117e7358645bbe4ade85fdbcc035ecc802059"} Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.575133 5030 scope.go:117] "RemoveContainer" containerID="342a3c8df5bbefc057affc0366f2e9d6121bdcd8f4605a68eed7381f5fa36036" Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.575119 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cg8lk" Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.604037 5030 scope.go:117] "RemoveContainer" containerID="c43f23a81b70867270323d83be0e7ef51961f9eec48d24cc32e15c65cc6e1ea9" Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.621262 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg8lk"] Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.633814 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cg8lk"] Jan 21 00:27:42 crc kubenswrapper[5030]: I0121 00:27:42.635669 5030 scope.go:117] "RemoveContainer" containerID="ea1de3cd8d1ec5bde8220e8b32c858eedecfbecd8a4a993a8646fad5b3672ee2" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.326788 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-m59dl" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.327123 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-m59dl" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.365310 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-m59dl" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.617196 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-m59dl" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.674384 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 00:27:43 crc kubenswrapper[5030]: E0121 00:27:43.674732 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="227e9165-78cf-4844-8bfb-ffda2221ee37" containerName="registry-server" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.674748 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="227e9165-78cf-4844-8bfb-ffda2221ee37" containerName="registry-server" Jan 21 00:27:43 crc kubenswrapper[5030]: E0121 00:27:43.674777 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="227e9165-78cf-4844-8bfb-ffda2221ee37" containerName="extract-content" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.674785 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="227e9165-78cf-4844-8bfb-ffda2221ee37" containerName="extract-content" Jan 21 00:27:43 crc kubenswrapper[5030]: E0121 00:27:43.674796 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="227e9165-78cf-4844-8bfb-ffda2221ee37" containerName="extract-utilities" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.674804 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="227e9165-78cf-4844-8bfb-ffda2221ee37" containerName="extract-utilities" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.674964 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="227e9165-78cf-4844-8bfb-ffda2221ee37" containerName="registry-server" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.676831 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.681205 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-default-user" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.682088 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-server-dockercfg-4ppt6" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.682236 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.682353 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-server-conf" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.682462 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.691550 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.784908 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.784951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.784993 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7cb0fd7-6a53-4ab5-a144-9495044acffb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.785019 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxlwl\" (UniqueName: \"kubernetes.io/projected/f7cb0fd7-6a53-4ab5-a144-9495044acffb-kube-api-access-pxlwl\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.785203 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7cb0fd7-6a53-4ab5-a144-9495044acffb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.785275 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.785427 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.785463 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7cb0fd7-6a53-4ab5-a144-9495044acffb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.886691 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.886849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.886878 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7cb0fd7-6a53-4ab5-a144-9495044acffb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.887934 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.887989 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.888041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7cb0fd7-6a53-4ab5-a144-9495044acffb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.888074 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxlwl\" (UniqueName: \"kubernetes.io/projected/f7cb0fd7-6a53-4ab5-a144-9495044acffb-kube-api-access-pxlwl\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.888123 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7cb0fd7-6a53-4ab5-a144-9495044acffb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.888714 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.888960 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7cb0fd7-6a53-4ab5-a144-9495044acffb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.889322 5030 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.889353 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e63b07f320ae4d037f36c12a45e04e968a0c5ea91177d6dce526bf641703db22/globalmount\"" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.889608 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.892598 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7cb0fd7-6a53-4ab5-a144-9495044acffb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.893014 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7cb0fd7-6a53-4ab5-a144-9495044acffb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.896315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.906643 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxlwl\" (UniqueName: \"kubernetes.io/projected/f7cb0fd7-6a53-4ab5-a144-9495044acffb-kube-api-access-pxlwl\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.922403 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122\") pod \"rabbitmq-server-0\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:43 crc kubenswrapper[5030]: I0121 00:27:43.972052 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="227e9165-78cf-4844-8bfb-ffda2221ee37" path="/var/lib/kubelet/pods/227e9165-78cf-4844-8bfb-ffda2221ee37/volumes" Jan 21 00:27:44 crc kubenswrapper[5030]: I0121 00:27:44.001573 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:27:44 crc kubenswrapper[5030]: I0121 00:27:44.397509 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wf4rj"] Jan 21 00:27:44 crc kubenswrapper[5030]: I0121 00:27:44.397803 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wf4rj" podUID="2dd786ba-7142-4b2b-8b47-8ef1dae07126" containerName="registry-server" containerID="cri-o://bdc37c46f1667599ed7b953a6735c72d7a7532efc9072abf525d4631446cd481" gracePeriod=2 Jan 21 00:27:44 crc kubenswrapper[5030]: I0121 00:27:44.474846 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 00:27:44 crc kubenswrapper[5030]: W0121 00:27:44.484930 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7cb0fd7_6a53_4ab5_a144_9495044acffb.slice/crio-0b115e0ffd70572e4d1729e17c81e61afca1995289fc050f3d08b035fe311afa WatchSource:0}: Error finding container 0b115e0ffd70572e4d1729e17c81e61afca1995289fc050f3d08b035fe311afa: Status 404 returned error can't find the container with id 0b115e0ffd70572e4d1729e17c81e61afca1995289fc050f3d08b035fe311afa Jan 21 00:27:44 crc kubenswrapper[5030]: I0121 00:27:44.596679 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"f7cb0fd7-6a53-4ab5-a144-9495044acffb","Type":"ContainerStarted","Data":"0b115e0ffd70572e4d1729e17c81e61afca1995289fc050f3d08b035fe311afa"} Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.012674 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.086397 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd786ba-7142-4b2b-8b47-8ef1dae07126-utilities\") pod \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\" (UID: \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\") " Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.086551 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzdcv\" (UniqueName: \"kubernetes.io/projected/2dd786ba-7142-4b2b-8b47-8ef1dae07126-kube-api-access-gzdcv\") pod \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\" (UID: \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\") " Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.086643 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd786ba-7142-4b2b-8b47-8ef1dae07126-catalog-content\") pod \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\" (UID: \"2dd786ba-7142-4b2b-8b47-8ef1dae07126\") " Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.088018 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dd786ba-7142-4b2b-8b47-8ef1dae07126-utilities" (OuterVolumeSpecName: "utilities") pod "2dd786ba-7142-4b2b-8b47-8ef1dae07126" (UID: "2dd786ba-7142-4b2b-8b47-8ef1dae07126"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.095227 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dd786ba-7142-4b2b-8b47-8ef1dae07126-kube-api-access-gzdcv" (OuterVolumeSpecName: "kube-api-access-gzdcv") pod "2dd786ba-7142-4b2b-8b47-8ef1dae07126" (UID: "2dd786ba-7142-4b2b-8b47-8ef1dae07126"). InnerVolumeSpecName "kube-api-access-gzdcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.141412 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dd786ba-7142-4b2b-8b47-8ef1dae07126-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2dd786ba-7142-4b2b-8b47-8ef1dae07126" (UID: "2dd786ba-7142-4b2b-8b47-8ef1dae07126"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.187926 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzdcv\" (UniqueName: \"kubernetes.io/projected/2dd786ba-7142-4b2b-8b47-8ef1dae07126-kube-api-access-gzdcv\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.187964 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dd786ba-7142-4b2b-8b47-8ef1dae07126-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.187974 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dd786ba-7142-4b2b-8b47-8ef1dae07126-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.620700 5030 generic.go:334] "Generic (PLEG): container finished" podID="2dd786ba-7142-4b2b-8b47-8ef1dae07126" containerID="bdc37c46f1667599ed7b953a6735c72d7a7532efc9072abf525d4631446cd481" exitCode=0 Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.620762 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wf4rj" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.620780 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wf4rj" event={"ID":"2dd786ba-7142-4b2b-8b47-8ef1dae07126","Type":"ContainerDied","Data":"bdc37c46f1667599ed7b953a6735c72d7a7532efc9072abf525d4631446cd481"} Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.620815 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wf4rj" event={"ID":"2dd786ba-7142-4b2b-8b47-8ef1dae07126","Type":"ContainerDied","Data":"63e3982a55ead39e8063bdf9c43bcd5f84629d18b1c3f4917c0fa5c208955248"} Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.620841 5030 scope.go:117] "RemoveContainer" containerID="bdc37c46f1667599ed7b953a6735c72d7a7532efc9072abf525d4631446cd481" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.624124 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"f7cb0fd7-6a53-4ab5-a144-9495044acffb","Type":"ContainerStarted","Data":"4a561d8a37b7ed574eeac7ab94192f94f75cebcb0c5335b3a5b084bc65d47440"} Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.672375 5030 scope.go:117] "RemoveContainer" containerID="bdb0c448c027659ce78b8ec81e7b1ae5536d3596546160189290331c6a20c0cb" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.693435 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wf4rj"] Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.708705 5030 scope.go:117] "RemoveContainer" containerID="2d2d9d1c346b8a5c89da4b28484e197babd3de9c7d4ffa27ad66097c556708d6" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.711130 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wf4rj"] Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.728144 5030 scope.go:117] "RemoveContainer" containerID="bdc37c46f1667599ed7b953a6735c72d7a7532efc9072abf525d4631446cd481" Jan 21 00:27:46 crc kubenswrapper[5030]: E0121 00:27:46.729569 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdc37c46f1667599ed7b953a6735c72d7a7532efc9072abf525d4631446cd481\": container with ID starting with bdc37c46f1667599ed7b953a6735c72d7a7532efc9072abf525d4631446cd481 not found: ID does not exist" containerID="bdc37c46f1667599ed7b953a6735c72d7a7532efc9072abf525d4631446cd481" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.729599 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdc37c46f1667599ed7b953a6735c72d7a7532efc9072abf525d4631446cd481"} err="failed to get container status \"bdc37c46f1667599ed7b953a6735c72d7a7532efc9072abf525d4631446cd481\": rpc error: code = NotFound desc = could not find container \"bdc37c46f1667599ed7b953a6735c72d7a7532efc9072abf525d4631446cd481\": container with ID starting with bdc37c46f1667599ed7b953a6735c72d7a7532efc9072abf525d4631446cd481 not found: ID does not exist" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.729641 5030 scope.go:117] "RemoveContainer" containerID="bdb0c448c027659ce78b8ec81e7b1ae5536d3596546160189290331c6a20c0cb" Jan 21 00:27:46 crc kubenswrapper[5030]: E0121 00:27:46.730088 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdb0c448c027659ce78b8ec81e7b1ae5536d3596546160189290331c6a20c0cb\": container with ID starting with bdb0c448c027659ce78b8ec81e7b1ae5536d3596546160189290331c6a20c0cb not found: ID does not exist" containerID="bdb0c448c027659ce78b8ec81e7b1ae5536d3596546160189290331c6a20c0cb" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.730116 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdb0c448c027659ce78b8ec81e7b1ae5536d3596546160189290331c6a20c0cb"} err="failed to get container status \"bdb0c448c027659ce78b8ec81e7b1ae5536d3596546160189290331c6a20c0cb\": rpc error: code = NotFound desc = could not find container \"bdb0c448c027659ce78b8ec81e7b1ae5536d3596546160189290331c6a20c0cb\": container with ID starting with bdb0c448c027659ce78b8ec81e7b1ae5536d3596546160189290331c6a20c0cb not found: ID does not exist" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.730129 5030 scope.go:117] "RemoveContainer" containerID="2d2d9d1c346b8a5c89da4b28484e197babd3de9c7d4ffa27ad66097c556708d6" Jan 21 00:27:46 crc kubenswrapper[5030]: E0121 00:27:46.730385 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d2d9d1c346b8a5c89da4b28484e197babd3de9c7d4ffa27ad66097c556708d6\": container with ID starting with 2d2d9d1c346b8a5c89da4b28484e197babd3de9c7d4ffa27ad66097c556708d6 not found: ID does not exist" containerID="2d2d9d1c346b8a5c89da4b28484e197babd3de9c7d4ffa27ad66097c556708d6" Jan 21 00:27:46 crc kubenswrapper[5030]: I0121 00:27:46.730403 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d2d9d1c346b8a5c89da4b28484e197babd3de9c7d4ffa27ad66097c556708d6"} err="failed to get container status \"2d2d9d1c346b8a5c89da4b28484e197babd3de9c7d4ffa27ad66097c556708d6\": rpc error: code = NotFound desc = could not find container \"2d2d9d1c346b8a5c89da4b28484e197babd3de9c7d4ffa27ad66097c556708d6\": container with ID starting with 2d2d9d1c346b8a5c89da4b28484e197babd3de9c7d4ffa27ad66097c556708d6 not found: ID does not exist" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.675755 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c"] Jan 21 00:27:47 crc kubenswrapper[5030]: E0121 00:27:47.676312 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd786ba-7142-4b2b-8b47-8ef1dae07126" containerName="registry-server" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.676324 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd786ba-7142-4b2b-8b47-8ef1dae07126" containerName="registry-server" Jan 21 00:27:47 crc kubenswrapper[5030]: E0121 00:27:47.676339 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd786ba-7142-4b2b-8b47-8ef1dae07126" containerName="extract-content" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.676347 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd786ba-7142-4b2b-8b47-8ef1dae07126" containerName="extract-content" Jan 21 00:27:47 crc kubenswrapper[5030]: E0121 00:27:47.676380 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd786ba-7142-4b2b-8b47-8ef1dae07126" containerName="extract-utilities" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.676390 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd786ba-7142-4b2b-8b47-8ef1dae07126" containerName="extract-utilities" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.676535 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dd786ba-7142-4b2b-8b47-8ef1dae07126" containerName="registry-server" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.677591 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.679736 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.690860 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c"] Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.815166 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7s6d\" (UniqueName: \"kubernetes.io/projected/cfa7767c-5909-48ed-9d3f-8046f21ee493-kube-api-access-f7s6d\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c\" (UID: \"cfa7767c-5909-48ed-9d3f-8046f21ee493\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.815220 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfa7767c-5909-48ed-9d3f-8046f21ee493-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c\" (UID: \"cfa7767c-5909-48ed-9d3f-8046f21ee493\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.815555 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfa7767c-5909-48ed-9d3f-8046f21ee493-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c\" (UID: \"cfa7767c-5909-48ed-9d3f-8046f21ee493\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.916882 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfa7767c-5909-48ed-9d3f-8046f21ee493-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c\" (UID: \"cfa7767c-5909-48ed-9d3f-8046f21ee493\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.916930 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7s6d\" (UniqueName: \"kubernetes.io/projected/cfa7767c-5909-48ed-9d3f-8046f21ee493-kube-api-access-f7s6d\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c\" (UID: \"cfa7767c-5909-48ed-9d3f-8046f21ee493\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.917021 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfa7767c-5909-48ed-9d3f-8046f21ee493-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c\" (UID: \"cfa7767c-5909-48ed-9d3f-8046f21ee493\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.917437 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfa7767c-5909-48ed-9d3f-8046f21ee493-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c\" (UID: \"cfa7767c-5909-48ed-9d3f-8046f21ee493\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.917490 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfa7767c-5909-48ed-9d3f-8046f21ee493-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c\" (UID: \"cfa7767c-5909-48ed-9d3f-8046f21ee493\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.935013 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7s6d\" (UniqueName: \"kubernetes.io/projected/cfa7767c-5909-48ed-9d3f-8046f21ee493-kube-api-access-f7s6d\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c\" (UID: \"cfa7767c-5909-48ed-9d3f-8046f21ee493\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.970338 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dd786ba-7142-4b2b-8b47-8ef1dae07126" path="/var/lib/kubelet/pods/2dd786ba-7142-4b2b-8b47-8ef1dae07126/volumes" Jan 21 00:27:47 crc kubenswrapper[5030]: I0121 00:27:47.995031 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" Jan 21 00:27:48 crc kubenswrapper[5030]: I0121 00:27:48.458794 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c"] Jan 21 00:27:48 crc kubenswrapper[5030]: I0121 00:27:48.640850 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" event={"ID":"cfa7767c-5909-48ed-9d3f-8046f21ee493","Type":"ContainerStarted","Data":"a3cfc47eb4f253bcf7c9049a5a3f1353b2fd4703fd12e31c960072b49ebaf3c8"} Jan 21 00:27:48 crc kubenswrapper[5030]: I0121 00:27:48.641136 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" event={"ID":"cfa7767c-5909-48ed-9d3f-8046f21ee493","Type":"ContainerStarted","Data":"e4449bc09f10a0365f222ec93997021eed784a850ebe7a5b483656c8990b30b8"} Jan 21 00:27:49 crc kubenswrapper[5030]: I0121 00:27:49.648638 5030 generic.go:334] "Generic (PLEG): container finished" podID="cfa7767c-5909-48ed-9d3f-8046f21ee493" containerID="a3cfc47eb4f253bcf7c9049a5a3f1353b2fd4703fd12e31c960072b49ebaf3c8" exitCode=0 Jan 21 00:27:49 crc kubenswrapper[5030]: I0121 00:27:49.648685 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" event={"ID":"cfa7767c-5909-48ed-9d3f-8046f21ee493","Type":"ContainerDied","Data":"a3cfc47eb4f253bcf7c9049a5a3f1353b2fd4703fd12e31c960072b49ebaf3c8"} Jan 21 00:27:50 crc kubenswrapper[5030]: I0121 00:27:50.657853 5030 generic.go:334] "Generic (PLEG): container finished" podID="cfa7767c-5909-48ed-9d3f-8046f21ee493" containerID="96ee8ddb1e016ee8f7bab5581ed6f41fcce35cfd02a63c7f4769776f20939482" exitCode=0 Jan 21 00:27:50 crc kubenswrapper[5030]: I0121 00:27:50.658012 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" event={"ID":"cfa7767c-5909-48ed-9d3f-8046f21ee493","Type":"ContainerDied","Data":"96ee8ddb1e016ee8f7bab5581ed6f41fcce35cfd02a63c7f4769776f20939482"} Jan 21 00:27:51 crc kubenswrapper[5030]: I0121 00:27:51.668314 5030 generic.go:334] "Generic (PLEG): container finished" podID="cfa7767c-5909-48ed-9d3f-8046f21ee493" containerID="bf2100e8315acf3920b1c710954850e1b86e673c2a40eeccdc4165aba597b394" exitCode=0 Jan 21 00:27:51 crc kubenswrapper[5030]: I0121 00:27:51.668400 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" event={"ID":"cfa7767c-5909-48ed-9d3f-8046f21ee493","Type":"ContainerDied","Data":"bf2100e8315acf3920b1c710954850e1b86e673c2a40eeccdc4165aba597b394"} Jan 21 00:27:52 crc kubenswrapper[5030]: I0121 00:27:52.996224 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" Jan 21 00:27:53 crc kubenswrapper[5030]: I0121 00:27:53.103419 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfa7767c-5909-48ed-9d3f-8046f21ee493-util\") pod \"cfa7767c-5909-48ed-9d3f-8046f21ee493\" (UID: \"cfa7767c-5909-48ed-9d3f-8046f21ee493\") " Jan 21 00:27:53 crc kubenswrapper[5030]: I0121 00:27:53.103518 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfa7767c-5909-48ed-9d3f-8046f21ee493-bundle\") pod \"cfa7767c-5909-48ed-9d3f-8046f21ee493\" (UID: \"cfa7767c-5909-48ed-9d3f-8046f21ee493\") " Jan 21 00:27:53 crc kubenswrapper[5030]: I0121 00:27:53.103550 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7s6d\" (UniqueName: \"kubernetes.io/projected/cfa7767c-5909-48ed-9d3f-8046f21ee493-kube-api-access-f7s6d\") pod \"cfa7767c-5909-48ed-9d3f-8046f21ee493\" (UID: \"cfa7767c-5909-48ed-9d3f-8046f21ee493\") " Jan 21 00:27:53 crc kubenswrapper[5030]: I0121 00:27:53.104574 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa7767c-5909-48ed-9d3f-8046f21ee493-bundle" (OuterVolumeSpecName: "bundle") pod "cfa7767c-5909-48ed-9d3f-8046f21ee493" (UID: "cfa7767c-5909-48ed-9d3f-8046f21ee493"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:27:53 crc kubenswrapper[5030]: I0121 00:27:53.116199 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa7767c-5909-48ed-9d3f-8046f21ee493-kube-api-access-f7s6d" (OuterVolumeSpecName: "kube-api-access-f7s6d") pod "cfa7767c-5909-48ed-9d3f-8046f21ee493" (UID: "cfa7767c-5909-48ed-9d3f-8046f21ee493"). InnerVolumeSpecName "kube-api-access-f7s6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:27:53 crc kubenswrapper[5030]: I0121 00:27:53.120188 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa7767c-5909-48ed-9d3f-8046f21ee493-util" (OuterVolumeSpecName: "util") pod "cfa7767c-5909-48ed-9d3f-8046f21ee493" (UID: "cfa7767c-5909-48ed-9d3f-8046f21ee493"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:27:53 crc kubenswrapper[5030]: I0121 00:27:53.205646 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfa7767c-5909-48ed-9d3f-8046f21ee493-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:53 crc kubenswrapper[5030]: I0121 00:27:53.205692 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfa7767c-5909-48ed-9d3f-8046f21ee493-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:53 crc kubenswrapper[5030]: I0121 00:27:53.205712 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7s6d\" (UniqueName: \"kubernetes.io/projected/cfa7767c-5909-48ed-9d3f-8046f21ee493-kube-api-access-f7s6d\") on node \"crc\" DevicePath \"\"" Jan 21 00:27:53 crc kubenswrapper[5030]: I0121 00:27:53.715462 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" event={"ID":"cfa7767c-5909-48ed-9d3f-8046f21ee493","Type":"ContainerDied","Data":"e4449bc09f10a0365f222ec93997021eed784a850ebe7a5b483656c8990b30b8"} Jan 21 00:27:53 crc kubenswrapper[5030]: I0121 00:27:53.715875 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4449bc09f10a0365f222ec93997021eed784a850ebe7a5b483656c8990b30b8" Jan 21 00:27:53 crc kubenswrapper[5030]: I0121 00:27:53.715500 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c" Jan 21 00:27:55 crc kubenswrapper[5030]: I0121 00:27:55.961687 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:27:55 crc kubenswrapper[5030]: E0121 00:27:55.962017 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:28:04 crc kubenswrapper[5030]: I0121 00:28:04.962276 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd"] Jan 21 00:28:04 crc kubenswrapper[5030]: E0121 00:28:04.965182 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa7767c-5909-48ed-9d3f-8046f21ee493" containerName="util" Jan 21 00:28:04 crc kubenswrapper[5030]: I0121 00:28:04.965404 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa7767c-5909-48ed-9d3f-8046f21ee493" containerName="util" Jan 21 00:28:04 crc kubenswrapper[5030]: E0121 00:28:04.965523 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa7767c-5909-48ed-9d3f-8046f21ee493" containerName="pull" Jan 21 00:28:04 crc kubenswrapper[5030]: I0121 00:28:04.965613 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa7767c-5909-48ed-9d3f-8046f21ee493" containerName="pull" Jan 21 00:28:04 crc kubenswrapper[5030]: E0121 00:28:04.965759 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa7767c-5909-48ed-9d3f-8046f21ee493" containerName="extract" Jan 21 00:28:04 crc kubenswrapper[5030]: I0121 00:28:04.965841 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa7767c-5909-48ed-9d3f-8046f21ee493" containerName="extract" Jan 21 00:28:04 crc kubenswrapper[5030]: I0121 00:28:04.966107 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa7767c-5909-48ed-9d3f-8046f21ee493" containerName="extract" Jan 21 00:28:04 crc kubenswrapper[5030]: I0121 00:28:04.966832 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:28:04 crc kubenswrapper[5030]: I0121 00:28:04.969136 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Jan 21 00:28:04 crc kubenswrapper[5030]: I0121 00:28:04.969689 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-ncmsh" Jan 21 00:28:04 crc kubenswrapper[5030]: I0121 00:28:04.979535 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd"] Jan 21 00:28:05 crc kubenswrapper[5030]: I0121 00:28:05.082743 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jflmf\" (UniqueName: \"kubernetes.io/projected/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-kube-api-access-jflmf\") pod \"keystone-operator-controller-manager-6dcc6c7f8c-np8dd\" (UID: \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\") " pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:28:05 crc kubenswrapper[5030]: I0121 00:28:05.082824 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-apiservice-cert\") pod \"keystone-operator-controller-manager-6dcc6c7f8c-np8dd\" (UID: \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\") " pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:28:05 crc kubenswrapper[5030]: I0121 00:28:05.082853 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-webhook-cert\") pod \"keystone-operator-controller-manager-6dcc6c7f8c-np8dd\" (UID: \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\") " pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:28:05 crc kubenswrapper[5030]: I0121 00:28:05.184134 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jflmf\" (UniqueName: \"kubernetes.io/projected/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-kube-api-access-jflmf\") pod \"keystone-operator-controller-manager-6dcc6c7f8c-np8dd\" (UID: \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\") " pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:28:05 crc kubenswrapper[5030]: I0121 00:28:05.184233 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-apiservice-cert\") pod \"keystone-operator-controller-manager-6dcc6c7f8c-np8dd\" (UID: \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\") " pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:28:05 crc kubenswrapper[5030]: I0121 00:28:05.184275 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-webhook-cert\") pod \"keystone-operator-controller-manager-6dcc6c7f8c-np8dd\" (UID: \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\") " pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:28:05 crc kubenswrapper[5030]: I0121 00:28:05.190000 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-webhook-cert\") pod \"keystone-operator-controller-manager-6dcc6c7f8c-np8dd\" (UID: \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\") " pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:28:05 crc kubenswrapper[5030]: I0121 00:28:05.190040 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-apiservice-cert\") pod \"keystone-operator-controller-manager-6dcc6c7f8c-np8dd\" (UID: \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\") " pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:28:05 crc kubenswrapper[5030]: I0121 00:28:05.206494 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jflmf\" (UniqueName: \"kubernetes.io/projected/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-kube-api-access-jflmf\") pod \"keystone-operator-controller-manager-6dcc6c7f8c-np8dd\" (UID: \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\") " pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:28:05 crc kubenswrapper[5030]: I0121 00:28:05.296694 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:28:05 crc kubenswrapper[5030]: I0121 00:28:05.769365 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd"] Jan 21 00:28:05 crc kubenswrapper[5030]: I0121 00:28:05.805809 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" event={"ID":"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d","Type":"ContainerStarted","Data":"5f83cb9953091da6c1a6ee0e7f183e0f7c8db41d1df89cb10471bf4480592aae"} Jan 21 00:28:06 crc kubenswrapper[5030]: I0121 00:28:06.813665 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" event={"ID":"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d","Type":"ContainerStarted","Data":"4c65ea1f851941676d7928ea51d2203d75494f65b132d93fda2f4070408fc80c"} Jan 21 00:28:06 crc kubenswrapper[5030]: I0121 00:28:06.813985 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:28:06 crc kubenswrapper[5030]: I0121 00:28:06.832971 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" podStartSLOduration=2.8329495590000002 podStartE2EDuration="2.832949559s" podCreationTimestamp="2026-01-21 00:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:28:06.828818 +0000 UTC m=+6759.149078288" watchObservedRunningTime="2026-01-21 00:28:06.832949559 +0000 UTC m=+6759.153209847" Jan 21 00:28:09 crc kubenswrapper[5030]: I0121 00:28:09.961593 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:28:09 crc kubenswrapper[5030]: E0121 00:28:09.962103 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:28:15 crc kubenswrapper[5030]: I0121 00:28:15.301490 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:28:17 crc kubenswrapper[5030]: I0121 00:28:17.900046 5030 generic.go:334] "Generic (PLEG): container finished" podID="f7cb0fd7-6a53-4ab5-a144-9495044acffb" containerID="4a561d8a37b7ed574eeac7ab94192f94f75cebcb0c5335b3a5b084bc65d47440" exitCode=0 Jan 21 00:28:17 crc kubenswrapper[5030]: I0121 00:28:17.900348 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"f7cb0fd7-6a53-4ab5-a144-9495044acffb","Type":"ContainerDied","Data":"4a561d8a37b7ed574eeac7ab94192f94f75cebcb0c5335b3a5b084bc65d47440"} Jan 21 00:28:18 crc kubenswrapper[5030]: I0121 00:28:18.910896 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"f7cb0fd7-6a53-4ab5-a144-9495044acffb","Type":"ContainerStarted","Data":"b43618f37e6fd146988e55f65eb23ca78ad00971c8fd42c8f85bb61794f473ab"} Jan 21 00:28:18 crc kubenswrapper[5030]: I0121 00:28:18.911423 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:28:18 crc kubenswrapper[5030]: I0121 00:28:18.939429 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.939409332 podStartE2EDuration="36.939409332s" podCreationTimestamp="2026-01-21 00:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:28:18.938434188 +0000 UTC m=+6771.258694506" watchObservedRunningTime="2026-01-21 00:28:18.939409332 +0000 UTC m=+6771.259669620" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.547953 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7"] Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.549652 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.552104 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.553264 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qlgln"] Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.554417 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-qlgln" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.559923 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7"] Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.568555 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qlgln"] Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.728550 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b-operator-scripts\") pod \"keystone-5c4b-account-create-update-xrwq7\" (UID: \"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b\") " pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.728953 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxw4p\" (UniqueName: \"kubernetes.io/projected/ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439-kube-api-access-dxw4p\") pod \"keystone-db-create-qlgln\" (UID: \"ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439\") " pod="keystone-kuttl-tests/keystone-db-create-qlgln" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.728984 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rksb\" (UniqueName: \"kubernetes.io/projected/42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b-kube-api-access-4rksb\") pod \"keystone-5c4b-account-create-update-xrwq7\" (UID: \"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b\") " pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.729041 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439-operator-scripts\") pod \"keystone-db-create-qlgln\" (UID: \"ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439\") " pod="keystone-kuttl-tests/keystone-db-create-qlgln" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.830161 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxw4p\" (UniqueName: \"kubernetes.io/projected/ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439-kube-api-access-dxw4p\") pod \"keystone-db-create-qlgln\" (UID: \"ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439\") " pod="keystone-kuttl-tests/keystone-db-create-qlgln" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.830210 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rksb\" (UniqueName: \"kubernetes.io/projected/42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b-kube-api-access-4rksb\") pod \"keystone-5c4b-account-create-update-xrwq7\" (UID: \"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b\") " pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.830302 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439-operator-scripts\") pod \"keystone-db-create-qlgln\" (UID: \"ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439\") " pod="keystone-kuttl-tests/keystone-db-create-qlgln" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.830342 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b-operator-scripts\") pod \"keystone-5c4b-account-create-update-xrwq7\" (UID: \"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b\") " pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.831028 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b-operator-scripts\") pod \"keystone-5c4b-account-create-update-xrwq7\" (UID: \"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b\") " pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.831789 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439-operator-scripts\") pod \"keystone-db-create-qlgln\" (UID: \"ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439\") " pod="keystone-kuttl-tests/keystone-db-create-qlgln" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.848929 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rksb\" (UniqueName: \"kubernetes.io/projected/42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b-kube-api-access-4rksb\") pod \"keystone-5c4b-account-create-update-xrwq7\" (UID: \"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b\") " pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.848984 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxw4p\" (UniqueName: \"kubernetes.io/projected/ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439-kube-api-access-dxw4p\") pod \"keystone-db-create-qlgln\" (UID: \"ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439\") " pod="keystone-kuttl-tests/keystone-db-create-qlgln" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.872970 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.885641 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-qlgln" Jan 21 00:28:24 crc kubenswrapper[5030]: I0121 00:28:24.962735 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:28:25 crc kubenswrapper[5030]: I0121 00:28:25.348029 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qlgln"] Jan 21 00:28:25 crc kubenswrapper[5030]: W0121 00:28:25.350835 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef1cc5de_a3f5_4caf_9b1d_42ad9cd47439.slice/crio-826cededaecec3c95b0d2e1afaafa248b53d092f49f8024de33acabbbb3b74f0 WatchSource:0}: Error finding container 826cededaecec3c95b0d2e1afaafa248b53d092f49f8024de33acabbbb3b74f0: Status 404 returned error can't find the container with id 826cededaecec3c95b0d2e1afaafa248b53d092f49f8024de33acabbbb3b74f0 Jan 21 00:28:25 crc kubenswrapper[5030]: I0121 00:28:25.416569 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7"] Jan 21 00:28:25 crc kubenswrapper[5030]: I0121 00:28:25.972394 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-qlgln" event={"ID":"ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439","Type":"ContainerStarted","Data":"826cededaecec3c95b0d2e1afaafa248b53d092f49f8024de33acabbbb3b74f0"} Jan 21 00:28:25 crc kubenswrapper[5030]: I0121 00:28:25.972807 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"db98105b3432d0931d8d075baf245bf404651b9f54b3e055d9bfc97178ee7e44"} Jan 21 00:28:25 crc kubenswrapper[5030]: I0121 00:28:25.972839 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" event={"ID":"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b","Type":"ContainerStarted","Data":"60991a9e4756622689e2b5c00231964590dd2e45f672f4c757b5e67366461f78"} Jan 21 00:28:26 crc kubenswrapper[5030]: I0121 00:28:26.979601 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" event={"ID":"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b","Type":"ContainerStarted","Data":"e36d136a3637d5ad02280a8e1b410b569b56fb8482dda19f04f012da725652b8"} Jan 21 00:28:26 crc kubenswrapper[5030]: I0121 00:28:26.983189 5030 generic.go:334] "Generic (PLEG): container finished" podID="ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439" containerID="7d1f0f75a66fcedfda2393b7f594449db330ff4fa34d4c0af418df22393c8133" exitCode=0 Jan 21 00:28:26 crc kubenswrapper[5030]: I0121 00:28:26.983236 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-qlgln" event={"ID":"ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439","Type":"ContainerDied","Data":"7d1f0f75a66fcedfda2393b7f594449db330ff4fa34d4c0af418df22393c8133"} Jan 21 00:28:26 crc kubenswrapper[5030]: I0121 00:28:26.994206 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" podStartSLOduration=2.994188739 podStartE2EDuration="2.994188739s" podCreationTimestamp="2026-01-21 00:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:28:26.992724714 +0000 UTC m=+6779.312985012" watchObservedRunningTime="2026-01-21 00:28:26.994188739 +0000 UTC m=+6779.314449027" Jan 21 00:28:28 crc kubenswrapper[5030]: I0121 00:28:28.319511 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-qlgln" Jan 21 00:28:28 crc kubenswrapper[5030]: I0121 00:28:28.491703 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxw4p\" (UniqueName: \"kubernetes.io/projected/ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439-kube-api-access-dxw4p\") pod \"ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439\" (UID: \"ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439\") " Jan 21 00:28:28 crc kubenswrapper[5030]: I0121 00:28:28.491763 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439-operator-scripts\") pod \"ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439\" (UID: \"ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439\") " Jan 21 00:28:28 crc kubenswrapper[5030]: I0121 00:28:28.492777 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439" (UID: "ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:28:28 crc kubenswrapper[5030]: I0121 00:28:28.502987 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439-kube-api-access-dxw4p" (OuterVolumeSpecName: "kube-api-access-dxw4p") pod "ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439" (UID: "ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439"). InnerVolumeSpecName "kube-api-access-dxw4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:28:28 crc kubenswrapper[5030]: I0121 00:28:28.594035 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxw4p\" (UniqueName: \"kubernetes.io/projected/ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439-kube-api-access-dxw4p\") on node \"crc\" DevicePath \"\"" Jan 21 00:28:28 crc kubenswrapper[5030]: I0121 00:28:28.594071 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:28:28 crc kubenswrapper[5030]: I0121 00:28:28.996905 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-qlgln" event={"ID":"ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439","Type":"ContainerDied","Data":"826cededaecec3c95b0d2e1afaafa248b53d092f49f8024de33acabbbb3b74f0"} Jan 21 00:28:28 crc kubenswrapper[5030]: I0121 00:28:28.996957 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="826cededaecec3c95b0d2e1afaafa248b53d092f49f8024de33acabbbb3b74f0" Jan 21 00:28:28 crc kubenswrapper[5030]: I0121 00:28:28.996961 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-qlgln" Jan 21 00:28:28 crc kubenswrapper[5030]: I0121 00:28:28.998381 5030 generic.go:334] "Generic (PLEG): container finished" podID="42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b" containerID="e36d136a3637d5ad02280a8e1b410b569b56fb8482dda19f04f012da725652b8" exitCode=0 Jan 21 00:28:28 crc kubenswrapper[5030]: I0121 00:28:28.998416 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" event={"ID":"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b","Type":"ContainerDied","Data":"e36d136a3637d5ad02280a8e1b410b569b56fb8482dda19f04f012da725652b8"} Jan 21 00:28:30 crc kubenswrapper[5030]: I0121 00:28:30.293876 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" Jan 21 00:28:30 crc kubenswrapper[5030]: I0121 00:28:30.420577 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b-operator-scripts\") pod \"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b\" (UID: \"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b\") " Jan 21 00:28:30 crc kubenswrapper[5030]: I0121 00:28:30.420690 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rksb\" (UniqueName: \"kubernetes.io/projected/42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b-kube-api-access-4rksb\") pod \"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b\" (UID: \"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b\") " Jan 21 00:28:30 crc kubenswrapper[5030]: I0121 00:28:30.421514 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b" (UID: "42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:28:30 crc kubenswrapper[5030]: I0121 00:28:30.428043 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b-kube-api-access-4rksb" (OuterVolumeSpecName: "kube-api-access-4rksb") pod "42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b" (UID: "42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b"). InnerVolumeSpecName "kube-api-access-4rksb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:28:30 crc kubenswrapper[5030]: I0121 00:28:30.529245 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:28:30 crc kubenswrapper[5030]: I0121 00:28:30.529570 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rksb\" (UniqueName: \"kubernetes.io/projected/42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b-kube-api-access-4rksb\") on node \"crc\" DevicePath \"\"" Jan 21 00:28:31 crc kubenswrapper[5030]: I0121 00:28:31.013410 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" event={"ID":"42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b","Type":"ContainerDied","Data":"60991a9e4756622689e2b5c00231964590dd2e45f672f4c757b5e67366461f78"} Jan 21 00:28:31 crc kubenswrapper[5030]: I0121 00:28:31.013452 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60991a9e4756622689e2b5c00231964590dd2e45f672f4c757b5e67366461f78" Jan 21 00:28:31 crc kubenswrapper[5030]: I0121 00:28:31.013818 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7" Jan 21 00:28:31 crc kubenswrapper[5030]: I0121 00:28:31.984689 5030 scope.go:117] "RemoveContainer" containerID="d3fb2075d76030e7e163397240e228668676e0fecc64314fe4f656308408ca6d" Jan 21 00:28:32 crc kubenswrapper[5030]: I0121 00:28:32.033127 5030 scope.go:117] "RemoveContainer" containerID="0972703b71c7a6099b4d59d3fb431741000ac8ef39554b6381b063e428753517" Jan 21 00:28:32 crc kubenswrapper[5030]: I0121 00:28:32.058171 5030 scope.go:117] "RemoveContainer" containerID="96423988b979eabdb9dd2652b8c3e27f9c7b0b171119e451f241c0402de3b5ec" Jan 21 00:28:32 crc kubenswrapper[5030]: I0121 00:28:32.110220 5030 scope.go:117] "RemoveContainer" containerID="dd306b1025814562f10ac884155612dc7902b43a48408d4f5c79b5dd83e322b0" Jan 21 00:28:32 crc kubenswrapper[5030]: I0121 00:28:32.133735 5030 scope.go:117] "RemoveContainer" containerID="4780eff14b9160167f2aebd7335dbeef4f119149e7c378823fe1b8f44466bd30" Jan 21 00:28:32 crc kubenswrapper[5030]: I0121 00:28:32.156497 5030 scope.go:117] "RemoveContainer" containerID="3c6c4ba7dcd5ddeff09c58f2dcac4c82f8cce6d8c2ff50244ca4eae88cfdc6f2" Jan 21 00:28:32 crc kubenswrapper[5030]: I0121 00:28:32.179742 5030 scope.go:117] "RemoveContainer" containerID="b0dcf5f69ea5f517008215877759c4e6c14e3065f71f19c488496bc5b4d56dbb" Jan 21 00:28:34 crc kubenswrapper[5030]: I0121 00:28:34.004874 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.107873 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-gc8rd"] Jan 21 00:28:35 crc kubenswrapper[5030]: E0121 00:28:35.108386 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439" containerName="mariadb-database-create" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.108397 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439" containerName="mariadb-database-create" Jan 21 00:28:35 crc kubenswrapper[5030]: E0121 00:28:35.108418 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b" containerName="mariadb-account-create-update" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.108425 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b" containerName="mariadb-account-create-update" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.108548 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b" containerName="mariadb-account-create-update" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.108562 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439" containerName="mariadb-database-create" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.109011 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.111827 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.111895 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-lkdjx" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.112042 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.112223 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.120697 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-gc8rd"] Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.204452 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7pt9\" (UniqueName: \"kubernetes.io/projected/6790d30b-ff25-4d9d-b438-fca7670cbbc0-kube-api-access-v7pt9\") pod \"keystone-db-sync-gc8rd\" (UID: \"6790d30b-ff25-4d9d-b438-fca7670cbbc0\") " pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.204519 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6790d30b-ff25-4d9d-b438-fca7670cbbc0-config-data\") pod \"keystone-db-sync-gc8rd\" (UID: \"6790d30b-ff25-4d9d-b438-fca7670cbbc0\") " pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.305902 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7pt9\" (UniqueName: \"kubernetes.io/projected/6790d30b-ff25-4d9d-b438-fca7670cbbc0-kube-api-access-v7pt9\") pod \"keystone-db-sync-gc8rd\" (UID: \"6790d30b-ff25-4d9d-b438-fca7670cbbc0\") " pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.305982 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6790d30b-ff25-4d9d-b438-fca7670cbbc0-config-data\") pod \"keystone-db-sync-gc8rd\" (UID: \"6790d30b-ff25-4d9d-b438-fca7670cbbc0\") " pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.323901 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6790d30b-ff25-4d9d-b438-fca7670cbbc0-config-data\") pod \"keystone-db-sync-gc8rd\" (UID: \"6790d30b-ff25-4d9d-b438-fca7670cbbc0\") " pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.327670 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7pt9\" (UniqueName: \"kubernetes.io/projected/6790d30b-ff25-4d9d-b438-fca7670cbbc0-kube-api-access-v7pt9\") pod \"keystone-db-sync-gc8rd\" (UID: \"6790d30b-ff25-4d9d-b438-fca7670cbbc0\") " pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.461835 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" Jan 21 00:28:35 crc kubenswrapper[5030]: I0121 00:28:35.877751 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-gc8rd"] Jan 21 00:28:36 crc kubenswrapper[5030]: I0121 00:28:36.118444 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" event={"ID":"6790d30b-ff25-4d9d-b438-fca7670cbbc0","Type":"ContainerStarted","Data":"18b310f5677dbab8b33c8dbf2417026617f2f343c50e055b678ace6c25336fb1"} Jan 21 00:28:36 crc kubenswrapper[5030]: I0121 00:28:36.118492 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" event={"ID":"6790d30b-ff25-4d9d-b438-fca7670cbbc0","Type":"ContainerStarted","Data":"aa4b050f17351a98c5c80c0eda5765eca8d5a3af923a13d05ef6bbc1f962719b"} Jan 21 00:28:36 crc kubenswrapper[5030]: I0121 00:28:36.140795 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" podStartSLOduration=1.140777134 podStartE2EDuration="1.140777134s" podCreationTimestamp="2026-01-21 00:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:28:36.138815428 +0000 UTC m=+6788.459075716" watchObservedRunningTime="2026-01-21 00:28:36.140777134 +0000 UTC m=+6788.461037422" Jan 21 00:28:40 crc kubenswrapper[5030]: I0121 00:28:40.151839 5030 generic.go:334] "Generic (PLEG): container finished" podID="6790d30b-ff25-4d9d-b438-fca7670cbbc0" containerID="18b310f5677dbab8b33c8dbf2417026617f2f343c50e055b678ace6c25336fb1" exitCode=0 Jan 21 00:28:40 crc kubenswrapper[5030]: I0121 00:28:40.151903 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" event={"ID":"6790d30b-ff25-4d9d-b438-fca7670cbbc0","Type":"ContainerDied","Data":"18b310f5677dbab8b33c8dbf2417026617f2f343c50e055b678ace6c25336fb1"} Jan 21 00:28:41 crc kubenswrapper[5030]: I0121 00:28:41.420052 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" Jan 21 00:28:41 crc kubenswrapper[5030]: I0121 00:28:41.498774 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7pt9\" (UniqueName: \"kubernetes.io/projected/6790d30b-ff25-4d9d-b438-fca7670cbbc0-kube-api-access-v7pt9\") pod \"6790d30b-ff25-4d9d-b438-fca7670cbbc0\" (UID: \"6790d30b-ff25-4d9d-b438-fca7670cbbc0\") " Jan 21 00:28:41 crc kubenswrapper[5030]: I0121 00:28:41.498834 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6790d30b-ff25-4d9d-b438-fca7670cbbc0-config-data\") pod \"6790d30b-ff25-4d9d-b438-fca7670cbbc0\" (UID: \"6790d30b-ff25-4d9d-b438-fca7670cbbc0\") " Jan 21 00:28:41 crc kubenswrapper[5030]: I0121 00:28:41.503892 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6790d30b-ff25-4d9d-b438-fca7670cbbc0-kube-api-access-v7pt9" (OuterVolumeSpecName: "kube-api-access-v7pt9") pod "6790d30b-ff25-4d9d-b438-fca7670cbbc0" (UID: "6790d30b-ff25-4d9d-b438-fca7670cbbc0"). InnerVolumeSpecName "kube-api-access-v7pt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:28:41 crc kubenswrapper[5030]: I0121 00:28:41.542344 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6790d30b-ff25-4d9d-b438-fca7670cbbc0-config-data" (OuterVolumeSpecName: "config-data") pod "6790d30b-ff25-4d9d-b438-fca7670cbbc0" (UID: "6790d30b-ff25-4d9d-b438-fca7670cbbc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:28:41 crc kubenswrapper[5030]: I0121 00:28:41.601020 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7pt9\" (UniqueName: \"kubernetes.io/projected/6790d30b-ff25-4d9d-b438-fca7670cbbc0-kube-api-access-v7pt9\") on node \"crc\" DevicePath \"\"" Jan 21 00:28:41 crc kubenswrapper[5030]: I0121 00:28:41.601096 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6790d30b-ff25-4d9d-b438-fca7670cbbc0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.171165 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" event={"ID":"6790d30b-ff25-4d9d-b438-fca7670cbbc0","Type":"ContainerDied","Data":"aa4b050f17351a98c5c80c0eda5765eca8d5a3af923a13d05ef6bbc1f962719b"} Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.171480 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa4b050f17351a98c5c80c0eda5765eca8d5a3af923a13d05ef6bbc1f962719b" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.171217 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-gc8rd" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.407648 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cfrdx"] Jan 21 00:28:42 crc kubenswrapper[5030]: E0121 00:28:42.408016 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6790d30b-ff25-4d9d-b438-fca7670cbbc0" containerName="keystone-db-sync" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.408037 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6790d30b-ff25-4d9d-b438-fca7670cbbc0" containerName="keystone-db-sync" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.408189 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6790d30b-ff25-4d9d-b438-fca7670cbbc0" containerName="keystone-db-sync" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.408772 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.413571 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.413814 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.413821 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.413864 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.414594 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-lkdjx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.417404 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cfrdx"] Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.516199 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-config-data\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.516280 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-fernet-keys\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.516317 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-credential-keys\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.516339 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-scripts\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.516382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9rdl\" (UniqueName: \"kubernetes.io/projected/7a913cce-580c-4b09-8cd8-d62a7a052b6a-kube-api-access-q9rdl\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.617821 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-fernet-keys\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.617937 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-credential-keys\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.617970 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-scripts\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.618784 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9rdl\" (UniqueName: \"kubernetes.io/projected/7a913cce-580c-4b09-8cd8-d62a7a052b6a-kube-api-access-q9rdl\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.619098 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-config-data\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.623101 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-fernet-keys\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.624090 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-scripts\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.624221 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-config-data\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.627932 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-credential-keys\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.637844 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9rdl\" (UniqueName: \"kubernetes.io/projected/7a913cce-580c-4b09-8cd8-d62a7a052b6a-kube-api-access-q9rdl\") pod \"keystone-bootstrap-cfrdx\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:42 crc kubenswrapper[5030]: I0121 00:28:42.728532 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:43 crc kubenswrapper[5030]: I0121 00:28:43.123821 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cfrdx"] Jan 21 00:28:43 crc kubenswrapper[5030]: W0121 00:28:43.125326 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a913cce_580c_4b09_8cd8_d62a7a052b6a.slice/crio-e0aa3a5d4f3ea0471b15ef02fa6d52e33722dca79c9af3129c51ec144489e69a WatchSource:0}: Error finding container e0aa3a5d4f3ea0471b15ef02fa6d52e33722dca79c9af3129c51ec144489e69a: Status 404 returned error can't find the container with id e0aa3a5d4f3ea0471b15ef02fa6d52e33722dca79c9af3129c51ec144489e69a Jan 21 00:28:43 crc kubenswrapper[5030]: I0121 00:28:43.179436 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" event={"ID":"7a913cce-580c-4b09-8cd8-d62a7a052b6a","Type":"ContainerStarted","Data":"e0aa3a5d4f3ea0471b15ef02fa6d52e33722dca79c9af3129c51ec144489e69a"} Jan 21 00:28:44 crc kubenswrapper[5030]: I0121 00:28:44.213741 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" event={"ID":"7a913cce-580c-4b09-8cd8-d62a7a052b6a","Type":"ContainerStarted","Data":"85e5ab42373262369eafd038b4738728ebc79353bb01234ec5eb3a70e11ba12d"} Jan 21 00:28:44 crc kubenswrapper[5030]: I0121 00:28:44.232940 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" podStartSLOduration=2.232919992 podStartE2EDuration="2.232919992s" podCreationTimestamp="2026-01-21 00:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:28:44.230778761 +0000 UTC m=+6796.551039049" watchObservedRunningTime="2026-01-21 00:28:44.232919992 +0000 UTC m=+6796.553180280" Jan 21 00:28:48 crc kubenswrapper[5030]: I0121 00:28:48.249556 5030 generic.go:334] "Generic (PLEG): container finished" podID="7a913cce-580c-4b09-8cd8-d62a7a052b6a" containerID="85e5ab42373262369eafd038b4738728ebc79353bb01234ec5eb3a70e11ba12d" exitCode=0 Jan 21 00:28:48 crc kubenswrapper[5030]: I0121 00:28:48.249720 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" event={"ID":"7a913cce-580c-4b09-8cd8-d62a7a052b6a","Type":"ContainerDied","Data":"85e5ab42373262369eafd038b4738728ebc79353bb01234ec5eb3a70e11ba12d"} Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.593786 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.619847 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9rdl\" (UniqueName: \"kubernetes.io/projected/7a913cce-580c-4b09-8cd8-d62a7a052b6a-kube-api-access-q9rdl\") pod \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.620005 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-scripts\") pod \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.620300 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-config-data\") pod \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.620423 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-credential-keys\") pod \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.620470 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-fernet-keys\") pod \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\" (UID: \"7a913cce-580c-4b09-8cd8-d62a7a052b6a\") " Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.626645 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a913cce-580c-4b09-8cd8-d62a7a052b6a-kube-api-access-q9rdl" (OuterVolumeSpecName: "kube-api-access-q9rdl") pod "7a913cce-580c-4b09-8cd8-d62a7a052b6a" (UID: "7a913cce-580c-4b09-8cd8-d62a7a052b6a"). InnerVolumeSpecName "kube-api-access-q9rdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.626922 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7a913cce-580c-4b09-8cd8-d62a7a052b6a" (UID: "7a913cce-580c-4b09-8cd8-d62a7a052b6a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.628845 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-scripts" (OuterVolumeSpecName: "scripts") pod "7a913cce-580c-4b09-8cd8-d62a7a052b6a" (UID: "7a913cce-580c-4b09-8cd8-d62a7a052b6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.634751 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7a913cce-580c-4b09-8cd8-d62a7a052b6a" (UID: "7a913cce-580c-4b09-8cd8-d62a7a052b6a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.642565 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-config-data" (OuterVolumeSpecName: "config-data") pod "7a913cce-580c-4b09-8cd8-d62a7a052b6a" (UID: "7a913cce-580c-4b09-8cd8-d62a7a052b6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.722389 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.722426 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.722436 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.722449 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9rdl\" (UniqueName: \"kubernetes.io/projected/7a913cce-580c-4b09-8cd8-d62a7a052b6a-kube-api-access-q9rdl\") on node \"crc\" DevicePath \"\"" Jan 21 00:28:49 crc kubenswrapper[5030]: I0121 00:28:49.722489 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a913cce-580c-4b09-8cd8-d62a7a052b6a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.266013 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" event={"ID":"7a913cce-580c-4b09-8cd8-d62a7a052b6a","Type":"ContainerDied","Data":"e0aa3a5d4f3ea0471b15ef02fa6d52e33722dca79c9af3129c51ec144489e69a"} Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.266057 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0aa3a5d4f3ea0471b15ef02fa6d52e33722dca79c9af3129c51ec144489e69a" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.266059 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-cfrdx" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.334226 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-56984ff75-4qcx5"] Jan 21 00:28:50 crc kubenswrapper[5030]: E0121 00:28:50.334549 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a913cce-580c-4b09-8cd8-d62a7a052b6a" containerName="keystone-bootstrap" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.334570 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a913cce-580c-4b09-8cd8-d62a7a052b6a" containerName="keystone-bootstrap" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.334746 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a913cce-580c-4b09-8cd8-d62a7a052b6a" containerName="keystone-bootstrap" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.335327 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.336959 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.337797 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.338431 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-lkdjx" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.338931 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.358192 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-56984ff75-4qcx5"] Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.440998 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-credential-keys\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.441089 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lgxz\" (UniqueName: \"kubernetes.io/projected/60fd8dca-d952-497c-ad4e-fc309e4efc1c-kube-api-access-8lgxz\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.441288 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-scripts\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.441514 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-config-data\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.441744 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-fernet-keys\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.543037 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-scripts\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.543114 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-config-data\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.543176 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-fernet-keys\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.543218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-credential-keys\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.544056 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lgxz\" (UniqueName: \"kubernetes.io/projected/60fd8dca-d952-497c-ad4e-fc309e4efc1c-kube-api-access-8lgxz\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.548013 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-scripts\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.548487 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-fernet-keys\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.549396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-config-data\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.549586 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-credential-keys\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.565744 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lgxz\" (UniqueName: \"kubernetes.io/projected/60fd8dca-d952-497c-ad4e-fc309e4efc1c-kube-api-access-8lgxz\") pod \"keystone-56984ff75-4qcx5\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:50 crc kubenswrapper[5030]: I0121 00:28:50.650239 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:51 crc kubenswrapper[5030]: I0121 00:28:51.075960 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-56984ff75-4qcx5"] Jan 21 00:28:51 crc kubenswrapper[5030]: I0121 00:28:51.273364 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" event={"ID":"60fd8dca-d952-497c-ad4e-fc309e4efc1c","Type":"ContainerStarted","Data":"8ed9b1a39fc472790eaed8b11e4bcb9f64bdcbbc67b1155365be237ca14c3c2a"} Jan 21 00:28:52 crc kubenswrapper[5030]: I0121 00:28:52.280683 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" event={"ID":"60fd8dca-d952-497c-ad4e-fc309e4efc1c","Type":"ContainerStarted","Data":"ed2af3b1efa3aaf58b14a7b55150f738c08be4afa1029337c8c28c6e1888bbe9"} Jan 21 00:28:52 crc kubenswrapper[5030]: I0121 00:28:52.281732 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:28:52 crc kubenswrapper[5030]: I0121 00:28:52.304643 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" podStartSLOduration=2.304605082 podStartE2EDuration="2.304605082s" podCreationTimestamp="2026-01-21 00:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:28:52.295596706 +0000 UTC m=+6804.615857004" watchObservedRunningTime="2026-01-21 00:28:52.304605082 +0000 UTC m=+6804.624865370" Jan 21 00:29:22 crc kubenswrapper[5030]: I0121 00:29:22.101462 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:29:23 crc kubenswrapper[5030]: E0121 00:29:23.486412 5030 log.go:32] "Failed when writing line to log file" err="http2: stream closed" path="/var/log/pods/keystone-kuttl-tests_keystone-56984ff75-4qcx5_60fd8dca-d952-497c-ad4e-fc309e4efc1c/keystone-api/0.log" line={} Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.845771 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-5fbff95654-4bfp9"] Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.846848 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.867802 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5fbff95654-4bfp9"] Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.883780 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqttz\" (UniqueName: \"kubernetes.io/projected/78a4119c-52a8-433d-a55b-0ad3747a2e22-kube-api-access-xqttz\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.883842 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-config-data\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.884034 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-credential-keys\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.884121 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-fernet-keys\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.884189 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-scripts\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.985743 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqttz\" (UniqueName: \"kubernetes.io/projected/78a4119c-52a8-433d-a55b-0ad3747a2e22-kube-api-access-xqttz\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.985835 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-config-data\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.985968 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-credential-keys\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.986103 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-fernet-keys\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.986216 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-scripts\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.992526 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-credential-keys\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.993081 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-fernet-keys\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.993781 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-config-data\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:23 crc kubenswrapper[5030]: I0121 00:29:23.998351 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-scripts\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.008226 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqttz\" (UniqueName: \"kubernetes.io/projected/78a4119c-52a8-433d-a55b-0ad3747a2e22-kube-api-access-xqttz\") pod \"keystone-5fbff95654-4bfp9\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.101885 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cfrdx"] Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.114563 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-cfrdx"] Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.124662 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-gc8rd"] Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.129657 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-gc8rd"] Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.134779 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5fbff95654-4bfp9"] Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.135431 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" secret="" err="secret \"keystone-keystone-dockercfg-lkdjx\" not found" Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.135527 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.142062 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-56984ff75-4qcx5"] Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.142307 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" podUID="60fd8dca-d952-497c-ad4e-fc309e4efc1c" containerName="keystone-api" containerID="cri-o://ed2af3b1efa3aaf58b14a7b55150f738c08be4afa1029337c8c28c6e1888bbe9" gracePeriod=30 Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.173821 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x"] Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.174795 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x" Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.188445 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x"] Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.191883 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99bb76d9-2580-4560-bd31-56c2bc692855-operator-scripts\") pod \"keystone5c4b-account-delete-z2f2x\" (UID: \"99bb76d9-2580-4560-bd31-56c2bc692855\") " pod="keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x" Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.191951 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbvcd\" (UniqueName: \"kubernetes.io/projected/99bb76d9-2580-4560-bd31-56c2bc692855-kube-api-access-wbvcd\") pod \"keystone5c4b-account-delete-z2f2x\" (UID: \"99bb76d9-2580-4560-bd31-56c2bc692855\") " pod="keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x" Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.294375 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99bb76d9-2580-4560-bd31-56c2bc692855-operator-scripts\") pod \"keystone5c4b-account-delete-z2f2x\" (UID: \"99bb76d9-2580-4560-bd31-56c2bc692855\") " pod="keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x" Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.294449 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbvcd\" (UniqueName: \"kubernetes.io/projected/99bb76d9-2580-4560-bd31-56c2bc692855-kube-api-access-wbvcd\") pod \"keystone5c4b-account-delete-z2f2x\" (UID: \"99bb76d9-2580-4560-bd31-56c2bc692855\") " pod="keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x" Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.296435 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99bb76d9-2580-4560-bd31-56c2bc692855-operator-scripts\") pod \"keystone5c4b-account-delete-z2f2x\" (UID: \"99bb76d9-2580-4560-bd31-56c2bc692855\") " pod="keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x" Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.318853 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbvcd\" (UniqueName: \"kubernetes.io/projected/99bb76d9-2580-4560-bd31-56c2bc692855-kube-api-access-wbvcd\") pod \"keystone5c4b-account-delete-z2f2x\" (UID: \"99bb76d9-2580-4560-bd31-56c2bc692855\") " pod="keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x" Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.515306 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x" Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.613453 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5fbff95654-4bfp9"] Jan 21 00:29:24 crc kubenswrapper[5030]: I0121 00:29:24.752157 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x"] Jan 21 00:29:24 crc kubenswrapper[5030]: W0121 00:29:24.755026 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99bb76d9_2580_4560_bd31_56c2bc692855.slice/crio-94e82c38d26f40c0cf9f9237ae4ae06e8483558127986995fa26188cfb952e5f WatchSource:0}: Error finding container 94e82c38d26f40c0cf9f9237ae4ae06e8483558127986995fa26188cfb952e5f: Status 404 returned error can't find the container with id 94e82c38d26f40c0cf9f9237ae4ae06e8483558127986995fa26188cfb952e5f Jan 21 00:29:25 crc kubenswrapper[5030]: I0121 00:29:25.553700 5030 generic.go:334] "Generic (PLEG): container finished" podID="99bb76d9-2580-4560-bd31-56c2bc692855" containerID="d2229cb87a4c9dc69e631510bb947178738e79bb959e6ff85af07cdc34fee6aa" exitCode=0 Jan 21 00:29:25 crc kubenswrapper[5030]: I0121 00:29:25.553783 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x" event={"ID":"99bb76d9-2580-4560-bd31-56c2bc692855","Type":"ContainerDied","Data":"d2229cb87a4c9dc69e631510bb947178738e79bb959e6ff85af07cdc34fee6aa"} Jan 21 00:29:25 crc kubenswrapper[5030]: I0121 00:29:25.554071 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x" event={"ID":"99bb76d9-2580-4560-bd31-56c2bc692855","Type":"ContainerStarted","Data":"94e82c38d26f40c0cf9f9237ae4ae06e8483558127986995fa26188cfb952e5f"} Jan 21 00:29:25 crc kubenswrapper[5030]: I0121 00:29:25.557177 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" event={"ID":"78a4119c-52a8-433d-a55b-0ad3747a2e22","Type":"ContainerStarted","Data":"881ee4c4116ad9a79957a5b43ac8244193dadc0a39cdc82929c28ea094c03661"} Jan 21 00:29:25 crc kubenswrapper[5030]: I0121 00:29:25.557214 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" event={"ID":"78a4119c-52a8-433d-a55b-0ad3747a2e22","Type":"ContainerStarted","Data":"c21cbb9a2d129c073acbc63a8d8ced1ec644543dc8add44a366ea8e9381a0e1f"} Jan 21 00:29:25 crc kubenswrapper[5030]: I0121 00:29:25.557332 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" podUID="78a4119c-52a8-433d-a55b-0ad3747a2e22" containerName="keystone-api" containerID="cri-o://881ee4c4116ad9a79957a5b43ac8244193dadc0a39cdc82929c28ea094c03661" gracePeriod=30 Jan 21 00:29:25 crc kubenswrapper[5030]: I0121 00:29:25.557659 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:25 crc kubenswrapper[5030]: I0121 00:29:25.590659 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" podStartSLOduration=2.590641109 podStartE2EDuration="2.590641109s" podCreationTimestamp="2026-01-21 00:29:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:29:25.589922472 +0000 UTC m=+6837.910182760" watchObservedRunningTime="2026-01-21 00:29:25.590641109 +0000 UTC m=+6837.910901397" Jan 21 00:29:25 crc kubenswrapper[5030]: I0121 00:29:25.940248 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:25 crc kubenswrapper[5030]: I0121 00:29:25.971869 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6790d30b-ff25-4d9d-b438-fca7670cbbc0" path="/var/lib/kubelet/pods/6790d30b-ff25-4d9d-b438-fca7670cbbc0/volumes" Jan 21 00:29:25 crc kubenswrapper[5030]: I0121 00:29:25.972537 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a913cce-580c-4b09-8cd8-d62a7a052b6a" path="/var/lib/kubelet/pods/7a913cce-580c-4b09-8cd8-d62a7a052b6a/volumes" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.020898 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqttz\" (UniqueName: \"kubernetes.io/projected/78a4119c-52a8-433d-a55b-0ad3747a2e22-kube-api-access-xqttz\") pod \"78a4119c-52a8-433d-a55b-0ad3747a2e22\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.020980 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-scripts\") pod \"78a4119c-52a8-433d-a55b-0ad3747a2e22\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.021036 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-config-data\") pod \"78a4119c-52a8-433d-a55b-0ad3747a2e22\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.021071 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-credential-keys\") pod \"78a4119c-52a8-433d-a55b-0ad3747a2e22\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.021095 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-fernet-keys\") pod \"78a4119c-52a8-433d-a55b-0ad3747a2e22\" (UID: \"78a4119c-52a8-433d-a55b-0ad3747a2e22\") " Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.026419 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "78a4119c-52a8-433d-a55b-0ad3747a2e22" (UID: "78a4119c-52a8-433d-a55b-0ad3747a2e22"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.026879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a4119c-52a8-433d-a55b-0ad3747a2e22-kube-api-access-xqttz" (OuterVolumeSpecName: "kube-api-access-xqttz") pod "78a4119c-52a8-433d-a55b-0ad3747a2e22" (UID: "78a4119c-52a8-433d-a55b-0ad3747a2e22"). InnerVolumeSpecName "kube-api-access-xqttz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.026997 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "78a4119c-52a8-433d-a55b-0ad3747a2e22" (UID: "78a4119c-52a8-433d-a55b-0ad3747a2e22"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.027047 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-scripts" (OuterVolumeSpecName: "scripts") pod "78a4119c-52a8-433d-a55b-0ad3747a2e22" (UID: "78a4119c-52a8-433d-a55b-0ad3747a2e22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.042422 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-config-data" (OuterVolumeSpecName: "config-data") pod "78a4119c-52a8-433d-a55b-0ad3747a2e22" (UID: "78a4119c-52a8-433d-a55b-0ad3747a2e22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.122351 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqttz\" (UniqueName: \"kubernetes.io/projected/78a4119c-52a8-433d-a55b-0ad3747a2e22-kube-api-access-xqttz\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.122398 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.122409 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.122418 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.122427 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/78a4119c-52a8-433d-a55b-0ad3747a2e22-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.566565 5030 generic.go:334] "Generic (PLEG): container finished" podID="78a4119c-52a8-433d-a55b-0ad3747a2e22" containerID="881ee4c4116ad9a79957a5b43ac8244193dadc0a39cdc82929c28ea094c03661" exitCode=0 Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.566656 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.566680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" event={"ID":"78a4119c-52a8-433d-a55b-0ad3747a2e22","Type":"ContainerDied","Data":"881ee4c4116ad9a79957a5b43ac8244193dadc0a39cdc82929c28ea094c03661"} Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.566730 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5fbff95654-4bfp9" event={"ID":"78a4119c-52a8-433d-a55b-0ad3747a2e22","Type":"ContainerDied","Data":"c21cbb9a2d129c073acbc63a8d8ced1ec644543dc8add44a366ea8e9381a0e1f"} Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.566762 5030 scope.go:117] "RemoveContainer" containerID="881ee4c4116ad9a79957a5b43ac8244193dadc0a39cdc82929c28ea094c03661" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.601746 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5fbff95654-4bfp9"] Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.604938 5030 scope.go:117] "RemoveContainer" containerID="881ee4c4116ad9a79957a5b43ac8244193dadc0a39cdc82929c28ea094c03661" Jan 21 00:29:26 crc kubenswrapper[5030]: E0121 00:29:26.607450 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"881ee4c4116ad9a79957a5b43ac8244193dadc0a39cdc82929c28ea094c03661\": container with ID starting with 881ee4c4116ad9a79957a5b43ac8244193dadc0a39cdc82929c28ea094c03661 not found: ID does not exist" containerID="881ee4c4116ad9a79957a5b43ac8244193dadc0a39cdc82929c28ea094c03661" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.607525 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"881ee4c4116ad9a79957a5b43ac8244193dadc0a39cdc82929c28ea094c03661"} err="failed to get container status \"881ee4c4116ad9a79957a5b43ac8244193dadc0a39cdc82929c28ea094c03661\": rpc error: code = NotFound desc = could not find container \"881ee4c4116ad9a79957a5b43ac8244193dadc0a39cdc82929c28ea094c03661\": container with ID starting with 881ee4c4116ad9a79957a5b43ac8244193dadc0a39cdc82929c28ea094c03661 not found: ID does not exist" Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.608844 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-5fbff95654-4bfp9"] Jan 21 00:29:26 crc kubenswrapper[5030]: I0121 00:29:26.939388 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.038737 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99bb76d9-2580-4560-bd31-56c2bc692855-operator-scripts\") pod \"99bb76d9-2580-4560-bd31-56c2bc692855\" (UID: \"99bb76d9-2580-4560-bd31-56c2bc692855\") " Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.039033 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbvcd\" (UniqueName: \"kubernetes.io/projected/99bb76d9-2580-4560-bd31-56c2bc692855-kube-api-access-wbvcd\") pod \"99bb76d9-2580-4560-bd31-56c2bc692855\" (UID: \"99bb76d9-2580-4560-bd31-56c2bc692855\") " Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.039738 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99bb76d9-2580-4560-bd31-56c2bc692855-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "99bb76d9-2580-4560-bd31-56c2bc692855" (UID: "99bb76d9-2580-4560-bd31-56c2bc692855"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.043408 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99bb76d9-2580-4560-bd31-56c2bc692855-kube-api-access-wbvcd" (OuterVolumeSpecName: "kube-api-access-wbvcd") pod "99bb76d9-2580-4560-bd31-56c2bc692855" (UID: "99bb76d9-2580-4560-bd31-56c2bc692855"). InnerVolumeSpecName "kube-api-access-wbvcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.139827 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99bb76d9-2580-4560-bd31-56c2bc692855-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.139874 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbvcd\" (UniqueName: \"kubernetes.io/projected/99bb76d9-2580-4560-bd31-56c2bc692855-kube-api-access-wbvcd\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.576310 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x" event={"ID":"99bb76d9-2580-4560-bd31-56c2bc692855","Type":"ContainerDied","Data":"94e82c38d26f40c0cf9f9237ae4ae06e8483558127986995fa26188cfb952e5f"} Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.576345 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94e82c38d26f40c0cf9f9237ae4ae06e8483558127986995fa26188cfb952e5f" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.576392 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.579552 5030 generic.go:334] "Generic (PLEG): container finished" podID="60fd8dca-d952-497c-ad4e-fc309e4efc1c" containerID="ed2af3b1efa3aaf58b14a7b55150f738c08be4afa1029337c8c28c6e1888bbe9" exitCode=0 Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.579593 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" event={"ID":"60fd8dca-d952-497c-ad4e-fc309e4efc1c","Type":"ContainerDied","Data":"ed2af3b1efa3aaf58b14a7b55150f738c08be4afa1029337c8c28c6e1888bbe9"} Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.696111 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.849084 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-scripts\") pod \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.849130 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-credential-keys\") pod \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.849154 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-fernet-keys\") pod \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.849304 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lgxz\" (UniqueName: \"kubernetes.io/projected/60fd8dca-d952-497c-ad4e-fc309e4efc1c-kube-api-access-8lgxz\") pod \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.849348 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-config-data\") pod \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\" (UID: \"60fd8dca-d952-497c-ad4e-fc309e4efc1c\") " Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.852982 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "60fd8dca-d952-497c-ad4e-fc309e4efc1c" (UID: "60fd8dca-d952-497c-ad4e-fc309e4efc1c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.853990 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "60fd8dca-d952-497c-ad4e-fc309e4efc1c" (UID: "60fd8dca-d952-497c-ad4e-fc309e4efc1c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.903768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60fd8dca-d952-497c-ad4e-fc309e4efc1c-kube-api-access-8lgxz" (OuterVolumeSpecName: "kube-api-access-8lgxz") pod "60fd8dca-d952-497c-ad4e-fc309e4efc1c" (UID: "60fd8dca-d952-497c-ad4e-fc309e4efc1c"). InnerVolumeSpecName "kube-api-access-8lgxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.903801 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-scripts" (OuterVolumeSpecName: "scripts") pod "60fd8dca-d952-497c-ad4e-fc309e4efc1c" (UID: "60fd8dca-d952-497c-ad4e-fc309e4efc1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.908055 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-config-data" (OuterVolumeSpecName: "config-data") pod "60fd8dca-d952-497c-ad4e-fc309e4efc1c" (UID: "60fd8dca-d952-497c-ad4e-fc309e4efc1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.951081 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.951123 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.951139 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.951158 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lgxz\" (UniqueName: \"kubernetes.io/projected/60fd8dca-d952-497c-ad4e-fc309e4efc1c-kube-api-access-8lgxz\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.951172 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60fd8dca-d952-497c-ad4e-fc309e4efc1c-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:27 crc kubenswrapper[5030]: I0121 00:29:27.973332 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78a4119c-52a8-433d-a55b-0ad3747a2e22" path="/var/lib/kubelet/pods/78a4119c-52a8-433d-a55b-0ad3747a2e22/volumes" Jan 21 00:29:28 crc kubenswrapper[5030]: I0121 00:29:28.587667 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" event={"ID":"60fd8dca-d952-497c-ad4e-fc309e4efc1c","Type":"ContainerDied","Data":"8ed9b1a39fc472790eaed8b11e4bcb9f64bdcbbc67b1155365be237ca14c3c2a"} Jan 21 00:29:28 crc kubenswrapper[5030]: I0121 00:29:28.587721 5030 scope.go:117] "RemoveContainer" containerID="ed2af3b1efa3aaf58b14a7b55150f738c08be4afa1029337c8c28c6e1888bbe9" Jan 21 00:29:28 crc kubenswrapper[5030]: I0121 00:29:28.587827 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-56984ff75-4qcx5" Jan 21 00:29:28 crc kubenswrapper[5030]: I0121 00:29:28.609876 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-56984ff75-4qcx5"] Jan 21 00:29:28 crc kubenswrapper[5030]: I0121 00:29:28.618969 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-56984ff75-4qcx5"] Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.200996 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qlgln"] Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.213342 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-qlgln"] Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.222148 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x"] Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.230211 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7"] Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.238655 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone5c4b-account-delete-z2f2x"] Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.247274 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-5c4b-account-create-update-xrwq7"] Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.424766 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-2qsh9"] Jan 21 00:29:29 crc kubenswrapper[5030]: E0121 00:29:29.425280 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60fd8dca-d952-497c-ad4e-fc309e4efc1c" containerName="keystone-api" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.425304 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="60fd8dca-d952-497c-ad4e-fc309e4efc1c" containerName="keystone-api" Jan 21 00:29:29 crc kubenswrapper[5030]: E0121 00:29:29.425326 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a4119c-52a8-433d-a55b-0ad3747a2e22" containerName="keystone-api" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.425338 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a4119c-52a8-433d-a55b-0ad3747a2e22" containerName="keystone-api" Jan 21 00:29:29 crc kubenswrapper[5030]: E0121 00:29:29.425361 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bb76d9-2580-4560-bd31-56c2bc692855" containerName="mariadb-account-delete" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.425370 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bb76d9-2580-4560-bd31-56c2bc692855" containerName="mariadb-account-delete" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.425555 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="78a4119c-52a8-433d-a55b-0ad3747a2e22" containerName="keystone-api" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.425572 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bb76d9-2580-4560-bd31-56c2bc692855" containerName="mariadb-account-delete" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.425601 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="60fd8dca-d952-497c-ad4e-fc309e4efc1c" containerName="keystone-api" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.426277 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-2qsh9" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.433941 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm"] Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.441800 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.447490 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.464226 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm"] Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.478669 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-2qsh9"] Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.573759 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p6ph\" (UniqueName: \"kubernetes.io/projected/166fed61-e938-4987-a036-2796621d548f-kube-api-access-5p6ph\") pod \"keystone-b4e1-account-create-update-96cxm\" (UID: \"166fed61-e938-4987-a036-2796621d548f\") " pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.573996 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4549\" (UniqueName: \"kubernetes.io/projected/d10a0df7-74d0-4a72-bc07-8bbe67876ceb-kube-api-access-s4549\") pod \"keystone-db-create-2qsh9\" (UID: \"d10a0df7-74d0-4a72-bc07-8bbe67876ceb\") " pod="keystone-kuttl-tests/keystone-db-create-2qsh9" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.574126 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d10a0df7-74d0-4a72-bc07-8bbe67876ceb-operator-scripts\") pod \"keystone-db-create-2qsh9\" (UID: \"d10a0df7-74d0-4a72-bc07-8bbe67876ceb\") " pod="keystone-kuttl-tests/keystone-db-create-2qsh9" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.574240 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/166fed61-e938-4987-a036-2796621d548f-operator-scripts\") pod \"keystone-b4e1-account-create-update-96cxm\" (UID: \"166fed61-e938-4987-a036-2796621d548f\") " pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.676162 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p6ph\" (UniqueName: \"kubernetes.io/projected/166fed61-e938-4987-a036-2796621d548f-kube-api-access-5p6ph\") pod \"keystone-b4e1-account-create-update-96cxm\" (UID: \"166fed61-e938-4987-a036-2796621d548f\") " pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.676287 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4549\" (UniqueName: \"kubernetes.io/projected/d10a0df7-74d0-4a72-bc07-8bbe67876ceb-kube-api-access-s4549\") pod \"keystone-db-create-2qsh9\" (UID: \"d10a0df7-74d0-4a72-bc07-8bbe67876ceb\") " pod="keystone-kuttl-tests/keystone-db-create-2qsh9" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.676334 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d10a0df7-74d0-4a72-bc07-8bbe67876ceb-operator-scripts\") pod \"keystone-db-create-2qsh9\" (UID: \"d10a0df7-74d0-4a72-bc07-8bbe67876ceb\") " pod="keystone-kuttl-tests/keystone-db-create-2qsh9" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.676384 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/166fed61-e938-4987-a036-2796621d548f-operator-scripts\") pod \"keystone-b4e1-account-create-update-96cxm\" (UID: \"166fed61-e938-4987-a036-2796621d548f\") " pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.677318 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/166fed61-e938-4987-a036-2796621d548f-operator-scripts\") pod \"keystone-b4e1-account-create-update-96cxm\" (UID: \"166fed61-e938-4987-a036-2796621d548f\") " pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.677592 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d10a0df7-74d0-4a72-bc07-8bbe67876ceb-operator-scripts\") pod \"keystone-db-create-2qsh9\" (UID: \"d10a0df7-74d0-4a72-bc07-8bbe67876ceb\") " pod="keystone-kuttl-tests/keystone-db-create-2qsh9" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.703440 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p6ph\" (UniqueName: \"kubernetes.io/projected/166fed61-e938-4987-a036-2796621d548f-kube-api-access-5p6ph\") pod \"keystone-b4e1-account-create-update-96cxm\" (UID: \"166fed61-e938-4987-a036-2796621d548f\") " pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.708474 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4549\" (UniqueName: \"kubernetes.io/projected/d10a0df7-74d0-4a72-bc07-8bbe67876ceb-kube-api-access-s4549\") pod \"keystone-db-create-2qsh9\" (UID: \"d10a0df7-74d0-4a72-bc07-8bbe67876ceb\") " pod="keystone-kuttl-tests/keystone-db-create-2qsh9" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.752788 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-2qsh9" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.767061 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.970600 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b" path="/var/lib/kubelet/pods/42b5e7f5-5e8d-41f6-9b44-0d4d43a4884b/volumes" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.971486 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60fd8dca-d952-497c-ad4e-fc309e4efc1c" path="/var/lib/kubelet/pods/60fd8dca-d952-497c-ad4e-fc309e4efc1c/volumes" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.972036 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99bb76d9-2580-4560-bd31-56c2bc692855" path="/var/lib/kubelet/pods/99bb76d9-2580-4560-bd31-56c2bc692855/volumes" Jan 21 00:29:29 crc kubenswrapper[5030]: I0121 00:29:29.972482 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439" path="/var/lib/kubelet/pods/ef1cc5de-a3f5-4caf-9b1d-42ad9cd47439/volumes" Jan 21 00:29:30 crc kubenswrapper[5030]: I0121 00:29:30.191797 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm"] Jan 21 00:29:30 crc kubenswrapper[5030]: I0121 00:29:30.222817 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-2qsh9"] Jan 21 00:29:30 crc kubenswrapper[5030]: W0121 00:29:30.233548 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd10a0df7_74d0_4a72_bc07_8bbe67876ceb.slice/crio-aa390ae449650e52dcbd13dbd76fc7fbf234ca960f520b63d6412481f535e439 WatchSource:0}: Error finding container aa390ae449650e52dcbd13dbd76fc7fbf234ca960f520b63d6412481f535e439: Status 404 returned error can't find the container with id aa390ae449650e52dcbd13dbd76fc7fbf234ca960f520b63d6412481f535e439 Jan 21 00:29:30 crc kubenswrapper[5030]: I0121 00:29:30.610108 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-2qsh9" event={"ID":"d10a0df7-74d0-4a72-bc07-8bbe67876ceb","Type":"ContainerStarted","Data":"e5b0157672e1b421cd5af4613cdeaedc6adfa85567c61cfc76b1b321d2c705ed"} Jan 21 00:29:30 crc kubenswrapper[5030]: I0121 00:29:30.610169 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-2qsh9" event={"ID":"d10a0df7-74d0-4a72-bc07-8bbe67876ceb","Type":"ContainerStarted","Data":"aa390ae449650e52dcbd13dbd76fc7fbf234ca960f520b63d6412481f535e439"} Jan 21 00:29:30 crc kubenswrapper[5030]: I0121 00:29:30.612184 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" event={"ID":"166fed61-e938-4987-a036-2796621d548f","Type":"ContainerStarted","Data":"48a807b9894bd45560717aba56ef65fc945fee6bec861d7922d84e23fcbf0980"} Jan 21 00:29:30 crc kubenswrapper[5030]: I0121 00:29:30.612210 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" event={"ID":"166fed61-e938-4987-a036-2796621d548f","Type":"ContainerStarted","Data":"23b2986793fe5f9d9ec90f0927ef54d9cc1f1c86bade5bbf18e94296d071a07d"} Jan 21 00:29:30 crc kubenswrapper[5030]: I0121 00:29:30.625287 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-create-2qsh9" podStartSLOduration=1.625265725 podStartE2EDuration="1.625265725s" podCreationTimestamp="2026-01-21 00:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:29:30.623638887 +0000 UTC m=+6842.943899185" watchObservedRunningTime="2026-01-21 00:29:30.625265725 +0000 UTC m=+6842.945526013" Jan 21 00:29:30 crc kubenswrapper[5030]: I0121 00:29:30.641618 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" podStartSLOduration=1.6415980860000001 podStartE2EDuration="1.641598086s" podCreationTimestamp="2026-01-21 00:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:29:30.637965609 +0000 UTC m=+6842.958225907" watchObservedRunningTime="2026-01-21 00:29:30.641598086 +0000 UTC m=+6842.961858384" Jan 21 00:29:31 crc kubenswrapper[5030]: I0121 00:29:31.624505 5030 generic.go:334] "Generic (PLEG): container finished" podID="166fed61-e938-4987-a036-2796621d548f" containerID="48a807b9894bd45560717aba56ef65fc945fee6bec861d7922d84e23fcbf0980" exitCode=0 Jan 21 00:29:31 crc kubenswrapper[5030]: I0121 00:29:31.624599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" event={"ID":"166fed61-e938-4987-a036-2796621d548f","Type":"ContainerDied","Data":"48a807b9894bd45560717aba56ef65fc945fee6bec861d7922d84e23fcbf0980"} Jan 21 00:29:31 crc kubenswrapper[5030]: I0121 00:29:31.627604 5030 generic.go:334] "Generic (PLEG): container finished" podID="d10a0df7-74d0-4a72-bc07-8bbe67876ceb" containerID="e5b0157672e1b421cd5af4613cdeaedc6adfa85567c61cfc76b1b321d2c705ed" exitCode=0 Jan 21 00:29:31 crc kubenswrapper[5030]: I0121 00:29:31.627668 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-2qsh9" event={"ID":"d10a0df7-74d0-4a72-bc07-8bbe67876ceb","Type":"ContainerDied","Data":"e5b0157672e1b421cd5af4613cdeaedc6adfa85567c61cfc76b1b321d2c705ed"} Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.317923 5030 scope.go:117] "RemoveContainer" containerID="e5c76fade9d56f705e934b1239b542d08c1aed1713f5c0e36100b29dbde26be3" Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.335449 5030 scope.go:117] "RemoveContainer" containerID="f546ff270d13e87756b68bc69b14cfc012088fbc136eaae01b98149a7d9090e1" Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.360158 5030 scope.go:117] "RemoveContainer" containerID="5f221d2d0d551cd23f128dcce507a6d1a26d8589212885b5f75b7ba3a5f38b95" Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.391960 5030 scope.go:117] "RemoveContainer" containerID="29eb6c85004f3f4c0e27be580f3a686a708892e41dfe3e1b9d8b0047281c06ed" Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.432528 5030 scope.go:117] "RemoveContainer" containerID="0a0ef5b928090da58165f3a969293287ea6eacde529e8f548b3ae25dfe8d3364" Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.463323 5030 scope.go:117] "RemoveContainer" containerID="77cbef79247a9104c4a4c6cac7f81b37a3df69ae6fc58ae4d4fb5f11d2c53f32" Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.489874 5030 scope.go:117] "RemoveContainer" containerID="67abbc44be874834bfc74bf06dbef4f876db30dd4f2391d60959932155e14081" Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.513896 5030 scope.go:117] "RemoveContainer" containerID="f66814a9de46d7736f6c47ceed05542d04c0b9c09a550511cd01151c616c4ede" Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.536224 5030 scope.go:117] "RemoveContainer" containerID="2caaad1dad98d3f0bde280b9f62f75f37b9a16e1ef62be6fd79c2d57aae1f591" Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.555578 5030 scope.go:117] "RemoveContainer" containerID="bb30e01ccbfbc917799867a4523d8b03ee98e8ae39fdc6661b3a6bd49ea96738" Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.574392 5030 scope.go:117] "RemoveContainer" containerID="321aeec566632b05c5a0effe066112c99f2f9161e670b82f2a4f8a4e2d3081be" Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.592087 5030 scope.go:117] "RemoveContainer" containerID="e3cc4bf99227158bdae8763a00722f9cc15c09290fa146bcd19dbf2bd9a7b163" Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.951662 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" Jan 21 00:29:32 crc kubenswrapper[5030]: I0121 00:29:32.964431 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-2qsh9" Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.124482 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d10a0df7-74d0-4a72-bc07-8bbe67876ceb-operator-scripts\") pod \"d10a0df7-74d0-4a72-bc07-8bbe67876ceb\" (UID: \"d10a0df7-74d0-4a72-bc07-8bbe67876ceb\") " Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.124917 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p6ph\" (UniqueName: \"kubernetes.io/projected/166fed61-e938-4987-a036-2796621d548f-kube-api-access-5p6ph\") pod \"166fed61-e938-4987-a036-2796621d548f\" (UID: \"166fed61-e938-4987-a036-2796621d548f\") " Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.125013 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4549\" (UniqueName: \"kubernetes.io/projected/d10a0df7-74d0-4a72-bc07-8bbe67876ceb-kube-api-access-s4549\") pod \"d10a0df7-74d0-4a72-bc07-8bbe67876ceb\" (UID: \"d10a0df7-74d0-4a72-bc07-8bbe67876ceb\") " Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.125067 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/166fed61-e938-4987-a036-2796621d548f-operator-scripts\") pod \"166fed61-e938-4987-a036-2796621d548f\" (UID: \"166fed61-e938-4987-a036-2796621d548f\") " Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.125336 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d10a0df7-74d0-4a72-bc07-8bbe67876ceb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d10a0df7-74d0-4a72-bc07-8bbe67876ceb" (UID: "d10a0df7-74d0-4a72-bc07-8bbe67876ceb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.125662 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d10a0df7-74d0-4a72-bc07-8bbe67876ceb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.125958 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/166fed61-e938-4987-a036-2796621d548f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "166fed61-e938-4987-a036-2796621d548f" (UID: "166fed61-e938-4987-a036-2796621d548f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.130151 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/166fed61-e938-4987-a036-2796621d548f-kube-api-access-5p6ph" (OuterVolumeSpecName: "kube-api-access-5p6ph") pod "166fed61-e938-4987-a036-2796621d548f" (UID: "166fed61-e938-4987-a036-2796621d548f"). InnerVolumeSpecName "kube-api-access-5p6ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.130362 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d10a0df7-74d0-4a72-bc07-8bbe67876ceb-kube-api-access-s4549" (OuterVolumeSpecName: "kube-api-access-s4549") pod "d10a0df7-74d0-4a72-bc07-8bbe67876ceb" (UID: "d10a0df7-74d0-4a72-bc07-8bbe67876ceb"). InnerVolumeSpecName "kube-api-access-s4549". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.226839 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/166fed61-e938-4987-a036-2796621d548f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.226885 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p6ph\" (UniqueName: \"kubernetes.io/projected/166fed61-e938-4987-a036-2796621d548f-kube-api-access-5p6ph\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.226897 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4549\" (UniqueName: \"kubernetes.io/projected/d10a0df7-74d0-4a72-bc07-8bbe67876ceb-kube-api-access-s4549\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.662168 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-2qsh9" event={"ID":"d10a0df7-74d0-4a72-bc07-8bbe67876ceb","Type":"ContainerDied","Data":"aa390ae449650e52dcbd13dbd76fc7fbf234ca960f520b63d6412481f535e439"} Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.662233 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa390ae449650e52dcbd13dbd76fc7fbf234ca960f520b63d6412481f535e439" Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.662195 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-2qsh9" Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.663984 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" event={"ID":"166fed61-e938-4987-a036-2796621d548f","Type":"ContainerDied","Data":"23b2986793fe5f9d9ec90f0927ef54d9cc1f1c86bade5bbf18e94296d071a07d"} Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.664021 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23b2986793fe5f9d9ec90f0927ef54d9cc1f1c86bade5bbf18e94296d071a07d" Jan 21 00:29:33 crc kubenswrapper[5030]: I0121 00:29:33.664057 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.003249 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-wz7zw"] Jan 21 00:29:35 crc kubenswrapper[5030]: E0121 00:29:35.003864 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10a0df7-74d0-4a72-bc07-8bbe67876ceb" containerName="mariadb-database-create" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.003878 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10a0df7-74d0-4a72-bc07-8bbe67876ceb" containerName="mariadb-database-create" Jan 21 00:29:35 crc kubenswrapper[5030]: E0121 00:29:35.003888 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="166fed61-e938-4987-a036-2796621d548f" containerName="mariadb-account-create-update" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.003894 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="166fed61-e938-4987-a036-2796621d548f" containerName="mariadb-account-create-update" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.004009 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="166fed61-e938-4987-a036-2796621d548f" containerName="mariadb-account-create-update" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.004025 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d10a0df7-74d0-4a72-bc07-8bbe67876ceb" containerName="mariadb-database-create" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.004483 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.006551 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.007448 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.009926 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.010797 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-bcgqp" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.014240 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-wz7zw"] Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.054014 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa555fb-a825-46f7-9f4b-56c9041a50a5-config-data\") pod \"keystone-db-sync-wz7zw\" (UID: \"baa555fb-a825-46f7-9f4b-56c9041a50a5\") " pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.054732 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z4d9\" (UniqueName: \"kubernetes.io/projected/baa555fb-a825-46f7-9f4b-56c9041a50a5-kube-api-access-8z4d9\") pod \"keystone-db-sync-wz7zw\" (UID: \"baa555fb-a825-46f7-9f4b-56c9041a50a5\") " pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.156255 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z4d9\" (UniqueName: \"kubernetes.io/projected/baa555fb-a825-46f7-9f4b-56c9041a50a5-kube-api-access-8z4d9\") pod \"keystone-db-sync-wz7zw\" (UID: \"baa555fb-a825-46f7-9f4b-56c9041a50a5\") " pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.156450 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa555fb-a825-46f7-9f4b-56c9041a50a5-config-data\") pod \"keystone-db-sync-wz7zw\" (UID: \"baa555fb-a825-46f7-9f4b-56c9041a50a5\") " pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.162915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa555fb-a825-46f7-9f4b-56c9041a50a5-config-data\") pod \"keystone-db-sync-wz7zw\" (UID: \"baa555fb-a825-46f7-9f4b-56c9041a50a5\") " pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.171717 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z4d9\" (UniqueName: \"kubernetes.io/projected/baa555fb-a825-46f7-9f4b-56c9041a50a5-kube-api-access-8z4d9\") pod \"keystone-db-sync-wz7zw\" (UID: \"baa555fb-a825-46f7-9f4b-56c9041a50a5\") " pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.321796 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" Jan 21 00:29:35 crc kubenswrapper[5030]: I0121 00:29:35.727962 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-wz7zw"] Jan 21 00:29:36 crc kubenswrapper[5030]: I0121 00:29:36.689252 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" event={"ID":"baa555fb-a825-46f7-9f4b-56c9041a50a5","Type":"ContainerStarted","Data":"145da04b1e990964301aeca19c397782cb6b889bf464d9d3ee9a7d40bf1a35f6"} Jan 21 00:29:36 crc kubenswrapper[5030]: I0121 00:29:36.689671 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" event={"ID":"baa555fb-a825-46f7-9f4b-56c9041a50a5","Type":"ContainerStarted","Data":"ec2305504d0d8a01021157f6eeb5082f5c865cda08e098b5959a5ac61eced978"} Jan 21 00:29:36 crc kubenswrapper[5030]: I0121 00:29:36.708838 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" podStartSLOduration=2.7088212670000003 podStartE2EDuration="2.708821267s" podCreationTimestamp="2026-01-21 00:29:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:29:36.702870695 +0000 UTC m=+6849.023131003" watchObservedRunningTime="2026-01-21 00:29:36.708821267 +0000 UTC m=+6849.029081555" Jan 21 00:29:37 crc kubenswrapper[5030]: I0121 00:29:37.699585 5030 generic.go:334] "Generic (PLEG): container finished" podID="baa555fb-a825-46f7-9f4b-56c9041a50a5" containerID="145da04b1e990964301aeca19c397782cb6b889bf464d9d3ee9a7d40bf1a35f6" exitCode=0 Jan 21 00:29:37 crc kubenswrapper[5030]: I0121 00:29:37.699761 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" event={"ID":"baa555fb-a825-46f7-9f4b-56c9041a50a5","Type":"ContainerDied","Data":"145da04b1e990964301aeca19c397782cb6b889bf464d9d3ee9a7d40bf1a35f6"} Jan 21 00:29:38 crc kubenswrapper[5030]: I0121 00:29:38.987363 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.032179 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa555fb-a825-46f7-9f4b-56c9041a50a5-config-data\") pod \"baa555fb-a825-46f7-9f4b-56c9041a50a5\" (UID: \"baa555fb-a825-46f7-9f4b-56c9041a50a5\") " Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.032298 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z4d9\" (UniqueName: \"kubernetes.io/projected/baa555fb-a825-46f7-9f4b-56c9041a50a5-kube-api-access-8z4d9\") pod \"baa555fb-a825-46f7-9f4b-56c9041a50a5\" (UID: \"baa555fb-a825-46f7-9f4b-56c9041a50a5\") " Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.037404 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa555fb-a825-46f7-9f4b-56c9041a50a5-kube-api-access-8z4d9" (OuterVolumeSpecName: "kube-api-access-8z4d9") pod "baa555fb-a825-46f7-9f4b-56c9041a50a5" (UID: "baa555fb-a825-46f7-9f4b-56c9041a50a5"). InnerVolumeSpecName "kube-api-access-8z4d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.067318 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baa555fb-a825-46f7-9f4b-56c9041a50a5-config-data" (OuterVolumeSpecName: "config-data") pod "baa555fb-a825-46f7-9f4b-56c9041a50a5" (UID: "baa555fb-a825-46f7-9f4b-56c9041a50a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.134197 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa555fb-a825-46f7-9f4b-56c9041a50a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.134235 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z4d9\" (UniqueName: \"kubernetes.io/projected/baa555fb-a825-46f7-9f4b-56c9041a50a5-kube-api-access-8z4d9\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.717125 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" event={"ID":"baa555fb-a825-46f7-9f4b-56c9041a50a5","Type":"ContainerDied","Data":"ec2305504d0d8a01021157f6eeb5082f5c865cda08e098b5959a5ac61eced978"} Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.717163 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec2305504d0d8a01021157f6eeb5082f5c865cda08e098b5959a5ac61eced978" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.717180 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-wz7zw" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.903405 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-82nsz"] Jan 21 00:29:39 crc kubenswrapper[5030]: E0121 00:29:39.903714 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa555fb-a825-46f7-9f4b-56c9041a50a5" containerName="keystone-db-sync" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.903725 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa555fb-a825-46f7-9f4b-56c9041a50a5" containerName="keystone-db-sync" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.903839 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa555fb-a825-46f7-9f4b-56c9041a50a5" containerName="keystone-db-sync" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.904326 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.907215 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.907215 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.907402 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.909325 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-bcgqp" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.914425 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-82nsz"] Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.916060 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.945850 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-credential-keys\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.945916 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-fernet-keys\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.945945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-scripts\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.945972 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hv4v\" (UniqueName: \"kubernetes.io/projected/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-kube-api-access-8hv4v\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:39 crc kubenswrapper[5030]: I0121 00:29:39.946027 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-config-data\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:40 crc kubenswrapper[5030]: I0121 00:29:40.047388 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-scripts\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:40 crc kubenswrapper[5030]: I0121 00:29:40.047442 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hv4v\" (UniqueName: \"kubernetes.io/projected/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-kube-api-access-8hv4v\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:40 crc kubenswrapper[5030]: I0121 00:29:40.047504 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-config-data\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:40 crc kubenswrapper[5030]: I0121 00:29:40.047554 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-credential-keys\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:40 crc kubenswrapper[5030]: I0121 00:29:40.047594 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-fernet-keys\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:40 crc kubenswrapper[5030]: I0121 00:29:40.052052 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-fernet-keys\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:40 crc kubenswrapper[5030]: I0121 00:29:40.052274 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-config-data\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:40 crc kubenswrapper[5030]: I0121 00:29:40.052347 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-scripts\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:40 crc kubenswrapper[5030]: I0121 00:29:40.055834 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-credential-keys\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:40 crc kubenswrapper[5030]: I0121 00:29:40.068460 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hv4v\" (UniqueName: \"kubernetes.io/projected/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-kube-api-access-8hv4v\") pod \"keystone-bootstrap-82nsz\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:40 crc kubenswrapper[5030]: I0121 00:29:40.225393 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:40 crc kubenswrapper[5030]: I0121 00:29:40.812063 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-82nsz"] Jan 21 00:29:40 crc kubenswrapper[5030]: W0121 00:29:40.814672 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dfdf4e5_9d35_4a50_b83b_f202403e2b9a.slice/crio-e5b6a10c5fe86aa421889e783cd20ad79ec306b02d87bb335728a41c7128b2b0 WatchSource:0}: Error finding container e5b6a10c5fe86aa421889e783cd20ad79ec306b02d87bb335728a41c7128b2b0: Status 404 returned error can't find the container with id e5b6a10c5fe86aa421889e783cd20ad79ec306b02d87bb335728a41c7128b2b0 Jan 21 00:29:41 crc kubenswrapper[5030]: I0121 00:29:41.738785 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" event={"ID":"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a","Type":"ContainerStarted","Data":"be02a06a2b2147d69c64894132b9ca632c3726d522bde0749ccbfa1e4fe8dbc9"} Jan 21 00:29:41 crc kubenswrapper[5030]: I0121 00:29:41.739030 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" event={"ID":"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a","Type":"ContainerStarted","Data":"e5b6a10c5fe86aa421889e783cd20ad79ec306b02d87bb335728a41c7128b2b0"} Jan 21 00:29:41 crc kubenswrapper[5030]: I0121 00:29:41.773106 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" podStartSLOduration=2.773082682 podStartE2EDuration="2.773082682s" podCreationTimestamp="2026-01-21 00:29:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:29:41.76464318 +0000 UTC m=+6854.084903468" watchObservedRunningTime="2026-01-21 00:29:41.773082682 +0000 UTC m=+6854.093342970" Jan 21 00:29:43 crc kubenswrapper[5030]: I0121 00:29:43.756319 5030 generic.go:334] "Generic (PLEG): container finished" podID="6dfdf4e5-9d35-4a50-b83b-f202403e2b9a" containerID="be02a06a2b2147d69c64894132b9ca632c3726d522bde0749ccbfa1e4fe8dbc9" exitCode=0 Jan 21 00:29:43 crc kubenswrapper[5030]: I0121 00:29:43.756434 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" event={"ID":"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a","Type":"ContainerDied","Data":"be02a06a2b2147d69c64894132b9ca632c3726d522bde0749ccbfa1e4fe8dbc9"} Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.123206 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.225715 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hv4v\" (UniqueName: \"kubernetes.io/projected/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-kube-api-access-8hv4v\") pod \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.225783 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-config-data\") pod \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.225827 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-fernet-keys\") pod \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.225954 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-credential-keys\") pod \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.225998 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-scripts\") pod \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\" (UID: \"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a\") " Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.232799 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6dfdf4e5-9d35-4a50-b83b-f202403e2b9a" (UID: "6dfdf4e5-9d35-4a50-b83b-f202403e2b9a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.232835 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-scripts" (OuterVolumeSpecName: "scripts") pod "6dfdf4e5-9d35-4a50-b83b-f202403e2b9a" (UID: "6dfdf4e5-9d35-4a50-b83b-f202403e2b9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.235360 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-kube-api-access-8hv4v" (OuterVolumeSpecName: "kube-api-access-8hv4v") pod "6dfdf4e5-9d35-4a50-b83b-f202403e2b9a" (UID: "6dfdf4e5-9d35-4a50-b83b-f202403e2b9a"). InnerVolumeSpecName "kube-api-access-8hv4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.235779 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6dfdf4e5-9d35-4a50-b83b-f202403e2b9a" (UID: "6dfdf4e5-9d35-4a50-b83b-f202403e2b9a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.249896 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-config-data" (OuterVolumeSpecName: "config-data") pod "6dfdf4e5-9d35-4a50-b83b-f202403e2b9a" (UID: "6dfdf4e5-9d35-4a50-b83b-f202403e2b9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.328243 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.328304 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.328330 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hv4v\" (UniqueName: \"kubernetes.io/projected/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-kube-api-access-8hv4v\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.328351 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.328371 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.776829 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" event={"ID":"6dfdf4e5-9d35-4a50-b83b-f202403e2b9a","Type":"ContainerDied","Data":"e5b6a10c5fe86aa421889e783cd20ad79ec306b02d87bb335728a41c7128b2b0"} Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.776879 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5b6a10c5fe86aa421889e783cd20ad79ec306b02d87bb335728a41c7128b2b0" Jan 21 00:29:45 crc kubenswrapper[5030]: I0121 00:29:45.776888 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-82nsz" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.218394 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-qkpdb"] Jan 21 00:29:46 crc kubenswrapper[5030]: E0121 00:29:46.218693 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfdf4e5-9d35-4a50-b83b-f202403e2b9a" containerName="keystone-bootstrap" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.218705 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfdf4e5-9d35-4a50-b83b-f202403e2b9a" containerName="keystone-bootstrap" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.218859 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dfdf4e5-9d35-4a50-b83b-f202403e2b9a" containerName="keystone-bootstrap" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.219330 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.222412 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.222987 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.223165 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-bcgqp" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.223331 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.245411 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-qkpdb"] Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.344416 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szlmx\" (UniqueName: \"kubernetes.io/projected/8b34c50c-963f-41ee-b456-b9cf7604dad8-kube-api-access-szlmx\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.344489 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-fernet-keys\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.344512 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-credential-keys\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.344548 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-config-data\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.344571 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-scripts\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.445802 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-config-data\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.445855 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-scripts\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.445918 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szlmx\" (UniqueName: \"kubernetes.io/projected/8b34c50c-963f-41ee-b456-b9cf7604dad8-kube-api-access-szlmx\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.445958 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-fernet-keys\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.445982 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-credential-keys\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.450218 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-config-data\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.450443 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-fernet-keys\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.451420 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-scripts\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.458890 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-credential-keys\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.464277 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szlmx\" (UniqueName: \"kubernetes.io/projected/8b34c50c-963f-41ee-b456-b9cf7604dad8-kube-api-access-szlmx\") pod \"keystone-8644957655-qkpdb\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:46 crc kubenswrapper[5030]: I0121 00:29:46.551154 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:47 crc kubenswrapper[5030]: I0121 00:29:47.036574 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-qkpdb"] Jan 21 00:29:47 crc kubenswrapper[5030]: I0121 00:29:47.795493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" event={"ID":"8b34c50c-963f-41ee-b456-b9cf7604dad8","Type":"ContainerStarted","Data":"078707d75d825a795e573db3649a20439280d0bce6ec108c8e0cb9faa3c50c28"} Jan 21 00:29:47 crc kubenswrapper[5030]: I0121 00:29:47.795959 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:29:47 crc kubenswrapper[5030]: I0121 00:29:47.795981 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" event={"ID":"8b34c50c-963f-41ee-b456-b9cf7604dad8","Type":"ContainerStarted","Data":"e7625091a2eff6be4565c0075e2bf3c8f6391b7d76c514df017ec99aa07fb323"} Jan 21 00:29:47 crc kubenswrapper[5030]: I0121 00:29:47.819444 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" podStartSLOduration=1.819425404 podStartE2EDuration="1.819425404s" podCreationTimestamp="2026-01-21 00:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:29:47.814054145 +0000 UTC m=+6860.134314443" watchObservedRunningTime="2026-01-21 00:29:47.819425404 +0000 UTC m=+6860.139685692" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.132469 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr"] Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.133933 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.136085 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.136342 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.150505 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr"] Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.267098 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-894h2\" (UniqueName: \"kubernetes.io/projected/3af7c689-d338-4c09-8b45-78037d8dc249-kube-api-access-894h2\") pod \"collect-profiles-29482590-h86lr\" (UID: \"3af7c689-d338-4c09-8b45-78037d8dc249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.267229 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3af7c689-d338-4c09-8b45-78037d8dc249-secret-volume\") pod \"collect-profiles-29482590-h86lr\" (UID: \"3af7c689-d338-4c09-8b45-78037d8dc249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.267306 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3af7c689-d338-4c09-8b45-78037d8dc249-config-volume\") pod \"collect-profiles-29482590-h86lr\" (UID: \"3af7c689-d338-4c09-8b45-78037d8dc249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.369044 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3af7c689-d338-4c09-8b45-78037d8dc249-config-volume\") pod \"collect-profiles-29482590-h86lr\" (UID: \"3af7c689-d338-4c09-8b45-78037d8dc249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.369131 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-894h2\" (UniqueName: \"kubernetes.io/projected/3af7c689-d338-4c09-8b45-78037d8dc249-kube-api-access-894h2\") pod \"collect-profiles-29482590-h86lr\" (UID: \"3af7c689-d338-4c09-8b45-78037d8dc249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.369212 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3af7c689-d338-4c09-8b45-78037d8dc249-secret-volume\") pod \"collect-profiles-29482590-h86lr\" (UID: \"3af7c689-d338-4c09-8b45-78037d8dc249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.372604 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3af7c689-d338-4c09-8b45-78037d8dc249-config-volume\") pod \"collect-profiles-29482590-h86lr\" (UID: \"3af7c689-d338-4c09-8b45-78037d8dc249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.374855 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3af7c689-d338-4c09-8b45-78037d8dc249-secret-volume\") pod \"collect-profiles-29482590-h86lr\" (UID: \"3af7c689-d338-4c09-8b45-78037d8dc249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.400807 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-894h2\" (UniqueName: \"kubernetes.io/projected/3af7c689-d338-4c09-8b45-78037d8dc249-kube-api-access-894h2\") pod \"collect-profiles-29482590-h86lr\" (UID: \"3af7c689-d338-4c09-8b45-78037d8dc249\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.455808 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.873712 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr"] Jan 21 00:30:00 crc kubenswrapper[5030]: I0121 00:30:00.903373 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" event={"ID":"3af7c689-d338-4c09-8b45-78037d8dc249","Type":"ContainerStarted","Data":"6f1cb2f0206828473a2f551d2fb738c5619761b4ae56feca2ca4401b2a55d19e"} Jan 21 00:30:01 crc kubenswrapper[5030]: I0121 00:30:01.912183 5030 generic.go:334] "Generic (PLEG): container finished" podID="3af7c689-d338-4c09-8b45-78037d8dc249" containerID="dcf8bdbe79b2a2b8f83042f25096f786d0b00a604e9bc49fef0c8005aef64b68" exitCode=0 Jan 21 00:30:01 crc kubenswrapper[5030]: I0121 00:30:01.912312 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" event={"ID":"3af7c689-d338-4c09-8b45-78037d8dc249","Type":"ContainerDied","Data":"dcf8bdbe79b2a2b8f83042f25096f786d0b00a604e9bc49fef0c8005aef64b68"} Jan 21 00:30:03 crc kubenswrapper[5030]: I0121 00:30:03.243664 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" Jan 21 00:30:03 crc kubenswrapper[5030]: I0121 00:30:03.416101 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3af7c689-d338-4c09-8b45-78037d8dc249-secret-volume\") pod \"3af7c689-d338-4c09-8b45-78037d8dc249\" (UID: \"3af7c689-d338-4c09-8b45-78037d8dc249\") " Jan 21 00:30:03 crc kubenswrapper[5030]: I0121 00:30:03.416231 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-894h2\" (UniqueName: \"kubernetes.io/projected/3af7c689-d338-4c09-8b45-78037d8dc249-kube-api-access-894h2\") pod \"3af7c689-d338-4c09-8b45-78037d8dc249\" (UID: \"3af7c689-d338-4c09-8b45-78037d8dc249\") " Jan 21 00:30:03 crc kubenswrapper[5030]: I0121 00:30:03.416309 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3af7c689-d338-4c09-8b45-78037d8dc249-config-volume\") pod \"3af7c689-d338-4c09-8b45-78037d8dc249\" (UID: \"3af7c689-d338-4c09-8b45-78037d8dc249\") " Jan 21 00:30:03 crc kubenswrapper[5030]: I0121 00:30:03.417022 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3af7c689-d338-4c09-8b45-78037d8dc249-config-volume" (OuterVolumeSpecName: "config-volume") pod "3af7c689-d338-4c09-8b45-78037d8dc249" (UID: "3af7c689-d338-4c09-8b45-78037d8dc249"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:30:03 crc kubenswrapper[5030]: I0121 00:30:03.423788 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3af7c689-d338-4c09-8b45-78037d8dc249-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3af7c689-d338-4c09-8b45-78037d8dc249" (UID: "3af7c689-d338-4c09-8b45-78037d8dc249"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:30:03 crc kubenswrapper[5030]: I0121 00:30:03.423914 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af7c689-d338-4c09-8b45-78037d8dc249-kube-api-access-894h2" (OuterVolumeSpecName: "kube-api-access-894h2") pod "3af7c689-d338-4c09-8b45-78037d8dc249" (UID: "3af7c689-d338-4c09-8b45-78037d8dc249"). InnerVolumeSpecName "kube-api-access-894h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:30:03 crc kubenswrapper[5030]: I0121 00:30:03.517511 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3af7c689-d338-4c09-8b45-78037d8dc249-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:03 crc kubenswrapper[5030]: I0121 00:30:03.517734 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-894h2\" (UniqueName: \"kubernetes.io/projected/3af7c689-d338-4c09-8b45-78037d8dc249-kube-api-access-894h2\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:03 crc kubenswrapper[5030]: I0121 00:30:03.517794 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3af7c689-d338-4c09-8b45-78037d8dc249-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:03 crc kubenswrapper[5030]: I0121 00:30:03.927724 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" event={"ID":"3af7c689-d338-4c09-8b45-78037d8dc249","Type":"ContainerDied","Data":"6f1cb2f0206828473a2f551d2fb738c5619761b4ae56feca2ca4401b2a55d19e"} Jan 21 00:30:03 crc kubenswrapper[5030]: I0121 00:30:03.927765 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f1cb2f0206828473a2f551d2fb738c5619761b4ae56feca2ca4401b2a55d19e" Jan 21 00:30:03 crc kubenswrapper[5030]: I0121 00:30:03.927780 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr" Jan 21 00:30:04 crc kubenswrapper[5030]: I0121 00:30:04.039051 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n"] Jan 21 00:30:04 crc kubenswrapper[5030]: I0121 00:30:04.047320 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482545-fj69n"] Jan 21 00:30:05 crc kubenswrapper[5030]: I0121 00:30:05.972662 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d12c4e-cd2b-4d9e-ab44-6ea82cd18829" path="/var/lib/kubelet/pods/10d12c4e-cd2b-4d9e-ab44-6ea82cd18829/volumes" Jan 21 00:30:17 crc kubenswrapper[5030]: I0121 00:30:17.972504 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.327709 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-k9b85"] Jan 21 00:30:19 crc kubenswrapper[5030]: E0121 00:30:19.328103 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af7c689-d338-4c09-8b45-78037d8dc249" containerName="collect-profiles" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.328118 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af7c689-d338-4c09-8b45-78037d8dc249" containerName="collect-profiles" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.328284 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af7c689-d338-4c09-8b45-78037d8dc249" containerName="collect-profiles" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.328926 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.341413 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-zdr97"] Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.342344 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.366975 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-k9b85"] Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.385384 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-zdr97"] Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.466881 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-scripts\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.467012 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-credential-keys\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.467041 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-fernet-keys\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.467070 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znpvz\" (UniqueName: \"kubernetes.io/projected/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-kube-api-access-znpvz\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.467095 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-config-data\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.467116 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-config-data\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.467152 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-fernet-keys\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.467201 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-credential-keys\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.467422 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgzq9\" (UniqueName: \"kubernetes.io/projected/05fec125-22a1-4329-a4d9-bd3773ee1f24-kube-api-access-pgzq9\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.467514 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-scripts\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.569275 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgzq9\" (UniqueName: \"kubernetes.io/projected/05fec125-22a1-4329-a4d9-bd3773ee1f24-kube-api-access-pgzq9\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.569341 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-scripts\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.569396 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-scripts\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.569565 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-credential-keys\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.569607 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znpvz\" (UniqueName: \"kubernetes.io/projected/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-kube-api-access-znpvz\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.569672 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-fernet-keys\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.569705 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-config-data\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.569736 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-config-data\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.569776 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-credential-keys\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.569799 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-fernet-keys\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.577366 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-credential-keys\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.577508 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-fernet-keys\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.577567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-config-data\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.577755 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-scripts\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.577883 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-credential-keys\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.577965 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-config-data\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.585876 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-scripts\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.586052 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-fernet-keys\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.590738 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znpvz\" (UniqueName: \"kubernetes.io/projected/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-kube-api-access-znpvz\") pod \"keystone-8644957655-k9b85\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.594247 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgzq9\" (UniqueName: \"kubernetes.io/projected/05fec125-22a1-4329-a4d9-bd3773ee1f24-kube-api-access-pgzq9\") pod \"keystone-8644957655-zdr97\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.661778 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:19 crc kubenswrapper[5030]: I0121 00:30:19.678451 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:20 crc kubenswrapper[5030]: I0121 00:30:20.097102 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-k9b85"] Jan 21 00:30:20 crc kubenswrapper[5030]: I0121 00:30:20.138719 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-zdr97"] Jan 21 00:30:20 crc kubenswrapper[5030]: W0121 00:30:20.142307 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05fec125_22a1_4329_a4d9_bd3773ee1f24.slice/crio-a3a3bc657b0e64da4974dbabfca80a6a9d6e86878a0cdcc1e5901199fdf772d4 WatchSource:0}: Error finding container a3a3bc657b0e64da4974dbabfca80a6a9d6e86878a0cdcc1e5901199fdf772d4: Status 404 returned error can't find the container with id a3a3bc657b0e64da4974dbabfca80a6a9d6e86878a0cdcc1e5901199fdf772d4 Jan 21 00:30:21 crc kubenswrapper[5030]: I0121 00:30:21.091972 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8644957655-zdr97" event={"ID":"05fec125-22a1-4329-a4d9-bd3773ee1f24","Type":"ContainerStarted","Data":"9733ad074021a4bd17a138322c8930beabf33933a544ad9bf2c22fad23c297fa"} Jan 21 00:30:21 crc kubenswrapper[5030]: I0121 00:30:21.092233 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:21 crc kubenswrapper[5030]: I0121 00:30:21.092244 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8644957655-zdr97" event={"ID":"05fec125-22a1-4329-a4d9-bd3773ee1f24","Type":"ContainerStarted","Data":"a3a3bc657b0e64da4974dbabfca80a6a9d6e86878a0cdcc1e5901199fdf772d4"} Jan 21 00:30:21 crc kubenswrapper[5030]: I0121 00:30:21.093902 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8644957655-k9b85" event={"ID":"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce","Type":"ContainerStarted","Data":"3079b268d94dfa3c0c46020d2e096a891531869b7a2bf90d2e90ac9bf42c0e83"} Jan 21 00:30:21 crc kubenswrapper[5030]: I0121 00:30:21.094372 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8644957655-k9b85" event={"ID":"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce","Type":"ContainerStarted","Data":"d69d1f9382f3141648c4438ef962e915426016477686737c271b7a624712c436"} Jan 21 00:30:21 crc kubenswrapper[5030]: I0121 00:30:21.094512 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:21 crc kubenswrapper[5030]: I0121 00:30:21.110276 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-8644957655-zdr97" podStartSLOduration=2.110252966 podStartE2EDuration="2.110252966s" podCreationTimestamp="2026-01-21 00:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:30:21.110089912 +0000 UTC m=+6893.430350200" watchObservedRunningTime="2026-01-21 00:30:21.110252966 +0000 UTC m=+6893.430513264" Jan 21 00:30:21 crc kubenswrapper[5030]: I0121 00:30:21.130805 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-8644957655-k9b85" podStartSLOduration=2.130786947 podStartE2EDuration="2.130786947s" podCreationTimestamp="2026-01-21 00:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:30:21.126328439 +0000 UTC m=+6893.446588727" watchObservedRunningTime="2026-01-21 00:30:21.130786947 +0000 UTC m=+6893.451047225" Jan 21 00:30:32 crc kubenswrapper[5030]: I0121 00:30:32.762605 5030 scope.go:117] "RemoveContainer" containerID="7455b0c322ade772e025ce4626b591e5262bdd366e579932bb533fe1332bd7fd" Jan 21 00:30:32 crc kubenswrapper[5030]: I0121 00:30:32.783013 5030 scope.go:117] "RemoveContainer" containerID="d97e7dad220d6c0976d63306667b72803bea126b66f2551e2435f8710999b82b" Jan 21 00:30:40 crc kubenswrapper[5030]: I0121 00:30:40.156948 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:30:40 crc kubenswrapper[5030]: I0121 00:30:40.157437 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:30:51 crc kubenswrapper[5030]: I0121 00:30:51.156954 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:51 crc kubenswrapper[5030]: I0121 00:30:51.176690 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:52 crc kubenswrapper[5030]: I0121 00:30:52.182522 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-k9b85"] Jan 21 00:30:52 crc kubenswrapper[5030]: I0121 00:30:52.182973 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-8644957655-k9b85" podUID="70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce" containerName="keystone-api" containerID="cri-o://3079b268d94dfa3c0c46020d2e096a891531869b7a2bf90d2e90ac9bf42c0e83" gracePeriod=30 Jan 21 00:30:52 crc kubenswrapper[5030]: I0121 00:30:52.197915 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-zdr97"] Jan 21 00:30:52 crc kubenswrapper[5030]: I0121 00:30:52.198139 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-8644957655-zdr97" podUID="05fec125-22a1-4329-a4d9-bd3773ee1f24" containerName="keystone-api" containerID="cri-o://9733ad074021a4bd17a138322c8930beabf33933a544ad9bf2c22fad23c297fa" gracePeriod=30 Jan 21 00:30:53 crc kubenswrapper[5030]: I0121 00:30:53.367072 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-qkpdb"] Jan 21 00:30:53 crc kubenswrapper[5030]: I0121 00:30:53.367351 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" podUID="8b34c50c-963f-41ee-b456-b9cf7604dad8" containerName="keystone-api" containerID="cri-o://078707d75d825a795e573db3649a20439280d0bce6ec108c8e0cb9faa3c50c28" gracePeriod=30 Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.412504 5030 generic.go:334] "Generic (PLEG): container finished" podID="70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce" containerID="3079b268d94dfa3c0c46020d2e096a891531869b7a2bf90d2e90ac9bf42c0e83" exitCode=0 Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.413121 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8644957655-k9b85" event={"ID":"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce","Type":"ContainerDied","Data":"3079b268d94dfa3c0c46020d2e096a891531869b7a2bf90d2e90ac9bf42c0e83"} Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.572907 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.671028 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.674727 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znpvz\" (UniqueName: \"kubernetes.io/projected/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-kube-api-access-znpvz\") pod \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.675726 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-fernet-keys\") pod \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.675768 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-credential-keys\") pod \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.675845 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-fernet-keys\") pod \"05fec125-22a1-4329-a4d9-bd3773ee1f24\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.675941 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgzq9\" (UniqueName: \"kubernetes.io/projected/05fec125-22a1-4329-a4d9-bd3773ee1f24-kube-api-access-pgzq9\") pod \"05fec125-22a1-4329-a4d9-bd3773ee1f24\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.675972 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-credential-keys\") pod \"05fec125-22a1-4329-a4d9-bd3773ee1f24\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.675997 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-config-data\") pod \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.676018 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-scripts\") pod \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\" (UID: \"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce\") " Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.676035 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-config-data\") pod \"05fec125-22a1-4329-a4d9-bd3773ee1f24\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.676060 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-scripts\") pod \"05fec125-22a1-4329-a4d9-bd3773ee1f24\" (UID: \"05fec125-22a1-4329-a4d9-bd3773ee1f24\") " Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.680832 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "05fec125-22a1-4329-a4d9-bd3773ee1f24" (UID: "05fec125-22a1-4329-a4d9-bd3773ee1f24"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.681260 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-kube-api-access-znpvz" (OuterVolumeSpecName: "kube-api-access-znpvz") pod "70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce" (UID: "70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce"). InnerVolumeSpecName "kube-api-access-znpvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.681454 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-scripts" (OuterVolumeSpecName: "scripts") pod "05fec125-22a1-4329-a4d9-bd3773ee1f24" (UID: "05fec125-22a1-4329-a4d9-bd3773ee1f24"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.681822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-scripts" (OuterVolumeSpecName: "scripts") pod "70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce" (UID: "70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.681807 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce" (UID: "70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.682661 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "05fec125-22a1-4329-a4d9-bd3773ee1f24" (UID: "05fec125-22a1-4329-a4d9-bd3773ee1f24"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.683348 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05fec125-22a1-4329-a4d9-bd3773ee1f24-kube-api-access-pgzq9" (OuterVolumeSpecName: "kube-api-access-pgzq9") pod "05fec125-22a1-4329-a4d9-bd3773ee1f24" (UID: "05fec125-22a1-4329-a4d9-bd3773ee1f24"). InnerVolumeSpecName "kube-api-access-pgzq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.683729 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce" (UID: "70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.706294 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-config-data" (OuterVolumeSpecName: "config-data") pod "05fec125-22a1-4329-a4d9-bd3773ee1f24" (UID: "05fec125-22a1-4329-a4d9-bd3773ee1f24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.708809 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-config-data" (OuterVolumeSpecName: "config-data") pod "70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce" (UID: "70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.777823 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.777861 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgzq9\" (UniqueName: \"kubernetes.io/projected/05fec125-22a1-4329-a4d9-bd3773ee1f24-kube-api-access-pgzq9\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.777871 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.777881 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.777890 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.777899 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.777907 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05fec125-22a1-4329-a4d9-bd3773ee1f24-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.777916 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znpvz\" (UniqueName: \"kubernetes.io/projected/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-kube-api-access-znpvz\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.777924 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:55 crc kubenswrapper[5030]: I0121 00:30:55.777932 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.423532 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8644957655-k9b85" Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.423687 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8644957655-k9b85" event={"ID":"70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce","Type":"ContainerDied","Data":"d69d1f9382f3141648c4438ef962e915426016477686737c271b7a624712c436"} Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.424230 5030 scope.go:117] "RemoveContainer" containerID="3079b268d94dfa3c0c46020d2e096a891531869b7a2bf90d2e90ac9bf42c0e83" Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.425383 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8644957655-zdr97" Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.425428 5030 generic.go:334] "Generic (PLEG): container finished" podID="05fec125-22a1-4329-a4d9-bd3773ee1f24" containerID="9733ad074021a4bd17a138322c8930beabf33933a544ad9bf2c22fad23c297fa" exitCode=0 Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.425482 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8644957655-zdr97" event={"ID":"05fec125-22a1-4329-a4d9-bd3773ee1f24","Type":"ContainerDied","Data":"9733ad074021a4bd17a138322c8930beabf33933a544ad9bf2c22fad23c297fa"} Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.425531 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8644957655-zdr97" event={"ID":"05fec125-22a1-4329-a4d9-bd3773ee1f24","Type":"ContainerDied","Data":"a3a3bc657b0e64da4974dbabfca80a6a9d6e86878a0cdcc1e5901199fdf772d4"} Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.449786 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-k9b85"] Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.460463 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-k9b85"] Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.460549 5030 scope.go:117] "RemoveContainer" containerID="9733ad074021a4bd17a138322c8930beabf33933a544ad9bf2c22fad23c297fa" Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.465998 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-zdr97"] Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.472076 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-zdr97"] Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.479907 5030 scope.go:117] "RemoveContainer" containerID="9733ad074021a4bd17a138322c8930beabf33933a544ad9bf2c22fad23c297fa" Jan 21 00:30:56 crc kubenswrapper[5030]: E0121 00:30:56.480546 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9733ad074021a4bd17a138322c8930beabf33933a544ad9bf2c22fad23c297fa\": container with ID starting with 9733ad074021a4bd17a138322c8930beabf33933a544ad9bf2c22fad23c297fa not found: ID does not exist" containerID="9733ad074021a4bd17a138322c8930beabf33933a544ad9bf2c22fad23c297fa" Jan 21 00:30:56 crc kubenswrapper[5030]: I0121 00:30:56.480601 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9733ad074021a4bd17a138322c8930beabf33933a544ad9bf2c22fad23c297fa"} err="failed to get container status \"9733ad074021a4bd17a138322c8930beabf33933a544ad9bf2c22fad23c297fa\": rpc error: code = NotFound desc = could not find container \"9733ad074021a4bd17a138322c8930beabf33933a544ad9bf2c22fad23c297fa\": container with ID starting with 9733ad074021a4bd17a138322c8930beabf33933a544ad9bf2c22fad23c297fa not found: ID does not exist" Jan 21 00:30:57 crc kubenswrapper[5030]: I0121 00:30:57.435439 5030 generic.go:334] "Generic (PLEG): container finished" podID="8b34c50c-963f-41ee-b456-b9cf7604dad8" containerID="078707d75d825a795e573db3649a20439280d0bce6ec108c8e0cb9faa3c50c28" exitCode=0 Jan 21 00:30:57 crc kubenswrapper[5030]: I0121 00:30:57.435510 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" event={"ID":"8b34c50c-963f-41ee-b456-b9cf7604dad8","Type":"ContainerDied","Data":"078707d75d825a795e573db3649a20439280d0bce6ec108c8e0cb9faa3c50c28"} Jan 21 00:30:57 crc kubenswrapper[5030]: I0121 00:30:57.970424 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05fec125-22a1-4329-a4d9-bd3773ee1f24" path="/var/lib/kubelet/pods/05fec125-22a1-4329-a4d9-bd3773ee1f24/volumes" Jan 21 00:30:57 crc kubenswrapper[5030]: I0121 00:30:57.971076 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce" path="/var/lib/kubelet/pods/70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce/volumes" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.002195 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.007768 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szlmx\" (UniqueName: \"kubernetes.io/projected/8b34c50c-963f-41ee-b456-b9cf7604dad8-kube-api-access-szlmx\") pod \"8b34c50c-963f-41ee-b456-b9cf7604dad8\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.007836 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-config-data\") pod \"8b34c50c-963f-41ee-b456-b9cf7604dad8\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.007885 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-scripts\") pod \"8b34c50c-963f-41ee-b456-b9cf7604dad8\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.007901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-credential-keys\") pod \"8b34c50c-963f-41ee-b456-b9cf7604dad8\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.007923 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-fernet-keys\") pod \"8b34c50c-963f-41ee-b456-b9cf7604dad8\" (UID: \"8b34c50c-963f-41ee-b456-b9cf7604dad8\") " Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.014829 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b34c50c-963f-41ee-b456-b9cf7604dad8-kube-api-access-szlmx" (OuterVolumeSpecName: "kube-api-access-szlmx") pod "8b34c50c-963f-41ee-b456-b9cf7604dad8" (UID: "8b34c50c-963f-41ee-b456-b9cf7604dad8"). InnerVolumeSpecName "kube-api-access-szlmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.014926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8b34c50c-963f-41ee-b456-b9cf7604dad8" (UID: "8b34c50c-963f-41ee-b456-b9cf7604dad8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.015004 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8b34c50c-963f-41ee-b456-b9cf7604dad8" (UID: "8b34c50c-963f-41ee-b456-b9cf7604dad8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.015976 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-scripts" (OuterVolumeSpecName: "scripts") pod "8b34c50c-963f-41ee-b456-b9cf7604dad8" (UID: "8b34c50c-963f-41ee-b456-b9cf7604dad8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.075485 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-config-data" (OuterVolumeSpecName: "config-data") pod "8b34c50c-963f-41ee-b456-b9cf7604dad8" (UID: "8b34c50c-963f-41ee-b456-b9cf7604dad8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.109118 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.109159 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.109172 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.109187 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szlmx\" (UniqueName: \"kubernetes.io/projected/8b34c50c-963f-41ee-b456-b9cf7604dad8-kube-api-access-szlmx\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.109197 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b34c50c-963f-41ee-b456-b9cf7604dad8-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.445569 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" event={"ID":"8b34c50c-963f-41ee-b456-b9cf7604dad8","Type":"ContainerDied","Data":"e7625091a2eff6be4565c0075e2bf3c8f6391b7d76c514df017ec99aa07fb323"} Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.445608 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-8644957655-qkpdb" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.445670 5030 scope.go:117] "RemoveContainer" containerID="078707d75d825a795e573db3649a20439280d0bce6ec108c8e0cb9faa3c50c28" Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.473037 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-qkpdb"] Jan 21 00:30:58 crc kubenswrapper[5030]: I0121 00:30:58.478569 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-8644957655-qkpdb"] Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.533676 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-wz7zw"] Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.542926 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-wz7zw"] Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.550249 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-82nsz"] Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.568403 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-82nsz"] Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.609449 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb"] Jan 21 00:30:59 crc kubenswrapper[5030]: E0121 00:30:59.609799 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b34c50c-963f-41ee-b456-b9cf7604dad8" containerName="keystone-api" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.609817 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b34c50c-963f-41ee-b456-b9cf7604dad8" containerName="keystone-api" Jan 21 00:30:59 crc kubenswrapper[5030]: E0121 00:30:59.609851 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05fec125-22a1-4329-a4d9-bd3773ee1f24" containerName="keystone-api" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.609858 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="05fec125-22a1-4329-a4d9-bd3773ee1f24" containerName="keystone-api" Jan 21 00:30:59 crc kubenswrapper[5030]: E0121 00:30:59.609880 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce" containerName="keystone-api" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.609887 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce" containerName="keystone-api" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.610080 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b34c50c-963f-41ee-b456-b9cf7604dad8" containerName="keystone-api" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.610105 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="05fec125-22a1-4329-a4d9-bd3773ee1f24" containerName="keystone-api" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.610122 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="70fa9a43-b7a0-45a4-86aa-e5a3b8e6bbce" containerName="keystone-api" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.610679 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.619025 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb"] Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.630873 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6462dc5-3667-4b13-9c23-ba2fbd0e8965-operator-scripts\") pod \"keystoneb4e1-account-delete-lkvxb\" (UID: \"b6462dc5-3667-4b13-9c23-ba2fbd0e8965\") " pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.630962 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4982\" (UniqueName: \"kubernetes.io/projected/b6462dc5-3667-4b13-9c23-ba2fbd0e8965-kube-api-access-q4982\") pod \"keystoneb4e1-account-delete-lkvxb\" (UID: \"b6462dc5-3667-4b13-9c23-ba2fbd0e8965\") " pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.732268 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6462dc5-3667-4b13-9c23-ba2fbd0e8965-operator-scripts\") pod \"keystoneb4e1-account-delete-lkvxb\" (UID: \"b6462dc5-3667-4b13-9c23-ba2fbd0e8965\") " pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.732552 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4982\" (UniqueName: \"kubernetes.io/projected/b6462dc5-3667-4b13-9c23-ba2fbd0e8965-kube-api-access-q4982\") pod \"keystoneb4e1-account-delete-lkvxb\" (UID: \"b6462dc5-3667-4b13-9c23-ba2fbd0e8965\") " pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.733159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6462dc5-3667-4b13-9c23-ba2fbd0e8965-operator-scripts\") pod \"keystoneb4e1-account-delete-lkvxb\" (UID: \"b6462dc5-3667-4b13-9c23-ba2fbd0e8965\") " pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.765879 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4982\" (UniqueName: \"kubernetes.io/projected/b6462dc5-3667-4b13-9c23-ba2fbd0e8965-kube-api-access-q4982\") pod \"keystoneb4e1-account-delete-lkvxb\" (UID: \"b6462dc5-3667-4b13-9c23-ba2fbd0e8965\") " pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.929234 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.971485 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dfdf4e5-9d35-4a50-b83b-f202403e2b9a" path="/var/lib/kubelet/pods/6dfdf4e5-9d35-4a50-b83b-f202403e2b9a/volumes" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.972528 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b34c50c-963f-41ee-b456-b9cf7604dad8" path="/var/lib/kubelet/pods/8b34c50c-963f-41ee-b456-b9cf7604dad8/volumes" Jan 21 00:30:59 crc kubenswrapper[5030]: I0121 00:30:59.973205 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa555fb-a825-46f7-9f4b-56c9041a50a5" path="/var/lib/kubelet/pods/baa555fb-a825-46f7-9f4b-56c9041a50a5/volumes" Jan 21 00:31:00 crc kubenswrapper[5030]: I0121 00:31:00.358612 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb"] Jan 21 00:31:00 crc kubenswrapper[5030]: I0121 00:31:00.465037 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" event={"ID":"b6462dc5-3667-4b13-9c23-ba2fbd0e8965","Type":"ContainerStarted","Data":"bc767e1ac69681b2262560c2b57fd6319117338709779812ca4ffa13e4914893"} Jan 21 00:31:01 crc kubenswrapper[5030]: I0121 00:31:01.476795 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" event={"ID":"b6462dc5-3667-4b13-9c23-ba2fbd0e8965","Type":"ContainerStarted","Data":"dc44f1f3719f1d4ebc0751b891dd3d95c0e3ff0f296ca0bb29b8e68b4862ffd5"} Jan 21 00:31:01 crc kubenswrapper[5030]: I0121 00:31:01.503716 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" podStartSLOduration=2.503696246 podStartE2EDuration="2.503696246s" podCreationTimestamp="2026-01-21 00:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:31:01.497288203 +0000 UTC m=+6933.817548511" watchObservedRunningTime="2026-01-21 00:31:01.503696246 +0000 UTC m=+6933.823956534" Jan 21 00:31:03 crc kubenswrapper[5030]: I0121 00:31:03.496515 5030 generic.go:334] "Generic (PLEG): container finished" podID="b6462dc5-3667-4b13-9c23-ba2fbd0e8965" containerID="dc44f1f3719f1d4ebc0751b891dd3d95c0e3ff0f296ca0bb29b8e68b4862ffd5" exitCode=0 Jan 21 00:31:03 crc kubenswrapper[5030]: I0121 00:31:03.496697 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" event={"ID":"b6462dc5-3667-4b13-9c23-ba2fbd0e8965","Type":"ContainerDied","Data":"dc44f1f3719f1d4ebc0751b891dd3d95c0e3ff0f296ca0bb29b8e68b4862ffd5"} Jan 21 00:31:04 crc kubenswrapper[5030]: I0121 00:31:04.821166 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" Jan 21 00:31:04 crc kubenswrapper[5030]: I0121 00:31:04.925007 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4982\" (UniqueName: \"kubernetes.io/projected/b6462dc5-3667-4b13-9c23-ba2fbd0e8965-kube-api-access-q4982\") pod \"b6462dc5-3667-4b13-9c23-ba2fbd0e8965\" (UID: \"b6462dc5-3667-4b13-9c23-ba2fbd0e8965\") " Jan 21 00:31:04 crc kubenswrapper[5030]: I0121 00:31:04.925977 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6462dc5-3667-4b13-9c23-ba2fbd0e8965-operator-scripts\") pod \"b6462dc5-3667-4b13-9c23-ba2fbd0e8965\" (UID: \"b6462dc5-3667-4b13-9c23-ba2fbd0e8965\") " Jan 21 00:31:04 crc kubenswrapper[5030]: I0121 00:31:04.926801 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6462dc5-3667-4b13-9c23-ba2fbd0e8965-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6462dc5-3667-4b13-9c23-ba2fbd0e8965" (UID: "b6462dc5-3667-4b13-9c23-ba2fbd0e8965"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:31:04 crc kubenswrapper[5030]: I0121 00:31:04.927071 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6462dc5-3667-4b13-9c23-ba2fbd0e8965-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:04 crc kubenswrapper[5030]: I0121 00:31:04.931392 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6462dc5-3667-4b13-9c23-ba2fbd0e8965-kube-api-access-q4982" (OuterVolumeSpecName: "kube-api-access-q4982") pod "b6462dc5-3667-4b13-9c23-ba2fbd0e8965" (UID: "b6462dc5-3667-4b13-9c23-ba2fbd0e8965"). InnerVolumeSpecName "kube-api-access-q4982". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:31:05 crc kubenswrapper[5030]: I0121 00:31:05.028849 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4982\" (UniqueName: \"kubernetes.io/projected/b6462dc5-3667-4b13-9c23-ba2fbd0e8965-kube-api-access-q4982\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:05 crc kubenswrapper[5030]: I0121 00:31:05.514523 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" event={"ID":"b6462dc5-3667-4b13-9c23-ba2fbd0e8965","Type":"ContainerDied","Data":"bc767e1ac69681b2262560c2b57fd6319117338709779812ca4ffa13e4914893"} Jan 21 00:31:05 crc kubenswrapper[5030]: I0121 00:31:05.514568 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc767e1ac69681b2262560c2b57fd6319117338709779812ca4ffa13e4914893" Jan 21 00:31:05 crc kubenswrapper[5030]: I0121 00:31:05.514983 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb" Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.642945 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-2qsh9"] Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.651395 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm"] Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.658516 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-2qsh9"] Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.670700 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb"] Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.678475 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-b4e1-account-create-update-96cxm"] Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.691397 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystoneb4e1-account-delete-lkvxb"] Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.730537 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-kjzqb"] Jan 21 00:31:09 crc kubenswrapper[5030]: E0121 00:31:09.730925 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6462dc5-3667-4b13-9c23-ba2fbd0e8965" containerName="mariadb-account-delete" Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.730950 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6462dc5-3667-4b13-9c23-ba2fbd0e8965" containerName="mariadb-account-delete" Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.731102 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6462dc5-3667-4b13-9c23-ba2fbd0e8965" containerName="mariadb-account-delete" Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.731763 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-kjzqb" Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.743947 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-kjzqb"] Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.831411 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c"] Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.832263 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.834262 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.847489 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c"] Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.909670 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dlgv\" (UniqueName: \"kubernetes.io/projected/77263086-8bb9-41a9-a195-618ac936a600-kube-api-access-9dlgv\") pod \"keystone-db-create-kjzqb\" (UID: \"77263086-8bb9-41a9-a195-618ac936a600\") " pod="keystone-kuttl-tests/keystone-db-create-kjzqb" Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.910074 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77263086-8bb9-41a9-a195-618ac936a600-operator-scripts\") pod \"keystone-db-create-kjzqb\" (UID: \"77263086-8bb9-41a9-a195-618ac936a600\") " pod="keystone-kuttl-tests/keystone-db-create-kjzqb" Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.971478 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="166fed61-e938-4987-a036-2796621d548f" path="/var/lib/kubelet/pods/166fed61-e938-4987-a036-2796621d548f/volumes" Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.972363 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6462dc5-3667-4b13-9c23-ba2fbd0e8965" path="/var/lib/kubelet/pods/b6462dc5-3667-4b13-9c23-ba2fbd0e8965/volumes" Jan 21 00:31:09 crc kubenswrapper[5030]: I0121 00:31:09.973015 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d10a0df7-74d0-4a72-bc07-8bbe67876ceb" path="/var/lib/kubelet/pods/d10a0df7-74d0-4a72-bc07-8bbe67876ceb/volumes" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.011208 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s5lz\" (UniqueName: \"kubernetes.io/projected/ecf95579-ed70-4cb1-85e6-c06c65ad6225-kube-api-access-7s5lz\") pod \"keystone-006f-account-create-update-7vh4c\" (UID: \"ecf95579-ed70-4cb1-85e6-c06c65ad6225\") " pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.011291 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77263086-8bb9-41a9-a195-618ac936a600-operator-scripts\") pod \"keystone-db-create-kjzqb\" (UID: \"77263086-8bb9-41a9-a195-618ac936a600\") " pod="keystone-kuttl-tests/keystone-db-create-kjzqb" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.011368 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dlgv\" (UniqueName: \"kubernetes.io/projected/77263086-8bb9-41a9-a195-618ac936a600-kube-api-access-9dlgv\") pod \"keystone-db-create-kjzqb\" (UID: \"77263086-8bb9-41a9-a195-618ac936a600\") " pod="keystone-kuttl-tests/keystone-db-create-kjzqb" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.011424 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecf95579-ed70-4cb1-85e6-c06c65ad6225-operator-scripts\") pod \"keystone-006f-account-create-update-7vh4c\" (UID: \"ecf95579-ed70-4cb1-85e6-c06c65ad6225\") " pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.012051 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77263086-8bb9-41a9-a195-618ac936a600-operator-scripts\") pod \"keystone-db-create-kjzqb\" (UID: \"77263086-8bb9-41a9-a195-618ac936a600\") " pod="keystone-kuttl-tests/keystone-db-create-kjzqb" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.029599 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dlgv\" (UniqueName: \"kubernetes.io/projected/77263086-8bb9-41a9-a195-618ac936a600-kube-api-access-9dlgv\") pod \"keystone-db-create-kjzqb\" (UID: \"77263086-8bb9-41a9-a195-618ac936a600\") " pod="keystone-kuttl-tests/keystone-db-create-kjzqb" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.064840 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-kjzqb" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.114087 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s5lz\" (UniqueName: \"kubernetes.io/projected/ecf95579-ed70-4cb1-85e6-c06c65ad6225-kube-api-access-7s5lz\") pod \"keystone-006f-account-create-update-7vh4c\" (UID: \"ecf95579-ed70-4cb1-85e6-c06c65ad6225\") " pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.114390 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecf95579-ed70-4cb1-85e6-c06c65ad6225-operator-scripts\") pod \"keystone-006f-account-create-update-7vh4c\" (UID: \"ecf95579-ed70-4cb1-85e6-c06c65ad6225\") " pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.116224 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecf95579-ed70-4cb1-85e6-c06c65ad6225-operator-scripts\") pod \"keystone-006f-account-create-update-7vh4c\" (UID: \"ecf95579-ed70-4cb1-85e6-c06c65ad6225\") " pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.136805 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s5lz\" (UniqueName: \"kubernetes.io/projected/ecf95579-ed70-4cb1-85e6-c06c65ad6225-kube-api-access-7s5lz\") pod \"keystone-006f-account-create-update-7vh4c\" (UID: \"ecf95579-ed70-4cb1-85e6-c06c65ad6225\") " pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.150907 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.158139 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.158184 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.493262 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-kjzqb"] Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.547936 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-kjzqb" event={"ID":"77263086-8bb9-41a9-a195-618ac936a600","Type":"ContainerStarted","Data":"418ddab173da8ecdb09a521c1c4ede1616061bb02f93502527d2607407abad52"} Jan 21 00:31:10 crc kubenswrapper[5030]: I0121 00:31:10.641532 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c"] Jan 21 00:31:11 crc kubenswrapper[5030]: I0121 00:31:11.557573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" event={"ID":"ecf95579-ed70-4cb1-85e6-c06c65ad6225","Type":"ContainerStarted","Data":"e357e6e87971c8f44391bfe94b1f997081b94e3f0f5ae5b6100057a666f24a5b"} Jan 21 00:31:11 crc kubenswrapper[5030]: I0121 00:31:11.557637 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" event={"ID":"ecf95579-ed70-4cb1-85e6-c06c65ad6225","Type":"ContainerStarted","Data":"00196431952b38b5001d5dade12abdecb630fe58e75ec3aa92e8c44ffc243cff"} Jan 21 00:31:11 crc kubenswrapper[5030]: I0121 00:31:11.559558 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-kjzqb" event={"ID":"77263086-8bb9-41a9-a195-618ac936a600","Type":"ContainerStarted","Data":"dbc574f7c305deb7a35afe9abfffcd0e7f2e11f79f47afe93d06fcc812a1880b"} Jan 21 00:31:11 crc kubenswrapper[5030]: I0121 00:31:11.585865 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-create-kjzqb" podStartSLOduration=2.5858428460000003 podStartE2EDuration="2.585842846s" podCreationTimestamp="2026-01-21 00:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:31:11.580206262 +0000 UTC m=+6943.900466570" watchObservedRunningTime="2026-01-21 00:31:11.585842846 +0000 UTC m=+6943.906103174" Jan 21 00:31:12 crc kubenswrapper[5030]: I0121 00:31:12.569552 5030 generic.go:334] "Generic (PLEG): container finished" podID="77263086-8bb9-41a9-a195-618ac936a600" containerID="dbc574f7c305deb7a35afe9abfffcd0e7f2e11f79f47afe93d06fcc812a1880b" exitCode=0 Jan 21 00:31:12 crc kubenswrapper[5030]: I0121 00:31:12.569680 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-kjzqb" event={"ID":"77263086-8bb9-41a9-a195-618ac936a600","Type":"ContainerDied","Data":"dbc574f7c305deb7a35afe9abfffcd0e7f2e11f79f47afe93d06fcc812a1880b"} Jan 21 00:31:12 crc kubenswrapper[5030]: I0121 00:31:12.572131 5030 generic.go:334] "Generic (PLEG): container finished" podID="ecf95579-ed70-4cb1-85e6-c06c65ad6225" containerID="e357e6e87971c8f44391bfe94b1f997081b94e3f0f5ae5b6100057a666f24a5b" exitCode=0 Jan 21 00:31:12 crc kubenswrapper[5030]: I0121 00:31:12.572194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" event={"ID":"ecf95579-ed70-4cb1-85e6-c06c65ad6225","Type":"ContainerDied","Data":"e357e6e87971c8f44391bfe94b1f997081b94e3f0f5ae5b6100057a666f24a5b"} Jan 21 00:31:13 crc kubenswrapper[5030]: I0121 00:31:13.952786 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" Jan 21 00:31:13 crc kubenswrapper[5030]: I0121 00:31:13.960959 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-kjzqb" Jan 21 00:31:13 crc kubenswrapper[5030]: I0121 00:31:13.977921 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecf95579-ed70-4cb1-85e6-c06c65ad6225-operator-scripts\") pod \"ecf95579-ed70-4cb1-85e6-c06c65ad6225\" (UID: \"ecf95579-ed70-4cb1-85e6-c06c65ad6225\") " Jan 21 00:31:13 crc kubenswrapper[5030]: I0121 00:31:13.978168 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s5lz\" (UniqueName: \"kubernetes.io/projected/ecf95579-ed70-4cb1-85e6-c06c65ad6225-kube-api-access-7s5lz\") pod \"ecf95579-ed70-4cb1-85e6-c06c65ad6225\" (UID: \"ecf95579-ed70-4cb1-85e6-c06c65ad6225\") " Jan 21 00:31:13 crc kubenswrapper[5030]: I0121 00:31:13.978258 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77263086-8bb9-41a9-a195-618ac936a600-operator-scripts\") pod \"77263086-8bb9-41a9-a195-618ac936a600\" (UID: \"77263086-8bb9-41a9-a195-618ac936a600\") " Jan 21 00:31:13 crc kubenswrapper[5030]: I0121 00:31:13.978397 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dlgv\" (UniqueName: \"kubernetes.io/projected/77263086-8bb9-41a9-a195-618ac936a600-kube-api-access-9dlgv\") pod \"77263086-8bb9-41a9-a195-618ac936a600\" (UID: \"77263086-8bb9-41a9-a195-618ac936a600\") " Jan 21 00:31:13 crc kubenswrapper[5030]: I0121 00:31:13.981035 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecf95579-ed70-4cb1-85e6-c06c65ad6225-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ecf95579-ed70-4cb1-85e6-c06c65ad6225" (UID: "ecf95579-ed70-4cb1-85e6-c06c65ad6225"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:31:13 crc kubenswrapper[5030]: I0121 00:31:13.982172 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77263086-8bb9-41a9-a195-618ac936a600-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77263086-8bb9-41a9-a195-618ac936a600" (UID: "77263086-8bb9-41a9-a195-618ac936a600"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:31:13 crc kubenswrapper[5030]: I0121 00:31:13.990351 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecf95579-ed70-4cb1-85e6-c06c65ad6225-kube-api-access-7s5lz" (OuterVolumeSpecName: "kube-api-access-7s5lz") pod "ecf95579-ed70-4cb1-85e6-c06c65ad6225" (UID: "ecf95579-ed70-4cb1-85e6-c06c65ad6225"). InnerVolumeSpecName "kube-api-access-7s5lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:31:13 crc kubenswrapper[5030]: I0121 00:31:13.990820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77263086-8bb9-41a9-a195-618ac936a600-kube-api-access-9dlgv" (OuterVolumeSpecName: "kube-api-access-9dlgv") pod "77263086-8bb9-41a9-a195-618ac936a600" (UID: "77263086-8bb9-41a9-a195-618ac936a600"). InnerVolumeSpecName "kube-api-access-9dlgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:31:14 crc kubenswrapper[5030]: I0121 00:31:14.079608 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecf95579-ed70-4cb1-85e6-c06c65ad6225-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:14 crc kubenswrapper[5030]: I0121 00:31:14.079948 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s5lz\" (UniqueName: \"kubernetes.io/projected/ecf95579-ed70-4cb1-85e6-c06c65ad6225-kube-api-access-7s5lz\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:14 crc kubenswrapper[5030]: I0121 00:31:14.079965 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77263086-8bb9-41a9-a195-618ac936a600-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:14 crc kubenswrapper[5030]: I0121 00:31:14.079978 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dlgv\" (UniqueName: \"kubernetes.io/projected/77263086-8bb9-41a9-a195-618ac936a600-kube-api-access-9dlgv\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:14 crc kubenswrapper[5030]: I0121 00:31:14.595513 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-kjzqb" Jan 21 00:31:14 crc kubenswrapper[5030]: I0121 00:31:14.595496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-kjzqb" event={"ID":"77263086-8bb9-41a9-a195-618ac936a600","Type":"ContainerDied","Data":"418ddab173da8ecdb09a521c1c4ede1616061bb02f93502527d2607407abad52"} Jan 21 00:31:14 crc kubenswrapper[5030]: I0121 00:31:14.595789 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="418ddab173da8ecdb09a521c1c4ede1616061bb02f93502527d2607407abad52" Jan 21 00:31:14 crc kubenswrapper[5030]: I0121 00:31:14.597175 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" event={"ID":"ecf95579-ed70-4cb1-85e6-c06c65ad6225","Type":"ContainerDied","Data":"00196431952b38b5001d5dade12abdecb630fe58e75ec3aa92e8c44ffc243cff"} Jan 21 00:31:14 crc kubenswrapper[5030]: I0121 00:31:14.597209 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00196431952b38b5001d5dade12abdecb630fe58e75ec3aa92e8c44ffc243cff" Jan 21 00:31:14 crc kubenswrapper[5030]: I0121 00:31:14.597275 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.399732 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-mw5mh"] Jan 21 00:31:20 crc kubenswrapper[5030]: E0121 00:31:20.400495 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf95579-ed70-4cb1-85e6-c06c65ad6225" containerName="mariadb-account-create-update" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.400507 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf95579-ed70-4cb1-85e6-c06c65ad6225" containerName="mariadb-account-create-update" Jan 21 00:31:20 crc kubenswrapper[5030]: E0121 00:31:20.400517 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77263086-8bb9-41a9-a195-618ac936a600" containerName="mariadb-database-create" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.400523 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="77263086-8bb9-41a9-a195-618ac936a600" containerName="mariadb-database-create" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.400651 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecf95579-ed70-4cb1-85e6-c06c65ad6225" containerName="mariadb-account-create-update" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.400669 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="77263086-8bb9-41a9-a195-618ac936a600" containerName="mariadb-database-create" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.401234 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.411012 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.411154 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.411730 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.411997 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.412232 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-hq5np" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.420055 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-mw5mh"] Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.572378 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c2b8621-d619-40fd-a45e-bd7be6d7662b-config-data\") pod \"keystone-db-sync-mw5mh\" (UID: \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\") " pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.572540 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2b8621-d619-40fd-a45e-bd7be6d7662b-combined-ca-bundle\") pod \"keystone-db-sync-mw5mh\" (UID: \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\") " pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.572592 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxtpq\" (UniqueName: \"kubernetes.io/projected/8c2b8621-d619-40fd-a45e-bd7be6d7662b-kube-api-access-pxtpq\") pod \"keystone-db-sync-mw5mh\" (UID: \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\") " pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.673977 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2b8621-d619-40fd-a45e-bd7be6d7662b-combined-ca-bundle\") pod \"keystone-db-sync-mw5mh\" (UID: \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\") " pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.674041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxtpq\" (UniqueName: \"kubernetes.io/projected/8c2b8621-d619-40fd-a45e-bd7be6d7662b-kube-api-access-pxtpq\") pod \"keystone-db-sync-mw5mh\" (UID: \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\") " pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.674087 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c2b8621-d619-40fd-a45e-bd7be6d7662b-config-data\") pod \"keystone-db-sync-mw5mh\" (UID: \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\") " pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.679959 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2b8621-d619-40fd-a45e-bd7be6d7662b-combined-ca-bundle\") pod \"keystone-db-sync-mw5mh\" (UID: \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\") " pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.680037 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c2b8621-d619-40fd-a45e-bd7be6d7662b-config-data\") pod \"keystone-db-sync-mw5mh\" (UID: \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\") " pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.691530 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxtpq\" (UniqueName: \"kubernetes.io/projected/8c2b8621-d619-40fd-a45e-bd7be6d7662b-kube-api-access-pxtpq\") pod \"keystone-db-sync-mw5mh\" (UID: \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\") " pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" Jan 21 00:31:20 crc kubenswrapper[5030]: I0121 00:31:20.721167 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" Jan 21 00:31:21 crc kubenswrapper[5030]: I0121 00:31:21.131173 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-mw5mh"] Jan 21 00:31:21 crc kubenswrapper[5030]: I0121 00:31:21.649676 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" event={"ID":"8c2b8621-d619-40fd-a45e-bd7be6d7662b","Type":"ContainerStarted","Data":"f1cdf9e598e9b521e87c899799e6cc70e3ae1b33cefb9c0395e2c70c15546272"} Jan 21 00:31:21 crc kubenswrapper[5030]: I0121 00:31:21.649930 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" event={"ID":"8c2b8621-d619-40fd-a45e-bd7be6d7662b","Type":"ContainerStarted","Data":"6e196c82d3df483ef11df4611e918b09335ba1811ad56e098a67b1fee1b9de5d"} Jan 21 00:31:21 crc kubenswrapper[5030]: I0121 00:31:21.662653 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" podStartSLOduration=1.6626128690000002 podStartE2EDuration="1.662612869s" podCreationTimestamp="2026-01-21 00:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:31:21.66224792 +0000 UTC m=+6953.982508208" watchObservedRunningTime="2026-01-21 00:31:21.662612869 +0000 UTC m=+6953.982873147" Jan 21 00:31:23 crc kubenswrapper[5030]: I0121 00:31:23.666800 5030 generic.go:334] "Generic (PLEG): container finished" podID="8c2b8621-d619-40fd-a45e-bd7be6d7662b" containerID="f1cdf9e598e9b521e87c899799e6cc70e3ae1b33cefb9c0395e2c70c15546272" exitCode=0 Jan 21 00:31:23 crc kubenswrapper[5030]: I0121 00:31:23.666883 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" event={"ID":"8c2b8621-d619-40fd-a45e-bd7be6d7662b","Type":"ContainerDied","Data":"f1cdf9e598e9b521e87c899799e6cc70e3ae1b33cefb9c0395e2c70c15546272"} Jan 21 00:31:24 crc kubenswrapper[5030]: I0121 00:31:24.953315 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.136053 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c2b8621-d619-40fd-a45e-bd7be6d7662b-config-data\") pod \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\" (UID: \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\") " Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.136234 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxtpq\" (UniqueName: \"kubernetes.io/projected/8c2b8621-d619-40fd-a45e-bd7be6d7662b-kube-api-access-pxtpq\") pod \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\" (UID: \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\") " Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.136266 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2b8621-d619-40fd-a45e-bd7be6d7662b-combined-ca-bundle\") pod \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\" (UID: \"8c2b8621-d619-40fd-a45e-bd7be6d7662b\") " Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.145481 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c2b8621-d619-40fd-a45e-bd7be6d7662b-kube-api-access-pxtpq" (OuterVolumeSpecName: "kube-api-access-pxtpq") pod "8c2b8621-d619-40fd-a45e-bd7be6d7662b" (UID: "8c2b8621-d619-40fd-a45e-bd7be6d7662b"). InnerVolumeSpecName "kube-api-access-pxtpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.160417 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c2b8621-d619-40fd-a45e-bd7be6d7662b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c2b8621-d619-40fd-a45e-bd7be6d7662b" (UID: "8c2b8621-d619-40fd-a45e-bd7be6d7662b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.173217 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c2b8621-d619-40fd-a45e-bd7be6d7662b-config-data" (OuterVolumeSpecName: "config-data") pod "8c2b8621-d619-40fd-a45e-bd7be6d7662b" (UID: "8c2b8621-d619-40fd-a45e-bd7be6d7662b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.238553 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxtpq\" (UniqueName: \"kubernetes.io/projected/8c2b8621-d619-40fd-a45e-bd7be6d7662b-kube-api-access-pxtpq\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.238595 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2b8621-d619-40fd-a45e-bd7be6d7662b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.238607 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c2b8621-d619-40fd-a45e-bd7be6d7662b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.681906 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" event={"ID":"8c2b8621-d619-40fd-a45e-bd7be6d7662b","Type":"ContainerDied","Data":"6e196c82d3df483ef11df4611e918b09335ba1811ad56e098a67b1fee1b9de5d"} Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.681955 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e196c82d3df483ef11df4611e918b09335ba1811ad56e098a67b1fee1b9de5d" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.682008 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-mw5mh" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.861699 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tsv7n"] Jan 21 00:31:25 crc kubenswrapper[5030]: E0121 00:31:25.862038 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c2b8621-d619-40fd-a45e-bd7be6d7662b" containerName="keystone-db-sync" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.862053 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2b8621-d619-40fd-a45e-bd7be6d7662b" containerName="keystone-db-sync" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.862210 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c2b8621-d619-40fd-a45e-bd7be6d7662b" containerName="keystone-db-sync" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.862730 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.865551 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.865761 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.865984 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.866346 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.866607 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-hq5np" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.870195 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.879575 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tsv7n"] Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.948652 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns9mc\" (UniqueName: \"kubernetes.io/projected/a4147780-1399-4ab8-995e-5ec0de039cf6-kube-api-access-ns9mc\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.948709 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-config-data\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.948877 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-credential-keys\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.948934 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-combined-ca-bundle\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.949144 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-scripts\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:25 crc kubenswrapper[5030]: I0121 00:31:25.949188 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-fernet-keys\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.050474 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-scripts\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.050534 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-fernet-keys\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.050576 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns9mc\" (UniqueName: \"kubernetes.io/projected/a4147780-1399-4ab8-995e-5ec0de039cf6-kube-api-access-ns9mc\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.050600 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-config-data\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.050654 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-credential-keys\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.050692 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-combined-ca-bundle\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.055103 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-combined-ca-bundle\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.055347 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-config-data\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.055351 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-scripts\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.055551 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-fernet-keys\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.056447 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-credential-keys\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.068410 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns9mc\" (UniqueName: \"kubernetes.io/projected/a4147780-1399-4ab8-995e-5ec0de039cf6-kube-api-access-ns9mc\") pod \"keystone-bootstrap-tsv7n\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.188789 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.627600 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tsv7n"] Jan 21 00:31:26 crc kubenswrapper[5030]: I0121 00:31:26.696367 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" event={"ID":"a4147780-1399-4ab8-995e-5ec0de039cf6","Type":"ContainerStarted","Data":"631f0671210f918ef99baf1a58f302458139271a73f0576bbbf4c5195cee88ab"} Jan 21 00:31:27 crc kubenswrapper[5030]: I0121 00:31:27.706737 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" event={"ID":"a4147780-1399-4ab8-995e-5ec0de039cf6","Type":"ContainerStarted","Data":"aa65ac67de2df03c7176e533a33ff638fc1de2375779abda5ebf53e2759f1291"} Jan 21 00:31:27 crc kubenswrapper[5030]: I0121 00:31:27.742284 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" podStartSLOduration=2.742265738 podStartE2EDuration="2.742265738s" podCreationTimestamp="2026-01-21 00:31:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:31:27.726392718 +0000 UTC m=+6960.046653026" watchObservedRunningTime="2026-01-21 00:31:27.742265738 +0000 UTC m=+6960.062526026" Jan 21 00:31:30 crc kubenswrapper[5030]: I0121 00:31:30.732271 5030 generic.go:334] "Generic (PLEG): container finished" podID="a4147780-1399-4ab8-995e-5ec0de039cf6" containerID="aa65ac67de2df03c7176e533a33ff638fc1de2375779abda5ebf53e2759f1291" exitCode=0 Jan 21 00:31:30 crc kubenswrapper[5030]: I0121 00:31:30.732342 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" event={"ID":"a4147780-1399-4ab8-995e-5ec0de039cf6","Type":"ContainerDied","Data":"aa65ac67de2df03c7176e533a33ff638fc1de2375779abda5ebf53e2759f1291"} Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.059347 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.255030 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-scripts\") pod \"a4147780-1399-4ab8-995e-5ec0de039cf6\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.255081 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-credential-keys\") pod \"a4147780-1399-4ab8-995e-5ec0de039cf6\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.255118 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns9mc\" (UniqueName: \"kubernetes.io/projected/a4147780-1399-4ab8-995e-5ec0de039cf6-kube-api-access-ns9mc\") pod \"a4147780-1399-4ab8-995e-5ec0de039cf6\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.255196 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-fernet-keys\") pod \"a4147780-1399-4ab8-995e-5ec0de039cf6\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.255254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-combined-ca-bundle\") pod \"a4147780-1399-4ab8-995e-5ec0de039cf6\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.255383 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-config-data\") pod \"a4147780-1399-4ab8-995e-5ec0de039cf6\" (UID: \"a4147780-1399-4ab8-995e-5ec0de039cf6\") " Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.262850 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a4147780-1399-4ab8-995e-5ec0de039cf6" (UID: "a4147780-1399-4ab8-995e-5ec0de039cf6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.263793 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4147780-1399-4ab8-995e-5ec0de039cf6-kube-api-access-ns9mc" (OuterVolumeSpecName: "kube-api-access-ns9mc") pod "a4147780-1399-4ab8-995e-5ec0de039cf6" (UID: "a4147780-1399-4ab8-995e-5ec0de039cf6"). InnerVolumeSpecName "kube-api-access-ns9mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.264894 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a4147780-1399-4ab8-995e-5ec0de039cf6" (UID: "a4147780-1399-4ab8-995e-5ec0de039cf6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.265301 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-scripts" (OuterVolumeSpecName: "scripts") pod "a4147780-1399-4ab8-995e-5ec0de039cf6" (UID: "a4147780-1399-4ab8-995e-5ec0de039cf6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.280374 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-config-data" (OuterVolumeSpecName: "config-data") pod "a4147780-1399-4ab8-995e-5ec0de039cf6" (UID: "a4147780-1399-4ab8-995e-5ec0de039cf6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.281100 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4147780-1399-4ab8-995e-5ec0de039cf6" (UID: "a4147780-1399-4ab8-995e-5ec0de039cf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.357374 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.357416 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.357426 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.357443 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns9mc\" (UniqueName: \"kubernetes.io/projected/a4147780-1399-4ab8-995e-5ec0de039cf6-kube-api-access-ns9mc\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.357453 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.357464 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4147780-1399-4ab8-995e-5ec0de039cf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.747423 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" event={"ID":"a4147780-1399-4ab8-995e-5ec0de039cf6","Type":"ContainerDied","Data":"631f0671210f918ef99baf1a58f302458139271a73f0576bbbf4c5195cee88ab"} Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.747824 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="631f0671210f918ef99baf1a58f302458139271a73f0576bbbf4c5195cee88ab" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.747590 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tsv7n" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.825209 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-7bc979fd67-nvdp6"] Jan 21 00:31:32 crc kubenswrapper[5030]: E0121 00:31:32.825574 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4147780-1399-4ab8-995e-5ec0de039cf6" containerName="keystone-bootstrap" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.825601 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4147780-1399-4ab8-995e-5ec0de039cf6" containerName="keystone-bootstrap" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.825798 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4147780-1399-4ab8-995e-5ec0de039cf6" containerName="keystone-bootstrap" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.826380 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.828885 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.828958 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.828986 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-internal-svc" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.828986 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-public-svc" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.829122 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-hq5np" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.829321 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.830195 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.838210 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7bc979fd67-nvdp6"] Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.970782 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-fernet-keys\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.970875 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-combined-ca-bundle\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.970945 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-scripts\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.970991 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sthpm\" (UniqueName: \"kubernetes.io/projected/4960b9db-6ab7-4520-8e12-ba262877c7f4-kube-api-access-sthpm\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.971070 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-config-data\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.971142 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-internal-tls-certs\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.971173 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-public-tls-certs\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:32 crc kubenswrapper[5030]: I0121 00:31:32.971197 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-credential-keys\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.072089 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-scripts\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.072182 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sthpm\" (UniqueName: \"kubernetes.io/projected/4960b9db-6ab7-4520-8e12-ba262877c7f4-kube-api-access-sthpm\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.072253 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-config-data\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.072990 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-internal-tls-certs\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.073019 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-public-tls-certs\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.073061 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-credential-keys\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.073093 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-fernet-keys\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.073881 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-combined-ca-bundle\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.076869 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-config-data\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.076902 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-fernet-keys\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.077080 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-public-tls-certs\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.078093 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-internal-tls-certs\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.078897 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-scripts\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.093183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-combined-ca-bundle\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.093837 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sthpm\" (UniqueName: \"kubernetes.io/projected/4960b9db-6ab7-4520-8e12-ba262877c7f4-kube-api-access-sthpm\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.094302 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-credential-keys\") pod \"keystone-7bc979fd67-nvdp6\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.144561 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.569540 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7bc979fd67-nvdp6"] Jan 21 00:31:33 crc kubenswrapper[5030]: I0121 00:31:33.755249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" event={"ID":"4960b9db-6ab7-4520-8e12-ba262877c7f4","Type":"ContainerStarted","Data":"16dbdc7267fe446f5bd352f5b6192df086aeae173e4e24ea59f41b1860a134f1"} Jan 21 00:31:34 crc kubenswrapper[5030]: I0121 00:31:34.763428 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" event={"ID":"4960b9db-6ab7-4520-8e12-ba262877c7f4","Type":"ContainerStarted","Data":"3017f07e2b6b01902e12aeeab22e856cfc2d6f5d6846aec6a9107dfb4841baa9"} Jan 21 00:31:34 crc kubenswrapper[5030]: I0121 00:31:34.763825 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:31:34 crc kubenswrapper[5030]: I0121 00:31:34.783964 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" podStartSLOduration=2.783945654 podStartE2EDuration="2.783945654s" podCreationTimestamp="2026-01-21 00:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:31:34.781212239 +0000 UTC m=+6967.101472537" watchObservedRunningTime="2026-01-21 00:31:34.783945654 +0000 UTC m=+6967.104205942" Jan 21 00:31:40 crc kubenswrapper[5030]: I0121 00:31:40.157533 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:31:40 crc kubenswrapper[5030]: I0121 00:31:40.158081 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:31:40 crc kubenswrapper[5030]: I0121 00:31:40.158127 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 00:31:40 crc kubenswrapper[5030]: I0121 00:31:40.158821 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db98105b3432d0931d8d075baf245bf404651b9f54b3e055d9bfc97178ee7e44"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 00:31:40 crc kubenswrapper[5030]: I0121 00:31:40.158890 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://db98105b3432d0931d8d075baf245bf404651b9f54b3e055d9bfc97178ee7e44" gracePeriod=600 Jan 21 00:31:40 crc kubenswrapper[5030]: I0121 00:31:40.807134 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="db98105b3432d0931d8d075baf245bf404651b9f54b3e055d9bfc97178ee7e44" exitCode=0 Jan 21 00:31:40 crc kubenswrapper[5030]: I0121 00:31:40.807208 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"db98105b3432d0931d8d075baf245bf404651b9f54b3e055d9bfc97178ee7e44"} Jan 21 00:31:40 crc kubenswrapper[5030]: I0121 00:31:40.807504 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89"} Jan 21 00:31:40 crc kubenswrapper[5030]: I0121 00:31:40.807534 5030 scope.go:117] "RemoveContainer" containerID="727e5a25d6744d8b056f26bd2ee308149d3a4acb0f0dde0f1d0053870c73e575" Jan 21 00:32:04 crc kubenswrapper[5030]: I0121 00:32:04.558955 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.431142 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tsv7n"] Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.442279 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-mw5mh"] Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.449064 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tsv7n"] Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.456313 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-mw5mh"] Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.464057 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7bc979fd67-nvdp6"] Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.464409 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" podUID="4960b9db-6ab7-4520-8e12-ba262877c7f4" containerName="keystone-api" containerID="cri-o://3017f07e2b6b01902e12aeeab22e856cfc2d6f5d6846aec6a9107dfb4841baa9" gracePeriod=30 Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.494679 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone006f-account-delete-2h855"] Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.495853 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone006f-account-delete-2h855" Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.506322 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone006f-account-delete-2h855"] Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.564108 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b42005fb-725f-486e-a453-3fd6d9852909-operator-scripts\") pod \"keystone006f-account-delete-2h855\" (UID: \"b42005fb-725f-486e-a453-3fd6d9852909\") " pod="keystone-kuttl-tests/keystone006f-account-delete-2h855" Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.564234 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfgc6\" (UniqueName: \"kubernetes.io/projected/b42005fb-725f-486e-a453-3fd6d9852909-kube-api-access-wfgc6\") pod \"keystone006f-account-delete-2h855\" (UID: \"b42005fb-725f-486e-a453-3fd6d9852909\") " pod="keystone-kuttl-tests/keystone006f-account-delete-2h855" Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.665488 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfgc6\" (UniqueName: \"kubernetes.io/projected/b42005fb-725f-486e-a453-3fd6d9852909-kube-api-access-wfgc6\") pod \"keystone006f-account-delete-2h855\" (UID: \"b42005fb-725f-486e-a453-3fd6d9852909\") " pod="keystone-kuttl-tests/keystone006f-account-delete-2h855" Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.665599 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b42005fb-725f-486e-a453-3fd6d9852909-operator-scripts\") pod \"keystone006f-account-delete-2h855\" (UID: \"b42005fb-725f-486e-a453-3fd6d9852909\") " pod="keystone-kuttl-tests/keystone006f-account-delete-2h855" Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.666922 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b42005fb-725f-486e-a453-3fd6d9852909-operator-scripts\") pod \"keystone006f-account-delete-2h855\" (UID: \"b42005fb-725f-486e-a453-3fd6d9852909\") " pod="keystone-kuttl-tests/keystone006f-account-delete-2h855" Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.685062 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfgc6\" (UniqueName: \"kubernetes.io/projected/b42005fb-725f-486e-a453-3fd6d9852909-kube-api-access-wfgc6\") pod \"keystone006f-account-delete-2h855\" (UID: \"b42005fb-725f-486e-a453-3fd6d9852909\") " pod="keystone-kuttl-tests/keystone006f-account-delete-2h855" Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.821971 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone006f-account-delete-2h855" Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.979039 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c2b8621-d619-40fd-a45e-bd7be6d7662b" path="/var/lib/kubelet/pods/8c2b8621-d619-40fd-a45e-bd7be6d7662b/volumes" Jan 21 00:32:05 crc kubenswrapper[5030]: I0121 00:32:05.980121 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4147780-1399-4ab8-995e-5ec0de039cf6" path="/var/lib/kubelet/pods/a4147780-1399-4ab8-995e-5ec0de039cf6/volumes" Jan 21 00:32:06 crc kubenswrapper[5030]: I0121 00:32:06.293669 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone006f-account-delete-2h855"] Jan 21 00:32:07 crc kubenswrapper[5030]: I0121 00:32:07.000483 5030 generic.go:334] "Generic (PLEG): container finished" podID="b42005fb-725f-486e-a453-3fd6d9852909" containerID="56577b69e355f56132268618b6805e8f83d49f38733f447f7a860aa109364f60" exitCode=0 Jan 21 00:32:07 crc kubenswrapper[5030]: I0121 00:32:07.000573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone006f-account-delete-2h855" event={"ID":"b42005fb-725f-486e-a453-3fd6d9852909","Type":"ContainerDied","Data":"56577b69e355f56132268618b6805e8f83d49f38733f447f7a860aa109364f60"} Jan 21 00:32:07 crc kubenswrapper[5030]: I0121 00:32:07.001041 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone006f-account-delete-2h855" event={"ID":"b42005fb-725f-486e-a453-3fd6d9852909","Type":"ContainerStarted","Data":"318a80f03f3e6336f6150f9c3776d083989a603b6049a6af6fc2d198a8b84207"} Jan 21 00:32:08 crc kubenswrapper[5030]: I0121 00:32:08.370549 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone006f-account-delete-2h855" Jan 21 00:32:08 crc kubenswrapper[5030]: I0121 00:32:08.407663 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfgc6\" (UniqueName: \"kubernetes.io/projected/b42005fb-725f-486e-a453-3fd6d9852909-kube-api-access-wfgc6\") pod \"b42005fb-725f-486e-a453-3fd6d9852909\" (UID: \"b42005fb-725f-486e-a453-3fd6d9852909\") " Jan 21 00:32:08 crc kubenswrapper[5030]: I0121 00:32:08.407883 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b42005fb-725f-486e-a453-3fd6d9852909-operator-scripts\") pod \"b42005fb-725f-486e-a453-3fd6d9852909\" (UID: \"b42005fb-725f-486e-a453-3fd6d9852909\") " Jan 21 00:32:08 crc kubenswrapper[5030]: I0121 00:32:08.409163 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b42005fb-725f-486e-a453-3fd6d9852909-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b42005fb-725f-486e-a453-3fd6d9852909" (UID: "b42005fb-725f-486e-a453-3fd6d9852909"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:32:08 crc kubenswrapper[5030]: I0121 00:32:08.413328 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b42005fb-725f-486e-a453-3fd6d9852909-kube-api-access-wfgc6" (OuterVolumeSpecName: "kube-api-access-wfgc6") pod "b42005fb-725f-486e-a453-3fd6d9852909" (UID: "b42005fb-725f-486e-a453-3fd6d9852909"). InnerVolumeSpecName "kube-api-access-wfgc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:32:08 crc kubenswrapper[5030]: I0121 00:32:08.510393 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfgc6\" (UniqueName: \"kubernetes.io/projected/b42005fb-725f-486e-a453-3fd6d9852909-kube-api-access-wfgc6\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:08 crc kubenswrapper[5030]: I0121 00:32:08.510444 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b42005fb-725f-486e-a453-3fd6d9852909-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:08 crc kubenswrapper[5030]: I0121 00:32:08.999815 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.020856 5030 generic.go:334] "Generic (PLEG): container finished" podID="4960b9db-6ab7-4520-8e12-ba262877c7f4" containerID="3017f07e2b6b01902e12aeeab22e856cfc2d6f5d6846aec6a9107dfb4841baa9" exitCode=0 Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.020918 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.021011 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" event={"ID":"4960b9db-6ab7-4520-8e12-ba262877c7f4","Type":"ContainerDied","Data":"3017f07e2b6b01902e12aeeab22e856cfc2d6f5d6846aec6a9107dfb4841baa9"} Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.021081 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7bc979fd67-nvdp6" event={"ID":"4960b9db-6ab7-4520-8e12-ba262877c7f4","Type":"ContainerDied","Data":"16dbdc7267fe446f5bd352f5b6192df086aeae173e4e24ea59f41b1860a134f1"} Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.021104 5030 scope.go:117] "RemoveContainer" containerID="3017f07e2b6b01902e12aeeab22e856cfc2d6f5d6846aec6a9107dfb4841baa9" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.023052 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone006f-account-delete-2h855" event={"ID":"b42005fb-725f-486e-a453-3fd6d9852909","Type":"ContainerDied","Data":"318a80f03f3e6336f6150f9c3776d083989a603b6049a6af6fc2d198a8b84207"} Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.023100 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="318a80f03f3e6336f6150f9c3776d083989a603b6049a6af6fc2d198a8b84207" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.023163 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone006f-account-delete-2h855" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.048629 5030 scope.go:117] "RemoveContainer" containerID="3017f07e2b6b01902e12aeeab22e856cfc2d6f5d6846aec6a9107dfb4841baa9" Jan 21 00:32:09 crc kubenswrapper[5030]: E0121 00:32:09.049134 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3017f07e2b6b01902e12aeeab22e856cfc2d6f5d6846aec6a9107dfb4841baa9\": container with ID starting with 3017f07e2b6b01902e12aeeab22e856cfc2d6f5d6846aec6a9107dfb4841baa9 not found: ID does not exist" containerID="3017f07e2b6b01902e12aeeab22e856cfc2d6f5d6846aec6a9107dfb4841baa9" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.049158 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3017f07e2b6b01902e12aeeab22e856cfc2d6f5d6846aec6a9107dfb4841baa9"} err="failed to get container status \"3017f07e2b6b01902e12aeeab22e856cfc2d6f5d6846aec6a9107dfb4841baa9\": rpc error: code = NotFound desc = could not find container \"3017f07e2b6b01902e12aeeab22e856cfc2d6f5d6846aec6a9107dfb4841baa9\": container with ID starting with 3017f07e2b6b01902e12aeeab22e856cfc2d6f5d6846aec6a9107dfb4841baa9 not found: ID does not exist" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.119158 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-config-data\") pod \"4960b9db-6ab7-4520-8e12-ba262877c7f4\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.119230 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-scripts\") pod \"4960b9db-6ab7-4520-8e12-ba262877c7f4\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.119287 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-public-tls-certs\") pod \"4960b9db-6ab7-4520-8e12-ba262877c7f4\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.119311 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-internal-tls-certs\") pod \"4960b9db-6ab7-4520-8e12-ba262877c7f4\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.119336 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-combined-ca-bundle\") pod \"4960b9db-6ab7-4520-8e12-ba262877c7f4\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.119377 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-credential-keys\") pod \"4960b9db-6ab7-4520-8e12-ba262877c7f4\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.119417 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-fernet-keys\") pod \"4960b9db-6ab7-4520-8e12-ba262877c7f4\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.119473 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sthpm\" (UniqueName: \"kubernetes.io/projected/4960b9db-6ab7-4520-8e12-ba262877c7f4-kube-api-access-sthpm\") pod \"4960b9db-6ab7-4520-8e12-ba262877c7f4\" (UID: \"4960b9db-6ab7-4520-8e12-ba262877c7f4\") " Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.123569 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-scripts" (OuterVolumeSpecName: "scripts") pod "4960b9db-6ab7-4520-8e12-ba262877c7f4" (UID: "4960b9db-6ab7-4520-8e12-ba262877c7f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.123629 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4960b9db-6ab7-4520-8e12-ba262877c7f4" (UID: "4960b9db-6ab7-4520-8e12-ba262877c7f4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.124646 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4960b9db-6ab7-4520-8e12-ba262877c7f4" (UID: "4960b9db-6ab7-4520-8e12-ba262877c7f4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.126706 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4960b9db-6ab7-4520-8e12-ba262877c7f4-kube-api-access-sthpm" (OuterVolumeSpecName: "kube-api-access-sthpm") pod "4960b9db-6ab7-4520-8e12-ba262877c7f4" (UID: "4960b9db-6ab7-4520-8e12-ba262877c7f4"). InnerVolumeSpecName "kube-api-access-sthpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.137865 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-config-data" (OuterVolumeSpecName: "config-data") pod "4960b9db-6ab7-4520-8e12-ba262877c7f4" (UID: "4960b9db-6ab7-4520-8e12-ba262877c7f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.138731 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4960b9db-6ab7-4520-8e12-ba262877c7f4" (UID: "4960b9db-6ab7-4520-8e12-ba262877c7f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.153460 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4960b9db-6ab7-4520-8e12-ba262877c7f4" (UID: "4960b9db-6ab7-4520-8e12-ba262877c7f4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.154941 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4960b9db-6ab7-4520-8e12-ba262877c7f4" (UID: "4960b9db-6ab7-4520-8e12-ba262877c7f4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.221499 5030 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.221536 5030 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.221546 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.221558 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.221569 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.221582 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sthpm\" (UniqueName: \"kubernetes.io/projected/4960b9db-6ab7-4520-8e12-ba262877c7f4-kube-api-access-sthpm\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.221592 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.221601 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4960b9db-6ab7-4520-8e12-ba262877c7f4-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.382755 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7bc979fd67-nvdp6"] Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.385566 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-7bc979fd67-nvdp6"] Jan 21 00:32:09 crc kubenswrapper[5030]: I0121 00:32:09.972594 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4960b9db-6ab7-4520-8e12-ba262877c7f4" path="/var/lib/kubelet/pods/4960b9db-6ab7-4520-8e12-ba262877c7f4/volumes" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.509556 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-kjzqb"] Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.518261 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-kjzqb"] Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.525529 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c"] Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.532284 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone006f-account-delete-2h855"] Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.539954 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-006f-account-create-update-7vh4c"] Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.547263 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone006f-account-delete-2h855"] Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.593430 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-fhpvh"] Jan 21 00:32:10 crc kubenswrapper[5030]: E0121 00:32:10.593980 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42005fb-725f-486e-a453-3fd6d9852909" containerName="mariadb-account-delete" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.594098 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42005fb-725f-486e-a453-3fd6d9852909" containerName="mariadb-account-delete" Jan 21 00:32:10 crc kubenswrapper[5030]: E0121 00:32:10.594180 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4960b9db-6ab7-4520-8e12-ba262877c7f4" containerName="keystone-api" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.594234 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4960b9db-6ab7-4520-8e12-ba262877c7f4" containerName="keystone-api" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.594422 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42005fb-725f-486e-a453-3fd6d9852909" containerName="mariadb-account-delete" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.594496 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4960b9db-6ab7-4520-8e12-ba262877c7f4" containerName="keystone-api" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.595208 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-fhpvh" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.614415 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-fhpvh"] Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.641312 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86b2905d-8c8f-48c3-92e7-23405af95b06-operator-scripts\") pod \"keystone-db-create-fhpvh\" (UID: \"86b2905d-8c8f-48c3-92e7-23405af95b06\") " pod="keystone-kuttl-tests/keystone-db-create-fhpvh" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.641495 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2fhl\" (UniqueName: \"kubernetes.io/projected/86b2905d-8c8f-48c3-92e7-23405af95b06-kube-api-access-f2fhl\") pod \"keystone-db-create-fhpvh\" (UID: \"86b2905d-8c8f-48c3-92e7-23405af95b06\") " pod="keystone-kuttl-tests/keystone-db-create-fhpvh" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.701130 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg"] Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.702037 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.704239 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.708430 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg"] Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.742674 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2fhl\" (UniqueName: \"kubernetes.io/projected/86b2905d-8c8f-48c3-92e7-23405af95b06-kube-api-access-f2fhl\") pod \"keystone-db-create-fhpvh\" (UID: \"86b2905d-8c8f-48c3-92e7-23405af95b06\") " pod="keystone-kuttl-tests/keystone-db-create-fhpvh" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.742743 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp8w2\" (UniqueName: \"kubernetes.io/projected/e4df90c8-bc7b-4e92-9f0d-347a800b9c6f-kube-api-access-zp8w2\") pod \"keystone-b4d4-account-create-update-jv9qg\" (UID: \"e4df90c8-bc7b-4e92-9f0d-347a800b9c6f\") " pod="keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.742833 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86b2905d-8c8f-48c3-92e7-23405af95b06-operator-scripts\") pod \"keystone-db-create-fhpvh\" (UID: \"86b2905d-8c8f-48c3-92e7-23405af95b06\") " pod="keystone-kuttl-tests/keystone-db-create-fhpvh" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.743022 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4df90c8-bc7b-4e92-9f0d-347a800b9c6f-operator-scripts\") pod \"keystone-b4d4-account-create-update-jv9qg\" (UID: \"e4df90c8-bc7b-4e92-9f0d-347a800b9c6f\") " pod="keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.743863 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86b2905d-8c8f-48c3-92e7-23405af95b06-operator-scripts\") pod \"keystone-db-create-fhpvh\" (UID: \"86b2905d-8c8f-48c3-92e7-23405af95b06\") " pod="keystone-kuttl-tests/keystone-db-create-fhpvh" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.759495 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2fhl\" (UniqueName: \"kubernetes.io/projected/86b2905d-8c8f-48c3-92e7-23405af95b06-kube-api-access-f2fhl\") pod \"keystone-db-create-fhpvh\" (UID: \"86b2905d-8c8f-48c3-92e7-23405af95b06\") " pod="keystone-kuttl-tests/keystone-db-create-fhpvh" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.844913 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4df90c8-bc7b-4e92-9f0d-347a800b9c6f-operator-scripts\") pod \"keystone-b4d4-account-create-update-jv9qg\" (UID: \"e4df90c8-bc7b-4e92-9f0d-347a800b9c6f\") " pod="keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.845230 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp8w2\" (UniqueName: \"kubernetes.io/projected/e4df90c8-bc7b-4e92-9f0d-347a800b9c6f-kube-api-access-zp8w2\") pod \"keystone-b4d4-account-create-update-jv9qg\" (UID: \"e4df90c8-bc7b-4e92-9f0d-347a800b9c6f\") " pod="keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.846126 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4df90c8-bc7b-4e92-9f0d-347a800b9c6f-operator-scripts\") pod \"keystone-b4d4-account-create-update-jv9qg\" (UID: \"e4df90c8-bc7b-4e92-9f0d-347a800b9c6f\") " pod="keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.861923 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp8w2\" (UniqueName: \"kubernetes.io/projected/e4df90c8-bc7b-4e92-9f0d-347a800b9c6f-kube-api-access-zp8w2\") pod \"keystone-b4d4-account-create-update-jv9qg\" (UID: \"e4df90c8-bc7b-4e92-9f0d-347a800b9c6f\") " pod="keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg" Jan 21 00:32:10 crc kubenswrapper[5030]: I0121 00:32:10.959665 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-fhpvh" Jan 21 00:32:11 crc kubenswrapper[5030]: I0121 00:32:11.014998 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg" Jan 21 00:32:11 crc kubenswrapper[5030]: I0121 00:32:11.389588 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-fhpvh"] Jan 21 00:32:11 crc kubenswrapper[5030]: W0121 00:32:11.392007 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86b2905d_8c8f_48c3_92e7_23405af95b06.slice/crio-981b7c5c4e39bb59c59def46c7e8bc664e59d7db77551b9218815dd5cbc45657 WatchSource:0}: Error finding container 981b7c5c4e39bb59c59def46c7e8bc664e59d7db77551b9218815dd5cbc45657: Status 404 returned error can't find the container with id 981b7c5c4e39bb59c59def46c7e8bc664e59d7db77551b9218815dd5cbc45657 Jan 21 00:32:11 crc kubenswrapper[5030]: I0121 00:32:11.472037 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg"] Jan 21 00:32:11 crc kubenswrapper[5030]: W0121 00:32:11.490909 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4df90c8_bc7b_4e92_9f0d_347a800b9c6f.slice/crio-38dd46e4f5be251a15c8f92f996806aba5fb8d548fb6882445edf58237073f0e WatchSource:0}: Error finding container 38dd46e4f5be251a15c8f92f996806aba5fb8d548fb6882445edf58237073f0e: Status 404 returned error can't find the container with id 38dd46e4f5be251a15c8f92f996806aba5fb8d548fb6882445edf58237073f0e Jan 21 00:32:11 crc kubenswrapper[5030]: I0121 00:32:11.970237 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77263086-8bb9-41a9-a195-618ac936a600" path="/var/lib/kubelet/pods/77263086-8bb9-41a9-a195-618ac936a600/volumes" Jan 21 00:32:11 crc kubenswrapper[5030]: I0121 00:32:11.971663 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b42005fb-725f-486e-a453-3fd6d9852909" path="/var/lib/kubelet/pods/b42005fb-725f-486e-a453-3fd6d9852909/volumes" Jan 21 00:32:11 crc kubenswrapper[5030]: I0121 00:32:11.973664 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecf95579-ed70-4cb1-85e6-c06c65ad6225" path="/var/lib/kubelet/pods/ecf95579-ed70-4cb1-85e6-c06c65ad6225/volumes" Jan 21 00:32:12 crc kubenswrapper[5030]: I0121 00:32:12.049139 5030 generic.go:334] "Generic (PLEG): container finished" podID="86b2905d-8c8f-48c3-92e7-23405af95b06" containerID="7b522ee6f1171a2b131795b927d6459049aca19d599238d12de629b848864714" exitCode=0 Jan 21 00:32:12 crc kubenswrapper[5030]: I0121 00:32:12.049205 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-fhpvh" event={"ID":"86b2905d-8c8f-48c3-92e7-23405af95b06","Type":"ContainerDied","Data":"7b522ee6f1171a2b131795b927d6459049aca19d599238d12de629b848864714"} Jan 21 00:32:12 crc kubenswrapper[5030]: I0121 00:32:12.049247 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-fhpvh" event={"ID":"86b2905d-8c8f-48c3-92e7-23405af95b06","Type":"ContainerStarted","Data":"981b7c5c4e39bb59c59def46c7e8bc664e59d7db77551b9218815dd5cbc45657"} Jan 21 00:32:12 crc kubenswrapper[5030]: I0121 00:32:12.052123 5030 generic.go:334] "Generic (PLEG): container finished" podID="e4df90c8-bc7b-4e92-9f0d-347a800b9c6f" containerID="5d53470662d918c0bf1c6a06d565912c08a2ea67b746add59032ed20a4f45c82" exitCode=0 Jan 21 00:32:12 crc kubenswrapper[5030]: I0121 00:32:12.052194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg" event={"ID":"e4df90c8-bc7b-4e92-9f0d-347a800b9c6f","Type":"ContainerDied","Data":"5d53470662d918c0bf1c6a06d565912c08a2ea67b746add59032ed20a4f45c82"} Jan 21 00:32:12 crc kubenswrapper[5030]: I0121 00:32:12.052234 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg" event={"ID":"e4df90c8-bc7b-4e92-9f0d-347a800b9c6f","Type":"ContainerStarted","Data":"38dd46e4f5be251a15c8f92f996806aba5fb8d548fb6882445edf58237073f0e"} Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.444490 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg" Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.450067 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-fhpvh" Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.484448 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2fhl\" (UniqueName: \"kubernetes.io/projected/86b2905d-8c8f-48c3-92e7-23405af95b06-kube-api-access-f2fhl\") pod \"86b2905d-8c8f-48c3-92e7-23405af95b06\" (UID: \"86b2905d-8c8f-48c3-92e7-23405af95b06\") " Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.484494 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86b2905d-8c8f-48c3-92e7-23405af95b06-operator-scripts\") pod \"86b2905d-8c8f-48c3-92e7-23405af95b06\" (UID: \"86b2905d-8c8f-48c3-92e7-23405af95b06\") " Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.484559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp8w2\" (UniqueName: \"kubernetes.io/projected/e4df90c8-bc7b-4e92-9f0d-347a800b9c6f-kube-api-access-zp8w2\") pod \"e4df90c8-bc7b-4e92-9f0d-347a800b9c6f\" (UID: \"e4df90c8-bc7b-4e92-9f0d-347a800b9c6f\") " Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.484681 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4df90c8-bc7b-4e92-9f0d-347a800b9c6f-operator-scripts\") pod \"e4df90c8-bc7b-4e92-9f0d-347a800b9c6f\" (UID: \"e4df90c8-bc7b-4e92-9f0d-347a800b9c6f\") " Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.485533 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4df90c8-bc7b-4e92-9f0d-347a800b9c6f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e4df90c8-bc7b-4e92-9f0d-347a800b9c6f" (UID: "e4df90c8-bc7b-4e92-9f0d-347a800b9c6f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.485678 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86b2905d-8c8f-48c3-92e7-23405af95b06-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86b2905d-8c8f-48c3-92e7-23405af95b06" (UID: "86b2905d-8c8f-48c3-92e7-23405af95b06"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.490722 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4df90c8-bc7b-4e92-9f0d-347a800b9c6f-kube-api-access-zp8w2" (OuterVolumeSpecName: "kube-api-access-zp8w2") pod "e4df90c8-bc7b-4e92-9f0d-347a800b9c6f" (UID: "e4df90c8-bc7b-4e92-9f0d-347a800b9c6f"). InnerVolumeSpecName "kube-api-access-zp8w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.492051 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b2905d-8c8f-48c3-92e7-23405af95b06-kube-api-access-f2fhl" (OuterVolumeSpecName: "kube-api-access-f2fhl") pod "86b2905d-8c8f-48c3-92e7-23405af95b06" (UID: "86b2905d-8c8f-48c3-92e7-23405af95b06"). InnerVolumeSpecName "kube-api-access-f2fhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.586183 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp8w2\" (UniqueName: \"kubernetes.io/projected/e4df90c8-bc7b-4e92-9f0d-347a800b9c6f-kube-api-access-zp8w2\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.586222 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4df90c8-bc7b-4e92-9f0d-347a800b9c6f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.586233 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2fhl\" (UniqueName: \"kubernetes.io/projected/86b2905d-8c8f-48c3-92e7-23405af95b06-kube-api-access-f2fhl\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:13 crc kubenswrapper[5030]: I0121 00:32:13.586242 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86b2905d-8c8f-48c3-92e7-23405af95b06-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:14 crc kubenswrapper[5030]: I0121 00:32:14.065459 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-fhpvh" event={"ID":"86b2905d-8c8f-48c3-92e7-23405af95b06","Type":"ContainerDied","Data":"981b7c5c4e39bb59c59def46c7e8bc664e59d7db77551b9218815dd5cbc45657"} Jan 21 00:32:14 crc kubenswrapper[5030]: I0121 00:32:14.065501 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="981b7c5c4e39bb59c59def46c7e8bc664e59d7db77551b9218815dd5cbc45657" Jan 21 00:32:14 crc kubenswrapper[5030]: I0121 00:32:14.065554 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-fhpvh" Jan 21 00:32:14 crc kubenswrapper[5030]: I0121 00:32:14.067097 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg" event={"ID":"e4df90c8-bc7b-4e92-9f0d-347a800b9c6f","Type":"ContainerDied","Data":"38dd46e4f5be251a15c8f92f996806aba5fb8d548fb6882445edf58237073f0e"} Jan 21 00:32:14 crc kubenswrapper[5030]: I0121 00:32:14.067137 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38dd46e4f5be251a15c8f92f996806aba5fb8d548fb6882445edf58237073f0e" Jan 21 00:32:14 crc kubenswrapper[5030]: I0121 00:32:14.067166 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.249885 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-p59rb"] Jan 21 00:32:16 crc kubenswrapper[5030]: E0121 00:32:16.250343 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4df90c8-bc7b-4e92-9f0d-347a800b9c6f" containerName="mariadb-account-create-update" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.250354 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4df90c8-bc7b-4e92-9f0d-347a800b9c6f" containerName="mariadb-account-create-update" Jan 21 00:32:16 crc kubenswrapper[5030]: E0121 00:32:16.250378 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b2905d-8c8f-48c3-92e7-23405af95b06" containerName="mariadb-database-create" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.250384 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b2905d-8c8f-48c3-92e7-23405af95b06" containerName="mariadb-database-create" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.250573 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b2905d-8c8f-48c3-92e7-23405af95b06" containerName="mariadb-database-create" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.250590 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4df90c8-bc7b-4e92-9f0d-347a800b9c6f" containerName="mariadb-account-create-update" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.251053 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-p59rb" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.252770 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.253380 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.253621 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.253714 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-8bxw8" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.261309 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-p59rb"] Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.329966 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb5aea1-626e-4cd5-90f6-28c875d93c38-config-data\") pod \"keystone-db-sync-p59rb\" (UID: \"2cb5aea1-626e-4cd5-90f6-28c875d93c38\") " pod="keystone-kuttl-tests/keystone-db-sync-p59rb" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.330025 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlzzv\" (UniqueName: \"kubernetes.io/projected/2cb5aea1-626e-4cd5-90f6-28c875d93c38-kube-api-access-nlzzv\") pod \"keystone-db-sync-p59rb\" (UID: \"2cb5aea1-626e-4cd5-90f6-28c875d93c38\") " pod="keystone-kuttl-tests/keystone-db-sync-p59rb" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.431160 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb5aea1-626e-4cd5-90f6-28c875d93c38-config-data\") pod \"keystone-db-sync-p59rb\" (UID: \"2cb5aea1-626e-4cd5-90f6-28c875d93c38\") " pod="keystone-kuttl-tests/keystone-db-sync-p59rb" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.431233 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlzzv\" (UniqueName: \"kubernetes.io/projected/2cb5aea1-626e-4cd5-90f6-28c875d93c38-kube-api-access-nlzzv\") pod \"keystone-db-sync-p59rb\" (UID: \"2cb5aea1-626e-4cd5-90f6-28c875d93c38\") " pod="keystone-kuttl-tests/keystone-db-sync-p59rb" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.437044 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb5aea1-626e-4cd5-90f6-28c875d93c38-config-data\") pod \"keystone-db-sync-p59rb\" (UID: \"2cb5aea1-626e-4cd5-90f6-28c875d93c38\") " pod="keystone-kuttl-tests/keystone-db-sync-p59rb" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.451860 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlzzv\" (UniqueName: \"kubernetes.io/projected/2cb5aea1-626e-4cd5-90f6-28c875d93c38-kube-api-access-nlzzv\") pod \"keystone-db-sync-p59rb\" (UID: \"2cb5aea1-626e-4cd5-90f6-28c875d93c38\") " pod="keystone-kuttl-tests/keystone-db-sync-p59rb" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.581332 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-p59rb" Jan 21 00:32:16 crc kubenswrapper[5030]: I0121 00:32:16.802595 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-p59rb"] Jan 21 00:32:17 crc kubenswrapper[5030]: I0121 00:32:17.090024 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-p59rb" event={"ID":"2cb5aea1-626e-4cd5-90f6-28c875d93c38","Type":"ContainerStarted","Data":"710255d5b4b972d99e63e2fd70e88a8f0cae991538c4d3ffb0e60f44134d3678"} Jan 21 00:32:17 crc kubenswrapper[5030]: I0121 00:32:17.090398 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-p59rb" event={"ID":"2cb5aea1-626e-4cd5-90f6-28c875d93c38","Type":"ContainerStarted","Data":"2f08ea43c09412ac1fd1977e27e02ae16cca256f4dd52f392a76cbad03e15d56"} Jan 21 00:32:17 crc kubenswrapper[5030]: I0121 00:32:17.108288 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-p59rb" podStartSLOduration=1.108266766 podStartE2EDuration="1.108266766s" podCreationTimestamp="2026-01-21 00:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:32:17.103107434 +0000 UTC m=+7009.423367742" watchObservedRunningTime="2026-01-21 00:32:17.108266766 +0000 UTC m=+7009.428527054" Jan 21 00:32:19 crc kubenswrapper[5030]: I0121 00:32:19.142555 5030 generic.go:334] "Generic (PLEG): container finished" podID="2cb5aea1-626e-4cd5-90f6-28c875d93c38" containerID="710255d5b4b972d99e63e2fd70e88a8f0cae991538c4d3ffb0e60f44134d3678" exitCode=0 Jan 21 00:32:19 crc kubenswrapper[5030]: I0121 00:32:19.142706 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-p59rb" event={"ID":"2cb5aea1-626e-4cd5-90f6-28c875d93c38","Type":"ContainerDied","Data":"710255d5b4b972d99e63e2fd70e88a8f0cae991538c4d3ffb0e60f44134d3678"} Jan 21 00:32:20 crc kubenswrapper[5030]: I0121 00:32:20.429946 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-p59rb" Jan 21 00:32:20 crc kubenswrapper[5030]: I0121 00:32:20.594327 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb5aea1-626e-4cd5-90f6-28c875d93c38-config-data\") pod \"2cb5aea1-626e-4cd5-90f6-28c875d93c38\" (UID: \"2cb5aea1-626e-4cd5-90f6-28c875d93c38\") " Jan 21 00:32:20 crc kubenswrapper[5030]: I0121 00:32:20.594412 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlzzv\" (UniqueName: \"kubernetes.io/projected/2cb5aea1-626e-4cd5-90f6-28c875d93c38-kube-api-access-nlzzv\") pod \"2cb5aea1-626e-4cd5-90f6-28c875d93c38\" (UID: \"2cb5aea1-626e-4cd5-90f6-28c875d93c38\") " Jan 21 00:32:20 crc kubenswrapper[5030]: I0121 00:32:20.599479 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb5aea1-626e-4cd5-90f6-28c875d93c38-kube-api-access-nlzzv" (OuterVolumeSpecName: "kube-api-access-nlzzv") pod "2cb5aea1-626e-4cd5-90f6-28c875d93c38" (UID: "2cb5aea1-626e-4cd5-90f6-28c875d93c38"). InnerVolumeSpecName "kube-api-access-nlzzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:32:20 crc kubenswrapper[5030]: I0121 00:32:20.641020 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb5aea1-626e-4cd5-90f6-28c875d93c38-config-data" (OuterVolumeSpecName: "config-data") pod "2cb5aea1-626e-4cd5-90f6-28c875d93c38" (UID: "2cb5aea1-626e-4cd5-90f6-28c875d93c38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:32:20 crc kubenswrapper[5030]: I0121 00:32:20.696492 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb5aea1-626e-4cd5-90f6-28c875d93c38-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:20 crc kubenswrapper[5030]: I0121 00:32:20.696529 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlzzv\" (UniqueName: \"kubernetes.io/projected/2cb5aea1-626e-4cd5-90f6-28c875d93c38-kube-api-access-nlzzv\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.159657 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-p59rb" event={"ID":"2cb5aea1-626e-4cd5-90f6-28c875d93c38","Type":"ContainerDied","Data":"2f08ea43c09412ac1fd1977e27e02ae16cca256f4dd52f392a76cbad03e15d56"} Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.159705 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-p59rb" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.159709 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f08ea43c09412ac1fd1977e27e02ae16cca256f4dd52f392a76cbad03e15d56" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.330577 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-fk9p7"] Jan 21 00:32:21 crc kubenswrapper[5030]: E0121 00:32:21.330958 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb5aea1-626e-4cd5-90f6-28c875d93c38" containerName="keystone-db-sync" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.330980 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb5aea1-626e-4cd5-90f6-28c875d93c38" containerName="keystone-db-sync" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.331179 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb5aea1-626e-4cd5-90f6-28c875d93c38" containerName="keystone-db-sync" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.331877 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.337394 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.337509 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.337581 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-8bxw8" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.337415 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.350791 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.358661 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-fk9p7"] Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.516956 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-fernet-keys\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.517249 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ngl4\" (UniqueName: \"kubernetes.io/projected/bcf29296-38a7-41e1-a930-3c39db057d03-kube-api-access-4ngl4\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.517289 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-scripts\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.517317 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-credential-keys\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.517355 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-config-data\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.619396 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-credential-keys\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.619530 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-config-data\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.619581 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-fernet-keys\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.619670 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ngl4\" (UniqueName: \"kubernetes.io/projected/bcf29296-38a7-41e1-a930-3c39db057d03-kube-api-access-4ngl4\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.619727 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-scripts\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.623928 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-scripts\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.624156 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-config-data\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.627043 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-fernet-keys\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.627404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-credential-keys\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.642999 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ngl4\" (UniqueName: \"kubernetes.io/projected/bcf29296-38a7-41e1-a930-3c39db057d03-kube-api-access-4ngl4\") pod \"keystone-bootstrap-fk9p7\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:21 crc kubenswrapper[5030]: I0121 00:32:21.660948 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:22 crc kubenswrapper[5030]: I0121 00:32:22.102270 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-fk9p7"] Jan 21 00:32:22 crc kubenswrapper[5030]: I0121 00:32:22.168107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" event={"ID":"bcf29296-38a7-41e1-a930-3c39db057d03","Type":"ContainerStarted","Data":"08ccfb79b499720cf1b15cce50ced7d41fb601c79cf7968ae6162eae6c663f2b"} Jan 21 00:32:23 crc kubenswrapper[5030]: I0121 00:32:23.178454 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" event={"ID":"bcf29296-38a7-41e1-a930-3c39db057d03","Type":"ContainerStarted","Data":"81a0f88de49ccf3e6b4b4162003f10bea3472fdea17c6bba60ef9a0938200944"} Jan 21 00:32:23 crc kubenswrapper[5030]: I0121 00:32:23.202012 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" podStartSLOduration=2.201993312 podStartE2EDuration="2.201993312s" podCreationTimestamp="2026-01-21 00:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:32:23.199300037 +0000 UTC m=+7015.519560365" watchObservedRunningTime="2026-01-21 00:32:23.201993312 +0000 UTC m=+7015.522253600" Jan 21 00:32:26 crc kubenswrapper[5030]: I0121 00:32:26.203429 5030 generic.go:334] "Generic (PLEG): container finished" podID="bcf29296-38a7-41e1-a930-3c39db057d03" containerID="81a0f88de49ccf3e6b4b4162003f10bea3472fdea17c6bba60ef9a0938200944" exitCode=0 Jan 21 00:32:26 crc kubenswrapper[5030]: I0121 00:32:26.203528 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" event={"ID":"bcf29296-38a7-41e1-a930-3c39db057d03","Type":"ContainerDied","Data":"81a0f88de49ccf3e6b4b4162003f10bea3472fdea17c6bba60ef9a0938200944"} Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.506491 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.622859 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-scripts\") pod \"bcf29296-38a7-41e1-a930-3c39db057d03\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.622939 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-config-data\") pod \"bcf29296-38a7-41e1-a930-3c39db057d03\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.623011 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ngl4\" (UniqueName: \"kubernetes.io/projected/bcf29296-38a7-41e1-a930-3c39db057d03-kube-api-access-4ngl4\") pod \"bcf29296-38a7-41e1-a930-3c39db057d03\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.623084 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-credential-keys\") pod \"bcf29296-38a7-41e1-a930-3c39db057d03\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.623199 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-fernet-keys\") pod \"bcf29296-38a7-41e1-a930-3c39db057d03\" (UID: \"bcf29296-38a7-41e1-a930-3c39db057d03\") " Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.628967 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bcf29296-38a7-41e1-a930-3c39db057d03" (UID: "bcf29296-38a7-41e1-a930-3c39db057d03"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.629542 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-scripts" (OuterVolumeSpecName: "scripts") pod "bcf29296-38a7-41e1-a930-3c39db057d03" (UID: "bcf29296-38a7-41e1-a930-3c39db057d03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.629576 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bcf29296-38a7-41e1-a930-3c39db057d03" (UID: "bcf29296-38a7-41e1-a930-3c39db057d03"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.636057 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf29296-38a7-41e1-a930-3c39db057d03-kube-api-access-4ngl4" (OuterVolumeSpecName: "kube-api-access-4ngl4") pod "bcf29296-38a7-41e1-a930-3c39db057d03" (UID: "bcf29296-38a7-41e1-a930-3c39db057d03"). InnerVolumeSpecName "kube-api-access-4ngl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.643362 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-config-data" (OuterVolumeSpecName: "config-data") pod "bcf29296-38a7-41e1-a930-3c39db057d03" (UID: "bcf29296-38a7-41e1-a930-3c39db057d03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.724866 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.724907 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.724922 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ngl4\" (UniqueName: \"kubernetes.io/projected/bcf29296-38a7-41e1-a930-3c39db057d03-kube-api-access-4ngl4\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.724933 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:27 crc kubenswrapper[5030]: I0121 00:32:27.724942 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bcf29296-38a7-41e1-a930-3c39db057d03-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.224525 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" event={"ID":"bcf29296-38a7-41e1-a930-3c39db057d03","Type":"ContainerDied","Data":"08ccfb79b499720cf1b15cce50ced7d41fb601c79cf7968ae6162eae6c663f2b"} Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.224571 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08ccfb79b499720cf1b15cce50ced7d41fb601c79cf7968ae6162eae6c663f2b" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.224586 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-fk9p7" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.303112 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-686b8f9f97-68wfk"] Jan 21 00:32:28 crc kubenswrapper[5030]: E0121 00:32:28.303583 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf29296-38a7-41e1-a930-3c39db057d03" containerName="keystone-bootstrap" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.303616 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf29296-38a7-41e1-a930-3c39db057d03" containerName="keystone-bootstrap" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.303888 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf29296-38a7-41e1-a930-3c39db057d03" containerName="keystone-bootstrap" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.304468 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.306633 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.306660 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-8bxw8" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.309452 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-686b8f9f97-68wfk"] Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.311753 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.317567 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.436743 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-fernet-keys\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.436841 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-credential-keys\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.436885 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54cd5\" (UniqueName: \"kubernetes.io/projected/5a257181-dee7-4137-afce-18919adc890d-kube-api-access-54cd5\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.436926 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-scripts\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.437039 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-config-data\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.538521 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-scripts\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.538670 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-config-data\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.538721 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-fernet-keys\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.538768 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-credential-keys\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.538803 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54cd5\" (UniqueName: \"kubernetes.io/projected/5a257181-dee7-4137-afce-18919adc890d-kube-api-access-54cd5\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.542825 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-scripts\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.543024 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-credential-keys\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.543540 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-config-data\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.551564 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-fernet-keys\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.559539 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54cd5\" (UniqueName: \"kubernetes.io/projected/5a257181-dee7-4137-afce-18919adc890d-kube-api-access-54cd5\") pod \"keystone-686b8f9f97-68wfk\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:28 crc kubenswrapper[5030]: I0121 00:32:28.625862 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:29 crc kubenswrapper[5030]: I0121 00:32:29.022536 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-686b8f9f97-68wfk"] Jan 21 00:32:29 crc kubenswrapper[5030]: I0121 00:32:29.235612 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" event={"ID":"5a257181-dee7-4137-afce-18919adc890d","Type":"ContainerStarted","Data":"391b03cd9d3caa79bf0d57d045b4a6fde0cb28108dfaa78ee7ffafd381eab0df"} Jan 21 00:32:29 crc kubenswrapper[5030]: I0121 00:32:29.235696 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" event={"ID":"5a257181-dee7-4137-afce-18919adc890d","Type":"ContainerStarted","Data":"03c750ef563e879f8feac0df43d751ff775f11c69ed0d078caef83287c67a318"} Jan 21 00:32:29 crc kubenswrapper[5030]: I0121 00:32:29.235794 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:32:29 crc kubenswrapper[5030]: I0121 00:32:29.261743 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" podStartSLOduration=1.2617177339999999 podStartE2EDuration="1.261717734s" podCreationTimestamp="2026-01-21 00:32:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:32:29.255346891 +0000 UTC m=+7021.575607179" watchObservedRunningTime="2026-01-21 00:32:29.261717734 +0000 UTC m=+7021.581978022" Jan 21 00:33:00 crc kubenswrapper[5030]: I0121 00:33:00.117457 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.145711 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-p59rb"] Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.160942 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-fk9p7"] Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.167504 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-fk9p7"] Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.173818 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-p59rb"] Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.181079 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-686b8f9f97-68wfk"] Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.181381 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" podUID="5a257181-dee7-4137-afce-18919adc890d" containerName="keystone-api" containerID="cri-o://391b03cd9d3caa79bf0d57d045b4a6fde0cb28108dfaa78ee7ffafd381eab0df" gracePeriod=30 Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.196991 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4"] Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.198166 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4" Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.204546 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4"] Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.271607 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8801d226-fd80-4d26-b60a-17e9dfd36efa-operator-scripts\") pod \"keystoneb4d4-account-delete-sxbm4\" (UID: \"8801d226-fd80-4d26-b60a-17e9dfd36efa\") " pod="keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4" Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.271701 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vw8d\" (UniqueName: \"kubernetes.io/projected/8801d226-fd80-4d26-b60a-17e9dfd36efa-kube-api-access-4vw8d\") pod \"keystoneb4d4-account-delete-sxbm4\" (UID: \"8801d226-fd80-4d26-b60a-17e9dfd36efa\") " pod="keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4" Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.373925 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8801d226-fd80-4d26-b60a-17e9dfd36efa-operator-scripts\") pod \"keystoneb4d4-account-delete-sxbm4\" (UID: \"8801d226-fd80-4d26-b60a-17e9dfd36efa\") " pod="keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4" Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.373993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vw8d\" (UniqueName: \"kubernetes.io/projected/8801d226-fd80-4d26-b60a-17e9dfd36efa-kube-api-access-4vw8d\") pod \"keystoneb4d4-account-delete-sxbm4\" (UID: \"8801d226-fd80-4d26-b60a-17e9dfd36efa\") " pod="keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4" Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.375677 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8801d226-fd80-4d26-b60a-17e9dfd36efa-operator-scripts\") pod \"keystoneb4d4-account-delete-sxbm4\" (UID: \"8801d226-fd80-4d26-b60a-17e9dfd36efa\") " pod="keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4" Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.395755 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vw8d\" (UniqueName: \"kubernetes.io/projected/8801d226-fd80-4d26-b60a-17e9dfd36efa-kube-api-access-4vw8d\") pod \"keystoneb4d4-account-delete-sxbm4\" (UID: \"8801d226-fd80-4d26-b60a-17e9dfd36efa\") " pod="keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4" Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.519217 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4" Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.936816 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4"] Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.971997 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb5aea1-626e-4cd5-90f6-28c875d93c38" path="/var/lib/kubelet/pods/2cb5aea1-626e-4cd5-90f6-28c875d93c38/volumes" Jan 21 00:33:17 crc kubenswrapper[5030]: I0121 00:33:17.972595 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf29296-38a7-41e1-a930-3c39db057d03" path="/var/lib/kubelet/pods/bcf29296-38a7-41e1-a930-3c39db057d03/volumes" Jan 21 00:33:18 crc kubenswrapper[5030]: I0121 00:33:18.647119 5030 generic.go:334] "Generic (PLEG): container finished" podID="8801d226-fd80-4d26-b60a-17e9dfd36efa" containerID="e7cc14c387712ca0756f7dc60d3602c3043fc8c95d0a87ed8bb5ad426b567115" exitCode=0 Jan 21 00:33:18 crc kubenswrapper[5030]: I0121 00:33:18.647193 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4" event={"ID":"8801d226-fd80-4d26-b60a-17e9dfd36efa","Type":"ContainerDied","Data":"e7cc14c387712ca0756f7dc60d3602c3043fc8c95d0a87ed8bb5ad426b567115"} Jan 21 00:33:18 crc kubenswrapper[5030]: I0121 00:33:18.648602 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4" event={"ID":"8801d226-fd80-4d26-b60a-17e9dfd36efa","Type":"ContainerStarted","Data":"435068e0592e05532c835ed3a493224aa42e10aef8fd4e69c46a823286e574f7"} Jan 21 00:33:19 crc kubenswrapper[5030]: I0121 00:33:19.967043 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.122741 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vw8d\" (UniqueName: \"kubernetes.io/projected/8801d226-fd80-4d26-b60a-17e9dfd36efa-kube-api-access-4vw8d\") pod \"8801d226-fd80-4d26-b60a-17e9dfd36efa\" (UID: \"8801d226-fd80-4d26-b60a-17e9dfd36efa\") " Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.122946 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8801d226-fd80-4d26-b60a-17e9dfd36efa-operator-scripts\") pod \"8801d226-fd80-4d26-b60a-17e9dfd36efa\" (UID: \"8801d226-fd80-4d26-b60a-17e9dfd36efa\") " Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.123432 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8801d226-fd80-4d26-b60a-17e9dfd36efa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8801d226-fd80-4d26-b60a-17e9dfd36efa" (UID: "8801d226-fd80-4d26-b60a-17e9dfd36efa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.128909 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8801d226-fd80-4d26-b60a-17e9dfd36efa-kube-api-access-4vw8d" (OuterVolumeSpecName: "kube-api-access-4vw8d") pod "8801d226-fd80-4d26-b60a-17e9dfd36efa" (UID: "8801d226-fd80-4d26-b60a-17e9dfd36efa"). InnerVolumeSpecName "kube-api-access-4vw8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.224844 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8801d226-fd80-4d26-b60a-17e9dfd36efa-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.224896 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vw8d\" (UniqueName: \"kubernetes.io/projected/8801d226-fd80-4d26-b60a-17e9dfd36efa-kube-api-access-4vw8d\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.586762 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.669924 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4" event={"ID":"8801d226-fd80-4d26-b60a-17e9dfd36efa","Type":"ContainerDied","Data":"435068e0592e05532c835ed3a493224aa42e10aef8fd4e69c46a823286e574f7"} Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.669980 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="435068e0592e05532c835ed3a493224aa42e10aef8fd4e69c46a823286e574f7" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.670011 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.671507 5030 generic.go:334] "Generic (PLEG): container finished" podID="5a257181-dee7-4137-afce-18919adc890d" containerID="391b03cd9d3caa79bf0d57d045b4a6fde0cb28108dfaa78ee7ffafd381eab0df" exitCode=0 Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.671548 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" event={"ID":"5a257181-dee7-4137-afce-18919adc890d","Type":"ContainerDied","Data":"391b03cd9d3caa79bf0d57d045b4a6fde0cb28108dfaa78ee7ffafd381eab0df"} Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.671573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" event={"ID":"5a257181-dee7-4137-afce-18919adc890d","Type":"ContainerDied","Data":"03c750ef563e879f8feac0df43d751ff775f11c69ed0d078caef83287c67a318"} Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.671597 5030 scope.go:117] "RemoveContainer" containerID="391b03cd9d3caa79bf0d57d045b4a6fde0cb28108dfaa78ee7ffafd381eab0df" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.671699 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-686b8f9f97-68wfk" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.694926 5030 scope.go:117] "RemoveContainer" containerID="391b03cd9d3caa79bf0d57d045b4a6fde0cb28108dfaa78ee7ffafd381eab0df" Jan 21 00:33:20 crc kubenswrapper[5030]: E0121 00:33:20.695380 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"391b03cd9d3caa79bf0d57d045b4a6fde0cb28108dfaa78ee7ffafd381eab0df\": container with ID starting with 391b03cd9d3caa79bf0d57d045b4a6fde0cb28108dfaa78ee7ffafd381eab0df not found: ID does not exist" containerID="391b03cd9d3caa79bf0d57d045b4a6fde0cb28108dfaa78ee7ffafd381eab0df" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.695454 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"391b03cd9d3caa79bf0d57d045b4a6fde0cb28108dfaa78ee7ffafd381eab0df"} err="failed to get container status \"391b03cd9d3caa79bf0d57d045b4a6fde0cb28108dfaa78ee7ffafd381eab0df\": rpc error: code = NotFound desc = could not find container \"391b03cd9d3caa79bf0d57d045b4a6fde0cb28108dfaa78ee7ffafd381eab0df\": container with ID starting with 391b03cd9d3caa79bf0d57d045b4a6fde0cb28108dfaa78ee7ffafd381eab0df not found: ID does not exist" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.731400 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54cd5\" (UniqueName: \"kubernetes.io/projected/5a257181-dee7-4137-afce-18919adc890d-kube-api-access-54cd5\") pod \"5a257181-dee7-4137-afce-18919adc890d\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.731441 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-credential-keys\") pod \"5a257181-dee7-4137-afce-18919adc890d\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.731523 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-fernet-keys\") pod \"5a257181-dee7-4137-afce-18919adc890d\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.731555 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-scripts\") pod \"5a257181-dee7-4137-afce-18919adc890d\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.731581 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-config-data\") pod \"5a257181-dee7-4137-afce-18919adc890d\" (UID: \"5a257181-dee7-4137-afce-18919adc890d\") " Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.736482 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5a257181-dee7-4137-afce-18919adc890d" (UID: "5a257181-dee7-4137-afce-18919adc890d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.736544 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a257181-dee7-4137-afce-18919adc890d-kube-api-access-54cd5" (OuterVolumeSpecName: "kube-api-access-54cd5") pod "5a257181-dee7-4137-afce-18919adc890d" (UID: "5a257181-dee7-4137-afce-18919adc890d"). InnerVolumeSpecName "kube-api-access-54cd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.736748 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-scripts" (OuterVolumeSpecName: "scripts") pod "5a257181-dee7-4137-afce-18919adc890d" (UID: "5a257181-dee7-4137-afce-18919adc890d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.736828 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5a257181-dee7-4137-afce-18919adc890d" (UID: "5a257181-dee7-4137-afce-18919adc890d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.753843 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-config-data" (OuterVolumeSpecName: "config-data") pod "5a257181-dee7-4137-afce-18919adc890d" (UID: "5a257181-dee7-4137-afce-18919adc890d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.833007 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.833042 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.833057 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.833075 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54cd5\" (UniqueName: \"kubernetes.io/projected/5a257181-dee7-4137-afce-18919adc890d-kube-api-access-54cd5\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:20 crc kubenswrapper[5030]: I0121 00:33:20.833091 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5a257181-dee7-4137-afce-18919adc890d-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:21 crc kubenswrapper[5030]: I0121 00:33:21.006832 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-686b8f9f97-68wfk"] Jan 21 00:33:21 crc kubenswrapper[5030]: I0121 00:33:21.018961 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-686b8f9f97-68wfk"] Jan 21 00:33:21 crc kubenswrapper[5030]: I0121 00:33:21.971491 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a257181-dee7-4137-afce-18919adc890d" path="/var/lib/kubelet/pods/5a257181-dee7-4137-afce-18919adc890d/volumes" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.169683 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-fhpvh"] Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.181297 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-fhpvh"] Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.190681 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4"] Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.200729 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg"] Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.209502 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-b4d4-account-create-update-jv9qg"] Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.216766 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystoneb4d4-account-delete-sxbm4"] Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.348273 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-3507-account-create-update-698k7"] Jan 21 00:33:22 crc kubenswrapper[5030]: E0121 00:33:22.351097 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a257181-dee7-4137-afce-18919adc890d" containerName="keystone-api" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.351130 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a257181-dee7-4137-afce-18919adc890d" containerName="keystone-api" Jan 21 00:33:22 crc kubenswrapper[5030]: E0121 00:33:22.351153 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8801d226-fd80-4d26-b60a-17e9dfd36efa" containerName="mariadb-account-delete" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.351161 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8801d226-fd80-4d26-b60a-17e9dfd36efa" containerName="mariadb-account-delete" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.351335 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a257181-dee7-4137-afce-18919adc890d" containerName="keystone-api" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.351362 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8801d226-fd80-4d26-b60a-17e9dfd36efa" containerName="mariadb-account-delete" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.352008 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-3507-account-create-update-698k7" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.354529 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.355504 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-r8l7f"] Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.358177 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-r8l7f" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.360901 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-3507-account-create-update-698k7"] Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.365255 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-r8l7f"] Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.460616 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdfkv\" (UniqueName: \"kubernetes.io/projected/e360cc0c-7528-4b74-8586-a4dab5c4b543-kube-api-access-bdfkv\") pod \"keystone-db-create-r8l7f\" (UID: \"e360cc0c-7528-4b74-8586-a4dab5c4b543\") " pod="keystone-kuttl-tests/keystone-db-create-r8l7f" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.460768 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spgf5\" (UniqueName: \"kubernetes.io/projected/f7019e9a-6250-4154-85b8-d6a45e4ea550-kube-api-access-spgf5\") pod \"keystone-3507-account-create-update-698k7\" (UID: \"f7019e9a-6250-4154-85b8-d6a45e4ea550\") " pod="keystone-kuttl-tests/keystone-3507-account-create-update-698k7" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.460803 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e360cc0c-7528-4b74-8586-a4dab5c4b543-operator-scripts\") pod \"keystone-db-create-r8l7f\" (UID: \"e360cc0c-7528-4b74-8586-a4dab5c4b543\") " pod="keystone-kuttl-tests/keystone-db-create-r8l7f" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.460860 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7019e9a-6250-4154-85b8-d6a45e4ea550-operator-scripts\") pod \"keystone-3507-account-create-update-698k7\" (UID: \"f7019e9a-6250-4154-85b8-d6a45e4ea550\") " pod="keystone-kuttl-tests/keystone-3507-account-create-update-698k7" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.562769 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spgf5\" (UniqueName: \"kubernetes.io/projected/f7019e9a-6250-4154-85b8-d6a45e4ea550-kube-api-access-spgf5\") pod \"keystone-3507-account-create-update-698k7\" (UID: \"f7019e9a-6250-4154-85b8-d6a45e4ea550\") " pod="keystone-kuttl-tests/keystone-3507-account-create-update-698k7" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.562852 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e360cc0c-7528-4b74-8586-a4dab5c4b543-operator-scripts\") pod \"keystone-db-create-r8l7f\" (UID: \"e360cc0c-7528-4b74-8586-a4dab5c4b543\") " pod="keystone-kuttl-tests/keystone-db-create-r8l7f" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.562944 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7019e9a-6250-4154-85b8-d6a45e4ea550-operator-scripts\") pod \"keystone-3507-account-create-update-698k7\" (UID: \"f7019e9a-6250-4154-85b8-d6a45e4ea550\") " pod="keystone-kuttl-tests/keystone-3507-account-create-update-698k7" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.562996 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdfkv\" (UniqueName: \"kubernetes.io/projected/e360cc0c-7528-4b74-8586-a4dab5c4b543-kube-api-access-bdfkv\") pod \"keystone-db-create-r8l7f\" (UID: \"e360cc0c-7528-4b74-8586-a4dab5c4b543\") " pod="keystone-kuttl-tests/keystone-db-create-r8l7f" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.563917 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7019e9a-6250-4154-85b8-d6a45e4ea550-operator-scripts\") pod \"keystone-3507-account-create-update-698k7\" (UID: \"f7019e9a-6250-4154-85b8-d6a45e4ea550\") " pod="keystone-kuttl-tests/keystone-3507-account-create-update-698k7" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.565639 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e360cc0c-7528-4b74-8586-a4dab5c4b543-operator-scripts\") pod \"keystone-db-create-r8l7f\" (UID: \"e360cc0c-7528-4b74-8586-a4dab5c4b543\") " pod="keystone-kuttl-tests/keystone-db-create-r8l7f" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.583259 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spgf5\" (UniqueName: \"kubernetes.io/projected/f7019e9a-6250-4154-85b8-d6a45e4ea550-kube-api-access-spgf5\") pod \"keystone-3507-account-create-update-698k7\" (UID: \"f7019e9a-6250-4154-85b8-d6a45e4ea550\") " pod="keystone-kuttl-tests/keystone-3507-account-create-update-698k7" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.584928 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdfkv\" (UniqueName: \"kubernetes.io/projected/e360cc0c-7528-4b74-8586-a4dab5c4b543-kube-api-access-bdfkv\") pod \"keystone-db-create-r8l7f\" (UID: \"e360cc0c-7528-4b74-8586-a4dab5c4b543\") " pod="keystone-kuttl-tests/keystone-db-create-r8l7f" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.678740 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-3507-account-create-update-698k7" Jan 21 00:33:22 crc kubenswrapper[5030]: I0121 00:33:22.692737 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-r8l7f" Jan 21 00:33:23 crc kubenswrapper[5030]: I0121 00:33:23.111396 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-3507-account-create-update-698k7"] Jan 21 00:33:23 crc kubenswrapper[5030]: I0121 00:33:23.198125 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-r8l7f"] Jan 21 00:33:23 crc kubenswrapper[5030]: W0121 00:33:23.219836 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode360cc0c_7528_4b74_8586_a4dab5c4b543.slice/crio-95aa4a896ed27dfaaa9f75b408b6f957955cb81fd215668d5d46973a647c82a1 WatchSource:0}: Error finding container 95aa4a896ed27dfaaa9f75b408b6f957955cb81fd215668d5d46973a647c82a1: Status 404 returned error can't find the container with id 95aa4a896ed27dfaaa9f75b408b6f957955cb81fd215668d5d46973a647c82a1 Jan 21 00:33:23 crc kubenswrapper[5030]: I0121 00:33:23.697241 5030 generic.go:334] "Generic (PLEG): container finished" podID="e360cc0c-7528-4b74-8586-a4dab5c4b543" containerID="4ba799b592fbda20fe7252c859392107b30d8f277c1ebc8f05b28486a902a62f" exitCode=0 Jan 21 00:33:23 crc kubenswrapper[5030]: I0121 00:33:23.697289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-r8l7f" event={"ID":"e360cc0c-7528-4b74-8586-a4dab5c4b543","Type":"ContainerDied","Data":"4ba799b592fbda20fe7252c859392107b30d8f277c1ebc8f05b28486a902a62f"} Jan 21 00:33:23 crc kubenswrapper[5030]: I0121 00:33:23.697580 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-r8l7f" event={"ID":"e360cc0c-7528-4b74-8586-a4dab5c4b543","Type":"ContainerStarted","Data":"95aa4a896ed27dfaaa9f75b408b6f957955cb81fd215668d5d46973a647c82a1"} Jan 21 00:33:23 crc kubenswrapper[5030]: I0121 00:33:23.699438 5030 generic.go:334] "Generic (PLEG): container finished" podID="f7019e9a-6250-4154-85b8-d6a45e4ea550" containerID="20700d5af49b815ba8e55ca4711ca8c4cc6294881f9f4bf5b9ca73220236bfc4" exitCode=0 Jan 21 00:33:23 crc kubenswrapper[5030]: I0121 00:33:23.699485 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-3507-account-create-update-698k7" event={"ID":"f7019e9a-6250-4154-85b8-d6a45e4ea550","Type":"ContainerDied","Data":"20700d5af49b815ba8e55ca4711ca8c4cc6294881f9f4bf5b9ca73220236bfc4"} Jan 21 00:33:23 crc kubenswrapper[5030]: I0121 00:33:23.699506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-3507-account-create-update-698k7" event={"ID":"f7019e9a-6250-4154-85b8-d6a45e4ea550","Type":"ContainerStarted","Data":"af7c436a001a9d527155b0fa2f3e36873235588f3c3c996995e71cf25878d58a"} Jan 21 00:33:23 crc kubenswrapper[5030]: I0121 00:33:23.969241 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86b2905d-8c8f-48c3-92e7-23405af95b06" path="/var/lib/kubelet/pods/86b2905d-8c8f-48c3-92e7-23405af95b06/volumes" Jan 21 00:33:23 crc kubenswrapper[5030]: I0121 00:33:23.969795 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8801d226-fd80-4d26-b60a-17e9dfd36efa" path="/var/lib/kubelet/pods/8801d226-fd80-4d26-b60a-17e9dfd36efa/volumes" Jan 21 00:33:23 crc kubenswrapper[5030]: I0121 00:33:23.970293 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4df90c8-bc7b-4e92-9f0d-347a800b9c6f" path="/var/lib/kubelet/pods/e4df90c8-bc7b-4e92-9f0d-347a800b9c6f/volumes" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.002566 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-r8l7f" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.075828 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-3507-account-create-update-698k7" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.102176 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdfkv\" (UniqueName: \"kubernetes.io/projected/e360cc0c-7528-4b74-8586-a4dab5c4b543-kube-api-access-bdfkv\") pod \"e360cc0c-7528-4b74-8586-a4dab5c4b543\" (UID: \"e360cc0c-7528-4b74-8586-a4dab5c4b543\") " Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.102317 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e360cc0c-7528-4b74-8586-a4dab5c4b543-operator-scripts\") pod \"e360cc0c-7528-4b74-8586-a4dab5c4b543\" (UID: \"e360cc0c-7528-4b74-8586-a4dab5c4b543\") " Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.103157 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e360cc0c-7528-4b74-8586-a4dab5c4b543-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e360cc0c-7528-4b74-8586-a4dab5c4b543" (UID: "e360cc0c-7528-4b74-8586-a4dab5c4b543"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.107591 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e360cc0c-7528-4b74-8586-a4dab5c4b543-kube-api-access-bdfkv" (OuterVolumeSpecName: "kube-api-access-bdfkv") pod "e360cc0c-7528-4b74-8586-a4dab5c4b543" (UID: "e360cc0c-7528-4b74-8586-a4dab5c4b543"). InnerVolumeSpecName "kube-api-access-bdfkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.203224 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7019e9a-6250-4154-85b8-d6a45e4ea550-operator-scripts\") pod \"f7019e9a-6250-4154-85b8-d6a45e4ea550\" (UID: \"f7019e9a-6250-4154-85b8-d6a45e4ea550\") " Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.203326 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spgf5\" (UniqueName: \"kubernetes.io/projected/f7019e9a-6250-4154-85b8-d6a45e4ea550-kube-api-access-spgf5\") pod \"f7019e9a-6250-4154-85b8-d6a45e4ea550\" (UID: \"f7019e9a-6250-4154-85b8-d6a45e4ea550\") " Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.203720 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e360cc0c-7528-4b74-8586-a4dab5c4b543-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.203742 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdfkv\" (UniqueName: \"kubernetes.io/projected/e360cc0c-7528-4b74-8586-a4dab5c4b543-kube-api-access-bdfkv\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.203744 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7019e9a-6250-4154-85b8-d6a45e4ea550-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7019e9a-6250-4154-85b8-d6a45e4ea550" (UID: "f7019e9a-6250-4154-85b8-d6a45e4ea550"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.206552 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7019e9a-6250-4154-85b8-d6a45e4ea550-kube-api-access-spgf5" (OuterVolumeSpecName: "kube-api-access-spgf5") pod "f7019e9a-6250-4154-85b8-d6a45e4ea550" (UID: "f7019e9a-6250-4154-85b8-d6a45e4ea550"). InnerVolumeSpecName "kube-api-access-spgf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.305183 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7019e9a-6250-4154-85b8-d6a45e4ea550-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.305237 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spgf5\" (UniqueName: \"kubernetes.io/projected/f7019e9a-6250-4154-85b8-d6a45e4ea550-kube-api-access-spgf5\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.716145 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-3507-account-create-update-698k7" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.716172 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-3507-account-create-update-698k7" event={"ID":"f7019e9a-6250-4154-85b8-d6a45e4ea550","Type":"ContainerDied","Data":"af7c436a001a9d527155b0fa2f3e36873235588f3c3c996995e71cf25878d58a"} Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.716213 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af7c436a001a9d527155b0fa2f3e36873235588f3c3c996995e71cf25878d58a" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.718092 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-r8l7f" event={"ID":"e360cc0c-7528-4b74-8586-a4dab5c4b543","Type":"ContainerDied","Data":"95aa4a896ed27dfaaa9f75b408b6f957955cb81fd215668d5d46973a647c82a1"} Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.718127 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-r8l7f" Jan 21 00:33:25 crc kubenswrapper[5030]: I0121 00:33:25.718136 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95aa4a896ed27dfaaa9f75b408b6f957955cb81fd215668d5d46973a647c82a1" Jan 21 00:33:27 crc kubenswrapper[5030]: I0121 00:33:27.951322 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-hvrcc"] Jan 21 00:33:27 crc kubenswrapper[5030]: E0121 00:33:27.951905 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e360cc0c-7528-4b74-8586-a4dab5c4b543" containerName="mariadb-database-create" Jan 21 00:33:27 crc kubenswrapper[5030]: I0121 00:33:27.951922 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e360cc0c-7528-4b74-8586-a4dab5c4b543" containerName="mariadb-database-create" Jan 21 00:33:27 crc kubenswrapper[5030]: E0121 00:33:27.951949 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7019e9a-6250-4154-85b8-d6a45e4ea550" containerName="mariadb-account-create-update" Jan 21 00:33:27 crc kubenswrapper[5030]: I0121 00:33:27.951957 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7019e9a-6250-4154-85b8-d6a45e4ea550" containerName="mariadb-account-create-update" Jan 21 00:33:27 crc kubenswrapper[5030]: I0121 00:33:27.952138 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7019e9a-6250-4154-85b8-d6a45e4ea550" containerName="mariadb-account-create-update" Jan 21 00:33:27 crc kubenswrapper[5030]: I0121 00:33:27.952155 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e360cc0c-7528-4b74-8586-a4dab5c4b543" containerName="mariadb-database-create" Jan 21 00:33:27 crc kubenswrapper[5030]: I0121 00:33:27.952759 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" Jan 21 00:33:27 crc kubenswrapper[5030]: I0121 00:33:27.954993 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-lk7ss" Jan 21 00:33:27 crc kubenswrapper[5030]: I0121 00:33:27.955050 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:33:27 crc kubenswrapper[5030]: I0121 00:33:27.955050 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:33:27 crc kubenswrapper[5030]: I0121 00:33:27.956714 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:33:27 crc kubenswrapper[5030]: I0121 00:33:27.977597 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-hvrcc"] Jan 21 00:33:28 crc kubenswrapper[5030]: I0121 00:33:28.048496 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d8042e-0563-4e85-b7a2-cfb6c402c4ef-config-data\") pod \"keystone-db-sync-hvrcc\" (UID: \"66d8042e-0563-4e85-b7a2-cfb6c402c4ef\") " pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" Jan 21 00:33:28 crc kubenswrapper[5030]: I0121 00:33:28.048661 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmwc7\" (UniqueName: \"kubernetes.io/projected/66d8042e-0563-4e85-b7a2-cfb6c402c4ef-kube-api-access-pmwc7\") pod \"keystone-db-sync-hvrcc\" (UID: \"66d8042e-0563-4e85-b7a2-cfb6c402c4ef\") " pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" Jan 21 00:33:28 crc kubenswrapper[5030]: I0121 00:33:28.150806 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d8042e-0563-4e85-b7a2-cfb6c402c4ef-config-data\") pod \"keystone-db-sync-hvrcc\" (UID: \"66d8042e-0563-4e85-b7a2-cfb6c402c4ef\") " pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" Jan 21 00:33:28 crc kubenswrapper[5030]: I0121 00:33:28.150937 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmwc7\" (UniqueName: \"kubernetes.io/projected/66d8042e-0563-4e85-b7a2-cfb6c402c4ef-kube-api-access-pmwc7\") pod \"keystone-db-sync-hvrcc\" (UID: \"66d8042e-0563-4e85-b7a2-cfb6c402c4ef\") " pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" Jan 21 00:33:28 crc kubenswrapper[5030]: I0121 00:33:28.159357 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d8042e-0563-4e85-b7a2-cfb6c402c4ef-config-data\") pod \"keystone-db-sync-hvrcc\" (UID: \"66d8042e-0563-4e85-b7a2-cfb6c402c4ef\") " pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" Jan 21 00:33:28 crc kubenswrapper[5030]: I0121 00:33:28.171098 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmwc7\" (UniqueName: \"kubernetes.io/projected/66d8042e-0563-4e85-b7a2-cfb6c402c4ef-kube-api-access-pmwc7\") pod \"keystone-db-sync-hvrcc\" (UID: \"66d8042e-0563-4e85-b7a2-cfb6c402c4ef\") " pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" Jan 21 00:33:28 crc kubenswrapper[5030]: I0121 00:33:28.290292 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-lk7ss" Jan 21 00:33:28 crc kubenswrapper[5030]: I0121 00:33:28.298070 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" Jan 21 00:33:28 crc kubenswrapper[5030]: I0121 00:33:28.766465 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-hvrcc"] Jan 21 00:33:29 crc kubenswrapper[5030]: I0121 00:33:29.753380 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" event={"ID":"66d8042e-0563-4e85-b7a2-cfb6c402c4ef","Type":"ContainerStarted","Data":"cd0d8838c94903e8d90b606e6f1db3341d42d75604b250d537ba0b49b23b96e3"} Jan 21 00:33:29 crc kubenswrapper[5030]: I0121 00:33:29.753694 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" event={"ID":"66d8042e-0563-4e85-b7a2-cfb6c402c4ef","Type":"ContainerStarted","Data":"994649014c5ae77d51d3b87c23265b2f2b090ea4f88b015629397b961457eb80"} Jan 21 00:33:29 crc kubenswrapper[5030]: I0121 00:33:29.772569 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" podStartSLOduration=2.772550378 podStartE2EDuration="2.772550378s" podCreationTimestamp="2026-01-21 00:33:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:33:29.767999879 +0000 UTC m=+7082.088260167" watchObservedRunningTime="2026-01-21 00:33:29.772550378 +0000 UTC m=+7082.092810666" Jan 21 00:33:31 crc kubenswrapper[5030]: I0121 00:33:31.774307 5030 generic.go:334] "Generic (PLEG): container finished" podID="66d8042e-0563-4e85-b7a2-cfb6c402c4ef" containerID="cd0d8838c94903e8d90b606e6f1db3341d42d75604b250d537ba0b49b23b96e3" exitCode=0 Jan 21 00:33:31 crc kubenswrapper[5030]: I0121 00:33:31.774401 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" event={"ID":"66d8042e-0563-4e85-b7a2-cfb6c402c4ef","Type":"ContainerDied","Data":"cd0d8838c94903e8d90b606e6f1db3341d42d75604b250d537ba0b49b23b96e3"} Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.050124 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.128047 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmwc7\" (UniqueName: \"kubernetes.io/projected/66d8042e-0563-4e85-b7a2-cfb6c402c4ef-kube-api-access-pmwc7\") pod \"66d8042e-0563-4e85-b7a2-cfb6c402c4ef\" (UID: \"66d8042e-0563-4e85-b7a2-cfb6c402c4ef\") " Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.128153 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d8042e-0563-4e85-b7a2-cfb6c402c4ef-config-data\") pod \"66d8042e-0563-4e85-b7a2-cfb6c402c4ef\" (UID: \"66d8042e-0563-4e85-b7a2-cfb6c402c4ef\") " Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.133426 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d8042e-0563-4e85-b7a2-cfb6c402c4ef-kube-api-access-pmwc7" (OuterVolumeSpecName: "kube-api-access-pmwc7") pod "66d8042e-0563-4e85-b7a2-cfb6c402c4ef" (UID: "66d8042e-0563-4e85-b7a2-cfb6c402c4ef"). InnerVolumeSpecName "kube-api-access-pmwc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.164480 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d8042e-0563-4e85-b7a2-cfb6c402c4ef-config-data" (OuterVolumeSpecName: "config-data") pod "66d8042e-0563-4e85-b7a2-cfb6c402c4ef" (UID: "66d8042e-0563-4e85-b7a2-cfb6c402c4ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.229647 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmwc7\" (UniqueName: \"kubernetes.io/projected/66d8042e-0563-4e85-b7a2-cfb6c402c4ef-kube-api-access-pmwc7\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.229680 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d8042e-0563-4e85-b7a2-cfb6c402c4ef-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.796264 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" event={"ID":"66d8042e-0563-4e85-b7a2-cfb6c402c4ef","Type":"ContainerDied","Data":"994649014c5ae77d51d3b87c23265b2f2b090ea4f88b015629397b961457eb80"} Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.796679 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="994649014c5ae77d51d3b87c23265b2f2b090ea4f88b015629397b961457eb80" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.796393 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-hvrcc" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.976495 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-mv6sd"] Jan 21 00:33:33 crc kubenswrapper[5030]: E0121 00:33:33.976851 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d8042e-0563-4e85-b7a2-cfb6c402c4ef" containerName="keystone-db-sync" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.976879 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d8042e-0563-4e85-b7a2-cfb6c402c4ef" containerName="keystone-db-sync" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.977090 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d8042e-0563-4e85-b7a2-cfb6c402c4ef" containerName="keystone-db-sync" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.977674 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.983658 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.983852 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-lk7ss" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.983975 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.984091 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.984197 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:33:33 crc kubenswrapper[5030]: I0121 00:33:33.988338 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-mv6sd"] Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.040806 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psdft\" (UniqueName: \"kubernetes.io/projected/031af4ed-5133-4a3b-88ea-f857438fb2e4-kube-api-access-psdft\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.041201 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-fernet-keys\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.041282 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-config-data\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.041383 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-scripts\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.041517 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-credential-keys\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.143142 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-fernet-keys\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.143840 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-config-data\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.143993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-scripts\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.144047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-credential-keys\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.144178 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psdft\" (UniqueName: \"kubernetes.io/projected/031af4ed-5133-4a3b-88ea-f857438fb2e4-kube-api-access-psdft\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.148410 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-scripts\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.148482 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-credential-keys\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.149307 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-config-data\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.148374 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-fernet-keys\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.164128 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psdft\" (UniqueName: \"kubernetes.io/projected/031af4ed-5133-4a3b-88ea-f857438fb2e4-kube-api-access-psdft\") pod \"keystone-bootstrap-mv6sd\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.303247 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.788129 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-mv6sd"] Jan 21 00:33:34 crc kubenswrapper[5030]: I0121 00:33:34.811698 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" event={"ID":"031af4ed-5133-4a3b-88ea-f857438fb2e4","Type":"ContainerStarted","Data":"0c12b796ff5f91c7a17dba3cd35da6cc0bea6d0214a8f587cc2a7bfbc93282a4"} Jan 21 00:33:36 crc kubenswrapper[5030]: I0121 00:33:36.828037 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" event={"ID":"031af4ed-5133-4a3b-88ea-f857438fb2e4","Type":"ContainerStarted","Data":"c843368ab535d1e1653c99ecd54a5521ef468b82ba991af78ca288109814605b"} Jan 21 00:33:36 crc kubenswrapper[5030]: I0121 00:33:36.854947 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" podStartSLOduration=3.854918408 podStartE2EDuration="3.854918408s" podCreationTimestamp="2026-01-21 00:33:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:33:36.848727949 +0000 UTC m=+7089.168988277" watchObservedRunningTime="2026-01-21 00:33:36.854918408 +0000 UTC m=+7089.175178736" Jan 21 00:33:39 crc kubenswrapper[5030]: I0121 00:33:39.860647 5030 generic.go:334] "Generic (PLEG): container finished" podID="031af4ed-5133-4a3b-88ea-f857438fb2e4" containerID="c843368ab535d1e1653c99ecd54a5521ef468b82ba991af78ca288109814605b" exitCode=0 Jan 21 00:33:39 crc kubenswrapper[5030]: I0121 00:33:39.860719 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" event={"ID":"031af4ed-5133-4a3b-88ea-f857438fb2e4","Type":"ContainerDied","Data":"c843368ab535d1e1653c99ecd54a5521ef468b82ba991af78ca288109814605b"} Jan 21 00:33:40 crc kubenswrapper[5030]: I0121 00:33:40.157258 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:33:40 crc kubenswrapper[5030]: I0121 00:33:40.157346 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.163152 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.258777 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-scripts\") pod \"031af4ed-5133-4a3b-88ea-f857438fb2e4\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.258833 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-config-data\") pod \"031af4ed-5133-4a3b-88ea-f857438fb2e4\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.258869 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-credential-keys\") pod \"031af4ed-5133-4a3b-88ea-f857438fb2e4\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.259915 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psdft\" (UniqueName: \"kubernetes.io/projected/031af4ed-5133-4a3b-88ea-f857438fb2e4-kube-api-access-psdft\") pod \"031af4ed-5133-4a3b-88ea-f857438fb2e4\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.260071 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-fernet-keys\") pod \"031af4ed-5133-4a3b-88ea-f857438fb2e4\" (UID: \"031af4ed-5133-4a3b-88ea-f857438fb2e4\") " Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.265972 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "031af4ed-5133-4a3b-88ea-f857438fb2e4" (UID: "031af4ed-5133-4a3b-88ea-f857438fb2e4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.266433 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-scripts" (OuterVolumeSpecName: "scripts") pod "031af4ed-5133-4a3b-88ea-f857438fb2e4" (UID: "031af4ed-5133-4a3b-88ea-f857438fb2e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.266454 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "031af4ed-5133-4a3b-88ea-f857438fb2e4" (UID: "031af4ed-5133-4a3b-88ea-f857438fb2e4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.266888 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031af4ed-5133-4a3b-88ea-f857438fb2e4-kube-api-access-psdft" (OuterVolumeSpecName: "kube-api-access-psdft") pod "031af4ed-5133-4a3b-88ea-f857438fb2e4" (UID: "031af4ed-5133-4a3b-88ea-f857438fb2e4"). InnerVolumeSpecName "kube-api-access-psdft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.282778 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-config-data" (OuterVolumeSpecName: "config-data") pod "031af4ed-5133-4a3b-88ea-f857438fb2e4" (UID: "031af4ed-5133-4a3b-88ea-f857438fb2e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.362160 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.362211 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.362239 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.362266 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/031af4ed-5133-4a3b-88ea-f857438fb2e4-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.362298 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psdft\" (UniqueName: \"kubernetes.io/projected/031af4ed-5133-4a3b-88ea-f857438fb2e4-kube-api-access-psdft\") on node \"crc\" DevicePath \"\"" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.877927 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" event={"ID":"031af4ed-5133-4a3b-88ea-f857438fb2e4","Type":"ContainerDied","Data":"0c12b796ff5f91c7a17dba3cd35da6cc0bea6d0214a8f587cc2a7bfbc93282a4"} Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.877969 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c12b796ff5f91c7a17dba3cd35da6cc0bea6d0214a8f587cc2a7bfbc93282a4" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.878021 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-mv6sd" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.980731 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-d76475f55-v2lnk"] Jan 21 00:33:41 crc kubenswrapper[5030]: E0121 00:33:41.981017 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031af4ed-5133-4a3b-88ea-f857438fb2e4" containerName="keystone-bootstrap" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.981035 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="031af4ed-5133-4a3b-88ea-f857438fb2e4" containerName="keystone-bootstrap" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.981230 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="031af4ed-5133-4a3b-88ea-f857438fb2e4" containerName="keystone-bootstrap" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.981864 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.983917 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.984031 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.983917 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-lk7ss" Jan 21 00:33:41 crc kubenswrapper[5030]: I0121 00:33:41.983971 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.002425 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-d76475f55-v2lnk"] Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.072260 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-fernet-keys\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.072318 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-config-data\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.072465 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sh4t\" (UniqueName: \"kubernetes.io/projected/b2d30b10-958f-4bc4-a57d-36b673db4890-kube-api-access-7sh4t\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.072494 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-credential-keys\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.072551 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-scripts\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.173441 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-fernet-keys\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.173487 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-config-data\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.173527 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sh4t\" (UniqueName: \"kubernetes.io/projected/b2d30b10-958f-4bc4-a57d-36b673db4890-kube-api-access-7sh4t\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.173598 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-credential-keys\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.173645 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-scripts\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.177508 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-credential-keys\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.177872 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-fernet-keys\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.181414 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-config-data\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.191281 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sh4t\" (UniqueName: \"kubernetes.io/projected/b2d30b10-958f-4bc4-a57d-36b673db4890-kube-api-access-7sh4t\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.192508 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-scripts\") pod \"keystone-d76475f55-v2lnk\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.347424 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.796274 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-d76475f55-v2lnk"] Jan 21 00:33:42 crc kubenswrapper[5030]: I0121 00:33:42.886861 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" event={"ID":"b2d30b10-958f-4bc4-a57d-36b673db4890","Type":"ContainerStarted","Data":"e5fae6d6b4d3ce834030497462ed10641515bc31eefb439bf08cccda65ac60b2"} Jan 21 00:33:43 crc kubenswrapper[5030]: I0121 00:33:43.896638 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" event={"ID":"b2d30b10-958f-4bc4-a57d-36b673db4890","Type":"ContainerStarted","Data":"afa6ce71049dc9a297d92e1cf76315d70bef6a9c2ddacbb46d8340997fe31bc8"} Jan 21 00:33:43 crc kubenswrapper[5030]: I0121 00:33:43.897096 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:33:43 crc kubenswrapper[5030]: I0121 00:33:43.917799 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" podStartSLOduration=2.917775571 podStartE2EDuration="2.917775571s" podCreationTimestamp="2026-01-21 00:33:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:33:43.909769069 +0000 UTC m=+7096.230029357" watchObservedRunningTime="2026-01-21 00:33:43.917775571 +0000 UTC m=+7096.238035869" Jan 21 00:34:10 crc kubenswrapper[5030]: I0121 00:34:10.157423 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:34:10 crc kubenswrapper[5030]: I0121 00:34:10.158388 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:34:13 crc kubenswrapper[5030]: I0121 00:34:13.896327 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.740049 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstackclient"] Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.741036 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.743566 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"openstack-config-secret" Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.743604 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config" Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.749076 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"default-dockercfg-wbzvl" Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.749592 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.870078 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ca1a59bd-f8de-4010-8a23-77a265e55b49-openstack-config\") pod \"openstackclient\" (UID: \"ca1a59bd-f8de-4010-8a23-77a265e55b49\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.870223 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ca1a59bd-f8de-4010-8a23-77a265e55b49-openstack-config-secret\") pod \"openstackclient\" (UID: \"ca1a59bd-f8de-4010-8a23-77a265e55b49\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.870518 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbmnv\" (UniqueName: \"kubernetes.io/projected/ca1a59bd-f8de-4010-8a23-77a265e55b49-kube-api-access-dbmnv\") pod \"openstackclient\" (UID: \"ca1a59bd-f8de-4010-8a23-77a265e55b49\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.972481 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbmnv\" (UniqueName: \"kubernetes.io/projected/ca1a59bd-f8de-4010-8a23-77a265e55b49-kube-api-access-dbmnv\") pod \"openstackclient\" (UID: \"ca1a59bd-f8de-4010-8a23-77a265e55b49\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.972555 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ca1a59bd-f8de-4010-8a23-77a265e55b49-openstack-config\") pod \"openstackclient\" (UID: \"ca1a59bd-f8de-4010-8a23-77a265e55b49\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.972579 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ca1a59bd-f8de-4010-8a23-77a265e55b49-openstack-config-secret\") pod \"openstackclient\" (UID: \"ca1a59bd-f8de-4010-8a23-77a265e55b49\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.974461 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ca1a59bd-f8de-4010-8a23-77a265e55b49-openstack-config\") pod \"openstackclient\" (UID: \"ca1a59bd-f8de-4010-8a23-77a265e55b49\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.980023 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ca1a59bd-f8de-4010-8a23-77a265e55b49-openstack-config-secret\") pod \"openstackclient\" (UID: \"ca1a59bd-f8de-4010-8a23-77a265e55b49\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 00:34:14 crc kubenswrapper[5030]: I0121 00:34:14.991054 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbmnv\" (UniqueName: \"kubernetes.io/projected/ca1a59bd-f8de-4010-8a23-77a265e55b49-kube-api-access-dbmnv\") pod \"openstackclient\" (UID: \"ca1a59bd-f8de-4010-8a23-77a265e55b49\") " pod="keystone-kuttl-tests/openstackclient" Jan 21 00:34:15 crc kubenswrapper[5030]: I0121 00:34:15.063924 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Jan 21 00:34:15 crc kubenswrapper[5030]: I0121 00:34:15.487440 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Jan 21 00:34:15 crc kubenswrapper[5030]: I0121 00:34:15.497932 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 00:34:16 crc kubenswrapper[5030]: I0121 00:34:16.176564 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"ca1a59bd-f8de-4010-8a23-77a265e55b49","Type":"ContainerStarted","Data":"350fcda5a970b9d84c001e2fef0a77ab2f357a323cd07e464ce1d538a2416a84"} Jan 21 00:34:17 crc kubenswrapper[5030]: I0121 00:34:17.186442 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"ca1a59bd-f8de-4010-8a23-77a265e55b49","Type":"ContainerStarted","Data":"b01104de51a609d2b64bf30459460f431dca56911e7676f3e1bc5db24228a013"} Jan 21 00:34:17 crc kubenswrapper[5030]: I0121 00:34:17.204039 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstackclient" podStartSLOduration=2.68680295 podStartE2EDuration="3.204023762s" podCreationTimestamp="2026-01-21 00:34:14 +0000 UTC" firstStartedPulling="2026-01-21 00:34:15.497666271 +0000 UTC m=+7127.817926569" lastFinishedPulling="2026-01-21 00:34:16.014887063 +0000 UTC m=+7128.335147381" observedRunningTime="2026-01-21 00:34:17.200593779 +0000 UTC m=+7129.520854097" watchObservedRunningTime="2026-01-21 00:34:17.204023762 +0000 UTC m=+7129.524284050" Jan 21 00:34:32 crc kubenswrapper[5030]: I0121 00:34:32.970601 5030 scope.go:117] "RemoveContainer" containerID="7d1f0f75a66fcedfda2393b7f594449db330ff4fa34d4c0af418df22393c8133" Jan 21 00:34:33 crc kubenswrapper[5030]: I0121 00:34:33.006385 5030 scope.go:117] "RemoveContainer" containerID="e36d136a3637d5ad02280a8e1b410b569b56fb8482dda19f04f012da725652b8" Jan 21 00:34:40 crc kubenswrapper[5030]: I0121 00:34:40.157242 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:34:40 crc kubenswrapper[5030]: I0121 00:34:40.157874 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:34:40 crc kubenswrapper[5030]: I0121 00:34:40.157934 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 00:34:40 crc kubenswrapper[5030]: I0121 00:34:40.158853 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 00:34:40 crc kubenswrapper[5030]: I0121 00:34:40.158925 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" gracePeriod=600 Jan 21 00:34:40 crc kubenswrapper[5030]: E0121 00:34:40.280009 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:34:40 crc kubenswrapper[5030]: I0121 00:34:40.398190 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" exitCode=0 Jan 21 00:34:40 crc kubenswrapper[5030]: I0121 00:34:40.398259 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89"} Jan 21 00:34:40 crc kubenswrapper[5030]: I0121 00:34:40.398525 5030 scope.go:117] "RemoveContainer" containerID="db98105b3432d0931d8d075baf245bf404651b9f54b3e055d9bfc97178ee7e44" Jan 21 00:34:40 crc kubenswrapper[5030]: I0121 00:34:40.399328 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:34:40 crc kubenswrapper[5030]: E0121 00:34:40.401208 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:34:53 crc kubenswrapper[5030]: I0121 00:34:53.962894 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:34:53 crc kubenswrapper[5030]: E0121 00:34:53.963752 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:35:05 crc kubenswrapper[5030]: I0121 00:35:05.962479 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:35:05 crc kubenswrapper[5030]: E0121 00:35:05.963223 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:35:17 crc kubenswrapper[5030]: I0121 00:35:17.966578 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:35:17 crc kubenswrapper[5030]: E0121 00:35:17.967304 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:35:32 crc kubenswrapper[5030]: I0121 00:35:32.963026 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:35:32 crc kubenswrapper[5030]: E0121 00:35:32.964144 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:35:33 crc kubenswrapper[5030]: I0121 00:35:33.097917 5030 scope.go:117] "RemoveContainer" containerID="e5b0157672e1b421cd5af4613cdeaedc6adfa85567c61cfc76b1b321d2c705ed" Jan 21 00:35:33 crc kubenswrapper[5030]: I0121 00:35:33.120892 5030 scope.go:117] "RemoveContainer" containerID="48a807b9894bd45560717aba56ef65fc945fee6bec861d7922d84e23fcbf0980" Jan 21 00:35:33 crc kubenswrapper[5030]: I0121 00:35:33.160372 5030 scope.go:117] "RemoveContainer" containerID="18b310f5677dbab8b33c8dbf2417026617f2f343c50e055b678ace6c25336fb1" Jan 21 00:35:33 crc kubenswrapper[5030]: I0121 00:35:33.216840 5030 scope.go:117] "RemoveContainer" containerID="d2229cb87a4c9dc69e631510bb947178738e79bb959e6ff85af07cdc34fee6aa" Jan 21 00:35:33 crc kubenswrapper[5030]: I0121 00:35:33.234712 5030 scope.go:117] "RemoveContainer" containerID="85e5ab42373262369eafd038b4738728ebc79353bb01234ec5eb3a70e11ba12d" Jan 21 00:35:44 crc kubenswrapper[5030]: I0121 00:35:44.962295 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:35:44 crc kubenswrapper[5030]: E0121 00:35:44.962892 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:35:55 crc kubenswrapper[5030]: I0121 00:35:55.961987 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:35:55 crc kubenswrapper[5030]: E0121 00:35:55.962703 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:36:06 crc kubenswrapper[5030]: I0121 00:36:06.962689 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:36:06 crc kubenswrapper[5030]: E0121 00:36:06.963427 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:36:19 crc kubenswrapper[5030]: I0121 00:36:19.962389 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:36:19 crc kubenswrapper[5030]: E0121 00:36:19.963188 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:36:20 crc kubenswrapper[5030]: I0121 00:36:20.051215 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-2kwnl"] Jan 21 00:36:20 crc kubenswrapper[5030]: I0121 00:36:20.058136 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-2kwnl"] Jan 21 00:36:21 crc kubenswrapper[5030]: I0121 00:36:21.971921 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bf8a32f-b9b9-4652-bae4-a4585fd99b3c" path="/var/lib/kubelet/pods/3bf8a32f-b9b9-4652-bae4-a4585fd99b3c/volumes" Jan 21 00:36:31 crc kubenswrapper[5030]: I0121 00:36:31.962689 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:36:31 crc kubenswrapper[5030]: E0121 00:36:31.963516 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:36:33 crc kubenswrapper[5030]: I0121 00:36:33.401329 5030 scope.go:117] "RemoveContainer" containerID="be02a06a2b2147d69c64894132b9ca632c3726d522bde0749ccbfa1e4fe8dbc9" Jan 21 00:36:33 crc kubenswrapper[5030]: I0121 00:36:33.435015 5030 scope.go:117] "RemoveContainer" containerID="145da04b1e990964301aeca19c397782cb6b889bf464d9d3ee9a7d40bf1a35f6" Jan 21 00:36:33 crc kubenswrapper[5030]: I0121 00:36:33.470940 5030 scope.go:117] "RemoveContainer" containerID="0bd6f7cd4661abab219cb48e19460cb50db6a6760f28d6b5d07031de996fa6a7" Jan 21 00:36:44 crc kubenswrapper[5030]: I0121 00:36:44.962587 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:36:44 crc kubenswrapper[5030]: E0121 00:36:44.963406 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:36:58 crc kubenswrapper[5030]: I0121 00:36:58.961827 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:36:58 crc kubenswrapper[5030]: E0121 00:36:58.963617 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:37:09 crc kubenswrapper[5030]: I0121 00:37:09.962394 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:37:09 crc kubenswrapper[5030]: E0121 00:37:09.963281 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:37:23 crc kubenswrapper[5030]: I0121 00:37:23.962424 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:37:23 crc kubenswrapper[5030]: E0121 00:37:23.963248 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.056455 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-stk4m"] Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.059750 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.069969 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-stk4m"] Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.168582 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307659df-121b-45d6-a201-00c5fbe5d363-catalog-content\") pod \"redhat-marketplace-stk4m\" (UID: \"307659df-121b-45d6-a201-00c5fbe5d363\") " pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.168781 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9454\" (UniqueName: \"kubernetes.io/projected/307659df-121b-45d6-a201-00c5fbe5d363-kube-api-access-t9454\") pod \"redhat-marketplace-stk4m\" (UID: \"307659df-121b-45d6-a201-00c5fbe5d363\") " pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.168906 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307659df-121b-45d6-a201-00c5fbe5d363-utilities\") pod \"redhat-marketplace-stk4m\" (UID: \"307659df-121b-45d6-a201-00c5fbe5d363\") " pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.270520 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9454\" (UniqueName: \"kubernetes.io/projected/307659df-121b-45d6-a201-00c5fbe5d363-kube-api-access-t9454\") pod \"redhat-marketplace-stk4m\" (UID: \"307659df-121b-45d6-a201-00c5fbe5d363\") " pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.270599 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307659df-121b-45d6-a201-00c5fbe5d363-utilities\") pod \"redhat-marketplace-stk4m\" (UID: \"307659df-121b-45d6-a201-00c5fbe5d363\") " pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.270684 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307659df-121b-45d6-a201-00c5fbe5d363-catalog-content\") pod \"redhat-marketplace-stk4m\" (UID: \"307659df-121b-45d6-a201-00c5fbe5d363\") " pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.271200 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307659df-121b-45d6-a201-00c5fbe5d363-catalog-content\") pod \"redhat-marketplace-stk4m\" (UID: \"307659df-121b-45d6-a201-00c5fbe5d363\") " pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.271315 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307659df-121b-45d6-a201-00c5fbe5d363-utilities\") pod \"redhat-marketplace-stk4m\" (UID: \"307659df-121b-45d6-a201-00c5fbe5d363\") " pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.303516 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9454\" (UniqueName: \"kubernetes.io/projected/307659df-121b-45d6-a201-00c5fbe5d363-kube-api-access-t9454\") pod \"redhat-marketplace-stk4m\" (UID: \"307659df-121b-45d6-a201-00c5fbe5d363\") " pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.384222 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.541878 5030 scope.go:117] "RemoveContainer" containerID="aa65ac67de2df03c7176e533a33ff638fc1de2375779abda5ebf53e2759f1291" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.589904 5030 scope.go:117] "RemoveContainer" containerID="f1cdf9e598e9b521e87c899799e6cc70e3ae1b33cefb9c0395e2c70c15546272" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.642916 5030 scope.go:117] "RemoveContainer" containerID="dc44f1f3719f1d4ebc0751b891dd3d95c0e3ff0f296ca0bb29b8e68b4862ffd5" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.658207 5030 scope.go:117] "RemoveContainer" containerID="e357e6e87971c8f44391bfe94b1f997081b94e3f0f5ae5b6100057a666f24a5b" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.673209 5030 scope.go:117] "RemoveContainer" containerID="dbc574f7c305deb7a35afe9abfffcd0e7f2e11f79f47afe93d06fcc812a1880b" Jan 21 00:37:33 crc kubenswrapper[5030]: I0121 00:37:33.830839 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-stk4m"] Jan 21 00:37:34 crc kubenswrapper[5030]: I0121 00:37:34.785387 5030 generic.go:334] "Generic (PLEG): container finished" podID="307659df-121b-45d6-a201-00c5fbe5d363" containerID="8e89dd65d15e2af99bdb94fb2a0bb26a0590cda3e77ff206f1415ecaf9a94569" exitCode=0 Jan 21 00:37:34 crc kubenswrapper[5030]: I0121 00:37:34.785434 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stk4m" event={"ID":"307659df-121b-45d6-a201-00c5fbe5d363","Type":"ContainerDied","Data":"8e89dd65d15e2af99bdb94fb2a0bb26a0590cda3e77ff206f1415ecaf9a94569"} Jan 21 00:37:34 crc kubenswrapper[5030]: I0121 00:37:34.785459 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stk4m" event={"ID":"307659df-121b-45d6-a201-00c5fbe5d363","Type":"ContainerStarted","Data":"ff9d31e661d0fdae64c7f938a6ef1281c9cbeeed3d0fd291a0a66e8748b493a4"} Jan 21 00:37:35 crc kubenswrapper[5030]: I0121 00:37:35.794322 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stk4m" event={"ID":"307659df-121b-45d6-a201-00c5fbe5d363","Type":"ContainerStarted","Data":"4a706fd42028751e5b5b8ebdfe407f1693c11813aba6b8e23789c017e3596921"} Jan 21 00:37:36 crc kubenswrapper[5030]: I0121 00:37:36.803409 5030 generic.go:334] "Generic (PLEG): container finished" podID="307659df-121b-45d6-a201-00c5fbe5d363" containerID="4a706fd42028751e5b5b8ebdfe407f1693c11813aba6b8e23789c017e3596921" exitCode=0 Jan 21 00:37:36 crc kubenswrapper[5030]: I0121 00:37:36.803508 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stk4m" event={"ID":"307659df-121b-45d6-a201-00c5fbe5d363","Type":"ContainerDied","Data":"4a706fd42028751e5b5b8ebdfe407f1693c11813aba6b8e23789c017e3596921"} Jan 21 00:37:37 crc kubenswrapper[5030]: I0121 00:37:37.821689 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stk4m" event={"ID":"307659df-121b-45d6-a201-00c5fbe5d363","Type":"ContainerStarted","Data":"cd6c2e6af593f6d7ca35966e2784691218c2584cc0cb5662e6b1619c92279017"} Jan 21 00:37:37 crc kubenswrapper[5030]: I0121 00:37:37.842074 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-stk4m" podStartSLOduration=2.314569947 podStartE2EDuration="4.842055714s" podCreationTimestamp="2026-01-21 00:37:33 +0000 UTC" firstStartedPulling="2026-01-21 00:37:34.787434167 +0000 UTC m=+7327.107694475" lastFinishedPulling="2026-01-21 00:37:37.314919954 +0000 UTC m=+7329.635180242" observedRunningTime="2026-01-21 00:37:37.838925649 +0000 UTC m=+7330.159185947" watchObservedRunningTime="2026-01-21 00:37:37.842055714 +0000 UTC m=+7330.162316002" Jan 21 00:37:37 crc kubenswrapper[5030]: I0121 00:37:37.967307 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:37:37 crc kubenswrapper[5030]: E0121 00:37:37.967669 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:37:43 crc kubenswrapper[5030]: I0121 00:37:43.384775 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:43 crc kubenswrapper[5030]: I0121 00:37:43.385126 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:43 crc kubenswrapper[5030]: I0121 00:37:43.431456 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:43 crc kubenswrapper[5030]: I0121 00:37:43.908309 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:43 crc kubenswrapper[5030]: I0121 00:37:43.972027 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-stk4m"] Jan 21 00:37:45 crc kubenswrapper[5030]: I0121 00:37:45.882814 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-stk4m" podUID="307659df-121b-45d6-a201-00c5fbe5d363" containerName="registry-server" containerID="cri-o://cd6c2e6af593f6d7ca35966e2784691218c2584cc0cb5662e6b1619c92279017" gracePeriod=2 Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.305392 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.394686 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307659df-121b-45d6-a201-00c5fbe5d363-catalog-content\") pod \"307659df-121b-45d6-a201-00c5fbe5d363\" (UID: \"307659df-121b-45d6-a201-00c5fbe5d363\") " Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.394792 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9454\" (UniqueName: \"kubernetes.io/projected/307659df-121b-45d6-a201-00c5fbe5d363-kube-api-access-t9454\") pod \"307659df-121b-45d6-a201-00c5fbe5d363\" (UID: \"307659df-121b-45d6-a201-00c5fbe5d363\") " Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.394858 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307659df-121b-45d6-a201-00c5fbe5d363-utilities\") pod \"307659df-121b-45d6-a201-00c5fbe5d363\" (UID: \"307659df-121b-45d6-a201-00c5fbe5d363\") " Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.396199 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/307659df-121b-45d6-a201-00c5fbe5d363-utilities" (OuterVolumeSpecName: "utilities") pod "307659df-121b-45d6-a201-00c5fbe5d363" (UID: "307659df-121b-45d6-a201-00c5fbe5d363"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.402431 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/307659df-121b-45d6-a201-00c5fbe5d363-kube-api-access-t9454" (OuterVolumeSpecName: "kube-api-access-t9454") pod "307659df-121b-45d6-a201-00c5fbe5d363" (UID: "307659df-121b-45d6-a201-00c5fbe5d363"). InnerVolumeSpecName "kube-api-access-t9454". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.415995 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/307659df-121b-45d6-a201-00c5fbe5d363-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "307659df-121b-45d6-a201-00c5fbe5d363" (UID: "307659df-121b-45d6-a201-00c5fbe5d363"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.496503 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/307659df-121b-45d6-a201-00c5fbe5d363-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.496552 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9454\" (UniqueName: \"kubernetes.io/projected/307659df-121b-45d6-a201-00c5fbe5d363-kube-api-access-t9454\") on node \"crc\" DevicePath \"\"" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.496571 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/307659df-121b-45d6-a201-00c5fbe5d363-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.891409 5030 generic.go:334] "Generic (PLEG): container finished" podID="307659df-121b-45d6-a201-00c5fbe5d363" containerID="cd6c2e6af593f6d7ca35966e2784691218c2584cc0cb5662e6b1619c92279017" exitCode=0 Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.891471 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stk4m" event={"ID":"307659df-121b-45d6-a201-00c5fbe5d363","Type":"ContainerDied","Data":"cd6c2e6af593f6d7ca35966e2784691218c2584cc0cb5662e6b1619c92279017"} Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.891524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-stk4m" event={"ID":"307659df-121b-45d6-a201-00c5fbe5d363","Type":"ContainerDied","Data":"ff9d31e661d0fdae64c7f938a6ef1281c9cbeeed3d0fd291a0a66e8748b493a4"} Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.891535 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-stk4m" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.891544 5030 scope.go:117] "RemoveContainer" containerID="cd6c2e6af593f6d7ca35966e2784691218c2584cc0cb5662e6b1619c92279017" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.926000 5030 scope.go:117] "RemoveContainer" containerID="4a706fd42028751e5b5b8ebdfe407f1693c11813aba6b8e23789c017e3596921" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.926912 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-stk4m"] Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.936567 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-stk4m"] Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.948569 5030 scope.go:117] "RemoveContainer" containerID="8e89dd65d15e2af99bdb94fb2a0bb26a0590cda3e77ff206f1415ecaf9a94569" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.974827 5030 scope.go:117] "RemoveContainer" containerID="cd6c2e6af593f6d7ca35966e2784691218c2584cc0cb5662e6b1619c92279017" Jan 21 00:37:46 crc kubenswrapper[5030]: E0121 00:37:46.975335 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd6c2e6af593f6d7ca35966e2784691218c2584cc0cb5662e6b1619c92279017\": container with ID starting with cd6c2e6af593f6d7ca35966e2784691218c2584cc0cb5662e6b1619c92279017 not found: ID does not exist" containerID="cd6c2e6af593f6d7ca35966e2784691218c2584cc0cb5662e6b1619c92279017" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.975409 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd6c2e6af593f6d7ca35966e2784691218c2584cc0cb5662e6b1619c92279017"} err="failed to get container status \"cd6c2e6af593f6d7ca35966e2784691218c2584cc0cb5662e6b1619c92279017\": rpc error: code = NotFound desc = could not find container \"cd6c2e6af593f6d7ca35966e2784691218c2584cc0cb5662e6b1619c92279017\": container with ID starting with cd6c2e6af593f6d7ca35966e2784691218c2584cc0cb5662e6b1619c92279017 not found: ID does not exist" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.975478 5030 scope.go:117] "RemoveContainer" containerID="4a706fd42028751e5b5b8ebdfe407f1693c11813aba6b8e23789c017e3596921" Jan 21 00:37:46 crc kubenswrapper[5030]: E0121 00:37:46.975901 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a706fd42028751e5b5b8ebdfe407f1693c11813aba6b8e23789c017e3596921\": container with ID starting with 4a706fd42028751e5b5b8ebdfe407f1693c11813aba6b8e23789c017e3596921 not found: ID does not exist" containerID="4a706fd42028751e5b5b8ebdfe407f1693c11813aba6b8e23789c017e3596921" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.975934 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a706fd42028751e5b5b8ebdfe407f1693c11813aba6b8e23789c017e3596921"} err="failed to get container status \"4a706fd42028751e5b5b8ebdfe407f1693c11813aba6b8e23789c017e3596921\": rpc error: code = NotFound desc = could not find container \"4a706fd42028751e5b5b8ebdfe407f1693c11813aba6b8e23789c017e3596921\": container with ID starting with 4a706fd42028751e5b5b8ebdfe407f1693c11813aba6b8e23789c017e3596921 not found: ID does not exist" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.975953 5030 scope.go:117] "RemoveContainer" containerID="8e89dd65d15e2af99bdb94fb2a0bb26a0590cda3e77ff206f1415ecaf9a94569" Jan 21 00:37:46 crc kubenswrapper[5030]: E0121 00:37:46.976317 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e89dd65d15e2af99bdb94fb2a0bb26a0590cda3e77ff206f1415ecaf9a94569\": container with ID starting with 8e89dd65d15e2af99bdb94fb2a0bb26a0590cda3e77ff206f1415ecaf9a94569 not found: ID does not exist" containerID="8e89dd65d15e2af99bdb94fb2a0bb26a0590cda3e77ff206f1415ecaf9a94569" Jan 21 00:37:46 crc kubenswrapper[5030]: I0121 00:37:46.976348 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e89dd65d15e2af99bdb94fb2a0bb26a0590cda3e77ff206f1415ecaf9a94569"} err="failed to get container status \"8e89dd65d15e2af99bdb94fb2a0bb26a0590cda3e77ff206f1415ecaf9a94569\": rpc error: code = NotFound desc = could not find container \"8e89dd65d15e2af99bdb94fb2a0bb26a0590cda3e77ff206f1415ecaf9a94569\": container with ID starting with 8e89dd65d15e2af99bdb94fb2a0bb26a0590cda3e77ff206f1415ecaf9a94569 not found: ID does not exist" Jan 21 00:37:47 crc kubenswrapper[5030]: I0121 00:37:47.969950 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="307659df-121b-45d6-a201-00c5fbe5d363" path="/var/lib/kubelet/pods/307659df-121b-45d6-a201-00c5fbe5d363/volumes" Jan 21 00:37:51 crc kubenswrapper[5030]: I0121 00:37:51.963394 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:37:51 crc kubenswrapper[5030]: E0121 00:37:51.964662 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.368684 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mj7rz"] Jan 21 00:37:56 crc kubenswrapper[5030]: E0121 00:37:56.369858 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307659df-121b-45d6-a201-00c5fbe5d363" containerName="extract-content" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.369877 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="307659df-121b-45d6-a201-00c5fbe5d363" containerName="extract-content" Jan 21 00:37:56 crc kubenswrapper[5030]: E0121 00:37:56.369898 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307659df-121b-45d6-a201-00c5fbe5d363" containerName="registry-server" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.369908 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="307659df-121b-45d6-a201-00c5fbe5d363" containerName="registry-server" Jan 21 00:37:56 crc kubenswrapper[5030]: E0121 00:37:56.369921 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307659df-121b-45d6-a201-00c5fbe5d363" containerName="extract-utilities" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.369930 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="307659df-121b-45d6-a201-00c5fbe5d363" containerName="extract-utilities" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.370140 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="307659df-121b-45d6-a201-00c5fbe5d363" containerName="registry-server" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.371332 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.379453 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mj7rz"] Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.462270 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f2a1240-46dc-4692-863d-a32e6f11caff-utilities\") pod \"community-operators-mj7rz\" (UID: \"4f2a1240-46dc-4692-863d-a32e6f11caff\") " pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.462369 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvnrj\" (UniqueName: \"kubernetes.io/projected/4f2a1240-46dc-4692-863d-a32e6f11caff-kube-api-access-lvnrj\") pod \"community-operators-mj7rz\" (UID: \"4f2a1240-46dc-4692-863d-a32e6f11caff\") " pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.462436 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f2a1240-46dc-4692-863d-a32e6f11caff-catalog-content\") pod \"community-operators-mj7rz\" (UID: \"4f2a1240-46dc-4692-863d-a32e6f11caff\") " pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.564115 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f2a1240-46dc-4692-863d-a32e6f11caff-catalog-content\") pod \"community-operators-mj7rz\" (UID: \"4f2a1240-46dc-4692-863d-a32e6f11caff\") " pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.564209 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f2a1240-46dc-4692-863d-a32e6f11caff-utilities\") pod \"community-operators-mj7rz\" (UID: \"4f2a1240-46dc-4692-863d-a32e6f11caff\") " pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.564281 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvnrj\" (UniqueName: \"kubernetes.io/projected/4f2a1240-46dc-4692-863d-a32e6f11caff-kube-api-access-lvnrj\") pod \"community-operators-mj7rz\" (UID: \"4f2a1240-46dc-4692-863d-a32e6f11caff\") " pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.564719 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f2a1240-46dc-4692-863d-a32e6f11caff-catalog-content\") pod \"community-operators-mj7rz\" (UID: \"4f2a1240-46dc-4692-863d-a32e6f11caff\") " pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.564782 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f2a1240-46dc-4692-863d-a32e6f11caff-utilities\") pod \"community-operators-mj7rz\" (UID: \"4f2a1240-46dc-4692-863d-a32e6f11caff\") " pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.593480 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvnrj\" (UniqueName: \"kubernetes.io/projected/4f2a1240-46dc-4692-863d-a32e6f11caff-kube-api-access-lvnrj\") pod \"community-operators-mj7rz\" (UID: \"4f2a1240-46dc-4692-863d-a32e6f11caff\") " pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.692537 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:37:56 crc kubenswrapper[5030]: I0121 00:37:56.958271 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mj7rz"] Jan 21 00:37:57 crc kubenswrapper[5030]: I0121 00:37:57.969100 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f2a1240-46dc-4692-863d-a32e6f11caff" containerID="8fcddf1eae7bfb342518bae04d8838b6b4f730598a086d69dba8d93190c86857" exitCode=0 Jan 21 00:37:57 crc kubenswrapper[5030]: I0121 00:37:57.976319 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj7rz" event={"ID":"4f2a1240-46dc-4692-863d-a32e6f11caff","Type":"ContainerDied","Data":"8fcddf1eae7bfb342518bae04d8838b6b4f730598a086d69dba8d93190c86857"} Jan 21 00:37:57 crc kubenswrapper[5030]: I0121 00:37:57.976375 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj7rz" event={"ID":"4f2a1240-46dc-4692-863d-a32e6f11caff","Type":"ContainerStarted","Data":"c79734be4890013c60be2589d3a7c0c1218bde296fdd3a25454ce1e9bd5d4a3f"} Jan 21 00:37:58 crc kubenswrapper[5030]: I0121 00:37:58.980178 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj7rz" event={"ID":"4f2a1240-46dc-4692-863d-a32e6f11caff","Type":"ContainerStarted","Data":"d26343bb95d4704d82a563e83836463c1b7f353c9cdd2c3cfcd7da5c3033a277"} Jan 21 00:37:59 crc kubenswrapper[5030]: I0121 00:37:59.989012 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f2a1240-46dc-4692-863d-a32e6f11caff" containerID="d26343bb95d4704d82a563e83836463c1b7f353c9cdd2c3cfcd7da5c3033a277" exitCode=0 Jan 21 00:37:59 crc kubenswrapper[5030]: I0121 00:37:59.989197 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj7rz" event={"ID":"4f2a1240-46dc-4692-863d-a32e6f11caff","Type":"ContainerDied","Data":"d26343bb95d4704d82a563e83836463c1b7f353c9cdd2c3cfcd7da5c3033a277"} Jan 21 00:38:01 crc kubenswrapper[5030]: I0121 00:38:00.999796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj7rz" event={"ID":"4f2a1240-46dc-4692-863d-a32e6f11caff","Type":"ContainerStarted","Data":"f87efe7cf9924a4c8e95593d6060ac329878d8b76390a035a08b1249fd02e692"} Jan 21 00:38:01 crc kubenswrapper[5030]: I0121 00:38:01.020565 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mj7rz" podStartSLOduration=2.535838509 podStartE2EDuration="5.020542808s" podCreationTimestamp="2026-01-21 00:37:56 +0000 UTC" firstStartedPulling="2026-01-21 00:37:57.971511626 +0000 UTC m=+7350.291771924" lastFinishedPulling="2026-01-21 00:38:00.456215925 +0000 UTC m=+7352.776476223" observedRunningTime="2026-01-21 00:38:01.015811505 +0000 UTC m=+7353.336071803" watchObservedRunningTime="2026-01-21 00:38:01.020542808 +0000 UTC m=+7353.340803096" Jan 21 00:38:06 crc kubenswrapper[5030]: I0121 00:38:06.693967 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:38:06 crc kubenswrapper[5030]: I0121 00:38:06.694487 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:38:06 crc kubenswrapper[5030]: I0121 00:38:06.736910 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:38:06 crc kubenswrapper[5030]: I0121 00:38:06.962584 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:38:06 crc kubenswrapper[5030]: E0121 00:38:06.963177 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:38:07 crc kubenswrapper[5030]: I0121 00:38:07.081109 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:38:07 crc kubenswrapper[5030]: I0121 00:38:07.141930 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mj7rz"] Jan 21 00:38:09 crc kubenswrapper[5030]: I0121 00:38:09.059359 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mj7rz" podUID="4f2a1240-46dc-4692-863d-a32e6f11caff" containerName="registry-server" containerID="cri-o://f87efe7cf9924a4c8e95593d6060ac329878d8b76390a035a08b1249fd02e692" gracePeriod=2 Jan 21 00:38:10 crc kubenswrapper[5030]: I0121 00:38:10.072486 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f2a1240-46dc-4692-863d-a32e6f11caff" containerID="f87efe7cf9924a4c8e95593d6060ac329878d8b76390a035a08b1249fd02e692" exitCode=0 Jan 21 00:38:10 crc kubenswrapper[5030]: I0121 00:38:10.072579 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj7rz" event={"ID":"4f2a1240-46dc-4692-863d-a32e6f11caff","Type":"ContainerDied","Data":"f87efe7cf9924a4c8e95593d6060ac329878d8b76390a035a08b1249fd02e692"} Jan 21 00:38:10 crc kubenswrapper[5030]: I0121 00:38:10.593959 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:38:10 crc kubenswrapper[5030]: I0121 00:38:10.683892 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f2a1240-46dc-4692-863d-a32e6f11caff-utilities\") pod \"4f2a1240-46dc-4692-863d-a32e6f11caff\" (UID: \"4f2a1240-46dc-4692-863d-a32e6f11caff\") " Jan 21 00:38:10 crc kubenswrapper[5030]: I0121 00:38:10.683991 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f2a1240-46dc-4692-863d-a32e6f11caff-catalog-content\") pod \"4f2a1240-46dc-4692-863d-a32e6f11caff\" (UID: \"4f2a1240-46dc-4692-863d-a32e6f11caff\") " Jan 21 00:38:10 crc kubenswrapper[5030]: I0121 00:38:10.684025 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvnrj\" (UniqueName: \"kubernetes.io/projected/4f2a1240-46dc-4692-863d-a32e6f11caff-kube-api-access-lvnrj\") pod \"4f2a1240-46dc-4692-863d-a32e6f11caff\" (UID: \"4f2a1240-46dc-4692-863d-a32e6f11caff\") " Jan 21 00:38:10 crc kubenswrapper[5030]: I0121 00:38:10.684983 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f2a1240-46dc-4692-863d-a32e6f11caff-utilities" (OuterVolumeSpecName: "utilities") pod "4f2a1240-46dc-4692-863d-a32e6f11caff" (UID: "4f2a1240-46dc-4692-863d-a32e6f11caff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:38:10 crc kubenswrapper[5030]: I0121 00:38:10.690656 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f2a1240-46dc-4692-863d-a32e6f11caff-kube-api-access-lvnrj" (OuterVolumeSpecName: "kube-api-access-lvnrj") pod "4f2a1240-46dc-4692-863d-a32e6f11caff" (UID: "4f2a1240-46dc-4692-863d-a32e6f11caff"). InnerVolumeSpecName "kube-api-access-lvnrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:38:10 crc kubenswrapper[5030]: I0121 00:38:10.741000 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f2a1240-46dc-4692-863d-a32e6f11caff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f2a1240-46dc-4692-863d-a32e6f11caff" (UID: "4f2a1240-46dc-4692-863d-a32e6f11caff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:38:10 crc kubenswrapper[5030]: I0121 00:38:10.785426 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f2a1240-46dc-4692-863d-a32e6f11caff-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:38:10 crc kubenswrapper[5030]: I0121 00:38:10.785465 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f2a1240-46dc-4692-863d-a32e6f11caff-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:38:10 crc kubenswrapper[5030]: I0121 00:38:10.785482 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvnrj\" (UniqueName: \"kubernetes.io/projected/4f2a1240-46dc-4692-863d-a32e6f11caff-kube-api-access-lvnrj\") on node \"crc\" DevicePath \"\"" Jan 21 00:38:11 crc kubenswrapper[5030]: I0121 00:38:11.085921 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mj7rz" event={"ID":"4f2a1240-46dc-4692-863d-a32e6f11caff","Type":"ContainerDied","Data":"c79734be4890013c60be2589d3a7c0c1218bde296fdd3a25454ce1e9bd5d4a3f"} Jan 21 00:38:11 crc kubenswrapper[5030]: I0121 00:38:11.086010 5030 scope.go:117] "RemoveContainer" containerID="f87efe7cf9924a4c8e95593d6060ac329878d8b76390a035a08b1249fd02e692" Jan 21 00:38:11 crc kubenswrapper[5030]: I0121 00:38:11.086047 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mj7rz" Jan 21 00:38:11 crc kubenswrapper[5030]: I0121 00:38:11.110139 5030 scope.go:117] "RemoveContainer" containerID="d26343bb95d4704d82a563e83836463c1b7f353c9cdd2c3cfcd7da5c3033a277" Jan 21 00:38:11 crc kubenswrapper[5030]: I0121 00:38:11.129538 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mj7rz"] Jan 21 00:38:11 crc kubenswrapper[5030]: I0121 00:38:11.136771 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mj7rz"] Jan 21 00:38:11 crc kubenswrapper[5030]: I0121 00:38:11.150937 5030 scope.go:117] "RemoveContainer" containerID="8fcddf1eae7bfb342518bae04d8838b6b4f730598a086d69dba8d93190c86857" Jan 21 00:38:11 crc kubenswrapper[5030]: I0121 00:38:11.977646 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f2a1240-46dc-4692-863d-a32e6f11caff" path="/var/lib/kubelet/pods/4f2a1240-46dc-4692-863d-a32e6f11caff/volumes" Jan 21 00:38:18 crc kubenswrapper[5030]: I0121 00:38:18.962417 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:38:18 crc kubenswrapper[5030]: E0121 00:38:18.962958 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:38:33 crc kubenswrapper[5030]: I0121 00:38:33.748598 5030 scope.go:117] "RemoveContainer" containerID="81a0f88de49ccf3e6b4b4162003f10bea3472fdea17c6bba60ef9a0938200944" Jan 21 00:38:33 crc kubenswrapper[5030]: I0121 00:38:33.789689 5030 scope.go:117] "RemoveContainer" containerID="710255d5b4b972d99e63e2fd70e88a8f0cae991538c4d3ffb0e60f44134d3678" Jan 21 00:38:33 crc kubenswrapper[5030]: I0121 00:38:33.821494 5030 scope.go:117] "RemoveContainer" containerID="5d53470662d918c0bf1c6a06d565912c08a2ea67b746add59032ed20a4f45c82" Jan 21 00:38:33 crc kubenswrapper[5030]: I0121 00:38:33.841838 5030 scope.go:117] "RemoveContainer" containerID="56577b69e355f56132268618b6805e8f83d49f38733f447f7a860aa109364f60" Jan 21 00:38:33 crc kubenswrapper[5030]: I0121 00:38:33.869285 5030 scope.go:117] "RemoveContainer" containerID="7b522ee6f1171a2b131795b927d6459049aca19d599238d12de629b848864714" Jan 21 00:38:33 crc kubenswrapper[5030]: I0121 00:38:33.962566 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:38:33 crc kubenswrapper[5030]: E0121 00:38:33.962782 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:38:47 crc kubenswrapper[5030]: I0121 00:38:47.967348 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:38:47 crc kubenswrapper[5030]: E0121 00:38:47.968267 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:38:59 crc kubenswrapper[5030]: I0121 00:38:59.963320 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:38:59 crc kubenswrapper[5030]: E0121 00:38:59.964597 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:39:12 crc kubenswrapper[5030]: I0121 00:39:12.963040 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:39:12 crc kubenswrapper[5030]: E0121 00:39:12.963994 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:39:24 crc kubenswrapper[5030]: I0121 00:39:24.962886 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:39:24 crc kubenswrapper[5030]: E0121 00:39:24.963776 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:39:33 crc kubenswrapper[5030]: I0121 00:39:33.984331 5030 scope.go:117] "RemoveContainer" containerID="e7cc14c387712ca0756f7dc60d3602c3043fc8c95d0a87ed8bb5ad426b567115" Jan 21 00:39:36 crc kubenswrapper[5030]: I0121 00:39:36.962170 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:39:36 crc kubenswrapper[5030]: E0121 00:39:36.962727 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:39:48 crc kubenswrapper[5030]: I0121 00:39:48.962116 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:39:49 crc kubenswrapper[5030]: I0121 00:39:49.981733 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"69843e2a363b78a7b8bb08cb53e5f4ea850f7a953289a3808799af70fc7ca97b"} Jan 21 00:42:10 crc kubenswrapper[5030]: I0121 00:42:10.157503 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:42:10 crc kubenswrapper[5030]: I0121 00:42:10.159278 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:42:40 crc kubenswrapper[5030]: I0121 00:42:40.157698 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:42:40 crc kubenswrapper[5030]: I0121 00:42:40.158388 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:43:10 crc kubenswrapper[5030]: I0121 00:43:10.157425 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:43:10 crc kubenswrapper[5030]: I0121 00:43:10.158174 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:43:10 crc kubenswrapper[5030]: I0121 00:43:10.158235 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 00:43:10 crc kubenswrapper[5030]: I0121 00:43:10.158887 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69843e2a363b78a7b8bb08cb53e5f4ea850f7a953289a3808799af70fc7ca97b"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 00:43:10 crc kubenswrapper[5030]: I0121 00:43:10.158955 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://69843e2a363b78a7b8bb08cb53e5f4ea850f7a953289a3808799af70fc7ca97b" gracePeriod=600 Jan 21 00:43:10 crc kubenswrapper[5030]: I0121 00:43:10.661567 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="69843e2a363b78a7b8bb08cb53e5f4ea850f7a953289a3808799af70fc7ca97b" exitCode=0 Jan 21 00:43:10 crc kubenswrapper[5030]: I0121 00:43:10.661633 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"69843e2a363b78a7b8bb08cb53e5f4ea850f7a953289a3808799af70fc7ca97b"} Jan 21 00:43:10 crc kubenswrapper[5030]: I0121 00:43:10.661954 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9"} Jan 21 00:43:10 crc kubenswrapper[5030]: I0121 00:43:10.661984 5030 scope.go:117] "RemoveContainer" containerID="4cbf0a67df09086a2451ce8cbcc9a8a9847a887687100df5df121e74840c0a89" Jan 21 00:43:26 crc kubenswrapper[5030]: I0121 00:43:26.062041 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-3507-account-create-update-698k7"] Jan 21 00:43:26 crc kubenswrapper[5030]: I0121 00:43:26.069359 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-r8l7f"] Jan 21 00:43:26 crc kubenswrapper[5030]: I0121 00:43:26.082068 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-r8l7f"] Jan 21 00:43:26 crc kubenswrapper[5030]: I0121 00:43:26.089478 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-3507-account-create-update-698k7"] Jan 21 00:43:27 crc kubenswrapper[5030]: I0121 00:43:27.976601 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e360cc0c-7528-4b74-8586-a4dab5c4b543" path="/var/lib/kubelet/pods/e360cc0c-7528-4b74-8586-a4dab5c4b543/volumes" Jan 21 00:43:27 crc kubenswrapper[5030]: I0121 00:43:27.977685 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7019e9a-6250-4154-85b8-d6a45e4ea550" path="/var/lib/kubelet/pods/f7019e9a-6250-4154-85b8-d6a45e4ea550/volumes" Jan 21 00:43:33 crc kubenswrapper[5030]: I0121 00:43:33.060346 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-hvrcc"] Jan 21 00:43:33 crc kubenswrapper[5030]: I0121 00:43:33.070239 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-hvrcc"] Jan 21 00:43:33 crc kubenswrapper[5030]: I0121 00:43:33.971418 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d8042e-0563-4e85-b7a2-cfb6c402c4ef" path="/var/lib/kubelet/pods/66d8042e-0563-4e85-b7a2-cfb6c402c4ef/volumes" Jan 21 00:43:34 crc kubenswrapper[5030]: I0121 00:43:34.073499 5030 scope.go:117] "RemoveContainer" containerID="20700d5af49b815ba8e55ca4711ca8c4cc6294881f9f4bf5b9ca73220236bfc4" Jan 21 00:43:34 crc kubenswrapper[5030]: I0121 00:43:34.102558 5030 scope.go:117] "RemoveContainer" containerID="cd0d8838c94903e8d90b606e6f1db3341d42d75604b250d537ba0b49b23b96e3" Jan 21 00:43:34 crc kubenswrapper[5030]: I0121 00:43:34.134283 5030 scope.go:117] "RemoveContainer" containerID="4ba799b592fbda20fe7252c859392107b30d8f277c1ebc8f05b28486a902a62f" Jan 21 00:43:41 crc kubenswrapper[5030]: I0121 00:43:41.035110 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-mv6sd"] Jan 21 00:43:41 crc kubenswrapper[5030]: I0121 00:43:41.041287 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-mv6sd"] Jan 21 00:43:41 crc kubenswrapper[5030]: I0121 00:43:41.984874 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="031af4ed-5133-4a3b-88ea-f857438fb2e4" path="/var/lib/kubelet/pods/031af4ed-5133-4a3b-88ea-f857438fb2e4/volumes" Jan 21 00:44:23 crc kubenswrapper[5030]: I0121 00:44:23.372678 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Jan 21 00:44:23 crc kubenswrapper[5030]: I0121 00:44:23.374242 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstackclient" podUID="ca1a59bd-f8de-4010-8a23-77a265e55b49" containerName="openstackclient" containerID="cri-o://b01104de51a609d2b64bf30459460f431dca56911e7676f3e1bc5db24228a013" gracePeriod=30 Jan 21 00:44:23 crc kubenswrapper[5030]: I0121 00:44:23.779349 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Jan 21 00:44:23 crc kubenswrapper[5030]: I0121 00:44:23.790293 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ca1a59bd-f8de-4010-8a23-77a265e55b49-openstack-config-secret\") pod \"ca1a59bd-f8de-4010-8a23-77a265e55b49\" (UID: \"ca1a59bd-f8de-4010-8a23-77a265e55b49\") " Jan 21 00:44:23 crc kubenswrapper[5030]: I0121 00:44:23.790363 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbmnv\" (UniqueName: \"kubernetes.io/projected/ca1a59bd-f8de-4010-8a23-77a265e55b49-kube-api-access-dbmnv\") pod \"ca1a59bd-f8de-4010-8a23-77a265e55b49\" (UID: \"ca1a59bd-f8de-4010-8a23-77a265e55b49\") " Jan 21 00:44:23 crc kubenswrapper[5030]: I0121 00:44:23.790447 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ca1a59bd-f8de-4010-8a23-77a265e55b49-openstack-config\") pod \"ca1a59bd-f8de-4010-8a23-77a265e55b49\" (UID: \"ca1a59bd-f8de-4010-8a23-77a265e55b49\") " Jan 21 00:44:23 crc kubenswrapper[5030]: I0121 00:44:23.797363 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1a59bd-f8de-4010-8a23-77a265e55b49-kube-api-access-dbmnv" (OuterVolumeSpecName: "kube-api-access-dbmnv") pod "ca1a59bd-f8de-4010-8a23-77a265e55b49" (UID: "ca1a59bd-f8de-4010-8a23-77a265e55b49"). InnerVolumeSpecName "kube-api-access-dbmnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:23 crc kubenswrapper[5030]: I0121 00:44:23.811523 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca1a59bd-f8de-4010-8a23-77a265e55b49-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ca1a59bd-f8de-4010-8a23-77a265e55b49" (UID: "ca1a59bd-f8de-4010-8a23-77a265e55b49"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:23 crc kubenswrapper[5030]: I0121 00:44:23.818606 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1a59bd-f8de-4010-8a23-77a265e55b49-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ca1a59bd-f8de-4010-8a23-77a265e55b49" (UID: "ca1a59bd-f8de-4010-8a23-77a265e55b49"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:44:23 crc kubenswrapper[5030]: I0121 00:44:23.891526 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ca1a59bd-f8de-4010-8a23-77a265e55b49-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:23 crc kubenswrapper[5030]: I0121 00:44:23.891570 5030 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ca1a59bd-f8de-4010-8a23-77a265e55b49-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:23 crc kubenswrapper[5030]: I0121 00:44:23.891586 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbmnv\" (UniqueName: \"kubernetes.io/projected/ca1a59bd-f8de-4010-8a23-77a265e55b49-kube-api-access-dbmnv\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.266023 5030 generic.go:334] "Generic (PLEG): container finished" podID="ca1a59bd-f8de-4010-8a23-77a265e55b49" containerID="b01104de51a609d2b64bf30459460f431dca56911e7676f3e1bc5db24228a013" exitCode=143 Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.266082 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.266107 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"ca1a59bd-f8de-4010-8a23-77a265e55b49","Type":"ContainerDied","Data":"b01104de51a609d2b64bf30459460f431dca56911e7676f3e1bc5db24228a013"} Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.266200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstackclient" event={"ID":"ca1a59bd-f8de-4010-8a23-77a265e55b49","Type":"ContainerDied","Data":"350fcda5a970b9d84c001e2fef0a77ab2f357a323cd07e464ce1d538a2416a84"} Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.266249 5030 scope.go:117] "RemoveContainer" containerID="b01104de51a609d2b64bf30459460f431dca56911e7676f3e1bc5db24228a013" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.298537 5030 scope.go:117] "RemoveContainer" containerID="b01104de51a609d2b64bf30459460f431dca56911e7676f3e1bc5db24228a013" Jan 21 00:44:24 crc kubenswrapper[5030]: E0121 00:44:24.299184 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b01104de51a609d2b64bf30459460f431dca56911e7676f3e1bc5db24228a013\": container with ID starting with b01104de51a609d2b64bf30459460f431dca56911e7676f3e1bc5db24228a013 not found: ID does not exist" containerID="b01104de51a609d2b64bf30459460f431dca56911e7676f3e1bc5db24228a013" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.299236 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b01104de51a609d2b64bf30459460f431dca56911e7676f3e1bc5db24228a013"} err="failed to get container status \"b01104de51a609d2b64bf30459460f431dca56911e7676f3e1bc5db24228a013\": rpc error: code = NotFound desc = could not find container \"b01104de51a609d2b64bf30459460f431dca56911e7676f3e1bc5db24228a013\": container with ID starting with b01104de51a609d2b64bf30459460f431dca56911e7676f3e1bc5db24228a013 not found: ID does not exist" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.303488 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.309028 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.474551 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-d76475f55-v2lnk"] Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.474790 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" podUID="b2d30b10-958f-4bc4-a57d-36b673db4890" containerName="keystone-api" containerID="cri-o://afa6ce71049dc9a297d92e1cf76315d70bef6a9c2ddacbb46d8340997fe31bc8" gracePeriod=30 Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.539341 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone3507-account-delete-rcx8w"] Jan 21 00:44:24 crc kubenswrapper[5030]: E0121 00:44:24.539718 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1a59bd-f8de-4010-8a23-77a265e55b49" containerName="openstackclient" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.539739 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1a59bd-f8de-4010-8a23-77a265e55b49" containerName="openstackclient" Jan 21 00:44:24 crc kubenswrapper[5030]: E0121 00:44:24.539760 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2a1240-46dc-4692-863d-a32e6f11caff" containerName="registry-server" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.539768 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2a1240-46dc-4692-863d-a32e6f11caff" containerName="registry-server" Jan 21 00:44:24 crc kubenswrapper[5030]: E0121 00:44:24.539782 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2a1240-46dc-4692-863d-a32e6f11caff" containerName="extract-content" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.539792 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2a1240-46dc-4692-863d-a32e6f11caff" containerName="extract-content" Jan 21 00:44:24 crc kubenswrapper[5030]: E0121 00:44:24.539817 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2a1240-46dc-4692-863d-a32e6f11caff" containerName="extract-utilities" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.539826 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2a1240-46dc-4692-863d-a32e6f11caff" containerName="extract-utilities" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.540024 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2a1240-46dc-4692-863d-a32e6f11caff" containerName="registry-server" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.540057 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1a59bd-f8de-4010-8a23-77a265e55b49" containerName="openstackclient" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.540766 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone3507-account-delete-rcx8w" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.547299 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone3507-account-delete-rcx8w"] Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.609295 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b647\" (UniqueName: \"kubernetes.io/projected/64bd91b0-63e2-4b7d-a6f2-5dec227693fa-kube-api-access-5b647\") pod \"keystone3507-account-delete-rcx8w\" (UID: \"64bd91b0-63e2-4b7d-a6f2-5dec227693fa\") " pod="keystone-kuttl-tests/keystone3507-account-delete-rcx8w" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.609360 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64bd91b0-63e2-4b7d-a6f2-5dec227693fa-operator-scripts\") pod \"keystone3507-account-delete-rcx8w\" (UID: \"64bd91b0-63e2-4b7d-a6f2-5dec227693fa\") " pod="keystone-kuttl-tests/keystone3507-account-delete-rcx8w" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.711320 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b647\" (UniqueName: \"kubernetes.io/projected/64bd91b0-63e2-4b7d-a6f2-5dec227693fa-kube-api-access-5b647\") pod \"keystone3507-account-delete-rcx8w\" (UID: \"64bd91b0-63e2-4b7d-a6f2-5dec227693fa\") " pod="keystone-kuttl-tests/keystone3507-account-delete-rcx8w" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.711420 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64bd91b0-63e2-4b7d-a6f2-5dec227693fa-operator-scripts\") pod \"keystone3507-account-delete-rcx8w\" (UID: \"64bd91b0-63e2-4b7d-a6f2-5dec227693fa\") " pod="keystone-kuttl-tests/keystone3507-account-delete-rcx8w" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.712301 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64bd91b0-63e2-4b7d-a6f2-5dec227693fa-operator-scripts\") pod \"keystone3507-account-delete-rcx8w\" (UID: \"64bd91b0-63e2-4b7d-a6f2-5dec227693fa\") " pod="keystone-kuttl-tests/keystone3507-account-delete-rcx8w" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.732691 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b647\" (UniqueName: \"kubernetes.io/projected/64bd91b0-63e2-4b7d-a6f2-5dec227693fa-kube-api-access-5b647\") pod \"keystone3507-account-delete-rcx8w\" (UID: \"64bd91b0-63e2-4b7d-a6f2-5dec227693fa\") " pod="keystone-kuttl-tests/keystone3507-account-delete-rcx8w" Jan 21 00:44:24 crc kubenswrapper[5030]: I0121 00:44:24.856252 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone3507-account-delete-rcx8w" Jan 21 00:44:25 crc kubenswrapper[5030]: I0121 00:44:25.285283 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone3507-account-delete-rcx8w"] Jan 21 00:44:25 crc kubenswrapper[5030]: I0121 00:44:25.971311 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1a59bd-f8de-4010-8a23-77a265e55b49" path="/var/lib/kubelet/pods/ca1a59bd-f8de-4010-8a23-77a265e55b49/volumes" Jan 21 00:44:26 crc kubenswrapper[5030]: I0121 00:44:26.284052 5030 generic.go:334] "Generic (PLEG): container finished" podID="64bd91b0-63e2-4b7d-a6f2-5dec227693fa" containerID="3896a6e6bb21c09b1e7b87a886accc1b0d58aa8f0af3cbfb046abdfa425f4bac" exitCode=0 Jan 21 00:44:26 crc kubenswrapper[5030]: I0121 00:44:26.284105 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone3507-account-delete-rcx8w" event={"ID":"64bd91b0-63e2-4b7d-a6f2-5dec227693fa","Type":"ContainerDied","Data":"3896a6e6bb21c09b1e7b87a886accc1b0d58aa8f0af3cbfb046abdfa425f4bac"} Jan 21 00:44:26 crc kubenswrapper[5030]: I0121 00:44:26.284173 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone3507-account-delete-rcx8w" event={"ID":"64bd91b0-63e2-4b7d-a6f2-5dec227693fa","Type":"ContainerStarted","Data":"536f0f306ce398fa2866a1f6f8e95073e2606db0885cfeee07f56c3755a2405e"} Jan 21 00:44:27 crc kubenswrapper[5030]: I0121 00:44:27.580518 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone3507-account-delete-rcx8w" Jan 21 00:44:27 crc kubenswrapper[5030]: I0121 00:44:27.754226 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b647\" (UniqueName: \"kubernetes.io/projected/64bd91b0-63e2-4b7d-a6f2-5dec227693fa-kube-api-access-5b647\") pod \"64bd91b0-63e2-4b7d-a6f2-5dec227693fa\" (UID: \"64bd91b0-63e2-4b7d-a6f2-5dec227693fa\") " Jan 21 00:44:27 crc kubenswrapper[5030]: I0121 00:44:27.754758 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64bd91b0-63e2-4b7d-a6f2-5dec227693fa-operator-scripts\") pod \"64bd91b0-63e2-4b7d-a6f2-5dec227693fa\" (UID: \"64bd91b0-63e2-4b7d-a6f2-5dec227693fa\") " Jan 21 00:44:27 crc kubenswrapper[5030]: I0121 00:44:27.756046 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64bd91b0-63e2-4b7d-a6f2-5dec227693fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64bd91b0-63e2-4b7d-a6f2-5dec227693fa" (UID: "64bd91b0-63e2-4b7d-a6f2-5dec227693fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:27 crc kubenswrapper[5030]: I0121 00:44:27.763278 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64bd91b0-63e2-4b7d-a6f2-5dec227693fa-kube-api-access-5b647" (OuterVolumeSpecName: "kube-api-access-5b647") pod "64bd91b0-63e2-4b7d-a6f2-5dec227693fa" (UID: "64bd91b0-63e2-4b7d-a6f2-5dec227693fa"). InnerVolumeSpecName "kube-api-access-5b647". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:27 crc kubenswrapper[5030]: I0121 00:44:27.856074 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64bd91b0-63e2-4b7d-a6f2-5dec227693fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:27 crc kubenswrapper[5030]: I0121 00:44:27.856143 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b647\" (UniqueName: \"kubernetes.io/projected/64bd91b0-63e2-4b7d-a6f2-5dec227693fa-kube-api-access-5b647\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:27 crc kubenswrapper[5030]: I0121 00:44:27.938751 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.058135 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sh4t\" (UniqueName: \"kubernetes.io/projected/b2d30b10-958f-4bc4-a57d-36b673db4890-kube-api-access-7sh4t\") pod \"b2d30b10-958f-4bc4-a57d-36b673db4890\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.058222 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-config-data\") pod \"b2d30b10-958f-4bc4-a57d-36b673db4890\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.058290 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-credential-keys\") pod \"b2d30b10-958f-4bc4-a57d-36b673db4890\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.058362 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-fernet-keys\") pod \"b2d30b10-958f-4bc4-a57d-36b673db4890\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.058494 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-scripts\") pod \"b2d30b10-958f-4bc4-a57d-36b673db4890\" (UID: \"b2d30b10-958f-4bc4-a57d-36b673db4890\") " Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.061715 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2d30b10-958f-4bc4-a57d-36b673db4890-kube-api-access-7sh4t" (OuterVolumeSpecName: "kube-api-access-7sh4t") pod "b2d30b10-958f-4bc4-a57d-36b673db4890" (UID: "b2d30b10-958f-4bc4-a57d-36b673db4890"). InnerVolumeSpecName "kube-api-access-7sh4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.062160 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b2d30b10-958f-4bc4-a57d-36b673db4890" (UID: "b2d30b10-958f-4bc4-a57d-36b673db4890"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.062609 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-scripts" (OuterVolumeSpecName: "scripts") pod "b2d30b10-958f-4bc4-a57d-36b673db4890" (UID: "b2d30b10-958f-4bc4-a57d-36b673db4890"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.063439 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b2d30b10-958f-4bc4-a57d-36b673db4890" (UID: "b2d30b10-958f-4bc4-a57d-36b673db4890"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.075791 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-config-data" (OuterVolumeSpecName: "config-data") pod "b2d30b10-958f-4bc4-a57d-36b673db4890" (UID: "b2d30b10-958f-4bc4-a57d-36b673db4890"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.160192 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.160226 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sh4t\" (UniqueName: \"kubernetes.io/projected/b2d30b10-958f-4bc4-a57d-36b673db4890-kube-api-access-7sh4t\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.160239 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.160248 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.160257 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2d30b10-958f-4bc4-a57d-36b673db4890-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.301913 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone3507-account-delete-rcx8w" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.301888 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone3507-account-delete-rcx8w" event={"ID":"64bd91b0-63e2-4b7d-a6f2-5dec227693fa","Type":"ContainerDied","Data":"536f0f306ce398fa2866a1f6f8e95073e2606db0885cfeee07f56c3755a2405e"} Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.302118 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="536f0f306ce398fa2866a1f6f8e95073e2606db0885cfeee07f56c3755a2405e" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.304168 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2d30b10-958f-4bc4-a57d-36b673db4890" containerID="afa6ce71049dc9a297d92e1cf76315d70bef6a9c2ddacbb46d8340997fe31bc8" exitCode=0 Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.304226 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.304233 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" event={"ID":"b2d30b10-958f-4bc4-a57d-36b673db4890","Type":"ContainerDied","Data":"afa6ce71049dc9a297d92e1cf76315d70bef6a9c2ddacbb46d8340997fe31bc8"} Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.304289 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-d76475f55-v2lnk" event={"ID":"b2d30b10-958f-4bc4-a57d-36b673db4890","Type":"ContainerDied","Data":"e5fae6d6b4d3ce834030497462ed10641515bc31eefb439bf08cccda65ac60b2"} Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.304310 5030 scope.go:117] "RemoveContainer" containerID="afa6ce71049dc9a297d92e1cf76315d70bef6a9c2ddacbb46d8340997fe31bc8" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.343241 5030 scope.go:117] "RemoveContainer" containerID="afa6ce71049dc9a297d92e1cf76315d70bef6a9c2ddacbb46d8340997fe31bc8" Jan 21 00:44:28 crc kubenswrapper[5030]: E0121 00:44:28.344356 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afa6ce71049dc9a297d92e1cf76315d70bef6a9c2ddacbb46d8340997fe31bc8\": container with ID starting with afa6ce71049dc9a297d92e1cf76315d70bef6a9c2ddacbb46d8340997fe31bc8 not found: ID does not exist" containerID="afa6ce71049dc9a297d92e1cf76315d70bef6a9c2ddacbb46d8340997fe31bc8" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.344550 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afa6ce71049dc9a297d92e1cf76315d70bef6a9c2ddacbb46d8340997fe31bc8"} err="failed to get container status \"afa6ce71049dc9a297d92e1cf76315d70bef6a9c2ddacbb46d8340997fe31bc8\": rpc error: code = NotFound desc = could not find container \"afa6ce71049dc9a297d92e1cf76315d70bef6a9c2ddacbb46d8340997fe31bc8\": container with ID starting with afa6ce71049dc9a297d92e1cf76315d70bef6a9c2ddacbb46d8340997fe31bc8 not found: ID does not exist" Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.344647 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-d76475f55-v2lnk"] Jan 21 00:44:28 crc kubenswrapper[5030]: I0121 00:44:28.349836 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-d76475f55-v2lnk"] Jan 21 00:44:29 crc kubenswrapper[5030]: I0121 00:44:29.580848 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone3507-account-delete-rcx8w"] Jan 21 00:44:29 crc kubenswrapper[5030]: I0121 00:44:29.586406 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone3507-account-delete-rcx8w"] Jan 21 00:44:29 crc kubenswrapper[5030]: I0121 00:44:29.978515 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64bd91b0-63e2-4b7d-a6f2-5dec227693fa" path="/var/lib/kubelet/pods/64bd91b0-63e2-4b7d-a6f2-5dec227693fa/volumes" Jan 21 00:44:29 crc kubenswrapper[5030]: I0121 00:44:29.979707 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2d30b10-958f-4bc4-a57d-36b673db4890" path="/var/lib/kubelet/pods/b2d30b10-958f-4bc4-a57d-36b673db4890/volumes" Jan 21 00:44:34 crc kubenswrapper[5030]: I0121 00:44:34.200451 5030 scope.go:117] "RemoveContainer" containerID="c843368ab535d1e1653c99ecd54a5521ef468b82ba991af78ca288109814605b" Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.271424 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/root-account-create-update-s7b2j"] Jan 21 00:44:35 crc kubenswrapper[5030]: E0121 00:44:35.271757 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64bd91b0-63e2-4b7d-a6f2-5dec227693fa" containerName="mariadb-account-delete" Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.271771 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="64bd91b0-63e2-4b7d-a6f2-5dec227693fa" containerName="mariadb-account-delete" Jan 21 00:44:35 crc kubenswrapper[5030]: E0121 00:44:35.271806 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2d30b10-958f-4bc4-a57d-36b673db4890" containerName="keystone-api" Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.271814 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2d30b10-958f-4bc4-a57d-36b673db4890" containerName="keystone-api" Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.271969 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2d30b10-958f-4bc4-a57d-36b673db4890" containerName="keystone-api" Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.271986 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="64bd91b0-63e2-4b7d-a6f2-5dec227693fa" containerName="mariadb-account-delete" Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.272528 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-s7b2j" Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.275618 5030 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.285185 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-s7b2j"] Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.338075 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.346245 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.356502 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.365887 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aef87c5-faf5-4580-8c7c-1304f1592744-operator-scripts\") pod \"root-account-create-update-s7b2j\" (UID: \"6aef87c5-faf5-4580-8c7c-1304f1592744\") " pod="keystone-kuttl-tests/root-account-create-update-s7b2j" Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.365964 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrpxr\" (UniqueName: \"kubernetes.io/projected/6aef87c5-faf5-4580-8c7c-1304f1592744-kube-api-access-qrpxr\") pod \"root-account-create-update-s7b2j\" (UID: \"6aef87c5-faf5-4580-8c7c-1304f1592744\") " pod="keystone-kuttl-tests/root-account-create-update-s7b2j" Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.370875 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-s7b2j"] Jan 21 00:44:35 crc kubenswrapper[5030]: E0121 00:44:35.371430 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-qrpxr operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="keystone-kuttl-tests/root-account-create-update-s7b2j" podUID="6aef87c5-faf5-4580-8c7c-1304f1592744" Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.467331 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aef87c5-faf5-4580-8c7c-1304f1592744-operator-scripts\") pod \"root-account-create-update-s7b2j\" (UID: \"6aef87c5-faf5-4580-8c7c-1304f1592744\") " pod="keystone-kuttl-tests/root-account-create-update-s7b2j" Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.467387 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrpxr\" (UniqueName: \"kubernetes.io/projected/6aef87c5-faf5-4580-8c7c-1304f1592744-kube-api-access-qrpxr\") pod \"root-account-create-update-s7b2j\" (UID: \"6aef87c5-faf5-4580-8c7c-1304f1592744\") " pod="keystone-kuttl-tests/root-account-create-update-s7b2j" Jan 21 00:44:35 crc kubenswrapper[5030]: E0121 00:44:35.467482 5030 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 00:44:35 crc kubenswrapper[5030]: E0121 00:44:35.467559 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6aef87c5-faf5-4580-8c7c-1304f1592744-operator-scripts podName:6aef87c5-faf5-4580-8c7c-1304f1592744 nodeName:}" failed. No retries permitted until 2026-01-21 00:44:35.967539889 +0000 UTC m=+7748.287800177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6aef87c5-faf5-4580-8c7c-1304f1592744-operator-scripts") pod "root-account-create-update-s7b2j" (UID: "6aef87c5-faf5-4580-8c7c-1304f1592744") : configmap "openstack-scripts" not found Jan 21 00:44:35 crc kubenswrapper[5030]: E0121 00:44:35.471530 5030 projected.go:194] Error preparing data for projected volume kube-api-access-qrpxr for pod keystone-kuttl-tests/root-account-create-update-s7b2j: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 00:44:35 crc kubenswrapper[5030]: E0121 00:44:35.471596 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6aef87c5-faf5-4580-8c7c-1304f1592744-kube-api-access-qrpxr podName:6aef87c5-faf5-4580-8c7c-1304f1592744 nodeName:}" failed. No retries permitted until 2026-01-21 00:44:35.971586606 +0000 UTC m=+7748.291846894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qrpxr" (UniqueName: "kubernetes.io/projected/6aef87c5-faf5-4580-8c7c-1304f1592744-kube-api-access-qrpxr") pod "root-account-create-update-s7b2j" (UID: "6aef87c5-faf5-4580-8c7c-1304f1592744") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.496526 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-2" podUID="3af3725b-a540-4b57-9655-0087cea23aa0" containerName="galera" containerID="cri-o://35cf9a354de6a03e1d2f76b7c36b7d348684cc280dac6e0b42684214d10b579f" gracePeriod=30 Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.935474 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.936005 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/memcached-0" podUID="5e8db42f-632b-4d70-9c6f-1bd38c0f1c47" containerName="memcached" containerID="cri-o://ead90f2213be7eeed0596c3695de24918d1426e2e96d77e3af783b5a5d85bef7" gracePeriod=30 Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.972935 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aef87c5-faf5-4580-8c7c-1304f1592744-operator-scripts\") pod \"root-account-create-update-s7b2j\" (UID: \"6aef87c5-faf5-4580-8c7c-1304f1592744\") " pod="keystone-kuttl-tests/root-account-create-update-s7b2j" Jan 21 00:44:35 crc kubenswrapper[5030]: I0121 00:44:35.972997 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrpxr\" (UniqueName: \"kubernetes.io/projected/6aef87c5-faf5-4580-8c7c-1304f1592744-kube-api-access-qrpxr\") pod \"root-account-create-update-s7b2j\" (UID: \"6aef87c5-faf5-4580-8c7c-1304f1592744\") " pod="keystone-kuttl-tests/root-account-create-update-s7b2j" Jan 21 00:44:35 crc kubenswrapper[5030]: E0121 00:44:35.973098 5030 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 00:44:35 crc kubenswrapper[5030]: E0121 00:44:35.973167 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6aef87c5-faf5-4580-8c7c-1304f1592744-operator-scripts podName:6aef87c5-faf5-4580-8c7c-1304f1592744 nodeName:}" failed. No retries permitted until 2026-01-21 00:44:36.973149313 +0000 UTC m=+7749.293409601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6aef87c5-faf5-4580-8c7c-1304f1592744-operator-scripts") pod "root-account-create-update-s7b2j" (UID: "6aef87c5-faf5-4580-8c7c-1304f1592744") : configmap "openstack-scripts" not found Jan 21 00:44:35 crc kubenswrapper[5030]: E0121 00:44:35.976514 5030 projected.go:194] Error preparing data for projected volume kube-api-access-qrpxr for pod keystone-kuttl-tests/root-account-create-update-s7b2j: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 00:44:35 crc kubenswrapper[5030]: E0121 00:44:35.976607 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6aef87c5-faf5-4580-8c7c-1304f1592744-kube-api-access-qrpxr podName:6aef87c5-faf5-4580-8c7c-1304f1592744 nodeName:}" failed. No retries permitted until 2026-01-21 00:44:36.976587536 +0000 UTC m=+7749.296847824 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qrpxr" (UniqueName: "kubernetes.io/projected/6aef87c5-faf5-4580-8c7c-1304f1592744-kube-api-access-qrpxr") pod "root-account-create-update-s7b2j" (UID: "6aef87c5-faf5-4580-8c7c-1304f1592744") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.318388 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.365965 5030 generic.go:334] "Generic (PLEG): container finished" podID="3af3725b-a540-4b57-9655-0087cea23aa0" containerID="35cf9a354de6a03e1d2f76b7c36b7d348684cc280dac6e0b42684214d10b579f" exitCode=0 Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.366048 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-s7b2j" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.366105 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.367054 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3af3725b-a540-4b57-9655-0087cea23aa0","Type":"ContainerDied","Data":"35cf9a354de6a03e1d2f76b7c36b7d348684cc280dac6e0b42684214d10b579f"} Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.367101 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"3af3725b-a540-4b57-9655-0087cea23aa0","Type":"ContainerDied","Data":"c091f8dfb5135dc8afec64ad97aaac9348a03d304b2ee147376ed1105a33f85a"} Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.367118 5030 scope.go:117] "RemoveContainer" containerID="35cf9a354de6a03e1d2f76b7c36b7d348684cc280dac6e0b42684214d10b579f" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.375915 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-s7b2j" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.390430 5030 scope.go:117] "RemoveContainer" containerID="2a4ce284153fef8d4a6d8f8bf008b95c83d6857b6d8e308d18de3e10cea8cf74" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.414566 5030 scope.go:117] "RemoveContainer" containerID="35cf9a354de6a03e1d2f76b7c36b7d348684cc280dac6e0b42684214d10b579f" Jan 21 00:44:36 crc kubenswrapper[5030]: E0121 00:44:36.415059 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35cf9a354de6a03e1d2f76b7c36b7d348684cc280dac6e0b42684214d10b579f\": container with ID starting with 35cf9a354de6a03e1d2f76b7c36b7d348684cc280dac6e0b42684214d10b579f not found: ID does not exist" containerID="35cf9a354de6a03e1d2f76b7c36b7d348684cc280dac6e0b42684214d10b579f" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.415097 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35cf9a354de6a03e1d2f76b7c36b7d348684cc280dac6e0b42684214d10b579f"} err="failed to get container status \"35cf9a354de6a03e1d2f76b7c36b7d348684cc280dac6e0b42684214d10b579f\": rpc error: code = NotFound desc = could not find container \"35cf9a354de6a03e1d2f76b7c36b7d348684cc280dac6e0b42684214d10b579f\": container with ID starting with 35cf9a354de6a03e1d2f76b7c36b7d348684cc280dac6e0b42684214d10b579f not found: ID does not exist" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.415122 5030 scope.go:117] "RemoveContainer" containerID="2a4ce284153fef8d4a6d8f8bf008b95c83d6857b6d8e308d18de3e10cea8cf74" Jan 21 00:44:36 crc kubenswrapper[5030]: E0121 00:44:36.415479 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a4ce284153fef8d4a6d8f8bf008b95c83d6857b6d8e308d18de3e10cea8cf74\": container with ID starting with 2a4ce284153fef8d4a6d8f8bf008b95c83d6857b6d8e308d18de3e10cea8cf74 not found: ID does not exist" containerID="2a4ce284153fef8d4a6d8f8bf008b95c83d6857b6d8e308d18de3e10cea8cf74" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.415525 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a4ce284153fef8d4a6d8f8bf008b95c83d6857b6d8e308d18de3e10cea8cf74"} err="failed to get container status \"2a4ce284153fef8d4a6d8f8bf008b95c83d6857b6d8e308d18de3e10cea8cf74\": rpc error: code = NotFound desc = could not find container \"2a4ce284153fef8d4a6d8f8bf008b95c83d6857b6d8e308d18de3e10cea8cf74\": container with ID starting with 2a4ce284153fef8d4a6d8f8bf008b95c83d6857b6d8e308d18de3e10cea8cf74 not found: ID does not exist" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.466439 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.488232 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-kolla-config\") pod \"3af3725b-a540-4b57-9655-0087cea23aa0\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.488321 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-operator-scripts\") pod \"3af3725b-a540-4b57-9655-0087cea23aa0\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.489242 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "3af3725b-a540-4b57-9655-0087cea23aa0" (UID: "3af3725b-a540-4b57-9655-0087cea23aa0"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.489595 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3af3725b-a540-4b57-9655-0087cea23aa0" (UID: "3af3725b-a540-4b57-9655-0087cea23aa0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.489672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"3af3725b-a540-4b57-9655-0087cea23aa0\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.489720 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4xwd\" (UniqueName: \"kubernetes.io/projected/3af3725b-a540-4b57-9655-0087cea23aa0-kube-api-access-d4xwd\") pod \"3af3725b-a540-4b57-9655-0087cea23aa0\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.491379 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3af3725b-a540-4b57-9655-0087cea23aa0-config-data-generated\") pod \"3af3725b-a540-4b57-9655-0087cea23aa0\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.491413 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-config-data-default\") pod \"3af3725b-a540-4b57-9655-0087cea23aa0\" (UID: \"3af3725b-a540-4b57-9655-0087cea23aa0\") " Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.491853 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3af3725b-a540-4b57-9655-0087cea23aa0-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "3af3725b-a540-4b57-9655-0087cea23aa0" (UID: "3af3725b-a540-4b57-9655-0087cea23aa0"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.491924 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.491946 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.491977 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "3af3725b-a540-4b57-9655-0087cea23aa0" (UID: "3af3725b-a540-4b57-9655-0087cea23aa0"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.499312 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af3725b-a540-4b57-9655-0087cea23aa0-kube-api-access-d4xwd" (OuterVolumeSpecName: "kube-api-access-d4xwd") pod "3af3725b-a540-4b57-9655-0087cea23aa0" (UID: "3af3725b-a540-4b57-9655-0087cea23aa0"). InnerVolumeSpecName "kube-api-access-d4xwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.501397 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "mysql-db") pod "3af3725b-a540-4b57-9655-0087cea23aa0" (UID: "3af3725b-a540-4b57-9655-0087cea23aa0"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.593823 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3af3725b-a540-4b57-9655-0087cea23aa0-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.594084 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3af3725b-a540-4b57-9655-0087cea23aa0-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.594107 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.594120 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4xwd\" (UniqueName: \"kubernetes.io/projected/3af3725b-a540-4b57-9655-0087cea23aa0-kube-api-access-d4xwd\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.607381 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.694730 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.695104 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.699914 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Jan 21 00:44:36 crc kubenswrapper[5030]: I0121 00:44:36.823012 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.009388 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aef87c5-faf5-4580-8c7c-1304f1592744-operator-scripts\") pod \"root-account-create-update-s7b2j\" (UID: \"6aef87c5-faf5-4580-8c7c-1304f1592744\") " pod="keystone-kuttl-tests/root-account-create-update-s7b2j" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.009458 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrpxr\" (UniqueName: \"kubernetes.io/projected/6aef87c5-faf5-4580-8c7c-1304f1592744-kube-api-access-qrpxr\") pod \"root-account-create-update-s7b2j\" (UID: \"6aef87c5-faf5-4580-8c7c-1304f1592744\") " pod="keystone-kuttl-tests/root-account-create-update-s7b2j" Jan 21 00:44:37 crc kubenswrapper[5030]: E0121 00:44:37.010400 5030 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 00:44:37 crc kubenswrapper[5030]: E0121 00:44:37.010455 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6aef87c5-faf5-4580-8c7c-1304f1592744-operator-scripts podName:6aef87c5-faf5-4580-8c7c-1304f1592744 nodeName:}" failed. No retries permitted until 2026-01-21 00:44:39.010439556 +0000 UTC m=+7751.330699844 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6aef87c5-faf5-4580-8c7c-1304f1592744-operator-scripts") pod "root-account-create-update-s7b2j" (UID: "6aef87c5-faf5-4580-8c7c-1304f1592744") : configmap "openstack-scripts" not found Jan 21 00:44:37 crc kubenswrapper[5030]: E0121 00:44:37.019641 5030 projected.go:194] Error preparing data for projected volume kube-api-access-qrpxr for pod keystone-kuttl-tests/root-account-create-update-s7b2j: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 00:44:37 crc kubenswrapper[5030]: E0121 00:44:37.019715 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6aef87c5-faf5-4580-8c7c-1304f1592744-kube-api-access-qrpxr podName:6aef87c5-faf5-4580-8c7c-1304f1592744 nodeName:}" failed. No retries permitted until 2026-01-21 00:44:39.019698929 +0000 UTC m=+7751.339959217 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qrpxr" (UniqueName: "kubernetes.io/projected/6aef87c5-faf5-4580-8c7c-1304f1592744-kube-api-access-qrpxr") pod "root-account-create-update-s7b2j" (UID: "6aef87c5-faf5-4580-8c7c-1304f1592744") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.201907 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.312646 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-config-data\") pod \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\" (UID: \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\") " Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.312758 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwsbv\" (UniqueName: \"kubernetes.io/projected/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-kube-api-access-vwsbv\") pod \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\" (UID: \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\") " Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.312858 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-kolla-config\") pod \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\" (UID: \"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47\") " Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.313485 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-config-data" (OuterVolumeSpecName: "config-data") pod "5e8db42f-632b-4d70-9c6f-1bd38c0f1c47" (UID: "5e8db42f-632b-4d70-9c6f-1bd38c0f1c47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.313583 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "5e8db42f-632b-4d70-9c6f-1bd38c0f1c47" (UID: "5e8db42f-632b-4d70-9c6f-1bd38c0f1c47"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.317533 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-kube-api-access-vwsbv" (OuterVolumeSpecName: "kube-api-access-vwsbv") pod "5e8db42f-632b-4d70-9c6f-1bd38c0f1c47" (UID: "5e8db42f-632b-4d70-9c6f-1bd38c0f1c47"). InnerVolumeSpecName "kube-api-access-vwsbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.375542 5030 generic.go:334] "Generic (PLEG): container finished" podID="5e8db42f-632b-4d70-9c6f-1bd38c0f1c47" containerID="ead90f2213be7eeed0596c3695de24918d1426e2e96d77e3af783b5a5d85bef7" exitCode=0 Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.375591 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.375613 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/root-account-create-update-s7b2j" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.375668 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47","Type":"ContainerDied","Data":"ead90f2213be7eeed0596c3695de24918d1426e2e96d77e3af783b5a5d85bef7"} Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.375695 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"5e8db42f-632b-4d70-9c6f-1bd38c0f1c47","Type":"ContainerDied","Data":"825379b1ecbad96c19a675651d7f9d1537a51a3471290002710e60ce36fe8045"} Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.375713 5030 scope.go:117] "RemoveContainer" containerID="ead90f2213be7eeed0596c3695de24918d1426e2e96d77e3af783b5a5d85bef7" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.409726 5030 scope.go:117] "RemoveContainer" containerID="ead90f2213be7eeed0596c3695de24918d1426e2e96d77e3af783b5a5d85bef7" Jan 21 00:44:37 crc kubenswrapper[5030]: E0121 00:44:37.414248 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead90f2213be7eeed0596c3695de24918d1426e2e96d77e3af783b5a5d85bef7\": container with ID starting with ead90f2213be7eeed0596c3695de24918d1426e2e96d77e3af783b5a5d85bef7 not found: ID does not exist" containerID="ead90f2213be7eeed0596c3695de24918d1426e2e96d77e3af783b5a5d85bef7" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.414301 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead90f2213be7eeed0596c3695de24918d1426e2e96d77e3af783b5a5d85bef7"} err="failed to get container status \"ead90f2213be7eeed0596c3695de24918d1426e2e96d77e3af783b5a5d85bef7\": rpc error: code = NotFound desc = could not find container \"ead90f2213be7eeed0596c3695de24918d1426e2e96d77e3af783b5a5d85bef7\": container with ID starting with ead90f2213be7eeed0596c3695de24918d1426e2e96d77e3af783b5a5d85bef7 not found: ID does not exist" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.415175 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.415202 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwsbv\" (UniqueName: \"kubernetes.io/projected/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-kube-api-access-vwsbv\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.415215 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.427097 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-s7b2j"] Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.430370 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/rabbitmq-server-0" podUID="f7cb0fd7-6a53-4ab5-a144-9495044acffb" containerName="rabbitmq" containerID="cri-o://b43618f37e6fd146988e55f65eb23ca78ad00971c8fd42c8f85bb61794f473ab" gracePeriod=604800 Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.440342 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/root-account-create-update-s7b2j"] Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.448335 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.453822 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.517084 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrpxr\" (UniqueName: \"kubernetes.io/projected/6aef87c5-faf5-4580-8c7c-1304f1592744-kube-api-access-qrpxr\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.518942 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6aef87c5-faf5-4580-8c7c-1304f1592744-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.548161 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-1" podUID="59579353-499a-4005-8702-38e5be21d855" containerName="galera" containerID="cri-o://e6836d2cd0525dcc81f5327e90f13f4b8342fe25b3f7ca5d7231bc3df10a4354" gracePeriod=28 Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.972204 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3af3725b-a540-4b57-9655-0087cea23aa0" path="/var/lib/kubelet/pods/3af3725b-a540-4b57-9655-0087cea23aa0/volumes" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.973113 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e8db42f-632b-4d70-9c6f-1bd38c0f1c47" path="/var/lib/kubelet/pods/5e8db42f-632b-4d70-9c6f-1bd38c0f1c47/volumes" Jan 21 00:44:37 crc kubenswrapper[5030]: I0121 00:44:37.973577 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aef87c5-faf5-4580-8c7c-1304f1592744" path="/var/lib/kubelet/pods/6aef87c5-faf5-4580-8c7c-1304f1592744/volumes" Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.222751 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd"] Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.222963 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" podUID="8ed0d97f-1c38-4cdd-bacf-507b3ca9442d" containerName="manager" containerID="cri-o://4c65ea1f851941676d7928ea51d2203d75494f65b132d93fda2f4070408fc80c" gracePeriod=10 Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.385227 5030 generic.go:334] "Generic (PLEG): container finished" podID="8ed0d97f-1c38-4cdd-bacf-507b3ca9442d" containerID="4c65ea1f851941676d7928ea51d2203d75494f65b132d93fda2f4070408fc80c" exitCode=0 Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.385274 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" event={"ID":"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d","Type":"ContainerDied","Data":"4c65ea1f851941676d7928ea51d2203d75494f65b132d93fda2f4070408fc80c"} Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.483411 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-m59dl"] Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.490922 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-m59dl" podUID="d6c9fb65-9372-46f7-b41e-bddf586f31d4" containerName="registry-server" containerID="cri-o://755affc83f2ef84b3880fe7174ec8e87f766b88bfd198d5173e255d0ada9b887" gracePeriod=30 Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.525795 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c"] Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.556786 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a069zx78c"] Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.698147 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.744229 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-webhook-cert\") pod \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\" (UID: \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\") " Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.744364 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jflmf\" (UniqueName: \"kubernetes.io/projected/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-kube-api-access-jflmf\") pod \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\" (UID: \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\") " Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.744411 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-apiservice-cert\") pod \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\" (UID: \"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d\") " Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.761850 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "8ed0d97f-1c38-4cdd-bacf-507b3ca9442d" (UID: "8ed0d97f-1c38-4cdd-bacf-507b3ca9442d"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.769859 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-kube-api-access-jflmf" (OuterVolumeSpecName: "kube-api-access-jflmf") pod "8ed0d97f-1c38-4cdd-bacf-507b3ca9442d" (UID: "8ed0d97f-1c38-4cdd-bacf-507b3ca9442d"). InnerVolumeSpecName "kube-api-access-jflmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.769985 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "8ed0d97f-1c38-4cdd-bacf-507b3ca9442d" (UID: "8ed0d97f-1c38-4cdd-bacf-507b3ca9442d"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.846320 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.846363 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jflmf\" (UniqueName: \"kubernetes.io/projected/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-kube-api-access-jflmf\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.846387 5030 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:38 crc kubenswrapper[5030]: I0121 00:44:38.960115 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-m59dl" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.047033 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.048452 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clt5l\" (UniqueName: \"kubernetes.io/projected/d6c9fb65-9372-46f7-b41e-bddf586f31d4-kube-api-access-clt5l\") pod \"d6c9fb65-9372-46f7-b41e-bddf586f31d4\" (UID: \"d6c9fb65-9372-46f7-b41e-bddf586f31d4\") " Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.056098 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c9fb65-9372-46f7-b41e-bddf586f31d4-kube-api-access-clt5l" (OuterVolumeSpecName: "kube-api-access-clt5l") pod "d6c9fb65-9372-46f7-b41e-bddf586f31d4" (UID: "d6c9fb65-9372-46f7-b41e-bddf586f31d4"). InnerVolumeSpecName "kube-api-access-clt5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.150260 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxlwl\" (UniqueName: \"kubernetes.io/projected/f7cb0fd7-6a53-4ab5-a144-9495044acffb-kube-api-access-pxlwl\") pod \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.150326 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-confd\") pod \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.150372 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-erlang-cookie\") pod \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.150401 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-plugins\") pod \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.150528 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122\") pod \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.150562 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7cb0fd7-6a53-4ab5-a144-9495044acffb-pod-info\") pod \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.150678 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7cb0fd7-6a53-4ab5-a144-9495044acffb-plugins-conf\") pod \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.150707 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7cb0fd7-6a53-4ab5-a144-9495044acffb-erlang-cookie-secret\") pod \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\" (UID: \"f7cb0fd7-6a53-4ab5-a144-9495044acffb\") " Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.150982 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clt5l\" (UniqueName: \"kubernetes.io/projected/d6c9fb65-9372-46f7-b41e-bddf586f31d4-kube-api-access-clt5l\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.151444 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f7cb0fd7-6a53-4ab5-a144-9495044acffb" (UID: "f7cb0fd7-6a53-4ab5-a144-9495044acffb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.151499 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f7cb0fd7-6a53-4ab5-a144-9495044acffb" (UID: "f7cb0fd7-6a53-4ab5-a144-9495044acffb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.151516 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7cb0fd7-6a53-4ab5-a144-9495044acffb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f7cb0fd7-6a53-4ab5-a144-9495044acffb" (UID: "f7cb0fd7-6a53-4ab5-a144-9495044acffb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.154145 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cb0fd7-6a53-4ab5-a144-9495044acffb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f7cb0fd7-6a53-4ab5-a144-9495044acffb" (UID: "f7cb0fd7-6a53-4ab5-a144-9495044acffb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.155128 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f7cb0fd7-6a53-4ab5-a144-9495044acffb-pod-info" (OuterVolumeSpecName: "pod-info") pod "f7cb0fd7-6a53-4ab5-a144-9495044acffb" (UID: "f7cb0fd7-6a53-4ab5-a144-9495044acffb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.155895 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7cb0fd7-6a53-4ab5-a144-9495044acffb-kube-api-access-pxlwl" (OuterVolumeSpecName: "kube-api-access-pxlwl") pod "f7cb0fd7-6a53-4ab5-a144-9495044acffb" (UID: "f7cb0fd7-6a53-4ab5-a144-9495044acffb"). InnerVolumeSpecName "kube-api-access-pxlwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.180478 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122" (OuterVolumeSpecName: "persistence") pod "f7cb0fd7-6a53-4ab5-a144-9495044acffb" (UID: "f7cb0fd7-6a53-4ab5-a144-9495044acffb"). InnerVolumeSpecName "pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.233786 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f7cb0fd7-6a53-4ab5-a144-9495044acffb" (UID: "f7cb0fd7-6a53-4ab5-a144-9495044acffb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.252275 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxlwl\" (UniqueName: \"kubernetes.io/projected/f7cb0fd7-6a53-4ab5-a144-9495044acffb-kube-api-access-pxlwl\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.252313 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.252324 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.252336 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f7cb0fd7-6a53-4ab5-a144-9495044acffb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.252373 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122\") on node \"crc\" " Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.252390 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f7cb0fd7-6a53-4ab5-a144-9495044acffb-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.252402 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f7cb0fd7-6a53-4ab5-a144-9495044acffb-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.252428 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f7cb0fd7-6a53-4ab5-a144-9495044acffb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.271752 5030 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.271925 5030 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122") on node "crc" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.353414 5030 reconciler_common.go:293] "Volume detached for volume \"pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c30dd48f-3a59-4b5a-bfc8-1af571c6a122\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.404157 5030 generic.go:334] "Generic (PLEG): container finished" podID="d6c9fb65-9372-46f7-b41e-bddf586f31d4" containerID="755affc83f2ef84b3880fe7174ec8e87f766b88bfd198d5173e255d0ada9b887" exitCode=0 Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.404269 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-m59dl" event={"ID":"d6c9fb65-9372-46f7-b41e-bddf586f31d4","Type":"ContainerDied","Data":"755affc83f2ef84b3880fe7174ec8e87f766b88bfd198d5173e255d0ada9b887"} Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.404328 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-m59dl" event={"ID":"d6c9fb65-9372-46f7-b41e-bddf586f31d4","Type":"ContainerDied","Data":"7ce7e78f9cde34952a70e85bf0664398449eeb1c884b9b21585fa29e564efd39"} Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.404353 5030 scope.go:117] "RemoveContainer" containerID="755affc83f2ef84b3880fe7174ec8e87f766b88bfd198d5173e255d0ada9b887" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.404648 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-m59dl" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.413121 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" event={"ID":"8ed0d97f-1c38-4cdd-bacf-507b3ca9442d","Type":"ContainerDied","Data":"5f83cb9953091da6c1a6ee0e7f183e0f7c8db41d1df89cb10471bf4480592aae"} Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.413295 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.417238 5030 generic.go:334] "Generic (PLEG): container finished" podID="f7cb0fd7-6a53-4ab5-a144-9495044acffb" containerID="b43618f37e6fd146988e55f65eb23ca78ad00971c8fd42c8f85bb61794f473ab" exitCode=0 Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.417278 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"f7cb0fd7-6a53-4ab5-a144-9495044acffb","Type":"ContainerDied","Data":"b43618f37e6fd146988e55f65eb23ca78ad00971c8fd42c8f85bb61794f473ab"} Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.417304 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"f7cb0fd7-6a53-4ab5-a144-9495044acffb","Type":"ContainerDied","Data":"0b115e0ffd70572e4d1729e17c81e61afca1995289fc050f3d08b035fe311afa"} Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.417378 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.430686 5030 scope.go:117] "RemoveContainer" containerID="755affc83f2ef84b3880fe7174ec8e87f766b88bfd198d5173e255d0ada9b887" Jan 21 00:44:39 crc kubenswrapper[5030]: E0121 00:44:39.433109 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"755affc83f2ef84b3880fe7174ec8e87f766b88bfd198d5173e255d0ada9b887\": container with ID starting with 755affc83f2ef84b3880fe7174ec8e87f766b88bfd198d5173e255d0ada9b887 not found: ID does not exist" containerID="755affc83f2ef84b3880fe7174ec8e87f766b88bfd198d5173e255d0ada9b887" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.433171 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"755affc83f2ef84b3880fe7174ec8e87f766b88bfd198d5173e255d0ada9b887"} err="failed to get container status \"755affc83f2ef84b3880fe7174ec8e87f766b88bfd198d5173e255d0ada9b887\": rpc error: code = NotFound desc = could not find container \"755affc83f2ef84b3880fe7174ec8e87f766b88bfd198d5173e255d0ada9b887\": container with ID starting with 755affc83f2ef84b3880fe7174ec8e87f766b88bfd198d5173e255d0ada9b887 not found: ID does not exist" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.433202 5030 scope.go:117] "RemoveContainer" containerID="4c65ea1f851941676d7928ea51d2203d75494f65b132d93fda2f4070408fc80c" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.448256 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-m59dl"] Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.461987 5030 scope.go:117] "RemoveContainer" containerID="b43618f37e6fd146988e55f65eb23ca78ad00971c8fd42c8f85bb61794f473ab" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.470093 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-m59dl"] Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.485540 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.488776 5030 scope.go:117] "RemoveContainer" containerID="4a561d8a37b7ed574eeac7ab94192f94f75cebcb0c5335b3a5b084bc65d47440" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.494409 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.508270 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd"] Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.517472 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6dcc6c7f8c-np8dd"] Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.528679 5030 scope.go:117] "RemoveContainer" containerID="b43618f37e6fd146988e55f65eb23ca78ad00971c8fd42c8f85bb61794f473ab" Jan 21 00:44:39 crc kubenswrapper[5030]: E0121 00:44:39.529374 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43618f37e6fd146988e55f65eb23ca78ad00971c8fd42c8f85bb61794f473ab\": container with ID starting with b43618f37e6fd146988e55f65eb23ca78ad00971c8fd42c8f85bb61794f473ab not found: ID does not exist" containerID="b43618f37e6fd146988e55f65eb23ca78ad00971c8fd42c8f85bb61794f473ab" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.529435 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43618f37e6fd146988e55f65eb23ca78ad00971c8fd42c8f85bb61794f473ab"} err="failed to get container status \"b43618f37e6fd146988e55f65eb23ca78ad00971c8fd42c8f85bb61794f473ab\": rpc error: code = NotFound desc = could not find container \"b43618f37e6fd146988e55f65eb23ca78ad00971c8fd42c8f85bb61794f473ab\": container with ID starting with b43618f37e6fd146988e55f65eb23ca78ad00971c8fd42c8f85bb61794f473ab not found: ID does not exist" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.529472 5030 scope.go:117] "RemoveContainer" containerID="4a561d8a37b7ed574eeac7ab94192f94f75cebcb0c5335b3a5b084bc65d47440" Jan 21 00:44:39 crc kubenswrapper[5030]: E0121 00:44:39.529841 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a561d8a37b7ed574eeac7ab94192f94f75cebcb0c5335b3a5b084bc65d47440\": container with ID starting with 4a561d8a37b7ed574eeac7ab94192f94f75cebcb0c5335b3a5b084bc65d47440 not found: ID does not exist" containerID="4a561d8a37b7ed574eeac7ab94192f94f75cebcb0c5335b3a5b084bc65d47440" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.529886 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a561d8a37b7ed574eeac7ab94192f94f75cebcb0c5335b3a5b084bc65d47440"} err="failed to get container status \"4a561d8a37b7ed574eeac7ab94192f94f75cebcb0c5335b3a5b084bc65d47440\": rpc error: code = NotFound desc = could not find container \"4a561d8a37b7ed574eeac7ab94192f94f75cebcb0c5335b3a5b084bc65d47440\": container with ID starting with 4a561d8a37b7ed574eeac7ab94192f94f75cebcb0c5335b3a5b084bc65d47440 not found: ID does not exist" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.638823 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/openstack-galera-0" podUID="0577056e-7cb4-41ab-8763-f97dcbfa4f37" containerName="galera" containerID="cri-o://16a95d1de8157d23ee43ea886c42e2f2eb5193fe74f30e540f791006d0ed323c" gracePeriod=26 Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.969043 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ed0d97f-1c38-4cdd-bacf-507b3ca9442d" path="/var/lib/kubelet/pods/8ed0d97f-1c38-4cdd-bacf-507b3ca9442d/volumes" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.969548 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa7767c-5909-48ed-9d3f-8046f21ee493" path="/var/lib/kubelet/pods/cfa7767c-5909-48ed-9d3f-8046f21ee493/volumes" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.970118 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6c9fb65-9372-46f7-b41e-bddf586f31d4" path="/var/lib/kubelet/pods/d6c9fb65-9372-46f7-b41e-bddf586f31d4/volumes" Jan 21 00:44:39 crc kubenswrapper[5030]: I0121 00:44:39.971183 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7cb0fd7-6a53-4ab5-a144-9495044acffb" path="/var/lib/kubelet/pods/f7cb0fd7-6a53-4ab5-a144-9495044acffb/volumes" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.429014 5030 generic.go:334] "Generic (PLEG): container finished" podID="59579353-499a-4005-8702-38e5be21d855" containerID="e6836d2cd0525dcc81f5327e90f13f4b8342fe25b3f7ca5d7231bc3df10a4354" exitCode=0 Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.429083 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"59579353-499a-4005-8702-38e5be21d855","Type":"ContainerDied","Data":"e6836d2cd0525dcc81f5327e90f13f4b8342fe25b3f7ca5d7231bc3df10a4354"} Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.649369 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.734254 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.774119 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"59579353-499a-4005-8702-38e5be21d855\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.774185 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-config-data-default\") pod \"59579353-499a-4005-8702-38e5be21d855\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.774241 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/59579353-499a-4005-8702-38e5be21d855-config-data-generated\") pod \"59579353-499a-4005-8702-38e5be21d855\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.774309 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g987\" (UniqueName: \"kubernetes.io/projected/59579353-499a-4005-8702-38e5be21d855-kube-api-access-9g987\") pod \"59579353-499a-4005-8702-38e5be21d855\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.774363 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-operator-scripts\") pod \"59579353-499a-4005-8702-38e5be21d855\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.774409 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-kolla-config\") pod \"59579353-499a-4005-8702-38e5be21d855\" (UID: \"59579353-499a-4005-8702-38e5be21d855\") " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.775010 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59579353-499a-4005-8702-38e5be21d855-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "59579353-499a-4005-8702-38e5be21d855" (UID: "59579353-499a-4005-8702-38e5be21d855"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.775239 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "59579353-499a-4005-8702-38e5be21d855" (UID: "59579353-499a-4005-8702-38e5be21d855"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.775440 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "59579353-499a-4005-8702-38e5be21d855" (UID: "59579353-499a-4005-8702-38e5be21d855"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.776234 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59579353-499a-4005-8702-38e5be21d855" (UID: "59579353-499a-4005-8702-38e5be21d855"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.782400 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59579353-499a-4005-8702-38e5be21d855-kube-api-access-9g987" (OuterVolumeSpecName: "kube-api-access-9g987") pod "59579353-499a-4005-8702-38e5be21d855" (UID: "59579353-499a-4005-8702-38e5be21d855"). InnerVolumeSpecName "kube-api-access-9g987". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.787403 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "59579353-499a-4005-8702-38e5be21d855" (UID: "59579353-499a-4005-8702-38e5be21d855"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.875578 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-kolla-config\") pod \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.875746 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0577056e-7cb4-41ab-8763-f97dcbfa4f37-config-data-generated\") pod \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.875873 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phztj\" (UniqueName: \"kubernetes.io/projected/0577056e-7cb4-41ab-8763-f97dcbfa4f37-kube-api-access-phztj\") pod \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.875906 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.875954 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-operator-scripts\") pod \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.875990 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-config-data-default\") pod \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\" (UID: \"0577056e-7cb4-41ab-8763-f97dcbfa4f37\") " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.876002 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "0577056e-7cb4-41ab-8763-f97dcbfa4f37" (UID: "0577056e-7cb4-41ab-8763-f97dcbfa4f37"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.876378 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "0577056e-7cb4-41ab-8763-f97dcbfa4f37" (UID: "0577056e-7cb4-41ab-8763-f97dcbfa4f37"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.876748 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0577056e-7cb4-41ab-8763-f97dcbfa4f37" (UID: "0577056e-7cb4-41ab-8763-f97dcbfa4f37"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.876879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0577056e-7cb4-41ab-8763-f97dcbfa4f37-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "0577056e-7cb4-41ab-8763-f97dcbfa4f37" (UID: "0577056e-7cb4-41ab-8763-f97dcbfa4f37"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.876999 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.877036 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.877053 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.877074 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/59579353-499a-4005-8702-38e5be21d855-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.877088 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g987\" (UniqueName: \"kubernetes.io/projected/59579353-499a-4005-8702-38e5be21d855-kube-api-access-9g987\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.877101 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.877160 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59579353-499a-4005-8702-38e5be21d855-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.882251 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0577056e-7cb4-41ab-8763-f97dcbfa4f37-kube-api-access-phztj" (OuterVolumeSpecName: "kube-api-access-phztj") pod "0577056e-7cb4-41ab-8763-f97dcbfa4f37" (UID: "0577056e-7cb4-41ab-8763-f97dcbfa4f37"). InnerVolumeSpecName "kube-api-access-phztj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.885246 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "0577056e-7cb4-41ab-8763-f97dcbfa4f37" (UID: "0577056e-7cb4-41ab-8763-f97dcbfa4f37"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.896778 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.978127 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phztj\" (UniqueName: \"kubernetes.io/projected/0577056e-7cb4-41ab-8763-f97dcbfa4f37-kube-api-access-phztj\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.978168 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.978183 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.978194 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0577056e-7cb4-41ab-8763-f97dcbfa4f37-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.978204 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0577056e-7cb4-41ab-8763-f97dcbfa4f37-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.978213 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:40 crc kubenswrapper[5030]: I0121 00:44:40.990119 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.079695 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.438997 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"59579353-499a-4005-8702-38e5be21d855","Type":"ContainerDied","Data":"062e6991dd6a95ed2a8fbda6821475998bd94411ac73c979320f305a0ce8fab8"} Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.439046 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.439052 5030 scope.go:117] "RemoveContainer" containerID="e6836d2cd0525dcc81f5327e90f13f4b8342fe25b3f7ca5d7231bc3df10a4354" Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.442163 5030 generic.go:334] "Generic (PLEG): container finished" podID="0577056e-7cb4-41ab-8763-f97dcbfa4f37" containerID="16a95d1de8157d23ee43ea886c42e2f2eb5193fe74f30e540f791006d0ed323c" exitCode=0 Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.442194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"0577056e-7cb4-41ab-8763-f97dcbfa4f37","Type":"ContainerDied","Data":"16a95d1de8157d23ee43ea886c42e2f2eb5193fe74f30e540f791006d0ed323c"} Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.442219 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"0577056e-7cb4-41ab-8763-f97dcbfa4f37","Type":"ContainerDied","Data":"f8a787fc12fb04eaf61b204fee5319b04c4a69251ac36a51f9b6d9e3274bb785"} Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.442369 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.460717 5030 scope.go:117] "RemoveContainer" containerID="9a0d8a8b1a729f99b29930c0d5e0f8a964b5aa9e6ff5d124174b406f41407965" Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.477100 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.485166 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.490758 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.495491 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.503772 5030 scope.go:117] "RemoveContainer" containerID="16a95d1de8157d23ee43ea886c42e2f2eb5193fe74f30e540f791006d0ed323c" Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.530965 5030 scope.go:117] "RemoveContainer" containerID="af676271e3940564b0e02d55fa6babe6225fb1646cef723aa133def11239e1b2" Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.546235 5030 scope.go:117] "RemoveContainer" containerID="16a95d1de8157d23ee43ea886c42e2f2eb5193fe74f30e540f791006d0ed323c" Jan 21 00:44:41 crc kubenswrapper[5030]: E0121 00:44:41.546582 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a95d1de8157d23ee43ea886c42e2f2eb5193fe74f30e540f791006d0ed323c\": container with ID starting with 16a95d1de8157d23ee43ea886c42e2f2eb5193fe74f30e540f791006d0ed323c not found: ID does not exist" containerID="16a95d1de8157d23ee43ea886c42e2f2eb5193fe74f30e540f791006d0ed323c" Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.546612 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a95d1de8157d23ee43ea886c42e2f2eb5193fe74f30e540f791006d0ed323c"} err="failed to get container status \"16a95d1de8157d23ee43ea886c42e2f2eb5193fe74f30e540f791006d0ed323c\": rpc error: code = NotFound desc = could not find container \"16a95d1de8157d23ee43ea886c42e2f2eb5193fe74f30e540f791006d0ed323c\": container with ID starting with 16a95d1de8157d23ee43ea886c42e2f2eb5193fe74f30e540f791006d0ed323c not found: ID does not exist" Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.546652 5030 scope.go:117] "RemoveContainer" containerID="af676271e3940564b0e02d55fa6babe6225fb1646cef723aa133def11239e1b2" Jan 21 00:44:41 crc kubenswrapper[5030]: E0121 00:44:41.546989 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af676271e3940564b0e02d55fa6babe6225fb1646cef723aa133def11239e1b2\": container with ID starting with af676271e3940564b0e02d55fa6babe6225fb1646cef723aa133def11239e1b2 not found: ID does not exist" containerID="af676271e3940564b0e02d55fa6babe6225fb1646cef723aa133def11239e1b2" Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.547039 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af676271e3940564b0e02d55fa6babe6225fb1646cef723aa133def11239e1b2"} err="failed to get container status \"af676271e3940564b0e02d55fa6babe6225fb1646cef723aa133def11239e1b2\": rpc error: code = NotFound desc = could not find container \"af676271e3940564b0e02d55fa6babe6225fb1646cef723aa133def11239e1b2\": container with ID starting with af676271e3940564b0e02d55fa6babe6225fb1646cef723aa133def11239e1b2 not found: ID does not exist" Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.972441 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0577056e-7cb4-41ab-8763-f97dcbfa4f37" path="/var/lib/kubelet/pods/0577056e-7cb4-41ab-8763-f97dcbfa4f37/volumes" Jan 21 00:44:41 crc kubenswrapper[5030]: I0121 00:44:41.973599 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59579353-499a-4005-8702-38e5be21d855" path="/var/lib/kubelet/pods/59579353-499a-4005-8702-38e5be21d855/volumes" Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.191183 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-756b778986-4q6xf"] Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.191703 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" podUID="19cc407a-76e6-4487-bb8b-23286acdd0a5" containerName="manager" containerID="cri-o://b99755c77b32a58453549ab8c94e9e24439bab281d44c2fb01f200043ced6c0b" gracePeriod=10 Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.439441 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-7c4sv"] Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.447112 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-7c4sv" podUID="86f764f9-6250-4fda-8c47-96f9795a2fc6" containerName="registry-server" containerID="cri-o://70686d4e0a9ea3c67cae06b5d1c08584f81bc6d248bca11263fe35cbd871f412" gracePeriod=30 Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.491579 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22"] Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.498307 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c19jf22"] Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.498552 5030 generic.go:334] "Generic (PLEG): container finished" podID="19cc407a-76e6-4487-bb8b-23286acdd0a5" containerID="b99755c77b32a58453549ab8c94e9e24439bab281d44c2fb01f200043ced6c0b" exitCode=0 Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.498583 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" event={"ID":"19cc407a-76e6-4487-bb8b-23286acdd0a5","Type":"ContainerDied","Data":"b99755c77b32a58453549ab8c94e9e24439bab281d44c2fb01f200043ced6c0b"} Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.726265 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.792475 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19cc407a-76e6-4487-bb8b-23286acdd0a5-apiservice-cert\") pod \"19cc407a-76e6-4487-bb8b-23286acdd0a5\" (UID: \"19cc407a-76e6-4487-bb8b-23286acdd0a5\") " Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.792806 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfdqj\" (UniqueName: \"kubernetes.io/projected/19cc407a-76e6-4487-bb8b-23286acdd0a5-kube-api-access-rfdqj\") pod \"19cc407a-76e6-4487-bb8b-23286acdd0a5\" (UID: \"19cc407a-76e6-4487-bb8b-23286acdd0a5\") " Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.792873 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19cc407a-76e6-4487-bb8b-23286acdd0a5-webhook-cert\") pod \"19cc407a-76e6-4487-bb8b-23286acdd0a5\" (UID: \"19cc407a-76e6-4487-bb8b-23286acdd0a5\") " Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.809992 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cc407a-76e6-4487-bb8b-23286acdd0a5-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "19cc407a-76e6-4487-bb8b-23286acdd0a5" (UID: "19cc407a-76e6-4487-bb8b-23286acdd0a5"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.813003 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19cc407a-76e6-4487-bb8b-23286acdd0a5-kube-api-access-rfdqj" (OuterVolumeSpecName: "kube-api-access-rfdqj") pod "19cc407a-76e6-4487-bb8b-23286acdd0a5" (UID: "19cc407a-76e6-4487-bb8b-23286acdd0a5"). InnerVolumeSpecName "kube-api-access-rfdqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.813127 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cc407a-76e6-4487-bb8b-23286acdd0a5-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "19cc407a-76e6-4487-bb8b-23286acdd0a5" (UID: "19cc407a-76e6-4487-bb8b-23286acdd0a5"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.868850 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-7c4sv" Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.901098 5030 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/19cc407a-76e6-4487-bb8b-23286acdd0a5-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.901144 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfdqj\" (UniqueName: \"kubernetes.io/projected/19cc407a-76e6-4487-bb8b-23286acdd0a5-kube-api-access-rfdqj\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:48 crc kubenswrapper[5030]: I0121 00:44:48.901163 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/19cc407a-76e6-4487-bb8b-23286acdd0a5-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.002366 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27t47\" (UniqueName: \"kubernetes.io/projected/86f764f9-6250-4fda-8c47-96f9795a2fc6-kube-api-access-27t47\") pod \"86f764f9-6250-4fda-8c47-96f9795a2fc6\" (UID: \"86f764f9-6250-4fda-8c47-96f9795a2fc6\") " Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.005824 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f764f9-6250-4fda-8c47-96f9795a2fc6-kube-api-access-27t47" (OuterVolumeSpecName: "kube-api-access-27t47") pod "86f764f9-6250-4fda-8c47-96f9795a2fc6" (UID: "86f764f9-6250-4fda-8c47-96f9795a2fc6"). InnerVolumeSpecName "kube-api-access-27t47". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.104718 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27t47\" (UniqueName: \"kubernetes.io/projected/86f764f9-6250-4fda-8c47-96f9795a2fc6-kube-api-access-27t47\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.508587 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" event={"ID":"19cc407a-76e6-4487-bb8b-23286acdd0a5","Type":"ContainerDied","Data":"adf4b820821bb53d02891411b4a0bd1167450ea920bd645ae734937c321984e5"} Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.508683 5030 scope.go:117] "RemoveContainer" containerID="b99755c77b32a58453549ab8c94e9e24439bab281d44c2fb01f200043ced6c0b" Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.508825 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-756b778986-4q6xf" Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.511270 5030 generic.go:334] "Generic (PLEG): container finished" podID="86f764f9-6250-4fda-8c47-96f9795a2fc6" containerID="70686d4e0a9ea3c67cae06b5d1c08584f81bc6d248bca11263fe35cbd871f412" exitCode=0 Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.511327 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-7c4sv" event={"ID":"86f764f9-6250-4fda-8c47-96f9795a2fc6","Type":"ContainerDied","Data":"70686d4e0a9ea3c67cae06b5d1c08584f81bc6d248bca11263fe35cbd871f412"} Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.511368 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-7c4sv" event={"ID":"86f764f9-6250-4fda-8c47-96f9795a2fc6","Type":"ContainerDied","Data":"0022c71671268cbfb7afb7baae1366bf352f3809e953d0c4b016fe9892c877c0"} Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.511331 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-7c4sv" Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.528836 5030 scope.go:117] "RemoveContainer" containerID="70686d4e0a9ea3c67cae06b5d1c08584f81bc6d248bca11263fe35cbd871f412" Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.544745 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-7c4sv"] Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.548544 5030 scope.go:117] "RemoveContainer" containerID="70686d4e0a9ea3c67cae06b5d1c08584f81bc6d248bca11263fe35cbd871f412" Jan 21 00:44:49 crc kubenswrapper[5030]: E0121 00:44:49.549049 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70686d4e0a9ea3c67cae06b5d1c08584f81bc6d248bca11263fe35cbd871f412\": container with ID starting with 70686d4e0a9ea3c67cae06b5d1c08584f81bc6d248bca11263fe35cbd871f412 not found: ID does not exist" containerID="70686d4e0a9ea3c67cae06b5d1c08584f81bc6d248bca11263fe35cbd871f412" Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.549075 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70686d4e0a9ea3c67cae06b5d1c08584f81bc6d248bca11263fe35cbd871f412"} err="failed to get container status \"70686d4e0a9ea3c67cae06b5d1c08584f81bc6d248bca11263fe35cbd871f412\": rpc error: code = NotFound desc = could not find container \"70686d4e0a9ea3c67cae06b5d1c08584f81bc6d248bca11263fe35cbd871f412\": container with ID starting with 70686d4e0a9ea3c67cae06b5d1c08584f81bc6d248bca11263fe35cbd871f412 not found: ID does not exist" Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.554735 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-7c4sv"] Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.561124 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-756b778986-4q6xf"] Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.567245 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-756b778986-4q6xf"] Jan 21 00:44:49 crc kubenswrapper[5030]: E0121 00:44:49.631309 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86f764f9_6250_4fda_8c47_96f9795a2fc6.slice/crio-0022c71671268cbfb7afb7baae1366bf352f3809e953d0c4b016fe9892c877c0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19cc407a_76e6_4487_bb8b_23286acdd0a5.slice/crio-adf4b820821bb53d02891411b4a0bd1167450ea920bd645ae734937c321984e5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86f764f9_6250_4fda_8c47_96f9795a2fc6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19cc407a_76e6_4487_bb8b_23286acdd0a5.slice\": RecentStats: unable to find data in memory cache]" Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.973010 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19cc407a-76e6-4487-bb8b-23286acdd0a5" path="/var/lib/kubelet/pods/19cc407a-76e6-4487-bb8b-23286acdd0a5/volumes" Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.973986 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f764f9-6250-4fda-8c47-96f9795a2fc6" path="/var/lib/kubelet/pods/86f764f9-6250-4fda-8c47-96f9795a2fc6/volumes" Jan 21 00:44:49 crc kubenswrapper[5030]: I0121 00:44:49.974654 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e006733-b13c-4371-922e-b87b7a68a197" path="/var/lib/kubelet/pods/8e006733-b13c-4371-922e-b87b7a68a197/volumes" Jan 21 00:44:50 crc kubenswrapper[5030]: I0121 00:44:50.496278 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d"] Jan 21 00:44:50 crc kubenswrapper[5030]: I0121 00:44:50.496526 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" podUID="c46a4962-cf58-47ea-8ea6-b603605c747b" containerName="manager" containerID="cri-o://7820aa1133a01328a28dcaa37da8286034c2e6a5e219291842df1f7cc562f7ee" gracePeriod=10 Jan 21 00:44:50 crc kubenswrapper[5030]: I0121 00:44:50.768855 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-fgsht"] Jan 21 00:44:50 crc kubenswrapper[5030]: I0121 00:44:50.769089 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-fgsht" podUID="c09abaf9-84c0-44d0-8aa0-5d930ccf4d59" containerName="registry-server" containerID="cri-o://c3666ef5c94bc23c780da6d907fababf6dd4c49bf284beafc162d95fbe105506" gracePeriod=30 Jan 21 00:44:50 crc kubenswrapper[5030]: I0121 00:44:50.801637 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv"] Jan 21 00:44:50 crc kubenswrapper[5030]: I0121 00:44:50.810370 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bzvplv"] Jan 21 00:44:50 crc kubenswrapper[5030]: I0121 00:44:50.975396 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.037268 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlqts\" (UniqueName: \"kubernetes.io/projected/c46a4962-cf58-47ea-8ea6-b603605c747b-kube-api-access-qlqts\") pod \"c46a4962-cf58-47ea-8ea6-b603605c747b\" (UID: \"c46a4962-cf58-47ea-8ea6-b603605c747b\") " Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.037421 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c46a4962-cf58-47ea-8ea6-b603605c747b-apiservice-cert\") pod \"c46a4962-cf58-47ea-8ea6-b603605c747b\" (UID: \"c46a4962-cf58-47ea-8ea6-b603605c747b\") " Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.037478 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c46a4962-cf58-47ea-8ea6-b603605c747b-webhook-cert\") pod \"c46a4962-cf58-47ea-8ea6-b603605c747b\" (UID: \"c46a4962-cf58-47ea-8ea6-b603605c747b\") " Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.060963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c46a4962-cf58-47ea-8ea6-b603605c747b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "c46a4962-cf58-47ea-8ea6-b603605c747b" (UID: "c46a4962-cf58-47ea-8ea6-b603605c747b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.061064 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c46a4962-cf58-47ea-8ea6-b603605c747b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "c46a4962-cf58-47ea-8ea6-b603605c747b" (UID: "c46a4962-cf58-47ea-8ea6-b603605c747b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.070278 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c46a4962-cf58-47ea-8ea6-b603605c747b-kube-api-access-qlqts" (OuterVolumeSpecName: "kube-api-access-qlqts") pod "c46a4962-cf58-47ea-8ea6-b603605c747b" (UID: "c46a4962-cf58-47ea-8ea6-b603605c747b"). InnerVolumeSpecName "kube-api-access-qlqts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.139398 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlqts\" (UniqueName: \"kubernetes.io/projected/c46a4962-cf58-47ea-8ea6-b603605c747b-kube-api-access-qlqts\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.139432 5030 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c46a4962-cf58-47ea-8ea6-b603605c747b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.139442 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c46a4962-cf58-47ea-8ea6-b603605c747b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.176495 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-fgsht" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.240569 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klbpj\" (UniqueName: \"kubernetes.io/projected/c09abaf9-84c0-44d0-8aa0-5d930ccf4d59-kube-api-access-klbpj\") pod \"c09abaf9-84c0-44d0-8aa0-5d930ccf4d59\" (UID: \"c09abaf9-84c0-44d0-8aa0-5d930ccf4d59\") " Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.243919 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09abaf9-84c0-44d0-8aa0-5d930ccf4d59-kube-api-access-klbpj" (OuterVolumeSpecName: "kube-api-access-klbpj") pod "c09abaf9-84c0-44d0-8aa0-5d930ccf4d59" (UID: "c09abaf9-84c0-44d0-8aa0-5d930ccf4d59"). InnerVolumeSpecName "kube-api-access-klbpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.342145 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klbpj\" (UniqueName: \"kubernetes.io/projected/c09abaf9-84c0-44d0-8aa0-5d930ccf4d59-kube-api-access-klbpj\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.538987 5030 generic.go:334] "Generic (PLEG): container finished" podID="c46a4962-cf58-47ea-8ea6-b603605c747b" containerID="7820aa1133a01328a28dcaa37da8286034c2e6a5e219291842df1f7cc562f7ee" exitCode=0 Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.539056 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" event={"ID":"c46a4962-cf58-47ea-8ea6-b603605c747b","Type":"ContainerDied","Data":"7820aa1133a01328a28dcaa37da8286034c2e6a5e219291842df1f7cc562f7ee"} Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.539057 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.539084 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d" event={"ID":"c46a4962-cf58-47ea-8ea6-b603605c747b","Type":"ContainerDied","Data":"a6690e7cba154e5d0b3b06be4e6e6081f32a5a7ba133ca5e9f8d152f72974112"} Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.539106 5030 scope.go:117] "RemoveContainer" containerID="7820aa1133a01328a28dcaa37da8286034c2e6a5e219291842df1f7cc562f7ee" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.542396 5030 generic.go:334] "Generic (PLEG): container finished" podID="c09abaf9-84c0-44d0-8aa0-5d930ccf4d59" containerID="c3666ef5c94bc23c780da6d907fababf6dd4c49bf284beafc162d95fbe105506" exitCode=0 Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.542449 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-fgsht" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.542453 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-fgsht" event={"ID":"c09abaf9-84c0-44d0-8aa0-5d930ccf4d59","Type":"ContainerDied","Data":"c3666ef5c94bc23c780da6d907fababf6dd4c49bf284beafc162d95fbe105506"} Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.543685 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-fgsht" event={"ID":"c09abaf9-84c0-44d0-8aa0-5d930ccf4d59","Type":"ContainerDied","Data":"a981fc1ec3dda9b939ec3252eec526ce1013e8a3adf73257fbb3c9b79410b7ea"} Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.561486 5030 scope.go:117] "RemoveContainer" containerID="7820aa1133a01328a28dcaa37da8286034c2e6a5e219291842df1f7cc562f7ee" Jan 21 00:44:51 crc kubenswrapper[5030]: E0121 00:44:51.562543 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7820aa1133a01328a28dcaa37da8286034c2e6a5e219291842df1f7cc562f7ee\": container with ID starting with 7820aa1133a01328a28dcaa37da8286034c2e6a5e219291842df1f7cc562f7ee not found: ID does not exist" containerID="7820aa1133a01328a28dcaa37da8286034c2e6a5e219291842df1f7cc562f7ee" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.562578 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7820aa1133a01328a28dcaa37da8286034c2e6a5e219291842df1f7cc562f7ee"} err="failed to get container status \"7820aa1133a01328a28dcaa37da8286034c2e6a5e219291842df1f7cc562f7ee\": rpc error: code = NotFound desc = could not find container \"7820aa1133a01328a28dcaa37da8286034c2e6a5e219291842df1f7cc562f7ee\": container with ID starting with 7820aa1133a01328a28dcaa37da8286034c2e6a5e219291842df1f7cc562f7ee not found: ID does not exist" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.562615 5030 scope.go:117] "RemoveContainer" containerID="c3666ef5c94bc23c780da6d907fababf6dd4c49bf284beafc162d95fbe105506" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.573436 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d"] Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.584355 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-547684f4d7-2vk2d"] Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.591360 5030 scope.go:117] "RemoveContainer" containerID="c3666ef5c94bc23c780da6d907fababf6dd4c49bf284beafc162d95fbe105506" Jan 21 00:44:51 crc kubenswrapper[5030]: E0121 00:44:51.591944 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3666ef5c94bc23c780da6d907fababf6dd4c49bf284beafc162d95fbe105506\": container with ID starting with c3666ef5c94bc23c780da6d907fababf6dd4c49bf284beafc162d95fbe105506 not found: ID does not exist" containerID="c3666ef5c94bc23c780da6d907fababf6dd4c49bf284beafc162d95fbe105506" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.591973 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3666ef5c94bc23c780da6d907fababf6dd4c49bf284beafc162d95fbe105506"} err="failed to get container status \"c3666ef5c94bc23c780da6d907fababf6dd4c49bf284beafc162d95fbe105506\": rpc error: code = NotFound desc = could not find container \"c3666ef5c94bc23c780da6d907fababf6dd4c49bf284beafc162d95fbe105506\": container with ID starting with c3666ef5c94bc23c780da6d907fababf6dd4c49bf284beafc162d95fbe105506 not found: ID does not exist" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.611720 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-fgsht"] Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.621194 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-fgsht"] Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.973682 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab25a3ce-1d1e-4621-93d3-51d691be4b9f" path="/var/lib/kubelet/pods/ab25a3ce-1d1e-4621-93d3-51d691be4b9f/volumes" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.974290 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c09abaf9-84c0-44d0-8aa0-5d930ccf4d59" path="/var/lib/kubelet/pods/c09abaf9-84c0-44d0-8aa0-5d930ccf4d59/volumes" Jan 21 00:44:51 crc kubenswrapper[5030]: I0121 00:44:51.974777 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c46a4962-cf58-47ea-8ea6-b603605c747b" path="/var/lib/kubelet/pods/c46a4962-cf58-47ea-8ea6-b603605c747b/volumes" Jan 21 00:44:53 crc kubenswrapper[5030]: I0121 00:44:53.121157 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5"] Jan 21 00:44:53 crc kubenswrapper[5030]: I0121 00:44:53.121757 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5" podUID="0af03329-faed-4eeb-bad2-482e5318f619" containerName="operator" containerID="cri-o://6258392ce7c15b7d00a12401c0196dfcbd9e49f57f5074754f05eb4ee62206f3" gracePeriod=10 Jan 21 00:44:53 crc kubenswrapper[5030]: I0121 00:44:53.460604 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jbt6q"] Jan 21 00:44:53 crc kubenswrapper[5030]: I0121 00:44:53.460875 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" podUID="4f47d75b-9f36-4d13-8b91-a5836565f670" containerName="registry-server" containerID="cri-o://0e9fbc48aca84c99ee185bfefc0b80cedae43d7f4c8af54a06705865c87c0177" gracePeriod=30 Jan 21 00:44:53 crc kubenswrapper[5030]: I0121 00:44:53.495145 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj"] Jan 21 00:44:53 crc kubenswrapper[5030]: I0121 00:44:53.501097 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e59028bbj"] Jan 21 00:44:53 crc kubenswrapper[5030]: I0121 00:44:53.559806 5030 generic.go:334] "Generic (PLEG): container finished" podID="0af03329-faed-4eeb-bad2-482e5318f619" containerID="6258392ce7c15b7d00a12401c0196dfcbd9e49f57f5074754f05eb4ee62206f3" exitCode=0 Jan 21 00:44:53 crc kubenswrapper[5030]: I0121 00:44:53.559861 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5" event={"ID":"0af03329-faed-4eeb-bad2-482e5318f619","Type":"ContainerDied","Data":"6258392ce7c15b7d00a12401c0196dfcbd9e49f57f5074754f05eb4ee62206f3"} Jan 21 00:44:53 crc kubenswrapper[5030]: I0121 00:44:53.930174 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" Jan 21 00:44:53 crc kubenswrapper[5030]: I0121 00:44:53.978808 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df48f69d-ead6-4779-a18d-9bcdd762986b" path="/var/lib/kubelet/pods/df48f69d-ead6-4779-a18d-9bcdd762986b/volumes" Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.062932 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5" Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.089826 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smmm4\" (UniqueName: \"kubernetes.io/projected/4f47d75b-9f36-4d13-8b91-a5836565f670-kube-api-access-smmm4\") pod \"4f47d75b-9f36-4d13-8b91-a5836565f670\" (UID: \"4f47d75b-9f36-4d13-8b91-a5836565f670\") " Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.095811 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f47d75b-9f36-4d13-8b91-a5836565f670-kube-api-access-smmm4" (OuterVolumeSpecName: "kube-api-access-smmm4") pod "4f47d75b-9f36-4d13-8b91-a5836565f670" (UID: "4f47d75b-9f36-4d13-8b91-a5836565f670"). InnerVolumeSpecName "kube-api-access-smmm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.191417 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgp57\" (UniqueName: \"kubernetes.io/projected/0af03329-faed-4eeb-bad2-482e5318f619-kube-api-access-hgp57\") pod \"0af03329-faed-4eeb-bad2-482e5318f619\" (UID: \"0af03329-faed-4eeb-bad2-482e5318f619\") " Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.191742 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smmm4\" (UniqueName: \"kubernetes.io/projected/4f47d75b-9f36-4d13-8b91-a5836565f670-kube-api-access-smmm4\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.195874 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af03329-faed-4eeb-bad2-482e5318f619-kube-api-access-hgp57" (OuterVolumeSpecName: "kube-api-access-hgp57") pod "0af03329-faed-4eeb-bad2-482e5318f619" (UID: "0af03329-faed-4eeb-bad2-482e5318f619"). InnerVolumeSpecName "kube-api-access-hgp57". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.292420 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgp57\" (UniqueName: \"kubernetes.io/projected/0af03329-faed-4eeb-bad2-482e5318f619-kube-api-access-hgp57\") on node \"crc\" DevicePath \"\"" Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.567142 5030 generic.go:334] "Generic (PLEG): container finished" podID="4f47d75b-9f36-4d13-8b91-a5836565f670" containerID="0e9fbc48aca84c99ee185bfefc0b80cedae43d7f4c8af54a06705865c87c0177" exitCode=0 Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.567203 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.567243 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" event={"ID":"4f47d75b-9f36-4d13-8b91-a5836565f670","Type":"ContainerDied","Data":"0e9fbc48aca84c99ee185bfefc0b80cedae43d7f4c8af54a06705865c87c0177"} Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.567286 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-jbt6q" event={"ID":"4f47d75b-9f36-4d13-8b91-a5836565f670","Type":"ContainerDied","Data":"471c1c0a97416d2552eb6b59d2ff6ac9fcf8dd8b408710a0ddf295ebea029726"} Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.567306 5030 scope.go:117] "RemoveContainer" containerID="0e9fbc48aca84c99ee185bfefc0b80cedae43d7f4c8af54a06705865c87c0177" Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.568959 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5" event={"ID":"0af03329-faed-4eeb-bad2-482e5318f619","Type":"ContainerDied","Data":"f966472710d68c224401985a89400e07f2d9aec8871f1fa521cf7808650fb609"} Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.568983 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5" Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.588425 5030 scope.go:117] "RemoveContainer" containerID="0e9fbc48aca84c99ee185bfefc0b80cedae43d7f4c8af54a06705865c87c0177" Jan 21 00:44:54 crc kubenswrapper[5030]: E0121 00:44:54.588872 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e9fbc48aca84c99ee185bfefc0b80cedae43d7f4c8af54a06705865c87c0177\": container with ID starting with 0e9fbc48aca84c99ee185bfefc0b80cedae43d7f4c8af54a06705865c87c0177 not found: ID does not exist" containerID="0e9fbc48aca84c99ee185bfefc0b80cedae43d7f4c8af54a06705865c87c0177" Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.588929 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e9fbc48aca84c99ee185bfefc0b80cedae43d7f4c8af54a06705865c87c0177"} err="failed to get container status \"0e9fbc48aca84c99ee185bfefc0b80cedae43d7f4c8af54a06705865c87c0177\": rpc error: code = NotFound desc = could not find container \"0e9fbc48aca84c99ee185bfefc0b80cedae43d7f4c8af54a06705865c87c0177\": container with ID starting with 0e9fbc48aca84c99ee185bfefc0b80cedae43d7f4c8af54a06705865c87c0177 not found: ID does not exist" Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.588954 5030 scope.go:117] "RemoveContainer" containerID="6258392ce7c15b7d00a12401c0196dfcbd9e49f57f5074754f05eb4ee62206f3" Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.617495 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jbt6q"] Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.623173 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-jbt6q"] Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.628336 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5"] Jan 21 00:44:54 crc kubenswrapper[5030]: I0121 00:44:54.633460 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-6kpl5"] Jan 21 00:44:55 crc kubenswrapper[5030]: I0121 00:44:55.969320 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0af03329-faed-4eeb-bad2-482e5318f619" path="/var/lib/kubelet/pods/0af03329-faed-4eeb-bad2-482e5318f619/volumes" Jan 21 00:44:55 crc kubenswrapper[5030]: I0121 00:44:55.970011 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f47d75b-9f36-4d13-8b91-a5836565f670" path="/var/lib/kubelet/pods/4f47d75b-9f36-4d13-8b91-a5836565f670/volumes" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.158762 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns"] Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159371 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c9fb65-9372-46f7-b41e-bddf586f31d4" containerName="registry-server" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159386 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c9fb65-9372-46f7-b41e-bddf586f31d4" containerName="registry-server" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159403 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af3725b-a540-4b57-9655-0087cea23aa0" containerName="galera" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159409 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af3725b-a540-4b57-9655-0087cea23aa0" containerName="galera" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159423 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cb0fd7-6a53-4ab5-a144-9495044acffb" containerName="rabbitmq" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159429 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cb0fd7-6a53-4ab5-a144-9495044acffb" containerName="rabbitmq" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159441 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0577056e-7cb4-41ab-8763-f97dcbfa4f37" containerName="mysql-bootstrap" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159447 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0577056e-7cb4-41ab-8763-f97dcbfa4f37" containerName="mysql-bootstrap" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159454 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8db42f-632b-4d70-9c6f-1bd38c0f1c47" containerName="memcached" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159460 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8db42f-632b-4d70-9c6f-1bd38c0f1c47" containerName="memcached" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159474 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0577056e-7cb4-41ab-8763-f97dcbfa4f37" containerName="galera" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159480 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0577056e-7cb4-41ab-8763-f97dcbfa4f37" containerName="galera" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159491 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed0d97f-1c38-4cdd-bacf-507b3ca9442d" containerName="manager" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159497 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed0d97f-1c38-4cdd-bacf-507b3ca9442d" containerName="manager" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159507 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59579353-499a-4005-8702-38e5be21d855" containerName="mysql-bootstrap" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159513 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59579353-499a-4005-8702-38e5be21d855" containerName="mysql-bootstrap" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159524 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cb0fd7-6a53-4ab5-a144-9495044acffb" containerName="setup-container" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159529 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cb0fd7-6a53-4ab5-a144-9495044acffb" containerName="setup-container" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159542 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af03329-faed-4eeb-bad2-482e5318f619" containerName="operator" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159549 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af03329-faed-4eeb-bad2-482e5318f619" containerName="operator" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159561 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f764f9-6250-4fda-8c47-96f9795a2fc6" containerName="registry-server" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159571 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f764f9-6250-4fda-8c47-96f9795a2fc6" containerName="registry-server" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159581 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59579353-499a-4005-8702-38e5be21d855" containerName="galera" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159589 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="59579353-499a-4005-8702-38e5be21d855" containerName="galera" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159596 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09abaf9-84c0-44d0-8aa0-5d930ccf4d59" containerName="registry-server" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159604 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09abaf9-84c0-44d0-8aa0-5d930ccf4d59" containerName="registry-server" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159636 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f47d75b-9f36-4d13-8b91-a5836565f670" containerName="registry-server" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159644 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f47d75b-9f36-4d13-8b91-a5836565f670" containerName="registry-server" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159656 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af3725b-a540-4b57-9655-0087cea23aa0" containerName="mysql-bootstrap" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159664 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af3725b-a540-4b57-9655-0087cea23aa0" containerName="mysql-bootstrap" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159672 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46a4962-cf58-47ea-8ea6-b603605c747b" containerName="manager" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159681 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46a4962-cf58-47ea-8ea6-b603605c747b" containerName="manager" Jan 21 00:45:00 crc kubenswrapper[5030]: E0121 00:45:00.159692 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19cc407a-76e6-4487-bb8b-23286acdd0a5" containerName="manager" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159699 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="19cc407a-76e6-4487-bb8b-23286acdd0a5" containerName="manager" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159808 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="19cc407a-76e6-4487-bb8b-23286acdd0a5" containerName="manager" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159819 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cb0fd7-6a53-4ab5-a144-9495044acffb" containerName="rabbitmq" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159833 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f764f9-6250-4fda-8c47-96f9795a2fc6" containerName="registry-server" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159840 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0af03329-faed-4eeb-bad2-482e5318f619" containerName="operator" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159848 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c9fb65-9372-46f7-b41e-bddf586f31d4" containerName="registry-server" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159855 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e8db42f-632b-4d70-9c6f-1bd38c0f1c47" containerName="memcached" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159866 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="0577056e-7cb4-41ab-8763-f97dcbfa4f37" containerName="galera" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159874 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af3725b-a540-4b57-9655-0087cea23aa0" containerName="galera" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159880 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09abaf9-84c0-44d0-8aa0-5d930ccf4d59" containerName="registry-server" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159892 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f47d75b-9f36-4d13-8b91-a5836565f670" containerName="registry-server" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159901 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46a4962-cf58-47ea-8ea6-b603605c747b" containerName="manager" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159910 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed0d97f-1c38-4cdd-bacf-507b3ca9442d" containerName="manager" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.159917 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="59579353-499a-4005-8702-38e5be21d855" containerName="galera" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.160418 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.163123 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.163248 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.167218 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns"] Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.276970 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/786c9aee-56d9-48c8-8a1d-eafb58602db2-config-volume\") pod \"collect-profiles-29482605-cs5ns\" (UID: \"786c9aee-56d9-48c8-8a1d-eafb58602db2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.277026 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/786c9aee-56d9-48c8-8a1d-eafb58602db2-secret-volume\") pod \"collect-profiles-29482605-cs5ns\" (UID: \"786c9aee-56d9-48c8-8a1d-eafb58602db2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.277115 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lzlq\" (UniqueName: \"kubernetes.io/projected/786c9aee-56d9-48c8-8a1d-eafb58602db2-kube-api-access-2lzlq\") pod \"collect-profiles-29482605-cs5ns\" (UID: \"786c9aee-56d9-48c8-8a1d-eafb58602db2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.378511 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/786c9aee-56d9-48c8-8a1d-eafb58602db2-secret-volume\") pod \"collect-profiles-29482605-cs5ns\" (UID: \"786c9aee-56d9-48c8-8a1d-eafb58602db2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.378864 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lzlq\" (UniqueName: \"kubernetes.io/projected/786c9aee-56d9-48c8-8a1d-eafb58602db2-kube-api-access-2lzlq\") pod \"collect-profiles-29482605-cs5ns\" (UID: \"786c9aee-56d9-48c8-8a1d-eafb58602db2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.379027 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/786c9aee-56d9-48c8-8a1d-eafb58602db2-config-volume\") pod \"collect-profiles-29482605-cs5ns\" (UID: \"786c9aee-56d9-48c8-8a1d-eafb58602db2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.380077 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/786c9aee-56d9-48c8-8a1d-eafb58602db2-config-volume\") pod \"collect-profiles-29482605-cs5ns\" (UID: \"786c9aee-56d9-48c8-8a1d-eafb58602db2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.388517 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/786c9aee-56d9-48c8-8a1d-eafb58602db2-secret-volume\") pod \"collect-profiles-29482605-cs5ns\" (UID: \"786c9aee-56d9-48c8-8a1d-eafb58602db2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.395712 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lzlq\" (UniqueName: \"kubernetes.io/projected/786c9aee-56d9-48c8-8a1d-eafb58602db2-kube-api-access-2lzlq\") pod \"collect-profiles-29482605-cs5ns\" (UID: \"786c9aee-56d9-48c8-8a1d-eafb58602db2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.489350 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" Jan 21 00:45:00 crc kubenswrapper[5030]: I0121 00:45:00.901715 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns"] Jan 21 00:45:01 crc kubenswrapper[5030]: I0121 00:45:01.632083 5030 generic.go:334] "Generic (PLEG): container finished" podID="786c9aee-56d9-48c8-8a1d-eafb58602db2" containerID="01a00cdffaa9716dc9aab8d7a0ade65f03582010d732231552948440d7f5f5f5" exitCode=0 Jan 21 00:45:01 crc kubenswrapper[5030]: I0121 00:45:01.632194 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" event={"ID":"786c9aee-56d9-48c8-8a1d-eafb58602db2","Type":"ContainerDied","Data":"01a00cdffaa9716dc9aab8d7a0ade65f03582010d732231552948440d7f5f5f5"} Jan 21 00:45:01 crc kubenswrapper[5030]: I0121 00:45:01.632361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" event={"ID":"786c9aee-56d9-48c8-8a1d-eafb58602db2","Type":"ContainerStarted","Data":"d220b6f295f4bd0b1b2646cc760c17232da3218ce8e78f63ed692b5b74e1a3e6"} Jan 21 00:45:02 crc kubenswrapper[5030]: I0121 00:45:02.915043 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" Jan 21 00:45:02 crc kubenswrapper[5030]: I0121 00:45:02.926783 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/786c9aee-56d9-48c8-8a1d-eafb58602db2-config-volume\") pod \"786c9aee-56d9-48c8-8a1d-eafb58602db2\" (UID: \"786c9aee-56d9-48c8-8a1d-eafb58602db2\") " Jan 21 00:45:02 crc kubenswrapper[5030]: I0121 00:45:02.926936 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lzlq\" (UniqueName: \"kubernetes.io/projected/786c9aee-56d9-48c8-8a1d-eafb58602db2-kube-api-access-2lzlq\") pod \"786c9aee-56d9-48c8-8a1d-eafb58602db2\" (UID: \"786c9aee-56d9-48c8-8a1d-eafb58602db2\") " Jan 21 00:45:02 crc kubenswrapper[5030]: I0121 00:45:02.926985 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/786c9aee-56d9-48c8-8a1d-eafb58602db2-secret-volume\") pod \"786c9aee-56d9-48c8-8a1d-eafb58602db2\" (UID: \"786c9aee-56d9-48c8-8a1d-eafb58602db2\") " Jan 21 00:45:02 crc kubenswrapper[5030]: I0121 00:45:02.927926 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/786c9aee-56d9-48c8-8a1d-eafb58602db2-config-volume" (OuterVolumeSpecName: "config-volume") pod "786c9aee-56d9-48c8-8a1d-eafb58602db2" (UID: "786c9aee-56d9-48c8-8a1d-eafb58602db2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:45:02 crc kubenswrapper[5030]: I0121 00:45:02.933453 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/786c9aee-56d9-48c8-8a1d-eafb58602db2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "786c9aee-56d9-48c8-8a1d-eafb58602db2" (UID: "786c9aee-56d9-48c8-8a1d-eafb58602db2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:45:02 crc kubenswrapper[5030]: I0121 00:45:02.934372 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/786c9aee-56d9-48c8-8a1d-eafb58602db2-kube-api-access-2lzlq" (OuterVolumeSpecName: "kube-api-access-2lzlq") pod "786c9aee-56d9-48c8-8a1d-eafb58602db2" (UID: "786c9aee-56d9-48c8-8a1d-eafb58602db2"). InnerVolumeSpecName "kube-api-access-2lzlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:45:03 crc kubenswrapper[5030]: I0121 00:45:03.028715 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lzlq\" (UniqueName: \"kubernetes.io/projected/786c9aee-56d9-48c8-8a1d-eafb58602db2-kube-api-access-2lzlq\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:03 crc kubenswrapper[5030]: I0121 00:45:03.029024 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/786c9aee-56d9-48c8-8a1d-eafb58602db2-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:03 crc kubenswrapper[5030]: I0121 00:45:03.029110 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/786c9aee-56d9-48c8-8a1d-eafb58602db2-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:03 crc kubenswrapper[5030]: I0121 00:45:03.646108 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" event={"ID":"786c9aee-56d9-48c8-8a1d-eafb58602db2","Type":"ContainerDied","Data":"d220b6f295f4bd0b1b2646cc760c17232da3218ce8e78f63ed692b5b74e1a3e6"} Jan 21 00:45:03 crc kubenswrapper[5030]: I0121 00:45:03.646152 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns" Jan 21 00:45:03 crc kubenswrapper[5030]: I0121 00:45:03.646162 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d220b6f295f4bd0b1b2646cc760c17232da3218ce8e78f63ed692b5b74e1a3e6" Jan 21 00:45:03 crc kubenswrapper[5030]: I0121 00:45:03.982110 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn"] Jan 21 00:45:03 crc kubenswrapper[5030]: I0121 00:45:03.987677 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482560-xcgvn"] Jan 21 00:45:05 crc kubenswrapper[5030]: I0121 00:45:05.976238 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc" path="/var/lib/kubelet/pods/c290e3b2-94f3-4ba3-bbb0-5c4ec1933edc/volumes" Jan 21 00:45:10 crc kubenswrapper[5030]: I0121 00:45:10.157197 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:45:10 crc kubenswrapper[5030]: I0121 00:45:10.157985 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.011150 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-66xwl"] Jan 21 00:45:22 crc kubenswrapper[5030]: E0121 00:45:22.012020 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="786c9aee-56d9-48c8-8a1d-eafb58602db2" containerName="collect-profiles" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.012036 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="786c9aee-56d9-48c8-8a1d-eafb58602db2" containerName="collect-profiles" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.012224 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="786c9aee-56d9-48c8-8a1d-eafb58602db2" containerName="collect-profiles" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.013413 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.038588 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-66xwl"] Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.115976 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk4wd\" (UniqueName: \"kubernetes.io/projected/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-kube-api-access-fk4wd\") pod \"redhat-operators-66xwl\" (UID: \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\") " pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.116147 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-utilities\") pod \"redhat-operators-66xwl\" (UID: \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\") " pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.116258 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-catalog-content\") pod \"redhat-operators-66xwl\" (UID: \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\") " pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.217792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-catalog-content\") pod \"redhat-operators-66xwl\" (UID: \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\") " pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.218210 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk4wd\" (UniqueName: \"kubernetes.io/projected/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-kube-api-access-fk4wd\") pod \"redhat-operators-66xwl\" (UID: \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\") " pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.218256 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-utilities\") pod \"redhat-operators-66xwl\" (UID: \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\") " pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.219187 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-catalog-content\") pod \"redhat-operators-66xwl\" (UID: \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\") " pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.219198 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-utilities\") pod \"redhat-operators-66xwl\" (UID: \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\") " pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.239291 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk4wd\" (UniqueName: \"kubernetes.io/projected/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-kube-api-access-fk4wd\") pod \"redhat-operators-66xwl\" (UID: \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\") " pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.340217 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.785717 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-66xwl"] Jan 21 00:45:22 crc kubenswrapper[5030]: I0121 00:45:22.800610 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66xwl" event={"ID":"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae","Type":"ContainerStarted","Data":"c2eeccce23b62579d5326d88a1da01c929123d0a98c034ee833ad548f538471c"} Jan 21 00:45:23 crc kubenswrapper[5030]: I0121 00:45:23.810927 5030 generic.go:334] "Generic (PLEG): container finished" podID="b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" containerID="872dafea92e140879875efd1951bb35a6e66aadad4a3946142bbc28199a4b660" exitCode=0 Jan 21 00:45:23 crc kubenswrapper[5030]: I0121 00:45:23.811017 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66xwl" event={"ID":"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae","Type":"ContainerDied","Data":"872dafea92e140879875efd1951bb35a6e66aadad4a3946142bbc28199a4b660"} Jan 21 00:45:23 crc kubenswrapper[5030]: I0121 00:45:23.813212 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 00:45:24 crc kubenswrapper[5030]: I0121 00:45:24.829660 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66xwl" event={"ID":"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae","Type":"ContainerStarted","Data":"3cc0895eaf79bb4cffac217e2ae0f260fd859db4bd75e4796412195b84753991"} Jan 21 00:45:25 crc kubenswrapper[5030]: I0121 00:45:25.837967 5030 generic.go:334] "Generic (PLEG): container finished" podID="b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" containerID="3cc0895eaf79bb4cffac217e2ae0f260fd859db4bd75e4796412195b84753991" exitCode=0 Jan 21 00:45:25 crc kubenswrapper[5030]: I0121 00:45:25.838049 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66xwl" event={"ID":"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae","Type":"ContainerDied","Data":"3cc0895eaf79bb4cffac217e2ae0f260fd859db4bd75e4796412195b84753991"} Jan 21 00:45:26 crc kubenswrapper[5030]: I0121 00:45:26.847388 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66xwl" event={"ID":"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae","Type":"ContainerStarted","Data":"dadbfd46a7e9b63d07c04e8e68ee9ee14cf1f92be4756f4a20115871311463a0"} Jan 21 00:45:26 crc kubenswrapper[5030]: I0121 00:45:26.873440 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-66xwl" podStartSLOduration=3.445776258 podStartE2EDuration="5.873422389s" podCreationTimestamp="2026-01-21 00:45:21 +0000 UTC" firstStartedPulling="2026-01-21 00:45:23.812909801 +0000 UTC m=+7796.133170089" lastFinishedPulling="2026-01-21 00:45:26.240555932 +0000 UTC m=+7798.560816220" observedRunningTime="2026-01-21 00:45:26.870445498 +0000 UTC m=+7799.190705796" watchObservedRunningTime="2026-01-21 00:45:26.873422389 +0000 UTC m=+7799.193682697" Jan 21 00:45:32 crc kubenswrapper[5030]: I0121 00:45:32.340940 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:32 crc kubenswrapper[5030]: I0121 00:45:32.341593 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:32 crc kubenswrapper[5030]: I0121 00:45:32.388238 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:32 crc kubenswrapper[5030]: I0121 00:45:32.928345 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:32 crc kubenswrapper[5030]: I0121 00:45:32.980045 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-66xwl"] Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.290624 5030 scope.go:117] "RemoveContainer" containerID="bf2100e8315acf3920b1c710954850e1b86e673c2a40eeccdc4165aba597b394" Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.324476 5030 scope.go:117] "RemoveContainer" containerID="2470bd93dd8f3ad9fc7ced919be5682dafb8952e759911e4dfef592d61dd9387" Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.343606 5030 scope.go:117] "RemoveContainer" containerID="a99af352c8b003deea9af8fd281ac887d2d89e23b061a480da35bfdb2e0b70c0" Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.371312 5030 scope.go:117] "RemoveContainer" containerID="665c94c69f82a90b6f6b3a7d0f22f91a514d01ed6d9711663321031bedb6f5fd" Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.400338 5030 scope.go:117] "RemoveContainer" containerID="0af5d8f42d6307b99b24b85866749933415ebe7af60453391f53f910014ffa6c" Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.422577 5030 scope.go:117] "RemoveContainer" containerID="e84b6ed2beeb34a63c1a4828cc53135a0fa535c31340599e05b0dc1db6b4cb5c" Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.440681 5030 scope.go:117] "RemoveContainer" containerID="b5d63c50bd5504ebbdb52eedc03b351d2a212b1cd402d8f005df68b3ba72e750" Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.456280 5030 scope.go:117] "RemoveContainer" containerID="8219be5f3668ca7881c7749e39a10f27cb54e97f6465d96081a6bd72380f6803" Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.472032 5030 scope.go:117] "RemoveContainer" containerID="96ee8ddb1e016ee8f7bab5581ed6f41fcce35cfd02a63c7f4769776f20939482" Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.488814 5030 scope.go:117] "RemoveContainer" containerID="a3cfc47eb4f253bcf7c9049a5a3f1353b2fd4703fd12e31c960072b49ebaf3c8" Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.504904 5030 scope.go:117] "RemoveContainer" containerID="ce34f37c43d1ef02c865bcfa15b6652fa8311a3a8ebf277a435dce37d5e3dad0" Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.531083 5030 scope.go:117] "RemoveContainer" containerID="7a833b3ae51014b4e193ee088e7e1b19f9d2fec8fb5cd1d43e8d27cfbc42c368" Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.552106 5030 scope.go:117] "RemoveContainer" containerID="e11e07bcf23ae86edc15b54aa67d20fe38ea1d2124730c4723e7388e0bd2d17e" Jan 21 00:45:34 crc kubenswrapper[5030]: I0121 00:45:34.904632 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-66xwl" podUID="b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" containerName="registry-server" containerID="cri-o://dadbfd46a7e9b63d07c04e8e68ee9ee14cf1f92be4756f4a20115871311463a0" gracePeriod=2 Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.031721 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s72mm"] Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.033394 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s72mm" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.050768 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s72mm"] Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.097218 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qwn8\" (UniqueName: \"kubernetes.io/projected/91e74c80-5caf-4922-9b71-e2003b2c5486-kube-api-access-7qwn8\") pod \"certified-operators-s72mm\" (UID: \"91e74c80-5caf-4922-9b71-e2003b2c5486\") " pod="openshift-marketplace/certified-operators-s72mm" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.097276 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e74c80-5caf-4922-9b71-e2003b2c5486-utilities\") pod \"certified-operators-s72mm\" (UID: \"91e74c80-5caf-4922-9b71-e2003b2c5486\") " pod="openshift-marketplace/certified-operators-s72mm" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.097314 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e74c80-5caf-4922-9b71-e2003b2c5486-catalog-content\") pod \"certified-operators-s72mm\" (UID: \"91e74c80-5caf-4922-9b71-e2003b2c5486\") " pod="openshift-marketplace/certified-operators-s72mm" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.198696 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qwn8\" (UniqueName: \"kubernetes.io/projected/91e74c80-5caf-4922-9b71-e2003b2c5486-kube-api-access-7qwn8\") pod \"certified-operators-s72mm\" (UID: \"91e74c80-5caf-4922-9b71-e2003b2c5486\") " pod="openshift-marketplace/certified-operators-s72mm" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.199124 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e74c80-5caf-4922-9b71-e2003b2c5486-utilities\") pod \"certified-operators-s72mm\" (UID: \"91e74c80-5caf-4922-9b71-e2003b2c5486\") " pod="openshift-marketplace/certified-operators-s72mm" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.199146 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e74c80-5caf-4922-9b71-e2003b2c5486-catalog-content\") pod \"certified-operators-s72mm\" (UID: \"91e74c80-5caf-4922-9b71-e2003b2c5486\") " pod="openshift-marketplace/certified-operators-s72mm" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.199774 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e74c80-5caf-4922-9b71-e2003b2c5486-catalog-content\") pod \"certified-operators-s72mm\" (UID: \"91e74c80-5caf-4922-9b71-e2003b2c5486\") " pod="openshift-marketplace/certified-operators-s72mm" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.199943 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e74c80-5caf-4922-9b71-e2003b2c5486-utilities\") pod \"certified-operators-s72mm\" (UID: \"91e74c80-5caf-4922-9b71-e2003b2c5486\") " pod="openshift-marketplace/certified-operators-s72mm" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.219735 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qwn8\" (UniqueName: \"kubernetes.io/projected/91e74c80-5caf-4922-9b71-e2003b2c5486-kube-api-access-7qwn8\") pod \"certified-operators-s72mm\" (UID: \"91e74c80-5caf-4922-9b71-e2003b2c5486\") " pod="openshift-marketplace/certified-operators-s72mm" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.362880 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s72mm" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.375508 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.506535 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-utilities\") pod \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\" (UID: \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\") " Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.507034 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk4wd\" (UniqueName: \"kubernetes.io/projected/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-kube-api-access-fk4wd\") pod \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\" (UID: \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\") " Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.507088 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-catalog-content\") pod \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\" (UID: \"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae\") " Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.513658 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-utilities" (OuterVolumeSpecName: "utilities") pod "b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" (UID: "b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.530855 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-kube-api-access-fk4wd" (OuterVolumeSpecName: "kube-api-access-fk4wd") pod "b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" (UID: "b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae"). InnerVolumeSpecName "kube-api-access-fk4wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.612205 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk4wd\" (UniqueName: \"kubernetes.io/projected/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-kube-api-access-fk4wd\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.612274 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.670707 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s72mm"] Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.914931 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s72mm" event={"ID":"91e74c80-5caf-4922-9b71-e2003b2c5486","Type":"ContainerStarted","Data":"0e7ed36bd97c6577f46b1cd03c4725b705520a81a56ff32c9e3b7e4e21edb394"} Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.917131 5030 generic.go:334] "Generic (PLEG): container finished" podID="b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" containerID="dadbfd46a7e9b63d07c04e8e68ee9ee14cf1f92be4756f4a20115871311463a0" exitCode=0 Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.917160 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66xwl" event={"ID":"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae","Type":"ContainerDied","Data":"dadbfd46a7e9b63d07c04e8e68ee9ee14cf1f92be4756f4a20115871311463a0"} Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.917180 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-66xwl" event={"ID":"b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae","Type":"ContainerDied","Data":"c2eeccce23b62579d5326d88a1da01c929123d0a98c034ee833ad548f538471c"} Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.917199 5030 scope.go:117] "RemoveContainer" containerID="dadbfd46a7e9b63d07c04e8e68ee9ee14cf1f92be4756f4a20115871311463a0" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.917234 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-66xwl" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.934669 5030 scope.go:117] "RemoveContainer" containerID="3cc0895eaf79bb4cffac217e2ae0f260fd859db4bd75e4796412195b84753991" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.958254 5030 scope.go:117] "RemoveContainer" containerID="872dafea92e140879875efd1951bb35a6e66aadad4a3946142bbc28199a4b660" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.973769 5030 scope.go:117] "RemoveContainer" containerID="dadbfd46a7e9b63d07c04e8e68ee9ee14cf1f92be4756f4a20115871311463a0" Jan 21 00:45:35 crc kubenswrapper[5030]: E0121 00:45:35.974247 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dadbfd46a7e9b63d07c04e8e68ee9ee14cf1f92be4756f4a20115871311463a0\": container with ID starting with dadbfd46a7e9b63d07c04e8e68ee9ee14cf1f92be4756f4a20115871311463a0 not found: ID does not exist" containerID="dadbfd46a7e9b63d07c04e8e68ee9ee14cf1f92be4756f4a20115871311463a0" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.974287 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dadbfd46a7e9b63d07c04e8e68ee9ee14cf1f92be4756f4a20115871311463a0"} err="failed to get container status \"dadbfd46a7e9b63d07c04e8e68ee9ee14cf1f92be4756f4a20115871311463a0\": rpc error: code = NotFound desc = could not find container \"dadbfd46a7e9b63d07c04e8e68ee9ee14cf1f92be4756f4a20115871311463a0\": container with ID starting with dadbfd46a7e9b63d07c04e8e68ee9ee14cf1f92be4756f4a20115871311463a0 not found: ID does not exist" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.974315 5030 scope.go:117] "RemoveContainer" containerID="3cc0895eaf79bb4cffac217e2ae0f260fd859db4bd75e4796412195b84753991" Jan 21 00:45:35 crc kubenswrapper[5030]: E0121 00:45:35.974706 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cc0895eaf79bb4cffac217e2ae0f260fd859db4bd75e4796412195b84753991\": container with ID starting with 3cc0895eaf79bb4cffac217e2ae0f260fd859db4bd75e4796412195b84753991 not found: ID does not exist" containerID="3cc0895eaf79bb4cffac217e2ae0f260fd859db4bd75e4796412195b84753991" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.974735 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cc0895eaf79bb4cffac217e2ae0f260fd859db4bd75e4796412195b84753991"} err="failed to get container status \"3cc0895eaf79bb4cffac217e2ae0f260fd859db4bd75e4796412195b84753991\": rpc error: code = NotFound desc = could not find container \"3cc0895eaf79bb4cffac217e2ae0f260fd859db4bd75e4796412195b84753991\": container with ID starting with 3cc0895eaf79bb4cffac217e2ae0f260fd859db4bd75e4796412195b84753991 not found: ID does not exist" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.974764 5030 scope.go:117] "RemoveContainer" containerID="872dafea92e140879875efd1951bb35a6e66aadad4a3946142bbc28199a4b660" Jan 21 00:45:35 crc kubenswrapper[5030]: E0121 00:45:35.974989 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"872dafea92e140879875efd1951bb35a6e66aadad4a3946142bbc28199a4b660\": container with ID starting with 872dafea92e140879875efd1951bb35a6e66aadad4a3946142bbc28199a4b660 not found: ID does not exist" containerID="872dafea92e140879875efd1951bb35a6e66aadad4a3946142bbc28199a4b660" Jan 21 00:45:35 crc kubenswrapper[5030]: I0121 00:45:35.975012 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"872dafea92e140879875efd1951bb35a6e66aadad4a3946142bbc28199a4b660"} err="failed to get container status \"872dafea92e140879875efd1951bb35a6e66aadad4a3946142bbc28199a4b660\": rpc error: code = NotFound desc = could not find container \"872dafea92e140879875efd1951bb35a6e66aadad4a3946142bbc28199a4b660\": container with ID starting with 872dafea92e140879875efd1951bb35a6e66aadad4a3946142bbc28199a4b660 not found: ID does not exist" Jan 21 00:45:36 crc kubenswrapper[5030]: I0121 00:45:36.706029 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" (UID: "b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:45:36 crc kubenswrapper[5030]: I0121 00:45:36.727809 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:36 crc kubenswrapper[5030]: I0121 00:45:36.852015 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-66xwl"] Jan 21 00:45:36 crc kubenswrapper[5030]: I0121 00:45:36.860724 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-66xwl"] Jan 21 00:45:36 crc kubenswrapper[5030]: I0121 00:45:36.925447 5030 generic.go:334] "Generic (PLEG): container finished" podID="91e74c80-5caf-4922-9b71-e2003b2c5486" containerID="922132d05c9620db40e7e8b2653fcd5a020b1e1bccddf1628b9fcafca7ce8086" exitCode=0 Jan 21 00:45:36 crc kubenswrapper[5030]: I0121 00:45:36.925496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s72mm" event={"ID":"91e74c80-5caf-4922-9b71-e2003b2c5486","Type":"ContainerDied","Data":"922132d05c9620db40e7e8b2653fcd5a020b1e1bccddf1628b9fcafca7ce8086"} Jan 21 00:45:37 crc kubenswrapper[5030]: I0121 00:45:37.940755 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s72mm" event={"ID":"91e74c80-5caf-4922-9b71-e2003b2c5486","Type":"ContainerStarted","Data":"d9c298eebc328c1e0ebbb59d7bb34697ccacf522a182b973a861f15b526f3f18"} Jan 21 00:45:37 crc kubenswrapper[5030]: I0121 00:45:37.979840 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" path="/var/lib/kubelet/pods/b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae/volumes" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.469975 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.477321 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a25lhs"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.483595 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.492077 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc9vkrm"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.503978 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.510913 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7135f7bk"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.517614 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.523893 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08sh756"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.534016 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rpccd"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.534386 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rpccd" podUID="157b727a-8cd0-44ae-8406-1dcaf981ffdd" containerName="registry-server" containerID="cri-o://e0eae01eca9efd3b51d95b741a01b46c7fe7f2fbdb5f37fbf2681b21e9f7e81f" gracePeriod=30 Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.546550 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s72mm"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.554465 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xqz49"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.554789 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xqz49" podUID="046ad923-ab58-4298-83a4-7d2627b5eee9" containerName="registry-server" containerID="cri-o://914eace812bf63e7559810a7997a45ad8eb574795938f005742a6218faa5d52c" gracePeriod=30 Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.560335 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jpsk9"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.560584 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" podUID="827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d" containerName="marketplace-operator" containerID="cri-o://c691bfa4049ecba097c1f9a152b736785f885cf82af4c678a6026f230bec332c" gracePeriod=30 Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.592725 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4tnv"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.593112 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w4tnv" podUID="1ca0b585-e59f-437b-895a-9fd2ce709f2e" containerName="registry-server" containerID="cri-o://f36feba575813ce4a9719fb0a36ca80d2a7ef5c1350ea62e4b1fdc18ff8abf23" gracePeriod=30 Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.601882 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zdn28"] Jan 21 00:45:38 crc kubenswrapper[5030]: E0121 00:45:38.602440 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" containerName="extract-utilities" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.602465 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" containerName="extract-utilities" Jan 21 00:45:38 crc kubenswrapper[5030]: E0121 00:45:38.602486 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" containerName="registry-server" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.602495 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" containerName="registry-server" Jan 21 00:45:38 crc kubenswrapper[5030]: E0121 00:45:38.602512 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" containerName="extract-content" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.602520 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" containerName="extract-content" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.602762 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a8b7b5-ba7d-4b6a-9809-dde8db8587ae" containerName="registry-server" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.603236 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.609163 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kf5kh"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.609852 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kf5kh" podUID="3f09a830-0564-41d1-8e00-3232119becba" containerName="registry-server" containerID="cri-o://376793db80f69aae127680fc34ea4dae95d690fbea083d5f60fe681742db9c1a" gracePeriod=30 Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.613177 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zdn28"] Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.754669 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/576826de-d45b-4d66-99a0-ed9a215b0e2f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zdn28\" (UID: \"576826de-d45b-4d66-99a0-ed9a215b0e2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.754737 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/576826de-d45b-4d66-99a0-ed9a215b0e2f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zdn28\" (UID: \"576826de-d45b-4d66-99a0-ed9a215b0e2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.754771 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgrb4\" (UniqueName: \"kubernetes.io/projected/576826de-d45b-4d66-99a0-ed9a215b0e2f-kube-api-access-dgrb4\") pod \"marketplace-operator-79b997595-zdn28\" (UID: \"576826de-d45b-4d66-99a0-ed9a215b0e2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.858646 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgrb4\" (UniqueName: \"kubernetes.io/projected/576826de-d45b-4d66-99a0-ed9a215b0e2f-kube-api-access-dgrb4\") pod \"marketplace-operator-79b997595-zdn28\" (UID: \"576826de-d45b-4d66-99a0-ed9a215b0e2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.858870 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/576826de-d45b-4d66-99a0-ed9a215b0e2f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zdn28\" (UID: \"576826de-d45b-4d66-99a0-ed9a215b0e2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.858937 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/576826de-d45b-4d66-99a0-ed9a215b0e2f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zdn28\" (UID: \"576826de-d45b-4d66-99a0-ed9a215b0e2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.860584 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/576826de-d45b-4d66-99a0-ed9a215b0e2f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zdn28\" (UID: \"576826de-d45b-4d66-99a0-ed9a215b0e2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.868396 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/576826de-d45b-4d66-99a0-ed9a215b0e2f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zdn28\" (UID: \"576826de-d45b-4d66-99a0-ed9a215b0e2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.883775 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgrb4\" (UniqueName: \"kubernetes.io/projected/576826de-d45b-4d66-99a0-ed9a215b0e2f-kube-api-access-dgrb4\") pod \"marketplace-operator-79b997595-zdn28\" (UID: \"576826de-d45b-4d66-99a0-ed9a215b0e2f\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.949578 5030 generic.go:334] "Generic (PLEG): container finished" podID="157b727a-8cd0-44ae-8406-1dcaf981ffdd" containerID="e0eae01eca9efd3b51d95b741a01b46c7fe7f2fbdb5f37fbf2681b21e9f7e81f" exitCode=0 Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.949654 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpccd" event={"ID":"157b727a-8cd0-44ae-8406-1dcaf981ffdd","Type":"ContainerDied","Data":"e0eae01eca9efd3b51d95b741a01b46c7fe7f2fbdb5f37fbf2681b21e9f7e81f"} Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.955996 5030 generic.go:334] "Generic (PLEG): container finished" podID="1ca0b585-e59f-437b-895a-9fd2ce709f2e" containerID="f36feba575813ce4a9719fb0a36ca80d2a7ef5c1350ea62e4b1fdc18ff8abf23" exitCode=0 Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.956133 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4tnv" event={"ID":"1ca0b585-e59f-437b-895a-9fd2ce709f2e","Type":"ContainerDied","Data":"f36feba575813ce4a9719fb0a36ca80d2a7ef5c1350ea62e4b1fdc18ff8abf23"} Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.960150 5030 generic.go:334] "Generic (PLEG): container finished" podID="91e74c80-5caf-4922-9b71-e2003b2c5486" containerID="d9c298eebc328c1e0ebbb59d7bb34697ccacf522a182b973a861f15b526f3f18" exitCode=0 Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.960208 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s72mm" event={"ID":"91e74c80-5caf-4922-9b71-e2003b2c5486","Type":"ContainerDied","Data":"d9c298eebc328c1e0ebbb59d7bb34697ccacf522a182b973a861f15b526f3f18"} Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.963865 5030 generic.go:334] "Generic (PLEG): container finished" podID="3f09a830-0564-41d1-8e00-3232119becba" containerID="376793db80f69aae127680fc34ea4dae95d690fbea083d5f60fe681742db9c1a" exitCode=0 Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.963931 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf5kh" event={"ID":"3f09a830-0564-41d1-8e00-3232119becba","Type":"ContainerDied","Data":"376793db80f69aae127680fc34ea4dae95d690fbea083d5f60fe681742db9c1a"} Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.967387 5030 generic.go:334] "Generic (PLEG): container finished" podID="046ad923-ab58-4298-83a4-7d2627b5eee9" containerID="914eace812bf63e7559810a7997a45ad8eb574795938f005742a6218faa5d52c" exitCode=0 Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.967440 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqz49" event={"ID":"046ad923-ab58-4298-83a4-7d2627b5eee9","Type":"ContainerDied","Data":"914eace812bf63e7559810a7997a45ad8eb574795938f005742a6218faa5d52c"} Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.967484 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xqz49" event={"ID":"046ad923-ab58-4298-83a4-7d2627b5eee9","Type":"ContainerDied","Data":"08e6827d2eff5b02e65fa03484aa099068f20dd1a46a03b555be6e0d3527ab7f"} Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.967497 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08e6827d2eff5b02e65fa03484aa099068f20dd1a46a03b555be6e0d3527ab7f" Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.969533 5030 generic.go:334] "Generic (PLEG): container finished" podID="827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d" containerID="c691bfa4049ecba097c1f9a152b736785f885cf82af4c678a6026f230bec332c" exitCode=0 Jan 21 00:45:38 crc kubenswrapper[5030]: I0121 00:45:38.969602 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" event={"ID":"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d","Type":"ContainerDied","Data":"c691bfa4049ecba097c1f9a152b736785f885cf82af4c678a6026f230bec332c"} Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.022602 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.031107 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqz49" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.080862 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.088535 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.094776 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.099795 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf5kh" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.163776 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp4fx\" (UniqueName: \"kubernetes.io/projected/046ad923-ab58-4298-83a4-7d2627b5eee9-kube-api-access-tp4fx\") pod \"046ad923-ab58-4298-83a4-7d2627b5eee9\" (UID: \"046ad923-ab58-4298-83a4-7d2627b5eee9\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.163899 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046ad923-ab58-4298-83a4-7d2627b5eee9-catalog-content\") pod \"046ad923-ab58-4298-83a4-7d2627b5eee9\" (UID: \"046ad923-ab58-4298-83a4-7d2627b5eee9\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.163949 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046ad923-ab58-4298-83a4-7d2627b5eee9-utilities\") pod \"046ad923-ab58-4298-83a4-7d2627b5eee9\" (UID: \"046ad923-ab58-4298-83a4-7d2627b5eee9\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.164888 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/046ad923-ab58-4298-83a4-7d2627b5eee9-utilities" (OuterVolumeSpecName: "utilities") pod "046ad923-ab58-4298-83a4-7d2627b5eee9" (UID: "046ad923-ab58-4298-83a4-7d2627b5eee9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.169855 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/046ad923-ab58-4298-83a4-7d2627b5eee9-kube-api-access-tp4fx" (OuterVolumeSpecName: "kube-api-access-tp4fx") pod "046ad923-ab58-4298-83a4-7d2627b5eee9" (UID: "046ad923-ab58-4298-83a4-7d2627b5eee9"). InnerVolumeSpecName "kube-api-access-tp4fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.221287 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/046ad923-ab58-4298-83a4-7d2627b5eee9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "046ad923-ab58-4298-83a4-7d2627b5eee9" (UID: "046ad923-ab58-4298-83a4-7d2627b5eee9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.265308 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znl2x\" (UniqueName: \"kubernetes.io/projected/3f09a830-0564-41d1-8e00-3232119becba-kube-api-access-znl2x\") pod \"3f09a830-0564-41d1-8e00-3232119becba\" (UID: \"3f09a830-0564-41d1-8e00-3232119becba\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.265363 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f09a830-0564-41d1-8e00-3232119becba-utilities\") pod \"3f09a830-0564-41d1-8e00-3232119becba\" (UID: \"3f09a830-0564-41d1-8e00-3232119becba\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.265392 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-marketplace-trusted-ca\") pod \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\" (UID: \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.265474 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84lsw\" (UniqueName: \"kubernetes.io/projected/157b727a-8cd0-44ae-8406-1dcaf981ffdd-kube-api-access-84lsw\") pod \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\" (UID: \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.266028 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f09a830-0564-41d1-8e00-3232119becba-utilities" (OuterVolumeSpecName: "utilities") pod "3f09a830-0564-41d1-8e00-3232119becba" (UID: "3f09a830-0564-41d1-8e00-3232119becba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.266291 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjmc9\" (UniqueName: \"kubernetes.io/projected/1ca0b585-e59f-437b-895a-9fd2ce709f2e-kube-api-access-tjmc9\") pod \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\" (UID: \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.266366 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca0b585-e59f-437b-895a-9fd2ce709f2e-utilities\") pod \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\" (UID: \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.266393 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157b727a-8cd0-44ae-8406-1dcaf981ffdd-catalog-content\") pod \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\" (UID: \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.266420 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157b727a-8cd0-44ae-8406-1dcaf981ffdd-utilities\") pod \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\" (UID: \"157b727a-8cd0-44ae-8406-1dcaf981ffdd\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.266440 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9jwz\" (UniqueName: \"kubernetes.io/projected/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-kube-api-access-s9jwz\") pod \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\" (UID: \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.266456 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca0b585-e59f-437b-895a-9fd2ce709f2e-catalog-content\") pod \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\" (UID: \"1ca0b585-e59f-437b-895a-9fd2ce709f2e\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.266479 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f09a830-0564-41d1-8e00-3232119becba-catalog-content\") pod \"3f09a830-0564-41d1-8e00-3232119becba\" (UID: \"3f09a830-0564-41d1-8e00-3232119becba\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.266509 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-marketplace-operator-metrics\") pod \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\" (UID: \"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d\") " Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.266841 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/046ad923-ab58-4298-83a4-7d2627b5eee9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.266861 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/046ad923-ab58-4298-83a4-7d2627b5eee9-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.266857 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d" (UID: "827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.266875 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp4fx\" (UniqueName: \"kubernetes.io/projected/046ad923-ab58-4298-83a4-7d2627b5eee9-kube-api-access-tp4fx\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.267058 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f09a830-0564-41d1-8e00-3232119becba-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.267656 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/157b727a-8cd0-44ae-8406-1dcaf981ffdd-utilities" (OuterVolumeSpecName: "utilities") pod "157b727a-8cd0-44ae-8406-1dcaf981ffdd" (UID: "157b727a-8cd0-44ae-8406-1dcaf981ffdd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.270300 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/157b727a-8cd0-44ae-8406-1dcaf981ffdd-kube-api-access-84lsw" (OuterVolumeSpecName: "kube-api-access-84lsw") pod "157b727a-8cd0-44ae-8406-1dcaf981ffdd" (UID: "157b727a-8cd0-44ae-8406-1dcaf981ffdd"). InnerVolumeSpecName "kube-api-access-84lsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.270349 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f09a830-0564-41d1-8e00-3232119becba-kube-api-access-znl2x" (OuterVolumeSpecName: "kube-api-access-znl2x") pod "3f09a830-0564-41d1-8e00-3232119becba" (UID: "3f09a830-0564-41d1-8e00-3232119becba"). InnerVolumeSpecName "kube-api-access-znl2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.270454 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d" (UID: "827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.272005 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-kube-api-access-s9jwz" (OuterVolumeSpecName: "kube-api-access-s9jwz") pod "827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d" (UID: "827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d"). InnerVolumeSpecName "kube-api-access-s9jwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.273908 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca0b585-e59f-437b-895a-9fd2ce709f2e-kube-api-access-tjmc9" (OuterVolumeSpecName: "kube-api-access-tjmc9") pod "1ca0b585-e59f-437b-895a-9fd2ce709f2e" (UID: "1ca0b585-e59f-437b-895a-9fd2ce709f2e"). InnerVolumeSpecName "kube-api-access-tjmc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.279735 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca0b585-e59f-437b-895a-9fd2ce709f2e-utilities" (OuterVolumeSpecName: "utilities") pod "1ca0b585-e59f-437b-895a-9fd2ce709f2e" (UID: "1ca0b585-e59f-437b-895a-9fd2ce709f2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.290346 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca0b585-e59f-437b-895a-9fd2ce709f2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ca0b585-e59f-437b-895a-9fd2ce709f2e" (UID: "1ca0b585-e59f-437b-895a-9fd2ce709f2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.321333 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/157b727a-8cd0-44ae-8406-1dcaf981ffdd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "157b727a-8cd0-44ae-8406-1dcaf981ffdd" (UID: "157b727a-8cd0-44ae-8406-1dcaf981ffdd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.369038 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84lsw\" (UniqueName: \"kubernetes.io/projected/157b727a-8cd0-44ae-8406-1dcaf981ffdd-kube-api-access-84lsw\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.369070 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjmc9\" (UniqueName: \"kubernetes.io/projected/1ca0b585-e59f-437b-895a-9fd2ce709f2e-kube-api-access-tjmc9\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.369080 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca0b585-e59f-437b-895a-9fd2ce709f2e-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.369090 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157b727a-8cd0-44ae-8406-1dcaf981ffdd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.369099 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157b727a-8cd0-44ae-8406-1dcaf981ffdd-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.369108 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9jwz\" (UniqueName: \"kubernetes.io/projected/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-kube-api-access-s9jwz\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.369119 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca0b585-e59f-437b-895a-9fd2ce709f2e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.369128 5030 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.369138 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znl2x\" (UniqueName: \"kubernetes.io/projected/3f09a830-0564-41d1-8e00-3232119becba-kube-api-access-znl2x\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.369147 5030 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.398449 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f09a830-0564-41d1-8e00-3232119becba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f09a830-0564-41d1-8e00-3232119becba" (UID: "3f09a830-0564-41d1-8e00-3232119becba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.471226 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f09a830-0564-41d1-8e00-3232119becba-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.487607 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zdn28"] Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.973130 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0334a325-c5cb-4543-8580-812e53d249c9" path="/var/lib/kubelet/pods/0334a325-c5cb-4543-8580-812e53d249c9/volumes" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.976884 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kf5kh" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.977824 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="374f7b93-b687-492f-8b68-c3b20c6566e6" path="/var/lib/kubelet/pods/374f7b93-b687-492f-8b68-c3b20c6566e6/volumes" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.979521 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.979949 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4" path="/var/lib/kubelet/pods/6ba9bbff-2ea5-475c-8cbb-f81f74cf37b4/volumes" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.981680 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2493d5a-ed39-48f7-a46f-f5b19cc201ae" path="/var/lib/kubelet/pods/d2493d5a-ed39-48f7-a46f-f5b19cc201ae/volumes" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.982279 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpccd" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.983131 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kf5kh" event={"ID":"3f09a830-0564-41d1-8e00-3232119becba","Type":"ContainerDied","Data":"e54fc46728846cc0f96a4a3d9eaf746b98c740e210bd0b416375a75b5230bf8d"} Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.983174 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jpsk9" event={"ID":"827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d","Type":"ContainerDied","Data":"f7f45411bb5d77f082c3efa4826fab77eecf9bacc887a61f4500f0ca142588c5"} Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.983191 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpccd" event={"ID":"157b727a-8cd0-44ae-8406-1dcaf981ffdd","Type":"ContainerDied","Data":"111768af66beda35aed5ae66398604eb69cd074faeecfe7b2a03a9f0c868e564"} Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.983215 5030 scope.go:117] "RemoveContainer" containerID="376793db80f69aae127680fc34ea4dae95d690fbea083d5f60fe681742db9c1a" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.985691 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4tnv" event={"ID":"1ca0b585-e59f-437b-895a-9fd2ce709f2e","Type":"ContainerDied","Data":"60dbffcb0977b5d13ffc26db81224b605dfdc3f091c43d011c649ea5faa1d922"} Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.985814 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4tnv" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.990273 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" event={"ID":"576826de-d45b-4d66-99a0-ed9a215b0e2f","Type":"ContainerStarted","Data":"43f3e88a25945d9ee77a45e58ba34885fa44d58a68c3c84e80cc471a26004319"} Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.990336 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" event={"ID":"576826de-d45b-4d66-99a0-ed9a215b0e2f","Type":"ContainerStarted","Data":"8d304397ecff8137fcc2b785e4150f5ed2a0d49ea2c98a445fe21c0a0b999ae4"} Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.992893 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.992964 5030 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zdn28 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.1.171:8080/healthz\": dial tcp 10.217.1.171:8080: connect: connection refused" start-of-body= Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.992992 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" podUID="576826de-d45b-4d66-99a0-ed9a215b0e2f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.1.171:8080/healthz\": dial tcp 10.217.1.171:8080: connect: connection refused" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.997098 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xqz49" Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.997140 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s72mm" podUID="91e74c80-5caf-4922-9b71-e2003b2c5486" containerName="registry-server" containerID="cri-o://551f93bbcc376fe7e46fc1b1bc9cb784f10e882e85e67cf699ff32dc1d899e76" gracePeriod=30 Jan 21 00:45:39 crc kubenswrapper[5030]: I0121 00:45:39.997617 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s72mm" event={"ID":"91e74c80-5caf-4922-9b71-e2003b2c5486","Type":"ContainerStarted","Data":"551f93bbcc376fe7e46fc1b1bc9cb784f10e882e85e67cf699ff32dc1d899e76"} Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.012239 5030 scope.go:117] "RemoveContainer" containerID="5feec5fcea41b0014502a011a58bd3b031086aa3df92b6c554b36baf9d8784e2" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.031354 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" podStartSLOduration=2.031329881 podStartE2EDuration="2.031329881s" podCreationTimestamp="2026-01-21 00:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:45:40.019833166 +0000 UTC m=+7812.340093464" watchObservedRunningTime="2026-01-21 00:45:40.031329881 +0000 UTC m=+7812.351590179" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.043405 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s72mm" podStartSLOduration=2.392630166 podStartE2EDuration="5.043387441s" podCreationTimestamp="2026-01-21 00:45:35 +0000 UTC" firstStartedPulling="2026-01-21 00:45:36.926674454 +0000 UTC m=+7809.246934732" lastFinishedPulling="2026-01-21 00:45:39.577431719 +0000 UTC m=+7811.897692007" observedRunningTime="2026-01-21 00:45:40.042555871 +0000 UTC m=+7812.362816159" watchObservedRunningTime="2026-01-21 00:45:40.043387441 +0000 UTC m=+7812.363647729" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.114329 5030 scope.go:117] "RemoveContainer" containerID="5309b03a7907a030a6b8b6e3458914eb744ba927bf5c048ac173cbcc6279d89f" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.117909 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kf5kh"] Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.124160 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kf5kh"] Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.156909 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.157203 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.174161 5030 scope.go:117] "RemoveContainer" containerID="c691bfa4049ecba097c1f9a152b736785f885cf82af4c678a6026f230bec332c" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.194848 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jpsk9"] Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.202003 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jpsk9"] Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.220061 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xqz49"] Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.221061 5030 scope.go:117] "RemoveContainer" containerID="e0eae01eca9efd3b51d95b741a01b46c7fe7f2fbdb5f37fbf2681b21e9f7e81f" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.229489 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xqz49"] Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.237722 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rpccd"] Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.257180 5030 scope.go:117] "RemoveContainer" containerID="e2cc1922b10bab485194876b2ee89c889b82b0d77dd03a6e014604e6f3420189" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.264446 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rpccd"] Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.277156 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4tnv"] Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.284116 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4tnv"] Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.293556 5030 scope.go:117] "RemoveContainer" containerID="8207e1592085806e1f3d97f81fc4dadde2881307788bbf90f7cfa7410bc37f67" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.309206 5030 scope.go:117] "RemoveContainer" containerID="f36feba575813ce4a9719fb0a36ca80d2a7ef5c1350ea62e4b1fdc18ff8abf23" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.331464 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s72mm_91e74c80-5caf-4922-9b71-e2003b2c5486/registry-server/0.log" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.332035 5030 scope.go:117] "RemoveContainer" containerID="32c10319cc98b71529f1f81f9eebbbd38461c2834d697fa32ec41494bb153f86" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.333022 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s72mm" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.354416 5030 scope.go:117] "RemoveContainer" containerID="1c8c49a30bd63c73c01ade5fa4ffe498035336c12b09dfd2f8e47e8d97752883" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.484166 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e74c80-5caf-4922-9b71-e2003b2c5486-catalog-content\") pod \"91e74c80-5caf-4922-9b71-e2003b2c5486\" (UID: \"91e74c80-5caf-4922-9b71-e2003b2c5486\") " Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.484252 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e74c80-5caf-4922-9b71-e2003b2c5486-utilities\") pod \"91e74c80-5caf-4922-9b71-e2003b2c5486\" (UID: \"91e74c80-5caf-4922-9b71-e2003b2c5486\") " Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.484392 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qwn8\" (UniqueName: \"kubernetes.io/projected/91e74c80-5caf-4922-9b71-e2003b2c5486-kube-api-access-7qwn8\") pod \"91e74c80-5caf-4922-9b71-e2003b2c5486\" (UID: \"91e74c80-5caf-4922-9b71-e2003b2c5486\") " Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.485156 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91e74c80-5caf-4922-9b71-e2003b2c5486-utilities" (OuterVolumeSpecName: "utilities") pod "91e74c80-5caf-4922-9b71-e2003b2c5486" (UID: "91e74c80-5caf-4922-9b71-e2003b2c5486"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.488705 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e74c80-5caf-4922-9b71-e2003b2c5486-kube-api-access-7qwn8" (OuterVolumeSpecName: "kube-api-access-7qwn8") pod "91e74c80-5caf-4922-9b71-e2003b2c5486" (UID: "91e74c80-5caf-4922-9b71-e2003b2c5486"). InnerVolumeSpecName "kube-api-access-7qwn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.536480 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91e74c80-5caf-4922-9b71-e2003b2c5486-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91e74c80-5caf-4922-9b71-e2003b2c5486" (UID: "91e74c80-5caf-4922-9b71-e2003b2c5486"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.585497 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qwn8\" (UniqueName: \"kubernetes.io/projected/91e74c80-5caf-4922-9b71-e2003b2c5486-kube-api-access-7qwn8\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.585533 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e74c80-5caf-4922-9b71-e2003b2c5486-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:40 crc kubenswrapper[5030]: I0121 00:45:40.585543 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e74c80-5caf-4922-9b71-e2003b2c5486-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.012178 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s72mm_91e74c80-5caf-4922-9b71-e2003b2c5486/registry-server/0.log" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.012971 5030 generic.go:334] "Generic (PLEG): container finished" podID="91e74c80-5caf-4922-9b71-e2003b2c5486" containerID="551f93bbcc376fe7e46fc1b1bc9cb784f10e882e85e67cf699ff32dc1d899e76" exitCode=1 Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.013040 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s72mm" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.013032 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s72mm" event={"ID":"91e74c80-5caf-4922-9b71-e2003b2c5486","Type":"ContainerDied","Data":"551f93bbcc376fe7e46fc1b1bc9cb784f10e882e85e67cf699ff32dc1d899e76"} Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.013094 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s72mm" event={"ID":"91e74c80-5caf-4922-9b71-e2003b2c5486","Type":"ContainerDied","Data":"0e7ed36bd97c6577f46b1cd03c4725b705520a81a56ff32c9e3b7e4e21edb394"} Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.013121 5030 scope.go:117] "RemoveContainer" containerID="551f93bbcc376fe7e46fc1b1bc9cb784f10e882e85e67cf699ff32dc1d899e76" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.016443 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zdn28" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.047891 5030 scope.go:117] "RemoveContainer" containerID="d9c298eebc328c1e0ebbb59d7bb34697ccacf522a182b973a861f15b526f3f18" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.068411 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5j2cd"] Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.068992 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157b727a-8cd0-44ae-8406-1dcaf981ffdd" containerName="extract-utilities" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069012 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="157b727a-8cd0-44ae-8406-1dcaf981ffdd" containerName="extract-utilities" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069026 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f09a830-0564-41d1-8e00-3232119becba" containerName="extract-content" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069035 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f09a830-0564-41d1-8e00-3232119becba" containerName="extract-content" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069048 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046ad923-ab58-4298-83a4-7d2627b5eee9" containerName="extract-utilities" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069057 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="046ad923-ab58-4298-83a4-7d2627b5eee9" containerName="extract-utilities" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069075 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046ad923-ab58-4298-83a4-7d2627b5eee9" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069084 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="046ad923-ab58-4298-83a4-7d2627b5eee9" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069099 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca0b585-e59f-437b-895a-9fd2ce709f2e" containerName="extract-utilities" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069107 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca0b585-e59f-437b-895a-9fd2ce709f2e" containerName="extract-utilities" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069121 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca0b585-e59f-437b-895a-9fd2ce709f2e" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069130 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca0b585-e59f-437b-895a-9fd2ce709f2e" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069143 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca0b585-e59f-437b-895a-9fd2ce709f2e" containerName="extract-content" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069150 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca0b585-e59f-437b-895a-9fd2ce709f2e" containerName="extract-content" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069162 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e74c80-5caf-4922-9b71-e2003b2c5486" containerName="extract-content" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069170 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e74c80-5caf-4922-9b71-e2003b2c5486" containerName="extract-content" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069181 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046ad923-ab58-4298-83a4-7d2627b5eee9" containerName="extract-content" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069189 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="046ad923-ab58-4298-83a4-7d2627b5eee9" containerName="extract-content" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069205 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157b727a-8cd0-44ae-8406-1dcaf981ffdd" containerName="extract-content" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069212 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="157b727a-8cd0-44ae-8406-1dcaf981ffdd" containerName="extract-content" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069223 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f09a830-0564-41d1-8e00-3232119becba" containerName="extract-utilities" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069231 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f09a830-0564-41d1-8e00-3232119becba" containerName="extract-utilities" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069245 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f09a830-0564-41d1-8e00-3232119becba" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069253 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f09a830-0564-41d1-8e00-3232119becba" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069265 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d" containerName="marketplace-operator" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069272 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d" containerName="marketplace-operator" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069280 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157b727a-8cd0-44ae-8406-1dcaf981ffdd" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069287 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="157b727a-8cd0-44ae-8406-1dcaf981ffdd" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069305 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e74c80-5caf-4922-9b71-e2003b2c5486" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069313 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e74c80-5caf-4922-9b71-e2003b2c5486" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.069323 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e74c80-5caf-4922-9b71-e2003b2c5486" containerName="extract-utilities" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069331 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e74c80-5caf-4922-9b71-e2003b2c5486" containerName="extract-utilities" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069466 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d" containerName="marketplace-operator" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069482 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="157b727a-8cd0-44ae-8406-1dcaf981ffdd" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069499 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca0b585-e59f-437b-895a-9fd2ce709f2e" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069511 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f09a830-0564-41d1-8e00-3232119becba" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069522 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e74c80-5caf-4922-9b71-e2003b2c5486" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.069534 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="046ad923-ab58-4298-83a4-7d2627b5eee9" containerName="registry-server" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.070693 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.073876 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.083272 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5j2cd"] Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.090637 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s72mm"] Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.092467 5030 scope.go:117] "RemoveContainer" containerID="922132d05c9620db40e7e8b2653fcd5a020b1e1bccddf1628b9fcafca7ce8086" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.095391 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s72mm"] Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.114307 5030 scope.go:117] "RemoveContainer" containerID="551f93bbcc376fe7e46fc1b1bc9cb784f10e882e85e67cf699ff32dc1d899e76" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.115714 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"551f93bbcc376fe7e46fc1b1bc9cb784f10e882e85e67cf699ff32dc1d899e76\": container with ID starting with 551f93bbcc376fe7e46fc1b1bc9cb784f10e882e85e67cf699ff32dc1d899e76 not found: ID does not exist" containerID="551f93bbcc376fe7e46fc1b1bc9cb784f10e882e85e67cf699ff32dc1d899e76" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.115798 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551f93bbcc376fe7e46fc1b1bc9cb784f10e882e85e67cf699ff32dc1d899e76"} err="failed to get container status \"551f93bbcc376fe7e46fc1b1bc9cb784f10e882e85e67cf699ff32dc1d899e76\": rpc error: code = NotFound desc = could not find container \"551f93bbcc376fe7e46fc1b1bc9cb784f10e882e85e67cf699ff32dc1d899e76\": container with ID starting with 551f93bbcc376fe7e46fc1b1bc9cb784f10e882e85e67cf699ff32dc1d899e76 not found: ID does not exist" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.115839 5030 scope.go:117] "RemoveContainer" containerID="d9c298eebc328c1e0ebbb59d7bb34697ccacf522a182b973a861f15b526f3f18" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.116324 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c298eebc328c1e0ebbb59d7bb34697ccacf522a182b973a861f15b526f3f18\": container with ID starting with d9c298eebc328c1e0ebbb59d7bb34697ccacf522a182b973a861f15b526f3f18 not found: ID does not exist" containerID="d9c298eebc328c1e0ebbb59d7bb34697ccacf522a182b973a861f15b526f3f18" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.116360 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c298eebc328c1e0ebbb59d7bb34697ccacf522a182b973a861f15b526f3f18"} err="failed to get container status \"d9c298eebc328c1e0ebbb59d7bb34697ccacf522a182b973a861f15b526f3f18\": rpc error: code = NotFound desc = could not find container \"d9c298eebc328c1e0ebbb59d7bb34697ccacf522a182b973a861f15b526f3f18\": container with ID starting with d9c298eebc328c1e0ebbb59d7bb34697ccacf522a182b973a861f15b526f3f18 not found: ID does not exist" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.116383 5030 scope.go:117] "RemoveContainer" containerID="922132d05c9620db40e7e8b2653fcd5a020b1e1bccddf1628b9fcafca7ce8086" Jan 21 00:45:41 crc kubenswrapper[5030]: E0121 00:45:41.116691 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922132d05c9620db40e7e8b2653fcd5a020b1e1bccddf1628b9fcafca7ce8086\": container with ID starting with 922132d05c9620db40e7e8b2653fcd5a020b1e1bccddf1628b9fcafca7ce8086 not found: ID does not exist" containerID="922132d05c9620db40e7e8b2653fcd5a020b1e1bccddf1628b9fcafca7ce8086" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.116728 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922132d05c9620db40e7e8b2653fcd5a020b1e1bccddf1628b9fcafca7ce8086"} err="failed to get container status \"922132d05c9620db40e7e8b2653fcd5a020b1e1bccddf1628b9fcafca7ce8086\": rpc error: code = NotFound desc = could not find container \"922132d05c9620db40e7e8b2653fcd5a020b1e1bccddf1628b9fcafca7ce8086\": container with ID starting with 922132d05c9620db40e7e8b2653fcd5a020b1e1bccddf1628b9fcafca7ce8086 not found: ID does not exist" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.194864 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5dfcb0d-c01c-4da3-90ee-0ddb739503aa-utilities\") pod \"community-operators-5j2cd\" (UID: \"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa\") " pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.194990 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc44r\" (UniqueName: \"kubernetes.io/projected/a5dfcb0d-c01c-4da3-90ee-0ddb739503aa-kube-api-access-bc44r\") pod \"community-operators-5j2cd\" (UID: \"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa\") " pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.195034 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5dfcb0d-c01c-4da3-90ee-0ddb739503aa-catalog-content\") pod \"community-operators-5j2cd\" (UID: \"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa\") " pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.296525 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5dfcb0d-c01c-4da3-90ee-0ddb739503aa-utilities\") pod \"community-operators-5j2cd\" (UID: \"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa\") " pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.296839 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc44r\" (UniqueName: \"kubernetes.io/projected/a5dfcb0d-c01c-4da3-90ee-0ddb739503aa-kube-api-access-bc44r\") pod \"community-operators-5j2cd\" (UID: \"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa\") " pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.296961 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5dfcb0d-c01c-4da3-90ee-0ddb739503aa-catalog-content\") pod \"community-operators-5j2cd\" (UID: \"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa\") " pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.297389 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5dfcb0d-c01c-4da3-90ee-0ddb739503aa-catalog-content\") pod \"community-operators-5j2cd\" (UID: \"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa\") " pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.298875 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5dfcb0d-c01c-4da3-90ee-0ddb739503aa-utilities\") pod \"community-operators-5j2cd\" (UID: \"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa\") " pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.313549 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc44r\" (UniqueName: \"kubernetes.io/projected/a5dfcb0d-c01c-4da3-90ee-0ddb739503aa-kube-api-access-bc44r\") pod \"community-operators-5j2cd\" (UID: \"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa\") " pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.435060 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.633384 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tbj59"] Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.634511 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.636557 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.641649 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbj59"] Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.803585 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f-catalog-content\") pod \"redhat-marketplace-tbj59\" (UID: \"b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f\") " pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.803666 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f-utilities\") pod \"redhat-marketplace-tbj59\" (UID: \"b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f\") " pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.803854 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thpm9\" (UniqueName: \"kubernetes.io/projected/b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f-kube-api-access-thpm9\") pod \"redhat-marketplace-tbj59\" (UID: \"b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f\") " pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.835019 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5j2cd"] Jan 21 00:45:41 crc kubenswrapper[5030]: W0121 00:45:41.840590 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5dfcb0d_c01c_4da3_90ee_0ddb739503aa.slice/crio-796098af119751c656ca120423422323f0c8d6ae16afaae5f2792554a20c2c6a WatchSource:0}: Error finding container 796098af119751c656ca120423422323f0c8d6ae16afaae5f2792554a20c2c6a: Status 404 returned error can't find the container with id 796098af119751c656ca120423422323f0c8d6ae16afaae5f2792554a20c2c6a Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.905149 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thpm9\" (UniqueName: \"kubernetes.io/projected/b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f-kube-api-access-thpm9\") pod \"redhat-marketplace-tbj59\" (UID: \"b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f\") " pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.905572 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f-catalog-content\") pod \"redhat-marketplace-tbj59\" (UID: \"b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f\") " pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.905616 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f-utilities\") pod \"redhat-marketplace-tbj59\" (UID: \"b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f\") " pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.906185 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f-utilities\") pod \"redhat-marketplace-tbj59\" (UID: \"b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f\") " pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.906346 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f-catalog-content\") pod \"redhat-marketplace-tbj59\" (UID: \"b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f\") " pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.930342 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thpm9\" (UniqueName: \"kubernetes.io/projected/b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f-kube-api-access-thpm9\") pod \"redhat-marketplace-tbj59\" (UID: \"b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f\") " pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.950011 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.981700 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="046ad923-ab58-4298-83a4-7d2627b5eee9" path="/var/lib/kubelet/pods/046ad923-ab58-4298-83a4-7d2627b5eee9/volumes" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.982981 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="157b727a-8cd0-44ae-8406-1dcaf981ffdd" path="/var/lib/kubelet/pods/157b727a-8cd0-44ae-8406-1dcaf981ffdd/volumes" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.983683 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca0b585-e59f-437b-895a-9fd2ce709f2e" path="/var/lib/kubelet/pods/1ca0b585-e59f-437b-895a-9fd2ce709f2e/volumes" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.988242 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f09a830-0564-41d1-8e00-3232119becba" path="/var/lib/kubelet/pods/3f09a830-0564-41d1-8e00-3232119becba/volumes" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.988939 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d" path="/var/lib/kubelet/pods/827c9e1a-bcdf-4b7d-bc22-d71fc4954c8d/volumes" Jan 21 00:45:41 crc kubenswrapper[5030]: I0121 00:45:41.989824 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e74c80-5caf-4922-9b71-e2003b2c5486" path="/var/lib/kubelet/pods/91e74c80-5caf-4922-9b71-e2003b2c5486/volumes" Jan 21 00:45:42 crc kubenswrapper[5030]: I0121 00:45:42.105387 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j2cd" event={"ID":"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa","Type":"ContainerStarted","Data":"796098af119751c656ca120423422323f0c8d6ae16afaae5f2792554a20c2c6a"} Jan 21 00:45:42 crc kubenswrapper[5030]: I0121 00:45:42.275989 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbj59"] Jan 21 00:45:42 crc kubenswrapper[5030]: W0121 00:45:42.283475 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2db6aa4_5ae0_4c18_ab4e_6f9de18fbb1f.slice/crio-e59aad61cf56150f0c378bf60d7a52de214a773fe29f0903d4dbfaf706067640 WatchSource:0}: Error finding container e59aad61cf56150f0c378bf60d7a52de214a773fe29f0903d4dbfaf706067640: Status 404 returned error can't find the container with id e59aad61cf56150f0c378bf60d7a52de214a773fe29f0903d4dbfaf706067640 Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.114159 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f" containerID="8286a850e52b57dd4bba9b418063b3ddbfe21e38299d7b41920b07b1943b6d6d" exitCode=0 Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.114226 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbj59" event={"ID":"b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f","Type":"ContainerDied","Data":"8286a850e52b57dd4bba9b418063b3ddbfe21e38299d7b41920b07b1943b6d6d"} Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.114291 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbj59" event={"ID":"b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f","Type":"ContainerStarted","Data":"e59aad61cf56150f0c378bf60d7a52de214a773fe29f0903d4dbfaf706067640"} Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.116724 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5dfcb0d-c01c-4da3-90ee-0ddb739503aa" containerID="457a5b582987f97aaa052cb896eb3dc76b076e36d8b7197406a71280538162ae" exitCode=0 Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.117482 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j2cd" event={"ID":"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa","Type":"ContainerDied","Data":"457a5b582987f97aaa052cb896eb3dc76b076e36d8b7197406a71280538162ae"} Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.430851 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2ts88"] Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.448478 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.451135 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.456258 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2ts88"] Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.527680 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1cd5152-d5f9-4d07-b4bb-81abc83676f6-catalog-content\") pod \"certified-operators-2ts88\" (UID: \"c1cd5152-d5f9-4d07-b4bb-81abc83676f6\") " pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.528056 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1cd5152-d5f9-4d07-b4bb-81abc83676f6-utilities\") pod \"certified-operators-2ts88\" (UID: \"c1cd5152-d5f9-4d07-b4bb-81abc83676f6\") " pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.528094 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rcvl\" (UniqueName: \"kubernetes.io/projected/c1cd5152-d5f9-4d07-b4bb-81abc83676f6-kube-api-access-4rcvl\") pod \"certified-operators-2ts88\" (UID: \"c1cd5152-d5f9-4d07-b4bb-81abc83676f6\") " pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.629369 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1cd5152-d5f9-4d07-b4bb-81abc83676f6-utilities\") pod \"certified-operators-2ts88\" (UID: \"c1cd5152-d5f9-4d07-b4bb-81abc83676f6\") " pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.629455 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rcvl\" (UniqueName: \"kubernetes.io/projected/c1cd5152-d5f9-4d07-b4bb-81abc83676f6-kube-api-access-4rcvl\") pod \"certified-operators-2ts88\" (UID: \"c1cd5152-d5f9-4d07-b4bb-81abc83676f6\") " pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.629539 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1cd5152-d5f9-4d07-b4bb-81abc83676f6-catalog-content\") pod \"certified-operators-2ts88\" (UID: \"c1cd5152-d5f9-4d07-b4bb-81abc83676f6\") " pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.630098 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1cd5152-d5f9-4d07-b4bb-81abc83676f6-utilities\") pod \"certified-operators-2ts88\" (UID: \"c1cd5152-d5f9-4d07-b4bb-81abc83676f6\") " pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.630169 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1cd5152-d5f9-4d07-b4bb-81abc83676f6-catalog-content\") pod \"certified-operators-2ts88\" (UID: \"c1cd5152-d5f9-4d07-b4bb-81abc83676f6\") " pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.656062 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rcvl\" (UniqueName: \"kubernetes.io/projected/c1cd5152-d5f9-4d07-b4bb-81abc83676f6-kube-api-access-4rcvl\") pod \"certified-operators-2ts88\" (UID: \"c1cd5152-d5f9-4d07-b4bb-81abc83676f6\") " pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:43 crc kubenswrapper[5030]: I0121 00:45:43.779735 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.033810 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4xq8j"] Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.035575 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.042550 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.048428 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xq8j"] Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.126364 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f" containerID="0343ec0643c795c82cd879ed4fe6dd915adf703f796b404df61b80ae3112f779" exitCode=0 Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.126599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbj59" event={"ID":"b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f","Type":"ContainerDied","Data":"0343ec0643c795c82cd879ed4fe6dd915adf703f796b404df61b80ae3112f779"} Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.131961 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j2cd" event={"ID":"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa","Type":"ContainerStarted","Data":"c60a0dd273cb14e3314002813e6d9dec5a165e5057d612b301498b948b5491cc"} Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.138644 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht6c2\" (UniqueName: \"kubernetes.io/projected/83770f5d-fd84-473c-926c-c2a14010aa1c-kube-api-access-ht6c2\") pod \"redhat-operators-4xq8j\" (UID: \"83770f5d-fd84-473c-926c-c2a14010aa1c\") " pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.138689 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83770f5d-fd84-473c-926c-c2a14010aa1c-catalog-content\") pod \"redhat-operators-4xq8j\" (UID: \"83770f5d-fd84-473c-926c-c2a14010aa1c\") " pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.138771 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83770f5d-fd84-473c-926c-c2a14010aa1c-utilities\") pod \"redhat-operators-4xq8j\" (UID: \"83770f5d-fd84-473c-926c-c2a14010aa1c\") " pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.230361 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2ts88"] Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.240093 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83770f5d-fd84-473c-926c-c2a14010aa1c-utilities\") pod \"redhat-operators-4xq8j\" (UID: \"83770f5d-fd84-473c-926c-c2a14010aa1c\") " pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.240191 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht6c2\" (UniqueName: \"kubernetes.io/projected/83770f5d-fd84-473c-926c-c2a14010aa1c-kube-api-access-ht6c2\") pod \"redhat-operators-4xq8j\" (UID: \"83770f5d-fd84-473c-926c-c2a14010aa1c\") " pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.240217 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83770f5d-fd84-473c-926c-c2a14010aa1c-catalog-content\") pod \"redhat-operators-4xq8j\" (UID: \"83770f5d-fd84-473c-926c-c2a14010aa1c\") " pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.240686 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83770f5d-fd84-473c-926c-c2a14010aa1c-catalog-content\") pod \"redhat-operators-4xq8j\" (UID: \"83770f5d-fd84-473c-926c-c2a14010aa1c\") " pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.241157 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83770f5d-fd84-473c-926c-c2a14010aa1c-utilities\") pod \"redhat-operators-4xq8j\" (UID: \"83770f5d-fd84-473c-926c-c2a14010aa1c\") " pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:44 crc kubenswrapper[5030]: W0121 00:45:44.252924 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1cd5152_d5f9_4d07_b4bb_81abc83676f6.slice/crio-ca79e893961cb7f61148d0d4a5ea03b87d1e8472f9b88092a4cf02056424b417 WatchSource:0}: Error finding container ca79e893961cb7f61148d0d4a5ea03b87d1e8472f9b88092a4cf02056424b417: Status 404 returned error can't find the container with id ca79e893961cb7f61148d0d4a5ea03b87d1e8472f9b88092a4cf02056424b417 Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.258833 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht6c2\" (UniqueName: \"kubernetes.io/projected/83770f5d-fd84-473c-926c-c2a14010aa1c-kube-api-access-ht6c2\") pod \"redhat-operators-4xq8j\" (UID: \"83770f5d-fd84-473c-926c-c2a14010aa1c\") " pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.360276 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:44 crc kubenswrapper[5030]: I0121 00:45:44.751294 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xq8j"] Jan 21 00:45:44 crc kubenswrapper[5030]: W0121 00:45:44.755055 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83770f5d_fd84_473c_926c_c2a14010aa1c.slice/crio-cebea808f054d2fdd4bc4c08f9001baa3a82b07cf476d6c76e2b41876e9f1b75 WatchSource:0}: Error finding container cebea808f054d2fdd4bc4c08f9001baa3a82b07cf476d6c76e2b41876e9f1b75: Status 404 returned error can't find the container with id cebea808f054d2fdd4bc4c08f9001baa3a82b07cf476d6c76e2b41876e9f1b75 Jan 21 00:45:45 crc kubenswrapper[5030]: I0121 00:45:45.147748 5030 generic.go:334] "Generic (PLEG): container finished" podID="a5dfcb0d-c01c-4da3-90ee-0ddb739503aa" containerID="c60a0dd273cb14e3314002813e6d9dec5a165e5057d612b301498b948b5491cc" exitCode=0 Jan 21 00:45:45 crc kubenswrapper[5030]: I0121 00:45:45.147818 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j2cd" event={"ID":"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa","Type":"ContainerDied","Data":"c60a0dd273cb14e3314002813e6d9dec5a165e5057d612b301498b948b5491cc"} Jan 21 00:45:45 crc kubenswrapper[5030]: I0121 00:45:45.149081 5030 generic.go:334] "Generic (PLEG): container finished" podID="83770f5d-fd84-473c-926c-c2a14010aa1c" containerID="a125491c01e77538831e731eee26e28a9fc6e13db54a352e8f41ae744f31a856" exitCode=0 Jan 21 00:45:45 crc kubenswrapper[5030]: I0121 00:45:45.149156 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xq8j" event={"ID":"83770f5d-fd84-473c-926c-c2a14010aa1c","Type":"ContainerDied","Data":"a125491c01e77538831e731eee26e28a9fc6e13db54a352e8f41ae744f31a856"} Jan 21 00:45:45 crc kubenswrapper[5030]: I0121 00:45:45.149193 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xq8j" event={"ID":"83770f5d-fd84-473c-926c-c2a14010aa1c","Type":"ContainerStarted","Data":"cebea808f054d2fdd4bc4c08f9001baa3a82b07cf476d6c76e2b41876e9f1b75"} Jan 21 00:45:45 crc kubenswrapper[5030]: I0121 00:45:45.152090 5030 generic.go:334] "Generic (PLEG): container finished" podID="c1cd5152-d5f9-4d07-b4bb-81abc83676f6" containerID="a51b76de6eb6368c67b391f15d42efae03732ea5c6523150aad143b2bd3c17e4" exitCode=0 Jan 21 00:45:45 crc kubenswrapper[5030]: I0121 00:45:45.152168 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ts88" event={"ID":"c1cd5152-d5f9-4d07-b4bb-81abc83676f6","Type":"ContainerDied","Data":"a51b76de6eb6368c67b391f15d42efae03732ea5c6523150aad143b2bd3c17e4"} Jan 21 00:45:45 crc kubenswrapper[5030]: I0121 00:45:45.152219 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ts88" event={"ID":"c1cd5152-d5f9-4d07-b4bb-81abc83676f6","Type":"ContainerStarted","Data":"ca79e893961cb7f61148d0d4a5ea03b87d1e8472f9b88092a4cf02056424b417"} Jan 21 00:45:45 crc kubenswrapper[5030]: I0121 00:45:45.155238 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbj59" event={"ID":"b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f","Type":"ContainerStarted","Data":"9401c9af6b4ba88c9578abb6c5078ff07d5ffdf0ea9a2917b839708da9d84038"} Jan 21 00:45:45 crc kubenswrapper[5030]: I0121 00:45:45.218424 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tbj59" podStartSLOduration=2.718332654 podStartE2EDuration="4.218404624s" podCreationTimestamp="2026-01-21 00:45:41 +0000 UTC" firstStartedPulling="2026-01-21 00:45:43.115728733 +0000 UTC m=+7815.435989021" lastFinishedPulling="2026-01-21 00:45:44.615800703 +0000 UTC m=+7816.936060991" observedRunningTime="2026-01-21 00:45:45.218392944 +0000 UTC m=+7817.538653232" watchObservedRunningTime="2026-01-21 00:45:45.218404624 +0000 UTC m=+7817.538664912" Jan 21 00:45:46 crc kubenswrapper[5030]: I0121 00:45:46.165096 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5j2cd" event={"ID":"a5dfcb0d-c01c-4da3-90ee-0ddb739503aa","Type":"ContainerStarted","Data":"992ba1ceace7fb56b3a75e80ae18591a0b37dc8416834656872e9ebe33ad903e"} Jan 21 00:45:46 crc kubenswrapper[5030]: I0121 00:45:46.167281 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ts88" event={"ID":"c1cd5152-d5f9-4d07-b4bb-81abc83676f6","Type":"ContainerStarted","Data":"332267ec70cf51937c13abd19fe3276f477fadc18ea1c5c7a5d9ee0a20915044"} Jan 21 00:45:46 crc kubenswrapper[5030]: I0121 00:45:46.197714 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5j2cd" podStartSLOduration=2.683342905 podStartE2EDuration="5.197692976s" podCreationTimestamp="2026-01-21 00:45:41 +0000 UTC" firstStartedPulling="2026-01-21 00:45:43.118456428 +0000 UTC m=+7815.438716716" lastFinishedPulling="2026-01-21 00:45:45.632806499 +0000 UTC m=+7817.953066787" observedRunningTime="2026-01-21 00:45:46.183802003 +0000 UTC m=+7818.504062291" watchObservedRunningTime="2026-01-21 00:45:46.197692976 +0000 UTC m=+7818.517953264" Jan 21 00:45:47 crc kubenswrapper[5030]: I0121 00:45:47.174146 5030 generic.go:334] "Generic (PLEG): container finished" podID="83770f5d-fd84-473c-926c-c2a14010aa1c" containerID="0239451e13c092697655c51e5762ebdd05f0d82bab3ad413a9882f584b34027e" exitCode=0 Jan 21 00:45:47 crc kubenswrapper[5030]: I0121 00:45:47.174218 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xq8j" event={"ID":"83770f5d-fd84-473c-926c-c2a14010aa1c","Type":"ContainerDied","Data":"0239451e13c092697655c51e5762ebdd05f0d82bab3ad413a9882f584b34027e"} Jan 21 00:45:47 crc kubenswrapper[5030]: I0121 00:45:47.176222 5030 generic.go:334] "Generic (PLEG): container finished" podID="c1cd5152-d5f9-4d07-b4bb-81abc83676f6" containerID="332267ec70cf51937c13abd19fe3276f477fadc18ea1c5c7a5d9ee0a20915044" exitCode=0 Jan 21 00:45:47 crc kubenswrapper[5030]: I0121 00:45:47.176303 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ts88" event={"ID":"c1cd5152-d5f9-4d07-b4bb-81abc83676f6","Type":"ContainerDied","Data":"332267ec70cf51937c13abd19fe3276f477fadc18ea1c5c7a5d9ee0a20915044"} Jan 21 00:45:48 crc kubenswrapper[5030]: I0121 00:45:48.185283 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2ts88" event={"ID":"c1cd5152-d5f9-4d07-b4bb-81abc83676f6","Type":"ContainerStarted","Data":"3d42034209aac209fe26239deca8d145b97c8a59843158046f5a5b65027d3477"} Jan 21 00:45:48 crc kubenswrapper[5030]: I0121 00:45:48.187645 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xq8j" event={"ID":"83770f5d-fd84-473c-926c-c2a14010aa1c","Type":"ContainerStarted","Data":"d2fed1842d6ec98eba4ae57d6de673a7b266392f18cbe1f3bc61d00944db2cb0"} Jan 21 00:45:48 crc kubenswrapper[5030]: I0121 00:45:48.207117 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2ts88" podStartSLOduration=2.766770775 podStartE2EDuration="5.207091899s" podCreationTimestamp="2026-01-21 00:45:43 +0000 UTC" firstStartedPulling="2026-01-21 00:45:45.153441946 +0000 UTC m=+7817.473702234" lastFinishedPulling="2026-01-21 00:45:47.59376308 +0000 UTC m=+7819.914023358" observedRunningTime="2026-01-21 00:45:48.203339998 +0000 UTC m=+7820.523600286" watchObservedRunningTime="2026-01-21 00:45:48.207091899 +0000 UTC m=+7820.527352197" Jan 21 00:45:48 crc kubenswrapper[5030]: I0121 00:45:48.223351 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4xq8j" podStartSLOduration=1.737061852 podStartE2EDuration="4.223325948s" podCreationTimestamp="2026-01-21 00:45:44 +0000 UTC" firstStartedPulling="2026-01-21 00:45:45.150506795 +0000 UTC m=+7817.470767103" lastFinishedPulling="2026-01-21 00:45:47.636770911 +0000 UTC m=+7819.957031199" observedRunningTime="2026-01-21 00:45:48.220222504 +0000 UTC m=+7820.540482792" watchObservedRunningTime="2026-01-21 00:45:48.223325948 +0000 UTC m=+7820.543586276" Jan 21 00:45:51 crc kubenswrapper[5030]: I0121 00:45:51.435278 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:51 crc kubenswrapper[5030]: I0121 00:45:51.435773 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:51 crc kubenswrapper[5030]: I0121 00:45:51.508691 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:51 crc kubenswrapper[5030]: I0121 00:45:51.951226 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:51 crc kubenswrapper[5030]: I0121 00:45:51.951288 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:52 crc kubenswrapper[5030]: I0121 00:45:52.002090 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:52 crc kubenswrapper[5030]: I0121 00:45:52.252765 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tbj59" Jan 21 00:45:52 crc kubenswrapper[5030]: I0121 00:45:52.259648 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5j2cd" Jan 21 00:45:53 crc kubenswrapper[5030]: I0121 00:45:53.780778 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:53 crc kubenswrapper[5030]: I0121 00:45:53.781115 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:53 crc kubenswrapper[5030]: I0121 00:45:53.838762 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:54 crc kubenswrapper[5030]: I0121 00:45:54.269518 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2ts88" Jan 21 00:45:54 crc kubenswrapper[5030]: I0121 00:45:54.361244 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:54 crc kubenswrapper[5030]: I0121 00:45:54.361529 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:54 crc kubenswrapper[5030]: I0121 00:45:54.406510 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:45:55 crc kubenswrapper[5030]: I0121 00:45:55.273990 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4xq8j" Jan 21 00:46:10 crc kubenswrapper[5030]: I0121 00:46:10.157805 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:46:10 crc kubenswrapper[5030]: I0121 00:46:10.158393 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:46:10 crc kubenswrapper[5030]: I0121 00:46:10.158439 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 00:46:10 crc kubenswrapper[5030]: I0121 00:46:10.159042 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 00:46:10 crc kubenswrapper[5030]: I0121 00:46:10.159086 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" gracePeriod=600 Jan 21 00:46:10 crc kubenswrapper[5030]: I0121 00:46:10.354617 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" exitCode=0 Jan 21 00:46:10 crc kubenswrapper[5030]: I0121 00:46:10.354668 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9"} Jan 21 00:46:10 crc kubenswrapper[5030]: I0121 00:46:10.354771 5030 scope.go:117] "RemoveContainer" containerID="69843e2a363b78a7b8bb08cb53e5f4ea850f7a953289a3808799af70fc7ca97b" Jan 21 00:46:10 crc kubenswrapper[5030]: E0121 00:46:10.781732 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:46:11 crc kubenswrapper[5030]: I0121 00:46:11.363113 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:46:11 crc kubenswrapper[5030]: E0121 00:46:11.363372 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:46:26 crc kubenswrapper[5030]: I0121 00:46:26.962228 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:46:26 crc kubenswrapper[5030]: E0121 00:46:26.963005 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:46:34 crc kubenswrapper[5030]: I0121 00:46:34.780506 5030 scope.go:117] "RemoveContainer" containerID="ed0db88c97257f61aca10598a82a169e629e234db569a57b1eb4765d28c80a7f" Jan 21 00:46:34 crc kubenswrapper[5030]: I0121 00:46:34.815741 5030 scope.go:117] "RemoveContainer" containerID="e8da43346499a27b3d7f9747979f3fed05d549e5d2324129196255ea4a8a0aca" Jan 21 00:46:34 crc kubenswrapper[5030]: I0121 00:46:34.867153 5030 scope.go:117] "RemoveContainer" containerID="7fbba8623965287f6f79bbf1f6d27d61e11abc8acc492b03cebfaee1740afe30" Jan 21 00:46:34 crc kubenswrapper[5030]: I0121 00:46:34.894137 5030 scope.go:117] "RemoveContainer" containerID="914eace812bf63e7559810a7997a45ad8eb574795938f005742a6218faa5d52c" Jan 21 00:46:34 crc kubenswrapper[5030]: I0121 00:46:34.926157 5030 scope.go:117] "RemoveContainer" containerID="aec4f01612e489033df5a614d5560ed5de61b02d76761f9a9a237794773a1d46" Jan 21 00:46:34 crc kubenswrapper[5030]: I0121 00:46:34.951615 5030 scope.go:117] "RemoveContainer" containerID="be36195b526a7d6bf6b9a7c2898ffaed2eba98b03ee9ade26bdb19e273e31813" Jan 21 00:46:34 crc kubenswrapper[5030]: I0121 00:46:34.978072 5030 scope.go:117] "RemoveContainer" containerID="b69357e614609f04ab61fe32497743b2badf25fd3acde5392a4b47da28f2195f" Jan 21 00:46:35 crc kubenswrapper[5030]: I0121 00:46:35.006027 5030 scope.go:117] "RemoveContainer" containerID="ef640f906dbaa76beab7a1716314d8aaa49658be0d369fda671e8f07ba30aa04" Jan 21 00:46:35 crc kubenswrapper[5030]: I0121 00:46:35.024570 5030 scope.go:117] "RemoveContainer" containerID="4d4d9a33bdfb8c13fbeb0946ff60dcf68c6090edf9a6bc630714daa89c02f200" Jan 21 00:46:35 crc kubenswrapper[5030]: I0121 00:46:35.048315 5030 scope.go:117] "RemoveContainer" containerID="8dbd0fe360394c16eba3d53eba7c7f66786ce4fb1d84c4fe592f6ceb2dfd1694" Jan 21 00:46:35 crc kubenswrapper[5030]: I0121 00:46:35.062819 5030 scope.go:117] "RemoveContainer" containerID="443cbed4d3a144e31b0f62f3bd0cb624b9732b6f101e000f07f81af64c3085e4" Jan 21 00:46:35 crc kubenswrapper[5030]: I0121 00:46:35.080335 5030 scope.go:117] "RemoveContainer" containerID="f7834ee254a7a5a4f19eaf01533b66dfb93b213e49fb150346aaedcd27a55264" Jan 21 00:46:35 crc kubenswrapper[5030]: I0121 00:46:35.103398 5030 scope.go:117] "RemoveContainer" containerID="b8ac4d14b12c8614e819576692c333883243a0e4407cd600961844da2528502c" Jan 21 00:46:35 crc kubenswrapper[5030]: I0121 00:46:35.134589 5030 scope.go:117] "RemoveContainer" containerID="8ae5072d7810fdcd0eec0e344c272820bf4e46792893a41ff51d2afed7ffba3b" Jan 21 00:46:35 crc kubenswrapper[5030]: I0121 00:46:35.157242 5030 scope.go:117] "RemoveContainer" containerID="511fae4c547cf98ed6f03319217eb7844557daa0df007ccbda74603d727145c7" Jan 21 00:46:41 crc kubenswrapper[5030]: I0121 00:46:41.962201 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:46:41 crc kubenswrapper[5030]: E0121 00:46:41.962854 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:46:54 crc kubenswrapper[5030]: I0121 00:46:54.962094 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:46:54 crc kubenswrapper[5030]: E0121 00:46:54.965069 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:47:08 crc kubenswrapper[5030]: I0121 00:47:08.962742 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:47:08 crc kubenswrapper[5030]: E0121 00:47:08.963896 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:47:21 crc kubenswrapper[5030]: I0121 00:47:21.962665 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:47:21 crc kubenswrapper[5030]: E0121 00:47:21.964214 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:47:33 crc kubenswrapper[5030]: I0121 00:47:33.962738 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:47:33 crc kubenswrapper[5030]: E0121 00:47:33.963537 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:47:45 crc kubenswrapper[5030]: I0121 00:47:45.963046 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:47:45 crc kubenswrapper[5030]: E0121 00:47:45.963953 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:47:56 crc kubenswrapper[5030]: I0121 00:47:56.961710 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:47:56 crc kubenswrapper[5030]: E0121 00:47:56.962511 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:48:10 crc kubenswrapper[5030]: I0121 00:48:10.961529 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:48:10 crc kubenswrapper[5030]: E0121 00:48:10.962238 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:48:21 crc kubenswrapper[5030]: I0121 00:48:21.962475 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:48:21 crc kubenswrapper[5030]: E0121 00:48:21.964198 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:48:34 crc kubenswrapper[5030]: I0121 00:48:34.962034 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:48:34 crc kubenswrapper[5030]: E0121 00:48:34.962903 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.206926 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h87lm"] Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.209257 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.229800 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h87lm"] Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.246825 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e058f88-40e6-47d2-9fa4-05a204f5158f-catalog-content\") pod \"redhat-marketplace-h87lm\" (UID: \"7e058f88-40e6-47d2-9fa4-05a204f5158f\") " pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.246971 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e058f88-40e6-47d2-9fa4-05a204f5158f-utilities\") pod \"redhat-marketplace-h87lm\" (UID: \"7e058f88-40e6-47d2-9fa4-05a204f5158f\") " pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.247017 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqsgn\" (UniqueName: \"kubernetes.io/projected/7e058f88-40e6-47d2-9fa4-05a204f5158f-kube-api-access-bqsgn\") pod \"redhat-marketplace-h87lm\" (UID: \"7e058f88-40e6-47d2-9fa4-05a204f5158f\") " pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.348192 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e058f88-40e6-47d2-9fa4-05a204f5158f-utilities\") pod \"redhat-marketplace-h87lm\" (UID: \"7e058f88-40e6-47d2-9fa4-05a204f5158f\") " pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.348274 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqsgn\" (UniqueName: \"kubernetes.io/projected/7e058f88-40e6-47d2-9fa4-05a204f5158f-kube-api-access-bqsgn\") pod \"redhat-marketplace-h87lm\" (UID: \"7e058f88-40e6-47d2-9fa4-05a204f5158f\") " pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.348338 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e058f88-40e6-47d2-9fa4-05a204f5158f-catalog-content\") pod \"redhat-marketplace-h87lm\" (UID: \"7e058f88-40e6-47d2-9fa4-05a204f5158f\") " pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.348861 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e058f88-40e6-47d2-9fa4-05a204f5158f-catalog-content\") pod \"redhat-marketplace-h87lm\" (UID: \"7e058f88-40e6-47d2-9fa4-05a204f5158f\") " pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.348985 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e058f88-40e6-47d2-9fa4-05a204f5158f-utilities\") pod \"redhat-marketplace-h87lm\" (UID: \"7e058f88-40e6-47d2-9fa4-05a204f5158f\") " pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.365691 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqsgn\" (UniqueName: \"kubernetes.io/projected/7e058f88-40e6-47d2-9fa4-05a204f5158f-kube-api-access-bqsgn\") pod \"redhat-marketplace-h87lm\" (UID: \"7e058f88-40e6-47d2-9fa4-05a204f5158f\") " pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.531990 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:41 crc kubenswrapper[5030]: I0121 00:48:41.985845 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h87lm"] Jan 21 00:48:42 crc kubenswrapper[5030]: I0121 00:48:42.572122 5030 generic.go:334] "Generic (PLEG): container finished" podID="7e058f88-40e6-47d2-9fa4-05a204f5158f" containerID="d937b6e5537c10442058a2d61fd4d5e861a1f9daaa30cd8578246b4a3eb566fe" exitCode=0 Jan 21 00:48:42 crc kubenswrapper[5030]: I0121 00:48:42.572168 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h87lm" event={"ID":"7e058f88-40e6-47d2-9fa4-05a204f5158f","Type":"ContainerDied","Data":"d937b6e5537c10442058a2d61fd4d5e861a1f9daaa30cd8578246b4a3eb566fe"} Jan 21 00:48:42 crc kubenswrapper[5030]: I0121 00:48:42.573239 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h87lm" event={"ID":"7e058f88-40e6-47d2-9fa4-05a204f5158f","Type":"ContainerStarted","Data":"cf96fae6387e3cd2e56037de3e05d9d78d513deeb377daf7638903c613f4dc91"} Jan 21 00:48:44 crc kubenswrapper[5030]: I0121 00:48:44.592615 5030 generic.go:334] "Generic (PLEG): container finished" podID="7e058f88-40e6-47d2-9fa4-05a204f5158f" containerID="508392265e94b84d0d0242d32d45340fea2d07dd7524c336dad29ef729700349" exitCode=0 Jan 21 00:48:44 crc kubenswrapper[5030]: I0121 00:48:44.592766 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h87lm" event={"ID":"7e058f88-40e6-47d2-9fa4-05a204f5158f","Type":"ContainerDied","Data":"508392265e94b84d0d0242d32d45340fea2d07dd7524c336dad29ef729700349"} Jan 21 00:48:45 crc kubenswrapper[5030]: I0121 00:48:45.601760 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h87lm" event={"ID":"7e058f88-40e6-47d2-9fa4-05a204f5158f","Type":"ContainerStarted","Data":"65d0cb2a42199d2a797eea86d457f2cdebd23a942f7d05b926b4211a133c844c"} Jan 21 00:48:45 crc kubenswrapper[5030]: I0121 00:48:45.628954 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h87lm" podStartSLOduration=2.164989554 podStartE2EDuration="4.628930755s" podCreationTimestamp="2026-01-21 00:48:41 +0000 UTC" firstStartedPulling="2026-01-21 00:48:42.574197766 +0000 UTC m=+7994.894458054" lastFinishedPulling="2026-01-21 00:48:45.038138947 +0000 UTC m=+7997.358399255" observedRunningTime="2026-01-21 00:48:45.624216113 +0000 UTC m=+7997.944476391" watchObservedRunningTime="2026-01-21 00:48:45.628930755 +0000 UTC m=+7997.949191043" Jan 21 00:48:46 crc kubenswrapper[5030]: I0121 00:48:46.962591 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:48:46 crc kubenswrapper[5030]: E0121 00:48:46.962895 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:48:51 crc kubenswrapper[5030]: I0121 00:48:51.532708 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:51 crc kubenswrapper[5030]: I0121 00:48:51.534006 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:51 crc kubenswrapper[5030]: I0121 00:48:51.583388 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:51 crc kubenswrapper[5030]: I0121 00:48:51.702911 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:51 crc kubenswrapper[5030]: I0121 00:48:51.812246 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h87lm"] Jan 21 00:48:53 crc kubenswrapper[5030]: I0121 00:48:53.651331 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h87lm" podUID="7e058f88-40e6-47d2-9fa4-05a204f5158f" containerName="registry-server" containerID="cri-o://65d0cb2a42199d2a797eea86d457f2cdebd23a942f7d05b926b4211a133c844c" gracePeriod=2 Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.617722 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.637596 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e058f88-40e6-47d2-9fa4-05a204f5158f-catalog-content\") pod \"7e058f88-40e6-47d2-9fa4-05a204f5158f\" (UID: \"7e058f88-40e6-47d2-9fa4-05a204f5158f\") " Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.637703 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e058f88-40e6-47d2-9fa4-05a204f5158f-utilities\") pod \"7e058f88-40e6-47d2-9fa4-05a204f5158f\" (UID: \"7e058f88-40e6-47d2-9fa4-05a204f5158f\") " Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.637754 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqsgn\" (UniqueName: \"kubernetes.io/projected/7e058f88-40e6-47d2-9fa4-05a204f5158f-kube-api-access-bqsgn\") pod \"7e058f88-40e6-47d2-9fa4-05a204f5158f\" (UID: \"7e058f88-40e6-47d2-9fa4-05a204f5158f\") " Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.638867 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e058f88-40e6-47d2-9fa4-05a204f5158f-utilities" (OuterVolumeSpecName: "utilities") pod "7e058f88-40e6-47d2-9fa4-05a204f5158f" (UID: "7e058f88-40e6-47d2-9fa4-05a204f5158f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.658404 5030 generic.go:334] "Generic (PLEG): container finished" podID="7e058f88-40e6-47d2-9fa4-05a204f5158f" containerID="65d0cb2a42199d2a797eea86d457f2cdebd23a942f7d05b926b4211a133c844c" exitCode=0 Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.658454 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h87lm" event={"ID":"7e058f88-40e6-47d2-9fa4-05a204f5158f","Type":"ContainerDied","Data":"65d0cb2a42199d2a797eea86d457f2cdebd23a942f7d05b926b4211a133c844c"} Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.658485 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h87lm" event={"ID":"7e058f88-40e6-47d2-9fa4-05a204f5158f","Type":"ContainerDied","Data":"cf96fae6387e3cd2e56037de3e05d9d78d513deeb377daf7638903c613f4dc91"} Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.658506 5030 scope.go:117] "RemoveContainer" containerID="65d0cb2a42199d2a797eea86d457f2cdebd23a942f7d05b926b4211a133c844c" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.658696 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h87lm" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.665952 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e058f88-40e6-47d2-9fa4-05a204f5158f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e058f88-40e6-47d2-9fa4-05a204f5158f" (UID: "7e058f88-40e6-47d2-9fa4-05a204f5158f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.701682 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e058f88-40e6-47d2-9fa4-05a204f5158f-kube-api-access-bqsgn" (OuterVolumeSpecName: "kube-api-access-bqsgn") pod "7e058f88-40e6-47d2-9fa4-05a204f5158f" (UID: "7e058f88-40e6-47d2-9fa4-05a204f5158f"). InnerVolumeSpecName "kube-api-access-bqsgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.736750 5030 scope.go:117] "RemoveContainer" containerID="508392265e94b84d0d0242d32d45340fea2d07dd7524c336dad29ef729700349" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.738925 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e058f88-40e6-47d2-9fa4-05a204f5158f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.738948 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e058f88-40e6-47d2-9fa4-05a204f5158f-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.738963 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqsgn\" (UniqueName: \"kubernetes.io/projected/7e058f88-40e6-47d2-9fa4-05a204f5158f-kube-api-access-bqsgn\") on node \"crc\" DevicePath \"\"" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.753935 5030 scope.go:117] "RemoveContainer" containerID="d937b6e5537c10442058a2d61fd4d5e861a1f9daaa30cd8578246b4a3eb566fe" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.772294 5030 scope.go:117] "RemoveContainer" containerID="65d0cb2a42199d2a797eea86d457f2cdebd23a942f7d05b926b4211a133c844c" Jan 21 00:48:54 crc kubenswrapper[5030]: E0121 00:48:54.772877 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65d0cb2a42199d2a797eea86d457f2cdebd23a942f7d05b926b4211a133c844c\": container with ID starting with 65d0cb2a42199d2a797eea86d457f2cdebd23a942f7d05b926b4211a133c844c not found: ID does not exist" containerID="65d0cb2a42199d2a797eea86d457f2cdebd23a942f7d05b926b4211a133c844c" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.772909 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65d0cb2a42199d2a797eea86d457f2cdebd23a942f7d05b926b4211a133c844c"} err="failed to get container status \"65d0cb2a42199d2a797eea86d457f2cdebd23a942f7d05b926b4211a133c844c\": rpc error: code = NotFound desc = could not find container \"65d0cb2a42199d2a797eea86d457f2cdebd23a942f7d05b926b4211a133c844c\": container with ID starting with 65d0cb2a42199d2a797eea86d457f2cdebd23a942f7d05b926b4211a133c844c not found: ID does not exist" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.772936 5030 scope.go:117] "RemoveContainer" containerID="508392265e94b84d0d0242d32d45340fea2d07dd7524c336dad29ef729700349" Jan 21 00:48:54 crc kubenswrapper[5030]: E0121 00:48:54.773176 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"508392265e94b84d0d0242d32d45340fea2d07dd7524c336dad29ef729700349\": container with ID starting with 508392265e94b84d0d0242d32d45340fea2d07dd7524c336dad29ef729700349 not found: ID does not exist" containerID="508392265e94b84d0d0242d32d45340fea2d07dd7524c336dad29ef729700349" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.773197 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"508392265e94b84d0d0242d32d45340fea2d07dd7524c336dad29ef729700349"} err="failed to get container status \"508392265e94b84d0d0242d32d45340fea2d07dd7524c336dad29ef729700349\": rpc error: code = NotFound desc = could not find container \"508392265e94b84d0d0242d32d45340fea2d07dd7524c336dad29ef729700349\": container with ID starting with 508392265e94b84d0d0242d32d45340fea2d07dd7524c336dad29ef729700349 not found: ID does not exist" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.773214 5030 scope.go:117] "RemoveContainer" containerID="d937b6e5537c10442058a2d61fd4d5e861a1f9daaa30cd8578246b4a3eb566fe" Jan 21 00:48:54 crc kubenswrapper[5030]: E0121 00:48:54.773604 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d937b6e5537c10442058a2d61fd4d5e861a1f9daaa30cd8578246b4a3eb566fe\": container with ID starting with d937b6e5537c10442058a2d61fd4d5e861a1f9daaa30cd8578246b4a3eb566fe not found: ID does not exist" containerID="d937b6e5537c10442058a2d61fd4d5e861a1f9daaa30cd8578246b4a3eb566fe" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.773725 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d937b6e5537c10442058a2d61fd4d5e861a1f9daaa30cd8578246b4a3eb566fe"} err="failed to get container status \"d937b6e5537c10442058a2d61fd4d5e861a1f9daaa30cd8578246b4a3eb566fe\": rpc error: code = NotFound desc = could not find container \"d937b6e5537c10442058a2d61fd4d5e861a1f9daaa30cd8578246b4a3eb566fe\": container with ID starting with d937b6e5537c10442058a2d61fd4d5e861a1f9daaa30cd8578246b4a3eb566fe not found: ID does not exist" Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.993814 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h87lm"] Jan 21 00:48:54 crc kubenswrapper[5030]: I0121 00:48:54.998347 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h87lm"] Jan 21 00:48:55 crc kubenswrapper[5030]: I0121 00:48:55.970302 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e058f88-40e6-47d2-9fa4-05a204f5158f" path="/var/lib/kubelet/pods/7e058f88-40e6-47d2-9fa4-05a204f5158f/volumes" Jan 21 00:49:01 crc kubenswrapper[5030]: I0121 00:49:01.962698 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:49:01 crc kubenswrapper[5030]: E0121 00:49:01.963852 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:49:16 crc kubenswrapper[5030]: I0121 00:49:16.962138 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:49:16 crc kubenswrapper[5030]: E0121 00:49:16.962987 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:49:29 crc kubenswrapper[5030]: I0121 00:49:29.962647 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:49:29 crc kubenswrapper[5030]: E0121 00:49:29.963326 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:49:40 crc kubenswrapper[5030]: I0121 00:49:40.962273 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:49:40 crc kubenswrapper[5030]: E0121 00:49:40.962984 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:49:51 crc kubenswrapper[5030]: I0121 00:49:51.962544 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:49:51 crc kubenswrapper[5030]: E0121 00:49:51.963449 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:50:06 crc kubenswrapper[5030]: I0121 00:50:06.962006 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:50:06 crc kubenswrapper[5030]: E0121 00:50:06.962725 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:50:18 crc kubenswrapper[5030]: I0121 00:50:18.962139 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:50:18 crc kubenswrapper[5030]: E0121 00:50:18.962861 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:50:31 crc kubenswrapper[5030]: I0121 00:50:31.963127 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:50:31 crc kubenswrapper[5030]: E0121 00:50:31.964179 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:50:35 crc kubenswrapper[5030]: I0121 00:50:35.404194 5030 scope.go:117] "RemoveContainer" containerID="3896a6e6bb21c09b1e7b87a886accc1b0d58aa8f0af3cbfb046abdfa425f4bac" Jan 21 00:50:45 crc kubenswrapper[5030]: I0121 00:50:45.961865 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:50:45 crc kubenswrapper[5030]: E0121 00:50:45.962634 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:50:59 crc kubenswrapper[5030]: I0121 00:50:59.962843 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:50:59 crc kubenswrapper[5030]: E0121 00:50:59.963800 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.381194 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jzftp"] Jan 21 00:51:10 crc kubenswrapper[5030]: E0121 00:51:10.382113 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e058f88-40e6-47d2-9fa4-05a204f5158f" containerName="registry-server" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.382130 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e058f88-40e6-47d2-9fa4-05a204f5158f" containerName="registry-server" Jan 21 00:51:10 crc kubenswrapper[5030]: E0121 00:51:10.382174 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e058f88-40e6-47d2-9fa4-05a204f5158f" containerName="extract-utilities" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.382183 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e058f88-40e6-47d2-9fa4-05a204f5158f" containerName="extract-utilities" Jan 21 00:51:10 crc kubenswrapper[5030]: E0121 00:51:10.382197 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e058f88-40e6-47d2-9fa4-05a204f5158f" containerName="extract-content" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.382205 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e058f88-40e6-47d2-9fa4-05a204f5158f" containerName="extract-content" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.382352 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e058f88-40e6-47d2-9fa4-05a204f5158f" containerName="registry-server" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.383494 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.393391 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jzftp"] Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.538339 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j572m\" (UniqueName: \"kubernetes.io/projected/5eba378e-4e31-4889-a3a3-16dcab1eb573-kube-api-access-j572m\") pod \"community-operators-jzftp\" (UID: \"5eba378e-4e31-4889-a3a3-16dcab1eb573\") " pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.538704 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eba378e-4e31-4889-a3a3-16dcab1eb573-catalog-content\") pod \"community-operators-jzftp\" (UID: \"5eba378e-4e31-4889-a3a3-16dcab1eb573\") " pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.538776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eba378e-4e31-4889-a3a3-16dcab1eb573-utilities\") pod \"community-operators-jzftp\" (UID: \"5eba378e-4e31-4889-a3a3-16dcab1eb573\") " pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.639801 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eba378e-4e31-4889-a3a3-16dcab1eb573-catalog-content\") pod \"community-operators-jzftp\" (UID: \"5eba378e-4e31-4889-a3a3-16dcab1eb573\") " pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.639858 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eba378e-4e31-4889-a3a3-16dcab1eb573-utilities\") pod \"community-operators-jzftp\" (UID: \"5eba378e-4e31-4889-a3a3-16dcab1eb573\") " pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.639922 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j572m\" (UniqueName: \"kubernetes.io/projected/5eba378e-4e31-4889-a3a3-16dcab1eb573-kube-api-access-j572m\") pod \"community-operators-jzftp\" (UID: \"5eba378e-4e31-4889-a3a3-16dcab1eb573\") " pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.640404 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eba378e-4e31-4889-a3a3-16dcab1eb573-catalog-content\") pod \"community-operators-jzftp\" (UID: \"5eba378e-4e31-4889-a3a3-16dcab1eb573\") " pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.640459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eba378e-4e31-4889-a3a3-16dcab1eb573-utilities\") pod \"community-operators-jzftp\" (UID: \"5eba378e-4e31-4889-a3a3-16dcab1eb573\") " pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.659685 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j572m\" (UniqueName: \"kubernetes.io/projected/5eba378e-4e31-4889-a3a3-16dcab1eb573-kube-api-access-j572m\") pod \"community-operators-jzftp\" (UID: \"5eba378e-4e31-4889-a3a3-16dcab1eb573\") " pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:10 crc kubenswrapper[5030]: I0121 00:51:10.757120 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:11 crc kubenswrapper[5030]: I0121 00:51:11.046759 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jzftp"] Jan 21 00:51:11 crc kubenswrapper[5030]: I0121 00:51:11.616101 5030 generic.go:334] "Generic (PLEG): container finished" podID="5eba378e-4e31-4889-a3a3-16dcab1eb573" containerID="af231f45e8ab7c944c22fca6116b58bb904b4dc0685f2e3493179d03dd5c97ce" exitCode=0 Jan 21 00:51:11 crc kubenswrapper[5030]: I0121 00:51:11.616344 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzftp" event={"ID":"5eba378e-4e31-4889-a3a3-16dcab1eb573","Type":"ContainerDied","Data":"af231f45e8ab7c944c22fca6116b58bb904b4dc0685f2e3493179d03dd5c97ce"} Jan 21 00:51:11 crc kubenswrapper[5030]: I0121 00:51:11.616369 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzftp" event={"ID":"5eba378e-4e31-4889-a3a3-16dcab1eb573","Type":"ContainerStarted","Data":"fbfb76ebc551b5bacbd57bd62f337b897fb74a3305ee8f3bfcb9c9aec006e6d6"} Jan 21 00:51:11 crc kubenswrapper[5030]: I0121 00:51:11.618284 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 00:51:13 crc kubenswrapper[5030]: I0121 00:51:13.637284 5030 generic.go:334] "Generic (PLEG): container finished" podID="5eba378e-4e31-4889-a3a3-16dcab1eb573" containerID="6b1bc144684f53d4a28db144bfff185cdac5afecd004cd752d8ccc4e22695966" exitCode=0 Jan 21 00:51:13 crc kubenswrapper[5030]: I0121 00:51:13.637353 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzftp" event={"ID":"5eba378e-4e31-4889-a3a3-16dcab1eb573","Type":"ContainerDied","Data":"6b1bc144684f53d4a28db144bfff185cdac5afecd004cd752d8ccc4e22695966"} Jan 21 00:51:13 crc kubenswrapper[5030]: I0121 00:51:13.962165 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:51:14 crc kubenswrapper[5030]: I0121 00:51:14.645427 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzftp" event={"ID":"5eba378e-4e31-4889-a3a3-16dcab1eb573","Type":"ContainerStarted","Data":"b13160b44ce423d0f3f9da0d5415c4d1b6fa95f3c89451263a573671b9170e52"} Jan 21 00:51:14 crc kubenswrapper[5030]: I0121 00:51:14.647481 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"90cfc701935004f0388f24a1bca80a66959af9772ba9792f889c6871c9551ae3"} Jan 21 00:51:14 crc kubenswrapper[5030]: I0121 00:51:14.664573 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jzftp" podStartSLOduration=2.106559455 podStartE2EDuration="4.664554003s" podCreationTimestamp="2026-01-21 00:51:10 +0000 UTC" firstStartedPulling="2026-01-21 00:51:11.618071952 +0000 UTC m=+8143.938332240" lastFinishedPulling="2026-01-21 00:51:14.1760665 +0000 UTC m=+8146.496326788" observedRunningTime="2026-01-21 00:51:14.660607328 +0000 UTC m=+8146.980867626" watchObservedRunningTime="2026-01-21 00:51:14.664554003 +0000 UTC m=+8146.984814291" Jan 21 00:51:20 crc kubenswrapper[5030]: I0121 00:51:20.758141 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:20 crc kubenswrapper[5030]: I0121 00:51:20.758789 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:20 crc kubenswrapper[5030]: I0121 00:51:20.830876 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:21 crc kubenswrapper[5030]: I0121 00:51:21.766397 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:21 crc kubenswrapper[5030]: I0121 00:51:21.821794 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jzftp"] Jan 21 00:51:23 crc kubenswrapper[5030]: I0121 00:51:23.718856 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jzftp" podUID="5eba378e-4e31-4889-a3a3-16dcab1eb573" containerName="registry-server" containerID="cri-o://b13160b44ce423d0f3f9da0d5415c4d1b6fa95f3c89451263a573671b9170e52" gracePeriod=2 Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.666749 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.726164 5030 generic.go:334] "Generic (PLEG): container finished" podID="5eba378e-4e31-4889-a3a3-16dcab1eb573" containerID="b13160b44ce423d0f3f9da0d5415c4d1b6fa95f3c89451263a573671b9170e52" exitCode=0 Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.726222 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jzftp" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.726211 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzftp" event={"ID":"5eba378e-4e31-4889-a3a3-16dcab1eb573","Type":"ContainerDied","Data":"b13160b44ce423d0f3f9da0d5415c4d1b6fa95f3c89451263a573671b9170e52"} Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.726576 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jzftp" event={"ID":"5eba378e-4e31-4889-a3a3-16dcab1eb573","Type":"ContainerDied","Data":"fbfb76ebc551b5bacbd57bd62f337b897fb74a3305ee8f3bfcb9c9aec006e6d6"} Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.726600 5030 scope.go:117] "RemoveContainer" containerID="b13160b44ce423d0f3f9da0d5415c4d1b6fa95f3c89451263a573671b9170e52" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.742839 5030 scope.go:117] "RemoveContainer" containerID="6b1bc144684f53d4a28db144bfff185cdac5afecd004cd752d8ccc4e22695966" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.758688 5030 scope.go:117] "RemoveContainer" containerID="af231f45e8ab7c944c22fca6116b58bb904b4dc0685f2e3493179d03dd5c97ce" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.784477 5030 scope.go:117] "RemoveContainer" containerID="b13160b44ce423d0f3f9da0d5415c4d1b6fa95f3c89451263a573671b9170e52" Jan 21 00:51:24 crc kubenswrapper[5030]: E0121 00:51:24.784995 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b13160b44ce423d0f3f9da0d5415c4d1b6fa95f3c89451263a573671b9170e52\": container with ID starting with b13160b44ce423d0f3f9da0d5415c4d1b6fa95f3c89451263a573671b9170e52 not found: ID does not exist" containerID="b13160b44ce423d0f3f9da0d5415c4d1b6fa95f3c89451263a573671b9170e52" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.785050 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13160b44ce423d0f3f9da0d5415c4d1b6fa95f3c89451263a573671b9170e52"} err="failed to get container status \"b13160b44ce423d0f3f9da0d5415c4d1b6fa95f3c89451263a573671b9170e52\": rpc error: code = NotFound desc = could not find container \"b13160b44ce423d0f3f9da0d5415c4d1b6fa95f3c89451263a573671b9170e52\": container with ID starting with b13160b44ce423d0f3f9da0d5415c4d1b6fa95f3c89451263a573671b9170e52 not found: ID does not exist" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.785081 5030 scope.go:117] "RemoveContainer" containerID="6b1bc144684f53d4a28db144bfff185cdac5afecd004cd752d8ccc4e22695966" Jan 21 00:51:24 crc kubenswrapper[5030]: E0121 00:51:24.785519 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1bc144684f53d4a28db144bfff185cdac5afecd004cd752d8ccc4e22695966\": container with ID starting with 6b1bc144684f53d4a28db144bfff185cdac5afecd004cd752d8ccc4e22695966 not found: ID does not exist" containerID="6b1bc144684f53d4a28db144bfff185cdac5afecd004cd752d8ccc4e22695966" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.785552 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1bc144684f53d4a28db144bfff185cdac5afecd004cd752d8ccc4e22695966"} err="failed to get container status \"6b1bc144684f53d4a28db144bfff185cdac5afecd004cd752d8ccc4e22695966\": rpc error: code = NotFound desc = could not find container \"6b1bc144684f53d4a28db144bfff185cdac5afecd004cd752d8ccc4e22695966\": container with ID starting with 6b1bc144684f53d4a28db144bfff185cdac5afecd004cd752d8ccc4e22695966 not found: ID does not exist" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.785571 5030 scope.go:117] "RemoveContainer" containerID="af231f45e8ab7c944c22fca6116b58bb904b4dc0685f2e3493179d03dd5c97ce" Jan 21 00:51:24 crc kubenswrapper[5030]: E0121 00:51:24.785910 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af231f45e8ab7c944c22fca6116b58bb904b4dc0685f2e3493179d03dd5c97ce\": container with ID starting with af231f45e8ab7c944c22fca6116b58bb904b4dc0685f2e3493179d03dd5c97ce not found: ID does not exist" containerID="af231f45e8ab7c944c22fca6116b58bb904b4dc0685f2e3493179d03dd5c97ce" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.785943 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af231f45e8ab7c944c22fca6116b58bb904b4dc0685f2e3493179d03dd5c97ce"} err="failed to get container status \"af231f45e8ab7c944c22fca6116b58bb904b4dc0685f2e3493179d03dd5c97ce\": rpc error: code = NotFound desc = could not find container \"af231f45e8ab7c944c22fca6116b58bb904b4dc0685f2e3493179d03dd5c97ce\": container with ID starting with af231f45e8ab7c944c22fca6116b58bb904b4dc0685f2e3493179d03dd5c97ce not found: ID does not exist" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.857368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j572m\" (UniqueName: \"kubernetes.io/projected/5eba378e-4e31-4889-a3a3-16dcab1eb573-kube-api-access-j572m\") pod \"5eba378e-4e31-4889-a3a3-16dcab1eb573\" (UID: \"5eba378e-4e31-4889-a3a3-16dcab1eb573\") " Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.857443 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eba378e-4e31-4889-a3a3-16dcab1eb573-catalog-content\") pod \"5eba378e-4e31-4889-a3a3-16dcab1eb573\" (UID: \"5eba378e-4e31-4889-a3a3-16dcab1eb573\") " Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.857475 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eba378e-4e31-4889-a3a3-16dcab1eb573-utilities\") pod \"5eba378e-4e31-4889-a3a3-16dcab1eb573\" (UID: \"5eba378e-4e31-4889-a3a3-16dcab1eb573\") " Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.858374 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eba378e-4e31-4889-a3a3-16dcab1eb573-utilities" (OuterVolumeSpecName: "utilities") pod "5eba378e-4e31-4889-a3a3-16dcab1eb573" (UID: "5eba378e-4e31-4889-a3a3-16dcab1eb573"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.858443 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eba378e-4e31-4889-a3a3-16dcab1eb573-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.864228 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eba378e-4e31-4889-a3a3-16dcab1eb573-kube-api-access-j572m" (OuterVolumeSpecName: "kube-api-access-j572m") pod "5eba378e-4e31-4889-a3a3-16dcab1eb573" (UID: "5eba378e-4e31-4889-a3a3-16dcab1eb573"). InnerVolumeSpecName "kube-api-access-j572m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.916193 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eba378e-4e31-4889-a3a3-16dcab1eb573-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5eba378e-4e31-4889-a3a3-16dcab1eb573" (UID: "5eba378e-4e31-4889-a3a3-16dcab1eb573"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.959492 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j572m\" (UniqueName: \"kubernetes.io/projected/5eba378e-4e31-4889-a3a3-16dcab1eb573-kube-api-access-j572m\") on node \"crc\" DevicePath \"\"" Jan 21 00:51:24 crc kubenswrapper[5030]: I0121 00:51:24.959546 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eba378e-4e31-4889-a3a3-16dcab1eb573-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:51:25 crc kubenswrapper[5030]: I0121 00:51:25.055869 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jzftp"] Jan 21 00:51:25 crc kubenswrapper[5030]: I0121 00:51:25.060307 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jzftp"] Jan 21 00:51:25 crc kubenswrapper[5030]: I0121 00:51:25.971454 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eba378e-4e31-4889-a3a3-16dcab1eb573" path="/var/lib/kubelet/pods/5eba378e-4e31-4889-a3a3-16dcab1eb573/volumes" Jan 21 00:53:40 crc kubenswrapper[5030]: I0121 00:53:40.157267 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:53:40 crc kubenswrapper[5030]: I0121 00:53:40.158001 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:54:10 crc kubenswrapper[5030]: I0121 00:54:10.157791 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:54:10 crc kubenswrapper[5030]: I0121 00:54:10.158489 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.644452 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-2m42b"] Jan 21 00:54:11 crc kubenswrapper[5030]: E0121 00:54:11.644961 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eba378e-4e31-4889-a3a3-16dcab1eb573" containerName="registry-server" Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.644972 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eba378e-4e31-4889-a3a3-16dcab1eb573" containerName="registry-server" Jan 21 00:54:11 crc kubenswrapper[5030]: E0121 00:54:11.644984 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eba378e-4e31-4889-a3a3-16dcab1eb573" containerName="extract-content" Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.644990 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eba378e-4e31-4889-a3a3-16dcab1eb573" containerName="extract-content" Jan 21 00:54:11 crc kubenswrapper[5030]: E0121 00:54:11.645004 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eba378e-4e31-4889-a3a3-16dcab1eb573" containerName="extract-utilities" Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.645012 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eba378e-4e31-4889-a3a3-16dcab1eb573" containerName="extract-utilities" Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.645123 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eba378e-4e31-4889-a3a3-16dcab1eb573" containerName="registry-server" Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.645508 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-2m42b" Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.648940 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.649070 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-z2qk6" Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.657283 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.666138 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-2m42b"] Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.725701 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq7gb\" (UniqueName: \"kubernetes.io/projected/b4a4dd1c-2a1a-49f6-800e-f484edac8630-kube-api-access-zq7gb\") pod \"mariadb-operator-index-2m42b\" (UID: \"b4a4dd1c-2a1a-49f6-800e-f484edac8630\") " pod="openstack-operators/mariadb-operator-index-2m42b" Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.826769 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7gb\" (UniqueName: \"kubernetes.io/projected/b4a4dd1c-2a1a-49f6-800e-f484edac8630-kube-api-access-zq7gb\") pod \"mariadb-operator-index-2m42b\" (UID: \"b4a4dd1c-2a1a-49f6-800e-f484edac8630\") " pod="openstack-operators/mariadb-operator-index-2m42b" Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.849306 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq7gb\" (UniqueName: \"kubernetes.io/projected/b4a4dd1c-2a1a-49f6-800e-f484edac8630-kube-api-access-zq7gb\") pod \"mariadb-operator-index-2m42b\" (UID: \"b4a4dd1c-2a1a-49f6-800e-f484edac8630\") " pod="openstack-operators/mariadb-operator-index-2m42b" Jan 21 00:54:11 crc kubenswrapper[5030]: I0121 00:54:11.960362 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-2m42b" Jan 21 00:54:12 crc kubenswrapper[5030]: I0121 00:54:12.392743 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-2m42b"] Jan 21 00:54:12 crc kubenswrapper[5030]: I0121 00:54:12.886402 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-2m42b" event={"ID":"b4a4dd1c-2a1a-49f6-800e-f484edac8630","Type":"ContainerStarted","Data":"1ad322bf48dd4186890f4dc6232fa84b68c295acf409a08090934f2d4cc6d819"} Jan 21 00:54:13 crc kubenswrapper[5030]: I0121 00:54:13.897786 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-2m42b" event={"ID":"b4a4dd1c-2a1a-49f6-800e-f484edac8630","Type":"ContainerStarted","Data":"3784e0105b255c41ce70aa3f9a211156d86ff67658ed04e0fdc99946d76662a1"} Jan 21 00:54:13 crc kubenswrapper[5030]: I0121 00:54:13.918128 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-2m42b" podStartSLOduration=2.332589365 podStartE2EDuration="2.91810652s" podCreationTimestamp="2026-01-21 00:54:11 +0000 UTC" firstStartedPulling="2026-01-21 00:54:12.401880762 +0000 UTC m=+8324.722141060" lastFinishedPulling="2026-01-21 00:54:12.987397927 +0000 UTC m=+8325.307658215" observedRunningTime="2026-01-21 00:54:13.911096922 +0000 UTC m=+8326.231357220" watchObservedRunningTime="2026-01-21 00:54:13.91810652 +0000 UTC m=+8326.238366818" Jan 21 00:54:21 crc kubenswrapper[5030]: I0121 00:54:21.969961 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-2m42b" Jan 21 00:54:21 crc kubenswrapper[5030]: I0121 00:54:21.970576 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-2m42b" Jan 21 00:54:21 crc kubenswrapper[5030]: I0121 00:54:21.991110 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-2m42b" Jan 21 00:54:22 crc kubenswrapper[5030]: I0121 00:54:22.984709 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-2m42b" Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.578594 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt"] Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.580529 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.582680 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.589202 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt"] Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.708398 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66b998ce-d629-4444-a463-3cccdff8895b-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt\" (UID: \"66b998ce-d629-4444-a463-3cccdff8895b\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.708490 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66b998ce-d629-4444-a463-3cccdff8895b-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt\" (UID: \"66b998ce-d629-4444-a463-3cccdff8895b\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.708518 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn5gg\" (UniqueName: \"kubernetes.io/projected/66b998ce-d629-4444-a463-3cccdff8895b-kube-api-access-xn5gg\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt\" (UID: \"66b998ce-d629-4444-a463-3cccdff8895b\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.809841 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66b998ce-d629-4444-a463-3cccdff8895b-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt\" (UID: \"66b998ce-d629-4444-a463-3cccdff8895b\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.809948 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66b998ce-d629-4444-a463-3cccdff8895b-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt\" (UID: \"66b998ce-d629-4444-a463-3cccdff8895b\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.809984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn5gg\" (UniqueName: \"kubernetes.io/projected/66b998ce-d629-4444-a463-3cccdff8895b-kube-api-access-xn5gg\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt\" (UID: \"66b998ce-d629-4444-a463-3cccdff8895b\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.810557 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66b998ce-d629-4444-a463-3cccdff8895b-util\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt\" (UID: \"66b998ce-d629-4444-a463-3cccdff8895b\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.810557 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66b998ce-d629-4444-a463-3cccdff8895b-bundle\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt\" (UID: \"66b998ce-d629-4444-a463-3cccdff8895b\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.828151 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn5gg\" (UniqueName: \"kubernetes.io/projected/66b998ce-d629-4444-a463-3cccdff8895b-kube-api-access-xn5gg\") pod \"a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt\" (UID: \"66b998ce-d629-4444-a463-3cccdff8895b\") " pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.900059 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:54:28 crc kubenswrapper[5030]: I0121 00:54:28.907869 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" Jan 21 00:54:29 crc kubenswrapper[5030]: I0121 00:54:29.347933 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt"] Jan 21 00:54:29 crc kubenswrapper[5030]: W0121 00:54:29.354017 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66b998ce_d629_4444_a463_3cccdff8895b.slice/crio-c1d875505b45e6b7a63bf39f4f7985f9f39c5c146f4fdc606bd534662afcaa26 WatchSource:0}: Error finding container c1d875505b45e6b7a63bf39f4f7985f9f39c5c146f4fdc606bd534662afcaa26: Status 404 returned error can't find the container with id c1d875505b45e6b7a63bf39f4f7985f9f39c5c146f4fdc606bd534662afcaa26 Jan 21 00:54:29 crc kubenswrapper[5030]: I0121 00:54:29.995782 5030 generic.go:334] "Generic (PLEG): container finished" podID="66b998ce-d629-4444-a463-3cccdff8895b" containerID="5e5c187b2897d2120605eda63d7bd7c076ca69b038506e43e7d41a1926cd3fa3" exitCode=0 Jan 21 00:54:29 crc kubenswrapper[5030]: I0121 00:54:29.995861 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" event={"ID":"66b998ce-d629-4444-a463-3cccdff8895b","Type":"ContainerDied","Data":"5e5c187b2897d2120605eda63d7bd7c076ca69b038506e43e7d41a1926cd3fa3"} Jan 21 00:54:29 crc kubenswrapper[5030]: I0121 00:54:29.996215 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" event={"ID":"66b998ce-d629-4444-a463-3cccdff8895b","Type":"ContainerStarted","Data":"c1d875505b45e6b7a63bf39f4f7985f9f39c5c146f4fdc606bd534662afcaa26"} Jan 21 00:54:31 crc kubenswrapper[5030]: I0121 00:54:31.006830 5030 generic.go:334] "Generic (PLEG): container finished" podID="66b998ce-d629-4444-a463-3cccdff8895b" containerID="aba101a4eea595818e1a187ad716bc13083a6477652da52dbb2ddeb6f0b2c736" exitCode=0 Jan 21 00:54:31 crc kubenswrapper[5030]: I0121 00:54:31.006976 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" event={"ID":"66b998ce-d629-4444-a463-3cccdff8895b","Type":"ContainerDied","Data":"aba101a4eea595818e1a187ad716bc13083a6477652da52dbb2ddeb6f0b2c736"} Jan 21 00:54:32 crc kubenswrapper[5030]: I0121 00:54:32.015050 5030 generic.go:334] "Generic (PLEG): container finished" podID="66b998ce-d629-4444-a463-3cccdff8895b" containerID="9cf0ecb3132f7ae4f4d1cb6a7868f5b0cca4cc9a588e6f1f8cdf0d7639d25a3a" exitCode=0 Jan 21 00:54:32 crc kubenswrapper[5030]: I0121 00:54:32.015158 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" event={"ID":"66b998ce-d629-4444-a463-3cccdff8895b","Type":"ContainerDied","Data":"9cf0ecb3132f7ae4f4d1cb6a7868f5b0cca4cc9a588e6f1f8cdf0d7639d25a3a"} Jan 21 00:54:33 crc kubenswrapper[5030]: I0121 00:54:33.417101 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" Jan 21 00:54:33 crc kubenswrapper[5030]: I0121 00:54:33.579703 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66b998ce-d629-4444-a463-3cccdff8895b-util\") pod \"66b998ce-d629-4444-a463-3cccdff8895b\" (UID: \"66b998ce-d629-4444-a463-3cccdff8895b\") " Jan 21 00:54:33 crc kubenswrapper[5030]: I0121 00:54:33.579761 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66b998ce-d629-4444-a463-3cccdff8895b-bundle\") pod \"66b998ce-d629-4444-a463-3cccdff8895b\" (UID: \"66b998ce-d629-4444-a463-3cccdff8895b\") " Jan 21 00:54:33 crc kubenswrapper[5030]: I0121 00:54:33.579901 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn5gg\" (UniqueName: \"kubernetes.io/projected/66b998ce-d629-4444-a463-3cccdff8895b-kube-api-access-xn5gg\") pod \"66b998ce-d629-4444-a463-3cccdff8895b\" (UID: \"66b998ce-d629-4444-a463-3cccdff8895b\") " Jan 21 00:54:33 crc kubenswrapper[5030]: I0121 00:54:33.580877 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b998ce-d629-4444-a463-3cccdff8895b-bundle" (OuterVolumeSpecName: "bundle") pod "66b998ce-d629-4444-a463-3cccdff8895b" (UID: "66b998ce-d629-4444-a463-3cccdff8895b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:54:33 crc kubenswrapper[5030]: I0121 00:54:33.587675 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b998ce-d629-4444-a463-3cccdff8895b-kube-api-access-xn5gg" (OuterVolumeSpecName: "kube-api-access-xn5gg") pod "66b998ce-d629-4444-a463-3cccdff8895b" (UID: "66b998ce-d629-4444-a463-3cccdff8895b"). InnerVolumeSpecName "kube-api-access-xn5gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:54:33 crc kubenswrapper[5030]: I0121 00:54:33.597238 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b998ce-d629-4444-a463-3cccdff8895b-util" (OuterVolumeSpecName: "util") pod "66b998ce-d629-4444-a463-3cccdff8895b" (UID: "66b998ce-d629-4444-a463-3cccdff8895b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:54:33 crc kubenswrapper[5030]: I0121 00:54:33.681588 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66b998ce-d629-4444-a463-3cccdff8895b-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:54:33 crc kubenswrapper[5030]: I0121 00:54:33.681652 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66b998ce-d629-4444-a463-3cccdff8895b-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:54:33 crc kubenswrapper[5030]: I0121 00:54:33.681668 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn5gg\" (UniqueName: \"kubernetes.io/projected/66b998ce-d629-4444-a463-3cccdff8895b-kube-api-access-xn5gg\") on node \"crc\" DevicePath \"\"" Jan 21 00:54:34 crc kubenswrapper[5030]: I0121 00:54:34.030097 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" event={"ID":"66b998ce-d629-4444-a463-3cccdff8895b","Type":"ContainerDied","Data":"c1d875505b45e6b7a63bf39f4f7985f9f39c5c146f4fdc606bd534662afcaa26"} Jan 21 00:54:34 crc kubenswrapper[5030]: I0121 00:54:34.030134 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1d875505b45e6b7a63bf39f4f7985f9f39c5c146f4fdc606bd534662afcaa26" Jan 21 00:54:34 crc kubenswrapper[5030]: I0121 00:54:34.030233 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt" Jan 21 00:54:40 crc kubenswrapper[5030]: I0121 00:54:40.157910 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:54:40 crc kubenswrapper[5030]: I0121 00:54:40.158251 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:54:40 crc kubenswrapper[5030]: I0121 00:54:40.158293 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 00:54:40 crc kubenswrapper[5030]: I0121 00:54:40.158922 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90cfc701935004f0388f24a1bca80a66959af9772ba9792f889c6871c9551ae3"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 00:54:40 crc kubenswrapper[5030]: I0121 00:54:40.158968 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://90cfc701935004f0388f24a1bca80a66959af9772ba9792f889c6871c9551ae3" gracePeriod=600 Jan 21 00:54:41 crc kubenswrapper[5030]: I0121 00:54:41.072578 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="90cfc701935004f0388f24a1bca80a66959af9772ba9792f889c6871c9551ae3" exitCode=0 Jan 21 00:54:41 crc kubenswrapper[5030]: I0121 00:54:41.072679 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"90cfc701935004f0388f24a1bca80a66959af9772ba9792f889c6871c9551ae3"} Jan 21 00:54:41 crc kubenswrapper[5030]: I0121 00:54:41.072900 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87"} Jan 21 00:54:41 crc kubenswrapper[5030]: I0121 00:54:41.072926 5030 scope.go:117] "RemoveContainer" containerID="7a7d3c6ab9d3574cd64c55f971eb58848f5e65eb3534c5bbd0f0d44ef6a3e6b9" Jan 21 00:54:45 crc kubenswrapper[5030]: I0121 00:54:45.933343 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw"] Jan 21 00:54:45 crc kubenswrapper[5030]: E0121 00:54:45.934171 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b998ce-d629-4444-a463-3cccdff8895b" containerName="extract" Jan 21 00:54:45 crc kubenswrapper[5030]: I0121 00:54:45.934186 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b998ce-d629-4444-a463-3cccdff8895b" containerName="extract" Jan 21 00:54:45 crc kubenswrapper[5030]: E0121 00:54:45.934199 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b998ce-d629-4444-a463-3cccdff8895b" containerName="pull" Jan 21 00:54:45 crc kubenswrapper[5030]: I0121 00:54:45.934206 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b998ce-d629-4444-a463-3cccdff8895b" containerName="pull" Jan 21 00:54:45 crc kubenswrapper[5030]: E0121 00:54:45.934237 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b998ce-d629-4444-a463-3cccdff8895b" containerName="util" Jan 21 00:54:45 crc kubenswrapper[5030]: I0121 00:54:45.934250 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b998ce-d629-4444-a463-3cccdff8895b" containerName="util" Jan 21 00:54:45 crc kubenswrapper[5030]: I0121 00:54:45.934397 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b998ce-d629-4444-a463-3cccdff8895b" containerName="extract" Jan 21 00:54:45 crc kubenswrapper[5030]: I0121 00:54:45.935028 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 00:54:45 crc kubenswrapper[5030]: I0121 00:54:45.939236 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 21 00:54:45 crc kubenswrapper[5030]: I0121 00:54:45.939264 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Jan 21 00:54:45 crc kubenswrapper[5030]: I0121 00:54:45.953638 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-857tc" Jan 21 00:54:45 crc kubenswrapper[5030]: I0121 00:54:45.957104 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw"] Jan 21 00:54:45 crc kubenswrapper[5030]: I0121 00:54:45.968899 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9nc9\" (UniqueName: \"kubernetes.io/projected/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-kube-api-access-t9nc9\") pod \"mariadb-operator-controller-manager-84ccc575b7-gxztw\" (UID: \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\") " pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 00:54:45 crc kubenswrapper[5030]: I0121 00:54:45.968984 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-webhook-cert\") pod \"mariadb-operator-controller-manager-84ccc575b7-gxztw\" (UID: \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\") " pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 00:54:45 crc kubenswrapper[5030]: I0121 00:54:45.969075 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-apiservice-cert\") pod \"mariadb-operator-controller-manager-84ccc575b7-gxztw\" (UID: \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\") " pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 00:54:46 crc kubenswrapper[5030]: I0121 00:54:46.070480 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9nc9\" (UniqueName: \"kubernetes.io/projected/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-kube-api-access-t9nc9\") pod \"mariadb-operator-controller-manager-84ccc575b7-gxztw\" (UID: \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\") " pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 00:54:46 crc kubenswrapper[5030]: I0121 00:54:46.070579 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-webhook-cert\") pod \"mariadb-operator-controller-manager-84ccc575b7-gxztw\" (UID: \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\") " pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 00:54:46 crc kubenswrapper[5030]: I0121 00:54:46.070735 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-apiservice-cert\") pod \"mariadb-operator-controller-manager-84ccc575b7-gxztw\" (UID: \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\") " pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 00:54:46 crc kubenswrapper[5030]: I0121 00:54:46.078055 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-webhook-cert\") pod \"mariadb-operator-controller-manager-84ccc575b7-gxztw\" (UID: \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\") " pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 00:54:46 crc kubenswrapper[5030]: I0121 00:54:46.101891 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-apiservice-cert\") pod \"mariadb-operator-controller-manager-84ccc575b7-gxztw\" (UID: \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\") " pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 00:54:46 crc kubenswrapper[5030]: I0121 00:54:46.104029 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9nc9\" (UniqueName: \"kubernetes.io/projected/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-kube-api-access-t9nc9\") pod \"mariadb-operator-controller-manager-84ccc575b7-gxztw\" (UID: \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\") " pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 00:54:46 crc kubenswrapper[5030]: I0121 00:54:46.253197 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 00:54:46 crc kubenswrapper[5030]: I0121 00:54:46.660923 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw"] Jan 21 00:54:46 crc kubenswrapper[5030]: W0121 00:54:46.666599 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode782a2bc_b80e_43ca_9fb0_49f1fba7739b.slice/crio-d00df3285a40bcfaeb234edb7c047f52a7de7fe06baaad294dda1859a7cc9ffc WatchSource:0}: Error finding container d00df3285a40bcfaeb234edb7c047f52a7de7fe06baaad294dda1859a7cc9ffc: Status 404 returned error can't find the container with id d00df3285a40bcfaeb234edb7c047f52a7de7fe06baaad294dda1859a7cc9ffc Jan 21 00:54:47 crc kubenswrapper[5030]: I0121 00:54:47.109722 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" event={"ID":"e782a2bc-b80e-43ca-9fb0-49f1fba7739b","Type":"ContainerStarted","Data":"389f49bfdef8a43d16568344267040b205b0bd720c6f6e6a1b7369130f84a29c"} Jan 21 00:54:47 crc kubenswrapper[5030]: I0121 00:54:47.109762 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" event={"ID":"e782a2bc-b80e-43ca-9fb0-49f1fba7739b","Type":"ContainerStarted","Data":"d00df3285a40bcfaeb234edb7c047f52a7de7fe06baaad294dda1859a7cc9ffc"} Jan 21 00:54:47 crc kubenswrapper[5030]: I0121 00:54:47.109852 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 00:54:47 crc kubenswrapper[5030]: I0121 00:54:47.136692 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" podStartSLOduration=2.136671022 podStartE2EDuration="2.136671022s" podCreationTimestamp="2026-01-21 00:54:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:54:47.131051478 +0000 UTC m=+8359.451311796" watchObservedRunningTime="2026-01-21 00:54:47.136671022 +0000 UTC m=+8359.456931310" Jan 21 00:54:56 crc kubenswrapper[5030]: I0121 00:54:56.259383 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 00:55:02 crc kubenswrapper[5030]: I0121 00:55:02.330765 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-nhwzj"] Jan 21 00:55:02 crc kubenswrapper[5030]: I0121 00:55:02.332160 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-nhwzj" Jan 21 00:55:02 crc kubenswrapper[5030]: I0121 00:55:02.337263 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-vdfrj" Jan 21 00:55:02 crc kubenswrapper[5030]: I0121 00:55:02.339436 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-nhwzj"] Jan 21 00:55:02 crc kubenswrapper[5030]: I0121 00:55:02.524642 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2txcv\" (UniqueName: \"kubernetes.io/projected/6b3d7a00-a57e-4327-bc57-7ee501d8eb12-kube-api-access-2txcv\") pod \"infra-operator-index-nhwzj\" (UID: \"6b3d7a00-a57e-4327-bc57-7ee501d8eb12\") " pod="openstack-operators/infra-operator-index-nhwzj" Jan 21 00:55:02 crc kubenswrapper[5030]: I0121 00:55:02.626303 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2txcv\" (UniqueName: \"kubernetes.io/projected/6b3d7a00-a57e-4327-bc57-7ee501d8eb12-kube-api-access-2txcv\") pod \"infra-operator-index-nhwzj\" (UID: \"6b3d7a00-a57e-4327-bc57-7ee501d8eb12\") " pod="openstack-operators/infra-operator-index-nhwzj" Jan 21 00:55:02 crc kubenswrapper[5030]: I0121 00:55:02.652596 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2txcv\" (UniqueName: \"kubernetes.io/projected/6b3d7a00-a57e-4327-bc57-7ee501d8eb12-kube-api-access-2txcv\") pod \"infra-operator-index-nhwzj\" (UID: \"6b3d7a00-a57e-4327-bc57-7ee501d8eb12\") " pod="openstack-operators/infra-operator-index-nhwzj" Jan 21 00:55:02 crc kubenswrapper[5030]: I0121 00:55:02.954276 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-nhwzj" Jan 21 00:55:03 crc kubenswrapper[5030]: I0121 00:55:03.209595 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-nhwzj"] Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.219438 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-nhwzj" event={"ID":"6b3d7a00-a57e-4327-bc57-7ee501d8eb12","Type":"ContainerStarted","Data":"f874dff165783d41ede0f3d6424c81f24faef8bd813e57c2c40e5cfe67985364"} Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.219785 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-nhwzj" event={"ID":"6b3d7a00-a57e-4327-bc57-7ee501d8eb12","Type":"ContainerStarted","Data":"d86cf1d6cecca203d5b2a31b6f99da857bf5aa6ef587785766b4865963ff900e"} Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.234042 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-nhwzj" podStartSLOduration=1.618848128 podStartE2EDuration="2.234023773s" podCreationTimestamp="2026-01-21 00:55:02 +0000 UTC" firstStartedPulling="2026-01-21 00:55:03.222490327 +0000 UTC m=+8375.542750615" lastFinishedPulling="2026-01-21 00:55:03.837665972 +0000 UTC m=+8376.157926260" observedRunningTime="2026-01-21 00:55:04.230080768 +0000 UTC m=+8376.550341056" watchObservedRunningTime="2026-01-21 00:55:04.234023773 +0000 UTC m=+8376.554284061" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.454987 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.456238 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.458523 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"openstack-config-data" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.458729 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"kube-root-ca.crt" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.458879 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"openstack-scripts" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.459058 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"openshift-service-ca.crt" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.459284 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"galera-openstack-dockercfg-j42pw" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.468805 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.488638 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.490230 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.495830 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.497003 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.515447 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.519679 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.557280 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.557390 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/97ea303d-f60d-42ad-a86b-0b371360c2b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.557430 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2phh7\" (UniqueName: \"kubernetes.io/projected/97ea303d-f60d-42ad-a86b-0b371360c2b1-kube-api-access-2phh7\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.557466 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.557492 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.557519 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661198 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661269 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-operator-scripts\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661303 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661352 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69e29420-6e97-43b9-8a13-78ee659137a3-config-data-generated\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661380 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661504 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661533 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-kolla-config\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661565 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661596 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-operator-scripts\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661631 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwwts\" (UniqueName: \"kubernetes.io/projected/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-kube-api-access-zwwts\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661668 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x882\" (UniqueName: \"kubernetes.io/projected/69e29420-6e97-43b9-8a13-78ee659137a3-kube-api-access-7x882\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661695 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-config-data-default\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661721 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661748 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/97ea303d-f60d-42ad-a86b-0b371360c2b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661776 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-config-data-default\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661821 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2phh7\" (UniqueName: \"kubernetes.io/projected/97ea303d-f60d-42ad-a86b-0b371360c2b1-kube-api-access-2phh7\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661851 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-kolla-config\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.661870 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-config-data-generated\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.662571 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.663025 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.663267 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") device mount path \"/mnt/openstack/pv07\"" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.663459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/97ea303d-f60d-42ad-a86b-0b371360c2b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.666277 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.682038 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2phh7\" (UniqueName: \"kubernetes.io/projected/97ea303d-f60d-42ad-a86b-0b371360c2b1-kube-api-access-2phh7\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.682369 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.763612 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-kolla-config\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.763721 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-config-data-generated\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.763757 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-operator-scripts\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.763810 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69e29420-6e97-43b9-8a13-78ee659137a3-config-data-generated\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.763909 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-kolla-config\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.763970 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.764001 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-operator-scripts\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.764058 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwwts\" (UniqueName: \"kubernetes.io/projected/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-kube-api-access-zwwts\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.764097 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x882\" (UniqueName: \"kubernetes.io/projected/69e29420-6e97-43b9-8a13-78ee659137a3-kube-api-access-7x882\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.764152 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-config-data-default\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.764178 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.764234 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-config-data-default\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.764397 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69e29420-6e97-43b9-8a13-78ee659137a3-config-data-generated\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.764500 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") device mount path \"/mnt/openstack/pv14\"" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.764845 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-kolla-config\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.764983 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") device mount path \"/mnt/openstack/pv15\"" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.765055 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-config-data-generated\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.765183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-config-data-default\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.765395 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-config-data-default\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.765936 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-operator-scripts\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.766050 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-kolla-config\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.767012 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-operator-scripts\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.781481 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.781850 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x882\" (UniqueName: \"kubernetes.io/projected/69e29420-6e97-43b9-8a13-78ee659137a3-kube-api-access-7x882\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.781944 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwwts\" (UniqueName: \"kubernetes.io/projected/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-kube-api-access-zwwts\") pod \"openstack-galera-1\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.784692 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.786946 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"openstack-galera-2\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.808504 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:04 crc kubenswrapper[5030]: I0121 00:55:04.843312 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:05 crc kubenswrapper[5030]: I0121 00:55:05.253611 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Jan 21 00:55:05 crc kubenswrapper[5030]: W0121 00:55:05.258918 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97ea303d_f60d_42ad_a86b_0b371360c2b1.slice/crio-001e23f4f61d1708295310b3706481d45ffc1a56ad4ec7174fa9003cdd8c1b92 WatchSource:0}: Error finding container 001e23f4f61d1708295310b3706481d45ffc1a56ad4ec7174fa9003cdd8c1b92: Status 404 returned error can't find the container with id 001e23f4f61d1708295310b3706481d45ffc1a56ad4ec7174fa9003cdd8c1b92 Jan 21 00:55:05 crc kubenswrapper[5030]: I0121 00:55:05.304595 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Jan 21 00:55:05 crc kubenswrapper[5030]: I0121 00:55:05.362994 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Jan 21 00:55:05 crc kubenswrapper[5030]: W0121 00:55:05.372145 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod765d3219_1f0d_4bda_8f7f_716ee4cb51ae.slice/crio-77f2e866895ee11a00785af56c0f92cffa95772ddd75419689b6c3bf65dfe8bc WatchSource:0}: Error finding container 77f2e866895ee11a00785af56c0f92cffa95772ddd75419689b6c3bf65dfe8bc: Status 404 returned error can't find the container with id 77f2e866895ee11a00785af56c0f92cffa95772ddd75419689b6c3bf65dfe8bc Jan 21 00:55:06 crc kubenswrapper[5030]: I0121 00:55:06.234177 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"765d3219-1f0d-4bda-8f7f-716ee4cb51ae","Type":"ContainerStarted","Data":"dc9d1a054be07d981dd9d1bcb5876e4565863e1a7c546cefa2a6b6f270a3468d"} Jan 21 00:55:06 crc kubenswrapper[5030]: I0121 00:55:06.234513 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"765d3219-1f0d-4bda-8f7f-716ee4cb51ae","Type":"ContainerStarted","Data":"77f2e866895ee11a00785af56c0f92cffa95772ddd75419689b6c3bf65dfe8bc"} Jan 21 00:55:06 crc kubenswrapper[5030]: I0121 00:55:06.236027 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"69e29420-6e97-43b9-8a13-78ee659137a3","Type":"ContainerStarted","Data":"ef6d71aade146bf53cd5abeee0fc3eeda4a2bdb0dc8e70df7289bcb6182508c4"} Jan 21 00:55:06 crc kubenswrapper[5030]: I0121 00:55:06.236111 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"69e29420-6e97-43b9-8a13-78ee659137a3","Type":"ContainerStarted","Data":"2af0eef2c894852b53610ad3fc7bc00566546a5ef4b28578bfd4902329b14fd2"} Jan 21 00:55:06 crc kubenswrapper[5030]: I0121 00:55:06.237426 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"97ea303d-f60d-42ad-a86b-0b371360c2b1","Type":"ContainerStarted","Data":"ce3afd85a7f097d9424a8c117e098c0bd738ff1e79884c717fe5cad083874148"} Jan 21 00:55:06 crc kubenswrapper[5030]: I0121 00:55:06.237457 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"97ea303d-f60d-42ad-a86b-0b371360c2b1","Type":"ContainerStarted","Data":"001e23f4f61d1708295310b3706481d45ffc1a56ad4ec7174fa9003cdd8c1b92"} Jan 21 00:55:09 crc kubenswrapper[5030]: I0121 00:55:09.269220 5030 generic.go:334] "Generic (PLEG): container finished" podID="765d3219-1f0d-4bda-8f7f-716ee4cb51ae" containerID="dc9d1a054be07d981dd9d1bcb5876e4565863e1a7c546cefa2a6b6f270a3468d" exitCode=0 Jan 21 00:55:09 crc kubenswrapper[5030]: I0121 00:55:09.269378 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"765d3219-1f0d-4bda-8f7f-716ee4cb51ae","Type":"ContainerDied","Data":"dc9d1a054be07d981dd9d1bcb5876e4565863e1a7c546cefa2a6b6f270a3468d"} Jan 21 00:55:09 crc kubenswrapper[5030]: I0121 00:55:09.272877 5030 generic.go:334] "Generic (PLEG): container finished" podID="69e29420-6e97-43b9-8a13-78ee659137a3" containerID="ef6d71aade146bf53cd5abeee0fc3eeda4a2bdb0dc8e70df7289bcb6182508c4" exitCode=0 Jan 21 00:55:09 crc kubenswrapper[5030]: I0121 00:55:09.272980 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"69e29420-6e97-43b9-8a13-78ee659137a3","Type":"ContainerDied","Data":"ef6d71aade146bf53cd5abeee0fc3eeda4a2bdb0dc8e70df7289bcb6182508c4"} Jan 21 00:55:09 crc kubenswrapper[5030]: I0121 00:55:09.280997 5030 generic.go:334] "Generic (PLEG): container finished" podID="97ea303d-f60d-42ad-a86b-0b371360c2b1" containerID="ce3afd85a7f097d9424a8c117e098c0bd738ff1e79884c717fe5cad083874148" exitCode=0 Jan 21 00:55:09 crc kubenswrapper[5030]: I0121 00:55:09.281054 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"97ea303d-f60d-42ad-a86b-0b371360c2b1","Type":"ContainerDied","Data":"ce3afd85a7f097d9424a8c117e098c0bd738ff1e79884c717fe5cad083874148"} Jan 21 00:55:10 crc kubenswrapper[5030]: I0121 00:55:10.290388 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"97ea303d-f60d-42ad-a86b-0b371360c2b1","Type":"ContainerStarted","Data":"4922eaeb36c3932ef49092ccba1cbc0da53558a79e4c3a7dd993f0d1fb3bccd7"} Jan 21 00:55:10 crc kubenswrapper[5030]: I0121 00:55:10.293363 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"765d3219-1f0d-4bda-8f7f-716ee4cb51ae","Type":"ContainerStarted","Data":"b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529"} Jan 21 00:55:10 crc kubenswrapper[5030]: I0121 00:55:10.295609 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"69e29420-6e97-43b9-8a13-78ee659137a3","Type":"ContainerStarted","Data":"51b12193306751d5e555d7c9e3fd34b74dd7fed0b2f6d2e6d883b6c0073e0073"} Jan 21 00:55:10 crc kubenswrapper[5030]: I0121 00:55:10.310776 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/openstack-galera-0" podStartSLOduration=7.310750099 podStartE2EDuration="7.310750099s" podCreationTimestamp="2026-01-21 00:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:55:10.308456144 +0000 UTC m=+8382.628716452" watchObservedRunningTime="2026-01-21 00:55:10.310750099 +0000 UTC m=+8382.631010477" Jan 21 00:55:10 crc kubenswrapper[5030]: I0121 00:55:10.341025 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/openstack-galera-2" podStartSLOduration=7.341006084 podStartE2EDuration="7.341006084s" podCreationTimestamp="2026-01-21 00:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:55:10.33753548 +0000 UTC m=+8382.657795778" watchObservedRunningTime="2026-01-21 00:55:10.341006084 +0000 UTC m=+8382.661266372" Jan 21 00:55:10 crc kubenswrapper[5030]: I0121 00:55:10.360310 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/openstack-galera-1" podStartSLOduration=7.360292054 podStartE2EDuration="7.360292054s" podCreationTimestamp="2026-01-21 00:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:55:10.356669098 +0000 UTC m=+8382.676929396" watchObservedRunningTime="2026-01-21 00:55:10.360292054 +0000 UTC m=+8382.680552342" Jan 21 00:55:12 crc kubenswrapper[5030]: I0121 00:55:12.954871 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-nhwzj" Jan 21 00:55:12 crc kubenswrapper[5030]: I0121 00:55:12.955271 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-nhwzj" Jan 21 00:55:12 crc kubenswrapper[5030]: I0121 00:55:12.981547 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-nhwzj" Jan 21 00:55:13 crc kubenswrapper[5030]: I0121 00:55:13.345008 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-nhwzj" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.775913 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6"] Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.778746 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.780602 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.785528 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.785697 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.787612 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6"] Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.809663 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.809715 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.844332 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.844387 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.856957 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ec3adc6-5340-444c-b220-8cb79b493534-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6\" (UID: \"8ec3adc6-5340-444c-b220-8cb79b493534\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.857140 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ec3adc6-5340-444c-b220-8cb79b493534-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6\" (UID: \"8ec3adc6-5340-444c-b220-8cb79b493534\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.857414 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f66xb\" (UniqueName: \"kubernetes.io/projected/8ec3adc6-5340-444c-b220-8cb79b493534-kube-api-access-f66xb\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6\" (UID: \"8ec3adc6-5340-444c-b220-8cb79b493534\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.958599 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ec3adc6-5340-444c-b220-8cb79b493534-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6\" (UID: \"8ec3adc6-5340-444c-b220-8cb79b493534\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.958677 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ec3adc6-5340-444c-b220-8cb79b493534-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6\" (UID: \"8ec3adc6-5340-444c-b220-8cb79b493534\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.958738 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f66xb\" (UniqueName: \"kubernetes.io/projected/8ec3adc6-5340-444c-b220-8cb79b493534-kube-api-access-f66xb\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6\" (UID: \"8ec3adc6-5340-444c-b220-8cb79b493534\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.959164 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ec3adc6-5340-444c-b220-8cb79b493534-util\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6\" (UID: \"8ec3adc6-5340-444c-b220-8cb79b493534\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.959388 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ec3adc6-5340-444c-b220-8cb79b493534-bundle\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6\" (UID: \"8ec3adc6-5340-444c-b220-8cb79b493534\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" Jan 21 00:55:14 crc kubenswrapper[5030]: I0121 00:55:14.979161 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f66xb\" (UniqueName: \"kubernetes.io/projected/8ec3adc6-5340-444c-b220-8cb79b493534-kube-api-access-f66xb\") pod \"ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6\" (UID: \"8ec3adc6-5340-444c-b220-8cb79b493534\") " pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" Jan 21 00:55:15 crc kubenswrapper[5030]: I0121 00:55:15.098728 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" Jan 21 00:55:15 crc kubenswrapper[5030]: I0121 00:55:15.596193 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6"] Jan 21 00:55:16 crc kubenswrapper[5030]: I0121 00:55:16.340095 5030 generic.go:334] "Generic (PLEG): container finished" podID="8ec3adc6-5340-444c-b220-8cb79b493534" containerID="9545e32aa27716e913bacfcac37330599ebcd471a287020024a71734eed884b2" exitCode=0 Jan 21 00:55:16 crc kubenswrapper[5030]: I0121 00:55:16.340148 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" event={"ID":"8ec3adc6-5340-444c-b220-8cb79b493534","Type":"ContainerDied","Data":"9545e32aa27716e913bacfcac37330599ebcd471a287020024a71734eed884b2"} Jan 21 00:55:16 crc kubenswrapper[5030]: I0121 00:55:16.340187 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" event={"ID":"8ec3adc6-5340-444c-b220-8cb79b493534","Type":"ContainerStarted","Data":"df1c850f926d9e1eff0edd33fdba44bee98cd038a1fe17ae519307bb4ea8ebb3"} Jan 21 00:55:17 crc kubenswrapper[5030]: I0121 00:55:17.347469 5030 generic.go:334] "Generic (PLEG): container finished" podID="8ec3adc6-5340-444c-b220-8cb79b493534" containerID="6f50096578fa9013b1f2b0602ea184139cf29271f5e6baef23dfeb4f46554d6f" exitCode=0 Jan 21 00:55:17 crc kubenswrapper[5030]: I0121 00:55:17.347543 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" event={"ID":"8ec3adc6-5340-444c-b220-8cb79b493534","Type":"ContainerDied","Data":"6f50096578fa9013b1f2b0602ea184139cf29271f5e6baef23dfeb4f46554d6f"} Jan 21 00:55:18 crc kubenswrapper[5030]: I0121 00:55:18.355957 5030 generic.go:334] "Generic (PLEG): container finished" podID="8ec3adc6-5340-444c-b220-8cb79b493534" containerID="6d810c1c168a54f2cd0fd942db694c7a8b69bad89e1ef93b4a16c53b7595b45e" exitCode=0 Jan 21 00:55:18 crc kubenswrapper[5030]: I0121 00:55:18.356000 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" event={"ID":"8ec3adc6-5340-444c-b220-8cb79b493534","Type":"ContainerDied","Data":"6d810c1c168a54f2cd0fd942db694c7a8b69bad89e1ef93b4a16c53b7595b45e"} Jan 21 00:55:18 crc kubenswrapper[5030]: I0121 00:55:18.884826 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:18 crc kubenswrapper[5030]: I0121 00:55:18.980819 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 00:55:19 crc kubenswrapper[5030]: I0121 00:55:19.643220 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" Jan 21 00:55:19 crc kubenswrapper[5030]: I0121 00:55:19.735065 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ec3adc6-5340-444c-b220-8cb79b493534-bundle\") pod \"8ec3adc6-5340-444c-b220-8cb79b493534\" (UID: \"8ec3adc6-5340-444c-b220-8cb79b493534\") " Jan 21 00:55:19 crc kubenswrapper[5030]: I0121 00:55:19.735192 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ec3adc6-5340-444c-b220-8cb79b493534-util\") pod \"8ec3adc6-5340-444c-b220-8cb79b493534\" (UID: \"8ec3adc6-5340-444c-b220-8cb79b493534\") " Jan 21 00:55:19 crc kubenswrapper[5030]: I0121 00:55:19.735235 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f66xb\" (UniqueName: \"kubernetes.io/projected/8ec3adc6-5340-444c-b220-8cb79b493534-kube-api-access-f66xb\") pod \"8ec3adc6-5340-444c-b220-8cb79b493534\" (UID: \"8ec3adc6-5340-444c-b220-8cb79b493534\") " Jan 21 00:55:19 crc kubenswrapper[5030]: I0121 00:55:19.738086 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec3adc6-5340-444c-b220-8cb79b493534-bundle" (OuterVolumeSpecName: "bundle") pod "8ec3adc6-5340-444c-b220-8cb79b493534" (UID: "8ec3adc6-5340-444c-b220-8cb79b493534"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:55:19 crc kubenswrapper[5030]: I0121 00:55:19.741279 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ec3adc6-5340-444c-b220-8cb79b493534-kube-api-access-f66xb" (OuterVolumeSpecName: "kube-api-access-f66xb") pod "8ec3adc6-5340-444c-b220-8cb79b493534" (UID: "8ec3adc6-5340-444c-b220-8cb79b493534"). InnerVolumeSpecName "kube-api-access-f66xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:55:19 crc kubenswrapper[5030]: I0121 00:55:19.763556 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ec3adc6-5340-444c-b220-8cb79b493534-util" (OuterVolumeSpecName: "util") pod "8ec3adc6-5340-444c-b220-8cb79b493534" (UID: "8ec3adc6-5340-444c-b220-8cb79b493534"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:55:19 crc kubenswrapper[5030]: I0121 00:55:19.838067 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ec3adc6-5340-444c-b220-8cb79b493534-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:55:19 crc kubenswrapper[5030]: I0121 00:55:19.838105 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ec3adc6-5340-444c-b220-8cb79b493534-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:55:19 crc kubenswrapper[5030]: I0121 00:55:19.838124 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f66xb\" (UniqueName: \"kubernetes.io/projected/8ec3adc6-5340-444c-b220-8cb79b493534-kube-api-access-f66xb\") on node \"crc\" DevicePath \"\"" Jan 21 00:55:20 crc kubenswrapper[5030]: I0121 00:55:20.373045 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" event={"ID":"8ec3adc6-5340-444c-b220-8cb79b493534","Type":"ContainerDied","Data":"df1c850f926d9e1eff0edd33fdba44bee98cd038a1fe17ae519307bb4ea8ebb3"} Jan 21 00:55:20 crc kubenswrapper[5030]: I0121 00:55:20.373292 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df1c850f926d9e1eff0edd33fdba44bee98cd038a1fe17ae519307bb4ea8ebb3" Jan 21 00:55:20 crc kubenswrapper[5030]: I0121 00:55:20.373125 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.533734 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/root-account-create-update-p57x5"] Jan 21 00:55:23 crc kubenswrapper[5030]: E0121 00:55:23.534279 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec3adc6-5340-444c-b220-8cb79b493534" containerName="pull" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.534292 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec3adc6-5340-444c-b220-8cb79b493534" containerName="pull" Jan 21 00:55:23 crc kubenswrapper[5030]: E0121 00:55:23.534306 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec3adc6-5340-444c-b220-8cb79b493534" containerName="extract" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.534311 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec3adc6-5340-444c-b220-8cb79b493534" containerName="extract" Jan 21 00:55:23 crc kubenswrapper[5030]: E0121 00:55:23.534343 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ec3adc6-5340-444c-b220-8cb79b493534" containerName="util" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.534349 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ec3adc6-5340-444c-b220-8cb79b493534" containerName="util" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.534463 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ec3adc6-5340-444c-b220-8cb79b493534" containerName="extract" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.534932 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-p57x5" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.540029 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.548791 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-p57x5"] Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.698383 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1e9a747-3b2d-450e-a1a3-aa49bce30b7c-operator-scripts\") pod \"root-account-create-update-p57x5\" (UID: \"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c\") " pod="horizon-kuttl-tests/root-account-create-update-p57x5" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.698507 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8mxs\" (UniqueName: \"kubernetes.io/projected/f1e9a747-3b2d-450e-a1a3-aa49bce30b7c-kube-api-access-p8mxs\") pod \"root-account-create-update-p57x5\" (UID: \"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c\") " pod="horizon-kuttl-tests/root-account-create-update-p57x5" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.800683 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8mxs\" (UniqueName: \"kubernetes.io/projected/f1e9a747-3b2d-450e-a1a3-aa49bce30b7c-kube-api-access-p8mxs\") pod \"root-account-create-update-p57x5\" (UID: \"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c\") " pod="horizon-kuttl-tests/root-account-create-update-p57x5" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.800753 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1e9a747-3b2d-450e-a1a3-aa49bce30b7c-operator-scripts\") pod \"root-account-create-update-p57x5\" (UID: \"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c\") " pod="horizon-kuttl-tests/root-account-create-update-p57x5" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.801518 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1e9a747-3b2d-450e-a1a3-aa49bce30b7c-operator-scripts\") pod \"root-account-create-update-p57x5\" (UID: \"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c\") " pod="horizon-kuttl-tests/root-account-create-update-p57x5" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.820559 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8mxs\" (UniqueName: \"kubernetes.io/projected/f1e9a747-3b2d-450e-a1a3-aa49bce30b7c-kube-api-access-p8mxs\") pod \"root-account-create-update-p57x5\" (UID: \"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c\") " pod="horizon-kuttl-tests/root-account-create-update-p57x5" Jan 21 00:55:23 crc kubenswrapper[5030]: I0121 00:55:23.851476 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-p57x5" Jan 21 00:55:24 crc kubenswrapper[5030]: I0121 00:55:24.091214 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-p57x5"] Jan 21 00:55:24 crc kubenswrapper[5030]: I0121 00:55:24.399234 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/root-account-create-update-p57x5" event={"ID":"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c","Type":"ContainerStarted","Data":"1172bbe521da0e1e1724ff07cde1ba7087398ca8978ed9593bffa760953a3341"} Jan 21 00:55:24 crc kubenswrapper[5030]: I0121 00:55:24.982926 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/openstack-galera-2" podUID="69e29420-6e97-43b9-8a13-78ee659137a3" containerName="galera" probeResult="failure" output=< Jan 21 00:55:24 crc kubenswrapper[5030]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Jan 21 00:55:24 crc kubenswrapper[5030]: > Jan 21 00:55:25 crc kubenswrapper[5030]: I0121 00:55:25.407641 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/root-account-create-update-p57x5" event={"ID":"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c","Type":"ContainerStarted","Data":"313f8b3f2da47a1746bf985046b8a211ae982f8ee7c23e1d6210dbe1f6e0b658"} Jan 21 00:55:25 crc kubenswrapper[5030]: I0121 00:55:25.429158 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/root-account-create-update-p57x5" podStartSLOduration=2.429143314 podStartE2EDuration="2.429143314s" podCreationTimestamp="2026-01-21 00:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:55:25.427176396 +0000 UTC m=+8397.747436684" watchObservedRunningTime="2026-01-21 00:55:25.429143314 +0000 UTC m=+8397.749403602" Jan 21 00:55:26 crc kubenswrapper[5030]: I0121 00:55:26.415989 5030 generic.go:334] "Generic (PLEG): container finished" podID="f1e9a747-3b2d-450e-a1a3-aa49bce30b7c" containerID="313f8b3f2da47a1746bf985046b8a211ae982f8ee7c23e1d6210dbe1f6e0b658" exitCode=0 Jan 21 00:55:26 crc kubenswrapper[5030]: I0121 00:55:26.416070 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/root-account-create-update-p57x5" event={"ID":"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c","Type":"ContainerDied","Data":"313f8b3f2da47a1746bf985046b8a211ae982f8ee7c23e1d6210dbe1f6e0b658"} Jan 21 00:55:27 crc kubenswrapper[5030]: I0121 00:55:27.685538 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-p57x5" Jan 21 00:55:27 crc kubenswrapper[5030]: I0121 00:55:27.860753 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8mxs\" (UniqueName: \"kubernetes.io/projected/f1e9a747-3b2d-450e-a1a3-aa49bce30b7c-kube-api-access-p8mxs\") pod \"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c\" (UID: \"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c\") " Jan 21 00:55:27 crc kubenswrapper[5030]: I0121 00:55:27.860874 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1e9a747-3b2d-450e-a1a3-aa49bce30b7c-operator-scripts\") pod \"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c\" (UID: \"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c\") " Jan 21 00:55:27 crc kubenswrapper[5030]: I0121 00:55:27.862033 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1e9a747-3b2d-450e-a1a3-aa49bce30b7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1e9a747-3b2d-450e-a1a3-aa49bce30b7c" (UID: "f1e9a747-3b2d-450e-a1a3-aa49bce30b7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:55:27 crc kubenswrapper[5030]: I0121 00:55:27.868172 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1e9a747-3b2d-450e-a1a3-aa49bce30b7c-kube-api-access-p8mxs" (OuterVolumeSpecName: "kube-api-access-p8mxs") pod "f1e9a747-3b2d-450e-a1a3-aa49bce30b7c" (UID: "f1e9a747-3b2d-450e-a1a3-aa49bce30b7c"). InnerVolumeSpecName "kube-api-access-p8mxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:55:27 crc kubenswrapper[5030]: I0121 00:55:27.962150 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1e9a747-3b2d-450e-a1a3-aa49bce30b7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:55:27 crc kubenswrapper[5030]: I0121 00:55:27.962198 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8mxs\" (UniqueName: \"kubernetes.io/projected/f1e9a747-3b2d-450e-a1a3-aa49bce30b7c-kube-api-access-p8mxs\") on node \"crc\" DevicePath \"\"" Jan 21 00:55:28 crc kubenswrapper[5030]: I0121 00:55:28.430208 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/root-account-create-update-p57x5" event={"ID":"f1e9a747-3b2d-450e-a1a3-aa49bce30b7c","Type":"ContainerDied","Data":"1172bbe521da0e1e1724ff07cde1ba7087398ca8978ed9593bffa760953a3341"} Jan 21 00:55:28 crc kubenswrapper[5030]: I0121 00:55:28.430540 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1172bbe521da0e1e1724ff07cde1ba7087398ca8978ed9593bffa760953a3341" Jan 21 00:55:28 crc kubenswrapper[5030]: I0121 00:55:28.430268 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-p57x5" Jan 21 00:55:29 crc kubenswrapper[5030]: E0121 00:55:29.985766 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e9a747_3b2d_450e_a1a3_aa49bce30b7c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e9a747_3b2d_450e_a1a3_aa49bce30b7c.slice/crio-1172bbe521da0e1e1724ff07cde1ba7087398ca8978ed9593bffa760953a3341\": RecentStats: unable to find data in memory cache]" Jan 21 00:55:30 crc kubenswrapper[5030]: I0121 00:55:30.499266 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:30 crc kubenswrapper[5030]: I0121 00:55:30.573668 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 00:55:33 crc kubenswrapper[5030]: I0121 00:55:33.726684 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:33 crc kubenswrapper[5030]: I0121 00:55:33.801973 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 00:55:40 crc kubenswrapper[5030]: E0121 00:55:40.135888 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e9a747_3b2d_450e_a1a3_aa49bce30b7c.slice/crio-1172bbe521da0e1e1724ff07cde1ba7087398ca8978ed9593bffa760953a3341\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e9a747_3b2d_450e_a1a3_aa49bce30b7c.slice\": RecentStats: unable to find data in memory cache]" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.341504 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-62n88"] Jan 21 00:55:44 crc kubenswrapper[5030]: E0121 00:55:44.342548 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e9a747-3b2d-450e-a1a3-aa49bce30b7c" containerName="mariadb-account-create-update" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.342568 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e9a747-3b2d-450e-a1a3-aa49bce30b7c" containerName="mariadb-account-create-update" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.343012 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1e9a747-3b2d-450e-a1a3-aa49bce30b7c" containerName="mariadb-account-create-update" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.345207 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.378254 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-62n88"] Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.464613 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-utilities\") pod \"redhat-operators-62n88\" (UID: \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\") " pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.464743 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9hhp\" (UniqueName: \"kubernetes.io/projected/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-kube-api-access-m9hhp\") pod \"redhat-operators-62n88\" (UID: \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\") " pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.464766 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-catalog-content\") pod \"redhat-operators-62n88\" (UID: \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\") " pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.566136 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-catalog-content\") pod \"redhat-operators-62n88\" (UID: \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\") " pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.566232 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-utilities\") pod \"redhat-operators-62n88\" (UID: \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\") " pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.566304 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9hhp\" (UniqueName: \"kubernetes.io/projected/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-kube-api-access-m9hhp\") pod \"redhat-operators-62n88\" (UID: \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\") " pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.566698 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-catalog-content\") pod \"redhat-operators-62n88\" (UID: \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\") " pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.566790 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-utilities\") pod \"redhat-operators-62n88\" (UID: \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\") " pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.584685 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9hhp\" (UniqueName: \"kubernetes.io/projected/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-kube-api-access-m9hhp\") pod \"redhat-operators-62n88\" (UID: \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\") " pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:44 crc kubenswrapper[5030]: I0121 00:55:44.675035 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:45 crc kubenswrapper[5030]: I0121 00:55:45.094166 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-62n88"] Jan 21 00:55:45 crc kubenswrapper[5030]: W0121 00:55:45.097644 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b5ca99e_ccdf_449d_ba92_d7aa6e530bc8.slice/crio-dd786d9ef87e3a721efb81c23111fb231e49e59101ed5428122171dd00acb537 WatchSource:0}: Error finding container dd786d9ef87e3a721efb81c23111fb231e49e59101ed5428122171dd00acb537: Status 404 returned error can't find the container with id dd786d9ef87e3a721efb81c23111fb231e49e59101ed5428122171dd00acb537 Jan 21 00:55:45 crc kubenswrapper[5030]: I0121 00:55:45.550475 5030 generic.go:334] "Generic (PLEG): container finished" podID="3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" containerID="660c7c47b4ce9198fa1e10731a96148a762dc67a08300a2f17f2309d9c28a481" exitCode=0 Jan 21 00:55:45 crc kubenswrapper[5030]: I0121 00:55:45.550526 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62n88" event={"ID":"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8","Type":"ContainerDied","Data":"660c7c47b4ce9198fa1e10731a96148a762dc67a08300a2f17f2309d9c28a481"} Jan 21 00:55:45 crc kubenswrapper[5030]: I0121 00:55:45.550555 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62n88" event={"ID":"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8","Type":"ContainerStarted","Data":"dd786d9ef87e3a721efb81c23111fb231e49e59101ed5428122171dd00acb537"} Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.269677 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z"] Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.270712 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.274676 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.274761 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5v59r" Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.343840 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z"] Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.390868 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48jps\" (UniqueName: \"kubernetes.io/projected/89886f93-b57e-42ec-8964-f3507f40f605-kube-api-access-48jps\") pod \"infra-operator-controller-manager-7bbfcdfb-zm69z\" (UID: \"89886f93-b57e-42ec-8964-f3507f40f605\") " pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.390916 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89886f93-b57e-42ec-8964-f3507f40f605-webhook-cert\") pod \"infra-operator-controller-manager-7bbfcdfb-zm69z\" (UID: \"89886f93-b57e-42ec-8964-f3507f40f605\") " pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.391084 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89886f93-b57e-42ec-8964-f3507f40f605-apiservice-cert\") pod \"infra-operator-controller-manager-7bbfcdfb-zm69z\" (UID: \"89886f93-b57e-42ec-8964-f3507f40f605\") " pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.492724 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48jps\" (UniqueName: \"kubernetes.io/projected/89886f93-b57e-42ec-8964-f3507f40f605-kube-api-access-48jps\") pod \"infra-operator-controller-manager-7bbfcdfb-zm69z\" (UID: \"89886f93-b57e-42ec-8964-f3507f40f605\") " pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.492792 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89886f93-b57e-42ec-8964-f3507f40f605-webhook-cert\") pod \"infra-operator-controller-manager-7bbfcdfb-zm69z\" (UID: \"89886f93-b57e-42ec-8964-f3507f40f605\") " pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.492830 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89886f93-b57e-42ec-8964-f3507f40f605-apiservice-cert\") pod \"infra-operator-controller-manager-7bbfcdfb-zm69z\" (UID: \"89886f93-b57e-42ec-8964-f3507f40f605\") " pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.499266 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89886f93-b57e-42ec-8964-f3507f40f605-apiservice-cert\") pod \"infra-operator-controller-manager-7bbfcdfb-zm69z\" (UID: \"89886f93-b57e-42ec-8964-f3507f40f605\") " pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.505248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89886f93-b57e-42ec-8964-f3507f40f605-webhook-cert\") pod \"infra-operator-controller-manager-7bbfcdfb-zm69z\" (UID: \"89886f93-b57e-42ec-8964-f3507f40f605\") " pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.512468 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48jps\" (UniqueName: \"kubernetes.io/projected/89886f93-b57e-42ec-8964-f3507f40f605-kube-api-access-48jps\") pod \"infra-operator-controller-manager-7bbfcdfb-zm69z\" (UID: \"89886f93-b57e-42ec-8964-f3507f40f605\") " pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.559853 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62n88" event={"ID":"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8","Type":"ContainerStarted","Data":"dea408f2fb16854903785168f08ae2e2a27098e7f45f6a16d2055c375b34d7ec"} Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.590296 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 00:55:46 crc kubenswrapper[5030]: I0121 00:55:46.982326 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z"] Jan 21 00:55:46 crc kubenswrapper[5030]: W0121 00:55:46.989937 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89886f93_b57e_42ec_8964_f3507f40f605.slice/crio-7c0c256307f00fccaeb81edfde8d0e1c01d2b95bbfbf692dadb47c5ca5fc1a89 WatchSource:0}: Error finding container 7c0c256307f00fccaeb81edfde8d0e1c01d2b95bbfbf692dadb47c5ca5fc1a89: Status 404 returned error can't find the container with id 7c0c256307f00fccaeb81edfde8d0e1c01d2b95bbfbf692dadb47c5ca5fc1a89 Jan 21 00:55:47 crc kubenswrapper[5030]: I0121 00:55:47.568308 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" event={"ID":"89886f93-b57e-42ec-8964-f3507f40f605","Type":"ContainerStarted","Data":"485e66ca9210e81a0121ffff6f240bb16490308e7425939494ba1e56bc2af2f5"} Jan 21 00:55:47 crc kubenswrapper[5030]: I0121 00:55:47.569003 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 00:55:47 crc kubenswrapper[5030]: I0121 00:55:47.569419 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" event={"ID":"89886f93-b57e-42ec-8964-f3507f40f605","Type":"ContainerStarted","Data":"7c0c256307f00fccaeb81edfde8d0e1c01d2b95bbfbf692dadb47c5ca5fc1a89"} Jan 21 00:55:47 crc kubenswrapper[5030]: I0121 00:55:47.570379 5030 generic.go:334] "Generic (PLEG): container finished" podID="3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" containerID="dea408f2fb16854903785168f08ae2e2a27098e7f45f6a16d2055c375b34d7ec" exitCode=0 Jan 21 00:55:47 crc kubenswrapper[5030]: I0121 00:55:47.570447 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62n88" event={"ID":"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8","Type":"ContainerDied","Data":"dea408f2fb16854903785168f08ae2e2a27098e7f45f6a16d2055c375b34d7ec"} Jan 21 00:55:47 crc kubenswrapper[5030]: I0121 00:55:47.614927 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" podStartSLOduration=1.614904999 podStartE2EDuration="1.614904999s" podCreationTimestamp="2026-01-21 00:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:55:47.592044712 +0000 UTC m=+8419.912305000" watchObservedRunningTime="2026-01-21 00:55:47.614904999 +0000 UTC m=+8419.935165287" Jan 21 00:55:48 crc kubenswrapper[5030]: I0121 00:55:48.579862 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62n88" event={"ID":"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8","Type":"ContainerStarted","Data":"f71182d02453732a55eeeb4e606df262415f5b5f75bba3cfa3a99aa861aff025"} Jan 21 00:55:48 crc kubenswrapper[5030]: I0121 00:55:48.601500 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-62n88" podStartSLOduration=2.162134199 podStartE2EDuration="4.601480938s" podCreationTimestamp="2026-01-21 00:55:44 +0000 UTC" firstStartedPulling="2026-01-21 00:55:45.55225633 +0000 UTC m=+8417.872516618" lastFinishedPulling="2026-01-21 00:55:47.991603079 +0000 UTC m=+8420.311863357" observedRunningTime="2026-01-21 00:55:48.596502149 +0000 UTC m=+8420.916762447" watchObservedRunningTime="2026-01-21 00:55:48.601480938 +0000 UTC m=+8420.921741236" Jan 21 00:55:50 crc kubenswrapper[5030]: E0121 00:55:50.290989 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e9a747_3b2d_450e_a1a3_aa49bce30b7c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e9a747_3b2d_450e_a1a3_aa49bce30b7c.slice/crio-1172bbe521da0e1e1724ff07cde1ba7087398ca8978ed9593bffa760953a3341\": RecentStats: unable to find data in memory cache]" Jan 21 00:55:54 crc kubenswrapper[5030]: I0121 00:55:54.676198 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:54 crc kubenswrapper[5030]: I0121 00:55:54.676967 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:54 crc kubenswrapper[5030]: I0121 00:55:54.719727 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:55 crc kubenswrapper[5030]: I0121 00:55:55.130579 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5xkwl"] Jan 21 00:55:55 crc kubenswrapper[5030]: I0121 00:55:55.132253 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:55:55 crc kubenswrapper[5030]: I0121 00:55:55.140747 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5xkwl"] Jan 21 00:55:55 crc kubenswrapper[5030]: I0121 00:55:55.215004 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-utilities\") pod \"certified-operators-5xkwl\" (UID: \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\") " pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:55:55 crc kubenswrapper[5030]: I0121 00:55:55.215090 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5c27\" (UniqueName: \"kubernetes.io/projected/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-kube-api-access-f5c27\") pod \"certified-operators-5xkwl\" (UID: \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\") " pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:55:55 crc kubenswrapper[5030]: I0121 00:55:55.215139 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-catalog-content\") pod \"certified-operators-5xkwl\" (UID: \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\") " pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:55:55 crc kubenswrapper[5030]: I0121 00:55:55.316605 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-utilities\") pod \"certified-operators-5xkwl\" (UID: \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\") " pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:55:55 crc kubenswrapper[5030]: I0121 00:55:55.316730 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5c27\" (UniqueName: \"kubernetes.io/projected/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-kube-api-access-f5c27\") pod \"certified-operators-5xkwl\" (UID: \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\") " pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:55:55 crc kubenswrapper[5030]: I0121 00:55:55.316772 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-catalog-content\") pod \"certified-operators-5xkwl\" (UID: \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\") " pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:55:55 crc kubenswrapper[5030]: I0121 00:55:55.317115 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-utilities\") pod \"certified-operators-5xkwl\" (UID: \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\") " pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:55:55 crc kubenswrapper[5030]: I0121 00:55:55.317166 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-catalog-content\") pod \"certified-operators-5xkwl\" (UID: \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\") " pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:55:55 crc kubenswrapper[5030]: I0121 00:55:55.334759 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5c27\" (UniqueName: \"kubernetes.io/projected/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-kube-api-access-f5c27\") pod \"certified-operators-5xkwl\" (UID: \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\") " pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:55:55 crc kubenswrapper[5030]: I0121 00:55:55.451680 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:55:56 crc kubenswrapper[5030]: I0121 00:55:55.683033 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:56 crc kubenswrapper[5030]: I0121 00:55:55.734350 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5xkwl"] Jan 21 00:55:56 crc kubenswrapper[5030]: I0121 00:55:56.594889 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 00:55:56 crc kubenswrapper[5030]: I0121 00:55:56.644959 5030 generic.go:334] "Generic (PLEG): container finished" podID="c43b00a2-5ec0-48e4-b3b3-5dec6753c545" containerID="94306a98c4eb1c16ae7c72da18769014438f47e63d3a9e468606095a99194885" exitCode=0 Jan 21 00:55:56 crc kubenswrapper[5030]: I0121 00:55:56.646483 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkwl" event={"ID":"c43b00a2-5ec0-48e4-b3b3-5dec6753c545","Type":"ContainerDied","Data":"94306a98c4eb1c16ae7c72da18769014438f47e63d3a9e468606095a99194885"} Jan 21 00:55:56 crc kubenswrapper[5030]: I0121 00:55:56.646521 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkwl" event={"ID":"c43b00a2-5ec0-48e4-b3b3-5dec6753c545","Type":"ContainerStarted","Data":"b44f320be6a9e8e20bfc3a174ae34d1eb984d4615bfda4724d5dae9ac2c27157"} Jan 21 00:55:57 crc kubenswrapper[5030]: I0121 00:55:57.652319 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkwl" event={"ID":"c43b00a2-5ec0-48e4-b3b3-5dec6753c545","Type":"ContainerStarted","Data":"c6b287ad88151c0b7c02b995f7f6d43fd3255586a5b0de7517a27b886f8e26df"} Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.323990 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-62n88"] Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.324365 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-62n88" podUID="3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" containerName="registry-server" containerID="cri-o://f71182d02453732a55eeeb4e606df262415f5b5f75bba3cfa3a99aa861aff025" gracePeriod=2 Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.660546 5030 generic.go:334] "Generic (PLEG): container finished" podID="c43b00a2-5ec0-48e4-b3b3-5dec6753c545" containerID="c6b287ad88151c0b7c02b995f7f6d43fd3255586a5b0de7517a27b886f8e26df" exitCode=0 Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.660596 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkwl" event={"ID":"c43b00a2-5ec0-48e4-b3b3-5dec6753c545","Type":"ContainerDied","Data":"c6b287ad88151c0b7c02b995f7f6d43fd3255586a5b0de7517a27b886f8e26df"} Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.666437 5030 generic.go:334] "Generic (PLEG): container finished" podID="3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" containerID="f71182d02453732a55eeeb4e606df262415f5b5f75bba3cfa3a99aa861aff025" exitCode=0 Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.666479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62n88" event={"ID":"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8","Type":"ContainerDied","Data":"f71182d02453732a55eeeb4e606df262415f5b5f75bba3cfa3a99aa861aff025"} Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.666507 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62n88" event={"ID":"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8","Type":"ContainerDied","Data":"dd786d9ef87e3a721efb81c23111fb231e49e59101ed5428122171dd00acb537"} Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.666518 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd786d9ef87e3a721efb81c23111fb231e49e59101ed5428122171dd00acb537" Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.698764 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.772734 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-catalog-content\") pod \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\" (UID: \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\") " Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.778789 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9hhp\" (UniqueName: \"kubernetes.io/projected/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-kube-api-access-m9hhp\") pod \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\" (UID: \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\") " Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.778920 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-utilities\") pod \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\" (UID: \"3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8\") " Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.779727 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-utilities" (OuterVolumeSpecName: "utilities") pod "3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" (UID: "3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.785148 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-kube-api-access-m9hhp" (OuterVolumeSpecName: "kube-api-access-m9hhp") pod "3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" (UID: "3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8"). InnerVolumeSpecName "kube-api-access-m9hhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.881122 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.881221 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9hhp\" (UniqueName: \"kubernetes.io/projected/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-kube-api-access-m9hhp\") on node \"crc\" DevicePath \"\"" Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.894647 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" (UID: "3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:55:58 crc kubenswrapper[5030]: I0121 00:55:58.982858 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:55:59 crc kubenswrapper[5030]: I0121 00:55:59.676055 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkwl" event={"ID":"c43b00a2-5ec0-48e4-b3b3-5dec6753c545","Type":"ContainerStarted","Data":"3937a3efd311084e6e9d818f2996090715e99c8e222a9d9fce33023c4bda0e90"} Jan 21 00:55:59 crc kubenswrapper[5030]: I0121 00:55:59.676078 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-62n88" Jan 21 00:55:59 crc kubenswrapper[5030]: I0121 00:55:59.703787 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5xkwl" podStartSLOduration=2.213905767 podStartE2EDuration="4.703768266s" podCreationTimestamp="2026-01-21 00:55:55 +0000 UTC" firstStartedPulling="2026-01-21 00:55:56.650233425 +0000 UTC m=+8428.970493723" lastFinishedPulling="2026-01-21 00:55:59.140095934 +0000 UTC m=+8431.460356222" observedRunningTime="2026-01-21 00:55:59.698841349 +0000 UTC m=+8432.019101637" watchObservedRunningTime="2026-01-21 00:55:59.703768266 +0000 UTC m=+8432.024028554" Jan 21 00:55:59 crc kubenswrapper[5030]: I0121 00:55:59.714839 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-62n88"] Jan 21 00:55:59 crc kubenswrapper[5030]: I0121 00:55:59.720305 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-62n88"] Jan 21 00:55:59 crc kubenswrapper[5030]: I0121 00:55:59.969840 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" path="/var/lib/kubelet/pods/3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8/volumes" Jan 21 00:56:00 crc kubenswrapper[5030]: E0121 00:56:00.429345 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e9a747_3b2d_450e_a1a3_aa49bce30b7c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e9a747_3b2d_450e_a1a3_aa49bce30b7c.slice/crio-1172bbe521da0e1e1724ff07cde1ba7087398ca8978ed9593bffa760953a3341\": RecentStats: unable to find data in memory cache]" Jan 21 00:56:05 crc kubenswrapper[5030]: I0121 00:56:05.452221 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:56:05 crc kubenswrapper[5030]: I0121 00:56:05.452547 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:56:05 crc kubenswrapper[5030]: I0121 00:56:05.523278 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:56:05 crc kubenswrapper[5030]: I0121 00:56:05.759743 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:56:05 crc kubenswrapper[5030]: I0121 00:56:05.931726 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-bh77x"] Jan 21 00:56:05 crc kubenswrapper[5030]: E0121 00:56:05.932125 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" containerName="extract-content" Jan 21 00:56:05 crc kubenswrapper[5030]: I0121 00:56:05.932146 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" containerName="extract-content" Jan 21 00:56:05 crc kubenswrapper[5030]: E0121 00:56:05.932173 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" containerName="registry-server" Jan 21 00:56:05 crc kubenswrapper[5030]: I0121 00:56:05.932182 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" containerName="registry-server" Jan 21 00:56:05 crc kubenswrapper[5030]: E0121 00:56:05.932197 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" containerName="extract-utilities" Jan 21 00:56:05 crc kubenswrapper[5030]: I0121 00:56:05.932205 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" containerName="extract-utilities" Jan 21 00:56:05 crc kubenswrapper[5030]: I0121 00:56:05.932352 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5ca99e-ccdf-449d-ba92-d7aa6e530bc8" containerName="registry-server" Jan 21 00:56:05 crc kubenswrapper[5030]: I0121 00:56:05.933106 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" Jan 21 00:56:05 crc kubenswrapper[5030]: I0121 00:56:05.936221 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-lr85l" Jan 21 00:56:05 crc kubenswrapper[5030]: I0121 00:56:05.939440 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-bh77x"] Jan 21 00:56:06 crc kubenswrapper[5030]: I0121 00:56:06.095265 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8m66\" (UniqueName: \"kubernetes.io/projected/a13b4821-61bf-4351-aa07-6fde3696ada3-kube-api-access-g8m66\") pod \"rabbitmq-cluster-operator-index-bh77x\" (UID: \"a13b4821-61bf-4351-aa07-6fde3696ada3\") " pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" Jan 21 00:56:06 crc kubenswrapper[5030]: I0121 00:56:06.196840 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8m66\" (UniqueName: \"kubernetes.io/projected/a13b4821-61bf-4351-aa07-6fde3696ada3-kube-api-access-g8m66\") pod \"rabbitmq-cluster-operator-index-bh77x\" (UID: \"a13b4821-61bf-4351-aa07-6fde3696ada3\") " pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" Jan 21 00:56:06 crc kubenswrapper[5030]: I0121 00:56:06.217248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8m66\" (UniqueName: \"kubernetes.io/projected/a13b4821-61bf-4351-aa07-6fde3696ada3-kube-api-access-g8m66\") pod \"rabbitmq-cluster-operator-index-bh77x\" (UID: \"a13b4821-61bf-4351-aa07-6fde3696ada3\") " pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" Jan 21 00:56:06 crc kubenswrapper[5030]: I0121 00:56:06.256594 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" Jan 21 00:56:06 crc kubenswrapper[5030]: I0121 00:56:06.792371 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-bh77x"] Jan 21 00:56:06 crc kubenswrapper[5030]: I0121 00:56:06.926197 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5xkwl"] Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.406406 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/memcached-0"] Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.407753 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/memcached-0" Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.409483 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"memcached-memcached-dockercfg-m6stq" Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.409905 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"memcached-config-data" Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.421904 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.516519 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv8lz\" (UniqueName: \"kubernetes.io/projected/e7c9e208-00ec-4363-b4e8-91e11eb22555-kube-api-access-hv8lz\") pod \"memcached-0\" (UID: \"e7c9e208-00ec-4363-b4e8-91e11eb22555\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.516589 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7c9e208-00ec-4363-b4e8-91e11eb22555-config-data\") pod \"memcached-0\" (UID: \"e7c9e208-00ec-4363-b4e8-91e11eb22555\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.516661 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e7c9e208-00ec-4363-b4e8-91e11eb22555-kolla-config\") pod \"memcached-0\" (UID: \"e7c9e208-00ec-4363-b4e8-91e11eb22555\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.618779 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e7c9e208-00ec-4363-b4e8-91e11eb22555-kolla-config\") pod \"memcached-0\" (UID: \"e7c9e208-00ec-4363-b4e8-91e11eb22555\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.620070 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e7c9e208-00ec-4363-b4e8-91e11eb22555-kolla-config\") pod \"memcached-0\" (UID: \"e7c9e208-00ec-4363-b4e8-91e11eb22555\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.620195 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv8lz\" (UniqueName: \"kubernetes.io/projected/e7c9e208-00ec-4363-b4e8-91e11eb22555-kube-api-access-hv8lz\") pod \"memcached-0\" (UID: \"e7c9e208-00ec-4363-b4e8-91e11eb22555\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.620318 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7c9e208-00ec-4363-b4e8-91e11eb22555-config-data\") pod \"memcached-0\" (UID: \"e7c9e208-00ec-4363-b4e8-91e11eb22555\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.621089 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7c9e208-00ec-4363-b4e8-91e11eb22555-config-data\") pod \"memcached-0\" (UID: \"e7c9e208-00ec-4363-b4e8-91e11eb22555\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.643973 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv8lz\" (UniqueName: \"kubernetes.io/projected/e7c9e208-00ec-4363-b4e8-91e11eb22555-kube-api-access-hv8lz\") pod \"memcached-0\" (UID: \"e7c9e208-00ec-4363-b4e8-91e11eb22555\") " pod="horizon-kuttl-tests/memcached-0" Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.723544 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/memcached-0" Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.735573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" event={"ID":"a13b4821-61bf-4351-aa07-6fde3696ada3","Type":"ContainerStarted","Data":"daf6eb95d51e8c344f1ebadc8767f20b18ebc5c2c75e7ce0d366777b47a63e80"} Jan 21 00:56:07 crc kubenswrapper[5030]: I0121 00:56:07.735767 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5xkwl" podUID="c43b00a2-5ec0-48e4-b3b3-5dec6753c545" containerName="registry-server" containerID="cri-o://3937a3efd311084e6e9d818f2996090715e99c8e222a9d9fce33023c4bda0e90" gracePeriod=2 Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.153354 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.199300 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Jan 21 00:56:08 crc kubenswrapper[5030]: W0121 00:56:08.202824 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7c9e208_00ec_4363_b4e8_91e11eb22555.slice/crio-a2b742ab0b4245ecf00b1c3eee204061c02aadd7b26e9fd6e57aa785839d4416 WatchSource:0}: Error finding container a2b742ab0b4245ecf00b1c3eee204061c02aadd7b26e9fd6e57aa785839d4416: Status 404 returned error can't find the container with id a2b742ab0b4245ecf00b1c3eee204061c02aadd7b26e9fd6e57aa785839d4416 Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.227750 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-utilities\") pod \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\" (UID: \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\") " Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.228028 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-catalog-content\") pod \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\" (UID: \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\") " Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.228193 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5c27\" (UniqueName: \"kubernetes.io/projected/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-kube-api-access-f5c27\") pod \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\" (UID: \"c43b00a2-5ec0-48e4-b3b3-5dec6753c545\") " Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.228714 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-utilities" (OuterVolumeSpecName: "utilities") pod "c43b00a2-5ec0-48e4-b3b3-5dec6753c545" (UID: "c43b00a2-5ec0-48e4-b3b3-5dec6753c545"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.237763 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-kube-api-access-f5c27" (OuterVolumeSpecName: "kube-api-access-f5c27") pod "c43b00a2-5ec0-48e4-b3b3-5dec6753c545" (UID: "c43b00a2-5ec0-48e4-b3b3-5dec6753c545"). InnerVolumeSpecName "kube-api-access-f5c27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.284421 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c43b00a2-5ec0-48e4-b3b3-5dec6753c545" (UID: "c43b00a2-5ec0-48e4-b3b3-5dec6753c545"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.329881 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5c27\" (UniqueName: \"kubernetes.io/projected/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-kube-api-access-f5c27\") on node \"crc\" DevicePath \"\"" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.329911 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.329923 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c43b00a2-5ec0-48e4-b3b3-5dec6753c545-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.742800 5030 generic.go:334] "Generic (PLEG): container finished" podID="c43b00a2-5ec0-48e4-b3b3-5dec6753c545" containerID="3937a3efd311084e6e9d818f2996090715e99c8e222a9d9fce33023c4bda0e90" exitCode=0 Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.742851 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkwl" event={"ID":"c43b00a2-5ec0-48e4-b3b3-5dec6753c545","Type":"ContainerDied","Data":"3937a3efd311084e6e9d818f2996090715e99c8e222a9d9fce33023c4bda0e90"} Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.742903 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xkwl" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.742942 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkwl" event={"ID":"c43b00a2-5ec0-48e4-b3b3-5dec6753c545","Type":"ContainerDied","Data":"b44f320be6a9e8e20bfc3a174ae34d1eb984d4615bfda4724d5dae9ac2c27157"} Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.742970 5030 scope.go:117] "RemoveContainer" containerID="3937a3efd311084e6e9d818f2996090715e99c8e222a9d9fce33023c4bda0e90" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.745135 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" event={"ID":"a13b4821-61bf-4351-aa07-6fde3696ada3","Type":"ContainerStarted","Data":"0a002f0c3c126c68b8cae441c383cb653e053248b23a3ed1c079f3725740bf39"} Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.749356 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/memcached-0" event={"ID":"e7c9e208-00ec-4363-b4e8-91e11eb22555","Type":"ContainerStarted","Data":"7b55930a5340962e8be1a11e7fbc1beecd596da8a8badc35c7d5f04aefcf0ae2"} Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.749404 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/memcached-0" event={"ID":"e7c9e208-00ec-4363-b4e8-91e11eb22555","Type":"ContainerStarted","Data":"a2b742ab0b4245ecf00b1c3eee204061c02aadd7b26e9fd6e57aa785839d4416"} Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.749855 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/memcached-0" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.764145 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" podStartSLOduration=2.852180828 podStartE2EDuration="3.764131652s" podCreationTimestamp="2026-01-21 00:56:05 +0000 UTC" firstStartedPulling="2026-01-21 00:56:06.808397231 +0000 UTC m=+8439.128657519" lastFinishedPulling="2026-01-21 00:56:07.720348055 +0000 UTC m=+8440.040608343" observedRunningTime="2026-01-21 00:56:08.760279161 +0000 UTC m=+8441.080539449" watchObservedRunningTime="2026-01-21 00:56:08.764131652 +0000 UTC m=+8441.084391940" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.769651 5030 scope.go:117] "RemoveContainer" containerID="c6b287ad88151c0b7c02b995f7f6d43fd3255586a5b0de7517a27b886f8e26df" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.791756 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/memcached-0" podStartSLOduration=1.791697062 podStartE2EDuration="1.791697062s" podCreationTimestamp="2026-01-21 00:56:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:56:08.779013798 +0000 UTC m=+8441.099274086" watchObservedRunningTime="2026-01-21 00:56:08.791697062 +0000 UTC m=+8441.111957360" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.806859 5030 scope.go:117] "RemoveContainer" containerID="94306a98c4eb1c16ae7c72da18769014438f47e63d3a9e468606095a99194885" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.809773 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5xkwl"] Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.815304 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5xkwl"] Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.825146 5030 scope.go:117] "RemoveContainer" containerID="3937a3efd311084e6e9d818f2996090715e99c8e222a9d9fce33023c4bda0e90" Jan 21 00:56:08 crc kubenswrapper[5030]: E0121 00:56:08.825596 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3937a3efd311084e6e9d818f2996090715e99c8e222a9d9fce33023c4bda0e90\": container with ID starting with 3937a3efd311084e6e9d818f2996090715e99c8e222a9d9fce33023c4bda0e90 not found: ID does not exist" containerID="3937a3efd311084e6e9d818f2996090715e99c8e222a9d9fce33023c4bda0e90" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.825654 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3937a3efd311084e6e9d818f2996090715e99c8e222a9d9fce33023c4bda0e90"} err="failed to get container status \"3937a3efd311084e6e9d818f2996090715e99c8e222a9d9fce33023c4bda0e90\": rpc error: code = NotFound desc = could not find container \"3937a3efd311084e6e9d818f2996090715e99c8e222a9d9fce33023c4bda0e90\": container with ID starting with 3937a3efd311084e6e9d818f2996090715e99c8e222a9d9fce33023c4bda0e90 not found: ID does not exist" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.825683 5030 scope.go:117] "RemoveContainer" containerID="c6b287ad88151c0b7c02b995f7f6d43fd3255586a5b0de7517a27b886f8e26df" Jan 21 00:56:08 crc kubenswrapper[5030]: E0121 00:56:08.825952 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b287ad88151c0b7c02b995f7f6d43fd3255586a5b0de7517a27b886f8e26df\": container with ID starting with c6b287ad88151c0b7c02b995f7f6d43fd3255586a5b0de7517a27b886f8e26df not found: ID does not exist" containerID="c6b287ad88151c0b7c02b995f7f6d43fd3255586a5b0de7517a27b886f8e26df" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.825988 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b287ad88151c0b7c02b995f7f6d43fd3255586a5b0de7517a27b886f8e26df"} err="failed to get container status \"c6b287ad88151c0b7c02b995f7f6d43fd3255586a5b0de7517a27b886f8e26df\": rpc error: code = NotFound desc = could not find container \"c6b287ad88151c0b7c02b995f7f6d43fd3255586a5b0de7517a27b886f8e26df\": container with ID starting with c6b287ad88151c0b7c02b995f7f6d43fd3255586a5b0de7517a27b886f8e26df not found: ID does not exist" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.826008 5030 scope.go:117] "RemoveContainer" containerID="94306a98c4eb1c16ae7c72da18769014438f47e63d3a9e468606095a99194885" Jan 21 00:56:08 crc kubenswrapper[5030]: E0121 00:56:08.826291 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94306a98c4eb1c16ae7c72da18769014438f47e63d3a9e468606095a99194885\": container with ID starting with 94306a98c4eb1c16ae7c72da18769014438f47e63d3a9e468606095a99194885 not found: ID does not exist" containerID="94306a98c4eb1c16ae7c72da18769014438f47e63d3a9e468606095a99194885" Jan 21 00:56:08 crc kubenswrapper[5030]: I0121 00:56:08.826311 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94306a98c4eb1c16ae7c72da18769014438f47e63d3a9e468606095a99194885"} err="failed to get container status \"94306a98c4eb1c16ae7c72da18769014438f47e63d3a9e468606095a99194885\": rpc error: code = NotFound desc = could not find container \"94306a98c4eb1c16ae7c72da18769014438f47e63d3a9e468606095a99194885\": container with ID starting with 94306a98c4eb1c16ae7c72da18769014438f47e63d3a9e468606095a99194885 not found: ID does not exist" Jan 21 00:56:09 crc kubenswrapper[5030]: I0121 00:56:09.970675 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c43b00a2-5ec0-48e4-b3b3-5dec6753c545" path="/var/lib/kubelet/pods/c43b00a2-5ec0-48e4-b3b3-5dec6753c545/volumes" Jan 21 00:56:10 crc kubenswrapper[5030]: E0121 00:56:10.610614 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e9a747_3b2d_450e_a1a3_aa49bce30b7c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e9a747_3b2d_450e_a1a3_aa49bce30b7c.slice/crio-1172bbe521da0e1e1724ff07cde1ba7087398ca8978ed9593bffa760953a3341\": RecentStats: unable to find data in memory cache]" Jan 21 00:56:16 crc kubenswrapper[5030]: I0121 00:56:16.256997 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" Jan 21 00:56:16 crc kubenswrapper[5030]: I0121 00:56:16.257895 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" Jan 21 00:56:16 crc kubenswrapper[5030]: I0121 00:56:16.296561 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" Jan 21 00:56:16 crc kubenswrapper[5030]: I0121 00:56:16.840639 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" Jan 21 00:56:17 crc kubenswrapper[5030]: I0121 00:56:17.725016 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/memcached-0" Jan 21 00:56:17 crc kubenswrapper[5030]: I0121 00:56:17.978357 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz"] Jan 21 00:56:17 crc kubenswrapper[5030]: E0121 00:56:17.978697 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c43b00a2-5ec0-48e4-b3b3-5dec6753c545" containerName="registry-server" Jan 21 00:56:17 crc kubenswrapper[5030]: I0121 00:56:17.978717 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c43b00a2-5ec0-48e4-b3b3-5dec6753c545" containerName="registry-server" Jan 21 00:56:17 crc kubenswrapper[5030]: E0121 00:56:17.978753 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c43b00a2-5ec0-48e4-b3b3-5dec6753c545" containerName="extract-content" Jan 21 00:56:17 crc kubenswrapper[5030]: I0121 00:56:17.978762 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c43b00a2-5ec0-48e4-b3b3-5dec6753c545" containerName="extract-content" Jan 21 00:56:17 crc kubenswrapper[5030]: E0121 00:56:17.978776 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c43b00a2-5ec0-48e4-b3b3-5dec6753c545" containerName="extract-utilities" Jan 21 00:56:17 crc kubenswrapper[5030]: I0121 00:56:17.978784 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c43b00a2-5ec0-48e4-b3b3-5dec6753c545" containerName="extract-utilities" Jan 21 00:56:17 crc kubenswrapper[5030]: I0121 00:56:17.978929 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c43b00a2-5ec0-48e4-b3b3-5dec6753c545" containerName="registry-server" Jan 21 00:56:17 crc kubenswrapper[5030]: I0121 00:56:17.980124 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" Jan 21 00:56:17 crc kubenswrapper[5030]: I0121 00:56:17.982574 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:56:17 crc kubenswrapper[5030]: I0121 00:56:17.988442 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz"] Jan 21 00:56:18 crc kubenswrapper[5030]: I0121 00:56:18.076867 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz\" (UID: \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" Jan 21 00:56:18 crc kubenswrapper[5030]: I0121 00:56:18.077145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg6jm\" (UniqueName: \"kubernetes.io/projected/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-kube-api-access-jg6jm\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz\" (UID: \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" Jan 21 00:56:18 crc kubenswrapper[5030]: I0121 00:56:18.077233 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz\" (UID: \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" Jan 21 00:56:18 crc kubenswrapper[5030]: I0121 00:56:18.179469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg6jm\" (UniqueName: \"kubernetes.io/projected/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-kube-api-access-jg6jm\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz\" (UID: \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" Jan 21 00:56:18 crc kubenswrapper[5030]: I0121 00:56:18.179550 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz\" (UID: \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" Jan 21 00:56:18 crc kubenswrapper[5030]: I0121 00:56:18.179714 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz\" (UID: \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" Jan 21 00:56:18 crc kubenswrapper[5030]: I0121 00:56:18.180139 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz\" (UID: \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" Jan 21 00:56:18 crc kubenswrapper[5030]: I0121 00:56:18.180261 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz\" (UID: \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" Jan 21 00:56:18 crc kubenswrapper[5030]: I0121 00:56:18.202124 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg6jm\" (UniqueName: \"kubernetes.io/projected/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-kube-api-access-jg6jm\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz\" (UID: \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" Jan 21 00:56:18 crc kubenswrapper[5030]: I0121 00:56:18.302328 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" Jan 21 00:56:18 crc kubenswrapper[5030]: I0121 00:56:18.893907 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz"] Jan 21 00:56:19 crc kubenswrapper[5030]: I0121 00:56:19.833745 5030 generic.go:334] "Generic (PLEG): container finished" podID="dd7fb511-f3ec-40b9-89ec-4ab91ff34662" containerID="7da48c6e0fcad7380f1a23fde74dd3de0c57ce507f99042be00cec0105e5cf7d" exitCode=0 Jan 21 00:56:19 crc kubenswrapper[5030]: I0121 00:56:19.834011 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" event={"ID":"dd7fb511-f3ec-40b9-89ec-4ab91ff34662","Type":"ContainerDied","Data":"7da48c6e0fcad7380f1a23fde74dd3de0c57ce507f99042be00cec0105e5cf7d"} Jan 21 00:56:19 crc kubenswrapper[5030]: I0121 00:56:19.834036 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" event={"ID":"dd7fb511-f3ec-40b9-89ec-4ab91ff34662","Type":"ContainerStarted","Data":"d06337203c2f9b90c36759c0a8243db7d599688973a403d2ec68e73da0570545"} Jan 21 00:56:19 crc kubenswrapper[5030]: I0121 00:56:19.835883 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 00:56:20 crc kubenswrapper[5030]: E0121 00:56:20.780866 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e9a747_3b2d_450e_a1a3_aa49bce30b7c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd7fb511_f3ec_40b9_89ec_4ab91ff34662.slice/crio-conmon-aeafb711c77295fbcc62fa5f2c4994e91bd200a7ef026936ff6048dc51504a20.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e9a747_3b2d_450e_a1a3_aa49bce30b7c.slice/crio-1172bbe521da0e1e1724ff07cde1ba7087398ca8978ed9593bffa760953a3341\": RecentStats: unable to find data in memory cache]" Jan 21 00:56:20 crc kubenswrapper[5030]: I0121 00:56:20.841650 5030 generic.go:334] "Generic (PLEG): container finished" podID="dd7fb511-f3ec-40b9-89ec-4ab91ff34662" containerID="aeafb711c77295fbcc62fa5f2c4994e91bd200a7ef026936ff6048dc51504a20" exitCode=0 Jan 21 00:56:20 crc kubenswrapper[5030]: I0121 00:56:20.841710 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" event={"ID":"dd7fb511-f3ec-40b9-89ec-4ab91ff34662","Type":"ContainerDied","Data":"aeafb711c77295fbcc62fa5f2c4994e91bd200a7ef026936ff6048dc51504a20"} Jan 21 00:56:21 crc kubenswrapper[5030]: I0121 00:56:21.849765 5030 generic.go:334] "Generic (PLEG): container finished" podID="dd7fb511-f3ec-40b9-89ec-4ab91ff34662" containerID="0c767475252e8cf0c92481394d72ba89f291aa44ec2e3a9db234c14cf5975616" exitCode=0 Jan 21 00:56:21 crc kubenswrapper[5030]: I0121 00:56:21.849809 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" event={"ID":"dd7fb511-f3ec-40b9-89ec-4ab91ff34662","Type":"ContainerDied","Data":"0c767475252e8cf0c92481394d72ba89f291aa44ec2e3a9db234c14cf5975616"} Jan 21 00:56:23 crc kubenswrapper[5030]: I0121 00:56:23.153396 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" Jan 21 00:56:23 crc kubenswrapper[5030]: I0121 00:56:23.253418 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-bundle\") pod \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\" (UID: \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\") " Jan 21 00:56:23 crc kubenswrapper[5030]: I0121 00:56:23.253525 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-util\") pod \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\" (UID: \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\") " Jan 21 00:56:23 crc kubenswrapper[5030]: I0121 00:56:23.253597 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg6jm\" (UniqueName: \"kubernetes.io/projected/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-kube-api-access-jg6jm\") pod \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\" (UID: \"dd7fb511-f3ec-40b9-89ec-4ab91ff34662\") " Jan 21 00:56:23 crc kubenswrapper[5030]: I0121 00:56:23.254394 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-bundle" (OuterVolumeSpecName: "bundle") pod "dd7fb511-f3ec-40b9-89ec-4ab91ff34662" (UID: "dd7fb511-f3ec-40b9-89ec-4ab91ff34662"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:56:23 crc kubenswrapper[5030]: I0121 00:56:23.260612 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-kube-api-access-jg6jm" (OuterVolumeSpecName: "kube-api-access-jg6jm") pod "dd7fb511-f3ec-40b9-89ec-4ab91ff34662" (UID: "dd7fb511-f3ec-40b9-89ec-4ab91ff34662"). InnerVolumeSpecName "kube-api-access-jg6jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:56:23 crc kubenswrapper[5030]: I0121 00:56:23.270408 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-util" (OuterVolumeSpecName: "util") pod "dd7fb511-f3ec-40b9-89ec-4ab91ff34662" (UID: "dd7fb511-f3ec-40b9-89ec-4ab91ff34662"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:56:23 crc kubenswrapper[5030]: I0121 00:56:23.355754 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:56:23 crc kubenswrapper[5030]: I0121 00:56:23.355817 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg6jm\" (UniqueName: \"kubernetes.io/projected/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-kube-api-access-jg6jm\") on node \"crc\" DevicePath \"\"" Jan 21 00:56:23 crc kubenswrapper[5030]: I0121 00:56:23.355845 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd7fb511-f3ec-40b9-89ec-4ab91ff34662-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:56:23 crc kubenswrapper[5030]: I0121 00:56:23.876479 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" event={"ID":"dd7fb511-f3ec-40b9-89ec-4ab91ff34662","Type":"ContainerDied","Data":"d06337203c2f9b90c36759c0a8243db7d599688973a403d2ec68e73da0570545"} Jan 21 00:56:23 crc kubenswrapper[5030]: I0121 00:56:23.876519 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d06337203c2f9b90c36759c0a8243db7d599688973a403d2ec68e73da0570545" Jan 21 00:56:23 crc kubenswrapper[5030]: I0121 00:56:23.876542 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz" Jan 21 00:56:34 crc kubenswrapper[5030]: I0121 00:56:34.698949 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf"] Jan 21 00:56:34 crc kubenswrapper[5030]: E0121 00:56:34.699831 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7fb511-f3ec-40b9-89ec-4ab91ff34662" containerName="extract" Jan 21 00:56:34 crc kubenswrapper[5030]: I0121 00:56:34.699845 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7fb511-f3ec-40b9-89ec-4ab91ff34662" containerName="extract" Jan 21 00:56:34 crc kubenswrapper[5030]: E0121 00:56:34.699877 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7fb511-f3ec-40b9-89ec-4ab91ff34662" containerName="pull" Jan 21 00:56:34 crc kubenswrapper[5030]: I0121 00:56:34.699885 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7fb511-f3ec-40b9-89ec-4ab91ff34662" containerName="pull" Jan 21 00:56:34 crc kubenswrapper[5030]: E0121 00:56:34.699918 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7fb511-f3ec-40b9-89ec-4ab91ff34662" containerName="util" Jan 21 00:56:34 crc kubenswrapper[5030]: I0121 00:56:34.699928 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7fb511-f3ec-40b9-89ec-4ab91ff34662" containerName="util" Jan 21 00:56:34 crc kubenswrapper[5030]: I0121 00:56:34.700078 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7fb511-f3ec-40b9-89ec-4ab91ff34662" containerName="extract" Jan 21 00:56:34 crc kubenswrapper[5030]: I0121 00:56:34.700634 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf" Jan 21 00:56:34 crc kubenswrapper[5030]: I0121 00:56:34.708073 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf"] Jan 21 00:56:34 crc kubenswrapper[5030]: W0121 00:56:34.708397 5030 reflector.go:561] object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-k875x": failed to list *v1.Secret: secrets "rabbitmq-cluster-operator-dockercfg-k875x" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Jan 21 00:56:34 crc kubenswrapper[5030]: E0121 00:56:34.708438 5030 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"rabbitmq-cluster-operator-dockercfg-k875x\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"rabbitmq-cluster-operator-dockercfg-k875x\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 21 00:56:34 crc kubenswrapper[5030]: I0121 00:56:34.733709 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j47s\" (UniqueName: \"kubernetes.io/projected/e8269d80-4238-4c5a-a441-9997d386bc34-kube-api-access-8j47s\") pod \"rabbitmq-cluster-operator-779fc9694b-ndhjf\" (UID: \"e8269d80-4238-4c5a-a441-9997d386bc34\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf" Jan 21 00:56:34 crc kubenswrapper[5030]: I0121 00:56:34.835726 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j47s\" (UniqueName: \"kubernetes.io/projected/e8269d80-4238-4c5a-a441-9997d386bc34-kube-api-access-8j47s\") pod \"rabbitmq-cluster-operator-779fc9694b-ndhjf\" (UID: \"e8269d80-4238-4c5a-a441-9997d386bc34\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf" Jan 21 00:56:34 crc kubenswrapper[5030]: I0121 00:56:34.857587 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j47s\" (UniqueName: \"kubernetes.io/projected/e8269d80-4238-4c5a-a441-9997d386bc34-kube-api-access-8j47s\") pod \"rabbitmq-cluster-operator-779fc9694b-ndhjf\" (UID: \"e8269d80-4238-4c5a-a441-9997d386bc34\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf" Jan 21 00:56:35 crc kubenswrapper[5030]: I0121 00:56:35.976277 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-k875x" Jan 21 00:56:35 crc kubenswrapper[5030]: I0121 00:56:35.978128 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf" Jan 21 00:56:36 crc kubenswrapper[5030]: I0121 00:56:36.448843 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf"] Jan 21 00:56:36 crc kubenswrapper[5030]: I0121 00:56:36.973268 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf" event={"ID":"e8269d80-4238-4c5a-a441-9997d386bc34","Type":"ContainerStarted","Data":"4a7a839bb5d430d5fab0f5f43471e084c2117be25a88edb928f43aba64877b11"} Jan 21 00:56:36 crc kubenswrapper[5030]: I0121 00:56:36.973573 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf" event={"ID":"e8269d80-4238-4c5a-a441-9997d386bc34","Type":"ContainerStarted","Data":"927fa3fdc0602626f57ab05cdd1108aa5bb3000de0d835b76bf0ccd0150bd08c"} Jan 21 00:56:40 crc kubenswrapper[5030]: I0121 00:56:40.157783 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:56:40 crc kubenswrapper[5030]: I0121 00:56:40.158056 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:56:40 crc kubenswrapper[5030]: I0121 00:56:40.736692 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf" podStartSLOduration=6.736669817 podStartE2EDuration="6.736669817s" podCreationTimestamp="2026-01-21 00:56:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:56:36.989894094 +0000 UTC m=+8469.310154462" watchObservedRunningTime="2026-01-21 00:56:40.736669817 +0000 UTC m=+8473.056930115" Jan 21 00:56:40 crc kubenswrapper[5030]: I0121 00:56:40.741373 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-7ptpw"] Jan 21 00:56:40 crc kubenswrapper[5030]: I0121 00:56:40.743230 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-7ptpw" Jan 21 00:56:40 crc kubenswrapper[5030]: I0121 00:56:40.748475 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-c2s2h" Jan 21 00:56:40 crc kubenswrapper[5030]: I0121 00:56:40.752397 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-7ptpw"] Jan 21 00:56:40 crc kubenswrapper[5030]: I0121 00:56:40.818677 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hv6j\" (UniqueName: \"kubernetes.io/projected/1228307e-d29d-44a4-b399-7be045068d83-kube-api-access-8hv6j\") pod \"keystone-operator-index-7ptpw\" (UID: \"1228307e-d29d-44a4-b399-7be045068d83\") " pod="openstack-operators/keystone-operator-index-7ptpw" Jan 21 00:56:40 crc kubenswrapper[5030]: I0121 00:56:40.919860 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hv6j\" (UniqueName: \"kubernetes.io/projected/1228307e-d29d-44a4-b399-7be045068d83-kube-api-access-8hv6j\") pod \"keystone-operator-index-7ptpw\" (UID: \"1228307e-d29d-44a4-b399-7be045068d83\") " pod="openstack-operators/keystone-operator-index-7ptpw" Jan 21 00:56:40 crc kubenswrapper[5030]: I0121 00:56:40.939505 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hv6j\" (UniqueName: \"kubernetes.io/projected/1228307e-d29d-44a4-b399-7be045068d83-kube-api-access-8hv6j\") pod \"keystone-operator-index-7ptpw\" (UID: \"1228307e-d29d-44a4-b399-7be045068d83\") " pod="openstack-operators/keystone-operator-index-7ptpw" Jan 21 00:56:41 crc kubenswrapper[5030]: I0121 00:56:41.109117 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-7ptpw" Jan 21 00:56:41 crc kubenswrapper[5030]: I0121 00:56:41.532865 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-7ptpw"] Jan 21 00:56:41 crc kubenswrapper[5030]: W0121 00:56:41.538750 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1228307e_d29d_44a4_b399_7be045068d83.slice/crio-ef019f23e851ad4ad5e94b9310ecb03ce6f40666441376b15f3e9391cf9bed77 WatchSource:0}: Error finding container ef019f23e851ad4ad5e94b9310ecb03ce6f40666441376b15f3e9391cf9bed77: Status 404 returned error can't find the container with id ef019f23e851ad4ad5e94b9310ecb03ce6f40666441376b15f3e9391cf9bed77 Jan 21 00:56:42 crc kubenswrapper[5030]: I0121 00:56:42.015218 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-7ptpw" event={"ID":"1228307e-d29d-44a4-b399-7be045068d83","Type":"ContainerStarted","Data":"ef019f23e851ad4ad5e94b9310ecb03ce6f40666441376b15f3e9391cf9bed77"} Jan 21 00:56:43 crc kubenswrapper[5030]: I0121 00:56:43.023331 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-7ptpw" event={"ID":"1228307e-d29d-44a4-b399-7be045068d83","Type":"ContainerStarted","Data":"13495d862f4ccc5c2a790ac890d96c66bba4de5fc565d1be0ee291b28dc283ec"} Jan 21 00:56:43 crc kubenswrapper[5030]: I0121 00:56:43.038280 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-7ptpw" podStartSLOduration=2.477873257 podStartE2EDuration="3.038259731s" podCreationTimestamp="2026-01-21 00:56:40 +0000 UTC" firstStartedPulling="2026-01-21 00:56:41.542216586 +0000 UTC m=+8473.862476874" lastFinishedPulling="2026-01-21 00:56:42.10260306 +0000 UTC m=+8474.422863348" observedRunningTime="2026-01-21 00:56:43.034887801 +0000 UTC m=+8475.355148089" watchObservedRunningTime="2026-01-21 00:56:43.038259731 +0000 UTC m=+8475.358520019" Jan 21 00:56:51 crc kubenswrapper[5030]: I0121 00:56:51.110265 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-7ptpw" Jan 21 00:56:51 crc kubenswrapper[5030]: I0121 00:56:51.110832 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-7ptpw" Jan 21 00:56:51 crc kubenswrapper[5030]: I0121 00:56:51.155637 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-7ptpw" Jan 21 00:56:52 crc kubenswrapper[5030]: I0121 00:56:52.121188 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-7ptpw" Jan 21 00:56:53 crc kubenswrapper[5030]: I0121 00:56:53.973082 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c"] Jan 21 00:56:53 crc kubenswrapper[5030]: I0121 00:56:53.974948 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" Jan 21 00:56:53 crc kubenswrapper[5030]: I0121 00:56:53.980776 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:56:53 crc kubenswrapper[5030]: I0121 00:56:53.985129 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c"] Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.019719 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c\" (UID: \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.020070 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdtdd\" (UniqueName: \"kubernetes.io/projected/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-kube-api-access-pdtdd\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c\" (UID: \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.020147 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c\" (UID: \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.122050 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdtdd\" (UniqueName: \"kubernetes.io/projected/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-kube-api-access-pdtdd\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c\" (UID: \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.122418 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c\" (UID: \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.122485 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c\" (UID: \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.123161 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-bundle\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c\" (UID: \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.123551 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-util\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c\" (UID: \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.141856 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdtdd\" (UniqueName: \"kubernetes.io/projected/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-kube-api-access-pdtdd\") pod \"34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c\" (UID: \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\") " pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.298946 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.463380 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.465101 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.470119 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"rabbitmq-plugins-conf" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.470268 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"rabbitmq-default-user" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.471072 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"rabbitmq-server-conf" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.471299 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.472369 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"rabbitmq-server-dockercfg-fgg8f" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.476250 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.528815 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.528876 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.529016 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.529068 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh5zb\" (UniqueName: \"kubernetes.io/projected/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-kube-api-access-vh5zb\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.529127 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.529192 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.529239 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.529275 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.630654 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.630723 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.630754 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.630776 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh5zb\" (UniqueName: \"kubernetes.io/projected/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-kube-api-access-vh5zb\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.630802 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.630826 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.630855 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.630875 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.631543 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.632196 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.631510 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.633645 5030 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.633673 5030 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b0c739217b4750a3439521d170f53ec53236aa8dbfd1cabd567d478d29cfdc92/globalmount\"" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.636028 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.636314 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.644182 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.653753 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh5zb\" (UniqueName: \"kubernetes.io/projected/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-kube-api-access-vh5zb\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.674159 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f\") pod \"rabbitmq-server-0\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.726061 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c"] Jan 21 00:56:54 crc kubenswrapper[5030]: I0121 00:56:54.797554 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:56:55 crc kubenswrapper[5030]: I0121 00:56:55.110316 5030 generic.go:334] "Generic (PLEG): container finished" podID="f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" containerID="aa186b9697e4ba4350fccfd2c0abef0a265a570f0afd7b9123bb25bd3f684018" exitCode=0 Jan 21 00:56:55 crc kubenswrapper[5030]: I0121 00:56:55.110361 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" event={"ID":"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c","Type":"ContainerDied","Data":"aa186b9697e4ba4350fccfd2c0abef0a265a570f0afd7b9123bb25bd3f684018"} Jan 21 00:56:55 crc kubenswrapper[5030]: I0121 00:56:55.110647 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" event={"ID":"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c","Type":"ContainerStarted","Data":"bd61b6add5874839782e2d22db06ac5240cb8bd62f85af2ce7b04abd169a1e58"} Jan 21 00:56:55 crc kubenswrapper[5030]: I0121 00:56:55.240520 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 00:56:55 crc kubenswrapper[5030]: W0121 00:56:55.244224 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a69a2c7_1b3d_40ed_b50c_0bf65178e971.slice/crio-09f6f0fb6f03fe0db252f9c53bd79af5f4fb76ad4698ca4a8ea33a5e1e0ea0f5 WatchSource:0}: Error finding container 09f6f0fb6f03fe0db252f9c53bd79af5f4fb76ad4698ca4a8ea33a5e1e0ea0f5: Status 404 returned error can't find the container with id 09f6f0fb6f03fe0db252f9c53bd79af5f4fb76ad4698ca4a8ea33a5e1e0ea0f5 Jan 21 00:56:56 crc kubenswrapper[5030]: I0121 00:56:56.120124 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"7a69a2c7-1b3d-40ed-b50c-0bf65178e971","Type":"ContainerStarted","Data":"09f6f0fb6f03fe0db252f9c53bd79af5f4fb76ad4698ca4a8ea33a5e1e0ea0f5"} Jan 21 00:56:56 crc kubenswrapper[5030]: I0121 00:56:56.122577 5030 generic.go:334] "Generic (PLEG): container finished" podID="f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" containerID="107e7033c34de7c0c125c4880b396725e6e9d48af2eeac73d97f60ac327462ff" exitCode=0 Jan 21 00:56:56 crc kubenswrapper[5030]: I0121 00:56:56.122615 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" event={"ID":"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c","Type":"ContainerDied","Data":"107e7033c34de7c0c125c4880b396725e6e9d48af2eeac73d97f60ac327462ff"} Jan 21 00:56:57 crc kubenswrapper[5030]: I0121 00:56:57.130805 5030 generic.go:334] "Generic (PLEG): container finished" podID="f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" containerID="ab3ff7039505912d60b0fec22b60d658bb779bc1c989b82524a148b365c90e77" exitCode=0 Jan 21 00:56:57 crc kubenswrapper[5030]: I0121 00:56:57.131871 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" event={"ID":"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c","Type":"ContainerDied","Data":"ab3ff7039505912d60b0fec22b60d658bb779bc1c989b82524a148b365c90e77"} Jan 21 00:56:57 crc kubenswrapper[5030]: I0121 00:56:57.133355 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"7a69a2c7-1b3d-40ed-b50c-0bf65178e971","Type":"ContainerStarted","Data":"6dc1aabd2b51d0e4d03fd0db98c955ebcfba0f98870576cc430d3c12bd87f714"} Jan 21 00:56:58 crc kubenswrapper[5030]: I0121 00:56:58.496295 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" Jan 21 00:56:58 crc kubenswrapper[5030]: I0121 00:56:58.599033 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-bundle\") pod \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\" (UID: \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\") " Jan 21 00:56:58 crc kubenswrapper[5030]: I0121 00:56:58.599229 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-util\") pod \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\" (UID: \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\") " Jan 21 00:56:58 crc kubenswrapper[5030]: I0121 00:56:58.599289 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdtdd\" (UniqueName: \"kubernetes.io/projected/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-kube-api-access-pdtdd\") pod \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\" (UID: \"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c\") " Jan 21 00:56:58 crc kubenswrapper[5030]: I0121 00:56:58.599940 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-bundle" (OuterVolumeSpecName: "bundle") pod "f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" (UID: "f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:56:58 crc kubenswrapper[5030]: I0121 00:56:58.605355 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-kube-api-access-pdtdd" (OuterVolumeSpecName: "kube-api-access-pdtdd") pod "f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" (UID: "f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c"). InnerVolumeSpecName "kube-api-access-pdtdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:56:58 crc kubenswrapper[5030]: I0121 00:56:58.613081 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-util" (OuterVolumeSpecName: "util") pod "f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" (UID: "f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:56:58 crc kubenswrapper[5030]: I0121 00:56:58.701345 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:56:58 crc kubenswrapper[5030]: I0121 00:56:58.701388 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdtdd\" (UniqueName: \"kubernetes.io/projected/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-kube-api-access-pdtdd\") on node \"crc\" DevicePath \"\"" Jan 21 00:56:58 crc kubenswrapper[5030]: I0121 00:56:58.701405 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:56:59 crc kubenswrapper[5030]: I0121 00:56:59.153272 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" event={"ID":"f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c","Type":"ContainerDied","Data":"bd61b6add5874839782e2d22db06ac5240cb8bd62f85af2ce7b04abd169a1e58"} Jan 21 00:56:59 crc kubenswrapper[5030]: I0121 00:56:59.153590 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd61b6add5874839782e2d22db06ac5240cb8bd62f85af2ce7b04abd169a1e58" Jan 21 00:56:59 crc kubenswrapper[5030]: I0121 00:56:59.153439 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.637978 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv"] Jan 21 00:57:06 crc kubenswrapper[5030]: E0121 00:57:06.639097 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" containerName="util" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.639113 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" containerName="util" Jan 21 00:57:06 crc kubenswrapper[5030]: E0121 00:57:06.639126 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" containerName="extract" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.639134 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" containerName="extract" Jan 21 00:57:06 crc kubenswrapper[5030]: E0121 00:57:06.639143 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" containerName="pull" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.639150 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" containerName="pull" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.639300 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" containerName="extract" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.639926 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.642498 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-z45dw" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.650595 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.662559 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv"] Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.743022 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2756282-c0f3-41b0-bdce-ab04afad52f8-apiservice-cert\") pod \"keystone-operator-controller-manager-d8c4cd77f-hlgjv\" (UID: \"b2756282-c0f3-41b0-bdce-ab04afad52f8\") " pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.743287 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6krs\" (UniqueName: \"kubernetes.io/projected/b2756282-c0f3-41b0-bdce-ab04afad52f8-kube-api-access-k6krs\") pod \"keystone-operator-controller-manager-d8c4cd77f-hlgjv\" (UID: \"b2756282-c0f3-41b0-bdce-ab04afad52f8\") " pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.743393 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2756282-c0f3-41b0-bdce-ab04afad52f8-webhook-cert\") pod \"keystone-operator-controller-manager-d8c4cd77f-hlgjv\" (UID: \"b2756282-c0f3-41b0-bdce-ab04afad52f8\") " pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.845663 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2756282-c0f3-41b0-bdce-ab04afad52f8-apiservice-cert\") pod \"keystone-operator-controller-manager-d8c4cd77f-hlgjv\" (UID: \"b2756282-c0f3-41b0-bdce-ab04afad52f8\") " pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.845794 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6krs\" (UniqueName: \"kubernetes.io/projected/b2756282-c0f3-41b0-bdce-ab04afad52f8-kube-api-access-k6krs\") pod \"keystone-operator-controller-manager-d8c4cd77f-hlgjv\" (UID: \"b2756282-c0f3-41b0-bdce-ab04afad52f8\") " pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.845821 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2756282-c0f3-41b0-bdce-ab04afad52f8-webhook-cert\") pod \"keystone-operator-controller-manager-d8c4cd77f-hlgjv\" (UID: \"b2756282-c0f3-41b0-bdce-ab04afad52f8\") " pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.851665 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2756282-c0f3-41b0-bdce-ab04afad52f8-webhook-cert\") pod \"keystone-operator-controller-manager-d8c4cd77f-hlgjv\" (UID: \"b2756282-c0f3-41b0-bdce-ab04afad52f8\") " pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.851728 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2756282-c0f3-41b0-bdce-ab04afad52f8-apiservice-cert\") pod \"keystone-operator-controller-manager-d8c4cd77f-hlgjv\" (UID: \"b2756282-c0f3-41b0-bdce-ab04afad52f8\") " pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.865915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6krs\" (UniqueName: \"kubernetes.io/projected/b2756282-c0f3-41b0-bdce-ab04afad52f8-kube-api-access-k6krs\") pod \"keystone-operator-controller-manager-d8c4cd77f-hlgjv\" (UID: \"b2756282-c0f3-41b0-bdce-ab04afad52f8\") " pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 00:57:06 crc kubenswrapper[5030]: I0121 00:57:06.960052 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 00:57:07 crc kubenswrapper[5030]: I0121 00:57:07.408997 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv"] Jan 21 00:57:08 crc kubenswrapper[5030]: I0121 00:57:08.213219 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" event={"ID":"b2756282-c0f3-41b0-bdce-ab04afad52f8","Type":"ContainerStarted","Data":"8f707196ed547c20be3be18b7836f589ce4588be76d9efdc733cd54d067d5fe6"} Jan 21 00:57:08 crc kubenswrapper[5030]: I0121 00:57:08.213498 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" event={"ID":"b2756282-c0f3-41b0-bdce-ab04afad52f8","Type":"ContainerStarted","Data":"0f76c9d42ef35e963c76406e1bba68eecb5a755efbef406295ac5e94e232d5e3"} Jan 21 00:57:08 crc kubenswrapper[5030]: I0121 00:57:08.213716 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 00:57:08 crc kubenswrapper[5030]: I0121 00:57:08.248051 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" podStartSLOduration=2.248030557 podStartE2EDuration="2.248030557s" podCreationTimestamp="2026-01-21 00:57:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:57:08.243295363 +0000 UTC m=+8500.563555651" watchObservedRunningTime="2026-01-21 00:57:08.248030557 +0000 UTC m=+8500.568290845" Jan 21 00:57:10 crc kubenswrapper[5030]: I0121 00:57:10.156939 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:57:10 crc kubenswrapper[5030]: I0121 00:57:10.157311 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:57:16 crc kubenswrapper[5030]: I0121 00:57:16.967503 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 00:57:23 crc kubenswrapper[5030]: I0121 00:57:23.939137 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-zd2q6"] Jan 21 00:57:23 crc kubenswrapper[5030]: I0121 00:57:23.940525 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-zd2q6" Jan 21 00:57:23 crc kubenswrapper[5030]: I0121 00:57:23.943178 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-mk65v" Jan 21 00:57:23 crc kubenswrapper[5030]: I0121 00:57:23.977247 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-zd2q6"] Jan 21 00:57:24 crc kubenswrapper[5030]: I0121 00:57:24.008983 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfqgc\" (UniqueName: \"kubernetes.io/projected/72237d07-9875-484f-9e9b-f282a65d0136-kube-api-access-bfqgc\") pod \"horizon-operator-index-zd2q6\" (UID: \"72237d07-9875-484f-9e9b-f282a65d0136\") " pod="openstack-operators/horizon-operator-index-zd2q6" Jan 21 00:57:24 crc kubenswrapper[5030]: I0121 00:57:24.110739 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfqgc\" (UniqueName: \"kubernetes.io/projected/72237d07-9875-484f-9e9b-f282a65d0136-kube-api-access-bfqgc\") pod \"horizon-operator-index-zd2q6\" (UID: \"72237d07-9875-484f-9e9b-f282a65d0136\") " pod="openstack-operators/horizon-operator-index-zd2q6" Jan 21 00:57:24 crc kubenswrapper[5030]: I0121 00:57:24.135541 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfqgc\" (UniqueName: \"kubernetes.io/projected/72237d07-9875-484f-9e9b-f282a65d0136-kube-api-access-bfqgc\") pod \"horizon-operator-index-zd2q6\" (UID: \"72237d07-9875-484f-9e9b-f282a65d0136\") " pod="openstack-operators/horizon-operator-index-zd2q6" Jan 21 00:57:24 crc kubenswrapper[5030]: I0121 00:57:24.265682 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-zd2q6" Jan 21 00:57:24 crc kubenswrapper[5030]: I0121 00:57:24.709839 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-zd2q6"] Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.355723 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-zd2q6" event={"ID":"72237d07-9875-484f-9e9b-f282a65d0136","Type":"ContainerStarted","Data":"47d1138d4ce9f0d0703e0715386b0a5bdefe1f828e862450777ebfa8b333a74e"} Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.758687 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-db-create-ltfzw"] Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.761320 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-create-ltfzw" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.766173 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7"] Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.768221 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.772969 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-db-secret" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.785877 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-db-create-ltfzw"] Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.800102 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7"] Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.843066 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30675ef2-c4d2-4fd1-8f10-ceef070130ef-operator-scripts\") pod \"keystone-db-create-ltfzw\" (UID: \"30675ef2-c4d2-4fd1-8f10-ceef070130ef\") " pod="horizon-kuttl-tests/keystone-db-create-ltfzw" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.843153 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qwsr\" (UniqueName: \"kubernetes.io/projected/30675ef2-c4d2-4fd1-8f10-ceef070130ef-kube-api-access-6qwsr\") pod \"keystone-db-create-ltfzw\" (UID: \"30675ef2-c4d2-4fd1-8f10-ceef070130ef\") " pod="horizon-kuttl-tests/keystone-db-create-ltfzw" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.843189 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc41df9c-0cdf-454e-a20a-d287a65bf2c0-operator-scripts\") pod \"keystone-d3cf-account-create-update-h76k7\" (UID: \"fc41df9c-0cdf-454e-a20a-d287a65bf2c0\") " pod="horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.843268 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwsqt\" (UniqueName: \"kubernetes.io/projected/fc41df9c-0cdf-454e-a20a-d287a65bf2c0-kube-api-access-lwsqt\") pod \"keystone-d3cf-account-create-update-h76k7\" (UID: \"fc41df9c-0cdf-454e-a20a-d287a65bf2c0\") " pod="horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.945370 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30675ef2-c4d2-4fd1-8f10-ceef070130ef-operator-scripts\") pod \"keystone-db-create-ltfzw\" (UID: \"30675ef2-c4d2-4fd1-8f10-ceef070130ef\") " pod="horizon-kuttl-tests/keystone-db-create-ltfzw" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.945469 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qwsr\" (UniqueName: \"kubernetes.io/projected/30675ef2-c4d2-4fd1-8f10-ceef070130ef-kube-api-access-6qwsr\") pod \"keystone-db-create-ltfzw\" (UID: \"30675ef2-c4d2-4fd1-8f10-ceef070130ef\") " pod="horizon-kuttl-tests/keystone-db-create-ltfzw" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.945512 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc41df9c-0cdf-454e-a20a-d287a65bf2c0-operator-scripts\") pod \"keystone-d3cf-account-create-update-h76k7\" (UID: \"fc41df9c-0cdf-454e-a20a-d287a65bf2c0\") " pod="horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.946054 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30675ef2-c4d2-4fd1-8f10-ceef070130ef-operator-scripts\") pod \"keystone-db-create-ltfzw\" (UID: \"30675ef2-c4d2-4fd1-8f10-ceef070130ef\") " pod="horizon-kuttl-tests/keystone-db-create-ltfzw" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.946576 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc41df9c-0cdf-454e-a20a-d287a65bf2c0-operator-scripts\") pod \"keystone-d3cf-account-create-update-h76k7\" (UID: \"fc41df9c-0cdf-454e-a20a-d287a65bf2c0\") " pod="horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.946963 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwsqt\" (UniqueName: \"kubernetes.io/projected/fc41df9c-0cdf-454e-a20a-d287a65bf2c0-kube-api-access-lwsqt\") pod \"keystone-d3cf-account-create-update-h76k7\" (UID: \"fc41df9c-0cdf-454e-a20a-d287a65bf2c0\") " pod="horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.967052 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwsqt\" (UniqueName: \"kubernetes.io/projected/fc41df9c-0cdf-454e-a20a-d287a65bf2c0-kube-api-access-lwsqt\") pod \"keystone-d3cf-account-create-update-h76k7\" (UID: \"fc41df9c-0cdf-454e-a20a-d287a65bf2c0\") " pod="horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7" Jan 21 00:57:25 crc kubenswrapper[5030]: I0121 00:57:25.966953 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qwsr\" (UniqueName: \"kubernetes.io/projected/30675ef2-c4d2-4fd1-8f10-ceef070130ef-kube-api-access-6qwsr\") pod \"keystone-db-create-ltfzw\" (UID: \"30675ef2-c4d2-4fd1-8f10-ceef070130ef\") " pod="horizon-kuttl-tests/keystone-db-create-ltfzw" Jan 21 00:57:26 crc kubenswrapper[5030]: I0121 00:57:26.093194 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-create-ltfzw" Jan 21 00:57:26 crc kubenswrapper[5030]: I0121 00:57:26.101861 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7" Jan 21 00:57:26 crc kubenswrapper[5030]: I0121 00:57:26.371872 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-zd2q6" event={"ID":"72237d07-9875-484f-9e9b-f282a65d0136","Type":"ContainerStarted","Data":"cfd45482f337d7f5159f114a3ff93b7177c4ab855c5d661cfc8a1ef57f24d710"} Jan 21 00:57:26 crc kubenswrapper[5030]: I0121 00:57:26.565383 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-zd2q6" podStartSLOduration=2.152019123 podStartE2EDuration="3.565359309s" podCreationTimestamp="2026-01-21 00:57:23 +0000 UTC" firstStartedPulling="2026-01-21 00:57:24.719574128 +0000 UTC m=+8517.039834416" lastFinishedPulling="2026-01-21 00:57:26.132914314 +0000 UTC m=+8518.453174602" observedRunningTime="2026-01-21 00:57:26.389391929 +0000 UTC m=+8518.709652227" watchObservedRunningTime="2026-01-21 00:57:26.565359309 +0000 UTC m=+8518.885619607" Jan 21 00:57:26 crc kubenswrapper[5030]: I0121 00:57:26.568771 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-db-create-ltfzw"] Jan 21 00:57:26 crc kubenswrapper[5030]: W0121 00:57:26.571484 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30675ef2_c4d2_4fd1_8f10_ceef070130ef.slice/crio-4ee2f521d812dc3d3729940879aa36921509f2bbd773181a6ae0e587dbdaad4a WatchSource:0}: Error finding container 4ee2f521d812dc3d3729940879aa36921509f2bbd773181a6ae0e587dbdaad4a: Status 404 returned error can't find the container with id 4ee2f521d812dc3d3729940879aa36921509f2bbd773181a6ae0e587dbdaad4a Jan 21 00:57:26 crc kubenswrapper[5030]: I0121 00:57:26.636704 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7"] Jan 21 00:57:27 crc kubenswrapper[5030]: I0121 00:57:27.380277 5030 generic.go:334] "Generic (PLEG): container finished" podID="fc41df9c-0cdf-454e-a20a-d287a65bf2c0" containerID="f1f5446ba6bb631af20b295d86ddd9a37fbb2ac53843d294cebc0cedb52823b2" exitCode=0 Jan 21 00:57:27 crc kubenswrapper[5030]: I0121 00:57:27.380339 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7" event={"ID":"fc41df9c-0cdf-454e-a20a-d287a65bf2c0","Type":"ContainerDied","Data":"f1f5446ba6bb631af20b295d86ddd9a37fbb2ac53843d294cebc0cedb52823b2"} Jan 21 00:57:27 crc kubenswrapper[5030]: I0121 00:57:27.380673 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7" event={"ID":"fc41df9c-0cdf-454e-a20a-d287a65bf2c0","Type":"ContainerStarted","Data":"33f3afb6cdda68e779d4870db0b9ab0ee72eff83f37a0078b57703e45a51ad99"} Jan 21 00:57:27 crc kubenswrapper[5030]: I0121 00:57:27.381929 5030 generic.go:334] "Generic (PLEG): container finished" podID="30675ef2-c4d2-4fd1-8f10-ceef070130ef" containerID="2bd42947a7b68c2dfc89ee2480c9171b269eb691899ede07105a29d74d0df7a5" exitCode=0 Jan 21 00:57:27 crc kubenswrapper[5030]: I0121 00:57:27.382298 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-create-ltfzw" event={"ID":"30675ef2-c4d2-4fd1-8f10-ceef070130ef","Type":"ContainerDied","Data":"2bd42947a7b68c2dfc89ee2480c9171b269eb691899ede07105a29d74d0df7a5"} Jan 21 00:57:27 crc kubenswrapper[5030]: I0121 00:57:27.382329 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-create-ltfzw" event={"ID":"30675ef2-c4d2-4fd1-8f10-ceef070130ef","Type":"ContainerStarted","Data":"4ee2f521d812dc3d3729940879aa36921509f2bbd773181a6ae0e587dbdaad4a"} Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.767853 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-create-ltfzw" Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.773295 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7" Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.891863 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwsqt\" (UniqueName: \"kubernetes.io/projected/fc41df9c-0cdf-454e-a20a-d287a65bf2c0-kube-api-access-lwsqt\") pod \"fc41df9c-0cdf-454e-a20a-d287a65bf2c0\" (UID: \"fc41df9c-0cdf-454e-a20a-d287a65bf2c0\") " Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.891950 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc41df9c-0cdf-454e-a20a-d287a65bf2c0-operator-scripts\") pod \"fc41df9c-0cdf-454e-a20a-d287a65bf2c0\" (UID: \"fc41df9c-0cdf-454e-a20a-d287a65bf2c0\") " Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.892073 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qwsr\" (UniqueName: \"kubernetes.io/projected/30675ef2-c4d2-4fd1-8f10-ceef070130ef-kube-api-access-6qwsr\") pod \"30675ef2-c4d2-4fd1-8f10-ceef070130ef\" (UID: \"30675ef2-c4d2-4fd1-8f10-ceef070130ef\") " Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.892132 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30675ef2-c4d2-4fd1-8f10-ceef070130ef-operator-scripts\") pod \"30675ef2-c4d2-4fd1-8f10-ceef070130ef\" (UID: \"30675ef2-c4d2-4fd1-8f10-ceef070130ef\") " Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.893094 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc41df9c-0cdf-454e-a20a-d287a65bf2c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc41df9c-0cdf-454e-a20a-d287a65bf2c0" (UID: "fc41df9c-0cdf-454e-a20a-d287a65bf2c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.893479 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30675ef2-c4d2-4fd1-8f10-ceef070130ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30675ef2-c4d2-4fd1-8f10-ceef070130ef" (UID: "30675ef2-c4d2-4fd1-8f10-ceef070130ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.897573 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc41df9c-0cdf-454e-a20a-d287a65bf2c0-kube-api-access-lwsqt" (OuterVolumeSpecName: "kube-api-access-lwsqt") pod "fc41df9c-0cdf-454e-a20a-d287a65bf2c0" (UID: "fc41df9c-0cdf-454e-a20a-d287a65bf2c0"). InnerVolumeSpecName "kube-api-access-lwsqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.898653 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30675ef2-c4d2-4fd1-8f10-ceef070130ef-kube-api-access-6qwsr" (OuterVolumeSpecName: "kube-api-access-6qwsr") pod "30675ef2-c4d2-4fd1-8f10-ceef070130ef" (UID: "30675ef2-c4d2-4fd1-8f10-ceef070130ef"). InnerVolumeSpecName "kube-api-access-6qwsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.994161 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qwsr\" (UniqueName: \"kubernetes.io/projected/30675ef2-c4d2-4fd1-8f10-ceef070130ef-kube-api-access-6qwsr\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.994201 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30675ef2-c4d2-4fd1-8f10-ceef070130ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.994213 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwsqt\" (UniqueName: \"kubernetes.io/projected/fc41df9c-0cdf-454e-a20a-d287a65bf2c0-kube-api-access-lwsqt\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:28 crc kubenswrapper[5030]: I0121 00:57:28.994225 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc41df9c-0cdf-454e-a20a-d287a65bf2c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:29 crc kubenswrapper[5030]: I0121 00:57:29.436436 5030 generic.go:334] "Generic (PLEG): container finished" podID="7a69a2c7-1b3d-40ed-b50c-0bf65178e971" containerID="6dc1aabd2b51d0e4d03fd0db98c955ebcfba0f98870576cc430d3c12bd87f714" exitCode=0 Jan 21 00:57:29 crc kubenswrapper[5030]: I0121 00:57:29.436522 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"7a69a2c7-1b3d-40ed-b50c-0bf65178e971","Type":"ContainerDied","Data":"6dc1aabd2b51d0e4d03fd0db98c955ebcfba0f98870576cc430d3c12bd87f714"} Jan 21 00:57:29 crc kubenswrapper[5030]: I0121 00:57:29.441429 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-create-ltfzw" event={"ID":"30675ef2-c4d2-4fd1-8f10-ceef070130ef","Type":"ContainerDied","Data":"4ee2f521d812dc3d3729940879aa36921509f2bbd773181a6ae0e587dbdaad4a"} Jan 21 00:57:29 crc kubenswrapper[5030]: I0121 00:57:29.441517 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ee2f521d812dc3d3729940879aa36921509f2bbd773181a6ae0e587dbdaad4a" Jan 21 00:57:29 crc kubenswrapper[5030]: I0121 00:57:29.441648 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-create-ltfzw" Jan 21 00:57:29 crc kubenswrapper[5030]: I0121 00:57:29.446119 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7" event={"ID":"fc41df9c-0cdf-454e-a20a-d287a65bf2c0","Type":"ContainerDied","Data":"33f3afb6cdda68e779d4870db0b9ab0ee72eff83f37a0078b57703e45a51ad99"} Jan 21 00:57:29 crc kubenswrapper[5030]: I0121 00:57:29.446158 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33f3afb6cdda68e779d4870db0b9ab0ee72eff83f37a0078b57703e45a51ad99" Jan 21 00:57:29 crc kubenswrapper[5030]: I0121 00:57:29.446338 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7" Jan 21 00:57:30 crc kubenswrapper[5030]: I0121 00:57:30.453421 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"7a69a2c7-1b3d-40ed-b50c-0bf65178e971","Type":"ContainerStarted","Data":"9cadf200e8321abc58854abfd8226569f7dd3a279740b95b9d6a4487243f4eec"} Jan 21 00:57:30 crc kubenswrapper[5030]: I0121 00:57:30.454735 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:57:30 crc kubenswrapper[5030]: I0121 00:57:30.474362 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.474345373 podStartE2EDuration="37.474345373s" podCreationTimestamp="2026-01-21 00:56:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:57:30.469867016 +0000 UTC m=+8522.790127314" watchObservedRunningTime="2026-01-21 00:57:30.474345373 +0000 UTC m=+8522.794605651" Jan 21 00:57:34 crc kubenswrapper[5030]: I0121 00:57:34.266149 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-zd2q6" Jan 21 00:57:34 crc kubenswrapper[5030]: I0121 00:57:34.267384 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-zd2q6" Jan 21 00:57:34 crc kubenswrapper[5030]: I0121 00:57:34.379794 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-zd2q6" Jan 21 00:57:34 crc kubenswrapper[5030]: I0121 00:57:34.528948 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-zd2q6" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.570075 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f"] Jan 21 00:57:38 crc kubenswrapper[5030]: E0121 00:57:38.570632 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30675ef2-c4d2-4fd1-8f10-ceef070130ef" containerName="mariadb-database-create" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.570647 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="30675ef2-c4d2-4fd1-8f10-ceef070130ef" containerName="mariadb-database-create" Jan 21 00:57:38 crc kubenswrapper[5030]: E0121 00:57:38.570667 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc41df9c-0cdf-454e-a20a-d287a65bf2c0" containerName="mariadb-account-create-update" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.570674 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc41df9c-0cdf-454e-a20a-d287a65bf2c0" containerName="mariadb-account-create-update" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.570807 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc41df9c-0cdf-454e-a20a-d287a65bf2c0" containerName="mariadb-account-create-update" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.570821 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="30675ef2-c4d2-4fd1-8f10-ceef070130ef" containerName="mariadb-database-create" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.571744 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.574431 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vb9p8" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.591955 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f"] Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.651300 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2070258a-189d-4670-91c2-5717863726ae-util\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f\" (UID: \"2070258a-189d-4670-91c2-5717863726ae\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.651684 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2070258a-189d-4670-91c2-5717863726ae-bundle\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f\" (UID: \"2070258a-189d-4670-91c2-5717863726ae\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.651758 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpg9p\" (UniqueName: \"kubernetes.io/projected/2070258a-189d-4670-91c2-5717863726ae-kube-api-access-gpg9p\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f\" (UID: \"2070258a-189d-4670-91c2-5717863726ae\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.753264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpg9p\" (UniqueName: \"kubernetes.io/projected/2070258a-189d-4670-91c2-5717863726ae-kube-api-access-gpg9p\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f\" (UID: \"2070258a-189d-4670-91c2-5717863726ae\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.753358 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2070258a-189d-4670-91c2-5717863726ae-util\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f\" (UID: \"2070258a-189d-4670-91c2-5717863726ae\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.753558 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2070258a-189d-4670-91c2-5717863726ae-bundle\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f\" (UID: \"2070258a-189d-4670-91c2-5717863726ae\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.754210 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2070258a-189d-4670-91c2-5717863726ae-util\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f\" (UID: \"2070258a-189d-4670-91c2-5717863726ae\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.754517 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2070258a-189d-4670-91c2-5717863726ae-bundle\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f\" (UID: \"2070258a-189d-4670-91c2-5717863726ae\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.779267 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpg9p\" (UniqueName: \"kubernetes.io/projected/2070258a-189d-4670-91c2-5717863726ae-kube-api-access-gpg9p\") pod \"653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f\" (UID: \"2070258a-189d-4670-91c2-5717863726ae\") " pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" Jan 21 00:57:38 crc kubenswrapper[5030]: I0121 00:57:38.895691 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" Jan 21 00:57:39 crc kubenswrapper[5030]: I0121 00:57:39.302601 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f"] Jan 21 00:57:39 crc kubenswrapper[5030]: W0121 00:57:39.309475 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2070258a_189d_4670_91c2_5717863726ae.slice/crio-94f4a6a8a930a1872f5820227589bdad94f3912e9cf26bbfcf86af9e4c99c870 WatchSource:0}: Error finding container 94f4a6a8a930a1872f5820227589bdad94f3912e9cf26bbfcf86af9e4c99c870: Status 404 returned error can't find the container with id 94f4a6a8a930a1872f5820227589bdad94f3912e9cf26bbfcf86af9e4c99c870 Jan 21 00:57:39 crc kubenswrapper[5030]: I0121 00:57:39.541558 5030 generic.go:334] "Generic (PLEG): container finished" podID="2070258a-189d-4670-91c2-5717863726ae" containerID="04811408c907a49bf788d7ea7ddc76e7c8306fddbf5f99dc614d2776f2f0029a" exitCode=0 Jan 21 00:57:39 crc kubenswrapper[5030]: I0121 00:57:39.541640 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" event={"ID":"2070258a-189d-4670-91c2-5717863726ae","Type":"ContainerDied","Data":"04811408c907a49bf788d7ea7ddc76e7c8306fddbf5f99dc614d2776f2f0029a"} Jan 21 00:57:39 crc kubenswrapper[5030]: I0121 00:57:39.541913 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" event={"ID":"2070258a-189d-4670-91c2-5717863726ae","Type":"ContainerStarted","Data":"94f4a6a8a930a1872f5820227589bdad94f3912e9cf26bbfcf86af9e4c99c870"} Jan 21 00:57:40 crc kubenswrapper[5030]: I0121 00:57:40.156882 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 00:57:40 crc kubenswrapper[5030]: I0121 00:57:40.156957 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 00:57:40 crc kubenswrapper[5030]: I0121 00:57:40.157005 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 00:57:40 crc kubenswrapper[5030]: I0121 00:57:40.157738 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 00:57:40 crc kubenswrapper[5030]: I0121 00:57:40.157805 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" gracePeriod=600 Jan 21 00:57:40 crc kubenswrapper[5030]: E0121 00:57:40.282832 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:57:40 crc kubenswrapper[5030]: I0121 00:57:40.550147 5030 generic.go:334] "Generic (PLEG): container finished" podID="2070258a-189d-4670-91c2-5717863726ae" containerID="e591307ec74214b89b97944e550020a26717239a534f5ae914919227e523f4f1" exitCode=0 Jan 21 00:57:40 crc kubenswrapper[5030]: I0121 00:57:40.550201 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" event={"ID":"2070258a-189d-4670-91c2-5717863726ae","Type":"ContainerDied","Data":"e591307ec74214b89b97944e550020a26717239a534f5ae914919227e523f4f1"} Jan 21 00:57:40 crc kubenswrapper[5030]: I0121 00:57:40.555570 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" exitCode=0 Jan 21 00:57:40 crc kubenswrapper[5030]: I0121 00:57:40.555612 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87"} Jan 21 00:57:40 crc kubenswrapper[5030]: I0121 00:57:40.555670 5030 scope.go:117] "RemoveContainer" containerID="90cfc701935004f0388f24a1bca80a66959af9772ba9792f889c6871c9551ae3" Jan 21 00:57:40 crc kubenswrapper[5030]: I0121 00:57:40.556307 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 00:57:40 crc kubenswrapper[5030]: E0121 00:57:40.556558 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:57:41 crc kubenswrapper[5030]: I0121 00:57:41.565786 5030 generic.go:334] "Generic (PLEG): container finished" podID="2070258a-189d-4670-91c2-5717863726ae" containerID="7949d61d73192f4f4d01d89fc08f93b8402e0b08d7e61324e248dc9fe8036c89" exitCode=0 Jan 21 00:57:41 crc kubenswrapper[5030]: I0121 00:57:41.565879 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" event={"ID":"2070258a-189d-4670-91c2-5717863726ae","Type":"ContainerDied","Data":"7949d61d73192f4f4d01d89fc08f93b8402e0b08d7e61324e248dc9fe8036c89"} Jan 21 00:57:42 crc kubenswrapper[5030]: I0121 00:57:42.872846 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" Jan 21 00:57:43 crc kubenswrapper[5030]: I0121 00:57:43.028741 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2070258a-189d-4670-91c2-5717863726ae-util\") pod \"2070258a-189d-4670-91c2-5717863726ae\" (UID: \"2070258a-189d-4670-91c2-5717863726ae\") " Jan 21 00:57:43 crc kubenswrapper[5030]: I0121 00:57:43.028839 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpg9p\" (UniqueName: \"kubernetes.io/projected/2070258a-189d-4670-91c2-5717863726ae-kube-api-access-gpg9p\") pod \"2070258a-189d-4670-91c2-5717863726ae\" (UID: \"2070258a-189d-4670-91c2-5717863726ae\") " Jan 21 00:57:43 crc kubenswrapper[5030]: I0121 00:57:43.028936 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2070258a-189d-4670-91c2-5717863726ae-bundle\") pod \"2070258a-189d-4670-91c2-5717863726ae\" (UID: \"2070258a-189d-4670-91c2-5717863726ae\") " Jan 21 00:57:43 crc kubenswrapper[5030]: I0121 00:57:43.029958 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2070258a-189d-4670-91c2-5717863726ae-bundle" (OuterVolumeSpecName: "bundle") pod "2070258a-189d-4670-91c2-5717863726ae" (UID: "2070258a-189d-4670-91c2-5717863726ae"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:57:43 crc kubenswrapper[5030]: I0121 00:57:43.039660 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2070258a-189d-4670-91c2-5717863726ae-kube-api-access-gpg9p" (OuterVolumeSpecName: "kube-api-access-gpg9p") pod "2070258a-189d-4670-91c2-5717863726ae" (UID: "2070258a-189d-4670-91c2-5717863726ae"). InnerVolumeSpecName "kube-api-access-gpg9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:57:43 crc kubenswrapper[5030]: I0121 00:57:43.044064 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2070258a-189d-4670-91c2-5717863726ae-util" (OuterVolumeSpecName: "util") pod "2070258a-189d-4670-91c2-5717863726ae" (UID: "2070258a-189d-4670-91c2-5717863726ae"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:57:43 crc kubenswrapper[5030]: I0121 00:57:43.130611 5030 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2070258a-189d-4670-91c2-5717863726ae-util\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:43 crc kubenswrapper[5030]: I0121 00:57:43.130674 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpg9p\" (UniqueName: \"kubernetes.io/projected/2070258a-189d-4670-91c2-5717863726ae-kube-api-access-gpg9p\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:43 crc kubenswrapper[5030]: I0121 00:57:43.130687 5030 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2070258a-189d-4670-91c2-5717863726ae-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:43 crc kubenswrapper[5030]: I0121 00:57:43.594141 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" event={"ID":"2070258a-189d-4670-91c2-5717863726ae","Type":"ContainerDied","Data":"94f4a6a8a930a1872f5820227589bdad94f3912e9cf26bbfcf86af9e4c99c870"} Jan 21 00:57:43 crc kubenswrapper[5030]: I0121 00:57:43.594518 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94f4a6a8a930a1872f5820227589bdad94f3912e9cf26bbfcf86af9e4c99c870" Jan 21 00:57:43 crc kubenswrapper[5030]: I0121 00:57:43.594298 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f" Jan 21 00:57:44 crc kubenswrapper[5030]: I0121 00:57:44.802212 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.349982 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-z29j4"] Jan 21 00:57:45 crc kubenswrapper[5030]: E0121 00:57:45.350609 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2070258a-189d-4670-91c2-5717863726ae" containerName="extract" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.350654 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2070258a-189d-4670-91c2-5717863726ae" containerName="extract" Jan 21 00:57:45 crc kubenswrapper[5030]: E0121 00:57:45.350686 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2070258a-189d-4670-91c2-5717863726ae" containerName="util" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.350692 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2070258a-189d-4670-91c2-5717863726ae" containerName="util" Jan 21 00:57:45 crc kubenswrapper[5030]: E0121 00:57:45.350707 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2070258a-189d-4670-91c2-5717863726ae" containerName="pull" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.350713 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2070258a-189d-4670-91c2-5717863726ae" containerName="pull" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.350840 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2070258a-189d-4670-91c2-5717863726ae" containerName="extract" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.351273 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-sync-z29j4" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.354821 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.354973 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-scripts" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.355537 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-keystone-dockercfg-jg2gl" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.355767 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-config-data" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.369951 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-z29j4"] Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.485740 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lpdf\" (UniqueName: \"kubernetes.io/projected/15170089-9ec5-40d2-919a-cb56efb2154f-kube-api-access-2lpdf\") pod \"keystone-db-sync-z29j4\" (UID: \"15170089-9ec5-40d2-919a-cb56efb2154f\") " pod="horizon-kuttl-tests/keystone-db-sync-z29j4" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.485896 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15170089-9ec5-40d2-919a-cb56efb2154f-config-data\") pod \"keystone-db-sync-z29j4\" (UID: \"15170089-9ec5-40d2-919a-cb56efb2154f\") " pod="horizon-kuttl-tests/keystone-db-sync-z29j4" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.587559 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15170089-9ec5-40d2-919a-cb56efb2154f-config-data\") pod \"keystone-db-sync-z29j4\" (UID: \"15170089-9ec5-40d2-919a-cb56efb2154f\") " pod="horizon-kuttl-tests/keystone-db-sync-z29j4" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.587724 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lpdf\" (UniqueName: \"kubernetes.io/projected/15170089-9ec5-40d2-919a-cb56efb2154f-kube-api-access-2lpdf\") pod \"keystone-db-sync-z29j4\" (UID: \"15170089-9ec5-40d2-919a-cb56efb2154f\") " pod="horizon-kuttl-tests/keystone-db-sync-z29j4" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.594365 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15170089-9ec5-40d2-919a-cb56efb2154f-config-data\") pod \"keystone-db-sync-z29j4\" (UID: \"15170089-9ec5-40d2-919a-cb56efb2154f\") " pod="horizon-kuttl-tests/keystone-db-sync-z29j4" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.605563 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lpdf\" (UniqueName: \"kubernetes.io/projected/15170089-9ec5-40d2-919a-cb56efb2154f-kube-api-access-2lpdf\") pod \"keystone-db-sync-z29j4\" (UID: \"15170089-9ec5-40d2-919a-cb56efb2154f\") " pod="horizon-kuttl-tests/keystone-db-sync-z29j4" Jan 21 00:57:45 crc kubenswrapper[5030]: I0121 00:57:45.674045 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-sync-z29j4" Jan 21 00:57:46 crc kubenswrapper[5030]: I0121 00:57:46.078168 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-z29j4"] Jan 21 00:57:46 crc kubenswrapper[5030]: W0121 00:57:46.081885 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15170089_9ec5_40d2_919a_cb56efb2154f.slice/crio-5cd8effce7765a7ec9b75e3cd3b6b12e90294e8ef038d34a0396c11f21b507ce WatchSource:0}: Error finding container 5cd8effce7765a7ec9b75e3cd3b6b12e90294e8ef038d34a0396c11f21b507ce: Status 404 returned error can't find the container with id 5cd8effce7765a7ec9b75e3cd3b6b12e90294e8ef038d34a0396c11f21b507ce Jan 21 00:57:46 crc kubenswrapper[5030]: I0121 00:57:46.617506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-sync-z29j4" event={"ID":"15170089-9ec5-40d2-919a-cb56efb2154f","Type":"ContainerStarted","Data":"180135f2bebd1daf383df1e6f748c196b1c0faa13222d5e5d3624f25327469f5"} Jan 21 00:57:46 crc kubenswrapper[5030]: I0121 00:57:46.617965 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-sync-z29j4" event={"ID":"15170089-9ec5-40d2-919a-cb56efb2154f","Type":"ContainerStarted","Data":"5cd8effce7765a7ec9b75e3cd3b6b12e90294e8ef038d34a0396c11f21b507ce"} Jan 21 00:57:46 crc kubenswrapper[5030]: I0121 00:57:46.644746 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/keystone-db-sync-z29j4" podStartSLOduration=1.644436263 podStartE2EDuration="1.644436263s" podCreationTimestamp="2026-01-21 00:57:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:57:46.634930876 +0000 UTC m=+8538.955191184" watchObservedRunningTime="2026-01-21 00:57:46.644436263 +0000 UTC m=+8538.964696561" Jan 21 00:57:49 crc kubenswrapper[5030]: I0121 00:57:49.640396 5030 generic.go:334] "Generic (PLEG): container finished" podID="15170089-9ec5-40d2-919a-cb56efb2154f" containerID="180135f2bebd1daf383df1e6f748c196b1c0faa13222d5e5d3624f25327469f5" exitCode=0 Jan 21 00:57:49 crc kubenswrapper[5030]: I0121 00:57:49.640472 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-sync-z29j4" event={"ID":"15170089-9ec5-40d2-919a-cb56efb2154f","Type":"ContainerDied","Data":"180135f2bebd1daf383df1e6f748c196b1c0faa13222d5e5d3624f25327469f5"} Jan 21 00:57:50 crc kubenswrapper[5030]: I0121 00:57:50.985411 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-sync-z29j4" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.076347 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15170089-9ec5-40d2-919a-cb56efb2154f-config-data\") pod \"15170089-9ec5-40d2-919a-cb56efb2154f\" (UID: \"15170089-9ec5-40d2-919a-cb56efb2154f\") " Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.076596 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lpdf\" (UniqueName: \"kubernetes.io/projected/15170089-9ec5-40d2-919a-cb56efb2154f-kube-api-access-2lpdf\") pod \"15170089-9ec5-40d2-919a-cb56efb2154f\" (UID: \"15170089-9ec5-40d2-919a-cb56efb2154f\") " Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.082988 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15170089-9ec5-40d2-919a-cb56efb2154f-kube-api-access-2lpdf" (OuterVolumeSpecName: "kube-api-access-2lpdf") pod "15170089-9ec5-40d2-919a-cb56efb2154f" (UID: "15170089-9ec5-40d2-919a-cb56efb2154f"). InnerVolumeSpecName "kube-api-access-2lpdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.122111 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15170089-9ec5-40d2-919a-cb56efb2154f-config-data" (OuterVolumeSpecName: "config-data") pod "15170089-9ec5-40d2-919a-cb56efb2154f" (UID: "15170089-9ec5-40d2-919a-cb56efb2154f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.178435 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lpdf\" (UniqueName: \"kubernetes.io/projected/15170089-9ec5-40d2-919a-cb56efb2154f-kube-api-access-2lpdf\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.178474 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15170089-9ec5-40d2-919a-cb56efb2154f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.659695 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-db-sync-z29j4" event={"ID":"15170089-9ec5-40d2-919a-cb56efb2154f","Type":"ContainerDied","Data":"5cd8effce7765a7ec9b75e3cd3b6b12e90294e8ef038d34a0396c11f21b507ce"} Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.659746 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cd8effce7765a7ec9b75e3cd3b6b12e90294e8ef038d34a0396c11f21b507ce" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.660184 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-db-sync-z29j4" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.886185 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-wjlc6"] Jan 21 00:57:51 crc kubenswrapper[5030]: E0121 00:57:51.886601 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15170089-9ec5-40d2-919a-cb56efb2154f" containerName="keystone-db-sync" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.890676 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="15170089-9ec5-40d2-919a-cb56efb2154f" containerName="keystone-db-sync" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.891000 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="15170089-9ec5-40d2-919a-cb56efb2154f" containerName="keystone-db-sync" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.891519 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.894135 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"osp-secret" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.894147 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-scripts" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.897378 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-config-data" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.897578 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-keystone-dockercfg-jg2gl" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.898239 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.929965 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-wjlc6"] Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.996559 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-credential-keys\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.996965 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-config-data\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.997009 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-scripts\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.997050 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-fernet-keys\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:51 crc kubenswrapper[5030]: I0121 00:57:51.997088 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmn9s\" (UniqueName: \"kubernetes.io/projected/e9794b8b-70dc-4472-a009-da61c1920ca3-kube-api-access-fmn9s\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:52 crc kubenswrapper[5030]: I0121 00:57:52.098241 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-credential-keys\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:52 crc kubenswrapper[5030]: I0121 00:57:52.098521 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-config-data\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:52 crc kubenswrapper[5030]: I0121 00:57:52.098693 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-scripts\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:52 crc kubenswrapper[5030]: I0121 00:57:52.098832 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-fernet-keys\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:52 crc kubenswrapper[5030]: I0121 00:57:52.098984 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmn9s\" (UniqueName: \"kubernetes.io/projected/e9794b8b-70dc-4472-a009-da61c1920ca3-kube-api-access-fmn9s\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:52 crc kubenswrapper[5030]: I0121 00:57:52.102032 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-scripts\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:52 crc kubenswrapper[5030]: I0121 00:57:52.102080 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-credential-keys\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:52 crc kubenswrapper[5030]: I0121 00:57:52.104232 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-config-data\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:52 crc kubenswrapper[5030]: I0121 00:57:52.108093 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-fernet-keys\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:52 crc kubenswrapper[5030]: I0121 00:57:52.116248 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmn9s\" (UniqueName: \"kubernetes.io/projected/e9794b8b-70dc-4472-a009-da61c1920ca3-kube-api-access-fmn9s\") pod \"keystone-bootstrap-wjlc6\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:52 crc kubenswrapper[5030]: I0121 00:57:52.214457 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:52 crc kubenswrapper[5030]: I0121 00:57:52.618556 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-wjlc6"] Jan 21 00:57:52 crc kubenswrapper[5030]: I0121 00:57:52.675524 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" event={"ID":"e9794b8b-70dc-4472-a009-da61c1920ca3","Type":"ContainerStarted","Data":"5dd3e08f973457809f7dc7fef3ae1951c778afa57824d3830b00c02a4f97083e"} Jan 21 00:57:53 crc kubenswrapper[5030]: I0121 00:57:53.682849 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" event={"ID":"e9794b8b-70dc-4472-a009-da61c1920ca3","Type":"ContainerStarted","Data":"487c41c3adc4e03adfbe069d0ec07221133bdf999ec4e1a5240978f7657e977c"} Jan 21 00:57:53 crc kubenswrapper[5030]: I0121 00:57:53.702035 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" podStartSLOduration=2.702019661 podStartE2EDuration="2.702019661s" podCreationTimestamp="2026-01-21 00:57:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:57:53.700036634 +0000 UTC m=+8546.020296922" watchObservedRunningTime="2026-01-21 00:57:53.702019661 +0000 UTC m=+8546.022279949" Jan 21 00:57:54 crc kubenswrapper[5030]: I0121 00:57:54.962251 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 00:57:54 crc kubenswrapper[5030]: E0121 00:57:54.962730 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:57:55 crc kubenswrapper[5030]: I0121 00:57:55.699193 5030 generic.go:334] "Generic (PLEG): container finished" podID="e9794b8b-70dc-4472-a009-da61c1920ca3" containerID="487c41c3adc4e03adfbe069d0ec07221133bdf999ec4e1a5240978f7657e977c" exitCode=0 Jan 21 00:57:55 crc kubenswrapper[5030]: I0121 00:57:55.699288 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" event={"ID":"e9794b8b-70dc-4472-a009-da61c1920ca3","Type":"ContainerDied","Data":"487c41c3adc4e03adfbe069d0ec07221133bdf999ec4e1a5240978f7657e977c"} Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.045465 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.177052 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-credential-keys\") pod \"e9794b8b-70dc-4472-a009-da61c1920ca3\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.177103 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-fernet-keys\") pod \"e9794b8b-70dc-4472-a009-da61c1920ca3\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.177207 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-scripts\") pod \"e9794b8b-70dc-4472-a009-da61c1920ca3\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.177266 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmn9s\" (UniqueName: \"kubernetes.io/projected/e9794b8b-70dc-4472-a009-da61c1920ca3-kube-api-access-fmn9s\") pod \"e9794b8b-70dc-4472-a009-da61c1920ca3\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.177327 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-config-data\") pod \"e9794b8b-70dc-4472-a009-da61c1920ca3\" (UID: \"e9794b8b-70dc-4472-a009-da61c1920ca3\") " Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.182157 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-scripts" (OuterVolumeSpecName: "scripts") pod "e9794b8b-70dc-4472-a009-da61c1920ca3" (UID: "e9794b8b-70dc-4472-a009-da61c1920ca3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.182715 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e9794b8b-70dc-4472-a009-da61c1920ca3" (UID: "e9794b8b-70dc-4472-a009-da61c1920ca3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.183024 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9794b8b-70dc-4472-a009-da61c1920ca3-kube-api-access-fmn9s" (OuterVolumeSpecName: "kube-api-access-fmn9s") pod "e9794b8b-70dc-4472-a009-da61c1920ca3" (UID: "e9794b8b-70dc-4472-a009-da61c1920ca3"). InnerVolumeSpecName "kube-api-access-fmn9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.183313 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e9794b8b-70dc-4472-a009-da61c1920ca3" (UID: "e9794b8b-70dc-4472-a009-da61c1920ca3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.203474 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-config-data" (OuterVolumeSpecName: "config-data") pod "e9794b8b-70dc-4472-a009-da61c1920ca3" (UID: "e9794b8b-70dc-4472-a009-da61c1920ca3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.278939 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.278974 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmn9s\" (UniqueName: \"kubernetes.io/projected/e9794b8b-70dc-4472-a009-da61c1920ca3-kube-api-access-fmn9s\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.278986 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.278994 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.279004 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e9794b8b-70dc-4472-a009-da61c1920ca3-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.715037 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" event={"ID":"e9794b8b-70dc-4472-a009-da61c1920ca3","Type":"ContainerDied","Data":"5dd3e08f973457809f7dc7fef3ae1951c778afa57824d3830b00c02a4f97083e"} Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.715312 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dd3e08f973457809f7dc7fef3ae1951c778afa57824d3830b00c02a4f97083e" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.715106 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-bootstrap-wjlc6" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.900140 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystone-cb575f7bd-6ml4v"] Jan 21 00:57:57 crc kubenswrapper[5030]: E0121 00:57:57.900591 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9794b8b-70dc-4472-a009-da61c1920ca3" containerName="keystone-bootstrap" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.900613 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9794b8b-70dc-4472-a009-da61c1920ca3" containerName="keystone-bootstrap" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.900919 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9794b8b-70dc-4472-a009-da61c1920ca3" containerName="keystone-bootstrap" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.901883 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.905419 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-config-data" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.905426 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.905417 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-scripts" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.906886 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"keystone-keystone-dockercfg-jg2gl" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.921647 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-cb575f7bd-6ml4v"] Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.989414 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-credential-keys\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.989456 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-fernet-keys\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.989489 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn5lj\" (UniqueName: \"kubernetes.io/projected/966dfb91-704f-4ed2-b419-b735a6f876a7-kube-api-access-wn5lj\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.989558 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-config-data\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:57 crc kubenswrapper[5030]: I0121 00:57:57.989574 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-scripts\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:58 crc kubenswrapper[5030]: I0121 00:57:58.090863 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-config-data\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:58 crc kubenswrapper[5030]: I0121 00:57:58.090908 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-scripts\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:58 crc kubenswrapper[5030]: I0121 00:57:58.090981 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-credential-keys\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:58 crc kubenswrapper[5030]: I0121 00:57:58.091001 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-fernet-keys\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:58 crc kubenswrapper[5030]: I0121 00:57:58.091030 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn5lj\" (UniqueName: \"kubernetes.io/projected/966dfb91-704f-4ed2-b419-b735a6f876a7-kube-api-access-wn5lj\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:58 crc kubenswrapper[5030]: I0121 00:57:58.094834 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-scripts\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:58 crc kubenswrapper[5030]: I0121 00:57:58.095277 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-fernet-keys\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:58 crc kubenswrapper[5030]: I0121 00:57:58.095701 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-config-data\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:58 crc kubenswrapper[5030]: I0121 00:57:58.103183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-credential-keys\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:58 crc kubenswrapper[5030]: I0121 00:57:58.108770 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn5lj\" (UniqueName: \"kubernetes.io/projected/966dfb91-704f-4ed2-b419-b735a6f876a7-kube-api-access-wn5lj\") pod \"keystone-cb575f7bd-6ml4v\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:58 crc kubenswrapper[5030]: I0121 00:57:58.279434 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:58 crc kubenswrapper[5030]: I0121 00:57:58.718539 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystone-cb575f7bd-6ml4v"] Jan 21 00:57:59 crc kubenswrapper[5030]: I0121 00:57:59.729764 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" event={"ID":"966dfb91-704f-4ed2-b419-b735a6f876a7","Type":"ContainerStarted","Data":"57065a24729abf4957b4a0944ad2f07503fd28ed164189f8f1ba47b15d05c54a"} Jan 21 00:57:59 crc kubenswrapper[5030]: I0121 00:57:59.730052 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" event={"ID":"966dfb91-704f-4ed2-b419-b735a6f876a7","Type":"ContainerStarted","Data":"560c36d21bf89f858b981bddceacf8a136907fb832af1a7989661f449451ee45"} Jan 21 00:57:59 crc kubenswrapper[5030]: I0121 00:57:59.730187 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:57:59 crc kubenswrapper[5030]: I0121 00:57:59.756703 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" podStartSLOduration=2.7566803 podStartE2EDuration="2.7566803s" podCreationTimestamp="2026-01-21 00:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:57:59.74625729 +0000 UTC m=+8552.066517588" watchObservedRunningTime="2026-01-21 00:57:59.7566803 +0000 UTC m=+8552.076940588" Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.797015 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk"] Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.798475 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.800595 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-8fs2p" Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.802100 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.811104 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk"] Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.874664 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0c199ff-c9e9-4b75-9639-003ffc8845b8-apiservice-cert\") pod \"horizon-operator-controller-manager-6b8c99cf5f-9xvqk\" (UID: \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\") " pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.874720 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0c199ff-c9e9-4b75-9639-003ffc8845b8-webhook-cert\") pod \"horizon-operator-controller-manager-6b8c99cf5f-9xvqk\" (UID: \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\") " pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.874769 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w95p7\" (UniqueName: \"kubernetes.io/projected/c0c199ff-c9e9-4b75-9639-003ffc8845b8-kube-api-access-w95p7\") pod \"horizon-operator-controller-manager-6b8c99cf5f-9xvqk\" (UID: \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\") " pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.976425 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w95p7\" (UniqueName: \"kubernetes.io/projected/c0c199ff-c9e9-4b75-9639-003ffc8845b8-kube-api-access-w95p7\") pod \"horizon-operator-controller-manager-6b8c99cf5f-9xvqk\" (UID: \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\") " pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.976548 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0c199ff-c9e9-4b75-9639-003ffc8845b8-apiservice-cert\") pod \"horizon-operator-controller-manager-6b8c99cf5f-9xvqk\" (UID: \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\") " pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.976571 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0c199ff-c9e9-4b75-9639-003ffc8845b8-webhook-cert\") pod \"horizon-operator-controller-manager-6b8c99cf5f-9xvqk\" (UID: \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\") " pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.982536 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0c199ff-c9e9-4b75-9639-003ffc8845b8-apiservice-cert\") pod \"horizon-operator-controller-manager-6b8c99cf5f-9xvqk\" (UID: \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\") " pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.982580 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0c199ff-c9e9-4b75-9639-003ffc8845b8-webhook-cert\") pod \"horizon-operator-controller-manager-6b8c99cf5f-9xvqk\" (UID: \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\") " pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 00:58:03 crc kubenswrapper[5030]: I0121 00:58:03.998198 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w95p7\" (UniqueName: \"kubernetes.io/projected/c0c199ff-c9e9-4b75-9639-003ffc8845b8-kube-api-access-w95p7\") pod \"horizon-operator-controller-manager-6b8c99cf5f-9xvqk\" (UID: \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\") " pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 00:58:04 crc kubenswrapper[5030]: I0121 00:58:04.133091 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 00:58:04 crc kubenswrapper[5030]: I0121 00:58:04.552556 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk"] Jan 21 00:58:04 crc kubenswrapper[5030]: I0121 00:58:04.766769 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" event={"ID":"c0c199ff-c9e9-4b75-9639-003ffc8845b8","Type":"ContainerStarted","Data":"ed4a54773e6f1a0647bf06335dfe35af9345807473e5ccf21078854102126a05"} Jan 21 00:58:04 crc kubenswrapper[5030]: I0121 00:58:04.766836 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" event={"ID":"c0c199ff-c9e9-4b75-9639-003ffc8845b8","Type":"ContainerStarted","Data":"b2b2ed4b23c5272763c95ea3d0bbd7d44c72e84473d9f8e6d4df2a2bbff9b152"} Jan 21 00:58:04 crc kubenswrapper[5030]: I0121 00:58:04.766904 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 00:58:06 crc kubenswrapper[5030]: I0121 00:58:06.961924 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 00:58:06 crc kubenswrapper[5030]: E0121 00:58:06.962617 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:58:14 crc kubenswrapper[5030]: I0121 00:58:14.138039 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 00:58:14 crc kubenswrapper[5030]: I0121 00:58:14.157021 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" podStartSLOduration=11.157002297 podStartE2EDuration="11.157002297s" podCreationTimestamp="2026-01-21 00:58:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:58:04.788325307 +0000 UTC m=+8557.108585625" watchObservedRunningTime="2026-01-21 00:58:14.157002297 +0000 UTC m=+8566.477262585" Jan 21 00:58:20 crc kubenswrapper[5030]: I0121 00:58:20.961872 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 00:58:20 crc kubenswrapper[5030]: E0121 00:58:20.962736 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:58:29 crc kubenswrapper[5030]: I0121 00:58:29.761218 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 00:58:31 crc kubenswrapper[5030]: I0121 00:58:31.961667 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 00:58:31 crc kubenswrapper[5030]: E0121 00:58:31.962186 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.294066 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-9b986b9c-v9xxv"] Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.296102 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.298295 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"horizon-horizon-dockercfg-jkt4j" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.299244 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-scripts" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.299406 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"horizon" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.300395 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-config-data" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.303302 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-9b986b9c-v9xxv"] Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.383704 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-scripts\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.383778 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-config-data\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.383798 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-logs\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.383828 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-horizon-secret-key\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.383862 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkd9z\" (UniqueName: \"kubernetes.io/projected/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-kube-api-access-kkd9z\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.399913 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-598f976c49-nrrkf"] Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.401019 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.418892 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-598f976c49-nrrkf"] Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.485186 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-scripts\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.485282 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-logs\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.485305 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-config-data\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.485331 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-horizon-secret-key\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.485367 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkd9z\" (UniqueName: \"kubernetes.io/projected/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-kube-api-access-kkd9z\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.486081 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-logs\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.486126 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-scripts\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.486783 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-config-data\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.491722 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-horizon-secret-key\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.504429 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkd9z\" (UniqueName: \"kubernetes.io/projected/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-kube-api-access-kkd9z\") pod \"horizon-9b986b9c-v9xxv\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.587189 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1117f071-bdf5-4566-bf84-c921da09b23f-logs\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.587396 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1117f071-bdf5-4566-bf84-c921da09b23f-config-data\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.587607 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1117f071-bdf5-4566-bf84-c921da09b23f-scripts\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.587740 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1117f071-bdf5-4566-bf84-c921da09b23f-horizon-secret-key\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.587791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbmpw\" (UniqueName: \"kubernetes.io/projected/1117f071-bdf5-4566-bf84-c921da09b23f-kube-api-access-xbmpw\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.619125 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.689374 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1117f071-bdf5-4566-bf84-c921da09b23f-horizon-secret-key\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.689462 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbmpw\" (UniqueName: \"kubernetes.io/projected/1117f071-bdf5-4566-bf84-c921da09b23f-kube-api-access-xbmpw\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.689561 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1117f071-bdf5-4566-bf84-c921da09b23f-logs\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.689676 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1117f071-bdf5-4566-bf84-c921da09b23f-config-data\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.689802 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1117f071-bdf5-4566-bf84-c921da09b23f-scripts\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.691273 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1117f071-bdf5-4566-bf84-c921da09b23f-scripts\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.692469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1117f071-bdf5-4566-bf84-c921da09b23f-logs\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.695359 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1117f071-bdf5-4566-bf84-c921da09b23f-horizon-secret-key\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.695384 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1117f071-bdf5-4566-bf84-c921da09b23f-config-data\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.714152 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbmpw\" (UniqueName: \"kubernetes.io/projected/1117f071-bdf5-4566-bf84-c921da09b23f-kube-api-access-xbmpw\") pod \"horizon-598f976c49-nrrkf\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:39 crc kubenswrapper[5030]: I0121 00:58:39.716942 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:40 crc kubenswrapper[5030]: I0121 00:58:40.054192 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-9b986b9c-v9xxv"] Jan 21 00:58:40 crc kubenswrapper[5030]: W0121 00:58:40.057524 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda580e97c_5bf6_4dfc_a36b_2faa72c85f7f.slice/crio-77e51c769be409f74b3f69c9337d2a67d1b441d542c5d86839bdfa1a5856fe14 WatchSource:0}: Error finding container 77e51c769be409f74b3f69c9337d2a67d1b441d542c5d86839bdfa1a5856fe14: Status 404 returned error can't find the container with id 77e51c769be409f74b3f69c9337d2a67d1b441d542c5d86839bdfa1a5856fe14 Jan 21 00:58:40 crc kubenswrapper[5030]: I0121 00:58:40.124106 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-598f976c49-nrrkf"] Jan 21 00:58:41 crc kubenswrapper[5030]: I0121 00:58:41.088200 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" event={"ID":"1117f071-bdf5-4566-bf84-c921da09b23f","Type":"ContainerStarted","Data":"2b0fa54d139202db90d685378598c2205c8855d47595239b5c0a3431660137d3"} Jan 21 00:58:41 crc kubenswrapper[5030]: I0121 00:58:41.091167 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" event={"ID":"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f","Type":"ContainerStarted","Data":"77e51c769be409f74b3f69c9337d2a67d1b441d542c5d86839bdfa1a5856fe14"} Jan 21 00:58:46 crc kubenswrapper[5030]: I0121 00:58:46.962107 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 00:58:46 crc kubenswrapper[5030]: E0121 00:58:46.962733 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:58:48 crc kubenswrapper[5030]: I0121 00:58:48.151852 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" event={"ID":"1117f071-bdf5-4566-bf84-c921da09b23f","Type":"ContainerStarted","Data":"0a6d81f8d94f509e941c93ba8b637d200e190d0074739d748fe8c68120b4a227"} Jan 21 00:58:48 crc kubenswrapper[5030]: I0121 00:58:48.152348 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" event={"ID":"1117f071-bdf5-4566-bf84-c921da09b23f","Type":"ContainerStarted","Data":"01ec1c89e83079b5d720d825afd3b6e61978c4d23acad4a16565e52fe64605bc"} Jan 21 00:58:48 crc kubenswrapper[5030]: I0121 00:58:48.153773 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" event={"ID":"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f","Type":"ContainerStarted","Data":"edcddb03ca554ec7b8363e1bd9a207b0f493017962e580be17e06521ebd48489"} Jan 21 00:58:48 crc kubenswrapper[5030]: I0121 00:58:48.153799 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" event={"ID":"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f","Type":"ContainerStarted","Data":"65ae67a38359edf27af76d9369160a0f81f857cb883fa2d170b2a0e2f0ffff63"} Jan 21 00:58:48 crc kubenswrapper[5030]: I0121 00:58:48.174917 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" podStartSLOduration=2.339375842 podStartE2EDuration="9.174902689s" podCreationTimestamp="2026-01-21 00:58:39 +0000 UTC" firstStartedPulling="2026-01-21 00:58:40.126982441 +0000 UTC m=+8592.447242729" lastFinishedPulling="2026-01-21 00:58:46.962509288 +0000 UTC m=+8599.282769576" observedRunningTime="2026-01-21 00:58:48.170429052 +0000 UTC m=+8600.490689350" watchObservedRunningTime="2026-01-21 00:58:48.174902689 +0000 UTC m=+8600.495162977" Jan 21 00:58:48 crc kubenswrapper[5030]: I0121 00:58:48.200246 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" podStartSLOduration=2.302414298 podStartE2EDuration="9.200230185s" podCreationTimestamp="2026-01-21 00:58:39 +0000 UTC" firstStartedPulling="2026-01-21 00:58:40.059710612 +0000 UTC m=+8592.379970900" lastFinishedPulling="2026-01-21 00:58:46.957526499 +0000 UTC m=+8599.277786787" observedRunningTime="2026-01-21 00:58:48.199082297 +0000 UTC m=+8600.519342585" watchObservedRunningTime="2026-01-21 00:58:48.200230185 +0000 UTC m=+8600.520490483" Jan 21 00:58:49 crc kubenswrapper[5030]: I0121 00:58:49.619539 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:49 crc kubenswrapper[5030]: I0121 00:58:49.619637 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:58:49 crc kubenswrapper[5030]: I0121 00:58:49.718229 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:49 crc kubenswrapper[5030]: I0121 00:58:49.718380 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:58:59 crc kubenswrapper[5030]: I0121 00:58:59.621816 5030 prober.go:107] "Probe failed" probeType="Startup" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" podUID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.206:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.206:8080: connect: connection refused" Jan 21 00:58:59 crc kubenswrapper[5030]: I0121 00:58:59.719025 5030 prober.go:107] "Probe failed" probeType="Startup" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" podUID="1117f071-bdf5-4566-bf84-c921da09b23f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.207:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.207:8080: connect: connection refused" Jan 21 00:59:01 crc kubenswrapper[5030]: I0121 00:59:01.962739 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 00:59:01 crc kubenswrapper[5030]: E0121 00:59:01.964133 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:59:02 crc kubenswrapper[5030]: I0121 00:59:02.476834 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mdd9p"] Jan 21 00:59:02 crc kubenswrapper[5030]: I0121 00:59:02.478461 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:02 crc kubenswrapper[5030]: I0121 00:59:02.488381 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5pc6\" (UniqueName: \"kubernetes.io/projected/f4d5277c-ebb1-40d4-994c-1e6a4685c875-kube-api-access-h5pc6\") pod \"redhat-marketplace-mdd9p\" (UID: \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\") " pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:02 crc kubenswrapper[5030]: I0121 00:59:02.488452 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4d5277c-ebb1-40d4-994c-1e6a4685c875-utilities\") pod \"redhat-marketplace-mdd9p\" (UID: \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\") " pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:02 crc kubenswrapper[5030]: I0121 00:59:02.488545 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4d5277c-ebb1-40d4-994c-1e6a4685c875-catalog-content\") pod \"redhat-marketplace-mdd9p\" (UID: \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\") " pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:02 crc kubenswrapper[5030]: I0121 00:59:02.495123 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdd9p"] Jan 21 00:59:02 crc kubenswrapper[5030]: I0121 00:59:02.590353 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4d5277c-ebb1-40d4-994c-1e6a4685c875-catalog-content\") pod \"redhat-marketplace-mdd9p\" (UID: \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\") " pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:02 crc kubenswrapper[5030]: I0121 00:59:02.590543 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5pc6\" (UniqueName: \"kubernetes.io/projected/f4d5277c-ebb1-40d4-994c-1e6a4685c875-kube-api-access-h5pc6\") pod \"redhat-marketplace-mdd9p\" (UID: \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\") " pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:02 crc kubenswrapper[5030]: I0121 00:59:02.590703 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4d5277c-ebb1-40d4-994c-1e6a4685c875-utilities\") pod \"redhat-marketplace-mdd9p\" (UID: \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\") " pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:02 crc kubenswrapper[5030]: I0121 00:59:02.590915 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4d5277c-ebb1-40d4-994c-1e6a4685c875-catalog-content\") pod \"redhat-marketplace-mdd9p\" (UID: \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\") " pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:02 crc kubenswrapper[5030]: I0121 00:59:02.591252 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4d5277c-ebb1-40d4-994c-1e6a4685c875-utilities\") pod \"redhat-marketplace-mdd9p\" (UID: \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\") " pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:02 crc kubenswrapper[5030]: I0121 00:59:02.610437 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5pc6\" (UniqueName: \"kubernetes.io/projected/f4d5277c-ebb1-40d4-994c-1e6a4685c875-kube-api-access-h5pc6\") pod \"redhat-marketplace-mdd9p\" (UID: \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\") " pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:02 crc kubenswrapper[5030]: I0121 00:59:02.817700 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:03 crc kubenswrapper[5030]: I0121 00:59:03.349106 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdd9p"] Jan 21 00:59:03 crc kubenswrapper[5030]: W0121 00:59:03.356422 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4d5277c_ebb1_40d4_994c_1e6a4685c875.slice/crio-9883f17f129da3afd9b04ae113662917cb5157c64f0253e83fab42fb2afc762c WatchSource:0}: Error finding container 9883f17f129da3afd9b04ae113662917cb5157c64f0253e83fab42fb2afc762c: Status 404 returned error can't find the container with id 9883f17f129da3afd9b04ae113662917cb5157c64f0253e83fab42fb2afc762c Jan 21 00:59:03 crc kubenswrapper[5030]: E0121 00:59:03.716922 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4d5277c_ebb1_40d4_994c_1e6a4685c875.slice/crio-conmon-6718283850ac8d48b2cfa207be3451eb084c81a3318f4f6fb9cb2105df06dc6b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4d5277c_ebb1_40d4_994c_1e6a4685c875.slice/crio-6718283850ac8d48b2cfa207be3451eb084c81a3318f4f6fb9cb2105df06dc6b.scope\": RecentStats: unable to find data in memory cache]" Jan 21 00:59:04 crc kubenswrapper[5030]: I0121 00:59:04.325981 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4d5277c-ebb1-40d4-994c-1e6a4685c875" containerID="6718283850ac8d48b2cfa207be3451eb084c81a3318f4f6fb9cb2105df06dc6b" exitCode=0 Jan 21 00:59:04 crc kubenswrapper[5030]: I0121 00:59:04.326022 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd9p" event={"ID":"f4d5277c-ebb1-40d4-994c-1e6a4685c875","Type":"ContainerDied","Data":"6718283850ac8d48b2cfa207be3451eb084c81a3318f4f6fb9cb2105df06dc6b"} Jan 21 00:59:04 crc kubenswrapper[5030]: I0121 00:59:04.326046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd9p" event={"ID":"f4d5277c-ebb1-40d4-994c-1e6a4685c875","Type":"ContainerStarted","Data":"9883f17f129da3afd9b04ae113662917cb5157c64f0253e83fab42fb2afc762c"} Jan 21 00:59:05 crc kubenswrapper[5030]: I0121 00:59:05.336213 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd9p" event={"ID":"f4d5277c-ebb1-40d4-994c-1e6a4685c875","Type":"ContainerStarted","Data":"f3ee3854db02bf648ae6e87095ee6c16809f7710e5ae1d5d01124bcef125d981"} Jan 21 00:59:06 crc kubenswrapper[5030]: I0121 00:59:06.349268 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4d5277c-ebb1-40d4-994c-1e6a4685c875" containerID="f3ee3854db02bf648ae6e87095ee6c16809f7710e5ae1d5d01124bcef125d981" exitCode=0 Jan 21 00:59:06 crc kubenswrapper[5030]: I0121 00:59:06.349358 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd9p" event={"ID":"f4d5277c-ebb1-40d4-994c-1e6a4685c875","Type":"ContainerDied","Data":"f3ee3854db02bf648ae6e87095ee6c16809f7710e5ae1d5d01124bcef125d981"} Jan 21 00:59:07 crc kubenswrapper[5030]: I0121 00:59:07.358750 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd9p" event={"ID":"f4d5277c-ebb1-40d4-994c-1e6a4685c875","Type":"ContainerStarted","Data":"201772614cb6632f7ce6477d3e410c746d4437b4e57fff25fa259ab09884200a"} Jan 21 00:59:07 crc kubenswrapper[5030]: I0121 00:59:07.379887 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mdd9p" podStartSLOduration=2.975885122 podStartE2EDuration="5.379863475s" podCreationTimestamp="2026-01-21 00:59:02 +0000 UTC" firstStartedPulling="2026-01-21 00:59:04.328189939 +0000 UTC m=+8616.648450227" lastFinishedPulling="2026-01-21 00:59:06.732168292 +0000 UTC m=+8619.052428580" observedRunningTime="2026-01-21 00:59:07.377584391 +0000 UTC m=+8619.697844699" watchObservedRunningTime="2026-01-21 00:59:07.379863475 +0000 UTC m=+8619.700123763" Jan 21 00:59:11 crc kubenswrapper[5030]: I0121 00:59:11.360676 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:59:11 crc kubenswrapper[5030]: I0121 00:59:11.533285 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:59:12 crc kubenswrapper[5030]: I0121 00:59:12.817898 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:12 crc kubenswrapper[5030]: I0121 00:59:12.818005 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:12 crc kubenswrapper[5030]: I0121 00:59:12.870079 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:13 crc kubenswrapper[5030]: I0121 00:59:13.044281 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:59:13 crc kubenswrapper[5030]: I0121 00:59:13.245116 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:59:13 crc kubenswrapper[5030]: I0121 00:59:13.308816 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-9b986b9c-v9xxv"] Jan 21 00:59:13 crc kubenswrapper[5030]: I0121 00:59:13.407189 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" podUID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerName="horizon-log" containerID="cri-o://65ae67a38359edf27af76d9369160a0f81f857cb883fa2d170b2a0e2f0ffff63" gracePeriod=30 Jan 21 00:59:13 crc kubenswrapper[5030]: I0121 00:59:13.407287 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" podUID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerName="horizon" containerID="cri-o://edcddb03ca554ec7b8363e1bd9a207b0f493017962e580be17e06521ebd48489" gracePeriod=30 Jan 21 00:59:13 crc kubenswrapper[5030]: I0121 00:59:13.459468 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:13 crc kubenswrapper[5030]: I0121 00:59:13.506513 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdd9p"] Jan 21 00:59:15 crc kubenswrapper[5030]: I0121 00:59:15.422958 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mdd9p" podUID="f4d5277c-ebb1-40d4-994c-1e6a4685c875" containerName="registry-server" containerID="cri-o://201772614cb6632f7ce6477d3e410c746d4437b4e57fff25fa259ab09884200a" gracePeriod=2 Jan 21 00:59:15 crc kubenswrapper[5030]: I0121 00:59:15.962901 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 00:59:15 crc kubenswrapper[5030]: E0121 00:59:15.963671 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.189414 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn"] Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.191438 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.196776 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-policy" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.211837 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn"] Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.263004 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn"] Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.263581 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-data horizon-secret-key kube-api-access-c7hsq logs policy scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" podUID="ba7f42e3-85f4-40ce-9f69-7bc141825b20" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.271431 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-598f976c49-nrrkf"] Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.272230 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" podUID="1117f071-bdf5-4566-bf84-c921da09b23f" containerName="horizon" containerID="cri-o://0a6d81f8d94f509e941c93ba8b637d200e190d0074739d748fe8c68120b4a227" gracePeriod=30 Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.271931 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" podUID="1117f071-bdf5-4566-bf84-c921da09b23f" containerName="horizon-log" containerID="cri-o://01ec1c89e83079b5d720d825afd3b6e61978c4d23acad4a16565e52fe64605bc" gracePeriod=30 Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.332911 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-scripts\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.333595 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-config-data\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.333859 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7hsq\" (UniqueName: \"kubernetes.io/projected/ba7f42e3-85f4-40ce-9f69-7bc141825b20-kube-api-access-c7hsq\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.334020 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba7f42e3-85f4-40ce-9f69-7bc141825b20-logs\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.334129 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba7f42e3-85f4-40ce-9f69-7bc141825b20-horizon-secret-key\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.334300 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-policy\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.433287 5030 generic.go:334] "Generic (PLEG): container finished" podID="f4d5277c-ebb1-40d4-994c-1e6a4685c875" containerID="201772614cb6632f7ce6477d3e410c746d4437b4e57fff25fa259ab09884200a" exitCode=0 Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.433366 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.433722 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd9p" event={"ID":"f4d5277c-ebb1-40d4-994c-1e6a4685c875","Type":"ContainerDied","Data":"201772614cb6632f7ce6477d3e410c746d4437b4e57fff25fa259ab09884200a"} Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.435680 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-policy\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.435752 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-scripts\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.435804 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-config-data\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.435813 5030 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-scripts: configmap "horizon-scripts" not found Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.435864 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7hsq\" (UniqueName: \"kubernetes.io/projected/ba7f42e3-85f4-40ce-9f69-7bc141825b20-kube-api-access-c7hsq\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.435878 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-scripts podName:ba7f42e3-85f4-40ce-9f69-7bc141825b20 nodeName:}" failed. No retries permitted until 2026-01-21 00:59:16.935859275 +0000 UTC m=+8629.256119573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-scripts") pod "horizon-6f6c5c44d8-fktdn" (UID: "ba7f42e3-85f4-40ce-9f69-7bc141825b20") : configmap "horizon-scripts" not found Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.435909 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba7f42e3-85f4-40ce-9f69-7bc141825b20-logs\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.435942 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba7f42e3-85f4-40ce-9f69-7bc141825b20-horizon-secret-key\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.436035 5030 secret.go:188] Couldn't get secret horizon-kuttl-tests/horizon: secret "horizon" not found Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.436068 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7f42e3-85f4-40ce-9f69-7bc141825b20-horizon-secret-key podName:ba7f42e3-85f4-40ce-9f69-7bc141825b20 nodeName:}" failed. No retries permitted until 2026-01-21 00:59:16.93605793 +0000 UTC m=+8629.256318218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "horizon-secret-key" (UniqueName: "kubernetes.io/secret/ba7f42e3-85f4-40ce-9f69-7bc141825b20-horizon-secret-key") pod "horizon-6f6c5c44d8-fktdn" (UID: "ba7f42e3-85f4-40ce-9f69-7bc141825b20") : secret "horizon" not found Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.436226 5030 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-config-data: configmap "horizon-config-data" not found Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.436273 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-config-data podName:ba7f42e3-85f4-40ce-9f69-7bc141825b20 nodeName:}" failed. No retries permitted until 2026-01-21 00:59:16.936263415 +0000 UTC m=+8629.256523703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-config-data") pod "horizon-6f6c5c44d8-fktdn" (UID: "ba7f42e3-85f4-40ce-9f69-7bc141825b20") : configmap "horizon-config-data" not found Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.436314 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-policy\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.436352 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba7f42e3-85f4-40ce-9f69-7bc141825b20-logs\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.438644 5030 projected.go:194] Error preparing data for projected volume kube-api-access-c7hsq for pod horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn: failed to fetch token: serviceaccounts "horizon-horizon" not found Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.438702 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba7f42e3-85f4-40ce-9f69-7bc141825b20-kube-api-access-c7hsq podName:ba7f42e3-85f4-40ce-9f69-7bc141825b20 nodeName:}" failed. No retries permitted until 2026-01-21 00:59:16.938689163 +0000 UTC m=+8629.258949461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c7hsq" (UniqueName: "kubernetes.io/projected/ba7f42e3-85f4-40ce-9f69-7bc141825b20-kube-api-access-c7hsq") pod "horizon-6f6c5c44d8-fktdn" (UID: "ba7f42e3-85f4-40ce-9f69-7bc141825b20") : failed to fetch token: serviceaccounts "horizon-horizon" not found Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.482602 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.484587 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.638979 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4d5277c-ebb1-40d4-994c-1e6a4685c875-utilities\") pod \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\" (UID: \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\") " Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.639079 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-policy\") pod \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.639123 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4d5277c-ebb1-40d4-994c-1e6a4685c875-catalog-content\") pod \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\" (UID: \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\") " Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.639158 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5pc6\" (UniqueName: \"kubernetes.io/projected/f4d5277c-ebb1-40d4-994c-1e6a4685c875-kube-api-access-h5pc6\") pod \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\" (UID: \"f4d5277c-ebb1-40d4-994c-1e6a4685c875\") " Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.639260 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba7f42e3-85f4-40ce-9f69-7bc141825b20-logs\") pod \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.639584 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-policy" (OuterVolumeSpecName: "policy") pod "ba7f42e3-85f4-40ce-9f69-7bc141825b20" (UID: "ba7f42e3-85f4-40ce-9f69-7bc141825b20"). InnerVolumeSpecName "policy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.639912 5030 reconciler_common.go:293] "Volume detached for volume \"policy\" (UniqueName: \"kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-policy\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.639936 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4d5277c-ebb1-40d4-994c-1e6a4685c875-utilities" (OuterVolumeSpecName: "utilities") pod "f4d5277c-ebb1-40d4-994c-1e6a4685c875" (UID: "f4d5277c-ebb1-40d4-994c-1e6a4685c875"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.640246 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba7f42e3-85f4-40ce-9f69-7bc141825b20-logs" (OuterVolumeSpecName: "logs") pod "ba7f42e3-85f4-40ce-9f69-7bc141825b20" (UID: "ba7f42e3-85f4-40ce-9f69-7bc141825b20"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.658479 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4d5277c-ebb1-40d4-994c-1e6a4685c875-kube-api-access-h5pc6" (OuterVolumeSpecName: "kube-api-access-h5pc6") pod "f4d5277c-ebb1-40d4-994c-1e6a4685c875" (UID: "f4d5277c-ebb1-40d4-994c-1e6a4685c875"). InnerVolumeSpecName "kube-api-access-h5pc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.665965 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4d5277c-ebb1-40d4-994c-1e6a4685c875-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4d5277c-ebb1-40d4-994c-1e6a4685c875" (UID: "f4d5277c-ebb1-40d4-994c-1e6a4685c875"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.741877 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4d5277c-ebb1-40d4-994c-1e6a4685c875-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.741923 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5pc6\" (UniqueName: \"kubernetes.io/projected/f4d5277c-ebb1-40d4-994c-1e6a4685c875-kube-api-access-h5pc6\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.741933 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba7f42e3-85f4-40ce-9f69-7bc141825b20-logs\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.741942 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4d5277c-ebb1-40d4-994c-1e6a4685c875-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.945332 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-scripts\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.945426 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-config-data\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.945512 5030 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-scripts: configmap "horizon-scripts" not found Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.945603 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-scripts podName:ba7f42e3-85f4-40ce-9f69-7bc141825b20 nodeName:}" failed. No retries permitted until 2026-01-21 00:59:17.945578368 +0000 UTC m=+8630.265838696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-scripts") pod "horizon-6f6c5c44d8-fktdn" (UID: "ba7f42e3-85f4-40ce-9f69-7bc141825b20") : configmap "horizon-scripts" not found Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.945683 5030 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/horizon-config-data: configmap "horizon-config-data" not found Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.945787 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-config-data podName:ba7f42e3-85f4-40ce-9f69-7bc141825b20 nodeName:}" failed. No retries permitted until 2026-01-21 00:59:17.945764182 +0000 UTC m=+8630.266024550 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-config-data") pod "horizon-6f6c5c44d8-fktdn" (UID: "ba7f42e3-85f4-40ce-9f69-7bc141825b20") : configmap "horizon-config-data" not found Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.945515 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7hsq\" (UniqueName: \"kubernetes.io/projected/ba7f42e3-85f4-40ce-9f69-7bc141825b20-kube-api-access-c7hsq\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: I0121 00:59:16.945993 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba7f42e3-85f4-40ce-9f69-7bc141825b20-horizon-secret-key\") pod \"horizon-6f6c5c44d8-fktdn\" (UID: \"ba7f42e3-85f4-40ce-9f69-7bc141825b20\") " pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.946185 5030 secret.go:188] Couldn't get secret horizon-kuttl-tests/horizon: secret "horizon" not found Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.946274 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7f42e3-85f4-40ce-9f69-7bc141825b20-horizon-secret-key podName:ba7f42e3-85f4-40ce-9f69-7bc141825b20 nodeName:}" failed. No retries permitted until 2026-01-21 00:59:17.946254844 +0000 UTC m=+8630.266515122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "horizon-secret-key" (UniqueName: "kubernetes.io/secret/ba7f42e3-85f4-40ce-9f69-7bc141825b20-horizon-secret-key") pod "horizon-6f6c5c44d8-fktdn" (UID: "ba7f42e3-85f4-40ce-9f69-7bc141825b20") : secret "horizon" not found Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.948843 5030 projected.go:194] Error preparing data for projected volume kube-api-access-c7hsq for pod horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn: failed to fetch token: serviceaccounts "horizon-horizon" not found Jan 21 00:59:16 crc kubenswrapper[5030]: E0121 00:59:16.948916 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba7f42e3-85f4-40ce-9f69-7bc141825b20-kube-api-access-c7hsq podName:ba7f42e3-85f4-40ce-9f69-7bc141825b20 nodeName:}" failed. No retries permitted until 2026-01-21 00:59:17.948900978 +0000 UTC m=+8630.269161286 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-c7hsq" (UniqueName: "kubernetes.io/projected/ba7f42e3-85f4-40ce-9f69-7bc141825b20-kube-api-access-c7hsq") pod "horizon-6f6c5c44d8-fktdn" (UID: "ba7f42e3-85f4-40ce-9f69-7bc141825b20") : failed to fetch token: serviceaccounts "horizon-horizon" not found Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.448046 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mdd9p" event={"ID":"f4d5277c-ebb1-40d4-994c-1e6a4685c875","Type":"ContainerDied","Data":"9883f17f129da3afd9b04ae113662917cb5157c64f0253e83fab42fb2afc762c"} Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.448147 5030 scope.go:117] "RemoveContainer" containerID="201772614cb6632f7ce6477d3e410c746d4437b4e57fff25fa259ab09884200a" Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.448229 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mdd9p" Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.451980 5030 generic.go:334] "Generic (PLEG): container finished" podID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerID="edcddb03ca554ec7b8363e1bd9a207b0f493017962e580be17e06521ebd48489" exitCode=0 Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.452100 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn" Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.452097 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" event={"ID":"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f","Type":"ContainerDied","Data":"edcddb03ca554ec7b8363e1bd9a207b0f493017962e580be17e06521ebd48489"} Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.480063 5030 scope.go:117] "RemoveContainer" containerID="f3ee3854db02bf648ae6e87095ee6c16809f7710e5ae1d5d01124bcef125d981" Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.527256 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn"] Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.529099 5030 scope.go:117] "RemoveContainer" containerID="6718283850ac8d48b2cfa207be3451eb084c81a3318f4f6fb9cb2105df06dc6b" Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.537092 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-6f6c5c44d8-fktdn"] Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.543796 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdd9p"] Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.550234 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mdd9p"] Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.662600 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.662662 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba7f42e3-85f4-40ce-9f69-7bc141825b20-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.662681 5030 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba7f42e3-85f4-40ce-9f69-7bc141825b20-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.662699 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7hsq\" (UniqueName: \"kubernetes.io/projected/ba7f42e3-85f4-40ce-9f69-7bc141825b20-kube-api-access-c7hsq\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.972607 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba7f42e3-85f4-40ce-9f69-7bc141825b20" path="/var/lib/kubelet/pods/ba7f42e3-85f4-40ce-9f69-7bc141825b20/volumes" Jan 21 00:59:17 crc kubenswrapper[5030]: I0121 00:59:17.973254 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4d5277c-ebb1-40d4-994c-1e6a4685c875" path="/var/lib/kubelet/pods/f4d5277c-ebb1-40d4-994c-1e6a4685c875/volumes" Jan 21 00:59:19 crc kubenswrapper[5030]: I0121 00:59:19.620221 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" podUID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.206:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.206:8080: connect: connection refused" Jan 21 00:59:19 crc kubenswrapper[5030]: I0121 00:59:19.719755 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" podUID="1117f071-bdf5-4566-bf84-c921da09b23f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.207:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.207:8080: connect: connection refused" Jan 21 00:59:20 crc kubenswrapper[5030]: I0121 00:59:20.479074 5030 generic.go:334] "Generic (PLEG): container finished" podID="1117f071-bdf5-4566-bf84-c921da09b23f" containerID="0a6d81f8d94f509e941c93ba8b637d200e190d0074739d748fe8c68120b4a227" exitCode=0 Jan 21 00:59:20 crc kubenswrapper[5030]: I0121 00:59:20.479120 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" event={"ID":"1117f071-bdf5-4566-bf84-c921da09b23f","Type":"ContainerDied","Data":"0a6d81f8d94f509e941c93ba8b637d200e190d0074739d748fe8c68120b4a227"} Jan 21 00:59:26 crc kubenswrapper[5030]: I0121 00:59:26.962533 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 00:59:26 crc kubenswrapper[5030]: E0121 00:59:26.963405 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:59:29 crc kubenswrapper[5030]: I0121 00:59:29.620775 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" podUID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.206:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.206:8080: connect: connection refused" Jan 21 00:59:29 crc kubenswrapper[5030]: I0121 00:59:29.718231 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" podUID="1117f071-bdf5-4566-bf84-c921da09b23f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.207:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.207:8080: connect: connection refused" Jan 21 00:59:39 crc kubenswrapper[5030]: I0121 00:59:39.620080 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" podUID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.206:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.206:8080: connect: connection refused" Jan 21 00:59:39 crc kubenswrapper[5030]: I0121 00:59:39.620659 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:59:39 crc kubenswrapper[5030]: I0121 00:59:39.718095 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" podUID="1117f071-bdf5-4566-bf84-c921da09b23f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.207:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.207:8080: connect: connection refused" Jan 21 00:59:39 crc kubenswrapper[5030]: I0121 00:59:39.718204 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:59:41 crc kubenswrapper[5030]: I0121 00:59:41.963230 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 00:59:41 crc kubenswrapper[5030]: E0121 00:59:41.963769 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:59:43 crc kubenswrapper[5030]: I0121 00:59:43.673593 5030 generic.go:334] "Generic (PLEG): container finished" podID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerID="65ae67a38359edf27af76d9369160a0f81f857cb883fa2d170b2a0e2f0ffff63" exitCode=137 Jan 21 00:59:43 crc kubenswrapper[5030]: I0121 00:59:43.673659 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" event={"ID":"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f","Type":"ContainerDied","Data":"65ae67a38359edf27af76d9369160a0f81f857cb883fa2d170b2a0e2f0ffff63"} Jan 21 00:59:43 crc kubenswrapper[5030]: I0121 00:59:43.832577 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:59:43 crc kubenswrapper[5030]: I0121 00:59:43.904575 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkd9z\" (UniqueName: \"kubernetes.io/projected/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-kube-api-access-kkd9z\") pod \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " Jan 21 00:59:43 crc kubenswrapper[5030]: I0121 00:59:43.904672 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-config-data\") pod \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " Jan 21 00:59:43 crc kubenswrapper[5030]: I0121 00:59:43.904711 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-logs\") pod \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " Jan 21 00:59:43 crc kubenswrapper[5030]: I0121 00:59:43.904743 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-scripts\") pod \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " Jan 21 00:59:43 crc kubenswrapper[5030]: I0121 00:59:43.904817 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-horizon-secret-key\") pod \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\" (UID: \"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f\") " Jan 21 00:59:43 crc kubenswrapper[5030]: I0121 00:59:43.906012 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-logs" (OuterVolumeSpecName: "logs") pod "a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" (UID: "a580e97c-5bf6-4dfc-a36b-2faa72c85f7f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:59:43 crc kubenswrapper[5030]: I0121 00:59:43.909567 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" (UID: "a580e97c-5bf6-4dfc-a36b-2faa72c85f7f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:59:43 crc kubenswrapper[5030]: I0121 00:59:43.909950 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-kube-api-access-kkd9z" (OuterVolumeSpecName: "kube-api-access-kkd9z") pod "a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" (UID: "a580e97c-5bf6-4dfc-a36b-2faa72c85f7f"). InnerVolumeSpecName "kube-api-access-kkd9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:59:43 crc kubenswrapper[5030]: I0121 00:59:43.925545 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-config-data" (OuterVolumeSpecName: "config-data") pod "a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" (UID: "a580e97c-5bf6-4dfc-a36b-2faa72c85f7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:59:43 crc kubenswrapper[5030]: I0121 00:59:43.927157 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-scripts" (OuterVolumeSpecName: "scripts") pod "a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" (UID: "a580e97c-5bf6-4dfc-a36b-2faa72c85f7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:59:44 crc kubenswrapper[5030]: I0121 00:59:44.006465 5030 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:44 crc kubenswrapper[5030]: I0121 00:59:44.006509 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkd9z\" (UniqueName: \"kubernetes.io/projected/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-kube-api-access-kkd9z\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:44 crc kubenswrapper[5030]: I0121 00:59:44.006551 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:44 crc kubenswrapper[5030]: I0121 00:59:44.006564 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:44 crc kubenswrapper[5030]: I0121 00:59:44.006579 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:44 crc kubenswrapper[5030]: I0121 00:59:44.687788 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" event={"ID":"a580e97c-5bf6-4dfc-a36b-2faa72c85f7f","Type":"ContainerDied","Data":"77e51c769be409f74b3f69c9337d2a67d1b441d542c5d86839bdfa1a5856fe14"} Jan 21 00:59:44 crc kubenswrapper[5030]: I0121 00:59:44.687844 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-9b986b9c-v9xxv" Jan 21 00:59:44 crc kubenswrapper[5030]: I0121 00:59:44.688330 5030 scope.go:117] "RemoveContainer" containerID="edcddb03ca554ec7b8363e1bd9a207b0f493017962e580be17e06521ebd48489" Jan 21 00:59:44 crc kubenswrapper[5030]: I0121 00:59:44.719504 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-9b986b9c-v9xxv"] Jan 21 00:59:44 crc kubenswrapper[5030]: I0121 00:59:44.724013 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-9b986b9c-v9xxv"] Jan 21 00:59:44 crc kubenswrapper[5030]: I0121 00:59:44.888276 5030 scope.go:117] "RemoveContainer" containerID="65ae67a38359edf27af76d9369160a0f81f857cb883fa2d170b2a0e2f0ffff63" Jan 21 00:59:45 crc kubenswrapper[5030]: I0121 00:59:45.973315 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" path="/var/lib/kubelet/pods/a580e97c-5bf6-4dfc-a36b-2faa72c85f7f/volumes" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.658593 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.711101 5030 generic.go:334] "Generic (PLEG): container finished" podID="1117f071-bdf5-4566-bf84-c921da09b23f" containerID="01ec1c89e83079b5d720d825afd3b6e61978c4d23acad4a16565e52fe64605bc" exitCode=137 Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.711178 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" event={"ID":"1117f071-bdf5-4566-bf84-c921da09b23f","Type":"ContainerDied","Data":"01ec1c89e83079b5d720d825afd3b6e61978c4d23acad4a16565e52fe64605bc"} Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.711228 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" event={"ID":"1117f071-bdf5-4566-bf84-c921da09b23f","Type":"ContainerDied","Data":"2b0fa54d139202db90d685378598c2205c8855d47595239b5c0a3431660137d3"} Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.711264 5030 scope.go:117] "RemoveContainer" containerID="0a6d81f8d94f509e941c93ba8b637d200e190d0074739d748fe8c68120b4a227" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.711455 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-598f976c49-nrrkf" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.848247 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbmpw\" (UniqueName: \"kubernetes.io/projected/1117f071-bdf5-4566-bf84-c921da09b23f-kube-api-access-xbmpw\") pod \"1117f071-bdf5-4566-bf84-c921da09b23f\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.848368 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1117f071-bdf5-4566-bf84-c921da09b23f-horizon-secret-key\") pod \"1117f071-bdf5-4566-bf84-c921da09b23f\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.848416 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1117f071-bdf5-4566-bf84-c921da09b23f-scripts\") pod \"1117f071-bdf5-4566-bf84-c921da09b23f\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.848454 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1117f071-bdf5-4566-bf84-c921da09b23f-config-data\") pod \"1117f071-bdf5-4566-bf84-c921da09b23f\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.848729 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1117f071-bdf5-4566-bf84-c921da09b23f-logs\") pod \"1117f071-bdf5-4566-bf84-c921da09b23f\" (UID: \"1117f071-bdf5-4566-bf84-c921da09b23f\") " Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.849794 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1117f071-bdf5-4566-bf84-c921da09b23f-logs" (OuterVolumeSpecName: "logs") pod "1117f071-bdf5-4566-bf84-c921da09b23f" (UID: "1117f071-bdf5-4566-bf84-c921da09b23f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.854352 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1117f071-bdf5-4566-bf84-c921da09b23f-kube-api-access-xbmpw" (OuterVolumeSpecName: "kube-api-access-xbmpw") pod "1117f071-bdf5-4566-bf84-c921da09b23f" (UID: "1117f071-bdf5-4566-bf84-c921da09b23f"). InnerVolumeSpecName "kube-api-access-xbmpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.863718 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1117f071-bdf5-4566-bf84-c921da09b23f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1117f071-bdf5-4566-bf84-c921da09b23f" (UID: "1117f071-bdf5-4566-bf84-c921da09b23f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.866913 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1117f071-bdf5-4566-bf84-c921da09b23f-scripts" (OuterVolumeSpecName: "scripts") pod "1117f071-bdf5-4566-bf84-c921da09b23f" (UID: "1117f071-bdf5-4566-bf84-c921da09b23f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.872729 5030 scope.go:117] "RemoveContainer" containerID="01ec1c89e83079b5d720d825afd3b6e61978c4d23acad4a16565e52fe64605bc" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.883315 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1117f071-bdf5-4566-bf84-c921da09b23f-config-data" (OuterVolumeSpecName: "config-data") pod "1117f071-bdf5-4566-bf84-c921da09b23f" (UID: "1117f071-bdf5-4566-bf84-c921da09b23f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.915170 5030 scope.go:117] "RemoveContainer" containerID="0a6d81f8d94f509e941c93ba8b637d200e190d0074739d748fe8c68120b4a227" Jan 21 00:59:46 crc kubenswrapper[5030]: E0121 00:59:46.916063 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a6d81f8d94f509e941c93ba8b637d200e190d0074739d748fe8c68120b4a227\": container with ID starting with 0a6d81f8d94f509e941c93ba8b637d200e190d0074739d748fe8c68120b4a227 not found: ID does not exist" containerID="0a6d81f8d94f509e941c93ba8b637d200e190d0074739d748fe8c68120b4a227" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.916149 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a6d81f8d94f509e941c93ba8b637d200e190d0074739d748fe8c68120b4a227"} err="failed to get container status \"0a6d81f8d94f509e941c93ba8b637d200e190d0074739d748fe8c68120b4a227\": rpc error: code = NotFound desc = could not find container \"0a6d81f8d94f509e941c93ba8b637d200e190d0074739d748fe8c68120b4a227\": container with ID starting with 0a6d81f8d94f509e941c93ba8b637d200e190d0074739d748fe8c68120b4a227 not found: ID does not exist" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.916218 5030 scope.go:117] "RemoveContainer" containerID="01ec1c89e83079b5d720d825afd3b6e61978c4d23acad4a16565e52fe64605bc" Jan 21 00:59:46 crc kubenswrapper[5030]: E0121 00:59:46.916749 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ec1c89e83079b5d720d825afd3b6e61978c4d23acad4a16565e52fe64605bc\": container with ID starting with 01ec1c89e83079b5d720d825afd3b6e61978c4d23acad4a16565e52fe64605bc not found: ID does not exist" containerID="01ec1c89e83079b5d720d825afd3b6e61978c4d23acad4a16565e52fe64605bc" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.916819 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ec1c89e83079b5d720d825afd3b6e61978c4d23acad4a16565e52fe64605bc"} err="failed to get container status \"01ec1c89e83079b5d720d825afd3b6e61978c4d23acad4a16565e52fe64605bc\": rpc error: code = NotFound desc = could not find container \"01ec1c89e83079b5d720d825afd3b6e61978c4d23acad4a16565e52fe64605bc\": container with ID starting with 01ec1c89e83079b5d720d825afd3b6e61978c4d23acad4a16565e52fe64605bc not found: ID does not exist" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.950160 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbmpw\" (UniqueName: \"kubernetes.io/projected/1117f071-bdf5-4566-bf84-c921da09b23f-kube-api-access-xbmpw\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.950195 5030 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1117f071-bdf5-4566-bf84-c921da09b23f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.950205 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1117f071-bdf5-4566-bf84-c921da09b23f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.950214 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1117f071-bdf5-4566-bf84-c921da09b23f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:46 crc kubenswrapper[5030]: I0121 00:59:46.950223 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1117f071-bdf5-4566-bf84-c921da09b23f-logs\") on node \"crc\" DevicePath \"\"" Jan 21 00:59:47 crc kubenswrapper[5030]: I0121 00:59:47.048633 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-598f976c49-nrrkf"] Jan 21 00:59:47 crc kubenswrapper[5030]: I0121 00:59:47.053004 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-598f976c49-nrrkf"] Jan 21 00:59:47 crc kubenswrapper[5030]: I0121 00:59:47.974322 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1117f071-bdf5-4566-bf84-c921da09b23f" path="/var/lib/kubelet/pods/1117f071-bdf5-4566-bf84-c921da09b23f/volumes" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.293712 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-754c467c7d-s8jx7"] Jan 21 00:59:48 crc kubenswrapper[5030]: E0121 00:59:48.294099 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d5277c-ebb1-40d4-994c-1e6a4685c875" containerName="extract-utilities" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.294115 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d5277c-ebb1-40d4-994c-1e6a4685c875" containerName="extract-utilities" Jan 21 00:59:48 crc kubenswrapper[5030]: E0121 00:59:48.294145 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerName="horizon" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.294153 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerName="horizon" Jan 21 00:59:48 crc kubenswrapper[5030]: E0121 00:59:48.294169 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1117f071-bdf5-4566-bf84-c921da09b23f" containerName="horizon-log" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.294177 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1117f071-bdf5-4566-bf84-c921da09b23f" containerName="horizon-log" Jan 21 00:59:48 crc kubenswrapper[5030]: E0121 00:59:48.294192 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d5277c-ebb1-40d4-994c-1e6a4685c875" containerName="registry-server" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.294200 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d5277c-ebb1-40d4-994c-1e6a4685c875" containerName="registry-server" Jan 21 00:59:48 crc kubenswrapper[5030]: E0121 00:59:48.294212 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerName="horizon-log" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.294219 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerName="horizon-log" Jan 21 00:59:48 crc kubenswrapper[5030]: E0121 00:59:48.294240 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d5277c-ebb1-40d4-994c-1e6a4685c875" containerName="extract-content" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.294247 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d5277c-ebb1-40d4-994c-1e6a4685c875" containerName="extract-content" Jan 21 00:59:48 crc kubenswrapper[5030]: E0121 00:59:48.294262 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1117f071-bdf5-4566-bf84-c921da09b23f" containerName="horizon" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.294269 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1117f071-bdf5-4566-bf84-c921da09b23f" containerName="horizon" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.294442 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerName="horizon" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.294458 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1117f071-bdf5-4566-bf84-c921da09b23f" containerName="horizon" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.294469 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4d5277c-ebb1-40d4-994c-1e6a4685c875" containerName="registry-server" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.294482 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a580e97c-5bf6-4dfc-a36b-2faa72c85f7f" containerName="horizon-log" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.294492 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1117f071-bdf5-4566-bf84-c921da09b23f" containerName="horizon-log" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.295354 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.297335 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-config-data" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.297430 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"horizon-horizon-dockercfg-v4wmq" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.297443 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"horizon" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.297429 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"combined-ca-bundle" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.297642 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"horizon-kuttl-tests"/"horizon-scripts" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.298629 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"cert-horizon-svc" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.301702 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-754c467c7d-s8jx7"] Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.348467 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/horizon-8cd586586-mphzt"] Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.350126 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.373862 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-8cd586586-mphzt"] Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.473356 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-horizon-secret-key\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.473415 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb4qx\" (UniqueName: \"kubernetes.io/projected/78b3b980-8af5-466d-a4b8-a9dec837c10e-kube-api-access-wb4qx\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.473471 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-combined-ca-bundle\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.473987 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78b3b980-8af5-466d-a4b8-a9dec837c10e-scripts\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.474054 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-horizon-secret-key\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.474088 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-horizon-tls-certs\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.474145 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-horizon-tls-certs\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.474177 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b3b980-8af5-466d-a4b8-a9dec837c10e-logs\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.474265 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78b3b980-8af5-466d-a4b8-a9dec837c10e-config-data\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.474350 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-combined-ca-bundle\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.474402 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/785e75c0-0b35-454f-934d-cec21d5f2749-scripts\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.474466 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rzvw\" (UniqueName: \"kubernetes.io/projected/785e75c0-0b35-454f-934d-cec21d5f2749-kube-api-access-2rzvw\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.474501 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/785e75c0-0b35-454f-934d-cec21d5f2749-logs\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.474533 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/785e75c0-0b35-454f-934d-cec21d5f2749-config-data\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.575874 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-combined-ca-bundle\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.575939 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78b3b980-8af5-466d-a4b8-a9dec837c10e-scripts\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.575987 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-horizon-secret-key\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.576012 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-horizon-tls-certs\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.576040 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-horizon-tls-certs\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.576071 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b3b980-8af5-466d-a4b8-a9dec837c10e-logs\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.576105 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78b3b980-8af5-466d-a4b8-a9dec837c10e-config-data\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.576148 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-combined-ca-bundle\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.576193 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/785e75c0-0b35-454f-934d-cec21d5f2749-scripts\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.576242 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rzvw\" (UniqueName: \"kubernetes.io/projected/785e75c0-0b35-454f-934d-cec21d5f2749-kube-api-access-2rzvw\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.576266 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/785e75c0-0b35-454f-934d-cec21d5f2749-logs\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.576297 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/785e75c0-0b35-454f-934d-cec21d5f2749-config-data\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.576322 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-horizon-secret-key\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.576354 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb4qx\" (UniqueName: \"kubernetes.io/projected/78b3b980-8af5-466d-a4b8-a9dec837c10e-kube-api-access-wb4qx\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.577007 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78b3b980-8af5-466d-a4b8-a9dec837c10e-scripts\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.577371 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/785e75c0-0b35-454f-934d-cec21d5f2749-logs\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.577386 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b3b980-8af5-466d-a4b8-a9dec837c10e-logs\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.578075 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/785e75c0-0b35-454f-934d-cec21d5f2749-scripts\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.578183 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/785e75c0-0b35-454f-934d-cec21d5f2749-config-data\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.578567 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78b3b980-8af5-466d-a4b8-a9dec837c10e-config-data\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.581090 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-combined-ca-bundle\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.581389 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-horizon-secret-key\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.581796 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-horizon-secret-key\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.582389 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-combined-ca-bundle\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.582951 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-horizon-tls-certs\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.589655 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-horizon-tls-certs\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.594010 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb4qx\" (UniqueName: \"kubernetes.io/projected/78b3b980-8af5-466d-a4b8-a9dec837c10e-kube-api-access-wb4qx\") pod \"horizon-8cd586586-mphzt\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.604375 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rzvw\" (UniqueName: \"kubernetes.io/projected/785e75c0-0b35-454f-934d-cec21d5f2749-kube-api-access-2rzvw\") pod \"horizon-754c467c7d-s8jx7\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.618443 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.664728 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:48 crc kubenswrapper[5030]: I0121 00:59:48.921360 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-754c467c7d-s8jx7"] Jan 21 00:59:48 crc kubenswrapper[5030]: W0121 00:59:48.937355 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod785e75c0_0b35_454f_934d_cec21d5f2749.slice/crio-71d6a446ce81f66571d9875b1e1aae26637330a3a3032b3feeea332e5f06cb1e WatchSource:0}: Error finding container 71d6a446ce81f66571d9875b1e1aae26637330a3a3032b3feeea332e5f06cb1e: Status 404 returned error can't find the container with id 71d6a446ce81f66571d9875b1e1aae26637330a3a3032b3feeea332e5f06cb1e Jan 21 00:59:49 crc kubenswrapper[5030]: I0121 00:59:49.169602 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/horizon-8cd586586-mphzt"] Jan 21 00:59:49 crc kubenswrapper[5030]: W0121 00:59:49.176542 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78b3b980_8af5_466d_a4b8_a9dec837c10e.slice/crio-8ef78e129111ecdd6fb034c559e5d786486ad524f0a14a94fde03fc4a22fd0b7 WatchSource:0}: Error finding container 8ef78e129111ecdd6fb034c559e5d786486ad524f0a14a94fde03fc4a22fd0b7: Status 404 returned error can't find the container with id 8ef78e129111ecdd6fb034c559e5d786486ad524f0a14a94fde03fc4a22fd0b7 Jan 21 00:59:49 crc kubenswrapper[5030]: I0121 00:59:49.759125 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" event={"ID":"785e75c0-0b35-454f-934d-cec21d5f2749","Type":"ContainerStarted","Data":"82db556c4f1f8a9be908349876a13003ec9e43684d8e05cc93ce170b78c0b121"} Jan 21 00:59:49 crc kubenswrapper[5030]: I0121 00:59:49.759192 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" event={"ID":"785e75c0-0b35-454f-934d-cec21d5f2749","Type":"ContainerStarted","Data":"41a9164621a0c7dc2e9abe48ba6d72650fefadf1b5fe40772a5232b0bb695337"} Jan 21 00:59:49 crc kubenswrapper[5030]: I0121 00:59:49.759208 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" event={"ID":"785e75c0-0b35-454f-934d-cec21d5f2749","Type":"ContainerStarted","Data":"71d6a446ce81f66571d9875b1e1aae26637330a3a3032b3feeea332e5f06cb1e"} Jan 21 00:59:49 crc kubenswrapper[5030]: I0121 00:59:49.760694 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" event={"ID":"78b3b980-8af5-466d-a4b8-a9dec837c10e","Type":"ContainerStarted","Data":"8f7b52746d1a59a2aa6a54061ee44abf42eb20e532056c7a47ae09c1722af1e2"} Jan 21 00:59:49 crc kubenswrapper[5030]: I0121 00:59:49.760728 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" event={"ID":"78b3b980-8af5-466d-a4b8-a9dec837c10e","Type":"ContainerStarted","Data":"fac6507d03beb09311cd2197a0f9365cf0a9eb0bea8adcfd1cb6201c4198c5cd"} Jan 21 00:59:49 crc kubenswrapper[5030]: I0121 00:59:49.760742 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" event={"ID":"78b3b980-8af5-466d-a4b8-a9dec837c10e","Type":"ContainerStarted","Data":"8ef78e129111ecdd6fb034c559e5d786486ad524f0a14a94fde03fc4a22fd0b7"} Jan 21 00:59:49 crc kubenswrapper[5030]: I0121 00:59:49.789967 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" podStartSLOduration=1.789947911 podStartE2EDuration="1.789947911s" podCreationTimestamp="2026-01-21 00:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:59:49.782704528 +0000 UTC m=+8662.102964836" watchObservedRunningTime="2026-01-21 00:59:49.789947911 +0000 UTC m=+8662.110208199" Jan 21 00:59:49 crc kubenswrapper[5030]: I0121 00:59:49.816438 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" podStartSLOduration=1.8164198040000001 podStartE2EDuration="1.816419804s" podCreationTimestamp="2026-01-21 00:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 00:59:49.81036053 +0000 UTC m=+8662.130620828" watchObservedRunningTime="2026-01-21 00:59:49.816419804 +0000 UTC m=+8662.136680092" Jan 21 00:59:55 crc kubenswrapper[5030]: I0121 00:59:55.961700 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 00:59:55 crc kubenswrapper[5030]: E0121 00:59:55.962350 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 00:59:58 crc kubenswrapper[5030]: I0121 00:59:58.618580 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:58 crc kubenswrapper[5030]: I0121 00:59:58.618917 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 00:59:58 crc kubenswrapper[5030]: I0121 00:59:58.664795 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 00:59:58 crc kubenswrapper[5030]: I0121 00:59:58.664883 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.134424 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv"] Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.137041 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.139306 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.139491 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.144265 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv"] Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.292335 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-secret-volume\") pod \"collect-profiles-29482620-k2sfv\" (UID: \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.292676 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbcg7\" (UniqueName: \"kubernetes.io/projected/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-kube-api-access-wbcg7\") pod \"collect-profiles-29482620-k2sfv\" (UID: \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.292791 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-config-volume\") pod \"collect-profiles-29482620-k2sfv\" (UID: \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.394047 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbcg7\" (UniqueName: \"kubernetes.io/projected/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-kube-api-access-wbcg7\") pod \"collect-profiles-29482620-k2sfv\" (UID: \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.394420 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-config-volume\") pod \"collect-profiles-29482620-k2sfv\" (UID: \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.394596 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-secret-volume\") pod \"collect-profiles-29482620-k2sfv\" (UID: \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.395351 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-config-volume\") pod \"collect-profiles-29482620-k2sfv\" (UID: \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.400961 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-secret-volume\") pod \"collect-profiles-29482620-k2sfv\" (UID: \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.414291 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbcg7\" (UniqueName: \"kubernetes.io/projected/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-kube-api-access-wbcg7\") pod \"collect-profiles-29482620-k2sfv\" (UID: \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.463289 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" Jan 21 01:00:00 crc kubenswrapper[5030]: I0121 01:00:00.894113 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv"] Jan 21 01:00:01 crc kubenswrapper[5030]: I0121 01:00:01.866946 5030 generic.go:334] "Generic (PLEG): container finished" podID="6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5" containerID="2a31a645a937ba5dd4b9e8967a1fa6475ebf9a9ed215b390ef6cb6d72abaa9b1" exitCode=0 Jan 21 01:00:01 crc kubenswrapper[5030]: I0121 01:00:01.867278 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" event={"ID":"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5","Type":"ContainerDied","Data":"2a31a645a937ba5dd4b9e8967a1fa6475ebf9a9ed215b390ef6cb6d72abaa9b1"} Jan 21 01:00:01 crc kubenswrapper[5030]: I0121 01:00:01.867309 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" event={"ID":"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5","Type":"ContainerStarted","Data":"c617cad2ac9aabefb3c88dcc4247e3d1186c852cbaad5cc0aa1b7ea1b28a5215"} Jan 21 01:00:03 crc kubenswrapper[5030]: I0121 01:00:03.175321 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" Jan 21 01:00:03 crc kubenswrapper[5030]: I0121 01:00:03.338245 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-secret-volume\") pod \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\" (UID: \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\") " Jan 21 01:00:03 crc kubenswrapper[5030]: I0121 01:00:03.338361 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-config-volume\") pod \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\" (UID: \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\") " Jan 21 01:00:03 crc kubenswrapper[5030]: I0121 01:00:03.338461 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbcg7\" (UniqueName: \"kubernetes.io/projected/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-kube-api-access-wbcg7\") pod \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\" (UID: \"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5\") " Jan 21 01:00:03 crc kubenswrapper[5030]: I0121 01:00:03.339993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-config-volume" (OuterVolumeSpecName: "config-volume") pod "6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5" (UID: "6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:03 crc kubenswrapper[5030]: I0121 01:00:03.344599 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-kube-api-access-wbcg7" (OuterVolumeSpecName: "kube-api-access-wbcg7") pod "6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5" (UID: "6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5"). InnerVolumeSpecName "kube-api-access-wbcg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:00:03 crc kubenswrapper[5030]: I0121 01:00:03.357217 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5" (UID: "6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:00:03 crc kubenswrapper[5030]: I0121 01:00:03.440783 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbcg7\" (UniqueName: \"kubernetes.io/projected/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-kube-api-access-wbcg7\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:03 crc kubenswrapper[5030]: I0121 01:00:03.440834 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:03 crc kubenswrapper[5030]: I0121 01:00:03.440849 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:03 crc kubenswrapper[5030]: I0121 01:00:03.882798 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" event={"ID":"6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5","Type":"ContainerDied","Data":"c617cad2ac9aabefb3c88dcc4247e3d1186c852cbaad5cc0aa1b7ea1b28a5215"} Jan 21 01:00:03 crc kubenswrapper[5030]: I0121 01:00:03.883120 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c617cad2ac9aabefb3c88dcc4247e3d1186c852cbaad5cc0aa1b7ea1b28a5215" Jan 21 01:00:03 crc kubenswrapper[5030]: I0121 01:00:03.882860 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv" Jan 21 01:00:04 crc kubenswrapper[5030]: E0121 01:00:04.041402 5030 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a0a6e2a_f088_4d4a_a5e5_5a62b4218cf5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a0a6e2a_f088_4d4a_a5e5_5a62b4218cf5.slice/crio-c617cad2ac9aabefb3c88dcc4247e3d1186c852cbaad5cc0aa1b7ea1b28a5215\": RecentStats: unable to find data in memory cache]" Jan 21 01:00:04 crc kubenswrapper[5030]: I0121 01:00:04.091732 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl"] Jan 21 01:00:04 crc kubenswrapper[5030]: I0121 01:00:04.101553 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482575-lmgpl"] Jan 21 01:00:05 crc kubenswrapper[5030]: I0121 01:00:05.974739 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7c6a312-8157-4460-9800-7f2df0edc7fa" path="/var/lib/kubelet/pods/b7c6a312-8157-4460-9800-7f2df0edc7fa/volumes" Jan 21 01:00:08 crc kubenswrapper[5030]: I0121 01:00:08.962091 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:00:08 crc kubenswrapper[5030]: E0121 01:00:08.962449 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:00:10 crc kubenswrapper[5030]: I0121 01:00:10.442581 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 01:00:10 crc kubenswrapper[5030]: I0121 01:00:10.534095 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 01:00:12 crc kubenswrapper[5030]: I0121 01:00:12.076217 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 01:00:12 crc kubenswrapper[5030]: I0121 01:00:12.230895 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 01:00:12 crc kubenswrapper[5030]: I0121 01:00:12.294523 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-754c467c7d-s8jx7"] Jan 21 01:00:12 crc kubenswrapper[5030]: I0121 01:00:12.975356 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" podUID="785e75c0-0b35-454f-934d-cec21d5f2749" containerName="horizon" containerID="cri-o://82db556c4f1f8a9be908349876a13003ec9e43684d8e05cc93ce170b78c0b121" gracePeriod=30 Jan 21 01:00:12 crc kubenswrapper[5030]: I0121 01:00:12.975308 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" podUID="785e75c0-0b35-454f-934d-cec21d5f2749" containerName="horizon-log" containerID="cri-o://41a9164621a0c7dc2e9abe48ba6d72650fefadf1b5fe40772a5232b0bb695337" gracePeriod=30 Jan 21 01:00:13 crc kubenswrapper[5030]: I0121 01:00:13.416146 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-8cd586586-mphzt"] Jan 21 01:00:13 crc kubenswrapper[5030]: I0121 01:00:13.421089 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" podUID="78b3b980-8af5-466d-a4b8-a9dec837c10e" containerName="horizon-log" containerID="cri-o://fac6507d03beb09311cd2197a0f9365cf0a9eb0bea8adcfd1cb6201c4198c5cd" gracePeriod=30 Jan 21 01:00:13 crc kubenswrapper[5030]: I0121 01:00:13.421183 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" podUID="78b3b980-8af5-466d-a4b8-a9dec837c10e" containerName="horizon" containerID="cri-o://8f7b52746d1a59a2aa6a54061ee44abf42eb20e532056c7a47ae09c1722af1e2" gracePeriod=30 Jan 21 01:00:17 crc kubenswrapper[5030]: I0121 01:00:17.004184 5030 generic.go:334] "Generic (PLEG): container finished" podID="785e75c0-0b35-454f-934d-cec21d5f2749" containerID="82db556c4f1f8a9be908349876a13003ec9e43684d8e05cc93ce170b78c0b121" exitCode=0 Jan 21 01:00:17 crc kubenswrapper[5030]: I0121 01:00:17.004242 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" event={"ID":"785e75c0-0b35-454f-934d-cec21d5f2749","Type":"ContainerDied","Data":"82db556c4f1f8a9be908349876a13003ec9e43684d8e05cc93ce170b78c0b121"} Jan 21 01:00:17 crc kubenswrapper[5030]: I0121 01:00:17.006900 5030 generic.go:334] "Generic (PLEG): container finished" podID="78b3b980-8af5-466d-a4b8-a9dec837c10e" containerID="8f7b52746d1a59a2aa6a54061ee44abf42eb20e532056c7a47ae09c1722af1e2" exitCode=0 Jan 21 01:00:17 crc kubenswrapper[5030]: I0121 01:00:17.006943 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" event={"ID":"78b3b980-8af5-466d-a4b8-a9dec837c10e","Type":"ContainerDied","Data":"8f7b52746d1a59a2aa6a54061ee44abf42eb20e532056c7a47ae09c1722af1e2"} Jan 21 01:00:18 crc kubenswrapper[5030]: I0121 01:00:18.619769 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" podUID="785e75c0-0b35-454f-934d-cec21d5f2749" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.210:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.210:8443: connect: connection refused" Jan 21 01:00:18 crc kubenswrapper[5030]: I0121 01:00:18.665912 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" podUID="78b3b980-8af5-466d-a4b8-a9dec837c10e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.211:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.211:8443: connect: connection refused" Jan 21 01:00:23 crc kubenswrapper[5030]: I0121 01:00:23.967076 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:00:23 crc kubenswrapper[5030]: E0121 01:00:23.969700 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:00:28 crc kubenswrapper[5030]: I0121 01:00:28.620074 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" podUID="785e75c0-0b35-454f-934d-cec21d5f2749" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.210:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.210:8443: connect: connection refused" Jan 21 01:00:28 crc kubenswrapper[5030]: I0121 01:00:28.665386 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" podUID="78b3b980-8af5-466d-a4b8-a9dec837c10e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.211:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.211:8443: connect: connection refused" Jan 21 01:00:35 crc kubenswrapper[5030]: I0121 01:00:35.602932 5030 scope.go:117] "RemoveContainer" containerID="7e2729c7f2dedcba7ffe2a612f970854d6a0b3bacf69942ee1e4c7982d3371fc" Jan 21 01:00:37 crc kubenswrapper[5030]: I0121 01:00:37.962359 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:00:37 crc kubenswrapper[5030]: E0121 01:00:37.963300 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:00:38 crc kubenswrapper[5030]: I0121 01:00:38.619826 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" podUID="785e75c0-0b35-454f-934d-cec21d5f2749" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.210:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.210:8443: connect: connection refused" Jan 21 01:00:38 crc kubenswrapper[5030]: I0121 01:00:38.619977 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 01:00:38 crc kubenswrapper[5030]: I0121 01:00:38.666177 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" podUID="78b3b980-8af5-466d-a4b8-a9dec837c10e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.211:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.211:8443: connect: connection refused" Jan 21 01:00:38 crc kubenswrapper[5030]: I0121 01:00:38.666310 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.223869 5030 generic.go:334] "Generic (PLEG): container finished" podID="785e75c0-0b35-454f-934d-cec21d5f2749" containerID="41a9164621a0c7dc2e9abe48ba6d72650fefadf1b5fe40772a5232b0bb695337" exitCode=137 Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.223898 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" event={"ID":"785e75c0-0b35-454f-934d-cec21d5f2749","Type":"ContainerDied","Data":"41a9164621a0c7dc2e9abe48ba6d72650fefadf1b5fe40772a5232b0bb695337"} Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.394061 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.516077 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/785e75c0-0b35-454f-934d-cec21d5f2749-scripts\") pod \"785e75c0-0b35-454f-934d-cec21d5f2749\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.516165 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/785e75c0-0b35-454f-934d-cec21d5f2749-logs\") pod \"785e75c0-0b35-454f-934d-cec21d5f2749\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.516230 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-combined-ca-bundle\") pod \"785e75c0-0b35-454f-934d-cec21d5f2749\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.516277 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rzvw\" (UniqueName: \"kubernetes.io/projected/785e75c0-0b35-454f-934d-cec21d5f2749-kube-api-access-2rzvw\") pod \"785e75c0-0b35-454f-934d-cec21d5f2749\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.516357 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-horizon-secret-key\") pod \"785e75c0-0b35-454f-934d-cec21d5f2749\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.516381 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-horizon-tls-certs\") pod \"785e75c0-0b35-454f-934d-cec21d5f2749\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.516411 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/785e75c0-0b35-454f-934d-cec21d5f2749-config-data\") pod \"785e75c0-0b35-454f-934d-cec21d5f2749\" (UID: \"785e75c0-0b35-454f-934d-cec21d5f2749\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.517803 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/785e75c0-0b35-454f-934d-cec21d5f2749-logs" (OuterVolumeSpecName: "logs") pod "785e75c0-0b35-454f-934d-cec21d5f2749" (UID: "785e75c0-0b35-454f-934d-cec21d5f2749"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.523257 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "785e75c0-0b35-454f-934d-cec21d5f2749" (UID: "785e75c0-0b35-454f-934d-cec21d5f2749"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.540438 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/785e75c0-0b35-454f-934d-cec21d5f2749-config-data" (OuterVolumeSpecName: "config-data") pod "785e75c0-0b35-454f-934d-cec21d5f2749" (UID: "785e75c0-0b35-454f-934d-cec21d5f2749"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.541398 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/785e75c0-0b35-454f-934d-cec21d5f2749-kube-api-access-2rzvw" (OuterVolumeSpecName: "kube-api-access-2rzvw") pod "785e75c0-0b35-454f-934d-cec21d5f2749" (UID: "785e75c0-0b35-454f-934d-cec21d5f2749"). InnerVolumeSpecName "kube-api-access-2rzvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.546640 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/785e75c0-0b35-454f-934d-cec21d5f2749-scripts" (OuterVolumeSpecName: "scripts") pod "785e75c0-0b35-454f-934d-cec21d5f2749" (UID: "785e75c0-0b35-454f-934d-cec21d5f2749"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.548947 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "785e75c0-0b35-454f-934d-cec21d5f2749" (UID: "785e75c0-0b35-454f-934d-cec21d5f2749"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.578249 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "785e75c0-0b35-454f-934d-cec21d5f2749" (UID: "785e75c0-0b35-454f-934d-cec21d5f2749"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.618531 5030 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.618786 5030 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.618799 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/785e75c0-0b35-454f-934d-cec21d5f2749-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.618809 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/785e75c0-0b35-454f-934d-cec21d5f2749-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.618820 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/785e75c0-0b35-454f-934d-cec21d5f2749-logs\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.618828 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785e75c0-0b35-454f-934d-cec21d5f2749-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.618840 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rzvw\" (UniqueName: \"kubernetes.io/projected/785e75c0-0b35-454f-934d-cec21d5f2749-kube-api-access-2rzvw\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.713535 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.825796 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-horizon-secret-key\") pod \"78b3b980-8af5-466d-a4b8-a9dec837c10e\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.825848 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-horizon-tls-certs\") pod \"78b3b980-8af5-466d-a4b8-a9dec837c10e\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.825888 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b3b980-8af5-466d-a4b8-a9dec837c10e-logs\") pod \"78b3b980-8af5-466d-a4b8-a9dec837c10e\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.825910 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-combined-ca-bundle\") pod \"78b3b980-8af5-466d-a4b8-a9dec837c10e\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.825992 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78b3b980-8af5-466d-a4b8-a9dec837c10e-scripts\") pod \"78b3b980-8af5-466d-a4b8-a9dec837c10e\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.826029 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb4qx\" (UniqueName: \"kubernetes.io/projected/78b3b980-8af5-466d-a4b8-a9dec837c10e-kube-api-access-wb4qx\") pod \"78b3b980-8af5-466d-a4b8-a9dec837c10e\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.826085 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78b3b980-8af5-466d-a4b8-a9dec837c10e-config-data\") pod \"78b3b980-8af5-466d-a4b8-a9dec837c10e\" (UID: \"78b3b980-8af5-466d-a4b8-a9dec837c10e\") " Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.827182 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b3b980-8af5-466d-a4b8-a9dec837c10e-logs" (OuterVolumeSpecName: "logs") pod "78b3b980-8af5-466d-a4b8-a9dec837c10e" (UID: "78b3b980-8af5-466d-a4b8-a9dec837c10e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.830538 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "78b3b980-8af5-466d-a4b8-a9dec837c10e" (UID: "78b3b980-8af5-466d-a4b8-a9dec837c10e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.830618 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b3b980-8af5-466d-a4b8-a9dec837c10e-kube-api-access-wb4qx" (OuterVolumeSpecName: "kube-api-access-wb4qx") pod "78b3b980-8af5-466d-a4b8-a9dec837c10e" (UID: "78b3b980-8af5-466d-a4b8-a9dec837c10e"). InnerVolumeSpecName "kube-api-access-wb4qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.842768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78b3b980-8af5-466d-a4b8-a9dec837c10e-scripts" (OuterVolumeSpecName: "scripts") pod "78b3b980-8af5-466d-a4b8-a9dec837c10e" (UID: "78b3b980-8af5-466d-a4b8-a9dec837c10e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.843112 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78b3b980-8af5-466d-a4b8-a9dec837c10e-config-data" (OuterVolumeSpecName: "config-data") pod "78b3b980-8af5-466d-a4b8-a9dec837c10e" (UID: "78b3b980-8af5-466d-a4b8-a9dec837c10e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.845558 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78b3b980-8af5-466d-a4b8-a9dec837c10e" (UID: "78b3b980-8af5-466d-a4b8-a9dec837c10e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.860840 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "78b3b980-8af5-466d-a4b8-a9dec837c10e" (UID: "78b3b980-8af5-466d-a4b8-a9dec837c10e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.928188 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78b3b980-8af5-466d-a4b8-a9dec837c10e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.928251 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb4qx\" (UniqueName: \"kubernetes.io/projected/78b3b980-8af5-466d-a4b8-a9dec837c10e-kube-api-access-wb4qx\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.928273 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78b3b980-8af5-466d-a4b8-a9dec837c10e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.928296 5030 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.928316 5030 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.928333 5030 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78b3b980-8af5-466d-a4b8-a9dec837c10e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:43 crc kubenswrapper[5030]: I0121 01:00:43.928351 5030 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b3b980-8af5-466d-a4b8-a9dec837c10e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.233142 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" event={"ID":"785e75c0-0b35-454f-934d-cec21d5f2749","Type":"ContainerDied","Data":"71d6a446ce81f66571d9875b1e1aae26637330a3a3032b3feeea332e5f06cb1e"} Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.233196 5030 scope.go:117] "RemoveContainer" containerID="82db556c4f1f8a9be908349876a13003ec9e43684d8e05cc93ce170b78c0b121" Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.233255 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-754c467c7d-s8jx7" Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.235480 5030 generic.go:334] "Generic (PLEG): container finished" podID="78b3b980-8af5-466d-a4b8-a9dec837c10e" containerID="fac6507d03beb09311cd2197a0f9365cf0a9eb0bea8adcfd1cb6201c4198c5cd" exitCode=137 Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.235535 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.235528 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" event={"ID":"78b3b980-8af5-466d-a4b8-a9dec837c10e","Type":"ContainerDied","Data":"fac6507d03beb09311cd2197a0f9365cf0a9eb0bea8adcfd1cb6201c4198c5cd"} Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.235590 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/horizon-8cd586586-mphzt" event={"ID":"78b3b980-8af5-466d-a4b8-a9dec837c10e","Type":"ContainerDied","Data":"8ef78e129111ecdd6fb034c559e5d786486ad524f0a14a94fde03fc4a22fd0b7"} Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.271476 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-8cd586586-mphzt"] Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.280368 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-8cd586586-mphzt"] Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.314691 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/horizon-754c467c7d-s8jx7"] Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.330341 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/horizon-754c467c7d-s8jx7"] Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.494227 5030 scope.go:117] "RemoveContainer" containerID="41a9164621a0c7dc2e9abe48ba6d72650fefadf1b5fe40772a5232b0bb695337" Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.508744 5030 scope.go:117] "RemoveContainer" containerID="8f7b52746d1a59a2aa6a54061ee44abf42eb20e532056c7a47ae09c1722af1e2" Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.667859 5030 scope.go:117] "RemoveContainer" containerID="fac6507d03beb09311cd2197a0f9365cf0a9eb0bea8adcfd1cb6201c4198c5cd" Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.690285 5030 scope.go:117] "RemoveContainer" containerID="8f7b52746d1a59a2aa6a54061ee44abf42eb20e532056c7a47ae09c1722af1e2" Jan 21 01:00:44 crc kubenswrapper[5030]: E0121 01:00:44.691138 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f7b52746d1a59a2aa6a54061ee44abf42eb20e532056c7a47ae09c1722af1e2\": container with ID starting with 8f7b52746d1a59a2aa6a54061ee44abf42eb20e532056c7a47ae09c1722af1e2 not found: ID does not exist" containerID="8f7b52746d1a59a2aa6a54061ee44abf42eb20e532056c7a47ae09c1722af1e2" Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.691193 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7b52746d1a59a2aa6a54061ee44abf42eb20e532056c7a47ae09c1722af1e2"} err="failed to get container status \"8f7b52746d1a59a2aa6a54061ee44abf42eb20e532056c7a47ae09c1722af1e2\": rpc error: code = NotFound desc = could not find container \"8f7b52746d1a59a2aa6a54061ee44abf42eb20e532056c7a47ae09c1722af1e2\": container with ID starting with 8f7b52746d1a59a2aa6a54061ee44abf42eb20e532056c7a47ae09c1722af1e2 not found: ID does not exist" Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.691227 5030 scope.go:117] "RemoveContainer" containerID="fac6507d03beb09311cd2197a0f9365cf0a9eb0bea8adcfd1cb6201c4198c5cd" Jan 21 01:00:44 crc kubenswrapper[5030]: E0121 01:00:44.691778 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac6507d03beb09311cd2197a0f9365cf0a9eb0bea8adcfd1cb6201c4198c5cd\": container with ID starting with fac6507d03beb09311cd2197a0f9365cf0a9eb0bea8adcfd1cb6201c4198c5cd not found: ID does not exist" containerID="fac6507d03beb09311cd2197a0f9365cf0a9eb0bea8adcfd1cb6201c4198c5cd" Jan 21 01:00:44 crc kubenswrapper[5030]: I0121 01:00:44.691818 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac6507d03beb09311cd2197a0f9365cf0a9eb0bea8adcfd1cb6201c4198c5cd"} err="failed to get container status \"fac6507d03beb09311cd2197a0f9365cf0a9eb0bea8adcfd1cb6201c4198c5cd\": rpc error: code = NotFound desc = could not find container \"fac6507d03beb09311cd2197a0f9365cf0a9eb0bea8adcfd1cb6201c4198c5cd\": container with ID starting with fac6507d03beb09311cd2197a0f9365cf0a9eb0bea8adcfd1cb6201c4198c5cd not found: ID does not exist" Jan 21 01:00:45 crc kubenswrapper[5030]: I0121 01:00:45.976306 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="785e75c0-0b35-454f-934d-cec21d5f2749" path="/var/lib/kubelet/pods/785e75c0-0b35-454f-934d-cec21d5f2749/volumes" Jan 21 01:00:45 crc kubenswrapper[5030]: I0121 01:00:45.977121 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b3b980-8af5-466d-a4b8-a9dec837c10e" path="/var/lib/kubelet/pods/78b3b980-8af5-466d-a4b8-a9dec837c10e/volumes" Jan 21 01:00:48 crc kubenswrapper[5030]: I0121 01:00:48.961763 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:00:48 crc kubenswrapper[5030]: E0121 01:00:48.962398 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.276216 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-wjlc6"] Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.286433 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-bootstrap-wjlc6"] Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.299739 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-z29j4"] Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.312697 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-db-sync-z29j4"] Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.323010 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-cb575f7bd-6ml4v"] Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.323209 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" podUID="966dfb91-704f-4ed2-b419-b735a6f876a7" containerName="keystone-api" containerID="cri-o://57065a24729abf4957b4a0944ad2f07503fd28ed164189f8f1ba47b15d05c54a" gracePeriod=30 Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.348816 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc"] Jan 21 01:00:51 crc kubenswrapper[5030]: E0121 01:00:51.349189 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785e75c0-0b35-454f-934d-cec21d5f2749" containerName="horizon" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.349212 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="785e75c0-0b35-454f-934d-cec21d5f2749" containerName="horizon" Jan 21 01:00:51 crc kubenswrapper[5030]: E0121 01:00:51.349230 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b3b980-8af5-466d-a4b8-a9dec837c10e" containerName="horizon-log" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.349239 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b3b980-8af5-466d-a4b8-a9dec837c10e" containerName="horizon-log" Jan 21 01:00:51 crc kubenswrapper[5030]: E0121 01:00:51.349256 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5" containerName="collect-profiles" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.349265 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5" containerName="collect-profiles" Jan 21 01:00:51 crc kubenswrapper[5030]: E0121 01:00:51.349285 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785e75c0-0b35-454f-934d-cec21d5f2749" containerName="horizon-log" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.349294 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="785e75c0-0b35-454f-934d-cec21d5f2749" containerName="horizon-log" Jan 21 01:00:51 crc kubenswrapper[5030]: E0121 01:00:51.349305 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b3b980-8af5-466d-a4b8-a9dec837c10e" containerName="horizon" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.349312 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b3b980-8af5-466d-a4b8-a9dec837c10e" containerName="horizon" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.349478 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5" containerName="collect-profiles" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.349496 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="785e75c0-0b35-454f-934d-cec21d5f2749" containerName="horizon" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.349506 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b3b980-8af5-466d-a4b8-a9dec837c10e" containerName="horizon-log" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.349516 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="785e75c0-0b35-454f-934d-cec21d5f2749" containerName="horizon-log" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.349533 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b3b980-8af5-466d-a4b8-a9dec837c10e" containerName="horizon" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.350162 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.356872 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc"] Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.482017 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhhn\" (UniqueName: \"kubernetes.io/projected/d7b5fe58-53a7-4726-8286-24f697bf4295-kube-api-access-dwhhn\") pod \"keystoned3cf-account-delete-n6qbc\" (UID: \"d7b5fe58-53a7-4726-8286-24f697bf4295\") " pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.482079 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts\") pod \"keystoned3cf-account-delete-n6qbc\" (UID: \"d7b5fe58-53a7-4726-8286-24f697bf4295\") " pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.583417 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhhn\" (UniqueName: \"kubernetes.io/projected/d7b5fe58-53a7-4726-8286-24f697bf4295-kube-api-access-dwhhn\") pod \"keystoned3cf-account-delete-n6qbc\" (UID: \"d7b5fe58-53a7-4726-8286-24f697bf4295\") " pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.583493 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts\") pod \"keystoned3cf-account-delete-n6qbc\" (UID: \"d7b5fe58-53a7-4726-8286-24f697bf4295\") " pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.584261 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts\") pod \"keystoned3cf-account-delete-n6qbc\" (UID: \"d7b5fe58-53a7-4726-8286-24f697bf4295\") " pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.612357 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhhn\" (UniqueName: \"kubernetes.io/projected/d7b5fe58-53a7-4726-8286-24f697bf4295-kube-api-access-dwhhn\") pod \"keystoned3cf-account-delete-n6qbc\" (UID: \"d7b5fe58-53a7-4726-8286-24f697bf4295\") " pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" Jan 21 01:00:51 crc kubenswrapper[5030]: I0121 01:00:51.670355 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.001172 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15170089-9ec5-40d2-919a-cb56efb2154f" path="/var/lib/kubelet/pods/15170089-9ec5-40d2-919a-cb56efb2154f/volumes" Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.007459 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9794b8b-70dc-4472-a009-da61c1920ca3" path="/var/lib/kubelet/pods/e9794b8b-70dc-4472-a009-da61c1920ca3/volumes" Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.019843 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-p57x5"] Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.029574 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-p57x5"] Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.067585 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["horizon-kuttl-tests/root-account-create-update-gmltn"] Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.069511 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-gmltn" Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.075691 5030 reflector.go:368] Caches populated for *v1.Secret from object-"horizon-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.083033 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-gmltn"] Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.098410 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.105690 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.117158 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.129067 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-gmltn"] Jan 21 01:00:52 crc kubenswrapper[5030]: E0121 01:00:52.129570 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-k5tjn operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="horizon-kuttl-tests/root-account-create-update-gmltn" podUID="3773b922-c87a-4a71-95d5-385611972a96" Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.139875 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc"] Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.201913 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3773b922-c87a-4a71-95d5-385611972a96-operator-scripts\") pod \"root-account-create-update-gmltn\" (UID: \"3773b922-c87a-4a71-95d5-385611972a96\") " pod="horizon-kuttl-tests/root-account-create-update-gmltn" Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.201954 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5tjn\" (UniqueName: \"kubernetes.io/projected/3773b922-c87a-4a71-95d5-385611972a96-kube-api-access-k5tjn\") pod \"root-account-create-update-gmltn\" (UID: \"3773b922-c87a-4a71-95d5-385611972a96\") " pod="horizon-kuttl-tests/root-account-create-update-gmltn" Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.254918 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/openstack-galera-2" podUID="69e29420-6e97-43b9-8a13-78ee659137a3" containerName="galera" containerID="cri-o://51b12193306751d5e555d7c9e3fd34b74dd7fed0b2f6d2e6d883b6c0073e0073" gracePeriod=30 Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.303041 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3773b922-c87a-4a71-95d5-385611972a96-operator-scripts\") pod \"root-account-create-update-gmltn\" (UID: \"3773b922-c87a-4a71-95d5-385611972a96\") " pod="horizon-kuttl-tests/root-account-create-update-gmltn" Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.303105 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5tjn\" (UniqueName: \"kubernetes.io/projected/3773b922-c87a-4a71-95d5-385611972a96-kube-api-access-k5tjn\") pod \"root-account-create-update-gmltn\" (UID: \"3773b922-c87a-4a71-95d5-385611972a96\") " pod="horizon-kuttl-tests/root-account-create-update-gmltn" Jan 21 01:00:52 crc kubenswrapper[5030]: E0121 01:00:52.303184 5030 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 01:00:52 crc kubenswrapper[5030]: E0121 01:00:52.303264 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3773b922-c87a-4a71-95d5-385611972a96-operator-scripts podName:3773b922-c87a-4a71-95d5-385611972a96 nodeName:}" failed. No retries permitted until 2026-01-21 01:00:52.803245547 +0000 UTC m=+8725.123505835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3773b922-c87a-4a71-95d5-385611972a96-operator-scripts") pod "root-account-create-update-gmltn" (UID: "3773b922-c87a-4a71-95d5-385611972a96") : configmap "openstack-scripts" not found Jan 21 01:00:52 crc kubenswrapper[5030]: E0121 01:00:52.307682 5030 projected.go:194] Error preparing data for projected volume kube-api-access-k5tjn for pod horizon-kuttl-tests/root-account-create-update-gmltn: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 01:00:52 crc kubenswrapper[5030]: E0121 01:00:52.307743 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3773b922-c87a-4a71-95d5-385611972a96-kube-api-access-k5tjn podName:3773b922-c87a-4a71-95d5-385611972a96 nodeName:}" failed. No retries permitted until 2026-01-21 01:00:52.807730034 +0000 UTC m=+8725.127990322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-k5tjn" (UniqueName: "kubernetes.io/projected/3773b922-c87a-4a71-95d5-385611972a96-kube-api-access-k5tjn") pod "root-account-create-update-gmltn" (UID: "3773b922-c87a-4a71-95d5-385611972a96") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.309354 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-gmltn" Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.309365 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" event={"ID":"d7b5fe58-53a7-4726-8286-24f697bf4295","Type":"ContainerStarted","Data":"31f7e43773b12e26b6a64fcca28da9df987a9d79e462b8048f16d9ec95383423"} Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.309394 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" event={"ID":"d7b5fe58-53a7-4726-8286-24f697bf4295","Type":"ContainerStarted","Data":"3434a979439a924c3fee7adcf9831a76e276057c8a3251497d40477d1cc55a37"} Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.310676 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" secret="" err="secret \"galera-openstack-dockercfg-j42pw\" not found" Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.318358 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-gmltn" Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.338918 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" podStartSLOduration=1.3388919989999999 podStartE2EDuration="1.338891999s" podCreationTimestamp="2026-01-21 01:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 01:00:52.333111811 +0000 UTC m=+8724.653372099" watchObservedRunningTime="2026-01-21 01:00:52.338891999 +0000 UTC m=+8724.659152287" Jan 21 01:00:52 crc kubenswrapper[5030]: E0121 01:00:52.404974 5030 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 01:00:52 crc kubenswrapper[5030]: E0121 01:00:52.405062 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts podName:d7b5fe58-53a7-4726-8286-24f697bf4295 nodeName:}" failed. No retries permitted until 2026-01-21 01:00:52.905040002 +0000 UTC m=+8725.225300300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts") pod "keystoned3cf-account-delete-n6qbc" (UID: "d7b5fe58-53a7-4726-8286-24f697bf4295") : configmap "openstack-scripts" not found Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.656140 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.656375 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/memcached-0" podUID="e7c9e208-00ec-4363-b4e8-91e11eb22555" containerName="memcached" containerID="cri-o://7b55930a5340962e8be1a11e7fbc1beecd596da8a8badc35c7d5f04aefcf0ae2" gracePeriod=30 Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.811893 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3773b922-c87a-4a71-95d5-385611972a96-operator-scripts\") pod \"root-account-create-update-gmltn\" (UID: \"3773b922-c87a-4a71-95d5-385611972a96\") " pod="horizon-kuttl-tests/root-account-create-update-gmltn" Jan 21 01:00:52 crc kubenswrapper[5030]: I0121 01:00:52.811952 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5tjn\" (UniqueName: \"kubernetes.io/projected/3773b922-c87a-4a71-95d5-385611972a96-kube-api-access-k5tjn\") pod \"root-account-create-update-gmltn\" (UID: \"3773b922-c87a-4a71-95d5-385611972a96\") " pod="horizon-kuttl-tests/root-account-create-update-gmltn" Jan 21 01:00:52 crc kubenswrapper[5030]: E0121 01:00:52.812031 5030 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 01:00:52 crc kubenswrapper[5030]: E0121 01:00:52.812132 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3773b922-c87a-4a71-95d5-385611972a96-operator-scripts podName:3773b922-c87a-4a71-95d5-385611972a96 nodeName:}" failed. No retries permitted until 2026-01-21 01:00:53.812111699 +0000 UTC m=+8726.132371987 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/3773b922-c87a-4a71-95d5-385611972a96-operator-scripts") pod "root-account-create-update-gmltn" (UID: "3773b922-c87a-4a71-95d5-385611972a96") : configmap "openstack-scripts" not found Jan 21 01:00:52 crc kubenswrapper[5030]: E0121 01:00:52.816043 5030 projected.go:194] Error preparing data for projected volume kube-api-access-k5tjn for pod horizon-kuttl-tests/root-account-create-update-gmltn: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 01:00:52 crc kubenswrapper[5030]: E0121 01:00:52.816120 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3773b922-c87a-4a71-95d5-385611972a96-kube-api-access-k5tjn podName:3773b922-c87a-4a71-95d5-385611972a96 nodeName:}" failed. No retries permitted until 2026-01-21 01:00:53.816104325 +0000 UTC m=+8726.136364703 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5tjn" (UniqueName: "kubernetes.io/projected/3773b922-c87a-4a71-95d5-385611972a96-kube-api-access-k5tjn") pod "root-account-create-update-gmltn" (UID: "3773b922-c87a-4a71-95d5-385611972a96") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 21 01:00:52 crc kubenswrapper[5030]: E0121 01:00:52.913984 5030 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 01:00:52 crc kubenswrapper[5030]: E0121 01:00:52.914060 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts podName:d7b5fe58-53a7-4726-8286-24f697bf4295 nodeName:}" failed. No retries permitted until 2026-01-21 01:00:53.914044328 +0000 UTC m=+8726.234304626 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts") pod "keystoned3cf-account-delete-n6qbc" (UID: "d7b5fe58-53a7-4726-8286-24f697bf4295") : configmap "openstack-scripts" not found Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.056173 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.218361 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-operator-scripts\") pod \"69e29420-6e97-43b9-8a13-78ee659137a3\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.218404 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69e29420-6e97-43b9-8a13-78ee659137a3-config-data-generated\") pod \"69e29420-6e97-43b9-8a13-78ee659137a3\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.218425 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"69e29420-6e97-43b9-8a13-78ee659137a3\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.218496 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-config-data-default\") pod \"69e29420-6e97-43b9-8a13-78ee659137a3\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.218561 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x882\" (UniqueName: \"kubernetes.io/projected/69e29420-6e97-43b9-8a13-78ee659137a3-kube-api-access-7x882\") pod \"69e29420-6e97-43b9-8a13-78ee659137a3\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.218636 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-kolla-config\") pod \"69e29420-6e97-43b9-8a13-78ee659137a3\" (UID: \"69e29420-6e97-43b9-8a13-78ee659137a3\") " Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.218897 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69e29420-6e97-43b9-8a13-78ee659137a3-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "69e29420-6e97-43b9-8a13-78ee659137a3" (UID: "69e29420-6e97-43b9-8a13-78ee659137a3"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.219059 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "69e29420-6e97-43b9-8a13-78ee659137a3" (UID: "69e29420-6e97-43b9-8a13-78ee659137a3"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.219194 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "69e29420-6e97-43b9-8a13-78ee659137a3" (UID: "69e29420-6e97-43b9-8a13-78ee659137a3"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.219348 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69e29420-6e97-43b9-8a13-78ee659137a3" (UID: "69e29420-6e97-43b9-8a13-78ee659137a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.219472 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.219490 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.219500 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69e29420-6e97-43b9-8a13-78ee659137a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.219510 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69e29420-6e97-43b9-8a13-78ee659137a3-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.224345 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e29420-6e97-43b9-8a13-78ee659137a3-kube-api-access-7x882" (OuterVolumeSpecName: "kube-api-access-7x882") pod "69e29420-6e97-43b9-8a13-78ee659137a3" (UID: "69e29420-6e97-43b9-8a13-78ee659137a3"). InnerVolumeSpecName "kube-api-access-7x882". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.229500 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "mysql-db") pod "69e29420-6e97-43b9-8a13-78ee659137a3" (UID: "69e29420-6e97-43b9-8a13-78ee659137a3"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.263633 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.317105 5030 generic.go:334] "Generic (PLEG): container finished" podID="e7c9e208-00ec-4363-b4e8-91e11eb22555" containerID="7b55930a5340962e8be1a11e7fbc1beecd596da8a8badc35c7d5f04aefcf0ae2" exitCode=0 Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.317199 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/memcached-0" event={"ID":"e7c9e208-00ec-4363-b4e8-91e11eb22555","Type":"ContainerDied","Data":"7b55930a5340962e8be1a11e7fbc1beecd596da8a8badc35c7d5f04aefcf0ae2"} Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.318474 5030 generic.go:334] "Generic (PLEG): container finished" podID="d7b5fe58-53a7-4726-8286-24f697bf4295" containerID="31f7e43773b12e26b6a64fcca28da9df987a9d79e462b8048f16d9ec95383423" exitCode=1 Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.318531 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" event={"ID":"d7b5fe58-53a7-4726-8286-24f697bf4295","Type":"ContainerDied","Data":"31f7e43773b12e26b6a64fcca28da9df987a9d79e462b8048f16d9ec95383423"} Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.318988 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" secret="" err="secret \"galera-openstack-dockercfg-j42pw\" not found" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.319018 5030 scope.go:117] "RemoveContainer" containerID="31f7e43773b12e26b6a64fcca28da9df987a9d79e462b8048f16d9ec95383423" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.321978 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x882\" (UniqueName: \"kubernetes.io/projected/69e29420-6e97-43b9-8a13-78ee659137a3-kube-api-access-7x882\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.322038 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.324099 5030 generic.go:334] "Generic (PLEG): container finished" podID="69e29420-6e97-43b9-8a13-78ee659137a3" containerID="51b12193306751d5e555d7c9e3fd34b74dd7fed0b2f6d2e6d883b6c0073e0073" exitCode=0 Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.324175 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/root-account-create-update-gmltn" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.324184 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"69e29420-6e97-43b9-8a13-78ee659137a3","Type":"ContainerDied","Data":"51b12193306751d5e555d7c9e3fd34b74dd7fed0b2f6d2e6d883b6c0073e0073"} Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.324227 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-2" event={"ID":"69e29420-6e97-43b9-8a13-78ee659137a3","Type":"ContainerDied","Data":"2af0eef2c894852b53610ad3fc7bc00566546a5ef4b28578bfd4902329b14fd2"} Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.324249 5030 scope.go:117] "RemoveContainer" containerID="51b12193306751d5e555d7c9e3fd34b74dd7fed0b2f6d2e6d883b6c0073e0073" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.324393 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-2" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.344280 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.402987 5030 scope.go:117] "RemoveContainer" containerID="ef6d71aade146bf53cd5abeee0fc3eeda4a2bdb0dc8e70df7289bcb6182508c4" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.417898 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.424597 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.430862 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/openstack-galera-2"] Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.440881 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-gmltn"] Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.442744 5030 scope.go:117] "RemoveContainer" containerID="51b12193306751d5e555d7c9e3fd34b74dd7fed0b2f6d2e6d883b6c0073e0073" Jan 21 01:00:53 crc kubenswrapper[5030]: E0121 01:00:53.443066 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51b12193306751d5e555d7c9e3fd34b74dd7fed0b2f6d2e6d883b6c0073e0073\": container with ID starting with 51b12193306751d5e555d7c9e3fd34b74dd7fed0b2f6d2e6d883b6c0073e0073 not found: ID does not exist" containerID="51b12193306751d5e555d7c9e3fd34b74dd7fed0b2f6d2e6d883b6c0073e0073" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.443099 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b12193306751d5e555d7c9e3fd34b74dd7fed0b2f6d2e6d883b6c0073e0073"} err="failed to get container status \"51b12193306751d5e555d7c9e3fd34b74dd7fed0b2f6d2e6d883b6c0073e0073\": rpc error: code = NotFound desc = could not find container \"51b12193306751d5e555d7c9e3fd34b74dd7fed0b2f6d2e6d883b6c0073e0073\": container with ID starting with 51b12193306751d5e555d7c9e3fd34b74dd7fed0b2f6d2e6d883b6c0073e0073 not found: ID does not exist" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.443119 5030 scope.go:117] "RemoveContainer" containerID="ef6d71aade146bf53cd5abeee0fc3eeda4a2bdb0dc8e70df7289bcb6182508c4" Jan 21 01:00:53 crc kubenswrapper[5030]: E0121 01:00:53.443471 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef6d71aade146bf53cd5abeee0fc3eeda4a2bdb0dc8e70df7289bcb6182508c4\": container with ID starting with ef6d71aade146bf53cd5abeee0fc3eeda4a2bdb0dc8e70df7289bcb6182508c4 not found: ID does not exist" containerID="ef6d71aade146bf53cd5abeee0fc3eeda4a2bdb0dc8e70df7289bcb6182508c4" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.443560 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef6d71aade146bf53cd5abeee0fc3eeda4a2bdb0dc8e70df7289bcb6182508c4"} err="failed to get container status \"ef6d71aade146bf53cd5abeee0fc3eeda4a2bdb0dc8e70df7289bcb6182508c4\": rpc error: code = NotFound desc = could not find container \"ef6d71aade146bf53cd5abeee0fc3eeda4a2bdb0dc8e70df7289bcb6182508c4\": container with ID starting with ef6d71aade146bf53cd5abeee0fc3eeda4a2bdb0dc8e70df7289bcb6182508c4 not found: ID does not exist" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.446013 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/root-account-create-update-gmltn"] Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.538335 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/memcached-0" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.626075 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.628936 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7c9e208-00ec-4363-b4e8-91e11eb22555-config-data\") pod \"e7c9e208-00ec-4363-b4e8-91e11eb22555\" (UID: \"e7c9e208-00ec-4363-b4e8-91e11eb22555\") " Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.628992 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e7c9e208-00ec-4363-b4e8-91e11eb22555-kolla-config\") pod \"e7c9e208-00ec-4363-b4e8-91e11eb22555\" (UID: \"e7c9e208-00ec-4363-b4e8-91e11eb22555\") " Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.629024 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv8lz\" (UniqueName: \"kubernetes.io/projected/e7c9e208-00ec-4363-b4e8-91e11eb22555-kube-api-access-hv8lz\") pod \"e7c9e208-00ec-4363-b4e8-91e11eb22555\" (UID: \"e7c9e208-00ec-4363-b4e8-91e11eb22555\") " Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.629316 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3773b922-c87a-4a71-95d5-385611972a96-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.629333 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5tjn\" (UniqueName: \"kubernetes.io/projected/3773b922-c87a-4a71-95d5-385611972a96-kube-api-access-k5tjn\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.630284 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c9e208-00ec-4363-b4e8-91e11eb22555-config-data" (OuterVolumeSpecName: "config-data") pod "e7c9e208-00ec-4363-b4e8-91e11eb22555" (UID: "e7c9e208-00ec-4363-b4e8-91e11eb22555"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.630604 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c9e208-00ec-4363-b4e8-91e11eb22555-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e7c9e208-00ec-4363-b4e8-91e11eb22555" (UID: "e7c9e208-00ec-4363-b4e8-91e11eb22555"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.636820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c9e208-00ec-4363-b4e8-91e11eb22555-kube-api-access-hv8lz" (OuterVolumeSpecName: "kube-api-access-hv8lz") pod "e7c9e208-00ec-4363-b4e8-91e11eb22555" (UID: "e7c9e208-00ec-4363-b4e8-91e11eb22555"). InnerVolumeSpecName "kube-api-access-hv8lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.669386 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/rabbitmq-server-0" podUID="7a69a2c7-1b3d-40ed-b50c-0bf65178e971" containerName="rabbitmq" containerID="cri-o://9cadf200e8321abc58854abfd8226569f7dd3a279740b95b9d6a4487243f4eec" gracePeriod=604800 Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.730510 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7c9e208-00ec-4363-b4e8-91e11eb22555-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.730660 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e7c9e208-00ec-4363-b4e8-91e11eb22555-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.730686 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv8lz\" (UniqueName: \"kubernetes.io/projected/e7c9e208-00ec-4363-b4e8-91e11eb22555-kube-api-access-hv8lz\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:53 crc kubenswrapper[5030]: E0121 01:00:53.934711 5030 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 01:00:53 crc kubenswrapper[5030]: E0121 01:00:53.934784 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts podName:d7b5fe58-53a7-4726-8286-24f697bf4295 nodeName:}" failed. No retries permitted until 2026-01-21 01:00:55.934768523 +0000 UTC m=+8728.255028811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts") pod "keystoned3cf-account-delete-n6qbc" (UID: "d7b5fe58-53a7-4726-8286-24f697bf4295") : configmap "openstack-scripts" not found Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.970785 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3773b922-c87a-4a71-95d5-385611972a96" path="/var/lib/kubelet/pods/3773b922-c87a-4a71-95d5-385611972a96/volumes" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.971213 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e29420-6e97-43b9-8a13-78ee659137a3" path="/var/lib/kubelet/pods/69e29420-6e97-43b9-8a13-78ee659137a3/volumes" Jan 21 01:00:53 crc kubenswrapper[5030]: I0121 01:00:53.971840 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1e9a747-3b2d-450e-a1a3-aa49bce30b7c" path="/var/lib/kubelet/pods/f1e9a747-3b2d-450e-a1a3-aa49bce30b7c/volumes" Jan 21 01:00:54 crc kubenswrapper[5030]: I0121 01:00:54.304135 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/openstack-galera-1" podUID="765d3219-1f0d-4bda-8f7f-716ee4cb51ae" containerName="galera" containerID="cri-o://b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529" gracePeriod=28 Jan 21 01:00:54 crc kubenswrapper[5030]: I0121 01:00:54.333843 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/memcached-0" event={"ID":"e7c9e208-00ec-4363-b4e8-91e11eb22555","Type":"ContainerDied","Data":"a2b742ab0b4245ecf00b1c3eee204061c02aadd7b26e9fd6e57aa785839d4416"} Jan 21 01:00:54 crc kubenswrapper[5030]: I0121 01:00:54.333875 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/memcached-0" Jan 21 01:00:54 crc kubenswrapper[5030]: I0121 01:00:54.333903 5030 scope.go:117] "RemoveContainer" containerID="7b55930a5340962e8be1a11e7fbc1beecd596da8a8badc35c7d5f04aefcf0ae2" Jan 21 01:00:54 crc kubenswrapper[5030]: I0121 01:00:54.344309 5030 generic.go:334] "Generic (PLEG): container finished" podID="d7b5fe58-53a7-4726-8286-24f697bf4295" containerID="1473694a0e15eae816ea5d37f246d11083699e89d5197e69e400038a4f7a11c6" exitCode=1 Jan 21 01:00:54 crc kubenswrapper[5030]: I0121 01:00:54.344437 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" event={"ID":"d7b5fe58-53a7-4726-8286-24f697bf4295","Type":"ContainerDied","Data":"1473694a0e15eae816ea5d37f246d11083699e89d5197e69e400038a4f7a11c6"} Jan 21 01:00:54 crc kubenswrapper[5030]: I0121 01:00:54.344986 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" secret="" err="secret \"galera-openstack-dockercfg-j42pw\" not found" Jan 21 01:00:54 crc kubenswrapper[5030]: I0121 01:00:54.345020 5030 scope.go:117] "RemoveContainer" containerID="1473694a0e15eae816ea5d37f246d11083699e89d5197e69e400038a4f7a11c6" Jan 21 01:00:54 crc kubenswrapper[5030]: E0121 01:00:54.345260 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystoned3cf-account-delete-n6qbc_horizon-kuttl-tests(d7b5fe58-53a7-4726-8286-24f697bf4295)\"" pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" podUID="d7b5fe58-53a7-4726-8286-24f697bf4295" Jan 21 01:00:54 crc kubenswrapper[5030]: I0121 01:00:54.360162 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Jan 21 01:00:54 crc kubenswrapper[5030]: I0121 01:00:54.367783 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/memcached-0"] Jan 21 01:00:54 crc kubenswrapper[5030]: I0121 01:00:54.371898 5030 scope.go:117] "RemoveContainer" containerID="31f7e43773b12e26b6a64fcca28da9df987a9d79e462b8048f16d9ec95383423" Jan 21 01:00:54 crc kubenswrapper[5030]: I0121 01:00:54.798176 5030 prober.go:107] "Probe failed" probeType="Readiness" pod="horizon-kuttl-tests/rabbitmq-server-0" podUID="7a69a2c7-1b3d-40ed-b50c-0bf65178e971" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.196:5672: connect: connection refused" Jan 21 01:00:54 crc kubenswrapper[5030]: E0121 01:00:54.852857 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 01:00:54 crc kubenswrapper[5030]: E0121 01:00:54.857882 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 01:00:54 crc kubenswrapper[5030]: E0121 01:00:54.860338 5030 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 21 01:00:54 crc kubenswrapper[5030]: E0121 01:00:54.860385 5030 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="horizon-kuttl-tests/openstack-galera-1" podUID="765d3219-1f0d-4bda-8f7f-716ee4cb51ae" containerName="galera" Jan 21 01:00:54 crc kubenswrapper[5030]: I0121 01:00:54.873645 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.052437 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-scripts\") pod \"966dfb91-704f-4ed2-b419-b735a6f876a7\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.052500 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-config-data\") pod \"966dfb91-704f-4ed2-b419-b735a6f876a7\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.052574 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-fernet-keys\") pod \"966dfb91-704f-4ed2-b419-b735a6f876a7\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.053753 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn5lj\" (UniqueName: \"kubernetes.io/projected/966dfb91-704f-4ed2-b419-b735a6f876a7-kube-api-access-wn5lj\") pod \"966dfb91-704f-4ed2-b419-b735a6f876a7\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.053904 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-credential-keys\") pod \"966dfb91-704f-4ed2-b419-b735a6f876a7\" (UID: \"966dfb91-704f-4ed2-b419-b735a6f876a7\") " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.064904 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "966dfb91-704f-4ed2-b419-b735a6f876a7" (UID: "966dfb91-704f-4ed2-b419-b735a6f876a7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.067870 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/966dfb91-704f-4ed2-b419-b735a6f876a7-kube-api-access-wn5lj" (OuterVolumeSpecName: "kube-api-access-wn5lj") pod "966dfb91-704f-4ed2-b419-b735a6f876a7" (UID: "966dfb91-704f-4ed2-b419-b735a6f876a7"). InnerVolumeSpecName "kube-api-access-wn5lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.075879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-scripts" (OuterVolumeSpecName: "scripts") pod "966dfb91-704f-4ed2-b419-b735a6f876a7" (UID: "966dfb91-704f-4ed2-b419-b735a6f876a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.083788 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "966dfb91-704f-4ed2-b419-b735a6f876a7" (UID: "966dfb91-704f-4ed2-b419-b735a6f876a7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.122809 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-config-data" (OuterVolumeSpecName: "config-data") pod "966dfb91-704f-4ed2-b419-b735a6f876a7" (UID: "966dfb91-704f-4ed2-b419-b735a6f876a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.157423 5030 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.157460 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn5lj\" (UniqueName: \"kubernetes.io/projected/966dfb91-704f-4ed2-b419-b735a6f876a7-kube-api-access-wn5lj\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.157471 5030 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.157482 5030 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.157490 5030 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966dfb91-704f-4ed2-b419-b735a6f876a7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.354188 5030 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" secret="" err="secret \"galera-openstack-dockercfg-j42pw\" not found" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.354641 5030 scope.go:117] "RemoveContainer" containerID="1473694a0e15eae816ea5d37f246d11083699e89d5197e69e400038a4f7a11c6" Jan 21 01:00:55 crc kubenswrapper[5030]: E0121 01:00:55.354965 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystoned3cf-account-delete-n6qbc_horizon-kuttl-tests(d7b5fe58-53a7-4726-8286-24f697bf4295)\"" pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" podUID="d7b5fe58-53a7-4726-8286-24f697bf4295" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.356412 5030 generic.go:334] "Generic (PLEG): container finished" podID="7a69a2c7-1b3d-40ed-b50c-0bf65178e971" containerID="9cadf200e8321abc58854abfd8226569f7dd3a279740b95b9d6a4487243f4eec" exitCode=0 Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.356472 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"7a69a2c7-1b3d-40ed-b50c-0bf65178e971","Type":"ContainerDied","Data":"9cadf200e8321abc58854abfd8226569f7dd3a279740b95b9d6a4487243f4eec"} Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.356496 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/rabbitmq-server-0" event={"ID":"7a69a2c7-1b3d-40ed-b50c-0bf65178e971","Type":"ContainerDied","Data":"09f6f0fb6f03fe0db252f9c53bd79af5f4fb76ad4698ca4a8ea33a5e1e0ea0f5"} Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.356507 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09f6f0fb6f03fe0db252f9c53bd79af5f4fb76ad4698ca4a8ea33a5e1e0ea0f5" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.358214 5030 generic.go:334] "Generic (PLEG): container finished" podID="966dfb91-704f-4ed2-b419-b735a6f876a7" containerID="57065a24729abf4957b4a0944ad2f07503fd28ed164189f8f1ba47b15d05c54a" exitCode=0 Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.358249 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" event={"ID":"966dfb91-704f-4ed2-b419-b735a6f876a7","Type":"ContainerDied","Data":"57065a24729abf4957b4a0944ad2f07503fd28ed164189f8f1ba47b15d05c54a"} Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.358266 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" event={"ID":"966dfb91-704f-4ed2-b419-b735a6f876a7","Type":"ContainerDied","Data":"560c36d21bf89f858b981bddceacf8a136907fb832af1a7989661f449451ee45"} Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.358282 5030 scope.go:117] "RemoveContainer" containerID="57065a24729abf4957b4a0944ad2f07503fd28ed164189f8f1ba47b15d05c54a" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.358355 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystone-cb575f7bd-6ml4v" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.363274 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.390414 5030 scope.go:117] "RemoveContainer" containerID="57065a24729abf4957b4a0944ad2f07503fd28ed164189f8f1ba47b15d05c54a" Jan 21 01:00:55 crc kubenswrapper[5030]: E0121 01:00:55.390984 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57065a24729abf4957b4a0944ad2f07503fd28ed164189f8f1ba47b15d05c54a\": container with ID starting with 57065a24729abf4957b4a0944ad2f07503fd28ed164189f8f1ba47b15d05c54a not found: ID does not exist" containerID="57065a24729abf4957b4a0944ad2f07503fd28ed164189f8f1ba47b15d05c54a" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.391018 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57065a24729abf4957b4a0944ad2f07503fd28ed164189f8f1ba47b15d05c54a"} err="failed to get container status \"57065a24729abf4957b4a0944ad2f07503fd28ed164189f8f1ba47b15d05c54a\": rpc error: code = NotFound desc = could not find container \"57065a24729abf4957b4a0944ad2f07503fd28ed164189f8f1ba47b15d05c54a\": container with ID starting with 57065a24729abf4957b4a0944ad2f07503fd28ed164189f8f1ba47b15d05c54a not found: ID does not exist" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.420912 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-cb575f7bd-6ml4v"] Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.432706 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-cb575f7bd-6ml4v"] Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.464110 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-plugins\") pod \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.464155 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-erlang-cookie-secret\") pod \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.464190 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-erlang-cookie\") pod \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.464237 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-pod-info\") pod \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.464297 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-confd\") pod \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.464322 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-plugins-conf\") pod \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.464574 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7a69a2c7-1b3d-40ed-b50c-0bf65178e971" (UID: "7a69a2c7-1b3d-40ed-b50c-0bf65178e971"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.464895 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7a69a2c7-1b3d-40ed-b50c-0bf65178e971" (UID: "7a69a2c7-1b3d-40ed-b50c-0bf65178e971"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.464939 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f\") pod \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.464975 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh5zb\" (UniqueName: \"kubernetes.io/projected/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-kube-api-access-vh5zb\") pod \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\" (UID: \"7a69a2c7-1b3d-40ed-b50c-0bf65178e971\") " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.464993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7a69a2c7-1b3d-40ed-b50c-0bf65178e971" (UID: "7a69a2c7-1b3d-40ed-b50c-0bf65178e971"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.465406 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.465483 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.465496 5030 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.468801 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-pod-info" (OuterVolumeSpecName: "pod-info") pod "7a69a2c7-1b3d-40ed-b50c-0bf65178e971" (UID: "7a69a2c7-1b3d-40ed-b50c-0bf65178e971"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.474613 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-kube-api-access-vh5zb" (OuterVolumeSpecName: "kube-api-access-vh5zb") pod "7a69a2c7-1b3d-40ed-b50c-0bf65178e971" (UID: "7a69a2c7-1b3d-40ed-b50c-0bf65178e971"). InnerVolumeSpecName "kube-api-access-vh5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.474768 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7a69a2c7-1b3d-40ed-b50c-0bf65178e971" (UID: "7a69a2c7-1b3d-40ed-b50c-0bf65178e971"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.480997 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f" (OuterVolumeSpecName: "persistence") pod "7a69a2c7-1b3d-40ed-b50c-0bf65178e971" (UID: "7a69a2c7-1b3d-40ed-b50c-0bf65178e971"). InnerVolumeSpecName "pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.536250 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7a69a2c7-1b3d-40ed-b50c-0bf65178e971" (UID: "7a69a2c7-1b3d-40ed-b50c-0bf65178e971"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.567186 5030 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.567216 5030 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.567226 5030 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.567255 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f\") on node \"crc\" " Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.567272 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh5zb\" (UniqueName: \"kubernetes.io/projected/7a69a2c7-1b3d-40ed-b50c-0bf65178e971-kube-api-access-vh5zb\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.584302 5030 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.584470 5030 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f") on node "crc" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.668213 5030 reconciler_common.go:293] "Volume detached for volume \"pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8c3ddc-8151-425a-8584-70aef4f0023f\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.971134 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="966dfb91-704f-4ed2-b419-b735a6f876a7" path="/var/lib/kubelet/pods/966dfb91-704f-4ed2-b419-b735a6f876a7/volumes" Jan 21 01:00:55 crc kubenswrapper[5030]: I0121 01:00:55.972170 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c9e208-00ec-4363-b4e8-91e11eb22555" path="/var/lib/kubelet/pods/e7c9e208-00ec-4363-b4e8-91e11eb22555/volumes" Jan 21 01:00:55 crc kubenswrapper[5030]: E0121 01:00:55.974472 5030 configmap.go:193] Couldn't get configMap horizon-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 21 01:00:55 crc kubenswrapper[5030]: E0121 01:00:55.974539 5030 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts podName:d7b5fe58-53a7-4726-8286-24f697bf4295 nodeName:}" failed. No retries permitted until 2026-01-21 01:00:59.974526365 +0000 UTC m=+8732.294786653 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts") pod "keystoned3cf-account-delete-n6qbc" (UID: "d7b5fe58-53a7-4726-8286-24f697bf4295") : configmap "openstack-scripts" not found Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.215686 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.338861 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="horizon-kuttl-tests/openstack-galera-0" podUID="97ea303d-f60d-42ad-a86b-0b371360c2b1" containerName="galera" containerID="cri-o://4922eaeb36c3932ef49092ccba1cbc0da53558a79e4c3a7dd993f0d1fb3bccd7" gracePeriod=26 Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.358270 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-db-create-ltfzw"] Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.363663 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-db-create-ltfzw"] Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.371401 5030 generic.go:334] "Generic (PLEG): container finished" podID="765d3219-1f0d-4bda-8f7f-716ee4cb51ae" containerID="b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529" exitCode=0 Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.371454 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"765d3219-1f0d-4bda-8f7f-716ee4cb51ae","Type":"ContainerDied","Data":"b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529"} Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.371480 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-1" event={"ID":"765d3219-1f0d-4bda-8f7f-716ee4cb51ae","Type":"ContainerDied","Data":"77f2e866895ee11a00785af56c0f92cffa95772ddd75419689b6c3bf65dfe8bc"} Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.371498 5030 scope.go:117] "RemoveContainer" containerID="b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.371569 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-1" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.373352 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/rabbitmq-server-0" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.378095 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7"] Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.380460 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwwts\" (UniqueName: \"kubernetes.io/projected/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-kube-api-access-zwwts\") pod \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.380502 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-operator-scripts\") pod \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.380567 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.380591 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-config-data-default\") pod \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.380646 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-kolla-config\") pod \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.380812 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-config-data-generated\") pod \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\" (UID: \"765d3219-1f0d-4bda-8f7f-716ee4cb51ae\") " Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.381358 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "765d3219-1f0d-4bda-8f7f-716ee4cb51ae" (UID: "765d3219-1f0d-4bda-8f7f-716ee4cb51ae"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.381374 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "765d3219-1f0d-4bda-8f7f-716ee4cb51ae" (UID: "765d3219-1f0d-4bda-8f7f-716ee4cb51ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.381378 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "765d3219-1f0d-4bda-8f7f-716ee4cb51ae" (UID: "765d3219-1f0d-4bda-8f7f-716ee4cb51ae"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.384437 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc"] Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.386881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-kube-api-access-zwwts" (OuterVolumeSpecName: "kube-api-access-zwwts") pod "765d3219-1f0d-4bda-8f7f-716ee4cb51ae" (UID: "765d3219-1f0d-4bda-8f7f-716ee4cb51ae"). InnerVolumeSpecName "kube-api-access-zwwts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.390370 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystone-d3cf-account-create-update-h76k7"] Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.399847 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "765d3219-1f0d-4bda-8f7f-716ee4cb51ae" (UID: "765d3219-1f0d-4bda-8f7f-716ee4cb51ae"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.407782 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.413531 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "mysql-db") pod "765d3219-1f0d-4bda-8f7f-716ee4cb51ae" (UID: "765d3219-1f0d-4bda-8f7f-716ee4cb51ae"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.414876 5030 scope.go:117] "RemoveContainer" containerID="dc9d1a054be07d981dd9d1bcb5876e4565863e1a7c546cefa2a6b6f270a3468d" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.415989 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/rabbitmq-server-0"] Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.438732 5030 scope.go:117] "RemoveContainer" containerID="b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529" Jan 21 01:00:56 crc kubenswrapper[5030]: E0121 01:00:56.439532 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529\": container with ID starting with b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529 not found: ID does not exist" containerID="b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.439580 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529"} err="failed to get container status \"b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529\": rpc error: code = NotFound desc = could not find container \"b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529\": container with ID starting with b7b65b71d7ebc4f07dd1a84afb1fc82ba6a6d979e2b8b867b0a996a9aad6e529 not found: ID does not exist" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.439609 5030 scope.go:117] "RemoveContainer" containerID="dc9d1a054be07d981dd9d1bcb5876e4565863e1a7c546cefa2a6b6f270a3468d" Jan 21 01:00:56 crc kubenswrapper[5030]: E0121 01:00:56.439965 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc9d1a054be07d981dd9d1bcb5876e4565863e1a7c546cefa2a6b6f270a3468d\": container with ID starting with dc9d1a054be07d981dd9d1bcb5876e4565863e1a7c546cefa2a6b6f270a3468d not found: ID does not exist" containerID="dc9d1a054be07d981dd9d1bcb5876e4565863e1a7c546cefa2a6b6f270a3468d" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.440009 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9d1a054be07d981dd9d1bcb5876e4565863e1a7c546cefa2a6b6f270a3468d"} err="failed to get container status \"dc9d1a054be07d981dd9d1bcb5876e4565863e1a7c546cefa2a6b6f270a3468d\": rpc error: code = NotFound desc = could not find container \"dc9d1a054be07d981dd9d1bcb5876e4565863e1a7c546cefa2a6b6f270a3468d\": container with ID starting with dc9d1a054be07d981dd9d1bcb5876e4565863e1a7c546cefa2a6b6f270a3468d not found: ID does not exist" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.482229 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.482261 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwwts\" (UniqueName: \"kubernetes.io/projected/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-kube-api-access-zwwts\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.482273 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.482315 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.482336 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.482347 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/765d3219-1f0d-4bda-8f7f-716ee4cb51ae-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.514437 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.583490 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.606405 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.710251 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.715544 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/openstack-galera-1"] Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.785286 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwhhn\" (UniqueName: \"kubernetes.io/projected/d7b5fe58-53a7-4726-8286-24f697bf4295-kube-api-access-dwhhn\") pod \"d7b5fe58-53a7-4726-8286-24f697bf4295\" (UID: \"d7b5fe58-53a7-4726-8286-24f697bf4295\") " Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.785361 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts\") pod \"d7b5fe58-53a7-4726-8286-24f697bf4295\" (UID: \"d7b5fe58-53a7-4726-8286-24f697bf4295\") " Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.786014 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7b5fe58-53a7-4726-8286-24f697bf4295" (UID: "d7b5fe58-53a7-4726-8286-24f697bf4295"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.789404 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7b5fe58-53a7-4726-8286-24f697bf4295-kube-api-access-dwhhn" (OuterVolumeSpecName: "kube-api-access-dwhhn") pod "d7b5fe58-53a7-4726-8286-24f697bf4295" (UID: "d7b5fe58-53a7-4726-8286-24f697bf4295"). InnerVolumeSpecName "kube-api-access-dwhhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.886784 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwhhn\" (UniqueName: \"kubernetes.io/projected/d7b5fe58-53a7-4726-8286-24f697bf4295-kube-api-access-dwhhn\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:56 crc kubenswrapper[5030]: I0121 01:00:56.886819 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7b5fe58-53a7-4726-8286-24f697bf4295-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.029022 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.191358 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-operator-scripts\") pod \"97ea303d-f60d-42ad-a86b-0b371360c2b1\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.191605 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-kolla-config\") pod \"97ea303d-f60d-42ad-a86b-0b371360c2b1\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.191674 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2phh7\" (UniqueName: \"kubernetes.io/projected/97ea303d-f60d-42ad-a86b-0b371360c2b1-kube-api-access-2phh7\") pod \"97ea303d-f60d-42ad-a86b-0b371360c2b1\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.191708 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"97ea303d-f60d-42ad-a86b-0b371360c2b1\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.191744 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-config-data-default\") pod \"97ea303d-f60d-42ad-a86b-0b371360c2b1\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.191807 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/97ea303d-f60d-42ad-a86b-0b371360c2b1-config-data-generated\") pod \"97ea303d-f60d-42ad-a86b-0b371360c2b1\" (UID: \"97ea303d-f60d-42ad-a86b-0b371360c2b1\") " Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.192314 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97ea303d-f60d-42ad-a86b-0b371360c2b1-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "97ea303d-f60d-42ad-a86b-0b371360c2b1" (UID: "97ea303d-f60d-42ad-a86b-0b371360c2b1"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.192385 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "97ea303d-f60d-42ad-a86b-0b371360c2b1" (UID: "97ea303d-f60d-42ad-a86b-0b371360c2b1"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.192484 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97ea303d-f60d-42ad-a86b-0b371360c2b1" (UID: "97ea303d-f60d-42ad-a86b-0b371360c2b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.192837 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "97ea303d-f60d-42ad-a86b-0b371360c2b1" (UID: "97ea303d-f60d-42ad-a86b-0b371360c2b1"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.195167 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ea303d-f60d-42ad-a86b-0b371360c2b1-kube-api-access-2phh7" (OuterVolumeSpecName: "kube-api-access-2phh7") pod "97ea303d-f60d-42ad-a86b-0b371360c2b1" (UID: "97ea303d-f60d-42ad-a86b-0b371360c2b1"). InnerVolumeSpecName "kube-api-access-2phh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.201085 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "mysql-db") pod "97ea303d-f60d-42ad-a86b-0b371360c2b1" (UID: "97ea303d-f60d-42ad-a86b-0b371360c2b1"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.293799 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/97ea303d-f60d-42ad-a86b-0b371360c2b1-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.293865 5030 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.293887 5030 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.293932 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2phh7\" (UniqueName: \"kubernetes.io/projected/97ea303d-f60d-42ad-a86b-0b371360c2b1-kube-api-access-2phh7\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.293996 5030 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.294023 5030 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/97ea303d-f60d-42ad-a86b-0b371360c2b1-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.311043 5030 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.383263 5030 generic.go:334] "Generic (PLEG): container finished" podID="97ea303d-f60d-42ad-a86b-0b371360c2b1" containerID="4922eaeb36c3932ef49092ccba1cbc0da53558a79e4c3a7dd993f0d1fb3bccd7" exitCode=0 Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.383404 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/openstack-galera-0" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.383404 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"97ea303d-f60d-42ad-a86b-0b371360c2b1","Type":"ContainerDied","Data":"4922eaeb36c3932ef49092ccba1cbc0da53558a79e4c3a7dd993f0d1fb3bccd7"} Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.383599 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/openstack-galera-0" event={"ID":"97ea303d-f60d-42ad-a86b-0b371360c2b1","Type":"ContainerDied","Data":"001e23f4f61d1708295310b3706481d45ffc1a56ad4ec7174fa9003cdd8c1b92"} Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.383650 5030 scope.go:117] "RemoveContainer" containerID="4922eaeb36c3932ef49092ccba1cbc0da53558a79e4c3a7dd993f0d1fb3bccd7" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.389545 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" event={"ID":"d7b5fe58-53a7-4726-8286-24f697bf4295","Type":"ContainerDied","Data":"3434a979439a924c3fee7adcf9831a76e276057c8a3251497d40477d1cc55a37"} Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.389712 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.395115 5030 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.414079 5030 scope.go:117] "RemoveContainer" containerID="ce3afd85a7f097d9424a8c117e098c0bd738ff1e79884c717fe5cad083874148" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.419087 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.432006 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/openstack-galera-0"] Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.441563 5030 scope.go:117] "RemoveContainer" containerID="4922eaeb36c3932ef49092ccba1cbc0da53558a79e4c3a7dd993f0d1fb3bccd7" Jan 21 01:00:57 crc kubenswrapper[5030]: E0121 01:00:57.441989 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4922eaeb36c3932ef49092ccba1cbc0da53558a79e4c3a7dd993f0d1fb3bccd7\": container with ID starting with 4922eaeb36c3932ef49092ccba1cbc0da53558a79e4c3a7dd993f0d1fb3bccd7 not found: ID does not exist" containerID="4922eaeb36c3932ef49092ccba1cbc0da53558a79e4c3a7dd993f0d1fb3bccd7" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.442028 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4922eaeb36c3932ef49092ccba1cbc0da53558a79e4c3a7dd993f0d1fb3bccd7"} err="failed to get container status \"4922eaeb36c3932ef49092ccba1cbc0da53558a79e4c3a7dd993f0d1fb3bccd7\": rpc error: code = NotFound desc = could not find container \"4922eaeb36c3932ef49092ccba1cbc0da53558a79e4c3a7dd993f0d1fb3bccd7\": container with ID starting with 4922eaeb36c3932ef49092ccba1cbc0da53558a79e4c3a7dd993f0d1fb3bccd7 not found: ID does not exist" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.442053 5030 scope.go:117] "RemoveContainer" containerID="ce3afd85a7f097d9424a8c117e098c0bd738ff1e79884c717fe5cad083874148" Jan 21 01:00:57 crc kubenswrapper[5030]: E0121 01:00:57.442362 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce3afd85a7f097d9424a8c117e098c0bd738ff1e79884c717fe5cad083874148\": container with ID starting with ce3afd85a7f097d9424a8c117e098c0bd738ff1e79884c717fe5cad083874148 not found: ID does not exist" containerID="ce3afd85a7f097d9424a8c117e098c0bd738ff1e79884c717fe5cad083874148" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.442446 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce3afd85a7f097d9424a8c117e098c0bd738ff1e79884c717fe5cad083874148"} err="failed to get container status \"ce3afd85a7f097d9424a8c117e098c0bd738ff1e79884c717fe5cad083874148\": rpc error: code = NotFound desc = could not find container \"ce3afd85a7f097d9424a8c117e098c0bd738ff1e79884c717fe5cad083874148\": container with ID starting with ce3afd85a7f097d9424a8c117e098c0bd738ff1e79884c717fe5cad083874148 not found: ID does not exist" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.442478 5030 scope.go:117] "RemoveContainer" containerID="1473694a0e15eae816ea5d37f246d11083699e89d5197e69e400038a4f7a11c6" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.442588 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc"] Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.448794 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["horizon-kuttl-tests/keystoned3cf-account-delete-n6qbc"] Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.970792 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30675ef2-c4d2-4fd1-8f10-ceef070130ef" path="/var/lib/kubelet/pods/30675ef2-c4d2-4fd1-8f10-ceef070130ef/volumes" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.971383 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="765d3219-1f0d-4bda-8f7f-716ee4cb51ae" path="/var/lib/kubelet/pods/765d3219-1f0d-4bda-8f7f-716ee4cb51ae/volumes" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.972071 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a69a2c7-1b3d-40ed-b50c-0bf65178e971" path="/var/lib/kubelet/pods/7a69a2c7-1b3d-40ed-b50c-0bf65178e971/volumes" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.973225 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ea303d-f60d-42ad-a86b-0b371360c2b1" path="/var/lib/kubelet/pods/97ea303d-f60d-42ad-a86b-0b371360c2b1/volumes" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.973782 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7b5fe58-53a7-4726-8286-24f697bf4295" path="/var/lib/kubelet/pods/d7b5fe58-53a7-4726-8286-24f697bf4295/volumes" Jan 21 01:00:57 crc kubenswrapper[5030]: I0121 01:00:57.974718 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc41df9c-0cdf-454e-a20a-d287a65bf2c0" path="/var/lib/kubelet/pods/fc41df9c-0cdf-454e-a20a-d287a65bf2c0/volumes" Jan 21 01:00:59 crc kubenswrapper[5030]: I0121 01:00:59.832378 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk"] Jan 21 01:00:59 crc kubenswrapper[5030]: I0121 01:00:59.832913 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" podUID="c0c199ff-c9e9-4b75-9639-003ffc8845b8" containerName="manager" containerID="cri-o://ed4a54773e6f1a0647bf06335dfe35af9345807473e5ccf21078854102126a05" gracePeriod=10 Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.120571 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-zd2q6"] Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.120834 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/horizon-operator-index-zd2q6" podUID="72237d07-9875-484f-9e9b-f282a65d0136" containerName="registry-server" containerID="cri-o://cfd45482f337d7f5159f114a3ff93b7177c4ab855c5d661cfc8a1ef57f24d710" gracePeriod=30 Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.163902 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f"] Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.169571 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/653b3b52ace42a43c00bbe11ec09bdc12d146890fa0ae4874ac941672astp5f"] Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.422878 5030 generic.go:334] "Generic (PLEG): container finished" podID="c0c199ff-c9e9-4b75-9639-003ffc8845b8" containerID="ed4a54773e6f1a0647bf06335dfe35af9345807473e5ccf21078854102126a05" exitCode=0 Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.422939 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" event={"ID":"c0c199ff-c9e9-4b75-9639-003ffc8845b8","Type":"ContainerDied","Data":"ed4a54773e6f1a0647bf06335dfe35af9345807473e5ccf21078854102126a05"} Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.424162 5030 generic.go:334] "Generic (PLEG): container finished" podID="72237d07-9875-484f-9e9b-f282a65d0136" containerID="cfd45482f337d7f5159f114a3ff93b7177c4ab855c5d661cfc8a1ef57f24d710" exitCode=0 Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.424183 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-zd2q6" event={"ID":"72237d07-9875-484f-9e9b-f282a65d0136","Type":"ContainerDied","Data":"cfd45482f337d7f5159f114a3ff93b7177c4ab855c5d661cfc8a1ef57f24d710"} Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.571150 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-zd2q6" Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.693917 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.749100 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfqgc\" (UniqueName: \"kubernetes.io/projected/72237d07-9875-484f-9e9b-f282a65d0136-kube-api-access-bfqgc\") pod \"72237d07-9875-484f-9e9b-f282a65d0136\" (UID: \"72237d07-9875-484f-9e9b-f282a65d0136\") " Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.754078 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72237d07-9875-484f-9e9b-f282a65d0136-kube-api-access-bfqgc" (OuterVolumeSpecName: "kube-api-access-bfqgc") pod "72237d07-9875-484f-9e9b-f282a65d0136" (UID: "72237d07-9875-484f-9e9b-f282a65d0136"). InnerVolumeSpecName "kube-api-access-bfqgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.850254 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0c199ff-c9e9-4b75-9639-003ffc8845b8-webhook-cert\") pod \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\" (UID: \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\") " Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.850366 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w95p7\" (UniqueName: \"kubernetes.io/projected/c0c199ff-c9e9-4b75-9639-003ffc8845b8-kube-api-access-w95p7\") pod \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\" (UID: \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\") " Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.850441 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0c199ff-c9e9-4b75-9639-003ffc8845b8-apiservice-cert\") pod \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\" (UID: \"c0c199ff-c9e9-4b75-9639-003ffc8845b8\") " Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.850762 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfqgc\" (UniqueName: \"kubernetes.io/projected/72237d07-9875-484f-9e9b-f282a65d0136-kube-api-access-bfqgc\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.853048 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c199ff-c9e9-4b75-9639-003ffc8845b8-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "c0c199ff-c9e9-4b75-9639-003ffc8845b8" (UID: "c0c199ff-c9e9-4b75-9639-003ffc8845b8"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.854321 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c199ff-c9e9-4b75-9639-003ffc8845b8-kube-api-access-w95p7" (OuterVolumeSpecName: "kube-api-access-w95p7") pod "c0c199ff-c9e9-4b75-9639-003ffc8845b8" (UID: "c0c199ff-c9e9-4b75-9639-003ffc8845b8"). InnerVolumeSpecName "kube-api-access-w95p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.854325 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c199ff-c9e9-4b75-9639-003ffc8845b8-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "c0c199ff-c9e9-4b75-9639-003ffc8845b8" (UID: "c0c199ff-c9e9-4b75-9639-003ffc8845b8"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.952115 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0c199ff-c9e9-4b75-9639-003ffc8845b8-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.952156 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w95p7\" (UniqueName: \"kubernetes.io/projected/c0c199ff-c9e9-4b75-9639-003ffc8845b8-kube-api-access-w95p7\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.952171 5030 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0c199ff-c9e9-4b75-9639-003ffc8845b8-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:00 crc kubenswrapper[5030]: I0121 01:01:00.962781 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:01:00 crc kubenswrapper[5030]: E0121 01:01:00.963142 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:01:01 crc kubenswrapper[5030]: I0121 01:01:01.433146 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" event={"ID":"c0c199ff-c9e9-4b75-9639-003ffc8845b8","Type":"ContainerDied","Data":"b2b2ed4b23c5272763c95ea3d0bbd7d44c72e84473d9f8e6d4df2a2bbff9b152"} Jan 21 01:01:01 crc kubenswrapper[5030]: I0121 01:01:01.433212 5030 scope.go:117] "RemoveContainer" containerID="ed4a54773e6f1a0647bf06335dfe35af9345807473e5ccf21078854102126a05" Jan 21 01:01:01 crc kubenswrapper[5030]: I0121 01:01:01.433226 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk" Jan 21 01:01:01 crc kubenswrapper[5030]: I0121 01:01:01.436114 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-zd2q6" event={"ID":"72237d07-9875-484f-9e9b-f282a65d0136","Type":"ContainerDied","Data":"47d1138d4ce9f0d0703e0715386b0a5bdefe1f828e862450777ebfa8b333a74e"} Jan 21 01:01:01 crc kubenswrapper[5030]: I0121 01:01:01.436168 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-zd2q6" Jan 21 01:01:01 crc kubenswrapper[5030]: I0121 01:01:01.473556 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-zd2q6"] Jan 21 01:01:01 crc kubenswrapper[5030]: I0121 01:01:01.476176 5030 scope.go:117] "RemoveContainer" containerID="cfd45482f337d7f5159f114a3ff93b7177c4ab855c5d661cfc8a1ef57f24d710" Jan 21 01:01:01 crc kubenswrapper[5030]: I0121 01:01:01.480485 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/horizon-operator-index-zd2q6"] Jan 21 01:01:01 crc kubenswrapper[5030]: I0121 01:01:01.495129 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk"] Jan 21 01:01:01 crc kubenswrapper[5030]: I0121 01:01:01.501754 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6b8c99cf5f-9xvqk"] Jan 21 01:01:01 crc kubenswrapper[5030]: I0121 01:01:01.969184 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2070258a-189d-4670-91c2-5717863726ae" path="/var/lib/kubelet/pods/2070258a-189d-4670-91c2-5717863726ae/volumes" Jan 21 01:01:01 crc kubenswrapper[5030]: I0121 01:01:01.970186 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72237d07-9875-484f-9e9b-f282a65d0136" path="/var/lib/kubelet/pods/72237d07-9875-484f-9e9b-f282a65d0136/volumes" Jan 21 01:01:01 crc kubenswrapper[5030]: I0121 01:01:01.970618 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0c199ff-c9e9-4b75-9639-003ffc8845b8" path="/var/lib/kubelet/pods/c0c199ff-c9e9-4b75-9639-003ffc8845b8/volumes" Jan 21 01:01:03 crc kubenswrapper[5030]: I0121 01:01:03.559745 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv"] Jan 21 01:01:03 crc kubenswrapper[5030]: I0121 01:01:03.559984 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" podUID="b2756282-c0f3-41b0-bdce-ab04afad52f8" containerName="manager" containerID="cri-o://8f707196ed547c20be3be18b7836f589ce4588be76d9efdc733cd54d067d5fe6" gracePeriod=10 Jan 21 01:01:03 crc kubenswrapper[5030]: I0121 01:01:03.842772 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-7ptpw"] Jan 21 01:01:03 crc kubenswrapper[5030]: I0121 01:01:03.843253 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-7ptpw" podUID="1228307e-d29d-44a4-b399-7be045068d83" containerName="registry-server" containerID="cri-o://13495d862f4ccc5c2a790ac890d96c66bba4de5fc565d1be0ee291b28dc283ec" gracePeriod=30 Jan 21 01:01:03 crc kubenswrapper[5030]: I0121 01:01:03.916360 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c"] Jan 21 01:01:03 crc kubenswrapper[5030]: I0121 01:01:03.926258 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/34e4e076b11e40e2796f19ad3bfdac5929942b93224fbc520400e0a0692b74c"] Jan 21 01:01:03 crc kubenswrapper[5030]: I0121 01:01:03.970697 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c" path="/var/lib/kubelet/pods/f07e08e1-2a96-4cb3-86e1-85f3cf6c3e9c/volumes" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.211785 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.289016 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-7ptpw" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.307396 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2756282-c0f3-41b0-bdce-ab04afad52f8-apiservice-cert\") pod \"b2756282-c0f3-41b0-bdce-ab04afad52f8\" (UID: \"b2756282-c0f3-41b0-bdce-ab04afad52f8\") " Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.307460 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6krs\" (UniqueName: \"kubernetes.io/projected/b2756282-c0f3-41b0-bdce-ab04afad52f8-kube-api-access-k6krs\") pod \"b2756282-c0f3-41b0-bdce-ab04afad52f8\" (UID: \"b2756282-c0f3-41b0-bdce-ab04afad52f8\") " Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.307559 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2756282-c0f3-41b0-bdce-ab04afad52f8-webhook-cert\") pod \"b2756282-c0f3-41b0-bdce-ab04afad52f8\" (UID: \"b2756282-c0f3-41b0-bdce-ab04afad52f8\") " Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.346910 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2756282-c0f3-41b0-bdce-ab04afad52f8-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "b2756282-c0f3-41b0-bdce-ab04afad52f8" (UID: "b2756282-c0f3-41b0-bdce-ab04afad52f8"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.346945 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2756282-c0f3-41b0-bdce-ab04afad52f8-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "b2756282-c0f3-41b0-bdce-ab04afad52f8" (UID: "b2756282-c0f3-41b0-bdce-ab04afad52f8"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.347003 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2756282-c0f3-41b0-bdce-ab04afad52f8-kube-api-access-k6krs" (OuterVolumeSpecName: "kube-api-access-k6krs") pod "b2756282-c0f3-41b0-bdce-ab04afad52f8" (UID: "b2756282-c0f3-41b0-bdce-ab04afad52f8"). InnerVolumeSpecName "kube-api-access-k6krs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.409949 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hv6j\" (UniqueName: \"kubernetes.io/projected/1228307e-d29d-44a4-b399-7be045068d83-kube-api-access-8hv6j\") pod \"1228307e-d29d-44a4-b399-7be045068d83\" (UID: \"1228307e-d29d-44a4-b399-7be045068d83\") " Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.410215 5030 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2756282-c0f3-41b0-bdce-ab04afad52f8-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.410228 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6krs\" (UniqueName: \"kubernetes.io/projected/b2756282-c0f3-41b0-bdce-ab04afad52f8-kube-api-access-k6krs\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.410238 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2756282-c0f3-41b0-bdce-ab04afad52f8-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.412810 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1228307e-d29d-44a4-b399-7be045068d83-kube-api-access-8hv6j" (OuterVolumeSpecName: "kube-api-access-8hv6j") pod "1228307e-d29d-44a4-b399-7be045068d83" (UID: "1228307e-d29d-44a4-b399-7be045068d83"). InnerVolumeSpecName "kube-api-access-8hv6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.460704 5030 generic.go:334] "Generic (PLEG): container finished" podID="b2756282-c0f3-41b0-bdce-ab04afad52f8" containerID="8f707196ed547c20be3be18b7836f589ce4588be76d9efdc733cd54d067d5fe6" exitCode=0 Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.460755 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.460777 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" event={"ID":"b2756282-c0f3-41b0-bdce-ab04afad52f8","Type":"ContainerDied","Data":"8f707196ed547c20be3be18b7836f589ce4588be76d9efdc733cd54d067d5fe6"} Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.460811 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv" event={"ID":"b2756282-c0f3-41b0-bdce-ab04afad52f8","Type":"ContainerDied","Data":"0f76c9d42ef35e963c76406e1bba68eecb5a755efbef406295ac5e94e232d5e3"} Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.460828 5030 scope.go:117] "RemoveContainer" containerID="8f707196ed547c20be3be18b7836f589ce4588be76d9efdc733cd54d067d5fe6" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.462721 5030 generic.go:334] "Generic (PLEG): container finished" podID="1228307e-d29d-44a4-b399-7be045068d83" containerID="13495d862f4ccc5c2a790ac890d96c66bba4de5fc565d1be0ee291b28dc283ec" exitCode=0 Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.462765 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-7ptpw" event={"ID":"1228307e-d29d-44a4-b399-7be045068d83","Type":"ContainerDied","Data":"13495d862f4ccc5c2a790ac890d96c66bba4de5fc565d1be0ee291b28dc283ec"} Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.462791 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-7ptpw" event={"ID":"1228307e-d29d-44a4-b399-7be045068d83","Type":"ContainerDied","Data":"ef019f23e851ad4ad5e94b9310ecb03ce6f40666441376b15f3e9391cf9bed77"} Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.462770 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-7ptpw" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.495730 5030 scope.go:117] "RemoveContainer" containerID="8f707196ed547c20be3be18b7836f589ce4588be76d9efdc733cd54d067d5fe6" Jan 21 01:01:04 crc kubenswrapper[5030]: E0121 01:01:04.496233 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f707196ed547c20be3be18b7836f589ce4588be76d9efdc733cd54d067d5fe6\": container with ID starting with 8f707196ed547c20be3be18b7836f589ce4588be76d9efdc733cd54d067d5fe6 not found: ID does not exist" containerID="8f707196ed547c20be3be18b7836f589ce4588be76d9efdc733cd54d067d5fe6" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.496288 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f707196ed547c20be3be18b7836f589ce4588be76d9efdc733cd54d067d5fe6"} err="failed to get container status \"8f707196ed547c20be3be18b7836f589ce4588be76d9efdc733cd54d067d5fe6\": rpc error: code = NotFound desc = could not find container \"8f707196ed547c20be3be18b7836f589ce4588be76d9efdc733cd54d067d5fe6\": container with ID starting with 8f707196ed547c20be3be18b7836f589ce4588be76d9efdc733cd54d067d5fe6 not found: ID does not exist" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.496321 5030 scope.go:117] "RemoveContainer" containerID="13495d862f4ccc5c2a790ac890d96c66bba4de5fc565d1be0ee291b28dc283ec" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.511010 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hv6j\" (UniqueName: \"kubernetes.io/projected/1228307e-d29d-44a4-b399-7be045068d83-kube-api-access-8hv6j\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.512444 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-7ptpw"] Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.517827 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-7ptpw"] Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.518863 5030 scope.go:117] "RemoveContainer" containerID="13495d862f4ccc5c2a790ac890d96c66bba4de5fc565d1be0ee291b28dc283ec" Jan 21 01:01:04 crc kubenswrapper[5030]: E0121 01:01:04.519284 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13495d862f4ccc5c2a790ac890d96c66bba4de5fc565d1be0ee291b28dc283ec\": container with ID starting with 13495d862f4ccc5c2a790ac890d96c66bba4de5fc565d1be0ee291b28dc283ec not found: ID does not exist" containerID="13495d862f4ccc5c2a790ac890d96c66bba4de5fc565d1be0ee291b28dc283ec" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.519333 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13495d862f4ccc5c2a790ac890d96c66bba4de5fc565d1be0ee291b28dc283ec"} err="failed to get container status \"13495d862f4ccc5c2a790ac890d96c66bba4de5fc565d1be0ee291b28dc283ec\": rpc error: code = NotFound desc = could not find container \"13495d862f4ccc5c2a790ac890d96c66bba4de5fc565d1be0ee291b28dc283ec\": container with ID starting with 13495d862f4ccc5c2a790ac890d96c66bba4de5fc565d1be0ee291b28dc283ec not found: ID does not exist" Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.523229 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv"] Jan 21 01:01:04 crc kubenswrapper[5030]: I0121 01:01:04.531966 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-d8c4cd77f-hlgjv"] Jan 21 01:01:05 crc kubenswrapper[5030]: I0121 01:01:05.973268 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1228307e-d29d-44a4-b399-7be045068d83" path="/var/lib/kubelet/pods/1228307e-d29d-44a4-b399-7be045068d83/volumes" Jan 21 01:01:05 crc kubenswrapper[5030]: I0121 01:01:05.974031 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2756282-c0f3-41b0-bdce-ab04afad52f8" path="/var/lib/kubelet/pods/b2756282-c0f3-41b0-bdce-ab04afad52f8/volumes" Jan 21 01:01:08 crc kubenswrapper[5030]: I0121 01:01:08.229310 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf"] Jan 21 01:01:08 crc kubenswrapper[5030]: I0121 01:01:08.229908 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf" podUID="e8269d80-4238-4c5a-a441-9997d386bc34" containerName="operator" containerID="cri-o://4a7a839bb5d430d5fab0f5f43471e084c2117be25a88edb928f43aba64877b11" gracePeriod=10 Jan 21 01:01:08 crc kubenswrapper[5030]: I0121 01:01:08.494592 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-bh77x"] Jan 21 01:01:08 crc kubenswrapper[5030]: I0121 01:01:08.495018 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" podUID="a13b4821-61bf-4351-aa07-6fde3696ada3" containerName="registry-server" containerID="cri-o://0a002f0c3c126c68b8cae441c383cb653e053248b23a3ed1c079f3725740bf39" gracePeriod=30 Jan 21 01:01:08 crc kubenswrapper[5030]: I0121 01:01:08.509031 5030 generic.go:334] "Generic (PLEG): container finished" podID="e8269d80-4238-4c5a-a441-9997d386bc34" containerID="4a7a839bb5d430d5fab0f5f43471e084c2117be25a88edb928f43aba64877b11" exitCode=0 Jan 21 01:01:08 crc kubenswrapper[5030]: I0121 01:01:08.509085 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf" event={"ID":"e8269d80-4238-4c5a-a441-9997d386bc34","Type":"ContainerDied","Data":"4a7a839bb5d430d5fab0f5f43471e084c2117be25a88edb928f43aba64877b11"} Jan 21 01:01:08 crc kubenswrapper[5030]: I0121 01:01:08.526172 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz"] Jan 21 01:01:08 crc kubenswrapper[5030]: I0121 01:01:08.535247 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jmjjz"] Jan 21 01:01:08 crc kubenswrapper[5030]: I0121 01:01:08.743088 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf" Jan 21 01:01:08 crc kubenswrapper[5030]: I0121 01:01:08.875159 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j47s\" (UniqueName: \"kubernetes.io/projected/e8269d80-4238-4c5a-a441-9997d386bc34-kube-api-access-8j47s\") pod \"e8269d80-4238-4c5a-a441-9997d386bc34\" (UID: \"e8269d80-4238-4c5a-a441-9997d386bc34\") " Jan 21 01:01:08 crc kubenswrapper[5030]: I0121 01:01:08.884993 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8269d80-4238-4c5a-a441-9997d386bc34-kube-api-access-8j47s" (OuterVolumeSpecName: "kube-api-access-8j47s") pod "e8269d80-4238-4c5a-a441-9997d386bc34" (UID: "e8269d80-4238-4c5a-a441-9997d386bc34"). InnerVolumeSpecName "kube-api-access-8j47s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:01:08 crc kubenswrapper[5030]: I0121 01:01:08.966938 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" Jan 21 01:01:08 crc kubenswrapper[5030]: I0121 01:01:08.976502 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j47s\" (UniqueName: \"kubernetes.io/projected/e8269d80-4238-4c5a-a441-9997d386bc34-kube-api-access-8j47s\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.077664 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8m66\" (UniqueName: \"kubernetes.io/projected/a13b4821-61bf-4351-aa07-6fde3696ada3-kube-api-access-g8m66\") pod \"a13b4821-61bf-4351-aa07-6fde3696ada3\" (UID: \"a13b4821-61bf-4351-aa07-6fde3696ada3\") " Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.080754 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13b4821-61bf-4351-aa07-6fde3696ada3-kube-api-access-g8m66" (OuterVolumeSpecName: "kube-api-access-g8m66") pod "a13b4821-61bf-4351-aa07-6fde3696ada3" (UID: "a13b4821-61bf-4351-aa07-6fde3696ada3"). InnerVolumeSpecName "kube-api-access-g8m66". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.179416 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8m66\" (UniqueName: \"kubernetes.io/projected/a13b4821-61bf-4351-aa07-6fde3696ada3-kube-api-access-g8m66\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.517040 5030 generic.go:334] "Generic (PLEG): container finished" podID="a13b4821-61bf-4351-aa07-6fde3696ada3" containerID="0a002f0c3c126c68b8cae441c383cb653e053248b23a3ed1c079f3725740bf39" exitCode=0 Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.517160 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.517171 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" event={"ID":"a13b4821-61bf-4351-aa07-6fde3696ada3","Type":"ContainerDied","Data":"0a002f0c3c126c68b8cae441c383cb653e053248b23a3ed1c079f3725740bf39"} Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.517207 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-bh77x" event={"ID":"a13b4821-61bf-4351-aa07-6fde3696ada3","Type":"ContainerDied","Data":"daf6eb95d51e8c344f1ebadc8767f20b18ebc5c2c75e7ce0d366777b47a63e80"} Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.517224 5030 scope.go:117] "RemoveContainer" containerID="0a002f0c3c126c68b8cae441c383cb653e053248b23a3ed1c079f3725740bf39" Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.518675 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf" event={"ID":"e8269d80-4238-4c5a-a441-9997d386bc34","Type":"ContainerDied","Data":"927fa3fdc0602626f57ab05cdd1108aa5bb3000de0d835b76bf0ccd0150bd08c"} Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.518741 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf" Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.537820 5030 scope.go:117] "RemoveContainer" containerID="0a002f0c3c126c68b8cae441c383cb653e053248b23a3ed1c079f3725740bf39" Jan 21 01:01:09 crc kubenswrapper[5030]: E0121 01:01:09.538469 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a002f0c3c126c68b8cae441c383cb653e053248b23a3ed1c079f3725740bf39\": container with ID starting with 0a002f0c3c126c68b8cae441c383cb653e053248b23a3ed1c079f3725740bf39 not found: ID does not exist" containerID="0a002f0c3c126c68b8cae441c383cb653e053248b23a3ed1c079f3725740bf39" Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.538526 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a002f0c3c126c68b8cae441c383cb653e053248b23a3ed1c079f3725740bf39"} err="failed to get container status \"0a002f0c3c126c68b8cae441c383cb653e053248b23a3ed1c079f3725740bf39\": rpc error: code = NotFound desc = could not find container \"0a002f0c3c126c68b8cae441c383cb653e053248b23a3ed1c079f3725740bf39\": container with ID starting with 0a002f0c3c126c68b8cae441c383cb653e053248b23a3ed1c079f3725740bf39 not found: ID does not exist" Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.538558 5030 scope.go:117] "RemoveContainer" containerID="4a7a839bb5d430d5fab0f5f43471e084c2117be25a88edb928f43aba64877b11" Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.552017 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-bh77x"] Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.569163 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-bh77x"] Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.573866 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf"] Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.582866 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-ndhjf"] Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.985148 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13b4821-61bf-4351-aa07-6fde3696ada3" path="/var/lib/kubelet/pods/a13b4821-61bf-4351-aa07-6fde3696ada3/volumes" Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.985654 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd7fb511-f3ec-40b9-89ec-4ab91ff34662" path="/var/lib/kubelet/pods/dd7fb511-f3ec-40b9-89ec-4ab91ff34662/volumes" Jan 21 01:01:09 crc kubenswrapper[5030]: I0121 01:01:09.986159 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8269d80-4238-4c5a-a441-9997d386bc34" path="/var/lib/kubelet/pods/e8269d80-4238-4c5a-a441-9997d386bc34/volumes" Jan 21 01:01:11 crc kubenswrapper[5030]: I0121 01:01:11.211500 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z"] Jan 21 01:01:11 crc kubenswrapper[5030]: I0121 01:01:11.212185 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" podUID="89886f93-b57e-42ec-8964-f3507f40f605" containerName="manager" containerID="cri-o://485e66ca9210e81a0121ffff6f240bb16490308e7425939494ba1e56bc2af2f5" gracePeriod=10 Jan 21 01:01:11 crc kubenswrapper[5030]: I0121 01:01:11.497555 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-nhwzj"] Jan 21 01:01:11 crc kubenswrapper[5030]: I0121 01:01:11.497760 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-nhwzj" podUID="6b3d7a00-a57e-4327-bc57-7ee501d8eb12" containerName="registry-server" containerID="cri-o://f874dff165783d41ede0f3d6424c81f24faef8bd813e57c2c40e5cfe67985364" gracePeriod=30 Jan 21 01:01:11 crc kubenswrapper[5030]: I0121 01:01:11.543264 5030 generic.go:334] "Generic (PLEG): container finished" podID="89886f93-b57e-42ec-8964-f3507f40f605" containerID="485e66ca9210e81a0121ffff6f240bb16490308e7425939494ba1e56bc2af2f5" exitCode=0 Jan 21 01:01:11 crc kubenswrapper[5030]: I0121 01:01:11.543317 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" event={"ID":"89886f93-b57e-42ec-8964-f3507f40f605","Type":"ContainerDied","Data":"485e66ca9210e81a0121ffff6f240bb16490308e7425939494ba1e56bc2af2f5"} Jan 21 01:01:11 crc kubenswrapper[5030]: I0121 01:01:11.554702 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6"] Jan 21 01:01:11 crc kubenswrapper[5030]: I0121 01:01:11.562164 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/ea82580bc5724477f94b47db468c840840d4aaf95efc52f7d04b6353c1krwn6"] Jan 21 01:01:11 crc kubenswrapper[5030]: I0121 01:01:11.938771 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-nhwzj" Jan 21 01:01:11 crc kubenswrapper[5030]: I0121 01:01:11.973537 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ec3adc6-5340-444c-b220-8cb79b493534" path="/var/lib/kubelet/pods/8ec3adc6-5340-444c-b220-8cb79b493534/volumes" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.021849 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2txcv\" (UniqueName: \"kubernetes.io/projected/6b3d7a00-a57e-4327-bc57-7ee501d8eb12-kube-api-access-2txcv\") pod \"6b3d7a00-a57e-4327-bc57-7ee501d8eb12\" (UID: \"6b3d7a00-a57e-4327-bc57-7ee501d8eb12\") " Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.028514 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b3d7a00-a57e-4327-bc57-7ee501d8eb12-kube-api-access-2txcv" (OuterVolumeSpecName: "kube-api-access-2txcv") pod "6b3d7a00-a57e-4327-bc57-7ee501d8eb12" (UID: "6b3d7a00-a57e-4327-bc57-7ee501d8eb12"). InnerVolumeSpecName "kube-api-access-2txcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.124803 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2txcv\" (UniqueName: \"kubernetes.io/projected/6b3d7a00-a57e-4327-bc57-7ee501d8eb12-kube-api-access-2txcv\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.421187 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.532634 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89886f93-b57e-42ec-8964-f3507f40f605-webhook-cert\") pod \"89886f93-b57e-42ec-8964-f3507f40f605\" (UID: \"89886f93-b57e-42ec-8964-f3507f40f605\") " Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.532695 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48jps\" (UniqueName: \"kubernetes.io/projected/89886f93-b57e-42ec-8964-f3507f40f605-kube-api-access-48jps\") pod \"89886f93-b57e-42ec-8964-f3507f40f605\" (UID: \"89886f93-b57e-42ec-8964-f3507f40f605\") " Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.532745 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89886f93-b57e-42ec-8964-f3507f40f605-apiservice-cert\") pod \"89886f93-b57e-42ec-8964-f3507f40f605\" (UID: \"89886f93-b57e-42ec-8964-f3507f40f605\") " Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.537223 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89886f93-b57e-42ec-8964-f3507f40f605-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "89886f93-b57e-42ec-8964-f3507f40f605" (UID: "89886f93-b57e-42ec-8964-f3507f40f605"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.537470 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89886f93-b57e-42ec-8964-f3507f40f605-kube-api-access-48jps" (OuterVolumeSpecName: "kube-api-access-48jps") pod "89886f93-b57e-42ec-8964-f3507f40f605" (UID: "89886f93-b57e-42ec-8964-f3507f40f605"). InnerVolumeSpecName "kube-api-access-48jps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.537849 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89886f93-b57e-42ec-8964-f3507f40f605-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "89886f93-b57e-42ec-8964-f3507f40f605" (UID: "89886f93-b57e-42ec-8964-f3507f40f605"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.551053 5030 generic.go:334] "Generic (PLEG): container finished" podID="6b3d7a00-a57e-4327-bc57-7ee501d8eb12" containerID="f874dff165783d41ede0f3d6424c81f24faef8bd813e57c2c40e5cfe67985364" exitCode=0 Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.551094 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-nhwzj" event={"ID":"6b3d7a00-a57e-4327-bc57-7ee501d8eb12","Type":"ContainerDied","Data":"f874dff165783d41ede0f3d6424c81f24faef8bd813e57c2c40e5cfe67985364"} Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.551119 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-nhwzj" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.551138 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-nhwzj" event={"ID":"6b3d7a00-a57e-4327-bc57-7ee501d8eb12","Type":"ContainerDied","Data":"d86cf1d6cecca203d5b2a31b6f99da857bf5aa6ef587785766b4865963ff900e"} Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.551160 5030 scope.go:117] "RemoveContainer" containerID="f874dff165783d41ede0f3d6424c81f24faef8bd813e57c2c40e5cfe67985364" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.552714 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" event={"ID":"89886f93-b57e-42ec-8964-f3507f40f605","Type":"ContainerDied","Data":"7c0c256307f00fccaeb81edfde8d0e1c01d2b95bbfbf692dadb47c5ca5fc1a89"} Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.552787 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.600391 5030 scope.go:117] "RemoveContainer" containerID="f874dff165783d41ede0f3d6424c81f24faef8bd813e57c2c40e5cfe67985364" Jan 21 01:01:12 crc kubenswrapper[5030]: E0121 01:01:12.601289 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f874dff165783d41ede0f3d6424c81f24faef8bd813e57c2c40e5cfe67985364\": container with ID starting with f874dff165783d41ede0f3d6424c81f24faef8bd813e57c2c40e5cfe67985364 not found: ID does not exist" containerID="f874dff165783d41ede0f3d6424c81f24faef8bd813e57c2c40e5cfe67985364" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.601332 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f874dff165783d41ede0f3d6424c81f24faef8bd813e57c2c40e5cfe67985364"} err="failed to get container status \"f874dff165783d41ede0f3d6424c81f24faef8bd813e57c2c40e5cfe67985364\": rpc error: code = NotFound desc = could not find container \"f874dff165783d41ede0f3d6424c81f24faef8bd813e57c2c40e5cfe67985364\": container with ID starting with f874dff165783d41ede0f3d6424c81f24faef8bd813e57c2c40e5cfe67985364 not found: ID does not exist" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.601369 5030 scope.go:117] "RemoveContainer" containerID="485e66ca9210e81a0121ffff6f240bb16490308e7425939494ba1e56bc2af2f5" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.623185 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z"] Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.627493 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7bbfcdfb-zm69z"] Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.634688 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-nhwzj"] Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.635011 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/89886f93-b57e-42ec-8964-f3507f40f605-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.635063 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48jps\" (UniqueName: \"kubernetes.io/projected/89886f93-b57e-42ec-8964-f3507f40f605-kube-api-access-48jps\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.635084 5030 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/89886f93-b57e-42ec-8964-f3507f40f605-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.639111 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-nhwzj"] Jan 21 01:01:12 crc kubenswrapper[5030]: I0121 01:01:12.963178 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:01:12 crc kubenswrapper[5030]: E0121 01:01:12.964389 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:01:13 crc kubenswrapper[5030]: I0121 01:01:13.655123 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw"] Jan 21 01:01:13 crc kubenswrapper[5030]: I0121 01:01:13.655367 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" podUID="e782a2bc-b80e-43ca-9fb0-49f1fba7739b" containerName="manager" containerID="cri-o://389f49bfdef8a43d16568344267040b205b0bd720c6f6e6a1b7369130f84a29c" gracePeriod=10 Jan 21 01:01:13 crc kubenswrapper[5030]: I0121 01:01:13.939138 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-2m42b"] Jan 21 01:01:13 crc kubenswrapper[5030]: I0121 01:01:13.939701 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-2m42b" podUID="b4a4dd1c-2a1a-49f6-800e-f484edac8630" containerName="registry-server" containerID="cri-o://3784e0105b255c41ce70aa3f9a211156d86ff67658ed04e0fdc99946d76662a1" gracePeriod=30 Jan 21 01:01:13 crc kubenswrapper[5030]: I0121 01:01:13.977350 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b3d7a00-a57e-4327-bc57-7ee501d8eb12" path="/var/lib/kubelet/pods/6b3d7a00-a57e-4327-bc57-7ee501d8eb12/volumes" Jan 21 01:01:13 crc kubenswrapper[5030]: I0121 01:01:13.977859 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89886f93-b57e-42ec-8964-f3507f40f605" path="/var/lib/kubelet/pods/89886f93-b57e-42ec-8964-f3507f40f605/volumes" Jan 21 01:01:13 crc kubenswrapper[5030]: I0121 01:01:13.992203 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt"] Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.002735 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/a53044f29d0f89c197912873b7cc34569484f5de61ee55394f4939720bb46jt"] Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.230605 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.370097 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-webhook-cert\") pod \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\" (UID: \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\") " Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.370175 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-apiservice-cert\") pod \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\" (UID: \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\") " Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.370332 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9nc9\" (UniqueName: \"kubernetes.io/projected/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-kube-api-access-t9nc9\") pod \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\" (UID: \"e782a2bc-b80e-43ca-9fb0-49f1fba7739b\") " Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.381365 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "e782a2bc-b80e-43ca-9fb0-49f1fba7739b" (UID: "e782a2bc-b80e-43ca-9fb0-49f1fba7739b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.382872 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "e782a2bc-b80e-43ca-9fb0-49f1fba7739b" (UID: "e782a2bc-b80e-43ca-9fb0-49f1fba7739b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.384567 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-kube-api-access-t9nc9" (OuterVolumeSpecName: "kube-api-access-t9nc9") pod "e782a2bc-b80e-43ca-9fb0-49f1fba7739b" (UID: "e782a2bc-b80e-43ca-9fb0-49f1fba7739b"). InnerVolumeSpecName "kube-api-access-t9nc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.422261 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-2m42b" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.471465 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9nc9\" (UniqueName: \"kubernetes.io/projected/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-kube-api-access-t9nc9\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.471511 5030 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.471525 5030 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e782a2bc-b80e-43ca-9fb0-49f1fba7739b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.572217 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq7gb\" (UniqueName: \"kubernetes.io/projected/b4a4dd1c-2a1a-49f6-800e-f484edac8630-kube-api-access-zq7gb\") pod \"b4a4dd1c-2a1a-49f6-800e-f484edac8630\" (UID: \"b4a4dd1c-2a1a-49f6-800e-f484edac8630\") " Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.573067 5030 generic.go:334] "Generic (PLEG): container finished" podID="e782a2bc-b80e-43ca-9fb0-49f1fba7739b" containerID="389f49bfdef8a43d16568344267040b205b0bd720c6f6e6a1b7369130f84a29c" exitCode=0 Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.573191 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.573177 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" event={"ID":"e782a2bc-b80e-43ca-9fb0-49f1fba7739b","Type":"ContainerDied","Data":"389f49bfdef8a43d16568344267040b205b0bd720c6f6e6a1b7369130f84a29c"} Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.573451 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw" event={"ID":"e782a2bc-b80e-43ca-9fb0-49f1fba7739b","Type":"ContainerDied","Data":"d00df3285a40bcfaeb234edb7c047f52a7de7fe06baaad294dda1859a7cc9ffc"} Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.573495 5030 scope.go:117] "RemoveContainer" containerID="389f49bfdef8a43d16568344267040b205b0bd720c6f6e6a1b7369130f84a29c" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.576986 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a4dd1c-2a1a-49f6-800e-f484edac8630-kube-api-access-zq7gb" (OuterVolumeSpecName: "kube-api-access-zq7gb") pod "b4a4dd1c-2a1a-49f6-800e-f484edac8630" (UID: "b4a4dd1c-2a1a-49f6-800e-f484edac8630"). InnerVolumeSpecName "kube-api-access-zq7gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.577466 5030 generic.go:334] "Generic (PLEG): container finished" podID="b4a4dd1c-2a1a-49f6-800e-f484edac8630" containerID="3784e0105b255c41ce70aa3f9a211156d86ff67658ed04e0fdc99946d76662a1" exitCode=0 Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.577490 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-2m42b" event={"ID":"b4a4dd1c-2a1a-49f6-800e-f484edac8630","Type":"ContainerDied","Data":"3784e0105b255c41ce70aa3f9a211156d86ff67658ed04e0fdc99946d76662a1"} Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.577506 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-2m42b" event={"ID":"b4a4dd1c-2a1a-49f6-800e-f484edac8630","Type":"ContainerDied","Data":"1ad322bf48dd4186890f4dc6232fa84b68c295acf409a08090934f2d4cc6d819"} Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.577552 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-2m42b" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.608439 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw"] Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.608743 5030 scope.go:117] "RemoveContainer" containerID="389f49bfdef8a43d16568344267040b205b0bd720c6f6e6a1b7369130f84a29c" Jan 21 01:01:14 crc kubenswrapper[5030]: E0121 01:01:14.611585 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"389f49bfdef8a43d16568344267040b205b0bd720c6f6e6a1b7369130f84a29c\": container with ID starting with 389f49bfdef8a43d16568344267040b205b0bd720c6f6e6a1b7369130f84a29c not found: ID does not exist" containerID="389f49bfdef8a43d16568344267040b205b0bd720c6f6e6a1b7369130f84a29c" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.611757 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"389f49bfdef8a43d16568344267040b205b0bd720c6f6e6a1b7369130f84a29c"} err="failed to get container status \"389f49bfdef8a43d16568344267040b205b0bd720c6f6e6a1b7369130f84a29c\": rpc error: code = NotFound desc = could not find container \"389f49bfdef8a43d16568344267040b205b0bd720c6f6e6a1b7369130f84a29c\": container with ID starting with 389f49bfdef8a43d16568344267040b205b0bd720c6f6e6a1b7369130f84a29c not found: ID does not exist" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.611837 5030 scope.go:117] "RemoveContainer" containerID="3784e0105b255c41ce70aa3f9a211156d86ff67658ed04e0fdc99946d76662a1" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.618895 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-84ccc575b7-gxztw"] Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.624889 5030 scope.go:117] "RemoveContainer" containerID="3784e0105b255c41ce70aa3f9a211156d86ff67658ed04e0fdc99946d76662a1" Jan 21 01:01:14 crc kubenswrapper[5030]: E0121 01:01:14.625345 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3784e0105b255c41ce70aa3f9a211156d86ff67658ed04e0fdc99946d76662a1\": container with ID starting with 3784e0105b255c41ce70aa3f9a211156d86ff67658ed04e0fdc99946d76662a1 not found: ID does not exist" containerID="3784e0105b255c41ce70aa3f9a211156d86ff67658ed04e0fdc99946d76662a1" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.625456 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3784e0105b255c41ce70aa3f9a211156d86ff67658ed04e0fdc99946d76662a1"} err="failed to get container status \"3784e0105b255c41ce70aa3f9a211156d86ff67658ed04e0fdc99946d76662a1\": rpc error: code = NotFound desc = could not find container \"3784e0105b255c41ce70aa3f9a211156d86ff67658ed04e0fdc99946d76662a1\": container with ID starting with 3784e0105b255c41ce70aa3f9a211156d86ff67658ed04e0fdc99946d76662a1 not found: ID does not exist" Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.625710 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-2m42b"] Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.629562 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-2m42b"] Jan 21 01:01:14 crc kubenswrapper[5030]: I0121 01:01:14.674107 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq7gb\" (UniqueName: \"kubernetes.io/projected/b4a4dd1c-2a1a-49f6-800e-f484edac8630-kube-api-access-zq7gb\") on node \"crc\" DevicePath \"\"" Jan 21 01:01:15 crc kubenswrapper[5030]: I0121 01:01:15.971845 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66b998ce-d629-4444-a463-3cccdff8895b" path="/var/lib/kubelet/pods/66b998ce-d629-4444-a463-3cccdff8895b/volumes" Jan 21 01:01:15 crc kubenswrapper[5030]: I0121 01:01:15.972769 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a4dd1c-2a1a-49f6-800e-f484edac8630" path="/var/lib/kubelet/pods/b4a4dd1c-2a1a-49f6-800e-f484edac8630/volumes" Jan 21 01:01:15 crc kubenswrapper[5030]: I0121 01:01:15.973410 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e782a2bc-b80e-43ca-9fb0-49f1fba7739b" path="/var/lib/kubelet/pods/e782a2bc-b80e-43ca-9fb0-49f1fba7739b/volumes" Jan 21 01:01:27 crc kubenswrapper[5030]: I0121 01:01:27.968464 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:01:27 crc kubenswrapper[5030]: E0121 01:01:27.969474 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.788902 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tfsr9/must-gather-d2zjm"] Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789694 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c9e208-00ec-4363-b4e8-91e11eb22555" containerName="memcached" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789707 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c9e208-00ec-4363-b4e8-91e11eb22555" containerName="memcached" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789719 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765d3219-1f0d-4bda-8f7f-716ee4cb51ae" containerName="mysql-bootstrap" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789725 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="765d3219-1f0d-4bda-8f7f-716ee4cb51ae" containerName="mysql-bootstrap" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789734 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89886f93-b57e-42ec-8964-f3507f40f605" containerName="manager" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789740 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="89886f93-b57e-42ec-8964-f3507f40f605" containerName="manager" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789747 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765d3219-1f0d-4bda-8f7f-716ee4cb51ae" containerName="galera" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789753 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="765d3219-1f0d-4bda-8f7f-716ee4cb51ae" containerName="galera" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789781 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e29420-6e97-43b9-8a13-78ee659137a3" containerName="mysql-bootstrap" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789787 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e29420-6e97-43b9-8a13-78ee659137a3" containerName="mysql-bootstrap" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789798 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b5fe58-53a7-4726-8286-24f697bf4295" containerName="mariadb-account-delete" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789805 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b5fe58-53a7-4726-8286-24f697bf4295" containerName="mariadb-account-delete" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789813 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3d7a00-a57e-4327-bc57-7ee501d8eb12" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789819 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3d7a00-a57e-4327-bc57-7ee501d8eb12" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789832 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e29420-6e97-43b9-8a13-78ee659137a3" containerName="galera" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789837 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e29420-6e97-43b9-8a13-78ee659137a3" containerName="galera" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789846 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a69a2c7-1b3d-40ed-b50c-0bf65178e971" containerName="setup-container" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789852 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a69a2c7-1b3d-40ed-b50c-0bf65178e971" containerName="setup-container" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789862 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8269d80-4238-4c5a-a441-9997d386bc34" containerName="operator" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789868 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8269d80-4238-4c5a-a441-9997d386bc34" containerName="operator" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789876 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a69a2c7-1b3d-40ed-b50c-0bf65178e971" containerName="rabbitmq" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789883 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a69a2c7-1b3d-40ed-b50c-0bf65178e971" containerName="rabbitmq" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789889 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966dfb91-704f-4ed2-b419-b735a6f876a7" containerName="keystone-api" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789894 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="966dfb91-704f-4ed2-b419-b735a6f876a7" containerName="keystone-api" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789903 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13b4821-61bf-4351-aa07-6fde3696ada3" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789910 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13b4821-61bf-4351-aa07-6fde3696ada3" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789918 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b5fe58-53a7-4726-8286-24f697bf4295" containerName="mariadb-account-delete" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789924 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b5fe58-53a7-4726-8286-24f697bf4295" containerName="mariadb-account-delete" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789932 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2756282-c0f3-41b0-bdce-ab04afad52f8" containerName="manager" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789937 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2756282-c0f3-41b0-bdce-ab04afad52f8" containerName="manager" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789947 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72237d07-9875-484f-9e9b-f282a65d0136" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789952 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="72237d07-9875-484f-9e9b-f282a65d0136" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789966 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c199ff-c9e9-4b75-9639-003ffc8845b8" containerName="manager" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789971 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c199ff-c9e9-4b75-9639-003ffc8845b8" containerName="manager" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789980 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a4dd1c-2a1a-49f6-800e-f484edac8630" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.789985 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a4dd1c-2a1a-49f6-800e-f484edac8630" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.789995 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ea303d-f60d-42ad-a86b-0b371360c2b1" containerName="galera" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790001 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ea303d-f60d-42ad-a86b-0b371360c2b1" containerName="galera" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.790009 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ea303d-f60d-42ad-a86b-0b371360c2b1" containerName="mysql-bootstrap" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790015 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ea303d-f60d-42ad-a86b-0b371360c2b1" containerName="mysql-bootstrap" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.790026 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e782a2bc-b80e-43ca-9fb0-49f1fba7739b" containerName="manager" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790033 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e782a2bc-b80e-43ca-9fb0-49f1fba7739b" containerName="manager" Jan 21 01:01:32 crc kubenswrapper[5030]: E0121 01:01:32.790046 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1228307e-d29d-44a4-b399-7be045068d83" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790052 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="1228307e-d29d-44a4-b399-7be045068d83" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790164 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="72237d07-9875-484f-9e9b-f282a65d0136" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790176 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7b5fe58-53a7-4726-8286-24f697bf4295" containerName="mariadb-account-delete" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790182 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7b5fe58-53a7-4726-8286-24f697bf4295" containerName="mariadb-account-delete" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790189 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4a4dd1c-2a1a-49f6-800e-f484edac8630" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790200 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="966dfb91-704f-4ed2-b419-b735a6f876a7" containerName="keystone-api" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790211 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8269d80-4238-4c5a-a441-9997d386bc34" containerName="operator" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790221 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="765d3219-1f0d-4bda-8f7f-716ee4cb51ae" containerName="galera" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790228 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e29420-6e97-43b9-8a13-78ee659137a3" containerName="galera" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790237 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13b4821-61bf-4351-aa07-6fde3696ada3" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790245 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3d7a00-a57e-4327-bc57-7ee501d8eb12" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790253 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="89886f93-b57e-42ec-8964-f3507f40f605" containerName="manager" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790264 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="1228307e-d29d-44a4-b399-7be045068d83" containerName="registry-server" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790274 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a69a2c7-1b3d-40ed-b50c-0bf65178e971" containerName="rabbitmq" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790279 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e782a2bc-b80e-43ca-9fb0-49f1fba7739b" containerName="manager" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790286 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c9e208-00ec-4363-b4e8-91e11eb22555" containerName="memcached" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790297 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c199ff-c9e9-4b75-9639-003ffc8845b8" containerName="manager" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790305 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ea303d-f60d-42ad-a86b-0b371360c2b1" containerName="galera" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790312 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2756282-c0f3-41b0-bdce-ab04afad52f8" containerName="manager" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.790939 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfsr9/must-gather-d2zjm" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.793246 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tfsr9"/"kube-root-ca.crt" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.795282 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tfsr9"/"openshift-service-ca.crt" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.806694 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tfsr9/must-gather-d2zjm"] Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.891167 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8456a6d3-7e0b-4a72-9018-720e00a7bb01-must-gather-output\") pod \"must-gather-d2zjm\" (UID: \"8456a6d3-7e0b-4a72-9018-720e00a7bb01\") " pod="openshift-must-gather-tfsr9/must-gather-d2zjm" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.891516 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zplfr\" (UniqueName: \"kubernetes.io/projected/8456a6d3-7e0b-4a72-9018-720e00a7bb01-kube-api-access-zplfr\") pod \"must-gather-d2zjm\" (UID: \"8456a6d3-7e0b-4a72-9018-720e00a7bb01\") " pod="openshift-must-gather-tfsr9/must-gather-d2zjm" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.993214 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zplfr\" (UniqueName: \"kubernetes.io/projected/8456a6d3-7e0b-4a72-9018-720e00a7bb01-kube-api-access-zplfr\") pod \"must-gather-d2zjm\" (UID: \"8456a6d3-7e0b-4a72-9018-720e00a7bb01\") " pod="openshift-must-gather-tfsr9/must-gather-d2zjm" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.993435 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8456a6d3-7e0b-4a72-9018-720e00a7bb01-must-gather-output\") pod \"must-gather-d2zjm\" (UID: \"8456a6d3-7e0b-4a72-9018-720e00a7bb01\") " pod="openshift-must-gather-tfsr9/must-gather-d2zjm" Jan 21 01:01:32 crc kubenswrapper[5030]: I0121 01:01:32.993903 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8456a6d3-7e0b-4a72-9018-720e00a7bb01-must-gather-output\") pod \"must-gather-d2zjm\" (UID: \"8456a6d3-7e0b-4a72-9018-720e00a7bb01\") " pod="openshift-must-gather-tfsr9/must-gather-d2zjm" Jan 21 01:01:33 crc kubenswrapper[5030]: I0121 01:01:33.018459 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zplfr\" (UniqueName: \"kubernetes.io/projected/8456a6d3-7e0b-4a72-9018-720e00a7bb01-kube-api-access-zplfr\") pod \"must-gather-d2zjm\" (UID: \"8456a6d3-7e0b-4a72-9018-720e00a7bb01\") " pod="openshift-must-gather-tfsr9/must-gather-d2zjm" Jan 21 01:01:33 crc kubenswrapper[5030]: I0121 01:01:33.106253 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfsr9/must-gather-d2zjm" Jan 21 01:01:33 crc kubenswrapper[5030]: I0121 01:01:33.512784 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tfsr9/must-gather-d2zjm"] Jan 21 01:01:33 crc kubenswrapper[5030]: I0121 01:01:33.521653 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 01:01:33 crc kubenswrapper[5030]: I0121 01:01:33.761924 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfsr9/must-gather-d2zjm" event={"ID":"8456a6d3-7e0b-4a72-9018-720e00a7bb01","Type":"ContainerStarted","Data":"15fd1730da23cef04f6925ec66d81c53aee8489be52b55c522a935f5b9dd1184"} Jan 21 01:01:35 crc kubenswrapper[5030]: I0121 01:01:35.684354 5030 scope.go:117] "RemoveContainer" containerID="6d810c1c168a54f2cd0fd942db694c7a8b69bad89e1ef93b4a16c53b7595b45e" Jan 21 01:01:39 crc kubenswrapper[5030]: I0121 01:01:39.583168 5030 scope.go:117] "RemoveContainer" containerID="9545e32aa27716e913bacfcac37330599ebcd471a287020024a71734eed884b2" Jan 21 01:01:39 crc kubenswrapper[5030]: I0121 01:01:39.624868 5030 scope.go:117] "RemoveContainer" containerID="aba101a4eea595818e1a187ad716bc13083a6477652da52dbb2ddeb6f0b2c736" Jan 21 01:01:39 crc kubenswrapper[5030]: I0121 01:01:39.648562 5030 scope.go:117] "RemoveContainer" containerID="9cf0ecb3132f7ae4f4d1cb6a7868f5b0cca4cc9a588e6f1f8cdf0d7639d25a3a" Jan 21 01:01:39 crc kubenswrapper[5030]: I0121 01:01:39.677947 5030 scope.go:117] "RemoveContainer" containerID="5e5c187b2897d2120605eda63d7bd7c076ca69b038506e43e7d41a1926cd3fa3" Jan 21 01:01:39 crc kubenswrapper[5030]: I0121 01:01:39.727531 5030 scope.go:117] "RemoveContainer" containerID="313f8b3f2da47a1746bf985046b8a211ae982f8ee7c23e1d6210dbe1f6e0b658" Jan 21 01:01:39 crc kubenswrapper[5030]: I0121 01:01:39.749917 5030 scope.go:117] "RemoveContainer" containerID="6f50096578fa9013b1f2b0602ea184139cf29271f5e6baef23dfeb4f46554d6f" Jan 21 01:01:39 crc kubenswrapper[5030]: I0121 01:01:39.962351 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:01:39 crc kubenswrapper[5030]: E0121 01:01:39.962713 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:01:40 crc kubenswrapper[5030]: I0121 01:01:40.826297 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfsr9/must-gather-d2zjm" event={"ID":"8456a6d3-7e0b-4a72-9018-720e00a7bb01","Type":"ContainerStarted","Data":"6b545d631d6f947215dff7d9f112c392221b6521b0ff894d2ba81ace65d7451a"} Jan 21 01:01:40 crc kubenswrapper[5030]: I0121 01:01:40.826607 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfsr9/must-gather-d2zjm" event={"ID":"8456a6d3-7e0b-4a72-9018-720e00a7bb01","Type":"ContainerStarted","Data":"fcc97d68808c1d74a5dd20919c356c352def27326021a991d81d79c9c80b446d"} Jan 21 01:01:40 crc kubenswrapper[5030]: I0121 01:01:40.842488 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tfsr9/must-gather-d2zjm" podStartSLOduration=2.712524502 podStartE2EDuration="8.842472181s" podCreationTimestamp="2026-01-21 01:01:32 +0000 UTC" firstStartedPulling="2026-01-21 01:01:33.521591125 +0000 UTC m=+8765.841851433" lastFinishedPulling="2026-01-21 01:01:39.651538814 +0000 UTC m=+8771.971799112" observedRunningTime="2026-01-21 01:01:40.841615541 +0000 UTC m=+8773.161875829" watchObservedRunningTime="2026-01-21 01:01:40.842472181 +0000 UTC m=+8773.162732469" Jan 21 01:01:52 crc kubenswrapper[5030]: I0121 01:01:52.962141 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:01:52 crc kubenswrapper[5030]: E0121 01:01:52.962996 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:02:05 crc kubenswrapper[5030]: I0121 01:02:05.962152 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:02:05 crc kubenswrapper[5030]: E0121 01:02:05.963063 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.109961 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bxt9t/must-gather-9nkp2"] Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.110996 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxt9t/must-gather-9nkp2" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.113164 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bxt9t"/"openshift-service-ca.crt" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.113331 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bxt9t"/"kube-root-ca.crt" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.122390 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bxt9t/must-gather-9nkp2"] Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.229326 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmm7f\" (UniqueName: \"kubernetes.io/projected/5921dbf0-7f3b-4e0e-aeee-18502de052a5-kube-api-access-nmm7f\") pod \"must-gather-9nkp2\" (UID: \"5921dbf0-7f3b-4e0e-aeee-18502de052a5\") " pod="openshift-must-gather-bxt9t/must-gather-9nkp2" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.229426 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5921dbf0-7f3b-4e0e-aeee-18502de052a5-must-gather-output\") pod \"must-gather-9nkp2\" (UID: \"5921dbf0-7f3b-4e0e-aeee-18502de052a5\") " pod="openshift-must-gather-bxt9t/must-gather-9nkp2" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.330382 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5921dbf0-7f3b-4e0e-aeee-18502de052a5-must-gather-output\") pod \"must-gather-9nkp2\" (UID: \"5921dbf0-7f3b-4e0e-aeee-18502de052a5\") " pod="openshift-must-gather-bxt9t/must-gather-9nkp2" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.330463 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmm7f\" (UniqueName: \"kubernetes.io/projected/5921dbf0-7f3b-4e0e-aeee-18502de052a5-kube-api-access-nmm7f\") pod \"must-gather-9nkp2\" (UID: \"5921dbf0-7f3b-4e0e-aeee-18502de052a5\") " pod="openshift-must-gather-bxt9t/must-gather-9nkp2" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.330881 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5921dbf0-7f3b-4e0e-aeee-18502de052a5-must-gather-output\") pod \"must-gather-9nkp2\" (UID: \"5921dbf0-7f3b-4e0e-aeee-18502de052a5\") " pod="openshift-must-gather-bxt9t/must-gather-9nkp2" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.348876 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmm7f\" (UniqueName: \"kubernetes.io/projected/5921dbf0-7f3b-4e0e-aeee-18502de052a5-kube-api-access-nmm7f\") pod \"must-gather-9nkp2\" (UID: \"5921dbf0-7f3b-4e0e-aeee-18502de052a5\") " pod="openshift-must-gather-bxt9t/must-gather-9nkp2" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.428690 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxt9t/must-gather-9nkp2" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.731319 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9npwj_c2b9b98a-8ac1-4b34-bf47-b04f6e498149/controller/0.log" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.737339 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9npwj_c2b9b98a-8ac1-4b34-bf47-b04f6e498149/kube-rbac-proxy/0.log" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.760410 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/controller/0.log" Jan 21 01:02:07 crc kubenswrapper[5030]: I0121 01:02:07.935678 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bxt9t/must-gather-9nkp2"] Jan 21 01:02:08 crc kubenswrapper[5030]: I0121 01:02:08.045676 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxt9t/must-gather-9nkp2" event={"ID":"5921dbf0-7f3b-4e0e-aeee-18502de052a5","Type":"ContainerStarted","Data":"7caffc22768fabdf9b602e6d5048c9781cd317a0c3364daa72d565c2394c1657"} Jan 21 01:02:08 crc kubenswrapper[5030]: I0121 01:02:08.847378 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/frr/0.log" Jan 21 01:02:08 crc kubenswrapper[5030]: I0121 01:02:08.855861 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/reloader/0.log" Jan 21 01:02:08 crc kubenswrapper[5030]: I0121 01:02:08.861996 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/frr-metrics/0.log" Jan 21 01:02:08 crc kubenswrapper[5030]: I0121 01:02:08.868356 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/kube-rbac-proxy/0.log" Jan 21 01:02:08 crc kubenswrapper[5030]: I0121 01:02:08.877029 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/kube-rbac-proxy-frr/0.log" Jan 21 01:02:08 crc kubenswrapper[5030]: I0121 01:02:08.881862 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-frr-files/0.log" Jan 21 01:02:08 crc kubenswrapper[5030]: I0121 01:02:08.899914 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-reloader/0.log" Jan 21 01:02:08 crc kubenswrapper[5030]: I0121 01:02:08.909025 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-metrics/0.log" Jan 21 01:02:08 crc kubenswrapper[5030]: I0121 01:02:08.919914 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-lfp28_aab7d871-bc62-4186-898d-e80a2905bc64/frr-k8s-webhook-server/0.log" Jan 21 01:02:08 crc kubenswrapper[5030]: I0121 01:02:08.951460 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6fd7497c88-v4gzk_17d40a45-26fa-4c9d-91df-767448cd1cf6/manager/0.log" Jan 21 01:02:08 crc kubenswrapper[5030]: I0121 01:02:08.962730 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74877969b9-vf5sw_bfce14b1-de45-4c8d-852f-044ee0214880/webhook-server/0.log" Jan 21 01:02:09 crc kubenswrapper[5030]: I0121 01:02:09.060285 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxt9t/must-gather-9nkp2" event={"ID":"5921dbf0-7f3b-4e0e-aeee-18502de052a5","Type":"ContainerStarted","Data":"3c9a0d6651740dedc968864124516ad823cd5cf6f3d317038c36684e961e4b5e"} Jan 21 01:02:09 crc kubenswrapper[5030]: I0121 01:02:09.060730 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxt9t/must-gather-9nkp2" event={"ID":"5921dbf0-7f3b-4e0e-aeee-18502de052a5","Type":"ContainerStarted","Data":"123eb5143dbee7274b18c946d5a621ccd2bcd720d9d35f4e07547ef08d04d509"} Jan 21 01:02:10 crc kubenswrapper[5030]: I0121 01:02:10.178088 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2ttl_bbacd039-a930-489a-9aca-1fac0748973b/speaker/0.log" Jan 21 01:02:10 crc kubenswrapper[5030]: I0121 01:02:10.185006 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2ttl_bbacd039-a930-489a-9aca-1fac0748973b/kube-rbac-proxy/0.log" Jan 21 01:02:19 crc kubenswrapper[5030]: I0121 01:02:19.818225 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-82699_607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9/control-plane-machine-set-operator/0.log" Jan 21 01:02:19 crc kubenswrapper[5030]: I0121 01:02:19.830825 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t7nch_09fc7d72-f430-4d90-9ab9-1d79db43b695/kube-rbac-proxy/0.log" Jan 21 01:02:19 crc kubenswrapper[5030]: I0121 01:02:19.840414 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t7nch_09fc7d72-f430-4d90-9ab9-1d79db43b695/machine-api-operator/0.log" Jan 21 01:02:19 crc kubenswrapper[5030]: I0121 01:02:19.962475 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:02:19 crc kubenswrapper[5030]: E0121 01:02:19.962739 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.100041 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bxt9t/must-gather-9nkp2" podStartSLOduration=16.100018877 podStartE2EDuration="16.100018877s" podCreationTimestamp="2026-01-21 01:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 01:02:09.075535199 +0000 UTC m=+8801.395795477" watchObservedRunningTime="2026-01-21 01:02:23.100018877 +0000 UTC m=+8815.420279175" Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.101782 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9zlw7"] Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.103458 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.112542 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zlw7"] Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.130546 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0d5944a-47d5-4124-ad88-2c7823d7a190-catalog-content\") pod \"community-operators-9zlw7\" (UID: \"e0d5944a-47d5-4124-ad88-2c7823d7a190\") " pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.130790 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vbcg\" (UniqueName: \"kubernetes.io/projected/e0d5944a-47d5-4124-ad88-2c7823d7a190-kube-api-access-6vbcg\") pod \"community-operators-9zlw7\" (UID: \"e0d5944a-47d5-4124-ad88-2c7823d7a190\") " pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.130928 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0d5944a-47d5-4124-ad88-2c7823d7a190-utilities\") pod \"community-operators-9zlw7\" (UID: \"e0d5944a-47d5-4124-ad88-2c7823d7a190\") " pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.231829 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0d5944a-47d5-4124-ad88-2c7823d7a190-utilities\") pod \"community-operators-9zlw7\" (UID: \"e0d5944a-47d5-4124-ad88-2c7823d7a190\") " pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.231912 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0d5944a-47d5-4124-ad88-2c7823d7a190-catalog-content\") pod \"community-operators-9zlw7\" (UID: \"e0d5944a-47d5-4124-ad88-2c7823d7a190\") " pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.232081 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vbcg\" (UniqueName: \"kubernetes.io/projected/e0d5944a-47d5-4124-ad88-2c7823d7a190-kube-api-access-6vbcg\") pod \"community-operators-9zlw7\" (UID: \"e0d5944a-47d5-4124-ad88-2c7823d7a190\") " pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.232426 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0d5944a-47d5-4124-ad88-2c7823d7a190-utilities\") pod \"community-operators-9zlw7\" (UID: \"e0d5944a-47d5-4124-ad88-2c7823d7a190\") " pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.232746 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0d5944a-47d5-4124-ad88-2c7823d7a190-catalog-content\") pod \"community-operators-9zlw7\" (UID: \"e0d5944a-47d5-4124-ad88-2c7823d7a190\") " pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.249506 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vbcg\" (UniqueName: \"kubernetes.io/projected/e0d5944a-47d5-4124-ad88-2c7823d7a190-kube-api-access-6vbcg\") pod \"community-operators-9zlw7\" (UID: \"e0d5944a-47d5-4124-ad88-2c7823d7a190\") " pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.445757 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:23 crc kubenswrapper[5030]: I0121 01:02:23.952079 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9zlw7"] Jan 21 01:02:23 crc kubenswrapper[5030]: W0121 01:02:23.961204 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0d5944a_47d5_4124_ad88_2c7823d7a190.slice/crio-9134d0451ffd40a87753b28c5c257498f83b9d78c3caedfa294ca6ba8e7c16f4 WatchSource:0}: Error finding container 9134d0451ffd40a87753b28c5c257498f83b9d78c3caedfa294ca6ba8e7c16f4: Status 404 returned error can't find the container with id 9134d0451ffd40a87753b28c5c257498f83b9d78c3caedfa294ca6ba8e7c16f4 Jan 21 01:02:24 crc kubenswrapper[5030]: I0121 01:02:24.164135 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zlw7" event={"ID":"e0d5944a-47d5-4124-ad88-2c7823d7a190","Type":"ContainerStarted","Data":"9134d0451ffd40a87753b28c5c257498f83b9d78c3caedfa294ca6ba8e7c16f4"} Jan 21 01:02:25 crc kubenswrapper[5030]: I0121 01:02:25.170744 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0d5944a-47d5-4124-ad88-2c7823d7a190" containerID="2846e1cd3c619469487413b33f29bad98b5d25978bdf27fc5b2490e6603136d2" exitCode=0 Jan 21 01:02:25 crc kubenswrapper[5030]: I0121 01:02:25.170940 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zlw7" event={"ID":"e0d5944a-47d5-4124-ad88-2c7823d7a190","Type":"ContainerDied","Data":"2846e1cd3c619469487413b33f29bad98b5d25978bdf27fc5b2490e6603136d2"} Jan 21 01:02:25 crc kubenswrapper[5030]: I0121 01:02:25.746085 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-zlqvx_945ea6f2-a630-486b-bd03-4e07011e659f/cert-manager-controller/0.log" Jan 21 01:02:25 crc kubenswrapper[5030]: I0121 01:02:25.816263 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-v7jrd_5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b/cert-manager-cainjector/0.log" Jan 21 01:02:25 crc kubenswrapper[5030]: I0121 01:02:25.828568 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-z6v29_d61e12f1-4790-4ba8-a7c5-58e755a5190e/cert-manager-webhook/0.log" Jan 21 01:02:26 crc kubenswrapper[5030]: I0121 01:02:26.178617 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zlw7" event={"ID":"e0d5944a-47d5-4124-ad88-2c7823d7a190","Type":"ContainerStarted","Data":"99fd6d8dd4b875f6165364dc3dd2c6a9b46bcf987d7eddcad0d3e40c00e1237b"} Jan 21 01:02:27 crc kubenswrapper[5030]: I0121 01:02:27.186478 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0d5944a-47d5-4124-ad88-2c7823d7a190" containerID="99fd6d8dd4b875f6165364dc3dd2c6a9b46bcf987d7eddcad0d3e40c00e1237b" exitCode=0 Jan 21 01:02:27 crc kubenswrapper[5030]: I0121 01:02:27.186559 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zlw7" event={"ID":"e0d5944a-47d5-4124-ad88-2c7823d7a190","Type":"ContainerDied","Data":"99fd6d8dd4b875f6165364dc3dd2c6a9b46bcf987d7eddcad0d3e40c00e1237b"} Jan 21 01:02:28 crc kubenswrapper[5030]: I0121 01:02:28.196357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zlw7" event={"ID":"e0d5944a-47d5-4124-ad88-2c7823d7a190","Type":"ContainerStarted","Data":"3f4ce3ddacdb0dd17e88e0ffa2cbb29ba04215687620a6aeb77d0af07a421396"} Jan 21 01:02:28 crc kubenswrapper[5030]: I0121 01:02:28.216487 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9zlw7" podStartSLOduration=2.7072436619999998 podStartE2EDuration="5.216457684s" podCreationTimestamp="2026-01-21 01:02:23 +0000 UTC" firstStartedPulling="2026-01-21 01:02:25.172448599 +0000 UTC m=+8817.492708897" lastFinishedPulling="2026-01-21 01:02:27.681662641 +0000 UTC m=+8820.001922919" observedRunningTime="2026-01-21 01:02:28.212377726 +0000 UTC m=+8820.532638024" watchObservedRunningTime="2026-01-21 01:02:28.216457684 +0000 UTC m=+8820.536717972" Jan 21 01:02:30 crc kubenswrapper[5030]: I0121 01:02:30.797383 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-k6zls_1420cf7f-07c1-473d-9976-b0978952519d/nmstate-console-plugin/0.log" Jan 21 01:02:30 crc kubenswrapper[5030]: I0121 01:02:30.815280 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-tgc5w_c1bb67ab-63d4-45d9-95e3-696c32134f61/nmstate-handler/0.log" Jan 21 01:02:30 crc kubenswrapper[5030]: I0121 01:02:30.836787 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-xfzjh_795abd33-3a47-45c5-a84b-7a2a69168a13/nmstate-metrics/0.log" Jan 21 01:02:30 crc kubenswrapper[5030]: I0121 01:02:30.853280 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-xfzjh_795abd33-3a47-45c5-a84b-7a2a69168a13/kube-rbac-proxy/0.log" Jan 21 01:02:30 crc kubenswrapper[5030]: I0121 01:02:30.869599 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-mwgvw_c6f7ddb6-5178-42be-98ca-d39830d57dbc/nmstate-operator/0.log" Jan 21 01:02:30 crc kubenswrapper[5030]: I0121 01:02:30.879471 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-bkdvp_eb8c76be-9c0b-4240-955e-016a3f5e0ccd/nmstate-webhook/0.log" Jan 21 01:02:30 crc kubenswrapper[5030]: I0121 01:02:30.961873 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:02:30 crc kubenswrapper[5030]: E0121 01:02:30.962078 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:02:33 crc kubenswrapper[5030]: I0121 01:02:33.447447 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:33 crc kubenswrapper[5030]: I0121 01:02:33.448728 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:33 crc kubenswrapper[5030]: I0121 01:02:33.500430 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:34 crc kubenswrapper[5030]: I0121 01:02:34.275125 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:34 crc kubenswrapper[5030]: I0121 01:02:34.320544 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zlw7"] Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.058215 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9npwj_c2b9b98a-8ac1-4b34-bf47-b04f6e498149/controller/0.log" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.063994 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9npwj_c2b9b98a-8ac1-4b34-bf47-b04f6e498149/kube-rbac-proxy/0.log" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.091392 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/controller/0.log" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.241792 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9zlw7" podUID="e0d5944a-47d5-4124-ad88-2c7823d7a190" containerName="registry-server" containerID="cri-o://3f4ce3ddacdb0dd17e88e0ffa2cbb29ba04215687620a6aeb77d0af07a421396" gracePeriod=2 Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.569765 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.627662 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-l4rnz_1492b841-c24d-49a9-843c-7348405b87be/prometheus-operator/0.log" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.642353 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0d5944a-47d5-4124-ad88-2c7823d7a190-utilities\") pod \"e0d5944a-47d5-4124-ad88-2c7823d7a190\" (UID: \"e0d5944a-47d5-4124-ad88-2c7823d7a190\") " Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.642541 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vbcg\" (UniqueName: \"kubernetes.io/projected/e0d5944a-47d5-4124-ad88-2c7823d7a190-kube-api-access-6vbcg\") pod \"e0d5944a-47d5-4124-ad88-2c7823d7a190\" (UID: \"e0d5944a-47d5-4124-ad88-2c7823d7a190\") " Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.642791 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0d5944a-47d5-4124-ad88-2c7823d7a190-catalog-content\") pod \"e0d5944a-47d5-4124-ad88-2c7823d7a190\" (UID: \"e0d5944a-47d5-4124-ad88-2c7823d7a190\") " Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.648910 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0d5944a-47d5-4124-ad88-2c7823d7a190-utilities" (OuterVolumeSpecName: "utilities") pod "e0d5944a-47d5-4124-ad88-2c7823d7a190" (UID: "e0d5944a-47d5-4124-ad88-2c7823d7a190"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.651772 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j_3a9c5ee4-1f00-4f1a-8147-f2ad854edb84/prometheus-operator-admission-webhook/0.log" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.663854 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0d5944a-47d5-4124-ad88-2c7823d7a190-kube-api-access-6vbcg" (OuterVolumeSpecName: "kube-api-access-6vbcg") pod "e0d5944a-47d5-4124-ad88-2c7823d7a190" (UID: "e0d5944a-47d5-4124-ad88-2c7823d7a190"). InnerVolumeSpecName "kube-api-access-6vbcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.671260 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr_7410b4ea-f2f7-4350-aa51-0730f3463d7b/prometheus-operator-admission-webhook/0.log" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.713294 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0d5944a-47d5-4124-ad88-2c7823d7a190-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0d5944a-47d5-4124-ad88-2c7823d7a190" (UID: "e0d5944a-47d5-4124-ad88-2c7823d7a190"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.717902 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-n46gl_1daef7a7-0684-404a-8e5a-4850d692e21e/operator/0.log" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.727857 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-wlqst_711a3beb-9e88-42cd-86f6-0e88e52c05e3/perses-operator/0.log" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.743892 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vbcg\" (UniqueName: \"kubernetes.io/projected/e0d5944a-47d5-4124-ad88-2c7823d7a190-kube-api-access-6vbcg\") on node \"crc\" DevicePath \"\"" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.743925 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0d5944a-47d5-4124-ad88-2c7823d7a190-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:02:36 crc kubenswrapper[5030]: I0121 01:02:36.743936 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0d5944a-47d5-4124-ad88-2c7823d7a190-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.246781 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/frr/0.log" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.249594 5030 generic.go:334] "Generic (PLEG): container finished" podID="e0d5944a-47d5-4124-ad88-2c7823d7a190" containerID="3f4ce3ddacdb0dd17e88e0ffa2cbb29ba04215687620a6aeb77d0af07a421396" exitCode=0 Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.249650 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zlw7" event={"ID":"e0d5944a-47d5-4124-ad88-2c7823d7a190","Type":"ContainerDied","Data":"3f4ce3ddacdb0dd17e88e0ffa2cbb29ba04215687620a6aeb77d0af07a421396"} Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.249677 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9zlw7" event={"ID":"e0d5944a-47d5-4124-ad88-2c7823d7a190","Type":"ContainerDied","Data":"9134d0451ffd40a87753b28c5c257498f83b9d78c3caedfa294ca6ba8e7c16f4"} Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.249696 5030 scope.go:117] "RemoveContainer" containerID="3f4ce3ddacdb0dd17e88e0ffa2cbb29ba04215687620a6aeb77d0af07a421396" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.249822 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9zlw7" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.272587 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/reloader/0.log" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.283751 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/frr-metrics/0.log" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.283861 5030 scope.go:117] "RemoveContainer" containerID="99fd6d8dd4b875f6165364dc3dd2c6a9b46bcf987d7eddcad0d3e40c00e1237b" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.296302 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9zlw7"] Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.299094 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/kube-rbac-proxy/0.log" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.304777 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9zlw7"] Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.313828 5030 scope.go:117] "RemoveContainer" containerID="2846e1cd3c619469487413b33f29bad98b5d25978bdf27fc5b2490e6603136d2" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.348391 5030 scope.go:117] "RemoveContainer" containerID="3f4ce3ddacdb0dd17e88e0ffa2cbb29ba04215687620a6aeb77d0af07a421396" Jan 21 01:02:37 crc kubenswrapper[5030]: E0121 01:02:37.348922 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4ce3ddacdb0dd17e88e0ffa2cbb29ba04215687620a6aeb77d0af07a421396\": container with ID starting with 3f4ce3ddacdb0dd17e88e0ffa2cbb29ba04215687620a6aeb77d0af07a421396 not found: ID does not exist" containerID="3f4ce3ddacdb0dd17e88e0ffa2cbb29ba04215687620a6aeb77d0af07a421396" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.348969 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4ce3ddacdb0dd17e88e0ffa2cbb29ba04215687620a6aeb77d0af07a421396"} err="failed to get container status \"3f4ce3ddacdb0dd17e88e0ffa2cbb29ba04215687620a6aeb77d0af07a421396\": rpc error: code = NotFound desc = could not find container \"3f4ce3ddacdb0dd17e88e0ffa2cbb29ba04215687620a6aeb77d0af07a421396\": container with ID starting with 3f4ce3ddacdb0dd17e88e0ffa2cbb29ba04215687620a6aeb77d0af07a421396 not found: ID does not exist" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.348989 5030 scope.go:117] "RemoveContainer" containerID="99fd6d8dd4b875f6165364dc3dd2c6a9b46bcf987d7eddcad0d3e40c00e1237b" Jan 21 01:02:37 crc kubenswrapper[5030]: E0121 01:02:37.349313 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99fd6d8dd4b875f6165364dc3dd2c6a9b46bcf987d7eddcad0d3e40c00e1237b\": container with ID starting with 99fd6d8dd4b875f6165364dc3dd2c6a9b46bcf987d7eddcad0d3e40c00e1237b not found: ID does not exist" containerID="99fd6d8dd4b875f6165364dc3dd2c6a9b46bcf987d7eddcad0d3e40c00e1237b" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.349331 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99fd6d8dd4b875f6165364dc3dd2c6a9b46bcf987d7eddcad0d3e40c00e1237b"} err="failed to get container status \"99fd6d8dd4b875f6165364dc3dd2c6a9b46bcf987d7eddcad0d3e40c00e1237b\": rpc error: code = NotFound desc = could not find container \"99fd6d8dd4b875f6165364dc3dd2c6a9b46bcf987d7eddcad0d3e40c00e1237b\": container with ID starting with 99fd6d8dd4b875f6165364dc3dd2c6a9b46bcf987d7eddcad0d3e40c00e1237b not found: ID does not exist" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.349342 5030 scope.go:117] "RemoveContainer" containerID="2846e1cd3c619469487413b33f29bad98b5d25978bdf27fc5b2490e6603136d2" Jan 21 01:02:37 crc kubenswrapper[5030]: E0121 01:02:37.349660 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2846e1cd3c619469487413b33f29bad98b5d25978bdf27fc5b2490e6603136d2\": container with ID starting with 2846e1cd3c619469487413b33f29bad98b5d25978bdf27fc5b2490e6603136d2 not found: ID does not exist" containerID="2846e1cd3c619469487413b33f29bad98b5d25978bdf27fc5b2490e6603136d2" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.349678 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2846e1cd3c619469487413b33f29bad98b5d25978bdf27fc5b2490e6603136d2"} err="failed to get container status \"2846e1cd3c619469487413b33f29bad98b5d25978bdf27fc5b2490e6603136d2\": rpc error: code = NotFound desc = could not find container \"2846e1cd3c619469487413b33f29bad98b5d25978bdf27fc5b2490e6603136d2\": container with ID starting with 2846e1cd3c619469487413b33f29bad98b5d25978bdf27fc5b2490e6603136d2 not found: ID does not exist" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.383357 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/kube-rbac-proxy-frr/0.log" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.407107 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-frr-files/0.log" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.417211 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-reloader/0.log" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.427155 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-metrics/0.log" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.440430 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-lfp28_aab7d871-bc62-4186-898d-e80a2905bc64/frr-k8s-webhook-server/0.log" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.466091 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6fd7497c88-v4gzk_17d40a45-26fa-4c9d-91df-767448cd1cf6/manager/0.log" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.478767 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74877969b9-vf5sw_bfce14b1-de45-4c8d-852f-044ee0214880/webhook-server/0.log" Jan 21 01:02:37 crc kubenswrapper[5030]: I0121 01:02:37.970948 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0d5944a-47d5-4124-ad88-2c7823d7a190" path="/var/lib/kubelet/pods/e0d5944a-47d5-4124-ad88-2c7823d7a190/volumes" Jan 21 01:02:38 crc kubenswrapper[5030]: I0121 01:02:38.635998 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2ttl_bbacd039-a930-489a-9aca-1fac0748973b/speaker/0.log" Jan 21 01:02:38 crc kubenswrapper[5030]: I0121 01:02:38.643604 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2ttl_bbacd039-a930-489a-9aca-1fac0748973b/kube-rbac-proxy/0.log" Jan 21 01:02:40 crc kubenswrapper[5030]: I0121 01:02:40.113059 5030 scope.go:117] "RemoveContainer" containerID="0c767475252e8cf0c92481394d72ba89f291aa44ec2e3a9db234c14cf5975616" Jan 21 01:02:40 crc kubenswrapper[5030]: I0121 01:02:40.130293 5030 scope.go:117] "RemoveContainer" containerID="f71182d02453732a55eeeb4e606df262415f5b5f75bba3cfa3a99aa861aff025" Jan 21 01:02:40 crc kubenswrapper[5030]: I0121 01:02:40.162567 5030 scope.go:117] "RemoveContainer" containerID="dea408f2fb16854903785168f08ae2e2a27098e7f45f6a16d2055c375b34d7ec" Jan 21 01:02:40 crc kubenswrapper[5030]: I0121 01:02:40.188042 5030 scope.go:117] "RemoveContainer" containerID="660c7c47b4ce9198fa1e10731a96148a762dc67a08300a2f17f2309d9c28a481" Jan 21 01:02:40 crc kubenswrapper[5030]: I0121 01:02:40.216761 5030 scope.go:117] "RemoveContainer" containerID="7da48c6e0fcad7380f1a23fde74dd3de0c57ce507f99042be00cec0105e5cf7d" Jan 21 01:02:40 crc kubenswrapper[5030]: I0121 01:02:40.239406 5030 scope.go:117] "RemoveContainer" containerID="aeafb711c77295fbcc62fa5f2c4994e91bd200a7ef026936ff6048dc51504a20" Jan 21 01:02:43 crc kubenswrapper[5030]: I0121 01:02:43.634402 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9npwj_c2b9b98a-8ac1-4b34-bf47-b04f6e498149/controller/0.log" Jan 21 01:02:43 crc kubenswrapper[5030]: I0121 01:02:43.646811 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9npwj_c2b9b98a-8ac1-4b34-bf47-b04f6e498149/kube-rbac-proxy/0.log" Jan 21 01:02:43 crc kubenswrapper[5030]: I0121 01:02:43.666962 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/controller/0.log" Jan 21 01:02:44 crc kubenswrapper[5030]: I0121 01:02:44.678567 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/frr/0.log" Jan 21 01:02:44 crc kubenswrapper[5030]: I0121 01:02:44.687429 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/reloader/0.log" Jan 21 01:02:44 crc kubenswrapper[5030]: I0121 01:02:44.695380 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/frr-metrics/0.log" Jan 21 01:02:44 crc kubenswrapper[5030]: I0121 01:02:44.708260 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/kube-rbac-proxy/0.log" Jan 21 01:02:44 crc kubenswrapper[5030]: I0121 01:02:44.716851 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/kube-rbac-proxy-frr/0.log" Jan 21 01:02:44 crc kubenswrapper[5030]: I0121 01:02:44.726255 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-frr-files/0.log" Jan 21 01:02:44 crc kubenswrapper[5030]: I0121 01:02:44.735233 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-reloader/0.log" Jan 21 01:02:44 crc kubenswrapper[5030]: I0121 01:02:44.753542 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-metrics/0.log" Jan 21 01:02:44 crc kubenswrapper[5030]: I0121 01:02:44.769534 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-lfp28_aab7d871-bc62-4186-898d-e80a2905bc64/frr-k8s-webhook-server/0.log" Jan 21 01:02:44 crc kubenswrapper[5030]: I0121 01:02:44.793678 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6fd7497c88-v4gzk_17d40a45-26fa-4c9d-91df-767448cd1cf6/manager/0.log" Jan 21 01:02:44 crc kubenswrapper[5030]: I0121 01:02:44.802317 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74877969b9-vf5sw_bfce14b1-de45-4c8d-852f-044ee0214880/webhook-server/0.log" Jan 21 01:02:44 crc kubenswrapper[5030]: I0121 01:02:44.962222 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:02:45 crc kubenswrapper[5030]: I0121 01:02:45.329596 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"493fb309220013c06d3075f6b5de033d2728601f3b32363f4acb6dd909e988fb"} Jan 21 01:02:45 crc kubenswrapper[5030]: I0121 01:02:45.982559 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2ttl_bbacd039-a930-489a-9aca-1fac0748973b/speaker/0.log" Jan 21 01:02:46 crc kubenswrapper[5030]: I0121 01:02:46.003819 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2ttl_bbacd039-a930-489a-9aca-1fac0748973b/kube-rbac-proxy/0.log" Jan 21 01:02:48 crc kubenswrapper[5030]: I0121 01:02:48.721363 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-82699_607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9/control-plane-machine-set-operator/0.log" Jan 21 01:02:48 crc kubenswrapper[5030]: I0121 01:02:48.735379 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t7nch_09fc7d72-f430-4d90-9ab9-1d79db43b695/kube-rbac-proxy/0.log" Jan 21 01:02:48 crc kubenswrapper[5030]: I0121 01:02:48.742717 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t7nch_09fc7d72-f430-4d90-9ab9-1d79db43b695/machine-api-operator/0.log" Jan 21 01:02:55 crc kubenswrapper[5030]: I0121 01:02:55.010832 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-zlqvx_945ea6f2-a630-486b-bd03-4e07011e659f/cert-manager-controller/0.log" Jan 21 01:02:55 crc kubenswrapper[5030]: I0121 01:02:55.090257 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-v7jrd_5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b/cert-manager-cainjector/0.log" Jan 21 01:02:55 crc kubenswrapper[5030]: I0121 01:02:55.100923 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-z6v29_d61e12f1-4790-4ba8-a7c5-58e755a5190e/cert-manager-webhook/0.log" Jan 21 01:03:00 crc kubenswrapper[5030]: I0121 01:03:00.821150 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-k6zls_1420cf7f-07c1-473d-9976-b0978952519d/nmstate-console-plugin/0.log" Jan 21 01:03:00 crc kubenswrapper[5030]: I0121 01:03:00.839681 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-tgc5w_c1bb67ab-63d4-45d9-95e3-696c32134f61/nmstate-handler/0.log" Jan 21 01:03:00 crc kubenswrapper[5030]: I0121 01:03:00.853459 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-xfzjh_795abd33-3a47-45c5-a84b-7a2a69168a13/nmstate-metrics/0.log" Jan 21 01:03:00 crc kubenswrapper[5030]: I0121 01:03:00.864862 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-xfzjh_795abd33-3a47-45c5-a84b-7a2a69168a13/kube-rbac-proxy/0.log" Jan 21 01:03:00 crc kubenswrapper[5030]: I0121 01:03:00.879837 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-mwgvw_c6f7ddb6-5178-42be-98ca-d39830d57dbc/nmstate-operator/0.log" Jan 21 01:03:00 crc kubenswrapper[5030]: I0121 01:03:00.899820 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-bkdvp_eb8c76be-9c0b-4240-955e-016a3f5e0ccd/nmstate-webhook/0.log" Jan 21 01:03:06 crc kubenswrapper[5030]: I0121 01:03:06.883456 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-l4rnz_1492b841-c24d-49a9-843c-7348405b87be/prometheus-operator/0.log" Jan 21 01:03:06 crc kubenswrapper[5030]: I0121 01:03:06.893214 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j_3a9c5ee4-1f00-4f1a-8147-f2ad854edb84/prometheus-operator-admission-webhook/0.log" Jan 21 01:03:06 crc kubenswrapper[5030]: I0121 01:03:06.909125 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr_7410b4ea-f2f7-4350-aa51-0730f3463d7b/prometheus-operator-admission-webhook/0.log" Jan 21 01:03:06 crc kubenswrapper[5030]: I0121 01:03:06.942177 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-n46gl_1daef7a7-0684-404a-8e5a-4850d692e21e/operator/0.log" Jan 21 01:03:06 crc kubenswrapper[5030]: I0121 01:03:06.952016 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-wlqst_711a3beb-9e88-42cd-86f6-0e88e52c05e3/perses-operator/0.log" Jan 21 01:03:07 crc kubenswrapper[5030]: I0121 01:03:07.503260 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_dnsmasq-dnsmasq-84b9f45d47-kd68z_f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d/dnsmasq-dns/0.log" Jan 21 01:03:07 crc kubenswrapper[5030]: I0121 01:03:07.516377 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_dnsmasq-dnsmasq-84b9f45d47-kd68z_f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d/init/0.log" Jan 21 01:03:13 crc kubenswrapper[5030]: I0121 01:03:13.099692 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9npwj_c2b9b98a-8ac1-4b34-bf47-b04f6e498149/controller/0.log" Jan 21 01:03:13 crc kubenswrapper[5030]: I0121 01:03:13.108455 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9npwj_c2b9b98a-8ac1-4b34-bf47-b04f6e498149/kube-rbac-proxy/0.log" Jan 21 01:03:13 crc kubenswrapper[5030]: I0121 01:03:13.141425 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/controller/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.434577 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ts88_c1cd5152-d5f9-4d07-b4bb-81abc83676f6/registry-server/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.444146 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ts88_c1cd5152-d5f9-4d07-b4bb-81abc83676f6/extract-utilities/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.497800 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ts88_c1cd5152-d5f9-4d07-b4bb-81abc83676f6/extract-content/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.671976 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/frr/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.685509 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/reloader/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.691500 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/frr-metrics/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.701360 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/kube-rbac-proxy/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.717562 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/kube-rbac-proxy-frr/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.737466 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-frr-files/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.745290 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-reloader/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.762011 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-metrics/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.775219 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-lfp28_aab7d871-bc62-4186-898d-e80a2905bc64/frr-k8s-webhook-server/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.821665 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6fd7497c88-v4gzk_17d40a45-26fa-4c9d-91df-767448cd1cf6/manager/0.log" Jan 21 01:03:14 crc kubenswrapper[5030]: I0121 01:03:14.832181 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74877969b9-vf5sw_bfce14b1-de45-4c8d-852f-044ee0214880/webhook-server/0.log" Jan 21 01:03:15 crc kubenswrapper[5030]: I0121 01:03:15.249943 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5j2cd_a5dfcb0d-c01c-4da3-90ee-0ddb739503aa/registry-server/0.log" Jan 21 01:03:15 crc kubenswrapper[5030]: I0121 01:03:15.256726 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5j2cd_a5dfcb0d-c01c-4da3-90ee-0ddb739503aa/extract-utilities/0.log" Jan 21 01:03:15 crc kubenswrapper[5030]: I0121 01:03:15.266077 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5j2cd_a5dfcb0d-c01c-4da3-90ee-0ddb739503aa/extract-content/0.log" Jan 21 01:03:15 crc kubenswrapper[5030]: I0121 01:03:15.300324 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zdn28_576826de-d45b-4d66-99a0-ed9a215b0e2f/marketplace-operator/0.log" Jan 21 01:03:15 crc kubenswrapper[5030]: I0121 01:03:15.426557 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tbj59_b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f/registry-server/0.log" Jan 21 01:03:15 crc kubenswrapper[5030]: I0121 01:03:15.437882 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tbj59_b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f/extract-utilities/0.log" Jan 21 01:03:15 crc kubenswrapper[5030]: I0121 01:03:15.443004 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tbj59_b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f/extract-content/0.log" Jan 21 01:03:15 crc kubenswrapper[5030]: I0121 01:03:15.802307 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4xq8j_83770f5d-fd84-473c-926c-c2a14010aa1c/registry-server/0.log" Jan 21 01:03:15 crc kubenswrapper[5030]: I0121 01:03:15.807202 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4xq8j_83770f5d-fd84-473c-926c-c2a14010aa1c/extract-utilities/0.log" Jan 21 01:03:15 crc kubenswrapper[5030]: I0121 01:03:15.817132 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4xq8j_83770f5d-fd84-473c-926c-c2a14010aa1c/extract-content/0.log" Jan 21 01:03:16 crc kubenswrapper[5030]: I0121 01:03:16.294716 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2ttl_bbacd039-a930-489a-9aca-1fac0748973b/speaker/0.log" Jan 21 01:03:16 crc kubenswrapper[5030]: I0121 01:03:16.310584 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2ttl_bbacd039-a930-489a-9aca-1fac0748973b/kube-rbac-proxy/0.log" Jan 21 01:03:20 crc kubenswrapper[5030]: I0121 01:03:20.334856 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-l4rnz_1492b841-c24d-49a9-843c-7348405b87be/prometheus-operator/0.log" Jan 21 01:03:20 crc kubenswrapper[5030]: I0121 01:03:20.345335 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j_3a9c5ee4-1f00-4f1a-8147-f2ad854edb84/prometheus-operator-admission-webhook/0.log" Jan 21 01:03:20 crc kubenswrapper[5030]: I0121 01:03:20.358029 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr_7410b4ea-f2f7-4350-aa51-0730f3463d7b/prometheus-operator-admission-webhook/0.log" Jan 21 01:03:20 crc kubenswrapper[5030]: I0121 01:03:20.382874 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-n46gl_1daef7a7-0684-404a-8e5a-4850d692e21e/operator/0.log" Jan 21 01:03:20 crc kubenswrapper[5030]: I0121 01:03:20.393473 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-wlqst_711a3beb-9e88-42cd-86f6-0e88e52c05e3/perses-operator/0.log" Jan 21 01:03:37 crc kubenswrapper[5030]: I0121 01:03:37.673876 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_dnsmasq-dnsmasq-84b9f45d47-kd68z_f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d/dnsmasq-dns/0.log" Jan 21 01:03:37 crc kubenswrapper[5030]: I0121 01:03:37.686027 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-kuttl-tests_dnsmasq-dnsmasq-84b9f45d47-kd68z_f9d1fb79-a6e2-4cb1-8990-8665d9f5e57d/init/0.log" Jan 21 01:03:40 crc kubenswrapper[5030]: I0121 01:03:40.303187 5030 scope.go:117] "RemoveContainer" containerID="ab3ff7039505912d60b0fec22b60d658bb779bc1c989b82524a148b365c90e77" Jan 21 01:03:40 crc kubenswrapper[5030]: I0121 01:03:40.342961 5030 scope.go:117] "RemoveContainer" containerID="aa186b9697e4ba4350fccfd2c0abef0a265a570f0afd7b9123bb25bd3f684018" Jan 21 01:03:40 crc kubenswrapper[5030]: I0121 01:03:40.359847 5030 scope.go:117] "RemoveContainer" containerID="2bd42947a7b68c2dfc89ee2480c9171b269eb691899ede07105a29d74d0df7a5" Jan 21 01:03:40 crc kubenswrapper[5030]: I0121 01:03:40.392836 5030 scope.go:117] "RemoveContainer" containerID="9cadf200e8321abc58854abfd8226569f7dd3a279740b95b9d6a4487243f4eec" Jan 21 01:03:40 crc kubenswrapper[5030]: I0121 01:03:40.413729 5030 scope.go:117] "RemoveContainer" containerID="f1f5446ba6bb631af20b295d86ddd9a37fbb2ac53843d294cebc0cedb52823b2" Jan 21 01:03:40 crc kubenswrapper[5030]: I0121 01:03:40.439517 5030 scope.go:117] "RemoveContainer" containerID="e591307ec74214b89b97944e550020a26717239a534f5ae914919227e523f4f1" Jan 21 01:03:40 crc kubenswrapper[5030]: I0121 01:03:40.471568 5030 scope.go:117] "RemoveContainer" containerID="04811408c907a49bf788d7ea7ddc76e7c8306fddbf5f99dc614d2776f2f0029a" Jan 21 01:03:40 crc kubenswrapper[5030]: I0121 01:03:40.497110 5030 scope.go:117] "RemoveContainer" containerID="6dc1aabd2b51d0e4d03fd0db98c955ebcfba0f98870576cc430d3c12bd87f714" Jan 21 01:03:40 crc kubenswrapper[5030]: I0121 01:03:40.514018 5030 scope.go:117] "RemoveContainer" containerID="107e7033c34de7c0c125c4880b396725e6e9d48af2eeac73d97f60ac327462ff" Jan 21 01:03:44 crc kubenswrapper[5030]: I0121 01:03:44.929308 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ts88_c1cd5152-d5f9-4d07-b4bb-81abc83676f6/registry-server/0.log" Jan 21 01:03:44 crc kubenswrapper[5030]: I0121 01:03:44.936326 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ts88_c1cd5152-d5f9-4d07-b4bb-81abc83676f6/extract-utilities/0.log" Jan 21 01:03:44 crc kubenswrapper[5030]: I0121 01:03:44.942088 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2ts88_c1cd5152-d5f9-4d07-b4bb-81abc83676f6/extract-content/0.log" Jan 21 01:03:45 crc kubenswrapper[5030]: I0121 01:03:45.410836 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5j2cd_a5dfcb0d-c01c-4da3-90ee-0ddb739503aa/registry-server/0.log" Jan 21 01:03:45 crc kubenswrapper[5030]: I0121 01:03:45.417582 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5j2cd_a5dfcb0d-c01c-4da3-90ee-0ddb739503aa/extract-utilities/0.log" Jan 21 01:03:45 crc kubenswrapper[5030]: I0121 01:03:45.431804 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5j2cd_a5dfcb0d-c01c-4da3-90ee-0ddb739503aa/extract-content/0.log" Jan 21 01:03:45 crc kubenswrapper[5030]: I0121 01:03:45.463303 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zdn28_576826de-d45b-4d66-99a0-ed9a215b0e2f/marketplace-operator/0.log" Jan 21 01:03:45 crc kubenswrapper[5030]: I0121 01:03:45.553841 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tbj59_b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f/registry-server/0.log" Jan 21 01:03:45 crc kubenswrapper[5030]: I0121 01:03:45.558947 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tbj59_b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f/extract-utilities/0.log" Jan 21 01:03:45 crc kubenswrapper[5030]: I0121 01:03:45.566197 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tbj59_b2db6aa4-5ae0-4c18-ab4e-6f9de18fbb1f/extract-content/0.log" Jan 21 01:03:45 crc kubenswrapper[5030]: I0121 01:03:45.808382 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4xq8j_83770f5d-fd84-473c-926c-c2a14010aa1c/registry-server/0.log" Jan 21 01:03:45 crc kubenswrapper[5030]: I0121 01:03:45.814820 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4xq8j_83770f5d-fd84-473c-926c-c2a14010aa1c/extract-utilities/0.log" Jan 21 01:03:45 crc kubenswrapper[5030]: I0121 01:03:45.820862 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4xq8j_83770f5d-fd84-473c-926c-c2a14010aa1c/extract-content/0.log" Jan 21 01:03:52 crc kubenswrapper[5030]: I0121 01:03:52.021165 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-l4rnz_1492b841-c24d-49a9-843c-7348405b87be/prometheus-operator/0.log" Jan 21 01:03:52 crc kubenswrapper[5030]: I0121 01:03:52.055708 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j_3a9c5ee4-1f00-4f1a-8147-f2ad854edb84/prometheus-operator-admission-webhook/0.log" Jan 21 01:03:52 crc kubenswrapper[5030]: I0121 01:03:52.064409 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr_7410b4ea-f2f7-4350-aa51-0730f3463d7b/prometheus-operator-admission-webhook/0.log" Jan 21 01:03:52 crc kubenswrapper[5030]: I0121 01:03:52.096337 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-n46gl_1daef7a7-0684-404a-8e5a-4850d692e21e/operator/0.log" Jan 21 01:03:52 crc kubenswrapper[5030]: I0121 01:03:52.107812 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-wlqst_711a3beb-9e88-42cd-86f6-0e88e52c05e3/perses-operator/0.log" Jan 21 01:04:40 crc kubenswrapper[5030]: I0121 01:04:40.572278 5030 scope.go:117] "RemoveContainer" containerID="180135f2bebd1daf383df1e6f748c196b1c0faa13222d5e5d3624f25327469f5" Jan 21 01:04:40 crc kubenswrapper[5030]: I0121 01:04:40.603542 5030 scope.go:117] "RemoveContainer" containerID="487c41c3adc4e03adfbe069d0ec07221133bdf999ec4e1a5240978f7657e977c" Jan 21 01:04:40 crc kubenswrapper[5030]: I0121 01:04:40.652783 5030 scope.go:117] "RemoveContainer" containerID="7949d61d73192f4f4d01d89fc08f93b8402e0b08d7e61324e248dc9fe8036c89" Jan 21 01:05:10 crc kubenswrapper[5030]: I0121 01:05:10.157765 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:05:10 crc kubenswrapper[5030]: I0121 01:05:10.158329 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:05:40 crc kubenswrapper[5030]: I0121 01:05:40.157720 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:05:40 crc kubenswrapper[5030]: I0121 01:05:40.158200 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:05:49 crc kubenswrapper[5030]: I0121 01:05:49.672327 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-l4rnz_1492b841-c24d-49a9-843c-7348405b87be/prometheus-operator/0.log" Jan 21 01:05:49 crc kubenswrapper[5030]: I0121 01:05:49.683185 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j_3a9c5ee4-1f00-4f1a-8147-f2ad854edb84/prometheus-operator-admission-webhook/0.log" Jan 21 01:05:49 crc kubenswrapper[5030]: I0121 01:05:49.698102 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr_7410b4ea-f2f7-4350-aa51-0730f3463d7b/prometheus-operator-admission-webhook/0.log" Jan 21 01:05:49 crc kubenswrapper[5030]: I0121 01:05:49.739841 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-n46gl_1daef7a7-0684-404a-8e5a-4850d692e21e/operator/0.log" Jan 21 01:05:49 crc kubenswrapper[5030]: I0121 01:05:49.755317 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-wlqst_711a3beb-9e88-42cd-86f6-0e88e52c05e3/perses-operator/0.log" Jan 21 01:05:50 crc kubenswrapper[5030]: I0121 01:05:50.204173 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-zlqvx_945ea6f2-a630-486b-bd03-4e07011e659f/cert-manager-controller/0.log" Jan 21 01:05:50 crc kubenswrapper[5030]: I0121 01:05:50.273441 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-v7jrd_5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b/cert-manager-cainjector/0.log" Jan 21 01:05:50 crc kubenswrapper[5030]: I0121 01:05:50.284452 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-z6v29_d61e12f1-4790-4ba8-a7c5-58e755a5190e/cert-manager-webhook/0.log" Jan 21 01:05:51 crc kubenswrapper[5030]: I0121 01:05:51.327922 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9npwj_c2b9b98a-8ac1-4b34-bf47-b04f6e498149/controller/0.log" Jan 21 01:05:51 crc kubenswrapper[5030]: I0121 01:05:51.335572 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9npwj_c2b9b98a-8ac1-4b34-bf47-b04f6e498149/kube-rbac-proxy/0.log" Jan 21 01:05:51 crc kubenswrapper[5030]: I0121 01:05:51.355411 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/controller/0.log" Jan 21 01:05:51 crc kubenswrapper[5030]: I0121 01:05:51.524198 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-k6zls_1420cf7f-07c1-473d-9976-b0978952519d/nmstate-console-plugin/0.log" Jan 21 01:05:51 crc kubenswrapper[5030]: I0121 01:05:51.543799 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-tgc5w_c1bb67ab-63d4-45d9-95e3-696c32134f61/nmstate-handler/0.log" Jan 21 01:05:51 crc kubenswrapper[5030]: I0121 01:05:51.576495 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-xfzjh_795abd33-3a47-45c5-a84b-7a2a69168a13/nmstate-metrics/0.log" Jan 21 01:05:51 crc kubenswrapper[5030]: I0121 01:05:51.601540 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-xfzjh_795abd33-3a47-45c5-a84b-7a2a69168a13/kube-rbac-proxy/0.log" Jan 21 01:05:51 crc kubenswrapper[5030]: I0121 01:05:51.622837 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-mwgvw_c6f7ddb6-5178-42be-98ca-d39830d57dbc/nmstate-operator/0.log" Jan 21 01:05:51 crc kubenswrapper[5030]: I0121 01:05:51.642250 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-bkdvp_eb8c76be-9c0b-4240-955e-016a3f5e0ccd/nmstate-webhook/0.log" Jan 21 01:05:52 crc kubenswrapper[5030]: I0121 01:05:52.525938 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/frr/0.log" Jan 21 01:05:52 crc kubenswrapper[5030]: I0121 01:05:52.535863 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/reloader/0.log" Jan 21 01:05:52 crc kubenswrapper[5030]: I0121 01:05:52.547420 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/frr-metrics/0.log" Jan 21 01:05:52 crc kubenswrapper[5030]: I0121 01:05:52.561715 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/kube-rbac-proxy/0.log" Jan 21 01:05:52 crc kubenswrapper[5030]: I0121 01:05:52.572254 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/kube-rbac-proxy-frr/0.log" Jan 21 01:05:52 crc kubenswrapper[5030]: I0121 01:05:52.583752 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-frr-files/0.log" Jan 21 01:05:52 crc kubenswrapper[5030]: I0121 01:05:52.595970 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-reloader/0.log" Jan 21 01:05:52 crc kubenswrapper[5030]: I0121 01:05:52.609897 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-metrics/0.log" Jan 21 01:05:52 crc kubenswrapper[5030]: I0121 01:05:52.622171 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-lfp28_aab7d871-bc62-4186-898d-e80a2905bc64/frr-k8s-webhook-server/0.log" Jan 21 01:05:52 crc kubenswrapper[5030]: I0121 01:05:52.655459 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6fd7497c88-v4gzk_17d40a45-26fa-4c9d-91df-767448cd1cf6/manager/0.log" Jan 21 01:05:52 crc kubenswrapper[5030]: I0121 01:05:52.664677 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74877969b9-vf5sw_bfce14b1-de45-4c8d-852f-044ee0214880/webhook-server/0.log" Jan 21 01:05:53 crc kubenswrapper[5030]: I0121 01:05:53.887655 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2ttl_bbacd039-a930-489a-9aca-1fac0748973b/speaker/0.log" Jan 21 01:05:53 crc kubenswrapper[5030]: I0121 01:05:53.931774 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2ttl_bbacd039-a930-489a-9aca-1fac0748973b/kube-rbac-proxy/0.log" Jan 21 01:05:55 crc kubenswrapper[5030]: I0121 01:05:55.357260 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-zlqvx_945ea6f2-a630-486b-bd03-4e07011e659f/cert-manager-controller/0.log" Jan 21 01:05:55 crc kubenswrapper[5030]: I0121 01:05:55.440941 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-v7jrd_5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b/cert-manager-cainjector/0.log" Jan 21 01:05:55 crc kubenswrapper[5030]: I0121 01:05:55.454371 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-z6v29_d61e12f1-4790-4ba8-a7c5-58e755a5190e/cert-manager-webhook/0.log" Jan 21 01:05:56 crc kubenswrapper[5030]: I0121 01:05:56.333342 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-82699_607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9/control-plane-machine-set-operator/0.log" Jan 21 01:05:56 crc kubenswrapper[5030]: I0121 01:05:56.345997 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t7nch_09fc7d72-f430-4d90-9ab9-1d79db43b695/kube-rbac-proxy/0.log" Jan 21 01:05:56 crc kubenswrapper[5030]: I0121 01:05:56.353495 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t7nch_09fc7d72-f430-4d90-9ab9-1d79db43b695/machine-api-operator/0.log" Jan 21 01:05:58 crc kubenswrapper[5030]: I0121 01:05:58.477119 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/kube-multus-additional-cni-plugins/0.log" Jan 21 01:05:58 crc kubenswrapper[5030]: I0121 01:05:58.484027 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/egress-router-binary-copy/0.log" Jan 21 01:05:58 crc kubenswrapper[5030]: I0121 01:05:58.491964 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/cni-plugins/0.log" Jan 21 01:05:58 crc kubenswrapper[5030]: I0121 01:05:58.501570 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/bond-cni-plugin/0.log" Jan 21 01:05:58 crc kubenswrapper[5030]: I0121 01:05:58.517401 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/routeoverride-cni/0.log" Jan 21 01:05:58 crc kubenswrapper[5030]: I0121 01:05:58.525206 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/whereabouts-cni-bincopy/0.log" Jan 21 01:05:58 crc kubenswrapper[5030]: I0121 01:05:58.534779 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/whereabouts-cni/0.log" Jan 21 01:05:58 crc kubenswrapper[5030]: I0121 01:05:58.604144 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-mh8ms_09197347-c50f-4fe1-b20d-3baabf9869fc/multus-admission-controller/0.log" Jan 21 01:05:58 crc kubenswrapper[5030]: I0121 01:05:58.609318 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-mh8ms_09197347-c50f-4fe1-b20d-3baabf9869fc/kube-rbac-proxy/0.log" Jan 21 01:05:58 crc kubenswrapper[5030]: I0121 01:05:58.648564 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/2.log" Jan 21 01:05:59 crc kubenswrapper[5030]: I0121 01:05:59.091713 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/3.log" Jan 21 01:05:59 crc kubenswrapper[5030]: I0121 01:05:59.278850 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f2dgj_2b3c414b-bc29-41aa-a369-ecc2cc809691/network-metrics-daemon/0.log" Jan 21 01:05:59 crc kubenswrapper[5030]: I0121 01:05:59.292086 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f2dgj_2b3c414b-bc29-41aa-a369-ecc2cc809691/kube-rbac-proxy/0.log" Jan 21 01:06:10 crc kubenswrapper[5030]: I0121 01:06:10.157206 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:06:10 crc kubenswrapper[5030]: I0121 01:06:10.157849 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:06:10 crc kubenswrapper[5030]: I0121 01:06:10.157901 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 01:06:10 crc kubenswrapper[5030]: I0121 01:06:10.158565 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"493fb309220013c06d3075f6b5de033d2728601f3b32363f4acb6dd909e988fb"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 01:06:10 crc kubenswrapper[5030]: I0121 01:06:10.158645 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://493fb309220013c06d3075f6b5de033d2728601f3b32363f4acb6dd909e988fb" gracePeriod=600 Jan 21 01:06:10 crc kubenswrapper[5030]: I0121 01:06:10.526885 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="493fb309220013c06d3075f6b5de033d2728601f3b32363f4acb6dd909e988fb" exitCode=0 Jan 21 01:06:10 crc kubenswrapper[5030]: I0121 01:06:10.526945 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"493fb309220013c06d3075f6b5de033d2728601f3b32363f4acb6dd909e988fb"} Jan 21 01:06:10 crc kubenswrapper[5030]: I0121 01:06:10.526980 5030 scope.go:117] "RemoveContainer" containerID="279f6d4d96552d9ac8025abbbb4fc8acb2bade073de4b1c6433852b406522c87" Jan 21 01:06:11 crc kubenswrapper[5030]: I0121 01:06:11.534984 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8"} Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.708302 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d9llr"] Jan 21 01:06:14 crc kubenswrapper[5030]: E0121 01:06:14.709983 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d5944a-47d5-4124-ad88-2c7823d7a190" containerName="extract-utilities" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.710067 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d5944a-47d5-4124-ad88-2c7823d7a190" containerName="extract-utilities" Jan 21 01:06:14 crc kubenswrapper[5030]: E0121 01:06:14.710133 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d5944a-47d5-4124-ad88-2c7823d7a190" containerName="registry-server" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.710194 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d5944a-47d5-4124-ad88-2c7823d7a190" containerName="registry-server" Jan 21 01:06:14 crc kubenswrapper[5030]: E0121 01:06:14.710269 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d5944a-47d5-4124-ad88-2c7823d7a190" containerName="extract-content" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.710348 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d5944a-47d5-4124-ad88-2c7823d7a190" containerName="extract-content" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.710544 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0d5944a-47d5-4124-ad88-2c7823d7a190" containerName="registry-server" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.711504 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.736939 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d9llr"] Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.834006 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e58e25eb-fb67-4c13-a4fb-5029dc200d52-utilities\") pod \"redhat-operators-d9llr\" (UID: \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\") " pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.834107 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzgp7\" (UniqueName: \"kubernetes.io/projected/e58e25eb-fb67-4c13-a4fb-5029dc200d52-kube-api-access-mzgp7\") pod \"redhat-operators-d9llr\" (UID: \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\") " pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.834143 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e58e25eb-fb67-4c13-a4fb-5029dc200d52-catalog-content\") pod \"redhat-operators-d9llr\" (UID: \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\") " pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.935086 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzgp7\" (UniqueName: \"kubernetes.io/projected/e58e25eb-fb67-4c13-a4fb-5029dc200d52-kube-api-access-mzgp7\") pod \"redhat-operators-d9llr\" (UID: \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\") " pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.935336 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e58e25eb-fb67-4c13-a4fb-5029dc200d52-catalog-content\") pod \"redhat-operators-d9llr\" (UID: \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\") " pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.935457 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e58e25eb-fb67-4c13-a4fb-5029dc200d52-utilities\") pod \"redhat-operators-d9llr\" (UID: \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\") " pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.936045 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e58e25eb-fb67-4c13-a4fb-5029dc200d52-utilities\") pod \"redhat-operators-d9llr\" (UID: \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\") " pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.936911 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e58e25eb-fb67-4c13-a4fb-5029dc200d52-catalog-content\") pod \"redhat-operators-d9llr\" (UID: \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\") " pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:14 crc kubenswrapper[5030]: I0121 01:06:14.966469 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzgp7\" (UniqueName: \"kubernetes.io/projected/e58e25eb-fb67-4c13-a4fb-5029dc200d52-kube-api-access-mzgp7\") pod \"redhat-operators-d9llr\" (UID: \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\") " pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:15 crc kubenswrapper[5030]: I0121 01:06:15.032314 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:15 crc kubenswrapper[5030]: I0121 01:06:15.306010 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d9llr"] Jan 21 01:06:15 crc kubenswrapper[5030]: I0121 01:06:15.557128 5030 generic.go:334] "Generic (PLEG): container finished" podID="e58e25eb-fb67-4c13-a4fb-5029dc200d52" containerID="91003e8e8fe74b4a05cfd9d9b5462b2965761ff32ce0260124c2b23042074184" exitCode=0 Jan 21 01:06:15 crc kubenswrapper[5030]: I0121 01:06:15.557332 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9llr" event={"ID":"e58e25eb-fb67-4c13-a4fb-5029dc200d52","Type":"ContainerDied","Data":"91003e8e8fe74b4a05cfd9d9b5462b2965761ff32ce0260124c2b23042074184"} Jan 21 01:06:15 crc kubenswrapper[5030]: I0121 01:06:15.557474 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9llr" event={"ID":"e58e25eb-fb67-4c13-a4fb-5029dc200d52","Type":"ContainerStarted","Data":"3e221390f7031f8911a9547f71c1c7de48e5876f856654898dd72972d7d1482a"} Jan 21 01:06:17 crc kubenswrapper[5030]: I0121 01:06:17.571067 5030 generic.go:334] "Generic (PLEG): container finished" podID="e58e25eb-fb67-4c13-a4fb-5029dc200d52" containerID="0f359cad990ea855cadddb127595c5410023d7e81a06c82c1a010e620bf8e3d2" exitCode=0 Jan 21 01:06:17 crc kubenswrapper[5030]: I0121 01:06:17.571198 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9llr" event={"ID":"e58e25eb-fb67-4c13-a4fb-5029dc200d52","Type":"ContainerDied","Data":"0f359cad990ea855cadddb127595c5410023d7e81a06c82c1a010e620bf8e3d2"} Jan 21 01:06:18 crc kubenswrapper[5030]: I0121 01:06:18.579618 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9llr" event={"ID":"e58e25eb-fb67-4c13-a4fb-5029dc200d52","Type":"ContainerStarted","Data":"2c2b01ca6290340134715c414508d767c19b747927953aacaa498677adce2fbb"} Jan 21 01:06:22 crc kubenswrapper[5030]: I0121 01:06:22.504586 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-l4rnz_1492b841-c24d-49a9-843c-7348405b87be/prometheus-operator/0.log" Jan 21 01:06:22 crc kubenswrapper[5030]: I0121 01:06:22.521588 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6447bc6f6d-9zq6j_3a9c5ee4-1f00-4f1a-8147-f2ad854edb84/prometheus-operator-admission-webhook/0.log" Jan 21 01:06:22 crc kubenswrapper[5030]: I0121 01:06:22.550264 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6447bc6f6d-gxhpr_7410b4ea-f2f7-4350-aa51-0730f3463d7b/prometheus-operator-admission-webhook/0.log" Jan 21 01:06:22 crc kubenswrapper[5030]: I0121 01:06:22.583163 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-n46gl_1daef7a7-0684-404a-8e5a-4850d692e21e/operator/0.log" Jan 21 01:06:22 crc kubenswrapper[5030]: I0121 01:06:22.596706 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-wlqst_711a3beb-9e88-42cd-86f6-0e88e52c05e3/perses-operator/0.log" Jan 21 01:06:22 crc kubenswrapper[5030]: I0121 01:06:22.991655 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-zlqvx_945ea6f2-a630-486b-bd03-4e07011e659f/cert-manager-controller/0.log" Jan 21 01:06:23 crc kubenswrapper[5030]: I0121 01:06:23.058777 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-v7jrd_5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b/cert-manager-cainjector/0.log" Jan 21 01:06:23 crc kubenswrapper[5030]: I0121 01:06:23.074371 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-z6v29_d61e12f1-4790-4ba8-a7c5-58e755a5190e/cert-manager-webhook/0.log" Jan 21 01:06:24 crc kubenswrapper[5030]: I0121 01:06:24.476399 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-k6zls_1420cf7f-07c1-473d-9976-b0978952519d/nmstate-console-plugin/0.log" Jan 21 01:06:24 crc kubenswrapper[5030]: I0121 01:06:24.501877 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-tgc5w_c1bb67ab-63d4-45d9-95e3-696c32134f61/nmstate-handler/0.log" Jan 21 01:06:24 crc kubenswrapper[5030]: I0121 01:06:24.512363 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-xfzjh_795abd33-3a47-45c5-a84b-7a2a69168a13/nmstate-metrics/0.log" Jan 21 01:06:24 crc kubenswrapper[5030]: I0121 01:06:24.528252 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-xfzjh_795abd33-3a47-45c5-a84b-7a2a69168a13/kube-rbac-proxy/0.log" Jan 21 01:06:24 crc kubenswrapper[5030]: I0121 01:06:24.558574 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-mwgvw_c6f7ddb6-5178-42be-98ca-d39830d57dbc/nmstate-operator/0.log" Jan 21 01:06:24 crc kubenswrapper[5030]: I0121 01:06:24.567909 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9npwj_c2b9b98a-8ac1-4b34-bf47-b04f6e498149/controller/0.log" Jan 21 01:06:24 crc kubenswrapper[5030]: I0121 01:06:24.569387 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-bkdvp_eb8c76be-9c0b-4240-955e-016a3f5e0ccd/nmstate-webhook/0.log" Jan 21 01:06:24 crc kubenswrapper[5030]: I0121 01:06:24.574373 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-9npwj_c2b9b98a-8ac1-4b34-bf47-b04f6e498149/kube-rbac-proxy/0.log" Jan 21 01:06:24 crc kubenswrapper[5030]: I0121 01:06:24.594476 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/controller/0.log" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.032591 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.033519 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.081948 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.105602 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d9llr" podStartSLOduration=8.65336321 podStartE2EDuration="11.105588787s" podCreationTimestamp="2026-01-21 01:06:14 +0000 UTC" firstStartedPulling="2026-01-21 01:06:15.558543372 +0000 UTC m=+9047.878803660" lastFinishedPulling="2026-01-21 01:06:18.010768949 +0000 UTC m=+9050.331029237" observedRunningTime="2026-01-21 01:06:18.605999867 +0000 UTC m=+9050.926260155" watchObservedRunningTime="2026-01-21 01:06:25.105588787 +0000 UTC m=+9057.425849075" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.532210 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/frr/0.log" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.539368 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/reloader/0.log" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.547866 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/frr-metrics/0.log" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.560061 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/kube-rbac-proxy/0.log" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.568376 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/kube-rbac-proxy-frr/0.log" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.583120 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-frr-files/0.log" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.601047 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-reloader/0.log" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.609776 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cdvkw_d655cbb5-1c38-4c7c-b60c-13abdf46828b/cp-metrics/0.log" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.619941 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-lfp28_aab7d871-bc62-4186-898d-e80a2905bc64/frr-k8s-webhook-server/0.log" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.647662 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6fd7497c88-v4gzk_17d40a45-26fa-4c9d-91df-767448cd1cf6/manager/0.log" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.660055 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74877969b9-vf5sw_bfce14b1-de45-4c8d-852f-044ee0214880/webhook-server/0.log" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.782230 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:25 crc kubenswrapper[5030]: I0121 01:06:25.830743 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d9llr"] Jan 21 01:06:27 crc kubenswrapper[5030]: I0121 01:06:27.033981 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2ttl_bbacd039-a930-489a-9aca-1fac0748973b/speaker/0.log" Jan 21 01:06:27 crc kubenswrapper[5030]: I0121 01:06:27.046404 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2ttl_bbacd039-a930-489a-9aca-1fac0748973b/kube-rbac-proxy/0.log" Jan 21 01:06:27 crc kubenswrapper[5030]: I0121 01:06:27.636899 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d9llr" podUID="e58e25eb-fb67-4c13-a4fb-5029dc200d52" containerName="registry-server" containerID="cri-o://2c2b01ca6290340134715c414508d767c19b747927953aacaa498677adce2fbb" gracePeriod=2 Jan 21 01:06:28 crc kubenswrapper[5030]: I0121 01:06:28.288559 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-zlqvx_945ea6f2-a630-486b-bd03-4e07011e659f/cert-manager-controller/0.log" Jan 21 01:06:28 crc kubenswrapper[5030]: I0121 01:06:28.734999 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-v7jrd_5be4ddc5-9bf2-4ad6-b46c-43e5fc062c8b/cert-manager-cainjector/0.log" Jan 21 01:06:28 crc kubenswrapper[5030]: I0121 01:06:28.788363 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-z6v29_d61e12f1-4790-4ba8-a7c5-58e755a5190e/cert-manager-webhook/0.log" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.177382 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.223832 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e58e25eb-fb67-4c13-a4fb-5029dc200d52-utilities\") pod \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\" (UID: \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\") " Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.224041 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e58e25eb-fb67-4c13-a4fb-5029dc200d52-catalog-content\") pod \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\" (UID: \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\") " Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.224078 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzgp7\" (UniqueName: \"kubernetes.io/projected/e58e25eb-fb67-4c13-a4fb-5029dc200d52-kube-api-access-mzgp7\") pod \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\" (UID: \"e58e25eb-fb67-4c13-a4fb-5029dc200d52\") " Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.224529 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e58e25eb-fb67-4c13-a4fb-5029dc200d52-utilities" (OuterVolumeSpecName: "utilities") pod "e58e25eb-fb67-4c13-a4fb-5029dc200d52" (UID: "e58e25eb-fb67-4c13-a4fb-5029dc200d52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.232804 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e58e25eb-fb67-4c13-a4fb-5029dc200d52-kube-api-access-mzgp7" (OuterVolumeSpecName: "kube-api-access-mzgp7") pod "e58e25eb-fb67-4c13-a4fb-5029dc200d52" (UID: "e58e25eb-fb67-4c13-a4fb-5029dc200d52"). InnerVolumeSpecName "kube-api-access-mzgp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.325440 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzgp7\" (UniqueName: \"kubernetes.io/projected/e58e25eb-fb67-4c13-a4fb-5029dc200d52-kube-api-access-mzgp7\") on node \"crc\" DevicePath \"\"" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.325480 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e58e25eb-fb67-4c13-a4fb-5029dc200d52-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.400796 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e58e25eb-fb67-4c13-a4fb-5029dc200d52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e58e25eb-fb67-4c13-a4fb-5029dc200d52" (UID: "e58e25eb-fb67-4c13-a4fb-5029dc200d52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.426451 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e58e25eb-fb67-4c13-a4fb-5029dc200d52-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.650998 5030 generic.go:334] "Generic (PLEG): container finished" podID="e58e25eb-fb67-4c13-a4fb-5029dc200d52" containerID="2c2b01ca6290340134715c414508d767c19b747927953aacaa498677adce2fbb" exitCode=0 Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.651101 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d9llr" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.651084 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9llr" event={"ID":"e58e25eb-fb67-4c13-a4fb-5029dc200d52","Type":"ContainerDied","Data":"2c2b01ca6290340134715c414508d767c19b747927953aacaa498677adce2fbb"} Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.651154 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d9llr" event={"ID":"e58e25eb-fb67-4c13-a4fb-5029dc200d52","Type":"ContainerDied","Data":"3e221390f7031f8911a9547f71c1c7de48e5876f856654898dd72972d7d1482a"} Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.651176 5030 scope.go:117] "RemoveContainer" containerID="2c2b01ca6290340134715c414508d767c19b747927953aacaa498677adce2fbb" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.683531 5030 scope.go:117] "RemoveContainer" containerID="0f359cad990ea855cadddb127595c5410023d7e81a06c82c1a010e620bf8e3d2" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.685550 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d9llr"] Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.690362 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d9llr"] Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.715746 5030 scope.go:117] "RemoveContainer" containerID="91003e8e8fe74b4a05cfd9d9b5462b2965761ff32ce0260124c2b23042074184" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.745214 5030 scope.go:117] "RemoveContainer" containerID="2c2b01ca6290340134715c414508d767c19b747927953aacaa498677adce2fbb" Jan 21 01:06:29 crc kubenswrapper[5030]: E0121 01:06:29.745734 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c2b01ca6290340134715c414508d767c19b747927953aacaa498677adce2fbb\": container with ID starting with 2c2b01ca6290340134715c414508d767c19b747927953aacaa498677adce2fbb not found: ID does not exist" containerID="2c2b01ca6290340134715c414508d767c19b747927953aacaa498677adce2fbb" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.745818 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c2b01ca6290340134715c414508d767c19b747927953aacaa498677adce2fbb"} err="failed to get container status \"2c2b01ca6290340134715c414508d767c19b747927953aacaa498677adce2fbb\": rpc error: code = NotFound desc = could not find container \"2c2b01ca6290340134715c414508d767c19b747927953aacaa498677adce2fbb\": container with ID starting with 2c2b01ca6290340134715c414508d767c19b747927953aacaa498677adce2fbb not found: ID does not exist" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.745854 5030 scope.go:117] "RemoveContainer" containerID="0f359cad990ea855cadddb127595c5410023d7e81a06c82c1a010e620bf8e3d2" Jan 21 01:06:29 crc kubenswrapper[5030]: E0121 01:06:29.746246 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f359cad990ea855cadddb127595c5410023d7e81a06c82c1a010e620bf8e3d2\": container with ID starting with 0f359cad990ea855cadddb127595c5410023d7e81a06c82c1a010e620bf8e3d2 not found: ID does not exist" containerID="0f359cad990ea855cadddb127595c5410023d7e81a06c82c1a010e620bf8e3d2" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.746296 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f359cad990ea855cadddb127595c5410023d7e81a06c82c1a010e620bf8e3d2"} err="failed to get container status \"0f359cad990ea855cadddb127595c5410023d7e81a06c82c1a010e620bf8e3d2\": rpc error: code = NotFound desc = could not find container \"0f359cad990ea855cadddb127595c5410023d7e81a06c82c1a010e620bf8e3d2\": container with ID starting with 0f359cad990ea855cadddb127595c5410023d7e81a06c82c1a010e620bf8e3d2 not found: ID does not exist" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.746327 5030 scope.go:117] "RemoveContainer" containerID="91003e8e8fe74b4a05cfd9d9b5462b2965761ff32ce0260124c2b23042074184" Jan 21 01:06:29 crc kubenswrapper[5030]: E0121 01:06:29.746630 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91003e8e8fe74b4a05cfd9d9b5462b2965761ff32ce0260124c2b23042074184\": container with ID starting with 91003e8e8fe74b4a05cfd9d9b5462b2965761ff32ce0260124c2b23042074184 not found: ID does not exist" containerID="91003e8e8fe74b4a05cfd9d9b5462b2965761ff32ce0260124c2b23042074184" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.746655 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91003e8e8fe74b4a05cfd9d9b5462b2965761ff32ce0260124c2b23042074184"} err="failed to get container status \"91003e8e8fe74b4a05cfd9d9b5462b2965761ff32ce0260124c2b23042074184\": rpc error: code = NotFound desc = could not find container \"91003e8e8fe74b4a05cfd9d9b5462b2965761ff32ce0260124c2b23042074184\": container with ID starting with 91003e8e8fe74b4a05cfd9d9b5462b2965761ff32ce0260124c2b23042074184 not found: ID does not exist" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.862022 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-82699_607ebcfe-acb3-47b7-bb3d-c73a30cdd8b9/control-plane-machine-set-operator/0.log" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.879761 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t7nch_09fc7d72-f430-4d90-9ab9-1d79db43b695/kube-rbac-proxy/0.log" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.887766 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t7nch_09fc7d72-f430-4d90-9ab9-1d79db43b695/machine-api-operator/0.log" Jan 21 01:06:29 crc kubenswrapper[5030]: I0121 01:06:29.992664 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e58e25eb-fb67-4c13-a4fb-5029dc200d52" path="/var/lib/kubelet/pods/e58e25eb-fb67-4c13-a4fb-5029dc200d52/volumes" Jan 21 01:06:31 crc kubenswrapper[5030]: I0121 01:06:31.741704 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/kube-multus-additional-cni-plugins/0.log" Jan 21 01:06:31 crc kubenswrapper[5030]: I0121 01:06:31.756552 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/egress-router-binary-copy/0.log" Jan 21 01:06:31 crc kubenswrapper[5030]: I0121 01:06:31.768281 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/cni-plugins/0.log" Jan 21 01:06:31 crc kubenswrapper[5030]: I0121 01:06:31.777877 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/bond-cni-plugin/0.log" Jan 21 01:06:31 crc kubenswrapper[5030]: I0121 01:06:31.784958 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/routeoverride-cni/0.log" Jan 21 01:06:31 crc kubenswrapper[5030]: I0121 01:06:31.794114 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/whereabouts-cni-bincopy/0.log" Jan 21 01:06:31 crc kubenswrapper[5030]: I0121 01:06:31.801656 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zzvtq_18c97a93-96a8-40e8-9f36-513f91962906/whereabouts-cni/0.log" Jan 21 01:06:31 crc kubenswrapper[5030]: I0121 01:06:31.865047 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-mh8ms_09197347-c50f-4fe1-b20d-3baabf9869fc/multus-admission-controller/0.log" Jan 21 01:06:31 crc kubenswrapper[5030]: I0121 01:06:31.874040 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-mh8ms_09197347-c50f-4fe1-b20d-3baabf9869fc/kube-rbac-proxy/0.log" Jan 21 01:06:31 crc kubenswrapper[5030]: I0121 01:06:31.922374 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/2.log" Jan 21 01:06:32 crc kubenswrapper[5030]: I0121 01:06:32.387406 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n8v4f_7e610661-5072-4aa0-b1f1-75410b7f663b/kube-multus/3.log" Jan 21 01:06:32 crc kubenswrapper[5030]: I0121 01:06:32.566106 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f2dgj_2b3c414b-bc29-41aa-a369-ecc2cc809691/network-metrics-daemon/0.log" Jan 21 01:06:32 crc kubenswrapper[5030]: I0121 01:06:32.577335 5030 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f2dgj_2b3c414b-bc29-41aa-a369-ecc2cc809691/kube-rbac-proxy/0.log" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.763458 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zcj57"] Jan 21 01:08:06 crc kubenswrapper[5030]: E0121 01:08:06.764518 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e58e25eb-fb67-4c13-a4fb-5029dc200d52" containerName="registry-server" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.764534 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e58e25eb-fb67-4c13-a4fb-5029dc200d52" containerName="registry-server" Jan 21 01:08:06 crc kubenswrapper[5030]: E0121 01:08:06.764547 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e58e25eb-fb67-4c13-a4fb-5029dc200d52" containerName="extract-utilities" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.764556 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e58e25eb-fb67-4c13-a4fb-5029dc200d52" containerName="extract-utilities" Jan 21 01:08:06 crc kubenswrapper[5030]: E0121 01:08:06.764571 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e58e25eb-fb67-4c13-a4fb-5029dc200d52" containerName="extract-content" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.764580 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="e58e25eb-fb67-4c13-a4fb-5029dc200d52" containerName="extract-content" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.764779 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="e58e25eb-fb67-4c13-a4fb-5029dc200d52" containerName="registry-server" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.766018 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.787779 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zcj57"] Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.870783 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b667904-e003-467b-8609-407a4ec7ad2a-utilities\") pod \"certified-operators-zcj57\" (UID: \"9b667904-e003-467b-8609-407a4ec7ad2a\") " pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.870865 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24xnm\" (UniqueName: \"kubernetes.io/projected/9b667904-e003-467b-8609-407a4ec7ad2a-kube-api-access-24xnm\") pod \"certified-operators-zcj57\" (UID: \"9b667904-e003-467b-8609-407a4ec7ad2a\") " pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.870939 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b667904-e003-467b-8609-407a4ec7ad2a-catalog-content\") pod \"certified-operators-zcj57\" (UID: \"9b667904-e003-467b-8609-407a4ec7ad2a\") " pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.971479 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b667904-e003-467b-8609-407a4ec7ad2a-utilities\") pod \"certified-operators-zcj57\" (UID: \"9b667904-e003-467b-8609-407a4ec7ad2a\") " pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.971544 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24xnm\" (UniqueName: \"kubernetes.io/projected/9b667904-e003-467b-8609-407a4ec7ad2a-kube-api-access-24xnm\") pod \"certified-operators-zcj57\" (UID: \"9b667904-e003-467b-8609-407a4ec7ad2a\") " pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.971597 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b667904-e003-467b-8609-407a4ec7ad2a-catalog-content\") pod \"certified-operators-zcj57\" (UID: \"9b667904-e003-467b-8609-407a4ec7ad2a\") " pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.972034 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b667904-e003-467b-8609-407a4ec7ad2a-catalog-content\") pod \"certified-operators-zcj57\" (UID: \"9b667904-e003-467b-8609-407a4ec7ad2a\") " pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:06 crc kubenswrapper[5030]: I0121 01:08:06.972292 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b667904-e003-467b-8609-407a4ec7ad2a-utilities\") pod \"certified-operators-zcj57\" (UID: \"9b667904-e003-467b-8609-407a4ec7ad2a\") " pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:07 crc kubenswrapper[5030]: I0121 01:08:07.044279 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24xnm\" (UniqueName: \"kubernetes.io/projected/9b667904-e003-467b-8609-407a4ec7ad2a-kube-api-access-24xnm\") pod \"certified-operators-zcj57\" (UID: \"9b667904-e003-467b-8609-407a4ec7ad2a\") " pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:07 crc kubenswrapper[5030]: I0121 01:08:07.091215 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:07 crc kubenswrapper[5030]: I0121 01:08:07.615071 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zcj57"] Jan 21 01:08:08 crc kubenswrapper[5030]: I0121 01:08:08.354247 5030 generic.go:334] "Generic (PLEG): container finished" podID="9b667904-e003-467b-8609-407a4ec7ad2a" containerID="dcb372323d6f993844ed30819a68005b85ba8bd9d93d78dc56afc994bafe7037" exitCode=0 Jan 21 01:08:08 crc kubenswrapper[5030]: I0121 01:08:08.354331 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcj57" event={"ID":"9b667904-e003-467b-8609-407a4ec7ad2a","Type":"ContainerDied","Data":"dcb372323d6f993844ed30819a68005b85ba8bd9d93d78dc56afc994bafe7037"} Jan 21 01:08:08 crc kubenswrapper[5030]: I0121 01:08:08.354651 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcj57" event={"ID":"9b667904-e003-467b-8609-407a4ec7ad2a","Type":"ContainerStarted","Data":"23633d3307701e6acf4589fe4da68711ef26f92dd0e848a5eb3f0e578ae884fc"} Jan 21 01:08:08 crc kubenswrapper[5030]: I0121 01:08:08.357007 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 01:08:09 crc kubenswrapper[5030]: I0121 01:08:09.366426 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcj57" event={"ID":"9b667904-e003-467b-8609-407a4ec7ad2a","Type":"ContainerStarted","Data":"57094980c0975a538386de77434c0c57713f101a688cc5170b9988350b40000f"} Jan 21 01:08:10 crc kubenswrapper[5030]: I0121 01:08:10.157133 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:08:10 crc kubenswrapper[5030]: I0121 01:08:10.157224 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:08:10 crc kubenswrapper[5030]: I0121 01:08:10.377012 5030 generic.go:334] "Generic (PLEG): container finished" podID="9b667904-e003-467b-8609-407a4ec7ad2a" containerID="57094980c0975a538386de77434c0c57713f101a688cc5170b9988350b40000f" exitCode=0 Jan 21 01:08:10 crc kubenswrapper[5030]: I0121 01:08:10.377061 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcj57" event={"ID":"9b667904-e003-467b-8609-407a4ec7ad2a","Type":"ContainerDied","Data":"57094980c0975a538386de77434c0c57713f101a688cc5170b9988350b40000f"} Jan 21 01:08:11 crc kubenswrapper[5030]: I0121 01:08:11.386836 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcj57" event={"ID":"9b667904-e003-467b-8609-407a4ec7ad2a","Type":"ContainerStarted","Data":"ddd9f4b30ee700c3e9216e650ee371dc55d53a4894cf7a50401d61c8c92b1b5d"} Jan 21 01:08:11 crc kubenswrapper[5030]: I0121 01:08:11.411688 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zcj57" podStartSLOduration=2.961427105 podStartE2EDuration="5.411617094s" podCreationTimestamp="2026-01-21 01:08:06 +0000 UTC" firstStartedPulling="2026-01-21 01:08:08.356332892 +0000 UTC m=+9160.676593190" lastFinishedPulling="2026-01-21 01:08:10.806522891 +0000 UTC m=+9163.126783179" observedRunningTime="2026-01-21 01:08:11.405182621 +0000 UTC m=+9163.725442919" watchObservedRunningTime="2026-01-21 01:08:11.411617094 +0000 UTC m=+9163.731877422" Jan 21 01:08:17 crc kubenswrapper[5030]: I0121 01:08:17.091930 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:17 crc kubenswrapper[5030]: I0121 01:08:17.096992 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:17 crc kubenswrapper[5030]: I0121 01:08:17.157010 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:17 crc kubenswrapper[5030]: I0121 01:08:17.480245 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:17 crc kubenswrapper[5030]: I0121 01:08:17.519840 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zcj57"] Jan 21 01:08:19 crc kubenswrapper[5030]: I0121 01:08:19.455100 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zcj57" podUID="9b667904-e003-467b-8609-407a4ec7ad2a" containerName="registry-server" containerID="cri-o://ddd9f4b30ee700c3e9216e650ee371dc55d53a4894cf7a50401d61c8c92b1b5d" gracePeriod=2 Jan 21 01:08:20 crc kubenswrapper[5030]: I0121 01:08:20.467271 5030 generic.go:334] "Generic (PLEG): container finished" podID="9b667904-e003-467b-8609-407a4ec7ad2a" containerID="ddd9f4b30ee700c3e9216e650ee371dc55d53a4894cf7a50401d61c8c92b1b5d" exitCode=0 Jan 21 01:08:20 crc kubenswrapper[5030]: I0121 01:08:20.467336 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcj57" event={"ID":"9b667904-e003-467b-8609-407a4ec7ad2a","Type":"ContainerDied","Data":"ddd9f4b30ee700c3e9216e650ee371dc55d53a4894cf7a50401d61c8c92b1b5d"} Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.065917 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.186670 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b667904-e003-467b-8609-407a4ec7ad2a-utilities\") pod \"9b667904-e003-467b-8609-407a4ec7ad2a\" (UID: \"9b667904-e003-467b-8609-407a4ec7ad2a\") " Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.186757 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24xnm\" (UniqueName: \"kubernetes.io/projected/9b667904-e003-467b-8609-407a4ec7ad2a-kube-api-access-24xnm\") pod \"9b667904-e003-467b-8609-407a4ec7ad2a\" (UID: \"9b667904-e003-467b-8609-407a4ec7ad2a\") " Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.186795 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b667904-e003-467b-8609-407a4ec7ad2a-catalog-content\") pod \"9b667904-e003-467b-8609-407a4ec7ad2a\" (UID: \"9b667904-e003-467b-8609-407a4ec7ad2a\") " Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.196539 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b667904-e003-467b-8609-407a4ec7ad2a-utilities" (OuterVolumeSpecName: "utilities") pod "9b667904-e003-467b-8609-407a4ec7ad2a" (UID: "9b667904-e003-467b-8609-407a4ec7ad2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.197007 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b667904-e003-467b-8609-407a4ec7ad2a-kube-api-access-24xnm" (OuterVolumeSpecName: "kube-api-access-24xnm") pod "9b667904-e003-467b-8609-407a4ec7ad2a" (UID: "9b667904-e003-467b-8609-407a4ec7ad2a"). InnerVolumeSpecName "kube-api-access-24xnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.250635 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b667904-e003-467b-8609-407a4ec7ad2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b667904-e003-467b-8609-407a4ec7ad2a" (UID: "9b667904-e003-467b-8609-407a4ec7ad2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.288763 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b667904-e003-467b-8609-407a4ec7ad2a-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.288825 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24xnm\" (UniqueName: \"kubernetes.io/projected/9b667904-e003-467b-8609-407a4ec7ad2a-kube-api-access-24xnm\") on node \"crc\" DevicePath \"\"" Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.288842 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b667904-e003-467b-8609-407a4ec7ad2a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.477454 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zcj57" event={"ID":"9b667904-e003-467b-8609-407a4ec7ad2a","Type":"ContainerDied","Data":"23633d3307701e6acf4589fe4da68711ef26f92dd0e848a5eb3f0e578ae884fc"} Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.477792 5030 scope.go:117] "RemoveContainer" containerID="ddd9f4b30ee700c3e9216e650ee371dc55d53a4894cf7a50401d61c8c92b1b5d" Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.477524 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zcj57" Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.498306 5030 scope.go:117] "RemoveContainer" containerID="57094980c0975a538386de77434c0c57713f101a688cc5170b9988350b40000f" Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.514706 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zcj57"] Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.522383 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zcj57"] Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.543315 5030 scope.go:117] "RemoveContainer" containerID="dcb372323d6f993844ed30819a68005b85ba8bd9d93d78dc56afc994bafe7037" Jan 21 01:08:21 crc kubenswrapper[5030]: I0121 01:08:21.970112 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b667904-e003-467b-8609-407a4ec7ad2a" path="/var/lib/kubelet/pods/9b667904-e003-467b-8609-407a4ec7ad2a/volumes" Jan 21 01:08:40 crc kubenswrapper[5030]: I0121 01:08:40.158171 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:08:40 crc kubenswrapper[5030]: I0121 01:08:40.159766 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:09:10 crc kubenswrapper[5030]: I0121 01:09:10.157223 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:09:10 crc kubenswrapper[5030]: I0121 01:09:10.157841 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:09:10 crc kubenswrapper[5030]: I0121 01:09:10.157887 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 01:09:10 crc kubenswrapper[5030]: I0121 01:09:10.158504 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 01:09:10 crc kubenswrapper[5030]: I0121 01:09:10.158558 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" gracePeriod=600 Jan 21 01:09:10 crc kubenswrapper[5030]: E0121 01:09:10.289566 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:09:10 crc kubenswrapper[5030]: I0121 01:09:10.883509 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" exitCode=0 Jan 21 01:09:10 crc kubenswrapper[5030]: I0121 01:09:10.883930 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8"} Jan 21 01:09:10 crc kubenswrapper[5030]: I0121 01:09:10.883967 5030 scope.go:117] "RemoveContainer" containerID="493fb309220013c06d3075f6b5de033d2728601f3b32363f4acb6dd909e988fb" Jan 21 01:09:10 crc kubenswrapper[5030]: I0121 01:09:10.884525 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:09:10 crc kubenswrapper[5030]: E0121 01:09:10.884844 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:09:24 crc kubenswrapper[5030]: I0121 01:09:24.961712 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:09:24 crc kubenswrapper[5030]: E0121 01:09:24.962491 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:09:37 crc kubenswrapper[5030]: I0121 01:09:37.970462 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:09:37 crc kubenswrapper[5030]: E0121 01:09:37.971577 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:09:50 crc kubenswrapper[5030]: I0121 01:09:50.962685 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:09:50 crc kubenswrapper[5030]: E0121 01:09:50.964271 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:10:02 crc kubenswrapper[5030]: I0121 01:10:02.962402 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:10:02 crc kubenswrapper[5030]: E0121 01:10:02.963130 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:10:07 crc kubenswrapper[5030]: I0121 01:10:07.816334 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7wv9n"] Jan 21 01:10:07 crc kubenswrapper[5030]: E0121 01:10:07.818235 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b667904-e003-467b-8609-407a4ec7ad2a" containerName="extract-content" Jan 21 01:10:07 crc kubenswrapper[5030]: I0121 01:10:07.818275 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b667904-e003-467b-8609-407a4ec7ad2a" containerName="extract-content" Jan 21 01:10:07 crc kubenswrapper[5030]: E0121 01:10:07.818293 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b667904-e003-467b-8609-407a4ec7ad2a" containerName="extract-utilities" Jan 21 01:10:07 crc kubenswrapper[5030]: I0121 01:10:07.818304 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b667904-e003-467b-8609-407a4ec7ad2a" containerName="extract-utilities" Jan 21 01:10:07 crc kubenswrapper[5030]: E0121 01:10:07.818354 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b667904-e003-467b-8609-407a4ec7ad2a" containerName="registry-server" Jan 21 01:10:07 crc kubenswrapper[5030]: I0121 01:10:07.818367 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b667904-e003-467b-8609-407a4ec7ad2a" containerName="registry-server" Jan 21 01:10:07 crc kubenswrapper[5030]: I0121 01:10:07.818587 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b667904-e003-467b-8609-407a4ec7ad2a" containerName="registry-server" Jan 21 01:10:07 crc kubenswrapper[5030]: I0121 01:10:07.820040 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:07 crc kubenswrapper[5030]: I0121 01:10:07.831583 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7wv9n"] Jan 21 01:10:07 crc kubenswrapper[5030]: I0121 01:10:07.917165 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6734f104-da95-4452-b9c4-1c1595452786-utilities\") pod \"redhat-marketplace-7wv9n\" (UID: \"6734f104-da95-4452-b9c4-1c1595452786\") " pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:07 crc kubenswrapper[5030]: I0121 01:10:07.917242 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6734f104-da95-4452-b9c4-1c1595452786-catalog-content\") pod \"redhat-marketplace-7wv9n\" (UID: \"6734f104-da95-4452-b9c4-1c1595452786\") " pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:07 crc kubenswrapper[5030]: I0121 01:10:07.917452 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj97c\" (UniqueName: \"kubernetes.io/projected/6734f104-da95-4452-b9c4-1c1595452786-kube-api-access-jj97c\") pod \"redhat-marketplace-7wv9n\" (UID: \"6734f104-da95-4452-b9c4-1c1595452786\") " pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:08 crc kubenswrapper[5030]: I0121 01:10:08.019493 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj97c\" (UniqueName: \"kubernetes.io/projected/6734f104-da95-4452-b9c4-1c1595452786-kube-api-access-jj97c\") pod \"redhat-marketplace-7wv9n\" (UID: \"6734f104-da95-4452-b9c4-1c1595452786\") " pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:08 crc kubenswrapper[5030]: I0121 01:10:08.019675 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6734f104-da95-4452-b9c4-1c1595452786-utilities\") pod \"redhat-marketplace-7wv9n\" (UID: \"6734f104-da95-4452-b9c4-1c1595452786\") " pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:08 crc kubenswrapper[5030]: I0121 01:10:08.019716 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6734f104-da95-4452-b9c4-1c1595452786-catalog-content\") pod \"redhat-marketplace-7wv9n\" (UID: \"6734f104-da95-4452-b9c4-1c1595452786\") " pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:08 crc kubenswrapper[5030]: I0121 01:10:08.020209 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6734f104-da95-4452-b9c4-1c1595452786-catalog-content\") pod \"redhat-marketplace-7wv9n\" (UID: \"6734f104-da95-4452-b9c4-1c1595452786\") " pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:08 crc kubenswrapper[5030]: I0121 01:10:08.020492 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6734f104-da95-4452-b9c4-1c1595452786-utilities\") pod \"redhat-marketplace-7wv9n\" (UID: \"6734f104-da95-4452-b9c4-1c1595452786\") " pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:08 crc kubenswrapper[5030]: I0121 01:10:08.050653 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj97c\" (UniqueName: \"kubernetes.io/projected/6734f104-da95-4452-b9c4-1c1595452786-kube-api-access-jj97c\") pod \"redhat-marketplace-7wv9n\" (UID: \"6734f104-da95-4452-b9c4-1c1595452786\") " pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:08 crc kubenswrapper[5030]: I0121 01:10:08.194856 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:08 crc kubenswrapper[5030]: I0121 01:10:08.433438 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7wv9n"] Jan 21 01:10:09 crc kubenswrapper[5030]: I0121 01:10:09.343410 5030 generic.go:334] "Generic (PLEG): container finished" podID="6734f104-da95-4452-b9c4-1c1595452786" containerID="75fe99f1d4acf69268843009f6e9df68e52f846b055c4823de11dbcd0b97351c" exitCode=0 Jan 21 01:10:09 crc kubenswrapper[5030]: I0121 01:10:09.343511 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7wv9n" event={"ID":"6734f104-da95-4452-b9c4-1c1595452786","Type":"ContainerDied","Data":"75fe99f1d4acf69268843009f6e9df68e52f846b055c4823de11dbcd0b97351c"} Jan 21 01:10:09 crc kubenswrapper[5030]: I0121 01:10:09.343757 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7wv9n" event={"ID":"6734f104-da95-4452-b9c4-1c1595452786","Type":"ContainerStarted","Data":"562476e07c28a981804420ddde5df990f81a9e1c7107209d9987833d8bd0ecdc"} Jan 21 01:10:10 crc kubenswrapper[5030]: I0121 01:10:10.351231 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7wv9n" event={"ID":"6734f104-da95-4452-b9c4-1c1595452786","Type":"ContainerStarted","Data":"a51e65f31cec95cb28ce93f0ba6bb1aa21f85af4ae6f2197dccc68a8d4b99401"} Jan 21 01:10:11 crc kubenswrapper[5030]: I0121 01:10:11.360475 5030 generic.go:334] "Generic (PLEG): container finished" podID="6734f104-da95-4452-b9c4-1c1595452786" containerID="a51e65f31cec95cb28ce93f0ba6bb1aa21f85af4ae6f2197dccc68a8d4b99401" exitCode=0 Jan 21 01:10:11 crc kubenswrapper[5030]: I0121 01:10:11.360534 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7wv9n" event={"ID":"6734f104-da95-4452-b9c4-1c1595452786","Type":"ContainerDied","Data":"a51e65f31cec95cb28ce93f0ba6bb1aa21f85af4ae6f2197dccc68a8d4b99401"} Jan 21 01:10:12 crc kubenswrapper[5030]: I0121 01:10:12.368219 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7wv9n" event={"ID":"6734f104-da95-4452-b9c4-1c1595452786","Type":"ContainerStarted","Data":"e85db2adefd48f9130d2d143119ae5ec8d7cbec6b400fbd209bf69c159c3b485"} Jan 21 01:10:12 crc kubenswrapper[5030]: I0121 01:10:12.388277 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7wv9n" podStartSLOduration=2.936955007 podStartE2EDuration="5.38825766s" podCreationTimestamp="2026-01-21 01:10:07 +0000 UTC" firstStartedPulling="2026-01-21 01:10:09.348732493 +0000 UTC m=+9281.668992791" lastFinishedPulling="2026-01-21 01:10:11.800035156 +0000 UTC m=+9284.120295444" observedRunningTime="2026-01-21 01:10:12.386709413 +0000 UTC m=+9284.706969721" watchObservedRunningTime="2026-01-21 01:10:12.38825766 +0000 UTC m=+9284.708517948" Jan 21 01:10:17 crc kubenswrapper[5030]: I0121 01:10:17.966712 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:10:17 crc kubenswrapper[5030]: E0121 01:10:17.967558 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:10:18 crc kubenswrapper[5030]: I0121 01:10:18.195433 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:18 crc kubenswrapper[5030]: I0121 01:10:18.195511 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:18 crc kubenswrapper[5030]: I0121 01:10:18.266532 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:18 crc kubenswrapper[5030]: I0121 01:10:18.464197 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:18 crc kubenswrapper[5030]: I0121 01:10:18.518285 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7wv9n"] Jan 21 01:10:20 crc kubenswrapper[5030]: I0121 01:10:20.425744 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7wv9n" podUID="6734f104-da95-4452-b9c4-1c1595452786" containerName="registry-server" containerID="cri-o://e85db2adefd48f9130d2d143119ae5ec8d7cbec6b400fbd209bf69c159c3b485" gracePeriod=2 Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.396254 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.434020 5030 generic.go:334] "Generic (PLEG): container finished" podID="6734f104-da95-4452-b9c4-1c1595452786" containerID="e85db2adefd48f9130d2d143119ae5ec8d7cbec6b400fbd209bf69c159c3b485" exitCode=0 Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.434073 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7wv9n" event={"ID":"6734f104-da95-4452-b9c4-1c1595452786","Type":"ContainerDied","Data":"e85db2adefd48f9130d2d143119ae5ec8d7cbec6b400fbd209bf69c159c3b485"} Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.434111 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7wv9n" event={"ID":"6734f104-da95-4452-b9c4-1c1595452786","Type":"ContainerDied","Data":"562476e07c28a981804420ddde5df990f81a9e1c7107209d9987833d8bd0ecdc"} Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.434131 5030 scope.go:117] "RemoveContainer" containerID="e85db2adefd48f9130d2d143119ae5ec8d7cbec6b400fbd209bf69c159c3b485" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.434266 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7wv9n" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.455016 5030 scope.go:117] "RemoveContainer" containerID="a51e65f31cec95cb28ce93f0ba6bb1aa21f85af4ae6f2197dccc68a8d4b99401" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.475937 5030 scope.go:117] "RemoveContainer" containerID="75fe99f1d4acf69268843009f6e9df68e52f846b055c4823de11dbcd0b97351c" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.489909 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6734f104-da95-4452-b9c4-1c1595452786-catalog-content\") pod \"6734f104-da95-4452-b9c4-1c1595452786\" (UID: \"6734f104-da95-4452-b9c4-1c1595452786\") " Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.489987 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj97c\" (UniqueName: \"kubernetes.io/projected/6734f104-da95-4452-b9c4-1c1595452786-kube-api-access-jj97c\") pod \"6734f104-da95-4452-b9c4-1c1595452786\" (UID: \"6734f104-da95-4452-b9c4-1c1595452786\") " Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.490071 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6734f104-da95-4452-b9c4-1c1595452786-utilities\") pod \"6734f104-da95-4452-b9c4-1c1595452786\" (UID: \"6734f104-da95-4452-b9c4-1c1595452786\") " Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.490902 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6734f104-da95-4452-b9c4-1c1595452786-utilities" (OuterVolumeSpecName: "utilities") pod "6734f104-da95-4452-b9c4-1c1595452786" (UID: "6734f104-da95-4452-b9c4-1c1595452786"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.496056 5030 scope.go:117] "RemoveContainer" containerID="e85db2adefd48f9130d2d143119ae5ec8d7cbec6b400fbd209bf69c159c3b485" Jan 21 01:10:21 crc kubenswrapper[5030]: E0121 01:10:21.496838 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e85db2adefd48f9130d2d143119ae5ec8d7cbec6b400fbd209bf69c159c3b485\": container with ID starting with e85db2adefd48f9130d2d143119ae5ec8d7cbec6b400fbd209bf69c159c3b485 not found: ID does not exist" containerID="e85db2adefd48f9130d2d143119ae5ec8d7cbec6b400fbd209bf69c159c3b485" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.496913 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e85db2adefd48f9130d2d143119ae5ec8d7cbec6b400fbd209bf69c159c3b485"} err="failed to get container status \"e85db2adefd48f9130d2d143119ae5ec8d7cbec6b400fbd209bf69c159c3b485\": rpc error: code = NotFound desc = could not find container \"e85db2adefd48f9130d2d143119ae5ec8d7cbec6b400fbd209bf69c159c3b485\": container with ID starting with e85db2adefd48f9130d2d143119ae5ec8d7cbec6b400fbd209bf69c159c3b485 not found: ID does not exist" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.496948 5030 scope.go:117] "RemoveContainer" containerID="a51e65f31cec95cb28ce93f0ba6bb1aa21f85af4ae6f2197dccc68a8d4b99401" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.497086 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6734f104-da95-4452-b9c4-1c1595452786-kube-api-access-jj97c" (OuterVolumeSpecName: "kube-api-access-jj97c") pod "6734f104-da95-4452-b9c4-1c1595452786" (UID: "6734f104-da95-4452-b9c4-1c1595452786"). InnerVolumeSpecName "kube-api-access-jj97c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:10:21 crc kubenswrapper[5030]: E0121 01:10:21.497338 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a51e65f31cec95cb28ce93f0ba6bb1aa21f85af4ae6f2197dccc68a8d4b99401\": container with ID starting with a51e65f31cec95cb28ce93f0ba6bb1aa21f85af4ae6f2197dccc68a8d4b99401 not found: ID does not exist" containerID="a51e65f31cec95cb28ce93f0ba6bb1aa21f85af4ae6f2197dccc68a8d4b99401" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.497392 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a51e65f31cec95cb28ce93f0ba6bb1aa21f85af4ae6f2197dccc68a8d4b99401"} err="failed to get container status \"a51e65f31cec95cb28ce93f0ba6bb1aa21f85af4ae6f2197dccc68a8d4b99401\": rpc error: code = NotFound desc = could not find container \"a51e65f31cec95cb28ce93f0ba6bb1aa21f85af4ae6f2197dccc68a8d4b99401\": container with ID starting with a51e65f31cec95cb28ce93f0ba6bb1aa21f85af4ae6f2197dccc68a8d4b99401 not found: ID does not exist" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.497428 5030 scope.go:117] "RemoveContainer" containerID="75fe99f1d4acf69268843009f6e9df68e52f846b055c4823de11dbcd0b97351c" Jan 21 01:10:21 crc kubenswrapper[5030]: E0121 01:10:21.497727 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75fe99f1d4acf69268843009f6e9df68e52f846b055c4823de11dbcd0b97351c\": container with ID starting with 75fe99f1d4acf69268843009f6e9df68e52f846b055c4823de11dbcd0b97351c not found: ID does not exist" containerID="75fe99f1d4acf69268843009f6e9df68e52f846b055c4823de11dbcd0b97351c" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.497761 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75fe99f1d4acf69268843009f6e9df68e52f846b055c4823de11dbcd0b97351c"} err="failed to get container status \"75fe99f1d4acf69268843009f6e9df68e52f846b055c4823de11dbcd0b97351c\": rpc error: code = NotFound desc = could not find container \"75fe99f1d4acf69268843009f6e9df68e52f846b055c4823de11dbcd0b97351c\": container with ID starting with 75fe99f1d4acf69268843009f6e9df68e52f846b055c4823de11dbcd0b97351c not found: ID does not exist" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.514216 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6734f104-da95-4452-b9c4-1c1595452786-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6734f104-da95-4452-b9c4-1c1595452786" (UID: "6734f104-da95-4452-b9c4-1c1595452786"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.591893 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6734f104-da95-4452-b9c4-1c1595452786-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.591934 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj97c\" (UniqueName: \"kubernetes.io/projected/6734f104-da95-4452-b9c4-1c1595452786-kube-api-access-jj97c\") on node \"crc\" DevicePath \"\"" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.591949 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6734f104-da95-4452-b9c4-1c1595452786-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.766986 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7wv9n"] Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.771455 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7wv9n"] Jan 21 01:10:21 crc kubenswrapper[5030]: I0121 01:10:21.980162 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6734f104-da95-4452-b9c4-1c1595452786" path="/var/lib/kubelet/pods/6734f104-da95-4452-b9c4-1c1595452786/volumes" Jan 21 01:10:32 crc kubenswrapper[5030]: I0121 01:10:32.962801 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:10:32 crc kubenswrapper[5030]: E0121 01:10:32.963541 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:10:46 crc kubenswrapper[5030]: I0121 01:10:46.963111 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:10:46 crc kubenswrapper[5030]: E0121 01:10:46.964093 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:10:57 crc kubenswrapper[5030]: I0121 01:10:57.970349 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:10:57 crc kubenswrapper[5030]: E0121 01:10:57.972268 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:11:09 crc kubenswrapper[5030]: I0121 01:11:09.962186 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:11:09 crc kubenswrapper[5030]: E0121 01:11:09.962936 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:11:21 crc kubenswrapper[5030]: I0121 01:11:21.961912 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:11:21 crc kubenswrapper[5030]: E0121 01:11:21.962702 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:11:36 crc kubenswrapper[5030]: I0121 01:11:36.962058 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:11:36 crc kubenswrapper[5030]: E0121 01:11:36.962693 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:11:50 crc kubenswrapper[5030]: I0121 01:11:50.962275 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:11:50 crc kubenswrapper[5030]: E0121 01:11:50.963045 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:12:05 crc kubenswrapper[5030]: I0121 01:12:05.962139 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:12:05 crc kubenswrapper[5030]: E0121 01:12:05.963044 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:12:18 crc kubenswrapper[5030]: I0121 01:12:18.962454 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:12:18 crc kubenswrapper[5030]: E0121 01:12:18.963157 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:12:33 crc kubenswrapper[5030]: I0121 01:12:33.962918 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:12:33 crc kubenswrapper[5030]: E0121 01:12:33.964048 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:12:44 crc kubenswrapper[5030]: I0121 01:12:44.962207 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:12:44 crc kubenswrapper[5030]: E0121 01:12:44.963048 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:12:58 crc kubenswrapper[5030]: I0121 01:12:58.962576 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:12:58 crc kubenswrapper[5030]: E0121 01:12:58.963516 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:13:11 crc kubenswrapper[5030]: I0121 01:13:11.965269 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:13:11 crc kubenswrapper[5030]: E0121 01:13:11.966501 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:13:24 crc kubenswrapper[5030]: I0121 01:13:24.962611 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:13:24 crc kubenswrapper[5030]: E0121 01:13:24.963672 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:13:35 crc kubenswrapper[5030]: I0121 01:13:35.962389 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:13:35 crc kubenswrapper[5030]: E0121 01:13:35.963170 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.243939 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2fqcw"] Jan 21 01:13:36 crc kubenswrapper[5030]: E0121 01:13:36.244421 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6734f104-da95-4452-b9c4-1c1595452786" containerName="extract-utilities" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.244453 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6734f104-da95-4452-b9c4-1c1595452786" containerName="extract-utilities" Jan 21 01:13:36 crc kubenswrapper[5030]: E0121 01:13:36.244493 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6734f104-da95-4452-b9c4-1c1595452786" containerName="extract-content" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.244506 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6734f104-da95-4452-b9c4-1c1595452786" containerName="extract-content" Jan 21 01:13:36 crc kubenswrapper[5030]: E0121 01:13:36.244702 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6734f104-da95-4452-b9c4-1c1595452786" containerName="registry-server" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.244729 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6734f104-da95-4452-b9c4-1c1595452786" containerName="registry-server" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.248395 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6734f104-da95-4452-b9c4-1c1595452786" containerName="registry-server" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.250004 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.259076 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2fqcw"] Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.382093 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq8cg\" (UniqueName: \"kubernetes.io/projected/192db6f1-fb04-46b5-956d-9b94874b8ea8-kube-api-access-pq8cg\") pod \"community-operators-2fqcw\" (UID: \"192db6f1-fb04-46b5-956d-9b94874b8ea8\") " pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.382174 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192db6f1-fb04-46b5-956d-9b94874b8ea8-utilities\") pod \"community-operators-2fqcw\" (UID: \"192db6f1-fb04-46b5-956d-9b94874b8ea8\") " pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.382336 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192db6f1-fb04-46b5-956d-9b94874b8ea8-catalog-content\") pod \"community-operators-2fqcw\" (UID: \"192db6f1-fb04-46b5-956d-9b94874b8ea8\") " pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.483765 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192db6f1-fb04-46b5-956d-9b94874b8ea8-catalog-content\") pod \"community-operators-2fqcw\" (UID: \"192db6f1-fb04-46b5-956d-9b94874b8ea8\") " pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.483849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq8cg\" (UniqueName: \"kubernetes.io/projected/192db6f1-fb04-46b5-956d-9b94874b8ea8-kube-api-access-pq8cg\") pod \"community-operators-2fqcw\" (UID: \"192db6f1-fb04-46b5-956d-9b94874b8ea8\") " pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.483883 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192db6f1-fb04-46b5-956d-9b94874b8ea8-utilities\") pod \"community-operators-2fqcw\" (UID: \"192db6f1-fb04-46b5-956d-9b94874b8ea8\") " pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.484415 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192db6f1-fb04-46b5-956d-9b94874b8ea8-catalog-content\") pod \"community-operators-2fqcw\" (UID: \"192db6f1-fb04-46b5-956d-9b94874b8ea8\") " pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.484518 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192db6f1-fb04-46b5-956d-9b94874b8ea8-utilities\") pod \"community-operators-2fqcw\" (UID: \"192db6f1-fb04-46b5-956d-9b94874b8ea8\") " pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.512918 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq8cg\" (UniqueName: \"kubernetes.io/projected/192db6f1-fb04-46b5-956d-9b94874b8ea8-kube-api-access-pq8cg\") pod \"community-operators-2fqcw\" (UID: \"192db6f1-fb04-46b5-956d-9b94874b8ea8\") " pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.586203 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:36 crc kubenswrapper[5030]: I0121 01:13:36.877776 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2fqcw"] Jan 21 01:13:37 crc kubenswrapper[5030]: I0121 01:13:37.905683 5030 generic.go:334] "Generic (PLEG): container finished" podID="192db6f1-fb04-46b5-956d-9b94874b8ea8" containerID="521abce4dd580855c8ddc7d536ce4ef4ac1efa43a6446ad4abb608c0ae0cbbb3" exitCode=0 Jan 21 01:13:37 crc kubenswrapper[5030]: I0121 01:13:37.905736 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fqcw" event={"ID":"192db6f1-fb04-46b5-956d-9b94874b8ea8","Type":"ContainerDied","Data":"521abce4dd580855c8ddc7d536ce4ef4ac1efa43a6446ad4abb608c0ae0cbbb3"} Jan 21 01:13:37 crc kubenswrapper[5030]: I0121 01:13:37.906073 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fqcw" event={"ID":"192db6f1-fb04-46b5-956d-9b94874b8ea8","Type":"ContainerStarted","Data":"d1cad67f8bde7f06da6910de59b097bf0ff9a995ecbb1858abe9bc219abc7d5d"} Jan 21 01:13:37 crc kubenswrapper[5030]: I0121 01:13:37.908405 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 01:13:38 crc kubenswrapper[5030]: I0121 01:13:38.917792 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fqcw" event={"ID":"192db6f1-fb04-46b5-956d-9b94874b8ea8","Type":"ContainerStarted","Data":"368e76c828e479160d73b029e6ea286a9f2565bb25b742e06db6bfce7724b59e"} Jan 21 01:13:39 crc kubenswrapper[5030]: I0121 01:13:39.929435 5030 generic.go:334] "Generic (PLEG): container finished" podID="192db6f1-fb04-46b5-956d-9b94874b8ea8" containerID="368e76c828e479160d73b029e6ea286a9f2565bb25b742e06db6bfce7724b59e" exitCode=0 Jan 21 01:13:39 crc kubenswrapper[5030]: I0121 01:13:39.929498 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fqcw" event={"ID":"192db6f1-fb04-46b5-956d-9b94874b8ea8","Type":"ContainerDied","Data":"368e76c828e479160d73b029e6ea286a9f2565bb25b742e06db6bfce7724b59e"} Jan 21 01:13:40 crc kubenswrapper[5030]: I0121 01:13:40.937468 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fqcw" event={"ID":"192db6f1-fb04-46b5-956d-9b94874b8ea8","Type":"ContainerStarted","Data":"f49c5b84a08f0117f9dc0d023eba2936646f300e61cc0d13466c619e5b1ff061"} Jan 21 01:13:40 crc kubenswrapper[5030]: I0121 01:13:40.957386 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2fqcw" podStartSLOduration=2.564347856 podStartE2EDuration="4.957365557s" podCreationTimestamp="2026-01-21 01:13:36 +0000 UTC" firstStartedPulling="2026-01-21 01:13:37.907983723 +0000 UTC m=+9490.228244031" lastFinishedPulling="2026-01-21 01:13:40.301001434 +0000 UTC m=+9492.621261732" observedRunningTime="2026-01-21 01:13:40.957052669 +0000 UTC m=+9493.277312977" watchObservedRunningTime="2026-01-21 01:13:40.957365557 +0000 UTC m=+9493.277625865" Jan 21 01:13:46 crc kubenswrapper[5030]: I0121 01:13:46.587028 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:46 crc kubenswrapper[5030]: I0121 01:13:46.587767 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:46 crc kubenswrapper[5030]: I0121 01:13:46.644512 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:47 crc kubenswrapper[5030]: I0121 01:13:47.067732 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:47 crc kubenswrapper[5030]: I0121 01:13:47.126139 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2fqcw"] Jan 21 01:13:47 crc kubenswrapper[5030]: I0121 01:13:47.968134 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:13:47 crc kubenswrapper[5030]: E0121 01:13:47.969657 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:13:49 crc kubenswrapper[5030]: I0121 01:13:49.020546 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2fqcw" podUID="192db6f1-fb04-46b5-956d-9b94874b8ea8" containerName="registry-server" containerID="cri-o://f49c5b84a08f0117f9dc0d023eba2936646f300e61cc0d13466c619e5b1ff061" gracePeriod=2 Jan 21 01:13:49 crc kubenswrapper[5030]: I0121 01:13:49.432321 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:49 crc kubenswrapper[5030]: I0121 01:13:49.613757 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192db6f1-fb04-46b5-956d-9b94874b8ea8-catalog-content\") pod \"192db6f1-fb04-46b5-956d-9b94874b8ea8\" (UID: \"192db6f1-fb04-46b5-956d-9b94874b8ea8\") " Jan 21 01:13:49 crc kubenswrapper[5030]: I0121 01:13:49.613880 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192db6f1-fb04-46b5-956d-9b94874b8ea8-utilities\") pod \"192db6f1-fb04-46b5-956d-9b94874b8ea8\" (UID: \"192db6f1-fb04-46b5-956d-9b94874b8ea8\") " Jan 21 01:13:49 crc kubenswrapper[5030]: I0121 01:13:49.614014 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq8cg\" (UniqueName: \"kubernetes.io/projected/192db6f1-fb04-46b5-956d-9b94874b8ea8-kube-api-access-pq8cg\") pod \"192db6f1-fb04-46b5-956d-9b94874b8ea8\" (UID: \"192db6f1-fb04-46b5-956d-9b94874b8ea8\") " Jan 21 01:13:49 crc kubenswrapper[5030]: I0121 01:13:49.615822 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/192db6f1-fb04-46b5-956d-9b94874b8ea8-utilities" (OuterVolumeSpecName: "utilities") pod "192db6f1-fb04-46b5-956d-9b94874b8ea8" (UID: "192db6f1-fb04-46b5-956d-9b94874b8ea8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:13:49 crc kubenswrapper[5030]: I0121 01:13:49.636366 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/192db6f1-fb04-46b5-956d-9b94874b8ea8-kube-api-access-pq8cg" (OuterVolumeSpecName: "kube-api-access-pq8cg") pod "192db6f1-fb04-46b5-956d-9b94874b8ea8" (UID: "192db6f1-fb04-46b5-956d-9b94874b8ea8"). InnerVolumeSpecName "kube-api-access-pq8cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:13:49 crc kubenswrapper[5030]: I0121 01:13:49.664457 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/192db6f1-fb04-46b5-956d-9b94874b8ea8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "192db6f1-fb04-46b5-956d-9b94874b8ea8" (UID: "192db6f1-fb04-46b5-956d-9b94874b8ea8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:13:49 crc kubenswrapper[5030]: I0121 01:13:49.715882 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/192db6f1-fb04-46b5-956d-9b94874b8ea8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:13:49 crc kubenswrapper[5030]: I0121 01:13:49.715919 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/192db6f1-fb04-46b5-956d-9b94874b8ea8-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:13:49 crc kubenswrapper[5030]: I0121 01:13:49.715940 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq8cg\" (UniqueName: \"kubernetes.io/projected/192db6f1-fb04-46b5-956d-9b94874b8ea8-kube-api-access-pq8cg\") on node \"crc\" DevicePath \"\"" Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.030733 5030 generic.go:334] "Generic (PLEG): container finished" podID="192db6f1-fb04-46b5-956d-9b94874b8ea8" containerID="f49c5b84a08f0117f9dc0d023eba2936646f300e61cc0d13466c619e5b1ff061" exitCode=0 Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.030777 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fqcw" event={"ID":"192db6f1-fb04-46b5-956d-9b94874b8ea8","Type":"ContainerDied","Data":"f49c5b84a08f0117f9dc0d023eba2936646f300e61cc0d13466c619e5b1ff061"} Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.030800 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fqcw" event={"ID":"192db6f1-fb04-46b5-956d-9b94874b8ea8","Type":"ContainerDied","Data":"d1cad67f8bde7f06da6910de59b097bf0ff9a995ecbb1858abe9bc219abc7d5d"} Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.030817 5030 scope.go:117] "RemoveContainer" containerID="f49c5b84a08f0117f9dc0d023eba2936646f300e61cc0d13466c619e5b1ff061" Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.030910 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fqcw" Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.053441 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2fqcw"] Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.055326 5030 scope.go:117] "RemoveContainer" containerID="368e76c828e479160d73b029e6ea286a9f2565bb25b742e06db6bfce7724b59e" Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.058488 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2fqcw"] Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.077052 5030 scope.go:117] "RemoveContainer" containerID="521abce4dd580855c8ddc7d536ce4ef4ac1efa43a6446ad4abb608c0ae0cbbb3" Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.101093 5030 scope.go:117] "RemoveContainer" containerID="f49c5b84a08f0117f9dc0d023eba2936646f300e61cc0d13466c619e5b1ff061" Jan 21 01:13:50 crc kubenswrapper[5030]: E0121 01:13:50.101409 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f49c5b84a08f0117f9dc0d023eba2936646f300e61cc0d13466c619e5b1ff061\": container with ID starting with f49c5b84a08f0117f9dc0d023eba2936646f300e61cc0d13466c619e5b1ff061 not found: ID does not exist" containerID="f49c5b84a08f0117f9dc0d023eba2936646f300e61cc0d13466c619e5b1ff061" Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.101445 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f49c5b84a08f0117f9dc0d023eba2936646f300e61cc0d13466c619e5b1ff061"} err="failed to get container status \"f49c5b84a08f0117f9dc0d023eba2936646f300e61cc0d13466c619e5b1ff061\": rpc error: code = NotFound desc = could not find container \"f49c5b84a08f0117f9dc0d023eba2936646f300e61cc0d13466c619e5b1ff061\": container with ID starting with f49c5b84a08f0117f9dc0d023eba2936646f300e61cc0d13466c619e5b1ff061 not found: ID does not exist" Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.101463 5030 scope.go:117] "RemoveContainer" containerID="368e76c828e479160d73b029e6ea286a9f2565bb25b742e06db6bfce7724b59e" Jan 21 01:13:50 crc kubenswrapper[5030]: E0121 01:13:50.101701 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"368e76c828e479160d73b029e6ea286a9f2565bb25b742e06db6bfce7724b59e\": container with ID starting with 368e76c828e479160d73b029e6ea286a9f2565bb25b742e06db6bfce7724b59e not found: ID does not exist" containerID="368e76c828e479160d73b029e6ea286a9f2565bb25b742e06db6bfce7724b59e" Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.101722 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"368e76c828e479160d73b029e6ea286a9f2565bb25b742e06db6bfce7724b59e"} err="failed to get container status \"368e76c828e479160d73b029e6ea286a9f2565bb25b742e06db6bfce7724b59e\": rpc error: code = NotFound desc = could not find container \"368e76c828e479160d73b029e6ea286a9f2565bb25b742e06db6bfce7724b59e\": container with ID starting with 368e76c828e479160d73b029e6ea286a9f2565bb25b742e06db6bfce7724b59e not found: ID does not exist" Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.101735 5030 scope.go:117] "RemoveContainer" containerID="521abce4dd580855c8ddc7d536ce4ef4ac1efa43a6446ad4abb608c0ae0cbbb3" Jan 21 01:13:50 crc kubenswrapper[5030]: E0121 01:13:50.102123 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"521abce4dd580855c8ddc7d536ce4ef4ac1efa43a6446ad4abb608c0ae0cbbb3\": container with ID starting with 521abce4dd580855c8ddc7d536ce4ef4ac1efa43a6446ad4abb608c0ae0cbbb3 not found: ID does not exist" containerID="521abce4dd580855c8ddc7d536ce4ef4ac1efa43a6446ad4abb608c0ae0cbbb3" Jan 21 01:13:50 crc kubenswrapper[5030]: I0121 01:13:50.102143 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521abce4dd580855c8ddc7d536ce4ef4ac1efa43a6446ad4abb608c0ae0cbbb3"} err="failed to get container status \"521abce4dd580855c8ddc7d536ce4ef4ac1efa43a6446ad4abb608c0ae0cbbb3\": rpc error: code = NotFound desc = could not find container \"521abce4dd580855c8ddc7d536ce4ef4ac1efa43a6446ad4abb608c0ae0cbbb3\": container with ID starting with 521abce4dd580855c8ddc7d536ce4ef4ac1efa43a6446ad4abb608c0ae0cbbb3 not found: ID does not exist" Jan 21 01:13:51 crc kubenswrapper[5030]: I0121 01:13:51.972394 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="192db6f1-fb04-46b5-956d-9b94874b8ea8" path="/var/lib/kubelet/pods/192db6f1-fb04-46b5-956d-9b94874b8ea8/volumes" Jan 21 01:14:01 crc kubenswrapper[5030]: I0121 01:14:01.965660 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:14:01 crc kubenswrapper[5030]: E0121 01:14:01.967004 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:14:13 crc kubenswrapper[5030]: I0121 01:14:13.962199 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:14:16 crc kubenswrapper[5030]: I0121 01:14:16.183605 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"33db64f5d0e88164e41df8765af71d48f8d5443b0a39e8669699e20e991a96f5"} Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.181765 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr"] Jan 21 01:15:00 crc kubenswrapper[5030]: E0121 01:15:00.182812 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192db6f1-fb04-46b5-956d-9b94874b8ea8" containerName="extract-content" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.182837 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="192db6f1-fb04-46b5-956d-9b94874b8ea8" containerName="extract-content" Jan 21 01:15:00 crc kubenswrapper[5030]: E0121 01:15:00.182897 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192db6f1-fb04-46b5-956d-9b94874b8ea8" containerName="extract-utilities" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.182911 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="192db6f1-fb04-46b5-956d-9b94874b8ea8" containerName="extract-utilities" Jan 21 01:15:00 crc kubenswrapper[5030]: E0121 01:15:00.182930 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192db6f1-fb04-46b5-956d-9b94874b8ea8" containerName="registry-server" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.182942 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="192db6f1-fb04-46b5-956d-9b94874b8ea8" containerName="registry-server" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.183144 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="192db6f1-fb04-46b5-956d-9b94874b8ea8" containerName="registry-server" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.183830 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.187851 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.188350 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.196677 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr"] Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.315954 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-config-volume\") pod \"collect-profiles-29482635-b86sr\" (UID: \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.316036 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8fkz\" (UniqueName: \"kubernetes.io/projected/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-kube-api-access-w8fkz\") pod \"collect-profiles-29482635-b86sr\" (UID: \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.316067 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-secret-volume\") pod \"collect-profiles-29482635-b86sr\" (UID: \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.416920 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8fkz\" (UniqueName: \"kubernetes.io/projected/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-kube-api-access-w8fkz\") pod \"collect-profiles-29482635-b86sr\" (UID: \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.416986 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-secret-volume\") pod \"collect-profiles-29482635-b86sr\" (UID: \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.417036 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-config-volume\") pod \"collect-profiles-29482635-b86sr\" (UID: \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.417870 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-config-volume\") pod \"collect-profiles-29482635-b86sr\" (UID: \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.423610 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-secret-volume\") pod \"collect-profiles-29482635-b86sr\" (UID: \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.437011 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8fkz\" (UniqueName: \"kubernetes.io/projected/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-kube-api-access-w8fkz\") pod \"collect-profiles-29482635-b86sr\" (UID: \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.503690 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" Jan 21 01:15:00 crc kubenswrapper[5030]: I0121 01:15:00.971876 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr"] Jan 21 01:15:01 crc kubenswrapper[5030]: I0121 01:15:01.511545 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" event={"ID":"f21ee884-1482-4a4e-8d9d-7b864dbcebe8","Type":"ContainerStarted","Data":"ba0745472815c03dc1f5c0543f28aa36d0cdc17d522dcc7cdfe7045697046d26"} Jan 21 01:15:01 crc kubenswrapper[5030]: I0121 01:15:01.511645 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" event={"ID":"f21ee884-1482-4a4e-8d9d-7b864dbcebe8","Type":"ContainerStarted","Data":"0eb452026750c7230a0664b8cd8994afcf02f3cbe1f644b844fa0156cf713412"} Jan 21 01:15:01 crc kubenswrapper[5030]: I0121 01:15:01.528548 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" podStartSLOduration=1.52852666 podStartE2EDuration="1.52852666s" podCreationTimestamp="2026-01-21 01:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 01:15:01.523604361 +0000 UTC m=+9573.843864649" watchObservedRunningTime="2026-01-21 01:15:01.52852666 +0000 UTC m=+9573.848786948" Jan 21 01:15:02 crc kubenswrapper[5030]: I0121 01:15:02.520566 5030 generic.go:334] "Generic (PLEG): container finished" podID="f21ee884-1482-4a4e-8d9d-7b864dbcebe8" containerID="ba0745472815c03dc1f5c0543f28aa36d0cdc17d522dcc7cdfe7045697046d26" exitCode=0 Jan 21 01:15:02 crc kubenswrapper[5030]: I0121 01:15:02.520631 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" event={"ID":"f21ee884-1482-4a4e-8d9d-7b864dbcebe8","Type":"ContainerDied","Data":"ba0745472815c03dc1f5c0543f28aa36d0cdc17d522dcc7cdfe7045697046d26"} Jan 21 01:15:03 crc kubenswrapper[5030]: I0121 01:15:03.764179 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" Jan 21 01:15:03 crc kubenswrapper[5030]: I0121 01:15:03.767401 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-config-volume\") pod \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\" (UID: \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\") " Jan 21 01:15:03 crc kubenswrapper[5030]: I0121 01:15:03.768700 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-config-volume" (OuterVolumeSpecName: "config-volume") pod "f21ee884-1482-4a4e-8d9d-7b864dbcebe8" (UID: "f21ee884-1482-4a4e-8d9d-7b864dbcebe8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:15:03 crc kubenswrapper[5030]: I0121 01:15:03.868486 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8fkz\" (UniqueName: \"kubernetes.io/projected/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-kube-api-access-w8fkz\") pod \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\" (UID: \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\") " Jan 21 01:15:03 crc kubenswrapper[5030]: I0121 01:15:03.868546 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-secret-volume\") pod \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\" (UID: \"f21ee884-1482-4a4e-8d9d-7b864dbcebe8\") " Jan 21 01:15:03 crc kubenswrapper[5030]: I0121 01:15:03.868865 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 01:15:03 crc kubenswrapper[5030]: I0121 01:15:03.874511 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-kube-api-access-w8fkz" (OuterVolumeSpecName: "kube-api-access-w8fkz") pod "f21ee884-1482-4a4e-8d9d-7b864dbcebe8" (UID: "f21ee884-1482-4a4e-8d9d-7b864dbcebe8"). InnerVolumeSpecName "kube-api-access-w8fkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:15:03 crc kubenswrapper[5030]: I0121 01:15:03.886289 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f21ee884-1482-4a4e-8d9d-7b864dbcebe8" (UID: "f21ee884-1482-4a4e-8d9d-7b864dbcebe8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:15:03 crc kubenswrapper[5030]: I0121 01:15:03.970646 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8fkz\" (UniqueName: \"kubernetes.io/projected/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-kube-api-access-w8fkz\") on node \"crc\" DevicePath \"\"" Jan 21 01:15:03 crc kubenswrapper[5030]: I0121 01:15:03.970681 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f21ee884-1482-4a4e-8d9d-7b864dbcebe8-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 01:15:04 crc kubenswrapper[5030]: I0121 01:15:04.535697 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" event={"ID":"f21ee884-1482-4a4e-8d9d-7b864dbcebe8","Type":"ContainerDied","Data":"0eb452026750c7230a0664b8cd8994afcf02f3cbe1f644b844fa0156cf713412"} Jan 21 01:15:04 crc kubenswrapper[5030]: I0121 01:15:04.536038 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0eb452026750c7230a0664b8cd8994afcf02f3cbe1f644b844fa0156cf713412" Jan 21 01:15:04 crc kubenswrapper[5030]: I0121 01:15:04.535785 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482635-b86sr" Jan 21 01:15:04 crc kubenswrapper[5030]: I0121 01:15:04.578941 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr"] Jan 21 01:15:04 crc kubenswrapper[5030]: I0121 01:15:04.584852 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482590-h86lr"] Jan 21 01:15:05 crc kubenswrapper[5030]: I0121 01:15:05.983762 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3af7c689-d338-4c09-8b45-78037d8dc249" path="/var/lib/kubelet/pods/3af7c689-d338-4c09-8b45-78037d8dc249/volumes" Jan 21 01:15:40 crc kubenswrapper[5030]: I0121 01:15:40.897564 5030 scope.go:117] "RemoveContainer" containerID="dcf8bdbe79b2a2b8f83042f25096f786d0b00a604e9bc49fef0c8005aef64b68" Jan 21 01:16:40 crc kubenswrapper[5030]: I0121 01:16:40.157118 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:16:40 crc kubenswrapper[5030]: I0121 01:16:40.157814 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.365994 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qbxg6"] Jan 21 01:16:55 crc kubenswrapper[5030]: E0121 01:16:55.366800 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21ee884-1482-4a4e-8d9d-7b864dbcebe8" containerName="collect-profiles" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.366819 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21ee884-1482-4a4e-8d9d-7b864dbcebe8" containerName="collect-profiles" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.366987 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="f21ee884-1482-4a4e-8d9d-7b864dbcebe8" containerName="collect-profiles" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.368091 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.385178 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbxg6"] Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.451685 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jqf7\" (UniqueName: \"kubernetes.io/projected/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-kube-api-access-2jqf7\") pod \"redhat-operators-qbxg6\" (UID: \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\") " pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.451770 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-utilities\") pod \"redhat-operators-qbxg6\" (UID: \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\") " pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.451803 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-catalog-content\") pod \"redhat-operators-qbxg6\" (UID: \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\") " pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.606253 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jqf7\" (UniqueName: \"kubernetes.io/projected/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-kube-api-access-2jqf7\") pod \"redhat-operators-qbxg6\" (UID: \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\") " pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.606415 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-utilities\") pod \"redhat-operators-qbxg6\" (UID: \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\") " pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.606476 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-catalog-content\") pod \"redhat-operators-qbxg6\" (UID: \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\") " pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.607300 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-catalog-content\") pod \"redhat-operators-qbxg6\" (UID: \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\") " pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.607384 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-utilities\") pod \"redhat-operators-qbxg6\" (UID: \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\") " pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.641126 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jqf7\" (UniqueName: \"kubernetes.io/projected/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-kube-api-access-2jqf7\") pod \"redhat-operators-qbxg6\" (UID: \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\") " pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:16:55 crc kubenswrapper[5030]: I0121 01:16:55.687767 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:16:56 crc kubenswrapper[5030]: I0121 01:16:56.119031 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbxg6"] Jan 21 01:16:56 crc kubenswrapper[5030]: I0121 01:16:56.289234 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbxg6" event={"ID":"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18","Type":"ContainerStarted","Data":"b48dd99bd697b622a3832d3aeea5a49917de7351513e503c05d0241b8f9fc65e"} Jan 21 01:16:57 crc kubenswrapper[5030]: I0121 01:16:57.297228 5030 generic.go:334] "Generic (PLEG): container finished" podID="d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" containerID="5c7ed28bb2db5c11d7f18109df6a11f0bc795bb7efed02af5e81b241cd517472" exitCode=0 Jan 21 01:16:57 crc kubenswrapper[5030]: I0121 01:16:57.297279 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbxg6" event={"ID":"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18","Type":"ContainerDied","Data":"5c7ed28bb2db5c11d7f18109df6a11f0bc795bb7efed02af5e81b241cd517472"} Jan 21 01:16:59 crc kubenswrapper[5030]: I0121 01:16:59.321449 5030 generic.go:334] "Generic (PLEG): container finished" podID="d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" containerID="01ea9194eedb3a8b50dca0153c928294a6df880f7c95ac120d6a72ce868882ef" exitCode=0 Jan 21 01:16:59 crc kubenswrapper[5030]: I0121 01:16:59.321600 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbxg6" event={"ID":"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18","Type":"ContainerDied","Data":"01ea9194eedb3a8b50dca0153c928294a6df880f7c95ac120d6a72ce868882ef"} Jan 21 01:17:00 crc kubenswrapper[5030]: I0121 01:17:00.331434 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbxg6" event={"ID":"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18","Type":"ContainerStarted","Data":"109c6b2a906ce15fa6781341df74e19df53d5a332d82b961fc5896008ddc0f5a"} Jan 21 01:17:00 crc kubenswrapper[5030]: I0121 01:17:00.356289 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qbxg6" podStartSLOduration=2.8824282979999998 podStartE2EDuration="5.356271733s" podCreationTimestamp="2026-01-21 01:16:55 +0000 UTC" firstStartedPulling="2026-01-21 01:16:57.299176874 +0000 UTC m=+9689.619437162" lastFinishedPulling="2026-01-21 01:16:59.773020289 +0000 UTC m=+9692.093280597" observedRunningTime="2026-01-21 01:17:00.350950465 +0000 UTC m=+9692.671210753" watchObservedRunningTime="2026-01-21 01:17:00.356271733 +0000 UTC m=+9692.676532031" Jan 21 01:17:05 crc kubenswrapper[5030]: I0121 01:17:05.688866 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:17:05 crc kubenswrapper[5030]: I0121 01:17:05.689582 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:17:06 crc kubenswrapper[5030]: I0121 01:17:06.736961 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qbxg6" podUID="d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" containerName="registry-server" probeResult="failure" output=< Jan 21 01:17:06 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 21 01:17:06 crc kubenswrapper[5030]: > Jan 21 01:17:10 crc kubenswrapper[5030]: I0121 01:17:10.157058 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:17:10 crc kubenswrapper[5030]: I0121 01:17:10.157155 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:17:15 crc kubenswrapper[5030]: I0121 01:17:15.757388 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:17:15 crc kubenswrapper[5030]: I0121 01:17:15.816563 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:17:16 crc kubenswrapper[5030]: I0121 01:17:16.006919 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qbxg6"] Jan 21 01:17:17 crc kubenswrapper[5030]: I0121 01:17:17.448132 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qbxg6" podUID="d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" containerName="registry-server" containerID="cri-o://109c6b2a906ce15fa6781341df74e19df53d5a332d82b961fc5896008ddc0f5a" gracePeriod=2 Jan 21 01:17:20 crc kubenswrapper[5030]: I0121 01:17:20.485635 5030 generic.go:334] "Generic (PLEG): container finished" podID="d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" containerID="109c6b2a906ce15fa6781341df74e19df53d5a332d82b961fc5896008ddc0f5a" exitCode=0 Jan 21 01:17:20 crc kubenswrapper[5030]: I0121 01:17:20.485840 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbxg6" event={"ID":"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18","Type":"ContainerDied","Data":"109c6b2a906ce15fa6781341df74e19df53d5a332d82b961fc5896008ddc0f5a"} Jan 21 01:17:20 crc kubenswrapper[5030]: I0121 01:17:20.823843 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:17:20 crc kubenswrapper[5030]: I0121 01:17:20.888405 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-utilities\") pod \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\" (UID: \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\") " Jan 21 01:17:20 crc kubenswrapper[5030]: I0121 01:17:20.888484 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jqf7\" (UniqueName: \"kubernetes.io/projected/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-kube-api-access-2jqf7\") pod \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\" (UID: \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\") " Jan 21 01:17:20 crc kubenswrapper[5030]: I0121 01:17:20.888600 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-catalog-content\") pod \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\" (UID: \"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18\") " Jan 21 01:17:20 crc kubenswrapper[5030]: I0121 01:17:20.892984 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-utilities" (OuterVolumeSpecName: "utilities") pod "d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" (UID: "d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:17:20 crc kubenswrapper[5030]: I0121 01:17:20.897879 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-kube-api-access-2jqf7" (OuterVolumeSpecName: "kube-api-access-2jqf7") pod "d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" (UID: "d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18"). InnerVolumeSpecName "kube-api-access-2jqf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:17:20 crc kubenswrapper[5030]: I0121 01:17:20.990502 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jqf7\" (UniqueName: \"kubernetes.io/projected/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-kube-api-access-2jqf7\") on node \"crc\" DevicePath \"\"" Jan 21 01:17:20 crc kubenswrapper[5030]: I0121 01:17:20.990764 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:17:21 crc kubenswrapper[5030]: I0121 01:17:21.018853 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" (UID: "d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:17:21 crc kubenswrapper[5030]: I0121 01:17:21.092232 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:17:21 crc kubenswrapper[5030]: I0121 01:17:21.500739 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbxg6" event={"ID":"d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18","Type":"ContainerDied","Data":"b48dd99bd697b622a3832d3aeea5a49917de7351513e503c05d0241b8f9fc65e"} Jan 21 01:17:21 crc kubenswrapper[5030]: I0121 01:17:21.501650 5030 scope.go:117] "RemoveContainer" containerID="109c6b2a906ce15fa6781341df74e19df53d5a332d82b961fc5896008ddc0f5a" Jan 21 01:17:21 crc kubenswrapper[5030]: I0121 01:17:21.500819 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbxg6" Jan 21 01:17:21 crc kubenswrapper[5030]: I0121 01:17:21.517451 5030 scope.go:117] "RemoveContainer" containerID="01ea9194eedb3a8b50dca0153c928294a6df880f7c95ac120d6a72ce868882ef" Jan 21 01:17:21 crc kubenswrapper[5030]: I0121 01:17:21.535206 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qbxg6"] Jan 21 01:17:21 crc kubenswrapper[5030]: I0121 01:17:21.541433 5030 scope.go:117] "RemoveContainer" containerID="5c7ed28bb2db5c11d7f18109df6a11f0bc795bb7efed02af5e81b241cd517472" Jan 21 01:17:21 crc kubenswrapper[5030]: I0121 01:17:21.542484 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qbxg6"] Jan 21 01:17:21 crc kubenswrapper[5030]: I0121 01:17:21.988846 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" path="/var/lib/kubelet/pods/d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18/volumes" Jan 21 01:17:40 crc kubenswrapper[5030]: I0121 01:17:40.157107 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:17:40 crc kubenswrapper[5030]: I0121 01:17:40.157617 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:17:40 crc kubenswrapper[5030]: I0121 01:17:40.157685 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 01:17:40 crc kubenswrapper[5030]: I0121 01:17:40.158373 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33db64f5d0e88164e41df8765af71d48f8d5443b0a39e8669699e20e991a96f5"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 01:17:40 crc kubenswrapper[5030]: I0121 01:17:40.158442 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://33db64f5d0e88164e41df8765af71d48f8d5443b0a39e8669699e20e991a96f5" gracePeriod=600 Jan 21 01:17:40 crc kubenswrapper[5030]: I0121 01:17:40.662488 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="33db64f5d0e88164e41df8765af71d48f8d5443b0a39e8669699e20e991a96f5" exitCode=0 Jan 21 01:17:40 crc kubenswrapper[5030]: I0121 01:17:40.662579 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"33db64f5d0e88164e41df8765af71d48f8d5443b0a39e8669699e20e991a96f5"} Jan 21 01:17:40 crc kubenswrapper[5030]: I0121 01:17:40.662911 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8"} Jan 21 01:17:40 crc kubenswrapper[5030]: I0121 01:17:40.662938 5030 scope.go:117] "RemoveContainer" containerID="881f717c3ed402d263b51c42afaee315c1aa3bab98b1eb240387630009e518d8" Jan 21 01:19:40 crc kubenswrapper[5030]: I0121 01:19:40.157873 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:19:40 crc kubenswrapper[5030]: I0121 01:19:40.158494 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:20:10 crc kubenswrapper[5030]: I0121 01:20:10.157186 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:20:10 crc kubenswrapper[5030]: I0121 01:20:10.157772 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:20:40 crc kubenswrapper[5030]: I0121 01:20:40.157696 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:20:40 crc kubenswrapper[5030]: I0121 01:20:40.158975 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:20:40 crc kubenswrapper[5030]: I0121 01:20:40.159044 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 01:20:40 crc kubenswrapper[5030]: I0121 01:20:40.160071 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 01:20:40 crc kubenswrapper[5030]: I0121 01:20:40.160168 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" gracePeriod=600 Jan 21 01:20:40 crc kubenswrapper[5030]: E0121 01:20:40.283647 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:20:40 crc kubenswrapper[5030]: I0121 01:20:40.976436 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" exitCode=0 Jan 21 01:20:40 crc kubenswrapper[5030]: I0121 01:20:40.976493 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8"} Jan 21 01:20:40 crc kubenswrapper[5030]: I0121 01:20:40.976538 5030 scope.go:117] "RemoveContainer" containerID="33db64f5d0e88164e41df8765af71d48f8d5443b0a39e8669699e20e991a96f5" Jan 21 01:20:40 crc kubenswrapper[5030]: I0121 01:20:40.977165 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:20:40 crc kubenswrapper[5030]: E0121 01:20:40.977457 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:20:48 crc kubenswrapper[5030]: I0121 01:20:48.852313 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f67sg"] Jan 21 01:20:48 crc kubenswrapper[5030]: E0121 01:20:48.853454 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" containerName="registry-server" Jan 21 01:20:48 crc kubenswrapper[5030]: I0121 01:20:48.853484 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" containerName="registry-server" Jan 21 01:20:48 crc kubenswrapper[5030]: E0121 01:20:48.853516 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" containerName="extract-content" Jan 21 01:20:48 crc kubenswrapper[5030]: I0121 01:20:48.853533 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" containerName="extract-content" Jan 21 01:20:48 crc kubenswrapper[5030]: E0121 01:20:48.853585 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" containerName="extract-utilities" Jan 21 01:20:48 crc kubenswrapper[5030]: I0121 01:20:48.853602 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" containerName="extract-utilities" Jan 21 01:20:48 crc kubenswrapper[5030]: I0121 01:20:48.853925 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11d8ee4-bedb-4f9e-9cfa-cd4dbf5dfa18" containerName="registry-server" Jan 21 01:20:48 crc kubenswrapper[5030]: I0121 01:20:48.856060 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:20:48 crc kubenswrapper[5030]: I0121 01:20:48.859441 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f67sg"] Jan 21 01:20:48 crc kubenswrapper[5030]: I0121 01:20:48.903766 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-catalog-content\") pod \"redhat-marketplace-f67sg\" (UID: \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\") " pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:20:48 crc kubenswrapper[5030]: I0121 01:20:48.903997 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-utilities\") pod \"redhat-marketplace-f67sg\" (UID: \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\") " pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:20:48 crc kubenswrapper[5030]: I0121 01:20:48.904390 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvqkz\" (UniqueName: \"kubernetes.io/projected/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-kube-api-access-wvqkz\") pod \"redhat-marketplace-f67sg\" (UID: \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\") " pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:20:49 crc kubenswrapper[5030]: I0121 01:20:49.006029 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-catalog-content\") pod \"redhat-marketplace-f67sg\" (UID: \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\") " pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:20:49 crc kubenswrapper[5030]: I0121 01:20:49.006143 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-utilities\") pod \"redhat-marketplace-f67sg\" (UID: \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\") " pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:20:49 crc kubenswrapper[5030]: I0121 01:20:49.006218 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvqkz\" (UniqueName: \"kubernetes.io/projected/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-kube-api-access-wvqkz\") pod \"redhat-marketplace-f67sg\" (UID: \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\") " pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:20:49 crc kubenswrapper[5030]: I0121 01:20:49.006946 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-catalog-content\") pod \"redhat-marketplace-f67sg\" (UID: \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\") " pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:20:49 crc kubenswrapper[5030]: I0121 01:20:49.007229 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-utilities\") pod \"redhat-marketplace-f67sg\" (UID: \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\") " pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:20:49 crc kubenswrapper[5030]: I0121 01:20:49.027319 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvqkz\" (UniqueName: \"kubernetes.io/projected/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-kube-api-access-wvqkz\") pod \"redhat-marketplace-f67sg\" (UID: \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\") " pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:20:49 crc kubenswrapper[5030]: I0121 01:20:49.208263 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:20:49 crc kubenswrapper[5030]: I0121 01:20:49.428823 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f67sg"] Jan 21 01:20:50 crc kubenswrapper[5030]: I0121 01:20:50.073021 5030 generic.go:334] "Generic (PLEG): container finished" podID="d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" containerID="c5056deb4fe0b9c97bfe96c558aedc7c1a9a11181df94e6888507dc48f9bee36" exitCode=0 Jan 21 01:20:50 crc kubenswrapper[5030]: I0121 01:20:50.073103 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f67sg" event={"ID":"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2","Type":"ContainerDied","Data":"c5056deb4fe0b9c97bfe96c558aedc7c1a9a11181df94e6888507dc48f9bee36"} Jan 21 01:20:50 crc kubenswrapper[5030]: I0121 01:20:50.073153 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f67sg" event={"ID":"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2","Type":"ContainerStarted","Data":"ef2515f509d481ac6a94c17de0a00cbedfb4310babfdd3943d2343fba9b0c98c"} Jan 21 01:20:50 crc kubenswrapper[5030]: I0121 01:20:50.076890 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 01:20:51 crc kubenswrapper[5030]: I0121 01:20:51.962744 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:20:51 crc kubenswrapper[5030]: E0121 01:20:51.963677 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:20:52 crc kubenswrapper[5030]: I0121 01:20:52.095031 5030 generic.go:334] "Generic (PLEG): container finished" podID="d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" containerID="e82ce53bfd9bfb18f6c2da528d63a1c7f100e4bda59e11906b30633bb55fb975" exitCode=0 Jan 21 01:20:52 crc kubenswrapper[5030]: I0121 01:20:52.095098 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f67sg" event={"ID":"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2","Type":"ContainerDied","Data":"e82ce53bfd9bfb18f6c2da528d63a1c7f100e4bda59e11906b30633bb55fb975"} Jan 21 01:20:53 crc kubenswrapper[5030]: I0121 01:20:53.105977 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f67sg" event={"ID":"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2","Type":"ContainerStarted","Data":"3d83b6307eb76adeaa5b531350cd2a10afeb71ef3bf9c7384761169e29f8e5c1"} Jan 21 01:20:53 crc kubenswrapper[5030]: I0121 01:20:53.134609 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f67sg" podStartSLOduration=2.688256163 podStartE2EDuration="5.134588727s" podCreationTimestamp="2026-01-21 01:20:48 +0000 UTC" firstStartedPulling="2026-01-21 01:20:50.076334099 +0000 UTC m=+9922.396594427" lastFinishedPulling="2026-01-21 01:20:52.522666663 +0000 UTC m=+9924.842926991" observedRunningTime="2026-01-21 01:20:53.13010187 +0000 UTC m=+9925.450362208" watchObservedRunningTime="2026-01-21 01:20:53.134588727 +0000 UTC m=+9925.454849035" Jan 21 01:20:59 crc kubenswrapper[5030]: I0121 01:20:59.209529 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:20:59 crc kubenswrapper[5030]: I0121 01:20:59.210046 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:20:59 crc kubenswrapper[5030]: I0121 01:20:59.291105 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:21:00 crc kubenswrapper[5030]: I0121 01:21:00.207819 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:21:00 crc kubenswrapper[5030]: I0121 01:21:00.248752 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f67sg"] Jan 21 01:21:02 crc kubenswrapper[5030]: I0121 01:21:02.175664 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f67sg" podUID="d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" containerName="registry-server" containerID="cri-o://3d83b6307eb76adeaa5b531350cd2a10afeb71ef3bf9c7384761169e29f8e5c1" gracePeriod=2 Jan 21 01:21:02 crc kubenswrapper[5030]: I0121 01:21:02.544606 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:21:02 crc kubenswrapper[5030]: I0121 01:21:02.617852 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvqkz\" (UniqueName: \"kubernetes.io/projected/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-kube-api-access-wvqkz\") pod \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\" (UID: \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\") " Jan 21 01:21:02 crc kubenswrapper[5030]: I0121 01:21:02.617962 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-catalog-content\") pod \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\" (UID: \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\") " Jan 21 01:21:02 crc kubenswrapper[5030]: I0121 01:21:02.618004 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-utilities\") pod \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\" (UID: \"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2\") " Jan 21 01:21:02 crc kubenswrapper[5030]: I0121 01:21:02.619203 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-utilities" (OuterVolumeSpecName: "utilities") pod "d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" (UID: "d4b96dd1-eb89-4a4d-9311-7f35833cc8f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:21:02 crc kubenswrapper[5030]: I0121 01:21:02.631087 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-kube-api-access-wvqkz" (OuterVolumeSpecName: "kube-api-access-wvqkz") pod "d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" (UID: "d4b96dd1-eb89-4a4d-9311-7f35833cc8f2"). InnerVolumeSpecName "kube-api-access-wvqkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:21:02 crc kubenswrapper[5030]: I0121 01:21:02.656437 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" (UID: "d4b96dd1-eb89-4a4d-9311-7f35833cc8f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:21:02 crc kubenswrapper[5030]: I0121 01:21:02.719472 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvqkz\" (UniqueName: \"kubernetes.io/projected/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-kube-api-access-wvqkz\") on node \"crc\" DevicePath \"\"" Jan 21 01:21:02 crc kubenswrapper[5030]: I0121 01:21:02.719521 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:21:02 crc kubenswrapper[5030]: I0121 01:21:02.719535 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.185430 5030 generic.go:334] "Generic (PLEG): container finished" podID="d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" containerID="3d83b6307eb76adeaa5b531350cd2a10afeb71ef3bf9c7384761169e29f8e5c1" exitCode=0 Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.185464 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f67sg" Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.185522 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f67sg" event={"ID":"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2","Type":"ContainerDied","Data":"3d83b6307eb76adeaa5b531350cd2a10afeb71ef3bf9c7384761169e29f8e5c1"} Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.186041 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f67sg" event={"ID":"d4b96dd1-eb89-4a4d-9311-7f35833cc8f2","Type":"ContainerDied","Data":"ef2515f509d481ac6a94c17de0a00cbedfb4310babfdd3943d2343fba9b0c98c"} Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.186080 5030 scope.go:117] "RemoveContainer" containerID="3d83b6307eb76adeaa5b531350cd2a10afeb71ef3bf9c7384761169e29f8e5c1" Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.218104 5030 scope.go:117] "RemoveContainer" containerID="e82ce53bfd9bfb18f6c2da528d63a1c7f100e4bda59e11906b30633bb55fb975" Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.262084 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f67sg"] Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.267240 5030 scope.go:117] "RemoveContainer" containerID="c5056deb4fe0b9c97bfe96c558aedc7c1a9a11181df94e6888507dc48f9bee36" Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.275086 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f67sg"] Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.284437 5030 scope.go:117] "RemoveContainer" containerID="3d83b6307eb76adeaa5b531350cd2a10afeb71ef3bf9c7384761169e29f8e5c1" Jan 21 01:21:03 crc kubenswrapper[5030]: E0121 01:21:03.284999 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d83b6307eb76adeaa5b531350cd2a10afeb71ef3bf9c7384761169e29f8e5c1\": container with ID starting with 3d83b6307eb76adeaa5b531350cd2a10afeb71ef3bf9c7384761169e29f8e5c1 not found: ID does not exist" containerID="3d83b6307eb76adeaa5b531350cd2a10afeb71ef3bf9c7384761169e29f8e5c1" Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.285038 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d83b6307eb76adeaa5b531350cd2a10afeb71ef3bf9c7384761169e29f8e5c1"} err="failed to get container status \"3d83b6307eb76adeaa5b531350cd2a10afeb71ef3bf9c7384761169e29f8e5c1\": rpc error: code = NotFound desc = could not find container \"3d83b6307eb76adeaa5b531350cd2a10afeb71ef3bf9c7384761169e29f8e5c1\": container with ID starting with 3d83b6307eb76adeaa5b531350cd2a10afeb71ef3bf9c7384761169e29f8e5c1 not found: ID does not exist" Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.285063 5030 scope.go:117] "RemoveContainer" containerID="e82ce53bfd9bfb18f6c2da528d63a1c7f100e4bda59e11906b30633bb55fb975" Jan 21 01:21:03 crc kubenswrapper[5030]: E0121 01:21:03.285790 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82ce53bfd9bfb18f6c2da528d63a1c7f100e4bda59e11906b30633bb55fb975\": container with ID starting with e82ce53bfd9bfb18f6c2da528d63a1c7f100e4bda59e11906b30633bb55fb975 not found: ID does not exist" containerID="e82ce53bfd9bfb18f6c2da528d63a1c7f100e4bda59e11906b30633bb55fb975" Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.285832 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82ce53bfd9bfb18f6c2da528d63a1c7f100e4bda59e11906b30633bb55fb975"} err="failed to get container status \"e82ce53bfd9bfb18f6c2da528d63a1c7f100e4bda59e11906b30633bb55fb975\": rpc error: code = NotFound desc = could not find container \"e82ce53bfd9bfb18f6c2da528d63a1c7f100e4bda59e11906b30633bb55fb975\": container with ID starting with e82ce53bfd9bfb18f6c2da528d63a1c7f100e4bda59e11906b30633bb55fb975 not found: ID does not exist" Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.285865 5030 scope.go:117] "RemoveContainer" containerID="c5056deb4fe0b9c97bfe96c558aedc7c1a9a11181df94e6888507dc48f9bee36" Jan 21 01:21:03 crc kubenswrapper[5030]: E0121 01:21:03.286362 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5056deb4fe0b9c97bfe96c558aedc7c1a9a11181df94e6888507dc48f9bee36\": container with ID starting with c5056deb4fe0b9c97bfe96c558aedc7c1a9a11181df94e6888507dc48f9bee36 not found: ID does not exist" containerID="c5056deb4fe0b9c97bfe96c558aedc7c1a9a11181df94e6888507dc48f9bee36" Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.286391 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5056deb4fe0b9c97bfe96c558aedc7c1a9a11181df94e6888507dc48f9bee36"} err="failed to get container status \"c5056deb4fe0b9c97bfe96c558aedc7c1a9a11181df94e6888507dc48f9bee36\": rpc error: code = NotFound desc = could not find container \"c5056deb4fe0b9c97bfe96c558aedc7c1a9a11181df94e6888507dc48f9bee36\": container with ID starting with c5056deb4fe0b9c97bfe96c558aedc7c1a9a11181df94e6888507dc48f9bee36 not found: ID does not exist" Jan 21 01:21:03 crc kubenswrapper[5030]: I0121 01:21:03.973472 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" path="/var/lib/kubelet/pods/d4b96dd1-eb89-4a4d-9311-7f35833cc8f2/volumes" Jan 21 01:21:06 crc kubenswrapper[5030]: I0121 01:21:06.962853 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:21:06 crc kubenswrapper[5030]: E0121 01:21:06.963807 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:21:20 crc kubenswrapper[5030]: I0121 01:21:20.964500 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:21:20 crc kubenswrapper[5030]: E0121 01:21:20.965828 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:21:35 crc kubenswrapper[5030]: I0121 01:21:35.961904 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:21:35 crc kubenswrapper[5030]: E0121 01:21:35.962758 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:21:47 crc kubenswrapper[5030]: I0121 01:21:47.971045 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:21:47 crc kubenswrapper[5030]: E0121 01:21:47.972436 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:21:58 crc kubenswrapper[5030]: I0121 01:21:58.962961 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:21:58 crc kubenswrapper[5030]: E0121 01:21:58.963780 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:22:09 crc kubenswrapper[5030]: I0121 01:22:09.967518 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:22:09 crc kubenswrapper[5030]: E0121 01:22:09.968858 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:22:20 crc kubenswrapper[5030]: I0121 01:22:20.962613 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:22:20 crc kubenswrapper[5030]: E0121 01:22:20.963648 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:22:33 crc kubenswrapper[5030]: I0121 01:22:33.963795 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:22:33 crc kubenswrapper[5030]: E0121 01:22:33.964584 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:22:44 crc kubenswrapper[5030]: I0121 01:22:44.962929 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:22:44 crc kubenswrapper[5030]: E0121 01:22:44.966243 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:22:57 crc kubenswrapper[5030]: I0121 01:22:57.968927 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:22:57 crc kubenswrapper[5030]: E0121 01:22:57.969908 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:23:12 crc kubenswrapper[5030]: I0121 01:23:12.962517 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:23:12 crc kubenswrapper[5030]: E0121 01:23:12.963943 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:23:25 crc kubenswrapper[5030]: I0121 01:23:25.962890 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:23:25 crc kubenswrapper[5030]: E0121 01:23:25.963929 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:23:40 crc kubenswrapper[5030]: I0121 01:23:40.961795 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:23:40 crc kubenswrapper[5030]: E0121 01:23:40.962667 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:23:52 crc kubenswrapper[5030]: I0121 01:23:52.962755 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:23:52 crc kubenswrapper[5030]: E0121 01:23:52.963970 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:24:05 crc kubenswrapper[5030]: I0121 01:24:05.965413 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:24:05 crc kubenswrapper[5030]: E0121 01:24:05.966399 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:24:20 crc kubenswrapper[5030]: I0121 01:24:20.962378 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:24:20 crc kubenswrapper[5030]: E0121 01:24:20.965612 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:24:32 crc kubenswrapper[5030]: I0121 01:24:32.962180 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:24:32 crc kubenswrapper[5030]: E0121 01:24:32.964135 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:24:45 crc kubenswrapper[5030]: I0121 01:24:45.963198 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:24:45 crc kubenswrapper[5030]: E0121 01:24:45.963960 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.405970 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m29nr"] Jan 21 01:24:59 crc kubenswrapper[5030]: E0121 01:24:59.409489 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" containerName="extract-utilities" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.409512 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" containerName="extract-utilities" Jan 21 01:24:59 crc kubenswrapper[5030]: E0121 01:24:59.409606 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" containerName="extract-content" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.409655 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" containerName="extract-content" Jan 21 01:24:59 crc kubenswrapper[5030]: E0121 01:24:59.409684 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" containerName="registry-server" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.409693 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" containerName="registry-server" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.410336 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b96dd1-eb89-4a4d-9311-7f35833cc8f2" containerName="registry-server" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.416891 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.422984 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m29nr"] Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.472856 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e446e62-77ef-4678-8b95-00d1a93bf5ec-catalog-content\") pod \"community-operators-m29nr\" (UID: \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\") " pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.472974 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e446e62-77ef-4678-8b95-00d1a93bf5ec-utilities\") pod \"community-operators-m29nr\" (UID: \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\") " pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.473068 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnjl7\" (UniqueName: \"kubernetes.io/projected/7e446e62-77ef-4678-8b95-00d1a93bf5ec-kube-api-access-cnjl7\") pod \"community-operators-m29nr\" (UID: \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\") " pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.574746 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e446e62-77ef-4678-8b95-00d1a93bf5ec-catalog-content\") pod \"community-operators-m29nr\" (UID: \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\") " pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.575164 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e446e62-77ef-4678-8b95-00d1a93bf5ec-utilities\") pod \"community-operators-m29nr\" (UID: \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\") " pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.575215 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnjl7\" (UniqueName: \"kubernetes.io/projected/7e446e62-77ef-4678-8b95-00d1a93bf5ec-kube-api-access-cnjl7\") pod \"community-operators-m29nr\" (UID: \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\") " pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.575262 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e446e62-77ef-4678-8b95-00d1a93bf5ec-catalog-content\") pod \"community-operators-m29nr\" (UID: \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\") " pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.575657 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e446e62-77ef-4678-8b95-00d1a93bf5ec-utilities\") pod \"community-operators-m29nr\" (UID: \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\") " pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.604655 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnjl7\" (UniqueName: \"kubernetes.io/projected/7e446e62-77ef-4678-8b95-00d1a93bf5ec-kube-api-access-cnjl7\") pod \"community-operators-m29nr\" (UID: \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\") " pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.745318 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:24:59 crc kubenswrapper[5030]: I0121 01:24:59.963244 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:24:59 crc kubenswrapper[5030]: E0121 01:24:59.963725 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:25:00 crc kubenswrapper[5030]: I0121 01:25:00.323189 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m29nr"] Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.161770 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4mfcf"] Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.163586 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.173177 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4mfcf"] Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.294601 5030 generic.go:334] "Generic (PLEG): container finished" podID="7e446e62-77ef-4678-8b95-00d1a93bf5ec" containerID="b47ed85e85d1f258cd455767850f2e4af2631c5a10d5b29dbc71216a04fd54a5" exitCode=0 Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.294671 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m29nr" event={"ID":"7e446e62-77ef-4678-8b95-00d1a93bf5ec","Type":"ContainerDied","Data":"b47ed85e85d1f258cd455767850f2e4af2631c5a10d5b29dbc71216a04fd54a5"} Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.294700 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m29nr" event={"ID":"7e446e62-77ef-4678-8b95-00d1a93bf5ec","Type":"ContainerStarted","Data":"b5268e5c8d6a51a00daf0ca3aa7364c504507e30ca3b2a11d85c749c2695b416"} Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.308137 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-catalog-content\") pod \"certified-operators-4mfcf\" (UID: \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\") " pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.308231 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv2zt\" (UniqueName: \"kubernetes.io/projected/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-kube-api-access-tv2zt\") pod \"certified-operators-4mfcf\" (UID: \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\") " pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.308343 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-utilities\") pod \"certified-operators-4mfcf\" (UID: \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\") " pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.410180 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-utilities\") pod \"certified-operators-4mfcf\" (UID: \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\") " pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.410644 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-catalog-content\") pod \"certified-operators-4mfcf\" (UID: \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\") " pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.410863 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv2zt\" (UniqueName: \"kubernetes.io/projected/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-kube-api-access-tv2zt\") pod \"certified-operators-4mfcf\" (UID: \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\") " pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.410882 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-utilities\") pod \"certified-operators-4mfcf\" (UID: \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\") " pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.411409 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-catalog-content\") pod \"certified-operators-4mfcf\" (UID: \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\") " pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.439162 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv2zt\" (UniqueName: \"kubernetes.io/projected/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-kube-api-access-tv2zt\") pod \"certified-operators-4mfcf\" (UID: \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\") " pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.497033 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:01 crc kubenswrapper[5030]: I0121 01:25:01.755056 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4mfcf"] Jan 21 01:25:02 crc kubenswrapper[5030]: I0121 01:25:02.304181 5030 generic.go:334] "Generic (PLEG): container finished" podID="c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" containerID="7e3e435344c39a1f2eda15f3bbc4ed801234e60303fb8e4dd8cf311ffa81487f" exitCode=0 Jan 21 01:25:02 crc kubenswrapper[5030]: I0121 01:25:02.304265 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mfcf" event={"ID":"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0","Type":"ContainerDied","Data":"7e3e435344c39a1f2eda15f3bbc4ed801234e60303fb8e4dd8cf311ffa81487f"} Jan 21 01:25:02 crc kubenswrapper[5030]: I0121 01:25:02.304311 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mfcf" event={"ID":"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0","Type":"ContainerStarted","Data":"9903d8e266b62e5f318fedb561c21ec11f85acce843b50ec37219b3c420d9e6a"} Jan 21 01:25:02 crc kubenswrapper[5030]: I0121 01:25:02.309186 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m29nr" event={"ID":"7e446e62-77ef-4678-8b95-00d1a93bf5ec","Type":"ContainerStarted","Data":"01dccb5b19f71f8355c44a4712b9615198d267d276680f62c562dbff1f3ef08d"} Jan 21 01:25:03 crc kubenswrapper[5030]: I0121 01:25:03.318186 5030 generic.go:334] "Generic (PLEG): container finished" podID="7e446e62-77ef-4678-8b95-00d1a93bf5ec" containerID="01dccb5b19f71f8355c44a4712b9615198d267d276680f62c562dbff1f3ef08d" exitCode=0 Jan 21 01:25:03 crc kubenswrapper[5030]: I0121 01:25:03.318755 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m29nr" event={"ID":"7e446e62-77ef-4678-8b95-00d1a93bf5ec","Type":"ContainerDied","Data":"01dccb5b19f71f8355c44a4712b9615198d267d276680f62c562dbff1f3ef08d"} Jan 21 01:25:03 crc kubenswrapper[5030]: I0121 01:25:03.325585 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mfcf" event={"ID":"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0","Type":"ContainerStarted","Data":"df5ee6c58a912d6c58aad0b2b3379e9341b9f9813a3908e4f017fac526b15bac"} Jan 21 01:25:04 crc kubenswrapper[5030]: I0121 01:25:04.337472 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m29nr" event={"ID":"7e446e62-77ef-4678-8b95-00d1a93bf5ec","Type":"ContainerStarted","Data":"e52a9aeca5fd591753ba5bd5c06d6c4d3702ea97566316601d23a64e99f31edc"} Jan 21 01:25:04 crc kubenswrapper[5030]: I0121 01:25:04.341419 5030 generic.go:334] "Generic (PLEG): container finished" podID="c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" containerID="df5ee6c58a912d6c58aad0b2b3379e9341b9f9813a3908e4f017fac526b15bac" exitCode=0 Jan 21 01:25:04 crc kubenswrapper[5030]: I0121 01:25:04.341480 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mfcf" event={"ID":"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0","Type":"ContainerDied","Data":"df5ee6c58a912d6c58aad0b2b3379e9341b9f9813a3908e4f017fac526b15bac"} Jan 21 01:25:04 crc kubenswrapper[5030]: I0121 01:25:04.372243 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m29nr" podStartSLOduration=2.849931642 podStartE2EDuration="5.372219692s" podCreationTimestamp="2026-01-21 01:24:59 +0000 UTC" firstStartedPulling="2026-01-21 01:25:01.297700543 +0000 UTC m=+10173.617960841" lastFinishedPulling="2026-01-21 01:25:03.819988563 +0000 UTC m=+10176.140248891" observedRunningTime="2026-01-21 01:25:04.363183925 +0000 UTC m=+10176.683444303" watchObservedRunningTime="2026-01-21 01:25:04.372219692 +0000 UTC m=+10176.692480010" Jan 21 01:25:05 crc kubenswrapper[5030]: I0121 01:25:05.350449 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mfcf" event={"ID":"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0","Type":"ContainerStarted","Data":"59ef10e8116ac568723b2d907041f35c46e8297b2a061cd7084f8611272fe06f"} Jan 21 01:25:05 crc kubenswrapper[5030]: I0121 01:25:05.378305 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4mfcf" podStartSLOduration=1.9001480050000001 podStartE2EDuration="4.378282433s" podCreationTimestamp="2026-01-21 01:25:01 +0000 UTC" firstStartedPulling="2026-01-21 01:25:02.307202627 +0000 UTC m=+10174.627462925" lastFinishedPulling="2026-01-21 01:25:04.785337055 +0000 UTC m=+10177.105597353" observedRunningTime="2026-01-21 01:25:05.370388113 +0000 UTC m=+10177.690648421" watchObservedRunningTime="2026-01-21 01:25:05.378282433 +0000 UTC m=+10177.698542761" Jan 21 01:25:09 crc kubenswrapper[5030]: I0121 01:25:09.745825 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:25:09 crc kubenswrapper[5030]: I0121 01:25:09.746249 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:25:09 crc kubenswrapper[5030]: I0121 01:25:09.806378 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:25:10 crc kubenswrapper[5030]: I0121 01:25:10.476730 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:25:10 crc kubenswrapper[5030]: I0121 01:25:10.542893 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m29nr"] Jan 21 01:25:11 crc kubenswrapper[5030]: I0121 01:25:11.498190 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:11 crc kubenswrapper[5030]: I0121 01:25:11.498292 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:11 crc kubenswrapper[5030]: I0121 01:25:11.555868 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:12 crc kubenswrapper[5030]: I0121 01:25:12.421790 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m29nr" podUID="7e446e62-77ef-4678-8b95-00d1a93bf5ec" containerName="registry-server" containerID="cri-o://e52a9aeca5fd591753ba5bd5c06d6c4d3702ea97566316601d23a64e99f31edc" gracePeriod=2 Jan 21 01:25:12 crc kubenswrapper[5030]: I0121 01:25:12.497912 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:12 crc kubenswrapper[5030]: I0121 01:25:12.937584 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4mfcf"] Jan 21 01:25:13 crc kubenswrapper[5030]: I0121 01:25:13.438409 5030 generic.go:334] "Generic (PLEG): container finished" podID="7e446e62-77ef-4678-8b95-00d1a93bf5ec" containerID="e52a9aeca5fd591753ba5bd5c06d6c4d3702ea97566316601d23a64e99f31edc" exitCode=0 Jan 21 01:25:13 crc kubenswrapper[5030]: I0121 01:25:13.438614 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m29nr" event={"ID":"7e446e62-77ef-4678-8b95-00d1a93bf5ec","Type":"ContainerDied","Data":"e52a9aeca5fd591753ba5bd5c06d6c4d3702ea97566316601d23a64e99f31edc"} Jan 21 01:25:13 crc kubenswrapper[5030]: I0121 01:25:13.582895 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:25:13 crc kubenswrapper[5030]: I0121 01:25:13.613329 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnjl7\" (UniqueName: \"kubernetes.io/projected/7e446e62-77ef-4678-8b95-00d1a93bf5ec-kube-api-access-cnjl7\") pod \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\" (UID: \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\") " Jan 21 01:25:13 crc kubenswrapper[5030]: I0121 01:25:13.613724 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e446e62-77ef-4678-8b95-00d1a93bf5ec-utilities\") pod \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\" (UID: \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\") " Jan 21 01:25:13 crc kubenswrapper[5030]: I0121 01:25:13.613820 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e446e62-77ef-4678-8b95-00d1a93bf5ec-catalog-content\") pod \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\" (UID: \"7e446e62-77ef-4678-8b95-00d1a93bf5ec\") " Jan 21 01:25:13 crc kubenswrapper[5030]: I0121 01:25:13.629925 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e446e62-77ef-4678-8b95-00d1a93bf5ec-utilities" (OuterVolumeSpecName: "utilities") pod "7e446e62-77ef-4678-8b95-00d1a93bf5ec" (UID: "7e446e62-77ef-4678-8b95-00d1a93bf5ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:25:13 crc kubenswrapper[5030]: I0121 01:25:13.630554 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e446e62-77ef-4678-8b95-00d1a93bf5ec-kube-api-access-cnjl7" (OuterVolumeSpecName: "kube-api-access-cnjl7") pod "7e446e62-77ef-4678-8b95-00d1a93bf5ec" (UID: "7e446e62-77ef-4678-8b95-00d1a93bf5ec"). InnerVolumeSpecName "kube-api-access-cnjl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:25:13 crc kubenswrapper[5030]: I0121 01:25:13.672673 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e446e62-77ef-4678-8b95-00d1a93bf5ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e446e62-77ef-4678-8b95-00d1a93bf5ec" (UID: "7e446e62-77ef-4678-8b95-00d1a93bf5ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:25:13 crc kubenswrapper[5030]: I0121 01:25:13.715429 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e446e62-77ef-4678-8b95-00d1a93bf5ec-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:25:13 crc kubenswrapper[5030]: I0121 01:25:13.715468 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnjl7\" (UniqueName: \"kubernetes.io/projected/7e446e62-77ef-4678-8b95-00d1a93bf5ec-kube-api-access-cnjl7\") on node \"crc\" DevicePath \"\"" Jan 21 01:25:13 crc kubenswrapper[5030]: I0121 01:25:13.715481 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e446e62-77ef-4678-8b95-00d1a93bf5ec-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:25:14 crc kubenswrapper[5030]: I0121 01:25:14.451347 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m29nr" event={"ID":"7e446e62-77ef-4678-8b95-00d1a93bf5ec","Type":"ContainerDied","Data":"b5268e5c8d6a51a00daf0ca3aa7364c504507e30ca3b2a11d85c749c2695b416"} Jan 21 01:25:14 crc kubenswrapper[5030]: I0121 01:25:14.451398 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m29nr" Jan 21 01:25:14 crc kubenswrapper[5030]: I0121 01:25:14.451422 5030 scope.go:117] "RemoveContainer" containerID="e52a9aeca5fd591753ba5bd5c06d6c4d3702ea97566316601d23a64e99f31edc" Jan 21 01:25:14 crc kubenswrapper[5030]: I0121 01:25:14.459878 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4mfcf" podUID="c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" containerName="registry-server" containerID="cri-o://59ef10e8116ac568723b2d907041f35c46e8297b2a061cd7084f8611272fe06f" gracePeriod=2 Jan 21 01:25:14 crc kubenswrapper[5030]: I0121 01:25:14.484976 5030 scope.go:117] "RemoveContainer" containerID="01dccb5b19f71f8355c44a4712b9615198d267d276680f62c562dbff1f3ef08d" Jan 21 01:25:14 crc kubenswrapper[5030]: I0121 01:25:14.492212 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m29nr"] Jan 21 01:25:14 crc kubenswrapper[5030]: I0121 01:25:14.512527 5030 scope.go:117] "RemoveContainer" containerID="b47ed85e85d1f258cd455767850f2e4af2631c5a10d5b29dbc71216a04fd54a5" Jan 21 01:25:14 crc kubenswrapper[5030]: I0121 01:25:14.520612 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m29nr"] Jan 21 01:25:14 crc kubenswrapper[5030]: I0121 01:25:14.962720 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:25:14 crc kubenswrapper[5030]: E0121 01:25:14.963216 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.429806 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.470829 5030 generic.go:334] "Generic (PLEG): container finished" podID="c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" containerID="59ef10e8116ac568723b2d907041f35c46e8297b2a061cd7084f8611272fe06f" exitCode=0 Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.470909 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mfcf" event={"ID":"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0","Type":"ContainerDied","Data":"59ef10e8116ac568723b2d907041f35c46e8297b2a061cd7084f8611272fe06f"} Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.470963 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4mfcf" event={"ID":"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0","Type":"ContainerDied","Data":"9903d8e266b62e5f318fedb561c21ec11f85acce843b50ec37219b3c420d9e6a"} Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.470987 5030 scope.go:117] "RemoveContainer" containerID="59ef10e8116ac568723b2d907041f35c46e8297b2a061cd7084f8611272fe06f" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.470913 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4mfcf" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.495356 5030 scope.go:117] "RemoveContainer" containerID="df5ee6c58a912d6c58aad0b2b3379e9341b9f9813a3908e4f017fac526b15bac" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.515521 5030 scope.go:117] "RemoveContainer" containerID="7e3e435344c39a1f2eda15f3bbc4ed801234e60303fb8e4dd8cf311ffa81487f" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.554508 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv2zt\" (UniqueName: \"kubernetes.io/projected/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-kube-api-access-tv2zt\") pod \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\" (UID: \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\") " Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.554954 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-catalog-content\") pod \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\" (UID: \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\") " Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.555062 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-utilities\") pod \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\" (UID: \"c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0\") " Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.557229 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-utilities" (OuterVolumeSpecName: "utilities") pod "c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" (UID: "c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.564160 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-kube-api-access-tv2zt" (OuterVolumeSpecName: "kube-api-access-tv2zt") pod "c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" (UID: "c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0"). InnerVolumeSpecName "kube-api-access-tv2zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.564761 5030 scope.go:117] "RemoveContainer" containerID="59ef10e8116ac568723b2d907041f35c46e8297b2a061cd7084f8611272fe06f" Jan 21 01:25:15 crc kubenswrapper[5030]: E0121 01:25:15.565279 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ef10e8116ac568723b2d907041f35c46e8297b2a061cd7084f8611272fe06f\": container with ID starting with 59ef10e8116ac568723b2d907041f35c46e8297b2a061cd7084f8611272fe06f not found: ID does not exist" containerID="59ef10e8116ac568723b2d907041f35c46e8297b2a061cd7084f8611272fe06f" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.565335 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ef10e8116ac568723b2d907041f35c46e8297b2a061cd7084f8611272fe06f"} err="failed to get container status \"59ef10e8116ac568723b2d907041f35c46e8297b2a061cd7084f8611272fe06f\": rpc error: code = NotFound desc = could not find container \"59ef10e8116ac568723b2d907041f35c46e8297b2a061cd7084f8611272fe06f\": container with ID starting with 59ef10e8116ac568723b2d907041f35c46e8297b2a061cd7084f8611272fe06f not found: ID does not exist" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.565369 5030 scope.go:117] "RemoveContainer" containerID="df5ee6c58a912d6c58aad0b2b3379e9341b9f9813a3908e4f017fac526b15bac" Jan 21 01:25:15 crc kubenswrapper[5030]: E0121 01:25:15.565755 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5ee6c58a912d6c58aad0b2b3379e9341b9f9813a3908e4f017fac526b15bac\": container with ID starting with df5ee6c58a912d6c58aad0b2b3379e9341b9f9813a3908e4f017fac526b15bac not found: ID does not exist" containerID="df5ee6c58a912d6c58aad0b2b3379e9341b9f9813a3908e4f017fac526b15bac" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.565799 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5ee6c58a912d6c58aad0b2b3379e9341b9f9813a3908e4f017fac526b15bac"} err="failed to get container status \"df5ee6c58a912d6c58aad0b2b3379e9341b9f9813a3908e4f017fac526b15bac\": rpc error: code = NotFound desc = could not find container \"df5ee6c58a912d6c58aad0b2b3379e9341b9f9813a3908e4f017fac526b15bac\": container with ID starting with df5ee6c58a912d6c58aad0b2b3379e9341b9f9813a3908e4f017fac526b15bac not found: ID does not exist" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.565832 5030 scope.go:117] "RemoveContainer" containerID="7e3e435344c39a1f2eda15f3bbc4ed801234e60303fb8e4dd8cf311ffa81487f" Jan 21 01:25:15 crc kubenswrapper[5030]: E0121 01:25:15.566182 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3e435344c39a1f2eda15f3bbc4ed801234e60303fb8e4dd8cf311ffa81487f\": container with ID starting with 7e3e435344c39a1f2eda15f3bbc4ed801234e60303fb8e4dd8cf311ffa81487f not found: ID does not exist" containerID="7e3e435344c39a1f2eda15f3bbc4ed801234e60303fb8e4dd8cf311ffa81487f" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.566217 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3e435344c39a1f2eda15f3bbc4ed801234e60303fb8e4dd8cf311ffa81487f"} err="failed to get container status \"7e3e435344c39a1f2eda15f3bbc4ed801234e60303fb8e4dd8cf311ffa81487f\": rpc error: code = NotFound desc = could not find container \"7e3e435344c39a1f2eda15f3bbc4ed801234e60303fb8e4dd8cf311ffa81487f\": container with ID starting with 7e3e435344c39a1f2eda15f3bbc4ed801234e60303fb8e4dd8cf311ffa81487f not found: ID does not exist" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.628650 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" (UID: "c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.658434 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv2zt\" (UniqueName: \"kubernetes.io/projected/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-kube-api-access-tv2zt\") on node \"crc\" DevicePath \"\"" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.658521 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.658545 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.806269 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4mfcf"] Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.816159 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4mfcf"] Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.971936 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e446e62-77ef-4678-8b95-00d1a93bf5ec" path="/var/lib/kubelet/pods/7e446e62-77ef-4678-8b95-00d1a93bf5ec/volumes" Jan 21 01:25:15 crc kubenswrapper[5030]: I0121 01:25:15.972567 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" path="/var/lib/kubelet/pods/c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0/volumes" Jan 21 01:25:27 crc kubenswrapper[5030]: I0121 01:25:27.971788 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:25:27 crc kubenswrapper[5030]: E0121 01:25:27.972903 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:25:40 crc kubenswrapper[5030]: I0121 01:25:40.962506 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:25:41 crc kubenswrapper[5030]: I0121 01:25:41.671975 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"78b1da2cde9787cf2c664e3bd56ef6fd9a12427f788989cc55758f5210bb8b0d"} Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.573342 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2fhwk"] Jan 21 01:27:57 crc kubenswrapper[5030]: E0121 01:27:57.574705 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" containerName="registry-server" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.574725 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" containerName="registry-server" Jan 21 01:27:57 crc kubenswrapper[5030]: E0121 01:27:57.574737 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e446e62-77ef-4678-8b95-00d1a93bf5ec" containerName="registry-server" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.574744 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e446e62-77ef-4678-8b95-00d1a93bf5ec" containerName="registry-server" Jan 21 01:27:57 crc kubenswrapper[5030]: E0121 01:27:57.574755 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e446e62-77ef-4678-8b95-00d1a93bf5ec" containerName="extract-content" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.574763 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e446e62-77ef-4678-8b95-00d1a93bf5ec" containerName="extract-content" Jan 21 01:27:57 crc kubenswrapper[5030]: E0121 01:27:57.574775 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e446e62-77ef-4678-8b95-00d1a93bf5ec" containerName="extract-utilities" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.574784 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e446e62-77ef-4678-8b95-00d1a93bf5ec" containerName="extract-utilities" Jan 21 01:27:57 crc kubenswrapper[5030]: E0121 01:27:57.574805 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" containerName="extract-utilities" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.574811 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" containerName="extract-utilities" Jan 21 01:27:57 crc kubenswrapper[5030]: E0121 01:27:57.574821 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" containerName="extract-content" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.574827 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" containerName="extract-content" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.574965 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5aaadeb-0122-45cd-a8e5-d2ce0ffec1a0" containerName="registry-server" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.574991 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e446e62-77ef-4678-8b95-00d1a93bf5ec" containerName="registry-server" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.576200 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.585016 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2fhwk"] Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.676140 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hvln\" (UniqueName: \"kubernetes.io/projected/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-kube-api-access-2hvln\") pod \"redhat-operators-2fhwk\" (UID: \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\") " pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.676246 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-utilities\") pod \"redhat-operators-2fhwk\" (UID: \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\") " pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.676460 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-catalog-content\") pod \"redhat-operators-2fhwk\" (UID: \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\") " pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.778945 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hvln\" (UniqueName: \"kubernetes.io/projected/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-kube-api-access-2hvln\") pod \"redhat-operators-2fhwk\" (UID: \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\") " pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.779080 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-utilities\") pod \"redhat-operators-2fhwk\" (UID: \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\") " pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.779143 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-catalog-content\") pod \"redhat-operators-2fhwk\" (UID: \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\") " pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.780094 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-catalog-content\") pod \"redhat-operators-2fhwk\" (UID: \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\") " pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.780113 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-utilities\") pod \"redhat-operators-2fhwk\" (UID: \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\") " pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.800227 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hvln\" (UniqueName: \"kubernetes.io/projected/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-kube-api-access-2hvln\") pod \"redhat-operators-2fhwk\" (UID: \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\") " pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:27:57 crc kubenswrapper[5030]: I0121 01:27:57.942479 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:27:58 crc kubenswrapper[5030]: I0121 01:27:58.422706 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2fhwk"] Jan 21 01:27:58 crc kubenswrapper[5030]: I0121 01:27:58.924062 5030 generic.go:334] "Generic (PLEG): container finished" podID="7d88c20d-3b2a-457c-a586-3915ce1d4cd5" containerID="ab03579cdbd964cd0ffe94ca9c91bde886c500ccdde5f2e3256e83e7423abf3c" exitCode=0 Jan 21 01:27:58 crc kubenswrapper[5030]: I0121 01:27:58.924269 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2fhwk" event={"ID":"7d88c20d-3b2a-457c-a586-3915ce1d4cd5","Type":"ContainerDied","Data":"ab03579cdbd964cd0ffe94ca9c91bde886c500ccdde5f2e3256e83e7423abf3c"} Jan 21 01:27:58 crc kubenswrapper[5030]: I0121 01:27:58.924293 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2fhwk" event={"ID":"7d88c20d-3b2a-457c-a586-3915ce1d4cd5","Type":"ContainerStarted","Data":"f917448b01cdd95fd090b94779b81c2ec5fce4fa57651926baa7dda00a4a4746"} Jan 21 01:27:58 crc kubenswrapper[5030]: I0121 01:27:58.925979 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 01:28:00 crc kubenswrapper[5030]: I0121 01:28:00.945510 5030 generic.go:334] "Generic (PLEG): container finished" podID="7d88c20d-3b2a-457c-a586-3915ce1d4cd5" containerID="5ddb36f68d71ead9af4aaaa28457848fbf3ff1f89fbb26d9f4c85d9a44167417" exitCode=0 Jan 21 01:28:00 crc kubenswrapper[5030]: I0121 01:28:00.945569 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2fhwk" event={"ID":"7d88c20d-3b2a-457c-a586-3915ce1d4cd5","Type":"ContainerDied","Data":"5ddb36f68d71ead9af4aaaa28457848fbf3ff1f89fbb26d9f4c85d9a44167417"} Jan 21 01:28:01 crc kubenswrapper[5030]: I0121 01:28:01.982377 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2fhwk" event={"ID":"7d88c20d-3b2a-457c-a586-3915ce1d4cd5","Type":"ContainerStarted","Data":"75f3bfdf6eb2a53f2393c5b73325b4d8128dd21fb762ccbf86cac70722e37412"} Jan 21 01:28:01 crc kubenswrapper[5030]: I0121 01:28:01.990079 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2fhwk" podStartSLOduration=2.496318005 podStartE2EDuration="4.99006233s" podCreationTimestamp="2026-01-21 01:27:57 +0000 UTC" firstStartedPulling="2026-01-21 01:27:58.925792156 +0000 UTC m=+10351.246052444" lastFinishedPulling="2026-01-21 01:28:01.419536441 +0000 UTC m=+10353.739796769" observedRunningTime="2026-01-21 01:28:01.986064697 +0000 UTC m=+10354.306324985" watchObservedRunningTime="2026-01-21 01:28:01.99006233 +0000 UTC m=+10354.310322618" Jan 21 01:28:07 crc kubenswrapper[5030]: I0121 01:28:07.943721 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:28:07 crc kubenswrapper[5030]: I0121 01:28:07.945094 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:28:09 crc kubenswrapper[5030]: I0121 01:28:09.019811 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2fhwk" podUID="7d88c20d-3b2a-457c-a586-3915ce1d4cd5" containerName="registry-server" probeResult="failure" output=< Jan 21 01:28:09 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 21 01:28:09 crc kubenswrapper[5030]: > Jan 21 01:28:10 crc kubenswrapper[5030]: I0121 01:28:10.158877 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:28:10 crc kubenswrapper[5030]: I0121 01:28:10.159018 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:28:18 crc kubenswrapper[5030]: I0121 01:28:18.025387 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:28:18 crc kubenswrapper[5030]: I0121 01:28:18.113179 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:28:18 crc kubenswrapper[5030]: I0121 01:28:18.282491 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2fhwk"] Jan 21 01:28:19 crc kubenswrapper[5030]: I0121 01:28:19.129443 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2fhwk" podUID="7d88c20d-3b2a-457c-a586-3915ce1d4cd5" containerName="registry-server" containerID="cri-o://75f3bfdf6eb2a53f2393c5b73325b4d8128dd21fb762ccbf86cac70722e37412" gracePeriod=2 Jan 21 01:28:21 crc kubenswrapper[5030]: I0121 01:28:21.154408 5030 generic.go:334] "Generic (PLEG): container finished" podID="7d88c20d-3b2a-457c-a586-3915ce1d4cd5" containerID="75f3bfdf6eb2a53f2393c5b73325b4d8128dd21fb762ccbf86cac70722e37412" exitCode=0 Jan 21 01:28:21 crc kubenswrapper[5030]: I0121 01:28:21.154480 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2fhwk" event={"ID":"7d88c20d-3b2a-457c-a586-3915ce1d4cd5","Type":"ContainerDied","Data":"75f3bfdf6eb2a53f2393c5b73325b4d8128dd21fb762ccbf86cac70722e37412"} Jan 21 01:28:21 crc kubenswrapper[5030]: I0121 01:28:21.319558 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:28:21 crc kubenswrapper[5030]: I0121 01:28:21.358074 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hvln\" (UniqueName: \"kubernetes.io/projected/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-kube-api-access-2hvln\") pod \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\" (UID: \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\") " Jan 21 01:28:21 crc kubenswrapper[5030]: I0121 01:28:21.358200 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-catalog-content\") pod \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\" (UID: \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\") " Jan 21 01:28:21 crc kubenswrapper[5030]: I0121 01:28:21.358561 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-utilities\") pod \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\" (UID: \"7d88c20d-3b2a-457c-a586-3915ce1d4cd5\") " Jan 21 01:28:21 crc kubenswrapper[5030]: I0121 01:28:21.359488 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-utilities" (OuterVolumeSpecName: "utilities") pod "7d88c20d-3b2a-457c-a586-3915ce1d4cd5" (UID: "7d88c20d-3b2a-457c-a586-3915ce1d4cd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:28:21 crc kubenswrapper[5030]: I0121 01:28:21.364482 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-kube-api-access-2hvln" (OuterVolumeSpecName: "kube-api-access-2hvln") pod "7d88c20d-3b2a-457c-a586-3915ce1d4cd5" (UID: "7d88c20d-3b2a-457c-a586-3915ce1d4cd5"). InnerVolumeSpecName "kube-api-access-2hvln". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:28:21 crc kubenswrapper[5030]: I0121 01:28:21.460063 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:28:21 crc kubenswrapper[5030]: I0121 01:28:21.460111 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hvln\" (UniqueName: \"kubernetes.io/projected/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-kube-api-access-2hvln\") on node \"crc\" DevicePath \"\"" Jan 21 01:28:21 crc kubenswrapper[5030]: I0121 01:28:21.517210 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d88c20d-3b2a-457c-a586-3915ce1d4cd5" (UID: "7d88c20d-3b2a-457c-a586-3915ce1d4cd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:28:21 crc kubenswrapper[5030]: I0121 01:28:21.561383 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d88c20d-3b2a-457c-a586-3915ce1d4cd5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:28:22 crc kubenswrapper[5030]: I0121 01:28:22.164425 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2fhwk" event={"ID":"7d88c20d-3b2a-457c-a586-3915ce1d4cd5","Type":"ContainerDied","Data":"f917448b01cdd95fd090b94779b81c2ec5fce4fa57651926baa7dda00a4a4746"} Jan 21 01:28:22 crc kubenswrapper[5030]: I0121 01:28:22.164498 5030 scope.go:117] "RemoveContainer" containerID="75f3bfdf6eb2a53f2393c5b73325b4d8128dd21fb762ccbf86cac70722e37412" Jan 21 01:28:22 crc kubenswrapper[5030]: I0121 01:28:22.164495 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2fhwk" Jan 21 01:28:22 crc kubenswrapper[5030]: I0121 01:28:22.195379 5030 scope.go:117] "RemoveContainer" containerID="5ddb36f68d71ead9af4aaaa28457848fbf3ff1f89fbb26d9f4c85d9a44167417" Jan 21 01:28:22 crc kubenswrapper[5030]: I0121 01:28:22.196742 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2fhwk"] Jan 21 01:28:22 crc kubenswrapper[5030]: I0121 01:28:22.209506 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2fhwk"] Jan 21 01:28:22 crc kubenswrapper[5030]: I0121 01:28:22.224476 5030 scope.go:117] "RemoveContainer" containerID="ab03579cdbd964cd0ffe94ca9c91bde886c500ccdde5f2e3256e83e7423abf3c" Jan 21 01:28:23 crc kubenswrapper[5030]: I0121 01:28:23.975960 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d88c20d-3b2a-457c-a586-3915ce1d4cd5" path="/var/lib/kubelet/pods/7d88c20d-3b2a-457c-a586-3915ce1d4cd5/volumes" Jan 21 01:28:40 crc kubenswrapper[5030]: I0121 01:28:40.157266 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:28:40 crc kubenswrapper[5030]: I0121 01:28:40.157821 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:29:10 crc kubenswrapper[5030]: I0121 01:29:10.157358 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:29:10 crc kubenswrapper[5030]: I0121 01:29:10.157918 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:29:10 crc kubenswrapper[5030]: I0121 01:29:10.157974 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 01:29:10 crc kubenswrapper[5030]: I0121 01:29:10.158608 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"78b1da2cde9787cf2c664e3bd56ef6fd9a12427f788989cc55758f5210bb8b0d"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 01:29:10 crc kubenswrapper[5030]: I0121 01:29:10.158694 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://78b1da2cde9787cf2c664e3bd56ef6fd9a12427f788989cc55758f5210bb8b0d" gracePeriod=600 Jan 21 01:29:10 crc kubenswrapper[5030]: I0121 01:29:10.548024 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="78b1da2cde9787cf2c664e3bd56ef6fd9a12427f788989cc55758f5210bb8b0d" exitCode=0 Jan 21 01:29:10 crc kubenswrapper[5030]: I0121 01:29:10.548085 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"78b1da2cde9787cf2c664e3bd56ef6fd9a12427f788989cc55758f5210bb8b0d"} Jan 21 01:29:10 crc kubenswrapper[5030]: I0121 01:29:10.548357 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef"} Jan 21 01:29:10 crc kubenswrapper[5030]: I0121 01:29:10.548377 5030 scope.go:117] "RemoveContainer" containerID="bc93575931f809b191e9277765863c935d818b0da18de167ce7862f2933a3ac8" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.163457 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d"] Jan 21 01:30:00 crc kubenswrapper[5030]: E0121 01:30:00.164506 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d88c20d-3b2a-457c-a586-3915ce1d4cd5" containerName="extract-content" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.164528 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d88c20d-3b2a-457c-a586-3915ce1d4cd5" containerName="extract-content" Jan 21 01:30:00 crc kubenswrapper[5030]: E0121 01:30:00.164577 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d88c20d-3b2a-457c-a586-3915ce1d4cd5" containerName="extract-utilities" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.164589 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d88c20d-3b2a-457c-a586-3915ce1d4cd5" containerName="extract-utilities" Jan 21 01:30:00 crc kubenswrapper[5030]: E0121 01:30:00.164606 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d88c20d-3b2a-457c-a586-3915ce1d4cd5" containerName="registry-server" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.164641 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d88c20d-3b2a-457c-a586-3915ce1d4cd5" containerName="registry-server" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.164853 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d88c20d-3b2a-457c-a586-3915ce1d4cd5" containerName="registry-server" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.165556 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.168885 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.175917 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.177379 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghxvw\" (UniqueName: \"kubernetes.io/projected/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-kube-api-access-ghxvw\") pod \"collect-profiles-29482650-wxz8d\" (UID: \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.177845 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-secret-volume\") pod \"collect-profiles-29482650-wxz8d\" (UID: \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.177953 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d"] Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.178494 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-config-volume\") pod \"collect-profiles-29482650-wxz8d\" (UID: \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.293951 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-config-volume\") pod \"collect-profiles-29482650-wxz8d\" (UID: \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.294120 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghxvw\" (UniqueName: \"kubernetes.io/projected/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-kube-api-access-ghxvw\") pod \"collect-profiles-29482650-wxz8d\" (UID: \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.294264 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-secret-volume\") pod \"collect-profiles-29482650-wxz8d\" (UID: \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.298137 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-config-volume\") pod \"collect-profiles-29482650-wxz8d\" (UID: \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.304648 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-secret-volume\") pod \"collect-profiles-29482650-wxz8d\" (UID: \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.320409 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghxvw\" (UniqueName: \"kubernetes.io/projected/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-kube-api-access-ghxvw\") pod \"collect-profiles-29482650-wxz8d\" (UID: \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.491684 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" Jan 21 01:30:00 crc kubenswrapper[5030]: I0121 01:30:00.972240 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d"] Jan 21 01:30:01 crc kubenswrapper[5030]: I0121 01:30:01.001830 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" event={"ID":"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec","Type":"ContainerStarted","Data":"70c3fcc2f53d6b00c37fc3fdfb394244045edd515deb170e288a948fba0ec198"} Jan 21 01:30:02 crc kubenswrapper[5030]: I0121 01:30:02.012086 5030 generic.go:334] "Generic (PLEG): container finished" podID="d18143a0-b4fd-4f0c-bbd4-da67bcd87bec" containerID="82c50539eff5b026531993028d6e94266e1e8a9b8f54e039f36392fb287fceaf" exitCode=0 Jan 21 01:30:02 crc kubenswrapper[5030]: I0121 01:30:02.012242 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" event={"ID":"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec","Type":"ContainerDied","Data":"82c50539eff5b026531993028d6e94266e1e8a9b8f54e039f36392fb287fceaf"} Jan 21 01:30:03 crc kubenswrapper[5030]: I0121 01:30:03.443776 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" Jan 21 01:30:03 crc kubenswrapper[5030]: I0121 01:30:03.547684 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-secret-volume\") pod \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\" (UID: \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\") " Jan 21 01:30:03 crc kubenswrapper[5030]: I0121 01:30:03.547808 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghxvw\" (UniqueName: \"kubernetes.io/projected/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-kube-api-access-ghxvw\") pod \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\" (UID: \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\") " Jan 21 01:30:03 crc kubenswrapper[5030]: I0121 01:30:03.547873 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-config-volume\") pod \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\" (UID: \"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec\") " Jan 21 01:30:03 crc kubenswrapper[5030]: I0121 01:30:03.548760 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-config-volume" (OuterVolumeSpecName: "config-volume") pod "d18143a0-b4fd-4f0c-bbd4-da67bcd87bec" (UID: "d18143a0-b4fd-4f0c-bbd4-da67bcd87bec"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:30:03 crc kubenswrapper[5030]: I0121 01:30:03.567963 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-kube-api-access-ghxvw" (OuterVolumeSpecName: "kube-api-access-ghxvw") pod "d18143a0-b4fd-4f0c-bbd4-da67bcd87bec" (UID: "d18143a0-b4fd-4f0c-bbd4-da67bcd87bec"). InnerVolumeSpecName "kube-api-access-ghxvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:30:03 crc kubenswrapper[5030]: I0121 01:30:03.574281 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d18143a0-b4fd-4f0c-bbd4-da67bcd87bec" (UID: "d18143a0-b4fd-4f0c-bbd4-da67bcd87bec"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:30:03 crc kubenswrapper[5030]: I0121 01:30:03.648864 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 01:30:03 crc kubenswrapper[5030]: I0121 01:30:03.648898 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghxvw\" (UniqueName: \"kubernetes.io/projected/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-kube-api-access-ghxvw\") on node \"crc\" DevicePath \"\"" Jan 21 01:30:03 crc kubenswrapper[5030]: I0121 01:30:03.648910 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d18143a0-b4fd-4f0c-bbd4-da67bcd87bec-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 01:30:04 crc kubenswrapper[5030]: I0121 01:30:04.029233 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" event={"ID":"d18143a0-b4fd-4f0c-bbd4-da67bcd87bec","Type":"ContainerDied","Data":"70c3fcc2f53d6b00c37fc3fdfb394244045edd515deb170e288a948fba0ec198"} Jan 21 01:30:04 crc kubenswrapper[5030]: I0121 01:30:04.029601 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70c3fcc2f53d6b00c37fc3fdfb394244045edd515deb170e288a948fba0ec198" Jan 21 01:30:04 crc kubenswrapper[5030]: I0121 01:30:04.029499 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482650-wxz8d" Jan 21 01:30:04 crc kubenswrapper[5030]: I0121 01:30:04.552614 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns"] Jan 21 01:30:04 crc kubenswrapper[5030]: I0121 01:30:04.561148 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482605-cs5ns"] Jan 21 01:30:05 crc kubenswrapper[5030]: I0121 01:30:05.970991 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="786c9aee-56d9-48c8-8a1d-eafb58602db2" path="/var/lib/kubelet/pods/786c9aee-56d9-48c8-8a1d-eafb58602db2/volumes" Jan 21 01:30:41 crc kubenswrapper[5030]: I0121 01:30:41.210807 5030 scope.go:117] "RemoveContainer" containerID="01a00cdffaa9716dc9aab8d7a0ade65f03582010d732231552948440d7f5f5f5" Jan 21 01:31:10 crc kubenswrapper[5030]: I0121 01:31:10.157239 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:31:10 crc kubenswrapper[5030]: I0121 01:31:10.157964 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:31:40 crc kubenswrapper[5030]: I0121 01:31:40.157191 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:31:40 crc kubenswrapper[5030]: I0121 01:31:40.157837 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:32:10 crc kubenswrapper[5030]: I0121 01:32:10.157145 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:32:10 crc kubenswrapper[5030]: I0121 01:32:10.157825 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:32:10 crc kubenswrapper[5030]: I0121 01:32:10.157912 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 01:32:10 crc kubenswrapper[5030]: I0121 01:32:10.158933 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 01:32:10 crc kubenswrapper[5030]: I0121 01:32:10.159018 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" gracePeriod=600 Jan 21 01:32:10 crc kubenswrapper[5030]: E0121 01:32:10.310819 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:32:11 crc kubenswrapper[5030]: I0121 01:32:11.157701 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" exitCode=0 Jan 21 01:32:11 crc kubenswrapper[5030]: I0121 01:32:11.157783 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef"} Jan 21 01:32:11 crc kubenswrapper[5030]: I0121 01:32:11.159003 5030 scope.go:117] "RemoveContainer" containerID="78b1da2cde9787cf2c664e3bd56ef6fd9a12427f788989cc55758f5210bb8b0d" Jan 21 01:32:11 crc kubenswrapper[5030]: I0121 01:32:11.159761 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:32:11 crc kubenswrapper[5030]: E0121 01:32:11.160260 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:32:23 crc kubenswrapper[5030]: I0121 01:32:23.962328 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:32:23 crc kubenswrapper[5030]: E0121 01:32:23.963713 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:32:38 crc kubenswrapper[5030]: I0121 01:32:38.962743 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:32:38 crc kubenswrapper[5030]: E0121 01:32:38.963706 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:32:52 crc kubenswrapper[5030]: I0121 01:32:52.962160 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:32:52 crc kubenswrapper[5030]: E0121 01:32:52.963161 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:33:03 crc kubenswrapper[5030]: I0121 01:33:03.962651 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:33:03 crc kubenswrapper[5030]: E0121 01:33:03.963284 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:33:16 crc kubenswrapper[5030]: I0121 01:33:16.963472 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:33:16 crc kubenswrapper[5030]: E0121 01:33:16.964470 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:33:27 crc kubenswrapper[5030]: I0121 01:33:27.968498 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:33:27 crc kubenswrapper[5030]: E0121 01:33:27.969453 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:33:40 crc kubenswrapper[5030]: I0121 01:33:40.962734 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:33:40 crc kubenswrapper[5030]: E0121 01:33:40.963723 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:33:55 crc kubenswrapper[5030]: I0121 01:33:55.968950 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:33:55 crc kubenswrapper[5030]: E0121 01:33:55.969907 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:34:09 crc kubenswrapper[5030]: I0121 01:34:09.977974 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:34:09 crc kubenswrapper[5030]: E0121 01:34:09.979392 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:34:24 crc kubenswrapper[5030]: I0121 01:34:24.963290 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:34:24 crc kubenswrapper[5030]: E0121 01:34:24.964715 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:34:37 crc kubenswrapper[5030]: I0121 01:34:37.966130 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:34:37 crc kubenswrapper[5030]: E0121 01:34:37.966948 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:34:49 crc kubenswrapper[5030]: I0121 01:34:49.965330 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:34:49 crc kubenswrapper[5030]: E0121 01:34:49.965922 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:35:00 crc kubenswrapper[5030]: I0121 01:35:00.962793 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:35:00 crc kubenswrapper[5030]: E0121 01:35:00.964045 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:35:11 crc kubenswrapper[5030]: I0121 01:35:11.969061 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:35:11 crc kubenswrapper[5030]: E0121 01:35:11.970003 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:35:22 crc kubenswrapper[5030]: I0121 01:35:22.962544 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:35:22 crc kubenswrapper[5030]: E0121 01:35:22.966030 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.468189 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mwgtt"] Jan 21 01:35:30 crc kubenswrapper[5030]: E0121 01:35:30.469198 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18143a0-b4fd-4f0c-bbd4-da67bcd87bec" containerName="collect-profiles" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.469221 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18143a0-b4fd-4f0c-bbd4-da67bcd87bec" containerName="collect-profiles" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.469503 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18143a0-b4fd-4f0c-bbd4-da67bcd87bec" containerName="collect-profiles" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.471534 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.558120 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwgtt"] Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.580382 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a1d2c-b662-48fa-836d-e8eba7a62017-catalog-content\") pod \"certified-operators-mwgtt\" (UID: \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\") " pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.580641 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a1d2c-b662-48fa-836d-e8eba7a62017-utilities\") pod \"certified-operators-mwgtt\" (UID: \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\") " pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.580725 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk6g4\" (UniqueName: \"kubernetes.io/projected/6f9a1d2c-b662-48fa-836d-e8eba7a62017-kube-api-access-zk6g4\") pod \"certified-operators-mwgtt\" (UID: \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\") " pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.682345 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a1d2c-b662-48fa-836d-e8eba7a62017-catalog-content\") pod \"certified-operators-mwgtt\" (UID: \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\") " pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.682427 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a1d2c-b662-48fa-836d-e8eba7a62017-utilities\") pod \"certified-operators-mwgtt\" (UID: \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\") " pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.682495 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk6g4\" (UniqueName: \"kubernetes.io/projected/6f9a1d2c-b662-48fa-836d-e8eba7a62017-kube-api-access-zk6g4\") pod \"certified-operators-mwgtt\" (UID: \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\") " pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.682944 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a1d2c-b662-48fa-836d-e8eba7a62017-utilities\") pod \"certified-operators-mwgtt\" (UID: \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\") " pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.684136 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a1d2c-b662-48fa-836d-e8eba7a62017-catalog-content\") pod \"certified-operators-mwgtt\" (UID: \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\") " pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.709556 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk6g4\" (UniqueName: \"kubernetes.io/projected/6f9a1d2c-b662-48fa-836d-e8eba7a62017-kube-api-access-zk6g4\") pod \"certified-operators-mwgtt\" (UID: \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\") " pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:30 crc kubenswrapper[5030]: I0121 01:35:30.847269 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:31 crc kubenswrapper[5030]: I0121 01:35:31.270447 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwgtt"] Jan 21 01:35:31 crc kubenswrapper[5030]: I0121 01:35:31.843490 5030 generic.go:334] "Generic (PLEG): container finished" podID="6f9a1d2c-b662-48fa-836d-e8eba7a62017" containerID="f1361c7a660f768e4d46090fbd282ef73e47586a95bb6feb1e1cbc35d897d7f4" exitCode=0 Jan 21 01:35:31 crc kubenswrapper[5030]: I0121 01:35:31.843585 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgtt" event={"ID":"6f9a1d2c-b662-48fa-836d-e8eba7a62017","Type":"ContainerDied","Data":"f1361c7a660f768e4d46090fbd282ef73e47586a95bb6feb1e1cbc35d897d7f4"} Jan 21 01:35:31 crc kubenswrapper[5030]: I0121 01:35:31.843772 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgtt" event={"ID":"6f9a1d2c-b662-48fa-836d-e8eba7a62017","Type":"ContainerStarted","Data":"098679e8940176542f9d9393825cadf51d9671e766005cc5a7509e579939224f"} Jan 21 01:35:31 crc kubenswrapper[5030]: I0121 01:35:31.845779 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 01:35:32 crc kubenswrapper[5030]: I0121 01:35:32.854335 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgtt" event={"ID":"6f9a1d2c-b662-48fa-836d-e8eba7a62017","Type":"ContainerStarted","Data":"e27cf2ad710f769703f7223473b5adbbac87a151987da14d992206c1d74d7c70"} Jan 21 01:35:33 crc kubenswrapper[5030]: I0121 01:35:33.866798 5030 generic.go:334] "Generic (PLEG): container finished" podID="6f9a1d2c-b662-48fa-836d-e8eba7a62017" containerID="e27cf2ad710f769703f7223473b5adbbac87a151987da14d992206c1d74d7c70" exitCode=0 Jan 21 01:35:33 crc kubenswrapper[5030]: I0121 01:35:33.866886 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgtt" event={"ID":"6f9a1d2c-b662-48fa-836d-e8eba7a62017","Type":"ContainerDied","Data":"e27cf2ad710f769703f7223473b5adbbac87a151987da14d992206c1d74d7c70"} Jan 21 01:35:34 crc kubenswrapper[5030]: I0121 01:35:34.876173 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgtt" event={"ID":"6f9a1d2c-b662-48fa-836d-e8eba7a62017","Type":"ContainerStarted","Data":"3f91f5b12fe398b1d353c9fac44f79ce39be1e36662b5dea605bc1d2fc7bd2c1"} Jan 21 01:35:34 crc kubenswrapper[5030]: I0121 01:35:34.904543 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mwgtt" podStartSLOduration=2.262217261 podStartE2EDuration="4.904527664s" podCreationTimestamp="2026-01-21 01:35:30 +0000 UTC" firstStartedPulling="2026-01-21 01:35:31.845459042 +0000 UTC m=+10804.165719340" lastFinishedPulling="2026-01-21 01:35:34.487769415 +0000 UTC m=+10806.808029743" observedRunningTime="2026-01-21 01:35:34.900929259 +0000 UTC m=+10807.221189567" watchObservedRunningTime="2026-01-21 01:35:34.904527664 +0000 UTC m=+10807.224787952" Jan 21 01:35:36 crc kubenswrapper[5030]: I0121 01:35:36.962451 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:35:36 crc kubenswrapper[5030]: E0121 01:35:36.963187 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:35:40 crc kubenswrapper[5030]: I0121 01:35:40.848048 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:40 crc kubenswrapper[5030]: I0121 01:35:40.848405 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:40 crc kubenswrapper[5030]: I0121 01:35:40.923545 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:41 crc kubenswrapper[5030]: I0121 01:35:41.001061 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:41 crc kubenswrapper[5030]: I0121 01:35:41.169750 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwgtt"] Jan 21 01:35:42 crc kubenswrapper[5030]: I0121 01:35:42.940593 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mwgtt" podUID="6f9a1d2c-b662-48fa-836d-e8eba7a62017" containerName="registry-server" containerID="cri-o://3f91f5b12fe398b1d353c9fac44f79ce39be1e36662b5dea605bc1d2fc7bd2c1" gracePeriod=2 Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.575805 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.622296 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk6g4\" (UniqueName: \"kubernetes.io/projected/6f9a1d2c-b662-48fa-836d-e8eba7a62017-kube-api-access-zk6g4\") pod \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\" (UID: \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\") " Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.622689 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a1d2c-b662-48fa-836d-e8eba7a62017-utilities\") pod \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\" (UID: \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\") " Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.622719 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a1d2c-b662-48fa-836d-e8eba7a62017-catalog-content\") pod \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\" (UID: \"6f9a1d2c-b662-48fa-836d-e8eba7a62017\") " Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.636566 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9a1d2c-b662-48fa-836d-e8eba7a62017-kube-api-access-zk6g4" (OuterVolumeSpecName: "kube-api-access-zk6g4") pod "6f9a1d2c-b662-48fa-836d-e8eba7a62017" (UID: "6f9a1d2c-b662-48fa-836d-e8eba7a62017"). InnerVolumeSpecName "kube-api-access-zk6g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.644614 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9a1d2c-b662-48fa-836d-e8eba7a62017-utilities" (OuterVolumeSpecName: "utilities") pod "6f9a1d2c-b662-48fa-836d-e8eba7a62017" (UID: "6f9a1d2c-b662-48fa-836d-e8eba7a62017"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.666458 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9a1d2c-b662-48fa-836d-e8eba7a62017-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f9a1d2c-b662-48fa-836d-e8eba7a62017" (UID: "6f9a1d2c-b662-48fa-836d-e8eba7a62017"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.724538 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk6g4\" (UniqueName: \"kubernetes.io/projected/6f9a1d2c-b662-48fa-836d-e8eba7a62017-kube-api-access-zk6g4\") on node \"crc\" DevicePath \"\"" Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.724596 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a1d2c-b662-48fa-836d-e8eba7a62017-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.724653 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a1d2c-b662-48fa-836d-e8eba7a62017-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.951494 5030 generic.go:334] "Generic (PLEG): container finished" podID="6f9a1d2c-b662-48fa-836d-e8eba7a62017" containerID="3f91f5b12fe398b1d353c9fac44f79ce39be1e36662b5dea605bc1d2fc7bd2c1" exitCode=0 Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.952412 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgtt" event={"ID":"6f9a1d2c-b662-48fa-836d-e8eba7a62017","Type":"ContainerDied","Data":"3f91f5b12fe398b1d353c9fac44f79ce39be1e36662b5dea605bc1d2fc7bd2c1"} Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.952521 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgtt" event={"ID":"6f9a1d2c-b662-48fa-836d-e8eba7a62017","Type":"ContainerDied","Data":"098679e8940176542f9d9393825cadf51d9671e766005cc5a7509e579939224f"} Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.952627 5030 scope.go:117] "RemoveContainer" containerID="3f91f5b12fe398b1d353c9fac44f79ce39be1e36662b5dea605bc1d2fc7bd2c1" Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.952808 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwgtt" Jan 21 01:35:43 crc kubenswrapper[5030]: I0121 01:35:43.995453 5030 scope.go:117] "RemoveContainer" containerID="e27cf2ad710f769703f7223473b5adbbac87a151987da14d992206c1d74d7c70" Jan 21 01:35:44 crc kubenswrapper[5030]: I0121 01:35:44.028850 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwgtt"] Jan 21 01:35:44 crc kubenswrapper[5030]: I0121 01:35:44.028976 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mwgtt"] Jan 21 01:35:44 crc kubenswrapper[5030]: I0121 01:35:44.039519 5030 scope.go:117] "RemoveContainer" containerID="f1361c7a660f768e4d46090fbd282ef73e47586a95bb6feb1e1cbc35d897d7f4" Jan 21 01:35:44 crc kubenswrapper[5030]: I0121 01:35:44.068840 5030 scope.go:117] "RemoveContainer" containerID="3f91f5b12fe398b1d353c9fac44f79ce39be1e36662b5dea605bc1d2fc7bd2c1" Jan 21 01:35:44 crc kubenswrapper[5030]: E0121 01:35:44.069488 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f91f5b12fe398b1d353c9fac44f79ce39be1e36662b5dea605bc1d2fc7bd2c1\": container with ID starting with 3f91f5b12fe398b1d353c9fac44f79ce39be1e36662b5dea605bc1d2fc7bd2c1 not found: ID does not exist" containerID="3f91f5b12fe398b1d353c9fac44f79ce39be1e36662b5dea605bc1d2fc7bd2c1" Jan 21 01:35:44 crc kubenswrapper[5030]: I0121 01:35:44.069545 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f91f5b12fe398b1d353c9fac44f79ce39be1e36662b5dea605bc1d2fc7bd2c1"} err="failed to get container status \"3f91f5b12fe398b1d353c9fac44f79ce39be1e36662b5dea605bc1d2fc7bd2c1\": rpc error: code = NotFound desc = could not find container \"3f91f5b12fe398b1d353c9fac44f79ce39be1e36662b5dea605bc1d2fc7bd2c1\": container with ID starting with 3f91f5b12fe398b1d353c9fac44f79ce39be1e36662b5dea605bc1d2fc7bd2c1 not found: ID does not exist" Jan 21 01:35:44 crc kubenswrapper[5030]: I0121 01:35:44.069579 5030 scope.go:117] "RemoveContainer" containerID="e27cf2ad710f769703f7223473b5adbbac87a151987da14d992206c1d74d7c70" Jan 21 01:35:44 crc kubenswrapper[5030]: E0121 01:35:44.070283 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e27cf2ad710f769703f7223473b5adbbac87a151987da14d992206c1d74d7c70\": container with ID starting with e27cf2ad710f769703f7223473b5adbbac87a151987da14d992206c1d74d7c70 not found: ID does not exist" containerID="e27cf2ad710f769703f7223473b5adbbac87a151987da14d992206c1d74d7c70" Jan 21 01:35:44 crc kubenswrapper[5030]: I0121 01:35:44.070342 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e27cf2ad710f769703f7223473b5adbbac87a151987da14d992206c1d74d7c70"} err="failed to get container status \"e27cf2ad710f769703f7223473b5adbbac87a151987da14d992206c1d74d7c70\": rpc error: code = NotFound desc = could not find container \"e27cf2ad710f769703f7223473b5adbbac87a151987da14d992206c1d74d7c70\": container with ID starting with e27cf2ad710f769703f7223473b5adbbac87a151987da14d992206c1d74d7c70 not found: ID does not exist" Jan 21 01:35:44 crc kubenswrapper[5030]: I0121 01:35:44.070379 5030 scope.go:117] "RemoveContainer" containerID="f1361c7a660f768e4d46090fbd282ef73e47586a95bb6feb1e1cbc35d897d7f4" Jan 21 01:35:44 crc kubenswrapper[5030]: E0121 01:35:44.071144 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1361c7a660f768e4d46090fbd282ef73e47586a95bb6feb1e1cbc35d897d7f4\": container with ID starting with f1361c7a660f768e4d46090fbd282ef73e47586a95bb6feb1e1cbc35d897d7f4 not found: ID does not exist" containerID="f1361c7a660f768e4d46090fbd282ef73e47586a95bb6feb1e1cbc35d897d7f4" Jan 21 01:35:44 crc kubenswrapper[5030]: I0121 01:35:44.071264 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1361c7a660f768e4d46090fbd282ef73e47586a95bb6feb1e1cbc35d897d7f4"} err="failed to get container status \"f1361c7a660f768e4d46090fbd282ef73e47586a95bb6feb1e1cbc35d897d7f4\": rpc error: code = NotFound desc = could not find container \"f1361c7a660f768e4d46090fbd282ef73e47586a95bb6feb1e1cbc35d897d7f4\": container with ID starting with f1361c7a660f768e4d46090fbd282ef73e47586a95bb6feb1e1cbc35d897d7f4 not found: ID does not exist" Jan 21 01:35:45 crc kubenswrapper[5030]: I0121 01:35:45.977794 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9a1d2c-b662-48fa-836d-e8eba7a62017" path="/var/lib/kubelet/pods/6f9a1d2c-b662-48fa-836d-e8eba7a62017/volumes" Jan 21 01:35:47 crc kubenswrapper[5030]: I0121 01:35:47.965714 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:35:47 crc kubenswrapper[5030]: E0121 01:35:47.965923 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:35:59 crc kubenswrapper[5030]: I0121 01:35:59.963247 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:35:59 crc kubenswrapper[5030]: E0121 01:35:59.965934 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:36:10 crc kubenswrapper[5030]: I0121 01:36:10.961603 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:36:10 crc kubenswrapper[5030]: E0121 01:36:10.962553 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:36:14 crc kubenswrapper[5030]: I0121 01:36:14.851724 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b57xt"] Jan 21 01:36:14 crc kubenswrapper[5030]: E0121 01:36:14.853153 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9a1d2c-b662-48fa-836d-e8eba7a62017" containerName="extract-content" Jan 21 01:36:14 crc kubenswrapper[5030]: I0121 01:36:14.853186 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9a1d2c-b662-48fa-836d-e8eba7a62017" containerName="extract-content" Jan 21 01:36:14 crc kubenswrapper[5030]: E0121 01:36:14.853229 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9a1d2c-b662-48fa-836d-e8eba7a62017" containerName="registry-server" Jan 21 01:36:14 crc kubenswrapper[5030]: I0121 01:36:14.853246 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9a1d2c-b662-48fa-836d-e8eba7a62017" containerName="registry-server" Jan 21 01:36:14 crc kubenswrapper[5030]: E0121 01:36:14.853291 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9a1d2c-b662-48fa-836d-e8eba7a62017" containerName="extract-utilities" Jan 21 01:36:14 crc kubenswrapper[5030]: I0121 01:36:14.853308 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9a1d2c-b662-48fa-836d-e8eba7a62017" containerName="extract-utilities" Jan 21 01:36:14 crc kubenswrapper[5030]: I0121 01:36:14.853553 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9a1d2c-b662-48fa-836d-e8eba7a62017" containerName="registry-server" Jan 21 01:36:14 crc kubenswrapper[5030]: I0121 01:36:14.855234 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:14 crc kubenswrapper[5030]: I0121 01:36:14.892949 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b57xt"] Jan 21 01:36:14 crc kubenswrapper[5030]: I0121 01:36:14.940955 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8326b13-8186-4729-a44c-45c360697dec-utilities\") pod \"community-operators-b57xt\" (UID: \"a8326b13-8186-4729-a44c-45c360697dec\") " pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:14 crc kubenswrapper[5030]: I0121 01:36:14.941009 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5wnl\" (UniqueName: \"kubernetes.io/projected/a8326b13-8186-4729-a44c-45c360697dec-kube-api-access-c5wnl\") pod \"community-operators-b57xt\" (UID: \"a8326b13-8186-4729-a44c-45c360697dec\") " pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:14 crc kubenswrapper[5030]: I0121 01:36:14.941116 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8326b13-8186-4729-a44c-45c360697dec-catalog-content\") pod \"community-operators-b57xt\" (UID: \"a8326b13-8186-4729-a44c-45c360697dec\") " pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:15 crc kubenswrapper[5030]: I0121 01:36:15.042682 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8326b13-8186-4729-a44c-45c360697dec-catalog-content\") pod \"community-operators-b57xt\" (UID: \"a8326b13-8186-4729-a44c-45c360697dec\") " pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:15 crc kubenswrapper[5030]: I0121 01:36:15.042747 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8326b13-8186-4729-a44c-45c360697dec-utilities\") pod \"community-operators-b57xt\" (UID: \"a8326b13-8186-4729-a44c-45c360697dec\") " pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:15 crc kubenswrapper[5030]: I0121 01:36:15.042773 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5wnl\" (UniqueName: \"kubernetes.io/projected/a8326b13-8186-4729-a44c-45c360697dec-kube-api-access-c5wnl\") pod \"community-operators-b57xt\" (UID: \"a8326b13-8186-4729-a44c-45c360697dec\") " pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:15 crc kubenswrapper[5030]: I0121 01:36:15.043439 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8326b13-8186-4729-a44c-45c360697dec-catalog-content\") pod \"community-operators-b57xt\" (UID: \"a8326b13-8186-4729-a44c-45c360697dec\") " pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:15 crc kubenswrapper[5030]: I0121 01:36:15.043662 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8326b13-8186-4729-a44c-45c360697dec-utilities\") pod \"community-operators-b57xt\" (UID: \"a8326b13-8186-4729-a44c-45c360697dec\") " pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:15 crc kubenswrapper[5030]: I0121 01:36:15.065542 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5wnl\" (UniqueName: \"kubernetes.io/projected/a8326b13-8186-4729-a44c-45c360697dec-kube-api-access-c5wnl\") pod \"community-operators-b57xt\" (UID: \"a8326b13-8186-4729-a44c-45c360697dec\") " pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:15 crc kubenswrapper[5030]: I0121 01:36:15.192925 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:15 crc kubenswrapper[5030]: I0121 01:36:15.489851 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b57xt"] Jan 21 01:36:16 crc kubenswrapper[5030]: I0121 01:36:16.218725 5030 generic.go:334] "Generic (PLEG): container finished" podID="a8326b13-8186-4729-a44c-45c360697dec" containerID="ae4c517caafdac0a7eb5e73a8856b33e5185e0caa09706d47997f94d47910fc2" exitCode=0 Jan 21 01:36:16 crc kubenswrapper[5030]: I0121 01:36:16.218843 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b57xt" event={"ID":"a8326b13-8186-4729-a44c-45c360697dec","Type":"ContainerDied","Data":"ae4c517caafdac0a7eb5e73a8856b33e5185e0caa09706d47997f94d47910fc2"} Jan 21 01:36:16 crc kubenswrapper[5030]: I0121 01:36:16.219151 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b57xt" event={"ID":"a8326b13-8186-4729-a44c-45c360697dec","Type":"ContainerStarted","Data":"5842f20006b44bacfa1c9aa498653e71a55bd687c4956e292bcdfe007e399f81"} Jan 21 01:36:17 crc kubenswrapper[5030]: I0121 01:36:17.229715 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b57xt" event={"ID":"a8326b13-8186-4729-a44c-45c360697dec","Type":"ContainerStarted","Data":"0f66609b2de37742c7b2e5f3971d8c440c67d947869c4264d88b15fd91f67cab"} Jan 21 01:36:18 crc kubenswrapper[5030]: I0121 01:36:18.241325 5030 generic.go:334] "Generic (PLEG): container finished" podID="a8326b13-8186-4729-a44c-45c360697dec" containerID="0f66609b2de37742c7b2e5f3971d8c440c67d947869c4264d88b15fd91f67cab" exitCode=0 Jan 21 01:36:18 crc kubenswrapper[5030]: I0121 01:36:18.241443 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b57xt" event={"ID":"a8326b13-8186-4729-a44c-45c360697dec","Type":"ContainerDied","Data":"0f66609b2de37742c7b2e5f3971d8c440c67d947869c4264d88b15fd91f67cab"} Jan 21 01:36:19 crc kubenswrapper[5030]: I0121 01:36:19.250213 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b57xt" event={"ID":"a8326b13-8186-4729-a44c-45c360697dec","Type":"ContainerStarted","Data":"83c027e2ae00bc25b90e37c2a42019045a7837707060d7a1e4ccfd3d0307cec7"} Jan 21 01:36:19 crc kubenswrapper[5030]: I0121 01:36:19.271952 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b57xt" podStartSLOduration=2.738469985 podStartE2EDuration="5.271932039s" podCreationTimestamp="2026-01-21 01:36:14 +0000 UTC" firstStartedPulling="2026-01-21 01:36:16.222435143 +0000 UTC m=+10848.542695471" lastFinishedPulling="2026-01-21 01:36:18.755897197 +0000 UTC m=+10851.076157525" observedRunningTime="2026-01-21 01:36:19.266352157 +0000 UTC m=+10851.586612455" watchObservedRunningTime="2026-01-21 01:36:19.271932039 +0000 UTC m=+10851.592192337" Jan 21 01:36:21 crc kubenswrapper[5030]: I0121 01:36:21.966167 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:36:21 crc kubenswrapper[5030]: E0121 01:36:21.967042 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:36:25 crc kubenswrapper[5030]: I0121 01:36:25.193560 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:25 crc kubenswrapper[5030]: I0121 01:36:25.193952 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:25 crc kubenswrapper[5030]: I0121 01:36:25.278748 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:25 crc kubenswrapper[5030]: I0121 01:36:25.357070 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:25 crc kubenswrapper[5030]: I0121 01:36:25.528418 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b57xt"] Jan 21 01:36:27 crc kubenswrapper[5030]: I0121 01:36:27.309327 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b57xt" podUID="a8326b13-8186-4729-a44c-45c360697dec" containerName="registry-server" containerID="cri-o://83c027e2ae00bc25b90e37c2a42019045a7837707060d7a1e4ccfd3d0307cec7" gracePeriod=2 Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.339448 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.354437 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b57xt" event={"ID":"a8326b13-8186-4729-a44c-45c360697dec","Type":"ContainerDied","Data":"83c027e2ae00bc25b90e37c2a42019045a7837707060d7a1e4ccfd3d0307cec7"} Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.354514 5030 scope.go:117] "RemoveContainer" containerID="83c027e2ae00bc25b90e37c2a42019045a7837707060d7a1e4ccfd3d0307cec7" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.354347 5030 generic.go:334] "Generic (PLEG): container finished" podID="a8326b13-8186-4729-a44c-45c360697dec" containerID="83c027e2ae00bc25b90e37c2a42019045a7837707060d7a1e4ccfd3d0307cec7" exitCode=0 Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.354649 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b57xt" event={"ID":"a8326b13-8186-4729-a44c-45c360697dec","Type":"ContainerDied","Data":"5842f20006b44bacfa1c9aa498653e71a55bd687c4956e292bcdfe007e399f81"} Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.354710 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b57xt" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.387836 5030 scope.go:117] "RemoveContainer" containerID="0f66609b2de37742c7b2e5f3971d8c440c67d947869c4264d88b15fd91f67cab" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.415328 5030 scope.go:117] "RemoveContainer" containerID="ae4c517caafdac0a7eb5e73a8856b33e5185e0caa09706d47997f94d47910fc2" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.437205 5030 scope.go:117] "RemoveContainer" containerID="83c027e2ae00bc25b90e37c2a42019045a7837707060d7a1e4ccfd3d0307cec7" Jan 21 01:36:28 crc kubenswrapper[5030]: E0121 01:36:28.441162 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c027e2ae00bc25b90e37c2a42019045a7837707060d7a1e4ccfd3d0307cec7\": container with ID starting with 83c027e2ae00bc25b90e37c2a42019045a7837707060d7a1e4ccfd3d0307cec7 not found: ID does not exist" containerID="83c027e2ae00bc25b90e37c2a42019045a7837707060d7a1e4ccfd3d0307cec7" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.441297 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c027e2ae00bc25b90e37c2a42019045a7837707060d7a1e4ccfd3d0307cec7"} err="failed to get container status \"83c027e2ae00bc25b90e37c2a42019045a7837707060d7a1e4ccfd3d0307cec7\": rpc error: code = NotFound desc = could not find container \"83c027e2ae00bc25b90e37c2a42019045a7837707060d7a1e4ccfd3d0307cec7\": container with ID starting with 83c027e2ae00bc25b90e37c2a42019045a7837707060d7a1e4ccfd3d0307cec7 not found: ID does not exist" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.441393 5030 scope.go:117] "RemoveContainer" containerID="0f66609b2de37742c7b2e5f3971d8c440c67d947869c4264d88b15fd91f67cab" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.441361 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5wnl\" (UniqueName: \"kubernetes.io/projected/a8326b13-8186-4729-a44c-45c360697dec-kube-api-access-c5wnl\") pod \"a8326b13-8186-4729-a44c-45c360697dec\" (UID: \"a8326b13-8186-4729-a44c-45c360697dec\") " Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.441795 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8326b13-8186-4729-a44c-45c360697dec-utilities\") pod \"a8326b13-8186-4729-a44c-45c360697dec\" (UID: \"a8326b13-8186-4729-a44c-45c360697dec\") " Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.441921 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8326b13-8186-4729-a44c-45c360697dec-catalog-content\") pod \"a8326b13-8186-4729-a44c-45c360697dec\" (UID: \"a8326b13-8186-4729-a44c-45c360697dec\") " Jan 21 01:36:28 crc kubenswrapper[5030]: E0121 01:36:28.446956 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f66609b2de37742c7b2e5f3971d8c440c67d947869c4264d88b15fd91f67cab\": container with ID starting with 0f66609b2de37742c7b2e5f3971d8c440c67d947869c4264d88b15fd91f67cab not found: ID does not exist" containerID="0f66609b2de37742c7b2e5f3971d8c440c67d947869c4264d88b15fd91f67cab" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.446994 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f66609b2de37742c7b2e5f3971d8c440c67d947869c4264d88b15fd91f67cab"} err="failed to get container status \"0f66609b2de37742c7b2e5f3971d8c440c67d947869c4264d88b15fd91f67cab\": rpc error: code = NotFound desc = could not find container \"0f66609b2de37742c7b2e5f3971d8c440c67d947869c4264d88b15fd91f67cab\": container with ID starting with 0f66609b2de37742c7b2e5f3971d8c440c67d947869c4264d88b15fd91f67cab not found: ID does not exist" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.447013 5030 scope.go:117] "RemoveContainer" containerID="ae4c517caafdac0a7eb5e73a8856b33e5185e0caa09706d47997f94d47910fc2" Jan 21 01:36:28 crc kubenswrapper[5030]: E0121 01:36:28.447468 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae4c517caafdac0a7eb5e73a8856b33e5185e0caa09706d47997f94d47910fc2\": container with ID starting with ae4c517caafdac0a7eb5e73a8856b33e5185e0caa09706d47997f94d47910fc2 not found: ID does not exist" containerID="ae4c517caafdac0a7eb5e73a8856b33e5185e0caa09706d47997f94d47910fc2" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.447589 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae4c517caafdac0a7eb5e73a8856b33e5185e0caa09706d47997f94d47910fc2"} err="failed to get container status \"ae4c517caafdac0a7eb5e73a8856b33e5185e0caa09706d47997f94d47910fc2\": rpc error: code = NotFound desc = could not find container \"ae4c517caafdac0a7eb5e73a8856b33e5185e0caa09706d47997f94d47910fc2\": container with ID starting with ae4c517caafdac0a7eb5e73a8856b33e5185e0caa09706d47997f94d47910fc2 not found: ID does not exist" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.447787 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8326b13-8186-4729-a44c-45c360697dec-utilities" (OuterVolumeSpecName: "utilities") pod "a8326b13-8186-4729-a44c-45c360697dec" (UID: "a8326b13-8186-4729-a44c-45c360697dec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.448457 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8326b13-8186-4729-a44c-45c360697dec-kube-api-access-c5wnl" (OuterVolumeSpecName: "kube-api-access-c5wnl") pod "a8326b13-8186-4729-a44c-45c360697dec" (UID: "a8326b13-8186-4729-a44c-45c360697dec"). InnerVolumeSpecName "kube-api-access-c5wnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.520211 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8326b13-8186-4729-a44c-45c360697dec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8326b13-8186-4729-a44c-45c360697dec" (UID: "a8326b13-8186-4729-a44c-45c360697dec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.544593 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8326b13-8186-4729-a44c-45c360697dec-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.544729 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5wnl\" (UniqueName: \"kubernetes.io/projected/a8326b13-8186-4729-a44c-45c360697dec-kube-api-access-c5wnl\") on node \"crc\" DevicePath \"\"" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.544752 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8326b13-8186-4729-a44c-45c360697dec-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.691172 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b57xt"] Jan 21 01:36:28 crc kubenswrapper[5030]: I0121 01:36:28.711694 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b57xt"] Jan 21 01:36:29 crc kubenswrapper[5030]: I0121 01:36:29.979325 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8326b13-8186-4729-a44c-45c360697dec" path="/var/lib/kubelet/pods/a8326b13-8186-4729-a44c-45c360697dec/volumes" Jan 21 01:36:32 crc kubenswrapper[5030]: I0121 01:36:32.962959 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:36:32 crc kubenswrapper[5030]: E0121 01:36:32.963689 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:36:47 crc kubenswrapper[5030]: I0121 01:36:47.966947 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:36:47 crc kubenswrapper[5030]: E0121 01:36:47.967686 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:37:02 crc kubenswrapper[5030]: I0121 01:37:02.962151 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:37:02 crc kubenswrapper[5030]: E0121 01:37:02.963255 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:37:15 crc kubenswrapper[5030]: I0121 01:37:15.969913 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:37:16 crc kubenswrapper[5030]: I0121 01:37:16.803762 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"114c3338e822553138f2fc4279dfe57ddc45eb3abeb91bb6eea525ed256713ad"} Jan 21 01:39:40 crc kubenswrapper[5030]: I0121 01:39:40.157848 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:39:40 crc kubenswrapper[5030]: I0121 01:39:40.158608 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:40:10 crc kubenswrapper[5030]: I0121 01:40:10.157557 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:40:10 crc kubenswrapper[5030]: I0121 01:40:10.158096 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.329872 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kwrwv"] Jan 21 01:40:14 crc kubenswrapper[5030]: E0121 01:40:14.330848 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8326b13-8186-4729-a44c-45c360697dec" containerName="registry-server" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.330879 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8326b13-8186-4729-a44c-45c360697dec" containerName="registry-server" Jan 21 01:40:14 crc kubenswrapper[5030]: E0121 01:40:14.330927 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8326b13-8186-4729-a44c-45c360697dec" containerName="extract-content" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.330943 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8326b13-8186-4729-a44c-45c360697dec" containerName="extract-content" Jan 21 01:40:14 crc kubenswrapper[5030]: E0121 01:40:14.330987 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8326b13-8186-4729-a44c-45c360697dec" containerName="extract-utilities" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.331004 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8326b13-8186-4729-a44c-45c360697dec" containerName="extract-utilities" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.331319 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8326b13-8186-4729-a44c-45c360697dec" containerName="registry-server" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.332895 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.350023 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwrwv"] Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.437042 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0723085-dd4f-4e49-a833-49aa99e0f34a-catalog-content\") pod \"redhat-marketplace-kwrwv\" (UID: \"b0723085-dd4f-4e49-a833-49aa99e0f34a\") " pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.437736 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0723085-dd4f-4e49-a833-49aa99e0f34a-utilities\") pod \"redhat-marketplace-kwrwv\" (UID: \"b0723085-dd4f-4e49-a833-49aa99e0f34a\") " pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.437799 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx4h2\" (UniqueName: \"kubernetes.io/projected/b0723085-dd4f-4e49-a833-49aa99e0f34a-kube-api-access-zx4h2\") pod \"redhat-marketplace-kwrwv\" (UID: \"b0723085-dd4f-4e49-a833-49aa99e0f34a\") " pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.539381 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0723085-dd4f-4e49-a833-49aa99e0f34a-catalog-content\") pod \"redhat-marketplace-kwrwv\" (UID: \"b0723085-dd4f-4e49-a833-49aa99e0f34a\") " pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.539449 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0723085-dd4f-4e49-a833-49aa99e0f34a-utilities\") pod \"redhat-marketplace-kwrwv\" (UID: \"b0723085-dd4f-4e49-a833-49aa99e0f34a\") " pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.539502 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx4h2\" (UniqueName: \"kubernetes.io/projected/b0723085-dd4f-4e49-a833-49aa99e0f34a-kube-api-access-zx4h2\") pod \"redhat-marketplace-kwrwv\" (UID: \"b0723085-dd4f-4e49-a833-49aa99e0f34a\") " pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.540143 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0723085-dd4f-4e49-a833-49aa99e0f34a-catalog-content\") pod \"redhat-marketplace-kwrwv\" (UID: \"b0723085-dd4f-4e49-a833-49aa99e0f34a\") " pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.540425 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0723085-dd4f-4e49-a833-49aa99e0f34a-utilities\") pod \"redhat-marketplace-kwrwv\" (UID: \"b0723085-dd4f-4e49-a833-49aa99e0f34a\") " pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.559012 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx4h2\" (UniqueName: \"kubernetes.io/projected/b0723085-dd4f-4e49-a833-49aa99e0f34a-kube-api-access-zx4h2\") pod \"redhat-marketplace-kwrwv\" (UID: \"b0723085-dd4f-4e49-a833-49aa99e0f34a\") " pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:14 crc kubenswrapper[5030]: I0121 01:40:14.714668 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:15 crc kubenswrapper[5030]: I0121 01:40:15.004085 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwrwv"] Jan 21 01:40:15 crc kubenswrapper[5030]: I0121 01:40:15.405762 5030 generic.go:334] "Generic (PLEG): container finished" podID="b0723085-dd4f-4e49-a833-49aa99e0f34a" containerID="178404202db1d59146b10c4055d7ada6395d53e8209b812d35a1b4a66f50370d" exitCode=0 Jan 21 01:40:15 crc kubenswrapper[5030]: I0121 01:40:15.405837 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwrwv" event={"ID":"b0723085-dd4f-4e49-a833-49aa99e0f34a","Type":"ContainerDied","Data":"178404202db1d59146b10c4055d7ada6395d53e8209b812d35a1b4a66f50370d"} Jan 21 01:40:15 crc kubenswrapper[5030]: I0121 01:40:15.405889 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwrwv" event={"ID":"b0723085-dd4f-4e49-a833-49aa99e0f34a","Type":"ContainerStarted","Data":"d4aa552176b01431c97207d9c16b5757277e0cbee2cb8804a2dfe21e6b5d5179"} Jan 21 01:40:16 crc kubenswrapper[5030]: I0121 01:40:16.415870 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwrwv" event={"ID":"b0723085-dd4f-4e49-a833-49aa99e0f34a","Type":"ContainerStarted","Data":"721c67cd82795a1036c4ba2798a93b788f9397b4a73aea64a7249abb474627ab"} Jan 21 01:40:17 crc kubenswrapper[5030]: I0121 01:40:17.426742 5030 generic.go:334] "Generic (PLEG): container finished" podID="b0723085-dd4f-4e49-a833-49aa99e0f34a" containerID="721c67cd82795a1036c4ba2798a93b788f9397b4a73aea64a7249abb474627ab" exitCode=0 Jan 21 01:40:17 crc kubenswrapper[5030]: I0121 01:40:17.426796 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwrwv" event={"ID":"b0723085-dd4f-4e49-a833-49aa99e0f34a","Type":"ContainerDied","Data":"721c67cd82795a1036c4ba2798a93b788f9397b4a73aea64a7249abb474627ab"} Jan 21 01:40:18 crc kubenswrapper[5030]: I0121 01:40:18.438176 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwrwv" event={"ID":"b0723085-dd4f-4e49-a833-49aa99e0f34a","Type":"ContainerStarted","Data":"e7813da92af639279a47063dbbbc41e9724f95de07eb9a98b0369fc7addec582"} Jan 21 01:40:18 crc kubenswrapper[5030]: I0121 01:40:18.474353 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kwrwv" podStartSLOduration=2.023884501 podStartE2EDuration="4.474332831s" podCreationTimestamp="2026-01-21 01:40:14 +0000 UTC" firstStartedPulling="2026-01-21 01:40:15.408573345 +0000 UTC m=+11087.728833673" lastFinishedPulling="2026-01-21 01:40:17.859021675 +0000 UTC m=+11090.179282003" observedRunningTime="2026-01-21 01:40:18.463422438 +0000 UTC m=+11090.783682766" watchObservedRunningTime="2026-01-21 01:40:18.474332831 +0000 UTC m=+11090.794593159" Jan 21 01:40:24 crc kubenswrapper[5030]: I0121 01:40:24.715183 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:24 crc kubenswrapper[5030]: I0121 01:40:24.717118 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:24 crc kubenswrapper[5030]: I0121 01:40:24.768232 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:25 crc kubenswrapper[5030]: I0121 01:40:25.560256 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:25 crc kubenswrapper[5030]: I0121 01:40:25.615285 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwrwv"] Jan 21 01:40:27 crc kubenswrapper[5030]: I0121 01:40:27.509409 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kwrwv" podUID="b0723085-dd4f-4e49-a833-49aa99e0f34a" containerName="registry-server" containerID="cri-o://e7813da92af639279a47063dbbbc41e9724f95de07eb9a98b0369fc7addec582" gracePeriod=2 Jan 21 01:40:27 crc kubenswrapper[5030]: I0121 01:40:27.958349 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.063159 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0723085-dd4f-4e49-a833-49aa99e0f34a-catalog-content\") pod \"b0723085-dd4f-4e49-a833-49aa99e0f34a\" (UID: \"b0723085-dd4f-4e49-a833-49aa99e0f34a\") " Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.063247 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0723085-dd4f-4e49-a833-49aa99e0f34a-utilities\") pod \"b0723085-dd4f-4e49-a833-49aa99e0f34a\" (UID: \"b0723085-dd4f-4e49-a833-49aa99e0f34a\") " Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.063279 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx4h2\" (UniqueName: \"kubernetes.io/projected/b0723085-dd4f-4e49-a833-49aa99e0f34a-kube-api-access-zx4h2\") pod \"b0723085-dd4f-4e49-a833-49aa99e0f34a\" (UID: \"b0723085-dd4f-4e49-a833-49aa99e0f34a\") " Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.064151 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0723085-dd4f-4e49-a833-49aa99e0f34a-utilities" (OuterVolumeSpecName: "utilities") pod "b0723085-dd4f-4e49-a833-49aa99e0f34a" (UID: "b0723085-dd4f-4e49-a833-49aa99e0f34a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.068660 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0723085-dd4f-4e49-a833-49aa99e0f34a-kube-api-access-zx4h2" (OuterVolumeSpecName: "kube-api-access-zx4h2") pod "b0723085-dd4f-4e49-a833-49aa99e0f34a" (UID: "b0723085-dd4f-4e49-a833-49aa99e0f34a"). InnerVolumeSpecName "kube-api-access-zx4h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.118895 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0723085-dd4f-4e49-a833-49aa99e0f34a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0723085-dd4f-4e49-a833-49aa99e0f34a" (UID: "b0723085-dd4f-4e49-a833-49aa99e0f34a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.165571 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0723085-dd4f-4e49-a833-49aa99e0f34a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.165681 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0723085-dd4f-4e49-a833-49aa99e0f34a-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.165709 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx4h2\" (UniqueName: \"kubernetes.io/projected/b0723085-dd4f-4e49-a833-49aa99e0f34a-kube-api-access-zx4h2\") on node \"crc\" DevicePath \"\"" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.519325 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwrwv" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.519351 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwrwv" event={"ID":"b0723085-dd4f-4e49-a833-49aa99e0f34a","Type":"ContainerDied","Data":"e7813da92af639279a47063dbbbc41e9724f95de07eb9a98b0369fc7addec582"} Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.519312 5030 generic.go:334] "Generic (PLEG): container finished" podID="b0723085-dd4f-4e49-a833-49aa99e0f34a" containerID="e7813da92af639279a47063dbbbc41e9724f95de07eb9a98b0369fc7addec582" exitCode=0 Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.519497 5030 scope.go:117] "RemoveContainer" containerID="e7813da92af639279a47063dbbbc41e9724f95de07eb9a98b0369fc7addec582" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.519519 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwrwv" event={"ID":"b0723085-dd4f-4e49-a833-49aa99e0f34a","Type":"ContainerDied","Data":"d4aa552176b01431c97207d9c16b5757277e0cbee2cb8804a2dfe21e6b5d5179"} Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.562462 5030 scope.go:117] "RemoveContainer" containerID="721c67cd82795a1036c4ba2798a93b788f9397b4a73aea64a7249abb474627ab" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.563151 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwrwv"] Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.574267 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwrwv"] Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.576834 5030 scope.go:117] "RemoveContainer" containerID="178404202db1d59146b10c4055d7ada6395d53e8209b812d35a1b4a66f50370d" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.606241 5030 scope.go:117] "RemoveContainer" containerID="e7813da92af639279a47063dbbbc41e9724f95de07eb9a98b0369fc7addec582" Jan 21 01:40:28 crc kubenswrapper[5030]: E0121 01:40:28.606798 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7813da92af639279a47063dbbbc41e9724f95de07eb9a98b0369fc7addec582\": container with ID starting with e7813da92af639279a47063dbbbc41e9724f95de07eb9a98b0369fc7addec582 not found: ID does not exist" containerID="e7813da92af639279a47063dbbbc41e9724f95de07eb9a98b0369fc7addec582" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.606846 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7813da92af639279a47063dbbbc41e9724f95de07eb9a98b0369fc7addec582"} err="failed to get container status \"e7813da92af639279a47063dbbbc41e9724f95de07eb9a98b0369fc7addec582\": rpc error: code = NotFound desc = could not find container \"e7813da92af639279a47063dbbbc41e9724f95de07eb9a98b0369fc7addec582\": container with ID starting with e7813da92af639279a47063dbbbc41e9724f95de07eb9a98b0369fc7addec582 not found: ID does not exist" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.606873 5030 scope.go:117] "RemoveContainer" containerID="721c67cd82795a1036c4ba2798a93b788f9397b4a73aea64a7249abb474627ab" Jan 21 01:40:28 crc kubenswrapper[5030]: E0121 01:40:28.607391 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721c67cd82795a1036c4ba2798a93b788f9397b4a73aea64a7249abb474627ab\": container with ID starting with 721c67cd82795a1036c4ba2798a93b788f9397b4a73aea64a7249abb474627ab not found: ID does not exist" containerID="721c67cd82795a1036c4ba2798a93b788f9397b4a73aea64a7249abb474627ab" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.607456 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721c67cd82795a1036c4ba2798a93b788f9397b4a73aea64a7249abb474627ab"} err="failed to get container status \"721c67cd82795a1036c4ba2798a93b788f9397b4a73aea64a7249abb474627ab\": rpc error: code = NotFound desc = could not find container \"721c67cd82795a1036c4ba2798a93b788f9397b4a73aea64a7249abb474627ab\": container with ID starting with 721c67cd82795a1036c4ba2798a93b788f9397b4a73aea64a7249abb474627ab not found: ID does not exist" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.607495 5030 scope.go:117] "RemoveContainer" containerID="178404202db1d59146b10c4055d7ada6395d53e8209b812d35a1b4a66f50370d" Jan 21 01:40:28 crc kubenswrapper[5030]: E0121 01:40:28.608060 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"178404202db1d59146b10c4055d7ada6395d53e8209b812d35a1b4a66f50370d\": container with ID starting with 178404202db1d59146b10c4055d7ada6395d53e8209b812d35a1b4a66f50370d not found: ID does not exist" containerID="178404202db1d59146b10c4055d7ada6395d53e8209b812d35a1b4a66f50370d" Jan 21 01:40:28 crc kubenswrapper[5030]: I0121 01:40:28.608097 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178404202db1d59146b10c4055d7ada6395d53e8209b812d35a1b4a66f50370d"} err="failed to get container status \"178404202db1d59146b10c4055d7ada6395d53e8209b812d35a1b4a66f50370d\": rpc error: code = NotFound desc = could not find container \"178404202db1d59146b10c4055d7ada6395d53e8209b812d35a1b4a66f50370d\": container with ID starting with 178404202db1d59146b10c4055d7ada6395d53e8209b812d35a1b4a66f50370d not found: ID does not exist" Jan 21 01:40:29 crc kubenswrapper[5030]: I0121 01:40:29.976963 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0723085-dd4f-4e49-a833-49aa99e0f34a" path="/var/lib/kubelet/pods/b0723085-dd4f-4e49-a833-49aa99e0f34a/volumes" Jan 21 01:40:40 crc kubenswrapper[5030]: I0121 01:40:40.157688 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:40:40 crc kubenswrapper[5030]: I0121 01:40:40.158600 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:40:40 crc kubenswrapper[5030]: I0121 01:40:40.158734 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 01:40:40 crc kubenswrapper[5030]: I0121 01:40:40.160065 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"114c3338e822553138f2fc4279dfe57ddc45eb3abeb91bb6eea525ed256713ad"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 01:40:40 crc kubenswrapper[5030]: I0121 01:40:40.160186 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://114c3338e822553138f2fc4279dfe57ddc45eb3abeb91bb6eea525ed256713ad" gracePeriod=600 Jan 21 01:40:40 crc kubenswrapper[5030]: I0121 01:40:40.622253 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="114c3338e822553138f2fc4279dfe57ddc45eb3abeb91bb6eea525ed256713ad" exitCode=0 Jan 21 01:40:40 crc kubenswrapper[5030]: I0121 01:40:40.622455 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"114c3338e822553138f2fc4279dfe57ddc45eb3abeb91bb6eea525ed256713ad"} Jan 21 01:40:40 crc kubenswrapper[5030]: I0121 01:40:40.622773 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerStarted","Data":"40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9"} Jan 21 01:40:40 crc kubenswrapper[5030]: I0121 01:40:40.622815 5030 scope.go:117] "RemoveContainer" containerID="a1fe6f517345dfcf1f9dc17faa47883b634fffe0db61f8af8c8df57202745bef" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.383897 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7s6j6"] Jan 21 01:41:53 crc kubenswrapper[5030]: E0121 01:41:53.384527 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0723085-dd4f-4e49-a833-49aa99e0f34a" containerName="registry-server" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.384539 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0723085-dd4f-4e49-a833-49aa99e0f34a" containerName="registry-server" Jan 21 01:41:53 crc kubenswrapper[5030]: E0121 01:41:53.384561 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0723085-dd4f-4e49-a833-49aa99e0f34a" containerName="extract-content" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.384568 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0723085-dd4f-4e49-a833-49aa99e0f34a" containerName="extract-content" Jan 21 01:41:53 crc kubenswrapper[5030]: E0121 01:41:53.384576 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0723085-dd4f-4e49-a833-49aa99e0f34a" containerName="extract-utilities" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.384582 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0723085-dd4f-4e49-a833-49aa99e0f34a" containerName="extract-utilities" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.384696 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0723085-dd4f-4e49-a833-49aa99e0f34a" containerName="registry-server" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.386310 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.419710 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7s6j6"] Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.482322 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7h2c\" (UniqueName: \"kubernetes.io/projected/2e77a329-fdb7-433f-bc8d-955c7e8ff189-kube-api-access-t7h2c\") pod \"redhat-operators-7s6j6\" (UID: \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\") " pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.482375 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e77a329-fdb7-433f-bc8d-955c7e8ff189-catalog-content\") pod \"redhat-operators-7s6j6\" (UID: \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\") " pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.482459 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e77a329-fdb7-433f-bc8d-955c7e8ff189-utilities\") pod \"redhat-operators-7s6j6\" (UID: \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\") " pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.583106 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e77a329-fdb7-433f-bc8d-955c7e8ff189-utilities\") pod \"redhat-operators-7s6j6\" (UID: \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\") " pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.583188 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7h2c\" (UniqueName: \"kubernetes.io/projected/2e77a329-fdb7-433f-bc8d-955c7e8ff189-kube-api-access-t7h2c\") pod \"redhat-operators-7s6j6\" (UID: \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\") " pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.583209 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e77a329-fdb7-433f-bc8d-955c7e8ff189-catalog-content\") pod \"redhat-operators-7s6j6\" (UID: \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\") " pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.583672 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e77a329-fdb7-433f-bc8d-955c7e8ff189-catalog-content\") pod \"redhat-operators-7s6j6\" (UID: \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\") " pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.583742 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e77a329-fdb7-433f-bc8d-955c7e8ff189-utilities\") pod \"redhat-operators-7s6j6\" (UID: \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\") " pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.616485 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7h2c\" (UniqueName: \"kubernetes.io/projected/2e77a329-fdb7-433f-bc8d-955c7e8ff189-kube-api-access-t7h2c\") pod \"redhat-operators-7s6j6\" (UID: \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\") " pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:41:53 crc kubenswrapper[5030]: I0121 01:41:53.784846 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:41:54 crc kubenswrapper[5030]: I0121 01:41:54.243907 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7s6j6"] Jan 21 01:41:55 crc kubenswrapper[5030]: I0121 01:41:55.226770 5030 generic.go:334] "Generic (PLEG): container finished" podID="2e77a329-fdb7-433f-bc8d-955c7e8ff189" containerID="050867319b159b86e072c65b99d7b744c61e6fc132704eabb4d2bef877ea06b3" exitCode=0 Jan 21 01:41:55 crc kubenswrapper[5030]: I0121 01:41:55.226824 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7s6j6" event={"ID":"2e77a329-fdb7-433f-bc8d-955c7e8ff189","Type":"ContainerDied","Data":"050867319b159b86e072c65b99d7b744c61e6fc132704eabb4d2bef877ea06b3"} Jan 21 01:41:55 crc kubenswrapper[5030]: I0121 01:41:55.227161 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7s6j6" event={"ID":"2e77a329-fdb7-433f-bc8d-955c7e8ff189","Type":"ContainerStarted","Data":"55c348db8d2634dea5efe7409ebd89488c382f8458e09ab0ce2e54ce8922779c"} Jan 21 01:41:55 crc kubenswrapper[5030]: I0121 01:41:55.228851 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 01:41:57 crc kubenswrapper[5030]: I0121 01:41:57.241427 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7s6j6" event={"ID":"2e77a329-fdb7-433f-bc8d-955c7e8ff189","Type":"ContainerStarted","Data":"d6ccdeebbdcf13984b4f3169f2314bf274db35dca9d600460c362eb4767b6ef7"} Jan 21 01:41:58 crc kubenswrapper[5030]: I0121 01:41:58.251542 5030 generic.go:334] "Generic (PLEG): container finished" podID="2e77a329-fdb7-433f-bc8d-955c7e8ff189" containerID="d6ccdeebbdcf13984b4f3169f2314bf274db35dca9d600460c362eb4767b6ef7" exitCode=0 Jan 21 01:41:58 crc kubenswrapper[5030]: I0121 01:41:58.251679 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7s6j6" event={"ID":"2e77a329-fdb7-433f-bc8d-955c7e8ff189","Type":"ContainerDied","Data":"d6ccdeebbdcf13984b4f3169f2314bf274db35dca9d600460c362eb4767b6ef7"} Jan 21 01:41:59 crc kubenswrapper[5030]: I0121 01:41:59.259110 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7s6j6" event={"ID":"2e77a329-fdb7-433f-bc8d-955c7e8ff189","Type":"ContainerStarted","Data":"2c15d70e6e0e8440d73a72b14c41bbbfdaea088aaca3b7a217d3a018e1618d67"} Jan 21 01:41:59 crc kubenswrapper[5030]: I0121 01:41:59.282824 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7s6j6" podStartSLOduration=2.732183785 podStartE2EDuration="6.282809747s" podCreationTimestamp="2026-01-21 01:41:53 +0000 UTC" firstStartedPulling="2026-01-21 01:41:55.228547596 +0000 UTC m=+11187.548807894" lastFinishedPulling="2026-01-21 01:41:58.779173528 +0000 UTC m=+11191.099433856" observedRunningTime="2026-01-21 01:41:59.279459046 +0000 UTC m=+11191.599719364" watchObservedRunningTime="2026-01-21 01:41:59.282809747 +0000 UTC m=+11191.603070035" Jan 21 01:42:03 crc kubenswrapper[5030]: I0121 01:42:03.785277 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:42:03 crc kubenswrapper[5030]: I0121 01:42:03.785819 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:42:04 crc kubenswrapper[5030]: I0121 01:42:04.842373 5030 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7s6j6" podUID="2e77a329-fdb7-433f-bc8d-955c7e8ff189" containerName="registry-server" probeResult="failure" output=< Jan 21 01:42:04 crc kubenswrapper[5030]: timeout: failed to connect service ":50051" within 1s Jan 21 01:42:04 crc kubenswrapper[5030]: > Jan 21 01:42:13 crc kubenswrapper[5030]: I0121 01:42:13.874060 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:42:13 crc kubenswrapper[5030]: I0121 01:42:13.976304 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:42:14 crc kubenswrapper[5030]: I0121 01:42:14.128592 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7s6j6"] Jan 21 01:42:15 crc kubenswrapper[5030]: I0121 01:42:15.376146 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7s6j6" podUID="2e77a329-fdb7-433f-bc8d-955c7e8ff189" containerName="registry-server" containerID="cri-o://2c15d70e6e0e8440d73a72b14c41bbbfdaea088aaca3b7a217d3a018e1618d67" gracePeriod=2 Jan 21 01:42:15 crc kubenswrapper[5030]: I0121 01:42:15.868892 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:42:15 crc kubenswrapper[5030]: I0121 01:42:15.919159 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e77a329-fdb7-433f-bc8d-955c7e8ff189-utilities\") pod \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\" (UID: \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\") " Jan 21 01:42:15 crc kubenswrapper[5030]: I0121 01:42:15.919272 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7h2c\" (UniqueName: \"kubernetes.io/projected/2e77a329-fdb7-433f-bc8d-955c7e8ff189-kube-api-access-t7h2c\") pod \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\" (UID: \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\") " Jan 21 01:42:15 crc kubenswrapper[5030]: I0121 01:42:15.919324 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e77a329-fdb7-433f-bc8d-955c7e8ff189-catalog-content\") pod \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\" (UID: \"2e77a329-fdb7-433f-bc8d-955c7e8ff189\") " Jan 21 01:42:15 crc kubenswrapper[5030]: I0121 01:42:15.920336 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e77a329-fdb7-433f-bc8d-955c7e8ff189-utilities" (OuterVolumeSpecName: "utilities") pod "2e77a329-fdb7-433f-bc8d-955c7e8ff189" (UID: "2e77a329-fdb7-433f-bc8d-955c7e8ff189"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:42:15 crc kubenswrapper[5030]: I0121 01:42:15.932847 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e77a329-fdb7-433f-bc8d-955c7e8ff189-kube-api-access-t7h2c" (OuterVolumeSpecName: "kube-api-access-t7h2c") pod "2e77a329-fdb7-433f-bc8d-955c7e8ff189" (UID: "2e77a329-fdb7-433f-bc8d-955c7e8ff189"). InnerVolumeSpecName "kube-api-access-t7h2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.020746 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7h2c\" (UniqueName: \"kubernetes.io/projected/2e77a329-fdb7-433f-bc8d-955c7e8ff189-kube-api-access-t7h2c\") on node \"crc\" DevicePath \"\"" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.020775 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e77a329-fdb7-433f-bc8d-955c7e8ff189-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.064881 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e77a329-fdb7-433f-bc8d-955c7e8ff189-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e77a329-fdb7-433f-bc8d-955c7e8ff189" (UID: "2e77a329-fdb7-433f-bc8d-955c7e8ff189"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.124290 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e77a329-fdb7-433f-bc8d-955c7e8ff189-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.387744 5030 generic.go:334] "Generic (PLEG): container finished" podID="2e77a329-fdb7-433f-bc8d-955c7e8ff189" containerID="2c15d70e6e0e8440d73a72b14c41bbbfdaea088aaca3b7a217d3a018e1618d67" exitCode=0 Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.387811 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7s6j6" event={"ID":"2e77a329-fdb7-433f-bc8d-955c7e8ff189","Type":"ContainerDied","Data":"2c15d70e6e0e8440d73a72b14c41bbbfdaea088aaca3b7a217d3a018e1618d67"} Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.387865 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7s6j6" event={"ID":"2e77a329-fdb7-433f-bc8d-955c7e8ff189","Type":"ContainerDied","Data":"55c348db8d2634dea5efe7409ebd89488c382f8458e09ab0ce2e54ce8922779c"} Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.387895 5030 scope.go:117] "RemoveContainer" containerID="2c15d70e6e0e8440d73a72b14c41bbbfdaea088aaca3b7a217d3a018e1618d67" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.387927 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7s6j6" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.424387 5030 scope.go:117] "RemoveContainer" containerID="d6ccdeebbdcf13984b4f3169f2314bf274db35dca9d600460c362eb4767b6ef7" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.456193 5030 scope.go:117] "RemoveContainer" containerID="050867319b159b86e072c65b99d7b744c61e6fc132704eabb4d2bef877ea06b3" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.472283 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7s6j6"] Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.482075 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7s6j6"] Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.486598 5030 scope.go:117] "RemoveContainer" containerID="2c15d70e6e0e8440d73a72b14c41bbbfdaea088aaca3b7a217d3a018e1618d67" Jan 21 01:42:16 crc kubenswrapper[5030]: E0121 01:42:16.490750 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c15d70e6e0e8440d73a72b14c41bbbfdaea088aaca3b7a217d3a018e1618d67\": container with ID starting with 2c15d70e6e0e8440d73a72b14c41bbbfdaea088aaca3b7a217d3a018e1618d67 not found: ID does not exist" containerID="2c15d70e6e0e8440d73a72b14c41bbbfdaea088aaca3b7a217d3a018e1618d67" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.490803 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c15d70e6e0e8440d73a72b14c41bbbfdaea088aaca3b7a217d3a018e1618d67"} err="failed to get container status \"2c15d70e6e0e8440d73a72b14c41bbbfdaea088aaca3b7a217d3a018e1618d67\": rpc error: code = NotFound desc = could not find container \"2c15d70e6e0e8440d73a72b14c41bbbfdaea088aaca3b7a217d3a018e1618d67\": container with ID starting with 2c15d70e6e0e8440d73a72b14c41bbbfdaea088aaca3b7a217d3a018e1618d67 not found: ID does not exist" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.490835 5030 scope.go:117] "RemoveContainer" containerID="d6ccdeebbdcf13984b4f3169f2314bf274db35dca9d600460c362eb4767b6ef7" Jan 21 01:42:16 crc kubenswrapper[5030]: E0121 01:42:16.491210 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6ccdeebbdcf13984b4f3169f2314bf274db35dca9d600460c362eb4767b6ef7\": container with ID starting with d6ccdeebbdcf13984b4f3169f2314bf274db35dca9d600460c362eb4767b6ef7 not found: ID does not exist" containerID="d6ccdeebbdcf13984b4f3169f2314bf274db35dca9d600460c362eb4767b6ef7" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.491245 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6ccdeebbdcf13984b4f3169f2314bf274db35dca9d600460c362eb4767b6ef7"} err="failed to get container status \"d6ccdeebbdcf13984b4f3169f2314bf274db35dca9d600460c362eb4767b6ef7\": rpc error: code = NotFound desc = could not find container \"d6ccdeebbdcf13984b4f3169f2314bf274db35dca9d600460c362eb4767b6ef7\": container with ID starting with d6ccdeebbdcf13984b4f3169f2314bf274db35dca9d600460c362eb4767b6ef7 not found: ID does not exist" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.491270 5030 scope.go:117] "RemoveContainer" containerID="050867319b159b86e072c65b99d7b744c61e6fc132704eabb4d2bef877ea06b3" Jan 21 01:42:16 crc kubenswrapper[5030]: E0121 01:42:16.491737 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"050867319b159b86e072c65b99d7b744c61e6fc132704eabb4d2bef877ea06b3\": container with ID starting with 050867319b159b86e072c65b99d7b744c61e6fc132704eabb4d2bef877ea06b3 not found: ID does not exist" containerID="050867319b159b86e072c65b99d7b744c61e6fc132704eabb4d2bef877ea06b3" Jan 21 01:42:16 crc kubenswrapper[5030]: I0121 01:42:16.491765 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050867319b159b86e072c65b99d7b744c61e6fc132704eabb4d2bef877ea06b3"} err="failed to get container status \"050867319b159b86e072c65b99d7b744c61e6fc132704eabb4d2bef877ea06b3\": rpc error: code = NotFound desc = could not find container \"050867319b159b86e072c65b99d7b744c61e6fc132704eabb4d2bef877ea06b3\": container with ID starting with 050867319b159b86e072c65b99d7b744c61e6fc132704eabb4d2bef877ea06b3 not found: ID does not exist" Jan 21 01:42:17 crc kubenswrapper[5030]: I0121 01:42:17.976183 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e77a329-fdb7-433f-bc8d-955c7e8ff189" path="/var/lib/kubelet/pods/2e77a329-fdb7-433f-bc8d-955c7e8ff189/volumes" Jan 21 01:42:40 crc kubenswrapper[5030]: I0121 01:42:40.158129 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:42:40 crc kubenswrapper[5030]: I0121 01:42:40.158869 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:43:10 crc kubenswrapper[5030]: I0121 01:43:10.157138 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:43:10 crc kubenswrapper[5030]: I0121 01:43:10.158100 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:43:40 crc kubenswrapper[5030]: I0121 01:43:40.156761 5030 patch_prober.go:28] interesting pod/machine-config-daemon-qb97t container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 01:43:40 crc kubenswrapper[5030]: I0121 01:43:40.158326 5030 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 01:43:40 crc kubenswrapper[5030]: I0121 01:43:40.158447 5030 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" Jan 21 01:43:40 crc kubenswrapper[5030]: I0121 01:43:40.159088 5030 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9"} pod="openshift-machine-config-operator/machine-config-daemon-qb97t" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 01:43:40 crc kubenswrapper[5030]: I0121 01:43:40.159211 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerName="machine-config-daemon" containerID="cri-o://40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" gracePeriod=600 Jan 21 01:43:40 crc kubenswrapper[5030]: E0121 01:43:40.298938 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:43:41 crc kubenswrapper[5030]: I0121 01:43:41.144131 5030 generic.go:334] "Generic (PLEG): container finished" podID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" exitCode=0 Jan 21 01:43:41 crc kubenswrapper[5030]: I0121 01:43:41.144222 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" event={"ID":"a2a963db-b558-4e9a-8e56-12636c3fe1c2","Type":"ContainerDied","Data":"40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9"} Jan 21 01:43:41 crc kubenswrapper[5030]: I0121 01:43:41.144535 5030 scope.go:117] "RemoveContainer" containerID="114c3338e822553138f2fc4279dfe57ddc45eb3abeb91bb6eea525ed256713ad" Jan 21 01:43:41 crc kubenswrapper[5030]: I0121 01:43:41.145139 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:43:41 crc kubenswrapper[5030]: E0121 01:43:41.145412 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:43:55 crc kubenswrapper[5030]: I0121 01:43:55.963836 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:43:55 crc kubenswrapper[5030]: E0121 01:43:55.964910 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:44:06 crc kubenswrapper[5030]: I0121 01:44:06.962503 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:44:06 crc kubenswrapper[5030]: E0121 01:44:06.963408 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:44:21 crc kubenswrapper[5030]: I0121 01:44:21.965698 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:44:21 crc kubenswrapper[5030]: E0121 01:44:21.967732 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:44:32 crc kubenswrapper[5030]: I0121 01:44:32.962692 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:44:32 crc kubenswrapper[5030]: E0121 01:44:32.963996 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:44:45 crc kubenswrapper[5030]: I0121 01:44:45.962594 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:44:45 crc kubenswrapper[5030]: E0121 01:44:45.963731 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:44:59 crc kubenswrapper[5030]: I0121 01:44:59.963021 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:44:59 crc kubenswrapper[5030]: E0121 01:44:59.964194 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.206504 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq"] Jan 21 01:45:00 crc kubenswrapper[5030]: E0121 01:45:00.207083 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e77a329-fdb7-433f-bc8d-955c7e8ff189" containerName="extract-content" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.207136 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e77a329-fdb7-433f-bc8d-955c7e8ff189" containerName="extract-content" Jan 21 01:45:00 crc kubenswrapper[5030]: E0121 01:45:00.207196 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e77a329-fdb7-433f-bc8d-955c7e8ff189" containerName="registry-server" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.207216 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e77a329-fdb7-433f-bc8d-955c7e8ff189" containerName="registry-server" Jan 21 01:45:00 crc kubenswrapper[5030]: E0121 01:45:00.207266 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e77a329-fdb7-433f-bc8d-955c7e8ff189" containerName="extract-utilities" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.207290 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e77a329-fdb7-433f-bc8d-955c7e8ff189" containerName="extract-utilities" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.207667 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e77a329-fdb7-433f-bc8d-955c7e8ff189" containerName="registry-server" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.208699 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.211716 5030 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.213103 5030 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.218703 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq"] Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.225860 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn9mn\" (UniqueName: \"kubernetes.io/projected/dc3d531a-4a70-4302-bc7a-8ac13b421047-kube-api-access-kn9mn\") pod \"collect-profiles-29482665-dp5pq\" (UID: \"dc3d531a-4a70-4302-bc7a-8ac13b421047\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.225915 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc3d531a-4a70-4302-bc7a-8ac13b421047-secret-volume\") pod \"collect-profiles-29482665-dp5pq\" (UID: \"dc3d531a-4a70-4302-bc7a-8ac13b421047\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.225959 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc3d531a-4a70-4302-bc7a-8ac13b421047-config-volume\") pod \"collect-profiles-29482665-dp5pq\" (UID: \"dc3d531a-4a70-4302-bc7a-8ac13b421047\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.326945 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn9mn\" (UniqueName: \"kubernetes.io/projected/dc3d531a-4a70-4302-bc7a-8ac13b421047-kube-api-access-kn9mn\") pod \"collect-profiles-29482665-dp5pq\" (UID: \"dc3d531a-4a70-4302-bc7a-8ac13b421047\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.327007 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc3d531a-4a70-4302-bc7a-8ac13b421047-secret-volume\") pod \"collect-profiles-29482665-dp5pq\" (UID: \"dc3d531a-4a70-4302-bc7a-8ac13b421047\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.327037 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc3d531a-4a70-4302-bc7a-8ac13b421047-config-volume\") pod \"collect-profiles-29482665-dp5pq\" (UID: \"dc3d531a-4a70-4302-bc7a-8ac13b421047\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.328028 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc3d531a-4a70-4302-bc7a-8ac13b421047-config-volume\") pod \"collect-profiles-29482665-dp5pq\" (UID: \"dc3d531a-4a70-4302-bc7a-8ac13b421047\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.350217 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc3d531a-4a70-4302-bc7a-8ac13b421047-secret-volume\") pod \"collect-profiles-29482665-dp5pq\" (UID: \"dc3d531a-4a70-4302-bc7a-8ac13b421047\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.353559 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn9mn\" (UniqueName: \"kubernetes.io/projected/dc3d531a-4a70-4302-bc7a-8ac13b421047-kube-api-access-kn9mn\") pod \"collect-profiles-29482665-dp5pq\" (UID: \"dc3d531a-4a70-4302-bc7a-8ac13b421047\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.531537 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" Jan 21 01:45:00 crc kubenswrapper[5030]: I0121 01:45:00.972461 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq"] Jan 21 01:45:00 crc kubenswrapper[5030]: W0121 01:45:00.981548 5030 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc3d531a_4a70_4302_bc7a_8ac13b421047.slice/crio-f179e14427fc205153f5e4734533194001c3643944d653fc6abd8a68ab703a58 WatchSource:0}: Error finding container f179e14427fc205153f5e4734533194001c3643944d653fc6abd8a68ab703a58: Status 404 returned error can't find the container with id f179e14427fc205153f5e4734533194001c3643944d653fc6abd8a68ab703a58 Jan 21 01:45:01 crc kubenswrapper[5030]: I0121 01:45:01.995767 5030 generic.go:334] "Generic (PLEG): container finished" podID="dc3d531a-4a70-4302-bc7a-8ac13b421047" containerID="cc30b44bde1591b96cf6fb04db1a886b7fb671e83fc86978a9b0d547df5e5861" exitCode=0 Jan 21 01:45:01 crc kubenswrapper[5030]: I0121 01:45:01.996305 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" event={"ID":"dc3d531a-4a70-4302-bc7a-8ac13b421047","Type":"ContainerDied","Data":"cc30b44bde1591b96cf6fb04db1a886b7fb671e83fc86978a9b0d547df5e5861"} Jan 21 01:45:01 crc kubenswrapper[5030]: I0121 01:45:01.996359 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" event={"ID":"dc3d531a-4a70-4302-bc7a-8ac13b421047","Type":"ContainerStarted","Data":"f179e14427fc205153f5e4734533194001c3643944d653fc6abd8a68ab703a58"} Jan 21 01:45:03 crc kubenswrapper[5030]: I0121 01:45:03.349574 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" Jan 21 01:45:03 crc kubenswrapper[5030]: I0121 01:45:03.481507 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc3d531a-4a70-4302-bc7a-8ac13b421047-config-volume\") pod \"dc3d531a-4a70-4302-bc7a-8ac13b421047\" (UID: \"dc3d531a-4a70-4302-bc7a-8ac13b421047\") " Jan 21 01:45:03 crc kubenswrapper[5030]: I0121 01:45:03.481682 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn9mn\" (UniqueName: \"kubernetes.io/projected/dc3d531a-4a70-4302-bc7a-8ac13b421047-kube-api-access-kn9mn\") pod \"dc3d531a-4a70-4302-bc7a-8ac13b421047\" (UID: \"dc3d531a-4a70-4302-bc7a-8ac13b421047\") " Jan 21 01:45:03 crc kubenswrapper[5030]: I0121 01:45:03.481750 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc3d531a-4a70-4302-bc7a-8ac13b421047-secret-volume\") pod \"dc3d531a-4a70-4302-bc7a-8ac13b421047\" (UID: \"dc3d531a-4a70-4302-bc7a-8ac13b421047\") " Jan 21 01:45:03 crc kubenswrapper[5030]: I0121 01:45:03.482820 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc3d531a-4a70-4302-bc7a-8ac13b421047-config-volume" (OuterVolumeSpecName: "config-volume") pod "dc3d531a-4a70-4302-bc7a-8ac13b421047" (UID: "dc3d531a-4a70-4302-bc7a-8ac13b421047"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 01:45:03 crc kubenswrapper[5030]: I0121 01:45:03.490569 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc3d531a-4a70-4302-bc7a-8ac13b421047-kube-api-access-kn9mn" (OuterVolumeSpecName: "kube-api-access-kn9mn") pod "dc3d531a-4a70-4302-bc7a-8ac13b421047" (UID: "dc3d531a-4a70-4302-bc7a-8ac13b421047"). InnerVolumeSpecName "kube-api-access-kn9mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:45:03 crc kubenswrapper[5030]: I0121 01:45:03.491566 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3d531a-4a70-4302-bc7a-8ac13b421047-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dc3d531a-4a70-4302-bc7a-8ac13b421047" (UID: "dc3d531a-4a70-4302-bc7a-8ac13b421047"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 01:45:03 crc kubenswrapper[5030]: I0121 01:45:03.584335 5030 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc3d531a-4a70-4302-bc7a-8ac13b421047-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 01:45:03 crc kubenswrapper[5030]: I0121 01:45:03.584810 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn9mn\" (UniqueName: \"kubernetes.io/projected/dc3d531a-4a70-4302-bc7a-8ac13b421047-kube-api-access-kn9mn\") on node \"crc\" DevicePath \"\"" Jan 21 01:45:03 crc kubenswrapper[5030]: I0121 01:45:03.584849 5030 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc3d531a-4a70-4302-bc7a-8ac13b421047-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 01:45:04 crc kubenswrapper[5030]: I0121 01:45:04.012549 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" event={"ID":"dc3d531a-4a70-4302-bc7a-8ac13b421047","Type":"ContainerDied","Data":"f179e14427fc205153f5e4734533194001c3643944d653fc6abd8a68ab703a58"} Jan 21 01:45:04 crc kubenswrapper[5030]: I0121 01:45:04.012987 5030 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f179e14427fc205153f5e4734533194001c3643944d653fc6abd8a68ab703a58" Jan 21 01:45:04 crc kubenswrapper[5030]: I0121 01:45:04.012585 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482665-dp5pq" Jan 21 01:45:04 crc kubenswrapper[5030]: I0121 01:45:04.442539 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv"] Jan 21 01:45:04 crc kubenswrapper[5030]: I0121 01:45:04.448632 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482620-k2sfv"] Jan 21 01:45:05 crc kubenswrapper[5030]: I0121 01:45:05.977129 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5" path="/var/lib/kubelet/pods/6a0a6e2a-f088-4d4a-a5e5-5a62b4218cf5/volumes" Jan 21 01:45:14 crc kubenswrapper[5030]: I0121 01:45:14.962328 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:45:14 crc kubenswrapper[5030]: E0121 01:45:14.963267 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:45:25 crc kubenswrapper[5030]: I0121 01:45:25.962734 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:45:25 crc kubenswrapper[5030]: E0121 01:45:25.964100 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:45:39 crc kubenswrapper[5030]: I0121 01:45:39.966116 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:45:39 crc kubenswrapper[5030]: E0121 01:45:39.967123 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:45:41 crc kubenswrapper[5030]: I0121 01:45:41.571079 5030 scope.go:117] "RemoveContainer" containerID="2a31a645a937ba5dd4b9e8967a1fa6475ebf9a9ed215b390ef6cb6d72abaa9b1" Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.789277 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x5rk5"] Jan 21 01:45:46 crc kubenswrapper[5030]: E0121 01:45:46.793831 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3d531a-4a70-4302-bc7a-8ac13b421047" containerName="collect-profiles" Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.793853 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3d531a-4a70-4302-bc7a-8ac13b421047" containerName="collect-profiles" Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.794028 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3d531a-4a70-4302-bc7a-8ac13b421047" containerName="collect-profiles" Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.795351 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.802459 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x5rk5"] Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.819173 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-utilities\") pod \"certified-operators-x5rk5\" (UID: \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\") " pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.819284 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lntqd\" (UniqueName: \"kubernetes.io/projected/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-kube-api-access-lntqd\") pod \"certified-operators-x5rk5\" (UID: \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\") " pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.819398 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-catalog-content\") pod \"certified-operators-x5rk5\" (UID: \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\") " pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.920954 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-utilities\") pod \"certified-operators-x5rk5\" (UID: \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\") " pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.921051 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lntqd\" (UniqueName: \"kubernetes.io/projected/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-kube-api-access-lntqd\") pod \"certified-operators-x5rk5\" (UID: \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\") " pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.921162 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-catalog-content\") pod \"certified-operators-x5rk5\" (UID: \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\") " pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.921552 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-utilities\") pod \"certified-operators-x5rk5\" (UID: \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\") " pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.921763 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-catalog-content\") pod \"certified-operators-x5rk5\" (UID: \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\") " pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:46 crc kubenswrapper[5030]: I0121 01:45:46.945719 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lntqd\" (UniqueName: \"kubernetes.io/projected/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-kube-api-access-lntqd\") pod \"certified-operators-x5rk5\" (UID: \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\") " pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:47 crc kubenswrapper[5030]: I0121 01:45:47.130574 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:47 crc kubenswrapper[5030]: I0121 01:45:47.504458 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x5rk5"] Jan 21 01:45:48 crc kubenswrapper[5030]: I0121 01:45:48.388279 5030 generic.go:334] "Generic (PLEG): container finished" podID="deae4c27-9aa0-475f-ae4e-67b80cf01ad9" containerID="09b9d2098372461080aae1d0f682eb1f49ae81aee35ab0e89357086378909318" exitCode=0 Jan 21 01:45:48 crc kubenswrapper[5030]: I0121 01:45:48.388342 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rk5" event={"ID":"deae4c27-9aa0-475f-ae4e-67b80cf01ad9","Type":"ContainerDied","Data":"09b9d2098372461080aae1d0f682eb1f49ae81aee35ab0e89357086378909318"} Jan 21 01:45:48 crc kubenswrapper[5030]: I0121 01:45:48.388380 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rk5" event={"ID":"deae4c27-9aa0-475f-ae4e-67b80cf01ad9","Type":"ContainerStarted","Data":"cdbf07f9f596d993fbd453a1d110010c2155dd5718f0b6715d8dc369d86d02a2"} Jan 21 01:45:49 crc kubenswrapper[5030]: I0121 01:45:49.403492 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rk5" event={"ID":"deae4c27-9aa0-475f-ae4e-67b80cf01ad9","Type":"ContainerStarted","Data":"6c152ff450a7b2b3eed7e112cd1191d45494d7a7b8a3a3686dee7ef24a5a737a"} Jan 21 01:45:50 crc kubenswrapper[5030]: I0121 01:45:50.415062 5030 generic.go:334] "Generic (PLEG): container finished" podID="deae4c27-9aa0-475f-ae4e-67b80cf01ad9" containerID="6c152ff450a7b2b3eed7e112cd1191d45494d7a7b8a3a3686dee7ef24a5a737a" exitCode=0 Jan 21 01:45:50 crc kubenswrapper[5030]: I0121 01:45:50.415464 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rk5" event={"ID":"deae4c27-9aa0-475f-ae4e-67b80cf01ad9","Type":"ContainerDied","Data":"6c152ff450a7b2b3eed7e112cd1191d45494d7a7b8a3a3686dee7ef24a5a737a"} Jan 21 01:45:51 crc kubenswrapper[5030]: I0121 01:45:51.427098 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rk5" event={"ID":"deae4c27-9aa0-475f-ae4e-67b80cf01ad9","Type":"ContainerStarted","Data":"bd367f2cf8a0068258902e8c22dbc32ad6026f1c5d48f021c9843ea02780d6fe"} Jan 21 01:45:51 crc kubenswrapper[5030]: I0121 01:45:51.964555 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:45:51 crc kubenswrapper[5030]: E0121 01:45:51.965055 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:45:57 crc kubenswrapper[5030]: I0121 01:45:57.132315 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:57 crc kubenswrapper[5030]: I0121 01:45:57.132920 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:57 crc kubenswrapper[5030]: I0121 01:45:57.211306 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:57 crc kubenswrapper[5030]: I0121 01:45:57.240201 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x5rk5" podStartSLOduration=8.781863448 podStartE2EDuration="11.240176811s" podCreationTimestamp="2026-01-21 01:45:46 +0000 UTC" firstStartedPulling="2026-01-21 01:45:48.391360895 +0000 UTC m=+11420.711621223" lastFinishedPulling="2026-01-21 01:45:50.849674268 +0000 UTC m=+11423.169934586" observedRunningTime="2026-01-21 01:45:51.461719418 +0000 UTC m=+11423.781979706" watchObservedRunningTime="2026-01-21 01:45:57.240176811 +0000 UTC m=+11429.560437129" Jan 21 01:45:57 crc kubenswrapper[5030]: I0121 01:45:57.535115 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:45:57 crc kubenswrapper[5030]: I0121 01:45:57.592436 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x5rk5"] Jan 21 01:45:59 crc kubenswrapper[5030]: I0121 01:45:59.484402 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x5rk5" podUID="deae4c27-9aa0-475f-ae4e-67b80cf01ad9" containerName="registry-server" containerID="cri-o://bd367f2cf8a0068258902e8c22dbc32ad6026f1c5d48f021c9843ea02780d6fe" gracePeriod=2 Jan 21 01:46:00 crc kubenswrapper[5030]: I0121 01:46:00.494803 5030 generic.go:334] "Generic (PLEG): container finished" podID="deae4c27-9aa0-475f-ae4e-67b80cf01ad9" containerID="bd367f2cf8a0068258902e8c22dbc32ad6026f1c5d48f021c9843ea02780d6fe" exitCode=0 Jan 21 01:46:00 crc kubenswrapper[5030]: I0121 01:46:00.494994 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rk5" event={"ID":"deae4c27-9aa0-475f-ae4e-67b80cf01ad9","Type":"ContainerDied","Data":"bd367f2cf8a0068258902e8c22dbc32ad6026f1c5d48f021c9843ea02780d6fe"} Jan 21 01:46:00 crc kubenswrapper[5030]: I0121 01:46:00.550613 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:46:00 crc kubenswrapper[5030]: I0121 01:46:00.657660 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-catalog-content\") pod \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\" (UID: \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\") " Jan 21 01:46:00 crc kubenswrapper[5030]: I0121 01:46:00.658133 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-utilities\") pod \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\" (UID: \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\") " Jan 21 01:46:00 crc kubenswrapper[5030]: I0121 01:46:00.658180 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lntqd\" (UniqueName: \"kubernetes.io/projected/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-kube-api-access-lntqd\") pod \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\" (UID: \"deae4c27-9aa0-475f-ae4e-67b80cf01ad9\") " Jan 21 01:46:00 crc kubenswrapper[5030]: I0121 01:46:00.662464 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-utilities" (OuterVolumeSpecName: "utilities") pod "deae4c27-9aa0-475f-ae4e-67b80cf01ad9" (UID: "deae4c27-9aa0-475f-ae4e-67b80cf01ad9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:46:00 crc kubenswrapper[5030]: I0121 01:46:00.680383 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-kube-api-access-lntqd" (OuterVolumeSpecName: "kube-api-access-lntqd") pod "deae4c27-9aa0-475f-ae4e-67b80cf01ad9" (UID: "deae4c27-9aa0-475f-ae4e-67b80cf01ad9"). InnerVolumeSpecName "kube-api-access-lntqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:46:00 crc kubenswrapper[5030]: I0121 01:46:00.719499 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "deae4c27-9aa0-475f-ae4e-67b80cf01ad9" (UID: "deae4c27-9aa0-475f-ae4e-67b80cf01ad9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:46:00 crc kubenswrapper[5030]: I0121 01:46:00.761107 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:46:00 crc kubenswrapper[5030]: I0121 01:46:00.761156 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:46:00 crc kubenswrapper[5030]: I0121 01:46:00.761180 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lntqd\" (UniqueName: \"kubernetes.io/projected/deae4c27-9aa0-475f-ae4e-67b80cf01ad9-kube-api-access-lntqd\") on node \"crc\" DevicePath \"\"" Jan 21 01:46:01 crc kubenswrapper[5030]: I0121 01:46:01.507929 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rk5" event={"ID":"deae4c27-9aa0-475f-ae4e-67b80cf01ad9","Type":"ContainerDied","Data":"cdbf07f9f596d993fbd453a1d110010c2155dd5718f0b6715d8dc369d86d02a2"} Jan 21 01:46:01 crc kubenswrapper[5030]: I0121 01:46:01.507993 5030 scope.go:117] "RemoveContainer" containerID="bd367f2cf8a0068258902e8c22dbc32ad6026f1c5d48f021c9843ea02780d6fe" Jan 21 01:46:01 crc kubenswrapper[5030]: I0121 01:46:01.508027 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5rk5" Jan 21 01:46:01 crc kubenswrapper[5030]: I0121 01:46:01.547893 5030 scope.go:117] "RemoveContainer" containerID="6c152ff450a7b2b3eed7e112cd1191d45494d7a7b8a3a3686dee7ef24a5a737a" Jan 21 01:46:01 crc kubenswrapper[5030]: I0121 01:46:01.558710 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x5rk5"] Jan 21 01:46:01 crc kubenswrapper[5030]: I0121 01:46:01.575325 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x5rk5"] Jan 21 01:46:01 crc kubenswrapper[5030]: I0121 01:46:01.576118 5030 scope.go:117] "RemoveContainer" containerID="09b9d2098372461080aae1d0f682eb1f49ae81aee35ab0e89357086378909318" Jan 21 01:46:01 crc kubenswrapper[5030]: I0121 01:46:01.975920 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deae4c27-9aa0-475f-ae4e-67b80cf01ad9" path="/var/lib/kubelet/pods/deae4c27-9aa0-475f-ae4e-67b80cf01ad9/volumes" Jan 21 01:46:06 crc kubenswrapper[5030]: I0121 01:46:06.961517 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:46:06 crc kubenswrapper[5030]: E0121 01:46:06.962275 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:46:19 crc kubenswrapper[5030]: I0121 01:46:19.962277 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:46:19 crc kubenswrapper[5030]: E0121 01:46:19.963248 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:46:34 crc kubenswrapper[5030]: I0121 01:46:34.963444 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:46:34 crc kubenswrapper[5030]: E0121 01:46:34.964393 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:46:46 crc kubenswrapper[5030]: I0121 01:46:46.961857 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:46:46 crc kubenswrapper[5030]: E0121 01:46:46.962790 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:46:57 crc kubenswrapper[5030]: I0121 01:46:57.970068 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:46:57 crc kubenswrapper[5030]: E0121 01:46:57.970977 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.662805 5030 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lsqmb"] Jan 21 01:47:05 crc kubenswrapper[5030]: E0121 01:47:05.664136 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deae4c27-9aa0-475f-ae4e-67b80cf01ad9" containerName="registry-server" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.664152 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="deae4c27-9aa0-475f-ae4e-67b80cf01ad9" containerName="registry-server" Jan 21 01:47:05 crc kubenswrapper[5030]: E0121 01:47:05.664180 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deae4c27-9aa0-475f-ae4e-67b80cf01ad9" containerName="extract-utilities" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.664189 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="deae4c27-9aa0-475f-ae4e-67b80cf01ad9" containerName="extract-utilities" Jan 21 01:47:05 crc kubenswrapper[5030]: E0121 01:47:05.664218 5030 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deae4c27-9aa0-475f-ae4e-67b80cf01ad9" containerName="extract-content" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.664225 5030 state_mem.go:107] "Deleted CPUSet assignment" podUID="deae4c27-9aa0-475f-ae4e-67b80cf01ad9" containerName="extract-content" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.664357 5030 memory_manager.go:354] "RemoveStaleState removing state" podUID="deae4c27-9aa0-475f-ae4e-67b80cf01ad9" containerName="registry-server" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.665478 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.672544 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsqmb"] Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.775055 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b922a3eb-743f-418b-a5ed-6a2675e3b31c-catalog-content\") pod \"community-operators-lsqmb\" (UID: \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\") " pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.775213 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b922a3eb-743f-418b-a5ed-6a2675e3b31c-utilities\") pod \"community-operators-lsqmb\" (UID: \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\") " pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.775274 5030 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24rgx\" (UniqueName: \"kubernetes.io/projected/b922a3eb-743f-418b-a5ed-6a2675e3b31c-kube-api-access-24rgx\") pod \"community-operators-lsqmb\" (UID: \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\") " pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.876849 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b922a3eb-743f-418b-a5ed-6a2675e3b31c-catalog-content\") pod \"community-operators-lsqmb\" (UID: \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\") " pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.876942 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b922a3eb-743f-418b-a5ed-6a2675e3b31c-utilities\") pod \"community-operators-lsqmb\" (UID: \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\") " pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.876972 5030 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24rgx\" (UniqueName: \"kubernetes.io/projected/b922a3eb-743f-418b-a5ed-6a2675e3b31c-kube-api-access-24rgx\") pod \"community-operators-lsqmb\" (UID: \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\") " pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.877771 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b922a3eb-743f-418b-a5ed-6a2675e3b31c-utilities\") pod \"community-operators-lsqmb\" (UID: \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\") " pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.877824 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b922a3eb-743f-418b-a5ed-6a2675e3b31c-catalog-content\") pod \"community-operators-lsqmb\" (UID: \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\") " pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.913466 5030 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24rgx\" (UniqueName: \"kubernetes.io/projected/b922a3eb-743f-418b-a5ed-6a2675e3b31c-kube-api-access-24rgx\") pod \"community-operators-lsqmb\" (UID: \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\") " pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:05 crc kubenswrapper[5030]: I0121 01:47:05.996052 5030 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:06 crc kubenswrapper[5030]: I0121 01:47:06.329974 5030 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lsqmb"] Jan 21 01:47:07 crc kubenswrapper[5030]: I0121 01:47:07.055565 5030 generic.go:334] "Generic (PLEG): container finished" podID="b922a3eb-743f-418b-a5ed-6a2675e3b31c" containerID="7e5a00e6a73e9b2beded78adf62c47c94ba778121d126c2699ce3e4d6ea5959b" exitCode=0 Jan 21 01:47:07 crc kubenswrapper[5030]: I0121 01:47:07.055674 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsqmb" event={"ID":"b922a3eb-743f-418b-a5ed-6a2675e3b31c","Type":"ContainerDied","Data":"7e5a00e6a73e9b2beded78adf62c47c94ba778121d126c2699ce3e4d6ea5959b"} Jan 21 01:47:07 crc kubenswrapper[5030]: I0121 01:47:07.055942 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsqmb" event={"ID":"b922a3eb-743f-418b-a5ed-6a2675e3b31c","Type":"ContainerStarted","Data":"18e5d23d0ee2ab934ea829d832768857688721f87c336cb33c79eab85c63b943"} Jan 21 01:47:07 crc kubenswrapper[5030]: I0121 01:47:07.059169 5030 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 01:47:08 crc kubenswrapper[5030]: I0121 01:47:08.963180 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:47:08 crc kubenswrapper[5030]: E0121 01:47:08.964296 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:47:09 crc kubenswrapper[5030]: I0121 01:47:09.114567 5030 generic.go:334] "Generic (PLEG): container finished" podID="b922a3eb-743f-418b-a5ed-6a2675e3b31c" containerID="e073b6ce4f701088e85cc2e262242da7e09baf8e55921eed3a9c31535b353e61" exitCode=0 Jan 21 01:47:09 crc kubenswrapper[5030]: I0121 01:47:09.114593 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsqmb" event={"ID":"b922a3eb-743f-418b-a5ed-6a2675e3b31c","Type":"ContainerDied","Data":"e073b6ce4f701088e85cc2e262242da7e09baf8e55921eed3a9c31535b353e61"} Jan 21 01:47:10 crc kubenswrapper[5030]: I0121 01:47:10.135861 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsqmb" event={"ID":"b922a3eb-743f-418b-a5ed-6a2675e3b31c","Type":"ContainerStarted","Data":"c520c2a007fba3e8c4fb9d924443a2892db45e84ffbf7608906dda643e106b81"} Jan 21 01:47:10 crc kubenswrapper[5030]: I0121 01:47:10.176107 5030 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lsqmb" podStartSLOduration=2.549785872 podStartE2EDuration="5.176081414s" podCreationTimestamp="2026-01-21 01:47:05 +0000 UTC" firstStartedPulling="2026-01-21 01:47:07.058891982 +0000 UTC m=+11499.379152260" lastFinishedPulling="2026-01-21 01:47:09.685187514 +0000 UTC m=+11502.005447802" observedRunningTime="2026-01-21 01:47:10.166730913 +0000 UTC m=+11502.486991261" watchObservedRunningTime="2026-01-21 01:47:10.176081414 +0000 UTC m=+11502.496341742" Jan 21 01:47:15 crc kubenswrapper[5030]: I0121 01:47:15.996442 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:15 crc kubenswrapper[5030]: I0121 01:47:15.997041 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:16 crc kubenswrapper[5030]: I0121 01:47:16.071532 5030 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:16 crc kubenswrapper[5030]: I0121 01:47:16.249890 5030 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:16 crc kubenswrapper[5030]: I0121 01:47:16.318822 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lsqmb"] Jan 21 01:47:18 crc kubenswrapper[5030]: I0121 01:47:18.201516 5030 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lsqmb" podUID="b922a3eb-743f-418b-a5ed-6a2675e3b31c" containerName="registry-server" containerID="cri-o://c520c2a007fba3e8c4fb9d924443a2892db45e84ffbf7608906dda643e106b81" gracePeriod=2 Jan 21 01:47:18 crc kubenswrapper[5030]: I0121 01:47:18.761569 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:18 crc kubenswrapper[5030]: I0121 01:47:18.878502 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b922a3eb-743f-418b-a5ed-6a2675e3b31c-catalog-content\") pod \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\" (UID: \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\") " Jan 21 01:47:18 crc kubenswrapper[5030]: I0121 01:47:18.878554 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b922a3eb-743f-418b-a5ed-6a2675e3b31c-utilities\") pod \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\" (UID: \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\") " Jan 21 01:47:18 crc kubenswrapper[5030]: I0121 01:47:18.878855 5030 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24rgx\" (UniqueName: \"kubernetes.io/projected/b922a3eb-743f-418b-a5ed-6a2675e3b31c-kube-api-access-24rgx\") pod \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\" (UID: \"b922a3eb-743f-418b-a5ed-6a2675e3b31c\") " Jan 21 01:47:18 crc kubenswrapper[5030]: I0121 01:47:18.879934 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b922a3eb-743f-418b-a5ed-6a2675e3b31c-utilities" (OuterVolumeSpecName: "utilities") pod "b922a3eb-743f-418b-a5ed-6a2675e3b31c" (UID: "b922a3eb-743f-418b-a5ed-6a2675e3b31c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:47:18 crc kubenswrapper[5030]: I0121 01:47:18.887009 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b922a3eb-743f-418b-a5ed-6a2675e3b31c-kube-api-access-24rgx" (OuterVolumeSpecName: "kube-api-access-24rgx") pod "b922a3eb-743f-418b-a5ed-6a2675e3b31c" (UID: "b922a3eb-743f-418b-a5ed-6a2675e3b31c"). InnerVolumeSpecName "kube-api-access-24rgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 01:47:18 crc kubenswrapper[5030]: I0121 01:47:18.955253 5030 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b922a3eb-743f-418b-a5ed-6a2675e3b31c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b922a3eb-743f-418b-a5ed-6a2675e3b31c" (UID: "b922a3eb-743f-418b-a5ed-6a2675e3b31c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 01:47:18 crc kubenswrapper[5030]: I0121 01:47:18.980088 5030 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24rgx\" (UniqueName: \"kubernetes.io/projected/b922a3eb-743f-418b-a5ed-6a2675e3b31c-kube-api-access-24rgx\") on node \"crc\" DevicePath \"\"" Jan 21 01:47:18 crc kubenswrapper[5030]: I0121 01:47:18.980165 5030 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b922a3eb-743f-418b-a5ed-6a2675e3b31c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 01:47:18 crc kubenswrapper[5030]: I0121 01:47:18.980200 5030 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b922a3eb-743f-418b-a5ed-6a2675e3b31c-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.209969 5030 generic.go:334] "Generic (PLEG): container finished" podID="b922a3eb-743f-418b-a5ed-6a2675e3b31c" containerID="c520c2a007fba3e8c4fb9d924443a2892db45e84ffbf7608906dda643e106b81" exitCode=0 Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.210012 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsqmb" event={"ID":"b922a3eb-743f-418b-a5ed-6a2675e3b31c","Type":"ContainerDied","Data":"c520c2a007fba3e8c4fb9d924443a2892db45e84ffbf7608906dda643e106b81"} Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.210039 5030 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lsqmb" event={"ID":"b922a3eb-743f-418b-a5ed-6a2675e3b31c","Type":"ContainerDied","Data":"18e5d23d0ee2ab934ea829d832768857688721f87c336cb33c79eab85c63b943"} Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.210055 5030 scope.go:117] "RemoveContainer" containerID="c520c2a007fba3e8c4fb9d924443a2892db45e84ffbf7608906dda643e106b81" Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.210161 5030 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lsqmb" Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.249560 5030 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lsqmb"] Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.254357 5030 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lsqmb"] Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.255500 5030 scope.go:117] "RemoveContainer" containerID="e073b6ce4f701088e85cc2e262242da7e09baf8e55921eed3a9c31535b353e61" Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.287242 5030 scope.go:117] "RemoveContainer" containerID="7e5a00e6a73e9b2beded78adf62c47c94ba778121d126c2699ce3e4d6ea5959b" Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.318313 5030 scope.go:117] "RemoveContainer" containerID="c520c2a007fba3e8c4fb9d924443a2892db45e84ffbf7608906dda643e106b81" Jan 21 01:47:19 crc kubenswrapper[5030]: E0121 01:47:19.319022 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c520c2a007fba3e8c4fb9d924443a2892db45e84ffbf7608906dda643e106b81\": container with ID starting with c520c2a007fba3e8c4fb9d924443a2892db45e84ffbf7608906dda643e106b81 not found: ID does not exist" containerID="c520c2a007fba3e8c4fb9d924443a2892db45e84ffbf7608906dda643e106b81" Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.319054 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c520c2a007fba3e8c4fb9d924443a2892db45e84ffbf7608906dda643e106b81"} err="failed to get container status \"c520c2a007fba3e8c4fb9d924443a2892db45e84ffbf7608906dda643e106b81\": rpc error: code = NotFound desc = could not find container \"c520c2a007fba3e8c4fb9d924443a2892db45e84ffbf7608906dda643e106b81\": container with ID starting with c520c2a007fba3e8c4fb9d924443a2892db45e84ffbf7608906dda643e106b81 not found: ID does not exist" Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.319072 5030 scope.go:117] "RemoveContainer" containerID="e073b6ce4f701088e85cc2e262242da7e09baf8e55921eed3a9c31535b353e61" Jan 21 01:47:19 crc kubenswrapper[5030]: E0121 01:47:19.320371 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e073b6ce4f701088e85cc2e262242da7e09baf8e55921eed3a9c31535b353e61\": container with ID starting with e073b6ce4f701088e85cc2e262242da7e09baf8e55921eed3a9c31535b353e61 not found: ID does not exist" containerID="e073b6ce4f701088e85cc2e262242da7e09baf8e55921eed3a9c31535b353e61" Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.320400 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e073b6ce4f701088e85cc2e262242da7e09baf8e55921eed3a9c31535b353e61"} err="failed to get container status \"e073b6ce4f701088e85cc2e262242da7e09baf8e55921eed3a9c31535b353e61\": rpc error: code = NotFound desc = could not find container \"e073b6ce4f701088e85cc2e262242da7e09baf8e55921eed3a9c31535b353e61\": container with ID starting with e073b6ce4f701088e85cc2e262242da7e09baf8e55921eed3a9c31535b353e61 not found: ID does not exist" Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.320418 5030 scope.go:117] "RemoveContainer" containerID="7e5a00e6a73e9b2beded78adf62c47c94ba778121d126c2699ce3e4d6ea5959b" Jan 21 01:47:19 crc kubenswrapper[5030]: E0121 01:47:19.320752 5030 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e5a00e6a73e9b2beded78adf62c47c94ba778121d126c2699ce3e4d6ea5959b\": container with ID starting with 7e5a00e6a73e9b2beded78adf62c47c94ba778121d126c2699ce3e4d6ea5959b not found: ID does not exist" containerID="7e5a00e6a73e9b2beded78adf62c47c94ba778121d126c2699ce3e4d6ea5959b" Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.320778 5030 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e5a00e6a73e9b2beded78adf62c47c94ba778121d126c2699ce3e4d6ea5959b"} err="failed to get container status \"7e5a00e6a73e9b2beded78adf62c47c94ba778121d126c2699ce3e4d6ea5959b\": rpc error: code = NotFound desc = could not find container \"7e5a00e6a73e9b2beded78adf62c47c94ba778121d126c2699ce3e4d6ea5959b\": container with ID starting with 7e5a00e6a73e9b2beded78adf62c47c94ba778121d126c2699ce3e4d6ea5959b not found: ID does not exist" Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.962540 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:47:19 crc kubenswrapper[5030]: E0121 01:47:19.963042 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:47:19 crc kubenswrapper[5030]: I0121 01:47:19.977870 5030 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b922a3eb-743f-418b-a5ed-6a2675e3b31c" path="/var/lib/kubelet/pods/b922a3eb-743f-418b-a5ed-6a2675e3b31c/volumes" Jan 21 01:47:33 crc kubenswrapper[5030]: I0121 01:47:33.961829 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:47:33 crc kubenswrapper[5030]: E0121 01:47:33.962553 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:47:47 crc kubenswrapper[5030]: I0121 01:47:47.976535 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:47:47 crc kubenswrapper[5030]: E0121 01:47:47.977547 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:48:00 crc kubenswrapper[5030]: I0121 01:48:00.963285 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:48:00 crc kubenswrapper[5030]: E0121 01:48:00.966551 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:48:13 crc kubenswrapper[5030]: I0121 01:48:13.963565 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:48:13 crc kubenswrapper[5030]: E0121 01:48:13.964606 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:48:28 crc kubenswrapper[5030]: I0121 01:48:28.962729 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:48:28 crc kubenswrapper[5030]: E0121 01:48:28.963855 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2" Jan 21 01:48:39 crc kubenswrapper[5030]: I0121 01:48:39.962749 5030 scope.go:117] "RemoveContainer" containerID="40898e40a8b88b4a997b493f83df63976d7837501e5a5bf30af9e7c3943ffcc9" Jan 21 01:48:39 crc kubenswrapper[5030]: E0121 01:48:39.963776 5030 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qb97t_openshift-machine-config-operator(a2a963db-b558-4e9a-8e56-12636c3fe1c2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qb97t" podUID="a2a963db-b558-4e9a-8e56-12636c3fe1c2"